CHECK: Is CUDA the right version (10)? Creating model, this may take a second... tracking anchors tracking anchors tracking anchors tracking anchors tracking anchors Model: "retinanet" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) (None, None, None, 3 0 __________________________________________________________________________________________________ padding_conv1 (ZeroPadding2D) (None, None, None, 3 0 input_1[0][0] __________________________________________________________________________________________________ conv1 (Conv2D) (None, None, None, 6 9408 padding_conv1[0][0] __________________________________________________________________________________________________ bn_conv1 (BatchNormalization) (None, None, None, 6 256 conv1[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, None, None, 6 0 bn_conv1[0][0] __________________________________________________________________________________________________ pool1 (MaxPooling2D) (None, None, None, 6 0 conv1_relu[0][0] __________________________________________________________________________________________________ res2a_branch2a (Conv2D) (None, None, None, 6 4096 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2a (BatchNormalizati (None, None, None, 6 256 res2a_branch2a[0][0] __________________________________________________________________________________________________ res2a_branch2a_relu (Activation (None, None, None, 6 0 bn2a_branch2a[0][0] __________________________________________________________________________________________________ padding2a_branch2b (ZeroPadding (None, None, None, 6 0 res2a_branch2a_relu[0][0] __________________________________________________________________________________________________ res2a_branch2b (Conv2D) (None, None, None, 6 36864 padding2a_branch2b[0][0] __________________________________________________________________________________________________ bn2a_branch2b (BatchNormalizati (None, None, None, 6 256 res2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2b_relu (Activation (None, None, None, 6 0 bn2a_branch2b[0][0] __________________________________________________________________________________________________ res2a_branch2c (Conv2D) (None, None, None, 2 16384 res2a_branch2b_relu[0][0] __________________________________________________________________________________________________ res2a_branch1 (Conv2D) (None, None, None, 2 16384 pool1[0][0] __________________________________________________________________________________________________ bn2a_branch2c (BatchNormalizati (None, None, None, 2 1024 res2a_branch2c[0][0] __________________________________________________________________________________________________ bn2a_branch1 (BatchNormalizatio (None, None, None, 2 1024 res2a_branch1[0][0] __________________________________________________________________________________________________ res2a (Add) (None, None, None, 2 0 bn2a_branch2c[0][0] bn2a_branch1[0][0] __________________________________________________________________________________________________ res2a_relu (Activation) (None, None, None, 2 0 res2a[0][0] __________________________________________________________________________________________________ res2b_branch2a (Conv2D) (None, None, None, 6 16384 res2a_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2a (BatchNormalizati (None, None, None, 6 256 res2b_branch2a[0][0] __________________________________________________________________________________________________ res2b_branch2a_relu (Activation (None, None, None, 6 0 bn2b_branch2a[0][0] __________________________________________________________________________________________________ padding2b_branch2b (ZeroPadding (None, None, None, 6 0 res2b_branch2a_relu[0][0] __________________________________________________________________________________________________ res2b_branch2b (Conv2D) (None, None, None, 6 36864 padding2b_branch2b[0][0] __________________________________________________________________________________________________ bn2b_branch2b (BatchNormalizati (None, None, None, 6 256 res2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2b_relu (Activation (None, None, None, 6 0 bn2b_branch2b[0][0] __________________________________________________________________________________________________ res2b_branch2c (Conv2D) (None, None, None, 2 16384 res2b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2b_branch2c (BatchNormalizati (None, None, None, 2 1024 res2b_branch2c[0][0] __________________________________________________________________________________________________ res2b (Add) (None, None, None, 2 0 bn2b_branch2c[0][0] res2a_relu[0][0] __________________________________________________________________________________________________ res2b_relu (Activation) (None, None, None, 2 0 res2b[0][0] __________________________________________________________________________________________________ res2c_branch2a (Conv2D) (None, None, None, 6 16384 res2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2a (BatchNormalizati (None, None, None, 6 256 res2c_branch2a[0][0] __________________________________________________________________________________________________ res2c_branch2a_relu (Activation (None, None, None, 6 0 bn2c_branch2a[0][0] __________________________________________________________________________________________________ padding2c_branch2b (ZeroPadding (None, None, None, 6 0 res2c_branch2a_relu[0][0] __________________________________________________________________________________________________ res2c_branch2b (Conv2D) (None, None, None, 6 36864 padding2c_branch2b[0][0] __________________________________________________________________________________________________ bn2c_branch2b (BatchNormalizati (None, None, None, 6 256 res2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2b_relu (Activation (None, None, None, 6 0 bn2c_branch2b[0][0] __________________________________________________________________________________________________ res2c_branch2c (Conv2D) (None, None, None, 2 16384 res2c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn2c_branch2c (BatchNormalizati (None, None, None, 2 1024 res2c_branch2c[0][0] __________________________________________________________________________________________________ res2c (Add) (None, None, None, 2 0 bn2c_branch2c[0][0] res2b_relu[0][0] __________________________________________________________________________________________________ res2c_relu (Activation) (None, None, None, 2 0 res2c[0][0] __________________________________________________________________________________________________ res3a_branch2a (Conv2D) (None, None, None, 1 32768 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2a (BatchNormalizati (None, None, None, 1 512 res3a_branch2a[0][0] __________________________________________________________________________________________________ res3a_branch2a_relu (Activation (None, None, None, 1 0 bn3a_branch2a[0][0] __________________________________________________________________________________________________ padding3a_branch2b (ZeroPadding (None, None, None, 1 0 res3a_branch2a_relu[0][0] __________________________________________________________________________________________________ res3a_branch2b (Conv2D) (None, None, None, 1 147456 padding3a_branch2b[0][0] __________________________________________________________________________________________________ bn3a_branch2b (BatchNormalizati (None, None, None, 1 512 res3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2b_relu (Activation (None, None, None, 1 0 bn3a_branch2b[0][0] __________________________________________________________________________________________________ res3a_branch2c (Conv2D) (None, None, None, 5 65536 res3a_branch2b_relu[0][0] __________________________________________________________________________________________________ res3a_branch1 (Conv2D) (None, None, None, 5 131072 res2c_relu[0][0] __________________________________________________________________________________________________ bn3a_branch2c (BatchNormalizati (None, None, None, 5 2048 res3a_branch2c[0][0] __________________________________________________________________________________________________ bn3a_branch1 (BatchNormalizatio (None, None, None, 5 2048 res3a_branch1[0][0] __________________________________________________________________________________________________ res3a (Add) (None, None, None, 5 0 bn3a_branch2c[0][0] bn3a_branch1[0][0] __________________________________________________________________________________________________ res3a_relu (Activation) (None, None, None, 5 0 res3a[0][0] __________________________________________________________________________________________________ res3b_branch2a (Conv2D) (None, None, None, 1 65536 res3a_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2a (BatchNormalizati (None, None, None, 1 512 res3b_branch2a[0][0] __________________________________________________________________________________________________ res3b_branch2a_relu (Activation (None, None, None, 1 0 bn3b_branch2a[0][0] __________________________________________________________________________________________________ padding3b_branch2b (ZeroPadding (None, None, None, 1 0 res3b_branch2a_relu[0][0] __________________________________________________________________________________________________ res3b_branch2b (Conv2D) (None, None, None, 1 147456 padding3b_branch2b[0][0] __________________________________________________________________________________________________ bn3b_branch2b (BatchNormalizati (None, None, None, 1 512 res3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2b_relu (Activation (None, None, None, 1 0 bn3b_branch2b[0][0] __________________________________________________________________________________________________ res3b_branch2c (Conv2D) (None, None, None, 5 65536 res3b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3b_branch2c (BatchNormalizati (None, None, None, 5 2048 res3b_branch2c[0][0] __________________________________________________________________________________________________ res3b (Add) (None, None, None, 5 0 bn3b_branch2c[0][0] res3a_relu[0][0] __________________________________________________________________________________________________ res3b_relu (Activation) (None, None, None, 5 0 res3b[0][0] __________________________________________________________________________________________________ res3c_branch2a (Conv2D) (None, None, None, 1 65536 res3b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2a (BatchNormalizati (None, None, None, 1 512 res3c_branch2a[0][0] __________________________________________________________________________________________________ res3c_branch2a_relu (Activation (None, None, None, 1 0 bn3c_branch2a[0][0] __________________________________________________________________________________________________ padding3c_branch2b (ZeroPadding (None, None, None, 1 0 res3c_branch2a_relu[0][0] __________________________________________________________________________________________________ res3c_branch2b (Conv2D) (None, None, None, 1 147456 padding3c_branch2b[0][0] __________________________________________________________________________________________________ bn3c_branch2b (BatchNormalizati (None, None, None, 1 512 res3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2b_relu (Activation (None, None, None, 1 0 bn3c_branch2b[0][0] __________________________________________________________________________________________________ res3c_branch2c (Conv2D) (None, None, None, 5 65536 res3c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3c_branch2c (BatchNormalizati (None, None, None, 5 2048 res3c_branch2c[0][0] __________________________________________________________________________________________________ res3c (Add) (None, None, None, 5 0 bn3c_branch2c[0][0] res3b_relu[0][0] __________________________________________________________________________________________________ res3c_relu (Activation) (None, None, None, 5 0 res3c[0][0] __________________________________________________________________________________________________ res3d_branch2a (Conv2D) (None, None, None, 1 65536 res3c_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2a (BatchNormalizati (None, None, None, 1 512 res3d_branch2a[0][0] __________________________________________________________________________________________________ res3d_branch2a_relu (Activation (None, None, None, 1 0 bn3d_branch2a[0][0] __________________________________________________________________________________________________ padding3d_branch2b (ZeroPadding (None, None, None, 1 0 res3d_branch2a_relu[0][0] __________________________________________________________________________________________________ res3d_branch2b (Conv2D) (None, None, None, 1 147456 padding3d_branch2b[0][0] __________________________________________________________________________________________________ bn3d_branch2b (BatchNormalizati (None, None, None, 1 512 res3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2b_relu (Activation (None, None, None, 1 0 bn3d_branch2b[0][0] __________________________________________________________________________________________________ res3d_branch2c (Conv2D) (None, None, None, 5 65536 res3d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn3d_branch2c (BatchNormalizati (None, None, None, 5 2048 res3d_branch2c[0][0] __________________________________________________________________________________________________ res3d (Add) (None, None, None, 5 0 bn3d_branch2c[0][0] res3c_relu[0][0] __________________________________________________________________________________________________ res3d_relu (Activation) (None, None, None, 5 0 res3d[0][0] __________________________________________________________________________________________________ res4a_branch2a (Conv2D) (None, None, None, 2 131072 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2a (BatchNormalizati (None, None, None, 2 1024 res4a_branch2a[0][0] __________________________________________________________________________________________________ res4a_branch2a_relu (Activation (None, None, None, 2 0 bn4a_branch2a[0][0] __________________________________________________________________________________________________ padding4a_branch2b (ZeroPadding (None, None, None, 2 0 res4a_branch2a_relu[0][0] __________________________________________________________________________________________________ res4a_branch2b (Conv2D) (None, None, None, 2 589824 padding4a_branch2b[0][0] __________________________________________________________________________________________________ bn4a_branch2b (BatchNormalizati (None, None, None, 2 1024 res4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2b_relu (Activation (None, None, None, 2 0 bn4a_branch2b[0][0] __________________________________________________________________________________________________ res4a_branch2c (Conv2D) (None, None, None, 1 262144 res4a_branch2b_relu[0][0] __________________________________________________________________________________________________ res4a_branch1 (Conv2D) (None, None, None, 1 524288 res3d_relu[0][0] __________________________________________________________________________________________________ bn4a_branch2c (BatchNormalizati (None, None, None, 1 4096 res4a_branch2c[0][0] __________________________________________________________________________________________________ bn4a_branch1 (BatchNormalizatio (None, None, None, 1 4096 res4a_branch1[0][0] __________________________________________________________________________________________________ res4a (Add) (None, None, None, 1 0 bn4a_branch2c[0][0] bn4a_branch1[0][0] __________________________________________________________________________________________________ res4a_relu (Activation) (None, None, None, 1 0 res4a[0][0] __________________________________________________________________________________________________ res4b_branch2a (Conv2D) (None, None, None, 2 262144 res4a_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2a (BatchNormalizati (None, None, None, 2 1024 res4b_branch2a[0][0] __________________________________________________________________________________________________ res4b_branch2a_relu (Activation (None, None, None, 2 0 bn4b_branch2a[0][0] __________________________________________________________________________________________________ padding4b_branch2b (ZeroPadding (None, None, None, 2 0 res4b_branch2a_relu[0][0] __________________________________________________________________________________________________ res4b_branch2b (Conv2D) (None, None, None, 2 589824 padding4b_branch2b[0][0] __________________________________________________________________________________________________ bn4b_branch2b (BatchNormalizati (None, None, None, 2 1024 res4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2b_relu (Activation (None, None, None, 2 0 bn4b_branch2b[0][0] __________________________________________________________________________________________________ res4b_branch2c (Conv2D) (None, None, None, 1 262144 res4b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4b_branch2c (BatchNormalizati (None, None, None, 1 4096 res4b_branch2c[0][0] __________________________________________________________________________________________________ res4b (Add) (None, None, None, 1 0 bn4b_branch2c[0][0] res4a_relu[0][0] __________________________________________________________________________________________________ res4b_relu (Activation) (None, None, None, 1 0 res4b[0][0] __________________________________________________________________________________________________ res4c_branch2a (Conv2D) (None, None, None, 2 262144 res4b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2a (BatchNormalizati (None, None, None, 2 1024 res4c_branch2a[0][0] __________________________________________________________________________________________________ res4c_branch2a_relu (Activation (None, None, None, 2 0 bn4c_branch2a[0][0] __________________________________________________________________________________________________ padding4c_branch2b (ZeroPadding (None, None, None, 2 0 res4c_branch2a_relu[0][0] __________________________________________________________________________________________________ res4c_branch2b (Conv2D) (None, None, None, 2 589824 padding4c_branch2b[0][0] __________________________________________________________________________________________________ bn4c_branch2b (BatchNormalizati (None, None, None, 2 1024 res4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2b_relu (Activation (None, None, None, 2 0 bn4c_branch2b[0][0] __________________________________________________________________________________________________ res4c_branch2c (Conv2D) (None, None, None, 1 262144 res4c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4c_branch2c (BatchNormalizati (None, None, None, 1 4096 res4c_branch2c[0][0] __________________________________________________________________________________________________ res4c (Add) (None, None, None, 1 0 bn4c_branch2c[0][0] res4b_relu[0][0] __________________________________________________________________________________________________ res4c_relu (Activation) (None, None, None, 1 0 res4c[0][0] __________________________________________________________________________________________________ res4d_branch2a (Conv2D) (None, None, None, 2 262144 res4c_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2a (BatchNormalizati (None, None, None, 2 1024 res4d_branch2a[0][0] __________________________________________________________________________________________________ res4d_branch2a_relu (Activation (None, None, None, 2 0 bn4d_branch2a[0][0] __________________________________________________________________________________________________ padding4d_branch2b (ZeroPadding (None, None, None, 2 0 res4d_branch2a_relu[0][0] __________________________________________________________________________________________________ res4d_branch2b (Conv2D) (None, None, None, 2 589824 padding4d_branch2b[0][0] __________________________________________________________________________________________________ bn4d_branch2b (BatchNormalizati (None, None, None, 2 1024 res4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2b_relu (Activation (None, None, None, 2 0 bn4d_branch2b[0][0] __________________________________________________________________________________________________ res4d_branch2c (Conv2D) (None, None, None, 1 262144 res4d_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4d_branch2c (BatchNormalizati (None, None, None, 1 4096 res4d_branch2c[0][0] __________________________________________________________________________________________________ res4d (Add) (None, None, None, 1 0 bn4d_branch2c[0][0] res4c_relu[0][0] __________________________________________________________________________________________________ res4d_relu (Activation) (None, None, None, 1 0 res4d[0][0] __________________________________________________________________________________________________ res4e_branch2a (Conv2D) (None, None, None, 2 262144 res4d_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2a (BatchNormalizati (None, None, None, 2 1024 res4e_branch2a[0][0] __________________________________________________________________________________________________ res4e_branch2a_relu (Activation (None, None, None, 2 0 bn4e_branch2a[0][0] __________________________________________________________________________________________________ padding4e_branch2b (ZeroPadding (None, None, None, 2 0 res4e_branch2a_relu[0][0] __________________________________________________________________________________________________ res4e_branch2b (Conv2D) (None, None, None, 2 589824 padding4e_branch2b[0][0] __________________________________________________________________________________________________ bn4e_branch2b (BatchNormalizati (None, None, None, 2 1024 res4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2b_relu (Activation (None, None, None, 2 0 bn4e_branch2b[0][0] __________________________________________________________________________________________________ res4e_branch2c (Conv2D) (None, None, None, 1 262144 res4e_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4e_branch2c (BatchNormalizati (None, None, None, 1 4096 res4e_branch2c[0][0] __________________________________________________________________________________________________ res4e (Add) (None, None, None, 1 0 bn4e_branch2c[0][0] res4d_relu[0][0] __________________________________________________________________________________________________ res4e_relu (Activation) (None, None, None, 1 0 res4e[0][0] __________________________________________________________________________________________________ res4f_branch2a (Conv2D) (None, None, None, 2 262144 res4e_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2a (BatchNormalizati (None, None, None, 2 1024 res4f_branch2a[0][0] __________________________________________________________________________________________________ res4f_branch2a_relu (Activation (None, None, None, 2 0 bn4f_branch2a[0][0] __________________________________________________________________________________________________ padding4f_branch2b (ZeroPadding (None, None, None, 2 0 res4f_branch2a_relu[0][0] __________________________________________________________________________________________________ res4f_branch2b (Conv2D) (None, None, None, 2 589824 padding4f_branch2b[0][0] __________________________________________________________________________________________________ bn4f_branch2b (BatchNormalizati (None, None, None, 2 1024 res4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2b_relu (Activation (None, None, None, 2 0 bn4f_branch2b[0][0] __________________________________________________________________________________________________ res4f_branch2c (Conv2D) (None, None, None, 1 262144 res4f_branch2b_relu[0][0] __________________________________________________________________________________________________ bn4f_branch2c (BatchNormalizati (None, None, None, 1 4096 res4f_branch2c[0][0] __________________________________________________________________________________________________ res4f (Add) (None, None, None, 1 0 bn4f_branch2c[0][0] res4e_relu[0][0] __________________________________________________________________________________________________ res4f_relu (Activation) (None, None, None, 1 0 res4f[0][0] __________________________________________________________________________________________________ res5a_branch2a (Conv2D) (None, None, None, 5 524288 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2a (BatchNormalizati (None, None, None, 5 2048 res5a_branch2a[0][0] __________________________________________________________________________________________________ res5a_branch2a_relu (Activation (None, None, None, 5 0 bn5a_branch2a[0][0] __________________________________________________________________________________________________ padding5a_branch2b (ZeroPadding (None, None, None, 5 0 res5a_branch2a_relu[0][0] __________________________________________________________________________________________________ res5a_branch2b (Conv2D) (None, None, None, 5 2359296 padding5a_branch2b[0][0] __________________________________________________________________________________________________ bn5a_branch2b (BatchNormalizati (None, None, None, 5 2048 res5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2b_relu (Activation (None, None, None, 5 0 bn5a_branch2b[0][0] __________________________________________________________________________________________________ res5a_branch2c (Conv2D) (None, None, None, 2 1048576 res5a_branch2b_relu[0][0] __________________________________________________________________________________________________ res5a_branch1 (Conv2D) (None, None, None, 2 2097152 res4f_relu[0][0] __________________________________________________________________________________________________ bn5a_branch2c (BatchNormalizati (None, None, None, 2 8192 res5a_branch2c[0][0] __________________________________________________________________________________________________ bn5a_branch1 (BatchNormalizatio (None, None, None, 2 8192 res5a_branch1[0][0] __________________________________________________________________________________________________ res5a (Add) (None, None, None, 2 0 bn5a_branch2c[0][0] bn5a_branch1[0][0] __________________________________________________________________________________________________ res5a_relu (Activation) (None, None, None, 2 0 res5a[0][0] __________________________________________________________________________________________________ res5b_branch2a (Conv2D) (None, None, None, 5 1048576 res5a_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2a (BatchNormalizati (None, None, None, 5 2048 res5b_branch2a[0][0] __________________________________________________________________________________________________ res5b_branch2a_relu (Activation (None, None, None, 5 0 bn5b_branch2a[0][0] __________________________________________________________________________________________________ padding5b_branch2b (ZeroPadding (None, None, None, 5 0 res5b_branch2a_relu[0][0] __________________________________________________________________________________________________ res5b_branch2b (Conv2D) (None, None, None, 5 2359296 padding5b_branch2b[0][0] __________________________________________________________________________________________________ bn5b_branch2b (BatchNormalizati (None, None, None, 5 2048 res5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2b_relu (Activation (None, None, None, 5 0 bn5b_branch2b[0][0] __________________________________________________________________________________________________ res5b_branch2c (Conv2D) (None, None, None, 2 1048576 res5b_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5b_branch2c (BatchNormalizati (None, None, None, 2 8192 res5b_branch2c[0][0] __________________________________________________________________________________________________ res5b (Add) (None, None, None, 2 0 bn5b_branch2c[0][0] res5a_relu[0][0] __________________________________________________________________________________________________ res5b_relu (Activation) (None, None, None, 2 0 res5b[0][0] __________________________________________________________________________________________________ res5c_branch2a (Conv2D) (None, None, None, 5 1048576 res5b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2a (BatchNormalizati (None, None, None, 5 2048 res5c_branch2a[0][0] __________________________________________________________________________________________________ res5c_branch2a_relu (Activation (None, None, None, 5 0 bn5c_branch2a[0][0] __________________________________________________________________________________________________ padding5c_branch2b (ZeroPadding (None, None, None, 5 0 res5c_branch2a_relu[0][0] __________________________________________________________________________________________________ res5c_branch2b (Conv2D) (None, None, None, 5 2359296 padding5c_branch2b[0][0] __________________________________________________________________________________________________ bn5c_branch2b (BatchNormalizati (None, None, None, 5 2048 res5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2b_relu (Activation (None, None, None, 5 0 bn5c_branch2b[0][0] __________________________________________________________________________________________________ res5c_branch2c (Conv2D) (None, None, None, 2 1048576 res5c_branch2b_relu[0][0] __________________________________________________________________________________________________ bn5c_branch2c (BatchNormalizati (None, None, None, 2 8192 res5c_branch2c[0][0] __________________________________________________________________________________________________ res5c (Add) (None, None, None, 2 0 bn5c_branch2c[0][0] res5b_relu[0][0] __________________________________________________________________________________________________ res5c_relu (Activation) (None, None, None, 2 0 res5c[0][0] __________________________________________________________________________________________________ C5_reduced (Conv2D) (None, None, None, 2 524544 res5c_relu[0][0] __________________________________________________________________________________________________ P5_upsampled (UpsampleLike) (None, None, None, 2 0 C5_reduced[0][0] res4f_relu[0][0] __________________________________________________________________________________________________ C4_reduced (Conv2D) (None, None, None, 2 262400 res4f_relu[0][0] __________________________________________________________________________________________________ P4_merged (Add) (None, None, None, 2 0 P5_upsampled[0][0] C4_reduced[0][0] __________________________________________________________________________________________________ P4_upsampled (UpsampleLike) (None, None, None, 2 0 P4_merged[0][0] res3d_relu[0][0] __________________________________________________________________________________________________ C3_reduced (Conv2D) (None, None, None, 2 131328 res3d_relu[0][0] __________________________________________________________________________________________________ P6 (Conv2D) (None, None, None, 2 4718848 res5c_relu[0][0] __________________________________________________________________________________________________ P3_merged (Add) (None, None, None, 2 0 P4_upsampled[0][0] C3_reduced[0][0] __________________________________________________________________________________________________ C6_relu (Activation) (None, None, None, 2 0 P6[0][0] __________________________________________________________________________________________________ P3 (Conv2D) (None, None, None, 2 590080 P3_merged[0][0] __________________________________________________________________________________________________ P4 (Conv2D) (None, None, None, 2 590080 P4_merged[0][0] __________________________________________________________________________________________________ P5 (Conv2D) (None, None, None, 2 590080 C5_reduced[0][0] __________________________________________________________________________________________________ P7 (Conv2D) (None, None, None, 2 590080 C6_relu[0][0] __________________________________________________________________________________________________ regression_submodel (Model) (None, None, 4) 2443300 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ classification_submodel (Model) (None, None, 1) 2381065 P3[0][0] P4[0][0] P5[0][0] P6[0][0] P7[0][0] __________________________________________________________________________________________________ regression (Concatenate) (None, None, 4) 0 regression_submodel[1][0] regression_submodel[2][0] regression_submodel[3][0] regression_submodel[4][0] regression_submodel[5][0] __________________________________________________________________________________________________ classification (Concatenate) (None, None, 1) 0 classification_submodel[1][0] classification_submodel[2][0] classification_submodel[3][0] classification_submodel[4][0] classification_submodel[5][0] ================================================================================================== Total params: 36,382,957 Trainable params: 36,276,717 Non-trainable params: 106,240 __________________________________________________________________________________________________ None Epoch 1/150 1/500 [..............................] - ETA: 41:41 - loss: 18344.0664 - regression_loss: 152.3998 - classification_loss: 18191.6660 2/500 [..............................] - ETA: 21:42 - loss: 10988.3342 - regression_loss: 220.8305 - classification_loss: 10767.5039 3/500 [..............................] - ETA: 15:01 - loss: 8778.5438 - regression_loss: 226.3358 - classification_loss: 8552.2080 4/500 [..............................] - ETA: 11:41 - loss: 7019.8373 - regression_loss: 247.8450 - classification_loss: 6771.9927 5/500 [..............................] - ETA: 9:40 - loss: 6113.0339 - regression_loss: 248.2350 - classification_loss: 5864.7993 6/500 [..............................] - ETA: 8:19 - loss: 6183.5030 - regression_loss: 250.8135 - classification_loss: 5932.6895 7/500 [..............................] - ETA: 7:23 - loss: 5505.3884 - regression_loss: 260.1254 - classification_loss: 5245.2627 8/500 [..............................] - ETA: 6:39 - loss: 5526.7514 - regression_loss: 260.3393 - classification_loss: 5266.4121 9/500 [..............................] - ETA: 6:06 - loss: 6345.9229 - regression_loss: 253.8133 - classification_loss: 6092.1094 10/500 [..............................] - ETA: 5:39 - loss: 5890.2994 - regression_loss: 252.4944 - classification_loss: 5637.8047 11/500 [..............................] - ETA: 5:17 - loss: 5439.6105 - regression_loss: 246.0986 - classification_loss: 5193.5117 12/500 [..............................] - ETA: 4:58 - loss: 5832.3541 - regression_loss: 239.6108 - classification_loss: 5592.7427 13/500 [..............................] - ETA: 4:44 - loss: 5484.4171 - regression_loss: 240.3167 - classification_loss: 5244.0996 14/500 [..............................] - ETA: 4:31 - loss: 5823.9861 - regression_loss: 234.1653 - classification_loss: 5589.8203 15/500 [..............................] - ETA: 4:20 - loss: 7109.1821 - regression_loss: 228.1818 - classification_loss: 6881.0000 16/500 [..............................] - ETA: 4:10 - loss: 6762.5241 - regression_loss: 223.4231 - classification_loss: 6539.1006 17/500 [>.............................] - ETA: 4:01 - loss: 6422.8293 - regression_loss: 223.3191 - classification_loss: 6199.5103 18/500 [>.............................] - ETA: 3:53 - loss: 6110.5954 - regression_loss: 217.9326 - classification_loss: 5892.6626 19/500 [>.............................] - ETA: 3:46 - loss: 6510.7536 - regression_loss: 213.2580 - classification_loss: 6297.4956 20/500 [>.............................] - ETA: 3:40 - loss: 6305.3627 - regression_loss: 208.0269 - classification_loss: 6097.3354 21/500 [>.............................] - ETA: 3:33 - loss: 6641.0819 - regression_loss: 204.6372 - classification_loss: 6436.4443 22/500 [>.............................] - ETA: 3:28 - loss: 6506.6681 - regression_loss: 202.2957 - classification_loss: 6304.3721 23/500 [>.............................] - ETA: 3:23 - loss: 6245.6297 - regression_loss: 199.3308 - classification_loss: 6046.2988 24/500 [>.............................] - ETA: 3:18 - loss: 5998.9085 - regression_loss: 194.5657 - classification_loss: 5804.3433 25/500 [>.............................] - ETA: 3:14 - loss: 6031.3497 - regression_loss: 194.6174 - classification_loss: 5836.7324 26/500 [>.............................] - ETA: 3:10 - loss: 5821.5614 - regression_loss: 190.1783 - classification_loss: 5631.3833 27/500 [>.............................] - ETA: 3:07 - loss: 5643.8544 - regression_loss: 190.3717 - classification_loss: 5453.4834 28/500 [>.............................] - ETA: 3:04 - loss: 5446.5007 - regression_loss: 186.5108 - classification_loss: 5259.9907 29/500 [>.............................] - ETA: 3:01 - loss: 5273.4413 - regression_loss: 184.7688 - classification_loss: 5088.6729 30/500 [>.............................] - ETA: 2:57 - loss: 5104.9148 - regression_loss: 183.1179 - classification_loss: 4921.7974 31/500 [>.............................] - ETA: 2:55 - loss: 4943.3131 - regression_loss: 179.9852 - classification_loss: 4763.3286 32/500 [>.............................] - ETA: 2:52 - loss: 4792.8832 - regression_loss: 177.3233 - classification_loss: 4615.5605 33/500 [>.............................] - ETA: 2:49 - loss: 4650.0344 - regression_loss: 174.1791 - classification_loss: 4475.8560 34/500 [=>............................] - ETA: 2:47 - loss: 4516.6904 - regression_loss: 171.9792 - classification_loss: 4344.7119 35/500 [=>............................] - ETA: 2:45 - loss: 4390.0743 - regression_loss: 169.1323 - classification_loss: 4220.9429 36/500 [=>............................] - ETA: 2:42 - loss: 4270.0070 - regression_loss: 166.1828 - classification_loss: 4103.8252 37/500 [=>............................] - ETA: 2:40 - loss: 4156.8722 - regression_loss: 163.7315 - classification_loss: 3993.1414 38/500 [=>............................] - ETA: 2:38 - loss: 4049.4952 - regression_loss: 161.2940 - classification_loss: 3888.2019 39/500 [=>............................] - ETA: 2:36 - loss: 3946.7854 - regression_loss: 158.1708 - classification_loss: 3788.6155 40/500 [=>............................] - ETA: 2:35 - loss: 3849.9323 - regression_loss: 155.8790 - classification_loss: 3694.0540 41/500 [=>............................] - ETA: 2:33 - loss: 3757.4042 - regression_loss: 153.3504 - classification_loss: 3604.0542 42/500 [=>............................] - ETA: 2:31 - loss: 3669.3866 - regression_loss: 150.9751 - classification_loss: 3518.4119 43/500 [=>............................] - ETA: 2:30 - loss: 3584.8146 - regression_loss: 148.1310 - classification_loss: 3436.6838 44/500 [=>............................] - ETA: 2:28 - loss: 3504.0345 - regression_loss: 145.3685 - classification_loss: 3358.6663 45/500 [=>............................] - ETA: 2:27 - loss: 3427.7827 - regression_loss: 143.6587 - classification_loss: 3284.1243 46/500 [=>............................] - ETA: 2:26 - loss: 3354.1720 - regression_loss: 141.3338 - classification_loss: 3212.8384 47/500 [=>............................] - ETA: 2:24 - loss: 3283.6808 - regression_loss: 139.1173 - classification_loss: 3144.5637 48/500 [=>............................] - ETA: 2:23 - loss: 3215.9536 - regression_loss: 136.8029 - classification_loss: 3079.1511 49/500 [=>............................] - ETA: 2:22 - loss: 3150.9933 - regression_loss: 134.6036 - classification_loss: 3016.3899 50/500 [==>...........................] - ETA: 2:21 - loss: 3088.8344 - regression_loss: 132.6763 - classification_loss: 2956.1584 51/500 [==>...........................] - ETA: 2:20 - loss: 3029.1603 - regression_loss: 130.8725 - classification_loss: 2898.2881 52/500 [==>...........................] - ETA: 2:19 - loss: 2971.3819 - regression_loss: 128.7556 - classification_loss: 2842.6265 53/500 [==>...........................] - ETA: 2:18 - loss: 2916.1132 - regression_loss: 127.0468 - classification_loss: 2789.0667 54/500 [==>...........................] - ETA: 2:17 - loss: 2862.6024 - regression_loss: 125.1024 - classification_loss: 2737.5002 55/500 [==>...........................] - ETA: 2:16 - loss: 2811.1014 - regression_loss: 123.3025 - classification_loss: 2687.7991 56/500 [==>...........................] - ETA: 2:15 - loss: 2761.4557 - regression_loss: 121.5866 - classification_loss: 2639.8694 57/500 [==>...........................] - ETA: 2:14 - loss: 2713.4903 - regression_loss: 119.8650 - classification_loss: 2593.6255 58/500 [==>...........................] - ETA: 2:13 - loss: 2667.2667 - regression_loss: 118.2820 - classification_loss: 2548.9849 59/500 [==>...........................] - ETA: 2:12 - loss: 2622.4718 - regression_loss: 116.6232 - classification_loss: 2505.8489 60/500 [==>...........................] - ETA: 2:11 - loss: 2579.1855 - regression_loss: 115.0367 - classification_loss: 2464.1489 61/500 [==>...........................] - ETA: 2:10 - loss: 2537.3684 - regression_loss: 113.5524 - classification_loss: 2423.8162 62/500 [==>...........................] - ETA: 2:10 - loss: 2496.7782 - regression_loss: 111.9939 - classification_loss: 2384.7844 63/500 [==>...........................] - ETA: 2:09 - loss: 2457.4921 - regression_loss: 110.5005 - classification_loss: 2346.9917 64/500 [==>...........................] - ETA: 2:08 - loss: 2419.5291 - regression_loss: 109.1485 - classification_loss: 2310.3806 65/500 [==>...........................] - ETA: 2:07 - loss: 2382.7272 - regression_loss: 107.8290 - classification_loss: 2274.8984 66/500 [==>...........................] - ETA: 2:06 - loss: 2346.9742 - regression_loss: 106.4847 - classification_loss: 2240.4895 67/500 [===>..........................] - ETA: 2:06 - loss: 2312.3344 - regression_loss: 105.2274 - classification_loss: 2207.1069 68/500 [===>..........................] - ETA: 2:05 - loss: 2278.5729 - regression_loss: 103.8664 - classification_loss: 2174.7065 69/500 [===>..........................] - ETA: 2:04 - loss: 2245.9883 - regression_loss: 102.7427 - classification_loss: 2143.2456 70/500 [===>..........................] - ETA: 2:03 - loss: 2214.1843 - regression_loss: 101.5014 - classification_loss: 2112.6831 71/500 [===>..........................] - ETA: 2:03 - loss: 2183.3276 - regression_loss: 100.3463 - classification_loss: 2082.9814 72/500 [===>..........................] - ETA: 2:02 - loss: 2153.2711 - regression_loss: 99.1664 - classification_loss: 2054.1047 73/500 [===>..........................] - ETA: 2:01 - loss: 2124.0624 - regression_loss: 98.0433 - classification_loss: 2026.0193 74/500 [===>..........................] - ETA: 2:01 - loss: 2095.5556 - regression_loss: 96.8627 - classification_loss: 1998.6930 75/500 [===>..........................] - ETA: 2:00 - loss: 2067.9274 - regression_loss: 95.8323 - classification_loss: 1972.0952 76/500 [===>..........................] - ETA: 2:00 - loss: 2040.9673 - regression_loss: 94.7701 - classification_loss: 1946.1974 77/500 [===>..........................] - ETA: 1:59 - loss: 2014.7055 - regression_loss: 93.7335 - classification_loss: 1920.9722 78/500 [===>..........................] - ETA: 1:58 - loss: 1989.0483 - regression_loss: 92.6546 - classification_loss: 1896.3938 79/500 [===>..........................] - ETA: 1:58 - loss: 1964.0353 - regression_loss: 91.5978 - classification_loss: 1872.4377 80/500 [===>..........................] - ETA: 1:57 - loss: 1939.6881 - regression_loss: 90.6089 - classification_loss: 1849.0793 81/500 [===>..........................] - ETA: 1:57 - loss: 1915.9369 - regression_loss: 89.6382 - classification_loss: 1826.2988 82/500 [===>..........................] - ETA: 1:56 - loss: 1892.7740 - regression_loss: 88.7003 - classification_loss: 1804.0740 83/500 [===>..........................] - ETA: 1:56 - loss: 1870.1470 - regression_loss: 87.7626 - classification_loss: 1782.3846 84/500 [====>.........................] - ETA: 1:55 - loss: 1848.0589 - regression_loss: 86.8476 - classification_loss: 1761.2115 85/500 [====>.........................] - ETA: 1:55 - loss: 1826.5150 - regression_loss: 85.9789 - classification_loss: 1740.5364 86/500 [====>.........................] - ETA: 1:54 - loss: 1805.4489 - regression_loss: 85.1069 - classification_loss: 1720.3423 87/500 [====>.........................] - ETA: 1:54 - loss: 1784.8608 - regression_loss: 84.2485 - classification_loss: 1700.6125 88/500 [====>.........................] - ETA: 1:53 - loss: 1764.7212 - regression_loss: 83.3903 - classification_loss: 1681.3313 89/500 [====>.........................] - ETA: 1:53 - loss: 1745.0701 - regression_loss: 82.5872 - classification_loss: 1662.4833 90/500 [====>.........................] - ETA: 1:52 - loss: 1725.9340 - regression_loss: 81.8803 - classification_loss: 1644.0542 91/500 [====>.........................] - ETA: 1:52 - loss: 1707.1275 - regression_loss: 81.0980 - classification_loss: 1626.0300 92/500 [====>.........................] - ETA: 1:51 - loss: 1688.7361 - regression_loss: 80.3394 - classification_loss: 1608.3972 93/500 [====>.........................] - ETA: 1:51 - loss: 1670.7315 - regression_loss: 79.5880 - classification_loss: 1591.1442 94/500 [====>.........................] - ETA: 1:51 - loss: 1653.0963 - regression_loss: 78.8390 - classification_loss: 1574.2579 95/500 [====>.........................] - ETA: 1:50 - loss: 1635.8216 - regression_loss: 78.0949 - classification_loss: 1557.7273 96/500 [====>.........................] - ETA: 1:50 - loss: 1618.8989 - regression_loss: 77.3584 - classification_loss: 1541.5411 97/500 [====>.........................] - ETA: 1:49 - loss: 1602.3262 - regression_loss: 76.6382 - classification_loss: 1525.6886 98/500 [====>.........................] - ETA: 1:49 - loss: 1586.0940 - regression_loss: 75.9354 - classification_loss: 1510.1593 99/500 [====>.........................] - ETA: 1:49 - loss: 1570.1712 - regression_loss: 75.2283 - classification_loss: 1494.9435 100/500 [=====>........................] - ETA: 1:48 - loss: 1554.5725 - regression_loss: 74.5413 - classification_loss: 1480.0317 101/500 [=====>........................] - ETA: 1:48 - loss: 1539.2992 - regression_loss: 73.8838 - classification_loss: 1465.4160 102/500 [=====>........................] - ETA: 1:47 - loss: 1524.3195 - regression_loss: 73.2346 - classification_loss: 1451.0854 103/500 [=====>........................] - ETA: 1:47 - loss: 1509.6298 - regression_loss: 72.5958 - classification_loss: 1437.0345 104/500 [=====>........................] - ETA: 1:46 - loss: 1495.2162 - regression_loss: 71.9627 - classification_loss: 1423.2540 105/500 [=====>........................] - ETA: 1:46 - loss: 1481.0703 - regression_loss: 71.3351 - classification_loss: 1409.7358 106/500 [=====>........................] - ETA: 1:45 - loss: 1467.1932 - regression_loss: 70.7211 - classification_loss: 1396.4725 107/500 [=====>........................] - ETA: 1:45 - loss: 1453.5747 - regression_loss: 70.1180 - classification_loss: 1383.4572 108/500 [=====>........................] - ETA: 1:45 - loss: 1440.2032 - regression_loss: 69.5209 - classification_loss: 1370.6827 109/500 [=====>........................] - ETA: 1:44 - loss: 1427.0917 - regression_loss: 68.9494 - classification_loss: 1358.1428 110/500 [=====>........................] - ETA: 1:44 - loss: 1414.2021 - regression_loss: 68.3715 - classification_loss: 1345.8309 111/500 [=====>........................] - ETA: 1:44 - loss: 1401.5395 - regression_loss: 67.8013 - classification_loss: 1333.7386 112/500 [=====>........................] - ETA: 1:43 - loss: 1389.1092 - regression_loss: 67.2451 - classification_loss: 1321.8645 113/500 [=====>........................] - ETA: 1:43 - loss: 1376.9065 - regression_loss: 66.7062 - classification_loss: 1310.2008 114/500 [=====>........................] - ETA: 1:42 - loss: 1364.9020 - regression_loss: 66.1612 - classification_loss: 1298.7412 115/500 [=====>........................] - ETA: 1:42 - loss: 1353.1130 - regression_loss: 65.6323 - classification_loss: 1287.4811 116/500 [=====>........................] - ETA: 1:41 - loss: 1341.5292 - regression_loss: 65.1151 - classification_loss: 1276.4146 117/500 [======>.......................] - ETA: 1:41 - loss: 1330.1366 - regression_loss: 64.6008 - classification_loss: 1265.5364 118/500 [======>.......................] - ETA: 1:41 - loss: 1318.9364 - regression_loss: 64.0959 - classification_loss: 1254.8409 119/500 [======>.......................] - ETA: 1:40 - loss: 1307.9340 - regression_loss: 63.6091 - classification_loss: 1244.3252 120/500 [======>.......................] - ETA: 1:40 - loss: 1297.1064 - regression_loss: 63.1196 - classification_loss: 1233.9871 121/500 [======>.......................] - ETA: 1:40 - loss: 1286.4645 - regression_loss: 62.6458 - classification_loss: 1223.8191 122/500 [======>.......................] - ETA: 1:39 - loss: 1275.9872 - regression_loss: 62.1696 - classification_loss: 1213.8180 123/500 [======>.......................] - ETA: 1:39 - loss: 1265.6742 - regression_loss: 61.7001 - classification_loss: 1203.9745 124/500 [======>.......................] - ETA: 1:39 - loss: 1255.5272 - regression_loss: 61.2340 - classification_loss: 1194.2936 125/500 [======>.......................] - ETA: 1:38 - loss: 1245.5478 - regression_loss: 60.7808 - classification_loss: 1184.7673 126/500 [======>.......................] - ETA: 1:38 - loss: 1235.7318 - regression_loss: 60.3392 - classification_loss: 1175.3928 127/500 [======>.......................] - ETA: 1:37 - loss: 1226.0705 - regression_loss: 59.9057 - classification_loss: 1166.1652 128/500 [======>.......................] - ETA: 1:37 - loss: 1216.5467 - regression_loss: 59.4665 - classification_loss: 1157.0806 129/500 [======>.......................] - ETA: 1:37 - loss: 1207.1762 - regression_loss: 59.0385 - classification_loss: 1148.1379 130/500 [======>.......................] - ETA: 1:36 - loss: 1197.9534 - regression_loss: 58.6194 - classification_loss: 1139.3342 131/500 [======>.......................] - ETA: 1:36 - loss: 1188.8607 - regression_loss: 58.2003 - classification_loss: 1130.6608 132/500 [======>.......................] - ETA: 1:36 - loss: 1179.8974 - regression_loss: 57.7822 - classification_loss: 1122.1155 133/500 [======>.......................] - ETA: 1:36 - loss: 1171.1207 - regression_loss: 57.3843 - classification_loss: 1113.7367 134/500 [=======>......................] - ETA: 1:35 - loss: 1162.4641 - regression_loss: 56.9902 - classification_loss: 1105.4742 135/500 [=======>......................] - ETA: 1:35 - loss: 1153.9198 - regression_loss: 56.5946 - classification_loss: 1097.3256 136/500 [=======>......................] - ETA: 1:35 - loss: 1145.5028 - regression_loss: 56.2159 - classification_loss: 1089.2872 137/500 [=======>......................] - ETA: 1:34 - loss: 1137.1954 - regression_loss: 55.8341 - classification_loss: 1081.3616 138/500 [=======>......................] - ETA: 1:34 - loss: 1129.0088 - regression_loss: 55.4625 - classification_loss: 1073.5465 139/500 [=======>......................] - ETA: 1:34 - loss: 1120.9369 - regression_loss: 55.0923 - classification_loss: 1065.8447 140/500 [=======>......................] - ETA: 1:33 - loss: 1112.9845 - regression_loss: 54.7288 - classification_loss: 1058.2559 141/500 [=======>......................] - ETA: 1:33 - loss: 1105.1347 - regression_loss: 54.3670 - classification_loss: 1050.7678 142/500 [=======>......................] - ETA: 1:33 - loss: 1097.4021 - regression_loss: 54.0130 - classification_loss: 1043.3892 143/500 [=======>......................] - ETA: 1:32 - loss: 1089.7784 - regression_loss: 53.6633 - classification_loss: 1036.1151 144/500 [=======>......................] - ETA: 1:32 - loss: 1082.2590 - regression_loss: 53.3167 - classification_loss: 1028.9424 145/500 [=======>......................] - ETA: 1:31 - loss: 1074.8369 - regression_loss: 52.9713 - classification_loss: 1021.8656 146/500 [=======>......................] - ETA: 1:31 - loss: 1067.5200 - regression_loss: 52.6362 - classification_loss: 1014.8838 147/500 [=======>......................] - ETA: 1:31 - loss: 1060.2996 - regression_loss: 52.3001 - classification_loss: 1007.9996 148/500 [=======>......................] - ETA: 1:30 - loss: 1053.1759 - regression_loss: 51.9692 - classification_loss: 1001.2067 149/500 [=======>......................] - ETA: 1:30 - loss: 1046.1486 - regression_loss: 51.6453 - classification_loss: 994.5035 150/500 [========>.....................] - ETA: 1:30 - loss: 1039.2138 - regression_loss: 51.3275 - classification_loss: 987.8865 151/500 [========>.....................] - ETA: 1:29 - loss: 1032.3666 - regression_loss: 51.0103 - classification_loss: 981.3564 152/500 [========>.....................] - ETA: 1:29 - loss: 1025.6114 - regression_loss: 50.6980 - classification_loss: 974.9135 153/500 [========>.....................] - ETA: 1:29 - loss: 1018.9481 - regression_loss: 50.3899 - classification_loss: 968.5582 154/500 [========>.....................] - ETA: 1:28 - loss: 1012.3705 - regression_loss: 50.0872 - classification_loss: 962.2833 155/500 [========>.....................] - ETA: 1:28 - loss: 1005.8799 - regression_loss: 49.7883 - classification_loss: 956.0916 156/500 [========>.....................] - ETA: 1:28 - loss: 999.4651 - regression_loss: 49.4889 - classification_loss: 949.9761 157/500 [========>.....................] - ETA: 1:27 - loss: 993.1325 - regression_loss: 49.1954 - classification_loss: 943.9371 158/500 [========>.....................] - ETA: 1:27 - loss: 986.8829 - regression_loss: 48.9050 - classification_loss: 937.9778 159/500 [========>.....................] - ETA: 1:27 - loss: 980.7101 - regression_loss: 48.6178 - classification_loss: 932.0923 160/500 [========>.....................] - ETA: 1:26 - loss: 974.6218 - regression_loss: 48.3382 - classification_loss: 926.2837 161/500 [========>.....................] - ETA: 1:26 - loss: 968.6015 - regression_loss: 48.0599 - classification_loss: 920.5416 162/500 [========>.....................] - ETA: 1:26 - loss: 962.6546 - regression_loss: 47.7841 - classification_loss: 914.8705 163/500 [========>.....................] - ETA: 1:25 - loss: 956.7792 - regression_loss: 47.5131 - classification_loss: 909.2662 164/500 [========>.....................] - ETA: 1:25 - loss: 950.9732 - regression_loss: 47.2405 - classification_loss: 903.7327 165/500 [========>.....................] - ETA: 1:25 - loss: 945.2367 - regression_loss: 46.9731 - classification_loss: 898.2636 166/500 [========>.....................] - ETA: 1:24 - loss: 939.5727 - regression_loss: 46.7085 - classification_loss: 892.8643 167/500 [=========>....................] - ETA: 1:24 - loss: 933.9803 - regression_loss: 46.4526 - classification_loss: 887.5277 168/500 [=========>....................] - ETA: 1:24 - loss: 928.4487 - regression_loss: 46.1939 - classification_loss: 882.2548 169/500 [=========>....................] - ETA: 1:23 - loss: 922.9808 - regression_loss: 45.9388 - classification_loss: 877.0421 170/500 [=========>....................] - ETA: 1:23 - loss: 917.5784 - regression_loss: 45.6867 - classification_loss: 871.8917 171/500 [=========>....................] - ETA: 1:23 - loss: 912.2373 - regression_loss: 45.4371 - classification_loss: 866.8004 172/500 [=========>....................] - ETA: 1:22 - loss: 906.9621 - regression_loss: 45.1916 - classification_loss: 861.7706 173/500 [=========>....................] - ETA: 1:22 - loss: 901.7440 - regression_loss: 44.9478 - classification_loss: 856.7963 174/500 [=========>....................] - ETA: 1:22 - loss: 896.5858 - regression_loss: 44.7064 - classification_loss: 851.8795 175/500 [=========>....................] - ETA: 1:21 - loss: 891.4867 - regression_loss: 44.4681 - classification_loss: 847.0187 176/500 [=========>....................] - ETA: 1:21 - loss: 886.4485 - regression_loss: 44.2328 - classification_loss: 842.2158 177/500 [=========>....................] - ETA: 1:21 - loss: 881.4648 - regression_loss: 44.0000 - classification_loss: 837.4648 178/500 [=========>....................] - ETA: 1:20 - loss: 876.5354 - regression_loss: 43.7689 - classification_loss: 832.7666 179/500 [=========>....................] - ETA: 1:20 - loss: 871.6613 - regression_loss: 43.5410 - classification_loss: 828.1204 180/500 [=========>....................] - ETA: 1:20 - loss: 866.8417 - regression_loss: 43.3157 - classification_loss: 823.5259 181/500 [=========>....................] - ETA: 1:20 - loss: 862.0734 - regression_loss: 43.0918 - classification_loss: 818.9815 182/500 [=========>....................] - ETA: 1:19 - loss: 857.3588 - regression_loss: 42.8710 - classification_loss: 814.4878 183/500 [=========>....................] - ETA: 1:19 - loss: 852.6995 - regression_loss: 42.6535 - classification_loss: 810.0460 184/500 [==========>...................] - ETA: 1:19 - loss: 848.1564 - regression_loss: 42.4376 - classification_loss: 805.7188 185/500 [==========>...................] - ETA: 1:18 - loss: 843.5974 - regression_loss: 42.2241 - classification_loss: 801.3733 186/500 [==========>...................] - ETA: 1:18 - loss: 839.0870 - regression_loss: 42.0138 - classification_loss: 797.0732 187/500 [==========>...................] - ETA: 1:18 - loss: 834.6228 - regression_loss: 41.8057 - classification_loss: 792.8172 188/500 [==========>...................] - ETA: 1:17 - loss: 830.2046 - regression_loss: 41.5984 - classification_loss: 788.6062 189/500 [==========>...................] - ETA: 1:17 - loss: 825.8322 - regression_loss: 41.3927 - classification_loss: 784.4396 190/500 [==========>...................] - ETA: 1:17 - loss: 821.5064 - regression_loss: 41.1894 - classification_loss: 780.3170 191/500 [==========>...................] - ETA: 1:17 - loss: 817.2265 - regression_loss: 40.9887 - classification_loss: 776.2378 192/500 [==========>...................] - ETA: 1:16 - loss: 812.9907 - regression_loss: 40.7900 - classification_loss: 772.2009 193/500 [==========>...................] - ETA: 1:16 - loss: 808.7984 - regression_loss: 40.5927 - classification_loss: 768.2057 194/500 [==========>...................] - ETA: 1:16 - loss: 804.6499 - regression_loss: 40.3976 - classification_loss: 764.2524 195/500 [==========>...................] - ETA: 1:15 - loss: 800.5443 - regression_loss: 40.2054 - classification_loss: 760.3389 196/500 [==========>...................] - ETA: 1:15 - loss: 796.4796 - regression_loss: 40.0145 - classification_loss: 756.4651 197/500 [==========>...................] - ETA: 1:15 - loss: 792.4558 - regression_loss: 39.8257 - classification_loss: 752.6301 198/500 [==========>...................] - ETA: 1:15 - loss: 788.4750 - regression_loss: 39.6402 - classification_loss: 748.8348 199/500 [==========>...................] - ETA: 1:14 - loss: 784.5328 - regression_loss: 39.4555 - classification_loss: 745.0773 200/500 [===========>..................] - ETA: 1:14 - loss: 780.6304 - regression_loss: 39.2724 - classification_loss: 741.3580 201/500 [===========>..................] - ETA: 1:14 - loss: 776.7666 - regression_loss: 39.0915 - classification_loss: 737.6750 202/500 [===========>..................] - ETA: 1:13 - loss: 772.9408 - regression_loss: 38.9126 - classification_loss: 734.0281 203/500 [===========>..................] - ETA: 1:13 - loss: 769.1524 - regression_loss: 38.7341 - classification_loss: 730.4182 204/500 [===========>..................] - ETA: 1:13 - loss: 765.4001 - regression_loss: 38.5573 - classification_loss: 726.8427 205/500 [===========>..................] - ETA: 1:13 - loss: 761.6853 - regression_loss: 38.3825 - classification_loss: 723.3027 206/500 [===========>..................] - ETA: 1:12 - loss: 758.0071 - regression_loss: 38.2100 - classification_loss: 719.7970 207/500 [===========>..................] - ETA: 1:12 - loss: 754.3636 - regression_loss: 38.0395 - classification_loss: 716.3240 208/500 [===========>..................] - ETA: 1:12 - loss: 750.7597 - regression_loss: 37.8704 - classification_loss: 712.8892 209/500 [===========>..................] - ETA: 1:12 - loss: 747.1848 - regression_loss: 37.7023 - classification_loss: 709.4825 210/500 [===========>..................] - ETA: 1:11 - loss: 743.6447 - regression_loss: 37.5361 - classification_loss: 706.1086 211/500 [===========>..................] - ETA: 1:11 - loss: 740.1378 - regression_loss: 37.3707 - classification_loss: 702.7670 212/500 [===========>..................] - ETA: 1:11 - loss: 736.6640 - regression_loss: 37.2079 - classification_loss: 699.4561 213/500 [===========>..................] - ETA: 1:10 - loss: 733.2227 - regression_loss: 37.0457 - classification_loss: 696.1770 214/500 [===========>..................] - ETA: 1:10 - loss: 729.8134 - regression_loss: 36.8855 - classification_loss: 692.9279 215/500 [===========>..................] - ETA: 1:10 - loss: 726.4407 - regression_loss: 36.7288 - classification_loss: 689.7119 216/500 [===========>..................] - ETA: 1:10 - loss: 723.0955 - regression_loss: 36.5723 - classification_loss: 686.5231 217/500 [============>.................] - ETA: 1:09 - loss: 719.7805 - regression_loss: 36.4167 - classification_loss: 683.3638 218/500 [============>.................] - ETA: 1:09 - loss: 716.4954 - regression_loss: 36.2624 - classification_loss: 680.2330 219/500 [============>.................] - ETA: 1:09 - loss: 713.2440 - regression_loss: 36.1122 - classification_loss: 677.1318 220/500 [============>.................] - ETA: 1:08 - loss: 710.0188 - regression_loss: 35.9606 - classification_loss: 674.0582 221/500 [============>.................] - ETA: 1:08 - loss: 706.8220 - regression_loss: 35.8099 - classification_loss: 671.0121 222/500 [============>.................] - ETA: 1:08 - loss: 703.6559 - regression_loss: 35.6618 - classification_loss: 667.9940 223/500 [============>.................] - ETA: 1:08 - loss: 700.5200 - regression_loss: 35.5158 - classification_loss: 665.0042 224/500 [============>.................] - ETA: 1:07 - loss: 697.4098 - regression_loss: 35.3700 - classification_loss: 662.0399 225/500 [============>.................] - ETA: 1:07 - loss: 694.3260 - regression_loss: 35.2248 - classification_loss: 659.1013 226/500 [============>.................] - ETA: 1:07 - loss: 691.2694 - regression_loss: 35.0808 - classification_loss: 656.1887 227/500 [============>.................] - ETA: 1:06 - loss: 688.2409 - regression_loss: 34.9394 - classification_loss: 653.3017 228/500 [============>.................] - ETA: 1:06 - loss: 685.2425 - regression_loss: 34.7999 - classification_loss: 650.4427 229/500 [============>.................] - ETA: 1:06 - loss: 682.2655 - regression_loss: 34.6595 - classification_loss: 647.6060 230/500 [============>.................] - ETA: 1:06 - loss: 679.3158 - regression_loss: 34.5216 - classification_loss: 644.7943 231/500 [============>.................] - ETA: 1:05 - loss: 676.3911 - regression_loss: 34.3840 - classification_loss: 642.0073 232/500 [============>.................] - ETA: 1:05 - loss: 673.4915 - regression_loss: 34.2480 - classification_loss: 639.2436 233/500 [============>.................] - ETA: 1:05 - loss: 670.6179 - regression_loss: 34.1141 - classification_loss: 636.5038 234/500 [=============>................] - ETA: 1:05 - loss: 667.7673 - regression_loss: 33.9798 - classification_loss: 633.7875 235/500 [=============>................] - ETA: 1:04 - loss: 664.9408 - regression_loss: 33.8464 - classification_loss: 631.0944 236/500 [=============>................] - ETA: 1:04 - loss: 662.1400 - regression_loss: 33.7144 - classification_loss: 628.4256 237/500 [=============>................] - ETA: 1:04 - loss: 659.3609 - regression_loss: 33.5833 - classification_loss: 625.7776 238/500 [=============>................] - ETA: 1:04 - loss: 656.6055 - regression_loss: 33.4535 - classification_loss: 623.1520 239/500 [=============>................] - ETA: 1:03 - loss: 653.8728 - regression_loss: 33.3250 - classification_loss: 620.5479 240/500 [=============>................] - ETA: 1:03 - loss: 651.1628 - regression_loss: 33.1970 - classification_loss: 617.9659 241/500 [=============>................] - ETA: 1:03 - loss: 648.4759 - regression_loss: 33.0703 - classification_loss: 615.4056 242/500 [=============>................] - ETA: 1:03 - loss: 645.8112 - regression_loss: 32.9451 - classification_loss: 612.8661 243/500 [=============>................] - ETA: 1:02 - loss: 643.1674 - regression_loss: 32.8205 - classification_loss: 610.3470 244/500 [=============>................] - ETA: 1:02 - loss: 640.5563 - regression_loss: 32.6959 - classification_loss: 607.8604 245/500 [=============>................] - ETA: 1:02 - loss: 637.9574 - regression_loss: 32.5748 - classification_loss: 605.3826 246/500 [=============>................] - ETA: 1:02 - loss: 635.3937 - regression_loss: 32.4531 - classification_loss: 602.9406 247/500 [=============>................] - ETA: 1:01 - loss: 632.8351 - regression_loss: 32.3323 - classification_loss: 600.5029 248/500 [=============>................] - ETA: 1:01 - loss: 630.2977 - regression_loss: 32.2126 - classification_loss: 598.0851 249/500 [=============>................] - ETA: 1:01 - loss: 627.7802 - regression_loss: 32.0937 - classification_loss: 595.6865 250/500 [==============>...............] - ETA: 1:01 - loss: 625.2830 - regression_loss: 31.9757 - classification_loss: 593.3073 251/500 [==============>...............] - ETA: 1:00 - loss: 622.8065 - regression_loss: 31.8589 - classification_loss: 590.9475 252/500 [==============>...............] - ETA: 1:00 - loss: 620.3504 - regression_loss: 31.7435 - classification_loss: 588.6069 253/500 [==============>...............] - ETA: 1:00 - loss: 617.9126 - regression_loss: 31.6290 - classification_loss: 586.2835 254/500 [==============>...............] - ETA: 59s - loss: 615.4945 - regression_loss: 31.5156 - classification_loss: 583.9788 255/500 [==============>...............] - ETA: 59s - loss: 613.0942 - regression_loss: 31.4022 - classification_loss: 581.6920 256/500 [==============>...............] - ETA: 59s - loss: 610.7126 - regression_loss: 31.2896 - classification_loss: 579.4230 257/500 [==============>...............] - ETA: 59s - loss: 608.3506 - regression_loss: 31.1790 - classification_loss: 577.1715 258/500 [==============>...............] - ETA: 58s - loss: 606.0060 - regression_loss: 31.0683 - classification_loss: 574.9377 259/500 [==============>...............] - ETA: 58s - loss: 603.6802 - regression_loss: 30.9592 - classification_loss: 572.7210 260/500 [==============>...............] - ETA: 58s - loss: 601.3723 - regression_loss: 30.8505 - classification_loss: 570.5217 261/500 [==============>...............] - ETA: 58s - loss: 599.0815 - regression_loss: 30.7419 - classification_loss: 568.3395 262/500 [==============>...............] - ETA: 57s - loss: 596.8077 - regression_loss: 30.6344 - classification_loss: 566.1733 263/500 [==============>...............] - ETA: 57s - loss: 594.5509 - regression_loss: 30.5277 - classification_loss: 564.0231 264/500 [==============>...............] - ETA: 57s - loss: 592.3137 - regression_loss: 30.4238 - classification_loss: 561.8899 265/500 [==============>...............] - ETA: 57s - loss: 590.0921 - regression_loss: 30.3191 - classification_loss: 559.7729 266/500 [==============>...............] - ETA: 56s - loss: 587.8868 - regression_loss: 30.2144 - classification_loss: 557.6723 267/500 [===============>..............] - ETA: 56s - loss: 585.6979 - regression_loss: 30.1114 - classification_loss: 555.5864 268/500 [===============>..............] - ETA: 56s - loss: 583.5253 - regression_loss: 30.0086 - classification_loss: 553.5165 269/500 [===============>..............] - ETA: 56s - loss: 581.3689 - regression_loss: 29.9067 - classification_loss: 551.4621 270/500 [===============>..............] - ETA: 55s - loss: 579.2298 - regression_loss: 29.8074 - classification_loss: 549.4222 271/500 [===============>..............] - ETA: 55s - loss: 577.1052 - regression_loss: 29.7069 - classification_loss: 547.3981 272/500 [===============>..............] - ETA: 55s - loss: 574.9970 - regression_loss: 29.6081 - classification_loss: 545.3887 273/500 [===============>..............] - ETA: 54s - loss: 572.9039 - regression_loss: 29.5099 - classification_loss: 543.3939 274/500 [===============>..............] - ETA: 54s - loss: 570.8258 - regression_loss: 29.4119 - classification_loss: 541.4137 275/500 [===============>..............] - ETA: 54s - loss: 568.7620 - regression_loss: 29.3140 - classification_loss: 539.4478 276/500 [===============>..............] - ETA: 54s - loss: 566.7150 - regression_loss: 29.2180 - classification_loss: 537.4969 277/500 [===============>..............] - ETA: 53s - loss: 564.6815 - regression_loss: 29.1218 - classification_loss: 535.5596 278/500 [===============>..............] - ETA: 53s - loss: 562.6622 - regression_loss: 29.0263 - classification_loss: 533.6359 279/500 [===============>..............] - ETA: 53s - loss: 560.6581 - regression_loss: 28.9319 - classification_loss: 531.7261 280/500 [===============>..............] - ETA: 53s - loss: 558.6689 - regression_loss: 28.8381 - classification_loss: 529.8307 281/500 [===============>..............] - ETA: 52s - loss: 556.6965 - regression_loss: 28.7453 - classification_loss: 527.9510 282/500 [===============>..............] - ETA: 52s - loss: 554.7344 - regression_loss: 28.6528 - classification_loss: 526.0814 283/500 [===============>..............] - ETA: 52s - loss: 552.7871 - regression_loss: 28.5612 - classification_loss: 524.2257 284/500 [================>.............] - ETA: 52s - loss: 550.8523 - regression_loss: 28.4696 - classification_loss: 522.3826 285/500 [================>.............] - ETA: 51s - loss: 548.9309 - regression_loss: 28.3786 - classification_loss: 520.5522 286/500 [================>.............] - ETA: 51s - loss: 547.0231 - regression_loss: 28.2883 - classification_loss: 518.7347 287/500 [================>.............] - ETA: 51s - loss: 545.1286 - regression_loss: 28.1986 - classification_loss: 516.9299 288/500 [================>.............] - ETA: 51s - loss: 543.2494 - regression_loss: 28.1108 - classification_loss: 515.1384 289/500 [================>.............] - ETA: 50s - loss: 541.3826 - regression_loss: 28.0230 - classification_loss: 513.3595 290/500 [================>.............] - ETA: 50s - loss: 539.5275 - regression_loss: 27.9350 - classification_loss: 511.5924 291/500 [================>.............] - ETA: 50s - loss: 537.6847 - regression_loss: 27.8477 - classification_loss: 509.8369 292/500 [================>.............] - ETA: 50s - loss: 535.8580 - regression_loss: 27.7616 - classification_loss: 508.0963 293/500 [================>.............] - ETA: 49s - loss: 534.0431 - regression_loss: 27.6766 - classification_loss: 506.3664 294/500 [================>.............] - ETA: 49s - loss: 532.2397 - regression_loss: 27.5920 - classification_loss: 504.6476 295/500 [================>.............] - ETA: 49s - loss: 530.4474 - regression_loss: 27.5074 - classification_loss: 502.9399 296/500 [================>.............] - ETA: 49s - loss: 528.6669 - regression_loss: 27.4235 - classification_loss: 501.2434 297/500 [================>.............] - ETA: 48s - loss: 526.9003 - regression_loss: 27.3415 - classification_loss: 499.5587 298/500 [================>.............] - ETA: 48s - loss: 525.1438 - regression_loss: 27.2585 - classification_loss: 497.8852 299/500 [================>.............] - ETA: 48s - loss: 523.4000 - regression_loss: 27.1768 - classification_loss: 496.2231 300/500 [=================>............] - ETA: 47s - loss: 521.6702 - regression_loss: 27.0980 - classification_loss: 494.5721 301/500 [=================>............] - ETA: 47s - loss: 519.9489 - regression_loss: 27.0172 - classification_loss: 492.9317 302/500 [=================>............] - ETA: 47s - loss: 518.2385 - regression_loss: 26.9367 - classification_loss: 491.3018 303/500 [=================>............] - ETA: 47s - loss: 516.5394 - regression_loss: 26.8560 - classification_loss: 489.6834 304/500 [=================>............] - ETA: 46s - loss: 514.8534 - regression_loss: 26.7779 - classification_loss: 488.0756 305/500 [=================>............] - ETA: 46s - loss: 513.1772 - regression_loss: 26.6993 - classification_loss: 486.4780 306/500 [=================>............] - ETA: 46s - loss: 511.5126 - regression_loss: 26.6219 - classification_loss: 484.8907 307/500 [=================>............] - ETA: 46s - loss: 509.8577 - regression_loss: 26.5435 - classification_loss: 483.3142 308/500 [=================>............] - ETA: 45s - loss: 508.2141 - regression_loss: 26.4664 - classification_loss: 481.7478 309/500 [=================>............] - ETA: 45s - loss: 506.5816 - regression_loss: 26.3891 - classification_loss: 480.1925 310/500 [=================>............] - ETA: 45s - loss: 504.9586 - regression_loss: 26.3123 - classification_loss: 478.6464 311/500 [=================>............] - ETA: 45s - loss: 503.3462 - regression_loss: 26.2366 - classification_loss: 477.1097 312/500 [=================>............] - ETA: 44s - loss: 501.7445 - regression_loss: 26.1614 - classification_loss: 475.5831 313/500 [=================>............] - ETA: 44s - loss: 500.1530 - regression_loss: 26.0862 - classification_loss: 474.0669 314/500 [=================>............] - ETA: 44s - loss: 498.5712 - regression_loss: 26.0112 - classification_loss: 472.5600 315/500 [=================>............] - ETA: 44s - loss: 496.9989 - regression_loss: 25.9368 - classification_loss: 471.0622 316/500 [=================>............] - ETA: 43s - loss: 495.4373 - regression_loss: 25.8631 - classification_loss: 469.5742 317/500 [==================>...........] - ETA: 43s - loss: 493.8855 - regression_loss: 25.7900 - classification_loss: 468.0957 318/500 [==================>...........] - ETA: 43s - loss: 492.3433 - regression_loss: 25.7175 - classification_loss: 466.6259 319/500 [==================>...........] - ETA: 43s - loss: 490.8117 - regression_loss: 25.6441 - classification_loss: 465.1677 320/500 [==================>...........] - ETA: 42s - loss: 489.2889 - regression_loss: 25.5725 - classification_loss: 463.7165 321/500 [==================>...........] - ETA: 42s - loss: 487.7749 - regression_loss: 25.5010 - classification_loss: 462.2740 322/500 [==================>...........] - ETA: 42s - loss: 486.2700 - regression_loss: 25.4295 - classification_loss: 460.8406 323/500 [==================>...........] - ETA: 42s - loss: 484.7753 - regression_loss: 25.3593 - classification_loss: 459.4160 324/500 [==================>...........] - ETA: 41s - loss: 483.2897 - regression_loss: 25.2895 - classification_loss: 458.0003 325/500 [==================>...........] - ETA: 41s - loss: 481.8130 - regression_loss: 25.2195 - classification_loss: 456.5936 326/500 [==================>...........] - ETA: 41s - loss: 480.3451 - regression_loss: 25.1495 - classification_loss: 455.1957 327/500 [==================>...........] - ETA: 41s - loss: 478.8873 - regression_loss: 25.0811 - classification_loss: 453.8062 328/500 [==================>...........] - ETA: 40s - loss: 477.4373 - regression_loss: 25.0123 - classification_loss: 452.4251 329/500 [==================>...........] - ETA: 40s - loss: 475.9964 - regression_loss: 24.9442 - classification_loss: 451.0522 330/500 [==================>...........] - ETA: 40s - loss: 474.5652 - regression_loss: 24.8771 - classification_loss: 449.6881 331/500 [==================>...........] - ETA: 40s - loss: 473.1435 - regression_loss: 24.8106 - classification_loss: 448.3330 332/500 [==================>...........] - ETA: 39s - loss: 471.7287 - regression_loss: 24.7438 - classification_loss: 446.9850 333/500 [==================>...........] - ETA: 39s - loss: 470.3216 - regression_loss: 24.6770 - classification_loss: 445.6447 334/500 [===================>..........] - ETA: 39s - loss: 468.9232 - regression_loss: 24.6106 - classification_loss: 444.3127 335/500 [===================>..........] - ETA: 39s - loss: 467.5340 - regression_loss: 24.5449 - classification_loss: 442.9892 336/500 [===================>..........] - ETA: 38s - loss: 466.1549 - regression_loss: 24.4807 - classification_loss: 441.6743 337/500 [===================>..........] - ETA: 38s - loss: 464.7815 - regression_loss: 24.4159 - classification_loss: 440.3656 338/500 [===================>..........] - ETA: 38s - loss: 463.4160 - regression_loss: 24.3514 - classification_loss: 439.0648 339/500 [===================>..........] - ETA: 38s - loss: 462.0587 - regression_loss: 24.2870 - classification_loss: 437.7718 340/500 [===================>..........] - ETA: 37s - loss: 460.7099 - regression_loss: 24.2236 - classification_loss: 436.4864 341/500 [===================>..........] - ETA: 37s - loss: 459.3686 - regression_loss: 24.1598 - classification_loss: 435.2088 342/500 [===================>..........] - ETA: 37s - loss: 458.0347 - regression_loss: 24.0963 - classification_loss: 433.9384 343/500 [===================>..........] - ETA: 37s - loss: 456.7086 - regression_loss: 24.0333 - classification_loss: 432.6753 344/500 [===================>..........] - ETA: 36s - loss: 455.3920 - regression_loss: 23.9717 - classification_loss: 431.4203 345/500 [===================>..........] - ETA: 36s - loss: 454.0818 - regression_loss: 23.9094 - classification_loss: 430.1723 346/500 [===================>..........] - ETA: 36s - loss: 452.7787 - regression_loss: 23.8475 - classification_loss: 428.9312 347/500 [===================>..........] - ETA: 36s - loss: 451.4838 - regression_loss: 23.7864 - classification_loss: 427.6974 348/500 [===================>..........] - ETA: 36s - loss: 450.1967 - regression_loss: 23.7258 - classification_loss: 426.4709 349/500 [===================>..........] - ETA: 35s - loss: 448.9164 - regression_loss: 23.6653 - classification_loss: 425.2511 350/500 [====================>.........] - ETA: 35s - loss: 447.6631 - regression_loss: 23.6064 - classification_loss: 424.0566 351/500 [====================>.........] - ETA: 35s - loss: 446.3976 - regression_loss: 23.5466 - classification_loss: 422.8510 352/500 [====================>.........] - ETA: 35s - loss: 445.1397 - regression_loss: 23.4876 - classification_loss: 421.6521 353/500 [====================>.........] - ETA: 34s - loss: 443.8884 - regression_loss: 23.4286 - classification_loss: 420.4598 354/500 [====================>.........] - ETA: 34s - loss: 442.6441 - regression_loss: 23.3698 - classification_loss: 419.2743 355/500 [====================>.........] - ETA: 34s - loss: 441.4065 - regression_loss: 23.3113 - classification_loss: 418.0952 356/500 [====================>.........] - ETA: 34s - loss: 440.1759 - regression_loss: 23.2531 - classification_loss: 416.9228 357/500 [====================>.........] - ETA: 33s - loss: 438.9521 - regression_loss: 23.1948 - classification_loss: 415.7573 358/500 [====================>.........] - ETA: 33s - loss: 437.7348 - regression_loss: 23.1367 - classification_loss: 414.5980 359/500 [====================>.........] - ETA: 33s - loss: 436.5250 - regression_loss: 23.0791 - classification_loss: 413.4458 360/500 [====================>.........] - ETA: 33s - loss: 435.3218 - regression_loss: 23.0220 - classification_loss: 412.2997 361/500 [====================>.........] - ETA: 32s - loss: 434.1265 - regression_loss: 22.9667 - classification_loss: 411.1597 362/500 [====================>.........] - ETA: 32s - loss: 432.9366 - regression_loss: 22.9102 - classification_loss: 410.0263 363/500 [====================>.........] - ETA: 32s - loss: 431.7542 - regression_loss: 22.8549 - classification_loss: 408.8993 364/500 [====================>.........] - ETA: 32s - loss: 430.5773 - regression_loss: 22.7989 - classification_loss: 407.7783 365/500 [====================>.........] - ETA: 31s - loss: 429.4069 - regression_loss: 22.7433 - classification_loss: 406.6635 366/500 [====================>.........] - ETA: 31s - loss: 428.2425 - regression_loss: 22.6879 - classification_loss: 405.5545 367/500 [=====================>........] - ETA: 31s - loss: 427.0846 - regression_loss: 22.6328 - classification_loss: 404.4517 368/500 [=====================>........] - ETA: 31s - loss: 425.9320 - regression_loss: 22.5771 - classification_loss: 403.3548 369/500 [=====================>........] - ETA: 30s - loss: 424.7867 - regression_loss: 22.5227 - classification_loss: 402.2639 370/500 [=====================>........] - ETA: 30s - loss: 423.6478 - regression_loss: 22.4687 - classification_loss: 401.1790 371/500 [=====================>........] - ETA: 30s - loss: 422.5147 - regression_loss: 22.4146 - classification_loss: 400.1000 372/500 [=====================>........] - ETA: 30s - loss: 421.3886 - regression_loss: 22.3609 - classification_loss: 399.0276 373/500 [=====================>........] - ETA: 29s - loss: 420.2690 - regression_loss: 22.3064 - classification_loss: 397.9625 374/500 [=====================>........] - ETA: 29s - loss: 419.1539 - regression_loss: 22.2536 - classification_loss: 396.9001 375/500 [=====================>........] - ETA: 29s - loss: 418.0447 - regression_loss: 22.2009 - classification_loss: 395.8437 376/500 [=====================>........] - ETA: 29s - loss: 416.9443 - regression_loss: 22.1495 - classification_loss: 394.7947 377/500 [=====================>........] - ETA: 29s - loss: 415.8469 - regression_loss: 22.0974 - classification_loss: 393.7494 378/500 [=====================>........] - ETA: 28s - loss: 414.7552 - regression_loss: 22.0455 - classification_loss: 392.7096 379/500 [=====================>........] - ETA: 28s - loss: 413.6725 - regression_loss: 21.9961 - classification_loss: 391.6763 380/500 [=====================>........] - ETA: 28s - loss: 412.5926 - regression_loss: 21.9448 - classification_loss: 390.6476 381/500 [=====================>........] - ETA: 28s - loss: 411.5186 - regression_loss: 21.8940 - classification_loss: 389.6245 382/500 [=====================>........] - ETA: 27s - loss: 410.4501 - regression_loss: 21.8433 - classification_loss: 388.6066 383/500 [=====================>........] - ETA: 27s - loss: 409.3870 - regression_loss: 21.7927 - classification_loss: 387.5942 384/500 [======================>.......] - ETA: 27s - loss: 408.3308 - regression_loss: 21.7435 - classification_loss: 386.5872 385/500 [======================>.......] - ETA: 27s - loss: 407.2815 - regression_loss: 21.6944 - classification_loss: 385.5869 386/500 [======================>.......] - ETA: 26s - loss: 406.2351 - regression_loss: 21.6452 - classification_loss: 384.5898 387/500 [======================>.......] - ETA: 26s - loss: 405.1941 - regression_loss: 21.5960 - classification_loss: 383.5980 388/500 [======================>.......] - ETA: 26s - loss: 404.1582 - regression_loss: 21.5470 - classification_loss: 382.6111 389/500 [======================>.......] - ETA: 26s - loss: 403.1275 - regression_loss: 21.4982 - classification_loss: 381.6293 390/500 [======================>.......] - ETA: 25s - loss: 402.1021 - regression_loss: 21.4496 - classification_loss: 380.6524 391/500 [======================>.......] - ETA: 25s - loss: 401.0826 - regression_loss: 21.4017 - classification_loss: 379.6808 392/500 [======================>.......] - ETA: 25s - loss: 400.0678 - regression_loss: 21.3534 - classification_loss: 378.7144 393/500 [======================>.......] - ETA: 25s - loss: 399.0588 - regression_loss: 21.3054 - classification_loss: 377.7533 394/500 [======================>.......] - ETA: 24s - loss: 398.0546 - regression_loss: 21.2578 - classification_loss: 376.7966 395/500 [======================>.......] - ETA: 24s - loss: 397.0545 - regression_loss: 21.2096 - classification_loss: 375.8447 396/500 [======================>.......] - ETA: 24s - loss: 396.0610 - regression_loss: 21.1628 - classification_loss: 374.8982 397/500 [======================>.......] - ETA: 24s - loss: 395.0715 - regression_loss: 21.1157 - classification_loss: 373.9557 398/500 [======================>.......] - ETA: 23s - loss: 394.0867 - regression_loss: 21.0688 - classification_loss: 373.0178 399/500 [======================>.......] - ETA: 23s - loss: 393.1099 - regression_loss: 21.0238 - classification_loss: 372.0859 400/500 [=======================>......] - ETA: 23s - loss: 392.1361 - regression_loss: 20.9783 - classification_loss: 371.1576 401/500 [=======================>......] - ETA: 23s - loss: 391.1663 - regression_loss: 20.9323 - classification_loss: 370.2339 402/500 [=======================>......] - ETA: 23s - loss: 390.2013 - regression_loss: 20.8863 - classification_loss: 369.3148 403/500 [=======================>......] - ETA: 22s - loss: 389.2419 - regression_loss: 20.8412 - classification_loss: 368.4006 404/500 [=======================>......] - ETA: 22s - loss: 388.2867 - regression_loss: 20.7958 - classification_loss: 367.4908 405/500 [=======================>......] - ETA: 22s - loss: 387.3359 - regression_loss: 20.7506 - classification_loss: 366.5851 406/500 [=======================>......] - ETA: 22s - loss: 386.3916 - regression_loss: 20.7058 - classification_loss: 365.6857 407/500 [=======================>......] - ETA: 21s - loss: 385.4500 - regression_loss: 20.6609 - classification_loss: 364.7890 408/500 [=======================>......] - ETA: 21s - loss: 384.5134 - regression_loss: 20.6164 - classification_loss: 363.8969 409/500 [=======================>......] - ETA: 21s - loss: 383.5812 - regression_loss: 20.5722 - classification_loss: 363.0089 410/500 [=======================>......] - ETA: 21s - loss: 382.6563 - regression_loss: 20.5283 - classification_loss: 362.1279 411/500 [=======================>......] - ETA: 20s - loss: 381.7329 - regression_loss: 20.4843 - classification_loss: 361.2485 412/500 [=======================>......] - ETA: 20s - loss: 380.8151 - regression_loss: 20.4409 - classification_loss: 360.3740 413/500 [=======================>......] - ETA: 20s - loss: 379.9050 - regression_loss: 20.3987 - classification_loss: 359.5061 414/500 [=======================>......] - ETA: 20s - loss: 378.9953 - regression_loss: 20.3555 - classification_loss: 358.6397 415/500 [=======================>......] - ETA: 19s - loss: 378.0898 - regression_loss: 20.3124 - classification_loss: 357.7773 416/500 [=======================>......] - ETA: 19s - loss: 377.1890 - regression_loss: 20.2698 - classification_loss: 356.9191 417/500 [========================>.....] - ETA: 19s - loss: 376.2941 - regression_loss: 20.2277 - classification_loss: 356.0663 418/500 [========================>.....] - ETA: 19s - loss: 375.4030 - regression_loss: 20.1863 - classification_loss: 355.2165 419/500 [========================>.....] - ETA: 18s - loss: 374.5148 - regression_loss: 20.1441 - classification_loss: 354.3706 420/500 [========================>.....] - ETA: 18s - loss: 373.6310 - regression_loss: 20.1020 - classification_loss: 353.5289 421/500 [========================>.....] - ETA: 18s - loss: 372.7523 - regression_loss: 20.0612 - classification_loss: 352.6909 422/500 [========================>.....] - ETA: 18s - loss: 371.8768 - regression_loss: 20.0195 - classification_loss: 351.8571 423/500 [========================>.....] - ETA: 18s - loss: 371.0054 - regression_loss: 19.9782 - classification_loss: 351.0270 424/500 [========================>.....] - ETA: 17s - loss: 370.1380 - regression_loss: 19.9369 - classification_loss: 350.2009 425/500 [========================>.....] - ETA: 17s - loss: 369.2748 - regression_loss: 19.8958 - classification_loss: 349.3788 426/500 [========================>.....] - ETA: 17s - loss: 368.4162 - regression_loss: 19.8555 - classification_loss: 348.5605 427/500 [========================>.....] - ETA: 17s - loss: 367.5622 - regression_loss: 19.8158 - classification_loss: 347.7462 428/500 [========================>.....] - ETA: 16s - loss: 366.7106 - regression_loss: 19.7751 - classification_loss: 346.9353 429/500 [========================>.....] - ETA: 16s - loss: 365.8635 - regression_loss: 19.7353 - classification_loss: 346.1281 430/500 [========================>.....] - ETA: 16s - loss: 365.0203 - regression_loss: 19.6952 - classification_loss: 345.3250 431/500 [========================>.....] - ETA: 16s - loss: 364.1809 - regression_loss: 19.6552 - classification_loss: 344.5255 432/500 [========================>.....] - ETA: 15s - loss: 363.3459 - regression_loss: 19.6158 - classification_loss: 343.7300 433/500 [========================>.....] - ETA: 15s - loss: 362.5142 - regression_loss: 19.5762 - classification_loss: 342.9379 434/500 [=========================>....] - ETA: 15s - loss: 361.6888 - regression_loss: 19.5383 - classification_loss: 342.1504 435/500 [=========================>....] - ETA: 15s - loss: 360.8651 - regression_loss: 19.4991 - classification_loss: 341.3658 436/500 [=========================>....] - ETA: 14s - loss: 360.0460 - regression_loss: 19.4603 - classification_loss: 340.5855 437/500 [=========================>....] - ETA: 14s - loss: 359.2300 - regression_loss: 19.4220 - classification_loss: 339.8079 438/500 [=========================>....] - ETA: 14s - loss: 358.4172 - regression_loss: 19.3834 - classification_loss: 339.0336 439/500 [=========================>....] - ETA: 14s - loss: 357.6085 - regression_loss: 19.3454 - classification_loss: 338.2629 440/500 [=========================>....] - ETA: 13s - loss: 356.8031 - regression_loss: 19.3066 - classification_loss: 337.4963 441/500 [=========================>....] - ETA: 13s - loss: 356.0022 - regression_loss: 19.2692 - classification_loss: 336.7328 442/500 [=========================>....] - ETA: 13s - loss: 355.2063 - regression_loss: 19.2330 - classification_loss: 335.9732 443/500 [=========================>....] - ETA: 13s - loss: 354.4122 - regression_loss: 19.1954 - classification_loss: 335.2166 444/500 [=========================>....] - ETA: 13s - loss: 353.6213 - regression_loss: 19.1577 - classification_loss: 334.4634 445/500 [=========================>....] - ETA: 12s - loss: 352.8339 - regression_loss: 19.1202 - classification_loss: 333.7136 446/500 [=========================>....] - ETA: 12s - loss: 352.0506 - regression_loss: 19.0831 - classification_loss: 332.9673 447/500 [=========================>....] - ETA: 12s - loss: 351.2704 - regression_loss: 19.0462 - classification_loss: 332.2241 448/500 [=========================>....] - ETA: 12s - loss: 350.4946 - regression_loss: 19.0100 - classification_loss: 331.4843 449/500 [=========================>....] - ETA: 11s - loss: 349.7207 - regression_loss: 18.9730 - classification_loss: 330.7475 450/500 [==========================>...] - ETA: 11s - loss: 348.9506 - regression_loss: 18.9362 - classification_loss: 330.0142 451/500 [==========================>...] - ETA: 11s - loss: 348.1844 - regression_loss: 18.8998 - classification_loss: 329.2843 452/500 [==========================>...] - ETA: 11s - loss: 347.4231 - regression_loss: 18.8641 - classification_loss: 328.5588 453/500 [==========================>...] - ETA: 10s - loss: 346.6636 - regression_loss: 18.8285 - classification_loss: 327.8350 454/500 [==========================>...] - ETA: 10s - loss: 345.9074 - regression_loss: 18.7928 - classification_loss: 327.1144 455/500 [==========================>...] - ETA: 10s - loss: 345.1544 - regression_loss: 18.7572 - classification_loss: 326.3970 456/500 [==========================>...] - ETA: 10s - loss: 344.4044 - regression_loss: 18.7214 - classification_loss: 325.6827 457/500 [==========================>...] - ETA: 10s - loss: 343.6579 - regression_loss: 18.6860 - classification_loss: 324.9716 458/500 [==========================>...] - ETA: 9s - loss: 342.9142 - regression_loss: 18.6506 - classification_loss: 324.2634 459/500 [==========================>...] - ETA: 9s - loss: 342.1739 - regression_loss: 18.6152 - classification_loss: 323.5586 460/500 [==========================>...] - ETA: 9s - loss: 341.4374 - regression_loss: 18.5803 - classification_loss: 322.8569 461/500 [==========================>...] - ETA: 9s - loss: 340.7036 - regression_loss: 18.5453 - classification_loss: 322.1581 462/500 [==========================>...] - ETA: 8s - loss: 339.9728 - regression_loss: 18.5104 - classification_loss: 321.4623 463/500 [==========================>...] - ETA: 8s - loss: 339.2457 - regression_loss: 18.4760 - classification_loss: 320.7696 464/500 [==========================>...] - ETA: 8s - loss: 338.5220 - regression_loss: 18.4419 - classification_loss: 320.0799 465/500 [==========================>...] - ETA: 8s - loss: 337.8022 - regression_loss: 18.4078 - classification_loss: 319.3942 466/500 [==========================>...] - ETA: 7s - loss: 337.0869 - regression_loss: 18.3749 - classification_loss: 318.7119 467/500 [===========================>..] - ETA: 7s - loss: 336.3732 - regression_loss: 18.3413 - classification_loss: 318.0317 468/500 [===========================>..] - ETA: 7s - loss: 335.6623 - regression_loss: 18.3081 - classification_loss: 317.3540 469/500 [===========================>..] - ETA: 7s - loss: 334.9536 - regression_loss: 18.2747 - classification_loss: 316.6787 470/500 [===========================>..] - ETA: 6s - loss: 334.2476 - regression_loss: 18.2412 - classification_loss: 316.0062 471/500 [===========================>..] - ETA: 6s - loss: 333.5454 - regression_loss: 18.2083 - classification_loss: 315.3369 472/500 [===========================>..] - ETA: 6s - loss: 332.8463 - regression_loss: 18.1755 - classification_loss: 314.6706 473/500 [===========================>..] - ETA: 6s - loss: 332.1497 - regression_loss: 18.1425 - classification_loss: 314.0071 474/500 [===========================>..] - ETA: 6s - loss: 331.4561 - regression_loss: 18.1095 - classification_loss: 313.3464 475/500 [===========================>..] - ETA: 5s - loss: 330.7652 - regression_loss: 18.0765 - classification_loss: 312.6884 476/500 [===========================>..] - ETA: 5s - loss: 330.0775 - regression_loss: 18.0441 - classification_loss: 312.0331 477/500 [===========================>..] - ETA: 5s - loss: 329.3936 - regression_loss: 18.0114 - classification_loss: 311.3820 478/500 [===========================>..] - ETA: 5s - loss: 328.7117 - regression_loss: 17.9794 - classification_loss: 310.7321 479/500 [===========================>..] - ETA: 4s - loss: 328.0324 - regression_loss: 17.9470 - classification_loss: 310.0852 480/500 [===========================>..] - ETA: 4s - loss: 327.3558 - regression_loss: 17.9151 - classification_loss: 309.4405 481/500 [===========================>..] - ETA: 4s - loss: 326.6820 - regression_loss: 17.8833 - classification_loss: 308.7985 482/500 [===========================>..] - ETA: 4s - loss: 326.0109 - regression_loss: 17.8514 - classification_loss: 308.1593 483/500 [===========================>..] - ETA: 3s - loss: 325.3426 - regression_loss: 17.8197 - classification_loss: 307.5227 484/500 [============================>.] - ETA: 3s - loss: 324.6774 - regression_loss: 17.7880 - classification_loss: 306.8892 485/500 [============================>.] - ETA: 3s - loss: 324.0146 - regression_loss: 17.7568 - classification_loss: 306.2576 486/500 [============================>.] - ETA: 3s - loss: 323.3546 - regression_loss: 17.7253 - classification_loss: 305.6291 487/500 [============================>.] - ETA: 3s - loss: 322.6972 - regression_loss: 17.6939 - classification_loss: 305.0031 488/500 [============================>.] - ETA: 2s - loss: 322.0429 - regression_loss: 17.6628 - classification_loss: 304.3798 489/500 [============================>.] - ETA: 2s - loss: 321.3908 - regression_loss: 17.6318 - classification_loss: 303.7588 490/500 [============================>.] - ETA: 2s - loss: 320.7417 - regression_loss: 17.6009 - classification_loss: 303.1405 491/500 [============================>.] - ETA: 2s - loss: 320.0988 - regression_loss: 17.5706 - classification_loss: 302.5280 492/500 [============================>.] - ETA: 1s - loss: 319.4545 - regression_loss: 17.5398 - classification_loss: 301.9145 493/500 [============================>.] - ETA: 1s - loss: 318.8134 - regression_loss: 17.5095 - classification_loss: 301.3037 494/500 [============================>.] - ETA: 1s - loss: 318.1754 - regression_loss: 17.4798 - classification_loss: 300.6954 495/500 [============================>.] - ETA: 1s - loss: 317.5392 - regression_loss: 17.4495 - classification_loss: 300.0895 496/500 [============================>.] - ETA: 0s - loss: 316.9055 - regression_loss: 17.4192 - classification_loss: 299.4861 497/500 [============================>.] - ETA: 0s - loss: 316.2751 - regression_loss: 17.3897 - classification_loss: 298.8852 498/500 [============================>.] - ETA: 0s - loss: 315.6468 - regression_loss: 17.3599 - classification_loss: 298.2867 499/500 [============================>.] - ETA: 0s - loss: 315.0208 - regression_loss: 17.3302 - classification_loss: 297.6903 500/500 [==============================] - 116s 232ms/step - loss: 314.3984 - regression_loss: 17.3006 - classification_loss: 297.0975 1172 instances of class plum with average precision: 0.0007 mAP: 0.0007 Epoch 00001: saving model to ./training/snapshots/resnet50_pascal_01.h5 Epoch 2/150 1/500 [..............................] - ETA: 1:57 - loss: 3.1216 - regression_loss: 2.4250 - classification_loss: 0.6966 2/500 [..............................] - ETA: 1:54 - loss: 3.2502 - regression_loss: 2.5283 - classification_loss: 0.7219 3/500 [..............................] - ETA: 1:52 - loss: 3.2294 - regression_loss: 2.5124 - classification_loss: 0.7170 4/500 [..............................] - ETA: 1:52 - loss: 3.2365 - regression_loss: 2.5078 - classification_loss: 0.7288 5/500 [..............................] - ETA: 1:51 - loss: 3.2708 - regression_loss: 2.5097 - classification_loss: 0.7611 6/500 [..............................] - ETA: 1:53 - loss: 3.4442 - regression_loss: 2.6718 - classification_loss: 0.7724 7/500 [..............................] - ETA: 1:53 - loss: 3.4033 - regression_loss: 2.6333 - classification_loss: 0.7700 8/500 [..............................] - ETA: 1:52 - loss: 3.4351 - regression_loss: 2.6656 - classification_loss: 0.7695 9/500 [..............................] - ETA: 1:52 - loss: 3.4261 - regression_loss: 2.6567 - classification_loss: 0.7694 10/500 [..............................] - ETA: 1:51 - loss: 3.4284 - regression_loss: 2.6521 - classification_loss: 0.7763 11/500 [..............................] - ETA: 1:51 - loss: 3.4170 - regression_loss: 2.6549 - classification_loss: 0.7621 12/500 [..............................] - ETA: 1:50 - loss: 3.3932 - regression_loss: 2.6390 - classification_loss: 0.7542 13/500 [..............................] - ETA: 1:49 - loss: 4.8861 - regression_loss: 2.6572 - classification_loss: 2.2290 14/500 [..............................] - ETA: 1:49 - loss: 4.7655 - regression_loss: 2.6418 - classification_loss: 2.1237 15/500 [..............................] - ETA: 1:49 - loss: 4.6630 - regression_loss: 2.6298 - classification_loss: 2.0332 16/500 [..............................] - ETA: 1:48 - loss: 4.5773 - regression_loss: 2.6204 - classification_loss: 1.9569 17/500 [>.............................] - ETA: 1:47 - loss: 4.5158 - regression_loss: 2.6229 - classification_loss: 1.8929 18/500 [>.............................] - ETA: 1:47 - loss: 4.4449 - regression_loss: 2.6158 - classification_loss: 1.8291 19/500 [>.............................] - ETA: 1:47 - loss: 4.3720 - regression_loss: 2.6061 - classification_loss: 1.7659 20/500 [>.............................] - ETA: 1:46 - loss: 4.3201 - regression_loss: 2.6014 - classification_loss: 1.7187 21/500 [>.............................] - ETA: 1:46 - loss: 4.2860 - regression_loss: 2.6149 - classification_loss: 1.6711 22/500 [>.............................] - ETA: 1:45 - loss: 4.2481 - regression_loss: 2.6143 - classification_loss: 1.6338 23/500 [>.............................] - ETA: 1:45 - loss: 4.1996 - regression_loss: 2.6090 - classification_loss: 1.5906 24/500 [>.............................] - ETA: 1:44 - loss: 4.1665 - regression_loss: 2.6075 - classification_loss: 1.5590 25/500 [>.............................] - ETA: 1:44 - loss: 4.1473 - regression_loss: 2.6092 - classification_loss: 1.5381 26/500 [>.............................] - ETA: 1:44 - loss: 4.1093 - regression_loss: 2.6041 - classification_loss: 1.5052 27/500 [>.............................] - ETA: 1:43 - loss: 4.0907 - regression_loss: 2.6076 - classification_loss: 1.4831 28/500 [>.............................] - ETA: 1:43 - loss: 4.0573 - regression_loss: 2.6022 - classification_loss: 1.4551 29/500 [>.............................] - ETA: 1:43 - loss: 4.0314 - regression_loss: 2.6028 - classification_loss: 1.4285 30/500 [>.............................] - ETA: 1:43 - loss: 3.9981 - regression_loss: 2.5955 - classification_loss: 1.4026 31/500 [>.............................] - ETA: 1:43 - loss: 3.9752 - regression_loss: 2.5926 - classification_loss: 1.3827 32/500 [>.............................] - ETA: 1:42 - loss: 3.9461 - regression_loss: 2.5875 - classification_loss: 1.3587 33/500 [>.............................] - ETA: 1:42 - loss: 3.9375 - regression_loss: 2.5912 - classification_loss: 1.3462 34/500 [=>............................] - ETA: 1:42 - loss: 3.9137 - regression_loss: 2.5867 - classification_loss: 1.3269 35/500 [=>............................] - ETA: 1:42 - loss: 3.8938 - regression_loss: 2.5816 - classification_loss: 1.3122 36/500 [=>............................] - ETA: 1:41 - loss: 3.8772 - regression_loss: 2.5830 - classification_loss: 1.2942 37/500 [=>............................] - ETA: 1:41 - loss: 3.8839 - regression_loss: 2.5934 - classification_loss: 1.2906 38/500 [=>............................] - ETA: 1:41 - loss: 3.8615 - regression_loss: 2.5870 - classification_loss: 1.2745 39/500 [=>............................] - ETA: 1:40 - loss: 3.8453 - regression_loss: 2.5855 - classification_loss: 1.2598 40/500 [=>............................] - ETA: 1:40 - loss: 3.8280 - regression_loss: 2.5806 - classification_loss: 1.2474 41/500 [=>............................] - ETA: 1:40 - loss: 3.8170 - regression_loss: 2.5799 - classification_loss: 1.2372 42/500 [=>............................] - ETA: 1:39 - loss: 3.8018 - regression_loss: 2.5759 - classification_loss: 1.2259 43/500 [=>............................] - ETA: 1:39 - loss: 3.7968 - regression_loss: 2.5683 - classification_loss: 1.2285 44/500 [=>............................] - ETA: 1:39 - loss: 3.7837 - regression_loss: 2.5667 - classification_loss: 1.2170 45/500 [=>............................] - ETA: 1:39 - loss: 3.7705 - regression_loss: 2.5650 - classification_loss: 1.2055 46/500 [=>............................] - ETA: 1:38 - loss: 3.7577 - regression_loss: 2.5620 - classification_loss: 1.1957 47/500 [=>............................] - ETA: 1:38 - loss: 3.7446 - regression_loss: 2.5588 - classification_loss: 1.1858 48/500 [=>............................] - ETA: 1:38 - loss: 3.7304 - regression_loss: 2.5572 - classification_loss: 1.1732 49/500 [=>............................] - ETA: 1:38 - loss: 3.7194 - regression_loss: 2.5554 - classification_loss: 1.1640 50/500 [==>...........................] - ETA: 1:37 - loss: 3.7108 - regression_loss: 2.5554 - classification_loss: 1.1554 51/500 [==>...........................] - ETA: 1:37 - loss: 3.7018 - regression_loss: 2.5554 - classification_loss: 1.1464 52/500 [==>...........................] - ETA: 1:37 - loss: 3.6923 - regression_loss: 2.5527 - classification_loss: 1.1396 53/500 [==>...........................] - ETA: 1:37 - loss: 3.6836 - regression_loss: 2.5518 - classification_loss: 1.1318 54/500 [==>...........................] - ETA: 1:36 - loss: 3.6747 - regression_loss: 2.5505 - classification_loss: 1.1242 55/500 [==>...........................] - ETA: 1:36 - loss: 3.6645 - regression_loss: 2.5483 - classification_loss: 1.1162 56/500 [==>...........................] - ETA: 1:36 - loss: 3.6671 - regression_loss: 2.5463 - classification_loss: 1.1208 57/500 [==>...........................] - ETA: 1:36 - loss: 3.6650 - regression_loss: 2.5516 - classification_loss: 1.1134 58/500 [==>...........................] - ETA: 1:36 - loss: 3.6981 - regression_loss: 2.5462 - classification_loss: 1.1519 59/500 [==>...........................] - ETA: 1:35 - loss: 3.6884 - regression_loss: 2.5457 - classification_loss: 1.1427 60/500 [==>...........................] - ETA: 1:35 - loss: 3.6900 - regression_loss: 2.5471 - classification_loss: 1.1429 61/500 [==>...........................] - ETA: 1:35 - loss: 3.6809 - regression_loss: 2.5457 - classification_loss: 1.1351 62/500 [==>...........................] - ETA: 1:35 - loss: 3.6896 - regression_loss: 2.5506 - classification_loss: 1.1389 63/500 [==>...........................] - ETA: 1:34 - loss: 3.6841 - regression_loss: 2.5491 - classification_loss: 1.1349 64/500 [==>...........................] - ETA: 1:34 - loss: 3.6827 - regression_loss: 2.5466 - classification_loss: 1.1361 65/500 [==>...........................] - ETA: 1:34 - loss: 3.6751 - regression_loss: 2.5455 - classification_loss: 1.1296 66/500 [==>...........................] - ETA: 1:34 - loss: 3.6698 - regression_loss: 2.5460 - classification_loss: 1.1238 67/500 [===>..........................] - ETA: 1:34 - loss: 3.6627 - regression_loss: 2.5439 - classification_loss: 1.1187 68/500 [===>..........................] - ETA: 1:33 - loss: 3.6560 - regression_loss: 2.5448 - classification_loss: 1.1112 69/500 [===>..........................] - ETA: 1:33 - loss: 3.6514 - regression_loss: 2.5458 - classification_loss: 1.1056 70/500 [===>..........................] - ETA: 1:33 - loss: 3.6494 - regression_loss: 2.5475 - classification_loss: 1.1018 71/500 [===>..........................] - ETA: 1:33 - loss: 3.6413 - regression_loss: 2.5447 - classification_loss: 1.0966 72/500 [===>..........................] - ETA: 1:33 - loss: 3.6362 - regression_loss: 2.5454 - classification_loss: 1.0908 73/500 [===>..........................] - ETA: 1:32 - loss: 3.6298 - regression_loss: 2.5449 - classification_loss: 1.0849 74/500 [===>..........................] - ETA: 1:32 - loss: 3.6248 - regression_loss: 2.5455 - classification_loss: 1.0793 75/500 [===>..........................] - ETA: 1:32 - loss: 3.6196 - regression_loss: 2.5449 - classification_loss: 1.0747 76/500 [===>..........................] - ETA: 1:32 - loss: 3.6181 - regression_loss: 2.5481 - classification_loss: 1.0700 77/500 [===>..........................] - ETA: 1:31 - loss: 3.6206 - regression_loss: 2.5517 - classification_loss: 1.0688 78/500 [===>..........................] - ETA: 1:31 - loss: 3.6151 - regression_loss: 2.5519 - classification_loss: 1.0632 79/500 [===>..........................] - ETA: 1:31 - loss: 3.6097 - regression_loss: 2.5507 - classification_loss: 1.0590 80/500 [===>..........................] - ETA: 1:31 - loss: 3.6065 - regression_loss: 2.5503 - classification_loss: 1.0562 81/500 [===>..........................] - ETA: 1:31 - loss: 3.6030 - regression_loss: 2.5503 - classification_loss: 1.0527 82/500 [===>..........................] - ETA: 1:30 - loss: 3.6079 - regression_loss: 2.5505 - classification_loss: 1.0574 83/500 [===>..........................] - ETA: 1:30 - loss: 3.6022 - regression_loss: 2.5501 - classification_loss: 1.0522 84/500 [====>.........................] - ETA: 1:30 - loss: 3.5968 - regression_loss: 2.5491 - classification_loss: 1.0477 85/500 [====>.........................] - ETA: 1:30 - loss: 3.6006 - regression_loss: 2.5491 - classification_loss: 1.0515 86/500 [====>.........................] - ETA: 1:30 - loss: 3.5960 - regression_loss: 2.5484 - classification_loss: 1.0476 87/500 [====>.........................] - ETA: 1:30 - loss: 3.5899 - regression_loss: 2.5466 - classification_loss: 1.0433 88/500 [====>.........................] - ETA: 1:29 - loss: 3.5867 - regression_loss: 2.5477 - classification_loss: 1.0390 89/500 [====>.........................] - ETA: 1:29 - loss: 3.6047 - regression_loss: 2.5547 - classification_loss: 1.0500 90/500 [====>.........................] - ETA: 1:29 - loss: 3.6008 - regression_loss: 2.5541 - classification_loss: 1.0468 91/500 [====>.........................] - ETA: 1:29 - loss: 3.6037 - regression_loss: 2.5531 - classification_loss: 1.0506 92/500 [====>.........................] - ETA: 1:29 - loss: 3.5998 - regression_loss: 2.5518 - classification_loss: 1.0479 93/500 [====>.........................] - ETA: 1:28 - loss: 3.5984 - regression_loss: 2.5532 - classification_loss: 1.0452 94/500 [====>.........................] - ETA: 1:28 - loss: 3.5950 - regression_loss: 2.5528 - classification_loss: 1.0422 95/500 [====>.........................] - ETA: 1:28 - loss: 3.5917 - regression_loss: 2.5524 - classification_loss: 1.0393 96/500 [====>.........................] - ETA: 1:28 - loss: 3.5869 - regression_loss: 2.5509 - classification_loss: 1.0359 97/500 [====>.........................] - ETA: 1:28 - loss: 3.5883 - regression_loss: 2.5528 - classification_loss: 1.0355 98/500 [====>.........................] - ETA: 1:28 - loss: 3.5833 - regression_loss: 2.5507 - classification_loss: 1.0326 99/500 [====>.........................] - ETA: 1:27 - loss: 3.5792 - regression_loss: 2.5504 - classification_loss: 1.0288 100/500 [=====>........................] - ETA: 1:27 - loss: 3.5760 - regression_loss: 2.5506 - classification_loss: 1.0253 101/500 [=====>........................] - ETA: 1:27 - loss: 3.5723 - regression_loss: 2.5505 - classification_loss: 1.0218 102/500 [=====>........................] - ETA: 1:27 - loss: 3.5685 - regression_loss: 2.5508 - classification_loss: 1.0177 103/500 [=====>........................] - ETA: 1:27 - loss: 3.5669 - regression_loss: 2.5527 - classification_loss: 1.0143 104/500 [=====>........................] - ETA: 1:26 - loss: 3.5630 - regression_loss: 2.5523 - classification_loss: 1.0107 105/500 [=====>........................] - ETA: 1:26 - loss: 3.5597 - regression_loss: 2.5519 - classification_loss: 1.0078 106/500 [=====>........................] - ETA: 1:26 - loss: 3.5585 - regression_loss: 2.5549 - classification_loss: 1.0036 107/500 [=====>........................] - ETA: 1:26 - loss: 3.5630 - regression_loss: 2.5539 - classification_loss: 1.0091 108/500 [=====>........................] - ETA: 1:25 - loss: 3.5684 - regression_loss: 2.5561 - classification_loss: 1.0122 109/500 [=====>........................] - ETA: 1:25 - loss: 3.5651 - regression_loss: 2.5546 - classification_loss: 1.0105 110/500 [=====>........................] - ETA: 1:25 - loss: 3.5634 - regression_loss: 2.5552 - classification_loss: 1.0082 111/500 [=====>........................] - ETA: 1:25 - loss: 3.5617 - regression_loss: 2.5555 - classification_loss: 1.0062 112/500 [=====>........................] - ETA: 1:25 - loss: 3.5584 - regression_loss: 2.5546 - classification_loss: 1.0038 113/500 [=====>........................] - ETA: 1:24 - loss: 3.5561 - regression_loss: 2.5537 - classification_loss: 1.0025 114/500 [=====>........................] - ETA: 1:24 - loss: 3.5530 - regression_loss: 2.5534 - classification_loss: 0.9996 115/500 [=====>........................] - ETA: 1:24 - loss: 3.5501 - regression_loss: 2.5522 - classification_loss: 0.9979 116/500 [=====>........................] - ETA: 1:24 - loss: 3.5477 - regression_loss: 2.5521 - classification_loss: 0.9956 117/500 [======>.......................] - ETA: 1:23 - loss: 3.5454 - regression_loss: 2.5517 - classification_loss: 0.9937 118/500 [======>.......................] - ETA: 1:23 - loss: 3.5432 - regression_loss: 2.5513 - classification_loss: 0.9919 119/500 [======>.......................] - ETA: 1:23 - loss: 3.5399 - regression_loss: 2.5509 - classification_loss: 0.9890 120/500 [======>.......................] - ETA: 1:23 - loss: 3.5987 - regression_loss: 2.5569 - classification_loss: 1.0418 121/500 [======>.......................] - ETA: 1:23 - loss: 3.6021 - regression_loss: 2.5612 - classification_loss: 1.0410 122/500 [======>.......................] - ETA: 1:22 - loss: 3.6000 - regression_loss: 2.5621 - classification_loss: 1.0379 123/500 [======>.......................] - ETA: 1:22 - loss: 3.5975 - regression_loss: 2.5619 - classification_loss: 1.0356 124/500 [======>.......................] - ETA: 1:22 - loss: 3.5940 - regression_loss: 2.5607 - classification_loss: 1.0333 125/500 [======>.......................] - ETA: 1:22 - loss: 3.5985 - regression_loss: 2.5603 - classification_loss: 1.0383 126/500 [======>.......................] - ETA: 1:22 - loss: 3.5947 - regression_loss: 2.5594 - classification_loss: 1.0353 127/500 [======>.......................] - ETA: 1:22 - loss: 3.5978 - regression_loss: 2.5574 - classification_loss: 1.0405 128/500 [======>.......................] - ETA: 1:21 - loss: 3.5948 - regression_loss: 2.5567 - classification_loss: 1.0380 129/500 [======>.......................] - ETA: 1:21 - loss: 3.5909 - regression_loss: 2.5558 - classification_loss: 1.0351 130/500 [======>.......................] - ETA: 1:21 - loss: 3.5877 - regression_loss: 2.5556 - classification_loss: 1.0321 131/500 [======>.......................] - ETA: 1:21 - loss: 3.5854 - regression_loss: 2.5560 - classification_loss: 1.0294 132/500 [======>.......................] - ETA: 1:21 - loss: 3.5833 - regression_loss: 2.5557 - classification_loss: 1.0276 133/500 [======>.......................] - ETA: 1:21 - loss: 3.5791 - regression_loss: 2.5549 - classification_loss: 1.0242 134/500 [=======>......................] - ETA: 1:20 - loss: 3.5877 - regression_loss: 2.5559 - classification_loss: 1.0317 135/500 [=======>......................] - ETA: 1:20 - loss: 3.5875 - regression_loss: 2.5568 - classification_loss: 1.0307 136/500 [=======>......................] - ETA: 1:20 - loss: 3.5846 - regression_loss: 2.5563 - classification_loss: 1.0283 137/500 [=======>......................] - ETA: 1:20 - loss: 3.5850 - regression_loss: 2.5558 - classification_loss: 1.0292 138/500 [=======>......................] - ETA: 1:20 - loss: 3.5818 - regression_loss: 2.5554 - classification_loss: 1.0265 139/500 [=======>......................] - ETA: 1:19 - loss: 3.5816 - regression_loss: 2.5562 - classification_loss: 1.0255 140/500 [=======>......................] - ETA: 1:19 - loss: 3.5879 - regression_loss: 2.5573 - classification_loss: 1.0306 141/500 [=======>......................] - ETA: 1:19 - loss: 3.5849 - regression_loss: 2.5562 - classification_loss: 1.0287 142/500 [=======>......................] - ETA: 1:19 - loss: 3.5827 - regression_loss: 2.5557 - classification_loss: 1.0270 143/500 [=======>......................] - ETA: 1:18 - loss: 3.5794 - regression_loss: 2.5551 - classification_loss: 1.0243 144/500 [=======>......................] - ETA: 1:18 - loss: 3.5735 - regression_loss: 2.5520 - classification_loss: 1.0215 145/500 [=======>......................] - ETA: 1:18 - loss: 3.5720 - regression_loss: 2.5518 - classification_loss: 1.0202 146/500 [=======>......................] - ETA: 1:18 - loss: 3.5703 - regression_loss: 2.5516 - classification_loss: 1.0188 147/500 [=======>......................] - ETA: 1:18 - loss: 3.5688 - regression_loss: 2.5523 - classification_loss: 1.0165 148/500 [=======>......................] - ETA: 1:17 - loss: 3.5660 - regression_loss: 2.5514 - classification_loss: 1.0146 149/500 [=======>......................] - ETA: 1:17 - loss: 3.5664 - regression_loss: 2.5527 - classification_loss: 1.0137 150/500 [========>.....................] - ETA: 1:17 - loss: 3.5638 - regression_loss: 2.5511 - classification_loss: 1.0127 151/500 [========>.....................] - ETA: 1:17 - loss: 3.5637 - regression_loss: 2.5531 - classification_loss: 1.0106 152/500 [========>.....................] - ETA: 1:16 - loss: 3.5627 - regression_loss: 2.5536 - classification_loss: 1.0091 153/500 [========>.....................] - ETA: 1:16 - loss: 3.5600 - regression_loss: 2.5532 - classification_loss: 1.0068 154/500 [========>.....................] - ETA: 1:16 - loss: 3.5586 - regression_loss: 2.5530 - classification_loss: 1.0056 155/500 [========>.....................] - ETA: 1:16 - loss: 3.5548 - regression_loss: 2.5523 - classification_loss: 1.0025 156/500 [========>.....................] - ETA: 1:16 - loss: 3.5526 - regression_loss: 2.5521 - classification_loss: 1.0005 157/500 [========>.....................] - ETA: 1:15 - loss: 3.5507 - regression_loss: 2.5518 - classification_loss: 0.9989 158/500 [========>.....................] - ETA: 1:15 - loss: 3.5547 - regression_loss: 2.5567 - classification_loss: 0.9980 159/500 [========>.....................] - ETA: 1:15 - loss: 3.5524 - regression_loss: 2.5561 - classification_loss: 0.9962 160/500 [========>.....................] - ETA: 1:15 - loss: 3.5518 - regression_loss: 2.5567 - classification_loss: 0.9951 161/500 [========>.....................] - ETA: 1:14 - loss: 3.5489 - regression_loss: 2.5557 - classification_loss: 0.9932 162/500 [========>.....................] - ETA: 1:14 - loss: 3.5468 - regression_loss: 2.5553 - classification_loss: 0.9915 163/500 [========>.....................] - ETA: 1:14 - loss: 3.5440 - regression_loss: 2.5544 - classification_loss: 0.9896 164/500 [========>.....................] - ETA: 1:14 - loss: 3.5413 - regression_loss: 2.5542 - classification_loss: 0.9871 165/500 [========>.....................] - ETA: 1:13 - loss: 3.5437 - regression_loss: 2.5546 - classification_loss: 0.9891 166/500 [========>.....................] - ETA: 1:13 - loss: 3.5414 - regression_loss: 2.5539 - classification_loss: 0.9875 167/500 [=========>....................] - ETA: 1:13 - loss: 3.5390 - regression_loss: 2.5533 - classification_loss: 0.9857 168/500 [=========>....................] - ETA: 1:13 - loss: 3.5364 - regression_loss: 2.5526 - classification_loss: 0.9838 169/500 [=========>....................] - ETA: 1:13 - loss: 3.5370 - regression_loss: 2.5540 - classification_loss: 0.9829 170/500 [=========>....................] - ETA: 1:12 - loss: 3.5360 - regression_loss: 2.5552 - classification_loss: 0.9808 171/500 [=========>....................] - ETA: 1:12 - loss: 3.5352 - regression_loss: 2.5551 - classification_loss: 0.9801 172/500 [=========>....................] - ETA: 1:12 - loss: 3.5337 - regression_loss: 2.5549 - classification_loss: 0.9788 173/500 [=========>....................] - ETA: 1:12 - loss: 3.5354 - regression_loss: 2.5568 - classification_loss: 0.9786 174/500 [=========>....................] - ETA: 1:11 - loss: 3.5332 - regression_loss: 2.5564 - classification_loss: 0.9768 175/500 [=========>....................] - ETA: 1:11 - loss: 3.5334 - regression_loss: 2.5572 - classification_loss: 0.9762 176/500 [=========>....................] - ETA: 1:11 - loss: 3.5309 - regression_loss: 2.5562 - classification_loss: 0.9747 177/500 [=========>....................] - ETA: 1:11 - loss: 3.5326 - regression_loss: 2.5582 - classification_loss: 0.9744 178/500 [=========>....................] - ETA: 1:11 - loss: 3.5326 - regression_loss: 2.5567 - classification_loss: 0.9759 179/500 [=========>....................] - ETA: 1:10 - loss: 3.5317 - regression_loss: 2.5575 - classification_loss: 0.9742 180/500 [=========>....................] - ETA: 1:10 - loss: 3.5306 - regression_loss: 2.5575 - classification_loss: 0.9731 181/500 [=========>....................] - ETA: 1:10 - loss: 3.5289 - regression_loss: 2.5575 - classification_loss: 0.9713 182/500 [=========>....................] - ETA: 1:10 - loss: 3.5275 - regression_loss: 2.5569 - classification_loss: 0.9706 183/500 [=========>....................] - ETA: 1:09 - loss: 3.5258 - regression_loss: 2.5563 - classification_loss: 0.9696 184/500 [==========>...................] - ETA: 1:09 - loss: 3.5229 - regression_loss: 2.5552 - classification_loss: 0.9677 185/500 [==========>...................] - ETA: 1:09 - loss: 3.5224 - regression_loss: 2.5551 - classification_loss: 0.9673 186/500 [==========>...................] - ETA: 1:09 - loss: 3.5209 - regression_loss: 2.5545 - classification_loss: 0.9665 187/500 [==========>...................] - ETA: 1:08 - loss: 3.5200 - regression_loss: 2.5547 - classification_loss: 0.9653 188/500 [==========>...................] - ETA: 1:08 - loss: 3.5186 - regression_loss: 2.5547 - classification_loss: 0.9639 189/500 [==========>...................] - ETA: 1:08 - loss: 3.5168 - regression_loss: 2.5542 - classification_loss: 0.9626 190/500 [==========>...................] - ETA: 1:08 - loss: 3.5153 - regression_loss: 2.5537 - classification_loss: 0.9616 191/500 [==========>...................] - ETA: 1:08 - loss: 3.5142 - regression_loss: 2.5539 - classification_loss: 0.9602 192/500 [==========>...................] - ETA: 1:07 - loss: 3.5119 - regression_loss: 2.5535 - classification_loss: 0.9584 193/500 [==========>...................] - ETA: 1:07 - loss: 3.5150 - regression_loss: 2.5562 - classification_loss: 0.9589 194/500 [==========>...................] - ETA: 1:07 - loss: 3.5137 - regression_loss: 2.5561 - classification_loss: 0.9576 195/500 [==========>...................] - ETA: 1:07 - loss: 3.5196 - regression_loss: 2.5593 - classification_loss: 0.9604 196/500 [==========>...................] - ETA: 1:06 - loss: 3.5186 - regression_loss: 2.5596 - classification_loss: 0.9590 197/500 [==========>...................] - ETA: 1:06 - loss: 3.5198 - regression_loss: 2.5594 - classification_loss: 0.9605 198/500 [==========>...................] - ETA: 1:06 - loss: 3.5221 - regression_loss: 2.5605 - classification_loss: 0.9617 199/500 [==========>...................] - ETA: 1:06 - loss: 3.5211 - regression_loss: 2.5607 - classification_loss: 0.9605 200/500 [===========>..................] - ETA: 1:05 - loss: 3.5207 - regression_loss: 2.5614 - classification_loss: 0.9593 201/500 [===========>..................] - ETA: 1:05 - loss: 3.5191 - regression_loss: 2.5611 - classification_loss: 0.9580 202/500 [===========>..................] - ETA: 1:05 - loss: 3.5173 - regression_loss: 2.5610 - classification_loss: 0.9563 203/500 [===========>..................] - ETA: 1:05 - loss: 3.5159 - regression_loss: 2.5605 - classification_loss: 0.9554 204/500 [===========>..................] - ETA: 1:05 - loss: 3.5142 - regression_loss: 2.5601 - classification_loss: 0.9541 205/500 [===========>..................] - ETA: 1:04 - loss: 3.5132 - regression_loss: 2.5598 - classification_loss: 0.9535 206/500 [===========>..................] - ETA: 1:04 - loss: 3.5124 - regression_loss: 2.5586 - classification_loss: 0.9538 207/500 [===========>..................] - ETA: 1:04 - loss: 3.5109 - regression_loss: 2.5582 - classification_loss: 0.9527 208/500 [===========>..................] - ETA: 1:04 - loss: 3.5090 - regression_loss: 2.5575 - classification_loss: 0.9515 209/500 [===========>..................] - ETA: 1:03 - loss: 3.5075 - regression_loss: 2.5570 - classification_loss: 0.9504 210/500 [===========>..................] - ETA: 1:03 - loss: 3.5062 - regression_loss: 2.5565 - classification_loss: 0.9497 211/500 [===========>..................] - ETA: 1:03 - loss: 3.5063 - regression_loss: 2.5575 - classification_loss: 0.9488 212/500 [===========>..................] - ETA: 1:03 - loss: 3.5081 - regression_loss: 2.5588 - classification_loss: 0.9493 213/500 [===========>..................] - ETA: 1:03 - loss: 3.5067 - regression_loss: 2.5581 - classification_loss: 0.9486 214/500 [===========>..................] - ETA: 1:02 - loss: 3.5070 - regression_loss: 2.5591 - classification_loss: 0.9479 215/500 [===========>..................] - ETA: 1:02 - loss: 3.5054 - regression_loss: 2.5589 - classification_loss: 0.9465 216/500 [===========>..................] - ETA: 1:02 - loss: 3.5041 - regression_loss: 2.5585 - classification_loss: 0.9456 217/500 [============>.................] - ETA: 1:02 - loss: 3.5023 - regression_loss: 2.5578 - classification_loss: 0.9445 218/500 [============>.................] - ETA: 1:01 - loss: 3.5008 - regression_loss: 2.5572 - classification_loss: 0.9435 219/500 [============>.................] - ETA: 1:01 - loss: 3.4994 - regression_loss: 2.5569 - classification_loss: 0.9425 220/500 [============>.................] - ETA: 1:01 - loss: 3.4978 - regression_loss: 2.5565 - classification_loss: 0.9413 221/500 [============>.................] - ETA: 1:01 - loss: 3.4975 - regression_loss: 2.5566 - classification_loss: 0.9409 222/500 [============>.................] - ETA: 1:00 - loss: 3.4960 - regression_loss: 2.5561 - classification_loss: 0.9399 223/500 [============>.................] - ETA: 1:00 - loss: 3.4946 - regression_loss: 2.5560 - classification_loss: 0.9387 224/500 [============>.................] - ETA: 1:00 - loss: 3.4935 - regression_loss: 2.5558 - classification_loss: 0.9377 225/500 [============>.................] - ETA: 1:00 - loss: 3.4917 - regression_loss: 2.5548 - classification_loss: 0.9369 226/500 [============>.................] - ETA: 1:00 - loss: 3.4904 - regression_loss: 2.5548 - classification_loss: 0.9356 227/500 [============>.................] - ETA: 59s - loss: 3.4898 - regression_loss: 2.5549 - classification_loss: 0.9349  228/500 [============>.................] - ETA: 59s - loss: 3.4913 - regression_loss: 2.5552 - classification_loss: 0.9362 229/500 [============>.................] - ETA: 59s - loss: 3.4902 - regression_loss: 2.5552 - classification_loss: 0.9350 230/500 [============>.................] - ETA: 59s - loss: 3.4904 - regression_loss: 2.5562 - classification_loss: 0.9342 231/500 [============>.................] - ETA: 58s - loss: 3.4890 - regression_loss: 2.5556 - classification_loss: 0.9333 232/500 [============>.................] - ETA: 58s - loss: 3.4893 - regression_loss: 2.5561 - classification_loss: 0.9332 233/500 [============>.................] - ETA: 58s - loss: 3.4883 - regression_loss: 2.5557 - classification_loss: 0.9326 234/500 [=============>................] - ETA: 58s - loss: 3.4871 - regression_loss: 2.5554 - classification_loss: 0.9317 235/500 [=============>................] - ETA: 58s - loss: 3.4862 - regression_loss: 2.5548 - classification_loss: 0.9314 236/500 [=============>................] - ETA: 57s - loss: 3.4852 - regression_loss: 2.5541 - classification_loss: 0.9311 237/500 [=============>................] - ETA: 57s - loss: 3.4836 - regression_loss: 2.5535 - classification_loss: 0.9301 238/500 [=============>................] - ETA: 57s - loss: 3.4824 - regression_loss: 2.5528 - classification_loss: 0.9296 239/500 [=============>................] - ETA: 57s - loss: 3.4820 - regression_loss: 2.5529 - classification_loss: 0.9291 240/500 [=============>................] - ETA: 56s - loss: 3.4812 - regression_loss: 2.5532 - classification_loss: 0.9279 241/500 [=============>................] - ETA: 56s - loss: 3.4800 - regression_loss: 2.5533 - classification_loss: 0.9267 242/500 [=============>................] - ETA: 56s - loss: 3.4795 - regression_loss: 2.5531 - classification_loss: 0.9264 243/500 [=============>................] - ETA: 56s - loss: 3.4781 - regression_loss: 2.5530 - classification_loss: 0.9250 244/500 [=============>................] - ETA: 56s - loss: 3.4774 - regression_loss: 2.5526 - classification_loss: 0.9248 245/500 [=============>................] - ETA: 55s - loss: 3.4795 - regression_loss: 2.5551 - classification_loss: 0.9244 246/500 [=============>................] - ETA: 55s - loss: 3.4777 - regression_loss: 2.5549 - classification_loss: 0.9227 247/500 [=============>................] - ETA: 55s - loss: 3.4758 - regression_loss: 2.5547 - classification_loss: 0.9211 248/500 [=============>................] - ETA: 55s - loss: 3.4762 - regression_loss: 2.5545 - classification_loss: 0.9218 249/500 [=============>................] - ETA: 54s - loss: 3.4777 - regression_loss: 2.5547 - classification_loss: 0.9230 250/500 [==============>...............] - ETA: 54s - loss: 3.4761 - regression_loss: 2.5547 - classification_loss: 0.9214 251/500 [==============>...............] - ETA: 54s - loss: 3.4748 - regression_loss: 2.5543 - classification_loss: 0.9204 252/500 [==============>...............] - ETA: 54s - loss: 3.4746 - regression_loss: 2.5552 - classification_loss: 0.9195 253/500 [==============>...............] - ETA: 54s - loss: 3.4734 - regression_loss: 2.5546 - classification_loss: 0.9188 254/500 [==============>...............] - ETA: 53s - loss: 3.4713 - regression_loss: 2.5539 - classification_loss: 0.9174 255/500 [==============>...............] - ETA: 53s - loss: 3.4704 - regression_loss: 2.5535 - classification_loss: 0.9169 256/500 [==============>...............] - ETA: 53s - loss: 3.4696 - regression_loss: 2.5536 - classification_loss: 0.9160 257/500 [==============>...............] - ETA: 53s - loss: 3.4684 - regression_loss: 2.5533 - classification_loss: 0.9150 258/500 [==============>...............] - ETA: 52s - loss: 3.4681 - regression_loss: 2.5536 - classification_loss: 0.9146 259/500 [==============>...............] - ETA: 52s - loss: 3.4672 - regression_loss: 2.5534 - classification_loss: 0.9138 260/500 [==============>...............] - ETA: 52s - loss: 3.4662 - regression_loss: 2.5531 - classification_loss: 0.9131 261/500 [==============>...............] - ETA: 52s - loss: 3.4652 - regression_loss: 2.5529 - classification_loss: 0.9123 262/500 [==============>...............] - ETA: 52s - loss: 3.4652 - regression_loss: 2.5534 - classification_loss: 0.9118 263/500 [==============>...............] - ETA: 51s - loss: 3.4637 - regression_loss: 2.5530 - classification_loss: 0.9107 264/500 [==============>...............] - ETA: 51s - loss: 3.4621 - regression_loss: 2.5524 - classification_loss: 0.9097 265/500 [==============>...............] - ETA: 51s - loss: 3.4607 - regression_loss: 2.5521 - classification_loss: 0.9086 266/500 [==============>...............] - ETA: 51s - loss: 3.4615 - regression_loss: 2.5531 - classification_loss: 0.9084 267/500 [===============>..............] - ETA: 51s - loss: 3.4598 - regression_loss: 2.5527 - classification_loss: 0.9071 268/500 [===============>..............] - ETA: 50s - loss: 3.4609 - regression_loss: 2.5529 - classification_loss: 0.9080 269/500 [===============>..............] - ETA: 50s - loss: 3.4638 - regression_loss: 2.5528 - classification_loss: 0.9111 270/500 [===============>..............] - ETA: 50s - loss: 3.4620 - regression_loss: 2.5523 - classification_loss: 0.9098 271/500 [===============>..............] - ETA: 50s - loss: 3.4614 - regression_loss: 2.5523 - classification_loss: 0.9091 272/500 [===============>..............] - ETA: 50s - loss: 3.4645 - regression_loss: 2.5518 - classification_loss: 0.9127 273/500 [===============>..............] - ETA: 50s - loss: 3.4631 - regression_loss: 2.5516 - classification_loss: 0.9115 274/500 [===============>..............] - ETA: 49s - loss: 3.4621 - regression_loss: 2.5514 - classification_loss: 0.9107 275/500 [===============>..............] - ETA: 49s - loss: 3.4608 - regression_loss: 2.5512 - classification_loss: 0.9096 276/500 [===============>..............] - ETA: 49s - loss: 3.4592 - regression_loss: 2.5506 - classification_loss: 0.9085 277/500 [===============>..............] - ETA: 49s - loss: 3.4582 - regression_loss: 2.5505 - classification_loss: 0.9077 278/500 [===============>..............] - ETA: 49s - loss: 3.4577 - regression_loss: 2.5507 - classification_loss: 0.9070 279/500 [===============>..............] - ETA: 48s - loss: 3.4563 - regression_loss: 2.5504 - classification_loss: 0.9060 280/500 [===============>..............] - ETA: 48s - loss: 3.4546 - regression_loss: 2.5499 - classification_loss: 0.9047 281/500 [===============>..............] - ETA: 48s - loss: 3.4541 - regression_loss: 2.5494 - classification_loss: 0.9047 282/500 [===============>..............] - ETA: 48s - loss: 3.4530 - regression_loss: 2.5492 - classification_loss: 0.9038 283/500 [===============>..............] - ETA: 48s - loss: 3.4516 - regression_loss: 2.5489 - classification_loss: 0.9027 284/500 [================>.............] - ETA: 47s - loss: 3.4562 - regression_loss: 2.5500 - classification_loss: 0.9062 285/500 [================>.............] - ETA: 47s - loss: 3.4545 - regression_loss: 2.5495 - classification_loss: 0.9050 286/500 [================>.............] - ETA: 47s - loss: 3.4528 - regression_loss: 2.5490 - classification_loss: 0.9039 287/500 [================>.............] - ETA: 47s - loss: 3.4526 - regression_loss: 2.5491 - classification_loss: 0.9035 288/500 [================>.............] - ETA: 47s - loss: 3.4516 - regression_loss: 2.5488 - classification_loss: 0.9028 289/500 [================>.............] - ETA: 46s - loss: 3.4615 - regression_loss: 2.5497 - classification_loss: 0.9117 290/500 [================>.............] - ETA: 46s - loss: 3.4660 - regression_loss: 2.5488 - classification_loss: 0.9172 291/500 [================>.............] - ETA: 46s - loss: 3.4645 - regression_loss: 2.5483 - classification_loss: 0.9162 292/500 [================>.............] - ETA: 46s - loss: 3.4648 - regression_loss: 2.5494 - classification_loss: 0.9154 293/500 [================>.............] - ETA: 46s - loss: 3.4640 - regression_loss: 2.5495 - classification_loss: 0.9145 294/500 [================>.............] - ETA: 45s - loss: 3.4624 - regression_loss: 2.5491 - classification_loss: 0.9133 295/500 [================>.............] - ETA: 45s - loss: 3.4616 - regression_loss: 2.5491 - classification_loss: 0.9124 296/500 [================>.............] - ETA: 45s - loss: 3.4606 - regression_loss: 2.5488 - classification_loss: 0.9118 297/500 [================>.............] - ETA: 45s - loss: 3.4600 - regression_loss: 2.5484 - classification_loss: 0.9116 298/500 [================>.............] - ETA: 45s - loss: 3.4592 - regression_loss: 2.5481 - classification_loss: 0.9111 299/500 [================>.............] - ETA: 44s - loss: 3.4583 - regression_loss: 2.5482 - classification_loss: 0.9101 300/500 [=================>............] - ETA: 44s - loss: 3.4578 - regression_loss: 2.5485 - classification_loss: 0.9093 301/500 [=================>............] - ETA: 44s - loss: 3.4600 - regression_loss: 2.5507 - classification_loss: 0.9094 302/500 [=================>............] - ETA: 44s - loss: 3.4590 - regression_loss: 2.5502 - classification_loss: 0.9088 303/500 [=================>............] - ETA: 44s - loss: 3.4577 - regression_loss: 2.5496 - classification_loss: 0.9081 304/500 [=================>............] - ETA: 43s - loss: 3.4600 - regression_loss: 2.5504 - classification_loss: 0.9097 305/500 [=================>............] - ETA: 43s - loss: 3.4595 - regression_loss: 2.5504 - classification_loss: 0.9092 306/500 [=================>............] - ETA: 43s - loss: 3.4582 - regression_loss: 2.5500 - classification_loss: 0.9082 307/500 [=================>............] - ETA: 43s - loss: 3.4577 - regression_loss: 2.5500 - classification_loss: 0.9077 308/500 [=================>............] - ETA: 42s - loss: 3.4568 - regression_loss: 2.5496 - classification_loss: 0.9072 309/500 [=================>............] - ETA: 42s - loss: 3.4569 - regression_loss: 2.5507 - classification_loss: 0.9062 310/500 [=================>............] - ETA: 42s - loss: 3.4575 - regression_loss: 2.5508 - classification_loss: 0.9067 311/500 [=================>............] - ETA: 42s - loss: 3.4560 - regression_loss: 2.5503 - classification_loss: 0.9057 312/500 [=================>............] - ETA: 42s - loss: 3.4549 - regression_loss: 2.5499 - classification_loss: 0.9050 313/500 [=================>............] - ETA: 41s - loss: 3.4532 - regression_loss: 2.5492 - classification_loss: 0.9040 314/500 [=================>............] - ETA: 41s - loss: 3.4522 - regression_loss: 2.5486 - classification_loss: 0.9036 315/500 [=================>............] - ETA: 41s - loss: 3.4519 - regression_loss: 2.5489 - classification_loss: 0.9029 316/500 [=================>............] - ETA: 41s - loss: 3.4515 - regression_loss: 2.5488 - classification_loss: 0.9027 317/500 [==================>...........] - ETA: 41s - loss: 3.4506 - regression_loss: 2.5484 - classification_loss: 0.9023 318/500 [==================>...........] - ETA: 40s - loss: 3.4495 - regression_loss: 2.5477 - classification_loss: 0.9018 319/500 [==================>...........] - ETA: 40s - loss: 3.4492 - regression_loss: 2.5474 - classification_loss: 0.9018 320/500 [==================>...........] - ETA: 40s - loss: 3.4483 - regression_loss: 2.5471 - classification_loss: 0.9012 321/500 [==================>...........] - ETA: 40s - loss: 3.4477 - regression_loss: 2.5470 - classification_loss: 0.9007 322/500 [==================>...........] - ETA: 40s - loss: 3.4470 - regression_loss: 2.5468 - classification_loss: 0.9002 323/500 [==================>...........] - ETA: 39s - loss: 3.4463 - regression_loss: 2.5465 - classification_loss: 0.8998 324/500 [==================>...........] - ETA: 39s - loss: 3.4454 - regression_loss: 2.5463 - classification_loss: 0.8991 325/500 [==================>...........] - ETA: 39s - loss: 3.4461 - regression_loss: 2.5466 - classification_loss: 0.8995 326/500 [==================>...........] - ETA: 39s - loss: 3.4456 - regression_loss: 2.5466 - classification_loss: 0.8990 327/500 [==================>...........] - ETA: 39s - loss: 3.4446 - regression_loss: 2.5464 - classification_loss: 0.8983 328/500 [==================>...........] - ETA: 38s - loss: 3.4460 - regression_loss: 2.5464 - classification_loss: 0.8995 329/500 [==================>...........] - ETA: 38s - loss: 3.4453 - regression_loss: 2.5466 - classification_loss: 0.8987 330/500 [==================>...........] - ETA: 38s - loss: 3.4444 - regression_loss: 2.5463 - classification_loss: 0.8980 331/500 [==================>...........] - ETA: 38s - loss: 3.4440 - regression_loss: 2.5465 - classification_loss: 0.8974 332/500 [==================>...........] - ETA: 37s - loss: 3.4457 - regression_loss: 2.5460 - classification_loss: 0.8997 333/500 [==================>...........] - ETA: 37s - loss: 3.4450 - regression_loss: 2.5452 - classification_loss: 0.8998 334/500 [===================>..........] - ETA: 37s - loss: 3.4439 - regression_loss: 2.5450 - classification_loss: 0.8989 335/500 [===================>..........] - ETA: 37s - loss: 3.4428 - regression_loss: 2.5448 - classification_loss: 0.8980 336/500 [===================>..........] - ETA: 37s - loss: 3.4423 - regression_loss: 2.5448 - classification_loss: 0.8975 337/500 [===================>..........] - ETA: 36s - loss: 3.4413 - regression_loss: 2.5443 - classification_loss: 0.8970 338/500 [===================>..........] - ETA: 36s - loss: 3.4407 - regression_loss: 2.5443 - classification_loss: 0.8965 339/500 [===================>..........] - ETA: 36s - loss: 3.4395 - regression_loss: 2.5440 - classification_loss: 0.8955 340/500 [===================>..........] - ETA: 36s - loss: 3.4386 - regression_loss: 2.5439 - classification_loss: 0.8946 341/500 [===================>..........] - ETA: 36s - loss: 3.4384 - regression_loss: 2.5445 - classification_loss: 0.8939 342/500 [===================>..........] - ETA: 35s - loss: 3.4376 - regression_loss: 2.5442 - classification_loss: 0.8934 343/500 [===================>..........] - ETA: 35s - loss: 3.4363 - regression_loss: 2.5439 - classification_loss: 0.8924 344/500 [===================>..........] - ETA: 35s - loss: 3.4372 - regression_loss: 2.5440 - classification_loss: 0.8931 345/500 [===================>..........] - ETA: 35s - loss: 3.4364 - regression_loss: 2.5437 - classification_loss: 0.8927 346/500 [===================>..........] - ETA: 34s - loss: 3.4357 - regression_loss: 2.5435 - classification_loss: 0.8922 347/500 [===================>..........] - ETA: 34s - loss: 3.4343 - regression_loss: 2.5432 - classification_loss: 0.8911 348/500 [===================>..........] - ETA: 34s - loss: 3.4332 - regression_loss: 2.5428 - classification_loss: 0.8904 349/500 [===================>..........] - ETA: 34s - loss: 3.4324 - regression_loss: 2.5426 - classification_loss: 0.8898 350/500 [====================>.........] - ETA: 34s - loss: 3.4317 - regression_loss: 2.5424 - classification_loss: 0.8894 351/500 [====================>.........] - ETA: 33s - loss: 3.4308 - regression_loss: 2.5421 - classification_loss: 0.8887 352/500 [====================>.........] - ETA: 33s - loss: 3.4293 - regression_loss: 2.5418 - classification_loss: 0.8876 353/500 [====================>.........] - ETA: 33s - loss: 3.4309 - regression_loss: 2.5436 - classification_loss: 0.8874 354/500 [====================>.........] - ETA: 33s - loss: 3.4297 - regression_loss: 2.5431 - classification_loss: 0.8866 355/500 [====================>.........] - ETA: 32s - loss: 3.4290 - regression_loss: 2.5428 - classification_loss: 0.8862 356/500 [====================>.........] - ETA: 32s - loss: 3.4282 - regression_loss: 2.5425 - classification_loss: 0.8857 357/500 [====================>.........] - ETA: 32s - loss: 3.4278 - regression_loss: 2.5425 - classification_loss: 0.8852 358/500 [====================>.........] - ETA: 32s - loss: 3.4276 - regression_loss: 2.5422 - classification_loss: 0.8854 359/500 [====================>.........] - ETA: 32s - loss: 3.4268 - regression_loss: 2.5419 - classification_loss: 0.8849 360/500 [====================>.........] - ETA: 31s - loss: 3.4265 - regression_loss: 2.5422 - classification_loss: 0.8842 361/500 [====================>.........] - ETA: 31s - loss: 3.4255 - regression_loss: 2.5418 - classification_loss: 0.8838 362/500 [====================>.........] - ETA: 31s - loss: 3.4248 - regression_loss: 2.5415 - classification_loss: 0.8833 363/500 [====================>.........] - ETA: 31s - loss: 3.4232 - regression_loss: 2.5404 - classification_loss: 0.8828 364/500 [====================>.........] - ETA: 31s - loss: 3.4237 - regression_loss: 2.5405 - classification_loss: 0.8831 365/500 [====================>.........] - ETA: 30s - loss: 3.4232 - regression_loss: 2.5408 - classification_loss: 0.8824 366/500 [====================>.........] - ETA: 30s - loss: 3.4222 - regression_loss: 2.5406 - classification_loss: 0.8816 367/500 [=====================>........] - ETA: 30s - loss: 3.4219 - regression_loss: 2.5405 - classification_loss: 0.8814 368/500 [=====================>........] - ETA: 30s - loss: 3.4209 - regression_loss: 2.5402 - classification_loss: 0.8807 369/500 [=====================>........] - ETA: 29s - loss: 3.4202 - regression_loss: 2.5398 - classification_loss: 0.8803 370/500 [=====================>........] - ETA: 29s - loss: 3.4189 - regression_loss: 2.5394 - classification_loss: 0.8794 371/500 [=====================>........] - ETA: 29s - loss: 3.4178 - regression_loss: 2.5390 - classification_loss: 0.8788 372/500 [=====================>........] - ETA: 29s - loss: 3.4182 - regression_loss: 2.5391 - classification_loss: 0.8791 373/500 [=====================>........] - ETA: 29s - loss: 3.4179 - regression_loss: 2.5392 - classification_loss: 0.8787 374/500 [=====================>........] - ETA: 28s - loss: 3.4244 - regression_loss: 2.5397 - classification_loss: 0.8847 375/500 [=====================>........] - ETA: 28s - loss: 3.4234 - regression_loss: 2.5394 - classification_loss: 0.8840 376/500 [=====================>........] - ETA: 28s - loss: 3.4237 - regression_loss: 2.5400 - classification_loss: 0.8837 377/500 [=====================>........] - ETA: 28s - loss: 3.4228 - regression_loss: 2.5398 - classification_loss: 0.8830 378/500 [=====================>........] - ETA: 27s - loss: 3.4216 - regression_loss: 2.5393 - classification_loss: 0.8823 379/500 [=====================>........] - ETA: 27s - loss: 3.4206 - regression_loss: 2.5390 - classification_loss: 0.8816 380/500 [=====================>........] - ETA: 27s - loss: 3.4201 - regression_loss: 2.5389 - classification_loss: 0.8812 381/500 [=====================>........] - ETA: 27s - loss: 3.4201 - regression_loss: 2.5391 - classification_loss: 0.8810 382/500 [=====================>........] - ETA: 27s - loss: 3.4193 - regression_loss: 2.5390 - classification_loss: 0.8803 383/500 [=====================>........] - ETA: 26s - loss: 3.4188 - regression_loss: 2.5393 - classification_loss: 0.8795 384/500 [======================>.......] - ETA: 26s - loss: 3.4182 - regression_loss: 2.5390 - classification_loss: 0.8792 385/500 [======================>.......] - ETA: 26s - loss: 3.4191 - regression_loss: 2.5387 - classification_loss: 0.8804 386/500 [======================>.......] - ETA: 26s - loss: 3.4182 - regression_loss: 2.5381 - classification_loss: 0.8801 387/500 [======================>.......] - ETA: 25s - loss: 3.4179 - regression_loss: 2.5380 - classification_loss: 0.8799 388/500 [======================>.......] - ETA: 25s - loss: 3.4175 - regression_loss: 2.5377 - classification_loss: 0.8797 389/500 [======================>.......] - ETA: 25s - loss: 3.4171 - regression_loss: 2.5376 - classification_loss: 0.8794 390/500 [======================>.......] - ETA: 25s - loss: 3.4163 - regression_loss: 2.5378 - classification_loss: 0.8785 391/500 [======================>.......] - ETA: 24s - loss: 3.4159 - regression_loss: 2.5381 - classification_loss: 0.8779 392/500 [======================>.......] - ETA: 24s - loss: 3.4167 - regression_loss: 2.5388 - classification_loss: 0.8779 393/500 [======================>.......] - ETA: 24s - loss: 3.4191 - regression_loss: 2.5396 - classification_loss: 0.8795 394/500 [======================>.......] - ETA: 24s - loss: 3.4182 - regression_loss: 2.5392 - classification_loss: 0.8791 395/500 [======================>.......] - ETA: 24s - loss: 3.4183 - regression_loss: 2.5397 - classification_loss: 0.8785 396/500 [======================>.......] - ETA: 23s - loss: 3.4176 - regression_loss: 2.5395 - classification_loss: 0.8781 397/500 [======================>.......] - ETA: 23s - loss: 3.4181 - regression_loss: 2.5404 - classification_loss: 0.8778 398/500 [======================>.......] - ETA: 23s - loss: 3.4185 - regression_loss: 2.5405 - classification_loss: 0.8780 399/500 [======================>.......] - ETA: 23s - loss: 3.4199 - regression_loss: 2.5412 - classification_loss: 0.8787 400/500 [=======================>......] - ETA: 22s - loss: 3.4187 - regression_loss: 2.5408 - classification_loss: 0.8778 401/500 [=======================>......] - ETA: 22s - loss: 3.4176 - regression_loss: 2.5402 - classification_loss: 0.8774 402/500 [=======================>......] - ETA: 22s - loss: 3.4165 - regression_loss: 2.5398 - classification_loss: 0.8768 403/500 [=======================>......] - ETA: 22s - loss: 3.4162 - regression_loss: 2.5402 - classification_loss: 0.8761 404/500 [=======================>......] - ETA: 22s - loss: 3.4158 - regression_loss: 2.5402 - classification_loss: 0.8756 405/500 [=======================>......] - ETA: 21s - loss: 3.4154 - regression_loss: 2.5402 - classification_loss: 0.8752 406/500 [=======================>......] - ETA: 21s - loss: 3.4147 - regression_loss: 2.5398 - classification_loss: 0.8748 407/500 [=======================>......] - ETA: 21s - loss: 3.4142 - regression_loss: 2.5397 - classification_loss: 0.8745 408/500 [=======================>......] - ETA: 21s - loss: 3.4141 - regression_loss: 2.5400 - classification_loss: 0.8741 409/500 [=======================>......] - ETA: 20s - loss: 3.4138 - regression_loss: 2.5397 - classification_loss: 0.8741 410/500 [=======================>......] - ETA: 20s - loss: 3.4129 - regression_loss: 2.5394 - classification_loss: 0.8735 411/500 [=======================>......] - ETA: 20s - loss: 3.4122 - regression_loss: 2.5393 - classification_loss: 0.8729 412/500 [=======================>......] - ETA: 20s - loss: 3.4121 - regression_loss: 2.5386 - classification_loss: 0.8735 413/500 [=======================>......] - ETA: 20s - loss: 3.4126 - regression_loss: 2.5390 - classification_loss: 0.8736 414/500 [=======================>......] - ETA: 19s - loss: 3.4120 - regression_loss: 2.5390 - classification_loss: 0.8730 415/500 [=======================>......] - ETA: 19s - loss: 3.4116 - regression_loss: 2.5388 - classification_loss: 0.8728 416/500 [=======================>......] - ETA: 19s - loss: 3.4111 - regression_loss: 2.5388 - classification_loss: 0.8723 417/500 [========================>.....] - ETA: 19s - loss: 3.4106 - regression_loss: 2.5386 - classification_loss: 0.8720 418/500 [========================>.....] - ETA: 18s - loss: 3.4098 - regression_loss: 2.5384 - classification_loss: 0.8714 419/500 [========================>.....] - ETA: 18s - loss: 3.4089 - regression_loss: 2.5382 - classification_loss: 0.8707 420/500 [========================>.....] - ETA: 18s - loss: 3.4083 - regression_loss: 2.5382 - classification_loss: 0.8700 421/500 [========================>.....] - ETA: 18s - loss: 3.4086 - regression_loss: 2.5388 - classification_loss: 0.8698 422/500 [========================>.....] - ETA: 18s - loss: 3.4082 - regression_loss: 2.5387 - classification_loss: 0.8695 423/500 [========================>.....] - ETA: 17s - loss: 3.4073 - regression_loss: 2.5386 - classification_loss: 0.8687 424/500 [========================>.....] - ETA: 17s - loss: 3.4068 - regression_loss: 2.5388 - classification_loss: 0.8680 425/500 [========================>.....] - ETA: 17s - loss: 3.4065 - regression_loss: 2.5380 - classification_loss: 0.8685 426/500 [========================>.....] - ETA: 17s - loss: 3.4055 - regression_loss: 2.5377 - classification_loss: 0.8678 427/500 [========================>.....] - ETA: 16s - loss: 3.4049 - regression_loss: 2.5374 - classification_loss: 0.8674 428/500 [========================>.....] - ETA: 16s - loss: 3.4050 - regression_loss: 2.5371 - classification_loss: 0.8678 429/500 [========================>.....] - ETA: 16s - loss: 3.4049 - regression_loss: 2.5372 - classification_loss: 0.8677 430/500 [========================>.....] - ETA: 16s - loss: 3.4053 - regression_loss: 2.5372 - classification_loss: 0.8681 431/500 [========================>.....] - ETA: 15s - loss: 3.4047 - regression_loss: 2.5369 - classification_loss: 0.8678 432/500 [========================>.....] - ETA: 15s - loss: 3.4040 - regression_loss: 2.5368 - classification_loss: 0.8672 433/500 [========================>.....] - ETA: 15s - loss: 3.4035 - regression_loss: 2.5367 - classification_loss: 0.8669 434/500 [=========================>....] - ETA: 15s - loss: 3.4036 - regression_loss: 2.5369 - classification_loss: 0.8667 435/500 [=========================>....] - ETA: 15s - loss: 3.4027 - regression_loss: 2.5365 - classification_loss: 0.8662 436/500 [=========================>....] - ETA: 14s - loss: 3.4020 - regression_loss: 2.5363 - classification_loss: 0.8657 437/500 [=========================>....] - ETA: 14s - loss: 3.4026 - regression_loss: 2.5369 - classification_loss: 0.8657 438/500 [=========================>....] - ETA: 14s - loss: 3.4019 - regression_loss: 2.5364 - classification_loss: 0.8655 439/500 [=========================>....] - ETA: 14s - loss: 3.4013 - regression_loss: 2.5362 - classification_loss: 0.8650 440/500 [=========================>....] - ETA: 13s - loss: 3.4014 - regression_loss: 2.5372 - classification_loss: 0.8642 441/500 [=========================>....] - ETA: 13s - loss: 3.4011 - regression_loss: 2.5369 - classification_loss: 0.8642 442/500 [=========================>....] - ETA: 13s - loss: 3.4016 - regression_loss: 2.5377 - classification_loss: 0.8639 443/500 [=========================>....] - ETA: 13s - loss: 3.4010 - regression_loss: 2.5376 - classification_loss: 0.8634 444/500 [=========================>....] - ETA: 12s - loss: 3.4018 - regression_loss: 2.5385 - classification_loss: 0.8633 445/500 [=========================>....] - ETA: 12s - loss: 3.4013 - regression_loss: 2.5385 - classification_loss: 0.8628 446/500 [=========================>....] - ETA: 12s - loss: 3.4027 - regression_loss: 2.5383 - classification_loss: 0.8644 447/500 [=========================>....] - ETA: 12s - loss: 3.4022 - regression_loss: 2.5380 - classification_loss: 0.8642 448/500 [=========================>....] - ETA: 12s - loss: 3.4015 - regression_loss: 2.5382 - classification_loss: 0.8633 449/500 [=========================>....] - ETA: 11s - loss: 3.4009 - regression_loss: 2.5379 - classification_loss: 0.8631 450/500 [==========================>...] - ETA: 11s - loss: 3.4006 - regression_loss: 2.5377 - classification_loss: 0.8629 451/500 [==========================>...] - ETA: 11s - loss: 3.4003 - regression_loss: 2.5376 - classification_loss: 0.8627 452/500 [==========================>...] - ETA: 11s - loss: 3.4000 - regression_loss: 2.5375 - classification_loss: 0.8625 453/500 [==========================>...] - ETA: 10s - loss: 3.4000 - regression_loss: 2.5377 - classification_loss: 0.8623 454/500 [==========================>...] - ETA: 10s - loss: 3.4008 - regression_loss: 2.5384 - classification_loss: 0.8623 455/500 [==========================>...] - ETA: 10s - loss: 3.4003 - regression_loss: 2.5383 - classification_loss: 0.8620 456/500 [==========================>...] - ETA: 10s - loss: 3.4000 - regression_loss: 2.5381 - classification_loss: 0.8619 457/500 [==========================>...] - ETA: 9s - loss: 3.3997 - regression_loss: 2.5380 - classification_loss: 0.8617  458/500 [==========================>...] - ETA: 9s - loss: 3.4002 - regression_loss: 2.5389 - classification_loss: 0.8614 459/500 [==========================>...] - ETA: 9s - loss: 3.4002 - regression_loss: 2.5391 - classification_loss: 0.8611 460/500 [==========================>...] - ETA: 9s - loss: 3.3997 - regression_loss: 2.5389 - classification_loss: 0.8607 461/500 [==========================>...] - ETA: 9s - loss: 3.3993 - regression_loss: 2.5387 - classification_loss: 0.8606 462/500 [==========================>...] - ETA: 8s - loss: 3.3990 - regression_loss: 2.5392 - classification_loss: 0.8598 463/500 [==========================>...] - ETA: 8s - loss: 3.3985 - regression_loss: 2.5390 - classification_loss: 0.8594 464/500 [==========================>...] - ETA: 8s - loss: 3.3985 - regression_loss: 2.5396 - classification_loss: 0.8590 465/500 [==========================>...] - ETA: 8s - loss: 3.3993 - regression_loss: 2.5404 - classification_loss: 0.8589 466/500 [==========================>...] - ETA: 7s - loss: 3.3984 - regression_loss: 2.5398 - classification_loss: 0.8586 467/500 [===========================>..] - ETA: 7s - loss: 3.3985 - regression_loss: 2.5399 - classification_loss: 0.8586 468/500 [===========================>..] - ETA: 7s - loss: 3.3976 - regression_loss: 2.5395 - classification_loss: 0.8581 469/500 [===========================>..] - ETA: 7s - loss: 3.3976 - regression_loss: 2.5398 - classification_loss: 0.8578 470/500 [===========================>..] - ETA: 6s - loss: 3.3968 - regression_loss: 2.5394 - classification_loss: 0.8573 471/500 [===========================>..] - ETA: 6s - loss: 3.3959 - regression_loss: 2.5391 - classification_loss: 0.8568 472/500 [===========================>..] - ETA: 6s - loss: 3.3970 - regression_loss: 2.5391 - classification_loss: 0.8579 473/500 [===========================>..] - ETA: 6s - loss: 3.4001 - regression_loss: 2.5394 - classification_loss: 0.8606 474/500 [===========================>..] - ETA: 6s - loss: 3.3995 - regression_loss: 2.5393 - classification_loss: 0.8602 475/500 [===========================>..] - ETA: 5s - loss: 3.4004 - regression_loss: 2.5398 - classification_loss: 0.8606 476/500 [===========================>..] - ETA: 5s - loss: 3.3996 - regression_loss: 2.5395 - classification_loss: 0.8601 477/500 [===========================>..] - ETA: 5s - loss: 3.3995 - regression_loss: 2.5395 - classification_loss: 0.8600 478/500 [===========================>..] - ETA: 5s - loss: 3.3992 - regression_loss: 2.5399 - classification_loss: 0.8593 479/500 [===========================>..] - ETA: 4s - loss: 3.3990 - regression_loss: 2.5399 - classification_loss: 0.8591 480/500 [===========================>..] - ETA: 4s - loss: 3.3985 - regression_loss: 2.5398 - classification_loss: 0.8587 481/500 [===========================>..] - ETA: 4s - loss: 3.3980 - regression_loss: 2.5397 - classification_loss: 0.8584 482/500 [===========================>..] - ETA: 4s - loss: 3.3972 - regression_loss: 2.5394 - classification_loss: 0.8578 483/500 [===========================>..] - ETA: 3s - loss: 3.3969 - regression_loss: 2.5393 - classification_loss: 0.8576 484/500 [============================>.] - ETA: 3s - loss: 3.3965 - regression_loss: 2.5391 - classification_loss: 0.8574 485/500 [============================>.] - ETA: 3s - loss: 3.3960 - regression_loss: 2.5387 - classification_loss: 0.8572 486/500 [============================>.] - ETA: 3s - loss: 3.3962 - regression_loss: 2.5385 - classification_loss: 0.8577 487/500 [============================>.] - ETA: 3s - loss: 3.3957 - regression_loss: 2.5382 - classification_loss: 0.8575 488/500 [============================>.] - ETA: 2s - loss: 3.3960 - regression_loss: 2.5384 - classification_loss: 0.8576 489/500 [============================>.] - ETA: 2s - loss: 3.3957 - regression_loss: 2.5384 - classification_loss: 0.8573 490/500 [============================>.] - ETA: 2s - loss: 3.3958 - regression_loss: 2.5386 - classification_loss: 0.8573 491/500 [============================>.] - ETA: 2s - loss: 3.3964 - regression_loss: 2.5386 - classification_loss: 0.8579 492/500 [============================>.] - ETA: 1s - loss: 3.3956 - regression_loss: 2.5381 - classification_loss: 0.8575 493/500 [============================>.] - ETA: 1s - loss: 3.3951 - regression_loss: 2.5380 - classification_loss: 0.8570 494/500 [============================>.] - ETA: 1s - loss: 3.3944 - regression_loss: 2.5378 - classification_loss: 0.8566 495/500 [============================>.] - ETA: 1s - loss: 3.3938 - regression_loss: 2.5376 - classification_loss: 0.8562 496/500 [============================>.] - ETA: 0s - loss: 3.3930 - regression_loss: 2.5372 - classification_loss: 0.8558 497/500 [============================>.] - ETA: 0s - loss: 3.3928 - regression_loss: 2.5373 - classification_loss: 0.8554 498/500 [============================>.] - ETA: 0s - loss: 3.3924 - regression_loss: 2.5372 - classification_loss: 0.8552 499/500 [============================>.] - ETA: 0s - loss: 3.3919 - regression_loss: 2.5370 - classification_loss: 0.8549 500/500 [==============================] - 117s 234ms/step - loss: 3.3915 - regression_loss: 2.5370 - classification_loss: 0.8546 1172 instances of class plum with average precision: 0.0052 mAP: 0.0052 Epoch 00002: saving model to ./training/snapshots/resnet50_pascal_02.h5 Epoch 3/150 1/500 [..............................] - ETA: 1:55 - loss: 3.0881 - regression_loss: 2.4013 - classification_loss: 0.6868 2/500 [..............................] - ETA: 1:58 - loss: 3.0906 - regression_loss: 2.4325 - classification_loss: 0.6582 3/500 [..............................] - ETA: 2:00 - loss: 3.1800 - regression_loss: 2.4832 - classification_loss: 0.6967 4/500 [..............................] - ETA: 2:00 - loss: 3.1486 - regression_loss: 2.4634 - classification_loss: 0.6852 5/500 [..............................] - ETA: 2:01 - loss: 3.1263 - regression_loss: 2.4584 - classification_loss: 0.6679 6/500 [..............................] - ETA: 2:01 - loss: 3.1203 - regression_loss: 2.4691 - classification_loss: 0.6511 7/500 [..............................] - ETA: 2:02 - loss: 3.3038 - regression_loss: 2.4939 - classification_loss: 0.8099 8/500 [..............................] - ETA: 2:01 - loss: 3.2887 - regression_loss: 2.4889 - classification_loss: 0.7998 9/500 [..............................] - ETA: 2:01 - loss: 3.4753 - regression_loss: 2.4737 - classification_loss: 1.0016 10/500 [..............................] - ETA: 2:01 - loss: 3.4704 - regression_loss: 2.4834 - classification_loss: 0.9870 11/500 [..............................] - ETA: 2:01 - loss: 3.6455 - regression_loss: 2.4329 - classification_loss: 1.2126 12/500 [..............................] - ETA: 2:01 - loss: 3.5901 - regression_loss: 2.4286 - classification_loss: 1.1615 13/500 [..............................] - ETA: 2:02 - loss: 3.5394 - regression_loss: 2.4272 - classification_loss: 1.1122 14/500 [..............................] - ETA: 2:02 - loss: 3.5206 - regression_loss: 2.4442 - classification_loss: 1.0764 15/500 [..............................] - ETA: 2:02 - loss: 3.4912 - regression_loss: 2.4422 - classification_loss: 1.0490 16/500 [..............................] - ETA: 2:01 - loss: 3.4791 - regression_loss: 2.4500 - classification_loss: 1.0291 17/500 [>.............................] - ETA: 2:01 - loss: 3.4613 - regression_loss: 2.4519 - classification_loss: 1.0094 18/500 [>.............................] - ETA: 2:01 - loss: 3.4484 - regression_loss: 2.4535 - classification_loss: 0.9949 19/500 [>.............................] - ETA: 2:01 - loss: 3.4283 - regression_loss: 2.4522 - classification_loss: 0.9761 20/500 [>.............................] - ETA: 2:00 - loss: 3.4267 - regression_loss: 2.4539 - classification_loss: 0.9728 21/500 [>.............................] - ETA: 2:00 - loss: 3.4107 - regression_loss: 2.4542 - classification_loss: 0.9565 22/500 [>.............................] - ETA: 2:00 - loss: 3.4062 - regression_loss: 2.4624 - classification_loss: 0.9438 23/500 [>.............................] - ETA: 2:00 - loss: 3.3950 - regression_loss: 2.4636 - classification_loss: 0.9314 24/500 [>.............................] - ETA: 1:59 - loss: 3.3871 - regression_loss: 2.4651 - classification_loss: 0.9220 25/500 [>.............................] - ETA: 1:59 - loss: 3.3870 - regression_loss: 2.4715 - classification_loss: 0.9155 26/500 [>.............................] - ETA: 1:59 - loss: 3.3758 - regression_loss: 2.4717 - classification_loss: 0.9040 27/500 [>.............................] - ETA: 1:58 - loss: 3.3624 - regression_loss: 2.4676 - classification_loss: 0.8947 28/500 [>.............................] - ETA: 1:58 - loss: 3.3880 - regression_loss: 2.4963 - classification_loss: 0.8917 29/500 [>.............................] - ETA: 1:58 - loss: 3.3596 - regression_loss: 2.4822 - classification_loss: 0.8774 30/500 [>.............................] - ETA: 1:58 - loss: 3.3452 - regression_loss: 2.4780 - classification_loss: 0.8672 31/500 [>.............................] - ETA: 1:58 - loss: 3.3381 - regression_loss: 2.4761 - classification_loss: 0.8619 32/500 [>.............................] - ETA: 1:57 - loss: 3.3256 - regression_loss: 2.4714 - classification_loss: 0.8542 33/500 [>.............................] - ETA: 1:57 - loss: 3.3225 - regression_loss: 2.4756 - classification_loss: 0.8469 34/500 [=>............................] - ETA: 1:57 - loss: 3.3124 - regression_loss: 2.4742 - classification_loss: 0.8382 35/500 [=>............................] - ETA: 1:57 - loss: 3.3136 - regression_loss: 2.4776 - classification_loss: 0.8360 36/500 [=>............................] - ETA: 1:56 - loss: 3.3087 - regression_loss: 2.4766 - classification_loss: 0.8321 37/500 [=>............................] - ETA: 1:56 - loss: 3.3180 - regression_loss: 2.4865 - classification_loss: 0.8315 38/500 [=>............................] - ETA: 1:56 - loss: 3.3252 - regression_loss: 2.4941 - classification_loss: 0.8311 39/500 [=>............................] - ETA: 1:56 - loss: 3.3331 - regression_loss: 2.4989 - classification_loss: 0.8342 40/500 [=>............................] - ETA: 1:55 - loss: 3.3273 - regression_loss: 2.4996 - classification_loss: 0.8277 41/500 [=>............................] - ETA: 1:55 - loss: 3.5785 - regression_loss: 2.5173 - classification_loss: 1.0612 42/500 [=>............................] - ETA: 1:55 - loss: 3.5677 - regression_loss: 2.5166 - classification_loss: 1.0511 43/500 [=>............................] - ETA: 1:54 - loss: 3.5591 - regression_loss: 2.5181 - classification_loss: 1.0410 44/500 [=>............................] - ETA: 1:54 - loss: 3.5561 - regression_loss: 2.5237 - classification_loss: 1.0324 45/500 [=>............................] - ETA: 1:54 - loss: 3.5486 - regression_loss: 2.5233 - classification_loss: 1.0253 46/500 [=>............................] - ETA: 1:54 - loss: 3.5386 - regression_loss: 2.5192 - classification_loss: 1.0194 47/500 [=>............................] - ETA: 1:53 - loss: 3.5273 - regression_loss: 2.5166 - classification_loss: 1.0107 48/500 [=>............................] - ETA: 1:53 - loss: 3.5275 - regression_loss: 2.5230 - classification_loss: 1.0045 49/500 [=>............................] - ETA: 1:53 - loss: 3.5210 - regression_loss: 2.5213 - classification_loss: 0.9997 50/500 [==>...........................] - ETA: 1:53 - loss: 3.5252 - regression_loss: 2.5296 - classification_loss: 0.9956 51/500 [==>...........................] - ETA: 1:53 - loss: 3.5186 - regression_loss: 2.5315 - classification_loss: 0.9871 52/500 [==>...........................] - ETA: 1:52 - loss: 3.5112 - regression_loss: 2.5296 - classification_loss: 0.9816 53/500 [==>...........................] - ETA: 1:52 - loss: 3.5031 - regression_loss: 2.5277 - classification_loss: 0.9754 54/500 [==>...........................] - ETA: 1:52 - loss: 3.5019 - regression_loss: 2.5258 - classification_loss: 0.9761 55/500 [==>...........................] - ETA: 1:52 - loss: 3.4948 - regression_loss: 2.5249 - classification_loss: 0.9699 56/500 [==>...........................] - ETA: 1:51 - loss: 3.4865 - regression_loss: 2.5236 - classification_loss: 0.9629 57/500 [==>...........................] - ETA: 1:51 - loss: 3.4775 - regression_loss: 2.5211 - classification_loss: 0.9565 58/500 [==>...........................] - ETA: 1:50 - loss: 3.4753 - regression_loss: 2.5189 - classification_loss: 0.9565 59/500 [==>...........................] - ETA: 1:50 - loss: 3.4711 - regression_loss: 2.5202 - classification_loss: 0.9508 60/500 [==>...........................] - ETA: 1:49 - loss: 3.4665 - regression_loss: 2.5204 - classification_loss: 0.9461 61/500 [==>...........................] - ETA: 1:49 - loss: 3.4598 - regression_loss: 2.5186 - classification_loss: 0.9412 62/500 [==>...........................] - ETA: 1:48 - loss: 3.4595 - regression_loss: 2.5216 - classification_loss: 0.9380 63/500 [==>...........................] - ETA: 1:48 - loss: 3.4521 - regression_loss: 2.5196 - classification_loss: 0.9325 64/500 [==>...........................] - ETA: 1:48 - loss: 3.4664 - regression_loss: 2.5177 - classification_loss: 0.9488 65/500 [==>...........................] - ETA: 1:48 - loss: 3.4610 - regression_loss: 2.5166 - classification_loss: 0.9444 66/500 [==>...........................] - ETA: 1:47 - loss: 3.4566 - regression_loss: 2.5171 - classification_loss: 0.9395 67/500 [===>..........................] - ETA: 1:47 - loss: 3.4508 - regression_loss: 2.5153 - classification_loss: 0.9356 68/500 [===>..........................] - ETA: 1:47 - loss: 3.4500 - regression_loss: 2.5156 - classification_loss: 0.9344 69/500 [===>..........................] - ETA: 1:47 - loss: 3.4475 - regression_loss: 2.5163 - classification_loss: 0.9312 70/500 [===>..........................] - ETA: 1:47 - loss: 3.4420 - regression_loss: 2.5153 - classification_loss: 0.9267 71/500 [===>..........................] - ETA: 1:46 - loss: 3.4345 - regression_loss: 2.5134 - classification_loss: 0.9212 72/500 [===>..........................] - ETA: 1:46 - loss: 3.4304 - regression_loss: 2.5132 - classification_loss: 0.9172 73/500 [===>..........................] - ETA: 1:46 - loss: 3.4244 - regression_loss: 2.5107 - classification_loss: 0.9137 74/500 [===>..........................] - ETA: 1:46 - loss: 3.4234 - regression_loss: 2.5126 - classification_loss: 0.9109 75/500 [===>..........................] - ETA: 1:45 - loss: 3.4173 - regression_loss: 2.5113 - classification_loss: 0.9060 76/500 [===>..........................] - ETA: 1:45 - loss: 3.4135 - regression_loss: 2.5109 - classification_loss: 0.9026 77/500 [===>..........................] - ETA: 1:45 - loss: 3.4117 - regression_loss: 2.5092 - classification_loss: 0.9026 78/500 [===>..........................] - ETA: 1:45 - loss: 3.4056 - regression_loss: 2.5075 - classification_loss: 0.8982 79/500 [===>..........................] - ETA: 1:45 - loss: 3.4088 - regression_loss: 2.5168 - classification_loss: 0.8920 80/500 [===>..........................] - ETA: 1:44 - loss: 3.4126 - regression_loss: 2.5226 - classification_loss: 0.8900 81/500 [===>..........................] - ETA: 1:44 - loss: 3.4064 - regression_loss: 2.5203 - classification_loss: 0.8861 82/500 [===>..........................] - ETA: 1:44 - loss: 3.4036 - regression_loss: 2.5195 - classification_loss: 0.8841 83/500 [===>..........................] - ETA: 1:44 - loss: 3.4027 - regression_loss: 2.5208 - classification_loss: 0.8818 84/500 [====>.........................] - ETA: 1:43 - loss: 3.3994 - regression_loss: 2.5203 - classification_loss: 0.8792 85/500 [====>.........................] - ETA: 1:43 - loss: 3.3955 - regression_loss: 2.5199 - classification_loss: 0.8757 86/500 [====>.........................] - ETA: 1:43 - loss: 3.3872 - regression_loss: 2.5171 - classification_loss: 0.8702 87/500 [====>.........................] - ETA: 1:43 - loss: 3.3820 - regression_loss: 2.5144 - classification_loss: 0.8677 88/500 [====>.........................] - ETA: 1:42 - loss: 3.3779 - regression_loss: 2.5139 - classification_loss: 0.8640 89/500 [====>.........................] - ETA: 1:42 - loss: 3.3735 - regression_loss: 2.5117 - classification_loss: 0.8618 90/500 [====>.........................] - ETA: 1:42 - loss: 3.3721 - regression_loss: 2.5114 - classification_loss: 0.8607 91/500 [====>.........................] - ETA: 1:42 - loss: 3.3704 - regression_loss: 2.5111 - classification_loss: 0.8594 92/500 [====>.........................] - ETA: 1:41 - loss: 3.3702 - regression_loss: 2.5112 - classification_loss: 0.8590 93/500 [====>.........................] - ETA: 1:41 - loss: 3.3662 - regression_loss: 2.5081 - classification_loss: 0.8581 94/500 [====>.........................] - ETA: 1:41 - loss: 3.3663 - regression_loss: 2.5069 - classification_loss: 0.8594 95/500 [====>.........................] - ETA: 1:41 - loss: 3.3705 - regression_loss: 2.5100 - classification_loss: 0.8606 96/500 [====>.........................] - ETA: 1:40 - loss: 3.3678 - regression_loss: 2.5100 - classification_loss: 0.8578 97/500 [====>.........................] - ETA: 1:40 - loss: 3.3651 - regression_loss: 2.5101 - classification_loss: 0.8550 98/500 [====>.........................] - ETA: 1:40 - loss: 3.3621 - regression_loss: 2.5087 - classification_loss: 0.8534 99/500 [====>.........................] - ETA: 1:40 - loss: 3.3659 - regression_loss: 2.5088 - classification_loss: 0.8571 100/500 [=====>........................] - ETA: 1:39 - loss: 3.3655 - regression_loss: 2.5096 - classification_loss: 0.8559 101/500 [=====>........................] - ETA: 1:39 - loss: 3.3620 - regression_loss: 2.5083 - classification_loss: 0.8536 102/500 [=====>........................] - ETA: 1:39 - loss: 3.3597 - regression_loss: 2.5084 - classification_loss: 0.8512 103/500 [=====>........................] - ETA: 1:39 - loss: 3.3637 - regression_loss: 2.5133 - classification_loss: 0.8504 104/500 [=====>........................] - ETA: 1:38 - loss: 3.3602 - regression_loss: 2.5117 - classification_loss: 0.8485 105/500 [=====>........................] - ETA: 1:38 - loss: 3.3589 - regression_loss: 2.5123 - classification_loss: 0.8466 106/500 [=====>........................] - ETA: 1:38 - loss: 3.3555 - regression_loss: 2.5110 - classification_loss: 0.8445 107/500 [=====>........................] - ETA: 1:38 - loss: 3.3552 - regression_loss: 2.5109 - classification_loss: 0.8442 108/500 [=====>........................] - ETA: 1:38 - loss: 3.3521 - regression_loss: 2.5105 - classification_loss: 0.8416 109/500 [=====>........................] - ETA: 1:37 - loss: 3.3504 - regression_loss: 2.5099 - classification_loss: 0.8405 110/500 [=====>........................] - ETA: 1:37 - loss: 3.3501 - regression_loss: 2.5106 - classification_loss: 0.8395 111/500 [=====>........................] - ETA: 1:37 - loss: 3.3478 - regression_loss: 2.5104 - classification_loss: 0.8374 112/500 [=====>........................] - ETA: 1:37 - loss: 3.3452 - regression_loss: 2.5091 - classification_loss: 0.8360 113/500 [=====>........................] - ETA: 1:36 - loss: 3.3433 - regression_loss: 2.5094 - classification_loss: 0.8339 114/500 [=====>........................] - ETA: 1:36 - loss: 3.3424 - regression_loss: 2.5097 - classification_loss: 0.8327 115/500 [=====>........................] - ETA: 1:36 - loss: 3.3412 - regression_loss: 2.5089 - classification_loss: 0.8323 116/500 [=====>........................] - ETA: 1:36 - loss: 3.3392 - regression_loss: 2.5093 - classification_loss: 0.8300 117/500 [======>.......................] - ETA: 1:35 - loss: 3.3365 - regression_loss: 2.5082 - classification_loss: 0.8283 118/500 [======>.......................] - ETA: 1:35 - loss: 3.3450 - regression_loss: 2.5122 - classification_loss: 0.8328 119/500 [======>.......................] - ETA: 1:35 - loss: 3.3428 - regression_loss: 2.5113 - classification_loss: 0.8315 120/500 [======>.......................] - ETA: 1:35 - loss: 3.3410 - regression_loss: 2.5103 - classification_loss: 0.8306 121/500 [======>.......................] - ETA: 1:34 - loss: 3.3414 - regression_loss: 2.5115 - classification_loss: 0.8299 122/500 [======>.......................] - ETA: 1:34 - loss: 3.3390 - regression_loss: 2.5104 - classification_loss: 0.8286 123/500 [======>.......................] - ETA: 1:34 - loss: 3.3376 - regression_loss: 2.5101 - classification_loss: 0.8275 124/500 [======>.......................] - ETA: 1:34 - loss: 3.3364 - regression_loss: 2.5097 - classification_loss: 0.8267 125/500 [======>.......................] - ETA: 1:33 - loss: 3.3425 - regression_loss: 2.5121 - classification_loss: 0.8304 126/500 [======>.......................] - ETA: 1:33 - loss: 3.3407 - regression_loss: 2.5118 - classification_loss: 0.8289 127/500 [======>.......................] - ETA: 1:33 - loss: 3.3408 - regression_loss: 2.5128 - classification_loss: 0.8281 128/500 [======>.......................] - ETA: 1:33 - loss: 3.3399 - regression_loss: 2.5132 - classification_loss: 0.8267 129/500 [======>.......................] - ETA: 1:32 - loss: 3.3374 - regression_loss: 2.5125 - classification_loss: 0.8248 130/500 [======>.......................] - ETA: 1:32 - loss: 3.3365 - regression_loss: 2.5123 - classification_loss: 0.8242 131/500 [======>.......................] - ETA: 1:32 - loss: 3.3366 - regression_loss: 2.5129 - classification_loss: 0.8237 132/500 [======>.......................] - ETA: 1:32 - loss: 3.3342 - regression_loss: 2.5123 - classification_loss: 0.8219 133/500 [======>.......................] - ETA: 1:31 - loss: 3.3307 - regression_loss: 2.5109 - classification_loss: 0.8199 134/500 [=======>......................] - ETA: 1:31 - loss: 3.3323 - regression_loss: 2.5137 - classification_loss: 0.8187 135/500 [=======>......................] - ETA: 1:31 - loss: 3.3320 - regression_loss: 2.5148 - classification_loss: 0.8172 136/500 [=======>......................] - ETA: 1:31 - loss: 3.3274 - regression_loss: 2.5126 - classification_loss: 0.8148 137/500 [=======>......................] - ETA: 1:31 - loss: 3.3245 - regression_loss: 2.5115 - classification_loss: 0.8130 138/500 [=======>......................] - ETA: 1:30 - loss: 3.3231 - regression_loss: 2.5111 - classification_loss: 0.8121 139/500 [=======>......................] - ETA: 1:30 - loss: 3.3238 - regression_loss: 2.5124 - classification_loss: 0.8114 140/500 [=======>......................] - ETA: 1:30 - loss: 3.3223 - regression_loss: 2.5122 - classification_loss: 0.8101 141/500 [=======>......................] - ETA: 1:30 - loss: 3.3380 - regression_loss: 2.5164 - classification_loss: 0.8217 142/500 [=======>......................] - ETA: 1:29 - loss: 3.3362 - regression_loss: 2.5157 - classification_loss: 0.8206 143/500 [=======>......................] - ETA: 1:29 - loss: 3.3407 - regression_loss: 2.5203 - classification_loss: 0.8204 144/500 [=======>......................] - ETA: 1:29 - loss: 3.3402 - regression_loss: 2.5208 - classification_loss: 0.8194 145/500 [=======>......................] - ETA: 1:29 - loss: 3.3386 - regression_loss: 2.5202 - classification_loss: 0.8184 146/500 [=======>......................] - ETA: 1:28 - loss: 3.3387 - regression_loss: 2.5203 - classification_loss: 0.8184 147/500 [=======>......................] - ETA: 1:28 - loss: 3.3371 - regression_loss: 2.5193 - classification_loss: 0.8177 148/500 [=======>......................] - ETA: 1:28 - loss: 3.3356 - regression_loss: 2.5186 - classification_loss: 0.8171 149/500 [=======>......................] - ETA: 1:28 - loss: 3.3335 - regression_loss: 2.5180 - classification_loss: 0.8155 150/500 [========>.....................] - ETA: 1:27 - loss: 3.3330 - regression_loss: 2.5173 - classification_loss: 0.8157 151/500 [========>.....................] - ETA: 1:27 - loss: 3.3329 - regression_loss: 2.5185 - classification_loss: 0.8144 152/500 [========>.....................] - ETA: 1:27 - loss: 3.3326 - regression_loss: 2.5184 - classification_loss: 0.8142 153/500 [========>.....................] - ETA: 1:27 - loss: 3.3301 - regression_loss: 2.5176 - classification_loss: 0.8125 154/500 [========>.....................] - ETA: 1:26 - loss: 3.3287 - regression_loss: 2.5168 - classification_loss: 0.8119 155/500 [========>.....................] - ETA: 1:26 - loss: 3.3286 - regression_loss: 2.5168 - classification_loss: 0.8118 156/500 [========>.....................] - ETA: 1:26 - loss: 3.3272 - regression_loss: 2.5151 - classification_loss: 0.8121 157/500 [========>.....................] - ETA: 1:26 - loss: 3.3246 - regression_loss: 2.5144 - classification_loss: 0.8102 158/500 [========>.....................] - ETA: 1:25 - loss: 3.3247 - regression_loss: 2.5146 - classification_loss: 0.8101 159/500 [========>.....................] - ETA: 1:25 - loss: 3.3212 - regression_loss: 2.5127 - classification_loss: 0.8085 160/500 [========>.....................] - ETA: 1:25 - loss: 3.3204 - regression_loss: 2.5124 - classification_loss: 0.8080 161/500 [========>.....................] - ETA: 1:25 - loss: 3.3198 - regression_loss: 2.5122 - classification_loss: 0.8076 162/500 [========>.....................] - ETA: 1:24 - loss: 3.3179 - regression_loss: 2.5124 - classification_loss: 0.8055 163/500 [========>.....................] - ETA: 1:24 - loss: 3.3195 - regression_loss: 2.5131 - classification_loss: 0.8064 164/500 [========>.....................] - ETA: 1:24 - loss: 3.3180 - regression_loss: 2.5122 - classification_loss: 0.8057 165/500 [========>.....................] - ETA: 1:24 - loss: 3.3171 - regression_loss: 2.5120 - classification_loss: 0.8051 166/500 [========>.....................] - ETA: 1:23 - loss: 3.3186 - regression_loss: 2.5110 - classification_loss: 0.8075 167/500 [=========>....................] - ETA: 1:23 - loss: 3.3188 - regression_loss: 2.5115 - classification_loss: 0.8073 168/500 [=========>....................] - ETA: 1:23 - loss: 3.3164 - regression_loss: 2.5089 - classification_loss: 0.8075 169/500 [=========>....................] - ETA: 1:23 - loss: 3.3148 - regression_loss: 2.5086 - classification_loss: 0.8062 170/500 [=========>....................] - ETA: 1:22 - loss: 3.3141 - regression_loss: 2.5083 - classification_loss: 0.8059 171/500 [=========>....................] - ETA: 1:22 - loss: 3.3140 - regression_loss: 2.5092 - classification_loss: 0.8048 172/500 [=========>....................] - ETA: 1:22 - loss: 3.3152 - regression_loss: 2.5097 - classification_loss: 0.8054 173/500 [=========>....................] - ETA: 1:22 - loss: 3.3141 - regression_loss: 2.5096 - classification_loss: 0.8045 174/500 [=========>....................] - ETA: 1:21 - loss: 3.3125 - regression_loss: 2.5092 - classification_loss: 0.8033 175/500 [=========>....................] - ETA: 1:21 - loss: 3.3113 - regression_loss: 2.5087 - classification_loss: 0.8026 176/500 [=========>....................] - ETA: 1:21 - loss: 3.3122 - regression_loss: 2.5104 - classification_loss: 0.8018 177/500 [=========>....................] - ETA: 1:21 - loss: 3.3121 - regression_loss: 2.5099 - classification_loss: 0.8022 178/500 [=========>....................] - ETA: 1:20 - loss: 3.3117 - regression_loss: 2.5106 - classification_loss: 0.8011 179/500 [=========>....................] - ETA: 1:20 - loss: 3.3102 - regression_loss: 2.5102 - classification_loss: 0.8000 180/500 [=========>....................] - ETA: 1:20 - loss: 3.3120 - regression_loss: 2.5090 - classification_loss: 0.8030 181/500 [=========>....................] - ETA: 1:20 - loss: 3.3116 - regression_loss: 2.5092 - classification_loss: 0.8024 182/500 [=========>....................] - ETA: 1:19 - loss: 3.3119 - regression_loss: 2.5104 - classification_loss: 0.8015 183/500 [=========>....................] - ETA: 1:19 - loss: 3.3114 - regression_loss: 2.5111 - classification_loss: 0.8002 184/500 [==========>...................] - ETA: 1:19 - loss: 3.3113 - regression_loss: 2.5117 - classification_loss: 0.7996 185/500 [==========>...................] - ETA: 1:19 - loss: 3.3118 - regression_loss: 2.5121 - classification_loss: 0.7997 186/500 [==========>...................] - ETA: 1:18 - loss: 3.3071 - regression_loss: 2.5094 - classification_loss: 0.7977 187/500 [==========>...................] - ETA: 1:18 - loss: 3.3062 - regression_loss: 2.5090 - classification_loss: 0.7972 188/500 [==========>...................] - ETA: 1:18 - loss: 3.3044 - regression_loss: 2.5085 - classification_loss: 0.7959 189/500 [==========>...................] - ETA: 1:18 - loss: 3.3036 - regression_loss: 2.5081 - classification_loss: 0.7954 190/500 [==========>...................] - ETA: 1:17 - loss: 3.3033 - regression_loss: 2.5079 - classification_loss: 0.7954 191/500 [==========>...................] - ETA: 1:17 - loss: 3.3021 - regression_loss: 2.5081 - classification_loss: 0.7941 192/500 [==========>...................] - ETA: 1:17 - loss: 3.3007 - regression_loss: 2.5071 - classification_loss: 0.7936 193/500 [==========>...................] - ETA: 1:17 - loss: 3.3000 - regression_loss: 2.5067 - classification_loss: 0.7933 194/500 [==========>...................] - ETA: 1:16 - loss: 3.3058 - regression_loss: 2.5075 - classification_loss: 0.7983 195/500 [==========>...................] - ETA: 1:16 - loss: 3.3047 - regression_loss: 2.5071 - classification_loss: 0.7976 196/500 [==========>...................] - ETA: 1:16 - loss: 3.3055 - regression_loss: 2.5083 - classification_loss: 0.7972 197/500 [==========>...................] - ETA: 1:16 - loss: 3.3043 - regression_loss: 2.5083 - classification_loss: 0.7960 198/500 [==========>...................] - ETA: 1:15 - loss: 3.3037 - regression_loss: 2.5085 - classification_loss: 0.7952 199/500 [==========>...................] - ETA: 1:15 - loss: 3.3022 - regression_loss: 2.5080 - classification_loss: 0.7943 200/500 [===========>..................] - ETA: 1:15 - loss: 3.3024 - regression_loss: 2.5084 - classification_loss: 0.7940 201/500 [===========>..................] - ETA: 1:15 - loss: 3.3008 - regression_loss: 2.5078 - classification_loss: 0.7930 202/500 [===========>..................] - ETA: 1:14 - loss: 3.3014 - regression_loss: 2.5089 - classification_loss: 0.7925 203/500 [===========>..................] - ETA: 1:14 - loss: 3.3011 - regression_loss: 2.5088 - classification_loss: 0.7923 204/500 [===========>..................] - ETA: 1:14 - loss: 3.3012 - regression_loss: 2.5082 - classification_loss: 0.7930 205/500 [===========>..................] - ETA: 1:14 - loss: 3.3027 - regression_loss: 2.5076 - classification_loss: 0.7951 206/500 [===========>..................] - ETA: 1:14 - loss: 3.3053 - regression_loss: 2.5077 - classification_loss: 0.7977 207/500 [===========>..................] - ETA: 1:13 - loss: 3.3050 - regression_loss: 2.5078 - classification_loss: 0.7972 208/500 [===========>..................] - ETA: 1:13 - loss: 3.3047 - regression_loss: 2.5077 - classification_loss: 0.7970 209/500 [===========>..................] - ETA: 1:13 - loss: 3.3042 - regression_loss: 2.5075 - classification_loss: 0.7967 210/500 [===========>..................] - ETA: 1:13 - loss: 3.3037 - regression_loss: 2.5074 - classification_loss: 0.7963 211/500 [===========>..................] - ETA: 1:12 - loss: 3.3030 - regression_loss: 2.5070 - classification_loss: 0.7961 212/500 [===========>..................] - ETA: 1:12 - loss: 3.3037 - regression_loss: 2.5080 - classification_loss: 0.7957 213/500 [===========>..................] - ETA: 1:12 - loss: 3.3038 - regression_loss: 2.5089 - classification_loss: 0.7949 214/500 [===========>..................] - ETA: 1:12 - loss: 3.3064 - regression_loss: 2.5108 - classification_loss: 0.7955 215/500 [===========>..................] - ETA: 1:11 - loss: 3.3055 - regression_loss: 2.5102 - classification_loss: 0.7954 216/500 [===========>..................] - ETA: 1:11 - loss: 3.3050 - regression_loss: 2.5111 - classification_loss: 0.7939 217/500 [============>.................] - ETA: 1:11 - loss: 3.3047 - regression_loss: 2.5110 - classification_loss: 0.7937 218/500 [============>.................] - ETA: 1:11 - loss: 3.3039 - regression_loss: 2.5105 - classification_loss: 0.7934 219/500 [============>.................] - ETA: 1:10 - loss: 3.3028 - regression_loss: 2.5101 - classification_loss: 0.7927 220/500 [============>.................] - ETA: 1:10 - loss: 3.3023 - regression_loss: 2.5099 - classification_loss: 0.7924 221/500 [============>.................] - ETA: 1:10 - loss: 3.3019 - regression_loss: 2.5101 - classification_loss: 0.7917 222/500 [============>.................] - ETA: 1:10 - loss: 3.3009 - regression_loss: 2.5097 - classification_loss: 0.7912 223/500 [============>.................] - ETA: 1:09 - loss: 3.3001 - regression_loss: 2.5095 - classification_loss: 0.7906 224/500 [============>.................] - ETA: 1:09 - loss: 3.2991 - regression_loss: 2.5093 - classification_loss: 0.7898 225/500 [============>.................] - ETA: 1:09 - loss: 3.2992 - regression_loss: 2.5089 - classification_loss: 0.7902 226/500 [============>.................] - ETA: 1:09 - loss: 3.2977 - regression_loss: 2.5082 - classification_loss: 0.7895 227/500 [============>.................] - ETA: 1:08 - loss: 3.2992 - regression_loss: 2.5094 - classification_loss: 0.7899 228/500 [============>.................] - ETA: 1:08 - loss: 3.2984 - regression_loss: 2.5091 - classification_loss: 0.7893 229/500 [============>.................] - ETA: 1:08 - loss: 3.2971 - regression_loss: 2.5086 - classification_loss: 0.7884 230/500 [============>.................] - ETA: 1:08 - loss: 3.2984 - regression_loss: 2.5103 - classification_loss: 0.7882 231/500 [============>.................] - ETA: 1:07 - loss: 3.2967 - regression_loss: 2.5096 - classification_loss: 0.7871 232/500 [============>.................] - ETA: 1:07 - loss: 3.2978 - regression_loss: 2.5108 - classification_loss: 0.7870 233/500 [============>.................] - ETA: 1:07 - loss: 3.2968 - regression_loss: 2.5103 - classification_loss: 0.7864 234/500 [=============>................] - ETA: 1:06 - loss: 3.2970 - regression_loss: 2.5109 - classification_loss: 0.7861 235/500 [=============>................] - ETA: 1:06 - loss: 3.2958 - regression_loss: 2.5105 - classification_loss: 0.7854 236/500 [=============>................] - ETA: 1:06 - loss: 3.2947 - regression_loss: 2.5098 - classification_loss: 0.7849 237/500 [=============>................] - ETA: 1:06 - loss: 3.2945 - regression_loss: 2.5095 - classification_loss: 0.7850 238/500 [=============>................] - ETA: 1:05 - loss: 3.2938 - regression_loss: 2.5095 - classification_loss: 0.7843 239/500 [=============>................] - ETA: 1:05 - loss: 3.2943 - regression_loss: 2.5099 - classification_loss: 0.7844 240/500 [=============>................] - ETA: 1:05 - loss: 3.2939 - regression_loss: 2.5097 - classification_loss: 0.7842 241/500 [=============>................] - ETA: 1:05 - loss: 3.2955 - regression_loss: 2.5118 - classification_loss: 0.7837 242/500 [=============>................] - ETA: 1:04 - loss: 3.2938 - regression_loss: 2.5107 - classification_loss: 0.7831 243/500 [=============>................] - ETA: 1:04 - loss: 3.2932 - regression_loss: 2.5104 - classification_loss: 0.7828 244/500 [=============>................] - ETA: 1:04 - loss: 3.2934 - regression_loss: 2.5107 - classification_loss: 0.7827 245/500 [=============>................] - ETA: 1:04 - loss: 3.2925 - regression_loss: 2.5111 - classification_loss: 0.7814 246/500 [=============>................] - ETA: 1:03 - loss: 3.2917 - regression_loss: 2.5105 - classification_loss: 0.7812 247/500 [=============>................] - ETA: 1:03 - loss: 3.2908 - regression_loss: 2.5099 - classification_loss: 0.7809 248/500 [=============>................] - ETA: 1:03 - loss: 3.2900 - regression_loss: 2.5098 - classification_loss: 0.7802 249/500 [=============>................] - ETA: 1:03 - loss: 3.2896 - regression_loss: 2.5095 - classification_loss: 0.7801 250/500 [==============>...............] - ETA: 1:02 - loss: 3.2878 - regression_loss: 2.5087 - classification_loss: 0.7791 251/500 [==============>...............] - ETA: 1:02 - loss: 3.2874 - regression_loss: 2.5086 - classification_loss: 0.7788 252/500 [==============>...............] - ETA: 1:02 - loss: 3.2896 - regression_loss: 2.5094 - classification_loss: 0.7803 253/500 [==============>...............] - ETA: 1:02 - loss: 3.2907 - regression_loss: 2.5089 - classification_loss: 0.7818 254/500 [==============>...............] - ETA: 1:01 - loss: 3.2899 - regression_loss: 2.5088 - classification_loss: 0.7812 255/500 [==============>...............] - ETA: 1:01 - loss: 3.2884 - regression_loss: 2.5081 - classification_loss: 0.7804 256/500 [==============>...............] - ETA: 1:01 - loss: 3.2875 - regression_loss: 2.5077 - classification_loss: 0.7798 257/500 [==============>...............] - ETA: 1:01 - loss: 3.2870 - regression_loss: 2.5077 - classification_loss: 0.7793 258/500 [==============>...............] - ETA: 1:00 - loss: 3.2952 - regression_loss: 2.5096 - classification_loss: 0.7856 259/500 [==============>...............] - ETA: 1:00 - loss: 3.2944 - regression_loss: 2.5095 - classification_loss: 0.7849 260/500 [==============>...............] - ETA: 1:00 - loss: 3.2928 - regression_loss: 2.5089 - classification_loss: 0.7840 261/500 [==============>...............] - ETA: 1:00 - loss: 3.2927 - regression_loss: 2.5091 - classification_loss: 0.7835 262/500 [==============>...............] - ETA: 59s - loss: 3.2917 - regression_loss: 2.5089 - classification_loss: 0.7828  263/500 [==============>...............] - ETA: 59s - loss: 3.2901 - regression_loss: 2.5086 - classification_loss: 0.7815 264/500 [==============>...............] - ETA: 59s - loss: 3.2892 - regression_loss: 2.5084 - classification_loss: 0.7807 265/500 [==============>...............] - ETA: 59s - loss: 3.2882 - regression_loss: 2.5081 - classification_loss: 0.7801 266/500 [==============>...............] - ETA: 58s - loss: 3.2872 - regression_loss: 2.5075 - classification_loss: 0.7797 267/500 [===============>..............] - ETA: 58s - loss: 3.2868 - regression_loss: 2.5072 - classification_loss: 0.7796 268/500 [===============>..............] - ETA: 58s - loss: 3.2866 - regression_loss: 2.5064 - classification_loss: 0.7802 269/500 [===============>..............] - ETA: 58s - loss: 3.2849 - regression_loss: 2.5045 - classification_loss: 0.7804 270/500 [===============>..............] - ETA: 57s - loss: 3.2846 - regression_loss: 2.5041 - classification_loss: 0.7804 271/500 [===============>..............] - ETA: 57s - loss: 3.2835 - regression_loss: 2.5039 - classification_loss: 0.7795 272/500 [===============>..............] - ETA: 57s - loss: 3.2830 - regression_loss: 2.5038 - classification_loss: 0.7792 273/500 [===============>..............] - ETA: 57s - loss: 3.2845 - regression_loss: 2.5055 - classification_loss: 0.7789 274/500 [===============>..............] - ETA: 56s - loss: 3.2838 - regression_loss: 2.5051 - classification_loss: 0.7787 275/500 [===============>..............] - ETA: 56s - loss: 3.2824 - regression_loss: 2.5047 - classification_loss: 0.7777 276/500 [===============>..............] - ETA: 56s - loss: 3.2825 - regression_loss: 2.5054 - classification_loss: 0.7771 277/500 [===============>..............] - ETA: 56s - loss: 3.2815 - regression_loss: 2.5055 - classification_loss: 0.7760 278/500 [===============>..............] - ETA: 55s - loss: 3.2802 - regression_loss: 2.5056 - classification_loss: 0.7747 279/500 [===============>..............] - ETA: 55s - loss: 3.2795 - regression_loss: 2.5050 - classification_loss: 0.7745 280/500 [===============>..............] - ETA: 55s - loss: 3.2790 - regression_loss: 2.5057 - classification_loss: 0.7733 281/500 [===============>..............] - ETA: 55s - loss: 3.2792 - regression_loss: 2.5059 - classification_loss: 0.7732 282/500 [===============>..............] - ETA: 54s - loss: 3.2763 - regression_loss: 2.5041 - classification_loss: 0.7722 283/500 [===============>..............] - ETA: 54s - loss: 3.2755 - regression_loss: 2.5036 - classification_loss: 0.7719 284/500 [================>.............] - ETA: 54s - loss: 3.2771 - regression_loss: 2.5049 - classification_loss: 0.7722 285/500 [================>.............] - ETA: 54s - loss: 3.2765 - regression_loss: 2.5045 - classification_loss: 0.7720 286/500 [================>.............] - ETA: 53s - loss: 3.2759 - regression_loss: 2.5041 - classification_loss: 0.7718 287/500 [================>.............] - ETA: 53s - loss: 3.2759 - regression_loss: 2.5044 - classification_loss: 0.7715 288/500 [================>.............] - ETA: 53s - loss: 3.2797 - regression_loss: 2.5044 - classification_loss: 0.7753 289/500 [================>.............] - ETA: 53s - loss: 3.2792 - regression_loss: 2.5042 - classification_loss: 0.7750 290/500 [================>.............] - ETA: 52s - loss: 3.2783 - regression_loss: 2.5038 - classification_loss: 0.7744 291/500 [================>.............] - ETA: 52s - loss: 3.2774 - regression_loss: 2.5037 - classification_loss: 0.7737 292/500 [================>.............] - ETA: 52s - loss: 3.2767 - regression_loss: 2.5034 - classification_loss: 0.7732 293/500 [================>.............] - ETA: 52s - loss: 3.2756 - regression_loss: 2.5030 - classification_loss: 0.7726 294/500 [================>.............] - ETA: 51s - loss: 3.2753 - regression_loss: 2.5030 - classification_loss: 0.7723 295/500 [================>.............] - ETA: 51s - loss: 3.2743 - regression_loss: 2.5026 - classification_loss: 0.7717 296/500 [================>.............] - ETA: 51s - loss: 3.2750 - regression_loss: 2.5028 - classification_loss: 0.7722 297/500 [================>.............] - ETA: 51s - loss: 3.2794 - regression_loss: 2.5023 - classification_loss: 0.7771 298/500 [================>.............] - ETA: 50s - loss: 3.2780 - regression_loss: 2.5018 - classification_loss: 0.7762 299/500 [================>.............] - ETA: 50s - loss: 3.2811 - regression_loss: 2.5021 - classification_loss: 0.7791 300/500 [=================>............] - ETA: 50s - loss: 3.2805 - regression_loss: 2.5016 - classification_loss: 0.7788 301/500 [=================>............] - ETA: 50s - loss: 3.2792 - regression_loss: 2.5012 - classification_loss: 0.7781 302/500 [=================>............] - ETA: 49s - loss: 3.2836 - regression_loss: 2.5025 - classification_loss: 0.7811 303/500 [=================>............] - ETA: 49s - loss: 3.2831 - regression_loss: 2.5026 - classification_loss: 0.7805 304/500 [=================>............] - ETA: 49s - loss: 3.2827 - regression_loss: 2.5029 - classification_loss: 0.7798 305/500 [=================>............] - ETA: 49s - loss: 3.2820 - regression_loss: 2.5027 - classification_loss: 0.7794 306/500 [=================>............] - ETA: 48s - loss: 3.2815 - regression_loss: 2.5027 - classification_loss: 0.7788 307/500 [=================>............] - ETA: 48s - loss: 3.2808 - regression_loss: 2.5025 - classification_loss: 0.7783 308/500 [=================>............] - ETA: 48s - loss: 3.2817 - regression_loss: 2.5033 - classification_loss: 0.7784 309/500 [=================>............] - ETA: 48s - loss: 3.2831 - regression_loss: 2.5051 - classification_loss: 0.7780 310/500 [=================>............] - ETA: 47s - loss: 3.2825 - regression_loss: 2.5046 - classification_loss: 0.7779 311/500 [=================>............] - ETA: 47s - loss: 3.2803 - regression_loss: 2.5027 - classification_loss: 0.7776 312/500 [=================>............] - ETA: 47s - loss: 3.2794 - regression_loss: 2.5025 - classification_loss: 0.7768 313/500 [=================>............] - ETA: 47s - loss: 3.2774 - regression_loss: 2.5016 - classification_loss: 0.7758 314/500 [=================>............] - ETA: 46s - loss: 3.2777 - regression_loss: 2.5020 - classification_loss: 0.7757 315/500 [=================>............] - ETA: 46s - loss: 3.2776 - regression_loss: 2.5017 - classification_loss: 0.7759 316/500 [=================>............] - ETA: 46s - loss: 3.2780 - regression_loss: 2.5017 - classification_loss: 0.7762 317/500 [==================>...........] - ETA: 46s - loss: 3.2783 - regression_loss: 2.5016 - classification_loss: 0.7767 318/500 [==================>...........] - ETA: 45s - loss: 3.2784 - regression_loss: 2.5022 - classification_loss: 0.7762 319/500 [==================>...........] - ETA: 45s - loss: 3.2782 - regression_loss: 2.5022 - classification_loss: 0.7760 320/500 [==================>...........] - ETA: 45s - loss: 3.2786 - regression_loss: 2.5024 - classification_loss: 0.7762 321/500 [==================>...........] - ETA: 45s - loss: 3.2788 - regression_loss: 2.5028 - classification_loss: 0.7760 322/500 [==================>...........] - ETA: 44s - loss: 3.2788 - regression_loss: 2.5026 - classification_loss: 0.7761 323/500 [==================>...........] - ETA: 44s - loss: 3.2788 - regression_loss: 2.5024 - classification_loss: 0.7764 324/500 [==================>...........] - ETA: 44s - loss: 3.2787 - regression_loss: 2.5023 - classification_loss: 0.7764 325/500 [==================>...........] - ETA: 44s - loss: 3.2783 - regression_loss: 2.5020 - classification_loss: 0.7763 326/500 [==================>...........] - ETA: 43s - loss: 3.2785 - regression_loss: 2.5029 - classification_loss: 0.7755 327/500 [==================>...........] - ETA: 43s - loss: 3.2780 - regression_loss: 2.5026 - classification_loss: 0.7754 328/500 [==================>...........] - ETA: 43s - loss: 3.2777 - regression_loss: 2.5025 - classification_loss: 0.7752 329/500 [==================>...........] - ETA: 43s - loss: 3.2776 - regression_loss: 2.5026 - classification_loss: 0.7750 330/500 [==================>...........] - ETA: 42s - loss: 3.2766 - regression_loss: 2.5021 - classification_loss: 0.7745 331/500 [==================>...........] - ETA: 42s - loss: 3.2758 - regression_loss: 2.5018 - classification_loss: 0.7740 332/500 [==================>...........] - ETA: 42s - loss: 3.2749 - regression_loss: 2.5015 - classification_loss: 0.7735 333/500 [==================>...........] - ETA: 42s - loss: 3.2753 - regression_loss: 2.5016 - classification_loss: 0.7737 334/500 [===================>..........] - ETA: 41s - loss: 3.2746 - regression_loss: 2.5013 - classification_loss: 0.7733 335/500 [===================>..........] - ETA: 41s - loss: 3.2747 - regression_loss: 2.5011 - classification_loss: 0.7735 336/500 [===================>..........] - ETA: 41s - loss: 3.2760 - regression_loss: 2.5018 - classification_loss: 0.7742 337/500 [===================>..........] - ETA: 41s - loss: 3.2756 - regression_loss: 2.5016 - classification_loss: 0.7740 338/500 [===================>..........] - ETA: 40s - loss: 3.2751 - regression_loss: 2.5015 - classification_loss: 0.7736 339/500 [===================>..........] - ETA: 40s - loss: 3.2742 - regression_loss: 2.5012 - classification_loss: 0.7730 340/500 [===================>..........] - ETA: 40s - loss: 3.2736 - regression_loss: 2.5010 - classification_loss: 0.7726 341/500 [===================>..........] - ETA: 40s - loss: 3.2728 - regression_loss: 2.5006 - classification_loss: 0.7722 342/500 [===================>..........] - ETA: 39s - loss: 3.2730 - regression_loss: 2.5005 - classification_loss: 0.7725 343/500 [===================>..........] - ETA: 39s - loss: 3.2725 - regression_loss: 2.5003 - classification_loss: 0.7722 344/500 [===================>..........] - ETA: 39s - loss: 3.2722 - regression_loss: 2.5001 - classification_loss: 0.7720 345/500 [===================>..........] - ETA: 39s - loss: 3.2718 - regression_loss: 2.5001 - classification_loss: 0.7717 346/500 [===================>..........] - ETA: 38s - loss: 3.2707 - regression_loss: 2.4997 - classification_loss: 0.7710 347/500 [===================>..........] - ETA: 38s - loss: 3.2696 - regression_loss: 2.4991 - classification_loss: 0.7705 348/500 [===================>..........] - ETA: 38s - loss: 3.2689 - regression_loss: 2.4991 - classification_loss: 0.7698 349/500 [===================>..........] - ETA: 38s - loss: 3.2685 - regression_loss: 2.4992 - classification_loss: 0.7694 350/500 [====================>.........] - ETA: 37s - loss: 3.2714 - regression_loss: 2.5002 - classification_loss: 0.7712 351/500 [====================>.........] - ETA: 37s - loss: 3.2708 - regression_loss: 2.5000 - classification_loss: 0.7708 352/500 [====================>.........] - ETA: 37s - loss: 3.2703 - regression_loss: 2.4992 - classification_loss: 0.7711 353/500 [====================>.........] - ETA: 37s - loss: 3.2695 - regression_loss: 2.4989 - classification_loss: 0.7707 354/500 [====================>.........] - ETA: 36s - loss: 3.2689 - regression_loss: 2.4987 - classification_loss: 0.7702 355/500 [====================>.........] - ETA: 36s - loss: 3.2681 - regression_loss: 2.4983 - classification_loss: 0.7698 356/500 [====================>.........] - ETA: 36s - loss: 3.2675 - regression_loss: 2.4979 - classification_loss: 0.7696 357/500 [====================>.........] - ETA: 36s - loss: 3.2666 - regression_loss: 2.4975 - classification_loss: 0.7691 358/500 [====================>.........] - ETA: 35s - loss: 3.2655 - regression_loss: 2.4971 - classification_loss: 0.7684 359/500 [====================>.........] - ETA: 35s - loss: 3.2675 - regression_loss: 2.4978 - classification_loss: 0.7698 360/500 [====================>.........] - ETA: 35s - loss: 3.2670 - regression_loss: 2.4976 - classification_loss: 0.7695 361/500 [====================>.........] - ETA: 35s - loss: 3.2666 - regression_loss: 2.4973 - classification_loss: 0.7693 362/500 [====================>.........] - ETA: 34s - loss: 3.2668 - regression_loss: 2.4976 - classification_loss: 0.7692 363/500 [====================>.........] - ETA: 34s - loss: 3.2668 - regression_loss: 2.4978 - classification_loss: 0.7690 364/500 [====================>.........] - ETA: 34s - loss: 3.2666 - regression_loss: 2.4980 - classification_loss: 0.7686 365/500 [====================>.........] - ETA: 34s - loss: 3.2685 - regression_loss: 2.4974 - classification_loss: 0.7711 366/500 [====================>.........] - ETA: 33s - loss: 3.2687 - regression_loss: 2.4976 - classification_loss: 0.7711 367/500 [=====================>........] - ETA: 33s - loss: 3.2718 - regression_loss: 2.4978 - classification_loss: 0.7741 368/500 [=====================>........] - ETA: 33s - loss: 3.2721 - regression_loss: 2.4979 - classification_loss: 0.7743 369/500 [=====================>........] - ETA: 33s - loss: 3.2724 - regression_loss: 2.4977 - classification_loss: 0.7748 370/500 [=====================>........] - ETA: 32s - loss: 3.2720 - regression_loss: 2.4977 - classification_loss: 0.7743 371/500 [=====================>........] - ETA: 32s - loss: 3.2704 - regression_loss: 2.4963 - classification_loss: 0.7741 372/500 [=====================>........] - ETA: 32s - loss: 3.2701 - regression_loss: 2.4958 - classification_loss: 0.7743 373/500 [=====================>........] - ETA: 31s - loss: 3.2696 - regression_loss: 2.4954 - classification_loss: 0.7742 374/500 [=====================>........] - ETA: 31s - loss: 3.2693 - regression_loss: 2.4950 - classification_loss: 0.7742 375/500 [=====================>........] - ETA: 31s - loss: 3.2693 - regression_loss: 2.4948 - classification_loss: 0.7746 376/500 [=====================>........] - ETA: 31s - loss: 3.2692 - regression_loss: 2.4946 - classification_loss: 0.7746 377/500 [=====================>........] - ETA: 30s - loss: 3.2689 - regression_loss: 2.4942 - classification_loss: 0.7748 378/500 [=====================>........] - ETA: 30s - loss: 3.2681 - regression_loss: 2.4938 - classification_loss: 0.7743 379/500 [=====================>........] - ETA: 30s - loss: 3.2674 - regression_loss: 2.4933 - classification_loss: 0.7741 380/500 [=====================>........] - ETA: 30s - loss: 3.2672 - regression_loss: 2.4933 - classification_loss: 0.7740 381/500 [=====================>........] - ETA: 29s - loss: 3.2681 - regression_loss: 2.4945 - classification_loss: 0.7736 382/500 [=====================>........] - ETA: 29s - loss: 3.2675 - regression_loss: 2.4942 - classification_loss: 0.7733 383/500 [=====================>........] - ETA: 29s - loss: 3.2670 - regression_loss: 2.4940 - classification_loss: 0.7731 384/500 [======================>.......] - ETA: 29s - loss: 3.2666 - regression_loss: 2.4937 - classification_loss: 0.7730 385/500 [======================>.......] - ETA: 28s - loss: 3.2668 - regression_loss: 2.4939 - classification_loss: 0.7730 386/500 [======================>.......] - ETA: 28s - loss: 3.2663 - regression_loss: 2.4936 - classification_loss: 0.7727 387/500 [======================>.......] - ETA: 28s - loss: 3.2670 - regression_loss: 2.4943 - classification_loss: 0.7727 388/500 [======================>.......] - ETA: 28s - loss: 3.2666 - regression_loss: 2.4941 - classification_loss: 0.7725 389/500 [======================>.......] - ETA: 27s - loss: 3.2667 - regression_loss: 2.4942 - classification_loss: 0.7725 390/500 [======================>.......] - ETA: 27s - loss: 3.2664 - regression_loss: 2.4943 - classification_loss: 0.7720 391/500 [======================>.......] - ETA: 27s - loss: 3.2659 - regression_loss: 2.4945 - classification_loss: 0.7714 392/500 [======================>.......] - ETA: 27s - loss: 3.2662 - regression_loss: 2.4943 - classification_loss: 0.7718 393/500 [======================>.......] - ETA: 26s - loss: 3.2658 - regression_loss: 2.4944 - classification_loss: 0.7715 394/500 [======================>.......] - ETA: 26s - loss: 3.2652 - regression_loss: 2.4943 - classification_loss: 0.7709 395/500 [======================>.......] - ETA: 26s - loss: 3.2655 - regression_loss: 2.4951 - classification_loss: 0.7704 396/500 [======================>.......] - ETA: 26s - loss: 3.2644 - regression_loss: 2.4946 - classification_loss: 0.7698 397/500 [======================>.......] - ETA: 25s - loss: 3.2680 - regression_loss: 2.4957 - classification_loss: 0.7723 398/500 [======================>.......] - ETA: 25s - loss: 3.2675 - regression_loss: 2.4956 - classification_loss: 0.7719 399/500 [======================>.......] - ETA: 25s - loss: 3.2667 - regression_loss: 2.4954 - classification_loss: 0.7713 400/500 [=======================>......] - ETA: 25s - loss: 3.2667 - regression_loss: 2.4948 - classification_loss: 0.7719 401/500 [=======================>......] - ETA: 24s - loss: 3.2666 - regression_loss: 2.4948 - classification_loss: 0.7718 402/500 [=======================>......] - ETA: 24s - loss: 3.2654 - regression_loss: 2.4940 - classification_loss: 0.7714 403/500 [=======================>......] - ETA: 24s - loss: 3.2646 - regression_loss: 2.4936 - classification_loss: 0.7709 404/500 [=======================>......] - ETA: 24s - loss: 3.2642 - regression_loss: 2.4937 - classification_loss: 0.7706 405/500 [=======================>......] - ETA: 23s - loss: 3.2640 - regression_loss: 2.4937 - classification_loss: 0.7704 406/500 [=======================>......] - ETA: 23s - loss: 3.2628 - regression_loss: 2.4930 - classification_loss: 0.7698 407/500 [=======================>......] - ETA: 23s - loss: 3.2619 - regression_loss: 2.4924 - classification_loss: 0.7695 408/500 [=======================>......] - ETA: 23s - loss: 3.2632 - regression_loss: 2.4941 - classification_loss: 0.7691 409/500 [=======================>......] - ETA: 22s - loss: 3.2627 - regression_loss: 2.4939 - classification_loss: 0.7688 410/500 [=======================>......] - ETA: 22s - loss: 3.2628 - regression_loss: 2.4939 - classification_loss: 0.7690 411/500 [=======================>......] - ETA: 22s - loss: 3.2645 - regression_loss: 2.4956 - classification_loss: 0.7689 412/500 [=======================>......] - ETA: 22s - loss: 3.2637 - regression_loss: 2.4956 - classification_loss: 0.7681 413/500 [=======================>......] - ETA: 21s - loss: 3.2632 - regression_loss: 2.4953 - classification_loss: 0.7679 414/500 [=======================>......] - ETA: 21s - loss: 3.2625 - regression_loss: 2.4953 - classification_loss: 0.7673 415/500 [=======================>......] - ETA: 21s - loss: 3.2614 - regression_loss: 2.4947 - classification_loss: 0.7667 416/500 [=======================>......] - ETA: 21s - loss: 3.2609 - regression_loss: 2.4942 - classification_loss: 0.7667 417/500 [========================>.....] - ETA: 20s - loss: 3.2606 - regression_loss: 2.4942 - classification_loss: 0.7664 418/500 [========================>.....] - ETA: 20s - loss: 3.2603 - regression_loss: 2.4941 - classification_loss: 0.7662 419/500 [========================>.....] - ETA: 20s - loss: 3.2600 - regression_loss: 2.4939 - classification_loss: 0.7661 420/500 [========================>.....] - ETA: 20s - loss: 3.2593 - regression_loss: 2.4935 - classification_loss: 0.7659 421/500 [========================>.....] - ETA: 19s - loss: 3.2587 - regression_loss: 2.4932 - classification_loss: 0.7655 422/500 [========================>.....] - ETA: 19s - loss: 3.2585 - regression_loss: 2.4933 - classification_loss: 0.7652 423/500 [========================>.....] - ETA: 19s - loss: 3.2581 - regression_loss: 2.4931 - classification_loss: 0.7649 424/500 [========================>.....] - ETA: 19s - loss: 3.2576 - regression_loss: 2.4929 - classification_loss: 0.7646 425/500 [========================>.....] - ETA: 18s - loss: 3.2567 - regression_loss: 2.4924 - classification_loss: 0.7643 426/500 [========================>.....] - ETA: 18s - loss: 3.2569 - regression_loss: 2.4930 - classification_loss: 0.7639 427/500 [========================>.....] - ETA: 18s - loss: 3.2561 - regression_loss: 2.4926 - classification_loss: 0.7635 428/500 [========================>.....] - ETA: 18s - loss: 3.2556 - regression_loss: 2.4924 - classification_loss: 0.7632 429/500 [========================>.....] - ETA: 17s - loss: 3.2592 - regression_loss: 2.4923 - classification_loss: 0.7668 430/500 [========================>.....] - ETA: 17s - loss: 3.2629 - regression_loss: 2.4932 - classification_loss: 0.7697 431/500 [========================>.....] - ETA: 17s - loss: 3.2628 - regression_loss: 2.4927 - classification_loss: 0.7700 432/500 [========================>.....] - ETA: 17s - loss: 3.2628 - regression_loss: 2.4929 - classification_loss: 0.7699 433/500 [========================>.....] - ETA: 16s - loss: 3.2620 - regression_loss: 2.4925 - classification_loss: 0.7695 434/500 [=========================>....] - ETA: 16s - loss: 3.2622 - regression_loss: 2.4929 - classification_loss: 0.7693 435/500 [=========================>....] - ETA: 16s - loss: 3.2622 - regression_loss: 2.4929 - classification_loss: 0.7693 436/500 [=========================>....] - ETA: 16s - loss: 3.2608 - regression_loss: 2.4924 - classification_loss: 0.7684 437/500 [=========================>....] - ETA: 15s - loss: 3.2607 - regression_loss: 2.4927 - classification_loss: 0.7680 438/500 [=========================>....] - ETA: 15s - loss: 3.2599 - regression_loss: 2.4924 - classification_loss: 0.7675 439/500 [=========================>....] - ETA: 15s - loss: 3.2597 - regression_loss: 2.4924 - classification_loss: 0.7673 440/500 [=========================>....] - ETA: 15s - loss: 3.2606 - regression_loss: 2.4925 - classification_loss: 0.7681 441/500 [=========================>....] - ETA: 14s - loss: 3.2603 - regression_loss: 2.4920 - classification_loss: 0.7684 442/500 [=========================>....] - ETA: 14s - loss: 3.2596 - regression_loss: 2.4915 - classification_loss: 0.7681 443/500 [=========================>....] - ETA: 14s - loss: 3.2590 - regression_loss: 2.4914 - classification_loss: 0.7676 444/500 [=========================>....] - ETA: 14s - loss: 3.2585 - regression_loss: 2.4911 - classification_loss: 0.7673 445/500 [=========================>....] - ETA: 13s - loss: 3.2568 - regression_loss: 2.4895 - classification_loss: 0.7673 446/500 [=========================>....] - ETA: 13s - loss: 3.2564 - regression_loss: 2.4893 - classification_loss: 0.7671 447/500 [=========================>....] - ETA: 13s - loss: 3.2561 - regression_loss: 2.4892 - classification_loss: 0.7668 448/500 [=========================>....] - ETA: 13s - loss: 3.2564 - regression_loss: 2.4890 - classification_loss: 0.7675 449/500 [=========================>....] - ETA: 12s - loss: 3.2560 - regression_loss: 2.4888 - classification_loss: 0.7672 450/500 [==========================>...] - ETA: 12s - loss: 3.2550 - regression_loss: 2.4882 - classification_loss: 0.7668 451/500 [==========================>...] - ETA: 12s - loss: 3.2549 - regression_loss: 2.4884 - classification_loss: 0.7666 452/500 [==========================>...] - ETA: 12s - loss: 3.2551 - regression_loss: 2.4884 - classification_loss: 0.7666 453/500 [==========================>...] - ETA: 11s - loss: 3.2549 - regression_loss: 2.4883 - classification_loss: 0.7666 454/500 [==========================>...] - ETA: 11s - loss: 3.2549 - regression_loss: 2.4881 - classification_loss: 0.7668 455/500 [==========================>...] - ETA: 11s - loss: 3.2553 - regression_loss: 2.4892 - classification_loss: 0.7661 456/500 [==========================>...] - ETA: 11s - loss: 3.2558 - regression_loss: 2.4898 - classification_loss: 0.7660 457/500 [==========================>...] - ETA: 10s - loss: 3.2551 - regression_loss: 2.4894 - classification_loss: 0.7656 458/500 [==========================>...] - ETA: 10s - loss: 3.2541 - regression_loss: 2.4886 - classification_loss: 0.7655 459/500 [==========================>...] - ETA: 10s - loss: 3.2540 - regression_loss: 2.4885 - classification_loss: 0.7654 460/500 [==========================>...] - ETA: 10s - loss: 3.2545 - regression_loss: 2.4883 - classification_loss: 0.7662 461/500 [==========================>...] - ETA: 9s - loss: 3.2550 - regression_loss: 2.4886 - classification_loss: 0.7663  462/500 [==========================>...] - ETA: 9s - loss: 3.2542 - regression_loss: 2.4885 - classification_loss: 0.7657 463/500 [==========================>...] - ETA: 9s - loss: 3.2540 - regression_loss: 2.4884 - classification_loss: 0.7656 464/500 [==========================>...] - ETA: 9s - loss: 3.2537 - regression_loss: 2.4883 - classification_loss: 0.7654 465/500 [==========================>...] - ETA: 8s - loss: 3.2530 - regression_loss: 2.4881 - classification_loss: 0.7649 466/500 [==========================>...] - ETA: 8s - loss: 3.2521 - regression_loss: 2.4877 - classification_loss: 0.7644 467/500 [===========================>..] - ETA: 8s - loss: 3.2526 - regression_loss: 2.4881 - classification_loss: 0.7645 468/500 [===========================>..] - ETA: 8s - loss: 3.2518 - regression_loss: 2.4879 - classification_loss: 0.7639 469/500 [===========================>..] - ETA: 7s - loss: 3.2510 - regression_loss: 2.4875 - classification_loss: 0.7634 470/500 [===========================>..] - ETA: 7s - loss: 3.2589 - regression_loss: 2.4882 - classification_loss: 0.7708 471/500 [===========================>..] - ETA: 7s - loss: 3.2580 - regression_loss: 2.4877 - classification_loss: 0.7702 472/500 [===========================>..] - ETA: 7s - loss: 3.2613 - regression_loss: 2.4882 - classification_loss: 0.7731 473/500 [===========================>..] - ETA: 6s - loss: 3.2632 - regression_loss: 2.4889 - classification_loss: 0.7743 474/500 [===========================>..] - ETA: 6s - loss: 3.2638 - regression_loss: 2.4893 - classification_loss: 0.7744 475/500 [===========================>..] - ETA: 6s - loss: 3.2633 - regression_loss: 2.4892 - classification_loss: 0.7741 476/500 [===========================>..] - ETA: 6s - loss: 3.2623 - regression_loss: 2.4886 - classification_loss: 0.7737 477/500 [===========================>..] - ETA: 5s - loss: 3.2614 - regression_loss: 2.4882 - classification_loss: 0.7732 478/500 [===========================>..] - ETA: 5s - loss: 3.2607 - regression_loss: 2.4877 - classification_loss: 0.7730 479/500 [===========================>..] - ETA: 5s - loss: 3.2605 - regression_loss: 2.4878 - classification_loss: 0.7727 480/500 [===========================>..] - ETA: 5s - loss: 3.2601 - regression_loss: 2.4877 - classification_loss: 0.7724 481/500 [===========================>..] - ETA: 4s - loss: 3.2648 - regression_loss: 2.4899 - classification_loss: 0.7749 482/500 [===========================>..] - ETA: 4s - loss: 3.2635 - regression_loss: 2.4892 - classification_loss: 0.7743 483/500 [===========================>..] - ETA: 4s - loss: 3.2636 - regression_loss: 2.4896 - classification_loss: 0.7740 484/500 [============================>.] - ETA: 4s - loss: 3.2628 - regression_loss: 2.4892 - classification_loss: 0.7736 485/500 [============================>.] - ETA: 3s - loss: 3.2629 - regression_loss: 2.4893 - classification_loss: 0.7736 486/500 [============================>.] - ETA: 3s - loss: 3.2624 - regression_loss: 2.4891 - classification_loss: 0.7733 487/500 [============================>.] - ETA: 3s - loss: 3.2616 - regression_loss: 2.4886 - classification_loss: 0.7729 488/500 [============================>.] - ETA: 3s - loss: 3.2619 - regression_loss: 2.4893 - classification_loss: 0.7726 489/500 [============================>.] - ETA: 2s - loss: 3.2620 - regression_loss: 2.4897 - classification_loss: 0.7724 490/500 [============================>.] - ETA: 2s - loss: 3.2614 - regression_loss: 2.4892 - classification_loss: 0.7722 491/500 [============================>.] - ETA: 2s - loss: 3.2610 - regression_loss: 2.4890 - classification_loss: 0.7721 492/500 [============================>.] - ETA: 2s - loss: 3.2601 - regression_loss: 2.4886 - classification_loss: 0.7715 493/500 [============================>.] - ETA: 1s - loss: 3.2598 - regression_loss: 2.4885 - classification_loss: 0.7713 494/500 [============================>.] - ETA: 1s - loss: 3.2594 - regression_loss: 2.4885 - classification_loss: 0.7709 495/500 [============================>.] - ETA: 1s - loss: 3.2586 - regression_loss: 2.4882 - classification_loss: 0.7704 496/500 [============================>.] - ETA: 1s - loss: 3.2585 - regression_loss: 2.4882 - classification_loss: 0.7703 497/500 [============================>.] - ETA: 0s - loss: 3.2617 - regression_loss: 2.4886 - classification_loss: 0.7731 498/500 [============================>.] - ETA: 0s - loss: 3.2623 - regression_loss: 2.4886 - classification_loss: 0.7737 499/500 [============================>.] - ETA: 0s - loss: 3.2619 - regression_loss: 2.4885 - classification_loss: 0.7734 500/500 [==============================] - 126s 252ms/step - loss: 3.2615 - regression_loss: 2.4884 - classification_loss: 0.7731 1172 instances of class plum with average precision: 0.0562 mAP: 0.0562 Epoch 00003: saving model to ./training/snapshots/resnet50_pascal_03.h5 Epoch 4/150 1/500 [..............................] - ETA: 1:57 - loss: 2.9412 - regression_loss: 2.2360 - classification_loss: 0.7052 2/500 [..............................] - ETA: 2:02 - loss: 2.9758 - regression_loss: 2.3549 - classification_loss: 0.6210 3/500 [..............................] - ETA: 2:03 - loss: 2.9594 - regression_loss: 2.3437 - classification_loss: 0.6156 4/500 [..............................] - ETA: 2:04 - loss: 3.0064 - regression_loss: 2.4053 - classification_loss: 0.6012 5/500 [..............................] - ETA: 2:05 - loss: 2.9784 - regression_loss: 2.3999 - classification_loss: 0.5784 6/500 [..............................] - ETA: 2:04 - loss: 2.9685 - regression_loss: 2.3720 - classification_loss: 0.5965 7/500 [..............................] - ETA: 2:05 - loss: 2.9866 - regression_loss: 2.3877 - classification_loss: 0.5988 8/500 [..............................] - ETA: 2:05 - loss: 2.9833 - regression_loss: 2.3827 - classification_loss: 0.6006 9/500 [..............................] - ETA: 2:05 - loss: 3.0005 - regression_loss: 2.3900 - classification_loss: 0.6105 10/500 [..............................] - ETA: 2:04 - loss: 3.0118 - regression_loss: 2.3959 - classification_loss: 0.6158 11/500 [..............................] - ETA: 2:04 - loss: 3.0091 - regression_loss: 2.3903 - classification_loss: 0.6187 12/500 [..............................] - ETA: 2:04 - loss: 3.0108 - regression_loss: 2.3918 - classification_loss: 0.6190 13/500 [..............................] - ETA: 2:03 - loss: 3.0495 - regression_loss: 2.4245 - classification_loss: 0.6250 14/500 [..............................] - ETA: 2:03 - loss: 3.0509 - regression_loss: 2.4306 - classification_loss: 0.6202 15/500 [..............................] - ETA: 2:03 - loss: 3.0494 - regression_loss: 2.4309 - classification_loss: 0.6185 16/500 [..............................] - ETA: 2:02 - loss: 3.0326 - regression_loss: 2.4198 - classification_loss: 0.6127 17/500 [>.............................] - ETA: 2:01 - loss: 3.0427 - regression_loss: 2.4313 - classification_loss: 0.6114 18/500 [>.............................] - ETA: 2:01 - loss: 3.0341 - regression_loss: 2.4237 - classification_loss: 0.6104 19/500 [>.............................] - ETA: 2:01 - loss: 3.0408 - regression_loss: 2.4303 - classification_loss: 0.6105 20/500 [>.............................] - ETA: 2:01 - loss: 3.0371 - regression_loss: 2.4273 - classification_loss: 0.6097 21/500 [>.............................] - ETA: 2:01 - loss: 3.0380 - regression_loss: 2.4297 - classification_loss: 0.6082 22/500 [>.............................] - ETA: 2:00 - loss: 3.0392 - regression_loss: 2.4316 - classification_loss: 0.6076 23/500 [>.............................] - ETA: 2:00 - loss: 3.0434 - regression_loss: 2.4359 - classification_loss: 0.6075 24/500 [>.............................] - ETA: 2:00 - loss: 3.0436 - regression_loss: 2.4403 - classification_loss: 0.6033 25/500 [>.............................] - ETA: 1:59 - loss: 3.0443 - regression_loss: 2.4450 - classification_loss: 0.5993 26/500 [>.............................] - ETA: 1:59 - loss: 3.0459 - regression_loss: 2.4444 - classification_loss: 0.6014 27/500 [>.............................] - ETA: 1:59 - loss: 3.0810 - regression_loss: 2.4595 - classification_loss: 0.6215 28/500 [>.............................] - ETA: 1:59 - loss: 3.0691 - regression_loss: 2.4453 - classification_loss: 0.6238 29/500 [>.............................] - ETA: 1:58 - loss: 3.1411 - regression_loss: 2.4480 - classification_loss: 0.6931 30/500 [>.............................] - ETA: 1:58 - loss: 3.1205 - regression_loss: 2.4334 - classification_loss: 0.6871 31/500 [>.............................] - ETA: 1:58 - loss: 3.1275 - regression_loss: 2.4454 - classification_loss: 0.6821 32/500 [>.............................] - ETA: 1:57 - loss: 3.1262 - regression_loss: 2.4463 - classification_loss: 0.6798 33/500 [>.............................] - ETA: 1:57 - loss: 3.1221 - regression_loss: 2.4459 - classification_loss: 0.6762 34/500 [=>............................] - ETA: 1:57 - loss: 3.1207 - regression_loss: 2.4443 - classification_loss: 0.6763 35/500 [=>............................] - ETA: 1:57 - loss: 3.1217 - regression_loss: 2.4442 - classification_loss: 0.6775 36/500 [=>............................] - ETA: 1:56 - loss: 3.1175 - regression_loss: 2.4401 - classification_loss: 0.6774 37/500 [=>............................] - ETA: 1:56 - loss: 3.1111 - regression_loss: 2.4394 - classification_loss: 0.6717 38/500 [=>............................] - ETA: 1:56 - loss: 3.1103 - regression_loss: 2.4394 - classification_loss: 0.6709 39/500 [=>............................] - ETA: 1:55 - loss: 3.1124 - regression_loss: 2.4418 - classification_loss: 0.6706 40/500 [=>............................] - ETA: 1:55 - loss: 3.1080 - regression_loss: 2.4413 - classification_loss: 0.6667 41/500 [=>............................] - ETA: 1:55 - loss: 3.1117 - regression_loss: 2.4422 - classification_loss: 0.6694 42/500 [=>............................] - ETA: 1:55 - loss: 3.1071 - regression_loss: 2.4390 - classification_loss: 0.6681 43/500 [=>............................] - ETA: 1:54 - loss: 3.1056 - regression_loss: 2.4393 - classification_loss: 0.6663 44/500 [=>............................] - ETA: 1:54 - loss: 3.1041 - regression_loss: 2.4363 - classification_loss: 0.6679 45/500 [=>............................] - ETA: 1:54 - loss: 3.1041 - regression_loss: 2.4362 - classification_loss: 0.6679 46/500 [=>............................] - ETA: 1:54 - loss: 3.1028 - regression_loss: 2.4354 - classification_loss: 0.6674 47/500 [=>............................] - ETA: 1:54 - loss: 3.1025 - regression_loss: 2.4353 - classification_loss: 0.6673 48/500 [=>............................] - ETA: 1:53 - loss: 3.0994 - regression_loss: 2.4354 - classification_loss: 0.6639 49/500 [=>............................] - ETA: 1:53 - loss: 3.1073 - regression_loss: 2.4435 - classification_loss: 0.6638 50/500 [==>...........................] - ETA: 1:53 - loss: 3.1132 - regression_loss: 2.4450 - classification_loss: 0.6682 51/500 [==>...........................] - ETA: 1:53 - loss: 3.1459 - regression_loss: 2.4491 - classification_loss: 0.6968 52/500 [==>...........................] - ETA: 1:52 - loss: 3.1864 - regression_loss: 2.4561 - classification_loss: 0.7303 53/500 [==>...........................] - ETA: 1:52 - loss: 3.1814 - regression_loss: 2.4541 - classification_loss: 0.7273 54/500 [==>...........................] - ETA: 1:52 - loss: 3.1816 - regression_loss: 2.4571 - classification_loss: 0.7245 55/500 [==>...........................] - ETA: 1:52 - loss: 3.1790 - regression_loss: 2.4577 - classification_loss: 0.7212 56/500 [==>...........................] - ETA: 1:52 - loss: 3.1767 - regression_loss: 2.4559 - classification_loss: 0.7208 57/500 [==>...........................] - ETA: 1:51 - loss: 3.1731 - regression_loss: 2.4538 - classification_loss: 0.7194 58/500 [==>...........................] - ETA: 1:51 - loss: 3.1707 - regression_loss: 2.4530 - classification_loss: 0.7176 59/500 [==>...........................] - ETA: 1:51 - loss: 3.1735 - regression_loss: 2.4542 - classification_loss: 0.7193 60/500 [==>...........................] - ETA: 1:50 - loss: 3.1733 - regression_loss: 2.4552 - classification_loss: 0.7181 61/500 [==>...........................] - ETA: 1:50 - loss: 3.1780 - regression_loss: 2.4550 - classification_loss: 0.7230 62/500 [==>...........................] - ETA: 1:50 - loss: 3.2299 - regression_loss: 2.4812 - classification_loss: 0.7487 63/500 [==>...........................] - ETA: 1:50 - loss: 3.2230 - regression_loss: 2.4770 - classification_loss: 0.7460 64/500 [==>...........................] - ETA: 1:49 - loss: 3.2173 - regression_loss: 2.4730 - classification_loss: 0.7443 65/500 [==>...........................] - ETA: 1:49 - loss: 3.2137 - regression_loss: 2.4724 - classification_loss: 0.7413 66/500 [==>...........................] - ETA: 1:49 - loss: 3.2030 - regression_loss: 2.4673 - classification_loss: 0.7357 67/500 [===>..........................] - ETA: 1:49 - loss: 3.2002 - regression_loss: 2.4678 - classification_loss: 0.7324 68/500 [===>..........................] - ETA: 1:48 - loss: 3.1980 - regression_loss: 2.4668 - classification_loss: 0.7312 69/500 [===>..........................] - ETA: 1:48 - loss: 3.1953 - regression_loss: 2.4662 - classification_loss: 0.7292 70/500 [===>..........................] - ETA: 1:48 - loss: 3.1908 - regression_loss: 2.4645 - classification_loss: 0.7263 71/500 [===>..........................] - ETA: 1:48 - loss: 3.1879 - regression_loss: 2.4628 - classification_loss: 0.7251 72/500 [===>..........................] - ETA: 1:47 - loss: 3.2111 - regression_loss: 2.4677 - classification_loss: 0.7434 73/500 [===>..........................] - ETA: 1:47 - loss: 3.2054 - regression_loss: 2.4643 - classification_loss: 0.7411 74/500 [===>..........................] - ETA: 1:47 - loss: 3.2019 - regression_loss: 2.4637 - classification_loss: 0.7381 75/500 [===>..........................] - ETA: 1:47 - loss: 3.1957 - regression_loss: 2.4605 - classification_loss: 0.7352 76/500 [===>..........................] - ETA: 1:47 - loss: 3.1999 - regression_loss: 2.4646 - classification_loss: 0.7353 77/500 [===>..........................] - ETA: 1:46 - loss: 3.1928 - regression_loss: 2.4606 - classification_loss: 0.7323 78/500 [===>..........................] - ETA: 1:46 - loss: 3.1929 - regression_loss: 2.4615 - classification_loss: 0.7314 79/500 [===>..........................] - ETA: 1:46 - loss: 3.1910 - regression_loss: 2.4616 - classification_loss: 0.7294 80/500 [===>..........................] - ETA: 1:45 - loss: 3.1867 - regression_loss: 2.4597 - classification_loss: 0.7270 81/500 [===>..........................] - ETA: 1:45 - loss: 3.1889 - regression_loss: 2.4600 - classification_loss: 0.7289 82/500 [===>..........................] - ETA: 1:45 - loss: 3.1949 - regression_loss: 2.4599 - classification_loss: 0.7350 83/500 [===>..........................] - ETA: 1:45 - loss: 3.1897 - regression_loss: 2.4570 - classification_loss: 0.7327 84/500 [====>.........................] - ETA: 1:44 - loss: 3.1886 - regression_loss: 2.4580 - classification_loss: 0.7306 85/500 [====>.........................] - ETA: 1:44 - loss: 3.1862 - regression_loss: 2.4580 - classification_loss: 0.7282 86/500 [====>.........................] - ETA: 1:43 - loss: 3.1931 - regression_loss: 2.4564 - classification_loss: 0.7366 87/500 [====>.........................] - ETA: 1:43 - loss: 3.1919 - regression_loss: 2.4562 - classification_loss: 0.7357 88/500 [====>.........................] - ETA: 1:43 - loss: 3.1869 - regression_loss: 2.4535 - classification_loss: 0.7334 89/500 [====>.........................] - ETA: 1:42 - loss: 3.1838 - regression_loss: 2.4517 - classification_loss: 0.7321 90/500 [====>.........................] - ETA: 1:42 - loss: 3.1854 - regression_loss: 2.4498 - classification_loss: 0.7356 91/500 [====>.........................] - ETA: 1:42 - loss: 3.1792 - regression_loss: 2.4470 - classification_loss: 0.7323 92/500 [====>.........................] - ETA: 1:42 - loss: 3.1781 - regression_loss: 2.4460 - classification_loss: 0.7320 93/500 [====>.........................] - ETA: 1:41 - loss: 3.1760 - regression_loss: 2.4458 - classification_loss: 0.7301 94/500 [====>.........................] - ETA: 1:41 - loss: 3.1737 - regression_loss: 2.4446 - classification_loss: 0.7290 95/500 [====>.........................] - ETA: 1:41 - loss: 3.1716 - regression_loss: 2.4435 - classification_loss: 0.7281 96/500 [====>.........................] - ETA: 1:41 - loss: 3.1692 - regression_loss: 2.4429 - classification_loss: 0.7263 97/500 [====>.........................] - ETA: 1:40 - loss: 3.1673 - regression_loss: 2.4423 - classification_loss: 0.7250 98/500 [====>.........................] - ETA: 1:40 - loss: 3.1697 - regression_loss: 2.4428 - classification_loss: 0.7269 99/500 [====>.........................] - ETA: 1:40 - loss: 3.1652 - regression_loss: 2.4401 - classification_loss: 0.7251 100/500 [=====>........................] - ETA: 1:40 - loss: 3.1623 - regression_loss: 2.4394 - classification_loss: 0.7228 101/500 [=====>........................] - ETA: 1:39 - loss: 3.1627 - regression_loss: 2.4410 - classification_loss: 0.7218 102/500 [=====>........................] - ETA: 1:39 - loss: 3.1598 - regression_loss: 2.4395 - classification_loss: 0.7203 103/500 [=====>........................] - ETA: 1:39 - loss: 3.1621 - regression_loss: 2.4416 - classification_loss: 0.7205 104/500 [=====>........................] - ETA: 1:39 - loss: 3.1583 - regression_loss: 2.4394 - classification_loss: 0.7189 105/500 [=====>........................] - ETA: 1:38 - loss: 3.1581 - regression_loss: 2.4393 - classification_loss: 0.7189 106/500 [=====>........................] - ETA: 1:38 - loss: 3.1549 - regression_loss: 2.4382 - classification_loss: 0.7167 107/500 [=====>........................] - ETA: 1:38 - loss: 3.1544 - regression_loss: 2.4390 - classification_loss: 0.7155 108/500 [=====>........................] - ETA: 1:38 - loss: 3.1553 - regression_loss: 2.4395 - classification_loss: 0.7158 109/500 [=====>........................] - ETA: 1:38 - loss: 3.1558 - regression_loss: 2.4420 - classification_loss: 0.7139 110/500 [=====>........................] - ETA: 1:37 - loss: 3.1531 - regression_loss: 2.4408 - classification_loss: 0.7123 111/500 [=====>........................] - ETA: 1:37 - loss: 3.1551 - regression_loss: 2.4432 - classification_loss: 0.7119 112/500 [=====>........................] - ETA: 1:37 - loss: 3.1561 - regression_loss: 2.4454 - classification_loss: 0.7108 113/500 [=====>........................] - ETA: 1:37 - loss: 3.1558 - regression_loss: 2.4452 - classification_loss: 0.7107 114/500 [=====>........................] - ETA: 1:36 - loss: 3.1554 - regression_loss: 2.4453 - classification_loss: 0.7102 115/500 [=====>........................] - ETA: 1:36 - loss: 3.1558 - regression_loss: 2.4453 - classification_loss: 0.7105 116/500 [=====>........................] - ETA: 1:36 - loss: 3.1536 - regression_loss: 2.4438 - classification_loss: 0.7098 117/500 [======>.......................] - ETA: 1:36 - loss: 3.1531 - regression_loss: 2.4441 - classification_loss: 0.7090 118/500 [======>.......................] - ETA: 1:35 - loss: 3.1556 - regression_loss: 2.4464 - classification_loss: 0.7092 119/500 [======>.......................] - ETA: 1:35 - loss: 3.1528 - regression_loss: 2.4447 - classification_loss: 0.7081 120/500 [======>.......................] - ETA: 1:35 - loss: 3.1509 - regression_loss: 2.4442 - classification_loss: 0.7067 121/500 [======>.......................] - ETA: 1:35 - loss: 3.1467 - regression_loss: 2.4416 - classification_loss: 0.7051 122/500 [======>.......................] - ETA: 1:34 - loss: 3.1483 - regression_loss: 2.4419 - classification_loss: 0.7065 123/500 [======>.......................] - ETA: 1:34 - loss: 3.1454 - regression_loss: 2.4403 - classification_loss: 0.7051 124/500 [======>.......................] - ETA: 1:34 - loss: 3.1480 - regression_loss: 2.4408 - classification_loss: 0.7072 125/500 [======>.......................] - ETA: 1:34 - loss: 3.1463 - regression_loss: 2.4404 - classification_loss: 0.7059 126/500 [======>.......................] - ETA: 1:33 - loss: 3.1445 - regression_loss: 2.4400 - classification_loss: 0.7046 127/500 [======>.......................] - ETA: 1:33 - loss: 3.1423 - regression_loss: 2.4388 - classification_loss: 0.7036 128/500 [======>.......................] - ETA: 1:33 - loss: 3.1451 - regression_loss: 2.4368 - classification_loss: 0.7083 129/500 [======>.......................] - ETA: 1:33 - loss: 3.1395 - regression_loss: 2.4328 - classification_loss: 0.7067 130/500 [======>.......................] - ETA: 1:32 - loss: 3.1377 - regression_loss: 2.4327 - classification_loss: 0.7050 131/500 [======>.......................] - ETA: 1:32 - loss: 3.1348 - regression_loss: 2.4311 - classification_loss: 0.7036 132/500 [======>.......................] - ETA: 1:32 - loss: 3.1347 - regression_loss: 2.4320 - classification_loss: 0.7027 133/500 [======>.......................] - ETA: 1:32 - loss: 3.1363 - regression_loss: 2.4338 - classification_loss: 0.7025 134/500 [=======>......................] - ETA: 1:31 - loss: 3.1343 - regression_loss: 2.4324 - classification_loss: 0.7019 135/500 [=======>......................] - ETA: 1:31 - loss: 3.1341 - regression_loss: 2.4333 - classification_loss: 0.7008 136/500 [=======>......................] - ETA: 1:31 - loss: 3.1336 - regression_loss: 2.4330 - classification_loss: 0.7006 137/500 [=======>......................] - ETA: 1:31 - loss: 3.1324 - regression_loss: 2.4327 - classification_loss: 0.6997 138/500 [=======>......................] - ETA: 1:30 - loss: 3.1246 - regression_loss: 2.4267 - classification_loss: 0.6979 139/500 [=======>......................] - ETA: 1:30 - loss: 3.1230 - regression_loss: 2.4263 - classification_loss: 0.6967 140/500 [=======>......................] - ETA: 1:30 - loss: 3.1207 - regression_loss: 2.4234 - classification_loss: 0.6973 141/500 [=======>......................] - ETA: 1:30 - loss: 3.1244 - regression_loss: 2.4280 - classification_loss: 0.6963 142/500 [=======>......................] - ETA: 1:29 - loss: 3.1234 - regression_loss: 2.4279 - classification_loss: 0.6955 143/500 [=======>......................] - ETA: 1:29 - loss: 3.1200 - regression_loss: 2.4248 - classification_loss: 0.6952 144/500 [=======>......................] - ETA: 1:29 - loss: 3.1185 - regression_loss: 2.4238 - classification_loss: 0.6947 145/500 [=======>......................] - ETA: 1:29 - loss: 3.1183 - regression_loss: 2.4246 - classification_loss: 0.6937 146/500 [=======>......................] - ETA: 1:29 - loss: 3.1167 - regression_loss: 2.4239 - classification_loss: 0.6927 147/500 [=======>......................] - ETA: 1:28 - loss: 3.1476 - regression_loss: 2.4244 - classification_loss: 0.7233 148/500 [=======>......................] - ETA: 1:28 - loss: 3.1465 - regression_loss: 2.4238 - classification_loss: 0.7226 149/500 [=======>......................] - ETA: 1:28 - loss: 3.1509 - regression_loss: 2.4271 - classification_loss: 0.7237 150/500 [========>.....................] - ETA: 1:27 - loss: 3.1502 - regression_loss: 2.4270 - classification_loss: 0.7232 151/500 [========>.....................] - ETA: 1:27 - loss: 3.1477 - regression_loss: 2.4256 - classification_loss: 0.7221 152/500 [========>.....................] - ETA: 1:27 - loss: 3.1472 - regression_loss: 2.4256 - classification_loss: 0.7216 153/500 [========>.....................] - ETA: 1:27 - loss: 3.1460 - regression_loss: 2.4257 - classification_loss: 0.7204 154/500 [========>.....................] - ETA: 1:27 - loss: 3.1457 - regression_loss: 2.4261 - classification_loss: 0.7196 155/500 [========>.....................] - ETA: 1:26 - loss: 3.1453 - regression_loss: 2.4265 - classification_loss: 0.7188 156/500 [========>.....................] - ETA: 1:26 - loss: 3.1421 - regression_loss: 2.4248 - classification_loss: 0.7173 157/500 [========>.....................] - ETA: 1:26 - loss: 3.1392 - regression_loss: 2.4233 - classification_loss: 0.7159 158/500 [========>.....................] - ETA: 1:26 - loss: 3.1374 - regression_loss: 2.4224 - classification_loss: 0.7149 159/500 [========>.....................] - ETA: 1:25 - loss: 3.1363 - regression_loss: 2.4220 - classification_loss: 0.7143 160/500 [========>.....................] - ETA: 1:25 - loss: 3.1455 - regression_loss: 2.4230 - classification_loss: 0.7225 161/500 [========>.....................] - ETA: 1:25 - loss: 3.1465 - regression_loss: 2.4242 - classification_loss: 0.7223 162/500 [========>.....................] - ETA: 1:25 - loss: 3.1498 - regression_loss: 2.4262 - classification_loss: 0.7236 163/500 [========>.....................] - ETA: 1:24 - loss: 3.1509 - regression_loss: 2.4267 - classification_loss: 0.7242 164/500 [========>.....................] - ETA: 1:24 - loss: 3.1550 - regression_loss: 2.4296 - classification_loss: 0.7254 165/500 [========>.....................] - ETA: 1:24 - loss: 3.1520 - regression_loss: 2.4283 - classification_loss: 0.7236 166/500 [========>.....................] - ETA: 1:24 - loss: 3.1497 - regression_loss: 2.4276 - classification_loss: 0.7221 167/500 [=========>....................] - ETA: 1:23 - loss: 3.1506 - regression_loss: 2.4283 - classification_loss: 0.7223 168/500 [=========>....................] - ETA: 1:23 - loss: 3.1482 - regression_loss: 2.4270 - classification_loss: 0.7212 169/500 [=========>....................] - ETA: 1:23 - loss: 3.1471 - regression_loss: 2.4269 - classification_loss: 0.7202 170/500 [=========>....................] - ETA: 1:23 - loss: 3.1458 - regression_loss: 2.4262 - classification_loss: 0.7195 171/500 [=========>....................] - ETA: 1:22 - loss: 3.1476 - regression_loss: 2.4278 - classification_loss: 0.7197 172/500 [=========>....................] - ETA: 1:22 - loss: 3.1470 - regression_loss: 2.4262 - classification_loss: 0.7208 173/500 [=========>....................] - ETA: 1:22 - loss: 3.1474 - regression_loss: 2.4269 - classification_loss: 0.7205 174/500 [=========>....................] - ETA: 1:22 - loss: 3.1496 - regression_loss: 2.4285 - classification_loss: 0.7211 175/500 [=========>....................] - ETA: 1:21 - loss: 3.1480 - regression_loss: 2.4278 - classification_loss: 0.7203 176/500 [=========>....................] - ETA: 1:21 - loss: 3.1473 - regression_loss: 2.4273 - classification_loss: 0.7200 177/500 [=========>....................] - ETA: 1:21 - loss: 3.1475 - regression_loss: 2.4272 - classification_loss: 0.7203 178/500 [=========>....................] - ETA: 1:21 - loss: 3.1468 - regression_loss: 2.4272 - classification_loss: 0.7196 179/500 [=========>....................] - ETA: 1:20 - loss: 3.1443 - regression_loss: 2.4263 - classification_loss: 0.7180 180/500 [=========>....................] - ETA: 1:20 - loss: 3.1450 - regression_loss: 2.4277 - classification_loss: 0.7173 181/500 [=========>....................] - ETA: 1:20 - loss: 3.1450 - regression_loss: 2.4279 - classification_loss: 0.7172 182/500 [=========>....................] - ETA: 1:20 - loss: 3.1440 - regression_loss: 2.4274 - classification_loss: 0.7166 183/500 [=========>....................] - ETA: 1:19 - loss: 3.1439 - regression_loss: 2.4279 - classification_loss: 0.7160 184/500 [==========>...................] - ETA: 1:19 - loss: 3.1463 - regression_loss: 2.4297 - classification_loss: 0.7166 185/500 [==========>...................] - ETA: 1:19 - loss: 3.1445 - regression_loss: 2.4290 - classification_loss: 0.7155 186/500 [==========>...................] - ETA: 1:19 - loss: 3.1427 - regression_loss: 2.4283 - classification_loss: 0.7145 187/500 [==========>...................] - ETA: 1:18 - loss: 3.1433 - regression_loss: 2.4294 - classification_loss: 0.7139 188/500 [==========>...................] - ETA: 1:18 - loss: 3.1427 - regression_loss: 2.4297 - classification_loss: 0.7130 189/500 [==========>...................] - ETA: 1:18 - loss: 3.1486 - regression_loss: 2.4294 - classification_loss: 0.7193 190/500 [==========>...................] - ETA: 1:18 - loss: 3.1492 - regression_loss: 2.4303 - classification_loss: 0.7189 191/500 [==========>...................] - ETA: 1:17 - loss: 3.1508 - regression_loss: 2.4322 - classification_loss: 0.7186 192/500 [==========>...................] - ETA: 1:17 - loss: 3.1496 - regression_loss: 2.4314 - classification_loss: 0.7182 193/500 [==========>...................] - ETA: 1:17 - loss: 3.1497 - regression_loss: 2.4321 - classification_loss: 0.7176 194/500 [==========>...................] - ETA: 1:17 - loss: 3.1488 - regression_loss: 2.4319 - classification_loss: 0.7169 195/500 [==========>...................] - ETA: 1:16 - loss: 3.1484 - regression_loss: 2.4324 - classification_loss: 0.7160 196/500 [==========>...................] - ETA: 1:16 - loss: 3.1476 - regression_loss: 2.4325 - classification_loss: 0.7150 197/500 [==========>...................] - ETA: 1:16 - loss: 3.1475 - regression_loss: 2.4333 - classification_loss: 0.7142 198/500 [==========>...................] - ETA: 1:16 - loss: 3.1470 - regression_loss: 2.4332 - classification_loss: 0.7139 199/500 [==========>...................] - ETA: 1:15 - loss: 3.1464 - regression_loss: 2.4329 - classification_loss: 0.7135 200/500 [===========>..................] - ETA: 1:15 - loss: 3.1453 - regression_loss: 2.4325 - classification_loss: 0.7128 201/500 [===========>..................] - ETA: 1:15 - loss: 3.1443 - regression_loss: 2.4320 - classification_loss: 0.7123 202/500 [===========>..................] - ETA: 1:15 - loss: 3.1426 - regression_loss: 2.4311 - classification_loss: 0.7115 203/500 [===========>..................] - ETA: 1:14 - loss: 3.1433 - regression_loss: 2.4318 - classification_loss: 0.7115 204/500 [===========>..................] - ETA: 1:14 - loss: 3.1418 - regression_loss: 2.4310 - classification_loss: 0.7108 205/500 [===========>..................] - ETA: 1:14 - loss: 3.1452 - regression_loss: 2.4312 - classification_loss: 0.7140 206/500 [===========>..................] - ETA: 1:14 - loss: 3.1444 - regression_loss: 2.4319 - classification_loss: 0.7124 207/500 [===========>..................] - ETA: 1:13 - loss: 3.1479 - regression_loss: 2.4327 - classification_loss: 0.7152 208/500 [===========>..................] - ETA: 1:13 - loss: 3.1473 - regression_loss: 2.4327 - classification_loss: 0.7146 209/500 [===========>..................] - ETA: 1:13 - loss: 3.1515 - regression_loss: 2.4314 - classification_loss: 0.7201 210/500 [===========>..................] - ETA: 1:13 - loss: 3.1516 - regression_loss: 2.4323 - classification_loss: 0.7192 211/500 [===========>..................] - ETA: 1:12 - loss: 3.1536 - regression_loss: 2.4325 - classification_loss: 0.7211 212/500 [===========>..................] - ETA: 1:12 - loss: 3.1541 - regression_loss: 2.4324 - classification_loss: 0.7217 213/500 [===========>..................] - ETA: 1:12 - loss: 3.1535 - regression_loss: 2.4324 - classification_loss: 0.7211 214/500 [===========>..................] - ETA: 1:12 - loss: 3.1530 - regression_loss: 2.4326 - classification_loss: 0.7203 215/500 [===========>..................] - ETA: 1:11 - loss: 3.1518 - regression_loss: 2.4318 - classification_loss: 0.7201 216/500 [===========>..................] - ETA: 1:11 - loss: 3.1512 - regression_loss: 2.4318 - classification_loss: 0.7194 217/500 [============>.................] - ETA: 1:11 - loss: 3.1491 - regression_loss: 2.4305 - classification_loss: 0.7187 218/500 [============>.................] - ETA: 1:11 - loss: 3.1484 - regression_loss: 2.4301 - classification_loss: 0.7183 219/500 [============>.................] - ETA: 1:10 - loss: 3.1472 - regression_loss: 2.4291 - classification_loss: 0.7181 220/500 [============>.................] - ETA: 1:10 - loss: 3.1471 - regression_loss: 2.4292 - classification_loss: 0.7179 221/500 [============>.................] - ETA: 1:10 - loss: 3.1451 - regression_loss: 2.4282 - classification_loss: 0.7169 222/500 [============>.................] - ETA: 1:10 - loss: 3.1439 - regression_loss: 2.4277 - classification_loss: 0.7162 223/500 [============>.................] - ETA: 1:09 - loss: 3.1431 - regression_loss: 2.4278 - classification_loss: 0.7153 224/500 [============>.................] - ETA: 1:09 - loss: 3.1421 - regression_loss: 2.4274 - classification_loss: 0.7147 225/500 [============>.................] - ETA: 1:09 - loss: 3.1423 - regression_loss: 2.4277 - classification_loss: 0.7146 226/500 [============>.................] - ETA: 1:09 - loss: 3.1445 - regression_loss: 2.4305 - classification_loss: 0.7140 227/500 [============>.................] - ETA: 1:08 - loss: 3.1449 - regression_loss: 2.4318 - classification_loss: 0.7131 228/500 [============>.................] - ETA: 1:08 - loss: 3.1450 - regression_loss: 2.4315 - classification_loss: 0.7136 229/500 [============>.................] - ETA: 1:08 - loss: 3.1440 - regression_loss: 2.4312 - classification_loss: 0.7128 230/500 [============>.................] - ETA: 1:08 - loss: 3.1428 - regression_loss: 2.4304 - classification_loss: 0.7124 231/500 [============>.................] - ETA: 1:07 - loss: 3.1422 - regression_loss: 2.4306 - classification_loss: 0.7117 232/500 [============>.................] - ETA: 1:07 - loss: 3.1422 - regression_loss: 2.4305 - classification_loss: 0.7116 233/500 [============>.................] - ETA: 1:07 - loss: 3.1427 - regression_loss: 2.4315 - classification_loss: 0.7111 234/500 [=============>................] - ETA: 1:07 - loss: 3.1397 - regression_loss: 2.4304 - classification_loss: 0.7093 235/500 [=============>................] - ETA: 1:06 - loss: 3.1407 - regression_loss: 2.4322 - classification_loss: 0.7085 236/500 [=============>................] - ETA: 1:06 - loss: 3.1409 - regression_loss: 2.4326 - classification_loss: 0.7082 237/500 [=============>................] - ETA: 1:06 - loss: 3.1523 - regression_loss: 2.4339 - classification_loss: 0.7184 238/500 [=============>................] - ETA: 1:06 - loss: 3.1518 - regression_loss: 2.4336 - classification_loss: 0.7182 239/500 [=============>................] - ETA: 1:05 - loss: 3.1521 - regression_loss: 2.4339 - classification_loss: 0.7182 240/500 [=============>................] - ETA: 1:05 - loss: 3.1522 - regression_loss: 2.4348 - classification_loss: 0.7174 241/500 [=============>................] - ETA: 1:05 - loss: 3.1515 - regression_loss: 2.4343 - classification_loss: 0.7172 242/500 [=============>................] - ETA: 1:05 - loss: 3.1521 - regression_loss: 2.4339 - classification_loss: 0.7182 243/500 [=============>................] - ETA: 1:04 - loss: 3.1518 - regression_loss: 2.4339 - classification_loss: 0.7179 244/500 [=============>................] - ETA: 1:04 - loss: 3.1516 - regression_loss: 2.4346 - classification_loss: 0.7170 245/500 [=============>................] - ETA: 1:04 - loss: 3.1520 - regression_loss: 2.4341 - classification_loss: 0.7179 246/500 [=============>................] - ETA: 1:03 - loss: 3.1517 - regression_loss: 2.4340 - classification_loss: 0.7177 247/500 [=============>................] - ETA: 1:03 - loss: 3.1503 - regression_loss: 2.4336 - classification_loss: 0.7167 248/500 [=============>................] - ETA: 1:03 - loss: 3.1490 - regression_loss: 2.4330 - classification_loss: 0.7160 249/500 [=============>................] - ETA: 1:03 - loss: 3.1484 - regression_loss: 2.4331 - classification_loss: 0.7153 250/500 [==============>...............] - ETA: 1:02 - loss: 3.1468 - regression_loss: 2.4324 - classification_loss: 0.7144 251/500 [==============>...............] - ETA: 1:02 - loss: 3.1473 - regression_loss: 2.4330 - classification_loss: 0.7144 252/500 [==============>...............] - ETA: 1:02 - loss: 3.1465 - regression_loss: 2.4327 - classification_loss: 0.7138 253/500 [==============>...............] - ETA: 1:02 - loss: 3.1493 - regression_loss: 2.4354 - classification_loss: 0.7139 254/500 [==============>...............] - ETA: 1:01 - loss: 3.1486 - regression_loss: 2.4351 - classification_loss: 0.7134 255/500 [==============>...............] - ETA: 1:01 - loss: 3.1480 - regression_loss: 2.4349 - classification_loss: 0.7131 256/500 [==============>...............] - ETA: 1:01 - loss: 3.1477 - regression_loss: 2.4346 - classification_loss: 0.7130 257/500 [==============>...............] - ETA: 1:01 - loss: 3.1467 - regression_loss: 2.4341 - classification_loss: 0.7127 258/500 [==============>...............] - ETA: 1:00 - loss: 3.1670 - regression_loss: 2.4388 - classification_loss: 0.7281 259/500 [==============>...............] - ETA: 1:00 - loss: 3.1653 - regression_loss: 2.4382 - classification_loss: 0.7271 260/500 [==============>...............] - ETA: 1:00 - loss: 3.1666 - regression_loss: 2.4396 - classification_loss: 0.7271 261/500 [==============>...............] - ETA: 1:00 - loss: 3.1641 - regression_loss: 2.4379 - classification_loss: 0.7262 262/500 [==============>...............] - ETA: 59s - loss: 3.1638 - regression_loss: 2.4381 - classification_loss: 0.7257  263/500 [==============>...............] - ETA: 59s - loss: 3.1622 - regression_loss: 2.4370 - classification_loss: 0.7252 264/500 [==============>...............] - ETA: 59s - loss: 3.1645 - regression_loss: 2.4379 - classification_loss: 0.7266 265/500 [==============>...............] - ETA: 59s - loss: 3.1641 - regression_loss: 2.4376 - classification_loss: 0.7265 266/500 [==============>...............] - ETA: 58s - loss: 3.1625 - regression_loss: 2.4365 - classification_loss: 0.7260 267/500 [===============>..............] - ETA: 58s - loss: 3.1625 - regression_loss: 2.4367 - classification_loss: 0.7257 268/500 [===============>..............] - ETA: 58s - loss: 3.1620 - regression_loss: 2.4365 - classification_loss: 0.7256 269/500 [===============>..............] - ETA: 58s - loss: 3.1611 - regression_loss: 2.4360 - classification_loss: 0.7251 270/500 [===============>..............] - ETA: 57s - loss: 3.1607 - regression_loss: 2.4361 - classification_loss: 0.7246 271/500 [===============>..............] - ETA: 57s - loss: 3.1592 - regression_loss: 2.4355 - classification_loss: 0.7237 272/500 [===============>..............] - ETA: 57s - loss: 3.1581 - regression_loss: 2.4352 - classification_loss: 0.7230 273/500 [===============>..............] - ETA: 57s - loss: 3.1578 - regression_loss: 2.4350 - classification_loss: 0.7228 274/500 [===============>..............] - ETA: 56s - loss: 3.1597 - regression_loss: 2.4371 - classification_loss: 0.7226 275/500 [===============>..............] - ETA: 56s - loss: 3.1591 - regression_loss: 2.4352 - classification_loss: 0.7239 276/500 [===============>..............] - ETA: 56s - loss: 3.1581 - regression_loss: 2.4348 - classification_loss: 0.7233 277/500 [===============>..............] - ETA: 56s - loss: 3.1578 - regression_loss: 2.4353 - classification_loss: 0.7225 278/500 [===============>..............] - ETA: 55s - loss: 3.1551 - regression_loss: 2.4334 - classification_loss: 0.7217 279/500 [===============>..............] - ETA: 55s - loss: 3.1552 - regression_loss: 2.4340 - classification_loss: 0.7212 280/500 [===============>..............] - ETA: 55s - loss: 3.1549 - regression_loss: 2.4336 - classification_loss: 0.7213 281/500 [===============>..............] - ETA: 55s - loss: 3.1541 - regression_loss: 2.4337 - classification_loss: 0.7204 282/500 [===============>..............] - ETA: 54s - loss: 3.1523 - regression_loss: 2.4326 - classification_loss: 0.7197 283/500 [===============>..............] - ETA: 54s - loss: 3.1512 - regression_loss: 2.4322 - classification_loss: 0.7191 284/500 [================>.............] - ETA: 54s - loss: 3.1557 - regression_loss: 2.4329 - classification_loss: 0.7228 285/500 [================>.............] - ETA: 54s - loss: 3.1553 - regression_loss: 2.4330 - classification_loss: 0.7223 286/500 [================>.............] - ETA: 53s - loss: 3.1545 - regression_loss: 2.4328 - classification_loss: 0.7217 287/500 [================>.............] - ETA: 53s - loss: 3.1538 - regression_loss: 2.4328 - classification_loss: 0.7210 288/500 [================>.............] - ETA: 53s - loss: 3.1537 - regression_loss: 2.4328 - classification_loss: 0.7210 289/500 [================>.............] - ETA: 53s - loss: 3.1528 - regression_loss: 2.4328 - classification_loss: 0.7200 290/500 [================>.............] - ETA: 52s - loss: 3.1530 - regression_loss: 2.4339 - classification_loss: 0.7191 291/500 [================>.............] - ETA: 52s - loss: 3.1532 - regression_loss: 2.4341 - classification_loss: 0.7191 292/500 [================>.............] - ETA: 52s - loss: 3.1523 - regression_loss: 2.4335 - classification_loss: 0.7188 293/500 [================>.............] - ETA: 52s - loss: 3.1514 - regression_loss: 2.4329 - classification_loss: 0.7184 294/500 [================>.............] - ETA: 51s - loss: 3.1520 - regression_loss: 2.4343 - classification_loss: 0.7177 295/500 [================>.............] - ETA: 51s - loss: 3.1515 - regression_loss: 2.4342 - classification_loss: 0.7173 296/500 [================>.............] - ETA: 51s - loss: 3.1517 - regression_loss: 2.4346 - classification_loss: 0.7171 297/500 [================>.............] - ETA: 51s - loss: 3.1504 - regression_loss: 2.4338 - classification_loss: 0.7166 298/500 [================>.............] - ETA: 50s - loss: 3.1497 - regression_loss: 2.4338 - classification_loss: 0.7160 299/500 [================>.............] - ETA: 50s - loss: 3.1497 - regression_loss: 2.4344 - classification_loss: 0.7154 300/500 [=================>............] - ETA: 50s - loss: 3.1490 - regression_loss: 2.4338 - classification_loss: 0.7152 301/500 [=================>............] - ETA: 50s - loss: 3.1476 - regression_loss: 2.4328 - classification_loss: 0.7148 302/500 [=================>............] - ETA: 49s - loss: 3.1466 - regression_loss: 2.4323 - classification_loss: 0.7143 303/500 [=================>............] - ETA: 49s - loss: 3.1469 - regression_loss: 2.4328 - classification_loss: 0.7141 304/500 [=================>............] - ETA: 49s - loss: 3.1477 - regression_loss: 2.4334 - classification_loss: 0.7144 305/500 [=================>............] - ETA: 49s - loss: 3.1469 - regression_loss: 2.4331 - classification_loss: 0.7138 306/500 [=================>............] - ETA: 48s - loss: 3.1467 - regression_loss: 2.4336 - classification_loss: 0.7131 307/500 [=================>............] - ETA: 48s - loss: 3.1465 - regression_loss: 2.4337 - classification_loss: 0.7128 308/500 [=================>............] - ETA: 48s - loss: 3.1465 - regression_loss: 2.4337 - classification_loss: 0.7129 309/500 [=================>............] - ETA: 48s - loss: 3.1463 - regression_loss: 2.4333 - classification_loss: 0.7130 310/500 [=================>............] - ETA: 47s - loss: 3.1453 - regression_loss: 2.4331 - classification_loss: 0.7122 311/500 [=================>............] - ETA: 47s - loss: 3.1459 - regression_loss: 2.4337 - classification_loss: 0.7121 312/500 [=================>............] - ETA: 47s - loss: 3.1447 - regression_loss: 2.4335 - classification_loss: 0.7112 313/500 [=================>............] - ETA: 47s - loss: 3.1443 - regression_loss: 2.4335 - classification_loss: 0.7108 314/500 [=================>............] - ETA: 46s - loss: 3.1439 - regression_loss: 2.4336 - classification_loss: 0.7103 315/500 [=================>............] - ETA: 46s - loss: 3.1432 - regression_loss: 2.4331 - classification_loss: 0.7101 316/500 [=================>............] - ETA: 46s - loss: 3.1421 - regression_loss: 2.4327 - classification_loss: 0.7094 317/500 [==================>...........] - ETA: 46s - loss: 3.1416 - regression_loss: 2.4327 - classification_loss: 0.7089 318/500 [==================>...........] - ETA: 45s - loss: 3.1406 - regression_loss: 2.4321 - classification_loss: 0.7085 319/500 [==================>...........] - ETA: 45s - loss: 3.1412 - regression_loss: 2.4319 - classification_loss: 0.7092 320/500 [==================>...........] - ETA: 45s - loss: 3.1400 - regression_loss: 2.4314 - classification_loss: 0.7086 321/500 [==================>...........] - ETA: 45s - loss: 3.1391 - regression_loss: 2.4309 - classification_loss: 0.7081 322/500 [==================>...........] - ETA: 44s - loss: 3.1405 - regression_loss: 2.4323 - classification_loss: 0.7082 323/500 [==================>...........] - ETA: 44s - loss: 3.1391 - regression_loss: 2.4315 - classification_loss: 0.7076 324/500 [==================>...........] - ETA: 44s - loss: 3.1394 - regression_loss: 2.4325 - classification_loss: 0.7069 325/500 [==================>...........] - ETA: 44s - loss: 3.1380 - regression_loss: 2.4317 - classification_loss: 0.7063 326/500 [==================>...........] - ETA: 43s - loss: 3.1371 - regression_loss: 2.4315 - classification_loss: 0.7056 327/500 [==================>...........] - ETA: 43s - loss: 3.1367 - regression_loss: 2.4314 - classification_loss: 0.7053 328/500 [==================>...........] - ETA: 43s - loss: 3.1362 - regression_loss: 2.4314 - classification_loss: 0.7049 329/500 [==================>...........] - ETA: 43s - loss: 3.1359 - regression_loss: 2.4312 - classification_loss: 0.7047 330/500 [==================>...........] - ETA: 42s - loss: 3.1353 - regression_loss: 2.4314 - classification_loss: 0.7039 331/500 [==================>...........] - ETA: 42s - loss: 3.1346 - regression_loss: 2.4312 - classification_loss: 0.7034 332/500 [==================>...........] - ETA: 42s - loss: 3.1345 - regression_loss: 2.4310 - classification_loss: 0.7035 333/500 [==================>...........] - ETA: 42s - loss: 3.1345 - regression_loss: 2.4315 - classification_loss: 0.7030 334/500 [===================>..........] - ETA: 41s - loss: 3.1345 - regression_loss: 2.4318 - classification_loss: 0.7027 335/500 [===================>..........] - ETA: 41s - loss: 3.1343 - regression_loss: 2.4321 - classification_loss: 0.7022 336/500 [===================>..........] - ETA: 41s - loss: 3.1335 - regression_loss: 2.4317 - classification_loss: 0.7017 337/500 [===================>..........] - ETA: 41s - loss: 3.1337 - regression_loss: 2.4321 - classification_loss: 0.7016 338/500 [===================>..........] - ETA: 40s - loss: 3.1329 - regression_loss: 2.4316 - classification_loss: 0.7012 339/500 [===================>..........] - ETA: 40s - loss: 3.1339 - regression_loss: 2.4323 - classification_loss: 0.7016 340/500 [===================>..........] - ETA: 40s - loss: 3.1384 - regression_loss: 2.4336 - classification_loss: 0.7048 341/500 [===================>..........] - ETA: 40s - loss: 3.1383 - regression_loss: 2.4334 - classification_loss: 0.7049 342/500 [===================>..........] - ETA: 39s - loss: 3.1379 - regression_loss: 2.4332 - classification_loss: 0.7047 343/500 [===================>..........] - ETA: 39s - loss: 3.1370 - regression_loss: 2.4330 - classification_loss: 0.7040 344/500 [===================>..........] - ETA: 39s - loss: 3.1349 - regression_loss: 2.4318 - classification_loss: 0.7032 345/500 [===================>..........] - ETA: 39s - loss: 3.1351 - regression_loss: 2.4321 - classification_loss: 0.7031 346/500 [===================>..........] - ETA: 38s - loss: 3.1353 - regression_loss: 2.4321 - classification_loss: 0.7032 347/500 [===================>..........] - ETA: 38s - loss: 3.1375 - regression_loss: 2.4330 - classification_loss: 0.7045 348/500 [===================>..........] - ETA: 38s - loss: 3.1372 - regression_loss: 2.4333 - classification_loss: 0.7039 349/500 [===================>..........] - ETA: 38s - loss: 3.1358 - regression_loss: 2.4325 - classification_loss: 0.7033 350/500 [====================>.........] - ETA: 37s - loss: 3.1374 - regression_loss: 2.4332 - classification_loss: 0.7042 351/500 [====================>.........] - ETA: 37s - loss: 3.1362 - regression_loss: 2.4324 - classification_loss: 0.7038 352/500 [====================>.........] - ETA: 37s - loss: 3.1356 - regression_loss: 2.4320 - classification_loss: 0.7036 353/500 [====================>.........] - ETA: 37s - loss: 3.1347 - regression_loss: 2.4315 - classification_loss: 0.7032 354/500 [====================>.........] - ETA: 36s - loss: 3.1346 - regression_loss: 2.4315 - classification_loss: 0.7031 355/500 [====================>.........] - ETA: 36s - loss: 3.1349 - regression_loss: 2.4316 - classification_loss: 0.7033 356/500 [====================>.........] - ETA: 36s - loss: 3.1326 - regression_loss: 2.4303 - classification_loss: 0.7023 357/500 [====================>.........] - ETA: 36s - loss: 3.1317 - regression_loss: 2.4297 - classification_loss: 0.7020 358/500 [====================>.........] - ETA: 35s - loss: 3.1316 - regression_loss: 2.4296 - classification_loss: 0.7020 359/500 [====================>.........] - ETA: 35s - loss: 3.1305 - regression_loss: 2.4286 - classification_loss: 0.7019 360/500 [====================>.........] - ETA: 35s - loss: 3.1307 - regression_loss: 2.4290 - classification_loss: 0.7017 361/500 [====================>.........] - ETA: 35s - loss: 3.1312 - regression_loss: 2.4293 - classification_loss: 0.7019 362/500 [====================>.........] - ETA: 34s - loss: 3.1302 - regression_loss: 2.4287 - classification_loss: 0.7015 363/500 [====================>.........] - ETA: 34s - loss: 3.1292 - regression_loss: 2.4283 - classification_loss: 0.7010 364/500 [====================>.........] - ETA: 34s - loss: 3.1294 - regression_loss: 2.4282 - classification_loss: 0.7011 365/500 [====================>.........] - ETA: 34s - loss: 3.1288 - regression_loss: 2.4280 - classification_loss: 0.7009 366/500 [====================>.........] - ETA: 33s - loss: 3.1281 - regression_loss: 2.4277 - classification_loss: 0.7004 367/500 [=====================>........] - ETA: 33s - loss: 3.1290 - regression_loss: 2.4288 - classification_loss: 0.7003 368/500 [=====================>........] - ETA: 33s - loss: 3.1277 - regression_loss: 2.4280 - classification_loss: 0.6997 369/500 [=====================>........] - ETA: 32s - loss: 3.1276 - regression_loss: 2.4280 - classification_loss: 0.6996 370/500 [=====================>........] - ETA: 32s - loss: 3.1275 - regression_loss: 2.4283 - classification_loss: 0.6991 371/500 [=====================>........] - ETA: 32s - loss: 3.1283 - regression_loss: 2.4295 - classification_loss: 0.6988 372/500 [=====================>........] - ETA: 32s - loss: 3.1274 - regression_loss: 2.4290 - classification_loss: 0.6984 373/500 [=====================>........] - ETA: 31s - loss: 3.1251 - regression_loss: 2.4271 - classification_loss: 0.6979 374/500 [=====================>........] - ETA: 31s - loss: 3.1245 - regression_loss: 2.4269 - classification_loss: 0.6976 375/500 [=====================>........] - ETA: 31s - loss: 3.1233 - regression_loss: 2.4258 - classification_loss: 0.6975 376/500 [=====================>........] - ETA: 31s - loss: 3.1220 - regression_loss: 2.4251 - classification_loss: 0.6969 377/500 [=====================>........] - ETA: 30s - loss: 3.1196 - regression_loss: 2.4231 - classification_loss: 0.6965 378/500 [=====================>........] - ETA: 30s - loss: 3.1196 - regression_loss: 2.4234 - classification_loss: 0.6962 379/500 [=====================>........] - ETA: 30s - loss: 3.1205 - regression_loss: 2.4239 - classification_loss: 0.6965 380/500 [=====================>........] - ETA: 30s - loss: 3.1198 - regression_loss: 2.4237 - classification_loss: 0.6961 381/500 [=====================>........] - ETA: 29s - loss: 3.1192 - regression_loss: 2.4233 - classification_loss: 0.6958 382/500 [=====================>........] - ETA: 29s - loss: 3.1180 - regression_loss: 2.4227 - classification_loss: 0.6953 383/500 [=====================>........] - ETA: 29s - loss: 3.1178 - regression_loss: 2.4227 - classification_loss: 0.6951 384/500 [======================>.......] - ETA: 29s - loss: 3.1172 - regression_loss: 2.4224 - classification_loss: 0.6948 385/500 [======================>.......] - ETA: 28s - loss: 3.1169 - regression_loss: 2.4225 - classification_loss: 0.6945 386/500 [======================>.......] - ETA: 28s - loss: 3.1171 - regression_loss: 2.4220 - classification_loss: 0.6951 387/500 [======================>.......] - ETA: 28s - loss: 3.1163 - regression_loss: 2.4215 - classification_loss: 0.6947 388/500 [======================>.......] - ETA: 28s - loss: 3.1183 - regression_loss: 2.4221 - classification_loss: 0.6962 389/500 [======================>.......] - ETA: 27s - loss: 3.1183 - regression_loss: 2.4225 - classification_loss: 0.6959 390/500 [======================>.......] - ETA: 27s - loss: 3.1170 - regression_loss: 2.4215 - classification_loss: 0.6954 391/500 [======================>.......] - ETA: 27s - loss: 3.1164 - regression_loss: 2.4212 - classification_loss: 0.6952 392/500 [======================>.......] - ETA: 27s - loss: 3.1158 - regression_loss: 2.4206 - classification_loss: 0.6951 393/500 [======================>.......] - ETA: 26s - loss: 3.1162 - regression_loss: 2.4212 - classification_loss: 0.6950 394/500 [======================>.......] - ETA: 26s - loss: 3.1148 - regression_loss: 2.4205 - classification_loss: 0.6943 395/500 [======================>.......] - ETA: 26s - loss: 3.1148 - regression_loss: 2.4206 - classification_loss: 0.6942 396/500 [======================>.......] - ETA: 26s - loss: 3.1147 - regression_loss: 2.4211 - classification_loss: 0.6936 397/500 [======================>.......] - ETA: 25s - loss: 3.1140 - regression_loss: 2.4209 - classification_loss: 0.6931 398/500 [======================>.......] - ETA: 25s - loss: 3.1166 - regression_loss: 2.4219 - classification_loss: 0.6947 399/500 [======================>.......] - ETA: 25s - loss: 3.1161 - regression_loss: 2.4217 - classification_loss: 0.6944 400/500 [=======================>......] - ETA: 25s - loss: 3.1157 - regression_loss: 2.4216 - classification_loss: 0.6941 401/500 [=======================>......] - ETA: 24s - loss: 3.1150 - regression_loss: 2.4213 - classification_loss: 0.6937 402/500 [=======================>......] - ETA: 24s - loss: 3.1141 - regression_loss: 2.4209 - classification_loss: 0.6932 403/500 [=======================>......] - ETA: 24s - loss: 3.1134 - regression_loss: 2.4206 - classification_loss: 0.6928 404/500 [=======================>......] - ETA: 24s - loss: 3.1171 - regression_loss: 2.4226 - classification_loss: 0.6945 405/500 [=======================>......] - ETA: 23s - loss: 3.1157 - regression_loss: 2.4220 - classification_loss: 0.6937 406/500 [=======================>......] - ETA: 23s - loss: 3.1154 - regression_loss: 2.4221 - classification_loss: 0.6934 407/500 [=======================>......] - ETA: 23s - loss: 3.1156 - regression_loss: 2.4223 - classification_loss: 0.6933 408/500 [=======================>......] - ETA: 23s - loss: 3.1146 - regression_loss: 2.4219 - classification_loss: 0.6927 409/500 [=======================>......] - ETA: 22s - loss: 3.1142 - regression_loss: 2.4217 - classification_loss: 0.6925 410/500 [=======================>......] - ETA: 22s - loss: 3.1152 - regression_loss: 2.4222 - classification_loss: 0.6930 411/500 [=======================>......] - ETA: 22s - loss: 3.1136 - regression_loss: 2.4212 - classification_loss: 0.6924 412/500 [=======================>......] - ETA: 22s - loss: 3.1130 - regression_loss: 2.4209 - classification_loss: 0.6921 413/500 [=======================>......] - ETA: 21s - loss: 3.1144 - regression_loss: 2.4222 - classification_loss: 0.6922 414/500 [=======================>......] - ETA: 21s - loss: 3.1130 - regression_loss: 2.4207 - classification_loss: 0.6923 415/500 [=======================>......] - ETA: 21s - loss: 3.1127 - regression_loss: 2.4205 - classification_loss: 0.6922 416/500 [=======================>......] - ETA: 21s - loss: 3.1121 - regression_loss: 2.4201 - classification_loss: 0.6920 417/500 [========================>.....] - ETA: 20s - loss: 3.1128 - regression_loss: 2.4207 - classification_loss: 0.6922 418/500 [========================>.....] - ETA: 20s - loss: 3.1124 - regression_loss: 2.4203 - classification_loss: 0.6921 419/500 [========================>.....] - ETA: 20s - loss: 3.1130 - regression_loss: 2.4210 - classification_loss: 0.6920 420/500 [========================>.....] - ETA: 20s - loss: 3.1129 - regression_loss: 2.4211 - classification_loss: 0.6918 421/500 [========================>.....] - ETA: 19s - loss: 3.1126 - regression_loss: 2.4211 - classification_loss: 0.6915 422/500 [========================>.....] - ETA: 19s - loss: 3.1118 - regression_loss: 2.4207 - classification_loss: 0.6911 423/500 [========================>.....] - ETA: 19s - loss: 3.1121 - regression_loss: 2.4212 - classification_loss: 0.6909 424/500 [========================>.....] - ETA: 19s - loss: 3.1142 - regression_loss: 2.4219 - classification_loss: 0.6923 425/500 [========================>.....] - ETA: 18s - loss: 3.1579 - regression_loss: 2.4229 - classification_loss: 0.7350 426/500 [========================>.....] - ETA: 18s - loss: 3.1557 - regression_loss: 2.4212 - classification_loss: 0.7346 427/500 [========================>.....] - ETA: 18s - loss: 3.1547 - regression_loss: 2.4206 - classification_loss: 0.7341 428/500 [========================>.....] - ETA: 18s - loss: 3.1552 - regression_loss: 2.4208 - classification_loss: 0.7343 429/500 [========================>.....] - ETA: 17s - loss: 3.1541 - regression_loss: 2.4203 - classification_loss: 0.7338 430/500 [========================>.....] - ETA: 17s - loss: 3.1539 - regression_loss: 2.4205 - classification_loss: 0.7334 431/500 [========================>.....] - ETA: 17s - loss: 3.1530 - regression_loss: 2.4201 - classification_loss: 0.7329 432/500 [========================>.....] - ETA: 17s - loss: 3.1524 - regression_loss: 2.4198 - classification_loss: 0.7326 433/500 [========================>.....] - ETA: 16s - loss: 3.1521 - regression_loss: 2.4199 - classification_loss: 0.7322 434/500 [=========================>....] - ETA: 16s - loss: 3.1515 - regression_loss: 2.4197 - classification_loss: 0.7318 435/500 [=========================>....] - ETA: 16s - loss: 3.1513 - regression_loss: 2.4197 - classification_loss: 0.7316 436/500 [=========================>....] - ETA: 16s - loss: 3.1503 - regression_loss: 2.4192 - classification_loss: 0.7311 437/500 [=========================>....] - ETA: 15s - loss: 3.1489 - regression_loss: 2.4184 - classification_loss: 0.7306 438/500 [=========================>....] - ETA: 15s - loss: 3.1569 - regression_loss: 2.4201 - classification_loss: 0.7368 439/500 [=========================>....] - ETA: 15s - loss: 3.1569 - regression_loss: 2.4201 - classification_loss: 0.7368 440/500 [=========================>....] - ETA: 15s - loss: 3.1560 - regression_loss: 2.4196 - classification_loss: 0.7363 441/500 [=========================>....] - ETA: 14s - loss: 3.1552 - regression_loss: 2.4193 - classification_loss: 0.7359 442/500 [=========================>....] - ETA: 14s - loss: 3.1549 - regression_loss: 2.4194 - classification_loss: 0.7355 443/500 [=========================>....] - ETA: 14s - loss: 3.1546 - regression_loss: 2.4193 - classification_loss: 0.7353 444/500 [=========================>....] - ETA: 14s - loss: 3.1541 - regression_loss: 2.4192 - classification_loss: 0.7349 445/500 [=========================>....] - ETA: 13s - loss: 3.1538 - regression_loss: 2.4194 - classification_loss: 0.7344 446/500 [=========================>....] - ETA: 13s - loss: 3.1531 - regression_loss: 2.4192 - classification_loss: 0.7340 447/500 [=========================>....] - ETA: 13s - loss: 3.1531 - regression_loss: 2.4191 - classification_loss: 0.7341 448/500 [=========================>....] - ETA: 13s - loss: 3.1522 - regression_loss: 2.4186 - classification_loss: 0.7336 449/500 [=========================>....] - ETA: 12s - loss: 3.1519 - regression_loss: 2.4185 - classification_loss: 0.7334 450/500 [==========================>...] - ETA: 12s - loss: 3.1513 - regression_loss: 2.4184 - classification_loss: 0.7330 451/500 [==========================>...] - ETA: 12s - loss: 3.1496 - regression_loss: 2.4167 - classification_loss: 0.7329 452/500 [==========================>...] - ETA: 12s - loss: 3.1483 - regression_loss: 2.4160 - classification_loss: 0.7323 453/500 [==========================>...] - ETA: 11s - loss: 3.1459 - regression_loss: 2.4142 - classification_loss: 0.7317 454/500 [==========================>...] - ETA: 11s - loss: 3.1466 - regression_loss: 2.4152 - classification_loss: 0.7314 455/500 [==========================>...] - ETA: 11s - loss: 3.1460 - regression_loss: 2.4151 - classification_loss: 0.7309 456/500 [==========================>...] - ETA: 11s - loss: 3.1456 - regression_loss: 2.4149 - classification_loss: 0.7307 457/500 [==========================>...] - ETA: 10s - loss: 3.1454 - regression_loss: 2.4150 - classification_loss: 0.7304 458/500 [==========================>...] - ETA: 10s - loss: 3.1438 - regression_loss: 2.4141 - classification_loss: 0.7297 459/500 [==========================>...] - ETA: 10s - loss: 3.1430 - regression_loss: 2.4138 - classification_loss: 0.7293 460/500 [==========================>...] - ETA: 10s - loss: 3.1432 - regression_loss: 2.4138 - classification_loss: 0.7293 461/500 [==========================>...] - ETA: 9s - loss: 3.1442 - regression_loss: 2.4138 - classification_loss: 0.7305  462/500 [==========================>...] - ETA: 9s - loss: 3.1433 - regression_loss: 2.4133 - classification_loss: 0.7300 463/500 [==========================>...] - ETA: 9s - loss: 3.1424 - regression_loss: 2.4129 - classification_loss: 0.7295 464/500 [==========================>...] - ETA: 9s - loss: 3.1417 - regression_loss: 2.4127 - classification_loss: 0.7290 465/500 [==========================>...] - ETA: 8s - loss: 3.1406 - regression_loss: 2.4121 - classification_loss: 0.7285 466/500 [==========================>...] - ETA: 8s - loss: 3.1427 - regression_loss: 2.4137 - classification_loss: 0.7289 467/500 [===========================>..] - ETA: 8s - loss: 3.1425 - regression_loss: 2.4138 - classification_loss: 0.7286 468/500 [===========================>..] - ETA: 8s - loss: 3.1432 - regression_loss: 2.4146 - classification_loss: 0.7285 469/500 [===========================>..] - ETA: 7s - loss: 3.1430 - regression_loss: 2.4146 - classification_loss: 0.7283 470/500 [===========================>..] - ETA: 7s - loss: 3.1425 - regression_loss: 2.4145 - classification_loss: 0.7281 471/500 [===========================>..] - ETA: 7s - loss: 3.1423 - regression_loss: 2.4148 - classification_loss: 0.7276 472/500 [===========================>..] - ETA: 7s - loss: 3.1415 - regression_loss: 2.4146 - classification_loss: 0.7269 473/500 [===========================>..] - ETA: 6s - loss: 3.1410 - regression_loss: 2.4143 - classification_loss: 0.7267 474/500 [===========================>..] - ETA: 6s - loss: 3.1406 - regression_loss: 2.4143 - classification_loss: 0.7262 475/500 [===========================>..] - ETA: 6s - loss: 3.1402 - regression_loss: 2.4144 - classification_loss: 0.7258 476/500 [===========================>..] - ETA: 6s - loss: 3.1395 - regression_loss: 2.4141 - classification_loss: 0.7255 477/500 [===========================>..] - ETA: 5s - loss: 3.1389 - regression_loss: 2.4138 - classification_loss: 0.7252 478/500 [===========================>..] - ETA: 5s - loss: 3.1381 - regression_loss: 2.4134 - classification_loss: 0.7247 479/500 [===========================>..] - ETA: 5s - loss: 3.1372 - regression_loss: 2.4130 - classification_loss: 0.7242 480/500 [===========================>..] - ETA: 5s - loss: 3.1351 - regression_loss: 2.4116 - classification_loss: 0.7235 481/500 [===========================>..] - ETA: 4s - loss: 3.1345 - regression_loss: 2.4113 - classification_loss: 0.7232 482/500 [===========================>..] - ETA: 4s - loss: 3.1336 - regression_loss: 2.4104 - classification_loss: 0.7232 483/500 [===========================>..] - ETA: 4s - loss: 3.1349 - regression_loss: 2.4112 - classification_loss: 0.7237 484/500 [============================>.] - ETA: 4s - loss: 3.1351 - regression_loss: 2.4115 - classification_loss: 0.7236 485/500 [============================>.] - ETA: 3s - loss: 3.1345 - regression_loss: 2.4107 - classification_loss: 0.7238 486/500 [============================>.] - ETA: 3s - loss: 3.1340 - regression_loss: 2.4106 - classification_loss: 0.7234 487/500 [============================>.] - ETA: 3s - loss: 3.1338 - regression_loss: 2.4106 - classification_loss: 0.7231 488/500 [============================>.] - ETA: 3s - loss: 3.1335 - regression_loss: 2.4107 - classification_loss: 0.7228 489/500 [============================>.] - ETA: 2s - loss: 3.1344 - regression_loss: 2.4119 - classification_loss: 0.7224 490/500 [============================>.] - ETA: 2s - loss: 3.1332 - regression_loss: 2.4116 - classification_loss: 0.7216 491/500 [============================>.] - ETA: 2s - loss: 3.1329 - regression_loss: 2.4114 - classification_loss: 0.7214 492/500 [============================>.] - ETA: 2s - loss: 3.1324 - regression_loss: 2.4113 - classification_loss: 0.7210 493/500 [============================>.] - ETA: 1s - loss: 3.1316 - regression_loss: 2.4109 - classification_loss: 0.7207 494/500 [============================>.] - ETA: 1s - loss: 3.1322 - regression_loss: 2.4118 - classification_loss: 0.7205 495/500 [============================>.] - ETA: 1s - loss: 3.1312 - regression_loss: 2.4113 - classification_loss: 0.7199 496/500 [============================>.] - ETA: 1s - loss: 3.1308 - regression_loss: 2.4111 - classification_loss: 0.7197 497/500 [============================>.] - ETA: 0s - loss: 3.1296 - regression_loss: 2.4105 - classification_loss: 0.7192 498/500 [============================>.] - ETA: 0s - loss: 3.1293 - regression_loss: 2.4101 - classification_loss: 0.7191 499/500 [============================>.] - ETA: 0s - loss: 3.1287 - regression_loss: 2.4095 - classification_loss: 0.7192 500/500 [==============================] - 126s 252ms/step - loss: 3.1279 - regression_loss: 2.4091 - classification_loss: 0.7188 1172 instances of class plum with average precision: 0.0793 mAP: 0.0793 Epoch 00004: saving model to ./training/snapshots/resnet50_pascal_04.h5 Epoch 5/150 1/500 [..............................] - ETA: 1:55 - loss: 2.6406 - regression_loss: 2.0778 - classification_loss: 0.5628 2/500 [..............................] - ETA: 2:01 - loss: 3.0014 - regression_loss: 2.3579 - classification_loss: 0.6435 3/500 [..............................] - ETA: 1:57 - loss: 3.4601 - regression_loss: 2.7082 - classification_loss: 0.7519 4/500 [..............................] - ETA: 1:59 - loss: 3.3162 - regression_loss: 2.6171 - classification_loss: 0.6991 5/500 [..............................] - ETA: 2:00 - loss: 3.2998 - regression_loss: 2.6231 - classification_loss: 0.6767 6/500 [..............................] - ETA: 2:01 - loss: 3.2167 - regression_loss: 2.5635 - classification_loss: 0.6531 7/500 [..............................] - ETA: 2:01 - loss: 3.1607 - regression_loss: 2.5233 - classification_loss: 0.6374 8/500 [..............................] - ETA: 2:02 - loss: 3.1231 - regression_loss: 2.5019 - classification_loss: 0.6212 9/500 [..............................] - ETA: 2:01 - loss: 3.1137 - regression_loss: 2.4904 - classification_loss: 0.6233 10/500 [..............................] - ETA: 2:01 - loss: 3.0932 - regression_loss: 2.4873 - classification_loss: 0.6059 11/500 [..............................] - ETA: 2:02 - loss: 3.0890 - regression_loss: 2.4818 - classification_loss: 0.6073 12/500 [..............................] - ETA: 2:02 - loss: 3.0626 - regression_loss: 2.4634 - classification_loss: 0.5991 13/500 [..............................] - ETA: 2:02 - loss: 3.0320 - regression_loss: 2.4269 - classification_loss: 0.6051 14/500 [..............................] - ETA: 2:01 - loss: 3.0164 - regression_loss: 2.4219 - classification_loss: 0.5945 15/500 [..............................] - ETA: 2:01 - loss: 2.9921 - regression_loss: 2.3986 - classification_loss: 0.5935 16/500 [..............................] - ETA: 2:01 - loss: 2.9742 - regression_loss: 2.3880 - classification_loss: 0.5862 17/500 [>.............................] - ETA: 2:01 - loss: 2.9680 - regression_loss: 2.3868 - classification_loss: 0.5812 18/500 [>.............................] - ETA: 2:01 - loss: 2.9426 - regression_loss: 2.3661 - classification_loss: 0.5766 19/500 [>.............................] - ETA: 2:00 - loss: 2.9424 - regression_loss: 2.3656 - classification_loss: 0.5768 20/500 [>.............................] - ETA: 2:00 - loss: 2.9329 - regression_loss: 2.3589 - classification_loss: 0.5741 21/500 [>.............................] - ETA: 2:00 - loss: 2.9346 - regression_loss: 2.3605 - classification_loss: 0.5741 22/500 [>.............................] - ETA: 1:59 - loss: 2.9573 - regression_loss: 2.3636 - classification_loss: 0.5937 23/500 [>.............................] - ETA: 1:59 - loss: 2.9788 - regression_loss: 2.3841 - classification_loss: 0.5947 24/500 [>.............................] - ETA: 1:59 - loss: 2.9754 - regression_loss: 2.3826 - classification_loss: 0.5928 25/500 [>.............................] - ETA: 1:59 - loss: 2.9782 - regression_loss: 2.3873 - classification_loss: 0.5909 26/500 [>.............................] - ETA: 1:59 - loss: 2.9733 - regression_loss: 2.3869 - classification_loss: 0.5864 27/500 [>.............................] - ETA: 1:58 - loss: 2.9725 - regression_loss: 2.3868 - classification_loss: 0.5857 28/500 [>.............................] - ETA: 1:58 - loss: 2.9826 - regression_loss: 2.3940 - classification_loss: 0.5885 29/500 [>.............................] - ETA: 1:58 - loss: 2.9822 - regression_loss: 2.3908 - classification_loss: 0.5913 30/500 [>.............................] - ETA: 1:58 - loss: 2.9886 - regression_loss: 2.3974 - classification_loss: 0.5912 31/500 [>.............................] - ETA: 1:57 - loss: 2.9922 - regression_loss: 2.3975 - classification_loss: 0.5946 32/500 [>.............................] - ETA: 1:57 - loss: 2.9958 - regression_loss: 2.3994 - classification_loss: 0.5963 33/500 [>.............................] - ETA: 1:57 - loss: 2.9904 - regression_loss: 2.3950 - classification_loss: 0.5954 34/500 [=>............................] - ETA: 1:57 - loss: 3.0040 - regression_loss: 2.4032 - classification_loss: 0.6008 35/500 [=>............................] - ETA: 1:57 - loss: 2.9951 - regression_loss: 2.3982 - classification_loss: 0.5969 36/500 [=>............................] - ETA: 1:56 - loss: 2.9867 - regression_loss: 2.3905 - classification_loss: 0.5962 37/500 [=>............................] - ETA: 1:56 - loss: 2.9622 - regression_loss: 2.3735 - classification_loss: 0.5888 38/500 [=>............................] - ETA: 1:56 - loss: 3.0813 - regression_loss: 2.3790 - classification_loss: 0.7023 39/500 [=>............................] - ETA: 1:56 - loss: 3.0786 - regression_loss: 2.3779 - classification_loss: 0.7007 40/500 [=>............................] - ETA: 1:56 - loss: 3.0744 - regression_loss: 2.3780 - classification_loss: 0.6964 41/500 [=>............................] - ETA: 1:55 - loss: 3.0690 - regression_loss: 2.3710 - classification_loss: 0.6980 42/500 [=>............................] - ETA: 1:55 - loss: 3.0619 - regression_loss: 2.3699 - classification_loss: 0.6920 43/500 [=>............................] - ETA: 1:55 - loss: 3.0687 - regression_loss: 2.3737 - classification_loss: 0.6949 44/500 [=>............................] - ETA: 1:55 - loss: 3.0637 - regression_loss: 2.3717 - classification_loss: 0.6921 45/500 [=>............................] - ETA: 1:54 - loss: 3.0430 - regression_loss: 2.3573 - classification_loss: 0.6857 46/500 [=>............................] - ETA: 1:54 - loss: 3.0508 - regression_loss: 2.3654 - classification_loss: 0.6854 47/500 [=>............................] - ETA: 1:54 - loss: 3.0416 - regression_loss: 2.3604 - classification_loss: 0.6812 48/500 [=>............................] - ETA: 1:53 - loss: 3.0429 - regression_loss: 2.3639 - classification_loss: 0.6790 49/500 [=>............................] - ETA: 1:53 - loss: 3.0428 - regression_loss: 2.3658 - classification_loss: 0.6770 50/500 [==>...........................] - ETA: 1:53 - loss: 3.0410 - regression_loss: 2.3656 - classification_loss: 0.6754 51/500 [==>...........................] - ETA: 1:52 - loss: 3.0399 - regression_loss: 2.3666 - classification_loss: 0.6733 52/500 [==>...........................] - ETA: 1:52 - loss: 3.0335 - regression_loss: 2.3638 - classification_loss: 0.6697 53/500 [==>...........................] - ETA: 1:52 - loss: 3.0335 - regression_loss: 2.3673 - classification_loss: 0.6662 54/500 [==>...........................] - ETA: 1:52 - loss: 3.0297 - regression_loss: 2.3653 - classification_loss: 0.6644 55/500 [==>...........................] - ETA: 1:52 - loss: 3.0290 - regression_loss: 2.3681 - classification_loss: 0.6610 56/500 [==>...........................] - ETA: 1:51 - loss: 3.0359 - regression_loss: 2.3783 - classification_loss: 0.6575 57/500 [==>...........................] - ETA: 1:51 - loss: 3.0386 - regression_loss: 2.3825 - classification_loss: 0.6561 58/500 [==>...........................] - ETA: 1:51 - loss: 3.0342 - regression_loss: 2.3796 - classification_loss: 0.6546 59/500 [==>...........................] - ETA: 1:51 - loss: 3.0329 - regression_loss: 2.3798 - classification_loss: 0.6532 60/500 [==>...........................] - ETA: 1:50 - loss: 3.0301 - regression_loss: 2.3773 - classification_loss: 0.6528 61/500 [==>...........................] - ETA: 1:50 - loss: 3.0587 - regression_loss: 2.3852 - classification_loss: 0.6735 62/500 [==>...........................] - ETA: 1:50 - loss: 3.1538 - regression_loss: 2.3959 - classification_loss: 0.7580 63/500 [==>...........................] - ETA: 1:50 - loss: 3.1557 - regression_loss: 2.3977 - classification_loss: 0.7580 64/500 [==>...........................] - ETA: 1:49 - loss: 3.1475 - regression_loss: 2.3936 - classification_loss: 0.7539 65/500 [==>...........................] - ETA: 1:49 - loss: 3.1486 - regression_loss: 2.3949 - classification_loss: 0.7537 66/500 [==>...........................] - ETA: 1:49 - loss: 3.1473 - regression_loss: 2.3960 - classification_loss: 0.7513 67/500 [===>..........................] - ETA: 1:49 - loss: 3.1472 - regression_loss: 2.3977 - classification_loss: 0.7494 68/500 [===>..........................] - ETA: 1:49 - loss: 3.1496 - regression_loss: 2.4009 - classification_loss: 0.7487 69/500 [===>..........................] - ETA: 1:48 - loss: 3.1416 - regression_loss: 2.3963 - classification_loss: 0.7453 70/500 [===>..........................] - ETA: 1:48 - loss: 3.1408 - regression_loss: 2.3986 - classification_loss: 0.7421 71/500 [===>..........................] - ETA: 1:48 - loss: 3.1348 - regression_loss: 2.3965 - classification_loss: 0.7383 72/500 [===>..........................] - ETA: 1:48 - loss: 3.1340 - regression_loss: 2.3982 - classification_loss: 0.7358 73/500 [===>..........................] - ETA: 1:47 - loss: 3.1324 - regression_loss: 2.3971 - classification_loss: 0.7353 74/500 [===>..........................] - ETA: 1:47 - loss: 3.1281 - regression_loss: 2.3950 - classification_loss: 0.7331 75/500 [===>..........................] - ETA: 1:47 - loss: 3.1252 - regression_loss: 2.3947 - classification_loss: 0.7305 76/500 [===>..........................] - ETA: 1:47 - loss: 3.1267 - regression_loss: 2.3971 - classification_loss: 0.7297 77/500 [===>..........................] - ETA: 1:46 - loss: 3.1291 - regression_loss: 2.3990 - classification_loss: 0.7301 78/500 [===>..........................] - ETA: 1:46 - loss: 3.1283 - regression_loss: 2.3959 - classification_loss: 0.7323 79/500 [===>..........................] - ETA: 1:46 - loss: 3.1299 - regression_loss: 2.3952 - classification_loss: 0.7347 80/500 [===>..........................] - ETA: 1:46 - loss: 3.1305 - regression_loss: 2.3950 - classification_loss: 0.7355 81/500 [===>..........................] - ETA: 1:45 - loss: 3.1304 - regression_loss: 2.3968 - classification_loss: 0.7336 82/500 [===>..........................] - ETA: 1:45 - loss: 3.1266 - regression_loss: 2.3960 - classification_loss: 0.7306 83/500 [===>..........................] - ETA: 1:45 - loss: 3.1210 - regression_loss: 2.3946 - classification_loss: 0.7264 84/500 [====>.........................] - ETA: 1:45 - loss: 3.1198 - regression_loss: 2.3946 - classification_loss: 0.7252 85/500 [====>.........................] - ETA: 1:44 - loss: 3.1202 - regression_loss: 2.3964 - classification_loss: 0.7238 86/500 [====>.........................] - ETA: 1:44 - loss: 3.1178 - regression_loss: 2.3949 - classification_loss: 0.7229 87/500 [====>.........................] - ETA: 1:44 - loss: 3.2147 - regression_loss: 2.3674 - classification_loss: 0.8473 88/500 [====>.........................] - ETA: 1:44 - loss: 3.2094 - regression_loss: 2.3648 - classification_loss: 0.8445 89/500 [====>.........................] - ETA: 1:43 - loss: 3.2033 - regression_loss: 2.3619 - classification_loss: 0.8413 90/500 [====>.........................] - ETA: 1:43 - loss: 3.2000 - regression_loss: 2.3606 - classification_loss: 0.8394 91/500 [====>.........................] - ETA: 1:43 - loss: 3.1978 - regression_loss: 2.3599 - classification_loss: 0.8379 92/500 [====>.........................] - ETA: 1:43 - loss: 3.1969 - regression_loss: 2.3612 - classification_loss: 0.8357 93/500 [====>.........................] - ETA: 1:43 - loss: 3.1971 - regression_loss: 2.3601 - classification_loss: 0.8369 94/500 [====>.........................] - ETA: 1:42 - loss: 3.1919 - regression_loss: 2.3596 - classification_loss: 0.8323 95/500 [====>.........................] - ETA: 1:42 - loss: 3.1879 - regression_loss: 2.3584 - classification_loss: 0.8296 96/500 [====>.........................] - ETA: 1:42 - loss: 3.1833 - regression_loss: 2.3561 - classification_loss: 0.8273 97/500 [====>.........................] - ETA: 1:41 - loss: 3.1805 - regression_loss: 2.3563 - classification_loss: 0.8241 98/500 [====>.........................] - ETA: 1:41 - loss: 3.1785 - regression_loss: 2.3564 - classification_loss: 0.8221 99/500 [====>.........................] - ETA: 1:41 - loss: 3.1778 - regression_loss: 2.3546 - classification_loss: 0.8232 100/500 [=====>........................] - ETA: 1:41 - loss: 3.1741 - regression_loss: 2.3540 - classification_loss: 0.8201 101/500 [=====>........................] - ETA: 1:40 - loss: 3.1698 - regression_loss: 2.3530 - classification_loss: 0.8168 102/500 [=====>........................] - ETA: 1:40 - loss: 3.1679 - regression_loss: 2.3535 - classification_loss: 0.8144 103/500 [=====>........................] - ETA: 1:40 - loss: 3.1666 - regression_loss: 2.3533 - classification_loss: 0.8133 104/500 [=====>........................] - ETA: 1:40 - loss: 3.1627 - regression_loss: 2.3525 - classification_loss: 0.8103 105/500 [=====>........................] - ETA: 1:39 - loss: 3.1714 - regression_loss: 2.3617 - classification_loss: 0.8098 106/500 [=====>........................] - ETA: 1:39 - loss: 3.1656 - regression_loss: 2.3590 - classification_loss: 0.8066 107/500 [=====>........................] - ETA: 1:39 - loss: 3.1630 - regression_loss: 2.3586 - classification_loss: 0.8044 108/500 [=====>........................] - ETA: 1:38 - loss: 3.1614 - regression_loss: 2.3577 - classification_loss: 0.8038 109/500 [=====>........................] - ETA: 1:38 - loss: 3.1633 - regression_loss: 2.3596 - classification_loss: 0.8037 110/500 [=====>........................] - ETA: 1:38 - loss: 3.1603 - regression_loss: 2.3581 - classification_loss: 0.8022 111/500 [=====>........................] - ETA: 1:38 - loss: 3.1576 - regression_loss: 2.3568 - classification_loss: 0.8008 112/500 [=====>........................] - ETA: 1:37 - loss: 3.1544 - regression_loss: 2.3561 - classification_loss: 0.7983 113/500 [=====>........................] - ETA: 1:37 - loss: 3.1516 - regression_loss: 2.3550 - classification_loss: 0.7965 114/500 [=====>........................] - ETA: 1:37 - loss: 3.1510 - regression_loss: 2.3550 - classification_loss: 0.7960 115/500 [=====>........................] - ETA: 1:36 - loss: 3.1464 - regression_loss: 2.3538 - classification_loss: 0.7926 116/500 [=====>........................] - ETA: 1:36 - loss: 3.1460 - regression_loss: 2.3549 - classification_loss: 0.7911 117/500 [======>.......................] - ETA: 1:36 - loss: 3.1383 - regression_loss: 2.3507 - classification_loss: 0.7876 118/500 [======>.......................] - ETA: 1:35 - loss: 3.1347 - regression_loss: 2.3491 - classification_loss: 0.7856 119/500 [======>.......................] - ETA: 1:35 - loss: 3.1323 - regression_loss: 2.3488 - classification_loss: 0.7835 120/500 [======>.......................] - ETA: 1:35 - loss: 3.1382 - regression_loss: 2.3490 - classification_loss: 0.7892 121/500 [======>.......................] - ETA: 1:35 - loss: 3.1372 - regression_loss: 2.3494 - classification_loss: 0.7879 122/500 [======>.......................] - ETA: 1:34 - loss: 3.1365 - regression_loss: 2.3500 - classification_loss: 0.7866 123/500 [======>.......................] - ETA: 1:34 - loss: 3.1330 - regression_loss: 2.3486 - classification_loss: 0.7845 124/500 [======>.......................] - ETA: 1:34 - loss: 3.1309 - regression_loss: 2.3484 - classification_loss: 0.7825 125/500 [======>.......................] - ETA: 1:34 - loss: 3.1281 - regression_loss: 2.3460 - classification_loss: 0.7821 126/500 [======>.......................] - ETA: 1:33 - loss: 3.1252 - regression_loss: 2.3426 - classification_loss: 0.7826 127/500 [======>.......................] - ETA: 1:33 - loss: 3.1257 - regression_loss: 2.3441 - classification_loss: 0.7816 128/500 [======>.......................] - ETA: 1:33 - loss: 3.1418 - regression_loss: 2.3444 - classification_loss: 0.7974 129/500 [======>.......................] - ETA: 1:33 - loss: 3.1423 - regression_loss: 2.3462 - classification_loss: 0.7961 130/500 [======>.......................] - ETA: 1:32 - loss: 3.1395 - regression_loss: 2.3458 - classification_loss: 0.7938 131/500 [======>.......................] - ETA: 1:32 - loss: 3.1378 - regression_loss: 2.3456 - classification_loss: 0.7922 132/500 [======>.......................] - ETA: 1:32 - loss: 3.1440 - regression_loss: 2.3528 - classification_loss: 0.7912 133/500 [======>.......................] - ETA: 1:32 - loss: 3.1400 - regression_loss: 2.3526 - classification_loss: 0.7874 134/500 [=======>......................] - ETA: 1:31 - loss: 3.1375 - regression_loss: 2.3520 - classification_loss: 0.7855 135/500 [=======>......................] - ETA: 1:31 - loss: 3.1336 - regression_loss: 2.3505 - classification_loss: 0.7831 136/500 [=======>......................] - ETA: 1:31 - loss: 3.1321 - regression_loss: 2.3502 - classification_loss: 0.7819 137/500 [=======>......................] - ETA: 1:31 - loss: 3.1338 - regression_loss: 2.3528 - classification_loss: 0.7810 138/500 [=======>......................] - ETA: 1:30 - loss: 3.1335 - regression_loss: 2.3534 - classification_loss: 0.7802 139/500 [=======>......................] - ETA: 1:30 - loss: 3.1319 - regression_loss: 2.3538 - classification_loss: 0.7782 140/500 [=======>......................] - ETA: 1:30 - loss: 3.1304 - regression_loss: 2.3537 - classification_loss: 0.7767 141/500 [=======>......................] - ETA: 1:30 - loss: 3.1300 - regression_loss: 2.3537 - classification_loss: 0.7762 142/500 [=======>......................] - ETA: 1:29 - loss: 3.1242 - regression_loss: 2.3508 - classification_loss: 0.7734 143/500 [=======>......................] - ETA: 1:29 - loss: 3.1235 - regression_loss: 2.3515 - classification_loss: 0.7719 144/500 [=======>......................] - ETA: 1:29 - loss: 3.1231 - regression_loss: 2.3519 - classification_loss: 0.7712 145/500 [=======>......................] - ETA: 1:29 - loss: 3.1192 - regression_loss: 2.3499 - classification_loss: 0.7692 146/500 [=======>......................] - ETA: 1:28 - loss: 3.1173 - regression_loss: 2.3494 - classification_loss: 0.7679 147/500 [=======>......................] - ETA: 1:28 - loss: 3.1152 - regression_loss: 2.3487 - classification_loss: 0.7665 148/500 [=======>......................] - ETA: 1:28 - loss: 3.1162 - regression_loss: 2.3506 - classification_loss: 0.7656 149/500 [=======>......................] - ETA: 1:28 - loss: 3.1174 - regression_loss: 2.3526 - classification_loss: 0.7648 150/500 [========>.....................] - ETA: 1:28 - loss: 3.1131 - regression_loss: 2.3505 - classification_loss: 0.7626 151/500 [========>.....................] - ETA: 1:27 - loss: 3.1114 - regression_loss: 2.3499 - classification_loss: 0.7614 152/500 [========>.....................] - ETA: 1:27 - loss: 3.1081 - regression_loss: 2.3482 - classification_loss: 0.7599 153/500 [========>.....................] - ETA: 1:27 - loss: 3.1082 - regression_loss: 2.3493 - classification_loss: 0.7590 154/500 [========>.....................] - ETA: 1:27 - loss: 3.1071 - regression_loss: 2.3498 - classification_loss: 0.7573 155/500 [========>.....................] - ETA: 1:26 - loss: 3.1031 - regression_loss: 2.3478 - classification_loss: 0.7553 156/500 [========>.....................] - ETA: 1:26 - loss: 3.1035 - regression_loss: 2.3499 - classification_loss: 0.7537 157/500 [========>.....................] - ETA: 1:26 - loss: 3.1023 - regression_loss: 2.3504 - classification_loss: 0.7519 158/500 [========>.....................] - ETA: 1:26 - loss: 3.1001 - regression_loss: 2.3493 - classification_loss: 0.7507 159/500 [========>.....................] - ETA: 1:25 - loss: 3.0978 - regression_loss: 2.3486 - classification_loss: 0.7493 160/500 [========>.....................] - ETA: 1:25 - loss: 3.0977 - regression_loss: 2.3489 - classification_loss: 0.7487 161/500 [========>.....................] - ETA: 1:25 - loss: 3.1038 - regression_loss: 2.3515 - classification_loss: 0.7523 162/500 [========>.....................] - ETA: 1:25 - loss: 3.1028 - regression_loss: 2.3514 - classification_loss: 0.7514 163/500 [========>.....................] - ETA: 1:24 - loss: 3.0994 - regression_loss: 2.3503 - classification_loss: 0.7491 164/500 [========>.....................] - ETA: 1:24 - loss: 3.1005 - regression_loss: 2.3514 - classification_loss: 0.7491 165/500 [========>.....................] - ETA: 1:24 - loss: 3.0996 - regression_loss: 2.3522 - classification_loss: 0.7475 166/500 [========>.....................] - ETA: 1:24 - loss: 3.0979 - regression_loss: 2.3520 - classification_loss: 0.7459 167/500 [=========>....................] - ETA: 1:23 - loss: 3.0934 - regression_loss: 2.3484 - classification_loss: 0.7450 168/500 [=========>....................] - ETA: 1:23 - loss: 3.0918 - regression_loss: 2.3480 - classification_loss: 0.7438 169/500 [=========>....................] - ETA: 1:23 - loss: 3.0900 - regression_loss: 2.3474 - classification_loss: 0.7427 170/500 [=========>....................] - ETA: 1:23 - loss: 3.0882 - regression_loss: 2.3469 - classification_loss: 0.7413 171/500 [=========>....................] - ETA: 1:22 - loss: 3.0869 - regression_loss: 2.3465 - classification_loss: 0.7404 172/500 [=========>....................] - ETA: 1:22 - loss: 3.0837 - regression_loss: 2.3454 - classification_loss: 0.7383 173/500 [=========>....................] - ETA: 1:22 - loss: 3.0841 - regression_loss: 2.3443 - classification_loss: 0.7397 174/500 [=========>....................] - ETA: 1:22 - loss: 3.0824 - regression_loss: 2.3440 - classification_loss: 0.7384 175/500 [=========>....................] - ETA: 1:21 - loss: 3.0807 - regression_loss: 2.3421 - classification_loss: 0.7386 176/500 [=========>....................] - ETA: 1:21 - loss: 3.0837 - regression_loss: 2.3453 - classification_loss: 0.7383 177/500 [=========>....................] - ETA: 1:21 - loss: 3.0820 - regression_loss: 2.3447 - classification_loss: 0.7373 178/500 [=========>....................] - ETA: 1:21 - loss: 3.0804 - regression_loss: 2.3440 - classification_loss: 0.7365 179/500 [=========>....................] - ETA: 1:20 - loss: 3.0852 - regression_loss: 2.3456 - classification_loss: 0.7396 180/500 [=========>....................] - ETA: 1:20 - loss: 3.0832 - regression_loss: 2.3445 - classification_loss: 0.7387 181/500 [=========>....................] - ETA: 1:20 - loss: 3.0804 - regression_loss: 2.3432 - classification_loss: 0.7372 182/500 [=========>....................] - ETA: 1:20 - loss: 3.0789 - regression_loss: 2.3427 - classification_loss: 0.7362 183/500 [=========>....................] - ETA: 1:19 - loss: 3.0771 - regression_loss: 2.3421 - classification_loss: 0.7349 184/500 [==========>...................] - ETA: 1:19 - loss: 3.0734 - regression_loss: 2.3396 - classification_loss: 0.7338 185/500 [==========>...................] - ETA: 1:19 - loss: 3.0711 - regression_loss: 2.3384 - classification_loss: 0.7327 186/500 [==========>...................] - ETA: 1:19 - loss: 3.0710 - regression_loss: 2.3382 - classification_loss: 0.7328 187/500 [==========>...................] - ETA: 1:18 - loss: 3.0709 - regression_loss: 2.3389 - classification_loss: 0.7320 188/500 [==========>...................] - ETA: 1:18 - loss: 3.0747 - regression_loss: 2.3431 - classification_loss: 0.7316 189/500 [==========>...................] - ETA: 1:18 - loss: 3.0745 - regression_loss: 2.3432 - classification_loss: 0.7313 190/500 [==========>...................] - ETA: 1:18 - loss: 3.0731 - regression_loss: 2.3429 - classification_loss: 0.7302 191/500 [==========>...................] - ETA: 1:17 - loss: 3.0735 - regression_loss: 2.3430 - classification_loss: 0.7304 192/500 [==========>...................] - ETA: 1:17 - loss: 3.0723 - regression_loss: 2.3429 - classification_loss: 0.7294 193/500 [==========>...................] - ETA: 1:17 - loss: 3.0764 - regression_loss: 2.3461 - classification_loss: 0.7303 194/500 [==========>...................] - ETA: 1:17 - loss: 3.0756 - regression_loss: 2.3461 - classification_loss: 0.7295 195/500 [==========>...................] - ETA: 1:16 - loss: 3.0750 - regression_loss: 2.3466 - classification_loss: 0.7284 196/500 [==========>...................] - ETA: 1:16 - loss: 3.0724 - regression_loss: 2.3447 - classification_loss: 0.7277 197/500 [==========>...................] - ETA: 1:16 - loss: 3.0707 - regression_loss: 2.3441 - classification_loss: 0.7266 198/500 [==========>...................] - ETA: 1:16 - loss: 3.0689 - regression_loss: 2.3429 - classification_loss: 0.7260 199/500 [==========>...................] - ETA: 1:15 - loss: 3.0684 - regression_loss: 2.3427 - classification_loss: 0.7257 200/500 [===========>..................] - ETA: 1:15 - loss: 3.0678 - regression_loss: 2.3428 - classification_loss: 0.7250 201/500 [===========>..................] - ETA: 1:15 - loss: 3.0606 - regression_loss: 2.3378 - classification_loss: 0.7229 202/500 [===========>..................] - ETA: 1:15 - loss: 3.0592 - regression_loss: 2.3375 - classification_loss: 0.7217 203/500 [===========>..................] - ETA: 1:14 - loss: 3.0585 - regression_loss: 2.3379 - classification_loss: 0.7206 204/500 [===========>..................] - ETA: 1:14 - loss: 3.0565 - regression_loss: 2.3368 - classification_loss: 0.7197 205/500 [===========>..................] - ETA: 1:14 - loss: 3.0546 - regression_loss: 2.3357 - classification_loss: 0.7189 206/500 [===========>..................] - ETA: 1:14 - loss: 3.0570 - regression_loss: 2.3378 - classification_loss: 0.7192 207/500 [===========>..................] - ETA: 1:13 - loss: 3.0532 - regression_loss: 2.3356 - classification_loss: 0.7176 208/500 [===========>..................] - ETA: 1:13 - loss: 3.0518 - regression_loss: 2.3340 - classification_loss: 0.7178 209/500 [===========>..................] - ETA: 1:13 - loss: 3.0508 - regression_loss: 2.3343 - classification_loss: 0.7165 210/500 [===========>..................] - ETA: 1:13 - loss: 3.0525 - regression_loss: 2.3370 - classification_loss: 0.7155 211/500 [===========>..................] - ETA: 1:12 - loss: 3.0515 - regression_loss: 2.3362 - classification_loss: 0.7153 212/500 [===========>..................] - ETA: 1:12 - loss: 3.0485 - regression_loss: 2.3347 - classification_loss: 0.7139 213/500 [===========>..................] - ETA: 1:12 - loss: 3.0491 - regression_loss: 2.3353 - classification_loss: 0.7137 214/500 [===========>..................] - ETA: 1:12 - loss: 3.0493 - regression_loss: 2.3360 - classification_loss: 0.7133 215/500 [===========>..................] - ETA: 1:11 - loss: 3.0479 - regression_loss: 2.3354 - classification_loss: 0.7125 216/500 [===========>..................] - ETA: 1:11 - loss: 3.0469 - regression_loss: 2.3345 - classification_loss: 0.7123 217/500 [============>.................] - ETA: 1:11 - loss: 3.0498 - regression_loss: 2.3379 - classification_loss: 0.7119 218/500 [============>.................] - ETA: 1:11 - loss: 3.0490 - regression_loss: 2.3379 - classification_loss: 0.7111 219/500 [============>.................] - ETA: 1:10 - loss: 3.0478 - regression_loss: 2.3372 - classification_loss: 0.7106 220/500 [============>.................] - ETA: 1:10 - loss: 3.0465 - regression_loss: 2.3366 - classification_loss: 0.7098 221/500 [============>.................] - ETA: 1:10 - loss: 3.0440 - regression_loss: 2.3355 - classification_loss: 0.7085 222/500 [============>.................] - ETA: 1:10 - loss: 3.0430 - regression_loss: 2.3350 - classification_loss: 0.7080 223/500 [============>.................] - ETA: 1:09 - loss: 3.0411 - regression_loss: 2.3341 - classification_loss: 0.7070 224/500 [============>.................] - ETA: 1:09 - loss: 3.0418 - regression_loss: 2.3348 - classification_loss: 0.7071 225/500 [============>.................] - ETA: 1:09 - loss: 3.0417 - regression_loss: 2.3348 - classification_loss: 0.7069 226/500 [============>.................] - ETA: 1:09 - loss: 3.0401 - regression_loss: 2.3340 - classification_loss: 0.7061 227/500 [============>.................] - ETA: 1:08 - loss: 3.0410 - regression_loss: 2.3342 - classification_loss: 0.7068 228/500 [============>.................] - ETA: 1:08 - loss: 3.0389 - regression_loss: 2.3331 - classification_loss: 0.7058 229/500 [============>.................] - ETA: 1:08 - loss: 3.0381 - regression_loss: 2.3328 - classification_loss: 0.7053 230/500 [============>.................] - ETA: 1:08 - loss: 3.0379 - regression_loss: 2.3335 - classification_loss: 0.7045 231/500 [============>.................] - ETA: 1:07 - loss: 3.0379 - regression_loss: 2.3338 - classification_loss: 0.7041 232/500 [============>.................] - ETA: 1:07 - loss: 3.0369 - regression_loss: 2.3337 - classification_loss: 0.7032 233/500 [============>.................] - ETA: 1:07 - loss: 3.0367 - regression_loss: 2.3343 - classification_loss: 0.7025 234/500 [=============>................] - ETA: 1:07 - loss: 3.0361 - regression_loss: 2.3327 - classification_loss: 0.7034 235/500 [=============>................] - ETA: 1:06 - loss: 3.0362 - regression_loss: 2.3335 - classification_loss: 0.7027 236/500 [=============>................] - ETA: 1:06 - loss: 3.0358 - regression_loss: 2.3337 - classification_loss: 0.7021 237/500 [=============>................] - ETA: 1:06 - loss: 3.0343 - regression_loss: 2.3328 - classification_loss: 0.7015 238/500 [=============>................] - ETA: 1:06 - loss: 3.0337 - regression_loss: 2.3330 - classification_loss: 0.7007 239/500 [=============>................] - ETA: 1:05 - loss: 3.0344 - regression_loss: 2.3339 - classification_loss: 0.7005 240/500 [=============>................] - ETA: 1:05 - loss: 3.0339 - regression_loss: 2.3337 - classification_loss: 0.7002 241/500 [=============>................] - ETA: 1:05 - loss: 3.0336 - regression_loss: 2.3341 - classification_loss: 0.6995 242/500 [=============>................] - ETA: 1:05 - loss: 3.0325 - regression_loss: 2.3342 - classification_loss: 0.6983 243/500 [=============>................] - ETA: 1:04 - loss: 3.0324 - regression_loss: 2.3350 - classification_loss: 0.6974 244/500 [=============>................] - ETA: 1:04 - loss: 3.0338 - regression_loss: 2.3362 - classification_loss: 0.6976 245/500 [=============>................] - ETA: 1:04 - loss: 3.0326 - regression_loss: 2.3354 - classification_loss: 0.6971 246/500 [=============>................] - ETA: 1:04 - loss: 3.0338 - regression_loss: 2.3363 - classification_loss: 0.6976 247/500 [=============>................] - ETA: 1:03 - loss: 3.0333 - regression_loss: 2.3361 - classification_loss: 0.6972 248/500 [=============>................] - ETA: 1:03 - loss: 3.0333 - regression_loss: 2.3370 - classification_loss: 0.6963 249/500 [=============>................] - ETA: 1:03 - loss: 3.0326 - regression_loss: 2.3367 - classification_loss: 0.6959 250/500 [==============>...............] - ETA: 1:03 - loss: 3.0320 - regression_loss: 2.3363 - classification_loss: 0.6957 251/500 [==============>...............] - ETA: 1:02 - loss: 3.0316 - regression_loss: 2.3364 - classification_loss: 0.6952 252/500 [==============>...............] - ETA: 1:02 - loss: 3.0309 - regression_loss: 2.3361 - classification_loss: 0.6949 253/500 [==============>...............] - ETA: 1:02 - loss: 3.0384 - regression_loss: 2.3343 - classification_loss: 0.7041 254/500 [==============>...............] - ETA: 1:02 - loss: 3.0384 - regression_loss: 2.3344 - classification_loss: 0.7040 255/500 [==============>...............] - ETA: 1:01 - loss: 3.0384 - regression_loss: 2.3348 - classification_loss: 0.7036 256/500 [==============>...............] - ETA: 1:01 - loss: 3.0381 - regression_loss: 2.3351 - classification_loss: 0.7030 257/500 [==============>...............] - ETA: 1:01 - loss: 3.0377 - regression_loss: 2.3351 - classification_loss: 0.7026 258/500 [==============>...............] - ETA: 1:00 - loss: 3.0360 - regression_loss: 2.3342 - classification_loss: 0.7017 259/500 [==============>...............] - ETA: 1:00 - loss: 3.0339 - regression_loss: 2.3331 - classification_loss: 0.7009 260/500 [==============>...............] - ETA: 1:00 - loss: 3.0322 - regression_loss: 2.3321 - classification_loss: 0.7001 261/500 [==============>...............] - ETA: 1:00 - loss: 3.0307 - regression_loss: 2.3314 - classification_loss: 0.6993 262/500 [==============>...............] - ETA: 59s - loss: 3.0294 - regression_loss: 2.3308 - classification_loss: 0.6986  263/500 [==============>...............] - ETA: 59s - loss: 3.0317 - regression_loss: 2.3330 - classification_loss: 0.6987 264/500 [==============>...............] - ETA: 59s - loss: 3.0312 - regression_loss: 2.3330 - classification_loss: 0.6982 265/500 [==============>...............] - ETA: 59s - loss: 3.0306 - regression_loss: 2.3328 - classification_loss: 0.6978 266/500 [==============>...............] - ETA: 58s - loss: 3.0285 - regression_loss: 2.3317 - classification_loss: 0.6968 267/500 [===============>..............] - ETA: 58s - loss: 3.0271 - regression_loss: 2.3308 - classification_loss: 0.6963 268/500 [===============>..............] - ETA: 58s - loss: 3.0255 - regression_loss: 2.3298 - classification_loss: 0.6957 269/500 [===============>..............] - ETA: 58s - loss: 3.0261 - regression_loss: 2.3305 - classification_loss: 0.6956 270/500 [===============>..............] - ETA: 57s - loss: 3.0254 - regression_loss: 2.3301 - classification_loss: 0.6953 271/500 [===============>..............] - ETA: 57s - loss: 3.0256 - regression_loss: 2.3307 - classification_loss: 0.6948 272/500 [===============>..............] - ETA: 57s - loss: 3.0249 - regression_loss: 2.3306 - classification_loss: 0.6943 273/500 [===============>..............] - ETA: 57s - loss: 3.0267 - regression_loss: 2.3324 - classification_loss: 0.6944 274/500 [===============>..............] - ETA: 56s - loss: 3.0260 - regression_loss: 2.3325 - classification_loss: 0.6934 275/500 [===============>..............] - ETA: 56s - loss: 3.0247 - regression_loss: 2.3310 - classification_loss: 0.6936 276/500 [===============>..............] - ETA: 56s - loss: 3.0251 - regression_loss: 2.3306 - classification_loss: 0.6945 277/500 [===============>..............] - ETA: 56s - loss: 3.0289 - regression_loss: 2.3323 - classification_loss: 0.6966 278/500 [===============>..............] - ETA: 55s - loss: 3.0269 - regression_loss: 2.3312 - classification_loss: 0.6957 279/500 [===============>..............] - ETA: 55s - loss: 3.0252 - regression_loss: 2.3298 - classification_loss: 0.6953 280/500 [===============>..............] - ETA: 55s - loss: 3.0243 - regression_loss: 2.3294 - classification_loss: 0.6950 281/500 [===============>..............] - ETA: 55s - loss: 3.0239 - regression_loss: 2.3291 - classification_loss: 0.6948 282/500 [===============>..............] - ETA: 54s - loss: 3.0232 - regression_loss: 2.3290 - classification_loss: 0.6942 283/500 [===============>..............] - ETA: 54s - loss: 3.0227 - regression_loss: 2.3293 - classification_loss: 0.6934 284/500 [================>.............] - ETA: 54s - loss: 3.0246 - regression_loss: 2.3311 - classification_loss: 0.6935 285/500 [================>.............] - ETA: 54s - loss: 3.0217 - regression_loss: 2.3294 - classification_loss: 0.6924 286/500 [================>.............] - ETA: 53s - loss: 3.0223 - regression_loss: 2.3299 - classification_loss: 0.6924 287/500 [================>.............] - ETA: 53s - loss: 3.0220 - regression_loss: 2.3299 - classification_loss: 0.6921 288/500 [================>.............] - ETA: 53s - loss: 3.0218 - regression_loss: 2.3298 - classification_loss: 0.6920 289/500 [================>.............] - ETA: 53s - loss: 3.0229 - regression_loss: 2.3309 - classification_loss: 0.6920 290/500 [================>.............] - ETA: 52s - loss: 3.0228 - regression_loss: 2.3310 - classification_loss: 0.6918 291/500 [================>.............] - ETA: 52s - loss: 3.0236 - regression_loss: 2.3316 - classification_loss: 0.6920 292/500 [================>.............] - ETA: 52s - loss: 3.0234 - regression_loss: 2.3315 - classification_loss: 0.6919 293/500 [================>.............] - ETA: 52s - loss: 3.0315 - regression_loss: 2.3367 - classification_loss: 0.6947 294/500 [================>.............] - ETA: 51s - loss: 3.0308 - regression_loss: 2.3368 - classification_loss: 0.6940 295/500 [================>.............] - ETA: 51s - loss: 3.0312 - regression_loss: 2.3373 - classification_loss: 0.6938 296/500 [================>.............] - ETA: 51s - loss: 3.0279 - regression_loss: 2.3342 - classification_loss: 0.6938 297/500 [================>.............] - ETA: 51s - loss: 3.0268 - regression_loss: 2.3338 - classification_loss: 0.6930 298/500 [================>.............] - ETA: 50s - loss: 3.0269 - regression_loss: 2.3339 - classification_loss: 0.6930 299/500 [================>.............] - ETA: 50s - loss: 3.0252 - regression_loss: 2.3326 - classification_loss: 0.6926 300/500 [=================>............] - ETA: 50s - loss: 3.0240 - regression_loss: 2.3323 - classification_loss: 0.6918 301/500 [=================>............] - ETA: 50s - loss: 3.0242 - regression_loss: 2.3328 - classification_loss: 0.6914 302/500 [=================>............] - ETA: 49s - loss: 3.0232 - regression_loss: 2.3324 - classification_loss: 0.6908 303/500 [=================>............] - ETA: 49s - loss: 3.0218 - regression_loss: 2.3315 - classification_loss: 0.6903 304/500 [=================>............] - ETA: 49s - loss: 3.0200 - regression_loss: 2.3298 - classification_loss: 0.6902 305/500 [=================>............] - ETA: 49s - loss: 3.0192 - regression_loss: 2.3295 - classification_loss: 0.6897 306/500 [=================>............] - ETA: 48s - loss: 3.0200 - regression_loss: 2.3306 - classification_loss: 0.6894 307/500 [=================>............] - ETA: 48s - loss: 3.0209 - regression_loss: 2.3320 - classification_loss: 0.6890 308/500 [=================>............] - ETA: 48s - loss: 3.0203 - regression_loss: 2.3319 - classification_loss: 0.6885 309/500 [=================>............] - ETA: 48s - loss: 3.0192 - regression_loss: 2.3312 - classification_loss: 0.6879 310/500 [=================>............] - ETA: 47s - loss: 3.0187 - regression_loss: 2.3311 - classification_loss: 0.6876 311/500 [=================>............] - ETA: 47s - loss: 3.0175 - regression_loss: 2.3305 - classification_loss: 0.6870 312/500 [=================>............] - ETA: 47s - loss: 3.0182 - regression_loss: 2.3309 - classification_loss: 0.6873 313/500 [=================>............] - ETA: 47s - loss: 3.0171 - regression_loss: 2.3304 - classification_loss: 0.6867 314/500 [=================>............] - ETA: 46s - loss: 3.0159 - regression_loss: 2.3298 - classification_loss: 0.6861 315/500 [=================>............] - ETA: 46s - loss: 3.0160 - regression_loss: 2.3303 - classification_loss: 0.6858 316/500 [=================>............] - ETA: 46s - loss: 3.0153 - regression_loss: 2.3301 - classification_loss: 0.6852 317/500 [==================>...........] - ETA: 46s - loss: 3.0154 - regression_loss: 2.3306 - classification_loss: 0.6848 318/500 [==================>...........] - ETA: 45s - loss: 3.0164 - regression_loss: 2.3316 - classification_loss: 0.6848 319/500 [==================>...........] - ETA: 45s - loss: 3.0171 - regression_loss: 2.3317 - classification_loss: 0.6854 320/500 [==================>...........] - ETA: 45s - loss: 3.0162 - regression_loss: 2.3314 - classification_loss: 0.6848 321/500 [==================>...........] - ETA: 45s - loss: 3.0178 - regression_loss: 2.3324 - classification_loss: 0.6854 322/500 [==================>...........] - ETA: 44s - loss: 3.0169 - regression_loss: 2.3322 - classification_loss: 0.6847 323/500 [==================>...........] - ETA: 44s - loss: 3.0151 - regression_loss: 2.3306 - classification_loss: 0.6845 324/500 [==================>...........] - ETA: 44s - loss: 3.0158 - regression_loss: 2.3314 - classification_loss: 0.6845 325/500 [==================>...........] - ETA: 44s - loss: 3.0139 - regression_loss: 2.3302 - classification_loss: 0.6837 326/500 [==================>...........] - ETA: 43s - loss: 3.0132 - regression_loss: 2.3300 - classification_loss: 0.6832 327/500 [==================>...........] - ETA: 43s - loss: 3.0125 - regression_loss: 2.3297 - classification_loss: 0.6828 328/500 [==================>...........] - ETA: 43s - loss: 3.0129 - regression_loss: 2.3302 - classification_loss: 0.6826 329/500 [==================>...........] - ETA: 43s - loss: 3.0123 - regression_loss: 2.3300 - classification_loss: 0.6822 330/500 [==================>...........] - ETA: 42s - loss: 3.0114 - regression_loss: 2.3296 - classification_loss: 0.6817 331/500 [==================>...........] - ETA: 42s - loss: 3.0100 - regression_loss: 2.3288 - classification_loss: 0.6812 332/500 [==================>...........] - ETA: 42s - loss: 3.0095 - regression_loss: 2.3289 - classification_loss: 0.6806 333/500 [==================>...........] - ETA: 42s - loss: 3.0095 - regression_loss: 2.3291 - classification_loss: 0.6804 334/500 [===================>..........] - ETA: 41s - loss: 3.0084 - regression_loss: 2.3284 - classification_loss: 0.6800 335/500 [===================>..........] - ETA: 41s - loss: 3.0098 - regression_loss: 2.3298 - classification_loss: 0.6800 336/500 [===================>..........] - ETA: 41s - loss: 3.0085 - regression_loss: 2.3290 - classification_loss: 0.6795 337/500 [===================>..........] - ETA: 41s - loss: 3.0081 - regression_loss: 2.3289 - classification_loss: 0.6793 338/500 [===================>..........] - ETA: 40s - loss: 3.0049 - regression_loss: 2.3267 - classification_loss: 0.6781 339/500 [===================>..........] - ETA: 40s - loss: 3.0052 - regression_loss: 2.3273 - classification_loss: 0.6779 340/500 [===================>..........] - ETA: 40s - loss: 3.0042 - regression_loss: 2.3266 - classification_loss: 0.6776 341/500 [===================>..........] - ETA: 40s - loss: 3.0030 - regression_loss: 2.3259 - classification_loss: 0.6771 342/500 [===================>..........] - ETA: 39s - loss: 3.0019 - regression_loss: 2.3252 - classification_loss: 0.6767 343/500 [===================>..........] - ETA: 39s - loss: 3.0003 - regression_loss: 2.3242 - classification_loss: 0.6761 344/500 [===================>..........] - ETA: 39s - loss: 2.9997 - regression_loss: 2.3241 - classification_loss: 0.6756 345/500 [===================>..........] - ETA: 39s - loss: 3.0002 - regression_loss: 2.3246 - classification_loss: 0.6756 346/500 [===================>..........] - ETA: 38s - loss: 2.9991 - regression_loss: 2.3238 - classification_loss: 0.6753 347/500 [===================>..........] - ETA: 38s - loss: 2.9983 - regression_loss: 2.3234 - classification_loss: 0.6749 348/500 [===================>..........] - ETA: 38s - loss: 2.9969 - regression_loss: 2.3228 - classification_loss: 0.6741 349/500 [===================>..........] - ETA: 38s - loss: 2.9986 - regression_loss: 2.3241 - classification_loss: 0.6744 350/500 [====================>.........] - ETA: 37s - loss: 2.9979 - regression_loss: 2.3239 - classification_loss: 0.6740 351/500 [====================>.........] - ETA: 37s - loss: 2.9972 - regression_loss: 2.3235 - classification_loss: 0.6737 352/500 [====================>.........] - ETA: 37s - loss: 2.9968 - regression_loss: 2.3235 - classification_loss: 0.6733 353/500 [====================>.........] - ETA: 37s - loss: 2.9995 - regression_loss: 2.3243 - classification_loss: 0.6752 354/500 [====================>.........] - ETA: 36s - loss: 2.9984 - regression_loss: 2.3238 - classification_loss: 0.6746 355/500 [====================>.........] - ETA: 36s - loss: 2.9979 - regression_loss: 2.3236 - classification_loss: 0.6743 356/500 [====================>.........] - ETA: 36s - loss: 2.9971 - regression_loss: 2.3232 - classification_loss: 0.6739 357/500 [====================>.........] - ETA: 36s - loss: 2.9959 - regression_loss: 2.3219 - classification_loss: 0.6740 358/500 [====================>.........] - ETA: 35s - loss: 2.9969 - regression_loss: 2.3226 - classification_loss: 0.6743 359/500 [====================>.........] - ETA: 35s - loss: 2.9966 - regression_loss: 2.3224 - classification_loss: 0.6741 360/500 [====================>.........] - ETA: 35s - loss: 2.9948 - regression_loss: 2.3214 - classification_loss: 0.6734 361/500 [====================>.........] - ETA: 35s - loss: 2.9946 - regression_loss: 2.3210 - classification_loss: 0.6736 362/500 [====================>.........] - ETA: 34s - loss: 2.9946 - regression_loss: 2.3213 - classification_loss: 0.6734 363/500 [====================>.........] - ETA: 34s - loss: 2.9966 - regression_loss: 2.3234 - classification_loss: 0.6732 364/500 [====================>.........] - ETA: 34s - loss: 2.9962 - regression_loss: 2.3233 - classification_loss: 0.6729 365/500 [====================>.........] - ETA: 34s - loss: 2.9953 - regression_loss: 2.3228 - classification_loss: 0.6725 366/500 [====================>.........] - ETA: 33s - loss: 2.9949 - regression_loss: 2.3225 - classification_loss: 0.6724 367/500 [=====================>........] - ETA: 33s - loss: 2.9944 - regression_loss: 2.3224 - classification_loss: 0.6720 368/500 [=====================>........] - ETA: 33s - loss: 2.9944 - regression_loss: 2.3218 - classification_loss: 0.6725 369/500 [=====================>........] - ETA: 33s - loss: 2.9944 - regression_loss: 2.3218 - classification_loss: 0.6725 370/500 [=====================>........] - ETA: 32s - loss: 2.9934 - regression_loss: 2.3214 - classification_loss: 0.6720 371/500 [=====================>........] - ETA: 32s - loss: 2.9953 - regression_loss: 2.3223 - classification_loss: 0.6731 372/500 [=====================>........] - ETA: 32s - loss: 2.9940 - regression_loss: 2.3215 - classification_loss: 0.6725 373/500 [=====================>........] - ETA: 32s - loss: 3.0073 - regression_loss: 2.3235 - classification_loss: 0.6839 374/500 [=====================>........] - ETA: 31s - loss: 3.0068 - regression_loss: 2.3232 - classification_loss: 0.6836 375/500 [=====================>........] - ETA: 31s - loss: 3.0067 - regression_loss: 2.3233 - classification_loss: 0.6834 376/500 [=====================>........] - ETA: 31s - loss: 3.0064 - regression_loss: 2.3234 - classification_loss: 0.6831 377/500 [=====================>........] - ETA: 31s - loss: 3.0051 - regression_loss: 2.3226 - classification_loss: 0.6825 378/500 [=====================>........] - ETA: 30s - loss: 3.0049 - regression_loss: 2.3227 - classification_loss: 0.6822 379/500 [=====================>........] - ETA: 30s - loss: 3.0058 - regression_loss: 2.3226 - classification_loss: 0.6832 380/500 [=====================>........] - ETA: 30s - loss: 3.0053 - regression_loss: 2.3224 - classification_loss: 0.6829 381/500 [=====================>........] - ETA: 29s - loss: 3.0042 - regression_loss: 2.3215 - classification_loss: 0.6826 382/500 [=====================>........] - ETA: 29s - loss: 3.0045 - regression_loss: 2.3225 - classification_loss: 0.6820 383/500 [=====================>........] - ETA: 29s - loss: 3.0045 - regression_loss: 2.3224 - classification_loss: 0.6820 384/500 [======================>.......] - ETA: 29s - loss: 3.0034 - regression_loss: 2.3217 - classification_loss: 0.6817 385/500 [======================>.......] - ETA: 28s - loss: 3.0046 - regression_loss: 2.3232 - classification_loss: 0.6814 386/500 [======================>.......] - ETA: 28s - loss: 3.0037 - regression_loss: 2.3228 - classification_loss: 0.6809 387/500 [======================>.......] - ETA: 28s - loss: 3.0035 - regression_loss: 2.3230 - classification_loss: 0.6805 388/500 [======================>.......] - ETA: 28s - loss: 3.0045 - regression_loss: 2.3241 - classification_loss: 0.6804 389/500 [======================>.......] - ETA: 27s - loss: 3.0037 - regression_loss: 2.3238 - classification_loss: 0.6800 390/500 [======================>.......] - ETA: 27s - loss: 3.0037 - regression_loss: 2.3241 - classification_loss: 0.6797 391/500 [======================>.......] - ETA: 27s - loss: 3.0032 - regression_loss: 2.3237 - classification_loss: 0.6795 392/500 [======================>.......] - ETA: 27s - loss: 3.0007 - regression_loss: 2.3221 - classification_loss: 0.6786 393/500 [======================>.......] - ETA: 26s - loss: 2.9998 - regression_loss: 2.3216 - classification_loss: 0.6782 394/500 [======================>.......] - ETA: 26s - loss: 2.9985 - regression_loss: 2.3208 - classification_loss: 0.6778 395/500 [======================>.......] - ETA: 26s - loss: 2.9983 - regression_loss: 2.3215 - classification_loss: 0.6769 396/500 [======================>.......] - ETA: 26s - loss: 2.9975 - regression_loss: 2.3211 - classification_loss: 0.6765 397/500 [======================>.......] - ETA: 25s - loss: 2.9977 - regression_loss: 2.3214 - classification_loss: 0.6763 398/500 [======================>.......] - ETA: 25s - loss: 2.9973 - regression_loss: 2.3213 - classification_loss: 0.6760 399/500 [======================>.......] - ETA: 25s - loss: 2.9975 - regression_loss: 2.3211 - classification_loss: 0.6763 400/500 [=======================>......] - ETA: 25s - loss: 2.9977 - regression_loss: 2.3215 - classification_loss: 0.6762 401/500 [=======================>......] - ETA: 24s - loss: 2.9975 - regression_loss: 2.3217 - classification_loss: 0.6758 402/500 [=======================>......] - ETA: 24s - loss: 2.9967 - regression_loss: 2.3213 - classification_loss: 0.6754 403/500 [=======================>......] - ETA: 24s - loss: 2.9970 - regression_loss: 2.3222 - classification_loss: 0.6747 404/500 [=======================>......] - ETA: 24s - loss: 2.9959 - regression_loss: 2.3215 - classification_loss: 0.6745 405/500 [=======================>......] - ETA: 23s - loss: 2.9970 - regression_loss: 2.3225 - classification_loss: 0.6745 406/500 [=======================>......] - ETA: 23s - loss: 2.9973 - regression_loss: 2.3228 - classification_loss: 0.6745 407/500 [=======================>......] - ETA: 23s - loss: 2.9971 - regression_loss: 2.3229 - classification_loss: 0.6742 408/500 [=======================>......] - ETA: 23s - loss: 2.9974 - regression_loss: 2.3234 - classification_loss: 0.6740 409/500 [=======================>......] - ETA: 22s - loss: 2.9970 - regression_loss: 2.3234 - classification_loss: 0.6735 410/500 [=======================>......] - ETA: 22s - loss: 2.9961 - regression_loss: 2.3229 - classification_loss: 0.6732 411/500 [=======================>......] - ETA: 22s - loss: 2.9956 - regression_loss: 2.3227 - classification_loss: 0.6730 412/500 [=======================>......] - ETA: 22s - loss: 2.9949 - regression_loss: 2.3225 - classification_loss: 0.6724 413/500 [=======================>......] - ETA: 21s - loss: 2.9937 - regression_loss: 2.3217 - classification_loss: 0.6720 414/500 [=======================>......] - ETA: 21s - loss: 2.9930 - regression_loss: 2.3213 - classification_loss: 0.6717 415/500 [=======================>......] - ETA: 21s - loss: 2.9922 - regression_loss: 2.3207 - classification_loss: 0.6715 416/500 [=======================>......] - ETA: 21s - loss: 2.9917 - regression_loss: 2.3204 - classification_loss: 0.6713 417/500 [========================>.....] - ETA: 20s - loss: 2.9908 - regression_loss: 2.3198 - classification_loss: 0.6710 418/500 [========================>.....] - ETA: 20s - loss: 2.9904 - regression_loss: 2.3200 - classification_loss: 0.6704 419/500 [========================>.....] - ETA: 20s - loss: 2.9895 - regression_loss: 2.3191 - classification_loss: 0.6704 420/500 [========================>.....] - ETA: 20s - loss: 2.9894 - regression_loss: 2.3193 - classification_loss: 0.6700 421/500 [========================>.....] - ETA: 19s - loss: 2.9886 - regression_loss: 2.3190 - classification_loss: 0.6697 422/500 [========================>.....] - ETA: 19s - loss: 2.9880 - regression_loss: 2.3188 - classification_loss: 0.6693 423/500 [========================>.....] - ETA: 19s - loss: 2.9882 - regression_loss: 2.3194 - classification_loss: 0.6688 424/500 [========================>.....] - ETA: 19s - loss: 2.9915 - regression_loss: 2.3210 - classification_loss: 0.6705 425/500 [========================>.....] - ETA: 18s - loss: 2.9912 - regression_loss: 2.3210 - classification_loss: 0.6702 426/500 [========================>.....] - ETA: 18s - loss: 2.9909 - regression_loss: 2.3210 - classification_loss: 0.6699 427/500 [========================>.....] - ETA: 18s - loss: 2.9902 - regression_loss: 2.3207 - classification_loss: 0.6695 428/500 [========================>.....] - ETA: 18s - loss: 2.9899 - regression_loss: 2.3205 - classification_loss: 0.6693 429/500 [========================>.....] - ETA: 17s - loss: 2.9893 - regression_loss: 2.3204 - classification_loss: 0.6690 430/500 [========================>.....] - ETA: 17s - loss: 2.9886 - regression_loss: 2.3201 - classification_loss: 0.6685 431/500 [========================>.....] - ETA: 17s - loss: 2.9878 - regression_loss: 2.3196 - classification_loss: 0.6682 432/500 [========================>.....] - ETA: 17s - loss: 2.9881 - regression_loss: 2.3202 - classification_loss: 0.6679 433/500 [========================>.....] - ETA: 16s - loss: 2.9885 - regression_loss: 2.3213 - classification_loss: 0.6672 434/500 [=========================>....] - ETA: 16s - loss: 2.9882 - regression_loss: 2.3211 - classification_loss: 0.6671 435/500 [=========================>....] - ETA: 16s - loss: 2.9879 - regression_loss: 2.3209 - classification_loss: 0.6670 436/500 [=========================>....] - ETA: 16s - loss: 2.9877 - regression_loss: 2.3208 - classification_loss: 0.6669 437/500 [=========================>....] - ETA: 15s - loss: 2.9865 - regression_loss: 2.3203 - classification_loss: 0.6661 438/500 [=========================>....] - ETA: 15s - loss: 2.9860 - regression_loss: 2.3201 - classification_loss: 0.6659 439/500 [=========================>....] - ETA: 15s - loss: 2.9853 - regression_loss: 2.3197 - classification_loss: 0.6656 440/500 [=========================>....] - ETA: 15s - loss: 2.9843 - regression_loss: 2.3192 - classification_loss: 0.6652 441/500 [=========================>....] - ETA: 14s - loss: 2.9832 - regression_loss: 2.3184 - classification_loss: 0.6647 442/500 [=========================>....] - ETA: 14s - loss: 2.9837 - regression_loss: 2.3191 - classification_loss: 0.6646 443/500 [=========================>....] - ETA: 14s - loss: 2.9834 - regression_loss: 2.3190 - classification_loss: 0.6644 444/500 [=========================>....] - ETA: 14s - loss: 2.9831 - regression_loss: 2.3189 - classification_loss: 0.6642 445/500 [=========================>....] - ETA: 13s - loss: 2.9834 - regression_loss: 2.3193 - classification_loss: 0.6641 446/500 [=========================>....] - ETA: 13s - loss: 2.9830 - regression_loss: 2.3191 - classification_loss: 0.6639 447/500 [=========================>....] - ETA: 13s - loss: 2.9822 - regression_loss: 2.3188 - classification_loss: 0.6634 448/500 [=========================>....] - ETA: 13s - loss: 2.9813 - regression_loss: 2.3183 - classification_loss: 0.6630 449/500 [=========================>....] - ETA: 12s - loss: 2.9813 - regression_loss: 2.3186 - classification_loss: 0.6626 450/500 [==========================>...] - ETA: 12s - loss: 2.9807 - regression_loss: 2.3183 - classification_loss: 0.6624 451/500 [==========================>...] - ETA: 12s - loss: 2.9803 - regression_loss: 2.3182 - classification_loss: 0.6621 452/500 [==========================>...] - ETA: 12s - loss: 2.9790 - regression_loss: 2.3175 - classification_loss: 0.6615 453/500 [==========================>...] - ETA: 11s - loss: 2.9779 - regression_loss: 2.3170 - classification_loss: 0.6610 454/500 [==========================>...] - ETA: 11s - loss: 2.9774 - regression_loss: 2.3167 - classification_loss: 0.6607 455/500 [==========================>...] - ETA: 11s - loss: 2.9783 - regression_loss: 2.3175 - classification_loss: 0.6608 456/500 [==========================>...] - ETA: 11s - loss: 2.9790 - regression_loss: 2.3181 - classification_loss: 0.6609 457/500 [==========================>...] - ETA: 10s - loss: 2.9782 - regression_loss: 2.3179 - classification_loss: 0.6603 458/500 [==========================>...] - ETA: 10s - loss: 2.9776 - regression_loss: 2.3176 - classification_loss: 0.6601 459/500 [==========================>...] - ETA: 10s - loss: 2.9775 - regression_loss: 2.3174 - classification_loss: 0.6601 460/500 [==========================>...] - ETA: 10s - loss: 2.9766 - regression_loss: 2.3169 - classification_loss: 0.6597 461/500 [==========================>...] - ETA: 9s - loss: 2.9772 - regression_loss: 2.3176 - classification_loss: 0.6596  462/500 [==========================>...] - ETA: 9s - loss: 2.9764 - regression_loss: 2.3170 - classification_loss: 0.6594 463/500 [==========================>...] - ETA: 9s - loss: 2.9757 - regression_loss: 2.3168 - classification_loss: 0.6589 464/500 [==========================>...] - ETA: 9s - loss: 2.9757 - regression_loss: 2.3169 - classification_loss: 0.6587 465/500 [==========================>...] - ETA: 8s - loss: 2.9756 - regression_loss: 2.3172 - classification_loss: 0.6584 466/500 [==========================>...] - ETA: 8s - loss: 2.9752 - regression_loss: 2.3172 - classification_loss: 0.6581 467/500 [===========================>..] - ETA: 8s - loss: 2.9755 - regression_loss: 2.3175 - classification_loss: 0.6580 468/500 [===========================>..] - ETA: 8s - loss: 2.9754 - regression_loss: 2.3176 - classification_loss: 0.6578 469/500 [===========================>..] - ETA: 7s - loss: 2.9750 - regression_loss: 2.3175 - classification_loss: 0.6575 470/500 [===========================>..] - ETA: 7s - loss: 2.9747 - regression_loss: 2.3173 - classification_loss: 0.6573 471/500 [===========================>..] - ETA: 7s - loss: 2.9743 - regression_loss: 2.3171 - classification_loss: 0.6572 472/500 [===========================>..] - ETA: 7s - loss: 2.9737 - regression_loss: 2.3168 - classification_loss: 0.6568 473/500 [===========================>..] - ETA: 6s - loss: 2.9741 - regression_loss: 2.3173 - classification_loss: 0.6568 474/500 [===========================>..] - ETA: 6s - loss: 2.9734 - regression_loss: 2.3169 - classification_loss: 0.6565 475/500 [===========================>..] - ETA: 6s - loss: 2.9730 - regression_loss: 2.3166 - classification_loss: 0.6564 476/500 [===========================>..] - ETA: 6s - loss: 2.9746 - regression_loss: 2.3178 - classification_loss: 0.6568 477/500 [===========================>..] - ETA: 5s - loss: 2.9743 - regression_loss: 2.3171 - classification_loss: 0.6571 478/500 [===========================>..] - ETA: 5s - loss: 2.9738 - regression_loss: 2.3171 - classification_loss: 0.6567 479/500 [===========================>..] - ETA: 5s - loss: 2.9748 - regression_loss: 2.3179 - classification_loss: 0.6569 480/500 [===========================>..] - ETA: 5s - loss: 2.9747 - regression_loss: 2.3179 - classification_loss: 0.6568 481/500 [===========================>..] - ETA: 4s - loss: 2.9741 - regression_loss: 2.3175 - classification_loss: 0.6566 482/500 [===========================>..] - ETA: 4s - loss: 2.9735 - regression_loss: 2.3172 - classification_loss: 0.6563 483/500 [===========================>..] - ETA: 4s - loss: 2.9732 - regression_loss: 2.3171 - classification_loss: 0.6561 484/500 [============================>.] - ETA: 4s - loss: 2.9734 - regression_loss: 2.3172 - classification_loss: 0.6562 485/500 [============================>.] - ETA: 3s - loss: 2.9734 - regression_loss: 2.3173 - classification_loss: 0.6561 486/500 [============================>.] - ETA: 3s - loss: 2.9723 - regression_loss: 2.3167 - classification_loss: 0.6555 487/500 [============================>.] - ETA: 3s - loss: 2.9716 - regression_loss: 2.3165 - classification_loss: 0.6552 488/500 [============================>.] - ETA: 3s - loss: 2.9711 - regression_loss: 2.3163 - classification_loss: 0.6548 489/500 [============================>.] - ETA: 2s - loss: 2.9708 - regression_loss: 2.3163 - classification_loss: 0.6545 490/500 [============================>.] - ETA: 2s - loss: 2.9704 - regression_loss: 2.3164 - classification_loss: 0.6540 491/500 [============================>.] - ETA: 2s - loss: 2.9699 - regression_loss: 2.3161 - classification_loss: 0.6538 492/500 [============================>.] - ETA: 2s - loss: 2.9688 - regression_loss: 2.3154 - classification_loss: 0.6535 493/500 [============================>.] - ETA: 1s - loss: 2.9676 - regression_loss: 2.3144 - classification_loss: 0.6532 494/500 [============================>.] - ETA: 1s - loss: 2.9684 - regression_loss: 2.3152 - classification_loss: 0.6532 495/500 [============================>.] - ETA: 1s - loss: 2.9669 - regression_loss: 2.3143 - classification_loss: 0.6526 496/500 [============================>.] - ETA: 1s - loss: 2.9671 - regression_loss: 2.3147 - classification_loss: 0.6524 497/500 [============================>.] - ETA: 0s - loss: 2.9670 - regression_loss: 2.3148 - classification_loss: 0.6522 498/500 [============================>.] - ETA: 0s - loss: 2.9671 - regression_loss: 2.3148 - classification_loss: 0.6523 499/500 [============================>.] - ETA: 0s - loss: 2.9668 - regression_loss: 2.3147 - classification_loss: 0.6521 500/500 [==============================] - 126s 252ms/step - loss: 2.9674 - regression_loss: 2.3152 - classification_loss: 0.6522 1172 instances of class plum with average precision: 0.1424 mAP: 0.1424 Epoch 00005: saving model to ./training/snapshots/resnet50_pascal_05.h5 Epoch 6/150 1/500 [..............................] - ETA: 1:57 - loss: 2.7253 - regression_loss: 2.2452 - classification_loss: 0.4801 2/500 [..............................] - ETA: 2:02 - loss: 2.9651 - regression_loss: 2.4239 - classification_loss: 0.5412 3/500 [..............................] - ETA: 2:03 - loss: 2.8389 - regression_loss: 2.3383 - classification_loss: 0.5006 4/500 [..............................] - ETA: 2:03 - loss: 2.8196 - regression_loss: 2.2960 - classification_loss: 0.5237 5/500 [..............................] - ETA: 2:04 - loss: 2.8926 - regression_loss: 2.3635 - classification_loss: 0.5291 6/500 [..............................] - ETA: 2:04 - loss: 2.8700 - regression_loss: 2.3351 - classification_loss: 0.5349 7/500 [..............................] - ETA: 2:04 - loss: 2.9588 - regression_loss: 2.3869 - classification_loss: 0.5719 8/500 [..............................] - ETA: 2:04 - loss: 2.9432 - regression_loss: 2.3789 - classification_loss: 0.5643 9/500 [..............................] - ETA: 2:03 - loss: 2.9912 - regression_loss: 2.4086 - classification_loss: 0.5825 10/500 [..............................] - ETA: 2:03 - loss: 2.9162 - regression_loss: 2.3429 - classification_loss: 0.5732 11/500 [..............................] - ETA: 2:03 - loss: 2.9478 - regression_loss: 2.3783 - classification_loss: 0.5695 12/500 [..............................] - ETA: 2:03 - loss: 2.9181 - regression_loss: 2.3556 - classification_loss: 0.5626 13/500 [..............................] - ETA: 2:03 - loss: 2.8850 - regression_loss: 2.3348 - classification_loss: 0.5502 14/500 [..............................] - ETA: 2:02 - loss: 2.8882 - regression_loss: 2.3304 - classification_loss: 0.5579 15/500 [..............................] - ETA: 2:02 - loss: 2.8834 - regression_loss: 2.3242 - classification_loss: 0.5592 16/500 [..............................] - ETA: 2:02 - loss: 2.9093 - regression_loss: 2.3587 - classification_loss: 0.5506 17/500 [>.............................] - ETA: 2:01 - loss: 2.9143 - regression_loss: 2.3583 - classification_loss: 0.5560 18/500 [>.............................] - ETA: 2:01 - loss: 2.8687 - regression_loss: 2.3153 - classification_loss: 0.5533 19/500 [>.............................] - ETA: 2:01 - loss: 2.8978 - regression_loss: 2.3370 - classification_loss: 0.5608 20/500 [>.............................] - ETA: 2:01 - loss: 2.8996 - regression_loss: 2.3407 - classification_loss: 0.5589 21/500 [>.............................] - ETA: 2:00 - loss: 2.8882 - regression_loss: 2.3324 - classification_loss: 0.5558 22/500 [>.............................] - ETA: 2:00 - loss: 2.8861 - regression_loss: 2.3272 - classification_loss: 0.5589 23/500 [>.............................] - ETA: 2:00 - loss: 2.8800 - regression_loss: 2.3235 - classification_loss: 0.5564 24/500 [>.............................] - ETA: 1:59 - loss: 2.8833 - regression_loss: 2.3265 - classification_loss: 0.5568 25/500 [>.............................] - ETA: 1:59 - loss: 2.8747 - regression_loss: 2.3188 - classification_loss: 0.5558 26/500 [>.............................] - ETA: 1:59 - loss: 2.8534 - regression_loss: 2.3017 - classification_loss: 0.5517 27/500 [>.............................] - ETA: 1:59 - loss: 2.8546 - regression_loss: 2.3046 - classification_loss: 0.5499 28/500 [>.............................] - ETA: 1:58 - loss: 2.8654 - regression_loss: 2.3112 - classification_loss: 0.5542 29/500 [>.............................] - ETA: 1:58 - loss: 2.8551 - regression_loss: 2.3029 - classification_loss: 0.5522 30/500 [>.............................] - ETA: 1:58 - loss: 2.8595 - regression_loss: 2.2905 - classification_loss: 0.5690 31/500 [>.............................] - ETA: 1:57 - loss: 2.8496 - regression_loss: 2.2841 - classification_loss: 0.5655 32/500 [>.............................] - ETA: 1:57 - loss: 2.8422 - regression_loss: 2.2804 - classification_loss: 0.5618 33/500 [>.............................] - ETA: 1:57 - loss: 2.8466 - regression_loss: 2.2820 - classification_loss: 0.5646 34/500 [=>............................] - ETA: 1:56 - loss: 2.8509 - regression_loss: 2.2893 - classification_loss: 0.5616 35/500 [=>............................] - ETA: 1:56 - loss: 2.8388 - regression_loss: 2.2794 - classification_loss: 0.5594 36/500 [=>............................] - ETA: 1:56 - loss: 2.8389 - regression_loss: 2.2783 - classification_loss: 0.5606 37/500 [=>............................] - ETA: 1:55 - loss: 2.8254 - regression_loss: 2.2684 - classification_loss: 0.5570 38/500 [=>............................] - ETA: 1:55 - loss: 2.8538 - regression_loss: 2.2984 - classification_loss: 0.5554 39/500 [=>............................] - ETA: 1:55 - loss: 2.8662 - regression_loss: 2.2958 - classification_loss: 0.5705 40/500 [=>............................] - ETA: 1:55 - loss: 2.8671 - regression_loss: 2.2971 - classification_loss: 0.5700 41/500 [=>............................] - ETA: 1:54 - loss: 2.8688 - regression_loss: 2.2994 - classification_loss: 0.5694 42/500 [=>............................] - ETA: 1:54 - loss: 2.8635 - regression_loss: 2.2955 - classification_loss: 0.5680 43/500 [=>............................] - ETA: 1:54 - loss: 2.8559 - regression_loss: 2.2918 - classification_loss: 0.5642 44/500 [=>............................] - ETA: 1:54 - loss: 2.8533 - regression_loss: 2.2902 - classification_loss: 0.5631 45/500 [=>............................] - ETA: 1:54 - loss: 2.8509 - regression_loss: 2.2890 - classification_loss: 0.5619 46/500 [=>............................] - ETA: 1:54 - loss: 2.8548 - regression_loss: 2.2947 - classification_loss: 0.5601 47/500 [=>............................] - ETA: 1:53 - loss: 2.8543 - regression_loss: 2.2933 - classification_loss: 0.5611 48/500 [=>............................] - ETA: 1:53 - loss: 2.8785 - regression_loss: 2.2771 - classification_loss: 0.6015 49/500 [=>............................] - ETA: 1:53 - loss: 2.8761 - regression_loss: 2.2719 - classification_loss: 0.6041 50/500 [==>...........................] - ETA: 1:52 - loss: 2.8976 - regression_loss: 2.2807 - classification_loss: 0.6169 51/500 [==>...........................] - ETA: 1:52 - loss: 2.8970 - regression_loss: 2.2781 - classification_loss: 0.6189 52/500 [==>...........................] - ETA: 1:52 - loss: 2.9108 - regression_loss: 2.2900 - classification_loss: 0.6208 53/500 [==>...........................] - ETA: 1:52 - loss: 2.9187 - regression_loss: 2.2943 - classification_loss: 0.6245 54/500 [==>...........................] - ETA: 1:52 - loss: 2.9155 - regression_loss: 2.2919 - classification_loss: 0.6236 55/500 [==>...........................] - ETA: 1:51 - loss: 2.9109 - regression_loss: 2.2867 - classification_loss: 0.6243 56/500 [==>...........................] - ETA: 1:51 - loss: 2.9160 - regression_loss: 2.2900 - classification_loss: 0.6260 57/500 [==>...........................] - ETA: 1:51 - loss: 2.9195 - regression_loss: 2.2935 - classification_loss: 0.6260 58/500 [==>...........................] - ETA: 1:51 - loss: 2.9211 - regression_loss: 2.2937 - classification_loss: 0.6274 59/500 [==>...........................] - ETA: 1:50 - loss: 2.9142 - regression_loss: 2.2901 - classification_loss: 0.6241 60/500 [==>...........................] - ETA: 1:50 - loss: 2.9197 - regression_loss: 2.2947 - classification_loss: 0.6250 61/500 [==>...........................] - ETA: 1:50 - loss: 2.9230 - regression_loss: 2.2974 - classification_loss: 0.6256 62/500 [==>...........................] - ETA: 1:50 - loss: 2.9229 - regression_loss: 2.2969 - classification_loss: 0.6259 63/500 [==>...........................] - ETA: 1:49 - loss: 2.9215 - regression_loss: 2.2943 - classification_loss: 0.6271 64/500 [==>...........................] - ETA: 1:49 - loss: 2.9212 - regression_loss: 2.2943 - classification_loss: 0.6269 65/500 [==>...........................] - ETA: 1:49 - loss: 2.9155 - regression_loss: 2.2910 - classification_loss: 0.6246 66/500 [==>...........................] - ETA: 1:49 - loss: 2.9196 - regression_loss: 2.2929 - classification_loss: 0.6267 67/500 [===>..........................] - ETA: 1:48 - loss: 2.9132 - regression_loss: 2.2886 - classification_loss: 0.6247 68/500 [===>..........................] - ETA: 1:48 - loss: 2.9107 - regression_loss: 2.2877 - classification_loss: 0.6231 69/500 [===>..........................] - ETA: 1:48 - loss: 2.9133 - regression_loss: 2.2917 - classification_loss: 0.6216 70/500 [===>..........................] - ETA: 1:48 - loss: 2.9104 - regression_loss: 2.2883 - classification_loss: 0.6221 71/500 [===>..........................] - ETA: 1:47 - loss: 2.9117 - regression_loss: 2.2867 - classification_loss: 0.6251 72/500 [===>..........................] - ETA: 1:47 - loss: 2.9212 - regression_loss: 2.2927 - classification_loss: 0.6285 73/500 [===>..........................] - ETA: 1:47 - loss: 2.9255 - regression_loss: 2.2925 - classification_loss: 0.6330 74/500 [===>..........................] - ETA: 1:46 - loss: 2.9422 - regression_loss: 2.3014 - classification_loss: 0.6408 75/500 [===>..........................] - ETA: 1:46 - loss: 2.9490 - regression_loss: 2.3078 - classification_loss: 0.6412 76/500 [===>..........................] - ETA: 1:46 - loss: 2.9454 - regression_loss: 2.3065 - classification_loss: 0.6389 77/500 [===>..........................] - ETA: 1:46 - loss: 2.9409 - regression_loss: 2.3046 - classification_loss: 0.6363 78/500 [===>..........................] - ETA: 1:46 - loss: 2.9409 - regression_loss: 2.3041 - classification_loss: 0.6368 79/500 [===>..........................] - ETA: 1:45 - loss: 2.9462 - regression_loss: 2.3081 - classification_loss: 0.6381 80/500 [===>..........................] - ETA: 1:45 - loss: 2.9371 - regression_loss: 2.3034 - classification_loss: 0.6337 81/500 [===>..........................] - ETA: 1:45 - loss: 2.9328 - regression_loss: 2.3006 - classification_loss: 0.6322 82/500 [===>..........................] - ETA: 1:45 - loss: 2.9306 - regression_loss: 2.3006 - classification_loss: 0.6300 83/500 [===>..........................] - ETA: 1:44 - loss: 2.9267 - regression_loss: 2.2977 - classification_loss: 0.6289 84/500 [====>.........................] - ETA: 1:44 - loss: 2.9295 - regression_loss: 2.3006 - classification_loss: 0.6289 85/500 [====>.........................] - ETA: 1:44 - loss: 2.9247 - regression_loss: 2.2975 - classification_loss: 0.6272 86/500 [====>.........................] - ETA: 1:44 - loss: 2.9223 - regression_loss: 2.2962 - classification_loss: 0.6262 87/500 [====>.........................] - ETA: 1:43 - loss: 2.9207 - regression_loss: 2.2942 - classification_loss: 0.6265 88/500 [====>.........................] - ETA: 1:43 - loss: 2.9178 - regression_loss: 2.2891 - classification_loss: 0.6287 89/500 [====>.........................] - ETA: 1:43 - loss: 2.9104 - regression_loss: 2.2850 - classification_loss: 0.6254 90/500 [====>.........................] - ETA: 1:43 - loss: 2.9117 - regression_loss: 2.2874 - classification_loss: 0.6243 91/500 [====>.........................] - ETA: 1:43 - loss: 2.9095 - regression_loss: 2.2864 - classification_loss: 0.6231 92/500 [====>.........................] - ETA: 1:42 - loss: 2.9068 - regression_loss: 2.2847 - classification_loss: 0.6221 93/500 [====>.........................] - ETA: 1:42 - loss: 2.9040 - regression_loss: 2.2828 - classification_loss: 0.6212 94/500 [====>.........................] - ETA: 1:42 - loss: 2.9056 - regression_loss: 2.2840 - classification_loss: 0.6217 95/500 [====>.........................] - ETA: 1:42 - loss: 2.9047 - regression_loss: 2.2837 - classification_loss: 0.6210 96/500 [====>.........................] - ETA: 1:41 - loss: 2.9009 - regression_loss: 2.2824 - classification_loss: 0.6185 97/500 [====>.........................] - ETA: 1:41 - loss: 2.9020 - regression_loss: 2.2839 - classification_loss: 0.6182 98/500 [====>.........................] - ETA: 1:41 - loss: 2.9048 - regression_loss: 2.2846 - classification_loss: 0.6202 99/500 [====>.........................] - ETA: 1:41 - loss: 2.9044 - regression_loss: 2.2862 - classification_loss: 0.6182 100/500 [=====>........................] - ETA: 1:40 - loss: 2.9073 - regression_loss: 2.2887 - classification_loss: 0.6186 101/500 [=====>........................] - ETA: 1:40 - loss: 2.9061 - regression_loss: 2.2884 - classification_loss: 0.6177 102/500 [=====>........................] - ETA: 1:40 - loss: 2.9074 - regression_loss: 2.2890 - classification_loss: 0.6184 103/500 [=====>........................] - ETA: 1:40 - loss: 2.9049 - regression_loss: 2.2877 - classification_loss: 0.6173 104/500 [=====>........................] - ETA: 1:39 - loss: 2.9032 - regression_loss: 2.2868 - classification_loss: 0.6163 105/500 [=====>........................] - ETA: 1:39 - loss: 2.9004 - regression_loss: 2.2852 - classification_loss: 0.6152 106/500 [=====>........................] - ETA: 1:39 - loss: 2.9006 - regression_loss: 2.2863 - classification_loss: 0.6143 107/500 [=====>........................] - ETA: 1:39 - loss: 2.8946 - regression_loss: 2.2816 - classification_loss: 0.6130 108/500 [=====>........................] - ETA: 1:39 - loss: 2.8976 - regression_loss: 2.2846 - classification_loss: 0.6130 109/500 [=====>........................] - ETA: 1:38 - loss: 2.8969 - regression_loss: 2.2814 - classification_loss: 0.6154 110/500 [=====>........................] - ETA: 1:38 - loss: 2.8951 - regression_loss: 2.2807 - classification_loss: 0.6144 111/500 [=====>........................] - ETA: 1:38 - loss: 2.8974 - regression_loss: 2.2817 - classification_loss: 0.6157 112/500 [=====>........................] - ETA: 1:38 - loss: 2.8947 - regression_loss: 2.2800 - classification_loss: 0.6147 113/500 [=====>........................] - ETA: 1:37 - loss: 2.9003 - regression_loss: 2.2825 - classification_loss: 0.6178 114/500 [=====>........................] - ETA: 1:37 - loss: 2.9003 - regression_loss: 2.2824 - classification_loss: 0.6179 115/500 [=====>........................] - ETA: 1:37 - loss: 2.9044 - regression_loss: 2.2855 - classification_loss: 0.6189 116/500 [=====>........................] - ETA: 1:37 - loss: 2.9060 - regression_loss: 2.2868 - classification_loss: 0.6192 117/500 [======>.......................] - ETA: 1:36 - loss: 2.9020 - regression_loss: 2.2848 - classification_loss: 0.6172 118/500 [======>.......................] - ETA: 1:36 - loss: 2.9001 - regression_loss: 2.2844 - classification_loss: 0.6156 119/500 [======>.......................] - ETA: 1:36 - loss: 2.8962 - regression_loss: 2.2817 - classification_loss: 0.6146 120/500 [======>.......................] - ETA: 1:36 - loss: 2.8903 - regression_loss: 2.2777 - classification_loss: 0.6126 121/500 [======>.......................] - ETA: 1:35 - loss: 2.8866 - regression_loss: 2.2750 - classification_loss: 0.6116 122/500 [======>.......................] - ETA: 1:35 - loss: 2.8847 - regression_loss: 2.2740 - classification_loss: 0.6107 123/500 [======>.......................] - ETA: 1:35 - loss: 2.8850 - regression_loss: 2.2747 - classification_loss: 0.6103 124/500 [======>.......................] - ETA: 1:35 - loss: 2.8862 - regression_loss: 2.2745 - classification_loss: 0.6118 125/500 [======>.......................] - ETA: 1:34 - loss: 2.8884 - regression_loss: 2.2759 - classification_loss: 0.6125 126/500 [======>.......................] - ETA: 1:34 - loss: 2.8883 - regression_loss: 2.2752 - classification_loss: 0.6130 127/500 [======>.......................] - ETA: 1:34 - loss: 2.8844 - regression_loss: 2.2724 - classification_loss: 0.6119 128/500 [======>.......................] - ETA: 1:34 - loss: 2.8847 - regression_loss: 2.2729 - classification_loss: 0.6118 129/500 [======>.......................] - ETA: 1:33 - loss: 2.8853 - regression_loss: 2.2740 - classification_loss: 0.6114 130/500 [======>.......................] - ETA: 1:33 - loss: 2.8866 - regression_loss: 2.2748 - classification_loss: 0.6118 131/500 [======>.......................] - ETA: 1:33 - loss: 2.8846 - regression_loss: 2.2738 - classification_loss: 0.6108 132/500 [======>.......................] - ETA: 1:33 - loss: 2.8837 - regression_loss: 2.2739 - classification_loss: 0.6098 133/500 [======>.......................] - ETA: 1:32 - loss: 2.8825 - regression_loss: 2.2735 - classification_loss: 0.6090 134/500 [=======>......................] - ETA: 1:32 - loss: 2.8873 - regression_loss: 2.2753 - classification_loss: 0.6121 135/500 [=======>......................] - ETA: 1:32 - loss: 2.8847 - regression_loss: 2.2738 - classification_loss: 0.6109 136/500 [=======>......................] - ETA: 1:32 - loss: 2.8851 - regression_loss: 2.2749 - classification_loss: 0.6102 137/500 [=======>......................] - ETA: 1:31 - loss: 2.8867 - regression_loss: 2.2773 - classification_loss: 0.6094 138/500 [=======>......................] - ETA: 1:31 - loss: 2.8840 - regression_loss: 2.2753 - classification_loss: 0.6086 139/500 [=======>......................] - ETA: 1:31 - loss: 2.8854 - regression_loss: 2.2769 - classification_loss: 0.6085 140/500 [=======>......................] - ETA: 1:31 - loss: 2.8846 - regression_loss: 2.2766 - classification_loss: 0.6080 141/500 [=======>......................] - ETA: 1:30 - loss: 2.8814 - regression_loss: 2.2743 - classification_loss: 0.6071 142/500 [=======>......................] - ETA: 1:30 - loss: 2.8780 - regression_loss: 2.2715 - classification_loss: 0.6065 143/500 [=======>......................] - ETA: 1:29 - loss: 2.8802 - regression_loss: 2.2727 - classification_loss: 0.6075 144/500 [=======>......................] - ETA: 1:29 - loss: 2.8800 - regression_loss: 2.2724 - classification_loss: 0.6076 145/500 [=======>......................] - ETA: 1:29 - loss: 2.8856 - regression_loss: 2.2781 - classification_loss: 0.6075 146/500 [=======>......................] - ETA: 1:29 - loss: 2.8871 - regression_loss: 2.2791 - classification_loss: 0.6079 147/500 [=======>......................] - ETA: 1:28 - loss: 2.8859 - regression_loss: 2.2790 - classification_loss: 0.6069 148/500 [=======>......................] - ETA: 1:28 - loss: 2.8887 - regression_loss: 2.2814 - classification_loss: 0.6073 149/500 [=======>......................] - ETA: 1:28 - loss: 2.8854 - regression_loss: 2.2791 - classification_loss: 0.6063 150/500 [========>.....................] - ETA: 1:28 - loss: 2.8847 - regression_loss: 2.2785 - classification_loss: 0.6062 151/500 [========>.....................] - ETA: 1:27 - loss: 2.8833 - regression_loss: 2.2773 - classification_loss: 0.6060 152/500 [========>.....................] - ETA: 1:27 - loss: 2.8806 - regression_loss: 2.2732 - classification_loss: 0.6074 153/500 [========>.....................] - ETA: 1:27 - loss: 2.8774 - regression_loss: 2.2708 - classification_loss: 0.6066 154/500 [========>.....................] - ETA: 1:27 - loss: 2.8760 - regression_loss: 2.2699 - classification_loss: 0.6061 155/500 [========>.....................] - ETA: 1:26 - loss: 2.8767 - regression_loss: 2.2710 - classification_loss: 0.6057 156/500 [========>.....................] - ETA: 1:26 - loss: 2.8719 - regression_loss: 2.2676 - classification_loss: 0.6043 157/500 [========>.....................] - ETA: 1:26 - loss: 2.8695 - regression_loss: 2.2655 - classification_loss: 0.6040 158/500 [========>.....................] - ETA: 1:26 - loss: 2.8699 - regression_loss: 2.2662 - classification_loss: 0.6036 159/500 [========>.....................] - ETA: 1:25 - loss: 2.8680 - regression_loss: 2.2654 - classification_loss: 0.6026 160/500 [========>.....................] - ETA: 1:25 - loss: 2.8688 - regression_loss: 2.2665 - classification_loss: 0.6023 161/500 [========>.....................] - ETA: 1:25 - loss: 2.8699 - regression_loss: 2.2678 - classification_loss: 0.6021 162/500 [========>.....................] - ETA: 1:25 - loss: 2.8671 - regression_loss: 2.2660 - classification_loss: 0.6010 163/500 [========>.....................] - ETA: 1:24 - loss: 2.8737 - regression_loss: 2.2720 - classification_loss: 0.6017 164/500 [========>.....................] - ETA: 1:24 - loss: 2.8756 - regression_loss: 2.2738 - classification_loss: 0.6018 165/500 [========>.....................] - ETA: 1:24 - loss: 2.8750 - regression_loss: 2.2712 - classification_loss: 0.6038 166/500 [========>.....................] - ETA: 1:24 - loss: 2.8735 - regression_loss: 2.2700 - classification_loss: 0.6035 167/500 [=========>....................] - ETA: 1:23 - loss: 2.8713 - regression_loss: 2.2684 - classification_loss: 0.6029 168/500 [=========>....................] - ETA: 1:23 - loss: 2.8723 - regression_loss: 2.2690 - classification_loss: 0.6033 169/500 [=========>....................] - ETA: 1:23 - loss: 2.8744 - regression_loss: 2.2711 - classification_loss: 0.6033 170/500 [=========>....................] - ETA: 1:23 - loss: 2.8712 - regression_loss: 2.2680 - classification_loss: 0.6032 171/500 [=========>....................] - ETA: 1:22 - loss: 2.8707 - regression_loss: 2.2673 - classification_loss: 0.6034 172/500 [=========>....................] - ETA: 1:22 - loss: 2.8621 - regression_loss: 2.2613 - classification_loss: 0.6009 173/500 [=========>....................] - ETA: 1:22 - loss: 2.8616 - regression_loss: 2.2609 - classification_loss: 0.6007 174/500 [=========>....................] - ETA: 1:22 - loss: 2.8594 - regression_loss: 2.2598 - classification_loss: 0.5996 175/500 [=========>....................] - ETA: 1:21 - loss: 2.8590 - regression_loss: 2.2593 - classification_loss: 0.5997 176/500 [=========>....................] - ETA: 1:21 - loss: 2.8587 - regression_loss: 2.2589 - classification_loss: 0.5998 177/500 [=========>....................] - ETA: 1:21 - loss: 2.8597 - regression_loss: 2.2588 - classification_loss: 0.6009 178/500 [=========>....................] - ETA: 1:21 - loss: 2.8601 - regression_loss: 2.2588 - classification_loss: 0.6012 179/500 [=========>....................] - ETA: 1:20 - loss: 2.8575 - regression_loss: 2.2569 - classification_loss: 0.6006 180/500 [=========>....................] - ETA: 1:20 - loss: 2.8576 - regression_loss: 2.2577 - classification_loss: 0.5999 181/500 [=========>....................] - ETA: 1:20 - loss: 2.8553 - regression_loss: 2.2563 - classification_loss: 0.5989 182/500 [=========>....................] - ETA: 1:20 - loss: 2.8555 - regression_loss: 2.2570 - classification_loss: 0.5985 183/500 [=========>....................] - ETA: 1:19 - loss: 2.8569 - regression_loss: 2.2578 - classification_loss: 0.5991 184/500 [==========>...................] - ETA: 1:19 - loss: 2.8577 - regression_loss: 2.2586 - classification_loss: 0.5991 185/500 [==========>...................] - ETA: 1:19 - loss: 2.8559 - regression_loss: 2.2574 - classification_loss: 0.5985 186/500 [==========>...................] - ETA: 1:19 - loss: 2.8562 - regression_loss: 2.2575 - classification_loss: 0.5987 187/500 [==========>...................] - ETA: 1:18 - loss: 2.8590 - regression_loss: 2.2586 - classification_loss: 0.6005 188/500 [==========>...................] - ETA: 1:18 - loss: 2.8583 - regression_loss: 2.2586 - classification_loss: 0.5997 189/500 [==========>...................] - ETA: 1:18 - loss: 2.8573 - regression_loss: 2.2580 - classification_loss: 0.5993 190/500 [==========>...................] - ETA: 1:18 - loss: 2.8583 - regression_loss: 2.2597 - classification_loss: 0.5986 191/500 [==========>...................] - ETA: 1:17 - loss: 2.8574 - regression_loss: 2.2594 - classification_loss: 0.5980 192/500 [==========>...................] - ETA: 1:17 - loss: 2.8573 - regression_loss: 2.2595 - classification_loss: 0.5978 193/500 [==========>...................] - ETA: 1:17 - loss: 2.8568 - regression_loss: 2.2592 - classification_loss: 0.5976 194/500 [==========>...................] - ETA: 1:17 - loss: 2.8552 - regression_loss: 2.2582 - classification_loss: 0.5970 195/500 [==========>...................] - ETA: 1:16 - loss: 2.8541 - regression_loss: 2.2571 - classification_loss: 0.5970 196/500 [==========>...................] - ETA: 1:16 - loss: 2.8536 - regression_loss: 2.2572 - classification_loss: 0.5964 197/500 [==========>...................] - ETA: 1:16 - loss: 2.8541 - regression_loss: 2.2574 - classification_loss: 0.5967 198/500 [==========>...................] - ETA: 1:16 - loss: 2.8505 - regression_loss: 2.2557 - classification_loss: 0.5948 199/500 [==========>...................] - ETA: 1:15 - loss: 2.8490 - regression_loss: 2.2539 - classification_loss: 0.5951 200/500 [===========>..................] - ETA: 1:15 - loss: 2.8508 - regression_loss: 2.2550 - classification_loss: 0.5958 201/500 [===========>..................] - ETA: 1:15 - loss: 2.8499 - regression_loss: 2.2541 - classification_loss: 0.5958 202/500 [===========>..................] - ETA: 1:15 - loss: 2.8527 - regression_loss: 2.2567 - classification_loss: 0.5960 203/500 [===========>..................] - ETA: 1:14 - loss: 2.8513 - regression_loss: 2.2560 - classification_loss: 0.5953 204/500 [===========>..................] - ETA: 1:14 - loss: 2.8514 - regression_loss: 2.2561 - classification_loss: 0.5953 205/500 [===========>..................] - ETA: 1:14 - loss: 2.8511 - regression_loss: 2.2564 - classification_loss: 0.5947 206/500 [===========>..................] - ETA: 1:14 - loss: 2.8487 - regression_loss: 2.2546 - classification_loss: 0.5941 207/500 [===========>..................] - ETA: 1:13 - loss: 2.8486 - regression_loss: 2.2543 - classification_loss: 0.5943 208/500 [===========>..................] - ETA: 1:13 - loss: 2.8614 - regression_loss: 2.2594 - classification_loss: 0.6020 209/500 [===========>..................] - ETA: 1:13 - loss: 2.8611 - regression_loss: 2.2594 - classification_loss: 0.6017 210/500 [===========>..................] - ETA: 1:13 - loss: 2.8608 - regression_loss: 2.2596 - classification_loss: 0.6012 211/500 [===========>..................] - ETA: 1:12 - loss: 2.8589 - regression_loss: 2.2580 - classification_loss: 0.6009 212/500 [===========>..................] - ETA: 1:12 - loss: 2.8652 - regression_loss: 2.2572 - classification_loss: 0.6079 213/500 [===========>..................] - ETA: 1:12 - loss: 2.8730 - regression_loss: 2.2582 - classification_loss: 0.6149 214/500 [===========>..................] - ETA: 1:12 - loss: 2.8732 - regression_loss: 2.2585 - classification_loss: 0.6147 215/500 [===========>..................] - ETA: 1:11 - loss: 2.8714 - regression_loss: 2.2576 - classification_loss: 0.6138 216/500 [===========>..................] - ETA: 1:11 - loss: 2.8721 - regression_loss: 2.2581 - classification_loss: 0.6140 217/500 [============>.................] - ETA: 1:11 - loss: 2.8715 - regression_loss: 2.2579 - classification_loss: 0.6136 218/500 [============>.................] - ETA: 1:11 - loss: 2.8711 - regression_loss: 2.2575 - classification_loss: 0.6136 219/500 [============>.................] - ETA: 1:10 - loss: 2.8707 - regression_loss: 2.2574 - classification_loss: 0.6133 220/500 [============>.................] - ETA: 1:10 - loss: 2.8721 - regression_loss: 2.2585 - classification_loss: 0.6136 221/500 [============>.................] - ETA: 1:10 - loss: 2.8727 - regression_loss: 2.2593 - classification_loss: 0.6133 222/500 [============>.................] - ETA: 1:10 - loss: 2.8716 - regression_loss: 2.2587 - classification_loss: 0.6130 223/500 [============>.................] - ETA: 1:09 - loss: 2.8710 - regression_loss: 2.2587 - classification_loss: 0.6123 224/500 [============>.................] - ETA: 1:09 - loss: 2.8703 - regression_loss: 2.2585 - classification_loss: 0.6118 225/500 [============>.................] - ETA: 1:09 - loss: 2.8687 - regression_loss: 2.2577 - classification_loss: 0.6110 226/500 [============>.................] - ETA: 1:09 - loss: 2.8685 - regression_loss: 2.2579 - classification_loss: 0.6105 227/500 [============>.................] - ETA: 1:08 - loss: 2.8661 - regression_loss: 2.2550 - classification_loss: 0.6111 228/500 [============>.................] - ETA: 1:08 - loss: 2.8662 - regression_loss: 2.2555 - classification_loss: 0.6107 229/500 [============>.................] - ETA: 1:08 - loss: 2.8665 - regression_loss: 2.2558 - classification_loss: 0.6106 230/500 [============>.................] - ETA: 1:08 - loss: 2.8658 - regression_loss: 2.2557 - classification_loss: 0.6101 231/500 [============>.................] - ETA: 1:07 - loss: 2.8665 - regression_loss: 2.2564 - classification_loss: 0.6102 232/500 [============>.................] - ETA: 1:07 - loss: 2.8712 - regression_loss: 2.2601 - classification_loss: 0.6111 233/500 [============>.................] - ETA: 1:07 - loss: 2.8696 - regression_loss: 2.2592 - classification_loss: 0.6104 234/500 [=============>................] - ETA: 1:07 - loss: 2.8688 - regression_loss: 2.2587 - classification_loss: 0.6101 235/500 [=============>................] - ETA: 1:06 - loss: 2.8705 - regression_loss: 2.2605 - classification_loss: 0.6099 236/500 [=============>................] - ETA: 1:06 - loss: 2.8716 - regression_loss: 2.2618 - classification_loss: 0.6098 237/500 [=============>................] - ETA: 1:06 - loss: 2.8721 - regression_loss: 2.2625 - classification_loss: 0.6096 238/500 [=============>................] - ETA: 1:06 - loss: 2.8714 - regression_loss: 2.2624 - classification_loss: 0.6090 239/500 [=============>................] - ETA: 1:05 - loss: 2.8722 - regression_loss: 2.2630 - classification_loss: 0.6092 240/500 [=============>................] - ETA: 1:05 - loss: 2.8724 - regression_loss: 2.2635 - classification_loss: 0.6089 241/500 [=============>................] - ETA: 1:05 - loss: 2.8711 - regression_loss: 2.2628 - classification_loss: 0.6083 242/500 [=============>................] - ETA: 1:05 - loss: 2.8695 - regression_loss: 2.2619 - classification_loss: 0.6076 243/500 [=============>................] - ETA: 1:04 - loss: 2.8706 - regression_loss: 2.2625 - classification_loss: 0.6081 244/500 [=============>................] - ETA: 1:04 - loss: 2.8714 - regression_loss: 2.2631 - classification_loss: 0.6083 245/500 [=============>................] - ETA: 1:04 - loss: 2.8767 - regression_loss: 2.2644 - classification_loss: 0.6123 246/500 [=============>................] - ETA: 1:04 - loss: 2.8751 - regression_loss: 2.2634 - classification_loss: 0.6117 247/500 [=============>................] - ETA: 1:03 - loss: 2.8742 - regression_loss: 2.2626 - classification_loss: 0.6117 248/500 [=============>................] - ETA: 1:03 - loss: 2.8768 - regression_loss: 2.2637 - classification_loss: 0.6131 249/500 [=============>................] - ETA: 1:03 - loss: 2.8757 - regression_loss: 2.2632 - classification_loss: 0.6126 250/500 [==============>...............] - ETA: 1:03 - loss: 2.8750 - regression_loss: 2.2631 - classification_loss: 0.6119 251/500 [==============>...............] - ETA: 1:02 - loss: 2.8737 - regression_loss: 2.2624 - classification_loss: 0.6113 252/500 [==============>...............] - ETA: 1:02 - loss: 2.8728 - regression_loss: 2.2619 - classification_loss: 0.6109 253/500 [==============>...............] - ETA: 1:02 - loss: 2.8740 - regression_loss: 2.2613 - classification_loss: 0.6127 254/500 [==============>...............] - ETA: 1:02 - loss: 2.8721 - regression_loss: 2.2594 - classification_loss: 0.6128 255/500 [==============>...............] - ETA: 1:01 - loss: 2.8684 - regression_loss: 2.2564 - classification_loss: 0.6119 256/500 [==============>...............] - ETA: 1:01 - loss: 2.8664 - regression_loss: 2.2551 - classification_loss: 0.6113 257/500 [==============>...............] - ETA: 1:01 - loss: 2.8658 - regression_loss: 2.2547 - classification_loss: 0.6111 258/500 [==============>...............] - ETA: 1:01 - loss: 2.8650 - regression_loss: 2.2543 - classification_loss: 0.6107 259/500 [==============>...............] - ETA: 1:00 - loss: 2.8649 - regression_loss: 2.2542 - classification_loss: 0.6107 260/500 [==============>...............] - ETA: 1:00 - loss: 2.8671 - regression_loss: 2.2559 - classification_loss: 0.6112 261/500 [==============>...............] - ETA: 1:00 - loss: 2.8674 - regression_loss: 2.2563 - classification_loss: 0.6111 262/500 [==============>...............] - ETA: 1:00 - loss: 2.8691 - regression_loss: 2.2569 - classification_loss: 0.6123 263/500 [==============>...............] - ETA: 59s - loss: 2.8695 - regression_loss: 2.2573 - classification_loss: 0.6122  264/500 [==============>...............] - ETA: 59s - loss: 2.8703 - regression_loss: 2.2576 - classification_loss: 0.6127 265/500 [==============>...............] - ETA: 59s - loss: 2.8696 - regression_loss: 2.2574 - classification_loss: 0.6122 266/500 [==============>...............] - ETA: 58s - loss: 2.8686 - regression_loss: 2.2566 - classification_loss: 0.6120 267/500 [===============>..............] - ETA: 58s - loss: 2.8700 - regression_loss: 2.2588 - classification_loss: 0.6112 268/500 [===============>..............] - ETA: 58s - loss: 2.8691 - regression_loss: 2.2582 - classification_loss: 0.6109 269/500 [===============>..............] - ETA: 58s - loss: 2.8686 - regression_loss: 2.2579 - classification_loss: 0.6107 270/500 [===============>..............] - ETA: 57s - loss: 2.8681 - regression_loss: 2.2579 - classification_loss: 0.6102 271/500 [===============>..............] - ETA: 57s - loss: 2.8681 - regression_loss: 2.2585 - classification_loss: 0.6095 272/500 [===============>..............] - ETA: 57s - loss: 2.8665 - regression_loss: 2.2577 - classification_loss: 0.6088 273/500 [===============>..............] - ETA: 57s - loss: 2.8649 - regression_loss: 2.2567 - classification_loss: 0.6082 274/500 [===============>..............] - ETA: 56s - loss: 2.8648 - regression_loss: 2.2565 - classification_loss: 0.6083 275/500 [===============>..............] - ETA: 56s - loss: 2.8656 - regression_loss: 2.2573 - classification_loss: 0.6083 276/500 [===============>..............] - ETA: 56s - loss: 2.8653 - regression_loss: 2.2568 - classification_loss: 0.6085 277/500 [===============>..............] - ETA: 56s - loss: 2.8678 - regression_loss: 2.2588 - classification_loss: 0.6090 278/500 [===============>..............] - ETA: 55s - loss: 2.8677 - regression_loss: 2.2590 - classification_loss: 0.6088 279/500 [===============>..............] - ETA: 55s - loss: 2.8679 - regression_loss: 2.2591 - classification_loss: 0.6088 280/500 [===============>..............] - ETA: 55s - loss: 2.8674 - regression_loss: 2.2588 - classification_loss: 0.6086 281/500 [===============>..............] - ETA: 55s - loss: 2.8659 - regression_loss: 2.2577 - classification_loss: 0.6082 282/500 [===============>..............] - ETA: 54s - loss: 2.8652 - regression_loss: 2.2574 - classification_loss: 0.6078 283/500 [===============>..............] - ETA: 54s - loss: 2.8646 - regression_loss: 2.2573 - classification_loss: 0.6073 284/500 [================>.............] - ETA: 54s - loss: 2.8641 - regression_loss: 2.2570 - classification_loss: 0.6071 285/500 [================>.............] - ETA: 54s - loss: 2.8635 - regression_loss: 2.2569 - classification_loss: 0.6066 286/500 [================>.............] - ETA: 53s - loss: 2.8633 - regression_loss: 2.2569 - classification_loss: 0.6063 287/500 [================>.............] - ETA: 53s - loss: 2.8610 - regression_loss: 2.2552 - classification_loss: 0.6058 288/500 [================>.............] - ETA: 53s - loss: 2.8608 - regression_loss: 2.2554 - classification_loss: 0.6054 289/500 [================>.............] - ETA: 53s - loss: 2.8595 - regression_loss: 2.2544 - classification_loss: 0.6051 290/500 [================>.............] - ETA: 52s - loss: 2.8596 - regression_loss: 2.2543 - classification_loss: 0.6053 291/500 [================>.............] - ETA: 52s - loss: 2.8598 - regression_loss: 2.2545 - classification_loss: 0.6053 292/500 [================>.............] - ETA: 52s - loss: 2.8582 - regression_loss: 2.2532 - classification_loss: 0.6050 293/500 [================>.............] - ETA: 52s - loss: 2.8570 - regression_loss: 2.2527 - classification_loss: 0.6043 294/500 [================>.............] - ETA: 51s - loss: 2.8566 - regression_loss: 2.2525 - classification_loss: 0.6042 295/500 [================>.............] - ETA: 51s - loss: 2.8566 - regression_loss: 2.2525 - classification_loss: 0.6041 296/500 [================>.............] - ETA: 51s - loss: 2.8559 - regression_loss: 2.2522 - classification_loss: 0.6037 297/500 [================>.............] - ETA: 51s - loss: 2.8554 - regression_loss: 2.2520 - classification_loss: 0.6033 298/500 [================>.............] - ETA: 50s - loss: 2.8516 - regression_loss: 2.2491 - classification_loss: 0.6025 299/500 [================>.............] - ETA: 50s - loss: 2.8525 - regression_loss: 2.2502 - classification_loss: 0.6023 300/500 [=================>............] - ETA: 50s - loss: 2.8521 - regression_loss: 2.2500 - classification_loss: 0.6021 301/500 [=================>............] - ETA: 50s - loss: 2.8516 - regression_loss: 2.2500 - classification_loss: 0.6016 302/500 [=================>............] - ETA: 49s - loss: 2.8530 - regression_loss: 2.2510 - classification_loss: 0.6020 303/500 [=================>............] - ETA: 49s - loss: 2.8524 - regression_loss: 2.2509 - classification_loss: 0.6015 304/500 [=================>............] - ETA: 49s - loss: 2.8524 - regression_loss: 2.2512 - classification_loss: 0.6012 305/500 [=================>............] - ETA: 49s - loss: 2.8518 - regression_loss: 2.2507 - classification_loss: 0.6011 306/500 [=================>............] - ETA: 48s - loss: 2.8507 - regression_loss: 2.2501 - classification_loss: 0.6007 307/500 [=================>............] - ETA: 48s - loss: 2.8502 - regression_loss: 2.2488 - classification_loss: 0.6013 308/500 [=================>............] - ETA: 48s - loss: 2.8498 - regression_loss: 2.2485 - classification_loss: 0.6013 309/500 [=================>............] - ETA: 48s - loss: 2.8498 - regression_loss: 2.2484 - classification_loss: 0.6015 310/500 [=================>............] - ETA: 47s - loss: 2.8493 - regression_loss: 2.2483 - classification_loss: 0.6010 311/500 [=================>............] - ETA: 47s - loss: 2.8487 - regression_loss: 2.2482 - classification_loss: 0.6005 312/500 [=================>............] - ETA: 47s - loss: 2.8492 - regression_loss: 2.2489 - classification_loss: 0.6003 313/500 [=================>............] - ETA: 47s - loss: 2.8508 - regression_loss: 2.2496 - classification_loss: 0.6012 314/500 [=================>............] - ETA: 46s - loss: 2.8514 - regression_loss: 2.2503 - classification_loss: 0.6011 315/500 [=================>............] - ETA: 46s - loss: 2.8535 - regression_loss: 2.2515 - classification_loss: 0.6020 316/500 [=================>............] - ETA: 46s - loss: 2.8531 - regression_loss: 2.2511 - classification_loss: 0.6019 317/500 [==================>...........] - ETA: 46s - loss: 2.8539 - regression_loss: 2.2515 - classification_loss: 0.6024 318/500 [==================>...........] - ETA: 45s - loss: 2.8557 - regression_loss: 2.2533 - classification_loss: 0.6024 319/500 [==================>...........] - ETA: 45s - loss: 2.8539 - regression_loss: 2.2521 - classification_loss: 0.6018 320/500 [==================>...........] - ETA: 45s - loss: 2.8534 - regression_loss: 2.2520 - classification_loss: 0.6014 321/500 [==================>...........] - ETA: 45s - loss: 2.8503 - regression_loss: 2.2498 - classification_loss: 0.6005 322/500 [==================>...........] - ETA: 44s - loss: 2.8499 - regression_loss: 2.2501 - classification_loss: 0.5997 323/500 [==================>...........] - ETA: 44s - loss: 2.8492 - regression_loss: 2.2493 - classification_loss: 0.5999 324/500 [==================>...........] - ETA: 44s - loss: 2.8516 - regression_loss: 2.2512 - classification_loss: 0.6003 325/500 [==================>...........] - ETA: 44s - loss: 2.8511 - regression_loss: 2.2510 - classification_loss: 0.6001 326/500 [==================>...........] - ETA: 43s - loss: 2.8509 - regression_loss: 2.2511 - classification_loss: 0.5999 327/500 [==================>...........] - ETA: 43s - loss: 2.8504 - regression_loss: 2.2508 - classification_loss: 0.5996 328/500 [==================>...........] - ETA: 43s - loss: 2.8506 - regression_loss: 2.2507 - classification_loss: 0.5999 329/500 [==================>...........] - ETA: 43s - loss: 2.8495 - regression_loss: 2.2499 - classification_loss: 0.5995 330/500 [==================>...........] - ETA: 42s - loss: 2.8510 - regression_loss: 2.2512 - classification_loss: 0.5997 331/500 [==================>...........] - ETA: 42s - loss: 2.8488 - regression_loss: 2.2493 - classification_loss: 0.5995 332/500 [==================>...........] - ETA: 42s - loss: 2.8482 - regression_loss: 2.2491 - classification_loss: 0.5991 333/500 [==================>...........] - ETA: 41s - loss: 2.8481 - regression_loss: 2.2495 - classification_loss: 0.5986 334/500 [===================>..........] - ETA: 41s - loss: 2.8489 - regression_loss: 2.2503 - classification_loss: 0.5986 335/500 [===================>..........] - ETA: 41s - loss: 2.8480 - regression_loss: 2.2497 - classification_loss: 0.5983 336/500 [===================>..........] - ETA: 41s - loss: 2.8473 - regression_loss: 2.2495 - classification_loss: 0.5979 337/500 [===================>..........] - ETA: 41s - loss: 2.8471 - regression_loss: 2.2494 - classification_loss: 0.5977 338/500 [===================>..........] - ETA: 40s - loss: 2.8493 - regression_loss: 2.2514 - classification_loss: 0.5979 339/500 [===================>..........] - ETA: 40s - loss: 2.8498 - regression_loss: 2.2520 - classification_loss: 0.5978 340/500 [===================>..........] - ETA: 40s - loss: 2.8497 - regression_loss: 2.2521 - classification_loss: 0.5976 341/500 [===================>..........] - ETA: 39s - loss: 2.8557 - regression_loss: 2.2524 - classification_loss: 0.6033 342/500 [===================>..........] - ETA: 39s - loss: 2.8565 - regression_loss: 2.2532 - classification_loss: 0.6033 343/500 [===================>..........] - ETA: 39s - loss: 2.8592 - regression_loss: 2.2534 - classification_loss: 0.6058 344/500 [===================>..........] - ETA: 39s - loss: 2.8614 - regression_loss: 2.2555 - classification_loss: 0.6058 345/500 [===================>..........] - ETA: 38s - loss: 2.8643 - regression_loss: 2.2571 - classification_loss: 0.6071 346/500 [===================>..........] - ETA: 38s - loss: 2.8667 - regression_loss: 2.2593 - classification_loss: 0.6074 347/500 [===================>..........] - ETA: 38s - loss: 2.8666 - regression_loss: 2.2594 - classification_loss: 0.6071 348/500 [===================>..........] - ETA: 38s - loss: 2.8686 - regression_loss: 2.2617 - classification_loss: 0.6069 349/500 [===================>..........] - ETA: 37s - loss: 2.8680 - regression_loss: 2.2613 - classification_loss: 0.6067 350/500 [====================>.........] - ETA: 37s - loss: 2.8678 - regression_loss: 2.2613 - classification_loss: 0.6065 351/500 [====================>.........] - ETA: 37s - loss: 2.8683 - regression_loss: 2.2615 - classification_loss: 0.6068 352/500 [====================>.........] - ETA: 37s - loss: 2.8699 - regression_loss: 2.2623 - classification_loss: 0.6076 353/500 [====================>.........] - ETA: 36s - loss: 2.8689 - regression_loss: 2.2616 - classification_loss: 0.6073 354/500 [====================>.........] - ETA: 36s - loss: 2.8700 - regression_loss: 2.2623 - classification_loss: 0.6078 355/500 [====================>.........] - ETA: 36s - loss: 2.8705 - regression_loss: 2.2627 - classification_loss: 0.6078 356/500 [====================>.........] - ETA: 36s - loss: 2.8696 - regression_loss: 2.2621 - classification_loss: 0.6075 357/500 [====================>.........] - ETA: 35s - loss: 2.8693 - regression_loss: 2.2621 - classification_loss: 0.6071 358/500 [====================>.........] - ETA: 35s - loss: 2.8684 - regression_loss: 2.2618 - classification_loss: 0.6066 359/500 [====================>.........] - ETA: 35s - loss: 2.8683 - regression_loss: 2.2618 - classification_loss: 0.6066 360/500 [====================>.........] - ETA: 35s - loss: 2.8682 - regression_loss: 2.2601 - classification_loss: 0.6081 361/500 [====================>.........] - ETA: 34s - loss: 2.8677 - regression_loss: 2.2596 - classification_loss: 0.6081 362/500 [====================>.........] - ETA: 34s - loss: 2.8666 - regression_loss: 2.2590 - classification_loss: 0.6077 363/500 [====================>.........] - ETA: 34s - loss: 2.8654 - regression_loss: 2.2581 - classification_loss: 0.6073 364/500 [====================>.........] - ETA: 34s - loss: 2.8659 - regression_loss: 2.2586 - classification_loss: 0.6072 365/500 [====================>.........] - ETA: 33s - loss: 2.8660 - regression_loss: 2.2586 - classification_loss: 0.6074 366/500 [====================>.........] - ETA: 33s - loss: 2.8649 - regression_loss: 2.2577 - classification_loss: 0.6072 367/500 [=====================>........] - ETA: 33s - loss: 2.8646 - regression_loss: 2.2575 - classification_loss: 0.6071 368/500 [=====================>........] - ETA: 33s - loss: 2.8674 - regression_loss: 2.2595 - classification_loss: 0.6079 369/500 [=====================>........] - ETA: 32s - loss: 2.8670 - regression_loss: 2.2593 - classification_loss: 0.6077 370/500 [=====================>........] - ETA: 32s - loss: 2.8665 - regression_loss: 2.2591 - classification_loss: 0.6074 371/500 [=====================>........] - ETA: 32s - loss: 2.8662 - regression_loss: 2.2591 - classification_loss: 0.6072 372/500 [=====================>........] - ETA: 32s - loss: 2.8662 - regression_loss: 2.2592 - classification_loss: 0.6070 373/500 [=====================>........] - ETA: 31s - loss: 2.8652 - regression_loss: 2.2585 - classification_loss: 0.6067 374/500 [=====================>........] - ETA: 31s - loss: 2.8663 - regression_loss: 2.2597 - classification_loss: 0.6066 375/500 [=====================>........] - ETA: 31s - loss: 2.8828 - regression_loss: 2.2608 - classification_loss: 0.6219 376/500 [=====================>........] - ETA: 31s - loss: 2.8830 - regression_loss: 2.2611 - classification_loss: 0.6219 377/500 [=====================>........] - ETA: 30s - loss: 2.8826 - regression_loss: 2.2608 - classification_loss: 0.6218 378/500 [=====================>........] - ETA: 30s - loss: 2.8821 - regression_loss: 2.2604 - classification_loss: 0.6217 379/500 [=====================>........] - ETA: 30s - loss: 2.8821 - regression_loss: 2.2608 - classification_loss: 0.6212 380/500 [=====================>........] - ETA: 30s - loss: 2.8824 - regression_loss: 2.2611 - classification_loss: 0.6212 381/500 [=====================>........] - ETA: 29s - loss: 2.8822 - regression_loss: 2.2610 - classification_loss: 0.6213 382/500 [=====================>........] - ETA: 29s - loss: 2.8808 - regression_loss: 2.2600 - classification_loss: 0.6208 383/500 [=====================>........] - ETA: 29s - loss: 2.8805 - regression_loss: 2.2600 - classification_loss: 0.6205 384/500 [======================>.......] - ETA: 29s - loss: 2.8798 - regression_loss: 2.2598 - classification_loss: 0.6200 385/500 [======================>.......] - ETA: 28s - loss: 2.8800 - regression_loss: 2.2596 - classification_loss: 0.6205 386/500 [======================>.......] - ETA: 28s - loss: 2.8787 - regression_loss: 2.2588 - classification_loss: 0.6199 387/500 [======================>.......] - ETA: 28s - loss: 2.8776 - regression_loss: 2.2584 - classification_loss: 0.6192 388/500 [======================>.......] - ETA: 28s - loss: 2.8772 - regression_loss: 2.2582 - classification_loss: 0.6191 389/500 [======================>.......] - ETA: 27s - loss: 2.8764 - regression_loss: 2.2576 - classification_loss: 0.6188 390/500 [======================>.......] - ETA: 27s - loss: 2.8757 - regression_loss: 2.2574 - classification_loss: 0.6183 391/500 [======================>.......] - ETA: 27s - loss: 2.8747 - regression_loss: 2.2567 - classification_loss: 0.6180 392/500 [======================>.......] - ETA: 27s - loss: 2.8742 - regression_loss: 2.2564 - classification_loss: 0.6178 393/500 [======================>.......] - ETA: 26s - loss: 2.8736 - regression_loss: 2.2560 - classification_loss: 0.6176 394/500 [======================>.......] - ETA: 26s - loss: 2.8747 - regression_loss: 2.2568 - classification_loss: 0.6179 395/500 [======================>.......] - ETA: 26s - loss: 2.8732 - regression_loss: 2.2558 - classification_loss: 0.6174 396/500 [======================>.......] - ETA: 26s - loss: 2.8734 - regression_loss: 2.2560 - classification_loss: 0.6175 397/500 [======================>.......] - ETA: 25s - loss: 2.8728 - regression_loss: 2.2550 - classification_loss: 0.6178 398/500 [======================>.......] - ETA: 25s - loss: 2.8710 - regression_loss: 2.2538 - classification_loss: 0.6172 399/500 [======================>.......] - ETA: 25s - loss: 2.8699 - regression_loss: 2.2533 - classification_loss: 0.6166 400/500 [=======================>......] - ETA: 25s - loss: 2.8695 - regression_loss: 2.2532 - classification_loss: 0.6164 401/500 [=======================>......] - ETA: 24s - loss: 2.8696 - regression_loss: 2.2534 - classification_loss: 0.6162 402/500 [=======================>......] - ETA: 24s - loss: 2.8683 - regression_loss: 2.2526 - classification_loss: 0.6157 403/500 [=======================>......] - ETA: 24s - loss: 2.8692 - regression_loss: 2.2533 - classification_loss: 0.6159 404/500 [=======================>......] - ETA: 24s - loss: 2.8704 - regression_loss: 2.2542 - classification_loss: 0.6161 405/500 [=======================>......] - ETA: 23s - loss: 2.8677 - regression_loss: 2.2521 - classification_loss: 0.6157 406/500 [=======================>......] - ETA: 23s - loss: 2.8675 - regression_loss: 2.2519 - classification_loss: 0.6156 407/500 [=======================>......] - ETA: 23s - loss: 2.8665 - regression_loss: 2.2513 - classification_loss: 0.6152 408/500 [=======================>......] - ETA: 23s - loss: 2.8659 - regression_loss: 2.2510 - classification_loss: 0.6149 409/500 [=======================>......] - ETA: 22s - loss: 2.8660 - regression_loss: 2.2511 - classification_loss: 0.6148 410/500 [=======================>......] - ETA: 22s - loss: 2.8658 - regression_loss: 2.2512 - classification_loss: 0.6147 411/500 [=======================>......] - ETA: 22s - loss: 2.8668 - regression_loss: 2.2521 - classification_loss: 0.6147 412/500 [=======================>......] - ETA: 22s - loss: 2.8686 - regression_loss: 2.2537 - classification_loss: 0.6148 413/500 [=======================>......] - ETA: 21s - loss: 2.8678 - regression_loss: 2.2532 - classification_loss: 0.6146 414/500 [=======================>......] - ETA: 21s - loss: 2.8672 - regression_loss: 2.2529 - classification_loss: 0.6143 415/500 [=======================>......] - ETA: 21s - loss: 2.8670 - regression_loss: 2.2529 - classification_loss: 0.6140 416/500 [=======================>......] - ETA: 21s - loss: 2.8670 - regression_loss: 2.2531 - classification_loss: 0.6139 417/500 [========================>.....] - ETA: 20s - loss: 2.8655 - regression_loss: 2.2522 - classification_loss: 0.6134 418/500 [========================>.....] - ETA: 20s - loss: 2.8638 - regression_loss: 2.2509 - classification_loss: 0.6128 419/500 [========================>.....] - ETA: 20s - loss: 2.8641 - regression_loss: 2.2515 - classification_loss: 0.6126 420/500 [========================>.....] - ETA: 20s - loss: 2.8632 - regression_loss: 2.2508 - classification_loss: 0.6124 421/500 [========================>.....] - ETA: 19s - loss: 2.8629 - regression_loss: 2.2506 - classification_loss: 0.6122 422/500 [========================>.....] - ETA: 19s - loss: 2.8625 - regression_loss: 2.2504 - classification_loss: 0.6120 423/500 [========================>.....] - ETA: 19s - loss: 2.8626 - regression_loss: 2.2506 - classification_loss: 0.6121 424/500 [========================>.....] - ETA: 19s - loss: 2.8617 - regression_loss: 2.2499 - classification_loss: 0.6119 425/500 [========================>.....] - ETA: 18s - loss: 2.8629 - regression_loss: 2.2503 - classification_loss: 0.6126 426/500 [========================>.....] - ETA: 18s - loss: 2.8628 - regression_loss: 2.2503 - classification_loss: 0.6125 427/500 [========================>.....] - ETA: 18s - loss: 2.8634 - regression_loss: 2.2506 - classification_loss: 0.6127 428/500 [========================>.....] - ETA: 18s - loss: 2.8636 - regression_loss: 2.2504 - classification_loss: 0.6132 429/500 [========================>.....] - ETA: 17s - loss: 2.8632 - regression_loss: 2.2500 - classification_loss: 0.6132 430/500 [========================>.....] - ETA: 17s - loss: 2.8622 - regression_loss: 2.2493 - classification_loss: 0.6129 431/500 [========================>.....] - ETA: 17s - loss: 2.8618 - regression_loss: 2.2491 - classification_loss: 0.6127 432/500 [========================>.....] - ETA: 17s - loss: 2.8619 - regression_loss: 2.2495 - classification_loss: 0.6124 433/500 [========================>.....] - ETA: 16s - loss: 2.8614 - regression_loss: 2.2493 - classification_loss: 0.6121 434/500 [=========================>....] - ETA: 16s - loss: 2.8615 - regression_loss: 2.2496 - classification_loss: 0.6120 435/500 [=========================>....] - ETA: 16s - loss: 2.8601 - regression_loss: 2.2489 - classification_loss: 0.6112 436/500 [=========================>....] - ETA: 16s - loss: 2.8602 - regression_loss: 2.2492 - classification_loss: 0.6110 437/500 [=========================>....] - ETA: 15s - loss: 2.8611 - regression_loss: 2.2498 - classification_loss: 0.6113 438/500 [=========================>....] - ETA: 15s - loss: 2.8617 - regression_loss: 2.2508 - classification_loss: 0.6109 439/500 [=========================>....] - ETA: 15s - loss: 2.8632 - regression_loss: 2.2512 - classification_loss: 0.6120 440/500 [=========================>....] - ETA: 15s - loss: 2.8630 - regression_loss: 2.2513 - classification_loss: 0.6117 441/500 [=========================>....] - ETA: 14s - loss: 2.8631 - regression_loss: 2.2513 - classification_loss: 0.6118 442/500 [=========================>....] - ETA: 14s - loss: 2.8634 - regression_loss: 2.2513 - classification_loss: 0.6121 443/500 [=========================>....] - ETA: 14s - loss: 2.8629 - regression_loss: 2.2510 - classification_loss: 0.6119 444/500 [=========================>....] - ETA: 14s - loss: 2.8619 - regression_loss: 2.2503 - classification_loss: 0.6116 445/500 [=========================>....] - ETA: 13s - loss: 2.8612 - regression_loss: 2.2497 - classification_loss: 0.6114 446/500 [=========================>....] - ETA: 13s - loss: 2.8593 - regression_loss: 2.2484 - classification_loss: 0.6109 447/500 [=========================>....] - ETA: 13s - loss: 2.8589 - regression_loss: 2.2481 - classification_loss: 0.6108 448/500 [=========================>....] - ETA: 13s - loss: 2.8584 - regression_loss: 2.2479 - classification_loss: 0.6105 449/500 [=========================>....] - ETA: 12s - loss: 2.8588 - regression_loss: 2.2478 - classification_loss: 0.6110 450/500 [==========================>...] - ETA: 12s - loss: 2.8587 - regression_loss: 2.2480 - classification_loss: 0.6107 451/500 [==========================>...] - ETA: 12s - loss: 2.8582 - regression_loss: 2.2480 - classification_loss: 0.6102 452/500 [==========================>...] - ETA: 12s - loss: 2.8585 - regression_loss: 2.2483 - classification_loss: 0.6102 453/500 [==========================>...] - ETA: 11s - loss: 2.8577 - regression_loss: 2.2478 - classification_loss: 0.6100 454/500 [==========================>...] - ETA: 11s - loss: 2.8576 - regression_loss: 2.2478 - classification_loss: 0.6098 455/500 [==========================>...] - ETA: 11s - loss: 2.8577 - regression_loss: 2.2481 - classification_loss: 0.6096 456/500 [==========================>...] - ETA: 11s - loss: 2.8570 - regression_loss: 2.2477 - classification_loss: 0.6094 457/500 [==========================>...] - ETA: 10s - loss: 2.8564 - regression_loss: 2.2473 - classification_loss: 0.6091 458/500 [==========================>...] - ETA: 10s - loss: 2.8567 - regression_loss: 2.2476 - classification_loss: 0.6091 459/500 [==========================>...] - ETA: 10s - loss: 2.8564 - regression_loss: 2.2475 - classification_loss: 0.6089 460/500 [==========================>...] - ETA: 10s - loss: 2.8583 - regression_loss: 2.2480 - classification_loss: 0.6102 461/500 [==========================>...] - ETA: 9s - loss: 2.8570 - regression_loss: 2.2472 - classification_loss: 0.6098  462/500 [==========================>...] - ETA: 9s - loss: 2.8565 - regression_loss: 2.2469 - classification_loss: 0.6096 463/500 [==========================>...] - ETA: 9s - loss: 2.8569 - regression_loss: 2.2474 - classification_loss: 0.6095 464/500 [==========================>...] - ETA: 9s - loss: 2.8581 - regression_loss: 2.2481 - classification_loss: 0.6100 465/500 [==========================>...] - ETA: 8s - loss: 2.8588 - regression_loss: 2.2479 - classification_loss: 0.6109 466/500 [==========================>...] - ETA: 8s - loss: 2.8576 - regression_loss: 2.2471 - classification_loss: 0.6105 467/500 [===========================>..] - ETA: 8s - loss: 2.8570 - regression_loss: 2.2467 - classification_loss: 0.6103 468/500 [===========================>..] - ETA: 8s - loss: 2.8571 - regression_loss: 2.2470 - classification_loss: 0.6102 469/500 [===========================>..] - ETA: 7s - loss: 2.8575 - regression_loss: 2.2475 - classification_loss: 0.6100 470/500 [===========================>..] - ETA: 7s - loss: 2.8569 - regression_loss: 2.2472 - classification_loss: 0.6098 471/500 [===========================>..] - ETA: 7s - loss: 2.8573 - regression_loss: 2.2474 - classification_loss: 0.6099 472/500 [===========================>..] - ETA: 7s - loss: 2.8572 - regression_loss: 2.2472 - classification_loss: 0.6099 473/500 [===========================>..] - ETA: 6s - loss: 2.8591 - regression_loss: 2.2491 - classification_loss: 0.6100 474/500 [===========================>..] - ETA: 6s - loss: 2.8580 - regression_loss: 2.2481 - classification_loss: 0.6099 475/500 [===========================>..] - ETA: 6s - loss: 2.8583 - regression_loss: 2.2488 - classification_loss: 0.6094 476/500 [===========================>..] - ETA: 6s - loss: 2.8572 - regression_loss: 2.2483 - classification_loss: 0.6089 477/500 [===========================>..] - ETA: 5s - loss: 2.8572 - regression_loss: 2.2483 - classification_loss: 0.6090 478/500 [===========================>..] - ETA: 5s - loss: 2.8575 - regression_loss: 2.2487 - classification_loss: 0.6088 479/500 [===========================>..] - ETA: 5s - loss: 2.8570 - regression_loss: 2.2483 - classification_loss: 0.6086 480/500 [===========================>..] - ETA: 5s - loss: 2.8572 - regression_loss: 2.2486 - classification_loss: 0.6087 481/500 [===========================>..] - ETA: 4s - loss: 2.8584 - regression_loss: 2.2500 - classification_loss: 0.6084 482/500 [===========================>..] - ETA: 4s - loss: 2.8586 - regression_loss: 2.2501 - classification_loss: 0.6084 483/500 [===========================>..] - ETA: 4s - loss: 2.8583 - regression_loss: 2.2501 - classification_loss: 0.6082 484/500 [============================>.] - ETA: 4s - loss: 2.8591 - regression_loss: 2.2510 - classification_loss: 0.6081 485/500 [============================>.] - ETA: 3s - loss: 2.8590 - regression_loss: 2.2511 - classification_loss: 0.6079 486/500 [============================>.] - ETA: 3s - loss: 2.8585 - regression_loss: 2.2508 - classification_loss: 0.6077 487/500 [============================>.] - ETA: 3s - loss: 2.8586 - regression_loss: 2.2509 - classification_loss: 0.6076 488/500 [============================>.] - ETA: 3s - loss: 2.8581 - regression_loss: 2.2507 - classification_loss: 0.6074 489/500 [============================>.] - ETA: 2s - loss: 2.8584 - regression_loss: 2.2509 - classification_loss: 0.6075 490/500 [============================>.] - ETA: 2s - loss: 2.8589 - regression_loss: 2.2512 - classification_loss: 0.6077 491/500 [============================>.] - ETA: 2s - loss: 2.8582 - regression_loss: 2.2507 - classification_loss: 0.6075 492/500 [============================>.] - ETA: 2s - loss: 2.8580 - regression_loss: 2.2507 - classification_loss: 0.6073 493/500 [============================>.] - ETA: 1s - loss: 2.8575 - regression_loss: 2.2505 - classification_loss: 0.6071 494/500 [============================>.] - ETA: 1s - loss: 2.8582 - regression_loss: 2.2510 - classification_loss: 0.6073 495/500 [============================>.] - ETA: 1s - loss: 2.8572 - regression_loss: 2.2499 - classification_loss: 0.6073 496/500 [============================>.] - ETA: 1s - loss: 2.8568 - regression_loss: 2.2496 - classification_loss: 0.6072 497/500 [============================>.] - ETA: 0s - loss: 2.8585 - regression_loss: 2.2507 - classification_loss: 0.6077 498/500 [============================>.] - ETA: 0s - loss: 2.8575 - regression_loss: 2.2501 - classification_loss: 0.6074 499/500 [============================>.] - ETA: 0s - loss: 2.8577 - regression_loss: 2.2502 - classification_loss: 0.6074 500/500 [==============================] - 126s 251ms/step - loss: 2.8573 - regression_loss: 2.2501 - classification_loss: 0.6072 1172 instances of class plum with average precision: 0.1577 mAP: 0.1577 Epoch 00006: saving model to ./training/snapshots/resnet50_pascal_06.h5 Epoch 7/150 1/500 [..............................] - ETA: 1:53 - loss: 2.8628 - regression_loss: 2.2092 - classification_loss: 0.6536 2/500 [..............................] - ETA: 2:02 - loss: 3.0330 - regression_loss: 2.2836 - classification_loss: 0.7494 3/500 [..............................] - ETA: 2:04 - loss: 3.1024 - regression_loss: 2.4054 - classification_loss: 0.6970 4/500 [..............................] - ETA: 2:03 - loss: 3.0500 - regression_loss: 2.3610 - classification_loss: 0.6890 5/500 [..............................] - ETA: 2:04 - loss: 3.0329 - regression_loss: 2.3302 - classification_loss: 0.7027 6/500 [..............................] - ETA: 2:04 - loss: 3.0283 - regression_loss: 2.3179 - classification_loss: 0.7104 7/500 [..............................] - ETA: 2:04 - loss: 3.0227 - regression_loss: 2.3092 - classification_loss: 0.7135 8/500 [..............................] - ETA: 2:04 - loss: 2.9956 - regression_loss: 2.2974 - classification_loss: 0.6982 9/500 [..............................] - ETA: 2:04 - loss: 2.9245 - regression_loss: 2.2586 - classification_loss: 0.6658 10/500 [..............................] - ETA: 2:03 - loss: 2.8828 - regression_loss: 2.2313 - classification_loss: 0.6515 11/500 [..............................] - ETA: 2:03 - loss: 2.8378 - regression_loss: 2.2072 - classification_loss: 0.6306 12/500 [..............................] - ETA: 2:03 - loss: 2.8139 - regression_loss: 2.1890 - classification_loss: 0.6249 13/500 [..............................] - ETA: 2:02 - loss: 2.7928 - regression_loss: 2.1769 - classification_loss: 0.6159 14/500 [..............................] - ETA: 2:00 - loss: 2.8781 - regression_loss: 2.2588 - classification_loss: 0.6193 15/500 [..............................] - ETA: 2:00 - loss: 2.9636 - regression_loss: 2.2619 - classification_loss: 0.7018 16/500 [..............................] - ETA: 2:00 - loss: 2.9449 - regression_loss: 2.2457 - classification_loss: 0.6992 17/500 [>.............................] - ETA: 2:00 - loss: 2.9200 - regression_loss: 2.2334 - classification_loss: 0.6865 18/500 [>.............................] - ETA: 2:00 - loss: 2.9008 - regression_loss: 2.2263 - classification_loss: 0.6744 19/500 [>.............................] - ETA: 2:00 - loss: 2.9254 - regression_loss: 2.2411 - classification_loss: 0.6843 20/500 [>.............................] - ETA: 1:59 - loss: 2.9179 - regression_loss: 2.2448 - classification_loss: 0.6731 21/500 [>.............................] - ETA: 1:59 - loss: 2.9440 - regression_loss: 2.2559 - classification_loss: 0.6882 22/500 [>.............................] - ETA: 1:59 - loss: 2.9223 - regression_loss: 2.2434 - classification_loss: 0.6788 23/500 [>.............................] - ETA: 1:59 - loss: 2.9286 - regression_loss: 2.2524 - classification_loss: 0.6762 24/500 [>.............................] - ETA: 1:59 - loss: 2.9105 - regression_loss: 2.2433 - classification_loss: 0.6672 25/500 [>.............................] - ETA: 1:58 - loss: 2.8913 - regression_loss: 2.2351 - classification_loss: 0.6562 26/500 [>.............................] - ETA: 1:58 - loss: 2.8864 - regression_loss: 2.2295 - classification_loss: 0.6569 27/500 [>.............................] - ETA: 1:58 - loss: 2.8796 - regression_loss: 2.2289 - classification_loss: 0.6508 28/500 [>.............................] - ETA: 1:58 - loss: 2.8733 - regression_loss: 2.2256 - classification_loss: 0.6477 29/500 [>.............................] - ETA: 1:58 - loss: 2.8610 - regression_loss: 2.2210 - classification_loss: 0.6400 30/500 [>.............................] - ETA: 1:57 - loss: 2.8723 - regression_loss: 2.2300 - classification_loss: 0.6422 31/500 [>.............................] - ETA: 1:57 - loss: 2.8919 - regression_loss: 2.2469 - classification_loss: 0.6450 32/500 [>.............................] - ETA: 1:57 - loss: 2.8632 - regression_loss: 2.2241 - classification_loss: 0.6391 33/500 [>.............................] - ETA: 1:56 - loss: 2.8701 - regression_loss: 2.2303 - classification_loss: 0.6398 34/500 [=>............................] - ETA: 1:56 - loss: 2.8714 - regression_loss: 2.2326 - classification_loss: 0.6388 35/500 [=>............................] - ETA: 1:56 - loss: 2.8629 - regression_loss: 2.2283 - classification_loss: 0.6345 36/500 [=>............................] - ETA: 1:56 - loss: 2.8631 - regression_loss: 2.2321 - classification_loss: 0.6311 37/500 [=>............................] - ETA: 1:56 - loss: 2.8579 - regression_loss: 2.2281 - classification_loss: 0.6298 38/500 [=>............................] - ETA: 1:56 - loss: 2.8532 - regression_loss: 2.2260 - classification_loss: 0.6272 39/500 [=>............................] - ETA: 1:55 - loss: 2.8472 - regression_loss: 2.2211 - classification_loss: 0.6262 40/500 [=>............................] - ETA: 1:55 - loss: 2.8403 - regression_loss: 2.2160 - classification_loss: 0.6243 41/500 [=>............................] - ETA: 1:55 - loss: 2.8368 - regression_loss: 2.2151 - classification_loss: 0.6217 42/500 [=>............................] - ETA: 1:55 - loss: 2.8340 - regression_loss: 2.2132 - classification_loss: 0.6208 43/500 [=>............................] - ETA: 1:54 - loss: 2.8314 - regression_loss: 2.2127 - classification_loss: 0.6187 44/500 [=>............................] - ETA: 1:54 - loss: 2.8378 - regression_loss: 2.2156 - classification_loss: 0.6222 45/500 [=>............................] - ETA: 1:54 - loss: 2.8389 - regression_loss: 2.2171 - classification_loss: 0.6217 46/500 [=>............................] - ETA: 1:53 - loss: 2.8437 - regression_loss: 2.2201 - classification_loss: 0.6236 47/500 [=>............................] - ETA: 1:53 - loss: 2.8496 - regression_loss: 2.2288 - classification_loss: 0.6207 48/500 [=>............................] - ETA: 1:53 - loss: 2.8490 - regression_loss: 2.2294 - classification_loss: 0.6196 49/500 [=>............................] - ETA: 1:53 - loss: 2.8464 - regression_loss: 2.2307 - classification_loss: 0.6157 50/500 [==>...........................] - ETA: 1:53 - loss: 2.8362 - regression_loss: 2.2241 - classification_loss: 0.6121 51/500 [==>...........................] - ETA: 1:52 - loss: 2.8284 - regression_loss: 2.2165 - classification_loss: 0.6119 52/500 [==>...........................] - ETA: 1:52 - loss: 2.8298 - regression_loss: 2.2197 - classification_loss: 0.6100 53/500 [==>...........................] - ETA: 1:52 - loss: 2.8283 - regression_loss: 2.2200 - classification_loss: 0.6083 54/500 [==>...........................] - ETA: 1:52 - loss: 2.8250 - regression_loss: 2.2188 - classification_loss: 0.6062 55/500 [==>...........................] - ETA: 1:52 - loss: 2.8394 - regression_loss: 2.2259 - classification_loss: 0.6135 56/500 [==>...........................] - ETA: 1:51 - loss: 2.8389 - regression_loss: 2.2228 - classification_loss: 0.6161 57/500 [==>...........................] - ETA: 1:51 - loss: 2.8328 - regression_loss: 2.2186 - classification_loss: 0.6142 58/500 [==>...........................] - ETA: 1:51 - loss: 2.8346 - regression_loss: 2.2229 - classification_loss: 0.6117 59/500 [==>...........................] - ETA: 1:51 - loss: 2.8327 - regression_loss: 2.2236 - classification_loss: 0.6091 60/500 [==>...........................] - ETA: 1:51 - loss: 2.8317 - regression_loss: 2.2230 - classification_loss: 0.6087 61/500 [==>...........................] - ETA: 1:50 - loss: 2.8413 - regression_loss: 2.2322 - classification_loss: 0.6091 62/500 [==>...........................] - ETA: 1:50 - loss: 2.8361 - regression_loss: 2.2296 - classification_loss: 0.6065 63/500 [==>...........................] - ETA: 1:50 - loss: 2.8331 - regression_loss: 2.2276 - classification_loss: 0.6055 64/500 [==>...........................] - ETA: 1:50 - loss: 2.8213 - regression_loss: 2.2183 - classification_loss: 0.6030 65/500 [==>...........................] - ETA: 1:49 - loss: 2.8174 - regression_loss: 2.2160 - classification_loss: 0.6014 66/500 [==>...........................] - ETA: 1:49 - loss: 2.8187 - regression_loss: 2.2169 - classification_loss: 0.6018 67/500 [===>..........................] - ETA: 1:49 - loss: 2.8178 - regression_loss: 2.2174 - classification_loss: 0.6003 68/500 [===>..........................] - ETA: 1:49 - loss: 2.8152 - regression_loss: 2.2161 - classification_loss: 0.5991 69/500 [===>..........................] - ETA: 1:48 - loss: 2.8118 - regression_loss: 2.2136 - classification_loss: 0.5982 70/500 [===>..........................] - ETA: 1:48 - loss: 2.8101 - regression_loss: 2.2137 - classification_loss: 0.5964 71/500 [===>..........................] - ETA: 1:48 - loss: 2.8162 - regression_loss: 2.2192 - classification_loss: 0.5970 72/500 [===>..........................] - ETA: 1:47 - loss: 2.8095 - regression_loss: 2.2085 - classification_loss: 0.6010 73/500 [===>..........................] - ETA: 1:47 - loss: 2.8115 - regression_loss: 2.2105 - classification_loss: 0.6010 74/500 [===>..........................] - ETA: 1:47 - loss: 2.8175 - regression_loss: 2.2158 - classification_loss: 0.6017 75/500 [===>..........................] - ETA: 1:47 - loss: 2.8115 - regression_loss: 2.2041 - classification_loss: 0.6074 76/500 [===>..........................] - ETA: 1:46 - loss: 2.8265 - regression_loss: 2.2175 - classification_loss: 0.6090 77/500 [===>..........................] - ETA: 1:46 - loss: 2.8261 - regression_loss: 2.2180 - classification_loss: 0.6081 78/500 [===>..........................] - ETA: 1:46 - loss: 2.8243 - regression_loss: 2.2179 - classification_loss: 0.6065 79/500 [===>..........................] - ETA: 1:46 - loss: 2.8226 - regression_loss: 2.2178 - classification_loss: 0.6047 80/500 [===>..........................] - ETA: 1:45 - loss: 2.8179 - regression_loss: 2.2156 - classification_loss: 0.6023 81/500 [===>..........................] - ETA: 1:45 - loss: 2.8158 - regression_loss: 2.2145 - classification_loss: 0.6012 82/500 [===>..........................] - ETA: 1:45 - loss: 2.8143 - regression_loss: 2.2146 - classification_loss: 0.5997 83/500 [===>..........................] - ETA: 1:45 - loss: 2.8242 - regression_loss: 2.2248 - classification_loss: 0.5994 84/500 [====>.........................] - ETA: 1:44 - loss: 2.8201 - regression_loss: 2.2223 - classification_loss: 0.5977 85/500 [====>.........................] - ETA: 1:44 - loss: 2.8182 - regression_loss: 2.2212 - classification_loss: 0.5970 86/500 [====>.........................] - ETA: 1:44 - loss: 2.8154 - regression_loss: 2.2201 - classification_loss: 0.5954 87/500 [====>.........................] - ETA: 1:44 - loss: 2.8172 - regression_loss: 2.2224 - classification_loss: 0.5948 88/500 [====>.........................] - ETA: 1:44 - loss: 2.8154 - regression_loss: 2.2209 - classification_loss: 0.5945 89/500 [====>.........................] - ETA: 1:43 - loss: 2.8127 - regression_loss: 2.2187 - classification_loss: 0.5940 90/500 [====>.........................] - ETA: 1:43 - loss: 2.8126 - regression_loss: 2.2196 - classification_loss: 0.5930 91/500 [====>.........................] - ETA: 1:43 - loss: 2.8170 - regression_loss: 2.2238 - classification_loss: 0.5932 92/500 [====>.........................] - ETA: 1:42 - loss: 2.8166 - regression_loss: 2.2243 - classification_loss: 0.5923 93/500 [====>.........................] - ETA: 1:42 - loss: 2.8149 - regression_loss: 2.2232 - classification_loss: 0.5917 94/500 [====>.........................] - ETA: 1:42 - loss: 2.8210 - regression_loss: 2.2289 - classification_loss: 0.5922 95/500 [====>.........................] - ETA: 1:42 - loss: 2.8172 - regression_loss: 2.2259 - classification_loss: 0.5913 96/500 [====>.........................] - ETA: 1:41 - loss: 2.8257 - regression_loss: 2.2312 - classification_loss: 0.5946 97/500 [====>.........................] - ETA: 1:41 - loss: 2.8237 - regression_loss: 2.2304 - classification_loss: 0.5933 98/500 [====>.........................] - ETA: 1:41 - loss: 2.8231 - regression_loss: 2.2300 - classification_loss: 0.5931 99/500 [====>.........................] - ETA: 1:41 - loss: 2.8200 - regression_loss: 2.2284 - classification_loss: 0.5916 100/500 [=====>........................] - ETA: 1:40 - loss: 2.8184 - regression_loss: 2.2281 - classification_loss: 0.5903 101/500 [=====>........................] - ETA: 1:40 - loss: 2.8144 - regression_loss: 2.2252 - classification_loss: 0.5891 102/500 [=====>........................] - ETA: 1:40 - loss: 2.8159 - regression_loss: 2.2277 - classification_loss: 0.5882 103/500 [=====>........................] - ETA: 1:39 - loss: 2.8129 - regression_loss: 2.2257 - classification_loss: 0.5872 104/500 [=====>........................] - ETA: 1:39 - loss: 2.8172 - regression_loss: 2.2294 - classification_loss: 0.5878 105/500 [=====>........................] - ETA: 1:39 - loss: 2.8150 - regression_loss: 2.2276 - classification_loss: 0.5874 106/500 [=====>........................] - ETA: 1:39 - loss: 2.8121 - regression_loss: 2.2263 - classification_loss: 0.5857 107/500 [=====>........................] - ETA: 1:38 - loss: 2.8126 - regression_loss: 2.2272 - classification_loss: 0.5854 108/500 [=====>........................] - ETA: 1:38 - loss: 2.8156 - regression_loss: 2.2303 - classification_loss: 0.5853 109/500 [=====>........................] - ETA: 1:38 - loss: 2.8109 - regression_loss: 2.2276 - classification_loss: 0.5833 110/500 [=====>........................] - ETA: 1:38 - loss: 2.8147 - regression_loss: 2.2306 - classification_loss: 0.5841 111/500 [=====>........................] - ETA: 1:37 - loss: 2.8118 - regression_loss: 2.2288 - classification_loss: 0.5830 112/500 [=====>........................] - ETA: 1:37 - loss: 2.8106 - regression_loss: 2.2258 - classification_loss: 0.5848 113/500 [=====>........................] - ETA: 1:37 - loss: 2.8452 - regression_loss: 2.2315 - classification_loss: 0.6137 114/500 [=====>........................] - ETA: 1:37 - loss: 2.8456 - regression_loss: 2.2325 - classification_loss: 0.6131 115/500 [=====>........................] - ETA: 1:36 - loss: 2.8448 - regression_loss: 2.2323 - classification_loss: 0.6125 116/500 [=====>........................] - ETA: 1:36 - loss: 2.8420 - regression_loss: 2.2307 - classification_loss: 0.6113 117/500 [======>.......................] - ETA: 1:36 - loss: 2.8438 - regression_loss: 2.2316 - classification_loss: 0.6122 118/500 [======>.......................] - ETA: 1:36 - loss: 2.8499 - regression_loss: 2.2375 - classification_loss: 0.6124 119/500 [======>.......................] - ETA: 1:35 - loss: 2.8471 - regression_loss: 2.2356 - classification_loss: 0.6114 120/500 [======>.......................] - ETA: 1:35 - loss: 2.8446 - regression_loss: 2.2343 - classification_loss: 0.6103 121/500 [======>.......................] - ETA: 1:35 - loss: 2.8447 - regression_loss: 2.2358 - classification_loss: 0.6089 122/500 [======>.......................] - ETA: 1:35 - loss: 2.8473 - regression_loss: 2.2381 - classification_loss: 0.6092 123/500 [======>.......................] - ETA: 1:34 - loss: 2.8458 - regression_loss: 2.2366 - classification_loss: 0.6092 124/500 [======>.......................] - ETA: 1:34 - loss: 2.8361 - regression_loss: 2.2300 - classification_loss: 0.6061 125/500 [======>.......................] - ETA: 1:34 - loss: 2.8336 - regression_loss: 2.2279 - classification_loss: 0.6057 126/500 [======>.......................] - ETA: 1:34 - loss: 2.8351 - regression_loss: 2.2293 - classification_loss: 0.6058 127/500 [======>.......................] - ETA: 1:33 - loss: 2.8337 - regression_loss: 2.2283 - classification_loss: 0.6054 128/500 [======>.......................] - ETA: 1:33 - loss: 2.8307 - regression_loss: 2.2262 - classification_loss: 0.6045 129/500 [======>.......................] - ETA: 1:33 - loss: 2.8371 - regression_loss: 2.2269 - classification_loss: 0.6102 130/500 [======>.......................] - ETA: 1:33 - loss: 2.8409 - regression_loss: 2.2302 - classification_loss: 0.6108 131/500 [======>.......................] - ETA: 1:32 - loss: 2.8396 - regression_loss: 2.2294 - classification_loss: 0.6102 132/500 [======>.......................] - ETA: 1:32 - loss: 2.8383 - regression_loss: 2.2287 - classification_loss: 0.6096 133/500 [======>.......................] - ETA: 1:32 - loss: 2.8380 - regression_loss: 2.2288 - classification_loss: 0.6092 134/500 [=======>......................] - ETA: 1:32 - loss: 2.8328 - regression_loss: 2.2253 - classification_loss: 0.6075 135/500 [=======>......................] - ETA: 1:32 - loss: 2.8345 - regression_loss: 2.2265 - classification_loss: 0.6080 136/500 [=======>......................] - ETA: 1:31 - loss: 2.8333 - regression_loss: 2.2260 - classification_loss: 0.6073 137/500 [=======>......................] - ETA: 1:31 - loss: 2.8397 - regression_loss: 2.2298 - classification_loss: 0.6099 138/500 [=======>......................] - ETA: 1:31 - loss: 2.8369 - regression_loss: 2.2277 - classification_loss: 0.6092 139/500 [=======>......................] - ETA: 1:31 - loss: 2.8384 - regression_loss: 2.2300 - classification_loss: 0.6084 140/500 [=======>......................] - ETA: 1:30 - loss: 2.8359 - regression_loss: 2.2278 - classification_loss: 0.6081 141/500 [=======>......................] - ETA: 1:30 - loss: 2.8364 - regression_loss: 2.2286 - classification_loss: 0.6078 142/500 [=======>......................] - ETA: 1:30 - loss: 2.8317 - regression_loss: 2.2253 - classification_loss: 0.6064 143/500 [=======>......................] - ETA: 1:30 - loss: 2.8303 - regression_loss: 2.2247 - classification_loss: 0.6056 144/500 [=======>......................] - ETA: 1:29 - loss: 2.8283 - regression_loss: 2.2229 - classification_loss: 0.6053 145/500 [=======>......................] - ETA: 1:29 - loss: 2.8277 - regression_loss: 2.2229 - classification_loss: 0.6048 146/500 [=======>......................] - ETA: 1:29 - loss: 2.8292 - regression_loss: 2.2256 - classification_loss: 0.6036 147/500 [=======>......................] - ETA: 1:29 - loss: 2.8285 - regression_loss: 2.2254 - classification_loss: 0.6031 148/500 [=======>......................] - ETA: 1:28 - loss: 2.8257 - regression_loss: 2.2236 - classification_loss: 0.6021 149/500 [=======>......................] - ETA: 1:28 - loss: 2.8263 - regression_loss: 2.2248 - classification_loss: 0.6015 150/500 [========>.....................] - ETA: 1:28 - loss: 2.8257 - regression_loss: 2.2253 - classification_loss: 0.6004 151/500 [========>.....................] - ETA: 1:28 - loss: 2.8212 - regression_loss: 2.2228 - classification_loss: 0.5984 152/500 [========>.....................] - ETA: 1:27 - loss: 2.8201 - regression_loss: 2.2221 - classification_loss: 0.5980 153/500 [========>.....................] - ETA: 1:27 - loss: 2.8141 - regression_loss: 2.2180 - classification_loss: 0.5960 154/500 [========>.....................] - ETA: 1:27 - loss: 2.8140 - regression_loss: 2.2186 - classification_loss: 0.5955 155/500 [========>.....................] - ETA: 1:27 - loss: 2.8130 - regression_loss: 2.2180 - classification_loss: 0.5950 156/500 [========>.....................] - ETA: 1:26 - loss: 2.8115 - regression_loss: 2.2171 - classification_loss: 0.5944 157/500 [========>.....................] - ETA: 1:26 - loss: 2.8098 - regression_loss: 2.2153 - classification_loss: 0.5945 158/500 [========>.....................] - ETA: 1:26 - loss: 2.8100 - regression_loss: 2.2155 - classification_loss: 0.5945 159/500 [========>.....................] - ETA: 1:26 - loss: 2.8082 - regression_loss: 2.2144 - classification_loss: 0.5938 160/500 [========>.....................] - ETA: 1:25 - loss: 2.8042 - regression_loss: 2.2114 - classification_loss: 0.5928 161/500 [========>.....................] - ETA: 1:25 - loss: 2.8097 - regression_loss: 2.2126 - classification_loss: 0.5971 162/500 [========>.....................] - ETA: 1:25 - loss: 2.8102 - regression_loss: 2.2136 - classification_loss: 0.5967 163/500 [========>.....................] - ETA: 1:25 - loss: 2.8097 - regression_loss: 2.2131 - classification_loss: 0.5966 164/500 [========>.....................] - ETA: 1:24 - loss: 2.8106 - regression_loss: 2.2140 - classification_loss: 0.5965 165/500 [========>.....................] - ETA: 1:24 - loss: 2.8095 - regression_loss: 2.2134 - classification_loss: 0.5961 166/500 [========>.....................] - ETA: 1:24 - loss: 2.8105 - regression_loss: 2.2142 - classification_loss: 0.5963 167/500 [=========>....................] - ETA: 1:24 - loss: 2.8111 - regression_loss: 2.2144 - classification_loss: 0.5967 168/500 [=========>....................] - ETA: 1:23 - loss: 2.8124 - regression_loss: 2.2161 - classification_loss: 0.5963 169/500 [=========>....................] - ETA: 1:23 - loss: 2.8213 - regression_loss: 2.2188 - classification_loss: 0.6025 170/500 [=========>....................] - ETA: 1:23 - loss: 2.8300 - regression_loss: 2.2274 - classification_loss: 0.6026 171/500 [=========>....................] - ETA: 1:22 - loss: 2.8297 - regression_loss: 2.2270 - classification_loss: 0.6028 172/500 [=========>....................] - ETA: 1:22 - loss: 2.8354 - regression_loss: 2.2315 - classification_loss: 0.6039 173/500 [=========>....................] - ETA: 1:22 - loss: 2.8340 - regression_loss: 2.2305 - classification_loss: 0.6035 174/500 [=========>....................] - ETA: 1:21 - loss: 2.8328 - regression_loss: 2.2297 - classification_loss: 0.6031 175/500 [=========>....................] - ETA: 1:21 - loss: 2.8303 - regression_loss: 2.2275 - classification_loss: 0.6028 176/500 [=========>....................] - ETA: 1:21 - loss: 2.8325 - regression_loss: 2.2287 - classification_loss: 0.6038 177/500 [=========>....................] - ETA: 1:21 - loss: 2.8249 - regression_loss: 2.2227 - classification_loss: 0.6022 178/500 [=========>....................] - ETA: 1:20 - loss: 2.8254 - regression_loss: 2.2232 - classification_loss: 0.6022 179/500 [=========>....................] - ETA: 1:20 - loss: 2.8238 - regression_loss: 2.2225 - classification_loss: 0.6013 180/500 [=========>....................] - ETA: 1:20 - loss: 2.8229 - regression_loss: 2.2220 - classification_loss: 0.6008 181/500 [=========>....................] - ETA: 1:20 - loss: 2.8213 - regression_loss: 2.2210 - classification_loss: 0.6003 182/500 [=========>....................] - ETA: 1:19 - loss: 2.8213 - regression_loss: 2.2212 - classification_loss: 0.6000 183/500 [=========>....................] - ETA: 1:19 - loss: 2.8186 - regression_loss: 2.2193 - classification_loss: 0.5992 184/500 [==========>...................] - ETA: 1:19 - loss: 2.8174 - regression_loss: 2.2190 - classification_loss: 0.5985 185/500 [==========>...................] - ETA: 1:19 - loss: 2.8176 - regression_loss: 2.2191 - classification_loss: 0.5985 186/500 [==========>...................] - ETA: 1:18 - loss: 2.8162 - regression_loss: 2.2182 - classification_loss: 0.5979 187/500 [==========>...................] - ETA: 1:18 - loss: 2.8149 - regression_loss: 2.2174 - classification_loss: 0.5975 188/500 [==========>...................] - ETA: 1:18 - loss: 2.8148 - regression_loss: 2.2171 - classification_loss: 0.5976 189/500 [==========>...................] - ETA: 1:17 - loss: 2.8259 - regression_loss: 2.2199 - classification_loss: 0.6061 190/500 [==========>...................] - ETA: 1:17 - loss: 2.8234 - regression_loss: 2.2183 - classification_loss: 0.6051 191/500 [==========>...................] - ETA: 1:17 - loss: 2.8222 - regression_loss: 2.2175 - classification_loss: 0.6047 192/500 [==========>...................] - ETA: 1:17 - loss: 2.8212 - regression_loss: 2.2169 - classification_loss: 0.6043 193/500 [==========>...................] - ETA: 1:16 - loss: 2.8202 - regression_loss: 2.2166 - classification_loss: 0.6036 194/500 [==========>...................] - ETA: 1:16 - loss: 2.8242 - regression_loss: 2.2203 - classification_loss: 0.6038 195/500 [==========>...................] - ETA: 1:16 - loss: 2.8252 - regression_loss: 2.2214 - classification_loss: 0.6038 196/500 [==========>...................] - ETA: 1:16 - loss: 2.8274 - regression_loss: 2.2230 - classification_loss: 0.6044 197/500 [==========>...................] - ETA: 1:15 - loss: 2.8285 - regression_loss: 2.2239 - classification_loss: 0.6046 198/500 [==========>...................] - ETA: 1:15 - loss: 2.8334 - regression_loss: 2.2276 - classification_loss: 0.6058 199/500 [==========>...................] - ETA: 1:15 - loss: 2.8334 - regression_loss: 2.2274 - classification_loss: 0.6060 200/500 [===========>..................] - ETA: 1:15 - loss: 2.8331 - regression_loss: 2.2269 - classification_loss: 0.6062 201/500 [===========>..................] - ETA: 1:14 - loss: 2.8350 - regression_loss: 2.2293 - classification_loss: 0.6058 202/500 [===========>..................] - ETA: 1:14 - loss: 2.8350 - regression_loss: 2.2293 - classification_loss: 0.6057 203/500 [===========>..................] - ETA: 1:14 - loss: 2.8343 - regression_loss: 2.2290 - classification_loss: 0.6053 204/500 [===========>..................] - ETA: 1:14 - loss: 2.8303 - regression_loss: 2.2264 - classification_loss: 0.6039 205/500 [===========>..................] - ETA: 1:13 - loss: 2.8325 - regression_loss: 2.2255 - classification_loss: 0.6070 206/500 [===========>..................] - ETA: 1:13 - loss: 2.8313 - regression_loss: 2.2251 - classification_loss: 0.6062 207/500 [===========>..................] - ETA: 1:13 - loss: 2.8308 - regression_loss: 2.2250 - classification_loss: 0.6058 208/500 [===========>..................] - ETA: 1:13 - loss: 2.8296 - regression_loss: 2.2246 - classification_loss: 0.6050 209/500 [===========>..................] - ETA: 1:12 - loss: 2.8289 - regression_loss: 2.2242 - classification_loss: 0.6047 210/500 [===========>..................] - ETA: 1:12 - loss: 2.8284 - regression_loss: 2.2239 - classification_loss: 0.6045 211/500 [===========>..................] - ETA: 1:12 - loss: 2.8271 - regression_loss: 2.2228 - classification_loss: 0.6043 212/500 [===========>..................] - ETA: 1:12 - loss: 2.8287 - regression_loss: 2.2234 - classification_loss: 0.6052 213/500 [===========>..................] - ETA: 1:11 - loss: 2.8273 - regression_loss: 2.2225 - classification_loss: 0.6048 214/500 [===========>..................] - ETA: 1:11 - loss: 2.8266 - regression_loss: 2.2220 - classification_loss: 0.6047 215/500 [===========>..................] - ETA: 1:11 - loss: 2.8246 - regression_loss: 2.2207 - classification_loss: 0.6039 216/500 [===========>..................] - ETA: 1:11 - loss: 2.8220 - regression_loss: 2.2190 - classification_loss: 0.6030 217/500 [============>.................] - ETA: 1:10 - loss: 2.8217 - regression_loss: 2.2192 - classification_loss: 0.6024 218/500 [============>.................] - ETA: 1:10 - loss: 2.8206 - regression_loss: 2.2188 - classification_loss: 0.6018 219/500 [============>.................] - ETA: 1:10 - loss: 2.8194 - regression_loss: 2.2183 - classification_loss: 0.6011 220/500 [============>.................] - ETA: 1:10 - loss: 2.8183 - regression_loss: 2.2177 - classification_loss: 0.6006 221/500 [============>.................] - ETA: 1:09 - loss: 2.8164 - regression_loss: 2.2164 - classification_loss: 0.6000 222/500 [============>.................] - ETA: 1:09 - loss: 2.8157 - regression_loss: 2.2162 - classification_loss: 0.5995 223/500 [============>.................] - ETA: 1:09 - loss: 2.8181 - regression_loss: 2.2179 - classification_loss: 0.6002 224/500 [============>.................] - ETA: 1:09 - loss: 2.8174 - regression_loss: 2.2179 - classification_loss: 0.5995 225/500 [============>.................] - ETA: 1:08 - loss: 2.8169 - regression_loss: 2.2176 - classification_loss: 0.5992 226/500 [============>.................] - ETA: 1:08 - loss: 2.8153 - regression_loss: 2.2167 - classification_loss: 0.5986 227/500 [============>.................] - ETA: 1:08 - loss: 2.8162 - regression_loss: 2.2179 - classification_loss: 0.5983 228/500 [============>.................] - ETA: 1:08 - loss: 2.8154 - regression_loss: 2.2172 - classification_loss: 0.5982 229/500 [============>.................] - ETA: 1:07 - loss: 2.8173 - regression_loss: 2.2185 - classification_loss: 0.5988 230/500 [============>.................] - ETA: 1:07 - loss: 2.8157 - regression_loss: 2.2177 - classification_loss: 0.5980 231/500 [============>.................] - ETA: 1:07 - loss: 2.8142 - regression_loss: 2.2168 - classification_loss: 0.5974 232/500 [============>.................] - ETA: 1:07 - loss: 2.8154 - regression_loss: 2.2179 - classification_loss: 0.5975 233/500 [============>.................] - ETA: 1:06 - loss: 2.8152 - regression_loss: 2.2174 - classification_loss: 0.5979 234/500 [=============>................] - ETA: 1:06 - loss: 2.8143 - regression_loss: 2.2171 - classification_loss: 0.5972 235/500 [=============>................] - ETA: 1:06 - loss: 2.8164 - regression_loss: 2.2182 - classification_loss: 0.5983 236/500 [=============>................] - ETA: 1:06 - loss: 2.8137 - regression_loss: 2.2163 - classification_loss: 0.5974 237/500 [=============>................] - ETA: 1:05 - loss: 2.8138 - regression_loss: 2.2141 - classification_loss: 0.5997 238/500 [=============>................] - ETA: 1:05 - loss: 2.8123 - regression_loss: 2.2135 - classification_loss: 0.5988 239/500 [=============>................] - ETA: 1:05 - loss: 2.8106 - regression_loss: 2.2128 - classification_loss: 0.5978 240/500 [=============>................] - ETA: 1:05 - loss: 2.8065 - regression_loss: 2.2093 - classification_loss: 0.5972 241/500 [=============>................] - ETA: 1:04 - loss: 2.8057 - regression_loss: 2.2089 - classification_loss: 0.5968 242/500 [=============>................] - ETA: 1:04 - loss: 2.8054 - regression_loss: 2.2088 - classification_loss: 0.5967 243/500 [=============>................] - ETA: 1:04 - loss: 2.8053 - regression_loss: 2.2090 - classification_loss: 0.5962 244/500 [=============>................] - ETA: 1:04 - loss: 2.8063 - regression_loss: 2.2103 - classification_loss: 0.5960 245/500 [=============>................] - ETA: 1:03 - loss: 2.8072 - regression_loss: 2.2111 - classification_loss: 0.5961 246/500 [=============>................] - ETA: 1:03 - loss: 2.8099 - regression_loss: 2.2134 - classification_loss: 0.5965 247/500 [=============>................] - ETA: 1:03 - loss: 2.8094 - regression_loss: 2.2134 - classification_loss: 0.5960 248/500 [=============>................] - ETA: 1:03 - loss: 2.8102 - regression_loss: 2.2140 - classification_loss: 0.5962 249/500 [=============>................] - ETA: 1:02 - loss: 2.8100 - regression_loss: 2.2140 - classification_loss: 0.5961 250/500 [==============>...............] - ETA: 1:02 - loss: 2.8097 - regression_loss: 2.2138 - classification_loss: 0.5959 251/500 [==============>...............] - ETA: 1:02 - loss: 2.8104 - regression_loss: 2.2145 - classification_loss: 0.5958 252/500 [==============>...............] - ETA: 1:02 - loss: 2.8087 - regression_loss: 2.2133 - classification_loss: 0.5954 253/500 [==============>...............] - ETA: 1:01 - loss: 2.8058 - regression_loss: 2.2116 - classification_loss: 0.5942 254/500 [==============>...............] - ETA: 1:01 - loss: 2.8054 - regression_loss: 2.2117 - classification_loss: 0.5937 255/500 [==============>...............] - ETA: 1:01 - loss: 2.8070 - regression_loss: 2.2130 - classification_loss: 0.5939 256/500 [==============>...............] - ETA: 1:01 - loss: 2.8060 - regression_loss: 2.2122 - classification_loss: 0.5938 257/500 [==============>...............] - ETA: 1:00 - loss: 2.8066 - regression_loss: 2.2128 - classification_loss: 0.5938 258/500 [==============>...............] - ETA: 1:00 - loss: 2.8065 - regression_loss: 2.2122 - classification_loss: 0.5942 259/500 [==============>...............] - ETA: 1:00 - loss: 2.8045 - regression_loss: 2.2109 - classification_loss: 0.5936 260/500 [==============>...............] - ETA: 1:00 - loss: 2.8059 - regression_loss: 2.2115 - classification_loss: 0.5945 261/500 [==============>...............] - ETA: 59s - loss: 2.8057 - regression_loss: 2.2116 - classification_loss: 0.5941  262/500 [==============>...............] - ETA: 59s - loss: 2.8058 - regression_loss: 2.2118 - classification_loss: 0.5941 263/500 [==============>...............] - ETA: 59s - loss: 2.8057 - regression_loss: 2.2120 - classification_loss: 0.5938 264/500 [==============>...............] - ETA: 59s - loss: 2.8054 - regression_loss: 2.2116 - classification_loss: 0.5938 265/500 [==============>...............] - ETA: 58s - loss: 2.8051 - regression_loss: 2.2112 - classification_loss: 0.5939 266/500 [==============>...............] - ETA: 58s - loss: 2.8054 - regression_loss: 2.2112 - classification_loss: 0.5942 267/500 [===============>..............] - ETA: 58s - loss: 2.8052 - regression_loss: 2.2112 - classification_loss: 0.5940 268/500 [===============>..............] - ETA: 58s - loss: 2.8064 - regression_loss: 2.2131 - classification_loss: 0.5933 269/500 [===============>..............] - ETA: 57s - loss: 2.8058 - regression_loss: 2.2130 - classification_loss: 0.5928 270/500 [===============>..............] - ETA: 57s - loss: 2.8074 - regression_loss: 2.2142 - classification_loss: 0.5932 271/500 [===============>..............] - ETA: 57s - loss: 2.8067 - regression_loss: 2.2137 - classification_loss: 0.5930 272/500 [===============>..............] - ETA: 57s - loss: 2.8065 - regression_loss: 2.2136 - classification_loss: 0.5929 273/500 [===============>..............] - ETA: 56s - loss: 2.8051 - regression_loss: 2.2119 - classification_loss: 0.5932 274/500 [===============>..............] - ETA: 56s - loss: 2.8048 - regression_loss: 2.2119 - classification_loss: 0.5929 275/500 [===============>..............] - ETA: 56s - loss: 2.8040 - regression_loss: 2.2119 - classification_loss: 0.5922 276/500 [===============>..............] - ETA: 56s - loss: 2.8041 - regression_loss: 2.2123 - classification_loss: 0.5918 277/500 [===============>..............] - ETA: 55s - loss: 2.8034 - regression_loss: 2.2119 - classification_loss: 0.5914 278/500 [===============>..............] - ETA: 55s - loss: 2.8035 - regression_loss: 2.2123 - classification_loss: 0.5912 279/500 [===============>..............] - ETA: 55s - loss: 2.8057 - regression_loss: 2.2128 - classification_loss: 0.5929 280/500 [===============>..............] - ETA: 55s - loss: 2.8051 - regression_loss: 2.2123 - classification_loss: 0.5929 281/500 [===============>..............] - ETA: 54s - loss: 2.8058 - regression_loss: 2.2129 - classification_loss: 0.5930 282/500 [===============>..............] - ETA: 54s - loss: 2.8061 - regression_loss: 2.2132 - classification_loss: 0.5930 283/500 [===============>..............] - ETA: 54s - loss: 2.8058 - regression_loss: 2.2130 - classification_loss: 0.5928 284/500 [================>.............] - ETA: 54s - loss: 2.8026 - regression_loss: 2.2097 - classification_loss: 0.5929 285/500 [================>.............] - ETA: 54s - loss: 2.8041 - regression_loss: 2.2107 - classification_loss: 0.5934 286/500 [================>.............] - ETA: 53s - loss: 2.8031 - regression_loss: 2.2101 - classification_loss: 0.5930 287/500 [================>.............] - ETA: 53s - loss: 2.8022 - regression_loss: 2.2095 - classification_loss: 0.5927 288/500 [================>.............] - ETA: 53s - loss: 2.8015 - regression_loss: 2.2091 - classification_loss: 0.5924 289/500 [================>.............] - ETA: 53s - loss: 2.7986 - regression_loss: 2.2068 - classification_loss: 0.5918 290/500 [================>.............] - ETA: 52s - loss: 2.7991 - regression_loss: 2.2073 - classification_loss: 0.5919 291/500 [================>.............] - ETA: 52s - loss: 2.7991 - regression_loss: 2.2073 - classification_loss: 0.5918 292/500 [================>.............] - ETA: 52s - loss: 2.7989 - regression_loss: 2.2074 - classification_loss: 0.5915 293/500 [================>.............] - ETA: 52s - loss: 2.7998 - regression_loss: 2.2084 - classification_loss: 0.5914 294/500 [================>.............] - ETA: 51s - loss: 2.7981 - regression_loss: 2.2069 - classification_loss: 0.5912 295/500 [================>.............] - ETA: 51s - loss: 2.7987 - regression_loss: 2.2076 - classification_loss: 0.5911 296/500 [================>.............] - ETA: 51s - loss: 2.8008 - regression_loss: 2.2098 - classification_loss: 0.5910 297/500 [================>.............] - ETA: 50s - loss: 2.8007 - regression_loss: 2.2100 - classification_loss: 0.5907 298/500 [================>.............] - ETA: 50s - loss: 2.8024 - regression_loss: 2.2117 - classification_loss: 0.5907 299/500 [================>.............] - ETA: 50s - loss: 2.8020 - regression_loss: 2.2118 - classification_loss: 0.5902 300/500 [=================>............] - ETA: 50s - loss: 2.8035 - regression_loss: 2.2129 - classification_loss: 0.5905 301/500 [=================>............] - ETA: 49s - loss: 2.8041 - regression_loss: 2.2139 - classification_loss: 0.5902 302/500 [=================>............] - ETA: 49s - loss: 2.8038 - regression_loss: 2.2140 - classification_loss: 0.5898 303/500 [=================>............] - ETA: 49s - loss: 2.8019 - regression_loss: 2.2122 - classification_loss: 0.5897 304/500 [=================>............] - ETA: 49s - loss: 2.8031 - regression_loss: 2.2137 - classification_loss: 0.5894 305/500 [=================>............] - ETA: 48s - loss: 2.8029 - regression_loss: 2.2138 - classification_loss: 0.5891 306/500 [=================>............] - ETA: 48s - loss: 2.8034 - regression_loss: 2.2145 - classification_loss: 0.5888 307/500 [=================>............] - ETA: 48s - loss: 2.8031 - regression_loss: 2.2144 - classification_loss: 0.5887 308/500 [=================>............] - ETA: 48s - loss: 2.8029 - regression_loss: 2.2147 - classification_loss: 0.5882 309/500 [=================>............] - ETA: 47s - loss: 2.8021 - regression_loss: 2.2141 - classification_loss: 0.5880 310/500 [=================>............] - ETA: 47s - loss: 2.8014 - regression_loss: 2.2138 - classification_loss: 0.5876 311/500 [=================>............] - ETA: 47s - loss: 2.8013 - regression_loss: 2.2135 - classification_loss: 0.5879 312/500 [=================>............] - ETA: 47s - loss: 2.8014 - regression_loss: 2.2135 - classification_loss: 0.5879 313/500 [=================>............] - ETA: 46s - loss: 2.8045 - regression_loss: 2.2149 - classification_loss: 0.5896 314/500 [=================>............] - ETA: 46s - loss: 2.8035 - regression_loss: 2.2137 - classification_loss: 0.5898 315/500 [=================>............] - ETA: 46s - loss: 2.8028 - regression_loss: 2.2133 - classification_loss: 0.5895 316/500 [=================>............] - ETA: 46s - loss: 2.8017 - regression_loss: 2.2122 - classification_loss: 0.5895 317/500 [==================>...........] - ETA: 45s - loss: 2.8017 - regression_loss: 2.2115 - classification_loss: 0.5902 318/500 [==================>...........] - ETA: 45s - loss: 2.8008 - regression_loss: 2.2109 - classification_loss: 0.5899 319/500 [==================>...........] - ETA: 45s - loss: 2.8011 - regression_loss: 2.2107 - classification_loss: 0.5904 320/500 [==================>...........] - ETA: 45s - loss: 2.8019 - regression_loss: 2.2118 - classification_loss: 0.5901 321/500 [==================>...........] - ETA: 44s - loss: 2.8021 - regression_loss: 2.2119 - classification_loss: 0.5902 322/500 [==================>...........] - ETA: 44s - loss: 2.8012 - regression_loss: 2.2114 - classification_loss: 0.5898 323/500 [==================>...........] - ETA: 44s - loss: 2.8007 - regression_loss: 2.2110 - classification_loss: 0.5896 324/500 [==================>...........] - ETA: 44s - loss: 2.8003 - regression_loss: 2.2112 - classification_loss: 0.5891 325/500 [==================>...........] - ETA: 43s - loss: 2.8001 - regression_loss: 2.2114 - classification_loss: 0.5887 326/500 [==================>...........] - ETA: 43s - loss: 2.7998 - regression_loss: 2.2111 - classification_loss: 0.5887 327/500 [==================>...........] - ETA: 43s - loss: 2.8002 - regression_loss: 2.2112 - classification_loss: 0.5890 328/500 [==================>...........] - ETA: 43s - loss: 2.8001 - regression_loss: 2.2110 - classification_loss: 0.5891 329/500 [==================>...........] - ETA: 42s - loss: 2.7998 - regression_loss: 2.2110 - classification_loss: 0.5888 330/500 [==================>...........] - ETA: 42s - loss: 2.8007 - regression_loss: 2.2118 - classification_loss: 0.5889 331/500 [==================>...........] - ETA: 42s - loss: 2.8004 - regression_loss: 2.2120 - classification_loss: 0.5884 332/500 [==================>...........] - ETA: 42s - loss: 2.7982 - regression_loss: 2.2105 - classification_loss: 0.5877 333/500 [==================>...........] - ETA: 41s - loss: 2.7974 - regression_loss: 2.2101 - classification_loss: 0.5873 334/500 [===================>..........] - ETA: 41s - loss: 2.7982 - regression_loss: 2.2108 - classification_loss: 0.5874 335/500 [===================>..........] - ETA: 41s - loss: 2.7977 - regression_loss: 2.2106 - classification_loss: 0.5871 336/500 [===================>..........] - ETA: 41s - loss: 2.7992 - regression_loss: 2.2116 - classification_loss: 0.5876 337/500 [===================>..........] - ETA: 40s - loss: 2.7980 - regression_loss: 2.2110 - classification_loss: 0.5870 338/500 [===================>..........] - ETA: 40s - loss: 2.7974 - regression_loss: 2.2108 - classification_loss: 0.5866 339/500 [===================>..........] - ETA: 40s - loss: 2.7978 - regression_loss: 2.2115 - classification_loss: 0.5863 340/500 [===================>..........] - ETA: 40s - loss: 2.7968 - regression_loss: 2.2108 - classification_loss: 0.5860 341/500 [===================>..........] - ETA: 39s - loss: 2.7972 - regression_loss: 2.2108 - classification_loss: 0.5864 342/500 [===================>..........] - ETA: 39s - loss: 2.7976 - regression_loss: 2.2105 - classification_loss: 0.5871 343/500 [===================>..........] - ETA: 39s - loss: 2.7978 - regression_loss: 2.2110 - classification_loss: 0.5868 344/500 [===================>..........] - ETA: 39s - loss: 2.7973 - regression_loss: 2.2108 - classification_loss: 0.5865 345/500 [===================>..........] - ETA: 38s - loss: 2.7951 - regression_loss: 2.2096 - classification_loss: 0.5855 346/500 [===================>..........] - ETA: 38s - loss: 2.7955 - regression_loss: 2.2098 - classification_loss: 0.5856 347/500 [===================>..........] - ETA: 38s - loss: 2.7962 - regression_loss: 2.2105 - classification_loss: 0.5857 348/500 [===================>..........] - ETA: 38s - loss: 2.7964 - regression_loss: 2.2109 - classification_loss: 0.5855 349/500 [===================>..........] - ETA: 37s - loss: 2.7961 - regression_loss: 2.2108 - classification_loss: 0.5853 350/500 [====================>.........] - ETA: 37s - loss: 2.7959 - regression_loss: 2.2106 - classification_loss: 0.5853 351/500 [====================>.........] - ETA: 37s - loss: 2.7950 - regression_loss: 2.2101 - classification_loss: 0.5849 352/500 [====================>.........] - ETA: 37s - loss: 2.7943 - regression_loss: 2.2097 - classification_loss: 0.5846 353/500 [====================>.........] - ETA: 36s - loss: 2.7932 - regression_loss: 2.2092 - classification_loss: 0.5840 354/500 [====================>.........] - ETA: 36s - loss: 2.7929 - regression_loss: 2.2090 - classification_loss: 0.5838 355/500 [====================>.........] - ETA: 36s - loss: 2.7925 - regression_loss: 2.2085 - classification_loss: 0.5840 356/500 [====================>.........] - ETA: 36s - loss: 2.7932 - regression_loss: 2.2092 - classification_loss: 0.5840 357/500 [====================>.........] - ETA: 35s - loss: 2.7926 - regression_loss: 2.2088 - classification_loss: 0.5837 358/500 [====================>.........] - ETA: 35s - loss: 2.7924 - regression_loss: 2.2087 - classification_loss: 0.5837 359/500 [====================>.........] - ETA: 35s - loss: 2.7921 - regression_loss: 2.2087 - classification_loss: 0.5834 360/500 [====================>.........] - ETA: 35s - loss: 2.7926 - regression_loss: 2.2091 - classification_loss: 0.5835 361/500 [====================>.........] - ETA: 34s - loss: 2.7922 - regression_loss: 2.2089 - classification_loss: 0.5832 362/500 [====================>.........] - ETA: 34s - loss: 2.7912 - regression_loss: 2.2082 - classification_loss: 0.5829 363/500 [====================>.........] - ETA: 34s - loss: 2.7894 - regression_loss: 2.2069 - classification_loss: 0.5825 364/500 [====================>.........] - ETA: 34s - loss: 2.7896 - regression_loss: 2.2072 - classification_loss: 0.5824 365/500 [====================>.........] - ETA: 33s - loss: 2.7890 - regression_loss: 2.2069 - classification_loss: 0.5821 366/500 [====================>.........] - ETA: 33s - loss: 2.7890 - regression_loss: 2.2071 - classification_loss: 0.5819 367/500 [=====================>........] - ETA: 33s - loss: 2.7892 - regression_loss: 2.2075 - classification_loss: 0.5818 368/500 [=====================>........] - ETA: 33s - loss: 2.7889 - regression_loss: 2.2074 - classification_loss: 0.5816 369/500 [=====================>........] - ETA: 32s - loss: 2.7886 - regression_loss: 2.2073 - classification_loss: 0.5813 370/500 [=====================>........] - ETA: 32s - loss: 2.7905 - regression_loss: 2.2089 - classification_loss: 0.5816 371/500 [=====================>........] - ETA: 32s - loss: 2.7903 - regression_loss: 2.2090 - classification_loss: 0.5813 372/500 [=====================>........] - ETA: 32s - loss: 2.7898 - regression_loss: 2.2089 - classification_loss: 0.5809 373/500 [=====================>........] - ETA: 31s - loss: 2.7898 - regression_loss: 2.2092 - classification_loss: 0.5806 374/500 [=====================>........] - ETA: 31s - loss: 2.7891 - regression_loss: 2.2088 - classification_loss: 0.5803 375/500 [=====================>........] - ETA: 31s - loss: 2.7889 - regression_loss: 2.2087 - classification_loss: 0.5801 376/500 [=====================>........] - ETA: 31s - loss: 2.7883 - regression_loss: 2.2083 - classification_loss: 0.5800 377/500 [=====================>........] - ETA: 30s - loss: 2.7885 - regression_loss: 2.2086 - classification_loss: 0.5799 378/500 [=====================>........] - ETA: 30s - loss: 2.7874 - regression_loss: 2.2069 - classification_loss: 0.5805 379/500 [=====================>........] - ETA: 30s - loss: 2.7853 - regression_loss: 2.2055 - classification_loss: 0.5798 380/500 [=====================>........] - ETA: 30s - loss: 2.7846 - regression_loss: 2.2052 - classification_loss: 0.5794 381/500 [=====================>........] - ETA: 29s - loss: 2.7840 - regression_loss: 2.2048 - classification_loss: 0.5792 382/500 [=====================>........] - ETA: 29s - loss: 2.7839 - regression_loss: 2.2049 - classification_loss: 0.5790 383/500 [=====================>........] - ETA: 29s - loss: 2.7837 - regression_loss: 2.2050 - classification_loss: 0.5787 384/500 [======================>.......] - ETA: 29s - loss: 2.7853 - regression_loss: 2.2062 - classification_loss: 0.5791 385/500 [======================>.......] - ETA: 28s - loss: 2.7845 - regression_loss: 2.2055 - classification_loss: 0.5790 386/500 [======================>.......] - ETA: 28s - loss: 2.7852 - regression_loss: 2.2060 - classification_loss: 0.5791 387/500 [======================>.......] - ETA: 28s - loss: 2.7844 - regression_loss: 2.2056 - classification_loss: 0.5788 388/500 [======================>.......] - ETA: 28s - loss: 2.7840 - regression_loss: 2.2054 - classification_loss: 0.5786 389/500 [======================>.......] - ETA: 27s - loss: 2.7863 - regression_loss: 2.2071 - classification_loss: 0.5793 390/500 [======================>.......] - ETA: 27s - loss: 2.7860 - regression_loss: 2.2070 - classification_loss: 0.5789 391/500 [======================>.......] - ETA: 27s - loss: 2.7871 - regression_loss: 2.2063 - classification_loss: 0.5809 392/500 [======================>.......] - ETA: 27s - loss: 2.7867 - regression_loss: 2.2060 - classification_loss: 0.5807 393/500 [======================>.......] - ETA: 26s - loss: 2.7890 - regression_loss: 2.2068 - classification_loss: 0.5822 394/500 [======================>.......] - ETA: 26s - loss: 2.7897 - regression_loss: 2.2076 - classification_loss: 0.5821 395/500 [======================>.......] - ETA: 26s - loss: 2.7894 - regression_loss: 2.2075 - classification_loss: 0.5819 396/500 [======================>.......] - ETA: 26s - loss: 2.7883 - regression_loss: 2.2068 - classification_loss: 0.5815 397/500 [======================>.......] - ETA: 25s - loss: 2.7877 - regression_loss: 2.2063 - classification_loss: 0.5814 398/500 [======================>.......] - ETA: 25s - loss: 2.7879 - regression_loss: 2.2067 - classification_loss: 0.5813 399/500 [======================>.......] - ETA: 25s - loss: 2.7858 - regression_loss: 2.2048 - classification_loss: 0.5810 400/500 [=======================>......] - ETA: 25s - loss: 2.7861 - regression_loss: 2.2053 - classification_loss: 0.5808 401/500 [=======================>......] - ETA: 24s - loss: 2.7855 - regression_loss: 2.2049 - classification_loss: 0.5806 402/500 [=======================>......] - ETA: 24s - loss: 2.7854 - regression_loss: 2.2049 - classification_loss: 0.5804 403/500 [=======================>......] - ETA: 24s - loss: 2.7848 - regression_loss: 2.2044 - classification_loss: 0.5805 404/500 [=======================>......] - ETA: 24s - loss: 2.7836 - regression_loss: 2.2036 - classification_loss: 0.5800 405/500 [=======================>......] - ETA: 23s - loss: 2.7832 - regression_loss: 2.2034 - classification_loss: 0.5798 406/500 [=======================>......] - ETA: 23s - loss: 2.7842 - regression_loss: 2.2038 - classification_loss: 0.5804 407/500 [=======================>......] - ETA: 23s - loss: 2.7832 - regression_loss: 2.2033 - classification_loss: 0.5799 408/500 [=======================>......] - ETA: 23s - loss: 2.7824 - regression_loss: 2.2027 - classification_loss: 0.5797 409/500 [=======================>......] - ETA: 22s - loss: 2.7833 - regression_loss: 2.2035 - classification_loss: 0.5798 410/500 [=======================>......] - ETA: 22s - loss: 2.7831 - regression_loss: 2.2035 - classification_loss: 0.5796 411/500 [=======================>......] - ETA: 22s - loss: 2.7836 - regression_loss: 2.2041 - classification_loss: 0.5795 412/500 [=======================>......] - ETA: 22s - loss: 2.7838 - regression_loss: 2.2042 - classification_loss: 0.5795 413/500 [=======================>......] - ETA: 21s - loss: 2.7804 - regression_loss: 2.2016 - classification_loss: 0.5787 414/500 [=======================>......] - ETA: 21s - loss: 2.7800 - regression_loss: 2.2015 - classification_loss: 0.5785 415/500 [=======================>......] - ETA: 21s - loss: 2.7801 - regression_loss: 2.2018 - classification_loss: 0.5783 416/500 [=======================>......] - ETA: 21s - loss: 2.7792 - regression_loss: 2.2011 - classification_loss: 0.5782 417/500 [========================>.....] - ETA: 20s - loss: 2.7792 - regression_loss: 2.2012 - classification_loss: 0.5780 418/500 [========================>.....] - ETA: 20s - loss: 2.7789 - regression_loss: 2.2011 - classification_loss: 0.5778 419/500 [========================>.....] - ETA: 20s - loss: 2.7782 - regression_loss: 2.2006 - classification_loss: 0.5776 420/500 [========================>.....] - ETA: 20s - loss: 2.7783 - regression_loss: 2.2008 - classification_loss: 0.5775 421/500 [========================>.....] - ETA: 19s - loss: 2.7779 - regression_loss: 2.2005 - classification_loss: 0.5774 422/500 [========================>.....] - ETA: 19s - loss: 2.7775 - regression_loss: 2.2004 - classification_loss: 0.5771 423/500 [========================>.....] - ETA: 19s - loss: 2.7780 - regression_loss: 2.2006 - classification_loss: 0.5774 424/500 [========================>.....] - ETA: 19s - loss: 2.7780 - regression_loss: 2.2006 - classification_loss: 0.5774 425/500 [========================>.....] - ETA: 18s - loss: 2.7773 - regression_loss: 2.2000 - classification_loss: 0.5773 426/500 [========================>.....] - ETA: 18s - loss: 2.7769 - regression_loss: 2.1997 - classification_loss: 0.5772 427/500 [========================>.....] - ETA: 18s - loss: 2.7763 - regression_loss: 2.1992 - classification_loss: 0.5771 428/500 [========================>.....] - ETA: 18s - loss: 2.7761 - regression_loss: 2.1990 - classification_loss: 0.5771 429/500 [========================>.....] - ETA: 17s - loss: 2.7758 - regression_loss: 2.1987 - classification_loss: 0.5770 430/500 [========================>.....] - ETA: 17s - loss: 2.7759 - regression_loss: 2.1987 - classification_loss: 0.5772 431/500 [========================>.....] - ETA: 17s - loss: 2.7758 - regression_loss: 2.1988 - classification_loss: 0.5770 432/500 [========================>.....] - ETA: 17s - loss: 2.7769 - regression_loss: 2.1983 - classification_loss: 0.5786 433/500 [========================>.....] - ETA: 16s - loss: 2.7762 - regression_loss: 2.1979 - classification_loss: 0.5783 434/500 [=========================>....] - ETA: 16s - loss: 2.7744 - regression_loss: 2.1966 - classification_loss: 0.5778 435/500 [=========================>....] - ETA: 16s - loss: 2.7760 - regression_loss: 2.1981 - classification_loss: 0.5778 436/500 [=========================>....] - ETA: 16s - loss: 2.7744 - regression_loss: 2.1971 - classification_loss: 0.5773 437/500 [=========================>....] - ETA: 15s - loss: 2.7734 - regression_loss: 2.1964 - classification_loss: 0.5769 438/500 [=========================>....] - ETA: 15s - loss: 2.7734 - regression_loss: 2.1965 - classification_loss: 0.5769 439/500 [=========================>....] - ETA: 15s - loss: 2.7735 - regression_loss: 2.1969 - classification_loss: 0.5766 440/500 [=========================>....] - ETA: 15s - loss: 2.7732 - regression_loss: 2.1970 - classification_loss: 0.5763 441/500 [=========================>....] - ETA: 14s - loss: 2.7734 - regression_loss: 2.1972 - classification_loss: 0.5762 442/500 [=========================>....] - ETA: 14s - loss: 2.7738 - regression_loss: 2.1972 - classification_loss: 0.5765 443/500 [=========================>....] - ETA: 14s - loss: 2.7774 - regression_loss: 2.2007 - classification_loss: 0.5767 444/500 [=========================>....] - ETA: 14s - loss: 2.7772 - regression_loss: 2.2007 - classification_loss: 0.5765 445/500 [=========================>....] - ETA: 13s - loss: 2.7768 - regression_loss: 2.2001 - classification_loss: 0.5767 446/500 [=========================>....] - ETA: 13s - loss: 2.7766 - regression_loss: 2.1998 - classification_loss: 0.5768 447/500 [=========================>....] - ETA: 13s - loss: 2.7764 - regression_loss: 2.1999 - classification_loss: 0.5765 448/500 [=========================>....] - ETA: 13s - loss: 2.7771 - regression_loss: 2.2002 - classification_loss: 0.5769 449/500 [=========================>....] - ETA: 12s - loss: 2.7780 - regression_loss: 2.2014 - classification_loss: 0.5766 450/500 [==========================>...] - ETA: 12s - loss: 2.7790 - regression_loss: 2.2018 - classification_loss: 0.5772 451/500 [==========================>...] - ETA: 12s - loss: 2.7788 - regression_loss: 2.2018 - classification_loss: 0.5770 452/500 [==========================>...] - ETA: 12s - loss: 2.7774 - regression_loss: 2.2005 - classification_loss: 0.5769 453/500 [==========================>...] - ETA: 11s - loss: 2.7771 - regression_loss: 2.2005 - classification_loss: 0.5766 454/500 [==========================>...] - ETA: 11s - loss: 2.7778 - regression_loss: 2.2011 - classification_loss: 0.5767 455/500 [==========================>...] - ETA: 11s - loss: 2.7761 - regression_loss: 2.1997 - classification_loss: 0.5765 456/500 [==========================>...] - ETA: 11s - loss: 2.7765 - regression_loss: 2.1998 - classification_loss: 0.5767 457/500 [==========================>...] - ETA: 10s - loss: 2.7771 - regression_loss: 2.2003 - classification_loss: 0.5768 458/500 [==========================>...] - ETA: 10s - loss: 2.7786 - regression_loss: 2.2016 - classification_loss: 0.5770 459/500 [==========================>...] - ETA: 10s - loss: 2.7783 - regression_loss: 2.2015 - classification_loss: 0.5768 460/500 [==========================>...] - ETA: 10s - loss: 2.7782 - regression_loss: 2.2016 - classification_loss: 0.5766 461/500 [==========================>...] - ETA: 9s - loss: 2.7790 - regression_loss: 2.2024 - classification_loss: 0.5767  462/500 [==========================>...] - ETA: 9s - loss: 2.7804 - regression_loss: 2.2034 - classification_loss: 0.5770 463/500 [==========================>...] - ETA: 9s - loss: 2.7798 - regression_loss: 2.2029 - classification_loss: 0.5769 464/500 [==========================>...] - ETA: 9s - loss: 2.7797 - regression_loss: 2.2027 - classification_loss: 0.5770 465/500 [==========================>...] - ETA: 8s - loss: 2.7794 - regression_loss: 2.2026 - classification_loss: 0.5768 466/500 [==========================>...] - ETA: 8s - loss: 2.7782 - regression_loss: 2.2018 - classification_loss: 0.5764 467/500 [===========================>..] - ETA: 8s - loss: 2.7783 - regression_loss: 2.2020 - classification_loss: 0.5763 468/500 [===========================>..] - ETA: 8s - loss: 2.7780 - regression_loss: 2.2017 - classification_loss: 0.5763 469/500 [===========================>..] - ETA: 7s - loss: 2.7778 - regression_loss: 2.2014 - classification_loss: 0.5764 470/500 [===========================>..] - ETA: 7s - loss: 2.7774 - regression_loss: 2.2013 - classification_loss: 0.5761 471/500 [===========================>..] - ETA: 7s - loss: 2.7773 - regression_loss: 2.2011 - classification_loss: 0.5762 472/500 [===========================>..] - ETA: 7s - loss: 2.7776 - regression_loss: 2.2013 - classification_loss: 0.5763 473/500 [===========================>..] - ETA: 6s - loss: 2.7780 - regression_loss: 2.2017 - classification_loss: 0.5763 474/500 [===========================>..] - ETA: 6s - loss: 2.7779 - regression_loss: 2.2015 - classification_loss: 0.5764 475/500 [===========================>..] - ETA: 6s - loss: 2.7774 - regression_loss: 2.2013 - classification_loss: 0.5762 476/500 [===========================>..] - ETA: 6s - loss: 2.7780 - regression_loss: 2.2020 - classification_loss: 0.5760 477/500 [===========================>..] - ETA: 5s - loss: 2.7783 - regression_loss: 2.2024 - classification_loss: 0.5759 478/500 [===========================>..] - ETA: 5s - loss: 2.7767 - regression_loss: 2.2009 - classification_loss: 0.5758 479/500 [===========================>..] - ETA: 5s - loss: 2.7760 - regression_loss: 2.2006 - classification_loss: 0.5754 480/500 [===========================>..] - ETA: 5s - loss: 2.7763 - regression_loss: 2.2005 - classification_loss: 0.5758 481/500 [===========================>..] - ETA: 4s - loss: 2.7777 - regression_loss: 2.2014 - classification_loss: 0.5763 482/500 [===========================>..] - ETA: 4s - loss: 2.7784 - regression_loss: 2.2017 - classification_loss: 0.5766 483/500 [===========================>..] - ETA: 4s - loss: 2.7785 - regression_loss: 2.2019 - classification_loss: 0.5767 484/500 [============================>.] - ETA: 4s - loss: 2.7780 - regression_loss: 2.2016 - classification_loss: 0.5764 485/500 [============================>.] - ETA: 3s - loss: 2.7778 - regression_loss: 2.2015 - classification_loss: 0.5763 486/500 [============================>.] - ETA: 3s - loss: 2.7786 - regression_loss: 2.2023 - classification_loss: 0.5763 487/500 [============================>.] - ETA: 3s - loss: 2.7787 - regression_loss: 2.2028 - classification_loss: 0.5759 488/500 [============================>.] - ETA: 3s - loss: 2.7783 - regression_loss: 2.2025 - classification_loss: 0.5758 489/500 [============================>.] - ETA: 2s - loss: 2.7782 - regression_loss: 2.2027 - classification_loss: 0.5756 490/500 [============================>.] - ETA: 2s - loss: 2.7786 - regression_loss: 2.2031 - classification_loss: 0.5755 491/500 [============================>.] - ETA: 2s - loss: 2.7792 - regression_loss: 2.2037 - classification_loss: 0.5756 492/500 [============================>.] - ETA: 2s - loss: 2.7800 - regression_loss: 2.2042 - classification_loss: 0.5758 493/500 [============================>.] - ETA: 1s - loss: 2.7799 - regression_loss: 2.2041 - classification_loss: 0.5758 494/500 [============================>.] - ETA: 1s - loss: 2.7798 - regression_loss: 2.2040 - classification_loss: 0.5758 495/500 [============================>.] - ETA: 1s - loss: 2.7796 - regression_loss: 2.2040 - classification_loss: 0.5756 496/500 [============================>.] - ETA: 1s - loss: 2.7793 - regression_loss: 2.2039 - classification_loss: 0.5754 497/500 [============================>.] - ETA: 0s - loss: 2.7790 - regression_loss: 2.2037 - classification_loss: 0.5752 498/500 [============================>.] - ETA: 0s - loss: 2.7806 - regression_loss: 2.2053 - classification_loss: 0.5753 499/500 [============================>.] - ETA: 0s - loss: 2.7801 - regression_loss: 2.2049 - classification_loss: 0.5752 500/500 [==============================] - 126s 251ms/step - loss: 2.7825 - regression_loss: 2.2061 - classification_loss: 0.5764 1172 instances of class plum with average precision: 0.2120 mAP: 0.2120 Epoch 00007: saving model to ./training/snapshots/resnet50_pascal_07.h5 Epoch 8/150 1/500 [..............................] - ETA: 1:55 - loss: 2.8182 - regression_loss: 2.2841 - classification_loss: 0.5341 2/500 [..............................] - ETA: 2:00 - loss: 2.7871 - regression_loss: 2.2928 - classification_loss: 0.4943 3/500 [..............................] - ETA: 2:02 - loss: 2.7513 - regression_loss: 2.2693 - classification_loss: 0.4820 4/500 [..............................] - ETA: 2:02 - loss: 2.6382 - regression_loss: 2.1490 - classification_loss: 0.4892 5/500 [..............................] - ETA: 2:02 - loss: 2.5784 - regression_loss: 2.0828 - classification_loss: 0.4956 6/500 [..............................] - ETA: 2:02 - loss: 2.6243 - regression_loss: 2.1174 - classification_loss: 0.5069 7/500 [..............................] - ETA: 2:02 - loss: 2.6357 - regression_loss: 2.1290 - classification_loss: 0.5067 8/500 [..............................] - ETA: 2:02 - loss: 2.6456 - regression_loss: 2.1443 - classification_loss: 0.5013 9/500 [..............................] - ETA: 2:02 - loss: 2.6425 - regression_loss: 2.1431 - classification_loss: 0.4994 10/500 [..............................] - ETA: 2:02 - loss: 2.6577 - regression_loss: 2.1460 - classification_loss: 0.5117 11/500 [..............................] - ETA: 2:02 - loss: 2.6715 - regression_loss: 2.1580 - classification_loss: 0.5135 12/500 [..............................] - ETA: 2:01 - loss: 2.7075 - regression_loss: 2.1859 - classification_loss: 0.5216 13/500 [..............................] - ETA: 2:02 - loss: 2.6990 - regression_loss: 2.1759 - classification_loss: 0.5231 14/500 [..............................] - ETA: 2:01 - loss: 2.7124 - regression_loss: 2.1951 - classification_loss: 0.5172 15/500 [..............................] - ETA: 2:01 - loss: 2.7231 - regression_loss: 2.1982 - classification_loss: 0.5249 16/500 [..............................] - ETA: 2:01 - loss: 2.7237 - regression_loss: 2.1970 - classification_loss: 0.5267 17/500 [>.............................] - ETA: 2:01 - loss: 2.7236 - regression_loss: 2.2018 - classification_loss: 0.5218 18/500 [>.............................] - ETA: 2:00 - loss: 2.7131 - regression_loss: 2.1930 - classification_loss: 0.5200 19/500 [>.............................] - ETA: 2:00 - loss: 2.7555 - regression_loss: 2.2337 - classification_loss: 0.5218 20/500 [>.............................] - ETA: 2:00 - loss: 2.7580 - regression_loss: 2.2359 - classification_loss: 0.5221 21/500 [>.............................] - ETA: 1:59 - loss: 2.7454 - regression_loss: 2.2244 - classification_loss: 0.5210 22/500 [>.............................] - ETA: 1:58 - loss: 2.7466 - regression_loss: 2.2239 - classification_loss: 0.5228 23/500 [>.............................] - ETA: 1:58 - loss: 2.7411 - regression_loss: 2.2228 - classification_loss: 0.5183 24/500 [>.............................] - ETA: 1:57 - loss: 2.7345 - regression_loss: 2.2181 - classification_loss: 0.5164 25/500 [>.............................] - ETA: 1:56 - loss: 2.7596 - regression_loss: 2.2416 - classification_loss: 0.5181 26/500 [>.............................] - ETA: 1:56 - loss: 2.7542 - regression_loss: 2.2346 - classification_loss: 0.5196 27/500 [>.............................] - ETA: 1:56 - loss: 2.7344 - regression_loss: 2.2195 - classification_loss: 0.5149 28/500 [>.............................] - ETA: 1:56 - loss: 2.7260 - regression_loss: 2.2133 - classification_loss: 0.5127 29/500 [>.............................] - ETA: 1:56 - loss: 2.7163 - regression_loss: 2.2059 - classification_loss: 0.5104 30/500 [>.............................] - ETA: 1:55 - loss: 2.7199 - regression_loss: 2.1867 - classification_loss: 0.5332 31/500 [>.............................] - ETA: 1:55 - loss: 2.6986 - regression_loss: 2.1712 - classification_loss: 0.5274 32/500 [>.............................] - ETA: 1:55 - loss: 2.7015 - regression_loss: 2.1660 - classification_loss: 0.5355 33/500 [>.............................] - ETA: 1:55 - loss: 2.7027 - regression_loss: 2.1704 - classification_loss: 0.5323 34/500 [=>............................] - ETA: 1:55 - loss: 2.7137 - regression_loss: 2.1715 - classification_loss: 0.5422 35/500 [=>............................] - ETA: 1:54 - loss: 2.7125 - regression_loss: 2.1707 - classification_loss: 0.5418 36/500 [=>............................] - ETA: 1:54 - loss: 2.7266 - regression_loss: 2.1796 - classification_loss: 0.5470 37/500 [=>............................] - ETA: 1:54 - loss: 2.7220 - regression_loss: 2.1762 - classification_loss: 0.5458 38/500 [=>............................] - ETA: 1:54 - loss: 2.7297 - regression_loss: 2.1822 - classification_loss: 0.5476 39/500 [=>............................] - ETA: 1:53 - loss: 2.7311 - regression_loss: 2.1838 - classification_loss: 0.5473 40/500 [=>............................] - ETA: 1:53 - loss: 2.7345 - regression_loss: 2.1882 - classification_loss: 0.5463 41/500 [=>............................] - ETA: 1:53 - loss: 2.7282 - regression_loss: 2.1806 - classification_loss: 0.5476 42/500 [=>............................] - ETA: 1:52 - loss: 2.7337 - regression_loss: 2.1857 - classification_loss: 0.5480 43/500 [=>............................] - ETA: 1:52 - loss: 2.7176 - regression_loss: 2.1710 - classification_loss: 0.5466 44/500 [=>............................] - ETA: 1:52 - loss: 2.7185 - regression_loss: 2.1764 - classification_loss: 0.5421 45/500 [=>............................] - ETA: 1:52 - loss: 2.7179 - regression_loss: 2.1774 - classification_loss: 0.5405 46/500 [=>............................] - ETA: 1:52 - loss: 2.7149 - regression_loss: 2.1741 - classification_loss: 0.5407 47/500 [=>............................] - ETA: 1:52 - loss: 2.7093 - regression_loss: 2.1691 - classification_loss: 0.5402 48/500 [=>............................] - ETA: 1:52 - loss: 2.7016 - regression_loss: 2.1629 - classification_loss: 0.5388 49/500 [=>............................] - ETA: 1:51 - loss: 2.7384 - regression_loss: 2.1838 - classification_loss: 0.5545 50/500 [==>...........................] - ETA: 1:51 - loss: 2.7435 - regression_loss: 2.1887 - classification_loss: 0.5548 51/500 [==>...........................] - ETA: 1:51 - loss: 2.7424 - regression_loss: 2.1883 - classification_loss: 0.5541 52/500 [==>...........................] - ETA: 1:51 - loss: 2.7427 - regression_loss: 2.1903 - classification_loss: 0.5524 53/500 [==>...........................] - ETA: 1:51 - loss: 2.7398 - regression_loss: 2.1899 - classification_loss: 0.5499 54/500 [==>...........................] - ETA: 1:50 - loss: 2.7457 - regression_loss: 2.1962 - classification_loss: 0.5496 55/500 [==>...........................] - ETA: 1:50 - loss: 2.7466 - regression_loss: 2.1978 - classification_loss: 0.5488 56/500 [==>...........................] - ETA: 1:50 - loss: 2.7463 - regression_loss: 2.1968 - classification_loss: 0.5495 57/500 [==>...........................] - ETA: 1:50 - loss: 2.7465 - regression_loss: 2.1976 - classification_loss: 0.5489 58/500 [==>...........................] - ETA: 1:49 - loss: 2.7448 - regression_loss: 2.1972 - classification_loss: 0.5476 59/500 [==>...........................] - ETA: 1:49 - loss: 2.7497 - regression_loss: 2.1970 - classification_loss: 0.5527 60/500 [==>...........................] - ETA: 1:49 - loss: 2.7465 - regression_loss: 2.1955 - classification_loss: 0.5510 61/500 [==>...........................] - ETA: 1:49 - loss: 2.7409 - regression_loss: 2.1936 - classification_loss: 0.5474 62/500 [==>...........................] - ETA: 1:49 - loss: 2.7385 - regression_loss: 2.1914 - classification_loss: 0.5471 63/500 [==>...........................] - ETA: 1:48 - loss: 2.7367 - regression_loss: 2.1896 - classification_loss: 0.5471 64/500 [==>...........................] - ETA: 1:48 - loss: 2.7334 - regression_loss: 2.1872 - classification_loss: 0.5463 65/500 [==>...........................] - ETA: 1:48 - loss: 2.7366 - regression_loss: 2.1907 - classification_loss: 0.5459 66/500 [==>...........................] - ETA: 1:48 - loss: 2.7356 - regression_loss: 2.1902 - classification_loss: 0.5454 67/500 [===>..........................] - ETA: 1:47 - loss: 2.7337 - regression_loss: 2.1891 - classification_loss: 0.5446 68/500 [===>..........................] - ETA: 1:47 - loss: 2.7373 - regression_loss: 2.1909 - classification_loss: 0.5464 69/500 [===>..........................] - ETA: 1:47 - loss: 2.7418 - regression_loss: 2.1936 - classification_loss: 0.5482 70/500 [===>..........................] - ETA: 1:47 - loss: 2.7453 - regression_loss: 2.1961 - classification_loss: 0.5493 71/500 [===>..........................] - ETA: 1:46 - loss: 2.7764 - regression_loss: 2.2025 - classification_loss: 0.5739 72/500 [===>..........................] - ETA: 1:46 - loss: 2.7772 - regression_loss: 2.1995 - classification_loss: 0.5777 73/500 [===>..........................] - ETA: 1:46 - loss: 2.7797 - regression_loss: 2.2024 - classification_loss: 0.5773 74/500 [===>..........................] - ETA: 1:46 - loss: 2.7837 - regression_loss: 2.2058 - classification_loss: 0.5779 75/500 [===>..........................] - ETA: 1:45 - loss: 2.7814 - regression_loss: 2.2050 - classification_loss: 0.5764 76/500 [===>..........................] - ETA: 1:45 - loss: 2.7808 - regression_loss: 2.2055 - classification_loss: 0.5753 77/500 [===>..........................] - ETA: 1:45 - loss: 2.7881 - regression_loss: 2.2107 - classification_loss: 0.5773 78/500 [===>..........................] - ETA: 1:45 - loss: 2.7881 - regression_loss: 2.2112 - classification_loss: 0.5768 79/500 [===>..........................] - ETA: 1:44 - loss: 2.7870 - regression_loss: 2.2111 - classification_loss: 0.5759 80/500 [===>..........................] - ETA: 1:44 - loss: 2.7825 - regression_loss: 2.2067 - classification_loss: 0.5758 81/500 [===>..........................] - ETA: 1:44 - loss: 2.7755 - regression_loss: 2.2020 - classification_loss: 0.5735 82/500 [===>..........................] - ETA: 1:44 - loss: 2.7786 - regression_loss: 2.2048 - classification_loss: 0.5738 83/500 [===>..........................] - ETA: 1:44 - loss: 2.7760 - regression_loss: 2.2032 - classification_loss: 0.5728 84/500 [====>.........................] - ETA: 1:43 - loss: 2.7749 - regression_loss: 2.2023 - classification_loss: 0.5726 85/500 [====>.........................] - ETA: 1:43 - loss: 2.7758 - regression_loss: 2.2034 - classification_loss: 0.5724 86/500 [====>.........................] - ETA: 1:43 - loss: 2.7692 - regression_loss: 2.1979 - classification_loss: 0.5714 87/500 [====>.........................] - ETA: 1:43 - loss: 2.7703 - regression_loss: 2.1980 - classification_loss: 0.5722 88/500 [====>.........................] - ETA: 1:42 - loss: 2.7868 - regression_loss: 2.1999 - classification_loss: 0.5870 89/500 [====>.........................] - ETA: 1:42 - loss: 2.8010 - regression_loss: 2.2133 - classification_loss: 0.5877 90/500 [====>.........................] - ETA: 1:42 - loss: 2.8020 - regression_loss: 2.2136 - classification_loss: 0.5884 91/500 [====>.........................] - ETA: 1:42 - loss: 2.8021 - regression_loss: 2.2135 - classification_loss: 0.5886 92/500 [====>.........................] - ETA: 1:41 - loss: 2.8024 - regression_loss: 2.2153 - classification_loss: 0.5871 93/500 [====>.........................] - ETA: 1:41 - loss: 2.7994 - regression_loss: 2.2137 - classification_loss: 0.5856 94/500 [====>.........................] - ETA: 1:41 - loss: 2.8021 - regression_loss: 2.2166 - classification_loss: 0.5855 95/500 [====>.........................] - ETA: 1:41 - loss: 2.7992 - regression_loss: 2.2150 - classification_loss: 0.5842 96/500 [====>.........................] - ETA: 1:40 - loss: 2.7944 - regression_loss: 2.2119 - classification_loss: 0.5825 97/500 [====>.........................] - ETA: 1:40 - loss: 2.7938 - regression_loss: 2.2117 - classification_loss: 0.5820 98/500 [====>.........................] - ETA: 1:40 - loss: 2.7900 - regression_loss: 2.2086 - classification_loss: 0.5814 99/500 [====>.........................] - ETA: 1:40 - loss: 2.7890 - regression_loss: 2.2082 - classification_loss: 0.5808 100/500 [=====>........................] - ETA: 1:40 - loss: 2.7897 - regression_loss: 2.2087 - classification_loss: 0.5810 101/500 [=====>........................] - ETA: 1:39 - loss: 2.7877 - regression_loss: 2.2073 - classification_loss: 0.5804 102/500 [=====>........................] - ETA: 1:39 - loss: 2.7912 - regression_loss: 2.2097 - classification_loss: 0.5815 103/500 [=====>........................] - ETA: 1:39 - loss: 2.7923 - regression_loss: 2.2109 - classification_loss: 0.5814 104/500 [=====>........................] - ETA: 1:39 - loss: 2.7927 - regression_loss: 2.2114 - classification_loss: 0.5813 105/500 [=====>........................] - ETA: 1:38 - loss: 2.7884 - regression_loss: 2.2085 - classification_loss: 0.5798 106/500 [=====>........................] - ETA: 1:38 - loss: 2.7881 - regression_loss: 2.2084 - classification_loss: 0.5798 107/500 [=====>........................] - ETA: 1:38 - loss: 2.7862 - regression_loss: 2.2071 - classification_loss: 0.5790 108/500 [=====>........................] - ETA: 1:38 - loss: 2.7978 - regression_loss: 2.2175 - classification_loss: 0.5803 109/500 [=====>........................] - ETA: 1:38 - loss: 2.7923 - regression_loss: 2.2118 - classification_loss: 0.5805 110/500 [=====>........................] - ETA: 1:37 - loss: 2.7902 - regression_loss: 2.2112 - classification_loss: 0.5790 111/500 [=====>........................] - ETA: 1:37 - loss: 2.7855 - regression_loss: 2.2085 - classification_loss: 0.5771 112/500 [=====>........................] - ETA: 1:37 - loss: 2.7850 - regression_loss: 2.2087 - classification_loss: 0.5764 113/500 [=====>........................] - ETA: 1:37 - loss: 2.7854 - regression_loss: 2.2070 - classification_loss: 0.5784 114/500 [=====>........................] - ETA: 1:36 - loss: 2.7946 - regression_loss: 2.2132 - classification_loss: 0.5813 115/500 [=====>........................] - ETA: 1:36 - loss: 2.7948 - regression_loss: 2.2142 - classification_loss: 0.5806 116/500 [=====>........................] - ETA: 1:36 - loss: 2.7927 - regression_loss: 2.2125 - classification_loss: 0.5803 117/500 [======>.......................] - ETA: 1:36 - loss: 2.7906 - regression_loss: 2.2109 - classification_loss: 0.5797 118/500 [======>.......................] - ETA: 1:35 - loss: 2.7881 - regression_loss: 2.2092 - classification_loss: 0.5788 119/500 [======>.......................] - ETA: 1:35 - loss: 2.7904 - regression_loss: 2.2107 - classification_loss: 0.5797 120/500 [======>.......................] - ETA: 1:35 - loss: 2.7891 - regression_loss: 2.2102 - classification_loss: 0.5789 121/500 [======>.......................] - ETA: 1:35 - loss: 2.7896 - regression_loss: 2.2113 - classification_loss: 0.5783 122/500 [======>.......................] - ETA: 1:34 - loss: 2.7838 - regression_loss: 2.2067 - classification_loss: 0.5771 123/500 [======>.......................] - ETA: 1:34 - loss: 2.7815 - regression_loss: 2.2052 - classification_loss: 0.5763 124/500 [======>.......................] - ETA: 1:34 - loss: 2.7760 - regression_loss: 2.2023 - classification_loss: 0.5737 125/500 [======>.......................] - ETA: 1:34 - loss: 2.7783 - regression_loss: 2.2047 - classification_loss: 0.5736 126/500 [======>.......................] - ETA: 1:33 - loss: 2.7766 - regression_loss: 2.2037 - classification_loss: 0.5730 127/500 [======>.......................] - ETA: 1:33 - loss: 2.7708 - regression_loss: 2.1993 - classification_loss: 0.5715 128/500 [======>.......................] - ETA: 1:33 - loss: 2.7715 - regression_loss: 2.2008 - classification_loss: 0.5707 129/500 [======>.......................] - ETA: 1:33 - loss: 2.7710 - regression_loss: 2.2003 - classification_loss: 0.5707 130/500 [======>.......................] - ETA: 1:32 - loss: 2.7693 - regression_loss: 2.1986 - classification_loss: 0.5707 131/500 [======>.......................] - ETA: 1:32 - loss: 2.7684 - regression_loss: 2.1970 - classification_loss: 0.5714 132/500 [======>.......................] - ETA: 1:32 - loss: 2.7682 - regression_loss: 2.1967 - classification_loss: 0.5715 133/500 [======>.......................] - ETA: 1:32 - loss: 2.7667 - regression_loss: 2.1953 - classification_loss: 0.5714 134/500 [=======>......................] - ETA: 1:31 - loss: 2.7653 - regression_loss: 2.1949 - classification_loss: 0.5704 135/500 [=======>......................] - ETA: 1:31 - loss: 2.7642 - regression_loss: 2.1935 - classification_loss: 0.5707 136/500 [=======>......................] - ETA: 1:31 - loss: 2.7587 - regression_loss: 2.1903 - classification_loss: 0.5684 137/500 [=======>......................] - ETA: 1:31 - loss: 2.7591 - regression_loss: 2.1904 - classification_loss: 0.5687 138/500 [=======>......................] - ETA: 1:31 - loss: 2.7600 - regression_loss: 2.1911 - classification_loss: 0.5689 139/500 [=======>......................] - ETA: 1:30 - loss: 2.7584 - regression_loss: 2.1907 - classification_loss: 0.5677 140/500 [=======>......................] - ETA: 1:30 - loss: 2.7545 - regression_loss: 2.1868 - classification_loss: 0.5678 141/500 [=======>......................] - ETA: 1:30 - loss: 2.7641 - regression_loss: 2.1909 - classification_loss: 0.5731 142/500 [=======>......................] - ETA: 1:30 - loss: 2.7641 - regression_loss: 2.1912 - classification_loss: 0.5729 143/500 [=======>......................] - ETA: 1:29 - loss: 2.7612 - regression_loss: 2.1893 - classification_loss: 0.5719 144/500 [=======>......................] - ETA: 1:29 - loss: 2.7637 - regression_loss: 2.1916 - classification_loss: 0.5721 145/500 [=======>......................] - ETA: 1:29 - loss: 2.7644 - regression_loss: 2.1927 - classification_loss: 0.5716 146/500 [=======>......................] - ETA: 1:29 - loss: 2.7653 - regression_loss: 2.1916 - classification_loss: 0.5736 147/500 [=======>......................] - ETA: 1:28 - loss: 2.7640 - regression_loss: 2.1907 - classification_loss: 0.5732 148/500 [=======>......................] - ETA: 1:28 - loss: 2.7638 - regression_loss: 2.1921 - classification_loss: 0.5716 149/500 [=======>......................] - ETA: 1:28 - loss: 2.7618 - regression_loss: 2.1908 - classification_loss: 0.5710 150/500 [========>.....................] - ETA: 1:28 - loss: 2.7606 - regression_loss: 2.1896 - classification_loss: 0.5710 151/500 [========>.....................] - ETA: 1:27 - loss: 2.7605 - regression_loss: 2.1897 - classification_loss: 0.5708 152/500 [========>.....................] - ETA: 1:27 - loss: 2.7582 - regression_loss: 2.1880 - classification_loss: 0.5703 153/500 [========>.....................] - ETA: 1:27 - loss: 2.7594 - regression_loss: 2.1893 - classification_loss: 0.5701 154/500 [========>.....................] - ETA: 1:27 - loss: 2.7587 - regression_loss: 2.1885 - classification_loss: 0.5702 155/500 [========>.....................] - ETA: 1:26 - loss: 2.7587 - regression_loss: 2.1885 - classification_loss: 0.5702 156/500 [========>.....................] - ETA: 1:26 - loss: 2.7601 - regression_loss: 2.1905 - classification_loss: 0.5696 157/500 [========>.....................] - ETA: 1:26 - loss: 2.7585 - regression_loss: 2.1895 - classification_loss: 0.5690 158/500 [========>.....................] - ETA: 1:25 - loss: 2.7554 - regression_loss: 2.1871 - classification_loss: 0.5682 159/500 [========>.....................] - ETA: 1:25 - loss: 2.7549 - regression_loss: 2.1865 - classification_loss: 0.5683 160/500 [========>.....................] - ETA: 1:25 - loss: 2.7543 - regression_loss: 2.1865 - classification_loss: 0.5678 161/500 [========>.....................] - ETA: 1:25 - loss: 2.7531 - regression_loss: 2.1860 - classification_loss: 0.5671 162/500 [========>.....................] - ETA: 1:24 - loss: 2.7530 - regression_loss: 2.1863 - classification_loss: 0.5667 163/500 [========>.....................] - ETA: 1:24 - loss: 2.7520 - regression_loss: 2.1858 - classification_loss: 0.5662 164/500 [========>.....................] - ETA: 1:24 - loss: 2.7510 - regression_loss: 2.1844 - classification_loss: 0.5666 165/500 [========>.....................] - ETA: 1:24 - loss: 2.7492 - regression_loss: 2.1836 - classification_loss: 0.5656 166/500 [========>.....................] - ETA: 1:23 - loss: 2.7495 - regression_loss: 2.1842 - classification_loss: 0.5653 167/500 [=========>....................] - ETA: 1:23 - loss: 2.7488 - regression_loss: 2.1841 - classification_loss: 0.5647 168/500 [=========>....................] - ETA: 1:23 - loss: 2.7469 - regression_loss: 2.1827 - classification_loss: 0.5642 169/500 [=========>....................] - ETA: 1:23 - loss: 2.7479 - regression_loss: 2.1835 - classification_loss: 0.5645 170/500 [=========>....................] - ETA: 1:22 - loss: 2.7475 - regression_loss: 2.1793 - classification_loss: 0.5682 171/500 [=========>....................] - ETA: 1:22 - loss: 2.7529 - regression_loss: 2.1840 - classification_loss: 0.5690 172/500 [=========>....................] - ETA: 1:22 - loss: 2.7530 - regression_loss: 2.1838 - classification_loss: 0.5691 173/500 [=========>....................] - ETA: 1:22 - loss: 2.7532 - regression_loss: 2.1843 - classification_loss: 0.5689 174/500 [=========>....................] - ETA: 1:21 - loss: 2.7559 - regression_loss: 2.1862 - classification_loss: 0.5697 175/500 [=========>....................] - ETA: 1:21 - loss: 2.7558 - regression_loss: 2.1860 - classification_loss: 0.5698 176/500 [=========>....................] - ETA: 1:21 - loss: 2.7543 - regression_loss: 2.1847 - classification_loss: 0.5695 177/500 [=========>....................] - ETA: 1:21 - loss: 2.7558 - regression_loss: 2.1864 - classification_loss: 0.5694 178/500 [=========>....................] - ETA: 1:20 - loss: 2.7557 - regression_loss: 2.1866 - classification_loss: 0.5691 179/500 [=========>....................] - ETA: 1:20 - loss: 2.7511 - regression_loss: 2.1829 - classification_loss: 0.5683 180/500 [=========>....................] - ETA: 1:20 - loss: 2.7432 - regression_loss: 2.1770 - classification_loss: 0.5661 181/500 [=========>....................] - ETA: 1:20 - loss: 2.7451 - regression_loss: 2.1792 - classification_loss: 0.5659 182/500 [=========>....................] - ETA: 1:19 - loss: 2.7450 - regression_loss: 2.1790 - classification_loss: 0.5661 183/500 [=========>....................] - ETA: 1:19 - loss: 2.7420 - regression_loss: 2.1771 - classification_loss: 0.5648 184/500 [==========>...................] - ETA: 1:19 - loss: 2.7421 - regression_loss: 2.1767 - classification_loss: 0.5653 185/500 [==========>...................] - ETA: 1:19 - loss: 2.7414 - regression_loss: 2.1763 - classification_loss: 0.5651 186/500 [==========>...................] - ETA: 1:18 - loss: 2.7410 - regression_loss: 2.1763 - classification_loss: 0.5647 187/500 [==========>...................] - ETA: 1:18 - loss: 2.7409 - regression_loss: 2.1757 - classification_loss: 0.5653 188/500 [==========>...................] - ETA: 1:18 - loss: 2.7400 - regression_loss: 2.1752 - classification_loss: 0.5648 189/500 [==========>...................] - ETA: 1:18 - loss: 2.7393 - regression_loss: 2.1749 - classification_loss: 0.5644 190/500 [==========>...................] - ETA: 1:17 - loss: 2.7407 - regression_loss: 2.1761 - classification_loss: 0.5647 191/500 [==========>...................] - ETA: 1:17 - loss: 2.7421 - regression_loss: 2.1764 - classification_loss: 0.5658 192/500 [==========>...................] - ETA: 1:17 - loss: 2.7411 - regression_loss: 2.1760 - classification_loss: 0.5652 193/500 [==========>...................] - ETA: 1:17 - loss: 2.7435 - regression_loss: 2.1769 - classification_loss: 0.5666 194/500 [==========>...................] - ETA: 1:16 - loss: 2.7424 - regression_loss: 2.1760 - classification_loss: 0.5664 195/500 [==========>...................] - ETA: 1:16 - loss: 2.7449 - regression_loss: 2.1768 - classification_loss: 0.5681 196/500 [==========>...................] - ETA: 1:16 - loss: 2.7504 - regression_loss: 2.1820 - classification_loss: 0.5684 197/500 [==========>...................] - ETA: 1:16 - loss: 2.7503 - regression_loss: 2.1821 - classification_loss: 0.5682 198/500 [==========>...................] - ETA: 1:15 - loss: 2.7512 - regression_loss: 2.1831 - classification_loss: 0.5681 199/500 [==========>...................] - ETA: 1:15 - loss: 2.7510 - regression_loss: 2.1829 - classification_loss: 0.5681 200/500 [===========>..................] - ETA: 1:15 - loss: 2.7537 - regression_loss: 2.1848 - classification_loss: 0.5689 201/500 [===========>..................] - ETA: 1:14 - loss: 2.7510 - regression_loss: 2.1832 - classification_loss: 0.5678 202/500 [===========>..................] - ETA: 1:14 - loss: 2.7519 - regression_loss: 2.1843 - classification_loss: 0.5676 203/500 [===========>..................] - ETA: 1:14 - loss: 2.7542 - regression_loss: 2.1852 - classification_loss: 0.5691 204/500 [===========>..................] - ETA: 1:14 - loss: 2.7549 - regression_loss: 2.1857 - classification_loss: 0.5692 205/500 [===========>..................] - ETA: 1:14 - loss: 2.7538 - regression_loss: 2.1850 - classification_loss: 0.5689 206/500 [===========>..................] - ETA: 1:13 - loss: 2.7545 - regression_loss: 2.1857 - classification_loss: 0.5687 207/500 [===========>..................] - ETA: 1:13 - loss: 2.7558 - regression_loss: 2.1867 - classification_loss: 0.5691 208/500 [===========>..................] - ETA: 1:13 - loss: 2.7569 - regression_loss: 2.1880 - classification_loss: 0.5688 209/500 [===========>..................] - ETA: 1:13 - loss: 2.7578 - regression_loss: 2.1897 - classification_loss: 0.5681 210/500 [===========>..................] - ETA: 1:12 - loss: 2.7563 - regression_loss: 2.1889 - classification_loss: 0.5674 211/500 [===========>..................] - ETA: 1:12 - loss: 2.7575 - regression_loss: 2.1907 - classification_loss: 0.5668 212/500 [===========>..................] - ETA: 1:12 - loss: 2.7563 - regression_loss: 2.1900 - classification_loss: 0.5664 213/500 [===========>..................] - ETA: 1:12 - loss: 2.7553 - regression_loss: 2.1894 - classification_loss: 0.5659 214/500 [===========>..................] - ETA: 1:11 - loss: 2.7557 - regression_loss: 2.1903 - classification_loss: 0.5654 215/500 [===========>..................] - ETA: 1:11 - loss: 2.7561 - regression_loss: 2.1910 - classification_loss: 0.5651 216/500 [===========>..................] - ETA: 1:11 - loss: 2.7602 - regression_loss: 2.1927 - classification_loss: 0.5675 217/500 [============>.................] - ETA: 1:11 - loss: 2.7636 - regression_loss: 2.1956 - classification_loss: 0.5680 218/500 [============>.................] - ETA: 1:10 - loss: 2.7627 - regression_loss: 2.1951 - classification_loss: 0.5676 219/500 [============>.................] - ETA: 1:10 - loss: 2.7626 - regression_loss: 2.1946 - classification_loss: 0.5680 220/500 [============>.................] - ETA: 1:10 - loss: 2.7614 - regression_loss: 2.1935 - classification_loss: 0.5679 221/500 [============>.................] - ETA: 1:10 - loss: 2.7629 - regression_loss: 2.1938 - classification_loss: 0.5692 222/500 [============>.................] - ETA: 1:09 - loss: 2.7616 - regression_loss: 2.1926 - classification_loss: 0.5689 223/500 [============>.................] - ETA: 1:09 - loss: 2.7679 - regression_loss: 2.1962 - classification_loss: 0.5717 224/500 [============>.................] - ETA: 1:09 - loss: 2.7683 - regression_loss: 2.1966 - classification_loss: 0.5717 225/500 [============>.................] - ETA: 1:09 - loss: 2.7682 - regression_loss: 2.1968 - classification_loss: 0.5715 226/500 [============>.................] - ETA: 1:08 - loss: 2.7684 - regression_loss: 2.1968 - classification_loss: 0.5716 227/500 [============>.................] - ETA: 1:08 - loss: 2.7692 - regression_loss: 2.1973 - classification_loss: 0.5719 228/500 [============>.................] - ETA: 1:08 - loss: 2.7684 - regression_loss: 2.1965 - classification_loss: 0.5720 229/500 [============>.................] - ETA: 1:08 - loss: 2.7693 - regression_loss: 2.1966 - classification_loss: 0.5727 230/500 [============>.................] - ETA: 1:07 - loss: 2.7691 - regression_loss: 2.1965 - classification_loss: 0.5727 231/500 [============>.................] - ETA: 1:07 - loss: 2.7670 - regression_loss: 2.1948 - classification_loss: 0.5722 232/500 [============>.................] - ETA: 1:07 - loss: 2.7667 - regression_loss: 2.1945 - classification_loss: 0.5722 233/500 [============>.................] - ETA: 1:06 - loss: 2.7684 - regression_loss: 2.1954 - classification_loss: 0.5730 234/500 [=============>................] - ETA: 1:06 - loss: 2.7695 - regression_loss: 2.1962 - classification_loss: 0.5733 235/500 [=============>................] - ETA: 1:06 - loss: 2.7687 - regression_loss: 2.1958 - classification_loss: 0.5728 236/500 [=============>................] - ETA: 1:06 - loss: 2.7689 - regression_loss: 2.1965 - classification_loss: 0.5724 237/500 [=============>................] - ETA: 1:06 - loss: 2.7686 - regression_loss: 2.1963 - classification_loss: 0.5723 238/500 [=============>................] - ETA: 1:05 - loss: 2.7685 - regression_loss: 2.1964 - classification_loss: 0.5721 239/500 [=============>................] - ETA: 1:05 - loss: 2.7687 - regression_loss: 2.1970 - classification_loss: 0.5716 240/500 [=============>................] - ETA: 1:05 - loss: 2.7681 - regression_loss: 2.1971 - classification_loss: 0.5710 241/500 [=============>................] - ETA: 1:04 - loss: 2.7681 - regression_loss: 2.1973 - classification_loss: 0.5708 242/500 [=============>................] - ETA: 1:04 - loss: 2.7691 - regression_loss: 2.1980 - classification_loss: 0.5711 243/500 [=============>................] - ETA: 1:04 - loss: 2.7705 - regression_loss: 2.1971 - classification_loss: 0.5734 244/500 [=============>................] - ETA: 1:04 - loss: 2.7699 - regression_loss: 2.1968 - classification_loss: 0.5731 245/500 [=============>................] - ETA: 1:03 - loss: 2.7719 - regression_loss: 2.1982 - classification_loss: 0.5736 246/500 [=============>................] - ETA: 1:03 - loss: 2.7742 - regression_loss: 2.2005 - classification_loss: 0.5738 247/500 [=============>................] - ETA: 1:03 - loss: 2.7748 - regression_loss: 2.2011 - classification_loss: 0.5737 248/500 [=============>................] - ETA: 1:03 - loss: 2.7760 - regression_loss: 2.2024 - classification_loss: 0.5736 249/500 [=============>................] - ETA: 1:02 - loss: 2.7778 - regression_loss: 2.2041 - classification_loss: 0.5737 250/500 [==============>...............] - ETA: 1:02 - loss: 2.7786 - regression_loss: 2.2045 - classification_loss: 0.5742 251/500 [==============>...............] - ETA: 1:02 - loss: 2.7786 - regression_loss: 2.2046 - classification_loss: 0.5740 252/500 [==============>...............] - ETA: 1:02 - loss: 2.7786 - regression_loss: 2.2046 - classification_loss: 0.5741 253/500 [==============>...............] - ETA: 1:01 - loss: 2.7781 - regression_loss: 2.2041 - classification_loss: 0.5740 254/500 [==============>...............] - ETA: 1:01 - loss: 2.7773 - regression_loss: 2.2035 - classification_loss: 0.5738 255/500 [==============>...............] - ETA: 1:01 - loss: 2.7752 - regression_loss: 2.2021 - classification_loss: 0.5731 256/500 [==============>...............] - ETA: 1:01 - loss: 2.7752 - regression_loss: 2.2021 - classification_loss: 0.5731 257/500 [==============>...............] - ETA: 1:00 - loss: 2.7737 - regression_loss: 2.2013 - classification_loss: 0.5724 258/500 [==============>...............] - ETA: 1:00 - loss: 2.7734 - regression_loss: 2.2012 - classification_loss: 0.5722 259/500 [==============>...............] - ETA: 1:00 - loss: 2.7748 - regression_loss: 2.2024 - classification_loss: 0.5724 260/500 [==============>...............] - ETA: 1:00 - loss: 2.7773 - regression_loss: 2.2047 - classification_loss: 0.5726 261/500 [==============>...............] - ETA: 59s - loss: 2.7767 - regression_loss: 2.2041 - classification_loss: 0.5726  262/500 [==============>...............] - ETA: 59s - loss: 2.7767 - regression_loss: 2.2038 - classification_loss: 0.5729 263/500 [==============>...............] - ETA: 59s - loss: 2.7765 - regression_loss: 2.2042 - classification_loss: 0.5723 264/500 [==============>...............] - ETA: 59s - loss: 2.7754 - regression_loss: 2.2033 - classification_loss: 0.5721 265/500 [==============>...............] - ETA: 58s - loss: 2.7747 - regression_loss: 2.2029 - classification_loss: 0.5718 266/500 [==============>...............] - ETA: 58s - loss: 2.7745 - regression_loss: 2.2024 - classification_loss: 0.5721 267/500 [===============>..............] - ETA: 58s - loss: 2.7747 - regression_loss: 2.2029 - classification_loss: 0.5718 268/500 [===============>..............] - ETA: 58s - loss: 2.7744 - regression_loss: 2.2028 - classification_loss: 0.5716 269/500 [===============>..............] - ETA: 57s - loss: 2.7730 - regression_loss: 2.2018 - classification_loss: 0.5713 270/500 [===============>..............] - ETA: 57s - loss: 2.7734 - regression_loss: 2.2027 - classification_loss: 0.5707 271/500 [===============>..............] - ETA: 57s - loss: 2.7723 - regression_loss: 2.2017 - classification_loss: 0.5706 272/500 [===============>..............] - ETA: 57s - loss: 2.7716 - regression_loss: 2.2012 - classification_loss: 0.5704 273/500 [===============>..............] - ETA: 56s - loss: 2.7717 - regression_loss: 2.2012 - classification_loss: 0.5705 274/500 [===============>..............] - ETA: 56s - loss: 2.7716 - regression_loss: 2.2014 - classification_loss: 0.5703 275/500 [===============>..............] - ETA: 56s - loss: 2.7728 - regression_loss: 2.2025 - classification_loss: 0.5703 276/500 [===============>..............] - ETA: 56s - loss: 2.7725 - regression_loss: 2.2027 - classification_loss: 0.5699 277/500 [===============>..............] - ETA: 55s - loss: 2.7712 - regression_loss: 2.2015 - classification_loss: 0.5697 278/500 [===============>..............] - ETA: 55s - loss: 2.7709 - regression_loss: 2.2018 - classification_loss: 0.5692 279/500 [===============>..............] - ETA: 55s - loss: 2.7715 - regression_loss: 2.2023 - classification_loss: 0.5692 280/500 [===============>..............] - ETA: 55s - loss: 2.7704 - regression_loss: 2.2017 - classification_loss: 0.5688 281/500 [===============>..............] - ETA: 54s - loss: 2.7727 - regression_loss: 2.2034 - classification_loss: 0.5693 282/500 [===============>..............] - ETA: 54s - loss: 2.7714 - regression_loss: 2.2024 - classification_loss: 0.5690 283/500 [===============>..............] - ETA: 54s - loss: 2.7693 - regression_loss: 2.2009 - classification_loss: 0.5683 284/500 [================>.............] - ETA: 54s - loss: 2.7684 - regression_loss: 2.2010 - classification_loss: 0.5674 285/500 [================>.............] - ETA: 53s - loss: 2.7680 - regression_loss: 2.2009 - classification_loss: 0.5671 286/500 [================>.............] - ETA: 53s - loss: 2.7680 - regression_loss: 2.2012 - classification_loss: 0.5668 287/500 [================>.............] - ETA: 53s - loss: 2.7685 - regression_loss: 2.2014 - classification_loss: 0.5671 288/500 [================>.............] - ETA: 53s - loss: 2.7681 - regression_loss: 2.2012 - classification_loss: 0.5670 289/500 [================>.............] - ETA: 52s - loss: 2.7666 - regression_loss: 2.1998 - classification_loss: 0.5668 290/500 [================>.............] - ETA: 52s - loss: 2.7647 - regression_loss: 2.1978 - classification_loss: 0.5669 291/500 [================>.............] - ETA: 52s - loss: 2.7648 - regression_loss: 2.1978 - classification_loss: 0.5669 292/500 [================>.............] - ETA: 52s - loss: 2.7646 - regression_loss: 2.1979 - classification_loss: 0.5667 293/500 [================>.............] - ETA: 51s - loss: 2.7639 - regression_loss: 2.1974 - classification_loss: 0.5665 294/500 [================>.............] - ETA: 51s - loss: 2.7660 - regression_loss: 2.1984 - classification_loss: 0.5676 295/500 [================>.............] - ETA: 51s - loss: 2.7653 - regression_loss: 2.1977 - classification_loss: 0.5676 296/500 [================>.............] - ETA: 51s - loss: 2.7634 - regression_loss: 2.1966 - classification_loss: 0.5668 297/500 [================>.............] - ETA: 50s - loss: 2.7631 - regression_loss: 2.1966 - classification_loss: 0.5666 298/500 [================>.............] - ETA: 50s - loss: 2.7636 - regression_loss: 2.1962 - classification_loss: 0.5674 299/500 [================>.............] - ETA: 50s - loss: 2.7622 - regression_loss: 2.1953 - classification_loss: 0.5668 300/500 [=================>............] - ETA: 50s - loss: 2.7600 - regression_loss: 2.1935 - classification_loss: 0.5664 301/500 [=================>............] - ETA: 49s - loss: 2.7581 - regression_loss: 2.1926 - classification_loss: 0.5655 302/500 [=================>............] - ETA: 49s - loss: 2.7586 - regression_loss: 2.1927 - classification_loss: 0.5659 303/500 [=================>............] - ETA: 49s - loss: 2.7571 - regression_loss: 2.1918 - classification_loss: 0.5653 304/500 [=================>............] - ETA: 49s - loss: 2.7583 - regression_loss: 2.1926 - classification_loss: 0.5656 305/500 [=================>............] - ETA: 48s - loss: 2.7579 - regression_loss: 2.1926 - classification_loss: 0.5653 306/500 [=================>............] - ETA: 48s - loss: 2.7568 - regression_loss: 2.1918 - classification_loss: 0.5650 307/500 [=================>............] - ETA: 48s - loss: 2.7561 - regression_loss: 2.1911 - classification_loss: 0.5650 308/500 [=================>............] - ETA: 48s - loss: 2.7564 - regression_loss: 2.1913 - classification_loss: 0.5652 309/500 [=================>............] - ETA: 47s - loss: 2.7544 - regression_loss: 2.1898 - classification_loss: 0.5646 310/500 [=================>............] - ETA: 47s - loss: 2.7582 - regression_loss: 2.1932 - classification_loss: 0.5650 311/500 [=================>............] - ETA: 47s - loss: 2.7574 - regression_loss: 2.1927 - classification_loss: 0.5647 312/500 [=================>............] - ETA: 47s - loss: 2.7580 - regression_loss: 2.1930 - classification_loss: 0.5650 313/500 [=================>............] - ETA: 46s - loss: 2.7580 - regression_loss: 2.1926 - classification_loss: 0.5654 314/500 [=================>............] - ETA: 46s - loss: 2.7587 - regression_loss: 2.1939 - classification_loss: 0.5648 315/500 [=================>............] - ETA: 46s - loss: 2.7565 - regression_loss: 2.1924 - classification_loss: 0.5641 316/500 [=================>............] - ETA: 46s - loss: 2.7579 - regression_loss: 2.1933 - classification_loss: 0.5646 317/500 [==================>...........] - ETA: 45s - loss: 2.7578 - regression_loss: 2.1933 - classification_loss: 0.5645 318/500 [==================>...........] - ETA: 45s - loss: 2.7587 - regression_loss: 2.1942 - classification_loss: 0.5645 319/500 [==================>...........] - ETA: 45s - loss: 2.7578 - regression_loss: 2.1936 - classification_loss: 0.5643 320/500 [==================>...........] - ETA: 45s - loss: 2.7584 - regression_loss: 2.1934 - classification_loss: 0.5650 321/500 [==================>...........] - ETA: 44s - loss: 2.7584 - regression_loss: 2.1936 - classification_loss: 0.5649 322/500 [==================>...........] - ETA: 44s - loss: 2.7586 - regression_loss: 2.1938 - classification_loss: 0.5649 323/500 [==================>...........] - ETA: 44s - loss: 2.7590 - regression_loss: 2.1946 - classification_loss: 0.5644 324/500 [==================>...........] - ETA: 44s - loss: 2.7596 - regression_loss: 2.1950 - classification_loss: 0.5646 325/500 [==================>...........] - ETA: 43s - loss: 2.7571 - regression_loss: 2.1930 - classification_loss: 0.5641 326/500 [==================>...........] - ETA: 43s - loss: 2.7577 - regression_loss: 2.1939 - classification_loss: 0.5638 327/500 [==================>...........] - ETA: 43s - loss: 2.7579 - regression_loss: 2.1941 - classification_loss: 0.5638 328/500 [==================>...........] - ETA: 43s - loss: 2.7583 - regression_loss: 2.1944 - classification_loss: 0.5639 329/500 [==================>...........] - ETA: 42s - loss: 2.7572 - regression_loss: 2.1938 - classification_loss: 0.5634 330/500 [==================>...........] - ETA: 42s - loss: 2.7568 - regression_loss: 2.1935 - classification_loss: 0.5633 331/500 [==================>...........] - ETA: 42s - loss: 2.7559 - regression_loss: 2.1928 - classification_loss: 0.5631 332/500 [==================>...........] - ETA: 42s - loss: 2.7540 - regression_loss: 2.1914 - classification_loss: 0.5626 333/500 [==================>...........] - ETA: 41s - loss: 2.7520 - regression_loss: 2.1897 - classification_loss: 0.5623 334/500 [===================>..........] - ETA: 41s - loss: 2.7515 - regression_loss: 2.1896 - classification_loss: 0.5619 335/500 [===================>..........] - ETA: 41s - loss: 2.7507 - regression_loss: 2.1892 - classification_loss: 0.5615 336/500 [===================>..........] - ETA: 41s - loss: 2.7503 - regression_loss: 2.1889 - classification_loss: 0.5613 337/500 [===================>..........] - ETA: 40s - loss: 2.7496 - regression_loss: 2.1885 - classification_loss: 0.5611 338/500 [===================>..........] - ETA: 40s - loss: 2.7504 - regression_loss: 2.1889 - classification_loss: 0.5615 339/500 [===================>..........] - ETA: 40s - loss: 2.7506 - regression_loss: 2.1887 - classification_loss: 0.5619 340/500 [===================>..........] - ETA: 40s - loss: 2.7503 - regression_loss: 2.1886 - classification_loss: 0.5617 341/500 [===================>..........] - ETA: 39s - loss: 2.7499 - regression_loss: 2.1885 - classification_loss: 0.5614 342/500 [===================>..........] - ETA: 39s - loss: 2.7502 - regression_loss: 2.1887 - classification_loss: 0.5615 343/500 [===================>..........] - ETA: 39s - loss: 2.7496 - regression_loss: 2.1884 - classification_loss: 0.5612 344/500 [===================>..........] - ETA: 39s - loss: 2.7498 - regression_loss: 2.1884 - classification_loss: 0.5614 345/500 [===================>..........] - ETA: 38s - loss: 2.7501 - regression_loss: 2.1887 - classification_loss: 0.5614 346/500 [===================>..........] - ETA: 38s - loss: 2.7502 - regression_loss: 2.1892 - classification_loss: 0.5610 347/500 [===================>..........] - ETA: 38s - loss: 2.7479 - regression_loss: 2.1874 - classification_loss: 0.5605 348/500 [===================>..........] - ETA: 38s - loss: 2.7475 - regression_loss: 2.1873 - classification_loss: 0.5602 349/500 [===================>..........] - ETA: 37s - loss: 2.7477 - regression_loss: 2.1875 - classification_loss: 0.5601 350/500 [====================>.........] - ETA: 37s - loss: 2.7478 - regression_loss: 2.1874 - classification_loss: 0.5604 351/500 [====================>.........] - ETA: 37s - loss: 2.7471 - regression_loss: 2.1869 - classification_loss: 0.5602 352/500 [====================>.........] - ETA: 37s - loss: 2.7488 - regression_loss: 2.1883 - classification_loss: 0.5605 353/500 [====================>.........] - ETA: 36s - loss: 2.7491 - regression_loss: 2.1886 - classification_loss: 0.5605 354/500 [====================>.........] - ETA: 36s - loss: 2.7486 - regression_loss: 2.1885 - classification_loss: 0.5602 355/500 [====================>.........] - ETA: 36s - loss: 2.7487 - regression_loss: 2.1888 - classification_loss: 0.5600 356/500 [====================>.........] - ETA: 36s - loss: 2.7486 - regression_loss: 2.1878 - classification_loss: 0.5608 357/500 [====================>.........] - ETA: 35s - loss: 2.7488 - regression_loss: 2.1881 - classification_loss: 0.5608 358/500 [====================>.........] - ETA: 35s - loss: 2.7488 - regression_loss: 2.1882 - classification_loss: 0.5606 359/500 [====================>.........] - ETA: 35s - loss: 2.7481 - regression_loss: 2.1878 - classification_loss: 0.5603 360/500 [====================>.........] - ETA: 35s - loss: 2.7476 - regression_loss: 2.1876 - classification_loss: 0.5600 361/500 [====================>.........] - ETA: 34s - loss: 2.7473 - regression_loss: 2.1871 - classification_loss: 0.5602 362/500 [====================>.........] - ETA: 34s - loss: 2.7463 - regression_loss: 2.1862 - classification_loss: 0.5600 363/500 [====================>.........] - ETA: 34s - loss: 2.7471 - regression_loss: 2.1868 - classification_loss: 0.5602 364/500 [====================>.........] - ETA: 34s - loss: 2.7464 - regression_loss: 2.1864 - classification_loss: 0.5600 365/500 [====================>.........] - ETA: 33s - loss: 2.7462 - regression_loss: 2.1864 - classification_loss: 0.5598 366/500 [====================>.........] - ETA: 33s - loss: 2.7461 - regression_loss: 2.1864 - classification_loss: 0.5598 367/500 [=====================>........] - ETA: 33s - loss: 2.7453 - regression_loss: 2.1859 - classification_loss: 0.5594 368/500 [=====================>........] - ETA: 33s - loss: 2.7468 - regression_loss: 2.1869 - classification_loss: 0.5599 369/500 [=====================>........] - ETA: 32s - loss: 2.7482 - regression_loss: 2.1874 - classification_loss: 0.5608 370/500 [=====================>........] - ETA: 32s - loss: 2.7482 - regression_loss: 2.1874 - classification_loss: 0.5608 371/500 [=====================>........] - ETA: 32s - loss: 2.7480 - regression_loss: 2.1875 - classification_loss: 0.5605 372/500 [=====================>........] - ETA: 32s - loss: 2.7477 - regression_loss: 2.1873 - classification_loss: 0.5604 373/500 [=====================>........] - ETA: 31s - loss: 2.7469 - regression_loss: 2.1868 - classification_loss: 0.5601 374/500 [=====================>........] - ETA: 31s - loss: 2.7486 - regression_loss: 2.1883 - classification_loss: 0.5603 375/500 [=====================>........] - ETA: 31s - loss: 2.7476 - regression_loss: 2.1881 - classification_loss: 0.5595 376/500 [=====================>........] - ETA: 31s - loss: 2.7471 - regression_loss: 2.1878 - classification_loss: 0.5594 377/500 [=====================>........] - ETA: 30s - loss: 2.7469 - regression_loss: 2.1876 - classification_loss: 0.5594 378/500 [=====================>........] - ETA: 30s - loss: 2.7475 - regression_loss: 2.1877 - classification_loss: 0.5598 379/500 [=====================>........] - ETA: 30s - loss: 2.7470 - regression_loss: 2.1876 - classification_loss: 0.5594 380/500 [=====================>........] - ETA: 30s - loss: 2.7463 - regression_loss: 2.1871 - classification_loss: 0.5592 381/500 [=====================>........] - ETA: 29s - loss: 2.7450 - regression_loss: 2.1862 - classification_loss: 0.5588 382/500 [=====================>........] - ETA: 29s - loss: 2.7442 - regression_loss: 2.1854 - classification_loss: 0.5588 383/500 [=====================>........] - ETA: 29s - loss: 2.7438 - regression_loss: 2.1852 - classification_loss: 0.5586 384/500 [======================>.......] - ETA: 29s - loss: 2.7440 - regression_loss: 2.1857 - classification_loss: 0.5584 385/500 [======================>.......] - ETA: 28s - loss: 2.7438 - regression_loss: 2.1857 - classification_loss: 0.5581 386/500 [======================>.......] - ETA: 28s - loss: 2.7437 - regression_loss: 2.1855 - classification_loss: 0.5582 387/500 [======================>.......] - ETA: 28s - loss: 2.7434 - regression_loss: 2.1855 - classification_loss: 0.5579 388/500 [======================>.......] - ETA: 28s - loss: 2.7436 - regression_loss: 2.1859 - classification_loss: 0.5577 389/500 [======================>.......] - ETA: 27s - loss: 2.7444 - regression_loss: 2.1864 - classification_loss: 0.5579 390/500 [======================>.......] - ETA: 27s - loss: 2.7445 - regression_loss: 2.1862 - classification_loss: 0.5583 391/500 [======================>.......] - ETA: 27s - loss: 2.7436 - regression_loss: 2.1854 - classification_loss: 0.5582 392/500 [======================>.......] - ETA: 27s - loss: 2.7453 - regression_loss: 2.1858 - classification_loss: 0.5596 393/500 [======================>.......] - ETA: 26s - loss: 2.7447 - regression_loss: 2.1853 - classification_loss: 0.5594 394/500 [======================>.......] - ETA: 26s - loss: 2.7442 - regression_loss: 2.1847 - classification_loss: 0.5595 395/500 [======================>.......] - ETA: 26s - loss: 2.7454 - regression_loss: 2.1857 - classification_loss: 0.5597 396/500 [======================>.......] - ETA: 26s - loss: 2.7462 - regression_loss: 2.1863 - classification_loss: 0.5598 397/500 [======================>.......] - ETA: 25s - loss: 2.7460 - regression_loss: 2.1863 - classification_loss: 0.5597 398/500 [======================>.......] - ETA: 25s - loss: 2.7453 - regression_loss: 2.1858 - classification_loss: 0.5595 399/500 [======================>.......] - ETA: 25s - loss: 2.7451 - regression_loss: 2.1856 - classification_loss: 0.5595 400/500 [=======================>......] - ETA: 25s - loss: 2.7460 - regression_loss: 2.1863 - classification_loss: 0.5597 401/500 [=======================>......] - ETA: 24s - loss: 2.7441 - regression_loss: 2.1848 - classification_loss: 0.5594 402/500 [=======================>......] - ETA: 24s - loss: 2.7448 - regression_loss: 2.1855 - classification_loss: 0.5593 403/500 [=======================>......] - ETA: 24s - loss: 2.7454 - regression_loss: 2.1860 - classification_loss: 0.5593 404/500 [=======================>......] - ETA: 24s - loss: 2.7452 - regression_loss: 2.1859 - classification_loss: 0.5593 405/500 [=======================>......] - ETA: 23s - loss: 2.7450 - regression_loss: 2.1860 - classification_loss: 0.5590 406/500 [=======================>......] - ETA: 23s - loss: 2.7420 - regression_loss: 2.1834 - classification_loss: 0.5586 407/500 [=======================>......] - ETA: 23s - loss: 2.7429 - regression_loss: 2.1843 - classification_loss: 0.5586 408/500 [=======================>......] - ETA: 23s - loss: 2.7425 - regression_loss: 2.1841 - classification_loss: 0.5584 409/500 [=======================>......] - ETA: 22s - loss: 2.7424 - regression_loss: 2.1839 - classification_loss: 0.5585 410/500 [=======================>......] - ETA: 22s - loss: 2.7441 - regression_loss: 2.1845 - classification_loss: 0.5596 411/500 [=======================>......] - ETA: 22s - loss: 2.7451 - regression_loss: 2.1857 - classification_loss: 0.5594 412/500 [=======================>......] - ETA: 22s - loss: 2.7454 - regression_loss: 2.1860 - classification_loss: 0.5593 413/500 [=======================>......] - ETA: 21s - loss: 2.7448 - regression_loss: 2.1857 - classification_loss: 0.5591 414/500 [=======================>......] - ETA: 21s - loss: 2.7456 - regression_loss: 2.1868 - classification_loss: 0.5589 415/500 [=======================>......] - ETA: 21s - loss: 2.7451 - regression_loss: 2.1864 - classification_loss: 0.5587 416/500 [=======================>......] - ETA: 21s - loss: 2.7445 - regression_loss: 2.1858 - classification_loss: 0.5587 417/500 [========================>.....] - ETA: 20s - loss: 2.7444 - regression_loss: 2.1854 - classification_loss: 0.5590 418/500 [========================>.....] - ETA: 20s - loss: 2.7440 - regression_loss: 2.1853 - classification_loss: 0.5588 419/500 [========================>.....] - ETA: 20s - loss: 2.7438 - regression_loss: 2.1854 - classification_loss: 0.5584 420/500 [========================>.....] - ETA: 20s - loss: 2.7445 - regression_loss: 2.1860 - classification_loss: 0.5585 421/500 [========================>.....] - ETA: 19s - loss: 2.7450 - regression_loss: 2.1864 - classification_loss: 0.5586 422/500 [========================>.....] - ETA: 19s - loss: 2.7454 - regression_loss: 2.1867 - classification_loss: 0.5587 423/500 [========================>.....] - ETA: 19s - loss: 2.7439 - regression_loss: 2.1854 - classification_loss: 0.5585 424/500 [========================>.....] - ETA: 19s - loss: 2.7450 - regression_loss: 2.1865 - classification_loss: 0.5585 425/500 [========================>.....] - ETA: 18s - loss: 2.7435 - regression_loss: 2.1855 - classification_loss: 0.5580 426/500 [========================>.....] - ETA: 18s - loss: 2.7442 - regression_loss: 2.1863 - classification_loss: 0.5580 427/500 [========================>.....] - ETA: 18s - loss: 2.7440 - regression_loss: 2.1861 - classification_loss: 0.5578 428/500 [========================>.....] - ETA: 18s - loss: 2.7430 - regression_loss: 2.1855 - classification_loss: 0.5575 429/500 [========================>.....] - ETA: 17s - loss: 2.7439 - regression_loss: 2.1860 - classification_loss: 0.5579 430/500 [========================>.....] - ETA: 17s - loss: 2.7428 - regression_loss: 2.1851 - classification_loss: 0.5577 431/500 [========================>.....] - ETA: 17s - loss: 2.7431 - regression_loss: 2.1855 - classification_loss: 0.5576 432/500 [========================>.....] - ETA: 17s - loss: 2.7425 - regression_loss: 2.1851 - classification_loss: 0.5574 433/500 [========================>.....] - ETA: 16s - loss: 2.7418 - regression_loss: 2.1846 - classification_loss: 0.5572 434/500 [=========================>....] - ETA: 16s - loss: 2.7408 - regression_loss: 2.1840 - classification_loss: 0.5568 435/500 [=========================>....] - ETA: 16s - loss: 2.7405 - regression_loss: 2.1838 - classification_loss: 0.5567 436/500 [=========================>....] - ETA: 16s - loss: 2.7393 - regression_loss: 2.1830 - classification_loss: 0.5563 437/500 [=========================>....] - ETA: 15s - loss: 2.7417 - regression_loss: 2.1825 - classification_loss: 0.5592 438/500 [=========================>....] - ETA: 15s - loss: 2.7409 - regression_loss: 2.1820 - classification_loss: 0.5589 439/500 [=========================>....] - ETA: 15s - loss: 2.7413 - regression_loss: 2.1821 - classification_loss: 0.5591 440/500 [=========================>....] - ETA: 15s - loss: 2.7421 - regression_loss: 2.1828 - classification_loss: 0.5593 441/500 [=========================>....] - ETA: 14s - loss: 2.7436 - regression_loss: 2.1842 - classification_loss: 0.5594 442/500 [=========================>....] - ETA: 14s - loss: 2.7431 - regression_loss: 2.1840 - classification_loss: 0.5591 443/500 [=========================>....] - ETA: 14s - loss: 2.7426 - regression_loss: 2.1837 - classification_loss: 0.5589 444/500 [=========================>....] - ETA: 14s - loss: 2.7421 - regression_loss: 2.1834 - classification_loss: 0.5587 445/500 [=========================>....] - ETA: 13s - loss: 2.7414 - regression_loss: 2.1830 - classification_loss: 0.5584 446/500 [=========================>....] - ETA: 13s - loss: 2.7410 - regression_loss: 2.1828 - classification_loss: 0.5582 447/500 [=========================>....] - ETA: 13s - loss: 2.7404 - regression_loss: 2.1823 - classification_loss: 0.5581 448/500 [=========================>....] - ETA: 13s - loss: 2.7399 - regression_loss: 2.1821 - classification_loss: 0.5579 449/500 [=========================>....] - ETA: 12s - loss: 2.7392 - regression_loss: 2.1816 - classification_loss: 0.5576 450/500 [==========================>...] - ETA: 12s - loss: 2.7391 - regression_loss: 2.1816 - classification_loss: 0.5575 451/500 [==========================>...] - ETA: 12s - loss: 2.7384 - regression_loss: 2.1812 - classification_loss: 0.5572 452/500 [==========================>...] - ETA: 12s - loss: 2.7394 - regression_loss: 2.1823 - classification_loss: 0.5571 453/500 [==========================>...] - ETA: 11s - loss: 2.7383 - regression_loss: 2.1813 - classification_loss: 0.5571 454/500 [==========================>...] - ETA: 11s - loss: 2.7380 - regression_loss: 2.1812 - classification_loss: 0.5568 455/500 [==========================>...] - ETA: 11s - loss: 2.7385 - regression_loss: 2.1814 - classification_loss: 0.5571 456/500 [==========================>...] - ETA: 11s - loss: 2.7384 - regression_loss: 2.1813 - classification_loss: 0.5571 457/500 [==========================>...] - ETA: 10s - loss: 2.7382 - regression_loss: 2.1812 - classification_loss: 0.5570 458/500 [==========================>...] - ETA: 10s - loss: 2.7380 - regression_loss: 2.1813 - classification_loss: 0.5567 459/500 [==========================>...] - ETA: 10s - loss: 2.7382 - regression_loss: 2.1815 - classification_loss: 0.5568 460/500 [==========================>...] - ETA: 10s - loss: 2.7381 - regression_loss: 2.1814 - classification_loss: 0.5567 461/500 [==========================>...] - ETA: 9s - loss: 2.7390 - regression_loss: 2.1821 - classification_loss: 0.5569  462/500 [==========================>...] - ETA: 9s - loss: 2.7384 - regression_loss: 2.1818 - classification_loss: 0.5566 463/500 [==========================>...] - ETA: 9s - loss: 2.7381 - regression_loss: 2.1817 - classification_loss: 0.5564 464/500 [==========================>...] - ETA: 9s - loss: 2.7378 - regression_loss: 2.1816 - classification_loss: 0.5562 465/500 [==========================>...] - ETA: 8s - loss: 2.7382 - regression_loss: 2.1820 - classification_loss: 0.5562 466/500 [==========================>...] - ETA: 8s - loss: 2.7394 - regression_loss: 2.1831 - classification_loss: 0.5563 467/500 [===========================>..] - ETA: 8s - loss: 2.7384 - regression_loss: 2.1824 - classification_loss: 0.5560 468/500 [===========================>..] - ETA: 8s - loss: 2.7376 - regression_loss: 2.1817 - classification_loss: 0.5559 469/500 [===========================>..] - ETA: 7s - loss: 2.7378 - regression_loss: 2.1822 - classification_loss: 0.5556 470/500 [===========================>..] - ETA: 7s - loss: 2.7382 - regression_loss: 2.1828 - classification_loss: 0.5554 471/500 [===========================>..] - ETA: 7s - loss: 2.7421 - regression_loss: 2.1848 - classification_loss: 0.5573 472/500 [===========================>..] - ETA: 7s - loss: 2.7426 - regression_loss: 2.1852 - classification_loss: 0.5575 473/500 [===========================>..] - ETA: 6s - loss: 2.7433 - regression_loss: 2.1855 - classification_loss: 0.5578 474/500 [===========================>..] - ETA: 6s - loss: 2.7447 - regression_loss: 2.1866 - classification_loss: 0.5581 475/500 [===========================>..] - ETA: 6s - loss: 2.7441 - regression_loss: 2.1861 - classification_loss: 0.5581 476/500 [===========================>..] - ETA: 6s - loss: 2.7441 - regression_loss: 2.1861 - classification_loss: 0.5580 477/500 [===========================>..] - ETA: 5s - loss: 2.7434 - regression_loss: 2.1856 - classification_loss: 0.5579 478/500 [===========================>..] - ETA: 5s - loss: 2.7436 - regression_loss: 2.1853 - classification_loss: 0.5583 479/500 [===========================>..] - ETA: 5s - loss: 2.7433 - regression_loss: 2.1853 - classification_loss: 0.5580 480/500 [===========================>..] - ETA: 5s - loss: 2.7429 - regression_loss: 2.1850 - classification_loss: 0.5579 481/500 [===========================>..] - ETA: 4s - loss: 2.7422 - regression_loss: 2.1843 - classification_loss: 0.5579 482/500 [===========================>..] - ETA: 4s - loss: 2.7442 - regression_loss: 2.1848 - classification_loss: 0.5594 483/500 [===========================>..] - ETA: 4s - loss: 2.7442 - regression_loss: 2.1848 - classification_loss: 0.5593 484/500 [============================>.] - ETA: 4s - loss: 2.7434 - regression_loss: 2.1842 - classification_loss: 0.5591 485/500 [============================>.] - ETA: 3s - loss: 2.7426 - regression_loss: 2.1839 - classification_loss: 0.5588 486/500 [============================>.] - ETA: 3s - loss: 2.7433 - regression_loss: 2.1842 - classification_loss: 0.5592 487/500 [============================>.] - ETA: 3s - loss: 2.7446 - regression_loss: 2.1850 - classification_loss: 0.5596 488/500 [============================>.] - ETA: 3s - loss: 2.7433 - regression_loss: 2.1844 - classification_loss: 0.5589 489/500 [============================>.] - ETA: 2s - loss: 2.7424 - regression_loss: 2.1838 - classification_loss: 0.5587 490/500 [============================>.] - ETA: 2s - loss: 2.7429 - regression_loss: 2.1840 - classification_loss: 0.5589 491/500 [============================>.] - ETA: 2s - loss: 2.7417 - regression_loss: 2.1828 - classification_loss: 0.5589 492/500 [============================>.] - ETA: 2s - loss: 2.7408 - regression_loss: 2.1822 - classification_loss: 0.5586 493/500 [============================>.] - ETA: 1s - loss: 2.7409 - regression_loss: 2.1823 - classification_loss: 0.5586 494/500 [============================>.] - ETA: 1s - loss: 2.7410 - regression_loss: 2.1825 - classification_loss: 0.5585 495/500 [============================>.] - ETA: 1s - loss: 2.7400 - regression_loss: 2.1819 - classification_loss: 0.5581 496/500 [============================>.] - ETA: 1s - loss: 2.7396 - regression_loss: 2.1817 - classification_loss: 0.5579 497/500 [============================>.] - ETA: 0s - loss: 2.7396 - regression_loss: 2.1815 - classification_loss: 0.5581 498/500 [============================>.] - ETA: 0s - loss: 2.7396 - regression_loss: 2.1815 - classification_loss: 0.5581 499/500 [============================>.] - ETA: 0s - loss: 2.7389 - regression_loss: 2.1809 - classification_loss: 0.5580 500/500 [==============================] - 125s 250ms/step - loss: 2.7385 - regression_loss: 2.1807 - classification_loss: 0.5579 1172 instances of class plum with average precision: 0.2447 mAP: 0.2447 Epoch 00008: saving model to ./training/snapshots/resnet50_pascal_08.h5 Epoch 9/150 1/500 [..............................] - ETA: 1:56 - loss: 2.5639 - regression_loss: 1.9706 - classification_loss: 0.5933 2/500 [..............................] - ETA: 1:59 - loss: 2.6031 - regression_loss: 2.1273 - classification_loss: 0.4758 3/500 [..............................] - ETA: 2:03 - loss: 2.6213 - regression_loss: 2.1654 - classification_loss: 0.4559 4/500 [..............................] - ETA: 2:03 - loss: 2.6629 - regression_loss: 2.1887 - classification_loss: 0.4742 5/500 [..............................] - ETA: 2:04 - loss: 2.5388 - regression_loss: 2.0169 - classification_loss: 0.5219 6/500 [..............................] - ETA: 2:03 - loss: 2.5324 - regression_loss: 2.0254 - classification_loss: 0.5070 7/500 [..............................] - ETA: 2:04 - loss: 2.5040 - regression_loss: 2.0006 - classification_loss: 0.5034 8/500 [..............................] - ETA: 2:03 - loss: 2.5046 - regression_loss: 2.0037 - classification_loss: 0.5008 9/500 [..............................] - ETA: 2:03 - loss: 2.5387 - regression_loss: 2.0268 - classification_loss: 0.5118 10/500 [..............................] - ETA: 2:03 - loss: 2.5486 - regression_loss: 2.0331 - classification_loss: 0.5156 11/500 [..............................] - ETA: 2:02 - loss: 2.6000 - regression_loss: 2.0728 - classification_loss: 0.5272 12/500 [..............................] - ETA: 2:02 - loss: 2.5479 - regression_loss: 2.0365 - classification_loss: 0.5113 13/500 [..............................] - ETA: 2:02 - loss: 2.5548 - regression_loss: 2.0382 - classification_loss: 0.5166 14/500 [..............................] - ETA: 2:02 - loss: 2.5266 - regression_loss: 2.0224 - classification_loss: 0.5042 15/500 [..............................] - ETA: 2:02 - loss: 2.5266 - regression_loss: 2.0247 - classification_loss: 0.5019 16/500 [..............................] - ETA: 2:01 - loss: 2.5383 - regression_loss: 2.0345 - classification_loss: 0.5038 17/500 [>.............................] - ETA: 2:01 - loss: 2.5612 - regression_loss: 2.0501 - classification_loss: 0.5111 18/500 [>.............................] - ETA: 2:01 - loss: 2.5997 - regression_loss: 2.0781 - classification_loss: 0.5216 19/500 [>.............................] - ETA: 2:01 - loss: 2.6257 - regression_loss: 2.0893 - classification_loss: 0.5365 20/500 [>.............................] - ETA: 2:01 - loss: 2.6346 - regression_loss: 2.0951 - classification_loss: 0.5395 21/500 [>.............................] - ETA: 2:00 - loss: 2.6296 - regression_loss: 2.0923 - classification_loss: 0.5373 22/500 [>.............................] - ETA: 2:00 - loss: 2.6098 - regression_loss: 2.0784 - classification_loss: 0.5314 23/500 [>.............................] - ETA: 2:00 - loss: 2.5997 - regression_loss: 2.0711 - classification_loss: 0.5286 24/500 [>.............................] - ETA: 2:00 - loss: 2.6127 - regression_loss: 2.0790 - classification_loss: 0.5337 25/500 [>.............................] - ETA: 2:00 - loss: 2.6449 - regression_loss: 2.0972 - classification_loss: 0.5477 26/500 [>.............................] - ETA: 2:00 - loss: 2.6354 - regression_loss: 2.0922 - classification_loss: 0.5432 27/500 [>.............................] - ETA: 2:00 - loss: 2.6380 - regression_loss: 2.0964 - classification_loss: 0.5416 28/500 [>.............................] - ETA: 1:59 - loss: 2.6281 - regression_loss: 2.0903 - classification_loss: 0.5378 29/500 [>.............................] - ETA: 1:59 - loss: 2.6309 - regression_loss: 2.0920 - classification_loss: 0.5389 30/500 [>.............................] - ETA: 1:59 - loss: 2.6409 - regression_loss: 2.1017 - classification_loss: 0.5392 31/500 [>.............................] - ETA: 1:58 - loss: 2.6155 - regression_loss: 2.0837 - classification_loss: 0.5318 32/500 [>.............................] - ETA: 1:58 - loss: 2.6085 - regression_loss: 2.0784 - classification_loss: 0.5301 33/500 [>.............................] - ETA: 1:58 - loss: 2.6120 - regression_loss: 2.0829 - classification_loss: 0.5292 34/500 [=>............................] - ETA: 1:57 - loss: 2.6285 - regression_loss: 2.0946 - classification_loss: 0.5338 35/500 [=>............................] - ETA: 1:57 - loss: 2.6199 - regression_loss: 2.0893 - classification_loss: 0.5306 36/500 [=>............................] - ETA: 1:57 - loss: 2.6243 - regression_loss: 2.0929 - classification_loss: 0.5314 37/500 [=>............................] - ETA: 1:57 - loss: 2.6323 - regression_loss: 2.1010 - classification_loss: 0.5312 38/500 [=>............................] - ETA: 1:57 - loss: 2.6242 - regression_loss: 2.0961 - classification_loss: 0.5281 39/500 [=>............................] - ETA: 1:56 - loss: 2.6607 - regression_loss: 2.1294 - classification_loss: 0.5313 40/500 [=>............................] - ETA: 1:56 - loss: 2.6447 - regression_loss: 2.1160 - classification_loss: 0.5287 41/500 [=>............................] - ETA: 1:56 - loss: 2.6506 - regression_loss: 2.1221 - classification_loss: 0.5285 42/500 [=>............................] - ETA: 1:56 - loss: 2.6494 - regression_loss: 2.1217 - classification_loss: 0.5278 43/500 [=>............................] - ETA: 1:55 - loss: 2.6621 - regression_loss: 2.1379 - classification_loss: 0.5242 44/500 [=>............................] - ETA: 1:55 - loss: 2.6593 - regression_loss: 2.1365 - classification_loss: 0.5228 45/500 [=>............................] - ETA: 1:54 - loss: 2.6553 - regression_loss: 2.1334 - classification_loss: 0.5219 46/500 [=>............................] - ETA: 1:54 - loss: 2.6511 - regression_loss: 2.1292 - classification_loss: 0.5218 47/500 [=>............................] - ETA: 1:54 - loss: 2.6480 - regression_loss: 2.1289 - classification_loss: 0.5191 48/500 [=>............................] - ETA: 1:54 - loss: 2.6483 - regression_loss: 2.1251 - classification_loss: 0.5232 49/500 [=>............................] - ETA: 1:53 - loss: 2.6554 - regression_loss: 2.1294 - classification_loss: 0.5259 50/500 [==>...........................] - ETA: 1:53 - loss: 2.6601 - regression_loss: 2.1253 - classification_loss: 0.5348 51/500 [==>...........................] - ETA: 1:52 - loss: 2.6688 - regression_loss: 2.1312 - classification_loss: 0.5376 52/500 [==>...........................] - ETA: 1:51 - loss: 2.6700 - regression_loss: 2.1329 - classification_loss: 0.5370 53/500 [==>...........................] - ETA: 1:51 - loss: 2.6696 - regression_loss: 2.1337 - classification_loss: 0.5359 54/500 [==>...........................] - ETA: 1:51 - loss: 2.6635 - regression_loss: 2.1238 - classification_loss: 0.5397 55/500 [==>...........................] - ETA: 1:51 - loss: 2.6691 - regression_loss: 2.1236 - classification_loss: 0.5455 56/500 [==>...........................] - ETA: 1:50 - loss: 2.6623 - regression_loss: 2.1192 - classification_loss: 0.5431 57/500 [==>...........................] - ETA: 1:50 - loss: 2.6606 - regression_loss: 2.1180 - classification_loss: 0.5426 58/500 [==>...........................] - ETA: 1:50 - loss: 2.6645 - regression_loss: 2.1209 - classification_loss: 0.5436 59/500 [==>...........................] - ETA: 1:50 - loss: 2.6492 - regression_loss: 2.1087 - classification_loss: 0.5404 60/500 [==>...........................] - ETA: 1:50 - loss: 2.6471 - regression_loss: 2.1078 - classification_loss: 0.5393 61/500 [==>...........................] - ETA: 1:49 - loss: 2.6431 - regression_loss: 2.1056 - classification_loss: 0.5375 62/500 [==>...........................] - ETA: 1:49 - loss: 2.6404 - regression_loss: 2.1056 - classification_loss: 0.5348 63/500 [==>...........................] - ETA: 1:49 - loss: 2.6354 - regression_loss: 2.1023 - classification_loss: 0.5331 64/500 [==>...........................] - ETA: 1:49 - loss: 2.6288 - regression_loss: 2.0973 - classification_loss: 0.5315 65/500 [==>...........................] - ETA: 1:49 - loss: 2.6265 - regression_loss: 2.0954 - classification_loss: 0.5311 66/500 [==>...........................] - ETA: 1:48 - loss: 2.6270 - regression_loss: 2.0970 - classification_loss: 0.5300 67/500 [===>..........................] - ETA: 1:48 - loss: 2.6288 - regression_loss: 2.0978 - classification_loss: 0.5310 68/500 [===>..........................] - ETA: 1:48 - loss: 2.6355 - regression_loss: 2.1036 - classification_loss: 0.5319 69/500 [===>..........................] - ETA: 1:48 - loss: 2.6438 - regression_loss: 2.1109 - classification_loss: 0.5328 70/500 [===>..........................] - ETA: 1:47 - loss: 2.6455 - regression_loss: 2.1100 - classification_loss: 0.5355 71/500 [===>..........................] - ETA: 1:47 - loss: 2.6406 - regression_loss: 2.1061 - classification_loss: 0.5345 72/500 [===>..........................] - ETA: 1:47 - loss: 2.6516 - regression_loss: 2.1108 - classification_loss: 0.5408 73/500 [===>..........................] - ETA: 1:47 - loss: 2.6519 - regression_loss: 2.1119 - classification_loss: 0.5400 74/500 [===>..........................] - ETA: 1:47 - loss: 2.6523 - regression_loss: 2.1123 - classification_loss: 0.5401 75/500 [===>..........................] - ETA: 1:46 - loss: 2.6523 - regression_loss: 2.1131 - classification_loss: 0.5392 76/500 [===>..........................] - ETA: 1:46 - loss: 2.6431 - regression_loss: 2.1057 - classification_loss: 0.5373 77/500 [===>..........................] - ETA: 1:46 - loss: 2.6391 - regression_loss: 2.1014 - classification_loss: 0.5377 78/500 [===>..........................] - ETA: 1:46 - loss: 2.6431 - regression_loss: 2.1034 - classification_loss: 0.5397 79/500 [===>..........................] - ETA: 1:45 - loss: 2.6479 - regression_loss: 2.1070 - classification_loss: 0.5409 80/500 [===>..........................] - ETA: 1:45 - loss: 2.6526 - regression_loss: 2.1098 - classification_loss: 0.5428 81/500 [===>..........................] - ETA: 1:45 - loss: 2.6487 - regression_loss: 2.1061 - classification_loss: 0.5426 82/500 [===>..........................] - ETA: 1:45 - loss: 2.6436 - regression_loss: 2.1017 - classification_loss: 0.5419 83/500 [===>..........................] - ETA: 1:44 - loss: 2.6429 - regression_loss: 2.1006 - classification_loss: 0.5423 84/500 [====>.........................] - ETA: 1:44 - loss: 2.6423 - regression_loss: 2.0995 - classification_loss: 0.5428 85/500 [====>.........................] - ETA: 1:44 - loss: 2.6583 - regression_loss: 2.1119 - classification_loss: 0.5464 86/500 [====>.........................] - ETA: 1:44 - loss: 2.6579 - regression_loss: 2.1129 - classification_loss: 0.5450 87/500 [====>.........................] - ETA: 1:43 - loss: 2.6598 - regression_loss: 2.1151 - classification_loss: 0.5447 88/500 [====>.........................] - ETA: 1:43 - loss: 2.6544 - regression_loss: 2.1113 - classification_loss: 0.5431 89/500 [====>.........................] - ETA: 1:43 - loss: 2.6584 - regression_loss: 2.1138 - classification_loss: 0.5445 90/500 [====>.........................] - ETA: 1:42 - loss: 2.6594 - regression_loss: 2.1142 - classification_loss: 0.5452 91/500 [====>.........................] - ETA: 1:42 - loss: 2.6576 - regression_loss: 2.1128 - classification_loss: 0.5448 92/500 [====>.........................] - ETA: 1:42 - loss: 2.6555 - regression_loss: 2.1112 - classification_loss: 0.5443 93/500 [====>.........................] - ETA: 1:42 - loss: 2.6514 - regression_loss: 2.1056 - classification_loss: 0.5458 94/500 [====>.........................] - ETA: 1:41 - loss: 2.6536 - regression_loss: 2.1078 - classification_loss: 0.5458 95/500 [====>.........................] - ETA: 1:41 - loss: 2.6579 - regression_loss: 2.1112 - classification_loss: 0.5467 96/500 [====>.........................] - ETA: 1:41 - loss: 2.6544 - regression_loss: 2.1087 - classification_loss: 0.5456 97/500 [====>.........................] - ETA: 1:41 - loss: 2.6520 - regression_loss: 2.1075 - classification_loss: 0.5445 98/500 [====>.........................] - ETA: 1:41 - loss: 2.6519 - regression_loss: 2.1072 - classification_loss: 0.5447 99/500 [====>.........................] - ETA: 1:40 - loss: 2.6526 - regression_loss: 2.1079 - classification_loss: 0.5447 100/500 [=====>........................] - ETA: 1:40 - loss: 2.6504 - regression_loss: 2.1065 - classification_loss: 0.5439 101/500 [=====>........................] - ETA: 1:40 - loss: 2.6497 - regression_loss: 2.1059 - classification_loss: 0.5438 102/500 [=====>........................] - ETA: 1:40 - loss: 2.6489 - regression_loss: 2.1052 - classification_loss: 0.5438 103/500 [=====>........................] - ETA: 1:39 - loss: 2.6478 - regression_loss: 2.1049 - classification_loss: 0.5429 104/500 [=====>........................] - ETA: 1:39 - loss: 2.6445 - regression_loss: 2.1019 - classification_loss: 0.5427 105/500 [=====>........................] - ETA: 1:39 - loss: 2.6500 - regression_loss: 2.1061 - classification_loss: 0.5439 106/500 [=====>........................] - ETA: 1:38 - loss: 2.6521 - regression_loss: 2.1087 - classification_loss: 0.5433 107/500 [=====>........................] - ETA: 1:38 - loss: 2.6598 - regression_loss: 2.1144 - classification_loss: 0.5454 108/500 [=====>........................] - ETA: 1:38 - loss: 2.6603 - regression_loss: 2.1154 - classification_loss: 0.5449 109/500 [=====>........................] - ETA: 1:38 - loss: 2.6689 - regression_loss: 2.1224 - classification_loss: 0.5465 110/500 [=====>........................] - ETA: 1:38 - loss: 2.6668 - regression_loss: 2.1206 - classification_loss: 0.5462 111/500 [=====>........................] - ETA: 1:37 - loss: 2.6671 - regression_loss: 2.1213 - classification_loss: 0.5459 112/500 [=====>........................] - ETA: 1:37 - loss: 2.6567 - regression_loss: 2.1136 - classification_loss: 0.5430 113/500 [=====>........................] - ETA: 1:37 - loss: 2.6546 - regression_loss: 2.1132 - classification_loss: 0.5415 114/500 [=====>........................] - ETA: 1:37 - loss: 2.6500 - regression_loss: 2.1096 - classification_loss: 0.5404 115/500 [=====>........................] - ETA: 1:36 - loss: 2.6475 - regression_loss: 2.1074 - classification_loss: 0.5401 116/500 [=====>........................] - ETA: 1:36 - loss: 2.6462 - regression_loss: 2.1067 - classification_loss: 0.5395 117/500 [======>.......................] - ETA: 1:36 - loss: 2.6436 - regression_loss: 2.1049 - classification_loss: 0.5387 118/500 [======>.......................] - ETA: 1:35 - loss: 2.6430 - regression_loss: 2.1047 - classification_loss: 0.5384 119/500 [======>.......................] - ETA: 1:35 - loss: 2.6432 - regression_loss: 2.1051 - classification_loss: 0.5382 120/500 [======>.......................] - ETA: 1:35 - loss: 2.6427 - regression_loss: 2.1047 - classification_loss: 0.5379 121/500 [======>.......................] - ETA: 1:35 - loss: 2.6414 - regression_loss: 2.1043 - classification_loss: 0.5372 122/500 [======>.......................] - ETA: 1:34 - loss: 2.6422 - regression_loss: 2.1044 - classification_loss: 0.5378 123/500 [======>.......................] - ETA: 1:34 - loss: 2.6410 - regression_loss: 2.1041 - classification_loss: 0.5370 124/500 [======>.......................] - ETA: 1:34 - loss: 2.6346 - regression_loss: 2.0982 - classification_loss: 0.5363 125/500 [======>.......................] - ETA: 1:33 - loss: 2.6364 - regression_loss: 2.1001 - classification_loss: 0.5363 126/500 [======>.......................] - ETA: 1:33 - loss: 2.6344 - regression_loss: 2.0990 - classification_loss: 0.5354 127/500 [======>.......................] - ETA: 1:33 - loss: 2.6361 - regression_loss: 2.1001 - classification_loss: 0.5360 128/500 [======>.......................] - ETA: 1:33 - loss: 2.6378 - regression_loss: 2.1020 - classification_loss: 0.5358 129/500 [======>.......................] - ETA: 1:32 - loss: 2.6371 - regression_loss: 2.1021 - classification_loss: 0.5351 130/500 [======>.......................] - ETA: 1:32 - loss: 2.6366 - regression_loss: 2.1018 - classification_loss: 0.5348 131/500 [======>.......................] - ETA: 1:32 - loss: 2.6420 - regression_loss: 2.1049 - classification_loss: 0.5370 132/500 [======>.......................] - ETA: 1:32 - loss: 2.6416 - regression_loss: 2.1037 - classification_loss: 0.5379 133/500 [======>.......................] - ETA: 1:31 - loss: 2.6435 - regression_loss: 2.1055 - classification_loss: 0.5380 134/500 [=======>......................] - ETA: 1:31 - loss: 2.6473 - regression_loss: 2.1088 - classification_loss: 0.5386 135/500 [=======>......................] - ETA: 1:31 - loss: 2.6482 - regression_loss: 2.1094 - classification_loss: 0.5388 136/500 [=======>......................] - ETA: 1:31 - loss: 2.6476 - regression_loss: 2.1095 - classification_loss: 0.5381 137/500 [=======>......................] - ETA: 1:31 - loss: 2.6485 - regression_loss: 2.1102 - classification_loss: 0.5383 138/500 [=======>......................] - ETA: 1:30 - loss: 2.6516 - regression_loss: 2.1131 - classification_loss: 0.5385 139/500 [=======>......................] - ETA: 1:30 - loss: 2.6511 - regression_loss: 2.1125 - classification_loss: 0.5387 140/500 [=======>......................] - ETA: 1:30 - loss: 2.6523 - regression_loss: 2.1120 - classification_loss: 0.5402 141/500 [=======>......................] - ETA: 1:29 - loss: 2.6513 - regression_loss: 2.1112 - classification_loss: 0.5401 142/500 [=======>......................] - ETA: 1:29 - loss: 2.6561 - regression_loss: 2.1152 - classification_loss: 0.5409 143/500 [=======>......................] - ETA: 1:29 - loss: 2.6580 - regression_loss: 2.1171 - classification_loss: 0.5409 144/500 [=======>......................] - ETA: 1:29 - loss: 2.6571 - regression_loss: 2.1163 - classification_loss: 0.5408 145/500 [=======>......................] - ETA: 1:29 - loss: 2.6576 - regression_loss: 2.1166 - classification_loss: 0.5410 146/500 [=======>......................] - ETA: 1:28 - loss: 2.6630 - regression_loss: 2.1196 - classification_loss: 0.5433 147/500 [=======>......................] - ETA: 1:28 - loss: 2.6638 - regression_loss: 2.1200 - classification_loss: 0.5438 148/500 [=======>......................] - ETA: 1:28 - loss: 2.6644 - regression_loss: 2.1203 - classification_loss: 0.5441 149/500 [=======>......................] - ETA: 1:28 - loss: 2.6594 - regression_loss: 2.1168 - classification_loss: 0.5426 150/500 [========>.....................] - ETA: 1:27 - loss: 2.6598 - regression_loss: 2.1175 - classification_loss: 0.5423 151/500 [========>.....................] - ETA: 1:27 - loss: 2.6587 - regression_loss: 2.1168 - classification_loss: 0.5419 152/500 [========>.....................] - ETA: 1:27 - loss: 2.6579 - regression_loss: 2.1172 - classification_loss: 0.5407 153/500 [========>.....................] - ETA: 1:27 - loss: 2.6665 - regression_loss: 2.1201 - classification_loss: 0.5463 154/500 [========>.....................] - ETA: 1:26 - loss: 2.6662 - regression_loss: 2.1204 - classification_loss: 0.5459 155/500 [========>.....................] - ETA: 1:26 - loss: 2.6675 - regression_loss: 2.1216 - classification_loss: 0.5459 156/500 [========>.....................] - ETA: 1:26 - loss: 2.6665 - regression_loss: 2.1212 - classification_loss: 0.5452 157/500 [========>.....................] - ETA: 1:26 - loss: 2.6675 - regression_loss: 2.1215 - classification_loss: 0.5460 158/500 [========>.....................] - ETA: 1:25 - loss: 2.6710 - regression_loss: 2.1238 - classification_loss: 0.5472 159/500 [========>.....................] - ETA: 1:25 - loss: 2.6697 - regression_loss: 2.1230 - classification_loss: 0.5467 160/500 [========>.....................] - ETA: 1:25 - loss: 2.6698 - regression_loss: 2.1230 - classification_loss: 0.5468 161/500 [========>.....................] - ETA: 1:24 - loss: 2.6677 - regression_loss: 2.1214 - classification_loss: 0.5463 162/500 [========>.....................] - ETA: 1:24 - loss: 2.6665 - regression_loss: 2.1209 - classification_loss: 0.5456 163/500 [========>.....................] - ETA: 1:24 - loss: 2.6688 - regression_loss: 2.1228 - classification_loss: 0.5460 164/500 [========>.....................] - ETA: 1:24 - loss: 2.6717 - regression_loss: 2.1254 - classification_loss: 0.5462 165/500 [========>.....................] - ETA: 1:23 - loss: 2.6718 - regression_loss: 2.1262 - classification_loss: 0.5455 166/500 [========>.....................] - ETA: 1:23 - loss: 2.6692 - regression_loss: 2.1245 - classification_loss: 0.5447 167/500 [=========>....................] - ETA: 1:23 - loss: 2.6672 - regression_loss: 2.1235 - classification_loss: 0.5437 168/500 [=========>....................] - ETA: 1:23 - loss: 2.6664 - regression_loss: 2.1231 - classification_loss: 0.5433 169/500 [=========>....................] - ETA: 1:23 - loss: 2.6651 - regression_loss: 2.1220 - classification_loss: 0.5431 170/500 [=========>....................] - ETA: 1:22 - loss: 2.6645 - regression_loss: 2.1219 - classification_loss: 0.5425 171/500 [=========>....................] - ETA: 1:22 - loss: 2.6642 - regression_loss: 2.1223 - classification_loss: 0.5418 172/500 [=========>....................] - ETA: 1:22 - loss: 2.6669 - regression_loss: 2.1250 - classification_loss: 0.5420 173/500 [=========>....................] - ETA: 1:22 - loss: 2.6672 - regression_loss: 2.1256 - classification_loss: 0.5416 174/500 [=========>....................] - ETA: 1:21 - loss: 2.6680 - regression_loss: 2.1265 - classification_loss: 0.5415 175/500 [=========>....................] - ETA: 1:21 - loss: 2.6667 - regression_loss: 2.1259 - classification_loss: 0.5408 176/500 [=========>....................] - ETA: 1:21 - loss: 2.6600 - regression_loss: 2.1206 - classification_loss: 0.5393 177/500 [=========>....................] - ETA: 1:21 - loss: 2.6652 - regression_loss: 2.1246 - classification_loss: 0.5406 178/500 [=========>....................] - ETA: 1:20 - loss: 2.6629 - regression_loss: 2.1225 - classification_loss: 0.5404 179/500 [=========>....................] - ETA: 1:20 - loss: 2.6622 - regression_loss: 2.1223 - classification_loss: 0.5399 180/500 [=========>....................] - ETA: 1:20 - loss: 2.6614 - regression_loss: 2.1222 - classification_loss: 0.5392 181/500 [=========>....................] - ETA: 1:20 - loss: 2.6609 - regression_loss: 2.1224 - classification_loss: 0.5385 182/500 [=========>....................] - ETA: 1:19 - loss: 2.6581 - regression_loss: 2.1208 - classification_loss: 0.5373 183/500 [=========>....................] - ETA: 1:19 - loss: 2.6580 - regression_loss: 2.1207 - classification_loss: 0.5372 184/500 [==========>...................] - ETA: 1:19 - loss: 2.6590 - regression_loss: 2.1206 - classification_loss: 0.5384 185/500 [==========>...................] - ETA: 1:19 - loss: 2.6597 - regression_loss: 2.1214 - classification_loss: 0.5383 186/500 [==========>...................] - ETA: 1:18 - loss: 2.6573 - regression_loss: 2.1199 - classification_loss: 0.5374 187/500 [==========>...................] - ETA: 1:18 - loss: 2.6609 - regression_loss: 2.1217 - classification_loss: 0.5392 188/500 [==========>...................] - ETA: 1:18 - loss: 2.6621 - regression_loss: 2.1231 - classification_loss: 0.5391 189/500 [==========>...................] - ETA: 1:18 - loss: 2.6613 - regression_loss: 2.1226 - classification_loss: 0.5387 190/500 [==========>...................] - ETA: 1:17 - loss: 2.6599 - regression_loss: 2.1218 - classification_loss: 0.5381 191/500 [==========>...................] - ETA: 1:17 - loss: 2.6587 - regression_loss: 2.1206 - classification_loss: 0.5381 192/500 [==========>...................] - ETA: 1:17 - loss: 2.6608 - regression_loss: 2.1220 - classification_loss: 0.5388 193/500 [==========>...................] - ETA: 1:17 - loss: 2.6600 - regression_loss: 2.1207 - classification_loss: 0.5393 194/500 [==========>...................] - ETA: 1:16 - loss: 2.6596 - regression_loss: 2.1207 - classification_loss: 0.5389 195/500 [==========>...................] - ETA: 1:16 - loss: 2.6595 - regression_loss: 2.1207 - classification_loss: 0.5388 196/500 [==========>...................] - ETA: 1:16 - loss: 2.6595 - regression_loss: 2.1207 - classification_loss: 0.5387 197/500 [==========>...................] - ETA: 1:16 - loss: 2.6600 - regression_loss: 2.1211 - classification_loss: 0.5389 198/500 [==========>...................] - ETA: 1:15 - loss: 2.6589 - regression_loss: 2.1204 - classification_loss: 0.5384 199/500 [==========>...................] - ETA: 1:15 - loss: 2.6597 - regression_loss: 2.1216 - classification_loss: 0.5380 200/500 [===========>..................] - ETA: 1:15 - loss: 2.6584 - regression_loss: 2.1208 - classification_loss: 0.5375 201/500 [===========>..................] - ETA: 1:15 - loss: 2.6590 - regression_loss: 2.1216 - classification_loss: 0.5374 202/500 [===========>..................] - ETA: 1:14 - loss: 2.6584 - regression_loss: 2.1215 - classification_loss: 0.5369 203/500 [===========>..................] - ETA: 1:14 - loss: 2.6579 - regression_loss: 2.1211 - classification_loss: 0.5368 204/500 [===========>..................] - ETA: 1:14 - loss: 2.6563 - regression_loss: 2.1201 - classification_loss: 0.5363 205/500 [===========>..................] - ETA: 1:14 - loss: 2.6547 - regression_loss: 2.1174 - classification_loss: 0.5373 206/500 [===========>..................] - ETA: 1:13 - loss: 2.6521 - regression_loss: 2.1151 - classification_loss: 0.5370 207/500 [===========>..................] - ETA: 1:13 - loss: 2.6515 - regression_loss: 2.1152 - classification_loss: 0.5363 208/500 [===========>..................] - ETA: 1:13 - loss: 2.6521 - regression_loss: 2.1161 - classification_loss: 0.5360 209/500 [===========>..................] - ETA: 1:12 - loss: 2.6523 - regression_loss: 2.1168 - classification_loss: 0.5355 210/500 [===========>..................] - ETA: 1:12 - loss: 2.6529 - regression_loss: 2.1173 - classification_loss: 0.5356 211/500 [===========>..................] - ETA: 1:12 - loss: 2.6550 - regression_loss: 2.1187 - classification_loss: 0.5363 212/500 [===========>..................] - ETA: 1:12 - loss: 2.6532 - regression_loss: 2.1175 - classification_loss: 0.5356 213/500 [===========>..................] - ETA: 1:11 - loss: 2.6521 - regression_loss: 2.1167 - classification_loss: 0.5354 214/500 [===========>..................] - ETA: 1:11 - loss: 2.6530 - regression_loss: 2.1176 - classification_loss: 0.5354 215/500 [===========>..................] - ETA: 1:11 - loss: 2.6538 - regression_loss: 2.1191 - classification_loss: 0.5347 216/500 [===========>..................] - ETA: 1:11 - loss: 2.6541 - regression_loss: 2.1187 - classification_loss: 0.5354 217/500 [============>.................] - ETA: 1:11 - loss: 2.6574 - regression_loss: 2.1213 - classification_loss: 0.5361 218/500 [============>.................] - ETA: 1:10 - loss: 2.6608 - regression_loss: 2.1249 - classification_loss: 0.5359 219/500 [============>.................] - ETA: 1:10 - loss: 2.6607 - regression_loss: 2.1243 - classification_loss: 0.5364 220/500 [============>.................] - ETA: 1:10 - loss: 2.6608 - regression_loss: 2.1251 - classification_loss: 0.5357 221/500 [============>.................] - ETA: 1:10 - loss: 2.6601 - regression_loss: 2.1243 - classification_loss: 0.5358 222/500 [============>.................] - ETA: 1:09 - loss: 2.6573 - regression_loss: 2.1224 - classification_loss: 0.5349 223/500 [============>.................] - ETA: 1:09 - loss: 2.6579 - regression_loss: 2.1228 - classification_loss: 0.5350 224/500 [============>.................] - ETA: 1:09 - loss: 2.6567 - regression_loss: 2.1221 - classification_loss: 0.5347 225/500 [============>.................] - ETA: 1:08 - loss: 2.6587 - regression_loss: 2.1234 - classification_loss: 0.5353 226/500 [============>.................] - ETA: 1:08 - loss: 2.6585 - regression_loss: 2.1231 - classification_loss: 0.5354 227/500 [============>.................] - ETA: 1:08 - loss: 2.6577 - regression_loss: 2.1225 - classification_loss: 0.5351 228/500 [============>.................] - ETA: 1:08 - loss: 2.6608 - regression_loss: 2.1257 - classification_loss: 0.5351 229/500 [============>.................] - ETA: 1:07 - loss: 2.6596 - regression_loss: 2.1251 - classification_loss: 0.5345 230/500 [============>.................] - ETA: 1:07 - loss: 2.6577 - regression_loss: 2.1236 - classification_loss: 0.5341 231/500 [============>.................] - ETA: 1:07 - loss: 2.6567 - regression_loss: 2.1231 - classification_loss: 0.5335 232/500 [============>.................] - ETA: 1:07 - loss: 2.6578 - regression_loss: 2.1243 - classification_loss: 0.5335 233/500 [============>.................] - ETA: 1:06 - loss: 2.6576 - regression_loss: 2.1243 - classification_loss: 0.5333 234/500 [=============>................] - ETA: 1:06 - loss: 2.6559 - regression_loss: 2.1233 - classification_loss: 0.5326 235/500 [=============>................] - ETA: 1:06 - loss: 2.6547 - regression_loss: 2.1226 - classification_loss: 0.5321 236/500 [=============>................] - ETA: 1:06 - loss: 2.6535 - regression_loss: 2.1219 - classification_loss: 0.5316 237/500 [=============>................] - ETA: 1:05 - loss: 2.6546 - regression_loss: 2.1229 - classification_loss: 0.5317 238/500 [=============>................] - ETA: 1:05 - loss: 2.6553 - regression_loss: 2.1233 - classification_loss: 0.5320 239/500 [=============>................] - ETA: 1:05 - loss: 2.6576 - regression_loss: 2.1249 - classification_loss: 0.5327 240/500 [=============>................] - ETA: 1:05 - loss: 2.6595 - regression_loss: 2.1263 - classification_loss: 0.5331 241/500 [=============>................] - ETA: 1:04 - loss: 2.6576 - regression_loss: 2.1247 - classification_loss: 0.5329 242/500 [=============>................] - ETA: 1:04 - loss: 2.6571 - regression_loss: 2.1243 - classification_loss: 0.5328 243/500 [=============>................] - ETA: 1:04 - loss: 2.6561 - regression_loss: 2.1233 - classification_loss: 0.5327 244/500 [=============>................] - ETA: 1:04 - loss: 2.6550 - regression_loss: 2.1228 - classification_loss: 0.5322 245/500 [=============>................] - ETA: 1:03 - loss: 2.6561 - regression_loss: 2.1239 - classification_loss: 0.5322 246/500 [=============>................] - ETA: 1:03 - loss: 2.6554 - regression_loss: 2.1233 - classification_loss: 0.5321 247/500 [=============>................] - ETA: 1:03 - loss: 2.6544 - regression_loss: 2.1225 - classification_loss: 0.5319 248/500 [=============>................] - ETA: 1:03 - loss: 2.6579 - regression_loss: 2.1233 - classification_loss: 0.5345 249/500 [=============>................] - ETA: 1:02 - loss: 2.6612 - regression_loss: 2.1264 - classification_loss: 0.5348 250/500 [==============>...............] - ETA: 1:02 - loss: 2.6624 - regression_loss: 2.1272 - classification_loss: 0.5352 251/500 [==============>...............] - ETA: 1:02 - loss: 2.6613 - regression_loss: 2.1262 - classification_loss: 0.5352 252/500 [==============>...............] - ETA: 1:02 - loss: 2.6623 - regression_loss: 2.1278 - classification_loss: 0.5345 253/500 [==============>...............] - ETA: 1:01 - loss: 2.6623 - regression_loss: 2.1273 - classification_loss: 0.5350 254/500 [==============>...............] - ETA: 1:01 - loss: 2.6618 - regression_loss: 2.1268 - classification_loss: 0.5349 255/500 [==============>...............] - ETA: 1:01 - loss: 2.6619 - regression_loss: 2.1272 - classification_loss: 0.5347 256/500 [==============>...............] - ETA: 1:01 - loss: 2.6619 - regression_loss: 2.1275 - classification_loss: 0.5344 257/500 [==============>...............] - ETA: 1:00 - loss: 2.6611 - regression_loss: 2.1270 - classification_loss: 0.5341 258/500 [==============>...............] - ETA: 1:00 - loss: 2.6593 - regression_loss: 2.1258 - classification_loss: 0.5335 259/500 [==============>...............] - ETA: 1:00 - loss: 2.6601 - regression_loss: 2.1264 - classification_loss: 0.5338 260/500 [==============>...............] - ETA: 1:00 - loss: 2.6584 - regression_loss: 2.1252 - classification_loss: 0.5332 261/500 [==============>...............] - ETA: 59s - loss: 2.6622 - regression_loss: 2.1289 - classification_loss: 0.5334  262/500 [==============>...............] - ETA: 59s - loss: 2.6632 - regression_loss: 2.1299 - classification_loss: 0.5333 263/500 [==============>...............] - ETA: 59s - loss: 2.6631 - regression_loss: 2.1302 - classification_loss: 0.5329 264/500 [==============>...............] - ETA: 59s - loss: 2.6624 - regression_loss: 2.1295 - classification_loss: 0.5329 265/500 [==============>...............] - ETA: 58s - loss: 2.6616 - regression_loss: 2.1292 - classification_loss: 0.5324 266/500 [==============>...............] - ETA: 58s - loss: 2.6622 - regression_loss: 2.1299 - classification_loss: 0.5323 267/500 [===============>..............] - ETA: 58s - loss: 2.6616 - regression_loss: 2.1296 - classification_loss: 0.5320 268/500 [===============>..............] - ETA: 58s - loss: 2.6606 - regression_loss: 2.1291 - classification_loss: 0.5315 269/500 [===============>..............] - ETA: 57s - loss: 2.6610 - regression_loss: 2.1296 - classification_loss: 0.5313 270/500 [===============>..............] - ETA: 57s - loss: 2.6621 - regression_loss: 2.1312 - classification_loss: 0.5309 271/500 [===============>..............] - ETA: 57s - loss: 2.6614 - regression_loss: 2.1307 - classification_loss: 0.5308 272/500 [===============>..............] - ETA: 57s - loss: 2.6613 - regression_loss: 2.1305 - classification_loss: 0.5309 273/500 [===============>..............] - ETA: 56s - loss: 2.6617 - regression_loss: 2.1309 - classification_loss: 0.5308 274/500 [===============>..............] - ETA: 56s - loss: 2.6616 - regression_loss: 2.1312 - classification_loss: 0.5304 275/500 [===============>..............] - ETA: 56s - loss: 2.6627 - regression_loss: 2.1314 - classification_loss: 0.5313 276/500 [===============>..............] - ETA: 56s - loss: 2.6631 - regression_loss: 2.1315 - classification_loss: 0.5316 277/500 [===============>..............] - ETA: 55s - loss: 2.6649 - regression_loss: 2.1327 - classification_loss: 0.5322 278/500 [===============>..............] - ETA: 55s - loss: 2.6648 - regression_loss: 2.1328 - classification_loss: 0.5320 279/500 [===============>..............] - ETA: 55s - loss: 2.6642 - regression_loss: 2.1323 - classification_loss: 0.5318 280/500 [===============>..............] - ETA: 55s - loss: 2.6647 - regression_loss: 2.1327 - classification_loss: 0.5320 281/500 [===============>..............] - ETA: 54s - loss: 2.6647 - regression_loss: 2.1328 - classification_loss: 0.5319 282/500 [===============>..............] - ETA: 54s - loss: 2.6638 - regression_loss: 2.1321 - classification_loss: 0.5317 283/500 [===============>..............] - ETA: 54s - loss: 2.6649 - regression_loss: 2.1329 - classification_loss: 0.5320 284/500 [================>.............] - ETA: 54s - loss: 2.6653 - regression_loss: 2.1334 - classification_loss: 0.5319 285/500 [================>.............] - ETA: 53s - loss: 2.6653 - regression_loss: 2.1337 - classification_loss: 0.5316 286/500 [================>.............] - ETA: 53s - loss: 2.6684 - regression_loss: 2.1351 - classification_loss: 0.5333 287/500 [================>.............] - ETA: 53s - loss: 2.6677 - regression_loss: 2.1344 - classification_loss: 0.5333 288/500 [================>.............] - ETA: 53s - loss: 2.6662 - regression_loss: 2.1325 - classification_loss: 0.5336 289/500 [================>.............] - ETA: 52s - loss: 2.6657 - regression_loss: 2.1325 - classification_loss: 0.5332 290/500 [================>.............] - ETA: 52s - loss: 2.6625 - regression_loss: 2.1297 - classification_loss: 0.5327 291/500 [================>.............] - ETA: 52s - loss: 2.6637 - regression_loss: 2.1306 - classification_loss: 0.5331 292/500 [================>.............] - ETA: 52s - loss: 2.6635 - regression_loss: 2.1305 - classification_loss: 0.5330 293/500 [================>.............] - ETA: 51s - loss: 2.6628 - regression_loss: 2.1303 - classification_loss: 0.5326 294/500 [================>.............] - ETA: 51s - loss: 2.6629 - regression_loss: 2.1303 - classification_loss: 0.5326 295/500 [================>.............] - ETA: 51s - loss: 2.6595 - regression_loss: 2.1277 - classification_loss: 0.5319 296/500 [================>.............] - ETA: 51s - loss: 2.6625 - regression_loss: 2.1302 - classification_loss: 0.5323 297/500 [================>.............] - ETA: 50s - loss: 2.6613 - regression_loss: 2.1290 - classification_loss: 0.5323 298/500 [================>.............] - ETA: 50s - loss: 2.6607 - regression_loss: 2.1288 - classification_loss: 0.5319 299/500 [================>.............] - ETA: 50s - loss: 2.6620 - regression_loss: 2.1298 - classification_loss: 0.5322 300/500 [=================>............] - ETA: 50s - loss: 2.6622 - regression_loss: 2.1302 - classification_loss: 0.5320 301/500 [=================>............] - ETA: 49s - loss: 2.6646 - regression_loss: 2.1324 - classification_loss: 0.5321 302/500 [=================>............] - ETA: 49s - loss: 2.6642 - regression_loss: 2.1325 - classification_loss: 0.5317 303/500 [=================>............] - ETA: 49s - loss: 2.6599 - regression_loss: 2.1290 - classification_loss: 0.5309 304/500 [=================>............] - ETA: 49s - loss: 2.6580 - regression_loss: 2.1278 - classification_loss: 0.5302 305/500 [=================>............] - ETA: 48s - loss: 2.6592 - regression_loss: 2.1277 - classification_loss: 0.5315 306/500 [=================>............] - ETA: 48s - loss: 2.6579 - regression_loss: 2.1264 - classification_loss: 0.5315 307/500 [=================>............] - ETA: 48s - loss: 2.6582 - regression_loss: 2.1266 - classification_loss: 0.5315 308/500 [=================>............] - ETA: 47s - loss: 2.6582 - regression_loss: 2.1268 - classification_loss: 0.5314 309/500 [=================>............] - ETA: 47s - loss: 2.6554 - regression_loss: 2.1248 - classification_loss: 0.5307 310/500 [=================>............] - ETA: 47s - loss: 2.6553 - regression_loss: 2.1247 - classification_loss: 0.5305 311/500 [=================>............] - ETA: 47s - loss: 2.6548 - regression_loss: 2.1244 - classification_loss: 0.5304 312/500 [=================>............] - ETA: 46s - loss: 2.6540 - regression_loss: 2.1240 - classification_loss: 0.5301 313/500 [=================>............] - ETA: 46s - loss: 2.6538 - regression_loss: 2.1235 - classification_loss: 0.5303 314/500 [=================>............] - ETA: 46s - loss: 2.6547 - regression_loss: 2.1243 - classification_loss: 0.5305 315/500 [=================>............] - ETA: 46s - loss: 2.6531 - regression_loss: 2.1227 - classification_loss: 0.5304 316/500 [=================>............] - ETA: 45s - loss: 2.6554 - regression_loss: 2.1248 - classification_loss: 0.5307 317/500 [==================>...........] - ETA: 45s - loss: 2.6554 - regression_loss: 2.1249 - classification_loss: 0.5305 318/500 [==================>...........] - ETA: 45s - loss: 2.6546 - regression_loss: 2.1239 - classification_loss: 0.5306 319/500 [==================>...........] - ETA: 45s - loss: 2.6549 - regression_loss: 2.1241 - classification_loss: 0.5308 320/500 [==================>...........] - ETA: 44s - loss: 2.6554 - regression_loss: 2.1246 - classification_loss: 0.5308 321/500 [==================>...........] - ETA: 44s - loss: 2.6527 - regression_loss: 2.1229 - classification_loss: 0.5298 322/500 [==================>...........] - ETA: 44s - loss: 2.6523 - regression_loss: 2.1225 - classification_loss: 0.5299 323/500 [==================>...........] - ETA: 44s - loss: 2.6525 - regression_loss: 2.1228 - classification_loss: 0.5297 324/500 [==================>...........] - ETA: 43s - loss: 2.6537 - regression_loss: 2.1243 - classification_loss: 0.5294 325/500 [==================>...........] - ETA: 43s - loss: 2.6540 - regression_loss: 2.1244 - classification_loss: 0.5296 326/500 [==================>...........] - ETA: 43s - loss: 2.6532 - regression_loss: 2.1239 - classification_loss: 0.5293 327/500 [==================>...........] - ETA: 43s - loss: 2.6521 - regression_loss: 2.1230 - classification_loss: 0.5291 328/500 [==================>...........] - ETA: 42s - loss: 2.6524 - regression_loss: 2.1229 - classification_loss: 0.5294 329/500 [==================>...........] - ETA: 42s - loss: 2.6508 - regression_loss: 2.1217 - classification_loss: 0.5291 330/500 [==================>...........] - ETA: 42s - loss: 2.6504 - regression_loss: 2.1216 - classification_loss: 0.5288 331/500 [==================>...........] - ETA: 42s - loss: 2.6504 - regression_loss: 2.1213 - classification_loss: 0.5291 332/500 [==================>...........] - ETA: 41s - loss: 2.6512 - regression_loss: 2.1216 - classification_loss: 0.5296 333/500 [==================>...........] - ETA: 41s - loss: 2.6505 - regression_loss: 2.1212 - classification_loss: 0.5293 334/500 [===================>..........] - ETA: 41s - loss: 2.6515 - regression_loss: 2.1218 - classification_loss: 0.5297 335/500 [===================>..........] - ETA: 41s - loss: 2.6537 - regression_loss: 2.1239 - classification_loss: 0.5298 336/500 [===================>..........] - ETA: 40s - loss: 2.6520 - regression_loss: 2.1227 - classification_loss: 0.5293 337/500 [===================>..........] - ETA: 40s - loss: 2.6510 - regression_loss: 2.1218 - classification_loss: 0.5292 338/500 [===================>..........] - ETA: 40s - loss: 2.6507 - regression_loss: 2.1216 - classification_loss: 0.5291 339/500 [===================>..........] - ETA: 40s - loss: 2.6500 - regression_loss: 2.1211 - classification_loss: 0.5289 340/500 [===================>..........] - ETA: 39s - loss: 2.6508 - regression_loss: 2.1209 - classification_loss: 0.5299 341/500 [===================>..........] - ETA: 39s - loss: 2.6507 - regression_loss: 2.1208 - classification_loss: 0.5298 342/500 [===================>..........] - ETA: 39s - loss: 2.6498 - regression_loss: 2.1201 - classification_loss: 0.5296 343/500 [===================>..........] - ETA: 39s - loss: 2.6494 - regression_loss: 2.1201 - classification_loss: 0.5294 344/500 [===================>..........] - ETA: 38s - loss: 2.6506 - regression_loss: 2.1212 - classification_loss: 0.5295 345/500 [===================>..........] - ETA: 38s - loss: 2.6513 - regression_loss: 2.1219 - classification_loss: 0.5294 346/500 [===================>..........] - ETA: 38s - loss: 2.6504 - regression_loss: 2.1213 - classification_loss: 0.5292 347/500 [===================>..........] - ETA: 38s - loss: 2.6511 - regression_loss: 2.1216 - classification_loss: 0.5295 348/500 [===================>..........] - ETA: 37s - loss: 2.6497 - regression_loss: 2.1204 - classification_loss: 0.5292 349/500 [===================>..........] - ETA: 37s - loss: 2.6489 - regression_loss: 2.1194 - classification_loss: 0.5295 350/500 [====================>.........] - ETA: 37s - loss: 2.6485 - regression_loss: 2.1192 - classification_loss: 0.5293 351/500 [====================>.........] - ETA: 37s - loss: 2.6507 - regression_loss: 2.1206 - classification_loss: 0.5301 352/500 [====================>.........] - ETA: 36s - loss: 2.6502 - regression_loss: 2.1202 - classification_loss: 0.5300 353/500 [====================>.........] - ETA: 36s - loss: 2.6513 - regression_loss: 2.1209 - classification_loss: 0.5304 354/500 [====================>.........] - ETA: 36s - loss: 2.6512 - regression_loss: 2.1209 - classification_loss: 0.5303 355/500 [====================>.........] - ETA: 36s - loss: 2.6520 - regression_loss: 2.1216 - classification_loss: 0.5304 356/500 [====================>.........] - ETA: 35s - loss: 2.6519 - regression_loss: 2.1214 - classification_loss: 0.5305 357/500 [====================>.........] - ETA: 35s - loss: 2.6508 - regression_loss: 2.1204 - classification_loss: 0.5304 358/500 [====================>.........] - ETA: 35s - loss: 2.6512 - regression_loss: 2.1210 - classification_loss: 0.5302 359/500 [====================>.........] - ETA: 35s - loss: 2.6507 - regression_loss: 2.1206 - classification_loss: 0.5300 360/500 [====================>.........] - ETA: 34s - loss: 2.6508 - regression_loss: 2.1208 - classification_loss: 0.5301 361/500 [====================>.........] - ETA: 34s - loss: 2.6515 - regression_loss: 2.1215 - classification_loss: 0.5300 362/500 [====================>.........] - ETA: 34s - loss: 2.6516 - regression_loss: 2.1216 - classification_loss: 0.5300 363/500 [====================>.........] - ETA: 34s - loss: 2.6519 - regression_loss: 2.1220 - classification_loss: 0.5300 364/500 [====================>.........] - ETA: 33s - loss: 2.6516 - regression_loss: 2.1219 - classification_loss: 0.5297 365/500 [====================>.........] - ETA: 33s - loss: 2.6512 - regression_loss: 2.1217 - classification_loss: 0.5295 366/500 [====================>.........] - ETA: 33s - loss: 2.6526 - regression_loss: 2.1227 - classification_loss: 0.5298 367/500 [=====================>........] - ETA: 33s - loss: 2.6534 - regression_loss: 2.1233 - classification_loss: 0.5302 368/500 [=====================>........] - ETA: 32s - loss: 2.6507 - regression_loss: 2.1208 - classification_loss: 0.5299 369/500 [=====================>........] - ETA: 32s - loss: 2.6503 - regression_loss: 2.1207 - classification_loss: 0.5296 370/500 [=====================>........] - ETA: 32s - loss: 2.6499 - regression_loss: 2.1204 - classification_loss: 0.5295 371/500 [=====================>........] - ETA: 32s - loss: 2.6495 - regression_loss: 2.1203 - classification_loss: 0.5292 372/500 [=====================>........] - ETA: 31s - loss: 2.6478 - regression_loss: 2.1190 - classification_loss: 0.5288 373/500 [=====================>........] - ETA: 31s - loss: 2.6471 - regression_loss: 2.1185 - classification_loss: 0.5286 374/500 [=====================>........] - ETA: 31s - loss: 2.6481 - regression_loss: 2.1193 - classification_loss: 0.5287 375/500 [=====================>........] - ETA: 31s - loss: 2.6476 - regression_loss: 2.1192 - classification_loss: 0.5284 376/500 [=====================>........] - ETA: 30s - loss: 2.6482 - regression_loss: 2.1193 - classification_loss: 0.5289 377/500 [=====================>........] - ETA: 30s - loss: 2.6478 - regression_loss: 2.1192 - classification_loss: 0.5286 378/500 [=====================>........] - ETA: 30s - loss: 2.6505 - regression_loss: 2.1203 - classification_loss: 0.5302 379/500 [=====================>........] - ETA: 30s - loss: 2.6500 - regression_loss: 2.1199 - classification_loss: 0.5301 380/500 [=====================>........] - ETA: 29s - loss: 2.6496 - regression_loss: 2.1195 - classification_loss: 0.5301 381/500 [=====================>........] - ETA: 29s - loss: 2.6496 - regression_loss: 2.1196 - classification_loss: 0.5301 382/500 [=====================>........] - ETA: 29s - loss: 2.6470 - regression_loss: 2.1173 - classification_loss: 0.5297 383/500 [=====================>........] - ETA: 29s - loss: 2.6464 - regression_loss: 2.1170 - classification_loss: 0.5294 384/500 [======================>.......] - ETA: 28s - loss: 2.6474 - regression_loss: 2.1176 - classification_loss: 0.5298 385/500 [======================>.......] - ETA: 28s - loss: 2.6487 - regression_loss: 2.1186 - classification_loss: 0.5302 386/500 [======================>.......] - ETA: 28s - loss: 2.6495 - regression_loss: 2.1194 - classification_loss: 0.5301 387/500 [======================>.......] - ETA: 28s - loss: 2.6490 - regression_loss: 2.1192 - classification_loss: 0.5298 388/500 [======================>.......] - ETA: 27s - loss: 2.6492 - regression_loss: 2.1195 - classification_loss: 0.5297 389/500 [======================>.......] - ETA: 27s - loss: 2.6479 - regression_loss: 2.1186 - classification_loss: 0.5292 390/500 [======================>.......] - ETA: 27s - loss: 2.6474 - regression_loss: 2.1185 - classification_loss: 0.5289 391/500 [======================>.......] - ETA: 27s - loss: 2.6473 - regression_loss: 2.1185 - classification_loss: 0.5288 392/500 [======================>.......] - ETA: 26s - loss: 2.6480 - regression_loss: 2.1185 - classification_loss: 0.5295 393/500 [======================>.......] - ETA: 26s - loss: 2.6486 - regression_loss: 2.1190 - classification_loss: 0.5296 394/500 [======================>.......] - ETA: 26s - loss: 2.6494 - regression_loss: 2.1198 - classification_loss: 0.5296 395/500 [======================>.......] - ETA: 26s - loss: 2.6500 - regression_loss: 2.1203 - classification_loss: 0.5297 396/500 [======================>.......] - ETA: 25s - loss: 2.6499 - regression_loss: 2.1202 - classification_loss: 0.5298 397/500 [======================>.......] - ETA: 25s - loss: 2.6504 - regression_loss: 2.1205 - classification_loss: 0.5299 398/500 [======================>.......] - ETA: 25s - loss: 2.6529 - regression_loss: 2.1224 - classification_loss: 0.5304 399/500 [======================>.......] - ETA: 25s - loss: 2.6526 - regression_loss: 2.1225 - classification_loss: 0.5301 400/500 [=======================>......] - ETA: 24s - loss: 2.6519 - regression_loss: 2.1217 - classification_loss: 0.5302 401/500 [=======================>......] - ETA: 24s - loss: 2.6519 - regression_loss: 2.1217 - classification_loss: 0.5302 402/500 [=======================>......] - ETA: 24s - loss: 2.6516 - regression_loss: 2.1216 - classification_loss: 0.5300 403/500 [=======================>......] - ETA: 24s - loss: 2.6511 - regression_loss: 2.1214 - classification_loss: 0.5297 404/500 [=======================>......] - ETA: 23s - loss: 2.6506 - regression_loss: 2.1211 - classification_loss: 0.5295 405/500 [=======================>......] - ETA: 23s - loss: 2.6501 - regression_loss: 2.1207 - classification_loss: 0.5294 406/500 [=======================>......] - ETA: 23s - loss: 2.6494 - regression_loss: 2.1203 - classification_loss: 0.5291 407/500 [=======================>......] - ETA: 23s - loss: 2.6493 - regression_loss: 2.1203 - classification_loss: 0.5290 408/500 [=======================>......] - ETA: 22s - loss: 2.6492 - regression_loss: 2.1204 - classification_loss: 0.5288 409/500 [=======================>......] - ETA: 22s - loss: 2.6505 - regression_loss: 2.1217 - classification_loss: 0.5289 410/500 [=======================>......] - ETA: 22s - loss: 2.6499 - regression_loss: 2.1213 - classification_loss: 0.5286 411/500 [=======================>......] - ETA: 22s - loss: 2.6495 - regression_loss: 2.1212 - classification_loss: 0.5283 412/500 [=======================>......] - ETA: 21s - loss: 2.6509 - regression_loss: 2.1227 - classification_loss: 0.5282 413/500 [=======================>......] - ETA: 21s - loss: 2.6502 - regression_loss: 2.1221 - classification_loss: 0.5281 414/500 [=======================>......] - ETA: 21s - loss: 2.6504 - regression_loss: 2.1224 - classification_loss: 0.5280 415/500 [=======================>......] - ETA: 21s - loss: 2.6524 - regression_loss: 2.1238 - classification_loss: 0.5285 416/500 [=======================>......] - ETA: 20s - loss: 2.6520 - regression_loss: 2.1238 - classification_loss: 0.5282 417/500 [========================>.....] - ETA: 20s - loss: 2.6515 - regression_loss: 2.1234 - classification_loss: 0.5281 418/500 [========================>.....] - ETA: 20s - loss: 2.6517 - regression_loss: 2.1235 - classification_loss: 0.5282 419/500 [========================>.....] - ETA: 20s - loss: 2.6520 - regression_loss: 2.1238 - classification_loss: 0.5282 420/500 [========================>.....] - ETA: 19s - loss: 2.6524 - regression_loss: 2.1242 - classification_loss: 0.5282 421/500 [========================>.....] - ETA: 19s - loss: 2.6508 - regression_loss: 2.1230 - classification_loss: 0.5277 422/500 [========================>.....] - ETA: 19s - loss: 2.6517 - regression_loss: 2.1236 - classification_loss: 0.5282 423/500 [========================>.....] - ETA: 19s - loss: 2.6511 - regression_loss: 2.1230 - classification_loss: 0.5281 424/500 [========================>.....] - ETA: 18s - loss: 2.6611 - regression_loss: 2.1259 - classification_loss: 0.5351 425/500 [========================>.....] - ETA: 18s - loss: 2.6608 - regression_loss: 2.1259 - classification_loss: 0.5349 426/500 [========================>.....] - ETA: 18s - loss: 2.6649 - regression_loss: 2.1286 - classification_loss: 0.5364 427/500 [========================>.....] - ETA: 18s - loss: 2.6649 - regression_loss: 2.1286 - classification_loss: 0.5364 428/500 [========================>.....] - ETA: 17s - loss: 2.6652 - regression_loss: 2.1287 - classification_loss: 0.5365 429/500 [========================>.....] - ETA: 17s - loss: 2.6661 - regression_loss: 2.1294 - classification_loss: 0.5367 430/500 [========================>.....] - ETA: 17s - loss: 2.6657 - regression_loss: 2.1290 - classification_loss: 0.5367 431/500 [========================>.....] - ETA: 17s - loss: 2.6655 - regression_loss: 2.1289 - classification_loss: 0.5365 432/500 [========================>.....] - ETA: 16s - loss: 2.6641 - regression_loss: 2.1281 - classification_loss: 0.5361 433/500 [========================>.....] - ETA: 16s - loss: 2.6651 - regression_loss: 2.1284 - classification_loss: 0.5367 434/500 [=========================>....] - ETA: 16s - loss: 2.6654 - regression_loss: 2.1286 - classification_loss: 0.5368 435/500 [=========================>....] - ETA: 16s - loss: 2.6642 - regression_loss: 2.1278 - classification_loss: 0.5364 436/500 [=========================>....] - ETA: 15s - loss: 2.6661 - regression_loss: 2.1297 - classification_loss: 0.5363 437/500 [=========================>....] - ETA: 15s - loss: 2.6656 - regression_loss: 2.1295 - classification_loss: 0.5361 438/500 [=========================>....] - ETA: 15s - loss: 2.6661 - regression_loss: 2.1301 - classification_loss: 0.5360 439/500 [=========================>....] - ETA: 15s - loss: 2.6647 - regression_loss: 2.1290 - classification_loss: 0.5357 440/500 [=========================>....] - ETA: 14s - loss: 2.6637 - regression_loss: 2.1284 - classification_loss: 0.5354 441/500 [=========================>....] - ETA: 14s - loss: 2.6638 - regression_loss: 2.1287 - classification_loss: 0.5352 442/500 [=========================>....] - ETA: 14s - loss: 2.6643 - regression_loss: 2.1291 - classification_loss: 0.5352 443/500 [=========================>....] - ETA: 14s - loss: 2.6641 - regression_loss: 2.1291 - classification_loss: 0.5350 444/500 [=========================>....] - ETA: 13s - loss: 2.6652 - regression_loss: 2.1300 - classification_loss: 0.5352 445/500 [=========================>....] - ETA: 13s - loss: 2.6653 - regression_loss: 2.1301 - classification_loss: 0.5353 446/500 [=========================>....] - ETA: 13s - loss: 2.6655 - regression_loss: 2.1302 - classification_loss: 0.5353 447/500 [=========================>....] - ETA: 13s - loss: 2.6633 - regression_loss: 2.1287 - classification_loss: 0.5346 448/500 [=========================>....] - ETA: 12s - loss: 2.6632 - regression_loss: 2.1285 - classification_loss: 0.5346 449/500 [=========================>....] - ETA: 12s - loss: 2.6633 - regression_loss: 2.1288 - classification_loss: 0.5345 450/500 [==========================>...] - ETA: 12s - loss: 2.6629 - regression_loss: 2.1286 - classification_loss: 0.5343 451/500 [==========================>...] - ETA: 12s - loss: 2.6631 - regression_loss: 2.1287 - classification_loss: 0.5344 452/500 [==========================>...] - ETA: 11s - loss: 2.6623 - regression_loss: 2.1282 - classification_loss: 0.5341 453/500 [==========================>...] - ETA: 11s - loss: 2.6628 - regression_loss: 2.1286 - classification_loss: 0.5342 454/500 [==========================>...] - ETA: 11s - loss: 2.6629 - regression_loss: 2.1286 - classification_loss: 0.5343 455/500 [==========================>...] - ETA: 11s - loss: 2.6623 - regression_loss: 2.1283 - classification_loss: 0.5340 456/500 [==========================>...] - ETA: 10s - loss: 2.6610 - regression_loss: 2.1273 - classification_loss: 0.5337 457/500 [==========================>...] - ETA: 10s - loss: 2.6583 - regression_loss: 2.1252 - classification_loss: 0.5331 458/500 [==========================>...] - ETA: 10s - loss: 2.6586 - regression_loss: 2.1256 - classification_loss: 0.5329 459/500 [==========================>...] - ETA: 10s - loss: 2.6584 - regression_loss: 2.1257 - classification_loss: 0.5327 460/500 [==========================>...] - ETA: 9s - loss: 2.6607 - regression_loss: 2.1268 - classification_loss: 0.5339  461/500 [==========================>...] - ETA: 9s - loss: 2.6605 - regression_loss: 2.1267 - classification_loss: 0.5338 462/500 [==========================>...] - ETA: 9s - loss: 2.6605 - regression_loss: 2.1268 - classification_loss: 0.5336 463/500 [==========================>...] - ETA: 9s - loss: 2.6594 - regression_loss: 2.1260 - classification_loss: 0.5333 464/500 [==========================>...] - ETA: 8s - loss: 2.6580 - regression_loss: 2.1248 - classification_loss: 0.5331 465/500 [==========================>...] - ETA: 8s - loss: 2.6577 - regression_loss: 2.1249 - classification_loss: 0.5328 466/500 [==========================>...] - ETA: 8s - loss: 2.6565 - regression_loss: 2.1242 - classification_loss: 0.5323 467/500 [===========================>..] - ETA: 8s - loss: 2.6565 - regression_loss: 2.1243 - classification_loss: 0.5322 468/500 [===========================>..] - ETA: 7s - loss: 2.6564 - regression_loss: 2.1237 - classification_loss: 0.5326 469/500 [===========================>..] - ETA: 7s - loss: 2.6559 - regression_loss: 2.1235 - classification_loss: 0.5324 470/500 [===========================>..] - ETA: 7s - loss: 2.6562 - regression_loss: 2.1237 - classification_loss: 0.5326 471/500 [===========================>..] - ETA: 7s - loss: 2.6573 - regression_loss: 2.1244 - classification_loss: 0.5328 472/500 [===========================>..] - ETA: 6s - loss: 2.6568 - regression_loss: 2.1240 - classification_loss: 0.5328 473/500 [===========================>..] - ETA: 6s - loss: 2.6568 - regression_loss: 2.1239 - classification_loss: 0.5329 474/500 [===========================>..] - ETA: 6s - loss: 2.6568 - regression_loss: 2.1240 - classification_loss: 0.5328 475/500 [===========================>..] - ETA: 6s - loss: 2.6566 - regression_loss: 2.1237 - classification_loss: 0.5329 476/500 [===========================>..] - ETA: 5s - loss: 2.6562 - regression_loss: 2.1234 - classification_loss: 0.5327 477/500 [===========================>..] - ETA: 5s - loss: 2.6565 - regression_loss: 2.1236 - classification_loss: 0.5329 478/500 [===========================>..] - ETA: 5s - loss: 2.6565 - regression_loss: 2.1236 - classification_loss: 0.5329 479/500 [===========================>..] - ETA: 5s - loss: 2.6569 - regression_loss: 2.1240 - classification_loss: 0.5329 480/500 [===========================>..] - ETA: 4s - loss: 2.6564 - regression_loss: 2.1237 - classification_loss: 0.5327 481/500 [===========================>..] - ETA: 4s - loss: 2.6567 - regression_loss: 2.1234 - classification_loss: 0.5333 482/500 [===========================>..] - ETA: 4s - loss: 2.6575 - regression_loss: 2.1228 - classification_loss: 0.5348 483/500 [===========================>..] - ETA: 4s - loss: 2.6590 - regression_loss: 2.1241 - classification_loss: 0.5349 484/500 [============================>.] - ETA: 3s - loss: 2.6584 - regression_loss: 2.1227 - classification_loss: 0.5358 485/500 [============================>.] - ETA: 3s - loss: 2.6587 - regression_loss: 2.1228 - classification_loss: 0.5359 486/500 [============================>.] - ETA: 3s - loss: 2.6595 - regression_loss: 2.1237 - classification_loss: 0.5358 487/500 [============================>.] - ETA: 3s - loss: 2.6582 - regression_loss: 2.1228 - classification_loss: 0.5354 488/500 [============================>.] - ETA: 2s - loss: 2.6578 - regression_loss: 2.1226 - classification_loss: 0.5351 489/500 [============================>.] - ETA: 2s - loss: 2.6571 - regression_loss: 2.1222 - classification_loss: 0.5349 490/500 [============================>.] - ETA: 2s - loss: 2.6576 - regression_loss: 2.1227 - classification_loss: 0.5348 491/500 [============================>.] - ETA: 2s - loss: 2.6570 - regression_loss: 2.1224 - classification_loss: 0.5346 492/500 [============================>.] - ETA: 1s - loss: 2.6564 - regression_loss: 2.1223 - classification_loss: 0.5341 493/500 [============================>.] - ETA: 1s - loss: 2.6581 - regression_loss: 2.1234 - classification_loss: 0.5347 494/500 [============================>.] - ETA: 1s - loss: 2.6573 - regression_loss: 2.1227 - classification_loss: 0.5346 495/500 [============================>.] - ETA: 1s - loss: 2.6586 - regression_loss: 2.1237 - classification_loss: 0.5349 496/500 [============================>.] - ETA: 0s - loss: 2.6580 - regression_loss: 2.1233 - classification_loss: 0.5347 497/500 [============================>.] - ETA: 0s - loss: 2.6577 - regression_loss: 2.1231 - classification_loss: 0.5346 498/500 [============================>.] - ETA: 0s - loss: 2.6578 - regression_loss: 2.1230 - classification_loss: 0.5348 499/500 [============================>.] - ETA: 0s - loss: 2.6573 - regression_loss: 2.1228 - classification_loss: 0.5346 500/500 [==============================] - 125s 249ms/step - loss: 2.6580 - regression_loss: 2.1234 - classification_loss: 0.5347 1172 instances of class plum with average precision: 0.2720 mAP: 0.2720 Epoch 00009: saving model to ./training/snapshots/resnet50_pascal_09.h5 Epoch 10/150 1/500 [..............................] - ETA: 2:05 - loss: 2.4813 - regression_loss: 2.0037 - classification_loss: 0.4775 2/500 [..............................] - ETA: 2:04 - loss: 2.3973 - regression_loss: 1.8722 - classification_loss: 0.5251 3/500 [..............................] - ETA: 2:04 - loss: 2.3718 - regression_loss: 1.8895 - classification_loss: 0.4823 4/500 [..............................] - ETA: 2:04 - loss: 2.4723 - regression_loss: 1.9836 - classification_loss: 0.4886 5/500 [..............................] - ETA: 2:04 - loss: 2.4554 - regression_loss: 1.9747 - classification_loss: 0.4808 6/500 [..............................] - ETA: 2:04 - loss: 2.4527 - regression_loss: 1.9816 - classification_loss: 0.4710 7/500 [..............................] - ETA: 2:04 - loss: 2.4833 - regression_loss: 2.0051 - classification_loss: 0.4782 8/500 [..............................] - ETA: 2:04 - loss: 2.5090 - regression_loss: 2.0380 - classification_loss: 0.4710 9/500 [..............................] - ETA: 2:03 - loss: 2.4948 - regression_loss: 2.0338 - classification_loss: 0.4610 10/500 [..............................] - ETA: 2:03 - loss: 2.5562 - regression_loss: 2.0348 - classification_loss: 0.5214 11/500 [..............................] - ETA: 2:03 - loss: 2.5518 - regression_loss: 2.0368 - classification_loss: 0.5150 12/500 [..............................] - ETA: 2:03 - loss: 2.5153 - regression_loss: 2.0131 - classification_loss: 0.5023 13/500 [..............................] - ETA: 2:03 - loss: 2.5369 - regression_loss: 2.0201 - classification_loss: 0.5168 14/500 [..............................] - ETA: 2:03 - loss: 2.5107 - regression_loss: 2.0011 - classification_loss: 0.5096 15/500 [..............................] - ETA: 2:02 - loss: 2.5184 - regression_loss: 1.9997 - classification_loss: 0.5186 16/500 [..............................] - ETA: 2:02 - loss: 2.4902 - regression_loss: 1.9795 - classification_loss: 0.5107 17/500 [>.............................] - ETA: 2:02 - loss: 2.5436 - regression_loss: 2.0345 - classification_loss: 0.5090 18/500 [>.............................] - ETA: 2:02 - loss: 2.5341 - regression_loss: 2.0233 - classification_loss: 0.5107 19/500 [>.............................] - ETA: 2:01 - loss: 2.4831 - regression_loss: 1.9882 - classification_loss: 0.4949 20/500 [>.............................] - ETA: 2:01 - loss: 2.4813 - regression_loss: 1.9859 - classification_loss: 0.4954 21/500 [>.............................] - ETA: 2:01 - loss: 2.4817 - regression_loss: 1.9864 - classification_loss: 0.4953 22/500 [>.............................] - ETA: 2:00 - loss: 2.4853 - regression_loss: 1.9888 - classification_loss: 0.4965 23/500 [>.............................] - ETA: 2:00 - loss: 2.4855 - regression_loss: 1.9868 - classification_loss: 0.4986 24/500 [>.............................] - ETA: 2:00 - loss: 2.4788 - regression_loss: 1.9788 - classification_loss: 0.5000 25/500 [>.............................] - ETA: 2:00 - loss: 2.4942 - regression_loss: 1.9916 - classification_loss: 0.5027 26/500 [>.............................] - ETA: 1:59 - loss: 2.5113 - regression_loss: 2.0055 - classification_loss: 0.5058 27/500 [>.............................] - ETA: 1:59 - loss: 2.5315 - regression_loss: 2.0148 - classification_loss: 0.5167 28/500 [>.............................] - ETA: 1:59 - loss: 2.5302 - regression_loss: 2.0129 - classification_loss: 0.5173 29/500 [>.............................] - ETA: 1:59 - loss: 2.5516 - regression_loss: 2.0365 - classification_loss: 0.5150 30/500 [>.............................] - ETA: 1:58 - loss: 2.5399 - regression_loss: 2.0297 - classification_loss: 0.5102 31/500 [>.............................] - ETA: 1:58 - loss: 2.5320 - regression_loss: 2.0252 - classification_loss: 0.5068 32/500 [>.............................] - ETA: 1:58 - loss: 2.5485 - regression_loss: 2.0381 - classification_loss: 0.5104 33/500 [>.............................] - ETA: 1:58 - loss: 2.5561 - regression_loss: 2.0443 - classification_loss: 0.5117 34/500 [=>............................] - ETA: 1:57 - loss: 2.5465 - regression_loss: 2.0313 - classification_loss: 0.5152 35/500 [=>............................] - ETA: 1:57 - loss: 2.5579 - regression_loss: 2.0433 - classification_loss: 0.5146 36/500 [=>............................] - ETA: 1:57 - loss: 2.5566 - regression_loss: 2.0417 - classification_loss: 0.5149 37/500 [=>............................] - ETA: 1:56 - loss: 2.5602 - regression_loss: 2.0496 - classification_loss: 0.5106 38/500 [=>............................] - ETA: 1:56 - loss: 2.5532 - regression_loss: 2.0429 - classification_loss: 0.5103 39/500 [=>............................] - ETA: 1:56 - loss: 2.5526 - regression_loss: 2.0428 - classification_loss: 0.5099 40/500 [=>............................] - ETA: 1:55 - loss: 2.5672 - regression_loss: 2.0525 - classification_loss: 0.5147 41/500 [=>............................] - ETA: 1:55 - loss: 2.5748 - regression_loss: 2.0546 - classification_loss: 0.5201 42/500 [=>............................] - ETA: 1:55 - loss: 2.5767 - regression_loss: 2.0584 - classification_loss: 0.5183 43/500 [=>............................] - ETA: 1:55 - loss: 2.5575 - regression_loss: 2.0433 - classification_loss: 0.5141 44/500 [=>............................] - ETA: 1:54 - loss: 2.5639 - regression_loss: 2.0503 - classification_loss: 0.5136 45/500 [=>............................] - ETA: 1:54 - loss: 2.5668 - regression_loss: 2.0545 - classification_loss: 0.5124 46/500 [=>............................] - ETA: 1:54 - loss: 2.5689 - regression_loss: 2.0582 - classification_loss: 0.5107 47/500 [=>............................] - ETA: 1:54 - loss: 2.5682 - regression_loss: 2.0586 - classification_loss: 0.5096 48/500 [=>............................] - ETA: 1:53 - loss: 2.5689 - regression_loss: 2.0599 - classification_loss: 0.5090 49/500 [=>............................] - ETA: 1:53 - loss: 2.5678 - regression_loss: 2.0620 - classification_loss: 0.5059 50/500 [==>...........................] - ETA: 1:53 - loss: 2.5713 - regression_loss: 2.0647 - classification_loss: 0.5066 51/500 [==>...........................] - ETA: 1:53 - loss: 2.5731 - regression_loss: 2.0664 - classification_loss: 0.5067 52/500 [==>...........................] - ETA: 1:53 - loss: 2.5766 - regression_loss: 2.0718 - classification_loss: 0.5048 53/500 [==>...........................] - ETA: 1:52 - loss: 2.5719 - regression_loss: 2.0674 - classification_loss: 0.5045 54/500 [==>...........................] - ETA: 1:52 - loss: 2.5821 - regression_loss: 2.0759 - classification_loss: 0.5062 55/500 [==>...........................] - ETA: 1:52 - loss: 2.5753 - regression_loss: 2.0711 - classification_loss: 0.5042 56/500 [==>...........................] - ETA: 1:52 - loss: 2.5749 - regression_loss: 2.0719 - classification_loss: 0.5030 57/500 [==>...........................] - ETA: 1:51 - loss: 2.5872 - regression_loss: 2.0715 - classification_loss: 0.5157 58/500 [==>...........................] - ETA: 1:51 - loss: 2.6128 - regression_loss: 2.0699 - classification_loss: 0.5429 59/500 [==>...........................] - ETA: 1:51 - loss: 2.6411 - regression_loss: 2.0758 - classification_loss: 0.5654 60/500 [==>...........................] - ETA: 1:50 - loss: 2.6549 - regression_loss: 2.0888 - classification_loss: 0.5661 61/500 [==>...........................] - ETA: 1:50 - loss: 2.6539 - regression_loss: 2.0884 - classification_loss: 0.5655 62/500 [==>...........................] - ETA: 1:50 - loss: 2.6559 - regression_loss: 2.0897 - classification_loss: 0.5662 63/500 [==>...........................] - ETA: 1:50 - loss: 2.6603 - regression_loss: 2.0934 - classification_loss: 0.5668 64/500 [==>...........................] - ETA: 1:50 - loss: 2.6641 - regression_loss: 2.0963 - classification_loss: 0.5678 65/500 [==>...........................] - ETA: 1:49 - loss: 2.6632 - regression_loss: 2.0950 - classification_loss: 0.5681 66/500 [==>...........................] - ETA: 1:49 - loss: 2.6647 - regression_loss: 2.0956 - classification_loss: 0.5691 67/500 [===>..........................] - ETA: 1:49 - loss: 2.6644 - regression_loss: 2.0959 - classification_loss: 0.5686 68/500 [===>..........................] - ETA: 1:49 - loss: 2.6581 - regression_loss: 2.0926 - classification_loss: 0.5655 69/500 [===>..........................] - ETA: 1:48 - loss: 2.6566 - regression_loss: 2.0930 - classification_loss: 0.5636 70/500 [===>..........................] - ETA: 1:48 - loss: 2.6552 - regression_loss: 2.0931 - classification_loss: 0.5621 71/500 [===>..........................] - ETA: 1:48 - loss: 2.6548 - regression_loss: 2.0940 - classification_loss: 0.5608 72/500 [===>..........................] - ETA: 1:48 - loss: 2.6549 - regression_loss: 2.0934 - classification_loss: 0.5615 73/500 [===>..........................] - ETA: 1:47 - loss: 2.6536 - regression_loss: 2.0914 - classification_loss: 0.5622 74/500 [===>..........................] - ETA: 1:47 - loss: 2.6525 - regression_loss: 2.0926 - classification_loss: 0.5599 75/500 [===>..........................] - ETA: 1:47 - loss: 2.6518 - regression_loss: 2.0932 - classification_loss: 0.5586 76/500 [===>..........................] - ETA: 1:46 - loss: 2.6503 - regression_loss: 2.0930 - classification_loss: 0.5573 77/500 [===>..........................] - ETA: 1:46 - loss: 2.6387 - regression_loss: 2.0839 - classification_loss: 0.5548 78/500 [===>..........................] - ETA: 1:46 - loss: 2.6513 - regression_loss: 2.0804 - classification_loss: 0.5709 79/500 [===>..........................] - ETA: 1:45 - loss: 2.6466 - regression_loss: 2.0776 - classification_loss: 0.5690 80/500 [===>..........................] - ETA: 1:45 - loss: 2.6479 - regression_loss: 2.0792 - classification_loss: 0.5687 81/500 [===>..........................] - ETA: 1:44 - loss: 2.6509 - regression_loss: 2.0828 - classification_loss: 0.5681 82/500 [===>..........................] - ETA: 1:44 - loss: 2.6500 - regression_loss: 2.0833 - classification_loss: 0.5667 83/500 [===>..........................] - ETA: 1:44 - loss: 2.6523 - regression_loss: 2.0866 - classification_loss: 0.5657 84/500 [====>.........................] - ETA: 1:44 - loss: 2.6486 - regression_loss: 2.0838 - classification_loss: 0.5648 85/500 [====>.........................] - ETA: 1:43 - loss: 2.6471 - regression_loss: 2.0838 - classification_loss: 0.5633 86/500 [====>.........................] - ETA: 1:43 - loss: 2.6498 - regression_loss: 2.0875 - classification_loss: 0.5623 87/500 [====>.........................] - ETA: 1:43 - loss: 2.6455 - regression_loss: 2.0849 - classification_loss: 0.5606 88/500 [====>.........................] - ETA: 1:43 - loss: 2.6470 - regression_loss: 2.0867 - classification_loss: 0.5603 89/500 [====>.........................] - ETA: 1:42 - loss: 2.6522 - regression_loss: 2.0914 - classification_loss: 0.5607 90/500 [====>.........................] - ETA: 1:42 - loss: 2.6535 - regression_loss: 2.0937 - classification_loss: 0.5598 91/500 [====>.........................] - ETA: 1:42 - loss: 2.6530 - regression_loss: 2.0939 - classification_loss: 0.5591 92/500 [====>.........................] - ETA: 1:42 - loss: 2.6541 - regression_loss: 2.0958 - classification_loss: 0.5584 93/500 [====>.........................] - ETA: 1:41 - loss: 2.6463 - regression_loss: 2.0898 - classification_loss: 0.5566 94/500 [====>.........................] - ETA: 1:41 - loss: 2.6528 - regression_loss: 2.0954 - classification_loss: 0.5574 95/500 [====>.........................] - ETA: 1:41 - loss: 2.6590 - regression_loss: 2.0988 - classification_loss: 0.5602 96/500 [====>.........................] - ETA: 1:41 - loss: 2.6556 - regression_loss: 2.0967 - classification_loss: 0.5589 97/500 [====>.........................] - ETA: 1:40 - loss: 2.6529 - regression_loss: 2.0953 - classification_loss: 0.5576 98/500 [====>.........................] - ETA: 1:40 - loss: 2.6512 - regression_loss: 2.0952 - classification_loss: 0.5560 99/500 [====>.........................] - ETA: 1:40 - loss: 2.6476 - regression_loss: 2.0896 - classification_loss: 0.5580 100/500 [=====>........................] - ETA: 1:40 - loss: 2.6467 - regression_loss: 2.0899 - classification_loss: 0.5568 101/500 [=====>........................] - ETA: 1:40 - loss: 2.6461 - regression_loss: 2.0895 - classification_loss: 0.5566 102/500 [=====>........................] - ETA: 1:39 - loss: 2.6399 - regression_loss: 2.0849 - classification_loss: 0.5550 103/500 [=====>........................] - ETA: 1:39 - loss: 2.6321 - regression_loss: 2.0802 - classification_loss: 0.5519 104/500 [=====>........................] - ETA: 1:39 - loss: 2.6319 - regression_loss: 2.0809 - classification_loss: 0.5510 105/500 [=====>........................] - ETA: 1:39 - loss: 2.6375 - regression_loss: 2.0858 - classification_loss: 0.5517 106/500 [=====>........................] - ETA: 1:38 - loss: 2.6376 - regression_loss: 2.0864 - classification_loss: 0.5512 107/500 [=====>........................] - ETA: 1:38 - loss: 2.6394 - regression_loss: 2.0864 - classification_loss: 0.5530 108/500 [=====>........................] - ETA: 1:38 - loss: 2.6428 - regression_loss: 2.0891 - classification_loss: 0.5538 109/500 [=====>........................] - ETA: 1:38 - loss: 2.6436 - regression_loss: 2.0901 - classification_loss: 0.5535 110/500 [=====>........................] - ETA: 1:37 - loss: 2.6406 - regression_loss: 2.0882 - classification_loss: 0.5524 111/500 [=====>........................] - ETA: 1:37 - loss: 2.6457 - regression_loss: 2.0924 - classification_loss: 0.5533 112/500 [=====>........................] - ETA: 1:37 - loss: 2.6474 - regression_loss: 2.0938 - classification_loss: 0.5536 113/500 [=====>........................] - ETA: 1:37 - loss: 2.6468 - regression_loss: 2.0939 - classification_loss: 0.5529 114/500 [=====>........................] - ETA: 1:36 - loss: 2.6448 - regression_loss: 2.0929 - classification_loss: 0.5519 115/500 [=====>........................] - ETA: 1:36 - loss: 2.6430 - regression_loss: 2.0927 - classification_loss: 0.5503 116/500 [=====>........................] - ETA: 1:36 - loss: 2.6428 - regression_loss: 2.0926 - classification_loss: 0.5502 117/500 [======>.......................] - ETA: 1:36 - loss: 2.6435 - regression_loss: 2.0929 - classification_loss: 0.5506 118/500 [======>.......................] - ETA: 1:35 - loss: 2.6420 - regression_loss: 2.0920 - classification_loss: 0.5499 119/500 [======>.......................] - ETA: 1:35 - loss: 2.6363 - regression_loss: 2.0876 - classification_loss: 0.5487 120/500 [======>.......................] - ETA: 1:35 - loss: 2.6429 - regression_loss: 2.0929 - classification_loss: 0.5500 121/500 [======>.......................] - ETA: 1:35 - loss: 2.6438 - regression_loss: 2.0941 - classification_loss: 0.5497 122/500 [======>.......................] - ETA: 1:34 - loss: 2.6441 - regression_loss: 2.0926 - classification_loss: 0.5515 123/500 [======>.......................] - ETA: 1:34 - loss: 2.6426 - regression_loss: 2.0916 - classification_loss: 0.5510 124/500 [======>.......................] - ETA: 1:34 - loss: 2.6419 - regression_loss: 2.0912 - classification_loss: 0.5507 125/500 [======>.......................] - ETA: 1:33 - loss: 2.6412 - regression_loss: 2.0908 - classification_loss: 0.5504 126/500 [======>.......................] - ETA: 1:33 - loss: 2.6395 - regression_loss: 2.0900 - classification_loss: 0.5495 127/500 [======>.......................] - ETA: 1:33 - loss: 2.6485 - regression_loss: 2.0971 - classification_loss: 0.5513 128/500 [======>.......................] - ETA: 1:33 - loss: 2.6508 - regression_loss: 2.0996 - classification_loss: 0.5512 129/500 [======>.......................] - ETA: 1:33 - loss: 2.6595 - regression_loss: 2.1070 - classification_loss: 0.5525 130/500 [======>.......................] - ETA: 1:32 - loss: 2.6627 - regression_loss: 2.1101 - classification_loss: 0.5526 131/500 [======>.......................] - ETA: 1:32 - loss: 2.6591 - regression_loss: 2.1074 - classification_loss: 0.5517 132/500 [======>.......................] - ETA: 1:32 - loss: 2.6600 - regression_loss: 2.1076 - classification_loss: 0.5524 133/500 [======>.......................] - ETA: 1:31 - loss: 2.6585 - regression_loss: 2.1068 - classification_loss: 0.5517 134/500 [=======>......................] - ETA: 1:31 - loss: 2.6568 - regression_loss: 2.1061 - classification_loss: 0.5507 135/500 [=======>......................] - ETA: 1:31 - loss: 2.6533 - regression_loss: 2.1039 - classification_loss: 0.5493 136/500 [=======>......................] - ETA: 1:31 - loss: 2.6542 - regression_loss: 2.1052 - classification_loss: 0.5490 137/500 [=======>......................] - ETA: 1:30 - loss: 2.6541 - regression_loss: 2.1058 - classification_loss: 0.5483 138/500 [=======>......................] - ETA: 1:30 - loss: 2.6548 - regression_loss: 2.1063 - classification_loss: 0.5485 139/500 [=======>......................] - ETA: 1:30 - loss: 2.6549 - regression_loss: 2.1072 - classification_loss: 0.5477 140/500 [=======>......................] - ETA: 1:30 - loss: 2.6548 - regression_loss: 2.1077 - classification_loss: 0.5471 141/500 [=======>......................] - ETA: 1:29 - loss: 2.6552 - regression_loss: 2.1087 - classification_loss: 0.5465 142/500 [=======>......................] - ETA: 1:29 - loss: 2.6498 - regression_loss: 2.1047 - classification_loss: 0.5450 143/500 [=======>......................] - ETA: 1:29 - loss: 2.6544 - regression_loss: 2.1079 - classification_loss: 0.5465 144/500 [=======>......................] - ETA: 1:29 - loss: 2.6575 - regression_loss: 2.1099 - classification_loss: 0.5475 145/500 [=======>......................] - ETA: 1:28 - loss: 2.6562 - regression_loss: 2.1092 - classification_loss: 0.5470 146/500 [=======>......................] - ETA: 1:28 - loss: 2.6545 - regression_loss: 2.1071 - classification_loss: 0.5474 147/500 [=======>......................] - ETA: 1:28 - loss: 2.6530 - regression_loss: 2.1062 - classification_loss: 0.5468 148/500 [=======>......................] - ETA: 1:28 - loss: 2.6522 - regression_loss: 2.1057 - classification_loss: 0.5465 149/500 [=======>......................] - ETA: 1:27 - loss: 2.6509 - regression_loss: 2.1045 - classification_loss: 0.5463 150/500 [========>.....................] - ETA: 1:27 - loss: 2.6490 - regression_loss: 2.1032 - classification_loss: 0.5457 151/500 [========>.....................] - ETA: 1:27 - loss: 2.6442 - regression_loss: 2.0984 - classification_loss: 0.5458 152/500 [========>.....................] - ETA: 1:27 - loss: 2.6487 - regression_loss: 2.1021 - classification_loss: 0.5466 153/500 [========>.....................] - ETA: 1:26 - loss: 2.6485 - regression_loss: 2.1020 - classification_loss: 0.5466 154/500 [========>.....................] - ETA: 1:26 - loss: 2.6515 - regression_loss: 2.1045 - classification_loss: 0.5471 155/500 [========>.....................] - ETA: 1:26 - loss: 2.6523 - regression_loss: 2.1056 - classification_loss: 0.5467 156/500 [========>.....................] - ETA: 1:26 - loss: 2.6519 - regression_loss: 2.1053 - classification_loss: 0.5466 157/500 [========>.....................] - ETA: 1:25 - loss: 2.6528 - regression_loss: 2.1060 - classification_loss: 0.5469 158/500 [========>.....................] - ETA: 1:25 - loss: 2.6523 - regression_loss: 2.1061 - classification_loss: 0.5462 159/500 [========>.....................] - ETA: 1:25 - loss: 2.6503 - regression_loss: 2.1046 - classification_loss: 0.5457 160/500 [========>.....................] - ETA: 1:24 - loss: 2.6469 - regression_loss: 2.1023 - classification_loss: 0.5447 161/500 [========>.....................] - ETA: 1:24 - loss: 2.6413 - regression_loss: 2.0981 - classification_loss: 0.5431 162/500 [========>.....................] - ETA: 1:24 - loss: 2.6406 - regression_loss: 2.0980 - classification_loss: 0.5426 163/500 [========>.....................] - ETA: 1:24 - loss: 2.6408 - regression_loss: 2.0984 - classification_loss: 0.5424 164/500 [========>.....................] - ETA: 1:24 - loss: 2.6467 - regression_loss: 2.1048 - classification_loss: 0.5419 165/500 [========>.....................] - ETA: 1:23 - loss: 2.6547 - regression_loss: 2.1099 - classification_loss: 0.5448 166/500 [========>.....................] - ETA: 1:23 - loss: 2.6523 - regression_loss: 2.1078 - classification_loss: 0.5445 167/500 [=========>....................] - ETA: 1:23 - loss: 2.6520 - regression_loss: 2.1074 - classification_loss: 0.5446 168/500 [=========>....................] - ETA: 1:22 - loss: 2.6531 - regression_loss: 2.1088 - classification_loss: 0.5443 169/500 [=========>....................] - ETA: 1:22 - loss: 2.6526 - regression_loss: 2.1089 - classification_loss: 0.5437 170/500 [=========>....................] - ETA: 1:22 - loss: 2.6555 - regression_loss: 2.1111 - classification_loss: 0.5443 171/500 [=========>....................] - ETA: 1:22 - loss: 2.6528 - regression_loss: 2.1095 - classification_loss: 0.5433 172/500 [=========>....................] - ETA: 1:22 - loss: 2.6553 - regression_loss: 2.1116 - classification_loss: 0.5437 173/500 [=========>....................] - ETA: 1:21 - loss: 2.6566 - regression_loss: 2.1129 - classification_loss: 0.5437 174/500 [=========>....................] - ETA: 1:21 - loss: 2.6577 - regression_loss: 2.1141 - classification_loss: 0.5437 175/500 [=========>....................] - ETA: 1:21 - loss: 2.6564 - regression_loss: 2.1127 - classification_loss: 0.5436 176/500 [=========>....................] - ETA: 1:21 - loss: 2.6562 - regression_loss: 2.1133 - classification_loss: 0.5430 177/500 [=========>....................] - ETA: 1:20 - loss: 2.6554 - regression_loss: 2.1128 - classification_loss: 0.5427 178/500 [=========>....................] - ETA: 1:20 - loss: 2.6544 - regression_loss: 2.1125 - classification_loss: 0.5419 179/500 [=========>....................] - ETA: 1:20 - loss: 2.6532 - regression_loss: 2.1121 - classification_loss: 0.5411 180/500 [=========>....................] - ETA: 1:20 - loss: 2.6531 - regression_loss: 2.1125 - classification_loss: 0.5406 181/500 [=========>....................] - ETA: 1:19 - loss: 2.6547 - regression_loss: 2.1136 - classification_loss: 0.5411 182/500 [=========>....................] - ETA: 1:19 - loss: 2.6533 - regression_loss: 2.1131 - classification_loss: 0.5402 183/500 [=========>....................] - ETA: 1:19 - loss: 2.6523 - regression_loss: 2.1127 - classification_loss: 0.5396 184/500 [==========>...................] - ETA: 1:18 - loss: 2.6508 - regression_loss: 2.1119 - classification_loss: 0.5389 185/500 [==========>...................] - ETA: 1:18 - loss: 2.6557 - regression_loss: 2.1168 - classification_loss: 0.5389 186/500 [==========>...................] - ETA: 1:18 - loss: 2.6573 - regression_loss: 2.1156 - classification_loss: 0.5418 187/500 [==========>...................] - ETA: 1:18 - loss: 2.6564 - regression_loss: 2.1153 - classification_loss: 0.5412 188/500 [==========>...................] - ETA: 1:18 - loss: 2.6565 - regression_loss: 2.1158 - classification_loss: 0.5408 189/500 [==========>...................] - ETA: 1:17 - loss: 2.6585 - regression_loss: 2.1166 - classification_loss: 0.5419 190/500 [==========>...................] - ETA: 1:17 - loss: 2.6590 - regression_loss: 2.1172 - classification_loss: 0.5418 191/500 [==========>...................] - ETA: 1:17 - loss: 2.6580 - regression_loss: 2.1165 - classification_loss: 0.5415 192/500 [==========>...................] - ETA: 1:16 - loss: 2.6569 - regression_loss: 2.1153 - classification_loss: 0.5416 193/500 [==========>...................] - ETA: 1:16 - loss: 2.6514 - regression_loss: 2.1118 - classification_loss: 0.5396 194/500 [==========>...................] - ETA: 1:16 - loss: 2.6522 - regression_loss: 2.1121 - classification_loss: 0.5401 195/500 [==========>...................] - ETA: 1:16 - loss: 2.6514 - regression_loss: 2.1114 - classification_loss: 0.5399 196/500 [==========>...................] - ETA: 1:15 - loss: 2.6523 - regression_loss: 2.1124 - classification_loss: 0.5399 197/500 [==========>...................] - ETA: 1:15 - loss: 2.6514 - regression_loss: 2.1118 - classification_loss: 0.5396 198/500 [==========>...................] - ETA: 1:15 - loss: 2.6463 - regression_loss: 2.1078 - classification_loss: 0.5385 199/500 [==========>...................] - ETA: 1:15 - loss: 2.6490 - regression_loss: 2.1111 - classification_loss: 0.5379 200/500 [===========>..................] - ETA: 1:14 - loss: 2.6493 - regression_loss: 2.1116 - classification_loss: 0.5377 201/500 [===========>..................] - ETA: 1:14 - loss: 2.6516 - regression_loss: 2.1139 - classification_loss: 0.5377 202/500 [===========>..................] - ETA: 1:14 - loss: 2.6506 - regression_loss: 2.1134 - classification_loss: 0.5373 203/500 [===========>..................] - ETA: 1:14 - loss: 2.6499 - regression_loss: 2.1127 - classification_loss: 0.5372 204/500 [===========>..................] - ETA: 1:13 - loss: 2.6492 - regression_loss: 2.1122 - classification_loss: 0.5370 205/500 [===========>..................] - ETA: 1:13 - loss: 2.6480 - regression_loss: 2.1115 - classification_loss: 0.5365 206/500 [===========>..................] - ETA: 1:13 - loss: 2.6479 - regression_loss: 2.1108 - classification_loss: 0.5371 207/500 [===========>..................] - ETA: 1:13 - loss: 2.6462 - regression_loss: 2.1097 - classification_loss: 0.5365 208/500 [===========>..................] - ETA: 1:12 - loss: 2.6438 - regression_loss: 2.1083 - classification_loss: 0.5355 209/500 [===========>..................] - ETA: 1:12 - loss: 2.6453 - regression_loss: 2.1084 - classification_loss: 0.5369 210/500 [===========>..................] - ETA: 1:12 - loss: 2.6419 - regression_loss: 2.1060 - classification_loss: 0.5359 211/500 [===========>..................] - ETA: 1:12 - loss: 2.6425 - regression_loss: 2.1065 - classification_loss: 0.5360 212/500 [===========>..................] - ETA: 1:11 - loss: 2.6422 - regression_loss: 2.1067 - classification_loss: 0.5355 213/500 [===========>..................] - ETA: 1:11 - loss: 2.6437 - regression_loss: 2.1084 - classification_loss: 0.5353 214/500 [===========>..................] - ETA: 1:11 - loss: 2.6449 - regression_loss: 2.1094 - classification_loss: 0.5355 215/500 [===========>..................] - ETA: 1:11 - loss: 2.6454 - regression_loss: 2.1096 - classification_loss: 0.5358 216/500 [===========>..................] - ETA: 1:10 - loss: 2.6447 - regression_loss: 2.1090 - classification_loss: 0.5357 217/500 [============>.................] - ETA: 1:10 - loss: 2.6507 - regression_loss: 2.1147 - classification_loss: 0.5360 218/500 [============>.................] - ETA: 1:10 - loss: 2.6515 - regression_loss: 2.1153 - classification_loss: 0.5362 219/500 [============>.................] - ETA: 1:10 - loss: 2.6508 - regression_loss: 2.1152 - classification_loss: 0.5356 220/500 [============>.................] - ETA: 1:09 - loss: 2.6504 - regression_loss: 2.1145 - classification_loss: 0.5358 221/500 [============>.................] - ETA: 1:09 - loss: 2.6491 - regression_loss: 2.1139 - classification_loss: 0.5352 222/500 [============>.................] - ETA: 1:09 - loss: 2.6505 - regression_loss: 2.1152 - classification_loss: 0.5353 223/500 [============>.................] - ETA: 1:09 - loss: 2.6529 - regression_loss: 2.1175 - classification_loss: 0.5355 224/500 [============>.................] - ETA: 1:08 - loss: 2.6534 - regression_loss: 2.1178 - classification_loss: 0.5356 225/500 [============>.................] - ETA: 1:08 - loss: 2.6509 - regression_loss: 2.1163 - classification_loss: 0.5346 226/500 [============>.................] - ETA: 1:08 - loss: 2.6493 - regression_loss: 2.1150 - classification_loss: 0.5343 227/500 [============>.................] - ETA: 1:08 - loss: 2.6477 - regression_loss: 2.1144 - classification_loss: 0.5332 228/500 [============>.................] - ETA: 1:08 - loss: 2.6489 - regression_loss: 2.1158 - classification_loss: 0.5332 229/500 [============>.................] - ETA: 1:07 - loss: 2.6490 - regression_loss: 2.1154 - classification_loss: 0.5336 230/500 [============>.................] - ETA: 1:07 - loss: 2.6485 - regression_loss: 2.1155 - classification_loss: 0.5330 231/500 [============>.................] - ETA: 1:07 - loss: 2.6449 - regression_loss: 2.1124 - classification_loss: 0.5325 232/500 [============>.................] - ETA: 1:07 - loss: 2.6447 - regression_loss: 2.1121 - classification_loss: 0.5326 233/500 [============>.................] - ETA: 1:06 - loss: 2.6452 - regression_loss: 2.1128 - classification_loss: 0.5324 234/500 [=============>................] - ETA: 1:06 - loss: 2.6466 - regression_loss: 2.1143 - classification_loss: 0.5324 235/500 [=============>................] - ETA: 1:06 - loss: 2.6401 - regression_loss: 2.1092 - classification_loss: 0.5309 236/500 [=============>................] - ETA: 1:06 - loss: 2.6415 - regression_loss: 2.1103 - classification_loss: 0.5311 237/500 [=============>................] - ETA: 1:05 - loss: 2.6421 - regression_loss: 2.1109 - classification_loss: 0.5312 238/500 [=============>................] - ETA: 1:05 - loss: 2.6423 - regression_loss: 2.1115 - classification_loss: 0.5308 239/500 [=============>................] - ETA: 1:05 - loss: 2.6403 - regression_loss: 2.1102 - classification_loss: 0.5301 240/500 [=============>................] - ETA: 1:05 - loss: 2.6388 - regression_loss: 2.1091 - classification_loss: 0.5298 241/500 [=============>................] - ETA: 1:04 - loss: 2.6395 - regression_loss: 2.1094 - classification_loss: 0.5301 242/500 [=============>................] - ETA: 1:04 - loss: 2.6394 - regression_loss: 2.1094 - classification_loss: 0.5300 243/500 [=============>................] - ETA: 1:04 - loss: 2.6378 - regression_loss: 2.1081 - classification_loss: 0.5298 244/500 [=============>................] - ETA: 1:04 - loss: 2.6394 - regression_loss: 2.1085 - classification_loss: 0.5309 245/500 [=============>................] - ETA: 1:03 - loss: 2.6455 - regression_loss: 2.1126 - classification_loss: 0.5329 246/500 [=============>................] - ETA: 1:03 - loss: 2.6470 - regression_loss: 2.1136 - classification_loss: 0.5334 247/500 [=============>................] - ETA: 1:03 - loss: 2.6469 - regression_loss: 2.1137 - classification_loss: 0.5332 248/500 [=============>................] - ETA: 1:03 - loss: 2.6462 - regression_loss: 2.1134 - classification_loss: 0.5328 249/500 [=============>................] - ETA: 1:02 - loss: 2.6466 - regression_loss: 2.1136 - classification_loss: 0.5329 250/500 [==============>...............] - ETA: 1:02 - loss: 2.6461 - regression_loss: 2.1135 - classification_loss: 0.5326 251/500 [==============>...............] - ETA: 1:02 - loss: 2.6441 - regression_loss: 2.1121 - classification_loss: 0.5320 252/500 [==============>...............] - ETA: 1:02 - loss: 2.6438 - regression_loss: 2.1120 - classification_loss: 0.5318 253/500 [==============>...............] - ETA: 1:01 - loss: 2.6444 - regression_loss: 2.1122 - classification_loss: 0.5322 254/500 [==============>...............] - ETA: 1:01 - loss: 2.6432 - regression_loss: 2.1115 - classification_loss: 0.5318 255/500 [==============>...............] - ETA: 1:01 - loss: 2.6397 - regression_loss: 2.1086 - classification_loss: 0.5311 256/500 [==============>...............] - ETA: 1:00 - loss: 2.6385 - regression_loss: 2.1076 - classification_loss: 0.5309 257/500 [==============>...............] - ETA: 1:00 - loss: 2.6345 - regression_loss: 2.1045 - classification_loss: 0.5300 258/500 [==============>...............] - ETA: 1:00 - loss: 2.6334 - regression_loss: 2.1039 - classification_loss: 0.5294 259/500 [==============>...............] - ETA: 1:00 - loss: 2.6342 - regression_loss: 2.1045 - classification_loss: 0.5298 260/500 [==============>...............] - ETA: 59s - loss: 2.6346 - regression_loss: 2.1050 - classification_loss: 0.5297  261/500 [==============>...............] - ETA: 59s - loss: 2.6348 - regression_loss: 2.1050 - classification_loss: 0.5298 262/500 [==============>...............] - ETA: 59s - loss: 2.6372 - regression_loss: 2.1064 - classification_loss: 0.5308 263/500 [==============>...............] - ETA: 59s - loss: 2.6380 - regression_loss: 2.1073 - classification_loss: 0.5307 264/500 [==============>...............] - ETA: 58s - loss: 2.6397 - regression_loss: 2.1084 - classification_loss: 0.5313 265/500 [==============>...............] - ETA: 58s - loss: 2.6405 - regression_loss: 2.1093 - classification_loss: 0.5311 266/500 [==============>...............] - ETA: 58s - loss: 2.6380 - regression_loss: 2.1077 - classification_loss: 0.5303 267/500 [===============>..............] - ETA: 58s - loss: 2.6385 - regression_loss: 2.1085 - classification_loss: 0.5300 268/500 [===============>..............] - ETA: 57s - loss: 2.6357 - regression_loss: 2.1064 - classification_loss: 0.5293 269/500 [===============>..............] - ETA: 57s - loss: 2.6370 - regression_loss: 2.1080 - classification_loss: 0.5290 270/500 [===============>..............] - ETA: 57s - loss: 2.6368 - regression_loss: 2.1077 - classification_loss: 0.5291 271/500 [===============>..............] - ETA: 57s - loss: 2.6381 - regression_loss: 2.1094 - classification_loss: 0.5287 272/500 [===============>..............] - ETA: 56s - loss: 2.6380 - regression_loss: 2.1093 - classification_loss: 0.5287 273/500 [===============>..............] - ETA: 56s - loss: 2.6383 - regression_loss: 2.1097 - classification_loss: 0.5287 274/500 [===============>..............] - ETA: 56s - loss: 2.6375 - regression_loss: 2.1094 - classification_loss: 0.5281 275/500 [===============>..............] - ETA: 56s - loss: 2.6382 - regression_loss: 2.1100 - classification_loss: 0.5282 276/500 [===============>..............] - ETA: 55s - loss: 2.6419 - regression_loss: 2.1111 - classification_loss: 0.5308 277/500 [===============>..............] - ETA: 55s - loss: 2.6422 - regression_loss: 2.1116 - classification_loss: 0.5306 278/500 [===============>..............] - ETA: 55s - loss: 2.6400 - regression_loss: 2.1099 - classification_loss: 0.5302 279/500 [===============>..............] - ETA: 55s - loss: 2.6406 - regression_loss: 2.1102 - classification_loss: 0.5304 280/500 [===============>..............] - ETA: 54s - loss: 2.6401 - regression_loss: 2.1099 - classification_loss: 0.5301 281/500 [===============>..............] - ETA: 54s - loss: 2.6404 - regression_loss: 2.1103 - classification_loss: 0.5301 282/500 [===============>..............] - ETA: 54s - loss: 2.6454 - regression_loss: 2.1140 - classification_loss: 0.5314 283/500 [===============>..............] - ETA: 54s - loss: 2.6468 - regression_loss: 2.1153 - classification_loss: 0.5316 284/500 [================>.............] - ETA: 53s - loss: 2.6473 - regression_loss: 2.1156 - classification_loss: 0.5317 285/500 [================>.............] - ETA: 53s - loss: 2.6471 - regression_loss: 2.1153 - classification_loss: 0.5317 286/500 [================>.............] - ETA: 53s - loss: 2.6470 - regression_loss: 2.1139 - classification_loss: 0.5330 287/500 [================>.............] - ETA: 53s - loss: 2.6485 - regression_loss: 2.1157 - classification_loss: 0.5329 288/500 [================>.............] - ETA: 52s - loss: 2.6489 - regression_loss: 2.1163 - classification_loss: 0.5326 289/500 [================>.............] - ETA: 52s - loss: 2.6473 - regression_loss: 2.1153 - classification_loss: 0.5320 290/500 [================>.............] - ETA: 52s - loss: 2.6474 - regression_loss: 2.1154 - classification_loss: 0.5320 291/500 [================>.............] - ETA: 52s - loss: 2.6484 - regression_loss: 2.1162 - classification_loss: 0.5322 292/500 [================>.............] - ETA: 51s - loss: 2.6472 - regression_loss: 2.1154 - classification_loss: 0.5318 293/500 [================>.............] - ETA: 51s - loss: 2.6466 - regression_loss: 2.1148 - classification_loss: 0.5317 294/500 [================>.............] - ETA: 51s - loss: 2.6497 - regression_loss: 2.1168 - classification_loss: 0.5329 295/500 [================>.............] - ETA: 51s - loss: 2.6474 - regression_loss: 2.1151 - classification_loss: 0.5323 296/500 [================>.............] - ETA: 50s - loss: 2.6441 - regression_loss: 2.1125 - classification_loss: 0.5316 297/500 [================>.............] - ETA: 50s - loss: 2.6439 - regression_loss: 2.1127 - classification_loss: 0.5312 298/500 [================>.............] - ETA: 50s - loss: 2.6430 - regression_loss: 2.1120 - classification_loss: 0.5310 299/500 [================>.............] - ETA: 50s - loss: 2.6409 - regression_loss: 2.1104 - classification_loss: 0.5305 300/500 [=================>............] - ETA: 49s - loss: 2.6407 - regression_loss: 2.1090 - classification_loss: 0.5317 301/500 [=================>............] - ETA: 49s - loss: 2.6401 - regression_loss: 2.1084 - classification_loss: 0.5318 302/500 [=================>............] - ETA: 49s - loss: 2.6392 - regression_loss: 2.1079 - classification_loss: 0.5313 303/500 [=================>............] - ETA: 49s - loss: 2.6392 - regression_loss: 2.1079 - classification_loss: 0.5313 304/500 [=================>............] - ETA: 48s - loss: 2.6382 - regression_loss: 2.1074 - classification_loss: 0.5308 305/500 [=================>............] - ETA: 48s - loss: 2.6364 - regression_loss: 2.1061 - classification_loss: 0.5303 306/500 [=================>............] - ETA: 48s - loss: 2.6338 - regression_loss: 2.1042 - classification_loss: 0.5295 307/500 [=================>............] - ETA: 48s - loss: 2.6332 - regression_loss: 2.1038 - classification_loss: 0.5294 308/500 [=================>............] - ETA: 47s - loss: 2.6373 - regression_loss: 2.1073 - classification_loss: 0.5300 309/500 [=================>............] - ETA: 47s - loss: 2.6351 - regression_loss: 2.1061 - classification_loss: 0.5289 310/500 [=================>............] - ETA: 47s - loss: 2.6354 - regression_loss: 2.1067 - classification_loss: 0.5287 311/500 [=================>............] - ETA: 47s - loss: 2.6349 - regression_loss: 2.1065 - classification_loss: 0.5284 312/500 [=================>............] - ETA: 46s - loss: 2.6343 - regression_loss: 2.1063 - classification_loss: 0.5280 313/500 [=================>............] - ETA: 46s - loss: 2.6341 - regression_loss: 2.1063 - classification_loss: 0.5277 314/500 [=================>............] - ETA: 46s - loss: 2.6333 - regression_loss: 2.1057 - classification_loss: 0.5275 315/500 [=================>............] - ETA: 46s - loss: 2.6325 - regression_loss: 2.1052 - classification_loss: 0.5273 316/500 [=================>............] - ETA: 45s - loss: 2.6322 - regression_loss: 2.1051 - classification_loss: 0.5272 317/500 [==================>...........] - ETA: 45s - loss: 2.6318 - regression_loss: 2.1048 - classification_loss: 0.5270 318/500 [==================>...........] - ETA: 45s - loss: 2.6310 - regression_loss: 2.1038 - classification_loss: 0.5271 319/500 [==================>...........] - ETA: 45s - loss: 2.6318 - regression_loss: 2.1047 - classification_loss: 0.5272 320/500 [==================>...........] - ETA: 44s - loss: 2.6311 - regression_loss: 2.1040 - classification_loss: 0.5272 321/500 [==================>...........] - ETA: 44s - loss: 2.6313 - regression_loss: 2.1044 - classification_loss: 0.5269 322/500 [==================>...........] - ETA: 44s - loss: 2.6346 - regression_loss: 2.1068 - classification_loss: 0.5278 323/500 [==================>...........] - ETA: 44s - loss: 2.6338 - regression_loss: 2.1063 - classification_loss: 0.5276 324/500 [==================>...........] - ETA: 43s - loss: 2.6343 - regression_loss: 2.1066 - classification_loss: 0.5277 325/500 [==================>...........] - ETA: 43s - loss: 2.6334 - regression_loss: 2.1061 - classification_loss: 0.5273 326/500 [==================>...........] - ETA: 43s - loss: 2.6328 - regression_loss: 2.1055 - classification_loss: 0.5273 327/500 [==================>...........] - ETA: 43s - loss: 2.6322 - regression_loss: 2.1050 - classification_loss: 0.5272 328/500 [==================>...........] - ETA: 42s - loss: 2.6342 - regression_loss: 2.1068 - classification_loss: 0.5274 329/500 [==================>...........] - ETA: 42s - loss: 2.6333 - regression_loss: 2.1062 - classification_loss: 0.5271 330/500 [==================>...........] - ETA: 42s - loss: 2.6330 - regression_loss: 2.1062 - classification_loss: 0.5269 331/500 [==================>...........] - ETA: 42s - loss: 2.6317 - regression_loss: 2.1052 - classification_loss: 0.5265 332/500 [==================>...........] - ETA: 41s - loss: 2.6318 - regression_loss: 2.1054 - classification_loss: 0.5264 333/500 [==================>...........] - ETA: 41s - loss: 2.6297 - regression_loss: 2.1040 - classification_loss: 0.5257 334/500 [===================>..........] - ETA: 41s - loss: 2.6287 - regression_loss: 2.1032 - classification_loss: 0.5255 335/500 [===================>..........] - ETA: 41s - loss: 2.6281 - regression_loss: 2.1031 - classification_loss: 0.5250 336/500 [===================>..........] - ETA: 40s - loss: 2.6288 - regression_loss: 2.1019 - classification_loss: 0.5270 337/500 [===================>..........] - ETA: 40s - loss: 2.6279 - regression_loss: 2.1013 - classification_loss: 0.5266 338/500 [===================>..........] - ETA: 40s - loss: 2.6282 - regression_loss: 2.1019 - classification_loss: 0.5263 339/500 [===================>..........] - ETA: 40s - loss: 2.6280 - regression_loss: 2.1022 - classification_loss: 0.5259 340/500 [===================>..........] - ETA: 39s - loss: 2.6280 - regression_loss: 2.1022 - classification_loss: 0.5258 341/500 [===================>..........] - ETA: 39s - loss: 2.6281 - regression_loss: 2.1021 - classification_loss: 0.5260 342/500 [===================>..........] - ETA: 39s - loss: 2.6264 - regression_loss: 2.1010 - classification_loss: 0.5254 343/500 [===================>..........] - ETA: 39s - loss: 2.6262 - regression_loss: 2.1012 - classification_loss: 0.5250 344/500 [===================>..........] - ETA: 38s - loss: 2.6238 - regression_loss: 2.0996 - classification_loss: 0.5241 345/500 [===================>..........] - ETA: 38s - loss: 2.6234 - regression_loss: 2.0996 - classification_loss: 0.5238 346/500 [===================>..........] - ETA: 38s - loss: 2.6228 - regression_loss: 2.0990 - classification_loss: 0.5238 347/500 [===================>..........] - ETA: 38s - loss: 2.6235 - regression_loss: 2.0995 - classification_loss: 0.5240 348/500 [===================>..........] - ETA: 37s - loss: 2.6236 - regression_loss: 2.0995 - classification_loss: 0.5241 349/500 [===================>..........] - ETA: 37s - loss: 2.6233 - regression_loss: 2.0990 - classification_loss: 0.5243 350/500 [====================>.........] - ETA: 37s - loss: 2.6226 - regression_loss: 2.0985 - classification_loss: 0.5241 351/500 [====================>.........] - ETA: 37s - loss: 2.6227 - regression_loss: 2.0986 - classification_loss: 0.5241 352/500 [====================>.........] - ETA: 36s - loss: 2.6264 - regression_loss: 2.1004 - classification_loss: 0.5260 353/500 [====================>.........] - ETA: 36s - loss: 2.6253 - regression_loss: 2.0998 - classification_loss: 0.5256 354/500 [====================>.........] - ETA: 36s - loss: 2.6247 - regression_loss: 2.0992 - classification_loss: 0.5255 355/500 [====================>.........] - ETA: 36s - loss: 2.6272 - regression_loss: 2.1011 - classification_loss: 0.5261 356/500 [====================>.........] - ETA: 35s - loss: 2.6275 - regression_loss: 2.1015 - classification_loss: 0.5260 357/500 [====================>.........] - ETA: 35s - loss: 2.6271 - regression_loss: 2.1011 - classification_loss: 0.5260 358/500 [====================>.........] - ETA: 35s - loss: 2.6274 - regression_loss: 2.1012 - classification_loss: 0.5262 359/500 [====================>.........] - ETA: 35s - loss: 2.6271 - regression_loss: 2.1010 - classification_loss: 0.5261 360/500 [====================>.........] - ETA: 34s - loss: 2.6266 - regression_loss: 2.1004 - classification_loss: 0.5262 361/500 [====================>.........] - ETA: 34s - loss: 2.6264 - regression_loss: 2.1003 - classification_loss: 0.5261 362/500 [====================>.........] - ETA: 34s - loss: 2.6256 - regression_loss: 2.0997 - classification_loss: 0.5259 363/500 [====================>.........] - ETA: 34s - loss: 2.6259 - regression_loss: 2.0999 - classification_loss: 0.5260 364/500 [====================>.........] - ETA: 33s - loss: 2.6272 - regression_loss: 2.1005 - classification_loss: 0.5267 365/500 [====================>.........] - ETA: 33s - loss: 2.6285 - regression_loss: 2.1018 - classification_loss: 0.5267 366/500 [====================>.........] - ETA: 33s - loss: 2.6289 - regression_loss: 2.1022 - classification_loss: 0.5268 367/500 [=====================>........] - ETA: 33s - loss: 2.6284 - regression_loss: 2.1020 - classification_loss: 0.5265 368/500 [=====================>........] - ETA: 32s - loss: 2.6271 - regression_loss: 2.1004 - classification_loss: 0.5268 369/500 [=====================>........] - ETA: 32s - loss: 2.6283 - regression_loss: 2.1015 - classification_loss: 0.5269 370/500 [=====================>........] - ETA: 32s - loss: 2.6280 - regression_loss: 2.1010 - classification_loss: 0.5270 371/500 [=====================>........] - ETA: 32s - loss: 2.6267 - regression_loss: 2.1002 - classification_loss: 0.5266 372/500 [=====================>........] - ETA: 31s - loss: 2.6257 - regression_loss: 2.0996 - classification_loss: 0.5261 373/500 [=====================>........] - ETA: 31s - loss: 2.6275 - regression_loss: 2.1008 - classification_loss: 0.5268 374/500 [=====================>........] - ETA: 31s - loss: 2.6273 - regression_loss: 2.1008 - classification_loss: 0.5265 375/500 [=====================>........] - ETA: 31s - loss: 2.6253 - regression_loss: 2.0986 - classification_loss: 0.5266 376/500 [=====================>........] - ETA: 30s - loss: 2.6242 - regression_loss: 2.0980 - classification_loss: 0.5261 377/500 [=====================>........] - ETA: 30s - loss: 2.6236 - regression_loss: 2.0976 - classification_loss: 0.5259 378/500 [=====================>........] - ETA: 30s - loss: 2.6241 - regression_loss: 2.0982 - classification_loss: 0.5259 379/500 [=====================>........] - ETA: 30s - loss: 2.6237 - regression_loss: 2.0981 - classification_loss: 0.5256 380/500 [=====================>........] - ETA: 29s - loss: 2.6243 - regression_loss: 2.0986 - classification_loss: 0.5257 381/500 [=====================>........] - ETA: 29s - loss: 2.6242 - regression_loss: 2.0985 - classification_loss: 0.5257 382/500 [=====================>........] - ETA: 29s - loss: 2.6237 - regression_loss: 2.0983 - classification_loss: 0.5254 383/500 [=====================>........] - ETA: 29s - loss: 2.6233 - regression_loss: 2.0979 - classification_loss: 0.5253 384/500 [======================>.......] - ETA: 28s - loss: 2.6247 - regression_loss: 2.0988 - classification_loss: 0.5259 385/500 [======================>.......] - ETA: 28s - loss: 2.6246 - regression_loss: 2.0981 - classification_loss: 0.5265 386/500 [======================>.......] - ETA: 28s - loss: 2.6243 - regression_loss: 2.0978 - classification_loss: 0.5265 387/500 [======================>.......] - ETA: 28s - loss: 2.6248 - regression_loss: 2.0982 - classification_loss: 0.5266 388/500 [======================>.......] - ETA: 27s - loss: 2.6259 - regression_loss: 2.0990 - classification_loss: 0.5270 389/500 [======================>.......] - ETA: 27s - loss: 2.6272 - regression_loss: 2.1002 - classification_loss: 0.5270 390/500 [======================>.......] - ETA: 27s - loss: 2.6265 - regression_loss: 2.0996 - classification_loss: 0.5269 391/500 [======================>.......] - ETA: 27s - loss: 2.6267 - regression_loss: 2.0997 - classification_loss: 0.5270 392/500 [======================>.......] - ETA: 26s - loss: 2.6272 - regression_loss: 2.1000 - classification_loss: 0.5272 393/500 [======================>.......] - ETA: 26s - loss: 2.6273 - regression_loss: 2.0996 - classification_loss: 0.5277 394/500 [======================>.......] - ETA: 26s - loss: 2.6283 - regression_loss: 2.1010 - classification_loss: 0.5273 395/500 [======================>.......] - ETA: 26s - loss: 2.6286 - regression_loss: 2.1012 - classification_loss: 0.5273 396/500 [======================>.......] - ETA: 25s - loss: 2.6293 - regression_loss: 2.1019 - classification_loss: 0.5274 397/500 [======================>.......] - ETA: 25s - loss: 2.6289 - regression_loss: 2.1016 - classification_loss: 0.5272 398/500 [======================>.......] - ETA: 25s - loss: 2.6301 - regression_loss: 2.1027 - classification_loss: 0.5274 399/500 [======================>.......] - ETA: 25s - loss: 2.6306 - regression_loss: 2.1031 - classification_loss: 0.5274 400/500 [=======================>......] - ETA: 24s - loss: 2.6319 - regression_loss: 2.1037 - classification_loss: 0.5283 401/500 [=======================>......] - ETA: 24s - loss: 2.6314 - regression_loss: 2.1035 - classification_loss: 0.5279 402/500 [=======================>......] - ETA: 24s - loss: 2.6310 - regression_loss: 2.1033 - classification_loss: 0.5277 403/500 [=======================>......] - ETA: 24s - loss: 2.6318 - regression_loss: 2.1042 - classification_loss: 0.5276 404/500 [=======================>......] - ETA: 23s - loss: 2.6297 - regression_loss: 2.1028 - classification_loss: 0.5269 405/500 [=======================>......] - ETA: 23s - loss: 2.6297 - regression_loss: 2.1029 - classification_loss: 0.5268 406/500 [=======================>......] - ETA: 23s - loss: 2.6306 - regression_loss: 2.1035 - classification_loss: 0.5270 407/500 [=======================>......] - ETA: 23s - loss: 2.6315 - regression_loss: 2.1046 - classification_loss: 0.5269 408/500 [=======================>......] - ETA: 22s - loss: 2.6311 - regression_loss: 2.1042 - classification_loss: 0.5269 409/500 [=======================>......] - ETA: 22s - loss: 2.6303 - regression_loss: 2.1038 - classification_loss: 0.5265 410/500 [=======================>......] - ETA: 22s - loss: 2.6298 - regression_loss: 2.1034 - classification_loss: 0.5264 411/500 [=======================>......] - ETA: 22s - loss: 2.6299 - regression_loss: 2.1034 - classification_loss: 0.5265 412/500 [=======================>......] - ETA: 21s - loss: 2.6290 - regression_loss: 2.1027 - classification_loss: 0.5263 413/500 [=======================>......] - ETA: 21s - loss: 2.6291 - regression_loss: 2.1028 - classification_loss: 0.5263 414/500 [=======================>......] - ETA: 21s - loss: 2.6283 - regression_loss: 2.1022 - classification_loss: 0.5261 415/500 [=======================>......] - ETA: 21s - loss: 2.6274 - regression_loss: 2.1016 - classification_loss: 0.5258 416/500 [=======================>......] - ETA: 20s - loss: 2.6286 - regression_loss: 2.1020 - classification_loss: 0.5266 417/500 [========================>.....] - ETA: 20s - loss: 2.6291 - regression_loss: 2.1027 - classification_loss: 0.5265 418/500 [========================>.....] - ETA: 20s - loss: 2.6285 - regression_loss: 2.1023 - classification_loss: 0.5263 419/500 [========================>.....] - ETA: 20s - loss: 2.6304 - regression_loss: 2.1037 - classification_loss: 0.5268 420/500 [========================>.....] - ETA: 19s - loss: 2.6301 - regression_loss: 2.1032 - classification_loss: 0.5269 421/500 [========================>.....] - ETA: 19s - loss: 2.6294 - regression_loss: 2.1027 - classification_loss: 0.5267 422/500 [========================>.....] - ETA: 19s - loss: 2.6288 - regression_loss: 2.1025 - classification_loss: 0.5263 423/500 [========================>.....] - ETA: 19s - loss: 2.6279 - regression_loss: 2.1017 - classification_loss: 0.5262 424/500 [========================>.....] - ETA: 18s - loss: 2.6281 - regression_loss: 2.1018 - classification_loss: 0.5262 425/500 [========================>.....] - ETA: 18s - loss: 2.6281 - regression_loss: 2.1021 - classification_loss: 0.5260 426/500 [========================>.....] - ETA: 18s - loss: 2.6278 - regression_loss: 2.1020 - classification_loss: 0.5258 427/500 [========================>.....] - ETA: 18s - loss: 2.6268 - regression_loss: 2.1013 - classification_loss: 0.5255 428/500 [========================>.....] - ETA: 17s - loss: 2.6267 - regression_loss: 2.1014 - classification_loss: 0.5253 429/500 [========================>.....] - ETA: 17s - loss: 2.6265 - regression_loss: 2.1011 - classification_loss: 0.5254 430/500 [========================>.....] - ETA: 17s - loss: 2.6268 - regression_loss: 2.1018 - classification_loss: 0.5250 431/500 [========================>.....] - ETA: 17s - loss: 2.6275 - regression_loss: 2.1021 - classification_loss: 0.5254 432/500 [========================>.....] - ETA: 16s - loss: 2.6274 - regression_loss: 2.1020 - classification_loss: 0.5254 433/500 [========================>.....] - ETA: 16s - loss: 2.6274 - regression_loss: 2.1019 - classification_loss: 0.5255 434/500 [=========================>....] - ETA: 16s - loss: 2.6280 - regression_loss: 2.1026 - classification_loss: 0.5254 435/500 [=========================>....] - ETA: 16s - loss: 2.6289 - regression_loss: 2.1031 - classification_loss: 0.5257 436/500 [=========================>....] - ETA: 15s - loss: 2.6266 - regression_loss: 2.1012 - classification_loss: 0.5253 437/500 [=========================>....] - ETA: 15s - loss: 2.6268 - regression_loss: 2.1014 - classification_loss: 0.5254 438/500 [=========================>....] - ETA: 15s - loss: 2.6265 - regression_loss: 2.1013 - classification_loss: 0.5252 439/500 [=========================>....] - ETA: 15s - loss: 2.6249 - regression_loss: 2.1002 - classification_loss: 0.5246 440/500 [=========================>....] - ETA: 14s - loss: 2.6234 - regression_loss: 2.0991 - classification_loss: 0.5244 441/500 [=========================>....] - ETA: 14s - loss: 2.6214 - regression_loss: 2.0974 - classification_loss: 0.5240 442/500 [=========================>....] - ETA: 14s - loss: 2.6208 - regression_loss: 2.0969 - classification_loss: 0.5239 443/500 [=========================>....] - ETA: 14s - loss: 2.6185 - regression_loss: 2.0951 - classification_loss: 0.5234 444/500 [=========================>....] - ETA: 13s - loss: 2.6176 - regression_loss: 2.0944 - classification_loss: 0.5232 445/500 [=========================>....] - ETA: 13s - loss: 2.6165 - regression_loss: 2.0935 - classification_loss: 0.5230 446/500 [=========================>....] - ETA: 13s - loss: 2.6187 - regression_loss: 2.0954 - classification_loss: 0.5232 447/500 [=========================>....] - ETA: 13s - loss: 2.6175 - regression_loss: 2.0946 - classification_loss: 0.5229 448/500 [=========================>....] - ETA: 12s - loss: 2.6166 - regression_loss: 2.0939 - classification_loss: 0.5227 449/500 [=========================>....] - ETA: 12s - loss: 2.6175 - regression_loss: 2.0947 - classification_loss: 0.5228 450/500 [==========================>...] - ETA: 12s - loss: 2.6172 - regression_loss: 2.0946 - classification_loss: 0.5227 451/500 [==========================>...] - ETA: 12s - loss: 2.6171 - regression_loss: 2.0945 - classification_loss: 0.5226 452/500 [==========================>...] - ETA: 11s - loss: 2.6185 - regression_loss: 2.0958 - classification_loss: 0.5227 453/500 [==========================>...] - ETA: 11s - loss: 2.6175 - regression_loss: 2.0952 - classification_loss: 0.5223 454/500 [==========================>...] - ETA: 11s - loss: 2.6163 - regression_loss: 2.0943 - classification_loss: 0.5220 455/500 [==========================>...] - ETA: 11s - loss: 2.6162 - regression_loss: 2.0943 - classification_loss: 0.5219 456/500 [==========================>...] - ETA: 10s - loss: 2.6173 - regression_loss: 2.0953 - classification_loss: 0.5221 457/500 [==========================>...] - ETA: 10s - loss: 2.6162 - regression_loss: 2.0944 - classification_loss: 0.5218 458/500 [==========================>...] - ETA: 10s - loss: 2.6163 - regression_loss: 2.0947 - classification_loss: 0.5216 459/500 [==========================>...] - ETA: 10s - loss: 2.6165 - regression_loss: 2.0948 - classification_loss: 0.5217 460/500 [==========================>...] - ETA: 9s - loss: 2.6153 - regression_loss: 2.0939 - classification_loss: 0.5214  461/500 [==========================>...] - ETA: 9s - loss: 2.6156 - regression_loss: 2.0943 - classification_loss: 0.5214 462/500 [==========================>...] - ETA: 9s - loss: 2.6152 - regression_loss: 2.0939 - classification_loss: 0.5212 463/500 [==========================>...] - ETA: 9s - loss: 2.6151 - regression_loss: 2.0940 - classification_loss: 0.5212 464/500 [==========================>...] - ETA: 8s - loss: 2.6148 - regression_loss: 2.0939 - classification_loss: 0.5209 465/500 [==========================>...] - ETA: 8s - loss: 2.6149 - regression_loss: 2.0941 - classification_loss: 0.5208 466/500 [==========================>...] - ETA: 8s - loss: 2.6148 - regression_loss: 2.0941 - classification_loss: 0.5206 467/500 [===========================>..] - ETA: 8s - loss: 2.6143 - regression_loss: 2.0940 - classification_loss: 0.5202 468/500 [===========================>..] - ETA: 7s - loss: 2.6127 - regression_loss: 2.0927 - classification_loss: 0.5200 469/500 [===========================>..] - ETA: 7s - loss: 2.6123 - regression_loss: 2.0924 - classification_loss: 0.5199 470/500 [===========================>..] - ETA: 7s - loss: 2.6117 - regression_loss: 2.0920 - classification_loss: 0.5197 471/500 [===========================>..] - ETA: 7s - loss: 2.6117 - regression_loss: 2.0919 - classification_loss: 0.5198 472/500 [===========================>..] - ETA: 6s - loss: 2.6124 - regression_loss: 2.0924 - classification_loss: 0.5200 473/500 [===========================>..] - ETA: 6s - loss: 2.6128 - regression_loss: 2.0927 - classification_loss: 0.5201 474/500 [===========================>..] - ETA: 6s - loss: 2.6136 - regression_loss: 2.0933 - classification_loss: 0.5203 475/500 [===========================>..] - ETA: 6s - loss: 2.6137 - regression_loss: 2.0933 - classification_loss: 0.5203 476/500 [===========================>..] - ETA: 5s - loss: 2.6134 - regression_loss: 2.0933 - classification_loss: 0.5201 477/500 [===========================>..] - ETA: 5s - loss: 2.6135 - regression_loss: 2.0920 - classification_loss: 0.5215 478/500 [===========================>..] - ETA: 5s - loss: 2.6135 - regression_loss: 2.0920 - classification_loss: 0.5215 479/500 [===========================>..] - ETA: 5s - loss: 2.6138 - regression_loss: 2.0923 - classification_loss: 0.5215 480/500 [===========================>..] - ETA: 4s - loss: 2.6163 - regression_loss: 2.0940 - classification_loss: 0.5223 481/500 [===========================>..] - ETA: 4s - loss: 2.6156 - regression_loss: 2.0935 - classification_loss: 0.5221 482/500 [===========================>..] - ETA: 4s - loss: 2.6157 - regression_loss: 2.0938 - classification_loss: 0.5219 483/500 [===========================>..] - ETA: 4s - loss: 2.6153 - regression_loss: 2.0936 - classification_loss: 0.5217 484/500 [============================>.] - ETA: 3s - loss: 2.6166 - regression_loss: 2.0941 - classification_loss: 0.5226 485/500 [============================>.] - ETA: 3s - loss: 2.6171 - regression_loss: 2.0942 - classification_loss: 0.5229 486/500 [============================>.] - ETA: 3s - loss: 2.6155 - regression_loss: 2.0934 - classification_loss: 0.5222 487/500 [============================>.] - ETA: 3s - loss: 2.6144 - regression_loss: 2.0925 - classification_loss: 0.5219 488/500 [============================>.] - ETA: 2s - loss: 2.6148 - regression_loss: 2.0926 - classification_loss: 0.5222 489/500 [============================>.] - ETA: 2s - loss: 2.6145 - regression_loss: 2.0923 - classification_loss: 0.5222 490/500 [============================>.] - ETA: 2s - loss: 2.6157 - regression_loss: 2.0932 - classification_loss: 0.5225 491/500 [============================>.] - ETA: 2s - loss: 2.6161 - regression_loss: 2.0937 - classification_loss: 0.5223 492/500 [============================>.] - ETA: 1s - loss: 2.6162 - regression_loss: 2.0938 - classification_loss: 0.5224 493/500 [============================>.] - ETA: 1s - loss: 2.6159 - regression_loss: 2.0936 - classification_loss: 0.5223 494/500 [============================>.] - ETA: 1s - loss: 2.6157 - regression_loss: 2.0933 - classification_loss: 0.5224 495/500 [============================>.] - ETA: 1s - loss: 2.6160 - regression_loss: 2.0936 - classification_loss: 0.5225 496/500 [============================>.] - ETA: 0s - loss: 2.6161 - regression_loss: 2.0936 - classification_loss: 0.5225 497/500 [============================>.] - ETA: 0s - loss: 2.6169 - regression_loss: 2.0943 - classification_loss: 0.5226 498/500 [============================>.] - ETA: 0s - loss: 2.6180 - regression_loss: 2.0955 - classification_loss: 0.5225 499/500 [============================>.] - ETA: 0s - loss: 2.6174 - regression_loss: 2.0951 - classification_loss: 0.5223 500/500 [==============================] - 125s 250ms/step - loss: 2.6177 - regression_loss: 2.0954 - classification_loss: 0.5223 1172 instances of class plum with average precision: 0.2755 mAP: 0.2755 Epoch 00010: saving model to ./training/snapshots/resnet50_pascal_10.h5 Epoch 11/150 1/500 [..............................] - ETA: 1:59 - loss: 1.9119 - regression_loss: 1.2924 - classification_loss: 0.6194 2/500 [..............................] - ETA: 2:03 - loss: 2.3960 - regression_loss: 1.6240 - classification_loss: 0.7719 3/500 [..............................] - ETA: 2:03 - loss: 2.4288 - regression_loss: 1.8036 - classification_loss: 0.6252 4/500 [..............................] - ETA: 2:04 - loss: 2.4938 - regression_loss: 1.8797 - classification_loss: 0.6140 5/500 [..............................] - ETA: 2:03 - loss: 2.4967 - regression_loss: 1.9146 - classification_loss: 0.5821 6/500 [..............................] - ETA: 2:03 - loss: 2.5164 - regression_loss: 1.9553 - classification_loss: 0.5611 7/500 [..............................] - ETA: 2:03 - loss: 2.4497 - regression_loss: 1.9196 - classification_loss: 0.5301 8/500 [..............................] - ETA: 2:03 - loss: 2.5313 - regression_loss: 1.9580 - classification_loss: 0.5733 9/500 [..............................] - ETA: 2:03 - loss: 2.5394 - regression_loss: 1.9748 - classification_loss: 0.5646 10/500 [..............................] - ETA: 2:03 - loss: 2.5340 - regression_loss: 1.9848 - classification_loss: 0.5491 11/500 [..............................] - ETA: 2:02 - loss: 2.5486 - regression_loss: 2.0051 - classification_loss: 0.5435 12/500 [..............................] - ETA: 2:02 - loss: 2.5628 - regression_loss: 2.0229 - classification_loss: 0.5399 13/500 [..............................] - ETA: 2:02 - loss: 2.5485 - regression_loss: 2.0110 - classification_loss: 0.5376 14/500 [..............................] - ETA: 2:01 - loss: 2.5476 - regression_loss: 2.0165 - classification_loss: 0.5310 15/500 [..............................] - ETA: 2:01 - loss: 2.5626 - regression_loss: 2.0312 - classification_loss: 0.5314 16/500 [..............................] - ETA: 2:01 - loss: 2.5686 - regression_loss: 2.0418 - classification_loss: 0.5268 17/500 [>.............................] - ETA: 2:00 - loss: 2.5641 - regression_loss: 2.0426 - classification_loss: 0.5215 18/500 [>.............................] - ETA: 2:00 - loss: 2.5616 - regression_loss: 2.0378 - classification_loss: 0.5238 19/500 [>.............................] - ETA: 2:00 - loss: 2.5546 - regression_loss: 2.0324 - classification_loss: 0.5222 20/500 [>.............................] - ETA: 2:00 - loss: 2.5564 - regression_loss: 2.0344 - classification_loss: 0.5220 21/500 [>.............................] - ETA: 1:59 - loss: 2.5655 - regression_loss: 2.0422 - classification_loss: 0.5232 22/500 [>.............................] - ETA: 1:59 - loss: 2.5726 - regression_loss: 2.0491 - classification_loss: 0.5236 23/500 [>.............................] - ETA: 1:59 - loss: 2.5569 - regression_loss: 2.0392 - classification_loss: 0.5177 24/500 [>.............................] - ETA: 1:58 - loss: 2.5807 - regression_loss: 2.0582 - classification_loss: 0.5225 25/500 [>.............................] - ETA: 1:58 - loss: 2.5666 - regression_loss: 2.0504 - classification_loss: 0.5161 26/500 [>.............................] - ETA: 1:58 - loss: 2.5872 - regression_loss: 2.0482 - classification_loss: 0.5390 27/500 [>.............................] - ETA: 1:58 - loss: 2.5921 - regression_loss: 2.0537 - classification_loss: 0.5384 28/500 [>.............................] - ETA: 1:57 - loss: 2.5943 - regression_loss: 2.0559 - classification_loss: 0.5384 29/500 [>.............................] - ETA: 1:57 - loss: 2.6272 - regression_loss: 2.0602 - classification_loss: 0.5669 30/500 [>.............................] - ETA: 1:57 - loss: 2.6179 - regression_loss: 2.0568 - classification_loss: 0.5612 31/500 [>.............................] - ETA: 1:57 - loss: 2.6144 - regression_loss: 2.0571 - classification_loss: 0.5572 32/500 [>.............................] - ETA: 1:57 - loss: 2.6118 - regression_loss: 2.0549 - classification_loss: 0.5569 33/500 [>.............................] - ETA: 1:56 - loss: 2.6075 - regression_loss: 2.0539 - classification_loss: 0.5535 34/500 [=>............................] - ETA: 1:56 - loss: 2.5925 - regression_loss: 2.0462 - classification_loss: 0.5463 35/500 [=>............................] - ETA: 1:56 - loss: 2.5874 - regression_loss: 2.0449 - classification_loss: 0.5425 36/500 [=>............................] - ETA: 1:56 - loss: 2.5712 - regression_loss: 2.0321 - classification_loss: 0.5392 37/500 [=>............................] - ETA: 1:56 - loss: 2.5723 - regression_loss: 2.0345 - classification_loss: 0.5378 38/500 [=>............................] - ETA: 1:56 - loss: 2.5788 - regression_loss: 2.0408 - classification_loss: 0.5380 39/500 [=>............................] - ETA: 1:55 - loss: 2.5799 - regression_loss: 2.0441 - classification_loss: 0.5358 40/500 [=>............................] - ETA: 1:55 - loss: 2.5814 - regression_loss: 2.0455 - classification_loss: 0.5359 41/500 [=>............................] - ETA: 1:55 - loss: 2.5781 - regression_loss: 2.0446 - classification_loss: 0.5335 42/500 [=>............................] - ETA: 1:55 - loss: 2.5842 - regression_loss: 2.0506 - classification_loss: 0.5336 43/500 [=>............................] - ETA: 1:54 - loss: 2.5833 - regression_loss: 2.0500 - classification_loss: 0.5333 44/500 [=>............................] - ETA: 1:54 - loss: 2.5987 - regression_loss: 2.0664 - classification_loss: 0.5323 45/500 [=>............................] - ETA: 1:54 - loss: 2.6019 - regression_loss: 2.0665 - classification_loss: 0.5354 46/500 [=>............................] - ETA: 1:54 - loss: 2.6014 - regression_loss: 2.0676 - classification_loss: 0.5338 47/500 [=>............................] - ETA: 1:53 - loss: 2.5946 - regression_loss: 2.0630 - classification_loss: 0.5316 48/500 [=>............................] - ETA: 1:53 - loss: 2.5938 - regression_loss: 2.0645 - classification_loss: 0.5293 49/500 [=>............................] - ETA: 1:53 - loss: 2.5685 - regression_loss: 2.0448 - classification_loss: 0.5238 50/500 [==>...........................] - ETA: 1:53 - loss: 2.5646 - regression_loss: 2.0451 - classification_loss: 0.5194 51/500 [==>...........................] - ETA: 1:52 - loss: 2.5635 - regression_loss: 2.0451 - classification_loss: 0.5184 52/500 [==>...........................] - ETA: 1:52 - loss: 2.5601 - regression_loss: 2.0442 - classification_loss: 0.5159 53/500 [==>...........................] - ETA: 1:52 - loss: 2.5659 - regression_loss: 2.0496 - classification_loss: 0.5163 54/500 [==>...........................] - ETA: 1:52 - loss: 2.5821 - regression_loss: 2.0614 - classification_loss: 0.5207 55/500 [==>...........................] - ETA: 1:51 - loss: 2.5884 - regression_loss: 2.0664 - classification_loss: 0.5221 56/500 [==>...........................] - ETA: 1:51 - loss: 2.5913 - regression_loss: 2.0684 - classification_loss: 0.5229 57/500 [==>...........................] - ETA: 1:51 - loss: 2.5901 - regression_loss: 2.0677 - classification_loss: 0.5224 58/500 [==>...........................] - ETA: 1:51 - loss: 2.6001 - regression_loss: 2.0740 - classification_loss: 0.5261 59/500 [==>...........................] - ETA: 1:50 - loss: 2.6083 - regression_loss: 2.0816 - classification_loss: 0.5267 60/500 [==>...........................] - ETA: 1:50 - loss: 2.6056 - regression_loss: 2.0793 - classification_loss: 0.5263 61/500 [==>...........................] - ETA: 1:50 - loss: 2.6309 - regression_loss: 2.1017 - classification_loss: 0.5291 62/500 [==>...........................] - ETA: 1:50 - loss: 2.6299 - regression_loss: 2.1028 - classification_loss: 0.5271 63/500 [==>...........................] - ETA: 1:49 - loss: 2.6274 - regression_loss: 2.1012 - classification_loss: 0.5261 64/500 [==>...........................] - ETA: 1:49 - loss: 2.6243 - regression_loss: 2.1013 - classification_loss: 0.5230 65/500 [==>...........................] - ETA: 1:49 - loss: 2.6223 - regression_loss: 2.0944 - classification_loss: 0.5279 66/500 [==>...........................] - ETA: 1:49 - loss: 2.6210 - regression_loss: 2.0940 - classification_loss: 0.5269 67/500 [===>..........................] - ETA: 1:49 - loss: 2.6242 - regression_loss: 2.0945 - classification_loss: 0.5298 68/500 [===>..........................] - ETA: 1:48 - loss: 2.6238 - regression_loss: 2.0954 - classification_loss: 0.5284 69/500 [===>..........................] - ETA: 1:48 - loss: 2.6192 - regression_loss: 2.0926 - classification_loss: 0.5266 70/500 [===>..........................] - ETA: 1:48 - loss: 2.6209 - regression_loss: 2.0937 - classification_loss: 0.5272 71/500 [===>..........................] - ETA: 1:48 - loss: 2.6199 - regression_loss: 2.0910 - classification_loss: 0.5289 72/500 [===>..........................] - ETA: 1:47 - loss: 2.6213 - regression_loss: 2.0911 - classification_loss: 0.5301 73/500 [===>..........................] - ETA: 1:47 - loss: 2.6192 - regression_loss: 2.0899 - classification_loss: 0.5293 74/500 [===>..........................] - ETA: 1:47 - loss: 2.6166 - regression_loss: 2.0886 - classification_loss: 0.5280 75/500 [===>..........................] - ETA: 1:46 - loss: 2.6167 - regression_loss: 2.0894 - classification_loss: 0.5273 76/500 [===>..........................] - ETA: 1:46 - loss: 2.6133 - regression_loss: 2.0861 - classification_loss: 0.5272 77/500 [===>..........................] - ETA: 1:46 - loss: 2.6023 - regression_loss: 2.0771 - classification_loss: 0.5252 78/500 [===>..........................] - ETA: 1:46 - loss: 2.5972 - regression_loss: 2.0755 - classification_loss: 0.5216 79/500 [===>..........................] - ETA: 1:45 - loss: 2.5981 - regression_loss: 2.0763 - classification_loss: 0.5217 80/500 [===>..........................] - ETA: 1:45 - loss: 2.5889 - regression_loss: 2.0682 - classification_loss: 0.5207 81/500 [===>..........................] - ETA: 1:45 - loss: 2.5927 - regression_loss: 2.0718 - classification_loss: 0.5210 82/500 [===>..........................] - ETA: 1:45 - loss: 2.5895 - regression_loss: 2.0695 - classification_loss: 0.5201 83/500 [===>..........................] - ETA: 1:45 - loss: 2.5856 - regression_loss: 2.0665 - classification_loss: 0.5191 84/500 [====>.........................] - ETA: 1:44 - loss: 2.5846 - regression_loss: 2.0634 - classification_loss: 0.5213 85/500 [====>.........................] - ETA: 1:44 - loss: 2.5801 - regression_loss: 2.0599 - classification_loss: 0.5201 86/500 [====>.........................] - ETA: 1:44 - loss: 2.5837 - regression_loss: 2.0635 - classification_loss: 0.5201 87/500 [====>.........................] - ETA: 1:43 - loss: 2.5824 - regression_loss: 2.0629 - classification_loss: 0.5195 88/500 [====>.........................] - ETA: 1:43 - loss: 2.5857 - regression_loss: 2.0664 - classification_loss: 0.5192 89/500 [====>.........................] - ETA: 1:43 - loss: 2.5838 - regression_loss: 2.0658 - classification_loss: 0.5180 90/500 [====>.........................] - ETA: 1:43 - loss: 2.5814 - regression_loss: 2.0640 - classification_loss: 0.5174 91/500 [====>.........................] - ETA: 1:42 - loss: 2.5848 - regression_loss: 2.0664 - classification_loss: 0.5184 92/500 [====>.........................] - ETA: 1:42 - loss: 2.5827 - regression_loss: 2.0647 - classification_loss: 0.5180 93/500 [====>.........................] - ETA: 1:42 - loss: 2.5841 - regression_loss: 2.0674 - classification_loss: 0.5168 94/500 [====>.........................] - ETA: 1:42 - loss: 2.5936 - regression_loss: 2.0735 - classification_loss: 0.5201 95/500 [====>.........................] - ETA: 1:42 - loss: 2.5928 - regression_loss: 2.0722 - classification_loss: 0.5206 96/500 [====>.........................] - ETA: 1:41 - loss: 2.5992 - regression_loss: 2.0776 - classification_loss: 0.5216 97/500 [====>.........................] - ETA: 1:41 - loss: 2.5931 - regression_loss: 2.0733 - classification_loss: 0.5198 98/500 [====>.........................] - ETA: 1:41 - loss: 2.5957 - regression_loss: 2.0767 - classification_loss: 0.5190 99/500 [====>.........................] - ETA: 1:41 - loss: 2.5932 - regression_loss: 2.0743 - classification_loss: 0.5189 100/500 [=====>........................] - ETA: 1:40 - loss: 2.5921 - regression_loss: 2.0740 - classification_loss: 0.5181 101/500 [=====>........................] - ETA: 1:40 - loss: 2.5900 - regression_loss: 2.0723 - classification_loss: 0.5176 102/500 [=====>........................] - ETA: 1:40 - loss: 2.5864 - regression_loss: 2.0693 - classification_loss: 0.5171 103/500 [=====>........................] - ETA: 1:40 - loss: 2.5918 - regression_loss: 2.0745 - classification_loss: 0.5172 104/500 [=====>........................] - ETA: 1:39 - loss: 2.5941 - regression_loss: 2.0768 - classification_loss: 0.5173 105/500 [=====>........................] - ETA: 1:39 - loss: 2.5949 - regression_loss: 2.0772 - classification_loss: 0.5177 106/500 [=====>........................] - ETA: 1:38 - loss: 2.5950 - regression_loss: 2.0778 - classification_loss: 0.5172 107/500 [=====>........................] - ETA: 1:38 - loss: 2.5908 - regression_loss: 2.0743 - classification_loss: 0.5165 108/500 [=====>........................] - ETA: 1:38 - loss: 2.5961 - regression_loss: 2.0777 - classification_loss: 0.5184 109/500 [=====>........................] - ETA: 1:37 - loss: 2.5923 - regression_loss: 2.0745 - classification_loss: 0.5178 110/500 [=====>........................] - ETA: 1:37 - loss: 2.5955 - regression_loss: 2.0774 - classification_loss: 0.5181 111/500 [=====>........................] - ETA: 1:37 - loss: 2.5967 - regression_loss: 2.0784 - classification_loss: 0.5183 112/500 [=====>........................] - ETA: 1:37 - loss: 2.6003 - regression_loss: 2.0826 - classification_loss: 0.5177 113/500 [=====>........................] - ETA: 1:36 - loss: 2.6002 - regression_loss: 2.0835 - classification_loss: 0.5167 114/500 [=====>........................] - ETA: 1:36 - loss: 2.5951 - regression_loss: 2.0792 - classification_loss: 0.5159 115/500 [=====>........................] - ETA: 1:36 - loss: 2.6038 - regression_loss: 2.0874 - classification_loss: 0.5164 116/500 [=====>........................] - ETA: 1:36 - loss: 2.6029 - regression_loss: 2.0865 - classification_loss: 0.5163 117/500 [======>.......................] - ETA: 1:35 - loss: 2.6032 - regression_loss: 2.0877 - classification_loss: 0.5155 118/500 [======>.......................] - ETA: 1:35 - loss: 2.6035 - regression_loss: 2.0882 - classification_loss: 0.5154 119/500 [======>.......................] - ETA: 1:35 - loss: 2.6035 - regression_loss: 2.0877 - classification_loss: 0.5158 120/500 [======>.......................] - ETA: 1:35 - loss: 2.6027 - regression_loss: 2.0873 - classification_loss: 0.5154 121/500 [======>.......................] - ETA: 1:34 - loss: 2.6024 - regression_loss: 2.0875 - classification_loss: 0.5149 122/500 [======>.......................] - ETA: 1:34 - loss: 2.5967 - regression_loss: 2.0831 - classification_loss: 0.5136 123/500 [======>.......................] - ETA: 1:34 - loss: 2.6020 - regression_loss: 2.0878 - classification_loss: 0.5142 124/500 [======>.......................] - ETA: 1:34 - loss: 2.5998 - regression_loss: 2.0869 - classification_loss: 0.5129 125/500 [======>.......................] - ETA: 1:33 - loss: 2.5979 - regression_loss: 2.0856 - classification_loss: 0.5123 126/500 [======>.......................] - ETA: 1:33 - loss: 2.6001 - regression_loss: 2.0866 - classification_loss: 0.5135 127/500 [======>.......................] - ETA: 1:33 - loss: 2.6014 - regression_loss: 2.0856 - classification_loss: 0.5158 128/500 [======>.......................] - ETA: 1:33 - loss: 2.6016 - regression_loss: 2.0860 - classification_loss: 0.5156 129/500 [======>.......................] - ETA: 1:32 - loss: 2.6043 - regression_loss: 2.0889 - classification_loss: 0.5154 130/500 [======>.......................] - ETA: 1:32 - loss: 2.6054 - regression_loss: 2.0899 - classification_loss: 0.5156 131/500 [======>.......................] - ETA: 1:32 - loss: 2.6061 - regression_loss: 2.0905 - classification_loss: 0.5156 132/500 [======>.......................] - ETA: 1:32 - loss: 2.6090 - regression_loss: 2.0936 - classification_loss: 0.5155 133/500 [======>.......................] - ETA: 1:31 - loss: 2.6049 - regression_loss: 2.0908 - classification_loss: 0.5142 134/500 [=======>......................] - ETA: 1:31 - loss: 2.6073 - regression_loss: 2.0923 - classification_loss: 0.5151 135/500 [=======>......................] - ETA: 1:31 - loss: 2.6060 - regression_loss: 2.0913 - classification_loss: 0.5147 136/500 [=======>......................] - ETA: 1:31 - loss: 2.6064 - regression_loss: 2.0916 - classification_loss: 0.5148 137/500 [=======>......................] - ETA: 1:30 - loss: 2.6020 - regression_loss: 2.0885 - classification_loss: 0.5134 138/500 [=======>......................] - ETA: 1:30 - loss: 2.6047 - regression_loss: 2.0906 - classification_loss: 0.5141 139/500 [=======>......................] - ETA: 1:30 - loss: 2.6024 - regression_loss: 2.0888 - classification_loss: 0.5137 140/500 [=======>......................] - ETA: 1:30 - loss: 2.6051 - regression_loss: 2.0910 - classification_loss: 0.5142 141/500 [=======>......................] - ETA: 1:29 - loss: 2.6079 - regression_loss: 2.0934 - classification_loss: 0.5145 142/500 [=======>......................] - ETA: 1:29 - loss: 2.6049 - regression_loss: 2.0912 - classification_loss: 0.5137 143/500 [=======>......................] - ETA: 1:29 - loss: 2.6038 - regression_loss: 2.0908 - classification_loss: 0.5130 144/500 [=======>......................] - ETA: 1:29 - loss: 2.6057 - regression_loss: 2.0924 - classification_loss: 0.5133 145/500 [=======>......................] - ETA: 1:28 - loss: 2.6065 - regression_loss: 2.0931 - classification_loss: 0.5135 146/500 [=======>......................] - ETA: 1:28 - loss: 2.6060 - regression_loss: 2.0923 - classification_loss: 0.5137 147/500 [=======>......................] - ETA: 1:28 - loss: 2.6043 - regression_loss: 2.0911 - classification_loss: 0.5132 148/500 [=======>......................] - ETA: 1:28 - loss: 2.6007 - regression_loss: 2.0883 - classification_loss: 0.5124 149/500 [=======>......................] - ETA: 1:27 - loss: 2.5995 - regression_loss: 2.0878 - classification_loss: 0.5117 150/500 [========>.....................] - ETA: 1:27 - loss: 2.5998 - regression_loss: 2.0877 - classification_loss: 0.5122 151/500 [========>.....................] - ETA: 1:27 - loss: 2.6015 - regression_loss: 2.0894 - classification_loss: 0.5122 152/500 [========>.....................] - ETA: 1:27 - loss: 2.6030 - regression_loss: 2.0897 - classification_loss: 0.5132 153/500 [========>.....................] - ETA: 1:26 - loss: 2.6042 - regression_loss: 2.0902 - classification_loss: 0.5140 154/500 [========>.....................] - ETA: 1:26 - loss: 2.6066 - regression_loss: 2.0921 - classification_loss: 0.5145 155/500 [========>.....................] - ETA: 1:26 - loss: 2.6007 - regression_loss: 2.0871 - classification_loss: 0.5136 156/500 [========>.....................] - ETA: 1:26 - loss: 2.5991 - regression_loss: 2.0863 - classification_loss: 0.5128 157/500 [========>.....................] - ETA: 1:25 - loss: 2.5970 - regression_loss: 2.0849 - classification_loss: 0.5121 158/500 [========>.....................] - ETA: 1:25 - loss: 2.5960 - regression_loss: 2.0843 - classification_loss: 0.5117 159/500 [========>.....................] - ETA: 1:25 - loss: 2.5979 - regression_loss: 2.0857 - classification_loss: 0.5122 160/500 [========>.....................] - ETA: 1:25 - loss: 2.5970 - regression_loss: 2.0853 - classification_loss: 0.5118 161/500 [========>.....................] - ETA: 1:24 - loss: 2.5982 - regression_loss: 2.0855 - classification_loss: 0.5127 162/500 [========>.....................] - ETA: 1:24 - loss: 2.5964 - regression_loss: 2.0838 - classification_loss: 0.5126 163/500 [========>.....................] - ETA: 1:24 - loss: 2.5943 - regression_loss: 2.0826 - classification_loss: 0.5118 164/500 [========>.....................] - ETA: 1:24 - loss: 2.5965 - regression_loss: 2.0838 - classification_loss: 0.5127 165/500 [========>.....................] - ETA: 1:23 - loss: 2.5975 - regression_loss: 2.0853 - classification_loss: 0.5122 166/500 [========>.....................] - ETA: 1:23 - loss: 2.5942 - regression_loss: 2.0824 - classification_loss: 0.5119 167/500 [=========>....................] - ETA: 1:23 - loss: 2.5942 - regression_loss: 2.0828 - classification_loss: 0.5114 168/500 [=========>....................] - ETA: 1:23 - loss: 2.6044 - regression_loss: 2.0892 - classification_loss: 0.5151 169/500 [=========>....................] - ETA: 1:22 - loss: 2.6118 - regression_loss: 2.0952 - classification_loss: 0.5166 170/500 [=========>....................] - ETA: 1:22 - loss: 2.6104 - regression_loss: 2.0947 - classification_loss: 0.5156 171/500 [=========>....................] - ETA: 1:22 - loss: 2.6093 - regression_loss: 2.0942 - classification_loss: 0.5152 172/500 [=========>....................] - ETA: 1:22 - loss: 2.6088 - regression_loss: 2.0940 - classification_loss: 0.5148 173/500 [=========>....................] - ETA: 1:21 - loss: 2.6076 - regression_loss: 2.0934 - classification_loss: 0.5142 174/500 [=========>....................] - ETA: 1:21 - loss: 2.6172 - regression_loss: 2.0931 - classification_loss: 0.5241 175/500 [=========>....................] - ETA: 1:21 - loss: 2.6178 - regression_loss: 2.0938 - classification_loss: 0.5240 176/500 [=========>....................] - ETA: 1:21 - loss: 2.6162 - regression_loss: 2.0928 - classification_loss: 0.5234 177/500 [=========>....................] - ETA: 1:20 - loss: 2.6163 - regression_loss: 2.0927 - classification_loss: 0.5236 178/500 [=========>....................] - ETA: 1:20 - loss: 2.6127 - regression_loss: 2.0902 - classification_loss: 0.5225 179/500 [=========>....................] - ETA: 1:20 - loss: 2.6120 - regression_loss: 2.0898 - classification_loss: 0.5222 180/500 [=========>....................] - ETA: 1:20 - loss: 2.6117 - regression_loss: 2.0897 - classification_loss: 0.5219 181/500 [=========>....................] - ETA: 1:19 - loss: 2.6100 - regression_loss: 2.0881 - classification_loss: 0.5220 182/500 [=========>....................] - ETA: 1:19 - loss: 2.6096 - regression_loss: 2.0879 - classification_loss: 0.5216 183/500 [=========>....................] - ETA: 1:19 - loss: 2.6105 - regression_loss: 2.0888 - classification_loss: 0.5217 184/500 [==========>...................] - ETA: 1:19 - loss: 2.6134 - regression_loss: 2.0910 - classification_loss: 0.5224 185/500 [==========>...................] - ETA: 1:18 - loss: 2.6128 - regression_loss: 2.0910 - classification_loss: 0.5218 186/500 [==========>...................] - ETA: 1:18 - loss: 2.6147 - regression_loss: 2.0922 - classification_loss: 0.5224 187/500 [==========>...................] - ETA: 1:18 - loss: 2.6118 - regression_loss: 2.0902 - classification_loss: 0.5215 188/500 [==========>...................] - ETA: 1:18 - loss: 2.6124 - regression_loss: 2.0909 - classification_loss: 0.5214 189/500 [==========>...................] - ETA: 1:17 - loss: 2.6112 - regression_loss: 2.0906 - classification_loss: 0.5206 190/500 [==========>...................] - ETA: 1:17 - loss: 2.6096 - regression_loss: 2.0897 - classification_loss: 0.5199 191/500 [==========>...................] - ETA: 1:17 - loss: 2.6091 - regression_loss: 2.0890 - classification_loss: 0.5201 192/500 [==========>...................] - ETA: 1:17 - loss: 2.6086 - regression_loss: 2.0889 - classification_loss: 0.5196 193/500 [==========>...................] - ETA: 1:16 - loss: 2.6077 - regression_loss: 2.0874 - classification_loss: 0.5203 194/500 [==========>...................] - ETA: 1:16 - loss: 2.6052 - regression_loss: 2.0859 - classification_loss: 0.5193 195/500 [==========>...................] - ETA: 1:16 - loss: 2.6087 - regression_loss: 2.0885 - classification_loss: 0.5201 196/500 [==========>...................] - ETA: 1:16 - loss: 2.6051 - regression_loss: 2.0853 - classification_loss: 0.5199 197/500 [==========>...................] - ETA: 1:15 - loss: 2.6035 - regression_loss: 2.0842 - classification_loss: 0.5193 198/500 [==========>...................] - ETA: 1:15 - loss: 2.6058 - regression_loss: 2.0852 - classification_loss: 0.5206 199/500 [==========>...................] - ETA: 1:15 - loss: 2.6049 - regression_loss: 2.0848 - classification_loss: 0.5201 200/500 [===========>..................] - ETA: 1:15 - loss: 2.6101 - regression_loss: 2.0888 - classification_loss: 0.5214 201/500 [===========>..................] - ETA: 1:14 - loss: 2.6113 - regression_loss: 2.0899 - classification_loss: 0.5215 202/500 [===========>..................] - ETA: 1:14 - loss: 2.6096 - regression_loss: 2.0886 - classification_loss: 0.5210 203/500 [===========>..................] - ETA: 1:14 - loss: 2.6106 - regression_loss: 2.0897 - classification_loss: 0.5209 204/500 [===========>..................] - ETA: 1:14 - loss: 2.6093 - regression_loss: 2.0886 - classification_loss: 0.5207 205/500 [===========>..................] - ETA: 1:13 - loss: 2.6091 - regression_loss: 2.0887 - classification_loss: 0.5204 206/500 [===========>..................] - ETA: 1:13 - loss: 2.6098 - regression_loss: 2.0896 - classification_loss: 0.5202 207/500 [===========>..................] - ETA: 1:13 - loss: 2.6077 - regression_loss: 2.0883 - classification_loss: 0.5194 208/500 [===========>..................] - ETA: 1:13 - loss: 2.6099 - regression_loss: 2.0905 - classification_loss: 0.5194 209/500 [===========>..................] - ETA: 1:12 - loss: 2.6091 - regression_loss: 2.0899 - classification_loss: 0.5193 210/500 [===========>..................] - ETA: 1:12 - loss: 2.6091 - regression_loss: 2.0896 - classification_loss: 0.5195 211/500 [===========>..................] - ETA: 1:12 - loss: 2.6064 - regression_loss: 2.0878 - classification_loss: 0.5186 212/500 [===========>..................] - ETA: 1:12 - loss: 2.6072 - regression_loss: 2.0883 - classification_loss: 0.5189 213/500 [===========>..................] - ETA: 1:11 - loss: 2.6076 - regression_loss: 2.0888 - classification_loss: 0.5188 214/500 [===========>..................] - ETA: 1:11 - loss: 2.6048 - regression_loss: 2.0869 - classification_loss: 0.5179 215/500 [===========>..................] - ETA: 1:11 - loss: 2.6031 - regression_loss: 2.0856 - classification_loss: 0.5175 216/500 [===========>..................] - ETA: 1:11 - loss: 2.6031 - regression_loss: 2.0858 - classification_loss: 0.5173 217/500 [============>.................] - ETA: 1:10 - loss: 2.5991 - regression_loss: 2.0816 - classification_loss: 0.5175 218/500 [============>.................] - ETA: 1:10 - loss: 2.5975 - regression_loss: 2.0806 - classification_loss: 0.5169 219/500 [============>.................] - ETA: 1:10 - loss: 2.5971 - regression_loss: 2.0803 - classification_loss: 0.5168 220/500 [============>.................] - ETA: 1:10 - loss: 2.6009 - regression_loss: 2.0837 - classification_loss: 0.5172 221/500 [============>.................] - ETA: 1:09 - loss: 2.6020 - regression_loss: 2.0847 - classification_loss: 0.5173 222/500 [============>.................] - ETA: 1:09 - loss: 2.6026 - regression_loss: 2.0853 - classification_loss: 0.5174 223/500 [============>.................] - ETA: 1:09 - loss: 2.6027 - regression_loss: 2.0848 - classification_loss: 0.5179 224/500 [============>.................] - ETA: 1:09 - loss: 2.6038 - regression_loss: 2.0857 - classification_loss: 0.5180 225/500 [============>.................] - ETA: 1:08 - loss: 2.6036 - regression_loss: 2.0861 - classification_loss: 0.5175 226/500 [============>.................] - ETA: 1:08 - loss: 2.6042 - regression_loss: 2.0868 - classification_loss: 0.5174 227/500 [============>.................] - ETA: 1:08 - loss: 2.6034 - regression_loss: 2.0853 - classification_loss: 0.5181 228/500 [============>.................] - ETA: 1:08 - loss: 2.6029 - regression_loss: 2.0852 - classification_loss: 0.5177 229/500 [============>.................] - ETA: 1:07 - loss: 2.6016 - regression_loss: 2.0843 - classification_loss: 0.5172 230/500 [============>.................] - ETA: 1:07 - loss: 2.6008 - regression_loss: 2.0840 - classification_loss: 0.5168 231/500 [============>.................] - ETA: 1:07 - loss: 2.6005 - regression_loss: 2.0839 - classification_loss: 0.5166 232/500 [============>.................] - ETA: 1:07 - loss: 2.6002 - regression_loss: 2.0842 - classification_loss: 0.5160 233/500 [============>.................] - ETA: 1:06 - loss: 2.5998 - regression_loss: 2.0842 - classification_loss: 0.5156 234/500 [=============>................] - ETA: 1:06 - loss: 2.5997 - regression_loss: 2.0842 - classification_loss: 0.5154 235/500 [=============>................] - ETA: 1:06 - loss: 2.5985 - regression_loss: 2.0831 - classification_loss: 0.5154 236/500 [=============>................] - ETA: 1:06 - loss: 2.5934 - regression_loss: 2.0793 - classification_loss: 0.5141 237/500 [=============>................] - ETA: 1:05 - loss: 2.5924 - regression_loss: 2.0788 - classification_loss: 0.5136 238/500 [=============>................] - ETA: 1:05 - loss: 2.5926 - regression_loss: 2.0783 - classification_loss: 0.5143 239/500 [=============>................] - ETA: 1:05 - loss: 2.5899 - regression_loss: 2.0762 - classification_loss: 0.5137 240/500 [=============>................] - ETA: 1:05 - loss: 2.5898 - regression_loss: 2.0765 - classification_loss: 0.5133 241/500 [=============>................] - ETA: 1:04 - loss: 2.5910 - regression_loss: 2.0775 - classification_loss: 0.5135 242/500 [=============>................] - ETA: 1:04 - loss: 2.5902 - regression_loss: 2.0769 - classification_loss: 0.5133 243/500 [=============>................] - ETA: 1:04 - loss: 2.5890 - regression_loss: 2.0760 - classification_loss: 0.5130 244/500 [=============>................] - ETA: 1:04 - loss: 2.5896 - regression_loss: 2.0765 - classification_loss: 0.5131 245/500 [=============>................] - ETA: 1:03 - loss: 2.5884 - regression_loss: 2.0753 - classification_loss: 0.5131 246/500 [=============>................] - ETA: 1:03 - loss: 2.5884 - regression_loss: 2.0751 - classification_loss: 0.5134 247/500 [=============>................] - ETA: 1:03 - loss: 2.5854 - regression_loss: 2.0714 - classification_loss: 0.5140 248/500 [=============>................] - ETA: 1:03 - loss: 2.5874 - regression_loss: 2.0730 - classification_loss: 0.5144 249/500 [=============>................] - ETA: 1:02 - loss: 2.5887 - regression_loss: 2.0742 - classification_loss: 0.5145 250/500 [==============>...............] - ETA: 1:02 - loss: 2.5880 - regression_loss: 2.0740 - classification_loss: 0.5140 251/500 [==============>...............] - ETA: 1:02 - loss: 2.5874 - regression_loss: 2.0733 - classification_loss: 0.5141 252/500 [==============>...............] - ETA: 1:02 - loss: 2.5875 - regression_loss: 2.0734 - classification_loss: 0.5141 253/500 [==============>...............] - ETA: 1:01 - loss: 2.5877 - regression_loss: 2.0736 - classification_loss: 0.5141 254/500 [==============>...............] - ETA: 1:01 - loss: 2.5902 - regression_loss: 2.0757 - classification_loss: 0.5145 255/500 [==============>...............] - ETA: 1:01 - loss: 2.5890 - regression_loss: 2.0747 - classification_loss: 0.5143 256/500 [==============>...............] - ETA: 1:01 - loss: 2.5896 - regression_loss: 2.0753 - classification_loss: 0.5143 257/500 [==============>...............] - ETA: 1:00 - loss: 2.5912 - regression_loss: 2.0772 - classification_loss: 0.5141 258/500 [==============>...............] - ETA: 1:00 - loss: 2.5875 - regression_loss: 2.0741 - classification_loss: 0.5134 259/500 [==============>...............] - ETA: 1:00 - loss: 2.5857 - regression_loss: 2.0726 - classification_loss: 0.5130 260/500 [==============>...............] - ETA: 1:00 - loss: 2.5854 - regression_loss: 2.0723 - classification_loss: 0.5130 261/500 [==============>...............] - ETA: 59s - loss: 2.5852 - regression_loss: 2.0720 - classification_loss: 0.5132  262/500 [==============>...............] - ETA: 59s - loss: 2.5786 - regression_loss: 2.0667 - classification_loss: 0.5120 263/500 [==============>...............] - ETA: 59s - loss: 2.5780 - regression_loss: 2.0663 - classification_loss: 0.5116 264/500 [==============>...............] - ETA: 59s - loss: 2.5781 - regression_loss: 2.0668 - classification_loss: 0.5113 265/500 [==============>...............] - ETA: 58s - loss: 2.5771 - regression_loss: 2.0661 - classification_loss: 0.5110 266/500 [==============>...............] - ETA: 58s - loss: 2.5767 - regression_loss: 2.0657 - classification_loss: 0.5110 267/500 [===============>..............] - ETA: 58s - loss: 2.5762 - regression_loss: 2.0652 - classification_loss: 0.5109 268/500 [===============>..............] - ETA: 58s - loss: 2.5800 - regression_loss: 2.0683 - classification_loss: 0.5117 269/500 [===============>..............] - ETA: 57s - loss: 2.5820 - regression_loss: 2.0700 - classification_loss: 0.5120 270/500 [===============>..............] - ETA: 57s - loss: 2.5842 - regression_loss: 2.0716 - classification_loss: 0.5126 271/500 [===============>..............] - ETA: 57s - loss: 2.5852 - regression_loss: 2.0723 - classification_loss: 0.5129 272/500 [===============>..............] - ETA: 57s - loss: 2.5878 - regression_loss: 2.0735 - classification_loss: 0.5142 273/500 [===============>..............] - ETA: 56s - loss: 2.5883 - regression_loss: 2.0741 - classification_loss: 0.5142 274/500 [===============>..............] - ETA: 56s - loss: 2.5864 - regression_loss: 2.0729 - classification_loss: 0.5136 275/500 [===============>..............] - ETA: 56s - loss: 2.5863 - regression_loss: 2.0728 - classification_loss: 0.5134 276/500 [===============>..............] - ETA: 56s - loss: 2.5841 - regression_loss: 2.0711 - classification_loss: 0.5130 277/500 [===============>..............] - ETA: 55s - loss: 2.5845 - regression_loss: 2.0716 - classification_loss: 0.5129 278/500 [===============>..............] - ETA: 55s - loss: 2.5864 - regression_loss: 2.0713 - classification_loss: 0.5151 279/500 [===============>..............] - ETA: 55s - loss: 2.5875 - regression_loss: 2.0729 - classification_loss: 0.5146 280/500 [===============>..............] - ETA: 55s - loss: 2.5865 - regression_loss: 2.0722 - classification_loss: 0.5143 281/500 [===============>..............] - ETA: 54s - loss: 2.5834 - regression_loss: 2.0696 - classification_loss: 0.5138 282/500 [===============>..............] - ETA: 54s - loss: 2.5835 - regression_loss: 2.0696 - classification_loss: 0.5139 283/500 [===============>..............] - ETA: 54s - loss: 2.5815 - regression_loss: 2.0681 - classification_loss: 0.5134 284/500 [================>.............] - ETA: 54s - loss: 2.5776 - regression_loss: 2.0654 - classification_loss: 0.5122 285/500 [================>.............] - ETA: 53s - loss: 2.5778 - regression_loss: 2.0655 - classification_loss: 0.5123 286/500 [================>.............] - ETA: 53s - loss: 2.5790 - regression_loss: 2.0665 - classification_loss: 0.5125 287/500 [================>.............] - ETA: 53s - loss: 2.5803 - regression_loss: 2.0676 - classification_loss: 0.5128 288/500 [================>.............] - ETA: 53s - loss: 2.5820 - regression_loss: 2.0692 - classification_loss: 0.5128 289/500 [================>.............] - ETA: 52s - loss: 2.5818 - regression_loss: 2.0684 - classification_loss: 0.5134 290/500 [================>.............] - ETA: 52s - loss: 2.5820 - regression_loss: 2.0686 - classification_loss: 0.5134 291/500 [================>.............] - ETA: 52s - loss: 2.5813 - regression_loss: 2.0682 - classification_loss: 0.5131 292/500 [================>.............] - ETA: 52s - loss: 2.5848 - regression_loss: 2.0698 - classification_loss: 0.5150 293/500 [================>.............] - ETA: 51s - loss: 2.5837 - regression_loss: 2.0691 - classification_loss: 0.5146 294/500 [================>.............] - ETA: 51s - loss: 2.5832 - regression_loss: 2.0689 - classification_loss: 0.5143 295/500 [================>.............] - ETA: 51s - loss: 2.5835 - regression_loss: 2.0692 - classification_loss: 0.5143 296/500 [================>.............] - ETA: 51s - loss: 2.5823 - regression_loss: 2.0685 - classification_loss: 0.5138 297/500 [================>.............] - ETA: 50s - loss: 2.5822 - regression_loss: 2.0685 - classification_loss: 0.5136 298/500 [================>.............] - ETA: 50s - loss: 2.5824 - regression_loss: 2.0688 - classification_loss: 0.5136 299/500 [================>.............] - ETA: 50s - loss: 2.5821 - regression_loss: 2.0684 - classification_loss: 0.5137 300/500 [=================>............] - ETA: 50s - loss: 2.5809 - regression_loss: 2.0674 - classification_loss: 0.5134 301/500 [=================>............] - ETA: 49s - loss: 2.5813 - regression_loss: 2.0678 - classification_loss: 0.5136 302/500 [=================>............] - ETA: 49s - loss: 2.5810 - regression_loss: 2.0675 - classification_loss: 0.5135 303/500 [=================>............] - ETA: 49s - loss: 2.5842 - regression_loss: 2.0702 - classification_loss: 0.5140 304/500 [=================>............] - ETA: 49s - loss: 2.5853 - regression_loss: 2.0711 - classification_loss: 0.5142 305/500 [=================>............] - ETA: 48s - loss: 2.5838 - regression_loss: 2.0697 - classification_loss: 0.5141 306/500 [=================>............] - ETA: 48s - loss: 2.5839 - regression_loss: 2.0698 - classification_loss: 0.5141 307/500 [=================>............] - ETA: 48s - loss: 2.5846 - regression_loss: 2.0705 - classification_loss: 0.5141 308/500 [=================>............] - ETA: 48s - loss: 2.5831 - regression_loss: 2.0695 - classification_loss: 0.5136 309/500 [=================>............] - ETA: 47s - loss: 2.5827 - regression_loss: 2.0693 - classification_loss: 0.5134 310/500 [=================>............] - ETA: 47s - loss: 2.5843 - regression_loss: 2.0706 - classification_loss: 0.5137 311/500 [=================>............] - ETA: 47s - loss: 2.5826 - regression_loss: 2.0689 - classification_loss: 0.5137 312/500 [=================>............] - ETA: 47s - loss: 2.5813 - regression_loss: 2.0676 - classification_loss: 0.5137 313/500 [=================>............] - ETA: 46s - loss: 2.5820 - regression_loss: 2.0684 - classification_loss: 0.5136 314/500 [=================>............] - ETA: 46s - loss: 2.5814 - regression_loss: 2.0680 - classification_loss: 0.5134 315/500 [=================>............] - ETA: 46s - loss: 2.5828 - regression_loss: 2.0687 - classification_loss: 0.5140 316/500 [=================>............] - ETA: 46s - loss: 2.5833 - regression_loss: 2.0696 - classification_loss: 0.5138 317/500 [==================>...........] - ETA: 45s - loss: 2.5827 - regression_loss: 2.0691 - classification_loss: 0.5136 318/500 [==================>...........] - ETA: 45s - loss: 2.5814 - regression_loss: 2.0680 - classification_loss: 0.5134 319/500 [==================>...........] - ETA: 45s - loss: 2.5817 - regression_loss: 2.0683 - classification_loss: 0.5134 320/500 [==================>...........] - ETA: 45s - loss: 2.5802 - regression_loss: 2.0673 - classification_loss: 0.5129 321/500 [==================>...........] - ETA: 44s - loss: 2.5812 - regression_loss: 2.0685 - classification_loss: 0.5127 322/500 [==================>...........] - ETA: 44s - loss: 2.5827 - regression_loss: 2.0703 - classification_loss: 0.5124 323/500 [==================>...........] - ETA: 44s - loss: 2.5842 - regression_loss: 2.0718 - classification_loss: 0.5125 324/500 [==================>...........] - ETA: 44s - loss: 2.5851 - regression_loss: 2.0725 - classification_loss: 0.5126 325/500 [==================>...........] - ETA: 43s - loss: 2.5844 - regression_loss: 2.0719 - classification_loss: 0.5126 326/500 [==================>...........] - ETA: 43s - loss: 2.5846 - regression_loss: 2.0720 - classification_loss: 0.5126 327/500 [==================>...........] - ETA: 43s - loss: 2.5849 - regression_loss: 2.0720 - classification_loss: 0.5129 328/500 [==================>...........] - ETA: 43s - loss: 2.5847 - regression_loss: 2.0721 - classification_loss: 0.5126 329/500 [==================>...........] - ETA: 42s - loss: 2.5848 - regression_loss: 2.0723 - classification_loss: 0.5125 330/500 [==================>...........] - ETA: 42s - loss: 2.5841 - regression_loss: 2.0720 - classification_loss: 0.5121 331/500 [==================>...........] - ETA: 42s - loss: 2.5837 - regression_loss: 2.0713 - classification_loss: 0.5124 332/500 [==================>...........] - ETA: 42s - loss: 2.5850 - regression_loss: 2.0724 - classification_loss: 0.5125 333/500 [==================>...........] - ETA: 41s - loss: 2.5864 - regression_loss: 2.0737 - classification_loss: 0.5127 334/500 [===================>..........] - ETA: 41s - loss: 2.5877 - regression_loss: 2.0745 - classification_loss: 0.5132 335/500 [===================>..........] - ETA: 41s - loss: 2.5883 - regression_loss: 2.0749 - classification_loss: 0.5133 336/500 [===================>..........] - ETA: 41s - loss: 2.5888 - regression_loss: 2.0756 - classification_loss: 0.5132 337/500 [===================>..........] - ETA: 40s - loss: 2.5899 - regression_loss: 2.0767 - classification_loss: 0.5133 338/500 [===================>..........] - ETA: 40s - loss: 2.5876 - regression_loss: 2.0746 - classification_loss: 0.5130 339/500 [===================>..........] - ETA: 40s - loss: 2.5875 - regression_loss: 2.0745 - classification_loss: 0.5130 340/500 [===================>..........] - ETA: 40s - loss: 2.5887 - regression_loss: 2.0747 - classification_loss: 0.5140 341/500 [===================>..........] - ETA: 39s - loss: 2.5893 - regression_loss: 2.0752 - classification_loss: 0.5141 342/500 [===================>..........] - ETA: 39s - loss: 2.5892 - regression_loss: 2.0751 - classification_loss: 0.5141 343/500 [===================>..........] - ETA: 39s - loss: 2.5889 - regression_loss: 2.0748 - classification_loss: 0.5141 344/500 [===================>..........] - ETA: 39s - loss: 2.5892 - regression_loss: 2.0750 - classification_loss: 0.5143 345/500 [===================>..........] - ETA: 38s - loss: 2.5887 - regression_loss: 2.0748 - classification_loss: 0.5140 346/500 [===================>..........] - ETA: 38s - loss: 2.5892 - regression_loss: 2.0749 - classification_loss: 0.5143 347/500 [===================>..........] - ETA: 38s - loss: 2.5886 - regression_loss: 2.0744 - classification_loss: 0.5141 348/500 [===================>..........] - ETA: 38s - loss: 2.5878 - regression_loss: 2.0739 - classification_loss: 0.5139 349/500 [===================>..........] - ETA: 37s - loss: 2.5862 - regression_loss: 2.0729 - classification_loss: 0.5134 350/500 [====================>.........] - ETA: 37s - loss: 2.5860 - regression_loss: 2.0728 - classification_loss: 0.5131 351/500 [====================>.........] - ETA: 37s - loss: 2.5858 - regression_loss: 2.0730 - classification_loss: 0.5129 352/500 [====================>.........] - ETA: 37s - loss: 2.5898 - regression_loss: 2.0751 - classification_loss: 0.5147 353/500 [====================>.........] - ETA: 36s - loss: 2.5886 - regression_loss: 2.0743 - classification_loss: 0.5142 354/500 [====================>.........] - ETA: 36s - loss: 2.5880 - regression_loss: 2.0735 - classification_loss: 0.5144 355/500 [====================>.........] - ETA: 36s - loss: 2.5886 - regression_loss: 2.0738 - classification_loss: 0.5149 356/500 [====================>.........] - ETA: 36s - loss: 2.5885 - regression_loss: 2.0735 - classification_loss: 0.5150 357/500 [====================>.........] - ETA: 35s - loss: 2.5876 - regression_loss: 2.0728 - classification_loss: 0.5148 358/500 [====================>.........] - ETA: 35s - loss: 2.5885 - regression_loss: 2.0737 - classification_loss: 0.5148 359/500 [====================>.........] - ETA: 35s - loss: 2.5880 - regression_loss: 2.0733 - classification_loss: 0.5147 360/500 [====================>.........] - ETA: 35s - loss: 2.5857 - regression_loss: 2.0718 - classification_loss: 0.5139 361/500 [====================>.........] - ETA: 34s - loss: 2.5858 - regression_loss: 2.0717 - classification_loss: 0.5140 362/500 [====================>.........] - ETA: 34s - loss: 2.5870 - regression_loss: 2.0729 - classification_loss: 0.5141 363/500 [====================>.........] - ETA: 34s - loss: 2.5863 - regression_loss: 2.0725 - classification_loss: 0.5138 364/500 [====================>.........] - ETA: 34s - loss: 2.5868 - regression_loss: 2.0732 - classification_loss: 0.5136 365/500 [====================>.........] - ETA: 33s - loss: 2.5887 - regression_loss: 2.0753 - classification_loss: 0.5134 366/500 [====================>.........] - ETA: 33s - loss: 2.5889 - regression_loss: 2.0756 - classification_loss: 0.5133 367/500 [=====================>........] - ETA: 33s - loss: 2.5868 - regression_loss: 2.0737 - classification_loss: 0.5131 368/500 [=====================>........] - ETA: 33s - loss: 2.5892 - regression_loss: 2.0758 - classification_loss: 0.5134 369/500 [=====================>........] - ETA: 32s - loss: 2.5892 - regression_loss: 2.0757 - classification_loss: 0.5135 370/500 [=====================>........] - ETA: 32s - loss: 2.5898 - regression_loss: 2.0761 - classification_loss: 0.5137 371/500 [=====================>........] - ETA: 32s - loss: 2.5897 - regression_loss: 2.0757 - classification_loss: 0.5139 372/500 [=====================>........] - ETA: 32s - loss: 2.5898 - regression_loss: 2.0760 - classification_loss: 0.5138 373/500 [=====================>........] - ETA: 31s - loss: 2.5890 - regression_loss: 2.0753 - classification_loss: 0.5136 374/500 [=====================>........] - ETA: 31s - loss: 2.5905 - regression_loss: 2.0761 - classification_loss: 0.5144 375/500 [=====================>........] - ETA: 31s - loss: 2.5898 - regression_loss: 2.0757 - classification_loss: 0.5141 376/500 [=====================>........] - ETA: 31s - loss: 2.5894 - regression_loss: 2.0749 - classification_loss: 0.5145 377/500 [=====================>........] - ETA: 30s - loss: 2.5906 - regression_loss: 2.0755 - classification_loss: 0.5151 378/500 [=====================>........] - ETA: 30s - loss: 2.5910 - regression_loss: 2.0760 - classification_loss: 0.5150 379/500 [=====================>........] - ETA: 30s - loss: 2.5906 - regression_loss: 2.0758 - classification_loss: 0.5148 380/500 [=====================>........] - ETA: 30s - loss: 2.5917 - regression_loss: 2.0767 - classification_loss: 0.5149 381/500 [=====================>........] - ETA: 29s - loss: 2.5917 - regression_loss: 2.0769 - classification_loss: 0.5149 382/500 [=====================>........] - ETA: 29s - loss: 2.5914 - regression_loss: 2.0766 - classification_loss: 0.5148 383/500 [=====================>........] - ETA: 29s - loss: 2.5913 - regression_loss: 2.0767 - classification_loss: 0.5146 384/500 [======================>.......] - ETA: 29s - loss: 2.5937 - regression_loss: 2.0789 - classification_loss: 0.5148 385/500 [======================>.......] - ETA: 28s - loss: 2.5936 - regression_loss: 2.0791 - classification_loss: 0.5145 386/500 [======================>.......] - ETA: 28s - loss: 2.5925 - regression_loss: 2.0783 - classification_loss: 0.5142 387/500 [======================>.......] - ETA: 28s - loss: 2.5916 - regression_loss: 2.0777 - classification_loss: 0.5139 388/500 [======================>.......] - ETA: 28s - loss: 2.5908 - regression_loss: 2.0770 - classification_loss: 0.5138 389/500 [======================>.......] - ETA: 27s - loss: 2.5896 - regression_loss: 2.0761 - classification_loss: 0.5135 390/500 [======================>.......] - ETA: 27s - loss: 2.5896 - regression_loss: 2.0763 - classification_loss: 0.5133 391/500 [======================>.......] - ETA: 27s - loss: 2.5886 - regression_loss: 2.0756 - classification_loss: 0.5130 392/500 [======================>.......] - ETA: 27s - loss: 2.5879 - regression_loss: 2.0751 - classification_loss: 0.5128 393/500 [======================>.......] - ETA: 26s - loss: 2.5882 - regression_loss: 2.0752 - classification_loss: 0.5130 394/500 [======================>.......] - ETA: 26s - loss: 2.5883 - regression_loss: 2.0754 - classification_loss: 0.5128 395/500 [======================>.......] - ETA: 26s - loss: 2.5886 - regression_loss: 2.0758 - classification_loss: 0.5128 396/500 [======================>.......] - ETA: 26s - loss: 2.5891 - regression_loss: 2.0761 - classification_loss: 0.5130 397/500 [======================>.......] - ETA: 25s - loss: 2.5887 - regression_loss: 2.0759 - classification_loss: 0.5128 398/500 [======================>.......] - ETA: 25s - loss: 2.5888 - regression_loss: 2.0750 - classification_loss: 0.5138 399/500 [======================>.......] - ETA: 25s - loss: 2.5887 - regression_loss: 2.0752 - classification_loss: 0.5135 400/500 [=======================>......] - ETA: 25s - loss: 2.5893 - regression_loss: 2.0756 - classification_loss: 0.5137 401/500 [=======================>......] - ETA: 24s - loss: 2.5903 - regression_loss: 2.0762 - classification_loss: 0.5141 402/500 [=======================>......] - ETA: 24s - loss: 2.5904 - regression_loss: 2.0757 - classification_loss: 0.5146 403/500 [=======================>......] - ETA: 24s - loss: 2.5916 - regression_loss: 2.0762 - classification_loss: 0.5153 404/500 [=======================>......] - ETA: 24s - loss: 2.5913 - regression_loss: 2.0761 - classification_loss: 0.5152 405/500 [=======================>......] - ETA: 23s - loss: 2.5901 - regression_loss: 2.0752 - classification_loss: 0.5148 406/500 [=======================>......] - ETA: 23s - loss: 2.5910 - regression_loss: 2.0761 - classification_loss: 0.5149 407/500 [=======================>......] - ETA: 23s - loss: 2.5904 - regression_loss: 2.0758 - classification_loss: 0.5146 408/500 [=======================>......] - ETA: 23s - loss: 2.5905 - regression_loss: 2.0759 - classification_loss: 0.5145 409/500 [=======================>......] - ETA: 22s - loss: 2.5898 - regression_loss: 2.0752 - classification_loss: 0.5146 410/500 [=======================>......] - ETA: 22s - loss: 2.5890 - regression_loss: 2.0744 - classification_loss: 0.5146 411/500 [=======================>......] - ETA: 22s - loss: 2.5880 - regression_loss: 2.0737 - classification_loss: 0.5144 412/500 [=======================>......] - ETA: 22s - loss: 2.5895 - regression_loss: 2.0748 - classification_loss: 0.5147 413/500 [=======================>......] - ETA: 21s - loss: 2.5880 - regression_loss: 2.0737 - classification_loss: 0.5143 414/500 [=======================>......] - ETA: 21s - loss: 2.5876 - regression_loss: 2.0733 - classification_loss: 0.5143 415/500 [=======================>......] - ETA: 21s - loss: 2.5880 - regression_loss: 2.0739 - classification_loss: 0.5141 416/500 [=======================>......] - ETA: 21s - loss: 2.5872 - regression_loss: 2.0733 - classification_loss: 0.5139 417/500 [========================>.....] - ETA: 20s - loss: 2.5871 - regression_loss: 2.0733 - classification_loss: 0.5137 418/500 [========================>.....] - ETA: 20s - loss: 2.5876 - regression_loss: 2.0738 - classification_loss: 0.5138 419/500 [========================>.....] - ETA: 20s - loss: 2.5877 - regression_loss: 2.0739 - classification_loss: 0.5139 420/500 [========================>.....] - ETA: 20s - loss: 2.5876 - regression_loss: 2.0739 - classification_loss: 0.5137 421/500 [========================>.....] - ETA: 19s - loss: 2.5877 - regression_loss: 2.0739 - classification_loss: 0.5138 422/500 [========================>.....] - ETA: 19s - loss: 2.5874 - regression_loss: 2.0738 - classification_loss: 0.5135 423/500 [========================>.....] - ETA: 19s - loss: 2.5870 - regression_loss: 2.0735 - classification_loss: 0.5134 424/500 [========================>.....] - ETA: 19s - loss: 2.5858 - regression_loss: 2.0727 - classification_loss: 0.5132 425/500 [========================>.....] - ETA: 18s - loss: 2.5866 - regression_loss: 2.0731 - classification_loss: 0.5135 426/500 [========================>.....] - ETA: 18s - loss: 2.5869 - regression_loss: 2.0736 - classification_loss: 0.5133 427/500 [========================>.....] - ETA: 18s - loss: 2.5867 - regression_loss: 2.0735 - classification_loss: 0.5132 428/500 [========================>.....] - ETA: 18s - loss: 2.5859 - regression_loss: 2.0730 - classification_loss: 0.5129 429/500 [========================>.....] - ETA: 17s - loss: 2.5856 - regression_loss: 2.0725 - classification_loss: 0.5130 430/500 [========================>.....] - ETA: 17s - loss: 2.5829 - regression_loss: 2.0707 - classification_loss: 0.5122 431/500 [========================>.....] - ETA: 17s - loss: 2.5829 - regression_loss: 2.0706 - classification_loss: 0.5123 432/500 [========================>.....] - ETA: 17s - loss: 2.5813 - regression_loss: 2.0693 - classification_loss: 0.5120 433/500 [========================>.....] - ETA: 16s - loss: 2.5835 - regression_loss: 2.0714 - classification_loss: 0.5121 434/500 [=========================>....] - ETA: 16s - loss: 2.5835 - regression_loss: 2.0717 - classification_loss: 0.5118 435/500 [=========================>....] - ETA: 16s - loss: 2.5830 - regression_loss: 2.0713 - classification_loss: 0.5117 436/500 [=========================>....] - ETA: 16s - loss: 2.5813 - regression_loss: 2.0700 - classification_loss: 0.5113 437/500 [=========================>....] - ETA: 15s - loss: 2.5808 - regression_loss: 2.0694 - classification_loss: 0.5114 438/500 [=========================>....] - ETA: 15s - loss: 2.5833 - regression_loss: 2.0712 - classification_loss: 0.5121 439/500 [=========================>....] - ETA: 15s - loss: 2.5839 - regression_loss: 2.0719 - classification_loss: 0.5120 440/500 [=========================>....] - ETA: 15s - loss: 2.5827 - regression_loss: 2.0710 - classification_loss: 0.5117 441/500 [=========================>....] - ETA: 14s - loss: 2.5825 - regression_loss: 2.0709 - classification_loss: 0.5116 442/500 [=========================>....] - ETA: 14s - loss: 2.5820 - regression_loss: 2.0704 - classification_loss: 0.5116 443/500 [=========================>....] - ETA: 14s - loss: 2.5795 - regression_loss: 2.0685 - classification_loss: 0.5111 444/500 [=========================>....] - ETA: 14s - loss: 2.5784 - regression_loss: 2.0676 - classification_loss: 0.5108 445/500 [=========================>....] - ETA: 13s - loss: 2.5786 - regression_loss: 2.0679 - classification_loss: 0.5107 446/500 [=========================>....] - ETA: 13s - loss: 2.5782 - regression_loss: 2.0676 - classification_loss: 0.5107 447/500 [=========================>....] - ETA: 13s - loss: 2.5785 - regression_loss: 2.0680 - classification_loss: 0.5105 448/500 [=========================>....] - ETA: 13s - loss: 2.5791 - regression_loss: 2.0686 - classification_loss: 0.5105 449/500 [=========================>....] - ETA: 12s - loss: 2.5796 - regression_loss: 2.0692 - classification_loss: 0.5104 450/500 [==========================>...] - ETA: 12s - loss: 2.5803 - regression_loss: 2.0696 - classification_loss: 0.5107 451/500 [==========================>...] - ETA: 12s - loss: 2.5807 - regression_loss: 2.0694 - classification_loss: 0.5113 452/500 [==========================>...] - ETA: 12s - loss: 2.5802 - regression_loss: 2.0687 - classification_loss: 0.5115 453/500 [==========================>...] - ETA: 11s - loss: 2.5797 - regression_loss: 2.0684 - classification_loss: 0.5113 454/500 [==========================>...] - ETA: 11s - loss: 2.5792 - regression_loss: 2.0681 - classification_loss: 0.5111 455/500 [==========================>...] - ETA: 11s - loss: 2.5788 - regression_loss: 2.0679 - classification_loss: 0.5109 456/500 [==========================>...] - ETA: 11s - loss: 2.5773 - regression_loss: 2.0668 - classification_loss: 0.5105 457/500 [==========================>...] - ETA: 10s - loss: 2.5770 - regression_loss: 2.0667 - classification_loss: 0.5103 458/500 [==========================>...] - ETA: 10s - loss: 2.5766 - regression_loss: 2.0665 - classification_loss: 0.5101 459/500 [==========================>...] - ETA: 10s - loss: 2.5757 - regression_loss: 2.0657 - classification_loss: 0.5100 460/500 [==========================>...] - ETA: 10s - loss: 2.5756 - regression_loss: 2.0656 - classification_loss: 0.5100 461/500 [==========================>...] - ETA: 9s - loss: 2.5748 - regression_loss: 2.0649 - classification_loss: 0.5099  462/500 [==========================>...] - ETA: 9s - loss: 2.5752 - regression_loss: 2.0652 - classification_loss: 0.5099 463/500 [==========================>...] - ETA: 9s - loss: 2.5758 - regression_loss: 2.0657 - classification_loss: 0.5100 464/500 [==========================>...] - ETA: 9s - loss: 2.5758 - regression_loss: 2.0660 - classification_loss: 0.5099 465/500 [==========================>...] - ETA: 8s - loss: 2.5751 - regression_loss: 2.0653 - classification_loss: 0.5097 466/500 [==========================>...] - ETA: 8s - loss: 2.5763 - regression_loss: 2.0663 - classification_loss: 0.5101 467/500 [===========================>..] - ETA: 8s - loss: 2.5741 - regression_loss: 2.0642 - classification_loss: 0.5098 468/500 [===========================>..] - ETA: 7s - loss: 2.5742 - regression_loss: 2.0644 - classification_loss: 0.5099 469/500 [===========================>..] - ETA: 7s - loss: 2.5736 - regression_loss: 2.0638 - classification_loss: 0.5098 470/500 [===========================>..] - ETA: 7s - loss: 2.5711 - regression_loss: 2.0618 - classification_loss: 0.5093 471/500 [===========================>..] - ETA: 7s - loss: 2.5708 - regression_loss: 2.0613 - classification_loss: 0.5095 472/500 [===========================>..] - ETA: 6s - loss: 2.5715 - regression_loss: 2.0617 - classification_loss: 0.5098 473/500 [===========================>..] - ETA: 6s - loss: 2.5702 - regression_loss: 2.0608 - classification_loss: 0.5094 474/500 [===========================>..] - ETA: 6s - loss: 2.5705 - regression_loss: 2.0610 - classification_loss: 0.5095 475/500 [===========================>..] - ETA: 6s - loss: 2.5707 - regression_loss: 2.0612 - classification_loss: 0.5094 476/500 [===========================>..] - ETA: 6s - loss: 2.5699 - regression_loss: 2.0607 - classification_loss: 0.5092 477/500 [===========================>..] - ETA: 5s - loss: 2.5696 - regression_loss: 2.0604 - classification_loss: 0.5091 478/500 [===========================>..] - ETA: 5s - loss: 2.5682 - regression_loss: 2.0593 - classification_loss: 0.5088 479/500 [===========================>..] - ETA: 5s - loss: 2.5686 - regression_loss: 2.0598 - classification_loss: 0.5088 480/500 [===========================>..] - ETA: 5s - loss: 2.5685 - regression_loss: 2.0599 - classification_loss: 0.5087 481/500 [===========================>..] - ETA: 4s - loss: 2.5688 - regression_loss: 2.0603 - classification_loss: 0.5085 482/500 [===========================>..] - ETA: 4s - loss: 2.5685 - regression_loss: 2.0600 - classification_loss: 0.5085 483/500 [===========================>..] - ETA: 4s - loss: 2.5690 - regression_loss: 2.0603 - classification_loss: 0.5087 484/500 [============================>.] - ETA: 4s - loss: 2.5688 - regression_loss: 2.0601 - classification_loss: 0.5087 485/500 [============================>.] - ETA: 3s - loss: 2.5689 - regression_loss: 2.0603 - classification_loss: 0.5086 486/500 [============================>.] - ETA: 3s - loss: 2.5685 - regression_loss: 2.0601 - classification_loss: 0.5084 487/500 [============================>.] - ETA: 3s - loss: 2.5684 - regression_loss: 2.0601 - classification_loss: 0.5083 488/500 [============================>.] - ETA: 3s - loss: 2.5686 - regression_loss: 2.0604 - classification_loss: 0.5082 489/500 [============================>.] - ETA: 2s - loss: 2.5693 - regression_loss: 2.0610 - classification_loss: 0.5083 490/500 [============================>.] - ETA: 2s - loss: 2.5691 - regression_loss: 2.0610 - classification_loss: 0.5081 491/500 [============================>.] - ETA: 2s - loss: 2.5697 - regression_loss: 2.0615 - classification_loss: 0.5082 492/500 [============================>.] - ETA: 2s - loss: 2.5694 - regression_loss: 2.0613 - classification_loss: 0.5081 493/500 [============================>.] - ETA: 1s - loss: 2.5703 - regression_loss: 2.0621 - classification_loss: 0.5082 494/500 [============================>.] - ETA: 1s - loss: 2.5702 - regression_loss: 2.0622 - classification_loss: 0.5080 495/500 [============================>.] - ETA: 1s - loss: 2.5701 - regression_loss: 2.0622 - classification_loss: 0.5079 496/500 [============================>.] - ETA: 1s - loss: 2.5697 - regression_loss: 2.0619 - classification_loss: 0.5077 497/500 [============================>.] - ETA: 0s - loss: 2.5695 - regression_loss: 2.0617 - classification_loss: 0.5078 498/500 [============================>.] - ETA: 0s - loss: 2.5702 - regression_loss: 2.0623 - classification_loss: 0.5079 499/500 [============================>.] - ETA: 0s - loss: 2.5686 - regression_loss: 2.0609 - classification_loss: 0.5076 500/500 [==============================] - 125s 250ms/step - loss: 2.5696 - regression_loss: 2.0615 - classification_loss: 0.5081 1172 instances of class plum with average precision: 0.2770 mAP: 0.2770 Epoch 00011: saving model to ./training/snapshots/resnet50_pascal_11.h5 Epoch 12/150 1/500 [..............................] - ETA: 2:08 - loss: 2.7721 - regression_loss: 2.3236 - classification_loss: 0.4485 2/500 [..............................] - ETA: 2:07 - loss: 2.5797 - regression_loss: 2.1394 - classification_loss: 0.4403 3/500 [..............................] - ETA: 2:05 - loss: 2.5168 - regression_loss: 2.1052 - classification_loss: 0.4116 4/500 [..............................] - ETA: 2:05 - loss: 2.4654 - regression_loss: 2.0432 - classification_loss: 0.4222 5/500 [..............................] - ETA: 2:06 - loss: 2.4782 - regression_loss: 2.0372 - classification_loss: 0.4411 6/500 [..............................] - ETA: 2:05 - loss: 2.4689 - regression_loss: 2.0344 - classification_loss: 0.4345 7/500 [..............................] - ETA: 2:04 - loss: 2.4678 - regression_loss: 2.0334 - classification_loss: 0.4344 8/500 [..............................] - ETA: 2:03 - loss: 2.4561 - regression_loss: 2.0193 - classification_loss: 0.4368 9/500 [..............................] - ETA: 2:03 - loss: 2.4665 - regression_loss: 2.0330 - classification_loss: 0.4335 10/500 [..............................] - ETA: 2:03 - loss: 2.4523 - regression_loss: 2.0133 - classification_loss: 0.4390 11/500 [..............................] - ETA: 2:03 - loss: 2.4640 - regression_loss: 2.0082 - classification_loss: 0.4559 12/500 [..............................] - ETA: 2:03 - loss: 2.4709 - regression_loss: 2.0131 - classification_loss: 0.4578 13/500 [..............................] - ETA: 2:03 - loss: 2.4774 - regression_loss: 2.0180 - classification_loss: 0.4594 14/500 [..............................] - ETA: 2:03 - loss: 2.4442 - regression_loss: 1.9937 - classification_loss: 0.4504 15/500 [..............................] - ETA: 2:02 - loss: 2.4416 - regression_loss: 1.9973 - classification_loss: 0.4443 16/500 [..............................] - ETA: 2:02 - loss: 2.4409 - regression_loss: 1.9995 - classification_loss: 0.4413 17/500 [>.............................] - ETA: 2:02 - loss: 2.4500 - regression_loss: 2.0056 - classification_loss: 0.4444 18/500 [>.............................] - ETA: 2:02 - loss: 2.4362 - regression_loss: 1.9938 - classification_loss: 0.4424 19/500 [>.............................] - ETA: 2:01 - loss: 2.4685 - regression_loss: 2.0243 - classification_loss: 0.4442 20/500 [>.............................] - ETA: 2:01 - loss: 2.4602 - regression_loss: 2.0148 - classification_loss: 0.4455 21/500 [>.............................] - ETA: 2:01 - loss: 2.4437 - regression_loss: 2.0028 - classification_loss: 0.4409 22/500 [>.............................] - ETA: 2:00 - loss: 2.4754 - regression_loss: 2.0262 - classification_loss: 0.4492 23/500 [>.............................] - ETA: 2:00 - loss: 2.5061 - regression_loss: 2.0349 - classification_loss: 0.4711 24/500 [>.............................] - ETA: 2:00 - loss: 2.5340 - regression_loss: 2.0524 - classification_loss: 0.4816 25/500 [>.............................] - ETA: 1:59 - loss: 2.5390 - regression_loss: 2.0545 - classification_loss: 0.4845 26/500 [>.............................] - ETA: 1:59 - loss: 2.5018 - regression_loss: 2.0231 - classification_loss: 0.4787 27/500 [>.............................] - ETA: 1:59 - loss: 2.5147 - regression_loss: 2.0278 - classification_loss: 0.4869 28/500 [>.............................] - ETA: 1:58 - loss: 2.4987 - regression_loss: 2.0155 - classification_loss: 0.4832 29/500 [>.............................] - ETA: 1:58 - loss: 2.5391 - regression_loss: 2.0509 - classification_loss: 0.4883 30/500 [>.............................] - ETA: 1:58 - loss: 2.5553 - regression_loss: 2.0677 - classification_loss: 0.4876 31/500 [>.............................] - ETA: 1:58 - loss: 2.5563 - regression_loss: 2.0687 - classification_loss: 0.4876 32/500 [>.............................] - ETA: 1:57 - loss: 2.5576 - regression_loss: 2.0731 - classification_loss: 0.4845 33/500 [>.............................] - ETA: 1:57 - loss: 2.5601 - regression_loss: 2.0740 - classification_loss: 0.4861 34/500 [=>............................] - ETA: 1:57 - loss: 2.5743 - regression_loss: 2.0856 - classification_loss: 0.4888 35/500 [=>............................] - ETA: 1:57 - loss: 2.5962 - regression_loss: 2.0963 - classification_loss: 0.4999 36/500 [=>............................] - ETA: 1:56 - loss: 2.6017 - regression_loss: 2.1001 - classification_loss: 0.5016 37/500 [=>............................] - ETA: 1:56 - loss: 2.6140 - regression_loss: 2.1108 - classification_loss: 0.5032 38/500 [=>............................] - ETA: 1:56 - loss: 2.7216 - regression_loss: 2.0956 - classification_loss: 0.6260 39/500 [=>............................] - ETA: 1:56 - loss: 2.7224 - regression_loss: 2.0974 - classification_loss: 0.6250 40/500 [=>............................] - ETA: 1:56 - loss: 2.7173 - regression_loss: 2.0897 - classification_loss: 0.6276 41/500 [=>............................] - ETA: 1:55 - loss: 2.6956 - regression_loss: 2.0758 - classification_loss: 0.6198 42/500 [=>............................] - ETA: 1:55 - loss: 2.6935 - regression_loss: 2.0752 - classification_loss: 0.6182 43/500 [=>............................] - ETA: 1:55 - loss: 2.6898 - regression_loss: 2.0777 - classification_loss: 0.6121 44/500 [=>............................] - ETA: 1:55 - loss: 2.6982 - regression_loss: 2.0865 - classification_loss: 0.6116 45/500 [=>............................] - ETA: 1:54 - loss: 2.7118 - regression_loss: 2.1012 - classification_loss: 0.6106 46/500 [=>............................] - ETA: 1:54 - loss: 2.7045 - regression_loss: 2.0977 - classification_loss: 0.6069 47/500 [=>............................] - ETA: 1:54 - loss: 2.7095 - regression_loss: 2.1013 - classification_loss: 0.6082 48/500 [=>............................] - ETA: 1:54 - loss: 2.7082 - regression_loss: 2.1028 - classification_loss: 0.6054 49/500 [=>............................] - ETA: 1:53 - loss: 2.7213 - regression_loss: 2.1170 - classification_loss: 0.6044 50/500 [==>...........................] - ETA: 1:53 - loss: 2.7080 - regression_loss: 2.1084 - classification_loss: 0.5996 51/500 [==>...........................] - ETA: 1:53 - loss: 2.7144 - regression_loss: 2.1166 - classification_loss: 0.5979 52/500 [==>...........................] - ETA: 1:53 - loss: 2.7040 - regression_loss: 2.1104 - classification_loss: 0.5936 53/500 [==>...........................] - ETA: 1:52 - loss: 2.6995 - regression_loss: 2.1084 - classification_loss: 0.5911 54/500 [==>...........................] - ETA: 1:52 - loss: 2.7162 - regression_loss: 2.1249 - classification_loss: 0.5913 55/500 [==>...........................] - ETA: 1:52 - loss: 2.7094 - regression_loss: 2.1209 - classification_loss: 0.5885 56/500 [==>...........................] - ETA: 1:52 - loss: 2.7071 - regression_loss: 2.1204 - classification_loss: 0.5867 57/500 [==>...........................] - ETA: 1:51 - loss: 2.7465 - regression_loss: 2.1224 - classification_loss: 0.6241 58/500 [==>...........................] - ETA: 1:51 - loss: 2.7423 - regression_loss: 2.1190 - classification_loss: 0.6233 59/500 [==>...........................] - ETA: 1:51 - loss: 2.7354 - regression_loss: 2.1157 - classification_loss: 0.6197 60/500 [==>...........................] - ETA: 1:51 - loss: 2.7247 - regression_loss: 2.1089 - classification_loss: 0.6159 61/500 [==>...........................] - ETA: 1:50 - loss: 2.7174 - regression_loss: 2.1045 - classification_loss: 0.6129 62/500 [==>...........................] - ETA: 1:50 - loss: 2.7109 - regression_loss: 2.1010 - classification_loss: 0.6099 63/500 [==>...........................] - ETA: 1:50 - loss: 2.7095 - regression_loss: 2.1013 - classification_loss: 0.6082 64/500 [==>...........................] - ETA: 1:50 - loss: 2.7139 - regression_loss: 2.1062 - classification_loss: 0.6078 65/500 [==>...........................] - ETA: 1:49 - loss: 2.7058 - regression_loss: 2.0999 - classification_loss: 0.6059 66/500 [==>...........................] - ETA: 1:49 - loss: 2.7087 - regression_loss: 2.1039 - classification_loss: 0.6048 67/500 [===>..........................] - ETA: 1:49 - loss: 2.7093 - regression_loss: 2.1047 - classification_loss: 0.6045 68/500 [===>..........................] - ETA: 1:48 - loss: 2.7056 - regression_loss: 2.1030 - classification_loss: 0.6026 69/500 [===>..........................] - ETA: 1:48 - loss: 2.6989 - regression_loss: 2.0979 - classification_loss: 0.6010 70/500 [===>..........................] - ETA: 1:48 - loss: 2.6952 - regression_loss: 2.0956 - classification_loss: 0.5996 71/500 [===>..........................] - ETA: 1:47 - loss: 2.6892 - regression_loss: 2.0920 - classification_loss: 0.5973 72/500 [===>..........................] - ETA: 1:47 - loss: 2.6920 - regression_loss: 2.0951 - classification_loss: 0.5969 73/500 [===>..........................] - ETA: 1:47 - loss: 2.6889 - regression_loss: 2.0944 - classification_loss: 0.5945 74/500 [===>..........................] - ETA: 1:47 - loss: 2.6795 - regression_loss: 2.0887 - classification_loss: 0.5908 75/500 [===>..........................] - ETA: 1:46 - loss: 2.6823 - regression_loss: 2.0931 - classification_loss: 0.5892 76/500 [===>..........................] - ETA: 1:46 - loss: 2.6807 - regression_loss: 2.0927 - classification_loss: 0.5880 77/500 [===>..........................] - ETA: 1:46 - loss: 2.7028 - regression_loss: 2.1021 - classification_loss: 0.6006 78/500 [===>..........................] - ETA: 1:46 - loss: 2.6988 - regression_loss: 2.1004 - classification_loss: 0.5985 79/500 [===>..........................] - ETA: 1:45 - loss: 2.6986 - regression_loss: 2.1016 - classification_loss: 0.5970 80/500 [===>..........................] - ETA: 1:45 - loss: 2.7021 - regression_loss: 2.1059 - classification_loss: 0.5962 81/500 [===>..........................] - ETA: 1:45 - loss: 2.7019 - regression_loss: 2.1065 - classification_loss: 0.5955 82/500 [===>..........................] - ETA: 1:45 - loss: 2.7159 - regression_loss: 2.1100 - classification_loss: 0.6059 83/500 [===>..........................] - ETA: 1:44 - loss: 2.7112 - regression_loss: 2.1069 - classification_loss: 0.6043 84/500 [====>.........................] - ETA: 1:44 - loss: 2.7145 - regression_loss: 2.1114 - classification_loss: 0.6031 85/500 [====>.........................] - ETA: 1:44 - loss: 2.7060 - regression_loss: 2.1063 - classification_loss: 0.5996 86/500 [====>.........................] - ETA: 1:44 - loss: 2.7050 - regression_loss: 2.1061 - classification_loss: 0.5989 87/500 [====>.........................] - ETA: 1:43 - loss: 2.7020 - regression_loss: 2.1029 - classification_loss: 0.5991 88/500 [====>.........................] - ETA: 1:43 - loss: 2.6978 - regression_loss: 2.1015 - classification_loss: 0.5962 89/500 [====>.........................] - ETA: 1:43 - loss: 2.6984 - regression_loss: 2.1018 - classification_loss: 0.5966 90/500 [====>.........................] - ETA: 1:43 - loss: 2.6957 - regression_loss: 2.1004 - classification_loss: 0.5953 91/500 [====>.........................] - ETA: 1:42 - loss: 2.6933 - regression_loss: 2.0997 - classification_loss: 0.5936 92/500 [====>.........................] - ETA: 1:42 - loss: 2.6933 - regression_loss: 2.1001 - classification_loss: 0.5933 93/500 [====>.........................] - ETA: 1:42 - loss: 2.6983 - regression_loss: 2.1037 - classification_loss: 0.5945 94/500 [====>.........................] - ETA: 1:42 - loss: 2.6949 - regression_loss: 2.1019 - classification_loss: 0.5930 95/500 [====>.........................] - ETA: 1:41 - loss: 2.6922 - regression_loss: 2.0993 - classification_loss: 0.5929 96/500 [====>.........................] - ETA: 1:41 - loss: 2.6910 - regression_loss: 2.1005 - classification_loss: 0.5905 97/500 [====>.........................] - ETA: 1:41 - loss: 2.6829 - regression_loss: 2.0945 - classification_loss: 0.5884 98/500 [====>.........................] - ETA: 1:41 - loss: 2.6800 - regression_loss: 2.0931 - classification_loss: 0.5868 99/500 [====>.........................] - ETA: 1:40 - loss: 2.6781 - regression_loss: 2.0927 - classification_loss: 0.5854 100/500 [=====>........................] - ETA: 1:40 - loss: 2.6761 - regression_loss: 2.0924 - classification_loss: 0.5837 101/500 [=====>........................] - ETA: 1:40 - loss: 2.6730 - regression_loss: 2.0912 - classification_loss: 0.5818 102/500 [=====>........................] - ETA: 1:40 - loss: 2.6671 - regression_loss: 2.0871 - classification_loss: 0.5800 103/500 [=====>........................] - ETA: 1:39 - loss: 2.6734 - regression_loss: 2.0898 - classification_loss: 0.5836 104/500 [=====>........................] - ETA: 1:39 - loss: 2.6743 - regression_loss: 2.0907 - classification_loss: 0.5836 105/500 [=====>........................] - ETA: 1:39 - loss: 2.6734 - regression_loss: 2.0897 - classification_loss: 0.5836 106/500 [=====>........................] - ETA: 1:39 - loss: 2.6669 - regression_loss: 2.0841 - classification_loss: 0.5828 107/500 [=====>........................] - ETA: 1:38 - loss: 2.6730 - regression_loss: 2.0886 - classification_loss: 0.5844 108/500 [=====>........................] - ETA: 1:38 - loss: 2.6747 - regression_loss: 2.0913 - classification_loss: 0.5834 109/500 [=====>........................] - ETA: 1:38 - loss: 2.6747 - regression_loss: 2.0918 - classification_loss: 0.5828 110/500 [=====>........................] - ETA: 1:38 - loss: 2.6670 - regression_loss: 2.0855 - classification_loss: 0.5815 111/500 [=====>........................] - ETA: 1:37 - loss: 2.6660 - regression_loss: 2.0859 - classification_loss: 0.5801 112/500 [=====>........................] - ETA: 1:37 - loss: 2.6632 - regression_loss: 2.0844 - classification_loss: 0.5788 113/500 [=====>........................] - ETA: 1:37 - loss: 2.6630 - regression_loss: 2.0849 - classification_loss: 0.5781 114/500 [=====>........................] - ETA: 1:37 - loss: 2.6606 - regression_loss: 2.0832 - classification_loss: 0.5773 115/500 [=====>........................] - ETA: 1:36 - loss: 2.6598 - regression_loss: 2.0831 - classification_loss: 0.5768 116/500 [=====>........................] - ETA: 1:36 - loss: 2.6592 - regression_loss: 2.0835 - classification_loss: 0.5758 117/500 [======>.......................] - ETA: 1:36 - loss: 2.6571 - regression_loss: 2.0825 - classification_loss: 0.5745 118/500 [======>.......................] - ETA: 1:36 - loss: 2.6648 - regression_loss: 2.0880 - classification_loss: 0.5768 119/500 [======>.......................] - ETA: 1:35 - loss: 2.6669 - regression_loss: 2.0902 - classification_loss: 0.5767 120/500 [======>.......................] - ETA: 1:35 - loss: 2.6639 - regression_loss: 2.0883 - classification_loss: 0.5756 121/500 [======>.......................] - ETA: 1:35 - loss: 2.6594 - regression_loss: 2.0852 - classification_loss: 0.5742 122/500 [======>.......................] - ETA: 1:35 - loss: 2.6540 - regression_loss: 2.0817 - classification_loss: 0.5722 123/500 [======>.......................] - ETA: 1:34 - loss: 2.6475 - regression_loss: 2.0774 - classification_loss: 0.5702 124/500 [======>.......................] - ETA: 1:34 - loss: 2.6452 - regression_loss: 2.0760 - classification_loss: 0.5692 125/500 [======>.......................] - ETA: 1:34 - loss: 2.6432 - regression_loss: 2.0751 - classification_loss: 0.5681 126/500 [======>.......................] - ETA: 1:34 - loss: 2.6361 - regression_loss: 2.0700 - classification_loss: 0.5660 127/500 [======>.......................] - ETA: 1:33 - loss: 2.6337 - regression_loss: 2.0682 - classification_loss: 0.5654 128/500 [======>.......................] - ETA: 1:33 - loss: 2.6320 - regression_loss: 2.0675 - classification_loss: 0.5645 129/500 [======>.......................] - ETA: 1:33 - loss: 2.6319 - regression_loss: 2.0658 - classification_loss: 0.5661 130/500 [======>.......................] - ETA: 1:33 - loss: 2.6289 - regression_loss: 2.0636 - classification_loss: 0.5653 131/500 [======>.......................] - ETA: 1:32 - loss: 2.6270 - regression_loss: 2.0613 - classification_loss: 0.5657 132/500 [======>.......................] - ETA: 1:32 - loss: 2.6259 - regression_loss: 2.0609 - classification_loss: 0.5650 133/500 [======>.......................] - ETA: 1:32 - loss: 2.6244 - regression_loss: 2.0604 - classification_loss: 0.5640 134/500 [=======>......................] - ETA: 1:31 - loss: 2.6228 - regression_loss: 2.0590 - classification_loss: 0.5638 135/500 [=======>......................] - ETA: 1:31 - loss: 2.6248 - regression_loss: 2.0610 - classification_loss: 0.5638 136/500 [=======>......................] - ETA: 1:31 - loss: 2.6209 - regression_loss: 2.0580 - classification_loss: 0.5629 137/500 [=======>......................] - ETA: 1:30 - loss: 2.6196 - regression_loss: 2.0575 - classification_loss: 0.5621 138/500 [=======>......................] - ETA: 1:30 - loss: 2.6107 - regression_loss: 2.0504 - classification_loss: 0.5604 139/500 [=======>......................] - ETA: 1:30 - loss: 2.6093 - regression_loss: 2.0491 - classification_loss: 0.5602 140/500 [=======>......................] - ETA: 1:30 - loss: 2.6087 - regression_loss: 2.0489 - classification_loss: 0.5598 141/500 [=======>......................] - ETA: 1:29 - loss: 2.6142 - regression_loss: 2.0535 - classification_loss: 0.5607 142/500 [=======>......................] - ETA: 1:29 - loss: 2.6043 - regression_loss: 2.0457 - classification_loss: 0.5586 143/500 [=======>......................] - ETA: 1:29 - loss: 2.6070 - regression_loss: 2.0484 - classification_loss: 0.5585 144/500 [=======>......................] - ETA: 1:29 - loss: 2.6095 - regression_loss: 2.0504 - classification_loss: 0.5591 145/500 [=======>......................] - ETA: 1:28 - loss: 2.6077 - regression_loss: 2.0496 - classification_loss: 0.5581 146/500 [=======>......................] - ETA: 1:28 - loss: 2.6033 - regression_loss: 2.0474 - classification_loss: 0.5559 147/500 [=======>......................] - ETA: 1:28 - loss: 2.6023 - regression_loss: 2.0477 - classification_loss: 0.5546 148/500 [=======>......................] - ETA: 1:28 - loss: 2.6014 - regression_loss: 2.0476 - classification_loss: 0.5538 149/500 [=======>......................] - ETA: 1:27 - loss: 2.6016 - regression_loss: 2.0482 - classification_loss: 0.5534 150/500 [========>.....................] - ETA: 1:27 - loss: 2.6022 - regression_loss: 2.0488 - classification_loss: 0.5534 151/500 [========>.....................] - ETA: 1:27 - loss: 2.6019 - regression_loss: 2.0490 - classification_loss: 0.5529 152/500 [========>.....................] - ETA: 1:27 - loss: 2.5990 - regression_loss: 2.0474 - classification_loss: 0.5516 153/500 [========>.....................] - ETA: 1:26 - loss: 2.5954 - regression_loss: 2.0449 - classification_loss: 0.5505 154/500 [========>.....................] - ETA: 1:26 - loss: 2.5951 - regression_loss: 2.0454 - classification_loss: 0.5497 155/500 [========>.....................] - ETA: 1:26 - loss: 2.5930 - regression_loss: 2.0442 - classification_loss: 0.5488 156/500 [========>.....................] - ETA: 1:26 - loss: 2.5947 - regression_loss: 2.0459 - classification_loss: 0.5488 157/500 [========>.....................] - ETA: 1:25 - loss: 2.5946 - regression_loss: 2.0455 - classification_loss: 0.5491 158/500 [========>.....................] - ETA: 1:25 - loss: 2.5936 - regression_loss: 2.0454 - classification_loss: 0.5482 159/500 [========>.....................] - ETA: 1:25 - loss: 2.5946 - regression_loss: 2.0468 - classification_loss: 0.5479 160/500 [========>.....................] - ETA: 1:25 - loss: 2.5897 - regression_loss: 2.0433 - classification_loss: 0.5464 161/500 [========>.....................] - ETA: 1:24 - loss: 2.5900 - regression_loss: 2.0438 - classification_loss: 0.5462 162/500 [========>.....................] - ETA: 1:24 - loss: 2.5910 - regression_loss: 2.0451 - classification_loss: 0.5459 163/500 [========>.....................] - ETA: 1:24 - loss: 2.5898 - regression_loss: 2.0444 - classification_loss: 0.5454 164/500 [========>.....................] - ETA: 1:24 - loss: 2.5875 - regression_loss: 2.0430 - classification_loss: 0.5445 165/500 [========>.....................] - ETA: 1:23 - loss: 2.5814 - regression_loss: 2.0387 - classification_loss: 0.5427 166/500 [========>.....................] - ETA: 1:23 - loss: 2.5767 - regression_loss: 2.0343 - classification_loss: 0.5424 167/500 [=========>....................] - ETA: 1:23 - loss: 2.5762 - regression_loss: 2.0346 - classification_loss: 0.5416 168/500 [=========>....................] - ETA: 1:23 - loss: 2.5738 - regression_loss: 2.0330 - classification_loss: 0.5407 169/500 [=========>....................] - ETA: 1:22 - loss: 2.5851 - regression_loss: 2.0375 - classification_loss: 0.5476 170/500 [=========>....................] - ETA: 1:22 - loss: 2.5808 - regression_loss: 2.0344 - classification_loss: 0.5464 171/500 [=========>....................] - ETA: 1:22 - loss: 2.5817 - regression_loss: 2.0351 - classification_loss: 0.5466 172/500 [=========>....................] - ETA: 1:22 - loss: 2.5799 - regression_loss: 2.0340 - classification_loss: 0.5459 173/500 [=========>....................] - ETA: 1:21 - loss: 2.5793 - regression_loss: 2.0336 - classification_loss: 0.5457 174/500 [=========>....................] - ETA: 1:21 - loss: 2.5764 - regression_loss: 2.0316 - classification_loss: 0.5448 175/500 [=========>....................] - ETA: 1:21 - loss: 2.5734 - regression_loss: 2.0297 - classification_loss: 0.5437 176/500 [=========>....................] - ETA: 1:21 - loss: 2.5717 - regression_loss: 2.0287 - classification_loss: 0.5430 177/500 [=========>....................] - ETA: 1:20 - loss: 2.5695 - regression_loss: 2.0276 - classification_loss: 0.5419 178/500 [=========>....................] - ETA: 1:20 - loss: 2.5697 - regression_loss: 2.0279 - classification_loss: 0.5417 179/500 [=========>....................] - ETA: 1:20 - loss: 2.5706 - regression_loss: 2.0286 - classification_loss: 0.5420 180/500 [=========>....................] - ETA: 1:20 - loss: 2.5691 - regression_loss: 2.0277 - classification_loss: 0.5414 181/500 [=========>....................] - ETA: 1:19 - loss: 2.5698 - regression_loss: 2.0289 - classification_loss: 0.5410 182/500 [=========>....................] - ETA: 1:19 - loss: 2.5725 - regression_loss: 2.0312 - classification_loss: 0.5413 183/500 [=========>....................] - ETA: 1:19 - loss: 2.5724 - regression_loss: 2.0317 - classification_loss: 0.5407 184/500 [==========>...................] - ETA: 1:19 - loss: 2.5722 - regression_loss: 2.0322 - classification_loss: 0.5400 185/500 [==========>...................] - ETA: 1:18 - loss: 2.5716 - regression_loss: 2.0321 - classification_loss: 0.5395 186/500 [==========>...................] - ETA: 1:18 - loss: 2.5784 - regression_loss: 2.0343 - classification_loss: 0.5441 187/500 [==========>...................] - ETA: 1:18 - loss: 2.5749 - regression_loss: 2.0313 - classification_loss: 0.5436 188/500 [==========>...................] - ETA: 1:18 - loss: 2.5765 - regression_loss: 2.0325 - classification_loss: 0.5439 189/500 [==========>...................] - ETA: 1:17 - loss: 2.5785 - regression_loss: 2.0342 - classification_loss: 0.5443 190/500 [==========>...................] - ETA: 1:17 - loss: 2.5789 - regression_loss: 2.0342 - classification_loss: 0.5446 191/500 [==========>...................] - ETA: 1:17 - loss: 2.5791 - regression_loss: 2.0350 - classification_loss: 0.5441 192/500 [==========>...................] - ETA: 1:17 - loss: 2.5784 - regression_loss: 2.0344 - classification_loss: 0.5440 193/500 [==========>...................] - ETA: 1:16 - loss: 2.5781 - regression_loss: 2.0344 - classification_loss: 0.5437 194/500 [==========>...................] - ETA: 1:16 - loss: 2.5802 - regression_loss: 2.0364 - classification_loss: 0.5438 195/500 [==========>...................] - ETA: 1:16 - loss: 2.5789 - regression_loss: 2.0357 - classification_loss: 0.5432 196/500 [==========>...................] - ETA: 1:16 - loss: 2.5803 - regression_loss: 2.0370 - classification_loss: 0.5433 197/500 [==========>...................] - ETA: 1:15 - loss: 2.5815 - regression_loss: 2.0376 - classification_loss: 0.5440 198/500 [==========>...................] - ETA: 1:15 - loss: 2.5810 - regression_loss: 2.0374 - classification_loss: 0.5436 199/500 [==========>...................] - ETA: 1:15 - loss: 2.5786 - regression_loss: 2.0361 - classification_loss: 0.5425 200/500 [===========>..................] - ETA: 1:15 - loss: 2.5796 - regression_loss: 2.0367 - classification_loss: 0.5429 201/500 [===========>..................] - ETA: 1:14 - loss: 2.5802 - regression_loss: 2.0380 - classification_loss: 0.5422 202/500 [===========>..................] - ETA: 1:14 - loss: 2.5804 - regression_loss: 2.0387 - classification_loss: 0.5417 203/500 [===========>..................] - ETA: 1:14 - loss: 2.5836 - regression_loss: 2.0420 - classification_loss: 0.5416 204/500 [===========>..................] - ETA: 1:14 - loss: 2.5841 - regression_loss: 2.0425 - classification_loss: 0.5416 205/500 [===========>..................] - ETA: 1:13 - loss: 2.5882 - regression_loss: 2.0465 - classification_loss: 0.5417 206/500 [===========>..................] - ETA: 1:13 - loss: 2.5876 - regression_loss: 2.0462 - classification_loss: 0.5414 207/500 [===========>..................] - ETA: 1:13 - loss: 2.5874 - regression_loss: 2.0462 - classification_loss: 0.5412 208/500 [===========>..................] - ETA: 1:13 - loss: 2.5920 - regression_loss: 2.0500 - classification_loss: 0.5420 209/500 [===========>..................] - ETA: 1:12 - loss: 2.5869 - regression_loss: 2.0457 - classification_loss: 0.5412 210/500 [===========>..................] - ETA: 1:12 - loss: 2.5872 - regression_loss: 2.0459 - classification_loss: 0.5413 211/500 [===========>..................] - ETA: 1:12 - loss: 2.5901 - regression_loss: 2.0478 - classification_loss: 0.5422 212/500 [===========>..................] - ETA: 1:12 - loss: 2.5900 - regression_loss: 2.0479 - classification_loss: 0.5421 213/500 [===========>..................] - ETA: 1:11 - loss: 2.5915 - regression_loss: 2.0494 - classification_loss: 0.5420 214/500 [===========>..................] - ETA: 1:11 - loss: 2.5917 - regression_loss: 2.0500 - classification_loss: 0.5417 215/500 [===========>..................] - ETA: 1:11 - loss: 2.5924 - regression_loss: 2.0506 - classification_loss: 0.5419 216/500 [===========>..................] - ETA: 1:11 - loss: 2.5922 - regression_loss: 2.0508 - classification_loss: 0.5414 217/500 [============>.................] - ETA: 1:10 - loss: 2.5910 - regression_loss: 2.0501 - classification_loss: 0.5409 218/500 [============>.................] - ETA: 1:10 - loss: 2.5916 - regression_loss: 2.0508 - classification_loss: 0.5408 219/500 [============>.................] - ETA: 1:10 - loss: 2.5931 - regression_loss: 2.0522 - classification_loss: 0.5409 220/500 [============>.................] - ETA: 1:10 - loss: 2.5914 - regression_loss: 2.0511 - classification_loss: 0.5403 221/500 [============>.................] - ETA: 1:09 - loss: 2.5930 - regression_loss: 2.0516 - classification_loss: 0.5414 222/500 [============>.................] - ETA: 1:09 - loss: 2.5917 - regression_loss: 2.0510 - classification_loss: 0.5407 223/500 [============>.................] - ETA: 1:09 - loss: 2.5937 - regression_loss: 2.0521 - classification_loss: 0.5416 224/500 [============>.................] - ETA: 1:08 - loss: 2.5922 - regression_loss: 2.0511 - classification_loss: 0.5411 225/500 [============>.................] - ETA: 1:08 - loss: 2.5937 - regression_loss: 2.0526 - classification_loss: 0.5411 226/500 [============>.................] - ETA: 1:08 - loss: 2.5931 - regression_loss: 2.0525 - classification_loss: 0.5407 227/500 [============>.................] - ETA: 1:08 - loss: 2.5873 - regression_loss: 2.0469 - classification_loss: 0.5404 228/500 [============>.................] - ETA: 1:07 - loss: 2.5846 - regression_loss: 2.0449 - classification_loss: 0.5397 229/500 [============>.................] - ETA: 1:07 - loss: 2.5837 - regression_loss: 2.0445 - classification_loss: 0.5392 230/500 [============>.................] - ETA: 1:07 - loss: 2.5837 - regression_loss: 2.0451 - classification_loss: 0.5386 231/500 [============>.................] - ETA: 1:07 - loss: 2.5833 - regression_loss: 2.0447 - classification_loss: 0.5386 232/500 [============>.................] - ETA: 1:06 - loss: 2.5838 - regression_loss: 2.0457 - classification_loss: 0.5381 233/500 [============>.................] - ETA: 1:06 - loss: 2.5849 - regression_loss: 2.0463 - classification_loss: 0.5386 234/500 [=============>................] - ETA: 1:06 - loss: 2.5841 - regression_loss: 2.0459 - classification_loss: 0.5382 235/500 [=============>................] - ETA: 1:06 - loss: 2.5849 - regression_loss: 2.0469 - classification_loss: 0.5380 236/500 [=============>................] - ETA: 1:06 - loss: 2.5845 - regression_loss: 2.0468 - classification_loss: 0.5377 237/500 [=============>................] - ETA: 1:05 - loss: 2.5828 - regression_loss: 2.0455 - classification_loss: 0.5373 238/500 [=============>................] - ETA: 1:05 - loss: 2.5843 - regression_loss: 2.0469 - classification_loss: 0.5374 239/500 [=============>................] - ETA: 1:05 - loss: 2.5836 - regression_loss: 2.0468 - classification_loss: 0.5369 240/500 [=============>................] - ETA: 1:05 - loss: 2.5848 - regression_loss: 2.0487 - classification_loss: 0.5361 241/500 [=============>................] - ETA: 1:04 - loss: 2.5844 - regression_loss: 2.0491 - classification_loss: 0.5353 242/500 [=============>................] - ETA: 1:04 - loss: 2.5849 - regression_loss: 2.0498 - classification_loss: 0.5351 243/500 [=============>................] - ETA: 1:04 - loss: 2.5842 - regression_loss: 2.0492 - classification_loss: 0.5350 244/500 [=============>................] - ETA: 1:04 - loss: 2.5852 - regression_loss: 2.0496 - classification_loss: 0.5355 245/500 [=============>................] - ETA: 1:03 - loss: 2.5862 - regression_loss: 2.0513 - classification_loss: 0.5349 246/500 [=============>................] - ETA: 1:03 - loss: 2.5840 - regression_loss: 2.0497 - classification_loss: 0.5342 247/500 [=============>................] - ETA: 1:03 - loss: 2.5841 - regression_loss: 2.0501 - classification_loss: 0.5339 248/500 [=============>................] - ETA: 1:03 - loss: 2.5866 - regression_loss: 2.0524 - classification_loss: 0.5342 249/500 [=============>................] - ETA: 1:02 - loss: 2.5853 - regression_loss: 2.0516 - classification_loss: 0.5337 250/500 [==============>...............] - ETA: 1:02 - loss: 2.5850 - regression_loss: 2.0515 - classification_loss: 0.5334 251/500 [==============>...............] - ETA: 1:02 - loss: 2.5850 - regression_loss: 2.0515 - classification_loss: 0.5335 252/500 [==============>...............] - ETA: 1:02 - loss: 2.5853 - regression_loss: 2.0515 - classification_loss: 0.5338 253/500 [==============>...............] - ETA: 1:01 - loss: 2.5840 - regression_loss: 2.0505 - classification_loss: 0.5335 254/500 [==============>...............] - ETA: 1:01 - loss: 2.5857 - regression_loss: 2.0518 - classification_loss: 0.5339 255/500 [==============>...............] - ETA: 1:01 - loss: 2.5857 - regression_loss: 2.0523 - classification_loss: 0.5334 256/500 [==============>...............] - ETA: 1:01 - loss: 2.5832 - regression_loss: 2.0501 - classification_loss: 0.5331 257/500 [==============>...............] - ETA: 1:00 - loss: 2.5819 - regression_loss: 2.0492 - classification_loss: 0.5327 258/500 [==============>...............] - ETA: 1:00 - loss: 2.5815 - regression_loss: 2.0487 - classification_loss: 0.5327 259/500 [==============>...............] - ETA: 1:00 - loss: 2.5830 - regression_loss: 2.0502 - classification_loss: 0.5328 260/500 [==============>...............] - ETA: 1:00 - loss: 2.5831 - regression_loss: 2.0509 - classification_loss: 0.5322 261/500 [==============>...............] - ETA: 59s - loss: 2.5834 - regression_loss: 2.0511 - classification_loss: 0.5323  262/500 [==============>...............] - ETA: 59s - loss: 2.5859 - regression_loss: 2.0518 - classification_loss: 0.5340 263/500 [==============>...............] - ETA: 59s - loss: 2.5850 - regression_loss: 2.0511 - classification_loss: 0.5339 264/500 [==============>...............] - ETA: 59s - loss: 2.5854 - regression_loss: 2.0515 - classification_loss: 0.5339 265/500 [==============>...............] - ETA: 58s - loss: 2.5832 - regression_loss: 2.0498 - classification_loss: 0.5334 266/500 [==============>...............] - ETA: 58s - loss: 2.5822 - regression_loss: 2.0495 - classification_loss: 0.5328 267/500 [===============>..............] - ETA: 58s - loss: 2.5809 - regression_loss: 2.0483 - classification_loss: 0.5325 268/500 [===============>..............] - ETA: 58s - loss: 2.5814 - regression_loss: 2.0496 - classification_loss: 0.5318 269/500 [===============>..............] - ETA: 57s - loss: 2.5823 - regression_loss: 2.0502 - classification_loss: 0.5322 270/500 [===============>..............] - ETA: 57s - loss: 2.5816 - regression_loss: 2.0498 - classification_loss: 0.5317 271/500 [===============>..............] - ETA: 57s - loss: 2.5825 - regression_loss: 2.0501 - classification_loss: 0.5324 272/500 [===============>..............] - ETA: 57s - loss: 2.5827 - regression_loss: 2.0504 - classification_loss: 0.5322 273/500 [===============>..............] - ETA: 56s - loss: 2.5825 - regression_loss: 2.0508 - classification_loss: 0.5317 274/500 [===============>..............] - ETA: 56s - loss: 2.5819 - regression_loss: 2.0504 - classification_loss: 0.5315 275/500 [===============>..............] - ETA: 56s - loss: 2.5821 - regression_loss: 2.0506 - classification_loss: 0.5315 276/500 [===============>..............] - ETA: 56s - loss: 2.5821 - regression_loss: 2.0511 - classification_loss: 0.5310 277/500 [===============>..............] - ETA: 55s - loss: 2.5819 - regression_loss: 2.0511 - classification_loss: 0.5308 278/500 [===============>..............] - ETA: 55s - loss: 2.5838 - regression_loss: 2.0525 - classification_loss: 0.5313 279/500 [===============>..............] - ETA: 55s - loss: 2.5837 - regression_loss: 2.0525 - classification_loss: 0.5311 280/500 [===============>..............] - ETA: 55s - loss: 2.5820 - regression_loss: 2.0516 - classification_loss: 0.5304 281/500 [===============>..............] - ETA: 54s - loss: 2.5829 - regression_loss: 2.0523 - classification_loss: 0.5306 282/500 [===============>..............] - ETA: 54s - loss: 2.5832 - regression_loss: 2.0525 - classification_loss: 0.5307 283/500 [===============>..............] - ETA: 54s - loss: 2.5830 - regression_loss: 2.0527 - classification_loss: 0.5302 284/500 [================>.............] - ETA: 54s - loss: 2.5817 - regression_loss: 2.0519 - classification_loss: 0.5298 285/500 [================>.............] - ETA: 53s - loss: 2.5818 - regression_loss: 2.0522 - classification_loss: 0.5296 286/500 [================>.............] - ETA: 53s - loss: 2.5820 - regression_loss: 2.0525 - classification_loss: 0.5295 287/500 [================>.............] - ETA: 53s - loss: 2.5820 - regression_loss: 2.0524 - classification_loss: 0.5295 288/500 [================>.............] - ETA: 53s - loss: 2.5824 - regression_loss: 2.0526 - classification_loss: 0.5298 289/500 [================>.............] - ETA: 52s - loss: 2.5807 - regression_loss: 2.0515 - classification_loss: 0.5292 290/500 [================>.............] - ETA: 52s - loss: 2.5808 - regression_loss: 2.0518 - classification_loss: 0.5290 291/500 [================>.............] - ETA: 52s - loss: 2.5817 - regression_loss: 2.0526 - classification_loss: 0.5290 292/500 [================>.............] - ETA: 52s - loss: 2.5828 - regression_loss: 2.0534 - classification_loss: 0.5294 293/500 [================>.............] - ETA: 51s - loss: 2.5861 - regression_loss: 2.0563 - classification_loss: 0.5298 294/500 [================>.............] - ETA: 51s - loss: 2.5861 - regression_loss: 2.0567 - classification_loss: 0.5295 295/500 [================>.............] - ETA: 51s - loss: 2.5842 - regression_loss: 2.0548 - classification_loss: 0.5294 296/500 [================>.............] - ETA: 51s - loss: 2.5828 - regression_loss: 2.0539 - classification_loss: 0.5289 297/500 [================>.............] - ETA: 50s - loss: 2.5811 - regression_loss: 2.0514 - classification_loss: 0.5296 298/500 [================>.............] - ETA: 50s - loss: 2.5834 - regression_loss: 2.0531 - classification_loss: 0.5303 299/500 [================>.............] - ETA: 50s - loss: 2.5831 - regression_loss: 2.0531 - classification_loss: 0.5300 300/500 [=================>............] - ETA: 50s - loss: 2.5823 - regression_loss: 2.0526 - classification_loss: 0.5297 301/500 [=================>............] - ETA: 49s - loss: 2.5813 - regression_loss: 2.0520 - classification_loss: 0.5294 302/500 [=================>............] - ETA: 49s - loss: 2.5814 - regression_loss: 2.0521 - classification_loss: 0.5293 303/500 [=================>............] - ETA: 49s - loss: 2.5805 - regression_loss: 2.0515 - classification_loss: 0.5290 304/500 [=================>............] - ETA: 49s - loss: 2.5798 - regression_loss: 2.0511 - classification_loss: 0.5287 305/500 [=================>............] - ETA: 48s - loss: 2.5775 - regression_loss: 2.0493 - classification_loss: 0.5282 306/500 [=================>............] - ETA: 48s - loss: 2.5781 - regression_loss: 2.0495 - classification_loss: 0.5286 307/500 [=================>............] - ETA: 48s - loss: 2.5767 - regression_loss: 2.0483 - classification_loss: 0.5284 308/500 [=================>............] - ETA: 48s - loss: 2.5762 - regression_loss: 2.0480 - classification_loss: 0.5282 309/500 [=================>............] - ETA: 47s - loss: 2.5767 - regression_loss: 2.0486 - classification_loss: 0.5281 310/500 [=================>............] - ETA: 47s - loss: 2.5765 - regression_loss: 2.0486 - classification_loss: 0.5279 311/500 [=================>............] - ETA: 47s - loss: 2.5763 - regression_loss: 2.0483 - classification_loss: 0.5280 312/500 [=================>............] - ETA: 46s - loss: 2.5754 - regression_loss: 2.0478 - classification_loss: 0.5276 313/500 [=================>............] - ETA: 46s - loss: 2.5762 - regression_loss: 2.0484 - classification_loss: 0.5278 314/500 [=================>............] - ETA: 46s - loss: 2.5740 - regression_loss: 2.0466 - classification_loss: 0.5275 315/500 [=================>............] - ETA: 46s - loss: 2.5736 - regression_loss: 2.0465 - classification_loss: 0.5271 316/500 [=================>............] - ETA: 45s - loss: 2.5732 - regression_loss: 2.0466 - classification_loss: 0.5266 317/500 [==================>...........] - ETA: 45s - loss: 2.5729 - regression_loss: 2.0466 - classification_loss: 0.5263 318/500 [==================>...........] - ETA: 45s - loss: 2.5712 - regression_loss: 2.0456 - classification_loss: 0.5256 319/500 [==================>...........] - ETA: 45s - loss: 2.5709 - regression_loss: 2.0454 - classification_loss: 0.5255 320/500 [==================>...........] - ETA: 44s - loss: 2.5694 - regression_loss: 2.0444 - classification_loss: 0.5250 321/500 [==================>...........] - ETA: 44s - loss: 2.5697 - regression_loss: 2.0446 - classification_loss: 0.5250 322/500 [==================>...........] - ETA: 44s - loss: 2.5714 - regression_loss: 2.0459 - classification_loss: 0.5254 323/500 [==================>...........] - ETA: 44s - loss: 2.5682 - regression_loss: 2.0435 - classification_loss: 0.5247 324/500 [==================>...........] - ETA: 43s - loss: 2.5672 - regression_loss: 2.0428 - classification_loss: 0.5244 325/500 [==================>...........] - ETA: 43s - loss: 2.5672 - regression_loss: 2.0431 - classification_loss: 0.5240 326/500 [==================>...........] - ETA: 43s - loss: 2.5670 - regression_loss: 2.0427 - classification_loss: 0.5243 327/500 [==================>...........] - ETA: 43s - loss: 2.5677 - regression_loss: 2.0434 - classification_loss: 0.5244 328/500 [==================>...........] - ETA: 42s - loss: 2.5676 - regression_loss: 2.0432 - classification_loss: 0.5244 329/500 [==================>...........] - ETA: 42s - loss: 2.5681 - regression_loss: 2.0436 - classification_loss: 0.5245 330/500 [==================>...........] - ETA: 42s - loss: 2.5675 - regression_loss: 2.0433 - classification_loss: 0.5242 331/500 [==================>...........] - ETA: 42s - loss: 2.5674 - regression_loss: 2.0434 - classification_loss: 0.5239 332/500 [==================>...........] - ETA: 41s - loss: 2.5659 - regression_loss: 2.0425 - classification_loss: 0.5234 333/500 [==================>...........] - ETA: 41s - loss: 2.5670 - regression_loss: 2.0438 - classification_loss: 0.5232 334/500 [===================>..........] - ETA: 41s - loss: 2.5703 - regression_loss: 2.0443 - classification_loss: 0.5261 335/500 [===================>..........] - ETA: 41s - loss: 2.5697 - regression_loss: 2.0439 - classification_loss: 0.5258 336/500 [===================>..........] - ETA: 40s - loss: 2.5711 - regression_loss: 2.0449 - classification_loss: 0.5262 337/500 [===================>..........] - ETA: 40s - loss: 2.5745 - regression_loss: 2.0480 - classification_loss: 0.5265 338/500 [===================>..........] - ETA: 40s - loss: 2.5741 - regression_loss: 2.0479 - classification_loss: 0.5263 339/500 [===================>..........] - ETA: 40s - loss: 2.5753 - regression_loss: 2.0489 - classification_loss: 0.5264 340/500 [===================>..........] - ETA: 39s - loss: 2.5755 - regression_loss: 2.0493 - classification_loss: 0.5262 341/500 [===================>..........] - ETA: 39s - loss: 2.5756 - regression_loss: 2.0494 - classification_loss: 0.5261 342/500 [===================>..........] - ETA: 39s - loss: 2.5747 - regression_loss: 2.0488 - classification_loss: 0.5258 343/500 [===================>..........] - ETA: 39s - loss: 2.5758 - regression_loss: 2.0499 - classification_loss: 0.5259 344/500 [===================>..........] - ETA: 38s - loss: 2.5772 - regression_loss: 2.0509 - classification_loss: 0.5263 345/500 [===================>..........] - ETA: 38s - loss: 2.5776 - regression_loss: 2.0515 - classification_loss: 0.5262 346/500 [===================>..........] - ETA: 38s - loss: 2.5781 - regression_loss: 2.0521 - classification_loss: 0.5260 347/500 [===================>..........] - ETA: 38s - loss: 2.5745 - regression_loss: 2.0491 - classification_loss: 0.5254 348/500 [===================>..........] - ETA: 37s - loss: 2.5738 - regression_loss: 2.0487 - classification_loss: 0.5251 349/500 [===================>..........] - ETA: 37s - loss: 2.5736 - regression_loss: 2.0482 - classification_loss: 0.5255 350/500 [====================>.........] - ETA: 37s - loss: 2.5736 - regression_loss: 2.0480 - classification_loss: 0.5255 351/500 [====================>.........] - ETA: 37s - loss: 2.5744 - regression_loss: 2.0490 - classification_loss: 0.5254 352/500 [====================>.........] - ETA: 36s - loss: 2.5756 - regression_loss: 2.0502 - classification_loss: 0.5254 353/500 [====================>.........] - ETA: 36s - loss: 2.5761 - regression_loss: 2.0506 - classification_loss: 0.5254 354/500 [====================>.........] - ETA: 36s - loss: 2.5750 - regression_loss: 2.0500 - classification_loss: 0.5250 355/500 [====================>.........] - ETA: 36s - loss: 2.5747 - regression_loss: 2.0500 - classification_loss: 0.5247 356/500 [====================>.........] - ETA: 35s - loss: 2.5731 - regression_loss: 2.0488 - classification_loss: 0.5243 357/500 [====================>.........] - ETA: 35s - loss: 2.5729 - regression_loss: 2.0488 - classification_loss: 0.5240 358/500 [====================>.........] - ETA: 35s - loss: 2.5735 - regression_loss: 2.0494 - classification_loss: 0.5241 359/500 [====================>.........] - ETA: 35s - loss: 2.5743 - regression_loss: 2.0503 - classification_loss: 0.5240 360/500 [====================>.........] - ETA: 34s - loss: 2.5768 - regression_loss: 2.0524 - classification_loss: 0.5244 361/500 [====================>.........] - ETA: 34s - loss: 2.5758 - regression_loss: 2.0514 - classification_loss: 0.5243 362/500 [====================>.........] - ETA: 34s - loss: 2.5743 - regression_loss: 2.0501 - classification_loss: 0.5242 363/500 [====================>.........] - ETA: 34s - loss: 2.5704 - regression_loss: 2.0473 - classification_loss: 0.5231 364/500 [====================>.........] - ETA: 33s - loss: 2.5691 - regression_loss: 2.0463 - classification_loss: 0.5228 365/500 [====================>.........] - ETA: 33s - loss: 2.5690 - regression_loss: 2.0464 - classification_loss: 0.5226 366/500 [====================>.........] - ETA: 33s - loss: 2.5695 - regression_loss: 2.0467 - classification_loss: 0.5228 367/500 [=====================>........] - ETA: 33s - loss: 2.5672 - regression_loss: 2.0452 - classification_loss: 0.5221 368/500 [=====================>........] - ETA: 32s - loss: 2.5666 - regression_loss: 2.0449 - classification_loss: 0.5217 369/500 [=====================>........] - ETA: 32s - loss: 2.5651 - regression_loss: 2.0438 - classification_loss: 0.5213 370/500 [=====================>........] - ETA: 32s - loss: 2.5645 - regression_loss: 2.0434 - classification_loss: 0.5211 371/500 [=====================>........] - ETA: 32s - loss: 2.5647 - regression_loss: 2.0440 - classification_loss: 0.5207 372/500 [=====================>........] - ETA: 31s - loss: 2.5633 - regression_loss: 2.0424 - classification_loss: 0.5209 373/500 [=====================>........] - ETA: 31s - loss: 2.5631 - regression_loss: 2.0425 - classification_loss: 0.5206 374/500 [=====================>........] - ETA: 31s - loss: 2.5622 - regression_loss: 2.0412 - classification_loss: 0.5210 375/500 [=====================>........] - ETA: 31s - loss: 2.5612 - regression_loss: 2.0405 - classification_loss: 0.5208 376/500 [=====================>........] - ETA: 30s - loss: 2.5604 - regression_loss: 2.0398 - classification_loss: 0.5206 377/500 [=====================>........] - ETA: 30s - loss: 2.5606 - regression_loss: 2.0402 - classification_loss: 0.5204 378/500 [=====================>........] - ETA: 30s - loss: 2.5603 - regression_loss: 2.0402 - classification_loss: 0.5201 379/500 [=====================>........] - ETA: 30s - loss: 2.5605 - regression_loss: 2.0404 - classification_loss: 0.5202 380/500 [=====================>........] - ETA: 29s - loss: 2.5613 - regression_loss: 2.0410 - classification_loss: 0.5203 381/500 [=====================>........] - ETA: 29s - loss: 2.5598 - regression_loss: 2.0402 - classification_loss: 0.5196 382/500 [=====================>........] - ETA: 29s - loss: 2.5602 - regression_loss: 2.0410 - classification_loss: 0.5193 383/500 [=====================>........] - ETA: 29s - loss: 2.5607 - regression_loss: 2.0413 - classification_loss: 0.5193 384/500 [======================>.......] - ETA: 28s - loss: 2.5619 - regression_loss: 2.0427 - classification_loss: 0.5192 385/500 [======================>.......] - ETA: 28s - loss: 2.5607 - regression_loss: 2.0420 - classification_loss: 0.5188 386/500 [======================>.......] - ETA: 28s - loss: 2.5603 - regression_loss: 2.0417 - classification_loss: 0.5186 387/500 [======================>.......] - ETA: 28s - loss: 2.5604 - regression_loss: 2.0418 - classification_loss: 0.5186 388/500 [======================>.......] - ETA: 27s - loss: 2.5595 - regression_loss: 2.0411 - classification_loss: 0.5184 389/500 [======================>.......] - ETA: 27s - loss: 2.5588 - regression_loss: 2.0406 - classification_loss: 0.5182 390/500 [======================>.......] - ETA: 27s - loss: 2.5585 - regression_loss: 2.0404 - classification_loss: 0.5181 391/500 [======================>.......] - ETA: 27s - loss: 2.5592 - regression_loss: 2.0411 - classification_loss: 0.5182 392/500 [======================>.......] - ETA: 26s - loss: 2.5588 - regression_loss: 2.0410 - classification_loss: 0.5178 393/500 [======================>.......] - ETA: 26s - loss: 2.5580 - regression_loss: 2.0403 - classification_loss: 0.5177 394/500 [======================>.......] - ETA: 26s - loss: 2.5605 - regression_loss: 2.0423 - classification_loss: 0.5182 395/500 [======================>.......] - ETA: 26s - loss: 2.5586 - regression_loss: 2.0407 - classification_loss: 0.5179 396/500 [======================>.......] - ETA: 25s - loss: 2.5587 - regression_loss: 2.0409 - classification_loss: 0.5179 397/500 [======================>.......] - ETA: 25s - loss: 2.5584 - regression_loss: 2.0407 - classification_loss: 0.5178 398/500 [======================>.......] - ETA: 25s - loss: 2.5585 - regression_loss: 2.0406 - classification_loss: 0.5179 399/500 [======================>.......] - ETA: 25s - loss: 2.5589 - regression_loss: 2.0408 - classification_loss: 0.5181 400/500 [=======================>......] - ETA: 24s - loss: 2.5585 - regression_loss: 2.0405 - classification_loss: 0.5180 401/500 [=======================>......] - ETA: 24s - loss: 2.5587 - regression_loss: 2.0408 - classification_loss: 0.5180 402/500 [=======================>......] - ETA: 24s - loss: 2.5575 - regression_loss: 2.0400 - classification_loss: 0.5175 403/500 [=======================>......] - ETA: 24s - loss: 2.5587 - regression_loss: 2.0409 - classification_loss: 0.5178 404/500 [=======================>......] - ETA: 23s - loss: 2.5587 - regression_loss: 2.0411 - classification_loss: 0.5176 405/500 [=======================>......] - ETA: 23s - loss: 2.5590 - regression_loss: 2.0415 - classification_loss: 0.5175 406/500 [=======================>......] - ETA: 23s - loss: 2.5601 - regression_loss: 2.0425 - classification_loss: 0.5177 407/500 [=======================>......] - ETA: 23s - loss: 2.5590 - regression_loss: 2.0414 - classification_loss: 0.5176 408/500 [=======================>......] - ETA: 22s - loss: 2.5573 - regression_loss: 2.0402 - classification_loss: 0.5171 409/500 [=======================>......] - ETA: 22s - loss: 2.5574 - regression_loss: 2.0403 - classification_loss: 0.5171 410/500 [=======================>......] - ETA: 22s - loss: 2.5554 - regression_loss: 2.0388 - classification_loss: 0.5167 411/500 [=======================>......] - ETA: 22s - loss: 2.5565 - regression_loss: 2.0396 - classification_loss: 0.5169 412/500 [=======================>......] - ETA: 21s - loss: 2.5567 - regression_loss: 2.0396 - classification_loss: 0.5171 413/500 [=======================>......] - ETA: 21s - loss: 2.5590 - regression_loss: 2.0417 - classification_loss: 0.5172 414/500 [=======================>......] - ETA: 21s - loss: 2.5594 - regression_loss: 2.0420 - classification_loss: 0.5174 415/500 [=======================>......] - ETA: 21s - loss: 2.5568 - regression_loss: 2.0401 - classification_loss: 0.5167 416/500 [=======================>......] - ETA: 20s - loss: 2.5574 - regression_loss: 2.0405 - classification_loss: 0.5169 417/500 [========================>.....] - ETA: 20s - loss: 2.5574 - regression_loss: 2.0405 - classification_loss: 0.5170 418/500 [========================>.....] - ETA: 20s - loss: 2.5582 - regression_loss: 2.0411 - classification_loss: 0.5170 419/500 [========================>.....] - ETA: 20s - loss: 2.5586 - regression_loss: 2.0415 - classification_loss: 0.5171 420/500 [========================>.....] - ETA: 19s - loss: 2.5590 - regression_loss: 2.0418 - classification_loss: 0.5172 421/500 [========================>.....] - ETA: 19s - loss: 2.5587 - regression_loss: 2.0414 - classification_loss: 0.5173 422/500 [========================>.....] - ETA: 19s - loss: 2.5577 - regression_loss: 2.0407 - classification_loss: 0.5170 423/500 [========================>.....] - ETA: 19s - loss: 2.5574 - regression_loss: 2.0405 - classification_loss: 0.5169 424/500 [========================>.....] - ETA: 18s - loss: 2.5598 - regression_loss: 2.0410 - classification_loss: 0.5189 425/500 [========================>.....] - ETA: 18s - loss: 2.5612 - regression_loss: 2.0419 - classification_loss: 0.5193 426/500 [========================>.....] - ETA: 18s - loss: 2.5615 - regression_loss: 2.0424 - classification_loss: 0.5192 427/500 [========================>.....] - ETA: 18s - loss: 2.5605 - regression_loss: 2.0415 - classification_loss: 0.5190 428/500 [========================>.....] - ETA: 17s - loss: 2.5600 - regression_loss: 2.0413 - classification_loss: 0.5188 429/500 [========================>.....] - ETA: 17s - loss: 2.5598 - regression_loss: 2.0410 - classification_loss: 0.5188 430/500 [========================>.....] - ETA: 17s - loss: 2.5591 - regression_loss: 2.0404 - classification_loss: 0.5187 431/500 [========================>.....] - ETA: 17s - loss: 2.5608 - regression_loss: 2.0418 - classification_loss: 0.5190 432/500 [========================>.....] - ETA: 16s - loss: 2.5605 - regression_loss: 2.0416 - classification_loss: 0.5190 433/500 [========================>.....] - ETA: 16s - loss: 2.5606 - regression_loss: 2.0417 - classification_loss: 0.5189 434/500 [=========================>....] - ETA: 16s - loss: 2.5620 - regression_loss: 2.0428 - classification_loss: 0.5192 435/500 [=========================>....] - ETA: 16s - loss: 2.5618 - regression_loss: 2.0428 - classification_loss: 0.5191 436/500 [=========================>....] - ETA: 15s - loss: 2.5620 - regression_loss: 2.0430 - classification_loss: 0.5190 437/500 [=========================>....] - ETA: 15s - loss: 2.5627 - regression_loss: 2.0434 - classification_loss: 0.5194 438/500 [=========================>....] - ETA: 15s - loss: 2.5622 - regression_loss: 2.0428 - classification_loss: 0.5194 439/500 [=========================>....] - ETA: 15s - loss: 2.5617 - regression_loss: 2.0426 - classification_loss: 0.5191 440/500 [=========================>....] - ETA: 14s - loss: 2.5627 - regression_loss: 2.0430 - classification_loss: 0.5197 441/500 [=========================>....] - ETA: 14s - loss: 2.5622 - regression_loss: 2.0428 - classification_loss: 0.5195 442/500 [=========================>....] - ETA: 14s - loss: 2.5618 - regression_loss: 2.0424 - classification_loss: 0.5193 443/500 [=========================>....] - ETA: 14s - loss: 2.5613 - regression_loss: 2.0421 - classification_loss: 0.5192 444/500 [=========================>....] - ETA: 13s - loss: 2.5616 - regression_loss: 2.0426 - classification_loss: 0.5190 445/500 [=========================>....] - ETA: 13s - loss: 2.5599 - regression_loss: 2.0416 - classification_loss: 0.5183 446/500 [=========================>....] - ETA: 13s - loss: 2.5601 - regression_loss: 2.0419 - classification_loss: 0.5183 447/500 [=========================>....] - ETA: 13s - loss: 2.5598 - regression_loss: 2.0417 - classification_loss: 0.5182 448/500 [=========================>....] - ETA: 12s - loss: 2.5602 - regression_loss: 2.0420 - classification_loss: 0.5182 449/500 [=========================>....] - ETA: 12s - loss: 2.5601 - regression_loss: 2.0421 - classification_loss: 0.5180 450/500 [==========================>...] - ETA: 12s - loss: 2.5620 - regression_loss: 2.0436 - classification_loss: 0.5184 451/500 [==========================>...] - ETA: 12s - loss: 2.5617 - regression_loss: 2.0434 - classification_loss: 0.5182 452/500 [==========================>...] - ETA: 11s - loss: 2.5609 - regression_loss: 2.0427 - classification_loss: 0.5182 453/500 [==========================>...] - ETA: 11s - loss: 2.5601 - regression_loss: 2.0422 - classification_loss: 0.5179 454/500 [==========================>...] - ETA: 11s - loss: 2.5594 - regression_loss: 2.0418 - classification_loss: 0.5176 455/500 [==========================>...] - ETA: 11s - loss: 2.5586 - regression_loss: 2.0413 - classification_loss: 0.5173 456/500 [==========================>...] - ETA: 10s - loss: 2.5598 - regression_loss: 2.0422 - classification_loss: 0.5176 457/500 [==========================>...] - ETA: 10s - loss: 2.5604 - regression_loss: 2.0426 - classification_loss: 0.5178 458/500 [==========================>...] - ETA: 10s - loss: 2.5611 - regression_loss: 2.0432 - classification_loss: 0.5179 459/500 [==========================>...] - ETA: 10s - loss: 2.5615 - regression_loss: 2.0432 - classification_loss: 0.5183 460/500 [==========================>...] - ETA: 10s - loss: 2.5614 - regression_loss: 2.0433 - classification_loss: 0.5181 461/500 [==========================>...] - ETA: 9s - loss: 2.5610 - regression_loss: 2.0434 - classification_loss: 0.5176  462/500 [==========================>...] - ETA: 9s - loss: 2.5600 - regression_loss: 2.0427 - classification_loss: 0.5173 463/500 [==========================>...] - ETA: 9s - loss: 2.5591 - regression_loss: 2.0422 - classification_loss: 0.5170 464/500 [==========================>...] - ETA: 8s - loss: 2.5586 - regression_loss: 2.0411 - classification_loss: 0.5175 465/500 [==========================>...] - ETA: 8s - loss: 2.5587 - regression_loss: 2.0412 - classification_loss: 0.5175 466/500 [==========================>...] - ETA: 8s - loss: 2.5607 - regression_loss: 2.0428 - classification_loss: 0.5179 467/500 [===========================>..] - ETA: 8s - loss: 2.5603 - regression_loss: 2.0426 - classification_loss: 0.5178 468/500 [===========================>..] - ETA: 8s - loss: 2.5610 - regression_loss: 2.0433 - classification_loss: 0.5177 469/500 [===========================>..] - ETA: 7s - loss: 2.5594 - regression_loss: 2.0421 - classification_loss: 0.5173 470/500 [===========================>..] - ETA: 7s - loss: 2.5592 - regression_loss: 2.0420 - classification_loss: 0.5172 471/500 [===========================>..] - ETA: 7s - loss: 2.5590 - regression_loss: 2.0418 - classification_loss: 0.5171 472/500 [===========================>..] - ETA: 7s - loss: 2.5577 - regression_loss: 2.0408 - classification_loss: 0.5169 473/500 [===========================>..] - ETA: 6s - loss: 2.5577 - regression_loss: 2.0406 - classification_loss: 0.5171 474/500 [===========================>..] - ETA: 6s - loss: 2.5571 - regression_loss: 2.0403 - classification_loss: 0.5169 475/500 [===========================>..] - ETA: 6s - loss: 2.5563 - regression_loss: 2.0396 - classification_loss: 0.5167 476/500 [===========================>..] - ETA: 6s - loss: 2.5559 - regression_loss: 2.0393 - classification_loss: 0.5165 477/500 [===========================>..] - ETA: 5s - loss: 2.5547 - regression_loss: 2.0386 - classification_loss: 0.5161 478/500 [===========================>..] - ETA: 5s - loss: 2.5547 - regression_loss: 2.0388 - classification_loss: 0.5160 479/500 [===========================>..] - ETA: 5s - loss: 2.5552 - regression_loss: 2.0395 - classification_loss: 0.5158 480/500 [===========================>..] - ETA: 5s - loss: 2.5557 - regression_loss: 2.0398 - classification_loss: 0.5159 481/500 [===========================>..] - ETA: 4s - loss: 2.5554 - regression_loss: 2.0396 - classification_loss: 0.5157 482/500 [===========================>..] - ETA: 4s - loss: 2.5548 - regression_loss: 2.0393 - classification_loss: 0.5155 483/500 [===========================>..] - ETA: 4s - loss: 2.5561 - regression_loss: 2.0397 - classification_loss: 0.5164 484/500 [============================>.] - ETA: 4s - loss: 2.5556 - regression_loss: 2.0395 - classification_loss: 0.5160 485/500 [============================>.] - ETA: 3s - loss: 2.5560 - regression_loss: 2.0397 - classification_loss: 0.5164 486/500 [============================>.] - ETA: 3s - loss: 2.5572 - regression_loss: 2.0407 - classification_loss: 0.5165 487/500 [============================>.] - ETA: 3s - loss: 2.5573 - regression_loss: 2.0410 - classification_loss: 0.5163 488/500 [============================>.] - ETA: 2s - loss: 2.5585 - regression_loss: 2.0421 - classification_loss: 0.5164 489/500 [============================>.] - ETA: 2s - loss: 2.5570 - regression_loss: 2.0409 - classification_loss: 0.5161 490/500 [============================>.] - ETA: 2s - loss: 2.5573 - regression_loss: 2.0412 - classification_loss: 0.5161 491/500 [============================>.] - ETA: 2s - loss: 2.5568 - regression_loss: 2.0409 - classification_loss: 0.5159 492/500 [============================>.] - ETA: 1s - loss: 2.5571 - regression_loss: 2.0411 - classification_loss: 0.5160 493/500 [============================>.] - ETA: 1s - loss: 2.5573 - regression_loss: 2.0412 - classification_loss: 0.5161 494/500 [============================>.] - ETA: 1s - loss: 2.5566 - regression_loss: 2.0407 - classification_loss: 0.5159 495/500 [============================>.] - ETA: 1s - loss: 2.5567 - regression_loss: 2.0408 - classification_loss: 0.5159 496/500 [============================>.] - ETA: 0s - loss: 2.5577 - regression_loss: 2.0415 - classification_loss: 0.5162 497/500 [============================>.] - ETA: 0s - loss: 2.5569 - regression_loss: 2.0409 - classification_loss: 0.5160 498/500 [============================>.] - ETA: 0s - loss: 2.5576 - regression_loss: 2.0413 - classification_loss: 0.5163 499/500 [============================>.] - ETA: 0s - loss: 2.5562 - regression_loss: 2.0399 - classification_loss: 0.5163 500/500 [==============================] - 125s 250ms/step - loss: 2.5572 - regression_loss: 2.0408 - classification_loss: 0.5164 1172 instances of class plum with average precision: 0.2681 mAP: 0.2681 Epoch 00012: saving model to ./training/snapshots/resnet50_pascal_12.h5 Epoch 13/150 1/500 [..............................] - ETA: 1:59 - loss: 2.2256 - regression_loss: 1.7815 - classification_loss: 0.4441 2/500 [..............................] - ETA: 2:03 - loss: 2.5372 - regression_loss: 2.0583 - classification_loss: 0.4789 3/500 [..............................] - ETA: 2:02 - loss: 2.4401 - regression_loss: 1.9606 - classification_loss: 0.4795 4/500 [..............................] - ETA: 2:02 - loss: 2.4679 - regression_loss: 2.0040 - classification_loss: 0.4639 5/500 [..............................] - ETA: 2:03 - loss: 2.7158 - regression_loss: 2.1458 - classification_loss: 0.5700 6/500 [..............................] - ETA: 2:03 - loss: 2.7914 - regression_loss: 2.2110 - classification_loss: 0.5804 7/500 [..............................] - ETA: 2:02 - loss: 2.7144 - regression_loss: 2.1566 - classification_loss: 0.5578 8/500 [..............................] - ETA: 2:01 - loss: 2.6908 - regression_loss: 2.1380 - classification_loss: 0.5528 9/500 [..............................] - ETA: 2:01 - loss: 2.7256 - regression_loss: 2.1738 - classification_loss: 0.5517 10/500 [..............................] - ETA: 2:01 - loss: 2.6798 - regression_loss: 2.1415 - classification_loss: 0.5383 11/500 [..............................] - ETA: 2:00 - loss: 2.6877 - regression_loss: 2.1023 - classification_loss: 0.5853 12/500 [..............................] - ETA: 2:00 - loss: 2.7050 - regression_loss: 2.1237 - classification_loss: 0.5813 13/500 [..............................] - ETA: 2:00 - loss: 2.6794 - regression_loss: 2.1041 - classification_loss: 0.5753 14/500 [..............................] - ETA: 2:01 - loss: 2.6885 - regression_loss: 2.1150 - classification_loss: 0.5735 15/500 [..............................] - ETA: 2:00 - loss: 2.6126 - regression_loss: 2.0603 - classification_loss: 0.5523 16/500 [..............................] - ETA: 2:00 - loss: 2.6470 - regression_loss: 2.0907 - classification_loss: 0.5563 17/500 [>.............................] - ETA: 1:59 - loss: 2.6223 - regression_loss: 2.0752 - classification_loss: 0.5470 18/500 [>.............................] - ETA: 1:59 - loss: 2.6151 - regression_loss: 2.0694 - classification_loss: 0.5457 19/500 [>.............................] - ETA: 1:59 - loss: 2.6768 - regression_loss: 2.1047 - classification_loss: 0.5722 20/500 [>.............................] - ETA: 1:59 - loss: 2.6638 - regression_loss: 2.0975 - classification_loss: 0.5663 21/500 [>.............................] - ETA: 1:59 - loss: 2.6433 - regression_loss: 2.0833 - classification_loss: 0.5599 22/500 [>.............................] - ETA: 1:59 - loss: 2.6393 - regression_loss: 2.0847 - classification_loss: 0.5545 23/500 [>.............................] - ETA: 1:59 - loss: 2.6171 - regression_loss: 2.0693 - classification_loss: 0.5478 24/500 [>.............................] - ETA: 1:58 - loss: 2.6019 - regression_loss: 2.0596 - classification_loss: 0.5423 25/500 [>.............................] - ETA: 1:58 - loss: 2.6024 - regression_loss: 2.0646 - classification_loss: 0.5379 26/500 [>.............................] - ETA: 1:58 - loss: 2.6072 - regression_loss: 2.0724 - classification_loss: 0.5347 27/500 [>.............................] - ETA: 1:58 - loss: 2.5535 - regression_loss: 2.0298 - classification_loss: 0.5237 28/500 [>.............................] - ETA: 1:57 - loss: 2.5542 - regression_loss: 2.0339 - classification_loss: 0.5203 29/500 [>.............................] - ETA: 1:57 - loss: 2.5551 - regression_loss: 2.0370 - classification_loss: 0.5181 30/500 [>.............................] - ETA: 1:57 - loss: 2.5696 - regression_loss: 2.0495 - classification_loss: 0.5201 31/500 [>.............................] - ETA: 1:56 - loss: 2.5599 - regression_loss: 2.0421 - classification_loss: 0.5178 32/500 [>.............................] - ETA: 1:56 - loss: 2.5444 - regression_loss: 2.0309 - classification_loss: 0.5135 33/500 [>.............................] - ETA: 1:56 - loss: 2.5582 - regression_loss: 2.0470 - classification_loss: 0.5112 34/500 [=>............................] - ETA: 1:56 - loss: 2.5588 - regression_loss: 2.0485 - classification_loss: 0.5104 35/500 [=>............................] - ETA: 1:56 - loss: 2.5582 - regression_loss: 2.0446 - classification_loss: 0.5137 36/500 [=>............................] - ETA: 1:55 - loss: 2.5607 - regression_loss: 2.0497 - classification_loss: 0.5110 37/500 [=>............................] - ETA: 1:55 - loss: 2.5672 - regression_loss: 2.0582 - classification_loss: 0.5091 38/500 [=>............................] - ETA: 1:55 - loss: 2.5724 - regression_loss: 2.0632 - classification_loss: 0.5092 39/500 [=>............................] - ETA: 1:55 - loss: 2.5478 - regression_loss: 2.0368 - classification_loss: 0.5110 40/500 [=>............................] - ETA: 1:55 - loss: 2.5424 - regression_loss: 2.0334 - classification_loss: 0.5091 41/500 [=>............................] - ETA: 1:54 - loss: 2.5370 - regression_loss: 2.0289 - classification_loss: 0.5081 42/500 [=>............................] - ETA: 1:54 - loss: 2.5382 - regression_loss: 2.0294 - classification_loss: 0.5088 43/500 [=>............................] - ETA: 1:54 - loss: 2.5346 - regression_loss: 2.0272 - classification_loss: 0.5074 44/500 [=>............................] - ETA: 1:54 - loss: 2.5308 - regression_loss: 2.0268 - classification_loss: 0.5040 45/500 [=>............................] - ETA: 1:53 - loss: 2.5351 - regression_loss: 2.0284 - classification_loss: 0.5068 46/500 [=>............................] - ETA: 1:53 - loss: 2.5359 - regression_loss: 2.0324 - classification_loss: 0.5035 47/500 [=>............................] - ETA: 1:53 - loss: 2.5338 - regression_loss: 2.0332 - classification_loss: 0.5006 48/500 [=>............................] - ETA: 1:52 - loss: 2.5591 - regression_loss: 2.0573 - classification_loss: 0.5019 49/500 [=>............................] - ETA: 1:52 - loss: 2.5626 - regression_loss: 2.0611 - classification_loss: 0.5015 50/500 [==>...........................] - ETA: 1:52 - loss: 2.5652 - regression_loss: 2.0648 - classification_loss: 0.5005 51/500 [==>...........................] - ETA: 1:52 - loss: 2.5374 - regression_loss: 2.0420 - classification_loss: 0.4954 52/500 [==>...........................] - ETA: 1:51 - loss: 2.5285 - regression_loss: 2.0345 - classification_loss: 0.4940 53/500 [==>...........................] - ETA: 1:51 - loss: 2.5374 - regression_loss: 2.0346 - classification_loss: 0.5028 54/500 [==>...........................] - ETA: 1:51 - loss: 2.5375 - regression_loss: 2.0352 - classification_loss: 0.5023 55/500 [==>...........................] - ETA: 1:51 - loss: 2.5366 - regression_loss: 2.0356 - classification_loss: 0.5010 56/500 [==>...........................] - ETA: 1:50 - loss: 2.5391 - regression_loss: 2.0378 - classification_loss: 0.5014 57/500 [==>...........................] - ETA: 1:50 - loss: 2.5233 - regression_loss: 2.0257 - classification_loss: 0.4976 58/500 [==>...........................] - ETA: 1:50 - loss: 2.5240 - regression_loss: 2.0269 - classification_loss: 0.4972 59/500 [==>...........................] - ETA: 1:50 - loss: 2.5345 - regression_loss: 2.0309 - classification_loss: 0.5035 60/500 [==>...........................] - ETA: 1:50 - loss: 2.5404 - regression_loss: 2.0360 - classification_loss: 0.5044 61/500 [==>...........................] - ETA: 1:49 - loss: 2.5382 - regression_loss: 2.0341 - classification_loss: 0.5041 62/500 [==>...........................] - ETA: 1:49 - loss: 2.5360 - regression_loss: 2.0334 - classification_loss: 0.5025 63/500 [==>...........................] - ETA: 1:49 - loss: 2.5287 - regression_loss: 2.0281 - classification_loss: 0.5005 64/500 [==>...........................] - ETA: 1:49 - loss: 2.5340 - regression_loss: 2.0340 - classification_loss: 0.5000 65/500 [==>...........................] - ETA: 1:48 - loss: 2.5390 - regression_loss: 2.0380 - classification_loss: 0.5011 66/500 [==>...........................] - ETA: 1:48 - loss: 2.5412 - regression_loss: 2.0405 - classification_loss: 0.5007 67/500 [===>..........................] - ETA: 1:48 - loss: 2.5390 - regression_loss: 2.0394 - classification_loss: 0.4996 68/500 [===>..........................] - ETA: 1:48 - loss: 2.5271 - regression_loss: 2.0304 - classification_loss: 0.4966 69/500 [===>..........................] - ETA: 1:48 - loss: 2.5271 - regression_loss: 2.0305 - classification_loss: 0.4966 70/500 [===>..........................] - ETA: 1:47 - loss: 2.5294 - regression_loss: 2.0323 - classification_loss: 0.4971 71/500 [===>..........................] - ETA: 1:47 - loss: 2.5302 - regression_loss: 2.0340 - classification_loss: 0.4962 72/500 [===>..........................] - ETA: 1:47 - loss: 2.5363 - regression_loss: 2.0378 - classification_loss: 0.4985 73/500 [===>..........................] - ETA: 1:47 - loss: 2.5453 - regression_loss: 2.0441 - classification_loss: 0.5012 74/500 [===>..........................] - ETA: 1:46 - loss: 2.5523 - regression_loss: 2.0496 - classification_loss: 0.5027 75/500 [===>..........................] - ETA: 1:46 - loss: 2.5518 - regression_loss: 2.0506 - classification_loss: 0.5012 76/500 [===>..........................] - ETA: 1:46 - loss: 2.5507 - regression_loss: 2.0501 - classification_loss: 0.5006 77/500 [===>..........................] - ETA: 1:46 - loss: 2.5572 - regression_loss: 2.0546 - classification_loss: 0.5026 78/500 [===>..........................] - ETA: 1:45 - loss: 2.5576 - regression_loss: 2.0548 - classification_loss: 0.5028 79/500 [===>..........................] - ETA: 1:45 - loss: 2.5579 - regression_loss: 2.0555 - classification_loss: 0.5024 80/500 [===>..........................] - ETA: 1:45 - loss: 2.5583 - regression_loss: 2.0563 - classification_loss: 0.5020 81/500 [===>..........................] - ETA: 1:45 - loss: 2.5639 - regression_loss: 2.0592 - classification_loss: 0.5047 82/500 [===>..........................] - ETA: 1:44 - loss: 2.5574 - regression_loss: 2.0548 - classification_loss: 0.5026 83/500 [===>..........................] - ETA: 1:44 - loss: 2.5572 - regression_loss: 2.0553 - classification_loss: 0.5020 84/500 [====>.........................] - ETA: 1:44 - loss: 2.5576 - regression_loss: 2.0557 - classification_loss: 0.5019 85/500 [====>.........................] - ETA: 1:44 - loss: 2.5586 - regression_loss: 2.0574 - classification_loss: 0.5012 86/500 [====>.........................] - ETA: 1:43 - loss: 2.5629 - regression_loss: 2.0604 - classification_loss: 0.5025 87/500 [====>.........................] - ETA: 1:43 - loss: 2.5618 - regression_loss: 2.0598 - classification_loss: 0.5020 88/500 [====>.........................] - ETA: 1:43 - loss: 2.5479 - regression_loss: 2.0484 - classification_loss: 0.4995 89/500 [====>.........................] - ETA: 1:43 - loss: 2.5511 - regression_loss: 2.0520 - classification_loss: 0.4991 90/500 [====>.........................] - ETA: 1:42 - loss: 2.5506 - regression_loss: 2.0518 - classification_loss: 0.4988 91/500 [====>.........................] - ETA: 1:42 - loss: 2.5605 - regression_loss: 2.0613 - classification_loss: 0.4993 92/500 [====>.........................] - ETA: 1:42 - loss: 2.5672 - regression_loss: 2.0672 - classification_loss: 0.5000 93/500 [====>.........................] - ETA: 1:42 - loss: 2.5682 - regression_loss: 2.0667 - classification_loss: 0.5015 94/500 [====>.........................] - ETA: 1:42 - loss: 2.5663 - regression_loss: 2.0642 - classification_loss: 0.5021 95/500 [====>.........................] - ETA: 1:41 - loss: 2.5629 - regression_loss: 2.0614 - classification_loss: 0.5015 96/500 [====>.........................] - ETA: 1:41 - loss: 2.5585 - regression_loss: 2.0583 - classification_loss: 0.5002 97/500 [====>.........................] - ETA: 1:41 - loss: 2.5614 - regression_loss: 2.0596 - classification_loss: 0.5018 98/500 [====>.........................] - ETA: 1:40 - loss: 2.5646 - regression_loss: 2.0628 - classification_loss: 0.5019 99/500 [====>.........................] - ETA: 1:40 - loss: 2.5625 - regression_loss: 2.0620 - classification_loss: 0.5005 100/500 [=====>........................] - ETA: 1:40 - loss: 2.5643 - regression_loss: 2.0632 - classification_loss: 0.5010 101/500 [=====>........................] - ETA: 1:40 - loss: 2.5641 - regression_loss: 2.0597 - classification_loss: 0.5044 102/500 [=====>........................] - ETA: 1:39 - loss: 2.5646 - regression_loss: 2.0606 - classification_loss: 0.5040 103/500 [=====>........................] - ETA: 1:39 - loss: 2.5690 - regression_loss: 2.0645 - classification_loss: 0.5045 104/500 [=====>........................] - ETA: 1:39 - loss: 2.5651 - regression_loss: 2.0619 - classification_loss: 0.5032 105/500 [=====>........................] - ETA: 1:39 - loss: 2.5673 - regression_loss: 2.0635 - classification_loss: 0.5038 106/500 [=====>........................] - ETA: 1:38 - loss: 2.5675 - regression_loss: 2.0640 - classification_loss: 0.5035 107/500 [=====>........................] - ETA: 1:38 - loss: 2.5686 - regression_loss: 2.0633 - classification_loss: 0.5053 108/500 [=====>........................] - ETA: 1:38 - loss: 2.5666 - regression_loss: 2.0621 - classification_loss: 0.5044 109/500 [=====>........................] - ETA: 1:38 - loss: 2.5624 - regression_loss: 2.0588 - classification_loss: 0.5037 110/500 [=====>........................] - ETA: 1:37 - loss: 2.5647 - regression_loss: 2.0599 - classification_loss: 0.5048 111/500 [=====>........................] - ETA: 1:37 - loss: 2.5624 - regression_loss: 2.0579 - classification_loss: 0.5045 112/500 [=====>........................] - ETA: 1:37 - loss: 2.5612 - regression_loss: 2.0571 - classification_loss: 0.5041 113/500 [=====>........................] - ETA: 1:37 - loss: 2.5599 - regression_loss: 2.0556 - classification_loss: 0.5042 114/500 [=====>........................] - ETA: 1:36 - loss: 2.5664 - regression_loss: 2.0634 - classification_loss: 0.5030 115/500 [=====>........................] - ETA: 1:36 - loss: 2.5670 - regression_loss: 2.0635 - classification_loss: 0.5035 116/500 [=====>........................] - ETA: 1:36 - loss: 2.5582 - regression_loss: 2.0559 - classification_loss: 0.5022 117/500 [======>.......................] - ETA: 1:36 - loss: 2.5561 - regression_loss: 2.0545 - classification_loss: 0.5016 118/500 [======>.......................] - ETA: 1:35 - loss: 2.5610 - regression_loss: 2.0583 - classification_loss: 0.5027 119/500 [======>.......................] - ETA: 1:35 - loss: 2.5592 - regression_loss: 2.0571 - classification_loss: 0.5022 120/500 [======>.......................] - ETA: 1:35 - loss: 2.5622 - regression_loss: 2.0590 - classification_loss: 0.5031 121/500 [======>.......................] - ETA: 1:35 - loss: 2.5630 - regression_loss: 2.0591 - classification_loss: 0.5039 122/500 [======>.......................] - ETA: 1:34 - loss: 2.5651 - regression_loss: 2.0611 - classification_loss: 0.5040 123/500 [======>.......................] - ETA: 1:34 - loss: 2.5698 - regression_loss: 2.0631 - classification_loss: 0.5066 124/500 [======>.......................] - ETA: 1:34 - loss: 2.5676 - regression_loss: 2.0618 - classification_loss: 0.5058 125/500 [======>.......................] - ETA: 1:34 - loss: 2.5650 - regression_loss: 2.0600 - classification_loss: 0.5050 126/500 [======>.......................] - ETA: 1:33 - loss: 2.5634 - regression_loss: 2.0583 - classification_loss: 0.5052 127/500 [======>.......................] - ETA: 1:33 - loss: 2.5617 - regression_loss: 2.0569 - classification_loss: 0.5048 128/500 [======>.......................] - ETA: 1:33 - loss: 2.5602 - regression_loss: 2.0562 - classification_loss: 0.5040 129/500 [======>.......................] - ETA: 1:33 - loss: 2.5590 - regression_loss: 2.0555 - classification_loss: 0.5035 130/500 [======>.......................] - ETA: 1:32 - loss: 2.5579 - regression_loss: 2.0545 - classification_loss: 0.5034 131/500 [======>.......................] - ETA: 1:32 - loss: 2.5552 - regression_loss: 2.0520 - classification_loss: 0.5032 132/500 [======>.......................] - ETA: 1:32 - loss: 2.5542 - regression_loss: 2.0518 - classification_loss: 0.5024 133/500 [======>.......................] - ETA: 1:32 - loss: 2.5524 - regression_loss: 2.0515 - classification_loss: 0.5009 134/500 [=======>......................] - ETA: 1:31 - loss: 2.5542 - regression_loss: 2.0530 - classification_loss: 0.5012 135/500 [=======>......................] - ETA: 1:31 - loss: 2.5498 - regression_loss: 2.0489 - classification_loss: 0.5009 136/500 [=======>......................] - ETA: 1:31 - loss: 2.5486 - regression_loss: 2.0484 - classification_loss: 0.5002 137/500 [=======>......................] - ETA: 1:31 - loss: 2.5459 - regression_loss: 2.0464 - classification_loss: 0.4994 138/500 [=======>......................] - ETA: 1:30 - loss: 2.5398 - regression_loss: 2.0421 - classification_loss: 0.4976 139/500 [=======>......................] - ETA: 1:30 - loss: 2.5400 - regression_loss: 2.0431 - classification_loss: 0.4969 140/500 [=======>......................] - ETA: 1:30 - loss: 2.5495 - regression_loss: 2.0514 - classification_loss: 0.4980 141/500 [=======>......................] - ETA: 1:30 - loss: 2.5462 - regression_loss: 2.0488 - classification_loss: 0.4974 142/500 [=======>......................] - ETA: 1:29 - loss: 2.5456 - regression_loss: 2.0482 - classification_loss: 0.4974 143/500 [=======>......................] - ETA: 1:29 - loss: 2.5451 - regression_loss: 2.0478 - classification_loss: 0.4973 144/500 [=======>......................] - ETA: 1:29 - loss: 2.5362 - regression_loss: 2.0396 - classification_loss: 0.4966 145/500 [=======>......................] - ETA: 1:29 - loss: 2.5353 - regression_loss: 2.0392 - classification_loss: 0.4962 146/500 [=======>......................] - ETA: 1:28 - loss: 2.5354 - regression_loss: 2.0393 - classification_loss: 0.4961 147/500 [=======>......................] - ETA: 1:28 - loss: 2.5344 - regression_loss: 2.0385 - classification_loss: 0.4959 148/500 [=======>......................] - ETA: 1:28 - loss: 2.5303 - regression_loss: 2.0354 - classification_loss: 0.4948 149/500 [=======>......................] - ETA: 1:28 - loss: 2.5326 - regression_loss: 2.0374 - classification_loss: 0.4952 150/500 [========>.....................] - ETA: 1:27 - loss: 2.5331 - regression_loss: 2.0379 - classification_loss: 0.4952 151/500 [========>.....................] - ETA: 1:27 - loss: 2.5317 - regression_loss: 2.0372 - classification_loss: 0.4945 152/500 [========>.....................] - ETA: 1:27 - loss: 2.5317 - regression_loss: 2.0365 - classification_loss: 0.4951 153/500 [========>.....................] - ETA: 1:27 - loss: 2.5294 - regression_loss: 2.0350 - classification_loss: 0.4944 154/500 [========>.....................] - ETA: 1:26 - loss: 2.5251 - regression_loss: 2.0322 - classification_loss: 0.4929 155/500 [========>.....................] - ETA: 1:26 - loss: 2.5258 - regression_loss: 2.0330 - classification_loss: 0.4928 156/500 [========>.....................] - ETA: 1:26 - loss: 2.5256 - regression_loss: 2.0330 - classification_loss: 0.4926 157/500 [========>.....................] - ETA: 1:26 - loss: 2.5226 - regression_loss: 2.0304 - classification_loss: 0.4922 158/500 [========>.....................] - ETA: 1:25 - loss: 2.5206 - regression_loss: 2.0294 - classification_loss: 0.4912 159/500 [========>.....................] - ETA: 1:25 - loss: 2.5175 - regression_loss: 2.0267 - classification_loss: 0.4908 160/500 [========>.....................] - ETA: 1:25 - loss: 2.5176 - regression_loss: 2.0267 - classification_loss: 0.4909 161/500 [========>.....................] - ETA: 1:25 - loss: 2.5188 - regression_loss: 2.0278 - classification_loss: 0.4910 162/500 [========>.....................] - ETA: 1:24 - loss: 2.5177 - regression_loss: 2.0268 - classification_loss: 0.4909 163/500 [========>.....................] - ETA: 1:24 - loss: 2.5141 - regression_loss: 2.0227 - classification_loss: 0.4914 164/500 [========>.....................] - ETA: 1:24 - loss: 2.5117 - regression_loss: 2.0212 - classification_loss: 0.4905 165/500 [========>.....................] - ETA: 1:23 - loss: 2.5091 - regression_loss: 2.0192 - classification_loss: 0.4899 166/500 [========>.....................] - ETA: 1:23 - loss: 2.5003 - regression_loss: 2.0127 - classification_loss: 0.4876 167/500 [=========>....................] - ETA: 1:23 - loss: 2.5034 - regression_loss: 2.0161 - classification_loss: 0.4873 168/500 [=========>....................] - ETA: 1:23 - loss: 2.5054 - regression_loss: 2.0173 - classification_loss: 0.4881 169/500 [=========>....................] - ETA: 1:22 - loss: 2.5040 - regression_loss: 2.0163 - classification_loss: 0.4877 170/500 [=========>....................] - ETA: 1:22 - loss: 2.5025 - regression_loss: 2.0152 - classification_loss: 0.4873 171/500 [=========>....................] - ETA: 1:22 - loss: 2.5049 - regression_loss: 2.0167 - classification_loss: 0.4882 172/500 [=========>....................] - ETA: 1:22 - loss: 2.5032 - regression_loss: 2.0155 - classification_loss: 0.4878 173/500 [=========>....................] - ETA: 1:21 - loss: 2.5027 - regression_loss: 2.0151 - classification_loss: 0.4875 174/500 [=========>....................] - ETA: 1:21 - loss: 2.5002 - regression_loss: 2.0133 - classification_loss: 0.4869 175/500 [=========>....................] - ETA: 1:21 - loss: 2.4999 - regression_loss: 2.0134 - classification_loss: 0.4865 176/500 [=========>....................] - ETA: 1:21 - loss: 2.4973 - regression_loss: 2.0119 - classification_loss: 0.4854 177/500 [=========>....................] - ETA: 1:20 - loss: 2.4968 - regression_loss: 2.0111 - classification_loss: 0.4857 178/500 [=========>....................] - ETA: 1:20 - loss: 2.5119 - regression_loss: 2.0121 - classification_loss: 0.4998 179/500 [=========>....................] - ETA: 1:20 - loss: 2.5113 - regression_loss: 2.0122 - classification_loss: 0.4992 180/500 [=========>....................] - ETA: 1:20 - loss: 2.5099 - regression_loss: 2.0110 - classification_loss: 0.4989 181/500 [=========>....................] - ETA: 1:19 - loss: 2.5154 - regression_loss: 2.0155 - classification_loss: 0.4999 182/500 [=========>....................] - ETA: 1:19 - loss: 2.5131 - regression_loss: 2.0143 - classification_loss: 0.4988 183/500 [=========>....................] - ETA: 1:19 - loss: 2.5138 - regression_loss: 2.0143 - classification_loss: 0.4995 184/500 [==========>...................] - ETA: 1:19 - loss: 2.5132 - regression_loss: 2.0138 - classification_loss: 0.4995 185/500 [==========>...................] - ETA: 1:18 - loss: 2.5139 - regression_loss: 2.0144 - classification_loss: 0.4994 186/500 [==========>...................] - ETA: 1:18 - loss: 2.5099 - regression_loss: 2.0118 - classification_loss: 0.4981 187/500 [==========>...................] - ETA: 1:18 - loss: 2.5107 - regression_loss: 2.0123 - classification_loss: 0.4984 188/500 [==========>...................] - ETA: 1:18 - loss: 2.5066 - regression_loss: 2.0091 - classification_loss: 0.4975 189/500 [==========>...................] - ETA: 1:17 - loss: 2.5078 - regression_loss: 2.0101 - classification_loss: 0.4977 190/500 [==========>...................] - ETA: 1:17 - loss: 2.5059 - regression_loss: 2.0083 - classification_loss: 0.4976 191/500 [==========>...................] - ETA: 1:17 - loss: 2.5048 - regression_loss: 2.0076 - classification_loss: 0.4972 192/500 [==========>...................] - ETA: 1:17 - loss: 2.5056 - regression_loss: 2.0084 - classification_loss: 0.4972 193/500 [==========>...................] - ETA: 1:16 - loss: 2.5077 - regression_loss: 2.0100 - classification_loss: 0.4977 194/500 [==========>...................] - ETA: 1:16 - loss: 2.5085 - regression_loss: 2.0121 - classification_loss: 0.4964 195/500 [==========>...................] - ETA: 1:16 - loss: 2.5101 - regression_loss: 2.0130 - classification_loss: 0.4971 196/500 [==========>...................] - ETA: 1:16 - loss: 2.5087 - regression_loss: 2.0118 - classification_loss: 0.4969 197/500 [==========>...................] - ETA: 1:15 - loss: 2.5089 - regression_loss: 2.0120 - classification_loss: 0.4969 198/500 [==========>...................] - ETA: 1:15 - loss: 2.5112 - regression_loss: 2.0138 - classification_loss: 0.4974 199/500 [==========>...................] - ETA: 1:15 - loss: 2.5102 - regression_loss: 2.0131 - classification_loss: 0.4972 200/500 [===========>..................] - ETA: 1:15 - loss: 2.5086 - regression_loss: 2.0120 - classification_loss: 0.4967 201/500 [===========>..................] - ETA: 1:14 - loss: 2.5078 - regression_loss: 2.0110 - classification_loss: 0.4968 202/500 [===========>..................] - ETA: 1:14 - loss: 2.5149 - regression_loss: 2.0108 - classification_loss: 0.5040 203/500 [===========>..................] - ETA: 1:14 - loss: 2.5148 - regression_loss: 2.0113 - classification_loss: 0.5036 204/500 [===========>..................] - ETA: 1:14 - loss: 2.5134 - regression_loss: 2.0105 - classification_loss: 0.5029 205/500 [===========>..................] - ETA: 1:13 - loss: 2.5123 - regression_loss: 2.0098 - classification_loss: 0.5026 206/500 [===========>..................] - ETA: 1:13 - loss: 2.5138 - regression_loss: 2.0117 - classification_loss: 0.5021 207/500 [===========>..................] - ETA: 1:13 - loss: 2.5184 - regression_loss: 2.0158 - classification_loss: 0.5027 208/500 [===========>..................] - ETA: 1:13 - loss: 2.5209 - regression_loss: 2.0175 - classification_loss: 0.5034 209/500 [===========>..................] - ETA: 1:12 - loss: 2.5236 - regression_loss: 2.0197 - classification_loss: 0.5039 210/500 [===========>..................] - ETA: 1:12 - loss: 2.5249 - regression_loss: 2.0207 - classification_loss: 0.5042 211/500 [===========>..................] - ETA: 1:12 - loss: 2.5227 - regression_loss: 2.0192 - classification_loss: 0.5035 212/500 [===========>..................] - ETA: 1:12 - loss: 2.5259 - regression_loss: 2.0225 - classification_loss: 0.5035 213/500 [===========>..................] - ETA: 1:11 - loss: 2.5272 - regression_loss: 2.0233 - classification_loss: 0.5039 214/500 [===========>..................] - ETA: 1:11 - loss: 2.5262 - regression_loss: 2.0225 - classification_loss: 0.5038 215/500 [===========>..................] - ETA: 1:11 - loss: 2.5266 - regression_loss: 2.0230 - classification_loss: 0.5036 216/500 [===========>..................] - ETA: 1:11 - loss: 2.5272 - regression_loss: 2.0238 - classification_loss: 0.5033 217/500 [============>.................] - ETA: 1:10 - loss: 2.5287 - regression_loss: 2.0256 - classification_loss: 0.5031 218/500 [============>.................] - ETA: 1:10 - loss: 2.5270 - regression_loss: 2.0245 - classification_loss: 0.5025 219/500 [============>.................] - ETA: 1:10 - loss: 2.5279 - regression_loss: 2.0252 - classification_loss: 0.5027 220/500 [============>.................] - ETA: 1:10 - loss: 2.5300 - regression_loss: 2.0270 - classification_loss: 0.5030 221/500 [============>.................] - ETA: 1:09 - loss: 2.5291 - regression_loss: 2.0268 - classification_loss: 0.5024 222/500 [============>.................] - ETA: 1:09 - loss: 2.5297 - regression_loss: 2.0271 - classification_loss: 0.5026 223/500 [============>.................] - ETA: 1:09 - loss: 2.5289 - regression_loss: 2.0261 - classification_loss: 0.5028 224/500 [============>.................] - ETA: 1:09 - loss: 2.5297 - regression_loss: 2.0271 - classification_loss: 0.5026 225/500 [============>.................] - ETA: 1:08 - loss: 2.5289 - regression_loss: 2.0266 - classification_loss: 0.5024 226/500 [============>.................] - ETA: 1:08 - loss: 2.5298 - regression_loss: 2.0275 - classification_loss: 0.5023 227/500 [============>.................] - ETA: 1:08 - loss: 2.5306 - regression_loss: 2.0283 - classification_loss: 0.5023 228/500 [============>.................] - ETA: 1:08 - loss: 2.5285 - regression_loss: 2.0269 - classification_loss: 0.5016 229/500 [============>.................] - ETA: 1:07 - loss: 2.5362 - regression_loss: 2.0334 - classification_loss: 0.5028 230/500 [============>.................] - ETA: 1:07 - loss: 2.5353 - regression_loss: 2.0320 - classification_loss: 0.5032 231/500 [============>.................] - ETA: 1:07 - loss: 2.5371 - regression_loss: 2.0328 - classification_loss: 0.5043 232/500 [============>.................] - ETA: 1:07 - loss: 2.5389 - regression_loss: 2.0343 - classification_loss: 0.5046 233/500 [============>.................] - ETA: 1:06 - loss: 2.5381 - regression_loss: 2.0337 - classification_loss: 0.5044 234/500 [=============>................] - ETA: 1:06 - loss: 2.5381 - regression_loss: 2.0337 - classification_loss: 0.5044 235/500 [=============>................] - ETA: 1:06 - loss: 2.5374 - regression_loss: 2.0335 - classification_loss: 0.5039 236/500 [=============>................] - ETA: 1:06 - loss: 2.5379 - regression_loss: 2.0337 - classification_loss: 0.5043 237/500 [=============>................] - ETA: 1:05 - loss: 2.5371 - regression_loss: 2.0325 - classification_loss: 0.5046 238/500 [=============>................] - ETA: 1:05 - loss: 2.5375 - regression_loss: 2.0329 - classification_loss: 0.5045 239/500 [=============>................] - ETA: 1:05 - loss: 2.5366 - regression_loss: 2.0319 - classification_loss: 0.5046 240/500 [=============>................] - ETA: 1:05 - loss: 2.5382 - regression_loss: 2.0331 - classification_loss: 0.5051 241/500 [=============>................] - ETA: 1:04 - loss: 2.5378 - regression_loss: 2.0330 - classification_loss: 0.5048 242/500 [=============>................] - ETA: 1:04 - loss: 2.5344 - regression_loss: 2.0292 - classification_loss: 0.5052 243/500 [=============>................] - ETA: 1:04 - loss: 2.5345 - regression_loss: 2.0296 - classification_loss: 0.5049 244/500 [=============>................] - ETA: 1:04 - loss: 2.5348 - regression_loss: 2.0300 - classification_loss: 0.5048 245/500 [=============>................] - ETA: 1:03 - loss: 2.5352 - regression_loss: 2.0299 - classification_loss: 0.5053 246/500 [=============>................] - ETA: 1:03 - loss: 2.5342 - regression_loss: 2.0297 - classification_loss: 0.5045 247/500 [=============>................] - ETA: 1:03 - loss: 2.5334 - regression_loss: 2.0294 - classification_loss: 0.5039 248/500 [=============>................] - ETA: 1:03 - loss: 2.5316 - regression_loss: 2.0281 - classification_loss: 0.5035 249/500 [=============>................] - ETA: 1:02 - loss: 2.5318 - regression_loss: 2.0285 - classification_loss: 0.5033 250/500 [==============>...............] - ETA: 1:02 - loss: 2.5385 - regression_loss: 2.0346 - classification_loss: 0.5039 251/500 [==============>...............] - ETA: 1:02 - loss: 2.5371 - regression_loss: 2.0337 - classification_loss: 0.5034 252/500 [==============>...............] - ETA: 1:02 - loss: 2.5399 - regression_loss: 2.0355 - classification_loss: 0.5045 253/500 [==============>...............] - ETA: 1:01 - loss: 2.5429 - regression_loss: 2.0375 - classification_loss: 0.5055 254/500 [==============>...............] - ETA: 1:01 - loss: 2.5406 - regression_loss: 2.0354 - classification_loss: 0.5052 255/500 [==============>...............] - ETA: 1:01 - loss: 2.5408 - regression_loss: 2.0355 - classification_loss: 0.5053 256/500 [==============>...............] - ETA: 1:01 - loss: 2.5417 - regression_loss: 2.0364 - classification_loss: 0.5053 257/500 [==============>...............] - ETA: 1:00 - loss: 2.5401 - regression_loss: 2.0353 - classification_loss: 0.5048 258/500 [==============>...............] - ETA: 1:00 - loss: 2.5375 - regression_loss: 2.0335 - classification_loss: 0.5040 259/500 [==============>...............] - ETA: 1:00 - loss: 2.5366 - regression_loss: 2.0328 - classification_loss: 0.5038 260/500 [==============>...............] - ETA: 1:00 - loss: 2.5372 - regression_loss: 2.0328 - classification_loss: 0.5044 261/500 [==============>...............] - ETA: 59s - loss: 2.5387 - regression_loss: 2.0344 - classification_loss: 0.5043  262/500 [==============>...............] - ETA: 59s - loss: 2.5371 - regression_loss: 2.0331 - classification_loss: 0.5040 263/500 [==============>...............] - ETA: 59s - loss: 2.5370 - regression_loss: 2.0332 - classification_loss: 0.5038 264/500 [==============>...............] - ETA: 59s - loss: 2.5359 - regression_loss: 2.0324 - classification_loss: 0.5035 265/500 [==============>...............] - ETA: 58s - loss: 2.5384 - regression_loss: 2.0342 - classification_loss: 0.5043 266/500 [==============>...............] - ETA: 58s - loss: 2.5321 - regression_loss: 2.0292 - classification_loss: 0.5029 267/500 [===============>..............] - ETA: 58s - loss: 2.5299 - regression_loss: 2.0275 - classification_loss: 0.5023 268/500 [===============>..............] - ETA: 58s - loss: 2.5299 - regression_loss: 2.0276 - classification_loss: 0.5023 269/500 [===============>..............] - ETA: 57s - loss: 2.5309 - regression_loss: 2.0284 - classification_loss: 0.5025 270/500 [===============>..............] - ETA: 57s - loss: 2.5318 - regression_loss: 2.0285 - classification_loss: 0.5033 271/500 [===============>..............] - ETA: 57s - loss: 2.5336 - regression_loss: 2.0301 - classification_loss: 0.5034 272/500 [===============>..............] - ETA: 57s - loss: 2.5339 - regression_loss: 2.0308 - classification_loss: 0.5030 273/500 [===============>..............] - ETA: 56s - loss: 2.5341 - regression_loss: 2.0311 - classification_loss: 0.5030 274/500 [===============>..............] - ETA: 56s - loss: 2.5347 - regression_loss: 2.0318 - classification_loss: 0.5029 275/500 [===============>..............] - ETA: 56s - loss: 2.5335 - regression_loss: 2.0311 - classification_loss: 0.5024 276/500 [===============>..............] - ETA: 56s - loss: 2.5339 - regression_loss: 2.0315 - classification_loss: 0.5024 277/500 [===============>..............] - ETA: 55s - loss: 2.5322 - regression_loss: 2.0304 - classification_loss: 0.5018 278/500 [===============>..............] - ETA: 55s - loss: 2.5413 - regression_loss: 2.0287 - classification_loss: 0.5126 279/500 [===============>..............] - ETA: 55s - loss: 2.5368 - regression_loss: 2.0254 - classification_loss: 0.5114 280/500 [===============>..............] - ETA: 55s - loss: 2.5365 - regression_loss: 2.0252 - classification_loss: 0.5113 281/500 [===============>..............] - ETA: 54s - loss: 2.5361 - regression_loss: 2.0250 - classification_loss: 0.5111 282/500 [===============>..............] - ETA: 54s - loss: 2.5354 - regression_loss: 2.0243 - classification_loss: 0.5111 283/500 [===============>..............] - ETA: 54s - loss: 2.5378 - regression_loss: 2.0262 - classification_loss: 0.5116 284/500 [================>.............] - ETA: 54s - loss: 2.5379 - regression_loss: 2.0263 - classification_loss: 0.5116 285/500 [================>.............] - ETA: 53s - loss: 2.5384 - regression_loss: 2.0268 - classification_loss: 0.5117 286/500 [================>.............] - ETA: 53s - loss: 2.5381 - regression_loss: 2.0266 - classification_loss: 0.5116 287/500 [================>.............] - ETA: 53s - loss: 2.5388 - regression_loss: 2.0270 - classification_loss: 0.5118 288/500 [================>.............] - ETA: 53s - loss: 2.5390 - regression_loss: 2.0274 - classification_loss: 0.5116 289/500 [================>.............] - ETA: 52s - loss: 2.5379 - regression_loss: 2.0267 - classification_loss: 0.5112 290/500 [================>.............] - ETA: 52s - loss: 2.5376 - regression_loss: 2.0265 - classification_loss: 0.5111 291/500 [================>.............] - ETA: 52s - loss: 2.5375 - regression_loss: 2.0266 - classification_loss: 0.5108 292/500 [================>.............] - ETA: 52s - loss: 2.5365 - regression_loss: 2.0261 - classification_loss: 0.5104 293/500 [================>.............] - ETA: 51s - loss: 2.5363 - regression_loss: 2.0258 - classification_loss: 0.5106 294/500 [================>.............] - ETA: 51s - loss: 2.5356 - regression_loss: 2.0255 - classification_loss: 0.5101 295/500 [================>.............] - ETA: 51s - loss: 2.5357 - regression_loss: 2.0259 - classification_loss: 0.5099 296/500 [================>.............] - ETA: 51s - loss: 2.5356 - regression_loss: 2.0255 - classification_loss: 0.5102 297/500 [================>.............] - ETA: 50s - loss: 2.5383 - regression_loss: 2.0268 - classification_loss: 0.5115 298/500 [================>.............] - ETA: 50s - loss: 2.5384 - regression_loss: 2.0271 - classification_loss: 0.5113 299/500 [================>.............] - ETA: 50s - loss: 2.5384 - regression_loss: 2.0273 - classification_loss: 0.5111 300/500 [=================>............] - ETA: 50s - loss: 2.5378 - regression_loss: 2.0271 - classification_loss: 0.5107 301/500 [=================>............] - ETA: 49s - loss: 2.5394 - regression_loss: 2.0287 - classification_loss: 0.5108 302/500 [=================>............] - ETA: 49s - loss: 2.5407 - regression_loss: 2.0301 - classification_loss: 0.5105 303/500 [=================>............] - ETA: 49s - loss: 2.5395 - regression_loss: 2.0295 - classification_loss: 0.5101 304/500 [=================>............] - ETA: 49s - loss: 2.5408 - regression_loss: 2.0306 - classification_loss: 0.5102 305/500 [=================>............] - ETA: 48s - loss: 2.5427 - regression_loss: 2.0322 - classification_loss: 0.5105 306/500 [=================>............] - ETA: 48s - loss: 2.5418 - regression_loss: 2.0319 - classification_loss: 0.5100 307/500 [=================>............] - ETA: 48s - loss: 2.5424 - regression_loss: 2.0325 - classification_loss: 0.5099 308/500 [=================>............] - ETA: 48s - loss: 2.5433 - regression_loss: 2.0331 - classification_loss: 0.5102 309/500 [=================>............] - ETA: 47s - loss: 2.5447 - regression_loss: 2.0341 - classification_loss: 0.5106 310/500 [=================>............] - ETA: 47s - loss: 2.5456 - regression_loss: 2.0350 - classification_loss: 0.5106 311/500 [=================>............] - ETA: 47s - loss: 2.5461 - regression_loss: 2.0354 - classification_loss: 0.5108 312/500 [=================>............] - ETA: 47s - loss: 2.5463 - regression_loss: 2.0354 - classification_loss: 0.5109 313/500 [=================>............] - ETA: 46s - loss: 2.5466 - regression_loss: 2.0356 - classification_loss: 0.5110 314/500 [=================>............] - ETA: 46s - loss: 2.5472 - regression_loss: 2.0363 - classification_loss: 0.5109 315/500 [=================>............] - ETA: 46s - loss: 2.5457 - regression_loss: 2.0352 - classification_loss: 0.5104 316/500 [=================>............] - ETA: 46s - loss: 2.5450 - regression_loss: 2.0348 - classification_loss: 0.5102 317/500 [==================>...........] - ETA: 45s - loss: 2.5442 - regression_loss: 2.0345 - classification_loss: 0.5097 318/500 [==================>...........] - ETA: 45s - loss: 2.5408 - regression_loss: 2.0311 - classification_loss: 0.5097 319/500 [==================>...........] - ETA: 45s - loss: 2.5419 - regression_loss: 2.0321 - classification_loss: 0.5098 320/500 [==================>...........] - ETA: 45s - loss: 2.5433 - regression_loss: 2.0325 - classification_loss: 0.5109 321/500 [==================>...........] - ETA: 44s - loss: 2.5439 - regression_loss: 2.0331 - classification_loss: 0.5108 322/500 [==================>...........] - ETA: 44s - loss: 2.5451 - regression_loss: 2.0323 - classification_loss: 0.5128 323/500 [==================>...........] - ETA: 44s - loss: 2.5451 - regression_loss: 2.0320 - classification_loss: 0.5131 324/500 [==================>...........] - ETA: 44s - loss: 2.5452 - regression_loss: 2.0321 - classification_loss: 0.5131 325/500 [==================>...........] - ETA: 43s - loss: 2.5426 - regression_loss: 2.0299 - classification_loss: 0.5128 326/500 [==================>...........] - ETA: 43s - loss: 2.5420 - regression_loss: 2.0294 - classification_loss: 0.5126 327/500 [==================>...........] - ETA: 43s - loss: 2.5411 - regression_loss: 2.0289 - classification_loss: 0.5122 328/500 [==================>...........] - ETA: 43s - loss: 2.5422 - regression_loss: 2.0299 - classification_loss: 0.5123 329/500 [==================>...........] - ETA: 42s - loss: 2.5422 - regression_loss: 2.0299 - classification_loss: 0.5123 330/500 [==================>...........] - ETA: 42s - loss: 2.5418 - regression_loss: 2.0295 - classification_loss: 0.5123 331/500 [==================>...........] - ETA: 42s - loss: 2.5427 - regression_loss: 2.0301 - classification_loss: 0.5126 332/500 [==================>...........] - ETA: 42s - loss: 2.5434 - regression_loss: 2.0306 - classification_loss: 0.5127 333/500 [==================>...........] - ETA: 41s - loss: 2.5440 - regression_loss: 2.0311 - classification_loss: 0.5129 334/500 [===================>..........] - ETA: 41s - loss: 2.5453 - regression_loss: 2.0321 - classification_loss: 0.5131 335/500 [===================>..........] - ETA: 41s - loss: 2.5451 - regression_loss: 2.0323 - classification_loss: 0.5128 336/500 [===================>..........] - ETA: 41s - loss: 2.5442 - regression_loss: 2.0317 - classification_loss: 0.5125 337/500 [===================>..........] - ETA: 40s - loss: 2.5414 - regression_loss: 2.0295 - classification_loss: 0.5119 338/500 [===================>..........] - ETA: 40s - loss: 2.5422 - regression_loss: 2.0305 - classification_loss: 0.5117 339/500 [===================>..........] - ETA: 40s - loss: 2.5421 - regression_loss: 2.0305 - classification_loss: 0.5116 340/500 [===================>..........] - ETA: 40s - loss: 2.5413 - regression_loss: 2.0301 - classification_loss: 0.5113 341/500 [===================>..........] - ETA: 39s - loss: 2.5393 - regression_loss: 2.0284 - classification_loss: 0.5109 342/500 [===================>..........] - ETA: 39s - loss: 2.5395 - regression_loss: 2.0287 - classification_loss: 0.5108 343/500 [===================>..........] - ETA: 39s - loss: 2.5398 - regression_loss: 2.0292 - classification_loss: 0.5106 344/500 [===================>..........] - ETA: 39s - loss: 2.5406 - regression_loss: 2.0300 - classification_loss: 0.5106 345/500 [===================>..........] - ETA: 38s - loss: 2.5405 - regression_loss: 2.0302 - classification_loss: 0.5103 346/500 [===================>..........] - ETA: 38s - loss: 2.5406 - regression_loss: 2.0303 - classification_loss: 0.5103 347/500 [===================>..........] - ETA: 38s - loss: 2.5406 - regression_loss: 2.0304 - classification_loss: 0.5102 348/500 [===================>..........] - ETA: 38s - loss: 2.5412 - regression_loss: 2.0309 - classification_loss: 0.5102 349/500 [===================>..........] - ETA: 37s - loss: 2.5408 - regression_loss: 2.0304 - classification_loss: 0.5104 350/500 [====================>.........] - ETA: 37s - loss: 2.5405 - regression_loss: 2.0303 - classification_loss: 0.5102 351/500 [====================>.........] - ETA: 37s - loss: 2.5407 - regression_loss: 2.0305 - classification_loss: 0.5102 352/500 [====================>.........] - ETA: 37s - loss: 2.5406 - regression_loss: 2.0305 - classification_loss: 0.5102 353/500 [====================>.........] - ETA: 36s - loss: 2.5391 - regression_loss: 2.0294 - classification_loss: 0.5098 354/500 [====================>.........] - ETA: 36s - loss: 2.5390 - regression_loss: 2.0291 - classification_loss: 0.5098 355/500 [====================>.........] - ETA: 36s - loss: 2.5384 - regression_loss: 2.0288 - classification_loss: 0.5096 356/500 [====================>.........] - ETA: 36s - loss: 2.5413 - regression_loss: 2.0299 - classification_loss: 0.5114 357/500 [====================>.........] - ETA: 35s - loss: 2.5416 - regression_loss: 2.0304 - classification_loss: 0.5112 358/500 [====================>.........] - ETA: 35s - loss: 2.5420 - regression_loss: 2.0310 - classification_loss: 0.5111 359/500 [====================>.........] - ETA: 35s - loss: 2.5417 - regression_loss: 2.0309 - classification_loss: 0.5108 360/500 [====================>.........] - ETA: 35s - loss: 2.5425 - regression_loss: 2.0313 - classification_loss: 0.5112 361/500 [====================>.........] - ETA: 34s - loss: 2.5439 - regression_loss: 2.0315 - classification_loss: 0.5124 362/500 [====================>.........] - ETA: 34s - loss: 2.5436 - regression_loss: 2.0317 - classification_loss: 0.5120 363/500 [====================>.........] - ETA: 34s - loss: 2.5424 - regression_loss: 2.0308 - classification_loss: 0.5117 364/500 [====================>.........] - ETA: 34s - loss: 2.5427 - regression_loss: 2.0312 - classification_loss: 0.5115 365/500 [====================>.........] - ETA: 33s - loss: 2.5443 - regression_loss: 2.0329 - classification_loss: 0.5115 366/500 [====================>.........] - ETA: 33s - loss: 2.5451 - regression_loss: 2.0337 - classification_loss: 0.5114 367/500 [=====================>........] - ETA: 33s - loss: 2.5446 - regression_loss: 2.0333 - classification_loss: 0.5114 368/500 [=====================>........] - ETA: 33s - loss: 2.5438 - regression_loss: 2.0327 - classification_loss: 0.5111 369/500 [=====================>........] - ETA: 32s - loss: 2.5432 - regression_loss: 2.0323 - classification_loss: 0.5109 370/500 [=====================>........] - ETA: 32s - loss: 2.5443 - regression_loss: 2.0331 - classification_loss: 0.5112 371/500 [=====================>........] - ETA: 32s - loss: 2.5442 - regression_loss: 2.0329 - classification_loss: 0.5113 372/500 [=====================>........] - ETA: 32s - loss: 2.5456 - regression_loss: 2.0343 - classification_loss: 0.5113 373/500 [=====================>........] - ETA: 31s - loss: 2.5455 - regression_loss: 2.0343 - classification_loss: 0.5112 374/500 [=====================>........] - ETA: 31s - loss: 2.5457 - regression_loss: 2.0346 - classification_loss: 0.5110 375/500 [=====================>........] - ETA: 31s - loss: 2.5445 - regression_loss: 2.0339 - classification_loss: 0.5106 376/500 [=====================>........] - ETA: 31s - loss: 2.5435 - regression_loss: 2.0332 - classification_loss: 0.5102 377/500 [=====================>........] - ETA: 30s - loss: 2.5423 - regression_loss: 2.0322 - classification_loss: 0.5101 378/500 [=====================>........] - ETA: 30s - loss: 2.5420 - regression_loss: 2.0320 - classification_loss: 0.5100 379/500 [=====================>........] - ETA: 30s - loss: 2.5414 - regression_loss: 2.0316 - classification_loss: 0.5098 380/500 [=====================>........] - ETA: 30s - loss: 2.5428 - regression_loss: 2.0328 - classification_loss: 0.5100 381/500 [=====================>........] - ETA: 29s - loss: 2.5423 - regression_loss: 2.0327 - classification_loss: 0.5097 382/500 [=====================>........] - ETA: 29s - loss: 2.5411 - regression_loss: 2.0319 - classification_loss: 0.5092 383/500 [=====================>........] - ETA: 29s - loss: 2.5395 - regression_loss: 2.0305 - classification_loss: 0.5090 384/500 [======================>.......] - ETA: 29s - loss: 2.5389 - regression_loss: 2.0302 - classification_loss: 0.5087 385/500 [======================>.......] - ETA: 28s - loss: 2.5383 - regression_loss: 2.0298 - classification_loss: 0.5085 386/500 [======================>.......] - ETA: 28s - loss: 2.5411 - regression_loss: 2.0322 - classification_loss: 0.5089 387/500 [======================>.......] - ETA: 28s - loss: 2.5413 - regression_loss: 2.0324 - classification_loss: 0.5088 388/500 [======================>.......] - ETA: 28s - loss: 2.5405 - regression_loss: 2.0319 - classification_loss: 0.5086 389/500 [======================>.......] - ETA: 27s - loss: 2.5392 - regression_loss: 2.0309 - classification_loss: 0.5082 390/500 [======================>.......] - ETA: 27s - loss: 2.5387 - regression_loss: 2.0307 - classification_loss: 0.5080 391/500 [======================>.......] - ETA: 27s - loss: 2.5387 - regression_loss: 2.0301 - classification_loss: 0.5085 392/500 [======================>.......] - ETA: 27s - loss: 2.5378 - regression_loss: 2.0294 - classification_loss: 0.5085 393/500 [======================>.......] - ETA: 26s - loss: 2.5392 - regression_loss: 2.0303 - classification_loss: 0.5089 394/500 [======================>.......] - ETA: 26s - loss: 2.5384 - regression_loss: 2.0299 - classification_loss: 0.5085 395/500 [======================>.......] - ETA: 26s - loss: 2.5373 - regression_loss: 2.0277 - classification_loss: 0.5096 396/500 [======================>.......] - ETA: 26s - loss: 2.5374 - regression_loss: 2.0277 - classification_loss: 0.5098 397/500 [======================>.......] - ETA: 25s - loss: 2.5394 - regression_loss: 2.0290 - classification_loss: 0.5103 398/500 [======================>.......] - ETA: 25s - loss: 2.5390 - regression_loss: 2.0289 - classification_loss: 0.5101 399/500 [======================>.......] - ETA: 25s - loss: 2.5388 - regression_loss: 2.0288 - classification_loss: 0.5100 400/500 [=======================>......] - ETA: 25s - loss: 2.5374 - regression_loss: 2.0278 - classification_loss: 0.5097 401/500 [=======================>......] - ETA: 24s - loss: 2.5374 - regression_loss: 2.0276 - classification_loss: 0.5097 402/500 [=======================>......] - ETA: 24s - loss: 2.5373 - regression_loss: 2.0277 - classification_loss: 0.5096 403/500 [=======================>......] - ETA: 24s - loss: 2.5367 - regression_loss: 2.0274 - classification_loss: 0.5093 404/500 [=======================>......] - ETA: 24s - loss: 2.5347 - regression_loss: 2.0259 - classification_loss: 0.5088 405/500 [=======================>......] - ETA: 23s - loss: 2.5345 - regression_loss: 2.0259 - classification_loss: 0.5086 406/500 [=======================>......] - ETA: 23s - loss: 2.5352 - regression_loss: 2.0264 - classification_loss: 0.5088 407/500 [=======================>......] - ETA: 23s - loss: 2.5351 - regression_loss: 2.0265 - classification_loss: 0.5087 408/500 [=======================>......] - ETA: 23s - loss: 2.5356 - regression_loss: 2.0269 - classification_loss: 0.5087 409/500 [=======================>......] - ETA: 22s - loss: 2.5346 - regression_loss: 2.0263 - classification_loss: 0.5083 410/500 [=======================>......] - ETA: 22s - loss: 2.5342 - regression_loss: 2.0258 - classification_loss: 0.5084 411/500 [=======================>......] - ETA: 22s - loss: 2.5337 - regression_loss: 2.0255 - classification_loss: 0.5083 412/500 [=======================>......] - ETA: 22s - loss: 2.5306 - regression_loss: 2.0227 - classification_loss: 0.5078 413/500 [=======================>......] - ETA: 21s - loss: 2.5315 - regression_loss: 2.0235 - classification_loss: 0.5080 414/500 [=======================>......] - ETA: 21s - loss: 2.5341 - regression_loss: 2.0256 - classification_loss: 0.5085 415/500 [=======================>......] - ETA: 21s - loss: 2.5323 - regression_loss: 2.0242 - classification_loss: 0.5081 416/500 [=======================>......] - ETA: 21s - loss: 2.5331 - regression_loss: 2.0249 - classification_loss: 0.5082 417/500 [========================>.....] - ETA: 20s - loss: 2.5350 - regression_loss: 2.0262 - classification_loss: 0.5088 418/500 [========================>.....] - ETA: 20s - loss: 2.5345 - regression_loss: 2.0259 - classification_loss: 0.5086 419/500 [========================>.....] - ETA: 20s - loss: 2.5334 - regression_loss: 2.0249 - classification_loss: 0.5084 420/500 [========================>.....] - ETA: 20s - loss: 2.5332 - regression_loss: 2.0250 - classification_loss: 0.5082 421/500 [========================>.....] - ETA: 19s - loss: 2.5323 - regression_loss: 2.0242 - classification_loss: 0.5081 422/500 [========================>.....] - ETA: 19s - loss: 2.5309 - regression_loss: 2.0233 - classification_loss: 0.5076 423/500 [========================>.....] - ETA: 19s - loss: 2.5303 - regression_loss: 2.0230 - classification_loss: 0.5073 424/500 [========================>.....] - ETA: 19s - loss: 2.5302 - regression_loss: 2.0227 - classification_loss: 0.5075 425/500 [========================>.....] - ETA: 18s - loss: 2.5309 - regression_loss: 2.0231 - classification_loss: 0.5078 426/500 [========================>.....] - ETA: 18s - loss: 2.5312 - regression_loss: 2.0235 - classification_loss: 0.5077 427/500 [========================>.....] - ETA: 18s - loss: 2.5304 - regression_loss: 2.0229 - classification_loss: 0.5075 428/500 [========================>.....] - ETA: 18s - loss: 2.5300 - regression_loss: 2.0226 - classification_loss: 0.5074 429/500 [========================>.....] - ETA: 17s - loss: 2.5294 - regression_loss: 2.0221 - classification_loss: 0.5072 430/500 [========================>.....] - ETA: 17s - loss: 2.5288 - regression_loss: 2.0214 - classification_loss: 0.5074 431/500 [========================>.....] - ETA: 17s - loss: 2.5296 - regression_loss: 2.0220 - classification_loss: 0.5076 432/500 [========================>.....] - ETA: 17s - loss: 2.5295 - regression_loss: 2.0212 - classification_loss: 0.5082 433/500 [========================>.....] - ETA: 16s - loss: 2.5285 - regression_loss: 2.0204 - classification_loss: 0.5081 434/500 [=========================>....] - ETA: 16s - loss: 2.5283 - regression_loss: 2.0204 - classification_loss: 0.5079 435/500 [=========================>....] - ETA: 16s - loss: 2.5277 - regression_loss: 2.0200 - classification_loss: 0.5077 436/500 [=========================>....] - ETA: 16s - loss: 2.5278 - regression_loss: 2.0203 - classification_loss: 0.5075 437/500 [=========================>....] - ETA: 15s - loss: 2.5288 - regression_loss: 2.0207 - classification_loss: 0.5082 438/500 [=========================>....] - ETA: 15s - loss: 2.5285 - regression_loss: 2.0204 - classification_loss: 0.5081 439/500 [=========================>....] - ETA: 15s - loss: 2.5272 - regression_loss: 2.0195 - classification_loss: 0.5077 440/500 [=========================>....] - ETA: 15s - loss: 2.5274 - regression_loss: 2.0198 - classification_loss: 0.5076 441/500 [=========================>....] - ETA: 14s - loss: 2.5286 - regression_loss: 2.0208 - classification_loss: 0.5078 442/500 [=========================>....] - ETA: 14s - loss: 2.5273 - regression_loss: 2.0197 - classification_loss: 0.5077 443/500 [=========================>....] - ETA: 14s - loss: 2.5272 - regression_loss: 2.0197 - classification_loss: 0.5075 444/500 [=========================>....] - ETA: 14s - loss: 2.5277 - regression_loss: 2.0203 - classification_loss: 0.5073 445/500 [=========================>....] - ETA: 13s - loss: 2.5272 - regression_loss: 2.0201 - classification_loss: 0.5071 446/500 [=========================>....] - ETA: 13s - loss: 2.5277 - regression_loss: 2.0207 - classification_loss: 0.5071 447/500 [=========================>....] - ETA: 13s - loss: 2.5272 - regression_loss: 2.0202 - classification_loss: 0.5069 448/500 [=========================>....] - ETA: 13s - loss: 2.5257 - regression_loss: 2.0192 - classification_loss: 0.5065 449/500 [=========================>....] - ETA: 12s - loss: 2.5254 - regression_loss: 2.0190 - classification_loss: 0.5064 450/500 [==========================>...] - ETA: 12s - loss: 2.5247 - regression_loss: 2.0187 - classification_loss: 0.5060 451/500 [==========================>...] - ETA: 12s - loss: 2.5248 - regression_loss: 2.0190 - classification_loss: 0.5058 452/500 [==========================>...] - ETA: 12s - loss: 2.5253 - regression_loss: 2.0194 - classification_loss: 0.5059 453/500 [==========================>...] - ETA: 11s - loss: 2.5253 - regression_loss: 2.0194 - classification_loss: 0.5059 454/500 [==========================>...] - ETA: 11s - loss: 2.5253 - regression_loss: 2.0197 - classification_loss: 0.5057 455/500 [==========================>...] - ETA: 11s - loss: 2.5261 - regression_loss: 2.0204 - classification_loss: 0.5057 456/500 [==========================>...] - ETA: 11s - loss: 2.5259 - regression_loss: 2.0203 - classification_loss: 0.5057 457/500 [==========================>...] - ETA: 10s - loss: 2.5295 - regression_loss: 2.0227 - classification_loss: 0.5069 458/500 [==========================>...] - ETA: 10s - loss: 2.5292 - regression_loss: 2.0225 - classification_loss: 0.5066 459/500 [==========================>...] - ETA: 10s - loss: 2.5293 - regression_loss: 2.0223 - classification_loss: 0.5070 460/500 [==========================>...] - ETA: 10s - loss: 2.5290 - regression_loss: 2.0221 - classification_loss: 0.5069 461/500 [==========================>...] - ETA: 9s - loss: 2.5287 - regression_loss: 2.0218 - classification_loss: 0.5068  462/500 [==========================>...] - ETA: 9s - loss: 2.5283 - regression_loss: 2.0215 - classification_loss: 0.5067 463/500 [==========================>...] - ETA: 9s - loss: 2.5279 - regression_loss: 2.0214 - classification_loss: 0.5065 464/500 [==========================>...] - ETA: 9s - loss: 2.5278 - regression_loss: 2.0214 - classification_loss: 0.5065 465/500 [==========================>...] - ETA: 8s - loss: 2.5287 - regression_loss: 2.0221 - classification_loss: 0.5066 466/500 [==========================>...] - ETA: 8s - loss: 2.5290 - regression_loss: 2.0222 - classification_loss: 0.5068 467/500 [===========================>..] - ETA: 8s - loss: 2.5285 - regression_loss: 2.0219 - classification_loss: 0.5066 468/500 [===========================>..] - ETA: 8s - loss: 2.5295 - regression_loss: 2.0227 - classification_loss: 0.5068 469/500 [===========================>..] - ETA: 7s - loss: 2.5305 - regression_loss: 2.0233 - classification_loss: 0.5072 470/500 [===========================>..] - ETA: 7s - loss: 2.5298 - regression_loss: 2.0229 - classification_loss: 0.5069 471/500 [===========================>..] - ETA: 7s - loss: 2.5299 - regression_loss: 2.0231 - classification_loss: 0.5068 472/500 [===========================>..] - ETA: 7s - loss: 2.5293 - regression_loss: 2.0224 - classification_loss: 0.5069 473/500 [===========================>..] - ETA: 6s - loss: 2.5294 - regression_loss: 2.0227 - classification_loss: 0.5067 474/500 [===========================>..] - ETA: 6s - loss: 2.5295 - regression_loss: 2.0227 - classification_loss: 0.5068 475/500 [===========================>..] - ETA: 6s - loss: 2.5284 - regression_loss: 2.0217 - classification_loss: 0.5067 476/500 [===========================>..] - ETA: 6s - loss: 2.5270 - regression_loss: 2.0207 - classification_loss: 0.5063 477/500 [===========================>..] - ETA: 5s - loss: 2.5271 - regression_loss: 2.0210 - classification_loss: 0.5061 478/500 [===========================>..] - ETA: 5s - loss: 2.5269 - regression_loss: 2.0209 - classification_loss: 0.5060 479/500 [===========================>..] - ETA: 5s - loss: 2.5267 - regression_loss: 2.0207 - classification_loss: 0.5060 480/500 [===========================>..] - ETA: 5s - loss: 2.5262 - regression_loss: 2.0205 - classification_loss: 0.5057 481/500 [===========================>..] - ETA: 4s - loss: 2.5254 - regression_loss: 2.0200 - classification_loss: 0.5055 482/500 [===========================>..] - ETA: 4s - loss: 2.5251 - regression_loss: 2.0197 - classification_loss: 0.5054 483/500 [===========================>..] - ETA: 4s - loss: 2.5250 - regression_loss: 2.0197 - classification_loss: 0.5053 484/500 [============================>.] - ETA: 4s - loss: 2.5251 - regression_loss: 2.0200 - classification_loss: 0.5051 485/500 [============================>.] - ETA: 3s - loss: 2.5268 - regression_loss: 2.0215 - classification_loss: 0.5053 486/500 [============================>.] - ETA: 3s - loss: 2.5272 - regression_loss: 2.0220 - classification_loss: 0.5053 487/500 [============================>.] - ETA: 3s - loss: 2.5275 - regression_loss: 2.0219 - classification_loss: 0.5057 488/500 [============================>.] - ETA: 3s - loss: 2.5288 - regression_loss: 2.0233 - classification_loss: 0.5055 489/500 [============================>.] - ETA: 2s - loss: 2.5280 - regression_loss: 2.0228 - classification_loss: 0.5052 490/500 [============================>.] - ETA: 2s - loss: 2.5274 - regression_loss: 2.0224 - classification_loss: 0.5051 491/500 [============================>.] - ETA: 2s - loss: 2.5261 - regression_loss: 2.0213 - classification_loss: 0.5048 492/500 [============================>.] - ETA: 2s - loss: 2.5270 - regression_loss: 2.0219 - classification_loss: 0.5052 493/500 [============================>.] - ETA: 1s - loss: 2.5247 - regression_loss: 2.0199 - classification_loss: 0.5048 494/500 [============================>.] - ETA: 1s - loss: 2.5260 - regression_loss: 2.0210 - classification_loss: 0.5050 495/500 [============================>.] - ETA: 1s - loss: 2.5248 - regression_loss: 2.0201 - classification_loss: 0.5047 496/500 [============================>.] - ETA: 1s - loss: 2.5257 - regression_loss: 2.0210 - classification_loss: 0.5047 497/500 [============================>.] - ETA: 0s - loss: 2.5252 - regression_loss: 2.0208 - classification_loss: 0.5044 498/500 [============================>.] - ETA: 0s - loss: 2.5254 - regression_loss: 2.0210 - classification_loss: 0.5044 499/500 [============================>.] - ETA: 0s - loss: 2.5247 - regression_loss: 2.0205 - classification_loss: 0.5042 500/500 [==============================] - 125s 250ms/step - loss: 2.5247 - regression_loss: 2.0205 - classification_loss: 0.5041 1172 instances of class plum with average precision: 0.3296 mAP: 0.3296 Epoch 00013: saving model to ./training/snapshots/resnet50_pascal_13.h5 Epoch 14/150 1/500 [..............................] - ETA: 1:57 - loss: 3.1710 - regression_loss: 2.4068 - classification_loss: 0.7642 2/500 [..............................] - ETA: 1:58 - loss: 2.9343 - regression_loss: 2.2713 - classification_loss: 0.6630 3/500 [..............................] - ETA: 2:01 - loss: 2.6997 - regression_loss: 2.1057 - classification_loss: 0.5940 4/500 [..............................] - ETA: 2:01 - loss: 2.6812 - regression_loss: 2.1206 - classification_loss: 0.5606 5/500 [..............................] - ETA: 2:01 - loss: 2.6226 - regression_loss: 2.0574 - classification_loss: 0.5652 6/500 [..............................] - ETA: 2:01 - loss: 2.3375 - regression_loss: 1.8402 - classification_loss: 0.4973 7/500 [..............................] - ETA: 2:02 - loss: 2.3588 - regression_loss: 1.8617 - classification_loss: 0.4971 8/500 [..............................] - ETA: 2:01 - loss: 2.3518 - regression_loss: 1.8653 - classification_loss: 0.4865 9/500 [..............................] - ETA: 1:59 - loss: 2.3502 - regression_loss: 1.8682 - classification_loss: 0.4820 10/500 [..............................] - ETA: 2:00 - loss: 2.3734 - regression_loss: 1.8989 - classification_loss: 0.4746 11/500 [..............................] - ETA: 1:59 - loss: 2.4305 - regression_loss: 1.9427 - classification_loss: 0.4878 12/500 [..............................] - ETA: 1:58 - loss: 2.3456 - regression_loss: 1.8613 - classification_loss: 0.4843 13/500 [..............................] - ETA: 1:56 - loss: 2.3386 - regression_loss: 1.8512 - classification_loss: 0.4874 14/500 [..............................] - ETA: 1:55 - loss: 2.3743 - regression_loss: 1.8863 - classification_loss: 0.4879 15/500 [..............................] - ETA: 1:54 - loss: 2.4062 - regression_loss: 1.9062 - classification_loss: 0.5000 16/500 [..............................] - ETA: 1:54 - loss: 2.4033 - regression_loss: 1.9071 - classification_loss: 0.4962 17/500 [>.............................] - ETA: 1:55 - loss: 2.4163 - regression_loss: 1.9158 - classification_loss: 0.5005 18/500 [>.............................] - ETA: 1:55 - loss: 2.4500 - regression_loss: 1.9484 - classification_loss: 0.5016 19/500 [>.............................] - ETA: 1:55 - loss: 2.4521 - regression_loss: 1.9463 - classification_loss: 0.5058 20/500 [>.............................] - ETA: 1:55 - loss: 2.4428 - regression_loss: 1.9438 - classification_loss: 0.4990 21/500 [>.............................] - ETA: 1:55 - loss: 2.4482 - regression_loss: 1.9429 - classification_loss: 0.5053 22/500 [>.............................] - ETA: 1:55 - loss: 2.4722 - regression_loss: 1.9665 - classification_loss: 0.5057 23/500 [>.............................] - ETA: 1:55 - loss: 2.4707 - regression_loss: 1.9687 - classification_loss: 0.5020 24/500 [>.............................] - ETA: 1:55 - loss: 2.4853 - regression_loss: 1.9841 - classification_loss: 0.5012 25/500 [>.............................] - ETA: 1:55 - loss: 2.4976 - regression_loss: 1.9944 - classification_loss: 0.5032 26/500 [>.............................] - ETA: 1:54 - loss: 2.4904 - regression_loss: 1.9920 - classification_loss: 0.4984 27/500 [>.............................] - ETA: 1:54 - loss: 2.4834 - regression_loss: 1.9868 - classification_loss: 0.4966 28/500 [>.............................] - ETA: 1:54 - loss: 2.4707 - regression_loss: 1.9752 - classification_loss: 0.4955 29/500 [>.............................] - ETA: 1:54 - loss: 2.4912 - regression_loss: 1.9964 - classification_loss: 0.4949 30/500 [>.............................] - ETA: 1:54 - loss: 2.4932 - regression_loss: 1.9955 - classification_loss: 0.4978 31/500 [>.............................] - ETA: 1:54 - loss: 2.5067 - regression_loss: 2.0063 - classification_loss: 0.5003 32/500 [>.............................] - ETA: 1:54 - loss: 2.4852 - regression_loss: 1.9891 - classification_loss: 0.4961 33/500 [>.............................] - ETA: 1:53 - loss: 2.4782 - regression_loss: 1.9878 - classification_loss: 0.4905 34/500 [=>............................] - ETA: 1:53 - loss: 2.4873 - regression_loss: 1.9978 - classification_loss: 0.4895 35/500 [=>............................] - ETA: 1:53 - loss: 2.4746 - regression_loss: 1.9884 - classification_loss: 0.4862 36/500 [=>............................] - ETA: 1:53 - loss: 2.4619 - regression_loss: 1.9788 - classification_loss: 0.4831 37/500 [=>............................] - ETA: 1:53 - loss: 2.4595 - regression_loss: 1.9776 - classification_loss: 0.4818 38/500 [=>............................] - ETA: 1:53 - loss: 2.4690 - regression_loss: 1.9848 - classification_loss: 0.4842 39/500 [=>............................] - ETA: 1:52 - loss: 2.4660 - regression_loss: 1.9829 - classification_loss: 0.4831 40/500 [=>............................] - ETA: 1:52 - loss: 2.4659 - regression_loss: 1.9839 - classification_loss: 0.4820 41/500 [=>............................] - ETA: 1:52 - loss: 2.4742 - regression_loss: 1.9729 - classification_loss: 0.5012 42/500 [=>............................] - ETA: 1:52 - loss: 2.4667 - regression_loss: 1.9689 - classification_loss: 0.4978 43/500 [=>............................] - ETA: 1:52 - loss: 2.4783 - regression_loss: 1.9772 - classification_loss: 0.5011 44/500 [=>............................] - ETA: 1:52 - loss: 2.5146 - regression_loss: 1.9774 - classification_loss: 0.5372 45/500 [=>............................] - ETA: 1:51 - loss: 2.4986 - regression_loss: 1.9668 - classification_loss: 0.5318 46/500 [=>............................] - ETA: 1:51 - loss: 2.5073 - regression_loss: 1.9708 - classification_loss: 0.5365 47/500 [=>............................] - ETA: 1:51 - loss: 2.5022 - regression_loss: 1.9676 - classification_loss: 0.5346 48/500 [=>............................] - ETA: 1:51 - loss: 2.4910 - regression_loss: 1.9597 - classification_loss: 0.5313 49/500 [=>............................] - ETA: 1:50 - loss: 2.5004 - regression_loss: 1.9665 - classification_loss: 0.5339 50/500 [==>...........................] - ETA: 1:50 - loss: 2.4955 - regression_loss: 1.9622 - classification_loss: 0.5333 51/500 [==>...........................] - ETA: 1:50 - loss: 2.4946 - regression_loss: 1.9599 - classification_loss: 0.5347 52/500 [==>...........................] - ETA: 1:49 - loss: 2.5038 - regression_loss: 1.9669 - classification_loss: 0.5369 53/500 [==>...........................] - ETA: 1:49 - loss: 2.5199 - regression_loss: 1.9791 - classification_loss: 0.5409 54/500 [==>...........................] - ETA: 1:49 - loss: 2.5241 - regression_loss: 1.9826 - classification_loss: 0.5415 55/500 [==>...........................] - ETA: 1:49 - loss: 2.5303 - regression_loss: 1.9863 - classification_loss: 0.5440 56/500 [==>...........................] - ETA: 1:49 - loss: 2.5280 - regression_loss: 1.9853 - classification_loss: 0.5427 57/500 [==>...........................] - ETA: 1:49 - loss: 2.5232 - regression_loss: 1.9814 - classification_loss: 0.5418 58/500 [==>...........................] - ETA: 1:48 - loss: 2.5234 - regression_loss: 1.9833 - classification_loss: 0.5402 59/500 [==>...........................] - ETA: 1:48 - loss: 2.5253 - regression_loss: 1.9861 - classification_loss: 0.5392 60/500 [==>...........................] - ETA: 1:48 - loss: 2.5344 - regression_loss: 1.9926 - classification_loss: 0.5419 61/500 [==>...........................] - ETA: 1:48 - loss: 2.5277 - regression_loss: 1.9885 - classification_loss: 0.5392 62/500 [==>...........................] - ETA: 1:48 - loss: 2.5282 - regression_loss: 1.9894 - classification_loss: 0.5388 63/500 [==>...........................] - ETA: 1:47 - loss: 2.5254 - regression_loss: 1.9895 - classification_loss: 0.5359 64/500 [==>...........................] - ETA: 1:47 - loss: 2.5138 - regression_loss: 1.9787 - classification_loss: 0.5350 65/500 [==>...........................] - ETA: 1:47 - loss: 2.5195 - regression_loss: 1.9858 - classification_loss: 0.5337 66/500 [==>...........................] - ETA: 1:47 - loss: 2.5275 - regression_loss: 1.9929 - classification_loss: 0.5346 67/500 [===>..........................] - ETA: 1:47 - loss: 2.5296 - regression_loss: 1.9965 - classification_loss: 0.5332 68/500 [===>..........................] - ETA: 1:46 - loss: 2.5296 - regression_loss: 1.9977 - classification_loss: 0.5320 69/500 [===>..........................] - ETA: 1:46 - loss: 2.5241 - regression_loss: 1.9941 - classification_loss: 0.5300 70/500 [===>..........................] - ETA: 1:46 - loss: 2.5187 - regression_loss: 1.9900 - classification_loss: 0.5287 71/500 [===>..........................] - ETA: 1:46 - loss: 2.5163 - regression_loss: 1.9895 - classification_loss: 0.5268 72/500 [===>..........................] - ETA: 1:45 - loss: 2.5097 - regression_loss: 1.9845 - classification_loss: 0.5252 73/500 [===>..........................] - ETA: 1:45 - loss: 2.5115 - regression_loss: 1.9854 - classification_loss: 0.5261 74/500 [===>..........................] - ETA: 1:45 - loss: 2.4998 - regression_loss: 1.9742 - classification_loss: 0.5255 75/500 [===>..........................] - ETA: 1:45 - loss: 2.4967 - regression_loss: 1.9722 - classification_loss: 0.5244 76/500 [===>..........................] - ETA: 1:44 - loss: 2.5004 - regression_loss: 1.9772 - classification_loss: 0.5233 77/500 [===>..........................] - ETA: 1:44 - loss: 2.5026 - regression_loss: 1.9797 - classification_loss: 0.5229 78/500 [===>..........................] - ETA: 1:44 - loss: 2.5089 - regression_loss: 1.9859 - classification_loss: 0.5230 79/500 [===>..........................] - ETA: 1:44 - loss: 2.5116 - regression_loss: 1.9892 - classification_loss: 0.5223 80/500 [===>..........................] - ETA: 1:43 - loss: 2.5198 - regression_loss: 1.9967 - classification_loss: 0.5231 81/500 [===>..........................] - ETA: 1:43 - loss: 2.5143 - regression_loss: 1.9931 - classification_loss: 0.5212 82/500 [===>..........................] - ETA: 1:43 - loss: 2.5092 - regression_loss: 1.9892 - classification_loss: 0.5199 83/500 [===>..........................] - ETA: 1:43 - loss: 2.5099 - regression_loss: 1.9898 - classification_loss: 0.5201 84/500 [====>.........................] - ETA: 1:43 - loss: 2.5129 - regression_loss: 1.9917 - classification_loss: 0.5212 85/500 [====>.........................] - ETA: 1:42 - loss: 2.5046 - regression_loss: 1.9854 - classification_loss: 0.5192 86/500 [====>.........................] - ETA: 1:42 - loss: 2.5058 - regression_loss: 1.9874 - classification_loss: 0.5184 87/500 [====>.........................] - ETA: 1:42 - loss: 2.4935 - regression_loss: 1.9775 - classification_loss: 0.5160 88/500 [====>.........................] - ETA: 1:41 - loss: 2.4934 - regression_loss: 1.9779 - classification_loss: 0.5155 89/500 [====>.........................] - ETA: 1:41 - loss: 2.4878 - regression_loss: 1.9739 - classification_loss: 0.5138 90/500 [====>.........................] - ETA: 1:41 - loss: 2.4884 - regression_loss: 1.9757 - classification_loss: 0.5127 91/500 [====>.........................] - ETA: 1:41 - loss: 2.4874 - regression_loss: 1.9767 - classification_loss: 0.5106 92/500 [====>.........................] - ETA: 1:40 - loss: 2.4857 - regression_loss: 1.9747 - classification_loss: 0.5111 93/500 [====>.........................] - ETA: 1:40 - loss: 2.4892 - regression_loss: 1.9783 - classification_loss: 0.5108 94/500 [====>.........................] - ETA: 1:40 - loss: 2.4908 - regression_loss: 1.9786 - classification_loss: 0.5122 95/500 [====>.........................] - ETA: 1:40 - loss: 2.4907 - regression_loss: 1.9793 - classification_loss: 0.5115 96/500 [====>.........................] - ETA: 1:40 - loss: 2.4911 - regression_loss: 1.9790 - classification_loss: 0.5121 97/500 [====>.........................] - ETA: 1:39 - loss: 2.4893 - regression_loss: 1.9784 - classification_loss: 0.5109 98/500 [====>.........................] - ETA: 1:39 - loss: 2.4948 - regression_loss: 1.9828 - classification_loss: 0.5120 99/500 [====>.........................] - ETA: 1:39 - loss: 2.4979 - regression_loss: 1.9867 - classification_loss: 0.5112 100/500 [=====>........................] - ETA: 1:39 - loss: 2.4948 - regression_loss: 1.9847 - classification_loss: 0.5102 101/500 [=====>........................] - ETA: 1:39 - loss: 2.4963 - regression_loss: 1.9855 - classification_loss: 0.5109 102/500 [=====>........................] - ETA: 1:38 - loss: 2.4946 - regression_loss: 1.9847 - classification_loss: 0.5099 103/500 [=====>........................] - ETA: 1:38 - loss: 2.4995 - regression_loss: 1.9880 - classification_loss: 0.5115 104/500 [=====>........................] - ETA: 1:38 - loss: 2.4909 - regression_loss: 1.9809 - classification_loss: 0.5100 105/500 [=====>........................] - ETA: 1:38 - loss: 2.4904 - regression_loss: 1.9807 - classification_loss: 0.5097 106/500 [=====>........................] - ETA: 1:37 - loss: 2.4988 - regression_loss: 1.9876 - classification_loss: 0.5112 107/500 [=====>........................] - ETA: 1:37 - loss: 2.4960 - regression_loss: 1.9858 - classification_loss: 0.5102 108/500 [=====>........................] - ETA: 1:37 - loss: 2.4962 - regression_loss: 1.9862 - classification_loss: 0.5100 109/500 [=====>........................] - ETA: 1:37 - loss: 2.4964 - regression_loss: 1.9860 - classification_loss: 0.5104 110/500 [=====>........................] - ETA: 1:36 - loss: 2.4932 - regression_loss: 1.9839 - classification_loss: 0.5093 111/500 [=====>........................] - ETA: 1:36 - loss: 2.4982 - regression_loss: 1.9895 - classification_loss: 0.5087 112/500 [=====>........................] - ETA: 1:36 - loss: 2.4939 - regression_loss: 1.9861 - classification_loss: 0.5078 113/500 [=====>........................] - ETA: 1:36 - loss: 2.4932 - regression_loss: 1.9855 - classification_loss: 0.5076 114/500 [=====>........................] - ETA: 1:35 - loss: 2.4928 - regression_loss: 1.9857 - classification_loss: 0.5071 115/500 [=====>........................] - ETA: 1:35 - loss: 2.4977 - regression_loss: 1.9883 - classification_loss: 0.5094 116/500 [=====>........................] - ETA: 1:35 - loss: 2.4934 - regression_loss: 1.9855 - classification_loss: 0.5080 117/500 [======>.......................] - ETA: 1:35 - loss: 2.4939 - regression_loss: 1.9872 - classification_loss: 0.5066 118/500 [======>.......................] - ETA: 1:34 - loss: 2.4983 - regression_loss: 1.9913 - classification_loss: 0.5069 119/500 [======>.......................] - ETA: 1:34 - loss: 2.5011 - regression_loss: 1.9926 - classification_loss: 0.5084 120/500 [======>.......................] - ETA: 1:34 - loss: 2.4964 - regression_loss: 1.9880 - classification_loss: 0.5084 121/500 [======>.......................] - ETA: 1:34 - loss: 2.4951 - regression_loss: 1.9879 - classification_loss: 0.5072 122/500 [======>.......................] - ETA: 1:33 - loss: 2.4942 - regression_loss: 1.9877 - classification_loss: 0.5065 123/500 [======>.......................] - ETA: 1:33 - loss: 2.4934 - regression_loss: 1.9874 - classification_loss: 0.5061 124/500 [======>.......................] - ETA: 1:33 - loss: 2.4945 - regression_loss: 1.9888 - classification_loss: 0.5057 125/500 [======>.......................] - ETA: 1:33 - loss: 2.4913 - regression_loss: 1.9872 - classification_loss: 0.5040 126/500 [======>.......................] - ETA: 1:32 - loss: 2.4978 - regression_loss: 1.9920 - classification_loss: 0.5058 127/500 [======>.......................] - ETA: 1:32 - loss: 2.5001 - regression_loss: 1.9940 - classification_loss: 0.5060 128/500 [======>.......................] - ETA: 1:32 - loss: 2.4981 - regression_loss: 1.9930 - classification_loss: 0.5052 129/500 [======>.......................] - ETA: 1:32 - loss: 2.5031 - regression_loss: 1.9939 - classification_loss: 0.5092 130/500 [======>.......................] - ETA: 1:31 - loss: 2.5033 - regression_loss: 1.9950 - classification_loss: 0.5083 131/500 [======>.......................] - ETA: 1:31 - loss: 2.5032 - regression_loss: 1.9951 - classification_loss: 0.5081 132/500 [======>.......................] - ETA: 1:31 - loss: 2.5036 - regression_loss: 1.9964 - classification_loss: 0.5072 133/500 [======>.......................] - ETA: 1:31 - loss: 2.4996 - regression_loss: 1.9939 - classification_loss: 0.5057 134/500 [=======>......................] - ETA: 1:30 - loss: 2.4931 - regression_loss: 1.9894 - classification_loss: 0.5038 135/500 [=======>......................] - ETA: 1:30 - loss: 2.4911 - regression_loss: 1.9880 - classification_loss: 0.5030 136/500 [=======>......................] - ETA: 1:30 - loss: 2.4862 - regression_loss: 1.9849 - classification_loss: 0.5013 137/500 [=======>......................] - ETA: 1:30 - loss: 2.4837 - regression_loss: 1.9833 - classification_loss: 0.5003 138/500 [=======>......................] - ETA: 1:30 - loss: 2.4832 - regression_loss: 1.9835 - classification_loss: 0.4997 139/500 [=======>......................] - ETA: 1:29 - loss: 2.4823 - regression_loss: 1.9833 - classification_loss: 0.4990 140/500 [=======>......................] - ETA: 1:29 - loss: 2.4844 - regression_loss: 1.9851 - classification_loss: 0.4993 141/500 [=======>......................] - ETA: 1:29 - loss: 2.4825 - regression_loss: 1.9839 - classification_loss: 0.4986 142/500 [=======>......................] - ETA: 1:29 - loss: 2.4823 - regression_loss: 1.9828 - classification_loss: 0.4995 143/500 [=======>......................] - ETA: 1:28 - loss: 2.4890 - regression_loss: 1.9877 - classification_loss: 0.5012 144/500 [=======>......................] - ETA: 1:28 - loss: 2.4935 - regression_loss: 1.9905 - classification_loss: 0.5029 145/500 [=======>......................] - ETA: 1:28 - loss: 2.4921 - regression_loss: 1.9891 - classification_loss: 0.5030 146/500 [=======>......................] - ETA: 1:28 - loss: 2.4956 - regression_loss: 1.9918 - classification_loss: 0.5038 147/500 [=======>......................] - ETA: 1:27 - loss: 2.4957 - regression_loss: 1.9922 - classification_loss: 0.5035 148/500 [=======>......................] - ETA: 1:27 - loss: 2.4920 - regression_loss: 1.9895 - classification_loss: 0.5025 149/500 [=======>......................] - ETA: 1:27 - loss: 2.4909 - regression_loss: 1.9889 - classification_loss: 0.5020 150/500 [========>.....................] - ETA: 1:27 - loss: 2.4908 - regression_loss: 1.9896 - classification_loss: 0.5013 151/500 [========>.....................] - ETA: 1:26 - loss: 2.4911 - regression_loss: 1.9906 - classification_loss: 0.5006 152/500 [========>.....................] - ETA: 1:26 - loss: 2.4900 - regression_loss: 1.9902 - classification_loss: 0.4998 153/500 [========>.....................] - ETA: 1:26 - loss: 2.4882 - regression_loss: 1.9893 - classification_loss: 0.4990 154/500 [========>.....................] - ETA: 1:26 - loss: 2.4872 - regression_loss: 1.9888 - classification_loss: 0.4984 155/500 [========>.....................] - ETA: 1:25 - loss: 2.4871 - regression_loss: 1.9893 - classification_loss: 0.4977 156/500 [========>.....................] - ETA: 1:25 - loss: 2.4857 - regression_loss: 1.9855 - classification_loss: 0.5001 157/500 [========>.....................] - ETA: 1:25 - loss: 2.4859 - regression_loss: 1.9864 - classification_loss: 0.4995 158/500 [========>.....................] - ETA: 1:25 - loss: 2.4857 - regression_loss: 1.9864 - classification_loss: 0.4993 159/500 [========>.....................] - ETA: 1:24 - loss: 2.4868 - regression_loss: 1.9870 - classification_loss: 0.4998 160/500 [========>.....................] - ETA: 1:24 - loss: 2.4852 - regression_loss: 1.9856 - classification_loss: 0.4996 161/500 [========>.....................] - ETA: 1:24 - loss: 2.4868 - regression_loss: 1.9859 - classification_loss: 0.5009 162/500 [========>.....................] - ETA: 1:24 - loss: 2.4872 - regression_loss: 1.9868 - classification_loss: 0.5004 163/500 [========>.....................] - ETA: 1:23 - loss: 2.4878 - regression_loss: 1.9880 - classification_loss: 0.4997 164/500 [========>.....................] - ETA: 1:23 - loss: 2.4806 - regression_loss: 1.9825 - classification_loss: 0.4981 165/500 [========>.....................] - ETA: 1:23 - loss: 2.4822 - regression_loss: 1.9835 - classification_loss: 0.4987 166/500 [========>.....................] - ETA: 1:23 - loss: 2.4820 - regression_loss: 1.9831 - classification_loss: 0.4989 167/500 [=========>....................] - ETA: 1:22 - loss: 2.4859 - regression_loss: 1.9865 - classification_loss: 0.4994 168/500 [=========>....................] - ETA: 1:22 - loss: 2.4876 - regression_loss: 1.9878 - classification_loss: 0.4998 169/500 [=========>....................] - ETA: 1:22 - loss: 2.4871 - regression_loss: 1.9867 - classification_loss: 0.5004 170/500 [=========>....................] - ETA: 1:22 - loss: 2.4823 - regression_loss: 1.9832 - classification_loss: 0.4991 171/500 [=========>....................] - ETA: 1:21 - loss: 2.4839 - regression_loss: 1.9841 - classification_loss: 0.4999 172/500 [=========>....................] - ETA: 1:21 - loss: 2.4859 - regression_loss: 1.9851 - classification_loss: 0.5008 173/500 [=========>....................] - ETA: 1:21 - loss: 2.4852 - regression_loss: 1.9841 - classification_loss: 0.5011 174/500 [=========>....................] - ETA: 1:21 - loss: 2.4841 - regression_loss: 1.9834 - classification_loss: 0.5007 175/500 [=========>....................] - ETA: 1:20 - loss: 2.4802 - regression_loss: 1.9803 - classification_loss: 0.4999 176/500 [=========>....................] - ETA: 1:20 - loss: 2.4793 - regression_loss: 1.9798 - classification_loss: 0.4996 177/500 [=========>....................] - ETA: 1:20 - loss: 2.4796 - regression_loss: 1.9804 - classification_loss: 0.4992 178/500 [=========>....................] - ETA: 1:20 - loss: 2.4811 - regression_loss: 1.9810 - classification_loss: 0.5001 179/500 [=========>....................] - ETA: 1:19 - loss: 2.4808 - regression_loss: 1.9810 - classification_loss: 0.4998 180/500 [=========>....................] - ETA: 1:19 - loss: 2.4785 - regression_loss: 1.9793 - classification_loss: 0.4993 181/500 [=========>....................] - ETA: 1:19 - loss: 2.4792 - regression_loss: 1.9801 - classification_loss: 0.4991 182/500 [=========>....................] - ETA: 1:19 - loss: 2.4819 - regression_loss: 1.9816 - classification_loss: 0.5003 183/500 [=========>....................] - ETA: 1:18 - loss: 2.4823 - regression_loss: 1.9824 - classification_loss: 0.4999 184/500 [==========>...................] - ETA: 1:18 - loss: 2.4841 - regression_loss: 1.9823 - classification_loss: 0.5018 185/500 [==========>...................] - ETA: 1:18 - loss: 2.4858 - regression_loss: 1.9828 - classification_loss: 0.5030 186/500 [==========>...................] - ETA: 1:18 - loss: 2.4861 - regression_loss: 1.9800 - classification_loss: 0.5060 187/500 [==========>...................] - ETA: 1:17 - loss: 2.4833 - regression_loss: 1.9780 - classification_loss: 0.5053 188/500 [==========>...................] - ETA: 1:17 - loss: 2.4835 - regression_loss: 1.9780 - classification_loss: 0.5055 189/500 [==========>...................] - ETA: 1:17 - loss: 2.4857 - regression_loss: 1.9801 - classification_loss: 0.5056 190/500 [==========>...................] - ETA: 1:17 - loss: 2.4852 - regression_loss: 1.9802 - classification_loss: 0.5051 191/500 [==========>...................] - ETA: 1:16 - loss: 2.4854 - regression_loss: 1.9800 - classification_loss: 0.5053 192/500 [==========>...................] - ETA: 1:16 - loss: 2.4865 - regression_loss: 1.9814 - classification_loss: 0.5051 193/500 [==========>...................] - ETA: 1:16 - loss: 2.4854 - regression_loss: 1.9808 - classification_loss: 0.5046 194/500 [==========>...................] - ETA: 1:16 - loss: 2.4850 - regression_loss: 1.9806 - classification_loss: 0.5044 195/500 [==========>...................] - ETA: 1:15 - loss: 2.4859 - regression_loss: 1.9816 - classification_loss: 0.5043 196/500 [==========>...................] - ETA: 1:15 - loss: 2.4840 - regression_loss: 1.9800 - classification_loss: 0.5040 197/500 [==========>...................] - ETA: 1:15 - loss: 2.4835 - regression_loss: 1.9793 - classification_loss: 0.5041 198/500 [==========>...................] - ETA: 1:15 - loss: 2.4850 - regression_loss: 1.9803 - classification_loss: 0.5046 199/500 [==========>...................] - ETA: 1:14 - loss: 2.4839 - regression_loss: 1.9797 - classification_loss: 0.5042 200/500 [===========>..................] - ETA: 1:14 - loss: 2.4836 - regression_loss: 1.9802 - classification_loss: 0.5034 201/500 [===========>..................] - ETA: 1:14 - loss: 2.4831 - regression_loss: 1.9798 - classification_loss: 0.5033 202/500 [===========>..................] - ETA: 1:14 - loss: 2.4813 - regression_loss: 1.9786 - classification_loss: 0.5027 203/500 [===========>..................] - ETA: 1:13 - loss: 2.4829 - regression_loss: 1.9799 - classification_loss: 0.5030 204/500 [===========>..................] - ETA: 1:13 - loss: 2.4815 - regression_loss: 1.9790 - classification_loss: 0.5025 205/500 [===========>..................] - ETA: 1:13 - loss: 2.4796 - regression_loss: 1.9777 - classification_loss: 0.5019 206/500 [===========>..................] - ETA: 1:13 - loss: 2.4773 - regression_loss: 1.9759 - classification_loss: 0.5014 207/500 [===========>..................] - ETA: 1:12 - loss: 2.4756 - regression_loss: 1.9744 - classification_loss: 0.5012 208/500 [===========>..................] - ETA: 1:12 - loss: 2.4756 - regression_loss: 1.9747 - classification_loss: 0.5008 209/500 [===========>..................] - ETA: 1:12 - loss: 2.4766 - regression_loss: 1.9742 - classification_loss: 0.5024 210/500 [===========>..................] - ETA: 1:12 - loss: 2.4767 - regression_loss: 1.9746 - classification_loss: 0.5021 211/500 [===========>..................] - ETA: 1:11 - loss: 2.4773 - regression_loss: 1.9753 - classification_loss: 0.5019 212/500 [===========>..................] - ETA: 1:11 - loss: 2.4762 - regression_loss: 1.9747 - classification_loss: 0.5014 213/500 [===========>..................] - ETA: 1:11 - loss: 2.4740 - regression_loss: 1.9732 - classification_loss: 0.5008 214/500 [===========>..................] - ETA: 1:11 - loss: 2.4708 - regression_loss: 1.9709 - classification_loss: 0.4999 215/500 [===========>..................] - ETA: 1:10 - loss: 2.4691 - regression_loss: 1.9696 - classification_loss: 0.4996 216/500 [===========>..................] - ETA: 1:10 - loss: 2.4616 - regression_loss: 1.9633 - classification_loss: 0.4983 217/500 [============>.................] - ETA: 1:10 - loss: 2.4627 - regression_loss: 1.9644 - classification_loss: 0.4983 218/500 [============>.................] - ETA: 1:10 - loss: 2.4628 - regression_loss: 1.9649 - classification_loss: 0.4979 219/500 [============>.................] - ETA: 1:09 - loss: 2.4622 - regression_loss: 1.9646 - classification_loss: 0.4976 220/500 [============>.................] - ETA: 1:09 - loss: 2.4637 - regression_loss: 1.9662 - classification_loss: 0.4975 221/500 [============>.................] - ETA: 1:09 - loss: 2.4665 - regression_loss: 1.9687 - classification_loss: 0.4978 222/500 [============>.................] - ETA: 1:09 - loss: 2.4686 - regression_loss: 1.9708 - classification_loss: 0.4978 223/500 [============>.................] - ETA: 1:08 - loss: 2.4686 - regression_loss: 1.9709 - classification_loss: 0.4977 224/500 [============>.................] - ETA: 1:08 - loss: 2.4719 - regression_loss: 1.9722 - classification_loss: 0.4996 225/500 [============>.................] - ETA: 1:08 - loss: 2.4730 - regression_loss: 1.9734 - classification_loss: 0.4996 226/500 [============>.................] - ETA: 1:08 - loss: 2.4756 - regression_loss: 1.9731 - classification_loss: 0.5025 227/500 [============>.................] - ETA: 1:07 - loss: 2.4754 - regression_loss: 1.9735 - classification_loss: 0.5019 228/500 [============>.................] - ETA: 1:07 - loss: 2.4749 - regression_loss: 1.9733 - classification_loss: 0.5016 229/500 [============>.................] - ETA: 1:07 - loss: 2.4740 - regression_loss: 1.9726 - classification_loss: 0.5014 230/500 [============>.................] - ETA: 1:07 - loss: 2.4736 - regression_loss: 1.9723 - classification_loss: 0.5013 231/500 [============>.................] - ETA: 1:06 - loss: 2.4735 - regression_loss: 1.9726 - classification_loss: 0.5009 232/500 [============>.................] - ETA: 1:06 - loss: 2.4706 - regression_loss: 1.9699 - classification_loss: 0.5007 233/500 [============>.................] - ETA: 1:06 - loss: 2.4649 - regression_loss: 1.9652 - classification_loss: 0.4997 234/500 [=============>................] - ETA: 1:06 - loss: 2.4627 - regression_loss: 1.9636 - classification_loss: 0.4991 235/500 [=============>................] - ETA: 1:06 - loss: 2.4590 - regression_loss: 1.9604 - classification_loss: 0.4986 236/500 [=============>................] - ETA: 1:05 - loss: 2.4624 - regression_loss: 1.9641 - classification_loss: 0.4984 237/500 [=============>................] - ETA: 1:05 - loss: 2.4608 - regression_loss: 1.9628 - classification_loss: 0.4980 238/500 [=============>................] - ETA: 1:05 - loss: 2.4603 - regression_loss: 1.9623 - classification_loss: 0.4980 239/500 [=============>................] - ETA: 1:05 - loss: 2.4618 - regression_loss: 1.9635 - classification_loss: 0.4983 240/500 [=============>................] - ETA: 1:04 - loss: 2.4616 - regression_loss: 1.9634 - classification_loss: 0.4982 241/500 [=============>................] - ETA: 1:04 - loss: 2.4594 - regression_loss: 1.9618 - classification_loss: 0.4976 242/500 [=============>................] - ETA: 1:04 - loss: 2.4598 - regression_loss: 1.9623 - classification_loss: 0.4975 243/500 [=============>................] - ETA: 1:04 - loss: 2.4605 - regression_loss: 1.9633 - classification_loss: 0.4972 244/500 [=============>................] - ETA: 1:03 - loss: 2.4582 - regression_loss: 1.9616 - classification_loss: 0.4966 245/500 [=============>................] - ETA: 1:03 - loss: 2.4582 - regression_loss: 1.9618 - classification_loss: 0.4965 246/500 [=============>................] - ETA: 1:03 - loss: 2.4595 - regression_loss: 1.9630 - classification_loss: 0.4965 247/500 [=============>................] - ETA: 1:03 - loss: 2.4612 - regression_loss: 1.9648 - classification_loss: 0.4964 248/500 [=============>................] - ETA: 1:02 - loss: 2.4565 - regression_loss: 1.9608 - classification_loss: 0.4957 249/500 [=============>................] - ETA: 1:02 - loss: 2.4561 - regression_loss: 1.9607 - classification_loss: 0.4954 250/500 [==============>...............] - ETA: 1:02 - loss: 2.4572 - regression_loss: 1.9616 - classification_loss: 0.4955 251/500 [==============>...............] - ETA: 1:02 - loss: 2.4551 - regression_loss: 1.9594 - classification_loss: 0.4956 252/500 [==============>...............] - ETA: 1:01 - loss: 2.4534 - regression_loss: 1.9579 - classification_loss: 0.4955 253/500 [==============>...............] - ETA: 1:01 - loss: 2.4531 - regression_loss: 1.9581 - classification_loss: 0.4950 254/500 [==============>...............] - ETA: 1:01 - loss: 2.4530 - regression_loss: 1.9580 - classification_loss: 0.4949 255/500 [==============>...............] - ETA: 1:01 - loss: 2.4517 - regression_loss: 1.9573 - classification_loss: 0.4944 256/500 [==============>...............] - ETA: 1:00 - loss: 2.4542 - regression_loss: 1.9601 - classification_loss: 0.4941 257/500 [==============>...............] - ETA: 1:00 - loss: 2.4529 - regression_loss: 1.9594 - classification_loss: 0.4936 258/500 [==============>...............] - ETA: 1:00 - loss: 2.4529 - regression_loss: 1.9595 - classification_loss: 0.4934 259/500 [==============>...............] - ETA: 1:00 - loss: 2.4529 - regression_loss: 1.9592 - classification_loss: 0.4937 260/500 [==============>...............] - ETA: 59s - loss: 2.4520 - regression_loss: 1.9584 - classification_loss: 0.4936  261/500 [==============>...............] - ETA: 59s - loss: 2.4542 - regression_loss: 1.9599 - classification_loss: 0.4943 262/500 [==============>...............] - ETA: 59s - loss: 2.4555 - regression_loss: 1.9609 - classification_loss: 0.4946 263/500 [==============>...............] - ETA: 59s - loss: 2.4551 - regression_loss: 1.9610 - classification_loss: 0.4941 264/500 [==============>...............] - ETA: 58s - loss: 2.4576 - regression_loss: 1.9631 - classification_loss: 0.4946 265/500 [==============>...............] - ETA: 58s - loss: 2.4558 - regression_loss: 1.9619 - classification_loss: 0.4939 266/500 [==============>...............] - ETA: 58s - loss: 2.4561 - regression_loss: 1.9611 - classification_loss: 0.4949 267/500 [===============>..............] - ETA: 58s - loss: 2.4563 - regression_loss: 1.9617 - classification_loss: 0.4947 268/500 [===============>..............] - ETA: 57s - loss: 2.4564 - regression_loss: 1.9618 - classification_loss: 0.4947 269/500 [===============>..............] - ETA: 57s - loss: 2.4569 - regression_loss: 1.9621 - classification_loss: 0.4948 270/500 [===============>..............] - ETA: 57s - loss: 2.4567 - regression_loss: 1.9620 - classification_loss: 0.4947 271/500 [===============>..............] - ETA: 57s - loss: 2.4565 - regression_loss: 1.9620 - classification_loss: 0.4945 272/500 [===============>..............] - ETA: 56s - loss: 2.4566 - regression_loss: 1.9624 - classification_loss: 0.4943 273/500 [===============>..............] - ETA: 56s - loss: 2.4567 - regression_loss: 1.9627 - classification_loss: 0.4941 274/500 [===============>..............] - ETA: 56s - loss: 2.4564 - regression_loss: 1.9627 - classification_loss: 0.4937 275/500 [===============>..............] - ETA: 56s - loss: 2.4573 - regression_loss: 1.9634 - classification_loss: 0.4939 276/500 [===============>..............] - ETA: 55s - loss: 2.4566 - regression_loss: 1.9628 - classification_loss: 0.4938 277/500 [===============>..............] - ETA: 55s - loss: 2.4557 - regression_loss: 1.9613 - classification_loss: 0.4944 278/500 [===============>..............] - ETA: 55s - loss: 2.4550 - regression_loss: 1.9611 - classification_loss: 0.4939 279/500 [===============>..............] - ETA: 55s - loss: 2.4560 - regression_loss: 1.9620 - classification_loss: 0.4941 280/500 [===============>..............] - ETA: 54s - loss: 2.4581 - regression_loss: 1.9631 - classification_loss: 0.4950 281/500 [===============>..............] - ETA: 54s - loss: 2.4586 - regression_loss: 1.9635 - classification_loss: 0.4951 282/500 [===============>..............] - ETA: 54s - loss: 2.4582 - regression_loss: 1.9630 - classification_loss: 0.4951 283/500 [===============>..............] - ETA: 54s - loss: 2.4576 - regression_loss: 1.9627 - classification_loss: 0.4949 284/500 [================>.............] - ETA: 53s - loss: 2.4582 - regression_loss: 1.9632 - classification_loss: 0.4950 285/500 [================>.............] - ETA: 53s - loss: 2.4600 - regression_loss: 1.9649 - classification_loss: 0.4950 286/500 [================>.............] - ETA: 53s - loss: 2.4602 - regression_loss: 1.9657 - classification_loss: 0.4945 287/500 [================>.............] - ETA: 53s - loss: 2.4610 - regression_loss: 1.9664 - classification_loss: 0.4947 288/500 [================>.............] - ETA: 52s - loss: 2.4624 - regression_loss: 1.9672 - classification_loss: 0.4952 289/500 [================>.............] - ETA: 52s - loss: 2.4601 - regression_loss: 1.9655 - classification_loss: 0.4946 290/500 [================>.............] - ETA: 52s - loss: 2.4614 - regression_loss: 1.9666 - classification_loss: 0.4948 291/500 [================>.............] - ETA: 52s - loss: 2.4621 - regression_loss: 1.9673 - classification_loss: 0.4948 292/500 [================>.............] - ETA: 51s - loss: 2.4614 - regression_loss: 1.9670 - classification_loss: 0.4945 293/500 [================>.............] - ETA: 51s - loss: 2.4613 - regression_loss: 1.9673 - classification_loss: 0.4941 294/500 [================>.............] - ETA: 51s - loss: 2.4622 - regression_loss: 1.9682 - classification_loss: 0.4940 295/500 [================>.............] - ETA: 51s - loss: 2.4619 - regression_loss: 1.9681 - classification_loss: 0.4938 296/500 [================>.............] - ETA: 50s - loss: 2.4620 - regression_loss: 1.9683 - classification_loss: 0.4938 297/500 [================>.............] - ETA: 50s - loss: 2.4619 - regression_loss: 1.9684 - classification_loss: 0.4935 298/500 [================>.............] - ETA: 50s - loss: 2.4604 - regression_loss: 1.9672 - classification_loss: 0.4932 299/500 [================>.............] - ETA: 50s - loss: 2.4606 - regression_loss: 1.9677 - classification_loss: 0.4929 300/500 [=================>............] - ETA: 49s - loss: 2.4624 - regression_loss: 1.9693 - classification_loss: 0.4931 301/500 [=================>............] - ETA: 49s - loss: 2.4664 - regression_loss: 1.9726 - classification_loss: 0.4938 302/500 [=================>............] - ETA: 49s - loss: 2.4662 - regression_loss: 1.9724 - classification_loss: 0.4938 303/500 [=================>............] - ETA: 49s - loss: 2.4659 - regression_loss: 1.9724 - classification_loss: 0.4935 304/500 [=================>............] - ETA: 48s - loss: 2.4684 - regression_loss: 1.9742 - classification_loss: 0.4942 305/500 [=================>............] - ETA: 48s - loss: 2.4665 - regression_loss: 1.9728 - classification_loss: 0.4937 306/500 [=================>............] - ETA: 48s - loss: 2.4691 - regression_loss: 1.9753 - classification_loss: 0.4938 307/500 [=================>............] - ETA: 48s - loss: 2.4709 - regression_loss: 1.9763 - classification_loss: 0.4946 308/500 [=================>............] - ETA: 47s - loss: 2.4696 - regression_loss: 1.9753 - classification_loss: 0.4943 309/500 [=================>............] - ETA: 47s - loss: 2.4692 - regression_loss: 1.9753 - classification_loss: 0.4939 310/500 [=================>............] - ETA: 47s - loss: 2.4688 - regression_loss: 1.9748 - classification_loss: 0.4940 311/500 [=================>............] - ETA: 47s - loss: 2.4686 - regression_loss: 1.9747 - classification_loss: 0.4940 312/500 [=================>............] - ETA: 46s - loss: 2.4681 - regression_loss: 1.9743 - classification_loss: 0.4937 313/500 [=================>............] - ETA: 46s - loss: 2.4665 - regression_loss: 1.9734 - classification_loss: 0.4932 314/500 [=================>............] - ETA: 46s - loss: 2.4661 - regression_loss: 1.9732 - classification_loss: 0.4929 315/500 [=================>............] - ETA: 46s - loss: 2.4678 - regression_loss: 1.9745 - classification_loss: 0.4933 316/500 [=================>............] - ETA: 45s - loss: 2.4675 - regression_loss: 1.9743 - classification_loss: 0.4932 317/500 [==================>...........] - ETA: 45s - loss: 2.4673 - regression_loss: 1.9742 - classification_loss: 0.4931 318/500 [==================>...........] - ETA: 45s - loss: 2.4676 - regression_loss: 1.9746 - classification_loss: 0.4930 319/500 [==================>...........] - ETA: 45s - loss: 2.4695 - regression_loss: 1.9764 - classification_loss: 0.4931 320/500 [==================>...........] - ETA: 44s - loss: 2.4695 - regression_loss: 1.9766 - classification_loss: 0.4929 321/500 [==================>...........] - ETA: 44s - loss: 2.4705 - regression_loss: 1.9774 - classification_loss: 0.4931 322/500 [==================>...........] - ETA: 44s - loss: 2.4689 - regression_loss: 1.9764 - classification_loss: 0.4926 323/500 [==================>...........] - ETA: 44s - loss: 2.4692 - regression_loss: 1.9768 - classification_loss: 0.4924 324/500 [==================>...........] - ETA: 43s - loss: 2.4693 - regression_loss: 1.9769 - classification_loss: 0.4925 325/500 [==================>...........] - ETA: 43s - loss: 2.4675 - regression_loss: 1.9758 - classification_loss: 0.4918 326/500 [==================>...........] - ETA: 43s - loss: 2.4668 - regression_loss: 1.9746 - classification_loss: 0.4922 327/500 [==================>...........] - ETA: 43s - loss: 2.4673 - regression_loss: 1.9751 - classification_loss: 0.4922 328/500 [==================>...........] - ETA: 42s - loss: 2.4682 - regression_loss: 1.9758 - classification_loss: 0.4923 329/500 [==================>...........] - ETA: 42s - loss: 2.4676 - regression_loss: 1.9754 - classification_loss: 0.4922 330/500 [==================>...........] - ETA: 42s - loss: 2.4670 - regression_loss: 1.9751 - classification_loss: 0.4918 331/500 [==================>...........] - ETA: 42s - loss: 2.4659 - regression_loss: 1.9745 - classification_loss: 0.4914 332/500 [==================>...........] - ETA: 41s - loss: 2.4683 - regression_loss: 1.9759 - classification_loss: 0.4924 333/500 [==================>...........] - ETA: 41s - loss: 2.4684 - regression_loss: 1.9759 - classification_loss: 0.4925 334/500 [===================>..........] - ETA: 41s - loss: 2.4680 - regression_loss: 1.9758 - classification_loss: 0.4922 335/500 [===================>..........] - ETA: 41s - loss: 2.4649 - regression_loss: 1.9731 - classification_loss: 0.4918 336/500 [===================>..........] - ETA: 40s - loss: 2.4688 - regression_loss: 1.9767 - classification_loss: 0.4921 337/500 [===================>..........] - ETA: 40s - loss: 2.4669 - regression_loss: 1.9751 - classification_loss: 0.4918 338/500 [===================>..........] - ETA: 40s - loss: 2.4651 - regression_loss: 1.9739 - classification_loss: 0.4913 339/500 [===================>..........] - ETA: 40s - loss: 2.4639 - regression_loss: 1.9731 - classification_loss: 0.4908 340/500 [===================>..........] - ETA: 39s - loss: 2.4637 - regression_loss: 1.9731 - classification_loss: 0.4906 341/500 [===================>..........] - ETA: 39s - loss: 2.4638 - regression_loss: 1.9729 - classification_loss: 0.4909 342/500 [===================>..........] - ETA: 39s - loss: 2.4644 - regression_loss: 1.9737 - classification_loss: 0.4908 343/500 [===================>..........] - ETA: 39s - loss: 2.4651 - regression_loss: 1.9740 - classification_loss: 0.4911 344/500 [===================>..........] - ETA: 38s - loss: 2.4652 - regression_loss: 1.9740 - classification_loss: 0.4912 345/500 [===================>..........] - ETA: 38s - loss: 2.4653 - regression_loss: 1.9741 - classification_loss: 0.4911 346/500 [===================>..........] - ETA: 38s - loss: 2.4653 - regression_loss: 1.9741 - classification_loss: 0.4912 347/500 [===================>..........] - ETA: 38s - loss: 2.4642 - regression_loss: 1.9733 - classification_loss: 0.4909 348/500 [===================>..........] - ETA: 37s - loss: 2.4629 - regression_loss: 1.9723 - classification_loss: 0.4906 349/500 [===================>..........] - ETA: 37s - loss: 2.4628 - regression_loss: 1.9720 - classification_loss: 0.4908 350/500 [====================>.........] - ETA: 37s - loss: 2.4601 - regression_loss: 1.9698 - classification_loss: 0.4903 351/500 [====================>.........] - ETA: 37s - loss: 2.4599 - regression_loss: 1.9698 - classification_loss: 0.4901 352/500 [====================>.........] - ETA: 36s - loss: 2.4610 - regression_loss: 1.9703 - classification_loss: 0.4907 353/500 [====================>.........] - ETA: 36s - loss: 2.4605 - regression_loss: 1.9701 - classification_loss: 0.4904 354/500 [====================>.........] - ETA: 36s - loss: 2.4598 - regression_loss: 1.9698 - classification_loss: 0.4900 355/500 [====================>.........] - ETA: 36s - loss: 2.4600 - regression_loss: 1.9698 - classification_loss: 0.4902 356/500 [====================>.........] - ETA: 35s - loss: 2.4586 - regression_loss: 1.9685 - classification_loss: 0.4901 357/500 [====================>.........] - ETA: 35s - loss: 2.4594 - regression_loss: 1.9691 - classification_loss: 0.4903 358/500 [====================>.........] - ETA: 35s - loss: 2.4572 - regression_loss: 1.9674 - classification_loss: 0.4898 359/500 [====================>.........] - ETA: 35s - loss: 2.4582 - regression_loss: 1.9684 - classification_loss: 0.4898 360/500 [====================>.........] - ETA: 34s - loss: 2.4586 - regression_loss: 1.9688 - classification_loss: 0.4899 361/500 [====================>.........] - ETA: 34s - loss: 2.4584 - regression_loss: 1.9685 - classification_loss: 0.4898 362/500 [====================>.........] - ETA: 34s - loss: 2.4581 - regression_loss: 1.9679 - classification_loss: 0.4902 363/500 [====================>.........] - ETA: 34s - loss: 2.4598 - regression_loss: 1.9695 - classification_loss: 0.4903 364/500 [====================>.........] - ETA: 33s - loss: 2.4614 - regression_loss: 1.9712 - classification_loss: 0.4902 365/500 [====================>.........] - ETA: 33s - loss: 2.4608 - regression_loss: 1.9710 - classification_loss: 0.4898 366/500 [====================>.........] - ETA: 33s - loss: 2.4615 - regression_loss: 1.9715 - classification_loss: 0.4900 367/500 [=====================>........] - ETA: 33s - loss: 2.4615 - regression_loss: 1.9717 - classification_loss: 0.4898 368/500 [=====================>........] - ETA: 32s - loss: 2.4612 - regression_loss: 1.9716 - classification_loss: 0.4896 369/500 [=====================>........] - ETA: 32s - loss: 2.4608 - regression_loss: 1.9714 - classification_loss: 0.4894 370/500 [=====================>........] - ETA: 32s - loss: 2.4637 - regression_loss: 1.9738 - classification_loss: 0.4899 371/500 [=====================>........] - ETA: 32s - loss: 2.4638 - regression_loss: 1.9741 - classification_loss: 0.4897 372/500 [=====================>........] - ETA: 31s - loss: 2.4658 - regression_loss: 1.9756 - classification_loss: 0.4901 373/500 [=====================>........] - ETA: 31s - loss: 2.4713 - regression_loss: 1.9747 - classification_loss: 0.4965 374/500 [=====================>........] - ETA: 31s - loss: 2.4725 - regression_loss: 1.9760 - classification_loss: 0.4965 375/500 [=====================>........] - ETA: 31s - loss: 2.4727 - regression_loss: 1.9763 - classification_loss: 0.4964 376/500 [=====================>........] - ETA: 30s - loss: 2.4756 - regression_loss: 1.9787 - classification_loss: 0.4970 377/500 [=====================>........] - ETA: 30s - loss: 2.4764 - regression_loss: 1.9794 - classification_loss: 0.4970 378/500 [=====================>........] - ETA: 30s - loss: 2.4754 - regression_loss: 1.9790 - classification_loss: 0.4964 379/500 [=====================>........] - ETA: 30s - loss: 2.4764 - regression_loss: 1.9798 - classification_loss: 0.4966 380/500 [=====================>........] - ETA: 29s - loss: 2.4757 - regression_loss: 1.9793 - classification_loss: 0.4964 381/500 [=====================>........] - ETA: 29s - loss: 2.4756 - regression_loss: 1.9793 - classification_loss: 0.4964 382/500 [=====================>........] - ETA: 29s - loss: 2.4752 - regression_loss: 1.9786 - classification_loss: 0.4966 383/500 [=====================>........] - ETA: 29s - loss: 2.4743 - regression_loss: 1.9779 - classification_loss: 0.4964 384/500 [======================>.......] - ETA: 28s - loss: 2.4742 - regression_loss: 1.9780 - classification_loss: 0.4962 385/500 [======================>.......] - ETA: 28s - loss: 2.4742 - regression_loss: 1.9782 - classification_loss: 0.4959 386/500 [======================>.......] - ETA: 28s - loss: 2.4756 - regression_loss: 1.9793 - classification_loss: 0.4963 387/500 [======================>.......] - ETA: 28s - loss: 2.4757 - regression_loss: 1.9796 - classification_loss: 0.4961 388/500 [======================>.......] - ETA: 27s - loss: 2.4751 - regression_loss: 1.9791 - classification_loss: 0.4960 389/500 [======================>.......] - ETA: 27s - loss: 2.4743 - regression_loss: 1.9783 - classification_loss: 0.4960 390/500 [======================>.......] - ETA: 27s - loss: 2.4775 - regression_loss: 1.9810 - classification_loss: 0.4965 391/500 [======================>.......] - ETA: 27s - loss: 2.4762 - regression_loss: 1.9800 - classification_loss: 0.4962 392/500 [======================>.......] - ETA: 26s - loss: 2.4752 - regression_loss: 1.9791 - classification_loss: 0.4960 393/500 [======================>.......] - ETA: 26s - loss: 2.4750 - regression_loss: 1.9787 - classification_loss: 0.4963 394/500 [======================>.......] - ETA: 26s - loss: 2.4740 - regression_loss: 1.9774 - classification_loss: 0.4967 395/500 [======================>.......] - ETA: 26s - loss: 2.4742 - regression_loss: 1.9775 - classification_loss: 0.4968 396/500 [======================>.......] - ETA: 25s - loss: 2.4735 - regression_loss: 1.9770 - classification_loss: 0.4964 397/500 [======================>.......] - ETA: 25s - loss: 2.4700 - regression_loss: 1.9742 - classification_loss: 0.4958 398/500 [======================>.......] - ETA: 25s - loss: 2.4696 - regression_loss: 1.9740 - classification_loss: 0.4955 399/500 [======================>.......] - ETA: 25s - loss: 2.4700 - regression_loss: 1.9743 - classification_loss: 0.4956 400/500 [=======================>......] - ETA: 24s - loss: 2.4697 - regression_loss: 1.9740 - classification_loss: 0.4956 401/500 [=======================>......] - ETA: 24s - loss: 2.4700 - regression_loss: 1.9742 - classification_loss: 0.4958 402/500 [=======================>......] - ETA: 24s - loss: 2.4718 - regression_loss: 1.9755 - classification_loss: 0.4963 403/500 [=======================>......] - ETA: 24s - loss: 2.4744 - regression_loss: 1.9777 - classification_loss: 0.4967 404/500 [=======================>......] - ETA: 23s - loss: 2.4749 - regression_loss: 1.9782 - classification_loss: 0.4967 405/500 [=======================>......] - ETA: 23s - loss: 2.4755 - regression_loss: 1.9788 - classification_loss: 0.4967 406/500 [=======================>......] - ETA: 23s - loss: 2.4756 - regression_loss: 1.9788 - classification_loss: 0.4968 407/500 [=======================>......] - ETA: 23s - loss: 2.4758 - regression_loss: 1.9792 - classification_loss: 0.4966 408/500 [=======================>......] - ETA: 22s - loss: 2.4764 - regression_loss: 1.9795 - classification_loss: 0.4969 409/500 [=======================>......] - ETA: 22s - loss: 2.4783 - regression_loss: 1.9810 - classification_loss: 0.4973 410/500 [=======================>......] - ETA: 22s - loss: 2.4785 - regression_loss: 1.9814 - classification_loss: 0.4972 411/500 [=======================>......] - ETA: 22s - loss: 2.4787 - regression_loss: 1.9815 - classification_loss: 0.4972 412/500 [=======================>......] - ETA: 21s - loss: 2.4793 - regression_loss: 1.9821 - classification_loss: 0.4972 413/500 [=======================>......] - ETA: 21s - loss: 2.4785 - regression_loss: 1.9815 - classification_loss: 0.4970 414/500 [=======================>......] - ETA: 21s - loss: 2.4761 - regression_loss: 1.9795 - classification_loss: 0.4966 415/500 [=======================>......] - ETA: 21s - loss: 2.4757 - regression_loss: 1.9794 - classification_loss: 0.4962 416/500 [=======================>......] - ETA: 20s - loss: 2.4766 - regression_loss: 1.9805 - classification_loss: 0.4961 417/500 [========================>.....] - ETA: 20s - loss: 2.4773 - regression_loss: 1.9810 - classification_loss: 0.4963 418/500 [========================>.....] - ETA: 20s - loss: 2.4783 - regression_loss: 1.9815 - classification_loss: 0.4968 419/500 [========================>.....] - ETA: 20s - loss: 2.4788 - regression_loss: 1.9819 - classification_loss: 0.4969 420/500 [========================>.....] - ETA: 19s - loss: 2.4780 - regression_loss: 1.9815 - classification_loss: 0.4965 421/500 [========================>.....] - ETA: 19s - loss: 2.4782 - regression_loss: 1.9818 - classification_loss: 0.4964 422/500 [========================>.....] - ETA: 19s - loss: 2.4775 - regression_loss: 1.9814 - classification_loss: 0.4961 423/500 [========================>.....] - ETA: 19s - loss: 2.4801 - regression_loss: 1.9836 - classification_loss: 0.4965 424/500 [========================>.....] - ETA: 18s - loss: 2.4814 - regression_loss: 1.9843 - classification_loss: 0.4970 425/500 [========================>.....] - ETA: 18s - loss: 2.4817 - regression_loss: 1.9844 - classification_loss: 0.4973 426/500 [========================>.....] - ETA: 18s - loss: 2.4822 - regression_loss: 1.9848 - classification_loss: 0.4974 427/500 [========================>.....] - ETA: 18s - loss: 2.4825 - regression_loss: 1.9850 - classification_loss: 0.4974 428/500 [========================>.....] - ETA: 17s - loss: 2.4820 - regression_loss: 1.9843 - classification_loss: 0.4977 429/500 [========================>.....] - ETA: 17s - loss: 2.4824 - regression_loss: 1.9847 - classification_loss: 0.4977 430/500 [========================>.....] - ETA: 17s - loss: 2.4819 - regression_loss: 1.9845 - classification_loss: 0.4975 431/500 [========================>.....] - ETA: 17s - loss: 2.4822 - regression_loss: 1.9848 - classification_loss: 0.4973 432/500 [========================>.....] - ETA: 16s - loss: 2.4816 - regression_loss: 1.9844 - classification_loss: 0.4972 433/500 [========================>.....] - ETA: 16s - loss: 2.4808 - regression_loss: 1.9839 - classification_loss: 0.4969 434/500 [=========================>....] - ETA: 16s - loss: 2.4802 - regression_loss: 1.9833 - classification_loss: 0.4969 435/500 [=========================>....] - ETA: 16s - loss: 2.4791 - regression_loss: 1.9826 - classification_loss: 0.4965 436/500 [=========================>....] - ETA: 15s - loss: 2.4783 - regression_loss: 1.9820 - classification_loss: 0.4963 437/500 [=========================>....] - ETA: 15s - loss: 2.4793 - regression_loss: 1.9830 - classification_loss: 0.4963 438/500 [=========================>....] - ETA: 15s - loss: 2.4783 - regression_loss: 1.9821 - classification_loss: 0.4962 439/500 [=========================>....] - ETA: 15s - loss: 2.4782 - regression_loss: 1.9821 - classification_loss: 0.4961 440/500 [=========================>....] - ETA: 14s - loss: 2.4786 - regression_loss: 1.9824 - classification_loss: 0.4962 441/500 [=========================>....] - ETA: 14s - loss: 2.4794 - regression_loss: 1.9831 - classification_loss: 0.4963 442/500 [=========================>....] - ETA: 14s - loss: 2.4796 - regression_loss: 1.9834 - classification_loss: 0.4962 443/500 [=========================>....] - ETA: 14s - loss: 2.4798 - regression_loss: 1.9836 - classification_loss: 0.4962 444/500 [=========================>....] - ETA: 13s - loss: 2.4782 - regression_loss: 1.9824 - classification_loss: 0.4959 445/500 [=========================>....] - ETA: 13s - loss: 2.4775 - regression_loss: 1.9819 - classification_loss: 0.4956 446/500 [=========================>....] - ETA: 13s - loss: 2.4779 - regression_loss: 1.9825 - classification_loss: 0.4954 447/500 [=========================>....] - ETA: 13s - loss: 2.4777 - regression_loss: 1.9825 - classification_loss: 0.4952 448/500 [=========================>....] - ETA: 12s - loss: 2.4777 - regression_loss: 1.9826 - classification_loss: 0.4951 449/500 [=========================>....] - ETA: 12s - loss: 2.4788 - regression_loss: 1.9830 - classification_loss: 0.4958 450/500 [==========================>...] - ETA: 12s - loss: 2.4777 - regression_loss: 1.9822 - classification_loss: 0.4955 451/500 [==========================>...] - ETA: 12s - loss: 2.4773 - regression_loss: 1.9818 - classification_loss: 0.4955 452/500 [==========================>...] - ETA: 11s - loss: 2.4777 - regression_loss: 1.9824 - classification_loss: 0.4953 453/500 [==========================>...] - ETA: 11s - loss: 2.4777 - regression_loss: 1.9823 - classification_loss: 0.4954 454/500 [==========================>...] - ETA: 11s - loss: 2.4772 - regression_loss: 1.9820 - classification_loss: 0.4952 455/500 [==========================>...] - ETA: 11s - loss: 2.4763 - regression_loss: 1.9814 - classification_loss: 0.4949 456/500 [==========================>...] - ETA: 10s - loss: 2.4750 - regression_loss: 1.9805 - classification_loss: 0.4944 457/500 [==========================>...] - ETA: 10s - loss: 2.4746 - regression_loss: 1.9804 - classification_loss: 0.4942 458/500 [==========================>...] - ETA: 10s - loss: 2.4746 - regression_loss: 1.9806 - classification_loss: 0.4940 459/500 [==========================>...] - ETA: 10s - loss: 2.4755 - regression_loss: 1.9814 - classification_loss: 0.4941 460/500 [==========================>...] - ETA: 9s - loss: 2.4732 - regression_loss: 1.9795 - classification_loss: 0.4937  461/500 [==========================>...] - ETA: 9s - loss: 2.4723 - regression_loss: 1.9789 - classification_loss: 0.4934 462/500 [==========================>...] - ETA: 9s - loss: 2.4725 - regression_loss: 1.9790 - classification_loss: 0.4935 463/500 [==========================>...] - ETA: 9s - loss: 2.4729 - regression_loss: 1.9794 - classification_loss: 0.4935 464/500 [==========================>...] - ETA: 8s - loss: 2.4724 - regression_loss: 1.9791 - classification_loss: 0.4933 465/500 [==========================>...] - ETA: 8s - loss: 2.4712 - regression_loss: 1.9782 - classification_loss: 0.4930 466/500 [==========================>...] - ETA: 8s - loss: 2.4714 - regression_loss: 1.9785 - classification_loss: 0.4929 467/500 [===========================>..] - ETA: 8s - loss: 2.4712 - regression_loss: 1.9785 - classification_loss: 0.4927 468/500 [===========================>..] - ETA: 7s - loss: 2.4717 - regression_loss: 1.9790 - classification_loss: 0.4927 469/500 [===========================>..] - ETA: 7s - loss: 2.4717 - regression_loss: 1.9791 - classification_loss: 0.4926 470/500 [===========================>..] - ETA: 7s - loss: 2.4711 - regression_loss: 1.9787 - classification_loss: 0.4924 471/500 [===========================>..] - ETA: 7s - loss: 2.4711 - regression_loss: 1.9788 - classification_loss: 0.4924 472/500 [===========================>..] - ETA: 6s - loss: 2.4694 - regression_loss: 1.9772 - classification_loss: 0.4922 473/500 [===========================>..] - ETA: 6s - loss: 2.4698 - regression_loss: 1.9769 - classification_loss: 0.4930 474/500 [===========================>..] - ETA: 6s - loss: 2.4693 - regression_loss: 1.9766 - classification_loss: 0.4927 475/500 [===========================>..] - ETA: 6s - loss: 2.4694 - regression_loss: 1.9769 - classification_loss: 0.4925 476/500 [===========================>..] - ETA: 5s - loss: 2.4692 - regression_loss: 1.9769 - classification_loss: 0.4924 477/500 [===========================>..] - ETA: 5s - loss: 2.4690 - regression_loss: 1.9766 - classification_loss: 0.4924 478/500 [===========================>..] - ETA: 5s - loss: 2.4702 - regression_loss: 1.9778 - classification_loss: 0.4924 479/500 [===========================>..] - ETA: 5s - loss: 2.4720 - regression_loss: 1.9789 - classification_loss: 0.4932 480/500 [===========================>..] - ETA: 4s - loss: 2.4719 - regression_loss: 1.9789 - classification_loss: 0.4930 481/500 [===========================>..] - ETA: 4s - loss: 2.4732 - regression_loss: 1.9780 - classification_loss: 0.4952 482/500 [===========================>..] - ETA: 4s - loss: 2.4716 - regression_loss: 1.9767 - classification_loss: 0.4949 483/500 [===========================>..] - ETA: 4s - loss: 2.4715 - regression_loss: 1.9767 - classification_loss: 0.4948 484/500 [============================>.] - ETA: 3s - loss: 2.4710 - regression_loss: 1.9763 - classification_loss: 0.4948 485/500 [============================>.] - ETA: 3s - loss: 2.4704 - regression_loss: 1.9759 - classification_loss: 0.4944 486/500 [============================>.] - ETA: 3s - loss: 2.4697 - regression_loss: 1.9756 - classification_loss: 0.4942 487/500 [============================>.] - ETA: 3s - loss: 2.4693 - regression_loss: 1.9755 - classification_loss: 0.4939 488/500 [============================>.] - ETA: 2s - loss: 2.4694 - regression_loss: 1.9755 - classification_loss: 0.4939 489/500 [============================>.] - ETA: 2s - loss: 2.4691 - regression_loss: 1.9753 - classification_loss: 0.4938 490/500 [============================>.] - ETA: 2s - loss: 2.4694 - regression_loss: 1.9757 - classification_loss: 0.4937 491/500 [============================>.] - ETA: 2s - loss: 2.4693 - regression_loss: 1.9755 - classification_loss: 0.4937 492/500 [============================>.] - ETA: 1s - loss: 2.4692 - regression_loss: 1.9756 - classification_loss: 0.4937 493/500 [============================>.] - ETA: 1s - loss: 2.4698 - regression_loss: 1.9762 - classification_loss: 0.4936 494/500 [============================>.] - ETA: 1s - loss: 2.4708 - regression_loss: 1.9771 - classification_loss: 0.4937 495/500 [============================>.] - ETA: 1s - loss: 2.4706 - regression_loss: 1.9770 - classification_loss: 0.4936 496/500 [============================>.] - ETA: 0s - loss: 2.4706 - regression_loss: 1.9771 - classification_loss: 0.4935 497/500 [============================>.] - ETA: 0s - loss: 2.4708 - regression_loss: 1.9775 - classification_loss: 0.4933 498/500 [============================>.] - ETA: 0s - loss: 2.4706 - regression_loss: 1.9773 - classification_loss: 0.4933 499/500 [============================>.] - ETA: 0s - loss: 2.4746 - regression_loss: 1.9789 - classification_loss: 0.4957 500/500 [==============================] - 125s 250ms/step - loss: 2.4735 - regression_loss: 1.9779 - classification_loss: 0.4956 1172 instances of class plum with average precision: 0.3449 mAP: 0.3449 Epoch 00014: saving model to ./training/snapshots/resnet50_pascal_14.h5 Epoch 15/150 1/500 [..............................] - ETA: 2:02 - loss: 1.9689 - regression_loss: 1.6023 - classification_loss: 0.3666 2/500 [..............................] - ETA: 2:03 - loss: 2.4271 - regression_loss: 1.9547 - classification_loss: 0.4724 3/500 [..............................] - ETA: 2:04 - loss: 2.2858 - regression_loss: 1.8720 - classification_loss: 0.4138 4/500 [..............................] - ETA: 2:04 - loss: 2.5009 - regression_loss: 2.0528 - classification_loss: 0.4482 5/500 [..............................] - ETA: 2:03 - loss: 2.4207 - regression_loss: 1.9697 - classification_loss: 0.4510 6/500 [..............................] - ETA: 2:02 - loss: 2.4083 - regression_loss: 1.9648 - classification_loss: 0.4435 7/500 [..............................] - ETA: 2:03 - loss: 2.3272 - regression_loss: 1.8973 - classification_loss: 0.4300 8/500 [..............................] - ETA: 2:02 - loss: 2.4089 - regression_loss: 1.9301 - classification_loss: 0.4788 9/500 [..............................] - ETA: 2:02 - loss: 2.4689 - regression_loss: 1.9694 - classification_loss: 0.4995 10/500 [..............................] - ETA: 2:02 - loss: 2.5273 - regression_loss: 2.0263 - classification_loss: 0.5009 11/500 [..............................] - ETA: 2:02 - loss: 2.4758 - regression_loss: 1.9865 - classification_loss: 0.4893 12/500 [..............................] - ETA: 2:02 - loss: 2.5169 - regression_loss: 2.0128 - classification_loss: 0.5041 13/500 [..............................] - ETA: 2:02 - loss: 2.5031 - regression_loss: 2.0000 - classification_loss: 0.5030 14/500 [..............................] - ETA: 2:02 - loss: 2.4867 - regression_loss: 1.9891 - classification_loss: 0.4976 15/500 [..............................] - ETA: 2:02 - loss: 2.4682 - regression_loss: 1.9801 - classification_loss: 0.4881 16/500 [..............................] - ETA: 2:01 - loss: 2.4818 - regression_loss: 1.9854 - classification_loss: 0.4964 17/500 [>.............................] - ETA: 2:01 - loss: 2.5105 - regression_loss: 2.0089 - classification_loss: 0.5016 18/500 [>.............................] - ETA: 2:01 - loss: 2.5180 - regression_loss: 2.0134 - classification_loss: 0.5046 19/500 [>.............................] - ETA: 2:01 - loss: 2.5150 - regression_loss: 2.0081 - classification_loss: 0.5070 20/500 [>.............................] - ETA: 2:00 - loss: 2.4960 - regression_loss: 1.9913 - classification_loss: 0.5046 21/500 [>.............................] - ETA: 2:00 - loss: 2.4796 - regression_loss: 1.9813 - classification_loss: 0.4983 22/500 [>.............................] - ETA: 2:00 - loss: 2.4453 - regression_loss: 1.9571 - classification_loss: 0.4882 23/500 [>.............................] - ETA: 2:00 - loss: 2.4564 - regression_loss: 1.9698 - classification_loss: 0.4866 24/500 [>.............................] - ETA: 2:00 - loss: 2.4438 - regression_loss: 1.9604 - classification_loss: 0.4835 25/500 [>.............................] - ETA: 2:00 - loss: 2.4329 - regression_loss: 1.9483 - classification_loss: 0.4846 26/500 [>.............................] - ETA: 1:59 - loss: 2.4186 - regression_loss: 1.9327 - classification_loss: 0.4859 27/500 [>.............................] - ETA: 1:59 - loss: 2.4214 - regression_loss: 1.9402 - classification_loss: 0.4812 28/500 [>.............................] - ETA: 1:59 - loss: 2.4186 - regression_loss: 1.9385 - classification_loss: 0.4801 29/500 [>.............................] - ETA: 1:58 - loss: 2.4398 - regression_loss: 1.9569 - classification_loss: 0.4829 30/500 [>.............................] - ETA: 1:58 - loss: 2.4388 - regression_loss: 1.9540 - classification_loss: 0.4847 31/500 [>.............................] - ETA: 1:58 - loss: 2.4402 - regression_loss: 1.9561 - classification_loss: 0.4841 32/500 [>.............................] - ETA: 1:58 - loss: 2.4357 - regression_loss: 1.9523 - classification_loss: 0.4834 33/500 [>.............................] - ETA: 1:58 - loss: 2.4311 - regression_loss: 1.9487 - classification_loss: 0.4824 34/500 [=>............................] - ETA: 1:57 - loss: 2.4195 - regression_loss: 1.9397 - classification_loss: 0.4798 35/500 [=>............................] - ETA: 1:57 - loss: 2.4062 - regression_loss: 1.9234 - classification_loss: 0.4828 36/500 [=>............................] - ETA: 1:57 - loss: 2.3692 - regression_loss: 1.8920 - classification_loss: 0.4771 37/500 [=>............................] - ETA: 1:57 - loss: 2.3709 - regression_loss: 1.8964 - classification_loss: 0.4745 38/500 [=>............................] - ETA: 1:56 - loss: 2.3704 - regression_loss: 1.8971 - classification_loss: 0.4732 39/500 [=>............................] - ETA: 1:56 - loss: 2.3589 - regression_loss: 1.8872 - classification_loss: 0.4717 40/500 [=>............................] - ETA: 1:55 - loss: 2.3640 - regression_loss: 1.8927 - classification_loss: 0.4713 41/500 [=>............................] - ETA: 1:55 - loss: 2.3629 - regression_loss: 1.8932 - classification_loss: 0.4697 42/500 [=>............................] - ETA: 1:54 - loss: 2.3655 - regression_loss: 1.8932 - classification_loss: 0.4723 43/500 [=>............................] - ETA: 1:53 - loss: 2.3661 - regression_loss: 1.8950 - classification_loss: 0.4711 44/500 [=>............................] - ETA: 1:53 - loss: 2.3520 - regression_loss: 1.8811 - classification_loss: 0.4709 45/500 [=>............................] - ETA: 1:53 - loss: 2.3528 - regression_loss: 1.8833 - classification_loss: 0.4695 46/500 [=>............................] - ETA: 1:52 - loss: 2.3554 - regression_loss: 1.8860 - classification_loss: 0.4695 47/500 [=>............................] - ETA: 1:52 - loss: 2.3559 - regression_loss: 1.8860 - classification_loss: 0.4699 48/500 [=>............................] - ETA: 1:52 - loss: 2.3702 - regression_loss: 1.8961 - classification_loss: 0.4740 49/500 [=>............................] - ETA: 1:52 - loss: 2.3737 - regression_loss: 1.8968 - classification_loss: 0.4769 50/500 [==>...........................] - ETA: 1:51 - loss: 2.3653 - regression_loss: 1.8927 - classification_loss: 0.4727 51/500 [==>...........................] - ETA: 1:51 - loss: 2.3612 - regression_loss: 1.8898 - classification_loss: 0.4713 52/500 [==>...........................] - ETA: 1:51 - loss: 2.3584 - regression_loss: 1.8886 - classification_loss: 0.4698 53/500 [==>...........................] - ETA: 1:51 - loss: 2.3372 - regression_loss: 1.8730 - classification_loss: 0.4642 54/500 [==>...........................] - ETA: 1:51 - loss: 2.3559 - regression_loss: 1.8855 - classification_loss: 0.4704 55/500 [==>...........................] - ETA: 1:50 - loss: 2.3583 - regression_loss: 1.8891 - classification_loss: 0.4692 56/500 [==>...........................] - ETA: 1:50 - loss: 2.3591 - regression_loss: 1.8895 - classification_loss: 0.4697 57/500 [==>...........................] - ETA: 1:50 - loss: 2.3559 - regression_loss: 1.8871 - classification_loss: 0.4688 58/500 [==>...........................] - ETA: 1:50 - loss: 2.3612 - regression_loss: 1.8937 - classification_loss: 0.4675 59/500 [==>...........................] - ETA: 1:50 - loss: 2.3637 - regression_loss: 1.8968 - classification_loss: 0.4669 60/500 [==>...........................] - ETA: 1:49 - loss: 2.3617 - regression_loss: 1.8964 - classification_loss: 0.4653 61/500 [==>...........................] - ETA: 1:49 - loss: 2.3576 - regression_loss: 1.8918 - classification_loss: 0.4658 62/500 [==>...........................] - ETA: 1:49 - loss: 2.3559 - regression_loss: 1.8910 - classification_loss: 0.4649 63/500 [==>...........................] - ETA: 1:49 - loss: 2.3503 - regression_loss: 1.8869 - classification_loss: 0.4634 64/500 [==>...........................] - ETA: 1:49 - loss: 2.3486 - regression_loss: 1.8854 - classification_loss: 0.4632 65/500 [==>...........................] - ETA: 1:49 - loss: 2.3540 - regression_loss: 1.8906 - classification_loss: 0.4635 66/500 [==>...........................] - ETA: 1:48 - loss: 2.3481 - regression_loss: 1.8859 - classification_loss: 0.4623 67/500 [===>..........................] - ETA: 1:48 - loss: 2.3470 - regression_loss: 1.8858 - classification_loss: 0.4612 68/500 [===>..........................] - ETA: 1:48 - loss: 2.3605 - regression_loss: 1.8945 - classification_loss: 0.4660 69/500 [===>..........................] - ETA: 1:48 - loss: 2.3679 - regression_loss: 1.9000 - classification_loss: 0.4679 70/500 [===>..........................] - ETA: 1:47 - loss: 2.3556 - regression_loss: 1.8904 - classification_loss: 0.4651 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3574 - regression_loss: 1.8921 - classification_loss: 0.4653 72/500 [===>..........................] - ETA: 1:47 - loss: 2.3535 - regression_loss: 1.8898 - classification_loss: 0.4637 73/500 [===>..........................] - ETA: 1:47 - loss: 2.3638 - regression_loss: 1.8972 - classification_loss: 0.4666 74/500 [===>..........................] - ETA: 1:46 - loss: 2.3662 - regression_loss: 1.8992 - classification_loss: 0.4670 75/500 [===>..........................] - ETA: 1:46 - loss: 2.3679 - regression_loss: 1.9020 - classification_loss: 0.4659 76/500 [===>..........................] - ETA: 1:46 - loss: 2.3757 - regression_loss: 1.9086 - classification_loss: 0.4671 77/500 [===>..........................] - ETA: 1:46 - loss: 2.3779 - regression_loss: 1.9115 - classification_loss: 0.4664 78/500 [===>..........................] - ETA: 1:45 - loss: 2.3632 - regression_loss: 1.8984 - classification_loss: 0.4648 79/500 [===>..........................] - ETA: 1:45 - loss: 2.3650 - regression_loss: 1.8998 - classification_loss: 0.4652 80/500 [===>..........................] - ETA: 1:45 - loss: 2.3603 - regression_loss: 1.8958 - classification_loss: 0.4645 81/500 [===>..........................] - ETA: 1:45 - loss: 2.3590 - regression_loss: 1.8951 - classification_loss: 0.4639 82/500 [===>..........................] - ETA: 1:44 - loss: 2.3600 - regression_loss: 1.8967 - classification_loss: 0.4632 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3744 - regression_loss: 1.9104 - classification_loss: 0.4640 84/500 [====>.........................] - ETA: 1:44 - loss: 2.3788 - regression_loss: 1.9135 - classification_loss: 0.4654 85/500 [====>.........................] - ETA: 1:44 - loss: 2.3787 - regression_loss: 1.9135 - classification_loss: 0.4652 86/500 [====>.........................] - ETA: 1:43 - loss: 2.3812 - regression_loss: 1.9152 - classification_loss: 0.4660 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3801 - regression_loss: 1.9153 - classification_loss: 0.4648 88/500 [====>.........................] - ETA: 1:43 - loss: 2.3800 - regression_loss: 1.9160 - classification_loss: 0.4640 89/500 [====>.........................] - ETA: 1:43 - loss: 2.3815 - regression_loss: 1.9166 - classification_loss: 0.4649 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3780 - regression_loss: 1.9138 - classification_loss: 0.4642 91/500 [====>.........................] - ETA: 1:42 - loss: 2.3761 - regression_loss: 1.9124 - classification_loss: 0.4637 92/500 [====>.........................] - ETA: 1:42 - loss: 2.3785 - regression_loss: 1.9142 - classification_loss: 0.4643 93/500 [====>.........................] - ETA: 1:42 - loss: 2.3784 - regression_loss: 1.9141 - classification_loss: 0.4643 94/500 [====>.........................] - ETA: 1:41 - loss: 2.3778 - regression_loss: 1.9136 - classification_loss: 0.4642 95/500 [====>.........................] - ETA: 1:41 - loss: 2.3766 - regression_loss: 1.9129 - classification_loss: 0.4637 96/500 [====>.........................] - ETA: 1:41 - loss: 2.3749 - regression_loss: 1.9117 - classification_loss: 0.4632 97/500 [====>.........................] - ETA: 1:41 - loss: 2.3728 - regression_loss: 1.9107 - classification_loss: 0.4621 98/500 [====>.........................] - ETA: 1:40 - loss: 2.3685 - regression_loss: 1.9078 - classification_loss: 0.4607 99/500 [====>.........................] - ETA: 1:40 - loss: 2.3683 - regression_loss: 1.9079 - classification_loss: 0.4604 100/500 [=====>........................] - ETA: 1:40 - loss: 2.3693 - regression_loss: 1.9091 - classification_loss: 0.4603 101/500 [=====>........................] - ETA: 1:40 - loss: 2.3697 - regression_loss: 1.9092 - classification_loss: 0.4605 102/500 [=====>........................] - ETA: 1:39 - loss: 2.3729 - regression_loss: 1.9081 - classification_loss: 0.4648 103/500 [=====>........................] - ETA: 1:39 - loss: 2.3713 - regression_loss: 1.9068 - classification_loss: 0.4645 104/500 [=====>........................] - ETA: 1:39 - loss: 2.3755 - regression_loss: 1.9106 - classification_loss: 0.4649 105/500 [=====>........................] - ETA: 1:38 - loss: 2.3718 - regression_loss: 1.9081 - classification_loss: 0.4638 106/500 [=====>........................] - ETA: 1:38 - loss: 2.3695 - regression_loss: 1.9063 - classification_loss: 0.4632 107/500 [=====>........................] - ETA: 1:38 - loss: 2.3713 - regression_loss: 1.9077 - classification_loss: 0.4636 108/500 [=====>........................] - ETA: 1:38 - loss: 2.3707 - regression_loss: 1.9034 - classification_loss: 0.4673 109/500 [=====>........................] - ETA: 1:37 - loss: 2.3684 - regression_loss: 1.9016 - classification_loss: 0.4668 110/500 [=====>........................] - ETA: 1:37 - loss: 2.3681 - regression_loss: 1.9012 - classification_loss: 0.4669 111/500 [=====>........................] - ETA: 1:37 - loss: 2.3757 - regression_loss: 1.9080 - classification_loss: 0.4677 112/500 [=====>........................] - ETA: 1:37 - loss: 2.3751 - regression_loss: 1.9073 - classification_loss: 0.4677 113/500 [=====>........................] - ETA: 1:37 - loss: 2.3749 - regression_loss: 1.9067 - classification_loss: 0.4681 114/500 [=====>........................] - ETA: 1:36 - loss: 2.3767 - regression_loss: 1.9088 - classification_loss: 0.4679 115/500 [=====>........................] - ETA: 1:36 - loss: 2.3754 - regression_loss: 1.9073 - classification_loss: 0.4680 116/500 [=====>........................] - ETA: 1:36 - loss: 2.3742 - regression_loss: 1.9076 - classification_loss: 0.4665 117/500 [======>.......................] - ETA: 1:36 - loss: 2.3717 - regression_loss: 1.9058 - classification_loss: 0.4660 118/500 [======>.......................] - ETA: 1:35 - loss: 2.3620 - regression_loss: 1.8976 - classification_loss: 0.4643 119/500 [======>.......................] - ETA: 1:35 - loss: 2.3652 - regression_loss: 1.9008 - classification_loss: 0.4644 120/500 [======>.......................] - ETA: 1:35 - loss: 2.3667 - regression_loss: 1.9027 - classification_loss: 0.4640 121/500 [======>.......................] - ETA: 1:35 - loss: 2.3671 - regression_loss: 1.9034 - classification_loss: 0.4637 122/500 [======>.......................] - ETA: 1:34 - loss: 2.3683 - regression_loss: 1.9046 - classification_loss: 0.4638 123/500 [======>.......................] - ETA: 1:34 - loss: 2.3678 - regression_loss: 1.9041 - classification_loss: 0.4638 124/500 [======>.......................] - ETA: 1:34 - loss: 2.3734 - regression_loss: 1.9083 - classification_loss: 0.4651 125/500 [======>.......................] - ETA: 1:34 - loss: 2.3793 - regression_loss: 1.9136 - classification_loss: 0.4657 126/500 [======>.......................] - ETA: 1:33 - loss: 2.3790 - regression_loss: 1.9140 - classification_loss: 0.4650 127/500 [======>.......................] - ETA: 1:33 - loss: 2.3767 - regression_loss: 1.9118 - classification_loss: 0.4649 128/500 [======>.......................] - ETA: 1:33 - loss: 2.3763 - regression_loss: 1.9114 - classification_loss: 0.4649 129/500 [======>.......................] - ETA: 1:33 - loss: 2.3763 - regression_loss: 1.9115 - classification_loss: 0.4648 130/500 [======>.......................] - ETA: 1:32 - loss: 2.3766 - regression_loss: 1.9127 - classification_loss: 0.4639 131/500 [======>.......................] - ETA: 1:32 - loss: 2.3784 - regression_loss: 1.9132 - classification_loss: 0.4652 132/500 [======>.......................] - ETA: 1:32 - loss: 2.3782 - regression_loss: 1.9140 - classification_loss: 0.4642 133/500 [======>.......................] - ETA: 1:32 - loss: 2.3746 - regression_loss: 1.9115 - classification_loss: 0.4632 134/500 [=======>......................] - ETA: 1:31 - loss: 2.3739 - regression_loss: 1.9111 - classification_loss: 0.4628 135/500 [=======>......................] - ETA: 1:31 - loss: 2.3820 - regression_loss: 1.9175 - classification_loss: 0.4645 136/500 [=======>......................] - ETA: 1:31 - loss: 2.3845 - regression_loss: 1.9199 - classification_loss: 0.4646 137/500 [=======>......................] - ETA: 1:31 - loss: 2.3857 - regression_loss: 1.9207 - classification_loss: 0.4649 138/500 [=======>......................] - ETA: 1:30 - loss: 2.3874 - regression_loss: 1.9229 - classification_loss: 0.4645 139/500 [=======>......................] - ETA: 1:30 - loss: 2.3901 - regression_loss: 1.9253 - classification_loss: 0.4648 140/500 [=======>......................] - ETA: 1:30 - loss: 2.3915 - regression_loss: 1.9263 - classification_loss: 0.4652 141/500 [=======>......................] - ETA: 1:30 - loss: 2.3942 - regression_loss: 1.9285 - classification_loss: 0.4658 142/500 [=======>......................] - ETA: 1:29 - loss: 2.3933 - regression_loss: 1.9281 - classification_loss: 0.4651 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3911 - regression_loss: 1.9265 - classification_loss: 0.4645 144/500 [=======>......................] - ETA: 1:29 - loss: 2.3929 - regression_loss: 1.9276 - classification_loss: 0.4653 145/500 [=======>......................] - ETA: 1:29 - loss: 2.3948 - regression_loss: 1.9289 - classification_loss: 0.4659 146/500 [=======>......................] - ETA: 1:28 - loss: 2.3974 - regression_loss: 1.9301 - classification_loss: 0.4673 147/500 [=======>......................] - ETA: 1:28 - loss: 2.3938 - regression_loss: 1.9278 - classification_loss: 0.4660 148/500 [=======>......................] - ETA: 1:28 - loss: 2.3895 - regression_loss: 1.9246 - classification_loss: 0.4649 149/500 [=======>......................] - ETA: 1:28 - loss: 2.3919 - regression_loss: 1.9258 - classification_loss: 0.4661 150/500 [========>.....................] - ETA: 1:27 - loss: 2.3935 - regression_loss: 1.9270 - classification_loss: 0.4666 151/500 [========>.....................] - ETA: 1:27 - loss: 2.4016 - regression_loss: 1.9312 - classification_loss: 0.4704 152/500 [========>.....................] - ETA: 1:27 - loss: 2.4128 - regression_loss: 1.9384 - classification_loss: 0.4744 153/500 [========>.....................] - ETA: 1:27 - loss: 2.4135 - regression_loss: 1.9384 - classification_loss: 0.4751 154/500 [========>.....................] - ETA: 1:26 - loss: 2.4128 - regression_loss: 1.9380 - classification_loss: 0.4748 155/500 [========>.....................] - ETA: 1:26 - loss: 2.4073 - regression_loss: 1.9343 - classification_loss: 0.4730 156/500 [========>.....................] - ETA: 1:26 - loss: 2.4107 - regression_loss: 1.9376 - classification_loss: 0.4731 157/500 [========>.....................] - ETA: 1:26 - loss: 2.4092 - regression_loss: 1.9365 - classification_loss: 0.4727 158/500 [========>.....................] - ETA: 1:25 - loss: 2.4063 - regression_loss: 1.9342 - classification_loss: 0.4721 159/500 [========>.....................] - ETA: 1:25 - loss: 2.4070 - regression_loss: 1.9349 - classification_loss: 0.4721 160/500 [========>.....................] - ETA: 1:25 - loss: 2.4099 - regression_loss: 1.9376 - classification_loss: 0.4722 161/500 [========>.....................] - ETA: 1:25 - loss: 2.4167 - regression_loss: 1.9380 - classification_loss: 0.4787 162/500 [========>.....................] - ETA: 1:24 - loss: 2.4193 - regression_loss: 1.9403 - classification_loss: 0.4790 163/500 [========>.....................] - ETA: 1:24 - loss: 2.4274 - regression_loss: 1.9437 - classification_loss: 0.4837 164/500 [========>.....................] - ETA: 1:24 - loss: 2.4236 - regression_loss: 1.9407 - classification_loss: 0.4829 165/500 [========>.....................] - ETA: 1:24 - loss: 2.4265 - regression_loss: 1.9428 - classification_loss: 0.4837 166/500 [========>.....................] - ETA: 1:23 - loss: 2.4237 - regression_loss: 1.9404 - classification_loss: 0.4834 167/500 [=========>....................] - ETA: 1:23 - loss: 2.4244 - regression_loss: 1.9418 - classification_loss: 0.4826 168/500 [=========>....................] - ETA: 1:23 - loss: 2.4260 - regression_loss: 1.9437 - classification_loss: 0.4823 169/500 [=========>....................] - ETA: 1:23 - loss: 2.4256 - regression_loss: 1.9434 - classification_loss: 0.4822 170/500 [=========>....................] - ETA: 1:22 - loss: 2.4275 - regression_loss: 1.9450 - classification_loss: 0.4824 171/500 [=========>....................] - ETA: 1:22 - loss: 2.4296 - regression_loss: 1.9473 - classification_loss: 0.4824 172/500 [=========>....................] - ETA: 1:22 - loss: 2.4287 - regression_loss: 1.9460 - classification_loss: 0.4827 173/500 [=========>....................] - ETA: 1:22 - loss: 2.4310 - regression_loss: 1.9484 - classification_loss: 0.4826 174/500 [=========>....................] - ETA: 1:21 - loss: 2.4307 - regression_loss: 1.9483 - classification_loss: 0.4824 175/500 [=========>....................] - ETA: 1:21 - loss: 2.4300 - regression_loss: 1.9476 - classification_loss: 0.4824 176/500 [=========>....................] - ETA: 1:21 - loss: 2.4312 - regression_loss: 1.9491 - classification_loss: 0.4822 177/500 [=========>....................] - ETA: 1:21 - loss: 2.4346 - regression_loss: 1.9517 - classification_loss: 0.4829 178/500 [=========>....................] - ETA: 1:20 - loss: 2.4340 - regression_loss: 1.9515 - classification_loss: 0.4825 179/500 [=========>....................] - ETA: 1:20 - loss: 2.4345 - regression_loss: 1.9524 - classification_loss: 0.4821 180/500 [=========>....................] - ETA: 1:20 - loss: 2.4338 - regression_loss: 1.9521 - classification_loss: 0.4817 181/500 [=========>....................] - ETA: 1:20 - loss: 2.4345 - regression_loss: 1.9529 - classification_loss: 0.4816 182/500 [=========>....................] - ETA: 1:19 - loss: 2.4359 - regression_loss: 1.9539 - classification_loss: 0.4820 183/500 [=========>....................] - ETA: 1:19 - loss: 2.4346 - regression_loss: 1.9530 - classification_loss: 0.4816 184/500 [==========>...................] - ETA: 1:19 - loss: 2.4349 - regression_loss: 1.9534 - classification_loss: 0.4815 185/500 [==========>...................] - ETA: 1:19 - loss: 2.4313 - regression_loss: 1.9507 - classification_loss: 0.4806 186/500 [==========>...................] - ETA: 1:18 - loss: 2.4329 - regression_loss: 1.9520 - classification_loss: 0.4809 187/500 [==========>...................] - ETA: 1:18 - loss: 2.4299 - regression_loss: 1.9496 - classification_loss: 0.4803 188/500 [==========>...................] - ETA: 1:18 - loss: 2.4307 - regression_loss: 1.9508 - classification_loss: 0.4799 189/500 [==========>...................] - ETA: 1:18 - loss: 2.4338 - regression_loss: 1.9523 - classification_loss: 0.4815 190/500 [==========>...................] - ETA: 1:17 - loss: 2.4328 - regression_loss: 1.9514 - classification_loss: 0.4814 191/500 [==========>...................] - ETA: 1:17 - loss: 2.4306 - regression_loss: 1.9496 - classification_loss: 0.4810 192/500 [==========>...................] - ETA: 1:17 - loss: 2.4292 - regression_loss: 1.9488 - classification_loss: 0.4804 193/500 [==========>...................] - ETA: 1:17 - loss: 2.4296 - regression_loss: 1.9492 - classification_loss: 0.4804 194/500 [==========>...................] - ETA: 1:16 - loss: 2.4285 - regression_loss: 1.9487 - classification_loss: 0.4798 195/500 [==========>...................] - ETA: 1:16 - loss: 2.4289 - regression_loss: 1.9490 - classification_loss: 0.4799 196/500 [==========>...................] - ETA: 1:16 - loss: 2.4283 - regression_loss: 1.9489 - classification_loss: 0.4795 197/500 [==========>...................] - ETA: 1:16 - loss: 2.4296 - regression_loss: 1.9499 - classification_loss: 0.4797 198/500 [==========>...................] - ETA: 1:15 - loss: 2.4293 - regression_loss: 1.9502 - classification_loss: 0.4791 199/500 [==========>...................] - ETA: 1:15 - loss: 2.4255 - regression_loss: 1.9474 - classification_loss: 0.4781 200/500 [===========>..................] - ETA: 1:15 - loss: 2.4277 - regression_loss: 1.9484 - classification_loss: 0.4794 201/500 [===========>..................] - ETA: 1:15 - loss: 2.4275 - regression_loss: 1.9485 - classification_loss: 0.4790 202/500 [===========>..................] - ETA: 1:14 - loss: 2.4271 - regression_loss: 1.9487 - classification_loss: 0.4785 203/500 [===========>..................] - ETA: 1:14 - loss: 2.4233 - regression_loss: 1.9459 - classification_loss: 0.4774 204/500 [===========>..................] - ETA: 1:14 - loss: 2.4253 - regression_loss: 1.9477 - classification_loss: 0.4775 205/500 [===========>..................] - ETA: 1:14 - loss: 2.4461 - regression_loss: 1.9459 - classification_loss: 0.5002 206/500 [===========>..................] - ETA: 1:13 - loss: 2.4470 - regression_loss: 1.9470 - classification_loss: 0.5000 207/500 [===========>..................] - ETA: 1:13 - loss: 2.4439 - regression_loss: 1.9447 - classification_loss: 0.4992 208/500 [===========>..................] - ETA: 1:13 - loss: 2.4465 - regression_loss: 1.9466 - classification_loss: 0.4999 209/500 [===========>..................] - ETA: 1:13 - loss: 2.4430 - regression_loss: 1.9440 - classification_loss: 0.4990 210/500 [===========>..................] - ETA: 1:12 - loss: 2.4407 - regression_loss: 1.9421 - classification_loss: 0.4986 211/500 [===========>..................] - ETA: 1:12 - loss: 2.4406 - regression_loss: 1.9424 - classification_loss: 0.4983 212/500 [===========>..................] - ETA: 1:12 - loss: 2.4408 - regression_loss: 1.9429 - classification_loss: 0.4978 213/500 [===========>..................] - ETA: 1:12 - loss: 2.4383 - regression_loss: 1.9414 - classification_loss: 0.4969 214/500 [===========>..................] - ETA: 1:11 - loss: 2.4355 - regression_loss: 1.9389 - classification_loss: 0.4966 215/500 [===========>..................] - ETA: 1:11 - loss: 2.4354 - regression_loss: 1.9394 - classification_loss: 0.4961 216/500 [===========>..................] - ETA: 1:11 - loss: 2.4356 - regression_loss: 1.9394 - classification_loss: 0.4961 217/500 [============>.................] - ETA: 1:11 - loss: 2.4354 - regression_loss: 1.9395 - classification_loss: 0.4959 218/500 [============>.................] - ETA: 1:10 - loss: 2.4313 - regression_loss: 1.9363 - classification_loss: 0.4949 219/500 [============>.................] - ETA: 1:10 - loss: 2.4330 - regression_loss: 1.9374 - classification_loss: 0.4956 220/500 [============>.................] - ETA: 1:10 - loss: 2.4338 - regression_loss: 1.9373 - classification_loss: 0.4965 221/500 [============>.................] - ETA: 1:09 - loss: 2.4332 - regression_loss: 1.9366 - classification_loss: 0.4966 222/500 [============>.................] - ETA: 1:09 - loss: 2.4326 - regression_loss: 1.9367 - classification_loss: 0.4960 223/500 [============>.................] - ETA: 1:09 - loss: 2.4311 - regression_loss: 1.9357 - classification_loss: 0.4954 224/500 [============>.................] - ETA: 1:09 - loss: 2.4330 - regression_loss: 1.9371 - classification_loss: 0.4959 225/500 [============>.................] - ETA: 1:08 - loss: 2.4329 - regression_loss: 1.9374 - classification_loss: 0.4954 226/500 [============>.................] - ETA: 1:08 - loss: 2.4330 - regression_loss: 1.9377 - classification_loss: 0.4952 227/500 [============>.................] - ETA: 1:08 - loss: 2.4336 - regression_loss: 1.9388 - classification_loss: 0.4948 228/500 [============>.................] - ETA: 1:08 - loss: 2.4300 - regression_loss: 1.9360 - classification_loss: 0.4940 229/500 [============>.................] - ETA: 1:07 - loss: 2.4293 - regression_loss: 1.9355 - classification_loss: 0.4938 230/500 [============>.................] - ETA: 1:07 - loss: 2.4312 - regression_loss: 1.9373 - classification_loss: 0.4939 231/500 [============>.................] - ETA: 1:07 - loss: 2.4317 - regression_loss: 1.9380 - classification_loss: 0.4937 232/500 [============>.................] - ETA: 1:07 - loss: 2.4332 - regression_loss: 1.9397 - classification_loss: 0.4935 233/500 [============>.................] - ETA: 1:06 - loss: 2.4321 - regression_loss: 1.9397 - classification_loss: 0.4924 234/500 [=============>................] - ETA: 1:06 - loss: 2.4326 - regression_loss: 1.9398 - classification_loss: 0.4928 235/500 [=============>................] - ETA: 1:06 - loss: 2.4334 - regression_loss: 1.9402 - classification_loss: 0.4932 236/500 [=============>................] - ETA: 1:06 - loss: 2.4333 - regression_loss: 1.9403 - classification_loss: 0.4930 237/500 [=============>................] - ETA: 1:05 - loss: 2.4342 - regression_loss: 1.9414 - classification_loss: 0.4928 238/500 [=============>................] - ETA: 1:05 - loss: 2.4355 - regression_loss: 1.9429 - classification_loss: 0.4926 239/500 [=============>................] - ETA: 1:05 - loss: 2.4360 - regression_loss: 1.9437 - classification_loss: 0.4924 240/500 [=============>................] - ETA: 1:05 - loss: 2.4347 - regression_loss: 1.9430 - classification_loss: 0.4917 241/500 [=============>................] - ETA: 1:04 - loss: 2.4355 - regression_loss: 1.9437 - classification_loss: 0.4918 242/500 [=============>................] - ETA: 1:04 - loss: 2.4348 - regression_loss: 1.9435 - classification_loss: 0.4912 243/500 [=============>................] - ETA: 1:04 - loss: 2.4341 - regression_loss: 1.9430 - classification_loss: 0.4912 244/500 [=============>................] - ETA: 1:04 - loss: 2.4339 - regression_loss: 1.9430 - classification_loss: 0.4909 245/500 [=============>................] - ETA: 1:03 - loss: 2.4334 - regression_loss: 1.9411 - classification_loss: 0.4923 246/500 [=============>................] - ETA: 1:03 - loss: 2.4323 - regression_loss: 1.9403 - classification_loss: 0.4920 247/500 [=============>................] - ETA: 1:03 - loss: 2.4332 - regression_loss: 1.9411 - classification_loss: 0.4921 248/500 [=============>................] - ETA: 1:03 - loss: 2.4320 - regression_loss: 1.9402 - classification_loss: 0.4918 249/500 [=============>................] - ETA: 1:02 - loss: 2.4313 - regression_loss: 1.9398 - classification_loss: 0.4915 250/500 [==============>...............] - ETA: 1:02 - loss: 2.4326 - regression_loss: 1.9409 - classification_loss: 0.4917 251/500 [==============>...............] - ETA: 1:02 - loss: 2.4317 - regression_loss: 1.9401 - classification_loss: 0.4915 252/500 [==============>...............] - ETA: 1:02 - loss: 2.4295 - regression_loss: 1.9384 - classification_loss: 0.4911 253/500 [==============>...............] - ETA: 1:01 - loss: 2.4287 - regression_loss: 1.9379 - classification_loss: 0.4908 254/500 [==============>...............] - ETA: 1:01 - loss: 2.4291 - regression_loss: 1.9384 - classification_loss: 0.4907 255/500 [==============>...............] - ETA: 1:01 - loss: 2.4301 - regression_loss: 1.9395 - classification_loss: 0.4906 256/500 [==============>...............] - ETA: 1:01 - loss: 2.4325 - regression_loss: 1.9415 - classification_loss: 0.4911 257/500 [==============>...............] - ETA: 1:00 - loss: 2.4332 - regression_loss: 1.9419 - classification_loss: 0.4912 258/500 [==============>...............] - ETA: 1:00 - loss: 2.4351 - regression_loss: 1.9428 - classification_loss: 0.4923 259/500 [==============>...............] - ETA: 1:00 - loss: 2.4366 - regression_loss: 1.9440 - classification_loss: 0.4927 260/500 [==============>...............] - ETA: 1:00 - loss: 2.4351 - regression_loss: 1.9429 - classification_loss: 0.4922 261/500 [==============>...............] - ETA: 59s - loss: 2.4354 - regression_loss: 1.9434 - classification_loss: 0.4921  262/500 [==============>...............] - ETA: 59s - loss: 2.4358 - regression_loss: 1.9438 - classification_loss: 0.4920 263/500 [==============>...............] - ETA: 59s - loss: 2.4364 - regression_loss: 1.9443 - classification_loss: 0.4921 264/500 [==============>...............] - ETA: 59s - loss: 2.4367 - regression_loss: 1.9450 - classification_loss: 0.4917 265/500 [==============>...............] - ETA: 58s - loss: 2.4365 - regression_loss: 1.9451 - classification_loss: 0.4914 266/500 [==============>...............] - ETA: 58s - loss: 2.4375 - regression_loss: 1.9459 - classification_loss: 0.4916 267/500 [===============>..............] - ETA: 58s - loss: 2.4453 - regression_loss: 1.9483 - classification_loss: 0.4970 268/500 [===============>..............] - ETA: 58s - loss: 2.4446 - regression_loss: 1.9478 - classification_loss: 0.4968 269/500 [===============>..............] - ETA: 57s - loss: 2.4443 - regression_loss: 1.9478 - classification_loss: 0.4965 270/500 [===============>..............] - ETA: 57s - loss: 2.4445 - regression_loss: 1.9481 - classification_loss: 0.4964 271/500 [===============>..............] - ETA: 57s - loss: 2.4445 - regression_loss: 1.9481 - classification_loss: 0.4964 272/500 [===============>..............] - ETA: 57s - loss: 2.4437 - regression_loss: 1.9478 - classification_loss: 0.4959 273/500 [===============>..............] - ETA: 56s - loss: 2.4448 - regression_loss: 1.9493 - classification_loss: 0.4955 274/500 [===============>..............] - ETA: 56s - loss: 2.4467 - regression_loss: 1.9508 - classification_loss: 0.4959 275/500 [===============>..............] - ETA: 56s - loss: 2.4464 - regression_loss: 1.9508 - classification_loss: 0.4957 276/500 [===============>..............] - ETA: 56s - loss: 2.4475 - regression_loss: 1.9514 - classification_loss: 0.4961 277/500 [===============>..............] - ETA: 55s - loss: 2.4493 - regression_loss: 1.9525 - classification_loss: 0.4968 278/500 [===============>..............] - ETA: 55s - loss: 2.4499 - regression_loss: 1.9528 - classification_loss: 0.4971 279/500 [===============>..............] - ETA: 55s - loss: 2.4480 - regression_loss: 1.9505 - classification_loss: 0.4975 280/500 [===============>..............] - ETA: 55s - loss: 2.4481 - regression_loss: 1.9506 - classification_loss: 0.4975 281/500 [===============>..............] - ETA: 54s - loss: 2.4483 - regression_loss: 1.9511 - classification_loss: 0.4972 282/500 [===============>..............] - ETA: 54s - loss: 2.4484 - regression_loss: 1.9516 - classification_loss: 0.4969 283/500 [===============>..............] - ETA: 54s - loss: 2.4478 - regression_loss: 1.9513 - classification_loss: 0.4965 284/500 [================>.............] - ETA: 54s - loss: 2.4476 - regression_loss: 1.9517 - classification_loss: 0.4959 285/500 [================>.............] - ETA: 53s - loss: 2.4484 - regression_loss: 1.9526 - classification_loss: 0.4958 286/500 [================>.............] - ETA: 53s - loss: 2.4503 - regression_loss: 1.9543 - classification_loss: 0.4960 287/500 [================>.............] - ETA: 53s - loss: 2.4504 - regression_loss: 1.9544 - classification_loss: 0.4960 288/500 [================>.............] - ETA: 53s - loss: 2.4505 - regression_loss: 1.9548 - classification_loss: 0.4957 289/500 [================>.............] - ETA: 52s - loss: 2.4499 - regression_loss: 1.9545 - classification_loss: 0.4954 290/500 [================>.............] - ETA: 52s - loss: 2.4495 - regression_loss: 1.9543 - classification_loss: 0.4952 291/500 [================>.............] - ETA: 52s - loss: 2.4510 - regression_loss: 1.9559 - classification_loss: 0.4950 292/500 [================>.............] - ETA: 52s - loss: 2.4514 - regression_loss: 1.9566 - classification_loss: 0.4949 293/500 [================>.............] - ETA: 51s - loss: 2.4531 - regression_loss: 1.9582 - classification_loss: 0.4949 294/500 [================>.............] - ETA: 51s - loss: 2.4528 - regression_loss: 1.9579 - classification_loss: 0.4949 295/500 [================>.............] - ETA: 51s - loss: 2.4517 - regression_loss: 1.9572 - classification_loss: 0.4945 296/500 [================>.............] - ETA: 51s - loss: 2.4535 - regression_loss: 1.9592 - classification_loss: 0.4943 297/500 [================>.............] - ETA: 50s - loss: 2.4526 - regression_loss: 1.9583 - classification_loss: 0.4943 298/500 [================>.............] - ETA: 50s - loss: 2.4523 - regression_loss: 1.9581 - classification_loss: 0.4942 299/500 [================>.............] - ETA: 50s - loss: 2.4505 - regression_loss: 1.9566 - classification_loss: 0.4939 300/500 [=================>............] - ETA: 50s - loss: 2.4505 - regression_loss: 1.9562 - classification_loss: 0.4943 301/500 [=================>............] - ETA: 49s - loss: 2.4522 - regression_loss: 1.9575 - classification_loss: 0.4946 302/500 [=================>............] - ETA: 49s - loss: 2.4532 - regression_loss: 1.9583 - classification_loss: 0.4949 303/500 [=================>............] - ETA: 49s - loss: 2.4549 - regression_loss: 1.9604 - classification_loss: 0.4945 304/500 [=================>............] - ETA: 49s - loss: 2.4520 - regression_loss: 1.9582 - classification_loss: 0.4938 305/500 [=================>............] - ETA: 48s - loss: 2.4513 - regression_loss: 1.9578 - classification_loss: 0.4935 306/500 [=================>............] - ETA: 48s - loss: 2.4485 - regression_loss: 1.9551 - classification_loss: 0.4934 307/500 [=================>............] - ETA: 48s - loss: 2.4513 - regression_loss: 1.9572 - classification_loss: 0.4941 308/500 [=================>............] - ETA: 48s - loss: 2.4512 - regression_loss: 1.9570 - classification_loss: 0.4941 309/500 [=================>............] - ETA: 47s - loss: 2.4520 - regression_loss: 1.9578 - classification_loss: 0.4943 310/500 [=================>............] - ETA: 47s - loss: 2.4532 - regression_loss: 1.9584 - classification_loss: 0.4948 311/500 [=================>............] - ETA: 47s - loss: 2.4525 - regression_loss: 1.9580 - classification_loss: 0.4945 312/500 [=================>............] - ETA: 47s - loss: 2.4541 - regression_loss: 1.9595 - classification_loss: 0.4946 313/500 [=================>............] - ETA: 46s - loss: 2.4551 - regression_loss: 1.9605 - classification_loss: 0.4946 314/500 [=================>............] - ETA: 46s - loss: 2.4543 - regression_loss: 1.9600 - classification_loss: 0.4943 315/500 [=================>............] - ETA: 46s - loss: 2.4541 - regression_loss: 1.9599 - classification_loss: 0.4941 316/500 [=================>............] - ETA: 46s - loss: 2.4536 - regression_loss: 1.9598 - classification_loss: 0.4938 317/500 [==================>...........] - ETA: 45s - loss: 2.4538 - regression_loss: 1.9599 - classification_loss: 0.4940 318/500 [==================>...........] - ETA: 45s - loss: 2.4567 - regression_loss: 1.9619 - classification_loss: 0.4948 319/500 [==================>...........] - ETA: 45s - loss: 2.4578 - regression_loss: 1.9627 - classification_loss: 0.4951 320/500 [==================>...........] - ETA: 45s - loss: 2.4570 - regression_loss: 1.9623 - classification_loss: 0.4947 321/500 [==================>...........] - ETA: 44s - loss: 2.4550 - regression_loss: 1.9606 - classification_loss: 0.4943 322/500 [==================>...........] - ETA: 44s - loss: 2.4550 - regression_loss: 1.9603 - classification_loss: 0.4947 323/500 [==================>...........] - ETA: 44s - loss: 2.4538 - regression_loss: 1.9594 - classification_loss: 0.4944 324/500 [==================>...........] - ETA: 44s - loss: 2.4539 - regression_loss: 1.9597 - classification_loss: 0.4943 325/500 [==================>...........] - ETA: 43s - loss: 2.4536 - regression_loss: 1.9595 - classification_loss: 0.4941 326/500 [==================>...........] - ETA: 43s - loss: 2.4516 - regression_loss: 1.9580 - classification_loss: 0.4936 327/500 [==================>...........] - ETA: 43s - loss: 2.4510 - regression_loss: 1.9576 - classification_loss: 0.4934 328/500 [==================>...........] - ETA: 43s - loss: 2.4523 - regression_loss: 1.9591 - classification_loss: 0.4933 329/500 [==================>...........] - ETA: 42s - loss: 2.4509 - regression_loss: 1.9580 - classification_loss: 0.4930 330/500 [==================>...........] - ETA: 42s - loss: 2.4495 - regression_loss: 1.9569 - classification_loss: 0.4927 331/500 [==================>...........] - ETA: 42s - loss: 2.4499 - regression_loss: 1.9574 - classification_loss: 0.4925 332/500 [==================>...........] - ETA: 42s - loss: 2.4490 - regression_loss: 1.9568 - classification_loss: 0.4922 333/500 [==================>...........] - ETA: 41s - loss: 2.4464 - regression_loss: 1.9549 - classification_loss: 0.4915 334/500 [===================>..........] - ETA: 41s - loss: 2.4448 - regression_loss: 1.9536 - classification_loss: 0.4911 335/500 [===================>..........] - ETA: 41s - loss: 2.4459 - regression_loss: 1.9545 - classification_loss: 0.4914 336/500 [===================>..........] - ETA: 41s - loss: 2.4470 - regression_loss: 1.9553 - classification_loss: 0.4918 337/500 [===================>..........] - ETA: 40s - loss: 2.4463 - regression_loss: 1.9549 - classification_loss: 0.4914 338/500 [===================>..........] - ETA: 40s - loss: 2.4460 - regression_loss: 1.9550 - classification_loss: 0.4910 339/500 [===================>..........] - ETA: 40s - loss: 2.4452 - regression_loss: 1.9544 - classification_loss: 0.4908 340/500 [===================>..........] - ETA: 40s - loss: 2.4453 - regression_loss: 1.9547 - classification_loss: 0.4906 341/500 [===================>..........] - ETA: 39s - loss: 2.4426 - regression_loss: 1.9525 - classification_loss: 0.4901 342/500 [===================>..........] - ETA: 39s - loss: 2.4432 - regression_loss: 1.9532 - classification_loss: 0.4900 343/500 [===================>..........] - ETA: 39s - loss: 2.4428 - regression_loss: 1.9529 - classification_loss: 0.4898 344/500 [===================>..........] - ETA: 39s - loss: 2.4446 - regression_loss: 1.9542 - classification_loss: 0.4904 345/500 [===================>..........] - ETA: 38s - loss: 2.4440 - regression_loss: 1.9537 - classification_loss: 0.4902 346/500 [===================>..........] - ETA: 38s - loss: 2.4446 - regression_loss: 1.9540 - classification_loss: 0.4906 347/500 [===================>..........] - ETA: 38s - loss: 2.4449 - regression_loss: 1.9544 - classification_loss: 0.4905 348/500 [===================>..........] - ETA: 38s - loss: 2.4444 - regression_loss: 1.9540 - classification_loss: 0.4903 349/500 [===================>..........] - ETA: 37s - loss: 2.4459 - regression_loss: 1.9551 - classification_loss: 0.4908 350/500 [====================>.........] - ETA: 37s - loss: 2.4455 - regression_loss: 1.9548 - classification_loss: 0.4907 351/500 [====================>.........] - ETA: 37s - loss: 2.4463 - regression_loss: 1.9555 - classification_loss: 0.4908 352/500 [====================>.........] - ETA: 37s - loss: 2.4430 - regression_loss: 1.9530 - classification_loss: 0.4900 353/500 [====================>.........] - ETA: 36s - loss: 2.4397 - regression_loss: 1.9504 - classification_loss: 0.4893 354/500 [====================>.........] - ETA: 36s - loss: 2.4424 - regression_loss: 1.9527 - classification_loss: 0.4897 355/500 [====================>.........] - ETA: 36s - loss: 2.4421 - regression_loss: 1.9526 - classification_loss: 0.4896 356/500 [====================>.........] - ETA: 36s - loss: 2.4420 - regression_loss: 1.9527 - classification_loss: 0.4893 357/500 [====================>.........] - ETA: 35s - loss: 2.4408 - regression_loss: 1.9518 - classification_loss: 0.4889 358/500 [====================>.........] - ETA: 35s - loss: 2.4413 - regression_loss: 1.9524 - classification_loss: 0.4889 359/500 [====================>.........] - ETA: 35s - loss: 2.4403 - regression_loss: 1.9515 - classification_loss: 0.4888 360/500 [====================>.........] - ETA: 35s - loss: 2.4417 - regression_loss: 1.9527 - classification_loss: 0.4890 361/500 [====================>.........] - ETA: 34s - loss: 2.4428 - regression_loss: 1.9538 - classification_loss: 0.4891 362/500 [====================>.........] - ETA: 34s - loss: 2.4427 - regression_loss: 1.9537 - classification_loss: 0.4890 363/500 [====================>.........] - ETA: 34s - loss: 2.4431 - regression_loss: 1.9538 - classification_loss: 0.4893 364/500 [====================>.........] - ETA: 34s - loss: 2.4437 - regression_loss: 1.9542 - classification_loss: 0.4895 365/500 [====================>.........] - ETA: 33s - loss: 2.4428 - regression_loss: 1.9534 - classification_loss: 0.4894 366/500 [====================>.........] - ETA: 33s - loss: 2.4434 - regression_loss: 1.9543 - classification_loss: 0.4891 367/500 [=====================>........] - ETA: 33s - loss: 2.4421 - regression_loss: 1.9535 - classification_loss: 0.4886 368/500 [=====================>........] - ETA: 33s - loss: 2.4414 - regression_loss: 1.9529 - classification_loss: 0.4885 369/500 [=====================>........] - ETA: 32s - loss: 2.4415 - regression_loss: 1.9531 - classification_loss: 0.4885 370/500 [=====================>........] - ETA: 32s - loss: 2.4423 - regression_loss: 1.9539 - classification_loss: 0.4883 371/500 [=====================>........] - ETA: 32s - loss: 2.4411 - regression_loss: 1.9532 - classification_loss: 0.4879 372/500 [=====================>........] - ETA: 32s - loss: 2.4422 - regression_loss: 1.9539 - classification_loss: 0.4882 373/500 [=====================>........] - ETA: 31s - loss: 2.4411 - regression_loss: 1.9530 - classification_loss: 0.4881 374/500 [=====================>........] - ETA: 31s - loss: 2.4406 - regression_loss: 1.9527 - classification_loss: 0.4879 375/500 [=====================>........] - ETA: 31s - loss: 2.4405 - regression_loss: 1.9524 - classification_loss: 0.4880 376/500 [=====================>........] - ETA: 31s - loss: 2.4402 - regression_loss: 1.9523 - classification_loss: 0.4879 377/500 [=====================>........] - ETA: 30s - loss: 2.4441 - regression_loss: 1.9524 - classification_loss: 0.4917 378/500 [=====================>........] - ETA: 30s - loss: 2.4431 - regression_loss: 1.9518 - classification_loss: 0.4913 379/500 [=====================>........] - ETA: 30s - loss: 2.4437 - regression_loss: 1.9525 - classification_loss: 0.4912 380/500 [=====================>........] - ETA: 30s - loss: 2.4433 - regression_loss: 1.9522 - classification_loss: 0.4911 381/500 [=====================>........] - ETA: 29s - loss: 2.4436 - regression_loss: 1.9527 - classification_loss: 0.4909 382/500 [=====================>........] - ETA: 29s - loss: 2.4404 - regression_loss: 1.9503 - classification_loss: 0.4901 383/500 [=====================>........] - ETA: 29s - loss: 2.4401 - regression_loss: 1.9502 - classification_loss: 0.4899 384/500 [======================>.......] - ETA: 29s - loss: 2.4395 - regression_loss: 1.9499 - classification_loss: 0.4897 385/500 [======================>.......] - ETA: 28s - loss: 2.4387 - regression_loss: 1.9494 - classification_loss: 0.4892 386/500 [======================>.......] - ETA: 28s - loss: 2.4384 - regression_loss: 1.9494 - classification_loss: 0.4890 387/500 [======================>.......] - ETA: 28s - loss: 2.4415 - regression_loss: 1.9494 - classification_loss: 0.4921 388/500 [======================>.......] - ETA: 28s - loss: 2.4411 - regression_loss: 1.9492 - classification_loss: 0.4919 389/500 [======================>.......] - ETA: 27s - loss: 2.4406 - regression_loss: 1.9489 - classification_loss: 0.4917 390/500 [======================>.......] - ETA: 27s - loss: 2.4407 - regression_loss: 1.9490 - classification_loss: 0.4916 391/500 [======================>.......] - ETA: 27s - loss: 2.4414 - regression_loss: 1.9496 - classification_loss: 0.4918 392/500 [======================>.......] - ETA: 27s - loss: 2.4421 - regression_loss: 1.9503 - classification_loss: 0.4918 393/500 [======================>.......] - ETA: 26s - loss: 2.4421 - regression_loss: 1.9503 - classification_loss: 0.4918 394/500 [======================>.......] - ETA: 26s - loss: 2.4422 - regression_loss: 1.9505 - classification_loss: 0.4917 395/500 [======================>.......] - ETA: 26s - loss: 2.4428 - regression_loss: 1.9507 - classification_loss: 0.4921 396/500 [======================>.......] - ETA: 26s - loss: 2.4429 - regression_loss: 1.9494 - classification_loss: 0.4935 397/500 [======================>.......] - ETA: 25s - loss: 2.4437 - regression_loss: 1.9500 - classification_loss: 0.4938 398/500 [======================>.......] - ETA: 25s - loss: 2.4435 - regression_loss: 1.9501 - classification_loss: 0.4934 399/500 [======================>.......] - ETA: 25s - loss: 2.4439 - regression_loss: 1.9506 - classification_loss: 0.4933 400/500 [=======================>......] - ETA: 25s - loss: 2.4428 - regression_loss: 1.9499 - classification_loss: 0.4929 401/500 [=======================>......] - ETA: 24s - loss: 2.4425 - regression_loss: 1.9498 - classification_loss: 0.4928 402/500 [=======================>......] - ETA: 24s - loss: 2.4421 - regression_loss: 1.9495 - classification_loss: 0.4926 403/500 [=======================>......] - ETA: 24s - loss: 2.4417 - regression_loss: 1.9496 - classification_loss: 0.4920 404/500 [=======================>......] - ETA: 24s - loss: 2.4416 - regression_loss: 1.9498 - classification_loss: 0.4918 405/500 [=======================>......] - ETA: 23s - loss: 2.4402 - regression_loss: 1.9488 - classification_loss: 0.4914 406/500 [=======================>......] - ETA: 23s - loss: 2.4404 - regression_loss: 1.9485 - classification_loss: 0.4919 407/500 [=======================>......] - ETA: 23s - loss: 2.4396 - regression_loss: 1.9481 - classification_loss: 0.4915 408/500 [=======================>......] - ETA: 23s - loss: 2.4407 - regression_loss: 1.9490 - classification_loss: 0.4917 409/500 [=======================>......] - ETA: 22s - loss: 2.4418 - regression_loss: 1.9498 - classification_loss: 0.4920 410/500 [=======================>......] - ETA: 22s - loss: 2.4414 - regression_loss: 1.9495 - classification_loss: 0.4919 411/500 [=======================>......] - ETA: 22s - loss: 2.4424 - regression_loss: 1.9503 - classification_loss: 0.4921 412/500 [=======================>......] - ETA: 22s - loss: 2.4413 - regression_loss: 1.9494 - classification_loss: 0.4919 413/500 [=======================>......] - ETA: 21s - loss: 2.4404 - regression_loss: 1.9486 - classification_loss: 0.4918 414/500 [=======================>......] - ETA: 21s - loss: 2.4415 - regression_loss: 1.9496 - classification_loss: 0.4919 415/500 [=======================>......] - ETA: 21s - loss: 2.4439 - regression_loss: 1.9512 - classification_loss: 0.4927 416/500 [=======================>......] - ETA: 21s - loss: 2.4438 - regression_loss: 1.9514 - classification_loss: 0.4924 417/500 [========================>.....] - ETA: 20s - loss: 2.4460 - regression_loss: 1.9530 - classification_loss: 0.4929 418/500 [========================>.....] - ETA: 20s - loss: 2.4469 - regression_loss: 1.9538 - classification_loss: 0.4931 419/500 [========================>.....] - ETA: 20s - loss: 2.4463 - regression_loss: 1.9535 - classification_loss: 0.4928 420/500 [========================>.....] - ETA: 20s - loss: 2.4470 - regression_loss: 1.9542 - classification_loss: 0.4927 421/500 [========================>.....] - ETA: 19s - loss: 2.4456 - regression_loss: 1.9533 - classification_loss: 0.4923 422/500 [========================>.....] - ETA: 19s - loss: 2.4454 - regression_loss: 1.9534 - classification_loss: 0.4920 423/500 [========================>.....] - ETA: 19s - loss: 2.4454 - regression_loss: 1.9536 - classification_loss: 0.4918 424/500 [========================>.....] - ETA: 18s - loss: 2.4472 - regression_loss: 1.9551 - classification_loss: 0.4921 425/500 [========================>.....] - ETA: 18s - loss: 2.4471 - regression_loss: 1.9552 - classification_loss: 0.4919 426/500 [========================>.....] - ETA: 18s - loss: 2.4471 - regression_loss: 1.9553 - classification_loss: 0.4918 427/500 [========================>.....] - ETA: 18s - loss: 2.4468 - regression_loss: 1.9553 - classification_loss: 0.4915 428/500 [========================>.....] - ETA: 17s - loss: 2.4451 - regression_loss: 1.9541 - classification_loss: 0.4910 429/500 [========================>.....] - ETA: 17s - loss: 2.4453 - regression_loss: 1.9541 - classification_loss: 0.4911 430/500 [========================>.....] - ETA: 17s - loss: 2.4458 - regression_loss: 1.9547 - classification_loss: 0.4911 431/500 [========================>.....] - ETA: 17s - loss: 2.4464 - regression_loss: 1.9554 - classification_loss: 0.4911 432/500 [========================>.....] - ETA: 16s - loss: 2.4473 - regression_loss: 1.9561 - classification_loss: 0.4912 433/500 [========================>.....] - ETA: 16s - loss: 2.4471 - regression_loss: 1.9561 - classification_loss: 0.4910 434/500 [=========================>....] - ETA: 16s - loss: 2.4471 - regression_loss: 1.9562 - classification_loss: 0.4909 435/500 [=========================>....] - ETA: 16s - loss: 2.4477 - regression_loss: 1.9567 - classification_loss: 0.4910 436/500 [=========================>....] - ETA: 15s - loss: 2.4476 - regression_loss: 1.9566 - classification_loss: 0.4910 437/500 [=========================>....] - ETA: 15s - loss: 2.4462 - regression_loss: 1.9554 - classification_loss: 0.4908 438/500 [=========================>....] - ETA: 15s - loss: 2.4437 - regression_loss: 1.9533 - classification_loss: 0.4904 439/500 [=========================>....] - ETA: 15s - loss: 2.4436 - regression_loss: 1.9535 - classification_loss: 0.4901 440/500 [=========================>....] - ETA: 15s - loss: 2.4447 - regression_loss: 1.9549 - classification_loss: 0.4898 441/500 [=========================>....] - ETA: 14s - loss: 2.4448 - regression_loss: 1.9551 - classification_loss: 0.4897 442/500 [=========================>....] - ETA: 14s - loss: 2.4448 - regression_loss: 1.9554 - classification_loss: 0.4895 443/500 [=========================>....] - ETA: 14s - loss: 2.4444 - regression_loss: 1.9552 - classification_loss: 0.4892 444/500 [=========================>....] - ETA: 14s - loss: 2.4418 - regression_loss: 1.9530 - classification_loss: 0.4888 445/500 [=========================>....] - ETA: 13s - loss: 2.4411 - regression_loss: 1.9526 - classification_loss: 0.4884 446/500 [=========================>....] - ETA: 13s - loss: 2.4420 - regression_loss: 1.9531 - classification_loss: 0.4889 447/500 [=========================>....] - ETA: 13s - loss: 2.4418 - regression_loss: 1.9530 - classification_loss: 0.4887 448/500 [=========================>....] - ETA: 13s - loss: 2.4422 - regression_loss: 1.9535 - classification_loss: 0.4887 449/500 [=========================>....] - ETA: 12s - loss: 2.4406 - regression_loss: 1.9523 - classification_loss: 0.4883 450/500 [==========================>...] - ETA: 12s - loss: 2.4412 - regression_loss: 1.9527 - classification_loss: 0.4886 451/500 [==========================>...] - ETA: 12s - loss: 2.4410 - regression_loss: 1.9524 - classification_loss: 0.4885 452/500 [==========================>...] - ETA: 12s - loss: 2.4402 - regression_loss: 1.9519 - classification_loss: 0.4882 453/500 [==========================>...] - ETA: 11s - loss: 2.4393 - regression_loss: 1.9510 - classification_loss: 0.4882 454/500 [==========================>...] - ETA: 11s - loss: 2.4398 - regression_loss: 1.9514 - classification_loss: 0.4884 455/500 [==========================>...] - ETA: 11s - loss: 2.4380 - regression_loss: 1.9497 - classification_loss: 0.4883 456/500 [==========================>...] - ETA: 10s - loss: 2.4356 - regression_loss: 1.9478 - classification_loss: 0.4878 457/500 [==========================>...] - ETA: 10s - loss: 2.4370 - regression_loss: 1.9487 - classification_loss: 0.4883 458/500 [==========================>...] - ETA: 10s - loss: 2.4358 - regression_loss: 1.9477 - classification_loss: 0.4881 459/500 [==========================>...] - ETA: 10s - loss: 2.4362 - regression_loss: 1.9482 - classification_loss: 0.4880 460/500 [==========================>...] - ETA: 9s - loss: 2.4361 - regression_loss: 1.9484 - classification_loss: 0.4878  461/500 [==========================>...] - ETA: 9s - loss: 2.4359 - regression_loss: 1.9484 - classification_loss: 0.4875 462/500 [==========================>...] - ETA: 9s - loss: 2.4369 - regression_loss: 1.9491 - classification_loss: 0.4877 463/500 [==========================>...] - ETA: 9s - loss: 2.4372 - regression_loss: 1.9495 - classification_loss: 0.4877 464/500 [==========================>...] - ETA: 8s - loss: 2.4365 - regression_loss: 1.9492 - classification_loss: 0.4873 465/500 [==========================>...] - ETA: 8s - loss: 2.4365 - regression_loss: 1.9493 - classification_loss: 0.4872 466/500 [==========================>...] - ETA: 8s - loss: 2.4364 - regression_loss: 1.9494 - classification_loss: 0.4870 467/500 [===========================>..] - ETA: 8s - loss: 2.4361 - regression_loss: 1.9492 - classification_loss: 0.4869 468/500 [===========================>..] - ETA: 7s - loss: 2.4383 - regression_loss: 1.9495 - classification_loss: 0.4888 469/500 [===========================>..] - ETA: 7s - loss: 2.4376 - regression_loss: 1.9491 - classification_loss: 0.4885 470/500 [===========================>..] - ETA: 7s - loss: 2.4366 - regression_loss: 1.9484 - classification_loss: 0.4882 471/500 [===========================>..] - ETA: 7s - loss: 2.4365 - regression_loss: 1.9485 - classification_loss: 0.4880 472/500 [===========================>..] - ETA: 6s - loss: 2.4372 - regression_loss: 1.9491 - classification_loss: 0.4881 473/500 [===========================>..] - ETA: 6s - loss: 2.4372 - regression_loss: 1.9492 - classification_loss: 0.4881 474/500 [===========================>..] - ETA: 6s - loss: 2.4371 - regression_loss: 1.9492 - classification_loss: 0.4879 475/500 [===========================>..] - ETA: 6s - loss: 2.4370 - regression_loss: 1.9489 - classification_loss: 0.4881 476/500 [===========================>..] - ETA: 5s - loss: 2.4387 - regression_loss: 1.9505 - classification_loss: 0.4883 477/500 [===========================>..] - ETA: 5s - loss: 2.4392 - regression_loss: 1.9509 - classification_loss: 0.4883 478/500 [===========================>..] - ETA: 5s - loss: 2.4392 - regression_loss: 1.9510 - classification_loss: 0.4882 479/500 [===========================>..] - ETA: 5s - loss: 2.4388 - regression_loss: 1.9508 - classification_loss: 0.4880 480/500 [===========================>..] - ETA: 4s - loss: 2.4380 - regression_loss: 1.9503 - classification_loss: 0.4878 481/500 [===========================>..] - ETA: 4s - loss: 2.4382 - regression_loss: 1.9505 - classification_loss: 0.4877 482/500 [===========================>..] - ETA: 4s - loss: 2.4384 - regression_loss: 1.9508 - classification_loss: 0.4877 483/500 [===========================>..] - ETA: 4s - loss: 2.4396 - regression_loss: 1.9516 - classification_loss: 0.4881 484/500 [============================>.] - ETA: 3s - loss: 2.4404 - regression_loss: 1.9523 - classification_loss: 0.4881 485/500 [============================>.] - ETA: 3s - loss: 2.4405 - regression_loss: 1.9525 - classification_loss: 0.4880 486/500 [============================>.] - ETA: 3s - loss: 2.4414 - regression_loss: 1.9528 - classification_loss: 0.4886 487/500 [============================>.] - ETA: 3s - loss: 2.4422 - regression_loss: 1.9533 - classification_loss: 0.4888 488/500 [============================>.] - ETA: 2s - loss: 2.4421 - regression_loss: 1.9534 - classification_loss: 0.4887 489/500 [============================>.] - ETA: 2s - loss: 2.4420 - regression_loss: 1.9535 - classification_loss: 0.4885 490/500 [============================>.] - ETA: 2s - loss: 2.4434 - regression_loss: 1.9548 - classification_loss: 0.4886 491/500 [============================>.] - ETA: 2s - loss: 2.4427 - regression_loss: 1.9543 - classification_loss: 0.4884 492/500 [============================>.] - ETA: 1s - loss: 2.4419 - regression_loss: 1.9538 - classification_loss: 0.4881 493/500 [============================>.] - ETA: 1s - loss: 2.4420 - regression_loss: 1.9540 - classification_loss: 0.4880 494/500 [============================>.] - ETA: 1s - loss: 2.4415 - regression_loss: 1.9536 - classification_loss: 0.4878 495/500 [============================>.] - ETA: 1s - loss: 2.4408 - regression_loss: 1.9532 - classification_loss: 0.4876 496/500 [============================>.] - ETA: 0s - loss: 2.4403 - regression_loss: 1.9528 - classification_loss: 0.4875 497/500 [============================>.] - ETA: 0s - loss: 2.4415 - regression_loss: 1.9538 - classification_loss: 0.4877 498/500 [============================>.] - ETA: 0s - loss: 2.4410 - regression_loss: 1.9534 - classification_loss: 0.4876 499/500 [============================>.] - ETA: 0s - loss: 2.4403 - regression_loss: 1.9530 - classification_loss: 0.4873 500/500 [==============================] - 125s 250ms/step - loss: 2.4419 - regression_loss: 1.9541 - classification_loss: 0.4878 1172 instances of class plum with average precision: 0.3568 mAP: 0.3568 Epoch 00015: saving model to ./training/snapshots/resnet50_pascal_15.h5 Epoch 16/150 1/500 [..............................] - ETA: 2:01 - loss: 2.3681 - regression_loss: 1.9303 - classification_loss: 0.4377 2/500 [..............................] - ETA: 2:03 - loss: 2.1305 - regression_loss: 1.7758 - classification_loss: 0.3547 3/500 [..............................] - ETA: 2:04 - loss: 2.4197 - regression_loss: 1.9864 - classification_loss: 0.4333 4/500 [..............................] - ETA: 2:05 - loss: 2.3779 - regression_loss: 1.9581 - classification_loss: 0.4197 5/500 [..............................] - ETA: 2:04 - loss: 2.3913 - regression_loss: 1.9486 - classification_loss: 0.4427 6/500 [..............................] - ETA: 2:04 - loss: 2.3334 - regression_loss: 1.9008 - classification_loss: 0.4326 7/500 [..............................] - ETA: 2:04 - loss: 2.2831 - regression_loss: 1.8589 - classification_loss: 0.4242 8/500 [..............................] - ETA: 2:04 - loss: 2.3218 - regression_loss: 1.8840 - classification_loss: 0.4378 9/500 [..............................] - ETA: 2:03 - loss: 2.4158 - regression_loss: 1.9521 - classification_loss: 0.4637 10/500 [..............................] - ETA: 2:03 - loss: 2.3854 - regression_loss: 1.9252 - classification_loss: 0.4602 11/500 [..............................] - ETA: 2:02 - loss: 2.3736 - regression_loss: 1.9160 - classification_loss: 0.4576 12/500 [..............................] - ETA: 2:02 - loss: 2.3664 - regression_loss: 1.9111 - classification_loss: 0.4554 13/500 [..............................] - ETA: 2:01 - loss: 2.3440 - regression_loss: 1.8898 - classification_loss: 0.4543 14/500 [..............................] - ETA: 2:01 - loss: 2.4095 - regression_loss: 1.9525 - classification_loss: 0.4569 15/500 [..............................] - ETA: 2:00 - loss: 2.3873 - regression_loss: 1.8950 - classification_loss: 0.4923 16/500 [..............................] - ETA: 2:00 - loss: 2.3702 - regression_loss: 1.8786 - classification_loss: 0.4916 17/500 [>.............................] - ETA: 2:01 - loss: 2.4012 - regression_loss: 1.9063 - classification_loss: 0.4949 18/500 [>.............................] - ETA: 2:00 - loss: 2.3850 - regression_loss: 1.8966 - classification_loss: 0.4883 19/500 [>.............................] - ETA: 2:00 - loss: 2.3746 - regression_loss: 1.8927 - classification_loss: 0.4818 20/500 [>.............................] - ETA: 2:00 - loss: 2.3774 - regression_loss: 1.8976 - classification_loss: 0.4798 21/500 [>.............................] - ETA: 2:00 - loss: 2.3845 - regression_loss: 1.9026 - classification_loss: 0.4819 22/500 [>.............................] - ETA: 2:00 - loss: 2.3600 - regression_loss: 1.8851 - classification_loss: 0.4749 23/500 [>.............................] - ETA: 2:00 - loss: 2.3646 - regression_loss: 1.8898 - classification_loss: 0.4748 24/500 [>.............................] - ETA: 1:59 - loss: 2.3664 - regression_loss: 1.8945 - classification_loss: 0.4719 25/500 [>.............................] - ETA: 1:59 - loss: 2.3519 - regression_loss: 1.8823 - classification_loss: 0.4696 26/500 [>.............................] - ETA: 1:59 - loss: 2.3802 - regression_loss: 1.9057 - classification_loss: 0.4745 27/500 [>.............................] - ETA: 1:59 - loss: 2.3972 - regression_loss: 1.9183 - classification_loss: 0.4789 28/500 [>.............................] - ETA: 1:58 - loss: 2.4120 - regression_loss: 1.9305 - classification_loss: 0.4815 29/500 [>.............................] - ETA: 1:58 - loss: 2.4215 - regression_loss: 1.9388 - classification_loss: 0.4827 30/500 [>.............................] - ETA: 1:58 - loss: 2.4253 - regression_loss: 1.9473 - classification_loss: 0.4780 31/500 [>.............................] - ETA: 1:58 - loss: 2.4384 - regression_loss: 1.9549 - classification_loss: 0.4835 32/500 [>.............................] - ETA: 1:57 - loss: 2.4375 - regression_loss: 1.9564 - classification_loss: 0.4811 33/500 [>.............................] - ETA: 1:57 - loss: 2.4269 - regression_loss: 1.9442 - classification_loss: 0.4827 34/500 [=>............................] - ETA: 1:57 - loss: 2.4275 - regression_loss: 1.9455 - classification_loss: 0.4820 35/500 [=>............................] - ETA: 1:57 - loss: 2.4119 - regression_loss: 1.9377 - classification_loss: 0.4742 36/500 [=>............................] - ETA: 1:56 - loss: 2.4082 - regression_loss: 1.9333 - classification_loss: 0.4749 37/500 [=>............................] - ETA: 1:56 - loss: 2.4065 - regression_loss: 1.9314 - classification_loss: 0.4751 38/500 [=>............................] - ETA: 1:56 - loss: 2.4163 - regression_loss: 1.9410 - classification_loss: 0.4753 39/500 [=>............................] - ETA: 1:56 - loss: 2.4227 - regression_loss: 1.9446 - classification_loss: 0.4781 40/500 [=>............................] - ETA: 1:55 - loss: 2.4246 - regression_loss: 1.9472 - classification_loss: 0.4774 41/500 [=>............................] - ETA: 1:55 - loss: 2.4074 - regression_loss: 1.9345 - classification_loss: 0.4729 42/500 [=>............................] - ETA: 1:55 - loss: 2.4102 - regression_loss: 1.9348 - classification_loss: 0.4754 43/500 [=>............................] - ETA: 1:55 - loss: 2.4012 - regression_loss: 1.9259 - classification_loss: 0.4753 44/500 [=>............................] - ETA: 1:54 - loss: 2.4160 - regression_loss: 1.9351 - classification_loss: 0.4809 45/500 [=>............................] - ETA: 1:54 - loss: 2.4195 - regression_loss: 1.9381 - classification_loss: 0.4814 46/500 [=>............................] - ETA: 1:54 - loss: 2.4245 - regression_loss: 1.9421 - classification_loss: 0.4824 47/500 [=>............................] - ETA: 1:54 - loss: 2.4155 - regression_loss: 1.9350 - classification_loss: 0.4805 48/500 [=>............................] - ETA: 1:53 - loss: 2.4063 - regression_loss: 1.9263 - classification_loss: 0.4800 49/500 [=>............................] - ETA: 1:53 - loss: 2.4302 - regression_loss: 1.9465 - classification_loss: 0.4837 50/500 [==>...........................] - ETA: 1:53 - loss: 2.4292 - regression_loss: 1.9478 - classification_loss: 0.4814 51/500 [==>...........................] - ETA: 1:53 - loss: 2.4275 - regression_loss: 1.9469 - classification_loss: 0.4806 52/500 [==>...........................] - ETA: 1:52 - loss: 2.4384 - regression_loss: 1.9526 - classification_loss: 0.4858 53/500 [==>...........................] - ETA: 1:52 - loss: 2.4492 - regression_loss: 1.9641 - classification_loss: 0.4851 54/500 [==>...........................] - ETA: 1:52 - loss: 2.4479 - regression_loss: 1.9624 - classification_loss: 0.4854 55/500 [==>...........................] - ETA: 1:52 - loss: 2.4552 - regression_loss: 1.9700 - classification_loss: 0.4852 56/500 [==>...........................] - ETA: 1:51 - loss: 2.4545 - regression_loss: 1.9682 - classification_loss: 0.4862 57/500 [==>...........................] - ETA: 1:51 - loss: 2.4522 - regression_loss: 1.9671 - classification_loss: 0.4851 58/500 [==>...........................] - ETA: 1:51 - loss: 2.4401 - regression_loss: 1.9583 - classification_loss: 0.4818 59/500 [==>...........................] - ETA: 1:51 - loss: 2.4359 - regression_loss: 1.9560 - classification_loss: 0.4799 60/500 [==>...........................] - ETA: 1:50 - loss: 2.4424 - regression_loss: 1.9484 - classification_loss: 0.4939 61/500 [==>...........................] - ETA: 1:50 - loss: 2.4378 - regression_loss: 1.9458 - classification_loss: 0.4920 62/500 [==>...........................] - ETA: 1:50 - loss: 2.4417 - regression_loss: 1.9489 - classification_loss: 0.4928 63/500 [==>...........................] - ETA: 1:50 - loss: 2.4396 - regression_loss: 1.9473 - classification_loss: 0.4923 64/500 [==>...........................] - ETA: 1:49 - loss: 2.4431 - regression_loss: 1.9524 - classification_loss: 0.4908 65/500 [==>...........................] - ETA: 1:49 - loss: 2.4313 - regression_loss: 1.9448 - classification_loss: 0.4865 66/500 [==>...........................] - ETA: 1:49 - loss: 2.4415 - regression_loss: 1.9537 - classification_loss: 0.4878 67/500 [===>..........................] - ETA: 1:49 - loss: 2.4417 - regression_loss: 1.9548 - classification_loss: 0.4870 68/500 [===>..........................] - ETA: 1:48 - loss: 2.4391 - regression_loss: 1.9540 - classification_loss: 0.4850 69/500 [===>..........................] - ETA: 1:48 - loss: 2.4575 - regression_loss: 1.9700 - classification_loss: 0.4875 70/500 [===>..........................] - ETA: 1:47 - loss: 2.4671 - regression_loss: 1.9765 - classification_loss: 0.4906 71/500 [===>..........................] - ETA: 1:47 - loss: 2.4643 - regression_loss: 1.9743 - classification_loss: 0.4900 72/500 [===>..........................] - ETA: 1:46 - loss: 2.4601 - regression_loss: 1.9719 - classification_loss: 0.4882 73/500 [===>..........................] - ETA: 1:46 - loss: 2.4491 - regression_loss: 1.9636 - classification_loss: 0.4855 74/500 [===>..........................] - ETA: 1:46 - loss: 2.4547 - regression_loss: 1.9691 - classification_loss: 0.4856 75/500 [===>..........................] - ETA: 1:46 - loss: 2.4606 - regression_loss: 1.9715 - classification_loss: 0.4891 76/500 [===>..........................] - ETA: 1:45 - loss: 2.5257 - regression_loss: 1.9820 - classification_loss: 0.5437 77/500 [===>..........................] - ETA: 1:45 - loss: 2.5221 - regression_loss: 1.9803 - classification_loss: 0.5418 78/500 [===>..........................] - ETA: 1:45 - loss: 2.5226 - regression_loss: 1.9832 - classification_loss: 0.5394 79/500 [===>..........................] - ETA: 1:45 - loss: 2.5259 - regression_loss: 1.9879 - classification_loss: 0.5379 80/500 [===>..........................] - ETA: 1:44 - loss: 2.5235 - regression_loss: 1.9871 - classification_loss: 0.5365 81/500 [===>..........................] - ETA: 1:44 - loss: 2.5383 - regression_loss: 2.0013 - classification_loss: 0.5371 82/500 [===>..........................] - ETA: 1:44 - loss: 2.5353 - regression_loss: 1.9989 - classification_loss: 0.5364 83/500 [===>..........................] - ETA: 1:44 - loss: 2.5372 - regression_loss: 2.0019 - classification_loss: 0.5353 84/500 [====>.........................] - ETA: 1:43 - loss: 2.5362 - regression_loss: 2.0020 - classification_loss: 0.5342 85/500 [====>.........................] - ETA: 1:43 - loss: 2.5350 - regression_loss: 2.0009 - classification_loss: 0.5341 86/500 [====>.........................] - ETA: 1:43 - loss: 2.5348 - regression_loss: 2.0007 - classification_loss: 0.5341 87/500 [====>.........................] - ETA: 1:43 - loss: 2.5390 - regression_loss: 2.0040 - classification_loss: 0.5350 88/500 [====>.........................] - ETA: 1:42 - loss: 2.5371 - regression_loss: 2.0030 - classification_loss: 0.5341 89/500 [====>.........................] - ETA: 1:42 - loss: 2.5350 - regression_loss: 2.0023 - classification_loss: 0.5327 90/500 [====>.........................] - ETA: 1:42 - loss: 2.5320 - regression_loss: 2.0006 - classification_loss: 0.5314 91/500 [====>.........................] - ETA: 1:42 - loss: 2.5291 - regression_loss: 1.9989 - classification_loss: 0.5302 92/500 [====>.........................] - ETA: 1:41 - loss: 2.5233 - regression_loss: 1.9955 - classification_loss: 0.5279 93/500 [====>.........................] - ETA: 1:41 - loss: 2.5236 - regression_loss: 1.9943 - classification_loss: 0.5293 94/500 [====>.........................] - ETA: 1:41 - loss: 2.5186 - regression_loss: 1.9907 - classification_loss: 0.5279 95/500 [====>.........................] - ETA: 1:41 - loss: 2.5201 - regression_loss: 1.9929 - classification_loss: 0.5272 96/500 [====>.........................] - ETA: 1:41 - loss: 2.5210 - regression_loss: 1.9913 - classification_loss: 0.5296 97/500 [====>.........................] - ETA: 1:40 - loss: 2.5151 - regression_loss: 1.9867 - classification_loss: 0.5285 98/500 [====>.........................] - ETA: 1:40 - loss: 2.5133 - regression_loss: 1.9857 - classification_loss: 0.5276 99/500 [====>.........................] - ETA: 1:40 - loss: 2.5156 - regression_loss: 1.9876 - classification_loss: 0.5281 100/500 [=====>........................] - ETA: 1:40 - loss: 2.5131 - regression_loss: 1.9867 - classification_loss: 0.5265 101/500 [=====>........................] - ETA: 1:39 - loss: 2.5081 - regression_loss: 1.9825 - classification_loss: 0.5256 102/500 [=====>........................] - ETA: 1:39 - loss: 2.5010 - regression_loss: 1.9774 - classification_loss: 0.5236 103/500 [=====>........................] - ETA: 1:39 - loss: 2.5017 - regression_loss: 1.9784 - classification_loss: 0.5234 104/500 [=====>........................] - ETA: 1:39 - loss: 2.5003 - regression_loss: 1.9772 - classification_loss: 0.5231 105/500 [=====>........................] - ETA: 1:38 - loss: 2.4979 - regression_loss: 1.9768 - classification_loss: 0.5211 106/500 [=====>........................] - ETA: 1:38 - loss: 2.5024 - regression_loss: 1.9801 - classification_loss: 0.5223 107/500 [=====>........................] - ETA: 1:38 - loss: 2.5062 - regression_loss: 1.9829 - classification_loss: 0.5232 108/500 [=====>........................] - ETA: 1:38 - loss: 2.4951 - regression_loss: 1.9739 - classification_loss: 0.5211 109/500 [=====>........................] - ETA: 1:37 - loss: 2.4927 - regression_loss: 1.9725 - classification_loss: 0.5203 110/500 [=====>........................] - ETA: 1:37 - loss: 2.4942 - regression_loss: 1.9743 - classification_loss: 0.5199 111/500 [=====>........................] - ETA: 1:37 - loss: 2.4909 - regression_loss: 1.9725 - classification_loss: 0.5184 112/500 [=====>........................] - ETA: 1:37 - loss: 2.4905 - regression_loss: 1.9728 - classification_loss: 0.5176 113/500 [=====>........................] - ETA: 1:36 - loss: 2.4852 - regression_loss: 1.9691 - classification_loss: 0.5161 114/500 [=====>........................] - ETA: 1:36 - loss: 2.4831 - regression_loss: 1.9684 - classification_loss: 0.5148 115/500 [=====>........................] - ETA: 1:36 - loss: 2.4853 - regression_loss: 1.9711 - classification_loss: 0.5142 116/500 [=====>........................] - ETA: 1:36 - loss: 2.4848 - regression_loss: 1.9713 - classification_loss: 0.5135 117/500 [======>.......................] - ETA: 1:35 - loss: 2.4779 - regression_loss: 1.9664 - classification_loss: 0.5115 118/500 [======>.......................] - ETA: 1:35 - loss: 2.4746 - regression_loss: 1.9639 - classification_loss: 0.5106 119/500 [======>.......................] - ETA: 1:35 - loss: 2.4782 - regression_loss: 1.9668 - classification_loss: 0.5114 120/500 [======>.......................] - ETA: 1:35 - loss: 2.4777 - regression_loss: 1.9677 - classification_loss: 0.5100 121/500 [======>.......................] - ETA: 1:34 - loss: 2.4727 - regression_loss: 1.9639 - classification_loss: 0.5088 122/500 [======>.......................] - ETA: 1:34 - loss: 2.4712 - regression_loss: 1.9629 - classification_loss: 0.5083 123/500 [======>.......................] - ETA: 1:34 - loss: 2.4733 - regression_loss: 1.9646 - classification_loss: 0.5087 124/500 [======>.......................] - ETA: 1:34 - loss: 2.4775 - regression_loss: 1.9677 - classification_loss: 0.5099 125/500 [======>.......................] - ETA: 1:33 - loss: 2.4781 - regression_loss: 1.9678 - classification_loss: 0.5104 126/500 [======>.......................] - ETA: 1:33 - loss: 2.4785 - regression_loss: 1.9688 - classification_loss: 0.5097 127/500 [======>.......................] - ETA: 1:33 - loss: 2.4797 - regression_loss: 1.9694 - classification_loss: 0.5103 128/500 [======>.......................] - ETA: 1:33 - loss: 2.4742 - regression_loss: 1.9652 - classification_loss: 0.5090 129/500 [======>.......................] - ETA: 1:32 - loss: 2.4769 - regression_loss: 1.9678 - classification_loss: 0.5091 130/500 [======>.......................] - ETA: 1:32 - loss: 2.4769 - regression_loss: 1.9682 - classification_loss: 0.5087 131/500 [======>.......................] - ETA: 1:32 - loss: 2.4750 - regression_loss: 1.9669 - classification_loss: 0.5080 132/500 [======>.......................] - ETA: 1:32 - loss: 2.4749 - regression_loss: 1.9678 - classification_loss: 0.5071 133/500 [======>.......................] - ETA: 1:31 - loss: 2.4896 - regression_loss: 1.9805 - classification_loss: 0.5091 134/500 [=======>......................] - ETA: 1:31 - loss: 2.4898 - regression_loss: 1.9806 - classification_loss: 0.5091 135/500 [=======>......................] - ETA: 1:31 - loss: 2.4910 - regression_loss: 1.9825 - classification_loss: 0.5084 136/500 [=======>......................] - ETA: 1:31 - loss: 2.4920 - regression_loss: 1.9834 - classification_loss: 0.5086 137/500 [=======>......................] - ETA: 1:30 - loss: 2.4850 - regression_loss: 1.9782 - classification_loss: 0.5068 138/500 [=======>......................] - ETA: 1:30 - loss: 2.4868 - regression_loss: 1.9795 - classification_loss: 0.5073 139/500 [=======>......................] - ETA: 1:30 - loss: 2.4848 - regression_loss: 1.9786 - classification_loss: 0.5062 140/500 [=======>......................] - ETA: 1:30 - loss: 2.4894 - regression_loss: 1.9826 - classification_loss: 0.5067 141/500 [=======>......................] - ETA: 1:29 - loss: 2.4894 - regression_loss: 1.9826 - classification_loss: 0.5068 142/500 [=======>......................] - ETA: 1:29 - loss: 2.4881 - regression_loss: 1.9814 - classification_loss: 0.5068 143/500 [=======>......................] - ETA: 1:29 - loss: 2.4876 - regression_loss: 1.9814 - classification_loss: 0.5061 144/500 [=======>......................] - ETA: 1:29 - loss: 2.4858 - regression_loss: 1.9799 - classification_loss: 0.5059 145/500 [=======>......................] - ETA: 1:28 - loss: 2.4856 - regression_loss: 1.9794 - classification_loss: 0.5061 146/500 [=======>......................] - ETA: 1:28 - loss: 2.4852 - regression_loss: 1.9793 - classification_loss: 0.5059 147/500 [=======>......................] - ETA: 1:28 - loss: 2.4837 - regression_loss: 1.9780 - classification_loss: 0.5056 148/500 [=======>......................] - ETA: 1:28 - loss: 2.4829 - regression_loss: 1.9781 - classification_loss: 0.5049 149/500 [=======>......................] - ETA: 1:27 - loss: 2.4822 - regression_loss: 1.9778 - classification_loss: 0.5045 150/500 [========>.....................] - ETA: 1:27 - loss: 2.4833 - regression_loss: 1.9796 - classification_loss: 0.5037 151/500 [========>.....................] - ETA: 1:27 - loss: 2.4839 - regression_loss: 1.9801 - classification_loss: 0.5039 152/500 [========>.....................] - ETA: 1:27 - loss: 2.4875 - regression_loss: 1.9811 - classification_loss: 0.5064 153/500 [========>.....................] - ETA: 1:26 - loss: 2.4877 - regression_loss: 1.9819 - classification_loss: 0.5059 154/500 [========>.....................] - ETA: 1:26 - loss: 2.4860 - regression_loss: 1.9810 - classification_loss: 0.5050 155/500 [========>.....................] - ETA: 1:26 - loss: 2.4851 - regression_loss: 1.9806 - classification_loss: 0.5045 156/500 [========>.....................] - ETA: 1:26 - loss: 2.4836 - regression_loss: 1.9793 - classification_loss: 0.5043 157/500 [========>.....................] - ETA: 1:25 - loss: 2.4859 - regression_loss: 1.9816 - classification_loss: 0.5042 158/500 [========>.....................] - ETA: 1:25 - loss: 2.4854 - regression_loss: 1.9812 - classification_loss: 0.5043 159/500 [========>.....................] - ETA: 1:25 - loss: 2.4833 - regression_loss: 1.9796 - classification_loss: 0.5037 160/500 [========>.....................] - ETA: 1:25 - loss: 2.4819 - regression_loss: 1.9784 - classification_loss: 0.5035 161/500 [========>.....................] - ETA: 1:24 - loss: 2.4796 - regression_loss: 1.9770 - classification_loss: 0.5026 162/500 [========>.....................] - ETA: 1:24 - loss: 2.4791 - regression_loss: 1.9767 - classification_loss: 0.5024 163/500 [========>.....................] - ETA: 1:24 - loss: 2.4762 - regression_loss: 1.9747 - classification_loss: 0.5015 164/500 [========>.....................] - ETA: 1:24 - loss: 2.4745 - regression_loss: 1.9737 - classification_loss: 0.5008 165/500 [========>.....................] - ETA: 1:23 - loss: 2.4723 - regression_loss: 1.9723 - classification_loss: 0.5000 166/500 [========>.....................] - ETA: 1:23 - loss: 2.4725 - regression_loss: 1.9731 - classification_loss: 0.4994 167/500 [=========>....................] - ETA: 1:23 - loss: 2.4713 - regression_loss: 1.9726 - classification_loss: 0.4987 168/500 [=========>....................] - ETA: 1:23 - loss: 2.4706 - regression_loss: 1.9721 - classification_loss: 0.4985 169/500 [=========>....................] - ETA: 1:22 - loss: 2.4691 - regression_loss: 1.9706 - classification_loss: 0.4985 170/500 [=========>....................] - ETA: 1:22 - loss: 2.4793 - regression_loss: 1.9777 - classification_loss: 0.5017 171/500 [=========>....................] - ETA: 1:22 - loss: 2.4747 - regression_loss: 1.9738 - classification_loss: 0.5009 172/500 [=========>....................] - ETA: 1:22 - loss: 2.4784 - regression_loss: 1.9737 - classification_loss: 0.5047 173/500 [=========>....................] - ETA: 1:21 - loss: 2.4801 - regression_loss: 1.9750 - classification_loss: 0.5051 174/500 [=========>....................] - ETA: 1:21 - loss: 2.4787 - regression_loss: 1.9742 - classification_loss: 0.5045 175/500 [=========>....................] - ETA: 1:21 - loss: 2.4816 - regression_loss: 1.9767 - classification_loss: 0.5049 176/500 [=========>....................] - ETA: 1:21 - loss: 2.4812 - regression_loss: 1.9756 - classification_loss: 0.5056 177/500 [=========>....................] - ETA: 1:20 - loss: 2.4769 - regression_loss: 1.9726 - classification_loss: 0.5043 178/500 [=========>....................] - ETA: 1:20 - loss: 2.4771 - regression_loss: 1.9731 - classification_loss: 0.5040 179/500 [=========>....................] - ETA: 1:20 - loss: 2.4773 - regression_loss: 1.9734 - classification_loss: 0.5039 180/500 [=========>....................] - ETA: 1:20 - loss: 2.4752 - regression_loss: 1.9721 - classification_loss: 0.5032 181/500 [=========>....................] - ETA: 1:19 - loss: 2.4803 - regression_loss: 1.9752 - classification_loss: 0.5051 182/500 [=========>....................] - ETA: 1:19 - loss: 2.4784 - regression_loss: 1.9738 - classification_loss: 0.5046 183/500 [=========>....................] - ETA: 1:19 - loss: 2.4743 - regression_loss: 1.9709 - classification_loss: 0.5034 184/500 [==========>...................] - ETA: 1:19 - loss: 2.4733 - regression_loss: 1.9705 - classification_loss: 0.5028 185/500 [==========>...................] - ETA: 1:18 - loss: 2.4738 - regression_loss: 1.9707 - classification_loss: 0.5031 186/500 [==========>...................] - ETA: 1:18 - loss: 2.4770 - regression_loss: 1.9738 - classification_loss: 0.5031 187/500 [==========>...................] - ETA: 1:18 - loss: 2.4766 - regression_loss: 1.9742 - classification_loss: 0.5023 188/500 [==========>...................] - ETA: 1:18 - loss: 2.4760 - regression_loss: 1.9741 - classification_loss: 0.5020 189/500 [==========>...................] - ETA: 1:17 - loss: 2.4760 - regression_loss: 1.9725 - classification_loss: 0.5035 190/500 [==========>...................] - ETA: 1:17 - loss: 2.4787 - regression_loss: 1.9749 - classification_loss: 0.5038 191/500 [==========>...................] - ETA: 1:17 - loss: 2.4800 - regression_loss: 1.9767 - classification_loss: 0.5032 192/500 [==========>...................] - ETA: 1:17 - loss: 2.4766 - regression_loss: 1.9731 - classification_loss: 0.5035 193/500 [==========>...................] - ETA: 1:16 - loss: 2.4822 - regression_loss: 1.9767 - classification_loss: 0.5056 194/500 [==========>...................] - ETA: 1:16 - loss: 2.4847 - regression_loss: 1.9782 - classification_loss: 0.5065 195/500 [==========>...................] - ETA: 1:16 - loss: 2.4829 - regression_loss: 1.9770 - classification_loss: 0.5060 196/500 [==========>...................] - ETA: 1:16 - loss: 2.4836 - regression_loss: 1.9776 - classification_loss: 0.5060 197/500 [==========>...................] - ETA: 1:15 - loss: 2.4841 - regression_loss: 1.9772 - classification_loss: 0.5069 198/500 [==========>...................] - ETA: 1:15 - loss: 2.4855 - regression_loss: 1.9783 - classification_loss: 0.5072 199/500 [==========>...................] - ETA: 1:15 - loss: 2.4828 - regression_loss: 1.9761 - classification_loss: 0.5067 200/500 [===========>..................] - ETA: 1:15 - loss: 2.4820 - regression_loss: 1.9755 - classification_loss: 0.5065 201/500 [===========>..................] - ETA: 1:14 - loss: 2.4805 - regression_loss: 1.9746 - classification_loss: 0.5059 202/500 [===========>..................] - ETA: 1:14 - loss: 2.4786 - regression_loss: 1.9730 - classification_loss: 0.5056 203/500 [===========>..................] - ETA: 1:14 - loss: 2.4812 - regression_loss: 1.9746 - classification_loss: 0.5066 204/500 [===========>..................] - ETA: 1:14 - loss: 2.4787 - regression_loss: 1.9731 - classification_loss: 0.5056 205/500 [===========>..................] - ETA: 1:13 - loss: 2.4800 - regression_loss: 1.9736 - classification_loss: 0.5064 206/500 [===========>..................] - ETA: 1:13 - loss: 2.4793 - regression_loss: 1.9732 - classification_loss: 0.5061 207/500 [===========>..................] - ETA: 1:13 - loss: 2.4817 - regression_loss: 1.9751 - classification_loss: 0.5066 208/500 [===========>..................] - ETA: 1:13 - loss: 2.4829 - regression_loss: 1.9755 - classification_loss: 0.5074 209/500 [===========>..................] - ETA: 1:12 - loss: 2.4877 - regression_loss: 1.9794 - classification_loss: 0.5082 210/500 [===========>..................] - ETA: 1:12 - loss: 2.4872 - regression_loss: 1.9796 - classification_loss: 0.5076 211/500 [===========>..................] - ETA: 1:12 - loss: 2.4880 - regression_loss: 1.9803 - classification_loss: 0.5077 212/500 [===========>..................] - ETA: 1:12 - loss: 2.4904 - regression_loss: 1.9826 - classification_loss: 0.5078 213/500 [===========>..................] - ETA: 1:11 - loss: 2.4898 - regression_loss: 1.9824 - classification_loss: 0.5074 214/500 [===========>..................] - ETA: 1:11 - loss: 2.4921 - regression_loss: 1.9833 - classification_loss: 0.5088 215/500 [===========>..................] - ETA: 1:11 - loss: 2.4941 - regression_loss: 1.9850 - classification_loss: 0.5091 216/500 [===========>..................] - ETA: 1:11 - loss: 2.4946 - regression_loss: 1.9857 - classification_loss: 0.5089 217/500 [============>.................] - ETA: 1:10 - loss: 2.4927 - regression_loss: 1.9840 - classification_loss: 0.5087 218/500 [============>.................] - ETA: 1:10 - loss: 2.4937 - regression_loss: 1.9852 - classification_loss: 0.5086 219/500 [============>.................] - ETA: 1:10 - loss: 2.4923 - regression_loss: 1.9843 - classification_loss: 0.5080 220/500 [============>.................] - ETA: 1:10 - loss: 2.4916 - regression_loss: 1.9839 - classification_loss: 0.5077 221/500 [============>.................] - ETA: 1:09 - loss: 2.4896 - regression_loss: 1.9824 - classification_loss: 0.5072 222/500 [============>.................] - ETA: 1:09 - loss: 2.4905 - regression_loss: 1.9833 - classification_loss: 0.5072 223/500 [============>.................] - ETA: 1:09 - loss: 2.4900 - regression_loss: 1.9828 - classification_loss: 0.5073 224/500 [============>.................] - ETA: 1:09 - loss: 2.4910 - regression_loss: 1.9839 - classification_loss: 0.5071 225/500 [============>.................] - ETA: 1:08 - loss: 2.4908 - regression_loss: 1.9839 - classification_loss: 0.5069 226/500 [============>.................] - ETA: 1:08 - loss: 2.4896 - regression_loss: 1.9831 - classification_loss: 0.5065 227/500 [============>.................] - ETA: 1:08 - loss: 2.4882 - regression_loss: 1.9821 - classification_loss: 0.5061 228/500 [============>.................] - ETA: 1:08 - loss: 2.4870 - regression_loss: 1.9813 - classification_loss: 0.5057 229/500 [============>.................] - ETA: 1:07 - loss: 2.4866 - regression_loss: 1.9813 - classification_loss: 0.5053 230/500 [============>.................] - ETA: 1:07 - loss: 2.4874 - regression_loss: 1.9825 - classification_loss: 0.5049 231/500 [============>.................] - ETA: 1:07 - loss: 2.4868 - regression_loss: 1.9821 - classification_loss: 0.5047 232/500 [============>.................] - ETA: 1:07 - loss: 2.4912 - regression_loss: 1.9836 - classification_loss: 0.5077 233/500 [============>.................] - ETA: 1:06 - loss: 2.4925 - regression_loss: 1.9844 - classification_loss: 0.5080 234/500 [=============>................] - ETA: 1:06 - loss: 2.4917 - regression_loss: 1.9842 - classification_loss: 0.5075 235/500 [=============>................] - ETA: 1:06 - loss: 2.4926 - regression_loss: 1.9853 - classification_loss: 0.5072 236/500 [=============>................] - ETA: 1:06 - loss: 2.4921 - regression_loss: 1.9853 - classification_loss: 0.5068 237/500 [=============>................] - ETA: 1:05 - loss: 2.4870 - regression_loss: 1.9813 - classification_loss: 0.5056 238/500 [=============>................] - ETA: 1:05 - loss: 2.4870 - regression_loss: 1.9815 - classification_loss: 0.5055 239/500 [=============>................] - ETA: 1:05 - loss: 2.4868 - regression_loss: 1.9818 - classification_loss: 0.5050 240/500 [=============>................] - ETA: 1:05 - loss: 2.4839 - regression_loss: 1.9801 - classification_loss: 0.5038 241/500 [=============>................] - ETA: 1:04 - loss: 2.4845 - regression_loss: 1.9807 - classification_loss: 0.5038 242/500 [=============>................] - ETA: 1:04 - loss: 2.4894 - regression_loss: 1.9849 - classification_loss: 0.5045 243/500 [=============>................] - ETA: 1:04 - loss: 2.4908 - regression_loss: 1.9861 - classification_loss: 0.5047 244/500 [=============>................] - ETA: 1:04 - loss: 2.4912 - regression_loss: 1.9868 - classification_loss: 0.5045 245/500 [=============>................] - ETA: 1:03 - loss: 2.4902 - regression_loss: 1.9863 - classification_loss: 0.5039 246/500 [=============>................] - ETA: 1:03 - loss: 2.4896 - regression_loss: 1.9862 - classification_loss: 0.5034 247/500 [=============>................] - ETA: 1:03 - loss: 2.4905 - regression_loss: 1.9870 - classification_loss: 0.5035 248/500 [=============>................] - ETA: 1:03 - loss: 2.4935 - regression_loss: 1.9894 - classification_loss: 0.5042 249/500 [=============>................] - ETA: 1:02 - loss: 2.4921 - regression_loss: 1.9885 - classification_loss: 0.5036 250/500 [==============>...............] - ETA: 1:02 - loss: 2.4903 - regression_loss: 1.9873 - classification_loss: 0.5030 251/500 [==============>...............] - ETA: 1:02 - loss: 2.4892 - regression_loss: 1.9866 - classification_loss: 0.5025 252/500 [==============>...............] - ETA: 1:02 - loss: 2.4872 - regression_loss: 1.9850 - classification_loss: 0.5022 253/500 [==============>...............] - ETA: 1:01 - loss: 2.4861 - regression_loss: 1.9844 - classification_loss: 0.5018 254/500 [==============>...............] - ETA: 1:01 - loss: 2.4863 - regression_loss: 1.9848 - classification_loss: 0.5016 255/500 [==============>...............] - ETA: 1:01 - loss: 2.4869 - regression_loss: 1.9854 - classification_loss: 0.5016 256/500 [==============>...............] - ETA: 1:01 - loss: 2.4869 - regression_loss: 1.9856 - classification_loss: 0.5013 257/500 [==============>...............] - ETA: 1:00 - loss: 2.4869 - regression_loss: 1.9859 - classification_loss: 0.5010 258/500 [==============>...............] - ETA: 1:00 - loss: 2.4824 - regression_loss: 1.9822 - classification_loss: 0.5002 259/500 [==============>...............] - ETA: 1:00 - loss: 2.4825 - regression_loss: 1.9825 - classification_loss: 0.5000 260/500 [==============>...............] - ETA: 1:00 - loss: 2.4820 - regression_loss: 1.9822 - classification_loss: 0.4997 261/500 [==============>...............] - ETA: 59s - loss: 2.4826 - regression_loss: 1.9831 - classification_loss: 0.4995  262/500 [==============>...............] - ETA: 59s - loss: 2.4816 - regression_loss: 1.9823 - classification_loss: 0.4993 263/500 [==============>...............] - ETA: 59s - loss: 2.4808 - regression_loss: 1.9820 - classification_loss: 0.4988 264/500 [==============>...............] - ETA: 59s - loss: 2.4801 - regression_loss: 1.9820 - classification_loss: 0.4981 265/500 [==============>...............] - ETA: 58s - loss: 2.4787 - regression_loss: 1.9811 - classification_loss: 0.4976 266/500 [==============>...............] - ETA: 58s - loss: 2.4803 - regression_loss: 1.9823 - classification_loss: 0.4980 267/500 [===============>..............] - ETA: 58s - loss: 2.4772 - regression_loss: 1.9798 - classification_loss: 0.4974 268/500 [===============>..............] - ETA: 58s - loss: 2.4723 - regression_loss: 1.9757 - classification_loss: 0.4965 269/500 [===============>..............] - ETA: 57s - loss: 2.4721 - regression_loss: 1.9761 - classification_loss: 0.4960 270/500 [===============>..............] - ETA: 57s - loss: 2.4716 - regression_loss: 1.9761 - classification_loss: 0.4955 271/500 [===============>..............] - ETA: 57s - loss: 2.4759 - regression_loss: 1.9800 - classification_loss: 0.4959 272/500 [===============>..............] - ETA: 57s - loss: 2.4754 - regression_loss: 1.9797 - classification_loss: 0.4957 273/500 [===============>..............] - ETA: 56s - loss: 2.4754 - regression_loss: 1.9793 - classification_loss: 0.4961 274/500 [===============>..............] - ETA: 56s - loss: 2.4749 - regression_loss: 1.9790 - classification_loss: 0.4959 275/500 [===============>..............] - ETA: 56s - loss: 2.4739 - regression_loss: 1.9784 - classification_loss: 0.4955 276/500 [===============>..............] - ETA: 56s - loss: 2.4727 - regression_loss: 1.9778 - classification_loss: 0.4949 277/500 [===============>..............] - ETA: 55s - loss: 2.4732 - regression_loss: 1.9785 - classification_loss: 0.4947 278/500 [===============>..............] - ETA: 55s - loss: 2.4741 - regression_loss: 1.9793 - classification_loss: 0.4948 279/500 [===============>..............] - ETA: 55s - loss: 2.4712 - regression_loss: 1.9772 - classification_loss: 0.4940 280/500 [===============>..............] - ETA: 55s - loss: 2.4667 - regression_loss: 1.9735 - classification_loss: 0.4931 281/500 [===============>..............] - ETA: 54s - loss: 2.4687 - regression_loss: 1.9753 - classification_loss: 0.4935 282/500 [===============>..............] - ETA: 54s - loss: 2.4675 - regression_loss: 1.9745 - classification_loss: 0.4929 283/500 [===============>..............] - ETA: 54s - loss: 2.4684 - regression_loss: 1.9754 - classification_loss: 0.4930 284/500 [================>.............] - ETA: 54s - loss: 2.4680 - regression_loss: 1.9752 - classification_loss: 0.4928 285/500 [================>.............] - ETA: 53s - loss: 2.4674 - regression_loss: 1.9750 - classification_loss: 0.4925 286/500 [================>.............] - ETA: 53s - loss: 2.4669 - regression_loss: 1.9747 - classification_loss: 0.4922 287/500 [================>.............] - ETA: 53s - loss: 2.4695 - regression_loss: 1.9761 - classification_loss: 0.4934 288/500 [================>.............] - ETA: 53s - loss: 2.4715 - regression_loss: 1.9771 - classification_loss: 0.4944 289/500 [================>.............] - ETA: 52s - loss: 2.4706 - regression_loss: 1.9764 - classification_loss: 0.4942 290/500 [================>.............] - ETA: 52s - loss: 2.4723 - regression_loss: 1.9775 - classification_loss: 0.4948 291/500 [================>.............] - ETA: 52s - loss: 2.4726 - regression_loss: 1.9780 - classification_loss: 0.4946 292/500 [================>.............] - ETA: 52s - loss: 2.4712 - regression_loss: 1.9770 - classification_loss: 0.4942 293/500 [================>.............] - ETA: 51s - loss: 2.4710 - regression_loss: 1.9768 - classification_loss: 0.4942 294/500 [================>.............] - ETA: 51s - loss: 2.4712 - regression_loss: 1.9769 - classification_loss: 0.4943 295/500 [================>.............] - ETA: 51s - loss: 2.4703 - regression_loss: 1.9761 - classification_loss: 0.4941 296/500 [================>.............] - ETA: 51s - loss: 2.4700 - regression_loss: 1.9759 - classification_loss: 0.4941 297/500 [================>.............] - ETA: 50s - loss: 2.4701 - regression_loss: 1.9761 - classification_loss: 0.4940 298/500 [================>.............] - ETA: 50s - loss: 2.4704 - regression_loss: 1.9761 - classification_loss: 0.4943 299/500 [================>.............] - ETA: 50s - loss: 2.4694 - regression_loss: 1.9752 - classification_loss: 0.4942 300/500 [=================>............] - ETA: 50s - loss: 2.4687 - regression_loss: 1.9748 - classification_loss: 0.4940 301/500 [=================>............] - ETA: 49s - loss: 2.4696 - regression_loss: 1.9758 - classification_loss: 0.4938 302/500 [=================>............] - ETA: 49s - loss: 2.4731 - regression_loss: 1.9790 - classification_loss: 0.4941 303/500 [=================>............] - ETA: 49s - loss: 2.4722 - regression_loss: 1.9784 - classification_loss: 0.4938 304/500 [=================>............] - ETA: 49s - loss: 2.4708 - regression_loss: 1.9775 - classification_loss: 0.4933 305/500 [=================>............] - ETA: 48s - loss: 2.4697 - regression_loss: 1.9766 - classification_loss: 0.4931 306/500 [=================>............] - ETA: 48s - loss: 2.4690 - regression_loss: 1.9760 - classification_loss: 0.4930 307/500 [=================>............] - ETA: 48s - loss: 2.4674 - regression_loss: 1.9750 - classification_loss: 0.4925 308/500 [=================>............] - ETA: 48s - loss: 2.4672 - regression_loss: 1.9750 - classification_loss: 0.4922 309/500 [=================>............] - ETA: 47s - loss: 2.4661 - regression_loss: 1.9740 - classification_loss: 0.4920 310/500 [=================>............] - ETA: 47s - loss: 2.4643 - regression_loss: 1.9721 - classification_loss: 0.4922 311/500 [=================>............] - ETA: 47s - loss: 2.4646 - regression_loss: 1.9725 - classification_loss: 0.4921 312/500 [=================>............] - ETA: 47s - loss: 2.4664 - regression_loss: 1.9733 - classification_loss: 0.4931 313/500 [=================>............] - ETA: 46s - loss: 2.4670 - regression_loss: 1.9739 - classification_loss: 0.4932 314/500 [=================>............] - ETA: 46s - loss: 2.4675 - regression_loss: 1.9745 - classification_loss: 0.4930 315/500 [=================>............] - ETA: 46s - loss: 2.4700 - regression_loss: 1.9763 - classification_loss: 0.4937 316/500 [=================>............] - ETA: 46s - loss: 2.4706 - regression_loss: 1.9769 - classification_loss: 0.4936 317/500 [==================>...........] - ETA: 45s - loss: 2.4704 - regression_loss: 1.9770 - classification_loss: 0.4934 318/500 [==================>...........] - ETA: 45s - loss: 2.4718 - regression_loss: 1.9782 - classification_loss: 0.4936 319/500 [==================>...........] - ETA: 45s - loss: 2.4727 - regression_loss: 1.9789 - classification_loss: 0.4938 320/500 [==================>...........] - ETA: 45s - loss: 2.4732 - regression_loss: 1.9791 - classification_loss: 0.4941 321/500 [==================>...........] - ETA: 44s - loss: 2.4718 - regression_loss: 1.9776 - classification_loss: 0.4942 322/500 [==================>...........] - ETA: 44s - loss: 2.4717 - regression_loss: 1.9778 - classification_loss: 0.4939 323/500 [==================>...........] - ETA: 44s - loss: 2.4734 - regression_loss: 1.9793 - classification_loss: 0.4941 324/500 [==================>...........] - ETA: 44s - loss: 2.4747 - regression_loss: 1.9807 - classification_loss: 0.4940 325/500 [==================>...........] - ETA: 43s - loss: 2.4759 - regression_loss: 1.9818 - classification_loss: 0.4942 326/500 [==================>...........] - ETA: 43s - loss: 2.4759 - regression_loss: 1.9818 - classification_loss: 0.4941 327/500 [==================>...........] - ETA: 43s - loss: 2.4742 - regression_loss: 1.9805 - classification_loss: 0.4937 328/500 [==================>...........] - ETA: 43s - loss: 2.4713 - regression_loss: 1.9782 - classification_loss: 0.4931 329/500 [==================>...........] - ETA: 42s - loss: 2.4722 - regression_loss: 1.9788 - classification_loss: 0.4935 330/500 [==================>...........] - ETA: 42s - loss: 2.4779 - regression_loss: 1.9835 - classification_loss: 0.4944 331/500 [==================>...........] - ETA: 42s - loss: 2.4780 - regression_loss: 1.9838 - classification_loss: 0.4942 332/500 [==================>...........] - ETA: 42s - loss: 2.4768 - regression_loss: 1.9831 - classification_loss: 0.4937 333/500 [==================>...........] - ETA: 41s - loss: 2.4766 - regression_loss: 1.9832 - classification_loss: 0.4934 334/500 [===================>..........] - ETA: 41s - loss: 2.4771 - regression_loss: 1.9838 - classification_loss: 0.4933 335/500 [===================>..........] - ETA: 41s - loss: 2.4770 - regression_loss: 1.9836 - classification_loss: 0.4934 336/500 [===================>..........] - ETA: 41s - loss: 2.4761 - regression_loss: 1.9830 - classification_loss: 0.4931 337/500 [===================>..........] - ETA: 40s - loss: 2.4754 - regression_loss: 1.9825 - classification_loss: 0.4930 338/500 [===================>..........] - ETA: 40s - loss: 2.4750 - regression_loss: 1.9821 - classification_loss: 0.4929 339/500 [===================>..........] - ETA: 40s - loss: 2.4745 - regression_loss: 1.9820 - classification_loss: 0.4925 340/500 [===================>..........] - ETA: 40s - loss: 2.4738 - regression_loss: 1.9817 - classification_loss: 0.4922 341/500 [===================>..........] - ETA: 39s - loss: 2.4749 - regression_loss: 1.9823 - classification_loss: 0.4926 342/500 [===================>..........] - ETA: 39s - loss: 2.4708 - regression_loss: 1.9789 - classification_loss: 0.4920 343/500 [===================>..........] - ETA: 39s - loss: 2.4711 - regression_loss: 1.9792 - classification_loss: 0.4919 344/500 [===================>..........] - ETA: 39s - loss: 2.4709 - regression_loss: 1.9790 - classification_loss: 0.4918 345/500 [===================>..........] - ETA: 38s - loss: 2.4695 - regression_loss: 1.9779 - classification_loss: 0.4916 346/500 [===================>..........] - ETA: 38s - loss: 2.4711 - regression_loss: 1.9786 - classification_loss: 0.4925 347/500 [===================>..........] - ETA: 38s - loss: 2.4707 - regression_loss: 1.9784 - classification_loss: 0.4924 348/500 [===================>..........] - ETA: 38s - loss: 2.4690 - regression_loss: 1.9769 - classification_loss: 0.4920 349/500 [===================>..........] - ETA: 37s - loss: 2.4701 - regression_loss: 1.9781 - classification_loss: 0.4920 350/500 [====================>.........] - ETA: 37s - loss: 2.4674 - regression_loss: 1.9760 - classification_loss: 0.4915 351/500 [====================>.........] - ETA: 37s - loss: 2.4691 - regression_loss: 1.9776 - classification_loss: 0.4915 352/500 [====================>.........] - ETA: 37s - loss: 2.4703 - regression_loss: 1.9791 - classification_loss: 0.4912 353/500 [====================>.........] - ETA: 36s - loss: 2.4690 - regression_loss: 1.9781 - classification_loss: 0.4909 354/500 [====================>.........] - ETA: 36s - loss: 2.4688 - regression_loss: 1.9780 - classification_loss: 0.4908 355/500 [====================>.........] - ETA: 36s - loss: 2.4689 - regression_loss: 1.9781 - classification_loss: 0.4909 356/500 [====================>.........] - ETA: 36s - loss: 2.4694 - regression_loss: 1.9785 - classification_loss: 0.4908 357/500 [====================>.........] - ETA: 35s - loss: 2.4685 - regression_loss: 1.9780 - classification_loss: 0.4905 358/500 [====================>.........] - ETA: 35s - loss: 2.4694 - regression_loss: 1.9784 - classification_loss: 0.4910 359/500 [====================>.........] - ETA: 35s - loss: 2.4671 - regression_loss: 1.9768 - classification_loss: 0.4903 360/500 [====================>.........] - ETA: 35s - loss: 2.4660 - regression_loss: 1.9759 - classification_loss: 0.4901 361/500 [====================>.........] - ETA: 34s - loss: 2.4651 - regression_loss: 1.9753 - classification_loss: 0.4898 362/500 [====================>.........] - ETA: 34s - loss: 2.4655 - regression_loss: 1.9757 - classification_loss: 0.4898 363/500 [====================>.........] - ETA: 34s - loss: 2.4642 - regression_loss: 1.9747 - classification_loss: 0.4895 364/500 [====================>.........] - ETA: 34s - loss: 2.4620 - regression_loss: 1.9731 - classification_loss: 0.4889 365/500 [====================>.........] - ETA: 33s - loss: 2.4621 - regression_loss: 1.9732 - classification_loss: 0.4889 366/500 [====================>.........] - ETA: 33s - loss: 2.4621 - regression_loss: 1.9714 - classification_loss: 0.4907 367/500 [=====================>........] - ETA: 33s - loss: 2.4623 - regression_loss: 1.9718 - classification_loss: 0.4905 368/500 [=====================>........] - ETA: 33s - loss: 2.4620 - regression_loss: 1.9717 - classification_loss: 0.4903 369/500 [=====================>........] - ETA: 32s - loss: 2.4603 - regression_loss: 1.9703 - classification_loss: 0.4899 370/500 [=====================>........] - ETA: 32s - loss: 2.4593 - regression_loss: 1.9697 - classification_loss: 0.4897 371/500 [=====================>........] - ETA: 32s - loss: 2.4598 - regression_loss: 1.9704 - classification_loss: 0.4894 372/500 [=====================>........] - ETA: 32s - loss: 2.4594 - regression_loss: 1.9700 - classification_loss: 0.4894 373/500 [=====================>........] - ETA: 31s - loss: 2.4577 - regression_loss: 1.9688 - classification_loss: 0.4890 374/500 [=====================>........] - ETA: 31s - loss: 2.4575 - regression_loss: 1.9689 - classification_loss: 0.4886 375/500 [=====================>........] - ETA: 31s - loss: 2.4583 - regression_loss: 1.9693 - classification_loss: 0.4890 376/500 [=====================>........] - ETA: 31s - loss: 2.4584 - regression_loss: 1.9695 - classification_loss: 0.4889 377/500 [=====================>........] - ETA: 30s - loss: 2.4633 - regression_loss: 1.9713 - classification_loss: 0.4920 378/500 [=====================>........] - ETA: 30s - loss: 2.4639 - regression_loss: 1.9718 - classification_loss: 0.4922 379/500 [=====================>........] - ETA: 30s - loss: 2.4659 - regression_loss: 1.9735 - classification_loss: 0.4924 380/500 [=====================>........] - ETA: 30s - loss: 2.4652 - regression_loss: 1.9731 - classification_loss: 0.4921 381/500 [=====================>........] - ETA: 29s - loss: 2.4661 - regression_loss: 1.9737 - classification_loss: 0.4924 382/500 [=====================>........] - ETA: 29s - loss: 2.4651 - regression_loss: 1.9728 - classification_loss: 0.4923 383/500 [=====================>........] - ETA: 29s - loss: 2.4651 - regression_loss: 1.9728 - classification_loss: 0.4923 384/500 [======================>.......] - ETA: 29s - loss: 2.4635 - regression_loss: 1.9716 - classification_loss: 0.4919 385/500 [======================>.......] - ETA: 28s - loss: 2.4648 - regression_loss: 1.9728 - classification_loss: 0.4921 386/500 [======================>.......] - ETA: 28s - loss: 2.4652 - regression_loss: 1.9733 - classification_loss: 0.4919 387/500 [======================>.......] - ETA: 28s - loss: 2.4662 - regression_loss: 1.9740 - classification_loss: 0.4922 388/500 [======================>.......] - ETA: 28s - loss: 2.4665 - regression_loss: 1.9744 - classification_loss: 0.4921 389/500 [======================>.......] - ETA: 27s - loss: 2.4665 - regression_loss: 1.9745 - classification_loss: 0.4920 390/500 [======================>.......] - ETA: 27s - loss: 2.4668 - regression_loss: 1.9743 - classification_loss: 0.4925 391/500 [======================>.......] - ETA: 27s - loss: 2.4666 - regression_loss: 1.9743 - classification_loss: 0.4924 392/500 [======================>.......] - ETA: 27s - loss: 2.4656 - regression_loss: 1.9736 - classification_loss: 0.4921 393/500 [======================>.......] - ETA: 26s - loss: 2.4688 - regression_loss: 1.9724 - classification_loss: 0.4964 394/500 [======================>.......] - ETA: 26s - loss: 2.4690 - regression_loss: 1.9724 - classification_loss: 0.4965 395/500 [======================>.......] - ETA: 26s - loss: 2.4688 - regression_loss: 1.9726 - classification_loss: 0.4962 396/500 [======================>.......] - ETA: 26s - loss: 2.4688 - regression_loss: 1.9727 - classification_loss: 0.4961 397/500 [======================>.......] - ETA: 25s - loss: 2.4686 - regression_loss: 1.9725 - classification_loss: 0.4961 398/500 [======================>.......] - ETA: 25s - loss: 2.4680 - regression_loss: 1.9723 - classification_loss: 0.4957 399/500 [======================>.......] - ETA: 25s - loss: 2.4678 - regression_loss: 1.9720 - classification_loss: 0.4957 400/500 [=======================>......] - ETA: 25s - loss: 2.4680 - regression_loss: 1.9724 - classification_loss: 0.4955 401/500 [=======================>......] - ETA: 24s - loss: 2.4671 - regression_loss: 1.9718 - classification_loss: 0.4953 402/500 [=======================>......] - ETA: 24s - loss: 2.4666 - regression_loss: 1.9716 - classification_loss: 0.4951 403/500 [=======================>......] - ETA: 24s - loss: 2.4658 - regression_loss: 1.9709 - classification_loss: 0.4949 404/500 [=======================>......] - ETA: 24s - loss: 2.4652 - regression_loss: 1.9698 - classification_loss: 0.4954 405/500 [=======================>......] - ETA: 23s - loss: 2.4654 - regression_loss: 1.9700 - classification_loss: 0.4954 406/500 [=======================>......] - ETA: 23s - loss: 2.4669 - regression_loss: 1.9711 - classification_loss: 0.4957 407/500 [=======================>......] - ETA: 23s - loss: 2.4663 - regression_loss: 1.9706 - classification_loss: 0.4956 408/500 [=======================>......] - ETA: 23s - loss: 2.4663 - regression_loss: 1.9706 - classification_loss: 0.4957 409/500 [=======================>......] - ETA: 22s - loss: 2.4665 - regression_loss: 1.9709 - classification_loss: 0.4956 410/500 [=======================>......] - ETA: 22s - loss: 2.4672 - regression_loss: 1.9716 - classification_loss: 0.4956 411/500 [=======================>......] - ETA: 22s - loss: 2.4682 - regression_loss: 1.9722 - classification_loss: 0.4959 412/500 [=======================>......] - ETA: 22s - loss: 2.4687 - regression_loss: 1.9728 - classification_loss: 0.4959 413/500 [=======================>......] - ETA: 21s - loss: 2.4675 - regression_loss: 1.9719 - classification_loss: 0.4956 414/500 [=======================>......] - ETA: 21s - loss: 2.4668 - regression_loss: 1.9714 - classification_loss: 0.4954 415/500 [=======================>......] - ETA: 21s - loss: 2.4677 - regression_loss: 1.9721 - classification_loss: 0.4956 416/500 [=======================>......] - ETA: 21s - loss: 2.4690 - regression_loss: 1.9732 - classification_loss: 0.4958 417/500 [========================>.....] - ETA: 20s - loss: 2.4687 - regression_loss: 1.9731 - classification_loss: 0.4956 418/500 [========================>.....] - ETA: 20s - loss: 2.4690 - regression_loss: 1.9735 - classification_loss: 0.4955 419/500 [========================>.....] - ETA: 20s - loss: 2.4689 - regression_loss: 1.9733 - classification_loss: 0.4956 420/500 [========================>.....] - ETA: 20s - loss: 2.4684 - regression_loss: 1.9731 - classification_loss: 0.4953 421/500 [========================>.....] - ETA: 19s - loss: 2.4669 - regression_loss: 1.9722 - classification_loss: 0.4947 422/500 [========================>.....] - ETA: 19s - loss: 2.4657 - regression_loss: 1.9712 - classification_loss: 0.4945 423/500 [========================>.....] - ETA: 19s - loss: 2.4654 - regression_loss: 1.9709 - classification_loss: 0.4945 424/500 [========================>.....] - ETA: 19s - loss: 2.4658 - regression_loss: 1.9714 - classification_loss: 0.4944 425/500 [========================>.....] - ETA: 18s - loss: 2.4659 - regression_loss: 1.9715 - classification_loss: 0.4944 426/500 [========================>.....] - ETA: 18s - loss: 2.4662 - regression_loss: 1.9720 - classification_loss: 0.4943 427/500 [========================>.....] - ETA: 18s - loss: 2.4673 - regression_loss: 1.9732 - classification_loss: 0.4941 428/500 [========================>.....] - ETA: 18s - loss: 2.4667 - regression_loss: 1.9729 - classification_loss: 0.4938 429/500 [========================>.....] - ETA: 17s - loss: 2.4676 - regression_loss: 1.9734 - classification_loss: 0.4942 430/500 [========================>.....] - ETA: 17s - loss: 2.4672 - regression_loss: 1.9732 - classification_loss: 0.4940 431/500 [========================>.....] - ETA: 17s - loss: 2.4666 - regression_loss: 1.9728 - classification_loss: 0.4939 432/500 [========================>.....] - ETA: 17s - loss: 2.4661 - regression_loss: 1.9724 - classification_loss: 0.4937 433/500 [========================>.....] - ETA: 16s - loss: 2.4674 - regression_loss: 1.9735 - classification_loss: 0.4939 434/500 [=========================>....] - ETA: 16s - loss: 2.4669 - regression_loss: 1.9732 - classification_loss: 0.4936 435/500 [=========================>....] - ETA: 16s - loss: 2.4664 - regression_loss: 1.9729 - classification_loss: 0.4935 436/500 [=========================>....] - ETA: 16s - loss: 2.4662 - regression_loss: 1.9727 - classification_loss: 0.4935 437/500 [=========================>....] - ETA: 15s - loss: 2.4658 - regression_loss: 1.9726 - classification_loss: 0.4931 438/500 [=========================>....] - ETA: 15s - loss: 2.4657 - regression_loss: 1.9728 - classification_loss: 0.4929 439/500 [=========================>....] - ETA: 15s - loss: 2.4652 - regression_loss: 1.9726 - classification_loss: 0.4927 440/500 [=========================>....] - ETA: 15s - loss: 2.4655 - regression_loss: 1.9725 - classification_loss: 0.4931 441/500 [=========================>....] - ETA: 14s - loss: 2.4668 - regression_loss: 1.9733 - classification_loss: 0.4935 442/500 [=========================>....] - ETA: 14s - loss: 2.4662 - regression_loss: 1.9728 - classification_loss: 0.4934 443/500 [=========================>....] - ETA: 14s - loss: 2.4661 - regression_loss: 1.9728 - classification_loss: 0.4933 444/500 [=========================>....] - ETA: 14s - loss: 2.4655 - regression_loss: 1.9723 - classification_loss: 0.4931 445/500 [=========================>....] - ETA: 13s - loss: 2.4652 - regression_loss: 1.9721 - classification_loss: 0.4931 446/500 [=========================>....] - ETA: 13s - loss: 2.4657 - regression_loss: 1.9721 - classification_loss: 0.4936 447/500 [=========================>....] - ETA: 13s - loss: 2.4635 - regression_loss: 1.9705 - classification_loss: 0.4930 448/500 [=========================>....] - ETA: 13s - loss: 2.4642 - regression_loss: 1.9710 - classification_loss: 0.4932 449/500 [=========================>....] - ETA: 12s - loss: 2.4647 - regression_loss: 1.9716 - classification_loss: 0.4931 450/500 [==========================>...] - ETA: 12s - loss: 2.4640 - regression_loss: 1.9712 - classification_loss: 0.4928 451/500 [==========================>...] - ETA: 12s - loss: 2.4633 - regression_loss: 1.9709 - classification_loss: 0.4924 452/500 [==========================>...] - ETA: 12s - loss: 2.4619 - regression_loss: 1.9700 - classification_loss: 0.4919 453/500 [==========================>...] - ETA: 11s - loss: 2.4618 - regression_loss: 1.9699 - classification_loss: 0.4919 454/500 [==========================>...] - ETA: 11s - loss: 2.4637 - regression_loss: 1.9713 - classification_loss: 0.4923 455/500 [==========================>...] - ETA: 11s - loss: 2.4633 - regression_loss: 1.9702 - classification_loss: 0.4931 456/500 [==========================>...] - ETA: 11s - loss: 2.4629 - regression_loss: 1.9701 - classification_loss: 0.4928 457/500 [==========================>...] - ETA: 10s - loss: 2.4620 - regression_loss: 1.9695 - classification_loss: 0.4925 458/500 [==========================>...] - ETA: 10s - loss: 2.4620 - regression_loss: 1.9697 - classification_loss: 0.4924 459/500 [==========================>...] - ETA: 10s - loss: 2.4597 - regression_loss: 1.9678 - classification_loss: 0.4920 460/500 [==========================>...] - ETA: 10s - loss: 2.4610 - regression_loss: 1.9686 - classification_loss: 0.4924 461/500 [==========================>...] - ETA: 9s - loss: 2.4595 - regression_loss: 1.9673 - classification_loss: 0.4923  462/500 [==========================>...] - ETA: 9s - loss: 2.4597 - regression_loss: 1.9676 - classification_loss: 0.4921 463/500 [==========================>...] - ETA: 9s - loss: 2.4599 - regression_loss: 1.9677 - classification_loss: 0.4922 464/500 [==========================>...] - ETA: 9s - loss: 2.4598 - regression_loss: 1.9677 - classification_loss: 0.4921 465/500 [==========================>...] - ETA: 8s - loss: 2.4618 - regression_loss: 1.9693 - classification_loss: 0.4926 466/500 [==========================>...] - ETA: 8s - loss: 2.4614 - regression_loss: 1.9689 - classification_loss: 0.4925 467/500 [===========================>..] - ETA: 8s - loss: 2.4597 - regression_loss: 1.9676 - classification_loss: 0.4921 468/500 [===========================>..] - ETA: 8s - loss: 2.4597 - regression_loss: 1.9677 - classification_loss: 0.4919 469/500 [===========================>..] - ETA: 7s - loss: 2.4585 - regression_loss: 1.9668 - classification_loss: 0.4916 470/500 [===========================>..] - ETA: 7s - loss: 2.4593 - regression_loss: 1.9676 - classification_loss: 0.4917 471/500 [===========================>..] - ETA: 7s - loss: 2.4597 - regression_loss: 1.9680 - classification_loss: 0.4917 472/500 [===========================>..] - ETA: 7s - loss: 2.4596 - regression_loss: 1.9680 - classification_loss: 0.4916 473/500 [===========================>..] - ETA: 6s - loss: 2.4600 - regression_loss: 1.9681 - classification_loss: 0.4919 474/500 [===========================>..] - ETA: 6s - loss: 2.4609 - regression_loss: 1.9688 - classification_loss: 0.4921 475/500 [===========================>..] - ETA: 6s - loss: 2.4611 - regression_loss: 1.9689 - classification_loss: 0.4922 476/500 [===========================>..] - ETA: 6s - loss: 2.4589 - regression_loss: 1.9672 - classification_loss: 0.4917 477/500 [===========================>..] - ETA: 5s - loss: 2.4586 - regression_loss: 1.9669 - classification_loss: 0.4917 478/500 [===========================>..] - ETA: 5s - loss: 2.4574 - regression_loss: 1.9659 - classification_loss: 0.4915 479/500 [===========================>..] - ETA: 5s - loss: 2.4548 - regression_loss: 1.9637 - classification_loss: 0.4911 480/500 [===========================>..] - ETA: 5s - loss: 2.4549 - regression_loss: 1.9639 - classification_loss: 0.4910 481/500 [===========================>..] - ETA: 4s - loss: 2.4549 - regression_loss: 1.9639 - classification_loss: 0.4910 482/500 [===========================>..] - ETA: 4s - loss: 2.4546 - regression_loss: 1.9638 - classification_loss: 0.4908 483/500 [===========================>..] - ETA: 4s - loss: 2.4554 - regression_loss: 1.9641 - classification_loss: 0.4913 484/500 [============================>.] - ETA: 4s - loss: 2.4543 - regression_loss: 1.9633 - classification_loss: 0.4910 485/500 [============================>.] - ETA: 3s - loss: 2.4523 - regression_loss: 1.9617 - classification_loss: 0.4906 486/500 [============================>.] - ETA: 3s - loss: 2.4523 - regression_loss: 1.9617 - classification_loss: 0.4905 487/500 [============================>.] - ETA: 3s - loss: 2.4531 - regression_loss: 1.9626 - classification_loss: 0.4905 488/500 [============================>.] - ETA: 3s - loss: 2.4528 - regression_loss: 1.9623 - classification_loss: 0.4905 489/500 [============================>.] - ETA: 2s - loss: 2.4514 - regression_loss: 1.9616 - classification_loss: 0.4899 490/500 [============================>.] - ETA: 2s - loss: 2.4524 - regression_loss: 1.9625 - classification_loss: 0.4898 491/500 [============================>.] - ETA: 2s - loss: 2.4516 - regression_loss: 1.9621 - classification_loss: 0.4895 492/500 [============================>.] - ETA: 2s - loss: 2.4523 - regression_loss: 1.9625 - classification_loss: 0.4897 493/500 [============================>.] - ETA: 1s - loss: 2.4515 - regression_loss: 1.9621 - classification_loss: 0.4895 494/500 [============================>.] - ETA: 1s - loss: 2.4524 - regression_loss: 1.9629 - classification_loss: 0.4895 495/500 [============================>.] - ETA: 1s - loss: 2.4524 - regression_loss: 1.9628 - classification_loss: 0.4896 496/500 [============================>.] - ETA: 1s - loss: 2.4528 - regression_loss: 1.9630 - classification_loss: 0.4898 497/500 [============================>.] - ETA: 0s - loss: 2.4521 - regression_loss: 1.9625 - classification_loss: 0.4896 498/500 [============================>.] - ETA: 0s - loss: 2.4534 - regression_loss: 1.9637 - classification_loss: 0.4897 499/500 [============================>.] - ETA: 0s - loss: 2.4544 - regression_loss: 1.9646 - classification_loss: 0.4898 500/500 [==============================] - 125s 250ms/step - loss: 2.4530 - regression_loss: 1.9637 - classification_loss: 0.4894 1172 instances of class plum with average precision: 0.3412 mAP: 0.3412 Epoch 00016: saving model to ./training/snapshots/resnet50_pascal_16.h5 Epoch 17/150 1/500 [..............................] - ETA: 1:58 - loss: 2.7519 - regression_loss: 2.2211 - classification_loss: 0.5307 2/500 [..............................] - ETA: 2:01 - loss: 2.5301 - regression_loss: 2.0464 - classification_loss: 0.4838 3/500 [..............................] - ETA: 2:01 - loss: 2.4544 - regression_loss: 1.9966 - classification_loss: 0.4578 4/500 [..............................] - ETA: 2:01 - loss: 2.4711 - regression_loss: 1.9875 - classification_loss: 0.4835 5/500 [..............................] - ETA: 2:02 - loss: 2.7690 - regression_loss: 2.1120 - classification_loss: 0.6570 6/500 [..............................] - ETA: 2:03 - loss: 2.6795 - regression_loss: 2.0792 - classification_loss: 0.6003 7/500 [..............................] - ETA: 2:03 - loss: 2.5616 - regression_loss: 1.9945 - classification_loss: 0.5670 8/500 [..............................] - ETA: 2:03 - loss: 2.5687 - regression_loss: 2.0127 - classification_loss: 0.5559 9/500 [..............................] - ETA: 2:02 - loss: 2.5587 - regression_loss: 2.0084 - classification_loss: 0.5503 10/500 [..............................] - ETA: 2:02 - loss: 2.6397 - regression_loss: 2.0877 - classification_loss: 0.5519 11/500 [..............................] - ETA: 2:02 - loss: 2.6202 - regression_loss: 2.0832 - classification_loss: 0.5370 12/500 [..............................] - ETA: 2:02 - loss: 2.5336 - regression_loss: 2.0197 - classification_loss: 0.5140 13/500 [..............................] - ETA: 2:02 - loss: 2.5216 - regression_loss: 2.0124 - classification_loss: 0.5092 14/500 [..............................] - ETA: 2:02 - loss: 2.5415 - regression_loss: 2.0261 - classification_loss: 0.5154 15/500 [..............................] - ETA: 2:02 - loss: 2.5174 - regression_loss: 2.0106 - classification_loss: 0.5067 16/500 [..............................] - ETA: 2:01 - loss: 2.5120 - regression_loss: 2.0131 - classification_loss: 0.4990 17/500 [>.............................] - ETA: 2:01 - loss: 2.5264 - regression_loss: 2.0250 - classification_loss: 0.5015 18/500 [>.............................] - ETA: 2:01 - loss: 2.5132 - regression_loss: 2.0189 - classification_loss: 0.4943 19/500 [>.............................] - ETA: 2:01 - loss: 2.4677 - regression_loss: 1.9798 - classification_loss: 0.4879 20/500 [>.............................] - ETA: 2:00 - loss: 2.4754 - regression_loss: 1.9865 - classification_loss: 0.4888 21/500 [>.............................] - ETA: 2:00 - loss: 2.4899 - regression_loss: 1.9967 - classification_loss: 0.4931 22/500 [>.............................] - ETA: 2:00 - loss: 2.4611 - regression_loss: 1.9757 - classification_loss: 0.4854 23/500 [>.............................] - ETA: 2:00 - loss: 2.4539 - regression_loss: 1.9737 - classification_loss: 0.4802 24/500 [>.............................] - ETA: 1:59 - loss: 2.4539 - regression_loss: 1.9759 - classification_loss: 0.4779 25/500 [>.............................] - ETA: 1:59 - loss: 2.4497 - regression_loss: 1.9733 - classification_loss: 0.4764 26/500 [>.............................] - ETA: 1:59 - loss: 2.4656 - regression_loss: 1.9871 - classification_loss: 0.4785 27/500 [>.............................] - ETA: 1:59 - loss: 2.4591 - regression_loss: 1.9829 - classification_loss: 0.4762 28/500 [>.............................] - ETA: 1:58 - loss: 2.4823 - regression_loss: 1.9996 - classification_loss: 0.4827 29/500 [>.............................] - ETA: 1:58 - loss: 2.4746 - regression_loss: 1.9893 - classification_loss: 0.4854 30/500 [>.............................] - ETA: 1:58 - loss: 2.4699 - regression_loss: 1.9859 - classification_loss: 0.4840 31/500 [>.............................] - ETA: 1:57 - loss: 2.4728 - regression_loss: 1.9839 - classification_loss: 0.4889 32/500 [>.............................] - ETA: 1:57 - loss: 2.4253 - regression_loss: 1.9457 - classification_loss: 0.4796 33/500 [>.............................] - ETA: 1:57 - loss: 2.4268 - regression_loss: 1.9468 - classification_loss: 0.4799 34/500 [=>............................] - ETA: 1:57 - loss: 2.4332 - regression_loss: 1.9502 - classification_loss: 0.4831 35/500 [=>............................] - ETA: 1:56 - loss: 2.4362 - regression_loss: 1.9536 - classification_loss: 0.4826 36/500 [=>............................] - ETA: 1:56 - loss: 2.4434 - regression_loss: 1.9600 - classification_loss: 0.4834 37/500 [=>............................] - ETA: 1:56 - loss: 2.4522 - regression_loss: 1.9675 - classification_loss: 0.4847 38/500 [=>............................] - ETA: 1:56 - loss: 2.4577 - regression_loss: 1.9707 - classification_loss: 0.4870 39/500 [=>............................] - ETA: 1:55 - loss: 2.4691 - regression_loss: 1.9779 - classification_loss: 0.4912 40/500 [=>............................] - ETA: 1:55 - loss: 2.4768 - regression_loss: 1.9855 - classification_loss: 0.4913 41/500 [=>............................] - ETA: 1:55 - loss: 2.4899 - regression_loss: 1.9946 - classification_loss: 0.4953 42/500 [=>............................] - ETA: 1:55 - loss: 2.4779 - regression_loss: 1.9864 - classification_loss: 0.4915 43/500 [=>............................] - ETA: 1:54 - loss: 2.4708 - regression_loss: 1.9819 - classification_loss: 0.4888 44/500 [=>............................] - ETA: 1:54 - loss: 2.4624 - regression_loss: 1.9765 - classification_loss: 0.4859 45/500 [=>............................] - ETA: 1:54 - loss: 2.4629 - regression_loss: 1.9775 - classification_loss: 0.4854 46/500 [=>............................] - ETA: 1:54 - loss: 2.4695 - regression_loss: 1.9851 - classification_loss: 0.4843 47/500 [=>............................] - ETA: 1:53 - loss: 2.4685 - regression_loss: 1.9821 - classification_loss: 0.4864 48/500 [=>............................] - ETA: 1:53 - loss: 2.4680 - regression_loss: 1.9799 - classification_loss: 0.4881 49/500 [=>............................] - ETA: 1:53 - loss: 2.4728 - regression_loss: 1.9805 - classification_loss: 0.4922 50/500 [==>...........................] - ETA: 1:53 - loss: 2.4706 - regression_loss: 1.9812 - classification_loss: 0.4894 51/500 [==>...........................] - ETA: 1:52 - loss: 2.4718 - regression_loss: 1.9806 - classification_loss: 0.4911 52/500 [==>...........................] - ETA: 1:52 - loss: 2.4653 - regression_loss: 1.9770 - classification_loss: 0.4882 53/500 [==>...........................] - ETA: 1:52 - loss: 2.4742 - regression_loss: 1.9847 - classification_loss: 0.4894 54/500 [==>...........................] - ETA: 1:52 - loss: 2.4792 - regression_loss: 1.9877 - classification_loss: 0.4916 55/500 [==>...........................] - ETA: 1:52 - loss: 2.4751 - regression_loss: 1.9837 - classification_loss: 0.4913 56/500 [==>...........................] - ETA: 1:51 - loss: 2.4730 - regression_loss: 1.9839 - classification_loss: 0.4891 57/500 [==>...........................] - ETA: 1:51 - loss: 2.4619 - regression_loss: 1.9752 - classification_loss: 0.4866 58/500 [==>...........................] - ETA: 1:51 - loss: 2.4590 - regression_loss: 1.9728 - classification_loss: 0.4862 59/500 [==>...........................] - ETA: 1:51 - loss: 2.4704 - regression_loss: 1.9821 - classification_loss: 0.4883 60/500 [==>...........................] - ETA: 1:50 - loss: 2.4731 - regression_loss: 1.9845 - classification_loss: 0.4886 61/500 [==>...........................] - ETA: 1:50 - loss: 2.4730 - regression_loss: 1.9858 - classification_loss: 0.4873 62/500 [==>...........................] - ETA: 1:50 - loss: 2.4746 - regression_loss: 1.9904 - classification_loss: 0.4842 63/500 [==>...........................] - ETA: 1:49 - loss: 2.4886 - regression_loss: 2.0025 - classification_loss: 0.4861 64/500 [==>...........................] - ETA: 1:49 - loss: 2.4861 - regression_loss: 2.0004 - classification_loss: 0.4856 65/500 [==>...........................] - ETA: 1:49 - loss: 2.4825 - regression_loss: 1.9973 - classification_loss: 0.4852 66/500 [==>...........................] - ETA: 1:49 - loss: 2.4774 - regression_loss: 1.9936 - classification_loss: 0.4838 67/500 [===>..........................] - ETA: 1:48 - loss: 2.4783 - regression_loss: 1.9946 - classification_loss: 0.4837 68/500 [===>..........................] - ETA: 1:48 - loss: 2.4705 - regression_loss: 1.9870 - classification_loss: 0.4835 69/500 [===>..........................] - ETA: 1:48 - loss: 2.4753 - regression_loss: 1.9895 - classification_loss: 0.4859 70/500 [===>..........................] - ETA: 1:48 - loss: 2.4770 - regression_loss: 1.9871 - classification_loss: 0.4899 71/500 [===>..........................] - ETA: 1:47 - loss: 2.4741 - regression_loss: 1.9854 - classification_loss: 0.4887 72/500 [===>..........................] - ETA: 1:47 - loss: 2.4715 - regression_loss: 1.9832 - classification_loss: 0.4882 73/500 [===>..........................] - ETA: 1:47 - loss: 2.4746 - regression_loss: 1.9814 - classification_loss: 0.4932 74/500 [===>..........................] - ETA: 1:47 - loss: 2.4759 - regression_loss: 1.9834 - classification_loss: 0.4925 75/500 [===>..........................] - ETA: 1:46 - loss: 2.4710 - regression_loss: 1.9804 - classification_loss: 0.4905 76/500 [===>..........................] - ETA: 1:46 - loss: 2.4717 - regression_loss: 1.9827 - classification_loss: 0.4891 77/500 [===>..........................] - ETA: 1:46 - loss: 2.4697 - regression_loss: 1.9803 - classification_loss: 0.4895 78/500 [===>..........................] - ETA: 1:46 - loss: 2.4685 - regression_loss: 1.9794 - classification_loss: 0.4891 79/500 [===>..........................] - ETA: 1:45 - loss: 2.4659 - regression_loss: 1.9782 - classification_loss: 0.4877 80/500 [===>..........................] - ETA: 1:45 - loss: 2.4658 - regression_loss: 1.9785 - classification_loss: 0.4873 81/500 [===>..........................] - ETA: 1:45 - loss: 2.4704 - regression_loss: 1.9804 - classification_loss: 0.4900 82/500 [===>..........................] - ETA: 1:45 - loss: 2.4652 - regression_loss: 1.9768 - classification_loss: 0.4883 83/500 [===>..........................] - ETA: 1:44 - loss: 2.4680 - regression_loss: 1.9797 - classification_loss: 0.4882 84/500 [====>.........................] - ETA: 1:44 - loss: 2.4736 - regression_loss: 1.9848 - classification_loss: 0.4887 85/500 [====>.........................] - ETA: 1:44 - loss: 2.4746 - regression_loss: 1.9862 - classification_loss: 0.4884 86/500 [====>.........................] - ETA: 1:44 - loss: 2.4890 - regression_loss: 1.9939 - classification_loss: 0.4951 87/500 [====>.........................] - ETA: 1:44 - loss: 2.4897 - regression_loss: 1.9957 - classification_loss: 0.4939 88/500 [====>.........................] - ETA: 1:43 - loss: 2.4890 - regression_loss: 1.9951 - classification_loss: 0.4939 89/500 [====>.........................] - ETA: 1:43 - loss: 2.4945 - regression_loss: 1.9986 - classification_loss: 0.4958 90/500 [====>.........................] - ETA: 1:43 - loss: 2.4915 - regression_loss: 1.9960 - classification_loss: 0.4954 91/500 [====>.........................] - ETA: 1:43 - loss: 2.4929 - regression_loss: 1.9959 - classification_loss: 0.4970 92/500 [====>.........................] - ETA: 1:42 - loss: 2.4900 - regression_loss: 1.9937 - classification_loss: 0.4963 93/500 [====>.........................] - ETA: 1:42 - loss: 2.4803 - regression_loss: 1.9855 - classification_loss: 0.4948 94/500 [====>.........................] - ETA: 1:42 - loss: 2.4767 - regression_loss: 1.9833 - classification_loss: 0.4934 95/500 [====>.........................] - ETA: 1:42 - loss: 2.4694 - regression_loss: 1.9778 - classification_loss: 0.4916 96/500 [====>.........................] - ETA: 1:41 - loss: 2.4598 - regression_loss: 1.9708 - classification_loss: 0.4890 97/500 [====>.........................] - ETA: 1:41 - loss: 2.4573 - regression_loss: 1.9692 - classification_loss: 0.4881 98/500 [====>.........................] - ETA: 1:40 - loss: 2.4542 - regression_loss: 1.9667 - classification_loss: 0.4875 99/500 [====>.........................] - ETA: 1:40 - loss: 2.4542 - regression_loss: 1.9674 - classification_loss: 0.4868 100/500 [=====>........................] - ETA: 1:40 - loss: 2.4587 - regression_loss: 1.9709 - classification_loss: 0.4878 101/500 [=====>........................] - ETA: 1:39 - loss: 2.4580 - regression_loss: 1.9709 - classification_loss: 0.4872 102/500 [=====>........................] - ETA: 1:39 - loss: 2.4566 - regression_loss: 1.9703 - classification_loss: 0.4863 103/500 [=====>........................] - ETA: 1:39 - loss: 2.4568 - regression_loss: 1.9698 - classification_loss: 0.4869 104/500 [=====>........................] - ETA: 1:39 - loss: 2.4595 - regression_loss: 1.9693 - classification_loss: 0.4902 105/500 [=====>........................] - ETA: 1:38 - loss: 2.4545 - regression_loss: 1.9631 - classification_loss: 0.4914 106/500 [=====>........................] - ETA: 1:38 - loss: 2.4539 - regression_loss: 1.9618 - classification_loss: 0.4921 107/500 [=====>........................] - ETA: 1:38 - loss: 2.4563 - regression_loss: 1.9608 - classification_loss: 0.4955 108/500 [=====>........................] - ETA: 1:38 - loss: 2.4510 - regression_loss: 1.9551 - classification_loss: 0.4960 109/500 [=====>........................] - ETA: 1:37 - loss: 2.4525 - regression_loss: 1.9568 - classification_loss: 0.4957 110/500 [=====>........................] - ETA: 1:37 - loss: 2.4501 - regression_loss: 1.9549 - classification_loss: 0.4952 111/500 [=====>........................] - ETA: 1:37 - loss: 2.4505 - regression_loss: 1.9558 - classification_loss: 0.4947 112/500 [=====>........................] - ETA: 1:37 - loss: 2.4464 - regression_loss: 1.9534 - classification_loss: 0.4930 113/500 [=====>........................] - ETA: 1:37 - loss: 2.4419 - regression_loss: 1.9495 - classification_loss: 0.4924 114/500 [=====>........................] - ETA: 1:36 - loss: 2.4395 - regression_loss: 1.9481 - classification_loss: 0.4914 115/500 [=====>........................] - ETA: 1:36 - loss: 2.4306 - regression_loss: 1.9401 - classification_loss: 0.4905 116/500 [=====>........................] - ETA: 1:36 - loss: 2.4286 - regression_loss: 1.9391 - classification_loss: 0.4895 117/500 [======>.......................] - ETA: 1:36 - loss: 2.4258 - regression_loss: 1.9379 - classification_loss: 0.4879 118/500 [======>.......................] - ETA: 1:35 - loss: 2.4274 - regression_loss: 1.9392 - classification_loss: 0.4883 119/500 [======>.......................] - ETA: 1:35 - loss: 2.4239 - regression_loss: 1.9371 - classification_loss: 0.4868 120/500 [======>.......................] - ETA: 1:35 - loss: 2.4243 - regression_loss: 1.9383 - classification_loss: 0.4860 121/500 [======>.......................] - ETA: 1:35 - loss: 2.4267 - regression_loss: 1.9410 - classification_loss: 0.4858 122/500 [======>.......................] - ETA: 1:34 - loss: 2.4259 - regression_loss: 1.9405 - classification_loss: 0.4854 123/500 [======>.......................] - ETA: 1:34 - loss: 2.4222 - regression_loss: 1.9383 - classification_loss: 0.4840 124/500 [======>.......................] - ETA: 1:34 - loss: 2.4129 - regression_loss: 1.9309 - classification_loss: 0.4820 125/500 [======>.......................] - ETA: 1:34 - loss: 2.4139 - regression_loss: 1.9319 - classification_loss: 0.4820 126/500 [======>.......................] - ETA: 1:33 - loss: 2.4134 - regression_loss: 1.9308 - classification_loss: 0.4826 127/500 [======>.......................] - ETA: 1:33 - loss: 2.4105 - regression_loss: 1.9284 - classification_loss: 0.4821 128/500 [======>.......................] - ETA: 1:33 - loss: 2.4155 - regression_loss: 1.9330 - classification_loss: 0.4826 129/500 [======>.......................] - ETA: 1:33 - loss: 2.4446 - regression_loss: 1.9325 - classification_loss: 0.5121 130/500 [======>.......................] - ETA: 1:32 - loss: 2.4441 - regression_loss: 1.9329 - classification_loss: 0.5112 131/500 [======>.......................] - ETA: 1:32 - loss: 2.4359 - regression_loss: 1.9271 - classification_loss: 0.5088 132/500 [======>.......................] - ETA: 1:32 - loss: 2.4349 - regression_loss: 1.9266 - classification_loss: 0.5083 133/500 [======>.......................] - ETA: 1:31 - loss: 2.4367 - regression_loss: 1.9272 - classification_loss: 0.5095 134/500 [=======>......................] - ETA: 1:31 - loss: 2.4384 - regression_loss: 1.9294 - classification_loss: 0.5091 135/500 [=======>......................] - ETA: 1:31 - loss: 2.4361 - regression_loss: 1.9280 - classification_loss: 0.5081 136/500 [=======>......................] - ETA: 1:31 - loss: 2.4355 - regression_loss: 1.9282 - classification_loss: 0.5073 137/500 [=======>......................] - ETA: 1:30 - loss: 2.4339 - regression_loss: 1.9268 - classification_loss: 0.5071 138/500 [=======>......................] - ETA: 1:30 - loss: 2.4321 - regression_loss: 1.9259 - classification_loss: 0.5062 139/500 [=======>......................] - ETA: 1:30 - loss: 2.4318 - regression_loss: 1.9267 - classification_loss: 0.5051 140/500 [=======>......................] - ETA: 1:30 - loss: 2.4317 - regression_loss: 1.9272 - classification_loss: 0.5046 141/500 [=======>......................] - ETA: 1:30 - loss: 2.4227 - regression_loss: 1.9205 - classification_loss: 0.5022 142/500 [=======>......................] - ETA: 1:29 - loss: 2.4232 - regression_loss: 1.9212 - classification_loss: 0.5020 143/500 [=======>......................] - ETA: 1:29 - loss: 2.4207 - regression_loss: 1.9198 - classification_loss: 0.5009 144/500 [=======>......................] - ETA: 1:29 - loss: 2.4213 - regression_loss: 1.9205 - classification_loss: 0.5008 145/500 [=======>......................] - ETA: 1:29 - loss: 2.4180 - regression_loss: 1.9185 - classification_loss: 0.4996 146/500 [=======>......................] - ETA: 1:28 - loss: 2.4171 - regression_loss: 1.9180 - classification_loss: 0.4991 147/500 [=======>......................] - ETA: 1:28 - loss: 2.4164 - regression_loss: 1.9178 - classification_loss: 0.4986 148/500 [=======>......................] - ETA: 1:28 - loss: 2.4144 - regression_loss: 1.9166 - classification_loss: 0.4978 149/500 [=======>......................] - ETA: 1:28 - loss: 2.4153 - regression_loss: 1.9175 - classification_loss: 0.4978 150/500 [========>.....................] - ETA: 1:27 - loss: 2.4158 - regression_loss: 1.9182 - classification_loss: 0.4975 151/500 [========>.....................] - ETA: 1:27 - loss: 2.4140 - regression_loss: 1.9173 - classification_loss: 0.4967 152/500 [========>.....................] - ETA: 1:27 - loss: 2.4135 - regression_loss: 1.9172 - classification_loss: 0.4963 153/500 [========>.....................] - ETA: 1:27 - loss: 2.4136 - regression_loss: 1.9179 - classification_loss: 0.4957 154/500 [========>.....................] - ETA: 1:26 - loss: 2.4157 - regression_loss: 1.9204 - classification_loss: 0.4952 155/500 [========>.....................] - ETA: 1:26 - loss: 2.4277 - regression_loss: 1.9301 - classification_loss: 0.4976 156/500 [========>.....................] - ETA: 1:26 - loss: 2.4280 - regression_loss: 1.9303 - classification_loss: 0.4978 157/500 [========>.....................] - ETA: 1:26 - loss: 2.4289 - regression_loss: 1.9310 - classification_loss: 0.4980 158/500 [========>.....................] - ETA: 1:25 - loss: 2.4322 - regression_loss: 1.9328 - classification_loss: 0.4994 159/500 [========>.....................] - ETA: 1:25 - loss: 2.4321 - regression_loss: 1.9327 - classification_loss: 0.4994 160/500 [========>.....................] - ETA: 1:25 - loss: 2.4323 - regression_loss: 1.9333 - classification_loss: 0.4990 161/500 [========>.....................] - ETA: 1:24 - loss: 2.4222 - regression_loss: 1.9249 - classification_loss: 0.4973 162/500 [========>.....................] - ETA: 1:24 - loss: 2.4206 - regression_loss: 1.9240 - classification_loss: 0.4965 163/500 [========>.....................] - ETA: 1:24 - loss: 2.4202 - regression_loss: 1.9241 - classification_loss: 0.4961 164/500 [========>.....................] - ETA: 1:24 - loss: 2.4171 - regression_loss: 1.9217 - classification_loss: 0.4954 165/500 [========>.....................] - ETA: 1:23 - loss: 2.4165 - regression_loss: 1.9214 - classification_loss: 0.4951 166/500 [========>.....................] - ETA: 1:23 - loss: 2.4120 - regression_loss: 1.9180 - classification_loss: 0.4940 167/500 [=========>....................] - ETA: 1:23 - loss: 2.4133 - regression_loss: 1.9186 - classification_loss: 0.4946 168/500 [=========>....................] - ETA: 1:23 - loss: 2.4104 - regression_loss: 1.9162 - classification_loss: 0.4942 169/500 [=========>....................] - ETA: 1:22 - loss: 2.4082 - regression_loss: 1.9143 - classification_loss: 0.4939 170/500 [=========>....................] - ETA: 1:22 - loss: 2.4068 - regression_loss: 1.9140 - classification_loss: 0.4928 171/500 [=========>....................] - ETA: 1:22 - loss: 2.4063 - regression_loss: 1.9136 - classification_loss: 0.4927 172/500 [=========>....................] - ETA: 1:22 - loss: 2.4096 - regression_loss: 1.9169 - classification_loss: 0.4926 173/500 [=========>....................] - ETA: 1:21 - loss: 2.4101 - regression_loss: 1.9178 - classification_loss: 0.4922 174/500 [=========>....................] - ETA: 1:21 - loss: 2.4110 - regression_loss: 1.9193 - classification_loss: 0.4918 175/500 [=========>....................] - ETA: 1:21 - loss: 2.4107 - regression_loss: 1.9194 - classification_loss: 0.4913 176/500 [=========>....................] - ETA: 1:21 - loss: 2.4126 - regression_loss: 1.9212 - classification_loss: 0.4914 177/500 [=========>....................] - ETA: 1:21 - loss: 2.4112 - regression_loss: 1.9203 - classification_loss: 0.4909 178/500 [=========>....................] - ETA: 1:20 - loss: 2.4125 - regression_loss: 1.9200 - classification_loss: 0.4925 179/500 [=========>....................] - ETA: 1:20 - loss: 2.4099 - regression_loss: 1.9188 - classification_loss: 0.4911 180/500 [=========>....................] - ETA: 1:20 - loss: 2.4048 - regression_loss: 1.9149 - classification_loss: 0.4899 181/500 [=========>....................] - ETA: 1:20 - loss: 2.4038 - regression_loss: 1.9143 - classification_loss: 0.4894 182/500 [=========>....................] - ETA: 1:19 - loss: 2.4063 - regression_loss: 1.9172 - classification_loss: 0.4891 183/500 [=========>....................] - ETA: 1:19 - loss: 2.4097 - regression_loss: 1.9198 - classification_loss: 0.4899 184/500 [==========>...................] - ETA: 1:19 - loss: 2.4090 - regression_loss: 1.9193 - classification_loss: 0.4897 185/500 [==========>...................] - ETA: 1:19 - loss: 2.4090 - regression_loss: 1.9198 - classification_loss: 0.4892 186/500 [==========>...................] - ETA: 1:18 - loss: 2.4070 - regression_loss: 1.9183 - classification_loss: 0.4887 187/500 [==========>...................] - ETA: 1:18 - loss: 2.4074 - regression_loss: 1.9190 - classification_loss: 0.4884 188/500 [==========>...................] - ETA: 1:18 - loss: 2.4086 - regression_loss: 1.9202 - classification_loss: 0.4884 189/500 [==========>...................] - ETA: 1:18 - loss: 2.4091 - regression_loss: 1.9211 - classification_loss: 0.4880 190/500 [==========>...................] - ETA: 1:17 - loss: 2.4074 - regression_loss: 1.9201 - classification_loss: 0.4873 191/500 [==========>...................] - ETA: 1:17 - loss: 2.4059 - regression_loss: 1.9186 - classification_loss: 0.4874 192/500 [==========>...................] - ETA: 1:17 - loss: 2.4055 - regression_loss: 1.9180 - classification_loss: 0.4876 193/500 [==========>...................] - ETA: 1:17 - loss: 2.4061 - regression_loss: 1.9186 - classification_loss: 0.4875 194/500 [==========>...................] - ETA: 1:16 - loss: 2.4082 - regression_loss: 1.9191 - classification_loss: 0.4891 195/500 [==========>...................] - ETA: 1:16 - loss: 2.4120 - regression_loss: 1.9221 - classification_loss: 0.4899 196/500 [==========>...................] - ETA: 1:16 - loss: 2.4117 - regression_loss: 1.9212 - classification_loss: 0.4906 197/500 [==========>...................] - ETA: 1:16 - loss: 2.4120 - regression_loss: 1.9218 - classification_loss: 0.4902 198/500 [==========>...................] - ETA: 1:15 - loss: 2.4090 - regression_loss: 1.9182 - classification_loss: 0.4908 199/500 [==========>...................] - ETA: 1:15 - loss: 2.4045 - regression_loss: 1.9139 - classification_loss: 0.4905 200/500 [===========>..................] - ETA: 1:15 - loss: 2.4059 - regression_loss: 1.9151 - classification_loss: 0.4908 201/500 [===========>..................] - ETA: 1:15 - loss: 2.4056 - regression_loss: 1.9150 - classification_loss: 0.4905 202/500 [===========>..................] - ETA: 1:14 - loss: 2.4058 - regression_loss: 1.9158 - classification_loss: 0.4901 203/500 [===========>..................] - ETA: 1:14 - loss: 2.4078 - regression_loss: 1.9179 - classification_loss: 0.4900 204/500 [===========>..................] - ETA: 1:14 - loss: 2.4076 - regression_loss: 1.9181 - classification_loss: 0.4894 205/500 [===========>..................] - ETA: 1:14 - loss: 2.4097 - regression_loss: 1.9196 - classification_loss: 0.4900 206/500 [===========>..................] - ETA: 1:13 - loss: 2.4089 - regression_loss: 1.9189 - classification_loss: 0.4900 207/500 [===========>..................] - ETA: 1:13 - loss: 2.4074 - regression_loss: 1.9183 - classification_loss: 0.4891 208/500 [===========>..................] - ETA: 1:13 - loss: 2.4058 - regression_loss: 1.9170 - classification_loss: 0.4887 209/500 [===========>..................] - ETA: 1:13 - loss: 2.4070 - regression_loss: 1.9180 - classification_loss: 0.4890 210/500 [===========>..................] - ETA: 1:12 - loss: 2.4068 - regression_loss: 1.9174 - classification_loss: 0.4894 211/500 [===========>..................] - ETA: 1:12 - loss: 2.4086 - regression_loss: 1.9188 - classification_loss: 0.4898 212/500 [===========>..................] - ETA: 1:12 - loss: 2.4124 - regression_loss: 1.9221 - classification_loss: 0.4903 213/500 [===========>..................] - ETA: 1:11 - loss: 2.4115 - regression_loss: 1.9213 - classification_loss: 0.4903 214/500 [===========>..................] - ETA: 1:11 - loss: 2.4128 - regression_loss: 1.9220 - classification_loss: 0.4908 215/500 [===========>..................] - ETA: 1:11 - loss: 2.4150 - regression_loss: 1.9238 - classification_loss: 0.4912 216/500 [===========>..................] - ETA: 1:11 - loss: 2.4170 - regression_loss: 1.9252 - classification_loss: 0.4918 217/500 [============>.................] - ETA: 1:10 - loss: 2.4128 - regression_loss: 1.9223 - classification_loss: 0.4905 218/500 [============>.................] - ETA: 1:10 - loss: 2.4141 - regression_loss: 1.9237 - classification_loss: 0.4904 219/500 [============>.................] - ETA: 1:10 - loss: 2.4174 - regression_loss: 1.9267 - classification_loss: 0.4907 220/500 [============>.................] - ETA: 1:10 - loss: 2.4164 - regression_loss: 1.9250 - classification_loss: 0.4914 221/500 [============>.................] - ETA: 1:09 - loss: 2.4197 - regression_loss: 1.9239 - classification_loss: 0.4958 222/500 [============>.................] - ETA: 1:09 - loss: 2.4173 - regression_loss: 1.9213 - classification_loss: 0.4960 223/500 [============>.................] - ETA: 1:09 - loss: 2.4166 - regression_loss: 1.9211 - classification_loss: 0.4955 224/500 [============>.................] - ETA: 1:09 - loss: 2.4175 - regression_loss: 1.9212 - classification_loss: 0.4963 225/500 [============>.................] - ETA: 1:08 - loss: 2.4190 - regression_loss: 1.9226 - classification_loss: 0.4964 226/500 [============>.................] - ETA: 1:08 - loss: 2.4168 - regression_loss: 1.9212 - classification_loss: 0.4956 227/500 [============>.................] - ETA: 1:08 - loss: 2.4167 - regression_loss: 1.9213 - classification_loss: 0.4954 228/500 [============>.................] - ETA: 1:08 - loss: 2.4161 - regression_loss: 1.9210 - classification_loss: 0.4951 229/500 [============>.................] - ETA: 1:07 - loss: 2.4150 - regression_loss: 1.9205 - classification_loss: 0.4946 230/500 [============>.................] - ETA: 1:07 - loss: 2.4140 - regression_loss: 1.9198 - classification_loss: 0.4942 231/500 [============>.................] - ETA: 1:07 - loss: 2.4153 - regression_loss: 1.9208 - classification_loss: 0.4945 232/500 [============>.................] - ETA: 1:07 - loss: 2.4114 - regression_loss: 1.9178 - classification_loss: 0.4936 233/500 [============>.................] - ETA: 1:06 - loss: 2.4106 - regression_loss: 1.9176 - classification_loss: 0.4931 234/500 [=============>................] - ETA: 1:06 - loss: 2.4125 - regression_loss: 1.9176 - classification_loss: 0.4949 235/500 [=============>................] - ETA: 1:06 - loss: 2.4132 - regression_loss: 1.9179 - classification_loss: 0.4953 236/500 [=============>................] - ETA: 1:06 - loss: 2.4150 - regression_loss: 1.9194 - classification_loss: 0.4956 237/500 [=============>................] - ETA: 1:05 - loss: 2.4147 - regression_loss: 1.9193 - classification_loss: 0.4953 238/500 [=============>................] - ETA: 1:05 - loss: 2.4151 - regression_loss: 1.9205 - classification_loss: 0.4946 239/500 [=============>................] - ETA: 1:05 - loss: 2.4166 - regression_loss: 1.9216 - classification_loss: 0.4950 240/500 [=============>................] - ETA: 1:05 - loss: 2.4169 - regression_loss: 1.9219 - classification_loss: 0.4950 241/500 [=============>................] - ETA: 1:04 - loss: 2.4180 - regression_loss: 1.9229 - classification_loss: 0.4952 242/500 [=============>................] - ETA: 1:04 - loss: 2.4184 - regression_loss: 1.9235 - classification_loss: 0.4948 243/500 [=============>................] - ETA: 1:04 - loss: 2.4168 - regression_loss: 1.9227 - classification_loss: 0.4941 244/500 [=============>................] - ETA: 1:04 - loss: 2.4173 - regression_loss: 1.9231 - classification_loss: 0.4942 245/500 [=============>................] - ETA: 1:04 - loss: 2.4172 - regression_loss: 1.9235 - classification_loss: 0.4938 246/500 [=============>................] - ETA: 1:03 - loss: 2.4208 - regression_loss: 1.9258 - classification_loss: 0.4950 247/500 [=============>................] - ETA: 1:03 - loss: 2.4230 - regression_loss: 1.9276 - classification_loss: 0.4954 248/500 [=============>................] - ETA: 1:03 - loss: 2.4213 - regression_loss: 1.9263 - classification_loss: 0.4950 249/500 [=============>................] - ETA: 1:02 - loss: 2.4222 - regression_loss: 1.9267 - classification_loss: 0.4955 250/500 [==============>...............] - ETA: 1:02 - loss: 2.4233 - regression_loss: 1.9278 - classification_loss: 0.4955 251/500 [==============>...............] - ETA: 1:02 - loss: 2.4237 - regression_loss: 1.9286 - classification_loss: 0.4951 252/500 [==============>...............] - ETA: 1:02 - loss: 2.4236 - regression_loss: 1.9287 - classification_loss: 0.4949 253/500 [==============>...............] - ETA: 1:01 - loss: 2.4223 - regression_loss: 1.9276 - classification_loss: 0.4947 254/500 [==============>...............] - ETA: 1:01 - loss: 2.4214 - regression_loss: 1.9271 - classification_loss: 0.4943 255/500 [==============>...............] - ETA: 1:01 - loss: 2.4220 - regression_loss: 1.9281 - classification_loss: 0.4940 256/500 [==============>...............] - ETA: 1:01 - loss: 2.4232 - regression_loss: 1.9288 - classification_loss: 0.4945 257/500 [==============>...............] - ETA: 1:00 - loss: 2.4234 - regression_loss: 1.9292 - classification_loss: 0.4942 258/500 [==============>...............] - ETA: 1:00 - loss: 2.4258 - regression_loss: 1.9308 - classification_loss: 0.4950 259/500 [==============>...............] - ETA: 1:00 - loss: 2.4260 - regression_loss: 1.9313 - classification_loss: 0.4947 260/500 [==============>...............] - ETA: 1:00 - loss: 2.4258 - regression_loss: 1.9314 - classification_loss: 0.4944 261/500 [==============>...............] - ETA: 59s - loss: 2.4257 - regression_loss: 1.9310 - classification_loss: 0.4947  262/500 [==============>...............] - ETA: 59s - loss: 2.4267 - regression_loss: 1.9317 - classification_loss: 0.4950 263/500 [==============>...............] - ETA: 59s - loss: 2.4261 - regression_loss: 1.9313 - classification_loss: 0.4948 264/500 [==============>...............] - ETA: 59s - loss: 2.4249 - regression_loss: 1.9307 - classification_loss: 0.4942 265/500 [==============>...............] - ETA: 58s - loss: 2.4251 - regression_loss: 1.9312 - classification_loss: 0.4939 266/500 [==============>...............] - ETA: 58s - loss: 2.4276 - regression_loss: 1.9336 - classification_loss: 0.4941 267/500 [===============>..............] - ETA: 58s - loss: 2.4289 - regression_loss: 1.9348 - classification_loss: 0.4942 268/500 [===============>..............] - ETA: 58s - loss: 2.4292 - regression_loss: 1.9346 - classification_loss: 0.4946 269/500 [===============>..............] - ETA: 57s - loss: 2.4307 - regression_loss: 1.9360 - classification_loss: 0.4947 270/500 [===============>..............] - ETA: 57s - loss: 2.4285 - regression_loss: 1.9344 - classification_loss: 0.4942 271/500 [===============>..............] - ETA: 57s - loss: 2.4274 - regression_loss: 1.9334 - classification_loss: 0.4939 272/500 [===============>..............] - ETA: 57s - loss: 2.4258 - regression_loss: 1.9322 - classification_loss: 0.4936 273/500 [===============>..............] - ETA: 56s - loss: 2.4258 - regression_loss: 1.9322 - classification_loss: 0.4936 274/500 [===============>..............] - ETA: 56s - loss: 2.4231 - regression_loss: 1.9301 - classification_loss: 0.4930 275/500 [===============>..............] - ETA: 56s - loss: 2.4224 - regression_loss: 1.9297 - classification_loss: 0.4928 276/500 [===============>..............] - ETA: 56s - loss: 2.4227 - regression_loss: 1.9299 - classification_loss: 0.4928 277/500 [===============>..............] - ETA: 55s - loss: 2.4241 - regression_loss: 1.9290 - classification_loss: 0.4951 278/500 [===============>..............] - ETA: 55s - loss: 2.4242 - regression_loss: 1.9294 - classification_loss: 0.4948 279/500 [===============>..............] - ETA: 55s - loss: 2.4236 - regression_loss: 1.9290 - classification_loss: 0.4946 280/500 [===============>..............] - ETA: 55s - loss: 2.4251 - regression_loss: 1.9302 - classification_loss: 0.4948 281/500 [===============>..............] - ETA: 54s - loss: 2.4251 - regression_loss: 1.9306 - classification_loss: 0.4945 282/500 [===============>..............] - ETA: 54s - loss: 2.4249 - regression_loss: 1.9308 - classification_loss: 0.4941 283/500 [===============>..............] - ETA: 54s - loss: 2.4263 - regression_loss: 1.9323 - classification_loss: 0.4940 284/500 [================>.............] - ETA: 54s - loss: 2.4266 - regression_loss: 1.9327 - classification_loss: 0.4939 285/500 [================>.............] - ETA: 53s - loss: 2.4266 - regression_loss: 1.9325 - classification_loss: 0.4941 286/500 [================>.............] - ETA: 53s - loss: 2.4285 - regression_loss: 1.9342 - classification_loss: 0.4943 287/500 [================>.............] - ETA: 53s - loss: 2.4303 - regression_loss: 1.9361 - classification_loss: 0.4941 288/500 [================>.............] - ETA: 53s - loss: 2.4307 - regression_loss: 1.9369 - classification_loss: 0.4938 289/500 [================>.............] - ETA: 52s - loss: 2.4311 - regression_loss: 1.9373 - classification_loss: 0.4938 290/500 [================>.............] - ETA: 52s - loss: 2.4332 - regression_loss: 1.9380 - classification_loss: 0.4952 291/500 [================>.............] - ETA: 52s - loss: 2.4293 - regression_loss: 1.9350 - classification_loss: 0.4943 292/500 [================>.............] - ETA: 52s - loss: 2.4286 - regression_loss: 1.9347 - classification_loss: 0.4939 293/500 [================>.............] - ETA: 51s - loss: 2.4278 - regression_loss: 1.9336 - classification_loss: 0.4942 294/500 [================>.............] - ETA: 51s - loss: 2.4284 - regression_loss: 1.9341 - classification_loss: 0.4943 295/500 [================>.............] - ETA: 51s - loss: 2.4280 - regression_loss: 1.9342 - classification_loss: 0.4938 296/500 [================>.............] - ETA: 51s - loss: 2.4279 - regression_loss: 1.9341 - classification_loss: 0.4938 297/500 [================>.............] - ETA: 50s - loss: 2.4273 - regression_loss: 1.9337 - classification_loss: 0.4936 298/500 [================>.............] - ETA: 50s - loss: 2.4274 - regression_loss: 1.9339 - classification_loss: 0.4935 299/500 [================>.............] - ETA: 50s - loss: 2.4234 - regression_loss: 1.9311 - classification_loss: 0.4923 300/500 [=================>............] - ETA: 50s - loss: 2.4242 - regression_loss: 1.9319 - classification_loss: 0.4924 301/500 [=================>............] - ETA: 49s - loss: 2.4241 - regression_loss: 1.9318 - classification_loss: 0.4923 302/500 [=================>............] - ETA: 49s - loss: 2.4239 - regression_loss: 1.9318 - classification_loss: 0.4921 303/500 [=================>............] - ETA: 49s - loss: 2.4252 - regression_loss: 1.9322 - classification_loss: 0.4929 304/500 [=================>............] - ETA: 49s - loss: 2.4235 - regression_loss: 1.9302 - classification_loss: 0.4933 305/500 [=================>............] - ETA: 48s - loss: 2.4240 - regression_loss: 1.9309 - classification_loss: 0.4931 306/500 [=================>............] - ETA: 48s - loss: 2.4252 - regression_loss: 1.9318 - classification_loss: 0.4934 307/500 [=================>............] - ETA: 48s - loss: 2.4253 - regression_loss: 1.9319 - classification_loss: 0.4934 308/500 [=================>............] - ETA: 48s - loss: 2.4227 - regression_loss: 1.9301 - classification_loss: 0.4926 309/500 [=================>............] - ETA: 47s - loss: 2.4231 - regression_loss: 1.9306 - classification_loss: 0.4925 310/500 [=================>............] - ETA: 47s - loss: 2.4217 - regression_loss: 1.9297 - classification_loss: 0.4920 311/500 [=================>............] - ETA: 47s - loss: 2.4213 - regression_loss: 1.9293 - classification_loss: 0.4920 312/500 [=================>............] - ETA: 47s - loss: 2.4205 - regression_loss: 1.9288 - classification_loss: 0.4917 313/500 [=================>............] - ETA: 46s - loss: 2.4208 - regression_loss: 1.9271 - classification_loss: 0.4937 314/500 [=================>............] - ETA: 46s - loss: 2.4195 - regression_loss: 1.9255 - classification_loss: 0.4940 315/500 [=================>............] - ETA: 46s - loss: 2.4192 - regression_loss: 1.9253 - classification_loss: 0.4939 316/500 [=================>............] - ETA: 46s - loss: 2.4210 - regression_loss: 1.9260 - classification_loss: 0.4950 317/500 [==================>...........] - ETA: 45s - loss: 2.4217 - regression_loss: 1.9269 - classification_loss: 0.4949 318/500 [==================>...........] - ETA: 45s - loss: 2.4220 - regression_loss: 1.9274 - classification_loss: 0.4946 319/500 [==================>...........] - ETA: 45s - loss: 2.4224 - regression_loss: 1.9278 - classification_loss: 0.4947 320/500 [==================>...........] - ETA: 45s - loss: 2.4214 - regression_loss: 1.9271 - classification_loss: 0.4944 321/500 [==================>...........] - ETA: 44s - loss: 2.4197 - regression_loss: 1.9258 - classification_loss: 0.4940 322/500 [==================>...........] - ETA: 44s - loss: 2.4195 - regression_loss: 1.9255 - classification_loss: 0.4939 323/500 [==================>...........] - ETA: 44s - loss: 2.4188 - regression_loss: 1.9249 - classification_loss: 0.4939 324/500 [==================>...........] - ETA: 44s - loss: 2.4183 - regression_loss: 1.9246 - classification_loss: 0.4938 325/500 [==================>...........] - ETA: 43s - loss: 2.4203 - regression_loss: 1.9262 - classification_loss: 0.4940 326/500 [==================>...........] - ETA: 43s - loss: 2.4207 - regression_loss: 1.9264 - classification_loss: 0.4942 327/500 [==================>...........] - ETA: 43s - loss: 2.4195 - regression_loss: 1.9255 - classification_loss: 0.4940 328/500 [==================>...........] - ETA: 43s - loss: 2.4184 - regression_loss: 1.9248 - classification_loss: 0.4936 329/500 [==================>...........] - ETA: 42s - loss: 2.4149 - regression_loss: 1.9221 - classification_loss: 0.4927 330/500 [==================>...........] - ETA: 42s - loss: 2.4146 - regression_loss: 1.9221 - classification_loss: 0.4925 331/500 [==================>...........] - ETA: 42s - loss: 2.4161 - regression_loss: 1.9235 - classification_loss: 0.4926 332/500 [==================>...........] - ETA: 42s - loss: 2.4147 - regression_loss: 1.9224 - classification_loss: 0.4923 333/500 [==================>...........] - ETA: 41s - loss: 2.4147 - regression_loss: 1.9224 - classification_loss: 0.4923 334/500 [===================>..........] - ETA: 41s - loss: 2.4148 - regression_loss: 1.9227 - classification_loss: 0.4921 335/500 [===================>..........] - ETA: 41s - loss: 2.4131 - regression_loss: 1.9215 - classification_loss: 0.4916 336/500 [===================>..........] - ETA: 41s - loss: 2.4117 - regression_loss: 1.9201 - classification_loss: 0.4916 337/500 [===================>..........] - ETA: 40s - loss: 2.4128 - regression_loss: 1.9210 - classification_loss: 0.4917 338/500 [===================>..........] - ETA: 40s - loss: 2.4123 - regression_loss: 1.9208 - classification_loss: 0.4915 339/500 [===================>..........] - ETA: 40s - loss: 2.4109 - regression_loss: 1.9196 - classification_loss: 0.4913 340/500 [===================>..........] - ETA: 40s - loss: 2.4102 - regression_loss: 1.9191 - classification_loss: 0.4911 341/500 [===================>..........] - ETA: 39s - loss: 2.4078 - regression_loss: 1.9173 - classification_loss: 0.4905 342/500 [===================>..........] - ETA: 39s - loss: 2.4073 - regression_loss: 1.9169 - classification_loss: 0.4904 343/500 [===================>..........] - ETA: 39s - loss: 2.4076 - regression_loss: 1.9175 - classification_loss: 0.4902 344/500 [===================>..........] - ETA: 39s - loss: 2.4073 - regression_loss: 1.9168 - classification_loss: 0.4905 345/500 [===================>..........] - ETA: 38s - loss: 2.4075 - regression_loss: 1.9167 - classification_loss: 0.4908 346/500 [===================>..........] - ETA: 38s - loss: 2.4067 - regression_loss: 1.9162 - classification_loss: 0.4906 347/500 [===================>..........] - ETA: 38s - loss: 2.4063 - regression_loss: 1.9162 - classification_loss: 0.4902 348/500 [===================>..........] - ETA: 38s - loss: 2.4059 - regression_loss: 1.9161 - classification_loss: 0.4899 349/500 [===================>..........] - ETA: 37s - loss: 2.4052 - regression_loss: 1.9156 - classification_loss: 0.4896 350/500 [====================>.........] - ETA: 37s - loss: 2.4058 - regression_loss: 1.9163 - classification_loss: 0.4895 351/500 [====================>.........] - ETA: 37s - loss: 2.4046 - regression_loss: 1.9155 - classification_loss: 0.4891 352/500 [====================>.........] - ETA: 37s - loss: 2.4051 - regression_loss: 1.9160 - classification_loss: 0.4892 353/500 [====================>.........] - ETA: 36s - loss: 2.4043 - regression_loss: 1.9155 - classification_loss: 0.4888 354/500 [====================>.........] - ETA: 36s - loss: 2.4057 - regression_loss: 1.9166 - classification_loss: 0.4890 355/500 [====================>.........] - ETA: 36s - loss: 2.4089 - regression_loss: 1.9195 - classification_loss: 0.4894 356/500 [====================>.........] - ETA: 36s - loss: 2.4082 - regression_loss: 1.9184 - classification_loss: 0.4899 357/500 [====================>.........] - ETA: 35s - loss: 2.4082 - regression_loss: 1.9184 - classification_loss: 0.4898 358/500 [====================>.........] - ETA: 35s - loss: 2.4081 - regression_loss: 1.9184 - classification_loss: 0.4897 359/500 [====================>.........] - ETA: 35s - loss: 2.4077 - regression_loss: 1.9180 - classification_loss: 0.4897 360/500 [====================>.........] - ETA: 35s - loss: 2.4102 - regression_loss: 1.9202 - classification_loss: 0.4900 361/500 [====================>.........] - ETA: 34s - loss: 2.4084 - regression_loss: 1.9189 - classification_loss: 0.4896 362/500 [====================>.........] - ETA: 34s - loss: 2.4086 - regression_loss: 1.9193 - classification_loss: 0.4893 363/500 [====================>.........] - ETA: 34s - loss: 2.4083 - regression_loss: 1.9192 - classification_loss: 0.4891 364/500 [====================>.........] - ETA: 34s - loss: 2.4117 - regression_loss: 1.9221 - classification_loss: 0.4896 365/500 [====================>.........] - ETA: 33s - loss: 2.4120 - regression_loss: 1.9225 - classification_loss: 0.4895 366/500 [====================>.........] - ETA: 33s - loss: 2.4119 - regression_loss: 1.9225 - classification_loss: 0.4894 367/500 [=====================>........] - ETA: 33s - loss: 2.4133 - regression_loss: 1.9235 - classification_loss: 0.4899 368/500 [=====================>........] - ETA: 33s - loss: 2.4124 - regression_loss: 1.9226 - classification_loss: 0.4898 369/500 [=====================>........] - ETA: 32s - loss: 2.4117 - regression_loss: 1.9219 - classification_loss: 0.4897 370/500 [=====================>........] - ETA: 32s - loss: 2.4140 - regression_loss: 1.9236 - classification_loss: 0.4904 371/500 [=====================>........] - ETA: 32s - loss: 2.4137 - regression_loss: 1.9235 - classification_loss: 0.4902 372/500 [=====================>........] - ETA: 32s - loss: 2.4129 - regression_loss: 1.9231 - classification_loss: 0.4898 373/500 [=====================>........] - ETA: 31s - loss: 2.4136 - regression_loss: 1.9236 - classification_loss: 0.4900 374/500 [=====================>........] - ETA: 31s - loss: 2.4134 - regression_loss: 1.9238 - classification_loss: 0.4897 375/500 [=====================>........] - ETA: 31s - loss: 2.4132 - regression_loss: 1.9236 - classification_loss: 0.4896 376/500 [=====================>........] - ETA: 31s - loss: 2.4132 - regression_loss: 1.9238 - classification_loss: 0.4894 377/500 [=====================>........] - ETA: 30s - loss: 2.4135 - regression_loss: 1.9242 - classification_loss: 0.4892 378/500 [=====================>........] - ETA: 30s - loss: 2.4347 - regression_loss: 1.9248 - classification_loss: 0.5100 379/500 [=====================>........] - ETA: 30s - loss: 2.4364 - regression_loss: 1.9266 - classification_loss: 0.5097 380/500 [=====================>........] - ETA: 30s - loss: 2.4370 - regression_loss: 1.9270 - classification_loss: 0.5100 381/500 [=====================>........] - ETA: 29s - loss: 2.4407 - regression_loss: 1.9285 - classification_loss: 0.5122 382/500 [=====================>........] - ETA: 29s - loss: 2.4410 - regression_loss: 1.9272 - classification_loss: 0.5138 383/500 [=====================>........] - ETA: 29s - loss: 2.4410 - regression_loss: 1.9276 - classification_loss: 0.5134 384/500 [======================>.......] - ETA: 29s - loss: 2.4408 - regression_loss: 1.9279 - classification_loss: 0.5129 385/500 [======================>.......] - ETA: 28s - loss: 2.4414 - regression_loss: 1.9284 - classification_loss: 0.5130 386/500 [======================>.......] - ETA: 28s - loss: 2.4406 - regression_loss: 1.9279 - classification_loss: 0.5127 387/500 [======================>.......] - ETA: 28s - loss: 2.4402 - regression_loss: 1.9276 - classification_loss: 0.5126 388/500 [======================>.......] - ETA: 28s - loss: 2.4415 - regression_loss: 1.9285 - classification_loss: 0.5130 389/500 [======================>.......] - ETA: 27s - loss: 2.4403 - regression_loss: 1.9276 - classification_loss: 0.5126 390/500 [======================>.......] - ETA: 27s - loss: 2.4409 - regression_loss: 1.9283 - classification_loss: 0.5126 391/500 [======================>.......] - ETA: 27s - loss: 2.4403 - regression_loss: 1.9278 - classification_loss: 0.5125 392/500 [======================>.......] - ETA: 27s - loss: 2.4403 - regression_loss: 1.9279 - classification_loss: 0.5123 393/500 [======================>.......] - ETA: 26s - loss: 2.4406 - regression_loss: 1.9284 - classification_loss: 0.5122 394/500 [======================>.......] - ETA: 26s - loss: 2.4394 - regression_loss: 1.9277 - classification_loss: 0.5118 395/500 [======================>.......] - ETA: 26s - loss: 2.4385 - regression_loss: 1.9273 - classification_loss: 0.5112 396/500 [======================>.......] - ETA: 26s - loss: 2.4377 - regression_loss: 1.9267 - classification_loss: 0.5110 397/500 [======================>.......] - ETA: 25s - loss: 2.4404 - regression_loss: 1.9286 - classification_loss: 0.5118 398/500 [======================>.......] - ETA: 25s - loss: 2.4384 - regression_loss: 1.9273 - classification_loss: 0.5111 399/500 [======================>.......] - ETA: 25s - loss: 2.4372 - regression_loss: 1.9266 - classification_loss: 0.5106 400/500 [=======================>......] - ETA: 25s - loss: 2.4369 - regression_loss: 1.9266 - classification_loss: 0.5103 401/500 [=======================>......] - ETA: 24s - loss: 2.4359 - regression_loss: 1.9254 - classification_loss: 0.5106 402/500 [=======================>......] - ETA: 24s - loss: 2.4340 - regression_loss: 1.9240 - classification_loss: 0.5100 403/500 [=======================>......] - ETA: 24s - loss: 2.4335 - regression_loss: 1.9238 - classification_loss: 0.5097 404/500 [=======================>......] - ETA: 24s - loss: 2.4321 - regression_loss: 1.9230 - classification_loss: 0.5092 405/500 [=======================>......] - ETA: 23s - loss: 2.4329 - regression_loss: 1.9237 - classification_loss: 0.5092 406/500 [=======================>......] - ETA: 23s - loss: 2.4330 - regression_loss: 1.9238 - classification_loss: 0.5091 407/500 [=======================>......] - ETA: 23s - loss: 2.4327 - regression_loss: 1.9236 - classification_loss: 0.5090 408/500 [=======================>......] - ETA: 23s - loss: 2.4321 - regression_loss: 1.9234 - classification_loss: 0.5088 409/500 [=======================>......] - ETA: 22s - loss: 2.4311 - regression_loss: 1.9226 - classification_loss: 0.5085 410/500 [=======================>......] - ETA: 22s - loss: 2.4279 - regression_loss: 1.9201 - classification_loss: 0.5078 411/500 [=======================>......] - ETA: 22s - loss: 2.4272 - regression_loss: 1.9194 - classification_loss: 0.5078 412/500 [=======================>......] - ETA: 22s - loss: 2.4276 - regression_loss: 1.9194 - classification_loss: 0.5082 413/500 [=======================>......] - ETA: 21s - loss: 2.4273 - regression_loss: 1.9193 - classification_loss: 0.5080 414/500 [=======================>......] - ETA: 21s - loss: 2.4277 - regression_loss: 1.9197 - classification_loss: 0.5080 415/500 [=======================>......] - ETA: 21s - loss: 2.4278 - regression_loss: 1.9198 - classification_loss: 0.5079 416/500 [=======================>......] - ETA: 21s - loss: 2.4282 - regression_loss: 1.9203 - classification_loss: 0.5079 417/500 [========================>.....] - ETA: 20s - loss: 2.4276 - regression_loss: 1.9201 - classification_loss: 0.5075 418/500 [========================>.....] - ETA: 20s - loss: 2.4302 - regression_loss: 1.9224 - classification_loss: 0.5079 419/500 [========================>.....] - ETA: 20s - loss: 2.4304 - regression_loss: 1.9228 - classification_loss: 0.5076 420/500 [========================>.....] - ETA: 20s - loss: 2.4303 - regression_loss: 1.9227 - classification_loss: 0.5076 421/500 [========================>.....] - ETA: 19s - loss: 2.4314 - regression_loss: 1.9236 - classification_loss: 0.5077 422/500 [========================>.....] - ETA: 19s - loss: 2.4312 - regression_loss: 1.9233 - classification_loss: 0.5080 423/500 [========================>.....] - ETA: 19s - loss: 2.4313 - regression_loss: 1.9236 - classification_loss: 0.5076 424/500 [========================>.....] - ETA: 19s - loss: 2.4316 - regression_loss: 1.9240 - classification_loss: 0.5076 425/500 [========================>.....] - ETA: 18s - loss: 2.4333 - regression_loss: 1.9255 - classification_loss: 0.5078 426/500 [========================>.....] - ETA: 18s - loss: 2.4321 - regression_loss: 1.9247 - classification_loss: 0.5074 427/500 [========================>.....] - ETA: 18s - loss: 2.4326 - regression_loss: 1.9255 - classification_loss: 0.5072 428/500 [========================>.....] - ETA: 18s - loss: 2.4327 - regression_loss: 1.9255 - classification_loss: 0.5072 429/500 [========================>.....] - ETA: 17s - loss: 2.4329 - regression_loss: 1.9258 - classification_loss: 0.5070 430/500 [========================>.....] - ETA: 17s - loss: 2.4314 - regression_loss: 1.9247 - classification_loss: 0.5067 431/500 [========================>.....] - ETA: 17s - loss: 2.4319 - regression_loss: 1.9247 - classification_loss: 0.5072 432/500 [========================>.....] - ETA: 17s - loss: 2.4386 - regression_loss: 1.9258 - classification_loss: 0.5128 433/500 [========================>.....] - ETA: 16s - loss: 2.4383 - regression_loss: 1.9257 - classification_loss: 0.5126 434/500 [=========================>....] - ETA: 16s - loss: 2.4393 - regression_loss: 1.9266 - classification_loss: 0.5127 435/500 [=========================>....] - ETA: 16s - loss: 2.4394 - regression_loss: 1.9265 - classification_loss: 0.5129 436/500 [=========================>....] - ETA: 16s - loss: 2.4399 - regression_loss: 1.9268 - classification_loss: 0.5131 437/500 [=========================>....] - ETA: 15s - loss: 2.4389 - regression_loss: 1.9258 - classification_loss: 0.5131 438/500 [=========================>....] - ETA: 15s - loss: 2.4362 - regression_loss: 1.9236 - classification_loss: 0.5125 439/500 [=========================>....] - ETA: 15s - loss: 2.4358 - regression_loss: 1.9236 - classification_loss: 0.5122 440/500 [=========================>....] - ETA: 15s - loss: 2.4332 - regression_loss: 1.9215 - classification_loss: 0.5118 441/500 [=========================>....] - ETA: 14s - loss: 2.4328 - regression_loss: 1.9214 - classification_loss: 0.5114 442/500 [=========================>....] - ETA: 14s - loss: 2.4324 - regression_loss: 1.9212 - classification_loss: 0.5111 443/500 [=========================>....] - ETA: 14s - loss: 2.4360 - regression_loss: 1.9240 - classification_loss: 0.5120 444/500 [=========================>....] - ETA: 14s - loss: 2.4339 - regression_loss: 1.9222 - classification_loss: 0.5117 445/500 [=========================>....] - ETA: 13s - loss: 2.4331 - regression_loss: 1.9217 - classification_loss: 0.5114 446/500 [=========================>....] - ETA: 13s - loss: 2.4323 - regression_loss: 1.9212 - classification_loss: 0.5111 447/500 [=========================>....] - ETA: 13s - loss: 2.4333 - regression_loss: 1.9222 - classification_loss: 0.5110 448/500 [=========================>....] - ETA: 13s - loss: 2.4315 - regression_loss: 1.9209 - classification_loss: 0.5106 449/500 [=========================>....] - ETA: 12s - loss: 2.4281 - regression_loss: 1.9179 - classification_loss: 0.5102 450/500 [==========================>...] - ETA: 12s - loss: 2.4270 - regression_loss: 1.9173 - classification_loss: 0.5097 451/500 [==========================>...] - ETA: 12s - loss: 2.4247 - regression_loss: 1.9154 - classification_loss: 0.5093 452/500 [==========================>...] - ETA: 12s - loss: 2.4239 - regression_loss: 1.9150 - classification_loss: 0.5089 453/500 [==========================>...] - ETA: 11s - loss: 2.4236 - regression_loss: 1.9150 - classification_loss: 0.5087 454/500 [==========================>...] - ETA: 11s - loss: 2.4246 - regression_loss: 1.9161 - classification_loss: 0.5085 455/500 [==========================>...] - ETA: 11s - loss: 2.4250 - regression_loss: 1.9165 - classification_loss: 0.5085 456/500 [==========================>...] - ETA: 11s - loss: 2.4243 - regression_loss: 1.9162 - classification_loss: 0.5081 457/500 [==========================>...] - ETA: 10s - loss: 2.4228 - regression_loss: 1.9150 - classification_loss: 0.5078 458/500 [==========================>...] - ETA: 10s - loss: 2.4222 - regression_loss: 1.9147 - classification_loss: 0.5074 459/500 [==========================>...] - ETA: 10s - loss: 2.4222 - regression_loss: 1.9149 - classification_loss: 0.5073 460/500 [==========================>...] - ETA: 10s - loss: 2.4214 - regression_loss: 1.9142 - classification_loss: 0.5072 461/500 [==========================>...] - ETA: 9s - loss: 2.4211 - regression_loss: 1.9140 - classification_loss: 0.5071  462/500 [==========================>...] - ETA: 9s - loss: 2.4211 - regression_loss: 1.9142 - classification_loss: 0.5069 463/500 [==========================>...] - ETA: 9s - loss: 2.4225 - regression_loss: 1.9141 - classification_loss: 0.5084 464/500 [==========================>...] - ETA: 9s - loss: 2.4213 - regression_loss: 1.9133 - classification_loss: 0.5080 465/500 [==========================>...] - ETA: 8s - loss: 2.4211 - regression_loss: 1.9135 - classification_loss: 0.5076 466/500 [==========================>...] - ETA: 8s - loss: 2.4206 - regression_loss: 1.9133 - classification_loss: 0.5073 467/500 [===========================>..] - ETA: 8s - loss: 2.4220 - regression_loss: 1.9146 - classification_loss: 0.5075 468/500 [===========================>..] - ETA: 8s - loss: 2.4214 - regression_loss: 1.9142 - classification_loss: 0.5072 469/500 [===========================>..] - ETA: 7s - loss: 2.4210 - regression_loss: 1.9140 - classification_loss: 0.5070 470/500 [===========================>..] - ETA: 7s - loss: 2.4197 - regression_loss: 1.9130 - classification_loss: 0.5067 471/500 [===========================>..] - ETA: 7s - loss: 2.4195 - regression_loss: 1.9128 - classification_loss: 0.5066 472/500 [===========================>..] - ETA: 7s - loss: 2.4198 - regression_loss: 1.9134 - classification_loss: 0.5064 473/500 [===========================>..] - ETA: 6s - loss: 2.4190 - regression_loss: 1.9130 - classification_loss: 0.5060 474/500 [===========================>..] - ETA: 6s - loss: 2.4185 - regression_loss: 1.9127 - classification_loss: 0.5057 475/500 [===========================>..] - ETA: 6s - loss: 2.4179 - regression_loss: 1.9125 - classification_loss: 0.5054 476/500 [===========================>..] - ETA: 6s - loss: 2.4166 - regression_loss: 1.9116 - classification_loss: 0.5050 477/500 [===========================>..] - ETA: 5s - loss: 2.4136 - regression_loss: 1.9092 - classification_loss: 0.5044 478/500 [===========================>..] - ETA: 5s - loss: 2.4149 - regression_loss: 1.9108 - classification_loss: 0.5041 479/500 [===========================>..] - ETA: 5s - loss: 2.4151 - regression_loss: 1.9111 - classification_loss: 0.5039 480/500 [===========================>..] - ETA: 5s - loss: 2.4149 - regression_loss: 1.9112 - classification_loss: 0.5037 481/500 [===========================>..] - ETA: 4s - loss: 2.4133 - regression_loss: 1.9100 - classification_loss: 0.5032 482/500 [===========================>..] - ETA: 4s - loss: 2.4132 - regression_loss: 1.9102 - classification_loss: 0.5030 483/500 [===========================>..] - ETA: 4s - loss: 2.4131 - regression_loss: 1.9104 - classification_loss: 0.5027 484/500 [============================>.] - ETA: 4s - loss: 2.4139 - regression_loss: 1.9110 - classification_loss: 0.5029 485/500 [============================>.] - ETA: 3s - loss: 2.4137 - regression_loss: 1.9111 - classification_loss: 0.5026 486/500 [============================>.] - ETA: 3s - loss: 2.4131 - regression_loss: 1.9108 - classification_loss: 0.5023 487/500 [============================>.] - ETA: 3s - loss: 2.4135 - regression_loss: 1.9110 - classification_loss: 0.5025 488/500 [============================>.] - ETA: 3s - loss: 2.4138 - regression_loss: 1.9113 - classification_loss: 0.5025 489/500 [============================>.] - ETA: 2s - loss: 2.4145 - regression_loss: 1.9120 - classification_loss: 0.5025 490/500 [============================>.] - ETA: 2s - loss: 2.4151 - regression_loss: 1.9126 - classification_loss: 0.5024 491/500 [============================>.] - ETA: 2s - loss: 2.4149 - regression_loss: 1.9117 - classification_loss: 0.5033 492/500 [============================>.] - ETA: 2s - loss: 2.4147 - regression_loss: 1.9116 - classification_loss: 0.5031 493/500 [============================>.] - ETA: 1s - loss: 2.4151 - regression_loss: 1.9119 - classification_loss: 0.5033 494/500 [============================>.] - ETA: 1s - loss: 2.4144 - regression_loss: 1.9114 - classification_loss: 0.5030 495/500 [============================>.] - ETA: 1s - loss: 2.4138 - regression_loss: 1.9112 - classification_loss: 0.5027 496/500 [============================>.] - ETA: 1s - loss: 2.4135 - regression_loss: 1.9110 - classification_loss: 0.5025 497/500 [============================>.] - ETA: 0s - loss: 2.4136 - regression_loss: 1.9114 - classification_loss: 0.5023 498/500 [============================>.] - ETA: 0s - loss: 2.4146 - regression_loss: 1.9118 - classification_loss: 0.5028 499/500 [============================>.] - ETA: 0s - loss: 2.4148 - regression_loss: 1.9121 - classification_loss: 0.5027 500/500 [==============================] - 125s 250ms/step - loss: 2.4147 - regression_loss: 1.9121 - classification_loss: 0.5026 1172 instances of class plum with average precision: 0.4004 mAP: 0.4004 Epoch 00017: saving model to ./training/snapshots/resnet50_pascal_17.h5 Epoch 18/150 1/500 [..............................] - ETA: 1:52 - loss: 2.2415 - regression_loss: 1.8383 - classification_loss: 0.4033 2/500 [..............................] - ETA: 1:54 - loss: 2.5419 - regression_loss: 2.0684 - classification_loss: 0.4735 3/500 [..............................] - ETA: 1:58 - loss: 2.3716 - regression_loss: 1.9557 - classification_loss: 0.4159 4/500 [..............................] - ETA: 1:59 - loss: 2.3430 - regression_loss: 1.9297 - classification_loss: 0.4133 5/500 [..............................] - ETA: 2:00 - loss: 2.4476 - regression_loss: 2.0126 - classification_loss: 0.4350 6/500 [..............................] - ETA: 2:00 - loss: 2.2823 - regression_loss: 1.8806 - classification_loss: 0.4016 7/500 [..............................] - ETA: 2:00 - loss: 2.3360 - regression_loss: 1.9248 - classification_loss: 0.4112 8/500 [..............................] - ETA: 2:01 - loss: 2.3825 - regression_loss: 1.9473 - classification_loss: 0.4351 9/500 [..............................] - ETA: 2:01 - loss: 2.2970 - regression_loss: 1.8756 - classification_loss: 0.4214 10/500 [..............................] - ETA: 2:01 - loss: 2.3060 - regression_loss: 1.8823 - classification_loss: 0.4237 11/500 [..............................] - ETA: 2:00 - loss: 2.3287 - regression_loss: 1.9010 - classification_loss: 0.4277 12/500 [..............................] - ETA: 2:00 - loss: 2.3118 - regression_loss: 1.8814 - classification_loss: 0.4304 13/500 [..............................] - ETA: 1:59 - loss: 2.2924 - regression_loss: 1.8626 - classification_loss: 0.4298 14/500 [..............................] - ETA: 1:59 - loss: 2.2893 - regression_loss: 1.8600 - classification_loss: 0.4293 15/500 [..............................] - ETA: 1:59 - loss: 2.3108 - regression_loss: 1.8674 - classification_loss: 0.4434 16/500 [..............................] - ETA: 1:59 - loss: 2.3074 - regression_loss: 1.8685 - classification_loss: 0.4389 17/500 [>.............................] - ETA: 1:59 - loss: 2.2884 - regression_loss: 1.8509 - classification_loss: 0.4375 18/500 [>.............................] - ETA: 1:59 - loss: 2.2937 - regression_loss: 1.8566 - classification_loss: 0.4371 19/500 [>.............................] - ETA: 1:59 - loss: 2.3254 - regression_loss: 1.8781 - classification_loss: 0.4472 20/500 [>.............................] - ETA: 1:58 - loss: 2.3197 - regression_loss: 1.8757 - classification_loss: 0.4440 21/500 [>.............................] - ETA: 1:58 - loss: 2.3084 - regression_loss: 1.8666 - classification_loss: 0.4418 22/500 [>.............................] - ETA: 1:58 - loss: 2.2898 - regression_loss: 1.8521 - classification_loss: 0.4377 23/500 [>.............................] - ETA: 1:58 - loss: 2.2591 - regression_loss: 1.8231 - classification_loss: 0.4360 24/500 [>.............................] - ETA: 1:57 - loss: 2.2761 - regression_loss: 1.8379 - classification_loss: 0.4382 25/500 [>.............................] - ETA: 1:57 - loss: 2.2933 - regression_loss: 1.8530 - classification_loss: 0.4403 26/500 [>.............................] - ETA: 1:57 - loss: 2.3427 - regression_loss: 1.8898 - classification_loss: 0.4529 27/500 [>.............................] - ETA: 1:57 - loss: 2.3389 - regression_loss: 1.8881 - classification_loss: 0.4508 28/500 [>.............................] - ETA: 1:57 - loss: 2.3268 - regression_loss: 1.8784 - classification_loss: 0.4484 29/500 [>.............................] - ETA: 1:56 - loss: 2.3411 - regression_loss: 1.8917 - classification_loss: 0.4495 30/500 [>.............................] - ETA: 1:56 - loss: 2.3469 - regression_loss: 1.8992 - classification_loss: 0.4477 31/500 [>.............................] - ETA: 1:56 - loss: 2.3553 - regression_loss: 1.9083 - classification_loss: 0.4470 32/500 [>.............................] - ETA: 1:56 - loss: 2.3605 - regression_loss: 1.9110 - classification_loss: 0.4495 33/500 [>.............................] - ETA: 1:55 - loss: 2.3563 - regression_loss: 1.9080 - classification_loss: 0.4483 34/500 [=>............................] - ETA: 1:55 - loss: 2.3149 - regression_loss: 1.8757 - classification_loss: 0.4392 35/500 [=>............................] - ETA: 1:55 - loss: 2.3114 - regression_loss: 1.8735 - classification_loss: 0.4379 36/500 [=>............................] - ETA: 1:55 - loss: 2.3119 - regression_loss: 1.8755 - classification_loss: 0.4364 37/500 [=>............................] - ETA: 1:55 - loss: 2.3518 - regression_loss: 1.9100 - classification_loss: 0.4418 38/500 [=>............................] - ETA: 1:55 - loss: 2.3408 - regression_loss: 1.9020 - classification_loss: 0.4388 39/500 [=>............................] - ETA: 1:54 - loss: 2.3428 - regression_loss: 1.9031 - classification_loss: 0.4397 40/500 [=>............................] - ETA: 1:54 - loss: 2.3423 - regression_loss: 1.9038 - classification_loss: 0.4385 41/500 [=>............................] - ETA: 1:54 - loss: 2.3409 - regression_loss: 1.9038 - classification_loss: 0.4371 42/500 [=>............................] - ETA: 1:54 - loss: 2.3317 - regression_loss: 1.8911 - classification_loss: 0.4406 43/500 [=>............................] - ETA: 1:53 - loss: 2.3472 - regression_loss: 1.9046 - classification_loss: 0.4425 44/500 [=>............................] - ETA: 1:53 - loss: 2.3603 - regression_loss: 1.9147 - classification_loss: 0.4457 45/500 [=>............................] - ETA: 1:53 - loss: 2.3634 - regression_loss: 1.9179 - classification_loss: 0.4456 46/500 [=>............................] - ETA: 1:53 - loss: 2.3722 - regression_loss: 1.9257 - classification_loss: 0.4465 47/500 [=>............................] - ETA: 1:53 - loss: 2.3790 - regression_loss: 1.9302 - classification_loss: 0.4488 48/500 [=>............................] - ETA: 1:52 - loss: 2.3812 - regression_loss: 1.9330 - classification_loss: 0.4482 49/500 [=>............................] - ETA: 1:52 - loss: 2.3793 - regression_loss: 1.9326 - classification_loss: 0.4467 50/500 [==>...........................] - ETA: 1:52 - loss: 2.3787 - regression_loss: 1.9325 - classification_loss: 0.4462 51/500 [==>...........................] - ETA: 1:52 - loss: 2.3708 - regression_loss: 1.9239 - classification_loss: 0.4468 52/500 [==>...........................] - ETA: 1:51 - loss: 2.3734 - regression_loss: 1.9259 - classification_loss: 0.4475 53/500 [==>...........................] - ETA: 1:51 - loss: 2.3714 - regression_loss: 1.9247 - classification_loss: 0.4467 54/500 [==>...........................] - ETA: 1:51 - loss: 2.3686 - regression_loss: 1.9241 - classification_loss: 0.4445 55/500 [==>...........................] - ETA: 1:51 - loss: 2.3809 - regression_loss: 1.9328 - classification_loss: 0.4481 56/500 [==>...........................] - ETA: 1:50 - loss: 2.3871 - regression_loss: 1.9360 - classification_loss: 0.4511 57/500 [==>...........................] - ETA: 1:50 - loss: 2.3842 - regression_loss: 1.9329 - classification_loss: 0.4513 58/500 [==>...........................] - ETA: 1:50 - loss: 2.3847 - regression_loss: 1.9340 - classification_loss: 0.4508 59/500 [==>...........................] - ETA: 1:50 - loss: 2.3729 - regression_loss: 1.9243 - classification_loss: 0.4486 60/500 [==>...........................] - ETA: 1:50 - loss: 2.3723 - regression_loss: 1.9243 - classification_loss: 0.4480 61/500 [==>...........................] - ETA: 1:49 - loss: 2.3694 - regression_loss: 1.9226 - classification_loss: 0.4468 62/500 [==>...........................] - ETA: 1:49 - loss: 2.3688 - regression_loss: 1.9216 - classification_loss: 0.4472 63/500 [==>...........................] - ETA: 1:49 - loss: 2.3681 - regression_loss: 1.9212 - classification_loss: 0.4469 64/500 [==>...........................] - ETA: 1:49 - loss: 2.3714 - regression_loss: 1.9227 - classification_loss: 0.4487 65/500 [==>...........................] - ETA: 1:48 - loss: 2.3771 - regression_loss: 1.9253 - classification_loss: 0.4517 66/500 [==>...........................] - ETA: 1:48 - loss: 2.3734 - regression_loss: 1.9227 - classification_loss: 0.4506 67/500 [===>..........................] - ETA: 1:48 - loss: 2.3667 - regression_loss: 1.9178 - classification_loss: 0.4489 68/500 [===>..........................] - ETA: 1:48 - loss: 2.3784 - regression_loss: 1.9266 - classification_loss: 0.4518 69/500 [===>..........................] - ETA: 1:47 - loss: 2.3796 - regression_loss: 1.9282 - classification_loss: 0.4514 70/500 [===>..........................] - ETA: 1:47 - loss: 2.3685 - regression_loss: 1.9204 - classification_loss: 0.4481 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3775 - regression_loss: 1.9301 - classification_loss: 0.4474 72/500 [===>..........................] - ETA: 1:47 - loss: 2.3750 - regression_loss: 1.9276 - classification_loss: 0.4474 73/500 [===>..........................] - ETA: 1:46 - loss: 2.3776 - regression_loss: 1.9298 - classification_loss: 0.4478 74/500 [===>..........................] - ETA: 1:46 - loss: 2.3790 - regression_loss: 1.9315 - classification_loss: 0.4475 75/500 [===>..........................] - ETA: 1:46 - loss: 2.3720 - regression_loss: 1.9255 - classification_loss: 0.4464 76/500 [===>..........................] - ETA: 1:46 - loss: 2.3761 - regression_loss: 1.9292 - classification_loss: 0.4469 77/500 [===>..........................] - ETA: 1:45 - loss: 2.3627 - regression_loss: 1.9177 - classification_loss: 0.4450 78/500 [===>..........................] - ETA: 1:45 - loss: 2.3603 - regression_loss: 1.9159 - classification_loss: 0.4444 79/500 [===>..........................] - ETA: 1:45 - loss: 2.3572 - regression_loss: 1.9134 - classification_loss: 0.4438 80/500 [===>..........................] - ETA: 1:45 - loss: 2.3508 - regression_loss: 1.9087 - classification_loss: 0.4421 81/500 [===>..........................] - ETA: 1:44 - loss: 2.3539 - regression_loss: 1.9115 - classification_loss: 0.4425 82/500 [===>..........................] - ETA: 1:44 - loss: 2.3531 - regression_loss: 1.9107 - classification_loss: 0.4424 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3487 - regression_loss: 1.9077 - classification_loss: 0.4410 84/500 [====>.........................] - ETA: 1:44 - loss: 2.3529 - regression_loss: 1.9109 - classification_loss: 0.4420 85/500 [====>.........................] - ETA: 1:44 - loss: 2.3510 - regression_loss: 1.9083 - classification_loss: 0.4427 86/500 [====>.........................] - ETA: 1:43 - loss: 2.3539 - regression_loss: 1.9102 - classification_loss: 0.4437 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3639 - regression_loss: 1.9138 - classification_loss: 0.4502 88/500 [====>.........................] - ETA: 1:43 - loss: 2.3604 - regression_loss: 1.9095 - classification_loss: 0.4510 89/500 [====>.........................] - ETA: 1:42 - loss: 2.3628 - regression_loss: 1.9108 - classification_loss: 0.4520 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3572 - regression_loss: 1.9055 - classification_loss: 0.4516 91/500 [====>.........................] - ETA: 1:42 - loss: 2.3587 - regression_loss: 1.9076 - classification_loss: 0.4511 92/500 [====>.........................] - ETA: 1:42 - loss: 2.3624 - regression_loss: 1.9104 - classification_loss: 0.4520 93/500 [====>.........................] - ETA: 1:42 - loss: 2.3645 - regression_loss: 1.9124 - classification_loss: 0.4522 94/500 [====>.........................] - ETA: 1:41 - loss: 2.3638 - regression_loss: 1.9117 - classification_loss: 0.4521 95/500 [====>.........................] - ETA: 1:41 - loss: 2.3597 - regression_loss: 1.9075 - classification_loss: 0.4522 96/500 [====>.........................] - ETA: 1:41 - loss: 2.3605 - regression_loss: 1.9077 - classification_loss: 0.4528 97/500 [====>.........................] - ETA: 1:41 - loss: 2.3604 - regression_loss: 1.9079 - classification_loss: 0.4525 98/500 [====>.........................] - ETA: 1:40 - loss: 2.3648 - regression_loss: 1.9106 - classification_loss: 0.4542 99/500 [====>.........................] - ETA: 1:40 - loss: 2.3707 - regression_loss: 1.9147 - classification_loss: 0.4560 100/500 [=====>........................] - ETA: 1:40 - loss: 2.3688 - regression_loss: 1.9140 - classification_loss: 0.4548 101/500 [=====>........................] - ETA: 1:40 - loss: 2.3685 - regression_loss: 1.9137 - classification_loss: 0.4548 102/500 [=====>........................] - ETA: 1:39 - loss: 2.3675 - regression_loss: 1.9101 - classification_loss: 0.4574 103/500 [=====>........................] - ETA: 1:39 - loss: 2.3682 - regression_loss: 1.9111 - classification_loss: 0.4571 104/500 [=====>........................] - ETA: 1:39 - loss: 2.3643 - regression_loss: 1.9096 - classification_loss: 0.4547 105/500 [=====>........................] - ETA: 1:39 - loss: 2.3679 - regression_loss: 1.9124 - classification_loss: 0.4554 106/500 [=====>........................] - ETA: 1:38 - loss: 2.3671 - regression_loss: 1.9130 - classification_loss: 0.4541 107/500 [=====>........................] - ETA: 1:38 - loss: 2.3610 - regression_loss: 1.9087 - classification_loss: 0.4522 108/500 [=====>........................] - ETA: 1:38 - loss: 2.3589 - regression_loss: 1.9067 - classification_loss: 0.4521 109/500 [=====>........................] - ETA: 1:38 - loss: 2.3565 - regression_loss: 1.9050 - classification_loss: 0.4515 110/500 [=====>........................] - ETA: 1:37 - loss: 2.3524 - regression_loss: 1.9019 - classification_loss: 0.4505 111/500 [=====>........................] - ETA: 1:37 - loss: 2.3496 - regression_loss: 1.8998 - classification_loss: 0.4497 112/500 [=====>........................] - ETA: 1:37 - loss: 2.3531 - regression_loss: 1.9030 - classification_loss: 0.4500 113/500 [=====>........................] - ETA: 1:37 - loss: 2.3489 - regression_loss: 1.8997 - classification_loss: 0.4492 114/500 [=====>........................] - ETA: 1:36 - loss: 2.3513 - regression_loss: 1.9022 - classification_loss: 0.4491 115/500 [=====>........................] - ETA: 1:36 - loss: 2.3528 - regression_loss: 1.9027 - classification_loss: 0.4500 116/500 [=====>........................] - ETA: 1:36 - loss: 2.3501 - regression_loss: 1.9010 - classification_loss: 0.4492 117/500 [======>.......................] - ETA: 1:36 - loss: 2.3490 - regression_loss: 1.9003 - classification_loss: 0.4488 118/500 [======>.......................] - ETA: 1:35 - loss: 2.3510 - regression_loss: 1.9022 - classification_loss: 0.4488 119/500 [======>.......................] - ETA: 1:35 - loss: 2.3513 - regression_loss: 1.9026 - classification_loss: 0.4487 120/500 [======>.......................] - ETA: 1:35 - loss: 2.3512 - regression_loss: 1.9029 - classification_loss: 0.4483 121/500 [======>.......................] - ETA: 1:35 - loss: 2.3501 - regression_loss: 1.9018 - classification_loss: 0.4483 122/500 [======>.......................] - ETA: 1:34 - loss: 2.3493 - regression_loss: 1.9012 - classification_loss: 0.4481 123/500 [======>.......................] - ETA: 1:34 - loss: 2.3498 - regression_loss: 1.9008 - classification_loss: 0.4490 124/500 [======>.......................] - ETA: 1:34 - loss: 2.3476 - regression_loss: 1.8991 - classification_loss: 0.4485 125/500 [======>.......................] - ETA: 1:33 - loss: 2.3484 - regression_loss: 1.9003 - classification_loss: 0.4482 126/500 [======>.......................] - ETA: 1:33 - loss: 2.3438 - regression_loss: 1.8966 - classification_loss: 0.4472 127/500 [======>.......................] - ETA: 1:33 - loss: 2.3411 - regression_loss: 1.8950 - classification_loss: 0.4462 128/500 [======>.......................] - ETA: 1:32 - loss: 2.3400 - regression_loss: 1.8938 - classification_loss: 0.4462 129/500 [======>.......................] - ETA: 1:32 - loss: 2.3438 - regression_loss: 1.8963 - classification_loss: 0.4475 130/500 [======>.......................] - ETA: 1:32 - loss: 2.3451 - regression_loss: 1.8968 - classification_loss: 0.4484 131/500 [======>.......................] - ETA: 1:32 - loss: 2.3431 - regression_loss: 1.8946 - classification_loss: 0.4486 132/500 [======>.......................] - ETA: 1:31 - loss: 2.3446 - regression_loss: 1.8962 - classification_loss: 0.4484 133/500 [======>.......................] - ETA: 1:31 - loss: 2.3463 - regression_loss: 1.8974 - classification_loss: 0.4489 134/500 [=======>......................] - ETA: 1:31 - loss: 2.3510 - regression_loss: 1.9005 - classification_loss: 0.4506 135/500 [=======>......................] - ETA: 1:31 - loss: 2.3470 - regression_loss: 1.8974 - classification_loss: 0.4495 136/500 [=======>......................] - ETA: 1:30 - loss: 2.3472 - regression_loss: 1.8976 - classification_loss: 0.4496 137/500 [=======>......................] - ETA: 1:30 - loss: 2.3464 - regression_loss: 1.8977 - classification_loss: 0.4488 138/500 [=======>......................] - ETA: 1:30 - loss: 2.3509 - regression_loss: 1.9016 - classification_loss: 0.4493 139/500 [=======>......................] - ETA: 1:30 - loss: 2.3522 - regression_loss: 1.9032 - classification_loss: 0.4490 140/500 [=======>......................] - ETA: 1:29 - loss: 2.3535 - regression_loss: 1.9046 - classification_loss: 0.4490 141/500 [=======>......................] - ETA: 1:29 - loss: 2.3471 - regression_loss: 1.8987 - classification_loss: 0.4484 142/500 [=======>......................] - ETA: 1:29 - loss: 2.3456 - regression_loss: 1.8975 - classification_loss: 0.4481 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3464 - regression_loss: 1.8979 - classification_loss: 0.4484 144/500 [=======>......................] - ETA: 1:28 - loss: 2.3527 - regression_loss: 1.9028 - classification_loss: 0.4499 145/500 [=======>......................] - ETA: 1:28 - loss: 2.3544 - regression_loss: 1.9043 - classification_loss: 0.4501 146/500 [=======>......................] - ETA: 1:28 - loss: 2.3547 - regression_loss: 1.9041 - classification_loss: 0.4505 147/500 [=======>......................] - ETA: 1:28 - loss: 2.3555 - regression_loss: 1.9046 - classification_loss: 0.4509 148/500 [=======>......................] - ETA: 1:27 - loss: 2.3545 - regression_loss: 1.9042 - classification_loss: 0.4503 149/500 [=======>......................] - ETA: 1:27 - loss: 2.3580 - regression_loss: 1.9069 - classification_loss: 0.4511 150/500 [========>.....................] - ETA: 1:27 - loss: 2.3556 - regression_loss: 1.9031 - classification_loss: 0.4525 151/500 [========>.....................] - ETA: 1:27 - loss: 2.3567 - regression_loss: 1.9047 - classification_loss: 0.4520 152/500 [========>.....................] - ETA: 1:27 - loss: 2.3555 - regression_loss: 1.9040 - classification_loss: 0.4516 153/500 [========>.....................] - ETA: 1:26 - loss: 2.3538 - regression_loss: 1.9034 - classification_loss: 0.4504 154/500 [========>.....................] - ETA: 1:26 - loss: 2.3551 - regression_loss: 1.9049 - classification_loss: 0.4502 155/500 [========>.....................] - ETA: 1:26 - loss: 2.3538 - regression_loss: 1.9039 - classification_loss: 0.4499 156/500 [========>.....................] - ETA: 1:26 - loss: 2.3546 - regression_loss: 1.9043 - classification_loss: 0.4503 157/500 [========>.....................] - ETA: 1:25 - loss: 2.3578 - regression_loss: 1.9072 - classification_loss: 0.4507 158/500 [========>.....................] - ETA: 1:25 - loss: 2.3562 - regression_loss: 1.9058 - classification_loss: 0.4504 159/500 [========>.....................] - ETA: 1:25 - loss: 2.3544 - regression_loss: 1.9045 - classification_loss: 0.4499 160/500 [========>.....................] - ETA: 1:25 - loss: 2.3536 - regression_loss: 1.9042 - classification_loss: 0.4494 161/500 [========>.....................] - ETA: 1:24 - loss: 2.3531 - regression_loss: 1.9041 - classification_loss: 0.4490 162/500 [========>.....................] - ETA: 1:24 - loss: 2.3468 - regression_loss: 1.8988 - classification_loss: 0.4480 163/500 [========>.....................] - ETA: 1:24 - loss: 2.3435 - regression_loss: 1.8961 - classification_loss: 0.4473 164/500 [========>.....................] - ETA: 1:24 - loss: 2.3380 - regression_loss: 1.8919 - classification_loss: 0.4461 165/500 [========>.....................] - ETA: 1:23 - loss: 2.3383 - regression_loss: 1.8921 - classification_loss: 0.4462 166/500 [========>.....................] - ETA: 1:23 - loss: 2.3451 - regression_loss: 1.8981 - classification_loss: 0.4471 167/500 [=========>....................] - ETA: 1:23 - loss: 2.3439 - regression_loss: 1.8975 - classification_loss: 0.4464 168/500 [=========>....................] - ETA: 1:23 - loss: 2.3491 - regression_loss: 1.9017 - classification_loss: 0.4473 169/500 [=========>....................] - ETA: 1:22 - loss: 2.3522 - regression_loss: 1.9052 - classification_loss: 0.4469 170/500 [=========>....................] - ETA: 1:22 - loss: 2.3525 - regression_loss: 1.9054 - classification_loss: 0.4471 171/500 [=========>....................] - ETA: 1:22 - loss: 2.3526 - regression_loss: 1.9059 - classification_loss: 0.4467 172/500 [=========>....................] - ETA: 1:22 - loss: 2.3596 - regression_loss: 1.9061 - classification_loss: 0.4535 173/500 [=========>....................] - ETA: 1:21 - loss: 2.3629 - regression_loss: 1.9094 - classification_loss: 0.4535 174/500 [=========>....................] - ETA: 1:21 - loss: 2.3636 - regression_loss: 1.9104 - classification_loss: 0.4532 175/500 [=========>....................] - ETA: 1:21 - loss: 2.3603 - regression_loss: 1.9079 - classification_loss: 0.4524 176/500 [=========>....................] - ETA: 1:21 - loss: 2.3629 - regression_loss: 1.9099 - classification_loss: 0.4530 177/500 [=========>....................] - ETA: 1:20 - loss: 2.3637 - regression_loss: 1.9110 - classification_loss: 0.4528 178/500 [=========>....................] - ETA: 1:20 - loss: 2.3624 - regression_loss: 1.9102 - classification_loss: 0.4522 179/500 [=========>....................] - ETA: 1:20 - loss: 2.3672 - regression_loss: 1.9138 - classification_loss: 0.4534 180/500 [=========>....................] - ETA: 1:20 - loss: 2.3682 - regression_loss: 1.9148 - classification_loss: 0.4534 181/500 [=========>....................] - ETA: 1:19 - loss: 2.3699 - regression_loss: 1.9164 - classification_loss: 0.4535 182/500 [=========>....................] - ETA: 1:19 - loss: 2.3664 - regression_loss: 1.9132 - classification_loss: 0.4532 183/500 [=========>....................] - ETA: 1:19 - loss: 2.3622 - regression_loss: 1.9099 - classification_loss: 0.4523 184/500 [==========>...................] - ETA: 1:19 - loss: 2.3689 - regression_loss: 1.9156 - classification_loss: 0.4533 185/500 [==========>...................] - ETA: 1:18 - loss: 2.3674 - regression_loss: 1.9130 - classification_loss: 0.4544 186/500 [==========>...................] - ETA: 1:18 - loss: 2.3672 - regression_loss: 1.9123 - classification_loss: 0.4549 187/500 [==========>...................] - ETA: 1:18 - loss: 2.3673 - regression_loss: 1.9128 - classification_loss: 0.4545 188/500 [==========>...................] - ETA: 1:18 - loss: 2.3661 - regression_loss: 1.9116 - classification_loss: 0.4545 189/500 [==========>...................] - ETA: 1:17 - loss: 2.3657 - regression_loss: 1.9114 - classification_loss: 0.4544 190/500 [==========>...................] - ETA: 1:17 - loss: 2.3690 - regression_loss: 1.9142 - classification_loss: 0.4548 191/500 [==========>...................] - ETA: 1:17 - loss: 2.3706 - regression_loss: 1.9152 - classification_loss: 0.4554 192/500 [==========>...................] - ETA: 1:17 - loss: 2.3667 - regression_loss: 1.9122 - classification_loss: 0.4545 193/500 [==========>...................] - ETA: 1:16 - loss: 2.3695 - regression_loss: 1.9144 - classification_loss: 0.4551 194/500 [==========>...................] - ETA: 1:16 - loss: 2.3704 - regression_loss: 1.9149 - classification_loss: 0.4554 195/500 [==========>...................] - ETA: 1:16 - loss: 2.3708 - regression_loss: 1.9154 - classification_loss: 0.4554 196/500 [==========>...................] - ETA: 1:16 - loss: 2.3710 - regression_loss: 1.9151 - classification_loss: 0.4559 197/500 [==========>...................] - ETA: 1:15 - loss: 2.3686 - regression_loss: 1.9135 - classification_loss: 0.4552 198/500 [==========>...................] - ETA: 1:15 - loss: 2.3679 - regression_loss: 1.9131 - classification_loss: 0.4548 199/500 [==========>...................] - ETA: 1:15 - loss: 2.3671 - regression_loss: 1.9127 - classification_loss: 0.4544 200/500 [===========>..................] - ETA: 1:15 - loss: 2.3670 - regression_loss: 1.9131 - classification_loss: 0.4539 201/500 [===========>..................] - ETA: 1:14 - loss: 2.3663 - regression_loss: 1.9123 - classification_loss: 0.4540 202/500 [===========>..................] - ETA: 1:14 - loss: 2.3703 - regression_loss: 1.9154 - classification_loss: 0.4549 203/500 [===========>..................] - ETA: 1:14 - loss: 2.3707 - regression_loss: 1.9153 - classification_loss: 0.4554 204/500 [===========>..................] - ETA: 1:14 - loss: 2.3695 - regression_loss: 1.9145 - classification_loss: 0.4550 205/500 [===========>..................] - ETA: 1:13 - loss: 2.3723 - regression_loss: 1.9165 - classification_loss: 0.4557 206/500 [===========>..................] - ETA: 1:13 - loss: 2.3725 - regression_loss: 1.9165 - classification_loss: 0.4559 207/500 [===========>..................] - ETA: 1:13 - loss: 2.3751 - regression_loss: 1.9187 - classification_loss: 0.4564 208/500 [===========>..................] - ETA: 1:13 - loss: 2.3766 - regression_loss: 1.9194 - classification_loss: 0.4572 209/500 [===========>..................] - ETA: 1:12 - loss: 2.3757 - regression_loss: 1.9188 - classification_loss: 0.4570 210/500 [===========>..................] - ETA: 1:12 - loss: 2.3747 - regression_loss: 1.9182 - classification_loss: 0.4565 211/500 [===========>..................] - ETA: 1:12 - loss: 2.3814 - regression_loss: 1.9209 - classification_loss: 0.4605 212/500 [===========>..................] - ETA: 1:12 - loss: 2.3817 - regression_loss: 1.9209 - classification_loss: 0.4608 213/500 [===========>..................] - ETA: 1:11 - loss: 2.3821 - regression_loss: 1.9216 - classification_loss: 0.4605 214/500 [===========>..................] - ETA: 1:11 - loss: 2.3799 - regression_loss: 1.9183 - classification_loss: 0.4616 215/500 [===========>..................] - ETA: 1:11 - loss: 2.3821 - regression_loss: 1.9201 - classification_loss: 0.4621 216/500 [===========>..................] - ETA: 1:11 - loss: 2.3808 - regression_loss: 1.9190 - classification_loss: 0.4618 217/500 [============>.................] - ETA: 1:10 - loss: 2.3817 - regression_loss: 1.9194 - classification_loss: 0.4623 218/500 [============>.................] - ETA: 1:10 - loss: 2.3806 - regression_loss: 1.9189 - classification_loss: 0.4617 219/500 [============>.................] - ETA: 1:10 - loss: 2.3788 - regression_loss: 1.9176 - classification_loss: 0.4612 220/500 [============>.................] - ETA: 1:10 - loss: 2.3764 - regression_loss: 1.9158 - classification_loss: 0.4606 221/500 [============>.................] - ETA: 1:09 - loss: 2.3760 - regression_loss: 1.9153 - classification_loss: 0.4607 222/500 [============>.................] - ETA: 1:09 - loss: 2.3750 - regression_loss: 1.9141 - classification_loss: 0.4610 223/500 [============>.................] - ETA: 1:09 - loss: 2.3779 - regression_loss: 1.9162 - classification_loss: 0.4617 224/500 [============>.................] - ETA: 1:09 - loss: 2.3779 - regression_loss: 1.9164 - classification_loss: 0.4615 225/500 [============>.................] - ETA: 1:08 - loss: 2.3756 - regression_loss: 1.9144 - classification_loss: 0.4612 226/500 [============>.................] - ETA: 1:08 - loss: 2.3759 - regression_loss: 1.9147 - classification_loss: 0.4612 227/500 [============>.................] - ETA: 1:08 - loss: 2.3769 - regression_loss: 1.9158 - classification_loss: 0.4610 228/500 [============>.................] - ETA: 1:08 - loss: 2.3771 - regression_loss: 1.9163 - classification_loss: 0.4608 229/500 [============>.................] - ETA: 1:07 - loss: 2.3780 - regression_loss: 1.9171 - classification_loss: 0.4609 230/500 [============>.................] - ETA: 1:07 - loss: 2.3775 - regression_loss: 1.9171 - classification_loss: 0.4604 231/500 [============>.................] - ETA: 1:07 - loss: 2.3796 - regression_loss: 1.9186 - classification_loss: 0.4611 232/500 [============>.................] - ETA: 1:07 - loss: 2.3786 - regression_loss: 1.9177 - classification_loss: 0.4609 233/500 [============>.................] - ETA: 1:06 - loss: 2.3769 - regression_loss: 1.9167 - classification_loss: 0.4602 234/500 [=============>................] - ETA: 1:06 - loss: 2.3766 - regression_loss: 1.9165 - classification_loss: 0.4601 235/500 [=============>................] - ETA: 1:06 - loss: 2.3770 - regression_loss: 1.9165 - classification_loss: 0.4605 236/500 [=============>................] - ETA: 1:06 - loss: 2.3753 - regression_loss: 1.9148 - classification_loss: 0.4605 237/500 [=============>................] - ETA: 1:05 - loss: 2.3732 - regression_loss: 1.9131 - classification_loss: 0.4601 238/500 [=============>................] - ETA: 1:05 - loss: 2.3722 - regression_loss: 1.9124 - classification_loss: 0.4598 239/500 [=============>................] - ETA: 1:05 - loss: 2.3723 - regression_loss: 1.9128 - classification_loss: 0.4595 240/500 [=============>................] - ETA: 1:05 - loss: 2.3713 - regression_loss: 1.9115 - classification_loss: 0.4598 241/500 [=============>................] - ETA: 1:04 - loss: 2.3713 - regression_loss: 1.9117 - classification_loss: 0.4596 242/500 [=============>................] - ETA: 1:04 - loss: 2.3690 - regression_loss: 1.9101 - classification_loss: 0.4589 243/500 [=============>................] - ETA: 1:04 - loss: 2.3678 - regression_loss: 1.9094 - classification_loss: 0.4583 244/500 [=============>................] - ETA: 1:04 - loss: 2.3680 - regression_loss: 1.9097 - classification_loss: 0.4583 245/500 [=============>................] - ETA: 1:03 - loss: 2.3678 - regression_loss: 1.9096 - classification_loss: 0.4582 246/500 [=============>................] - ETA: 1:03 - loss: 2.3686 - regression_loss: 1.9102 - classification_loss: 0.4585 247/500 [=============>................] - ETA: 1:03 - loss: 2.3641 - regression_loss: 1.9068 - classification_loss: 0.4573 248/500 [=============>................] - ETA: 1:03 - loss: 2.3652 - regression_loss: 1.9076 - classification_loss: 0.4576 249/500 [=============>................] - ETA: 1:02 - loss: 2.3664 - regression_loss: 1.9086 - classification_loss: 0.4578 250/500 [==============>...............] - ETA: 1:02 - loss: 2.3652 - regression_loss: 1.9078 - classification_loss: 0.4574 251/500 [==============>...............] - ETA: 1:02 - loss: 2.3655 - regression_loss: 1.9086 - classification_loss: 0.4569 252/500 [==============>...............] - ETA: 1:02 - loss: 2.3646 - regression_loss: 1.9074 - classification_loss: 0.4571 253/500 [==============>...............] - ETA: 1:01 - loss: 2.3654 - regression_loss: 1.9077 - classification_loss: 0.4577 254/500 [==============>...............] - ETA: 1:01 - loss: 2.3668 - regression_loss: 1.9088 - classification_loss: 0.4580 255/500 [==============>...............] - ETA: 1:01 - loss: 2.3659 - regression_loss: 1.9082 - classification_loss: 0.4577 256/500 [==============>...............] - ETA: 1:01 - loss: 2.3632 - regression_loss: 1.9053 - classification_loss: 0.4579 257/500 [==============>...............] - ETA: 1:00 - loss: 2.3646 - regression_loss: 1.9064 - classification_loss: 0.4581 258/500 [==============>...............] - ETA: 1:00 - loss: 2.3657 - regression_loss: 1.9073 - classification_loss: 0.4584 259/500 [==============>...............] - ETA: 1:00 - loss: 2.3660 - regression_loss: 1.9074 - classification_loss: 0.4586 260/500 [==============>...............] - ETA: 1:00 - loss: 2.3667 - regression_loss: 1.9083 - classification_loss: 0.4584 261/500 [==============>...............] - ETA: 59s - loss: 2.3658 - regression_loss: 1.9065 - classification_loss: 0.4594  262/500 [==============>...............] - ETA: 59s - loss: 2.3653 - regression_loss: 1.9061 - classification_loss: 0.4592 263/500 [==============>...............] - ETA: 59s - loss: 2.3651 - regression_loss: 1.9061 - classification_loss: 0.4591 264/500 [==============>...............] - ETA: 59s - loss: 2.3639 - regression_loss: 1.9053 - classification_loss: 0.4586 265/500 [==============>...............] - ETA: 58s - loss: 2.3644 - regression_loss: 1.9057 - classification_loss: 0.4588 266/500 [==============>...............] - ETA: 58s - loss: 2.3657 - regression_loss: 1.9069 - classification_loss: 0.4588 267/500 [===============>..............] - ETA: 58s - loss: 2.3652 - regression_loss: 1.9067 - classification_loss: 0.4585 268/500 [===============>..............] - ETA: 58s - loss: 2.3659 - regression_loss: 1.9074 - classification_loss: 0.4584 269/500 [===============>..............] - ETA: 57s - loss: 2.3653 - regression_loss: 1.9071 - classification_loss: 0.4582 270/500 [===============>..............] - ETA: 57s - loss: 2.3658 - regression_loss: 1.9078 - classification_loss: 0.4581 271/500 [===============>..............] - ETA: 57s - loss: 2.3651 - regression_loss: 1.9064 - classification_loss: 0.4587 272/500 [===============>..............] - ETA: 57s - loss: 2.3664 - regression_loss: 1.9076 - classification_loss: 0.4588 273/500 [===============>..............] - ETA: 56s - loss: 2.3664 - regression_loss: 1.9078 - classification_loss: 0.4586 274/500 [===============>..............] - ETA: 56s - loss: 2.3667 - regression_loss: 1.9082 - classification_loss: 0.4585 275/500 [===============>..............] - ETA: 56s - loss: 2.3676 - regression_loss: 1.9089 - classification_loss: 0.4587 276/500 [===============>..............] - ETA: 56s - loss: 2.3666 - regression_loss: 1.9082 - classification_loss: 0.4584 277/500 [===============>..............] - ETA: 55s - loss: 2.3622 - regression_loss: 1.9046 - classification_loss: 0.4576 278/500 [===============>..............] - ETA: 55s - loss: 2.3610 - regression_loss: 1.9038 - classification_loss: 0.4572 279/500 [===============>..............] - ETA: 55s - loss: 2.3616 - regression_loss: 1.9045 - classification_loss: 0.4571 280/500 [===============>..............] - ETA: 55s - loss: 2.3616 - regression_loss: 1.9040 - classification_loss: 0.4576 281/500 [===============>..............] - ETA: 54s - loss: 2.3567 - regression_loss: 1.9000 - classification_loss: 0.4567 282/500 [===============>..............] - ETA: 54s - loss: 2.3589 - regression_loss: 1.9020 - classification_loss: 0.4568 283/500 [===============>..............] - ETA: 54s - loss: 2.3585 - regression_loss: 1.9018 - classification_loss: 0.4567 284/500 [================>.............] - ETA: 54s - loss: 2.3566 - regression_loss: 1.9004 - classification_loss: 0.4562 285/500 [================>.............] - ETA: 53s - loss: 2.3563 - regression_loss: 1.9004 - classification_loss: 0.4559 286/500 [================>.............] - ETA: 53s - loss: 2.3556 - regression_loss: 1.9000 - classification_loss: 0.4556 287/500 [================>.............] - ETA: 53s - loss: 2.3555 - regression_loss: 1.8999 - classification_loss: 0.4556 288/500 [================>.............] - ETA: 53s - loss: 2.3558 - regression_loss: 1.9006 - classification_loss: 0.4552 289/500 [================>.............] - ETA: 52s - loss: 2.3561 - regression_loss: 1.9010 - classification_loss: 0.4551 290/500 [================>.............] - ETA: 52s - loss: 2.3553 - regression_loss: 1.9003 - classification_loss: 0.4550 291/500 [================>.............] - ETA: 52s - loss: 2.3575 - regression_loss: 1.8997 - classification_loss: 0.4578 292/500 [================>.............] - ETA: 52s - loss: 2.3550 - regression_loss: 1.8976 - classification_loss: 0.4574 293/500 [================>.............] - ETA: 51s - loss: 2.3550 - regression_loss: 1.8978 - classification_loss: 0.4572 294/500 [================>.............] - ETA: 51s - loss: 2.3549 - regression_loss: 1.8977 - classification_loss: 0.4572 295/500 [================>.............] - ETA: 51s - loss: 2.3516 - regression_loss: 1.8952 - classification_loss: 0.4565 296/500 [================>.............] - ETA: 51s - loss: 2.3507 - regression_loss: 1.8949 - classification_loss: 0.4558 297/500 [================>.............] - ETA: 50s - loss: 2.3506 - regression_loss: 1.8948 - classification_loss: 0.4558 298/500 [================>.............] - ETA: 50s - loss: 2.3501 - regression_loss: 1.8942 - classification_loss: 0.4560 299/500 [================>.............] - ETA: 50s - loss: 2.3532 - regression_loss: 1.8963 - classification_loss: 0.4569 300/500 [=================>............] - ETA: 50s - loss: 2.3531 - regression_loss: 1.8963 - classification_loss: 0.4568 301/500 [=================>............] - ETA: 49s - loss: 2.3535 - regression_loss: 1.8968 - classification_loss: 0.4567 302/500 [=================>............] - ETA: 49s - loss: 2.3521 - regression_loss: 1.8959 - classification_loss: 0.4563 303/500 [=================>............] - ETA: 49s - loss: 2.3533 - regression_loss: 1.8968 - classification_loss: 0.4566 304/500 [=================>............] - ETA: 49s - loss: 2.3516 - regression_loss: 1.8952 - classification_loss: 0.4564 305/500 [=================>............] - ETA: 48s - loss: 2.3511 - regression_loss: 1.8948 - classification_loss: 0.4563 306/500 [=================>............] - ETA: 48s - loss: 2.3521 - regression_loss: 1.8956 - classification_loss: 0.4564 307/500 [=================>............] - ETA: 48s - loss: 2.3531 - regression_loss: 1.8965 - classification_loss: 0.4566 308/500 [=================>............] - ETA: 48s - loss: 2.3516 - regression_loss: 1.8953 - classification_loss: 0.4563 309/500 [=================>............] - ETA: 47s - loss: 2.3532 - regression_loss: 1.8966 - classification_loss: 0.4566 310/500 [=================>............] - ETA: 47s - loss: 2.3551 - regression_loss: 1.8984 - classification_loss: 0.4567 311/500 [=================>............] - ETA: 47s - loss: 2.3546 - regression_loss: 1.8981 - classification_loss: 0.4565 312/500 [=================>............] - ETA: 47s - loss: 2.3554 - regression_loss: 1.8991 - classification_loss: 0.4563 313/500 [=================>............] - ETA: 46s - loss: 2.4284 - regression_loss: 1.8931 - classification_loss: 0.5353 314/500 [=================>............] - ETA: 46s - loss: 2.4245 - regression_loss: 1.8895 - classification_loss: 0.5350 315/500 [=================>............] - ETA: 46s - loss: 2.4246 - regression_loss: 1.8899 - classification_loss: 0.5347 316/500 [=================>............] - ETA: 46s - loss: 2.4245 - regression_loss: 1.8903 - classification_loss: 0.5342 317/500 [==================>...........] - ETA: 45s - loss: 2.4239 - regression_loss: 1.8893 - classification_loss: 0.5347 318/500 [==================>...........] - ETA: 45s - loss: 2.4238 - regression_loss: 1.8896 - classification_loss: 0.5342 319/500 [==================>...........] - ETA: 45s - loss: 2.4239 - regression_loss: 1.8900 - classification_loss: 0.5339 320/500 [==================>...........] - ETA: 45s - loss: 2.4244 - regression_loss: 1.8908 - classification_loss: 0.5336 321/500 [==================>...........] - ETA: 44s - loss: 2.4258 - regression_loss: 1.8918 - classification_loss: 0.5340 322/500 [==================>...........] - ETA: 44s - loss: 2.4248 - regression_loss: 1.8913 - classification_loss: 0.5335 323/500 [==================>...........] - ETA: 44s - loss: 2.4234 - regression_loss: 1.8905 - classification_loss: 0.5330 324/500 [==================>...........] - ETA: 44s - loss: 2.4231 - regression_loss: 1.8904 - classification_loss: 0.5327 325/500 [==================>...........] - ETA: 43s - loss: 2.4219 - regression_loss: 1.8895 - classification_loss: 0.5324 326/500 [==================>...........] - ETA: 43s - loss: 2.4227 - regression_loss: 1.8902 - classification_loss: 0.5325 327/500 [==================>...........] - ETA: 43s - loss: 2.4222 - regression_loss: 1.8897 - classification_loss: 0.5325 328/500 [==================>...........] - ETA: 43s - loss: 2.4213 - regression_loss: 1.8889 - classification_loss: 0.5324 329/500 [==================>...........] - ETA: 42s - loss: 2.4209 - regression_loss: 1.8890 - classification_loss: 0.5319 330/500 [==================>...........] - ETA: 42s - loss: 2.4223 - regression_loss: 1.8902 - classification_loss: 0.5321 331/500 [==================>...........] - ETA: 42s - loss: 2.4190 - regression_loss: 1.8876 - classification_loss: 0.5314 332/500 [==================>...........] - ETA: 42s - loss: 2.4224 - regression_loss: 1.8898 - classification_loss: 0.5326 333/500 [==================>...........] - ETA: 41s - loss: 2.4221 - regression_loss: 1.8900 - classification_loss: 0.5321 334/500 [===================>..........] - ETA: 41s - loss: 2.4213 - regression_loss: 1.8897 - classification_loss: 0.5316 335/500 [===================>..........] - ETA: 41s - loss: 2.4201 - regression_loss: 1.8889 - classification_loss: 0.5312 336/500 [===================>..........] - ETA: 41s - loss: 2.4206 - regression_loss: 1.8896 - classification_loss: 0.5310 337/500 [===================>..........] - ETA: 40s - loss: 2.4206 - regression_loss: 1.8899 - classification_loss: 0.5306 338/500 [===================>..........] - ETA: 40s - loss: 2.4178 - regression_loss: 1.8879 - classification_loss: 0.5299 339/500 [===================>..........] - ETA: 40s - loss: 2.4190 - regression_loss: 1.8891 - classification_loss: 0.5299 340/500 [===================>..........] - ETA: 40s - loss: 2.4176 - regression_loss: 1.8883 - classification_loss: 0.5294 341/500 [===================>..........] - ETA: 39s - loss: 2.4175 - regression_loss: 1.8885 - classification_loss: 0.5291 342/500 [===================>..........] - ETA: 39s - loss: 2.4175 - regression_loss: 1.8885 - classification_loss: 0.5290 343/500 [===================>..........] - ETA: 39s - loss: 2.4168 - regression_loss: 1.8883 - classification_loss: 0.5285 344/500 [===================>..........] - ETA: 39s - loss: 2.4164 - regression_loss: 1.8883 - classification_loss: 0.5282 345/500 [===================>..........] - ETA: 38s - loss: 2.4180 - regression_loss: 1.8895 - classification_loss: 0.5284 346/500 [===================>..........] - ETA: 38s - loss: 2.4175 - regression_loss: 1.8895 - classification_loss: 0.5280 347/500 [===================>..........] - ETA: 38s - loss: 2.4149 - regression_loss: 1.8876 - classification_loss: 0.5273 348/500 [===================>..........] - ETA: 38s - loss: 2.4142 - regression_loss: 1.8874 - classification_loss: 0.5268 349/500 [===================>..........] - ETA: 37s - loss: 2.4141 - regression_loss: 1.8874 - classification_loss: 0.5267 350/500 [====================>.........] - ETA: 37s - loss: 2.4140 - regression_loss: 1.8876 - classification_loss: 0.5264 351/500 [====================>.........] - ETA: 37s - loss: 2.4126 - regression_loss: 1.8866 - classification_loss: 0.5260 352/500 [====================>.........] - ETA: 37s - loss: 2.4144 - regression_loss: 1.8880 - classification_loss: 0.5263 353/500 [====================>.........] - ETA: 36s - loss: 2.4126 - regression_loss: 1.8869 - classification_loss: 0.5257 354/500 [====================>.........] - ETA: 36s - loss: 2.4129 - regression_loss: 1.8873 - classification_loss: 0.5256 355/500 [====================>.........] - ETA: 36s - loss: 2.4139 - regression_loss: 1.8880 - classification_loss: 0.5259 356/500 [====================>.........] - ETA: 36s - loss: 2.4132 - regression_loss: 1.8875 - classification_loss: 0.5257 357/500 [====================>.........] - ETA: 35s - loss: 2.4113 - regression_loss: 1.8862 - classification_loss: 0.5251 358/500 [====================>.........] - ETA: 35s - loss: 2.4086 - regression_loss: 1.8843 - classification_loss: 0.5242 359/500 [====================>.........] - ETA: 35s - loss: 2.4082 - regression_loss: 1.8843 - classification_loss: 0.5239 360/500 [====================>.........] - ETA: 35s - loss: 2.4084 - regression_loss: 1.8845 - classification_loss: 0.5239 361/500 [====================>.........] - ETA: 34s - loss: 2.4073 - regression_loss: 1.8839 - classification_loss: 0.5234 362/500 [====================>.........] - ETA: 34s - loss: 2.4061 - regression_loss: 1.8833 - classification_loss: 0.5228 363/500 [====================>.........] - ETA: 34s - loss: 2.4065 - regression_loss: 1.8838 - classification_loss: 0.5227 364/500 [====================>.........] - ETA: 34s - loss: 2.4058 - regression_loss: 1.8834 - classification_loss: 0.5224 365/500 [====================>.........] - ETA: 33s - loss: 2.4060 - regression_loss: 1.8836 - classification_loss: 0.5224 366/500 [====================>.........] - ETA: 33s - loss: 2.4048 - regression_loss: 1.8829 - classification_loss: 0.5219 367/500 [=====================>........] - ETA: 33s - loss: 2.4036 - regression_loss: 1.8821 - classification_loss: 0.5215 368/500 [=====================>........] - ETA: 33s - loss: 2.4034 - regression_loss: 1.8819 - classification_loss: 0.5216 369/500 [=====================>........] - ETA: 32s - loss: 2.4017 - regression_loss: 1.8808 - classification_loss: 0.5209 370/500 [=====================>........] - ETA: 32s - loss: 2.4025 - regression_loss: 1.8816 - classification_loss: 0.5210 371/500 [=====================>........] - ETA: 32s - loss: 2.4050 - regression_loss: 1.8823 - classification_loss: 0.5227 372/500 [=====================>........] - ETA: 32s - loss: 2.4064 - regression_loss: 1.8835 - classification_loss: 0.5229 373/500 [=====================>........] - ETA: 31s - loss: 2.4064 - regression_loss: 1.8840 - classification_loss: 0.5225 374/500 [=====================>........] - ETA: 31s - loss: 2.4059 - regression_loss: 1.8831 - classification_loss: 0.5229 375/500 [=====================>........] - ETA: 31s - loss: 2.4066 - regression_loss: 1.8841 - classification_loss: 0.5225 376/500 [=====================>........] - ETA: 31s - loss: 2.4064 - regression_loss: 1.8841 - classification_loss: 0.5223 377/500 [=====================>........] - ETA: 30s - loss: 2.4065 - regression_loss: 1.8843 - classification_loss: 0.5221 378/500 [=====================>........] - ETA: 30s - loss: 2.4082 - regression_loss: 1.8855 - classification_loss: 0.5227 379/500 [=====================>........] - ETA: 30s - loss: 2.4086 - regression_loss: 1.8861 - classification_loss: 0.5224 380/500 [=====================>........] - ETA: 30s - loss: 2.4059 - regression_loss: 1.8842 - classification_loss: 0.5217 381/500 [=====================>........] - ETA: 29s - loss: 2.4064 - regression_loss: 1.8848 - classification_loss: 0.5215 382/500 [=====================>........] - ETA: 29s - loss: 2.4064 - regression_loss: 1.8850 - classification_loss: 0.5215 383/500 [=====================>........] - ETA: 29s - loss: 2.4068 - regression_loss: 1.8853 - classification_loss: 0.5216 384/500 [======================>.......] - ETA: 29s - loss: 2.4091 - regression_loss: 1.8871 - classification_loss: 0.5219 385/500 [======================>.......] - ETA: 28s - loss: 2.4082 - regression_loss: 1.8865 - classification_loss: 0.5216 386/500 [======================>.......] - ETA: 28s - loss: 2.4078 - regression_loss: 1.8866 - classification_loss: 0.5212 387/500 [======================>.......] - ETA: 28s - loss: 2.4069 - regression_loss: 1.8861 - classification_loss: 0.5207 388/500 [======================>.......] - ETA: 28s - loss: 2.4068 - regression_loss: 1.8865 - classification_loss: 0.5203 389/500 [======================>.......] - ETA: 27s - loss: 2.4030 - regression_loss: 1.8834 - classification_loss: 0.5196 390/500 [======================>.......] - ETA: 27s - loss: 2.4032 - regression_loss: 1.8838 - classification_loss: 0.5194 391/500 [======================>.......] - ETA: 27s - loss: 2.4019 - regression_loss: 1.8828 - classification_loss: 0.5191 392/500 [======================>.......] - ETA: 27s - loss: 2.4031 - regression_loss: 1.8828 - classification_loss: 0.5202 393/500 [======================>.......] - ETA: 26s - loss: 2.4032 - regression_loss: 1.8833 - classification_loss: 0.5199 394/500 [======================>.......] - ETA: 26s - loss: 2.4045 - regression_loss: 1.8847 - classification_loss: 0.5198 395/500 [======================>.......] - ETA: 26s - loss: 2.4040 - regression_loss: 1.8846 - classification_loss: 0.5194 396/500 [======================>.......] - ETA: 26s - loss: 2.4060 - regression_loss: 1.8866 - classification_loss: 0.5194 397/500 [======================>.......] - ETA: 25s - loss: 2.4058 - regression_loss: 1.8869 - classification_loss: 0.5189 398/500 [======================>.......] - ETA: 25s - loss: 2.4043 - regression_loss: 1.8858 - classification_loss: 0.5185 399/500 [======================>.......] - ETA: 25s - loss: 2.4022 - regression_loss: 1.8843 - classification_loss: 0.5180 400/500 [=======================>......] - ETA: 25s - loss: 2.4025 - regression_loss: 1.8847 - classification_loss: 0.5177 401/500 [=======================>......] - ETA: 24s - loss: 2.4030 - regression_loss: 1.8853 - classification_loss: 0.5177 402/500 [=======================>......] - ETA: 24s - loss: 2.4036 - regression_loss: 1.8857 - classification_loss: 0.5179 403/500 [=======================>......] - ETA: 24s - loss: 2.4027 - regression_loss: 1.8847 - classification_loss: 0.5180 404/500 [=======================>......] - ETA: 24s - loss: 2.4040 - regression_loss: 1.8861 - classification_loss: 0.5179 405/500 [=======================>......] - ETA: 23s - loss: 2.4045 - regression_loss: 1.8867 - classification_loss: 0.5178 406/500 [=======================>......] - ETA: 23s - loss: 2.4055 - regression_loss: 1.8875 - classification_loss: 0.5180 407/500 [=======================>......] - ETA: 23s - loss: 2.4054 - regression_loss: 1.8878 - classification_loss: 0.5176 408/500 [=======================>......] - ETA: 23s - loss: 2.4056 - regression_loss: 1.8882 - classification_loss: 0.5173 409/500 [=======================>......] - ETA: 22s - loss: 2.4063 - regression_loss: 1.8893 - classification_loss: 0.5170 410/500 [=======================>......] - ETA: 22s - loss: 2.4056 - regression_loss: 1.8890 - classification_loss: 0.5166 411/500 [=======================>......] - ETA: 22s - loss: 2.4051 - regression_loss: 1.8890 - classification_loss: 0.5161 412/500 [=======================>......] - ETA: 22s - loss: 2.4049 - regression_loss: 1.8891 - classification_loss: 0.5159 413/500 [=======================>......] - ETA: 21s - loss: 2.4050 - regression_loss: 1.8892 - classification_loss: 0.5158 414/500 [=======================>......] - ETA: 21s - loss: 2.4054 - regression_loss: 1.8896 - classification_loss: 0.5158 415/500 [=======================>......] - ETA: 21s - loss: 2.4053 - regression_loss: 1.8894 - classification_loss: 0.5159 416/500 [=======================>......] - ETA: 21s - loss: 2.4042 - regression_loss: 1.8887 - classification_loss: 0.5155 417/500 [========================>.....] - ETA: 20s - loss: 2.4044 - regression_loss: 1.8890 - classification_loss: 0.5154 418/500 [========================>.....] - ETA: 20s - loss: 2.4040 - regression_loss: 1.8889 - classification_loss: 0.5150 419/500 [========================>.....] - ETA: 20s - loss: 2.4039 - regression_loss: 1.8892 - classification_loss: 0.5147 420/500 [========================>.....] - ETA: 20s - loss: 2.4016 - regression_loss: 1.8875 - classification_loss: 0.5141 421/500 [========================>.....] - ETA: 19s - loss: 2.3988 - regression_loss: 1.8854 - classification_loss: 0.5134 422/500 [========================>.....] - ETA: 19s - loss: 2.4000 - regression_loss: 1.8861 - classification_loss: 0.5139 423/500 [========================>.....] - ETA: 19s - loss: 2.3996 - regression_loss: 1.8861 - classification_loss: 0.5135 424/500 [========================>.....] - ETA: 19s - loss: 2.4018 - regression_loss: 1.8878 - classification_loss: 0.5140 425/500 [========================>.....] - ETA: 18s - loss: 2.4017 - regression_loss: 1.8879 - classification_loss: 0.5137 426/500 [========================>.....] - ETA: 18s - loss: 2.4014 - regression_loss: 1.8879 - classification_loss: 0.5135 427/500 [========================>.....] - ETA: 18s - loss: 2.4014 - regression_loss: 1.8881 - classification_loss: 0.5133 428/500 [========================>.....] - ETA: 18s - loss: 2.4016 - regression_loss: 1.8884 - classification_loss: 0.5132 429/500 [========================>.....] - ETA: 17s - loss: 2.4017 - regression_loss: 1.8888 - classification_loss: 0.5129 430/500 [========================>.....] - ETA: 17s - loss: 2.4011 - regression_loss: 1.8885 - classification_loss: 0.5126 431/500 [========================>.....] - ETA: 17s - loss: 2.4020 - regression_loss: 1.8894 - classification_loss: 0.5127 432/500 [========================>.....] - ETA: 17s - loss: 2.4015 - regression_loss: 1.8890 - classification_loss: 0.5125 433/500 [========================>.....] - ETA: 16s - loss: 2.4019 - regression_loss: 1.8895 - classification_loss: 0.5125 434/500 [=========================>....] - ETA: 16s - loss: 2.4032 - regression_loss: 1.8908 - classification_loss: 0.5124 435/500 [=========================>....] - ETA: 16s - loss: 2.4023 - regression_loss: 1.8904 - classification_loss: 0.5119 436/500 [=========================>....] - ETA: 16s - loss: 2.4027 - regression_loss: 1.8909 - classification_loss: 0.5118 437/500 [=========================>....] - ETA: 15s - loss: 2.4016 - regression_loss: 1.8901 - classification_loss: 0.5115 438/500 [=========================>....] - ETA: 15s - loss: 2.4024 - regression_loss: 1.8909 - classification_loss: 0.5115 439/500 [=========================>....] - ETA: 15s - loss: 2.4016 - regression_loss: 1.8905 - classification_loss: 0.5111 440/500 [=========================>....] - ETA: 15s - loss: 2.4014 - regression_loss: 1.8902 - classification_loss: 0.5112 441/500 [=========================>....] - ETA: 14s - loss: 2.4012 - regression_loss: 1.8903 - classification_loss: 0.5110 442/500 [=========================>....] - ETA: 14s - loss: 2.4012 - regression_loss: 1.8904 - classification_loss: 0.5108 443/500 [=========================>....] - ETA: 14s - loss: 2.4019 - regression_loss: 1.8910 - classification_loss: 0.5109 444/500 [=========================>....] - ETA: 14s - loss: 2.4012 - regression_loss: 1.8905 - classification_loss: 0.5106 445/500 [=========================>....] - ETA: 13s - loss: 2.4013 - regression_loss: 1.8909 - classification_loss: 0.5104 446/500 [=========================>....] - ETA: 13s - loss: 2.4051 - regression_loss: 1.8932 - classification_loss: 0.5118 447/500 [=========================>....] - ETA: 13s - loss: 2.4045 - regression_loss: 1.8927 - classification_loss: 0.5118 448/500 [=========================>....] - ETA: 13s - loss: 2.4040 - regression_loss: 1.8926 - classification_loss: 0.5115 449/500 [=========================>....] - ETA: 12s - loss: 2.4033 - regression_loss: 1.8923 - classification_loss: 0.5110 450/500 [==========================>...] - ETA: 12s - loss: 2.4043 - regression_loss: 1.8934 - classification_loss: 0.5109 451/500 [==========================>...] - ETA: 12s - loss: 2.4050 - regression_loss: 1.8939 - classification_loss: 0.5111 452/500 [==========================>...] - ETA: 12s - loss: 2.4044 - regression_loss: 1.8936 - classification_loss: 0.5108 453/500 [==========================>...] - ETA: 11s - loss: 2.4043 - regression_loss: 1.8937 - classification_loss: 0.5106 454/500 [==========================>...] - ETA: 11s - loss: 2.4045 - regression_loss: 1.8943 - classification_loss: 0.5102 455/500 [==========================>...] - ETA: 11s - loss: 2.4049 - regression_loss: 1.8948 - classification_loss: 0.5101 456/500 [==========================>...] - ETA: 11s - loss: 2.4042 - regression_loss: 1.8945 - classification_loss: 0.5098 457/500 [==========================>...] - ETA: 10s - loss: 2.4035 - regression_loss: 1.8941 - classification_loss: 0.5094 458/500 [==========================>...] - ETA: 10s - loss: 2.4034 - regression_loss: 1.8942 - classification_loss: 0.5093 459/500 [==========================>...] - ETA: 10s - loss: 2.4024 - regression_loss: 1.8934 - classification_loss: 0.5089 460/500 [==========================>...] - ETA: 10s - loss: 2.4033 - regression_loss: 1.8944 - classification_loss: 0.5089 461/500 [==========================>...] - ETA: 9s - loss: 2.4033 - regression_loss: 1.8946 - classification_loss: 0.5086  462/500 [==========================>...] - ETA: 9s - loss: 2.4022 - regression_loss: 1.8939 - classification_loss: 0.5082 463/500 [==========================>...] - ETA: 9s - loss: 2.4020 - regression_loss: 1.8939 - classification_loss: 0.5081 464/500 [==========================>...] - ETA: 9s - loss: 2.4014 - regression_loss: 1.8937 - classification_loss: 0.5077 465/500 [==========================>...] - ETA: 8s - loss: 2.3999 - regression_loss: 1.8924 - classification_loss: 0.5074 466/500 [==========================>...] - ETA: 8s - loss: 2.4011 - regression_loss: 1.8931 - classification_loss: 0.5080 467/500 [===========================>..] - ETA: 8s - loss: 2.4012 - regression_loss: 1.8932 - classification_loss: 0.5080 468/500 [===========================>..] - ETA: 8s - loss: 2.4043 - regression_loss: 1.8957 - classification_loss: 0.5086 469/500 [===========================>..] - ETA: 7s - loss: 2.4043 - regression_loss: 1.8958 - classification_loss: 0.5085 470/500 [===========================>..] - ETA: 7s - loss: 2.4049 - regression_loss: 1.8965 - classification_loss: 0.5084 471/500 [===========================>..] - ETA: 7s - loss: 2.4050 - regression_loss: 1.8966 - classification_loss: 0.5084 472/500 [===========================>..] - ETA: 7s - loss: 2.4050 - regression_loss: 1.8967 - classification_loss: 0.5083 473/500 [===========================>..] - ETA: 6s - loss: 2.4049 - regression_loss: 1.8968 - classification_loss: 0.5081 474/500 [===========================>..] - ETA: 6s - loss: 2.4048 - regression_loss: 1.8968 - classification_loss: 0.5080 475/500 [===========================>..] - ETA: 6s - loss: 2.4046 - regression_loss: 1.8969 - classification_loss: 0.5078 476/500 [===========================>..] - ETA: 6s - loss: 2.4042 - regression_loss: 1.8967 - classification_loss: 0.5075 477/500 [===========================>..] - ETA: 5s - loss: 2.4048 - regression_loss: 1.8973 - classification_loss: 0.5075 478/500 [===========================>..] - ETA: 5s - loss: 2.4039 - regression_loss: 1.8968 - classification_loss: 0.5071 479/500 [===========================>..] - ETA: 5s - loss: 2.4038 - regression_loss: 1.8969 - classification_loss: 0.5069 480/500 [===========================>..] - ETA: 5s - loss: 2.4035 - regression_loss: 1.8969 - classification_loss: 0.5066 481/500 [===========================>..] - ETA: 4s - loss: 2.4025 - regression_loss: 1.8962 - classification_loss: 0.5063 482/500 [===========================>..] - ETA: 4s - loss: 2.4032 - regression_loss: 1.8965 - classification_loss: 0.5067 483/500 [===========================>..] - ETA: 4s - loss: 2.4028 - regression_loss: 1.8963 - classification_loss: 0.5065 484/500 [============================>.] - ETA: 4s - loss: 2.4032 - regression_loss: 1.8968 - classification_loss: 0.5064 485/500 [============================>.] - ETA: 3s - loss: 2.4033 - regression_loss: 1.8970 - classification_loss: 0.5063 486/500 [============================>.] - ETA: 3s - loss: 2.4018 - regression_loss: 1.8958 - classification_loss: 0.5060 487/500 [============================>.] - ETA: 3s - loss: 2.4014 - regression_loss: 1.8955 - classification_loss: 0.5059 488/500 [============================>.] - ETA: 3s - loss: 2.4012 - regression_loss: 1.8955 - classification_loss: 0.5057 489/500 [============================>.] - ETA: 2s - loss: 2.4013 - regression_loss: 1.8958 - classification_loss: 0.5054 490/500 [============================>.] - ETA: 2s - loss: 2.4006 - regression_loss: 1.8954 - classification_loss: 0.5051 491/500 [============================>.] - ETA: 2s - loss: 2.4011 - regression_loss: 1.8959 - classification_loss: 0.5052 492/500 [============================>.] - ETA: 2s - loss: 2.4007 - regression_loss: 1.8956 - classification_loss: 0.5051 493/500 [============================>.] - ETA: 1s - loss: 2.4010 - regression_loss: 1.8961 - classification_loss: 0.5049 494/500 [============================>.] - ETA: 1s - loss: 2.4005 - regression_loss: 1.8959 - classification_loss: 0.5046 495/500 [============================>.] - ETA: 1s - loss: 2.3998 - regression_loss: 1.8955 - classification_loss: 0.5043 496/500 [============================>.] - ETA: 1s - loss: 2.3990 - regression_loss: 1.8950 - classification_loss: 0.5040 497/500 [============================>.] - ETA: 0s - loss: 2.3974 - regression_loss: 1.8938 - classification_loss: 0.5036 498/500 [============================>.] - ETA: 0s - loss: 2.3969 - regression_loss: 1.8934 - classification_loss: 0.5034 499/500 [============================>.] - ETA: 0s - loss: 2.3986 - regression_loss: 1.8948 - classification_loss: 0.5038 500/500 [==============================] - 125s 250ms/step - loss: 2.3983 - regression_loss: 1.8945 - classification_loss: 0.5038 1172 instances of class plum with average precision: 0.4083 mAP: 0.4083 Epoch 00018: saving model to ./training/snapshots/resnet50_pascal_18.h5 Epoch 19/150 1/500 [..............................] - ETA: 1:56 - loss: 1.6618 - regression_loss: 1.5620 - classification_loss: 0.0998 2/500 [..............................] - ETA: 2:01 - loss: 2.0382 - regression_loss: 1.7539 - classification_loss: 0.2843 3/500 [..............................] - ETA: 2:03 - loss: 2.1063 - regression_loss: 1.7508 - classification_loss: 0.3555 4/500 [..............................] - ETA: 2:03 - loss: 2.2529 - regression_loss: 1.8655 - classification_loss: 0.3874 5/500 [..............................] - ETA: 2:03 - loss: 2.2445 - regression_loss: 1.8446 - classification_loss: 0.3999 6/500 [..............................] - ETA: 2:03 - loss: 2.1502 - regression_loss: 1.7836 - classification_loss: 0.3666 7/500 [..............................] - ETA: 2:03 - loss: 2.2154 - regression_loss: 1.8339 - classification_loss: 0.3815 8/500 [..............................] - ETA: 2:04 - loss: 2.2199 - regression_loss: 1.8375 - classification_loss: 0.3824 9/500 [..............................] - ETA: 2:04 - loss: 2.2930 - regression_loss: 1.8906 - classification_loss: 0.4024 10/500 [..............................] - ETA: 2:04 - loss: 2.4053 - regression_loss: 1.9944 - classification_loss: 0.4108 11/500 [..............................] - ETA: 2:03 - loss: 2.4011 - regression_loss: 1.9936 - classification_loss: 0.4075 12/500 [..............................] - ETA: 2:03 - loss: 2.3816 - regression_loss: 1.9752 - classification_loss: 0.4063 13/500 [..............................] - ETA: 2:02 - loss: 2.3895 - regression_loss: 1.9813 - classification_loss: 0.4082 14/500 [..............................] - ETA: 2:02 - loss: 2.3736 - regression_loss: 1.9667 - classification_loss: 0.4070 15/500 [..............................] - ETA: 2:02 - loss: 2.3077 - regression_loss: 1.9060 - classification_loss: 0.4017 16/500 [..............................] - ETA: 2:02 - loss: 2.2828 - regression_loss: 1.8840 - classification_loss: 0.3988 17/500 [>.............................] - ETA: 2:01 - loss: 2.2953 - regression_loss: 1.8938 - classification_loss: 0.4015 18/500 [>.............................] - ETA: 2:01 - loss: 2.3148 - regression_loss: 1.9095 - classification_loss: 0.4053 19/500 [>.............................] - ETA: 2:01 - loss: 2.3108 - regression_loss: 1.9062 - classification_loss: 0.4046 20/500 [>.............................] - ETA: 2:00 - loss: 2.2699 - regression_loss: 1.8700 - classification_loss: 0.3998 21/500 [>.............................] - ETA: 2:00 - loss: 2.2834 - regression_loss: 1.8829 - classification_loss: 0.4005 22/500 [>.............................] - ETA: 2:00 - loss: 2.3038 - regression_loss: 1.9011 - classification_loss: 0.4027 23/500 [>.............................] - ETA: 2:00 - loss: 2.2951 - regression_loss: 1.8938 - classification_loss: 0.4014 24/500 [>.............................] - ETA: 2:00 - loss: 2.3070 - regression_loss: 1.8981 - classification_loss: 0.4089 25/500 [>.............................] - ETA: 1:59 - loss: 2.2881 - regression_loss: 1.8827 - classification_loss: 0.4054 26/500 [>.............................] - ETA: 1:59 - loss: 2.2914 - regression_loss: 1.8871 - classification_loss: 0.4044 27/500 [>.............................] - ETA: 1:59 - loss: 2.2916 - regression_loss: 1.8856 - classification_loss: 0.4060 28/500 [>.............................] - ETA: 1:59 - loss: 2.2987 - regression_loss: 1.8914 - classification_loss: 0.4072 29/500 [>.............................] - ETA: 1:58 - loss: 2.2870 - regression_loss: 1.8822 - classification_loss: 0.4048 30/500 [>.............................] - ETA: 1:58 - loss: 2.2850 - regression_loss: 1.8790 - classification_loss: 0.4060 31/500 [>.............................] - ETA: 1:58 - loss: 2.2810 - regression_loss: 1.8726 - classification_loss: 0.4084 32/500 [>.............................] - ETA: 1:58 - loss: 2.2885 - regression_loss: 1.8729 - classification_loss: 0.4156 33/500 [>.............................] - ETA: 1:58 - loss: 2.2985 - regression_loss: 1.8790 - classification_loss: 0.4195 34/500 [=>............................] - ETA: 1:57 - loss: 2.3157 - regression_loss: 1.8959 - classification_loss: 0.4198 35/500 [=>............................] - ETA: 1:57 - loss: 2.3060 - regression_loss: 1.8873 - classification_loss: 0.4187 36/500 [=>............................] - ETA: 1:57 - loss: 2.3177 - regression_loss: 1.8924 - classification_loss: 0.4253 37/500 [=>............................] - ETA: 1:57 - loss: 2.3412 - regression_loss: 1.9097 - classification_loss: 0.4315 38/500 [=>............................] - ETA: 1:56 - loss: 2.3642 - regression_loss: 1.9230 - classification_loss: 0.4412 39/500 [=>............................] - ETA: 1:56 - loss: 2.3679 - regression_loss: 1.9279 - classification_loss: 0.4400 40/500 [=>............................] - ETA: 1:56 - loss: 2.3603 - regression_loss: 1.9226 - classification_loss: 0.4377 41/500 [=>............................] - ETA: 1:55 - loss: 2.3926 - regression_loss: 1.9335 - classification_loss: 0.4591 42/500 [=>............................] - ETA: 1:55 - loss: 2.3972 - regression_loss: 1.9399 - classification_loss: 0.4573 43/500 [=>............................] - ETA: 1:55 - loss: 2.3988 - regression_loss: 1.9430 - classification_loss: 0.4558 44/500 [=>............................] - ETA: 1:55 - loss: 2.4257 - regression_loss: 1.9679 - classification_loss: 0.4578 45/500 [=>............................] - ETA: 1:54 - loss: 2.4176 - regression_loss: 1.9623 - classification_loss: 0.4553 46/500 [=>............................] - ETA: 1:54 - loss: 2.4135 - regression_loss: 1.9577 - classification_loss: 0.4558 47/500 [=>............................] - ETA: 1:54 - loss: 2.4096 - regression_loss: 1.9536 - classification_loss: 0.4559 48/500 [=>............................] - ETA: 1:54 - loss: 2.4154 - regression_loss: 1.9592 - classification_loss: 0.4563 49/500 [=>............................] - ETA: 1:53 - loss: 2.4064 - regression_loss: 1.9531 - classification_loss: 0.4534 50/500 [==>...........................] - ETA: 1:53 - loss: 2.4032 - regression_loss: 1.9518 - classification_loss: 0.4514 51/500 [==>...........................] - ETA: 1:53 - loss: 2.3866 - regression_loss: 1.9384 - classification_loss: 0.4482 52/500 [==>...........................] - ETA: 1:53 - loss: 2.3621 - regression_loss: 1.9185 - classification_loss: 0.4436 53/500 [==>...........................] - ETA: 1:52 - loss: 2.3637 - regression_loss: 1.9201 - classification_loss: 0.4435 54/500 [==>...........................] - ETA: 1:52 - loss: 2.3643 - regression_loss: 1.9208 - classification_loss: 0.4435 55/500 [==>...........................] - ETA: 1:52 - loss: 2.3451 - regression_loss: 1.9056 - classification_loss: 0.4395 56/500 [==>...........................] - ETA: 1:51 - loss: 2.3487 - regression_loss: 1.9082 - classification_loss: 0.4406 57/500 [==>...........................] - ETA: 1:51 - loss: 2.3525 - regression_loss: 1.9113 - classification_loss: 0.4412 58/500 [==>...........................] - ETA: 1:51 - loss: 2.3613 - regression_loss: 1.9181 - classification_loss: 0.4432 59/500 [==>...........................] - ETA: 1:51 - loss: 2.3652 - regression_loss: 1.9212 - classification_loss: 0.4440 60/500 [==>...........................] - ETA: 1:50 - loss: 2.3779 - regression_loss: 1.9293 - classification_loss: 0.4486 61/500 [==>...........................] - ETA: 1:50 - loss: 2.3916 - regression_loss: 1.9422 - classification_loss: 0.4493 62/500 [==>...........................] - ETA: 1:50 - loss: 2.3923 - regression_loss: 1.9430 - classification_loss: 0.4492 63/500 [==>...........................] - ETA: 1:50 - loss: 2.3905 - regression_loss: 1.9417 - classification_loss: 0.4488 64/500 [==>...........................] - ETA: 1:49 - loss: 2.3884 - regression_loss: 1.9410 - classification_loss: 0.4474 65/500 [==>...........................] - ETA: 1:49 - loss: 2.3968 - regression_loss: 1.9481 - classification_loss: 0.4486 66/500 [==>...........................] - ETA: 1:49 - loss: 2.3944 - regression_loss: 1.9362 - classification_loss: 0.4582 67/500 [===>..........................] - ETA: 1:49 - loss: 2.3871 - regression_loss: 1.9310 - classification_loss: 0.4561 68/500 [===>..........................] - ETA: 1:48 - loss: 2.3918 - regression_loss: 1.9362 - classification_loss: 0.4556 69/500 [===>..........................] - ETA: 1:48 - loss: 2.3900 - regression_loss: 1.9346 - classification_loss: 0.4554 70/500 [===>..........................] - ETA: 1:48 - loss: 2.3831 - regression_loss: 1.9290 - classification_loss: 0.4542 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3857 - regression_loss: 1.9305 - classification_loss: 0.4552 72/500 [===>..........................] - ETA: 1:47 - loss: 2.3834 - regression_loss: 1.9285 - classification_loss: 0.4549 73/500 [===>..........................] - ETA: 1:47 - loss: 2.3790 - regression_loss: 1.9258 - classification_loss: 0.4533 74/500 [===>..........................] - ETA: 1:47 - loss: 2.3696 - regression_loss: 1.9178 - classification_loss: 0.4518 75/500 [===>..........................] - ETA: 1:47 - loss: 2.3694 - regression_loss: 1.9155 - classification_loss: 0.4539 76/500 [===>..........................] - ETA: 1:46 - loss: 2.3687 - regression_loss: 1.9152 - classification_loss: 0.4535 77/500 [===>..........................] - ETA: 1:46 - loss: 2.3730 - regression_loss: 1.9186 - classification_loss: 0.4544 78/500 [===>..........................] - ETA: 1:46 - loss: 2.3701 - regression_loss: 1.9166 - classification_loss: 0.4535 79/500 [===>..........................] - ETA: 1:45 - loss: 2.3711 - regression_loss: 1.9191 - classification_loss: 0.4520 80/500 [===>..........................] - ETA: 1:45 - loss: 2.3711 - regression_loss: 1.9191 - classification_loss: 0.4521 81/500 [===>..........................] - ETA: 1:45 - loss: 2.3659 - regression_loss: 1.9135 - classification_loss: 0.4524 82/500 [===>..........................] - ETA: 1:45 - loss: 2.3628 - regression_loss: 1.9115 - classification_loss: 0.4513 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3575 - regression_loss: 1.9078 - classification_loss: 0.4497 84/500 [====>.........................] - ETA: 1:44 - loss: 2.3422 - regression_loss: 1.8952 - classification_loss: 0.4469 85/500 [====>.........................] - ETA: 1:44 - loss: 2.3439 - regression_loss: 1.8964 - classification_loss: 0.4475 86/500 [====>.........................] - ETA: 1:44 - loss: 2.3409 - regression_loss: 1.8937 - classification_loss: 0.4472 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3419 - regression_loss: 1.8950 - classification_loss: 0.4469 88/500 [====>.........................] - ETA: 1:43 - loss: 2.3426 - regression_loss: 1.8965 - classification_loss: 0.4461 89/500 [====>.........................] - ETA: 1:43 - loss: 2.3459 - regression_loss: 1.8999 - classification_loss: 0.4460 90/500 [====>.........................] - ETA: 1:43 - loss: 2.3430 - regression_loss: 1.8986 - classification_loss: 0.4443 91/500 [====>.........................] - ETA: 1:43 - loss: 2.3396 - regression_loss: 1.8954 - classification_loss: 0.4442 92/500 [====>.........................] - ETA: 1:42 - loss: 2.3341 - regression_loss: 1.8912 - classification_loss: 0.4429 93/500 [====>.........................] - ETA: 1:42 - loss: 2.3394 - regression_loss: 1.8942 - classification_loss: 0.4451 94/500 [====>.........................] - ETA: 1:42 - loss: 2.3452 - regression_loss: 1.8983 - classification_loss: 0.4469 95/500 [====>.........................] - ETA: 1:42 - loss: 2.3458 - regression_loss: 1.8992 - classification_loss: 0.4466 96/500 [====>.........................] - ETA: 1:41 - loss: 2.3371 - regression_loss: 1.8931 - classification_loss: 0.4440 97/500 [====>.........................] - ETA: 1:41 - loss: 2.3329 - regression_loss: 1.8898 - classification_loss: 0.4432 98/500 [====>.........................] - ETA: 1:41 - loss: 2.3333 - regression_loss: 1.8879 - classification_loss: 0.4453 99/500 [====>.........................] - ETA: 1:40 - loss: 2.3315 - regression_loss: 1.8870 - classification_loss: 0.4445 100/500 [=====>........................] - ETA: 1:40 - loss: 2.3322 - regression_loss: 1.8879 - classification_loss: 0.4443 101/500 [=====>........................] - ETA: 1:40 - loss: 2.3238 - regression_loss: 1.8817 - classification_loss: 0.4421 102/500 [=====>........................] - ETA: 1:40 - loss: 2.3271 - regression_loss: 1.8844 - classification_loss: 0.4427 103/500 [=====>........................] - ETA: 1:39 - loss: 2.3250 - regression_loss: 1.8829 - classification_loss: 0.4421 104/500 [=====>........................] - ETA: 1:39 - loss: 2.3262 - regression_loss: 1.8841 - classification_loss: 0.4421 105/500 [=====>........................] - ETA: 1:39 - loss: 2.3262 - regression_loss: 1.8842 - classification_loss: 0.4420 106/500 [=====>........................] - ETA: 1:39 - loss: 2.3257 - regression_loss: 1.8842 - classification_loss: 0.4415 107/500 [=====>........................] - ETA: 1:39 - loss: 2.3318 - regression_loss: 1.8868 - classification_loss: 0.4450 108/500 [=====>........................] - ETA: 1:38 - loss: 2.3299 - regression_loss: 1.8851 - classification_loss: 0.4448 109/500 [=====>........................] - ETA: 1:38 - loss: 2.3287 - regression_loss: 1.8843 - classification_loss: 0.4444 110/500 [=====>........................] - ETA: 1:38 - loss: 2.3369 - regression_loss: 1.8861 - classification_loss: 0.4508 111/500 [=====>........................] - ETA: 1:37 - loss: 2.3407 - regression_loss: 1.8878 - classification_loss: 0.4529 112/500 [=====>........................] - ETA: 1:37 - loss: 2.3381 - regression_loss: 1.8858 - classification_loss: 0.4523 113/500 [=====>........................] - ETA: 1:37 - loss: 2.3329 - regression_loss: 1.8822 - classification_loss: 0.4507 114/500 [=====>........................] - ETA: 1:37 - loss: 2.3314 - regression_loss: 1.8815 - classification_loss: 0.4498 115/500 [=====>........................] - ETA: 1:36 - loss: 2.3344 - regression_loss: 1.8846 - classification_loss: 0.4498 116/500 [=====>........................] - ETA: 1:36 - loss: 2.3362 - regression_loss: 1.8835 - classification_loss: 0.4527 117/500 [======>.......................] - ETA: 1:36 - loss: 2.3344 - regression_loss: 1.8825 - classification_loss: 0.4519 118/500 [======>.......................] - ETA: 1:36 - loss: 2.3371 - regression_loss: 1.8855 - classification_loss: 0.4517 119/500 [======>.......................] - ETA: 1:35 - loss: 2.3381 - regression_loss: 1.8870 - classification_loss: 0.4511 120/500 [======>.......................] - ETA: 1:35 - loss: 2.3406 - regression_loss: 1.8889 - classification_loss: 0.4517 121/500 [======>.......................] - ETA: 1:35 - loss: 2.3390 - regression_loss: 1.8875 - classification_loss: 0.4516 122/500 [======>.......................] - ETA: 1:35 - loss: 2.3366 - regression_loss: 1.8859 - classification_loss: 0.4507 123/500 [======>.......................] - ETA: 1:34 - loss: 2.3384 - regression_loss: 1.8877 - classification_loss: 0.4507 124/500 [======>.......................] - ETA: 1:34 - loss: 2.3391 - regression_loss: 1.8886 - classification_loss: 0.4505 125/500 [======>.......................] - ETA: 1:34 - loss: 2.3411 - regression_loss: 1.8905 - classification_loss: 0.4505 126/500 [======>.......................] - ETA: 1:34 - loss: 2.3420 - regression_loss: 1.8917 - classification_loss: 0.4503 127/500 [======>.......................] - ETA: 1:33 - loss: 2.3394 - regression_loss: 1.8887 - classification_loss: 0.4507 128/500 [======>.......................] - ETA: 1:33 - loss: 2.3398 - regression_loss: 1.8893 - classification_loss: 0.4506 129/500 [======>.......................] - ETA: 1:33 - loss: 2.3421 - regression_loss: 1.8915 - classification_loss: 0.4507 130/500 [======>.......................] - ETA: 1:33 - loss: 2.3438 - regression_loss: 1.8915 - classification_loss: 0.4524 131/500 [======>.......................] - ETA: 1:32 - loss: 2.3390 - regression_loss: 1.8878 - classification_loss: 0.4512 132/500 [======>.......................] - ETA: 1:32 - loss: 2.3434 - regression_loss: 1.8905 - classification_loss: 0.4529 133/500 [======>.......................] - ETA: 1:32 - loss: 2.3447 - regression_loss: 1.8920 - classification_loss: 0.4526 134/500 [=======>......................] - ETA: 1:32 - loss: 2.3445 - regression_loss: 1.8915 - classification_loss: 0.4529 135/500 [=======>......................] - ETA: 1:31 - loss: 2.3488 - regression_loss: 1.8943 - classification_loss: 0.4545 136/500 [=======>......................] - ETA: 1:31 - loss: 2.3481 - regression_loss: 1.8938 - classification_loss: 0.4543 137/500 [=======>......................] - ETA: 1:31 - loss: 2.3480 - regression_loss: 1.8937 - classification_loss: 0.4543 138/500 [=======>......................] - ETA: 1:31 - loss: 2.3442 - regression_loss: 1.8915 - classification_loss: 0.4527 139/500 [=======>......................] - ETA: 1:30 - loss: 2.3458 - regression_loss: 1.8929 - classification_loss: 0.4529 140/500 [=======>......................] - ETA: 1:30 - loss: 2.3476 - regression_loss: 1.8950 - classification_loss: 0.4526 141/500 [=======>......................] - ETA: 1:30 - loss: 2.3504 - regression_loss: 1.8936 - classification_loss: 0.4569 142/500 [=======>......................] - ETA: 1:30 - loss: 2.3514 - regression_loss: 1.8950 - classification_loss: 0.4564 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3542 - regression_loss: 1.8973 - classification_loss: 0.4569 144/500 [=======>......................] - ETA: 1:29 - loss: 2.3653 - regression_loss: 1.9039 - classification_loss: 0.4614 145/500 [=======>......................] - ETA: 1:29 - loss: 2.3661 - regression_loss: 1.9055 - classification_loss: 0.4606 146/500 [=======>......................] - ETA: 1:29 - loss: 2.3649 - regression_loss: 1.9051 - classification_loss: 0.4597 147/500 [=======>......................] - ETA: 1:28 - loss: 2.3646 - regression_loss: 1.9051 - classification_loss: 0.4595 148/500 [=======>......................] - ETA: 1:28 - loss: 2.3633 - regression_loss: 1.9025 - classification_loss: 0.4609 149/500 [=======>......................] - ETA: 1:28 - loss: 2.3649 - regression_loss: 1.9046 - classification_loss: 0.4603 150/500 [========>.....................] - ETA: 1:28 - loss: 2.3655 - regression_loss: 1.9048 - classification_loss: 0.4607 151/500 [========>.....................] - ETA: 1:27 - loss: 2.3630 - regression_loss: 1.9032 - classification_loss: 0.4599 152/500 [========>.....................] - ETA: 1:27 - loss: 2.3659 - regression_loss: 1.9051 - classification_loss: 0.4608 153/500 [========>.....................] - ETA: 1:27 - loss: 2.3649 - regression_loss: 1.9047 - classification_loss: 0.4602 154/500 [========>.....................] - ETA: 1:26 - loss: 2.3615 - regression_loss: 1.9024 - classification_loss: 0.4591 155/500 [========>.....................] - ETA: 1:26 - loss: 2.3614 - regression_loss: 1.9023 - classification_loss: 0.4590 156/500 [========>.....................] - ETA: 1:26 - loss: 2.3602 - regression_loss: 1.9018 - classification_loss: 0.4584 157/500 [========>.....................] - ETA: 1:25 - loss: 2.3602 - regression_loss: 1.9025 - classification_loss: 0.4577 158/500 [========>.....................] - ETA: 1:25 - loss: 2.3553 - regression_loss: 1.8984 - classification_loss: 0.4569 159/500 [========>.....................] - ETA: 1:25 - loss: 2.3606 - regression_loss: 1.9021 - classification_loss: 0.4585 160/500 [========>.....................] - ETA: 1:25 - loss: 2.3593 - regression_loss: 1.9011 - classification_loss: 0.4581 161/500 [========>.....................] - ETA: 1:24 - loss: 2.3588 - regression_loss: 1.9008 - classification_loss: 0.4580 162/500 [========>.....................] - ETA: 1:24 - loss: 2.3573 - regression_loss: 1.8998 - classification_loss: 0.4575 163/500 [========>.....................] - ETA: 1:24 - loss: 2.3574 - regression_loss: 1.9001 - classification_loss: 0.4573 164/500 [========>.....................] - ETA: 1:24 - loss: 2.3536 - regression_loss: 1.8971 - classification_loss: 0.4566 165/500 [========>.....................] - ETA: 1:23 - loss: 2.3575 - regression_loss: 1.9006 - classification_loss: 0.4569 166/500 [========>.....................] - ETA: 1:23 - loss: 2.3625 - regression_loss: 1.9047 - classification_loss: 0.4578 167/500 [=========>....................] - ETA: 1:23 - loss: 2.3570 - regression_loss: 1.8999 - classification_loss: 0.4571 168/500 [=========>....................] - ETA: 1:23 - loss: 2.3575 - regression_loss: 1.9009 - classification_loss: 0.4566 169/500 [=========>....................] - ETA: 1:22 - loss: 2.3602 - regression_loss: 1.9025 - classification_loss: 0.4577 170/500 [=========>....................] - ETA: 1:22 - loss: 2.3638 - regression_loss: 1.9045 - classification_loss: 0.4593 171/500 [=========>....................] - ETA: 1:22 - loss: 2.3624 - regression_loss: 1.9034 - classification_loss: 0.4589 172/500 [=========>....................] - ETA: 1:22 - loss: 2.3643 - regression_loss: 1.9066 - classification_loss: 0.4577 173/500 [=========>....................] - ETA: 1:21 - loss: 2.3657 - regression_loss: 1.9071 - classification_loss: 0.4587 174/500 [=========>....................] - ETA: 1:21 - loss: 2.3683 - regression_loss: 1.9088 - classification_loss: 0.4596 175/500 [=========>....................] - ETA: 1:21 - loss: 2.3704 - regression_loss: 1.9101 - classification_loss: 0.4602 176/500 [=========>....................] - ETA: 1:21 - loss: 2.3686 - regression_loss: 1.9085 - classification_loss: 0.4602 177/500 [=========>....................] - ETA: 1:20 - loss: 2.3652 - regression_loss: 1.9058 - classification_loss: 0.4594 178/500 [=========>....................] - ETA: 1:20 - loss: 2.3673 - regression_loss: 1.9077 - classification_loss: 0.4596 179/500 [=========>....................] - ETA: 1:20 - loss: 2.3677 - regression_loss: 1.9087 - classification_loss: 0.4590 180/500 [=========>....................] - ETA: 1:20 - loss: 2.3672 - regression_loss: 1.9087 - classification_loss: 0.4585 181/500 [=========>....................] - ETA: 1:19 - loss: 2.3697 - regression_loss: 1.9110 - classification_loss: 0.4587 182/500 [=========>....................] - ETA: 1:19 - loss: 2.3707 - regression_loss: 1.9118 - classification_loss: 0.4589 183/500 [=========>....................] - ETA: 1:19 - loss: 2.3717 - regression_loss: 1.9129 - classification_loss: 0.4588 184/500 [==========>...................] - ETA: 1:18 - loss: 2.3718 - regression_loss: 1.9132 - classification_loss: 0.4586 185/500 [==========>...................] - ETA: 1:18 - loss: 2.3698 - regression_loss: 1.9111 - classification_loss: 0.4587 186/500 [==========>...................] - ETA: 1:18 - loss: 2.3670 - regression_loss: 1.9074 - classification_loss: 0.4596 187/500 [==========>...................] - ETA: 1:18 - loss: 2.3668 - regression_loss: 1.9075 - classification_loss: 0.4593 188/500 [==========>...................] - ETA: 1:17 - loss: 2.3669 - regression_loss: 1.9080 - classification_loss: 0.4589 189/500 [==========>...................] - ETA: 1:17 - loss: 2.3660 - regression_loss: 1.9073 - classification_loss: 0.4587 190/500 [==========>...................] - ETA: 1:17 - loss: 2.3701 - regression_loss: 1.9102 - classification_loss: 0.4599 191/500 [==========>...................] - ETA: 1:17 - loss: 2.3721 - regression_loss: 1.9122 - classification_loss: 0.4599 192/500 [==========>...................] - ETA: 1:17 - loss: 2.3697 - regression_loss: 1.9105 - classification_loss: 0.4591 193/500 [==========>...................] - ETA: 1:16 - loss: 2.3687 - regression_loss: 1.9101 - classification_loss: 0.4586 194/500 [==========>...................] - ETA: 1:16 - loss: 2.3708 - regression_loss: 1.9118 - classification_loss: 0.4589 195/500 [==========>...................] - ETA: 1:16 - loss: 2.3700 - regression_loss: 1.9114 - classification_loss: 0.4586 196/500 [==========>...................] - ETA: 1:16 - loss: 2.3677 - regression_loss: 1.9096 - classification_loss: 0.4580 197/500 [==========>...................] - ETA: 1:15 - loss: 2.3680 - regression_loss: 1.9099 - classification_loss: 0.4581 198/500 [==========>...................] - ETA: 1:15 - loss: 2.3685 - regression_loss: 1.9103 - classification_loss: 0.4581 199/500 [==========>...................] - ETA: 1:15 - loss: 2.3685 - regression_loss: 1.9105 - classification_loss: 0.4581 200/500 [===========>..................] - ETA: 1:15 - loss: 2.3671 - regression_loss: 1.9095 - classification_loss: 0.4576 201/500 [===========>..................] - ETA: 1:14 - loss: 2.3656 - regression_loss: 1.9085 - classification_loss: 0.4572 202/500 [===========>..................] - ETA: 1:14 - loss: 2.3656 - regression_loss: 1.9086 - classification_loss: 0.4570 203/500 [===========>..................] - ETA: 1:14 - loss: 2.3681 - regression_loss: 1.9107 - classification_loss: 0.4574 204/500 [===========>..................] - ETA: 1:14 - loss: 2.3691 - regression_loss: 1.9110 - classification_loss: 0.4581 205/500 [===========>..................] - ETA: 1:13 - loss: 2.3686 - regression_loss: 1.9112 - classification_loss: 0.4574 206/500 [===========>..................] - ETA: 1:13 - loss: 2.3690 - regression_loss: 1.9114 - classification_loss: 0.4575 207/500 [===========>..................] - ETA: 1:13 - loss: 2.3693 - regression_loss: 1.9116 - classification_loss: 0.4577 208/500 [===========>..................] - ETA: 1:13 - loss: 2.3683 - regression_loss: 1.9103 - classification_loss: 0.4580 209/500 [===========>..................] - ETA: 1:12 - loss: 2.3684 - regression_loss: 1.9104 - classification_loss: 0.4580 210/500 [===========>..................] - ETA: 1:12 - loss: 2.3674 - regression_loss: 1.9094 - classification_loss: 0.4580 211/500 [===========>..................] - ETA: 1:12 - loss: 2.3678 - regression_loss: 1.9099 - classification_loss: 0.4579 212/500 [===========>..................] - ETA: 1:12 - loss: 2.3680 - regression_loss: 1.9103 - classification_loss: 0.4577 213/500 [===========>..................] - ETA: 1:11 - loss: 2.3688 - regression_loss: 1.9105 - classification_loss: 0.4583 214/500 [===========>..................] - ETA: 1:11 - loss: 2.3658 - regression_loss: 1.9079 - classification_loss: 0.4579 215/500 [===========>..................] - ETA: 1:11 - loss: 2.3653 - regression_loss: 1.9076 - classification_loss: 0.4576 216/500 [===========>..................] - ETA: 1:11 - loss: 2.3653 - regression_loss: 1.9076 - classification_loss: 0.4576 217/500 [============>.................] - ETA: 1:10 - loss: 2.3657 - regression_loss: 1.9083 - classification_loss: 0.4574 218/500 [============>.................] - ETA: 1:10 - loss: 2.3652 - regression_loss: 1.9081 - classification_loss: 0.4571 219/500 [============>.................] - ETA: 1:10 - loss: 2.3626 - regression_loss: 1.9063 - classification_loss: 0.4563 220/500 [============>.................] - ETA: 1:10 - loss: 2.3665 - regression_loss: 1.9093 - classification_loss: 0.4572 221/500 [============>.................] - ETA: 1:09 - loss: 2.3631 - regression_loss: 1.9066 - classification_loss: 0.4565 222/500 [============>.................] - ETA: 1:09 - loss: 2.3621 - regression_loss: 1.9058 - classification_loss: 0.4563 223/500 [============>.................] - ETA: 1:09 - loss: 2.3598 - regression_loss: 1.9041 - classification_loss: 0.4556 224/500 [============>.................] - ETA: 1:09 - loss: 2.3609 - regression_loss: 1.9053 - classification_loss: 0.4556 225/500 [============>.................] - ETA: 1:08 - loss: 2.3591 - regression_loss: 1.9040 - classification_loss: 0.4552 226/500 [============>.................] - ETA: 1:08 - loss: 2.3584 - regression_loss: 1.9035 - classification_loss: 0.4548 227/500 [============>.................] - ETA: 1:08 - loss: 2.3576 - regression_loss: 1.9033 - classification_loss: 0.4543 228/500 [============>.................] - ETA: 1:08 - loss: 2.3558 - regression_loss: 1.9014 - classification_loss: 0.4545 229/500 [============>.................] - ETA: 1:07 - loss: 2.3564 - regression_loss: 1.9025 - classification_loss: 0.4539 230/500 [============>.................] - ETA: 1:07 - loss: 2.3575 - regression_loss: 1.9035 - classification_loss: 0.4541 231/500 [============>.................] - ETA: 1:07 - loss: 2.3561 - regression_loss: 1.9023 - classification_loss: 0.4538 232/500 [============>.................] - ETA: 1:07 - loss: 2.3569 - regression_loss: 1.9031 - classification_loss: 0.4538 233/500 [============>.................] - ETA: 1:06 - loss: 2.3559 - regression_loss: 1.9026 - classification_loss: 0.4534 234/500 [=============>................] - ETA: 1:06 - loss: 2.3551 - regression_loss: 1.9019 - classification_loss: 0.4533 235/500 [=============>................] - ETA: 1:06 - loss: 2.3547 - regression_loss: 1.9019 - classification_loss: 0.4528 236/500 [=============>................] - ETA: 1:06 - loss: 2.3526 - regression_loss: 1.9002 - classification_loss: 0.4524 237/500 [=============>................] - ETA: 1:05 - loss: 2.3521 - regression_loss: 1.8997 - classification_loss: 0.4524 238/500 [=============>................] - ETA: 1:05 - loss: 2.3514 - regression_loss: 1.8994 - classification_loss: 0.4520 239/500 [=============>................] - ETA: 1:05 - loss: 2.3521 - regression_loss: 1.9001 - classification_loss: 0.4520 240/500 [=============>................] - ETA: 1:05 - loss: 2.3524 - regression_loss: 1.9004 - classification_loss: 0.4520 241/500 [=============>................] - ETA: 1:04 - loss: 2.3525 - regression_loss: 1.8996 - classification_loss: 0.4529 242/500 [=============>................] - ETA: 1:04 - loss: 2.3516 - regression_loss: 1.8990 - classification_loss: 0.4526 243/500 [=============>................] - ETA: 1:04 - loss: 2.3536 - regression_loss: 1.9008 - classification_loss: 0.4527 244/500 [=============>................] - ETA: 1:04 - loss: 2.3536 - regression_loss: 1.9008 - classification_loss: 0.4528 245/500 [=============>................] - ETA: 1:03 - loss: 2.3521 - regression_loss: 1.8996 - classification_loss: 0.4526 246/500 [=============>................] - ETA: 1:03 - loss: 2.3514 - regression_loss: 1.8993 - classification_loss: 0.4521 247/500 [=============>................] - ETA: 1:03 - loss: 2.3534 - regression_loss: 1.9010 - classification_loss: 0.4524 248/500 [=============>................] - ETA: 1:03 - loss: 2.3526 - regression_loss: 1.9003 - classification_loss: 0.4522 249/500 [=============>................] - ETA: 1:02 - loss: 2.3505 - regression_loss: 1.8987 - classification_loss: 0.4519 250/500 [==============>...............] - ETA: 1:02 - loss: 2.3505 - regression_loss: 1.8988 - classification_loss: 0.4516 251/500 [==============>...............] - ETA: 1:02 - loss: 2.3545 - regression_loss: 1.9025 - classification_loss: 0.4519 252/500 [==============>...............] - ETA: 1:02 - loss: 2.3507 - regression_loss: 1.8991 - classification_loss: 0.4516 253/500 [==============>...............] - ETA: 1:01 - loss: 2.3514 - regression_loss: 1.9000 - classification_loss: 0.4514 254/500 [==============>...............] - ETA: 1:01 - loss: 2.3522 - regression_loss: 1.9002 - classification_loss: 0.4519 255/500 [==============>...............] - ETA: 1:01 - loss: 2.3572 - regression_loss: 1.9039 - classification_loss: 0.4532 256/500 [==============>...............] - ETA: 1:01 - loss: 2.3571 - regression_loss: 1.9036 - classification_loss: 0.4535 257/500 [==============>...............] - ETA: 1:00 - loss: 2.3555 - regression_loss: 1.9024 - classification_loss: 0.4531 258/500 [==============>...............] - ETA: 1:00 - loss: 2.3557 - regression_loss: 1.9028 - classification_loss: 0.4529 259/500 [==============>...............] - ETA: 1:00 - loss: 2.3581 - regression_loss: 1.9048 - classification_loss: 0.4534 260/500 [==============>...............] - ETA: 1:00 - loss: 2.3567 - regression_loss: 1.9037 - classification_loss: 0.4531 261/500 [==============>...............] - ETA: 59s - loss: 2.3558 - regression_loss: 1.9029 - classification_loss: 0.4529  262/500 [==============>...............] - ETA: 59s - loss: 2.3571 - regression_loss: 1.9035 - classification_loss: 0.4536 263/500 [==============>...............] - ETA: 59s - loss: 2.3582 - regression_loss: 1.9050 - classification_loss: 0.4532 264/500 [==============>...............] - ETA: 59s - loss: 2.3572 - regression_loss: 1.9044 - classification_loss: 0.4528 265/500 [==============>...............] - ETA: 58s - loss: 2.3526 - regression_loss: 1.9007 - classification_loss: 0.4519 266/500 [==============>...............] - ETA: 58s - loss: 2.3522 - regression_loss: 1.9006 - classification_loss: 0.4516 267/500 [===============>..............] - ETA: 58s - loss: 2.3534 - regression_loss: 1.9018 - classification_loss: 0.4517 268/500 [===============>..............] - ETA: 58s - loss: 2.3536 - regression_loss: 1.9020 - classification_loss: 0.4516 269/500 [===============>..............] - ETA: 57s - loss: 2.3532 - regression_loss: 1.9019 - classification_loss: 0.4513 270/500 [===============>..............] - ETA: 57s - loss: 2.3528 - regression_loss: 1.9007 - classification_loss: 0.4520 271/500 [===============>..............] - ETA: 57s - loss: 2.3525 - regression_loss: 1.9007 - classification_loss: 0.4519 272/500 [===============>..............] - ETA: 57s - loss: 2.3501 - regression_loss: 1.8986 - classification_loss: 0.4514 273/500 [===============>..............] - ETA: 56s - loss: 2.3479 - regression_loss: 1.8969 - classification_loss: 0.4510 274/500 [===============>..............] - ETA: 56s - loss: 2.3476 - regression_loss: 1.8967 - classification_loss: 0.4509 275/500 [===============>..............] - ETA: 56s - loss: 2.3465 - regression_loss: 1.8960 - classification_loss: 0.4505 276/500 [===============>..............] - ETA: 56s - loss: 2.3452 - regression_loss: 1.8946 - classification_loss: 0.4505 277/500 [===============>..............] - ETA: 55s - loss: 2.3468 - regression_loss: 1.8958 - classification_loss: 0.4510 278/500 [===============>..............] - ETA: 55s - loss: 2.3459 - regression_loss: 1.8954 - classification_loss: 0.4505 279/500 [===============>..............] - ETA: 55s - loss: 2.3451 - regression_loss: 1.8946 - classification_loss: 0.4505 280/500 [===============>..............] - ETA: 55s - loss: 2.3445 - regression_loss: 1.8944 - classification_loss: 0.4501 281/500 [===============>..............] - ETA: 54s - loss: 2.3458 - regression_loss: 1.8955 - classification_loss: 0.4503 282/500 [===============>..............] - ETA: 54s - loss: 2.3462 - regression_loss: 1.8959 - classification_loss: 0.4504 283/500 [===============>..............] - ETA: 54s - loss: 2.3432 - regression_loss: 1.8933 - classification_loss: 0.4499 284/500 [================>.............] - ETA: 54s - loss: 2.3416 - regression_loss: 1.8924 - classification_loss: 0.4491 285/500 [================>.............] - ETA: 53s - loss: 2.3429 - regression_loss: 1.8935 - classification_loss: 0.4494 286/500 [================>.............] - ETA: 53s - loss: 2.3425 - regression_loss: 1.8936 - classification_loss: 0.4490 287/500 [================>.............] - ETA: 53s - loss: 2.3455 - regression_loss: 1.8959 - classification_loss: 0.4496 288/500 [================>.............] - ETA: 53s - loss: 2.3458 - regression_loss: 1.8962 - classification_loss: 0.4496 289/500 [================>.............] - ETA: 52s - loss: 2.3429 - regression_loss: 1.8937 - classification_loss: 0.4492 290/500 [================>.............] - ETA: 52s - loss: 2.3421 - regression_loss: 1.8930 - classification_loss: 0.4491 291/500 [================>.............] - ETA: 52s - loss: 2.3416 - regression_loss: 1.8926 - classification_loss: 0.4491 292/500 [================>.............] - ETA: 52s - loss: 2.3402 - regression_loss: 1.8914 - classification_loss: 0.4488 293/500 [================>.............] - ETA: 51s - loss: 2.3385 - regression_loss: 1.8901 - classification_loss: 0.4484 294/500 [================>.............] - ETA: 51s - loss: 2.3376 - regression_loss: 1.8887 - classification_loss: 0.4489 295/500 [================>.............] - ETA: 51s - loss: 2.3400 - regression_loss: 1.8910 - classification_loss: 0.4491 296/500 [================>.............] - ETA: 51s - loss: 2.3398 - regression_loss: 1.8908 - classification_loss: 0.4490 297/500 [================>.............] - ETA: 50s - loss: 2.3406 - regression_loss: 1.8913 - classification_loss: 0.4493 298/500 [================>.............] - ETA: 50s - loss: 2.3427 - regression_loss: 1.8930 - classification_loss: 0.4497 299/500 [================>.............] - ETA: 50s - loss: 2.3444 - regression_loss: 1.8947 - classification_loss: 0.4497 300/500 [=================>............] - ETA: 50s - loss: 2.3435 - regression_loss: 1.8941 - classification_loss: 0.4494 301/500 [=================>............] - ETA: 49s - loss: 2.3443 - regression_loss: 1.8945 - classification_loss: 0.4498 302/500 [=================>............] - ETA: 49s - loss: 2.3443 - regression_loss: 1.8945 - classification_loss: 0.4498 303/500 [=================>............] - ETA: 49s - loss: 2.3445 - regression_loss: 1.8947 - classification_loss: 0.4498 304/500 [=================>............] - ETA: 49s - loss: 2.3448 - regression_loss: 1.8951 - classification_loss: 0.4497 305/500 [=================>............] - ETA: 48s - loss: 2.3433 - regression_loss: 1.8938 - classification_loss: 0.4494 306/500 [=================>............] - ETA: 48s - loss: 2.3451 - regression_loss: 1.8955 - classification_loss: 0.4496 307/500 [=================>............] - ETA: 48s - loss: 2.3449 - regression_loss: 1.8955 - classification_loss: 0.4494 308/500 [=================>............] - ETA: 48s - loss: 2.3443 - regression_loss: 1.8951 - classification_loss: 0.4492 309/500 [=================>............] - ETA: 47s - loss: 2.3429 - regression_loss: 1.8938 - classification_loss: 0.4490 310/500 [=================>............] - ETA: 47s - loss: 2.3433 - regression_loss: 1.8940 - classification_loss: 0.4494 311/500 [=================>............] - ETA: 47s - loss: 2.3440 - regression_loss: 1.8941 - classification_loss: 0.4499 312/500 [=================>............] - ETA: 47s - loss: 2.3438 - regression_loss: 1.8940 - classification_loss: 0.4498 313/500 [=================>............] - ETA: 46s - loss: 2.3426 - regression_loss: 1.8932 - classification_loss: 0.4495 314/500 [=================>............] - ETA: 46s - loss: 2.3396 - regression_loss: 1.8908 - classification_loss: 0.4488 315/500 [=================>............] - ETA: 46s - loss: 2.3402 - regression_loss: 1.8916 - classification_loss: 0.4486 316/500 [=================>............] - ETA: 46s - loss: 2.3409 - regression_loss: 1.8926 - classification_loss: 0.4483 317/500 [==================>...........] - ETA: 45s - loss: 2.3387 - regression_loss: 1.8910 - classification_loss: 0.4478 318/500 [==================>...........] - ETA: 45s - loss: 2.3388 - regression_loss: 1.8911 - classification_loss: 0.4477 319/500 [==================>...........] - ETA: 45s - loss: 2.3375 - regression_loss: 1.8900 - classification_loss: 0.4475 320/500 [==================>...........] - ETA: 45s - loss: 2.3384 - regression_loss: 1.8907 - classification_loss: 0.4477 321/500 [==================>...........] - ETA: 44s - loss: 2.3355 - regression_loss: 1.8883 - classification_loss: 0.4472 322/500 [==================>...........] - ETA: 44s - loss: 2.3368 - regression_loss: 1.8894 - classification_loss: 0.4473 323/500 [==================>...........] - ETA: 44s - loss: 2.3366 - regression_loss: 1.8893 - classification_loss: 0.4474 324/500 [==================>...........] - ETA: 44s - loss: 2.3368 - regression_loss: 1.8895 - classification_loss: 0.4473 325/500 [==================>...........] - ETA: 43s - loss: 2.3374 - regression_loss: 1.8898 - classification_loss: 0.4475 326/500 [==================>...........] - ETA: 43s - loss: 2.3378 - regression_loss: 1.8878 - classification_loss: 0.4499 327/500 [==================>...........] - ETA: 43s - loss: 2.3382 - regression_loss: 1.8884 - classification_loss: 0.4498 328/500 [==================>...........] - ETA: 43s - loss: 2.3392 - regression_loss: 1.8893 - classification_loss: 0.4499 329/500 [==================>...........] - ETA: 42s - loss: 2.3383 - regression_loss: 1.8887 - classification_loss: 0.4496 330/500 [==================>...........] - ETA: 42s - loss: 2.3388 - regression_loss: 1.8889 - classification_loss: 0.4499 331/500 [==================>...........] - ETA: 42s - loss: 2.3385 - regression_loss: 1.8887 - classification_loss: 0.4498 332/500 [==================>...........] - ETA: 41s - loss: 2.3379 - regression_loss: 1.8884 - classification_loss: 0.4495 333/500 [==================>...........] - ETA: 41s - loss: 2.3402 - regression_loss: 1.8905 - classification_loss: 0.4498 334/500 [===================>..........] - ETA: 41s - loss: 2.3406 - regression_loss: 1.8907 - classification_loss: 0.4499 335/500 [===================>..........] - ETA: 41s - loss: 2.3401 - regression_loss: 1.8904 - classification_loss: 0.4497 336/500 [===================>..........] - ETA: 40s - loss: 2.3413 - regression_loss: 1.8916 - classification_loss: 0.4497 337/500 [===================>..........] - ETA: 40s - loss: 2.3419 - regression_loss: 1.8921 - classification_loss: 0.4498 338/500 [===================>..........] - ETA: 40s - loss: 2.3417 - regression_loss: 1.8919 - classification_loss: 0.4497 339/500 [===================>..........] - ETA: 40s - loss: 2.3411 - regression_loss: 1.8899 - classification_loss: 0.4512 340/500 [===================>..........] - ETA: 39s - loss: 2.3422 - regression_loss: 1.8910 - classification_loss: 0.4512 341/500 [===================>..........] - ETA: 39s - loss: 2.3401 - regression_loss: 1.8893 - classification_loss: 0.4508 342/500 [===================>..........] - ETA: 39s - loss: 2.3392 - regression_loss: 1.8887 - classification_loss: 0.4505 343/500 [===================>..........] - ETA: 39s - loss: 2.3384 - regression_loss: 1.8880 - classification_loss: 0.4504 344/500 [===================>..........] - ETA: 38s - loss: 2.3378 - regression_loss: 1.8874 - classification_loss: 0.4503 345/500 [===================>..........] - ETA: 38s - loss: 2.3395 - regression_loss: 1.8891 - classification_loss: 0.4505 346/500 [===================>..........] - ETA: 38s - loss: 2.3404 - regression_loss: 1.8898 - classification_loss: 0.4506 347/500 [===================>..........] - ETA: 38s - loss: 2.3392 - regression_loss: 1.8889 - classification_loss: 0.4503 348/500 [===================>..........] - ETA: 37s - loss: 2.3396 - regression_loss: 1.8892 - classification_loss: 0.4505 349/500 [===================>..........] - ETA: 37s - loss: 2.3388 - regression_loss: 1.8888 - classification_loss: 0.4501 350/500 [====================>.........] - ETA: 37s - loss: 2.3399 - regression_loss: 1.8896 - classification_loss: 0.4503 351/500 [====================>.........] - ETA: 37s - loss: 2.3405 - regression_loss: 1.8900 - classification_loss: 0.4505 352/500 [====================>.........] - ETA: 36s - loss: 2.3391 - regression_loss: 1.8891 - classification_loss: 0.4500 353/500 [====================>.........] - ETA: 36s - loss: 2.3379 - regression_loss: 1.8881 - classification_loss: 0.4498 354/500 [====================>.........] - ETA: 36s - loss: 2.3365 - regression_loss: 1.8869 - classification_loss: 0.4496 355/500 [====================>.........] - ETA: 36s - loss: 2.3360 - regression_loss: 1.8865 - classification_loss: 0.4496 356/500 [====================>.........] - ETA: 35s - loss: 2.3359 - regression_loss: 1.8863 - classification_loss: 0.4496 357/500 [====================>.........] - ETA: 35s - loss: 2.3367 - regression_loss: 1.8870 - classification_loss: 0.4498 358/500 [====================>.........] - ETA: 35s - loss: 2.3371 - regression_loss: 1.8875 - classification_loss: 0.4497 359/500 [====================>.........] - ETA: 35s - loss: 2.3375 - regression_loss: 1.8876 - classification_loss: 0.4499 360/500 [====================>.........] - ETA: 34s - loss: 2.3368 - regression_loss: 1.8870 - classification_loss: 0.4498 361/500 [====================>.........] - ETA: 34s - loss: 2.3365 - regression_loss: 1.8867 - classification_loss: 0.4498 362/500 [====================>.........] - ETA: 34s - loss: 2.3355 - regression_loss: 1.8860 - classification_loss: 0.4495 363/500 [====================>.........] - ETA: 34s - loss: 2.3350 - regression_loss: 1.8857 - classification_loss: 0.4493 364/500 [====================>.........] - ETA: 33s - loss: 2.3346 - regression_loss: 1.8855 - classification_loss: 0.4491 365/500 [====================>.........] - ETA: 33s - loss: 2.3336 - regression_loss: 1.8848 - classification_loss: 0.4488 366/500 [====================>.........] - ETA: 33s - loss: 2.3343 - regression_loss: 1.8857 - classification_loss: 0.4486 367/500 [=====================>........] - ETA: 33s - loss: 2.3355 - regression_loss: 1.8868 - classification_loss: 0.4487 368/500 [=====================>........] - ETA: 32s - loss: 2.3342 - regression_loss: 1.8856 - classification_loss: 0.4485 369/500 [=====================>........] - ETA: 32s - loss: 2.3344 - regression_loss: 1.8857 - classification_loss: 0.4487 370/500 [=====================>........] - ETA: 32s - loss: 2.3341 - regression_loss: 1.8856 - classification_loss: 0.4485 371/500 [=====================>........] - ETA: 32s - loss: 2.3320 - regression_loss: 1.8840 - classification_loss: 0.4480 372/500 [=====================>........] - ETA: 32s - loss: 2.3328 - regression_loss: 1.8849 - classification_loss: 0.4479 373/500 [=====================>........] - ETA: 31s - loss: 2.3331 - regression_loss: 1.8850 - classification_loss: 0.4481 374/500 [=====================>........] - ETA: 31s - loss: 2.3341 - regression_loss: 1.8859 - classification_loss: 0.4483 375/500 [=====================>........] - ETA: 31s - loss: 2.3328 - regression_loss: 1.8850 - classification_loss: 0.4479 376/500 [=====================>........] - ETA: 31s - loss: 2.3309 - regression_loss: 1.8836 - classification_loss: 0.4473 377/500 [=====================>........] - ETA: 30s - loss: 2.3316 - regression_loss: 1.8843 - classification_loss: 0.4473 378/500 [=====================>........] - ETA: 30s - loss: 2.3323 - regression_loss: 1.8849 - classification_loss: 0.4474 379/500 [=====================>........] - ETA: 30s - loss: 2.3316 - regression_loss: 1.8833 - classification_loss: 0.4484 380/500 [=====================>........] - ETA: 30s - loss: 2.3332 - regression_loss: 1.8845 - classification_loss: 0.4486 381/500 [=====================>........] - ETA: 29s - loss: 2.3335 - regression_loss: 1.8849 - classification_loss: 0.4486 382/500 [=====================>........] - ETA: 29s - loss: 2.3333 - regression_loss: 1.8850 - classification_loss: 0.4483 383/500 [=====================>........] - ETA: 29s - loss: 2.3323 - regression_loss: 1.8836 - classification_loss: 0.4486 384/500 [======================>.......] - ETA: 29s - loss: 2.3314 - regression_loss: 1.8832 - classification_loss: 0.4482 385/500 [======================>.......] - ETA: 28s - loss: 2.3324 - regression_loss: 1.8840 - classification_loss: 0.4484 386/500 [======================>.......] - ETA: 28s - loss: 2.3337 - regression_loss: 1.8853 - classification_loss: 0.4484 387/500 [======================>.......] - ETA: 28s - loss: 2.3355 - regression_loss: 1.8862 - classification_loss: 0.4493 388/500 [======================>.......] - ETA: 28s - loss: 2.3353 - regression_loss: 1.8863 - classification_loss: 0.4490 389/500 [======================>.......] - ETA: 27s - loss: 2.3362 - regression_loss: 1.8871 - classification_loss: 0.4491 390/500 [======================>.......] - ETA: 27s - loss: 2.3362 - regression_loss: 1.8871 - classification_loss: 0.4491 391/500 [======================>.......] - ETA: 27s - loss: 2.3339 - regression_loss: 1.8853 - classification_loss: 0.4486 392/500 [======================>.......] - ETA: 27s - loss: 2.3346 - regression_loss: 1.8858 - classification_loss: 0.4488 393/500 [======================>.......] - ETA: 26s - loss: 2.3350 - regression_loss: 1.8863 - classification_loss: 0.4487 394/500 [======================>.......] - ETA: 26s - loss: 2.3350 - regression_loss: 1.8864 - classification_loss: 0.4486 395/500 [======================>.......] - ETA: 26s - loss: 2.3337 - regression_loss: 1.8853 - classification_loss: 0.4484 396/500 [======================>.......] - ETA: 26s - loss: 2.3364 - regression_loss: 1.8880 - classification_loss: 0.4484 397/500 [======================>.......] - ETA: 25s - loss: 2.3365 - regression_loss: 1.8882 - classification_loss: 0.4484 398/500 [======================>.......] - ETA: 25s - loss: 2.3359 - regression_loss: 1.8878 - classification_loss: 0.4481 399/500 [======================>.......] - ETA: 25s - loss: 2.3359 - regression_loss: 1.8879 - classification_loss: 0.4480 400/500 [=======================>......] - ETA: 25s - loss: 2.3368 - regression_loss: 1.8886 - classification_loss: 0.4482 401/500 [=======================>......] - ETA: 24s - loss: 2.3346 - regression_loss: 1.8868 - classification_loss: 0.4478 402/500 [=======================>......] - ETA: 24s - loss: 2.3343 - regression_loss: 1.8866 - classification_loss: 0.4477 403/500 [=======================>......] - ETA: 24s - loss: 2.3338 - regression_loss: 1.8863 - classification_loss: 0.4475 404/500 [=======================>......] - ETA: 24s - loss: 2.3336 - regression_loss: 1.8864 - classification_loss: 0.4472 405/500 [=======================>......] - ETA: 23s - loss: 2.3316 - regression_loss: 1.8848 - classification_loss: 0.4468 406/500 [=======================>......] - ETA: 23s - loss: 2.3317 - regression_loss: 1.8850 - classification_loss: 0.4467 407/500 [=======================>......] - ETA: 23s - loss: 2.3312 - regression_loss: 1.8848 - classification_loss: 0.4464 408/500 [=======================>......] - ETA: 23s - loss: 2.3314 - regression_loss: 1.8851 - classification_loss: 0.4463 409/500 [=======================>......] - ETA: 22s - loss: 2.3319 - regression_loss: 1.8858 - classification_loss: 0.4460 410/500 [=======================>......] - ETA: 22s - loss: 2.3348 - regression_loss: 1.8882 - classification_loss: 0.4466 411/500 [=======================>......] - ETA: 22s - loss: 2.3350 - regression_loss: 1.8881 - classification_loss: 0.4468 412/500 [=======================>......] - ETA: 22s - loss: 2.3355 - regression_loss: 1.8882 - classification_loss: 0.4473 413/500 [=======================>......] - ETA: 21s - loss: 2.3354 - regression_loss: 1.8882 - classification_loss: 0.4472 414/500 [=======================>......] - ETA: 21s - loss: 2.3361 - regression_loss: 1.8888 - classification_loss: 0.4473 415/500 [=======================>......] - ETA: 21s - loss: 2.3374 - regression_loss: 1.8900 - classification_loss: 0.4475 416/500 [=======================>......] - ETA: 21s - loss: 2.3400 - regression_loss: 1.8922 - classification_loss: 0.4478 417/500 [========================>.....] - ETA: 20s - loss: 2.3397 - regression_loss: 1.8920 - classification_loss: 0.4477 418/500 [========================>.....] - ETA: 20s - loss: 2.3395 - regression_loss: 1.8918 - classification_loss: 0.4477 419/500 [========================>.....] - ETA: 20s - loss: 2.3399 - regression_loss: 1.8922 - classification_loss: 0.4477 420/500 [========================>.....] - ETA: 20s - loss: 2.3397 - regression_loss: 1.8922 - classification_loss: 0.4474 421/500 [========================>.....] - ETA: 19s - loss: 2.3411 - regression_loss: 1.8933 - classification_loss: 0.4478 422/500 [========================>.....] - ETA: 19s - loss: 2.3424 - regression_loss: 1.8944 - classification_loss: 0.4481 423/500 [========================>.....] - ETA: 19s - loss: 2.3420 - regression_loss: 1.8942 - classification_loss: 0.4478 424/500 [========================>.....] - ETA: 19s - loss: 2.3398 - regression_loss: 1.8924 - classification_loss: 0.4473 425/500 [========================>.....] - ETA: 18s - loss: 2.3365 - regression_loss: 1.8897 - classification_loss: 0.4468 426/500 [========================>.....] - ETA: 18s - loss: 2.3381 - regression_loss: 1.8893 - classification_loss: 0.4488 427/500 [========================>.....] - ETA: 18s - loss: 2.3390 - regression_loss: 1.8900 - classification_loss: 0.4490 428/500 [========================>.....] - ETA: 18s - loss: 2.3380 - regression_loss: 1.8892 - classification_loss: 0.4488 429/500 [========================>.....] - ETA: 17s - loss: 2.3378 - regression_loss: 1.8892 - classification_loss: 0.4486 430/500 [========================>.....] - ETA: 17s - loss: 2.3373 - regression_loss: 1.8889 - classification_loss: 0.4485 431/500 [========================>.....] - ETA: 17s - loss: 2.3371 - regression_loss: 1.8887 - classification_loss: 0.4484 432/500 [========================>.....] - ETA: 17s - loss: 2.3382 - regression_loss: 1.8891 - classification_loss: 0.4490 433/500 [========================>.....] - ETA: 16s - loss: 2.3373 - regression_loss: 1.8885 - classification_loss: 0.4488 434/500 [=========================>....] - ETA: 16s - loss: 2.3357 - regression_loss: 1.8873 - classification_loss: 0.4484 435/500 [=========================>....] - ETA: 16s - loss: 2.3352 - regression_loss: 1.8868 - classification_loss: 0.4484 436/500 [=========================>....] - ETA: 16s - loss: 2.3351 - regression_loss: 1.8867 - classification_loss: 0.4484 437/500 [=========================>....] - ETA: 15s - loss: 2.3348 - regression_loss: 1.8865 - classification_loss: 0.4483 438/500 [=========================>....] - ETA: 15s - loss: 2.3342 - regression_loss: 1.8857 - classification_loss: 0.4485 439/500 [=========================>....] - ETA: 15s - loss: 2.3336 - regression_loss: 1.8855 - classification_loss: 0.4482 440/500 [=========================>....] - ETA: 15s - loss: 2.3338 - regression_loss: 1.8854 - classification_loss: 0.4484 441/500 [=========================>....] - ETA: 14s - loss: 2.3325 - regression_loss: 1.8843 - classification_loss: 0.4481 442/500 [=========================>....] - ETA: 14s - loss: 2.3321 - regression_loss: 1.8840 - classification_loss: 0.4481 443/500 [=========================>....] - ETA: 14s - loss: 2.3319 - regression_loss: 1.8834 - classification_loss: 0.4485 444/500 [=========================>....] - ETA: 14s - loss: 2.3334 - regression_loss: 1.8847 - classification_loss: 0.4487 445/500 [=========================>....] - ETA: 13s - loss: 2.3339 - regression_loss: 1.8853 - classification_loss: 0.4487 446/500 [=========================>....] - ETA: 13s - loss: 2.3333 - regression_loss: 1.8849 - classification_loss: 0.4485 447/500 [=========================>....] - ETA: 13s - loss: 2.3330 - regression_loss: 1.8846 - classification_loss: 0.4484 448/500 [=========================>....] - ETA: 13s - loss: 2.3335 - regression_loss: 1.8850 - classification_loss: 0.4485 449/500 [=========================>....] - ETA: 12s - loss: 2.3349 - regression_loss: 1.8861 - classification_loss: 0.4488 450/500 [==========================>...] - ETA: 12s - loss: 2.3349 - regression_loss: 1.8862 - classification_loss: 0.4487 451/500 [==========================>...] - ETA: 12s - loss: 2.3344 - regression_loss: 1.8858 - classification_loss: 0.4486 452/500 [==========================>...] - ETA: 12s - loss: 2.3334 - regression_loss: 1.8850 - classification_loss: 0.4484 453/500 [==========================>...] - ETA: 11s - loss: 2.3358 - regression_loss: 1.8866 - classification_loss: 0.4492 454/500 [==========================>...] - ETA: 11s - loss: 2.3355 - regression_loss: 1.8865 - classification_loss: 0.4490 455/500 [==========================>...] - ETA: 11s - loss: 2.3351 - regression_loss: 1.8862 - classification_loss: 0.4488 456/500 [==========================>...] - ETA: 11s - loss: 2.3343 - regression_loss: 1.8858 - classification_loss: 0.4486 457/500 [==========================>...] - ETA: 10s - loss: 2.3343 - regression_loss: 1.8858 - classification_loss: 0.4485 458/500 [==========================>...] - ETA: 10s - loss: 2.3343 - regression_loss: 1.8858 - classification_loss: 0.4485 459/500 [==========================>...] - ETA: 10s - loss: 2.3307 - regression_loss: 1.8828 - classification_loss: 0.4479 460/500 [==========================>...] - ETA: 10s - loss: 2.3299 - regression_loss: 1.8821 - classification_loss: 0.4477 461/500 [==========================>...] - ETA: 9s - loss: 2.3312 - regression_loss: 1.8825 - classification_loss: 0.4487  462/500 [==========================>...] - ETA: 9s - loss: 2.3316 - regression_loss: 1.8829 - classification_loss: 0.4487 463/500 [==========================>...] - ETA: 9s - loss: 2.3320 - regression_loss: 1.8834 - classification_loss: 0.4487 464/500 [==========================>...] - ETA: 9s - loss: 2.3319 - regression_loss: 1.8830 - classification_loss: 0.4489 465/500 [==========================>...] - ETA: 8s - loss: 2.3318 - regression_loss: 1.8831 - classification_loss: 0.4488 466/500 [==========================>...] - ETA: 8s - loss: 2.3320 - regression_loss: 1.8833 - classification_loss: 0.4488 467/500 [===========================>..] - ETA: 8s - loss: 2.3321 - regression_loss: 1.8832 - classification_loss: 0.4489 468/500 [===========================>..] - ETA: 8s - loss: 2.3326 - regression_loss: 1.8837 - classification_loss: 0.4489 469/500 [===========================>..] - ETA: 7s - loss: 2.3321 - regression_loss: 1.8831 - classification_loss: 0.4489 470/500 [===========================>..] - ETA: 7s - loss: 2.3325 - regression_loss: 1.8835 - classification_loss: 0.4491 471/500 [===========================>..] - ETA: 7s - loss: 2.3316 - regression_loss: 1.8828 - classification_loss: 0.4488 472/500 [===========================>..] - ETA: 7s - loss: 2.3338 - regression_loss: 1.8843 - classification_loss: 0.4496 473/500 [===========================>..] - ETA: 6s - loss: 2.3355 - regression_loss: 1.8855 - classification_loss: 0.4500 474/500 [===========================>..] - ETA: 6s - loss: 2.3356 - regression_loss: 1.8855 - classification_loss: 0.4500 475/500 [===========================>..] - ETA: 6s - loss: 2.3360 - regression_loss: 1.8860 - classification_loss: 0.4500 476/500 [===========================>..] - ETA: 6s - loss: 2.3363 - regression_loss: 1.8862 - classification_loss: 0.4500 477/500 [===========================>..] - ETA: 5s - loss: 2.3370 - regression_loss: 1.8869 - classification_loss: 0.4501 478/500 [===========================>..] - ETA: 5s - loss: 2.3367 - regression_loss: 1.8867 - classification_loss: 0.4500 479/500 [===========================>..] - ETA: 5s - loss: 2.3365 - regression_loss: 1.8866 - classification_loss: 0.4499 480/500 [===========================>..] - ETA: 5s - loss: 2.3355 - regression_loss: 1.8857 - classification_loss: 0.4497 481/500 [===========================>..] - ETA: 4s - loss: 2.3357 - regression_loss: 1.8859 - classification_loss: 0.4498 482/500 [===========================>..] - ETA: 4s - loss: 2.3347 - regression_loss: 1.8851 - classification_loss: 0.4496 483/500 [===========================>..] - ETA: 4s - loss: 2.3342 - regression_loss: 1.8847 - classification_loss: 0.4494 484/500 [============================>.] - ETA: 4s - loss: 2.3356 - regression_loss: 1.8859 - classification_loss: 0.4497 485/500 [============================>.] - ETA: 3s - loss: 2.3362 - regression_loss: 1.8867 - classification_loss: 0.4495 486/500 [============================>.] - ETA: 3s - loss: 2.3365 - regression_loss: 1.8867 - classification_loss: 0.4498 487/500 [============================>.] - ETA: 3s - loss: 2.3367 - regression_loss: 1.8866 - classification_loss: 0.4501 488/500 [============================>.] - ETA: 3s - loss: 2.3364 - regression_loss: 1.8864 - classification_loss: 0.4499 489/500 [============================>.] - ETA: 2s - loss: 2.3367 - regression_loss: 1.8866 - classification_loss: 0.4501 490/500 [============================>.] - ETA: 2s - loss: 2.3359 - regression_loss: 1.8863 - classification_loss: 0.4496 491/500 [============================>.] - ETA: 2s - loss: 2.3367 - regression_loss: 1.8870 - classification_loss: 0.4497 492/500 [============================>.] - ETA: 2s - loss: 2.3360 - regression_loss: 1.8866 - classification_loss: 0.4495 493/500 [============================>.] - ETA: 1s - loss: 2.3348 - regression_loss: 1.8856 - classification_loss: 0.4492 494/500 [============================>.] - ETA: 1s - loss: 2.3348 - regression_loss: 1.8856 - classification_loss: 0.4492 495/500 [============================>.] - ETA: 1s - loss: 2.3347 - regression_loss: 1.8856 - classification_loss: 0.4491 496/500 [============================>.] - ETA: 1s - loss: 2.3334 - regression_loss: 1.8846 - classification_loss: 0.4488 497/500 [============================>.] - ETA: 0s - loss: 2.3333 - regression_loss: 1.8845 - classification_loss: 0.4488 498/500 [============================>.] - ETA: 0s - loss: 2.3314 - regression_loss: 1.8830 - classification_loss: 0.4485 499/500 [============================>.] - ETA: 0s - loss: 2.3320 - regression_loss: 1.8835 - classification_loss: 0.4485 500/500 [==============================] - 125s 250ms/step - loss: 2.3314 - regression_loss: 1.8829 - classification_loss: 0.4485 1172 instances of class plum with average precision: 0.4008 mAP: 0.4008 Epoch 00019: saving model to ./training/snapshots/resnet50_pascal_19.h5 Epoch 20/150 1/500 [..............................] - ETA: 1:58 - loss: 2.3194 - regression_loss: 1.8169 - classification_loss: 0.5024 2/500 [..............................] - ETA: 2:03 - loss: 2.0332 - regression_loss: 1.5941 - classification_loss: 0.4391 3/500 [..............................] - ETA: 2:04 - loss: 2.0415 - regression_loss: 1.5977 - classification_loss: 0.4438 4/500 [..............................] - ETA: 2:00 - loss: 2.0167 - regression_loss: 1.5891 - classification_loss: 0.4276 5/500 [..............................] - ETA: 1:57 - loss: 2.1214 - regression_loss: 1.6749 - classification_loss: 0.4465 6/500 [..............................] - ETA: 1:54 - loss: 2.1110 - regression_loss: 1.6720 - classification_loss: 0.4390 7/500 [..............................] - ETA: 1:52 - loss: 2.1368 - regression_loss: 1.6889 - classification_loss: 0.4479 8/500 [..............................] - ETA: 1:53 - loss: 2.1831 - regression_loss: 1.7440 - classification_loss: 0.4391 9/500 [..............................] - ETA: 1:54 - loss: 2.2466 - regression_loss: 1.7955 - classification_loss: 0.4511 10/500 [..............................] - ETA: 1:55 - loss: 2.2573 - regression_loss: 1.8160 - classification_loss: 0.4413 11/500 [..............................] - ETA: 1:56 - loss: 2.2787 - regression_loss: 1.8316 - classification_loss: 0.4471 12/500 [..............................] - ETA: 1:56 - loss: 2.2769 - regression_loss: 1.8245 - classification_loss: 0.4524 13/500 [..............................] - ETA: 1:56 - loss: 2.2649 - regression_loss: 1.8122 - classification_loss: 0.4527 14/500 [..............................] - ETA: 1:56 - loss: 2.2386 - regression_loss: 1.7858 - classification_loss: 0.4529 15/500 [..............................] - ETA: 1:57 - loss: 2.2520 - regression_loss: 1.7964 - classification_loss: 0.4556 16/500 [..............................] - ETA: 1:57 - loss: 2.2534 - regression_loss: 1.8022 - classification_loss: 0.4513 17/500 [>.............................] - ETA: 1:57 - loss: 2.2581 - regression_loss: 1.8082 - classification_loss: 0.4499 18/500 [>.............................] - ETA: 1:57 - loss: 2.2418 - regression_loss: 1.7986 - classification_loss: 0.4432 19/500 [>.............................] - ETA: 1:57 - loss: 2.2014 - regression_loss: 1.7696 - classification_loss: 0.4318 20/500 [>.............................] - ETA: 1:57 - loss: 2.2581 - regression_loss: 1.8143 - classification_loss: 0.4438 21/500 [>.............................] - ETA: 1:57 - loss: 2.2734 - regression_loss: 1.8263 - classification_loss: 0.4471 22/500 [>.............................] - ETA: 1:57 - loss: 2.2728 - regression_loss: 1.8276 - classification_loss: 0.4452 23/500 [>.............................] - ETA: 1:57 - loss: 2.2769 - regression_loss: 1.8345 - classification_loss: 0.4424 24/500 [>.............................] - ETA: 1:56 - loss: 2.2797 - regression_loss: 1.8381 - classification_loss: 0.4416 25/500 [>.............................] - ETA: 1:56 - loss: 2.2241 - regression_loss: 1.7959 - classification_loss: 0.4282 26/500 [>.............................] - ETA: 1:56 - loss: 2.2159 - regression_loss: 1.7875 - classification_loss: 0.4284 27/500 [>.............................] - ETA: 1:56 - loss: 2.2137 - regression_loss: 1.7838 - classification_loss: 0.4299 28/500 [>.............................] - ETA: 1:56 - loss: 2.2304 - regression_loss: 1.7984 - classification_loss: 0.4320 29/500 [>.............................] - ETA: 1:55 - loss: 2.2383 - regression_loss: 1.8058 - classification_loss: 0.4325 30/500 [>.............................] - ETA: 1:55 - loss: 2.2264 - regression_loss: 1.7974 - classification_loss: 0.4291 31/500 [>.............................] - ETA: 1:55 - loss: 2.2323 - regression_loss: 1.8027 - classification_loss: 0.4297 32/500 [>.............................] - ETA: 1:55 - loss: 2.2370 - regression_loss: 1.8064 - classification_loss: 0.4306 33/500 [>.............................] - ETA: 1:55 - loss: 2.2278 - regression_loss: 1.7973 - classification_loss: 0.4305 34/500 [=>............................] - ETA: 1:54 - loss: 2.2391 - regression_loss: 1.7950 - classification_loss: 0.4441 35/500 [=>............................] - ETA: 1:54 - loss: 2.2412 - regression_loss: 1.7970 - classification_loss: 0.4441 36/500 [=>............................] - ETA: 1:54 - loss: 2.2441 - regression_loss: 1.7977 - classification_loss: 0.4464 37/500 [=>............................] - ETA: 1:54 - loss: 2.2591 - regression_loss: 1.8084 - classification_loss: 0.4507 38/500 [=>............................] - ETA: 1:54 - loss: 2.2825 - regression_loss: 1.8176 - classification_loss: 0.4649 39/500 [=>............................] - ETA: 1:53 - loss: 2.3105 - regression_loss: 1.8199 - classification_loss: 0.4906 40/500 [=>............................] - ETA: 1:53 - loss: 2.3309 - regression_loss: 1.8284 - classification_loss: 0.5025 41/500 [=>............................] - ETA: 1:53 - loss: 2.3319 - regression_loss: 1.8322 - classification_loss: 0.4996 42/500 [=>............................] - ETA: 1:53 - loss: 2.3405 - regression_loss: 1.8398 - classification_loss: 0.5006 43/500 [=>............................] - ETA: 1:52 - loss: 2.3396 - regression_loss: 1.8430 - classification_loss: 0.4966 44/500 [=>............................] - ETA: 1:52 - loss: 2.3476 - regression_loss: 1.8541 - classification_loss: 0.4936 45/500 [=>............................] - ETA: 1:52 - loss: 2.3403 - regression_loss: 1.8495 - classification_loss: 0.4908 46/500 [=>............................] - ETA: 1:52 - loss: 2.3457 - regression_loss: 1.8549 - classification_loss: 0.4908 47/500 [=>............................] - ETA: 1:52 - loss: 2.3416 - regression_loss: 1.8516 - classification_loss: 0.4900 48/500 [=>............................] - ETA: 1:51 - loss: 2.3514 - regression_loss: 1.8605 - classification_loss: 0.4908 49/500 [=>............................] - ETA: 1:51 - loss: 2.3429 - regression_loss: 1.8540 - classification_loss: 0.4889 50/500 [==>...........................] - ETA: 1:51 - loss: 2.3497 - regression_loss: 1.8581 - classification_loss: 0.4916 51/500 [==>...........................] - ETA: 1:51 - loss: 2.3547 - regression_loss: 1.8638 - classification_loss: 0.4909 52/500 [==>...........................] - ETA: 1:51 - loss: 2.3745 - regression_loss: 1.8811 - classification_loss: 0.4934 53/500 [==>...........................] - ETA: 1:50 - loss: 2.3758 - regression_loss: 1.8838 - classification_loss: 0.4920 54/500 [==>...........................] - ETA: 1:50 - loss: 2.3833 - regression_loss: 1.8884 - classification_loss: 0.4949 55/500 [==>...........................] - ETA: 1:50 - loss: 2.3764 - regression_loss: 1.8846 - classification_loss: 0.4918 56/500 [==>...........................] - ETA: 1:50 - loss: 2.3731 - regression_loss: 1.8779 - classification_loss: 0.4952 57/500 [==>...........................] - ETA: 1:50 - loss: 2.3689 - regression_loss: 1.8754 - classification_loss: 0.4935 58/500 [==>...........................] - ETA: 1:49 - loss: 2.3656 - regression_loss: 1.8737 - classification_loss: 0.4919 59/500 [==>...........................] - ETA: 1:49 - loss: 2.3529 - regression_loss: 1.8642 - classification_loss: 0.4887 60/500 [==>...........................] - ETA: 1:49 - loss: 2.3470 - regression_loss: 1.8624 - classification_loss: 0.4846 61/500 [==>...........................] - ETA: 1:49 - loss: 2.3431 - regression_loss: 1.8600 - classification_loss: 0.4831 62/500 [==>...........................] - ETA: 1:49 - loss: 2.3393 - regression_loss: 1.8573 - classification_loss: 0.4820 63/500 [==>...........................] - ETA: 1:48 - loss: 2.3422 - regression_loss: 1.8605 - classification_loss: 0.4817 64/500 [==>...........................] - ETA: 1:48 - loss: 2.3410 - regression_loss: 1.8603 - classification_loss: 0.4807 65/500 [==>...........................] - ETA: 1:48 - loss: 2.3401 - regression_loss: 1.8610 - classification_loss: 0.4790 66/500 [==>...........................] - ETA: 1:48 - loss: 2.3436 - regression_loss: 1.8668 - classification_loss: 0.4768 67/500 [===>..........................] - ETA: 1:47 - loss: 2.3392 - regression_loss: 1.8640 - classification_loss: 0.4753 68/500 [===>..........................] - ETA: 1:47 - loss: 2.3318 - regression_loss: 1.8589 - classification_loss: 0.4729 69/500 [===>..........................] - ETA: 1:47 - loss: 2.3299 - regression_loss: 1.8587 - classification_loss: 0.4712 70/500 [===>..........................] - ETA: 1:47 - loss: 2.3337 - regression_loss: 1.8627 - classification_loss: 0.4709 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3182 - regression_loss: 1.8501 - classification_loss: 0.4681 72/500 [===>..........................] - ETA: 1:46 - loss: 2.3162 - regression_loss: 1.8487 - classification_loss: 0.4675 73/500 [===>..........................] - ETA: 1:46 - loss: 2.3187 - regression_loss: 1.8514 - classification_loss: 0.4674 74/500 [===>..........................] - ETA: 1:46 - loss: 2.3147 - regression_loss: 1.8485 - classification_loss: 0.4662 75/500 [===>..........................] - ETA: 1:46 - loss: 2.3102 - regression_loss: 1.8409 - classification_loss: 0.4693 76/500 [===>..........................] - ETA: 1:45 - loss: 2.3095 - regression_loss: 1.8408 - classification_loss: 0.4688 77/500 [===>..........................] - ETA: 1:45 - loss: 2.3038 - regression_loss: 1.8371 - classification_loss: 0.4667 78/500 [===>..........................] - ETA: 1:45 - loss: 2.3112 - regression_loss: 1.8431 - classification_loss: 0.4681 79/500 [===>..........................] - ETA: 1:45 - loss: 2.3117 - regression_loss: 1.8449 - classification_loss: 0.4668 80/500 [===>..........................] - ETA: 1:44 - loss: 2.3085 - regression_loss: 1.8436 - classification_loss: 0.4649 81/500 [===>..........................] - ETA: 1:44 - loss: 2.3069 - regression_loss: 1.8422 - classification_loss: 0.4647 82/500 [===>..........................] - ETA: 1:44 - loss: 2.3063 - regression_loss: 1.8427 - classification_loss: 0.4637 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3094 - regression_loss: 1.8459 - classification_loss: 0.4635 84/500 [====>.........................] - ETA: 1:43 - loss: 2.3108 - regression_loss: 1.8473 - classification_loss: 0.4635 85/500 [====>.........................] - ETA: 1:43 - loss: 2.3074 - regression_loss: 1.8450 - classification_loss: 0.4625 86/500 [====>.........................] - ETA: 1:43 - loss: 2.3182 - regression_loss: 1.8545 - classification_loss: 0.4637 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3100 - regression_loss: 1.8476 - classification_loss: 0.4623 88/500 [====>.........................] - ETA: 1:42 - loss: 2.3115 - regression_loss: 1.8502 - classification_loss: 0.4613 89/500 [====>.........................] - ETA: 1:42 - loss: 2.3108 - regression_loss: 1.8505 - classification_loss: 0.4603 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3144 - regression_loss: 1.8546 - classification_loss: 0.4599 91/500 [====>.........................] - ETA: 1:42 - loss: 2.3259 - regression_loss: 1.8647 - classification_loss: 0.4611 92/500 [====>.........................] - ETA: 1:41 - loss: 2.3209 - regression_loss: 1.8596 - classification_loss: 0.4613 93/500 [====>.........................] - ETA: 1:41 - loss: 2.3211 - regression_loss: 1.8605 - classification_loss: 0.4606 94/500 [====>.........................] - ETA: 1:41 - loss: 2.3195 - regression_loss: 1.8593 - classification_loss: 0.4602 95/500 [====>.........................] - ETA: 1:41 - loss: 2.3216 - regression_loss: 1.8625 - classification_loss: 0.4591 96/500 [====>.........................] - ETA: 1:40 - loss: 2.3222 - regression_loss: 1.8631 - classification_loss: 0.4591 97/500 [====>.........................] - ETA: 1:40 - loss: 2.3247 - regression_loss: 1.8654 - classification_loss: 0.4593 98/500 [====>.........................] - ETA: 1:40 - loss: 2.3279 - regression_loss: 1.8689 - classification_loss: 0.4590 99/500 [====>.........................] - ETA: 1:40 - loss: 2.3259 - regression_loss: 1.8674 - classification_loss: 0.4585 100/500 [=====>........................] - ETA: 1:40 - loss: 2.3349 - regression_loss: 1.8745 - classification_loss: 0.4604 101/500 [=====>........................] - ETA: 1:39 - loss: 2.3436 - regression_loss: 1.8813 - classification_loss: 0.4623 102/500 [=====>........................] - ETA: 1:39 - loss: 2.3380 - regression_loss: 1.8760 - classification_loss: 0.4620 103/500 [=====>........................] - ETA: 1:39 - loss: 2.3377 - regression_loss: 1.8765 - classification_loss: 0.4612 104/500 [=====>........................] - ETA: 1:39 - loss: 2.3352 - regression_loss: 1.8750 - classification_loss: 0.4602 105/500 [=====>........................] - ETA: 1:38 - loss: 2.3286 - regression_loss: 1.8695 - classification_loss: 0.4591 106/500 [=====>........................] - ETA: 1:38 - loss: 2.3255 - regression_loss: 1.8677 - classification_loss: 0.4578 107/500 [=====>........................] - ETA: 1:38 - loss: 2.3260 - regression_loss: 1.8668 - classification_loss: 0.4592 108/500 [=====>........................] - ETA: 1:38 - loss: 2.3230 - regression_loss: 1.8649 - classification_loss: 0.4581 109/500 [=====>........................] - ETA: 1:37 - loss: 2.3259 - regression_loss: 1.8673 - classification_loss: 0.4587 110/500 [=====>........................] - ETA: 1:37 - loss: 2.3233 - regression_loss: 1.8652 - classification_loss: 0.4582 111/500 [=====>........................] - ETA: 1:37 - loss: 2.3241 - regression_loss: 1.8657 - classification_loss: 0.4584 112/500 [=====>........................] - ETA: 1:37 - loss: 2.3240 - regression_loss: 1.8659 - classification_loss: 0.4581 113/500 [=====>........................] - ETA: 1:37 - loss: 2.3270 - regression_loss: 1.8687 - classification_loss: 0.4584 114/500 [=====>........................] - ETA: 1:36 - loss: 2.3266 - regression_loss: 1.8688 - classification_loss: 0.4578 115/500 [=====>........................] - ETA: 1:36 - loss: 2.3290 - regression_loss: 1.8709 - classification_loss: 0.4581 116/500 [=====>........................] - ETA: 1:36 - loss: 2.3279 - regression_loss: 1.8702 - classification_loss: 0.4577 117/500 [======>.......................] - ETA: 1:36 - loss: 2.3360 - regression_loss: 1.8771 - classification_loss: 0.4589 118/500 [======>.......................] - ETA: 1:35 - loss: 2.3360 - regression_loss: 1.8768 - classification_loss: 0.4592 119/500 [======>.......................] - ETA: 1:35 - loss: 2.3409 - regression_loss: 1.8804 - classification_loss: 0.4605 120/500 [======>.......................] - ETA: 1:35 - loss: 2.3366 - regression_loss: 1.8768 - classification_loss: 0.4598 121/500 [======>.......................] - ETA: 1:35 - loss: 2.3365 - regression_loss: 1.8769 - classification_loss: 0.4596 122/500 [======>.......................] - ETA: 1:34 - loss: 2.3332 - regression_loss: 1.8735 - classification_loss: 0.4597 123/500 [======>.......................] - ETA: 1:34 - loss: 2.3323 - regression_loss: 1.8731 - classification_loss: 0.4593 124/500 [======>.......................] - ETA: 1:34 - loss: 2.3272 - regression_loss: 1.8675 - classification_loss: 0.4598 125/500 [======>.......................] - ETA: 1:34 - loss: 2.3267 - regression_loss: 1.8678 - classification_loss: 0.4589 126/500 [======>.......................] - ETA: 1:33 - loss: 2.3147 - regression_loss: 1.8583 - classification_loss: 0.4565 127/500 [======>.......................] - ETA: 1:33 - loss: 2.3149 - regression_loss: 1.8587 - classification_loss: 0.4562 128/500 [======>.......................] - ETA: 1:33 - loss: 2.3135 - regression_loss: 1.8582 - classification_loss: 0.4554 129/500 [======>.......................] - ETA: 1:33 - loss: 2.3073 - regression_loss: 1.8534 - classification_loss: 0.4539 130/500 [======>.......................] - ETA: 1:32 - loss: 2.3066 - regression_loss: 1.8536 - classification_loss: 0.4530 131/500 [======>.......................] - ETA: 1:32 - loss: 2.3068 - regression_loss: 1.8537 - classification_loss: 0.4530 132/500 [======>.......................] - ETA: 1:32 - loss: 2.3029 - regression_loss: 1.8509 - classification_loss: 0.4520 133/500 [======>.......................] - ETA: 1:32 - loss: 2.3043 - regression_loss: 1.8530 - classification_loss: 0.4513 134/500 [=======>......................] - ETA: 1:31 - loss: 2.3046 - regression_loss: 1.8533 - classification_loss: 0.4513 135/500 [=======>......................] - ETA: 1:31 - loss: 2.3053 - regression_loss: 1.8545 - classification_loss: 0.4507 136/500 [=======>......................] - ETA: 1:31 - loss: 2.3023 - regression_loss: 1.8519 - classification_loss: 0.4504 137/500 [=======>......................] - ETA: 1:31 - loss: 2.3058 - regression_loss: 1.8546 - classification_loss: 0.4512 138/500 [=======>......................] - ETA: 1:30 - loss: 2.3070 - regression_loss: 1.8556 - classification_loss: 0.4514 139/500 [=======>......................] - ETA: 1:30 - loss: 2.3058 - regression_loss: 1.8549 - classification_loss: 0.4509 140/500 [=======>......................] - ETA: 1:30 - loss: 2.3091 - regression_loss: 1.8558 - classification_loss: 0.4532 141/500 [=======>......................] - ETA: 1:30 - loss: 2.3089 - regression_loss: 1.8556 - classification_loss: 0.4533 142/500 [=======>......................] - ETA: 1:29 - loss: 2.3083 - regression_loss: 1.8554 - classification_loss: 0.4528 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3081 - regression_loss: 1.8555 - classification_loss: 0.4526 144/500 [=======>......................] - ETA: 1:29 - loss: 2.3093 - regression_loss: 1.8560 - classification_loss: 0.4533 145/500 [=======>......................] - ETA: 1:29 - loss: 2.3076 - regression_loss: 1.8553 - classification_loss: 0.4523 146/500 [=======>......................] - ETA: 1:28 - loss: 2.3118 - regression_loss: 1.8587 - classification_loss: 0.4531 147/500 [=======>......................] - ETA: 1:28 - loss: 2.3126 - regression_loss: 1.8596 - classification_loss: 0.4530 148/500 [=======>......................] - ETA: 1:28 - loss: 2.3340 - regression_loss: 1.8673 - classification_loss: 0.4667 149/500 [=======>......................] - ETA: 1:28 - loss: 2.3375 - regression_loss: 1.8706 - classification_loss: 0.4669 150/500 [========>.....................] - ETA: 1:27 - loss: 2.3375 - regression_loss: 1.8706 - classification_loss: 0.4669 151/500 [========>.....................] - ETA: 1:27 - loss: 2.3373 - regression_loss: 1.8707 - classification_loss: 0.4666 152/500 [========>.....................] - ETA: 1:27 - loss: 2.3281 - regression_loss: 1.8630 - classification_loss: 0.4651 153/500 [========>.....................] - ETA: 1:27 - loss: 2.3265 - regression_loss: 1.8627 - classification_loss: 0.4638 154/500 [========>.....................] - ETA: 1:26 - loss: 2.3273 - regression_loss: 1.8637 - classification_loss: 0.4636 155/500 [========>.....................] - ETA: 1:26 - loss: 2.3289 - regression_loss: 1.8651 - classification_loss: 0.4638 156/500 [========>.....................] - ETA: 1:26 - loss: 2.3298 - regression_loss: 1.8665 - classification_loss: 0.4633 157/500 [========>.....................] - ETA: 1:26 - loss: 2.3278 - regression_loss: 1.8647 - classification_loss: 0.4631 158/500 [========>.....................] - ETA: 1:25 - loss: 2.3285 - regression_loss: 1.8652 - classification_loss: 0.4633 159/500 [========>.....................] - ETA: 1:25 - loss: 2.3262 - regression_loss: 1.8631 - classification_loss: 0.4631 160/500 [========>.....................] - ETA: 1:25 - loss: 2.3263 - regression_loss: 1.8631 - classification_loss: 0.4632 161/500 [========>.....................] - ETA: 1:25 - loss: 2.3240 - regression_loss: 1.8613 - classification_loss: 0.4627 162/500 [========>.....................] - ETA: 1:24 - loss: 2.3202 - regression_loss: 1.8584 - classification_loss: 0.4619 163/500 [========>.....................] - ETA: 1:24 - loss: 2.3197 - regression_loss: 1.8581 - classification_loss: 0.4616 164/500 [========>.....................] - ETA: 1:24 - loss: 2.3207 - regression_loss: 1.8595 - classification_loss: 0.4612 165/500 [========>.....................] - ETA: 1:24 - loss: 2.3217 - regression_loss: 1.8608 - classification_loss: 0.4609 166/500 [========>.....................] - ETA: 1:23 - loss: 2.3248 - regression_loss: 1.8633 - classification_loss: 0.4615 167/500 [=========>....................] - ETA: 1:23 - loss: 2.3211 - regression_loss: 1.8605 - classification_loss: 0.4606 168/500 [=========>....................] - ETA: 1:23 - loss: 2.3196 - regression_loss: 1.8596 - classification_loss: 0.4600 169/500 [=========>....................] - ETA: 1:23 - loss: 2.3203 - regression_loss: 1.8604 - classification_loss: 0.4599 170/500 [=========>....................] - ETA: 1:22 - loss: 2.3225 - regression_loss: 1.8616 - classification_loss: 0.4609 171/500 [=========>....................] - ETA: 1:22 - loss: 2.3298 - regression_loss: 1.8661 - classification_loss: 0.4637 172/500 [=========>....................] - ETA: 1:22 - loss: 2.3468 - regression_loss: 1.8693 - classification_loss: 0.4774 173/500 [=========>....................] - ETA: 1:22 - loss: 2.3452 - regression_loss: 1.8685 - classification_loss: 0.4767 174/500 [=========>....................] - ETA: 1:21 - loss: 2.3468 - regression_loss: 1.8699 - classification_loss: 0.4769 175/500 [=========>....................] - ETA: 1:21 - loss: 2.3495 - regression_loss: 1.8716 - classification_loss: 0.4779 176/500 [=========>....................] - ETA: 1:21 - loss: 2.3493 - regression_loss: 1.8717 - classification_loss: 0.4775 177/500 [=========>....................] - ETA: 1:21 - loss: 2.3472 - regression_loss: 1.8708 - classification_loss: 0.4764 178/500 [=========>....................] - ETA: 1:20 - loss: 2.3482 - regression_loss: 1.8715 - classification_loss: 0.4766 179/500 [=========>....................] - ETA: 1:20 - loss: 2.3465 - regression_loss: 1.8705 - classification_loss: 0.4760 180/500 [=========>....................] - ETA: 1:20 - loss: 2.3463 - regression_loss: 1.8706 - classification_loss: 0.4757 181/500 [=========>....................] - ETA: 1:19 - loss: 2.3465 - regression_loss: 1.8709 - classification_loss: 0.4756 182/500 [=========>....................] - ETA: 1:19 - loss: 2.3404 - regression_loss: 1.8659 - classification_loss: 0.4745 183/500 [=========>....................] - ETA: 1:19 - loss: 2.3386 - regression_loss: 1.8645 - classification_loss: 0.4741 184/500 [==========>...................] - ETA: 1:19 - loss: 2.3405 - regression_loss: 1.8658 - classification_loss: 0.4748 185/500 [==========>...................] - ETA: 1:18 - loss: 2.3408 - regression_loss: 1.8661 - classification_loss: 0.4746 186/500 [==========>...................] - ETA: 1:18 - loss: 2.3391 - regression_loss: 1.8653 - classification_loss: 0.4738 187/500 [==========>...................] - ETA: 1:18 - loss: 2.3415 - regression_loss: 1.8680 - classification_loss: 0.4734 188/500 [==========>...................] - ETA: 1:18 - loss: 2.3443 - regression_loss: 1.8708 - classification_loss: 0.4735 189/500 [==========>...................] - ETA: 1:17 - loss: 2.3426 - regression_loss: 1.8695 - classification_loss: 0.4731 190/500 [==========>...................] - ETA: 1:17 - loss: 2.3419 - regression_loss: 1.8691 - classification_loss: 0.4728 191/500 [==========>...................] - ETA: 1:17 - loss: 2.3425 - regression_loss: 1.8684 - classification_loss: 0.4741 192/500 [==========>...................] - ETA: 1:17 - loss: 2.3378 - regression_loss: 1.8655 - classification_loss: 0.4723 193/500 [==========>...................] - ETA: 1:16 - loss: 2.3376 - regression_loss: 1.8655 - classification_loss: 0.4721 194/500 [==========>...................] - ETA: 1:16 - loss: 2.3393 - regression_loss: 1.8674 - classification_loss: 0.4719 195/500 [==========>...................] - ETA: 1:16 - loss: 2.3403 - regression_loss: 1.8688 - classification_loss: 0.4715 196/500 [==========>...................] - ETA: 1:16 - loss: 2.3434 - regression_loss: 1.8715 - classification_loss: 0.4719 197/500 [==========>...................] - ETA: 1:15 - loss: 2.3441 - regression_loss: 1.8722 - classification_loss: 0.4718 198/500 [==========>...................] - ETA: 1:15 - loss: 2.3429 - regression_loss: 1.8715 - classification_loss: 0.4714 199/500 [==========>...................] - ETA: 1:15 - loss: 2.3465 - regression_loss: 1.8744 - classification_loss: 0.4721 200/500 [===========>..................] - ETA: 1:15 - loss: 2.3455 - regression_loss: 1.8738 - classification_loss: 0.4717 201/500 [===========>..................] - ETA: 1:14 - loss: 2.3471 - regression_loss: 1.8753 - classification_loss: 0.4718 202/500 [===========>..................] - ETA: 1:14 - loss: 2.3487 - regression_loss: 1.8771 - classification_loss: 0.4716 203/500 [===========>..................] - ETA: 1:14 - loss: 2.3481 - regression_loss: 1.8768 - classification_loss: 0.4713 204/500 [===========>..................] - ETA: 1:14 - loss: 2.3488 - regression_loss: 1.8773 - classification_loss: 0.4715 205/500 [===========>..................] - ETA: 1:13 - loss: 2.3512 - regression_loss: 1.8796 - classification_loss: 0.4716 206/500 [===========>..................] - ETA: 1:13 - loss: 2.3519 - regression_loss: 1.8805 - classification_loss: 0.4715 207/500 [===========>..................] - ETA: 1:13 - loss: 2.3513 - regression_loss: 1.8803 - classification_loss: 0.4710 208/500 [===========>..................] - ETA: 1:13 - loss: 2.3515 - regression_loss: 1.8800 - classification_loss: 0.4715 209/500 [===========>..................] - ETA: 1:12 - loss: 2.3517 - regression_loss: 1.8802 - classification_loss: 0.4715 210/500 [===========>..................] - ETA: 1:12 - loss: 2.3527 - regression_loss: 1.8806 - classification_loss: 0.4721 211/500 [===========>..................] - ETA: 1:12 - loss: 2.3497 - regression_loss: 1.8787 - classification_loss: 0.4710 212/500 [===========>..................] - ETA: 1:12 - loss: 2.3496 - regression_loss: 1.8788 - classification_loss: 0.4708 213/500 [===========>..................] - ETA: 1:11 - loss: 2.3482 - regression_loss: 1.8778 - classification_loss: 0.4704 214/500 [===========>..................] - ETA: 1:11 - loss: 2.3485 - regression_loss: 1.8782 - classification_loss: 0.4703 215/500 [===========>..................] - ETA: 1:11 - loss: 2.3486 - regression_loss: 1.8785 - classification_loss: 0.4702 216/500 [===========>..................] - ETA: 1:11 - loss: 2.3480 - regression_loss: 1.8780 - classification_loss: 0.4701 217/500 [============>.................] - ETA: 1:10 - loss: 2.3489 - regression_loss: 1.8785 - classification_loss: 0.4704 218/500 [============>.................] - ETA: 1:10 - loss: 2.3532 - regression_loss: 1.8815 - classification_loss: 0.4717 219/500 [============>.................] - ETA: 1:10 - loss: 2.3548 - regression_loss: 1.8824 - classification_loss: 0.4724 220/500 [============>.................] - ETA: 1:10 - loss: 2.3546 - regression_loss: 1.8826 - classification_loss: 0.4720 221/500 [============>.................] - ETA: 1:09 - loss: 2.3556 - regression_loss: 1.8837 - classification_loss: 0.4719 222/500 [============>.................] - ETA: 1:09 - loss: 2.3546 - regression_loss: 1.8752 - classification_loss: 0.4794 223/500 [============>.................] - ETA: 1:09 - loss: 2.3580 - regression_loss: 1.8779 - classification_loss: 0.4802 224/500 [============>.................] - ETA: 1:09 - loss: 2.3565 - regression_loss: 1.8765 - classification_loss: 0.4800 225/500 [============>.................] - ETA: 1:08 - loss: 2.3567 - regression_loss: 1.8770 - classification_loss: 0.4797 226/500 [============>.................] - ETA: 1:08 - loss: 2.3584 - regression_loss: 1.8783 - classification_loss: 0.4801 227/500 [============>.................] - ETA: 1:08 - loss: 2.3576 - regression_loss: 1.8780 - classification_loss: 0.4796 228/500 [============>.................] - ETA: 1:08 - loss: 2.3588 - regression_loss: 1.8787 - classification_loss: 0.4801 229/500 [============>.................] - ETA: 1:07 - loss: 2.3593 - regression_loss: 1.8779 - classification_loss: 0.4814 230/500 [============>.................] - ETA: 1:07 - loss: 2.3575 - regression_loss: 1.8766 - classification_loss: 0.4809 231/500 [============>.................] - ETA: 1:07 - loss: 2.3578 - regression_loss: 1.8770 - classification_loss: 0.4808 232/500 [============>.................] - ETA: 1:07 - loss: 2.3558 - regression_loss: 1.8754 - classification_loss: 0.4804 233/500 [============>.................] - ETA: 1:06 - loss: 2.3585 - regression_loss: 1.8775 - classification_loss: 0.4810 234/500 [=============>................] - ETA: 1:06 - loss: 2.3528 - regression_loss: 1.8732 - classification_loss: 0.4797 235/500 [=============>................] - ETA: 1:06 - loss: 2.3537 - regression_loss: 1.8742 - classification_loss: 0.4795 236/500 [=============>................] - ETA: 1:06 - loss: 2.3533 - regression_loss: 1.8741 - classification_loss: 0.4792 237/500 [=============>................] - ETA: 1:05 - loss: 2.3535 - regression_loss: 1.8747 - classification_loss: 0.4788 238/500 [=============>................] - ETA: 1:05 - loss: 2.3516 - regression_loss: 1.8729 - classification_loss: 0.4787 239/500 [=============>................] - ETA: 1:05 - loss: 2.3479 - regression_loss: 1.8701 - classification_loss: 0.4778 240/500 [=============>................] - ETA: 1:05 - loss: 2.3477 - regression_loss: 1.8702 - classification_loss: 0.4775 241/500 [=============>................] - ETA: 1:04 - loss: 2.3478 - regression_loss: 1.8707 - classification_loss: 0.4771 242/500 [=============>................] - ETA: 1:04 - loss: 2.3485 - regression_loss: 1.8711 - classification_loss: 0.4773 243/500 [=============>................] - ETA: 1:04 - loss: 2.3434 - regression_loss: 1.8671 - classification_loss: 0.4762 244/500 [=============>................] - ETA: 1:04 - loss: 2.3446 - regression_loss: 1.8684 - classification_loss: 0.4762 245/500 [=============>................] - ETA: 1:03 - loss: 2.3446 - regression_loss: 1.8685 - classification_loss: 0.4761 246/500 [=============>................] - ETA: 1:03 - loss: 2.3443 - regression_loss: 1.8685 - classification_loss: 0.4758 247/500 [=============>................] - ETA: 1:03 - loss: 2.3441 - regression_loss: 1.8687 - classification_loss: 0.4755 248/500 [=============>................] - ETA: 1:03 - loss: 2.3461 - regression_loss: 1.8701 - classification_loss: 0.4760 249/500 [=============>................] - ETA: 1:02 - loss: 2.3453 - regression_loss: 1.8698 - classification_loss: 0.4755 250/500 [==============>...............] - ETA: 1:02 - loss: 2.3457 - regression_loss: 1.8706 - classification_loss: 0.4751 251/500 [==============>...............] - ETA: 1:02 - loss: 2.3449 - regression_loss: 1.8701 - classification_loss: 0.4747 252/500 [==============>...............] - ETA: 1:02 - loss: 2.3443 - regression_loss: 1.8696 - classification_loss: 0.4746 253/500 [==============>...............] - ETA: 1:01 - loss: 2.3426 - regression_loss: 1.8677 - classification_loss: 0.4749 254/500 [==============>...............] - ETA: 1:01 - loss: 2.3433 - regression_loss: 1.8683 - classification_loss: 0.4751 255/500 [==============>...............] - ETA: 1:01 - loss: 2.3431 - regression_loss: 1.8682 - classification_loss: 0.4748 256/500 [==============>...............] - ETA: 1:01 - loss: 2.3416 - regression_loss: 1.8672 - classification_loss: 0.4744 257/500 [==============>...............] - ETA: 1:00 - loss: 2.3429 - regression_loss: 1.8680 - classification_loss: 0.4749 258/500 [==============>...............] - ETA: 1:00 - loss: 2.3440 - regression_loss: 1.8691 - classification_loss: 0.4749 259/500 [==============>...............] - ETA: 1:00 - loss: 2.3423 - regression_loss: 1.8679 - classification_loss: 0.4744 260/500 [==============>...............] - ETA: 1:00 - loss: 2.3406 - regression_loss: 1.8666 - classification_loss: 0.4740 261/500 [==============>...............] - ETA: 59s - loss: 2.3372 - regression_loss: 1.8640 - classification_loss: 0.4732  262/500 [==============>...............] - ETA: 59s - loss: 2.3343 - regression_loss: 1.8617 - classification_loss: 0.4727 263/500 [==============>...............] - ETA: 59s - loss: 2.3358 - regression_loss: 1.8628 - classification_loss: 0.4730 264/500 [==============>...............] - ETA: 59s - loss: 2.3356 - regression_loss: 1.8628 - classification_loss: 0.4728 265/500 [==============>...............] - ETA: 58s - loss: 2.3371 - regression_loss: 1.8642 - classification_loss: 0.4729 266/500 [==============>...............] - ETA: 58s - loss: 2.3399 - regression_loss: 1.8659 - classification_loss: 0.4741 267/500 [===============>..............] - ETA: 58s - loss: 2.3388 - regression_loss: 1.8652 - classification_loss: 0.4737 268/500 [===============>..............] - ETA: 58s - loss: 2.3395 - regression_loss: 1.8660 - classification_loss: 0.4735 269/500 [===============>..............] - ETA: 57s - loss: 2.3392 - regression_loss: 1.8660 - classification_loss: 0.4732 270/500 [===============>..............] - ETA: 57s - loss: 2.3371 - regression_loss: 1.8646 - classification_loss: 0.4724 271/500 [===============>..............] - ETA: 57s - loss: 2.3368 - regression_loss: 1.8644 - classification_loss: 0.4724 272/500 [===============>..............] - ETA: 57s - loss: 2.3367 - regression_loss: 1.8647 - classification_loss: 0.4720 273/500 [===============>..............] - ETA: 56s - loss: 2.3365 - regression_loss: 1.8647 - classification_loss: 0.4718 274/500 [===============>..............] - ETA: 56s - loss: 2.3361 - regression_loss: 1.8646 - classification_loss: 0.4715 275/500 [===============>..............] - ETA: 56s - loss: 2.3326 - regression_loss: 1.8617 - classification_loss: 0.4709 276/500 [===============>..............] - ETA: 56s - loss: 2.3329 - regression_loss: 1.8621 - classification_loss: 0.4708 277/500 [===============>..............] - ETA: 55s - loss: 2.3336 - regression_loss: 1.8629 - classification_loss: 0.4707 278/500 [===============>..............] - ETA: 55s - loss: 2.3355 - regression_loss: 1.8648 - classification_loss: 0.4707 279/500 [===============>..............] - ETA: 55s - loss: 2.3353 - regression_loss: 1.8647 - classification_loss: 0.4706 280/500 [===============>..............] - ETA: 55s - loss: 2.3350 - regression_loss: 1.8648 - classification_loss: 0.4702 281/500 [===============>..............] - ETA: 54s - loss: 2.3339 - regression_loss: 1.8638 - classification_loss: 0.4701 282/500 [===============>..............] - ETA: 54s - loss: 2.3318 - regression_loss: 1.8626 - classification_loss: 0.4692 283/500 [===============>..............] - ETA: 54s - loss: 2.3334 - regression_loss: 1.8636 - classification_loss: 0.4698 284/500 [================>.............] - ETA: 54s - loss: 2.3357 - regression_loss: 1.8655 - classification_loss: 0.4702 285/500 [================>.............] - ETA: 53s - loss: 2.3347 - regression_loss: 1.8650 - classification_loss: 0.4697 286/500 [================>.............] - ETA: 53s - loss: 2.3326 - regression_loss: 1.8634 - classification_loss: 0.4692 287/500 [================>.............] - ETA: 53s - loss: 2.3309 - regression_loss: 1.8621 - classification_loss: 0.4687 288/500 [================>.............] - ETA: 53s - loss: 2.3316 - regression_loss: 1.8628 - classification_loss: 0.4688 289/500 [================>.............] - ETA: 52s - loss: 2.3309 - regression_loss: 1.8624 - classification_loss: 0.4685 290/500 [================>.............] - ETA: 52s - loss: 2.3311 - regression_loss: 1.8625 - classification_loss: 0.4686 291/500 [================>.............] - ETA: 52s - loss: 2.3343 - regression_loss: 1.8637 - classification_loss: 0.4706 292/500 [================>.............] - ETA: 52s - loss: 2.3338 - regression_loss: 1.8635 - classification_loss: 0.4703 293/500 [================>.............] - ETA: 51s - loss: 2.3318 - regression_loss: 1.8615 - classification_loss: 0.4703 294/500 [================>.............] - ETA: 51s - loss: 2.3299 - regression_loss: 1.8603 - classification_loss: 0.4696 295/500 [================>.............] - ETA: 51s - loss: 2.3307 - regression_loss: 1.8611 - classification_loss: 0.4696 296/500 [================>.............] - ETA: 51s - loss: 2.3302 - regression_loss: 1.8609 - classification_loss: 0.4693 297/500 [================>.............] - ETA: 50s - loss: 2.3295 - regression_loss: 1.8607 - classification_loss: 0.4687 298/500 [================>.............] - ETA: 50s - loss: 2.3284 - regression_loss: 1.8601 - classification_loss: 0.4683 299/500 [================>.............] - ETA: 50s - loss: 2.3243 - regression_loss: 1.8570 - classification_loss: 0.4673 300/500 [=================>............] - ETA: 50s - loss: 2.3248 - regression_loss: 1.8578 - classification_loss: 0.4671 301/500 [=================>............] - ETA: 49s - loss: 2.3271 - regression_loss: 1.8594 - classification_loss: 0.4678 302/500 [=================>............] - ETA: 49s - loss: 2.3281 - regression_loss: 1.8605 - classification_loss: 0.4676 303/500 [=================>............] - ETA: 49s - loss: 2.3295 - regression_loss: 1.8618 - classification_loss: 0.4678 304/500 [=================>............] - ETA: 49s - loss: 2.3294 - regression_loss: 1.8619 - classification_loss: 0.4675 305/500 [=================>............] - ETA: 48s - loss: 2.3290 - regression_loss: 1.8617 - classification_loss: 0.4673 306/500 [=================>............] - ETA: 48s - loss: 2.3294 - regression_loss: 1.8625 - classification_loss: 0.4669 307/500 [=================>............] - ETA: 48s - loss: 2.3291 - regression_loss: 1.8628 - classification_loss: 0.4664 308/500 [=================>............] - ETA: 48s - loss: 2.3281 - regression_loss: 1.8619 - classification_loss: 0.4662 309/500 [=================>............] - ETA: 47s - loss: 2.3270 - regression_loss: 1.8611 - classification_loss: 0.4658 310/500 [=================>............] - ETA: 47s - loss: 2.3262 - regression_loss: 1.8606 - classification_loss: 0.4656 311/500 [=================>............] - ETA: 47s - loss: 2.3267 - regression_loss: 1.8611 - classification_loss: 0.4656 312/500 [=================>............] - ETA: 47s - loss: 2.3264 - regression_loss: 1.8607 - classification_loss: 0.4657 313/500 [=================>............] - ETA: 46s - loss: 2.3268 - regression_loss: 1.8613 - classification_loss: 0.4655 314/500 [=================>............] - ETA: 46s - loss: 2.3314 - regression_loss: 1.8629 - classification_loss: 0.4685 315/500 [=================>............] - ETA: 46s - loss: 2.3312 - regression_loss: 1.8630 - classification_loss: 0.4683 316/500 [=================>............] - ETA: 46s - loss: 2.3308 - regression_loss: 1.8628 - classification_loss: 0.4680 317/500 [==================>...........] - ETA: 45s - loss: 2.3315 - regression_loss: 1.8639 - classification_loss: 0.4676 318/500 [==================>...........] - ETA: 45s - loss: 2.3334 - regression_loss: 1.8650 - classification_loss: 0.4684 319/500 [==================>...........] - ETA: 45s - loss: 2.3325 - regression_loss: 1.8643 - classification_loss: 0.4681 320/500 [==================>...........] - ETA: 45s - loss: 2.3318 - regression_loss: 1.8639 - classification_loss: 0.4679 321/500 [==================>...........] - ETA: 44s - loss: 2.3317 - regression_loss: 1.8643 - classification_loss: 0.4674 322/500 [==================>...........] - ETA: 44s - loss: 2.3311 - regression_loss: 1.8639 - classification_loss: 0.4671 323/500 [==================>...........] - ETA: 44s - loss: 2.3317 - regression_loss: 1.8645 - classification_loss: 0.4673 324/500 [==================>...........] - ETA: 44s - loss: 2.3331 - regression_loss: 1.8653 - classification_loss: 0.4678 325/500 [==================>...........] - ETA: 43s - loss: 2.3320 - regression_loss: 1.8645 - classification_loss: 0.4675 326/500 [==================>...........] - ETA: 43s - loss: 2.3317 - regression_loss: 1.8643 - classification_loss: 0.4674 327/500 [==================>...........] - ETA: 43s - loss: 2.3320 - regression_loss: 1.8646 - classification_loss: 0.4673 328/500 [==================>...........] - ETA: 43s - loss: 2.3311 - regression_loss: 1.8641 - classification_loss: 0.4670 329/500 [==================>...........] - ETA: 42s - loss: 2.3324 - regression_loss: 1.8653 - classification_loss: 0.4671 330/500 [==================>...........] - ETA: 42s - loss: 2.3321 - regression_loss: 1.8653 - classification_loss: 0.4668 331/500 [==================>...........] - ETA: 42s - loss: 2.3334 - regression_loss: 1.8662 - classification_loss: 0.4672 332/500 [==================>...........] - ETA: 42s - loss: 2.3375 - regression_loss: 1.8699 - classification_loss: 0.4676 333/500 [==================>...........] - ETA: 41s - loss: 2.3380 - regression_loss: 1.8707 - classification_loss: 0.4674 334/500 [===================>..........] - ETA: 41s - loss: 2.3380 - regression_loss: 1.8709 - classification_loss: 0.4671 335/500 [===================>..........] - ETA: 41s - loss: 2.3388 - regression_loss: 1.8716 - classification_loss: 0.4673 336/500 [===================>..........] - ETA: 41s - loss: 2.3395 - regression_loss: 1.8722 - classification_loss: 0.4673 337/500 [===================>..........] - ETA: 40s - loss: 2.3393 - regression_loss: 1.8723 - classification_loss: 0.4669 338/500 [===================>..........] - ETA: 40s - loss: 2.3406 - regression_loss: 1.8729 - classification_loss: 0.4677 339/500 [===================>..........] - ETA: 40s - loss: 2.3408 - regression_loss: 1.8735 - classification_loss: 0.4673 340/500 [===================>..........] - ETA: 40s - loss: 2.3407 - regression_loss: 1.8734 - classification_loss: 0.4673 341/500 [===================>..........] - ETA: 39s - loss: 2.3425 - regression_loss: 1.8751 - classification_loss: 0.4675 342/500 [===================>..........] - ETA: 39s - loss: 2.3428 - regression_loss: 1.8753 - classification_loss: 0.4675 343/500 [===================>..........] - ETA: 39s - loss: 2.3424 - regression_loss: 1.8751 - classification_loss: 0.4673 344/500 [===================>..........] - ETA: 39s - loss: 2.3399 - regression_loss: 1.8731 - classification_loss: 0.4669 345/500 [===================>..........] - ETA: 38s - loss: 2.3400 - regression_loss: 1.8733 - classification_loss: 0.4667 346/500 [===================>..........] - ETA: 38s - loss: 2.3399 - regression_loss: 1.8732 - classification_loss: 0.4667 347/500 [===================>..........] - ETA: 38s - loss: 2.3399 - regression_loss: 1.8734 - classification_loss: 0.4665 348/500 [===================>..........] - ETA: 38s - loss: 2.3405 - regression_loss: 1.8741 - classification_loss: 0.4664 349/500 [===================>..........] - ETA: 37s - loss: 2.3400 - regression_loss: 1.8739 - classification_loss: 0.4661 350/500 [====================>.........] - ETA: 37s - loss: 2.3400 - regression_loss: 1.8742 - classification_loss: 0.4659 351/500 [====================>.........] - ETA: 37s - loss: 2.3383 - regression_loss: 1.8716 - classification_loss: 0.4667 352/500 [====================>.........] - ETA: 37s - loss: 2.3382 - regression_loss: 1.8714 - classification_loss: 0.4667 353/500 [====================>.........] - ETA: 36s - loss: 2.3398 - regression_loss: 1.8723 - classification_loss: 0.4675 354/500 [====================>.........] - ETA: 36s - loss: 2.3392 - regression_loss: 1.8720 - classification_loss: 0.4672 355/500 [====================>.........] - ETA: 36s - loss: 2.3397 - regression_loss: 1.8724 - classification_loss: 0.4673 356/500 [====================>.........] - ETA: 36s - loss: 2.3397 - regression_loss: 1.8726 - classification_loss: 0.4671 357/500 [====================>.........] - ETA: 35s - loss: 2.3407 - regression_loss: 1.8733 - classification_loss: 0.4675 358/500 [====================>.........] - ETA: 35s - loss: 2.3408 - regression_loss: 1.8733 - classification_loss: 0.4675 359/500 [====================>.........] - ETA: 35s - loss: 2.3408 - regression_loss: 1.8732 - classification_loss: 0.4676 360/500 [====================>.........] - ETA: 35s - loss: 2.3411 - regression_loss: 1.8737 - classification_loss: 0.4674 361/500 [====================>.........] - ETA: 34s - loss: 2.3417 - regression_loss: 1.8746 - classification_loss: 0.4671 362/500 [====================>.........] - ETA: 34s - loss: 2.3413 - regression_loss: 1.8744 - classification_loss: 0.4669 363/500 [====================>.........] - ETA: 34s - loss: 2.3401 - regression_loss: 1.8735 - classification_loss: 0.4666 364/500 [====================>.........] - ETA: 34s - loss: 2.3391 - regression_loss: 1.8726 - classification_loss: 0.4665 365/500 [====================>.........] - ETA: 33s - loss: 2.3383 - regression_loss: 1.8720 - classification_loss: 0.4663 366/500 [====================>.........] - ETA: 33s - loss: 2.3386 - regression_loss: 1.8722 - classification_loss: 0.4663 367/500 [=====================>........] - ETA: 33s - loss: 2.3373 - regression_loss: 1.8714 - classification_loss: 0.4659 368/500 [=====================>........] - ETA: 33s - loss: 2.3351 - regression_loss: 1.8696 - classification_loss: 0.4655 369/500 [=====================>........] - ETA: 32s - loss: 2.3345 - regression_loss: 1.8694 - classification_loss: 0.4650 370/500 [=====================>........] - ETA: 32s - loss: 2.3352 - regression_loss: 1.8703 - classification_loss: 0.4649 371/500 [=====================>........] - ETA: 32s - loss: 2.3347 - regression_loss: 1.8702 - classification_loss: 0.4645 372/500 [=====================>........] - ETA: 32s - loss: 2.3343 - regression_loss: 1.8700 - classification_loss: 0.4643 373/500 [=====================>........] - ETA: 31s - loss: 2.3351 - regression_loss: 1.8706 - classification_loss: 0.4645 374/500 [=====================>........] - ETA: 31s - loss: 2.3338 - regression_loss: 1.8696 - classification_loss: 0.4643 375/500 [=====================>........] - ETA: 31s - loss: 2.3367 - regression_loss: 1.8723 - classification_loss: 0.4644 376/500 [=====================>........] - ETA: 31s - loss: 2.3363 - regression_loss: 1.8718 - classification_loss: 0.4645 377/500 [=====================>........] - ETA: 30s - loss: 2.3355 - regression_loss: 1.8713 - classification_loss: 0.4642 378/500 [=====================>........] - ETA: 30s - loss: 2.3348 - regression_loss: 1.8709 - classification_loss: 0.4639 379/500 [=====================>........] - ETA: 30s - loss: 2.3346 - regression_loss: 1.8710 - classification_loss: 0.4636 380/500 [=====================>........] - ETA: 30s - loss: 2.3357 - regression_loss: 1.8715 - classification_loss: 0.4642 381/500 [=====================>........] - ETA: 29s - loss: 2.3352 - regression_loss: 1.8712 - classification_loss: 0.4639 382/500 [=====================>........] - ETA: 29s - loss: 2.3353 - regression_loss: 1.8713 - classification_loss: 0.4640 383/500 [=====================>........] - ETA: 29s - loss: 2.3366 - regression_loss: 1.8728 - classification_loss: 0.4638 384/500 [======================>.......] - ETA: 29s - loss: 2.3365 - regression_loss: 1.8729 - classification_loss: 0.4636 385/500 [======================>.......] - ETA: 28s - loss: 2.3364 - regression_loss: 1.8729 - classification_loss: 0.4635 386/500 [======================>.......] - ETA: 28s - loss: 2.3363 - regression_loss: 1.8730 - classification_loss: 0.4634 387/500 [======================>.......] - ETA: 28s - loss: 2.3366 - regression_loss: 1.8733 - classification_loss: 0.4633 388/500 [======================>.......] - ETA: 28s - loss: 2.3369 - regression_loss: 1.8738 - classification_loss: 0.4632 389/500 [======================>.......] - ETA: 27s - loss: 2.3360 - regression_loss: 1.8733 - classification_loss: 0.4627 390/500 [======================>.......] - ETA: 27s - loss: 2.3360 - regression_loss: 1.8729 - classification_loss: 0.4631 391/500 [======================>.......] - ETA: 27s - loss: 2.3339 - regression_loss: 1.8712 - classification_loss: 0.4626 392/500 [======================>.......] - ETA: 27s - loss: 2.3351 - regression_loss: 1.8725 - classification_loss: 0.4626 393/500 [======================>.......] - ETA: 26s - loss: 2.3329 - regression_loss: 1.8708 - classification_loss: 0.4621 394/500 [======================>.......] - ETA: 26s - loss: 2.3329 - regression_loss: 1.8708 - classification_loss: 0.4621 395/500 [======================>.......] - ETA: 26s - loss: 2.3323 - regression_loss: 1.8705 - classification_loss: 0.4618 396/500 [======================>.......] - ETA: 26s - loss: 2.3311 - regression_loss: 1.8696 - classification_loss: 0.4615 397/500 [======================>.......] - ETA: 25s - loss: 2.3300 - regression_loss: 1.8688 - classification_loss: 0.4612 398/500 [======================>.......] - ETA: 25s - loss: 2.3297 - regression_loss: 1.8687 - classification_loss: 0.4610 399/500 [======================>.......] - ETA: 25s - loss: 2.3299 - regression_loss: 1.8689 - classification_loss: 0.4611 400/500 [=======================>......] - ETA: 25s - loss: 2.3290 - regression_loss: 1.8683 - classification_loss: 0.4607 401/500 [=======================>......] - ETA: 24s - loss: 2.3294 - regression_loss: 1.8679 - classification_loss: 0.4615 402/500 [=======================>......] - ETA: 24s - loss: 2.3298 - regression_loss: 1.8685 - classification_loss: 0.4613 403/500 [=======================>......] - ETA: 24s - loss: 2.3298 - regression_loss: 1.8685 - classification_loss: 0.4613 404/500 [=======================>......] - ETA: 24s - loss: 2.3298 - regression_loss: 1.8686 - classification_loss: 0.4613 405/500 [=======================>......] - ETA: 23s - loss: 2.3271 - regression_loss: 1.8666 - classification_loss: 0.4605 406/500 [=======================>......] - ETA: 23s - loss: 2.3287 - regression_loss: 1.8672 - classification_loss: 0.4615 407/500 [=======================>......] - ETA: 23s - loss: 2.3286 - regression_loss: 1.8673 - classification_loss: 0.4613 408/500 [=======================>......] - ETA: 23s - loss: 2.3287 - regression_loss: 1.8677 - classification_loss: 0.4611 409/500 [=======================>......] - ETA: 22s - loss: 2.3286 - regression_loss: 1.8676 - classification_loss: 0.4609 410/500 [=======================>......] - ETA: 22s - loss: 2.3285 - regression_loss: 1.8676 - classification_loss: 0.4609 411/500 [=======================>......] - ETA: 22s - loss: 2.3303 - regression_loss: 1.8694 - classification_loss: 0.4609 412/500 [=======================>......] - ETA: 22s - loss: 2.3283 - regression_loss: 1.8678 - classification_loss: 0.4605 413/500 [=======================>......] - ETA: 21s - loss: 2.3281 - regression_loss: 1.8677 - classification_loss: 0.4604 414/500 [=======================>......] - ETA: 21s - loss: 2.3288 - regression_loss: 1.8683 - classification_loss: 0.4605 415/500 [=======================>......] - ETA: 21s - loss: 2.3283 - regression_loss: 1.8680 - classification_loss: 0.4603 416/500 [=======================>......] - ETA: 21s - loss: 2.3277 - regression_loss: 1.8676 - classification_loss: 0.4601 417/500 [========================>.....] - ETA: 20s - loss: 2.3297 - regression_loss: 1.8694 - classification_loss: 0.4603 418/500 [========================>.....] - ETA: 20s - loss: 2.3280 - regression_loss: 1.8674 - classification_loss: 0.4607 419/500 [========================>.....] - ETA: 20s - loss: 2.3280 - regression_loss: 1.8675 - classification_loss: 0.4605 420/500 [========================>.....] - ETA: 20s - loss: 2.3279 - regression_loss: 1.8675 - classification_loss: 0.4604 421/500 [========================>.....] - ETA: 19s - loss: 2.3284 - regression_loss: 1.8683 - classification_loss: 0.4601 422/500 [========================>.....] - ETA: 19s - loss: 2.3292 - regression_loss: 1.8690 - classification_loss: 0.4603 423/500 [========================>.....] - ETA: 19s - loss: 2.3294 - regression_loss: 1.8693 - classification_loss: 0.4601 424/500 [========================>.....] - ETA: 19s - loss: 2.3307 - regression_loss: 1.8703 - classification_loss: 0.4604 425/500 [========================>.....] - ETA: 18s - loss: 2.3290 - regression_loss: 1.8683 - classification_loss: 0.4606 426/500 [========================>.....] - ETA: 18s - loss: 2.3291 - regression_loss: 1.8684 - classification_loss: 0.4607 427/500 [========================>.....] - ETA: 18s - loss: 2.3302 - regression_loss: 1.8693 - classification_loss: 0.4608 428/500 [========================>.....] - ETA: 18s - loss: 2.3299 - regression_loss: 1.8692 - classification_loss: 0.4607 429/500 [========================>.....] - ETA: 17s - loss: 2.3377 - regression_loss: 1.8648 - classification_loss: 0.4729 430/500 [========================>.....] - ETA: 17s - loss: 2.3396 - regression_loss: 1.8667 - classification_loss: 0.4729 431/500 [========================>.....] - ETA: 17s - loss: 2.3396 - regression_loss: 1.8668 - classification_loss: 0.4728 432/500 [========================>.....] - ETA: 17s - loss: 2.3390 - regression_loss: 1.8666 - classification_loss: 0.4725 433/500 [========================>.....] - ETA: 16s - loss: 2.3384 - regression_loss: 1.8660 - classification_loss: 0.4724 434/500 [=========================>....] - ETA: 16s - loss: 2.3386 - regression_loss: 1.8662 - classification_loss: 0.4724 435/500 [=========================>....] - ETA: 16s - loss: 2.3378 - regression_loss: 1.8656 - classification_loss: 0.4722 436/500 [=========================>....] - ETA: 16s - loss: 2.3381 - regression_loss: 1.8657 - classification_loss: 0.4724 437/500 [=========================>....] - ETA: 15s - loss: 2.3367 - regression_loss: 1.8647 - classification_loss: 0.4721 438/500 [=========================>....] - ETA: 15s - loss: 2.3364 - regression_loss: 1.8646 - classification_loss: 0.4718 439/500 [=========================>....] - ETA: 15s - loss: 2.3372 - regression_loss: 1.8652 - classification_loss: 0.4720 440/500 [=========================>....] - ETA: 15s - loss: 2.3381 - regression_loss: 1.8659 - classification_loss: 0.4723 441/500 [=========================>....] - ETA: 14s - loss: 2.3380 - regression_loss: 1.8659 - classification_loss: 0.4721 442/500 [=========================>....] - ETA: 14s - loss: 2.3384 - regression_loss: 1.8665 - classification_loss: 0.4719 443/500 [=========================>....] - ETA: 14s - loss: 2.3409 - regression_loss: 1.8688 - classification_loss: 0.4720 444/500 [=========================>....] - ETA: 14s - loss: 2.3398 - regression_loss: 1.8680 - classification_loss: 0.4718 445/500 [=========================>....] - ETA: 13s - loss: 2.3400 - regression_loss: 1.8686 - classification_loss: 0.4714 446/500 [=========================>....] - ETA: 13s - loss: 2.3396 - regression_loss: 1.8684 - classification_loss: 0.4713 447/500 [=========================>....] - ETA: 13s - loss: 2.3407 - regression_loss: 1.8694 - classification_loss: 0.4713 448/500 [=========================>....] - ETA: 13s - loss: 2.3404 - regression_loss: 1.8693 - classification_loss: 0.4712 449/500 [=========================>....] - ETA: 12s - loss: 2.3411 - regression_loss: 1.8699 - classification_loss: 0.4712 450/500 [==========================>...] - ETA: 12s - loss: 2.3416 - regression_loss: 1.8705 - classification_loss: 0.4711 451/500 [==========================>...] - ETA: 12s - loss: 2.3403 - regression_loss: 1.8695 - classification_loss: 0.4708 452/500 [==========================>...] - ETA: 12s - loss: 2.3407 - regression_loss: 1.8700 - classification_loss: 0.4707 453/500 [==========================>...] - ETA: 11s - loss: 2.3414 - regression_loss: 1.8705 - classification_loss: 0.4709 454/500 [==========================>...] - ETA: 11s - loss: 2.3440 - regression_loss: 1.8718 - classification_loss: 0.4722 455/500 [==========================>...] - ETA: 11s - loss: 2.3445 - regression_loss: 1.8724 - classification_loss: 0.4721 456/500 [==========================>...] - ETA: 11s - loss: 2.3447 - regression_loss: 1.8726 - classification_loss: 0.4720 457/500 [==========================>...] - ETA: 10s - loss: 2.3440 - regression_loss: 1.8723 - classification_loss: 0.4718 458/500 [==========================>...] - ETA: 10s - loss: 2.3443 - regression_loss: 1.8725 - classification_loss: 0.4718 459/500 [==========================>...] - ETA: 10s - loss: 2.3429 - regression_loss: 1.8715 - classification_loss: 0.4713 460/500 [==========================>...] - ETA: 10s - loss: 2.3426 - regression_loss: 1.8714 - classification_loss: 0.4711 461/500 [==========================>...] - ETA: 9s - loss: 2.3427 - regression_loss: 1.8715 - classification_loss: 0.4712  462/500 [==========================>...] - ETA: 9s - loss: 2.3427 - regression_loss: 1.8717 - classification_loss: 0.4711 463/500 [==========================>...] - ETA: 9s - loss: 2.3421 - regression_loss: 1.8711 - classification_loss: 0.4710 464/500 [==========================>...] - ETA: 9s - loss: 2.3418 - regression_loss: 1.8709 - classification_loss: 0.4709 465/500 [==========================>...] - ETA: 8s - loss: 2.3421 - regression_loss: 1.8711 - classification_loss: 0.4709 466/500 [==========================>...] - ETA: 8s - loss: 2.3414 - regression_loss: 1.8706 - classification_loss: 0.4708 467/500 [===========================>..] - ETA: 8s - loss: 2.3412 - regression_loss: 1.8705 - classification_loss: 0.4707 468/500 [===========================>..] - ETA: 8s - loss: 2.3405 - regression_loss: 1.8699 - classification_loss: 0.4706 469/500 [===========================>..] - ETA: 7s - loss: 2.3400 - regression_loss: 1.8697 - classification_loss: 0.4702 470/500 [===========================>..] - ETA: 7s - loss: 2.3392 - regression_loss: 1.8689 - classification_loss: 0.4703 471/500 [===========================>..] - ETA: 7s - loss: 2.3396 - regression_loss: 1.8692 - classification_loss: 0.4704 472/500 [===========================>..] - ETA: 7s - loss: 2.3371 - regression_loss: 1.8672 - classification_loss: 0.4699 473/500 [===========================>..] - ETA: 6s - loss: 2.3367 - regression_loss: 1.8669 - classification_loss: 0.4698 474/500 [===========================>..] - ETA: 6s - loss: 2.3369 - regression_loss: 1.8671 - classification_loss: 0.4697 475/500 [===========================>..] - ETA: 6s - loss: 2.3372 - regression_loss: 1.8674 - classification_loss: 0.4698 476/500 [===========================>..] - ETA: 6s - loss: 2.3404 - regression_loss: 1.8686 - classification_loss: 0.4718 477/500 [===========================>..] - ETA: 5s - loss: 2.3408 - regression_loss: 1.8691 - classification_loss: 0.4717 478/500 [===========================>..] - ETA: 5s - loss: 2.3392 - regression_loss: 1.8680 - classification_loss: 0.4712 479/500 [===========================>..] - ETA: 5s - loss: 2.3395 - regression_loss: 1.8683 - classification_loss: 0.4712 480/500 [===========================>..] - ETA: 5s - loss: 2.3400 - regression_loss: 1.8688 - classification_loss: 0.4712 481/500 [===========================>..] - ETA: 4s - loss: 2.3397 - regression_loss: 1.8687 - classification_loss: 0.4709 482/500 [===========================>..] - ETA: 4s - loss: 2.3382 - regression_loss: 1.8674 - classification_loss: 0.4709 483/500 [===========================>..] - ETA: 4s - loss: 2.3393 - regression_loss: 1.8682 - classification_loss: 0.4711 484/500 [============================>.] - ETA: 4s - loss: 2.3401 - regression_loss: 1.8687 - classification_loss: 0.4714 485/500 [============================>.] - ETA: 3s - loss: 2.3395 - regression_loss: 1.8681 - classification_loss: 0.4714 486/500 [============================>.] - ETA: 3s - loss: 2.3400 - regression_loss: 1.8685 - classification_loss: 0.4715 487/500 [============================>.] - ETA: 3s - loss: 2.3401 - regression_loss: 1.8688 - classification_loss: 0.4714 488/500 [============================>.] - ETA: 3s - loss: 2.3399 - regression_loss: 1.8686 - classification_loss: 0.4713 489/500 [============================>.] - ETA: 2s - loss: 2.3388 - regression_loss: 1.8677 - classification_loss: 0.4711 490/500 [============================>.] - ETA: 2s - loss: 2.3388 - regression_loss: 1.8674 - classification_loss: 0.4713 491/500 [============================>.] - ETA: 2s - loss: 2.3405 - regression_loss: 1.8690 - classification_loss: 0.4716 492/500 [============================>.] - ETA: 2s - loss: 2.3405 - regression_loss: 1.8690 - classification_loss: 0.4714 493/500 [============================>.] - ETA: 1s - loss: 2.3404 - regression_loss: 1.8692 - classification_loss: 0.4712 494/500 [============================>.] - ETA: 1s - loss: 2.3407 - regression_loss: 1.8695 - classification_loss: 0.4712 495/500 [============================>.] - ETA: 1s - loss: 2.3402 - regression_loss: 1.8693 - classification_loss: 0.4710 496/500 [============================>.] - ETA: 1s - loss: 2.3410 - regression_loss: 1.8703 - classification_loss: 0.4707 497/500 [============================>.] - ETA: 0s - loss: 2.3409 - regression_loss: 1.8702 - classification_loss: 0.4707 498/500 [============================>.] - ETA: 0s - loss: 2.3410 - regression_loss: 1.8701 - classification_loss: 0.4709 499/500 [============================>.] - ETA: 0s - loss: 2.3417 - regression_loss: 1.8705 - classification_loss: 0.4712 500/500 [==============================] - 125s 250ms/step - loss: 2.3421 - regression_loss: 1.8708 - classification_loss: 0.4713 1172 instances of class plum with average precision: 0.4170 mAP: 0.4170 Epoch 00020: saving model to ./training/snapshots/resnet50_pascal_20.h5 Epoch 21/150 1/500 [..............................] - ETA: 2:01 - loss: 2.9905 - regression_loss: 2.4109 - classification_loss: 0.5797 2/500 [..............................] - ETA: 2:07 - loss: 2.5452 - regression_loss: 2.0840 - classification_loss: 0.4612 3/500 [..............................] - ETA: 2:06 - loss: 2.4459 - regression_loss: 1.9995 - classification_loss: 0.4464 4/500 [..............................] - ETA: 2:07 - loss: 2.3564 - regression_loss: 1.9226 - classification_loss: 0.4338 5/500 [..............................] - ETA: 2:06 - loss: 2.5142 - regression_loss: 2.0376 - classification_loss: 0.4766 6/500 [..............................] - ETA: 2:06 - loss: 2.4466 - regression_loss: 1.9897 - classification_loss: 0.4568 7/500 [..............................] - ETA: 2:06 - loss: 2.4308 - regression_loss: 1.9832 - classification_loss: 0.4476 8/500 [..............................] - ETA: 2:05 - loss: 2.4285 - regression_loss: 1.9727 - classification_loss: 0.4557 9/500 [..............................] - ETA: 2:05 - loss: 2.3261 - regression_loss: 1.8850 - classification_loss: 0.4411 10/500 [..............................] - ETA: 2:04 - loss: 2.3716 - regression_loss: 1.9295 - classification_loss: 0.4421 11/500 [..............................] - ETA: 2:04 - loss: 2.3984 - regression_loss: 1.9361 - classification_loss: 0.4622 12/500 [..............................] - ETA: 2:04 - loss: 2.3247 - regression_loss: 1.8761 - classification_loss: 0.4486 13/500 [..............................] - ETA: 2:04 - loss: 2.3844 - regression_loss: 1.9144 - classification_loss: 0.4700 14/500 [..............................] - ETA: 2:04 - loss: 2.3498 - regression_loss: 1.8915 - classification_loss: 0.4583 15/500 [..............................] - ETA: 2:03 - loss: 2.3611 - regression_loss: 1.9051 - classification_loss: 0.4560 16/500 [..............................] - ETA: 2:03 - loss: 2.3401 - regression_loss: 1.8923 - classification_loss: 0.4478 17/500 [>.............................] - ETA: 2:03 - loss: 2.2440 - regression_loss: 1.8119 - classification_loss: 0.4321 18/500 [>.............................] - ETA: 2:02 - loss: 2.2351 - regression_loss: 1.8088 - classification_loss: 0.4262 19/500 [>.............................] - ETA: 2:02 - loss: 2.2736 - regression_loss: 1.8382 - classification_loss: 0.4354 20/500 [>.............................] - ETA: 2:02 - loss: 2.2772 - regression_loss: 1.8424 - classification_loss: 0.4348 21/500 [>.............................] - ETA: 2:01 - loss: 2.2923 - regression_loss: 1.8528 - classification_loss: 0.4395 22/500 [>.............................] - ETA: 2:01 - loss: 2.2985 - regression_loss: 1.8557 - classification_loss: 0.4427 23/500 [>.............................] - ETA: 2:01 - loss: 2.2964 - regression_loss: 1.8550 - classification_loss: 0.4414 24/500 [>.............................] - ETA: 2:00 - loss: 2.3044 - regression_loss: 1.8572 - classification_loss: 0.4472 25/500 [>.............................] - ETA: 2:00 - loss: 2.2944 - regression_loss: 1.8471 - classification_loss: 0.4473 26/500 [>.............................] - ETA: 2:00 - loss: 2.3108 - regression_loss: 1.8625 - classification_loss: 0.4483 27/500 [>.............................] - ETA: 2:00 - loss: 2.3079 - regression_loss: 1.8593 - classification_loss: 0.4486 28/500 [>.............................] - ETA: 1:59 - loss: 2.3171 - regression_loss: 1.8695 - classification_loss: 0.4476 29/500 [>.............................] - ETA: 1:59 - loss: 2.3135 - regression_loss: 1.8678 - classification_loss: 0.4457 30/500 [>.............................] - ETA: 1:59 - loss: 2.3209 - regression_loss: 1.8738 - classification_loss: 0.4472 31/500 [>.............................] - ETA: 1:58 - loss: 2.3438 - regression_loss: 1.8885 - classification_loss: 0.4553 32/500 [>.............................] - ETA: 1:58 - loss: 2.3535 - regression_loss: 1.8992 - classification_loss: 0.4543 33/500 [>.............................] - ETA: 1:57 - loss: 2.3462 - regression_loss: 1.8926 - classification_loss: 0.4536 34/500 [=>............................] - ETA: 1:56 - loss: 2.3354 - regression_loss: 1.8850 - classification_loss: 0.4504 35/500 [=>............................] - ETA: 1:56 - loss: 2.3372 - regression_loss: 1.8881 - classification_loss: 0.4491 36/500 [=>............................] - ETA: 1:55 - loss: 2.3367 - regression_loss: 1.8877 - classification_loss: 0.4491 37/500 [=>............................] - ETA: 1:54 - loss: 2.2901 - regression_loss: 1.8488 - classification_loss: 0.4413 38/500 [=>............................] - ETA: 1:54 - loss: 2.2755 - regression_loss: 1.8387 - classification_loss: 0.4368 39/500 [=>............................] - ETA: 1:54 - loss: 2.2694 - regression_loss: 1.8363 - classification_loss: 0.4332 40/500 [=>............................] - ETA: 1:54 - loss: 2.2757 - regression_loss: 1.8416 - classification_loss: 0.4341 41/500 [=>............................] - ETA: 1:54 - loss: 2.2888 - regression_loss: 1.8517 - classification_loss: 0.4371 42/500 [=>............................] - ETA: 1:53 - loss: 2.2829 - regression_loss: 1.8486 - classification_loss: 0.4342 43/500 [=>............................] - ETA: 1:53 - loss: 2.3122 - regression_loss: 1.8724 - classification_loss: 0.4398 44/500 [=>............................] - ETA: 1:53 - loss: 2.3126 - regression_loss: 1.8724 - classification_loss: 0.4402 45/500 [=>............................] - ETA: 1:52 - loss: 2.2937 - regression_loss: 1.8564 - classification_loss: 0.4373 46/500 [=>............................] - ETA: 1:52 - loss: 2.3027 - regression_loss: 1.8636 - classification_loss: 0.4391 47/500 [=>............................] - ETA: 1:52 - loss: 2.3057 - regression_loss: 1.8671 - classification_loss: 0.4386 48/500 [=>............................] - ETA: 1:52 - loss: 2.3024 - regression_loss: 1.8645 - classification_loss: 0.4379 49/500 [=>............................] - ETA: 1:51 - loss: 2.2994 - regression_loss: 1.8605 - classification_loss: 0.4389 50/500 [==>...........................] - ETA: 1:51 - loss: 2.3055 - regression_loss: 1.8640 - classification_loss: 0.4414 51/500 [==>...........................] - ETA: 1:51 - loss: 2.3059 - regression_loss: 1.8659 - classification_loss: 0.4400 52/500 [==>...........................] - ETA: 1:51 - loss: 2.3016 - regression_loss: 1.8633 - classification_loss: 0.4383 53/500 [==>...........................] - ETA: 1:51 - loss: 2.2982 - regression_loss: 1.8607 - classification_loss: 0.4375 54/500 [==>...........................] - ETA: 1:50 - loss: 2.2953 - regression_loss: 1.8594 - classification_loss: 0.4359 55/500 [==>...........................] - ETA: 1:50 - loss: 2.2969 - regression_loss: 1.8613 - classification_loss: 0.4356 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2950 - regression_loss: 1.8606 - classification_loss: 0.4344 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2971 - regression_loss: 1.8632 - classification_loss: 0.4339 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2928 - regression_loss: 1.8610 - classification_loss: 0.4318 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2959 - regression_loss: 1.8634 - classification_loss: 0.4324 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2979 - regression_loss: 1.8650 - classification_loss: 0.4328 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2997 - regression_loss: 1.8665 - classification_loss: 0.4332 62/500 [==>...........................] - ETA: 1:48 - loss: 2.2898 - regression_loss: 1.8586 - classification_loss: 0.4312 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2905 - regression_loss: 1.8598 - classification_loss: 0.4307 64/500 [==>...........................] - ETA: 1:48 - loss: 2.2948 - regression_loss: 1.8626 - classification_loss: 0.4321 65/500 [==>...........................] - ETA: 1:48 - loss: 2.2943 - regression_loss: 1.8630 - classification_loss: 0.4313 66/500 [==>...........................] - ETA: 1:47 - loss: 2.2938 - regression_loss: 1.8627 - classification_loss: 0.4311 67/500 [===>..........................] - ETA: 1:47 - loss: 2.2952 - regression_loss: 1.8642 - classification_loss: 0.4310 68/500 [===>..........................] - ETA: 1:47 - loss: 2.2892 - regression_loss: 1.8580 - classification_loss: 0.4312 69/500 [===>..........................] - ETA: 1:47 - loss: 2.2879 - regression_loss: 1.8573 - classification_loss: 0.4305 70/500 [===>..........................] - ETA: 1:46 - loss: 2.2827 - regression_loss: 1.8536 - classification_loss: 0.4291 71/500 [===>..........................] - ETA: 1:46 - loss: 2.2883 - regression_loss: 1.8577 - classification_loss: 0.4306 72/500 [===>..........................] - ETA: 1:46 - loss: 2.2962 - regression_loss: 1.8644 - classification_loss: 0.4317 73/500 [===>..........................] - ETA: 1:46 - loss: 2.2937 - regression_loss: 1.8634 - classification_loss: 0.4303 74/500 [===>..........................] - ETA: 1:46 - loss: 2.2926 - regression_loss: 1.8636 - classification_loss: 0.4291 75/500 [===>..........................] - ETA: 1:45 - loss: 2.2983 - regression_loss: 1.8685 - classification_loss: 0.4298 76/500 [===>..........................] - ETA: 1:45 - loss: 2.3048 - regression_loss: 1.8730 - classification_loss: 0.4318 77/500 [===>..........................] - ETA: 1:45 - loss: 2.3111 - regression_loss: 1.8785 - classification_loss: 0.4326 78/500 [===>..........................] - ETA: 1:45 - loss: 2.3127 - regression_loss: 1.8791 - classification_loss: 0.4335 79/500 [===>..........................] - ETA: 1:44 - loss: 2.3180 - regression_loss: 1.8821 - classification_loss: 0.4359 80/500 [===>..........................] - ETA: 1:44 - loss: 2.3238 - regression_loss: 1.8862 - classification_loss: 0.4376 81/500 [===>..........................] - ETA: 1:44 - loss: 2.3238 - regression_loss: 1.8858 - classification_loss: 0.4380 82/500 [===>..........................] - ETA: 1:44 - loss: 2.3230 - regression_loss: 1.8848 - classification_loss: 0.4382 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3221 - regression_loss: 1.8842 - classification_loss: 0.4379 84/500 [====>.........................] - ETA: 1:43 - loss: 2.3202 - regression_loss: 1.8824 - classification_loss: 0.4378 85/500 [====>.........................] - ETA: 1:43 - loss: 2.3337 - regression_loss: 1.8855 - classification_loss: 0.4481 86/500 [====>.........................] - ETA: 1:43 - loss: 2.3391 - regression_loss: 1.8903 - classification_loss: 0.4488 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3384 - regression_loss: 1.8909 - classification_loss: 0.4476 88/500 [====>.........................] - ETA: 1:42 - loss: 2.3363 - regression_loss: 1.8894 - classification_loss: 0.4469 89/500 [====>.........................] - ETA: 1:42 - loss: 2.3320 - regression_loss: 1.8865 - classification_loss: 0.4455 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3370 - regression_loss: 1.8914 - classification_loss: 0.4456 91/500 [====>.........................] - ETA: 1:42 - loss: 2.3447 - regression_loss: 1.8973 - classification_loss: 0.4474 92/500 [====>.........................] - ETA: 1:41 - loss: 2.3443 - regression_loss: 1.8975 - classification_loss: 0.4468 93/500 [====>.........................] - ETA: 1:41 - loss: 2.3480 - regression_loss: 1.9005 - classification_loss: 0.4476 94/500 [====>.........................] - ETA: 1:41 - loss: 2.3527 - regression_loss: 1.9058 - classification_loss: 0.4469 95/500 [====>.........................] - ETA: 1:41 - loss: 2.3505 - regression_loss: 1.9042 - classification_loss: 0.4463 96/500 [====>.........................] - ETA: 1:40 - loss: 2.3443 - regression_loss: 1.8995 - classification_loss: 0.4448 97/500 [====>.........................] - ETA: 1:40 - loss: 2.3364 - regression_loss: 1.8931 - classification_loss: 0.4433 98/500 [====>.........................] - ETA: 1:40 - loss: 2.3345 - regression_loss: 1.8914 - classification_loss: 0.4431 99/500 [====>.........................] - ETA: 1:40 - loss: 2.3372 - regression_loss: 1.8926 - classification_loss: 0.4446 100/500 [=====>........................] - ETA: 1:39 - loss: 2.3367 - regression_loss: 1.8927 - classification_loss: 0.4440 101/500 [=====>........................] - ETA: 1:39 - loss: 2.3347 - regression_loss: 1.8914 - classification_loss: 0.4433 102/500 [=====>........................] - ETA: 1:39 - loss: 2.3328 - regression_loss: 1.8902 - classification_loss: 0.4426 103/500 [=====>........................] - ETA: 1:39 - loss: 2.3320 - regression_loss: 1.8898 - classification_loss: 0.4422 104/500 [=====>........................] - ETA: 1:38 - loss: 2.3320 - regression_loss: 1.8897 - classification_loss: 0.4423 105/500 [=====>........................] - ETA: 1:38 - loss: 2.3360 - regression_loss: 1.8933 - classification_loss: 0.4428 106/500 [=====>........................] - ETA: 1:38 - loss: 2.3392 - regression_loss: 1.8947 - classification_loss: 0.4445 107/500 [=====>........................] - ETA: 1:38 - loss: 2.3323 - regression_loss: 1.8901 - classification_loss: 0.4422 108/500 [=====>........................] - ETA: 1:37 - loss: 2.3256 - regression_loss: 1.8854 - classification_loss: 0.4402 109/500 [=====>........................] - ETA: 1:37 - loss: 2.3224 - regression_loss: 1.8834 - classification_loss: 0.4390 110/500 [=====>........................] - ETA: 1:37 - loss: 2.3342 - regression_loss: 1.8946 - classification_loss: 0.4396 111/500 [=====>........................] - ETA: 1:37 - loss: 2.3315 - regression_loss: 1.8921 - classification_loss: 0.4394 112/500 [=====>........................] - ETA: 1:36 - loss: 2.3284 - regression_loss: 1.8892 - classification_loss: 0.4391 113/500 [=====>........................] - ETA: 1:36 - loss: 2.3307 - regression_loss: 1.8909 - classification_loss: 0.4398 114/500 [=====>........................] - ETA: 1:36 - loss: 2.3250 - regression_loss: 1.8870 - classification_loss: 0.4381 115/500 [=====>........................] - ETA: 1:36 - loss: 2.3257 - regression_loss: 1.8875 - classification_loss: 0.4381 116/500 [=====>........................] - ETA: 1:35 - loss: 2.3204 - regression_loss: 1.8834 - classification_loss: 0.4370 117/500 [======>.......................] - ETA: 1:35 - loss: 2.3252 - regression_loss: 1.8873 - classification_loss: 0.4379 118/500 [======>.......................] - ETA: 1:35 - loss: 2.3206 - regression_loss: 1.8834 - classification_loss: 0.4371 119/500 [======>.......................] - ETA: 1:35 - loss: 2.3101 - regression_loss: 1.8749 - classification_loss: 0.4352 120/500 [======>.......................] - ETA: 1:34 - loss: 2.3139 - regression_loss: 1.8775 - classification_loss: 0.4364 121/500 [======>.......................] - ETA: 1:34 - loss: 2.3169 - regression_loss: 1.8789 - classification_loss: 0.4379 122/500 [======>.......................] - ETA: 1:34 - loss: 2.3195 - regression_loss: 1.8805 - classification_loss: 0.4390 123/500 [======>.......................] - ETA: 1:34 - loss: 2.3182 - regression_loss: 1.8806 - classification_loss: 0.4376 124/500 [======>.......................] - ETA: 1:33 - loss: 2.3200 - regression_loss: 1.8828 - classification_loss: 0.4372 125/500 [======>.......................] - ETA: 1:33 - loss: 2.3187 - regression_loss: 1.8817 - classification_loss: 0.4369 126/500 [======>.......................] - ETA: 1:33 - loss: 2.3163 - regression_loss: 1.8797 - classification_loss: 0.4366 127/500 [======>.......................] - ETA: 1:33 - loss: 2.3157 - regression_loss: 1.8792 - classification_loss: 0.4365 128/500 [======>.......................] - ETA: 1:32 - loss: 2.3165 - regression_loss: 1.8797 - classification_loss: 0.4367 129/500 [======>.......................] - ETA: 1:32 - loss: 2.3153 - regression_loss: 1.8779 - classification_loss: 0.4374 130/500 [======>.......................] - ETA: 1:32 - loss: 2.3162 - regression_loss: 1.8777 - classification_loss: 0.4385 131/500 [======>.......................] - ETA: 1:32 - loss: 2.3169 - regression_loss: 1.8786 - classification_loss: 0.4383 132/500 [======>.......................] - ETA: 1:31 - loss: 2.3167 - regression_loss: 1.8786 - classification_loss: 0.4381 133/500 [======>.......................] - ETA: 1:31 - loss: 2.3132 - regression_loss: 1.8761 - classification_loss: 0.4371 134/500 [=======>......................] - ETA: 1:31 - loss: 2.3104 - regression_loss: 1.8742 - classification_loss: 0.4362 135/500 [=======>......................] - ETA: 1:31 - loss: 2.3095 - regression_loss: 1.8735 - classification_loss: 0.4361 136/500 [=======>......................] - ETA: 1:30 - loss: 2.3094 - regression_loss: 1.8734 - classification_loss: 0.4360 137/500 [=======>......................] - ETA: 1:30 - loss: 2.3105 - regression_loss: 1.8746 - classification_loss: 0.4359 138/500 [=======>......................] - ETA: 1:30 - loss: 2.3090 - regression_loss: 1.8739 - classification_loss: 0.4351 139/500 [=======>......................] - ETA: 1:30 - loss: 2.3114 - regression_loss: 1.8756 - classification_loss: 0.4358 140/500 [=======>......................] - ETA: 1:29 - loss: 2.3159 - regression_loss: 1.8794 - classification_loss: 0.4365 141/500 [=======>......................] - ETA: 1:29 - loss: 2.3137 - regression_loss: 1.8783 - classification_loss: 0.4354 142/500 [=======>......................] - ETA: 1:29 - loss: 2.3155 - regression_loss: 1.8804 - classification_loss: 0.4351 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3169 - regression_loss: 1.8813 - classification_loss: 0.4356 144/500 [=======>......................] - ETA: 1:28 - loss: 2.3197 - regression_loss: 1.8838 - classification_loss: 0.4358 145/500 [=======>......................] - ETA: 1:28 - loss: 2.3191 - regression_loss: 1.8838 - classification_loss: 0.4353 146/500 [=======>......................] - ETA: 1:28 - loss: 2.3223 - regression_loss: 1.8854 - classification_loss: 0.4369 147/500 [=======>......................] - ETA: 1:28 - loss: 2.3227 - regression_loss: 1.8858 - classification_loss: 0.4369 148/500 [=======>......................] - ETA: 1:27 - loss: 2.3275 - regression_loss: 1.8891 - classification_loss: 0.4384 149/500 [=======>......................] - ETA: 1:27 - loss: 2.3286 - regression_loss: 1.8900 - classification_loss: 0.4387 150/500 [========>.....................] - ETA: 1:27 - loss: 2.3290 - regression_loss: 1.8903 - classification_loss: 0.4387 151/500 [========>.....................] - ETA: 1:27 - loss: 2.3218 - regression_loss: 1.8843 - classification_loss: 0.4375 152/500 [========>.....................] - ETA: 1:27 - loss: 2.3257 - regression_loss: 1.8872 - classification_loss: 0.4385 153/500 [========>.....................] - ETA: 1:26 - loss: 2.3253 - regression_loss: 1.8864 - classification_loss: 0.4388 154/500 [========>.....................] - ETA: 1:26 - loss: 2.3293 - regression_loss: 1.8900 - classification_loss: 0.4393 155/500 [========>.....................] - ETA: 1:26 - loss: 2.3228 - regression_loss: 1.8845 - classification_loss: 0.4383 156/500 [========>.....................] - ETA: 1:26 - loss: 2.3232 - regression_loss: 1.8848 - classification_loss: 0.4384 157/500 [========>.....................] - ETA: 1:25 - loss: 2.3262 - regression_loss: 1.8878 - classification_loss: 0.4384 158/500 [========>.....................] - ETA: 1:25 - loss: 2.3256 - regression_loss: 1.8878 - classification_loss: 0.4378 159/500 [========>.....................] - ETA: 1:25 - loss: 2.3285 - regression_loss: 1.8906 - classification_loss: 0.4379 160/500 [========>.....................] - ETA: 1:25 - loss: 2.3293 - regression_loss: 1.8912 - classification_loss: 0.4381 161/500 [========>.....................] - ETA: 1:24 - loss: 2.3268 - regression_loss: 1.8886 - classification_loss: 0.4382 162/500 [========>.....................] - ETA: 1:24 - loss: 2.3340 - regression_loss: 1.8935 - classification_loss: 0.4404 163/500 [========>.....................] - ETA: 1:24 - loss: 2.3334 - regression_loss: 1.8929 - classification_loss: 0.4405 164/500 [========>.....................] - ETA: 1:24 - loss: 2.3306 - regression_loss: 1.8909 - classification_loss: 0.4397 165/500 [========>.....................] - ETA: 1:23 - loss: 2.3255 - regression_loss: 1.8866 - classification_loss: 0.4389 166/500 [========>.....................] - ETA: 1:23 - loss: 2.3309 - regression_loss: 1.8907 - classification_loss: 0.4402 167/500 [=========>....................] - ETA: 1:23 - loss: 2.3303 - regression_loss: 1.8903 - classification_loss: 0.4401 168/500 [=========>....................] - ETA: 1:23 - loss: 2.3308 - regression_loss: 1.8909 - classification_loss: 0.4399 169/500 [=========>....................] - ETA: 1:22 - loss: 2.3320 - regression_loss: 1.8920 - classification_loss: 0.4400 170/500 [=========>....................] - ETA: 1:22 - loss: 2.3349 - regression_loss: 1.8939 - classification_loss: 0.4410 171/500 [=========>....................] - ETA: 1:22 - loss: 2.3366 - regression_loss: 1.8951 - classification_loss: 0.4415 172/500 [=========>....................] - ETA: 1:22 - loss: 2.3355 - regression_loss: 1.8946 - classification_loss: 0.4409 173/500 [=========>....................] - ETA: 1:21 - loss: 2.3346 - regression_loss: 1.8937 - classification_loss: 0.4409 174/500 [=========>....................] - ETA: 1:21 - loss: 2.3334 - regression_loss: 1.8926 - classification_loss: 0.4408 175/500 [=========>....................] - ETA: 1:21 - loss: 2.3301 - regression_loss: 1.8901 - classification_loss: 0.4400 176/500 [=========>....................] - ETA: 1:21 - loss: 2.3297 - regression_loss: 1.8900 - classification_loss: 0.4397 177/500 [=========>....................] - ETA: 1:20 - loss: 2.3302 - regression_loss: 1.8906 - classification_loss: 0.4396 178/500 [=========>....................] - ETA: 1:20 - loss: 2.3257 - regression_loss: 1.8868 - classification_loss: 0.4389 179/500 [=========>....................] - ETA: 1:20 - loss: 2.3260 - regression_loss: 1.8872 - classification_loss: 0.4388 180/500 [=========>....................] - ETA: 1:20 - loss: 2.3254 - regression_loss: 1.8861 - classification_loss: 0.4393 181/500 [=========>....................] - ETA: 1:19 - loss: 2.3270 - regression_loss: 1.8877 - classification_loss: 0.4394 182/500 [=========>....................] - ETA: 1:19 - loss: 2.3261 - regression_loss: 1.8868 - classification_loss: 0.4393 183/500 [=========>....................] - ETA: 1:19 - loss: 2.3250 - regression_loss: 1.8862 - classification_loss: 0.4388 184/500 [==========>...................] - ETA: 1:19 - loss: 2.3220 - regression_loss: 1.8837 - classification_loss: 0.4384 185/500 [==========>...................] - ETA: 1:18 - loss: 2.3218 - regression_loss: 1.8836 - classification_loss: 0.4382 186/500 [==========>...................] - ETA: 1:18 - loss: 2.3226 - regression_loss: 1.8844 - classification_loss: 0.4382 187/500 [==========>...................] - ETA: 1:18 - loss: 2.3270 - regression_loss: 1.8881 - classification_loss: 0.4389 188/500 [==========>...................] - ETA: 1:18 - loss: 2.3276 - regression_loss: 1.8889 - classification_loss: 0.4387 189/500 [==========>...................] - ETA: 1:17 - loss: 2.3269 - regression_loss: 1.8888 - classification_loss: 0.4381 190/500 [==========>...................] - ETA: 1:17 - loss: 2.3288 - regression_loss: 1.8907 - classification_loss: 0.4382 191/500 [==========>...................] - ETA: 1:17 - loss: 2.3309 - regression_loss: 1.8922 - classification_loss: 0.4387 192/500 [==========>...................] - ETA: 1:17 - loss: 2.3276 - regression_loss: 1.8897 - classification_loss: 0.4379 193/500 [==========>...................] - ETA: 1:16 - loss: 2.3265 - regression_loss: 1.8887 - classification_loss: 0.4378 194/500 [==========>...................] - ETA: 1:16 - loss: 2.3239 - regression_loss: 1.8866 - classification_loss: 0.4373 195/500 [==========>...................] - ETA: 1:16 - loss: 2.3233 - regression_loss: 1.8862 - classification_loss: 0.4371 196/500 [==========>...................] - ETA: 1:16 - loss: 2.3203 - regression_loss: 1.8840 - classification_loss: 0.4362 197/500 [==========>...................] - ETA: 1:15 - loss: 2.3202 - regression_loss: 1.8841 - classification_loss: 0.4360 198/500 [==========>...................] - ETA: 1:15 - loss: 2.3270 - regression_loss: 1.8895 - classification_loss: 0.4375 199/500 [==========>...................] - ETA: 1:15 - loss: 2.3278 - regression_loss: 1.8901 - classification_loss: 0.4378 200/500 [===========>..................] - ETA: 1:15 - loss: 2.3279 - regression_loss: 1.8900 - classification_loss: 0.4379 201/500 [===========>..................] - ETA: 1:14 - loss: 2.3287 - regression_loss: 1.8906 - classification_loss: 0.4381 202/500 [===========>..................] - ETA: 1:14 - loss: 2.3288 - regression_loss: 1.8908 - classification_loss: 0.4380 203/500 [===========>..................] - ETA: 1:14 - loss: 2.3283 - regression_loss: 1.8903 - classification_loss: 0.4380 204/500 [===========>..................] - ETA: 1:14 - loss: 2.3267 - regression_loss: 1.8867 - classification_loss: 0.4400 205/500 [===========>..................] - ETA: 1:13 - loss: 2.3284 - regression_loss: 1.8879 - classification_loss: 0.4405 206/500 [===========>..................] - ETA: 1:13 - loss: 2.3281 - regression_loss: 1.8874 - classification_loss: 0.4407 207/500 [===========>..................] - ETA: 1:13 - loss: 2.3226 - regression_loss: 1.8830 - classification_loss: 0.4396 208/500 [===========>..................] - ETA: 1:13 - loss: 2.3230 - regression_loss: 1.8839 - classification_loss: 0.4391 209/500 [===========>..................] - ETA: 1:12 - loss: 2.3183 - regression_loss: 1.8798 - classification_loss: 0.4385 210/500 [===========>..................] - ETA: 1:12 - loss: 2.3160 - regression_loss: 1.8777 - classification_loss: 0.4383 211/500 [===========>..................] - ETA: 1:12 - loss: 2.3118 - regression_loss: 1.8741 - classification_loss: 0.4377 212/500 [===========>..................] - ETA: 1:11 - loss: 2.3112 - regression_loss: 1.8740 - classification_loss: 0.4372 213/500 [===========>..................] - ETA: 1:11 - loss: 2.3113 - regression_loss: 1.8742 - classification_loss: 0.4371 214/500 [===========>..................] - ETA: 1:11 - loss: 2.3091 - regression_loss: 1.8723 - classification_loss: 0.4368 215/500 [===========>..................] - ETA: 1:11 - loss: 2.3080 - regression_loss: 1.8713 - classification_loss: 0.4367 216/500 [===========>..................] - ETA: 1:10 - loss: 2.3095 - regression_loss: 1.8724 - classification_loss: 0.4371 217/500 [============>.................] - ETA: 1:10 - loss: 2.3127 - regression_loss: 1.8750 - classification_loss: 0.4377 218/500 [============>.................] - ETA: 1:10 - loss: 2.3132 - regression_loss: 1.8755 - classification_loss: 0.4376 219/500 [============>.................] - ETA: 1:10 - loss: 2.3146 - regression_loss: 1.8771 - classification_loss: 0.4375 220/500 [============>.................] - ETA: 1:09 - loss: 2.3145 - regression_loss: 1.8772 - classification_loss: 0.4372 221/500 [============>.................] - ETA: 1:09 - loss: 2.3133 - regression_loss: 1.8763 - classification_loss: 0.4370 222/500 [============>.................] - ETA: 1:09 - loss: 2.3135 - regression_loss: 1.8764 - classification_loss: 0.4371 223/500 [============>.................] - ETA: 1:09 - loss: 2.3129 - regression_loss: 1.8762 - classification_loss: 0.4367 224/500 [============>.................] - ETA: 1:08 - loss: 2.3123 - regression_loss: 1.8756 - classification_loss: 0.4367 225/500 [============>.................] - ETA: 1:08 - loss: 2.3139 - regression_loss: 1.8770 - classification_loss: 0.4369 226/500 [============>.................] - ETA: 1:08 - loss: 2.3176 - regression_loss: 1.8794 - classification_loss: 0.4382 227/500 [============>.................] - ETA: 1:08 - loss: 2.3184 - regression_loss: 1.8803 - classification_loss: 0.4381 228/500 [============>.................] - ETA: 1:07 - loss: 2.3218 - regression_loss: 1.8812 - classification_loss: 0.4406 229/500 [============>.................] - ETA: 1:07 - loss: 2.3206 - regression_loss: 1.8806 - classification_loss: 0.4400 230/500 [============>.................] - ETA: 1:07 - loss: 2.3221 - regression_loss: 1.8818 - classification_loss: 0.4404 231/500 [============>.................] - ETA: 1:07 - loss: 2.3200 - regression_loss: 1.8803 - classification_loss: 0.4397 232/500 [============>.................] - ETA: 1:06 - loss: 2.3188 - regression_loss: 1.8798 - classification_loss: 0.4390 233/500 [============>.................] - ETA: 1:06 - loss: 2.3188 - regression_loss: 1.8802 - classification_loss: 0.4386 234/500 [=============>................] - ETA: 1:06 - loss: 2.3190 - regression_loss: 1.8806 - classification_loss: 0.4384 235/500 [=============>................] - ETA: 1:06 - loss: 2.3198 - regression_loss: 1.8812 - classification_loss: 0.4386 236/500 [=============>................] - ETA: 1:05 - loss: 2.3191 - regression_loss: 1.8806 - classification_loss: 0.4385 237/500 [=============>................] - ETA: 1:05 - loss: 2.3181 - regression_loss: 1.8800 - classification_loss: 0.4381 238/500 [=============>................] - ETA: 1:05 - loss: 2.3189 - regression_loss: 1.8800 - classification_loss: 0.4389 239/500 [=============>................] - ETA: 1:05 - loss: 2.3203 - regression_loss: 1.8809 - classification_loss: 0.4394 240/500 [=============>................] - ETA: 1:04 - loss: 2.3211 - regression_loss: 1.8812 - classification_loss: 0.4399 241/500 [=============>................] - ETA: 1:04 - loss: 2.3188 - regression_loss: 1.8794 - classification_loss: 0.4394 242/500 [=============>................] - ETA: 1:04 - loss: 2.3163 - regression_loss: 1.8774 - classification_loss: 0.4389 243/500 [=============>................] - ETA: 1:04 - loss: 2.3139 - regression_loss: 1.8754 - classification_loss: 0.4384 244/500 [=============>................] - ETA: 1:03 - loss: 2.3088 - regression_loss: 1.8713 - classification_loss: 0.4375 245/500 [=============>................] - ETA: 1:03 - loss: 2.3113 - regression_loss: 1.8736 - classification_loss: 0.4377 246/500 [=============>................] - ETA: 1:03 - loss: 2.3126 - regression_loss: 1.8743 - classification_loss: 0.4383 247/500 [=============>................] - ETA: 1:03 - loss: 2.3116 - regression_loss: 1.8736 - classification_loss: 0.4381 248/500 [=============>................] - ETA: 1:02 - loss: 2.3120 - regression_loss: 1.8735 - classification_loss: 0.4386 249/500 [=============>................] - ETA: 1:02 - loss: 2.3122 - regression_loss: 1.8738 - classification_loss: 0.4384 250/500 [==============>...............] - ETA: 1:02 - loss: 2.3153 - regression_loss: 1.8763 - classification_loss: 0.4390 251/500 [==============>...............] - ETA: 1:02 - loss: 2.3151 - regression_loss: 1.8763 - classification_loss: 0.4388 252/500 [==============>...............] - ETA: 1:01 - loss: 2.3155 - regression_loss: 1.8765 - classification_loss: 0.4390 253/500 [==============>...............] - ETA: 1:01 - loss: 2.3133 - regression_loss: 1.8747 - classification_loss: 0.4386 254/500 [==============>...............] - ETA: 1:01 - loss: 2.3129 - regression_loss: 1.8741 - classification_loss: 0.4388 255/500 [==============>...............] - ETA: 1:01 - loss: 2.3114 - regression_loss: 1.8729 - classification_loss: 0.4385 256/500 [==============>...............] - ETA: 1:00 - loss: 2.3118 - regression_loss: 1.8735 - classification_loss: 0.4382 257/500 [==============>...............] - ETA: 1:00 - loss: 2.3123 - regression_loss: 1.8726 - classification_loss: 0.4397 258/500 [==============>...............] - ETA: 1:00 - loss: 2.3122 - regression_loss: 1.8704 - classification_loss: 0.4417 259/500 [==============>...............] - ETA: 1:00 - loss: 2.3118 - regression_loss: 1.8707 - classification_loss: 0.4411 260/500 [==============>...............] - ETA: 59s - loss: 2.3113 - regression_loss: 1.8704 - classification_loss: 0.4408  261/500 [==============>...............] - ETA: 59s - loss: 2.3110 - regression_loss: 1.8703 - classification_loss: 0.4407 262/500 [==============>...............] - ETA: 59s - loss: 2.3111 - regression_loss: 1.8701 - classification_loss: 0.4410 263/500 [==============>...............] - ETA: 59s - loss: 2.3114 - regression_loss: 1.8703 - classification_loss: 0.4411 264/500 [==============>...............] - ETA: 58s - loss: 2.3100 - regression_loss: 1.8695 - classification_loss: 0.4404 265/500 [==============>...............] - ETA: 58s - loss: 2.3108 - regression_loss: 1.8700 - classification_loss: 0.4407 266/500 [==============>...............] - ETA: 58s - loss: 2.3116 - regression_loss: 1.8714 - classification_loss: 0.4402 267/500 [===============>..............] - ETA: 58s - loss: 2.3106 - regression_loss: 1.8708 - classification_loss: 0.4398 268/500 [===============>..............] - ETA: 57s - loss: 2.3119 - regression_loss: 1.8709 - classification_loss: 0.4410 269/500 [===============>..............] - ETA: 57s - loss: 2.3106 - regression_loss: 1.8700 - classification_loss: 0.4406 270/500 [===============>..............] - ETA: 57s - loss: 2.3108 - regression_loss: 1.8701 - classification_loss: 0.4408 271/500 [===============>..............] - ETA: 57s - loss: 2.3112 - regression_loss: 1.8705 - classification_loss: 0.4407 272/500 [===============>..............] - ETA: 57s - loss: 2.3113 - regression_loss: 1.8708 - classification_loss: 0.4405 273/500 [===============>..............] - ETA: 56s - loss: 2.3069 - regression_loss: 1.8672 - classification_loss: 0.4397 274/500 [===============>..............] - ETA: 56s - loss: 2.3063 - regression_loss: 1.8666 - classification_loss: 0.4396 275/500 [===============>..............] - ETA: 56s - loss: 2.3056 - regression_loss: 1.8662 - classification_loss: 0.4394 276/500 [===============>..............] - ETA: 56s - loss: 2.3060 - regression_loss: 1.8663 - classification_loss: 0.4396 277/500 [===============>..............] - ETA: 55s - loss: 2.3065 - regression_loss: 1.8665 - classification_loss: 0.4399 278/500 [===============>..............] - ETA: 55s - loss: 2.3060 - regression_loss: 1.8661 - classification_loss: 0.4400 279/500 [===============>..............] - ETA: 55s - loss: 2.3056 - regression_loss: 1.8659 - classification_loss: 0.4397 280/500 [===============>..............] - ETA: 55s - loss: 2.3057 - regression_loss: 1.8662 - classification_loss: 0.4395 281/500 [===============>..............] - ETA: 54s - loss: 2.3054 - regression_loss: 1.8658 - classification_loss: 0.4396 282/500 [===============>..............] - ETA: 54s - loss: 2.3047 - regression_loss: 1.8653 - classification_loss: 0.4394 283/500 [===============>..............] - ETA: 54s - loss: 2.3054 - regression_loss: 1.8660 - classification_loss: 0.4394 284/500 [================>.............] - ETA: 54s - loss: 2.3025 - regression_loss: 1.8636 - classification_loss: 0.4389 285/500 [================>.............] - ETA: 53s - loss: 2.3027 - regression_loss: 1.8639 - classification_loss: 0.4389 286/500 [================>.............] - ETA: 53s - loss: 2.3031 - regression_loss: 1.8639 - classification_loss: 0.4392 287/500 [================>.............] - ETA: 53s - loss: 2.3022 - regression_loss: 1.8634 - classification_loss: 0.4388 288/500 [================>.............] - ETA: 53s - loss: 2.3026 - regression_loss: 1.8635 - classification_loss: 0.4390 289/500 [================>.............] - ETA: 52s - loss: 2.3018 - regression_loss: 1.8629 - classification_loss: 0.4389 290/500 [================>.............] - ETA: 52s - loss: 2.3018 - regression_loss: 1.8632 - classification_loss: 0.4387 291/500 [================>.............] - ETA: 52s - loss: 2.3034 - regression_loss: 1.8645 - classification_loss: 0.4390 292/500 [================>.............] - ETA: 52s - loss: 2.3041 - regression_loss: 1.8650 - classification_loss: 0.4391 293/500 [================>.............] - ETA: 51s - loss: 2.3054 - regression_loss: 1.8661 - classification_loss: 0.4393 294/500 [================>.............] - ETA: 51s - loss: 2.3058 - regression_loss: 1.8666 - classification_loss: 0.4391 295/500 [================>.............] - ETA: 51s - loss: 2.3021 - regression_loss: 1.8637 - classification_loss: 0.4383 296/500 [================>.............] - ETA: 51s - loss: 2.3026 - regression_loss: 1.8641 - classification_loss: 0.4385 297/500 [================>.............] - ETA: 50s - loss: 2.3065 - regression_loss: 1.8679 - classification_loss: 0.4386 298/500 [================>.............] - ETA: 50s - loss: 2.3049 - regression_loss: 1.8667 - classification_loss: 0.4383 299/500 [================>.............] - ETA: 50s - loss: 2.3037 - regression_loss: 1.8658 - classification_loss: 0.4378 300/500 [=================>............] - ETA: 50s - loss: 2.3051 - regression_loss: 1.8671 - classification_loss: 0.4379 301/500 [=================>............] - ETA: 49s - loss: 2.3062 - regression_loss: 1.8683 - classification_loss: 0.4379 302/500 [=================>............] - ETA: 49s - loss: 2.3056 - regression_loss: 1.8680 - classification_loss: 0.4376 303/500 [=================>............] - ETA: 49s - loss: 2.3050 - regression_loss: 1.8657 - classification_loss: 0.4393 304/500 [=================>............] - ETA: 49s - loss: 2.3054 - regression_loss: 1.8660 - classification_loss: 0.4393 305/500 [=================>............] - ETA: 48s - loss: 2.3078 - regression_loss: 1.8682 - classification_loss: 0.4395 306/500 [=================>............] - ETA: 48s - loss: 2.3093 - regression_loss: 1.8697 - classification_loss: 0.4396 307/500 [=================>............] - ETA: 48s - loss: 2.3084 - regression_loss: 1.8689 - classification_loss: 0.4395 308/500 [=================>............] - ETA: 48s - loss: 2.3084 - regression_loss: 1.8691 - classification_loss: 0.4392 309/500 [=================>............] - ETA: 47s - loss: 2.3079 - regression_loss: 1.8686 - classification_loss: 0.4392 310/500 [=================>............] - ETA: 47s - loss: 2.3103 - regression_loss: 1.8701 - classification_loss: 0.4402 311/500 [=================>............] - ETA: 47s - loss: 2.3097 - regression_loss: 1.8698 - classification_loss: 0.4399 312/500 [=================>............] - ETA: 47s - loss: 2.3081 - regression_loss: 1.8684 - classification_loss: 0.4396 313/500 [=================>............] - ETA: 46s - loss: 2.3074 - regression_loss: 1.8679 - classification_loss: 0.4395 314/500 [=================>............] - ETA: 46s - loss: 2.3078 - regression_loss: 1.8679 - classification_loss: 0.4399 315/500 [=================>............] - ETA: 46s - loss: 2.3079 - regression_loss: 1.8680 - classification_loss: 0.4398 316/500 [=================>............] - ETA: 46s - loss: 2.3094 - regression_loss: 1.8689 - classification_loss: 0.4404 317/500 [==================>...........] - ETA: 45s - loss: 2.3070 - regression_loss: 1.8673 - classification_loss: 0.4396 318/500 [==================>...........] - ETA: 45s - loss: 2.3078 - regression_loss: 1.8680 - classification_loss: 0.4397 319/500 [==================>...........] - ETA: 45s - loss: 2.3089 - regression_loss: 1.8690 - classification_loss: 0.4399 320/500 [==================>...........] - ETA: 45s - loss: 2.3103 - regression_loss: 1.8699 - classification_loss: 0.4404 321/500 [==================>...........] - ETA: 44s - loss: 2.3111 - regression_loss: 1.8708 - classification_loss: 0.4403 322/500 [==================>...........] - ETA: 44s - loss: 2.3101 - regression_loss: 1.8700 - classification_loss: 0.4401 323/500 [==================>...........] - ETA: 44s - loss: 2.3084 - regression_loss: 1.8686 - classification_loss: 0.4398 324/500 [==================>...........] - ETA: 44s - loss: 2.3078 - regression_loss: 1.8677 - classification_loss: 0.4401 325/500 [==================>...........] - ETA: 43s - loss: 2.3071 - regression_loss: 1.8672 - classification_loss: 0.4399 326/500 [==================>...........] - ETA: 43s - loss: 2.3080 - regression_loss: 1.8681 - classification_loss: 0.4399 327/500 [==================>...........] - ETA: 43s - loss: 2.3081 - regression_loss: 1.8683 - classification_loss: 0.4398 328/500 [==================>...........] - ETA: 43s - loss: 2.3069 - regression_loss: 1.8675 - classification_loss: 0.4394 329/500 [==================>...........] - ETA: 42s - loss: 2.3036 - regression_loss: 1.8650 - classification_loss: 0.4386 330/500 [==================>...........] - ETA: 42s - loss: 2.3029 - regression_loss: 1.8641 - classification_loss: 0.4388 331/500 [==================>...........] - ETA: 42s - loss: 2.3015 - regression_loss: 1.8631 - classification_loss: 0.4383 332/500 [==================>...........] - ETA: 42s - loss: 2.3012 - regression_loss: 1.8628 - classification_loss: 0.4383 333/500 [==================>...........] - ETA: 41s - loss: 2.3004 - regression_loss: 1.8623 - classification_loss: 0.4381 334/500 [===================>..........] - ETA: 41s - loss: 2.3002 - regression_loss: 1.8618 - classification_loss: 0.4384 335/500 [===================>..........] - ETA: 41s - loss: 2.3007 - regression_loss: 1.8623 - classification_loss: 0.4384 336/500 [===================>..........] - ETA: 41s - loss: 2.3012 - regression_loss: 1.8628 - classification_loss: 0.4384 337/500 [===================>..........] - ETA: 40s - loss: 2.3011 - regression_loss: 1.8628 - classification_loss: 0.4383 338/500 [===================>..........] - ETA: 40s - loss: 2.3014 - regression_loss: 1.8629 - classification_loss: 0.4386 339/500 [===================>..........] - ETA: 40s - loss: 2.3004 - regression_loss: 1.8620 - classification_loss: 0.4384 340/500 [===================>..........] - ETA: 40s - loss: 2.3024 - regression_loss: 1.8635 - classification_loss: 0.4389 341/500 [===================>..........] - ETA: 39s - loss: 2.3024 - regression_loss: 1.8636 - classification_loss: 0.4388 342/500 [===================>..........] - ETA: 39s - loss: 2.3027 - regression_loss: 1.8636 - classification_loss: 0.4391 343/500 [===================>..........] - ETA: 39s - loss: 2.3020 - regression_loss: 1.8631 - classification_loss: 0.4390 344/500 [===================>..........] - ETA: 39s - loss: 2.3032 - regression_loss: 1.8641 - classification_loss: 0.4390 345/500 [===================>..........] - ETA: 38s - loss: 2.3038 - regression_loss: 1.8645 - classification_loss: 0.4393 346/500 [===================>..........] - ETA: 38s - loss: 2.3018 - regression_loss: 1.8628 - classification_loss: 0.4390 347/500 [===================>..........] - ETA: 38s - loss: 2.3013 - regression_loss: 1.8626 - classification_loss: 0.4387 348/500 [===================>..........] - ETA: 38s - loss: 2.3018 - regression_loss: 1.8626 - classification_loss: 0.4392 349/500 [===================>..........] - ETA: 37s - loss: 2.3015 - regression_loss: 1.8624 - classification_loss: 0.4392 350/500 [====================>.........] - ETA: 37s - loss: 2.3020 - regression_loss: 1.8627 - classification_loss: 0.4393 351/500 [====================>.........] - ETA: 37s - loss: 2.3027 - regression_loss: 1.8635 - classification_loss: 0.4392 352/500 [====================>.........] - ETA: 37s - loss: 2.3025 - regression_loss: 1.8636 - classification_loss: 0.4389 353/500 [====================>.........] - ETA: 36s - loss: 2.3022 - regression_loss: 1.8635 - classification_loss: 0.4387 354/500 [====================>.........] - ETA: 36s - loss: 2.3029 - regression_loss: 1.8641 - classification_loss: 0.4387 355/500 [====================>.........] - ETA: 36s - loss: 2.3048 - regression_loss: 1.8657 - classification_loss: 0.4390 356/500 [====================>.........] - ETA: 36s - loss: 2.3025 - regression_loss: 1.8639 - classification_loss: 0.4386 357/500 [====================>.........] - ETA: 35s - loss: 2.3034 - regression_loss: 1.8645 - classification_loss: 0.4389 358/500 [====================>.........] - ETA: 35s - loss: 2.3008 - regression_loss: 1.8625 - classification_loss: 0.4383 359/500 [====================>.........] - ETA: 35s - loss: 2.3009 - regression_loss: 1.8627 - classification_loss: 0.4382 360/500 [====================>.........] - ETA: 35s - loss: 2.3024 - regression_loss: 1.8634 - classification_loss: 0.4390 361/500 [====================>.........] - ETA: 34s - loss: 2.3022 - regression_loss: 1.8633 - classification_loss: 0.4389 362/500 [====================>.........] - ETA: 34s - loss: 2.3018 - regression_loss: 1.8633 - classification_loss: 0.4386 363/500 [====================>.........] - ETA: 34s - loss: 2.3029 - regression_loss: 1.8641 - classification_loss: 0.4389 364/500 [====================>.........] - ETA: 34s - loss: 2.3068 - regression_loss: 1.8675 - classification_loss: 0.4393 365/500 [====================>.........] - ETA: 33s - loss: 2.3067 - regression_loss: 1.8675 - classification_loss: 0.4392 366/500 [====================>.........] - ETA: 33s - loss: 2.3063 - regression_loss: 1.8674 - classification_loss: 0.4389 367/500 [=====================>........] - ETA: 33s - loss: 2.3053 - regression_loss: 1.8668 - classification_loss: 0.4386 368/500 [=====================>........] - ETA: 33s - loss: 2.3061 - regression_loss: 1.8667 - classification_loss: 0.4394 369/500 [=====================>........] - ETA: 32s - loss: 2.3066 - regression_loss: 1.8672 - classification_loss: 0.4395 370/500 [=====================>........] - ETA: 32s - loss: 2.3092 - regression_loss: 1.8676 - classification_loss: 0.4416 371/500 [=====================>........] - ETA: 32s - loss: 2.3082 - regression_loss: 1.8668 - classification_loss: 0.4414 372/500 [=====================>........] - ETA: 32s - loss: 2.3086 - regression_loss: 1.8668 - classification_loss: 0.4418 373/500 [=====================>........] - ETA: 31s - loss: 2.3076 - regression_loss: 1.8662 - classification_loss: 0.4414 374/500 [=====================>........] - ETA: 31s - loss: 2.3078 - regression_loss: 1.8661 - classification_loss: 0.4417 375/500 [=====================>........] - ETA: 31s - loss: 2.3075 - regression_loss: 1.8659 - classification_loss: 0.4416 376/500 [=====================>........] - ETA: 31s - loss: 2.3058 - regression_loss: 1.8647 - classification_loss: 0.4412 377/500 [=====================>........] - ETA: 30s - loss: 2.3063 - regression_loss: 1.8651 - classification_loss: 0.4412 378/500 [=====================>........] - ETA: 30s - loss: 2.3059 - regression_loss: 1.8647 - classification_loss: 0.4412 379/500 [=====================>........] - ETA: 30s - loss: 2.3055 - regression_loss: 1.8644 - classification_loss: 0.4412 380/500 [=====================>........] - ETA: 30s - loss: 2.3050 - regression_loss: 1.8640 - classification_loss: 0.4410 381/500 [=====================>........] - ETA: 29s - loss: 2.3029 - regression_loss: 1.8625 - classification_loss: 0.4404 382/500 [=====================>........] - ETA: 29s - loss: 2.3052 - regression_loss: 1.8643 - classification_loss: 0.4409 383/500 [=====================>........] - ETA: 29s - loss: 2.3063 - regression_loss: 1.8653 - classification_loss: 0.4411 384/500 [======================>.......] - ETA: 29s - loss: 2.3040 - regression_loss: 1.8633 - classification_loss: 0.4407 385/500 [======================>.......] - ETA: 28s - loss: 2.3043 - regression_loss: 1.8637 - classification_loss: 0.4406 386/500 [======================>.......] - ETA: 28s - loss: 2.3037 - regression_loss: 1.8628 - classification_loss: 0.4408 387/500 [======================>.......] - ETA: 28s - loss: 2.3037 - regression_loss: 1.8632 - classification_loss: 0.4405 388/500 [======================>.......] - ETA: 28s - loss: 2.3048 - regression_loss: 1.8643 - classification_loss: 0.4406 389/500 [======================>.......] - ETA: 27s - loss: 2.3029 - regression_loss: 1.8628 - classification_loss: 0.4402 390/500 [======================>.......] - ETA: 27s - loss: 2.3058 - regression_loss: 1.8655 - classification_loss: 0.4403 391/500 [======================>.......] - ETA: 27s - loss: 2.3058 - regression_loss: 1.8656 - classification_loss: 0.4402 392/500 [======================>.......] - ETA: 27s - loss: 2.3074 - regression_loss: 1.8668 - classification_loss: 0.4406 393/500 [======================>.......] - ETA: 26s - loss: 2.3067 - regression_loss: 1.8663 - classification_loss: 0.4404 394/500 [======================>.......] - ETA: 26s - loss: 2.3062 - regression_loss: 1.8655 - classification_loss: 0.4406 395/500 [======================>.......] - ETA: 26s - loss: 2.3062 - regression_loss: 1.8657 - classification_loss: 0.4405 396/500 [======================>.......] - ETA: 26s - loss: 2.3056 - regression_loss: 1.8652 - classification_loss: 0.4404 397/500 [======================>.......] - ETA: 25s - loss: 2.3041 - regression_loss: 1.8641 - classification_loss: 0.4400 398/500 [======================>.......] - ETA: 25s - loss: 2.3046 - regression_loss: 1.8645 - classification_loss: 0.4400 399/500 [======================>.......] - ETA: 25s - loss: 2.3051 - regression_loss: 1.8651 - classification_loss: 0.4401 400/500 [=======================>......] - ETA: 25s - loss: 2.3045 - regression_loss: 1.8647 - classification_loss: 0.4398 401/500 [=======================>......] - ETA: 24s - loss: 2.3032 - regression_loss: 1.8637 - classification_loss: 0.4396 402/500 [=======================>......] - ETA: 24s - loss: 2.3030 - regression_loss: 1.8633 - classification_loss: 0.4397 403/500 [=======================>......] - ETA: 24s - loss: 2.3014 - regression_loss: 1.8622 - classification_loss: 0.4392 404/500 [=======================>......] - ETA: 24s - loss: 2.3027 - regression_loss: 1.8630 - classification_loss: 0.4397 405/500 [=======================>......] - ETA: 23s - loss: 2.3015 - regression_loss: 1.8621 - classification_loss: 0.4394 406/500 [=======================>......] - ETA: 23s - loss: 2.3015 - regression_loss: 1.8622 - classification_loss: 0.4393 407/500 [=======================>......] - ETA: 23s - loss: 2.3005 - regression_loss: 1.8617 - classification_loss: 0.4387 408/500 [=======================>......] - ETA: 23s - loss: 2.3008 - regression_loss: 1.8622 - classification_loss: 0.4385 409/500 [=======================>......] - ETA: 22s - loss: 2.3003 - regression_loss: 1.8620 - classification_loss: 0.4383 410/500 [=======================>......] - ETA: 22s - loss: 2.3005 - regression_loss: 1.8622 - classification_loss: 0.4383 411/500 [=======================>......] - ETA: 22s - loss: 2.3009 - regression_loss: 1.8626 - classification_loss: 0.4383 412/500 [=======================>......] - ETA: 22s - loss: 2.3000 - regression_loss: 1.8612 - classification_loss: 0.4388 413/500 [=======================>......] - ETA: 21s - loss: 2.2991 - regression_loss: 1.8600 - classification_loss: 0.4391 414/500 [=======================>......] - ETA: 21s - loss: 2.2988 - regression_loss: 1.8598 - classification_loss: 0.4390 415/500 [=======================>......] - ETA: 21s - loss: 2.2984 - regression_loss: 1.8596 - classification_loss: 0.4388 416/500 [=======================>......] - ETA: 21s - loss: 2.2962 - regression_loss: 1.8579 - classification_loss: 0.4383 417/500 [========================>.....] - ETA: 20s - loss: 2.2952 - regression_loss: 1.8571 - classification_loss: 0.4381 418/500 [========================>.....] - ETA: 20s - loss: 2.2959 - regression_loss: 1.8575 - classification_loss: 0.4383 419/500 [========================>.....] - ETA: 20s - loss: 2.2943 - regression_loss: 1.8561 - classification_loss: 0.4382 420/500 [========================>.....] - ETA: 20s - loss: 2.2946 - regression_loss: 1.8568 - classification_loss: 0.4378 421/500 [========================>.....] - ETA: 19s - loss: 2.2947 - regression_loss: 1.8567 - classification_loss: 0.4379 422/500 [========================>.....] - ETA: 19s - loss: 2.2956 - regression_loss: 1.8573 - classification_loss: 0.4383 423/500 [========================>.....] - ETA: 19s - loss: 2.2950 - regression_loss: 1.8567 - classification_loss: 0.4383 424/500 [========================>.....] - ETA: 19s - loss: 2.2924 - regression_loss: 1.8548 - classification_loss: 0.4376 425/500 [========================>.....] - ETA: 18s - loss: 2.2930 - regression_loss: 1.8553 - classification_loss: 0.4377 426/500 [========================>.....] - ETA: 18s - loss: 2.2932 - regression_loss: 1.8553 - classification_loss: 0.4379 427/500 [========================>.....] - ETA: 18s - loss: 2.2949 - regression_loss: 1.8561 - classification_loss: 0.4388 428/500 [========================>.....] - ETA: 18s - loss: 2.2950 - regression_loss: 1.8562 - classification_loss: 0.4388 429/500 [========================>.....] - ETA: 17s - loss: 2.2964 - regression_loss: 1.8571 - classification_loss: 0.4392 430/500 [========================>.....] - ETA: 17s - loss: 2.2963 - regression_loss: 1.8570 - classification_loss: 0.4392 431/500 [========================>.....] - ETA: 17s - loss: 2.2983 - regression_loss: 1.8588 - classification_loss: 0.4394 432/500 [========================>.....] - ETA: 17s - loss: 2.3003 - regression_loss: 1.8605 - classification_loss: 0.4397 433/500 [========================>.....] - ETA: 16s - loss: 2.3002 - regression_loss: 1.8604 - classification_loss: 0.4398 434/500 [=========================>....] - ETA: 16s - loss: 2.2995 - regression_loss: 1.8598 - classification_loss: 0.4397 435/500 [=========================>....] - ETA: 16s - loss: 2.2992 - regression_loss: 1.8596 - classification_loss: 0.4396 436/500 [=========================>....] - ETA: 16s - loss: 2.2997 - regression_loss: 1.8602 - classification_loss: 0.4395 437/500 [=========================>....] - ETA: 15s - loss: 2.2990 - regression_loss: 1.8596 - classification_loss: 0.4394 438/500 [=========================>....] - ETA: 15s - loss: 2.2987 - regression_loss: 1.8594 - classification_loss: 0.4393 439/500 [=========================>....] - ETA: 15s - loss: 2.2986 - regression_loss: 1.8595 - classification_loss: 0.4391 440/500 [=========================>....] - ETA: 15s - loss: 2.2988 - regression_loss: 1.8599 - classification_loss: 0.4389 441/500 [=========================>....] - ETA: 14s - loss: 2.2990 - regression_loss: 1.8604 - classification_loss: 0.4386 442/500 [=========================>....] - ETA: 14s - loss: 2.2991 - regression_loss: 1.8606 - classification_loss: 0.4385 443/500 [=========================>....] - ETA: 14s - loss: 2.2995 - regression_loss: 1.8610 - classification_loss: 0.4386 444/500 [=========================>....] - ETA: 14s - loss: 2.2993 - regression_loss: 1.8607 - classification_loss: 0.4385 445/500 [=========================>....] - ETA: 13s - loss: 2.2993 - regression_loss: 1.8610 - classification_loss: 0.4383 446/500 [=========================>....] - ETA: 13s - loss: 2.2977 - regression_loss: 1.8596 - classification_loss: 0.4380 447/500 [=========================>....] - ETA: 13s - loss: 2.2982 - regression_loss: 1.8600 - classification_loss: 0.4383 448/500 [=========================>....] - ETA: 13s - loss: 2.2990 - regression_loss: 1.8608 - classification_loss: 0.4382 449/500 [=========================>....] - ETA: 12s - loss: 2.2974 - regression_loss: 1.8596 - classification_loss: 0.4378 450/500 [==========================>...] - ETA: 12s - loss: 2.2978 - regression_loss: 1.8599 - classification_loss: 0.4380 451/500 [==========================>...] - ETA: 12s - loss: 2.2974 - regression_loss: 1.8598 - classification_loss: 0.4376 452/500 [==========================>...] - ETA: 12s - loss: 2.2984 - regression_loss: 1.8607 - classification_loss: 0.4376 453/500 [==========================>...] - ETA: 11s - loss: 2.2974 - regression_loss: 1.8600 - classification_loss: 0.4374 454/500 [==========================>...] - ETA: 11s - loss: 2.2960 - regression_loss: 1.8590 - classification_loss: 0.4370 455/500 [==========================>...] - ETA: 11s - loss: 2.2967 - regression_loss: 1.8595 - classification_loss: 0.4371 456/500 [==========================>...] - ETA: 11s - loss: 2.2967 - regression_loss: 1.8597 - classification_loss: 0.4371 457/500 [==========================>...] - ETA: 10s - loss: 2.2945 - regression_loss: 1.8578 - classification_loss: 0.4367 458/500 [==========================>...] - ETA: 10s - loss: 2.2944 - regression_loss: 1.8578 - classification_loss: 0.4366 459/500 [==========================>...] - ETA: 10s - loss: 2.2949 - regression_loss: 1.8583 - classification_loss: 0.4366 460/500 [==========================>...] - ETA: 10s - loss: 2.2950 - regression_loss: 1.8584 - classification_loss: 0.4365 461/500 [==========================>...] - ETA: 9s - loss: 2.2929 - regression_loss: 1.8568 - classification_loss: 0.4361  462/500 [==========================>...] - ETA: 9s - loss: 2.2929 - regression_loss: 1.8568 - classification_loss: 0.4361 463/500 [==========================>...] - ETA: 9s - loss: 2.2933 - regression_loss: 1.8572 - classification_loss: 0.4362 464/500 [==========================>...] - ETA: 9s - loss: 2.2933 - regression_loss: 1.8572 - classification_loss: 0.4361 465/500 [==========================>...] - ETA: 8s - loss: 2.2920 - regression_loss: 1.8561 - classification_loss: 0.4359 466/500 [==========================>...] - ETA: 8s - loss: 2.2918 - regression_loss: 1.8558 - classification_loss: 0.4360 467/500 [===========================>..] - ETA: 8s - loss: 2.2909 - regression_loss: 1.8548 - classification_loss: 0.4361 468/500 [===========================>..] - ETA: 8s - loss: 2.2913 - regression_loss: 1.8548 - classification_loss: 0.4366 469/500 [===========================>..] - ETA: 7s - loss: 2.2889 - regression_loss: 1.8527 - classification_loss: 0.4362 470/500 [===========================>..] - ETA: 7s - loss: 2.2890 - regression_loss: 1.8529 - classification_loss: 0.4362 471/500 [===========================>..] - ETA: 7s - loss: 2.2886 - regression_loss: 1.8526 - classification_loss: 0.4360 472/500 [===========================>..] - ETA: 7s - loss: 2.2894 - regression_loss: 1.8535 - classification_loss: 0.4360 473/500 [===========================>..] - ETA: 6s - loss: 2.2906 - regression_loss: 1.8542 - classification_loss: 0.4364 474/500 [===========================>..] - ETA: 6s - loss: 2.2901 - regression_loss: 1.8539 - classification_loss: 0.4362 475/500 [===========================>..] - ETA: 6s - loss: 2.2893 - regression_loss: 1.8532 - classification_loss: 0.4361 476/500 [===========================>..] - ETA: 6s - loss: 2.2894 - regression_loss: 1.8534 - classification_loss: 0.4360 477/500 [===========================>..] - ETA: 5s - loss: 2.2892 - regression_loss: 1.8534 - classification_loss: 0.4359 478/500 [===========================>..] - ETA: 5s - loss: 2.2896 - regression_loss: 1.8540 - classification_loss: 0.4355 479/500 [===========================>..] - ETA: 5s - loss: 2.2889 - regression_loss: 1.8534 - classification_loss: 0.4355 480/500 [===========================>..] - ETA: 5s - loss: 2.2893 - regression_loss: 1.8537 - classification_loss: 0.4356 481/500 [===========================>..] - ETA: 4s - loss: 2.2898 - regression_loss: 1.8541 - classification_loss: 0.4357 482/500 [===========================>..] - ETA: 4s - loss: 2.2902 - regression_loss: 1.8542 - classification_loss: 0.4359 483/500 [===========================>..] - ETA: 4s - loss: 2.2905 - regression_loss: 1.8543 - classification_loss: 0.4362 484/500 [============================>.] - ETA: 4s - loss: 2.2900 - regression_loss: 1.8540 - classification_loss: 0.4360 485/500 [============================>.] - ETA: 3s - loss: 2.2902 - regression_loss: 1.8542 - classification_loss: 0.4360 486/500 [============================>.] - ETA: 3s - loss: 2.2895 - regression_loss: 1.8537 - classification_loss: 0.4358 487/500 [============================>.] - ETA: 3s - loss: 2.2893 - regression_loss: 1.8536 - classification_loss: 0.4357 488/500 [============================>.] - ETA: 3s - loss: 2.2880 - regression_loss: 1.8525 - classification_loss: 0.4355 489/500 [============================>.] - ETA: 2s - loss: 2.2896 - regression_loss: 1.8530 - classification_loss: 0.4366 490/500 [============================>.] - ETA: 2s - loss: 2.2903 - regression_loss: 1.8536 - classification_loss: 0.4367 491/500 [============================>.] - ETA: 2s - loss: 2.2917 - regression_loss: 1.8544 - classification_loss: 0.4373 492/500 [============================>.] - ETA: 2s - loss: 2.2930 - regression_loss: 1.8547 - classification_loss: 0.4383 493/500 [============================>.] - ETA: 1s - loss: 2.2932 - regression_loss: 1.8549 - classification_loss: 0.4383 494/500 [============================>.] - ETA: 1s - loss: 2.2933 - regression_loss: 1.8550 - classification_loss: 0.4383 495/500 [============================>.] - ETA: 1s - loss: 2.2937 - regression_loss: 1.8554 - classification_loss: 0.4383 496/500 [============================>.] - ETA: 1s - loss: 2.2939 - regression_loss: 1.8556 - classification_loss: 0.4383 497/500 [============================>.] - ETA: 0s - loss: 2.2935 - regression_loss: 1.8554 - classification_loss: 0.4380 498/500 [============================>.] - ETA: 0s - loss: 2.2936 - regression_loss: 1.8556 - classification_loss: 0.4380 499/500 [============================>.] - ETA: 0s - loss: 2.2936 - regression_loss: 1.8557 - classification_loss: 0.4379 500/500 [==============================] - 125s 250ms/step - loss: 2.2941 - regression_loss: 1.8562 - classification_loss: 0.4379 1172 instances of class plum with average precision: 0.4174 mAP: 0.4174 Epoch 00021: saving model to ./training/snapshots/resnet50_pascal_21.h5 Epoch 22/150 1/500 [..............................] - ETA: 1:58 - loss: 2.6986 - regression_loss: 2.1880 - classification_loss: 0.5106 2/500 [..............................] - ETA: 2:03 - loss: 2.3315 - regression_loss: 1.8818 - classification_loss: 0.4497 3/500 [..............................] - ETA: 2:05 - loss: 2.2304 - regression_loss: 1.8007 - classification_loss: 0.4297 4/500 [..............................] - ETA: 2:04 - loss: 2.2642 - regression_loss: 1.8367 - classification_loss: 0.4275 5/500 [..............................] - ETA: 2:04 - loss: 2.5475 - regression_loss: 2.1028 - classification_loss: 0.4447 6/500 [..............................] - ETA: 2:04 - loss: 2.4259 - regression_loss: 1.9968 - classification_loss: 0.4291 7/500 [..............................] - ETA: 2:04 - loss: 2.3938 - regression_loss: 1.9791 - classification_loss: 0.4148 8/500 [..............................] - ETA: 2:03 - loss: 2.3935 - regression_loss: 1.9913 - classification_loss: 0.4021 9/500 [..............................] - ETA: 2:03 - loss: 2.3472 - regression_loss: 1.9576 - classification_loss: 0.3896 10/500 [..............................] - ETA: 2:03 - loss: 2.2977 - regression_loss: 1.9097 - classification_loss: 0.3880 11/500 [..............................] - ETA: 2:03 - loss: 2.3385 - regression_loss: 1.9361 - classification_loss: 0.4024 12/500 [..............................] - ETA: 2:02 - loss: 2.3612 - regression_loss: 1.9499 - classification_loss: 0.4113 13/500 [..............................] - ETA: 2:02 - loss: 2.3306 - regression_loss: 1.9270 - classification_loss: 0.4036 14/500 [..............................] - ETA: 2:02 - loss: 2.3378 - regression_loss: 1.9272 - classification_loss: 0.4106 15/500 [..............................] - ETA: 2:02 - loss: 2.3395 - regression_loss: 1.9311 - classification_loss: 0.4083 16/500 [..............................] - ETA: 2:01 - loss: 2.3181 - regression_loss: 1.9138 - classification_loss: 0.4044 17/500 [>.............................] - ETA: 2:01 - loss: 2.3021 - regression_loss: 1.8995 - classification_loss: 0.4026 18/500 [>.............................] - ETA: 2:01 - loss: 2.3009 - regression_loss: 1.8967 - classification_loss: 0.4042 19/500 [>.............................] - ETA: 2:00 - loss: 2.2644 - regression_loss: 1.8707 - classification_loss: 0.3937 20/500 [>.............................] - ETA: 2:00 - loss: 2.2025 - regression_loss: 1.8215 - classification_loss: 0.3810 21/500 [>.............................] - ETA: 2:00 - loss: 2.1351 - regression_loss: 1.7661 - classification_loss: 0.3690 22/500 [>.............................] - ETA: 2:00 - loss: 2.1252 - regression_loss: 1.7566 - classification_loss: 0.3686 23/500 [>.............................] - ETA: 1:59 - loss: 2.1061 - regression_loss: 1.7397 - classification_loss: 0.3664 24/500 [>.............................] - ETA: 1:59 - loss: 2.1196 - regression_loss: 1.7523 - classification_loss: 0.3673 25/500 [>.............................] - ETA: 1:59 - loss: 2.1151 - regression_loss: 1.7530 - classification_loss: 0.3622 26/500 [>.............................] - ETA: 1:59 - loss: 2.1141 - regression_loss: 1.7485 - classification_loss: 0.3656 27/500 [>.............................] - ETA: 1:59 - loss: 2.1209 - regression_loss: 1.7504 - classification_loss: 0.3705 28/500 [>.............................] - ETA: 1:58 - loss: 2.1525 - regression_loss: 1.7719 - classification_loss: 0.3806 29/500 [>.............................] - ETA: 1:58 - loss: 2.1556 - regression_loss: 1.7748 - classification_loss: 0.3808 30/500 [>.............................] - ETA: 1:58 - loss: 2.1660 - regression_loss: 1.7836 - classification_loss: 0.3823 31/500 [>.............................] - ETA: 1:58 - loss: 2.1699 - regression_loss: 1.7892 - classification_loss: 0.3807 32/500 [>.............................] - ETA: 1:57 - loss: 2.2003 - regression_loss: 1.8123 - classification_loss: 0.3880 33/500 [>.............................] - ETA: 1:57 - loss: 2.1939 - regression_loss: 1.8056 - classification_loss: 0.3883 34/500 [=>............................] - ETA: 1:57 - loss: 2.1845 - regression_loss: 1.7982 - classification_loss: 0.3863 35/500 [=>............................] - ETA: 1:57 - loss: 2.2078 - regression_loss: 1.8139 - classification_loss: 0.3939 36/500 [=>............................] - ETA: 1:56 - loss: 2.1921 - regression_loss: 1.8004 - classification_loss: 0.3918 37/500 [=>............................] - ETA: 1:56 - loss: 2.1917 - regression_loss: 1.8003 - classification_loss: 0.3914 38/500 [=>............................] - ETA: 1:56 - loss: 2.1907 - regression_loss: 1.7994 - classification_loss: 0.3914 39/500 [=>............................] - ETA: 1:56 - loss: 2.2215 - regression_loss: 1.8199 - classification_loss: 0.4016 40/500 [=>............................] - ETA: 1:55 - loss: 2.2133 - regression_loss: 1.8135 - classification_loss: 0.3998 41/500 [=>............................] - ETA: 1:55 - loss: 2.2126 - regression_loss: 1.8106 - classification_loss: 0.4019 42/500 [=>............................] - ETA: 1:55 - loss: 2.2115 - regression_loss: 1.8103 - classification_loss: 0.4012 43/500 [=>............................] - ETA: 1:54 - loss: 2.1950 - regression_loss: 1.7945 - classification_loss: 0.4005 44/500 [=>............................] - ETA: 1:54 - loss: 2.2015 - regression_loss: 1.8010 - classification_loss: 0.4006 45/500 [=>............................] - ETA: 1:54 - loss: 2.2052 - regression_loss: 1.8027 - classification_loss: 0.4025 46/500 [=>............................] - ETA: 1:54 - loss: 2.2027 - regression_loss: 1.8003 - classification_loss: 0.4024 47/500 [=>............................] - ETA: 1:54 - loss: 2.2159 - regression_loss: 1.8133 - classification_loss: 0.4027 48/500 [=>............................] - ETA: 1:53 - loss: 2.2103 - regression_loss: 1.8077 - classification_loss: 0.4026 49/500 [=>............................] - ETA: 1:53 - loss: 2.2131 - regression_loss: 1.8100 - classification_loss: 0.4031 50/500 [==>...........................] - ETA: 1:53 - loss: 2.2169 - regression_loss: 1.8128 - classification_loss: 0.4041 51/500 [==>...........................] - ETA: 1:52 - loss: 2.2192 - regression_loss: 1.8155 - classification_loss: 0.4037 52/500 [==>...........................] - ETA: 1:52 - loss: 2.2236 - regression_loss: 1.8186 - classification_loss: 0.4050 53/500 [==>...........................] - ETA: 1:51 - loss: 2.2331 - regression_loss: 1.8252 - classification_loss: 0.4078 54/500 [==>...........................] - ETA: 1:51 - loss: 2.2204 - regression_loss: 1.8143 - classification_loss: 0.4061 55/500 [==>...........................] - ETA: 1:51 - loss: 2.2247 - regression_loss: 1.8178 - classification_loss: 0.4069 56/500 [==>...........................] - ETA: 1:51 - loss: 2.2394 - regression_loss: 1.8312 - classification_loss: 0.4083 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2540 - regression_loss: 1.8413 - classification_loss: 0.4128 58/500 [==>...........................] - ETA: 1:50 - loss: 2.2624 - regression_loss: 1.8489 - classification_loss: 0.4134 59/500 [==>...........................] - ETA: 1:50 - loss: 2.2704 - regression_loss: 1.8551 - classification_loss: 0.4153 60/500 [==>...........................] - ETA: 1:50 - loss: 2.2689 - regression_loss: 1.8554 - classification_loss: 0.4136 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2634 - regression_loss: 1.8478 - classification_loss: 0.4156 62/500 [==>...........................] - ETA: 1:49 - loss: 2.2642 - regression_loss: 1.8482 - classification_loss: 0.4160 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2738 - regression_loss: 1.8549 - classification_loss: 0.4188 64/500 [==>...........................] - ETA: 1:48 - loss: 2.2736 - regression_loss: 1.8552 - classification_loss: 0.4184 65/500 [==>...........................] - ETA: 1:47 - loss: 2.2700 - regression_loss: 1.8519 - classification_loss: 0.4181 66/500 [==>...........................] - ETA: 1:47 - loss: 2.2836 - regression_loss: 1.8631 - classification_loss: 0.4205 67/500 [===>..........................] - ETA: 1:47 - loss: 2.2790 - regression_loss: 1.8586 - classification_loss: 0.4204 68/500 [===>..........................] - ETA: 1:47 - loss: 2.2818 - regression_loss: 1.8616 - classification_loss: 0.4202 69/500 [===>..........................] - ETA: 1:46 - loss: 2.2854 - regression_loss: 1.8644 - classification_loss: 0.4209 70/500 [===>..........................] - ETA: 1:46 - loss: 2.2858 - regression_loss: 1.8642 - classification_loss: 0.4216 71/500 [===>..........................] - ETA: 1:46 - loss: 2.2790 - regression_loss: 1.8590 - classification_loss: 0.4200 72/500 [===>..........................] - ETA: 1:46 - loss: 2.2836 - regression_loss: 1.8627 - classification_loss: 0.4210 73/500 [===>..........................] - ETA: 1:45 - loss: 2.2848 - regression_loss: 1.8648 - classification_loss: 0.4200 74/500 [===>..........................] - ETA: 1:45 - loss: 2.2682 - regression_loss: 1.8500 - classification_loss: 0.4182 75/500 [===>..........................] - ETA: 1:45 - loss: 2.2650 - regression_loss: 1.8471 - classification_loss: 0.4179 76/500 [===>..........................] - ETA: 1:44 - loss: 2.2755 - regression_loss: 1.8483 - classification_loss: 0.4272 77/500 [===>..........................] - ETA: 1:44 - loss: 2.2721 - regression_loss: 1.8451 - classification_loss: 0.4269 78/500 [===>..........................] - ETA: 1:44 - loss: 2.2661 - regression_loss: 1.8406 - classification_loss: 0.4255 79/500 [===>..........................] - ETA: 1:44 - loss: 2.2667 - regression_loss: 1.8418 - classification_loss: 0.4249 80/500 [===>..........................] - ETA: 1:44 - loss: 2.2735 - regression_loss: 1.8351 - classification_loss: 0.4384 81/500 [===>..........................] - ETA: 1:43 - loss: 2.2743 - regression_loss: 1.8363 - classification_loss: 0.4380 82/500 [===>..........................] - ETA: 1:43 - loss: 2.2675 - regression_loss: 1.8297 - classification_loss: 0.4378 83/500 [===>..........................] - ETA: 1:43 - loss: 2.2682 - regression_loss: 1.8309 - classification_loss: 0.4373 84/500 [====>.........................] - ETA: 1:43 - loss: 2.2685 - regression_loss: 1.8310 - classification_loss: 0.4374 85/500 [====>.........................] - ETA: 1:43 - loss: 2.2552 - regression_loss: 1.8208 - classification_loss: 0.4344 86/500 [====>.........................] - ETA: 1:42 - loss: 2.2507 - regression_loss: 1.8163 - classification_loss: 0.4344 87/500 [====>.........................] - ETA: 1:42 - loss: 2.2519 - regression_loss: 1.8174 - classification_loss: 0.4345 88/500 [====>.........................] - ETA: 1:42 - loss: 2.2387 - regression_loss: 1.8074 - classification_loss: 0.4313 89/500 [====>.........................] - ETA: 1:42 - loss: 2.2404 - regression_loss: 1.8087 - classification_loss: 0.4318 90/500 [====>.........................] - ETA: 1:41 - loss: 2.2402 - regression_loss: 1.8089 - classification_loss: 0.4313 91/500 [====>.........................] - ETA: 1:41 - loss: 2.2440 - regression_loss: 1.8113 - classification_loss: 0.4327 92/500 [====>.........................] - ETA: 1:41 - loss: 2.2500 - regression_loss: 1.8159 - classification_loss: 0.4341 93/500 [====>.........................] - ETA: 1:41 - loss: 2.2485 - regression_loss: 1.8146 - classification_loss: 0.4338 94/500 [====>.........................] - ETA: 1:40 - loss: 2.2466 - regression_loss: 1.8135 - classification_loss: 0.4331 95/500 [====>.........................] - ETA: 1:40 - loss: 2.2427 - regression_loss: 1.8114 - classification_loss: 0.4314 96/500 [====>.........................] - ETA: 1:40 - loss: 2.2275 - regression_loss: 1.7982 - classification_loss: 0.4292 97/500 [====>.........................] - ETA: 1:40 - loss: 2.2375 - regression_loss: 1.8075 - classification_loss: 0.4299 98/500 [====>.........................] - ETA: 1:39 - loss: 2.2350 - regression_loss: 1.8056 - classification_loss: 0.4294 99/500 [====>.........................] - ETA: 1:39 - loss: 2.2371 - regression_loss: 1.8074 - classification_loss: 0.4297 100/500 [=====>........................] - ETA: 1:39 - loss: 2.2378 - regression_loss: 1.8082 - classification_loss: 0.4296 101/500 [=====>........................] - ETA: 1:39 - loss: 2.2401 - regression_loss: 1.8102 - classification_loss: 0.4298 102/500 [=====>........................] - ETA: 1:38 - loss: 2.2347 - regression_loss: 1.8041 - classification_loss: 0.4306 103/500 [=====>........................] - ETA: 1:38 - loss: 2.2387 - regression_loss: 1.8078 - classification_loss: 0.4309 104/500 [=====>........................] - ETA: 1:38 - loss: 2.2417 - regression_loss: 1.8085 - classification_loss: 0.4332 105/500 [=====>........................] - ETA: 1:38 - loss: 2.2461 - regression_loss: 1.8130 - classification_loss: 0.4331 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2433 - regression_loss: 1.8113 - classification_loss: 0.4320 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2445 - regression_loss: 1.8131 - classification_loss: 0.4314 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2453 - regression_loss: 1.8139 - classification_loss: 0.4314 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2463 - regression_loss: 1.8151 - classification_loss: 0.4312 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2541 - regression_loss: 1.8212 - classification_loss: 0.4328 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2506 - regression_loss: 1.8189 - classification_loss: 0.4317 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2509 - regression_loss: 1.8183 - classification_loss: 0.4326 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2559 - regression_loss: 1.8225 - classification_loss: 0.4333 114/500 [=====>........................] - ETA: 1:36 - loss: 2.2573 - regression_loss: 1.8239 - classification_loss: 0.4334 115/500 [=====>........................] - ETA: 1:35 - loss: 2.2599 - regression_loss: 1.8260 - classification_loss: 0.4339 116/500 [=====>........................] - ETA: 1:35 - loss: 2.2573 - regression_loss: 1.8247 - classification_loss: 0.4327 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2563 - regression_loss: 1.8241 - classification_loss: 0.4321 118/500 [======>.......................] - ETA: 1:35 - loss: 2.2579 - regression_loss: 1.8258 - classification_loss: 0.4321 119/500 [======>.......................] - ETA: 1:34 - loss: 2.2699 - regression_loss: 1.8344 - classification_loss: 0.4356 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2672 - regression_loss: 1.8324 - classification_loss: 0.4348 121/500 [======>.......................] - ETA: 1:34 - loss: 2.2665 - regression_loss: 1.8322 - classification_loss: 0.4343 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2557 - regression_loss: 1.8235 - classification_loss: 0.4322 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2642 - regression_loss: 1.8268 - classification_loss: 0.4374 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2683 - regression_loss: 1.8304 - classification_loss: 0.4379 125/500 [======>.......................] - ETA: 1:33 - loss: 2.2717 - regression_loss: 1.8333 - classification_loss: 0.4384 126/500 [======>.......................] - ETA: 1:33 - loss: 2.2744 - regression_loss: 1.8362 - classification_loss: 0.4382 127/500 [======>.......................] - ETA: 1:33 - loss: 2.2739 - regression_loss: 1.8360 - classification_loss: 0.4378 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2718 - regression_loss: 1.8345 - classification_loss: 0.4373 129/500 [======>.......................] - ETA: 1:32 - loss: 2.2644 - regression_loss: 1.8290 - classification_loss: 0.4354 130/500 [======>.......................] - ETA: 1:32 - loss: 2.2616 - regression_loss: 1.8266 - classification_loss: 0.4350 131/500 [======>.......................] - ETA: 1:32 - loss: 2.2602 - regression_loss: 1.8262 - classification_loss: 0.4340 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2580 - regression_loss: 1.8246 - classification_loss: 0.4334 133/500 [======>.......................] - ETA: 1:31 - loss: 2.2589 - regression_loss: 1.8250 - classification_loss: 0.4339 134/500 [=======>......................] - ETA: 1:31 - loss: 2.2584 - regression_loss: 1.8248 - classification_loss: 0.4336 135/500 [=======>......................] - ETA: 1:31 - loss: 2.2609 - regression_loss: 1.8269 - classification_loss: 0.4340 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2590 - regression_loss: 1.8253 - classification_loss: 0.4337 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2613 - regression_loss: 1.8265 - classification_loss: 0.4347 138/500 [=======>......................] - ETA: 1:30 - loss: 2.2600 - regression_loss: 1.8257 - classification_loss: 0.4344 139/500 [=======>......................] - ETA: 1:30 - loss: 2.2585 - regression_loss: 1.8248 - classification_loss: 0.4337 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2589 - regression_loss: 1.8259 - classification_loss: 0.4330 141/500 [=======>......................] - ETA: 1:29 - loss: 2.2558 - regression_loss: 1.8236 - classification_loss: 0.4322 142/500 [=======>......................] - ETA: 1:29 - loss: 2.2567 - regression_loss: 1.8247 - classification_loss: 0.4320 143/500 [=======>......................] - ETA: 1:29 - loss: 2.2541 - regression_loss: 1.8230 - classification_loss: 0.4311 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2513 - regression_loss: 1.8212 - classification_loss: 0.4301 145/500 [=======>......................] - ETA: 1:28 - loss: 2.2478 - regression_loss: 1.8178 - classification_loss: 0.4300 146/500 [=======>......................] - ETA: 1:28 - loss: 2.2469 - regression_loss: 1.8176 - classification_loss: 0.4294 147/500 [=======>......................] - ETA: 1:28 - loss: 2.2521 - regression_loss: 1.8221 - classification_loss: 0.4300 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2535 - regression_loss: 1.8235 - classification_loss: 0.4300 149/500 [=======>......................] - ETA: 1:27 - loss: 2.2544 - regression_loss: 1.8246 - classification_loss: 0.4297 150/500 [========>.....................] - ETA: 1:27 - loss: 2.2548 - regression_loss: 1.8252 - classification_loss: 0.4295 151/500 [========>.....................] - ETA: 1:27 - loss: 2.2563 - regression_loss: 1.8266 - classification_loss: 0.4297 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2552 - regression_loss: 1.8264 - classification_loss: 0.4288 153/500 [========>.....................] - ETA: 1:26 - loss: 2.2561 - regression_loss: 1.8277 - classification_loss: 0.4284 154/500 [========>.....................] - ETA: 1:26 - loss: 2.2541 - regression_loss: 1.8259 - classification_loss: 0.4282 155/500 [========>.....................] - ETA: 1:26 - loss: 2.2582 - regression_loss: 1.8291 - classification_loss: 0.4291 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2581 - regression_loss: 1.8293 - classification_loss: 0.4288 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2573 - regression_loss: 1.8283 - classification_loss: 0.4290 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2621 - regression_loss: 1.8325 - classification_loss: 0.4295 159/500 [========>.....................] - ETA: 1:25 - loss: 2.2618 - regression_loss: 1.8325 - classification_loss: 0.4293 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2612 - regression_loss: 1.8321 - classification_loss: 0.4291 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2606 - regression_loss: 1.8319 - classification_loss: 0.4287 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2622 - regression_loss: 1.8339 - classification_loss: 0.4283 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2633 - regression_loss: 1.8347 - classification_loss: 0.4287 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2581 - regression_loss: 1.8300 - classification_loss: 0.4281 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2599 - regression_loss: 1.8320 - classification_loss: 0.4279 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2618 - regression_loss: 1.8334 - classification_loss: 0.4284 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2628 - regression_loss: 1.8319 - classification_loss: 0.4309 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2660 - regression_loss: 1.8345 - classification_loss: 0.4315 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2661 - regression_loss: 1.8349 - classification_loss: 0.4312 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2649 - regression_loss: 1.8339 - classification_loss: 0.4310 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2664 - regression_loss: 1.8353 - classification_loss: 0.4311 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2658 - regression_loss: 1.8349 - classification_loss: 0.4309 173/500 [=========>....................] - ETA: 1:22 - loss: 2.2649 - regression_loss: 1.8345 - classification_loss: 0.4304 174/500 [=========>....................] - ETA: 1:22 - loss: 2.2675 - regression_loss: 1.8366 - classification_loss: 0.4308 175/500 [=========>....................] - ETA: 1:22 - loss: 2.2637 - regression_loss: 1.8335 - classification_loss: 0.4302 176/500 [=========>....................] - ETA: 1:21 - loss: 2.2646 - regression_loss: 1.8344 - classification_loss: 0.4302 177/500 [=========>....................] - ETA: 1:21 - loss: 2.2655 - regression_loss: 1.8350 - classification_loss: 0.4304 178/500 [=========>....................] - ETA: 1:21 - loss: 2.2642 - regression_loss: 1.8340 - classification_loss: 0.4303 179/500 [=========>....................] - ETA: 1:21 - loss: 2.2606 - regression_loss: 1.8313 - classification_loss: 0.4292 180/500 [=========>....................] - ETA: 1:20 - loss: 2.2597 - regression_loss: 1.8309 - classification_loss: 0.4288 181/500 [=========>....................] - ETA: 1:20 - loss: 2.2648 - regression_loss: 1.8343 - classification_loss: 0.4305 182/500 [=========>....................] - ETA: 1:20 - loss: 2.2607 - regression_loss: 1.8312 - classification_loss: 0.4295 183/500 [=========>....................] - ETA: 1:20 - loss: 2.2665 - regression_loss: 1.8359 - classification_loss: 0.4306 184/500 [==========>...................] - ETA: 1:19 - loss: 2.2657 - regression_loss: 1.8348 - classification_loss: 0.4309 185/500 [==========>...................] - ETA: 1:19 - loss: 2.2640 - regression_loss: 1.8337 - classification_loss: 0.4303 186/500 [==========>...................] - ETA: 1:19 - loss: 2.2647 - regression_loss: 1.8345 - classification_loss: 0.4302 187/500 [==========>...................] - ETA: 1:19 - loss: 2.2649 - regression_loss: 1.8340 - classification_loss: 0.4309 188/500 [==========>...................] - ETA: 1:18 - loss: 2.2666 - regression_loss: 1.8353 - classification_loss: 0.4313 189/500 [==========>...................] - ETA: 1:18 - loss: 2.2633 - regression_loss: 1.8329 - classification_loss: 0.4305 190/500 [==========>...................] - ETA: 1:18 - loss: 2.2598 - regression_loss: 1.8300 - classification_loss: 0.4299 191/500 [==========>...................] - ETA: 1:17 - loss: 2.2586 - regression_loss: 1.8294 - classification_loss: 0.4292 192/500 [==========>...................] - ETA: 1:17 - loss: 2.2590 - regression_loss: 1.8296 - classification_loss: 0.4295 193/500 [==========>...................] - ETA: 1:17 - loss: 2.2611 - regression_loss: 1.8311 - classification_loss: 0.4299 194/500 [==========>...................] - ETA: 1:17 - loss: 2.2599 - regression_loss: 1.8303 - classification_loss: 0.4296 195/500 [==========>...................] - ETA: 1:16 - loss: 2.2600 - regression_loss: 1.8306 - classification_loss: 0.4293 196/500 [==========>...................] - ETA: 1:16 - loss: 2.2614 - regression_loss: 1.8315 - classification_loss: 0.4298 197/500 [==========>...................] - ETA: 1:16 - loss: 2.2603 - regression_loss: 1.8308 - classification_loss: 0.4294 198/500 [==========>...................] - ETA: 1:16 - loss: 2.2601 - regression_loss: 1.8311 - classification_loss: 0.4290 199/500 [==========>...................] - ETA: 1:15 - loss: 2.2606 - regression_loss: 1.8314 - classification_loss: 0.4292 200/500 [===========>..................] - ETA: 1:15 - loss: 2.2607 - regression_loss: 1.8316 - classification_loss: 0.4291 201/500 [===========>..................] - ETA: 1:15 - loss: 2.2593 - regression_loss: 1.8307 - classification_loss: 0.4286 202/500 [===========>..................] - ETA: 1:15 - loss: 2.2625 - regression_loss: 1.8337 - classification_loss: 0.4289 203/500 [===========>..................] - ETA: 1:14 - loss: 2.2588 - regression_loss: 1.8306 - classification_loss: 0.4282 204/500 [===========>..................] - ETA: 1:14 - loss: 2.2583 - regression_loss: 1.8304 - classification_loss: 0.4278 205/500 [===========>..................] - ETA: 1:14 - loss: 2.2594 - regression_loss: 1.8312 - classification_loss: 0.4282 206/500 [===========>..................] - ETA: 1:14 - loss: 2.2613 - regression_loss: 1.8331 - classification_loss: 0.4281 207/500 [===========>..................] - ETA: 1:13 - loss: 2.2620 - regression_loss: 1.8333 - classification_loss: 0.4286 208/500 [===========>..................] - ETA: 1:13 - loss: 2.2631 - regression_loss: 1.8341 - classification_loss: 0.4290 209/500 [===========>..................] - ETA: 1:13 - loss: 2.2627 - regression_loss: 1.8340 - classification_loss: 0.4287 210/500 [===========>..................] - ETA: 1:13 - loss: 2.2631 - regression_loss: 1.8346 - classification_loss: 0.4285 211/500 [===========>..................] - ETA: 1:12 - loss: 2.2628 - regression_loss: 1.8341 - classification_loss: 0.4287 212/500 [===========>..................] - ETA: 1:12 - loss: 2.2645 - regression_loss: 1.8356 - classification_loss: 0.4289 213/500 [===========>..................] - ETA: 1:12 - loss: 2.2610 - regression_loss: 1.8329 - classification_loss: 0.4281 214/500 [===========>..................] - ETA: 1:12 - loss: 2.2608 - regression_loss: 1.8331 - classification_loss: 0.4277 215/500 [===========>..................] - ETA: 1:11 - loss: 2.2610 - regression_loss: 1.8337 - classification_loss: 0.4274 216/500 [===========>..................] - ETA: 1:11 - loss: 2.2596 - regression_loss: 1.8328 - classification_loss: 0.4268 217/500 [============>.................] - ETA: 1:11 - loss: 2.2604 - regression_loss: 1.8334 - classification_loss: 0.4270 218/500 [============>.................] - ETA: 1:11 - loss: 2.2591 - regression_loss: 1.8325 - classification_loss: 0.4265 219/500 [============>.................] - ETA: 1:10 - loss: 2.2597 - regression_loss: 1.8330 - classification_loss: 0.4267 220/500 [============>.................] - ETA: 1:10 - loss: 2.2554 - regression_loss: 1.8295 - classification_loss: 0.4259 221/500 [============>.................] - ETA: 1:10 - loss: 2.2548 - regression_loss: 1.8292 - classification_loss: 0.4255 222/500 [============>.................] - ETA: 1:10 - loss: 2.2530 - regression_loss: 1.8279 - classification_loss: 0.4250 223/500 [============>.................] - ETA: 1:09 - loss: 2.2546 - regression_loss: 1.8294 - classification_loss: 0.4253 224/500 [============>.................] - ETA: 1:09 - loss: 2.2562 - regression_loss: 1.8304 - classification_loss: 0.4258 225/500 [============>.................] - ETA: 1:09 - loss: 2.2581 - regression_loss: 1.8314 - classification_loss: 0.4268 226/500 [============>.................] - ETA: 1:09 - loss: 2.2598 - regression_loss: 1.8329 - classification_loss: 0.4269 227/500 [============>.................] - ETA: 1:08 - loss: 2.2613 - regression_loss: 1.8344 - classification_loss: 0.4268 228/500 [============>.................] - ETA: 1:08 - loss: 2.2619 - regression_loss: 1.8336 - classification_loss: 0.4283 229/500 [============>.................] - ETA: 1:08 - loss: 2.2584 - regression_loss: 1.8302 - classification_loss: 0.4282 230/500 [============>.................] - ETA: 1:08 - loss: 2.2595 - regression_loss: 1.8316 - classification_loss: 0.4279 231/500 [============>.................] - ETA: 1:07 - loss: 2.2618 - regression_loss: 1.8339 - classification_loss: 0.4278 232/500 [============>.................] - ETA: 1:07 - loss: 2.2614 - regression_loss: 1.8338 - classification_loss: 0.4277 233/500 [============>.................] - ETA: 1:07 - loss: 2.2616 - regression_loss: 1.8338 - classification_loss: 0.4278 234/500 [=============>................] - ETA: 1:07 - loss: 2.2604 - regression_loss: 1.8329 - classification_loss: 0.4275 235/500 [=============>................] - ETA: 1:06 - loss: 2.2590 - regression_loss: 1.8317 - classification_loss: 0.4274 236/500 [=============>................] - ETA: 1:06 - loss: 2.2601 - regression_loss: 1.8324 - classification_loss: 0.4277 237/500 [=============>................] - ETA: 1:06 - loss: 2.2590 - regression_loss: 1.8314 - classification_loss: 0.4276 238/500 [=============>................] - ETA: 1:06 - loss: 2.2581 - regression_loss: 1.8309 - classification_loss: 0.4272 239/500 [=============>................] - ETA: 1:05 - loss: 2.2569 - regression_loss: 1.8296 - classification_loss: 0.4273 240/500 [=============>................] - ETA: 1:05 - loss: 2.2521 - regression_loss: 1.8255 - classification_loss: 0.4266 241/500 [=============>................] - ETA: 1:05 - loss: 2.2512 - regression_loss: 1.8251 - classification_loss: 0.4261 242/500 [=============>................] - ETA: 1:04 - loss: 2.2493 - regression_loss: 1.8235 - classification_loss: 0.4258 243/500 [=============>................] - ETA: 1:04 - loss: 2.2506 - regression_loss: 1.8246 - classification_loss: 0.4259 244/500 [=============>................] - ETA: 1:04 - loss: 2.2457 - regression_loss: 1.8206 - classification_loss: 0.4251 245/500 [=============>................] - ETA: 1:04 - loss: 2.2465 - regression_loss: 1.8211 - classification_loss: 0.4254 246/500 [=============>................] - ETA: 1:03 - loss: 2.2477 - regression_loss: 1.8225 - classification_loss: 0.4251 247/500 [=============>................] - ETA: 1:03 - loss: 2.2470 - regression_loss: 1.8219 - classification_loss: 0.4251 248/500 [=============>................] - ETA: 1:03 - loss: 2.2479 - regression_loss: 1.8228 - classification_loss: 0.4251 249/500 [=============>................] - ETA: 1:03 - loss: 2.2460 - regression_loss: 1.8208 - classification_loss: 0.4252 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2461 - regression_loss: 1.8208 - classification_loss: 0.4253 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2481 - regression_loss: 1.8226 - classification_loss: 0.4255 252/500 [==============>...............] - ETA: 1:02 - loss: 2.2470 - regression_loss: 1.8216 - classification_loss: 0.4254 253/500 [==============>...............] - ETA: 1:02 - loss: 2.2500 - regression_loss: 1.8231 - classification_loss: 0.4269 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2473 - regression_loss: 1.8213 - classification_loss: 0.4261 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2492 - regression_loss: 1.8227 - classification_loss: 0.4265 256/500 [==============>...............] - ETA: 1:01 - loss: 2.2492 - regression_loss: 1.8232 - classification_loss: 0.4261 257/500 [==============>...............] - ETA: 1:01 - loss: 2.2493 - regression_loss: 1.8236 - classification_loss: 0.4257 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2513 - regression_loss: 1.8250 - classification_loss: 0.4263 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2518 - regression_loss: 1.8254 - classification_loss: 0.4264 260/500 [==============>...............] - ETA: 1:00 - loss: 2.2542 - regression_loss: 1.8268 - classification_loss: 0.4274 261/500 [==============>...............] - ETA: 1:00 - loss: 2.2535 - regression_loss: 1.8265 - classification_loss: 0.4270 262/500 [==============>...............] - ETA: 59s - loss: 2.2528 - regression_loss: 1.8258 - classification_loss: 0.4270  263/500 [==============>...............] - ETA: 59s - loss: 2.2540 - regression_loss: 1.8271 - classification_loss: 0.4269 264/500 [==============>...............] - ETA: 59s - loss: 2.2513 - regression_loss: 1.8249 - classification_loss: 0.4263 265/500 [==============>...............] - ETA: 59s - loss: 2.2523 - regression_loss: 1.8258 - classification_loss: 0.4265 266/500 [==============>...............] - ETA: 58s - loss: 2.2537 - regression_loss: 1.8258 - classification_loss: 0.4279 267/500 [===============>..............] - ETA: 58s - loss: 2.2528 - regression_loss: 1.8250 - classification_loss: 0.4278 268/500 [===============>..............] - ETA: 58s - loss: 2.2512 - regression_loss: 1.8237 - classification_loss: 0.4275 269/500 [===============>..............] - ETA: 58s - loss: 2.2507 - regression_loss: 1.8233 - classification_loss: 0.4274 270/500 [===============>..............] - ETA: 57s - loss: 2.2496 - regression_loss: 1.8223 - classification_loss: 0.4273 271/500 [===============>..............] - ETA: 57s - loss: 2.2479 - regression_loss: 1.8209 - classification_loss: 0.4269 272/500 [===============>..............] - ETA: 57s - loss: 2.2457 - regression_loss: 1.8188 - classification_loss: 0.4270 273/500 [===============>..............] - ETA: 57s - loss: 2.2441 - regression_loss: 1.8176 - classification_loss: 0.4265 274/500 [===============>..............] - ETA: 56s - loss: 2.2444 - regression_loss: 1.8178 - classification_loss: 0.4266 275/500 [===============>..............] - ETA: 56s - loss: 2.2433 - regression_loss: 1.8171 - classification_loss: 0.4262 276/500 [===============>..............] - ETA: 56s - loss: 2.2435 - regression_loss: 1.8172 - classification_loss: 0.4263 277/500 [===============>..............] - ETA: 56s - loss: 2.2427 - regression_loss: 1.8165 - classification_loss: 0.4262 278/500 [===============>..............] - ETA: 55s - loss: 2.2444 - regression_loss: 1.8178 - classification_loss: 0.4265 279/500 [===============>..............] - ETA: 55s - loss: 2.2441 - regression_loss: 1.8178 - classification_loss: 0.4263 280/500 [===============>..............] - ETA: 55s - loss: 2.2453 - regression_loss: 1.8189 - classification_loss: 0.4264 281/500 [===============>..............] - ETA: 55s - loss: 2.2427 - regression_loss: 1.8169 - classification_loss: 0.4259 282/500 [===============>..............] - ETA: 54s - loss: 2.2421 - regression_loss: 1.8166 - classification_loss: 0.4255 283/500 [===============>..............] - ETA: 54s - loss: 2.2436 - regression_loss: 1.8179 - classification_loss: 0.4257 284/500 [================>.............] - ETA: 54s - loss: 2.2403 - regression_loss: 1.8151 - classification_loss: 0.4252 285/500 [================>.............] - ETA: 54s - loss: 2.2395 - regression_loss: 1.8148 - classification_loss: 0.4248 286/500 [================>.............] - ETA: 53s - loss: 2.2415 - regression_loss: 1.8167 - classification_loss: 0.4247 287/500 [================>.............] - ETA: 53s - loss: 2.2404 - regression_loss: 1.8159 - classification_loss: 0.4246 288/500 [================>.............] - ETA: 53s - loss: 2.2416 - regression_loss: 1.8153 - classification_loss: 0.4262 289/500 [================>.............] - ETA: 53s - loss: 2.2411 - regression_loss: 1.8149 - classification_loss: 0.4261 290/500 [================>.............] - ETA: 52s - loss: 2.2414 - regression_loss: 1.8151 - classification_loss: 0.4263 291/500 [================>.............] - ETA: 52s - loss: 2.2422 - regression_loss: 1.8157 - classification_loss: 0.4264 292/500 [================>.............] - ETA: 52s - loss: 2.2453 - regression_loss: 1.8184 - classification_loss: 0.4269 293/500 [================>.............] - ETA: 52s - loss: 2.2432 - regression_loss: 1.8166 - classification_loss: 0.4266 294/500 [================>.............] - ETA: 51s - loss: 2.2432 - regression_loss: 1.8169 - classification_loss: 0.4263 295/500 [================>.............] - ETA: 51s - loss: 2.2422 - regression_loss: 1.8163 - classification_loss: 0.4259 296/500 [================>.............] - ETA: 51s - loss: 2.2426 - regression_loss: 1.8158 - classification_loss: 0.4268 297/500 [================>.............] - ETA: 51s - loss: 2.2423 - regression_loss: 1.8156 - classification_loss: 0.4267 298/500 [================>.............] - ETA: 50s - loss: 2.2430 - regression_loss: 1.8165 - classification_loss: 0.4265 299/500 [================>.............] - ETA: 50s - loss: 2.2430 - regression_loss: 1.8165 - classification_loss: 0.4265 300/500 [=================>............] - ETA: 50s - loss: 2.2424 - regression_loss: 1.8157 - classification_loss: 0.4267 301/500 [=================>............] - ETA: 50s - loss: 2.2423 - regression_loss: 1.8158 - classification_loss: 0.4265 302/500 [=================>............] - ETA: 49s - loss: 2.2436 - regression_loss: 1.8171 - classification_loss: 0.4265 303/500 [=================>............] - ETA: 49s - loss: 2.2442 - regression_loss: 1.8176 - classification_loss: 0.4266 304/500 [=================>............] - ETA: 49s - loss: 2.2445 - regression_loss: 1.8177 - classification_loss: 0.4269 305/500 [=================>............] - ETA: 49s - loss: 2.2446 - regression_loss: 1.8172 - classification_loss: 0.4275 306/500 [=================>............] - ETA: 48s - loss: 2.2460 - regression_loss: 1.8182 - classification_loss: 0.4278 307/500 [=================>............] - ETA: 48s - loss: 2.2460 - regression_loss: 1.8182 - classification_loss: 0.4278 308/500 [=================>............] - ETA: 48s - loss: 2.2462 - regression_loss: 1.8187 - classification_loss: 0.4275 309/500 [=================>............] - ETA: 48s - loss: 2.2457 - regression_loss: 1.8184 - classification_loss: 0.4273 310/500 [=================>............] - ETA: 47s - loss: 2.2440 - regression_loss: 1.8174 - classification_loss: 0.4267 311/500 [=================>............] - ETA: 47s - loss: 2.2398 - regression_loss: 1.8140 - classification_loss: 0.4258 312/500 [=================>............] - ETA: 47s - loss: 2.2428 - regression_loss: 1.8167 - classification_loss: 0.4261 313/500 [=================>............] - ETA: 47s - loss: 2.2428 - regression_loss: 1.8166 - classification_loss: 0.4262 314/500 [=================>............] - ETA: 46s - loss: 2.2415 - regression_loss: 1.8156 - classification_loss: 0.4259 315/500 [=================>............] - ETA: 46s - loss: 2.2417 - regression_loss: 1.8158 - classification_loss: 0.4260 316/500 [=================>............] - ETA: 46s - loss: 2.2416 - regression_loss: 1.8158 - classification_loss: 0.4258 317/500 [==================>...........] - ETA: 46s - loss: 2.2412 - regression_loss: 1.8155 - classification_loss: 0.4256 318/500 [==================>...........] - ETA: 45s - loss: 2.2417 - regression_loss: 1.8158 - classification_loss: 0.4259 319/500 [==================>...........] - ETA: 45s - loss: 2.2432 - regression_loss: 1.8166 - classification_loss: 0.4266 320/500 [==================>...........] - ETA: 45s - loss: 2.2436 - regression_loss: 1.8172 - classification_loss: 0.4264 321/500 [==================>...........] - ETA: 45s - loss: 2.2443 - regression_loss: 1.8179 - classification_loss: 0.4264 322/500 [==================>...........] - ETA: 44s - loss: 2.2457 - regression_loss: 1.8189 - classification_loss: 0.4268 323/500 [==================>...........] - ETA: 44s - loss: 2.2440 - regression_loss: 1.8172 - classification_loss: 0.4267 324/500 [==================>...........] - ETA: 44s - loss: 2.2445 - regression_loss: 1.8177 - classification_loss: 0.4269 325/500 [==================>...........] - ETA: 44s - loss: 2.2468 - regression_loss: 1.8189 - classification_loss: 0.4279 326/500 [==================>...........] - ETA: 43s - loss: 2.2466 - regression_loss: 1.8181 - classification_loss: 0.4285 327/500 [==================>...........] - ETA: 43s - loss: 2.2477 - regression_loss: 1.8190 - classification_loss: 0.4287 328/500 [==================>...........] - ETA: 43s - loss: 2.2483 - regression_loss: 1.8195 - classification_loss: 0.4288 329/500 [==================>...........] - ETA: 43s - loss: 2.2482 - regression_loss: 1.8197 - classification_loss: 0.4285 330/500 [==================>...........] - ETA: 42s - loss: 2.2470 - regression_loss: 1.8188 - classification_loss: 0.4281 331/500 [==================>...........] - ETA: 42s - loss: 2.2457 - regression_loss: 1.8179 - classification_loss: 0.4278 332/500 [==================>...........] - ETA: 42s - loss: 2.2458 - regression_loss: 1.8181 - classification_loss: 0.4277 333/500 [==================>...........] - ETA: 42s - loss: 2.2462 - regression_loss: 1.8186 - classification_loss: 0.4276 334/500 [===================>..........] - ETA: 41s - loss: 2.2425 - regression_loss: 1.8156 - classification_loss: 0.4269 335/500 [===================>..........] - ETA: 41s - loss: 2.2423 - regression_loss: 1.8155 - classification_loss: 0.4268 336/500 [===================>..........] - ETA: 41s - loss: 2.2410 - regression_loss: 1.8138 - classification_loss: 0.4272 337/500 [===================>..........] - ETA: 41s - loss: 2.2398 - regression_loss: 1.8130 - classification_loss: 0.4268 338/500 [===================>..........] - ETA: 40s - loss: 2.2396 - regression_loss: 1.8127 - classification_loss: 0.4269 339/500 [===================>..........] - ETA: 40s - loss: 2.2402 - regression_loss: 1.8134 - classification_loss: 0.4268 340/500 [===================>..........] - ETA: 40s - loss: 2.2404 - regression_loss: 1.8136 - classification_loss: 0.4268 341/500 [===================>..........] - ETA: 40s - loss: 2.2402 - regression_loss: 1.8134 - classification_loss: 0.4268 342/500 [===================>..........] - ETA: 39s - loss: 2.2409 - regression_loss: 1.8140 - classification_loss: 0.4269 343/500 [===================>..........] - ETA: 39s - loss: 2.2431 - regression_loss: 1.8156 - classification_loss: 0.4275 344/500 [===================>..........] - ETA: 39s - loss: 2.2440 - regression_loss: 1.8163 - classification_loss: 0.4277 345/500 [===================>..........] - ETA: 39s - loss: 2.2439 - regression_loss: 1.8163 - classification_loss: 0.4276 346/500 [===================>..........] - ETA: 38s - loss: 2.2441 - regression_loss: 1.8166 - classification_loss: 0.4275 347/500 [===================>..........] - ETA: 38s - loss: 2.2444 - regression_loss: 1.8170 - classification_loss: 0.4274 348/500 [===================>..........] - ETA: 38s - loss: 2.2448 - regression_loss: 1.8175 - classification_loss: 0.4273 349/500 [===================>..........] - ETA: 37s - loss: 2.2445 - regression_loss: 1.8174 - classification_loss: 0.4272 350/500 [====================>.........] - ETA: 37s - loss: 2.2425 - regression_loss: 1.8155 - classification_loss: 0.4270 351/500 [====================>.........] - ETA: 37s - loss: 2.2424 - regression_loss: 1.8155 - classification_loss: 0.4268 352/500 [====================>.........] - ETA: 37s - loss: 2.2421 - regression_loss: 1.8154 - classification_loss: 0.4266 353/500 [====================>.........] - ETA: 36s - loss: 2.2420 - regression_loss: 1.8156 - classification_loss: 0.4264 354/500 [====================>.........] - ETA: 36s - loss: 2.2424 - regression_loss: 1.8161 - classification_loss: 0.4263 355/500 [====================>.........] - ETA: 36s - loss: 2.2419 - regression_loss: 1.8158 - classification_loss: 0.4261 356/500 [====================>.........] - ETA: 36s - loss: 2.2429 - regression_loss: 1.8167 - classification_loss: 0.4262 357/500 [====================>.........] - ETA: 36s - loss: 2.2440 - regression_loss: 1.8178 - classification_loss: 0.4262 358/500 [====================>.........] - ETA: 35s - loss: 2.2454 - regression_loss: 1.8193 - classification_loss: 0.4261 359/500 [====================>.........] - ETA: 35s - loss: 2.2461 - regression_loss: 1.8200 - classification_loss: 0.4261 360/500 [====================>.........] - ETA: 35s - loss: 2.2474 - regression_loss: 1.8212 - classification_loss: 0.4262 361/500 [====================>.........] - ETA: 34s - loss: 2.2463 - regression_loss: 1.8204 - classification_loss: 0.4259 362/500 [====================>.........] - ETA: 34s - loss: 2.2443 - regression_loss: 1.8185 - classification_loss: 0.4258 363/500 [====================>.........] - ETA: 34s - loss: 2.2448 - regression_loss: 1.8190 - classification_loss: 0.4258 364/500 [====================>.........] - ETA: 34s - loss: 2.2439 - regression_loss: 1.8180 - classification_loss: 0.4260 365/500 [====================>.........] - ETA: 33s - loss: 2.2461 - regression_loss: 1.8198 - classification_loss: 0.4262 366/500 [====================>.........] - ETA: 33s - loss: 2.2464 - regression_loss: 1.8202 - classification_loss: 0.4261 367/500 [=====================>........] - ETA: 33s - loss: 2.2466 - regression_loss: 1.8204 - classification_loss: 0.4262 368/500 [=====================>........] - ETA: 33s - loss: 2.2483 - regression_loss: 1.8218 - classification_loss: 0.4265 369/500 [=====================>........] - ETA: 32s - loss: 2.2479 - regression_loss: 1.8215 - classification_loss: 0.4264 370/500 [=====================>........] - ETA: 32s - loss: 2.2479 - regression_loss: 1.8216 - classification_loss: 0.4262 371/500 [=====================>........] - ETA: 32s - loss: 2.2471 - regression_loss: 1.8211 - classification_loss: 0.4260 372/500 [=====================>........] - ETA: 32s - loss: 2.2470 - regression_loss: 1.8212 - classification_loss: 0.4258 373/500 [=====================>........] - ETA: 31s - loss: 2.2479 - regression_loss: 1.8218 - classification_loss: 0.4261 374/500 [=====================>........] - ETA: 31s - loss: 2.2469 - regression_loss: 1.8211 - classification_loss: 0.4258 375/500 [=====================>........] - ETA: 31s - loss: 2.2471 - regression_loss: 1.8213 - classification_loss: 0.4258 376/500 [=====================>........] - ETA: 31s - loss: 2.2476 - regression_loss: 1.8220 - classification_loss: 0.4257 377/500 [=====================>........] - ETA: 30s - loss: 2.2497 - regression_loss: 1.8238 - classification_loss: 0.4259 378/500 [=====================>........] - ETA: 30s - loss: 2.2494 - regression_loss: 1.8236 - classification_loss: 0.4258 379/500 [=====================>........] - ETA: 30s - loss: 2.2484 - regression_loss: 1.8231 - classification_loss: 0.4254 380/500 [=====================>........] - ETA: 30s - loss: 2.2495 - regression_loss: 1.8238 - classification_loss: 0.4257 381/500 [=====================>........] - ETA: 29s - loss: 2.2493 - regression_loss: 1.8235 - classification_loss: 0.4258 382/500 [=====================>........] - ETA: 29s - loss: 2.2489 - regression_loss: 1.8231 - classification_loss: 0.4258 383/500 [=====================>........] - ETA: 29s - loss: 2.2499 - regression_loss: 1.8236 - classification_loss: 0.4263 384/500 [======================>.......] - ETA: 29s - loss: 2.2509 - regression_loss: 1.8243 - classification_loss: 0.4265 385/500 [======================>.......] - ETA: 28s - loss: 2.2532 - regression_loss: 1.8261 - classification_loss: 0.4270 386/500 [======================>.......] - ETA: 28s - loss: 2.2528 - regression_loss: 1.8258 - classification_loss: 0.4270 387/500 [======================>.......] - ETA: 28s - loss: 2.2523 - regression_loss: 1.8255 - classification_loss: 0.4268 388/500 [======================>.......] - ETA: 28s - loss: 2.2551 - regression_loss: 1.8279 - classification_loss: 0.4272 389/500 [======================>.......] - ETA: 27s - loss: 2.2544 - regression_loss: 1.8274 - classification_loss: 0.4270 390/500 [======================>.......] - ETA: 27s - loss: 2.2530 - regression_loss: 1.8264 - classification_loss: 0.4266 391/500 [======================>.......] - ETA: 27s - loss: 2.2532 - regression_loss: 1.8267 - classification_loss: 0.4266 392/500 [======================>.......] - ETA: 27s - loss: 2.2533 - regression_loss: 1.8270 - classification_loss: 0.4263 393/500 [======================>.......] - ETA: 26s - loss: 2.2541 - regression_loss: 1.8275 - classification_loss: 0.4265 394/500 [======================>.......] - ETA: 26s - loss: 2.2532 - regression_loss: 1.8270 - classification_loss: 0.4262 395/500 [======================>.......] - ETA: 26s - loss: 2.2534 - regression_loss: 1.8273 - classification_loss: 0.4261 396/500 [======================>.......] - ETA: 26s - loss: 2.2536 - regression_loss: 1.8274 - classification_loss: 0.4262 397/500 [======================>.......] - ETA: 25s - loss: 2.2574 - regression_loss: 1.8297 - classification_loss: 0.4277 398/500 [======================>.......] - ETA: 25s - loss: 2.2585 - regression_loss: 1.8303 - classification_loss: 0.4282 399/500 [======================>.......] - ETA: 25s - loss: 2.2562 - regression_loss: 1.8284 - classification_loss: 0.4278 400/500 [=======================>......] - ETA: 25s - loss: 2.2559 - regression_loss: 1.8282 - classification_loss: 0.4277 401/500 [=======================>......] - ETA: 24s - loss: 2.2563 - regression_loss: 1.8283 - classification_loss: 0.4279 402/500 [=======================>......] - ETA: 24s - loss: 2.2555 - regression_loss: 1.8278 - classification_loss: 0.4277 403/500 [=======================>......] - ETA: 24s - loss: 2.2555 - regression_loss: 1.8277 - classification_loss: 0.4278 404/500 [=======================>......] - ETA: 24s - loss: 2.2571 - regression_loss: 1.8288 - classification_loss: 0.4283 405/500 [=======================>......] - ETA: 23s - loss: 2.2576 - regression_loss: 1.8293 - classification_loss: 0.4283 406/500 [=======================>......] - ETA: 23s - loss: 2.2573 - regression_loss: 1.8292 - classification_loss: 0.4281 407/500 [=======================>......] - ETA: 23s - loss: 2.2568 - regression_loss: 1.8290 - classification_loss: 0.4278 408/500 [=======================>......] - ETA: 23s - loss: 2.2566 - regression_loss: 1.8290 - classification_loss: 0.4276 409/500 [=======================>......] - ETA: 22s - loss: 2.2572 - regression_loss: 1.8294 - classification_loss: 0.4278 410/500 [=======================>......] - ETA: 22s - loss: 2.2582 - regression_loss: 1.8294 - classification_loss: 0.4288 411/500 [=======================>......] - ETA: 22s - loss: 2.2580 - regression_loss: 1.8294 - classification_loss: 0.4286 412/500 [=======================>......] - ETA: 22s - loss: 2.2547 - regression_loss: 1.8265 - classification_loss: 0.4282 413/500 [=======================>......] - ETA: 21s - loss: 2.2542 - regression_loss: 1.8263 - classification_loss: 0.4280 414/500 [=======================>......] - ETA: 21s - loss: 2.2551 - regression_loss: 1.8269 - classification_loss: 0.4281 415/500 [=======================>......] - ETA: 21s - loss: 2.2534 - regression_loss: 1.8258 - classification_loss: 0.4276 416/500 [=======================>......] - ETA: 21s - loss: 2.2551 - regression_loss: 1.8271 - classification_loss: 0.4280 417/500 [========================>.....] - ETA: 20s - loss: 2.2552 - regression_loss: 1.8272 - classification_loss: 0.4280 418/500 [========================>.....] - ETA: 20s - loss: 2.2547 - regression_loss: 1.8266 - classification_loss: 0.4280 419/500 [========================>.....] - ETA: 20s - loss: 2.2549 - regression_loss: 1.8269 - classification_loss: 0.4280 420/500 [========================>.....] - ETA: 20s - loss: 2.2527 - regression_loss: 1.8252 - classification_loss: 0.4275 421/500 [========================>.....] - ETA: 19s - loss: 2.2535 - regression_loss: 1.8257 - classification_loss: 0.4278 422/500 [========================>.....] - ETA: 19s - loss: 2.2559 - regression_loss: 1.8267 - classification_loss: 0.4292 423/500 [========================>.....] - ETA: 19s - loss: 2.2567 - regression_loss: 1.8268 - classification_loss: 0.4299 424/500 [========================>.....] - ETA: 19s - loss: 2.2571 - regression_loss: 1.8272 - classification_loss: 0.4299 425/500 [========================>.....] - ETA: 18s - loss: 2.2546 - regression_loss: 1.8252 - classification_loss: 0.4294 426/500 [========================>.....] - ETA: 18s - loss: 2.2551 - regression_loss: 1.8257 - classification_loss: 0.4294 427/500 [========================>.....] - ETA: 18s - loss: 2.2546 - regression_loss: 1.8254 - classification_loss: 0.4292 428/500 [========================>.....] - ETA: 18s - loss: 2.2555 - regression_loss: 1.8264 - classification_loss: 0.4291 429/500 [========================>.....] - ETA: 17s - loss: 2.2571 - regression_loss: 1.8278 - classification_loss: 0.4293 430/500 [========================>.....] - ETA: 17s - loss: 2.2572 - regression_loss: 1.8278 - classification_loss: 0.4294 431/500 [========================>.....] - ETA: 17s - loss: 2.2563 - regression_loss: 1.8272 - classification_loss: 0.4291 432/500 [========================>.....] - ETA: 17s - loss: 2.2530 - regression_loss: 1.8245 - classification_loss: 0.4285 433/500 [========================>.....] - ETA: 16s - loss: 2.2542 - regression_loss: 1.8254 - classification_loss: 0.4287 434/500 [=========================>....] - ETA: 16s - loss: 2.2518 - regression_loss: 1.8236 - classification_loss: 0.4282 435/500 [=========================>....] - ETA: 16s - loss: 2.2518 - regression_loss: 1.8237 - classification_loss: 0.4280 436/500 [=========================>....] - ETA: 16s - loss: 2.2499 - regression_loss: 1.8224 - classification_loss: 0.4275 437/500 [=========================>....] - ETA: 15s - loss: 2.2493 - regression_loss: 1.8219 - classification_loss: 0.4274 438/500 [=========================>....] - ETA: 15s - loss: 2.2489 - regression_loss: 1.8217 - classification_loss: 0.4272 439/500 [=========================>....] - ETA: 15s - loss: 2.2490 - regression_loss: 1.8217 - classification_loss: 0.4272 440/500 [=========================>....] - ETA: 15s - loss: 2.2496 - regression_loss: 1.8223 - classification_loss: 0.4273 441/500 [=========================>....] - ETA: 14s - loss: 2.2505 - regression_loss: 1.8230 - classification_loss: 0.4275 442/500 [=========================>....] - ETA: 14s - loss: 2.2510 - regression_loss: 1.8235 - classification_loss: 0.4275 443/500 [=========================>....] - ETA: 14s - loss: 2.2509 - regression_loss: 1.8235 - classification_loss: 0.4274 444/500 [=========================>....] - ETA: 14s - loss: 2.2516 - regression_loss: 1.8242 - classification_loss: 0.4274 445/500 [=========================>....] - ETA: 13s - loss: 2.2528 - regression_loss: 1.8253 - classification_loss: 0.4274 446/500 [=========================>....] - ETA: 13s - loss: 2.2541 - regression_loss: 1.8268 - classification_loss: 0.4274 447/500 [=========================>....] - ETA: 13s - loss: 2.2540 - regression_loss: 1.8267 - classification_loss: 0.4273 448/500 [=========================>....] - ETA: 13s - loss: 2.2548 - regression_loss: 1.8272 - classification_loss: 0.4276 449/500 [=========================>....] - ETA: 12s - loss: 2.2556 - regression_loss: 1.8274 - classification_loss: 0.4282 450/500 [==========================>...] - ETA: 12s - loss: 2.2559 - regression_loss: 1.8277 - classification_loss: 0.4283 451/500 [==========================>...] - ETA: 12s - loss: 2.2563 - regression_loss: 1.8275 - classification_loss: 0.4288 452/500 [==========================>...] - ETA: 12s - loss: 2.2560 - regression_loss: 1.8274 - classification_loss: 0.4286 453/500 [==========================>...] - ETA: 11s - loss: 2.2554 - regression_loss: 1.8271 - classification_loss: 0.4283 454/500 [==========================>...] - ETA: 11s - loss: 2.2557 - regression_loss: 1.8273 - classification_loss: 0.4283 455/500 [==========================>...] - ETA: 11s - loss: 2.2556 - regression_loss: 1.8275 - classification_loss: 0.4281 456/500 [==========================>...] - ETA: 11s - loss: 2.2554 - regression_loss: 1.8273 - classification_loss: 0.4280 457/500 [==========================>...] - ETA: 10s - loss: 2.2557 - regression_loss: 1.8277 - classification_loss: 0.4280 458/500 [==========================>...] - ETA: 10s - loss: 2.2541 - regression_loss: 1.8263 - classification_loss: 0.4277 459/500 [==========================>...] - ETA: 10s - loss: 2.2537 - regression_loss: 1.8261 - classification_loss: 0.4276 460/500 [==========================>...] - ETA: 10s - loss: 2.2524 - regression_loss: 1.8252 - classification_loss: 0.4272 461/500 [==========================>...] - ETA: 9s - loss: 2.2533 - regression_loss: 1.8258 - classification_loss: 0.4275  462/500 [==========================>...] - ETA: 9s - loss: 2.2537 - regression_loss: 1.8263 - classification_loss: 0.4274 463/500 [==========================>...] - ETA: 9s - loss: 2.2540 - regression_loss: 1.8265 - classification_loss: 0.4275 464/500 [==========================>...] - ETA: 9s - loss: 2.2540 - regression_loss: 1.8266 - classification_loss: 0.4274 465/500 [==========================>...] - ETA: 8s - loss: 2.2544 - regression_loss: 1.8271 - classification_loss: 0.4273 466/500 [==========================>...] - ETA: 8s - loss: 2.2541 - regression_loss: 1.8270 - classification_loss: 0.4271 467/500 [===========================>..] - ETA: 8s - loss: 2.2538 - regression_loss: 1.8270 - classification_loss: 0.4269 468/500 [===========================>..] - ETA: 8s - loss: 2.2540 - regression_loss: 1.8272 - classification_loss: 0.4267 469/500 [===========================>..] - ETA: 7s - loss: 2.2537 - regression_loss: 1.8272 - classification_loss: 0.4265 470/500 [===========================>..] - ETA: 7s - loss: 2.2537 - regression_loss: 1.8273 - classification_loss: 0.4264 471/500 [===========================>..] - ETA: 7s - loss: 2.2562 - regression_loss: 1.8284 - classification_loss: 0.4279 472/500 [===========================>..] - ETA: 7s - loss: 2.2555 - regression_loss: 1.8279 - classification_loss: 0.4277 473/500 [===========================>..] - ETA: 6s - loss: 2.2549 - regression_loss: 1.8270 - classification_loss: 0.4279 474/500 [===========================>..] - ETA: 6s - loss: 2.2551 - regression_loss: 1.8269 - classification_loss: 0.4283 475/500 [===========================>..] - ETA: 6s - loss: 2.2557 - regression_loss: 1.8274 - classification_loss: 0.4283 476/500 [===========================>..] - ETA: 6s - loss: 2.2564 - regression_loss: 1.8280 - classification_loss: 0.4284 477/500 [===========================>..] - ETA: 5s - loss: 2.2560 - regression_loss: 1.8278 - classification_loss: 0.4282 478/500 [===========================>..] - ETA: 5s - loss: 2.2560 - regression_loss: 1.8279 - classification_loss: 0.4281 479/500 [===========================>..] - ETA: 5s - loss: 2.2552 - regression_loss: 1.8273 - classification_loss: 0.4279 480/500 [===========================>..] - ETA: 5s - loss: 2.2548 - regression_loss: 1.8270 - classification_loss: 0.4278 481/500 [===========================>..] - ETA: 4s - loss: 2.2539 - regression_loss: 1.8262 - classification_loss: 0.4277 482/500 [===========================>..] - ETA: 4s - loss: 2.2532 - regression_loss: 1.8260 - classification_loss: 0.4272 483/500 [===========================>..] - ETA: 4s - loss: 2.2526 - regression_loss: 1.8255 - classification_loss: 0.4271 484/500 [============================>.] - ETA: 4s - loss: 2.2541 - regression_loss: 1.8268 - classification_loss: 0.4273 485/500 [============================>.] - ETA: 3s - loss: 2.2542 - regression_loss: 1.8269 - classification_loss: 0.4274 486/500 [============================>.] - ETA: 3s - loss: 2.2537 - regression_loss: 1.8265 - classification_loss: 0.4272 487/500 [============================>.] - ETA: 3s - loss: 2.2526 - regression_loss: 1.8254 - classification_loss: 0.4272 488/500 [============================>.] - ETA: 3s - loss: 2.2528 - regression_loss: 1.8257 - classification_loss: 0.4271 489/500 [============================>.] - ETA: 2s - loss: 2.2528 - regression_loss: 1.8257 - classification_loss: 0.4270 490/500 [============================>.] - ETA: 2s - loss: 2.2528 - regression_loss: 1.8259 - classification_loss: 0.4269 491/500 [============================>.] - ETA: 2s - loss: 2.2516 - regression_loss: 1.8249 - classification_loss: 0.4267 492/500 [============================>.] - ETA: 2s - loss: 2.2521 - regression_loss: 1.8254 - classification_loss: 0.4268 493/500 [============================>.] - ETA: 1s - loss: 2.2525 - regression_loss: 1.8256 - classification_loss: 0.4269 494/500 [============================>.] - ETA: 1s - loss: 2.2521 - regression_loss: 1.8254 - classification_loss: 0.4268 495/500 [============================>.] - ETA: 1s - loss: 2.2521 - regression_loss: 1.8255 - classification_loss: 0.4267 496/500 [============================>.] - ETA: 1s - loss: 2.2525 - regression_loss: 1.8258 - classification_loss: 0.4267 497/500 [============================>.] - ETA: 0s - loss: 2.2541 - regression_loss: 1.8273 - classification_loss: 0.4268 498/500 [============================>.] - ETA: 0s - loss: 2.2546 - regression_loss: 1.8276 - classification_loss: 0.4270 499/500 [============================>.] - ETA: 0s - loss: 2.2538 - regression_loss: 1.8269 - classification_loss: 0.4269 500/500 [==============================] - 126s 251ms/step - loss: 2.2519 - regression_loss: 1.8251 - classification_loss: 0.4268 1172 instances of class plum with average precision: 0.4340 mAP: 0.4340 Epoch 00022: saving model to ./training/snapshots/resnet50_pascal_22.h5 Epoch 23/150 1/500 [..............................] - ETA: 1:51 - loss: 1.7963 - regression_loss: 1.4362 - classification_loss: 0.3600 2/500 [..............................] - ETA: 1:56 - loss: 2.0736 - regression_loss: 1.6310 - classification_loss: 0.4426 3/500 [..............................] - ETA: 2:00 - loss: 2.1464 - regression_loss: 1.6722 - classification_loss: 0.4741 4/500 [..............................] - ETA: 2:00 - loss: 2.1667 - regression_loss: 1.7211 - classification_loss: 0.4456 5/500 [..............................] - ETA: 2:01 - loss: 2.2060 - regression_loss: 1.7715 - classification_loss: 0.4345 6/500 [..............................] - ETA: 2:01 - loss: 2.2151 - regression_loss: 1.7785 - classification_loss: 0.4366 7/500 [..............................] - ETA: 2:02 - loss: 2.2679 - regression_loss: 1.8258 - classification_loss: 0.4421 8/500 [..............................] - ETA: 2:02 - loss: 2.2442 - regression_loss: 1.8051 - classification_loss: 0.4391 9/500 [..............................] - ETA: 2:01 - loss: 2.2744 - regression_loss: 1.8352 - classification_loss: 0.4393 10/500 [..............................] - ETA: 2:01 - loss: 2.3118 - regression_loss: 1.8729 - classification_loss: 0.4389 11/500 [..............................] - ETA: 2:01 - loss: 2.3188 - regression_loss: 1.8846 - classification_loss: 0.4342 12/500 [..............................] - ETA: 2:01 - loss: 2.2380 - regression_loss: 1.8225 - classification_loss: 0.4155 13/500 [..............................] - ETA: 2:01 - loss: 2.2427 - regression_loss: 1.8228 - classification_loss: 0.4199 14/500 [..............................] - ETA: 2:01 - loss: 2.1747 - regression_loss: 1.7648 - classification_loss: 0.4099 15/500 [..............................] - ETA: 2:01 - loss: 2.2008 - regression_loss: 1.7887 - classification_loss: 0.4121 16/500 [..............................] - ETA: 2:01 - loss: 2.1878 - regression_loss: 1.7795 - classification_loss: 0.4083 17/500 [>.............................] - ETA: 2:01 - loss: 2.1868 - regression_loss: 1.7791 - classification_loss: 0.4077 18/500 [>.............................] - ETA: 2:01 - loss: 2.1466 - regression_loss: 1.7394 - classification_loss: 0.4072 19/500 [>.............................] - ETA: 2:01 - loss: 2.1724 - regression_loss: 1.7677 - classification_loss: 0.4047 20/500 [>.............................] - ETA: 2:00 - loss: 2.1860 - regression_loss: 1.7748 - classification_loss: 0.4113 21/500 [>.............................] - ETA: 2:00 - loss: 2.1923 - regression_loss: 1.7812 - classification_loss: 0.4111 22/500 [>.............................] - ETA: 1:59 - loss: 2.2020 - regression_loss: 1.7889 - classification_loss: 0.4130 23/500 [>.............................] - ETA: 1:59 - loss: 2.2033 - regression_loss: 1.7914 - classification_loss: 0.4120 24/500 [>.............................] - ETA: 1:59 - loss: 2.2089 - regression_loss: 1.7969 - classification_loss: 0.4120 25/500 [>.............................] - ETA: 1:59 - loss: 2.1813 - regression_loss: 1.7747 - classification_loss: 0.4066 26/500 [>.............................] - ETA: 1:58 - loss: 2.2039 - regression_loss: 1.7965 - classification_loss: 0.4075 27/500 [>.............................] - ETA: 1:58 - loss: 2.2216 - regression_loss: 1.8013 - classification_loss: 0.4204 28/500 [>.............................] - ETA: 1:58 - loss: 2.2578 - regression_loss: 1.8277 - classification_loss: 0.4301 29/500 [>.............................] - ETA: 1:58 - loss: 2.2531 - regression_loss: 1.8206 - classification_loss: 0.4325 30/500 [>.............................] - ETA: 1:57 - loss: 2.2065 - regression_loss: 1.7846 - classification_loss: 0.4218 31/500 [>.............................] - ETA: 1:57 - loss: 2.2265 - regression_loss: 1.7981 - classification_loss: 0.4283 32/500 [>.............................] - ETA: 1:57 - loss: 2.2326 - regression_loss: 1.8056 - classification_loss: 0.4270 33/500 [>.............................] - ETA: 1:56 - loss: 2.2383 - regression_loss: 1.8111 - classification_loss: 0.4272 34/500 [=>............................] - ETA: 1:56 - loss: 2.2331 - regression_loss: 1.8084 - classification_loss: 0.4247 35/500 [=>............................] - ETA: 1:56 - loss: 2.2269 - regression_loss: 1.8045 - classification_loss: 0.4224 36/500 [=>............................] - ETA: 1:56 - loss: 2.2168 - regression_loss: 1.7970 - classification_loss: 0.4198 37/500 [=>............................] - ETA: 1:55 - loss: 2.2217 - regression_loss: 1.8014 - classification_loss: 0.4203 38/500 [=>............................] - ETA: 1:55 - loss: 2.2174 - regression_loss: 1.7985 - classification_loss: 0.4189 39/500 [=>............................] - ETA: 1:55 - loss: 2.2061 - regression_loss: 1.7882 - classification_loss: 0.4179 40/500 [=>............................] - ETA: 1:55 - loss: 2.2004 - regression_loss: 1.7845 - classification_loss: 0.4158 41/500 [=>............................] - ETA: 1:55 - loss: 2.2031 - regression_loss: 1.7875 - classification_loss: 0.4157 42/500 [=>............................] - ETA: 1:54 - loss: 2.2008 - regression_loss: 1.7841 - classification_loss: 0.4166 43/500 [=>............................] - ETA: 1:54 - loss: 2.2261 - regression_loss: 1.8072 - classification_loss: 0.4189 44/500 [=>............................] - ETA: 1:54 - loss: 2.2398 - regression_loss: 1.8086 - classification_loss: 0.4311 45/500 [=>............................] - ETA: 1:54 - loss: 2.2362 - regression_loss: 1.8071 - classification_loss: 0.4292 46/500 [=>............................] - ETA: 1:53 - loss: 2.2534 - regression_loss: 1.8210 - classification_loss: 0.4324 47/500 [=>............................] - ETA: 1:53 - loss: 2.2597 - regression_loss: 1.8259 - classification_loss: 0.4338 48/500 [=>............................] - ETA: 1:53 - loss: 2.2604 - regression_loss: 1.8286 - classification_loss: 0.4318 49/500 [=>............................] - ETA: 1:53 - loss: 2.2726 - regression_loss: 1.8369 - classification_loss: 0.4357 50/500 [==>...........................] - ETA: 1:52 - loss: 2.2747 - regression_loss: 1.8374 - classification_loss: 0.4373 51/500 [==>...........................] - ETA: 1:52 - loss: 2.2761 - regression_loss: 1.8369 - classification_loss: 0.4392 52/500 [==>...........................] - ETA: 1:52 - loss: 2.2861 - regression_loss: 1.8433 - classification_loss: 0.4428 53/500 [==>...........................] - ETA: 1:52 - loss: 2.2851 - regression_loss: 1.8417 - classification_loss: 0.4434 54/500 [==>...........................] - ETA: 1:51 - loss: 2.2822 - regression_loss: 1.8397 - classification_loss: 0.4425 55/500 [==>...........................] - ETA: 1:51 - loss: 2.2825 - regression_loss: 1.8412 - classification_loss: 0.4413 56/500 [==>...........................] - ETA: 1:51 - loss: 2.2768 - regression_loss: 1.8371 - classification_loss: 0.4397 57/500 [==>...........................] - ETA: 1:51 - loss: 2.2873 - regression_loss: 1.8454 - classification_loss: 0.4419 58/500 [==>...........................] - ETA: 1:50 - loss: 2.2945 - regression_loss: 1.8497 - classification_loss: 0.4448 59/500 [==>...........................] - ETA: 1:50 - loss: 2.2900 - regression_loss: 1.8469 - classification_loss: 0.4430 60/500 [==>...........................] - ETA: 1:50 - loss: 2.2904 - regression_loss: 1.8503 - classification_loss: 0.4401 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2907 - regression_loss: 1.8515 - classification_loss: 0.4392 62/500 [==>...........................] - ETA: 1:49 - loss: 2.2996 - regression_loss: 1.8601 - classification_loss: 0.4395 63/500 [==>...........................] - ETA: 1:49 - loss: 2.2988 - regression_loss: 1.8607 - classification_loss: 0.4381 64/500 [==>...........................] - ETA: 1:49 - loss: 2.3072 - regression_loss: 1.8660 - classification_loss: 0.4412 65/500 [==>...........................] - ETA: 1:48 - loss: 2.3104 - regression_loss: 1.8695 - classification_loss: 0.4409 66/500 [==>...........................] - ETA: 1:48 - loss: 2.3061 - regression_loss: 1.8667 - classification_loss: 0.4394 67/500 [===>..........................] - ETA: 1:48 - loss: 2.3065 - regression_loss: 1.8648 - classification_loss: 0.4417 68/500 [===>..........................] - ETA: 1:48 - loss: 2.3112 - regression_loss: 1.8680 - classification_loss: 0.4433 69/500 [===>..........................] - ETA: 1:47 - loss: 2.3101 - regression_loss: 1.8666 - classification_loss: 0.4435 70/500 [===>..........................] - ETA: 1:47 - loss: 2.3127 - regression_loss: 1.8692 - classification_loss: 0.4435 71/500 [===>..........................] - ETA: 1:47 - loss: 2.3107 - regression_loss: 1.8681 - classification_loss: 0.4426 72/500 [===>..........................] - ETA: 1:47 - loss: 2.3078 - regression_loss: 1.8661 - classification_loss: 0.4417 73/500 [===>..........................] - ETA: 1:46 - loss: 2.3072 - regression_loss: 1.8660 - classification_loss: 0.4412 74/500 [===>..........................] - ETA: 1:46 - loss: 2.3019 - regression_loss: 1.8618 - classification_loss: 0.4401 75/500 [===>..........................] - ETA: 1:46 - loss: 2.2963 - regression_loss: 1.8580 - classification_loss: 0.4383 76/500 [===>..........................] - ETA: 1:46 - loss: 2.2942 - regression_loss: 1.8559 - classification_loss: 0.4383 77/500 [===>..........................] - ETA: 1:45 - loss: 2.2946 - regression_loss: 1.8559 - classification_loss: 0.4387 78/500 [===>..........................] - ETA: 1:45 - loss: 2.2842 - regression_loss: 1.8476 - classification_loss: 0.4366 79/500 [===>..........................] - ETA: 1:45 - loss: 2.2878 - regression_loss: 1.8499 - classification_loss: 0.4379 80/500 [===>..........................] - ETA: 1:44 - loss: 2.2944 - regression_loss: 1.8532 - classification_loss: 0.4412 81/500 [===>..........................] - ETA: 1:44 - loss: 2.2976 - regression_loss: 1.8566 - classification_loss: 0.4411 82/500 [===>..........................] - ETA: 1:44 - loss: 2.3032 - regression_loss: 1.8610 - classification_loss: 0.4422 83/500 [===>..........................] - ETA: 1:44 - loss: 2.3038 - regression_loss: 1.8613 - classification_loss: 0.4425 84/500 [====>.........................] - ETA: 1:43 - loss: 2.3079 - regression_loss: 1.8642 - classification_loss: 0.4437 85/500 [====>.........................] - ETA: 1:43 - loss: 2.3140 - regression_loss: 1.8704 - classification_loss: 0.4436 86/500 [====>.........................] - ETA: 1:43 - loss: 2.3165 - regression_loss: 1.8739 - classification_loss: 0.4426 87/500 [====>.........................] - ETA: 1:43 - loss: 2.3167 - regression_loss: 1.8715 - classification_loss: 0.4453 88/500 [====>.........................] - ETA: 1:42 - loss: 2.3127 - regression_loss: 1.8680 - classification_loss: 0.4447 89/500 [====>.........................] - ETA: 1:42 - loss: 2.3152 - regression_loss: 1.8692 - classification_loss: 0.4459 90/500 [====>.........................] - ETA: 1:42 - loss: 2.3184 - regression_loss: 1.8703 - classification_loss: 0.4480 91/500 [====>.........................] - ETA: 1:41 - loss: 2.3197 - regression_loss: 1.8721 - classification_loss: 0.4476 92/500 [====>.........................] - ETA: 1:41 - loss: 2.3117 - regression_loss: 1.8656 - classification_loss: 0.4461 93/500 [====>.........................] - ETA: 1:41 - loss: 2.3042 - regression_loss: 1.8599 - classification_loss: 0.4442 94/500 [====>.........................] - ETA: 1:40 - loss: 2.3012 - regression_loss: 1.8580 - classification_loss: 0.4432 95/500 [====>.........................] - ETA: 1:40 - loss: 2.3025 - regression_loss: 1.8598 - classification_loss: 0.4427 96/500 [====>.........................] - ETA: 1:40 - loss: 2.3004 - regression_loss: 1.8580 - classification_loss: 0.4424 97/500 [====>.........................] - ETA: 1:40 - loss: 2.2990 - regression_loss: 1.8569 - classification_loss: 0.4421 98/500 [====>.........................] - ETA: 1:40 - loss: 2.3004 - regression_loss: 1.8570 - classification_loss: 0.4434 99/500 [====>.........................] - ETA: 1:39 - loss: 2.2982 - regression_loss: 1.8559 - classification_loss: 0.4423 100/500 [=====>........................] - ETA: 1:39 - loss: 2.3000 - regression_loss: 1.8579 - classification_loss: 0.4421 101/500 [=====>........................] - ETA: 1:39 - loss: 2.2966 - regression_loss: 1.8553 - classification_loss: 0.4413 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2973 - regression_loss: 1.8559 - classification_loss: 0.4413 103/500 [=====>........................] - ETA: 1:38 - loss: 2.2977 - regression_loss: 1.8558 - classification_loss: 0.4419 104/500 [=====>........................] - ETA: 1:38 - loss: 2.3034 - regression_loss: 1.8611 - classification_loss: 0.4423 105/500 [=====>........................] - ETA: 1:38 - loss: 2.3013 - regression_loss: 1.8602 - classification_loss: 0.4411 106/500 [=====>........................] - ETA: 1:37 - loss: 2.2958 - regression_loss: 1.8550 - classification_loss: 0.4408 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2980 - regression_loss: 1.8578 - classification_loss: 0.4402 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2989 - regression_loss: 1.8598 - classification_loss: 0.4391 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2974 - regression_loss: 1.8590 - classification_loss: 0.4384 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2979 - regression_loss: 1.8599 - classification_loss: 0.4380 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2974 - regression_loss: 1.8601 - classification_loss: 0.4373 112/500 [=====>........................] - ETA: 1:36 - loss: 2.3000 - regression_loss: 1.8625 - classification_loss: 0.4375 113/500 [=====>........................] - ETA: 1:36 - loss: 2.3016 - regression_loss: 1.8636 - classification_loss: 0.4379 114/500 [=====>........................] - ETA: 1:36 - loss: 2.3032 - regression_loss: 1.8655 - classification_loss: 0.4377 115/500 [=====>........................] - ETA: 1:35 - loss: 2.3036 - regression_loss: 1.8653 - classification_loss: 0.4383 116/500 [=====>........................] - ETA: 1:35 - loss: 2.3070 - regression_loss: 1.8691 - classification_loss: 0.4379 117/500 [======>.......................] - ETA: 1:35 - loss: 2.3029 - regression_loss: 1.8654 - classification_loss: 0.4375 118/500 [======>.......................] - ETA: 1:35 - loss: 2.3050 - regression_loss: 1.8672 - classification_loss: 0.4378 119/500 [======>.......................] - ETA: 1:34 - loss: 2.2960 - regression_loss: 1.8605 - classification_loss: 0.4355 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2986 - regression_loss: 1.8630 - classification_loss: 0.4357 121/500 [======>.......................] - ETA: 1:34 - loss: 2.3021 - regression_loss: 1.8659 - classification_loss: 0.4361 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2991 - regression_loss: 1.8629 - classification_loss: 0.4362 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2881 - regression_loss: 1.8543 - classification_loss: 0.4338 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2964 - regression_loss: 1.8596 - classification_loss: 0.4369 125/500 [======>.......................] - ETA: 1:33 - loss: 2.3025 - regression_loss: 1.8641 - classification_loss: 0.4384 126/500 [======>.......................] - ETA: 1:33 - loss: 2.3046 - regression_loss: 1.8656 - classification_loss: 0.4390 127/500 [======>.......................] - ETA: 1:32 - loss: 2.3051 - regression_loss: 1.8666 - classification_loss: 0.4384 128/500 [======>.......................] - ETA: 1:32 - loss: 2.3049 - regression_loss: 1.8668 - classification_loss: 0.4382 129/500 [======>.......................] - ETA: 1:32 - loss: 2.3064 - regression_loss: 1.8677 - classification_loss: 0.4387 130/500 [======>.......................] - ETA: 1:32 - loss: 2.3049 - regression_loss: 1.8667 - classification_loss: 0.4382 131/500 [======>.......................] - ETA: 1:31 - loss: 2.3029 - regression_loss: 1.8656 - classification_loss: 0.4373 132/500 [======>.......................] - ETA: 1:31 - loss: 2.3054 - regression_loss: 1.8681 - classification_loss: 0.4373 133/500 [======>.......................] - ETA: 1:31 - loss: 2.3052 - regression_loss: 1.8684 - classification_loss: 0.4368 134/500 [=======>......................] - ETA: 1:31 - loss: 2.3018 - regression_loss: 1.8657 - classification_loss: 0.4362 135/500 [=======>......................] - ETA: 1:30 - loss: 2.3004 - regression_loss: 1.8646 - classification_loss: 0.4357 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2999 - regression_loss: 1.8646 - classification_loss: 0.4353 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2996 - regression_loss: 1.8644 - classification_loss: 0.4352 138/500 [=======>......................] - ETA: 1:30 - loss: 2.2977 - regression_loss: 1.8631 - classification_loss: 0.4346 139/500 [=======>......................] - ETA: 1:30 - loss: 2.2991 - regression_loss: 1.8647 - classification_loss: 0.4345 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2973 - regression_loss: 1.8635 - classification_loss: 0.4338 141/500 [=======>......................] - ETA: 1:29 - loss: 2.2980 - regression_loss: 1.8641 - classification_loss: 0.4340 142/500 [=======>......................] - ETA: 1:29 - loss: 2.3044 - regression_loss: 1.8700 - classification_loss: 0.4344 143/500 [=======>......................] - ETA: 1:29 - loss: 2.3045 - regression_loss: 1.8702 - classification_loss: 0.4344 144/500 [=======>......................] - ETA: 1:28 - loss: 2.3033 - regression_loss: 1.8697 - classification_loss: 0.4336 145/500 [=======>......................] - ETA: 1:28 - loss: 2.3011 - regression_loss: 1.8679 - classification_loss: 0.4332 146/500 [=======>......................] - ETA: 1:28 - loss: 2.2960 - regression_loss: 1.8636 - classification_loss: 0.4324 147/500 [=======>......................] - ETA: 1:28 - loss: 2.2970 - regression_loss: 1.8654 - classification_loss: 0.4316 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2956 - regression_loss: 1.8645 - classification_loss: 0.4311 149/500 [=======>......................] - ETA: 1:27 - loss: 2.2931 - regression_loss: 1.8627 - classification_loss: 0.4304 150/500 [========>.....................] - ETA: 1:27 - loss: 2.2955 - regression_loss: 1.8643 - classification_loss: 0.4312 151/500 [========>.....................] - ETA: 1:27 - loss: 2.2965 - regression_loss: 1.8658 - classification_loss: 0.4307 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2973 - regression_loss: 1.8660 - classification_loss: 0.4313 153/500 [========>.....................] - ETA: 1:26 - loss: 2.3004 - regression_loss: 1.8681 - classification_loss: 0.4322 154/500 [========>.....................] - ETA: 1:26 - loss: 2.2994 - regression_loss: 1.8676 - classification_loss: 0.4318 155/500 [========>.....................] - ETA: 1:26 - loss: 2.3001 - regression_loss: 1.8683 - classification_loss: 0.4318 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2986 - regression_loss: 1.8673 - classification_loss: 0.4313 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2920 - regression_loss: 1.8612 - classification_loss: 0.4308 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2890 - regression_loss: 1.8566 - classification_loss: 0.4324 159/500 [========>.....................] - ETA: 1:25 - loss: 2.2870 - regression_loss: 1.8556 - classification_loss: 0.4314 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2858 - regression_loss: 1.8547 - classification_loss: 0.4311 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2789 - regression_loss: 1.8490 - classification_loss: 0.4299 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2829 - regression_loss: 1.8520 - classification_loss: 0.4308 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2804 - regression_loss: 1.8500 - classification_loss: 0.4304 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2769 - regression_loss: 1.8469 - classification_loss: 0.4300 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2782 - regression_loss: 1.8479 - classification_loss: 0.4303 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2781 - regression_loss: 1.8479 - classification_loss: 0.4302 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2802 - regression_loss: 1.8500 - classification_loss: 0.4302 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2826 - regression_loss: 1.8524 - classification_loss: 0.4302 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2826 - regression_loss: 1.8526 - classification_loss: 0.4299 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2821 - regression_loss: 1.8525 - classification_loss: 0.4296 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2808 - regression_loss: 1.8516 - classification_loss: 0.4292 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2810 - regression_loss: 1.8520 - classification_loss: 0.4290 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2826 - regression_loss: 1.8530 - classification_loss: 0.4296 174/500 [=========>....................] - ETA: 1:21 - loss: 2.2828 - regression_loss: 1.8537 - classification_loss: 0.4291 175/500 [=========>....................] - ETA: 1:21 - loss: 2.2770 - regression_loss: 1.8490 - classification_loss: 0.4280 176/500 [=========>....................] - ETA: 1:20 - loss: 2.2722 - regression_loss: 1.8450 - classification_loss: 0.4272 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2641 - regression_loss: 1.8374 - classification_loss: 0.4267 178/500 [=========>....................] - ETA: 1:20 - loss: 2.2669 - regression_loss: 1.8393 - classification_loss: 0.4277 179/500 [=========>....................] - ETA: 1:20 - loss: 2.2656 - regression_loss: 1.8381 - classification_loss: 0.4275 180/500 [=========>....................] - ETA: 1:19 - loss: 2.2696 - regression_loss: 1.8401 - classification_loss: 0.4296 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2682 - regression_loss: 1.8393 - classification_loss: 0.4289 182/500 [=========>....................] - ETA: 1:19 - loss: 2.2742 - regression_loss: 1.8435 - classification_loss: 0.4307 183/500 [=========>....................] - ETA: 1:19 - loss: 2.2766 - regression_loss: 1.8459 - classification_loss: 0.4307 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2757 - regression_loss: 1.8455 - classification_loss: 0.4302 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2760 - regression_loss: 1.8459 - classification_loss: 0.4302 186/500 [==========>...................] - ETA: 1:18 - loss: 2.2769 - regression_loss: 1.8468 - classification_loss: 0.4301 187/500 [==========>...................] - ETA: 1:18 - loss: 2.2764 - regression_loss: 1.8465 - classification_loss: 0.4299 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2785 - regression_loss: 1.8482 - classification_loss: 0.4303 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2798 - regression_loss: 1.8495 - classification_loss: 0.4304 190/500 [==========>...................] - ETA: 1:17 - loss: 2.2799 - regression_loss: 1.8495 - classification_loss: 0.4303 191/500 [==========>...................] - ETA: 1:17 - loss: 2.2792 - regression_loss: 1.8484 - classification_loss: 0.4308 192/500 [==========>...................] - ETA: 1:16 - loss: 2.2775 - regression_loss: 1.8468 - classification_loss: 0.4308 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2784 - regression_loss: 1.8478 - classification_loss: 0.4306 194/500 [==========>...................] - ETA: 1:16 - loss: 2.2800 - regression_loss: 1.8490 - classification_loss: 0.4309 195/500 [==========>...................] - ETA: 1:16 - loss: 2.2815 - regression_loss: 1.8490 - classification_loss: 0.4325 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2818 - regression_loss: 1.8495 - classification_loss: 0.4323 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2796 - regression_loss: 1.8479 - classification_loss: 0.4317 198/500 [==========>...................] - ETA: 1:15 - loss: 2.2786 - regression_loss: 1.8473 - classification_loss: 0.4312 199/500 [==========>...................] - ETA: 1:15 - loss: 2.2776 - regression_loss: 1.8463 - classification_loss: 0.4313 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2729 - regression_loss: 1.8423 - classification_loss: 0.4305 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2733 - regression_loss: 1.8425 - classification_loss: 0.4309 202/500 [===========>..................] - ETA: 1:14 - loss: 2.2735 - regression_loss: 1.8422 - classification_loss: 0.4313 203/500 [===========>..................] - ETA: 1:14 - loss: 2.2734 - regression_loss: 1.8417 - classification_loss: 0.4317 204/500 [===========>..................] - ETA: 1:14 - loss: 2.2679 - regression_loss: 1.8371 - classification_loss: 0.4308 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2659 - regression_loss: 1.8352 - classification_loss: 0.4307 206/500 [===========>..................] - ETA: 1:13 - loss: 2.2681 - regression_loss: 1.8368 - classification_loss: 0.4313 207/500 [===========>..................] - ETA: 1:13 - loss: 2.2656 - regression_loss: 1.8346 - classification_loss: 0.4310 208/500 [===========>..................] - ETA: 1:13 - loss: 2.2654 - regression_loss: 1.8341 - classification_loss: 0.4313 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2656 - regression_loss: 1.8345 - classification_loss: 0.4311 210/500 [===========>..................] - ETA: 1:12 - loss: 2.2665 - regression_loss: 1.8357 - classification_loss: 0.4308 211/500 [===========>..................] - ETA: 1:12 - loss: 2.2657 - regression_loss: 1.8355 - classification_loss: 0.4302 212/500 [===========>..................] - ETA: 1:12 - loss: 2.2637 - regression_loss: 1.8340 - classification_loss: 0.4297 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2680 - regression_loss: 1.8373 - classification_loss: 0.4307 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2682 - regression_loss: 1.8377 - classification_loss: 0.4306 215/500 [===========>..................] - ETA: 1:11 - loss: 2.2725 - regression_loss: 1.8385 - classification_loss: 0.4340 216/500 [===========>..................] - ETA: 1:11 - loss: 2.2730 - regression_loss: 1.8394 - classification_loss: 0.4336 217/500 [============>.................] - ETA: 1:10 - loss: 2.2712 - regression_loss: 1.8381 - classification_loss: 0.4331 218/500 [============>.................] - ETA: 1:10 - loss: 2.2692 - regression_loss: 1.8372 - classification_loss: 0.4321 219/500 [============>.................] - ETA: 1:10 - loss: 2.2713 - regression_loss: 1.8392 - classification_loss: 0.4321 220/500 [============>.................] - ETA: 1:10 - loss: 2.2701 - regression_loss: 1.8384 - classification_loss: 0.4317 221/500 [============>.................] - ETA: 1:09 - loss: 2.2723 - regression_loss: 1.8399 - classification_loss: 0.4325 222/500 [============>.................] - ETA: 1:09 - loss: 2.2739 - regression_loss: 1.8410 - classification_loss: 0.4329 223/500 [============>.................] - ETA: 1:09 - loss: 2.2744 - regression_loss: 1.8416 - classification_loss: 0.4327 224/500 [============>.................] - ETA: 1:09 - loss: 2.2730 - regression_loss: 1.8410 - classification_loss: 0.4320 225/500 [============>.................] - ETA: 1:08 - loss: 2.2740 - regression_loss: 1.8416 - classification_loss: 0.4324 226/500 [============>.................] - ETA: 1:08 - loss: 2.2722 - regression_loss: 1.8403 - classification_loss: 0.4320 227/500 [============>.................] - ETA: 1:08 - loss: 2.2732 - regression_loss: 1.8407 - classification_loss: 0.4325 228/500 [============>.................] - ETA: 1:08 - loss: 2.2704 - regression_loss: 1.8383 - classification_loss: 0.4322 229/500 [============>.................] - ETA: 1:07 - loss: 2.2723 - regression_loss: 1.8397 - classification_loss: 0.4327 230/500 [============>.................] - ETA: 1:07 - loss: 2.2721 - regression_loss: 1.8395 - classification_loss: 0.4326 231/500 [============>.................] - ETA: 1:07 - loss: 2.2728 - regression_loss: 1.8403 - classification_loss: 0.4325 232/500 [============>.................] - ETA: 1:07 - loss: 2.2753 - regression_loss: 1.8422 - classification_loss: 0.4331 233/500 [============>.................] - ETA: 1:06 - loss: 2.2753 - regression_loss: 1.8423 - classification_loss: 0.4330 234/500 [=============>................] - ETA: 1:06 - loss: 2.2727 - regression_loss: 1.8401 - classification_loss: 0.4326 235/500 [=============>................] - ETA: 1:06 - loss: 2.2707 - regression_loss: 1.8388 - classification_loss: 0.4319 236/500 [=============>................] - ETA: 1:06 - loss: 2.2673 - regression_loss: 1.8359 - classification_loss: 0.4314 237/500 [=============>................] - ETA: 1:05 - loss: 2.2655 - regression_loss: 1.8343 - classification_loss: 0.4311 238/500 [=============>................] - ETA: 1:05 - loss: 2.2649 - regression_loss: 1.8341 - classification_loss: 0.4309 239/500 [=============>................] - ETA: 1:05 - loss: 2.2674 - regression_loss: 1.8357 - classification_loss: 0.4317 240/500 [=============>................] - ETA: 1:05 - loss: 2.2640 - regression_loss: 1.8327 - classification_loss: 0.4312 241/500 [=============>................] - ETA: 1:04 - loss: 2.2595 - regression_loss: 1.8289 - classification_loss: 0.4306 242/500 [=============>................] - ETA: 1:04 - loss: 2.2585 - regression_loss: 1.8282 - classification_loss: 0.4303 243/500 [=============>................] - ETA: 1:04 - loss: 2.2603 - regression_loss: 1.8292 - classification_loss: 0.4310 244/500 [=============>................] - ETA: 1:04 - loss: 2.2593 - regression_loss: 1.8285 - classification_loss: 0.4308 245/500 [=============>................] - ETA: 1:03 - loss: 2.2643 - regression_loss: 1.8328 - classification_loss: 0.4315 246/500 [=============>................] - ETA: 1:03 - loss: 2.2638 - regression_loss: 1.8325 - classification_loss: 0.4313 247/500 [=============>................] - ETA: 1:03 - loss: 2.2630 - regression_loss: 1.8320 - classification_loss: 0.4310 248/500 [=============>................] - ETA: 1:03 - loss: 2.2633 - regression_loss: 1.8323 - classification_loss: 0.4309 249/500 [=============>................] - ETA: 1:02 - loss: 2.2644 - regression_loss: 1.8334 - classification_loss: 0.4310 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2648 - regression_loss: 1.8339 - classification_loss: 0.4309 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2690 - regression_loss: 1.8373 - classification_loss: 0.4317 252/500 [==============>...............] - ETA: 1:02 - loss: 2.2677 - regression_loss: 1.8363 - classification_loss: 0.4315 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2677 - regression_loss: 1.8360 - classification_loss: 0.4317 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2683 - regression_loss: 1.8367 - classification_loss: 0.4315 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2683 - regression_loss: 1.8368 - classification_loss: 0.4315 256/500 [==============>...............] - ETA: 1:01 - loss: 2.2673 - regression_loss: 1.8362 - classification_loss: 0.4311 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2672 - regression_loss: 1.8366 - classification_loss: 0.4306 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2688 - regression_loss: 1.8378 - classification_loss: 0.4309 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2718 - regression_loss: 1.8392 - classification_loss: 0.4326 260/500 [==============>...............] - ETA: 1:00 - loss: 2.2726 - regression_loss: 1.8397 - classification_loss: 0.4329 261/500 [==============>...............] - ETA: 59s - loss: 2.2702 - regression_loss: 1.8375 - classification_loss: 0.4326  262/500 [==============>...............] - ETA: 59s - loss: 2.2677 - regression_loss: 1.8357 - classification_loss: 0.4320 263/500 [==============>...............] - ETA: 59s - loss: 2.2678 - regression_loss: 1.8359 - classification_loss: 0.4319 264/500 [==============>...............] - ETA: 59s - loss: 2.2672 - regression_loss: 1.8354 - classification_loss: 0.4319 265/500 [==============>...............] - ETA: 58s - loss: 2.2668 - regression_loss: 1.8353 - classification_loss: 0.4315 266/500 [==============>...............] - ETA: 58s - loss: 2.2672 - regression_loss: 1.8357 - classification_loss: 0.4315 267/500 [===============>..............] - ETA: 58s - loss: 2.2659 - regression_loss: 1.8345 - classification_loss: 0.4314 268/500 [===============>..............] - ETA: 57s - loss: 2.2648 - regression_loss: 1.8338 - classification_loss: 0.4309 269/500 [===============>..............] - ETA: 57s - loss: 2.2678 - regression_loss: 1.8365 - classification_loss: 0.4312 270/500 [===============>..............] - ETA: 57s - loss: 2.2666 - regression_loss: 1.8357 - classification_loss: 0.4308 271/500 [===============>..............] - ETA: 57s - loss: 2.2661 - regression_loss: 1.8354 - classification_loss: 0.4307 272/500 [===============>..............] - ETA: 56s - loss: 2.2659 - regression_loss: 1.8355 - classification_loss: 0.4304 273/500 [===============>..............] - ETA: 56s - loss: 2.2626 - regression_loss: 1.8328 - classification_loss: 0.4299 274/500 [===============>..............] - ETA: 56s - loss: 2.2644 - regression_loss: 1.8344 - classification_loss: 0.4300 275/500 [===============>..............] - ETA: 56s - loss: 2.2653 - regression_loss: 1.8353 - classification_loss: 0.4300 276/500 [===============>..............] - ETA: 55s - loss: 2.2654 - regression_loss: 1.8355 - classification_loss: 0.4299 277/500 [===============>..............] - ETA: 55s - loss: 2.2674 - regression_loss: 1.8366 - classification_loss: 0.4308 278/500 [===============>..............] - ETA: 55s - loss: 2.2681 - regression_loss: 1.8372 - classification_loss: 0.4308 279/500 [===============>..............] - ETA: 55s - loss: 2.2693 - regression_loss: 1.8383 - classification_loss: 0.4310 280/500 [===============>..............] - ETA: 55s - loss: 2.2695 - regression_loss: 1.8386 - classification_loss: 0.4308 281/500 [===============>..............] - ETA: 54s - loss: 2.2694 - regression_loss: 1.8385 - classification_loss: 0.4309 282/500 [===============>..............] - ETA: 54s - loss: 2.2719 - regression_loss: 1.8408 - classification_loss: 0.4311 283/500 [===============>..............] - ETA: 54s - loss: 2.2743 - regression_loss: 1.8430 - classification_loss: 0.4313 284/500 [================>.............] - ETA: 54s - loss: 2.2758 - regression_loss: 1.8443 - classification_loss: 0.4315 285/500 [================>.............] - ETA: 53s - loss: 2.2756 - regression_loss: 1.8427 - classification_loss: 0.4329 286/500 [================>.............] - ETA: 53s - loss: 2.2740 - regression_loss: 1.8405 - classification_loss: 0.4335 287/500 [================>.............] - ETA: 53s - loss: 2.2749 - regression_loss: 1.8409 - classification_loss: 0.4340 288/500 [================>.............] - ETA: 53s - loss: 2.2705 - regression_loss: 1.8372 - classification_loss: 0.4332 289/500 [================>.............] - ETA: 52s - loss: 2.2673 - regression_loss: 1.8348 - classification_loss: 0.4325 290/500 [================>.............] - ETA: 52s - loss: 2.2677 - regression_loss: 1.8355 - classification_loss: 0.4322 291/500 [================>.............] - ETA: 52s - loss: 2.2667 - regression_loss: 1.8346 - classification_loss: 0.4321 292/500 [================>.............] - ETA: 52s - loss: 2.2645 - regression_loss: 1.8327 - classification_loss: 0.4318 293/500 [================>.............] - ETA: 51s - loss: 2.2637 - regression_loss: 1.8322 - classification_loss: 0.4315 294/500 [================>.............] - ETA: 51s - loss: 2.2613 - regression_loss: 1.8304 - classification_loss: 0.4309 295/500 [================>.............] - ETA: 51s - loss: 2.2631 - regression_loss: 1.8319 - classification_loss: 0.4312 296/500 [================>.............] - ETA: 51s - loss: 2.2629 - regression_loss: 1.8316 - classification_loss: 0.4313 297/500 [================>.............] - ETA: 50s - loss: 2.2618 - regression_loss: 1.8309 - classification_loss: 0.4309 298/500 [================>.............] - ETA: 50s - loss: 2.2628 - regression_loss: 1.8314 - classification_loss: 0.4314 299/500 [================>.............] - ETA: 50s - loss: 2.2620 - regression_loss: 1.8305 - classification_loss: 0.4315 300/500 [=================>............] - ETA: 50s - loss: 2.2624 - regression_loss: 1.8310 - classification_loss: 0.4314 301/500 [=================>............] - ETA: 49s - loss: 2.2640 - regression_loss: 1.8317 - classification_loss: 0.4323 302/500 [=================>............] - ETA: 49s - loss: 2.2645 - regression_loss: 1.8322 - classification_loss: 0.4324 303/500 [=================>............] - ETA: 49s - loss: 2.2643 - regression_loss: 1.8319 - classification_loss: 0.4324 304/500 [=================>............] - ETA: 49s - loss: 2.2647 - regression_loss: 1.8324 - classification_loss: 0.4323 305/500 [=================>............] - ETA: 48s - loss: 2.2648 - regression_loss: 1.8325 - classification_loss: 0.4323 306/500 [=================>............] - ETA: 48s - loss: 2.2631 - regression_loss: 1.8313 - classification_loss: 0.4318 307/500 [=================>............] - ETA: 48s - loss: 2.2627 - regression_loss: 1.8310 - classification_loss: 0.4316 308/500 [=================>............] - ETA: 48s - loss: 2.2631 - regression_loss: 1.8311 - classification_loss: 0.4320 309/500 [=================>............] - ETA: 47s - loss: 2.2627 - regression_loss: 1.8310 - classification_loss: 0.4316 310/500 [=================>............] - ETA: 47s - loss: 2.2636 - regression_loss: 1.8320 - classification_loss: 0.4316 311/500 [=================>............] - ETA: 47s - loss: 2.2645 - regression_loss: 1.8326 - classification_loss: 0.4320 312/500 [=================>............] - ETA: 47s - loss: 2.2634 - regression_loss: 1.8316 - classification_loss: 0.4318 313/500 [=================>............] - ETA: 46s - loss: 2.2633 - regression_loss: 1.8314 - classification_loss: 0.4318 314/500 [=================>............] - ETA: 46s - loss: 2.2627 - regression_loss: 1.8310 - classification_loss: 0.4316 315/500 [=================>............] - ETA: 46s - loss: 2.2656 - regression_loss: 1.8328 - classification_loss: 0.4327 316/500 [=================>............] - ETA: 46s - loss: 2.2655 - regression_loss: 1.8323 - classification_loss: 0.4332 317/500 [==================>...........] - ETA: 45s - loss: 2.2645 - regression_loss: 1.8316 - classification_loss: 0.4329 318/500 [==================>...........] - ETA: 45s - loss: 2.2604 - regression_loss: 1.8283 - classification_loss: 0.4321 319/500 [==================>...........] - ETA: 45s - loss: 2.2592 - regression_loss: 1.8274 - classification_loss: 0.4318 320/500 [==================>...........] - ETA: 45s - loss: 2.2611 - regression_loss: 1.8288 - classification_loss: 0.4322 321/500 [==================>...........] - ETA: 44s - loss: 2.2610 - regression_loss: 1.8290 - classification_loss: 0.4319 322/500 [==================>...........] - ETA: 44s - loss: 2.2622 - regression_loss: 1.8298 - classification_loss: 0.4324 323/500 [==================>...........] - ETA: 44s - loss: 2.2622 - regression_loss: 1.8297 - classification_loss: 0.4325 324/500 [==================>...........] - ETA: 44s - loss: 2.2622 - regression_loss: 1.8299 - classification_loss: 0.4323 325/500 [==================>...........] - ETA: 43s - loss: 2.2630 - regression_loss: 1.8307 - classification_loss: 0.4323 326/500 [==================>...........] - ETA: 43s - loss: 2.2630 - regression_loss: 1.8308 - classification_loss: 0.4322 327/500 [==================>...........] - ETA: 43s - loss: 2.2625 - regression_loss: 1.8303 - classification_loss: 0.4321 328/500 [==================>...........] - ETA: 43s - loss: 2.2638 - regression_loss: 1.8314 - classification_loss: 0.4324 329/500 [==================>...........] - ETA: 42s - loss: 2.2628 - regression_loss: 1.8307 - classification_loss: 0.4321 330/500 [==================>...........] - ETA: 42s - loss: 2.2637 - regression_loss: 1.8316 - classification_loss: 0.4321 331/500 [==================>...........] - ETA: 42s - loss: 2.2637 - regression_loss: 1.8318 - classification_loss: 0.4319 332/500 [==================>...........] - ETA: 42s - loss: 2.2636 - regression_loss: 1.8319 - classification_loss: 0.4316 333/500 [==================>...........] - ETA: 41s - loss: 2.2644 - regression_loss: 1.8327 - classification_loss: 0.4318 334/500 [===================>..........] - ETA: 41s - loss: 2.2630 - regression_loss: 1.8315 - classification_loss: 0.4314 335/500 [===================>..........] - ETA: 41s - loss: 2.2660 - regression_loss: 1.8322 - classification_loss: 0.4338 336/500 [===================>..........] - ETA: 41s - loss: 2.2687 - regression_loss: 1.8343 - classification_loss: 0.4344 337/500 [===================>..........] - ETA: 40s - loss: 2.2688 - regression_loss: 1.8345 - classification_loss: 0.4343 338/500 [===================>..........] - ETA: 40s - loss: 2.2725 - regression_loss: 1.8340 - classification_loss: 0.4385 339/500 [===================>..........] - ETA: 40s - loss: 2.2712 - regression_loss: 1.8330 - classification_loss: 0.4382 340/500 [===================>..........] - ETA: 40s - loss: 2.2704 - regression_loss: 1.8324 - classification_loss: 0.4380 341/500 [===================>..........] - ETA: 39s - loss: 2.2665 - regression_loss: 1.8294 - classification_loss: 0.4371 342/500 [===================>..........] - ETA: 39s - loss: 2.2663 - regression_loss: 1.8294 - classification_loss: 0.4369 343/500 [===================>..........] - ETA: 39s - loss: 2.2648 - regression_loss: 1.8282 - classification_loss: 0.4366 344/500 [===================>..........] - ETA: 39s - loss: 2.2641 - regression_loss: 1.8273 - classification_loss: 0.4368 345/500 [===================>..........] - ETA: 38s - loss: 2.2636 - regression_loss: 1.8271 - classification_loss: 0.4365 346/500 [===================>..........] - ETA: 38s - loss: 2.2659 - regression_loss: 1.8290 - classification_loss: 0.4369 347/500 [===================>..........] - ETA: 38s - loss: 2.2655 - regression_loss: 1.8288 - classification_loss: 0.4367 348/500 [===================>..........] - ETA: 38s - loss: 2.2649 - regression_loss: 1.8280 - classification_loss: 0.4369 349/500 [===================>..........] - ETA: 37s - loss: 2.2661 - regression_loss: 1.8287 - classification_loss: 0.4374 350/500 [====================>.........] - ETA: 37s - loss: 2.2669 - regression_loss: 1.8291 - classification_loss: 0.4378 351/500 [====================>.........] - ETA: 37s - loss: 2.2693 - regression_loss: 1.8311 - classification_loss: 0.4382 352/500 [====================>.........] - ETA: 37s - loss: 2.2687 - regression_loss: 1.8306 - classification_loss: 0.4381 353/500 [====================>.........] - ETA: 36s - loss: 2.2709 - regression_loss: 1.8327 - classification_loss: 0.4382 354/500 [====================>.........] - ETA: 36s - loss: 2.2715 - regression_loss: 1.8330 - classification_loss: 0.4385 355/500 [====================>.........] - ETA: 36s - loss: 2.2721 - regression_loss: 1.8337 - classification_loss: 0.4384 356/500 [====================>.........] - ETA: 36s - loss: 2.2728 - regression_loss: 1.8343 - classification_loss: 0.4385 357/500 [====================>.........] - ETA: 35s - loss: 2.2724 - regression_loss: 1.8341 - classification_loss: 0.4382 358/500 [====================>.........] - ETA: 35s - loss: 2.2715 - regression_loss: 1.8335 - classification_loss: 0.4380 359/500 [====================>.........] - ETA: 35s - loss: 2.2713 - regression_loss: 1.8334 - classification_loss: 0.4379 360/500 [====================>.........] - ETA: 35s - loss: 2.2714 - regression_loss: 1.8336 - classification_loss: 0.4378 361/500 [====================>.........] - ETA: 34s - loss: 2.2687 - regression_loss: 1.8313 - classification_loss: 0.4375 362/500 [====================>.........] - ETA: 34s - loss: 2.2684 - regression_loss: 1.8312 - classification_loss: 0.4372 363/500 [====================>.........] - ETA: 34s - loss: 2.2666 - regression_loss: 1.8295 - classification_loss: 0.4371 364/500 [====================>.........] - ETA: 34s - loss: 2.2654 - regression_loss: 1.8288 - classification_loss: 0.4365 365/500 [====================>.........] - ETA: 33s - loss: 2.2650 - regression_loss: 1.8286 - classification_loss: 0.4363 366/500 [====================>.........] - ETA: 33s - loss: 2.2662 - regression_loss: 1.8298 - classification_loss: 0.4365 367/500 [=====================>........] - ETA: 33s - loss: 2.2658 - regression_loss: 1.8295 - classification_loss: 0.4363 368/500 [=====================>........] - ETA: 33s - loss: 2.2641 - regression_loss: 1.8280 - classification_loss: 0.4360 369/500 [=====================>........] - ETA: 32s - loss: 2.2637 - regression_loss: 1.8278 - classification_loss: 0.4359 370/500 [=====================>........] - ETA: 32s - loss: 2.2637 - regression_loss: 1.8278 - classification_loss: 0.4359 371/500 [=====================>........] - ETA: 32s - loss: 2.2636 - regression_loss: 1.8277 - classification_loss: 0.4358 372/500 [=====================>........] - ETA: 32s - loss: 2.2644 - regression_loss: 1.8283 - classification_loss: 0.4361 373/500 [=====================>........] - ETA: 31s - loss: 2.2642 - regression_loss: 1.8283 - classification_loss: 0.4359 374/500 [=====================>........] - ETA: 31s - loss: 2.2638 - regression_loss: 1.8283 - classification_loss: 0.4355 375/500 [=====================>........] - ETA: 31s - loss: 2.2643 - regression_loss: 1.8290 - classification_loss: 0.4353 376/500 [=====================>........] - ETA: 31s - loss: 2.2627 - regression_loss: 1.8275 - classification_loss: 0.4352 377/500 [=====================>........] - ETA: 30s - loss: 2.2634 - regression_loss: 1.8279 - classification_loss: 0.4355 378/500 [=====================>........] - ETA: 30s - loss: 2.2672 - regression_loss: 1.8293 - classification_loss: 0.4379 379/500 [=====================>........] - ETA: 30s - loss: 2.2670 - regression_loss: 1.8289 - classification_loss: 0.4381 380/500 [=====================>........] - ETA: 30s - loss: 2.2675 - regression_loss: 1.8295 - classification_loss: 0.4380 381/500 [=====================>........] - ETA: 29s - loss: 2.2684 - regression_loss: 1.8303 - classification_loss: 0.4381 382/500 [=====================>........] - ETA: 29s - loss: 2.2679 - regression_loss: 1.8301 - classification_loss: 0.4378 383/500 [=====================>........] - ETA: 29s - loss: 2.2671 - regression_loss: 1.8296 - classification_loss: 0.4375 384/500 [======================>.......] - ETA: 29s - loss: 2.2670 - regression_loss: 1.8296 - classification_loss: 0.4374 385/500 [======================>.......] - ETA: 28s - loss: 2.2669 - regression_loss: 1.8295 - classification_loss: 0.4374 386/500 [======================>.......] - ETA: 28s - loss: 2.2670 - regression_loss: 1.8297 - classification_loss: 0.4373 387/500 [======================>.......] - ETA: 28s - loss: 2.2669 - regression_loss: 1.8298 - classification_loss: 0.4371 388/500 [======================>.......] - ETA: 28s - loss: 2.2669 - regression_loss: 1.8301 - classification_loss: 0.4369 389/500 [======================>.......] - ETA: 27s - loss: 2.2691 - regression_loss: 1.8315 - classification_loss: 0.4376 390/500 [======================>.......] - ETA: 27s - loss: 2.2687 - regression_loss: 1.8312 - classification_loss: 0.4375 391/500 [======================>.......] - ETA: 27s - loss: 2.2689 - regression_loss: 1.8314 - classification_loss: 0.4375 392/500 [======================>.......] - ETA: 27s - loss: 2.2675 - regression_loss: 1.8305 - classification_loss: 0.4370 393/500 [======================>.......] - ETA: 26s - loss: 2.2683 - regression_loss: 1.8298 - classification_loss: 0.4384 394/500 [======================>.......] - ETA: 26s - loss: 2.2690 - regression_loss: 1.8305 - classification_loss: 0.4385 395/500 [======================>.......] - ETA: 26s - loss: 2.2683 - regression_loss: 1.8301 - classification_loss: 0.4382 396/500 [======================>.......] - ETA: 26s - loss: 2.2681 - regression_loss: 1.8300 - classification_loss: 0.4380 397/500 [======================>.......] - ETA: 25s - loss: 2.2665 - regression_loss: 1.8289 - classification_loss: 0.4377 398/500 [======================>.......] - ETA: 25s - loss: 2.2672 - regression_loss: 1.8297 - classification_loss: 0.4375 399/500 [======================>.......] - ETA: 25s - loss: 2.2673 - regression_loss: 1.8299 - classification_loss: 0.4374 400/500 [=======================>......] - ETA: 25s - loss: 2.2663 - regression_loss: 1.8291 - classification_loss: 0.4371 401/500 [=======================>......] - ETA: 24s - loss: 2.2675 - regression_loss: 1.8301 - classification_loss: 0.4374 402/500 [=======================>......] - ETA: 24s - loss: 2.2674 - regression_loss: 1.8302 - classification_loss: 0.4372 403/500 [=======================>......] - ETA: 24s - loss: 2.2673 - regression_loss: 1.8299 - classification_loss: 0.4373 404/500 [=======================>......] - ETA: 24s - loss: 2.2663 - regression_loss: 1.8291 - classification_loss: 0.4372 405/500 [=======================>......] - ETA: 23s - loss: 2.2668 - regression_loss: 1.8294 - classification_loss: 0.4373 406/500 [=======================>......] - ETA: 23s - loss: 2.2677 - regression_loss: 1.8301 - classification_loss: 0.4376 407/500 [=======================>......] - ETA: 23s - loss: 2.2674 - regression_loss: 1.8299 - classification_loss: 0.4375 408/500 [=======================>......] - ETA: 23s - loss: 2.2687 - regression_loss: 1.8308 - classification_loss: 0.4379 409/500 [=======================>......] - ETA: 22s - loss: 2.2671 - regression_loss: 1.8294 - classification_loss: 0.4378 410/500 [=======================>......] - ETA: 22s - loss: 2.2663 - regression_loss: 1.8288 - classification_loss: 0.4374 411/500 [=======================>......] - ETA: 22s - loss: 2.2660 - regression_loss: 1.8286 - classification_loss: 0.4373 412/500 [=======================>......] - ETA: 22s - loss: 2.2650 - regression_loss: 1.8280 - classification_loss: 0.4370 413/500 [=======================>......] - ETA: 21s - loss: 2.2652 - regression_loss: 1.8281 - classification_loss: 0.4370 414/500 [=======================>......] - ETA: 21s - loss: 2.2643 - regression_loss: 1.8276 - classification_loss: 0.4366 415/500 [=======================>......] - ETA: 21s - loss: 2.2639 - regression_loss: 1.8273 - classification_loss: 0.4365 416/500 [=======================>......] - ETA: 21s - loss: 2.2643 - regression_loss: 1.8278 - classification_loss: 0.4365 417/500 [========================>.....] - ETA: 20s - loss: 2.2647 - regression_loss: 1.8281 - classification_loss: 0.4365 418/500 [========================>.....] - ETA: 20s - loss: 2.2651 - regression_loss: 1.8283 - classification_loss: 0.4368 419/500 [========================>.....] - ETA: 20s - loss: 2.2639 - regression_loss: 1.8273 - classification_loss: 0.4366 420/500 [========================>.....] - ETA: 20s - loss: 2.2653 - regression_loss: 1.8286 - classification_loss: 0.4367 421/500 [========================>.....] - ETA: 19s - loss: 2.2653 - regression_loss: 1.8287 - classification_loss: 0.4366 422/500 [========================>.....] - ETA: 19s - loss: 2.2655 - regression_loss: 1.8288 - classification_loss: 0.4367 423/500 [========================>.....] - ETA: 19s - loss: 2.2656 - regression_loss: 1.8289 - classification_loss: 0.4366 424/500 [========================>.....] - ETA: 19s - loss: 2.2669 - regression_loss: 1.8299 - classification_loss: 0.4370 425/500 [========================>.....] - ETA: 18s - loss: 2.2659 - regression_loss: 1.8290 - classification_loss: 0.4369 426/500 [========================>.....] - ETA: 18s - loss: 2.2666 - regression_loss: 1.8299 - classification_loss: 0.4367 427/500 [========================>.....] - ETA: 18s - loss: 2.2670 - regression_loss: 1.8303 - classification_loss: 0.4368 428/500 [========================>.....] - ETA: 18s - loss: 2.2640 - regression_loss: 1.8277 - classification_loss: 0.4363 429/500 [========================>.....] - ETA: 17s - loss: 2.2625 - regression_loss: 1.8268 - classification_loss: 0.4357 430/500 [========================>.....] - ETA: 17s - loss: 2.2614 - regression_loss: 1.8261 - classification_loss: 0.4353 431/500 [========================>.....] - ETA: 17s - loss: 2.2608 - regression_loss: 1.8252 - classification_loss: 0.4356 432/500 [========================>.....] - ETA: 17s - loss: 2.2614 - regression_loss: 1.8257 - classification_loss: 0.4357 433/500 [========================>.....] - ETA: 16s - loss: 2.2587 - regression_loss: 1.8236 - classification_loss: 0.4351 434/500 [=========================>....] - ETA: 16s - loss: 2.2597 - regression_loss: 1.8243 - classification_loss: 0.4354 435/500 [=========================>....] - ETA: 16s - loss: 2.2585 - regression_loss: 1.8233 - classification_loss: 0.4352 436/500 [=========================>....] - ETA: 16s - loss: 2.2572 - regression_loss: 1.8223 - classification_loss: 0.4349 437/500 [=========================>....] - ETA: 15s - loss: 2.2598 - regression_loss: 1.8242 - classification_loss: 0.4356 438/500 [=========================>....] - ETA: 15s - loss: 2.2608 - regression_loss: 1.8250 - classification_loss: 0.4359 439/500 [=========================>....] - ETA: 15s - loss: 2.2613 - regression_loss: 1.8252 - classification_loss: 0.4360 440/500 [=========================>....] - ETA: 15s - loss: 2.2605 - regression_loss: 1.8246 - classification_loss: 0.4359 441/500 [=========================>....] - ETA: 14s - loss: 2.2604 - regression_loss: 1.8245 - classification_loss: 0.4358 442/500 [=========================>....] - ETA: 14s - loss: 2.2598 - regression_loss: 1.8241 - classification_loss: 0.4357 443/500 [=========================>....] - ETA: 14s - loss: 2.2573 - regression_loss: 1.8221 - classification_loss: 0.4353 444/500 [=========================>....] - ETA: 14s - loss: 2.2573 - regression_loss: 1.8219 - classification_loss: 0.4354 445/500 [=========================>....] - ETA: 13s - loss: 2.2574 - regression_loss: 1.8222 - classification_loss: 0.4352 446/500 [=========================>....] - ETA: 13s - loss: 2.2596 - regression_loss: 1.8239 - classification_loss: 0.4357 447/500 [=========================>....] - ETA: 13s - loss: 2.2592 - regression_loss: 1.8238 - classification_loss: 0.4354 448/500 [=========================>....] - ETA: 12s - loss: 2.2578 - regression_loss: 1.8226 - classification_loss: 0.4351 449/500 [=========================>....] - ETA: 12s - loss: 2.2591 - regression_loss: 1.8236 - classification_loss: 0.4355 450/500 [==========================>...] - ETA: 12s - loss: 2.2582 - regression_loss: 1.8230 - classification_loss: 0.4352 451/500 [==========================>...] - ETA: 12s - loss: 2.2562 - regression_loss: 1.8215 - classification_loss: 0.4347 452/500 [==========================>...] - ETA: 11s - loss: 2.2528 - regression_loss: 1.8187 - classification_loss: 0.4341 453/500 [==========================>...] - ETA: 11s - loss: 2.2529 - regression_loss: 1.8189 - classification_loss: 0.4340 454/500 [==========================>...] - ETA: 11s - loss: 2.2530 - regression_loss: 1.8191 - classification_loss: 0.4339 455/500 [==========================>...] - ETA: 11s - loss: 2.2523 - regression_loss: 1.8187 - classification_loss: 0.4336 456/500 [==========================>...] - ETA: 10s - loss: 2.2524 - regression_loss: 1.8190 - classification_loss: 0.4334 457/500 [==========================>...] - ETA: 10s - loss: 2.2514 - regression_loss: 1.8181 - classification_loss: 0.4334 458/500 [==========================>...] - ETA: 10s - loss: 2.2528 - regression_loss: 1.8193 - classification_loss: 0.4335 459/500 [==========================>...] - ETA: 10s - loss: 2.2526 - regression_loss: 1.8183 - classification_loss: 0.4343 460/500 [==========================>...] - ETA: 9s - loss: 2.2525 - regression_loss: 1.8183 - classification_loss: 0.4342  461/500 [==========================>...] - ETA: 9s - loss: 2.2525 - regression_loss: 1.8184 - classification_loss: 0.4340 462/500 [==========================>...] - ETA: 9s - loss: 2.2518 - regression_loss: 1.8179 - classification_loss: 0.4339 463/500 [==========================>...] - ETA: 9s - loss: 2.2520 - regression_loss: 1.8182 - classification_loss: 0.4338 464/500 [==========================>...] - ETA: 8s - loss: 2.2509 - regression_loss: 1.8173 - classification_loss: 0.4336 465/500 [==========================>...] - ETA: 8s - loss: 2.2522 - regression_loss: 1.8185 - classification_loss: 0.4337 466/500 [==========================>...] - ETA: 8s - loss: 2.2523 - regression_loss: 1.8187 - classification_loss: 0.4336 467/500 [===========================>..] - ETA: 8s - loss: 2.2511 - regression_loss: 1.8177 - classification_loss: 0.4334 468/500 [===========================>..] - ETA: 8s - loss: 2.2504 - regression_loss: 1.8163 - classification_loss: 0.4341 469/500 [===========================>..] - ETA: 7s - loss: 2.2493 - regression_loss: 1.8152 - classification_loss: 0.4341 470/500 [===========================>..] - ETA: 7s - loss: 2.2495 - regression_loss: 1.8155 - classification_loss: 0.4340 471/500 [===========================>..] - ETA: 7s - loss: 2.2496 - regression_loss: 1.8157 - classification_loss: 0.4339 472/500 [===========================>..] - ETA: 7s - loss: 2.2498 - regression_loss: 1.8160 - classification_loss: 0.4339 473/500 [===========================>..] - ETA: 6s - loss: 2.2502 - regression_loss: 1.8164 - classification_loss: 0.4338 474/500 [===========================>..] - ETA: 6s - loss: 2.2507 - regression_loss: 1.8168 - classification_loss: 0.4339 475/500 [===========================>..] - ETA: 6s - loss: 2.2488 - regression_loss: 1.8152 - classification_loss: 0.4336 476/500 [===========================>..] - ETA: 6s - loss: 2.2475 - regression_loss: 1.8143 - classification_loss: 0.4332 477/500 [===========================>..] - ETA: 5s - loss: 2.2478 - regression_loss: 1.8146 - classification_loss: 0.4331 478/500 [===========================>..] - ETA: 5s - loss: 2.2486 - regression_loss: 1.8156 - classification_loss: 0.4329 479/500 [===========================>..] - ETA: 5s - loss: 2.2486 - regression_loss: 1.8157 - classification_loss: 0.4329 480/500 [===========================>..] - ETA: 5s - loss: 2.2488 - regression_loss: 1.8160 - classification_loss: 0.4328 481/500 [===========================>..] - ETA: 4s - loss: 2.2477 - regression_loss: 1.8150 - classification_loss: 0.4327 482/500 [===========================>..] - ETA: 4s - loss: 2.2461 - regression_loss: 1.8138 - classification_loss: 0.4323 483/500 [===========================>..] - ETA: 4s - loss: 2.2449 - regression_loss: 1.8128 - classification_loss: 0.4321 484/500 [============================>.] - ETA: 4s - loss: 2.2446 - regression_loss: 1.8121 - classification_loss: 0.4324 485/500 [============================>.] - ETA: 3s - loss: 2.2450 - regression_loss: 1.8125 - classification_loss: 0.4325 486/500 [============================>.] - ETA: 3s - loss: 2.2451 - regression_loss: 1.8128 - classification_loss: 0.4323 487/500 [============================>.] - ETA: 3s - loss: 2.2448 - regression_loss: 1.8127 - classification_loss: 0.4321 488/500 [============================>.] - ETA: 3s - loss: 2.2457 - regression_loss: 1.8134 - classification_loss: 0.4323 489/500 [============================>.] - ETA: 2s - loss: 2.2462 - regression_loss: 1.8138 - classification_loss: 0.4324 490/500 [============================>.] - ETA: 2s - loss: 2.2456 - regression_loss: 1.8133 - classification_loss: 0.4322 491/500 [============================>.] - ETA: 2s - loss: 2.2460 - regression_loss: 1.8136 - classification_loss: 0.4324 492/500 [============================>.] - ETA: 2s - loss: 2.2451 - regression_loss: 1.8129 - classification_loss: 0.4322 493/500 [============================>.] - ETA: 1s - loss: 2.2450 - regression_loss: 1.8127 - classification_loss: 0.4323 494/500 [============================>.] - ETA: 1s - loss: 2.2451 - regression_loss: 1.8129 - classification_loss: 0.4322 495/500 [============================>.] - ETA: 1s - loss: 2.2471 - regression_loss: 1.8139 - classification_loss: 0.4332 496/500 [============================>.] - ETA: 1s - loss: 2.2492 - regression_loss: 1.8157 - classification_loss: 0.4334 497/500 [============================>.] - ETA: 0s - loss: 2.2501 - regression_loss: 1.8165 - classification_loss: 0.4336 498/500 [============================>.] - ETA: 0s - loss: 2.2485 - regression_loss: 1.8146 - classification_loss: 0.4339 499/500 [============================>.] - ETA: 0s - loss: 2.2472 - regression_loss: 1.8136 - classification_loss: 0.4337 500/500 [==============================] - 125s 250ms/step - loss: 2.2485 - regression_loss: 1.8147 - classification_loss: 0.4337 1172 instances of class plum with average precision: 0.4557 mAP: 0.4557 Epoch 00023: saving model to ./training/snapshots/resnet50_pascal_23.h5 Epoch 24/150 1/500 [..............................] - ETA: 1:48 - loss: 0.9910 - regression_loss: 0.7861 - classification_loss: 0.2048 2/500 [..............................] - ETA: 1:54 - loss: 1.3645 - regression_loss: 1.0960 - classification_loss: 0.2685 3/500 [..............................] - ETA: 1:57 - loss: 1.5788 - regression_loss: 1.2855 - classification_loss: 0.2933 4/500 [..............................] - ETA: 1:59 - loss: 1.6817 - regression_loss: 1.3873 - classification_loss: 0.2944 5/500 [..............................] - ETA: 1:59 - loss: 1.8336 - regression_loss: 1.5078 - classification_loss: 0.3258 6/500 [..............................] - ETA: 1:59 - loss: 1.8870 - regression_loss: 1.5386 - classification_loss: 0.3484 7/500 [..............................] - ETA: 2:00 - loss: 1.9791 - regression_loss: 1.6113 - classification_loss: 0.3678 8/500 [..............................] - ETA: 2:00 - loss: 2.0956 - regression_loss: 1.7187 - classification_loss: 0.3769 9/500 [..............................] - ETA: 2:01 - loss: 2.1012 - regression_loss: 1.7301 - classification_loss: 0.3711 10/500 [..............................] - ETA: 2:01 - loss: 2.1201 - regression_loss: 1.7320 - classification_loss: 0.3881 11/500 [..............................] - ETA: 2:01 - loss: 2.1294 - regression_loss: 1.7454 - classification_loss: 0.3840 12/500 [..............................] - ETA: 2:00 - loss: 2.1150 - regression_loss: 1.7278 - classification_loss: 0.3871 13/500 [..............................] - ETA: 2:00 - loss: 2.1223 - regression_loss: 1.7388 - classification_loss: 0.3834 14/500 [..............................] - ETA: 2:00 - loss: 2.1117 - regression_loss: 1.7328 - classification_loss: 0.3789 15/500 [..............................] - ETA: 2:00 - loss: 2.1249 - regression_loss: 1.7438 - classification_loss: 0.3812 16/500 [..............................] - ETA: 2:00 - loss: 2.0830 - regression_loss: 1.7098 - classification_loss: 0.3732 17/500 [>.............................] - ETA: 2:00 - loss: 2.1019 - regression_loss: 1.7288 - classification_loss: 0.3730 18/500 [>.............................] - ETA: 2:00 - loss: 2.1700 - regression_loss: 1.7847 - classification_loss: 0.3853 19/500 [>.............................] - ETA: 1:59 - loss: 2.1752 - regression_loss: 1.7920 - classification_loss: 0.3832 20/500 [>.............................] - ETA: 1:59 - loss: 2.1262 - regression_loss: 1.7494 - classification_loss: 0.3768 21/500 [>.............................] - ETA: 1:59 - loss: 2.1342 - regression_loss: 1.7545 - classification_loss: 0.3796 22/500 [>.............................] - ETA: 1:59 - loss: 2.1270 - regression_loss: 1.7479 - classification_loss: 0.3790 23/500 [>.............................] - ETA: 1:59 - loss: 2.1247 - regression_loss: 1.7470 - classification_loss: 0.3777 24/500 [>.............................] - ETA: 1:58 - loss: 2.1129 - regression_loss: 1.7318 - classification_loss: 0.3811 25/500 [>.............................] - ETA: 1:58 - loss: 2.1243 - regression_loss: 1.7368 - classification_loss: 0.3874 26/500 [>.............................] - ETA: 1:58 - loss: 2.1328 - regression_loss: 1.7490 - classification_loss: 0.3838 27/500 [>.............................] - ETA: 1:58 - loss: 2.1287 - regression_loss: 1.7471 - classification_loss: 0.3816 28/500 [>.............................] - ETA: 1:57 - loss: 2.1259 - regression_loss: 1.7475 - classification_loss: 0.3784 29/500 [>.............................] - ETA: 1:57 - loss: 2.1002 - regression_loss: 1.7274 - classification_loss: 0.3728 30/500 [>.............................] - ETA: 1:57 - loss: 2.1040 - regression_loss: 1.7313 - classification_loss: 0.3727 31/500 [>.............................] - ETA: 1:57 - loss: 2.1259 - regression_loss: 1.7431 - classification_loss: 0.3828 32/500 [>.............................] - ETA: 1:57 - loss: 2.1301 - regression_loss: 1.7454 - classification_loss: 0.3848 33/500 [>.............................] - ETA: 1:56 - loss: 2.1378 - regression_loss: 1.7491 - classification_loss: 0.3888 34/500 [=>............................] - ETA: 1:56 - loss: 2.1421 - regression_loss: 1.7515 - classification_loss: 0.3906 35/500 [=>............................] - ETA: 1:56 - loss: 2.1548 - regression_loss: 1.7602 - classification_loss: 0.3945 36/500 [=>............................] - ETA: 1:56 - loss: 2.1484 - regression_loss: 1.7542 - classification_loss: 0.3942 37/500 [=>............................] - ETA: 1:56 - loss: 2.1451 - regression_loss: 1.7531 - classification_loss: 0.3920 38/500 [=>............................] - ETA: 1:55 - loss: 2.1497 - regression_loss: 1.7557 - classification_loss: 0.3940 39/500 [=>............................] - ETA: 1:55 - loss: 2.1531 - regression_loss: 1.7585 - classification_loss: 0.3945 40/500 [=>............................] - ETA: 1:55 - loss: 2.1663 - regression_loss: 1.7664 - classification_loss: 0.3999 41/500 [=>............................] - ETA: 1:55 - loss: 2.1948 - regression_loss: 1.7885 - classification_loss: 0.4063 42/500 [=>............................] - ETA: 1:54 - loss: 2.1884 - regression_loss: 1.7836 - classification_loss: 0.4047 43/500 [=>............................] - ETA: 1:54 - loss: 2.1823 - regression_loss: 1.7782 - classification_loss: 0.4042 44/500 [=>............................] - ETA: 1:54 - loss: 2.1728 - regression_loss: 1.7713 - classification_loss: 0.4015 45/500 [=>............................] - ETA: 1:53 - loss: 2.1715 - regression_loss: 1.7690 - classification_loss: 0.4024 46/500 [=>............................] - ETA: 1:53 - loss: 2.1756 - regression_loss: 1.7715 - classification_loss: 0.4041 47/500 [=>............................] - ETA: 1:53 - loss: 2.1787 - regression_loss: 1.7744 - classification_loss: 0.4043 48/500 [=>............................] - ETA: 1:53 - loss: 2.1663 - regression_loss: 1.7640 - classification_loss: 0.4023 49/500 [=>............................] - ETA: 1:52 - loss: 2.1642 - regression_loss: 1.7643 - classification_loss: 0.4000 50/500 [==>...........................] - ETA: 1:52 - loss: 2.1578 - regression_loss: 1.7597 - classification_loss: 0.3981 51/500 [==>...........................] - ETA: 1:52 - loss: 2.1593 - regression_loss: 1.7602 - classification_loss: 0.3991 52/500 [==>...........................] - ETA: 1:52 - loss: 2.1587 - regression_loss: 1.7592 - classification_loss: 0.3995 53/500 [==>...........................] - ETA: 1:52 - loss: 2.1750 - regression_loss: 1.7714 - classification_loss: 0.4036 54/500 [==>...........................] - ETA: 1:51 - loss: 2.1786 - regression_loss: 1.7748 - classification_loss: 0.4038 55/500 [==>...........................] - ETA: 1:51 - loss: 2.1746 - regression_loss: 1.7703 - classification_loss: 0.4043 56/500 [==>...........................] - ETA: 1:51 - loss: 2.1765 - regression_loss: 1.7723 - classification_loss: 0.4042 57/500 [==>...........................] - ETA: 1:51 - loss: 2.1796 - regression_loss: 1.7754 - classification_loss: 0.4042 58/500 [==>...........................] - ETA: 1:50 - loss: 2.1744 - regression_loss: 1.7724 - classification_loss: 0.4020 59/500 [==>...........................] - ETA: 1:50 - loss: 2.1735 - regression_loss: 1.7720 - classification_loss: 0.4015 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1665 - regression_loss: 1.7661 - classification_loss: 0.4005 61/500 [==>...........................] - ETA: 1:49 - loss: 2.1677 - regression_loss: 1.7682 - classification_loss: 0.3995 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1743 - regression_loss: 1.7747 - classification_loss: 0.3996 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1903 - regression_loss: 1.7878 - classification_loss: 0.4026 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1889 - regression_loss: 1.7872 - classification_loss: 0.4017 65/500 [==>...........................] - ETA: 1:49 - loss: 2.1969 - regression_loss: 1.7931 - classification_loss: 0.4038 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1843 - regression_loss: 1.7835 - classification_loss: 0.4008 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1851 - regression_loss: 1.7844 - classification_loss: 0.4007 68/500 [===>..........................] - ETA: 1:48 - loss: 2.1870 - regression_loss: 1.7867 - classification_loss: 0.4004 69/500 [===>..........................] - ETA: 1:48 - loss: 2.1877 - regression_loss: 1.7871 - classification_loss: 0.4006 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1900 - regression_loss: 1.7898 - classification_loss: 0.4002 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1893 - regression_loss: 1.7900 - classification_loss: 0.3993 72/500 [===>..........................] - ETA: 1:47 - loss: 2.2048 - regression_loss: 1.8032 - classification_loss: 0.4015 73/500 [===>..........................] - ETA: 1:47 - loss: 2.1993 - regression_loss: 1.8000 - classification_loss: 0.3994 74/500 [===>..........................] - ETA: 1:46 - loss: 2.2046 - regression_loss: 1.8049 - classification_loss: 0.3997 75/500 [===>..........................] - ETA: 1:46 - loss: 2.2051 - regression_loss: 1.8050 - classification_loss: 0.4000 76/500 [===>..........................] - ETA: 1:46 - loss: 2.2060 - regression_loss: 1.8060 - classification_loss: 0.3999 77/500 [===>..........................] - ETA: 1:46 - loss: 2.2045 - regression_loss: 1.8045 - classification_loss: 0.3999 78/500 [===>..........................] - ETA: 1:45 - loss: 2.2201 - regression_loss: 1.8151 - classification_loss: 0.4050 79/500 [===>..........................] - ETA: 1:45 - loss: 2.2217 - regression_loss: 1.8168 - classification_loss: 0.4048 80/500 [===>..........................] - ETA: 1:45 - loss: 2.2209 - regression_loss: 1.8157 - classification_loss: 0.4052 81/500 [===>..........................] - ETA: 1:45 - loss: 2.2285 - regression_loss: 1.8200 - classification_loss: 0.4085 82/500 [===>..........................] - ETA: 1:44 - loss: 2.2224 - regression_loss: 1.8150 - classification_loss: 0.4074 83/500 [===>..........................] - ETA: 1:44 - loss: 2.2254 - regression_loss: 1.8174 - classification_loss: 0.4079 84/500 [====>.........................] - ETA: 1:44 - loss: 2.2253 - regression_loss: 1.8175 - classification_loss: 0.4078 85/500 [====>.........................] - ETA: 1:44 - loss: 2.2226 - regression_loss: 1.8151 - classification_loss: 0.4075 86/500 [====>.........................] - ETA: 1:43 - loss: 2.2303 - regression_loss: 1.8200 - classification_loss: 0.4103 87/500 [====>.........................] - ETA: 1:43 - loss: 2.2298 - regression_loss: 1.8198 - classification_loss: 0.4100 88/500 [====>.........................] - ETA: 1:43 - loss: 2.2265 - regression_loss: 1.8167 - classification_loss: 0.4098 89/500 [====>.........................] - ETA: 1:43 - loss: 2.2322 - regression_loss: 1.8214 - classification_loss: 0.4108 90/500 [====>.........................] - ETA: 1:42 - loss: 2.2341 - regression_loss: 1.8220 - classification_loss: 0.4121 91/500 [====>.........................] - ETA: 1:42 - loss: 2.2343 - regression_loss: 1.8226 - classification_loss: 0.4117 92/500 [====>.........................] - ETA: 1:42 - loss: 2.2385 - regression_loss: 1.8265 - classification_loss: 0.4119 93/500 [====>.........................] - ETA: 1:42 - loss: 2.2369 - regression_loss: 1.8258 - classification_loss: 0.4111 94/500 [====>.........................] - ETA: 1:41 - loss: 2.2447 - regression_loss: 1.8317 - classification_loss: 0.4130 95/500 [====>.........................] - ETA: 1:41 - loss: 2.2426 - regression_loss: 1.8307 - classification_loss: 0.4119 96/500 [====>.........................] - ETA: 1:41 - loss: 2.2410 - regression_loss: 1.8297 - classification_loss: 0.4113 97/500 [====>.........................] - ETA: 1:41 - loss: 2.2368 - regression_loss: 1.8248 - classification_loss: 0.4120 98/500 [====>.........................] - ETA: 1:40 - loss: 2.2331 - regression_loss: 1.8182 - classification_loss: 0.4149 99/500 [====>.........................] - ETA: 1:40 - loss: 2.2342 - regression_loss: 1.8191 - classification_loss: 0.4151 100/500 [=====>........................] - ETA: 1:40 - loss: 2.2368 - regression_loss: 1.8195 - classification_loss: 0.4173 101/500 [=====>........................] - ETA: 1:40 - loss: 2.2394 - regression_loss: 1.8221 - classification_loss: 0.4173 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2408 - regression_loss: 1.8238 - classification_loss: 0.4170 103/500 [=====>........................] - ETA: 1:39 - loss: 2.2320 - regression_loss: 1.8168 - classification_loss: 0.4152 104/500 [=====>........................] - ETA: 1:39 - loss: 2.2333 - regression_loss: 1.8184 - classification_loss: 0.4149 105/500 [=====>........................] - ETA: 1:39 - loss: 2.2317 - regression_loss: 1.8175 - classification_loss: 0.4143 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2323 - regression_loss: 1.8177 - classification_loss: 0.4146 107/500 [=====>........................] - ETA: 1:38 - loss: 2.2352 - regression_loss: 1.8202 - classification_loss: 0.4150 108/500 [=====>........................] - ETA: 1:38 - loss: 2.2372 - regression_loss: 1.8222 - classification_loss: 0.4150 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2368 - regression_loss: 1.8225 - classification_loss: 0.4142 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2389 - regression_loss: 1.8241 - classification_loss: 0.4148 111/500 [=====>........................] - ETA: 1:37 - loss: 2.2396 - regression_loss: 1.8248 - classification_loss: 0.4148 112/500 [=====>........................] - ETA: 1:37 - loss: 2.2400 - regression_loss: 1.8237 - classification_loss: 0.4162 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2498 - regression_loss: 1.8245 - classification_loss: 0.4253 114/500 [=====>........................] - ETA: 1:36 - loss: 2.2517 - regression_loss: 1.8249 - classification_loss: 0.4268 115/500 [=====>........................] - ETA: 1:36 - loss: 2.2550 - regression_loss: 1.8276 - classification_loss: 0.4274 116/500 [=====>........................] - ETA: 1:36 - loss: 2.2589 - regression_loss: 1.8306 - classification_loss: 0.4283 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2691 - regression_loss: 1.8386 - classification_loss: 0.4305 118/500 [======>.......................] - ETA: 1:35 - loss: 2.2684 - regression_loss: 1.8381 - classification_loss: 0.4303 119/500 [======>.......................] - ETA: 1:35 - loss: 2.2652 - regression_loss: 1.8355 - classification_loss: 0.4298 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2615 - regression_loss: 1.8323 - classification_loss: 0.4292 121/500 [======>.......................] - ETA: 1:34 - loss: 2.2592 - regression_loss: 1.8305 - classification_loss: 0.4287 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2550 - regression_loss: 1.8263 - classification_loss: 0.4287 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2558 - regression_loss: 1.8274 - classification_loss: 0.4284 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2542 - regression_loss: 1.8258 - classification_loss: 0.4283 125/500 [======>.......................] - ETA: 1:33 - loss: 2.2467 - regression_loss: 1.8199 - classification_loss: 0.4268 126/500 [======>.......................] - ETA: 1:33 - loss: 2.2458 - regression_loss: 1.8193 - classification_loss: 0.4265 127/500 [======>.......................] - ETA: 1:32 - loss: 2.2499 - regression_loss: 1.8224 - classification_loss: 0.4276 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2512 - regression_loss: 1.8238 - classification_loss: 0.4274 129/500 [======>.......................] - ETA: 1:32 - loss: 2.2467 - regression_loss: 1.8204 - classification_loss: 0.4263 130/500 [======>.......................] - ETA: 1:32 - loss: 2.2500 - regression_loss: 1.8230 - classification_loss: 0.4270 131/500 [======>.......................] - ETA: 1:31 - loss: 2.2484 - regression_loss: 1.8215 - classification_loss: 0.4269 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2520 - regression_loss: 1.8240 - classification_loss: 0.4280 133/500 [======>.......................] - ETA: 1:31 - loss: 2.2504 - regression_loss: 1.8229 - classification_loss: 0.4275 134/500 [=======>......................] - ETA: 1:31 - loss: 2.2443 - regression_loss: 1.8182 - classification_loss: 0.4261 135/500 [=======>......................] - ETA: 1:30 - loss: 2.2440 - regression_loss: 1.8187 - classification_loss: 0.4253 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2447 - regression_loss: 1.8196 - classification_loss: 0.4251 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2464 - regression_loss: 1.8208 - classification_loss: 0.4256 138/500 [=======>......................] - ETA: 1:30 - loss: 2.2427 - regression_loss: 1.8182 - classification_loss: 0.4245 139/500 [=======>......................] - ETA: 1:30 - loss: 2.2420 - regression_loss: 1.8176 - classification_loss: 0.4244 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2412 - regression_loss: 1.8167 - classification_loss: 0.4245 141/500 [=======>......................] - ETA: 1:29 - loss: 2.2442 - regression_loss: 1.8194 - classification_loss: 0.4248 142/500 [=======>......................] - ETA: 1:29 - loss: 2.2478 - regression_loss: 1.8225 - classification_loss: 0.4253 143/500 [=======>......................] - ETA: 1:29 - loss: 2.2476 - regression_loss: 1.8227 - classification_loss: 0.4249 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2475 - regression_loss: 1.8229 - classification_loss: 0.4246 145/500 [=======>......................] - ETA: 1:28 - loss: 2.2492 - regression_loss: 1.8246 - classification_loss: 0.4246 146/500 [=======>......................] - ETA: 1:28 - loss: 2.2481 - regression_loss: 1.8238 - classification_loss: 0.4242 147/500 [=======>......................] - ETA: 1:27 - loss: 2.2404 - regression_loss: 1.8176 - classification_loss: 0.4228 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2430 - regression_loss: 1.8194 - classification_loss: 0.4236 149/500 [=======>......................] - ETA: 1:27 - loss: 2.2511 - regression_loss: 1.8268 - classification_loss: 0.4243 150/500 [========>.....................] - ETA: 1:27 - loss: 2.2518 - regression_loss: 1.8282 - classification_loss: 0.4236 151/500 [========>.....................] - ETA: 1:26 - loss: 2.2431 - regression_loss: 1.8209 - classification_loss: 0.4223 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2432 - regression_loss: 1.8210 - classification_loss: 0.4222 153/500 [========>.....................] - ETA: 1:26 - loss: 2.2442 - regression_loss: 1.8215 - classification_loss: 0.4226 154/500 [========>.....................] - ETA: 1:26 - loss: 2.2449 - regression_loss: 1.8221 - classification_loss: 0.4228 155/500 [========>.....................] - ETA: 1:25 - loss: 2.2446 - regression_loss: 1.8223 - classification_loss: 0.4224 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2500 - regression_loss: 1.8261 - classification_loss: 0.4239 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2494 - regression_loss: 1.8261 - classification_loss: 0.4233 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2461 - regression_loss: 1.8230 - classification_loss: 0.4231 159/500 [========>.....................] - ETA: 1:25 - loss: 2.2441 - regression_loss: 1.8215 - classification_loss: 0.4226 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2471 - regression_loss: 1.8252 - classification_loss: 0.4219 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2454 - regression_loss: 1.8242 - classification_loss: 0.4212 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2450 - regression_loss: 1.8241 - classification_loss: 0.4209 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2438 - regression_loss: 1.8230 - classification_loss: 0.4208 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2435 - regression_loss: 1.8203 - classification_loss: 0.4232 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2421 - regression_loss: 1.8198 - classification_loss: 0.4223 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2425 - regression_loss: 1.8206 - classification_loss: 0.4219 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2447 - regression_loss: 1.8222 - classification_loss: 0.4225 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2432 - regression_loss: 1.8210 - classification_loss: 0.4222 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2344 - regression_loss: 1.8140 - classification_loss: 0.4205 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2300 - regression_loss: 1.8104 - classification_loss: 0.4195 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2283 - regression_loss: 1.8088 - classification_loss: 0.4195 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2303 - regression_loss: 1.8107 - classification_loss: 0.4196 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2270 - regression_loss: 1.8076 - classification_loss: 0.4194 174/500 [=========>....................] - ETA: 1:21 - loss: 2.2334 - regression_loss: 1.8122 - classification_loss: 0.4212 175/500 [=========>....................] - ETA: 1:20 - loss: 2.2346 - regression_loss: 1.8129 - classification_loss: 0.4217 176/500 [=========>....................] - ETA: 1:20 - loss: 2.2373 - regression_loss: 1.8148 - classification_loss: 0.4225 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2378 - regression_loss: 1.8153 - classification_loss: 0.4225 178/500 [=========>....................] - ETA: 1:20 - loss: 2.2381 - regression_loss: 1.8160 - classification_loss: 0.4221 179/500 [=========>....................] - ETA: 1:19 - loss: 2.2331 - regression_loss: 1.8112 - classification_loss: 0.4219 180/500 [=========>....................] - ETA: 1:19 - loss: 2.2348 - regression_loss: 1.8127 - classification_loss: 0.4220 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2325 - regression_loss: 1.8111 - classification_loss: 0.4214 182/500 [=========>....................] - ETA: 1:19 - loss: 2.2272 - regression_loss: 1.8071 - classification_loss: 0.4201 183/500 [=========>....................] - ETA: 1:18 - loss: 2.2274 - regression_loss: 1.8075 - classification_loss: 0.4199 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2269 - regression_loss: 1.8073 - classification_loss: 0.4196 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2305 - regression_loss: 1.8104 - classification_loss: 0.4201 186/500 [==========>...................] - ETA: 1:18 - loss: 2.2291 - regression_loss: 1.8095 - classification_loss: 0.4196 187/500 [==========>...................] - ETA: 1:17 - loss: 2.2281 - regression_loss: 1.8087 - classification_loss: 0.4195 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2248 - regression_loss: 1.8061 - classification_loss: 0.4186 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2234 - regression_loss: 1.8054 - classification_loss: 0.4180 190/500 [==========>...................] - ETA: 1:17 - loss: 2.2245 - regression_loss: 1.8066 - classification_loss: 0.4179 191/500 [==========>...................] - ETA: 1:16 - loss: 2.2250 - regression_loss: 1.8069 - classification_loss: 0.4180 192/500 [==========>...................] - ETA: 1:16 - loss: 2.2236 - regression_loss: 1.8057 - classification_loss: 0.4178 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2231 - regression_loss: 1.8054 - classification_loss: 0.4177 194/500 [==========>...................] - ETA: 1:16 - loss: 2.2219 - regression_loss: 1.8042 - classification_loss: 0.4178 195/500 [==========>...................] - ETA: 1:15 - loss: 2.2173 - regression_loss: 1.8007 - classification_loss: 0.4166 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2169 - regression_loss: 1.8005 - classification_loss: 0.4163 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2161 - regression_loss: 1.8000 - classification_loss: 0.4161 198/500 [==========>...................] - ETA: 1:15 - loss: 2.2169 - regression_loss: 1.8001 - classification_loss: 0.4169 199/500 [==========>...................] - ETA: 1:14 - loss: 2.2186 - regression_loss: 1.8015 - classification_loss: 0.4171 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2182 - regression_loss: 1.8009 - classification_loss: 0.4173 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2225 - regression_loss: 1.8046 - classification_loss: 0.4178 202/500 [===========>..................] - ETA: 1:14 - loss: 2.2236 - regression_loss: 1.8059 - classification_loss: 0.4177 203/500 [===========>..................] - ETA: 1:13 - loss: 2.2228 - regression_loss: 1.8055 - classification_loss: 0.4173 204/500 [===========>..................] - ETA: 1:13 - loss: 2.2235 - regression_loss: 1.8058 - classification_loss: 0.4177 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2220 - regression_loss: 1.8047 - classification_loss: 0.4173 206/500 [===========>..................] - ETA: 1:13 - loss: 2.2225 - regression_loss: 1.8051 - classification_loss: 0.4173 207/500 [===========>..................] - ETA: 1:12 - loss: 2.2225 - regression_loss: 1.8051 - classification_loss: 0.4174 208/500 [===========>..................] - ETA: 1:12 - loss: 2.2240 - regression_loss: 1.8061 - classification_loss: 0.4179 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2259 - regression_loss: 1.8065 - classification_loss: 0.4194 210/500 [===========>..................] - ETA: 1:12 - loss: 2.2258 - regression_loss: 1.8062 - classification_loss: 0.4196 211/500 [===========>..................] - ETA: 1:11 - loss: 2.2243 - regression_loss: 1.8052 - classification_loss: 0.4191 212/500 [===========>..................] - ETA: 1:11 - loss: 2.2277 - regression_loss: 1.8049 - classification_loss: 0.4229 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2287 - regression_loss: 1.8053 - classification_loss: 0.4234 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2305 - regression_loss: 1.8068 - classification_loss: 0.4237 215/500 [===========>..................] - ETA: 1:10 - loss: 2.2322 - regression_loss: 1.8086 - classification_loss: 0.4235 216/500 [===========>..................] - ETA: 1:10 - loss: 2.2313 - regression_loss: 1.8082 - classification_loss: 0.4231 217/500 [============>.................] - ETA: 1:10 - loss: 2.2325 - regression_loss: 1.8092 - classification_loss: 0.4233 218/500 [============>.................] - ETA: 1:10 - loss: 2.2320 - regression_loss: 1.8089 - classification_loss: 0.4231 219/500 [============>.................] - ETA: 1:10 - loss: 2.2331 - regression_loss: 1.8099 - classification_loss: 0.4232 220/500 [============>.................] - ETA: 1:09 - loss: 2.2337 - regression_loss: 1.8103 - classification_loss: 0.4234 221/500 [============>.................] - ETA: 1:09 - loss: 2.2349 - regression_loss: 1.8112 - classification_loss: 0.4237 222/500 [============>.................] - ETA: 1:09 - loss: 2.2369 - regression_loss: 1.8125 - classification_loss: 0.4244 223/500 [============>.................] - ETA: 1:09 - loss: 2.2387 - regression_loss: 1.8140 - classification_loss: 0.4247 224/500 [============>.................] - ETA: 1:08 - loss: 2.2391 - regression_loss: 1.8144 - classification_loss: 0.4248 225/500 [============>.................] - ETA: 1:08 - loss: 2.2384 - regression_loss: 1.8138 - classification_loss: 0.4246 226/500 [============>.................] - ETA: 1:08 - loss: 2.2378 - regression_loss: 1.8135 - classification_loss: 0.4243 227/500 [============>.................] - ETA: 1:08 - loss: 2.2333 - regression_loss: 1.8098 - classification_loss: 0.4235 228/500 [============>.................] - ETA: 1:07 - loss: 2.2329 - regression_loss: 1.8098 - classification_loss: 0.4231 229/500 [============>.................] - ETA: 1:07 - loss: 2.2329 - regression_loss: 1.8099 - classification_loss: 0.4230 230/500 [============>.................] - ETA: 1:07 - loss: 2.2323 - regression_loss: 1.8096 - classification_loss: 0.4228 231/500 [============>.................] - ETA: 1:07 - loss: 2.2337 - regression_loss: 1.8105 - classification_loss: 0.4232 232/500 [============>.................] - ETA: 1:06 - loss: 2.2330 - regression_loss: 1.8099 - classification_loss: 0.4230 233/500 [============>.................] - ETA: 1:06 - loss: 2.2316 - regression_loss: 1.8089 - classification_loss: 0.4227 234/500 [=============>................] - ETA: 1:06 - loss: 2.2309 - regression_loss: 1.8089 - classification_loss: 0.4221 235/500 [=============>................] - ETA: 1:06 - loss: 2.2304 - regression_loss: 1.8083 - classification_loss: 0.4221 236/500 [=============>................] - ETA: 1:05 - loss: 2.2312 - regression_loss: 1.8090 - classification_loss: 0.4222 237/500 [=============>................] - ETA: 1:05 - loss: 2.2315 - regression_loss: 1.8094 - classification_loss: 0.4221 238/500 [=============>................] - ETA: 1:05 - loss: 2.2316 - regression_loss: 1.8092 - classification_loss: 0.4224 239/500 [=============>................] - ETA: 1:05 - loss: 2.2324 - regression_loss: 1.8100 - classification_loss: 0.4224 240/500 [=============>................] - ETA: 1:04 - loss: 2.2329 - regression_loss: 1.8107 - classification_loss: 0.4222 241/500 [=============>................] - ETA: 1:04 - loss: 2.2338 - regression_loss: 1.8114 - classification_loss: 0.4224 242/500 [=============>................] - ETA: 1:04 - loss: 2.2356 - regression_loss: 1.8120 - classification_loss: 0.4236 243/500 [=============>................] - ETA: 1:04 - loss: 2.2367 - regression_loss: 1.8130 - classification_loss: 0.4237 244/500 [=============>................] - ETA: 1:03 - loss: 2.2367 - regression_loss: 1.8115 - classification_loss: 0.4252 245/500 [=============>................] - ETA: 1:03 - loss: 2.2372 - regression_loss: 1.8122 - classification_loss: 0.4249 246/500 [=============>................] - ETA: 1:03 - loss: 2.2389 - regression_loss: 1.8141 - classification_loss: 0.4249 247/500 [=============>................] - ETA: 1:03 - loss: 2.2421 - regression_loss: 1.8159 - classification_loss: 0.4262 248/500 [=============>................] - ETA: 1:02 - loss: 2.2396 - regression_loss: 1.8140 - classification_loss: 0.4255 249/500 [=============>................] - ETA: 1:02 - loss: 2.2389 - regression_loss: 1.8137 - classification_loss: 0.4253 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2375 - regression_loss: 1.8125 - classification_loss: 0.4250 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2358 - regression_loss: 1.8108 - classification_loss: 0.4250 252/500 [==============>...............] - ETA: 1:01 - loss: 2.2368 - regression_loss: 1.8117 - classification_loss: 0.4251 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2372 - regression_loss: 1.8121 - classification_loss: 0.4252 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2334 - regression_loss: 1.8090 - classification_loss: 0.4244 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2338 - regression_loss: 1.8095 - classification_loss: 0.4244 256/500 [==============>...............] - ETA: 1:00 - loss: 2.2346 - regression_loss: 1.8102 - classification_loss: 0.4245 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2355 - regression_loss: 1.8108 - classification_loss: 0.4247 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2373 - regression_loss: 1.8125 - classification_loss: 0.4249 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2387 - regression_loss: 1.8136 - classification_loss: 0.4250 260/500 [==============>...............] - ETA: 59s - loss: 2.2391 - regression_loss: 1.8131 - classification_loss: 0.4259  261/500 [==============>...............] - ETA: 59s - loss: 2.2388 - regression_loss: 1.8129 - classification_loss: 0.4259 262/500 [==============>...............] - ETA: 59s - loss: 2.2392 - regression_loss: 1.8132 - classification_loss: 0.4260 263/500 [==============>...............] - ETA: 59s - loss: 2.2354 - regression_loss: 1.8107 - classification_loss: 0.4247 264/500 [==============>...............] - ETA: 58s - loss: 2.2371 - regression_loss: 1.8121 - classification_loss: 0.4250 265/500 [==============>...............] - ETA: 58s - loss: 2.2375 - regression_loss: 1.8125 - classification_loss: 0.4250 266/500 [==============>...............] - ETA: 58s - loss: 2.2361 - regression_loss: 1.8117 - classification_loss: 0.4245 267/500 [===============>..............] - ETA: 58s - loss: 2.2360 - regression_loss: 1.8116 - classification_loss: 0.4244 268/500 [===============>..............] - ETA: 57s - loss: 2.2368 - regression_loss: 1.8123 - classification_loss: 0.4245 269/500 [===============>..............] - ETA: 57s - loss: 2.2393 - regression_loss: 1.8149 - classification_loss: 0.4245 270/500 [===============>..............] - ETA: 57s - loss: 2.2359 - regression_loss: 1.8123 - classification_loss: 0.4236 271/500 [===============>..............] - ETA: 57s - loss: 2.2369 - regression_loss: 1.8133 - classification_loss: 0.4236 272/500 [===============>..............] - ETA: 56s - loss: 2.2368 - regression_loss: 1.8133 - classification_loss: 0.4234 273/500 [===============>..............] - ETA: 56s - loss: 2.2377 - regression_loss: 1.8128 - classification_loss: 0.4249 274/500 [===============>..............] - ETA: 56s - loss: 2.2364 - regression_loss: 1.8119 - classification_loss: 0.4245 275/500 [===============>..............] - ETA: 56s - loss: 2.2352 - regression_loss: 1.8112 - classification_loss: 0.4240 276/500 [===============>..............] - ETA: 55s - loss: 2.2325 - regression_loss: 1.8092 - classification_loss: 0.4232 277/500 [===============>..............] - ETA: 55s - loss: 2.2339 - regression_loss: 1.8103 - classification_loss: 0.4236 278/500 [===============>..............] - ETA: 55s - loss: 2.2344 - regression_loss: 1.8105 - classification_loss: 0.4239 279/500 [===============>..............] - ETA: 55s - loss: 2.2345 - regression_loss: 1.8109 - classification_loss: 0.4236 280/500 [===============>..............] - ETA: 54s - loss: 2.2357 - regression_loss: 1.8121 - classification_loss: 0.4236 281/500 [===============>..............] - ETA: 54s - loss: 2.2366 - regression_loss: 1.8124 - classification_loss: 0.4242 282/500 [===============>..............] - ETA: 54s - loss: 2.2363 - regression_loss: 1.8121 - classification_loss: 0.4242 283/500 [===============>..............] - ETA: 54s - loss: 2.2350 - regression_loss: 1.8112 - classification_loss: 0.4238 284/500 [================>.............] - ETA: 53s - loss: 2.2354 - regression_loss: 1.8116 - classification_loss: 0.4238 285/500 [================>.............] - ETA: 53s - loss: 2.2381 - regression_loss: 1.8138 - classification_loss: 0.4243 286/500 [================>.............] - ETA: 53s - loss: 2.2388 - regression_loss: 1.8147 - classification_loss: 0.4240 287/500 [================>.............] - ETA: 53s - loss: 2.2380 - regression_loss: 1.8142 - classification_loss: 0.4238 288/500 [================>.............] - ETA: 52s - loss: 2.2389 - regression_loss: 1.8150 - classification_loss: 0.4239 289/500 [================>.............] - ETA: 52s - loss: 2.2394 - regression_loss: 1.8151 - classification_loss: 0.4243 290/500 [================>.............] - ETA: 52s - loss: 2.2385 - regression_loss: 1.8146 - classification_loss: 0.4239 291/500 [================>.............] - ETA: 52s - loss: 2.2385 - regression_loss: 1.8145 - classification_loss: 0.4240 292/500 [================>.............] - ETA: 51s - loss: 2.2364 - regression_loss: 1.8127 - classification_loss: 0.4236 293/500 [================>.............] - ETA: 51s - loss: 2.2359 - regression_loss: 1.8124 - classification_loss: 0.4235 294/500 [================>.............] - ETA: 51s - loss: 2.2354 - regression_loss: 1.8119 - classification_loss: 0.4235 295/500 [================>.............] - ETA: 51s - loss: 2.2346 - regression_loss: 1.8114 - classification_loss: 0.4231 296/500 [================>.............] - ETA: 50s - loss: 2.2362 - regression_loss: 1.8126 - classification_loss: 0.4236 297/500 [================>.............] - ETA: 50s - loss: 2.2343 - regression_loss: 1.8109 - classification_loss: 0.4234 298/500 [================>.............] - ETA: 50s - loss: 2.2353 - regression_loss: 1.8111 - classification_loss: 0.4242 299/500 [================>.............] - ETA: 50s - loss: 2.2369 - regression_loss: 1.8126 - classification_loss: 0.4243 300/500 [=================>............] - ETA: 49s - loss: 2.2369 - regression_loss: 1.8122 - classification_loss: 0.4247 301/500 [=================>............] - ETA: 49s - loss: 2.2372 - regression_loss: 1.8126 - classification_loss: 0.4246 302/500 [=================>............] - ETA: 49s - loss: 2.2371 - regression_loss: 1.8126 - classification_loss: 0.4244 303/500 [=================>............] - ETA: 49s - loss: 2.2367 - regression_loss: 1.8110 - classification_loss: 0.4257 304/500 [=================>............] - ETA: 48s - loss: 2.2365 - regression_loss: 1.8110 - classification_loss: 0.4255 305/500 [=================>............] - ETA: 48s - loss: 2.2363 - regression_loss: 1.8110 - classification_loss: 0.4253 306/500 [=================>............] - ETA: 48s - loss: 2.2373 - regression_loss: 1.8116 - classification_loss: 0.4257 307/500 [=================>............] - ETA: 48s - loss: 2.2368 - regression_loss: 1.8112 - classification_loss: 0.4256 308/500 [=================>............] - ETA: 47s - loss: 2.2374 - regression_loss: 1.8119 - classification_loss: 0.4255 309/500 [=================>............] - ETA: 47s - loss: 2.2392 - regression_loss: 1.8129 - classification_loss: 0.4263 310/500 [=================>............] - ETA: 47s - loss: 2.2391 - regression_loss: 1.8129 - classification_loss: 0.4262 311/500 [=================>............] - ETA: 47s - loss: 2.2398 - regression_loss: 1.8138 - classification_loss: 0.4260 312/500 [=================>............] - ETA: 46s - loss: 2.2389 - regression_loss: 1.8132 - classification_loss: 0.4257 313/500 [=================>............] - ETA: 46s - loss: 2.2397 - regression_loss: 1.8141 - classification_loss: 0.4256 314/500 [=================>............] - ETA: 46s - loss: 2.2411 - regression_loss: 1.8153 - classification_loss: 0.4258 315/500 [=================>............] - ETA: 46s - loss: 2.2419 - regression_loss: 1.8154 - classification_loss: 0.4265 316/500 [=================>............] - ETA: 45s - loss: 2.2425 - regression_loss: 1.8160 - classification_loss: 0.4266 317/500 [==================>...........] - ETA: 45s - loss: 2.2418 - regression_loss: 1.8152 - classification_loss: 0.4267 318/500 [==================>...........] - ETA: 45s - loss: 2.2415 - regression_loss: 1.8148 - classification_loss: 0.4267 319/500 [==================>...........] - ETA: 45s - loss: 2.2422 - regression_loss: 1.8154 - classification_loss: 0.4268 320/500 [==================>...........] - ETA: 44s - loss: 2.2391 - regression_loss: 1.8131 - classification_loss: 0.4260 321/500 [==================>...........] - ETA: 44s - loss: 2.2387 - regression_loss: 1.8129 - classification_loss: 0.4257 322/500 [==================>...........] - ETA: 44s - loss: 2.2375 - regression_loss: 1.8122 - classification_loss: 0.4253 323/500 [==================>...........] - ETA: 44s - loss: 2.2369 - regression_loss: 1.8119 - classification_loss: 0.4250 324/500 [==================>...........] - ETA: 43s - loss: 2.2364 - regression_loss: 1.8114 - classification_loss: 0.4250 325/500 [==================>...........] - ETA: 43s - loss: 2.2351 - regression_loss: 1.8106 - classification_loss: 0.4246 326/500 [==================>...........] - ETA: 43s - loss: 2.2385 - regression_loss: 1.8129 - classification_loss: 0.4255 327/500 [==================>...........] - ETA: 43s - loss: 2.2388 - regression_loss: 1.8132 - classification_loss: 0.4256 328/500 [==================>...........] - ETA: 42s - loss: 2.2393 - regression_loss: 1.8138 - classification_loss: 0.4255 329/500 [==================>...........] - ETA: 42s - loss: 2.2388 - regression_loss: 1.8134 - classification_loss: 0.4254 330/500 [==================>...........] - ETA: 42s - loss: 2.2390 - regression_loss: 1.8139 - classification_loss: 0.4251 331/500 [==================>...........] - ETA: 42s - loss: 2.2396 - regression_loss: 1.8146 - classification_loss: 0.4250 332/500 [==================>...........] - ETA: 41s - loss: 2.2392 - regression_loss: 1.8144 - classification_loss: 0.4248 333/500 [==================>...........] - ETA: 41s - loss: 2.2386 - regression_loss: 1.8137 - classification_loss: 0.4249 334/500 [===================>..........] - ETA: 41s - loss: 2.2385 - regression_loss: 1.8135 - classification_loss: 0.4250 335/500 [===================>..........] - ETA: 41s - loss: 2.2389 - regression_loss: 1.8142 - classification_loss: 0.4247 336/500 [===================>..........] - ETA: 40s - loss: 2.2396 - regression_loss: 1.8151 - classification_loss: 0.4246 337/500 [===================>..........] - ETA: 40s - loss: 2.2412 - regression_loss: 1.8163 - classification_loss: 0.4250 338/500 [===================>..........] - ETA: 40s - loss: 2.2417 - regression_loss: 1.8169 - classification_loss: 0.4248 339/500 [===================>..........] - ETA: 40s - loss: 2.2436 - regression_loss: 1.8183 - classification_loss: 0.4253 340/500 [===================>..........] - ETA: 39s - loss: 2.2440 - regression_loss: 1.8188 - classification_loss: 0.4251 341/500 [===================>..........] - ETA: 39s - loss: 2.2442 - regression_loss: 1.8191 - classification_loss: 0.4250 342/500 [===================>..........] - ETA: 39s - loss: 2.2426 - regression_loss: 1.8178 - classification_loss: 0.4248 343/500 [===================>..........] - ETA: 39s - loss: 2.2442 - regression_loss: 1.8188 - classification_loss: 0.4253 344/500 [===================>..........] - ETA: 38s - loss: 2.2452 - regression_loss: 1.8202 - classification_loss: 0.4250 345/500 [===================>..........] - ETA: 38s - loss: 2.2441 - regression_loss: 1.8194 - classification_loss: 0.4248 346/500 [===================>..........] - ETA: 38s - loss: 2.2442 - regression_loss: 1.8195 - classification_loss: 0.4247 347/500 [===================>..........] - ETA: 38s - loss: 2.2430 - regression_loss: 1.8187 - classification_loss: 0.4243 348/500 [===================>..........] - ETA: 37s - loss: 2.2434 - regression_loss: 1.8192 - classification_loss: 0.4242 349/500 [===================>..........] - ETA: 37s - loss: 2.2440 - regression_loss: 1.8197 - classification_loss: 0.4243 350/500 [====================>.........] - ETA: 37s - loss: 2.2438 - regression_loss: 1.8198 - classification_loss: 0.4240 351/500 [====================>.........] - ETA: 37s - loss: 2.2433 - regression_loss: 1.8195 - classification_loss: 0.4239 352/500 [====================>.........] - ETA: 36s - loss: 2.2442 - regression_loss: 1.8203 - classification_loss: 0.4238 353/500 [====================>.........] - ETA: 36s - loss: 2.2453 - regression_loss: 1.8210 - classification_loss: 0.4244 354/500 [====================>.........] - ETA: 36s - loss: 2.2436 - regression_loss: 1.8197 - classification_loss: 0.4239 355/500 [====================>.........] - ETA: 36s - loss: 2.2438 - regression_loss: 1.8200 - classification_loss: 0.4238 356/500 [====================>.........] - ETA: 35s - loss: 2.2426 - regression_loss: 1.8190 - classification_loss: 0.4235 357/500 [====================>.........] - ETA: 35s - loss: 2.2432 - regression_loss: 1.8196 - classification_loss: 0.4236 358/500 [====================>.........] - ETA: 35s - loss: 2.2422 - regression_loss: 1.8189 - classification_loss: 0.4234 359/500 [====================>.........] - ETA: 35s - loss: 2.2423 - regression_loss: 1.8191 - classification_loss: 0.4232 360/500 [====================>.........] - ETA: 34s - loss: 2.2421 - regression_loss: 1.8190 - classification_loss: 0.4232 361/500 [====================>.........] - ETA: 34s - loss: 2.2417 - regression_loss: 1.8187 - classification_loss: 0.4230 362/500 [====================>.........] - ETA: 34s - loss: 2.2441 - regression_loss: 1.8205 - classification_loss: 0.4236 363/500 [====================>.........] - ETA: 34s - loss: 2.2435 - regression_loss: 1.8201 - classification_loss: 0.4234 364/500 [====================>.........] - ETA: 33s - loss: 2.2426 - regression_loss: 1.8194 - classification_loss: 0.4232 365/500 [====================>.........] - ETA: 33s - loss: 2.2426 - regression_loss: 1.8196 - classification_loss: 0.4230 366/500 [====================>.........] - ETA: 33s - loss: 2.2427 - regression_loss: 1.8198 - classification_loss: 0.4230 367/500 [=====================>........] - ETA: 33s - loss: 2.2424 - regression_loss: 1.8191 - classification_loss: 0.4233 368/500 [=====================>........] - ETA: 32s - loss: 2.2419 - regression_loss: 1.8189 - classification_loss: 0.4231 369/500 [=====================>........] - ETA: 32s - loss: 2.2414 - regression_loss: 1.8184 - classification_loss: 0.4230 370/500 [=====================>........] - ETA: 32s - loss: 2.2457 - regression_loss: 1.8219 - classification_loss: 0.4238 371/500 [=====================>........] - ETA: 32s - loss: 2.2464 - regression_loss: 1.8228 - classification_loss: 0.4236 372/500 [=====================>........] - ETA: 31s - loss: 2.2481 - regression_loss: 1.8243 - classification_loss: 0.4238 373/500 [=====================>........] - ETA: 31s - loss: 2.2499 - regression_loss: 1.8259 - classification_loss: 0.4240 374/500 [=====================>........] - ETA: 31s - loss: 2.2497 - regression_loss: 1.8258 - classification_loss: 0.4239 375/500 [=====================>........] - ETA: 31s - loss: 2.2498 - regression_loss: 1.8257 - classification_loss: 0.4241 376/500 [=====================>........] - ETA: 30s - loss: 2.2516 - regression_loss: 1.8271 - classification_loss: 0.4245 377/500 [=====================>........] - ETA: 30s - loss: 2.2523 - regression_loss: 1.8277 - classification_loss: 0.4246 378/500 [=====================>........] - ETA: 30s - loss: 2.2528 - regression_loss: 1.8279 - classification_loss: 0.4249 379/500 [=====================>........] - ETA: 30s - loss: 2.2530 - regression_loss: 1.8279 - classification_loss: 0.4251 380/500 [=====================>........] - ETA: 29s - loss: 2.2514 - regression_loss: 1.8265 - classification_loss: 0.4249 381/500 [=====================>........] - ETA: 29s - loss: 2.2495 - regression_loss: 1.8251 - classification_loss: 0.4244 382/500 [=====================>........] - ETA: 29s - loss: 2.2509 - regression_loss: 1.8265 - classification_loss: 0.4244 383/500 [=====================>........] - ETA: 29s - loss: 2.2507 - regression_loss: 1.8265 - classification_loss: 0.4242 384/500 [======================>.......] - ETA: 28s - loss: 2.2499 - regression_loss: 1.8259 - classification_loss: 0.4241 385/500 [======================>.......] - ETA: 28s - loss: 2.2461 - regression_loss: 1.8228 - classification_loss: 0.4234 386/500 [======================>.......] - ETA: 28s - loss: 2.2455 - regression_loss: 1.8222 - classification_loss: 0.4233 387/500 [======================>.......] - ETA: 28s - loss: 2.2480 - regression_loss: 1.8237 - classification_loss: 0.4243 388/500 [======================>.......] - ETA: 27s - loss: 2.2485 - regression_loss: 1.8240 - classification_loss: 0.4245 389/500 [======================>.......] - ETA: 27s - loss: 2.2475 - regression_loss: 1.8230 - classification_loss: 0.4245 390/500 [======================>.......] - ETA: 27s - loss: 2.2468 - regression_loss: 1.8225 - classification_loss: 0.4243 391/500 [======================>.......] - ETA: 27s - loss: 2.2472 - regression_loss: 1.8227 - classification_loss: 0.4245 392/500 [======================>.......] - ETA: 26s - loss: 2.2476 - regression_loss: 1.8228 - classification_loss: 0.4248 393/500 [======================>.......] - ETA: 26s - loss: 2.2462 - regression_loss: 1.8211 - classification_loss: 0.4251 394/500 [======================>.......] - ETA: 26s - loss: 2.2460 - regression_loss: 1.8211 - classification_loss: 0.4248 395/500 [======================>.......] - ETA: 26s - loss: 2.2457 - regression_loss: 1.8209 - classification_loss: 0.4248 396/500 [======================>.......] - ETA: 25s - loss: 2.2433 - regression_loss: 1.8189 - classification_loss: 0.4244 397/500 [======================>.......] - ETA: 25s - loss: 2.2448 - regression_loss: 1.8200 - classification_loss: 0.4247 398/500 [======================>.......] - ETA: 25s - loss: 2.2430 - regression_loss: 1.8186 - classification_loss: 0.4244 399/500 [======================>.......] - ETA: 25s - loss: 2.2416 - regression_loss: 1.8175 - classification_loss: 0.4241 400/500 [=======================>......] - ETA: 24s - loss: 2.2415 - regression_loss: 1.8176 - classification_loss: 0.4240 401/500 [=======================>......] - ETA: 24s - loss: 2.2396 - regression_loss: 1.8160 - classification_loss: 0.4236 402/500 [=======================>......] - ETA: 24s - loss: 2.2398 - regression_loss: 1.8163 - classification_loss: 0.4235 403/500 [=======================>......] - ETA: 24s - loss: 2.2383 - regression_loss: 1.8149 - classification_loss: 0.4234 404/500 [=======================>......] - ETA: 23s - loss: 2.2389 - regression_loss: 1.8154 - classification_loss: 0.4235 405/500 [=======================>......] - ETA: 23s - loss: 2.2404 - regression_loss: 1.8166 - classification_loss: 0.4238 406/500 [=======================>......] - ETA: 23s - loss: 2.2410 - regression_loss: 1.8163 - classification_loss: 0.4247 407/500 [=======================>......] - ETA: 23s - loss: 2.2401 - regression_loss: 1.8157 - classification_loss: 0.4244 408/500 [=======================>......] - ETA: 22s - loss: 2.2404 - regression_loss: 1.8159 - classification_loss: 0.4245 409/500 [=======================>......] - ETA: 22s - loss: 2.2402 - regression_loss: 1.8158 - classification_loss: 0.4244 410/500 [=======================>......] - ETA: 22s - loss: 2.2399 - regression_loss: 1.8156 - classification_loss: 0.4243 411/500 [=======================>......] - ETA: 22s - loss: 2.2398 - regression_loss: 1.8156 - classification_loss: 0.4242 412/500 [=======================>......] - ETA: 21s - loss: 2.2405 - regression_loss: 1.8161 - classification_loss: 0.4244 413/500 [=======================>......] - ETA: 21s - loss: 2.2400 - regression_loss: 1.8156 - classification_loss: 0.4244 414/500 [=======================>......] - ETA: 21s - loss: 2.2392 - regression_loss: 1.8148 - classification_loss: 0.4244 415/500 [=======================>......] - ETA: 21s - loss: 2.2395 - regression_loss: 1.8151 - classification_loss: 0.4244 416/500 [=======================>......] - ETA: 20s - loss: 2.2383 - regression_loss: 1.8141 - classification_loss: 0.4242 417/500 [========================>.....] - ETA: 20s - loss: 2.2393 - regression_loss: 1.8148 - classification_loss: 0.4245 418/500 [========================>.....] - ETA: 20s - loss: 2.2415 - regression_loss: 1.8168 - classification_loss: 0.4247 419/500 [========================>.....] - ETA: 20s - loss: 2.2405 - regression_loss: 1.8161 - classification_loss: 0.4245 420/500 [========================>.....] - ETA: 19s - loss: 2.2411 - regression_loss: 1.8167 - classification_loss: 0.4244 421/500 [========================>.....] - ETA: 19s - loss: 2.2417 - regression_loss: 1.8171 - classification_loss: 0.4246 422/500 [========================>.....] - ETA: 19s - loss: 2.2414 - regression_loss: 1.8170 - classification_loss: 0.4245 423/500 [========================>.....] - ETA: 19s - loss: 2.2418 - regression_loss: 1.8174 - classification_loss: 0.4244 424/500 [========================>.....] - ETA: 18s - loss: 2.2413 - regression_loss: 1.8170 - classification_loss: 0.4242 425/500 [========================>.....] - ETA: 18s - loss: 2.2419 - regression_loss: 1.8175 - classification_loss: 0.4243 426/500 [========================>.....] - ETA: 18s - loss: 2.2426 - regression_loss: 1.8179 - classification_loss: 0.4246 427/500 [========================>.....] - ETA: 18s - loss: 2.2423 - regression_loss: 1.8177 - classification_loss: 0.4246 428/500 [========================>.....] - ETA: 17s - loss: 2.2419 - regression_loss: 1.8175 - classification_loss: 0.4245 429/500 [========================>.....] - ETA: 17s - loss: 2.2429 - regression_loss: 1.8182 - classification_loss: 0.4246 430/500 [========================>.....] - ETA: 17s - loss: 2.2421 - regression_loss: 1.8178 - classification_loss: 0.4243 431/500 [========================>.....] - ETA: 17s - loss: 2.2436 - regression_loss: 1.8190 - classification_loss: 0.4246 432/500 [========================>.....] - ETA: 16s - loss: 2.2436 - regression_loss: 1.8188 - classification_loss: 0.4249 433/500 [========================>.....] - ETA: 16s - loss: 2.2443 - regression_loss: 1.8195 - classification_loss: 0.4247 434/500 [=========================>....] - ETA: 16s - loss: 2.2448 - regression_loss: 1.8198 - classification_loss: 0.4249 435/500 [=========================>....] - ETA: 16s - loss: 2.2452 - regression_loss: 1.8203 - classification_loss: 0.4248 436/500 [=========================>....] - ETA: 15s - loss: 2.2445 - regression_loss: 1.8197 - classification_loss: 0.4247 437/500 [=========================>....] - ETA: 15s - loss: 2.2434 - regression_loss: 1.8188 - classification_loss: 0.4245 438/500 [=========================>....] - ETA: 15s - loss: 2.2422 - regression_loss: 1.8179 - classification_loss: 0.4243 439/500 [=========================>....] - ETA: 15s - loss: 2.2412 - regression_loss: 1.8172 - classification_loss: 0.4241 440/500 [=========================>....] - ETA: 14s - loss: 2.2403 - regression_loss: 1.8165 - classification_loss: 0.4238 441/500 [=========================>....] - ETA: 14s - loss: 2.2403 - regression_loss: 1.8165 - classification_loss: 0.4238 442/500 [=========================>....] - ETA: 14s - loss: 2.2408 - regression_loss: 1.8166 - classification_loss: 0.4242 443/500 [=========================>....] - ETA: 14s - loss: 2.2424 - regression_loss: 1.8171 - classification_loss: 0.4253 444/500 [=========================>....] - ETA: 13s - loss: 2.2407 - regression_loss: 1.8158 - classification_loss: 0.4249 445/500 [=========================>....] - ETA: 13s - loss: 2.2399 - regression_loss: 1.8153 - classification_loss: 0.4246 446/500 [=========================>....] - ETA: 13s - loss: 2.2395 - regression_loss: 1.8150 - classification_loss: 0.4245 447/500 [=========================>....] - ETA: 13s - loss: 2.2391 - regression_loss: 1.8148 - classification_loss: 0.4243 448/500 [=========================>....] - ETA: 12s - loss: 2.2392 - regression_loss: 1.8145 - classification_loss: 0.4246 449/500 [=========================>....] - ETA: 12s - loss: 2.2391 - regression_loss: 1.8145 - classification_loss: 0.4247 450/500 [==========================>...] - ETA: 12s - loss: 2.2388 - regression_loss: 1.8142 - classification_loss: 0.4246 451/500 [==========================>...] - ETA: 12s - loss: 2.2394 - regression_loss: 1.8148 - classification_loss: 0.4246 452/500 [==========================>...] - ETA: 11s - loss: 2.2384 - regression_loss: 1.8140 - classification_loss: 0.4243 453/500 [==========================>...] - ETA: 11s - loss: 2.2395 - regression_loss: 1.8149 - classification_loss: 0.4246 454/500 [==========================>...] - ETA: 11s - loss: 2.2396 - regression_loss: 1.8150 - classification_loss: 0.4246 455/500 [==========================>...] - ETA: 11s - loss: 2.2404 - regression_loss: 1.8158 - classification_loss: 0.4246 456/500 [==========================>...] - ETA: 10s - loss: 2.2413 - regression_loss: 1.8167 - classification_loss: 0.4246 457/500 [==========================>...] - ETA: 10s - loss: 2.2435 - regression_loss: 1.8183 - classification_loss: 0.4252 458/500 [==========================>...] - ETA: 10s - loss: 2.2451 - regression_loss: 1.8196 - classification_loss: 0.4255 459/500 [==========================>...] - ETA: 10s - loss: 2.2473 - regression_loss: 1.8215 - classification_loss: 0.4259 460/500 [==========================>...] - ETA: 9s - loss: 2.2469 - regression_loss: 1.8211 - classification_loss: 0.4258  461/500 [==========================>...] - ETA: 9s - loss: 2.2468 - regression_loss: 1.8211 - classification_loss: 0.4258 462/500 [==========================>...] - ETA: 9s - loss: 2.2473 - regression_loss: 1.8215 - classification_loss: 0.4258 463/500 [==========================>...] - ETA: 9s - loss: 2.2465 - regression_loss: 1.8209 - classification_loss: 0.4256 464/500 [==========================>...] - ETA: 8s - loss: 2.2461 - regression_loss: 1.8205 - classification_loss: 0.4256 465/500 [==========================>...] - ETA: 8s - loss: 2.2464 - regression_loss: 1.8209 - classification_loss: 0.4255 466/500 [==========================>...] - ETA: 8s - loss: 2.2452 - regression_loss: 1.8200 - classification_loss: 0.4252 467/500 [===========================>..] - ETA: 8s - loss: 2.2449 - regression_loss: 1.8197 - classification_loss: 0.4252 468/500 [===========================>..] - ETA: 7s - loss: 2.2458 - regression_loss: 1.8206 - classification_loss: 0.4253 469/500 [===========================>..] - ETA: 7s - loss: 2.2459 - regression_loss: 1.8208 - classification_loss: 0.4252 470/500 [===========================>..] - ETA: 7s - loss: 2.2458 - regression_loss: 1.8208 - classification_loss: 0.4250 471/500 [===========================>..] - ETA: 7s - loss: 2.2453 - regression_loss: 1.8205 - classification_loss: 0.4248 472/500 [===========================>..] - ETA: 6s - loss: 2.2453 - regression_loss: 1.8205 - classification_loss: 0.4248 473/500 [===========================>..] - ETA: 6s - loss: 2.2461 - regression_loss: 1.8213 - classification_loss: 0.4248 474/500 [===========================>..] - ETA: 6s - loss: 2.2447 - regression_loss: 1.8203 - classification_loss: 0.4244 475/500 [===========================>..] - ETA: 6s - loss: 2.2442 - regression_loss: 1.8199 - classification_loss: 0.4243 476/500 [===========================>..] - ETA: 5s - loss: 2.2444 - regression_loss: 1.8203 - classification_loss: 0.4242 477/500 [===========================>..] - ETA: 5s - loss: 2.2448 - regression_loss: 1.8205 - classification_loss: 0.4243 478/500 [===========================>..] - ETA: 5s - loss: 2.2442 - regression_loss: 1.8202 - classification_loss: 0.4240 479/500 [===========================>..] - ETA: 5s - loss: 2.2435 - regression_loss: 1.8197 - classification_loss: 0.4238 480/500 [===========================>..] - ETA: 4s - loss: 2.2441 - regression_loss: 1.8203 - classification_loss: 0.4239 481/500 [===========================>..] - ETA: 4s - loss: 2.2436 - regression_loss: 1.8201 - classification_loss: 0.4235 482/500 [===========================>..] - ETA: 4s - loss: 2.2437 - regression_loss: 1.8203 - classification_loss: 0.4235 483/500 [===========================>..] - ETA: 4s - loss: 2.2434 - regression_loss: 1.8200 - classification_loss: 0.4234 484/500 [============================>.] - ETA: 3s - loss: 2.2440 - regression_loss: 1.8203 - classification_loss: 0.4236 485/500 [============================>.] - ETA: 3s - loss: 2.2439 - regression_loss: 1.8202 - classification_loss: 0.4237 486/500 [============================>.] - ETA: 3s - loss: 2.2442 - regression_loss: 1.8203 - classification_loss: 0.4239 487/500 [============================>.] - ETA: 3s - loss: 2.2432 - regression_loss: 1.8197 - classification_loss: 0.4235 488/500 [============================>.] - ETA: 2s - loss: 2.2438 - regression_loss: 1.8202 - classification_loss: 0.4237 489/500 [============================>.] - ETA: 2s - loss: 2.2443 - regression_loss: 1.8206 - classification_loss: 0.4237 490/500 [============================>.] - ETA: 2s - loss: 2.2442 - regression_loss: 1.8206 - classification_loss: 0.4236 491/500 [============================>.] - ETA: 2s - loss: 2.2452 - regression_loss: 1.8216 - classification_loss: 0.4236 492/500 [============================>.] - ETA: 1s - loss: 2.2455 - regression_loss: 1.8221 - classification_loss: 0.4233 493/500 [============================>.] - ETA: 1s - loss: 2.2450 - regression_loss: 1.8217 - classification_loss: 0.4234 494/500 [============================>.] - ETA: 1s - loss: 2.2452 - regression_loss: 1.8219 - classification_loss: 0.4233 495/500 [============================>.] - ETA: 1s - loss: 2.2460 - regression_loss: 1.8226 - classification_loss: 0.4234 496/500 [============================>.] - ETA: 0s - loss: 2.2465 - regression_loss: 1.8232 - classification_loss: 0.4234 497/500 [============================>.] - ETA: 0s - loss: 2.2465 - regression_loss: 1.8231 - classification_loss: 0.4235 498/500 [============================>.] - ETA: 0s - loss: 2.2468 - regression_loss: 1.8234 - classification_loss: 0.4235 499/500 [============================>.] - ETA: 0s - loss: 2.2467 - regression_loss: 1.8233 - classification_loss: 0.4234 500/500 [==============================] - 125s 250ms/step - loss: 2.2479 - regression_loss: 1.8243 - classification_loss: 0.4236 1172 instances of class plum with average precision: 0.3965 mAP: 0.3965 Epoch 00024: saving model to ./training/snapshots/resnet50_pascal_24.h5 Epoch 25/150 1/500 [..............................] - ETA: 2:05 - loss: 2.4257 - regression_loss: 1.9513 - classification_loss: 0.4745 2/500 [..............................] - ETA: 2:03 - loss: 2.5652 - regression_loss: 2.0645 - classification_loss: 0.5007 3/500 [..............................] - ETA: 2:04 - loss: 2.4368 - regression_loss: 1.9897 - classification_loss: 0.4471 4/500 [..............................] - ETA: 2:03 - loss: 2.5782 - regression_loss: 2.1279 - classification_loss: 0.4503 5/500 [..............................] - ETA: 2:04 - loss: 2.5063 - regression_loss: 2.0743 - classification_loss: 0.4320 6/500 [..............................] - ETA: 2:04 - loss: 2.4666 - regression_loss: 2.0429 - classification_loss: 0.4237 7/500 [..............................] - ETA: 2:02 - loss: 2.4428 - regression_loss: 2.0126 - classification_loss: 0.4302 8/500 [..............................] - ETA: 2:02 - loss: 2.2614 - regression_loss: 1.8607 - classification_loss: 0.4007 9/500 [..............................] - ETA: 2:02 - loss: 2.2697 - regression_loss: 1.8703 - classification_loss: 0.3994 10/500 [..............................] - ETA: 2:03 - loss: 2.3394 - regression_loss: 1.9057 - classification_loss: 0.4337 11/500 [..............................] - ETA: 2:03 - loss: 2.3305 - regression_loss: 1.8891 - classification_loss: 0.4413 12/500 [..............................] - ETA: 2:02 - loss: 2.3558 - regression_loss: 1.9137 - classification_loss: 0.4421 13/500 [..............................] - ETA: 2:02 - loss: 2.3055 - regression_loss: 1.8751 - classification_loss: 0.4304 14/500 [..............................] - ETA: 2:01 - loss: 2.3573 - regression_loss: 1.9151 - classification_loss: 0.4422 15/500 [..............................] - ETA: 2:01 - loss: 2.3715 - regression_loss: 1.9285 - classification_loss: 0.4429 16/500 [..............................] - ETA: 2:01 - loss: 2.3190 - regression_loss: 1.8810 - classification_loss: 0.4379 17/500 [>.............................] - ETA: 2:01 - loss: 2.3072 - regression_loss: 1.8751 - classification_loss: 0.4321 18/500 [>.............................] - ETA: 2:01 - loss: 2.3057 - regression_loss: 1.8762 - classification_loss: 0.4295 19/500 [>.............................] - ETA: 2:00 - loss: 2.3008 - regression_loss: 1.8735 - classification_loss: 0.4273 20/500 [>.............................] - ETA: 2:00 - loss: 2.3032 - regression_loss: 1.8723 - classification_loss: 0.4309 21/500 [>.............................] - ETA: 2:00 - loss: 2.2842 - regression_loss: 1.8568 - classification_loss: 0.4274 22/500 [>.............................] - ETA: 2:00 - loss: 2.2451 - regression_loss: 1.8269 - classification_loss: 0.4182 23/500 [>.............................] - ETA: 1:59 - loss: 2.2559 - regression_loss: 1.8372 - classification_loss: 0.4187 24/500 [>.............................] - ETA: 1:59 - loss: 2.2340 - regression_loss: 1.8204 - classification_loss: 0.4136 25/500 [>.............................] - ETA: 1:59 - loss: 2.2342 - regression_loss: 1.8247 - classification_loss: 0.4095 26/500 [>.............................] - ETA: 1:58 - loss: 2.2435 - regression_loss: 1.8303 - classification_loss: 0.4133 27/500 [>.............................] - ETA: 1:58 - loss: 2.2275 - regression_loss: 1.8188 - classification_loss: 0.4087 28/500 [>.............................] - ETA: 1:58 - loss: 2.2237 - regression_loss: 1.8167 - classification_loss: 0.4071 29/500 [>.............................] - ETA: 1:58 - loss: 2.2342 - regression_loss: 1.8241 - classification_loss: 0.4101 30/500 [>.............................] - ETA: 1:57 - loss: 2.2976 - regression_loss: 1.8694 - classification_loss: 0.4282 31/500 [>.............................] - ETA: 1:57 - loss: 2.2882 - regression_loss: 1.8628 - classification_loss: 0.4254 32/500 [>.............................] - ETA: 1:57 - loss: 2.2818 - regression_loss: 1.8572 - classification_loss: 0.4247 33/500 [>.............................] - ETA: 1:56 - loss: 2.2805 - regression_loss: 1.8570 - classification_loss: 0.4236 34/500 [=>............................] - ETA: 1:56 - loss: 2.3003 - regression_loss: 1.8706 - classification_loss: 0.4297 35/500 [=>............................] - ETA: 1:56 - loss: 2.3106 - regression_loss: 1.8764 - classification_loss: 0.4342 36/500 [=>............................] - ETA: 1:56 - loss: 2.2897 - regression_loss: 1.8586 - classification_loss: 0.4311 37/500 [=>............................] - ETA: 1:56 - loss: 2.2786 - regression_loss: 1.8504 - classification_loss: 0.4282 38/500 [=>............................] - ETA: 1:55 - loss: 2.2551 - regression_loss: 1.8329 - classification_loss: 0.4222 39/500 [=>............................] - ETA: 1:55 - loss: 2.2520 - regression_loss: 1.8312 - classification_loss: 0.4209 40/500 [=>............................] - ETA: 1:55 - loss: 2.2187 - regression_loss: 1.8055 - classification_loss: 0.4132 41/500 [=>............................] - ETA: 1:55 - loss: 2.2158 - regression_loss: 1.8041 - classification_loss: 0.4116 42/500 [=>............................] - ETA: 1:54 - loss: 2.2168 - regression_loss: 1.8064 - classification_loss: 0.4104 43/500 [=>............................] - ETA: 1:54 - loss: 2.2133 - regression_loss: 1.8029 - classification_loss: 0.4104 44/500 [=>............................] - ETA: 1:54 - loss: 2.2125 - regression_loss: 1.8037 - classification_loss: 0.4088 45/500 [=>............................] - ETA: 1:54 - loss: 2.2208 - regression_loss: 1.8111 - classification_loss: 0.4097 46/500 [=>............................] - ETA: 1:53 - loss: 2.2139 - regression_loss: 1.8061 - classification_loss: 0.4078 47/500 [=>............................] - ETA: 1:53 - loss: 2.2077 - regression_loss: 1.8020 - classification_loss: 0.4058 48/500 [=>............................] - ETA: 1:53 - loss: 2.2034 - regression_loss: 1.7992 - classification_loss: 0.4042 49/500 [=>............................] - ETA: 1:53 - loss: 2.2029 - regression_loss: 1.7975 - classification_loss: 0.4055 50/500 [==>...........................] - ETA: 1:52 - loss: 2.1957 - regression_loss: 1.7877 - classification_loss: 0.4080 51/500 [==>...........................] - ETA: 1:52 - loss: 2.2049 - regression_loss: 1.7957 - classification_loss: 0.4093 52/500 [==>...........................] - ETA: 1:52 - loss: 2.2067 - regression_loss: 1.7978 - classification_loss: 0.4089 53/500 [==>...........................] - ETA: 1:51 - loss: 2.1982 - regression_loss: 1.7901 - classification_loss: 0.4081 54/500 [==>...........................] - ETA: 1:51 - loss: 2.2142 - regression_loss: 1.8049 - classification_loss: 0.4093 55/500 [==>...........................] - ETA: 1:51 - loss: 2.2168 - regression_loss: 1.8068 - classification_loss: 0.4100 56/500 [==>...........................] - ETA: 1:51 - loss: 2.2257 - regression_loss: 1.8140 - classification_loss: 0.4117 57/500 [==>...........................] - ETA: 1:50 - loss: 2.2143 - regression_loss: 1.8051 - classification_loss: 0.4093 58/500 [==>...........................] - ETA: 1:50 - loss: 2.2160 - regression_loss: 1.8069 - classification_loss: 0.4091 59/500 [==>...........................] - ETA: 1:50 - loss: 2.2148 - regression_loss: 1.8061 - classification_loss: 0.4086 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2226 - regression_loss: 1.8113 - classification_loss: 0.4114 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2201 - regression_loss: 1.8092 - classification_loss: 0.4109 62/500 [==>...........................] - ETA: 1:49 - loss: 2.2260 - regression_loss: 1.8137 - classification_loss: 0.4123 63/500 [==>...........................] - ETA: 1:49 - loss: 2.2215 - regression_loss: 1.8107 - classification_loss: 0.4107 64/500 [==>...........................] - ETA: 1:49 - loss: 2.2472 - regression_loss: 1.8219 - classification_loss: 0.4254 65/500 [==>...........................] - ETA: 1:48 - loss: 2.2447 - regression_loss: 1.8204 - classification_loss: 0.4243 66/500 [==>...........................] - ETA: 1:48 - loss: 2.2417 - regression_loss: 1.8182 - classification_loss: 0.4235 67/500 [===>..........................] - ETA: 1:48 - loss: 2.2295 - regression_loss: 1.8074 - classification_loss: 0.4221 68/500 [===>..........................] - ETA: 1:48 - loss: 2.2295 - regression_loss: 1.8045 - classification_loss: 0.4251 69/500 [===>..........................] - ETA: 1:47 - loss: 2.2296 - regression_loss: 1.8056 - classification_loss: 0.4239 70/500 [===>..........................] - ETA: 1:47 - loss: 2.2321 - regression_loss: 1.8087 - classification_loss: 0.4234 71/500 [===>..........................] - ETA: 1:47 - loss: 2.2329 - regression_loss: 1.8102 - classification_loss: 0.4227 72/500 [===>..........................] - ETA: 1:47 - loss: 2.2306 - regression_loss: 1.8085 - classification_loss: 0.4221 73/500 [===>..........................] - ETA: 1:46 - loss: 2.2203 - regression_loss: 1.8006 - classification_loss: 0.4197 74/500 [===>..........................] - ETA: 1:46 - loss: 2.2237 - regression_loss: 1.8031 - classification_loss: 0.4206 75/500 [===>..........................] - ETA: 1:46 - loss: 2.2136 - regression_loss: 1.7945 - classification_loss: 0.4191 76/500 [===>..........................] - ETA: 1:46 - loss: 2.1992 - regression_loss: 1.7816 - classification_loss: 0.4176 77/500 [===>..........................] - ETA: 1:45 - loss: 2.2112 - regression_loss: 1.7928 - classification_loss: 0.4184 78/500 [===>..........................] - ETA: 1:45 - loss: 2.2074 - regression_loss: 1.7906 - classification_loss: 0.4168 79/500 [===>..........................] - ETA: 1:45 - loss: 2.2032 - regression_loss: 1.7874 - classification_loss: 0.4158 80/500 [===>..........................] - ETA: 1:45 - loss: 2.2004 - regression_loss: 1.7846 - classification_loss: 0.4157 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1956 - regression_loss: 1.7815 - classification_loss: 0.4141 82/500 [===>..........................] - ETA: 1:44 - loss: 2.1881 - regression_loss: 1.7753 - classification_loss: 0.4127 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1719 - regression_loss: 1.7627 - classification_loss: 0.4092 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1646 - regression_loss: 1.7578 - classification_loss: 0.4069 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1689 - regression_loss: 1.7600 - classification_loss: 0.4089 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1682 - regression_loss: 1.7599 - classification_loss: 0.4083 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1698 - regression_loss: 1.7607 - classification_loss: 0.4091 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1696 - regression_loss: 1.7611 - classification_loss: 0.4085 89/500 [====>.........................] - ETA: 1:43 - loss: 2.1606 - regression_loss: 1.7506 - classification_loss: 0.4099 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1652 - regression_loss: 1.7533 - classification_loss: 0.4119 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1594 - regression_loss: 1.7495 - classification_loss: 0.4099 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1626 - regression_loss: 1.7521 - classification_loss: 0.4104 93/500 [====>.........................] - ETA: 1:42 - loss: 2.1694 - regression_loss: 1.7575 - classification_loss: 0.4120 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1736 - regression_loss: 1.7614 - classification_loss: 0.4122 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1643 - regression_loss: 1.7547 - classification_loss: 0.4096 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1676 - regression_loss: 1.7574 - classification_loss: 0.4102 97/500 [====>.........................] - ETA: 1:41 - loss: 2.1721 - regression_loss: 1.7611 - classification_loss: 0.4109 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1690 - regression_loss: 1.7577 - classification_loss: 0.4112 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1716 - regression_loss: 1.7599 - classification_loss: 0.4117 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1693 - regression_loss: 1.7580 - classification_loss: 0.4114 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1701 - regression_loss: 1.7586 - classification_loss: 0.4115 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1702 - regression_loss: 1.7587 - classification_loss: 0.4116 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1697 - regression_loss: 1.7588 - classification_loss: 0.4108 104/500 [=====>........................] - ETA: 1:39 - loss: 2.1676 - regression_loss: 1.7574 - classification_loss: 0.4101 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1806 - regression_loss: 1.7698 - classification_loss: 0.4108 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1781 - regression_loss: 1.7681 - classification_loss: 0.4100 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1837 - regression_loss: 1.7731 - classification_loss: 0.4106 108/500 [=====>........................] - ETA: 1:38 - loss: 2.1846 - regression_loss: 1.7721 - classification_loss: 0.4125 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1869 - regression_loss: 1.7745 - classification_loss: 0.4124 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1826 - regression_loss: 1.7713 - classification_loss: 0.4113 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1832 - regression_loss: 1.7723 - classification_loss: 0.4109 112/500 [=====>........................] - ETA: 1:37 - loss: 2.1825 - regression_loss: 1.7722 - classification_loss: 0.4103 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1831 - regression_loss: 1.7729 - classification_loss: 0.4102 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1788 - regression_loss: 1.7694 - classification_loss: 0.4094 115/500 [=====>........................] - ETA: 1:36 - loss: 2.1783 - regression_loss: 1.7691 - classification_loss: 0.4092 116/500 [=====>........................] - ETA: 1:36 - loss: 2.1829 - regression_loss: 1.7726 - classification_loss: 0.4103 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1785 - regression_loss: 1.7691 - classification_loss: 0.4094 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1808 - regression_loss: 1.7712 - classification_loss: 0.4095 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1813 - regression_loss: 1.7722 - classification_loss: 0.4092 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1707 - regression_loss: 1.7636 - classification_loss: 0.4071 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1778 - regression_loss: 1.7698 - classification_loss: 0.4081 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1792 - regression_loss: 1.7703 - classification_loss: 0.4089 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1833 - regression_loss: 1.7734 - classification_loss: 0.4099 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1806 - regression_loss: 1.7684 - classification_loss: 0.4122 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1868 - regression_loss: 1.7739 - classification_loss: 0.4128 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1905 - regression_loss: 1.7764 - classification_loss: 0.4141 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1857 - regression_loss: 1.7718 - classification_loss: 0.4139 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1933 - regression_loss: 1.7759 - classification_loss: 0.4174 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1938 - regression_loss: 1.7773 - classification_loss: 0.4165 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1928 - regression_loss: 1.7765 - classification_loss: 0.4162 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1950 - regression_loss: 1.7783 - classification_loss: 0.4166 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1944 - regression_loss: 1.7783 - classification_loss: 0.4160 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1931 - regression_loss: 1.7770 - classification_loss: 0.4161 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1850 - regression_loss: 1.7702 - classification_loss: 0.4148 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1863 - regression_loss: 1.7715 - classification_loss: 0.4148 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1878 - regression_loss: 1.7725 - classification_loss: 0.4153 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1847 - regression_loss: 1.7697 - classification_loss: 0.4150 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1859 - regression_loss: 1.7716 - classification_loss: 0.4143 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1857 - regression_loss: 1.7708 - classification_loss: 0.4149 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1873 - regression_loss: 1.7721 - classification_loss: 0.4151 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1896 - regression_loss: 1.7747 - classification_loss: 0.4150 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1871 - regression_loss: 1.7729 - classification_loss: 0.4142 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1906 - regression_loss: 1.7755 - classification_loss: 0.4151 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1933 - regression_loss: 1.7750 - classification_loss: 0.4183 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1929 - regression_loss: 1.7749 - classification_loss: 0.4181 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1894 - regression_loss: 1.7720 - classification_loss: 0.4174 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1875 - regression_loss: 1.7707 - classification_loss: 0.4168 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1898 - regression_loss: 1.7735 - classification_loss: 0.4163 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1907 - regression_loss: 1.7746 - classification_loss: 0.4161 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1934 - regression_loss: 1.7752 - classification_loss: 0.4182 151/500 [========>.....................] - ETA: 1:26 - loss: 2.1945 - regression_loss: 1.7755 - classification_loss: 0.4189 152/500 [========>.....................] - ETA: 1:26 - loss: 2.1922 - regression_loss: 1.7733 - classification_loss: 0.4189 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1905 - regression_loss: 1.7722 - classification_loss: 0.4184 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1900 - regression_loss: 1.7722 - classification_loss: 0.4178 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1878 - regression_loss: 1.7709 - classification_loss: 0.4169 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1898 - regression_loss: 1.7730 - classification_loss: 0.4168 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1870 - regression_loss: 1.7699 - classification_loss: 0.4171 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1872 - regression_loss: 1.7699 - classification_loss: 0.4172 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1935 - regression_loss: 1.7741 - classification_loss: 0.4194 160/500 [========>.....................] - ETA: 1:24 - loss: 2.1954 - regression_loss: 1.7761 - classification_loss: 0.4193 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1905 - regression_loss: 1.7720 - classification_loss: 0.4186 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1891 - regression_loss: 1.7714 - classification_loss: 0.4177 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1899 - regression_loss: 1.7722 - classification_loss: 0.4177 164/500 [========>.....................] - ETA: 1:23 - loss: 2.1868 - regression_loss: 1.7698 - classification_loss: 0.4169 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1876 - regression_loss: 1.7707 - classification_loss: 0.4169 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1917 - regression_loss: 1.7744 - classification_loss: 0.4173 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1893 - regression_loss: 1.7726 - classification_loss: 0.4167 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1888 - regression_loss: 1.7720 - classification_loss: 0.4168 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1891 - regression_loss: 1.7728 - classification_loss: 0.4163 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1897 - regression_loss: 1.7745 - classification_loss: 0.4153 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1959 - regression_loss: 1.7789 - classification_loss: 0.4170 172/500 [=========>....................] - ETA: 1:21 - loss: 2.1930 - regression_loss: 1.7765 - classification_loss: 0.4165 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1943 - regression_loss: 1.7773 - classification_loss: 0.4170 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1911 - regression_loss: 1.7745 - classification_loss: 0.4166 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1930 - regression_loss: 1.7764 - classification_loss: 0.4166 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1963 - regression_loss: 1.7787 - classification_loss: 0.4175 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1975 - regression_loss: 1.7799 - classification_loss: 0.4176 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1997 - regression_loss: 1.7815 - classification_loss: 0.4181 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1921 - regression_loss: 1.7756 - classification_loss: 0.4164 180/500 [=========>....................] - ETA: 1:19 - loss: 2.1911 - regression_loss: 1.7749 - classification_loss: 0.4162 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1929 - regression_loss: 1.7763 - classification_loss: 0.4166 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1927 - regression_loss: 1.7764 - classification_loss: 0.4163 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1949 - regression_loss: 1.7782 - classification_loss: 0.4167 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1975 - regression_loss: 1.7796 - classification_loss: 0.4179 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1963 - regression_loss: 1.7788 - classification_loss: 0.4175 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1958 - regression_loss: 1.7761 - classification_loss: 0.4197 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1961 - regression_loss: 1.7766 - classification_loss: 0.4195 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1982 - regression_loss: 1.7774 - classification_loss: 0.4207 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2027 - regression_loss: 1.7796 - classification_loss: 0.4230 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1992 - regression_loss: 1.7774 - classification_loss: 0.4218 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1975 - regression_loss: 1.7757 - classification_loss: 0.4218 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1963 - regression_loss: 1.7751 - classification_loss: 0.4213 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1977 - regression_loss: 1.7763 - classification_loss: 0.4215 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1980 - regression_loss: 1.7768 - classification_loss: 0.4212 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1984 - regression_loss: 1.7769 - classification_loss: 0.4214 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1973 - regression_loss: 1.7763 - classification_loss: 0.4210 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1964 - regression_loss: 1.7758 - classification_loss: 0.4206 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1968 - regression_loss: 1.7763 - classification_loss: 0.4205 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1953 - regression_loss: 1.7747 - classification_loss: 0.4206 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1898 - regression_loss: 1.7703 - classification_loss: 0.4195 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1929 - regression_loss: 1.7730 - classification_loss: 0.4199 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1941 - regression_loss: 1.7746 - classification_loss: 0.4195 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1971 - regression_loss: 1.7772 - classification_loss: 0.4199 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1976 - regression_loss: 1.7777 - classification_loss: 0.4199 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1922 - regression_loss: 1.7734 - classification_loss: 0.4188 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1918 - regression_loss: 1.7729 - classification_loss: 0.4189 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1906 - regression_loss: 1.7721 - classification_loss: 0.4184 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1890 - regression_loss: 1.7710 - classification_loss: 0.4180 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1881 - regression_loss: 1.7706 - classification_loss: 0.4175 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1950 - regression_loss: 1.7725 - classification_loss: 0.4226 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1981 - regression_loss: 1.7745 - classification_loss: 0.4236 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1982 - regression_loss: 1.7748 - classification_loss: 0.4234 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1993 - regression_loss: 1.7762 - classification_loss: 0.4232 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1981 - regression_loss: 1.7751 - classification_loss: 0.4230 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1980 - regression_loss: 1.7751 - classification_loss: 0.4229 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1980 - regression_loss: 1.7752 - classification_loss: 0.4227 217/500 [============>.................] - ETA: 1:10 - loss: 2.1976 - regression_loss: 1.7749 - classification_loss: 0.4227 218/500 [============>.................] - ETA: 1:10 - loss: 2.1994 - regression_loss: 1.7765 - classification_loss: 0.4229 219/500 [============>.................] - ETA: 1:10 - loss: 2.1957 - regression_loss: 1.7737 - classification_loss: 0.4220 220/500 [============>.................] - ETA: 1:10 - loss: 2.1957 - regression_loss: 1.7741 - classification_loss: 0.4216 221/500 [============>.................] - ETA: 1:09 - loss: 2.1953 - regression_loss: 1.7741 - classification_loss: 0.4212 222/500 [============>.................] - ETA: 1:09 - loss: 2.1909 - regression_loss: 1.7705 - classification_loss: 0.4204 223/500 [============>.................] - ETA: 1:09 - loss: 2.1922 - regression_loss: 1.7720 - classification_loss: 0.4202 224/500 [============>.................] - ETA: 1:09 - loss: 2.1933 - regression_loss: 1.7727 - classification_loss: 0.4206 225/500 [============>.................] - ETA: 1:08 - loss: 2.1936 - regression_loss: 1.7729 - classification_loss: 0.4207 226/500 [============>.................] - ETA: 1:08 - loss: 2.1930 - regression_loss: 1.7727 - classification_loss: 0.4203 227/500 [============>.................] - ETA: 1:08 - loss: 2.1964 - regression_loss: 1.7753 - classification_loss: 0.4211 228/500 [============>.................] - ETA: 1:08 - loss: 2.1955 - regression_loss: 1.7749 - classification_loss: 0.4206 229/500 [============>.................] - ETA: 1:07 - loss: 2.1958 - regression_loss: 1.7750 - classification_loss: 0.4208 230/500 [============>.................] - ETA: 1:07 - loss: 2.1966 - regression_loss: 1.7757 - classification_loss: 0.4209 231/500 [============>.................] - ETA: 1:07 - loss: 2.1952 - regression_loss: 1.7747 - classification_loss: 0.4204 232/500 [============>.................] - ETA: 1:07 - loss: 2.1939 - regression_loss: 1.7739 - classification_loss: 0.4200 233/500 [============>.................] - ETA: 1:06 - loss: 2.1931 - regression_loss: 1.7732 - classification_loss: 0.4199 234/500 [=============>................] - ETA: 1:06 - loss: 2.1933 - regression_loss: 1.7735 - classification_loss: 0.4197 235/500 [=============>................] - ETA: 1:06 - loss: 2.1939 - regression_loss: 1.7741 - classification_loss: 0.4198 236/500 [=============>................] - ETA: 1:06 - loss: 2.1943 - regression_loss: 1.7743 - classification_loss: 0.4199 237/500 [=============>................] - ETA: 1:05 - loss: 2.1967 - regression_loss: 1.7761 - classification_loss: 0.4206 238/500 [=============>................] - ETA: 1:05 - loss: 2.1975 - regression_loss: 1.7766 - classification_loss: 0.4210 239/500 [=============>................] - ETA: 1:05 - loss: 2.1977 - regression_loss: 1.7770 - classification_loss: 0.4207 240/500 [=============>................] - ETA: 1:05 - loss: 2.2007 - regression_loss: 1.7795 - classification_loss: 0.4212 241/500 [=============>................] - ETA: 1:04 - loss: 2.2018 - regression_loss: 1.7804 - classification_loss: 0.4214 242/500 [=============>................] - ETA: 1:04 - loss: 2.2012 - regression_loss: 1.7800 - classification_loss: 0.4212 243/500 [=============>................] - ETA: 1:04 - loss: 2.2034 - regression_loss: 1.7812 - classification_loss: 0.4222 244/500 [=============>................] - ETA: 1:04 - loss: 2.2020 - regression_loss: 1.7803 - classification_loss: 0.4217 245/500 [=============>................] - ETA: 1:03 - loss: 2.2052 - regression_loss: 1.7833 - classification_loss: 0.4219 246/500 [=============>................] - ETA: 1:03 - loss: 2.2067 - regression_loss: 1.7843 - classification_loss: 0.4224 247/500 [=============>................] - ETA: 1:03 - loss: 2.2066 - regression_loss: 1.7846 - classification_loss: 0.4220 248/500 [=============>................] - ETA: 1:03 - loss: 2.2058 - regression_loss: 1.7835 - classification_loss: 0.4224 249/500 [=============>................] - ETA: 1:02 - loss: 2.2058 - regression_loss: 1.7833 - classification_loss: 0.4225 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2063 - regression_loss: 1.7833 - classification_loss: 0.4230 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2037 - regression_loss: 1.7814 - classification_loss: 0.4223 252/500 [==============>...............] - ETA: 1:02 - loss: 2.2020 - regression_loss: 1.7804 - classification_loss: 0.4216 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2021 - regression_loss: 1.7804 - classification_loss: 0.4217 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2035 - regression_loss: 1.7817 - classification_loss: 0.4218 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2033 - regression_loss: 1.7817 - classification_loss: 0.4215 256/500 [==============>...............] - ETA: 1:01 - loss: 2.2033 - regression_loss: 1.7817 - classification_loss: 0.4215 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2033 - regression_loss: 1.7820 - classification_loss: 0.4214 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2037 - regression_loss: 1.7825 - classification_loss: 0.4212 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2060 - regression_loss: 1.7824 - classification_loss: 0.4235 260/500 [==============>...............] - ETA: 1:00 - loss: 2.2044 - regression_loss: 1.7811 - classification_loss: 0.4232 261/500 [==============>...............] - ETA: 59s - loss: 2.2042 - regression_loss: 1.7803 - classification_loss: 0.4239  262/500 [==============>...............] - ETA: 59s - loss: 2.2055 - regression_loss: 1.7815 - classification_loss: 0.4240 263/500 [==============>...............] - ETA: 59s - loss: 2.2079 - regression_loss: 1.7830 - classification_loss: 0.4249 264/500 [==============>...............] - ETA: 59s - loss: 2.2087 - regression_loss: 1.7841 - classification_loss: 0.4247 265/500 [==============>...............] - ETA: 58s - loss: 2.2090 - regression_loss: 1.7840 - classification_loss: 0.4250 266/500 [==============>...............] - ETA: 58s - loss: 2.2115 - regression_loss: 1.7866 - classification_loss: 0.4250 267/500 [===============>..............] - ETA: 58s - loss: 2.2126 - regression_loss: 1.7879 - classification_loss: 0.4247 268/500 [===============>..............] - ETA: 58s - loss: 2.2137 - regression_loss: 1.7887 - classification_loss: 0.4249 269/500 [===============>..............] - ETA: 57s - loss: 2.2140 - regression_loss: 1.7892 - classification_loss: 0.4248 270/500 [===============>..............] - ETA: 57s - loss: 2.2147 - regression_loss: 1.7900 - classification_loss: 0.4247 271/500 [===============>..............] - ETA: 57s - loss: 2.2154 - regression_loss: 1.7908 - classification_loss: 0.4247 272/500 [===============>..............] - ETA: 57s - loss: 2.2146 - regression_loss: 1.7903 - classification_loss: 0.4243 273/500 [===============>..............] - ETA: 56s - loss: 2.2149 - regression_loss: 1.7906 - classification_loss: 0.4243 274/500 [===============>..............] - ETA: 56s - loss: 2.2187 - regression_loss: 1.7934 - classification_loss: 0.4254 275/500 [===============>..............] - ETA: 56s - loss: 2.2173 - regression_loss: 1.7920 - classification_loss: 0.4253 276/500 [===============>..............] - ETA: 56s - loss: 2.2153 - regression_loss: 1.7903 - classification_loss: 0.4250 277/500 [===============>..............] - ETA: 55s - loss: 2.2143 - regression_loss: 1.7898 - classification_loss: 0.4245 278/500 [===============>..............] - ETA: 55s - loss: 2.2145 - regression_loss: 1.7901 - classification_loss: 0.4244 279/500 [===============>..............] - ETA: 55s - loss: 2.2141 - regression_loss: 1.7899 - classification_loss: 0.4242 280/500 [===============>..............] - ETA: 55s - loss: 2.2113 - regression_loss: 1.7877 - classification_loss: 0.4236 281/500 [===============>..............] - ETA: 54s - loss: 2.2089 - regression_loss: 1.7854 - classification_loss: 0.4235 282/500 [===============>..............] - ETA: 54s - loss: 2.2081 - regression_loss: 1.7849 - classification_loss: 0.4232 283/500 [===============>..............] - ETA: 54s - loss: 2.2076 - regression_loss: 1.7847 - classification_loss: 0.4230 284/500 [================>.............] - ETA: 54s - loss: 2.2081 - regression_loss: 1.7851 - classification_loss: 0.4230 285/500 [================>.............] - ETA: 53s - loss: 2.2073 - regression_loss: 1.7846 - classification_loss: 0.4227 286/500 [================>.............] - ETA: 53s - loss: 2.2074 - regression_loss: 1.7849 - classification_loss: 0.4225 287/500 [================>.............] - ETA: 53s - loss: 2.2079 - regression_loss: 1.7852 - classification_loss: 0.4226 288/500 [================>.............] - ETA: 53s - loss: 2.2057 - regression_loss: 1.7834 - classification_loss: 0.4223 289/500 [================>.............] - ETA: 52s - loss: 2.2055 - regression_loss: 1.7833 - classification_loss: 0.4222 290/500 [================>.............] - ETA: 52s - loss: 2.2051 - regression_loss: 1.7832 - classification_loss: 0.4219 291/500 [================>.............] - ETA: 52s - loss: 2.2021 - regression_loss: 1.7811 - classification_loss: 0.4210 292/500 [================>.............] - ETA: 52s - loss: 2.1999 - regression_loss: 1.7795 - classification_loss: 0.4204 293/500 [================>.............] - ETA: 51s - loss: 2.1997 - regression_loss: 1.7794 - classification_loss: 0.4203 294/500 [================>.............] - ETA: 51s - loss: 2.1995 - regression_loss: 1.7791 - classification_loss: 0.4204 295/500 [================>.............] - ETA: 51s - loss: 2.1990 - regression_loss: 1.7785 - classification_loss: 0.4205 296/500 [================>.............] - ETA: 51s - loss: 2.1987 - regression_loss: 1.7785 - classification_loss: 0.4202 297/500 [================>.............] - ETA: 50s - loss: 2.2015 - regression_loss: 1.7810 - classification_loss: 0.4206 298/500 [================>.............] - ETA: 50s - loss: 2.2021 - regression_loss: 1.7815 - classification_loss: 0.4206 299/500 [================>.............] - ETA: 50s - loss: 2.2025 - regression_loss: 1.7819 - classification_loss: 0.4207 300/500 [=================>............] - ETA: 50s - loss: 2.2037 - regression_loss: 1.7828 - classification_loss: 0.4208 301/500 [=================>............] - ETA: 49s - loss: 2.2045 - regression_loss: 1.7837 - classification_loss: 0.4209 302/500 [=================>............] - ETA: 49s - loss: 2.2040 - regression_loss: 1.7832 - classification_loss: 0.4207 303/500 [=================>............] - ETA: 49s - loss: 2.2036 - regression_loss: 1.7829 - classification_loss: 0.4207 304/500 [=================>............] - ETA: 49s - loss: 2.2106 - regression_loss: 1.7884 - classification_loss: 0.4222 305/500 [=================>............] - ETA: 48s - loss: 2.2104 - regression_loss: 1.7887 - classification_loss: 0.4217 306/500 [=================>............] - ETA: 48s - loss: 2.2125 - regression_loss: 1.7891 - classification_loss: 0.4233 307/500 [=================>............] - ETA: 48s - loss: 2.2113 - regression_loss: 1.7881 - classification_loss: 0.4232 308/500 [=================>............] - ETA: 48s - loss: 2.2100 - regression_loss: 1.7873 - classification_loss: 0.4227 309/500 [=================>............] - ETA: 47s - loss: 2.2098 - regression_loss: 1.7873 - classification_loss: 0.4225 310/500 [=================>............] - ETA: 47s - loss: 2.2107 - regression_loss: 1.7880 - classification_loss: 0.4227 311/500 [=================>............] - ETA: 47s - loss: 2.2106 - regression_loss: 1.7880 - classification_loss: 0.4226 312/500 [=================>............] - ETA: 47s - loss: 2.2111 - regression_loss: 1.7886 - classification_loss: 0.4225 313/500 [=================>............] - ETA: 46s - loss: 2.2099 - regression_loss: 1.7881 - classification_loss: 0.4218 314/500 [=================>............] - ETA: 46s - loss: 2.2093 - regression_loss: 1.7880 - classification_loss: 0.4214 315/500 [=================>............] - ETA: 46s - loss: 2.2083 - regression_loss: 1.7873 - classification_loss: 0.4210 316/500 [=================>............] - ETA: 46s - loss: 2.2088 - regression_loss: 1.7878 - classification_loss: 0.4210 317/500 [==================>...........] - ETA: 45s - loss: 2.2103 - regression_loss: 1.7886 - classification_loss: 0.4217 318/500 [==================>...........] - ETA: 45s - loss: 2.2132 - regression_loss: 1.7908 - classification_loss: 0.4223 319/500 [==================>...........] - ETA: 45s - loss: 2.2127 - regression_loss: 1.7905 - classification_loss: 0.4222 320/500 [==================>...........] - ETA: 45s - loss: 2.2129 - regression_loss: 1.7908 - classification_loss: 0.4221 321/500 [==================>...........] - ETA: 44s - loss: 2.2132 - regression_loss: 1.7909 - classification_loss: 0.4223 322/500 [==================>...........] - ETA: 44s - loss: 2.2133 - regression_loss: 1.7912 - classification_loss: 0.4221 323/500 [==================>...........] - ETA: 44s - loss: 2.2132 - regression_loss: 1.7914 - classification_loss: 0.4218 324/500 [==================>...........] - ETA: 44s - loss: 2.2130 - regression_loss: 1.7913 - classification_loss: 0.4217 325/500 [==================>...........] - ETA: 43s - loss: 2.2120 - regression_loss: 1.7903 - classification_loss: 0.4217 326/500 [==================>...........] - ETA: 43s - loss: 2.2133 - regression_loss: 1.7915 - classification_loss: 0.4218 327/500 [==================>...........] - ETA: 43s - loss: 2.2138 - regression_loss: 1.7917 - classification_loss: 0.4220 328/500 [==================>...........] - ETA: 43s - loss: 2.2130 - regression_loss: 1.7912 - classification_loss: 0.4218 329/500 [==================>...........] - ETA: 42s - loss: 2.2120 - regression_loss: 1.7905 - classification_loss: 0.4214 330/500 [==================>...........] - ETA: 42s - loss: 2.2115 - regression_loss: 1.7903 - classification_loss: 0.4212 331/500 [==================>...........] - ETA: 42s - loss: 2.2127 - regression_loss: 1.7915 - classification_loss: 0.4212 332/500 [==================>...........] - ETA: 42s - loss: 2.2137 - regression_loss: 1.7920 - classification_loss: 0.4217 333/500 [==================>...........] - ETA: 41s - loss: 2.2137 - regression_loss: 1.7922 - classification_loss: 0.4215 334/500 [===================>..........] - ETA: 41s - loss: 2.2149 - regression_loss: 1.7932 - classification_loss: 0.4216 335/500 [===================>..........] - ETA: 41s - loss: 2.2137 - regression_loss: 1.7925 - classification_loss: 0.4212 336/500 [===================>..........] - ETA: 41s - loss: 2.2136 - regression_loss: 1.7925 - classification_loss: 0.4211 337/500 [===================>..........] - ETA: 40s - loss: 2.2149 - regression_loss: 1.7938 - classification_loss: 0.4211 338/500 [===================>..........] - ETA: 40s - loss: 2.2147 - regression_loss: 1.7938 - classification_loss: 0.4209 339/500 [===================>..........] - ETA: 40s - loss: 2.2149 - regression_loss: 1.7941 - classification_loss: 0.4208 340/500 [===================>..........] - ETA: 40s - loss: 2.2148 - regression_loss: 1.7942 - classification_loss: 0.4207 341/500 [===================>..........] - ETA: 39s - loss: 2.2135 - regression_loss: 1.7933 - classification_loss: 0.4202 342/500 [===================>..........] - ETA: 39s - loss: 2.2133 - regression_loss: 1.7933 - classification_loss: 0.4200 343/500 [===================>..........] - ETA: 39s - loss: 2.2141 - regression_loss: 1.7936 - classification_loss: 0.4205 344/500 [===================>..........] - ETA: 39s - loss: 2.2170 - regression_loss: 1.7962 - classification_loss: 0.4208 345/500 [===================>..........] - ETA: 38s - loss: 2.2141 - regression_loss: 1.7938 - classification_loss: 0.4203 346/500 [===================>..........] - ETA: 38s - loss: 2.2144 - regression_loss: 1.7941 - classification_loss: 0.4203 347/500 [===================>..........] - ETA: 38s - loss: 2.2138 - regression_loss: 1.7935 - classification_loss: 0.4203 348/500 [===================>..........] - ETA: 38s - loss: 2.2135 - regression_loss: 1.7930 - classification_loss: 0.4206 349/500 [===================>..........] - ETA: 37s - loss: 2.2142 - regression_loss: 1.7937 - classification_loss: 0.4204 350/500 [====================>.........] - ETA: 37s - loss: 2.2164 - regression_loss: 1.7955 - classification_loss: 0.4209 351/500 [====================>.........] - ETA: 37s - loss: 2.2163 - regression_loss: 1.7956 - classification_loss: 0.4207 352/500 [====================>.........] - ETA: 37s - loss: 2.2159 - regression_loss: 1.7954 - classification_loss: 0.4205 353/500 [====================>.........] - ETA: 36s - loss: 2.2154 - regression_loss: 1.7952 - classification_loss: 0.4202 354/500 [====================>.........] - ETA: 36s - loss: 2.2159 - regression_loss: 1.7959 - classification_loss: 0.4200 355/500 [====================>.........] - ETA: 36s - loss: 2.2173 - regression_loss: 1.7973 - classification_loss: 0.4200 356/500 [====================>.........] - ETA: 36s - loss: 2.2191 - regression_loss: 1.7979 - classification_loss: 0.4212 357/500 [====================>.........] - ETA: 35s - loss: 2.2197 - regression_loss: 1.7984 - classification_loss: 0.4213 358/500 [====================>.........] - ETA: 35s - loss: 2.2197 - regression_loss: 1.7984 - classification_loss: 0.4213 359/500 [====================>.........] - ETA: 35s - loss: 2.2191 - regression_loss: 1.7978 - classification_loss: 0.4213 360/500 [====================>.........] - ETA: 35s - loss: 2.2227 - regression_loss: 1.7964 - classification_loss: 0.4263 361/500 [====================>.........] - ETA: 34s - loss: 2.2232 - regression_loss: 1.7971 - classification_loss: 0.4261 362/500 [====================>.........] - ETA: 34s - loss: 2.2239 - regression_loss: 1.7973 - classification_loss: 0.4266 363/500 [====================>.........] - ETA: 34s - loss: 2.2231 - regression_loss: 1.7966 - classification_loss: 0.4265 364/500 [====================>.........] - ETA: 34s - loss: 2.2239 - regression_loss: 1.7973 - classification_loss: 0.4267 365/500 [====================>.........] - ETA: 33s - loss: 2.2256 - regression_loss: 1.7990 - classification_loss: 0.4265 366/500 [====================>.........] - ETA: 33s - loss: 2.2249 - regression_loss: 1.7984 - classification_loss: 0.4265 367/500 [=====================>........] - ETA: 33s - loss: 2.2251 - regression_loss: 1.7985 - classification_loss: 0.4265 368/500 [=====================>........] - ETA: 33s - loss: 2.2229 - regression_loss: 1.7969 - classification_loss: 0.4261 369/500 [=====================>........] - ETA: 32s - loss: 2.2228 - regression_loss: 1.7970 - classification_loss: 0.4258 370/500 [=====================>........] - ETA: 32s - loss: 2.2227 - regression_loss: 1.7970 - classification_loss: 0.4257 371/500 [=====================>........] - ETA: 32s - loss: 2.2223 - regression_loss: 1.7966 - classification_loss: 0.4257 372/500 [=====================>........] - ETA: 32s - loss: 2.2196 - regression_loss: 1.7945 - classification_loss: 0.4250 373/500 [=====================>........] - ETA: 31s - loss: 2.2189 - regression_loss: 1.7942 - classification_loss: 0.4247 374/500 [=====================>........] - ETA: 31s - loss: 2.2191 - regression_loss: 1.7943 - classification_loss: 0.4248 375/500 [=====================>........] - ETA: 31s - loss: 2.2159 - regression_loss: 1.7919 - classification_loss: 0.4240 376/500 [=====================>........] - ETA: 31s - loss: 2.2154 - regression_loss: 1.7915 - classification_loss: 0.4239 377/500 [=====================>........] - ETA: 30s - loss: 2.2151 - regression_loss: 1.7912 - classification_loss: 0.4240 378/500 [=====================>........] - ETA: 30s - loss: 2.2145 - regression_loss: 1.7908 - classification_loss: 0.4237 379/500 [=====================>........] - ETA: 30s - loss: 2.2142 - regression_loss: 1.7908 - classification_loss: 0.4235 380/500 [=====================>........] - ETA: 30s - loss: 2.2129 - regression_loss: 1.7898 - classification_loss: 0.4231 381/500 [=====================>........] - ETA: 29s - loss: 2.2132 - regression_loss: 1.7899 - classification_loss: 0.4233 382/500 [=====================>........] - ETA: 29s - loss: 2.2138 - regression_loss: 1.7904 - classification_loss: 0.4234 383/500 [=====================>........] - ETA: 29s - loss: 2.2155 - regression_loss: 1.7919 - classification_loss: 0.4236 384/500 [======================>.......] - ETA: 29s - loss: 2.2145 - regression_loss: 1.7911 - classification_loss: 0.4234 385/500 [======================>.......] - ETA: 28s - loss: 2.2152 - regression_loss: 1.7916 - classification_loss: 0.4236 386/500 [======================>.......] - ETA: 28s - loss: 2.2168 - regression_loss: 1.7928 - classification_loss: 0.4241 387/500 [======================>.......] - ETA: 28s - loss: 2.2165 - regression_loss: 1.7927 - classification_loss: 0.4238 388/500 [======================>.......] - ETA: 28s - loss: 2.2164 - regression_loss: 1.7925 - classification_loss: 0.4238 389/500 [======================>.......] - ETA: 27s - loss: 2.2164 - regression_loss: 1.7923 - classification_loss: 0.4240 390/500 [======================>.......] - ETA: 27s - loss: 2.2167 - regression_loss: 1.7925 - classification_loss: 0.4242 391/500 [======================>.......] - ETA: 27s - loss: 2.2166 - regression_loss: 1.7924 - classification_loss: 0.4243 392/500 [======================>.......] - ETA: 27s - loss: 2.2176 - regression_loss: 1.7935 - classification_loss: 0.4241 393/500 [======================>.......] - ETA: 26s - loss: 2.2179 - regression_loss: 1.7938 - classification_loss: 0.4241 394/500 [======================>.......] - ETA: 26s - loss: 2.2184 - regression_loss: 1.7942 - classification_loss: 0.4242 395/500 [======================>.......] - ETA: 26s - loss: 2.2173 - regression_loss: 1.7932 - classification_loss: 0.4241 396/500 [======================>.......] - ETA: 26s - loss: 2.2186 - regression_loss: 1.7943 - classification_loss: 0.4243 397/500 [======================>.......] - ETA: 25s - loss: 2.2162 - regression_loss: 1.7922 - classification_loss: 0.4239 398/500 [======================>.......] - ETA: 25s - loss: 2.2170 - regression_loss: 1.7932 - classification_loss: 0.4239 399/500 [======================>.......] - ETA: 25s - loss: 2.2173 - regression_loss: 1.7932 - classification_loss: 0.4241 400/500 [=======================>......] - ETA: 25s - loss: 2.2180 - regression_loss: 1.7937 - classification_loss: 0.4243 401/500 [=======================>......] - ETA: 24s - loss: 2.2184 - regression_loss: 1.7942 - classification_loss: 0.4242 402/500 [=======================>......] - ETA: 24s - loss: 2.2189 - regression_loss: 1.7941 - classification_loss: 0.4248 403/500 [=======================>......] - ETA: 24s - loss: 2.2199 - regression_loss: 1.7950 - classification_loss: 0.4249 404/500 [=======================>......] - ETA: 24s - loss: 2.2203 - regression_loss: 1.7959 - classification_loss: 0.4244 405/500 [=======================>......] - ETA: 23s - loss: 2.2197 - regression_loss: 1.7955 - classification_loss: 0.4242 406/500 [=======================>......] - ETA: 23s - loss: 2.2211 - regression_loss: 1.7971 - classification_loss: 0.4240 407/500 [=======================>......] - ETA: 23s - loss: 2.2202 - regression_loss: 1.7965 - classification_loss: 0.4237 408/500 [=======================>......] - ETA: 23s - loss: 2.2206 - regression_loss: 1.7968 - classification_loss: 0.4238 409/500 [=======================>......] - ETA: 22s - loss: 2.2198 - regression_loss: 1.7963 - classification_loss: 0.4236 410/500 [=======================>......] - ETA: 22s - loss: 2.2181 - regression_loss: 1.7947 - classification_loss: 0.4234 411/500 [=======================>......] - ETA: 22s - loss: 2.2178 - regression_loss: 1.7946 - classification_loss: 0.4232 412/500 [=======================>......] - ETA: 22s - loss: 2.2176 - regression_loss: 1.7945 - classification_loss: 0.4231 413/500 [=======================>......] - ETA: 21s - loss: 2.2169 - regression_loss: 1.7940 - classification_loss: 0.4229 414/500 [=======================>......] - ETA: 21s - loss: 2.2171 - regression_loss: 1.7944 - classification_loss: 0.4227 415/500 [=======================>......] - ETA: 21s - loss: 2.2166 - regression_loss: 1.7940 - classification_loss: 0.4226 416/500 [=======================>......] - ETA: 21s - loss: 2.2179 - regression_loss: 1.7950 - classification_loss: 0.4229 417/500 [========================>.....] - ETA: 20s - loss: 2.2171 - regression_loss: 1.7945 - classification_loss: 0.4226 418/500 [========================>.....] - ETA: 20s - loss: 2.2160 - regression_loss: 1.7935 - classification_loss: 0.4225 419/500 [========================>.....] - ETA: 20s - loss: 2.2158 - regression_loss: 1.7933 - classification_loss: 0.4225 420/500 [========================>.....] - ETA: 20s - loss: 2.2158 - regression_loss: 1.7936 - classification_loss: 0.4221 421/500 [========================>.....] - ETA: 19s - loss: 2.2157 - regression_loss: 1.7939 - classification_loss: 0.4219 422/500 [========================>.....] - ETA: 19s - loss: 2.2158 - regression_loss: 1.7939 - classification_loss: 0.4219 423/500 [========================>.....] - ETA: 19s - loss: 2.2160 - regression_loss: 1.7938 - classification_loss: 0.4221 424/500 [========================>.....] - ETA: 19s - loss: 2.2176 - regression_loss: 1.7953 - classification_loss: 0.4223 425/500 [========================>.....] - ETA: 18s - loss: 2.2157 - regression_loss: 1.7940 - classification_loss: 0.4217 426/500 [========================>.....] - ETA: 18s - loss: 2.2154 - regression_loss: 1.7939 - classification_loss: 0.4215 427/500 [========================>.....] - ETA: 18s - loss: 2.2154 - regression_loss: 1.7940 - classification_loss: 0.4214 428/500 [========================>.....] - ETA: 18s - loss: 2.2150 - regression_loss: 1.7937 - classification_loss: 0.4213 429/500 [========================>.....] - ETA: 17s - loss: 2.2141 - regression_loss: 1.7931 - classification_loss: 0.4210 430/500 [========================>.....] - ETA: 17s - loss: 2.2131 - regression_loss: 1.7924 - classification_loss: 0.4207 431/500 [========================>.....] - ETA: 17s - loss: 2.2119 - regression_loss: 1.7914 - classification_loss: 0.4205 432/500 [========================>.....] - ETA: 17s - loss: 2.2120 - regression_loss: 1.7915 - classification_loss: 0.4205 433/500 [========================>.....] - ETA: 16s - loss: 2.2131 - regression_loss: 1.7916 - classification_loss: 0.4214 434/500 [=========================>....] - ETA: 16s - loss: 2.2130 - regression_loss: 1.7914 - classification_loss: 0.4216 435/500 [=========================>....] - ETA: 16s - loss: 2.2132 - regression_loss: 1.7917 - classification_loss: 0.4215 436/500 [=========================>....] - ETA: 16s - loss: 2.2113 - regression_loss: 1.7902 - classification_loss: 0.4211 437/500 [=========================>....] - ETA: 15s - loss: 2.2120 - regression_loss: 1.7906 - classification_loss: 0.4214 438/500 [=========================>....] - ETA: 15s - loss: 2.2112 - regression_loss: 1.7898 - classification_loss: 0.4214 439/500 [=========================>....] - ETA: 15s - loss: 2.2115 - regression_loss: 1.7900 - classification_loss: 0.4215 440/500 [=========================>....] - ETA: 15s - loss: 2.2110 - regression_loss: 1.7897 - classification_loss: 0.4214 441/500 [=========================>....] - ETA: 14s - loss: 2.2117 - regression_loss: 1.7902 - classification_loss: 0.4215 442/500 [=========================>....] - ETA: 14s - loss: 2.2120 - regression_loss: 1.7903 - classification_loss: 0.4217 443/500 [=========================>....] - ETA: 14s - loss: 2.2120 - regression_loss: 1.7902 - classification_loss: 0.4218 444/500 [=========================>....] - ETA: 14s - loss: 2.2130 - regression_loss: 1.7912 - classification_loss: 0.4218 445/500 [=========================>....] - ETA: 13s - loss: 2.2118 - regression_loss: 1.7905 - classification_loss: 0.4213 446/500 [=========================>....] - ETA: 13s - loss: 2.2124 - regression_loss: 1.7912 - classification_loss: 0.4211 447/500 [=========================>....] - ETA: 13s - loss: 2.2130 - regression_loss: 1.7916 - classification_loss: 0.4214 448/500 [=========================>....] - ETA: 13s - loss: 2.2127 - regression_loss: 1.7914 - classification_loss: 0.4213 449/500 [=========================>....] - ETA: 12s - loss: 2.2118 - regression_loss: 1.7907 - classification_loss: 0.4211 450/500 [==========================>...] - ETA: 12s - loss: 2.2109 - regression_loss: 1.7900 - classification_loss: 0.4209 451/500 [==========================>...] - ETA: 12s - loss: 2.2106 - regression_loss: 1.7898 - classification_loss: 0.4208 452/500 [==========================>...] - ETA: 12s - loss: 2.2106 - regression_loss: 1.7900 - classification_loss: 0.4207 453/500 [==========================>...] - ETA: 11s - loss: 2.2109 - regression_loss: 1.7904 - classification_loss: 0.4206 454/500 [==========================>...] - ETA: 11s - loss: 2.2114 - regression_loss: 1.7908 - classification_loss: 0.4206 455/500 [==========================>...] - ETA: 11s - loss: 2.2129 - regression_loss: 1.7920 - classification_loss: 0.4209 456/500 [==========================>...] - ETA: 11s - loss: 2.2113 - regression_loss: 1.7908 - classification_loss: 0.4205 457/500 [==========================>...] - ETA: 10s - loss: 2.2104 - regression_loss: 1.7902 - classification_loss: 0.4202 458/500 [==========================>...] - ETA: 10s - loss: 2.2082 - regression_loss: 1.7882 - classification_loss: 0.4200 459/500 [==========================>...] - ETA: 10s - loss: 2.2083 - regression_loss: 1.7885 - classification_loss: 0.4198 460/500 [==========================>...] - ETA: 10s - loss: 2.2095 - regression_loss: 1.7893 - classification_loss: 0.4202 461/500 [==========================>...] - ETA: 9s - loss: 2.2088 - regression_loss: 1.7885 - classification_loss: 0.4203  462/500 [==========================>...] - ETA: 9s - loss: 2.2093 - regression_loss: 1.7890 - classification_loss: 0.4203 463/500 [==========================>...] - ETA: 9s - loss: 2.2074 - regression_loss: 1.7875 - classification_loss: 0.4199 464/500 [==========================>...] - ETA: 9s - loss: 2.2071 - regression_loss: 1.7873 - classification_loss: 0.4197 465/500 [==========================>...] - ETA: 8s - loss: 2.2079 - regression_loss: 1.7880 - classification_loss: 0.4198 466/500 [==========================>...] - ETA: 8s - loss: 2.2078 - regression_loss: 1.7880 - classification_loss: 0.4199 467/500 [===========================>..] - ETA: 8s - loss: 2.2075 - regression_loss: 1.7875 - classification_loss: 0.4200 468/500 [===========================>..] - ETA: 8s - loss: 2.2055 - regression_loss: 1.7860 - classification_loss: 0.4195 469/500 [===========================>..] - ETA: 7s - loss: 2.2048 - regression_loss: 1.7854 - classification_loss: 0.4194 470/500 [===========================>..] - ETA: 7s - loss: 2.2035 - regression_loss: 1.7844 - classification_loss: 0.4191 471/500 [===========================>..] - ETA: 7s - loss: 2.2009 - regression_loss: 1.7822 - classification_loss: 0.4187 472/500 [===========================>..] - ETA: 7s - loss: 2.2015 - regression_loss: 1.7828 - classification_loss: 0.4188 473/500 [===========================>..] - ETA: 6s - loss: 2.2011 - regression_loss: 1.7826 - classification_loss: 0.4185 474/500 [===========================>..] - ETA: 6s - loss: 2.2012 - regression_loss: 1.7826 - classification_loss: 0.4185 475/500 [===========================>..] - ETA: 6s - loss: 2.2013 - regression_loss: 1.7830 - classification_loss: 0.4183 476/500 [===========================>..] - ETA: 6s - loss: 2.2019 - regression_loss: 1.7834 - classification_loss: 0.4185 477/500 [===========================>..] - ETA: 5s - loss: 2.2019 - regression_loss: 1.7836 - classification_loss: 0.4183 478/500 [===========================>..] - ETA: 5s - loss: 2.1998 - regression_loss: 1.7819 - classification_loss: 0.4179 479/500 [===========================>..] - ETA: 5s - loss: 2.2006 - regression_loss: 1.7826 - classification_loss: 0.4180 480/500 [===========================>..] - ETA: 5s - loss: 2.2019 - regression_loss: 1.7834 - classification_loss: 0.4184 481/500 [===========================>..] - ETA: 4s - loss: 2.2001 - regression_loss: 1.7822 - classification_loss: 0.4178 482/500 [===========================>..] - ETA: 4s - loss: 2.2002 - regression_loss: 1.7824 - classification_loss: 0.4178 483/500 [===========================>..] - ETA: 4s - loss: 2.1997 - regression_loss: 1.7820 - classification_loss: 0.4177 484/500 [============================>.] - ETA: 4s - loss: 2.1986 - regression_loss: 1.7812 - classification_loss: 0.4174 485/500 [============================>.] - ETA: 3s - loss: 2.1999 - regression_loss: 1.7817 - classification_loss: 0.4183 486/500 [============================>.] - ETA: 3s - loss: 2.2005 - regression_loss: 1.7822 - classification_loss: 0.4183 487/500 [============================>.] - ETA: 3s - loss: 2.1988 - regression_loss: 1.7807 - classification_loss: 0.4181 488/500 [============================>.] - ETA: 3s - loss: 2.1985 - regression_loss: 1.7806 - classification_loss: 0.4179 489/500 [============================>.] - ETA: 2s - loss: 2.1997 - regression_loss: 1.7815 - classification_loss: 0.4182 490/500 [============================>.] - ETA: 2s - loss: 2.2024 - regression_loss: 1.7839 - classification_loss: 0.4186 491/500 [============================>.] - ETA: 2s - loss: 2.2018 - regression_loss: 1.7833 - classification_loss: 0.4185 492/500 [============================>.] - ETA: 2s - loss: 2.2034 - regression_loss: 1.7845 - classification_loss: 0.4188 493/500 [============================>.] - ETA: 1s - loss: 2.2046 - regression_loss: 1.7852 - classification_loss: 0.4195 494/500 [============================>.] - ETA: 1s - loss: 2.2056 - regression_loss: 1.7859 - classification_loss: 0.4197 495/500 [============================>.] - ETA: 1s - loss: 2.2062 - regression_loss: 1.7862 - classification_loss: 0.4199 496/500 [============================>.] - ETA: 1s - loss: 2.2051 - regression_loss: 1.7857 - classification_loss: 0.4194 497/500 [============================>.] - ETA: 0s - loss: 2.2054 - regression_loss: 1.7860 - classification_loss: 0.4194 498/500 [============================>.] - ETA: 0s - loss: 2.2066 - regression_loss: 1.7873 - classification_loss: 0.4193 499/500 [============================>.] - ETA: 0s - loss: 2.2084 - regression_loss: 1.7886 - classification_loss: 0.4198 500/500 [==============================] - 125s 250ms/step - loss: 2.2087 - regression_loss: 1.7890 - classification_loss: 0.4198 1172 instances of class plum with average precision: 0.4006 mAP: 0.4006 Epoch 00025: saving model to ./training/snapshots/resnet50_pascal_25.h5 Epoch 26/150 1/500 [..............................] - ETA: 2:01 - loss: 2.1000 - regression_loss: 1.6959 - classification_loss: 0.4041 2/500 [..............................] - ETA: 2:01 - loss: 2.2206 - regression_loss: 1.7408 - classification_loss: 0.4798 3/500 [..............................] - ETA: 2:03 - loss: 2.3266 - regression_loss: 1.8268 - classification_loss: 0.4998 4/500 [..............................] - ETA: 2:04 - loss: 2.1977 - regression_loss: 1.7377 - classification_loss: 0.4600 5/500 [..............................] - ETA: 2:05 - loss: 2.1512 - regression_loss: 1.6855 - classification_loss: 0.4657 6/500 [..............................] - ETA: 2:04 - loss: 2.1003 - regression_loss: 1.6603 - classification_loss: 0.4400 7/500 [..............................] - ETA: 2:04 - loss: 2.1464 - regression_loss: 1.7057 - classification_loss: 0.4408 8/500 [..............................] - ETA: 2:03 - loss: 2.2147 - regression_loss: 1.7753 - classification_loss: 0.4394 9/500 [..............................] - ETA: 2:03 - loss: 2.2173 - regression_loss: 1.7865 - classification_loss: 0.4308 10/500 [..............................] - ETA: 2:02 - loss: 2.2198 - regression_loss: 1.7840 - classification_loss: 0.4358 11/500 [..............................] - ETA: 2:02 - loss: 2.2670 - regression_loss: 1.7858 - classification_loss: 0.4812 12/500 [..............................] - ETA: 2:02 - loss: 2.4033 - regression_loss: 1.8126 - classification_loss: 0.5907 13/500 [..............................] - ETA: 2:02 - loss: 2.3763 - regression_loss: 1.7901 - classification_loss: 0.5862 14/500 [..............................] - ETA: 2:01 - loss: 2.4407 - regression_loss: 1.8288 - classification_loss: 0.6120 15/500 [..............................] - ETA: 2:01 - loss: 2.3946 - regression_loss: 1.8070 - classification_loss: 0.5876 16/500 [..............................] - ETA: 2:01 - loss: 2.3942 - regression_loss: 1.8193 - classification_loss: 0.5749 17/500 [>.............................] - ETA: 2:01 - loss: 2.3890 - regression_loss: 1.8225 - classification_loss: 0.5664 18/500 [>.............................] - ETA: 2:01 - loss: 2.3786 - regression_loss: 1.8233 - classification_loss: 0.5553 19/500 [>.............................] - ETA: 2:01 - loss: 2.3438 - regression_loss: 1.8049 - classification_loss: 0.5388 20/500 [>.............................] - ETA: 2:00 - loss: 2.3495 - regression_loss: 1.8130 - classification_loss: 0.5365 21/500 [>.............................] - ETA: 2:00 - loss: 2.3516 - regression_loss: 1.8171 - classification_loss: 0.5345 22/500 [>.............................] - ETA: 2:00 - loss: 2.3367 - regression_loss: 1.8098 - classification_loss: 0.5269 23/500 [>.............................] - ETA: 2:00 - loss: 2.3027 - regression_loss: 1.7902 - classification_loss: 0.5125 24/500 [>.............................] - ETA: 1:59 - loss: 2.3149 - regression_loss: 1.8008 - classification_loss: 0.5141 25/500 [>.............................] - ETA: 1:59 - loss: 2.3106 - regression_loss: 1.8006 - classification_loss: 0.5100 26/500 [>.............................] - ETA: 1:59 - loss: 2.3367 - regression_loss: 1.8235 - classification_loss: 0.5132 27/500 [>.............................] - ETA: 1:59 - loss: 2.2939 - regression_loss: 1.7910 - classification_loss: 0.5029 28/500 [>.............................] - ETA: 1:58 - loss: 2.2869 - regression_loss: 1.7872 - classification_loss: 0.4998 29/500 [>.............................] - ETA: 1:58 - loss: 2.2818 - regression_loss: 1.7861 - classification_loss: 0.4957 30/500 [>.............................] - ETA: 1:58 - loss: 2.2581 - regression_loss: 1.7684 - classification_loss: 0.4897 31/500 [>.............................] - ETA: 1:57 - loss: 2.2695 - regression_loss: 1.7796 - classification_loss: 0.4899 32/500 [>.............................] - ETA: 1:57 - loss: 2.2549 - regression_loss: 1.7706 - classification_loss: 0.4843 33/500 [>.............................] - ETA: 1:57 - loss: 2.2483 - regression_loss: 1.7695 - classification_loss: 0.4789 34/500 [=>............................] - ETA: 1:57 - loss: 2.2694 - regression_loss: 1.7873 - classification_loss: 0.4821 35/500 [=>............................] - ETA: 1:56 - loss: 2.2639 - regression_loss: 1.7866 - classification_loss: 0.4773 36/500 [=>............................] - ETA: 1:56 - loss: 2.2411 - regression_loss: 1.7689 - classification_loss: 0.4721 37/500 [=>............................] - ETA: 1:56 - loss: 2.2403 - regression_loss: 1.7707 - classification_loss: 0.4697 38/500 [=>............................] - ETA: 1:56 - loss: 2.2472 - regression_loss: 1.7812 - classification_loss: 0.4660 39/500 [=>............................] - ETA: 1:55 - loss: 2.2523 - regression_loss: 1.7839 - classification_loss: 0.4685 40/500 [=>............................] - ETA: 1:55 - loss: 2.2513 - regression_loss: 1.7818 - classification_loss: 0.4695 41/500 [=>............................] - ETA: 1:55 - loss: 2.2584 - regression_loss: 1.7893 - classification_loss: 0.4691 42/500 [=>............................] - ETA: 1:55 - loss: 2.2572 - regression_loss: 1.7883 - classification_loss: 0.4688 43/500 [=>............................] - ETA: 1:55 - loss: 2.2501 - regression_loss: 1.7840 - classification_loss: 0.4661 44/500 [=>............................] - ETA: 1:54 - loss: 2.2449 - regression_loss: 1.7820 - classification_loss: 0.4629 45/500 [=>............................] - ETA: 1:54 - loss: 2.2474 - regression_loss: 1.7861 - classification_loss: 0.4613 46/500 [=>............................] - ETA: 1:54 - loss: 2.2457 - regression_loss: 1.7852 - classification_loss: 0.4605 47/500 [=>............................] - ETA: 1:54 - loss: 2.2597 - regression_loss: 1.7972 - classification_loss: 0.4625 48/500 [=>............................] - ETA: 1:53 - loss: 2.2604 - regression_loss: 1.7993 - classification_loss: 0.4611 49/500 [=>............................] - ETA: 1:53 - loss: 2.2539 - regression_loss: 1.7942 - classification_loss: 0.4597 50/500 [==>...........................] - ETA: 1:53 - loss: 2.2443 - regression_loss: 1.7872 - classification_loss: 0.4571 51/500 [==>...........................] - ETA: 1:52 - loss: 2.2463 - regression_loss: 1.7923 - classification_loss: 0.4540 52/500 [==>...........................] - ETA: 1:52 - loss: 2.2427 - regression_loss: 1.7897 - classification_loss: 0.4530 53/500 [==>...........................] - ETA: 1:52 - loss: 2.2468 - regression_loss: 1.7940 - classification_loss: 0.4528 54/500 [==>...........................] - ETA: 1:52 - loss: 2.2306 - regression_loss: 1.7818 - classification_loss: 0.4488 55/500 [==>...........................] - ETA: 1:51 - loss: 2.2335 - regression_loss: 1.7867 - classification_loss: 0.4468 56/500 [==>...........................] - ETA: 1:51 - loss: 2.2295 - regression_loss: 1.7811 - classification_loss: 0.4484 57/500 [==>...........................] - ETA: 1:51 - loss: 2.2299 - regression_loss: 1.7831 - classification_loss: 0.4468 58/500 [==>...........................] - ETA: 1:51 - loss: 2.2054 - regression_loss: 1.7632 - classification_loss: 0.4422 59/500 [==>...........................] - ETA: 1:50 - loss: 2.2027 - regression_loss: 1.7620 - classification_loss: 0.4407 60/500 [==>...........................] - ETA: 1:50 - loss: 2.2043 - regression_loss: 1.7634 - classification_loss: 0.4410 61/500 [==>...........................] - ETA: 1:50 - loss: 2.2049 - regression_loss: 1.7654 - classification_loss: 0.4395 62/500 [==>...........................] - ETA: 1:50 - loss: 2.2013 - regression_loss: 1.7624 - classification_loss: 0.4389 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1999 - regression_loss: 1.7625 - classification_loss: 0.4375 64/500 [==>...........................] - ETA: 1:49 - loss: 2.2144 - regression_loss: 1.7734 - classification_loss: 0.4410 65/500 [==>...........................] - ETA: 1:49 - loss: 2.2112 - regression_loss: 1.7718 - classification_loss: 0.4394 66/500 [==>...........................] - ETA: 1:49 - loss: 2.2081 - regression_loss: 1.7701 - classification_loss: 0.4380 67/500 [===>..........................] - ETA: 1:49 - loss: 2.2103 - regression_loss: 1.7738 - classification_loss: 0.4365 68/500 [===>..........................] - ETA: 1:48 - loss: 2.2092 - regression_loss: 1.7731 - classification_loss: 0.4361 69/500 [===>..........................] - ETA: 1:48 - loss: 2.2038 - regression_loss: 1.7712 - classification_loss: 0.4327 70/500 [===>..........................] - ETA: 1:48 - loss: 2.2174 - regression_loss: 1.7814 - classification_loss: 0.4360 71/500 [===>..........................] - ETA: 1:48 - loss: 2.2290 - regression_loss: 1.7905 - classification_loss: 0.4385 72/500 [===>..........................] - ETA: 1:47 - loss: 2.2285 - regression_loss: 1.7903 - classification_loss: 0.4382 73/500 [===>..........................] - ETA: 1:47 - loss: 2.2285 - regression_loss: 1.7911 - classification_loss: 0.4374 74/500 [===>..........................] - ETA: 1:47 - loss: 2.2208 - regression_loss: 1.7798 - classification_loss: 0.4410 75/500 [===>..........................] - ETA: 1:46 - loss: 2.2113 - regression_loss: 1.7730 - classification_loss: 0.4383 76/500 [===>..........................] - ETA: 1:46 - loss: 2.2084 - regression_loss: 1.7711 - classification_loss: 0.4373 77/500 [===>..........................] - ETA: 1:46 - loss: 2.2063 - regression_loss: 1.7709 - classification_loss: 0.4354 78/500 [===>..........................] - ETA: 1:46 - loss: 2.2067 - regression_loss: 1.7716 - classification_loss: 0.4351 79/500 [===>..........................] - ETA: 1:46 - loss: 2.2094 - regression_loss: 1.7730 - classification_loss: 0.4365 80/500 [===>..........................] - ETA: 1:45 - loss: 2.2067 - regression_loss: 1.7721 - classification_loss: 0.4346 81/500 [===>..........................] - ETA: 1:45 - loss: 2.1981 - regression_loss: 1.7646 - classification_loss: 0.4334 82/500 [===>..........................] - ETA: 1:45 - loss: 2.1929 - regression_loss: 1.7610 - classification_loss: 0.4320 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1875 - regression_loss: 1.7575 - classification_loss: 0.4301 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1899 - regression_loss: 1.7603 - classification_loss: 0.4296 85/500 [====>.........................] - ETA: 1:44 - loss: 2.1913 - regression_loss: 1.7623 - classification_loss: 0.4289 86/500 [====>.........................] - ETA: 1:44 - loss: 2.1860 - regression_loss: 1.7589 - classification_loss: 0.4270 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1901 - regression_loss: 1.7627 - classification_loss: 0.4273 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1890 - regression_loss: 1.7624 - classification_loss: 0.4267 89/500 [====>.........................] - ETA: 1:43 - loss: 2.1965 - regression_loss: 1.7685 - classification_loss: 0.4279 90/500 [====>.........................] - ETA: 1:43 - loss: 2.2042 - regression_loss: 1.7736 - classification_loss: 0.4306 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1973 - regression_loss: 1.7686 - classification_loss: 0.4286 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1938 - regression_loss: 1.7640 - classification_loss: 0.4299 93/500 [====>.........................] - ETA: 1:42 - loss: 2.1975 - regression_loss: 1.7683 - classification_loss: 0.4292 94/500 [====>.........................] - ETA: 1:42 - loss: 2.1955 - regression_loss: 1.7678 - classification_loss: 0.4278 95/500 [====>.........................] - ETA: 1:41 - loss: 2.2001 - regression_loss: 1.7718 - classification_loss: 0.4283 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1993 - regression_loss: 1.7723 - classification_loss: 0.4269 97/500 [====>.........................] - ETA: 1:41 - loss: 2.2015 - regression_loss: 1.7743 - classification_loss: 0.4272 98/500 [====>.........................] - ETA: 1:41 - loss: 2.2027 - regression_loss: 1.7761 - classification_loss: 0.4266 99/500 [====>.........................] - ETA: 1:40 - loss: 2.2055 - regression_loss: 1.7773 - classification_loss: 0.4282 100/500 [=====>........................] - ETA: 1:40 - loss: 2.2034 - regression_loss: 1.7765 - classification_loss: 0.4269 101/500 [=====>........................] - ETA: 1:40 - loss: 2.2019 - regression_loss: 1.7729 - classification_loss: 0.4289 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2031 - regression_loss: 1.7742 - classification_loss: 0.4289 103/500 [=====>........................] - ETA: 1:39 - loss: 2.2047 - regression_loss: 1.7759 - classification_loss: 0.4288 104/500 [=====>........................] - ETA: 1:39 - loss: 2.2078 - regression_loss: 1.7779 - classification_loss: 0.4299 105/500 [=====>........................] - ETA: 1:38 - loss: 2.2087 - regression_loss: 1.7780 - classification_loss: 0.4307 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2130 - regression_loss: 1.7818 - classification_loss: 0.4312 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2115 - regression_loss: 1.7806 - classification_loss: 0.4309 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2150 - regression_loss: 1.7838 - classification_loss: 0.4312 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2167 - regression_loss: 1.7853 - classification_loss: 0.4314 110/500 [=====>........................] - ETA: 1:36 - loss: 2.2146 - regression_loss: 1.7842 - classification_loss: 0.4304 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2141 - regression_loss: 1.7824 - classification_loss: 0.4317 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2173 - regression_loss: 1.7860 - classification_loss: 0.4313 113/500 [=====>........................] - ETA: 1:35 - loss: 2.2168 - regression_loss: 1.7850 - classification_loss: 0.4318 114/500 [=====>........................] - ETA: 1:35 - loss: 2.2170 - regression_loss: 1.7858 - classification_loss: 0.4313 115/500 [=====>........................] - ETA: 1:34 - loss: 2.2144 - regression_loss: 1.7843 - classification_loss: 0.4301 116/500 [=====>........................] - ETA: 1:34 - loss: 2.2156 - regression_loss: 1.7849 - classification_loss: 0.4307 117/500 [======>.......................] - ETA: 1:34 - loss: 2.2185 - regression_loss: 1.7881 - classification_loss: 0.4304 118/500 [======>.......................] - ETA: 1:33 - loss: 2.2169 - regression_loss: 1.7866 - classification_loss: 0.4304 119/500 [======>.......................] - ETA: 1:33 - loss: 2.2172 - regression_loss: 1.7870 - classification_loss: 0.4301 120/500 [======>.......................] - ETA: 1:33 - loss: 2.2180 - regression_loss: 1.7883 - classification_loss: 0.4297 121/500 [======>.......................] - ETA: 1:32 - loss: 2.2064 - regression_loss: 1.7788 - classification_loss: 0.4276 122/500 [======>.......................] - ETA: 1:32 - loss: 2.2036 - regression_loss: 1.7769 - classification_loss: 0.4267 123/500 [======>.......................] - ETA: 1:32 - loss: 2.2053 - regression_loss: 1.7782 - classification_loss: 0.4272 124/500 [======>.......................] - ETA: 1:31 - loss: 2.1924 - regression_loss: 1.7678 - classification_loss: 0.4247 125/500 [======>.......................] - ETA: 1:31 - loss: 2.1897 - regression_loss: 1.7655 - classification_loss: 0.4242 126/500 [======>.......................] - ETA: 1:31 - loss: 2.1874 - regression_loss: 1.7640 - classification_loss: 0.4234 127/500 [======>.......................] - ETA: 1:30 - loss: 2.1883 - regression_loss: 1.7647 - classification_loss: 0.4236 128/500 [======>.......................] - ETA: 1:30 - loss: 2.1935 - regression_loss: 1.7682 - classification_loss: 0.4252 129/500 [======>.......................] - ETA: 1:30 - loss: 2.1937 - regression_loss: 1.7673 - classification_loss: 0.4265 130/500 [======>.......................] - ETA: 1:29 - loss: 2.1976 - regression_loss: 1.7708 - classification_loss: 0.4268 131/500 [======>.......................] - ETA: 1:29 - loss: 2.1971 - regression_loss: 1.7707 - classification_loss: 0.4264 132/500 [======>.......................] - ETA: 1:29 - loss: 2.1956 - regression_loss: 1.7698 - classification_loss: 0.4258 133/500 [======>.......................] - ETA: 1:29 - loss: 2.1939 - regression_loss: 1.7689 - classification_loss: 0.4250 134/500 [=======>......................] - ETA: 1:28 - loss: 2.1903 - regression_loss: 1.7662 - classification_loss: 0.4241 135/500 [=======>......................] - ETA: 1:28 - loss: 2.1910 - regression_loss: 1.7668 - classification_loss: 0.4241 136/500 [=======>......................] - ETA: 1:28 - loss: 2.1949 - regression_loss: 1.7707 - classification_loss: 0.4242 137/500 [=======>......................] - ETA: 1:28 - loss: 2.1929 - regression_loss: 1.7695 - classification_loss: 0.4234 138/500 [=======>......................] - ETA: 1:27 - loss: 2.1930 - regression_loss: 1.7702 - classification_loss: 0.4229 139/500 [=======>......................] - ETA: 1:27 - loss: 2.1912 - regression_loss: 1.7689 - classification_loss: 0.4223 140/500 [=======>......................] - ETA: 1:27 - loss: 2.1930 - regression_loss: 1.7699 - classification_loss: 0.4231 141/500 [=======>......................] - ETA: 1:27 - loss: 2.1869 - regression_loss: 1.7633 - classification_loss: 0.4236 142/500 [=======>......................] - ETA: 1:27 - loss: 2.1884 - regression_loss: 1.7645 - classification_loss: 0.4240 143/500 [=======>......................] - ETA: 1:26 - loss: 2.1892 - regression_loss: 1.7643 - classification_loss: 0.4249 144/500 [=======>......................] - ETA: 1:26 - loss: 2.1897 - regression_loss: 1.7653 - classification_loss: 0.4244 145/500 [=======>......................] - ETA: 1:26 - loss: 2.1932 - regression_loss: 1.7675 - classification_loss: 0.4257 146/500 [=======>......................] - ETA: 1:26 - loss: 2.1948 - regression_loss: 1.7690 - classification_loss: 0.4258 147/500 [=======>......................] - ETA: 1:25 - loss: 2.1946 - regression_loss: 1.7688 - classification_loss: 0.4257 148/500 [=======>......................] - ETA: 1:25 - loss: 2.1971 - regression_loss: 1.7710 - classification_loss: 0.4261 149/500 [=======>......................] - ETA: 1:25 - loss: 2.1965 - regression_loss: 1.7691 - classification_loss: 0.4274 150/500 [========>.....................] - ETA: 1:25 - loss: 2.1956 - regression_loss: 1.7685 - classification_loss: 0.4271 151/500 [========>.....................] - ETA: 1:24 - loss: 2.2004 - regression_loss: 1.7730 - classification_loss: 0.4275 152/500 [========>.....................] - ETA: 1:24 - loss: 2.2069 - regression_loss: 1.7794 - classification_loss: 0.4276 153/500 [========>.....................] - ETA: 1:24 - loss: 2.2081 - regression_loss: 1.7804 - classification_loss: 0.4277 154/500 [========>.....................] - ETA: 1:24 - loss: 2.2095 - regression_loss: 1.7817 - classification_loss: 0.4278 155/500 [========>.....................] - ETA: 1:24 - loss: 2.2036 - regression_loss: 1.7771 - classification_loss: 0.4264 156/500 [========>.....................] - ETA: 1:23 - loss: 2.1980 - regression_loss: 1.7729 - classification_loss: 0.4250 157/500 [========>.....................] - ETA: 1:23 - loss: 2.1981 - regression_loss: 1.7733 - classification_loss: 0.4249 158/500 [========>.....................] - ETA: 1:23 - loss: 2.2015 - regression_loss: 1.7762 - classification_loss: 0.4253 159/500 [========>.....................] - ETA: 1:23 - loss: 2.2012 - regression_loss: 1.7760 - classification_loss: 0.4252 160/500 [========>.....................] - ETA: 1:22 - loss: 2.2031 - regression_loss: 1.7783 - classification_loss: 0.4248 161/500 [========>.....................] - ETA: 1:22 - loss: 2.2112 - regression_loss: 1.7849 - classification_loss: 0.4264 162/500 [========>.....................] - ETA: 1:22 - loss: 2.2099 - regression_loss: 1.7841 - classification_loss: 0.4257 163/500 [========>.....................] - ETA: 1:22 - loss: 2.2096 - regression_loss: 1.7842 - classification_loss: 0.4255 164/500 [========>.....................] - ETA: 1:21 - loss: 2.2096 - regression_loss: 1.7844 - classification_loss: 0.4252 165/500 [========>.....................] - ETA: 1:21 - loss: 2.2073 - regression_loss: 1.7828 - classification_loss: 0.4245 166/500 [========>.....................] - ETA: 1:21 - loss: 2.2068 - regression_loss: 1.7821 - classification_loss: 0.4247 167/500 [=========>....................] - ETA: 1:21 - loss: 2.2091 - regression_loss: 1.7835 - classification_loss: 0.4256 168/500 [=========>....................] - ETA: 1:20 - loss: 2.2115 - regression_loss: 1.7855 - classification_loss: 0.4260 169/500 [=========>....................] - ETA: 1:20 - loss: 2.2110 - regression_loss: 1.7850 - classification_loss: 0.4260 170/500 [=========>....................] - ETA: 1:20 - loss: 2.2110 - regression_loss: 1.7847 - classification_loss: 0.4263 171/500 [=========>....................] - ETA: 1:20 - loss: 2.2089 - regression_loss: 1.7835 - classification_loss: 0.4253 172/500 [=========>....................] - ETA: 1:20 - loss: 2.2077 - regression_loss: 1.7826 - classification_loss: 0.4251 173/500 [=========>....................] - ETA: 1:19 - loss: 2.2010 - regression_loss: 1.7780 - classification_loss: 0.4230 174/500 [=========>....................] - ETA: 1:19 - loss: 2.2014 - regression_loss: 1.7789 - classification_loss: 0.4225 175/500 [=========>....................] - ETA: 1:19 - loss: 2.1957 - regression_loss: 1.7741 - classification_loss: 0.4216 176/500 [=========>....................] - ETA: 1:18 - loss: 2.1992 - regression_loss: 1.7763 - classification_loss: 0.4229 177/500 [=========>....................] - ETA: 1:18 - loss: 2.2000 - regression_loss: 1.7770 - classification_loss: 0.4230 178/500 [=========>....................] - ETA: 1:18 - loss: 2.2026 - regression_loss: 1.7792 - classification_loss: 0.4234 179/500 [=========>....................] - ETA: 1:18 - loss: 2.2032 - regression_loss: 1.7800 - classification_loss: 0.4233 180/500 [=========>....................] - ETA: 1:17 - loss: 2.2007 - regression_loss: 1.7781 - classification_loss: 0.4226 181/500 [=========>....................] - ETA: 1:17 - loss: 2.2021 - regression_loss: 1.7795 - classification_loss: 0.4226 182/500 [=========>....................] - ETA: 1:17 - loss: 2.2034 - regression_loss: 1.7802 - classification_loss: 0.4233 183/500 [=========>....................] - ETA: 1:17 - loss: 2.2052 - regression_loss: 1.7814 - classification_loss: 0.4238 184/500 [==========>...................] - ETA: 1:17 - loss: 2.2079 - regression_loss: 1.7840 - classification_loss: 0.4240 185/500 [==========>...................] - ETA: 1:16 - loss: 2.2077 - regression_loss: 1.7835 - classification_loss: 0.4242 186/500 [==========>...................] - ETA: 1:16 - loss: 2.2064 - regression_loss: 1.7823 - classification_loss: 0.4241 187/500 [==========>...................] - ETA: 1:16 - loss: 2.2078 - regression_loss: 1.7835 - classification_loss: 0.4243 188/500 [==========>...................] - ETA: 1:16 - loss: 2.2068 - regression_loss: 1.7830 - classification_loss: 0.4238 189/500 [==========>...................] - ETA: 1:15 - loss: 2.2096 - regression_loss: 1.7853 - classification_loss: 0.4243 190/500 [==========>...................] - ETA: 1:15 - loss: 2.2051 - regression_loss: 1.7816 - classification_loss: 0.4236 191/500 [==========>...................] - ETA: 1:15 - loss: 2.2035 - regression_loss: 1.7801 - classification_loss: 0.4234 192/500 [==========>...................] - ETA: 1:15 - loss: 2.2023 - regression_loss: 1.7792 - classification_loss: 0.4231 193/500 [==========>...................] - ETA: 1:14 - loss: 2.2007 - regression_loss: 1.7781 - classification_loss: 0.4227 194/500 [==========>...................] - ETA: 1:14 - loss: 2.2010 - regression_loss: 1.7787 - classification_loss: 0.4223 195/500 [==========>...................] - ETA: 1:14 - loss: 2.2007 - regression_loss: 1.7784 - classification_loss: 0.4223 196/500 [==========>...................] - ETA: 1:14 - loss: 2.1996 - regression_loss: 1.7775 - classification_loss: 0.4221 197/500 [==========>...................] - ETA: 1:13 - loss: 2.2018 - regression_loss: 1.7799 - classification_loss: 0.4219 198/500 [==========>...................] - ETA: 1:13 - loss: 2.1999 - regression_loss: 1.7785 - classification_loss: 0.4213 199/500 [==========>...................] - ETA: 1:13 - loss: 2.2003 - regression_loss: 1.7794 - classification_loss: 0.4208 200/500 [===========>..................] - ETA: 1:13 - loss: 2.1994 - regression_loss: 1.7787 - classification_loss: 0.4207 201/500 [===========>..................] - ETA: 1:13 - loss: 2.1990 - regression_loss: 1.7786 - classification_loss: 0.4205 202/500 [===========>..................] - ETA: 1:12 - loss: 2.2013 - regression_loss: 1.7816 - classification_loss: 0.4197 203/500 [===========>..................] - ETA: 1:12 - loss: 2.2028 - regression_loss: 1.7827 - classification_loss: 0.4200 204/500 [===========>..................] - ETA: 1:12 - loss: 2.2038 - regression_loss: 1.7838 - classification_loss: 0.4201 205/500 [===========>..................] - ETA: 1:12 - loss: 2.2072 - regression_loss: 1.7867 - classification_loss: 0.4205 206/500 [===========>..................] - ETA: 1:11 - loss: 2.2084 - regression_loss: 1.7875 - classification_loss: 0.4209 207/500 [===========>..................] - ETA: 1:11 - loss: 2.2080 - regression_loss: 1.7874 - classification_loss: 0.4207 208/500 [===========>..................] - ETA: 1:11 - loss: 2.2048 - regression_loss: 1.7846 - classification_loss: 0.4202 209/500 [===========>..................] - ETA: 1:11 - loss: 2.2078 - regression_loss: 1.7873 - classification_loss: 0.4205 210/500 [===========>..................] - ETA: 1:10 - loss: 2.2136 - regression_loss: 1.7893 - classification_loss: 0.4243 211/500 [===========>..................] - ETA: 1:10 - loss: 2.2138 - regression_loss: 1.7898 - classification_loss: 0.4240 212/500 [===========>..................] - ETA: 1:10 - loss: 2.2175 - regression_loss: 1.7918 - classification_loss: 0.4258 213/500 [===========>..................] - ETA: 1:10 - loss: 2.2179 - regression_loss: 1.7915 - classification_loss: 0.4265 214/500 [===========>..................] - ETA: 1:09 - loss: 2.2175 - regression_loss: 1.7913 - classification_loss: 0.4263 215/500 [===========>..................] - ETA: 1:09 - loss: 2.2171 - regression_loss: 1.7911 - classification_loss: 0.4260 216/500 [===========>..................] - ETA: 1:09 - loss: 2.2151 - regression_loss: 1.7897 - classification_loss: 0.4255 217/500 [============>.................] - ETA: 1:09 - loss: 2.2139 - regression_loss: 1.7888 - classification_loss: 0.4251 218/500 [============>.................] - ETA: 1:09 - loss: 2.2132 - regression_loss: 1.7882 - classification_loss: 0.4251 219/500 [============>.................] - ETA: 1:08 - loss: 2.2121 - regression_loss: 1.7875 - classification_loss: 0.4247 220/500 [============>.................] - ETA: 1:08 - loss: 2.2121 - regression_loss: 1.7874 - classification_loss: 0.4247 221/500 [============>.................] - ETA: 1:08 - loss: 2.2143 - regression_loss: 1.7900 - classification_loss: 0.4243 222/500 [============>.................] - ETA: 1:08 - loss: 2.2156 - regression_loss: 1.7909 - classification_loss: 0.4247 223/500 [============>.................] - ETA: 1:07 - loss: 2.2166 - regression_loss: 1.7902 - classification_loss: 0.4263 224/500 [============>.................] - ETA: 1:07 - loss: 2.2169 - regression_loss: 1.7911 - classification_loss: 0.4259 225/500 [============>.................] - ETA: 1:07 - loss: 2.2198 - regression_loss: 1.7933 - classification_loss: 0.4265 226/500 [============>.................] - ETA: 1:07 - loss: 2.2186 - regression_loss: 1.7927 - classification_loss: 0.4259 227/500 [============>.................] - ETA: 1:06 - loss: 2.2181 - regression_loss: 1.7922 - classification_loss: 0.4259 228/500 [============>.................] - ETA: 1:06 - loss: 2.2180 - regression_loss: 1.7921 - classification_loss: 0.4258 229/500 [============>.................] - ETA: 1:06 - loss: 2.2180 - regression_loss: 1.7921 - classification_loss: 0.4259 230/500 [============>.................] - ETA: 1:06 - loss: 2.2184 - regression_loss: 1.7923 - classification_loss: 0.4261 231/500 [============>.................] - ETA: 1:05 - loss: 2.2190 - regression_loss: 1.7930 - classification_loss: 0.4260 232/500 [============>.................] - ETA: 1:05 - loss: 2.2161 - regression_loss: 1.7906 - classification_loss: 0.4255 233/500 [============>.................] - ETA: 1:05 - loss: 2.2160 - regression_loss: 1.7899 - classification_loss: 0.4261 234/500 [=============>................] - ETA: 1:05 - loss: 2.2129 - regression_loss: 1.7870 - classification_loss: 0.4259 235/500 [=============>................] - ETA: 1:05 - loss: 2.2117 - regression_loss: 1.7861 - classification_loss: 0.4256 236/500 [=============>................] - ETA: 1:04 - loss: 2.2103 - regression_loss: 1.7853 - classification_loss: 0.4250 237/500 [=============>................] - ETA: 1:04 - loss: 2.2121 - regression_loss: 1.7866 - classification_loss: 0.4255 238/500 [=============>................] - ETA: 1:04 - loss: 2.2132 - regression_loss: 1.7876 - classification_loss: 0.4256 239/500 [=============>................] - ETA: 1:04 - loss: 2.2131 - regression_loss: 1.7875 - classification_loss: 0.4256 240/500 [=============>................] - ETA: 1:03 - loss: 2.2148 - regression_loss: 1.7890 - classification_loss: 0.4257 241/500 [=============>................] - ETA: 1:03 - loss: 2.2155 - regression_loss: 1.7900 - classification_loss: 0.4255 242/500 [=============>................] - ETA: 1:03 - loss: 2.2131 - regression_loss: 1.7882 - classification_loss: 0.4249 243/500 [=============>................] - ETA: 1:03 - loss: 2.2125 - regression_loss: 1.7879 - classification_loss: 0.4246 244/500 [=============>................] - ETA: 1:02 - loss: 2.2135 - regression_loss: 1.7886 - classification_loss: 0.4249 245/500 [=============>................] - ETA: 1:02 - loss: 2.2089 - regression_loss: 1.7848 - classification_loss: 0.4240 246/500 [=============>................] - ETA: 1:02 - loss: 2.2103 - regression_loss: 1.7863 - classification_loss: 0.4240 247/500 [=============>................] - ETA: 1:02 - loss: 2.2124 - regression_loss: 1.7883 - classification_loss: 0.4241 248/500 [=============>................] - ETA: 1:01 - loss: 2.2176 - regression_loss: 1.7929 - classification_loss: 0.4247 249/500 [=============>................] - ETA: 1:01 - loss: 2.2178 - regression_loss: 1.7933 - classification_loss: 0.4245 250/500 [==============>...............] - ETA: 1:01 - loss: 2.2177 - regression_loss: 1.7935 - classification_loss: 0.4242 251/500 [==============>...............] - ETA: 1:01 - loss: 2.2177 - regression_loss: 1.7926 - classification_loss: 0.4252 252/500 [==============>...............] - ETA: 1:00 - loss: 2.2204 - regression_loss: 1.7951 - classification_loss: 0.4253 253/500 [==============>...............] - ETA: 1:00 - loss: 2.2200 - regression_loss: 1.7949 - classification_loss: 0.4252 254/500 [==============>...............] - ETA: 1:00 - loss: 2.2209 - regression_loss: 1.7952 - classification_loss: 0.4257 255/500 [==============>...............] - ETA: 1:00 - loss: 2.2219 - regression_loss: 1.7962 - classification_loss: 0.4257 256/500 [==============>...............] - ETA: 1:00 - loss: 2.2219 - regression_loss: 1.7962 - classification_loss: 0.4257 257/500 [==============>...............] - ETA: 59s - loss: 2.2205 - regression_loss: 1.7952 - classification_loss: 0.4253  258/500 [==============>...............] - ETA: 59s - loss: 2.2204 - regression_loss: 1.7951 - classification_loss: 0.4253 259/500 [==============>...............] - ETA: 59s - loss: 2.2198 - regression_loss: 1.7950 - classification_loss: 0.4249 260/500 [==============>...............] - ETA: 59s - loss: 2.2207 - regression_loss: 1.7959 - classification_loss: 0.4248 261/500 [==============>...............] - ETA: 58s - loss: 2.2227 - regression_loss: 1.7974 - classification_loss: 0.4253 262/500 [==============>...............] - ETA: 58s - loss: 2.2227 - regression_loss: 1.7976 - classification_loss: 0.4251 263/500 [==============>...............] - ETA: 58s - loss: 2.2215 - regression_loss: 1.7968 - classification_loss: 0.4246 264/500 [==============>...............] - ETA: 58s - loss: 2.2226 - regression_loss: 1.7976 - classification_loss: 0.4250 265/500 [==============>...............] - ETA: 57s - loss: 2.2198 - regression_loss: 1.7956 - classification_loss: 0.4242 266/500 [==============>...............] - ETA: 57s - loss: 2.2170 - regression_loss: 1.7936 - classification_loss: 0.4234 267/500 [===============>..............] - ETA: 57s - loss: 2.2168 - regression_loss: 1.7935 - classification_loss: 0.4233 268/500 [===============>..............] - ETA: 57s - loss: 2.2165 - regression_loss: 1.7933 - classification_loss: 0.4232 269/500 [===============>..............] - ETA: 56s - loss: 2.2156 - regression_loss: 1.7927 - classification_loss: 0.4230 270/500 [===============>..............] - ETA: 56s - loss: 2.2146 - regression_loss: 1.7920 - classification_loss: 0.4226 271/500 [===============>..............] - ETA: 56s - loss: 2.2162 - regression_loss: 1.7933 - classification_loss: 0.4229 272/500 [===============>..............] - ETA: 56s - loss: 2.2191 - regression_loss: 1.7957 - classification_loss: 0.4234 273/500 [===============>..............] - ETA: 55s - loss: 2.2197 - regression_loss: 1.7962 - classification_loss: 0.4235 274/500 [===============>..............] - ETA: 55s - loss: 2.2223 - regression_loss: 1.7984 - classification_loss: 0.4239 275/500 [===============>..............] - ETA: 55s - loss: 2.2216 - regression_loss: 1.7980 - classification_loss: 0.4235 276/500 [===============>..............] - ETA: 55s - loss: 2.2205 - regression_loss: 1.7974 - classification_loss: 0.4231 277/500 [===============>..............] - ETA: 54s - loss: 2.2223 - regression_loss: 1.7973 - classification_loss: 0.4250 278/500 [===============>..............] - ETA: 54s - loss: 2.2220 - regression_loss: 1.7974 - classification_loss: 0.4246 279/500 [===============>..............] - ETA: 54s - loss: 2.2252 - regression_loss: 1.8003 - classification_loss: 0.4249 280/500 [===============>..............] - ETA: 54s - loss: 2.2215 - regression_loss: 1.7974 - classification_loss: 0.4241 281/500 [===============>..............] - ETA: 53s - loss: 2.2202 - regression_loss: 1.7963 - classification_loss: 0.4239 282/500 [===============>..............] - ETA: 53s - loss: 2.2204 - regression_loss: 1.7966 - classification_loss: 0.4238 283/500 [===============>..............] - ETA: 53s - loss: 2.2201 - regression_loss: 1.7967 - classification_loss: 0.4234 284/500 [================>.............] - ETA: 53s - loss: 2.2183 - regression_loss: 1.7953 - classification_loss: 0.4230 285/500 [================>.............] - ETA: 53s - loss: 2.2181 - regression_loss: 1.7954 - classification_loss: 0.4227 286/500 [================>.............] - ETA: 52s - loss: 2.2180 - regression_loss: 1.7950 - classification_loss: 0.4230 287/500 [================>.............] - ETA: 52s - loss: 2.2175 - regression_loss: 1.7947 - classification_loss: 0.4228 288/500 [================>.............] - ETA: 52s - loss: 2.2185 - regression_loss: 1.7952 - classification_loss: 0.4233 289/500 [================>.............] - ETA: 52s - loss: 2.2172 - regression_loss: 1.7943 - classification_loss: 0.4229 290/500 [================>.............] - ETA: 51s - loss: 2.2172 - regression_loss: 1.7939 - classification_loss: 0.4233 291/500 [================>.............] - ETA: 51s - loss: 2.2174 - regression_loss: 1.7943 - classification_loss: 0.4231 292/500 [================>.............] - ETA: 51s - loss: 2.2160 - regression_loss: 1.7931 - classification_loss: 0.4230 293/500 [================>.............] - ETA: 51s - loss: 2.2144 - regression_loss: 1.7919 - classification_loss: 0.4225 294/500 [================>.............] - ETA: 50s - loss: 2.2109 - regression_loss: 1.7891 - classification_loss: 0.4218 295/500 [================>.............] - ETA: 50s - loss: 2.2118 - regression_loss: 1.7902 - classification_loss: 0.4217 296/500 [================>.............] - ETA: 50s - loss: 2.2151 - regression_loss: 1.7924 - classification_loss: 0.4226 297/500 [================>.............] - ETA: 50s - loss: 2.2148 - regression_loss: 1.7925 - classification_loss: 0.4223 298/500 [================>.............] - ETA: 49s - loss: 2.2152 - regression_loss: 1.7930 - classification_loss: 0.4223 299/500 [================>.............] - ETA: 49s - loss: 2.2180 - regression_loss: 1.7937 - classification_loss: 0.4243 300/500 [=================>............] - ETA: 49s - loss: 2.2182 - regression_loss: 1.7938 - classification_loss: 0.4244 301/500 [=================>............] - ETA: 49s - loss: 2.2165 - regression_loss: 1.7927 - classification_loss: 0.4238 302/500 [=================>............] - ETA: 48s - loss: 2.2149 - regression_loss: 1.7916 - classification_loss: 0.4233 303/500 [=================>............] - ETA: 48s - loss: 2.2155 - regression_loss: 1.7921 - classification_loss: 0.4235 304/500 [=================>............] - ETA: 48s - loss: 2.2145 - regression_loss: 1.7908 - classification_loss: 0.4237 305/500 [=================>............] - ETA: 48s - loss: 2.2144 - regression_loss: 1.7904 - classification_loss: 0.4239 306/500 [=================>............] - ETA: 47s - loss: 2.2169 - regression_loss: 1.7922 - classification_loss: 0.4247 307/500 [=================>............] - ETA: 47s - loss: 2.2171 - regression_loss: 1.7924 - classification_loss: 0.4247 308/500 [=================>............] - ETA: 47s - loss: 2.2180 - regression_loss: 1.7929 - classification_loss: 0.4251 309/500 [=================>............] - ETA: 47s - loss: 2.2175 - regression_loss: 1.7927 - classification_loss: 0.4248 310/500 [=================>............] - ETA: 46s - loss: 2.2164 - regression_loss: 1.7921 - classification_loss: 0.4244 311/500 [=================>............] - ETA: 46s - loss: 2.2161 - regression_loss: 1.7920 - classification_loss: 0.4241 312/500 [=================>............] - ETA: 46s - loss: 2.2161 - regression_loss: 1.7921 - classification_loss: 0.4240 313/500 [=================>............] - ETA: 46s - loss: 2.2167 - regression_loss: 1.7926 - classification_loss: 0.4241 314/500 [=================>............] - ETA: 45s - loss: 2.2159 - regression_loss: 1.7921 - classification_loss: 0.4238 315/500 [=================>............] - ETA: 45s - loss: 2.2118 - regression_loss: 1.7889 - classification_loss: 0.4229 316/500 [=================>............] - ETA: 45s - loss: 2.2133 - regression_loss: 1.7899 - classification_loss: 0.4235 317/500 [==================>...........] - ETA: 45s - loss: 2.2147 - regression_loss: 1.7912 - classification_loss: 0.4235 318/500 [==================>...........] - ETA: 44s - loss: 2.2151 - regression_loss: 1.7915 - classification_loss: 0.4236 319/500 [==================>...........] - ETA: 44s - loss: 2.2152 - regression_loss: 1.7915 - classification_loss: 0.4236 320/500 [==================>...........] - ETA: 44s - loss: 2.2147 - regression_loss: 1.7912 - classification_loss: 0.4235 321/500 [==================>...........] - ETA: 44s - loss: 2.2195 - regression_loss: 1.7951 - classification_loss: 0.4244 322/500 [==================>...........] - ETA: 44s - loss: 2.2180 - regression_loss: 1.7940 - classification_loss: 0.4239 323/500 [==================>...........] - ETA: 43s - loss: 2.2183 - regression_loss: 1.7945 - classification_loss: 0.4239 324/500 [==================>...........] - ETA: 43s - loss: 2.2182 - regression_loss: 1.7945 - classification_loss: 0.4238 325/500 [==================>...........] - ETA: 43s - loss: 2.2178 - regression_loss: 1.7941 - classification_loss: 0.4236 326/500 [==================>...........] - ETA: 43s - loss: 2.2165 - regression_loss: 1.7932 - classification_loss: 0.4233 327/500 [==================>...........] - ETA: 42s - loss: 2.2152 - regression_loss: 1.7924 - classification_loss: 0.4229 328/500 [==================>...........] - ETA: 42s - loss: 2.2158 - regression_loss: 1.7927 - classification_loss: 0.4231 329/500 [==================>...........] - ETA: 42s - loss: 2.2181 - regression_loss: 1.7946 - classification_loss: 0.4236 330/500 [==================>...........] - ETA: 42s - loss: 2.2191 - regression_loss: 1.7948 - classification_loss: 0.4243 331/500 [==================>...........] - ETA: 41s - loss: 2.2176 - regression_loss: 1.7931 - classification_loss: 0.4245 332/500 [==================>...........] - ETA: 41s - loss: 2.2181 - regression_loss: 1.7934 - classification_loss: 0.4246 333/500 [==================>...........] - ETA: 41s - loss: 2.2180 - regression_loss: 1.7936 - classification_loss: 0.4244 334/500 [===================>..........] - ETA: 41s - loss: 2.2178 - regression_loss: 1.7936 - classification_loss: 0.4242 335/500 [===================>..........] - ETA: 40s - loss: 2.2178 - regression_loss: 1.7938 - classification_loss: 0.4240 336/500 [===================>..........] - ETA: 40s - loss: 2.2190 - regression_loss: 1.7947 - classification_loss: 0.4243 337/500 [===================>..........] - ETA: 40s - loss: 2.2191 - regression_loss: 1.7949 - classification_loss: 0.4242 338/500 [===================>..........] - ETA: 40s - loss: 2.2187 - regression_loss: 1.7946 - classification_loss: 0.4241 339/500 [===================>..........] - ETA: 39s - loss: 2.2192 - regression_loss: 1.7949 - classification_loss: 0.4243 340/500 [===================>..........] - ETA: 39s - loss: 2.2205 - regression_loss: 1.7956 - classification_loss: 0.4249 341/500 [===================>..........] - ETA: 39s - loss: 2.2218 - regression_loss: 1.7965 - classification_loss: 0.4253 342/500 [===================>..........] - ETA: 39s - loss: 2.2232 - regression_loss: 1.7972 - classification_loss: 0.4260 343/500 [===================>..........] - ETA: 38s - loss: 2.2235 - regression_loss: 1.7977 - classification_loss: 0.4258 344/500 [===================>..........] - ETA: 38s - loss: 2.2226 - regression_loss: 1.7972 - classification_loss: 0.4254 345/500 [===================>..........] - ETA: 38s - loss: 2.2216 - regression_loss: 1.7964 - classification_loss: 0.4252 346/500 [===================>..........] - ETA: 38s - loss: 2.2227 - regression_loss: 1.7974 - classification_loss: 0.4252 347/500 [===================>..........] - ETA: 37s - loss: 2.2220 - regression_loss: 1.7963 - classification_loss: 0.4257 348/500 [===================>..........] - ETA: 37s - loss: 2.2214 - regression_loss: 1.7960 - classification_loss: 0.4254 349/500 [===================>..........] - ETA: 37s - loss: 2.2206 - regression_loss: 1.7956 - classification_loss: 0.4251 350/500 [====================>.........] - ETA: 37s - loss: 2.2178 - regression_loss: 1.7930 - classification_loss: 0.4247 351/500 [====================>.........] - ETA: 36s - loss: 2.2197 - regression_loss: 1.7942 - classification_loss: 0.4255 352/500 [====================>.........] - ETA: 36s - loss: 2.2192 - regression_loss: 1.7938 - classification_loss: 0.4254 353/500 [====================>.........] - ETA: 36s - loss: 2.2193 - regression_loss: 1.7939 - classification_loss: 0.4254 354/500 [====================>.........] - ETA: 36s - loss: 2.2195 - regression_loss: 1.7942 - classification_loss: 0.4253 355/500 [====================>.........] - ETA: 35s - loss: 2.2181 - regression_loss: 1.7929 - classification_loss: 0.4252 356/500 [====================>.........] - ETA: 35s - loss: 2.2198 - regression_loss: 1.7942 - classification_loss: 0.4256 357/500 [====================>.........] - ETA: 35s - loss: 2.2211 - regression_loss: 1.7954 - classification_loss: 0.4257 358/500 [====================>.........] - ETA: 35s - loss: 2.2221 - regression_loss: 1.7966 - classification_loss: 0.4254 359/500 [====================>.........] - ETA: 34s - loss: 2.2210 - regression_loss: 1.7958 - classification_loss: 0.4252 360/500 [====================>.........] - ETA: 34s - loss: 2.2205 - regression_loss: 1.7955 - classification_loss: 0.4250 361/500 [====================>.........] - ETA: 34s - loss: 2.2208 - regression_loss: 1.7960 - classification_loss: 0.4248 362/500 [====================>.........] - ETA: 34s - loss: 2.2191 - regression_loss: 1.7947 - classification_loss: 0.4244 363/500 [====================>.........] - ETA: 33s - loss: 2.2185 - regression_loss: 1.7942 - classification_loss: 0.4244 364/500 [====================>.........] - ETA: 33s - loss: 2.2188 - regression_loss: 1.7945 - classification_loss: 0.4243 365/500 [====================>.........] - ETA: 33s - loss: 2.2202 - regression_loss: 1.7956 - classification_loss: 0.4246 366/500 [====================>.........] - ETA: 33s - loss: 2.2190 - regression_loss: 1.7947 - classification_loss: 0.4243 367/500 [=====================>........] - ETA: 32s - loss: 2.2190 - regression_loss: 1.7948 - classification_loss: 0.4242 368/500 [=====================>........] - ETA: 32s - loss: 2.2192 - regression_loss: 1.7952 - classification_loss: 0.4240 369/500 [=====================>........] - ETA: 32s - loss: 2.2190 - regression_loss: 1.7952 - classification_loss: 0.4238 370/500 [=====================>........] - ETA: 32s - loss: 2.2204 - regression_loss: 1.7964 - classification_loss: 0.4240 371/500 [=====================>........] - ETA: 31s - loss: 2.2199 - regression_loss: 1.7962 - classification_loss: 0.4237 372/500 [=====================>........] - ETA: 31s - loss: 2.2198 - regression_loss: 1.7964 - classification_loss: 0.4235 373/500 [=====================>........] - ETA: 31s - loss: 2.2197 - regression_loss: 1.7964 - classification_loss: 0.4233 374/500 [=====================>........] - ETA: 31s - loss: 2.2203 - regression_loss: 1.7968 - classification_loss: 0.4235 375/500 [=====================>........] - ETA: 30s - loss: 2.2196 - regression_loss: 1.7963 - classification_loss: 0.4232 376/500 [=====================>........] - ETA: 30s - loss: 2.2193 - regression_loss: 1.7963 - classification_loss: 0.4231 377/500 [=====================>........] - ETA: 30s - loss: 2.2189 - regression_loss: 1.7962 - classification_loss: 0.4227 378/500 [=====================>........] - ETA: 30s - loss: 2.2174 - regression_loss: 1.7951 - classification_loss: 0.4223 379/500 [=====================>........] - ETA: 29s - loss: 2.2181 - regression_loss: 1.7958 - classification_loss: 0.4224 380/500 [=====================>........] - ETA: 29s - loss: 2.2186 - regression_loss: 1.7964 - classification_loss: 0.4222 381/500 [=====================>........] - ETA: 29s - loss: 2.2178 - regression_loss: 1.7959 - classification_loss: 0.4219 382/500 [=====================>........] - ETA: 29s - loss: 2.2167 - regression_loss: 1.7946 - classification_loss: 0.4221 383/500 [=====================>........] - ETA: 28s - loss: 2.2177 - regression_loss: 1.7954 - classification_loss: 0.4224 384/500 [======================>.......] - ETA: 28s - loss: 2.2217 - regression_loss: 1.7982 - classification_loss: 0.4235 385/500 [======================>.......] - ETA: 28s - loss: 2.2217 - regression_loss: 1.7984 - classification_loss: 0.4233 386/500 [======================>.......] - ETA: 28s - loss: 2.2197 - regression_loss: 1.7968 - classification_loss: 0.4229 387/500 [======================>.......] - ETA: 27s - loss: 2.2213 - regression_loss: 1.7983 - classification_loss: 0.4230 388/500 [======================>.......] - ETA: 27s - loss: 2.2215 - regression_loss: 1.7985 - classification_loss: 0.4230 389/500 [======================>.......] - ETA: 27s - loss: 2.2216 - regression_loss: 1.7988 - classification_loss: 0.4228 390/500 [======================>.......] - ETA: 27s - loss: 2.2205 - regression_loss: 1.7981 - classification_loss: 0.4224 391/500 [======================>.......] - ETA: 26s - loss: 2.2210 - regression_loss: 1.7987 - classification_loss: 0.4223 392/500 [======================>.......] - ETA: 26s - loss: 2.2208 - regression_loss: 1.7987 - classification_loss: 0.4221 393/500 [======================>.......] - ETA: 26s - loss: 2.2195 - regression_loss: 1.7977 - classification_loss: 0.4218 394/500 [======================>.......] - ETA: 26s - loss: 2.2192 - regression_loss: 1.7977 - classification_loss: 0.4215 395/500 [======================>.......] - ETA: 25s - loss: 2.2201 - regression_loss: 1.7985 - classification_loss: 0.4216 396/500 [======================>.......] - ETA: 25s - loss: 2.2213 - regression_loss: 1.7994 - classification_loss: 0.4218 397/500 [======================>.......] - ETA: 25s - loss: 2.2198 - regression_loss: 1.7983 - classification_loss: 0.4215 398/500 [======================>.......] - ETA: 25s - loss: 2.2202 - regression_loss: 1.7986 - classification_loss: 0.4216 399/500 [======================>.......] - ETA: 24s - loss: 2.2200 - regression_loss: 1.7985 - classification_loss: 0.4214 400/500 [=======================>......] - ETA: 24s - loss: 2.2189 - regression_loss: 1.7977 - classification_loss: 0.4212 401/500 [=======================>......] - ETA: 24s - loss: 2.2186 - regression_loss: 1.7974 - classification_loss: 0.4211 402/500 [=======================>......] - ETA: 24s - loss: 2.2181 - regression_loss: 1.7973 - classification_loss: 0.4208 403/500 [=======================>......] - ETA: 24s - loss: 2.2169 - regression_loss: 1.7963 - classification_loss: 0.4206 404/500 [=======================>......] - ETA: 23s - loss: 2.2173 - regression_loss: 1.7962 - classification_loss: 0.4211 405/500 [=======================>......] - ETA: 23s - loss: 2.2173 - regression_loss: 1.7964 - classification_loss: 0.4209 406/500 [=======================>......] - ETA: 23s - loss: 2.2172 - regression_loss: 1.7964 - classification_loss: 0.4209 407/500 [=======================>......] - ETA: 23s - loss: 2.2172 - regression_loss: 1.7963 - classification_loss: 0.4208 408/500 [=======================>......] - ETA: 22s - loss: 2.2176 - regression_loss: 1.7969 - classification_loss: 0.4207 409/500 [=======================>......] - ETA: 22s - loss: 2.2152 - regression_loss: 1.7946 - classification_loss: 0.4206 410/500 [=======================>......] - ETA: 22s - loss: 2.2159 - regression_loss: 1.7951 - classification_loss: 0.4208 411/500 [=======================>......] - ETA: 22s - loss: 2.2153 - regression_loss: 1.7948 - classification_loss: 0.4204 412/500 [=======================>......] - ETA: 21s - loss: 2.2142 - regression_loss: 1.7939 - classification_loss: 0.4203 413/500 [=======================>......] - ETA: 21s - loss: 2.2137 - regression_loss: 1.7937 - classification_loss: 0.4199 414/500 [=======================>......] - ETA: 21s - loss: 2.2125 - regression_loss: 1.7926 - classification_loss: 0.4199 415/500 [=======================>......] - ETA: 21s - loss: 2.2117 - regression_loss: 1.7920 - classification_loss: 0.4198 416/500 [=======================>......] - ETA: 20s - loss: 2.2115 - regression_loss: 1.7918 - classification_loss: 0.4196 417/500 [========================>.....] - ETA: 20s - loss: 2.2115 - regression_loss: 1.7919 - classification_loss: 0.4196 418/500 [========================>.....] - ETA: 20s - loss: 2.2128 - regression_loss: 1.7933 - classification_loss: 0.4194 419/500 [========================>.....] - ETA: 20s - loss: 2.2138 - regression_loss: 1.7942 - classification_loss: 0.4196 420/500 [========================>.....] - ETA: 19s - loss: 2.2133 - regression_loss: 1.7937 - classification_loss: 0.4196 421/500 [========================>.....] - ETA: 19s - loss: 2.2140 - regression_loss: 1.7944 - classification_loss: 0.4196 422/500 [========================>.....] - ETA: 19s - loss: 2.2113 - regression_loss: 1.7922 - classification_loss: 0.4191 423/500 [========================>.....] - ETA: 19s - loss: 2.2102 - regression_loss: 1.7913 - classification_loss: 0.4189 424/500 [========================>.....] - ETA: 18s - loss: 2.2109 - regression_loss: 1.7919 - classification_loss: 0.4190 425/500 [========================>.....] - ETA: 18s - loss: 2.2101 - regression_loss: 1.7914 - classification_loss: 0.4187 426/500 [========================>.....] - ETA: 18s - loss: 2.2110 - regression_loss: 1.7922 - classification_loss: 0.4189 427/500 [========================>.....] - ETA: 18s - loss: 2.2113 - regression_loss: 1.7925 - classification_loss: 0.4187 428/500 [========================>.....] - ETA: 17s - loss: 2.2118 - regression_loss: 1.7930 - classification_loss: 0.4188 429/500 [========================>.....] - ETA: 17s - loss: 2.2131 - regression_loss: 1.7940 - classification_loss: 0.4191 430/500 [========================>.....] - ETA: 17s - loss: 2.2124 - regression_loss: 1.7935 - classification_loss: 0.4189 431/500 [========================>.....] - ETA: 17s - loss: 2.2127 - regression_loss: 1.7939 - classification_loss: 0.4188 432/500 [========================>.....] - ETA: 16s - loss: 2.2125 - regression_loss: 1.7938 - classification_loss: 0.4187 433/500 [========================>.....] - ETA: 16s - loss: 2.2141 - regression_loss: 1.7949 - classification_loss: 0.4192 434/500 [=========================>....] - ETA: 16s - loss: 2.2150 - regression_loss: 1.7957 - classification_loss: 0.4193 435/500 [=========================>....] - ETA: 16s - loss: 2.2158 - regression_loss: 1.7962 - classification_loss: 0.4195 436/500 [=========================>....] - ETA: 15s - loss: 2.2157 - regression_loss: 1.7955 - classification_loss: 0.4203 437/500 [=========================>....] - ETA: 15s - loss: 2.2146 - regression_loss: 1.7944 - classification_loss: 0.4201 438/500 [=========================>....] - ETA: 15s - loss: 2.2148 - regression_loss: 1.7946 - classification_loss: 0.4202 439/500 [=========================>....] - ETA: 15s - loss: 2.2155 - regression_loss: 1.7951 - classification_loss: 0.4204 440/500 [=========================>....] - ETA: 14s - loss: 2.2152 - regression_loss: 1.7949 - classification_loss: 0.4203 441/500 [=========================>....] - ETA: 14s - loss: 2.2158 - regression_loss: 1.7955 - classification_loss: 0.4202 442/500 [=========================>....] - ETA: 14s - loss: 2.2138 - regression_loss: 1.7941 - classification_loss: 0.4198 443/500 [=========================>....] - ETA: 14s - loss: 2.2130 - regression_loss: 1.7934 - classification_loss: 0.4196 444/500 [=========================>....] - ETA: 13s - loss: 2.2129 - regression_loss: 1.7934 - classification_loss: 0.4195 445/500 [=========================>....] - ETA: 13s - loss: 2.2119 - regression_loss: 1.7928 - classification_loss: 0.4190 446/500 [=========================>....] - ETA: 13s - loss: 2.2109 - regression_loss: 1.7922 - classification_loss: 0.4188 447/500 [=========================>....] - ETA: 13s - loss: 2.2117 - regression_loss: 1.7927 - classification_loss: 0.4190 448/500 [=========================>....] - ETA: 12s - loss: 2.2118 - regression_loss: 1.7928 - classification_loss: 0.4190 449/500 [=========================>....] - ETA: 12s - loss: 2.2114 - regression_loss: 1.7925 - classification_loss: 0.4189 450/500 [==========================>...] - ETA: 12s - loss: 2.2093 - regression_loss: 1.7908 - classification_loss: 0.4185 451/500 [==========================>...] - ETA: 12s - loss: 2.2074 - regression_loss: 1.7891 - classification_loss: 0.4183 452/500 [==========================>...] - ETA: 11s - loss: 2.2056 - regression_loss: 1.7874 - classification_loss: 0.4182 453/500 [==========================>...] - ETA: 11s - loss: 2.2061 - regression_loss: 1.7879 - classification_loss: 0.4182 454/500 [==========================>...] - ETA: 11s - loss: 2.2057 - regression_loss: 1.7874 - classification_loss: 0.4183 455/500 [==========================>...] - ETA: 11s - loss: 2.2060 - regression_loss: 1.7875 - classification_loss: 0.4185 456/500 [==========================>...] - ETA: 10s - loss: 2.2064 - regression_loss: 1.7878 - classification_loss: 0.4186 457/500 [==========================>...] - ETA: 10s - loss: 2.2069 - regression_loss: 1.7884 - classification_loss: 0.4185 458/500 [==========================>...] - ETA: 10s - loss: 2.2065 - regression_loss: 1.7881 - classification_loss: 0.4184 459/500 [==========================>...] - ETA: 10s - loss: 2.2070 - regression_loss: 1.7886 - classification_loss: 0.4185 460/500 [==========================>...] - ETA: 9s - loss: 2.2091 - regression_loss: 1.7903 - classification_loss: 0.4188  461/500 [==========================>...] - ETA: 9s - loss: 2.2098 - regression_loss: 1.7908 - classification_loss: 0.4190 462/500 [==========================>...] - ETA: 9s - loss: 2.2075 - regression_loss: 1.7890 - classification_loss: 0.4185 463/500 [==========================>...] - ETA: 9s - loss: 2.2083 - regression_loss: 1.7897 - classification_loss: 0.4186 464/500 [==========================>...] - ETA: 8s - loss: 2.2077 - regression_loss: 1.7891 - classification_loss: 0.4185 465/500 [==========================>...] - ETA: 8s - loss: 2.2097 - regression_loss: 1.7905 - classification_loss: 0.4192 466/500 [==========================>...] - ETA: 8s - loss: 2.2099 - regression_loss: 1.7905 - classification_loss: 0.4193 467/500 [===========================>..] - ETA: 8s - loss: 2.2090 - regression_loss: 1.7896 - classification_loss: 0.4194 468/500 [===========================>..] - ETA: 7s - loss: 2.2090 - regression_loss: 1.7894 - classification_loss: 0.4196 469/500 [===========================>..] - ETA: 7s - loss: 2.2087 - regression_loss: 1.7892 - classification_loss: 0.4194 470/500 [===========================>..] - ETA: 7s - loss: 2.2076 - regression_loss: 1.7884 - classification_loss: 0.4192 471/500 [===========================>..] - ETA: 7s - loss: 2.2069 - regression_loss: 1.7879 - classification_loss: 0.4190 472/500 [===========================>..] - ETA: 6s - loss: 2.2066 - regression_loss: 1.7877 - classification_loss: 0.4189 473/500 [===========================>..] - ETA: 6s - loss: 2.2045 - regression_loss: 1.7860 - classification_loss: 0.4186 474/500 [===========================>..] - ETA: 6s - loss: 2.2042 - regression_loss: 1.7854 - classification_loss: 0.4188 475/500 [===========================>..] - ETA: 6s - loss: 2.2048 - regression_loss: 1.7858 - classification_loss: 0.4190 476/500 [===========================>..] - ETA: 5s - loss: 2.2051 - regression_loss: 1.7861 - classification_loss: 0.4191 477/500 [===========================>..] - ETA: 5s - loss: 2.2034 - regression_loss: 1.7847 - classification_loss: 0.4187 478/500 [===========================>..] - ETA: 5s - loss: 2.2037 - regression_loss: 1.7853 - classification_loss: 0.4184 479/500 [===========================>..] - ETA: 5s - loss: 2.2038 - regression_loss: 1.7854 - classification_loss: 0.4184 480/500 [===========================>..] - ETA: 4s - loss: 2.2036 - regression_loss: 1.7854 - classification_loss: 0.4182 481/500 [===========================>..] - ETA: 4s - loss: 2.2023 - regression_loss: 1.7845 - classification_loss: 0.4178 482/500 [===========================>..] - ETA: 4s - loss: 2.2017 - regression_loss: 1.7840 - classification_loss: 0.4178 483/500 [===========================>..] - ETA: 4s - loss: 2.2012 - regression_loss: 1.7835 - classification_loss: 0.4177 484/500 [============================>.] - ETA: 3s - loss: 2.2019 - regression_loss: 1.7842 - classification_loss: 0.4177 485/500 [============================>.] - ETA: 3s - loss: 2.2026 - regression_loss: 1.7848 - classification_loss: 0.4178 486/500 [============================>.] - ETA: 3s - loss: 2.2023 - regression_loss: 1.7846 - classification_loss: 0.4177 487/500 [============================>.] - ETA: 3s - loss: 2.2026 - regression_loss: 1.7849 - classification_loss: 0.4177 488/500 [============================>.] - ETA: 2s - loss: 2.2045 - regression_loss: 1.7864 - classification_loss: 0.4181 489/500 [============================>.] - ETA: 2s - loss: 2.2049 - regression_loss: 1.7865 - classification_loss: 0.4183 490/500 [============================>.] - ETA: 2s - loss: 2.2049 - regression_loss: 1.7866 - classification_loss: 0.4183 491/500 [============================>.] - ETA: 2s - loss: 2.2039 - regression_loss: 1.7859 - classification_loss: 0.4180 492/500 [============================>.] - ETA: 1s - loss: 2.2041 - regression_loss: 1.7861 - classification_loss: 0.4180 493/500 [============================>.] - ETA: 1s - loss: 2.2024 - regression_loss: 1.7847 - classification_loss: 0.4177 494/500 [============================>.] - ETA: 1s - loss: 2.2016 - regression_loss: 1.7841 - classification_loss: 0.4175 495/500 [============================>.] - ETA: 1s - loss: 2.2014 - regression_loss: 1.7840 - classification_loss: 0.4174 496/500 [============================>.] - ETA: 0s - loss: 2.2022 - regression_loss: 1.7848 - classification_loss: 0.4174 497/500 [============================>.] - ETA: 0s - loss: 2.2022 - regression_loss: 1.7848 - classification_loss: 0.4174 498/500 [============================>.] - ETA: 0s - loss: 2.2032 - regression_loss: 1.7858 - classification_loss: 0.4174 499/500 [============================>.] - ETA: 0s - loss: 2.2035 - regression_loss: 1.7863 - classification_loss: 0.4173 500/500 [==============================] - 124s 248ms/step - loss: 2.2036 - regression_loss: 1.7864 - classification_loss: 0.4172 1172 instances of class plum with average precision: 0.4526 mAP: 0.4526 Epoch 00026: saving model to ./training/snapshots/resnet50_pascal_26.h5 Epoch 27/150 1/500 [..............................] - ETA: 1:55 - loss: 2.2293 - regression_loss: 1.9084 - classification_loss: 0.3209 2/500 [..............................] - ETA: 2:02 - loss: 2.1339 - regression_loss: 1.7782 - classification_loss: 0.3557 3/500 [..............................] - ETA: 2:02 - loss: 2.0390 - regression_loss: 1.7083 - classification_loss: 0.3307 4/500 [..............................] - ETA: 2:03 - loss: 2.0186 - regression_loss: 1.6669 - classification_loss: 0.3517 5/500 [..............................] - ETA: 2:03 - loss: 2.0404 - regression_loss: 1.6720 - classification_loss: 0.3683 6/500 [..............................] - ETA: 2:03 - loss: 2.0656 - regression_loss: 1.6882 - classification_loss: 0.3775 7/500 [..............................] - ETA: 2:03 - loss: 2.0207 - regression_loss: 1.6491 - classification_loss: 0.3716 8/500 [..............................] - ETA: 2:03 - loss: 2.0846 - regression_loss: 1.7070 - classification_loss: 0.3776 9/500 [..............................] - ETA: 2:02 - loss: 2.1190 - regression_loss: 1.7022 - classification_loss: 0.4168 10/500 [..............................] - ETA: 2:02 - loss: 2.1645 - regression_loss: 1.7430 - classification_loss: 0.4216 11/500 [..............................] - ETA: 2:02 - loss: 2.1636 - regression_loss: 1.7501 - classification_loss: 0.4136 12/500 [..............................] - ETA: 2:02 - loss: 2.1673 - regression_loss: 1.7620 - classification_loss: 0.4054 13/500 [..............................] - ETA: 2:01 - loss: 2.1839 - regression_loss: 1.7694 - classification_loss: 0.4145 14/500 [..............................] - ETA: 2:01 - loss: 2.1856 - regression_loss: 1.7663 - classification_loss: 0.4193 15/500 [..............................] - ETA: 2:01 - loss: 2.1843 - regression_loss: 1.7678 - classification_loss: 0.4165 16/500 [..............................] - ETA: 2:01 - loss: 2.1997 - regression_loss: 1.7831 - classification_loss: 0.4166 17/500 [>.............................] - ETA: 2:00 - loss: 2.1371 - regression_loss: 1.7351 - classification_loss: 0.4019 18/500 [>.............................] - ETA: 2:00 - loss: 2.1870 - regression_loss: 1.7778 - classification_loss: 0.4092 19/500 [>.............................] - ETA: 2:00 - loss: 2.2041 - regression_loss: 1.7938 - classification_loss: 0.4103 20/500 [>.............................] - ETA: 2:00 - loss: 2.2101 - regression_loss: 1.8017 - classification_loss: 0.4084 21/500 [>.............................] - ETA: 2:00 - loss: 2.2029 - regression_loss: 1.7974 - classification_loss: 0.4055 22/500 [>.............................] - ETA: 1:59 - loss: 2.2218 - regression_loss: 1.8137 - classification_loss: 0.4081 23/500 [>.............................] - ETA: 1:59 - loss: 2.2433 - regression_loss: 1.8374 - classification_loss: 0.4059 24/500 [>.............................] - ETA: 1:58 - loss: 2.2475 - regression_loss: 1.8390 - classification_loss: 0.4085 25/500 [>.............................] - ETA: 1:57 - loss: 2.2442 - regression_loss: 1.8386 - classification_loss: 0.4056 26/500 [>.............................] - ETA: 1:57 - loss: 2.2611 - regression_loss: 1.8535 - classification_loss: 0.4076 27/500 [>.............................] - ETA: 1:56 - loss: 2.2585 - regression_loss: 1.8497 - classification_loss: 0.4088 28/500 [>.............................] - ETA: 1:56 - loss: 2.2591 - regression_loss: 1.8512 - classification_loss: 0.4078 29/500 [>.............................] - ETA: 1:55 - loss: 2.2664 - regression_loss: 1.8585 - classification_loss: 0.4078 30/500 [>.............................] - ETA: 1:55 - loss: 2.2527 - regression_loss: 1.8461 - classification_loss: 0.4065 31/500 [>.............................] - ETA: 1:55 - loss: 2.3059 - regression_loss: 1.8719 - classification_loss: 0.4340 32/500 [>.............................] - ETA: 1:55 - loss: 2.3024 - regression_loss: 1.8695 - classification_loss: 0.4329 33/500 [>.............................] - ETA: 1:54 - loss: 2.3246 - regression_loss: 1.8751 - classification_loss: 0.4495 34/500 [=>............................] - ETA: 1:54 - loss: 2.3149 - regression_loss: 1.8683 - classification_loss: 0.4466 35/500 [=>............................] - ETA: 1:54 - loss: 2.3116 - regression_loss: 1.8661 - classification_loss: 0.4455 36/500 [=>............................] - ETA: 1:54 - loss: 2.3327 - regression_loss: 1.8849 - classification_loss: 0.4478 37/500 [=>............................] - ETA: 1:54 - loss: 2.3350 - regression_loss: 1.8870 - classification_loss: 0.4480 38/500 [=>............................] - ETA: 1:53 - loss: 2.3335 - regression_loss: 1.8885 - classification_loss: 0.4450 39/500 [=>............................] - ETA: 1:53 - loss: 2.3315 - regression_loss: 1.8885 - classification_loss: 0.4430 40/500 [=>............................] - ETA: 1:53 - loss: 2.3294 - regression_loss: 1.8901 - classification_loss: 0.4393 41/500 [=>............................] - ETA: 1:53 - loss: 2.3044 - regression_loss: 1.8709 - classification_loss: 0.4335 42/500 [=>............................] - ETA: 1:53 - loss: 2.3155 - regression_loss: 1.8812 - classification_loss: 0.4343 43/500 [=>............................] - ETA: 1:52 - loss: 2.3184 - regression_loss: 1.8839 - classification_loss: 0.4345 44/500 [=>............................] - ETA: 1:52 - loss: 2.3209 - regression_loss: 1.8854 - classification_loss: 0.4355 45/500 [=>............................] - ETA: 1:52 - loss: 2.3061 - regression_loss: 1.8732 - classification_loss: 0.4329 46/500 [=>............................] - ETA: 1:52 - loss: 2.3043 - regression_loss: 1.8726 - classification_loss: 0.4317 47/500 [=>............................] - ETA: 1:52 - loss: 2.2979 - regression_loss: 1.8680 - classification_loss: 0.4299 48/500 [=>............................] - ETA: 1:51 - loss: 2.2956 - regression_loss: 1.8669 - classification_loss: 0.4286 49/500 [=>............................] - ETA: 1:51 - loss: 2.2924 - regression_loss: 1.8629 - classification_loss: 0.4295 50/500 [==>...........................] - ETA: 1:51 - loss: 2.2892 - regression_loss: 1.8602 - classification_loss: 0.4289 51/500 [==>...........................] - ETA: 1:51 - loss: 2.3000 - regression_loss: 1.8681 - classification_loss: 0.4319 52/500 [==>...........................] - ETA: 1:51 - loss: 2.3021 - regression_loss: 1.8695 - classification_loss: 0.4326 53/500 [==>...........................] - ETA: 1:50 - loss: 2.2783 - regression_loss: 1.8518 - classification_loss: 0.4265 54/500 [==>...........................] - ETA: 1:50 - loss: 2.2782 - regression_loss: 1.8524 - classification_loss: 0.4258 55/500 [==>...........................] - ETA: 1:50 - loss: 2.2774 - regression_loss: 1.8525 - classification_loss: 0.4249 56/500 [==>...........................] - ETA: 1:50 - loss: 2.2808 - regression_loss: 1.8555 - classification_loss: 0.4253 57/500 [==>...........................] - ETA: 1:49 - loss: 2.2740 - regression_loss: 1.8504 - classification_loss: 0.4236 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2760 - regression_loss: 1.8522 - classification_loss: 0.4238 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2631 - regression_loss: 1.8411 - classification_loss: 0.4220 60/500 [==>...........................] - ETA: 1:49 - loss: 2.2589 - regression_loss: 1.8382 - classification_loss: 0.4207 61/500 [==>...........................] - ETA: 1:49 - loss: 2.2828 - regression_loss: 1.8404 - classification_loss: 0.4424 62/500 [==>...........................] - ETA: 1:48 - loss: 2.2721 - regression_loss: 1.8315 - classification_loss: 0.4406 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2693 - regression_loss: 1.8303 - classification_loss: 0.4390 64/500 [==>...........................] - ETA: 1:48 - loss: 2.2644 - regression_loss: 1.8267 - classification_loss: 0.4378 65/500 [==>...........................] - ETA: 1:48 - loss: 2.2588 - regression_loss: 1.8225 - classification_loss: 0.4363 66/500 [==>...........................] - ETA: 1:48 - loss: 2.2607 - regression_loss: 1.8243 - classification_loss: 0.4365 67/500 [===>..........................] - ETA: 1:47 - loss: 2.2564 - regression_loss: 1.8205 - classification_loss: 0.4359 68/500 [===>..........................] - ETA: 1:47 - loss: 2.2530 - regression_loss: 1.8183 - classification_loss: 0.4347 69/500 [===>..........................] - ETA: 1:47 - loss: 2.2511 - regression_loss: 1.8170 - classification_loss: 0.4341 70/500 [===>..........................] - ETA: 1:47 - loss: 2.2508 - regression_loss: 1.8175 - classification_loss: 0.4333 71/500 [===>..........................] - ETA: 1:46 - loss: 2.2494 - regression_loss: 1.8170 - classification_loss: 0.4324 72/500 [===>..........................] - ETA: 1:46 - loss: 2.2535 - regression_loss: 1.8201 - classification_loss: 0.4334 73/500 [===>..........................] - ETA: 1:46 - loss: 2.2564 - regression_loss: 1.8222 - classification_loss: 0.4342 74/500 [===>..........................] - ETA: 1:46 - loss: 2.2503 - regression_loss: 1.8178 - classification_loss: 0.4325 75/500 [===>..........................] - ETA: 1:45 - loss: 2.2482 - regression_loss: 1.8166 - classification_loss: 0.4316 76/500 [===>..........................] - ETA: 1:45 - loss: 2.2508 - regression_loss: 1.8191 - classification_loss: 0.4316 77/500 [===>..........................] - ETA: 1:45 - loss: 2.2537 - regression_loss: 1.8217 - classification_loss: 0.4320 78/500 [===>..........................] - ETA: 1:45 - loss: 2.2502 - regression_loss: 1.8150 - classification_loss: 0.4352 79/500 [===>..........................] - ETA: 1:44 - loss: 2.2512 - regression_loss: 1.8167 - classification_loss: 0.4345 80/500 [===>..........................] - ETA: 1:44 - loss: 2.2474 - regression_loss: 1.8110 - classification_loss: 0.4364 81/500 [===>..........................] - ETA: 1:44 - loss: 2.2483 - regression_loss: 1.8106 - classification_loss: 0.4377 82/500 [===>..........................] - ETA: 1:44 - loss: 2.2368 - regression_loss: 1.8015 - classification_loss: 0.4353 83/500 [===>..........................] - ETA: 1:43 - loss: 2.2412 - regression_loss: 1.8047 - classification_loss: 0.4366 84/500 [====>.........................] - ETA: 1:43 - loss: 2.2421 - regression_loss: 1.8054 - classification_loss: 0.4367 85/500 [====>.........................] - ETA: 1:43 - loss: 2.2441 - regression_loss: 1.8060 - classification_loss: 0.4381 86/500 [====>.........................] - ETA: 1:43 - loss: 2.2465 - regression_loss: 1.8081 - classification_loss: 0.4384 87/500 [====>.........................] - ETA: 1:43 - loss: 2.2472 - regression_loss: 1.8079 - classification_loss: 0.4393 88/500 [====>.........................] - ETA: 1:42 - loss: 2.2477 - regression_loss: 1.8086 - classification_loss: 0.4391 89/500 [====>.........................] - ETA: 1:42 - loss: 2.2547 - regression_loss: 1.8136 - classification_loss: 0.4412 90/500 [====>.........................] - ETA: 1:42 - loss: 2.2520 - regression_loss: 1.8119 - classification_loss: 0.4401 91/500 [====>.........................] - ETA: 1:42 - loss: 2.2532 - regression_loss: 1.8139 - classification_loss: 0.4394 92/500 [====>.........................] - ETA: 1:41 - loss: 2.2583 - regression_loss: 1.8181 - classification_loss: 0.4402 93/500 [====>.........................] - ETA: 1:41 - loss: 2.2482 - regression_loss: 1.8099 - classification_loss: 0.4383 94/500 [====>.........................] - ETA: 1:41 - loss: 2.2498 - regression_loss: 1.8116 - classification_loss: 0.4382 95/500 [====>.........................] - ETA: 1:41 - loss: 2.2482 - regression_loss: 1.8108 - classification_loss: 0.4374 96/500 [====>.........................] - ETA: 1:40 - loss: 2.2472 - regression_loss: 1.8108 - classification_loss: 0.4364 97/500 [====>.........................] - ETA: 1:40 - loss: 2.2587 - regression_loss: 1.8195 - classification_loss: 0.4392 98/500 [====>.........................] - ETA: 1:40 - loss: 2.2576 - regression_loss: 1.8174 - classification_loss: 0.4402 99/500 [====>.........................] - ETA: 1:39 - loss: 2.2531 - regression_loss: 1.8131 - classification_loss: 0.4400 100/500 [=====>........................] - ETA: 1:39 - loss: 2.2524 - regression_loss: 1.8142 - classification_loss: 0.4382 101/500 [=====>........................] - ETA: 1:39 - loss: 2.2388 - regression_loss: 1.8025 - classification_loss: 0.4363 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2356 - regression_loss: 1.8002 - classification_loss: 0.4354 103/500 [=====>........................] - ETA: 1:38 - loss: 2.2369 - regression_loss: 1.8020 - classification_loss: 0.4349 104/500 [=====>........................] - ETA: 1:38 - loss: 2.2378 - regression_loss: 1.8023 - classification_loss: 0.4355 105/500 [=====>........................] - ETA: 1:38 - loss: 2.2349 - regression_loss: 1.7998 - classification_loss: 0.4352 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2304 - regression_loss: 1.7963 - classification_loss: 0.4341 107/500 [=====>........................] - ETA: 1:37 - loss: 2.2221 - regression_loss: 1.7901 - classification_loss: 0.4320 108/500 [=====>........................] - ETA: 1:37 - loss: 2.2306 - regression_loss: 1.7969 - classification_loss: 0.4337 109/500 [=====>........................] - ETA: 1:37 - loss: 2.2342 - regression_loss: 1.8007 - classification_loss: 0.4335 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2329 - regression_loss: 1.7999 - classification_loss: 0.4330 111/500 [=====>........................] - ETA: 1:36 - loss: 2.2348 - regression_loss: 1.8009 - classification_loss: 0.4339 112/500 [=====>........................] - ETA: 1:36 - loss: 2.2355 - regression_loss: 1.8016 - classification_loss: 0.4339 113/500 [=====>........................] - ETA: 1:36 - loss: 2.2240 - regression_loss: 1.7926 - classification_loss: 0.4314 114/500 [=====>........................] - ETA: 1:35 - loss: 2.2277 - regression_loss: 1.7953 - classification_loss: 0.4323 115/500 [=====>........................] - ETA: 1:35 - loss: 2.2281 - regression_loss: 1.7960 - classification_loss: 0.4321 116/500 [=====>........................] - ETA: 1:35 - loss: 2.2314 - regression_loss: 1.7993 - classification_loss: 0.4321 117/500 [======>.......................] - ETA: 1:35 - loss: 2.2286 - regression_loss: 1.7977 - classification_loss: 0.4309 118/500 [======>.......................] - ETA: 1:35 - loss: 2.2531 - regression_loss: 1.8056 - classification_loss: 0.4475 119/500 [======>.......................] - ETA: 1:34 - loss: 2.2514 - regression_loss: 1.8048 - classification_loss: 0.4466 120/500 [======>.......................] - ETA: 1:34 - loss: 2.2476 - regression_loss: 1.8017 - classification_loss: 0.4459 121/500 [======>.......................] - ETA: 1:34 - loss: 2.2482 - regression_loss: 1.8005 - classification_loss: 0.4478 122/500 [======>.......................] - ETA: 1:34 - loss: 2.2546 - regression_loss: 1.8038 - classification_loss: 0.4508 123/500 [======>.......................] - ETA: 1:33 - loss: 2.2525 - regression_loss: 1.8029 - classification_loss: 0.4496 124/500 [======>.......................] - ETA: 1:33 - loss: 2.2526 - regression_loss: 1.8041 - classification_loss: 0.4485 125/500 [======>.......................] - ETA: 1:33 - loss: 2.2578 - regression_loss: 1.8087 - classification_loss: 0.4491 126/500 [======>.......................] - ETA: 1:33 - loss: 2.2546 - regression_loss: 1.8061 - classification_loss: 0.4485 127/500 [======>.......................] - ETA: 1:32 - loss: 2.2510 - regression_loss: 1.8030 - classification_loss: 0.4480 128/500 [======>.......................] - ETA: 1:32 - loss: 2.2543 - regression_loss: 1.8058 - classification_loss: 0.4485 129/500 [======>.......................] - ETA: 1:32 - loss: 2.2547 - regression_loss: 1.8064 - classification_loss: 0.4483 130/500 [======>.......................] - ETA: 1:32 - loss: 2.2528 - regression_loss: 1.8054 - classification_loss: 0.4474 131/500 [======>.......................] - ETA: 1:31 - loss: 2.2514 - regression_loss: 1.8052 - classification_loss: 0.4462 132/500 [======>.......................] - ETA: 1:31 - loss: 2.2522 - regression_loss: 1.8066 - classification_loss: 0.4456 133/500 [======>.......................] - ETA: 1:31 - loss: 2.2532 - regression_loss: 1.8075 - classification_loss: 0.4456 134/500 [=======>......................] - ETA: 1:31 - loss: 2.2542 - regression_loss: 1.8086 - classification_loss: 0.4456 135/500 [=======>......................] - ETA: 1:30 - loss: 2.2500 - regression_loss: 1.8055 - classification_loss: 0.4445 136/500 [=======>......................] - ETA: 1:30 - loss: 2.2497 - regression_loss: 1.8060 - classification_loss: 0.4436 137/500 [=======>......................] - ETA: 1:30 - loss: 2.2463 - regression_loss: 1.8034 - classification_loss: 0.4429 138/500 [=======>......................] - ETA: 1:30 - loss: 2.2434 - regression_loss: 1.8012 - classification_loss: 0.4422 139/500 [=======>......................] - ETA: 1:29 - loss: 2.2396 - regression_loss: 1.7989 - classification_loss: 0.4407 140/500 [=======>......................] - ETA: 1:29 - loss: 2.2407 - regression_loss: 1.7998 - classification_loss: 0.4408 141/500 [=======>......................] - ETA: 1:29 - loss: 2.2433 - regression_loss: 1.8028 - classification_loss: 0.4405 142/500 [=======>......................] - ETA: 1:29 - loss: 2.2437 - regression_loss: 1.8034 - classification_loss: 0.4402 143/500 [=======>......................] - ETA: 1:28 - loss: 2.2447 - regression_loss: 1.8045 - classification_loss: 0.4402 144/500 [=======>......................] - ETA: 1:28 - loss: 2.2447 - regression_loss: 1.8045 - classification_loss: 0.4402 145/500 [=======>......................] - ETA: 1:28 - loss: 2.2451 - regression_loss: 1.8049 - classification_loss: 0.4402 146/500 [=======>......................] - ETA: 1:28 - loss: 2.2465 - regression_loss: 1.8058 - classification_loss: 0.4407 147/500 [=======>......................] - ETA: 1:27 - loss: 2.2472 - regression_loss: 1.8068 - classification_loss: 0.4404 148/500 [=======>......................] - ETA: 1:27 - loss: 2.2590 - regression_loss: 1.8142 - classification_loss: 0.4448 149/500 [=======>......................] - ETA: 1:27 - loss: 2.2590 - regression_loss: 1.8137 - classification_loss: 0.4452 150/500 [========>.....................] - ETA: 1:27 - loss: 2.2637 - regression_loss: 1.8179 - classification_loss: 0.4458 151/500 [========>.....................] - ETA: 1:26 - loss: 2.2611 - regression_loss: 1.8159 - classification_loss: 0.4452 152/500 [========>.....................] - ETA: 1:26 - loss: 2.2578 - regression_loss: 1.8131 - classification_loss: 0.4447 153/500 [========>.....................] - ETA: 1:26 - loss: 2.2574 - regression_loss: 1.8128 - classification_loss: 0.4445 154/500 [========>.....................] - ETA: 1:26 - loss: 2.2552 - regression_loss: 1.8116 - classification_loss: 0.4436 155/500 [========>.....................] - ETA: 1:26 - loss: 2.2546 - regression_loss: 1.8116 - classification_loss: 0.4430 156/500 [========>.....................] - ETA: 1:25 - loss: 2.2511 - regression_loss: 1.8087 - classification_loss: 0.4424 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2471 - regression_loss: 1.8058 - classification_loss: 0.4413 158/500 [========>.....................] - ETA: 1:25 - loss: 2.2473 - regression_loss: 1.8061 - classification_loss: 0.4412 159/500 [========>.....................] - ETA: 1:25 - loss: 2.2474 - regression_loss: 1.8062 - classification_loss: 0.4412 160/500 [========>.....................] - ETA: 1:24 - loss: 2.2421 - regression_loss: 1.8023 - classification_loss: 0.4398 161/500 [========>.....................] - ETA: 1:24 - loss: 2.2416 - regression_loss: 1.8018 - classification_loss: 0.4398 162/500 [========>.....................] - ETA: 1:24 - loss: 2.2394 - regression_loss: 1.8001 - classification_loss: 0.4393 163/500 [========>.....................] - ETA: 1:24 - loss: 2.2387 - regression_loss: 1.7997 - classification_loss: 0.4390 164/500 [========>.....................] - ETA: 1:23 - loss: 2.2358 - regression_loss: 1.7977 - classification_loss: 0.4381 165/500 [========>.....................] - ETA: 1:23 - loss: 2.2361 - regression_loss: 1.7986 - classification_loss: 0.4375 166/500 [========>.....................] - ETA: 1:23 - loss: 2.2428 - regression_loss: 1.8016 - classification_loss: 0.4412 167/500 [=========>....................] - ETA: 1:23 - loss: 2.2417 - regression_loss: 1.8010 - classification_loss: 0.4407 168/500 [=========>....................] - ETA: 1:22 - loss: 2.2432 - regression_loss: 1.8032 - classification_loss: 0.4400 169/500 [=========>....................] - ETA: 1:22 - loss: 2.2477 - regression_loss: 1.8079 - classification_loss: 0.4399 170/500 [=========>....................] - ETA: 1:22 - loss: 2.2475 - regression_loss: 1.8070 - classification_loss: 0.4404 171/500 [=========>....................] - ETA: 1:22 - loss: 2.2478 - regression_loss: 1.8074 - classification_loss: 0.4404 172/500 [=========>....................] - ETA: 1:21 - loss: 2.2470 - regression_loss: 1.8070 - classification_loss: 0.4400 173/500 [=========>....................] - ETA: 1:21 - loss: 2.2509 - regression_loss: 1.8098 - classification_loss: 0.4411 174/500 [=========>....................] - ETA: 1:21 - loss: 2.2497 - regression_loss: 1.8092 - classification_loss: 0.4405 175/500 [=========>....................] - ETA: 1:21 - loss: 2.2482 - regression_loss: 1.8081 - classification_loss: 0.4401 176/500 [=========>....................] - ETA: 1:20 - loss: 2.2474 - regression_loss: 1.8078 - classification_loss: 0.4396 177/500 [=========>....................] - ETA: 1:20 - loss: 2.2527 - regression_loss: 1.8116 - classification_loss: 0.4411 178/500 [=========>....................] - ETA: 1:20 - loss: 2.2551 - regression_loss: 1.8137 - classification_loss: 0.4414 179/500 [=========>....................] - ETA: 1:20 - loss: 2.2492 - regression_loss: 1.8089 - classification_loss: 0.4403 180/500 [=========>....................] - ETA: 1:19 - loss: 2.2477 - regression_loss: 1.8079 - classification_loss: 0.4398 181/500 [=========>....................] - ETA: 1:19 - loss: 2.2480 - regression_loss: 1.8086 - classification_loss: 0.4394 182/500 [=========>....................] - ETA: 1:19 - loss: 2.2471 - regression_loss: 1.8081 - classification_loss: 0.4390 183/500 [=========>....................] - ETA: 1:19 - loss: 2.2453 - regression_loss: 1.8072 - classification_loss: 0.4381 184/500 [==========>...................] - ETA: 1:18 - loss: 2.2447 - regression_loss: 1.8067 - classification_loss: 0.4380 185/500 [==========>...................] - ETA: 1:18 - loss: 2.2457 - regression_loss: 1.8076 - classification_loss: 0.4381 186/500 [==========>...................] - ETA: 1:18 - loss: 2.2448 - regression_loss: 1.8065 - classification_loss: 0.4383 187/500 [==========>...................] - ETA: 1:18 - loss: 2.2448 - regression_loss: 1.8066 - classification_loss: 0.4382 188/500 [==========>...................] - ETA: 1:17 - loss: 2.2418 - regression_loss: 1.8044 - classification_loss: 0.4374 189/500 [==========>...................] - ETA: 1:17 - loss: 2.2420 - regression_loss: 1.8037 - classification_loss: 0.4383 190/500 [==========>...................] - ETA: 1:17 - loss: 2.2446 - regression_loss: 1.8050 - classification_loss: 0.4396 191/500 [==========>...................] - ETA: 1:17 - loss: 2.2396 - regression_loss: 1.8003 - classification_loss: 0.4393 192/500 [==========>...................] - ETA: 1:16 - loss: 2.2415 - regression_loss: 1.8022 - classification_loss: 0.4393 193/500 [==========>...................] - ETA: 1:16 - loss: 2.2412 - regression_loss: 1.8022 - classification_loss: 0.4390 194/500 [==========>...................] - ETA: 1:16 - loss: 2.2402 - regression_loss: 1.8019 - classification_loss: 0.4383 195/500 [==========>...................] - ETA: 1:16 - loss: 2.2451 - regression_loss: 1.8063 - classification_loss: 0.4388 196/500 [==========>...................] - ETA: 1:15 - loss: 2.2434 - regression_loss: 1.8053 - classification_loss: 0.4382 197/500 [==========>...................] - ETA: 1:15 - loss: 2.2417 - regression_loss: 1.8041 - classification_loss: 0.4376 198/500 [==========>...................] - ETA: 1:15 - loss: 2.2467 - regression_loss: 1.8076 - classification_loss: 0.4391 199/500 [==========>...................] - ETA: 1:15 - loss: 2.2433 - regression_loss: 1.8042 - classification_loss: 0.4391 200/500 [===========>..................] - ETA: 1:14 - loss: 2.2441 - regression_loss: 1.8047 - classification_loss: 0.4394 201/500 [===========>..................] - ETA: 1:14 - loss: 2.2445 - regression_loss: 1.8054 - classification_loss: 0.4391 202/500 [===========>..................] - ETA: 1:14 - loss: 2.2469 - regression_loss: 1.8066 - classification_loss: 0.4403 203/500 [===========>..................] - ETA: 1:14 - loss: 2.2486 - regression_loss: 1.8081 - classification_loss: 0.4406 204/500 [===========>..................] - ETA: 1:13 - loss: 2.2500 - regression_loss: 1.8089 - classification_loss: 0.4410 205/500 [===========>..................] - ETA: 1:13 - loss: 2.2502 - regression_loss: 1.8094 - classification_loss: 0.4408 206/500 [===========>..................] - ETA: 1:13 - loss: 2.2522 - regression_loss: 1.8117 - classification_loss: 0.4405 207/500 [===========>..................] - ETA: 1:13 - loss: 2.2538 - regression_loss: 1.8134 - classification_loss: 0.4404 208/500 [===========>..................] - ETA: 1:12 - loss: 2.2555 - regression_loss: 1.8154 - classification_loss: 0.4401 209/500 [===========>..................] - ETA: 1:12 - loss: 2.2570 - regression_loss: 1.8161 - classification_loss: 0.4409 210/500 [===========>..................] - ETA: 1:12 - loss: 2.2568 - regression_loss: 1.8162 - classification_loss: 0.4406 211/500 [===========>..................] - ETA: 1:12 - loss: 2.2561 - regression_loss: 1.8157 - classification_loss: 0.4404 212/500 [===========>..................] - ETA: 1:11 - loss: 2.2573 - regression_loss: 1.8166 - classification_loss: 0.4407 213/500 [===========>..................] - ETA: 1:11 - loss: 2.2596 - regression_loss: 1.8177 - classification_loss: 0.4419 214/500 [===========>..................] - ETA: 1:11 - loss: 2.2577 - regression_loss: 1.8165 - classification_loss: 0.4413 215/500 [===========>..................] - ETA: 1:11 - loss: 2.2546 - regression_loss: 1.8145 - classification_loss: 0.4401 216/500 [===========>..................] - ETA: 1:10 - loss: 2.2543 - regression_loss: 1.8145 - classification_loss: 0.4397 217/500 [============>.................] - ETA: 1:10 - loss: 2.2586 - regression_loss: 1.8178 - classification_loss: 0.4408 218/500 [============>.................] - ETA: 1:10 - loss: 2.2586 - regression_loss: 1.8182 - classification_loss: 0.4403 219/500 [============>.................] - ETA: 1:10 - loss: 2.2570 - regression_loss: 1.8163 - classification_loss: 0.4407 220/500 [============>.................] - ETA: 1:09 - loss: 2.2604 - regression_loss: 1.8190 - classification_loss: 0.4414 221/500 [============>.................] - ETA: 1:09 - loss: 2.2694 - regression_loss: 1.8190 - classification_loss: 0.4504 222/500 [============>.................] - ETA: 1:09 - loss: 2.2692 - regression_loss: 1.8188 - classification_loss: 0.4504 223/500 [============>.................] - ETA: 1:09 - loss: 2.2705 - regression_loss: 1.8200 - classification_loss: 0.4506 224/500 [============>.................] - ETA: 1:08 - loss: 2.2700 - regression_loss: 1.8194 - classification_loss: 0.4505 225/500 [============>.................] - ETA: 1:08 - loss: 2.2694 - regression_loss: 1.8193 - classification_loss: 0.4501 226/500 [============>.................] - ETA: 1:08 - loss: 2.2724 - regression_loss: 1.8218 - classification_loss: 0.4506 227/500 [============>.................] - ETA: 1:08 - loss: 2.2688 - regression_loss: 1.8189 - classification_loss: 0.4499 228/500 [============>.................] - ETA: 1:07 - loss: 2.2680 - regression_loss: 1.8185 - classification_loss: 0.4496 229/500 [============>.................] - ETA: 1:07 - loss: 2.2674 - regression_loss: 1.8181 - classification_loss: 0.4492 230/500 [============>.................] - ETA: 1:07 - loss: 2.2702 - regression_loss: 1.8207 - classification_loss: 0.4495 231/500 [============>.................] - ETA: 1:07 - loss: 2.2698 - regression_loss: 1.8205 - classification_loss: 0.4493 232/500 [============>.................] - ETA: 1:06 - loss: 2.2679 - regression_loss: 1.8191 - classification_loss: 0.4488 233/500 [============>.................] - ETA: 1:06 - loss: 2.2677 - regression_loss: 1.8189 - classification_loss: 0.4488 234/500 [=============>................] - ETA: 1:06 - loss: 2.2651 - regression_loss: 1.8167 - classification_loss: 0.4484 235/500 [=============>................] - ETA: 1:06 - loss: 2.2648 - regression_loss: 1.8168 - classification_loss: 0.4480 236/500 [=============>................] - ETA: 1:05 - loss: 2.2639 - regression_loss: 1.8164 - classification_loss: 0.4475 237/500 [=============>................] - ETA: 1:05 - loss: 2.2631 - regression_loss: 1.8160 - classification_loss: 0.4471 238/500 [=============>................] - ETA: 1:05 - loss: 2.2621 - regression_loss: 1.8155 - classification_loss: 0.4467 239/500 [=============>................] - ETA: 1:05 - loss: 2.2621 - regression_loss: 1.8157 - classification_loss: 0.4465 240/500 [=============>................] - ETA: 1:04 - loss: 2.2639 - regression_loss: 1.8174 - classification_loss: 0.4465 241/500 [=============>................] - ETA: 1:04 - loss: 2.2654 - regression_loss: 1.8186 - classification_loss: 0.4468 242/500 [=============>................] - ETA: 1:04 - loss: 2.2650 - regression_loss: 1.8180 - classification_loss: 0.4470 243/500 [=============>................] - ETA: 1:04 - loss: 2.2649 - regression_loss: 1.8183 - classification_loss: 0.4465 244/500 [=============>................] - ETA: 1:03 - loss: 2.2656 - regression_loss: 1.8187 - classification_loss: 0.4468 245/500 [=============>................] - ETA: 1:03 - loss: 2.2641 - regression_loss: 1.8180 - classification_loss: 0.4462 246/500 [=============>................] - ETA: 1:03 - loss: 2.2621 - regression_loss: 1.8166 - classification_loss: 0.4455 247/500 [=============>................] - ETA: 1:03 - loss: 2.2620 - regression_loss: 1.8150 - classification_loss: 0.4471 248/500 [=============>................] - ETA: 1:02 - loss: 2.2644 - regression_loss: 1.8166 - classification_loss: 0.4478 249/500 [=============>................] - ETA: 1:02 - loss: 2.2643 - regression_loss: 1.8167 - classification_loss: 0.4476 250/500 [==============>...............] - ETA: 1:02 - loss: 2.2624 - regression_loss: 1.8156 - classification_loss: 0.4468 251/500 [==============>...............] - ETA: 1:02 - loss: 2.2616 - regression_loss: 1.8151 - classification_loss: 0.4464 252/500 [==============>...............] - ETA: 1:01 - loss: 2.2610 - regression_loss: 1.8150 - classification_loss: 0.4459 253/500 [==============>...............] - ETA: 1:01 - loss: 2.2633 - regression_loss: 1.8167 - classification_loss: 0.4466 254/500 [==============>...............] - ETA: 1:01 - loss: 2.2618 - regression_loss: 1.8159 - classification_loss: 0.4460 255/500 [==============>...............] - ETA: 1:01 - loss: 2.2622 - regression_loss: 1.8164 - classification_loss: 0.4459 256/500 [==============>...............] - ETA: 1:00 - loss: 2.2601 - regression_loss: 1.8145 - classification_loss: 0.4456 257/500 [==============>...............] - ETA: 1:00 - loss: 2.2612 - regression_loss: 1.8155 - classification_loss: 0.4457 258/500 [==============>...............] - ETA: 1:00 - loss: 2.2610 - regression_loss: 1.8153 - classification_loss: 0.4457 259/500 [==============>...............] - ETA: 1:00 - loss: 2.2616 - regression_loss: 1.8163 - classification_loss: 0.4454 260/500 [==============>...............] - ETA: 59s - loss: 2.2569 - regression_loss: 1.8120 - classification_loss: 0.4449  261/500 [==============>...............] - ETA: 59s - loss: 2.2629 - regression_loss: 1.8120 - classification_loss: 0.4509 262/500 [==============>...............] - ETA: 59s - loss: 2.2623 - regression_loss: 1.8116 - classification_loss: 0.4507 263/500 [==============>...............] - ETA: 59s - loss: 2.2622 - regression_loss: 1.8114 - classification_loss: 0.4508 264/500 [==============>...............] - ETA: 58s - loss: 2.2621 - regression_loss: 1.8118 - classification_loss: 0.4504 265/500 [==============>...............] - ETA: 58s - loss: 2.2614 - regression_loss: 1.8113 - classification_loss: 0.4501 266/500 [==============>...............] - ETA: 58s - loss: 2.2590 - regression_loss: 1.8095 - classification_loss: 0.4496 267/500 [===============>..............] - ETA: 58s - loss: 2.2583 - regression_loss: 1.8092 - classification_loss: 0.4491 268/500 [===============>..............] - ETA: 57s - loss: 2.2576 - regression_loss: 1.8088 - classification_loss: 0.4489 269/500 [===============>..............] - ETA: 57s - loss: 2.2609 - regression_loss: 1.8113 - classification_loss: 0.4496 270/500 [===============>..............] - ETA: 57s - loss: 2.2618 - regression_loss: 1.8121 - classification_loss: 0.4498 271/500 [===============>..............] - ETA: 57s - loss: 2.2626 - regression_loss: 1.8128 - classification_loss: 0.4499 272/500 [===============>..............] - ETA: 56s - loss: 2.2627 - regression_loss: 1.8131 - classification_loss: 0.4496 273/500 [===============>..............] - ETA: 56s - loss: 2.2573 - regression_loss: 1.8085 - classification_loss: 0.4489 274/500 [===============>..............] - ETA: 56s - loss: 2.2538 - regression_loss: 1.8060 - classification_loss: 0.4478 275/500 [===============>..............] - ETA: 56s - loss: 2.2535 - regression_loss: 1.8057 - classification_loss: 0.4478 276/500 [===============>..............] - ETA: 55s - loss: 2.2531 - regression_loss: 1.8056 - classification_loss: 0.4475 277/500 [===============>..............] - ETA: 55s - loss: 2.2515 - regression_loss: 1.8045 - classification_loss: 0.4470 278/500 [===============>..............] - ETA: 55s - loss: 2.2491 - regression_loss: 1.8024 - classification_loss: 0.4467 279/500 [===============>..............] - ETA: 55s - loss: 2.2495 - regression_loss: 1.8028 - classification_loss: 0.4467 280/500 [===============>..............] - ETA: 54s - loss: 2.2525 - regression_loss: 1.8056 - classification_loss: 0.4470 281/500 [===============>..............] - ETA: 54s - loss: 2.2511 - regression_loss: 1.8046 - classification_loss: 0.4465 282/500 [===============>..............] - ETA: 54s - loss: 2.2489 - regression_loss: 1.8031 - classification_loss: 0.4458 283/500 [===============>..............] - ETA: 54s - loss: 2.2495 - regression_loss: 1.8037 - classification_loss: 0.4458 284/500 [================>.............] - ETA: 53s - loss: 2.2494 - regression_loss: 1.8034 - classification_loss: 0.4459 285/500 [================>.............] - ETA: 53s - loss: 2.2510 - regression_loss: 1.8047 - classification_loss: 0.4463 286/500 [================>.............] - ETA: 53s - loss: 2.2524 - regression_loss: 1.8062 - classification_loss: 0.4462 287/500 [================>.............] - ETA: 53s - loss: 2.2504 - regression_loss: 1.8048 - classification_loss: 0.4456 288/500 [================>.............] - ETA: 52s - loss: 2.2503 - regression_loss: 1.8049 - classification_loss: 0.4454 289/500 [================>.............] - ETA: 52s - loss: 2.2513 - regression_loss: 1.8058 - classification_loss: 0.4455 290/500 [================>.............] - ETA: 52s - loss: 2.2513 - regression_loss: 1.8061 - classification_loss: 0.4452 291/500 [================>.............] - ETA: 52s - loss: 2.2519 - regression_loss: 1.8067 - classification_loss: 0.4452 292/500 [================>.............] - ETA: 51s - loss: 2.2531 - regression_loss: 1.8078 - classification_loss: 0.4453 293/500 [================>.............] - ETA: 51s - loss: 2.2535 - regression_loss: 1.8081 - classification_loss: 0.4454 294/500 [================>.............] - ETA: 51s - loss: 2.2531 - regression_loss: 1.8079 - classification_loss: 0.4452 295/500 [================>.............] - ETA: 51s - loss: 2.2530 - regression_loss: 1.8082 - classification_loss: 0.4449 296/500 [================>.............] - ETA: 50s - loss: 2.2542 - regression_loss: 1.8093 - classification_loss: 0.4449 297/500 [================>.............] - ETA: 50s - loss: 2.2543 - regression_loss: 1.8097 - classification_loss: 0.4447 298/500 [================>.............] - ETA: 50s - loss: 2.2513 - regression_loss: 1.8074 - classification_loss: 0.4439 299/500 [================>.............] - ETA: 50s - loss: 2.2528 - regression_loss: 1.8091 - classification_loss: 0.4438 300/500 [=================>............] - ETA: 49s - loss: 2.2517 - regression_loss: 1.8085 - classification_loss: 0.4432 301/500 [=================>............] - ETA: 49s - loss: 2.2514 - regression_loss: 1.8084 - classification_loss: 0.4430 302/500 [=================>............] - ETA: 49s - loss: 2.2519 - regression_loss: 1.8085 - classification_loss: 0.4435 303/500 [=================>............] - ETA: 49s - loss: 2.2528 - regression_loss: 1.8093 - classification_loss: 0.4435 304/500 [=================>............] - ETA: 48s - loss: 2.2524 - regression_loss: 1.8093 - classification_loss: 0.4431 305/500 [=================>............] - ETA: 48s - loss: 2.2530 - regression_loss: 1.8099 - classification_loss: 0.4431 306/500 [=================>............] - ETA: 48s - loss: 2.2519 - regression_loss: 1.8093 - classification_loss: 0.4426 307/500 [=================>............] - ETA: 48s - loss: 2.2507 - regression_loss: 1.8084 - classification_loss: 0.4423 308/500 [=================>............] - ETA: 47s - loss: 2.2512 - regression_loss: 1.8089 - classification_loss: 0.4423 309/500 [=================>............] - ETA: 47s - loss: 2.2500 - regression_loss: 1.8081 - classification_loss: 0.4419 310/500 [=================>............] - ETA: 47s - loss: 2.2488 - regression_loss: 1.8072 - classification_loss: 0.4416 311/500 [=================>............] - ETA: 47s - loss: 2.2484 - regression_loss: 1.8069 - classification_loss: 0.4414 312/500 [=================>............] - ETA: 46s - loss: 2.2517 - regression_loss: 1.8095 - classification_loss: 0.4422 313/500 [=================>............] - ETA: 46s - loss: 2.2518 - regression_loss: 1.8096 - classification_loss: 0.4423 314/500 [=================>............] - ETA: 46s - loss: 2.2532 - regression_loss: 1.8108 - classification_loss: 0.4424 315/500 [=================>............] - ETA: 46s - loss: 2.2531 - regression_loss: 1.8108 - classification_loss: 0.4423 316/500 [=================>............] - ETA: 45s - loss: 2.2532 - regression_loss: 1.8101 - classification_loss: 0.4432 317/500 [==================>...........] - ETA: 45s - loss: 2.2514 - regression_loss: 1.8087 - classification_loss: 0.4428 318/500 [==================>...........] - ETA: 45s - loss: 2.2509 - regression_loss: 1.8068 - classification_loss: 0.4441 319/500 [==================>...........] - ETA: 45s - loss: 2.2535 - regression_loss: 1.8093 - classification_loss: 0.4442 320/500 [==================>...........] - ETA: 44s - loss: 2.2540 - regression_loss: 1.8099 - classification_loss: 0.4441 321/500 [==================>...........] - ETA: 44s - loss: 2.2538 - regression_loss: 1.8098 - classification_loss: 0.4440 322/500 [==================>...........] - ETA: 44s - loss: 2.2542 - regression_loss: 1.8102 - classification_loss: 0.4440 323/500 [==================>...........] - ETA: 44s - loss: 2.2546 - regression_loss: 1.8106 - classification_loss: 0.4440 324/500 [==================>...........] - ETA: 43s - loss: 2.2527 - regression_loss: 1.8091 - classification_loss: 0.4435 325/500 [==================>...........] - ETA: 43s - loss: 2.2524 - regression_loss: 1.8092 - classification_loss: 0.4432 326/500 [==================>...........] - ETA: 43s - loss: 2.2517 - regression_loss: 1.8088 - classification_loss: 0.4429 327/500 [==================>...........] - ETA: 43s - loss: 2.2506 - regression_loss: 1.8082 - classification_loss: 0.4424 328/500 [==================>...........] - ETA: 42s - loss: 2.2519 - regression_loss: 1.8092 - classification_loss: 0.4427 329/500 [==================>...........] - ETA: 42s - loss: 2.2489 - regression_loss: 1.8070 - classification_loss: 0.4419 330/500 [==================>...........] - ETA: 42s - loss: 2.2492 - regression_loss: 1.8075 - classification_loss: 0.4417 331/500 [==================>...........] - ETA: 42s - loss: 2.2496 - regression_loss: 1.8081 - classification_loss: 0.4415 332/500 [==================>...........] - ETA: 41s - loss: 2.2519 - regression_loss: 1.8103 - classification_loss: 0.4416 333/500 [==================>...........] - ETA: 41s - loss: 2.2529 - regression_loss: 1.8112 - classification_loss: 0.4416 334/500 [===================>..........] - ETA: 41s - loss: 2.2515 - regression_loss: 1.8103 - classification_loss: 0.4411 335/500 [===================>..........] - ETA: 41s - loss: 2.2509 - regression_loss: 1.8102 - classification_loss: 0.4407 336/500 [===================>..........] - ETA: 40s - loss: 2.2515 - regression_loss: 1.8105 - classification_loss: 0.4409 337/500 [===================>..........] - ETA: 40s - loss: 2.2501 - regression_loss: 1.8095 - classification_loss: 0.4406 338/500 [===================>..........] - ETA: 40s - loss: 2.2510 - regression_loss: 1.8102 - classification_loss: 0.4408 339/500 [===================>..........] - ETA: 40s - loss: 2.2505 - regression_loss: 1.8097 - classification_loss: 0.4408 340/500 [===================>..........] - ETA: 40s - loss: 2.2504 - regression_loss: 1.8096 - classification_loss: 0.4408 341/500 [===================>..........] - ETA: 39s - loss: 2.2506 - regression_loss: 1.8102 - classification_loss: 0.4404 342/500 [===================>..........] - ETA: 39s - loss: 2.2513 - regression_loss: 1.8108 - classification_loss: 0.4405 343/500 [===================>..........] - ETA: 39s - loss: 2.2492 - regression_loss: 1.8092 - classification_loss: 0.4400 344/500 [===================>..........] - ETA: 39s - loss: 2.2467 - regression_loss: 1.8072 - classification_loss: 0.4395 345/500 [===================>..........] - ETA: 38s - loss: 2.2477 - regression_loss: 1.8082 - classification_loss: 0.4395 346/500 [===================>..........] - ETA: 38s - loss: 2.2465 - regression_loss: 1.8075 - classification_loss: 0.4390 347/500 [===================>..........] - ETA: 38s - loss: 2.2475 - regression_loss: 1.8084 - classification_loss: 0.4391 348/500 [===================>..........] - ETA: 38s - loss: 2.2463 - regression_loss: 1.8076 - classification_loss: 0.4387 349/500 [===================>..........] - ETA: 37s - loss: 2.2472 - regression_loss: 1.8082 - classification_loss: 0.4391 350/500 [====================>.........] - ETA: 37s - loss: 2.2457 - regression_loss: 1.8068 - classification_loss: 0.4388 351/500 [====================>.........] - ETA: 37s - loss: 2.2452 - regression_loss: 1.8066 - classification_loss: 0.4386 352/500 [====================>.........] - ETA: 37s - loss: 2.2441 - regression_loss: 1.8058 - classification_loss: 0.4383 353/500 [====================>.........] - ETA: 36s - loss: 2.2446 - regression_loss: 1.8061 - classification_loss: 0.4385 354/500 [====================>.........] - ETA: 36s - loss: 2.2438 - regression_loss: 1.8054 - classification_loss: 0.4384 355/500 [====================>.........] - ETA: 36s - loss: 2.2436 - regression_loss: 1.8053 - classification_loss: 0.4383 356/500 [====================>.........] - ETA: 36s - loss: 2.2428 - regression_loss: 1.8045 - classification_loss: 0.4383 357/500 [====================>.........] - ETA: 35s - loss: 2.2420 - regression_loss: 1.8041 - classification_loss: 0.4380 358/500 [====================>.........] - ETA: 35s - loss: 2.2425 - regression_loss: 1.8046 - classification_loss: 0.4379 359/500 [====================>.........] - ETA: 35s - loss: 2.2431 - regression_loss: 1.8052 - classification_loss: 0.4378 360/500 [====================>.........] - ETA: 35s - loss: 2.2425 - regression_loss: 1.8049 - classification_loss: 0.4376 361/500 [====================>.........] - ETA: 34s - loss: 2.2405 - regression_loss: 1.8035 - classification_loss: 0.4370 362/500 [====================>.........] - ETA: 34s - loss: 2.2389 - regression_loss: 1.8021 - classification_loss: 0.4368 363/500 [====================>.........] - ETA: 34s - loss: 2.2382 - regression_loss: 1.8020 - classification_loss: 0.4362 364/500 [====================>.........] - ETA: 34s - loss: 2.2386 - regression_loss: 1.8026 - classification_loss: 0.4361 365/500 [====================>.........] - ETA: 33s - loss: 2.2374 - regression_loss: 1.8016 - classification_loss: 0.4358 366/500 [====================>.........] - ETA: 33s - loss: 2.2338 - regression_loss: 1.7987 - classification_loss: 0.4350 367/500 [=====================>........] - ETA: 33s - loss: 2.2334 - regression_loss: 1.7986 - classification_loss: 0.4348 368/500 [=====================>........] - ETA: 33s - loss: 2.2332 - regression_loss: 1.7984 - classification_loss: 0.4347 369/500 [=====================>........] - ETA: 32s - loss: 2.2313 - regression_loss: 1.7969 - classification_loss: 0.4343 370/500 [=====================>........] - ETA: 32s - loss: 2.2300 - regression_loss: 1.7957 - classification_loss: 0.4342 371/500 [=====================>........] - ETA: 32s - loss: 2.2294 - regression_loss: 1.7953 - classification_loss: 0.4341 372/500 [=====================>........] - ETA: 32s - loss: 2.2284 - regression_loss: 1.7946 - classification_loss: 0.4338 373/500 [=====================>........] - ETA: 31s - loss: 2.2289 - regression_loss: 1.7954 - classification_loss: 0.4336 374/500 [=====================>........] - ETA: 31s - loss: 2.2281 - regression_loss: 1.7948 - classification_loss: 0.4333 375/500 [=====================>........] - ETA: 31s - loss: 2.2283 - regression_loss: 1.7951 - classification_loss: 0.4332 376/500 [=====================>........] - ETA: 30s - loss: 2.2269 - regression_loss: 1.7940 - classification_loss: 0.4329 377/500 [=====================>........] - ETA: 30s - loss: 2.2260 - regression_loss: 1.7934 - classification_loss: 0.4326 378/500 [=====================>........] - ETA: 30s - loss: 2.2263 - regression_loss: 1.7937 - classification_loss: 0.4326 379/500 [=====================>........] - ETA: 30s - loss: 2.2265 - regression_loss: 1.7940 - classification_loss: 0.4325 380/500 [=====================>........] - ETA: 29s - loss: 2.2263 - regression_loss: 1.7939 - classification_loss: 0.4323 381/500 [=====================>........] - ETA: 29s - loss: 2.2268 - regression_loss: 1.7943 - classification_loss: 0.4325 382/500 [=====================>........] - ETA: 29s - loss: 2.2267 - regression_loss: 1.7943 - classification_loss: 0.4324 383/500 [=====================>........] - ETA: 29s - loss: 2.2272 - regression_loss: 1.7950 - classification_loss: 0.4322 384/500 [======================>.......] - ETA: 28s - loss: 2.2246 - regression_loss: 1.7928 - classification_loss: 0.4317 385/500 [======================>.......] - ETA: 28s - loss: 2.2241 - regression_loss: 1.7924 - classification_loss: 0.4317 386/500 [======================>.......] - ETA: 28s - loss: 2.2229 - regression_loss: 1.7915 - classification_loss: 0.4314 387/500 [======================>.......] - ETA: 28s - loss: 2.2238 - regression_loss: 1.7922 - classification_loss: 0.4316 388/500 [======================>.......] - ETA: 27s - loss: 2.2240 - regression_loss: 1.7925 - classification_loss: 0.4315 389/500 [======================>.......] - ETA: 27s - loss: 2.2225 - regression_loss: 1.7915 - classification_loss: 0.4310 390/500 [======================>.......] - ETA: 27s - loss: 2.2206 - regression_loss: 1.7900 - classification_loss: 0.4306 391/500 [======================>.......] - ETA: 27s - loss: 2.2202 - regression_loss: 1.7898 - classification_loss: 0.4304 392/500 [======================>.......] - ETA: 26s - loss: 2.2196 - regression_loss: 1.7893 - classification_loss: 0.4303 393/500 [======================>.......] - ETA: 26s - loss: 2.2208 - regression_loss: 1.7900 - classification_loss: 0.4308 394/500 [======================>.......] - ETA: 26s - loss: 2.2189 - regression_loss: 1.7886 - classification_loss: 0.4303 395/500 [======================>.......] - ETA: 26s - loss: 2.2198 - regression_loss: 1.7894 - classification_loss: 0.4304 396/500 [======================>.......] - ETA: 25s - loss: 2.2199 - regression_loss: 1.7896 - classification_loss: 0.4302 397/500 [======================>.......] - ETA: 25s - loss: 2.2198 - regression_loss: 1.7898 - classification_loss: 0.4300 398/500 [======================>.......] - ETA: 25s - loss: 2.2201 - regression_loss: 1.7897 - classification_loss: 0.4305 399/500 [======================>.......] - ETA: 25s - loss: 2.2192 - regression_loss: 1.7891 - classification_loss: 0.4301 400/500 [=======================>......] - ETA: 24s - loss: 2.2185 - regression_loss: 1.7887 - classification_loss: 0.4297 401/500 [=======================>......] - ETA: 24s - loss: 2.2183 - regression_loss: 1.7881 - classification_loss: 0.4302 402/500 [=======================>......] - ETA: 24s - loss: 2.2156 - regression_loss: 1.7859 - classification_loss: 0.4297 403/500 [=======================>......] - ETA: 24s - loss: 2.2163 - regression_loss: 1.7862 - classification_loss: 0.4301 404/500 [=======================>......] - ETA: 23s - loss: 2.2162 - regression_loss: 1.7862 - classification_loss: 0.4301 405/500 [=======================>......] - ETA: 23s - loss: 2.2166 - regression_loss: 1.7863 - classification_loss: 0.4302 406/500 [=======================>......] - ETA: 23s - loss: 2.2152 - regression_loss: 1.7854 - classification_loss: 0.4298 407/500 [=======================>......] - ETA: 23s - loss: 2.2145 - regression_loss: 1.7847 - classification_loss: 0.4298 408/500 [=======================>......] - ETA: 22s - loss: 2.2139 - regression_loss: 1.7837 - classification_loss: 0.4302 409/500 [=======================>......] - ETA: 22s - loss: 2.2142 - regression_loss: 1.7842 - classification_loss: 0.4300 410/500 [=======================>......] - ETA: 22s - loss: 2.2127 - regression_loss: 1.7829 - classification_loss: 0.4297 411/500 [=======================>......] - ETA: 22s - loss: 2.2116 - regression_loss: 1.7822 - classification_loss: 0.4294 412/500 [=======================>......] - ETA: 21s - loss: 2.2120 - regression_loss: 1.7826 - classification_loss: 0.4295 413/500 [=======================>......] - ETA: 21s - loss: 2.2117 - regression_loss: 1.7822 - classification_loss: 0.4295 414/500 [=======================>......] - ETA: 21s - loss: 2.2124 - regression_loss: 1.7831 - classification_loss: 0.4293 415/500 [=======================>......] - ETA: 21s - loss: 2.2105 - regression_loss: 1.7815 - classification_loss: 0.4290 416/500 [=======================>......] - ETA: 20s - loss: 2.2108 - regression_loss: 1.7818 - classification_loss: 0.4290 417/500 [========================>.....] - ETA: 20s - loss: 2.2123 - regression_loss: 1.7825 - classification_loss: 0.4298 418/500 [========================>.....] - ETA: 20s - loss: 2.2126 - regression_loss: 1.7825 - classification_loss: 0.4300 419/500 [========================>.....] - ETA: 20s - loss: 2.2122 - regression_loss: 1.7825 - classification_loss: 0.4297 420/500 [========================>.....] - ETA: 19s - loss: 2.2127 - regression_loss: 1.7832 - classification_loss: 0.4295 421/500 [========================>.....] - ETA: 19s - loss: 2.2128 - regression_loss: 1.7833 - classification_loss: 0.4295 422/500 [========================>.....] - ETA: 19s - loss: 2.2137 - regression_loss: 1.7834 - classification_loss: 0.4303 423/500 [========================>.....] - ETA: 19s - loss: 2.2133 - regression_loss: 1.7832 - classification_loss: 0.4301 424/500 [========================>.....] - ETA: 18s - loss: 2.2138 - regression_loss: 1.7836 - classification_loss: 0.4301 425/500 [========================>.....] - ETA: 18s - loss: 2.2133 - regression_loss: 1.7834 - classification_loss: 0.4299 426/500 [========================>.....] - ETA: 18s - loss: 2.2136 - regression_loss: 1.7837 - classification_loss: 0.4299 427/500 [========================>.....] - ETA: 18s - loss: 2.2128 - regression_loss: 1.7832 - classification_loss: 0.4296 428/500 [========================>.....] - ETA: 17s - loss: 2.2125 - regression_loss: 1.7827 - classification_loss: 0.4298 429/500 [========================>.....] - ETA: 17s - loss: 2.2121 - regression_loss: 1.7826 - classification_loss: 0.4295 430/500 [========================>.....] - ETA: 17s - loss: 2.2115 - regression_loss: 1.7823 - classification_loss: 0.4293 431/500 [========================>.....] - ETA: 17s - loss: 2.2121 - regression_loss: 1.7827 - classification_loss: 0.4294 432/500 [========================>.....] - ETA: 16s - loss: 2.2119 - regression_loss: 1.7825 - classification_loss: 0.4294 433/500 [========================>.....] - ETA: 16s - loss: 2.2139 - regression_loss: 1.7844 - classification_loss: 0.4295 434/500 [=========================>....] - ETA: 16s - loss: 2.2142 - regression_loss: 1.7837 - classification_loss: 0.4305 435/500 [=========================>....] - ETA: 16s - loss: 2.2139 - regression_loss: 1.7834 - classification_loss: 0.4305 436/500 [=========================>....] - ETA: 15s - loss: 2.2143 - regression_loss: 1.7839 - classification_loss: 0.4304 437/500 [=========================>....] - ETA: 15s - loss: 2.2151 - regression_loss: 1.7846 - classification_loss: 0.4305 438/500 [=========================>....] - ETA: 15s - loss: 2.2126 - regression_loss: 1.7828 - classification_loss: 0.4298 439/500 [=========================>....] - ETA: 15s - loss: 2.2123 - regression_loss: 1.7828 - classification_loss: 0.4295 440/500 [=========================>....] - ETA: 14s - loss: 2.2099 - regression_loss: 1.7809 - classification_loss: 0.4290 441/500 [=========================>....] - ETA: 14s - loss: 2.2103 - regression_loss: 1.7814 - classification_loss: 0.4289 442/500 [=========================>....] - ETA: 14s - loss: 2.2170 - regression_loss: 1.7844 - classification_loss: 0.4326 443/500 [=========================>....] - ETA: 14s - loss: 2.2196 - regression_loss: 1.7866 - classification_loss: 0.4330 444/500 [=========================>....] - ETA: 13s - loss: 2.2201 - regression_loss: 1.7874 - classification_loss: 0.4327 445/500 [=========================>....] - ETA: 13s - loss: 2.2212 - regression_loss: 1.7883 - classification_loss: 0.4329 446/500 [=========================>....] - ETA: 13s - loss: 2.2210 - regression_loss: 1.7883 - classification_loss: 0.4328 447/500 [=========================>....] - ETA: 13s - loss: 2.2208 - regression_loss: 1.7881 - classification_loss: 0.4327 448/500 [=========================>....] - ETA: 12s - loss: 2.2202 - regression_loss: 1.7878 - classification_loss: 0.4324 449/500 [=========================>....] - ETA: 12s - loss: 2.2215 - regression_loss: 1.7891 - classification_loss: 0.4324 450/500 [==========================>...] - ETA: 12s - loss: 2.2202 - regression_loss: 1.7882 - classification_loss: 0.4320 451/500 [==========================>...] - ETA: 12s - loss: 2.2195 - regression_loss: 1.7875 - classification_loss: 0.4320 452/500 [==========================>...] - ETA: 11s - loss: 2.2194 - regression_loss: 1.7876 - classification_loss: 0.4319 453/500 [==========================>...] - ETA: 11s - loss: 2.2198 - regression_loss: 1.7881 - classification_loss: 0.4317 454/500 [==========================>...] - ETA: 11s - loss: 2.2206 - regression_loss: 1.7886 - classification_loss: 0.4320 455/500 [==========================>...] - ETA: 11s - loss: 2.2196 - regression_loss: 1.7880 - classification_loss: 0.4316 456/500 [==========================>...] - ETA: 10s - loss: 2.2209 - regression_loss: 1.7890 - classification_loss: 0.4319 457/500 [==========================>...] - ETA: 10s - loss: 2.2210 - regression_loss: 1.7891 - classification_loss: 0.4319 458/500 [==========================>...] - ETA: 10s - loss: 2.2223 - regression_loss: 1.7905 - classification_loss: 0.4318 459/500 [==========================>...] - ETA: 10s - loss: 2.2219 - regression_loss: 1.7902 - classification_loss: 0.4317 460/500 [==========================>...] - ETA: 9s - loss: 2.2227 - regression_loss: 1.7910 - classification_loss: 0.4317  461/500 [==========================>...] - ETA: 9s - loss: 2.2211 - regression_loss: 1.7899 - classification_loss: 0.4312 462/500 [==========================>...] - ETA: 9s - loss: 2.2221 - regression_loss: 1.7905 - classification_loss: 0.4315 463/500 [==========================>...] - ETA: 9s - loss: 2.2229 - regression_loss: 1.7914 - classification_loss: 0.4315 464/500 [==========================>...] - ETA: 8s - loss: 2.2239 - regression_loss: 1.7923 - classification_loss: 0.4315 465/500 [==========================>...] - ETA: 8s - loss: 2.2227 - regression_loss: 1.7915 - classification_loss: 0.4312 466/500 [==========================>...] - ETA: 8s - loss: 2.2230 - regression_loss: 1.7918 - classification_loss: 0.4312 467/500 [===========================>..] - ETA: 8s - loss: 2.2218 - regression_loss: 1.7905 - classification_loss: 0.4313 468/500 [===========================>..] - ETA: 7s - loss: 2.2194 - regression_loss: 1.7886 - classification_loss: 0.4308 469/500 [===========================>..] - ETA: 7s - loss: 2.2201 - regression_loss: 1.7895 - classification_loss: 0.4306 470/500 [===========================>..] - ETA: 7s - loss: 2.2188 - regression_loss: 1.7885 - classification_loss: 0.4303 471/500 [===========================>..] - ETA: 7s - loss: 2.2189 - regression_loss: 1.7886 - classification_loss: 0.4303 472/500 [===========================>..] - ETA: 6s - loss: 2.2188 - regression_loss: 1.7887 - classification_loss: 0.4301 473/500 [===========================>..] - ETA: 6s - loss: 2.2183 - regression_loss: 1.7883 - classification_loss: 0.4300 474/500 [===========================>..] - ETA: 6s - loss: 2.2179 - regression_loss: 1.7880 - classification_loss: 0.4299 475/500 [===========================>..] - ETA: 6s - loss: 2.2164 - regression_loss: 1.7869 - classification_loss: 0.4295 476/500 [===========================>..] - ETA: 5s - loss: 2.2158 - regression_loss: 1.7864 - classification_loss: 0.4294 477/500 [===========================>..] - ETA: 5s - loss: 2.2155 - regression_loss: 1.7863 - classification_loss: 0.4293 478/500 [===========================>..] - ETA: 5s - loss: 2.2160 - regression_loss: 1.7867 - classification_loss: 0.4293 479/500 [===========================>..] - ETA: 5s - loss: 2.2157 - regression_loss: 1.7867 - classification_loss: 0.4291 480/500 [===========================>..] - ETA: 4s - loss: 2.2165 - regression_loss: 1.7876 - classification_loss: 0.4289 481/500 [===========================>..] - ETA: 4s - loss: 2.2164 - regression_loss: 1.7872 - classification_loss: 0.4292 482/500 [===========================>..] - ETA: 4s - loss: 2.2155 - regression_loss: 1.7866 - classification_loss: 0.4289 483/500 [===========================>..] - ETA: 4s - loss: 2.2157 - regression_loss: 1.7869 - classification_loss: 0.4288 484/500 [============================>.] - ETA: 4s - loss: 2.2146 - regression_loss: 1.7860 - classification_loss: 0.4285 485/500 [============================>.] - ETA: 3s - loss: 2.2140 - regression_loss: 1.7856 - classification_loss: 0.4284 486/500 [============================>.] - ETA: 3s - loss: 2.2122 - regression_loss: 1.7842 - classification_loss: 0.4280 487/500 [============================>.] - ETA: 3s - loss: 2.2126 - regression_loss: 1.7846 - classification_loss: 0.4280 488/500 [============================>.] - ETA: 3s - loss: 2.2126 - regression_loss: 1.7845 - classification_loss: 0.4281 489/500 [============================>.] - ETA: 2s - loss: 2.2124 - regression_loss: 1.7844 - classification_loss: 0.4280 490/500 [============================>.] - ETA: 2s - loss: 2.2122 - regression_loss: 1.7845 - classification_loss: 0.4277 491/500 [============================>.] - ETA: 2s - loss: 2.2108 - regression_loss: 1.7835 - classification_loss: 0.4273 492/500 [============================>.] - ETA: 2s - loss: 2.2106 - regression_loss: 1.7835 - classification_loss: 0.4271 493/500 [============================>.] - ETA: 1s - loss: 2.2102 - regression_loss: 1.7832 - classification_loss: 0.4270 494/500 [============================>.] - ETA: 1s - loss: 2.2118 - regression_loss: 1.7843 - classification_loss: 0.4275 495/500 [============================>.] - ETA: 1s - loss: 2.2114 - regression_loss: 1.7841 - classification_loss: 0.4273 496/500 [============================>.] - ETA: 1s - loss: 2.2123 - regression_loss: 1.7850 - classification_loss: 0.4273 497/500 [============================>.] - ETA: 0s - loss: 2.2132 - regression_loss: 1.7857 - classification_loss: 0.4275 498/500 [============================>.] - ETA: 0s - loss: 2.2135 - regression_loss: 1.7859 - classification_loss: 0.4276 499/500 [============================>.] - ETA: 0s - loss: 2.2132 - regression_loss: 1.7857 - classification_loss: 0.4274 500/500 [==============================] - 125s 250ms/step - loss: 2.2130 - regression_loss: 1.7857 - classification_loss: 0.4273 1172 instances of class plum with average precision: 0.3796 mAP: 0.3796 Epoch 00027: saving model to ./training/snapshots/resnet50_pascal_27.h5 Epoch 28/150 1/500 [..............................] - ETA: 2:02 - loss: 2.3560 - regression_loss: 1.7512 - classification_loss: 0.6048 2/500 [..............................] - ETA: 2:02 - loss: 2.2725 - regression_loss: 1.7369 - classification_loss: 0.5356 3/500 [..............................] - ETA: 2:03 - loss: 1.9562 - regression_loss: 1.4446 - classification_loss: 0.5115 4/500 [..............................] - ETA: 2:04 - loss: 2.1409 - regression_loss: 1.6400 - classification_loss: 0.5009 5/500 [..............................] - ETA: 2:04 - loss: 2.2087 - regression_loss: 1.6831 - classification_loss: 0.5256 6/500 [..............................] - ETA: 2:04 - loss: 2.1726 - regression_loss: 1.6633 - classification_loss: 0.5093 7/500 [..............................] - ETA: 2:04 - loss: 2.1844 - regression_loss: 1.6924 - classification_loss: 0.4920 8/500 [..............................] - ETA: 2:03 - loss: 2.0900 - regression_loss: 1.6203 - classification_loss: 0.4697 9/500 [..............................] - ETA: 2:04 - loss: 2.0946 - regression_loss: 1.6306 - classification_loss: 0.4640 10/500 [..............................] - ETA: 2:03 - loss: 2.0681 - regression_loss: 1.6138 - classification_loss: 0.4543 11/500 [..............................] - ETA: 2:02 - loss: 2.0862 - regression_loss: 1.6316 - classification_loss: 0.4546 12/500 [..............................] - ETA: 2:02 - loss: 2.0843 - regression_loss: 1.6372 - classification_loss: 0.4471 13/500 [..............................] - ETA: 2:02 - loss: 2.1104 - regression_loss: 1.6661 - classification_loss: 0.4443 14/500 [..............................] - ETA: 2:02 - loss: 2.1229 - regression_loss: 1.6810 - classification_loss: 0.4419 15/500 [..............................] - ETA: 2:01 - loss: 2.1166 - regression_loss: 1.6696 - classification_loss: 0.4471 16/500 [..............................] - ETA: 2:01 - loss: 2.0942 - regression_loss: 1.6453 - classification_loss: 0.4489 17/500 [>.............................] - ETA: 2:01 - loss: 2.0905 - regression_loss: 1.6468 - classification_loss: 0.4437 18/500 [>.............................] - ETA: 2:01 - loss: 2.0919 - regression_loss: 1.6510 - classification_loss: 0.4409 19/500 [>.............................] - ETA: 2:00 - loss: 2.0963 - regression_loss: 1.6559 - classification_loss: 0.4404 20/500 [>.............................] - ETA: 2:00 - loss: 2.0873 - regression_loss: 1.6522 - classification_loss: 0.4351 21/500 [>.............................] - ETA: 2:00 - loss: 2.1071 - regression_loss: 1.6710 - classification_loss: 0.4361 22/500 [>.............................] - ETA: 1:59 - loss: 2.0885 - regression_loss: 1.6550 - classification_loss: 0.4335 23/500 [>.............................] - ETA: 1:59 - loss: 2.0879 - regression_loss: 1.6531 - classification_loss: 0.4348 24/500 [>.............................] - ETA: 1:59 - loss: 2.0927 - regression_loss: 1.6588 - classification_loss: 0.4340 25/500 [>.............................] - ETA: 1:59 - loss: 2.0906 - regression_loss: 1.6609 - classification_loss: 0.4297 26/500 [>.............................] - ETA: 1:58 - loss: 2.1058 - regression_loss: 1.6774 - classification_loss: 0.4284 27/500 [>.............................] - ETA: 1:58 - loss: 2.1244 - regression_loss: 1.6952 - classification_loss: 0.4293 28/500 [>.............................] - ETA: 1:58 - loss: 2.1431 - regression_loss: 1.7154 - classification_loss: 0.4277 29/500 [>.............................] - ETA: 1:58 - loss: 2.1207 - regression_loss: 1.6905 - classification_loss: 0.4302 30/500 [>.............................] - ETA: 1:58 - loss: 2.1167 - regression_loss: 1.6895 - classification_loss: 0.4272 31/500 [>.............................] - ETA: 1:57 - loss: 2.1198 - regression_loss: 1.6932 - classification_loss: 0.4267 32/500 [>.............................] - ETA: 1:57 - loss: 2.1377 - regression_loss: 1.7083 - classification_loss: 0.4294 33/500 [>.............................] - ETA: 1:57 - loss: 2.1336 - regression_loss: 1.7064 - classification_loss: 0.4271 34/500 [=>............................] - ETA: 1:57 - loss: 2.1325 - regression_loss: 1.7070 - classification_loss: 0.4255 35/500 [=>............................] - ETA: 1:57 - loss: 2.1403 - regression_loss: 1.7128 - classification_loss: 0.4275 36/500 [=>............................] - ETA: 1:56 - loss: 2.1455 - regression_loss: 1.7168 - classification_loss: 0.4287 37/500 [=>............................] - ETA: 1:56 - loss: 2.1504 - regression_loss: 1.7205 - classification_loss: 0.4299 38/500 [=>............................] - ETA: 1:56 - loss: 2.1215 - regression_loss: 1.6967 - classification_loss: 0.4248 39/500 [=>............................] - ETA: 1:56 - loss: 2.1147 - regression_loss: 1.6925 - classification_loss: 0.4222 40/500 [=>............................] - ETA: 1:56 - loss: 2.1255 - regression_loss: 1.6999 - classification_loss: 0.4256 41/500 [=>............................] - ETA: 1:55 - loss: 2.1552 - regression_loss: 1.7285 - classification_loss: 0.4267 42/500 [=>............................] - ETA: 1:55 - loss: 2.1901 - regression_loss: 1.7584 - classification_loss: 0.4316 43/500 [=>............................] - ETA: 1:55 - loss: 2.1997 - regression_loss: 1.7659 - classification_loss: 0.4339 44/500 [=>............................] - ETA: 1:55 - loss: 2.1966 - regression_loss: 1.7626 - classification_loss: 0.4340 45/500 [=>............................] - ETA: 1:55 - loss: 2.1872 - regression_loss: 1.7559 - classification_loss: 0.4313 46/500 [=>............................] - ETA: 1:54 - loss: 2.1906 - regression_loss: 1.7592 - classification_loss: 0.4314 47/500 [=>............................] - ETA: 1:54 - loss: 2.1927 - regression_loss: 1.7610 - classification_loss: 0.4316 48/500 [=>............................] - ETA: 1:54 - loss: 2.1928 - regression_loss: 1.7609 - classification_loss: 0.4319 49/500 [=>............................] - ETA: 1:53 - loss: 2.1915 - regression_loss: 1.7597 - classification_loss: 0.4317 50/500 [==>...........................] - ETA: 1:53 - loss: 2.1921 - regression_loss: 1.7613 - classification_loss: 0.4307 51/500 [==>...........................] - ETA: 1:53 - loss: 2.1780 - regression_loss: 1.7512 - classification_loss: 0.4268 52/500 [==>...........................] - ETA: 1:52 - loss: 2.1886 - regression_loss: 1.7611 - classification_loss: 0.4275 53/500 [==>...........................] - ETA: 1:51 - loss: 2.1834 - regression_loss: 1.7584 - classification_loss: 0.4250 54/500 [==>...........................] - ETA: 1:51 - loss: 2.1875 - regression_loss: 1.7597 - classification_loss: 0.4278 55/500 [==>...........................] - ETA: 1:50 - loss: 2.1861 - regression_loss: 1.7530 - classification_loss: 0.4331 56/500 [==>...........................] - ETA: 1:50 - loss: 2.1789 - regression_loss: 1.7481 - classification_loss: 0.4308 57/500 [==>...........................] - ETA: 1:50 - loss: 2.1874 - regression_loss: 1.7578 - classification_loss: 0.4296 58/500 [==>...........................] - ETA: 1:50 - loss: 2.2042 - regression_loss: 1.7728 - classification_loss: 0.4313 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2059 - regression_loss: 1.7747 - classification_loss: 0.4312 60/500 [==>...........................] - ETA: 1:49 - loss: 2.1941 - regression_loss: 1.7647 - classification_loss: 0.4293 61/500 [==>...........................] - ETA: 1:49 - loss: 2.1906 - regression_loss: 1.7633 - classification_loss: 0.4273 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1903 - regression_loss: 1.7627 - classification_loss: 0.4276 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1890 - regression_loss: 1.7631 - classification_loss: 0.4259 64/500 [==>...........................] - ETA: 1:48 - loss: 2.1850 - regression_loss: 1.7605 - classification_loss: 0.4245 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1819 - regression_loss: 1.7576 - classification_loss: 0.4244 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1845 - regression_loss: 1.7606 - classification_loss: 0.4239 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1777 - regression_loss: 1.7569 - classification_loss: 0.4208 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1768 - regression_loss: 1.7572 - classification_loss: 0.4196 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1761 - regression_loss: 1.7578 - classification_loss: 0.4183 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1796 - regression_loss: 1.7604 - classification_loss: 0.4192 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1772 - regression_loss: 1.7591 - classification_loss: 0.4180 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1716 - regression_loss: 1.7556 - classification_loss: 0.4160 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1773 - regression_loss: 1.7606 - classification_loss: 0.4168 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1811 - regression_loss: 1.7644 - classification_loss: 0.4168 75/500 [===>..........................] - ETA: 1:45 - loss: 2.1771 - regression_loss: 1.7554 - classification_loss: 0.4217 76/500 [===>..........................] - ETA: 1:45 - loss: 2.1725 - regression_loss: 1.7519 - classification_loss: 0.4206 77/500 [===>..........................] - ETA: 1:45 - loss: 2.1682 - regression_loss: 1.7465 - classification_loss: 0.4218 78/500 [===>..........................] - ETA: 1:45 - loss: 2.1667 - regression_loss: 1.7452 - classification_loss: 0.4214 79/500 [===>..........................] - ETA: 1:44 - loss: 2.1699 - regression_loss: 1.7487 - classification_loss: 0.4212 80/500 [===>..........................] - ETA: 1:44 - loss: 2.1744 - regression_loss: 1.7521 - classification_loss: 0.4223 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1802 - regression_loss: 1.7562 - classification_loss: 0.4240 82/500 [===>..........................] - ETA: 1:43 - loss: 2.1794 - regression_loss: 1.7566 - classification_loss: 0.4229 83/500 [===>..........................] - ETA: 1:43 - loss: 2.1851 - regression_loss: 1.7613 - classification_loss: 0.4239 84/500 [====>.........................] - ETA: 1:43 - loss: 2.1904 - regression_loss: 1.7663 - classification_loss: 0.4242 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1883 - regression_loss: 1.7637 - classification_loss: 0.4246 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1891 - regression_loss: 1.7653 - classification_loss: 0.4238 87/500 [====>.........................] - ETA: 1:42 - loss: 2.1933 - regression_loss: 1.7689 - classification_loss: 0.4244 88/500 [====>.........................] - ETA: 1:42 - loss: 2.1973 - regression_loss: 1.7701 - classification_loss: 0.4273 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1850 - regression_loss: 1.7606 - classification_loss: 0.4244 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1869 - regression_loss: 1.7617 - classification_loss: 0.4252 91/500 [====>.........................] - ETA: 1:41 - loss: 2.1847 - regression_loss: 1.7606 - classification_loss: 0.4241 92/500 [====>.........................] - ETA: 1:41 - loss: 2.1855 - regression_loss: 1.7616 - classification_loss: 0.4239 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1837 - regression_loss: 1.7605 - classification_loss: 0.4232 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1844 - regression_loss: 1.7609 - classification_loss: 0.4234 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1716 - regression_loss: 1.7495 - classification_loss: 0.4220 96/500 [====>.........................] - ETA: 1:40 - loss: 2.1745 - regression_loss: 1.7507 - classification_loss: 0.4238 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1745 - regression_loss: 1.7512 - classification_loss: 0.4233 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1776 - regression_loss: 1.7537 - classification_loss: 0.4240 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1849 - regression_loss: 1.7606 - classification_loss: 0.4244 100/500 [=====>........................] - ETA: 1:39 - loss: 2.1868 - regression_loss: 1.7624 - classification_loss: 0.4244 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1885 - regression_loss: 1.7628 - classification_loss: 0.4257 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1995 - regression_loss: 1.7683 - classification_loss: 0.4311 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1909 - regression_loss: 1.7608 - classification_loss: 0.4301 104/500 [=====>........................] - ETA: 1:38 - loss: 2.1875 - regression_loss: 1.7578 - classification_loss: 0.4297 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1853 - regression_loss: 1.7568 - classification_loss: 0.4285 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1884 - regression_loss: 1.7601 - classification_loss: 0.4283 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1879 - regression_loss: 1.7602 - classification_loss: 0.4277 108/500 [=====>........................] - ETA: 1:37 - loss: 2.1764 - regression_loss: 1.7518 - classification_loss: 0.4247 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1841 - regression_loss: 1.7562 - classification_loss: 0.4279 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1860 - regression_loss: 1.7582 - classification_loss: 0.4278 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1851 - regression_loss: 1.7577 - classification_loss: 0.4274 112/500 [=====>........................] - ETA: 1:36 - loss: 2.1843 - regression_loss: 1.7569 - classification_loss: 0.4275 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1862 - regression_loss: 1.7590 - classification_loss: 0.4272 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1906 - regression_loss: 1.7631 - classification_loss: 0.4275 115/500 [=====>........................] - ETA: 1:36 - loss: 2.1827 - regression_loss: 1.7569 - classification_loss: 0.4258 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1838 - regression_loss: 1.7589 - classification_loss: 0.4249 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1864 - regression_loss: 1.7610 - classification_loss: 0.4253 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1931 - regression_loss: 1.7664 - classification_loss: 0.4267 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1928 - regression_loss: 1.7654 - classification_loss: 0.4274 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1944 - regression_loss: 1.7680 - classification_loss: 0.4265 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1942 - regression_loss: 1.7690 - classification_loss: 0.4252 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1934 - regression_loss: 1.7688 - classification_loss: 0.4246 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1947 - regression_loss: 1.7706 - classification_loss: 0.4241 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1855 - regression_loss: 1.7632 - classification_loss: 0.4223 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1829 - regression_loss: 1.7609 - classification_loss: 0.4220 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1833 - regression_loss: 1.7609 - classification_loss: 0.4223 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1795 - regression_loss: 1.7587 - classification_loss: 0.4208 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1783 - regression_loss: 1.7577 - classification_loss: 0.4207 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1800 - regression_loss: 1.7580 - classification_loss: 0.4220 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1786 - regression_loss: 1.7574 - classification_loss: 0.4212 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1763 - regression_loss: 1.7562 - classification_loss: 0.4201 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1733 - regression_loss: 1.7529 - classification_loss: 0.4204 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1785 - regression_loss: 1.7565 - classification_loss: 0.4219 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1715 - regression_loss: 1.7512 - classification_loss: 0.4203 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1740 - regression_loss: 1.7537 - classification_loss: 0.4203 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1755 - regression_loss: 1.7552 - classification_loss: 0.4203 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1790 - regression_loss: 1.7581 - classification_loss: 0.4208 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1855 - regression_loss: 1.7634 - classification_loss: 0.4221 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1842 - regression_loss: 1.7629 - classification_loss: 0.4213 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1783 - regression_loss: 1.7581 - classification_loss: 0.4202 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1764 - regression_loss: 1.7566 - classification_loss: 0.4199 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1777 - regression_loss: 1.7583 - classification_loss: 0.4194 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1768 - regression_loss: 1.7580 - classification_loss: 0.4188 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1747 - regression_loss: 1.7568 - classification_loss: 0.4179 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1738 - regression_loss: 1.7562 - classification_loss: 0.4175 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1738 - regression_loss: 1.7562 - classification_loss: 0.4176 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1764 - regression_loss: 1.7593 - classification_loss: 0.4171 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1777 - regression_loss: 1.7608 - classification_loss: 0.4169 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1803 - regression_loss: 1.7634 - classification_loss: 0.4169 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1856 - regression_loss: 1.7675 - classification_loss: 0.4180 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1930 - regression_loss: 1.7730 - classification_loss: 0.4200 152/500 [========>.....................] - ETA: 1:27 - loss: 2.1922 - regression_loss: 1.7728 - classification_loss: 0.4194 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1988 - regression_loss: 1.7793 - classification_loss: 0.4196 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1977 - regression_loss: 1.7788 - classification_loss: 0.4190 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1994 - regression_loss: 1.7806 - classification_loss: 0.4188 156/500 [========>.....................] - ETA: 1:26 - loss: 2.1981 - regression_loss: 1.7797 - classification_loss: 0.4184 157/500 [========>.....................] - ETA: 1:25 - loss: 2.2006 - regression_loss: 1.7820 - classification_loss: 0.4186 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1993 - regression_loss: 1.7812 - classification_loss: 0.4181 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1966 - regression_loss: 1.7790 - classification_loss: 0.4176 160/500 [========>.....................] - ETA: 1:25 - loss: 2.1954 - regression_loss: 1.7768 - classification_loss: 0.4186 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1963 - regression_loss: 1.7777 - classification_loss: 0.4186 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1953 - regression_loss: 1.7768 - classification_loss: 0.4185 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1944 - regression_loss: 1.7762 - classification_loss: 0.4182 164/500 [========>.....................] - ETA: 1:24 - loss: 2.1903 - regression_loss: 1.7726 - classification_loss: 0.4177 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1870 - regression_loss: 1.7695 - classification_loss: 0.4175 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1862 - regression_loss: 1.7688 - classification_loss: 0.4174 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1865 - regression_loss: 1.7693 - classification_loss: 0.4172 168/500 [=========>....................] - ETA: 1:23 - loss: 2.1790 - regression_loss: 1.7632 - classification_loss: 0.4158 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1834 - regression_loss: 1.7661 - classification_loss: 0.4173 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1824 - regression_loss: 1.7655 - classification_loss: 0.4169 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1814 - regression_loss: 1.7652 - classification_loss: 0.4163 172/500 [=========>....................] - ETA: 1:22 - loss: 2.1828 - regression_loss: 1.7665 - classification_loss: 0.4163 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1826 - regression_loss: 1.7668 - classification_loss: 0.4158 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1810 - regression_loss: 1.7658 - classification_loss: 0.4152 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1773 - regression_loss: 1.7632 - classification_loss: 0.4141 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1790 - regression_loss: 1.7645 - classification_loss: 0.4144 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1771 - regression_loss: 1.7624 - classification_loss: 0.4147 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1796 - regression_loss: 1.7640 - classification_loss: 0.4156 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1785 - regression_loss: 1.7626 - classification_loss: 0.4159 180/500 [=========>....................] - ETA: 1:20 - loss: 2.1807 - regression_loss: 1.7648 - classification_loss: 0.4159 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1836 - regression_loss: 1.7677 - classification_loss: 0.4160 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1844 - regression_loss: 1.7684 - classification_loss: 0.4160 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1859 - regression_loss: 1.7698 - classification_loss: 0.4161 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1864 - regression_loss: 1.7704 - classification_loss: 0.4160 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1843 - regression_loss: 1.7689 - classification_loss: 0.4154 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1839 - regression_loss: 1.7688 - classification_loss: 0.4151 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1826 - regression_loss: 1.7675 - classification_loss: 0.4151 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1839 - regression_loss: 1.7687 - classification_loss: 0.4152 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1817 - regression_loss: 1.7670 - classification_loss: 0.4147 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1854 - regression_loss: 1.7698 - classification_loss: 0.4156 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1870 - regression_loss: 1.7702 - classification_loss: 0.4168 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1858 - regression_loss: 1.7691 - classification_loss: 0.4167 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1874 - regression_loss: 1.7709 - classification_loss: 0.4165 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1874 - regression_loss: 1.7712 - classification_loss: 0.4162 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1885 - regression_loss: 1.7723 - classification_loss: 0.4162 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1822 - regression_loss: 1.7674 - classification_loss: 0.4148 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1829 - regression_loss: 1.7680 - classification_loss: 0.4149 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1839 - regression_loss: 1.7689 - classification_loss: 0.4150 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1859 - regression_loss: 1.7704 - classification_loss: 0.4156 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1864 - regression_loss: 1.7708 - classification_loss: 0.4157 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1906 - regression_loss: 1.7746 - classification_loss: 0.4160 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1923 - regression_loss: 1.7759 - classification_loss: 0.4164 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1937 - regression_loss: 1.7772 - classification_loss: 0.4165 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1929 - regression_loss: 1.7763 - classification_loss: 0.4166 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1922 - regression_loss: 1.7759 - classification_loss: 0.4163 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1929 - regression_loss: 1.7757 - classification_loss: 0.4171 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1926 - regression_loss: 1.7752 - classification_loss: 0.4174 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1948 - regression_loss: 1.7774 - classification_loss: 0.4174 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1889 - regression_loss: 1.7724 - classification_loss: 0.4166 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1885 - regression_loss: 1.7721 - classification_loss: 0.4164 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1873 - regression_loss: 1.7712 - classification_loss: 0.4160 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1870 - regression_loss: 1.7710 - classification_loss: 0.4160 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1879 - regression_loss: 1.7715 - classification_loss: 0.4163 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1881 - regression_loss: 1.7719 - classification_loss: 0.4162 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1868 - regression_loss: 1.7712 - classification_loss: 0.4155 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1866 - regression_loss: 1.7712 - classification_loss: 0.4154 217/500 [============>.................] - ETA: 1:10 - loss: 2.1859 - regression_loss: 1.7704 - classification_loss: 0.4156 218/500 [============>.................] - ETA: 1:10 - loss: 2.1862 - regression_loss: 1.7706 - classification_loss: 0.4156 219/500 [============>.................] - ETA: 1:10 - loss: 2.1857 - regression_loss: 1.7703 - classification_loss: 0.4154 220/500 [============>.................] - ETA: 1:10 - loss: 2.1847 - regression_loss: 1.7696 - classification_loss: 0.4151 221/500 [============>.................] - ETA: 1:09 - loss: 2.1850 - regression_loss: 1.7701 - classification_loss: 0.4149 222/500 [============>.................] - ETA: 1:09 - loss: 2.1864 - regression_loss: 1.7715 - classification_loss: 0.4149 223/500 [============>.................] - ETA: 1:09 - loss: 2.1866 - regression_loss: 1.7715 - classification_loss: 0.4150 224/500 [============>.................] - ETA: 1:09 - loss: 2.1881 - regression_loss: 1.7728 - classification_loss: 0.4153 225/500 [============>.................] - ETA: 1:08 - loss: 2.1880 - regression_loss: 1.7720 - classification_loss: 0.4159 226/500 [============>.................] - ETA: 1:08 - loss: 2.1866 - regression_loss: 1.7708 - classification_loss: 0.4158 227/500 [============>.................] - ETA: 1:08 - loss: 2.1864 - regression_loss: 1.7707 - classification_loss: 0.4157 228/500 [============>.................] - ETA: 1:08 - loss: 2.1866 - regression_loss: 1.7705 - classification_loss: 0.4161 229/500 [============>.................] - ETA: 1:07 - loss: 2.1860 - regression_loss: 1.7701 - classification_loss: 0.4160 230/500 [============>.................] - ETA: 1:07 - loss: 2.1858 - regression_loss: 1.7698 - classification_loss: 0.4160 231/500 [============>.................] - ETA: 1:07 - loss: 2.1858 - regression_loss: 1.7701 - classification_loss: 0.4157 232/500 [============>.................] - ETA: 1:06 - loss: 2.1854 - regression_loss: 1.7698 - classification_loss: 0.4156 233/500 [============>.................] - ETA: 1:06 - loss: 2.1820 - regression_loss: 1.7671 - classification_loss: 0.4149 234/500 [=============>................] - ETA: 1:06 - loss: 2.1863 - regression_loss: 1.7710 - classification_loss: 0.4153 235/500 [=============>................] - ETA: 1:06 - loss: 2.1885 - regression_loss: 1.7730 - classification_loss: 0.4155 236/500 [=============>................] - ETA: 1:05 - loss: 2.1890 - regression_loss: 1.7736 - classification_loss: 0.4154 237/500 [=============>................] - ETA: 1:05 - loss: 2.1876 - regression_loss: 1.7724 - classification_loss: 0.4152 238/500 [=============>................] - ETA: 1:05 - loss: 2.1888 - regression_loss: 1.7730 - classification_loss: 0.4158 239/500 [=============>................] - ETA: 1:05 - loss: 2.1916 - regression_loss: 1.7750 - classification_loss: 0.4166 240/500 [=============>................] - ETA: 1:04 - loss: 2.1924 - regression_loss: 1.7756 - classification_loss: 0.4168 241/500 [=============>................] - ETA: 1:04 - loss: 2.1914 - regression_loss: 1.7749 - classification_loss: 0.4165 242/500 [=============>................] - ETA: 1:04 - loss: 2.1949 - regression_loss: 1.7778 - classification_loss: 0.4172 243/500 [=============>................] - ETA: 1:04 - loss: 2.1953 - regression_loss: 1.7782 - classification_loss: 0.4171 244/500 [=============>................] - ETA: 1:04 - loss: 2.1984 - regression_loss: 1.7814 - classification_loss: 0.4171 245/500 [=============>................] - ETA: 1:03 - loss: 2.1970 - regression_loss: 1.7804 - classification_loss: 0.4166 246/500 [=============>................] - ETA: 1:03 - loss: 2.1944 - regression_loss: 1.7773 - classification_loss: 0.4171 247/500 [=============>................] - ETA: 1:03 - loss: 2.1945 - regression_loss: 1.7779 - classification_loss: 0.4166 248/500 [=============>................] - ETA: 1:03 - loss: 2.1926 - regression_loss: 1.7758 - classification_loss: 0.4168 249/500 [=============>................] - ETA: 1:02 - loss: 2.1933 - regression_loss: 1.7762 - classification_loss: 0.4171 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1903 - regression_loss: 1.7737 - classification_loss: 0.4166 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1884 - regression_loss: 1.7719 - classification_loss: 0.4165 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1881 - regression_loss: 1.7718 - classification_loss: 0.4164 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1892 - regression_loss: 1.7725 - classification_loss: 0.4167 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1921 - regression_loss: 1.7755 - classification_loss: 0.4166 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1927 - regression_loss: 1.7760 - classification_loss: 0.4167 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1927 - regression_loss: 1.7761 - classification_loss: 0.4166 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1943 - regression_loss: 1.7775 - classification_loss: 0.4168 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1939 - regression_loss: 1.7772 - classification_loss: 0.4167 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1925 - regression_loss: 1.7763 - classification_loss: 0.4162 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1922 - regression_loss: 1.7762 - classification_loss: 0.4160 261/500 [==============>...............] - ETA: 59s - loss: 2.1949 - regression_loss: 1.7779 - classification_loss: 0.4170  262/500 [==============>...............] - ETA: 59s - loss: 2.1928 - regression_loss: 1.7762 - classification_loss: 0.4166 263/500 [==============>...............] - ETA: 59s - loss: 2.1900 - regression_loss: 1.7736 - classification_loss: 0.4163 264/500 [==============>...............] - ETA: 59s - loss: 2.1901 - regression_loss: 1.7744 - classification_loss: 0.4157 265/500 [==============>...............] - ETA: 58s - loss: 2.1893 - regression_loss: 1.7737 - classification_loss: 0.4156 266/500 [==============>...............] - ETA: 58s - loss: 2.1852 - regression_loss: 1.7702 - classification_loss: 0.4150 267/500 [===============>..............] - ETA: 58s - loss: 2.1840 - regression_loss: 1.7691 - classification_loss: 0.4148 268/500 [===============>..............] - ETA: 58s - loss: 2.1844 - regression_loss: 1.7695 - classification_loss: 0.4149 269/500 [===============>..............] - ETA: 57s - loss: 2.1843 - regression_loss: 1.7694 - classification_loss: 0.4149 270/500 [===============>..............] - ETA: 57s - loss: 2.1878 - regression_loss: 1.7720 - classification_loss: 0.4158 271/500 [===============>..............] - ETA: 57s - loss: 2.1868 - regression_loss: 1.7710 - classification_loss: 0.4158 272/500 [===============>..............] - ETA: 57s - loss: 2.1886 - regression_loss: 1.7725 - classification_loss: 0.4161 273/500 [===============>..............] - ETA: 56s - loss: 2.1920 - regression_loss: 1.7751 - classification_loss: 0.4169 274/500 [===============>..............] - ETA: 56s - loss: 2.1919 - regression_loss: 1.7755 - classification_loss: 0.4163 275/500 [===============>..............] - ETA: 56s - loss: 2.1932 - regression_loss: 1.7764 - classification_loss: 0.4169 276/500 [===============>..............] - ETA: 56s - loss: 2.1924 - regression_loss: 1.7758 - classification_loss: 0.4166 277/500 [===============>..............] - ETA: 55s - loss: 2.1904 - regression_loss: 1.7743 - classification_loss: 0.4161 278/500 [===============>..............] - ETA: 55s - loss: 2.1921 - regression_loss: 1.7753 - classification_loss: 0.4168 279/500 [===============>..............] - ETA: 55s - loss: 2.1915 - regression_loss: 1.7749 - classification_loss: 0.4166 280/500 [===============>..............] - ETA: 55s - loss: 2.1913 - regression_loss: 1.7746 - classification_loss: 0.4167 281/500 [===============>..............] - ETA: 54s - loss: 2.1908 - regression_loss: 1.7740 - classification_loss: 0.4168 282/500 [===============>..............] - ETA: 54s - loss: 2.1913 - regression_loss: 1.7742 - classification_loss: 0.4171 283/500 [===============>..............] - ETA: 54s - loss: 2.1930 - regression_loss: 1.7757 - classification_loss: 0.4174 284/500 [================>.............] - ETA: 54s - loss: 2.1937 - regression_loss: 1.7762 - classification_loss: 0.4175 285/500 [================>.............] - ETA: 53s - loss: 2.1938 - regression_loss: 1.7765 - classification_loss: 0.4173 286/500 [================>.............] - ETA: 53s - loss: 2.2038 - regression_loss: 1.7777 - classification_loss: 0.4262 287/500 [================>.............] - ETA: 53s - loss: 2.2060 - regression_loss: 1.7797 - classification_loss: 0.4263 288/500 [================>.............] - ETA: 53s - loss: 2.2048 - regression_loss: 1.7786 - classification_loss: 0.4262 289/500 [================>.............] - ETA: 52s - loss: 2.2028 - regression_loss: 1.7772 - classification_loss: 0.4256 290/500 [================>.............] - ETA: 52s - loss: 2.2013 - regression_loss: 1.7761 - classification_loss: 0.4252 291/500 [================>.............] - ETA: 52s - loss: 2.2021 - regression_loss: 1.7770 - classification_loss: 0.4251 292/500 [================>.............] - ETA: 52s - loss: 2.2008 - regression_loss: 1.7746 - classification_loss: 0.4263 293/500 [================>.............] - ETA: 51s - loss: 2.1999 - regression_loss: 1.7735 - classification_loss: 0.4265 294/500 [================>.............] - ETA: 51s - loss: 2.2004 - regression_loss: 1.7743 - classification_loss: 0.4262 295/500 [================>.............] - ETA: 51s - loss: 2.2009 - regression_loss: 1.7751 - classification_loss: 0.4258 296/500 [================>.............] - ETA: 51s - loss: 2.2005 - regression_loss: 1.7738 - classification_loss: 0.4266 297/500 [================>.............] - ETA: 50s - loss: 2.1992 - regression_loss: 1.7727 - classification_loss: 0.4265 298/500 [================>.............] - ETA: 50s - loss: 2.1990 - regression_loss: 1.7727 - classification_loss: 0.4263 299/500 [================>.............] - ETA: 50s - loss: 2.1964 - regression_loss: 1.7710 - classification_loss: 0.4254 300/500 [=================>............] - ETA: 50s - loss: 2.1966 - regression_loss: 1.7713 - classification_loss: 0.4253 301/500 [=================>............] - ETA: 49s - loss: 2.1954 - regression_loss: 1.7705 - classification_loss: 0.4249 302/500 [=================>............] - ETA: 49s - loss: 2.1956 - regression_loss: 1.7707 - classification_loss: 0.4249 303/500 [=================>............] - ETA: 49s - loss: 2.1956 - regression_loss: 1.7705 - classification_loss: 0.4251 304/500 [=================>............] - ETA: 49s - loss: 2.1952 - regression_loss: 1.7703 - classification_loss: 0.4250 305/500 [=================>............] - ETA: 48s - loss: 2.1984 - regression_loss: 1.7728 - classification_loss: 0.4256 306/500 [=================>............] - ETA: 48s - loss: 2.1971 - regression_loss: 1.7723 - classification_loss: 0.4248 307/500 [=================>............] - ETA: 48s - loss: 2.1974 - regression_loss: 1.7725 - classification_loss: 0.4249 308/500 [=================>............] - ETA: 48s - loss: 2.1979 - regression_loss: 1.7731 - classification_loss: 0.4248 309/500 [=================>............] - ETA: 47s - loss: 2.1975 - regression_loss: 1.7729 - classification_loss: 0.4246 310/500 [=================>............] - ETA: 47s - loss: 2.1984 - regression_loss: 1.7734 - classification_loss: 0.4250 311/500 [=================>............] - ETA: 47s - loss: 2.1956 - regression_loss: 1.7711 - classification_loss: 0.4245 312/500 [=================>............] - ETA: 47s - loss: 2.1975 - regression_loss: 1.7725 - classification_loss: 0.4250 313/500 [=================>............] - ETA: 46s - loss: 2.1972 - regression_loss: 1.7725 - classification_loss: 0.4247 314/500 [=================>............] - ETA: 46s - loss: 2.1956 - regression_loss: 1.7712 - classification_loss: 0.4243 315/500 [=================>............] - ETA: 46s - loss: 2.1955 - regression_loss: 1.7714 - classification_loss: 0.4242 316/500 [=================>............] - ETA: 46s - loss: 2.1975 - regression_loss: 1.7730 - classification_loss: 0.4245 317/500 [==================>...........] - ETA: 45s - loss: 2.1979 - regression_loss: 1.7731 - classification_loss: 0.4248 318/500 [==================>...........] - ETA: 45s - loss: 2.1989 - regression_loss: 1.7740 - classification_loss: 0.4249 319/500 [==================>...........] - ETA: 45s - loss: 2.1990 - regression_loss: 1.7743 - classification_loss: 0.4247 320/500 [==================>...........] - ETA: 45s - loss: 2.1985 - regression_loss: 1.7741 - classification_loss: 0.4245 321/500 [==================>...........] - ETA: 44s - loss: 2.1983 - regression_loss: 1.7740 - classification_loss: 0.4242 322/500 [==================>...........] - ETA: 44s - loss: 2.1953 - regression_loss: 1.7716 - classification_loss: 0.4237 323/500 [==================>...........] - ETA: 44s - loss: 2.1964 - regression_loss: 1.7729 - classification_loss: 0.4236 324/500 [==================>...........] - ETA: 44s - loss: 2.1959 - regression_loss: 1.7725 - classification_loss: 0.4234 325/500 [==================>...........] - ETA: 43s - loss: 2.1978 - regression_loss: 1.7738 - classification_loss: 0.4239 326/500 [==================>...........] - ETA: 43s - loss: 2.1989 - regression_loss: 1.7750 - classification_loss: 0.4239 327/500 [==================>...........] - ETA: 43s - loss: 2.2004 - regression_loss: 1.7759 - classification_loss: 0.4245 328/500 [==================>...........] - ETA: 43s - loss: 2.2014 - regression_loss: 1.7767 - classification_loss: 0.4247 329/500 [==================>...........] - ETA: 42s - loss: 2.2024 - regression_loss: 1.7775 - classification_loss: 0.4249 330/500 [==================>...........] - ETA: 42s - loss: 2.2010 - regression_loss: 1.7764 - classification_loss: 0.4246 331/500 [==================>...........] - ETA: 42s - loss: 2.2005 - regression_loss: 1.7760 - classification_loss: 0.4245 332/500 [==================>...........] - ETA: 42s - loss: 2.2011 - regression_loss: 1.7768 - classification_loss: 0.4244 333/500 [==================>...........] - ETA: 41s - loss: 2.2002 - regression_loss: 1.7762 - classification_loss: 0.4240 334/500 [===================>..........] - ETA: 41s - loss: 2.2007 - regression_loss: 1.7767 - classification_loss: 0.4240 335/500 [===================>..........] - ETA: 41s - loss: 2.2035 - regression_loss: 1.7781 - classification_loss: 0.4255 336/500 [===================>..........] - ETA: 41s - loss: 2.2036 - regression_loss: 1.7783 - classification_loss: 0.4253 337/500 [===================>..........] - ETA: 40s - loss: 2.2051 - regression_loss: 1.7795 - classification_loss: 0.4256 338/500 [===================>..........] - ETA: 40s - loss: 2.2052 - regression_loss: 1.7795 - classification_loss: 0.4258 339/500 [===================>..........] - ETA: 40s - loss: 2.2050 - regression_loss: 1.7795 - classification_loss: 0.4255 340/500 [===================>..........] - ETA: 40s - loss: 2.2048 - regression_loss: 1.7793 - classification_loss: 0.4255 341/500 [===================>..........] - ETA: 39s - loss: 2.2065 - regression_loss: 1.7811 - classification_loss: 0.4254 342/500 [===================>..........] - ETA: 39s - loss: 2.2079 - regression_loss: 1.7822 - classification_loss: 0.4257 343/500 [===================>..........] - ETA: 39s - loss: 2.2082 - regression_loss: 1.7826 - classification_loss: 0.4256 344/500 [===================>..........] - ETA: 39s - loss: 2.2075 - regression_loss: 1.7823 - classification_loss: 0.4252 345/500 [===================>..........] - ETA: 38s - loss: 2.2089 - regression_loss: 1.7836 - classification_loss: 0.4253 346/500 [===================>..........] - ETA: 38s - loss: 2.2101 - regression_loss: 1.7844 - classification_loss: 0.4257 347/500 [===================>..........] - ETA: 38s - loss: 2.2105 - regression_loss: 1.7844 - classification_loss: 0.4261 348/500 [===================>..........] - ETA: 38s - loss: 2.2079 - regression_loss: 1.7822 - classification_loss: 0.4257 349/500 [===================>..........] - ETA: 37s - loss: 2.2066 - regression_loss: 1.7813 - classification_loss: 0.4253 350/500 [====================>.........] - ETA: 37s - loss: 2.2070 - regression_loss: 1.7817 - classification_loss: 0.4253 351/500 [====================>.........] - ETA: 37s - loss: 2.2072 - regression_loss: 1.7819 - classification_loss: 0.4253 352/500 [====================>.........] - ETA: 37s - loss: 2.2082 - regression_loss: 1.7826 - classification_loss: 0.4256 353/500 [====================>.........] - ETA: 36s - loss: 2.2071 - regression_loss: 1.7818 - classification_loss: 0.4253 354/500 [====================>.........] - ETA: 36s - loss: 2.2067 - regression_loss: 1.7809 - classification_loss: 0.4258 355/500 [====================>.........] - ETA: 36s - loss: 2.2078 - regression_loss: 1.7818 - classification_loss: 0.4259 356/500 [====================>.........] - ETA: 36s - loss: 2.2061 - regression_loss: 1.7804 - classification_loss: 0.4257 357/500 [====================>.........] - ETA: 35s - loss: 2.2058 - regression_loss: 1.7801 - classification_loss: 0.4256 358/500 [====================>.........] - ETA: 35s - loss: 2.2056 - regression_loss: 1.7801 - classification_loss: 0.4255 359/500 [====================>.........] - ETA: 35s - loss: 2.2064 - regression_loss: 1.7809 - classification_loss: 0.4255 360/500 [====================>.........] - ETA: 35s - loss: 2.2064 - regression_loss: 1.7812 - classification_loss: 0.4252 361/500 [====================>.........] - ETA: 34s - loss: 2.2060 - regression_loss: 1.7811 - classification_loss: 0.4249 362/500 [====================>.........] - ETA: 34s - loss: 2.2057 - regression_loss: 1.7809 - classification_loss: 0.4247 363/500 [====================>.........] - ETA: 34s - loss: 2.2067 - regression_loss: 1.7820 - classification_loss: 0.4247 364/500 [====================>.........] - ETA: 34s - loss: 2.2069 - regression_loss: 1.7823 - classification_loss: 0.4246 365/500 [====================>.........] - ETA: 33s - loss: 2.2072 - regression_loss: 1.7828 - classification_loss: 0.4244 366/500 [====================>.........] - ETA: 33s - loss: 2.2070 - regression_loss: 1.7828 - classification_loss: 0.4243 367/500 [=====================>........] - ETA: 33s - loss: 2.2081 - regression_loss: 1.7838 - classification_loss: 0.4243 368/500 [=====================>........] - ETA: 33s - loss: 2.2074 - regression_loss: 1.7835 - classification_loss: 0.4239 369/500 [=====================>........] - ETA: 32s - loss: 2.2075 - regression_loss: 1.7838 - classification_loss: 0.4238 370/500 [=====================>........] - ETA: 32s - loss: 2.2070 - regression_loss: 1.7833 - classification_loss: 0.4237 371/500 [=====================>........] - ETA: 32s - loss: 2.2072 - regression_loss: 1.7835 - classification_loss: 0.4237 372/500 [=====================>........] - ETA: 32s - loss: 2.2080 - regression_loss: 1.7844 - classification_loss: 0.4236 373/500 [=====================>........] - ETA: 31s - loss: 2.2077 - regression_loss: 1.7842 - classification_loss: 0.4235 374/500 [=====================>........] - ETA: 31s - loss: 2.2073 - regression_loss: 1.7840 - classification_loss: 0.4233 375/500 [=====================>........] - ETA: 31s - loss: 2.2069 - regression_loss: 1.7838 - classification_loss: 0.4231 376/500 [=====================>........] - ETA: 31s - loss: 2.2070 - regression_loss: 1.7840 - classification_loss: 0.4230 377/500 [=====================>........] - ETA: 30s - loss: 2.2072 - regression_loss: 1.7844 - classification_loss: 0.4228 378/500 [=====================>........] - ETA: 30s - loss: 2.2050 - regression_loss: 1.7829 - classification_loss: 0.4221 379/500 [=====================>........] - ETA: 30s - loss: 2.2050 - regression_loss: 1.7831 - classification_loss: 0.4220 380/500 [=====================>........] - ETA: 30s - loss: 2.2076 - regression_loss: 1.7849 - classification_loss: 0.4228 381/500 [=====================>........] - ETA: 29s - loss: 2.2047 - regression_loss: 1.7827 - classification_loss: 0.4220 382/500 [=====================>........] - ETA: 29s - loss: 2.2031 - regression_loss: 1.7815 - classification_loss: 0.4216 383/500 [=====================>........] - ETA: 29s - loss: 2.2031 - regression_loss: 1.7817 - classification_loss: 0.4215 384/500 [======================>.......] - ETA: 29s - loss: 2.2037 - regression_loss: 1.7822 - classification_loss: 0.4216 385/500 [======================>.......] - ETA: 28s - loss: 2.2027 - regression_loss: 1.7815 - classification_loss: 0.4212 386/500 [======================>.......] - ETA: 28s - loss: 2.2018 - regression_loss: 1.7806 - classification_loss: 0.4211 387/500 [======================>.......] - ETA: 28s - loss: 2.2014 - regression_loss: 1.7798 - classification_loss: 0.4215 388/500 [======================>.......] - ETA: 28s - loss: 2.2011 - regression_loss: 1.7802 - classification_loss: 0.4209 389/500 [======================>.......] - ETA: 27s - loss: 2.2022 - regression_loss: 1.7810 - classification_loss: 0.4212 390/500 [======================>.......] - ETA: 27s - loss: 2.2003 - regression_loss: 1.7795 - classification_loss: 0.4208 391/500 [======================>.......] - ETA: 27s - loss: 2.2017 - regression_loss: 1.7807 - classification_loss: 0.4210 392/500 [======================>.......] - ETA: 27s - loss: 2.2021 - regression_loss: 1.7811 - classification_loss: 0.4211 393/500 [======================>.......] - ETA: 26s - loss: 2.2008 - regression_loss: 1.7798 - classification_loss: 0.4209 394/500 [======================>.......] - ETA: 26s - loss: 2.2004 - regression_loss: 1.7797 - classification_loss: 0.4208 395/500 [======================>.......] - ETA: 26s - loss: 2.2005 - regression_loss: 1.7798 - classification_loss: 0.4207 396/500 [======================>.......] - ETA: 26s - loss: 2.1996 - regression_loss: 1.7791 - classification_loss: 0.4205 397/500 [======================>.......] - ETA: 25s - loss: 2.1996 - regression_loss: 1.7791 - classification_loss: 0.4205 398/500 [======================>.......] - ETA: 25s - loss: 2.1998 - regression_loss: 1.7793 - classification_loss: 0.4205 399/500 [======================>.......] - ETA: 25s - loss: 2.2002 - regression_loss: 1.7795 - classification_loss: 0.4207 400/500 [=======================>......] - ETA: 25s - loss: 2.1992 - regression_loss: 1.7791 - classification_loss: 0.4202 401/500 [=======================>......] - ETA: 24s - loss: 2.1990 - regression_loss: 1.7792 - classification_loss: 0.4198 402/500 [=======================>......] - ETA: 24s - loss: 2.1994 - regression_loss: 1.7796 - classification_loss: 0.4199 403/500 [=======================>......] - ETA: 24s - loss: 2.1986 - regression_loss: 1.7790 - classification_loss: 0.4196 404/500 [=======================>......] - ETA: 24s - loss: 2.1992 - regression_loss: 1.7795 - classification_loss: 0.4197 405/500 [=======================>......] - ETA: 23s - loss: 2.1982 - regression_loss: 1.7788 - classification_loss: 0.4194 406/500 [=======================>......] - ETA: 23s - loss: 2.1989 - regression_loss: 1.7792 - classification_loss: 0.4197 407/500 [=======================>......] - ETA: 23s - loss: 2.1986 - regression_loss: 1.7792 - classification_loss: 0.4194 408/500 [=======================>......] - ETA: 23s - loss: 2.1976 - regression_loss: 1.7785 - classification_loss: 0.4191 409/500 [=======================>......] - ETA: 22s - loss: 2.1973 - regression_loss: 1.7784 - classification_loss: 0.4190 410/500 [=======================>......] - ETA: 22s - loss: 2.1970 - regression_loss: 1.7774 - classification_loss: 0.4197 411/500 [=======================>......] - ETA: 22s - loss: 2.1952 - regression_loss: 1.7760 - classification_loss: 0.4192 412/500 [=======================>......] - ETA: 22s - loss: 2.1975 - regression_loss: 1.7781 - classification_loss: 0.4194 413/500 [=======================>......] - ETA: 21s - loss: 2.1983 - regression_loss: 1.7786 - classification_loss: 0.4197 414/500 [=======================>......] - ETA: 21s - loss: 2.1968 - regression_loss: 1.7770 - classification_loss: 0.4197 415/500 [=======================>......] - ETA: 21s - loss: 2.1962 - regression_loss: 1.7766 - classification_loss: 0.4195 416/500 [=======================>......] - ETA: 21s - loss: 2.1961 - regression_loss: 1.7766 - classification_loss: 0.4195 417/500 [========================>.....] - ETA: 20s - loss: 2.1969 - regression_loss: 1.7775 - classification_loss: 0.4195 418/500 [========================>.....] - ETA: 20s - loss: 2.1968 - regression_loss: 1.7775 - classification_loss: 0.4193 419/500 [========================>.....] - ETA: 20s - loss: 2.1962 - regression_loss: 1.7771 - classification_loss: 0.4190 420/500 [========================>.....] - ETA: 20s - loss: 2.1963 - regression_loss: 1.7776 - classification_loss: 0.4187 421/500 [========================>.....] - ETA: 19s - loss: 2.1952 - regression_loss: 1.7769 - classification_loss: 0.4184 422/500 [========================>.....] - ETA: 19s - loss: 2.1921 - regression_loss: 1.7744 - classification_loss: 0.4176 423/500 [========================>.....] - ETA: 19s - loss: 2.1922 - regression_loss: 1.7748 - classification_loss: 0.4174 424/500 [========================>.....] - ETA: 19s - loss: 2.1918 - regression_loss: 1.7743 - classification_loss: 0.4175 425/500 [========================>.....] - ETA: 18s - loss: 2.1922 - regression_loss: 1.7745 - classification_loss: 0.4177 426/500 [========================>.....] - ETA: 18s - loss: 2.1921 - regression_loss: 1.7745 - classification_loss: 0.4176 427/500 [========================>.....] - ETA: 18s - loss: 2.1919 - regression_loss: 1.7745 - classification_loss: 0.4174 428/500 [========================>.....] - ETA: 18s - loss: 2.1912 - regression_loss: 1.7739 - classification_loss: 0.4173 429/500 [========================>.....] - ETA: 17s - loss: 2.1920 - regression_loss: 1.7734 - classification_loss: 0.4186 430/500 [========================>.....] - ETA: 17s - loss: 2.1898 - regression_loss: 1.7715 - classification_loss: 0.4183 431/500 [========================>.....] - ETA: 17s - loss: 2.1885 - regression_loss: 1.7706 - classification_loss: 0.4179 432/500 [========================>.....] - ETA: 17s - loss: 2.1884 - regression_loss: 1.7705 - classification_loss: 0.4179 433/500 [========================>.....] - ETA: 16s - loss: 2.1887 - regression_loss: 1.7709 - classification_loss: 0.4178 434/500 [=========================>....] - ETA: 16s - loss: 2.1889 - regression_loss: 1.7711 - classification_loss: 0.4177 435/500 [=========================>....] - ETA: 16s - loss: 2.1881 - regression_loss: 1.7701 - classification_loss: 0.4181 436/500 [=========================>....] - ETA: 16s - loss: 2.1882 - regression_loss: 1.7699 - classification_loss: 0.4183 437/500 [=========================>....] - ETA: 15s - loss: 2.1861 - regression_loss: 1.7681 - classification_loss: 0.4181 438/500 [=========================>....] - ETA: 15s - loss: 2.1870 - regression_loss: 1.7682 - classification_loss: 0.4187 439/500 [=========================>....] - ETA: 15s - loss: 2.1892 - regression_loss: 1.7701 - classification_loss: 0.4191 440/500 [=========================>....] - ETA: 15s - loss: 2.1897 - regression_loss: 1.7704 - classification_loss: 0.4193 441/500 [=========================>....] - ETA: 14s - loss: 2.1909 - regression_loss: 1.7713 - classification_loss: 0.4196 442/500 [=========================>....] - ETA: 14s - loss: 2.1917 - regression_loss: 1.7719 - classification_loss: 0.4198 443/500 [=========================>....] - ETA: 14s - loss: 2.1925 - regression_loss: 1.7726 - classification_loss: 0.4199 444/500 [=========================>....] - ETA: 14s - loss: 2.1945 - regression_loss: 1.7741 - classification_loss: 0.4205 445/500 [=========================>....] - ETA: 13s - loss: 2.1962 - regression_loss: 1.7754 - classification_loss: 0.4208 446/500 [=========================>....] - ETA: 13s - loss: 2.1994 - regression_loss: 1.7756 - classification_loss: 0.4239 447/500 [=========================>....] - ETA: 13s - loss: 2.1993 - regression_loss: 1.7755 - classification_loss: 0.4238 448/500 [=========================>....] - ETA: 13s - loss: 2.1986 - regression_loss: 1.7751 - classification_loss: 0.4235 449/500 [=========================>....] - ETA: 12s - loss: 2.1980 - regression_loss: 1.7747 - classification_loss: 0.4232 450/500 [==========================>...] - ETA: 12s - loss: 2.1973 - regression_loss: 1.7743 - classification_loss: 0.4230 451/500 [==========================>...] - ETA: 12s - loss: 2.1955 - regression_loss: 1.7729 - classification_loss: 0.4227 452/500 [==========================>...] - ETA: 12s - loss: 2.1946 - regression_loss: 1.7721 - classification_loss: 0.4225 453/500 [==========================>...] - ETA: 11s - loss: 2.1948 - regression_loss: 1.7724 - classification_loss: 0.4224 454/500 [==========================>...] - ETA: 11s - loss: 2.1949 - regression_loss: 1.7727 - classification_loss: 0.4222 455/500 [==========================>...] - ETA: 11s - loss: 2.1948 - regression_loss: 1.7726 - classification_loss: 0.4222 456/500 [==========================>...] - ETA: 11s - loss: 2.1951 - regression_loss: 1.7730 - classification_loss: 0.4221 457/500 [==========================>...] - ETA: 10s - loss: 2.1969 - regression_loss: 1.7743 - classification_loss: 0.4225 458/500 [==========================>...] - ETA: 10s - loss: 2.1963 - regression_loss: 1.7740 - classification_loss: 0.4223 459/500 [==========================>...] - ETA: 10s - loss: 2.1966 - regression_loss: 1.7741 - classification_loss: 0.4224 460/500 [==========================>...] - ETA: 10s - loss: 2.1951 - regression_loss: 1.7730 - classification_loss: 0.4221 461/500 [==========================>...] - ETA: 9s - loss: 2.1951 - regression_loss: 1.7732 - classification_loss: 0.4219  462/500 [==========================>...] - ETA: 9s - loss: 2.1937 - regression_loss: 1.7720 - classification_loss: 0.4217 463/500 [==========================>...] - ETA: 9s - loss: 2.1934 - regression_loss: 1.7719 - classification_loss: 0.4215 464/500 [==========================>...] - ETA: 9s - loss: 2.1922 - regression_loss: 1.7710 - classification_loss: 0.4212 465/500 [==========================>...] - ETA: 8s - loss: 2.1955 - regression_loss: 1.7736 - classification_loss: 0.4219 466/500 [==========================>...] - ETA: 8s - loss: 2.1954 - regression_loss: 1.7736 - classification_loss: 0.4218 467/500 [===========================>..] - ETA: 8s - loss: 2.1963 - regression_loss: 1.7739 - classification_loss: 0.4224 468/500 [===========================>..] - ETA: 8s - loss: 2.1966 - regression_loss: 1.7743 - classification_loss: 0.4224 469/500 [===========================>..] - ETA: 7s - loss: 2.1972 - regression_loss: 1.7747 - classification_loss: 0.4225 470/500 [===========================>..] - ETA: 7s - loss: 2.1972 - regression_loss: 1.7748 - classification_loss: 0.4224 471/500 [===========================>..] - ETA: 7s - loss: 2.1957 - regression_loss: 1.7735 - classification_loss: 0.4221 472/500 [===========================>..] - ETA: 7s - loss: 2.1947 - regression_loss: 1.7727 - classification_loss: 0.4220 473/500 [===========================>..] - ETA: 6s - loss: 2.1943 - regression_loss: 1.7724 - classification_loss: 0.4219 474/500 [===========================>..] - ETA: 6s - loss: 2.1942 - regression_loss: 1.7723 - classification_loss: 0.4218 475/500 [===========================>..] - ETA: 6s - loss: 2.1936 - regression_loss: 1.7719 - classification_loss: 0.4217 476/500 [===========================>..] - ETA: 6s - loss: 2.1932 - regression_loss: 1.7717 - classification_loss: 0.4215 477/500 [===========================>..] - ETA: 5s - loss: 2.1932 - regression_loss: 1.7719 - classification_loss: 0.4213 478/500 [===========================>..] - ETA: 5s - loss: 2.1945 - regression_loss: 1.7711 - classification_loss: 0.4234 479/500 [===========================>..] - ETA: 5s - loss: 2.1928 - regression_loss: 1.7698 - classification_loss: 0.4230 480/500 [===========================>..] - ETA: 5s - loss: 2.1925 - regression_loss: 1.7695 - classification_loss: 0.4230 481/500 [===========================>..] - ETA: 4s - loss: 2.1919 - regression_loss: 1.7690 - classification_loss: 0.4229 482/500 [===========================>..] - ETA: 4s - loss: 2.1908 - regression_loss: 1.7684 - classification_loss: 0.4224 483/500 [===========================>..] - ETA: 4s - loss: 2.1910 - regression_loss: 1.7686 - classification_loss: 0.4224 484/500 [============================>.] - ETA: 4s - loss: 2.1903 - regression_loss: 1.7681 - classification_loss: 0.4222 485/500 [============================>.] - ETA: 3s - loss: 2.1933 - regression_loss: 1.7703 - classification_loss: 0.4230 486/500 [============================>.] - ETA: 3s - loss: 2.1910 - regression_loss: 1.7684 - classification_loss: 0.4226 487/500 [============================>.] - ETA: 3s - loss: 2.1903 - regression_loss: 1.7682 - classification_loss: 0.4221 488/500 [============================>.] - ETA: 3s - loss: 2.1885 - regression_loss: 1.7667 - classification_loss: 0.4218 489/500 [============================>.] - ETA: 2s - loss: 2.1885 - regression_loss: 1.7668 - classification_loss: 0.4217 490/500 [============================>.] - ETA: 2s - loss: 2.1886 - regression_loss: 1.7672 - classification_loss: 0.4215 491/500 [============================>.] - ETA: 2s - loss: 2.1880 - regression_loss: 1.7669 - classification_loss: 0.4211 492/500 [============================>.] - ETA: 2s - loss: 2.1878 - regression_loss: 1.7667 - classification_loss: 0.4211 493/500 [============================>.] - ETA: 1s - loss: 2.1872 - regression_loss: 1.7663 - classification_loss: 0.4209 494/500 [============================>.] - ETA: 1s - loss: 2.1864 - regression_loss: 1.7657 - classification_loss: 0.4207 495/500 [============================>.] - ETA: 1s - loss: 2.1867 - regression_loss: 1.7660 - classification_loss: 0.4207 496/500 [============================>.] - ETA: 1s - loss: 2.1879 - regression_loss: 1.7672 - classification_loss: 0.4207 497/500 [============================>.] - ETA: 0s - loss: 2.1881 - regression_loss: 1.7674 - classification_loss: 0.4207 498/500 [============================>.] - ETA: 0s - loss: 2.1883 - regression_loss: 1.7676 - classification_loss: 0.4207 499/500 [============================>.] - ETA: 0s - loss: 2.1888 - regression_loss: 1.7676 - classification_loss: 0.4211 500/500 [==============================] - 125s 250ms/step - loss: 2.1868 - regression_loss: 1.7661 - classification_loss: 0.4207 1172 instances of class plum with average precision: 0.4932 mAP: 0.4932 Epoch 00028: saving model to ./training/snapshots/resnet50_pascal_28.h5 Epoch 29/150 1/500 [..............................] - ETA: 1:59 - loss: 2.5103 - regression_loss: 2.0474 - classification_loss: 0.4629 2/500 [..............................] - ETA: 2:01 - loss: 2.3240 - regression_loss: 1.9042 - classification_loss: 0.4198 3/500 [..............................] - ETA: 2:02 - loss: 2.3981 - regression_loss: 1.9943 - classification_loss: 0.4038 4/500 [..............................] - ETA: 2:05 - loss: 2.7467 - regression_loss: 2.0444 - classification_loss: 0.7023 5/500 [..............................] - ETA: 2:06 - loss: 2.6328 - regression_loss: 1.9961 - classification_loss: 0.6368 6/500 [..............................] - ETA: 2:06 - loss: 2.5271 - regression_loss: 1.9340 - classification_loss: 0.5931 7/500 [..............................] - ETA: 2:06 - loss: 2.4147 - regression_loss: 1.8669 - classification_loss: 0.5478 8/500 [..............................] - ETA: 2:05 - loss: 2.3915 - regression_loss: 1.8627 - classification_loss: 0.5287 9/500 [..............................] - ETA: 2:05 - loss: 2.4083 - regression_loss: 1.8834 - classification_loss: 0.5249 10/500 [..............................] - ETA: 2:05 - loss: 2.3368 - regression_loss: 1.8379 - classification_loss: 0.4989 11/500 [..............................] - ETA: 2:04 - loss: 2.3422 - regression_loss: 1.8445 - classification_loss: 0.4978 12/500 [..............................] - ETA: 2:04 - loss: 2.3283 - regression_loss: 1.8350 - classification_loss: 0.4933 13/500 [..............................] - ETA: 2:04 - loss: 2.3426 - regression_loss: 1.8527 - classification_loss: 0.4900 14/500 [..............................] - ETA: 2:03 - loss: 2.3523 - regression_loss: 1.8701 - classification_loss: 0.4822 15/500 [..............................] - ETA: 2:03 - loss: 2.3314 - regression_loss: 1.8599 - classification_loss: 0.4714 16/500 [..............................] - ETA: 2:02 - loss: 2.3244 - regression_loss: 1.8557 - classification_loss: 0.4687 17/500 [>.............................] - ETA: 2:02 - loss: 2.3048 - regression_loss: 1.8451 - classification_loss: 0.4597 18/500 [>.............................] - ETA: 2:02 - loss: 2.2977 - regression_loss: 1.8461 - classification_loss: 0.4516 19/500 [>.............................] - ETA: 2:01 - loss: 2.3017 - regression_loss: 1.8481 - classification_loss: 0.4536 20/500 [>.............................] - ETA: 2:01 - loss: 2.2860 - regression_loss: 1.8380 - classification_loss: 0.4480 21/500 [>.............................] - ETA: 2:01 - loss: 2.2802 - regression_loss: 1.8327 - classification_loss: 0.4474 22/500 [>.............................] - ETA: 2:01 - loss: 2.2161 - regression_loss: 1.7828 - classification_loss: 0.4332 23/500 [>.............................] - ETA: 2:00 - loss: 2.2082 - regression_loss: 1.7760 - classification_loss: 0.4322 24/500 [>.............................] - ETA: 2:00 - loss: 2.2316 - regression_loss: 1.7961 - classification_loss: 0.4355 25/500 [>.............................] - ETA: 2:00 - loss: 2.1797 - regression_loss: 1.7535 - classification_loss: 0.4263 26/500 [>.............................] - ETA: 1:59 - loss: 2.1639 - regression_loss: 1.7427 - classification_loss: 0.4211 27/500 [>.............................] - ETA: 1:59 - loss: 2.1768 - regression_loss: 1.7571 - classification_loss: 0.4198 28/500 [>.............................] - ETA: 1:59 - loss: 2.1791 - regression_loss: 1.7592 - classification_loss: 0.4199 29/500 [>.............................] - ETA: 1:58 - loss: 2.1628 - regression_loss: 1.7442 - classification_loss: 0.4187 30/500 [>.............................] - ETA: 1:58 - loss: 2.1554 - regression_loss: 1.7428 - classification_loss: 0.4126 31/500 [>.............................] - ETA: 1:58 - loss: 2.1696 - regression_loss: 1.7556 - classification_loss: 0.4139 32/500 [>.............................] - ETA: 1:58 - loss: 2.1744 - regression_loss: 1.7608 - classification_loss: 0.4136 33/500 [>.............................] - ETA: 1:58 - loss: 2.1775 - regression_loss: 1.7633 - classification_loss: 0.4141 34/500 [=>............................] - ETA: 1:57 - loss: 2.1841 - regression_loss: 1.7694 - classification_loss: 0.4147 35/500 [=>............................] - ETA: 1:57 - loss: 2.1912 - regression_loss: 1.7760 - classification_loss: 0.4153 36/500 [=>............................] - ETA: 1:57 - loss: 2.1751 - regression_loss: 1.7639 - classification_loss: 0.4112 37/500 [=>............................] - ETA: 1:57 - loss: 2.2020 - regression_loss: 1.7763 - classification_loss: 0.4257 38/500 [=>............................] - ETA: 1:56 - loss: 2.2117 - regression_loss: 1.7866 - classification_loss: 0.4250 39/500 [=>............................] - ETA: 1:56 - loss: 2.2384 - regression_loss: 1.8109 - classification_loss: 0.4275 40/500 [=>............................] - ETA: 1:56 - loss: 2.2315 - regression_loss: 1.8064 - classification_loss: 0.4251 41/500 [=>............................] - ETA: 1:55 - loss: 2.2271 - regression_loss: 1.8009 - classification_loss: 0.4262 42/500 [=>............................] - ETA: 1:55 - loss: 2.2180 - regression_loss: 1.7892 - classification_loss: 0.4288 43/500 [=>............................] - ETA: 1:55 - loss: 2.2170 - regression_loss: 1.7902 - classification_loss: 0.4267 44/500 [=>............................] - ETA: 1:55 - loss: 2.2074 - regression_loss: 1.7834 - classification_loss: 0.4240 45/500 [=>............................] - ETA: 1:54 - loss: 2.1922 - regression_loss: 1.7725 - classification_loss: 0.4197 46/500 [=>............................] - ETA: 1:54 - loss: 2.1887 - regression_loss: 1.7681 - classification_loss: 0.4206 47/500 [=>............................] - ETA: 1:54 - loss: 2.1787 - regression_loss: 1.7597 - classification_loss: 0.4190 48/500 [=>............................] - ETA: 1:54 - loss: 2.1768 - regression_loss: 1.7598 - classification_loss: 0.4169 49/500 [=>............................] - ETA: 1:53 - loss: 2.1768 - regression_loss: 1.7582 - classification_loss: 0.4185 50/500 [==>...........................] - ETA: 1:53 - loss: 2.1852 - regression_loss: 1.7655 - classification_loss: 0.4197 51/500 [==>...........................] - ETA: 1:53 - loss: 2.1892 - regression_loss: 1.7692 - classification_loss: 0.4200 52/500 [==>...........................] - ETA: 1:53 - loss: 2.1915 - regression_loss: 1.7712 - classification_loss: 0.4203 53/500 [==>...........................] - ETA: 1:52 - loss: 2.2013 - regression_loss: 1.7808 - classification_loss: 0.4205 54/500 [==>...........................] - ETA: 1:52 - loss: 2.1992 - regression_loss: 1.7786 - classification_loss: 0.4206 55/500 [==>...........................] - ETA: 1:52 - loss: 2.2004 - regression_loss: 1.7801 - classification_loss: 0.4202 56/500 [==>...........................] - ETA: 1:52 - loss: 2.2016 - regression_loss: 1.7827 - classification_loss: 0.4189 57/500 [==>...........................] - ETA: 1:51 - loss: 2.1994 - regression_loss: 1.7810 - classification_loss: 0.4184 58/500 [==>...........................] - ETA: 1:51 - loss: 2.1994 - regression_loss: 1.7810 - classification_loss: 0.4184 59/500 [==>...........................] - ETA: 1:51 - loss: 2.2004 - regression_loss: 1.7817 - classification_loss: 0.4187 60/500 [==>...........................] - ETA: 1:51 - loss: 2.1950 - regression_loss: 1.7768 - classification_loss: 0.4183 61/500 [==>...........................] - ETA: 1:50 - loss: 2.1893 - regression_loss: 1.7712 - classification_loss: 0.4181 62/500 [==>...........................] - ETA: 1:50 - loss: 2.1904 - regression_loss: 1.7723 - classification_loss: 0.4181 63/500 [==>...........................] - ETA: 1:50 - loss: 2.1878 - regression_loss: 1.7708 - classification_loss: 0.4170 64/500 [==>...........................] - ETA: 1:50 - loss: 2.1929 - regression_loss: 1.7756 - classification_loss: 0.4173 65/500 [==>...........................] - ETA: 1:50 - loss: 2.1943 - regression_loss: 1.7774 - classification_loss: 0.4169 66/500 [==>...........................] - ETA: 1:49 - loss: 2.1939 - regression_loss: 1.7784 - classification_loss: 0.4156 67/500 [===>..........................] - ETA: 1:49 - loss: 2.1880 - regression_loss: 1.7743 - classification_loss: 0.4136 68/500 [===>..........................] - ETA: 1:49 - loss: 2.1855 - regression_loss: 1.7724 - classification_loss: 0.4132 69/500 [===>..........................] - ETA: 1:48 - loss: 2.1829 - regression_loss: 1.7709 - classification_loss: 0.4120 70/500 [===>..........................] - ETA: 1:48 - loss: 2.1910 - regression_loss: 1.7769 - classification_loss: 0.4140 71/500 [===>..........................] - ETA: 1:48 - loss: 2.1968 - regression_loss: 1.7822 - classification_loss: 0.4146 72/500 [===>..........................] - ETA: 1:48 - loss: 2.1998 - regression_loss: 1.7825 - classification_loss: 0.4173 73/500 [===>..........................] - ETA: 1:47 - loss: 2.2042 - regression_loss: 1.7861 - classification_loss: 0.4181 74/500 [===>..........................] - ETA: 1:47 - loss: 2.2093 - regression_loss: 1.7910 - classification_loss: 0.4183 75/500 [===>..........................] - ETA: 1:47 - loss: 2.2175 - regression_loss: 1.7976 - classification_loss: 0.4199 76/500 [===>..........................] - ETA: 1:47 - loss: 2.2176 - regression_loss: 1.7990 - classification_loss: 0.4186 77/500 [===>..........................] - ETA: 1:46 - loss: 2.2212 - regression_loss: 1.8021 - classification_loss: 0.4190 78/500 [===>..........................] - ETA: 1:46 - loss: 2.2224 - regression_loss: 1.8039 - classification_loss: 0.4185 79/500 [===>..........................] - ETA: 1:46 - loss: 2.2277 - regression_loss: 1.8069 - classification_loss: 0.4208 80/500 [===>..........................] - ETA: 1:45 - loss: 2.2196 - regression_loss: 1.7974 - classification_loss: 0.4222 81/500 [===>..........................] - ETA: 1:45 - loss: 2.2162 - regression_loss: 1.7956 - classification_loss: 0.4206 82/500 [===>..........................] - ETA: 1:44 - loss: 2.2096 - regression_loss: 1.7913 - classification_loss: 0.4184 83/500 [===>..........................] - ETA: 1:44 - loss: 2.2059 - regression_loss: 1.7887 - classification_loss: 0.4172 84/500 [====>.........................] - ETA: 1:44 - loss: 2.2045 - regression_loss: 1.7877 - classification_loss: 0.4169 85/500 [====>.........................] - ETA: 1:43 - loss: 2.2080 - regression_loss: 1.7916 - classification_loss: 0.4164 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1976 - regression_loss: 1.7834 - classification_loss: 0.4142 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1980 - regression_loss: 1.7850 - classification_loss: 0.4130 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1928 - regression_loss: 1.7819 - classification_loss: 0.4109 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1973 - regression_loss: 1.7866 - classification_loss: 0.4107 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1948 - regression_loss: 1.7840 - classification_loss: 0.4108 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1993 - regression_loss: 1.7876 - classification_loss: 0.4117 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1966 - regression_loss: 1.7862 - classification_loss: 0.4104 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1955 - regression_loss: 1.7856 - classification_loss: 0.4099 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1948 - regression_loss: 1.7852 - classification_loss: 0.4096 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1906 - regression_loss: 1.7824 - classification_loss: 0.4082 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1901 - regression_loss: 1.7809 - classification_loss: 0.4091 97/500 [====>.........................] - ETA: 1:41 - loss: 2.1919 - regression_loss: 1.7825 - classification_loss: 0.4093 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1966 - regression_loss: 1.7855 - classification_loss: 0.4111 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1962 - regression_loss: 1.7862 - classification_loss: 0.4100 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1923 - regression_loss: 1.7827 - classification_loss: 0.4096 101/500 [=====>........................] - ETA: 1:40 - loss: 2.2004 - regression_loss: 1.7899 - classification_loss: 0.4105 102/500 [=====>........................] - ETA: 1:39 - loss: 2.2036 - regression_loss: 1.7927 - classification_loss: 0.4109 103/500 [=====>........................] - ETA: 1:39 - loss: 2.2052 - regression_loss: 1.7929 - classification_loss: 0.4123 104/500 [=====>........................] - ETA: 1:39 - loss: 2.2087 - regression_loss: 1.7968 - classification_loss: 0.4120 105/500 [=====>........................] - ETA: 1:39 - loss: 2.2049 - regression_loss: 1.7932 - classification_loss: 0.4117 106/500 [=====>........................] - ETA: 1:38 - loss: 2.2093 - regression_loss: 1.7970 - classification_loss: 0.4122 107/500 [=====>........................] - ETA: 1:38 - loss: 2.2084 - regression_loss: 1.7966 - classification_loss: 0.4117 108/500 [=====>........................] - ETA: 1:38 - loss: 2.2115 - regression_loss: 1.7936 - classification_loss: 0.4179 109/500 [=====>........................] - ETA: 1:38 - loss: 2.2071 - regression_loss: 1.7896 - classification_loss: 0.4175 110/500 [=====>........................] - ETA: 1:37 - loss: 2.2101 - regression_loss: 1.7916 - classification_loss: 0.4185 111/500 [=====>........................] - ETA: 1:37 - loss: 2.2084 - regression_loss: 1.7905 - classification_loss: 0.4179 112/500 [=====>........................] - ETA: 1:37 - loss: 2.2059 - regression_loss: 1.7887 - classification_loss: 0.4172 113/500 [=====>........................] - ETA: 1:37 - loss: 2.2051 - regression_loss: 1.7884 - classification_loss: 0.4167 114/500 [=====>........................] - ETA: 1:36 - loss: 2.2025 - regression_loss: 1.7868 - classification_loss: 0.4157 115/500 [=====>........................] - ETA: 1:36 - loss: 2.2048 - regression_loss: 1.7889 - classification_loss: 0.4159 116/500 [=====>........................] - ETA: 1:36 - loss: 2.2024 - regression_loss: 1.7876 - classification_loss: 0.4149 117/500 [======>.......................] - ETA: 1:36 - loss: 2.1992 - regression_loss: 1.7849 - classification_loss: 0.4143 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1963 - regression_loss: 1.7827 - classification_loss: 0.4136 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1907 - regression_loss: 1.7782 - classification_loss: 0.4125 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1911 - regression_loss: 1.7787 - classification_loss: 0.4124 121/500 [======>.......................] - ETA: 1:35 - loss: 2.1915 - regression_loss: 1.7794 - classification_loss: 0.4121 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1861 - regression_loss: 1.7745 - classification_loss: 0.4116 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1918 - regression_loss: 1.7805 - classification_loss: 0.4112 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1940 - regression_loss: 1.7817 - classification_loss: 0.4123 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1862 - regression_loss: 1.7754 - classification_loss: 0.4108 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1798 - regression_loss: 1.7713 - classification_loss: 0.4085 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1769 - regression_loss: 1.7695 - classification_loss: 0.4074 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1757 - regression_loss: 1.7692 - classification_loss: 0.4065 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1768 - regression_loss: 1.7706 - classification_loss: 0.4062 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1820 - regression_loss: 1.7738 - classification_loss: 0.4081 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1754 - regression_loss: 1.7676 - classification_loss: 0.4078 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1798 - regression_loss: 1.7703 - classification_loss: 0.4094 133/500 [======>.......................] - ETA: 1:32 - loss: 2.1786 - regression_loss: 1.7692 - classification_loss: 0.4093 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1800 - regression_loss: 1.7700 - classification_loss: 0.4100 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1776 - regression_loss: 1.7678 - classification_loss: 0.4099 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1766 - regression_loss: 1.7663 - classification_loss: 0.4103 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1732 - regression_loss: 1.7642 - classification_loss: 0.4090 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1735 - regression_loss: 1.7645 - classification_loss: 0.4091 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1793 - regression_loss: 1.7681 - classification_loss: 0.4111 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1749 - regression_loss: 1.7649 - classification_loss: 0.4099 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1775 - regression_loss: 1.7667 - classification_loss: 0.4108 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1795 - regression_loss: 1.7684 - classification_loss: 0.4111 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1781 - regression_loss: 1.7673 - classification_loss: 0.4109 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1762 - regression_loss: 1.7659 - classification_loss: 0.4103 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1712 - regression_loss: 1.7616 - classification_loss: 0.4096 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1697 - regression_loss: 1.7606 - classification_loss: 0.4091 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1715 - regression_loss: 1.7615 - classification_loss: 0.4099 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1696 - regression_loss: 1.7602 - classification_loss: 0.4093 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1684 - regression_loss: 1.7592 - classification_loss: 0.4092 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1706 - regression_loss: 1.7620 - classification_loss: 0.4087 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1699 - regression_loss: 1.7621 - classification_loss: 0.4078 152/500 [========>.....................] - ETA: 1:27 - loss: 2.1668 - regression_loss: 1.7592 - classification_loss: 0.4076 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1668 - regression_loss: 1.7594 - classification_loss: 0.4074 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1698 - regression_loss: 1.7622 - classification_loss: 0.4076 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1719 - regression_loss: 1.7638 - classification_loss: 0.4082 156/500 [========>.....................] - ETA: 1:26 - loss: 2.1722 - regression_loss: 1.7642 - classification_loss: 0.4080 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1741 - regression_loss: 1.7643 - classification_loss: 0.4098 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1692 - regression_loss: 1.7605 - classification_loss: 0.4087 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1753 - regression_loss: 1.7656 - classification_loss: 0.4097 160/500 [========>.....................] - ETA: 1:25 - loss: 2.1766 - regression_loss: 1.7669 - classification_loss: 0.4096 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1731 - regression_loss: 1.7643 - classification_loss: 0.4088 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1732 - regression_loss: 1.7643 - classification_loss: 0.4090 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1732 - regression_loss: 1.7642 - classification_loss: 0.4090 164/500 [========>.....................] - ETA: 1:24 - loss: 2.1743 - regression_loss: 1.7652 - classification_loss: 0.4091 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1741 - regression_loss: 1.7650 - classification_loss: 0.4091 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1736 - regression_loss: 1.7646 - classification_loss: 0.4089 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1678 - regression_loss: 1.7600 - classification_loss: 0.4078 168/500 [=========>....................] - ETA: 1:23 - loss: 2.1661 - regression_loss: 1.7584 - classification_loss: 0.4078 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1666 - regression_loss: 1.7589 - classification_loss: 0.4077 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1653 - regression_loss: 1.7580 - classification_loss: 0.4074 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1672 - regression_loss: 1.7597 - classification_loss: 0.4075 172/500 [=========>....................] - ETA: 1:22 - loss: 2.1635 - regression_loss: 1.7572 - classification_loss: 0.4063 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1548 - regression_loss: 1.7495 - classification_loss: 0.4053 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1551 - regression_loss: 1.7503 - classification_loss: 0.4048 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1545 - regression_loss: 1.7500 - classification_loss: 0.4045 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1554 - regression_loss: 1.7508 - classification_loss: 0.4046 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1567 - regression_loss: 1.7515 - classification_loss: 0.4053 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1579 - regression_loss: 1.7521 - classification_loss: 0.4058 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1605 - regression_loss: 1.7542 - classification_loss: 0.4063 180/500 [=========>....................] - ETA: 1:20 - loss: 2.1613 - regression_loss: 1.7547 - classification_loss: 0.4066 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1630 - regression_loss: 1.7562 - classification_loss: 0.4068 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1627 - regression_loss: 1.7563 - classification_loss: 0.4064 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1635 - regression_loss: 1.7570 - classification_loss: 0.4065 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1632 - regression_loss: 1.7572 - classification_loss: 0.4061 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1637 - regression_loss: 1.7575 - classification_loss: 0.4062 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1624 - regression_loss: 1.7567 - classification_loss: 0.4057 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1652 - regression_loss: 1.7591 - classification_loss: 0.4062 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1678 - regression_loss: 1.7611 - classification_loss: 0.4068 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1655 - regression_loss: 1.7591 - classification_loss: 0.4065 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1627 - regression_loss: 1.7568 - classification_loss: 0.4059 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1624 - regression_loss: 1.7565 - classification_loss: 0.4059 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1634 - regression_loss: 1.7575 - classification_loss: 0.4060 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1636 - regression_loss: 1.7578 - classification_loss: 0.4057 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1636 - regression_loss: 1.7581 - classification_loss: 0.4055 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1611 - regression_loss: 1.7560 - classification_loss: 0.4051 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1600 - regression_loss: 1.7554 - classification_loss: 0.4046 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1596 - regression_loss: 1.7554 - classification_loss: 0.4043 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1537 - regression_loss: 1.7503 - classification_loss: 0.4034 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1536 - regression_loss: 1.7502 - classification_loss: 0.4034 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1507 - regression_loss: 1.7482 - classification_loss: 0.4025 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1525 - regression_loss: 1.7498 - classification_loss: 0.4027 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1510 - regression_loss: 1.7487 - classification_loss: 0.4023 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1499 - regression_loss: 1.7477 - classification_loss: 0.4022 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1507 - regression_loss: 1.7485 - classification_loss: 0.4022 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1498 - regression_loss: 1.7482 - classification_loss: 0.4016 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1502 - regression_loss: 1.7485 - classification_loss: 0.4017 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1501 - regression_loss: 1.7485 - classification_loss: 0.4016 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1510 - regression_loss: 1.7492 - classification_loss: 0.4017 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1515 - regression_loss: 1.7492 - classification_loss: 0.4023 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1591 - regression_loss: 1.7550 - classification_loss: 0.4041 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1594 - regression_loss: 1.7553 - classification_loss: 0.4040 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1624 - regression_loss: 1.7577 - classification_loss: 0.4047 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1615 - regression_loss: 1.7572 - classification_loss: 0.4043 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1571 - regression_loss: 1.7536 - classification_loss: 0.4035 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1562 - regression_loss: 1.7531 - classification_loss: 0.4031 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1558 - regression_loss: 1.7509 - classification_loss: 0.4050 217/500 [============>.................] - ETA: 1:10 - loss: 2.1571 - regression_loss: 1.7522 - classification_loss: 0.4049 218/500 [============>.................] - ETA: 1:10 - loss: 2.1569 - regression_loss: 1.7519 - classification_loss: 0.4050 219/500 [============>.................] - ETA: 1:10 - loss: 2.1573 - regression_loss: 1.7523 - classification_loss: 0.4050 220/500 [============>.................] - ETA: 1:10 - loss: 2.1563 - regression_loss: 1.7508 - classification_loss: 0.4054 221/500 [============>.................] - ETA: 1:09 - loss: 2.1573 - regression_loss: 1.7519 - classification_loss: 0.4053 222/500 [============>.................] - ETA: 1:09 - loss: 2.1572 - regression_loss: 1.7519 - classification_loss: 0.4054 223/500 [============>.................] - ETA: 1:09 - loss: 2.1563 - regression_loss: 1.7513 - classification_loss: 0.4050 224/500 [============>.................] - ETA: 1:09 - loss: 2.1557 - regression_loss: 1.7504 - classification_loss: 0.4053 225/500 [============>.................] - ETA: 1:08 - loss: 2.1556 - regression_loss: 1.7500 - classification_loss: 0.4056 226/500 [============>.................] - ETA: 1:08 - loss: 2.1550 - regression_loss: 1.7497 - classification_loss: 0.4053 227/500 [============>.................] - ETA: 1:08 - loss: 2.1553 - regression_loss: 1.7501 - classification_loss: 0.4051 228/500 [============>.................] - ETA: 1:08 - loss: 2.1558 - regression_loss: 1.7509 - classification_loss: 0.4050 229/500 [============>.................] - ETA: 1:07 - loss: 2.1544 - regression_loss: 1.7498 - classification_loss: 0.4046 230/500 [============>.................] - ETA: 1:07 - loss: 2.1578 - regression_loss: 1.7529 - classification_loss: 0.4049 231/500 [============>.................] - ETA: 1:07 - loss: 2.1571 - regression_loss: 1.7522 - classification_loss: 0.4048 232/500 [============>.................] - ETA: 1:07 - loss: 2.1588 - regression_loss: 1.7536 - classification_loss: 0.4052 233/500 [============>.................] - ETA: 1:06 - loss: 2.1602 - regression_loss: 1.7546 - classification_loss: 0.4056 234/500 [=============>................] - ETA: 1:06 - loss: 2.1632 - regression_loss: 1.7575 - classification_loss: 0.4056 235/500 [=============>................] - ETA: 1:06 - loss: 2.1677 - regression_loss: 1.7616 - classification_loss: 0.4061 236/500 [=============>................] - ETA: 1:06 - loss: 2.1680 - regression_loss: 1.7615 - classification_loss: 0.4064 237/500 [=============>................] - ETA: 1:05 - loss: 2.1695 - regression_loss: 1.7631 - classification_loss: 0.4064 238/500 [=============>................] - ETA: 1:05 - loss: 2.1716 - regression_loss: 1.7644 - classification_loss: 0.4072 239/500 [=============>................] - ETA: 1:05 - loss: 2.1693 - regression_loss: 1.7624 - classification_loss: 0.4070 240/500 [=============>................] - ETA: 1:05 - loss: 2.1696 - regression_loss: 1.7625 - classification_loss: 0.4071 241/500 [=============>................] - ETA: 1:04 - loss: 2.1712 - regression_loss: 1.7647 - classification_loss: 0.4065 242/500 [=============>................] - ETA: 1:04 - loss: 2.1729 - regression_loss: 1.7662 - classification_loss: 0.4067 243/500 [=============>................] - ETA: 1:04 - loss: 2.1756 - regression_loss: 1.7679 - classification_loss: 0.4078 244/500 [=============>................] - ETA: 1:04 - loss: 2.1726 - regression_loss: 1.7655 - classification_loss: 0.4071 245/500 [=============>................] - ETA: 1:03 - loss: 2.1723 - regression_loss: 1.7647 - classification_loss: 0.4077 246/500 [=============>................] - ETA: 1:03 - loss: 2.1732 - regression_loss: 1.7654 - classification_loss: 0.4078 247/500 [=============>................] - ETA: 1:03 - loss: 2.1723 - regression_loss: 1.7648 - classification_loss: 0.4075 248/500 [=============>................] - ETA: 1:03 - loss: 2.1728 - regression_loss: 1.7649 - classification_loss: 0.4079 249/500 [=============>................] - ETA: 1:02 - loss: 2.1742 - regression_loss: 1.7660 - classification_loss: 0.4082 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1749 - regression_loss: 1.7665 - classification_loss: 0.4084 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1756 - regression_loss: 1.7677 - classification_loss: 0.4079 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1766 - regression_loss: 1.7688 - classification_loss: 0.4079 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1740 - regression_loss: 1.7667 - classification_loss: 0.4073 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1758 - regression_loss: 1.7680 - classification_loss: 0.4078 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1745 - regression_loss: 1.7671 - classification_loss: 0.4074 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1751 - regression_loss: 1.7677 - classification_loss: 0.4074 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1744 - regression_loss: 1.7671 - classification_loss: 0.4073 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1748 - regression_loss: 1.7679 - classification_loss: 0.4069 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1753 - regression_loss: 1.7680 - classification_loss: 0.4073 260/500 [==============>...............] - ETA: 59s - loss: 2.1777 - regression_loss: 1.7685 - classification_loss: 0.4092  261/500 [==============>...............] - ETA: 59s - loss: 2.1794 - regression_loss: 1.7701 - classification_loss: 0.4093 262/500 [==============>...............] - ETA: 59s - loss: 2.1786 - regression_loss: 1.7697 - classification_loss: 0.4089 263/500 [==============>...............] - ETA: 59s - loss: 2.1779 - regression_loss: 1.7693 - classification_loss: 0.4087 264/500 [==============>...............] - ETA: 58s - loss: 2.1793 - regression_loss: 1.7708 - classification_loss: 0.4086 265/500 [==============>...............] - ETA: 58s - loss: 2.1808 - regression_loss: 1.7719 - classification_loss: 0.4089 266/500 [==============>...............] - ETA: 58s - loss: 2.1796 - regression_loss: 1.7711 - classification_loss: 0.4086 267/500 [===============>..............] - ETA: 58s - loss: 2.1768 - regression_loss: 1.7687 - classification_loss: 0.4081 268/500 [===============>..............] - ETA: 57s - loss: 2.1787 - regression_loss: 1.7701 - classification_loss: 0.4086 269/500 [===============>..............] - ETA: 57s - loss: 2.1781 - regression_loss: 1.7698 - classification_loss: 0.4083 270/500 [===============>..............] - ETA: 57s - loss: 2.1773 - regression_loss: 1.7692 - classification_loss: 0.4080 271/500 [===============>..............] - ETA: 57s - loss: 2.1781 - regression_loss: 1.7697 - classification_loss: 0.4084 272/500 [===============>..............] - ETA: 57s - loss: 2.1807 - regression_loss: 1.7721 - classification_loss: 0.4087 273/500 [===============>..............] - ETA: 56s - loss: 2.1811 - regression_loss: 1.7719 - classification_loss: 0.4092 274/500 [===============>..............] - ETA: 56s - loss: 2.1805 - regression_loss: 1.7715 - classification_loss: 0.4090 275/500 [===============>..............] - ETA: 56s - loss: 2.1750 - regression_loss: 1.7669 - classification_loss: 0.4080 276/500 [===============>..............] - ETA: 55s - loss: 2.1723 - regression_loss: 1.7648 - classification_loss: 0.4075 277/500 [===============>..............] - ETA: 55s - loss: 2.1737 - regression_loss: 1.7661 - classification_loss: 0.4076 278/500 [===============>..............] - ETA: 55s - loss: 2.1729 - regression_loss: 1.7657 - classification_loss: 0.4072 279/500 [===============>..............] - ETA: 55s - loss: 2.1699 - regression_loss: 1.7631 - classification_loss: 0.4067 280/500 [===============>..............] - ETA: 54s - loss: 2.1702 - regression_loss: 1.7631 - classification_loss: 0.4071 281/500 [===============>..............] - ETA: 54s - loss: 2.1708 - regression_loss: 1.7636 - classification_loss: 0.4072 282/500 [===============>..............] - ETA: 54s - loss: 2.1684 - regression_loss: 1.7613 - classification_loss: 0.4071 283/500 [===============>..............] - ETA: 54s - loss: 2.1682 - regression_loss: 1.7610 - classification_loss: 0.4072 284/500 [================>.............] - ETA: 53s - loss: 2.1686 - regression_loss: 1.7618 - classification_loss: 0.4068 285/500 [================>.............] - ETA: 53s - loss: 2.1705 - regression_loss: 1.7638 - classification_loss: 0.4067 286/500 [================>.............] - ETA: 53s - loss: 2.1695 - regression_loss: 1.7628 - classification_loss: 0.4067 287/500 [================>.............] - ETA: 53s - loss: 2.1716 - regression_loss: 1.7644 - classification_loss: 0.4072 288/500 [================>.............] - ETA: 52s - loss: 2.1718 - regression_loss: 1.7648 - classification_loss: 0.4070 289/500 [================>.............] - ETA: 52s - loss: 2.1690 - regression_loss: 1.7625 - classification_loss: 0.4065 290/500 [================>.............] - ETA: 52s - loss: 2.1639 - regression_loss: 1.7584 - classification_loss: 0.4055 291/500 [================>.............] - ETA: 52s - loss: 2.1655 - regression_loss: 1.7593 - classification_loss: 0.4062 292/500 [================>.............] - ETA: 51s - loss: 2.1642 - regression_loss: 1.7584 - classification_loss: 0.4058 293/500 [================>.............] - ETA: 51s - loss: 2.1621 - regression_loss: 1.7568 - classification_loss: 0.4053 294/500 [================>.............] - ETA: 51s - loss: 2.1635 - regression_loss: 1.7580 - classification_loss: 0.4055 295/500 [================>.............] - ETA: 51s - loss: 2.1647 - regression_loss: 1.7589 - classification_loss: 0.4058 296/500 [================>.............] - ETA: 50s - loss: 2.1641 - regression_loss: 1.7584 - classification_loss: 0.4057 297/500 [================>.............] - ETA: 50s - loss: 2.1635 - regression_loss: 1.7579 - classification_loss: 0.4056 298/500 [================>.............] - ETA: 50s - loss: 2.1628 - regression_loss: 1.7575 - classification_loss: 0.4053 299/500 [================>.............] - ETA: 50s - loss: 2.1607 - regression_loss: 1.7550 - classification_loss: 0.4057 300/500 [=================>............] - ETA: 50s - loss: 2.1607 - regression_loss: 1.7551 - classification_loss: 0.4055 301/500 [=================>............] - ETA: 49s - loss: 2.1606 - regression_loss: 1.7551 - classification_loss: 0.4055 302/500 [=================>............] - ETA: 49s - loss: 2.1594 - regression_loss: 1.7542 - classification_loss: 0.4052 303/500 [=================>............] - ETA: 49s - loss: 2.1578 - regression_loss: 1.7529 - classification_loss: 0.4048 304/500 [=================>............] - ETA: 49s - loss: 2.1575 - regression_loss: 1.7527 - classification_loss: 0.4048 305/500 [=================>............] - ETA: 48s - loss: 2.1570 - regression_loss: 1.7523 - classification_loss: 0.4047 306/500 [=================>............] - ETA: 48s - loss: 2.1584 - regression_loss: 1.7533 - classification_loss: 0.4051 307/500 [=================>............] - ETA: 48s - loss: 2.1600 - regression_loss: 1.7542 - classification_loss: 0.4058 308/500 [=================>............] - ETA: 48s - loss: 2.1557 - regression_loss: 1.7506 - classification_loss: 0.4051 309/500 [=================>............] - ETA: 47s - loss: 2.1556 - regression_loss: 1.7507 - classification_loss: 0.4049 310/500 [=================>............] - ETA: 47s - loss: 2.1553 - regression_loss: 1.7505 - classification_loss: 0.4049 311/500 [=================>............] - ETA: 47s - loss: 2.1577 - regression_loss: 1.7523 - classification_loss: 0.4054 312/500 [=================>............] - ETA: 47s - loss: 2.1561 - regression_loss: 1.7510 - classification_loss: 0.4050 313/500 [=================>............] - ETA: 46s - loss: 2.1570 - regression_loss: 1.7518 - classification_loss: 0.4052 314/500 [=================>............] - ETA: 46s - loss: 2.1552 - regression_loss: 1.7504 - classification_loss: 0.4048 315/500 [=================>............] - ETA: 46s - loss: 2.1552 - regression_loss: 1.7505 - classification_loss: 0.4047 316/500 [=================>............] - ETA: 46s - loss: 2.1531 - regression_loss: 1.7487 - classification_loss: 0.4044 317/500 [==================>...........] - ETA: 45s - loss: 2.1522 - regression_loss: 1.7480 - classification_loss: 0.4042 318/500 [==================>...........] - ETA: 45s - loss: 2.1508 - regression_loss: 1.7469 - classification_loss: 0.4039 319/500 [==================>...........] - ETA: 45s - loss: 2.1504 - regression_loss: 1.7465 - classification_loss: 0.4038 320/500 [==================>...........] - ETA: 45s - loss: 2.1487 - regression_loss: 1.7453 - classification_loss: 0.4034 321/500 [==================>...........] - ETA: 44s - loss: 2.1484 - regression_loss: 1.7445 - classification_loss: 0.4039 322/500 [==================>...........] - ETA: 44s - loss: 2.1468 - regression_loss: 1.7432 - classification_loss: 0.4036 323/500 [==================>...........] - ETA: 44s - loss: 2.1465 - regression_loss: 1.7431 - classification_loss: 0.4034 324/500 [==================>...........] - ETA: 44s - loss: 2.1460 - regression_loss: 1.7429 - classification_loss: 0.4031 325/500 [==================>...........] - ETA: 43s - loss: 2.1453 - regression_loss: 1.7426 - classification_loss: 0.4027 326/500 [==================>...........] - ETA: 43s - loss: 2.1455 - regression_loss: 1.7429 - classification_loss: 0.4026 327/500 [==================>...........] - ETA: 43s - loss: 2.1464 - regression_loss: 1.7438 - classification_loss: 0.4026 328/500 [==================>...........] - ETA: 43s - loss: 2.1468 - regression_loss: 1.7443 - classification_loss: 0.4025 329/500 [==================>...........] - ETA: 42s - loss: 2.1483 - regression_loss: 1.7457 - classification_loss: 0.4026 330/500 [==================>...........] - ETA: 42s - loss: 2.1487 - regression_loss: 1.7460 - classification_loss: 0.4026 331/500 [==================>...........] - ETA: 42s - loss: 2.1501 - regression_loss: 1.7474 - classification_loss: 0.4027 332/500 [==================>...........] - ETA: 42s - loss: 2.1506 - regression_loss: 1.7479 - classification_loss: 0.4027 333/500 [==================>...........] - ETA: 41s - loss: 2.1510 - regression_loss: 1.7485 - classification_loss: 0.4025 334/500 [===================>..........] - ETA: 41s - loss: 2.1511 - regression_loss: 1.7487 - classification_loss: 0.4024 335/500 [===================>..........] - ETA: 41s - loss: 2.1500 - regression_loss: 1.7479 - classification_loss: 0.4020 336/500 [===================>..........] - ETA: 41s - loss: 2.1503 - regression_loss: 1.7484 - classification_loss: 0.4019 337/500 [===================>..........] - ETA: 40s - loss: 2.1513 - regression_loss: 1.7494 - classification_loss: 0.4019 338/500 [===================>..........] - ETA: 40s - loss: 2.1531 - regression_loss: 1.7506 - classification_loss: 0.4024 339/500 [===================>..........] - ETA: 40s - loss: 2.1537 - regression_loss: 1.7513 - classification_loss: 0.4024 340/500 [===================>..........] - ETA: 40s - loss: 2.1523 - regression_loss: 1.7503 - classification_loss: 0.4021 341/500 [===================>..........] - ETA: 39s - loss: 2.1531 - regression_loss: 1.7505 - classification_loss: 0.4026 342/500 [===================>..........] - ETA: 39s - loss: 2.1506 - regression_loss: 1.7484 - classification_loss: 0.4022 343/500 [===================>..........] - ETA: 39s - loss: 2.1499 - regression_loss: 1.7480 - classification_loss: 0.4019 344/500 [===================>..........] - ETA: 39s - loss: 2.1505 - regression_loss: 1.7485 - classification_loss: 0.4020 345/500 [===================>..........] - ETA: 38s - loss: 2.1496 - regression_loss: 1.7479 - classification_loss: 0.4017 346/500 [===================>..........] - ETA: 38s - loss: 2.1499 - regression_loss: 1.7482 - classification_loss: 0.4018 347/500 [===================>..........] - ETA: 38s - loss: 2.1485 - regression_loss: 1.7471 - classification_loss: 0.4014 348/500 [===================>..........] - ETA: 38s - loss: 2.1488 - regression_loss: 1.7474 - classification_loss: 0.4015 349/500 [===================>..........] - ETA: 37s - loss: 2.1486 - regression_loss: 1.7473 - classification_loss: 0.4013 350/500 [====================>.........] - ETA: 37s - loss: 2.1474 - regression_loss: 1.7465 - classification_loss: 0.4010 351/500 [====================>.........] - ETA: 37s - loss: 2.1471 - regression_loss: 1.7463 - classification_loss: 0.4008 352/500 [====================>.........] - ETA: 36s - loss: 2.1476 - regression_loss: 1.7469 - classification_loss: 0.4006 353/500 [====================>.........] - ETA: 36s - loss: 2.1495 - regression_loss: 1.7490 - classification_loss: 0.4005 354/500 [====================>.........] - ETA: 36s - loss: 2.1495 - regression_loss: 1.7491 - classification_loss: 0.4004 355/500 [====================>.........] - ETA: 36s - loss: 2.1502 - regression_loss: 1.7498 - classification_loss: 0.4004 356/500 [====================>.........] - ETA: 35s - loss: 2.1513 - regression_loss: 1.7502 - classification_loss: 0.4011 357/500 [====================>.........] - ETA: 35s - loss: 2.1515 - regression_loss: 1.7503 - classification_loss: 0.4012 358/500 [====================>.........] - ETA: 35s - loss: 2.1510 - regression_loss: 1.7501 - classification_loss: 0.4009 359/500 [====================>.........] - ETA: 35s - loss: 2.1521 - regression_loss: 1.7510 - classification_loss: 0.4011 360/500 [====================>.........] - ETA: 34s - loss: 2.1533 - regression_loss: 1.7519 - classification_loss: 0.4013 361/500 [====================>.........] - ETA: 34s - loss: 2.1551 - regression_loss: 1.7536 - classification_loss: 0.4015 362/500 [====================>.........] - ETA: 34s - loss: 2.1552 - regression_loss: 1.7523 - classification_loss: 0.4028 363/500 [====================>.........] - ETA: 34s - loss: 2.1572 - regression_loss: 1.7538 - classification_loss: 0.4035 364/500 [====================>.........] - ETA: 34s - loss: 2.1567 - regression_loss: 1.7533 - classification_loss: 0.4034 365/500 [====================>.........] - ETA: 33s - loss: 2.1569 - regression_loss: 1.7534 - classification_loss: 0.4035 366/500 [====================>.........] - ETA: 33s - loss: 2.1559 - regression_loss: 1.7527 - classification_loss: 0.4032 367/500 [=====================>........] - ETA: 33s - loss: 2.1567 - regression_loss: 1.7531 - classification_loss: 0.4036 368/500 [=====================>........] - ETA: 33s - loss: 2.1554 - regression_loss: 1.7518 - classification_loss: 0.4035 369/500 [=====================>........] - ETA: 32s - loss: 2.1530 - regression_loss: 1.7500 - classification_loss: 0.4029 370/500 [=====================>........] - ETA: 32s - loss: 2.1541 - regression_loss: 1.7507 - classification_loss: 0.4034 371/500 [=====================>........] - ETA: 32s - loss: 2.1535 - regression_loss: 1.7502 - classification_loss: 0.4033 372/500 [=====================>........] - ETA: 32s - loss: 2.1539 - regression_loss: 1.7507 - classification_loss: 0.4032 373/500 [=====================>........] - ETA: 31s - loss: 2.1532 - regression_loss: 1.7503 - classification_loss: 0.4029 374/500 [=====================>........] - ETA: 31s - loss: 2.1535 - regression_loss: 1.7504 - classification_loss: 0.4031 375/500 [=====================>........] - ETA: 31s - loss: 2.1518 - regression_loss: 1.7494 - classification_loss: 0.4025 376/500 [=====================>........] - ETA: 31s - loss: 2.1507 - regression_loss: 1.7484 - classification_loss: 0.4023 377/500 [=====================>........] - ETA: 30s - loss: 2.1487 - regression_loss: 1.7469 - classification_loss: 0.4018 378/500 [=====================>........] - ETA: 30s - loss: 2.1513 - regression_loss: 1.7490 - classification_loss: 0.4023 379/500 [=====================>........] - ETA: 30s - loss: 2.1523 - regression_loss: 1.7500 - classification_loss: 0.4023 380/500 [=====================>........] - ETA: 30s - loss: 2.1527 - regression_loss: 1.7504 - classification_loss: 0.4023 381/500 [=====================>........] - ETA: 29s - loss: 2.1536 - regression_loss: 1.7506 - classification_loss: 0.4030 382/500 [=====================>........] - ETA: 29s - loss: 2.1530 - regression_loss: 1.7501 - classification_loss: 0.4028 383/500 [=====================>........] - ETA: 29s - loss: 2.1526 - regression_loss: 1.7498 - classification_loss: 0.4028 384/500 [======================>.......] - ETA: 29s - loss: 2.1497 - regression_loss: 1.7475 - classification_loss: 0.4022 385/500 [======================>.......] - ETA: 28s - loss: 2.1494 - regression_loss: 1.7472 - classification_loss: 0.4022 386/500 [======================>.......] - ETA: 28s - loss: 2.1490 - regression_loss: 1.7470 - classification_loss: 0.4020 387/500 [======================>.......] - ETA: 28s - loss: 2.1482 - regression_loss: 1.7464 - classification_loss: 0.4018 388/500 [======================>.......] - ETA: 28s - loss: 2.1481 - regression_loss: 1.7464 - classification_loss: 0.4017 389/500 [======================>.......] - ETA: 27s - loss: 2.1497 - regression_loss: 1.7476 - classification_loss: 0.4021 390/500 [======================>.......] - ETA: 27s - loss: 2.1488 - regression_loss: 1.7470 - classification_loss: 0.4019 391/500 [======================>.......] - ETA: 27s - loss: 2.1498 - regression_loss: 1.7480 - classification_loss: 0.4018 392/500 [======================>.......] - ETA: 27s - loss: 2.1486 - regression_loss: 1.7471 - classification_loss: 0.4014 393/500 [======================>.......] - ETA: 26s - loss: 2.1480 - regression_loss: 1.7468 - classification_loss: 0.4012 394/500 [======================>.......] - ETA: 26s - loss: 2.1472 - regression_loss: 1.7461 - classification_loss: 0.4011 395/500 [======================>.......] - ETA: 26s - loss: 2.1479 - regression_loss: 1.7469 - classification_loss: 0.4010 396/500 [======================>.......] - ETA: 26s - loss: 2.1476 - regression_loss: 1.7468 - classification_loss: 0.4008 397/500 [======================>.......] - ETA: 25s - loss: 2.1488 - regression_loss: 1.7475 - classification_loss: 0.4013 398/500 [======================>.......] - ETA: 25s - loss: 2.1496 - regression_loss: 1.7482 - classification_loss: 0.4014 399/500 [======================>.......] - ETA: 25s - loss: 2.1494 - regression_loss: 1.7481 - classification_loss: 0.4014 400/500 [=======================>......] - ETA: 25s - loss: 2.1500 - regression_loss: 1.7487 - classification_loss: 0.4013 401/500 [=======================>......] - ETA: 24s - loss: 2.1507 - regression_loss: 1.7493 - classification_loss: 0.4014 402/500 [=======================>......] - ETA: 24s - loss: 2.1499 - regression_loss: 1.7487 - classification_loss: 0.4012 403/500 [=======================>......] - ETA: 24s - loss: 2.1499 - regression_loss: 1.7489 - classification_loss: 0.4011 404/500 [=======================>......] - ETA: 24s - loss: 2.1498 - regression_loss: 1.7488 - classification_loss: 0.4010 405/500 [=======================>......] - ETA: 23s - loss: 2.1489 - regression_loss: 1.7479 - classification_loss: 0.4010 406/500 [=======================>......] - ETA: 23s - loss: 2.1460 - regression_loss: 1.7449 - classification_loss: 0.4011 407/500 [=======================>......] - ETA: 23s - loss: 2.1481 - regression_loss: 1.7463 - classification_loss: 0.4017 408/500 [=======================>......] - ETA: 23s - loss: 2.1499 - regression_loss: 1.7475 - classification_loss: 0.4024 409/500 [=======================>......] - ETA: 22s - loss: 2.1509 - regression_loss: 1.7481 - classification_loss: 0.4028 410/500 [=======================>......] - ETA: 22s - loss: 2.1512 - regression_loss: 1.7485 - classification_loss: 0.4027 411/500 [=======================>......] - ETA: 22s - loss: 2.1505 - regression_loss: 1.7480 - classification_loss: 0.4026 412/500 [=======================>......] - ETA: 22s - loss: 2.1504 - regression_loss: 1.7480 - classification_loss: 0.4025 413/500 [=======================>......] - ETA: 21s - loss: 2.1505 - regression_loss: 1.7482 - classification_loss: 0.4023 414/500 [=======================>......] - ETA: 21s - loss: 2.1527 - regression_loss: 1.7500 - classification_loss: 0.4027 415/500 [=======================>......] - ETA: 21s - loss: 2.1536 - regression_loss: 1.7505 - classification_loss: 0.4031 416/500 [=======================>......] - ETA: 21s - loss: 2.1549 - regression_loss: 1.7515 - classification_loss: 0.4034 417/500 [========================>.....] - ETA: 20s - loss: 2.1557 - regression_loss: 1.7520 - classification_loss: 0.4038 418/500 [========================>.....] - ETA: 20s - loss: 2.1565 - regression_loss: 1.7523 - classification_loss: 0.4042 419/500 [========================>.....] - ETA: 20s - loss: 2.1568 - regression_loss: 1.7525 - classification_loss: 0.4042 420/500 [========================>.....] - ETA: 20s - loss: 2.1574 - regression_loss: 1.7534 - classification_loss: 0.4039 421/500 [========================>.....] - ETA: 19s - loss: 2.1569 - regression_loss: 1.7531 - classification_loss: 0.4038 422/500 [========================>.....] - ETA: 19s - loss: 2.1573 - regression_loss: 1.7534 - classification_loss: 0.4038 423/500 [========================>.....] - ETA: 19s - loss: 2.1575 - regression_loss: 1.7534 - classification_loss: 0.4041 424/500 [========================>.....] - ETA: 19s - loss: 2.1556 - regression_loss: 1.7519 - classification_loss: 0.4037 425/500 [========================>.....] - ETA: 18s - loss: 2.1553 - regression_loss: 1.7518 - classification_loss: 0.4036 426/500 [========================>.....] - ETA: 18s - loss: 2.1560 - regression_loss: 1.7523 - classification_loss: 0.4037 427/500 [========================>.....] - ETA: 18s - loss: 2.1578 - regression_loss: 1.7538 - classification_loss: 0.4040 428/500 [========================>.....] - ETA: 18s - loss: 2.1577 - regression_loss: 1.7539 - classification_loss: 0.4038 429/500 [========================>.....] - ETA: 17s - loss: 2.1574 - regression_loss: 1.7538 - classification_loss: 0.4036 430/500 [========================>.....] - ETA: 17s - loss: 2.1568 - regression_loss: 1.7534 - classification_loss: 0.4035 431/500 [========================>.....] - ETA: 17s - loss: 2.1583 - regression_loss: 1.7547 - classification_loss: 0.4036 432/500 [========================>.....] - ETA: 17s - loss: 2.1584 - regression_loss: 1.7547 - classification_loss: 0.4037 433/500 [========================>.....] - ETA: 16s - loss: 2.1588 - regression_loss: 1.7550 - classification_loss: 0.4037 434/500 [=========================>....] - ETA: 16s - loss: 2.1586 - regression_loss: 1.7550 - classification_loss: 0.4036 435/500 [=========================>....] - ETA: 16s - loss: 2.1591 - regression_loss: 1.7556 - classification_loss: 0.4036 436/500 [=========================>....] - ETA: 15s - loss: 2.1612 - regression_loss: 1.7572 - classification_loss: 0.4040 437/500 [=========================>....] - ETA: 15s - loss: 2.1597 - regression_loss: 1.7561 - classification_loss: 0.4036 438/500 [=========================>....] - ETA: 15s - loss: 2.1621 - regression_loss: 1.7568 - classification_loss: 0.4053 439/500 [=========================>....] - ETA: 15s - loss: 2.1619 - regression_loss: 1.7567 - classification_loss: 0.4052 440/500 [=========================>....] - ETA: 14s - loss: 2.1608 - regression_loss: 1.7560 - classification_loss: 0.4048 441/500 [=========================>....] - ETA: 14s - loss: 2.1605 - regression_loss: 1.7558 - classification_loss: 0.4047 442/500 [=========================>....] - ETA: 14s - loss: 2.1593 - regression_loss: 1.7549 - classification_loss: 0.4044 443/500 [=========================>....] - ETA: 14s - loss: 2.1591 - regression_loss: 1.7547 - classification_loss: 0.4043 444/500 [=========================>....] - ETA: 13s - loss: 2.1604 - regression_loss: 1.7555 - classification_loss: 0.4049 445/500 [=========================>....] - ETA: 13s - loss: 2.1599 - regression_loss: 1.7553 - classification_loss: 0.4047 446/500 [=========================>....] - ETA: 13s - loss: 2.1600 - regression_loss: 1.7555 - classification_loss: 0.4046 447/500 [=========================>....] - ETA: 13s - loss: 2.1597 - regression_loss: 1.7553 - classification_loss: 0.4044 448/500 [=========================>....] - ETA: 12s - loss: 2.1583 - regression_loss: 1.7541 - classification_loss: 0.4042 449/500 [=========================>....] - ETA: 12s - loss: 2.1592 - regression_loss: 1.7552 - classification_loss: 0.4041 450/500 [==========================>...] - ETA: 12s - loss: 2.1603 - regression_loss: 1.7564 - classification_loss: 0.4039 451/500 [==========================>...] - ETA: 12s - loss: 2.1609 - regression_loss: 1.7570 - classification_loss: 0.4039 452/500 [==========================>...] - ETA: 11s - loss: 2.1604 - regression_loss: 1.7567 - classification_loss: 0.4037 453/500 [==========================>...] - ETA: 11s - loss: 2.1603 - regression_loss: 1.7566 - classification_loss: 0.4037 454/500 [==========================>...] - ETA: 11s - loss: 2.1610 - regression_loss: 1.7574 - classification_loss: 0.4036 455/500 [==========================>...] - ETA: 11s - loss: 2.1608 - regression_loss: 1.7574 - classification_loss: 0.4034 456/500 [==========================>...] - ETA: 10s - loss: 2.1613 - regression_loss: 1.7579 - classification_loss: 0.4034 457/500 [==========================>...] - ETA: 10s - loss: 2.1601 - regression_loss: 1.7570 - classification_loss: 0.4031 458/500 [==========================>...] - ETA: 10s - loss: 2.1600 - regression_loss: 1.7571 - classification_loss: 0.4030 459/500 [==========================>...] - ETA: 10s - loss: 2.1609 - regression_loss: 1.7579 - classification_loss: 0.4029 460/500 [==========================>...] - ETA: 10s - loss: 2.1615 - regression_loss: 1.7584 - classification_loss: 0.4031 461/500 [==========================>...] - ETA: 9s - loss: 2.1610 - regression_loss: 1.7581 - classification_loss: 0.4030  462/500 [==========================>...] - ETA: 9s - loss: 2.1604 - regression_loss: 1.7576 - classification_loss: 0.4029 463/500 [==========================>...] - ETA: 9s - loss: 2.1606 - regression_loss: 1.7578 - classification_loss: 0.4028 464/500 [==========================>...] - ETA: 9s - loss: 2.1590 - regression_loss: 1.7568 - classification_loss: 0.4022 465/500 [==========================>...] - ETA: 8s - loss: 2.1580 - regression_loss: 1.7562 - classification_loss: 0.4018 466/500 [==========================>...] - ETA: 8s - loss: 2.1575 - regression_loss: 1.7560 - classification_loss: 0.4015 467/500 [===========================>..] - ETA: 8s - loss: 2.1594 - regression_loss: 1.7576 - classification_loss: 0.4018 468/500 [===========================>..] - ETA: 8s - loss: 2.1589 - regression_loss: 1.7572 - classification_loss: 0.4017 469/500 [===========================>..] - ETA: 7s - loss: 2.1589 - regression_loss: 1.7573 - classification_loss: 0.4016 470/500 [===========================>..] - ETA: 7s - loss: 2.1595 - regression_loss: 1.7579 - classification_loss: 0.4016 471/500 [===========================>..] - ETA: 7s - loss: 2.1591 - regression_loss: 1.7577 - classification_loss: 0.4014 472/500 [===========================>..] - ETA: 7s - loss: 2.1587 - regression_loss: 1.7575 - classification_loss: 0.4012 473/500 [===========================>..] - ETA: 6s - loss: 2.1568 - regression_loss: 1.7560 - classification_loss: 0.4007 474/500 [===========================>..] - ETA: 6s - loss: 2.1571 - regression_loss: 1.7562 - classification_loss: 0.4009 475/500 [===========================>..] - ETA: 6s - loss: 2.1565 - regression_loss: 1.7558 - classification_loss: 0.4007 476/500 [===========================>..] - ETA: 6s - loss: 2.1566 - regression_loss: 1.7552 - classification_loss: 0.4014 477/500 [===========================>..] - ETA: 5s - loss: 2.1575 - regression_loss: 1.7559 - classification_loss: 0.4016 478/500 [===========================>..] - ETA: 5s - loss: 2.1584 - regression_loss: 1.7566 - classification_loss: 0.4018 479/500 [===========================>..] - ETA: 5s - loss: 2.1589 - regression_loss: 1.7572 - classification_loss: 0.4018 480/500 [===========================>..] - ETA: 5s - loss: 2.1603 - regression_loss: 1.7580 - classification_loss: 0.4023 481/500 [===========================>..] - ETA: 4s - loss: 2.1601 - regression_loss: 1.7579 - classification_loss: 0.4022 482/500 [===========================>..] - ETA: 4s - loss: 2.1602 - regression_loss: 1.7581 - classification_loss: 0.4022 483/500 [===========================>..] - ETA: 4s - loss: 2.1597 - regression_loss: 1.7573 - classification_loss: 0.4024 484/500 [============================>.] - ETA: 4s - loss: 2.1586 - regression_loss: 1.7564 - classification_loss: 0.4021 485/500 [============================>.] - ETA: 3s - loss: 2.1585 - regression_loss: 1.7564 - classification_loss: 0.4021 486/500 [============================>.] - ETA: 3s - loss: 2.1578 - regression_loss: 1.7559 - classification_loss: 0.4018 487/500 [============================>.] - ETA: 3s - loss: 2.1562 - regression_loss: 1.7547 - classification_loss: 0.4015 488/500 [============================>.] - ETA: 3s - loss: 2.1572 - regression_loss: 1.7555 - classification_loss: 0.4017 489/500 [============================>.] - ETA: 2s - loss: 2.1572 - regression_loss: 1.7556 - classification_loss: 0.4016 490/500 [============================>.] - ETA: 2s - loss: 2.1571 - regression_loss: 1.7556 - classification_loss: 0.4015 491/500 [============================>.] - ETA: 2s - loss: 2.1559 - regression_loss: 1.7547 - classification_loss: 0.4012 492/500 [============================>.] - ETA: 2s - loss: 2.1562 - regression_loss: 1.7548 - classification_loss: 0.4014 493/500 [============================>.] - ETA: 1s - loss: 2.1549 - regression_loss: 1.7539 - classification_loss: 0.4010 494/500 [============================>.] - ETA: 1s - loss: 2.1551 - regression_loss: 1.7542 - classification_loss: 0.4009 495/500 [============================>.] - ETA: 1s - loss: 2.1554 - regression_loss: 1.7544 - classification_loss: 0.4010 496/500 [============================>.] - ETA: 1s - loss: 2.1563 - regression_loss: 1.7552 - classification_loss: 0.4011 497/500 [============================>.] - ETA: 0s - loss: 2.1566 - regression_loss: 1.7555 - classification_loss: 0.4011 498/500 [============================>.] - ETA: 0s - loss: 2.1570 - regression_loss: 1.7557 - classification_loss: 0.4012 499/500 [============================>.] - ETA: 0s - loss: 2.1560 - regression_loss: 1.7551 - classification_loss: 0.4009 500/500 [==============================] - 125s 250ms/step - loss: 2.1549 - regression_loss: 1.7543 - classification_loss: 0.4006 1172 instances of class plum with average precision: 0.4953 mAP: 0.4953 Epoch 00029: saving model to ./training/snapshots/resnet50_pascal_29.h5 Epoch 30/150 1/500 [..............................] - ETA: 1:57 - loss: 2.4199 - regression_loss: 1.9839 - classification_loss: 0.4360 2/500 [..............................] - ETA: 2:00 - loss: 2.4228 - regression_loss: 1.9779 - classification_loss: 0.4450 3/500 [..............................] - ETA: 2:01 - loss: 2.4603 - regression_loss: 2.0446 - classification_loss: 0.4157 4/500 [..............................] - ETA: 2:02 - loss: 2.3438 - regression_loss: 1.9232 - classification_loss: 0.4206 5/500 [..............................] - ETA: 2:02 - loss: 2.2536 - regression_loss: 1.8531 - classification_loss: 0.4005 6/500 [..............................] - ETA: 2:01 - loss: 2.3752 - regression_loss: 1.9276 - classification_loss: 0.4476 7/500 [..............................] - ETA: 2:00 - loss: 2.3463 - regression_loss: 1.9123 - classification_loss: 0.4340 8/500 [..............................] - ETA: 2:00 - loss: 2.2874 - regression_loss: 1.8689 - classification_loss: 0.4185 9/500 [..............................] - ETA: 2:00 - loss: 2.3231 - regression_loss: 1.9023 - classification_loss: 0.4208 10/500 [..............................] - ETA: 2:00 - loss: 2.3165 - regression_loss: 1.8919 - classification_loss: 0.4246 11/500 [..............................] - ETA: 2:00 - loss: 2.2680 - regression_loss: 1.8554 - classification_loss: 0.4126 12/500 [..............................] - ETA: 2:00 - loss: 2.2394 - regression_loss: 1.8364 - classification_loss: 0.4030 13/500 [..............................] - ETA: 2:00 - loss: 2.2430 - regression_loss: 1.8444 - classification_loss: 0.3986 14/500 [..............................] - ETA: 2:00 - loss: 2.2481 - regression_loss: 1.8487 - classification_loss: 0.3994 15/500 [..............................] - ETA: 2:00 - loss: 2.2487 - regression_loss: 1.8521 - classification_loss: 0.3966 16/500 [..............................] - ETA: 2:00 - loss: 2.2462 - regression_loss: 1.8463 - classification_loss: 0.3999 17/500 [>.............................] - ETA: 1:59 - loss: 2.1949 - regression_loss: 1.8067 - classification_loss: 0.3882 18/500 [>.............................] - ETA: 1:59 - loss: 2.2659 - regression_loss: 1.8364 - classification_loss: 0.4295 19/500 [>.............................] - ETA: 1:59 - loss: 2.2964 - regression_loss: 1.8682 - classification_loss: 0.4282 20/500 [>.............................] - ETA: 1:59 - loss: 2.2585 - regression_loss: 1.8357 - classification_loss: 0.4228 21/500 [>.............................] - ETA: 1:59 - loss: 2.2578 - regression_loss: 1.8424 - classification_loss: 0.4154 22/500 [>.............................] - ETA: 1:59 - loss: 2.2737 - regression_loss: 1.8552 - classification_loss: 0.4185 23/500 [>.............................] - ETA: 1:58 - loss: 2.2929 - regression_loss: 1.8676 - classification_loss: 0.4252 24/500 [>.............................] - ETA: 1:58 - loss: 2.2923 - regression_loss: 1.8716 - classification_loss: 0.4207 25/500 [>.............................] - ETA: 1:58 - loss: 2.3004 - regression_loss: 1.8763 - classification_loss: 0.4241 26/500 [>.............................] - ETA: 1:58 - loss: 2.2425 - regression_loss: 1.8315 - classification_loss: 0.4110 27/500 [>.............................] - ETA: 1:57 - loss: 2.2330 - regression_loss: 1.8229 - classification_loss: 0.4101 28/500 [>.............................] - ETA: 1:57 - loss: 2.2517 - regression_loss: 1.8370 - classification_loss: 0.4147 29/500 [>.............................] - ETA: 1:56 - loss: 2.2059 - regression_loss: 1.7980 - classification_loss: 0.4079 30/500 [>.............................] - ETA: 1:56 - loss: 2.1901 - regression_loss: 1.7851 - classification_loss: 0.4050 31/500 [>.............................] - ETA: 1:56 - loss: 2.1743 - regression_loss: 1.7737 - classification_loss: 0.4007 32/500 [>.............................] - ETA: 1:56 - loss: 2.1795 - regression_loss: 1.7743 - classification_loss: 0.4052 33/500 [>.............................] - ETA: 1:56 - loss: 2.1947 - regression_loss: 1.7888 - classification_loss: 0.4059 34/500 [=>............................] - ETA: 1:55 - loss: 2.1981 - regression_loss: 1.7882 - classification_loss: 0.4099 35/500 [=>............................] - ETA: 1:55 - loss: 2.1888 - regression_loss: 1.7787 - classification_loss: 0.4101 36/500 [=>............................] - ETA: 1:55 - loss: 2.1789 - regression_loss: 1.7698 - classification_loss: 0.4091 37/500 [=>............................] - ETA: 1:55 - loss: 2.1735 - regression_loss: 1.7656 - classification_loss: 0.4080 38/500 [=>............................] - ETA: 1:55 - loss: 2.1667 - regression_loss: 1.7598 - classification_loss: 0.4069 39/500 [=>............................] - ETA: 1:54 - loss: 2.1582 - regression_loss: 1.7546 - classification_loss: 0.4036 40/500 [=>............................] - ETA: 1:54 - loss: 2.1578 - regression_loss: 1.7552 - classification_loss: 0.4026 41/500 [=>............................] - ETA: 1:54 - loss: 2.1597 - regression_loss: 1.7562 - classification_loss: 0.4034 42/500 [=>............................] - ETA: 1:54 - loss: 2.1606 - regression_loss: 1.7582 - classification_loss: 0.4024 43/500 [=>............................] - ETA: 1:54 - loss: 2.1550 - regression_loss: 1.7530 - classification_loss: 0.4020 44/500 [=>............................] - ETA: 1:54 - loss: 2.1600 - regression_loss: 1.7567 - classification_loss: 0.4034 45/500 [=>............................] - ETA: 1:53 - loss: 2.1635 - regression_loss: 1.7561 - classification_loss: 0.4074 46/500 [=>............................] - ETA: 1:53 - loss: 2.1674 - regression_loss: 1.7595 - classification_loss: 0.4079 47/500 [=>............................] - ETA: 1:53 - loss: 2.1669 - regression_loss: 1.7602 - classification_loss: 0.4067 48/500 [=>............................] - ETA: 1:53 - loss: 2.1604 - regression_loss: 1.7554 - classification_loss: 0.4050 49/500 [=>............................] - ETA: 1:53 - loss: 2.1549 - regression_loss: 1.7517 - classification_loss: 0.4031 50/500 [==>...........................] - ETA: 1:52 - loss: 2.1473 - regression_loss: 1.7470 - classification_loss: 0.4003 51/500 [==>...........................] - ETA: 1:52 - loss: 2.1586 - regression_loss: 1.7543 - classification_loss: 0.4043 52/500 [==>...........................] - ETA: 1:52 - loss: 2.1519 - regression_loss: 1.7499 - classification_loss: 0.4021 53/500 [==>...........................] - ETA: 1:52 - loss: 2.1551 - regression_loss: 1.7524 - classification_loss: 0.4027 54/500 [==>...........................] - ETA: 1:51 - loss: 2.1542 - regression_loss: 1.7536 - classification_loss: 0.4006 55/500 [==>...........................] - ETA: 1:51 - loss: 2.1438 - regression_loss: 1.7466 - classification_loss: 0.3972 56/500 [==>...........................] - ETA: 1:51 - loss: 2.1447 - regression_loss: 1.7476 - classification_loss: 0.3971 57/500 [==>...........................] - ETA: 1:51 - loss: 2.1476 - regression_loss: 1.7505 - classification_loss: 0.3971 58/500 [==>...........................] - ETA: 1:50 - loss: 2.1438 - regression_loss: 1.7462 - classification_loss: 0.3976 59/500 [==>...........................] - ETA: 1:50 - loss: 2.1479 - regression_loss: 1.7496 - classification_loss: 0.3983 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1354 - regression_loss: 1.7380 - classification_loss: 0.3974 61/500 [==>...........................] - ETA: 1:49 - loss: 2.1341 - regression_loss: 1.7372 - classification_loss: 0.3969 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1303 - regression_loss: 1.7346 - classification_loss: 0.3957 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1345 - regression_loss: 1.7379 - classification_loss: 0.3966 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1343 - regression_loss: 1.7385 - classification_loss: 0.3958 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1470 - regression_loss: 1.7477 - classification_loss: 0.3993 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1446 - regression_loss: 1.7463 - classification_loss: 0.3983 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1468 - regression_loss: 1.7483 - classification_loss: 0.3986 68/500 [===>..........................] - ETA: 1:48 - loss: 2.1545 - regression_loss: 1.7528 - classification_loss: 0.4017 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1547 - regression_loss: 1.7533 - classification_loss: 0.4013 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1545 - regression_loss: 1.7533 - classification_loss: 0.4012 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1526 - regression_loss: 1.7510 - classification_loss: 0.4015 72/500 [===>..........................] - ETA: 1:47 - loss: 2.1467 - regression_loss: 1.7459 - classification_loss: 0.4008 73/500 [===>..........................] - ETA: 1:47 - loss: 2.1589 - regression_loss: 1.7563 - classification_loss: 0.4026 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1544 - regression_loss: 1.7529 - classification_loss: 0.4015 75/500 [===>..........................] - ETA: 1:46 - loss: 2.1477 - regression_loss: 1.7476 - classification_loss: 0.4001 76/500 [===>..........................] - ETA: 1:46 - loss: 2.1528 - regression_loss: 1.7521 - classification_loss: 0.4007 77/500 [===>..........................] - ETA: 1:46 - loss: 2.1581 - regression_loss: 1.7563 - classification_loss: 0.4018 78/500 [===>..........................] - ETA: 1:45 - loss: 2.1545 - regression_loss: 1.7543 - classification_loss: 0.4002 79/500 [===>..........................] - ETA: 1:45 - loss: 2.1515 - regression_loss: 1.7498 - classification_loss: 0.4017 80/500 [===>..........................] - ETA: 1:45 - loss: 2.1530 - regression_loss: 1.7511 - classification_loss: 0.4019 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1496 - regression_loss: 1.7475 - classification_loss: 0.4021 82/500 [===>..........................] - ETA: 1:44 - loss: 2.1495 - regression_loss: 1.7477 - classification_loss: 0.4018 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1413 - regression_loss: 1.7414 - classification_loss: 0.3999 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1421 - regression_loss: 1.7425 - classification_loss: 0.3996 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1488 - regression_loss: 1.7468 - classification_loss: 0.4021 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1485 - regression_loss: 1.7466 - classification_loss: 0.4019 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1513 - regression_loss: 1.7367 - classification_loss: 0.4145 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1531 - regression_loss: 1.7388 - classification_loss: 0.4142 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1509 - regression_loss: 1.7371 - classification_loss: 0.4138 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1553 - regression_loss: 1.7408 - classification_loss: 0.4145 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1555 - regression_loss: 1.7382 - classification_loss: 0.4173 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1556 - regression_loss: 1.7388 - classification_loss: 0.4169 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1548 - regression_loss: 1.7392 - classification_loss: 0.4156 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1574 - regression_loss: 1.7415 - classification_loss: 0.4159 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1533 - regression_loss: 1.7385 - classification_loss: 0.4148 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1434 - regression_loss: 1.7293 - classification_loss: 0.4141 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1437 - regression_loss: 1.7305 - classification_loss: 0.4133 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1370 - regression_loss: 1.7249 - classification_loss: 0.4121 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1472 - regression_loss: 1.7337 - classification_loss: 0.4135 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1478 - regression_loss: 1.7344 - classification_loss: 0.4135 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1464 - regression_loss: 1.7338 - classification_loss: 0.4126 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1488 - regression_loss: 1.7351 - classification_loss: 0.4137 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1516 - regression_loss: 1.7360 - classification_loss: 0.4156 104/500 [=====>........................] - ETA: 1:39 - loss: 2.1534 - regression_loss: 1.7377 - classification_loss: 0.4157 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1510 - regression_loss: 1.7370 - classification_loss: 0.4141 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1558 - regression_loss: 1.7412 - classification_loss: 0.4147 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1568 - regression_loss: 1.7428 - classification_loss: 0.4140 108/500 [=====>........................] - ETA: 1:38 - loss: 2.1581 - regression_loss: 1.7446 - classification_loss: 0.4135 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1624 - regression_loss: 1.7482 - classification_loss: 0.4142 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1621 - regression_loss: 1.7484 - classification_loss: 0.4137 111/500 [=====>........................] - ETA: 1:36 - loss: 2.1603 - regression_loss: 1.7473 - classification_loss: 0.4130 112/500 [=====>........................] - ETA: 1:36 - loss: 2.1604 - regression_loss: 1.7480 - classification_loss: 0.4124 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1627 - regression_loss: 1.7501 - classification_loss: 0.4126 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1653 - regression_loss: 1.7490 - classification_loss: 0.4163 115/500 [=====>........................] - ETA: 1:35 - loss: 2.1557 - regression_loss: 1.7413 - classification_loss: 0.4144 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1588 - regression_loss: 1.7440 - classification_loss: 0.4149 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1541 - regression_loss: 1.7398 - classification_loss: 0.4143 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1501 - regression_loss: 1.7369 - classification_loss: 0.4132 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1433 - regression_loss: 1.7322 - classification_loss: 0.4110 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1481 - regression_loss: 1.7347 - classification_loss: 0.4134 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1467 - regression_loss: 1.7338 - classification_loss: 0.4129 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1498 - regression_loss: 1.7365 - classification_loss: 0.4133 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1489 - regression_loss: 1.7355 - classification_loss: 0.4134 124/500 [======>.......................] - ETA: 1:33 - loss: 2.1511 - regression_loss: 1.7368 - classification_loss: 0.4143 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1505 - regression_loss: 1.7365 - classification_loss: 0.4140 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1517 - regression_loss: 1.7377 - classification_loss: 0.4140 127/500 [======>.......................] - ETA: 1:32 - loss: 2.1532 - regression_loss: 1.7390 - classification_loss: 0.4141 128/500 [======>.......................] - ETA: 1:32 - loss: 2.1531 - regression_loss: 1.7387 - classification_loss: 0.4143 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1529 - regression_loss: 1.7386 - classification_loss: 0.4143 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1572 - regression_loss: 1.7431 - classification_loss: 0.4141 131/500 [======>.......................] - ETA: 1:31 - loss: 2.1543 - regression_loss: 1.7409 - classification_loss: 0.4134 132/500 [======>.......................] - ETA: 1:31 - loss: 2.1538 - regression_loss: 1.7411 - classification_loss: 0.4127 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1521 - regression_loss: 1.7404 - classification_loss: 0.4117 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1516 - regression_loss: 1.7404 - classification_loss: 0.4112 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1507 - regression_loss: 1.7400 - classification_loss: 0.4107 136/500 [=======>......................] - ETA: 1:30 - loss: 2.1500 - regression_loss: 1.7401 - classification_loss: 0.4099 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1517 - regression_loss: 1.7421 - classification_loss: 0.4096 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1502 - regression_loss: 1.7406 - classification_loss: 0.4096 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1491 - regression_loss: 1.7397 - classification_loss: 0.4094 140/500 [=======>......................] - ETA: 1:29 - loss: 2.1461 - regression_loss: 1.7373 - classification_loss: 0.4088 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1460 - regression_loss: 1.7380 - classification_loss: 0.4080 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1458 - regression_loss: 1.7384 - classification_loss: 0.4075 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1406 - regression_loss: 1.7343 - classification_loss: 0.4063 144/500 [=======>......................] - ETA: 1:28 - loss: 2.1361 - regression_loss: 1.7305 - classification_loss: 0.4055 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1344 - regression_loss: 1.7293 - classification_loss: 0.4051 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1337 - regression_loss: 1.7289 - classification_loss: 0.4047 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1349 - regression_loss: 1.7294 - classification_loss: 0.4054 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1308 - regression_loss: 1.7263 - classification_loss: 0.4045 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1261 - regression_loss: 1.7227 - classification_loss: 0.4033 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1268 - regression_loss: 1.7239 - classification_loss: 0.4029 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1186 - regression_loss: 1.7172 - classification_loss: 0.4014 152/500 [========>.....................] - ETA: 1:26 - loss: 2.1230 - regression_loss: 1.7200 - classification_loss: 0.4030 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1254 - regression_loss: 1.7219 - classification_loss: 0.4034 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1220 - regression_loss: 1.7201 - classification_loss: 0.4020 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1275 - regression_loss: 1.7240 - classification_loss: 0.4035 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1257 - regression_loss: 1.7225 - classification_loss: 0.4031 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1272 - regression_loss: 1.7239 - classification_loss: 0.4034 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1270 - regression_loss: 1.7235 - classification_loss: 0.4035 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1280 - regression_loss: 1.7243 - classification_loss: 0.4037 160/500 [========>.....................] - ETA: 1:25 - loss: 2.1276 - regression_loss: 1.7243 - classification_loss: 0.4033 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1260 - regression_loss: 1.7233 - classification_loss: 0.4026 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1250 - regression_loss: 1.7222 - classification_loss: 0.4028 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1261 - regression_loss: 1.7233 - classification_loss: 0.4028 164/500 [========>.....................] - ETA: 1:24 - loss: 2.1284 - regression_loss: 1.7250 - classification_loss: 0.4034 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1301 - regression_loss: 1.7248 - classification_loss: 0.4052 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1319 - regression_loss: 1.7265 - classification_loss: 0.4053 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1334 - regression_loss: 1.7275 - classification_loss: 0.4059 168/500 [=========>....................] - ETA: 1:23 - loss: 2.1360 - regression_loss: 1.7299 - classification_loss: 0.4061 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1349 - regression_loss: 1.7281 - classification_loss: 0.4068 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1350 - regression_loss: 1.7284 - classification_loss: 0.4066 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1337 - regression_loss: 1.7267 - classification_loss: 0.4071 172/500 [=========>....................] - ETA: 1:22 - loss: 2.1310 - regression_loss: 1.7251 - classification_loss: 0.4059 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1324 - regression_loss: 1.7267 - classification_loss: 0.4057 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1330 - regression_loss: 1.7275 - classification_loss: 0.4055 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1321 - regression_loss: 1.7271 - classification_loss: 0.4050 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1298 - regression_loss: 1.7261 - classification_loss: 0.4038 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1289 - regression_loss: 1.7254 - classification_loss: 0.4035 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1267 - regression_loss: 1.7237 - classification_loss: 0.4030 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1285 - regression_loss: 1.7252 - classification_loss: 0.4032 180/500 [=========>....................] - ETA: 1:20 - loss: 2.1259 - regression_loss: 1.7232 - classification_loss: 0.4028 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1311 - regression_loss: 1.7273 - classification_loss: 0.4038 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1278 - regression_loss: 1.7247 - classification_loss: 0.4031 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1262 - regression_loss: 1.7238 - classification_loss: 0.4024 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1223 - regression_loss: 1.7209 - classification_loss: 0.4014 185/500 [==========>...................] - ETA: 1:19 - loss: 2.1209 - regression_loss: 1.7198 - classification_loss: 0.4011 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1274 - regression_loss: 1.7245 - classification_loss: 0.4030 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1293 - regression_loss: 1.7258 - classification_loss: 0.4035 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1311 - regression_loss: 1.7272 - classification_loss: 0.4040 189/500 [==========>...................] - ETA: 1:18 - loss: 2.1331 - regression_loss: 1.7290 - classification_loss: 0.4041 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1331 - regression_loss: 1.7294 - classification_loss: 0.4037 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1313 - regression_loss: 1.7279 - classification_loss: 0.4034 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1313 - regression_loss: 1.7281 - classification_loss: 0.4033 193/500 [==========>...................] - ETA: 1:17 - loss: 2.1310 - regression_loss: 1.7285 - classification_loss: 0.4025 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1318 - regression_loss: 1.7289 - classification_loss: 0.4029 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1343 - regression_loss: 1.7307 - classification_loss: 0.4036 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1380 - regression_loss: 1.7336 - classification_loss: 0.4044 197/500 [==========>...................] - ETA: 1:16 - loss: 2.1394 - regression_loss: 1.7351 - classification_loss: 0.4043 198/500 [==========>...................] - ETA: 1:16 - loss: 2.1392 - regression_loss: 1.7350 - classification_loss: 0.4042 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1392 - regression_loss: 1.7349 - classification_loss: 0.4042 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1411 - regression_loss: 1.7365 - classification_loss: 0.4046 201/500 [===========>..................] - ETA: 1:15 - loss: 2.1409 - regression_loss: 1.7367 - classification_loss: 0.4042 202/500 [===========>..................] - ETA: 1:15 - loss: 2.1394 - regression_loss: 1.7358 - classification_loss: 0.4036 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1359 - regression_loss: 1.7332 - classification_loss: 0.4026 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1365 - regression_loss: 1.7339 - classification_loss: 0.4026 205/500 [===========>..................] - ETA: 1:14 - loss: 2.1349 - regression_loss: 1.7320 - classification_loss: 0.4029 206/500 [===========>..................] - ETA: 1:14 - loss: 2.1402 - regression_loss: 1.7361 - classification_loss: 0.4041 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1404 - regression_loss: 1.7363 - classification_loss: 0.4041 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1441 - regression_loss: 1.7393 - classification_loss: 0.4047 209/500 [===========>..................] - ETA: 1:13 - loss: 2.1442 - regression_loss: 1.7395 - classification_loss: 0.4048 210/500 [===========>..................] - ETA: 1:13 - loss: 2.1465 - regression_loss: 1.7412 - classification_loss: 0.4053 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1490 - regression_loss: 1.7423 - classification_loss: 0.4067 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1480 - regression_loss: 1.7416 - classification_loss: 0.4064 213/500 [===========>..................] - ETA: 1:12 - loss: 2.1459 - regression_loss: 1.7400 - classification_loss: 0.4059 214/500 [===========>..................] - ETA: 1:12 - loss: 2.1459 - regression_loss: 1.7401 - classification_loss: 0.4058 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1461 - regression_loss: 1.7404 - classification_loss: 0.4057 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1466 - regression_loss: 1.7405 - classification_loss: 0.4061 217/500 [============>.................] - ETA: 1:11 - loss: 2.1443 - regression_loss: 1.7389 - classification_loss: 0.4054 218/500 [============>.................] - ETA: 1:11 - loss: 2.1452 - regression_loss: 1.7397 - classification_loss: 0.4055 219/500 [============>.................] - ETA: 1:10 - loss: 2.1487 - regression_loss: 1.7423 - classification_loss: 0.4063 220/500 [============>.................] - ETA: 1:10 - loss: 2.1497 - regression_loss: 1.7432 - classification_loss: 0.4064 221/500 [============>.................] - ETA: 1:10 - loss: 2.1521 - regression_loss: 1.7447 - classification_loss: 0.4074 222/500 [============>.................] - ETA: 1:10 - loss: 2.1546 - regression_loss: 1.7466 - classification_loss: 0.4079 223/500 [============>.................] - ETA: 1:09 - loss: 2.1512 - regression_loss: 1.7439 - classification_loss: 0.4072 224/500 [============>.................] - ETA: 1:09 - loss: 2.1516 - regression_loss: 1.7441 - classification_loss: 0.4074 225/500 [============>.................] - ETA: 1:09 - loss: 2.1531 - regression_loss: 1.7453 - classification_loss: 0.4078 226/500 [============>.................] - ETA: 1:09 - loss: 2.1499 - regression_loss: 1.7429 - classification_loss: 0.4069 227/500 [============>.................] - ETA: 1:08 - loss: 2.1463 - regression_loss: 1.7403 - classification_loss: 0.4060 228/500 [============>.................] - ETA: 1:08 - loss: 2.1464 - regression_loss: 1.7408 - classification_loss: 0.4056 229/500 [============>.................] - ETA: 1:08 - loss: 2.1465 - regression_loss: 1.7410 - classification_loss: 0.4055 230/500 [============>.................] - ETA: 1:07 - loss: 2.1512 - regression_loss: 1.7448 - classification_loss: 0.4065 231/500 [============>.................] - ETA: 1:07 - loss: 2.1581 - regression_loss: 1.7502 - classification_loss: 0.4078 232/500 [============>.................] - ETA: 1:07 - loss: 2.1571 - regression_loss: 1.7496 - classification_loss: 0.4076 233/500 [============>.................] - ETA: 1:07 - loss: 2.1564 - regression_loss: 1.7488 - classification_loss: 0.4076 234/500 [=============>................] - ETA: 1:06 - loss: 2.1580 - regression_loss: 1.7504 - classification_loss: 0.4077 235/500 [=============>................] - ETA: 1:06 - loss: 2.1566 - regression_loss: 1.7491 - classification_loss: 0.4075 236/500 [=============>................] - ETA: 1:06 - loss: 2.1627 - regression_loss: 1.7547 - classification_loss: 0.4080 237/500 [=============>................] - ETA: 1:06 - loss: 2.1654 - regression_loss: 1.7565 - classification_loss: 0.4089 238/500 [=============>................] - ETA: 1:05 - loss: 2.1671 - regression_loss: 1.7574 - classification_loss: 0.4097 239/500 [=============>................] - ETA: 1:05 - loss: 2.1676 - regression_loss: 1.7580 - classification_loss: 0.4096 240/500 [=============>................] - ETA: 1:05 - loss: 2.1683 - regression_loss: 1.7587 - classification_loss: 0.4096 241/500 [=============>................] - ETA: 1:05 - loss: 2.1683 - regression_loss: 1.7587 - classification_loss: 0.4096 242/500 [=============>................] - ETA: 1:04 - loss: 2.1682 - regression_loss: 1.7588 - classification_loss: 0.4093 243/500 [=============>................] - ETA: 1:04 - loss: 2.1679 - regression_loss: 1.7589 - classification_loss: 0.4089 244/500 [=============>................] - ETA: 1:04 - loss: 2.1680 - regression_loss: 1.7573 - classification_loss: 0.4107 245/500 [=============>................] - ETA: 1:04 - loss: 2.1668 - regression_loss: 1.7563 - classification_loss: 0.4106 246/500 [=============>................] - ETA: 1:03 - loss: 2.1678 - regression_loss: 1.7573 - classification_loss: 0.4105 247/500 [=============>................] - ETA: 1:03 - loss: 2.1668 - regression_loss: 1.7557 - classification_loss: 0.4111 248/500 [=============>................] - ETA: 1:03 - loss: 2.1665 - regression_loss: 1.7556 - classification_loss: 0.4109 249/500 [=============>................] - ETA: 1:03 - loss: 2.1677 - regression_loss: 1.7565 - classification_loss: 0.4112 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1641 - regression_loss: 1.7538 - classification_loss: 0.4103 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1603 - regression_loss: 1.7510 - classification_loss: 0.4093 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1617 - regression_loss: 1.7523 - classification_loss: 0.4095 253/500 [==============>...............] - ETA: 1:02 - loss: 2.1613 - regression_loss: 1.7517 - classification_loss: 0.4096 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1616 - regression_loss: 1.7522 - classification_loss: 0.4093 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1608 - regression_loss: 1.7516 - classification_loss: 0.4092 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1604 - regression_loss: 1.7514 - classification_loss: 0.4089 257/500 [==============>...............] - ETA: 1:01 - loss: 2.1606 - regression_loss: 1.7517 - classification_loss: 0.4090 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1550 - regression_loss: 1.7473 - classification_loss: 0.4077 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1535 - regression_loss: 1.7463 - classification_loss: 0.4071 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1526 - regression_loss: 1.7458 - classification_loss: 0.4068 261/500 [==============>...............] - ETA: 1:00 - loss: 2.1511 - regression_loss: 1.7449 - classification_loss: 0.4062 262/500 [==============>...............] - ETA: 59s - loss: 2.1517 - regression_loss: 1.7456 - classification_loss: 0.4062  263/500 [==============>...............] - ETA: 59s - loss: 2.1507 - regression_loss: 1.7449 - classification_loss: 0.4058 264/500 [==============>...............] - ETA: 59s - loss: 2.1508 - regression_loss: 1.7453 - classification_loss: 0.4055 265/500 [==============>...............] - ETA: 59s - loss: 2.1493 - regression_loss: 1.7442 - classification_loss: 0.4051 266/500 [==============>...............] - ETA: 58s - loss: 2.1480 - regression_loss: 1.7433 - classification_loss: 0.4046 267/500 [===============>..............] - ETA: 58s - loss: 2.1469 - regression_loss: 1.7426 - classification_loss: 0.4042 268/500 [===============>..............] - ETA: 58s - loss: 2.1478 - regression_loss: 1.7435 - classification_loss: 0.4043 269/500 [===============>..............] - ETA: 58s - loss: 2.1488 - regression_loss: 1.7442 - classification_loss: 0.4046 270/500 [===============>..............] - ETA: 57s - loss: 2.1492 - regression_loss: 1.7447 - classification_loss: 0.4044 271/500 [===============>..............] - ETA: 57s - loss: 2.1488 - regression_loss: 1.7442 - classification_loss: 0.4046 272/500 [===============>..............] - ETA: 57s - loss: 2.1460 - regression_loss: 1.7420 - classification_loss: 0.4040 273/500 [===============>..............] - ETA: 57s - loss: 2.1421 - regression_loss: 1.7390 - classification_loss: 0.4032 274/500 [===============>..............] - ETA: 56s - loss: 2.1434 - regression_loss: 1.7403 - classification_loss: 0.4031 275/500 [===============>..............] - ETA: 56s - loss: 2.1447 - regression_loss: 1.7417 - classification_loss: 0.4031 276/500 [===============>..............] - ETA: 56s - loss: 2.1469 - regression_loss: 1.7427 - classification_loss: 0.4042 277/500 [===============>..............] - ETA: 56s - loss: 2.1486 - regression_loss: 1.7440 - classification_loss: 0.4046 278/500 [===============>..............] - ETA: 55s - loss: 2.1514 - regression_loss: 1.7462 - classification_loss: 0.4052 279/500 [===============>..............] - ETA: 55s - loss: 2.1510 - regression_loss: 1.7463 - classification_loss: 0.4047 280/500 [===============>..............] - ETA: 55s - loss: 2.1493 - regression_loss: 1.7452 - classification_loss: 0.4041 281/500 [===============>..............] - ETA: 55s - loss: 2.1476 - regression_loss: 1.7437 - classification_loss: 0.4039 282/500 [===============>..............] - ETA: 54s - loss: 2.1502 - regression_loss: 1.7460 - classification_loss: 0.4042 283/500 [===============>..............] - ETA: 54s - loss: 2.1492 - regression_loss: 1.7450 - classification_loss: 0.4042 284/500 [================>.............] - ETA: 54s - loss: 2.1501 - regression_loss: 1.7458 - classification_loss: 0.4043 285/500 [================>.............] - ETA: 54s - loss: 2.1511 - regression_loss: 1.7468 - classification_loss: 0.4043 286/500 [================>.............] - ETA: 53s - loss: 2.1502 - regression_loss: 1.7462 - classification_loss: 0.4039 287/500 [================>.............] - ETA: 53s - loss: 2.1486 - regression_loss: 1.7445 - classification_loss: 0.4041 288/500 [================>.............] - ETA: 53s - loss: 2.1527 - regression_loss: 1.7486 - classification_loss: 0.4041 289/500 [================>.............] - ETA: 52s - loss: 2.1538 - regression_loss: 1.7492 - classification_loss: 0.4046 290/500 [================>.............] - ETA: 52s - loss: 2.1543 - regression_loss: 1.7493 - classification_loss: 0.4050 291/500 [================>.............] - ETA: 52s - loss: 2.1537 - regression_loss: 1.7491 - classification_loss: 0.4046 292/500 [================>.............] - ETA: 52s - loss: 2.1527 - regression_loss: 1.7485 - classification_loss: 0.4042 293/500 [================>.............] - ETA: 51s - loss: 2.1525 - regression_loss: 1.7486 - classification_loss: 0.4039 294/500 [================>.............] - ETA: 51s - loss: 2.1533 - regression_loss: 1.7491 - classification_loss: 0.4042 295/500 [================>.............] - ETA: 51s - loss: 2.1536 - regression_loss: 1.7494 - classification_loss: 0.4042 296/500 [================>.............] - ETA: 51s - loss: 2.1556 - regression_loss: 1.7509 - classification_loss: 0.4047 297/500 [================>.............] - ETA: 50s - loss: 2.1559 - regression_loss: 1.7510 - classification_loss: 0.4049 298/500 [================>.............] - ETA: 50s - loss: 2.1561 - regression_loss: 1.7514 - classification_loss: 0.4048 299/500 [================>.............] - ETA: 50s - loss: 2.1564 - regression_loss: 1.7513 - classification_loss: 0.4051 300/500 [=================>............] - ETA: 50s - loss: 2.1586 - regression_loss: 1.7534 - classification_loss: 0.4052 301/500 [=================>............] - ETA: 49s - loss: 2.1549 - regression_loss: 1.7505 - classification_loss: 0.4044 302/500 [=================>............] - ETA: 49s - loss: 2.1544 - regression_loss: 1.7503 - classification_loss: 0.4041 303/500 [=================>............] - ETA: 49s - loss: 2.1529 - regression_loss: 1.7489 - classification_loss: 0.4040 304/500 [=================>............] - ETA: 49s - loss: 2.1527 - regression_loss: 1.7486 - classification_loss: 0.4041 305/500 [=================>............] - ETA: 48s - loss: 2.1521 - regression_loss: 1.7484 - classification_loss: 0.4037 306/500 [=================>............] - ETA: 48s - loss: 2.1525 - regression_loss: 1.7488 - classification_loss: 0.4037 307/500 [=================>............] - ETA: 48s - loss: 2.1540 - regression_loss: 1.7500 - classification_loss: 0.4040 308/500 [=================>............] - ETA: 48s - loss: 2.1547 - regression_loss: 1.7506 - classification_loss: 0.4042 309/500 [=================>............] - ETA: 47s - loss: 2.1531 - regression_loss: 1.7495 - classification_loss: 0.4037 310/500 [=================>............] - ETA: 47s - loss: 2.1535 - regression_loss: 1.7499 - classification_loss: 0.4036 311/500 [=================>............] - ETA: 47s - loss: 2.1551 - regression_loss: 1.7513 - classification_loss: 0.4038 312/500 [=================>............] - ETA: 47s - loss: 2.1541 - regression_loss: 1.7506 - classification_loss: 0.4035 313/500 [=================>............] - ETA: 46s - loss: 2.1540 - regression_loss: 1.7507 - classification_loss: 0.4033 314/500 [=================>............] - ETA: 46s - loss: 2.1539 - regression_loss: 1.7508 - classification_loss: 0.4031 315/500 [=================>............] - ETA: 46s - loss: 2.1520 - regression_loss: 1.7493 - classification_loss: 0.4027 316/500 [=================>............] - ETA: 46s - loss: 2.1510 - regression_loss: 1.7485 - classification_loss: 0.4025 317/500 [==================>...........] - ETA: 45s - loss: 2.1500 - regression_loss: 1.7477 - classification_loss: 0.4023 318/500 [==================>...........] - ETA: 45s - loss: 2.1502 - regression_loss: 1.7478 - classification_loss: 0.4024 319/500 [==================>...........] - ETA: 45s - loss: 2.1482 - regression_loss: 1.7462 - classification_loss: 0.4020 320/500 [==================>...........] - ETA: 45s - loss: 2.1486 - regression_loss: 1.7464 - classification_loss: 0.4022 321/500 [==================>...........] - ETA: 44s - loss: 2.1470 - regression_loss: 1.7453 - classification_loss: 0.4017 322/500 [==================>...........] - ETA: 44s - loss: 2.1487 - regression_loss: 1.7465 - classification_loss: 0.4022 323/500 [==================>...........] - ETA: 44s - loss: 2.1487 - regression_loss: 1.7467 - classification_loss: 0.4020 324/500 [==================>...........] - ETA: 44s - loss: 2.1487 - regression_loss: 1.7468 - classification_loss: 0.4019 325/500 [==================>...........] - ETA: 43s - loss: 2.1493 - regression_loss: 1.7475 - classification_loss: 0.4018 326/500 [==================>...........] - ETA: 43s - loss: 2.1459 - regression_loss: 1.7450 - classification_loss: 0.4009 327/500 [==================>...........] - ETA: 43s - loss: 2.1459 - regression_loss: 1.7452 - classification_loss: 0.4007 328/500 [==================>...........] - ETA: 43s - loss: 2.1450 - regression_loss: 1.7442 - classification_loss: 0.4008 329/500 [==================>...........] - ETA: 42s - loss: 2.1468 - regression_loss: 1.7454 - classification_loss: 0.4014 330/500 [==================>...........] - ETA: 42s - loss: 2.1470 - regression_loss: 1.7457 - classification_loss: 0.4013 331/500 [==================>...........] - ETA: 42s - loss: 2.1457 - regression_loss: 1.7446 - classification_loss: 0.4011 332/500 [==================>...........] - ETA: 42s - loss: 2.1443 - regression_loss: 1.7437 - classification_loss: 0.4005 333/500 [==================>...........] - ETA: 41s - loss: 2.1431 - regression_loss: 1.7429 - classification_loss: 0.4002 334/500 [===================>..........] - ETA: 41s - loss: 2.1430 - regression_loss: 1.7426 - classification_loss: 0.4005 335/500 [===================>..........] - ETA: 41s - loss: 2.1413 - regression_loss: 1.7407 - classification_loss: 0.4007 336/500 [===================>..........] - ETA: 41s - loss: 2.1399 - regression_loss: 1.7396 - classification_loss: 0.4003 337/500 [===================>..........] - ETA: 40s - loss: 2.1392 - regression_loss: 1.7390 - classification_loss: 0.4002 338/500 [===================>..........] - ETA: 40s - loss: 2.1393 - regression_loss: 1.7391 - classification_loss: 0.4002 339/500 [===================>..........] - ETA: 40s - loss: 2.1385 - regression_loss: 1.7385 - classification_loss: 0.4001 340/500 [===================>..........] - ETA: 40s - loss: 2.1403 - regression_loss: 1.7393 - classification_loss: 0.4010 341/500 [===================>..........] - ETA: 39s - loss: 2.1408 - regression_loss: 1.7398 - classification_loss: 0.4010 342/500 [===================>..........] - ETA: 39s - loss: 2.1398 - regression_loss: 1.7389 - classification_loss: 0.4009 343/500 [===================>..........] - ETA: 39s - loss: 2.1388 - regression_loss: 1.7382 - classification_loss: 0.4005 344/500 [===================>..........] - ETA: 39s - loss: 2.1389 - regression_loss: 1.7386 - classification_loss: 0.4003 345/500 [===================>..........] - ETA: 38s - loss: 2.1391 - regression_loss: 1.7390 - classification_loss: 0.4001 346/500 [===================>..........] - ETA: 38s - loss: 2.1382 - regression_loss: 1.7384 - classification_loss: 0.3998 347/500 [===================>..........] - ETA: 38s - loss: 2.1372 - regression_loss: 1.7377 - classification_loss: 0.3995 348/500 [===================>..........] - ETA: 38s - loss: 2.1400 - regression_loss: 1.7401 - classification_loss: 0.3999 349/500 [===================>..........] - ETA: 37s - loss: 2.1402 - regression_loss: 1.7404 - classification_loss: 0.3997 350/500 [====================>.........] - ETA: 37s - loss: 2.1379 - regression_loss: 1.7378 - classification_loss: 0.4001 351/500 [====================>.........] - ETA: 37s - loss: 2.1381 - regression_loss: 1.7380 - classification_loss: 0.4000 352/500 [====================>.........] - ETA: 37s - loss: 2.1393 - regression_loss: 1.7389 - classification_loss: 0.4004 353/500 [====================>.........] - ETA: 36s - loss: 2.1401 - regression_loss: 1.7398 - classification_loss: 0.4003 354/500 [====================>.........] - ETA: 36s - loss: 2.1402 - regression_loss: 1.7398 - classification_loss: 0.4004 355/500 [====================>.........] - ETA: 36s - loss: 2.1423 - regression_loss: 1.7417 - classification_loss: 0.4007 356/500 [====================>.........] - ETA: 36s - loss: 2.1422 - regression_loss: 1.7415 - classification_loss: 0.4007 357/500 [====================>.........] - ETA: 35s - loss: 2.1427 - regression_loss: 1.7418 - classification_loss: 0.4008 358/500 [====================>.........] - ETA: 35s - loss: 2.1430 - regression_loss: 1.7422 - classification_loss: 0.4007 359/500 [====================>.........] - ETA: 35s - loss: 2.1428 - regression_loss: 1.7424 - classification_loss: 0.4004 360/500 [====================>.........] - ETA: 35s - loss: 2.1435 - regression_loss: 1.7430 - classification_loss: 0.4005 361/500 [====================>.........] - ETA: 34s - loss: 2.1405 - regression_loss: 1.7405 - classification_loss: 0.4000 362/500 [====================>.........] - ETA: 34s - loss: 2.1411 - regression_loss: 1.7412 - classification_loss: 0.4000 363/500 [====================>.........] - ETA: 34s - loss: 2.1388 - regression_loss: 1.7394 - classification_loss: 0.3995 364/500 [====================>.........] - ETA: 34s - loss: 2.1358 - regression_loss: 1.7371 - classification_loss: 0.3987 365/500 [====================>.........] - ETA: 33s - loss: 2.1361 - regression_loss: 1.7373 - classification_loss: 0.3988 366/500 [====================>.........] - ETA: 33s - loss: 2.1387 - regression_loss: 1.7395 - classification_loss: 0.3992 367/500 [=====================>........] - ETA: 33s - loss: 2.1406 - regression_loss: 1.7412 - classification_loss: 0.3994 368/500 [=====================>........] - ETA: 33s - loss: 2.1408 - regression_loss: 1.7413 - classification_loss: 0.3995 369/500 [=====================>........] - ETA: 32s - loss: 2.1412 - regression_loss: 1.7418 - classification_loss: 0.3994 370/500 [=====================>........] - ETA: 32s - loss: 2.1434 - regression_loss: 1.7431 - classification_loss: 0.4003 371/500 [=====================>........] - ETA: 32s - loss: 2.1428 - regression_loss: 1.7426 - classification_loss: 0.4001 372/500 [=====================>........] - ETA: 32s - loss: 2.1434 - regression_loss: 1.7432 - classification_loss: 0.4002 373/500 [=====================>........] - ETA: 31s - loss: 2.1434 - regression_loss: 1.7434 - classification_loss: 0.4000 374/500 [=====================>........] - ETA: 31s - loss: 2.1433 - regression_loss: 1.7433 - classification_loss: 0.4000 375/500 [=====================>........] - ETA: 31s - loss: 2.1421 - regression_loss: 1.7424 - classification_loss: 0.3997 376/500 [=====================>........] - ETA: 31s - loss: 2.1415 - regression_loss: 1.7421 - classification_loss: 0.3995 377/500 [=====================>........] - ETA: 30s - loss: 2.1394 - regression_loss: 1.7404 - classification_loss: 0.3990 378/500 [=====================>........] - ETA: 30s - loss: 2.1366 - regression_loss: 1.7382 - classification_loss: 0.3984 379/500 [=====================>........] - ETA: 30s - loss: 2.1372 - regression_loss: 1.7387 - classification_loss: 0.3985 380/500 [=====================>........] - ETA: 30s - loss: 2.1375 - regression_loss: 1.7392 - classification_loss: 0.3984 381/500 [=====================>........] - ETA: 29s - loss: 2.1379 - regression_loss: 1.7395 - classification_loss: 0.3984 382/500 [=====================>........] - ETA: 29s - loss: 2.1365 - regression_loss: 1.7387 - classification_loss: 0.3979 383/500 [=====================>........] - ETA: 29s - loss: 2.1380 - regression_loss: 1.7396 - classification_loss: 0.3983 384/500 [======================>.......] - ETA: 29s - loss: 2.1374 - regression_loss: 1.7394 - classification_loss: 0.3981 385/500 [======================>.......] - ETA: 28s - loss: 2.1383 - regression_loss: 1.7402 - classification_loss: 0.3981 386/500 [======================>.......] - ETA: 28s - loss: 2.1379 - regression_loss: 1.7399 - classification_loss: 0.3981 387/500 [======================>.......] - ETA: 28s - loss: 2.1400 - regression_loss: 1.7412 - classification_loss: 0.3988 388/500 [======================>.......] - ETA: 28s - loss: 2.1406 - regression_loss: 1.7417 - classification_loss: 0.3989 389/500 [======================>.......] - ETA: 27s - loss: 2.1407 - regression_loss: 1.7418 - classification_loss: 0.3989 390/500 [======================>.......] - ETA: 27s - loss: 2.1401 - regression_loss: 1.7414 - classification_loss: 0.3987 391/500 [======================>.......] - ETA: 27s - loss: 2.1396 - regression_loss: 1.7411 - classification_loss: 0.3985 392/500 [======================>.......] - ETA: 27s - loss: 2.1395 - regression_loss: 1.7411 - classification_loss: 0.3984 393/500 [======================>.......] - ETA: 26s - loss: 2.1382 - regression_loss: 1.7402 - classification_loss: 0.3981 394/500 [======================>.......] - ETA: 26s - loss: 2.1374 - regression_loss: 1.7395 - classification_loss: 0.3979 395/500 [======================>.......] - ETA: 26s - loss: 2.1368 - regression_loss: 1.7392 - classification_loss: 0.3976 396/500 [======================>.......] - ETA: 26s - loss: 2.1368 - regression_loss: 1.7392 - classification_loss: 0.3976 397/500 [======================>.......] - ETA: 25s - loss: 2.1372 - regression_loss: 1.7389 - classification_loss: 0.3984 398/500 [======================>.......] - ETA: 25s - loss: 2.1355 - regression_loss: 1.7373 - classification_loss: 0.3982 399/500 [======================>.......] - ETA: 25s - loss: 2.1349 - regression_loss: 1.7365 - classification_loss: 0.3984 400/500 [=======================>......] - ETA: 25s - loss: 2.1344 - regression_loss: 1.7363 - classification_loss: 0.3981 401/500 [=======================>......] - ETA: 24s - loss: 2.1362 - regression_loss: 1.7379 - classification_loss: 0.3983 402/500 [=======================>......] - ETA: 24s - loss: 2.1364 - regression_loss: 1.7381 - classification_loss: 0.3983 403/500 [=======================>......] - ETA: 24s - loss: 2.1376 - regression_loss: 1.7392 - classification_loss: 0.3984 404/500 [=======================>......] - ETA: 24s - loss: 2.1370 - regression_loss: 1.7389 - classification_loss: 0.3980 405/500 [=======================>......] - ETA: 23s - loss: 2.1352 - regression_loss: 1.7375 - classification_loss: 0.3977 406/500 [=======================>......] - ETA: 23s - loss: 2.1364 - regression_loss: 1.7383 - classification_loss: 0.3981 407/500 [=======================>......] - ETA: 23s - loss: 2.1365 - regression_loss: 1.7385 - classification_loss: 0.3980 408/500 [=======================>......] - ETA: 23s - loss: 2.1368 - regression_loss: 1.7388 - classification_loss: 0.3980 409/500 [=======================>......] - ETA: 22s - loss: 2.1413 - regression_loss: 1.7419 - classification_loss: 0.3994 410/500 [=======================>......] - ETA: 22s - loss: 2.1418 - regression_loss: 1.7424 - classification_loss: 0.3994 411/500 [=======================>......] - ETA: 22s - loss: 2.1397 - regression_loss: 1.7406 - classification_loss: 0.3991 412/500 [=======================>......] - ETA: 22s - loss: 2.1399 - regression_loss: 1.7409 - classification_loss: 0.3990 413/500 [=======================>......] - ETA: 21s - loss: 2.1391 - regression_loss: 1.7402 - classification_loss: 0.3989 414/500 [=======================>......] - ETA: 21s - loss: 2.1397 - regression_loss: 1.7405 - classification_loss: 0.3992 415/500 [=======================>......] - ETA: 21s - loss: 2.1405 - regression_loss: 1.7412 - classification_loss: 0.3992 416/500 [=======================>......] - ETA: 21s - loss: 2.1373 - regression_loss: 1.7386 - classification_loss: 0.3986 417/500 [========================>.....] - ETA: 20s - loss: 2.1366 - regression_loss: 1.7382 - classification_loss: 0.3984 418/500 [========================>.....] - ETA: 20s - loss: 2.1362 - regression_loss: 1.7378 - classification_loss: 0.3984 419/500 [========================>.....] - ETA: 20s - loss: 2.1370 - regression_loss: 1.7384 - classification_loss: 0.3986 420/500 [========================>.....] - ETA: 20s - loss: 2.1371 - regression_loss: 1.7386 - classification_loss: 0.3985 421/500 [========================>.....] - ETA: 19s - loss: 2.1379 - regression_loss: 1.7394 - classification_loss: 0.3986 422/500 [========================>.....] - ETA: 19s - loss: 2.1389 - regression_loss: 1.7403 - classification_loss: 0.3986 423/500 [========================>.....] - ETA: 19s - loss: 2.1390 - regression_loss: 1.7401 - classification_loss: 0.3989 424/500 [========================>.....] - ETA: 19s - loss: 2.1392 - regression_loss: 1.7403 - classification_loss: 0.3989 425/500 [========================>.....] - ETA: 18s - loss: 2.1405 - regression_loss: 1.7408 - classification_loss: 0.3996 426/500 [========================>.....] - ETA: 18s - loss: 2.1404 - regression_loss: 1.7406 - classification_loss: 0.3998 427/500 [========================>.....] - ETA: 18s - loss: 2.1414 - regression_loss: 1.7412 - classification_loss: 0.4002 428/500 [========================>.....] - ETA: 18s - loss: 2.1402 - regression_loss: 1.7403 - classification_loss: 0.3999 429/500 [========================>.....] - ETA: 17s - loss: 2.1394 - regression_loss: 1.7398 - classification_loss: 0.3996 430/500 [========================>.....] - ETA: 17s - loss: 2.1399 - regression_loss: 1.7404 - classification_loss: 0.3996 431/500 [========================>.....] - ETA: 17s - loss: 2.1406 - regression_loss: 1.7411 - classification_loss: 0.3996 432/500 [========================>.....] - ETA: 17s - loss: 2.1408 - regression_loss: 1.7413 - classification_loss: 0.3995 433/500 [========================>.....] - ETA: 16s - loss: 2.1406 - regression_loss: 1.7410 - classification_loss: 0.3996 434/500 [=========================>....] - ETA: 16s - loss: 2.1404 - regression_loss: 1.7410 - classification_loss: 0.3994 435/500 [=========================>....] - ETA: 16s - loss: 2.1394 - regression_loss: 1.7403 - classification_loss: 0.3991 436/500 [=========================>....] - ETA: 16s - loss: 2.1403 - regression_loss: 1.7409 - classification_loss: 0.3994 437/500 [=========================>....] - ETA: 15s - loss: 2.1400 - regression_loss: 1.7407 - classification_loss: 0.3993 438/500 [=========================>....] - ETA: 15s - loss: 2.1393 - regression_loss: 1.7403 - classification_loss: 0.3990 439/500 [=========================>....] - ETA: 15s - loss: 2.1394 - regression_loss: 1.7405 - classification_loss: 0.3989 440/500 [=========================>....] - ETA: 15s - loss: 2.1407 - regression_loss: 1.7418 - classification_loss: 0.3990 441/500 [=========================>....] - ETA: 14s - loss: 2.1391 - regression_loss: 1.7405 - classification_loss: 0.3986 442/500 [=========================>....] - ETA: 14s - loss: 2.1386 - regression_loss: 1.7402 - classification_loss: 0.3984 443/500 [=========================>....] - ETA: 14s - loss: 2.1385 - regression_loss: 1.7401 - classification_loss: 0.3983 444/500 [=========================>....] - ETA: 14s - loss: 2.1380 - regression_loss: 1.7396 - classification_loss: 0.3984 445/500 [=========================>....] - ETA: 13s - loss: 2.1387 - regression_loss: 1.7403 - classification_loss: 0.3984 446/500 [=========================>....] - ETA: 13s - loss: 2.1385 - regression_loss: 1.7402 - classification_loss: 0.3983 447/500 [=========================>....] - ETA: 13s - loss: 2.1389 - regression_loss: 1.7404 - classification_loss: 0.3984 448/500 [=========================>....] - ETA: 13s - loss: 2.1389 - regression_loss: 1.7404 - classification_loss: 0.3984 449/500 [=========================>....] - ETA: 12s - loss: 2.1396 - regression_loss: 1.7412 - classification_loss: 0.3984 450/500 [==========================>...] - ETA: 12s - loss: 2.1395 - regression_loss: 1.7412 - classification_loss: 0.3983 451/500 [==========================>...] - ETA: 12s - loss: 2.1384 - regression_loss: 1.7401 - classification_loss: 0.3982 452/500 [==========================>...] - ETA: 12s - loss: 2.1408 - regression_loss: 1.7423 - classification_loss: 0.3984 453/500 [==========================>...] - ETA: 11s - loss: 2.1402 - regression_loss: 1.7419 - classification_loss: 0.3982 454/500 [==========================>...] - ETA: 11s - loss: 2.1413 - regression_loss: 1.7428 - classification_loss: 0.3985 455/500 [==========================>...] - ETA: 11s - loss: 2.1421 - regression_loss: 1.7434 - classification_loss: 0.3987 456/500 [==========================>...] - ETA: 11s - loss: 2.1430 - regression_loss: 1.7441 - classification_loss: 0.3989 457/500 [==========================>...] - ETA: 10s - loss: 2.1436 - regression_loss: 1.7446 - classification_loss: 0.3990 458/500 [==========================>...] - ETA: 10s - loss: 2.1429 - regression_loss: 1.7441 - classification_loss: 0.3988 459/500 [==========================>...] - ETA: 10s - loss: 2.1438 - regression_loss: 1.7448 - classification_loss: 0.3990 460/500 [==========================>...] - ETA: 10s - loss: 2.1438 - regression_loss: 1.7450 - classification_loss: 0.3988 461/500 [==========================>...] - ETA: 9s - loss: 2.1440 - regression_loss: 1.7452 - classification_loss: 0.3987  462/500 [==========================>...] - ETA: 9s - loss: 2.1449 - regression_loss: 1.7459 - classification_loss: 0.3990 463/500 [==========================>...] - ETA: 9s - loss: 2.1451 - regression_loss: 1.7454 - classification_loss: 0.3997 464/500 [==========================>...] - ETA: 9s - loss: 2.1469 - regression_loss: 1.7473 - classification_loss: 0.3997 465/500 [==========================>...] - ETA: 8s - loss: 2.1474 - regression_loss: 1.7475 - classification_loss: 0.3999 466/500 [==========================>...] - ETA: 8s - loss: 2.1456 - regression_loss: 1.7461 - classification_loss: 0.3994 467/500 [===========================>..] - ETA: 8s - loss: 2.1461 - regression_loss: 1.7467 - classification_loss: 0.3994 468/500 [===========================>..] - ETA: 8s - loss: 2.1457 - regression_loss: 1.7464 - classification_loss: 0.3994 469/500 [===========================>..] - ETA: 7s - loss: 2.1461 - regression_loss: 1.7467 - classification_loss: 0.3993 470/500 [===========================>..] - ETA: 7s - loss: 2.1458 - regression_loss: 1.7466 - classification_loss: 0.3992 471/500 [===========================>..] - ETA: 7s - loss: 2.1460 - regression_loss: 1.7468 - classification_loss: 0.3992 472/500 [===========================>..] - ETA: 7s - loss: 2.1481 - regression_loss: 1.7485 - classification_loss: 0.3996 473/500 [===========================>..] - ETA: 6s - loss: 2.1498 - regression_loss: 1.7499 - classification_loss: 0.3999 474/500 [===========================>..] - ETA: 6s - loss: 2.1504 - regression_loss: 1.7505 - classification_loss: 0.4000 475/500 [===========================>..] - ETA: 6s - loss: 2.1500 - regression_loss: 1.7501 - classification_loss: 0.4000 476/500 [===========================>..] - ETA: 6s - loss: 2.1504 - regression_loss: 1.7506 - classification_loss: 0.3999 477/500 [===========================>..] - ETA: 5s - loss: 2.1501 - regression_loss: 1.7498 - classification_loss: 0.4003 478/500 [===========================>..] - ETA: 5s - loss: 2.1494 - regression_loss: 1.7492 - classification_loss: 0.4002 479/500 [===========================>..] - ETA: 5s - loss: 2.1497 - regression_loss: 1.7494 - classification_loss: 0.4002 480/500 [===========================>..] - ETA: 5s - loss: 2.1499 - regression_loss: 1.7496 - classification_loss: 0.4003 481/500 [===========================>..] - ETA: 4s - loss: 2.1503 - regression_loss: 1.7501 - classification_loss: 0.4002 482/500 [===========================>..] - ETA: 4s - loss: 2.1493 - regression_loss: 1.7492 - classification_loss: 0.4001 483/500 [===========================>..] - ETA: 4s - loss: 2.1493 - regression_loss: 1.7492 - classification_loss: 0.4001 484/500 [============================>.] - ETA: 4s - loss: 2.1494 - regression_loss: 1.7492 - classification_loss: 0.4002 485/500 [============================>.] - ETA: 3s - loss: 2.1488 - regression_loss: 1.7489 - classification_loss: 0.4000 486/500 [============================>.] - ETA: 3s - loss: 2.1480 - regression_loss: 1.7483 - classification_loss: 0.3997 487/500 [============================>.] - ETA: 3s - loss: 2.1486 - regression_loss: 1.7488 - classification_loss: 0.3998 488/500 [============================>.] - ETA: 3s - loss: 2.1483 - regression_loss: 1.7486 - classification_loss: 0.3997 489/500 [============================>.] - ETA: 2s - loss: 2.1483 - regression_loss: 1.7487 - classification_loss: 0.3997 490/500 [============================>.] - ETA: 2s - loss: 2.1492 - regression_loss: 1.7492 - classification_loss: 0.4000 491/500 [============================>.] - ETA: 2s - loss: 2.1485 - regression_loss: 1.7487 - classification_loss: 0.3998 492/500 [============================>.] - ETA: 2s - loss: 2.1480 - regression_loss: 1.7485 - classification_loss: 0.3996 493/500 [============================>.] - ETA: 1s - loss: 2.1475 - regression_loss: 1.7479 - classification_loss: 0.3997 494/500 [============================>.] - ETA: 1s - loss: 2.1458 - regression_loss: 1.7464 - classification_loss: 0.3994 495/500 [============================>.] - ETA: 1s - loss: 2.1468 - regression_loss: 1.7473 - classification_loss: 0.3995 496/500 [============================>.] - ETA: 1s - loss: 2.1461 - regression_loss: 1.7469 - classification_loss: 0.3992 497/500 [============================>.] - ETA: 0s - loss: 2.1456 - regression_loss: 1.7464 - classification_loss: 0.3992 498/500 [============================>.] - ETA: 0s - loss: 2.1466 - regression_loss: 1.7471 - classification_loss: 0.3995 499/500 [============================>.] - ETA: 0s - loss: 2.1454 - regression_loss: 1.7461 - classification_loss: 0.3993 500/500 [==============================] - 126s 251ms/step - loss: 2.1455 - regression_loss: 1.7462 - classification_loss: 0.3993 1172 instances of class plum with average precision: 0.4730 mAP: 0.4730 Epoch 00030: saving model to ./training/snapshots/resnet50_pascal_30.h5 Epoch 31/150 1/500 [..............................] - ETA: 1:56 - loss: 1.3363 - regression_loss: 0.9892 - classification_loss: 0.3471 2/500 [..............................] - ETA: 1:59 - loss: 2.0340 - regression_loss: 1.6346 - classification_loss: 0.3994 3/500 [..............................] - ETA: 2:03 - loss: 2.2807 - regression_loss: 1.8382 - classification_loss: 0.4424 4/500 [..............................] - ETA: 2:04 - loss: 2.2126 - regression_loss: 1.8100 - classification_loss: 0.4026 5/500 [..............................] - ETA: 2:04 - loss: 2.3344 - regression_loss: 1.9142 - classification_loss: 0.4202 6/500 [..............................] - ETA: 2:03 - loss: 2.3948 - regression_loss: 1.9466 - classification_loss: 0.4483 7/500 [..............................] - ETA: 2:02 - loss: 2.3667 - regression_loss: 1.9282 - classification_loss: 0.4384 8/500 [..............................] - ETA: 2:02 - loss: 2.3650 - regression_loss: 1.9307 - classification_loss: 0.4343 9/500 [..............................] - ETA: 2:02 - loss: 2.3466 - regression_loss: 1.9139 - classification_loss: 0.4327 10/500 [..............................] - ETA: 2:02 - loss: 2.2745 - regression_loss: 1.8560 - classification_loss: 0.4185 11/500 [..............................] - ETA: 2:02 - loss: 2.3143 - regression_loss: 1.8814 - classification_loss: 0.4328 12/500 [..............................] - ETA: 2:02 - loss: 2.2961 - regression_loss: 1.8627 - classification_loss: 0.4334 13/500 [..............................] - ETA: 2:02 - loss: 2.1922 - regression_loss: 1.7816 - classification_loss: 0.4107 14/500 [..............................] - ETA: 2:02 - loss: 2.1936 - regression_loss: 1.7833 - classification_loss: 0.4103 15/500 [..............................] - ETA: 2:01 - loss: 2.1885 - regression_loss: 1.7821 - classification_loss: 0.4064 16/500 [..............................] - ETA: 2:01 - loss: 2.1772 - regression_loss: 1.7732 - classification_loss: 0.4040 17/500 [>.............................] - ETA: 2:01 - loss: 2.2630 - regression_loss: 1.7872 - classification_loss: 0.4759 18/500 [>.............................] - ETA: 2:01 - loss: 2.2719 - regression_loss: 1.8008 - classification_loss: 0.4711 19/500 [>.............................] - ETA: 2:00 - loss: 2.2572 - regression_loss: 1.7937 - classification_loss: 0.4635 20/500 [>.............................] - ETA: 2:00 - loss: 2.2912 - regression_loss: 1.7887 - classification_loss: 0.5024 21/500 [>.............................] - ETA: 2:00 - loss: 2.2792 - regression_loss: 1.7830 - classification_loss: 0.4962 22/500 [>.............................] - ETA: 2:00 - loss: 2.2839 - regression_loss: 1.7901 - classification_loss: 0.4938 23/500 [>.............................] - ETA: 2:00 - loss: 2.2715 - regression_loss: 1.7827 - classification_loss: 0.4888 24/500 [>.............................] - ETA: 1:59 - loss: 2.2760 - regression_loss: 1.7892 - classification_loss: 0.4867 25/500 [>.............................] - ETA: 1:59 - loss: 2.2396 - regression_loss: 1.7664 - classification_loss: 0.4732 26/500 [>.............................] - ETA: 1:59 - loss: 2.2369 - regression_loss: 1.7647 - classification_loss: 0.4722 27/500 [>.............................] - ETA: 1:59 - loss: 2.2487 - regression_loss: 1.7733 - classification_loss: 0.4753 28/500 [>.............................] - ETA: 1:58 - loss: 2.2336 - regression_loss: 1.7618 - classification_loss: 0.4717 29/500 [>.............................] - ETA: 1:58 - loss: 2.2520 - regression_loss: 1.7775 - classification_loss: 0.4745 30/500 [>.............................] - ETA: 1:58 - loss: 2.2432 - regression_loss: 1.7733 - classification_loss: 0.4699 31/500 [>.............................] - ETA: 1:57 - loss: 2.2454 - regression_loss: 1.7774 - classification_loss: 0.4680 32/500 [>.............................] - ETA: 1:57 - loss: 2.2501 - regression_loss: 1.7832 - classification_loss: 0.4669 33/500 [>.............................] - ETA: 1:57 - loss: 2.2464 - regression_loss: 1.7826 - classification_loss: 0.4638 34/500 [=>............................] - ETA: 1:56 - loss: 2.2618 - regression_loss: 1.8010 - classification_loss: 0.4608 35/500 [=>............................] - ETA: 1:56 - loss: 2.2561 - regression_loss: 1.7991 - classification_loss: 0.4570 36/500 [=>............................] - ETA: 1:56 - loss: 2.2625 - regression_loss: 1.8062 - classification_loss: 0.4563 37/500 [=>............................] - ETA: 1:56 - loss: 2.2561 - regression_loss: 1.8021 - classification_loss: 0.4540 38/500 [=>............................] - ETA: 1:56 - loss: 2.2533 - regression_loss: 1.8007 - classification_loss: 0.4526 39/500 [=>............................] - ETA: 1:56 - loss: 2.2355 - regression_loss: 1.7839 - classification_loss: 0.4516 40/500 [=>............................] - ETA: 1:55 - loss: 2.2294 - regression_loss: 1.7814 - classification_loss: 0.4481 41/500 [=>............................] - ETA: 1:55 - loss: 2.2226 - regression_loss: 1.7755 - classification_loss: 0.4471 42/500 [=>............................] - ETA: 1:55 - loss: 2.2314 - regression_loss: 1.7827 - classification_loss: 0.4486 43/500 [=>............................] - ETA: 1:54 - loss: 2.2097 - regression_loss: 1.7666 - classification_loss: 0.4432 44/500 [=>............................] - ETA: 1:54 - loss: 2.2056 - regression_loss: 1.7649 - classification_loss: 0.4407 45/500 [=>............................] - ETA: 1:54 - loss: 2.1979 - regression_loss: 1.7587 - classification_loss: 0.4392 46/500 [=>............................] - ETA: 1:54 - loss: 2.2189 - regression_loss: 1.7789 - classification_loss: 0.4400 47/500 [=>............................] - ETA: 1:54 - loss: 2.2155 - regression_loss: 1.7769 - classification_loss: 0.4387 48/500 [=>............................] - ETA: 1:53 - loss: 2.2037 - regression_loss: 1.7673 - classification_loss: 0.4364 49/500 [=>............................] - ETA: 1:53 - loss: 2.2114 - regression_loss: 1.7730 - classification_loss: 0.4384 50/500 [==>...........................] - ETA: 1:53 - loss: 2.2301 - regression_loss: 1.7904 - classification_loss: 0.4397 51/500 [==>...........................] - ETA: 1:53 - loss: 2.2470 - regression_loss: 1.8015 - classification_loss: 0.4455 52/500 [==>...........................] - ETA: 1:52 - loss: 2.2377 - regression_loss: 1.7940 - classification_loss: 0.4436 53/500 [==>...........................] - ETA: 1:52 - loss: 2.2145 - regression_loss: 1.7759 - classification_loss: 0.4386 54/500 [==>...........................] - ETA: 1:52 - loss: 2.2082 - regression_loss: 1.7715 - classification_loss: 0.4367 55/500 [==>...........................] - ETA: 1:52 - loss: 2.2036 - regression_loss: 1.7689 - classification_loss: 0.4347 56/500 [==>...........................] - ETA: 1:51 - loss: 2.2120 - regression_loss: 1.7756 - classification_loss: 0.4364 57/500 [==>...........................] - ETA: 1:51 - loss: 2.2100 - regression_loss: 1.7762 - classification_loss: 0.4338 58/500 [==>...........................] - ETA: 1:51 - loss: 2.1949 - regression_loss: 1.7644 - classification_loss: 0.4304 59/500 [==>...........................] - ETA: 1:51 - loss: 2.1894 - regression_loss: 1.7594 - classification_loss: 0.4300 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1835 - regression_loss: 1.7553 - classification_loss: 0.4282 61/500 [==>...........................] - ETA: 1:50 - loss: 2.1795 - regression_loss: 1.7536 - classification_loss: 0.4260 62/500 [==>...........................] - ETA: 1:50 - loss: 2.1767 - regression_loss: 1.7524 - classification_loss: 0.4242 63/500 [==>...........................] - ETA: 1:50 - loss: 2.1748 - regression_loss: 1.7515 - classification_loss: 0.4233 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1781 - regression_loss: 1.7545 - classification_loss: 0.4236 65/500 [==>...........................] - ETA: 1:49 - loss: 2.1793 - regression_loss: 1.7564 - classification_loss: 0.4229 66/500 [==>...........................] - ETA: 1:49 - loss: 2.1853 - regression_loss: 1.7610 - classification_loss: 0.4243 67/500 [===>..........................] - ETA: 1:49 - loss: 2.1908 - regression_loss: 1.7655 - classification_loss: 0.4252 68/500 [===>..........................] - ETA: 1:48 - loss: 2.1996 - regression_loss: 1.7715 - classification_loss: 0.4281 69/500 [===>..........................] - ETA: 1:48 - loss: 2.1970 - regression_loss: 1.7700 - classification_loss: 0.4270 70/500 [===>..........................] - ETA: 1:48 - loss: 2.1902 - regression_loss: 1.7652 - classification_loss: 0.4250 71/500 [===>..........................] - ETA: 1:48 - loss: 2.1932 - regression_loss: 1.7678 - classification_loss: 0.4254 72/500 [===>..........................] - ETA: 1:47 - loss: 2.1835 - regression_loss: 1.7608 - classification_loss: 0.4227 73/500 [===>..........................] - ETA: 1:47 - loss: 2.1875 - regression_loss: 1.7653 - classification_loss: 0.4223 74/500 [===>..........................] - ETA: 1:47 - loss: 2.1948 - regression_loss: 1.7671 - classification_loss: 0.4277 75/500 [===>..........................] - ETA: 1:47 - loss: 2.1931 - regression_loss: 1.7658 - classification_loss: 0.4273 76/500 [===>..........................] - ETA: 1:46 - loss: 2.1950 - regression_loss: 1.7692 - classification_loss: 0.4258 77/500 [===>..........................] - ETA: 1:46 - loss: 2.1986 - regression_loss: 1.7728 - classification_loss: 0.4258 78/500 [===>..........................] - ETA: 1:46 - loss: 2.1982 - regression_loss: 1.7727 - classification_loss: 0.4255 79/500 [===>..........................] - ETA: 1:46 - loss: 2.1939 - regression_loss: 1.7700 - classification_loss: 0.4239 80/500 [===>..........................] - ETA: 1:45 - loss: 2.2015 - regression_loss: 1.7771 - classification_loss: 0.4245 81/500 [===>..........................] - ETA: 1:45 - loss: 2.1995 - regression_loss: 1.7753 - classification_loss: 0.4242 82/500 [===>..........................] - ETA: 1:45 - loss: 2.1958 - regression_loss: 1.7728 - classification_loss: 0.4230 83/500 [===>..........................] - ETA: 1:45 - loss: 2.1866 - regression_loss: 1.7658 - classification_loss: 0.4208 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1762 - regression_loss: 1.7563 - classification_loss: 0.4198 85/500 [====>.........................] - ETA: 1:44 - loss: 2.1850 - regression_loss: 1.7623 - classification_loss: 0.4226 86/500 [====>.........................] - ETA: 1:44 - loss: 2.1901 - regression_loss: 1.7659 - classification_loss: 0.4242 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1883 - regression_loss: 1.7638 - classification_loss: 0.4245 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1869 - regression_loss: 1.7635 - classification_loss: 0.4234 89/500 [====>.........................] - ETA: 1:43 - loss: 2.1866 - regression_loss: 1.7645 - classification_loss: 0.4220 90/500 [====>.........................] - ETA: 1:43 - loss: 2.1866 - regression_loss: 1.7651 - classification_loss: 0.4215 91/500 [====>.........................] - ETA: 1:43 - loss: 2.1916 - regression_loss: 1.7697 - classification_loss: 0.4219 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1851 - regression_loss: 1.7626 - classification_loss: 0.4225 93/500 [====>.........................] - ETA: 1:42 - loss: 2.1826 - regression_loss: 1.7613 - classification_loss: 0.4212 94/500 [====>.........................] - ETA: 1:42 - loss: 2.1761 - regression_loss: 1.7565 - classification_loss: 0.4196 95/500 [====>.........................] - ETA: 1:42 - loss: 2.1827 - regression_loss: 1.7600 - classification_loss: 0.4227 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1827 - regression_loss: 1.7608 - classification_loss: 0.4220 97/500 [====>.........................] - ETA: 1:41 - loss: 2.1830 - regression_loss: 1.7612 - classification_loss: 0.4218 98/500 [====>.........................] - ETA: 1:41 - loss: 2.1798 - regression_loss: 1.7589 - classification_loss: 0.4209 99/500 [====>.........................] - ETA: 1:41 - loss: 2.1817 - regression_loss: 1.7605 - classification_loss: 0.4211 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1843 - regression_loss: 1.7620 - classification_loss: 0.4223 101/500 [=====>........................] - ETA: 1:40 - loss: 2.1966 - regression_loss: 1.7733 - classification_loss: 0.4233 102/500 [=====>........................] - ETA: 1:40 - loss: 2.1940 - regression_loss: 1.7706 - classification_loss: 0.4234 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1901 - regression_loss: 1.7679 - classification_loss: 0.4222 104/500 [=====>........................] - ETA: 1:39 - loss: 2.1882 - regression_loss: 1.7665 - classification_loss: 0.4217 105/500 [=====>........................] - ETA: 1:39 - loss: 2.1792 - regression_loss: 1.7601 - classification_loss: 0.4191 106/500 [=====>........................] - ETA: 1:39 - loss: 2.1755 - regression_loss: 1.7566 - classification_loss: 0.4189 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1747 - regression_loss: 1.7561 - classification_loss: 0.4186 108/500 [=====>........................] - ETA: 1:38 - loss: 2.1749 - regression_loss: 1.7563 - classification_loss: 0.4186 109/500 [=====>........................] - ETA: 1:38 - loss: 2.1741 - regression_loss: 1.7554 - classification_loss: 0.4187 110/500 [=====>........................] - ETA: 1:38 - loss: 2.1716 - regression_loss: 1.7539 - classification_loss: 0.4177 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1719 - regression_loss: 1.7540 - classification_loss: 0.4179 112/500 [=====>........................] - ETA: 1:37 - loss: 2.1757 - regression_loss: 1.7576 - classification_loss: 0.4180 113/500 [=====>........................] - ETA: 1:37 - loss: 2.1799 - regression_loss: 1.7606 - classification_loss: 0.4194 114/500 [=====>........................] - ETA: 1:37 - loss: 2.1818 - regression_loss: 1.7629 - classification_loss: 0.4188 115/500 [=====>........................] - ETA: 1:36 - loss: 2.1853 - regression_loss: 1.7668 - classification_loss: 0.4185 116/500 [=====>........................] - ETA: 1:36 - loss: 2.1849 - regression_loss: 1.7671 - classification_loss: 0.4179 117/500 [======>.......................] - ETA: 1:36 - loss: 2.1817 - regression_loss: 1.7649 - classification_loss: 0.4168 118/500 [======>.......................] - ETA: 1:36 - loss: 2.1794 - regression_loss: 1.7630 - classification_loss: 0.4163 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1846 - regression_loss: 1.7677 - classification_loss: 0.4169 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1770 - regression_loss: 1.7617 - classification_loss: 0.4153 121/500 [======>.......................] - ETA: 1:35 - loss: 2.1840 - regression_loss: 1.7673 - classification_loss: 0.4166 122/500 [======>.......................] - ETA: 1:35 - loss: 2.1859 - regression_loss: 1.7698 - classification_loss: 0.4161 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1883 - regression_loss: 1.7718 - classification_loss: 0.4165 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1860 - regression_loss: 1.7703 - classification_loss: 0.4157 125/500 [======>.......................] - ETA: 1:34 - loss: 2.1847 - regression_loss: 1.7696 - classification_loss: 0.4151 126/500 [======>.......................] - ETA: 1:34 - loss: 2.1833 - regression_loss: 1.7688 - classification_loss: 0.4145 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1732 - regression_loss: 1.7602 - classification_loss: 0.4130 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1748 - regression_loss: 1.7610 - classification_loss: 0.4137 129/500 [======>.......................] - ETA: 1:33 - loss: 2.1747 - regression_loss: 1.7611 - classification_loss: 0.4136 130/500 [======>.......................] - ETA: 1:33 - loss: 2.1747 - regression_loss: 1.7615 - classification_loss: 0.4132 131/500 [======>.......................] - ETA: 1:33 - loss: 2.1765 - regression_loss: 1.7624 - classification_loss: 0.4141 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1772 - regression_loss: 1.7637 - classification_loss: 0.4135 133/500 [======>.......................] - ETA: 1:32 - loss: 2.1762 - regression_loss: 1.7625 - classification_loss: 0.4136 134/500 [=======>......................] - ETA: 1:32 - loss: 2.1758 - regression_loss: 1.7623 - classification_loss: 0.4135 135/500 [=======>......................] - ETA: 1:32 - loss: 2.1767 - regression_loss: 1.7630 - classification_loss: 0.4137 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1774 - regression_loss: 1.7638 - classification_loss: 0.4136 137/500 [=======>......................] - ETA: 1:31 - loss: 2.1716 - regression_loss: 1.7594 - classification_loss: 0.4122 138/500 [=======>......................] - ETA: 1:31 - loss: 2.1672 - regression_loss: 1.7555 - classification_loss: 0.4118 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1722 - regression_loss: 1.7601 - classification_loss: 0.4122 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1711 - regression_loss: 1.7591 - classification_loss: 0.4119 141/500 [=======>......................] - ETA: 1:30 - loss: 2.1716 - regression_loss: 1.7603 - classification_loss: 0.4113 142/500 [=======>......................] - ETA: 1:30 - loss: 2.1740 - regression_loss: 1.7625 - classification_loss: 0.4115 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1714 - regression_loss: 1.7609 - classification_loss: 0.4105 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1718 - regression_loss: 1.7616 - classification_loss: 0.4103 145/500 [=======>......................] - ETA: 1:29 - loss: 2.1685 - regression_loss: 1.7581 - classification_loss: 0.4104 146/500 [=======>......................] - ETA: 1:29 - loss: 2.1679 - regression_loss: 1.7578 - classification_loss: 0.4101 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1714 - regression_loss: 1.7611 - classification_loss: 0.4104 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1689 - regression_loss: 1.7591 - classification_loss: 0.4097 149/500 [=======>......................] - ETA: 1:28 - loss: 2.1693 - regression_loss: 1.7589 - classification_loss: 0.4104 150/500 [========>.....................] - ETA: 1:28 - loss: 2.1691 - regression_loss: 1.7592 - classification_loss: 0.4099 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1721 - regression_loss: 1.7619 - classification_loss: 0.4103 152/500 [========>.....................] - ETA: 1:27 - loss: 2.1697 - regression_loss: 1.7603 - classification_loss: 0.4094 153/500 [========>.....................] - ETA: 1:27 - loss: 2.1731 - regression_loss: 1.7627 - classification_loss: 0.4103 154/500 [========>.....................] - ETA: 1:27 - loss: 2.1714 - regression_loss: 1.7616 - classification_loss: 0.4098 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1724 - regression_loss: 1.7628 - classification_loss: 0.4096 156/500 [========>.....................] - ETA: 1:26 - loss: 2.1701 - regression_loss: 1.7614 - classification_loss: 0.4087 157/500 [========>.....................] - ETA: 1:26 - loss: 2.1754 - regression_loss: 1.7656 - classification_loss: 0.4098 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1782 - regression_loss: 1.7680 - classification_loss: 0.4102 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1782 - regression_loss: 1.7684 - classification_loss: 0.4098 160/500 [========>.....................] - ETA: 1:25 - loss: 2.1781 - regression_loss: 1.7686 - classification_loss: 0.4095 161/500 [========>.....................] - ETA: 1:25 - loss: 2.1773 - regression_loss: 1.7684 - classification_loss: 0.4089 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1754 - regression_loss: 1.7668 - classification_loss: 0.4086 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1763 - regression_loss: 1.7677 - classification_loss: 0.4085 164/500 [========>.....................] - ETA: 1:24 - loss: 2.1743 - regression_loss: 1.7665 - classification_loss: 0.4078 165/500 [========>.....................] - ETA: 1:24 - loss: 2.1786 - regression_loss: 1.7699 - classification_loss: 0.4087 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1775 - regression_loss: 1.7688 - classification_loss: 0.4086 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1796 - regression_loss: 1.7701 - classification_loss: 0.4095 168/500 [=========>....................] - ETA: 1:23 - loss: 2.1805 - regression_loss: 1.7709 - classification_loss: 0.4096 169/500 [=========>....................] - ETA: 1:23 - loss: 2.1786 - regression_loss: 1.7690 - classification_loss: 0.4095 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1762 - regression_loss: 1.7676 - classification_loss: 0.4086 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1774 - regression_loss: 1.7689 - classification_loss: 0.4085 172/500 [=========>....................] - ETA: 1:22 - loss: 2.1770 - regression_loss: 1.7692 - classification_loss: 0.4078 173/500 [=========>....................] - ETA: 1:22 - loss: 2.1791 - regression_loss: 1.7709 - classification_loss: 0.4081 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1800 - regression_loss: 1.7719 - classification_loss: 0.4081 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1793 - regression_loss: 1.7715 - classification_loss: 0.4078 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1763 - regression_loss: 1.7698 - classification_loss: 0.4065 177/500 [=========>....................] - ETA: 1:21 - loss: 2.1766 - regression_loss: 1.7702 - classification_loss: 0.4064 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1748 - regression_loss: 1.7683 - classification_loss: 0.4064 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1760 - regression_loss: 1.7688 - classification_loss: 0.4072 180/500 [=========>....................] - ETA: 1:20 - loss: 2.1756 - regression_loss: 1.7687 - classification_loss: 0.4069 181/500 [=========>....................] - ETA: 1:20 - loss: 2.1764 - regression_loss: 1.7697 - classification_loss: 0.4067 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1757 - regression_loss: 1.7692 - classification_loss: 0.4065 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1753 - regression_loss: 1.7690 - classification_loss: 0.4064 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1734 - regression_loss: 1.7677 - classification_loss: 0.4057 185/500 [==========>...................] - ETA: 1:19 - loss: 2.1724 - regression_loss: 1.7671 - classification_loss: 0.4053 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1761 - regression_loss: 1.7697 - classification_loss: 0.4063 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1723 - regression_loss: 1.7656 - classification_loss: 0.4067 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1678 - regression_loss: 1.7612 - classification_loss: 0.4066 189/500 [==========>...................] - ETA: 1:18 - loss: 2.1654 - regression_loss: 1.7595 - classification_loss: 0.4059 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1667 - regression_loss: 1.7599 - classification_loss: 0.4068 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1657 - regression_loss: 1.7592 - classification_loss: 0.4064 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1645 - regression_loss: 1.7584 - classification_loss: 0.4061 193/500 [==========>...................] - ETA: 1:17 - loss: 2.1653 - regression_loss: 1.7592 - classification_loss: 0.4061 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1681 - regression_loss: 1.7614 - classification_loss: 0.4067 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1678 - regression_loss: 1.7614 - classification_loss: 0.4064 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1656 - regression_loss: 1.7599 - classification_loss: 0.4056 197/500 [==========>...................] - ETA: 1:16 - loss: 2.1652 - regression_loss: 1.7596 - classification_loss: 0.4057 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1614 - regression_loss: 1.7565 - classification_loss: 0.4049 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1603 - regression_loss: 1.7560 - classification_loss: 0.4043 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1620 - regression_loss: 1.7572 - classification_loss: 0.4047 201/500 [===========>..................] - ETA: 1:15 - loss: 2.1635 - regression_loss: 1.7584 - classification_loss: 0.4050 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1651 - regression_loss: 1.7599 - classification_loss: 0.4052 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1667 - regression_loss: 1.7615 - classification_loss: 0.4052 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1617 - regression_loss: 1.7577 - classification_loss: 0.4040 205/500 [===========>..................] - ETA: 1:14 - loss: 2.1556 - regression_loss: 1.7531 - classification_loss: 0.4024 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1528 - regression_loss: 1.7510 - classification_loss: 0.4018 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1559 - regression_loss: 1.7543 - classification_loss: 0.4016 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1542 - regression_loss: 1.7533 - classification_loss: 0.4009 209/500 [===========>..................] - ETA: 1:13 - loss: 2.1515 - regression_loss: 1.7513 - classification_loss: 0.4003 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1486 - regression_loss: 1.7488 - classification_loss: 0.3997 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1475 - regression_loss: 1.7482 - classification_loss: 0.3994 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1470 - regression_loss: 1.7477 - classification_loss: 0.3992 213/500 [===========>..................] - ETA: 1:12 - loss: 2.1478 - regression_loss: 1.7492 - classification_loss: 0.3986 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1505 - regression_loss: 1.7502 - classification_loss: 0.4004 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1489 - regression_loss: 1.7492 - classification_loss: 0.3997 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1484 - regression_loss: 1.7490 - classification_loss: 0.3994 217/500 [============>.................] - ETA: 1:11 - loss: 2.1495 - regression_loss: 1.7501 - classification_loss: 0.3994 218/500 [============>.................] - ETA: 1:10 - loss: 2.1497 - regression_loss: 1.7507 - classification_loss: 0.3990 219/500 [============>.................] - ETA: 1:10 - loss: 2.1503 - regression_loss: 1.7515 - classification_loss: 0.3988 220/500 [============>.................] - ETA: 1:10 - loss: 2.1489 - regression_loss: 1.7508 - classification_loss: 0.3980 221/500 [============>.................] - ETA: 1:10 - loss: 2.1503 - regression_loss: 1.7516 - classification_loss: 0.3987 222/500 [============>.................] - ETA: 1:09 - loss: 2.1495 - regression_loss: 1.7509 - classification_loss: 0.3986 223/500 [============>.................] - ETA: 1:09 - loss: 2.1495 - regression_loss: 1.7511 - classification_loss: 0.3985 224/500 [============>.................] - ETA: 1:09 - loss: 2.1500 - regression_loss: 1.7515 - classification_loss: 0.3984 225/500 [============>.................] - ETA: 1:09 - loss: 2.1498 - regression_loss: 1.7514 - classification_loss: 0.3984 226/500 [============>.................] - ETA: 1:08 - loss: 2.1526 - regression_loss: 1.7536 - classification_loss: 0.3990 227/500 [============>.................] - ETA: 1:08 - loss: 2.1519 - regression_loss: 1.7533 - classification_loss: 0.3986 228/500 [============>.................] - ETA: 1:08 - loss: 2.1516 - regression_loss: 1.7533 - classification_loss: 0.3983 229/500 [============>.................] - ETA: 1:08 - loss: 2.1524 - regression_loss: 1.7543 - classification_loss: 0.3982 230/500 [============>.................] - ETA: 1:07 - loss: 2.1563 - regression_loss: 1.7577 - classification_loss: 0.3986 231/500 [============>.................] - ETA: 1:07 - loss: 2.1554 - regression_loss: 1.7572 - classification_loss: 0.3983 232/500 [============>.................] - ETA: 1:07 - loss: 2.1577 - regression_loss: 1.7590 - classification_loss: 0.3987 233/500 [============>.................] - ETA: 1:07 - loss: 2.1576 - regression_loss: 1.7592 - classification_loss: 0.3984 234/500 [=============>................] - ETA: 1:06 - loss: 2.1561 - regression_loss: 1.7581 - classification_loss: 0.3981 235/500 [=============>................] - ETA: 1:06 - loss: 2.1581 - regression_loss: 1.7597 - classification_loss: 0.3984 236/500 [=============>................] - ETA: 1:06 - loss: 2.1578 - regression_loss: 1.7596 - classification_loss: 0.3982 237/500 [=============>................] - ETA: 1:06 - loss: 2.1589 - regression_loss: 1.7608 - classification_loss: 0.3981 238/500 [=============>................] - ETA: 1:05 - loss: 2.1603 - regression_loss: 1.7619 - classification_loss: 0.3984 239/500 [=============>................] - ETA: 1:05 - loss: 2.1555 - regression_loss: 1.7579 - classification_loss: 0.3976 240/500 [=============>................] - ETA: 1:05 - loss: 2.1567 - regression_loss: 1.7576 - classification_loss: 0.3991 241/500 [=============>................] - ETA: 1:05 - loss: 2.1662 - regression_loss: 1.7605 - classification_loss: 0.4057 242/500 [=============>................] - ETA: 1:04 - loss: 2.1636 - regression_loss: 1.7581 - classification_loss: 0.4055 243/500 [=============>................] - ETA: 1:04 - loss: 2.1665 - regression_loss: 1.7604 - classification_loss: 0.4061 244/500 [=============>................] - ETA: 1:04 - loss: 2.1686 - regression_loss: 1.7628 - classification_loss: 0.4058 245/500 [=============>................] - ETA: 1:04 - loss: 2.1704 - regression_loss: 1.7643 - classification_loss: 0.4062 246/500 [=============>................] - ETA: 1:03 - loss: 2.1705 - regression_loss: 1.7647 - classification_loss: 0.4059 247/500 [=============>................] - ETA: 1:03 - loss: 2.1715 - regression_loss: 1.7655 - classification_loss: 0.4060 248/500 [=============>................] - ETA: 1:03 - loss: 2.1701 - regression_loss: 1.7645 - classification_loss: 0.4056 249/500 [=============>................] - ETA: 1:03 - loss: 2.1688 - regression_loss: 1.7635 - classification_loss: 0.4053 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1727 - regression_loss: 1.7666 - classification_loss: 0.4061 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1735 - regression_loss: 1.7669 - classification_loss: 0.4066 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1759 - regression_loss: 1.7685 - classification_loss: 0.4075 253/500 [==============>...............] - ETA: 1:02 - loss: 2.1719 - regression_loss: 1.7655 - classification_loss: 0.4064 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1714 - regression_loss: 1.7650 - classification_loss: 0.4064 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1716 - regression_loss: 1.7650 - classification_loss: 0.4066 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1729 - regression_loss: 1.7658 - classification_loss: 0.4071 257/500 [==============>...............] - ETA: 1:01 - loss: 2.1728 - regression_loss: 1.7659 - classification_loss: 0.4069 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1733 - regression_loss: 1.7666 - classification_loss: 0.4067 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1715 - regression_loss: 1.7651 - classification_loss: 0.4064 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1704 - regression_loss: 1.7643 - classification_loss: 0.4061 261/500 [==============>...............] - ETA: 1:00 - loss: 2.1706 - regression_loss: 1.7646 - classification_loss: 0.4060 262/500 [==============>...............] - ETA: 59s - loss: 2.1681 - regression_loss: 1.7618 - classification_loss: 0.4063  263/500 [==============>...............] - ETA: 59s - loss: 2.1677 - regression_loss: 1.7616 - classification_loss: 0.4060 264/500 [==============>...............] - ETA: 59s - loss: 2.1676 - regression_loss: 1.7616 - classification_loss: 0.4060 265/500 [==============>...............] - ETA: 59s - loss: 2.1687 - regression_loss: 1.7620 - classification_loss: 0.4067 266/500 [==============>...............] - ETA: 58s - loss: 2.1687 - regression_loss: 1.7620 - classification_loss: 0.4067 267/500 [===============>..............] - ETA: 58s - loss: 2.1684 - regression_loss: 1.7619 - classification_loss: 0.4066 268/500 [===============>..............] - ETA: 58s - loss: 2.1687 - regression_loss: 1.7621 - classification_loss: 0.4066 269/500 [===============>..............] - ETA: 58s - loss: 2.1709 - regression_loss: 1.7641 - classification_loss: 0.4067 270/500 [===============>..............] - ETA: 57s - loss: 2.1691 - regression_loss: 1.7622 - classification_loss: 0.4069 271/500 [===============>..............] - ETA: 57s - loss: 2.1684 - regression_loss: 1.7613 - classification_loss: 0.4071 272/500 [===============>..............] - ETA: 57s - loss: 2.1680 - regression_loss: 1.7610 - classification_loss: 0.4070 273/500 [===============>..............] - ETA: 57s - loss: 2.1672 - regression_loss: 1.7603 - classification_loss: 0.4068 274/500 [===============>..............] - ETA: 56s - loss: 2.1671 - regression_loss: 1.7603 - classification_loss: 0.4068 275/500 [===============>..............] - ETA: 56s - loss: 2.1677 - regression_loss: 1.7607 - classification_loss: 0.4070 276/500 [===============>..............] - ETA: 56s - loss: 2.1685 - regression_loss: 1.7613 - classification_loss: 0.4072 277/500 [===============>..............] - ETA: 56s - loss: 2.1686 - regression_loss: 1.7616 - classification_loss: 0.4070 278/500 [===============>..............] - ETA: 55s - loss: 2.1704 - regression_loss: 1.7633 - classification_loss: 0.4071 279/500 [===============>..............] - ETA: 55s - loss: 2.1676 - regression_loss: 1.7605 - classification_loss: 0.4071 280/500 [===============>..............] - ETA: 55s - loss: 2.1669 - regression_loss: 1.7601 - classification_loss: 0.4068 281/500 [===============>..............] - ETA: 55s - loss: 2.1647 - regression_loss: 1.7586 - classification_loss: 0.4061 282/500 [===============>..............] - ETA: 54s - loss: 2.1654 - regression_loss: 1.7593 - classification_loss: 0.4061 283/500 [===============>..............] - ETA: 54s - loss: 2.1647 - regression_loss: 1.7585 - classification_loss: 0.4062 284/500 [================>.............] - ETA: 54s - loss: 2.1634 - regression_loss: 1.7569 - classification_loss: 0.4065 285/500 [================>.............] - ETA: 54s - loss: 2.1640 - regression_loss: 1.7575 - classification_loss: 0.4065 286/500 [================>.............] - ETA: 53s - loss: 2.1625 - regression_loss: 1.7564 - classification_loss: 0.4061 287/500 [================>.............] - ETA: 53s - loss: 2.1631 - regression_loss: 1.7572 - classification_loss: 0.4060 288/500 [================>.............] - ETA: 53s - loss: 2.1626 - regression_loss: 1.7567 - classification_loss: 0.4059 289/500 [================>.............] - ETA: 53s - loss: 2.1620 - regression_loss: 1.7562 - classification_loss: 0.4058 290/500 [================>.............] - ETA: 52s - loss: 2.1625 - regression_loss: 1.7565 - classification_loss: 0.4060 291/500 [================>.............] - ETA: 52s - loss: 2.1593 - regression_loss: 1.7540 - classification_loss: 0.4053 292/500 [================>.............] - ETA: 52s - loss: 2.1594 - regression_loss: 1.7540 - classification_loss: 0.4054 293/500 [================>.............] - ETA: 52s - loss: 2.1582 - regression_loss: 1.7531 - classification_loss: 0.4050 294/500 [================>.............] - ETA: 51s - loss: 2.1572 - regression_loss: 1.7524 - classification_loss: 0.4048 295/500 [================>.............] - ETA: 51s - loss: 2.1545 - regression_loss: 1.7502 - classification_loss: 0.4043 296/500 [================>.............] - ETA: 51s - loss: 2.1528 - regression_loss: 1.7489 - classification_loss: 0.4039 297/500 [================>.............] - ETA: 51s - loss: 2.1516 - regression_loss: 1.7481 - classification_loss: 0.4035 298/500 [================>.............] - ETA: 50s - loss: 2.1528 - regression_loss: 1.7490 - classification_loss: 0.4038 299/500 [================>.............] - ETA: 50s - loss: 2.1522 - regression_loss: 1.7487 - classification_loss: 0.4035 300/500 [=================>............] - ETA: 50s - loss: 2.1512 - regression_loss: 1.7479 - classification_loss: 0.4033 301/500 [=================>............] - ETA: 50s - loss: 2.1511 - regression_loss: 1.7479 - classification_loss: 0.4032 302/500 [=================>............] - ETA: 49s - loss: 2.1542 - regression_loss: 1.7506 - classification_loss: 0.4036 303/500 [=================>............] - ETA: 49s - loss: 2.1540 - regression_loss: 1.7505 - classification_loss: 0.4035 304/500 [=================>............] - ETA: 49s - loss: 2.1543 - regression_loss: 1.7506 - classification_loss: 0.4037 305/500 [=================>............] - ETA: 49s - loss: 2.1544 - regression_loss: 1.7507 - classification_loss: 0.4037 306/500 [=================>............] - ETA: 48s - loss: 2.1520 - regression_loss: 1.7481 - classification_loss: 0.4038 307/500 [=================>............] - ETA: 48s - loss: 2.1498 - regression_loss: 1.7463 - classification_loss: 0.4035 308/500 [=================>............] - ETA: 48s - loss: 2.1501 - regression_loss: 1.7464 - classification_loss: 0.4037 309/500 [=================>............] - ETA: 48s - loss: 2.1463 - regression_loss: 1.7433 - classification_loss: 0.4030 310/500 [=================>............] - ETA: 47s - loss: 2.1480 - regression_loss: 1.7442 - classification_loss: 0.4038 311/500 [=================>............] - ETA: 47s - loss: 2.1481 - regression_loss: 1.7443 - classification_loss: 0.4039 312/500 [=================>............] - ETA: 47s - loss: 2.1484 - regression_loss: 1.7447 - classification_loss: 0.4037 313/500 [=================>............] - ETA: 47s - loss: 2.1487 - regression_loss: 1.7451 - classification_loss: 0.4036 314/500 [=================>............] - ETA: 46s - loss: 2.1494 - regression_loss: 1.7456 - classification_loss: 0.4038 315/500 [=================>............] - ETA: 46s - loss: 2.1496 - regression_loss: 1.7459 - classification_loss: 0.4038 316/500 [=================>............] - ETA: 46s - loss: 2.1487 - regression_loss: 1.7452 - classification_loss: 0.4035 317/500 [==================>...........] - ETA: 46s - loss: 2.1483 - regression_loss: 1.7449 - classification_loss: 0.4033 318/500 [==================>...........] - ETA: 45s - loss: 2.1490 - regression_loss: 1.7457 - classification_loss: 0.4034 319/500 [==================>...........] - ETA: 45s - loss: 2.1495 - regression_loss: 1.7457 - classification_loss: 0.4038 320/500 [==================>...........] - ETA: 45s - loss: 2.1512 - regression_loss: 1.7470 - classification_loss: 0.4042 321/500 [==================>...........] - ETA: 45s - loss: 2.1506 - regression_loss: 1.7466 - classification_loss: 0.4040 322/500 [==================>...........] - ETA: 44s - loss: 2.1501 - regression_loss: 1.7464 - classification_loss: 0.4037 323/500 [==================>...........] - ETA: 44s - loss: 2.1473 - regression_loss: 1.7441 - classification_loss: 0.4032 324/500 [==================>...........] - ETA: 44s - loss: 2.1474 - regression_loss: 1.7440 - classification_loss: 0.4034 325/500 [==================>...........] - ETA: 43s - loss: 2.1473 - regression_loss: 1.7438 - classification_loss: 0.4035 326/500 [==================>...........] - ETA: 43s - loss: 2.1440 - regression_loss: 1.7410 - classification_loss: 0.4030 327/500 [==================>...........] - ETA: 43s - loss: 2.1454 - regression_loss: 1.7421 - classification_loss: 0.4033 328/500 [==================>...........] - ETA: 43s - loss: 2.1449 - regression_loss: 1.7417 - classification_loss: 0.4032 329/500 [==================>...........] - ETA: 42s - loss: 2.1430 - regression_loss: 1.7402 - classification_loss: 0.4028 330/500 [==================>...........] - ETA: 42s - loss: 2.1428 - regression_loss: 1.7403 - classification_loss: 0.4025 331/500 [==================>...........] - ETA: 42s - loss: 2.1428 - regression_loss: 1.7402 - classification_loss: 0.4026 332/500 [==================>...........] - ETA: 42s - loss: 2.1454 - regression_loss: 1.7422 - classification_loss: 0.4032 333/500 [==================>...........] - ETA: 41s - loss: 2.1454 - regression_loss: 1.7424 - classification_loss: 0.4030 334/500 [===================>..........] - ETA: 41s - loss: 2.1444 - regression_loss: 1.7416 - classification_loss: 0.4028 335/500 [===================>..........] - ETA: 41s - loss: 2.1461 - regression_loss: 1.7429 - classification_loss: 0.4032 336/500 [===================>..........] - ETA: 41s - loss: 2.1465 - regression_loss: 1.7432 - classification_loss: 0.4032 337/500 [===================>..........] - ETA: 40s - loss: 2.1464 - regression_loss: 1.7432 - classification_loss: 0.4033 338/500 [===================>..........] - ETA: 40s - loss: 2.1452 - regression_loss: 1.7423 - classification_loss: 0.4029 339/500 [===================>..........] - ETA: 40s - loss: 2.1437 - regression_loss: 1.7414 - classification_loss: 0.4023 340/500 [===================>..........] - ETA: 40s - loss: 2.1427 - regression_loss: 1.7408 - classification_loss: 0.4020 341/500 [===================>..........] - ETA: 39s - loss: 2.1429 - regression_loss: 1.7410 - classification_loss: 0.4019 342/500 [===================>..........] - ETA: 39s - loss: 2.1429 - regression_loss: 1.7409 - classification_loss: 0.4020 343/500 [===================>..........] - ETA: 39s - loss: 2.1425 - regression_loss: 1.7406 - classification_loss: 0.4020 344/500 [===================>..........] - ETA: 39s - loss: 2.1420 - regression_loss: 1.7402 - classification_loss: 0.4017 345/500 [===================>..........] - ETA: 38s - loss: 2.1445 - regression_loss: 1.7419 - classification_loss: 0.4025 346/500 [===================>..........] - ETA: 38s - loss: 2.1448 - regression_loss: 1.7424 - classification_loss: 0.4024 347/500 [===================>..........] - ETA: 38s - loss: 2.1444 - regression_loss: 1.7421 - classification_loss: 0.4023 348/500 [===================>..........] - ETA: 38s - loss: 2.1436 - regression_loss: 1.7414 - classification_loss: 0.4022 349/500 [===================>..........] - ETA: 37s - loss: 2.1426 - regression_loss: 1.7404 - classification_loss: 0.4022 350/500 [====================>.........] - ETA: 37s - loss: 2.1416 - regression_loss: 1.7397 - classification_loss: 0.4019 351/500 [====================>.........] - ETA: 37s - loss: 2.1407 - regression_loss: 1.7388 - classification_loss: 0.4020 352/500 [====================>.........] - ETA: 37s - loss: 2.1409 - regression_loss: 1.7392 - classification_loss: 0.4018 353/500 [====================>.........] - ETA: 36s - loss: 2.1415 - regression_loss: 1.7397 - classification_loss: 0.4018 354/500 [====================>.........] - ETA: 36s - loss: 2.1414 - regression_loss: 1.7395 - classification_loss: 0.4020 355/500 [====================>.........] - ETA: 36s - loss: 2.1418 - regression_loss: 1.7393 - classification_loss: 0.4025 356/500 [====================>.........] - ETA: 36s - loss: 2.1418 - regression_loss: 1.7394 - classification_loss: 0.4023 357/500 [====================>.........] - ETA: 35s - loss: 2.1426 - regression_loss: 1.7401 - classification_loss: 0.4025 358/500 [====================>.........] - ETA: 35s - loss: 2.1431 - regression_loss: 1.7404 - classification_loss: 0.4027 359/500 [====================>.........] - ETA: 35s - loss: 2.1408 - regression_loss: 1.7384 - classification_loss: 0.4024 360/500 [====================>.........] - ETA: 35s - loss: 2.1387 - regression_loss: 1.7368 - classification_loss: 0.4019 361/500 [====================>.........] - ETA: 34s - loss: 2.1366 - regression_loss: 1.7350 - classification_loss: 0.4016 362/500 [====================>.........] - ETA: 34s - loss: 2.1371 - regression_loss: 1.7354 - classification_loss: 0.4017 363/500 [====================>.........] - ETA: 34s - loss: 2.1365 - regression_loss: 1.7352 - classification_loss: 0.4012 364/500 [====================>.........] - ETA: 34s - loss: 2.1356 - regression_loss: 1.7347 - classification_loss: 0.4009 365/500 [====================>.........] - ETA: 33s - loss: 2.1349 - regression_loss: 1.7343 - classification_loss: 0.4006 366/500 [====================>.........] - ETA: 33s - loss: 2.1338 - regression_loss: 1.7335 - classification_loss: 0.4003 367/500 [=====================>........] - ETA: 33s - loss: 2.1330 - regression_loss: 1.7330 - classification_loss: 0.4000 368/500 [=====================>........] - ETA: 33s - loss: 2.1327 - regression_loss: 1.7327 - classification_loss: 0.4000 369/500 [=====================>........] - ETA: 32s - loss: 2.1322 - regression_loss: 1.7323 - classification_loss: 0.3999 370/500 [=====================>........] - ETA: 32s - loss: 2.1308 - regression_loss: 1.7312 - classification_loss: 0.3996 371/500 [=====================>........] - ETA: 32s - loss: 2.1318 - regression_loss: 1.7318 - classification_loss: 0.4000 372/500 [=====================>........] - ETA: 32s - loss: 2.1316 - regression_loss: 1.7315 - classification_loss: 0.4001 373/500 [=====================>........] - ETA: 31s - loss: 2.1335 - regression_loss: 1.7331 - classification_loss: 0.4004 374/500 [=====================>........] - ETA: 31s - loss: 2.1334 - regression_loss: 1.7327 - classification_loss: 0.4006 375/500 [=====================>........] - ETA: 31s - loss: 2.1349 - regression_loss: 1.7339 - classification_loss: 0.4010 376/500 [=====================>........] - ETA: 31s - loss: 2.1336 - regression_loss: 1.7328 - classification_loss: 0.4008 377/500 [=====================>........] - ETA: 30s - loss: 2.1301 - regression_loss: 1.7298 - classification_loss: 0.4002 378/500 [=====================>........] - ETA: 30s - loss: 2.1277 - regression_loss: 1.7276 - classification_loss: 0.4000 379/500 [=====================>........] - ETA: 30s - loss: 2.1287 - regression_loss: 1.7284 - classification_loss: 0.4003 380/500 [=====================>........] - ETA: 30s - loss: 2.1294 - regression_loss: 1.7290 - classification_loss: 0.4004 381/500 [=====================>........] - ETA: 29s - loss: 2.1319 - regression_loss: 1.7312 - classification_loss: 0.4007 382/500 [=====================>........] - ETA: 29s - loss: 2.1323 - regression_loss: 1.7317 - classification_loss: 0.4006 383/500 [=====================>........] - ETA: 29s - loss: 2.1341 - regression_loss: 1.7331 - classification_loss: 0.4009 384/500 [======================>.......] - ETA: 29s - loss: 2.1334 - regression_loss: 1.7326 - classification_loss: 0.4008 385/500 [======================>.......] - ETA: 28s - loss: 2.1333 - regression_loss: 1.7327 - classification_loss: 0.4006 386/500 [======================>.......] - ETA: 28s - loss: 2.1328 - regression_loss: 1.7323 - classification_loss: 0.4005 387/500 [======================>.......] - ETA: 28s - loss: 2.1332 - regression_loss: 1.7327 - classification_loss: 0.4005 388/500 [======================>.......] - ETA: 28s - loss: 2.1333 - regression_loss: 1.7331 - classification_loss: 0.4003 389/500 [======================>.......] - ETA: 27s - loss: 2.1342 - regression_loss: 1.7337 - classification_loss: 0.4005 390/500 [======================>.......] - ETA: 27s - loss: 2.1335 - regression_loss: 1.7333 - classification_loss: 0.4002 391/500 [======================>.......] - ETA: 27s - loss: 2.1328 - regression_loss: 1.7327 - classification_loss: 0.4001 392/500 [======================>.......] - ETA: 27s - loss: 2.1327 - regression_loss: 1.7329 - classification_loss: 0.3999 393/500 [======================>.......] - ETA: 26s - loss: 2.1334 - regression_loss: 1.7331 - classification_loss: 0.4003 394/500 [======================>.......] - ETA: 26s - loss: 2.1344 - regression_loss: 1.7341 - classification_loss: 0.4002 395/500 [======================>.......] - ETA: 26s - loss: 2.1346 - regression_loss: 1.7343 - classification_loss: 0.4003 396/500 [======================>.......] - ETA: 26s - loss: 2.1347 - regression_loss: 1.7345 - classification_loss: 0.4002 397/500 [======================>.......] - ETA: 25s - loss: 2.1357 - regression_loss: 1.7356 - classification_loss: 0.4001 398/500 [======================>.......] - ETA: 25s - loss: 2.1344 - regression_loss: 1.7346 - classification_loss: 0.3998 399/500 [======================>.......] - ETA: 25s - loss: 2.1344 - regression_loss: 1.7348 - classification_loss: 0.3996 400/500 [=======================>......] - ETA: 25s - loss: 2.1350 - regression_loss: 1.7355 - classification_loss: 0.3995 401/500 [=======================>......] - ETA: 24s - loss: 2.1359 - regression_loss: 1.7363 - classification_loss: 0.3996 402/500 [=======================>......] - ETA: 24s - loss: 2.1349 - regression_loss: 1.7355 - classification_loss: 0.3994 403/500 [=======================>......] - ETA: 24s - loss: 2.1365 - regression_loss: 1.7369 - classification_loss: 0.3996 404/500 [=======================>......] - ETA: 24s - loss: 2.1389 - regression_loss: 1.7390 - classification_loss: 0.4000 405/500 [=======================>......] - ETA: 23s - loss: 2.1390 - regression_loss: 1.7392 - classification_loss: 0.3999 406/500 [=======================>......] - ETA: 23s - loss: 2.1396 - regression_loss: 1.7395 - classification_loss: 0.4001 407/500 [=======================>......] - ETA: 23s - loss: 2.1395 - regression_loss: 1.7396 - classification_loss: 0.3999 408/500 [=======================>......] - ETA: 23s - loss: 2.1397 - regression_loss: 1.7396 - classification_loss: 0.4001 409/500 [=======================>......] - ETA: 22s - loss: 2.1400 - regression_loss: 1.7397 - classification_loss: 0.4003 410/500 [=======================>......] - ETA: 22s - loss: 2.1415 - regression_loss: 1.7396 - classification_loss: 0.4019 411/500 [=======================>......] - ETA: 22s - loss: 2.1430 - regression_loss: 1.7409 - classification_loss: 0.4020 412/500 [=======================>......] - ETA: 22s - loss: 2.1428 - regression_loss: 1.7408 - classification_loss: 0.4019 413/500 [=======================>......] - ETA: 21s - loss: 2.1430 - regression_loss: 1.7410 - classification_loss: 0.4020 414/500 [=======================>......] - ETA: 21s - loss: 2.1428 - regression_loss: 1.7410 - classification_loss: 0.4018 415/500 [=======================>......] - ETA: 21s - loss: 2.1427 - regression_loss: 1.7411 - classification_loss: 0.4016 416/500 [=======================>......] - ETA: 21s - loss: 2.1422 - regression_loss: 1.7408 - classification_loss: 0.4014 417/500 [========================>.....] - ETA: 20s - loss: 2.1393 - regression_loss: 1.7385 - classification_loss: 0.4008 418/500 [========================>.....] - ETA: 20s - loss: 2.1400 - regression_loss: 1.7389 - classification_loss: 0.4010 419/500 [========================>.....] - ETA: 20s - loss: 2.1412 - regression_loss: 1.7395 - classification_loss: 0.4017 420/500 [========================>.....] - ETA: 20s - loss: 2.1411 - regression_loss: 1.7397 - classification_loss: 0.4015 421/500 [========================>.....] - ETA: 19s - loss: 2.1423 - regression_loss: 1.7404 - classification_loss: 0.4019 422/500 [========================>.....] - ETA: 19s - loss: 2.1430 - regression_loss: 1.7397 - classification_loss: 0.4033 423/500 [========================>.....] - ETA: 19s - loss: 2.1434 - regression_loss: 1.7401 - classification_loss: 0.4033 424/500 [========================>.....] - ETA: 19s - loss: 2.1423 - regression_loss: 1.7389 - classification_loss: 0.4034 425/500 [========================>.....] - ETA: 18s - loss: 2.1432 - regression_loss: 1.7397 - classification_loss: 0.4034 426/500 [========================>.....] - ETA: 18s - loss: 2.1415 - regression_loss: 1.7385 - classification_loss: 0.4030 427/500 [========================>.....] - ETA: 18s - loss: 2.1434 - regression_loss: 1.7401 - classification_loss: 0.4032 428/500 [========================>.....] - ETA: 18s - loss: 2.1413 - regression_loss: 1.7387 - classification_loss: 0.4027 429/500 [========================>.....] - ETA: 17s - loss: 2.1419 - regression_loss: 1.7389 - classification_loss: 0.4031 430/500 [========================>.....] - ETA: 17s - loss: 2.1426 - regression_loss: 1.7395 - classification_loss: 0.4031 431/500 [========================>.....] - ETA: 17s - loss: 2.1436 - regression_loss: 1.7404 - classification_loss: 0.4032 432/500 [========================>.....] - ETA: 17s - loss: 2.1453 - regression_loss: 1.7419 - classification_loss: 0.4034 433/500 [========================>.....] - ETA: 16s - loss: 2.1424 - regression_loss: 1.7395 - classification_loss: 0.4029 434/500 [=========================>....] - ETA: 16s - loss: 2.1441 - regression_loss: 1.7408 - classification_loss: 0.4033 435/500 [=========================>....] - ETA: 16s - loss: 2.1445 - regression_loss: 1.7411 - classification_loss: 0.4034 436/500 [=========================>....] - ETA: 16s - loss: 2.1433 - regression_loss: 1.7401 - classification_loss: 0.4032 437/500 [=========================>....] - ETA: 15s - loss: 2.1434 - regression_loss: 1.7402 - classification_loss: 0.4032 438/500 [=========================>....] - ETA: 15s - loss: 2.1438 - regression_loss: 1.7407 - classification_loss: 0.4032 439/500 [=========================>....] - ETA: 15s - loss: 2.1435 - regression_loss: 1.7406 - classification_loss: 0.4030 440/500 [=========================>....] - ETA: 15s - loss: 2.1427 - regression_loss: 1.7400 - classification_loss: 0.4027 441/500 [=========================>....] - ETA: 14s - loss: 2.1441 - regression_loss: 1.7410 - classification_loss: 0.4030 442/500 [=========================>....] - ETA: 14s - loss: 2.1445 - regression_loss: 1.7416 - classification_loss: 0.4029 443/500 [=========================>....] - ETA: 14s - loss: 2.1446 - regression_loss: 1.7417 - classification_loss: 0.4029 444/500 [=========================>....] - ETA: 14s - loss: 2.1444 - regression_loss: 1.7416 - classification_loss: 0.4027 445/500 [=========================>....] - ETA: 13s - loss: 2.1465 - regression_loss: 1.7430 - classification_loss: 0.4034 446/500 [=========================>....] - ETA: 13s - loss: 2.1451 - regression_loss: 1.7414 - classification_loss: 0.4037 447/500 [=========================>....] - ETA: 13s - loss: 2.1453 - regression_loss: 1.7416 - classification_loss: 0.4036 448/500 [=========================>....] - ETA: 13s - loss: 2.1452 - regression_loss: 1.7417 - classification_loss: 0.4035 449/500 [=========================>....] - ETA: 12s - loss: 2.1451 - regression_loss: 1.7417 - classification_loss: 0.4034 450/500 [==========================>...] - ETA: 12s - loss: 2.1455 - regression_loss: 1.7421 - classification_loss: 0.4034 451/500 [==========================>...] - ETA: 12s - loss: 2.1440 - regression_loss: 1.7408 - classification_loss: 0.4032 452/500 [==========================>...] - ETA: 12s - loss: 2.1435 - regression_loss: 1.7404 - classification_loss: 0.4031 453/500 [==========================>...] - ETA: 11s - loss: 2.1438 - regression_loss: 1.7407 - classification_loss: 0.4031 454/500 [==========================>...] - ETA: 11s - loss: 2.1433 - regression_loss: 1.7404 - classification_loss: 0.4029 455/500 [==========================>...] - ETA: 11s - loss: 2.1426 - regression_loss: 1.7395 - classification_loss: 0.4031 456/500 [==========================>...] - ETA: 11s - loss: 2.1426 - regression_loss: 1.7397 - classification_loss: 0.4029 457/500 [==========================>...] - ETA: 10s - loss: 2.1423 - regression_loss: 1.7394 - classification_loss: 0.4028 458/500 [==========================>...] - ETA: 10s - loss: 2.1423 - regression_loss: 1.7395 - classification_loss: 0.4028 459/500 [==========================>...] - ETA: 10s - loss: 2.1433 - regression_loss: 1.7404 - classification_loss: 0.4029 460/500 [==========================>...] - ETA: 10s - loss: 2.1458 - regression_loss: 1.7423 - classification_loss: 0.4035 461/500 [==========================>...] - ETA: 9s - loss: 2.1456 - regression_loss: 1.7421 - classification_loss: 0.4036  462/500 [==========================>...] - ETA: 9s - loss: 2.1450 - regression_loss: 1.7415 - classification_loss: 0.4034 463/500 [==========================>...] - ETA: 9s - loss: 2.1453 - regression_loss: 1.7420 - classification_loss: 0.4033 464/500 [==========================>...] - ETA: 9s - loss: 2.1468 - regression_loss: 1.7432 - classification_loss: 0.4036 465/500 [==========================>...] - ETA: 8s - loss: 2.1467 - regression_loss: 1.7428 - classification_loss: 0.4039 466/500 [==========================>...] - ETA: 8s - loss: 2.1467 - regression_loss: 1.7430 - classification_loss: 0.4037 467/500 [===========================>..] - ETA: 8s - loss: 2.1483 - regression_loss: 1.7444 - classification_loss: 0.4039 468/500 [===========================>..] - ETA: 8s - loss: 2.1482 - regression_loss: 1.7445 - classification_loss: 0.4037 469/500 [===========================>..] - ETA: 7s - loss: 2.1482 - regression_loss: 1.7447 - classification_loss: 0.4035 470/500 [===========================>..] - ETA: 7s - loss: 2.1483 - regression_loss: 1.7449 - classification_loss: 0.4034 471/500 [===========================>..] - ETA: 7s - loss: 2.1473 - regression_loss: 1.7441 - classification_loss: 0.4032 472/500 [===========================>..] - ETA: 7s - loss: 2.1478 - regression_loss: 1.7444 - classification_loss: 0.4035 473/500 [===========================>..] - ETA: 6s - loss: 2.1473 - regression_loss: 1.7441 - classification_loss: 0.4032 474/500 [===========================>..] - ETA: 6s - loss: 2.1473 - regression_loss: 1.7442 - classification_loss: 0.4031 475/500 [===========================>..] - ETA: 6s - loss: 2.1489 - regression_loss: 1.7458 - classification_loss: 0.4031 476/500 [===========================>..] - ETA: 6s - loss: 2.1488 - regression_loss: 1.7457 - classification_loss: 0.4031 477/500 [===========================>..] - ETA: 5s - loss: 2.1497 - regression_loss: 1.7466 - classification_loss: 0.4031 478/500 [===========================>..] - ETA: 5s - loss: 2.1490 - regression_loss: 1.7461 - classification_loss: 0.4028 479/500 [===========================>..] - ETA: 5s - loss: 2.1500 - regression_loss: 1.7468 - classification_loss: 0.4032 480/500 [===========================>..] - ETA: 5s - loss: 2.1472 - regression_loss: 1.7446 - classification_loss: 0.4026 481/500 [===========================>..] - ETA: 4s - loss: 2.1473 - regression_loss: 1.7447 - classification_loss: 0.4026 482/500 [===========================>..] - ETA: 4s - loss: 2.1470 - regression_loss: 1.7445 - classification_loss: 0.4025 483/500 [===========================>..] - ETA: 4s - loss: 2.1462 - regression_loss: 1.7440 - classification_loss: 0.4022 484/500 [============================>.] - ETA: 4s - loss: 2.1466 - regression_loss: 1.7444 - classification_loss: 0.4022 485/500 [============================>.] - ETA: 3s - loss: 2.1482 - regression_loss: 1.7460 - classification_loss: 0.4021 486/500 [============================>.] - ETA: 3s - loss: 2.1492 - regression_loss: 1.7468 - classification_loss: 0.4024 487/500 [============================>.] - ETA: 3s - loss: 2.1487 - regression_loss: 1.7465 - classification_loss: 0.4022 488/500 [============================>.] - ETA: 3s - loss: 2.1480 - regression_loss: 1.7457 - classification_loss: 0.4023 489/500 [============================>.] - ETA: 2s - loss: 2.1482 - regression_loss: 1.7457 - classification_loss: 0.4025 490/500 [============================>.] - ETA: 2s - loss: 2.1493 - regression_loss: 1.7465 - classification_loss: 0.4029 491/500 [============================>.] - ETA: 2s - loss: 2.1501 - regression_loss: 1.7471 - classification_loss: 0.4030 492/500 [============================>.] - ETA: 2s - loss: 2.1504 - regression_loss: 1.7474 - classification_loss: 0.4030 493/500 [============================>.] - ETA: 1s - loss: 2.1506 - regression_loss: 1.7472 - classification_loss: 0.4034 494/500 [============================>.] - ETA: 1s - loss: 2.1506 - regression_loss: 1.7471 - classification_loss: 0.4035 495/500 [============================>.] - ETA: 1s - loss: 2.1508 - regression_loss: 1.7469 - classification_loss: 0.4039 496/500 [============================>.] - ETA: 1s - loss: 2.1506 - regression_loss: 1.7468 - classification_loss: 0.4038 497/500 [============================>.] - ETA: 0s - loss: 2.1498 - regression_loss: 1.7463 - classification_loss: 0.4035 498/500 [============================>.] - ETA: 0s - loss: 2.1478 - regression_loss: 1.7448 - classification_loss: 0.4029 499/500 [============================>.] - ETA: 0s - loss: 2.1480 - regression_loss: 1.7449 - classification_loss: 0.4031 500/500 [==============================] - 126s 251ms/step - loss: 2.1481 - regression_loss: 1.7449 - classification_loss: 0.4032 1172 instances of class plum with average precision: 0.4742 mAP: 0.4742 Epoch 00031: saving model to ./training/snapshots/resnet50_pascal_31.h5 Epoch 32/150 1/500 [..............................] - ETA: 1:51 - loss: 1.3941 - regression_loss: 1.0986 - classification_loss: 0.2955 2/500 [..............................] - ETA: 1:57 - loss: 1.9270 - regression_loss: 1.5095 - classification_loss: 0.4175 3/500 [..............................] - ETA: 1:56 - loss: 2.1338 - regression_loss: 1.7052 - classification_loss: 0.4286 4/500 [..............................] - ETA: 1:58 - loss: 1.9723 - regression_loss: 1.5762 - classification_loss: 0.3961 5/500 [..............................] - ETA: 1:59 - loss: 2.2321 - regression_loss: 1.7681 - classification_loss: 0.4639 6/500 [..............................] - ETA: 2:00 - loss: 2.2159 - regression_loss: 1.7564 - classification_loss: 0.4596 7/500 [..............................] - ETA: 2:00 - loss: 2.2047 - regression_loss: 1.7494 - classification_loss: 0.4554 8/500 [..............................] - ETA: 2:00 - loss: 2.2124 - regression_loss: 1.7556 - classification_loss: 0.4568 9/500 [..............................] - ETA: 2:00 - loss: 2.1723 - regression_loss: 1.7269 - classification_loss: 0.4455 10/500 [..............................] - ETA: 2:00 - loss: 2.1142 - regression_loss: 1.6877 - classification_loss: 0.4265 11/500 [..............................] - ETA: 2:00 - loss: 2.1218 - regression_loss: 1.7008 - classification_loss: 0.4210 12/500 [..............................] - ETA: 2:00 - loss: 2.0929 - regression_loss: 1.6851 - classification_loss: 0.4078 13/500 [..............................] - ETA: 2:00 - loss: 2.2105 - regression_loss: 1.6584 - classification_loss: 0.5521 14/500 [..............................] - ETA: 2:00 - loss: 2.1909 - regression_loss: 1.6486 - classification_loss: 0.5424 15/500 [..............................] - ETA: 2:00 - loss: 2.2049 - regression_loss: 1.6607 - classification_loss: 0.5442 16/500 [..............................] - ETA: 2:00 - loss: 2.2071 - regression_loss: 1.6740 - classification_loss: 0.5331 17/500 [>.............................] - ETA: 2:00 - loss: 2.1989 - regression_loss: 1.6796 - classification_loss: 0.5193 18/500 [>.............................] - ETA: 2:00 - loss: 2.1966 - regression_loss: 1.6856 - classification_loss: 0.5110 19/500 [>.............................] - ETA: 2:00 - loss: 2.1724 - regression_loss: 1.6710 - classification_loss: 0.5014 20/500 [>.............................] - ETA: 1:59 - loss: 2.1557 - regression_loss: 1.6627 - classification_loss: 0.4930 21/500 [>.............................] - ETA: 1:59 - loss: 2.1807 - regression_loss: 1.6892 - classification_loss: 0.4915 22/500 [>.............................] - ETA: 1:59 - loss: 2.1619 - regression_loss: 1.6807 - classification_loss: 0.4812 23/500 [>.............................] - ETA: 1:58 - loss: 2.1561 - regression_loss: 1.6822 - classification_loss: 0.4739 24/500 [>.............................] - ETA: 1:58 - loss: 2.1719 - regression_loss: 1.6964 - classification_loss: 0.4755 25/500 [>.............................] - ETA: 1:58 - loss: 2.1725 - regression_loss: 1.7022 - classification_loss: 0.4702 26/500 [>.............................] - ETA: 1:58 - loss: 2.1669 - regression_loss: 1.7030 - classification_loss: 0.4638 27/500 [>.............................] - ETA: 1:58 - loss: 2.1195 - regression_loss: 1.6672 - classification_loss: 0.4522 28/500 [>.............................] - ETA: 1:57 - loss: 2.0893 - regression_loss: 1.6445 - classification_loss: 0.4448 29/500 [>.............................] - ETA: 1:57 - loss: 2.0758 - regression_loss: 1.6350 - classification_loss: 0.4408 30/500 [>.............................] - ETA: 1:57 - loss: 2.0847 - regression_loss: 1.6453 - classification_loss: 0.4394 31/500 [>.............................] - ETA: 1:57 - loss: 2.0786 - regression_loss: 1.6439 - classification_loss: 0.4347 32/500 [>.............................] - ETA: 1:57 - loss: 2.0928 - regression_loss: 1.6565 - classification_loss: 0.4363 33/500 [>.............................] - ETA: 1:57 - loss: 2.1058 - regression_loss: 1.6706 - classification_loss: 0.4352 34/500 [=>............................] - ETA: 1:56 - loss: 2.1114 - regression_loss: 1.6794 - classification_loss: 0.4320 35/500 [=>............................] - ETA: 1:56 - loss: 2.1314 - regression_loss: 1.6956 - classification_loss: 0.4357 36/500 [=>............................] - ETA: 1:56 - loss: 2.1421 - regression_loss: 1.7068 - classification_loss: 0.4354 37/500 [=>............................] - ETA: 1:56 - loss: 2.1551 - regression_loss: 1.7159 - classification_loss: 0.4392 38/500 [=>............................] - ETA: 1:55 - loss: 2.1709 - regression_loss: 1.7294 - classification_loss: 0.4415 39/500 [=>............................] - ETA: 1:55 - loss: 2.1828 - regression_loss: 1.7414 - classification_loss: 0.4414 40/500 [=>............................] - ETA: 1:55 - loss: 2.1829 - regression_loss: 1.7414 - classification_loss: 0.4415 41/500 [=>............................] - ETA: 1:55 - loss: 2.1741 - regression_loss: 1.7362 - classification_loss: 0.4378 42/500 [=>............................] - ETA: 1:55 - loss: 2.1700 - regression_loss: 1.7314 - classification_loss: 0.4386 43/500 [=>............................] - ETA: 1:54 - loss: 2.1730 - regression_loss: 1.7351 - classification_loss: 0.4379 44/500 [=>............................] - ETA: 1:54 - loss: 2.1548 - regression_loss: 1.7205 - classification_loss: 0.4343 45/500 [=>............................] - ETA: 1:54 - loss: 2.1583 - regression_loss: 1.7266 - classification_loss: 0.4317 46/500 [=>............................] - ETA: 1:54 - loss: 2.1553 - regression_loss: 1.7255 - classification_loss: 0.4298 47/500 [=>............................] - ETA: 1:53 - loss: 2.1523 - regression_loss: 1.7223 - classification_loss: 0.4301 48/500 [=>............................] - ETA: 1:53 - loss: 2.1420 - regression_loss: 1.7128 - classification_loss: 0.4292 49/500 [=>............................] - ETA: 1:53 - loss: 2.1255 - regression_loss: 1.6993 - classification_loss: 0.4262 50/500 [==>...........................] - ETA: 1:52 - loss: 2.1522 - regression_loss: 1.7195 - classification_loss: 0.4327 51/500 [==>...........................] - ETA: 1:52 - loss: 2.1595 - regression_loss: 1.7254 - classification_loss: 0.4341 52/500 [==>...........................] - ETA: 1:52 - loss: 2.1668 - regression_loss: 1.7312 - classification_loss: 0.4356 53/500 [==>...........................] - ETA: 1:52 - loss: 2.1810 - regression_loss: 1.7447 - classification_loss: 0.4363 54/500 [==>...........................] - ETA: 1:51 - loss: 2.1836 - regression_loss: 1.7484 - classification_loss: 0.4352 55/500 [==>...........................] - ETA: 1:51 - loss: 2.1880 - regression_loss: 1.7519 - classification_loss: 0.4360 56/500 [==>...........................] - ETA: 1:51 - loss: 2.1861 - regression_loss: 1.7510 - classification_loss: 0.4351 57/500 [==>...........................] - ETA: 1:51 - loss: 2.1817 - regression_loss: 1.7489 - classification_loss: 0.4329 58/500 [==>...........................] - ETA: 1:50 - loss: 2.1864 - regression_loss: 1.7535 - classification_loss: 0.4329 59/500 [==>...........................] - ETA: 1:50 - loss: 2.1908 - regression_loss: 1.7571 - classification_loss: 0.4338 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1694 - regression_loss: 1.7402 - classification_loss: 0.4292 61/500 [==>...........................] - ETA: 1:50 - loss: 2.1700 - regression_loss: 1.7417 - classification_loss: 0.4283 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1671 - regression_loss: 1.7412 - classification_loss: 0.4260 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1513 - regression_loss: 1.7281 - classification_loss: 0.4232 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1543 - regression_loss: 1.7330 - classification_loss: 0.4213 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1530 - regression_loss: 1.7331 - classification_loss: 0.4199 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1484 - regression_loss: 1.7293 - classification_loss: 0.4190 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1580 - regression_loss: 1.7363 - classification_loss: 0.4217 68/500 [===>..........................] - ETA: 1:48 - loss: 2.1519 - regression_loss: 1.7275 - classification_loss: 0.4244 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1449 - regression_loss: 1.7223 - classification_loss: 0.4226 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1421 - regression_loss: 1.7190 - classification_loss: 0.4231 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1252 - regression_loss: 1.7049 - classification_loss: 0.4203 72/500 [===>..........................] - ETA: 1:47 - loss: 2.1240 - regression_loss: 1.7043 - classification_loss: 0.4197 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1277 - regression_loss: 1.7072 - classification_loss: 0.4205 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1219 - regression_loss: 1.7038 - classification_loss: 0.4181 75/500 [===>..........................] - ETA: 1:46 - loss: 2.1225 - regression_loss: 1.7046 - classification_loss: 0.4179 76/500 [===>..........................] - ETA: 1:46 - loss: 2.1207 - regression_loss: 1.7034 - classification_loss: 0.4174 77/500 [===>..........................] - ETA: 1:46 - loss: 2.1110 - regression_loss: 1.6958 - classification_loss: 0.4152 78/500 [===>..........................] - ETA: 1:45 - loss: 2.1006 - regression_loss: 1.6878 - classification_loss: 0.4128 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0998 - regression_loss: 1.6873 - classification_loss: 0.4125 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0994 - regression_loss: 1.6889 - classification_loss: 0.4104 81/500 [===>..........................] - ETA: 1:45 - loss: 2.1067 - regression_loss: 1.6955 - classification_loss: 0.4112 82/500 [===>..........................] - ETA: 1:44 - loss: 2.1063 - regression_loss: 1.6949 - classification_loss: 0.4114 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1169 - regression_loss: 1.7023 - classification_loss: 0.4146 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1095 - regression_loss: 1.6982 - classification_loss: 0.4114 85/500 [====>.........................] - ETA: 1:44 - loss: 2.1033 - regression_loss: 1.6940 - classification_loss: 0.4092 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0907 - regression_loss: 1.6833 - classification_loss: 0.4073 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1047 - regression_loss: 1.6954 - classification_loss: 0.4093 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1087 - regression_loss: 1.6978 - classification_loss: 0.4109 89/500 [====>.........................] - ETA: 1:43 - loss: 2.1098 - regression_loss: 1.6994 - classification_loss: 0.4105 90/500 [====>.........................] - ETA: 1:42 - loss: 2.1020 - regression_loss: 1.6925 - classification_loss: 0.4094 91/500 [====>.........................] - ETA: 1:42 - loss: 2.1059 - regression_loss: 1.6966 - classification_loss: 0.4093 92/500 [====>.........................] - ETA: 1:42 - loss: 2.1065 - regression_loss: 1.6974 - classification_loss: 0.4091 93/500 [====>.........................] - ETA: 1:42 - loss: 2.1167 - regression_loss: 1.7069 - classification_loss: 0.4098 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1107 - regression_loss: 1.7027 - classification_loss: 0.4080 95/500 [====>.........................] - ETA: 1:41 - loss: 2.1023 - regression_loss: 1.6969 - classification_loss: 0.4055 96/500 [====>.........................] - ETA: 1:41 - loss: 2.1047 - regression_loss: 1.6990 - classification_loss: 0.4057 97/500 [====>.........................] - ETA: 1:41 - loss: 2.1039 - regression_loss: 1.6986 - classification_loss: 0.4053 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1069 - regression_loss: 1.7007 - classification_loss: 0.4062 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1081 - regression_loss: 1.7015 - classification_loss: 0.4066 100/500 [=====>........................] - ETA: 1:40 - loss: 2.1111 - regression_loss: 1.7035 - classification_loss: 0.4076 101/500 [=====>........................] - ETA: 1:40 - loss: 2.1072 - regression_loss: 1.7010 - classification_loss: 0.4062 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1062 - regression_loss: 1.7008 - classification_loss: 0.4054 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1043 - regression_loss: 1.6985 - classification_loss: 0.4058 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0993 - regression_loss: 1.6954 - classification_loss: 0.4039 105/500 [=====>........................] - ETA: 1:39 - loss: 2.1034 - regression_loss: 1.6983 - classification_loss: 0.4051 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1048 - regression_loss: 1.6992 - classification_loss: 0.4056 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1096 - regression_loss: 1.7026 - classification_loss: 0.4070 108/500 [=====>........................] - ETA: 1:38 - loss: 2.1073 - regression_loss: 1.7016 - classification_loss: 0.4057 109/500 [=====>........................] - ETA: 1:38 - loss: 2.1150 - regression_loss: 1.7090 - classification_loss: 0.4060 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1169 - regression_loss: 1.7110 - classification_loss: 0.4059 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1137 - regression_loss: 1.7086 - classification_loss: 0.4051 112/500 [=====>........................] - ETA: 1:37 - loss: 2.1142 - regression_loss: 1.7092 - classification_loss: 0.4050 113/500 [=====>........................] - ETA: 1:37 - loss: 2.1176 - regression_loss: 1.7123 - classification_loss: 0.4053 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1173 - regression_loss: 1.7125 - classification_loss: 0.4048 115/500 [=====>........................] - ETA: 1:36 - loss: 2.1163 - regression_loss: 1.7119 - classification_loss: 0.4044 116/500 [=====>........................] - ETA: 1:36 - loss: 2.1127 - regression_loss: 1.7094 - classification_loss: 0.4033 117/500 [======>.......................] - ETA: 1:36 - loss: 2.1157 - regression_loss: 1.7123 - classification_loss: 0.4033 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1106 - regression_loss: 1.7081 - classification_loss: 0.4025 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1078 - regression_loss: 1.7057 - classification_loss: 0.4021 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1104 - regression_loss: 1.7066 - classification_loss: 0.4038 121/500 [======>.......................] - ETA: 1:35 - loss: 2.1120 - regression_loss: 1.7083 - classification_loss: 0.4037 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1097 - regression_loss: 1.7068 - classification_loss: 0.4029 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1115 - regression_loss: 1.7086 - classification_loss: 0.4029 124/500 [======>.......................] - ETA: 1:34 - loss: 2.1171 - regression_loss: 1.7128 - classification_loss: 0.4043 125/500 [======>.......................] - ETA: 1:34 - loss: 2.1179 - regression_loss: 1.7138 - classification_loss: 0.4040 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1133 - regression_loss: 1.7104 - classification_loss: 0.4030 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1159 - regression_loss: 1.7123 - classification_loss: 0.4036 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1162 - regression_loss: 1.7127 - classification_loss: 0.4034 129/500 [======>.......................] - ETA: 1:33 - loss: 2.1146 - regression_loss: 1.7117 - classification_loss: 0.4029 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1144 - regression_loss: 1.7111 - classification_loss: 0.4033 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1188 - regression_loss: 1.7144 - classification_loss: 0.4043 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1202 - regression_loss: 1.7156 - classification_loss: 0.4047 133/500 [======>.......................] - ETA: 1:32 - loss: 2.1304 - regression_loss: 1.7258 - classification_loss: 0.4046 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1374 - regression_loss: 1.7282 - classification_loss: 0.4092 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1378 - regression_loss: 1.7286 - classification_loss: 0.4092 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1378 - regression_loss: 1.7292 - classification_loss: 0.4086 137/500 [=======>......................] - ETA: 1:31 - loss: 2.1383 - regression_loss: 1.7298 - classification_loss: 0.4085 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1420 - regression_loss: 1.7328 - classification_loss: 0.4092 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1419 - regression_loss: 1.7330 - classification_loss: 0.4089 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1415 - regression_loss: 1.7326 - classification_loss: 0.4089 141/500 [=======>......................] - ETA: 1:30 - loss: 2.1427 - regression_loss: 1.7338 - classification_loss: 0.4089 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1423 - regression_loss: 1.7334 - classification_loss: 0.4089 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1410 - regression_loss: 1.7329 - classification_loss: 0.4081 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1419 - regression_loss: 1.7340 - classification_loss: 0.4080 145/500 [=======>......................] - ETA: 1:29 - loss: 2.1434 - regression_loss: 1.7353 - classification_loss: 0.4081 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1428 - regression_loss: 1.7354 - classification_loss: 0.4074 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1417 - regression_loss: 1.7344 - classification_loss: 0.4072 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1427 - regression_loss: 1.7355 - classification_loss: 0.4072 149/500 [=======>......................] - ETA: 1:28 - loss: 2.1416 - regression_loss: 1.7344 - classification_loss: 0.4071 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1413 - regression_loss: 1.7343 - classification_loss: 0.4070 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1399 - regression_loss: 1.7335 - classification_loss: 0.4064 152/500 [========>.....................] - ETA: 1:27 - loss: 2.1418 - regression_loss: 1.7348 - classification_loss: 0.4070 153/500 [========>.....................] - ETA: 1:27 - loss: 2.1433 - regression_loss: 1.7357 - classification_loss: 0.4076 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1406 - regression_loss: 1.7332 - classification_loss: 0.4073 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1381 - regression_loss: 1.7311 - classification_loss: 0.4070 156/500 [========>.....................] - ETA: 1:26 - loss: 2.1411 - regression_loss: 1.7329 - classification_loss: 0.4082 157/500 [========>.....................] - ETA: 1:26 - loss: 2.1423 - regression_loss: 1.7341 - classification_loss: 0.4082 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1359 - regression_loss: 1.7294 - classification_loss: 0.4064 159/500 [========>.....................] - ETA: 1:25 - loss: 2.1338 - regression_loss: 1.7277 - classification_loss: 0.4061 160/500 [========>.....................] - ETA: 1:25 - loss: 2.1333 - regression_loss: 1.7279 - classification_loss: 0.4053 161/500 [========>.....................] - ETA: 1:25 - loss: 2.1281 - regression_loss: 1.7238 - classification_loss: 0.4043 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1283 - regression_loss: 1.7239 - classification_loss: 0.4044 163/500 [========>.....................] - ETA: 1:24 - loss: 2.1335 - regression_loss: 1.7276 - classification_loss: 0.4059 164/500 [========>.....................] - ETA: 1:24 - loss: 2.1373 - regression_loss: 1.7311 - classification_loss: 0.4062 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1448 - regression_loss: 1.7375 - classification_loss: 0.4073 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1441 - regression_loss: 1.7369 - classification_loss: 0.4072 167/500 [=========>....................] - ETA: 1:23 - loss: 2.1436 - regression_loss: 1.7364 - classification_loss: 0.4072 168/500 [=========>....................] - ETA: 1:23 - loss: 2.1413 - regression_loss: 1.7344 - classification_loss: 0.4069 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1422 - regression_loss: 1.7354 - classification_loss: 0.4068 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1412 - regression_loss: 1.7344 - classification_loss: 0.4068 171/500 [=========>....................] - ETA: 1:22 - loss: 2.1453 - regression_loss: 1.7368 - classification_loss: 0.4085 172/500 [=========>....................] - ETA: 1:22 - loss: 2.1468 - regression_loss: 1.7381 - classification_loss: 0.4087 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1511 - regression_loss: 1.7408 - classification_loss: 0.4102 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1483 - regression_loss: 1.7382 - classification_loss: 0.4102 175/500 [=========>....................] - ETA: 1:21 - loss: 2.1481 - regression_loss: 1.7383 - classification_loss: 0.4099 176/500 [=========>....................] - ETA: 1:21 - loss: 2.1478 - regression_loss: 1.7387 - classification_loss: 0.4092 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1488 - regression_loss: 1.7388 - classification_loss: 0.4100 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1471 - regression_loss: 1.7372 - classification_loss: 0.4098 179/500 [=========>....................] - ETA: 1:20 - loss: 2.1453 - regression_loss: 1.7360 - classification_loss: 0.4093 180/500 [=========>....................] - ETA: 1:20 - loss: 2.1435 - regression_loss: 1.7348 - classification_loss: 0.4087 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1445 - regression_loss: 1.7360 - classification_loss: 0.4085 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1462 - regression_loss: 1.7355 - classification_loss: 0.4106 183/500 [=========>....................] - ETA: 1:19 - loss: 2.1479 - regression_loss: 1.7371 - classification_loss: 0.4109 184/500 [==========>...................] - ETA: 1:19 - loss: 2.1501 - regression_loss: 1.7390 - classification_loss: 0.4111 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1506 - regression_loss: 1.7395 - classification_loss: 0.4110 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1484 - regression_loss: 1.7382 - classification_loss: 0.4102 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1453 - regression_loss: 1.7361 - classification_loss: 0.4092 188/500 [==========>...................] - ETA: 1:18 - loss: 2.1442 - regression_loss: 1.7355 - classification_loss: 0.4087 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1464 - regression_loss: 1.7372 - classification_loss: 0.4092 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1417 - regression_loss: 1.7336 - classification_loss: 0.4081 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1399 - regression_loss: 1.7317 - classification_loss: 0.4082 192/500 [==========>...................] - ETA: 1:17 - loss: 2.1400 - regression_loss: 1.7324 - classification_loss: 0.4077 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1406 - regression_loss: 1.7328 - classification_loss: 0.4078 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1406 - regression_loss: 1.7333 - classification_loss: 0.4073 195/500 [==========>...................] - ETA: 1:16 - loss: 2.1408 - regression_loss: 1.7337 - classification_loss: 0.4070 196/500 [==========>...................] - ETA: 1:16 - loss: 2.1391 - regression_loss: 1.7325 - classification_loss: 0.4066 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1373 - regression_loss: 1.7310 - classification_loss: 0.4063 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1428 - regression_loss: 1.7358 - classification_loss: 0.4069 199/500 [==========>...................] - ETA: 1:15 - loss: 2.1405 - regression_loss: 1.7340 - classification_loss: 0.4066 200/500 [===========>..................] - ETA: 1:15 - loss: 2.1398 - regression_loss: 1.7337 - classification_loss: 0.4060 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1450 - regression_loss: 1.7380 - classification_loss: 0.4070 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1456 - regression_loss: 1.7388 - classification_loss: 0.4069 203/500 [===========>..................] - ETA: 1:14 - loss: 2.1491 - regression_loss: 1.7418 - classification_loss: 0.4073 204/500 [===========>..................] - ETA: 1:14 - loss: 2.1467 - regression_loss: 1.7401 - classification_loss: 0.4066 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1426 - regression_loss: 1.7367 - classification_loss: 0.4058 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1399 - regression_loss: 1.7345 - classification_loss: 0.4054 207/500 [===========>..................] - ETA: 1:13 - loss: 2.1404 - regression_loss: 1.7351 - classification_loss: 0.4053 208/500 [===========>..................] - ETA: 1:13 - loss: 2.1374 - regression_loss: 1.7324 - classification_loss: 0.4050 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1384 - regression_loss: 1.7333 - classification_loss: 0.4051 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1393 - regression_loss: 1.7341 - classification_loss: 0.4052 211/500 [===========>..................] - ETA: 1:12 - loss: 2.1457 - regression_loss: 1.7403 - classification_loss: 0.4055 212/500 [===========>..................] - ETA: 1:12 - loss: 2.1454 - regression_loss: 1.7401 - classification_loss: 0.4053 213/500 [===========>..................] - ETA: 1:12 - loss: 2.1471 - regression_loss: 1.7414 - classification_loss: 0.4057 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1457 - regression_loss: 1.7409 - classification_loss: 0.4048 215/500 [===========>..................] - ETA: 1:11 - loss: 2.1462 - regression_loss: 1.7415 - classification_loss: 0.4047 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1428 - regression_loss: 1.7389 - classification_loss: 0.4039 217/500 [============>.................] - ETA: 1:10 - loss: 2.1383 - regression_loss: 1.7350 - classification_loss: 0.4033 218/500 [============>.................] - ETA: 1:10 - loss: 2.1381 - regression_loss: 1.7350 - classification_loss: 0.4031 219/500 [============>.................] - ETA: 1:10 - loss: 2.1381 - regression_loss: 1.7347 - classification_loss: 0.4033 220/500 [============>.................] - ETA: 1:10 - loss: 2.1379 - regression_loss: 1.7347 - classification_loss: 0.4032 221/500 [============>.................] - ETA: 1:09 - loss: 2.1390 - regression_loss: 1.7358 - classification_loss: 0.4032 222/500 [============>.................] - ETA: 1:09 - loss: 2.1387 - regression_loss: 1.7355 - classification_loss: 0.4031 223/500 [============>.................] - ETA: 1:09 - loss: 2.1384 - regression_loss: 1.7360 - classification_loss: 0.4024 224/500 [============>.................] - ETA: 1:09 - loss: 2.1382 - regression_loss: 1.7360 - classification_loss: 0.4022 225/500 [============>.................] - ETA: 1:08 - loss: 2.1370 - regression_loss: 1.7353 - classification_loss: 0.4017 226/500 [============>.................] - ETA: 1:08 - loss: 2.1384 - regression_loss: 1.7364 - classification_loss: 0.4019 227/500 [============>.................] - ETA: 1:08 - loss: 2.1389 - regression_loss: 1.7359 - classification_loss: 0.4030 228/500 [============>.................] - ETA: 1:08 - loss: 2.1379 - regression_loss: 1.7350 - classification_loss: 0.4029 229/500 [============>.................] - ETA: 1:07 - loss: 2.1381 - regression_loss: 1.7354 - classification_loss: 0.4027 230/500 [============>.................] - ETA: 1:07 - loss: 2.1370 - regression_loss: 1.7342 - classification_loss: 0.4027 231/500 [============>.................] - ETA: 1:07 - loss: 2.1379 - regression_loss: 1.7343 - classification_loss: 0.4035 232/500 [============>.................] - ETA: 1:07 - loss: 2.1340 - regression_loss: 1.7314 - classification_loss: 0.4026 233/500 [============>.................] - ETA: 1:06 - loss: 2.1357 - regression_loss: 1.7330 - classification_loss: 0.4027 234/500 [=============>................] - ETA: 1:06 - loss: 2.1380 - regression_loss: 1.7347 - classification_loss: 0.4033 235/500 [=============>................] - ETA: 1:06 - loss: 2.1352 - regression_loss: 1.7324 - classification_loss: 0.4028 236/500 [=============>................] - ETA: 1:06 - loss: 2.1291 - regression_loss: 1.7275 - classification_loss: 0.4016 237/500 [=============>................] - ETA: 1:05 - loss: 2.1294 - regression_loss: 1.7278 - classification_loss: 0.4016 238/500 [=============>................] - ETA: 1:05 - loss: 2.1309 - regression_loss: 1.7294 - classification_loss: 0.4015 239/500 [=============>................] - ETA: 1:05 - loss: 2.1301 - regression_loss: 1.7289 - classification_loss: 0.4012 240/500 [=============>................] - ETA: 1:05 - loss: 2.1301 - regression_loss: 1.7291 - classification_loss: 0.4010 241/500 [=============>................] - ETA: 1:04 - loss: 2.1303 - regression_loss: 1.7294 - classification_loss: 0.4009 242/500 [=============>................] - ETA: 1:04 - loss: 2.1300 - regression_loss: 1.7293 - classification_loss: 0.4007 243/500 [=============>................] - ETA: 1:04 - loss: 2.1285 - regression_loss: 1.7285 - classification_loss: 0.4000 244/500 [=============>................] - ETA: 1:04 - loss: 2.1294 - regression_loss: 1.7295 - classification_loss: 0.4000 245/500 [=============>................] - ETA: 1:03 - loss: 2.1316 - regression_loss: 1.7312 - classification_loss: 0.4003 246/500 [=============>................] - ETA: 1:03 - loss: 2.1319 - regression_loss: 1.7315 - classification_loss: 0.4003 247/500 [=============>................] - ETA: 1:03 - loss: 2.1285 - regression_loss: 1.7288 - classification_loss: 0.3997 248/500 [=============>................] - ETA: 1:03 - loss: 2.1293 - regression_loss: 1.7294 - classification_loss: 0.3999 249/500 [=============>................] - ETA: 1:02 - loss: 2.1300 - regression_loss: 1.7302 - classification_loss: 0.3998 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1288 - regression_loss: 1.7294 - classification_loss: 0.3994 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1284 - regression_loss: 1.7290 - classification_loss: 0.3994 252/500 [==============>...............] - ETA: 1:02 - loss: 2.1290 - regression_loss: 1.7301 - classification_loss: 0.3989 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1292 - regression_loss: 1.7306 - classification_loss: 0.3986 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1296 - regression_loss: 1.7307 - classification_loss: 0.3989 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1330 - regression_loss: 1.7339 - classification_loss: 0.3992 256/500 [==============>...............] - ETA: 1:01 - loss: 2.1354 - regression_loss: 1.7362 - classification_loss: 0.3992 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1357 - regression_loss: 1.7367 - classification_loss: 0.3990 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1334 - regression_loss: 1.7348 - classification_loss: 0.3985 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1350 - regression_loss: 1.7359 - classification_loss: 0.3991 260/500 [==============>...............] - ETA: 1:00 - loss: 2.1378 - regression_loss: 1.7378 - classification_loss: 0.4000 261/500 [==============>...............] - ETA: 59s - loss: 2.1383 - regression_loss: 1.7384 - classification_loss: 0.3999  262/500 [==============>...............] - ETA: 59s - loss: 2.1358 - regression_loss: 1.7364 - classification_loss: 0.3994 263/500 [==============>...............] - ETA: 59s - loss: 2.1325 - regression_loss: 1.7334 - classification_loss: 0.3990 264/500 [==============>...............] - ETA: 59s - loss: 2.1270 - regression_loss: 1.7291 - classification_loss: 0.3979 265/500 [==============>...............] - ETA: 58s - loss: 2.1251 - regression_loss: 1.7277 - classification_loss: 0.3975 266/500 [==============>...............] - ETA: 58s - loss: 2.1269 - regression_loss: 1.7290 - classification_loss: 0.3979 267/500 [===============>..............] - ETA: 58s - loss: 2.1264 - regression_loss: 1.7290 - classification_loss: 0.3974 268/500 [===============>..............] - ETA: 58s - loss: 2.1277 - regression_loss: 1.7299 - classification_loss: 0.3978 269/500 [===============>..............] - ETA: 57s - loss: 2.1285 - regression_loss: 1.7306 - classification_loss: 0.3978 270/500 [===============>..............] - ETA: 57s - loss: 2.1304 - regression_loss: 1.7321 - classification_loss: 0.3983 271/500 [===============>..............] - ETA: 57s - loss: 2.1325 - regression_loss: 1.7338 - classification_loss: 0.3988 272/500 [===============>..............] - ETA: 57s - loss: 2.1327 - regression_loss: 1.7340 - classification_loss: 0.3988 273/500 [===============>..............] - ETA: 56s - loss: 2.1316 - regression_loss: 1.7332 - classification_loss: 0.3984 274/500 [===============>..............] - ETA: 56s - loss: 2.1330 - regression_loss: 1.7338 - classification_loss: 0.3992 275/500 [===============>..............] - ETA: 56s - loss: 2.1333 - regression_loss: 1.7329 - classification_loss: 0.4003 276/500 [===============>..............] - ETA: 56s - loss: 2.1289 - regression_loss: 1.7293 - classification_loss: 0.3995 277/500 [===============>..............] - ETA: 55s - loss: 2.1302 - regression_loss: 1.7308 - classification_loss: 0.3994 278/500 [===============>..............] - ETA: 55s - loss: 2.1284 - regression_loss: 1.7294 - classification_loss: 0.3990 279/500 [===============>..............] - ETA: 55s - loss: 2.1286 - regression_loss: 1.7292 - classification_loss: 0.3994 280/500 [===============>..............] - ETA: 55s - loss: 2.1286 - regression_loss: 1.7292 - classification_loss: 0.3994 281/500 [===============>..............] - ETA: 54s - loss: 2.1318 - regression_loss: 1.7310 - classification_loss: 0.4008 282/500 [===============>..............] - ETA: 54s - loss: 2.1321 - regression_loss: 1.7316 - classification_loss: 0.4005 283/500 [===============>..............] - ETA: 54s - loss: 2.1317 - regression_loss: 1.7314 - classification_loss: 0.4003 284/500 [================>.............] - ETA: 54s - loss: 2.1324 - regression_loss: 1.7321 - classification_loss: 0.4003 285/500 [================>.............] - ETA: 53s - loss: 2.1333 - regression_loss: 1.7332 - classification_loss: 0.4001 286/500 [================>.............] - ETA: 53s - loss: 2.1324 - regression_loss: 1.7326 - classification_loss: 0.3998 287/500 [================>.............] - ETA: 53s - loss: 2.1325 - regression_loss: 1.7326 - classification_loss: 0.3999 288/500 [================>.............] - ETA: 53s - loss: 2.1336 - regression_loss: 1.7333 - classification_loss: 0.4003 289/500 [================>.............] - ETA: 52s - loss: 2.1356 - regression_loss: 1.7349 - classification_loss: 0.4007 290/500 [================>.............] - ETA: 52s - loss: 2.1367 - regression_loss: 1.7357 - classification_loss: 0.4010 291/500 [================>.............] - ETA: 52s - loss: 2.1387 - regression_loss: 1.7373 - classification_loss: 0.4014 292/500 [================>.............] - ETA: 52s - loss: 2.1385 - regression_loss: 1.7374 - classification_loss: 0.4010 293/500 [================>.............] - ETA: 51s - loss: 2.1355 - regression_loss: 1.7350 - classification_loss: 0.4005 294/500 [================>.............] - ETA: 51s - loss: 2.1347 - regression_loss: 1.7340 - classification_loss: 0.4007 295/500 [================>.............] - ETA: 51s - loss: 2.1353 - regression_loss: 1.7347 - classification_loss: 0.4006 296/500 [================>.............] - ETA: 51s - loss: 2.1323 - regression_loss: 1.7324 - classification_loss: 0.3999 297/500 [================>.............] - ETA: 50s - loss: 2.1317 - regression_loss: 1.7321 - classification_loss: 0.3996 298/500 [================>.............] - ETA: 50s - loss: 2.1324 - regression_loss: 1.7327 - classification_loss: 0.3997 299/500 [================>.............] - ETA: 50s - loss: 2.1315 - regression_loss: 1.7320 - classification_loss: 0.3995 300/500 [=================>............] - ETA: 50s - loss: 2.1338 - regression_loss: 1.7336 - classification_loss: 0.4002 301/500 [=================>............] - ETA: 49s - loss: 2.1343 - regression_loss: 1.7341 - classification_loss: 0.4002 302/500 [=================>............] - ETA: 49s - loss: 2.1324 - regression_loss: 1.7326 - classification_loss: 0.3998 303/500 [=================>............] - ETA: 49s - loss: 2.1314 - regression_loss: 1.7313 - classification_loss: 0.4001 304/500 [=================>............] - ETA: 49s - loss: 2.1338 - regression_loss: 1.7338 - classification_loss: 0.4000 305/500 [=================>............] - ETA: 48s - loss: 2.1342 - regression_loss: 1.7340 - classification_loss: 0.4002 306/500 [=================>............] - ETA: 48s - loss: 2.1315 - regression_loss: 1.7323 - classification_loss: 0.3992 307/500 [=================>............] - ETA: 48s - loss: 2.1305 - regression_loss: 1.7316 - classification_loss: 0.3989 308/500 [=================>............] - ETA: 48s - loss: 2.1310 - regression_loss: 1.7319 - classification_loss: 0.3990 309/500 [=================>............] - ETA: 47s - loss: 2.1320 - regression_loss: 1.7328 - classification_loss: 0.3991 310/500 [=================>............] - ETA: 47s - loss: 2.1313 - regression_loss: 1.7325 - classification_loss: 0.3988 311/500 [=================>............] - ETA: 47s - loss: 2.1306 - regression_loss: 1.7321 - classification_loss: 0.3985 312/500 [=================>............] - ETA: 47s - loss: 2.1331 - regression_loss: 1.7343 - classification_loss: 0.3988 313/500 [=================>............] - ETA: 46s - loss: 2.1332 - regression_loss: 1.7345 - classification_loss: 0.3987 314/500 [=================>............] - ETA: 46s - loss: 2.1332 - regression_loss: 1.7347 - classification_loss: 0.3985 315/500 [=================>............] - ETA: 46s - loss: 2.1332 - regression_loss: 1.7347 - classification_loss: 0.3985 316/500 [=================>............] - ETA: 46s - loss: 2.1329 - regression_loss: 1.7346 - classification_loss: 0.3983 317/500 [==================>...........] - ETA: 45s - loss: 2.1335 - regression_loss: 1.7353 - classification_loss: 0.3981 318/500 [==================>...........] - ETA: 45s - loss: 2.1332 - regression_loss: 1.7351 - classification_loss: 0.3982 319/500 [==================>...........] - ETA: 45s - loss: 2.1322 - regression_loss: 1.7343 - classification_loss: 0.3979 320/500 [==================>...........] - ETA: 45s - loss: 2.1318 - regression_loss: 1.7341 - classification_loss: 0.3977 321/500 [==================>...........] - ETA: 44s - loss: 2.1310 - regression_loss: 1.7335 - classification_loss: 0.3975 322/500 [==================>...........] - ETA: 44s - loss: 2.1315 - regression_loss: 1.7341 - classification_loss: 0.3974 323/500 [==================>...........] - ETA: 44s - loss: 2.1320 - regression_loss: 1.7346 - classification_loss: 0.3974 324/500 [==================>...........] - ETA: 44s - loss: 2.1326 - regression_loss: 1.7350 - classification_loss: 0.3976 325/500 [==================>...........] - ETA: 43s - loss: 2.1310 - regression_loss: 1.7339 - classification_loss: 0.3971 326/500 [==================>...........] - ETA: 43s - loss: 2.1308 - regression_loss: 1.7338 - classification_loss: 0.3970 327/500 [==================>...........] - ETA: 43s - loss: 2.1286 - regression_loss: 1.7319 - classification_loss: 0.3967 328/500 [==================>...........] - ETA: 43s - loss: 2.1281 - regression_loss: 1.7315 - classification_loss: 0.3966 329/500 [==================>...........] - ETA: 42s - loss: 2.1273 - regression_loss: 1.7310 - classification_loss: 0.3963 330/500 [==================>...........] - ETA: 42s - loss: 2.1269 - regression_loss: 1.7308 - classification_loss: 0.3961 331/500 [==================>...........] - ETA: 42s - loss: 2.1271 - regression_loss: 1.7309 - classification_loss: 0.3962 332/500 [==================>...........] - ETA: 42s - loss: 2.1273 - regression_loss: 1.7311 - classification_loss: 0.3962 333/500 [==================>...........] - ETA: 41s - loss: 2.1280 - regression_loss: 1.7317 - classification_loss: 0.3963 334/500 [===================>..........] - ETA: 41s - loss: 2.1280 - regression_loss: 1.7318 - classification_loss: 0.3962 335/500 [===================>..........] - ETA: 41s - loss: 2.1294 - regression_loss: 1.7329 - classification_loss: 0.3966 336/500 [===================>..........] - ETA: 41s - loss: 2.1288 - regression_loss: 1.7326 - classification_loss: 0.3962 337/500 [===================>..........] - ETA: 40s - loss: 2.1308 - regression_loss: 1.7346 - classification_loss: 0.3962 338/500 [===================>..........] - ETA: 40s - loss: 2.1329 - regression_loss: 1.7341 - classification_loss: 0.3988 339/500 [===================>..........] - ETA: 40s - loss: 2.1307 - regression_loss: 1.7323 - classification_loss: 0.3983 340/500 [===================>..........] - ETA: 40s - loss: 2.1300 - regression_loss: 1.7318 - classification_loss: 0.3982 341/500 [===================>..........] - ETA: 39s - loss: 2.1284 - regression_loss: 1.7303 - classification_loss: 0.3980 342/500 [===================>..........] - ETA: 39s - loss: 2.1275 - regression_loss: 1.7299 - classification_loss: 0.3976 343/500 [===================>..........] - ETA: 39s - loss: 2.1276 - regression_loss: 1.7299 - classification_loss: 0.3978 344/500 [===================>..........] - ETA: 39s - loss: 2.1276 - regression_loss: 1.7300 - classification_loss: 0.3975 345/500 [===================>..........] - ETA: 38s - loss: 2.1295 - regression_loss: 1.7317 - classification_loss: 0.3979 346/500 [===================>..........] - ETA: 38s - loss: 2.1273 - regression_loss: 1.7296 - classification_loss: 0.3978 347/500 [===================>..........] - ETA: 38s - loss: 2.1275 - regression_loss: 1.7298 - classification_loss: 0.3977 348/500 [===================>..........] - ETA: 38s - loss: 2.1279 - regression_loss: 1.7302 - classification_loss: 0.3978 349/500 [===================>..........] - ETA: 37s - loss: 2.1280 - regression_loss: 1.7303 - classification_loss: 0.3977 350/500 [====================>.........] - ETA: 37s - loss: 2.1271 - regression_loss: 1.7296 - classification_loss: 0.3975 351/500 [====================>.........] - ETA: 37s - loss: 2.1282 - regression_loss: 1.7304 - classification_loss: 0.3979 352/500 [====================>.........] - ETA: 37s - loss: 2.1278 - regression_loss: 1.7299 - classification_loss: 0.3979 353/500 [====================>.........] - ETA: 36s - loss: 2.1284 - regression_loss: 1.7305 - classification_loss: 0.3980 354/500 [====================>.........] - ETA: 36s - loss: 2.1282 - regression_loss: 1.7303 - classification_loss: 0.3979 355/500 [====================>.........] - ETA: 36s - loss: 2.1280 - regression_loss: 1.7302 - classification_loss: 0.3978 356/500 [====================>.........] - ETA: 36s - loss: 2.1264 - regression_loss: 1.7288 - classification_loss: 0.3976 357/500 [====================>.........] - ETA: 35s - loss: 2.1271 - regression_loss: 1.7298 - classification_loss: 0.3973 358/500 [====================>.........] - ETA: 35s - loss: 2.1258 - regression_loss: 1.7290 - classification_loss: 0.3969 359/500 [====================>.........] - ETA: 35s - loss: 2.1267 - regression_loss: 1.7300 - classification_loss: 0.3968 360/500 [====================>.........] - ETA: 35s - loss: 2.1256 - regression_loss: 1.7291 - classification_loss: 0.3965 361/500 [====================>.........] - ETA: 34s - loss: 2.1267 - regression_loss: 1.7300 - classification_loss: 0.3966 362/500 [====================>.........] - ETA: 34s - loss: 2.1270 - regression_loss: 1.7305 - classification_loss: 0.3965 363/500 [====================>.........] - ETA: 34s - loss: 2.1289 - regression_loss: 1.7317 - classification_loss: 0.3972 364/500 [====================>.........] - ETA: 34s - loss: 2.1289 - regression_loss: 1.7317 - classification_loss: 0.3971 365/500 [====================>.........] - ETA: 33s - loss: 2.1287 - regression_loss: 1.7308 - classification_loss: 0.3979 366/500 [====================>.........] - ETA: 33s - loss: 2.1282 - regression_loss: 1.7303 - classification_loss: 0.3978 367/500 [=====================>........] - ETA: 33s - loss: 2.1292 - regression_loss: 1.7303 - classification_loss: 0.3989 368/500 [=====================>........] - ETA: 33s - loss: 2.1306 - regression_loss: 1.7313 - classification_loss: 0.3993 369/500 [=====================>........] - ETA: 32s - loss: 2.1306 - regression_loss: 1.7315 - classification_loss: 0.3990 370/500 [=====================>........] - ETA: 32s - loss: 2.1322 - regression_loss: 1.7330 - classification_loss: 0.3992 371/500 [=====================>........] - ETA: 32s - loss: 2.1323 - regression_loss: 1.7333 - classification_loss: 0.3990 372/500 [=====================>........] - ETA: 32s - loss: 2.1329 - regression_loss: 1.7340 - classification_loss: 0.3990 373/500 [=====================>........] - ETA: 31s - loss: 2.1333 - regression_loss: 1.7343 - classification_loss: 0.3990 374/500 [=====================>........] - ETA: 31s - loss: 2.1336 - regression_loss: 1.7338 - classification_loss: 0.3998 375/500 [=====================>........] - ETA: 31s - loss: 2.1331 - regression_loss: 1.7334 - classification_loss: 0.3997 376/500 [=====================>........] - ETA: 31s - loss: 2.1331 - regression_loss: 1.7336 - classification_loss: 0.3995 377/500 [=====================>........] - ETA: 30s - loss: 2.1313 - regression_loss: 1.7321 - classification_loss: 0.3993 378/500 [=====================>........] - ETA: 30s - loss: 2.1314 - regression_loss: 1.7323 - classification_loss: 0.3991 379/500 [=====================>........] - ETA: 30s - loss: 2.1323 - regression_loss: 1.7331 - classification_loss: 0.3992 380/500 [=====================>........] - ETA: 30s - loss: 2.1325 - regression_loss: 1.7335 - classification_loss: 0.3991 381/500 [=====================>........] - ETA: 29s - loss: 2.1329 - regression_loss: 1.7334 - classification_loss: 0.3995 382/500 [=====================>........] - ETA: 29s - loss: 2.1324 - regression_loss: 1.7330 - classification_loss: 0.3994 383/500 [=====================>........] - ETA: 29s - loss: 2.1321 - regression_loss: 1.7328 - classification_loss: 0.3993 384/500 [======================>.......] - ETA: 29s - loss: 2.1304 - regression_loss: 1.7315 - classification_loss: 0.3988 385/500 [======================>.......] - ETA: 28s - loss: 2.1302 - regression_loss: 1.7314 - classification_loss: 0.3988 386/500 [======================>.......] - ETA: 28s - loss: 2.1283 - regression_loss: 1.7299 - classification_loss: 0.3985 387/500 [======================>.......] - ETA: 28s - loss: 2.1277 - regression_loss: 1.7294 - classification_loss: 0.3983 388/500 [======================>.......] - ETA: 28s - loss: 2.1280 - regression_loss: 1.7296 - classification_loss: 0.3984 389/500 [======================>.......] - ETA: 27s - loss: 2.1282 - regression_loss: 1.7298 - classification_loss: 0.3984 390/500 [======================>.......] - ETA: 27s - loss: 2.1285 - regression_loss: 1.7303 - classification_loss: 0.3982 391/500 [======================>.......] - ETA: 27s - loss: 2.1292 - regression_loss: 1.7307 - classification_loss: 0.3986 392/500 [======================>.......] - ETA: 27s - loss: 2.1295 - regression_loss: 1.7311 - classification_loss: 0.3984 393/500 [======================>.......] - ETA: 26s - loss: 2.1290 - regression_loss: 1.7307 - classification_loss: 0.3982 394/500 [======================>.......] - ETA: 26s - loss: 2.1305 - regression_loss: 1.7315 - classification_loss: 0.3990 395/500 [======================>.......] - ETA: 26s - loss: 2.1295 - regression_loss: 1.7308 - classification_loss: 0.3988 396/500 [======================>.......] - ETA: 26s - loss: 2.1300 - regression_loss: 1.7297 - classification_loss: 0.4003 397/500 [======================>.......] - ETA: 25s - loss: 2.1281 - regression_loss: 1.7280 - classification_loss: 0.4000 398/500 [======================>.......] - ETA: 25s - loss: 2.1273 - regression_loss: 1.7275 - classification_loss: 0.3998 399/500 [======================>.......] - ETA: 25s - loss: 2.1273 - regression_loss: 1.7276 - classification_loss: 0.3997 400/500 [=======================>......] - ETA: 25s - loss: 2.1290 - regression_loss: 1.7286 - classification_loss: 0.4004 401/500 [=======================>......] - ETA: 24s - loss: 2.1287 - regression_loss: 1.7285 - classification_loss: 0.4002 402/500 [=======================>......] - ETA: 24s - loss: 2.1274 - regression_loss: 1.7276 - classification_loss: 0.3998 403/500 [=======================>......] - ETA: 24s - loss: 2.1282 - regression_loss: 1.7284 - classification_loss: 0.3998 404/500 [=======================>......] - ETA: 24s - loss: 2.1282 - regression_loss: 1.7284 - classification_loss: 0.3998 405/500 [=======================>......] - ETA: 23s - loss: 2.1287 - regression_loss: 1.7289 - classification_loss: 0.3998 406/500 [=======================>......] - ETA: 23s - loss: 2.1273 - regression_loss: 1.7278 - classification_loss: 0.3995 407/500 [=======================>......] - ETA: 23s - loss: 2.1261 - regression_loss: 1.7267 - classification_loss: 0.3993 408/500 [=======================>......] - ETA: 23s - loss: 2.1274 - regression_loss: 1.7276 - classification_loss: 0.3998 409/500 [=======================>......] - ETA: 22s - loss: 2.1281 - regression_loss: 1.7281 - classification_loss: 0.4000 410/500 [=======================>......] - ETA: 22s - loss: 2.1281 - regression_loss: 1.7281 - classification_loss: 0.4000 411/500 [=======================>......] - ETA: 22s - loss: 2.1277 - regression_loss: 1.7279 - classification_loss: 0.3998 412/500 [=======================>......] - ETA: 22s - loss: 2.1272 - regression_loss: 1.7276 - classification_loss: 0.3996 413/500 [=======================>......] - ETA: 21s - loss: 2.1268 - regression_loss: 1.7274 - classification_loss: 0.3994 414/500 [=======================>......] - ETA: 21s - loss: 2.1295 - regression_loss: 1.7292 - classification_loss: 0.4003 415/500 [=======================>......] - ETA: 21s - loss: 2.1271 - regression_loss: 1.7271 - classification_loss: 0.4000 416/500 [=======================>......] - ETA: 21s - loss: 2.1292 - regression_loss: 1.7288 - classification_loss: 0.4005 417/500 [========================>.....] - ETA: 20s - loss: 2.1301 - regression_loss: 1.7291 - classification_loss: 0.4010 418/500 [========================>.....] - ETA: 20s - loss: 2.1309 - regression_loss: 1.7298 - classification_loss: 0.4010 419/500 [========================>.....] - ETA: 20s - loss: 2.1299 - regression_loss: 1.7291 - classification_loss: 0.4007 420/500 [========================>.....] - ETA: 20s - loss: 2.1294 - regression_loss: 1.7290 - classification_loss: 0.4004 421/500 [========================>.....] - ETA: 19s - loss: 2.1296 - regression_loss: 1.7293 - classification_loss: 0.4003 422/500 [========================>.....] - ETA: 19s - loss: 2.1276 - regression_loss: 1.7273 - classification_loss: 0.4003 423/500 [========================>.....] - ETA: 19s - loss: 2.1280 - regression_loss: 1.7278 - classification_loss: 0.4003 424/500 [========================>.....] - ETA: 19s - loss: 2.1281 - regression_loss: 1.7279 - classification_loss: 0.4002 425/500 [========================>.....] - ETA: 18s - loss: 2.1283 - regression_loss: 1.7281 - classification_loss: 0.4002 426/500 [========================>.....] - ETA: 18s - loss: 2.1258 - regression_loss: 1.7261 - classification_loss: 0.3997 427/500 [========================>.....] - ETA: 18s - loss: 2.1252 - regression_loss: 1.7255 - classification_loss: 0.3996 428/500 [========================>.....] - ETA: 18s - loss: 2.1281 - regression_loss: 1.7282 - classification_loss: 0.3998 429/500 [========================>.....] - ETA: 17s - loss: 2.1281 - regression_loss: 1.7282 - classification_loss: 0.3999 430/500 [========================>.....] - ETA: 17s - loss: 2.1280 - regression_loss: 1.7281 - classification_loss: 0.3999 431/500 [========================>.....] - ETA: 17s - loss: 2.1250 - regression_loss: 1.7257 - classification_loss: 0.3993 432/500 [========================>.....] - ETA: 17s - loss: 2.1259 - regression_loss: 1.7265 - classification_loss: 0.3994 433/500 [========================>.....] - ETA: 16s - loss: 2.1284 - regression_loss: 1.7288 - classification_loss: 0.3996 434/500 [=========================>....] - ETA: 16s - loss: 2.1264 - regression_loss: 1.7268 - classification_loss: 0.3995 435/500 [=========================>....] - ETA: 16s - loss: 2.1263 - regression_loss: 1.7269 - classification_loss: 0.3994 436/500 [=========================>....] - ETA: 16s - loss: 2.1256 - regression_loss: 1.7261 - classification_loss: 0.3995 437/500 [=========================>....] - ETA: 15s - loss: 2.1252 - regression_loss: 1.7258 - classification_loss: 0.3994 438/500 [=========================>....] - ETA: 15s - loss: 2.1250 - regression_loss: 1.7257 - classification_loss: 0.3993 439/500 [=========================>....] - ETA: 15s - loss: 2.1254 - regression_loss: 1.7261 - classification_loss: 0.3993 440/500 [=========================>....] - ETA: 15s - loss: 2.1243 - regression_loss: 1.7251 - classification_loss: 0.3992 441/500 [=========================>....] - ETA: 14s - loss: 2.1239 - regression_loss: 1.7248 - classification_loss: 0.3991 442/500 [=========================>....] - ETA: 14s - loss: 2.1241 - regression_loss: 1.7249 - classification_loss: 0.3992 443/500 [=========================>....] - ETA: 14s - loss: 2.1240 - regression_loss: 1.7250 - classification_loss: 0.3990 444/500 [=========================>....] - ETA: 14s - loss: 2.1238 - regression_loss: 1.7249 - classification_loss: 0.3989 445/500 [=========================>....] - ETA: 13s - loss: 2.1231 - regression_loss: 1.7243 - classification_loss: 0.3988 446/500 [=========================>....] - ETA: 13s - loss: 2.1228 - regression_loss: 1.7242 - classification_loss: 0.3986 447/500 [=========================>....] - ETA: 13s - loss: 2.1235 - regression_loss: 1.7249 - classification_loss: 0.3987 448/500 [=========================>....] - ETA: 13s - loss: 2.1237 - regression_loss: 1.7251 - classification_loss: 0.3986 449/500 [=========================>....] - ETA: 12s - loss: 2.1232 - regression_loss: 1.7248 - classification_loss: 0.3984 450/500 [==========================>...] - ETA: 12s - loss: 2.1237 - regression_loss: 1.7254 - classification_loss: 0.3983 451/500 [==========================>...] - ETA: 12s - loss: 2.1249 - regression_loss: 1.7263 - classification_loss: 0.3986 452/500 [==========================>...] - ETA: 12s - loss: 2.1238 - regression_loss: 1.7253 - classification_loss: 0.3985 453/500 [==========================>...] - ETA: 11s - loss: 2.1234 - regression_loss: 1.7251 - classification_loss: 0.3982 454/500 [==========================>...] - ETA: 11s - loss: 2.1235 - regression_loss: 1.7252 - classification_loss: 0.3982 455/500 [==========================>...] - ETA: 11s - loss: 2.1220 - regression_loss: 1.7240 - classification_loss: 0.3980 456/500 [==========================>...] - ETA: 11s - loss: 2.1220 - regression_loss: 1.7240 - classification_loss: 0.3980 457/500 [==========================>...] - ETA: 10s - loss: 2.1213 - regression_loss: 1.7233 - classification_loss: 0.3980 458/500 [==========================>...] - ETA: 10s - loss: 2.1205 - regression_loss: 1.7226 - classification_loss: 0.3979 459/500 [==========================>...] - ETA: 10s - loss: 2.1206 - regression_loss: 1.7229 - classification_loss: 0.3978 460/500 [==========================>...] - ETA: 10s - loss: 2.1217 - regression_loss: 1.7238 - classification_loss: 0.3978 461/500 [==========================>...] - ETA: 9s - loss: 2.1226 - regression_loss: 1.7246 - classification_loss: 0.3980  462/500 [==========================>...] - ETA: 9s - loss: 2.1201 - regression_loss: 1.7228 - classification_loss: 0.3973 463/500 [==========================>...] - ETA: 9s - loss: 2.1207 - regression_loss: 1.7234 - classification_loss: 0.3973 464/500 [==========================>...] - ETA: 9s - loss: 2.1198 - regression_loss: 1.7221 - classification_loss: 0.3977 465/500 [==========================>...] - ETA: 8s - loss: 2.1203 - regression_loss: 1.7225 - classification_loss: 0.3977 466/500 [==========================>...] - ETA: 8s - loss: 2.1207 - regression_loss: 1.7231 - classification_loss: 0.3977 467/500 [===========================>..] - ETA: 8s - loss: 2.1203 - regression_loss: 1.7219 - classification_loss: 0.3984 468/500 [===========================>..] - ETA: 8s - loss: 2.1202 - regression_loss: 1.7218 - classification_loss: 0.3983 469/500 [===========================>..] - ETA: 7s - loss: 2.1203 - regression_loss: 1.7219 - classification_loss: 0.3984 470/500 [===========================>..] - ETA: 7s - loss: 2.1194 - regression_loss: 1.7211 - classification_loss: 0.3983 471/500 [===========================>..] - ETA: 7s - loss: 2.1185 - regression_loss: 1.7204 - classification_loss: 0.3981 472/500 [===========================>..] - ETA: 7s - loss: 2.1184 - regression_loss: 1.7202 - classification_loss: 0.3981 473/500 [===========================>..] - ETA: 6s - loss: 2.1173 - regression_loss: 1.7194 - classification_loss: 0.3979 474/500 [===========================>..] - ETA: 6s - loss: 2.1161 - regression_loss: 1.7184 - classification_loss: 0.3976 475/500 [===========================>..] - ETA: 6s - loss: 2.1166 - regression_loss: 1.7190 - classification_loss: 0.3976 476/500 [===========================>..] - ETA: 6s - loss: 2.1166 - regression_loss: 1.7190 - classification_loss: 0.3977 477/500 [===========================>..] - ETA: 5s - loss: 2.1160 - regression_loss: 1.7185 - classification_loss: 0.3975 478/500 [===========================>..] - ETA: 5s - loss: 2.1148 - regression_loss: 1.7176 - classification_loss: 0.3972 479/500 [===========================>..] - ETA: 5s - loss: 2.1148 - regression_loss: 1.7176 - classification_loss: 0.3972 480/500 [===========================>..] - ETA: 5s - loss: 2.1154 - regression_loss: 1.7182 - classification_loss: 0.3971 481/500 [===========================>..] - ETA: 4s - loss: 2.1150 - regression_loss: 1.7180 - classification_loss: 0.3970 482/500 [===========================>..] - ETA: 4s - loss: 2.1151 - regression_loss: 1.7182 - classification_loss: 0.3969 483/500 [===========================>..] - ETA: 4s - loss: 2.1154 - regression_loss: 1.7185 - classification_loss: 0.3969 484/500 [============================>.] - ETA: 4s - loss: 2.1158 - regression_loss: 1.7187 - classification_loss: 0.3972 485/500 [============================>.] - ETA: 3s - loss: 2.1168 - regression_loss: 1.7188 - classification_loss: 0.3981 486/500 [============================>.] - ETA: 3s - loss: 2.1164 - regression_loss: 1.7186 - classification_loss: 0.3978 487/500 [============================>.] - ETA: 3s - loss: 2.1171 - regression_loss: 1.7192 - classification_loss: 0.3979 488/500 [============================>.] - ETA: 3s - loss: 2.1186 - regression_loss: 1.7199 - classification_loss: 0.3987 489/500 [============================>.] - ETA: 2s - loss: 2.1184 - regression_loss: 1.7198 - classification_loss: 0.3986 490/500 [============================>.] - ETA: 2s - loss: 2.1179 - regression_loss: 1.7195 - classification_loss: 0.3985 491/500 [============================>.] - ETA: 2s - loss: 2.1174 - regression_loss: 1.7192 - classification_loss: 0.3982 492/500 [============================>.] - ETA: 2s - loss: 2.1183 - regression_loss: 1.7198 - classification_loss: 0.3984 493/500 [============================>.] - ETA: 1s - loss: 2.1190 - regression_loss: 1.7205 - classification_loss: 0.3985 494/500 [============================>.] - ETA: 1s - loss: 2.1200 - regression_loss: 1.7214 - classification_loss: 0.3986 495/500 [============================>.] - ETA: 1s - loss: 2.1199 - regression_loss: 1.7214 - classification_loss: 0.3985 496/500 [============================>.] - ETA: 1s - loss: 2.1206 - regression_loss: 1.7221 - classification_loss: 0.3986 497/500 [============================>.] - ETA: 0s - loss: 2.1208 - regression_loss: 1.7221 - classification_loss: 0.3987 498/500 [============================>.] - ETA: 0s - loss: 2.1216 - regression_loss: 1.7228 - classification_loss: 0.3987 499/500 [============================>.] - ETA: 0s - loss: 2.1228 - regression_loss: 1.7238 - classification_loss: 0.3990 500/500 [==============================] - 125s 250ms/step - loss: 2.1202 - regression_loss: 1.7214 - classification_loss: 0.3988 1172 instances of class plum with average precision: 0.5042 mAP: 0.5042 Epoch 00032: saving model to ./training/snapshots/resnet50_pascal_32.h5 Epoch 33/150 1/500 [..............................] - ETA: 1:56 - loss: 2.4239 - regression_loss: 2.0474 - classification_loss: 0.3765 2/500 [..............................] - ETA: 2:01 - loss: 2.2353 - regression_loss: 1.8784 - classification_loss: 0.3569 3/500 [..............................] - ETA: 2:01 - loss: 2.2039 - regression_loss: 1.8467 - classification_loss: 0.3572 4/500 [..............................] - ETA: 2:02 - loss: 2.1616 - regression_loss: 1.8064 - classification_loss: 0.3552 5/500 [..............................] - ETA: 2:01 - loss: 2.2419 - regression_loss: 1.8547 - classification_loss: 0.3873 6/500 [..............................] - ETA: 2:01 - loss: 2.2366 - regression_loss: 1.8562 - classification_loss: 0.3804 7/500 [..............................] - ETA: 1:59 - loss: 2.2421 - regression_loss: 1.8608 - classification_loss: 0.3812 8/500 [..............................] - ETA: 1:59 - loss: 2.2102 - regression_loss: 1.8313 - classification_loss: 0.3789 9/500 [..............................] - ETA: 1:59 - loss: 2.2021 - regression_loss: 1.8257 - classification_loss: 0.3764 10/500 [..............................] - ETA: 2:00 - loss: 2.1941 - regression_loss: 1.8217 - classification_loss: 0.3724 11/500 [..............................] - ETA: 2:00 - loss: 2.2299 - regression_loss: 1.8480 - classification_loss: 0.3819 12/500 [..............................] - ETA: 2:01 - loss: 2.4220 - regression_loss: 1.8959 - classification_loss: 0.5261 13/500 [..............................] - ETA: 2:01 - loss: 2.3640 - regression_loss: 1.8602 - classification_loss: 0.5038 14/500 [..............................] - ETA: 2:01 - loss: 2.3483 - regression_loss: 1.8551 - classification_loss: 0.4932 15/500 [..............................] - ETA: 2:00 - loss: 2.3499 - regression_loss: 1.8640 - classification_loss: 0.4859 16/500 [..............................] - ETA: 1:59 - loss: 2.3157 - regression_loss: 1.8386 - classification_loss: 0.4771 17/500 [>.............................] - ETA: 1:58 - loss: 2.3176 - regression_loss: 1.8304 - classification_loss: 0.4873 18/500 [>.............................] - ETA: 1:57 - loss: 2.2798 - regression_loss: 1.8027 - classification_loss: 0.4771 19/500 [>.............................] - ETA: 1:55 - loss: 2.3356 - regression_loss: 1.8546 - classification_loss: 0.4810 20/500 [>.............................] - ETA: 1:55 - loss: 2.3142 - regression_loss: 1.8387 - classification_loss: 0.4755 21/500 [>.............................] - ETA: 1:55 - loss: 2.3077 - regression_loss: 1.8364 - classification_loss: 0.4713 22/500 [>.............................] - ETA: 1:55 - loss: 2.3076 - regression_loss: 1.8395 - classification_loss: 0.4681 23/500 [>.............................] - ETA: 1:55 - loss: 2.2862 - regression_loss: 1.8264 - classification_loss: 0.4598 24/500 [>.............................] - ETA: 1:55 - loss: 2.2756 - regression_loss: 1.8211 - classification_loss: 0.4545 25/500 [>.............................] - ETA: 1:55 - loss: 2.2771 - regression_loss: 1.8243 - classification_loss: 0.4528 26/500 [>.............................] - ETA: 1:55 - loss: 2.2802 - regression_loss: 1.8330 - classification_loss: 0.4472 27/500 [>.............................] - ETA: 1:54 - loss: 2.3038 - regression_loss: 1.8522 - classification_loss: 0.4516 28/500 [>.............................] - ETA: 1:54 - loss: 2.3030 - regression_loss: 1.8538 - classification_loss: 0.4492 29/500 [>.............................] - ETA: 1:54 - loss: 2.2517 - regression_loss: 1.8086 - classification_loss: 0.4431 30/500 [>.............................] - ETA: 1:54 - loss: 2.2532 - regression_loss: 1.8118 - classification_loss: 0.4413 31/500 [>.............................] - ETA: 1:54 - loss: 2.2597 - regression_loss: 1.8163 - classification_loss: 0.4435 32/500 [>.............................] - ETA: 1:54 - loss: 2.2562 - regression_loss: 1.8139 - classification_loss: 0.4423 33/500 [>.............................] - ETA: 1:54 - loss: 2.2680 - regression_loss: 1.8218 - classification_loss: 0.4461 34/500 [=>............................] - ETA: 1:54 - loss: 2.2567 - regression_loss: 1.8147 - classification_loss: 0.4421 35/500 [=>............................] - ETA: 1:54 - loss: 2.2483 - regression_loss: 1.8104 - classification_loss: 0.4379 36/500 [=>............................] - ETA: 1:54 - loss: 2.2460 - regression_loss: 1.8080 - classification_loss: 0.4380 37/500 [=>............................] - ETA: 1:54 - loss: 2.2433 - regression_loss: 1.8060 - classification_loss: 0.4373 38/500 [=>............................] - ETA: 1:54 - loss: 2.2253 - regression_loss: 1.7921 - classification_loss: 0.4332 39/500 [=>............................] - ETA: 1:53 - loss: 2.2260 - regression_loss: 1.7941 - classification_loss: 0.4318 40/500 [=>............................] - ETA: 1:53 - loss: 2.2235 - regression_loss: 1.7942 - classification_loss: 0.4293 41/500 [=>............................] - ETA: 1:53 - loss: 2.2114 - regression_loss: 1.7866 - classification_loss: 0.4247 42/500 [=>............................] - ETA: 1:53 - loss: 2.2112 - regression_loss: 1.7873 - classification_loss: 0.4239 43/500 [=>............................] - ETA: 1:52 - loss: 2.2073 - regression_loss: 1.7855 - classification_loss: 0.4217 44/500 [=>............................] - ETA: 1:52 - loss: 2.1992 - regression_loss: 1.7805 - classification_loss: 0.4187 45/500 [=>............................] - ETA: 1:52 - loss: 2.1827 - regression_loss: 1.7685 - classification_loss: 0.4142 46/500 [=>............................] - ETA: 1:52 - loss: 2.1816 - regression_loss: 1.7684 - classification_loss: 0.4131 47/500 [=>............................] - ETA: 1:51 - loss: 2.1734 - regression_loss: 1.7623 - classification_loss: 0.4111 48/500 [=>............................] - ETA: 1:51 - loss: 2.1722 - regression_loss: 1.7627 - classification_loss: 0.4095 49/500 [=>............................] - ETA: 1:51 - loss: 2.1734 - regression_loss: 1.7622 - classification_loss: 0.4112 50/500 [==>...........................] - ETA: 1:51 - loss: 2.1817 - regression_loss: 1.7703 - classification_loss: 0.4113 51/500 [==>...........................] - ETA: 1:50 - loss: 2.1836 - regression_loss: 1.7722 - classification_loss: 0.4113 52/500 [==>...........................] - ETA: 1:50 - loss: 2.1796 - regression_loss: 1.7693 - classification_loss: 0.4103 53/500 [==>...........................] - ETA: 1:50 - loss: 2.1842 - regression_loss: 1.7737 - classification_loss: 0.4105 54/500 [==>...........................] - ETA: 1:50 - loss: 2.1827 - regression_loss: 1.7731 - classification_loss: 0.4096 55/500 [==>...........................] - ETA: 1:49 - loss: 2.1904 - regression_loss: 1.7794 - classification_loss: 0.4110 56/500 [==>...........................] - ETA: 1:49 - loss: 2.2017 - regression_loss: 1.7885 - classification_loss: 0.4132 57/500 [==>...........................] - ETA: 1:49 - loss: 2.2049 - regression_loss: 1.7912 - classification_loss: 0.4137 58/500 [==>...........................] - ETA: 1:49 - loss: 2.2036 - regression_loss: 1.7913 - classification_loss: 0.4123 59/500 [==>...........................] - ETA: 1:49 - loss: 2.2039 - regression_loss: 1.7930 - classification_loss: 0.4109 60/500 [==>...........................] - ETA: 1:48 - loss: 2.2133 - regression_loss: 1.8013 - classification_loss: 0.4120 61/500 [==>...........................] - ETA: 1:48 - loss: 2.2191 - regression_loss: 1.8065 - classification_loss: 0.4127 62/500 [==>...........................] - ETA: 1:48 - loss: 2.2166 - regression_loss: 1.8031 - classification_loss: 0.4136 63/500 [==>...........................] - ETA: 1:48 - loss: 2.2120 - regression_loss: 1.7953 - classification_loss: 0.4167 64/500 [==>...........................] - ETA: 1:48 - loss: 2.2035 - regression_loss: 1.7853 - classification_loss: 0.4182 65/500 [==>...........................] - ETA: 1:48 - loss: 2.2041 - regression_loss: 1.7863 - classification_loss: 0.4178 66/500 [==>...........................] - ETA: 1:47 - loss: 2.2064 - regression_loss: 1.7882 - classification_loss: 0.4183 67/500 [===>..........................] - ETA: 1:47 - loss: 2.2102 - regression_loss: 1.7913 - classification_loss: 0.4189 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1938 - regression_loss: 1.7786 - classification_loss: 0.4151 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1986 - regression_loss: 1.7816 - classification_loss: 0.4170 70/500 [===>..........................] - ETA: 1:46 - loss: 2.1848 - regression_loss: 1.7712 - classification_loss: 0.4136 71/500 [===>..........................] - ETA: 1:46 - loss: 2.1848 - regression_loss: 1.7718 - classification_loss: 0.4130 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1824 - regression_loss: 1.7703 - classification_loss: 0.4121 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1706 - regression_loss: 1.7621 - classification_loss: 0.4084 74/500 [===>..........................] - ETA: 1:45 - loss: 2.1729 - regression_loss: 1.7649 - classification_loss: 0.4080 75/500 [===>..........................] - ETA: 1:45 - loss: 2.1765 - regression_loss: 1.7667 - classification_loss: 0.4098 76/500 [===>..........................] - ETA: 1:45 - loss: 2.1672 - regression_loss: 1.7593 - classification_loss: 0.4079 77/500 [===>..........................] - ETA: 1:45 - loss: 2.1674 - regression_loss: 1.7594 - classification_loss: 0.4080 78/500 [===>..........................] - ETA: 1:44 - loss: 2.1641 - regression_loss: 1.7567 - classification_loss: 0.4074 79/500 [===>..........................] - ETA: 1:44 - loss: 2.1617 - regression_loss: 1.7553 - classification_loss: 0.4064 80/500 [===>..........................] - ETA: 1:44 - loss: 2.1640 - regression_loss: 1.7575 - classification_loss: 0.4065 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1649 - regression_loss: 1.7574 - classification_loss: 0.4075 82/500 [===>..........................] - ETA: 1:43 - loss: 2.1690 - regression_loss: 1.7608 - classification_loss: 0.4081 83/500 [===>..........................] - ETA: 1:43 - loss: 2.1786 - regression_loss: 1.7685 - classification_loss: 0.4102 84/500 [====>.........................] - ETA: 1:43 - loss: 2.1792 - regression_loss: 1.7697 - classification_loss: 0.4095 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1764 - regression_loss: 1.7675 - classification_loss: 0.4088 86/500 [====>.........................] - ETA: 1:42 - loss: 2.1798 - regression_loss: 1.7701 - classification_loss: 0.4097 87/500 [====>.........................] - ETA: 1:42 - loss: 2.1797 - regression_loss: 1.7710 - classification_loss: 0.4088 88/500 [====>.........................] - ETA: 1:42 - loss: 2.1779 - regression_loss: 1.7697 - classification_loss: 0.4082 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1698 - regression_loss: 1.7633 - classification_loss: 0.4065 90/500 [====>.........................] - ETA: 1:41 - loss: 2.1705 - regression_loss: 1.7651 - classification_loss: 0.4054 91/500 [====>.........................] - ETA: 1:41 - loss: 2.1630 - regression_loss: 1.7601 - classification_loss: 0.4028 92/500 [====>.........................] - ETA: 1:41 - loss: 2.1647 - regression_loss: 1.7620 - classification_loss: 0.4027 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1621 - regression_loss: 1.7605 - classification_loss: 0.4016 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1614 - regression_loss: 1.7605 - classification_loss: 0.4009 95/500 [====>.........................] - ETA: 1:40 - loss: 2.1640 - regression_loss: 1.7642 - classification_loss: 0.3998 96/500 [====>.........................] - ETA: 1:40 - loss: 2.1659 - regression_loss: 1.7655 - classification_loss: 0.4004 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1639 - regression_loss: 1.7640 - classification_loss: 0.3999 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1670 - regression_loss: 1.7669 - classification_loss: 0.4001 99/500 [====>.........................] - ETA: 1:39 - loss: 2.1716 - regression_loss: 1.7716 - classification_loss: 0.4000 100/500 [=====>........................] - ETA: 1:39 - loss: 2.1685 - regression_loss: 1.7693 - classification_loss: 0.3992 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1721 - regression_loss: 1.7736 - classification_loss: 0.3985 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1612 - regression_loss: 1.7649 - classification_loss: 0.3963 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1589 - regression_loss: 1.7624 - classification_loss: 0.3965 104/500 [=====>........................] - ETA: 1:38 - loss: 2.1573 - regression_loss: 1.7616 - classification_loss: 0.3956 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1567 - regression_loss: 1.7610 - classification_loss: 0.3958 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1517 - regression_loss: 1.7575 - classification_loss: 0.3942 107/500 [=====>........................] - ETA: 1:37 - loss: 2.1401 - regression_loss: 1.7483 - classification_loss: 0.3918 108/500 [=====>........................] - ETA: 1:37 - loss: 2.1336 - regression_loss: 1.7427 - classification_loss: 0.3909 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1259 - regression_loss: 1.7364 - classification_loss: 0.3896 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1266 - regression_loss: 1.7362 - classification_loss: 0.3904 111/500 [=====>........................] - ETA: 1:36 - loss: 2.1238 - regression_loss: 1.7343 - classification_loss: 0.3895 112/500 [=====>........................] - ETA: 1:36 - loss: 2.1218 - regression_loss: 1.7330 - classification_loss: 0.3888 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1191 - regression_loss: 1.7308 - classification_loss: 0.3883 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1132 - regression_loss: 1.7256 - classification_loss: 0.3876 115/500 [=====>........................] - ETA: 1:35 - loss: 2.1201 - regression_loss: 1.7307 - classification_loss: 0.3894 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1165 - regression_loss: 1.7280 - classification_loss: 0.3884 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1177 - regression_loss: 1.7287 - classification_loss: 0.3890 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1218 - regression_loss: 1.7322 - classification_loss: 0.3896 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1223 - regression_loss: 1.7330 - classification_loss: 0.3893 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1146 - regression_loss: 1.7269 - classification_loss: 0.3876 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1100 - regression_loss: 1.7237 - classification_loss: 0.3864 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1052 - regression_loss: 1.7193 - classification_loss: 0.3858 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1086 - regression_loss: 1.7224 - classification_loss: 0.3862 124/500 [======>.......................] - ETA: 1:33 - loss: 2.1121 - regression_loss: 1.7245 - classification_loss: 0.3876 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1142 - regression_loss: 1.7265 - classification_loss: 0.3878 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1148 - regression_loss: 1.7270 - classification_loss: 0.3878 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1162 - regression_loss: 1.7288 - classification_loss: 0.3874 128/500 [======>.......................] - ETA: 1:32 - loss: 2.1148 - regression_loss: 1.7282 - classification_loss: 0.3867 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1109 - regression_loss: 1.7250 - classification_loss: 0.3859 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1149 - regression_loss: 1.7269 - classification_loss: 0.3881 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1250 - regression_loss: 1.7341 - classification_loss: 0.3910 132/500 [======>.......................] - ETA: 1:31 - loss: 2.1259 - regression_loss: 1.7347 - classification_loss: 0.3912 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1275 - regression_loss: 1.7362 - classification_loss: 0.3913 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1327 - regression_loss: 1.7409 - classification_loss: 0.3919 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1340 - regression_loss: 1.7414 - classification_loss: 0.3926 136/500 [=======>......................] - ETA: 1:30 - loss: 2.1332 - regression_loss: 1.7411 - classification_loss: 0.3922 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1316 - regression_loss: 1.7397 - classification_loss: 0.3919 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1297 - regression_loss: 1.7383 - classification_loss: 0.3914 139/500 [=======>......................] - ETA: 1:29 - loss: 2.1300 - regression_loss: 1.7386 - classification_loss: 0.3914 140/500 [=======>......................] - ETA: 1:29 - loss: 2.1278 - regression_loss: 1.7370 - classification_loss: 0.3908 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1252 - regression_loss: 1.7349 - classification_loss: 0.3902 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1354 - regression_loss: 1.7295 - classification_loss: 0.4059 143/500 [=======>......................] - ETA: 1:28 - loss: 2.1341 - regression_loss: 1.7291 - classification_loss: 0.4050 144/500 [=======>......................] - ETA: 1:28 - loss: 2.1302 - regression_loss: 1.7256 - classification_loss: 0.4046 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1344 - regression_loss: 1.7287 - classification_loss: 0.4057 146/500 [=======>......................] - ETA: 1:27 - loss: 2.1318 - regression_loss: 1.7260 - classification_loss: 0.4058 147/500 [=======>......................] - ETA: 1:27 - loss: 2.1262 - regression_loss: 1.7212 - classification_loss: 0.4050 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1246 - regression_loss: 1.7203 - classification_loss: 0.4043 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1240 - regression_loss: 1.7204 - classification_loss: 0.4036 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1213 - regression_loss: 1.7185 - classification_loss: 0.4027 151/500 [========>.....................] - ETA: 1:26 - loss: 2.1203 - regression_loss: 1.7179 - classification_loss: 0.4024 152/500 [========>.....................] - ETA: 1:26 - loss: 2.1216 - regression_loss: 1.7180 - classification_loss: 0.4036 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1220 - regression_loss: 1.7185 - classification_loss: 0.4035 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1153 - regression_loss: 1.7134 - classification_loss: 0.4019 155/500 [========>.....................] - ETA: 1:25 - loss: 2.1171 - regression_loss: 1.7152 - classification_loss: 0.4019 156/500 [========>.....................] - ETA: 1:25 - loss: 2.1192 - regression_loss: 1.7171 - classification_loss: 0.4022 157/500 [========>.....................] - ETA: 1:25 - loss: 2.1174 - regression_loss: 1.7161 - classification_loss: 0.4013 158/500 [========>.....................] - ETA: 1:25 - loss: 2.1212 - regression_loss: 1.7189 - classification_loss: 0.4023 159/500 [========>.....................] - ETA: 1:24 - loss: 2.1172 - regression_loss: 1.7158 - classification_loss: 0.4014 160/500 [========>.....................] - ETA: 1:24 - loss: 2.1202 - regression_loss: 1.7179 - classification_loss: 0.4023 161/500 [========>.....................] - ETA: 1:24 - loss: 2.1214 - regression_loss: 1.7191 - classification_loss: 0.4022 162/500 [========>.....................] - ETA: 1:24 - loss: 2.1223 - regression_loss: 1.7199 - classification_loss: 0.4024 163/500 [========>.....................] - ETA: 1:23 - loss: 2.1155 - regression_loss: 1.7143 - classification_loss: 0.4012 164/500 [========>.....................] - ETA: 1:23 - loss: 2.1160 - regression_loss: 1.7150 - classification_loss: 0.4011 165/500 [========>.....................] - ETA: 1:23 - loss: 2.1116 - regression_loss: 1.7108 - classification_loss: 0.4008 166/500 [========>.....................] - ETA: 1:23 - loss: 2.1107 - regression_loss: 1.7092 - classification_loss: 0.4015 167/500 [=========>....................] - ETA: 1:22 - loss: 2.1126 - regression_loss: 1.7109 - classification_loss: 0.4016 168/500 [=========>....................] - ETA: 1:22 - loss: 2.1126 - regression_loss: 1.7109 - classification_loss: 0.4017 169/500 [=========>....................] - ETA: 1:22 - loss: 2.1130 - regression_loss: 1.7113 - classification_loss: 0.4017 170/500 [=========>....................] - ETA: 1:22 - loss: 2.1100 - regression_loss: 1.7091 - classification_loss: 0.4010 171/500 [=========>....................] - ETA: 1:21 - loss: 2.1087 - regression_loss: 1.7079 - classification_loss: 0.4009 172/500 [=========>....................] - ETA: 1:21 - loss: 2.1098 - regression_loss: 1.7087 - classification_loss: 0.4012 173/500 [=========>....................] - ETA: 1:21 - loss: 2.1108 - regression_loss: 1.7095 - classification_loss: 0.4013 174/500 [=========>....................] - ETA: 1:21 - loss: 2.1134 - regression_loss: 1.7113 - classification_loss: 0.4022 175/500 [=========>....................] - ETA: 1:20 - loss: 2.1132 - regression_loss: 1.7110 - classification_loss: 0.4022 176/500 [=========>....................] - ETA: 1:20 - loss: 2.1129 - regression_loss: 1.7109 - classification_loss: 0.4020 177/500 [=========>....................] - ETA: 1:20 - loss: 2.1152 - regression_loss: 1.7127 - classification_loss: 0.4025 178/500 [=========>....................] - ETA: 1:20 - loss: 2.1153 - regression_loss: 1.7129 - classification_loss: 0.4023 179/500 [=========>....................] - ETA: 1:19 - loss: 2.1141 - regression_loss: 1.7116 - classification_loss: 0.4024 180/500 [=========>....................] - ETA: 1:19 - loss: 2.1131 - regression_loss: 1.7112 - classification_loss: 0.4019 181/500 [=========>....................] - ETA: 1:19 - loss: 2.1119 - regression_loss: 1.7093 - classification_loss: 0.4026 182/500 [=========>....................] - ETA: 1:19 - loss: 2.1148 - regression_loss: 1.7124 - classification_loss: 0.4025 183/500 [=========>....................] - ETA: 1:18 - loss: 2.1154 - regression_loss: 1.7133 - classification_loss: 0.4021 184/500 [==========>...................] - ETA: 1:18 - loss: 2.1166 - regression_loss: 1.7141 - classification_loss: 0.4026 185/500 [==========>...................] - ETA: 1:18 - loss: 2.1184 - regression_loss: 1.7158 - classification_loss: 0.4027 186/500 [==========>...................] - ETA: 1:18 - loss: 2.1208 - regression_loss: 1.7178 - classification_loss: 0.4031 187/500 [==========>...................] - ETA: 1:18 - loss: 2.1241 - regression_loss: 1.7205 - classification_loss: 0.4036 188/500 [==========>...................] - ETA: 1:17 - loss: 2.1236 - regression_loss: 1.7204 - classification_loss: 0.4032 189/500 [==========>...................] - ETA: 1:17 - loss: 2.1274 - regression_loss: 1.7238 - classification_loss: 0.4036 190/500 [==========>...................] - ETA: 1:17 - loss: 2.1283 - regression_loss: 1.7247 - classification_loss: 0.4036 191/500 [==========>...................] - ETA: 1:17 - loss: 2.1293 - regression_loss: 1.7256 - classification_loss: 0.4037 192/500 [==========>...................] - ETA: 1:16 - loss: 2.1341 - regression_loss: 1.7302 - classification_loss: 0.4039 193/500 [==========>...................] - ETA: 1:16 - loss: 2.1349 - regression_loss: 1.7312 - classification_loss: 0.4037 194/500 [==========>...................] - ETA: 1:16 - loss: 2.1351 - regression_loss: 1.7316 - classification_loss: 0.4035 195/500 [==========>...................] - ETA: 1:15 - loss: 2.1395 - regression_loss: 1.7348 - classification_loss: 0.4046 196/500 [==========>...................] - ETA: 1:15 - loss: 2.1399 - regression_loss: 1.7353 - classification_loss: 0.4046 197/500 [==========>...................] - ETA: 1:15 - loss: 2.1350 - regression_loss: 1.7307 - classification_loss: 0.4043 198/500 [==========>...................] - ETA: 1:15 - loss: 2.1320 - regression_loss: 1.7286 - classification_loss: 0.4034 199/500 [==========>...................] - ETA: 1:14 - loss: 2.1321 - regression_loss: 1.7283 - classification_loss: 0.4037 200/500 [===========>..................] - ETA: 1:14 - loss: 2.1345 - regression_loss: 1.7305 - classification_loss: 0.4040 201/500 [===========>..................] - ETA: 1:14 - loss: 2.1356 - regression_loss: 1.7308 - classification_loss: 0.4048 202/500 [===========>..................] - ETA: 1:14 - loss: 2.1351 - regression_loss: 1.7306 - classification_loss: 0.4046 203/500 [===========>..................] - ETA: 1:13 - loss: 2.1386 - regression_loss: 1.7300 - classification_loss: 0.4086 204/500 [===========>..................] - ETA: 1:13 - loss: 2.1386 - regression_loss: 1.7297 - classification_loss: 0.4089 205/500 [===========>..................] - ETA: 1:13 - loss: 2.1369 - regression_loss: 1.7285 - classification_loss: 0.4083 206/500 [===========>..................] - ETA: 1:13 - loss: 2.1384 - regression_loss: 1.7298 - classification_loss: 0.4086 207/500 [===========>..................] - ETA: 1:12 - loss: 2.1331 - regression_loss: 1.7255 - classification_loss: 0.4077 208/500 [===========>..................] - ETA: 1:12 - loss: 2.1345 - regression_loss: 1.7266 - classification_loss: 0.4079 209/500 [===========>..................] - ETA: 1:12 - loss: 2.1344 - regression_loss: 1.7264 - classification_loss: 0.4080 210/500 [===========>..................] - ETA: 1:12 - loss: 2.1338 - regression_loss: 1.7260 - classification_loss: 0.4078 211/500 [===========>..................] - ETA: 1:11 - loss: 2.1371 - regression_loss: 1.7288 - classification_loss: 0.4083 212/500 [===========>..................] - ETA: 1:11 - loss: 2.1365 - regression_loss: 1.7283 - classification_loss: 0.4082 213/500 [===========>..................] - ETA: 1:11 - loss: 2.1364 - regression_loss: 1.7286 - classification_loss: 0.4078 214/500 [===========>..................] - ETA: 1:11 - loss: 2.1360 - regression_loss: 1.7288 - classification_loss: 0.4072 215/500 [===========>..................] - ETA: 1:10 - loss: 2.1370 - regression_loss: 1.7299 - classification_loss: 0.4071 216/500 [===========>..................] - ETA: 1:10 - loss: 2.1368 - regression_loss: 1.7300 - classification_loss: 0.4068 217/500 [============>.................] - ETA: 1:10 - loss: 2.1362 - regression_loss: 1.7297 - classification_loss: 0.4065 218/500 [============>.................] - ETA: 1:10 - loss: 2.1338 - regression_loss: 1.7276 - classification_loss: 0.4061 219/500 [============>.................] - ETA: 1:09 - loss: 2.1338 - regression_loss: 1.7280 - classification_loss: 0.4058 220/500 [============>.................] - ETA: 1:09 - loss: 2.1335 - regression_loss: 1.7280 - classification_loss: 0.4055 221/500 [============>.................] - ETA: 1:09 - loss: 2.1340 - regression_loss: 1.7289 - classification_loss: 0.4051 222/500 [============>.................] - ETA: 1:09 - loss: 2.1352 - regression_loss: 1.7299 - classification_loss: 0.4053 223/500 [============>.................] - ETA: 1:08 - loss: 2.1351 - regression_loss: 1.7299 - classification_loss: 0.4052 224/500 [============>.................] - ETA: 1:08 - loss: 2.1350 - regression_loss: 1.7283 - classification_loss: 0.4067 225/500 [============>.................] - ETA: 1:08 - loss: 2.1287 - regression_loss: 1.7233 - classification_loss: 0.4054 226/500 [============>.................] - ETA: 1:08 - loss: 2.1266 - regression_loss: 1.7218 - classification_loss: 0.4048 227/500 [============>.................] - ETA: 1:07 - loss: 2.1218 - regression_loss: 1.7182 - classification_loss: 0.4037 228/500 [============>.................] - ETA: 1:07 - loss: 2.1203 - regression_loss: 1.7170 - classification_loss: 0.4033 229/500 [============>.................] - ETA: 1:07 - loss: 2.1205 - regression_loss: 1.7174 - classification_loss: 0.4031 230/500 [============>.................] - ETA: 1:07 - loss: 2.1202 - regression_loss: 1.7173 - classification_loss: 0.4029 231/500 [============>.................] - ETA: 1:06 - loss: 2.1189 - regression_loss: 1.7162 - classification_loss: 0.4027 232/500 [============>.................] - ETA: 1:06 - loss: 2.1180 - regression_loss: 1.7155 - classification_loss: 0.4025 233/500 [============>.................] - ETA: 1:06 - loss: 2.1188 - regression_loss: 1.7160 - classification_loss: 0.4028 234/500 [=============>................] - ETA: 1:06 - loss: 2.1198 - regression_loss: 1.7169 - classification_loss: 0.4029 235/500 [=============>................] - ETA: 1:05 - loss: 2.1192 - regression_loss: 1.7168 - classification_loss: 0.4024 236/500 [=============>................] - ETA: 1:05 - loss: 2.1211 - regression_loss: 1.7175 - classification_loss: 0.4036 237/500 [=============>................] - ETA: 1:05 - loss: 2.1191 - regression_loss: 1.7162 - classification_loss: 0.4029 238/500 [=============>................] - ETA: 1:05 - loss: 2.1193 - regression_loss: 1.7167 - classification_loss: 0.4026 239/500 [=============>................] - ETA: 1:05 - loss: 2.1183 - regression_loss: 1.7157 - classification_loss: 0.4026 240/500 [=============>................] - ETA: 1:04 - loss: 2.1167 - regression_loss: 1.7140 - classification_loss: 0.4027 241/500 [=============>................] - ETA: 1:04 - loss: 2.1129 - regression_loss: 1.7111 - classification_loss: 0.4018 242/500 [=============>................] - ETA: 1:04 - loss: 2.1144 - regression_loss: 1.7126 - classification_loss: 0.4017 243/500 [=============>................] - ETA: 1:04 - loss: 2.1126 - regression_loss: 1.7115 - classification_loss: 0.4011 244/500 [=============>................] - ETA: 1:03 - loss: 2.1135 - regression_loss: 1.7121 - classification_loss: 0.4014 245/500 [=============>................] - ETA: 1:03 - loss: 2.1139 - regression_loss: 1.7120 - classification_loss: 0.4019 246/500 [=============>................] - ETA: 1:03 - loss: 2.1139 - regression_loss: 1.7120 - classification_loss: 0.4020 247/500 [=============>................] - ETA: 1:03 - loss: 2.1134 - regression_loss: 1.7117 - classification_loss: 0.4017 248/500 [=============>................] - ETA: 1:02 - loss: 2.1143 - regression_loss: 1.7125 - classification_loss: 0.4018 249/500 [=============>................] - ETA: 1:02 - loss: 2.1130 - regression_loss: 1.7116 - classification_loss: 0.4014 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1146 - regression_loss: 1.7129 - classification_loss: 0.4017 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1140 - regression_loss: 1.7125 - classification_loss: 0.4015 252/500 [==============>...............] - ETA: 1:01 - loss: 2.1116 - regression_loss: 1.7103 - classification_loss: 0.4013 253/500 [==============>...............] - ETA: 1:01 - loss: 2.1105 - regression_loss: 1.7096 - classification_loss: 0.4009 254/500 [==============>...............] - ETA: 1:01 - loss: 2.1111 - regression_loss: 1.7104 - classification_loss: 0.4007 255/500 [==============>...............] - ETA: 1:01 - loss: 2.1096 - regression_loss: 1.7094 - classification_loss: 0.4002 256/500 [==============>...............] - ETA: 1:00 - loss: 2.1086 - regression_loss: 1.7087 - classification_loss: 0.3999 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1081 - regression_loss: 1.7085 - classification_loss: 0.3996 258/500 [==============>...............] - ETA: 1:00 - loss: 2.1061 - regression_loss: 1.7069 - classification_loss: 0.3992 259/500 [==============>...............] - ETA: 1:00 - loss: 2.1077 - regression_loss: 1.7077 - classification_loss: 0.4000 260/500 [==============>...............] - ETA: 59s - loss: 2.1077 - regression_loss: 1.7080 - classification_loss: 0.3997  261/500 [==============>...............] - ETA: 59s - loss: 2.1088 - regression_loss: 1.7091 - classification_loss: 0.3998 262/500 [==============>...............] - ETA: 59s - loss: 2.1099 - regression_loss: 1.7101 - classification_loss: 0.3998 263/500 [==============>...............] - ETA: 59s - loss: 2.1094 - regression_loss: 1.7098 - classification_loss: 0.3996 264/500 [==============>...............] - ETA: 58s - loss: 2.1096 - regression_loss: 1.7101 - classification_loss: 0.3995 265/500 [==============>...............] - ETA: 58s - loss: 2.1078 - regression_loss: 1.7086 - classification_loss: 0.3992 266/500 [==============>...............] - ETA: 58s - loss: 2.1086 - regression_loss: 1.7094 - classification_loss: 0.3992 267/500 [===============>..............] - ETA: 58s - loss: 2.1094 - regression_loss: 1.7102 - classification_loss: 0.3992 268/500 [===============>..............] - ETA: 57s - loss: 2.1098 - regression_loss: 1.7098 - classification_loss: 0.4000 269/500 [===============>..............] - ETA: 57s - loss: 2.1098 - regression_loss: 1.7097 - classification_loss: 0.4002 270/500 [===============>..............] - ETA: 57s - loss: 2.1109 - regression_loss: 1.7106 - classification_loss: 0.4002 271/500 [===============>..............] - ETA: 57s - loss: 2.1122 - regression_loss: 1.7114 - classification_loss: 0.4008 272/500 [===============>..............] - ETA: 56s - loss: 2.1132 - regression_loss: 1.7123 - classification_loss: 0.4008 273/500 [===============>..............] - ETA: 56s - loss: 2.1125 - regression_loss: 1.7114 - classification_loss: 0.4011 274/500 [===============>..............] - ETA: 56s - loss: 2.1123 - regression_loss: 1.7113 - classification_loss: 0.4010 275/500 [===============>..............] - ETA: 56s - loss: 2.1113 - regression_loss: 1.7107 - classification_loss: 0.4006 276/500 [===============>..............] - ETA: 55s - loss: 2.1098 - regression_loss: 1.7096 - classification_loss: 0.4002 277/500 [===============>..............] - ETA: 55s - loss: 2.1116 - regression_loss: 1.7120 - classification_loss: 0.3997 278/500 [===============>..............] - ETA: 55s - loss: 2.1127 - regression_loss: 1.7130 - classification_loss: 0.3997 279/500 [===============>..............] - ETA: 55s - loss: 2.1149 - regression_loss: 1.7146 - classification_loss: 0.4002 280/500 [===============>..............] - ETA: 54s - loss: 2.1175 - regression_loss: 1.7164 - classification_loss: 0.4011 281/500 [===============>..............] - ETA: 54s - loss: 2.1172 - regression_loss: 1.7164 - classification_loss: 0.4008 282/500 [===============>..............] - ETA: 54s - loss: 2.1155 - regression_loss: 1.7148 - classification_loss: 0.4007 283/500 [===============>..............] - ETA: 54s - loss: 2.1152 - regression_loss: 1.7147 - classification_loss: 0.4005 284/500 [================>.............] - ETA: 53s - loss: 2.1162 - regression_loss: 1.7153 - classification_loss: 0.4008 285/500 [================>.............] - ETA: 53s - loss: 2.1149 - regression_loss: 1.7144 - classification_loss: 0.4005 286/500 [================>.............] - ETA: 53s - loss: 2.1166 - regression_loss: 1.7155 - classification_loss: 0.4012 287/500 [================>.............] - ETA: 53s - loss: 2.1155 - regression_loss: 1.7147 - classification_loss: 0.4008 288/500 [================>.............] - ETA: 52s - loss: 2.1181 - regression_loss: 1.7175 - classification_loss: 0.4006 289/500 [================>.............] - ETA: 52s - loss: 2.1169 - regression_loss: 1.7167 - classification_loss: 0.4003 290/500 [================>.............] - ETA: 52s - loss: 2.1152 - regression_loss: 1.7153 - classification_loss: 0.3999 291/500 [================>.............] - ETA: 52s - loss: 2.1167 - regression_loss: 1.7164 - classification_loss: 0.4002 292/500 [================>.............] - ETA: 51s - loss: 2.1181 - regression_loss: 1.7175 - classification_loss: 0.4006 293/500 [================>.............] - ETA: 51s - loss: 2.1181 - regression_loss: 1.7175 - classification_loss: 0.4005 294/500 [================>.............] - ETA: 51s - loss: 2.1151 - regression_loss: 1.7154 - classification_loss: 0.3997 295/500 [================>.............] - ETA: 51s - loss: 2.1142 - regression_loss: 1.7149 - classification_loss: 0.3993 296/500 [================>.............] - ETA: 50s - loss: 2.1160 - regression_loss: 1.7156 - classification_loss: 0.4004 297/500 [================>.............] - ETA: 50s - loss: 2.1138 - regression_loss: 1.7137 - classification_loss: 0.4001 298/500 [================>.............] - ETA: 50s - loss: 2.1125 - regression_loss: 1.7127 - classification_loss: 0.3998 299/500 [================>.............] - ETA: 50s - loss: 2.1136 - regression_loss: 1.7140 - classification_loss: 0.3996 300/500 [=================>............] - ETA: 49s - loss: 2.1177 - regression_loss: 1.7177 - classification_loss: 0.4000 301/500 [=================>............] - ETA: 49s - loss: 2.1176 - regression_loss: 1.7177 - classification_loss: 0.3998 302/500 [=================>............] - ETA: 49s - loss: 2.1173 - regression_loss: 1.7176 - classification_loss: 0.3997 303/500 [=================>............] - ETA: 49s - loss: 2.1170 - regression_loss: 1.7175 - classification_loss: 0.3995 304/500 [=================>............] - ETA: 48s - loss: 2.1166 - regression_loss: 1.7174 - classification_loss: 0.3993 305/500 [=================>............] - ETA: 48s - loss: 2.1172 - regression_loss: 1.7176 - classification_loss: 0.3995 306/500 [=================>............] - ETA: 48s - loss: 2.1177 - regression_loss: 1.7184 - classification_loss: 0.3993 307/500 [=================>............] - ETA: 48s - loss: 2.1167 - regression_loss: 1.7178 - classification_loss: 0.3990 308/500 [=================>............] - ETA: 47s - loss: 2.1176 - regression_loss: 1.7183 - classification_loss: 0.3993 309/500 [=================>............] - ETA: 47s - loss: 2.1181 - regression_loss: 1.7188 - classification_loss: 0.3993 310/500 [=================>............] - ETA: 47s - loss: 2.1210 - regression_loss: 1.7210 - classification_loss: 0.4001 311/500 [=================>............] - ETA: 47s - loss: 2.1214 - regression_loss: 1.7213 - classification_loss: 0.4001 312/500 [=================>............] - ETA: 46s - loss: 2.1214 - regression_loss: 1.7213 - classification_loss: 0.4002 313/500 [=================>............] - ETA: 46s - loss: 2.1216 - regression_loss: 1.7217 - classification_loss: 0.4000 314/500 [=================>............] - ETA: 46s - loss: 2.1228 - regression_loss: 1.7227 - classification_loss: 0.4001 315/500 [=================>............] - ETA: 46s - loss: 2.1233 - regression_loss: 1.7232 - classification_loss: 0.4001 316/500 [=================>............] - ETA: 45s - loss: 2.1229 - regression_loss: 1.7229 - classification_loss: 0.3999 317/500 [==================>...........] - ETA: 45s - loss: 2.1231 - regression_loss: 1.7233 - classification_loss: 0.3998 318/500 [==================>...........] - ETA: 45s - loss: 2.1216 - regression_loss: 1.7221 - classification_loss: 0.3996 319/500 [==================>...........] - ETA: 45s - loss: 2.1225 - regression_loss: 1.7228 - classification_loss: 0.3997 320/500 [==================>...........] - ETA: 44s - loss: 2.1241 - regression_loss: 1.7242 - classification_loss: 0.3999 321/500 [==================>...........] - ETA: 44s - loss: 2.1266 - regression_loss: 1.7256 - classification_loss: 0.4010 322/500 [==================>...........] - ETA: 44s - loss: 2.1255 - regression_loss: 1.7248 - classification_loss: 0.4007 323/500 [==================>...........] - ETA: 44s - loss: 2.1255 - regression_loss: 1.7252 - classification_loss: 0.4003 324/500 [==================>...........] - ETA: 43s - loss: 2.1241 - regression_loss: 1.7240 - classification_loss: 0.4001 325/500 [==================>...........] - ETA: 43s - loss: 2.1246 - regression_loss: 1.7244 - classification_loss: 0.4002 326/500 [==================>...........] - ETA: 43s - loss: 2.1255 - regression_loss: 1.7255 - classification_loss: 0.4001 327/500 [==================>...........] - ETA: 43s - loss: 2.1244 - regression_loss: 1.7248 - classification_loss: 0.3996 328/500 [==================>...........] - ETA: 42s - loss: 2.1251 - regression_loss: 1.7251 - classification_loss: 0.3999 329/500 [==================>...........] - ETA: 42s - loss: 2.1233 - regression_loss: 1.7241 - classification_loss: 0.3992 330/500 [==================>...........] - ETA: 42s - loss: 2.1234 - regression_loss: 1.7243 - classification_loss: 0.3991 331/500 [==================>...........] - ETA: 42s - loss: 2.1221 - regression_loss: 1.7231 - classification_loss: 0.3991 332/500 [==================>...........] - ETA: 41s - loss: 2.1234 - regression_loss: 1.7242 - classification_loss: 0.3992 333/500 [==================>...........] - ETA: 41s - loss: 2.1229 - regression_loss: 1.7215 - classification_loss: 0.4013 334/500 [===================>..........] - ETA: 41s - loss: 2.1223 - regression_loss: 1.7212 - classification_loss: 0.4011 335/500 [===================>..........] - ETA: 41s - loss: 2.1214 - regression_loss: 1.7205 - classification_loss: 0.4009 336/500 [===================>..........] - ETA: 40s - loss: 2.1188 - regression_loss: 1.7182 - classification_loss: 0.4006 337/500 [===================>..........] - ETA: 40s - loss: 2.1195 - regression_loss: 1.7189 - classification_loss: 0.4006 338/500 [===================>..........] - ETA: 40s - loss: 2.1188 - regression_loss: 1.7183 - classification_loss: 0.4005 339/500 [===================>..........] - ETA: 40s - loss: 2.1192 - regression_loss: 1.7189 - classification_loss: 0.4003 340/500 [===================>..........] - ETA: 39s - loss: 2.1191 - regression_loss: 1.7189 - classification_loss: 0.4002 341/500 [===================>..........] - ETA: 39s - loss: 2.1195 - regression_loss: 1.7193 - classification_loss: 0.4002 342/500 [===================>..........] - ETA: 39s - loss: 2.1161 - regression_loss: 1.7167 - classification_loss: 0.3994 343/500 [===================>..........] - ETA: 39s - loss: 2.1158 - regression_loss: 1.7165 - classification_loss: 0.3993 344/500 [===================>..........] - ETA: 38s - loss: 2.1157 - regression_loss: 1.7165 - classification_loss: 0.3993 345/500 [===================>..........] - ETA: 38s - loss: 2.1143 - regression_loss: 1.7155 - classification_loss: 0.3988 346/500 [===================>..........] - ETA: 38s - loss: 2.1130 - regression_loss: 1.7145 - classification_loss: 0.3986 347/500 [===================>..........] - ETA: 38s - loss: 2.1122 - regression_loss: 1.7141 - classification_loss: 0.3981 348/500 [===================>..........] - ETA: 37s - loss: 2.1136 - regression_loss: 1.7151 - classification_loss: 0.3986 349/500 [===================>..........] - ETA: 37s - loss: 2.1136 - regression_loss: 1.7154 - classification_loss: 0.3983 350/500 [====================>.........] - ETA: 37s - loss: 2.1140 - regression_loss: 1.7157 - classification_loss: 0.3982 351/500 [====================>.........] - ETA: 37s - loss: 2.1143 - regression_loss: 1.7160 - classification_loss: 0.3983 352/500 [====================>.........] - ETA: 36s - loss: 2.1180 - regression_loss: 1.7191 - classification_loss: 0.3989 353/500 [====================>.........] - ETA: 36s - loss: 2.1174 - regression_loss: 1.7185 - classification_loss: 0.3988 354/500 [====================>.........] - ETA: 36s - loss: 2.1174 - regression_loss: 1.7185 - classification_loss: 0.3989 355/500 [====================>.........] - ETA: 36s - loss: 2.1184 - regression_loss: 1.7194 - classification_loss: 0.3990 356/500 [====================>.........] - ETA: 35s - loss: 2.1178 - regression_loss: 1.7191 - classification_loss: 0.3987 357/500 [====================>.........] - ETA: 35s - loss: 2.1158 - regression_loss: 1.7176 - classification_loss: 0.3982 358/500 [====================>.........] - ETA: 35s - loss: 2.1152 - regression_loss: 1.7172 - classification_loss: 0.3980 359/500 [====================>.........] - ETA: 35s - loss: 2.1147 - regression_loss: 1.7170 - classification_loss: 0.3977 360/500 [====================>.........] - ETA: 34s - loss: 2.1127 - regression_loss: 1.7153 - classification_loss: 0.3974 361/500 [====================>.........] - ETA: 34s - loss: 2.1123 - regression_loss: 1.7151 - classification_loss: 0.3972 362/500 [====================>.........] - ETA: 34s - loss: 2.1112 - regression_loss: 1.7143 - classification_loss: 0.3968 363/500 [====================>.........] - ETA: 34s - loss: 2.1083 - regression_loss: 1.7121 - classification_loss: 0.3962 364/500 [====================>.........] - ETA: 33s - loss: 2.1087 - regression_loss: 1.7126 - classification_loss: 0.3961 365/500 [====================>.........] - ETA: 33s - loss: 2.1093 - regression_loss: 1.7132 - classification_loss: 0.3961 366/500 [====================>.........] - ETA: 33s - loss: 2.1095 - regression_loss: 1.7136 - classification_loss: 0.3959 367/500 [=====================>........] - ETA: 33s - loss: 2.1087 - regression_loss: 1.7131 - classification_loss: 0.3956 368/500 [=====================>........] - ETA: 32s - loss: 2.1047 - regression_loss: 1.7099 - classification_loss: 0.3948 369/500 [=====================>........] - ETA: 32s - loss: 2.1031 - regression_loss: 1.7085 - classification_loss: 0.3946 370/500 [=====================>........] - ETA: 32s - loss: 2.1025 - regression_loss: 1.7081 - classification_loss: 0.3944 371/500 [=====================>........] - ETA: 32s - loss: 2.1008 - regression_loss: 1.7071 - classification_loss: 0.3937 372/500 [=====================>........] - ETA: 31s - loss: 2.1010 - regression_loss: 1.7074 - classification_loss: 0.3936 373/500 [=====================>........] - ETA: 31s - loss: 2.1005 - regression_loss: 1.7072 - classification_loss: 0.3933 374/500 [=====================>........] - ETA: 31s - loss: 2.1013 - regression_loss: 1.7078 - classification_loss: 0.3935 375/500 [=====================>........] - ETA: 31s - loss: 2.1005 - regression_loss: 1.7073 - classification_loss: 0.3932 376/500 [=====================>........] - ETA: 30s - loss: 2.1007 - regression_loss: 1.7076 - classification_loss: 0.3931 377/500 [=====================>........] - ETA: 30s - loss: 2.1005 - regression_loss: 1.7075 - classification_loss: 0.3930 378/500 [=====================>........] - ETA: 30s - loss: 2.1030 - regression_loss: 1.7098 - classification_loss: 0.3932 379/500 [=====================>........] - ETA: 30s - loss: 2.1020 - regression_loss: 1.7090 - classification_loss: 0.3929 380/500 [=====================>........] - ETA: 29s - loss: 2.1021 - regression_loss: 1.7092 - classification_loss: 0.3929 381/500 [=====================>........] - ETA: 29s - loss: 2.1036 - regression_loss: 1.7106 - classification_loss: 0.3930 382/500 [=====================>........] - ETA: 29s - loss: 2.1042 - regression_loss: 1.7113 - classification_loss: 0.3930 383/500 [=====================>........] - ETA: 29s - loss: 2.1042 - regression_loss: 1.7114 - classification_loss: 0.3928 384/500 [======================>.......] - ETA: 28s - loss: 2.1044 - regression_loss: 1.7116 - classification_loss: 0.3928 385/500 [======================>.......] - ETA: 28s - loss: 2.1047 - regression_loss: 1.7119 - classification_loss: 0.3928 386/500 [======================>.......] - ETA: 28s - loss: 2.1048 - regression_loss: 1.7120 - classification_loss: 0.3928 387/500 [======================>.......] - ETA: 28s - loss: 2.1036 - regression_loss: 1.7109 - classification_loss: 0.3927 388/500 [======================>.......] - ETA: 27s - loss: 2.1034 - regression_loss: 1.7108 - classification_loss: 0.3926 389/500 [======================>.......] - ETA: 27s - loss: 2.1013 - regression_loss: 1.7093 - classification_loss: 0.3920 390/500 [======================>.......] - ETA: 27s - loss: 2.1012 - regression_loss: 1.7092 - classification_loss: 0.3920 391/500 [======================>.......] - ETA: 27s - loss: 2.1014 - regression_loss: 1.7092 - classification_loss: 0.3923 392/500 [======================>.......] - ETA: 26s - loss: 2.1008 - regression_loss: 1.7085 - classification_loss: 0.3923 393/500 [======================>.......] - ETA: 26s - loss: 2.1015 - regression_loss: 1.7089 - classification_loss: 0.3926 394/500 [======================>.......] - ETA: 26s - loss: 2.1013 - regression_loss: 1.7090 - classification_loss: 0.3923 395/500 [======================>.......] - ETA: 26s - loss: 2.1008 - regression_loss: 1.7087 - classification_loss: 0.3921 396/500 [======================>.......] - ETA: 25s - loss: 2.1009 - regression_loss: 1.7090 - classification_loss: 0.3920 397/500 [======================>.......] - ETA: 25s - loss: 2.1017 - regression_loss: 1.7097 - classification_loss: 0.3920 398/500 [======================>.......] - ETA: 25s - loss: 2.1024 - regression_loss: 1.7103 - classification_loss: 0.3921 399/500 [======================>.......] - ETA: 25s - loss: 2.1028 - regression_loss: 1.7107 - classification_loss: 0.3921 400/500 [=======================>......] - ETA: 24s - loss: 2.1039 - regression_loss: 1.7118 - classification_loss: 0.3922 401/500 [=======================>......] - ETA: 24s - loss: 2.1046 - regression_loss: 1.7123 - classification_loss: 0.3923 402/500 [=======================>......] - ETA: 24s - loss: 2.1049 - regression_loss: 1.7127 - classification_loss: 0.3922 403/500 [=======================>......] - ETA: 24s - loss: 2.1050 - regression_loss: 1.7127 - classification_loss: 0.3922 404/500 [=======================>......] - ETA: 23s - loss: 2.1055 - regression_loss: 1.7133 - classification_loss: 0.3922 405/500 [=======================>......] - ETA: 23s - loss: 2.1041 - regression_loss: 1.7119 - classification_loss: 0.3921 406/500 [=======================>......] - ETA: 23s - loss: 2.1044 - regression_loss: 1.7121 - classification_loss: 0.3923 407/500 [=======================>......] - ETA: 23s - loss: 2.1042 - regression_loss: 1.7116 - classification_loss: 0.3926 408/500 [=======================>......] - ETA: 22s - loss: 2.1039 - regression_loss: 1.7115 - classification_loss: 0.3924 409/500 [=======================>......] - ETA: 22s - loss: 2.1026 - regression_loss: 1.7102 - classification_loss: 0.3924 410/500 [=======================>......] - ETA: 22s - loss: 2.1019 - regression_loss: 1.7096 - classification_loss: 0.3923 411/500 [=======================>......] - ETA: 22s - loss: 2.1020 - regression_loss: 1.7100 - classification_loss: 0.3920 412/500 [=======================>......] - ETA: 21s - loss: 2.0999 - regression_loss: 1.7082 - classification_loss: 0.3916 413/500 [=======================>......] - ETA: 21s - loss: 2.0995 - regression_loss: 1.7081 - classification_loss: 0.3914 414/500 [=======================>......] - ETA: 21s - loss: 2.0997 - regression_loss: 1.7083 - classification_loss: 0.3914 415/500 [=======================>......] - ETA: 21s - loss: 2.0999 - regression_loss: 1.7085 - classification_loss: 0.3915 416/500 [=======================>......] - ETA: 20s - loss: 2.1010 - regression_loss: 1.7096 - classification_loss: 0.3914 417/500 [========================>.....] - ETA: 20s - loss: 2.0995 - regression_loss: 1.7083 - classification_loss: 0.3911 418/500 [========================>.....] - ETA: 20s - loss: 2.0997 - regression_loss: 1.7088 - classification_loss: 0.3909 419/500 [========================>.....] - ETA: 20s - loss: 2.0994 - regression_loss: 1.7085 - classification_loss: 0.3909 420/500 [========================>.....] - ETA: 19s - loss: 2.0993 - regression_loss: 1.7085 - classification_loss: 0.3908 421/500 [========================>.....] - ETA: 19s - loss: 2.0999 - regression_loss: 1.7088 - classification_loss: 0.3910 422/500 [========================>.....] - ETA: 19s - loss: 2.0998 - regression_loss: 1.7089 - classification_loss: 0.3909 423/500 [========================>.....] - ETA: 19s - loss: 2.1003 - regression_loss: 1.7093 - classification_loss: 0.3909 424/500 [========================>.....] - ETA: 18s - loss: 2.1018 - regression_loss: 1.7105 - classification_loss: 0.3914 425/500 [========================>.....] - ETA: 18s - loss: 2.1007 - regression_loss: 1.7094 - classification_loss: 0.3912 426/500 [========================>.....] - ETA: 18s - loss: 2.0999 - regression_loss: 1.7090 - classification_loss: 0.3909 427/500 [========================>.....] - ETA: 18s - loss: 2.1009 - regression_loss: 1.7099 - classification_loss: 0.3910 428/500 [========================>.....] - ETA: 17s - loss: 2.1002 - regression_loss: 1.7093 - classification_loss: 0.3908 429/500 [========================>.....] - ETA: 17s - loss: 2.0978 - regression_loss: 1.7076 - classification_loss: 0.3902 430/500 [========================>.....] - ETA: 17s - loss: 2.0963 - regression_loss: 1.7062 - classification_loss: 0.3901 431/500 [========================>.....] - ETA: 17s - loss: 2.0969 - regression_loss: 1.7069 - classification_loss: 0.3901 432/500 [========================>.....] - ETA: 16s - loss: 2.0977 - regression_loss: 1.7073 - classification_loss: 0.3904 433/500 [========================>.....] - ETA: 16s - loss: 2.0961 - regression_loss: 1.7063 - classification_loss: 0.3899 434/500 [=========================>....] - ETA: 16s - loss: 2.0943 - regression_loss: 1.7050 - classification_loss: 0.3893 435/500 [=========================>....] - ETA: 16s - loss: 2.0947 - regression_loss: 1.7052 - classification_loss: 0.3894 436/500 [=========================>....] - ETA: 15s - loss: 2.0932 - regression_loss: 1.7042 - classification_loss: 0.3890 437/500 [=========================>....] - ETA: 15s - loss: 2.0947 - regression_loss: 1.7050 - classification_loss: 0.3897 438/500 [=========================>....] - ETA: 15s - loss: 2.0918 - regression_loss: 1.7027 - classification_loss: 0.3891 439/500 [=========================>....] - ETA: 15s - loss: 2.0908 - regression_loss: 1.7020 - classification_loss: 0.3888 440/500 [=========================>....] - ETA: 14s - loss: 2.0920 - regression_loss: 1.7029 - classification_loss: 0.3892 441/500 [=========================>....] - ETA: 14s - loss: 2.0918 - regression_loss: 1.7028 - classification_loss: 0.3890 442/500 [=========================>....] - ETA: 14s - loss: 2.0915 - regression_loss: 1.7027 - classification_loss: 0.3888 443/500 [=========================>....] - ETA: 14s - loss: 2.0903 - regression_loss: 1.7017 - classification_loss: 0.3886 444/500 [=========================>....] - ETA: 13s - loss: 2.0908 - regression_loss: 1.7015 - classification_loss: 0.3893 445/500 [=========================>....] - ETA: 13s - loss: 2.0949 - regression_loss: 1.7046 - classification_loss: 0.3902 446/500 [=========================>....] - ETA: 13s - loss: 2.0951 - regression_loss: 1.7049 - classification_loss: 0.3902 447/500 [=========================>....] - ETA: 13s - loss: 2.0931 - regression_loss: 1.7034 - classification_loss: 0.3898 448/500 [=========================>....] - ETA: 12s - loss: 2.0933 - regression_loss: 1.7038 - classification_loss: 0.3896 449/500 [=========================>....] - ETA: 12s - loss: 2.0928 - regression_loss: 1.7034 - classification_loss: 0.3894 450/500 [==========================>...] - ETA: 12s - loss: 2.0926 - regression_loss: 1.7033 - classification_loss: 0.3893 451/500 [==========================>...] - ETA: 12s - loss: 2.0922 - regression_loss: 1.7030 - classification_loss: 0.3892 452/500 [==========================>...] - ETA: 11s - loss: 2.0923 - regression_loss: 1.7031 - classification_loss: 0.3892 453/500 [==========================>...] - ETA: 11s - loss: 2.0914 - regression_loss: 1.7023 - classification_loss: 0.3891 454/500 [==========================>...] - ETA: 11s - loss: 2.0912 - regression_loss: 1.7023 - classification_loss: 0.3890 455/500 [==========================>...] - ETA: 11s - loss: 2.0902 - regression_loss: 1.7012 - classification_loss: 0.3890 456/500 [==========================>...] - ETA: 10s - loss: 2.0903 - regression_loss: 1.7014 - classification_loss: 0.3889 457/500 [==========================>...] - ETA: 10s - loss: 2.0901 - regression_loss: 1.7013 - classification_loss: 0.3887 458/500 [==========================>...] - ETA: 10s - loss: 2.0886 - regression_loss: 1.7002 - classification_loss: 0.3884 459/500 [==========================>...] - ETA: 10s - loss: 2.0898 - regression_loss: 1.7014 - classification_loss: 0.3884 460/500 [==========================>...] - ETA: 9s - loss: 2.0902 - regression_loss: 1.7017 - classification_loss: 0.3886  461/500 [==========================>...] - ETA: 9s - loss: 2.0913 - regression_loss: 1.7024 - classification_loss: 0.3889 462/500 [==========================>...] - ETA: 9s - loss: 2.0928 - regression_loss: 1.7034 - classification_loss: 0.3895 463/500 [==========================>...] - ETA: 9s - loss: 2.0924 - regression_loss: 1.7031 - classification_loss: 0.3893 464/500 [==========================>...] - ETA: 8s - loss: 2.0935 - regression_loss: 1.7039 - classification_loss: 0.3895 465/500 [==========================>...] - ETA: 8s - loss: 2.0931 - regression_loss: 1.7037 - classification_loss: 0.3894 466/500 [==========================>...] - ETA: 8s - loss: 2.0935 - regression_loss: 1.7040 - classification_loss: 0.3894 467/500 [===========================>..] - ETA: 8s - loss: 2.0944 - regression_loss: 1.7049 - classification_loss: 0.3895 468/500 [===========================>..] - ETA: 7s - loss: 2.0954 - regression_loss: 1.7058 - classification_loss: 0.3897 469/500 [===========================>..] - ETA: 7s - loss: 2.0954 - regression_loss: 1.7057 - classification_loss: 0.3896 470/500 [===========================>..] - ETA: 7s - loss: 2.0968 - regression_loss: 1.7070 - classification_loss: 0.3899 471/500 [===========================>..] - ETA: 7s - loss: 2.0966 - regression_loss: 1.7068 - classification_loss: 0.3898 472/500 [===========================>..] - ETA: 6s - loss: 2.0982 - regression_loss: 1.7079 - classification_loss: 0.3903 473/500 [===========================>..] - ETA: 6s - loss: 2.0986 - regression_loss: 1.7083 - classification_loss: 0.3904 474/500 [===========================>..] - ETA: 6s - loss: 2.0980 - regression_loss: 1.7080 - classification_loss: 0.3901 475/500 [===========================>..] - ETA: 6s - loss: 2.0984 - regression_loss: 1.7085 - classification_loss: 0.3900 476/500 [===========================>..] - ETA: 5s - loss: 2.0982 - regression_loss: 1.7078 - classification_loss: 0.3904 477/500 [===========================>..] - ETA: 5s - loss: 2.0986 - regression_loss: 1.7082 - classification_loss: 0.3903 478/500 [===========================>..] - ETA: 5s - loss: 2.0992 - regression_loss: 1.7090 - classification_loss: 0.3903 479/500 [===========================>..] - ETA: 5s - loss: 2.1001 - regression_loss: 1.7097 - classification_loss: 0.3904 480/500 [===========================>..] - ETA: 4s - loss: 2.1001 - regression_loss: 1.7098 - classification_loss: 0.3903 481/500 [===========================>..] - ETA: 4s - loss: 2.1008 - regression_loss: 1.7103 - classification_loss: 0.3905 482/500 [===========================>..] - ETA: 4s - loss: 2.1011 - regression_loss: 1.7106 - classification_loss: 0.3904 483/500 [===========================>..] - ETA: 4s - loss: 2.1028 - regression_loss: 1.7120 - classification_loss: 0.3908 484/500 [============================>.] - ETA: 3s - loss: 2.1029 - regression_loss: 1.7122 - classification_loss: 0.3907 485/500 [============================>.] - ETA: 3s - loss: 2.1023 - regression_loss: 1.7118 - classification_loss: 0.3905 486/500 [============================>.] - ETA: 3s - loss: 2.1012 - regression_loss: 1.7110 - classification_loss: 0.3903 487/500 [============================>.] - ETA: 3s - loss: 2.1017 - regression_loss: 1.7114 - classification_loss: 0.3903 488/500 [============================>.] - ETA: 2s - loss: 2.1017 - regression_loss: 1.7114 - classification_loss: 0.3903 489/500 [============================>.] - ETA: 2s - loss: 2.1019 - regression_loss: 1.7118 - classification_loss: 0.3901 490/500 [============================>.] - ETA: 2s - loss: 2.1019 - regression_loss: 1.7119 - classification_loss: 0.3900 491/500 [============================>.] - ETA: 2s - loss: 2.1013 - regression_loss: 1.7115 - classification_loss: 0.3898 492/500 [============================>.] - ETA: 1s - loss: 2.1046 - regression_loss: 1.7137 - classification_loss: 0.3910 493/500 [============================>.] - ETA: 1s - loss: 2.1046 - regression_loss: 1.7136 - classification_loss: 0.3911 494/500 [============================>.] - ETA: 1s - loss: 2.1058 - regression_loss: 1.7143 - classification_loss: 0.3915 495/500 [============================>.] - ETA: 1s - loss: 2.1047 - regression_loss: 1.7134 - classification_loss: 0.3913 496/500 [============================>.] - ETA: 0s - loss: 2.1049 - regression_loss: 1.7136 - classification_loss: 0.3913 497/500 [============================>.] - ETA: 0s - loss: 2.1045 - regression_loss: 1.7133 - classification_loss: 0.3913 498/500 [============================>.] - ETA: 0s - loss: 2.1048 - regression_loss: 1.7135 - classification_loss: 0.3914 499/500 [============================>.] - ETA: 0s - loss: 2.1038 - regression_loss: 1.7127 - classification_loss: 0.3912 500/500 [==============================] - 125s 250ms/step - loss: 2.1039 - regression_loss: 1.7128 - classification_loss: 0.3911 1172 instances of class plum with average precision: 0.5442 mAP: 0.5442 Epoch 00033: saving model to ./training/snapshots/resnet50_pascal_33.h5 Epoch 34/150 1/500 [..............................] - ETA: 2:00 - loss: 1.9379 - regression_loss: 1.5855 - classification_loss: 0.3524 2/500 [..............................] - ETA: 2:04 - loss: 1.8786 - regression_loss: 1.5622 - classification_loss: 0.3164 3/500 [..............................] - ETA: 2:04 - loss: 2.0800 - regression_loss: 1.7231 - classification_loss: 0.3569 4/500 [..............................] - ETA: 2:04 - loss: 2.0970 - regression_loss: 1.7414 - classification_loss: 0.3556 5/500 [..............................] - ETA: 2:04 - loss: 1.9614 - regression_loss: 1.6221 - classification_loss: 0.3393 6/500 [..............................] - ETA: 2:05 - loss: 1.7732 - regression_loss: 1.4643 - classification_loss: 0.3089 7/500 [..............................] - ETA: 2:04 - loss: 1.7998 - regression_loss: 1.4861 - classification_loss: 0.3137 8/500 [..............................] - ETA: 2:04 - loss: 1.8708 - regression_loss: 1.5442 - classification_loss: 0.3266 9/500 [..............................] - ETA: 2:03 - loss: 1.8498 - regression_loss: 1.5283 - classification_loss: 0.3214 10/500 [..............................] - ETA: 2:02 - loss: 1.8224 - regression_loss: 1.5056 - classification_loss: 0.3168 11/500 [..............................] - ETA: 2:02 - loss: 1.8511 - regression_loss: 1.5187 - classification_loss: 0.3324 12/500 [..............................] - ETA: 2:02 - loss: 1.8435 - regression_loss: 1.5158 - classification_loss: 0.3277 13/500 [..............................] - ETA: 2:02 - loss: 1.8328 - regression_loss: 1.5081 - classification_loss: 0.3248 14/500 [..............................] - ETA: 2:02 - loss: 1.8674 - regression_loss: 1.5346 - classification_loss: 0.3328 15/500 [..............................] - ETA: 2:01 - loss: 1.8710 - regression_loss: 1.5394 - classification_loss: 0.3317 16/500 [..............................] - ETA: 2:01 - loss: 1.9014 - regression_loss: 1.5665 - classification_loss: 0.3349 17/500 [>.............................] - ETA: 2:00 - loss: 1.8990 - regression_loss: 1.5649 - classification_loss: 0.3341 18/500 [>.............................] - ETA: 2:00 - loss: 1.9233 - regression_loss: 1.5813 - classification_loss: 0.3420 19/500 [>.............................] - ETA: 2:00 - loss: 1.9426 - regression_loss: 1.5940 - classification_loss: 0.3485 20/500 [>.............................] - ETA: 1:59 - loss: 1.9368 - regression_loss: 1.5908 - classification_loss: 0.3461 21/500 [>.............................] - ETA: 1:59 - loss: 1.9341 - regression_loss: 1.5823 - classification_loss: 0.3518 22/500 [>.............................] - ETA: 1:59 - loss: 1.9939 - regression_loss: 1.6347 - classification_loss: 0.3592 23/500 [>.............................] - ETA: 1:59 - loss: 1.9963 - regression_loss: 1.6382 - classification_loss: 0.3582 24/500 [>.............................] - ETA: 1:58 - loss: 2.0074 - regression_loss: 1.6483 - classification_loss: 0.3591 25/500 [>.............................] - ETA: 1:58 - loss: 1.9966 - regression_loss: 1.6407 - classification_loss: 0.3559 26/500 [>.............................] - ETA: 1:58 - loss: 1.9802 - regression_loss: 1.6299 - classification_loss: 0.3503 27/500 [>.............................] - ETA: 1:57 - loss: 1.9530 - regression_loss: 1.6099 - classification_loss: 0.3432 28/500 [>.............................] - ETA: 1:57 - loss: 1.9793 - regression_loss: 1.6339 - classification_loss: 0.3454 29/500 [>.............................] - ETA: 1:57 - loss: 1.9877 - regression_loss: 1.6412 - classification_loss: 0.3464 30/500 [>.............................] - ETA: 1:57 - loss: 1.9883 - regression_loss: 1.6357 - classification_loss: 0.3526 31/500 [>.............................] - ETA: 1:57 - loss: 1.9944 - regression_loss: 1.6443 - classification_loss: 0.3501 32/500 [>.............................] - ETA: 1:56 - loss: 2.0337 - regression_loss: 1.6682 - classification_loss: 0.3655 33/500 [>.............................] - ETA: 1:56 - loss: 2.0251 - regression_loss: 1.6627 - classification_loss: 0.3624 34/500 [=>............................] - ETA: 1:56 - loss: 2.0240 - regression_loss: 1.6625 - classification_loss: 0.3615 35/500 [=>............................] - ETA: 1:56 - loss: 2.0242 - regression_loss: 1.6631 - classification_loss: 0.3612 36/500 [=>............................] - ETA: 1:55 - loss: 2.0243 - regression_loss: 1.6620 - classification_loss: 0.3623 37/500 [=>............................] - ETA: 1:55 - loss: 2.0266 - regression_loss: 1.6634 - classification_loss: 0.3632 38/500 [=>............................] - ETA: 1:55 - loss: 2.0374 - regression_loss: 1.6712 - classification_loss: 0.3662 39/500 [=>............................] - ETA: 1:55 - loss: 2.0432 - regression_loss: 1.6750 - classification_loss: 0.3683 40/500 [=>............................] - ETA: 1:55 - loss: 2.0471 - regression_loss: 1.6793 - classification_loss: 0.3678 41/500 [=>............................] - ETA: 1:55 - loss: 2.0160 - regression_loss: 1.6545 - classification_loss: 0.3615 42/500 [=>............................] - ETA: 1:54 - loss: 2.0137 - regression_loss: 1.6505 - classification_loss: 0.3632 43/500 [=>............................] - ETA: 1:54 - loss: 2.0147 - regression_loss: 1.6524 - classification_loss: 0.3623 44/500 [=>............................] - ETA: 1:54 - loss: 2.0164 - regression_loss: 1.6466 - classification_loss: 0.3699 45/500 [=>............................] - ETA: 1:53 - loss: 2.0237 - regression_loss: 1.6525 - classification_loss: 0.3712 46/500 [=>............................] - ETA: 1:53 - loss: 2.0185 - regression_loss: 1.6486 - classification_loss: 0.3698 47/500 [=>............................] - ETA: 1:52 - loss: 2.0224 - regression_loss: 1.6523 - classification_loss: 0.3701 48/500 [=>............................] - ETA: 1:51 - loss: 2.0275 - regression_loss: 1.6568 - classification_loss: 0.3707 49/500 [=>............................] - ETA: 1:51 - loss: 2.0461 - regression_loss: 1.6722 - classification_loss: 0.3739 50/500 [==>...........................] - ETA: 1:51 - loss: 2.0521 - regression_loss: 1.6789 - classification_loss: 0.3732 51/500 [==>...........................] - ETA: 1:51 - loss: 2.0596 - regression_loss: 1.6830 - classification_loss: 0.3766 52/500 [==>...........................] - ETA: 1:50 - loss: 2.0378 - regression_loss: 1.6657 - classification_loss: 0.3721 53/500 [==>...........................] - ETA: 1:50 - loss: 2.0389 - regression_loss: 1.6664 - classification_loss: 0.3724 54/500 [==>...........................] - ETA: 1:50 - loss: 2.0536 - regression_loss: 1.6636 - classification_loss: 0.3899 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0624 - regression_loss: 1.6697 - classification_loss: 0.3927 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0432 - regression_loss: 1.6542 - classification_loss: 0.3890 57/500 [==>...........................] - ETA: 1:49 - loss: 2.0327 - regression_loss: 1.6442 - classification_loss: 0.3886 58/500 [==>...........................] - ETA: 1:49 - loss: 2.0419 - regression_loss: 1.6512 - classification_loss: 0.3907 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0284 - regression_loss: 1.6414 - classification_loss: 0.3869 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0134 - regression_loss: 1.6297 - classification_loss: 0.3837 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0061 - regression_loss: 1.6241 - classification_loss: 0.3820 62/500 [==>...........................] - ETA: 1:48 - loss: 2.0112 - regression_loss: 1.6286 - classification_loss: 0.3825 63/500 [==>...........................] - ETA: 1:48 - loss: 2.0092 - regression_loss: 1.6270 - classification_loss: 0.3822 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0085 - regression_loss: 1.6270 - classification_loss: 0.3815 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0170 - regression_loss: 1.6345 - classification_loss: 0.3826 66/500 [==>...........................] - ETA: 1:47 - loss: 2.0196 - regression_loss: 1.6376 - classification_loss: 0.3820 67/500 [===>..........................] - ETA: 1:47 - loss: 2.0247 - regression_loss: 1.6422 - classification_loss: 0.3825 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0295 - regression_loss: 1.6465 - classification_loss: 0.3830 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0307 - regression_loss: 1.6475 - classification_loss: 0.3832 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0317 - regression_loss: 1.6493 - classification_loss: 0.3824 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0372 - regression_loss: 1.6536 - classification_loss: 0.3836 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0515 - regression_loss: 1.6617 - classification_loss: 0.3898 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0481 - regression_loss: 1.6597 - classification_loss: 0.3884 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0403 - regression_loss: 1.6530 - classification_loss: 0.3873 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0601 - regression_loss: 1.6706 - classification_loss: 0.3895 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0645 - regression_loss: 1.6704 - classification_loss: 0.3940 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0792 - regression_loss: 1.6836 - classification_loss: 0.3956 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0852 - regression_loss: 1.6886 - classification_loss: 0.3966 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0854 - regression_loss: 1.6896 - classification_loss: 0.3958 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0811 - regression_loss: 1.6874 - classification_loss: 0.3937 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0754 - regression_loss: 1.6826 - classification_loss: 0.3927 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0805 - regression_loss: 1.6866 - classification_loss: 0.3939 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0716 - regression_loss: 1.6799 - classification_loss: 0.3917 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0714 - regression_loss: 1.6803 - classification_loss: 0.3911 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0774 - regression_loss: 1.6839 - classification_loss: 0.3935 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0807 - regression_loss: 1.6866 - classification_loss: 0.3941 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0830 - regression_loss: 1.6875 - classification_loss: 0.3955 88/500 [====>.........................] - ETA: 1:42 - loss: 2.0828 - regression_loss: 1.6875 - classification_loss: 0.3953 89/500 [====>.........................] - ETA: 1:42 - loss: 2.0854 - regression_loss: 1.6902 - classification_loss: 0.3951 90/500 [====>.........................] - ETA: 1:42 - loss: 2.0832 - regression_loss: 1.6899 - classification_loss: 0.3933 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0870 - regression_loss: 1.6937 - classification_loss: 0.3932 92/500 [====>.........................] - ETA: 1:41 - loss: 2.0731 - regression_loss: 1.6821 - classification_loss: 0.3909 93/500 [====>.........................] - ETA: 1:41 - loss: 2.0851 - regression_loss: 1.6936 - classification_loss: 0.3915 94/500 [====>.........................] - ETA: 1:41 - loss: 2.0885 - regression_loss: 1.6962 - classification_loss: 0.3923 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0924 - regression_loss: 1.6993 - classification_loss: 0.3931 96/500 [====>.........................] - ETA: 1:40 - loss: 2.1006 - regression_loss: 1.7064 - classification_loss: 0.3943 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1014 - regression_loss: 1.7071 - classification_loss: 0.3943 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1007 - regression_loss: 1.7062 - classification_loss: 0.3944 99/500 [====>.........................] - ETA: 1:40 - loss: 2.1076 - regression_loss: 1.7124 - classification_loss: 0.3952 100/500 [=====>........................] - ETA: 1:39 - loss: 2.1083 - regression_loss: 1.7121 - classification_loss: 0.3962 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1136 - regression_loss: 1.7157 - classification_loss: 0.3979 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1139 - regression_loss: 1.7164 - classification_loss: 0.3974 103/500 [=====>........................] - ETA: 1:39 - loss: 2.1121 - regression_loss: 1.7162 - classification_loss: 0.3959 104/500 [=====>........................] - ETA: 1:38 - loss: 2.1090 - regression_loss: 1.7139 - classification_loss: 0.3951 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1054 - regression_loss: 1.7112 - classification_loss: 0.3942 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1058 - regression_loss: 1.7112 - classification_loss: 0.3946 107/500 [=====>........................] - ETA: 1:38 - loss: 2.1043 - regression_loss: 1.7101 - classification_loss: 0.3942 108/500 [=====>........................] - ETA: 1:37 - loss: 2.1083 - regression_loss: 1.7134 - classification_loss: 0.3950 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1093 - regression_loss: 1.7143 - classification_loss: 0.3950 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1122 - regression_loss: 1.7172 - classification_loss: 0.3950 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1160 - regression_loss: 1.7198 - classification_loss: 0.3962 112/500 [=====>........................] - ETA: 1:37 - loss: 2.1168 - regression_loss: 1.7199 - classification_loss: 0.3970 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1128 - regression_loss: 1.7170 - classification_loss: 0.3958 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1117 - regression_loss: 1.7166 - classification_loss: 0.3951 115/500 [=====>........................] - ETA: 1:36 - loss: 2.1099 - regression_loss: 1.7158 - classification_loss: 0.3941 116/500 [=====>........................] - ETA: 1:36 - loss: 2.1097 - regression_loss: 1.7161 - classification_loss: 0.3936 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1149 - regression_loss: 1.7207 - classification_loss: 0.3942 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1116 - regression_loss: 1.7181 - classification_loss: 0.3936 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1071 - regression_loss: 1.7146 - classification_loss: 0.3924 120/500 [======>.......................] - ETA: 1:35 - loss: 2.1026 - regression_loss: 1.7111 - classification_loss: 0.3915 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1028 - regression_loss: 1.7109 - classification_loss: 0.3920 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1051 - regression_loss: 1.7122 - classification_loss: 0.3929 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1022 - regression_loss: 1.7103 - classification_loss: 0.3919 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0985 - regression_loss: 1.7070 - classification_loss: 0.3915 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1009 - regression_loss: 1.7082 - classification_loss: 0.3927 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1033 - regression_loss: 1.7098 - classification_loss: 0.3936 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1060 - regression_loss: 1.7121 - classification_loss: 0.3939 128/500 [======>.......................] - ETA: 1:33 - loss: 2.1058 - regression_loss: 1.7122 - classification_loss: 0.3935 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1071 - regression_loss: 1.7146 - classification_loss: 0.3926 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1054 - regression_loss: 1.7134 - classification_loss: 0.3920 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1055 - regression_loss: 1.7138 - classification_loss: 0.3916 132/500 [======>.......................] - ETA: 1:32 - loss: 2.1138 - regression_loss: 1.7218 - classification_loss: 0.3920 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1193 - regression_loss: 1.7271 - classification_loss: 0.3921 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1207 - regression_loss: 1.7289 - classification_loss: 0.3917 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1169 - regression_loss: 1.7264 - classification_loss: 0.3905 136/500 [=======>......................] - ETA: 1:31 - loss: 2.1162 - regression_loss: 1.7261 - classification_loss: 0.3901 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1172 - regression_loss: 1.7265 - classification_loss: 0.3906 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1129 - regression_loss: 1.7233 - classification_loss: 0.3895 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1121 - regression_loss: 1.7230 - classification_loss: 0.3891 140/500 [=======>......................] - ETA: 1:30 - loss: 2.1098 - regression_loss: 1.7213 - classification_loss: 0.3885 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1072 - regression_loss: 1.7195 - classification_loss: 0.3877 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1081 - regression_loss: 1.7205 - classification_loss: 0.3875 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1079 - regression_loss: 1.7201 - classification_loss: 0.3877 144/500 [=======>......................] - ETA: 1:29 - loss: 2.1116 - regression_loss: 1.7228 - classification_loss: 0.3888 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1122 - regression_loss: 1.7233 - classification_loss: 0.3889 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1049 - regression_loss: 1.7170 - classification_loss: 0.3879 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1079 - regression_loss: 1.7196 - classification_loss: 0.3883 148/500 [=======>......................] - ETA: 1:28 - loss: 2.1060 - regression_loss: 1.7184 - classification_loss: 0.3877 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1067 - regression_loss: 1.7195 - classification_loss: 0.3872 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1044 - regression_loss: 1.7175 - classification_loss: 0.3870 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1024 - regression_loss: 1.7155 - classification_loss: 0.3869 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0989 - regression_loss: 1.7121 - classification_loss: 0.3868 153/500 [========>.....................] - ETA: 1:26 - loss: 2.0965 - regression_loss: 1.7105 - classification_loss: 0.3860 154/500 [========>.....................] - ETA: 1:28 - loss: 2.0913 - regression_loss: 1.7066 - classification_loss: 0.3847 155/500 [========>.....................] - ETA: 1:28 - loss: 2.0869 - regression_loss: 1.7036 - classification_loss: 0.3834 156/500 [========>.....................] - ETA: 1:27 - loss: 2.0878 - regression_loss: 1.7044 - classification_loss: 0.3833 157/500 [========>.....................] - ETA: 1:27 - loss: 2.0855 - regression_loss: 1.7026 - classification_loss: 0.3829 158/500 [========>.....................] - ETA: 1:27 - loss: 2.0798 - regression_loss: 1.6979 - classification_loss: 0.3819 159/500 [========>.....................] - ETA: 1:26 - loss: 2.0798 - regression_loss: 1.6976 - classification_loss: 0.3822 160/500 [========>.....................] - ETA: 1:26 - loss: 2.0822 - regression_loss: 1.6996 - classification_loss: 0.3826 161/500 [========>.....................] - ETA: 1:26 - loss: 2.0794 - regression_loss: 1.6974 - classification_loss: 0.3820 162/500 [========>.....................] - ETA: 1:26 - loss: 2.0784 - regression_loss: 1.6968 - classification_loss: 0.3816 163/500 [========>.....................] - ETA: 1:25 - loss: 2.0782 - regression_loss: 1.6969 - classification_loss: 0.3813 164/500 [========>.....................] - ETA: 1:25 - loss: 2.0760 - regression_loss: 1.6951 - classification_loss: 0.3809 165/500 [========>.....................] - ETA: 1:25 - loss: 2.0755 - regression_loss: 1.6948 - classification_loss: 0.3807 166/500 [========>.....................] - ETA: 1:25 - loss: 2.0796 - regression_loss: 1.6986 - classification_loss: 0.3810 167/500 [=========>....................] - ETA: 1:24 - loss: 2.0812 - regression_loss: 1.6997 - classification_loss: 0.3814 168/500 [=========>....................] - ETA: 1:24 - loss: 2.0782 - regression_loss: 1.6974 - classification_loss: 0.3808 169/500 [=========>....................] - ETA: 1:24 - loss: 2.0781 - regression_loss: 1.6967 - classification_loss: 0.3814 170/500 [=========>....................] - ETA: 1:24 - loss: 2.0795 - regression_loss: 1.6983 - classification_loss: 0.3813 171/500 [=========>....................] - ETA: 1:23 - loss: 2.0800 - regression_loss: 1.6989 - classification_loss: 0.3812 172/500 [=========>....................] - ETA: 1:23 - loss: 2.0823 - regression_loss: 1.7003 - classification_loss: 0.3820 173/500 [=========>....................] - ETA: 1:23 - loss: 2.0829 - regression_loss: 1.7011 - classification_loss: 0.3818 174/500 [=========>....................] - ETA: 1:22 - loss: 2.0736 - regression_loss: 1.6934 - classification_loss: 0.3802 175/500 [=========>....................] - ETA: 1:22 - loss: 2.0725 - regression_loss: 1.6928 - classification_loss: 0.3798 176/500 [=========>....................] - ETA: 1:22 - loss: 2.0742 - regression_loss: 1.6934 - classification_loss: 0.3808 177/500 [=========>....................] - ETA: 1:22 - loss: 2.0744 - regression_loss: 1.6935 - classification_loss: 0.3810 178/500 [=========>....................] - ETA: 1:21 - loss: 2.0735 - regression_loss: 1.6930 - classification_loss: 0.3805 179/500 [=========>....................] - ETA: 1:21 - loss: 2.0717 - regression_loss: 1.6922 - classification_loss: 0.3796 180/500 [=========>....................] - ETA: 1:21 - loss: 2.0715 - regression_loss: 1.6922 - classification_loss: 0.3792 181/500 [=========>....................] - ETA: 1:21 - loss: 2.0711 - regression_loss: 1.6921 - classification_loss: 0.3791 182/500 [=========>....................] - ETA: 1:20 - loss: 2.0719 - regression_loss: 1.6931 - classification_loss: 0.3788 183/500 [=========>....................] - ETA: 1:20 - loss: 2.0747 - regression_loss: 1.6950 - classification_loss: 0.3797 184/500 [==========>...................] - ETA: 1:20 - loss: 2.0687 - regression_loss: 1.6901 - classification_loss: 0.3786 185/500 [==========>...................] - ETA: 1:20 - loss: 2.0720 - regression_loss: 1.6940 - classification_loss: 0.3779 186/500 [==========>...................] - ETA: 1:19 - loss: 2.0728 - regression_loss: 1.6950 - classification_loss: 0.3778 187/500 [==========>...................] - ETA: 1:19 - loss: 2.0720 - regression_loss: 1.6944 - classification_loss: 0.3776 188/500 [==========>...................] - ETA: 1:19 - loss: 2.0730 - regression_loss: 1.6955 - classification_loss: 0.3775 189/500 [==========>...................] - ETA: 1:19 - loss: 2.0728 - regression_loss: 1.6945 - classification_loss: 0.3783 190/500 [==========>...................] - ETA: 1:18 - loss: 2.0738 - regression_loss: 1.6948 - classification_loss: 0.3790 191/500 [==========>...................] - ETA: 1:18 - loss: 2.0747 - regression_loss: 1.6953 - classification_loss: 0.3794 192/500 [==========>...................] - ETA: 1:18 - loss: 2.0764 - regression_loss: 1.6961 - classification_loss: 0.3803 193/500 [==========>...................] - ETA: 1:17 - loss: 2.0762 - regression_loss: 1.6960 - classification_loss: 0.3802 194/500 [==========>...................] - ETA: 1:17 - loss: 2.0747 - regression_loss: 1.6948 - classification_loss: 0.3799 195/500 [==========>...................] - ETA: 1:17 - loss: 2.0775 - regression_loss: 1.6974 - classification_loss: 0.3802 196/500 [==========>...................] - ETA: 1:17 - loss: 2.0788 - regression_loss: 1.6983 - classification_loss: 0.3805 197/500 [==========>...................] - ETA: 1:16 - loss: 2.0779 - regression_loss: 1.6978 - classification_loss: 0.3801 198/500 [==========>...................] - ETA: 1:16 - loss: 2.0772 - regression_loss: 1.6972 - classification_loss: 0.3800 199/500 [==========>...................] - ETA: 1:16 - loss: 2.0779 - regression_loss: 1.6976 - classification_loss: 0.3803 200/500 [===========>..................] - ETA: 1:16 - loss: 2.0785 - regression_loss: 1.6982 - classification_loss: 0.3804 201/500 [===========>..................] - ETA: 1:15 - loss: 2.0764 - regression_loss: 1.6966 - classification_loss: 0.3798 202/500 [===========>..................] - ETA: 1:15 - loss: 2.0749 - regression_loss: 1.6957 - classification_loss: 0.3792 203/500 [===========>..................] - ETA: 1:15 - loss: 2.0737 - regression_loss: 1.6953 - classification_loss: 0.3784 204/500 [===========>..................] - ETA: 1:15 - loss: 2.0752 - regression_loss: 1.6965 - classification_loss: 0.3787 205/500 [===========>..................] - ETA: 1:14 - loss: 2.0713 - regression_loss: 1.6938 - classification_loss: 0.3776 206/500 [===========>..................] - ETA: 1:14 - loss: 2.0742 - regression_loss: 1.6961 - classification_loss: 0.3780 207/500 [===========>..................] - ETA: 1:14 - loss: 2.0763 - regression_loss: 1.6976 - classification_loss: 0.3787 208/500 [===========>..................] - ETA: 1:14 - loss: 2.0797 - regression_loss: 1.7004 - classification_loss: 0.3793 209/500 [===========>..................] - ETA: 1:13 - loss: 2.0810 - regression_loss: 1.7015 - classification_loss: 0.3794 210/500 [===========>..................] - ETA: 1:13 - loss: 2.0808 - regression_loss: 1.7016 - classification_loss: 0.3791 211/500 [===========>..................] - ETA: 1:13 - loss: 2.0807 - regression_loss: 1.7018 - classification_loss: 0.3789 212/500 [===========>..................] - ETA: 1:13 - loss: 2.0768 - regression_loss: 1.6985 - classification_loss: 0.3782 213/500 [===========>..................] - ETA: 1:12 - loss: 2.0730 - regression_loss: 1.6954 - classification_loss: 0.3777 214/500 [===========>..................] - ETA: 1:12 - loss: 2.0744 - regression_loss: 1.6962 - classification_loss: 0.3782 215/500 [===========>..................] - ETA: 1:12 - loss: 2.0755 - regression_loss: 1.6964 - classification_loss: 0.3791 216/500 [===========>..................] - ETA: 1:12 - loss: 2.0763 - regression_loss: 1.6966 - classification_loss: 0.3797 217/500 [============>.................] - ETA: 1:11 - loss: 2.0780 - regression_loss: 1.6976 - classification_loss: 0.3804 218/500 [============>.................] - ETA: 1:11 - loss: 2.0774 - regression_loss: 1.6975 - classification_loss: 0.3799 219/500 [============>.................] - ETA: 1:11 - loss: 2.0791 - regression_loss: 1.6990 - classification_loss: 0.3801 220/500 [============>.................] - ETA: 1:10 - loss: 2.0786 - regression_loss: 1.6984 - classification_loss: 0.3802 221/500 [============>.................] - ETA: 1:10 - loss: 2.0796 - regression_loss: 1.6995 - classification_loss: 0.3802 222/500 [============>.................] - ETA: 1:10 - loss: 2.0796 - regression_loss: 1.6995 - classification_loss: 0.3801 223/500 [============>.................] - ETA: 1:10 - loss: 2.0786 - regression_loss: 1.6989 - classification_loss: 0.3797 224/500 [============>.................] - ETA: 1:09 - loss: 2.0784 - regression_loss: 1.6986 - classification_loss: 0.3799 225/500 [============>.................] - ETA: 1:09 - loss: 2.0794 - regression_loss: 1.6993 - classification_loss: 0.3801 226/500 [============>.................] - ETA: 1:09 - loss: 2.0790 - regression_loss: 1.6990 - classification_loss: 0.3800 227/500 [============>.................] - ETA: 1:09 - loss: 2.0812 - regression_loss: 1.7013 - classification_loss: 0.3799 228/500 [============>.................] - ETA: 1:08 - loss: 2.0768 - regression_loss: 1.6979 - classification_loss: 0.3789 229/500 [============>.................] - ETA: 1:08 - loss: 2.0793 - regression_loss: 1.6998 - classification_loss: 0.3795 230/500 [============>.................] - ETA: 1:08 - loss: 2.0800 - regression_loss: 1.7005 - classification_loss: 0.3796 231/500 [============>.................] - ETA: 1:07 - loss: 2.0811 - regression_loss: 1.7011 - classification_loss: 0.3801 232/500 [============>.................] - ETA: 1:07 - loss: 2.0833 - regression_loss: 1.7025 - classification_loss: 0.3808 233/500 [============>.................] - ETA: 1:07 - loss: 2.0832 - regression_loss: 1.7027 - classification_loss: 0.3805 234/500 [=============>................] - ETA: 1:07 - loss: 2.0824 - regression_loss: 1.7021 - classification_loss: 0.3803 235/500 [=============>................] - ETA: 1:06 - loss: 2.0810 - regression_loss: 1.7011 - classification_loss: 0.3799 236/500 [=============>................] - ETA: 1:06 - loss: 2.0825 - regression_loss: 1.7014 - classification_loss: 0.3811 237/500 [=============>................] - ETA: 1:06 - loss: 2.0882 - regression_loss: 1.7068 - classification_loss: 0.3814 238/500 [=============>................] - ETA: 1:06 - loss: 2.0876 - regression_loss: 1.7067 - classification_loss: 0.3809 239/500 [=============>................] - ETA: 1:05 - loss: 2.0907 - regression_loss: 1.7095 - classification_loss: 0.3812 240/500 [=============>................] - ETA: 1:05 - loss: 2.0869 - regression_loss: 1.7055 - classification_loss: 0.3814 241/500 [=============>................] - ETA: 1:05 - loss: 2.0849 - regression_loss: 1.7038 - classification_loss: 0.3812 242/500 [=============>................] - ETA: 1:05 - loss: 2.0834 - regression_loss: 1.7027 - classification_loss: 0.3807 243/500 [=============>................] - ETA: 1:04 - loss: 2.0828 - regression_loss: 1.7021 - classification_loss: 0.3807 244/500 [=============>................] - ETA: 1:04 - loss: 2.0833 - regression_loss: 1.7025 - classification_loss: 0.3808 245/500 [=============>................] - ETA: 1:04 - loss: 2.0850 - regression_loss: 1.7032 - classification_loss: 0.3818 246/500 [=============>................] - ETA: 1:04 - loss: 2.0856 - regression_loss: 1.7039 - classification_loss: 0.3817 247/500 [=============>................] - ETA: 1:03 - loss: 2.0872 - regression_loss: 1.7056 - classification_loss: 0.3816 248/500 [=============>................] - ETA: 1:03 - loss: 2.0889 - regression_loss: 1.7063 - classification_loss: 0.3826 249/500 [=============>................] - ETA: 1:03 - loss: 2.0889 - regression_loss: 1.7064 - classification_loss: 0.3825 250/500 [==============>...............] - ETA: 1:03 - loss: 2.0911 - regression_loss: 1.7078 - classification_loss: 0.3833 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0915 - regression_loss: 1.7083 - classification_loss: 0.3832 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0915 - regression_loss: 1.7081 - classification_loss: 0.3834 253/500 [==============>...............] - ETA: 1:02 - loss: 2.0912 - regression_loss: 1.7078 - classification_loss: 0.3834 254/500 [==============>...............] - ETA: 1:02 - loss: 2.0915 - regression_loss: 1.7084 - classification_loss: 0.3831 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0904 - regression_loss: 1.7069 - classification_loss: 0.3835 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0896 - regression_loss: 1.7063 - classification_loss: 0.3833 257/500 [==============>...............] - ETA: 1:01 - loss: 2.0889 - regression_loss: 1.7051 - classification_loss: 0.3837 258/500 [==============>...............] - ETA: 1:01 - loss: 2.0924 - regression_loss: 1.7078 - classification_loss: 0.3846 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0939 - regression_loss: 1.7093 - classification_loss: 0.3846 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0919 - regression_loss: 1.7076 - classification_loss: 0.3843 261/500 [==============>...............] - ETA: 1:00 - loss: 2.0932 - regression_loss: 1.7086 - classification_loss: 0.3846 262/500 [==============>...............] - ETA: 1:00 - loss: 2.0917 - regression_loss: 1.7072 - classification_loss: 0.3844 263/500 [==============>...............] - ETA: 59s - loss: 2.0897 - regression_loss: 1.7055 - classification_loss: 0.3842  264/500 [==============>...............] - ETA: 59s - loss: 2.0903 - regression_loss: 1.7063 - classification_loss: 0.3840 265/500 [==============>...............] - ETA: 59s - loss: 2.0854 - regression_loss: 1.7023 - classification_loss: 0.3832 266/500 [==============>...............] - ETA: 59s - loss: 2.0867 - regression_loss: 1.7032 - classification_loss: 0.3835 267/500 [===============>..............] - ETA: 58s - loss: 2.0857 - regression_loss: 1.7025 - classification_loss: 0.3832 268/500 [===============>..............] - ETA: 58s - loss: 2.0863 - regression_loss: 1.7030 - classification_loss: 0.3832 269/500 [===============>..............] - ETA: 58s - loss: 2.0865 - regression_loss: 1.7033 - classification_loss: 0.3831 270/500 [===============>..............] - ETA: 58s - loss: 2.0870 - regression_loss: 1.7038 - classification_loss: 0.3832 271/500 [===============>..............] - ETA: 57s - loss: 2.0877 - regression_loss: 1.7045 - classification_loss: 0.3832 272/500 [===============>..............] - ETA: 57s - loss: 2.0870 - regression_loss: 1.7040 - classification_loss: 0.3830 273/500 [===============>..............] - ETA: 57s - loss: 2.0878 - regression_loss: 1.7040 - classification_loss: 0.3838 274/500 [===============>..............] - ETA: 57s - loss: 2.0874 - regression_loss: 1.7037 - classification_loss: 0.3837 275/500 [===============>..............] - ETA: 56s - loss: 2.0831 - regression_loss: 1.7002 - classification_loss: 0.3829 276/500 [===============>..............] - ETA: 56s - loss: 2.0844 - regression_loss: 1.7014 - classification_loss: 0.3830 277/500 [===============>..............] - ETA: 56s - loss: 2.0817 - regression_loss: 1.6992 - classification_loss: 0.3825 278/500 [===============>..............] - ETA: 56s - loss: 2.0854 - regression_loss: 1.7021 - classification_loss: 0.3833 279/500 [===============>..............] - ETA: 55s - loss: 2.0840 - regression_loss: 1.7012 - classification_loss: 0.3828 280/500 [===============>..............] - ETA: 55s - loss: 2.0845 - regression_loss: 1.7013 - classification_loss: 0.3832 281/500 [===============>..............] - ETA: 55s - loss: 2.0882 - regression_loss: 1.7048 - classification_loss: 0.3834 282/500 [===============>..............] - ETA: 55s - loss: 2.0885 - regression_loss: 1.7054 - classification_loss: 0.3830 283/500 [===============>..............] - ETA: 54s - loss: 2.0879 - regression_loss: 1.7050 - classification_loss: 0.3829 284/500 [================>.............] - ETA: 54s - loss: 2.0901 - regression_loss: 1.7070 - classification_loss: 0.3831 285/500 [================>.............] - ETA: 54s - loss: 2.0884 - regression_loss: 1.7059 - classification_loss: 0.3825 286/500 [================>.............] - ETA: 54s - loss: 2.0873 - regression_loss: 1.7053 - classification_loss: 0.3820 287/500 [================>.............] - ETA: 53s - loss: 2.0856 - regression_loss: 1.7036 - classification_loss: 0.3819 288/500 [================>.............] - ETA: 53s - loss: 2.0862 - regression_loss: 1.7042 - classification_loss: 0.3820 289/500 [================>.............] - ETA: 53s - loss: 2.0862 - regression_loss: 1.7044 - classification_loss: 0.3819 290/500 [================>.............] - ETA: 53s - loss: 2.0858 - regression_loss: 1.7042 - classification_loss: 0.3816 291/500 [================>.............] - ETA: 52s - loss: 2.0876 - regression_loss: 1.7053 - classification_loss: 0.3823 292/500 [================>.............] - ETA: 52s - loss: 2.0880 - regression_loss: 1.7053 - classification_loss: 0.3827 293/500 [================>.............] - ETA: 52s - loss: 2.0886 - regression_loss: 1.7060 - classification_loss: 0.3826 294/500 [================>.............] - ETA: 52s - loss: 2.0880 - regression_loss: 1.7055 - classification_loss: 0.3825 295/500 [================>.............] - ETA: 51s - loss: 2.0889 - regression_loss: 1.7063 - classification_loss: 0.3826 296/500 [================>.............] - ETA: 51s - loss: 2.0897 - regression_loss: 1.7070 - classification_loss: 0.3827 297/500 [================>.............] - ETA: 51s - loss: 2.0910 - regression_loss: 1.7080 - classification_loss: 0.3830 298/500 [================>.............] - ETA: 51s - loss: 2.0897 - regression_loss: 1.7069 - classification_loss: 0.3828 299/500 [================>.............] - ETA: 50s - loss: 2.0892 - regression_loss: 1.7066 - classification_loss: 0.3826 300/500 [=================>............] - ETA: 50s - loss: 2.0884 - regression_loss: 1.7056 - classification_loss: 0.3827 301/500 [=================>............] - ETA: 50s - loss: 2.0876 - regression_loss: 1.7046 - classification_loss: 0.3829 302/500 [=================>............] - ETA: 50s - loss: 2.0878 - regression_loss: 1.7050 - classification_loss: 0.3828 303/500 [=================>............] - ETA: 49s - loss: 2.0865 - regression_loss: 1.7040 - classification_loss: 0.3825 304/500 [=================>............] - ETA: 49s - loss: 2.0862 - regression_loss: 1.7039 - classification_loss: 0.3823 305/500 [=================>............] - ETA: 49s - loss: 2.0859 - regression_loss: 1.7036 - classification_loss: 0.3823 306/500 [=================>............] - ETA: 48s - loss: 2.0873 - regression_loss: 1.7047 - classification_loss: 0.3826 307/500 [=================>............] - ETA: 48s - loss: 2.0876 - regression_loss: 1.7052 - classification_loss: 0.3825 308/500 [=================>............] - ETA: 48s - loss: 2.0866 - regression_loss: 1.7043 - classification_loss: 0.3823 309/500 [=================>............] - ETA: 48s - loss: 2.0846 - regression_loss: 1.7027 - classification_loss: 0.3819 310/500 [=================>............] - ETA: 47s - loss: 2.0857 - regression_loss: 1.7035 - classification_loss: 0.3822 311/500 [=================>............] - ETA: 47s - loss: 2.0837 - regression_loss: 1.7019 - classification_loss: 0.3818 312/500 [=================>............] - ETA: 47s - loss: 2.0843 - regression_loss: 1.7026 - classification_loss: 0.3816 313/500 [=================>............] - ETA: 47s - loss: 2.0841 - regression_loss: 1.7023 - classification_loss: 0.3818 314/500 [=================>............] - ETA: 46s - loss: 2.0856 - regression_loss: 1.7038 - classification_loss: 0.3818 315/500 [=================>............] - ETA: 46s - loss: 2.0815 - regression_loss: 1.7005 - classification_loss: 0.3810 316/500 [=================>............] - ETA: 46s - loss: 2.0823 - regression_loss: 1.7013 - classification_loss: 0.3810 317/500 [==================>...........] - ETA: 46s - loss: 2.0819 - regression_loss: 1.7010 - classification_loss: 0.3809 318/500 [==================>...........] - ETA: 45s - loss: 2.0816 - regression_loss: 1.7006 - classification_loss: 0.3810 319/500 [==================>...........] - ETA: 45s - loss: 2.0832 - regression_loss: 1.7019 - classification_loss: 0.3813 320/500 [==================>...........] - ETA: 45s - loss: 2.0837 - regression_loss: 1.7024 - classification_loss: 0.3813 321/500 [==================>...........] - ETA: 45s - loss: 2.0835 - regression_loss: 1.7024 - classification_loss: 0.3811 322/500 [==================>...........] - ETA: 44s - loss: 2.0851 - regression_loss: 1.7038 - classification_loss: 0.3813 323/500 [==================>...........] - ETA: 44s - loss: 2.0845 - regression_loss: 1.7032 - classification_loss: 0.3812 324/500 [==================>...........] - ETA: 44s - loss: 2.0850 - regression_loss: 1.7035 - classification_loss: 0.3815 325/500 [==================>...........] - ETA: 44s - loss: 2.0830 - regression_loss: 1.7019 - classification_loss: 0.3810 326/500 [==================>...........] - ETA: 43s - loss: 2.0812 - regression_loss: 1.6997 - classification_loss: 0.3816 327/500 [==================>...........] - ETA: 43s - loss: 2.0826 - regression_loss: 1.7007 - classification_loss: 0.3819 328/500 [==================>...........] - ETA: 43s - loss: 2.0843 - regression_loss: 1.7016 - classification_loss: 0.3827 329/500 [==================>...........] - ETA: 43s - loss: 2.0830 - regression_loss: 1.7007 - classification_loss: 0.3823 330/500 [==================>...........] - ETA: 42s - loss: 2.0827 - regression_loss: 1.7006 - classification_loss: 0.3821 331/500 [==================>...........] - ETA: 42s - loss: 2.0803 - regression_loss: 1.6987 - classification_loss: 0.3816 332/500 [==================>...........] - ETA: 42s - loss: 2.0796 - regression_loss: 1.6982 - classification_loss: 0.3814 333/500 [==================>...........] - ETA: 42s - loss: 2.0809 - regression_loss: 1.6992 - classification_loss: 0.3818 334/500 [===================>..........] - ETA: 41s - loss: 2.0807 - regression_loss: 1.6990 - classification_loss: 0.3816 335/500 [===================>..........] - ETA: 41s - loss: 2.0797 - regression_loss: 1.6984 - classification_loss: 0.3814 336/500 [===================>..........] - ETA: 41s - loss: 2.0799 - regression_loss: 1.6987 - classification_loss: 0.3813 337/500 [===================>..........] - ETA: 41s - loss: 2.0799 - regression_loss: 1.6988 - classification_loss: 0.3811 338/500 [===================>..........] - ETA: 40s - loss: 2.0803 - regression_loss: 1.6991 - classification_loss: 0.3812 339/500 [===================>..........] - ETA: 40s - loss: 2.0808 - regression_loss: 1.6995 - classification_loss: 0.3813 340/500 [===================>..........] - ETA: 40s - loss: 2.0801 - regression_loss: 1.6991 - classification_loss: 0.3810 341/500 [===================>..........] - ETA: 40s - loss: 2.0803 - regression_loss: 1.6994 - classification_loss: 0.3809 342/500 [===================>..........] - ETA: 39s - loss: 2.0799 - regression_loss: 1.6992 - classification_loss: 0.3807 343/500 [===================>..........] - ETA: 39s - loss: 2.0812 - regression_loss: 1.7003 - classification_loss: 0.3809 344/500 [===================>..........] - ETA: 39s - loss: 2.0808 - regression_loss: 1.6997 - classification_loss: 0.3811 345/500 [===================>..........] - ETA: 39s - loss: 2.0785 - regression_loss: 1.6977 - classification_loss: 0.3808 346/500 [===================>..........] - ETA: 38s - loss: 2.0779 - regression_loss: 1.6973 - classification_loss: 0.3806 347/500 [===================>..........] - ETA: 38s - loss: 2.0779 - regression_loss: 1.6972 - classification_loss: 0.3807 348/500 [===================>..........] - ETA: 38s - loss: 2.0784 - regression_loss: 1.6974 - classification_loss: 0.3809 349/500 [===================>..........] - ETA: 38s - loss: 2.0785 - regression_loss: 1.6977 - classification_loss: 0.3808 350/500 [====================>.........] - ETA: 37s - loss: 2.0789 - regression_loss: 1.6983 - classification_loss: 0.3806 351/500 [====================>.........] - ETA: 37s - loss: 2.0762 - regression_loss: 1.6961 - classification_loss: 0.3801 352/500 [====================>.........] - ETA: 37s - loss: 2.0761 - regression_loss: 1.6962 - classification_loss: 0.3799 353/500 [====================>.........] - ETA: 37s - loss: 2.0761 - regression_loss: 1.6961 - classification_loss: 0.3800 354/500 [====================>.........] - ETA: 36s - loss: 2.0753 - regression_loss: 1.6955 - classification_loss: 0.3798 355/500 [====================>.........] - ETA: 36s - loss: 2.0770 - regression_loss: 1.6969 - classification_loss: 0.3801 356/500 [====================>.........] - ETA: 36s - loss: 2.0769 - regression_loss: 1.6969 - classification_loss: 0.3800 357/500 [====================>.........] - ETA: 36s - loss: 2.0779 - regression_loss: 1.6964 - classification_loss: 0.3815 358/500 [====================>.........] - ETA: 35s - loss: 2.0792 - regression_loss: 1.6978 - classification_loss: 0.3814 359/500 [====================>.........] - ETA: 35s - loss: 2.0791 - regression_loss: 1.6976 - classification_loss: 0.3815 360/500 [====================>.........] - ETA: 35s - loss: 2.0812 - regression_loss: 1.6991 - classification_loss: 0.3821 361/500 [====================>.........] - ETA: 35s - loss: 2.0811 - regression_loss: 1.6992 - classification_loss: 0.3819 362/500 [====================>.........] - ETA: 34s - loss: 2.0812 - regression_loss: 1.6991 - classification_loss: 0.3821 363/500 [====================>.........] - ETA: 34s - loss: 2.0819 - regression_loss: 1.6996 - classification_loss: 0.3823 364/500 [====================>.........] - ETA: 34s - loss: 2.0826 - regression_loss: 1.7002 - classification_loss: 0.3823 365/500 [====================>.........] - ETA: 34s - loss: 2.0841 - regression_loss: 1.7014 - classification_loss: 0.3827 366/500 [====================>.........] - ETA: 33s - loss: 2.0832 - regression_loss: 1.7008 - classification_loss: 0.3824 367/500 [=====================>........] - ETA: 33s - loss: 2.0838 - regression_loss: 1.7014 - classification_loss: 0.3823 368/500 [=====================>........] - ETA: 33s - loss: 2.0847 - regression_loss: 1.7020 - classification_loss: 0.3827 369/500 [=====================>........] - ETA: 33s - loss: 2.0863 - regression_loss: 1.7034 - classification_loss: 0.3829 370/500 [=====================>........] - ETA: 32s - loss: 2.0854 - regression_loss: 1.7028 - classification_loss: 0.3826 371/500 [=====================>........] - ETA: 32s - loss: 2.0864 - regression_loss: 1.7017 - classification_loss: 0.3847 372/500 [=====================>........] - ETA: 32s - loss: 2.0865 - regression_loss: 1.7018 - classification_loss: 0.3847 373/500 [=====================>........] - ETA: 32s - loss: 2.0867 - regression_loss: 1.7016 - classification_loss: 0.3851 374/500 [=====================>........] - ETA: 31s - loss: 2.0844 - regression_loss: 1.7000 - classification_loss: 0.3844 375/500 [=====================>........] - ETA: 31s - loss: 2.0831 - regression_loss: 1.6990 - classification_loss: 0.3841 376/500 [=====================>........] - ETA: 31s - loss: 2.0814 - regression_loss: 1.6976 - classification_loss: 0.3838 377/500 [=====================>........] - ETA: 31s - loss: 2.0799 - regression_loss: 1.6965 - classification_loss: 0.3834 378/500 [=====================>........] - ETA: 30s - loss: 2.0796 - regression_loss: 1.6964 - classification_loss: 0.3832 379/500 [=====================>........] - ETA: 30s - loss: 2.0804 - regression_loss: 1.6970 - classification_loss: 0.3834 380/500 [=====================>........] - ETA: 30s - loss: 2.0813 - regression_loss: 1.6975 - classification_loss: 0.3838 381/500 [=====================>........] - ETA: 30s - loss: 2.0810 - regression_loss: 1.6974 - classification_loss: 0.3835 382/500 [=====================>........] - ETA: 29s - loss: 2.0804 - regression_loss: 1.6971 - classification_loss: 0.3833 383/500 [=====================>........] - ETA: 29s - loss: 2.0802 - regression_loss: 1.6969 - classification_loss: 0.3833 384/500 [======================>.......] - ETA: 29s - loss: 2.0811 - regression_loss: 1.6977 - classification_loss: 0.3835 385/500 [======================>.......] - ETA: 29s - loss: 2.0815 - regression_loss: 1.6979 - classification_loss: 0.3836 386/500 [======================>.......] - ETA: 28s - loss: 2.0832 - regression_loss: 1.6992 - classification_loss: 0.3840 387/500 [======================>.......] - ETA: 28s - loss: 2.0825 - regression_loss: 1.6989 - classification_loss: 0.3836 388/500 [======================>.......] - ETA: 28s - loss: 2.0835 - regression_loss: 1.6999 - classification_loss: 0.3836 389/500 [======================>.......] - ETA: 27s - loss: 2.0840 - regression_loss: 1.7004 - classification_loss: 0.3836 390/500 [======================>.......] - ETA: 27s - loss: 2.0838 - regression_loss: 1.7004 - classification_loss: 0.3834 391/500 [======================>.......] - ETA: 27s - loss: 2.0839 - regression_loss: 1.7004 - classification_loss: 0.3835 392/500 [======================>.......] - ETA: 27s - loss: 2.0825 - regression_loss: 1.6995 - classification_loss: 0.3831 393/500 [======================>.......] - ETA: 26s - loss: 2.0810 - regression_loss: 1.6981 - classification_loss: 0.3829 394/500 [======================>.......] - ETA: 26s - loss: 2.0807 - regression_loss: 1.6980 - classification_loss: 0.3827 395/500 [======================>.......] - ETA: 26s - loss: 2.0813 - regression_loss: 1.6986 - classification_loss: 0.3828 396/500 [======================>.......] - ETA: 26s - loss: 2.0817 - regression_loss: 1.6990 - classification_loss: 0.3827 397/500 [======================>.......] - ETA: 25s - loss: 2.0811 - regression_loss: 1.6986 - classification_loss: 0.3825 398/500 [======================>.......] - ETA: 25s - loss: 2.0819 - regression_loss: 1.6991 - classification_loss: 0.3828 399/500 [======================>.......] - ETA: 25s - loss: 2.0815 - regression_loss: 1.6989 - classification_loss: 0.3827 400/500 [=======================>......] - ETA: 25s - loss: 2.0799 - regression_loss: 1.6975 - classification_loss: 0.3824 401/500 [=======================>......] - ETA: 24s - loss: 2.0783 - regression_loss: 1.6963 - classification_loss: 0.3820 402/500 [=======================>......] - ETA: 24s - loss: 2.0791 - regression_loss: 1.6970 - classification_loss: 0.3821 403/500 [=======================>......] - ETA: 24s - loss: 2.0791 - regression_loss: 1.6972 - classification_loss: 0.3819 404/500 [=======================>......] - ETA: 24s - loss: 2.0767 - regression_loss: 1.6954 - classification_loss: 0.3813 405/500 [=======================>......] - ETA: 23s - loss: 2.0778 - regression_loss: 1.6963 - classification_loss: 0.3815 406/500 [=======================>......] - ETA: 23s - loss: 2.0779 - regression_loss: 1.6963 - classification_loss: 0.3815 407/500 [=======================>......] - ETA: 23s - loss: 2.0775 - regression_loss: 1.6962 - classification_loss: 0.3813 408/500 [=======================>......] - ETA: 23s - loss: 2.0775 - regression_loss: 1.6961 - classification_loss: 0.3813 409/500 [=======================>......] - ETA: 22s - loss: 2.0770 - regression_loss: 1.6960 - classification_loss: 0.3811 410/500 [=======================>......] - ETA: 22s - loss: 2.0784 - regression_loss: 1.6972 - classification_loss: 0.3811 411/500 [=======================>......] - ETA: 22s - loss: 2.0784 - regression_loss: 1.6972 - classification_loss: 0.3812 412/500 [=======================>......] - ETA: 22s - loss: 2.0790 - regression_loss: 1.6978 - classification_loss: 0.3812 413/500 [=======================>......] - ETA: 21s - loss: 2.0787 - regression_loss: 1.6974 - classification_loss: 0.3813 414/500 [=======================>......] - ETA: 21s - loss: 2.0789 - regression_loss: 1.6976 - classification_loss: 0.3813 415/500 [=======================>......] - ETA: 21s - loss: 2.0784 - regression_loss: 1.6973 - classification_loss: 0.3811 416/500 [=======================>......] - ETA: 21s - loss: 2.0764 - regression_loss: 1.6957 - classification_loss: 0.3807 417/500 [========================>.....] - ETA: 20s - loss: 2.0772 - regression_loss: 1.6963 - classification_loss: 0.3809 418/500 [========================>.....] - ETA: 20s - loss: 2.0784 - regression_loss: 1.6972 - classification_loss: 0.3812 419/500 [========================>.....] - ETA: 20s - loss: 2.0796 - regression_loss: 1.6981 - classification_loss: 0.3814 420/500 [========================>.....] - ETA: 20s - loss: 2.0795 - regression_loss: 1.6983 - classification_loss: 0.3813 421/500 [========================>.....] - ETA: 19s - loss: 2.0801 - regression_loss: 1.6987 - classification_loss: 0.3814 422/500 [========================>.....] - ETA: 19s - loss: 2.0815 - regression_loss: 1.6997 - classification_loss: 0.3818 423/500 [========================>.....] - ETA: 19s - loss: 2.0816 - regression_loss: 1.6994 - classification_loss: 0.3823 424/500 [========================>.....] - ETA: 19s - loss: 2.0821 - regression_loss: 1.6999 - classification_loss: 0.3823 425/500 [========================>.....] - ETA: 18s - loss: 2.0838 - regression_loss: 1.7013 - classification_loss: 0.3824 426/500 [========================>.....] - ETA: 18s - loss: 2.0852 - regression_loss: 1.7024 - classification_loss: 0.3827 427/500 [========================>.....] - ETA: 18s - loss: 2.0836 - regression_loss: 1.7012 - classification_loss: 0.3824 428/500 [========================>.....] - ETA: 18s - loss: 2.0834 - regression_loss: 1.7011 - classification_loss: 0.3823 429/500 [========================>.....] - ETA: 17s - loss: 2.0837 - regression_loss: 1.7013 - classification_loss: 0.3824 430/500 [========================>.....] - ETA: 17s - loss: 2.0823 - regression_loss: 1.7002 - classification_loss: 0.3821 431/500 [========================>.....] - ETA: 17s - loss: 2.0838 - regression_loss: 1.7013 - classification_loss: 0.3825 432/500 [========================>.....] - ETA: 17s - loss: 2.0828 - regression_loss: 1.7005 - classification_loss: 0.3823 433/500 [========================>.....] - ETA: 16s - loss: 2.0839 - regression_loss: 1.7013 - classification_loss: 0.3826 434/500 [=========================>....] - ETA: 16s - loss: 2.0841 - regression_loss: 1.7017 - classification_loss: 0.3825 435/500 [=========================>....] - ETA: 16s - loss: 2.0847 - regression_loss: 1.7021 - classification_loss: 0.3825 436/500 [=========================>....] - ETA: 16s - loss: 2.0834 - regression_loss: 1.7012 - classification_loss: 0.3823 437/500 [=========================>....] - ETA: 15s - loss: 2.0844 - regression_loss: 1.7020 - classification_loss: 0.3824 438/500 [=========================>....] - ETA: 15s - loss: 2.0834 - regression_loss: 1.7013 - classification_loss: 0.3821 439/500 [=========================>....] - ETA: 15s - loss: 2.0826 - regression_loss: 1.7008 - classification_loss: 0.3818 440/500 [=========================>....] - ETA: 15s - loss: 2.0816 - regression_loss: 1.7001 - classification_loss: 0.3815 441/500 [=========================>....] - ETA: 14s - loss: 2.0806 - regression_loss: 1.6993 - classification_loss: 0.3813 442/500 [=========================>....] - ETA: 14s - loss: 2.0802 - regression_loss: 1.6989 - classification_loss: 0.3813 443/500 [=========================>....] - ETA: 14s - loss: 2.0802 - regression_loss: 1.6991 - classification_loss: 0.3812 444/500 [=========================>....] - ETA: 14s - loss: 2.0797 - regression_loss: 1.6987 - classification_loss: 0.3810 445/500 [=========================>....] - ETA: 13s - loss: 2.0812 - regression_loss: 1.6999 - classification_loss: 0.3813 446/500 [=========================>....] - ETA: 13s - loss: 2.0816 - regression_loss: 1.7004 - classification_loss: 0.3812 447/500 [=========================>....] - ETA: 13s - loss: 2.0831 - regression_loss: 1.7016 - classification_loss: 0.3815 448/500 [=========================>....] - ETA: 13s - loss: 2.0838 - regression_loss: 1.7020 - classification_loss: 0.3818 449/500 [=========================>....] - ETA: 12s - loss: 2.0840 - regression_loss: 1.7022 - classification_loss: 0.3818 450/500 [==========================>...] - ETA: 12s - loss: 2.0841 - regression_loss: 1.7024 - classification_loss: 0.3818 451/500 [==========================>...] - ETA: 12s - loss: 2.0820 - regression_loss: 1.7005 - classification_loss: 0.3815 452/500 [==========================>...] - ETA: 12s - loss: 2.0823 - regression_loss: 1.7009 - classification_loss: 0.3815 453/500 [==========================>...] - ETA: 11s - loss: 2.0835 - regression_loss: 1.7017 - classification_loss: 0.3818 454/500 [==========================>...] - ETA: 11s - loss: 2.0831 - regression_loss: 1.7015 - classification_loss: 0.3817 455/500 [==========================>...] - ETA: 11s - loss: 2.0842 - regression_loss: 1.7022 - classification_loss: 0.3820 456/500 [==========================>...] - ETA: 11s - loss: 2.0844 - regression_loss: 1.7023 - classification_loss: 0.3821 457/500 [==========================>...] - ETA: 10s - loss: 2.0826 - regression_loss: 1.7007 - classification_loss: 0.3819 458/500 [==========================>...] - ETA: 10s - loss: 2.0831 - regression_loss: 1.7011 - classification_loss: 0.3820 459/500 [==========================>...] - ETA: 10s - loss: 2.0805 - regression_loss: 1.6990 - classification_loss: 0.3814 460/500 [==========================>...] - ETA: 10s - loss: 2.0806 - regression_loss: 1.6992 - classification_loss: 0.3814 461/500 [==========================>...] - ETA: 9s - loss: 2.0811 - regression_loss: 1.6996 - classification_loss: 0.3815  462/500 [==========================>...] - ETA: 9s - loss: 2.0847 - regression_loss: 1.7008 - classification_loss: 0.3839 463/500 [==========================>...] - ETA: 9s - loss: 2.0848 - regression_loss: 1.7007 - classification_loss: 0.3841 464/500 [==========================>...] - ETA: 9s - loss: 2.0855 - regression_loss: 1.7015 - classification_loss: 0.3840 465/500 [==========================>...] - ETA: 8s - loss: 2.0853 - regression_loss: 1.7015 - classification_loss: 0.3838 466/500 [==========================>...] - ETA: 8s - loss: 2.0835 - regression_loss: 1.7000 - classification_loss: 0.3835 467/500 [===========================>..] - ETA: 8s - loss: 2.0840 - regression_loss: 1.7004 - classification_loss: 0.3836 468/500 [===========================>..] - ETA: 8s - loss: 2.0850 - regression_loss: 1.7014 - classification_loss: 0.3835 469/500 [===========================>..] - ETA: 7s - loss: 2.0847 - regression_loss: 1.7013 - classification_loss: 0.3834 470/500 [===========================>..] - ETA: 7s - loss: 2.0847 - regression_loss: 1.7011 - classification_loss: 0.3836 471/500 [===========================>..] - ETA: 7s - loss: 2.0842 - regression_loss: 1.7007 - classification_loss: 0.3835 472/500 [===========================>..] - ETA: 7s - loss: 2.0844 - regression_loss: 1.7008 - classification_loss: 0.3836 473/500 [===========================>..] - ETA: 6s - loss: 2.0840 - regression_loss: 1.7003 - classification_loss: 0.3836 474/500 [===========================>..] - ETA: 6s - loss: 2.0836 - regression_loss: 1.7002 - classification_loss: 0.3834 475/500 [===========================>..] - ETA: 6s - loss: 2.0839 - regression_loss: 1.7004 - classification_loss: 0.3835 476/500 [===========================>..] - ETA: 6s - loss: 2.0813 - regression_loss: 1.6983 - classification_loss: 0.3830 477/500 [===========================>..] - ETA: 5s - loss: 2.0817 - regression_loss: 1.6986 - classification_loss: 0.3831 478/500 [===========================>..] - ETA: 5s - loss: 2.0803 - regression_loss: 1.6974 - classification_loss: 0.3829 479/500 [===========================>..] - ETA: 5s - loss: 2.0801 - regression_loss: 1.6971 - classification_loss: 0.3830 480/500 [===========================>..] - ETA: 5s - loss: 2.0779 - regression_loss: 1.6954 - classification_loss: 0.3825 481/500 [===========================>..] - ETA: 4s - loss: 2.0774 - regression_loss: 1.6949 - classification_loss: 0.3826 482/500 [===========================>..] - ETA: 4s - loss: 2.0797 - regression_loss: 1.6969 - classification_loss: 0.3829 483/500 [===========================>..] - ETA: 4s - loss: 2.0798 - regression_loss: 1.6970 - classification_loss: 0.3828 484/500 [============================>.] - ETA: 4s - loss: 2.0792 - regression_loss: 1.6964 - classification_loss: 0.3827 485/500 [============================>.] - ETA: 3s - loss: 2.0790 - regression_loss: 1.6965 - classification_loss: 0.3824 486/500 [============================>.] - ETA: 3s - loss: 2.0802 - regression_loss: 1.6976 - classification_loss: 0.3827 487/500 [============================>.] - ETA: 3s - loss: 2.0805 - regression_loss: 1.6979 - classification_loss: 0.3827 488/500 [============================>.] - ETA: 3s - loss: 2.0803 - regression_loss: 1.6977 - classification_loss: 0.3826 489/500 [============================>.] - ETA: 2s - loss: 2.0794 - regression_loss: 1.6969 - classification_loss: 0.3826 490/500 [============================>.] - ETA: 2s - loss: 2.0799 - regression_loss: 1.6973 - classification_loss: 0.3826 491/500 [============================>.] - ETA: 2s - loss: 2.0796 - regression_loss: 1.6970 - classification_loss: 0.3825 492/500 [============================>.] - ETA: 2s - loss: 2.0799 - regression_loss: 1.6973 - classification_loss: 0.3826 493/500 [============================>.] - ETA: 1s - loss: 2.0792 - regression_loss: 1.6967 - classification_loss: 0.3824 494/500 [============================>.] - ETA: 1s - loss: 2.0798 - regression_loss: 1.6974 - classification_loss: 0.3824 495/500 [============================>.] - ETA: 1s - loss: 2.0801 - regression_loss: 1.6977 - classification_loss: 0.3824 496/500 [============================>.] - ETA: 1s - loss: 2.0823 - regression_loss: 1.6994 - classification_loss: 0.3829 497/500 [============================>.] - ETA: 0s - loss: 2.0816 - regression_loss: 1.6986 - classification_loss: 0.3829 498/500 [============================>.] - ETA: 0s - loss: 2.0819 - regression_loss: 1.6988 - classification_loss: 0.3831 499/500 [============================>.] - ETA: 0s - loss: 2.0813 - regression_loss: 1.6984 - classification_loss: 0.3829 500/500 [==============================] - 126s 252ms/step - loss: 2.0810 - regression_loss: 1.6982 - classification_loss: 0.3828 1172 instances of class plum with average precision: 0.5188 mAP: 0.5188 Epoch 00034: saving model to ./training/snapshots/resnet50_pascal_34.h5 Epoch 35/150 1/500 [..............................] - ETA: 1:54 - loss: 0.8603 - regression_loss: 0.6573 - classification_loss: 0.2030 2/500 [..............................] - ETA: 1:56 - loss: 1.7530 - regression_loss: 1.4229 - classification_loss: 0.3300 3/500 [..............................] - ETA: 1:58 - loss: 1.8896 - regression_loss: 1.5452 - classification_loss: 0.3444 4/500 [..............................] - ETA: 2:00 - loss: 1.9880 - regression_loss: 1.6184 - classification_loss: 0.3696 5/500 [..............................] - ETA: 2:01 - loss: 1.9764 - regression_loss: 1.6095 - classification_loss: 0.3669 6/500 [..............................] - ETA: 2:01 - loss: 1.9414 - regression_loss: 1.5780 - classification_loss: 0.3634 7/500 [..............................] - ETA: 2:01 - loss: 1.9128 - regression_loss: 1.5614 - classification_loss: 0.3514 8/500 [..............................] - ETA: 2:01 - loss: 2.1162 - regression_loss: 1.7396 - classification_loss: 0.3767 9/500 [..............................] - ETA: 2:00 - loss: 2.1244 - regression_loss: 1.7494 - classification_loss: 0.3750 10/500 [..............................] - ETA: 2:00 - loss: 2.1437 - regression_loss: 1.7662 - classification_loss: 0.3776 11/500 [..............................] - ETA: 2:00 - loss: 2.1302 - regression_loss: 1.7561 - classification_loss: 0.3741 12/500 [..............................] - ETA: 2:00 - loss: 2.0884 - regression_loss: 1.7230 - classification_loss: 0.3654 13/500 [..............................] - ETA: 2:00 - loss: 2.0976 - regression_loss: 1.7318 - classification_loss: 0.3657 14/500 [..............................] - ETA: 1:59 - loss: 2.0733 - regression_loss: 1.7091 - classification_loss: 0.3642 15/500 [..............................] - ETA: 1:59 - loss: 2.0601 - regression_loss: 1.6986 - classification_loss: 0.3616 16/500 [..............................] - ETA: 1:59 - loss: 2.0823 - regression_loss: 1.7183 - classification_loss: 0.3640 17/500 [>.............................] - ETA: 1:59 - loss: 2.1003 - regression_loss: 1.7147 - classification_loss: 0.3855 18/500 [>.............................] - ETA: 1:59 - loss: 2.0883 - regression_loss: 1.7059 - classification_loss: 0.3824 19/500 [>.............................] - ETA: 1:59 - loss: 2.0398 - regression_loss: 1.6665 - classification_loss: 0.3733 20/500 [>.............................] - ETA: 1:59 - loss: 2.0648 - regression_loss: 1.6822 - classification_loss: 0.3826 21/500 [>.............................] - ETA: 1:59 - loss: 2.0535 - regression_loss: 1.6771 - classification_loss: 0.3764 22/500 [>.............................] - ETA: 1:58 - loss: 2.0543 - regression_loss: 1.6768 - classification_loss: 0.3774 23/500 [>.............................] - ETA: 1:58 - loss: 2.0448 - regression_loss: 1.6711 - classification_loss: 0.3737 24/500 [>.............................] - ETA: 1:58 - loss: 2.0556 - regression_loss: 1.6828 - classification_loss: 0.3728 25/500 [>.............................] - ETA: 1:58 - loss: 2.0405 - regression_loss: 1.6712 - classification_loss: 0.3694 26/500 [>.............................] - ETA: 1:58 - loss: 2.0312 - regression_loss: 1.6629 - classification_loss: 0.3683 27/500 [>.............................] - ETA: 1:57 - loss: 2.0286 - regression_loss: 1.6610 - classification_loss: 0.3676 28/500 [>.............................] - ETA: 1:57 - loss: 2.0228 - regression_loss: 1.6568 - classification_loss: 0.3661 29/500 [>.............................] - ETA: 1:57 - loss: 2.0201 - regression_loss: 1.6542 - classification_loss: 0.3659 30/500 [>.............................] - ETA: 1:56 - loss: 2.0260 - regression_loss: 1.6574 - classification_loss: 0.3686 31/500 [>.............................] - ETA: 1:56 - loss: 2.0010 - regression_loss: 1.6377 - classification_loss: 0.3634 32/500 [>.............................] - ETA: 1:56 - loss: 2.0005 - regression_loss: 1.6397 - classification_loss: 0.3608 33/500 [>.............................] - ETA: 1:56 - loss: 1.9856 - regression_loss: 1.6222 - classification_loss: 0.3634 34/500 [=>............................] - ETA: 1:56 - loss: 1.9996 - regression_loss: 1.6340 - classification_loss: 0.3656 35/500 [=>............................] - ETA: 1:56 - loss: 1.9926 - regression_loss: 1.6295 - classification_loss: 0.3631 36/500 [=>............................] - ETA: 1:55 - loss: 1.9833 - regression_loss: 1.6248 - classification_loss: 0.3585 37/500 [=>............................] - ETA: 1:55 - loss: 2.0092 - regression_loss: 1.6455 - classification_loss: 0.3637 38/500 [=>............................] - ETA: 1:55 - loss: 2.0205 - regression_loss: 1.6566 - classification_loss: 0.3639 39/500 [=>............................] - ETA: 1:55 - loss: 2.0205 - regression_loss: 1.6570 - classification_loss: 0.3634 40/500 [=>............................] - ETA: 1:55 - loss: 2.0442 - regression_loss: 1.6724 - classification_loss: 0.3718 41/500 [=>............................] - ETA: 1:54 - loss: 2.0435 - regression_loss: 1.6731 - classification_loss: 0.3704 42/500 [=>............................] - ETA: 1:54 - loss: 2.0319 - regression_loss: 1.6622 - classification_loss: 0.3697 43/500 [=>............................] - ETA: 1:54 - loss: 2.0275 - regression_loss: 1.6589 - classification_loss: 0.3686 44/500 [=>............................] - ETA: 1:53 - loss: 2.0184 - regression_loss: 1.6520 - classification_loss: 0.3664 45/500 [=>............................] - ETA: 1:53 - loss: 2.0278 - regression_loss: 1.6592 - classification_loss: 0.3686 46/500 [=>............................] - ETA: 1:53 - loss: 2.0321 - regression_loss: 1.6626 - classification_loss: 0.3696 47/500 [=>............................] - ETA: 1:53 - loss: 2.0412 - regression_loss: 1.6698 - classification_loss: 0.3714 48/500 [=>............................] - ETA: 1:52 - loss: 2.0392 - regression_loss: 1.6683 - classification_loss: 0.3709 49/500 [=>............................] - ETA: 1:52 - loss: 2.0582 - regression_loss: 1.6831 - classification_loss: 0.3751 50/500 [==>...........................] - ETA: 1:52 - loss: 2.0562 - regression_loss: 1.6815 - classification_loss: 0.3746 51/500 [==>...........................] - ETA: 1:52 - loss: 2.0533 - regression_loss: 1.6785 - classification_loss: 0.3748 52/500 [==>...........................] - ETA: 1:51 - loss: 2.0581 - regression_loss: 1.6831 - classification_loss: 0.3750 53/500 [==>...........................] - ETA: 1:51 - loss: 2.0672 - regression_loss: 1.6916 - classification_loss: 0.3756 54/500 [==>...........................] - ETA: 1:51 - loss: 2.0961 - regression_loss: 1.7028 - classification_loss: 0.3933 55/500 [==>...........................] - ETA: 1:51 - loss: 2.1011 - regression_loss: 1.7092 - classification_loss: 0.3919 56/500 [==>...........................] - ETA: 1:50 - loss: 2.1090 - regression_loss: 1.7171 - classification_loss: 0.3919 57/500 [==>...........................] - ETA: 1:50 - loss: 2.1122 - regression_loss: 1.7190 - classification_loss: 0.3932 58/500 [==>...........................] - ETA: 1:50 - loss: 2.1068 - regression_loss: 1.7134 - classification_loss: 0.3934 59/500 [==>...........................] - ETA: 1:50 - loss: 2.0993 - regression_loss: 1.7067 - classification_loss: 0.3926 60/500 [==>...........................] - ETA: 1:50 - loss: 2.1081 - regression_loss: 1.7147 - classification_loss: 0.3934 61/500 [==>...........................] - ETA: 1:49 - loss: 2.1103 - regression_loss: 1.7172 - classification_loss: 0.3931 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1199 - regression_loss: 1.7271 - classification_loss: 0.3928 63/500 [==>...........................] - ETA: 1:49 - loss: 2.1095 - regression_loss: 1.7180 - classification_loss: 0.3915 64/500 [==>...........................] - ETA: 1:49 - loss: 2.1033 - regression_loss: 1.7138 - classification_loss: 0.3895 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1047 - regression_loss: 1.7128 - classification_loss: 0.3919 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1067 - regression_loss: 1.7145 - classification_loss: 0.3922 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0950 - regression_loss: 1.7046 - classification_loss: 0.3904 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0969 - regression_loss: 1.7067 - classification_loss: 0.3902 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0902 - regression_loss: 1.7010 - classification_loss: 0.3892 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0953 - regression_loss: 1.7039 - classification_loss: 0.3914 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0904 - regression_loss: 1.6991 - classification_loss: 0.3913 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0987 - regression_loss: 1.7057 - classification_loss: 0.3930 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0992 - regression_loss: 1.7055 - classification_loss: 0.3937 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1005 - regression_loss: 1.7067 - classification_loss: 0.3939 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0957 - regression_loss: 1.7027 - classification_loss: 0.3930 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0931 - regression_loss: 1.7004 - classification_loss: 0.3928 77/500 [===>..........................] - ETA: 1:44 - loss: 2.0974 - regression_loss: 1.7039 - classification_loss: 0.3935 78/500 [===>..........................] - ETA: 1:44 - loss: 2.1035 - regression_loss: 1.7084 - classification_loss: 0.3951 79/500 [===>..........................] - ETA: 1:44 - loss: 2.1044 - regression_loss: 1.7097 - classification_loss: 0.3947 80/500 [===>..........................] - ETA: 1:44 - loss: 2.1047 - regression_loss: 1.7112 - classification_loss: 0.3935 81/500 [===>..........................] - ETA: 1:43 - loss: 2.1099 - regression_loss: 1.7165 - classification_loss: 0.3933 82/500 [===>..........................] - ETA: 1:43 - loss: 2.1055 - regression_loss: 1.7131 - classification_loss: 0.3924 83/500 [===>..........................] - ETA: 1:43 - loss: 2.1049 - regression_loss: 1.7130 - classification_loss: 0.3919 84/500 [====>.........................] - ETA: 1:43 - loss: 2.1059 - regression_loss: 1.7147 - classification_loss: 0.3911 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1095 - regression_loss: 1.7163 - classification_loss: 0.3932 86/500 [====>.........................] - ETA: 1:42 - loss: 2.1153 - regression_loss: 1.7195 - classification_loss: 0.3958 87/500 [====>.........................] - ETA: 1:42 - loss: 2.1192 - regression_loss: 1.7217 - classification_loss: 0.3975 88/500 [====>.........................] - ETA: 1:42 - loss: 2.1192 - regression_loss: 1.7214 - classification_loss: 0.3977 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1154 - regression_loss: 1.7190 - classification_loss: 0.3964 90/500 [====>.........................] - ETA: 1:41 - loss: 2.1183 - regression_loss: 1.7204 - classification_loss: 0.3979 91/500 [====>.........................] - ETA: 1:41 - loss: 2.1215 - regression_loss: 1.7239 - classification_loss: 0.3976 92/500 [====>.........................] - ETA: 1:41 - loss: 2.1136 - regression_loss: 1.7165 - classification_loss: 0.3971 93/500 [====>.........................] - ETA: 1:41 - loss: 2.1151 - regression_loss: 1.7173 - classification_loss: 0.3978 94/500 [====>.........................] - ETA: 1:41 - loss: 2.1139 - regression_loss: 1.7163 - classification_loss: 0.3977 95/500 [====>.........................] - ETA: 1:40 - loss: 2.1202 - regression_loss: 1.7204 - classification_loss: 0.3998 96/500 [====>.........................] - ETA: 1:40 - loss: 2.1257 - regression_loss: 1.7247 - classification_loss: 0.4009 97/500 [====>.........................] - ETA: 1:40 - loss: 2.1194 - regression_loss: 1.7193 - classification_loss: 0.4001 98/500 [====>.........................] - ETA: 1:40 - loss: 2.1109 - regression_loss: 1.7130 - classification_loss: 0.3979 99/500 [====>.........................] - ETA: 1:39 - loss: 2.1103 - regression_loss: 1.7135 - classification_loss: 0.3968 100/500 [=====>........................] - ETA: 1:39 - loss: 2.1116 - regression_loss: 1.7144 - classification_loss: 0.3972 101/500 [=====>........................] - ETA: 1:39 - loss: 2.1136 - regression_loss: 1.7153 - classification_loss: 0.3983 102/500 [=====>........................] - ETA: 1:39 - loss: 2.1157 - regression_loss: 1.7174 - classification_loss: 0.3983 103/500 [=====>........................] - ETA: 1:38 - loss: 2.1181 - regression_loss: 1.7189 - classification_loss: 0.3992 104/500 [=====>........................] - ETA: 1:38 - loss: 2.1147 - regression_loss: 1.7165 - classification_loss: 0.3982 105/500 [=====>........................] - ETA: 1:38 - loss: 2.1073 - regression_loss: 1.7102 - classification_loss: 0.3971 106/500 [=====>........................] - ETA: 1:38 - loss: 2.1086 - regression_loss: 1.7110 - classification_loss: 0.3975 107/500 [=====>........................] - ETA: 1:37 - loss: 2.1150 - regression_loss: 1.7147 - classification_loss: 0.4003 108/500 [=====>........................] - ETA: 1:37 - loss: 2.1130 - regression_loss: 1.7135 - classification_loss: 0.3995 109/500 [=====>........................] - ETA: 1:37 - loss: 2.1152 - regression_loss: 1.7160 - classification_loss: 0.3993 110/500 [=====>........................] - ETA: 1:37 - loss: 2.1074 - regression_loss: 1.7101 - classification_loss: 0.3973 111/500 [=====>........................] - ETA: 1:37 - loss: 2.1127 - regression_loss: 1.7144 - classification_loss: 0.3983 112/500 [=====>........................] - ETA: 1:36 - loss: 2.1117 - regression_loss: 1.7140 - classification_loss: 0.3977 113/500 [=====>........................] - ETA: 1:36 - loss: 2.1054 - regression_loss: 1.7077 - classification_loss: 0.3976 114/500 [=====>........................] - ETA: 1:36 - loss: 2.1087 - regression_loss: 1.7110 - classification_loss: 0.3976 115/500 [=====>........................] - ETA: 1:35 - loss: 2.1145 - regression_loss: 1.7154 - classification_loss: 0.3991 116/500 [=====>........................] - ETA: 1:35 - loss: 2.1156 - regression_loss: 1.7159 - classification_loss: 0.3997 117/500 [======>.......................] - ETA: 1:35 - loss: 2.1108 - regression_loss: 1.7117 - classification_loss: 0.3991 118/500 [======>.......................] - ETA: 1:35 - loss: 2.1124 - regression_loss: 1.7132 - classification_loss: 0.3991 119/500 [======>.......................] - ETA: 1:35 - loss: 2.1067 - regression_loss: 1.7098 - classification_loss: 0.3969 120/500 [======>.......................] - ETA: 1:34 - loss: 2.1073 - regression_loss: 1.7105 - classification_loss: 0.3968 121/500 [======>.......................] - ETA: 1:34 - loss: 2.1112 - regression_loss: 1.7133 - classification_loss: 0.3979 122/500 [======>.......................] - ETA: 1:34 - loss: 2.1093 - regression_loss: 1.7121 - classification_loss: 0.3971 123/500 [======>.......................] - ETA: 1:34 - loss: 2.1092 - regression_loss: 1.7126 - classification_loss: 0.3967 124/500 [======>.......................] - ETA: 1:33 - loss: 2.1100 - regression_loss: 1.7129 - classification_loss: 0.3971 125/500 [======>.......................] - ETA: 1:33 - loss: 2.1168 - regression_loss: 1.7180 - classification_loss: 0.3988 126/500 [======>.......................] - ETA: 1:33 - loss: 2.1105 - regression_loss: 1.7126 - classification_loss: 0.3979 127/500 [======>.......................] - ETA: 1:33 - loss: 2.1107 - regression_loss: 1.7129 - classification_loss: 0.3978 128/500 [======>.......................] - ETA: 1:32 - loss: 2.1088 - regression_loss: 1.7114 - classification_loss: 0.3973 129/500 [======>.......................] - ETA: 1:32 - loss: 2.1062 - regression_loss: 1.7098 - classification_loss: 0.3964 130/500 [======>.......................] - ETA: 1:32 - loss: 2.1049 - regression_loss: 1.7083 - classification_loss: 0.3967 131/500 [======>.......................] - ETA: 1:32 - loss: 2.1040 - regression_loss: 1.7078 - classification_loss: 0.3961 132/500 [======>.......................] - ETA: 1:31 - loss: 2.1035 - regression_loss: 1.7080 - classification_loss: 0.3955 133/500 [======>.......................] - ETA: 1:31 - loss: 2.1054 - regression_loss: 1.7103 - classification_loss: 0.3950 134/500 [=======>......................] - ETA: 1:31 - loss: 2.1019 - regression_loss: 1.7079 - classification_loss: 0.3941 135/500 [=======>......................] - ETA: 1:31 - loss: 2.1106 - regression_loss: 1.7154 - classification_loss: 0.3952 136/500 [=======>......................] - ETA: 1:30 - loss: 2.1110 - regression_loss: 1.7161 - classification_loss: 0.3950 137/500 [=======>......................] - ETA: 1:30 - loss: 2.1102 - regression_loss: 1.7159 - classification_loss: 0.3943 138/500 [=======>......................] - ETA: 1:30 - loss: 2.1101 - regression_loss: 1.7152 - classification_loss: 0.3949 139/500 [=======>......................] - ETA: 1:30 - loss: 2.1089 - regression_loss: 1.7143 - classification_loss: 0.3945 140/500 [=======>......................] - ETA: 1:29 - loss: 2.1104 - regression_loss: 1.7157 - classification_loss: 0.3947 141/500 [=======>......................] - ETA: 1:29 - loss: 2.1111 - regression_loss: 1.7163 - classification_loss: 0.3948 142/500 [=======>......................] - ETA: 1:29 - loss: 2.1128 - regression_loss: 1.7174 - classification_loss: 0.3954 143/500 [=======>......................] - ETA: 1:29 - loss: 2.1092 - regression_loss: 1.7151 - classification_loss: 0.3942 144/500 [=======>......................] - ETA: 1:28 - loss: 2.1093 - regression_loss: 1.7150 - classification_loss: 0.3943 145/500 [=======>......................] - ETA: 1:28 - loss: 2.1086 - regression_loss: 1.7149 - classification_loss: 0.3937 146/500 [=======>......................] - ETA: 1:28 - loss: 2.1037 - regression_loss: 1.7111 - classification_loss: 0.3926 147/500 [=======>......................] - ETA: 1:28 - loss: 2.1057 - regression_loss: 1.7121 - classification_loss: 0.3936 148/500 [=======>......................] - ETA: 1:27 - loss: 2.1055 - regression_loss: 1.7118 - classification_loss: 0.3937 149/500 [=======>......................] - ETA: 1:27 - loss: 2.1052 - regression_loss: 1.7119 - classification_loss: 0.3933 150/500 [========>.....................] - ETA: 1:27 - loss: 2.1033 - regression_loss: 1.7108 - classification_loss: 0.3926 151/500 [========>.....................] - ETA: 1:27 - loss: 2.1007 - regression_loss: 1.7088 - classification_loss: 0.3918 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0964 - regression_loss: 1.7048 - classification_loss: 0.3916 153/500 [========>.....................] - ETA: 1:26 - loss: 2.1003 - regression_loss: 1.7081 - classification_loss: 0.3922 154/500 [========>.....................] - ETA: 1:26 - loss: 2.1006 - regression_loss: 1.7084 - classification_loss: 0.3921 155/500 [========>.....................] - ETA: 1:26 - loss: 2.1018 - regression_loss: 1.7097 - classification_loss: 0.3921 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0990 - regression_loss: 1.7078 - classification_loss: 0.3912 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0974 - regression_loss: 1.7066 - classification_loss: 0.3907 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0962 - regression_loss: 1.7056 - classification_loss: 0.3906 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0959 - regression_loss: 1.7055 - classification_loss: 0.3905 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0954 - regression_loss: 1.7054 - classification_loss: 0.3900 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0988 - regression_loss: 1.7081 - classification_loss: 0.3907 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0948 - regression_loss: 1.7054 - classification_loss: 0.3894 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0954 - regression_loss: 1.7061 - classification_loss: 0.3894 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0950 - regression_loss: 1.7061 - classification_loss: 0.3889 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0984 - regression_loss: 1.7087 - classification_loss: 0.3896 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0952 - regression_loss: 1.7061 - classification_loss: 0.3890 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0966 - regression_loss: 1.7077 - classification_loss: 0.3888 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0962 - regression_loss: 1.7079 - classification_loss: 0.3883 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0945 - regression_loss: 1.7068 - classification_loss: 0.3876 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0889 - regression_loss: 1.7025 - classification_loss: 0.3864 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0894 - regression_loss: 1.7030 - classification_loss: 0.3864 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0863 - regression_loss: 1.7005 - classification_loss: 0.3858 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0793 - regression_loss: 1.6950 - classification_loss: 0.3843 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0812 - regression_loss: 1.6967 - classification_loss: 0.3845 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0812 - regression_loss: 1.6963 - classification_loss: 0.3850 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0813 - regression_loss: 1.6966 - classification_loss: 0.3847 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0804 - regression_loss: 1.6958 - classification_loss: 0.3845 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0828 - regression_loss: 1.6976 - classification_loss: 0.3852 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0822 - regression_loss: 1.6969 - classification_loss: 0.3854 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0836 - regression_loss: 1.6977 - classification_loss: 0.3859 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0872 - regression_loss: 1.6998 - classification_loss: 0.3874 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0916 - regression_loss: 1.7026 - classification_loss: 0.3891 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0947 - regression_loss: 1.7046 - classification_loss: 0.3901 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0979 - regression_loss: 1.7074 - classification_loss: 0.3905 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0929 - regression_loss: 1.7033 - classification_loss: 0.3896 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0910 - regression_loss: 1.7018 - classification_loss: 0.3891 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0907 - regression_loss: 1.7019 - classification_loss: 0.3888 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0906 - regression_loss: 1.7016 - classification_loss: 0.3889 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0918 - regression_loss: 1.7033 - classification_loss: 0.3886 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0916 - regression_loss: 1.7033 - classification_loss: 0.3883 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0916 - regression_loss: 1.7035 - classification_loss: 0.3880 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0864 - regression_loss: 1.6989 - classification_loss: 0.3874 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0891 - regression_loss: 1.7018 - classification_loss: 0.3873 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0889 - regression_loss: 1.7016 - classification_loss: 0.3873 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0939 - regression_loss: 1.7045 - classification_loss: 0.3893 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0892 - regression_loss: 1.7006 - classification_loss: 0.3885 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0905 - regression_loss: 1.7015 - classification_loss: 0.3890 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0920 - regression_loss: 1.7028 - classification_loss: 0.3892 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0929 - regression_loss: 1.7038 - classification_loss: 0.3892 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0905 - regression_loss: 1.7019 - classification_loss: 0.3887 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0868 - regression_loss: 1.6988 - classification_loss: 0.3880 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0864 - regression_loss: 1.6985 - classification_loss: 0.3879 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0872 - regression_loss: 1.6992 - classification_loss: 0.3880 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0887 - regression_loss: 1.7007 - classification_loss: 0.3879 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0884 - regression_loss: 1.7008 - classification_loss: 0.3876 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0908 - regression_loss: 1.7029 - classification_loss: 0.3880 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0919 - regression_loss: 1.7039 - classification_loss: 0.3880 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0914 - regression_loss: 1.7036 - classification_loss: 0.3878 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0895 - regression_loss: 1.7021 - classification_loss: 0.3873 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0933 - regression_loss: 1.7049 - classification_loss: 0.3885 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0931 - regression_loss: 1.7049 - classification_loss: 0.3882 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0931 - regression_loss: 1.7047 - classification_loss: 0.3883 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0984 - regression_loss: 1.7089 - classification_loss: 0.3895 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0998 - regression_loss: 1.7102 - classification_loss: 0.3896 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0992 - regression_loss: 1.7101 - classification_loss: 0.3890 216/500 [===========>..................] - ETA: 1:11 - loss: 2.1004 - regression_loss: 1.7113 - classification_loss: 0.3891 217/500 [============>.................] - ETA: 1:10 - loss: 2.1007 - regression_loss: 1.7119 - classification_loss: 0.3888 218/500 [============>.................] - ETA: 1:10 - loss: 2.1010 - regression_loss: 1.7120 - classification_loss: 0.3890 219/500 [============>.................] - ETA: 1:10 - loss: 2.0985 - regression_loss: 1.7101 - classification_loss: 0.3884 220/500 [============>.................] - ETA: 1:10 - loss: 2.0984 - regression_loss: 1.7102 - classification_loss: 0.3882 221/500 [============>.................] - ETA: 1:09 - loss: 2.0992 - regression_loss: 1.7110 - classification_loss: 0.3882 222/500 [============>.................] - ETA: 1:09 - loss: 2.0981 - regression_loss: 1.7092 - classification_loss: 0.3889 223/500 [============>.................] - ETA: 1:09 - loss: 2.1003 - regression_loss: 1.7115 - classification_loss: 0.3888 224/500 [============>.................] - ETA: 1:09 - loss: 2.0995 - regression_loss: 1.7114 - classification_loss: 0.3882 225/500 [============>.................] - ETA: 1:08 - loss: 2.0989 - regression_loss: 1.7108 - classification_loss: 0.3881 226/500 [============>.................] - ETA: 1:08 - loss: 2.0954 - regression_loss: 1.7082 - classification_loss: 0.3872 227/500 [============>.................] - ETA: 1:08 - loss: 2.0986 - regression_loss: 1.7109 - classification_loss: 0.3877 228/500 [============>.................] - ETA: 1:08 - loss: 2.0947 - regression_loss: 1.7079 - classification_loss: 0.3868 229/500 [============>.................] - ETA: 1:07 - loss: 2.0953 - regression_loss: 1.7086 - classification_loss: 0.3867 230/500 [============>.................] - ETA: 1:07 - loss: 2.0951 - regression_loss: 1.7084 - classification_loss: 0.3867 231/500 [============>.................] - ETA: 1:07 - loss: 2.0949 - regression_loss: 1.7084 - classification_loss: 0.3865 232/500 [============>.................] - ETA: 1:07 - loss: 2.0942 - regression_loss: 1.7080 - classification_loss: 0.3862 233/500 [============>.................] - ETA: 1:06 - loss: 2.0931 - regression_loss: 1.7071 - classification_loss: 0.3860 234/500 [=============>................] - ETA: 1:06 - loss: 2.0946 - regression_loss: 1.6998 - classification_loss: 0.3948 235/500 [=============>................] - ETA: 1:06 - loss: 2.0915 - regression_loss: 1.6969 - classification_loss: 0.3946 236/500 [=============>................] - ETA: 1:06 - loss: 2.0915 - regression_loss: 1.6968 - classification_loss: 0.3947 237/500 [=============>................] - ETA: 1:05 - loss: 2.0928 - regression_loss: 1.6981 - classification_loss: 0.3947 238/500 [=============>................] - ETA: 1:05 - loss: 2.0933 - regression_loss: 1.6972 - classification_loss: 0.3961 239/500 [=============>................] - ETA: 1:05 - loss: 2.0938 - regression_loss: 1.6979 - classification_loss: 0.3959 240/500 [=============>................] - ETA: 1:05 - loss: 2.0959 - regression_loss: 1.6996 - classification_loss: 0.3963 241/500 [=============>................] - ETA: 1:04 - loss: 2.0956 - regression_loss: 1.6996 - classification_loss: 0.3960 242/500 [=============>................] - ETA: 1:04 - loss: 2.0987 - regression_loss: 1.7022 - classification_loss: 0.3965 243/500 [=============>................] - ETA: 1:04 - loss: 2.0982 - regression_loss: 1.7014 - classification_loss: 0.3968 244/500 [=============>................] - ETA: 1:04 - loss: 2.0970 - regression_loss: 1.7006 - classification_loss: 0.3964 245/500 [=============>................] - ETA: 1:03 - loss: 2.0970 - regression_loss: 1.7008 - classification_loss: 0.3962 246/500 [=============>................] - ETA: 1:03 - loss: 2.0968 - regression_loss: 1.7005 - classification_loss: 0.3963 247/500 [=============>................] - ETA: 1:03 - loss: 2.0983 - regression_loss: 1.7022 - classification_loss: 0.3960 248/500 [=============>................] - ETA: 1:02 - loss: 2.0971 - regression_loss: 1.7015 - classification_loss: 0.3956 249/500 [=============>................] - ETA: 1:02 - loss: 2.0999 - regression_loss: 1.7033 - classification_loss: 0.3966 250/500 [==============>...............] - ETA: 1:02 - loss: 2.1002 - regression_loss: 1.7037 - classification_loss: 0.3966 251/500 [==============>...............] - ETA: 1:02 - loss: 2.1000 - regression_loss: 1.7036 - classification_loss: 0.3965 252/500 [==============>...............] - ETA: 1:01 - loss: 2.0984 - regression_loss: 1.7019 - classification_loss: 0.3965 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0994 - regression_loss: 1.7028 - classification_loss: 0.3966 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0989 - regression_loss: 1.7025 - classification_loss: 0.3964 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0993 - regression_loss: 1.7031 - classification_loss: 0.3962 256/500 [==============>...............] - ETA: 1:00 - loss: 2.0987 - regression_loss: 1.7028 - classification_loss: 0.3959 257/500 [==============>...............] - ETA: 1:00 - loss: 2.1008 - regression_loss: 1.7045 - classification_loss: 0.3962 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0987 - regression_loss: 1.7027 - classification_loss: 0.3961 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0937 - regression_loss: 1.6985 - classification_loss: 0.3953 260/500 [==============>...............] - ETA: 59s - loss: 2.0953 - regression_loss: 1.7001 - classification_loss: 0.3952  261/500 [==============>...............] - ETA: 59s - loss: 2.0942 - regression_loss: 1.6994 - classification_loss: 0.3949 262/500 [==============>...............] - ETA: 59s - loss: 2.0947 - regression_loss: 1.7000 - classification_loss: 0.3947 263/500 [==============>...............] - ETA: 59s - loss: 2.0948 - regression_loss: 1.7003 - classification_loss: 0.3945 264/500 [==============>...............] - ETA: 58s - loss: 2.0928 - regression_loss: 1.6988 - classification_loss: 0.3940 265/500 [==============>...............] - ETA: 58s - loss: 2.0970 - regression_loss: 1.7016 - classification_loss: 0.3955 266/500 [==============>...............] - ETA: 58s - loss: 2.0972 - regression_loss: 1.7020 - classification_loss: 0.3952 267/500 [===============>..............] - ETA: 58s - loss: 2.0953 - regression_loss: 1.7001 - classification_loss: 0.3952 268/500 [===============>..............] - ETA: 57s - loss: 2.0904 - regression_loss: 1.6961 - classification_loss: 0.3943 269/500 [===============>..............] - ETA: 57s - loss: 2.0888 - regression_loss: 1.6949 - classification_loss: 0.3939 270/500 [===============>..............] - ETA: 57s - loss: 2.0926 - regression_loss: 1.6982 - classification_loss: 0.3944 271/500 [===============>..............] - ETA: 57s - loss: 2.0889 - regression_loss: 1.6952 - classification_loss: 0.3937 272/500 [===============>..............] - ETA: 56s - loss: 2.0925 - regression_loss: 1.6979 - classification_loss: 0.3946 273/500 [===============>..............] - ETA: 56s - loss: 2.0929 - regression_loss: 1.6983 - classification_loss: 0.3945 274/500 [===============>..............] - ETA: 56s - loss: 2.0935 - regression_loss: 1.6989 - classification_loss: 0.3946 275/500 [===============>..............] - ETA: 56s - loss: 2.0937 - regression_loss: 1.6992 - classification_loss: 0.3945 276/500 [===============>..............] - ETA: 55s - loss: 2.0955 - regression_loss: 1.7006 - classification_loss: 0.3950 277/500 [===============>..............] - ETA: 55s - loss: 2.0941 - regression_loss: 1.6995 - classification_loss: 0.3946 278/500 [===============>..............] - ETA: 55s - loss: 2.0942 - regression_loss: 1.6998 - classification_loss: 0.3945 279/500 [===============>..............] - ETA: 55s - loss: 2.0930 - regression_loss: 1.6990 - classification_loss: 0.3940 280/500 [===============>..............] - ETA: 54s - loss: 2.0904 - regression_loss: 1.6971 - classification_loss: 0.3932 281/500 [===============>..............] - ETA: 54s - loss: 2.0911 - regression_loss: 1.6979 - classification_loss: 0.3932 282/500 [===============>..............] - ETA: 54s - loss: 2.0917 - regression_loss: 1.6982 - classification_loss: 0.3935 283/500 [===============>..............] - ETA: 54s - loss: 2.0929 - regression_loss: 1.6991 - classification_loss: 0.3937 284/500 [================>.............] - ETA: 53s - loss: 2.0915 - regression_loss: 1.6980 - classification_loss: 0.3935 285/500 [================>.............] - ETA: 53s - loss: 2.0917 - regression_loss: 1.6984 - classification_loss: 0.3933 286/500 [================>.............] - ETA: 53s - loss: 2.0919 - regression_loss: 1.6987 - classification_loss: 0.3932 287/500 [================>.............] - ETA: 53s - loss: 2.0922 - regression_loss: 1.6991 - classification_loss: 0.3931 288/500 [================>.............] - ETA: 52s - loss: 2.0910 - regression_loss: 1.6985 - classification_loss: 0.3924 289/500 [================>.............] - ETA: 52s - loss: 2.0915 - regression_loss: 1.6989 - classification_loss: 0.3926 290/500 [================>.............] - ETA: 52s - loss: 2.0889 - regression_loss: 1.6968 - classification_loss: 0.3921 291/500 [================>.............] - ETA: 52s - loss: 2.0893 - regression_loss: 1.6975 - classification_loss: 0.3918 292/500 [================>.............] - ETA: 51s - loss: 2.0895 - regression_loss: 1.6977 - classification_loss: 0.3918 293/500 [================>.............] - ETA: 51s - loss: 2.0951 - regression_loss: 1.7026 - classification_loss: 0.3925 294/500 [================>.............] - ETA: 51s - loss: 2.0926 - regression_loss: 1.7006 - classification_loss: 0.3920 295/500 [================>.............] - ETA: 51s - loss: 2.0939 - regression_loss: 1.7018 - classification_loss: 0.3921 296/500 [================>.............] - ETA: 50s - loss: 2.0963 - regression_loss: 1.7038 - classification_loss: 0.3925 297/500 [================>.............] - ETA: 50s - loss: 2.0961 - regression_loss: 1.7039 - classification_loss: 0.3923 298/500 [================>.............] - ETA: 50s - loss: 2.0959 - regression_loss: 1.7039 - classification_loss: 0.3919 299/500 [================>.............] - ETA: 50s - loss: 2.0928 - regression_loss: 1.7013 - classification_loss: 0.3915 300/500 [=================>............] - ETA: 49s - loss: 2.0939 - regression_loss: 1.7021 - classification_loss: 0.3918 301/500 [=================>............] - ETA: 49s - loss: 2.0939 - regression_loss: 1.7023 - classification_loss: 0.3916 302/500 [=================>............] - ETA: 49s - loss: 2.0915 - regression_loss: 1.7006 - classification_loss: 0.3909 303/500 [=================>............] - ETA: 49s - loss: 2.0914 - regression_loss: 1.7007 - classification_loss: 0.3907 304/500 [=================>............] - ETA: 48s - loss: 2.0913 - regression_loss: 1.7007 - classification_loss: 0.3906 305/500 [=================>............] - ETA: 48s - loss: 2.0902 - regression_loss: 1.6999 - classification_loss: 0.3903 306/500 [=================>............] - ETA: 48s - loss: 2.0888 - regression_loss: 1.6988 - classification_loss: 0.3900 307/500 [=================>............] - ETA: 48s - loss: 2.0895 - regression_loss: 1.6995 - classification_loss: 0.3900 308/500 [=================>............] - ETA: 47s - loss: 2.0903 - regression_loss: 1.7003 - classification_loss: 0.3901 309/500 [=================>............] - ETA: 47s - loss: 2.0905 - regression_loss: 1.7007 - classification_loss: 0.3899 310/500 [=================>............] - ETA: 47s - loss: 2.0933 - regression_loss: 1.7026 - classification_loss: 0.3907 311/500 [=================>............] - ETA: 47s - loss: 2.0892 - regression_loss: 1.6993 - classification_loss: 0.3898 312/500 [=================>............] - ETA: 46s - loss: 2.0878 - regression_loss: 1.6981 - classification_loss: 0.3897 313/500 [=================>............] - ETA: 46s - loss: 2.0877 - regression_loss: 1.6981 - classification_loss: 0.3897 314/500 [=================>............] - ETA: 46s - loss: 2.0881 - regression_loss: 1.6980 - classification_loss: 0.3901 315/500 [=================>............] - ETA: 46s - loss: 2.0891 - regression_loss: 1.6991 - classification_loss: 0.3900 316/500 [=================>............] - ETA: 45s - loss: 2.0908 - regression_loss: 1.7004 - classification_loss: 0.3904 317/500 [==================>...........] - ETA: 45s - loss: 2.0926 - regression_loss: 1.7017 - classification_loss: 0.3910 318/500 [==================>...........] - ETA: 45s - loss: 2.0925 - regression_loss: 1.7018 - classification_loss: 0.3907 319/500 [==================>...........] - ETA: 45s - loss: 2.0915 - regression_loss: 1.7011 - classification_loss: 0.3903 320/500 [==================>...........] - ETA: 44s - loss: 2.0924 - regression_loss: 1.7021 - classification_loss: 0.3903 321/500 [==================>...........] - ETA: 44s - loss: 2.0927 - regression_loss: 1.7023 - classification_loss: 0.3903 322/500 [==================>...........] - ETA: 44s - loss: 2.0924 - regression_loss: 1.7024 - classification_loss: 0.3900 323/500 [==================>...........] - ETA: 44s - loss: 2.0923 - regression_loss: 1.7024 - classification_loss: 0.3899 324/500 [==================>...........] - ETA: 43s - loss: 2.0927 - regression_loss: 1.7029 - classification_loss: 0.3898 325/500 [==================>...........] - ETA: 43s - loss: 2.0930 - regression_loss: 1.7033 - classification_loss: 0.3897 326/500 [==================>...........] - ETA: 43s - loss: 2.0954 - regression_loss: 1.7051 - classification_loss: 0.3903 327/500 [==================>...........] - ETA: 43s - loss: 2.0954 - regression_loss: 1.7049 - classification_loss: 0.3904 328/500 [==================>...........] - ETA: 42s - loss: 2.0949 - regression_loss: 1.7047 - classification_loss: 0.3902 329/500 [==================>...........] - ETA: 42s - loss: 2.0959 - regression_loss: 1.7057 - classification_loss: 0.3902 330/500 [==================>...........] - ETA: 42s - loss: 2.0963 - regression_loss: 1.7062 - classification_loss: 0.3901 331/500 [==================>...........] - ETA: 42s - loss: 2.0938 - regression_loss: 1.7041 - classification_loss: 0.3897 332/500 [==================>...........] - ETA: 41s - loss: 2.0939 - regression_loss: 1.7041 - classification_loss: 0.3898 333/500 [==================>...........] - ETA: 41s - loss: 2.0947 - regression_loss: 1.7049 - classification_loss: 0.3898 334/500 [===================>..........] - ETA: 41s - loss: 2.0967 - regression_loss: 1.7056 - classification_loss: 0.3912 335/500 [===================>..........] - ETA: 41s - loss: 2.1000 - regression_loss: 1.7085 - classification_loss: 0.3915 336/500 [===================>..........] - ETA: 40s - loss: 2.0985 - regression_loss: 1.7074 - classification_loss: 0.3911 337/500 [===================>..........] - ETA: 40s - loss: 2.0995 - regression_loss: 1.7082 - classification_loss: 0.3913 338/500 [===================>..........] - ETA: 40s - loss: 2.1010 - regression_loss: 1.7078 - classification_loss: 0.3932 339/500 [===================>..........] - ETA: 40s - loss: 2.1008 - regression_loss: 1.7075 - classification_loss: 0.3933 340/500 [===================>..........] - ETA: 39s - loss: 2.0993 - regression_loss: 1.7064 - classification_loss: 0.3929 341/500 [===================>..........] - ETA: 39s - loss: 2.1012 - regression_loss: 1.7082 - classification_loss: 0.3930 342/500 [===================>..........] - ETA: 39s - loss: 2.1008 - regression_loss: 1.7080 - classification_loss: 0.3928 343/500 [===================>..........] - ETA: 39s - loss: 2.1003 - regression_loss: 1.7078 - classification_loss: 0.3925 344/500 [===================>..........] - ETA: 38s - loss: 2.1008 - regression_loss: 1.7079 - classification_loss: 0.3929 345/500 [===================>..........] - ETA: 38s - loss: 2.1001 - regression_loss: 1.7074 - classification_loss: 0.3927 346/500 [===================>..........] - ETA: 38s - loss: 2.0990 - regression_loss: 1.7066 - classification_loss: 0.3923 347/500 [===================>..........] - ETA: 38s - loss: 2.0992 - regression_loss: 1.7068 - classification_loss: 0.3925 348/500 [===================>..........] - ETA: 37s - loss: 2.1008 - regression_loss: 1.7075 - classification_loss: 0.3933 349/500 [===================>..........] - ETA: 37s - loss: 2.0970 - regression_loss: 1.7045 - classification_loss: 0.3925 350/500 [====================>.........] - ETA: 37s - loss: 2.0962 - regression_loss: 1.7038 - classification_loss: 0.3924 351/500 [====================>.........] - ETA: 37s - loss: 2.0951 - regression_loss: 1.7027 - classification_loss: 0.3924 352/500 [====================>.........] - ETA: 36s - loss: 2.0960 - regression_loss: 1.7034 - classification_loss: 0.3926 353/500 [====================>.........] - ETA: 36s - loss: 2.0951 - regression_loss: 1.7028 - classification_loss: 0.3923 354/500 [====================>.........] - ETA: 36s - loss: 2.0935 - regression_loss: 1.7015 - classification_loss: 0.3920 355/500 [====================>.........] - ETA: 36s - loss: 2.0946 - regression_loss: 1.7026 - classification_loss: 0.3919 356/500 [====================>.........] - ETA: 35s - loss: 2.0942 - regression_loss: 1.7022 - classification_loss: 0.3920 357/500 [====================>.........] - ETA: 35s - loss: 2.0938 - regression_loss: 1.7021 - classification_loss: 0.3918 358/500 [====================>.........] - ETA: 35s - loss: 2.0943 - regression_loss: 1.7024 - classification_loss: 0.3919 359/500 [====================>.........] - ETA: 35s - loss: 2.0945 - regression_loss: 1.7027 - classification_loss: 0.3918 360/500 [====================>.........] - ETA: 34s - loss: 2.0944 - regression_loss: 1.7025 - classification_loss: 0.3919 361/500 [====================>.........] - ETA: 34s - loss: 2.0949 - regression_loss: 1.7029 - classification_loss: 0.3920 362/500 [====================>.........] - ETA: 34s - loss: 2.0954 - regression_loss: 1.7032 - classification_loss: 0.3922 363/500 [====================>.........] - ETA: 34s - loss: 2.0963 - regression_loss: 1.7039 - classification_loss: 0.3924 364/500 [====================>.........] - ETA: 33s - loss: 2.0973 - regression_loss: 1.7047 - classification_loss: 0.3926 365/500 [====================>.........] - ETA: 33s - loss: 2.0965 - regression_loss: 1.7043 - classification_loss: 0.3923 366/500 [====================>.........] - ETA: 33s - loss: 2.0954 - regression_loss: 1.7033 - classification_loss: 0.3920 367/500 [=====================>........] - ETA: 33s - loss: 2.0962 - regression_loss: 1.7038 - classification_loss: 0.3923 368/500 [=====================>........] - ETA: 32s - loss: 2.0964 - regression_loss: 1.7041 - classification_loss: 0.3923 369/500 [=====================>........] - ETA: 32s - loss: 2.0959 - regression_loss: 1.7038 - classification_loss: 0.3921 370/500 [=====================>........] - ETA: 32s - loss: 2.0951 - regression_loss: 1.7031 - classification_loss: 0.3921 371/500 [=====================>........] - ETA: 32s - loss: 2.0946 - regression_loss: 1.7026 - classification_loss: 0.3920 372/500 [=====================>........] - ETA: 31s - loss: 2.0939 - regression_loss: 1.7021 - classification_loss: 0.3918 373/500 [=====================>........] - ETA: 31s - loss: 2.0940 - regression_loss: 1.7022 - classification_loss: 0.3918 374/500 [=====================>........] - ETA: 31s - loss: 2.0941 - regression_loss: 1.7022 - classification_loss: 0.3919 375/500 [=====================>........] - ETA: 31s - loss: 2.0954 - regression_loss: 1.7034 - classification_loss: 0.3920 376/500 [=====================>........] - ETA: 30s - loss: 2.0949 - regression_loss: 1.7032 - classification_loss: 0.3917 377/500 [=====================>........] - ETA: 30s - loss: 2.0958 - regression_loss: 1.7038 - classification_loss: 0.3920 378/500 [=====================>........] - ETA: 30s - loss: 2.0947 - regression_loss: 1.7030 - classification_loss: 0.3917 379/500 [=====================>........] - ETA: 30s - loss: 2.0938 - regression_loss: 1.7023 - classification_loss: 0.3915 380/500 [=====================>........] - ETA: 30s - loss: 2.0958 - regression_loss: 1.7038 - classification_loss: 0.3920 381/500 [=====================>........] - ETA: 29s - loss: 2.0967 - regression_loss: 1.7045 - classification_loss: 0.3922 382/500 [=====================>........] - ETA: 29s - loss: 2.0943 - regression_loss: 1.7025 - classification_loss: 0.3918 383/500 [=====================>........] - ETA: 29s - loss: 2.0943 - regression_loss: 1.7025 - classification_loss: 0.3918 384/500 [======================>.......] - ETA: 29s - loss: 2.0947 - regression_loss: 1.7028 - classification_loss: 0.3918 385/500 [======================>.......] - ETA: 28s - loss: 2.0924 - regression_loss: 1.7009 - classification_loss: 0.3916 386/500 [======================>.......] - ETA: 28s - loss: 2.0925 - regression_loss: 1.7011 - classification_loss: 0.3914 387/500 [======================>.......] - ETA: 28s - loss: 2.0926 - regression_loss: 1.7014 - classification_loss: 0.3912 388/500 [======================>.......] - ETA: 28s - loss: 2.0940 - regression_loss: 1.7024 - classification_loss: 0.3917 389/500 [======================>.......] - ETA: 27s - loss: 2.0937 - regression_loss: 1.7023 - classification_loss: 0.3914 390/500 [======================>.......] - ETA: 27s - loss: 2.0950 - regression_loss: 1.7037 - classification_loss: 0.3913 391/500 [======================>.......] - ETA: 27s - loss: 2.0960 - regression_loss: 1.7044 - classification_loss: 0.3916 392/500 [======================>.......] - ETA: 27s - loss: 2.0960 - regression_loss: 1.7044 - classification_loss: 0.3916 393/500 [======================>.......] - ETA: 26s - loss: 2.0952 - regression_loss: 1.7038 - classification_loss: 0.3914 394/500 [======================>.......] - ETA: 26s - loss: 2.0944 - regression_loss: 1.7033 - classification_loss: 0.3911 395/500 [======================>.......] - ETA: 26s - loss: 2.0952 - regression_loss: 1.7033 - classification_loss: 0.3919 396/500 [======================>.......] - ETA: 26s - loss: 2.0961 - regression_loss: 1.7039 - classification_loss: 0.3922 397/500 [======================>.......] - ETA: 25s - loss: 2.0952 - regression_loss: 1.7030 - classification_loss: 0.3921 398/500 [======================>.......] - ETA: 25s - loss: 2.0959 - regression_loss: 1.7039 - classification_loss: 0.3920 399/500 [======================>.......] - ETA: 25s - loss: 2.0955 - regression_loss: 1.7037 - classification_loss: 0.3918 400/500 [=======================>......] - ETA: 25s - loss: 2.0925 - regression_loss: 1.7012 - classification_loss: 0.3913 401/500 [=======================>......] - ETA: 24s - loss: 2.0917 - regression_loss: 1.7006 - classification_loss: 0.3911 402/500 [=======================>......] - ETA: 24s - loss: 2.0907 - regression_loss: 1.7000 - classification_loss: 0.3907 403/500 [=======================>......] - ETA: 24s - loss: 2.0909 - regression_loss: 1.7001 - classification_loss: 0.3908 404/500 [=======================>......] - ETA: 23s - loss: 2.0900 - regression_loss: 1.6994 - classification_loss: 0.3906 405/500 [=======================>......] - ETA: 23s - loss: 2.0888 - regression_loss: 1.6983 - classification_loss: 0.3905 406/500 [=======================>......] - ETA: 23s - loss: 2.0896 - regression_loss: 1.6991 - classification_loss: 0.3905 407/500 [=======================>......] - ETA: 23s - loss: 2.0904 - regression_loss: 1.6996 - classification_loss: 0.3908 408/500 [=======================>......] - ETA: 23s - loss: 2.0906 - regression_loss: 1.6999 - classification_loss: 0.3907 409/500 [=======================>......] - ETA: 22s - loss: 2.0880 - regression_loss: 1.6975 - classification_loss: 0.3905 410/500 [=======================>......] - ETA: 22s - loss: 2.0888 - regression_loss: 1.6981 - classification_loss: 0.3907 411/500 [=======================>......] - ETA: 22s - loss: 2.0873 - regression_loss: 1.6970 - classification_loss: 0.3903 412/500 [=======================>......] - ETA: 22s - loss: 2.0871 - regression_loss: 1.6970 - classification_loss: 0.3900 413/500 [=======================>......] - ETA: 21s - loss: 2.0841 - regression_loss: 1.6948 - classification_loss: 0.3894 414/500 [=======================>......] - ETA: 21s - loss: 2.0856 - regression_loss: 1.6960 - classification_loss: 0.3896 415/500 [=======================>......] - ETA: 21s - loss: 2.0860 - regression_loss: 1.6966 - classification_loss: 0.3895 416/500 [=======================>......] - ETA: 21s - loss: 2.0867 - regression_loss: 1.6973 - classification_loss: 0.3894 417/500 [========================>.....] - ETA: 20s - loss: 2.0898 - regression_loss: 1.7000 - classification_loss: 0.3898 418/500 [========================>.....] - ETA: 20s - loss: 2.0893 - regression_loss: 1.6997 - classification_loss: 0.3896 419/500 [========================>.....] - ETA: 20s - loss: 2.0900 - regression_loss: 1.7003 - classification_loss: 0.3898 420/500 [========================>.....] - ETA: 19s - loss: 2.0896 - regression_loss: 1.7000 - classification_loss: 0.3895 421/500 [========================>.....] - ETA: 19s - loss: 2.0901 - regression_loss: 1.7006 - classification_loss: 0.3895 422/500 [========================>.....] - ETA: 19s - loss: 2.0896 - regression_loss: 1.7002 - classification_loss: 0.3893 423/500 [========================>.....] - ETA: 19s - loss: 2.0868 - regression_loss: 1.6979 - classification_loss: 0.3889 424/500 [========================>.....] - ETA: 18s - loss: 2.0870 - regression_loss: 1.6981 - classification_loss: 0.3889 425/500 [========================>.....] - ETA: 18s - loss: 2.0873 - regression_loss: 1.6983 - classification_loss: 0.3890 426/500 [========================>.....] - ETA: 18s - loss: 2.0874 - regression_loss: 1.6983 - classification_loss: 0.3891 427/500 [========================>.....] - ETA: 18s - loss: 2.0866 - regression_loss: 1.6978 - classification_loss: 0.3888 428/500 [========================>.....] - ETA: 17s - loss: 2.0863 - regression_loss: 1.6978 - classification_loss: 0.3885 429/500 [========================>.....] - ETA: 17s - loss: 2.0866 - regression_loss: 1.6981 - classification_loss: 0.3885 430/500 [========================>.....] - ETA: 17s - loss: 2.0875 - regression_loss: 1.6989 - classification_loss: 0.3886 431/500 [========================>.....] - ETA: 17s - loss: 2.0873 - regression_loss: 1.6989 - classification_loss: 0.3884 432/500 [========================>.....] - ETA: 16s - loss: 2.0878 - regression_loss: 1.6993 - classification_loss: 0.3885 433/500 [========================>.....] - ETA: 16s - loss: 2.0857 - regression_loss: 1.6977 - classification_loss: 0.3880 434/500 [=========================>....] - ETA: 16s - loss: 2.0835 - regression_loss: 1.6960 - classification_loss: 0.3875 435/500 [=========================>....] - ETA: 16s - loss: 2.0831 - regression_loss: 1.6957 - classification_loss: 0.3874 436/500 [=========================>....] - ETA: 15s - loss: 2.0842 - regression_loss: 1.6966 - classification_loss: 0.3876 437/500 [=========================>....] - ETA: 15s - loss: 2.0847 - regression_loss: 1.6968 - classification_loss: 0.3880 438/500 [=========================>....] - ETA: 15s - loss: 2.0848 - regression_loss: 1.6968 - classification_loss: 0.3880 439/500 [=========================>....] - ETA: 15s - loss: 2.0844 - regression_loss: 1.6966 - classification_loss: 0.3878 440/500 [=========================>....] - ETA: 14s - loss: 2.0834 - regression_loss: 1.6959 - classification_loss: 0.3875 441/500 [=========================>....] - ETA: 14s - loss: 2.0832 - regression_loss: 1.6958 - classification_loss: 0.3874 442/500 [=========================>....] - ETA: 14s - loss: 2.0848 - regression_loss: 1.6976 - classification_loss: 0.3872 443/500 [=========================>....] - ETA: 14s - loss: 2.0867 - regression_loss: 1.6991 - classification_loss: 0.3877 444/500 [=========================>....] - ETA: 13s - loss: 2.0854 - regression_loss: 1.6979 - classification_loss: 0.3875 445/500 [=========================>....] - ETA: 13s - loss: 2.0842 - regression_loss: 1.6969 - classification_loss: 0.3873 446/500 [=========================>....] - ETA: 13s - loss: 2.0843 - regression_loss: 1.6967 - classification_loss: 0.3877 447/500 [=========================>....] - ETA: 13s - loss: 2.0834 - regression_loss: 1.6960 - classification_loss: 0.3874 448/500 [=========================>....] - ETA: 12s - loss: 2.0835 - regression_loss: 1.6961 - classification_loss: 0.3873 449/500 [=========================>....] - ETA: 12s - loss: 2.0836 - regression_loss: 1.6964 - classification_loss: 0.3872 450/500 [==========================>...] - ETA: 12s - loss: 2.0839 - regression_loss: 1.6966 - classification_loss: 0.3872 451/500 [==========================>...] - ETA: 12s - loss: 2.0835 - regression_loss: 1.6964 - classification_loss: 0.3871 452/500 [==========================>...] - ETA: 11s - loss: 2.0835 - regression_loss: 1.6966 - classification_loss: 0.3869 453/500 [==========================>...] - ETA: 11s - loss: 2.0840 - regression_loss: 1.6972 - classification_loss: 0.3869 454/500 [==========================>...] - ETA: 11s - loss: 2.0831 - regression_loss: 1.6965 - classification_loss: 0.3866 455/500 [==========================>...] - ETA: 11s - loss: 2.0836 - regression_loss: 1.6970 - classification_loss: 0.3866 456/500 [==========================>...] - ETA: 10s - loss: 2.0831 - regression_loss: 1.6966 - classification_loss: 0.3865 457/500 [==========================>...] - ETA: 10s - loss: 2.0833 - regression_loss: 1.6968 - classification_loss: 0.3864 458/500 [==========================>...] - ETA: 10s - loss: 2.0847 - regression_loss: 1.6980 - classification_loss: 0.3867 459/500 [==========================>...] - ETA: 10s - loss: 2.0848 - regression_loss: 1.6981 - classification_loss: 0.3867 460/500 [==========================>...] - ETA: 9s - loss: 2.0861 - regression_loss: 1.6995 - classification_loss: 0.3866  461/500 [==========================>...] - ETA: 9s - loss: 2.0852 - regression_loss: 1.6988 - classification_loss: 0.3865 462/500 [==========================>...] - ETA: 9s - loss: 2.0841 - regression_loss: 1.6971 - classification_loss: 0.3870 463/500 [==========================>...] - ETA: 9s - loss: 2.0849 - regression_loss: 1.6977 - classification_loss: 0.3872 464/500 [==========================>...] - ETA: 8s - loss: 2.0844 - regression_loss: 1.6973 - classification_loss: 0.3871 465/500 [==========================>...] - ETA: 8s - loss: 2.0845 - regression_loss: 1.6975 - classification_loss: 0.3870 466/500 [==========================>...] - ETA: 8s - loss: 2.0848 - regression_loss: 1.6977 - classification_loss: 0.3870 467/500 [===========================>..] - ETA: 8s - loss: 2.0849 - regression_loss: 1.6979 - classification_loss: 0.3870 468/500 [===========================>..] - ETA: 7s - loss: 2.0847 - regression_loss: 1.6977 - classification_loss: 0.3870 469/500 [===========================>..] - ETA: 7s - loss: 2.0842 - regression_loss: 1.6969 - classification_loss: 0.3873 470/500 [===========================>..] - ETA: 7s - loss: 2.0844 - regression_loss: 1.6973 - classification_loss: 0.3871 471/500 [===========================>..] - ETA: 7s - loss: 2.0850 - regression_loss: 1.6978 - classification_loss: 0.3871 472/500 [===========================>..] - ETA: 6s - loss: 2.0851 - regression_loss: 1.6979 - classification_loss: 0.3872 473/500 [===========================>..] - ETA: 6s - loss: 2.0844 - regression_loss: 1.6973 - classification_loss: 0.3871 474/500 [===========================>..] - ETA: 6s - loss: 2.0857 - regression_loss: 1.6978 - classification_loss: 0.3879 475/500 [===========================>..] - ETA: 6s - loss: 2.0862 - regression_loss: 1.6982 - classification_loss: 0.3880 476/500 [===========================>..] - ETA: 5s - loss: 2.0860 - regression_loss: 1.6976 - classification_loss: 0.3884 477/500 [===========================>..] - ETA: 5s - loss: 2.0848 - regression_loss: 1.6968 - classification_loss: 0.3881 478/500 [===========================>..] - ETA: 5s - loss: 2.0842 - regression_loss: 1.6962 - classification_loss: 0.3880 479/500 [===========================>..] - ETA: 5s - loss: 2.0838 - regression_loss: 1.6960 - classification_loss: 0.3878 480/500 [===========================>..] - ETA: 4s - loss: 2.0845 - regression_loss: 1.6964 - classification_loss: 0.3881 481/500 [===========================>..] - ETA: 4s - loss: 2.0845 - regression_loss: 1.6963 - classification_loss: 0.3882 482/500 [===========================>..] - ETA: 4s - loss: 2.0849 - regression_loss: 1.6967 - classification_loss: 0.3882 483/500 [===========================>..] - ETA: 4s - loss: 2.0851 - regression_loss: 1.6971 - classification_loss: 0.3880 484/500 [============================>.] - ETA: 3s - loss: 2.0858 - regression_loss: 1.6978 - classification_loss: 0.3880 485/500 [============================>.] - ETA: 3s - loss: 2.0837 - regression_loss: 1.6961 - classification_loss: 0.3876 486/500 [============================>.] - ETA: 3s - loss: 2.0838 - regression_loss: 1.6962 - classification_loss: 0.3876 487/500 [============================>.] - ETA: 3s - loss: 2.0841 - regression_loss: 1.6965 - classification_loss: 0.3876 488/500 [============================>.] - ETA: 2s - loss: 2.0864 - regression_loss: 1.6983 - classification_loss: 0.3881 489/500 [============================>.] - ETA: 2s - loss: 2.0863 - regression_loss: 1.6983 - classification_loss: 0.3880 490/500 [============================>.] - ETA: 2s - loss: 2.0881 - regression_loss: 1.6996 - classification_loss: 0.3884 491/500 [============================>.] - ETA: 2s - loss: 2.0886 - regression_loss: 1.7000 - classification_loss: 0.3886 492/500 [============================>.] - ETA: 1s - loss: 2.0879 - regression_loss: 1.6994 - classification_loss: 0.3885 493/500 [============================>.] - ETA: 1s - loss: 2.0875 - regression_loss: 1.6992 - classification_loss: 0.3883 494/500 [============================>.] - ETA: 1s - loss: 2.0869 - regression_loss: 1.6988 - classification_loss: 0.3881 495/500 [============================>.] - ETA: 1s - loss: 2.0870 - regression_loss: 1.6989 - classification_loss: 0.3880 496/500 [============================>.] - ETA: 0s - loss: 2.0876 - regression_loss: 1.6993 - classification_loss: 0.3883 497/500 [============================>.] - ETA: 0s - loss: 2.0878 - regression_loss: 1.6995 - classification_loss: 0.3882 498/500 [============================>.] - ETA: 0s - loss: 2.0901 - regression_loss: 1.7014 - classification_loss: 0.3887 499/500 [============================>.] - ETA: 0s - loss: 2.0892 - regression_loss: 1.7007 - classification_loss: 0.3885 500/500 [==============================] - 125s 250ms/step - loss: 2.0890 - regression_loss: 1.7007 - classification_loss: 0.3884 1172 instances of class plum with average precision: 0.5072 mAP: 0.5072 Epoch 00035: saving model to ./training/snapshots/resnet50_pascal_35.h5 Epoch 36/150 1/500 [..............................] - ETA: 1:47 - loss: 1.9069 - regression_loss: 1.5982 - classification_loss: 0.3087 2/500 [..............................] - ETA: 1:55 - loss: 1.9731 - regression_loss: 1.6323 - classification_loss: 0.3408 3/500 [..............................] - ETA: 1:58 - loss: 1.6198 - regression_loss: 1.3559 - classification_loss: 0.2640 4/500 [..............................] - ETA: 2:00 - loss: 1.6435 - regression_loss: 1.3739 - classification_loss: 0.2696 5/500 [..............................] - ETA: 1:59 - loss: 1.6282 - regression_loss: 1.3581 - classification_loss: 0.2702 6/500 [..............................] - ETA: 1:59 - loss: 1.7127 - regression_loss: 1.4164 - classification_loss: 0.2963 7/500 [..............................] - ETA: 1:58 - loss: 1.5752 - regression_loss: 1.2951 - classification_loss: 0.2801 8/500 [..............................] - ETA: 1:59 - loss: 1.6882 - regression_loss: 1.3699 - classification_loss: 0.3183 9/500 [..............................] - ETA: 2:00 - loss: 1.7857 - regression_loss: 1.4522 - classification_loss: 0.3335 10/500 [..............................] - ETA: 2:00 - loss: 1.8364 - regression_loss: 1.4875 - classification_loss: 0.3490 11/500 [..............................] - ETA: 2:00 - loss: 1.9223 - regression_loss: 1.5620 - classification_loss: 0.3603 12/500 [..............................] - ETA: 2:00 - loss: 1.8807 - regression_loss: 1.5312 - classification_loss: 0.3495 13/500 [..............................] - ETA: 2:00 - loss: 1.8889 - regression_loss: 1.5351 - classification_loss: 0.3538 14/500 [..............................] - ETA: 2:00 - loss: 1.9278 - regression_loss: 1.5672 - classification_loss: 0.3606 15/500 [..............................] - ETA: 2:00 - loss: 1.9725 - regression_loss: 1.5998 - classification_loss: 0.3727 16/500 [..............................] - ETA: 2:00 - loss: 1.9974 - regression_loss: 1.6174 - classification_loss: 0.3800 17/500 [>.............................] - ETA: 2:00 - loss: 1.9413 - regression_loss: 1.5748 - classification_loss: 0.3665 18/500 [>.............................] - ETA: 2:00 - loss: 1.9635 - regression_loss: 1.5979 - classification_loss: 0.3656 19/500 [>.............................] - ETA: 1:59 - loss: 1.9654 - regression_loss: 1.6054 - classification_loss: 0.3600 20/500 [>.............................] - ETA: 1:59 - loss: 1.9828 - regression_loss: 1.6231 - classification_loss: 0.3597 21/500 [>.............................] - ETA: 1:59 - loss: 1.9826 - regression_loss: 1.6220 - classification_loss: 0.3606 22/500 [>.............................] - ETA: 1:59 - loss: 2.0174 - regression_loss: 1.6528 - classification_loss: 0.3645 23/500 [>.............................] - ETA: 1:59 - loss: 2.0145 - regression_loss: 1.6497 - classification_loss: 0.3648 24/500 [>.............................] - ETA: 1:58 - loss: 2.0310 - regression_loss: 1.6613 - classification_loss: 0.3697 25/500 [>.............................] - ETA: 1:58 - loss: 2.0231 - regression_loss: 1.6487 - classification_loss: 0.3744 26/500 [>.............................] - ETA: 1:58 - loss: 2.0255 - regression_loss: 1.6530 - classification_loss: 0.3725 27/500 [>.............................] - ETA: 1:58 - loss: 2.0270 - regression_loss: 1.6557 - classification_loss: 0.3714 28/500 [>.............................] - ETA: 1:58 - loss: 2.0222 - regression_loss: 1.6523 - classification_loss: 0.3699 29/500 [>.............................] - ETA: 1:58 - loss: 2.0147 - regression_loss: 1.6469 - classification_loss: 0.3678 30/500 [>.............................] - ETA: 1:57 - loss: 2.0448 - regression_loss: 1.6743 - classification_loss: 0.3705 31/500 [>.............................] - ETA: 1:57 - loss: 2.0454 - regression_loss: 1.6759 - classification_loss: 0.3695 32/500 [>.............................] - ETA: 1:57 - loss: 2.0549 - regression_loss: 1.6822 - classification_loss: 0.3727 33/500 [>.............................] - ETA: 1:57 - loss: 2.0548 - regression_loss: 1.6821 - classification_loss: 0.3727 34/500 [=>............................] - ETA: 1:57 - loss: 2.0561 - regression_loss: 1.6851 - classification_loss: 0.3711 35/500 [=>............................] - ETA: 1:56 - loss: 2.0279 - regression_loss: 1.6620 - classification_loss: 0.3659 36/500 [=>............................] - ETA: 1:56 - loss: 2.0566 - regression_loss: 1.6855 - classification_loss: 0.3711 37/500 [=>............................] - ETA: 1:56 - loss: 2.0711 - regression_loss: 1.6981 - classification_loss: 0.3730 38/500 [=>............................] - ETA: 1:55 - loss: 2.0697 - regression_loss: 1.6971 - classification_loss: 0.3726 39/500 [=>............................] - ETA: 1:55 - loss: 2.0652 - regression_loss: 1.6961 - classification_loss: 0.3691 40/500 [=>............................] - ETA: 1:55 - loss: 2.0361 - regression_loss: 1.6733 - classification_loss: 0.3627 41/500 [=>............................] - ETA: 1:55 - loss: 2.0280 - regression_loss: 1.6672 - classification_loss: 0.3608 42/500 [=>............................] - ETA: 1:55 - loss: 2.0094 - regression_loss: 1.6514 - classification_loss: 0.3580 43/500 [=>............................] - ETA: 1:54 - loss: 2.0077 - regression_loss: 1.6503 - classification_loss: 0.3575 44/500 [=>............................] - ETA: 1:54 - loss: 2.0039 - regression_loss: 1.6474 - classification_loss: 0.3565 45/500 [=>............................] - ETA: 1:54 - loss: 2.0121 - regression_loss: 1.6528 - classification_loss: 0.3593 46/500 [=>............................] - ETA: 1:54 - loss: 2.0137 - regression_loss: 1.6528 - classification_loss: 0.3609 47/500 [=>............................] - ETA: 1:54 - loss: 2.0176 - regression_loss: 1.6577 - classification_loss: 0.3599 48/500 [=>............................] - ETA: 1:53 - loss: 2.0199 - regression_loss: 1.6583 - classification_loss: 0.3616 49/500 [=>............................] - ETA: 1:53 - loss: 2.0155 - regression_loss: 1.6553 - classification_loss: 0.3601 50/500 [==>...........................] - ETA: 1:52 - loss: 2.0109 - regression_loss: 1.6517 - classification_loss: 0.3592 51/500 [==>...........................] - ETA: 1:52 - loss: 2.0089 - regression_loss: 1.6513 - classification_loss: 0.3576 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9970 - regression_loss: 1.6402 - classification_loss: 0.3568 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9968 - regression_loss: 1.6415 - classification_loss: 0.3553 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9925 - regression_loss: 1.6377 - classification_loss: 0.3548 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9895 - regression_loss: 1.6354 - classification_loss: 0.3541 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9893 - regression_loss: 1.6356 - classification_loss: 0.3537 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9941 - regression_loss: 1.6398 - classification_loss: 0.3543 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9979 - regression_loss: 1.6431 - classification_loss: 0.3547 59/500 [==>...........................] - ETA: 1:50 - loss: 2.0023 - regression_loss: 1.6479 - classification_loss: 0.3545 60/500 [==>...........................] - ETA: 1:50 - loss: 2.0046 - regression_loss: 1.6511 - classification_loss: 0.3535 61/500 [==>...........................] - ETA: 1:50 - loss: 2.0004 - regression_loss: 1.6474 - classification_loss: 0.3529 62/500 [==>...........................] - ETA: 1:50 - loss: 2.0021 - regression_loss: 1.6494 - classification_loss: 0.3528 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0010 - regression_loss: 1.6490 - classification_loss: 0.3520 64/500 [==>...........................] - ETA: 1:49 - loss: 2.0053 - regression_loss: 1.6537 - classification_loss: 0.3516 65/500 [==>...........................] - ETA: 1:49 - loss: 2.0035 - regression_loss: 1.6526 - classification_loss: 0.3509 66/500 [==>...........................] - ETA: 1:49 - loss: 2.0063 - regression_loss: 1.6541 - classification_loss: 0.3521 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0081 - regression_loss: 1.6557 - classification_loss: 0.3524 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0068 - regression_loss: 1.6552 - classification_loss: 0.3516 69/500 [===>..........................] - ETA: 1:48 - loss: 2.0021 - regression_loss: 1.6514 - classification_loss: 0.3507 70/500 [===>..........................] - ETA: 1:48 - loss: 2.0075 - regression_loss: 1.6553 - classification_loss: 0.3522 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0148 - regression_loss: 1.6607 - classification_loss: 0.3541 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0239 - regression_loss: 1.6671 - classification_loss: 0.3568 73/500 [===>..........................] - ETA: 1:47 - loss: 2.0320 - regression_loss: 1.6727 - classification_loss: 0.3593 74/500 [===>..........................] - ETA: 1:47 - loss: 2.0304 - regression_loss: 1.6716 - classification_loss: 0.3588 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0309 - regression_loss: 1.6728 - classification_loss: 0.3581 76/500 [===>..........................] - ETA: 1:46 - loss: 2.0221 - regression_loss: 1.6618 - classification_loss: 0.3603 77/500 [===>..........................] - ETA: 1:46 - loss: 2.0247 - regression_loss: 1.6619 - classification_loss: 0.3628 78/500 [===>..........................] - ETA: 1:46 - loss: 2.0250 - regression_loss: 1.6613 - classification_loss: 0.3637 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0264 - regression_loss: 1.6630 - classification_loss: 0.3634 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0129 - regression_loss: 1.6521 - classification_loss: 0.3607 81/500 [===>..........................] - ETA: 1:45 - loss: 2.0119 - regression_loss: 1.6515 - classification_loss: 0.3604 82/500 [===>..........................] - ETA: 1:45 - loss: 2.0147 - regression_loss: 1.6522 - classification_loss: 0.3625 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0127 - regression_loss: 1.6481 - classification_loss: 0.3646 84/500 [====>.........................] - ETA: 1:44 - loss: 2.0081 - regression_loss: 1.6447 - classification_loss: 0.3634 85/500 [====>.........................] - ETA: 1:44 - loss: 2.0117 - regression_loss: 1.6466 - classification_loss: 0.3651 86/500 [====>.........................] - ETA: 1:44 - loss: 2.0166 - regression_loss: 1.6516 - classification_loss: 0.3650 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0226 - regression_loss: 1.6560 - classification_loss: 0.3666 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0264 - regression_loss: 1.6593 - classification_loss: 0.3672 89/500 [====>.........................] - ETA: 1:43 - loss: 2.0309 - regression_loss: 1.6629 - classification_loss: 0.3681 90/500 [====>.........................] - ETA: 1:43 - loss: 2.0269 - regression_loss: 1.6589 - classification_loss: 0.3680 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0250 - regression_loss: 1.6582 - classification_loss: 0.3668 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0216 - regression_loss: 1.6561 - classification_loss: 0.3655 93/500 [====>.........................] - ETA: 1:42 - loss: 2.0251 - regression_loss: 1.6592 - classification_loss: 0.3659 94/500 [====>.........................] - ETA: 1:42 - loss: 2.0254 - regression_loss: 1.6595 - classification_loss: 0.3659 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0232 - regression_loss: 1.6560 - classification_loss: 0.3672 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0267 - regression_loss: 1.6585 - classification_loss: 0.3682 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0348 - regression_loss: 1.6665 - classification_loss: 0.3683 98/500 [====>.........................] - ETA: 1:41 - loss: 2.0298 - regression_loss: 1.6625 - classification_loss: 0.3673 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0272 - regression_loss: 1.6607 - classification_loss: 0.3665 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0213 - regression_loss: 1.6553 - classification_loss: 0.3660 101/500 [=====>........................] - ETA: 1:40 - loss: 2.0308 - regression_loss: 1.6636 - classification_loss: 0.3672 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0341 - regression_loss: 1.6658 - classification_loss: 0.3683 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0239 - regression_loss: 1.6559 - classification_loss: 0.3679 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0277 - regression_loss: 1.6590 - classification_loss: 0.3687 105/500 [=====>........................] - ETA: 1:38 - loss: 2.0257 - regression_loss: 1.6579 - classification_loss: 0.3678 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0291 - regression_loss: 1.6607 - classification_loss: 0.3684 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0196 - regression_loss: 1.6530 - classification_loss: 0.3666 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0137 - regression_loss: 1.6485 - classification_loss: 0.3652 109/500 [=====>........................] - ETA: 1:37 - loss: 2.0151 - regression_loss: 1.6492 - classification_loss: 0.3659 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0105 - regression_loss: 1.6455 - classification_loss: 0.3650 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0143 - regression_loss: 1.6491 - classification_loss: 0.3652 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0113 - regression_loss: 1.6475 - classification_loss: 0.3638 113/500 [=====>........................] - ETA: 1:36 - loss: 2.0141 - regression_loss: 1.6505 - classification_loss: 0.3637 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0186 - regression_loss: 1.6542 - classification_loss: 0.3645 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0217 - regression_loss: 1.6563 - classification_loss: 0.3653 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0183 - regression_loss: 1.6539 - classification_loss: 0.3644 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0211 - regression_loss: 1.6572 - classification_loss: 0.3640 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0159 - regression_loss: 1.6530 - classification_loss: 0.3629 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0214 - regression_loss: 1.6555 - classification_loss: 0.3659 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0240 - regression_loss: 1.6576 - classification_loss: 0.3664 121/500 [======>.......................] - ETA: 1:34 - loss: 2.0214 - regression_loss: 1.6549 - classification_loss: 0.3664 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0240 - regression_loss: 1.6571 - classification_loss: 0.3669 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0232 - regression_loss: 1.6569 - classification_loss: 0.3664 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0239 - regression_loss: 1.6575 - classification_loss: 0.3665 125/500 [======>.......................] - ETA: 1:33 - loss: 2.0263 - regression_loss: 1.6593 - classification_loss: 0.3670 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0285 - regression_loss: 1.6608 - classification_loss: 0.3677 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0338 - regression_loss: 1.6658 - classification_loss: 0.3679 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0343 - regression_loss: 1.6665 - classification_loss: 0.3679 129/500 [======>.......................] - ETA: 1:32 - loss: 2.0345 - regression_loss: 1.6666 - classification_loss: 0.3679 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0318 - regression_loss: 1.6652 - classification_loss: 0.3666 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0271 - regression_loss: 1.6618 - classification_loss: 0.3653 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0217 - regression_loss: 1.6576 - classification_loss: 0.3641 133/500 [======>.......................] - ETA: 1:31 - loss: 2.0218 - regression_loss: 1.6581 - classification_loss: 0.3638 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0171 - regression_loss: 1.6539 - classification_loss: 0.3632 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0149 - regression_loss: 1.6523 - classification_loss: 0.3626 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0143 - regression_loss: 1.6523 - classification_loss: 0.3620 137/500 [=======>......................] - ETA: 1:30 - loss: 2.0137 - regression_loss: 1.6515 - classification_loss: 0.3621 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0132 - regression_loss: 1.6508 - classification_loss: 0.3624 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0158 - regression_loss: 1.6530 - classification_loss: 0.3628 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0188 - regression_loss: 1.6558 - classification_loss: 0.3630 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0225 - regression_loss: 1.6587 - classification_loss: 0.3638 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0209 - regression_loss: 1.6573 - classification_loss: 0.3637 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0210 - regression_loss: 1.6543 - classification_loss: 0.3667 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0176 - regression_loss: 1.6516 - classification_loss: 0.3660 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0155 - regression_loss: 1.6504 - classification_loss: 0.3651 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0135 - regression_loss: 1.6492 - classification_loss: 0.3643 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0170 - regression_loss: 1.6521 - classification_loss: 0.3649 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0183 - regression_loss: 1.6533 - classification_loss: 0.3649 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0239 - regression_loss: 1.6578 - classification_loss: 0.3660 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0199 - regression_loss: 1.6540 - classification_loss: 0.3659 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0251 - regression_loss: 1.6591 - classification_loss: 0.3660 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0167 - regression_loss: 1.6523 - classification_loss: 0.3644 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0198 - regression_loss: 1.6531 - classification_loss: 0.3666 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0234 - regression_loss: 1.6551 - classification_loss: 0.3683 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0217 - regression_loss: 1.6538 - classification_loss: 0.3679 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0242 - regression_loss: 1.6562 - classification_loss: 0.3680 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0264 - regression_loss: 1.6577 - classification_loss: 0.3687 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0288 - regression_loss: 1.6596 - classification_loss: 0.3693 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0266 - regression_loss: 1.6576 - classification_loss: 0.3690 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0237 - regression_loss: 1.6550 - classification_loss: 0.3687 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0242 - regression_loss: 1.6552 - classification_loss: 0.3690 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0242 - regression_loss: 1.6551 - classification_loss: 0.3691 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0245 - regression_loss: 1.6555 - classification_loss: 0.3690 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0280 - regression_loss: 1.6583 - classification_loss: 0.3697 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0300 - regression_loss: 1.6599 - classification_loss: 0.3702 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0298 - regression_loss: 1.6597 - classification_loss: 0.3701 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0291 - regression_loss: 1.6595 - classification_loss: 0.3696 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0268 - regression_loss: 1.6580 - classification_loss: 0.3689 169/500 [=========>....................] - ETA: 1:23 - loss: 2.0316 - regression_loss: 1.6610 - classification_loss: 0.3706 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0326 - regression_loss: 1.6620 - classification_loss: 0.3705 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0332 - regression_loss: 1.6629 - classification_loss: 0.3702 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0316 - regression_loss: 1.6616 - classification_loss: 0.3699 173/500 [=========>....................] - ETA: 1:22 - loss: 2.0326 - regression_loss: 1.6621 - classification_loss: 0.3704 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0359 - regression_loss: 1.6642 - classification_loss: 0.3717 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0365 - regression_loss: 1.6649 - classification_loss: 0.3717 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0378 - regression_loss: 1.6655 - classification_loss: 0.3722 177/500 [=========>....................] - ETA: 1:21 - loss: 2.0383 - regression_loss: 1.6662 - classification_loss: 0.3721 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0457 - regression_loss: 1.6725 - classification_loss: 0.3732 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0486 - regression_loss: 1.6743 - classification_loss: 0.3743 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0504 - regression_loss: 1.6757 - classification_loss: 0.3747 181/500 [=========>....................] - ETA: 1:20 - loss: 2.0517 - regression_loss: 1.6756 - classification_loss: 0.3761 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0533 - regression_loss: 1.6777 - classification_loss: 0.3757 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0558 - regression_loss: 1.6797 - classification_loss: 0.3761 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0566 - regression_loss: 1.6802 - classification_loss: 0.3764 185/500 [==========>...................] - ETA: 1:19 - loss: 2.0576 - regression_loss: 1.6815 - classification_loss: 0.3761 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0544 - regression_loss: 1.6784 - classification_loss: 0.3761 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0482 - regression_loss: 1.6728 - classification_loss: 0.3754 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0483 - regression_loss: 1.6734 - classification_loss: 0.3749 189/500 [==========>...................] - ETA: 1:18 - loss: 2.0502 - regression_loss: 1.6750 - classification_loss: 0.3752 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0487 - regression_loss: 1.6741 - classification_loss: 0.3746 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0497 - regression_loss: 1.6747 - classification_loss: 0.3751 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0504 - regression_loss: 1.6752 - classification_loss: 0.3752 193/500 [==========>...................] - ETA: 1:17 - loss: 2.0543 - regression_loss: 1.6785 - classification_loss: 0.3757 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0597 - regression_loss: 1.6833 - classification_loss: 0.3764 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0604 - regression_loss: 1.6840 - classification_loss: 0.3764 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0651 - regression_loss: 1.6861 - classification_loss: 0.3790 197/500 [==========>...................] - ETA: 1:16 - loss: 2.0597 - regression_loss: 1.6819 - classification_loss: 0.3778 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0628 - regression_loss: 1.6842 - classification_loss: 0.3786 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0628 - regression_loss: 1.6846 - classification_loss: 0.3782 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0657 - regression_loss: 1.6866 - classification_loss: 0.3790 201/500 [===========>..................] - ETA: 1:15 - loss: 2.0646 - regression_loss: 1.6858 - classification_loss: 0.3788 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0657 - regression_loss: 1.6865 - classification_loss: 0.3792 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0661 - regression_loss: 1.6864 - classification_loss: 0.3796 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0653 - regression_loss: 1.6857 - classification_loss: 0.3795 205/500 [===========>..................] - ETA: 1:14 - loss: 2.0667 - regression_loss: 1.6865 - classification_loss: 0.3802 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0669 - regression_loss: 1.6868 - classification_loss: 0.3802 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0663 - regression_loss: 1.6863 - classification_loss: 0.3800 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0712 - regression_loss: 1.6907 - classification_loss: 0.3805 209/500 [===========>..................] - ETA: 1:13 - loss: 2.0718 - regression_loss: 1.6915 - classification_loss: 0.3803 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0712 - regression_loss: 1.6910 - classification_loss: 0.3802 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0745 - regression_loss: 1.6944 - classification_loss: 0.3801 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0728 - regression_loss: 1.6922 - classification_loss: 0.3807 213/500 [===========>..................] - ETA: 1:12 - loss: 2.0702 - regression_loss: 1.6902 - classification_loss: 0.3799 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0707 - regression_loss: 1.6909 - classification_loss: 0.3798 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0702 - regression_loss: 1.6904 - classification_loss: 0.3798 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0719 - regression_loss: 1.6918 - classification_loss: 0.3801 217/500 [============>.................] - ETA: 1:11 - loss: 2.0692 - regression_loss: 1.6897 - classification_loss: 0.3795 218/500 [============>.................] - ETA: 1:10 - loss: 2.0669 - regression_loss: 1.6878 - classification_loss: 0.3790 219/500 [============>.................] - ETA: 1:10 - loss: 2.0689 - regression_loss: 1.6893 - classification_loss: 0.3796 220/500 [============>.................] - ETA: 1:10 - loss: 2.0682 - regression_loss: 1.6889 - classification_loss: 0.3793 221/500 [============>.................] - ETA: 1:09 - loss: 2.0676 - regression_loss: 1.6882 - classification_loss: 0.3794 222/500 [============>.................] - ETA: 1:09 - loss: 2.0660 - regression_loss: 1.6871 - classification_loss: 0.3789 223/500 [============>.................] - ETA: 1:09 - loss: 2.0657 - regression_loss: 1.6866 - classification_loss: 0.3791 224/500 [============>.................] - ETA: 1:09 - loss: 2.0654 - regression_loss: 1.6864 - classification_loss: 0.3791 225/500 [============>.................] - ETA: 1:08 - loss: 2.0652 - regression_loss: 1.6863 - classification_loss: 0.3789 226/500 [============>.................] - ETA: 1:08 - loss: 2.0658 - regression_loss: 1.6874 - classification_loss: 0.3784 227/500 [============>.................] - ETA: 1:08 - loss: 2.0640 - regression_loss: 1.6859 - classification_loss: 0.3781 228/500 [============>.................] - ETA: 1:08 - loss: 2.0640 - regression_loss: 1.6858 - classification_loss: 0.3782 229/500 [============>.................] - ETA: 1:07 - loss: 2.0642 - regression_loss: 1.6862 - classification_loss: 0.3780 230/500 [============>.................] - ETA: 1:07 - loss: 2.0637 - regression_loss: 1.6859 - classification_loss: 0.3778 231/500 [============>.................] - ETA: 1:07 - loss: 2.0660 - regression_loss: 1.6879 - classification_loss: 0.3781 232/500 [============>.................] - ETA: 1:07 - loss: 2.0646 - regression_loss: 1.6868 - classification_loss: 0.3778 233/500 [============>.................] - ETA: 1:06 - loss: 2.0644 - regression_loss: 1.6867 - classification_loss: 0.3777 234/500 [=============>................] - ETA: 1:06 - loss: 2.0647 - regression_loss: 1.6867 - classification_loss: 0.3780 235/500 [=============>................] - ETA: 1:06 - loss: 2.0645 - regression_loss: 1.6865 - classification_loss: 0.3781 236/500 [=============>................] - ETA: 1:06 - loss: 2.0651 - regression_loss: 1.6865 - classification_loss: 0.3785 237/500 [=============>................] - ETA: 1:05 - loss: 2.0646 - regression_loss: 1.6864 - classification_loss: 0.3782 238/500 [=============>................] - ETA: 1:05 - loss: 2.0671 - regression_loss: 1.6882 - classification_loss: 0.3789 239/500 [=============>................] - ETA: 1:05 - loss: 2.0663 - regression_loss: 1.6877 - classification_loss: 0.3787 240/500 [=============>................] - ETA: 1:05 - loss: 2.0604 - regression_loss: 1.6826 - classification_loss: 0.3778 241/500 [=============>................] - ETA: 1:04 - loss: 2.0578 - regression_loss: 1.6802 - classification_loss: 0.3776 242/500 [=============>................] - ETA: 1:04 - loss: 2.0577 - regression_loss: 1.6802 - classification_loss: 0.3775 243/500 [=============>................] - ETA: 1:04 - loss: 2.0584 - regression_loss: 1.6809 - classification_loss: 0.3775 244/500 [=============>................] - ETA: 1:04 - loss: 2.0542 - regression_loss: 1.6775 - classification_loss: 0.3767 245/500 [=============>................] - ETA: 1:03 - loss: 2.0551 - regression_loss: 1.6784 - classification_loss: 0.3768 246/500 [=============>................] - ETA: 1:03 - loss: 2.0518 - regression_loss: 1.6759 - classification_loss: 0.3759 247/500 [=============>................] - ETA: 1:03 - loss: 2.0509 - regression_loss: 1.6749 - classification_loss: 0.3759 248/500 [=============>................] - ETA: 1:03 - loss: 2.0523 - regression_loss: 1.6758 - classification_loss: 0.3765 249/500 [=============>................] - ETA: 1:02 - loss: 2.0516 - regression_loss: 1.6750 - classification_loss: 0.3766 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0506 - regression_loss: 1.6744 - classification_loss: 0.3762 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0524 - regression_loss: 1.6763 - classification_loss: 0.3761 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0530 - regression_loss: 1.6769 - classification_loss: 0.3761 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0495 - regression_loss: 1.6743 - classification_loss: 0.3752 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0501 - regression_loss: 1.6748 - classification_loss: 0.3752 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0504 - regression_loss: 1.6754 - classification_loss: 0.3750 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0492 - regression_loss: 1.6736 - classification_loss: 0.3756 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0492 - regression_loss: 1.6739 - classification_loss: 0.3753 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0509 - regression_loss: 1.6749 - classification_loss: 0.3760 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0515 - regression_loss: 1.6756 - classification_loss: 0.3759 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0522 - regression_loss: 1.6762 - classification_loss: 0.3760 261/500 [==============>...............] - ETA: 59s - loss: 2.0555 - regression_loss: 1.6787 - classification_loss: 0.3768  262/500 [==============>...............] - ETA: 59s - loss: 2.0564 - regression_loss: 1.6793 - classification_loss: 0.3771 263/500 [==============>...............] - ETA: 59s - loss: 2.0595 - regression_loss: 1.6817 - classification_loss: 0.3779 264/500 [==============>...............] - ETA: 59s - loss: 2.0628 - regression_loss: 1.6848 - classification_loss: 0.3780 265/500 [==============>...............] - ETA: 58s - loss: 2.0626 - regression_loss: 1.6847 - classification_loss: 0.3779 266/500 [==============>...............] - ETA: 58s - loss: 2.0613 - regression_loss: 1.6837 - classification_loss: 0.3776 267/500 [===============>..............] - ETA: 58s - loss: 2.0634 - regression_loss: 1.6852 - classification_loss: 0.3782 268/500 [===============>..............] - ETA: 58s - loss: 2.0649 - regression_loss: 1.6864 - classification_loss: 0.3785 269/500 [===============>..............] - ETA: 57s - loss: 2.0649 - regression_loss: 1.6860 - classification_loss: 0.3790 270/500 [===============>..............] - ETA: 57s - loss: 2.0640 - regression_loss: 1.6855 - classification_loss: 0.3785 271/500 [===============>..............] - ETA: 57s - loss: 2.0640 - regression_loss: 1.6857 - classification_loss: 0.3783 272/500 [===============>..............] - ETA: 57s - loss: 2.0648 - regression_loss: 1.6861 - classification_loss: 0.3787 273/500 [===============>..............] - ETA: 56s - loss: 2.0661 - regression_loss: 1.6866 - classification_loss: 0.3795 274/500 [===============>..............] - ETA: 56s - loss: 2.0624 - regression_loss: 1.6836 - classification_loss: 0.3788 275/500 [===============>..............] - ETA: 56s - loss: 2.0621 - regression_loss: 1.6835 - classification_loss: 0.3786 276/500 [===============>..............] - ETA: 56s - loss: 2.0609 - regression_loss: 1.6827 - classification_loss: 0.3783 277/500 [===============>..............] - ETA: 55s - loss: 2.0616 - regression_loss: 1.6832 - classification_loss: 0.3783 278/500 [===============>..............] - ETA: 55s - loss: 2.0631 - regression_loss: 1.6846 - classification_loss: 0.3785 279/500 [===============>..............] - ETA: 55s - loss: 2.0632 - regression_loss: 1.6848 - classification_loss: 0.3784 280/500 [===============>..............] - ETA: 55s - loss: 2.0623 - regression_loss: 1.6840 - classification_loss: 0.3783 281/500 [===============>..............] - ETA: 54s - loss: 2.0630 - regression_loss: 1.6843 - classification_loss: 0.3787 282/500 [===============>..............] - ETA: 54s - loss: 2.0616 - regression_loss: 1.6835 - classification_loss: 0.3781 283/500 [===============>..............] - ETA: 54s - loss: 2.0621 - regression_loss: 1.6838 - classification_loss: 0.3783 284/500 [================>.............] - ETA: 54s - loss: 2.0629 - regression_loss: 1.6845 - classification_loss: 0.3784 285/500 [================>.............] - ETA: 53s - loss: 2.0630 - regression_loss: 1.6849 - classification_loss: 0.3781 286/500 [================>.............] - ETA: 53s - loss: 2.0651 - regression_loss: 1.6863 - classification_loss: 0.3788 287/500 [================>.............] - ETA: 53s - loss: 2.0627 - regression_loss: 1.6841 - classification_loss: 0.3786 288/500 [================>.............] - ETA: 53s - loss: 2.0625 - regression_loss: 1.6831 - classification_loss: 0.3794 289/500 [================>.............] - ETA: 52s - loss: 2.0645 - regression_loss: 1.6844 - classification_loss: 0.3801 290/500 [================>.............] - ETA: 52s - loss: 2.0650 - regression_loss: 1.6852 - classification_loss: 0.3798 291/500 [================>.............] - ETA: 52s - loss: 2.0653 - regression_loss: 1.6857 - classification_loss: 0.3796 292/500 [================>.............] - ETA: 52s - loss: 2.0641 - regression_loss: 1.6847 - classification_loss: 0.3794 293/500 [================>.............] - ETA: 51s - loss: 2.0612 - regression_loss: 1.6824 - classification_loss: 0.3789 294/500 [================>.............] - ETA: 51s - loss: 2.0601 - regression_loss: 1.6816 - classification_loss: 0.3785 295/500 [================>.............] - ETA: 51s - loss: 2.0577 - regression_loss: 1.6796 - classification_loss: 0.3781 296/500 [================>.............] - ETA: 51s - loss: 2.0583 - regression_loss: 1.6803 - classification_loss: 0.3780 297/500 [================>.............] - ETA: 50s - loss: 2.0572 - regression_loss: 1.6797 - classification_loss: 0.3776 298/500 [================>.............] - ETA: 50s - loss: 2.0579 - regression_loss: 1.6803 - classification_loss: 0.3776 299/500 [================>.............] - ETA: 50s - loss: 2.0581 - regression_loss: 1.6805 - classification_loss: 0.3776 300/500 [=================>............] - ETA: 50s - loss: 2.0616 - regression_loss: 1.6830 - classification_loss: 0.3786 301/500 [=================>............] - ETA: 49s - loss: 2.0592 - regression_loss: 1.6811 - classification_loss: 0.3782 302/500 [=================>............] - ETA: 49s - loss: 2.0594 - regression_loss: 1.6813 - classification_loss: 0.3781 303/500 [=================>............] - ETA: 49s - loss: 2.0561 - regression_loss: 1.6788 - classification_loss: 0.3773 304/500 [=================>............] - ETA: 49s - loss: 2.0564 - regression_loss: 1.6791 - classification_loss: 0.3773 305/500 [=================>............] - ETA: 48s - loss: 2.0561 - regression_loss: 1.6786 - classification_loss: 0.3775 306/500 [=================>............] - ETA: 48s - loss: 2.0535 - regression_loss: 1.6761 - classification_loss: 0.3774 307/500 [=================>............] - ETA: 48s - loss: 2.0532 - regression_loss: 1.6761 - classification_loss: 0.3771 308/500 [=================>............] - ETA: 48s - loss: 2.0517 - regression_loss: 1.6749 - classification_loss: 0.3768 309/500 [=================>............] - ETA: 47s - loss: 2.0542 - regression_loss: 1.6771 - classification_loss: 0.3771 310/500 [=================>............] - ETA: 47s - loss: 2.0555 - regression_loss: 1.6783 - classification_loss: 0.3772 311/500 [=================>............] - ETA: 47s - loss: 2.0527 - regression_loss: 1.6759 - classification_loss: 0.3767 312/500 [=================>............] - ETA: 47s - loss: 2.0494 - regression_loss: 1.6734 - classification_loss: 0.3760 313/500 [=================>............] - ETA: 46s - loss: 2.0506 - regression_loss: 1.6743 - classification_loss: 0.3763 314/500 [=================>............] - ETA: 46s - loss: 2.0501 - regression_loss: 1.6741 - classification_loss: 0.3759 315/500 [=================>............] - ETA: 46s - loss: 2.0514 - regression_loss: 1.6754 - classification_loss: 0.3760 316/500 [=================>............] - ETA: 46s - loss: 2.0526 - regression_loss: 1.6763 - classification_loss: 0.3763 317/500 [==================>...........] - ETA: 45s - loss: 2.0550 - regression_loss: 1.6779 - classification_loss: 0.3770 318/500 [==================>...........] - ETA: 45s - loss: 2.0554 - regression_loss: 1.6784 - classification_loss: 0.3770 319/500 [==================>...........] - ETA: 45s - loss: 2.0552 - regression_loss: 1.6779 - classification_loss: 0.3773 320/500 [==================>...........] - ETA: 45s - loss: 2.0535 - regression_loss: 1.6762 - classification_loss: 0.3773 321/500 [==================>...........] - ETA: 44s - loss: 2.0546 - regression_loss: 1.6773 - classification_loss: 0.3774 322/500 [==================>...........] - ETA: 44s - loss: 2.0531 - regression_loss: 1.6761 - classification_loss: 0.3770 323/500 [==================>...........] - ETA: 44s - loss: 2.0524 - regression_loss: 1.6755 - classification_loss: 0.3769 324/500 [==================>...........] - ETA: 44s - loss: 2.0535 - regression_loss: 1.6768 - classification_loss: 0.3768 325/500 [==================>...........] - ETA: 43s - loss: 2.0511 - regression_loss: 1.6751 - classification_loss: 0.3760 326/500 [==================>...........] - ETA: 43s - loss: 2.0505 - regression_loss: 1.6748 - classification_loss: 0.3758 327/500 [==================>...........] - ETA: 43s - loss: 2.0509 - regression_loss: 1.6749 - classification_loss: 0.3759 328/500 [==================>...........] - ETA: 43s - loss: 2.0516 - regression_loss: 1.6753 - classification_loss: 0.3763 329/500 [==================>...........] - ETA: 42s - loss: 2.0525 - regression_loss: 1.6761 - classification_loss: 0.3764 330/500 [==================>...........] - ETA: 42s - loss: 2.0529 - regression_loss: 1.6765 - classification_loss: 0.3764 331/500 [==================>...........] - ETA: 42s - loss: 2.0538 - regression_loss: 1.6768 - classification_loss: 0.3770 332/500 [==================>...........] - ETA: 42s - loss: 2.0549 - regression_loss: 1.6778 - classification_loss: 0.3771 333/500 [==================>...........] - ETA: 41s - loss: 2.0516 - regression_loss: 1.6749 - classification_loss: 0.3767 334/500 [===================>..........] - ETA: 41s - loss: 2.0515 - regression_loss: 1.6747 - classification_loss: 0.3768 335/500 [===================>..........] - ETA: 41s - loss: 2.0523 - regression_loss: 1.6753 - classification_loss: 0.3770 336/500 [===================>..........] - ETA: 41s - loss: 2.0516 - regression_loss: 1.6748 - classification_loss: 0.3768 337/500 [===================>..........] - ETA: 40s - loss: 2.0497 - regression_loss: 1.6726 - classification_loss: 0.3770 338/500 [===================>..........] - ETA: 40s - loss: 2.0493 - regression_loss: 1.6725 - classification_loss: 0.3768 339/500 [===================>..........] - ETA: 40s - loss: 2.0515 - regression_loss: 1.6743 - classification_loss: 0.3771 340/500 [===================>..........] - ETA: 40s - loss: 2.0493 - regression_loss: 1.6726 - classification_loss: 0.3767 341/500 [===================>..........] - ETA: 39s - loss: 2.0494 - regression_loss: 1.6729 - classification_loss: 0.3766 342/500 [===================>..........] - ETA: 39s - loss: 2.0502 - regression_loss: 1.6737 - classification_loss: 0.3765 343/500 [===================>..........] - ETA: 39s - loss: 2.0500 - regression_loss: 1.6736 - classification_loss: 0.3764 344/500 [===================>..........] - ETA: 39s - loss: 2.0507 - regression_loss: 1.6742 - classification_loss: 0.3765 345/500 [===================>..........] - ETA: 38s - loss: 2.0492 - regression_loss: 1.6730 - classification_loss: 0.3762 346/500 [===================>..........] - ETA: 38s - loss: 2.0509 - regression_loss: 1.6743 - classification_loss: 0.3766 347/500 [===================>..........] - ETA: 38s - loss: 2.0517 - regression_loss: 1.6750 - classification_loss: 0.3767 348/500 [===================>..........] - ETA: 38s - loss: 2.0492 - regression_loss: 1.6730 - classification_loss: 0.3762 349/500 [===================>..........] - ETA: 37s - loss: 2.0482 - regression_loss: 1.6720 - classification_loss: 0.3762 350/500 [====================>.........] - ETA: 37s - loss: 2.0496 - regression_loss: 1.6725 - classification_loss: 0.3772 351/500 [====================>.........] - ETA: 37s - loss: 2.0476 - regression_loss: 1.6706 - classification_loss: 0.3770 352/500 [====================>.........] - ETA: 37s - loss: 2.0464 - regression_loss: 1.6686 - classification_loss: 0.3778 353/500 [====================>.........] - ETA: 36s - loss: 2.0468 - regression_loss: 1.6690 - classification_loss: 0.3778 354/500 [====================>.........] - ETA: 36s - loss: 2.0473 - regression_loss: 1.6696 - classification_loss: 0.3777 355/500 [====================>.........] - ETA: 36s - loss: 2.0460 - regression_loss: 1.6687 - classification_loss: 0.3774 356/500 [====================>.........] - ETA: 36s - loss: 2.0462 - regression_loss: 1.6690 - classification_loss: 0.3772 357/500 [====================>.........] - ETA: 35s - loss: 2.0467 - regression_loss: 1.6692 - classification_loss: 0.3775 358/500 [====================>.........] - ETA: 35s - loss: 2.0456 - regression_loss: 1.6685 - classification_loss: 0.3771 359/500 [====================>.........] - ETA: 35s - loss: 2.0463 - regression_loss: 1.6690 - classification_loss: 0.3773 360/500 [====================>.........] - ETA: 35s - loss: 2.0469 - regression_loss: 1.6693 - classification_loss: 0.3776 361/500 [====================>.........] - ETA: 34s - loss: 2.0445 - regression_loss: 1.6672 - classification_loss: 0.3773 362/500 [====================>.........] - ETA: 34s - loss: 2.0438 - regression_loss: 1.6667 - classification_loss: 0.3772 363/500 [====================>.........] - ETA: 34s - loss: 2.0445 - regression_loss: 1.6673 - classification_loss: 0.3772 364/500 [====================>.........] - ETA: 34s - loss: 2.0468 - regression_loss: 1.6697 - classification_loss: 0.3770 365/500 [====================>.........] - ETA: 33s - loss: 2.0472 - regression_loss: 1.6700 - classification_loss: 0.3772 366/500 [====================>.........] - ETA: 33s - loss: 2.0478 - regression_loss: 1.6707 - classification_loss: 0.3772 367/500 [=====================>........] - ETA: 33s - loss: 2.0497 - regression_loss: 1.6721 - classification_loss: 0.3776 368/500 [=====================>........] - ETA: 33s - loss: 2.0500 - regression_loss: 1.6723 - classification_loss: 0.3777 369/500 [=====================>........] - ETA: 32s - loss: 2.0504 - regression_loss: 1.6727 - classification_loss: 0.3777 370/500 [=====================>........] - ETA: 32s - loss: 2.0506 - regression_loss: 1.6732 - classification_loss: 0.3774 371/500 [=====================>........] - ETA: 32s - loss: 2.0510 - regression_loss: 1.6736 - classification_loss: 0.3774 372/500 [=====================>........] - ETA: 32s - loss: 2.0503 - regression_loss: 1.6731 - classification_loss: 0.3771 373/500 [=====================>........] - ETA: 31s - loss: 2.0499 - regression_loss: 1.6730 - classification_loss: 0.3769 374/500 [=====================>........] - ETA: 31s - loss: 2.0509 - regression_loss: 1.6738 - classification_loss: 0.3771 375/500 [=====================>........] - ETA: 31s - loss: 2.0515 - regression_loss: 1.6745 - classification_loss: 0.3770 376/500 [=====================>........] - ETA: 31s - loss: 2.0508 - regression_loss: 1.6740 - classification_loss: 0.3768 377/500 [=====================>........] - ETA: 30s - loss: 2.0518 - regression_loss: 1.6748 - classification_loss: 0.3770 378/500 [=====================>........] - ETA: 30s - loss: 2.0516 - regression_loss: 1.6746 - classification_loss: 0.3770 379/500 [=====================>........] - ETA: 30s - loss: 2.0521 - regression_loss: 1.6754 - classification_loss: 0.3767 380/500 [=====================>........] - ETA: 30s - loss: 2.0530 - regression_loss: 1.6761 - classification_loss: 0.3769 381/500 [=====================>........] - ETA: 29s - loss: 2.0524 - regression_loss: 1.6755 - classification_loss: 0.3769 382/500 [=====================>........] - ETA: 29s - loss: 2.0509 - regression_loss: 1.6738 - classification_loss: 0.3771 383/500 [=====================>........] - ETA: 29s - loss: 2.0499 - regression_loss: 1.6731 - classification_loss: 0.3768 384/500 [======================>.......] - ETA: 29s - loss: 2.0515 - regression_loss: 1.6743 - classification_loss: 0.3773 385/500 [======================>.......] - ETA: 28s - loss: 2.0513 - regression_loss: 1.6741 - classification_loss: 0.3772 386/500 [======================>.......] - ETA: 28s - loss: 2.0516 - regression_loss: 1.6744 - classification_loss: 0.3771 387/500 [======================>.......] - ETA: 28s - loss: 2.0513 - regression_loss: 1.6742 - classification_loss: 0.3770 388/500 [======================>.......] - ETA: 28s - loss: 2.0502 - regression_loss: 1.6733 - classification_loss: 0.3770 389/500 [======================>.......] - ETA: 27s - loss: 2.0488 - regression_loss: 1.6720 - classification_loss: 0.3768 390/500 [======================>.......] - ETA: 27s - loss: 2.0491 - regression_loss: 1.6723 - classification_loss: 0.3768 391/500 [======================>.......] - ETA: 27s - loss: 2.0490 - regression_loss: 1.6723 - classification_loss: 0.3767 392/500 [======================>.......] - ETA: 27s - loss: 2.0492 - regression_loss: 1.6727 - classification_loss: 0.3766 393/500 [======================>.......] - ETA: 26s - loss: 2.0497 - regression_loss: 1.6733 - classification_loss: 0.3764 394/500 [======================>.......] - ETA: 26s - loss: 2.0493 - regression_loss: 1.6730 - classification_loss: 0.3763 395/500 [======================>.......] - ETA: 26s - loss: 2.0497 - regression_loss: 1.6737 - classification_loss: 0.3760 396/500 [======================>.......] - ETA: 26s - loss: 2.0490 - regression_loss: 1.6732 - classification_loss: 0.3757 397/500 [======================>.......] - ETA: 25s - loss: 2.0488 - regression_loss: 1.6729 - classification_loss: 0.3759 398/500 [======================>.......] - ETA: 25s - loss: 2.0498 - regression_loss: 1.6739 - classification_loss: 0.3759 399/500 [======================>.......] - ETA: 25s - loss: 2.0501 - regression_loss: 1.6744 - classification_loss: 0.3757 400/500 [=======================>......] - ETA: 25s - loss: 2.0502 - regression_loss: 1.6745 - classification_loss: 0.3757 401/500 [=======================>......] - ETA: 24s - loss: 2.0515 - regression_loss: 1.6755 - classification_loss: 0.3760 402/500 [=======================>......] - ETA: 24s - loss: 2.0509 - regression_loss: 1.6751 - classification_loss: 0.3759 403/500 [=======================>......] - ETA: 24s - loss: 2.0515 - regression_loss: 1.6758 - classification_loss: 0.3757 404/500 [=======================>......] - ETA: 24s - loss: 2.0501 - regression_loss: 1.6743 - classification_loss: 0.3758 405/500 [=======================>......] - ETA: 23s - loss: 2.0515 - regression_loss: 1.6752 - classification_loss: 0.3762 406/500 [=======================>......] - ETA: 23s - loss: 2.0520 - regression_loss: 1.6758 - classification_loss: 0.3763 407/500 [=======================>......] - ETA: 23s - loss: 2.0524 - regression_loss: 1.6760 - classification_loss: 0.3764 408/500 [=======================>......] - ETA: 23s - loss: 2.0535 - regression_loss: 1.6769 - classification_loss: 0.3766 409/500 [=======================>......] - ETA: 22s - loss: 2.0541 - regression_loss: 1.6773 - classification_loss: 0.3768 410/500 [=======================>......] - ETA: 22s - loss: 2.0527 - regression_loss: 1.6754 - classification_loss: 0.3773 411/500 [=======================>......] - ETA: 22s - loss: 2.0532 - regression_loss: 1.6757 - classification_loss: 0.3775 412/500 [=======================>......] - ETA: 22s - loss: 2.0529 - regression_loss: 1.6757 - classification_loss: 0.3772 413/500 [=======================>......] - ETA: 21s - loss: 2.0531 - regression_loss: 1.6761 - classification_loss: 0.3771 414/500 [=======================>......] - ETA: 21s - loss: 2.0518 - regression_loss: 1.6748 - classification_loss: 0.3770 415/500 [=======================>......] - ETA: 21s - loss: 2.0534 - regression_loss: 1.6762 - classification_loss: 0.3772 416/500 [=======================>......] - ETA: 21s - loss: 2.0528 - regression_loss: 1.6759 - classification_loss: 0.3770 417/500 [========================>.....] - ETA: 20s - loss: 2.0520 - regression_loss: 1.6753 - classification_loss: 0.3767 418/500 [========================>.....] - ETA: 20s - loss: 2.0522 - regression_loss: 1.6755 - classification_loss: 0.3767 419/500 [========================>.....] - ETA: 20s - loss: 2.0512 - regression_loss: 1.6748 - classification_loss: 0.3764 420/500 [========================>.....] - ETA: 20s - loss: 2.0509 - regression_loss: 1.6746 - classification_loss: 0.3764 421/500 [========================>.....] - ETA: 19s - loss: 2.0520 - regression_loss: 1.6753 - classification_loss: 0.3767 422/500 [========================>.....] - ETA: 19s - loss: 2.0523 - regression_loss: 1.6755 - classification_loss: 0.3768 423/500 [========================>.....] - ETA: 19s - loss: 2.0523 - regression_loss: 1.6757 - classification_loss: 0.3766 424/500 [========================>.....] - ETA: 19s - loss: 2.0535 - regression_loss: 1.6764 - classification_loss: 0.3771 425/500 [========================>.....] - ETA: 18s - loss: 2.0537 - regression_loss: 1.6766 - classification_loss: 0.3771 426/500 [========================>.....] - ETA: 18s - loss: 2.0530 - regression_loss: 1.6761 - classification_loss: 0.3769 427/500 [========================>.....] - ETA: 18s - loss: 2.0529 - regression_loss: 1.6761 - classification_loss: 0.3768 428/500 [========================>.....] - ETA: 18s - loss: 2.0531 - regression_loss: 1.6764 - classification_loss: 0.3767 429/500 [========================>.....] - ETA: 17s - loss: 2.0531 - regression_loss: 1.6764 - classification_loss: 0.3767 430/500 [========================>.....] - ETA: 17s - loss: 2.0511 - regression_loss: 1.6745 - classification_loss: 0.3766 431/500 [========================>.....] - ETA: 17s - loss: 2.0512 - regression_loss: 1.6747 - classification_loss: 0.3765 432/500 [========================>.....] - ETA: 17s - loss: 2.0515 - regression_loss: 1.6749 - classification_loss: 0.3766 433/500 [========================>.....] - ETA: 16s - loss: 2.0516 - regression_loss: 1.6747 - classification_loss: 0.3769 434/500 [=========================>....] - ETA: 16s - loss: 2.0505 - regression_loss: 1.6739 - classification_loss: 0.3767 435/500 [=========================>....] - ETA: 16s - loss: 2.0510 - regression_loss: 1.6744 - classification_loss: 0.3766 436/500 [=========================>....] - ETA: 16s - loss: 2.0518 - regression_loss: 1.6749 - classification_loss: 0.3769 437/500 [=========================>....] - ETA: 15s - loss: 2.0527 - regression_loss: 1.6755 - classification_loss: 0.3772 438/500 [=========================>....] - ETA: 15s - loss: 2.0524 - regression_loss: 1.6755 - classification_loss: 0.3770 439/500 [=========================>....] - ETA: 15s - loss: 2.0527 - regression_loss: 1.6756 - classification_loss: 0.3771 440/500 [=========================>....] - ETA: 15s - loss: 2.0517 - regression_loss: 1.6743 - classification_loss: 0.3774 441/500 [=========================>....] - ETA: 14s - loss: 2.0496 - regression_loss: 1.6727 - classification_loss: 0.3769 442/500 [=========================>....] - ETA: 14s - loss: 2.0507 - regression_loss: 1.6735 - classification_loss: 0.3772 443/500 [=========================>....] - ETA: 14s - loss: 2.0509 - regression_loss: 1.6738 - classification_loss: 0.3771 444/500 [=========================>....] - ETA: 14s - loss: 2.0488 - regression_loss: 1.6718 - classification_loss: 0.3770 445/500 [=========================>....] - ETA: 13s - loss: 2.0511 - regression_loss: 1.6738 - classification_loss: 0.3773 446/500 [=========================>....] - ETA: 13s - loss: 2.0509 - regression_loss: 1.6738 - classification_loss: 0.3771 447/500 [=========================>....] - ETA: 13s - loss: 2.0510 - regression_loss: 1.6740 - classification_loss: 0.3770 448/500 [=========================>....] - ETA: 13s - loss: 2.0510 - regression_loss: 1.6740 - classification_loss: 0.3771 449/500 [=========================>....] - ETA: 12s - loss: 2.0508 - regression_loss: 1.6739 - classification_loss: 0.3769 450/500 [==========================>...] - ETA: 12s - loss: 2.0503 - regression_loss: 1.6734 - classification_loss: 0.3769 451/500 [==========================>...] - ETA: 12s - loss: 2.0512 - regression_loss: 1.6743 - classification_loss: 0.3770 452/500 [==========================>...] - ETA: 12s - loss: 2.0504 - regression_loss: 1.6734 - classification_loss: 0.3770 453/500 [==========================>...] - ETA: 11s - loss: 2.0500 - regression_loss: 1.6730 - classification_loss: 0.3770 454/500 [==========================>...] - ETA: 11s - loss: 2.0511 - regression_loss: 1.6738 - classification_loss: 0.3772 455/500 [==========================>...] - ETA: 11s - loss: 2.0505 - regression_loss: 1.6735 - classification_loss: 0.3770 456/500 [==========================>...] - ETA: 11s - loss: 2.0518 - regression_loss: 1.6746 - classification_loss: 0.3773 457/500 [==========================>...] - ETA: 10s - loss: 2.0504 - regression_loss: 1.6736 - classification_loss: 0.3768 458/500 [==========================>...] - ETA: 10s - loss: 2.0505 - regression_loss: 1.6735 - classification_loss: 0.3769 459/500 [==========================>...] - ETA: 10s - loss: 2.0516 - regression_loss: 1.6746 - classification_loss: 0.3769 460/500 [==========================>...] - ETA: 10s - loss: 2.0500 - regression_loss: 1.6734 - classification_loss: 0.3766 461/500 [==========================>...] - ETA: 9s - loss: 2.0512 - regression_loss: 1.6745 - classification_loss: 0.3767  462/500 [==========================>...] - ETA: 9s - loss: 2.0512 - regression_loss: 1.6746 - classification_loss: 0.3766 463/500 [==========================>...] - ETA: 9s - loss: 2.0490 - regression_loss: 1.6729 - classification_loss: 0.3761 464/500 [==========================>...] - ETA: 9s - loss: 2.0488 - regression_loss: 1.6727 - classification_loss: 0.3761 465/500 [==========================>...] - ETA: 8s - loss: 2.0506 - regression_loss: 1.6742 - classification_loss: 0.3764 466/500 [==========================>...] - ETA: 8s - loss: 2.0495 - regression_loss: 1.6734 - classification_loss: 0.3761 467/500 [===========================>..] - ETA: 8s - loss: 2.0496 - regression_loss: 1.6735 - classification_loss: 0.3761 468/500 [===========================>..] - ETA: 8s - loss: 2.0494 - regression_loss: 1.6735 - classification_loss: 0.3759 469/500 [===========================>..] - ETA: 7s - loss: 2.0489 - regression_loss: 1.6732 - classification_loss: 0.3756 470/500 [===========================>..] - ETA: 7s - loss: 2.0489 - regression_loss: 1.6733 - classification_loss: 0.3756 471/500 [===========================>..] - ETA: 7s - loss: 2.0497 - regression_loss: 1.6738 - classification_loss: 0.3759 472/500 [===========================>..] - ETA: 7s - loss: 2.0493 - regression_loss: 1.6735 - classification_loss: 0.3758 473/500 [===========================>..] - ETA: 6s - loss: 2.0501 - regression_loss: 1.6742 - classification_loss: 0.3760 474/500 [===========================>..] - ETA: 6s - loss: 2.0503 - regression_loss: 1.6744 - classification_loss: 0.3760 475/500 [===========================>..] - ETA: 6s - loss: 2.0505 - regression_loss: 1.6745 - classification_loss: 0.3760 476/500 [===========================>..] - ETA: 6s - loss: 2.0507 - regression_loss: 1.6744 - classification_loss: 0.3763 477/500 [===========================>..] - ETA: 5s - loss: 2.0514 - regression_loss: 1.6750 - classification_loss: 0.3764 478/500 [===========================>..] - ETA: 5s - loss: 2.0494 - regression_loss: 1.6731 - classification_loss: 0.3762 479/500 [===========================>..] - ETA: 5s - loss: 2.0498 - regression_loss: 1.6733 - classification_loss: 0.3764 480/500 [===========================>..] - ETA: 5s - loss: 2.0484 - regression_loss: 1.6721 - classification_loss: 0.3763 481/500 [===========================>..] - ETA: 4s - loss: 2.0482 - regression_loss: 1.6721 - classification_loss: 0.3761 482/500 [===========================>..] - ETA: 4s - loss: 2.0481 - regression_loss: 1.6721 - classification_loss: 0.3760 483/500 [===========================>..] - ETA: 4s - loss: 2.0471 - regression_loss: 1.6712 - classification_loss: 0.3759 484/500 [============================>.] - ETA: 4s - loss: 2.0461 - regression_loss: 1.6705 - classification_loss: 0.3757 485/500 [============================>.] - ETA: 3s - loss: 2.0457 - regression_loss: 1.6702 - classification_loss: 0.3755 486/500 [============================>.] - ETA: 3s - loss: 2.0469 - regression_loss: 1.6711 - classification_loss: 0.3758 487/500 [============================>.] - ETA: 3s - loss: 2.0466 - regression_loss: 1.6709 - classification_loss: 0.3757 488/500 [============================>.] - ETA: 3s - loss: 2.0467 - regression_loss: 1.6709 - classification_loss: 0.3758 489/500 [============================>.] - ETA: 2s - loss: 2.0470 - regression_loss: 1.6712 - classification_loss: 0.3758 490/500 [============================>.] - ETA: 2s - loss: 2.0465 - regression_loss: 1.6710 - classification_loss: 0.3756 491/500 [============================>.] - ETA: 2s - loss: 2.0470 - regression_loss: 1.6712 - classification_loss: 0.3758 492/500 [============================>.] - ETA: 2s - loss: 2.0471 - regression_loss: 1.6713 - classification_loss: 0.3758 493/500 [============================>.] - ETA: 1s - loss: 2.0475 - regression_loss: 1.6718 - classification_loss: 0.3757 494/500 [============================>.] - ETA: 1s - loss: 2.0476 - regression_loss: 1.6720 - classification_loss: 0.3757 495/500 [============================>.] - ETA: 1s - loss: 2.0478 - regression_loss: 1.6722 - classification_loss: 0.3756 496/500 [============================>.] - ETA: 1s - loss: 2.0475 - regression_loss: 1.6719 - classification_loss: 0.3756 497/500 [============================>.] - ETA: 0s - loss: 2.0477 - regression_loss: 1.6721 - classification_loss: 0.3756 498/500 [============================>.] - ETA: 0s - loss: 2.0482 - regression_loss: 1.6725 - classification_loss: 0.3757 499/500 [============================>.] - ETA: 0s - loss: 2.0486 - regression_loss: 1.6730 - classification_loss: 0.3756 500/500 [==============================] - 125s 251ms/step - loss: 2.0483 - regression_loss: 1.6729 - classification_loss: 0.3755 1172 instances of class plum with average precision: 0.5297 mAP: 0.5297 Epoch 00036: saving model to ./training/snapshots/resnet50_pascal_36.h5 Epoch 37/150 1/500 [..............................] - ETA: 2:03 - loss: 1.7908 - regression_loss: 1.4827 - classification_loss: 0.3081 2/500 [..............................] - ETA: 2:02 - loss: 1.8944 - regression_loss: 1.5914 - classification_loss: 0.3030 3/500 [..............................] - ETA: 2:01 - loss: 1.5417 - regression_loss: 1.2561 - classification_loss: 0.2856 4/500 [..............................] - ETA: 2:01 - loss: 1.6990 - regression_loss: 1.3868 - classification_loss: 0.3122 5/500 [..............................] - ETA: 2:02 - loss: 1.7291 - regression_loss: 1.4243 - classification_loss: 0.3048 6/500 [..............................] - ETA: 2:02 - loss: 1.7570 - regression_loss: 1.4469 - classification_loss: 0.3102 7/500 [..............................] - ETA: 2:02 - loss: 1.7419 - regression_loss: 1.4261 - classification_loss: 0.3158 8/500 [..............................] - ETA: 2:02 - loss: 1.7199 - regression_loss: 1.4096 - classification_loss: 0.3103 9/500 [..............................] - ETA: 2:02 - loss: 1.7991 - regression_loss: 1.4653 - classification_loss: 0.3338 10/500 [..............................] - ETA: 2:02 - loss: 1.7977 - regression_loss: 1.4706 - classification_loss: 0.3271 11/500 [..............................] - ETA: 2:02 - loss: 1.7878 - regression_loss: 1.4632 - classification_loss: 0.3246 12/500 [..............................] - ETA: 2:02 - loss: 1.8167 - regression_loss: 1.4974 - classification_loss: 0.3193 13/500 [..............................] - ETA: 2:02 - loss: 1.8795 - regression_loss: 1.5571 - classification_loss: 0.3225 14/500 [..............................] - ETA: 2:02 - loss: 1.8983 - regression_loss: 1.5724 - classification_loss: 0.3259 15/500 [..............................] - ETA: 2:02 - loss: 1.8465 - regression_loss: 1.5262 - classification_loss: 0.3203 16/500 [..............................] - ETA: 2:02 - loss: 1.8609 - regression_loss: 1.5382 - classification_loss: 0.3227 17/500 [>.............................] - ETA: 2:01 - loss: 1.8922 - regression_loss: 1.5646 - classification_loss: 0.3276 18/500 [>.............................] - ETA: 2:01 - loss: 1.9228 - regression_loss: 1.5923 - classification_loss: 0.3305 19/500 [>.............................] - ETA: 2:01 - loss: 1.9280 - regression_loss: 1.5995 - classification_loss: 0.3285 20/500 [>.............................] - ETA: 2:01 - loss: 1.9554 - regression_loss: 1.6248 - classification_loss: 0.3306 21/500 [>.............................] - ETA: 2:00 - loss: 1.9145 - regression_loss: 1.5933 - classification_loss: 0.3211 22/500 [>.............................] - ETA: 2:00 - loss: 1.9392 - regression_loss: 1.6096 - classification_loss: 0.3296 23/500 [>.............................] - ETA: 2:00 - loss: 1.9461 - regression_loss: 1.6126 - classification_loss: 0.3335 24/500 [>.............................] - ETA: 1:59 - loss: 1.9396 - regression_loss: 1.6080 - classification_loss: 0.3316 25/500 [>.............................] - ETA: 1:59 - loss: 1.9411 - regression_loss: 1.6076 - classification_loss: 0.3335 26/500 [>.............................] - ETA: 1:59 - loss: 1.9591 - regression_loss: 1.6213 - classification_loss: 0.3377 27/500 [>.............................] - ETA: 1:58 - loss: 1.9647 - regression_loss: 1.6268 - classification_loss: 0.3380 28/500 [>.............................] - ETA: 1:58 - loss: 1.9749 - regression_loss: 1.6357 - classification_loss: 0.3392 29/500 [>.............................] - ETA: 1:58 - loss: 2.0110 - regression_loss: 1.6555 - classification_loss: 0.3555 30/500 [>.............................] - ETA: 1:58 - loss: 2.0214 - regression_loss: 1.6640 - classification_loss: 0.3575 31/500 [>.............................] - ETA: 1:57 - loss: 2.0107 - regression_loss: 1.6549 - classification_loss: 0.3558 32/500 [>.............................] - ETA: 1:57 - loss: 2.0495 - regression_loss: 1.6883 - classification_loss: 0.3613 33/500 [>.............................] - ETA: 1:57 - loss: 2.0363 - regression_loss: 1.6760 - classification_loss: 0.3603 34/500 [=>............................] - ETA: 1:57 - loss: 2.0350 - regression_loss: 1.6746 - classification_loss: 0.3604 35/500 [=>............................] - ETA: 1:56 - loss: 2.0340 - regression_loss: 1.6757 - classification_loss: 0.3582 36/500 [=>............................] - ETA: 1:56 - loss: 2.0364 - regression_loss: 1.6767 - classification_loss: 0.3597 37/500 [=>............................] - ETA: 1:56 - loss: 2.0552 - regression_loss: 1.6923 - classification_loss: 0.3629 38/500 [=>............................] - ETA: 1:56 - loss: 2.0664 - regression_loss: 1.7020 - classification_loss: 0.3644 39/500 [=>............................] - ETA: 1:56 - loss: 2.0540 - regression_loss: 1.6924 - classification_loss: 0.3616 40/500 [=>............................] - ETA: 1:55 - loss: 2.0345 - regression_loss: 1.6776 - classification_loss: 0.3569 41/500 [=>............................] - ETA: 1:55 - loss: 2.0112 - regression_loss: 1.6596 - classification_loss: 0.3515 42/500 [=>............................] - ETA: 1:55 - loss: 2.0215 - regression_loss: 1.6688 - classification_loss: 0.3527 43/500 [=>............................] - ETA: 1:55 - loss: 2.0212 - regression_loss: 1.6688 - classification_loss: 0.3525 44/500 [=>............................] - ETA: 1:54 - loss: 2.0224 - regression_loss: 1.6707 - classification_loss: 0.3517 45/500 [=>............................] - ETA: 1:54 - loss: 2.0192 - regression_loss: 1.6667 - classification_loss: 0.3525 46/500 [=>............................] - ETA: 1:54 - loss: 2.0215 - regression_loss: 1.6689 - classification_loss: 0.3525 47/500 [=>............................] - ETA: 1:54 - loss: 2.0338 - regression_loss: 1.6762 - classification_loss: 0.3576 48/500 [=>............................] - ETA: 1:53 - loss: 2.0182 - regression_loss: 1.6637 - classification_loss: 0.3545 49/500 [=>............................] - ETA: 1:53 - loss: 2.0099 - regression_loss: 1.6575 - classification_loss: 0.3524 50/500 [==>...........................] - ETA: 1:53 - loss: 2.0069 - regression_loss: 1.6549 - classification_loss: 0.3520 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9844 - regression_loss: 1.6362 - classification_loss: 0.3481 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9838 - regression_loss: 1.6369 - classification_loss: 0.3469 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9855 - regression_loss: 1.6381 - classification_loss: 0.3474 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9944 - regression_loss: 1.6457 - classification_loss: 0.3488 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9827 - regression_loss: 1.6367 - classification_loss: 0.3460 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9833 - regression_loss: 1.6378 - classification_loss: 0.3454 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9985 - regression_loss: 1.6501 - classification_loss: 0.3484 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9938 - regression_loss: 1.6468 - classification_loss: 0.3469 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9871 - regression_loss: 1.6417 - classification_loss: 0.3455 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9885 - regression_loss: 1.6435 - classification_loss: 0.3451 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9890 - regression_loss: 1.6459 - classification_loss: 0.3430 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9952 - regression_loss: 1.6518 - classification_loss: 0.3433 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0009 - regression_loss: 1.6563 - classification_loss: 0.3447 64/500 [==>...........................] - ETA: 1:49 - loss: 2.0018 - regression_loss: 1.6566 - classification_loss: 0.3452 65/500 [==>...........................] - ETA: 1:49 - loss: 2.0012 - regression_loss: 1.6573 - classification_loss: 0.3439 66/500 [==>...........................] - ETA: 1:49 - loss: 2.0036 - regression_loss: 1.6595 - classification_loss: 0.3441 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0151 - regression_loss: 1.6695 - classification_loss: 0.3455 68/500 [===>..........................] - ETA: 1:48 - loss: 2.0150 - regression_loss: 1.6696 - classification_loss: 0.3454 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9997 - regression_loss: 1.6568 - classification_loss: 0.3429 70/500 [===>..........................] - ETA: 1:48 - loss: 2.0062 - regression_loss: 1.6605 - classification_loss: 0.3458 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0017 - regression_loss: 1.6570 - classification_loss: 0.3446 72/500 [===>..........................] - ETA: 1:47 - loss: 2.0035 - regression_loss: 1.6592 - classification_loss: 0.3442 73/500 [===>..........................] - ETA: 1:47 - loss: 2.0059 - regression_loss: 1.6617 - classification_loss: 0.3443 74/500 [===>..........................] - ETA: 1:47 - loss: 2.0153 - regression_loss: 1.6687 - classification_loss: 0.3466 75/500 [===>..........................] - ETA: 1:46 - loss: 2.0116 - regression_loss: 1.6660 - classification_loss: 0.3456 76/500 [===>..........................] - ETA: 1:46 - loss: 2.0141 - regression_loss: 1.6675 - classification_loss: 0.3466 77/500 [===>..........................] - ETA: 1:46 - loss: 2.0065 - regression_loss: 1.6603 - classification_loss: 0.3462 78/500 [===>..........................] - ETA: 1:46 - loss: 2.0035 - regression_loss: 1.6586 - classification_loss: 0.3450 79/500 [===>..........................] - ETA: 1:46 - loss: 2.0045 - regression_loss: 1.6600 - classification_loss: 0.3445 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0189 - regression_loss: 1.6726 - classification_loss: 0.3462 81/500 [===>..........................] - ETA: 1:45 - loss: 2.0195 - regression_loss: 1.6737 - classification_loss: 0.3458 82/500 [===>..........................] - ETA: 1:45 - loss: 2.0225 - regression_loss: 1.6761 - classification_loss: 0.3465 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0287 - regression_loss: 1.6817 - classification_loss: 0.3470 84/500 [====>.........................] - ETA: 1:44 - loss: 2.0317 - regression_loss: 1.6838 - classification_loss: 0.3479 85/500 [====>.........................] - ETA: 1:44 - loss: 2.0325 - regression_loss: 1.6845 - classification_loss: 0.3480 86/500 [====>.........................] - ETA: 1:44 - loss: 2.0319 - regression_loss: 1.6841 - classification_loss: 0.3479 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0391 - regression_loss: 1.6883 - classification_loss: 0.3508 88/500 [====>.........................] - ETA: 1:43 - loss: 2.0437 - regression_loss: 1.6911 - classification_loss: 0.3525 89/500 [====>.........................] - ETA: 1:43 - loss: 2.0446 - regression_loss: 1.6928 - classification_loss: 0.3517 90/500 [====>.........................] - ETA: 1:43 - loss: 2.0454 - regression_loss: 1.6938 - classification_loss: 0.3516 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0400 - regression_loss: 1.6891 - classification_loss: 0.3509 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0475 - regression_loss: 1.6955 - classification_loss: 0.3520 93/500 [====>.........................] - ETA: 1:42 - loss: 2.0501 - regression_loss: 1.6975 - classification_loss: 0.3527 94/500 [====>.........................] - ETA: 1:42 - loss: 2.0491 - regression_loss: 1.6965 - classification_loss: 0.3526 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0480 - regression_loss: 1.6960 - classification_loss: 0.3520 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0579 - regression_loss: 1.6989 - classification_loss: 0.3590 97/500 [====>.........................] - ETA: 1:41 - loss: 2.0626 - regression_loss: 1.7030 - classification_loss: 0.3596 98/500 [====>.........................] - ETA: 1:41 - loss: 2.0686 - regression_loss: 1.7091 - classification_loss: 0.3594 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0653 - regression_loss: 1.7062 - classification_loss: 0.3591 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0625 - regression_loss: 1.7037 - classification_loss: 0.3588 101/500 [=====>........................] - ETA: 1:40 - loss: 2.0665 - regression_loss: 1.7062 - classification_loss: 0.3602 102/500 [=====>........................] - ETA: 1:40 - loss: 2.0659 - regression_loss: 1.7056 - classification_loss: 0.3603 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0598 - regression_loss: 1.7004 - classification_loss: 0.3595 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0575 - regression_loss: 1.6989 - classification_loss: 0.3586 105/500 [=====>........................] - ETA: 1:39 - loss: 2.0619 - regression_loss: 1.7018 - classification_loss: 0.3602 106/500 [=====>........................] - ETA: 1:39 - loss: 2.0677 - regression_loss: 1.7047 - classification_loss: 0.3630 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0699 - regression_loss: 1.7071 - classification_loss: 0.3628 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0666 - regression_loss: 1.7047 - classification_loss: 0.3619 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0697 - regression_loss: 1.7074 - classification_loss: 0.3623 110/500 [=====>........................] - ETA: 1:38 - loss: 2.0725 - regression_loss: 1.7085 - classification_loss: 0.3640 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0678 - regression_loss: 1.7039 - classification_loss: 0.3639 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0639 - regression_loss: 1.7013 - classification_loss: 0.3627 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0602 - regression_loss: 1.6980 - classification_loss: 0.3622 114/500 [=====>........................] - ETA: 1:37 - loss: 2.0597 - regression_loss: 1.6979 - classification_loss: 0.3618 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0584 - regression_loss: 1.6968 - classification_loss: 0.3617 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0569 - regression_loss: 1.6954 - classification_loss: 0.3615 117/500 [======>.......................] - ETA: 1:36 - loss: 2.0605 - regression_loss: 1.6978 - classification_loss: 0.3627 118/500 [======>.......................] - ETA: 1:36 - loss: 2.0604 - regression_loss: 1.6972 - classification_loss: 0.3631 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0597 - regression_loss: 1.6970 - classification_loss: 0.3627 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0567 - regression_loss: 1.6949 - classification_loss: 0.3618 121/500 [======>.......................] - ETA: 1:35 - loss: 2.0580 - regression_loss: 1.6955 - classification_loss: 0.3625 122/500 [======>.......................] - ETA: 1:35 - loss: 2.0575 - regression_loss: 1.6958 - classification_loss: 0.3616 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0628 - regression_loss: 1.7000 - classification_loss: 0.3629 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0556 - regression_loss: 1.6940 - classification_loss: 0.3616 125/500 [======>.......................] - ETA: 1:34 - loss: 2.0493 - regression_loss: 1.6874 - classification_loss: 0.3619 126/500 [======>.......................] - ETA: 1:34 - loss: 2.0463 - regression_loss: 1.6843 - classification_loss: 0.3620 127/500 [======>.......................] - ETA: 1:34 - loss: 2.0462 - regression_loss: 1.6851 - classification_loss: 0.3612 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0425 - regression_loss: 1.6818 - classification_loss: 0.3607 129/500 [======>.......................] - ETA: 1:33 - loss: 2.0455 - regression_loss: 1.6848 - classification_loss: 0.3607 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0451 - regression_loss: 1.6843 - classification_loss: 0.3607 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0525 - regression_loss: 1.6893 - classification_loss: 0.3632 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0515 - regression_loss: 1.6889 - classification_loss: 0.3626 133/500 [======>.......................] - ETA: 1:32 - loss: 2.0522 - regression_loss: 1.6891 - classification_loss: 0.3632 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0525 - regression_loss: 1.6895 - classification_loss: 0.3630 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0560 - regression_loss: 1.6922 - classification_loss: 0.3638 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0585 - regression_loss: 1.6938 - classification_loss: 0.3647 137/500 [=======>......................] - ETA: 1:31 - loss: 2.0567 - regression_loss: 1.6922 - classification_loss: 0.3645 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0578 - regression_loss: 1.6931 - classification_loss: 0.3648 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0525 - regression_loss: 1.6878 - classification_loss: 0.3648 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0513 - regression_loss: 1.6869 - classification_loss: 0.3643 141/500 [=======>......................] - ETA: 1:30 - loss: 2.0502 - regression_loss: 1.6865 - classification_loss: 0.3638 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0503 - regression_loss: 1.6868 - classification_loss: 0.3635 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0418 - regression_loss: 1.6799 - classification_loss: 0.3619 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0422 - regression_loss: 1.6803 - classification_loss: 0.3619 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0410 - regression_loss: 1.6797 - classification_loss: 0.3613 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0413 - regression_loss: 1.6800 - classification_loss: 0.3613 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0340 - regression_loss: 1.6745 - classification_loss: 0.3596 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0330 - regression_loss: 1.6733 - classification_loss: 0.3597 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0387 - regression_loss: 1.6777 - classification_loss: 0.3610 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0406 - regression_loss: 1.6789 - classification_loss: 0.3617 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0403 - regression_loss: 1.6788 - classification_loss: 0.3614 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0393 - regression_loss: 1.6781 - classification_loss: 0.3612 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0420 - regression_loss: 1.6799 - classification_loss: 0.3621 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0433 - regression_loss: 1.6810 - classification_loss: 0.3623 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0395 - regression_loss: 1.6777 - classification_loss: 0.3618 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0484 - regression_loss: 1.6843 - classification_loss: 0.3641 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0495 - regression_loss: 1.6852 - classification_loss: 0.3643 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0498 - regression_loss: 1.6858 - classification_loss: 0.3640 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0517 - regression_loss: 1.6871 - classification_loss: 0.3646 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0518 - regression_loss: 1.6872 - classification_loss: 0.3647 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0548 - regression_loss: 1.6891 - classification_loss: 0.3657 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0564 - regression_loss: 1.6903 - classification_loss: 0.3661 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0523 - regression_loss: 1.6871 - classification_loss: 0.3652 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0525 - regression_loss: 1.6875 - classification_loss: 0.3650 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0509 - regression_loss: 1.6862 - classification_loss: 0.3646 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0475 - regression_loss: 1.6822 - classification_loss: 0.3653 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0467 - regression_loss: 1.6792 - classification_loss: 0.3676 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0488 - regression_loss: 1.6809 - classification_loss: 0.3679 169/500 [=========>....................] - ETA: 1:23 - loss: 2.0526 - regression_loss: 1.6814 - classification_loss: 0.3711 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0521 - regression_loss: 1.6800 - classification_loss: 0.3721 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0510 - regression_loss: 1.6793 - classification_loss: 0.3717 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0491 - regression_loss: 1.6774 - classification_loss: 0.3716 173/500 [=========>....................] - ETA: 1:22 - loss: 2.0493 - regression_loss: 1.6781 - classification_loss: 0.3711 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0532 - regression_loss: 1.6817 - classification_loss: 0.3715 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0526 - regression_loss: 1.6818 - classification_loss: 0.3708 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0515 - regression_loss: 1.6809 - classification_loss: 0.3706 177/500 [=========>....................] - ETA: 1:21 - loss: 2.0535 - regression_loss: 1.6822 - classification_loss: 0.3712 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0572 - regression_loss: 1.6847 - classification_loss: 0.3725 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0586 - regression_loss: 1.6857 - classification_loss: 0.3729 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0594 - regression_loss: 1.6866 - classification_loss: 0.3728 181/500 [=========>....................] - ETA: 1:20 - loss: 2.0630 - regression_loss: 1.6893 - classification_loss: 0.3737 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0662 - regression_loss: 1.6923 - classification_loss: 0.3739 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0658 - regression_loss: 1.6922 - classification_loss: 0.3736 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0665 - regression_loss: 1.6926 - classification_loss: 0.3739 185/500 [==========>...................] - ETA: 1:19 - loss: 2.0643 - regression_loss: 1.6910 - classification_loss: 0.3733 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0635 - regression_loss: 1.6907 - classification_loss: 0.3728 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0615 - regression_loss: 1.6894 - classification_loss: 0.3722 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0615 - regression_loss: 1.6891 - classification_loss: 0.3724 189/500 [==========>...................] - ETA: 1:18 - loss: 2.0593 - regression_loss: 1.6869 - classification_loss: 0.3723 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0650 - regression_loss: 1.6929 - classification_loss: 0.3721 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0663 - regression_loss: 1.6933 - classification_loss: 0.3731 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0671 - regression_loss: 1.6939 - classification_loss: 0.3732 193/500 [==========>...................] - ETA: 1:17 - loss: 2.0671 - regression_loss: 1.6941 - classification_loss: 0.3730 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0643 - regression_loss: 1.6918 - classification_loss: 0.3725 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0704 - regression_loss: 1.6940 - classification_loss: 0.3765 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0726 - regression_loss: 1.6953 - classification_loss: 0.3772 197/500 [==========>...................] - ETA: 1:16 - loss: 2.0719 - regression_loss: 1.6950 - classification_loss: 0.3770 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0708 - regression_loss: 1.6943 - classification_loss: 0.3765 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0714 - regression_loss: 1.6952 - classification_loss: 0.3763 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0763 - regression_loss: 1.6995 - classification_loss: 0.3768 201/500 [===========>..................] - ETA: 1:15 - loss: 2.0773 - regression_loss: 1.7003 - classification_loss: 0.3769 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0775 - regression_loss: 1.7008 - classification_loss: 0.3768 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0791 - regression_loss: 1.7021 - classification_loss: 0.3769 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0803 - regression_loss: 1.7034 - classification_loss: 0.3769 205/500 [===========>..................] - ETA: 1:14 - loss: 2.0808 - regression_loss: 1.7034 - classification_loss: 0.3774 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0778 - regression_loss: 1.7010 - classification_loss: 0.3768 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0763 - regression_loss: 1.7000 - classification_loss: 0.3762 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0716 - regression_loss: 1.6966 - classification_loss: 0.3750 209/500 [===========>..................] - ETA: 1:13 - loss: 2.0708 - regression_loss: 1.6962 - classification_loss: 0.3747 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0694 - regression_loss: 1.6953 - classification_loss: 0.3741 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0706 - regression_loss: 1.6965 - classification_loss: 0.3741 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0706 - regression_loss: 1.6963 - classification_loss: 0.3743 213/500 [===========>..................] - ETA: 1:12 - loss: 2.0679 - regression_loss: 1.6936 - classification_loss: 0.3743 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0652 - regression_loss: 1.6915 - classification_loss: 0.3737 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0653 - regression_loss: 1.6918 - classification_loss: 0.3735 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0656 - regression_loss: 1.6919 - classification_loss: 0.3738 217/500 [============>.................] - ETA: 1:11 - loss: 2.0661 - regression_loss: 1.6921 - classification_loss: 0.3740 218/500 [============>.................] - ETA: 1:10 - loss: 2.0686 - regression_loss: 1.6941 - classification_loss: 0.3745 219/500 [============>.................] - ETA: 1:10 - loss: 2.0666 - regression_loss: 1.6926 - classification_loss: 0.3740 220/500 [============>.................] - ETA: 1:10 - loss: 2.0674 - regression_loss: 1.6935 - classification_loss: 0.3738 221/500 [============>.................] - ETA: 1:10 - loss: 2.0681 - regression_loss: 1.6935 - classification_loss: 0.3746 222/500 [============>.................] - ETA: 1:09 - loss: 2.0686 - regression_loss: 1.6942 - classification_loss: 0.3744 223/500 [============>.................] - ETA: 1:09 - loss: 2.0675 - regression_loss: 1.6934 - classification_loss: 0.3741 224/500 [============>.................] - ETA: 1:09 - loss: 2.0662 - regression_loss: 1.6922 - classification_loss: 0.3740 225/500 [============>.................] - ETA: 1:09 - loss: 2.0676 - regression_loss: 1.6933 - classification_loss: 0.3743 226/500 [============>.................] - ETA: 1:08 - loss: 2.0664 - regression_loss: 1.6922 - classification_loss: 0.3741 227/500 [============>.................] - ETA: 1:08 - loss: 2.0669 - regression_loss: 1.6923 - classification_loss: 0.3746 228/500 [============>.................] - ETA: 1:08 - loss: 2.0667 - regression_loss: 1.6925 - classification_loss: 0.3742 229/500 [============>.................] - ETA: 1:08 - loss: 2.0691 - regression_loss: 1.6943 - classification_loss: 0.3748 230/500 [============>.................] - ETA: 1:07 - loss: 2.0687 - regression_loss: 1.6942 - classification_loss: 0.3745 231/500 [============>.................] - ETA: 1:07 - loss: 2.0704 - regression_loss: 1.6958 - classification_loss: 0.3746 232/500 [============>.................] - ETA: 1:07 - loss: 2.0674 - regression_loss: 1.6927 - classification_loss: 0.3746 233/500 [============>.................] - ETA: 1:07 - loss: 2.0668 - regression_loss: 1.6925 - classification_loss: 0.3743 234/500 [=============>................] - ETA: 1:06 - loss: 2.0641 - regression_loss: 1.6901 - classification_loss: 0.3740 235/500 [=============>................] - ETA: 1:06 - loss: 2.0607 - regression_loss: 1.6872 - classification_loss: 0.3735 236/500 [=============>................] - ETA: 1:06 - loss: 2.0595 - regression_loss: 1.6864 - classification_loss: 0.3731 237/500 [=============>................] - ETA: 1:06 - loss: 2.0585 - regression_loss: 1.6857 - classification_loss: 0.3728 238/500 [=============>................] - ETA: 1:05 - loss: 2.0545 - regression_loss: 1.6821 - classification_loss: 0.3724 239/500 [=============>................] - ETA: 1:05 - loss: 2.0527 - regression_loss: 1.6806 - classification_loss: 0.3721 240/500 [=============>................] - ETA: 1:05 - loss: 2.0524 - regression_loss: 1.6805 - classification_loss: 0.3719 241/500 [=============>................] - ETA: 1:05 - loss: 2.0524 - regression_loss: 1.6806 - classification_loss: 0.3718 242/500 [=============>................] - ETA: 1:04 - loss: 2.0481 - regression_loss: 1.6773 - classification_loss: 0.3708 243/500 [=============>................] - ETA: 1:04 - loss: 2.0482 - regression_loss: 1.6775 - classification_loss: 0.3707 244/500 [=============>................] - ETA: 1:04 - loss: 2.0489 - regression_loss: 1.6778 - classification_loss: 0.3711 245/500 [=============>................] - ETA: 1:04 - loss: 2.0495 - regression_loss: 1.6780 - classification_loss: 0.3715 246/500 [=============>................] - ETA: 1:03 - loss: 2.0514 - regression_loss: 1.6793 - classification_loss: 0.3721 247/500 [=============>................] - ETA: 1:03 - loss: 2.0541 - regression_loss: 1.6813 - classification_loss: 0.3728 248/500 [=============>................] - ETA: 1:03 - loss: 2.0523 - regression_loss: 1.6792 - classification_loss: 0.3731 249/500 [=============>................] - ETA: 1:03 - loss: 2.0517 - regression_loss: 1.6787 - classification_loss: 0.3729 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0516 - regression_loss: 1.6790 - classification_loss: 0.3726 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0529 - regression_loss: 1.6804 - classification_loss: 0.3725 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0525 - regression_loss: 1.6805 - classification_loss: 0.3720 253/500 [==============>...............] - ETA: 1:02 - loss: 2.0527 - regression_loss: 1.6811 - classification_loss: 0.3716 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0504 - regression_loss: 1.6794 - classification_loss: 0.3709 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0507 - regression_loss: 1.6796 - classification_loss: 0.3712 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0475 - regression_loss: 1.6770 - classification_loss: 0.3706 257/500 [==============>...............] - ETA: 1:01 - loss: 2.0483 - regression_loss: 1.6776 - classification_loss: 0.3707 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0489 - regression_loss: 1.6780 - classification_loss: 0.3708 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0492 - regression_loss: 1.6783 - classification_loss: 0.3709 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0510 - regression_loss: 1.6798 - classification_loss: 0.3712 261/500 [==============>...............] - ETA: 1:00 - loss: 2.0486 - regression_loss: 1.6778 - classification_loss: 0.3708 262/500 [==============>...............] - ETA: 59s - loss: 2.0474 - regression_loss: 1.6766 - classification_loss: 0.3708  263/500 [==============>...............] - ETA: 59s - loss: 2.0496 - regression_loss: 1.6781 - classification_loss: 0.3715 264/500 [==============>...............] - ETA: 59s - loss: 2.0505 - regression_loss: 1.6791 - classification_loss: 0.3714 265/500 [==============>...............] - ETA: 59s - loss: 2.0497 - regression_loss: 1.6778 - classification_loss: 0.3719 266/500 [==============>...............] - ETA: 58s - loss: 2.0504 - regression_loss: 1.6785 - classification_loss: 0.3719 267/500 [===============>..............] - ETA: 58s - loss: 2.0489 - regression_loss: 1.6773 - classification_loss: 0.3716 268/500 [===============>..............] - ETA: 58s - loss: 2.0484 - regression_loss: 1.6769 - classification_loss: 0.3715 269/500 [===============>..............] - ETA: 58s - loss: 2.0480 - regression_loss: 1.6769 - classification_loss: 0.3712 270/500 [===============>..............] - ETA: 57s - loss: 2.0528 - regression_loss: 1.6795 - classification_loss: 0.3733 271/500 [===============>..............] - ETA: 57s - loss: 2.0494 - regression_loss: 1.6768 - classification_loss: 0.3726 272/500 [===============>..............] - ETA: 57s - loss: 2.0494 - regression_loss: 1.6764 - classification_loss: 0.3729 273/500 [===============>..............] - ETA: 57s - loss: 2.0515 - regression_loss: 1.6784 - classification_loss: 0.3731 274/500 [===============>..............] - ETA: 56s - loss: 2.0500 - regression_loss: 1.6774 - classification_loss: 0.3726 275/500 [===============>..............] - ETA: 56s - loss: 2.0497 - regression_loss: 1.6775 - classification_loss: 0.3722 276/500 [===============>..............] - ETA: 56s - loss: 2.0518 - regression_loss: 1.6790 - classification_loss: 0.3728 277/500 [===============>..............] - ETA: 56s - loss: 2.0509 - regression_loss: 1.6785 - classification_loss: 0.3724 278/500 [===============>..............] - ETA: 55s - loss: 2.0481 - regression_loss: 1.6765 - classification_loss: 0.3717 279/500 [===============>..............] - ETA: 55s - loss: 2.0480 - regression_loss: 1.6767 - classification_loss: 0.3713 280/500 [===============>..............] - ETA: 55s - loss: 2.0493 - regression_loss: 1.6776 - classification_loss: 0.3717 281/500 [===============>..............] - ETA: 55s - loss: 2.0497 - regression_loss: 1.6780 - classification_loss: 0.3718 282/500 [===============>..............] - ETA: 54s - loss: 2.0502 - regression_loss: 1.6782 - classification_loss: 0.3719 283/500 [===============>..............] - ETA: 54s - loss: 2.0521 - regression_loss: 1.6797 - classification_loss: 0.3724 284/500 [================>.............] - ETA: 54s - loss: 2.0533 - regression_loss: 1.6810 - classification_loss: 0.3723 285/500 [================>.............] - ETA: 54s - loss: 2.0524 - regression_loss: 1.6805 - classification_loss: 0.3719 286/500 [================>.............] - ETA: 53s - loss: 2.0516 - regression_loss: 1.6797 - classification_loss: 0.3719 287/500 [================>.............] - ETA: 53s - loss: 2.0522 - regression_loss: 1.6803 - classification_loss: 0.3719 288/500 [================>.............] - ETA: 53s - loss: 2.0505 - regression_loss: 1.6789 - classification_loss: 0.3716 289/500 [================>.............] - ETA: 53s - loss: 2.0495 - regression_loss: 1.6780 - classification_loss: 0.3715 290/500 [================>.............] - ETA: 52s - loss: 2.0500 - regression_loss: 1.6784 - classification_loss: 0.3715 291/500 [================>.............] - ETA: 52s - loss: 2.0469 - regression_loss: 1.6761 - classification_loss: 0.3707 292/500 [================>.............] - ETA: 52s - loss: 2.0494 - regression_loss: 1.6784 - classification_loss: 0.3710 293/500 [================>.............] - ETA: 51s - loss: 2.0491 - regression_loss: 1.6780 - classification_loss: 0.3712 294/500 [================>.............] - ETA: 51s - loss: 2.0479 - regression_loss: 1.6769 - classification_loss: 0.3710 295/500 [================>.............] - ETA: 51s - loss: 2.0480 - regression_loss: 1.6766 - classification_loss: 0.3713 296/500 [================>.............] - ETA: 51s - loss: 2.0472 - regression_loss: 1.6761 - classification_loss: 0.3711 297/500 [================>.............] - ETA: 50s - loss: 2.0482 - regression_loss: 1.6770 - classification_loss: 0.3712 298/500 [================>.............] - ETA: 50s - loss: 2.0495 - regression_loss: 1.6781 - classification_loss: 0.3714 299/500 [================>.............] - ETA: 50s - loss: 2.0499 - regression_loss: 1.6783 - classification_loss: 0.3717 300/500 [=================>............] - ETA: 50s - loss: 2.0498 - regression_loss: 1.6783 - classification_loss: 0.3715 301/500 [=================>............] - ETA: 49s - loss: 2.0501 - regression_loss: 1.6784 - classification_loss: 0.3716 302/500 [=================>............] - ETA: 49s - loss: 2.0507 - regression_loss: 1.6788 - classification_loss: 0.3718 303/500 [=================>............] - ETA: 49s - loss: 2.0507 - regression_loss: 1.6788 - classification_loss: 0.3719 304/500 [=================>............] - ETA: 49s - loss: 2.0508 - regression_loss: 1.6786 - classification_loss: 0.3722 305/500 [=================>............] - ETA: 48s - loss: 2.0511 - regression_loss: 1.6787 - classification_loss: 0.3724 306/500 [=================>............] - ETA: 48s - loss: 2.0495 - regression_loss: 1.6776 - classification_loss: 0.3720 307/500 [=================>............] - ETA: 48s - loss: 2.0488 - regression_loss: 1.6772 - classification_loss: 0.3716 308/500 [=================>............] - ETA: 48s - loss: 2.0482 - regression_loss: 1.6769 - classification_loss: 0.3713 309/500 [=================>............] - ETA: 47s - loss: 2.0448 - regression_loss: 1.6741 - classification_loss: 0.3707 310/500 [=================>............] - ETA: 47s - loss: 2.0453 - regression_loss: 1.6747 - classification_loss: 0.3705 311/500 [=================>............] - ETA: 47s - loss: 2.0443 - regression_loss: 1.6740 - classification_loss: 0.3703 312/500 [=================>............] - ETA: 47s - loss: 2.0443 - regression_loss: 1.6740 - classification_loss: 0.3703 313/500 [=================>............] - ETA: 46s - loss: 2.0451 - regression_loss: 1.6748 - classification_loss: 0.3703 314/500 [=================>............] - ETA: 46s - loss: 2.0422 - regression_loss: 1.6722 - classification_loss: 0.3701 315/500 [=================>............] - ETA: 46s - loss: 2.0400 - regression_loss: 1.6705 - classification_loss: 0.3696 316/500 [=================>............] - ETA: 46s - loss: 2.0405 - regression_loss: 1.6710 - classification_loss: 0.3696 317/500 [==================>...........] - ETA: 45s - loss: 2.0402 - regression_loss: 1.6709 - classification_loss: 0.3694 318/500 [==================>...........] - ETA: 45s - loss: 2.0372 - regression_loss: 1.6685 - classification_loss: 0.3688 319/500 [==================>...........] - ETA: 45s - loss: 2.0360 - regression_loss: 1.6676 - classification_loss: 0.3684 320/500 [==================>...........] - ETA: 45s - loss: 2.0396 - regression_loss: 1.6706 - classification_loss: 0.3690 321/500 [==================>...........] - ETA: 44s - loss: 2.0410 - regression_loss: 1.6717 - classification_loss: 0.3693 322/500 [==================>...........] - ETA: 44s - loss: 2.0418 - regression_loss: 1.6722 - classification_loss: 0.3696 323/500 [==================>...........] - ETA: 44s - loss: 2.0411 - regression_loss: 1.6716 - classification_loss: 0.3696 324/500 [==================>...........] - ETA: 44s - loss: 2.0402 - regression_loss: 1.6708 - classification_loss: 0.3694 325/500 [==================>...........] - ETA: 43s - loss: 2.0405 - regression_loss: 1.6713 - classification_loss: 0.3693 326/500 [==================>...........] - ETA: 43s - loss: 2.0427 - regression_loss: 1.6729 - classification_loss: 0.3698 327/500 [==================>...........] - ETA: 43s - loss: 2.0432 - regression_loss: 1.6732 - classification_loss: 0.3700 328/500 [==================>...........] - ETA: 43s - loss: 2.0442 - regression_loss: 1.6742 - classification_loss: 0.3701 329/500 [==================>...........] - ETA: 42s - loss: 2.0470 - regression_loss: 1.6766 - classification_loss: 0.3704 330/500 [==================>...........] - ETA: 42s - loss: 2.0462 - regression_loss: 1.6760 - classification_loss: 0.3702 331/500 [==================>...........] - ETA: 42s - loss: 2.0463 - regression_loss: 1.6762 - classification_loss: 0.3701 332/500 [==================>...........] - ETA: 42s - loss: 2.0467 - regression_loss: 1.6765 - classification_loss: 0.3702 333/500 [==================>...........] - ETA: 41s - loss: 2.0471 - regression_loss: 1.6769 - classification_loss: 0.3702 334/500 [===================>..........] - ETA: 41s - loss: 2.0457 - regression_loss: 1.6760 - classification_loss: 0.3697 335/500 [===================>..........] - ETA: 41s - loss: 2.0454 - regression_loss: 1.6759 - classification_loss: 0.3695 336/500 [===================>..........] - ETA: 41s - loss: 2.0464 - regression_loss: 1.6769 - classification_loss: 0.3695 337/500 [===================>..........] - ETA: 40s - loss: 2.0470 - regression_loss: 1.6775 - classification_loss: 0.3695 338/500 [===================>..........] - ETA: 40s - loss: 2.0464 - regression_loss: 1.6772 - classification_loss: 0.3692 339/500 [===================>..........] - ETA: 40s - loss: 2.0465 - regression_loss: 1.6772 - classification_loss: 0.3693 340/500 [===================>..........] - ETA: 40s - loss: 2.0467 - regression_loss: 1.6775 - classification_loss: 0.3693 341/500 [===================>..........] - ETA: 39s - loss: 2.0473 - regression_loss: 1.6779 - classification_loss: 0.3694 342/500 [===================>..........] - ETA: 39s - loss: 2.0465 - regression_loss: 1.6772 - classification_loss: 0.3693 343/500 [===================>..........] - ETA: 39s - loss: 2.0471 - regression_loss: 1.6775 - classification_loss: 0.3695 344/500 [===================>..........] - ETA: 39s - loss: 2.0477 - regression_loss: 1.6782 - classification_loss: 0.3695 345/500 [===================>..........] - ETA: 38s - loss: 2.0457 - regression_loss: 1.6766 - classification_loss: 0.3692 346/500 [===================>..........] - ETA: 38s - loss: 2.0449 - regression_loss: 1.6760 - classification_loss: 0.3689 347/500 [===================>..........] - ETA: 38s - loss: 2.0449 - regression_loss: 1.6759 - classification_loss: 0.3690 348/500 [===================>..........] - ETA: 38s - loss: 2.0448 - regression_loss: 1.6759 - classification_loss: 0.3689 349/500 [===================>..........] - ETA: 37s - loss: 2.0446 - regression_loss: 1.6757 - classification_loss: 0.3689 350/500 [====================>.........] - ETA: 37s - loss: 2.0451 - regression_loss: 1.6763 - classification_loss: 0.3688 351/500 [====================>.........] - ETA: 37s - loss: 2.0454 - regression_loss: 1.6766 - classification_loss: 0.3688 352/500 [====================>.........] - ETA: 37s - loss: 2.0441 - regression_loss: 1.6756 - classification_loss: 0.3685 353/500 [====================>.........] - ETA: 36s - loss: 2.0434 - regression_loss: 1.6750 - classification_loss: 0.3684 354/500 [====================>.........] - ETA: 36s - loss: 2.0435 - regression_loss: 1.6750 - classification_loss: 0.3685 355/500 [====================>.........] - ETA: 36s - loss: 2.0449 - regression_loss: 1.6758 - classification_loss: 0.3691 356/500 [====================>.........] - ETA: 36s - loss: 2.0461 - regression_loss: 1.6765 - classification_loss: 0.3696 357/500 [====================>.........] - ETA: 35s - loss: 2.0484 - regression_loss: 1.6788 - classification_loss: 0.3696 358/500 [====================>.........] - ETA: 35s - loss: 2.0470 - regression_loss: 1.6774 - classification_loss: 0.3696 359/500 [====================>.........] - ETA: 35s - loss: 2.0484 - regression_loss: 1.6784 - classification_loss: 0.3700 360/500 [====================>.........] - ETA: 35s - loss: 2.0482 - regression_loss: 1.6784 - classification_loss: 0.3698 361/500 [====================>.........] - ETA: 34s - loss: 2.0475 - regression_loss: 1.6780 - classification_loss: 0.3695 362/500 [====================>.........] - ETA: 34s - loss: 2.0476 - regression_loss: 1.6783 - classification_loss: 0.3693 363/500 [====================>.........] - ETA: 34s - loss: 2.0459 - regression_loss: 1.6769 - classification_loss: 0.3690 364/500 [====================>.........] - ETA: 34s - loss: 2.0472 - regression_loss: 1.6778 - classification_loss: 0.3694 365/500 [====================>.........] - ETA: 33s - loss: 2.0466 - regression_loss: 1.6773 - classification_loss: 0.3693 366/500 [====================>.........] - ETA: 33s - loss: 2.0458 - regression_loss: 1.6768 - classification_loss: 0.3690 367/500 [=====================>........] - ETA: 33s - loss: 2.0452 - regression_loss: 1.6762 - classification_loss: 0.3690 368/500 [=====================>........] - ETA: 33s - loss: 2.0447 - regression_loss: 1.6758 - classification_loss: 0.3689 369/500 [=====================>........] - ETA: 32s - loss: 2.0433 - regression_loss: 1.6746 - classification_loss: 0.3686 370/500 [=====================>........] - ETA: 32s - loss: 2.0434 - regression_loss: 1.6741 - classification_loss: 0.3693 371/500 [=====================>........] - ETA: 32s - loss: 2.0457 - regression_loss: 1.6758 - classification_loss: 0.3700 372/500 [=====================>........] - ETA: 32s - loss: 2.0476 - regression_loss: 1.6773 - classification_loss: 0.3704 373/500 [=====================>........] - ETA: 31s - loss: 2.0446 - regression_loss: 1.6748 - classification_loss: 0.3698 374/500 [=====================>........] - ETA: 31s - loss: 2.0441 - regression_loss: 1.6745 - classification_loss: 0.3696 375/500 [=====================>........] - ETA: 31s - loss: 2.0437 - regression_loss: 1.6742 - classification_loss: 0.3696 376/500 [=====================>........] - ETA: 31s - loss: 2.0437 - regression_loss: 1.6741 - classification_loss: 0.3696 377/500 [=====================>........] - ETA: 30s - loss: 2.0449 - regression_loss: 1.6752 - classification_loss: 0.3696 378/500 [=====================>........] - ETA: 30s - loss: 2.0452 - regression_loss: 1.6757 - classification_loss: 0.3695 379/500 [=====================>........] - ETA: 30s - loss: 2.0459 - regression_loss: 1.6762 - classification_loss: 0.3697 380/500 [=====================>........] - ETA: 30s - loss: 2.0456 - regression_loss: 1.6760 - classification_loss: 0.3696 381/500 [=====================>........] - ETA: 29s - loss: 2.0455 - regression_loss: 1.6754 - classification_loss: 0.3701 382/500 [=====================>........] - ETA: 29s - loss: 2.0435 - regression_loss: 1.6738 - classification_loss: 0.3697 383/500 [=====================>........] - ETA: 29s - loss: 2.0408 - regression_loss: 1.6718 - classification_loss: 0.3690 384/500 [======================>.......] - ETA: 29s - loss: 2.0413 - regression_loss: 1.6720 - classification_loss: 0.3694 385/500 [======================>.......] - ETA: 28s - loss: 2.0420 - regression_loss: 1.6721 - classification_loss: 0.3699 386/500 [======================>.......] - ETA: 28s - loss: 2.0433 - regression_loss: 1.6725 - classification_loss: 0.3709 387/500 [======================>.......] - ETA: 28s - loss: 2.0445 - regression_loss: 1.6734 - classification_loss: 0.3711 388/500 [======================>.......] - ETA: 28s - loss: 2.0459 - regression_loss: 1.6744 - classification_loss: 0.3714 389/500 [======================>.......] - ETA: 27s - loss: 2.0447 - regression_loss: 1.6736 - classification_loss: 0.3711 390/500 [======================>.......] - ETA: 27s - loss: 2.0440 - regression_loss: 1.6728 - classification_loss: 0.3712 391/500 [======================>.......] - ETA: 27s - loss: 2.0424 - regression_loss: 1.6715 - classification_loss: 0.3709 392/500 [======================>.......] - ETA: 27s - loss: 2.0420 - regression_loss: 1.6709 - classification_loss: 0.3711 393/500 [======================>.......] - ETA: 26s - loss: 2.0412 - regression_loss: 1.6704 - classification_loss: 0.3708 394/500 [======================>.......] - ETA: 26s - loss: 2.0416 - regression_loss: 1.6711 - classification_loss: 0.3705 395/500 [======================>.......] - ETA: 26s - loss: 2.0406 - regression_loss: 1.6702 - classification_loss: 0.3704 396/500 [======================>.......] - ETA: 26s - loss: 2.0409 - regression_loss: 1.6704 - classification_loss: 0.3705 397/500 [======================>.......] - ETA: 25s - loss: 2.0374 - regression_loss: 1.6676 - classification_loss: 0.3698 398/500 [======================>.......] - ETA: 25s - loss: 2.0373 - regression_loss: 1.6675 - classification_loss: 0.3698 399/500 [======================>.......] - ETA: 25s - loss: 2.0382 - regression_loss: 1.6682 - classification_loss: 0.3700 400/500 [=======================>......] - ETA: 25s - loss: 2.0371 - regression_loss: 1.6673 - classification_loss: 0.3698 401/500 [=======================>......] - ETA: 24s - loss: 2.0368 - regression_loss: 1.6670 - classification_loss: 0.3699 402/500 [=======================>......] - ETA: 24s - loss: 2.0370 - regression_loss: 1.6670 - classification_loss: 0.3700 403/500 [=======================>......] - ETA: 24s - loss: 2.0374 - regression_loss: 1.6674 - classification_loss: 0.3700 404/500 [=======================>......] - ETA: 24s - loss: 2.0375 - regression_loss: 1.6676 - classification_loss: 0.3699 405/500 [=======================>......] - ETA: 23s - loss: 2.0402 - regression_loss: 1.6696 - classification_loss: 0.3706 406/500 [=======================>......] - ETA: 23s - loss: 2.0402 - regression_loss: 1.6696 - classification_loss: 0.3706 407/500 [=======================>......] - ETA: 23s - loss: 2.0401 - regression_loss: 1.6695 - classification_loss: 0.3706 408/500 [=======================>......] - ETA: 23s - loss: 2.0427 - regression_loss: 1.6709 - classification_loss: 0.3717 409/500 [=======================>......] - ETA: 22s - loss: 2.0428 - regression_loss: 1.6710 - classification_loss: 0.3718 410/500 [=======================>......] - ETA: 22s - loss: 2.0448 - regression_loss: 1.6725 - classification_loss: 0.3722 411/500 [=======================>......] - ETA: 22s - loss: 2.0463 - regression_loss: 1.6735 - classification_loss: 0.3728 412/500 [=======================>......] - ETA: 22s - loss: 2.0493 - regression_loss: 1.6751 - classification_loss: 0.3742 413/500 [=======================>......] - ETA: 21s - loss: 2.0484 - regression_loss: 1.6738 - classification_loss: 0.3746 414/500 [=======================>......] - ETA: 21s - loss: 2.0499 - regression_loss: 1.6750 - classification_loss: 0.3748 415/500 [=======================>......] - ETA: 21s - loss: 2.0504 - regression_loss: 1.6756 - classification_loss: 0.3748 416/500 [=======================>......] - ETA: 21s - loss: 2.0512 - regression_loss: 1.6762 - classification_loss: 0.3751 417/500 [========================>.....] - ETA: 20s - loss: 2.0517 - regression_loss: 1.6765 - classification_loss: 0.3752 418/500 [========================>.....] - ETA: 20s - loss: 2.0485 - regression_loss: 1.6739 - classification_loss: 0.3746 419/500 [========================>.....] - ETA: 20s - loss: 2.0477 - regression_loss: 1.6733 - classification_loss: 0.3744 420/500 [========================>.....] - ETA: 20s - loss: 2.0492 - regression_loss: 1.6747 - classification_loss: 0.3745 421/500 [========================>.....] - ETA: 19s - loss: 2.0509 - regression_loss: 1.6758 - classification_loss: 0.3750 422/500 [========================>.....] - ETA: 19s - loss: 2.0512 - regression_loss: 1.6759 - classification_loss: 0.3753 423/500 [========================>.....] - ETA: 19s - loss: 2.0520 - regression_loss: 1.6764 - classification_loss: 0.3756 424/500 [========================>.....] - ETA: 19s - loss: 2.0526 - regression_loss: 1.6769 - classification_loss: 0.3757 425/500 [========================>.....] - ETA: 18s - loss: 2.0533 - regression_loss: 1.6775 - classification_loss: 0.3758 426/500 [========================>.....] - ETA: 18s - loss: 2.0539 - regression_loss: 1.6780 - classification_loss: 0.3759 427/500 [========================>.....] - ETA: 18s - loss: 2.0541 - regression_loss: 1.6783 - classification_loss: 0.3758 428/500 [========================>.....] - ETA: 18s - loss: 2.0548 - regression_loss: 1.6790 - classification_loss: 0.3758 429/500 [========================>.....] - ETA: 17s - loss: 2.0552 - regression_loss: 1.6790 - classification_loss: 0.3762 430/500 [========================>.....] - ETA: 17s - loss: 2.0554 - regression_loss: 1.6791 - classification_loss: 0.3763 431/500 [========================>.....] - ETA: 17s - loss: 2.0562 - regression_loss: 1.6798 - classification_loss: 0.3764 432/500 [========================>.....] - ETA: 17s - loss: 2.0550 - regression_loss: 1.6790 - classification_loss: 0.3759 433/500 [========================>.....] - ETA: 16s - loss: 2.0547 - regression_loss: 1.6788 - classification_loss: 0.3759 434/500 [=========================>....] - ETA: 16s - loss: 2.0543 - regression_loss: 1.6782 - classification_loss: 0.3760 435/500 [=========================>....] - ETA: 16s - loss: 2.0541 - regression_loss: 1.6781 - classification_loss: 0.3760 436/500 [=========================>....] - ETA: 16s - loss: 2.0542 - regression_loss: 1.6783 - classification_loss: 0.3759 437/500 [=========================>....] - ETA: 15s - loss: 2.0542 - regression_loss: 1.6783 - classification_loss: 0.3759 438/500 [=========================>....] - ETA: 15s - loss: 2.0532 - regression_loss: 1.6775 - classification_loss: 0.3757 439/500 [=========================>....] - ETA: 15s - loss: 2.0513 - regression_loss: 1.6760 - classification_loss: 0.3753 440/500 [=========================>....] - ETA: 15s - loss: 2.0513 - regression_loss: 1.6761 - classification_loss: 0.3752 441/500 [=========================>....] - ETA: 14s - loss: 2.0516 - regression_loss: 1.6763 - classification_loss: 0.3753 442/500 [=========================>....] - ETA: 14s - loss: 2.0511 - regression_loss: 1.6758 - classification_loss: 0.3752 443/500 [=========================>....] - ETA: 14s - loss: 2.0509 - regression_loss: 1.6756 - classification_loss: 0.3753 444/500 [=========================>....] - ETA: 14s - loss: 2.0505 - regression_loss: 1.6753 - classification_loss: 0.3752 445/500 [=========================>....] - ETA: 13s - loss: 2.0500 - regression_loss: 1.6751 - classification_loss: 0.3749 446/500 [=========================>....] - ETA: 13s - loss: 2.0502 - regression_loss: 1.6753 - classification_loss: 0.3749 447/500 [=========================>....] - ETA: 13s - loss: 2.0502 - regression_loss: 1.6748 - classification_loss: 0.3753 448/500 [=========================>....] - ETA: 13s - loss: 2.0484 - regression_loss: 1.6734 - classification_loss: 0.3750 449/500 [=========================>....] - ETA: 12s - loss: 2.0490 - regression_loss: 1.6736 - classification_loss: 0.3753 450/500 [==========================>...] - ETA: 12s - loss: 2.0480 - regression_loss: 1.6727 - classification_loss: 0.3754 451/500 [==========================>...] - ETA: 12s - loss: 2.0489 - regression_loss: 1.6734 - classification_loss: 0.3755 452/500 [==========================>...] - ETA: 12s - loss: 2.0480 - regression_loss: 1.6727 - classification_loss: 0.3753 453/500 [==========================>...] - ETA: 11s - loss: 2.0463 - regression_loss: 1.6712 - classification_loss: 0.3751 454/500 [==========================>...] - ETA: 11s - loss: 2.0467 - regression_loss: 1.6717 - classification_loss: 0.3751 455/500 [==========================>...] - ETA: 11s - loss: 2.0467 - regression_loss: 1.6717 - classification_loss: 0.3750 456/500 [==========================>...] - ETA: 11s - loss: 2.0463 - regression_loss: 1.6715 - classification_loss: 0.3748 457/500 [==========================>...] - ETA: 10s - loss: 2.0459 - regression_loss: 1.6711 - classification_loss: 0.3748 458/500 [==========================>...] - ETA: 10s - loss: 2.0469 - regression_loss: 1.6722 - classification_loss: 0.3748 459/500 [==========================>...] - ETA: 10s - loss: 2.0492 - regression_loss: 1.6740 - classification_loss: 0.3752 460/500 [==========================>...] - ETA: 10s - loss: 2.0488 - regression_loss: 1.6737 - classification_loss: 0.3751 461/500 [==========================>...] - ETA: 9s - loss: 2.0490 - regression_loss: 1.6738 - classification_loss: 0.3752  462/500 [==========================>...] - ETA: 9s - loss: 2.0484 - regression_loss: 1.6734 - classification_loss: 0.3750 463/500 [==========================>...] - ETA: 9s - loss: 2.0490 - regression_loss: 1.6740 - classification_loss: 0.3750 464/500 [==========================>...] - ETA: 9s - loss: 2.0499 - regression_loss: 1.6746 - classification_loss: 0.3753 465/500 [==========================>...] - ETA: 8s - loss: 2.0520 - regression_loss: 1.6762 - classification_loss: 0.3758 466/500 [==========================>...] - ETA: 8s - loss: 2.0507 - regression_loss: 1.6754 - classification_loss: 0.3754 467/500 [===========================>..] - ETA: 8s - loss: 2.0503 - regression_loss: 1.6750 - classification_loss: 0.3754 468/500 [===========================>..] - ETA: 8s - loss: 2.0493 - regression_loss: 1.6739 - classification_loss: 0.3753 469/500 [===========================>..] - ETA: 7s - loss: 2.0501 - regression_loss: 1.6746 - classification_loss: 0.3755 470/500 [===========================>..] - ETA: 7s - loss: 2.0508 - regression_loss: 1.6753 - classification_loss: 0.3755 471/500 [===========================>..] - ETA: 7s - loss: 2.0511 - regression_loss: 1.6753 - classification_loss: 0.3758 472/500 [===========================>..] - ETA: 7s - loss: 2.0514 - regression_loss: 1.6757 - classification_loss: 0.3757 473/500 [===========================>..] - ETA: 6s - loss: 2.0529 - regression_loss: 1.6772 - classification_loss: 0.3757 474/500 [===========================>..] - ETA: 6s - loss: 2.0511 - regression_loss: 1.6758 - classification_loss: 0.3753 475/500 [===========================>..] - ETA: 6s - loss: 2.0520 - regression_loss: 1.6766 - classification_loss: 0.3754 476/500 [===========================>..] - ETA: 6s - loss: 2.0523 - regression_loss: 1.6768 - classification_loss: 0.3755 477/500 [===========================>..] - ETA: 5s - loss: 2.0514 - regression_loss: 1.6764 - classification_loss: 0.3751 478/500 [===========================>..] - ETA: 5s - loss: 2.0510 - regression_loss: 1.6760 - classification_loss: 0.3750 479/500 [===========================>..] - ETA: 5s - loss: 2.0513 - regression_loss: 1.6764 - classification_loss: 0.3750 480/500 [===========================>..] - ETA: 5s - loss: 2.0520 - regression_loss: 1.6769 - classification_loss: 0.3752 481/500 [===========================>..] - ETA: 4s - loss: 2.0511 - regression_loss: 1.6761 - classification_loss: 0.3749 482/500 [===========================>..] - ETA: 4s - loss: 2.0524 - regression_loss: 1.6773 - classification_loss: 0.3751 483/500 [===========================>..] - ETA: 4s - loss: 2.0524 - regression_loss: 1.6773 - classification_loss: 0.3751 484/500 [============================>.] - ETA: 4s - loss: 2.0524 - regression_loss: 1.6772 - classification_loss: 0.3752 485/500 [============================>.] - ETA: 3s - loss: 2.0534 - regression_loss: 1.6781 - classification_loss: 0.3753 486/500 [============================>.] - ETA: 3s - loss: 2.0527 - regression_loss: 1.6776 - classification_loss: 0.3751 487/500 [============================>.] - ETA: 3s - loss: 2.0527 - regression_loss: 1.6777 - classification_loss: 0.3750 488/500 [============================>.] - ETA: 3s - loss: 2.0526 - regression_loss: 1.6777 - classification_loss: 0.3749 489/500 [============================>.] - ETA: 2s - loss: 2.0525 - regression_loss: 1.6776 - classification_loss: 0.3749 490/500 [============================>.] - ETA: 2s - loss: 2.0521 - regression_loss: 1.6775 - classification_loss: 0.3746 491/500 [============================>.] - ETA: 2s - loss: 2.0500 - regression_loss: 1.6758 - classification_loss: 0.3742 492/500 [============================>.] - ETA: 2s - loss: 2.0493 - regression_loss: 1.6752 - classification_loss: 0.3742 493/500 [============================>.] - ETA: 1s - loss: 2.0493 - regression_loss: 1.6752 - classification_loss: 0.3741 494/500 [============================>.] - ETA: 1s - loss: 2.0510 - regression_loss: 1.6763 - classification_loss: 0.3747 495/500 [============================>.] - ETA: 1s - loss: 2.0510 - regression_loss: 1.6759 - classification_loss: 0.3751 496/500 [============================>.] - ETA: 1s - loss: 2.0510 - regression_loss: 1.6759 - classification_loss: 0.3751 497/500 [============================>.] - ETA: 0s - loss: 2.0502 - regression_loss: 1.6754 - classification_loss: 0.3748 498/500 [============================>.] - ETA: 0s - loss: 2.0509 - regression_loss: 1.6759 - classification_loss: 0.3751 499/500 [============================>.] - ETA: 0s - loss: 2.0503 - regression_loss: 1.6754 - classification_loss: 0.3748 500/500 [==============================] - 125s 251ms/step - loss: 2.0514 - regression_loss: 1.6760 - classification_loss: 0.3754 1172 instances of class plum with average precision: 0.5335 mAP: 0.5335 Epoch 00037: saving model to ./training/snapshots/resnet50_pascal_37.h5 Epoch 38/150 1/500 [..............................] - ETA: 1:50 - loss: 2.2174 - regression_loss: 1.6775 - classification_loss: 0.5399 2/500 [..............................] - ETA: 1:56 - loss: 2.1949 - regression_loss: 1.7330 - classification_loss: 0.4619 3/500 [..............................] - ETA: 2:00 - loss: 1.9735 - regression_loss: 1.5423 - classification_loss: 0.4312 4/500 [..............................] - ETA: 2:01 - loss: 2.0439 - regression_loss: 1.6146 - classification_loss: 0.4293 5/500 [..............................] - ETA: 1:59 - loss: 2.1369 - regression_loss: 1.6958 - classification_loss: 0.4411 6/500 [..............................] - ETA: 2:00 - loss: 2.0993 - regression_loss: 1.6799 - classification_loss: 0.4194 7/500 [..............................] - ETA: 2:01 - loss: 2.0847 - regression_loss: 1.6831 - classification_loss: 0.4016 8/500 [..............................] - ETA: 2:01 - loss: 2.1649 - regression_loss: 1.7493 - classification_loss: 0.4155 9/500 [..............................] - ETA: 2:00 - loss: 2.1456 - regression_loss: 1.7232 - classification_loss: 0.4224 10/500 [..............................] - ETA: 2:00 - loss: 2.1696 - regression_loss: 1.7436 - classification_loss: 0.4260 11/500 [..............................] - ETA: 2:00 - loss: 2.2060 - regression_loss: 1.7758 - classification_loss: 0.4302 12/500 [..............................] - ETA: 2:00 - loss: 2.1332 - regression_loss: 1.7203 - classification_loss: 0.4129 13/500 [..............................] - ETA: 2:00 - loss: 2.1314 - regression_loss: 1.7257 - classification_loss: 0.4057 14/500 [..............................] - ETA: 1:59 - loss: 2.1363 - regression_loss: 1.7341 - classification_loss: 0.4022 15/500 [..............................] - ETA: 1:59 - loss: 2.1347 - regression_loss: 1.7392 - classification_loss: 0.3955 16/500 [..............................] - ETA: 1:59 - loss: 2.0756 - regression_loss: 1.6867 - classification_loss: 0.3888 17/500 [>.............................] - ETA: 1:59 - loss: 2.0994 - regression_loss: 1.7062 - classification_loss: 0.3932 18/500 [>.............................] - ETA: 1:59 - loss: 2.1396 - regression_loss: 1.7426 - classification_loss: 0.3970 19/500 [>.............................] - ETA: 1:59 - loss: 2.1356 - regression_loss: 1.7439 - classification_loss: 0.3916 20/500 [>.............................] - ETA: 1:59 - loss: 2.1817 - regression_loss: 1.7837 - classification_loss: 0.3980 21/500 [>.............................] - ETA: 1:59 - loss: 2.1837 - regression_loss: 1.7846 - classification_loss: 0.3992 22/500 [>.............................] - ETA: 1:58 - loss: 2.1918 - regression_loss: 1.7931 - classification_loss: 0.3987 23/500 [>.............................] - ETA: 1:58 - loss: 2.2481 - regression_loss: 1.8458 - classification_loss: 0.4023 24/500 [>.............................] - ETA: 1:58 - loss: 2.2517 - regression_loss: 1.8463 - classification_loss: 0.4054 25/500 [>.............................] - ETA: 1:58 - loss: 2.2492 - regression_loss: 1.8418 - classification_loss: 0.4074 26/500 [>.............................] - ETA: 1:57 - loss: 2.1927 - regression_loss: 1.7916 - classification_loss: 0.4011 27/500 [>.............................] - ETA: 1:57 - loss: 2.2069 - regression_loss: 1.7990 - classification_loss: 0.4079 28/500 [>.............................] - ETA: 1:57 - loss: 2.1944 - regression_loss: 1.7879 - classification_loss: 0.4065 29/500 [>.............................] - ETA: 1:57 - loss: 2.1865 - regression_loss: 1.7826 - classification_loss: 0.4039 30/500 [>.............................] - ETA: 1:57 - loss: 2.1977 - regression_loss: 1.7924 - classification_loss: 0.4053 31/500 [>.............................] - ETA: 1:57 - loss: 2.1944 - regression_loss: 1.7908 - classification_loss: 0.4036 32/500 [>.............................] - ETA: 1:56 - loss: 2.1900 - regression_loss: 1.7889 - classification_loss: 0.4011 33/500 [>.............................] - ETA: 1:56 - loss: 2.1755 - regression_loss: 1.7775 - classification_loss: 0.3980 34/500 [=>............................] - ETA: 1:56 - loss: 2.1519 - regression_loss: 1.7587 - classification_loss: 0.3932 35/500 [=>............................] - ETA: 1:55 - loss: 2.1553 - regression_loss: 1.7637 - classification_loss: 0.3916 36/500 [=>............................] - ETA: 1:55 - loss: 2.1515 - regression_loss: 1.7621 - classification_loss: 0.3893 37/500 [=>............................] - ETA: 1:55 - loss: 2.1495 - regression_loss: 1.7607 - classification_loss: 0.3888 38/500 [=>............................] - ETA: 1:55 - loss: 2.1442 - regression_loss: 1.7573 - classification_loss: 0.3869 39/500 [=>............................] - ETA: 1:54 - loss: 2.1471 - regression_loss: 1.7597 - classification_loss: 0.3874 40/500 [=>............................] - ETA: 1:54 - loss: 2.1478 - regression_loss: 1.7602 - classification_loss: 0.3876 41/500 [=>............................] - ETA: 1:54 - loss: 2.1419 - regression_loss: 1.7498 - classification_loss: 0.3921 42/500 [=>............................] - ETA: 1:54 - loss: 2.1483 - regression_loss: 1.7558 - classification_loss: 0.3925 43/500 [=>............................] - ETA: 1:53 - loss: 2.1533 - regression_loss: 1.7505 - classification_loss: 0.4028 44/500 [=>............................] - ETA: 1:53 - loss: 2.1466 - regression_loss: 1.7459 - classification_loss: 0.4007 45/500 [=>............................] - ETA: 1:53 - loss: 2.1479 - regression_loss: 1.7480 - classification_loss: 0.3999 46/500 [=>............................] - ETA: 1:53 - loss: 2.1508 - regression_loss: 1.7509 - classification_loss: 0.3999 47/500 [=>............................] - ETA: 1:52 - loss: 2.1453 - regression_loss: 1.7471 - classification_loss: 0.3981 48/500 [=>............................] - ETA: 1:52 - loss: 2.1367 - regression_loss: 1.7406 - classification_loss: 0.3961 49/500 [=>............................] - ETA: 1:52 - loss: 2.1355 - regression_loss: 1.7386 - classification_loss: 0.3969 50/500 [==>...........................] - ETA: 1:52 - loss: 2.1306 - regression_loss: 1.7362 - classification_loss: 0.3944 51/500 [==>...........................] - ETA: 1:51 - loss: 2.1317 - regression_loss: 1.7373 - classification_loss: 0.3945 52/500 [==>...........................] - ETA: 1:51 - loss: 2.1263 - regression_loss: 1.7332 - classification_loss: 0.3931 53/500 [==>...........................] - ETA: 1:51 - loss: 2.1272 - regression_loss: 1.7304 - classification_loss: 0.3969 54/500 [==>...........................] - ETA: 1:50 - loss: 2.1332 - regression_loss: 1.7345 - classification_loss: 0.3988 55/500 [==>...........................] - ETA: 1:50 - loss: 2.1241 - regression_loss: 1.7280 - classification_loss: 0.3961 56/500 [==>...........................] - ETA: 1:50 - loss: 2.1123 - regression_loss: 1.7181 - classification_loss: 0.3942 57/500 [==>...........................] - ETA: 1:50 - loss: 2.1076 - regression_loss: 1.7165 - classification_loss: 0.3911 58/500 [==>...........................] - ETA: 1:50 - loss: 2.1111 - regression_loss: 1.7203 - classification_loss: 0.3908 59/500 [==>...........................] - ETA: 1:49 - loss: 2.1281 - regression_loss: 1.7340 - classification_loss: 0.3941 60/500 [==>...........................] - ETA: 1:49 - loss: 2.1129 - regression_loss: 1.7215 - classification_loss: 0.3913 61/500 [==>...........................] - ETA: 1:49 - loss: 2.1178 - regression_loss: 1.7247 - classification_loss: 0.3931 62/500 [==>...........................] - ETA: 1:49 - loss: 2.1055 - regression_loss: 1.7147 - classification_loss: 0.3908 63/500 [==>...........................] - ETA: 1:48 - loss: 2.1163 - regression_loss: 1.7250 - classification_loss: 0.3913 64/500 [==>...........................] - ETA: 1:48 - loss: 2.1272 - regression_loss: 1.7350 - classification_loss: 0.3922 65/500 [==>...........................] - ETA: 1:48 - loss: 2.1326 - regression_loss: 1.7394 - classification_loss: 0.3932 66/500 [==>...........................] - ETA: 1:48 - loss: 2.1227 - regression_loss: 1.7315 - classification_loss: 0.3912 67/500 [===>..........................] - ETA: 1:48 - loss: 2.1153 - regression_loss: 1.7251 - classification_loss: 0.3902 68/500 [===>..........................] - ETA: 1:47 - loss: 2.1113 - regression_loss: 1.7228 - classification_loss: 0.3886 69/500 [===>..........................] - ETA: 1:47 - loss: 2.1109 - regression_loss: 1.7223 - classification_loss: 0.3886 70/500 [===>..........................] - ETA: 1:47 - loss: 2.1129 - regression_loss: 1.7248 - classification_loss: 0.3881 71/500 [===>..........................] - ETA: 1:47 - loss: 2.1130 - regression_loss: 1.7262 - classification_loss: 0.3868 72/500 [===>..........................] - ETA: 1:46 - loss: 2.1106 - regression_loss: 1.7238 - classification_loss: 0.3868 73/500 [===>..........................] - ETA: 1:46 - loss: 2.1086 - regression_loss: 1.7228 - classification_loss: 0.3858 74/500 [===>..........................] - ETA: 1:46 - loss: 2.1068 - regression_loss: 1.7193 - classification_loss: 0.3875 75/500 [===>..........................] - ETA: 1:46 - loss: 2.1038 - regression_loss: 1.7168 - classification_loss: 0.3870 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0958 - regression_loss: 1.7104 - classification_loss: 0.3855 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0969 - regression_loss: 1.7116 - classification_loss: 0.3853 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0966 - regression_loss: 1.7109 - classification_loss: 0.3858 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0946 - regression_loss: 1.7101 - classification_loss: 0.3845 80/500 [===>..........................] - ETA: 1:45 - loss: 2.0988 - regression_loss: 1.7139 - classification_loss: 0.3849 81/500 [===>..........................] - ETA: 1:44 - loss: 2.1052 - regression_loss: 1.7171 - classification_loss: 0.3881 82/500 [===>..........................] - ETA: 1:44 - loss: 2.1053 - regression_loss: 1.7171 - classification_loss: 0.3882 83/500 [===>..........................] - ETA: 1:44 - loss: 2.1031 - regression_loss: 1.7133 - classification_loss: 0.3898 84/500 [====>.........................] - ETA: 1:44 - loss: 2.1061 - regression_loss: 1.7159 - classification_loss: 0.3902 85/500 [====>.........................] - ETA: 1:43 - loss: 2.1028 - regression_loss: 1.7105 - classification_loss: 0.3923 86/500 [====>.........................] - ETA: 1:43 - loss: 2.1025 - regression_loss: 1.7111 - classification_loss: 0.3914 87/500 [====>.........................] - ETA: 1:43 - loss: 2.1030 - regression_loss: 1.7120 - classification_loss: 0.3911 88/500 [====>.........................] - ETA: 1:43 - loss: 2.1120 - regression_loss: 1.7195 - classification_loss: 0.3925 89/500 [====>.........................] - ETA: 1:42 - loss: 2.1003 - regression_loss: 1.7108 - classification_loss: 0.3895 90/500 [====>.........................] - ETA: 1:42 - loss: 2.0930 - regression_loss: 1.7052 - classification_loss: 0.3878 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0931 - regression_loss: 1.7057 - classification_loss: 0.3875 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0842 - regression_loss: 1.6993 - classification_loss: 0.3849 93/500 [====>.........................] - ETA: 1:41 - loss: 2.0827 - regression_loss: 1.6990 - classification_loss: 0.3837 94/500 [====>.........................] - ETA: 1:41 - loss: 2.0704 - regression_loss: 1.6894 - classification_loss: 0.3811 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0646 - regression_loss: 1.6848 - classification_loss: 0.3798 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0679 - regression_loss: 1.6875 - classification_loss: 0.3804 97/500 [====>.........................] - ETA: 1:40 - loss: 2.0689 - regression_loss: 1.6891 - classification_loss: 0.3798 98/500 [====>.........................] - ETA: 1:40 - loss: 2.0685 - regression_loss: 1.6893 - classification_loss: 0.3792 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0729 - regression_loss: 1.6927 - classification_loss: 0.3802 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0740 - regression_loss: 1.6935 - classification_loss: 0.3805 101/500 [=====>........................] - ETA: 1:39 - loss: 2.0759 - regression_loss: 1.6953 - classification_loss: 0.3806 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0674 - regression_loss: 1.6884 - classification_loss: 0.3790 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0632 - regression_loss: 1.6867 - classification_loss: 0.3765 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0626 - regression_loss: 1.6870 - classification_loss: 0.3757 105/500 [=====>........................] - ETA: 1:39 - loss: 2.0607 - regression_loss: 1.6860 - classification_loss: 0.3747 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0592 - regression_loss: 1.6846 - classification_loss: 0.3746 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0595 - regression_loss: 1.6842 - classification_loss: 0.3753 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0622 - regression_loss: 1.6862 - classification_loss: 0.3760 109/500 [=====>........................] - ETA: 1:37 - loss: 2.0602 - regression_loss: 1.6850 - classification_loss: 0.3751 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0568 - regression_loss: 1.6821 - classification_loss: 0.3747 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0620 - regression_loss: 1.6870 - classification_loss: 0.3750 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0645 - regression_loss: 1.6894 - classification_loss: 0.3750 113/500 [=====>........................] - ETA: 1:36 - loss: 2.0632 - regression_loss: 1.6878 - classification_loss: 0.3754 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0587 - regression_loss: 1.6833 - classification_loss: 0.3754 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0525 - regression_loss: 1.6780 - classification_loss: 0.3745 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0520 - regression_loss: 1.6782 - classification_loss: 0.3738 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0482 - regression_loss: 1.6751 - classification_loss: 0.3731 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0515 - regression_loss: 1.6781 - classification_loss: 0.3734 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0468 - regression_loss: 1.6745 - classification_loss: 0.3723 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0463 - regression_loss: 1.6741 - classification_loss: 0.3721 121/500 [======>.......................] - ETA: 1:34 - loss: 2.0461 - regression_loss: 1.6739 - classification_loss: 0.3722 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0462 - regression_loss: 1.6737 - classification_loss: 0.3725 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0454 - regression_loss: 1.6737 - classification_loss: 0.3717 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0511 - regression_loss: 1.6782 - classification_loss: 0.3730 125/500 [======>.......................] - ETA: 1:33 - loss: 2.0514 - regression_loss: 1.6780 - classification_loss: 0.3733 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0535 - regression_loss: 1.6803 - classification_loss: 0.3732 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0609 - regression_loss: 1.6856 - classification_loss: 0.3754 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0618 - regression_loss: 1.6850 - classification_loss: 0.3768 129/500 [======>.......................] - ETA: 1:33 - loss: 2.0634 - regression_loss: 1.6874 - classification_loss: 0.3760 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0656 - regression_loss: 1.6899 - classification_loss: 0.3758 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0646 - regression_loss: 1.6893 - classification_loss: 0.3753 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0669 - regression_loss: 1.6904 - classification_loss: 0.3765 133/500 [======>.......................] - ETA: 1:32 - loss: 2.0657 - regression_loss: 1.6894 - classification_loss: 0.3763 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0679 - regression_loss: 1.6916 - classification_loss: 0.3764 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0690 - regression_loss: 1.6927 - classification_loss: 0.3763 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0657 - regression_loss: 1.6898 - classification_loss: 0.3759 137/500 [=======>......................] - ETA: 1:31 - loss: 2.0604 - regression_loss: 1.6853 - classification_loss: 0.3751 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0608 - regression_loss: 1.6861 - classification_loss: 0.3748 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0608 - regression_loss: 1.6850 - classification_loss: 0.3758 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0595 - regression_loss: 1.6841 - classification_loss: 0.3753 141/500 [=======>......................] - ETA: 1:30 - loss: 2.0573 - regression_loss: 1.6820 - classification_loss: 0.3753 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0592 - regression_loss: 1.6831 - classification_loss: 0.3761 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0566 - regression_loss: 1.6813 - classification_loss: 0.3753 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0559 - regression_loss: 1.6809 - classification_loss: 0.3750 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0543 - regression_loss: 1.6799 - classification_loss: 0.3744 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0552 - regression_loss: 1.6808 - classification_loss: 0.3744 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0530 - regression_loss: 1.6790 - classification_loss: 0.3741 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0534 - regression_loss: 1.6792 - classification_loss: 0.3742 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0509 - regression_loss: 1.6765 - classification_loss: 0.3744 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0505 - regression_loss: 1.6766 - classification_loss: 0.3738 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0482 - regression_loss: 1.6751 - classification_loss: 0.3731 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0480 - regression_loss: 1.6752 - classification_loss: 0.3728 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0432 - regression_loss: 1.6718 - classification_loss: 0.3714 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0424 - regression_loss: 1.6715 - classification_loss: 0.3710 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0387 - regression_loss: 1.6686 - classification_loss: 0.3701 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0378 - regression_loss: 1.6679 - classification_loss: 0.3699 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0391 - regression_loss: 1.6687 - classification_loss: 0.3705 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0418 - regression_loss: 1.6713 - classification_loss: 0.3705 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0537 - regression_loss: 1.6803 - classification_loss: 0.3734 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0505 - regression_loss: 1.6775 - classification_loss: 0.3730 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0507 - regression_loss: 1.6778 - classification_loss: 0.3729 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0503 - regression_loss: 1.6777 - classification_loss: 0.3726 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0492 - regression_loss: 1.6771 - classification_loss: 0.3721 164/500 [========>.....................] - ETA: 1:23 - loss: 2.0444 - regression_loss: 1.6732 - classification_loss: 0.3711 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0449 - regression_loss: 1.6737 - classification_loss: 0.3713 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0462 - regression_loss: 1.6748 - classification_loss: 0.3714 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0451 - regression_loss: 1.6740 - classification_loss: 0.3712 168/500 [=========>....................] - ETA: 1:22 - loss: 2.0500 - regression_loss: 1.6779 - classification_loss: 0.3721 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0475 - regression_loss: 1.6751 - classification_loss: 0.3723 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0488 - regression_loss: 1.6754 - classification_loss: 0.3735 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0445 - regression_loss: 1.6715 - classification_loss: 0.3730 172/500 [=========>....................] - ETA: 1:21 - loss: 2.0430 - regression_loss: 1.6705 - classification_loss: 0.3725 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0423 - regression_loss: 1.6696 - classification_loss: 0.3727 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0418 - regression_loss: 1.6687 - classification_loss: 0.3732 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0422 - regression_loss: 1.6692 - classification_loss: 0.3731 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0420 - regression_loss: 1.6687 - classification_loss: 0.3733 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0396 - regression_loss: 1.6665 - classification_loss: 0.3730 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0389 - regression_loss: 1.6660 - classification_loss: 0.3729 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0356 - regression_loss: 1.6636 - classification_loss: 0.3720 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0359 - regression_loss: 1.6638 - classification_loss: 0.3721 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0361 - regression_loss: 1.6637 - classification_loss: 0.3724 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0378 - regression_loss: 1.6651 - classification_loss: 0.3727 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0352 - regression_loss: 1.6632 - classification_loss: 0.3720 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0338 - regression_loss: 1.6624 - classification_loss: 0.3714 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0299 - regression_loss: 1.6594 - classification_loss: 0.3705 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0316 - regression_loss: 1.6608 - classification_loss: 0.3708 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0323 - regression_loss: 1.6614 - classification_loss: 0.3709 188/500 [==========>...................] - ETA: 1:17 - loss: 2.0350 - regression_loss: 1.6640 - classification_loss: 0.3709 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0339 - regression_loss: 1.6633 - classification_loss: 0.3706 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0298 - regression_loss: 1.6598 - classification_loss: 0.3700 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0314 - regression_loss: 1.6611 - classification_loss: 0.3704 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0311 - regression_loss: 1.6613 - classification_loss: 0.3698 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0334 - regression_loss: 1.6631 - classification_loss: 0.3703 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0310 - regression_loss: 1.6610 - classification_loss: 0.3700 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0312 - regression_loss: 1.6608 - classification_loss: 0.3704 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0335 - regression_loss: 1.6628 - classification_loss: 0.3707 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0337 - regression_loss: 1.6629 - classification_loss: 0.3708 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0349 - regression_loss: 1.6638 - classification_loss: 0.3710 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0333 - regression_loss: 1.6629 - classification_loss: 0.3704 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0333 - regression_loss: 1.6630 - classification_loss: 0.3703 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0342 - regression_loss: 1.6643 - classification_loss: 0.3700 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0359 - regression_loss: 1.6656 - classification_loss: 0.3704 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0362 - regression_loss: 1.6660 - classification_loss: 0.3702 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0355 - regression_loss: 1.6656 - classification_loss: 0.3699 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0363 - regression_loss: 1.6658 - classification_loss: 0.3705 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0389 - regression_loss: 1.6674 - classification_loss: 0.3715 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0383 - regression_loss: 1.6671 - classification_loss: 0.3712 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0342 - regression_loss: 1.6637 - classification_loss: 0.3705 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0340 - regression_loss: 1.6637 - classification_loss: 0.3703 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0328 - regression_loss: 1.6627 - classification_loss: 0.3701 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0325 - regression_loss: 1.6621 - classification_loss: 0.3705 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0347 - regression_loss: 1.6633 - classification_loss: 0.3714 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0386 - regression_loss: 1.6671 - classification_loss: 0.3715 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0432 - regression_loss: 1.6707 - classification_loss: 0.3724 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0444 - regression_loss: 1.6720 - classification_loss: 0.3725 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0472 - regression_loss: 1.6736 - classification_loss: 0.3735 217/500 [============>.................] - ETA: 1:10 - loss: 2.0474 - regression_loss: 1.6734 - classification_loss: 0.3740 218/500 [============>.................] - ETA: 1:10 - loss: 2.0476 - regression_loss: 1.6736 - classification_loss: 0.3740 219/500 [============>.................] - ETA: 1:10 - loss: 2.0466 - regression_loss: 1.6731 - classification_loss: 0.3736 220/500 [============>.................] - ETA: 1:10 - loss: 2.0464 - regression_loss: 1.6731 - classification_loss: 0.3733 221/500 [============>.................] - ETA: 1:09 - loss: 2.0481 - regression_loss: 1.6747 - classification_loss: 0.3734 222/500 [============>.................] - ETA: 1:09 - loss: 2.0483 - regression_loss: 1.6749 - classification_loss: 0.3734 223/500 [============>.................] - ETA: 1:09 - loss: 2.0435 - regression_loss: 1.6710 - classification_loss: 0.3725 224/500 [============>.................] - ETA: 1:09 - loss: 2.0413 - regression_loss: 1.6697 - classification_loss: 0.3716 225/500 [============>.................] - ETA: 1:08 - loss: 2.0451 - regression_loss: 1.6724 - classification_loss: 0.3726 226/500 [============>.................] - ETA: 1:08 - loss: 2.0472 - regression_loss: 1.6728 - classification_loss: 0.3744 227/500 [============>.................] - ETA: 1:08 - loss: 2.0476 - regression_loss: 1.6735 - classification_loss: 0.3741 228/500 [============>.................] - ETA: 1:08 - loss: 2.0489 - regression_loss: 1.6746 - classification_loss: 0.3744 229/500 [============>.................] - ETA: 1:07 - loss: 2.0515 - regression_loss: 1.6764 - classification_loss: 0.3750 230/500 [============>.................] - ETA: 1:07 - loss: 2.0513 - regression_loss: 1.6766 - classification_loss: 0.3748 231/500 [============>.................] - ETA: 1:07 - loss: 2.0542 - regression_loss: 1.6787 - classification_loss: 0.3755 232/500 [============>.................] - ETA: 1:07 - loss: 2.0561 - regression_loss: 1.6800 - classification_loss: 0.3761 233/500 [============>.................] - ETA: 1:06 - loss: 2.0544 - regression_loss: 1.6787 - classification_loss: 0.3757 234/500 [=============>................] - ETA: 1:06 - loss: 2.0532 - regression_loss: 1.6778 - classification_loss: 0.3754 235/500 [=============>................] - ETA: 1:06 - loss: 2.0496 - regression_loss: 1.6749 - classification_loss: 0.3747 236/500 [=============>................] - ETA: 1:06 - loss: 2.0489 - regression_loss: 1.6743 - classification_loss: 0.3746 237/500 [=============>................] - ETA: 1:05 - loss: 2.0459 - regression_loss: 1.6717 - classification_loss: 0.3741 238/500 [=============>................] - ETA: 1:05 - loss: 2.0465 - regression_loss: 1.6721 - classification_loss: 0.3744 239/500 [=============>................] - ETA: 1:05 - loss: 2.0480 - regression_loss: 1.6733 - classification_loss: 0.3747 240/500 [=============>................] - ETA: 1:05 - loss: 2.0448 - regression_loss: 1.6710 - classification_loss: 0.3738 241/500 [=============>................] - ETA: 1:04 - loss: 2.0447 - regression_loss: 1.6711 - classification_loss: 0.3737 242/500 [=============>................] - ETA: 1:04 - loss: 2.0451 - regression_loss: 1.6714 - classification_loss: 0.3736 243/500 [=============>................] - ETA: 1:04 - loss: 2.0461 - regression_loss: 1.6724 - classification_loss: 0.3738 244/500 [=============>................] - ETA: 1:04 - loss: 2.0463 - regression_loss: 1.6728 - classification_loss: 0.3735 245/500 [=============>................] - ETA: 1:03 - loss: 2.0446 - regression_loss: 1.6714 - classification_loss: 0.3731 246/500 [=============>................] - ETA: 1:03 - loss: 2.0449 - regression_loss: 1.6717 - classification_loss: 0.3732 247/500 [=============>................] - ETA: 1:03 - loss: 2.0470 - regression_loss: 1.6719 - classification_loss: 0.3751 248/500 [=============>................] - ETA: 1:03 - loss: 2.0480 - regression_loss: 1.6727 - classification_loss: 0.3753 249/500 [=============>................] - ETA: 1:02 - loss: 2.0451 - regression_loss: 1.6702 - classification_loss: 0.3749 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0469 - regression_loss: 1.6716 - classification_loss: 0.3753 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0490 - regression_loss: 1.6732 - classification_loss: 0.3758 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0491 - regression_loss: 1.6736 - classification_loss: 0.3755 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0485 - regression_loss: 1.6731 - classification_loss: 0.3754 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0475 - regression_loss: 1.6714 - classification_loss: 0.3761 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0484 - regression_loss: 1.6723 - classification_loss: 0.3762 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0497 - regression_loss: 1.6730 - classification_loss: 0.3767 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0507 - regression_loss: 1.6737 - classification_loss: 0.3770 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0517 - regression_loss: 1.6745 - classification_loss: 0.3772 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0516 - regression_loss: 1.6748 - classification_loss: 0.3768 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0527 - regression_loss: 1.6758 - classification_loss: 0.3769 261/500 [==============>...............] - ETA: 59s - loss: 2.0478 - regression_loss: 1.6718 - classification_loss: 0.3760  262/500 [==============>...............] - ETA: 59s - loss: 2.0480 - regression_loss: 1.6720 - classification_loss: 0.3759 263/500 [==============>...............] - ETA: 59s - loss: 2.0485 - regression_loss: 1.6726 - classification_loss: 0.3758 264/500 [==============>...............] - ETA: 59s - loss: 2.0478 - regression_loss: 1.6723 - classification_loss: 0.3756 265/500 [==============>...............] - ETA: 58s - loss: 2.0485 - regression_loss: 1.6729 - classification_loss: 0.3755 266/500 [==============>...............] - ETA: 58s - loss: 2.0489 - regression_loss: 1.6733 - classification_loss: 0.3755 267/500 [===============>..............] - ETA: 58s - loss: 2.0504 - regression_loss: 1.6749 - classification_loss: 0.3755 268/500 [===============>..............] - ETA: 58s - loss: 2.0478 - regression_loss: 1.6729 - classification_loss: 0.3749 269/500 [===============>..............] - ETA: 57s - loss: 2.0487 - regression_loss: 1.6735 - classification_loss: 0.3752 270/500 [===============>..............] - ETA: 57s - loss: 2.0492 - regression_loss: 1.6742 - classification_loss: 0.3749 271/500 [===============>..............] - ETA: 57s - loss: 2.0484 - regression_loss: 1.6737 - classification_loss: 0.3747 272/500 [===============>..............] - ETA: 57s - loss: 2.0449 - regression_loss: 1.6707 - classification_loss: 0.3742 273/500 [===============>..............] - ETA: 56s - loss: 2.0420 - regression_loss: 1.6685 - classification_loss: 0.3735 274/500 [===============>..............] - ETA: 56s - loss: 2.0408 - regression_loss: 1.6670 - classification_loss: 0.3738 275/500 [===============>..............] - ETA: 56s - loss: 2.0416 - regression_loss: 1.6677 - classification_loss: 0.3738 276/500 [===============>..............] - ETA: 56s - loss: 2.0410 - regression_loss: 1.6674 - classification_loss: 0.3736 277/500 [===============>..............] - ETA: 55s - loss: 2.0434 - regression_loss: 1.6695 - classification_loss: 0.3739 278/500 [===============>..............] - ETA: 55s - loss: 2.0432 - regression_loss: 1.6694 - classification_loss: 0.3738 279/500 [===============>..............] - ETA: 55s - loss: 2.0420 - regression_loss: 1.6684 - classification_loss: 0.3735 280/500 [===============>..............] - ETA: 55s - loss: 2.0418 - regression_loss: 1.6686 - classification_loss: 0.3732 281/500 [===============>..............] - ETA: 54s - loss: 2.0419 - regression_loss: 1.6688 - classification_loss: 0.3730 282/500 [===============>..............] - ETA: 54s - loss: 2.0419 - regression_loss: 1.6689 - classification_loss: 0.3729 283/500 [===============>..............] - ETA: 54s - loss: 2.0418 - regression_loss: 1.6689 - classification_loss: 0.3729 284/500 [================>.............] - ETA: 54s - loss: 2.0442 - regression_loss: 1.6709 - classification_loss: 0.3733 285/500 [================>.............] - ETA: 53s - loss: 2.0420 - regression_loss: 1.6692 - classification_loss: 0.3728 286/500 [================>.............] - ETA: 53s - loss: 2.0433 - regression_loss: 1.6703 - classification_loss: 0.3730 287/500 [================>.............] - ETA: 53s - loss: 2.0405 - regression_loss: 1.6680 - classification_loss: 0.3725 288/500 [================>.............] - ETA: 53s - loss: 2.0379 - regression_loss: 1.6659 - classification_loss: 0.3721 289/500 [================>.............] - ETA: 52s - loss: 2.0387 - regression_loss: 1.6666 - classification_loss: 0.3722 290/500 [================>.............] - ETA: 52s - loss: 2.0391 - regression_loss: 1.6668 - classification_loss: 0.3723 291/500 [================>.............] - ETA: 52s - loss: 2.0391 - regression_loss: 1.6666 - classification_loss: 0.3725 292/500 [================>.............] - ETA: 52s - loss: 2.0345 - regression_loss: 1.6629 - classification_loss: 0.3716 293/500 [================>.............] - ETA: 51s - loss: 2.0345 - regression_loss: 1.6632 - classification_loss: 0.3714 294/500 [================>.............] - ETA: 51s - loss: 2.0319 - regression_loss: 1.6610 - classification_loss: 0.3709 295/500 [================>.............] - ETA: 51s - loss: 2.0306 - regression_loss: 1.6598 - classification_loss: 0.3708 296/500 [================>.............] - ETA: 51s - loss: 2.0302 - regression_loss: 1.6594 - classification_loss: 0.3707 297/500 [================>.............] - ETA: 50s - loss: 2.0283 - regression_loss: 1.6579 - classification_loss: 0.3704 298/500 [================>.............] - ETA: 50s - loss: 2.0340 - regression_loss: 1.6631 - classification_loss: 0.3710 299/500 [================>.............] - ETA: 50s - loss: 2.0364 - regression_loss: 1.6649 - classification_loss: 0.3715 300/500 [=================>............] - ETA: 50s - loss: 2.0370 - regression_loss: 1.6653 - classification_loss: 0.3717 301/500 [=================>............] - ETA: 49s - loss: 2.0379 - regression_loss: 1.6664 - classification_loss: 0.3715 302/500 [=================>............] - ETA: 49s - loss: 2.0377 - regression_loss: 1.6666 - classification_loss: 0.3712 303/500 [=================>............] - ETA: 49s - loss: 2.0391 - regression_loss: 1.6675 - classification_loss: 0.3716 304/500 [=================>............] - ETA: 49s - loss: 2.0397 - regression_loss: 1.6678 - classification_loss: 0.3719 305/500 [=================>............] - ETA: 48s - loss: 2.0395 - regression_loss: 1.6678 - classification_loss: 0.3718 306/500 [=================>............] - ETA: 48s - loss: 2.0386 - regression_loss: 1.6669 - classification_loss: 0.3717 307/500 [=================>............] - ETA: 48s - loss: 2.0402 - regression_loss: 1.6682 - classification_loss: 0.3719 308/500 [=================>............] - ETA: 48s - loss: 2.0390 - regression_loss: 1.6673 - classification_loss: 0.3717 309/500 [=================>............] - ETA: 47s - loss: 2.0380 - regression_loss: 1.6666 - classification_loss: 0.3714 310/500 [=================>............] - ETA: 47s - loss: 2.0372 - regression_loss: 1.6661 - classification_loss: 0.3711 311/500 [=================>............] - ETA: 47s - loss: 2.0376 - regression_loss: 1.6665 - classification_loss: 0.3711 312/500 [=================>............] - ETA: 47s - loss: 2.0367 - regression_loss: 1.6658 - classification_loss: 0.3709 313/500 [=================>............] - ETA: 46s - loss: 2.0342 - regression_loss: 1.6638 - classification_loss: 0.3704 314/500 [=================>............] - ETA: 46s - loss: 2.0352 - regression_loss: 1.6645 - classification_loss: 0.3707 315/500 [=================>............] - ETA: 46s - loss: 2.0350 - regression_loss: 1.6644 - classification_loss: 0.3706 316/500 [=================>............] - ETA: 46s - loss: 2.0339 - regression_loss: 1.6634 - classification_loss: 0.3705 317/500 [==================>...........] - ETA: 45s - loss: 2.0338 - regression_loss: 1.6637 - classification_loss: 0.3701 318/500 [==================>...........] - ETA: 45s - loss: 2.0340 - regression_loss: 1.6637 - classification_loss: 0.3704 319/500 [==================>...........] - ETA: 45s - loss: 2.0354 - regression_loss: 1.6646 - classification_loss: 0.3708 320/500 [==================>...........] - ETA: 45s - loss: 2.0345 - regression_loss: 1.6639 - classification_loss: 0.3706 321/500 [==================>...........] - ETA: 44s - loss: 2.0340 - regression_loss: 1.6636 - classification_loss: 0.3704 322/500 [==================>...........] - ETA: 44s - loss: 2.0335 - regression_loss: 1.6629 - classification_loss: 0.3706 323/500 [==================>...........] - ETA: 44s - loss: 2.0321 - regression_loss: 1.6618 - classification_loss: 0.3703 324/500 [==================>...........] - ETA: 44s - loss: 2.0310 - regression_loss: 1.6610 - classification_loss: 0.3700 325/500 [==================>...........] - ETA: 43s - loss: 2.0325 - regression_loss: 1.6623 - classification_loss: 0.3701 326/500 [==================>...........] - ETA: 43s - loss: 2.0330 - regression_loss: 1.6629 - classification_loss: 0.3701 327/500 [==================>...........] - ETA: 43s - loss: 2.0348 - regression_loss: 1.6639 - classification_loss: 0.3709 328/500 [==================>...........] - ETA: 43s - loss: 2.0349 - regression_loss: 1.6635 - classification_loss: 0.3714 329/500 [==================>...........] - ETA: 42s - loss: 2.0355 - regression_loss: 1.6639 - classification_loss: 0.3716 330/500 [==================>...........] - ETA: 42s - loss: 2.0364 - regression_loss: 1.6646 - classification_loss: 0.3718 331/500 [==================>...........] - ETA: 42s - loss: 2.0366 - regression_loss: 1.6648 - classification_loss: 0.3718 332/500 [==================>...........] - ETA: 42s - loss: 2.0359 - regression_loss: 1.6643 - classification_loss: 0.3716 333/500 [==================>...........] - ETA: 41s - loss: 2.0344 - regression_loss: 1.6630 - classification_loss: 0.3714 334/500 [===================>..........] - ETA: 41s - loss: 2.0336 - regression_loss: 1.6625 - classification_loss: 0.3712 335/500 [===================>..........] - ETA: 41s - loss: 2.0337 - regression_loss: 1.6626 - classification_loss: 0.3711 336/500 [===================>..........] - ETA: 41s - loss: 2.0324 - regression_loss: 1.6615 - classification_loss: 0.3710 337/500 [===================>..........] - ETA: 40s - loss: 2.0331 - regression_loss: 1.6618 - classification_loss: 0.3713 338/500 [===================>..........] - ETA: 40s - loss: 2.0340 - regression_loss: 1.6625 - classification_loss: 0.3715 339/500 [===================>..........] - ETA: 40s - loss: 2.0353 - regression_loss: 1.6636 - classification_loss: 0.3717 340/500 [===================>..........] - ETA: 40s - loss: 2.0328 - regression_loss: 1.6613 - classification_loss: 0.3715 341/500 [===================>..........] - ETA: 39s - loss: 2.0316 - regression_loss: 1.6603 - classification_loss: 0.3713 342/500 [===================>..........] - ETA: 39s - loss: 2.0311 - regression_loss: 1.6599 - classification_loss: 0.3712 343/500 [===================>..........] - ETA: 39s - loss: 2.0318 - regression_loss: 1.6607 - classification_loss: 0.3711 344/500 [===================>..........] - ETA: 39s - loss: 2.0323 - regression_loss: 1.6611 - classification_loss: 0.3712 345/500 [===================>..........] - ETA: 38s - loss: 2.0323 - regression_loss: 1.6612 - classification_loss: 0.3711 346/500 [===================>..........] - ETA: 38s - loss: 2.0317 - regression_loss: 1.6609 - classification_loss: 0.3708 347/500 [===================>..........] - ETA: 38s - loss: 2.0313 - regression_loss: 1.6607 - classification_loss: 0.3706 348/500 [===================>..........] - ETA: 38s - loss: 2.0348 - regression_loss: 1.6632 - classification_loss: 0.3716 349/500 [===================>..........] - ETA: 37s - loss: 2.0364 - regression_loss: 1.6644 - classification_loss: 0.3720 350/500 [====================>.........] - ETA: 37s - loss: 2.0368 - regression_loss: 1.6649 - classification_loss: 0.3719 351/500 [====================>.........] - ETA: 37s - loss: 2.0384 - regression_loss: 1.6662 - classification_loss: 0.3722 352/500 [====================>.........] - ETA: 37s - loss: 2.0384 - regression_loss: 1.6661 - classification_loss: 0.3722 353/500 [====================>.........] - ETA: 36s - loss: 2.0373 - regression_loss: 1.6653 - classification_loss: 0.3720 354/500 [====================>.........] - ETA: 36s - loss: 2.0376 - regression_loss: 1.6655 - classification_loss: 0.3722 355/500 [====================>.........] - ETA: 36s - loss: 2.0346 - regression_loss: 1.6631 - classification_loss: 0.3715 356/500 [====================>.........] - ETA: 36s - loss: 2.0329 - regression_loss: 1.6618 - classification_loss: 0.3712 357/500 [====================>.........] - ETA: 35s - loss: 2.0332 - regression_loss: 1.6621 - classification_loss: 0.3711 358/500 [====================>.........] - ETA: 35s - loss: 2.0323 - regression_loss: 1.6614 - classification_loss: 0.3709 359/500 [====================>.........] - ETA: 35s - loss: 2.0321 - regression_loss: 1.6613 - classification_loss: 0.3707 360/500 [====================>.........] - ETA: 35s - loss: 2.0318 - regression_loss: 1.6613 - classification_loss: 0.3705 361/500 [====================>.........] - ETA: 34s - loss: 2.0308 - regression_loss: 1.6606 - classification_loss: 0.3702 362/500 [====================>.........] - ETA: 34s - loss: 2.0308 - regression_loss: 1.6609 - classification_loss: 0.3700 363/500 [====================>.........] - ETA: 34s - loss: 2.0319 - regression_loss: 1.6619 - classification_loss: 0.3700 364/500 [====================>.........] - ETA: 34s - loss: 2.0314 - regression_loss: 1.6616 - classification_loss: 0.3697 365/500 [====================>.........] - ETA: 33s - loss: 2.0307 - regression_loss: 1.6612 - classification_loss: 0.3695 366/500 [====================>.........] - ETA: 33s - loss: 2.0295 - regression_loss: 1.6604 - classification_loss: 0.3692 367/500 [=====================>........] - ETA: 33s - loss: 2.0296 - regression_loss: 1.6600 - classification_loss: 0.3696 368/500 [=====================>........] - ETA: 33s - loss: 2.0284 - regression_loss: 1.6590 - classification_loss: 0.3694 369/500 [=====================>........] - ETA: 32s - loss: 2.0257 - regression_loss: 1.6569 - classification_loss: 0.3688 370/500 [=====================>........] - ETA: 32s - loss: 2.0263 - regression_loss: 1.6574 - classification_loss: 0.3689 371/500 [=====================>........] - ETA: 32s - loss: 2.0255 - regression_loss: 1.6567 - classification_loss: 0.3689 372/500 [=====================>........] - ETA: 32s - loss: 2.0258 - regression_loss: 1.6569 - classification_loss: 0.3689 373/500 [=====================>........] - ETA: 31s - loss: 2.0265 - regression_loss: 1.6569 - classification_loss: 0.3696 374/500 [=====================>........] - ETA: 31s - loss: 2.0255 - regression_loss: 1.6563 - classification_loss: 0.3692 375/500 [=====================>........] - ETA: 31s - loss: 2.0268 - regression_loss: 1.6575 - classification_loss: 0.3693 376/500 [=====================>........] - ETA: 31s - loss: 2.0268 - regression_loss: 1.6576 - classification_loss: 0.3692 377/500 [=====================>........] - ETA: 30s - loss: 2.0273 - regression_loss: 1.6582 - classification_loss: 0.3691 378/500 [=====================>........] - ETA: 30s - loss: 2.0277 - regression_loss: 1.6587 - classification_loss: 0.3690 379/500 [=====================>........] - ETA: 30s - loss: 2.0264 - regression_loss: 1.6576 - classification_loss: 0.3687 380/500 [=====================>........] - ETA: 30s - loss: 2.0265 - regression_loss: 1.6578 - classification_loss: 0.3687 381/500 [=====================>........] - ETA: 29s - loss: 2.0263 - regression_loss: 1.6577 - classification_loss: 0.3686 382/500 [=====================>........] - ETA: 29s - loss: 2.0251 - regression_loss: 1.6566 - classification_loss: 0.3685 383/500 [=====================>........] - ETA: 29s - loss: 2.0264 - regression_loss: 1.6576 - classification_loss: 0.3688 384/500 [======================>.......] - ETA: 29s - loss: 2.0280 - regression_loss: 1.6582 - classification_loss: 0.3698 385/500 [======================>.......] - ETA: 28s - loss: 2.0268 - regression_loss: 1.6573 - classification_loss: 0.3695 386/500 [======================>.......] - ETA: 28s - loss: 2.0253 - regression_loss: 1.6562 - classification_loss: 0.3691 387/500 [======================>.......] - ETA: 28s - loss: 2.0266 - regression_loss: 1.6563 - classification_loss: 0.3703 388/500 [======================>.......] - ETA: 28s - loss: 2.0269 - regression_loss: 1.6565 - classification_loss: 0.3704 389/500 [======================>.......] - ETA: 27s - loss: 2.0276 - regression_loss: 1.6572 - classification_loss: 0.3704 390/500 [======================>.......] - ETA: 27s - loss: 2.0248 - regression_loss: 1.6549 - classification_loss: 0.3699 391/500 [======================>.......] - ETA: 27s - loss: 2.0253 - regression_loss: 1.6551 - classification_loss: 0.3702 392/500 [======================>.......] - ETA: 27s - loss: 2.0253 - regression_loss: 1.6551 - classification_loss: 0.3702 393/500 [======================>.......] - ETA: 26s - loss: 2.0242 - regression_loss: 1.6541 - classification_loss: 0.3700 394/500 [======================>.......] - ETA: 26s - loss: 2.0235 - regression_loss: 1.6538 - classification_loss: 0.3697 395/500 [======================>.......] - ETA: 26s - loss: 2.0257 - regression_loss: 1.6555 - classification_loss: 0.3702 396/500 [======================>.......] - ETA: 26s - loss: 2.0243 - regression_loss: 1.6547 - classification_loss: 0.3696 397/500 [======================>.......] - ETA: 25s - loss: 2.0255 - regression_loss: 1.6559 - classification_loss: 0.3696 398/500 [======================>.......] - ETA: 25s - loss: 2.0266 - regression_loss: 1.6568 - classification_loss: 0.3698 399/500 [======================>.......] - ETA: 25s - loss: 2.0236 - regression_loss: 1.6544 - classification_loss: 0.3692 400/500 [=======================>......] - ETA: 25s - loss: 2.0241 - regression_loss: 1.6550 - classification_loss: 0.3690 401/500 [=======================>......] - ETA: 24s - loss: 2.0242 - regression_loss: 1.6552 - classification_loss: 0.3690 402/500 [=======================>......] - ETA: 24s - loss: 2.0249 - regression_loss: 1.6560 - classification_loss: 0.3690 403/500 [=======================>......] - ETA: 24s - loss: 2.0264 - regression_loss: 1.6570 - classification_loss: 0.3694 404/500 [=======================>......] - ETA: 24s - loss: 2.0274 - regression_loss: 1.6577 - classification_loss: 0.3697 405/500 [=======================>......] - ETA: 23s - loss: 2.0283 - regression_loss: 1.6584 - classification_loss: 0.3699 406/500 [=======================>......] - ETA: 23s - loss: 2.0279 - regression_loss: 1.6576 - classification_loss: 0.3703 407/500 [=======================>......] - ETA: 23s - loss: 2.0277 - regression_loss: 1.6575 - classification_loss: 0.3702 408/500 [=======================>......] - ETA: 23s - loss: 2.0249 - regression_loss: 1.6554 - classification_loss: 0.3695 409/500 [=======================>......] - ETA: 22s - loss: 2.0246 - regression_loss: 1.6552 - classification_loss: 0.3694 410/500 [=======================>......] - ETA: 22s - loss: 2.0227 - regression_loss: 1.6536 - classification_loss: 0.3690 411/500 [=======================>......] - ETA: 22s - loss: 2.0223 - regression_loss: 1.6534 - classification_loss: 0.3689 412/500 [=======================>......] - ETA: 22s - loss: 2.0234 - regression_loss: 1.6540 - classification_loss: 0.3694 413/500 [=======================>......] - ETA: 21s - loss: 2.0231 - regression_loss: 1.6538 - classification_loss: 0.3693 414/500 [=======================>......] - ETA: 21s - loss: 2.0227 - regression_loss: 1.6536 - classification_loss: 0.3691 415/500 [=======================>......] - ETA: 21s - loss: 2.0223 - regression_loss: 1.6534 - classification_loss: 0.3689 416/500 [=======================>......] - ETA: 21s - loss: 2.0225 - regression_loss: 1.6537 - classification_loss: 0.3688 417/500 [========================>.....] - ETA: 20s - loss: 2.0212 - regression_loss: 1.6526 - classification_loss: 0.3686 418/500 [========================>.....] - ETA: 20s - loss: 2.0216 - regression_loss: 1.6531 - classification_loss: 0.3685 419/500 [========================>.....] - ETA: 20s - loss: 2.0220 - regression_loss: 1.6535 - classification_loss: 0.3685 420/500 [========================>.....] - ETA: 20s - loss: 2.0219 - regression_loss: 1.6535 - classification_loss: 0.3683 421/500 [========================>.....] - ETA: 19s - loss: 2.0214 - regression_loss: 1.6533 - classification_loss: 0.3682 422/500 [========================>.....] - ETA: 19s - loss: 2.0213 - regression_loss: 1.6532 - classification_loss: 0.3681 423/500 [========================>.....] - ETA: 19s - loss: 2.0201 - regression_loss: 1.6522 - classification_loss: 0.3679 424/500 [========================>.....] - ETA: 19s - loss: 2.0206 - regression_loss: 1.6525 - classification_loss: 0.3680 425/500 [========================>.....] - ETA: 18s - loss: 2.0213 - regression_loss: 1.6533 - classification_loss: 0.3680 426/500 [========================>.....] - ETA: 18s - loss: 2.0220 - regression_loss: 1.6540 - classification_loss: 0.3680 427/500 [========================>.....] - ETA: 18s - loss: 2.0214 - regression_loss: 1.6536 - classification_loss: 0.3678 428/500 [========================>.....] - ETA: 18s - loss: 2.0238 - regression_loss: 1.6557 - classification_loss: 0.3681 429/500 [========================>.....] - ETA: 17s - loss: 2.0240 - regression_loss: 1.6559 - classification_loss: 0.3681 430/500 [========================>.....] - ETA: 17s - loss: 2.0249 - regression_loss: 1.6566 - classification_loss: 0.3683 431/500 [========================>.....] - ETA: 17s - loss: 2.0248 - regression_loss: 1.6566 - classification_loss: 0.3682 432/500 [========================>.....] - ETA: 17s - loss: 2.0252 - regression_loss: 1.6570 - classification_loss: 0.3682 433/500 [========================>.....] - ETA: 16s - loss: 2.0250 - regression_loss: 1.6569 - classification_loss: 0.3681 434/500 [=========================>....] - ETA: 16s - loss: 2.0249 - regression_loss: 1.6568 - classification_loss: 0.3681 435/500 [=========================>....] - ETA: 16s - loss: 2.0238 - regression_loss: 1.6557 - classification_loss: 0.3681 436/500 [=========================>....] - ETA: 16s - loss: 2.0247 - regression_loss: 1.6564 - classification_loss: 0.3683 437/500 [=========================>....] - ETA: 15s - loss: 2.0245 - regression_loss: 1.6564 - classification_loss: 0.3682 438/500 [=========================>....] - ETA: 15s - loss: 2.0244 - regression_loss: 1.6564 - classification_loss: 0.3680 439/500 [=========================>....] - ETA: 15s - loss: 2.0240 - regression_loss: 1.6562 - classification_loss: 0.3678 440/500 [=========================>....] - ETA: 15s - loss: 2.0246 - regression_loss: 1.6570 - classification_loss: 0.3676 441/500 [=========================>....] - ETA: 14s - loss: 2.0239 - regression_loss: 1.6565 - classification_loss: 0.3674 442/500 [=========================>....] - ETA: 14s - loss: 2.0235 - regression_loss: 1.6558 - classification_loss: 0.3677 443/500 [=========================>....] - ETA: 14s - loss: 2.0250 - regression_loss: 1.6572 - classification_loss: 0.3679 444/500 [=========================>....] - ETA: 14s - loss: 2.0251 - regression_loss: 1.6573 - classification_loss: 0.3678 445/500 [=========================>....] - ETA: 13s - loss: 2.0233 - regression_loss: 1.6557 - classification_loss: 0.3677 446/500 [=========================>....] - ETA: 13s - loss: 2.0237 - regression_loss: 1.6559 - classification_loss: 0.3678 447/500 [=========================>....] - ETA: 13s - loss: 2.0244 - regression_loss: 1.6564 - classification_loss: 0.3679 448/500 [=========================>....] - ETA: 13s - loss: 2.0242 - regression_loss: 1.6562 - classification_loss: 0.3680 449/500 [=========================>....] - ETA: 12s - loss: 2.0233 - regression_loss: 1.6555 - classification_loss: 0.3678 450/500 [==========================>...] - ETA: 12s - loss: 2.0215 - regression_loss: 1.6539 - classification_loss: 0.3676 451/500 [==========================>...] - ETA: 12s - loss: 2.0225 - regression_loss: 1.6547 - classification_loss: 0.3678 452/500 [==========================>...] - ETA: 12s - loss: 2.0230 - regression_loss: 1.6551 - classification_loss: 0.3680 453/500 [==========================>...] - ETA: 11s - loss: 2.0237 - regression_loss: 1.6557 - classification_loss: 0.3680 454/500 [==========================>...] - ETA: 11s - loss: 2.0230 - regression_loss: 1.6551 - classification_loss: 0.3679 455/500 [==========================>...] - ETA: 11s - loss: 2.0237 - regression_loss: 1.6558 - classification_loss: 0.3679 456/500 [==========================>...] - ETA: 11s - loss: 2.0233 - regression_loss: 1.6555 - classification_loss: 0.3678 457/500 [==========================>...] - ETA: 10s - loss: 2.0234 - regression_loss: 1.6556 - classification_loss: 0.3678 458/500 [==========================>...] - ETA: 10s - loss: 2.0229 - regression_loss: 1.6551 - classification_loss: 0.3677 459/500 [==========================>...] - ETA: 10s - loss: 2.0219 - regression_loss: 1.6544 - classification_loss: 0.3675 460/500 [==========================>...] - ETA: 10s - loss: 2.0211 - regression_loss: 1.6539 - classification_loss: 0.3672 461/500 [==========================>...] - ETA: 9s - loss: 2.0200 - regression_loss: 1.6529 - classification_loss: 0.3671  462/500 [==========================>...] - ETA: 9s - loss: 2.0204 - regression_loss: 1.6531 - classification_loss: 0.3673 463/500 [==========================>...] - ETA: 9s - loss: 2.0210 - regression_loss: 1.6535 - classification_loss: 0.3675 464/500 [==========================>...] - ETA: 9s - loss: 2.0227 - regression_loss: 1.6550 - classification_loss: 0.3677 465/500 [==========================>...] - ETA: 8s - loss: 2.0236 - regression_loss: 1.6556 - classification_loss: 0.3680 466/500 [==========================>...] - ETA: 8s - loss: 2.0268 - regression_loss: 1.6587 - classification_loss: 0.3681 467/500 [===========================>..] - ETA: 8s - loss: 2.0286 - regression_loss: 1.6605 - classification_loss: 0.3682 468/500 [===========================>..] - ETA: 8s - loss: 2.0299 - regression_loss: 1.6613 - classification_loss: 0.3686 469/500 [===========================>..] - ETA: 7s - loss: 2.0273 - regression_loss: 1.6593 - classification_loss: 0.3680 470/500 [===========================>..] - ETA: 7s - loss: 2.0298 - regression_loss: 1.6610 - classification_loss: 0.3688 471/500 [===========================>..] - ETA: 7s - loss: 2.0299 - regression_loss: 1.6612 - classification_loss: 0.3688 472/500 [===========================>..] - ETA: 7s - loss: 2.0304 - regression_loss: 1.6619 - classification_loss: 0.3686 473/500 [===========================>..] - ETA: 6s - loss: 2.0297 - regression_loss: 1.6615 - classification_loss: 0.3682 474/500 [===========================>..] - ETA: 6s - loss: 2.0307 - regression_loss: 1.6623 - classification_loss: 0.3684 475/500 [===========================>..] - ETA: 6s - loss: 2.0304 - regression_loss: 1.6621 - classification_loss: 0.3683 476/500 [===========================>..] - ETA: 6s - loss: 2.0297 - regression_loss: 1.6616 - classification_loss: 0.3681 477/500 [===========================>..] - ETA: 5s - loss: 2.0277 - regression_loss: 1.6598 - classification_loss: 0.3679 478/500 [===========================>..] - ETA: 5s - loss: 2.0283 - regression_loss: 1.6602 - classification_loss: 0.3681 479/500 [===========================>..] - ETA: 5s - loss: 2.0278 - regression_loss: 1.6598 - classification_loss: 0.3679 480/500 [===========================>..] - ETA: 5s - loss: 2.0280 - regression_loss: 1.6599 - classification_loss: 0.3681 481/500 [===========================>..] - ETA: 4s - loss: 2.0289 - regression_loss: 1.6605 - classification_loss: 0.3684 482/500 [===========================>..] - ETA: 4s - loss: 2.0275 - regression_loss: 1.6595 - classification_loss: 0.3679 483/500 [===========================>..] - ETA: 4s - loss: 2.0247 - regression_loss: 1.6573 - classification_loss: 0.3674 484/500 [============================>.] - ETA: 4s - loss: 2.0258 - regression_loss: 1.6582 - classification_loss: 0.3677 485/500 [============================>.] - ETA: 3s - loss: 2.0258 - regression_loss: 1.6583 - classification_loss: 0.3675 486/500 [============================>.] - ETA: 3s - loss: 2.0265 - regression_loss: 1.6590 - classification_loss: 0.3676 487/500 [============================>.] - ETA: 3s - loss: 2.0260 - regression_loss: 1.6585 - classification_loss: 0.3675 488/500 [============================>.] - ETA: 3s - loss: 2.0256 - regression_loss: 1.6581 - classification_loss: 0.3675 489/500 [============================>.] - ETA: 2s - loss: 2.0267 - regression_loss: 1.6589 - classification_loss: 0.3678 490/500 [============================>.] - ETA: 2s - loss: 2.0275 - regression_loss: 1.6596 - classification_loss: 0.3679 491/500 [============================>.] - ETA: 2s - loss: 2.0276 - regression_loss: 1.6598 - classification_loss: 0.3678 492/500 [============================>.] - ETA: 2s - loss: 2.0280 - regression_loss: 1.6602 - classification_loss: 0.3678 493/500 [============================>.] - ETA: 1s - loss: 2.0293 - regression_loss: 1.6605 - classification_loss: 0.3688 494/500 [============================>.] - ETA: 1s - loss: 2.0280 - regression_loss: 1.6595 - classification_loss: 0.3685 495/500 [============================>.] - ETA: 1s - loss: 2.0303 - regression_loss: 1.6615 - classification_loss: 0.3688 496/500 [============================>.] - ETA: 1s - loss: 2.0311 - regression_loss: 1.6619 - classification_loss: 0.3692 497/500 [============================>.] - ETA: 0s - loss: 2.0308 - regression_loss: 1.6618 - classification_loss: 0.3689 498/500 [============================>.] - ETA: 0s - loss: 2.0315 - regression_loss: 1.6625 - classification_loss: 0.3691 499/500 [============================>.] - ETA: 0s - loss: 2.0324 - regression_loss: 1.6631 - classification_loss: 0.3693 500/500 [==============================] - 125s 251ms/step - loss: 2.0323 - regression_loss: 1.6628 - classification_loss: 0.3696 1172 instances of class plum with average precision: 0.5238 mAP: 0.5238 Epoch 00038: saving model to ./training/snapshots/resnet50_pascal_38.h5 Epoch 39/150 1/500 [..............................] - ETA: 1:46 - loss: 1.1412 - regression_loss: 0.9708 - classification_loss: 0.1705 2/500 [..............................] - ETA: 1:53 - loss: 1.4880 - regression_loss: 1.2466 - classification_loss: 0.2414 3/500 [..............................] - ETA: 1:56 - loss: 1.7453 - regression_loss: 1.5033 - classification_loss: 0.2420 4/500 [..............................] - ETA: 1:58 - loss: 1.9414 - regression_loss: 1.6107 - classification_loss: 0.3307 5/500 [..............................] - ETA: 1:57 - loss: 1.8369 - regression_loss: 1.5391 - classification_loss: 0.2978 6/500 [..............................] - ETA: 1:57 - loss: 1.8404 - regression_loss: 1.5357 - classification_loss: 0.3047 7/500 [..............................] - ETA: 1:58 - loss: 1.8861 - regression_loss: 1.5632 - classification_loss: 0.3229 8/500 [..............................] - ETA: 1:57 - loss: 1.8884 - regression_loss: 1.5771 - classification_loss: 0.3113 9/500 [..............................] - ETA: 1:56 - loss: 1.9394 - regression_loss: 1.6090 - classification_loss: 0.3304 10/500 [..............................] - ETA: 1:54 - loss: 1.9601 - regression_loss: 1.6348 - classification_loss: 0.3253 11/500 [..............................] - ETA: 1:53 - loss: 1.9481 - regression_loss: 1.6224 - classification_loss: 0.3257 12/500 [..............................] - ETA: 1:53 - loss: 1.9438 - regression_loss: 1.6205 - classification_loss: 0.3233 13/500 [..............................] - ETA: 1:54 - loss: 1.9525 - regression_loss: 1.6274 - classification_loss: 0.3251 14/500 [..............................] - ETA: 1:54 - loss: 1.9535 - regression_loss: 1.6278 - classification_loss: 0.3256 15/500 [..............................] - ETA: 1:54 - loss: 1.9408 - regression_loss: 1.6141 - classification_loss: 0.3268 16/500 [..............................] - ETA: 1:54 - loss: 1.9583 - regression_loss: 1.6335 - classification_loss: 0.3249 17/500 [>.............................] - ETA: 1:54 - loss: 1.9531 - regression_loss: 1.6283 - classification_loss: 0.3248 18/500 [>.............................] - ETA: 1:54 - loss: 1.9168 - regression_loss: 1.5985 - classification_loss: 0.3183 19/500 [>.............................] - ETA: 1:54 - loss: 1.9332 - regression_loss: 1.6141 - classification_loss: 0.3191 20/500 [>.............................] - ETA: 1:54 - loss: 1.9921 - regression_loss: 1.6668 - classification_loss: 0.3253 21/500 [>.............................] - ETA: 1:54 - loss: 1.9773 - regression_loss: 1.6503 - classification_loss: 0.3270 22/500 [>.............................] - ETA: 1:54 - loss: 1.9792 - regression_loss: 1.6514 - classification_loss: 0.3278 23/500 [>.............................] - ETA: 1:54 - loss: 1.9623 - regression_loss: 1.6322 - classification_loss: 0.3301 24/500 [>.............................] - ETA: 1:54 - loss: 1.9746 - regression_loss: 1.6399 - classification_loss: 0.3347 25/500 [>.............................] - ETA: 1:54 - loss: 1.9824 - regression_loss: 1.6473 - classification_loss: 0.3352 26/500 [>.............................] - ETA: 1:54 - loss: 1.9903 - regression_loss: 1.6555 - classification_loss: 0.3348 27/500 [>.............................] - ETA: 1:54 - loss: 2.0179 - regression_loss: 1.6738 - classification_loss: 0.3441 28/500 [>.............................] - ETA: 1:54 - loss: 2.0059 - regression_loss: 1.6649 - classification_loss: 0.3411 29/500 [>.............................] - ETA: 1:54 - loss: 2.0031 - regression_loss: 1.6621 - classification_loss: 0.3409 30/500 [>.............................] - ETA: 1:54 - loss: 2.0050 - regression_loss: 1.6641 - classification_loss: 0.3409 31/500 [>.............................] - ETA: 1:54 - loss: 2.0171 - regression_loss: 1.6709 - classification_loss: 0.3462 32/500 [>.............................] - ETA: 1:54 - loss: 2.0276 - regression_loss: 1.6805 - classification_loss: 0.3471 33/500 [>.............................] - ETA: 1:53 - loss: 2.0296 - regression_loss: 1.6815 - classification_loss: 0.3481 34/500 [=>............................] - ETA: 1:53 - loss: 2.0153 - regression_loss: 1.6674 - classification_loss: 0.3479 35/500 [=>............................] - ETA: 1:53 - loss: 2.0091 - regression_loss: 1.6623 - classification_loss: 0.3469 36/500 [=>............................] - ETA: 1:53 - loss: 2.0379 - regression_loss: 1.6859 - classification_loss: 0.3519 37/500 [=>............................] - ETA: 1:53 - loss: 2.0169 - regression_loss: 1.6647 - classification_loss: 0.3522 38/500 [=>............................] - ETA: 1:53 - loss: 2.0142 - regression_loss: 1.6625 - classification_loss: 0.3517 39/500 [=>............................] - ETA: 1:53 - loss: 1.9836 - regression_loss: 1.6370 - classification_loss: 0.3467 40/500 [=>............................] - ETA: 1:52 - loss: 1.9906 - regression_loss: 1.6414 - classification_loss: 0.3492 41/500 [=>............................] - ETA: 1:52 - loss: 1.9970 - regression_loss: 1.6464 - classification_loss: 0.3506 42/500 [=>............................] - ETA: 1:52 - loss: 2.0082 - regression_loss: 1.6522 - classification_loss: 0.3560 43/500 [=>............................] - ETA: 1:52 - loss: 1.9920 - regression_loss: 1.6407 - classification_loss: 0.3513 44/500 [=>............................] - ETA: 1:52 - loss: 1.9798 - regression_loss: 1.6303 - classification_loss: 0.3496 45/500 [=>............................] - ETA: 1:52 - loss: 1.9791 - regression_loss: 1.6298 - classification_loss: 0.3494 46/500 [=>............................] - ETA: 1:51 - loss: 1.9823 - regression_loss: 1.6328 - classification_loss: 0.3495 47/500 [=>............................] - ETA: 1:51 - loss: 1.9852 - regression_loss: 1.6357 - classification_loss: 0.3495 48/500 [=>............................] - ETA: 1:51 - loss: 1.9837 - regression_loss: 1.6340 - classification_loss: 0.3497 49/500 [=>............................] - ETA: 1:51 - loss: 1.9947 - regression_loss: 1.6453 - classification_loss: 0.3494 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9726 - regression_loss: 1.6278 - classification_loss: 0.3448 51/500 [==>...........................] - ETA: 1:50 - loss: 1.9776 - regression_loss: 1.6306 - classification_loss: 0.3470 52/500 [==>...........................] - ETA: 1:50 - loss: 1.9876 - regression_loss: 1.6365 - classification_loss: 0.3511 53/500 [==>...........................] - ETA: 1:50 - loss: 1.9880 - regression_loss: 1.6380 - classification_loss: 0.3500 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9847 - regression_loss: 1.6347 - classification_loss: 0.3500 55/500 [==>...........................] - ETA: 1:49 - loss: 1.9698 - regression_loss: 1.6230 - classification_loss: 0.3468 56/500 [==>...........................] - ETA: 1:49 - loss: 1.9733 - regression_loss: 1.6248 - classification_loss: 0.3485 57/500 [==>...........................] - ETA: 1:49 - loss: 1.9696 - regression_loss: 1.6213 - classification_loss: 0.3483 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9636 - regression_loss: 1.6166 - classification_loss: 0.3470 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9684 - regression_loss: 1.6201 - classification_loss: 0.3483 60/500 [==>...........................] - ETA: 1:48 - loss: 1.9668 - regression_loss: 1.6184 - classification_loss: 0.3484 61/500 [==>...........................] - ETA: 1:48 - loss: 1.9698 - regression_loss: 1.6200 - classification_loss: 0.3497 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9656 - regression_loss: 1.6174 - classification_loss: 0.3481 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9722 - regression_loss: 1.6238 - classification_loss: 0.3484 64/500 [==>...........................] - ETA: 1:47 - loss: 1.9759 - regression_loss: 1.6260 - classification_loss: 0.3500 65/500 [==>...........................] - ETA: 1:47 - loss: 1.9797 - regression_loss: 1.6279 - classification_loss: 0.3517 66/500 [==>...........................] - ETA: 1:47 - loss: 1.9837 - regression_loss: 1.6305 - classification_loss: 0.3531 67/500 [===>..........................] - ETA: 1:47 - loss: 1.9900 - regression_loss: 1.6362 - classification_loss: 0.3538 68/500 [===>..........................] - ETA: 1:47 - loss: 1.9881 - regression_loss: 1.6344 - classification_loss: 0.3537 69/500 [===>..........................] - ETA: 1:46 - loss: 1.9844 - regression_loss: 1.6308 - classification_loss: 0.3535 70/500 [===>..........................] - ETA: 1:46 - loss: 1.9864 - regression_loss: 1.6328 - classification_loss: 0.3536 71/500 [===>..........................] - ETA: 1:46 - loss: 1.9905 - regression_loss: 1.6357 - classification_loss: 0.3548 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0009 - regression_loss: 1.6447 - classification_loss: 0.3562 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0092 - regression_loss: 1.6523 - classification_loss: 0.3569 74/500 [===>..........................] - ETA: 1:45 - loss: 2.0130 - regression_loss: 1.6556 - classification_loss: 0.3575 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0156 - regression_loss: 1.6579 - classification_loss: 0.3577 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0161 - regression_loss: 1.6587 - classification_loss: 0.3574 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0143 - regression_loss: 1.6575 - classification_loss: 0.3569 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0120 - regression_loss: 1.6563 - classification_loss: 0.3557 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0069 - regression_loss: 1.6532 - classification_loss: 0.3537 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0094 - regression_loss: 1.6547 - classification_loss: 0.3547 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0043 - regression_loss: 1.6510 - classification_loss: 0.3533 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9902 - regression_loss: 1.6384 - classification_loss: 0.3518 83/500 [===>..........................] - ETA: 1:43 - loss: 1.9921 - regression_loss: 1.6400 - classification_loss: 0.3521 84/500 [====>.........................] - ETA: 1:43 - loss: 1.9868 - regression_loss: 1.6363 - classification_loss: 0.3505 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9866 - regression_loss: 1.6368 - classification_loss: 0.3498 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9837 - regression_loss: 1.6342 - classification_loss: 0.3495 87/500 [====>.........................] - ETA: 1:42 - loss: 1.9861 - regression_loss: 1.6365 - classification_loss: 0.3497 88/500 [====>.........................] - ETA: 1:42 - loss: 1.9832 - regression_loss: 1.6339 - classification_loss: 0.3493 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9832 - regression_loss: 1.6341 - classification_loss: 0.3491 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9855 - regression_loss: 1.6366 - classification_loss: 0.3489 91/500 [====>.........................] - ETA: 1:41 - loss: 1.9876 - regression_loss: 1.6385 - classification_loss: 0.3491 92/500 [====>.........................] - ETA: 1:41 - loss: 1.9927 - regression_loss: 1.6423 - classification_loss: 0.3505 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9873 - regression_loss: 1.6363 - classification_loss: 0.3509 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9824 - regression_loss: 1.6323 - classification_loss: 0.3501 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9839 - regression_loss: 1.6338 - classification_loss: 0.3501 96/500 [====>.........................] - ETA: 1:40 - loss: 1.9870 - regression_loss: 1.6358 - classification_loss: 0.3512 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9889 - regression_loss: 1.6368 - classification_loss: 0.3522 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9859 - regression_loss: 1.6342 - classification_loss: 0.3517 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9882 - regression_loss: 1.6366 - classification_loss: 0.3516 100/500 [=====>........................] - ETA: 1:39 - loss: 1.9914 - regression_loss: 1.6392 - classification_loss: 0.3521 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9940 - regression_loss: 1.6418 - classification_loss: 0.3522 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9993 - regression_loss: 1.6443 - classification_loss: 0.3550 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0015 - regression_loss: 1.6475 - classification_loss: 0.3540 104/500 [=====>........................] - ETA: 1:38 - loss: 1.9889 - regression_loss: 1.6375 - classification_loss: 0.3514 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9878 - regression_loss: 1.6369 - classification_loss: 0.3509 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9901 - regression_loss: 1.6392 - classification_loss: 0.3509 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9876 - regression_loss: 1.6372 - classification_loss: 0.3504 108/500 [=====>........................] - ETA: 1:37 - loss: 1.9930 - regression_loss: 1.6406 - classification_loss: 0.3524 109/500 [=====>........................] - ETA: 1:37 - loss: 1.9826 - regression_loss: 1.6325 - classification_loss: 0.3501 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9747 - regression_loss: 1.6267 - classification_loss: 0.3480 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9768 - regression_loss: 1.6282 - classification_loss: 0.3486 112/500 [=====>........................] - ETA: 1:36 - loss: 1.9761 - regression_loss: 1.6284 - classification_loss: 0.3477 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9765 - regression_loss: 1.6284 - classification_loss: 0.3481 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9691 - regression_loss: 1.6223 - classification_loss: 0.3467 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9743 - regression_loss: 1.6276 - classification_loss: 0.3467 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9732 - regression_loss: 1.6274 - classification_loss: 0.3458 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9738 - regression_loss: 1.6276 - classification_loss: 0.3462 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9756 - regression_loss: 1.6285 - classification_loss: 0.3471 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9755 - regression_loss: 1.6285 - classification_loss: 0.3469 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9806 - regression_loss: 1.6326 - classification_loss: 0.3480 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9787 - regression_loss: 1.6307 - classification_loss: 0.3480 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9772 - regression_loss: 1.6294 - classification_loss: 0.3478 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9785 - regression_loss: 1.6303 - classification_loss: 0.3482 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9761 - regression_loss: 1.6279 - classification_loss: 0.3482 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9825 - regression_loss: 1.6311 - classification_loss: 0.3514 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9835 - regression_loss: 1.6322 - classification_loss: 0.3513 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9861 - regression_loss: 1.6344 - classification_loss: 0.3516 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9858 - regression_loss: 1.6342 - classification_loss: 0.3517 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9874 - regression_loss: 1.6359 - classification_loss: 0.3515 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9909 - regression_loss: 1.6390 - classification_loss: 0.3519 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9928 - regression_loss: 1.6407 - classification_loss: 0.3521 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9901 - regression_loss: 1.6389 - classification_loss: 0.3512 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9916 - regression_loss: 1.6403 - classification_loss: 0.3513 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9951 - regression_loss: 1.6431 - classification_loss: 0.3520 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9972 - regression_loss: 1.6448 - classification_loss: 0.3524 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9989 - regression_loss: 1.6456 - classification_loss: 0.3533 137/500 [=======>......................] - ETA: 1:30 - loss: 2.0054 - regression_loss: 1.6509 - classification_loss: 0.3545 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0048 - regression_loss: 1.6506 - classification_loss: 0.3542 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0105 - regression_loss: 1.6551 - classification_loss: 0.3555 140/500 [=======>......................] - ETA: 1:29 - loss: 2.0112 - regression_loss: 1.6545 - classification_loss: 0.3567 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0126 - regression_loss: 1.6560 - classification_loss: 0.3566 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0175 - regression_loss: 1.6587 - classification_loss: 0.3588 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0145 - regression_loss: 1.6564 - classification_loss: 0.3581 144/500 [=======>......................] - ETA: 1:28 - loss: 2.0082 - regression_loss: 1.6513 - classification_loss: 0.3569 145/500 [=======>......................] - ETA: 1:28 - loss: 2.0058 - regression_loss: 1.6493 - classification_loss: 0.3565 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0074 - regression_loss: 1.6505 - classification_loss: 0.3569 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0092 - regression_loss: 1.6516 - classification_loss: 0.3576 148/500 [=======>......................] - ETA: 1:27 - loss: 2.0099 - regression_loss: 1.6529 - classification_loss: 0.3570 149/500 [=======>......................] - ETA: 1:27 - loss: 2.0144 - regression_loss: 1.6562 - classification_loss: 0.3582 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0182 - regression_loss: 1.6588 - classification_loss: 0.3593 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0157 - regression_loss: 1.6566 - classification_loss: 0.3590 152/500 [========>.....................] - ETA: 1:26 - loss: 2.0246 - regression_loss: 1.6632 - classification_loss: 0.3614 153/500 [========>.....................] - ETA: 1:26 - loss: 2.0258 - regression_loss: 1.6640 - classification_loss: 0.3618 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0292 - regression_loss: 1.6663 - classification_loss: 0.3630 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0320 - regression_loss: 1.6686 - classification_loss: 0.3634 156/500 [========>.....................] - ETA: 1:25 - loss: 2.0293 - regression_loss: 1.6662 - classification_loss: 0.3631 157/500 [========>.....................] - ETA: 1:25 - loss: 2.0295 - regression_loss: 1.6665 - classification_loss: 0.3630 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0347 - regression_loss: 1.6710 - classification_loss: 0.3636 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0317 - regression_loss: 1.6686 - classification_loss: 0.3631 160/500 [========>.....................] - ETA: 1:24 - loss: 2.0334 - regression_loss: 1.6701 - classification_loss: 0.3632 161/500 [========>.....................] - ETA: 1:24 - loss: 2.0370 - regression_loss: 1.6734 - classification_loss: 0.3636 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0352 - regression_loss: 1.6723 - classification_loss: 0.3629 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0369 - regression_loss: 1.6745 - classification_loss: 0.3624 164/500 [========>.....................] - ETA: 1:23 - loss: 2.0367 - regression_loss: 1.6746 - classification_loss: 0.3621 165/500 [========>.....................] - ETA: 1:23 - loss: 2.0391 - regression_loss: 1.6765 - classification_loss: 0.3626 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0383 - regression_loss: 1.6763 - classification_loss: 0.3621 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0346 - regression_loss: 1.6729 - classification_loss: 0.3617 168/500 [=========>....................] - ETA: 1:22 - loss: 2.0294 - regression_loss: 1.6682 - classification_loss: 0.3613 169/500 [=========>....................] - ETA: 1:22 - loss: 2.0255 - regression_loss: 1.6649 - classification_loss: 0.3606 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0257 - regression_loss: 1.6647 - classification_loss: 0.3610 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0218 - regression_loss: 1.6616 - classification_loss: 0.3602 172/500 [=========>....................] - ETA: 1:21 - loss: 2.0232 - regression_loss: 1.6627 - classification_loss: 0.3604 173/500 [=========>....................] - ETA: 1:21 - loss: 2.0244 - regression_loss: 1.6639 - classification_loss: 0.3605 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0246 - regression_loss: 1.6644 - classification_loss: 0.3602 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0217 - regression_loss: 1.6620 - classification_loss: 0.3597 176/500 [=========>....................] - ETA: 1:20 - loss: 2.0190 - regression_loss: 1.6600 - classification_loss: 0.3590 177/500 [=========>....................] - ETA: 1:20 - loss: 2.0181 - regression_loss: 1.6595 - classification_loss: 0.3586 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0146 - regression_loss: 1.6565 - classification_loss: 0.3582 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0103 - regression_loss: 1.6528 - classification_loss: 0.3575 180/500 [=========>....................] - ETA: 1:19 - loss: 2.0123 - regression_loss: 1.6543 - classification_loss: 0.3580 181/500 [=========>....................] - ETA: 1:19 - loss: 2.0118 - regression_loss: 1.6537 - classification_loss: 0.3581 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0102 - regression_loss: 1.6524 - classification_loss: 0.3579 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0075 - regression_loss: 1.6505 - classification_loss: 0.3569 184/500 [==========>...................] - ETA: 1:18 - loss: 2.0079 - regression_loss: 1.6509 - classification_loss: 0.3570 185/500 [==========>...................] - ETA: 1:18 - loss: 2.0078 - regression_loss: 1.6510 - classification_loss: 0.3568 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0090 - regression_loss: 1.6525 - classification_loss: 0.3565 187/500 [==========>...................] - ETA: 1:17 - loss: 2.0099 - regression_loss: 1.6525 - classification_loss: 0.3574 188/500 [==========>...................] - ETA: 1:17 - loss: 2.0091 - regression_loss: 1.6517 - classification_loss: 0.3574 189/500 [==========>...................] - ETA: 1:17 - loss: 2.0101 - regression_loss: 1.6526 - classification_loss: 0.3576 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0072 - regression_loss: 1.6499 - classification_loss: 0.3573 191/500 [==========>...................] - ETA: 1:16 - loss: 2.0101 - regression_loss: 1.6528 - classification_loss: 0.3573 192/500 [==========>...................] - ETA: 1:16 - loss: 2.0101 - regression_loss: 1.6529 - classification_loss: 0.3572 193/500 [==========>...................] - ETA: 1:16 - loss: 2.0109 - regression_loss: 1.6531 - classification_loss: 0.3578 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0099 - regression_loss: 1.6524 - classification_loss: 0.3575 195/500 [==========>...................] - ETA: 1:15 - loss: 2.0109 - regression_loss: 1.6533 - classification_loss: 0.3576 196/500 [==========>...................] - ETA: 1:15 - loss: 2.0113 - regression_loss: 1.6538 - classification_loss: 0.3575 197/500 [==========>...................] - ETA: 1:15 - loss: 2.0132 - regression_loss: 1.6557 - classification_loss: 0.3575 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0127 - regression_loss: 1.6556 - classification_loss: 0.3571 199/500 [==========>...................] - ETA: 1:14 - loss: 2.0097 - regression_loss: 1.6537 - classification_loss: 0.3560 200/500 [===========>..................] - ETA: 1:14 - loss: 2.0076 - regression_loss: 1.6521 - classification_loss: 0.3555 201/500 [===========>..................] - ETA: 1:14 - loss: 2.0063 - regression_loss: 1.6504 - classification_loss: 0.3558 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0090 - regression_loss: 1.6526 - classification_loss: 0.3564 203/500 [===========>..................] - ETA: 1:13 - loss: 2.0078 - regression_loss: 1.6518 - classification_loss: 0.3560 204/500 [===========>..................] - ETA: 1:13 - loss: 2.0073 - regression_loss: 1.6515 - classification_loss: 0.3559 205/500 [===========>..................] - ETA: 1:13 - loss: 2.0071 - regression_loss: 1.6516 - classification_loss: 0.3556 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0076 - regression_loss: 1.6522 - classification_loss: 0.3554 207/500 [===========>..................] - ETA: 1:12 - loss: 2.0092 - regression_loss: 1.6535 - classification_loss: 0.3557 208/500 [===========>..................] - ETA: 1:12 - loss: 2.0086 - regression_loss: 1.6530 - classification_loss: 0.3556 209/500 [===========>..................] - ETA: 1:12 - loss: 2.0056 - regression_loss: 1.6505 - classification_loss: 0.3552 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0090 - regression_loss: 1.6532 - classification_loss: 0.3559 211/500 [===========>..................] - ETA: 1:11 - loss: 2.0101 - regression_loss: 1.6537 - classification_loss: 0.3564 212/500 [===========>..................] - ETA: 1:11 - loss: 2.0109 - regression_loss: 1.6546 - classification_loss: 0.3563 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0084 - regression_loss: 1.6524 - classification_loss: 0.3560 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0091 - regression_loss: 1.6533 - classification_loss: 0.3558 215/500 [===========>..................] - ETA: 1:10 - loss: 2.0030 - regression_loss: 1.6482 - classification_loss: 0.3548 216/500 [===========>..................] - ETA: 1:10 - loss: 2.0004 - regression_loss: 1.6463 - classification_loss: 0.3541 217/500 [============>.................] - ETA: 1:10 - loss: 2.0005 - regression_loss: 1.6465 - classification_loss: 0.3540 218/500 [============>.................] - ETA: 1:10 - loss: 1.9952 - regression_loss: 1.6422 - classification_loss: 0.3530 219/500 [============>.................] - ETA: 1:09 - loss: 1.9907 - regression_loss: 1.6386 - classification_loss: 0.3521 220/500 [============>.................] - ETA: 1:09 - loss: 1.9916 - regression_loss: 1.6392 - classification_loss: 0.3523 221/500 [============>.................] - ETA: 1:09 - loss: 1.9912 - regression_loss: 1.6386 - classification_loss: 0.3526 222/500 [============>.................] - ETA: 1:09 - loss: 1.9959 - regression_loss: 1.6428 - classification_loss: 0.3531 223/500 [============>.................] - ETA: 1:08 - loss: 1.9989 - regression_loss: 1.6453 - classification_loss: 0.3536 224/500 [============>.................] - ETA: 1:08 - loss: 1.9970 - regression_loss: 1.6431 - classification_loss: 0.3540 225/500 [============>.................] - ETA: 1:08 - loss: 1.9967 - regression_loss: 1.6428 - classification_loss: 0.3539 226/500 [============>.................] - ETA: 1:08 - loss: 1.9967 - regression_loss: 1.6429 - classification_loss: 0.3538 227/500 [============>.................] - ETA: 1:07 - loss: 1.9969 - regression_loss: 1.6432 - classification_loss: 0.3537 228/500 [============>.................] - ETA: 1:07 - loss: 2.0031 - regression_loss: 1.6472 - classification_loss: 0.3559 229/500 [============>.................] - ETA: 1:07 - loss: 2.0041 - regression_loss: 1.6479 - classification_loss: 0.3562 230/500 [============>.................] - ETA: 1:07 - loss: 2.0013 - regression_loss: 1.6454 - classification_loss: 0.3559 231/500 [============>.................] - ETA: 1:06 - loss: 1.9991 - regression_loss: 1.6436 - classification_loss: 0.3555 232/500 [============>.................] - ETA: 1:06 - loss: 1.9993 - regression_loss: 1.6438 - classification_loss: 0.3555 233/500 [============>.................] - ETA: 1:06 - loss: 1.9994 - regression_loss: 1.6437 - classification_loss: 0.3557 234/500 [=============>................] - ETA: 1:06 - loss: 1.9978 - regression_loss: 1.6424 - classification_loss: 0.3554 235/500 [=============>................] - ETA: 1:05 - loss: 1.9975 - regression_loss: 1.6422 - classification_loss: 0.3553 236/500 [=============>................] - ETA: 1:05 - loss: 1.9968 - regression_loss: 1.6417 - classification_loss: 0.3550 237/500 [=============>................] - ETA: 1:05 - loss: 2.0006 - regression_loss: 1.6446 - classification_loss: 0.3560 238/500 [=============>................] - ETA: 1:05 - loss: 2.0007 - regression_loss: 1.6448 - classification_loss: 0.3559 239/500 [=============>................] - ETA: 1:05 - loss: 2.0004 - regression_loss: 1.6422 - classification_loss: 0.3582 240/500 [=============>................] - ETA: 1:04 - loss: 1.9997 - regression_loss: 1.6416 - classification_loss: 0.3582 241/500 [=============>................] - ETA: 1:04 - loss: 2.0023 - regression_loss: 1.6434 - classification_loss: 0.3590 242/500 [=============>................] - ETA: 1:04 - loss: 2.0028 - regression_loss: 1.6438 - classification_loss: 0.3590 243/500 [=============>................] - ETA: 1:04 - loss: 2.0022 - regression_loss: 1.6434 - classification_loss: 0.3589 244/500 [=============>................] - ETA: 1:03 - loss: 2.0069 - regression_loss: 1.6469 - classification_loss: 0.3600 245/500 [=============>................] - ETA: 1:03 - loss: 2.0085 - regression_loss: 1.6485 - classification_loss: 0.3600 246/500 [=============>................] - ETA: 1:03 - loss: 2.0075 - regression_loss: 1.6478 - classification_loss: 0.3597 247/500 [=============>................] - ETA: 1:03 - loss: 2.0099 - regression_loss: 1.6494 - classification_loss: 0.3605 248/500 [=============>................] - ETA: 1:02 - loss: 2.0065 - regression_loss: 1.6465 - classification_loss: 0.3600 249/500 [=============>................] - ETA: 1:02 - loss: 2.0068 - regression_loss: 1.6466 - classification_loss: 0.3603 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0082 - regression_loss: 1.6477 - classification_loss: 0.3605 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0100 - regression_loss: 1.6492 - classification_loss: 0.3608 252/500 [==============>...............] - ETA: 1:01 - loss: 2.0103 - regression_loss: 1.6492 - classification_loss: 0.3611 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0131 - regression_loss: 1.6514 - classification_loss: 0.3617 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0132 - regression_loss: 1.6517 - classification_loss: 0.3615 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0112 - regression_loss: 1.6499 - classification_loss: 0.3613 256/500 [==============>...............] - ETA: 1:00 - loss: 2.0112 - regression_loss: 1.6500 - classification_loss: 0.3613 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0102 - regression_loss: 1.6490 - classification_loss: 0.3611 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0142 - regression_loss: 1.6523 - classification_loss: 0.3620 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0142 - regression_loss: 1.6520 - classification_loss: 0.3622 260/500 [==============>...............] - ETA: 59s - loss: 2.0115 - regression_loss: 1.6499 - classification_loss: 0.3616  261/500 [==============>...............] - ETA: 59s - loss: 2.0123 - regression_loss: 1.6506 - classification_loss: 0.3617 262/500 [==============>...............] - ETA: 59s - loss: 2.0113 - regression_loss: 1.6498 - classification_loss: 0.3615 263/500 [==============>...............] - ETA: 59s - loss: 2.0142 - regression_loss: 1.6525 - classification_loss: 0.3617 264/500 [==============>...............] - ETA: 58s - loss: 2.0110 - regression_loss: 1.6498 - classification_loss: 0.3612 265/500 [==============>...............] - ETA: 58s - loss: 2.0147 - regression_loss: 1.6534 - classification_loss: 0.3613 266/500 [==============>...............] - ETA: 58s - loss: 2.0153 - regression_loss: 1.6539 - classification_loss: 0.3614 267/500 [===============>..............] - ETA: 58s - loss: 2.0130 - regression_loss: 1.6522 - classification_loss: 0.3608 268/500 [===============>..............] - ETA: 57s - loss: 2.0140 - regression_loss: 1.6531 - classification_loss: 0.3609 269/500 [===============>..............] - ETA: 57s - loss: 2.0130 - regression_loss: 1.6523 - classification_loss: 0.3607 270/500 [===============>..............] - ETA: 57s - loss: 2.0128 - regression_loss: 1.6522 - classification_loss: 0.3606 271/500 [===============>..............] - ETA: 57s - loss: 2.0141 - regression_loss: 1.6536 - classification_loss: 0.3604 272/500 [===============>..............] - ETA: 56s - loss: 2.0120 - regression_loss: 1.6516 - classification_loss: 0.3604 273/500 [===============>..............] - ETA: 56s - loss: 2.0142 - regression_loss: 1.6532 - classification_loss: 0.3610 274/500 [===============>..............] - ETA: 56s - loss: 2.0148 - regression_loss: 1.6535 - classification_loss: 0.3613 275/500 [===============>..............] - ETA: 56s - loss: 2.0139 - regression_loss: 1.6529 - classification_loss: 0.3611 276/500 [===============>..............] - ETA: 55s - loss: 2.0128 - regression_loss: 1.6521 - classification_loss: 0.3608 277/500 [===============>..............] - ETA: 55s - loss: 2.0120 - regression_loss: 1.6502 - classification_loss: 0.3617 278/500 [===============>..............] - ETA: 55s - loss: 2.0121 - regression_loss: 1.6501 - classification_loss: 0.3620 279/500 [===============>..............] - ETA: 55s - loss: 2.0097 - regression_loss: 1.6484 - classification_loss: 0.3613 280/500 [===============>..............] - ETA: 54s - loss: 2.0110 - regression_loss: 1.6495 - classification_loss: 0.3615 281/500 [===============>..............] - ETA: 54s - loss: 2.0098 - regression_loss: 1.6487 - classification_loss: 0.3612 282/500 [===============>..............] - ETA: 54s - loss: 2.0125 - regression_loss: 1.6511 - classification_loss: 0.3614 283/500 [===============>..............] - ETA: 54s - loss: 2.0117 - regression_loss: 1.6504 - classification_loss: 0.3613 284/500 [================>.............] - ETA: 53s - loss: 2.0086 - regression_loss: 1.6480 - classification_loss: 0.3606 285/500 [================>.............] - ETA: 53s - loss: 2.0072 - regression_loss: 1.6469 - classification_loss: 0.3603 286/500 [================>.............] - ETA: 53s - loss: 2.0068 - regression_loss: 1.6465 - classification_loss: 0.3603 287/500 [================>.............] - ETA: 53s - loss: 2.0076 - regression_loss: 1.6471 - classification_loss: 0.3605 288/500 [================>.............] - ETA: 52s - loss: 2.0076 - regression_loss: 1.6470 - classification_loss: 0.3606 289/500 [================>.............] - ETA: 52s - loss: 2.0066 - regression_loss: 1.6463 - classification_loss: 0.3604 290/500 [================>.............] - ETA: 52s - loss: 2.0076 - regression_loss: 1.6471 - classification_loss: 0.3604 291/500 [================>.............] - ETA: 52s - loss: 2.0086 - regression_loss: 1.6477 - classification_loss: 0.3609 292/500 [================>.............] - ETA: 51s - loss: 2.0067 - regression_loss: 1.6463 - classification_loss: 0.3604 293/500 [================>.............] - ETA: 51s - loss: 2.0071 - regression_loss: 1.6466 - classification_loss: 0.3606 294/500 [================>.............] - ETA: 51s - loss: 2.0071 - regression_loss: 1.6465 - classification_loss: 0.3605 295/500 [================>.............] - ETA: 51s - loss: 2.0070 - regression_loss: 1.6464 - classification_loss: 0.3606 296/500 [================>.............] - ETA: 50s - loss: 2.0077 - regression_loss: 1.6472 - classification_loss: 0.3605 297/500 [================>.............] - ETA: 50s - loss: 2.0062 - regression_loss: 1.6461 - classification_loss: 0.3601 298/500 [================>.............] - ETA: 50s - loss: 2.0062 - regression_loss: 1.6461 - classification_loss: 0.3601 299/500 [================>.............] - ETA: 50s - loss: 2.0074 - regression_loss: 1.6472 - classification_loss: 0.3602 300/500 [=================>............] - ETA: 49s - loss: 2.0073 - regression_loss: 1.6472 - classification_loss: 0.3601 301/500 [=================>............] - ETA: 49s - loss: 2.0089 - regression_loss: 1.6487 - classification_loss: 0.3602 302/500 [=================>............] - ETA: 49s - loss: 2.0099 - regression_loss: 1.6495 - classification_loss: 0.3604 303/500 [=================>............] - ETA: 49s - loss: 2.0087 - regression_loss: 1.6486 - classification_loss: 0.3601 304/500 [=================>............] - ETA: 48s - loss: 2.0094 - regression_loss: 1.6493 - classification_loss: 0.3602 305/500 [=================>............] - ETA: 48s - loss: 2.0099 - regression_loss: 1.6497 - classification_loss: 0.3601 306/500 [=================>............] - ETA: 48s - loss: 2.0068 - regression_loss: 1.6472 - classification_loss: 0.3596 307/500 [=================>............] - ETA: 48s - loss: 2.0075 - regression_loss: 1.6479 - classification_loss: 0.3595 308/500 [=================>............] - ETA: 47s - loss: 2.0087 - regression_loss: 1.6489 - classification_loss: 0.3598 309/500 [=================>............] - ETA: 47s - loss: 2.0083 - regression_loss: 1.6488 - classification_loss: 0.3595 310/500 [=================>............] - ETA: 47s - loss: 2.0080 - regression_loss: 1.6486 - classification_loss: 0.3594 311/500 [=================>............] - ETA: 47s - loss: 2.0082 - regression_loss: 1.6489 - classification_loss: 0.3593 312/500 [=================>............] - ETA: 46s - loss: 2.0091 - regression_loss: 1.6497 - classification_loss: 0.3593 313/500 [=================>............] - ETA: 46s - loss: 2.0077 - regression_loss: 1.6488 - classification_loss: 0.3589 314/500 [=================>............] - ETA: 46s - loss: 2.0081 - regression_loss: 1.6492 - classification_loss: 0.3589 315/500 [=================>............] - ETA: 46s - loss: 2.0089 - regression_loss: 1.6497 - classification_loss: 0.3592 316/500 [=================>............] - ETA: 45s - loss: 2.0104 - regression_loss: 1.6507 - classification_loss: 0.3596 317/500 [==================>...........] - ETA: 45s - loss: 2.0111 - regression_loss: 1.6512 - classification_loss: 0.3599 318/500 [==================>...........] - ETA: 45s - loss: 2.0122 - regression_loss: 1.6521 - classification_loss: 0.3601 319/500 [==================>...........] - ETA: 45s - loss: 2.0111 - regression_loss: 1.6514 - classification_loss: 0.3597 320/500 [==================>...........] - ETA: 44s - loss: 2.0108 - regression_loss: 1.6511 - classification_loss: 0.3598 321/500 [==================>...........] - ETA: 44s - loss: 2.0101 - regression_loss: 1.6504 - classification_loss: 0.3597 322/500 [==================>...........] - ETA: 44s - loss: 2.0102 - regression_loss: 1.6507 - classification_loss: 0.3595 323/500 [==================>...........] - ETA: 44s - loss: 2.0097 - regression_loss: 1.6505 - classification_loss: 0.3592 324/500 [==================>...........] - ETA: 43s - loss: 2.0108 - regression_loss: 1.6512 - classification_loss: 0.3595 325/500 [==================>...........] - ETA: 43s - loss: 2.0132 - regression_loss: 1.6529 - classification_loss: 0.3603 326/500 [==================>...........] - ETA: 43s - loss: 2.0132 - regression_loss: 1.6529 - classification_loss: 0.3603 327/500 [==================>...........] - ETA: 43s - loss: 2.0147 - regression_loss: 1.6541 - classification_loss: 0.3606 328/500 [==================>...........] - ETA: 42s - loss: 2.0164 - regression_loss: 1.6556 - classification_loss: 0.3608 329/500 [==================>...........] - ETA: 42s - loss: 2.0155 - regression_loss: 1.6549 - classification_loss: 0.3606 330/500 [==================>...........] - ETA: 42s - loss: 2.0165 - regression_loss: 1.6556 - classification_loss: 0.3609 331/500 [==================>...........] - ETA: 42s - loss: 2.0160 - regression_loss: 1.6553 - classification_loss: 0.3607 332/500 [==================>...........] - ETA: 41s - loss: 2.0159 - regression_loss: 1.6553 - classification_loss: 0.3606 333/500 [==================>...........] - ETA: 41s - loss: 2.0146 - regression_loss: 1.6543 - classification_loss: 0.3603 334/500 [===================>..........] - ETA: 41s - loss: 2.0144 - regression_loss: 1.6542 - classification_loss: 0.3602 335/500 [===================>..........] - ETA: 41s - loss: 2.0135 - regression_loss: 1.6531 - classification_loss: 0.3604 336/500 [===================>..........] - ETA: 40s - loss: 2.0134 - regression_loss: 1.6531 - classification_loss: 0.3602 337/500 [===================>..........] - ETA: 40s - loss: 2.0108 - regression_loss: 1.6512 - classification_loss: 0.3596 338/500 [===================>..........] - ETA: 40s - loss: 2.0122 - regression_loss: 1.6526 - classification_loss: 0.3597 339/500 [===================>..........] - ETA: 40s - loss: 2.0113 - regression_loss: 1.6519 - classification_loss: 0.3593 340/500 [===================>..........] - ETA: 39s - loss: 2.0105 - regression_loss: 1.6510 - classification_loss: 0.3595 341/500 [===================>..........] - ETA: 39s - loss: 2.0088 - regression_loss: 1.6499 - classification_loss: 0.3589 342/500 [===================>..........] - ETA: 39s - loss: 2.0086 - regression_loss: 1.6491 - classification_loss: 0.3595 343/500 [===================>..........] - ETA: 39s - loss: 2.0098 - regression_loss: 1.6503 - classification_loss: 0.3595 344/500 [===================>..........] - ETA: 38s - loss: 2.0113 - regression_loss: 1.6514 - classification_loss: 0.3599 345/500 [===================>..........] - ETA: 38s - loss: 2.0132 - regression_loss: 1.6531 - classification_loss: 0.3601 346/500 [===================>..........] - ETA: 38s - loss: 2.0132 - regression_loss: 1.6533 - classification_loss: 0.3599 347/500 [===================>..........] - ETA: 38s - loss: 2.0129 - regression_loss: 1.6531 - classification_loss: 0.3598 348/500 [===================>..........] - ETA: 38s - loss: 2.0130 - regression_loss: 1.6534 - classification_loss: 0.3596 349/500 [===================>..........] - ETA: 37s - loss: 2.0115 - regression_loss: 1.6522 - classification_loss: 0.3593 350/500 [====================>.........] - ETA: 37s - loss: 2.0092 - regression_loss: 1.6504 - classification_loss: 0.3589 351/500 [====================>.........] - ETA: 37s - loss: 2.0098 - regression_loss: 1.6512 - classification_loss: 0.3586 352/500 [====================>.........] - ETA: 36s - loss: 2.0103 - regression_loss: 1.6512 - classification_loss: 0.3591 353/500 [====================>.........] - ETA: 36s - loss: 2.0101 - regression_loss: 1.6512 - classification_loss: 0.3589 354/500 [====================>.........] - ETA: 36s - loss: 2.0104 - regression_loss: 1.6520 - classification_loss: 0.3584 355/500 [====================>.........] - ETA: 36s - loss: 2.0083 - regression_loss: 1.6500 - classification_loss: 0.3583 356/500 [====================>.........] - ETA: 36s - loss: 2.0077 - regression_loss: 1.6498 - classification_loss: 0.3579 357/500 [====================>.........] - ETA: 35s - loss: 2.0084 - regression_loss: 1.6505 - classification_loss: 0.3579 358/500 [====================>.........] - ETA: 35s - loss: 2.0077 - regression_loss: 1.6501 - classification_loss: 0.3576 359/500 [====================>.........] - ETA: 35s - loss: 2.0078 - regression_loss: 1.6503 - classification_loss: 0.3575 360/500 [====================>.........] - ETA: 34s - loss: 2.0085 - regression_loss: 1.6509 - classification_loss: 0.3576 361/500 [====================>.........] - ETA: 34s - loss: 2.0088 - regression_loss: 1.6513 - classification_loss: 0.3575 362/500 [====================>.........] - ETA: 34s - loss: 2.0101 - regression_loss: 1.6523 - classification_loss: 0.3578 363/500 [====================>.........] - ETA: 34s - loss: 2.0099 - regression_loss: 1.6522 - classification_loss: 0.3576 364/500 [====================>.........] - ETA: 33s - loss: 2.0084 - regression_loss: 1.6512 - classification_loss: 0.3572 365/500 [====================>.........] - ETA: 33s - loss: 2.0089 - regression_loss: 1.6514 - classification_loss: 0.3575 366/500 [====================>.........] - ETA: 33s - loss: 2.0099 - regression_loss: 1.6519 - classification_loss: 0.3580 367/500 [=====================>........] - ETA: 33s - loss: 2.0083 - regression_loss: 1.6506 - classification_loss: 0.3576 368/500 [=====================>........] - ETA: 32s - loss: 2.0064 - regression_loss: 1.6491 - classification_loss: 0.3573 369/500 [=====================>........] - ETA: 32s - loss: 2.0060 - regression_loss: 1.6488 - classification_loss: 0.3572 370/500 [=====================>........] - ETA: 32s - loss: 2.0067 - regression_loss: 1.6495 - classification_loss: 0.3572 371/500 [=====================>........] - ETA: 32s - loss: 2.0070 - regression_loss: 1.6498 - classification_loss: 0.3572 372/500 [=====================>........] - ETA: 31s - loss: 2.0067 - regression_loss: 1.6496 - classification_loss: 0.3571 373/500 [=====================>........] - ETA: 31s - loss: 2.0072 - regression_loss: 1.6493 - classification_loss: 0.3579 374/500 [=====================>........] - ETA: 31s - loss: 2.0068 - regression_loss: 1.6490 - classification_loss: 0.3578 375/500 [=====================>........] - ETA: 31s - loss: 2.0066 - regression_loss: 1.6490 - classification_loss: 0.3577 376/500 [=====================>........] - ETA: 30s - loss: 2.0069 - regression_loss: 1.6492 - classification_loss: 0.3577 377/500 [=====================>........] - ETA: 30s - loss: 2.0067 - regression_loss: 1.6491 - classification_loss: 0.3576 378/500 [=====================>........] - ETA: 30s - loss: 2.0083 - regression_loss: 1.6502 - classification_loss: 0.3580 379/500 [=====================>........] - ETA: 30s - loss: 2.0067 - regression_loss: 1.6490 - classification_loss: 0.3577 380/500 [=====================>........] - ETA: 29s - loss: 2.0051 - regression_loss: 1.6473 - classification_loss: 0.3577 381/500 [=====================>........] - ETA: 29s - loss: 2.0068 - regression_loss: 1.6485 - classification_loss: 0.3583 382/500 [=====================>........] - ETA: 29s - loss: 2.0076 - regression_loss: 1.6493 - classification_loss: 0.3583 383/500 [=====================>........] - ETA: 29s - loss: 2.0065 - regression_loss: 1.6485 - classification_loss: 0.3580 384/500 [======================>.......] - ETA: 28s - loss: 2.0066 - regression_loss: 1.6484 - classification_loss: 0.3582 385/500 [======================>.......] - ETA: 28s - loss: 2.0053 - regression_loss: 1.6474 - classification_loss: 0.3579 386/500 [======================>.......] - ETA: 28s - loss: 2.0056 - regression_loss: 1.6476 - classification_loss: 0.3580 387/500 [======================>.......] - ETA: 28s - loss: 2.0047 - regression_loss: 1.6470 - classification_loss: 0.3578 388/500 [======================>.......] - ETA: 27s - loss: 2.0048 - regression_loss: 1.6471 - classification_loss: 0.3577 389/500 [======================>.......] - ETA: 27s - loss: 2.0040 - regression_loss: 1.6465 - classification_loss: 0.3575 390/500 [======================>.......] - ETA: 27s - loss: 2.0046 - regression_loss: 1.6471 - classification_loss: 0.3575 391/500 [======================>.......] - ETA: 27s - loss: 2.0048 - regression_loss: 1.6473 - classification_loss: 0.3575 392/500 [======================>.......] - ETA: 26s - loss: 2.0076 - regression_loss: 1.6497 - classification_loss: 0.3579 393/500 [======================>.......] - ETA: 26s - loss: 2.0075 - regression_loss: 1.6496 - classification_loss: 0.3579 394/500 [======================>.......] - ETA: 26s - loss: 2.0077 - regression_loss: 1.6499 - classification_loss: 0.3578 395/500 [======================>.......] - ETA: 26s - loss: 2.0075 - regression_loss: 1.6498 - classification_loss: 0.3577 396/500 [======================>.......] - ETA: 25s - loss: 2.0082 - regression_loss: 1.6504 - classification_loss: 0.3579 397/500 [======================>.......] - ETA: 25s - loss: 2.0095 - regression_loss: 1.6514 - classification_loss: 0.3581 398/500 [======================>.......] - ETA: 25s - loss: 2.0092 - regression_loss: 1.6513 - classification_loss: 0.3579 399/500 [======================>.......] - ETA: 25s - loss: 2.0089 - regression_loss: 1.6513 - classification_loss: 0.3577 400/500 [=======================>......] - ETA: 24s - loss: 2.0070 - regression_loss: 1.6496 - classification_loss: 0.3574 401/500 [=======================>......] - ETA: 24s - loss: 2.0047 - regression_loss: 1.6478 - classification_loss: 0.3569 402/500 [=======================>......] - ETA: 24s - loss: 2.0019 - regression_loss: 1.6456 - classification_loss: 0.3563 403/500 [=======================>......] - ETA: 24s - loss: 2.0020 - regression_loss: 1.6457 - classification_loss: 0.3563 404/500 [=======================>......] - ETA: 23s - loss: 2.0022 - regression_loss: 1.6459 - classification_loss: 0.3563 405/500 [=======================>......] - ETA: 23s - loss: 2.0006 - regression_loss: 1.6448 - classification_loss: 0.3558 406/500 [=======================>......] - ETA: 23s - loss: 2.0009 - regression_loss: 1.6451 - classification_loss: 0.3559 407/500 [=======================>......] - ETA: 23s - loss: 2.0033 - regression_loss: 1.6466 - classification_loss: 0.3567 408/500 [=======================>......] - ETA: 22s - loss: 2.0035 - regression_loss: 1.6468 - classification_loss: 0.3567 409/500 [=======================>......] - ETA: 22s - loss: 2.0036 - regression_loss: 1.6466 - classification_loss: 0.3570 410/500 [=======================>......] - ETA: 22s - loss: 2.0049 - regression_loss: 1.6479 - classification_loss: 0.3570 411/500 [=======================>......] - ETA: 22s - loss: 2.0048 - regression_loss: 1.6479 - classification_loss: 0.3569 412/500 [=======================>......] - ETA: 22s - loss: 2.0057 - regression_loss: 1.6485 - classification_loss: 0.3572 413/500 [=======================>......] - ETA: 21s - loss: 2.0055 - regression_loss: 1.6482 - classification_loss: 0.3573 414/500 [=======================>......] - ETA: 21s - loss: 2.0030 - regression_loss: 1.6458 - classification_loss: 0.3572 415/500 [=======================>......] - ETA: 21s - loss: 2.0030 - regression_loss: 1.6459 - classification_loss: 0.3571 416/500 [=======================>......] - ETA: 21s - loss: 2.0025 - regression_loss: 1.6456 - classification_loss: 0.3569 417/500 [========================>.....] - ETA: 20s - loss: 2.0024 - regression_loss: 1.6457 - classification_loss: 0.3567 418/500 [========================>.....] - ETA: 20s - loss: 2.0020 - regression_loss: 1.6454 - classification_loss: 0.3566 419/500 [========================>.....] - ETA: 20s - loss: 2.0021 - regression_loss: 1.6456 - classification_loss: 0.3565 420/500 [========================>.....] - ETA: 20s - loss: 2.0000 - regression_loss: 1.6438 - classification_loss: 0.3562 421/500 [========================>.....] - ETA: 19s - loss: 1.9999 - regression_loss: 1.6437 - classification_loss: 0.3562 422/500 [========================>.....] - ETA: 19s - loss: 1.9992 - regression_loss: 1.6429 - classification_loss: 0.3564 423/500 [========================>.....] - ETA: 19s - loss: 1.9996 - regression_loss: 1.6430 - classification_loss: 0.3567 424/500 [========================>.....] - ETA: 19s - loss: 1.9996 - regression_loss: 1.6428 - classification_loss: 0.3568 425/500 [========================>.....] - ETA: 18s - loss: 2.0007 - regression_loss: 1.6437 - classification_loss: 0.3569 426/500 [========================>.....] - ETA: 18s - loss: 2.0002 - regression_loss: 1.6434 - classification_loss: 0.3568 427/500 [========================>.....] - ETA: 18s - loss: 2.0004 - regression_loss: 1.6435 - classification_loss: 0.3569 428/500 [========================>.....] - ETA: 18s - loss: 1.9989 - regression_loss: 1.6420 - classification_loss: 0.3568 429/500 [========================>.....] - ETA: 17s - loss: 1.9992 - regression_loss: 1.6425 - classification_loss: 0.3568 430/500 [========================>.....] - ETA: 17s - loss: 1.9979 - regression_loss: 1.6411 - classification_loss: 0.3568 431/500 [========================>.....] - ETA: 17s - loss: 1.9979 - regression_loss: 1.6413 - classification_loss: 0.3566 432/500 [========================>.....] - ETA: 17s - loss: 1.9987 - regression_loss: 1.6419 - classification_loss: 0.3568 433/500 [========================>.....] - ETA: 16s - loss: 1.9999 - regression_loss: 1.6427 - classification_loss: 0.3572 434/500 [=========================>....] - ETA: 16s - loss: 1.9992 - regression_loss: 1.6423 - classification_loss: 0.3569 435/500 [=========================>....] - ETA: 16s - loss: 1.9996 - regression_loss: 1.6428 - classification_loss: 0.3568 436/500 [=========================>....] - ETA: 16s - loss: 1.9992 - regression_loss: 1.6425 - classification_loss: 0.3567 437/500 [=========================>....] - ETA: 15s - loss: 1.9985 - regression_loss: 1.6420 - classification_loss: 0.3565 438/500 [=========================>....] - ETA: 15s - loss: 1.9983 - regression_loss: 1.6417 - classification_loss: 0.3566 439/500 [=========================>....] - ETA: 15s - loss: 1.9993 - regression_loss: 1.6426 - classification_loss: 0.3566 440/500 [=========================>....] - ETA: 15s - loss: 1.9990 - regression_loss: 1.6425 - classification_loss: 0.3565 441/500 [=========================>....] - ETA: 14s - loss: 1.9980 - regression_loss: 1.6417 - classification_loss: 0.3563 442/500 [=========================>....] - ETA: 14s - loss: 1.9963 - regression_loss: 1.6401 - classification_loss: 0.3561 443/500 [=========================>....] - ETA: 14s - loss: 1.9981 - regression_loss: 1.6416 - classification_loss: 0.3565 444/500 [=========================>....] - ETA: 14s - loss: 1.9984 - regression_loss: 1.6419 - classification_loss: 0.3565 445/500 [=========================>....] - ETA: 13s - loss: 2.0000 - regression_loss: 1.6430 - classification_loss: 0.3571 446/500 [=========================>....] - ETA: 13s - loss: 1.9999 - regression_loss: 1.6428 - classification_loss: 0.3570 447/500 [=========================>....] - ETA: 13s - loss: 1.9977 - regression_loss: 1.6411 - classification_loss: 0.3565 448/500 [=========================>....] - ETA: 12s - loss: 1.9974 - regression_loss: 1.6411 - classification_loss: 0.3563 449/500 [=========================>....] - ETA: 12s - loss: 1.9977 - regression_loss: 1.6413 - classification_loss: 0.3564 450/500 [==========================>...] - ETA: 12s - loss: 1.9979 - regression_loss: 1.6416 - classification_loss: 0.3563 451/500 [==========================>...] - ETA: 12s - loss: 1.9989 - regression_loss: 1.6423 - classification_loss: 0.3566 452/500 [==========================>...] - ETA: 11s - loss: 1.9987 - regression_loss: 1.6422 - classification_loss: 0.3565 453/500 [==========================>...] - ETA: 11s - loss: 1.9982 - regression_loss: 1.6419 - classification_loss: 0.3563 454/500 [==========================>...] - ETA: 11s - loss: 1.9979 - regression_loss: 1.6417 - classification_loss: 0.3562 455/500 [==========================>...] - ETA: 11s - loss: 1.9977 - regression_loss: 1.6416 - classification_loss: 0.3561 456/500 [==========================>...] - ETA: 10s - loss: 1.9967 - regression_loss: 1.6408 - classification_loss: 0.3559 457/500 [==========================>...] - ETA: 10s - loss: 1.9962 - regression_loss: 1.6405 - classification_loss: 0.3556 458/500 [==========================>...] - ETA: 10s - loss: 1.9962 - regression_loss: 1.6406 - classification_loss: 0.3556 459/500 [==========================>...] - ETA: 10s - loss: 1.9966 - regression_loss: 1.6409 - classification_loss: 0.3556 460/500 [==========================>...] - ETA: 9s - loss: 1.9957 - regression_loss: 1.6403 - classification_loss: 0.3553  461/500 [==========================>...] - ETA: 9s - loss: 1.9942 - regression_loss: 1.6393 - classification_loss: 0.3550 462/500 [==========================>...] - ETA: 9s - loss: 1.9939 - regression_loss: 1.6390 - classification_loss: 0.3548 463/500 [==========================>...] - ETA: 9s - loss: 1.9933 - regression_loss: 1.6386 - classification_loss: 0.3547 464/500 [==========================>...] - ETA: 8s - loss: 1.9942 - regression_loss: 1.6395 - classification_loss: 0.3547 465/500 [==========================>...] - ETA: 8s - loss: 1.9936 - regression_loss: 1.6389 - classification_loss: 0.3547 466/500 [==========================>...] - ETA: 8s - loss: 1.9939 - regression_loss: 1.6393 - classification_loss: 0.3547 467/500 [===========================>..] - ETA: 8s - loss: 1.9934 - regression_loss: 1.6389 - classification_loss: 0.3545 468/500 [===========================>..] - ETA: 7s - loss: 1.9936 - regression_loss: 1.6392 - classification_loss: 0.3544 469/500 [===========================>..] - ETA: 7s - loss: 1.9950 - regression_loss: 1.6403 - classification_loss: 0.3547 470/500 [===========================>..] - ETA: 7s - loss: 1.9959 - regression_loss: 1.6410 - classification_loss: 0.3549 471/500 [===========================>..] - ETA: 7s - loss: 1.9968 - regression_loss: 1.6417 - classification_loss: 0.3550 472/500 [===========================>..] - ETA: 6s - loss: 1.9977 - regression_loss: 1.6425 - classification_loss: 0.3551 473/500 [===========================>..] - ETA: 6s - loss: 1.9974 - regression_loss: 1.6423 - classification_loss: 0.3550 474/500 [===========================>..] - ETA: 6s - loss: 1.9980 - regression_loss: 1.6429 - classification_loss: 0.3552 475/500 [===========================>..] - ETA: 6s - loss: 1.9991 - regression_loss: 1.6437 - classification_loss: 0.3555 476/500 [===========================>..] - ETA: 5s - loss: 1.9989 - regression_loss: 1.6435 - classification_loss: 0.3554 477/500 [===========================>..] - ETA: 5s - loss: 1.9983 - regression_loss: 1.6432 - classification_loss: 0.3551 478/500 [===========================>..] - ETA: 5s - loss: 1.9983 - regression_loss: 1.6432 - classification_loss: 0.3552 479/500 [===========================>..] - ETA: 5s - loss: 1.9987 - regression_loss: 1.6435 - classification_loss: 0.3552 480/500 [===========================>..] - ETA: 4s - loss: 1.9992 - regression_loss: 1.6440 - classification_loss: 0.3552 481/500 [===========================>..] - ETA: 4s - loss: 2.0001 - regression_loss: 1.6446 - classification_loss: 0.3556 482/500 [===========================>..] - ETA: 4s - loss: 2.0009 - regression_loss: 1.6452 - classification_loss: 0.3557 483/500 [===========================>..] - ETA: 4s - loss: 2.0008 - regression_loss: 1.6452 - classification_loss: 0.3556 484/500 [============================>.] - ETA: 4s - loss: 2.0004 - regression_loss: 1.6450 - classification_loss: 0.3555 485/500 [============================>.] - ETA: 3s - loss: 1.9984 - regression_loss: 1.6433 - classification_loss: 0.3551 486/500 [============================>.] - ETA: 3s - loss: 1.9979 - regression_loss: 1.6428 - classification_loss: 0.3550 487/500 [============================>.] - ETA: 3s - loss: 1.9970 - regression_loss: 1.6424 - classification_loss: 0.3547 488/500 [============================>.] - ETA: 3s - loss: 1.9963 - regression_loss: 1.6417 - classification_loss: 0.3546 489/500 [============================>.] - ETA: 2s - loss: 1.9959 - regression_loss: 1.6414 - classification_loss: 0.3545 490/500 [============================>.] - ETA: 2s - loss: 1.9970 - regression_loss: 1.6424 - classification_loss: 0.3545 491/500 [============================>.] - ETA: 2s - loss: 1.9965 - regression_loss: 1.6421 - classification_loss: 0.3544 492/500 [============================>.] - ETA: 2s - loss: 1.9975 - regression_loss: 1.6431 - classification_loss: 0.3545 493/500 [============================>.] - ETA: 1s - loss: 1.9991 - regression_loss: 1.6442 - classification_loss: 0.3549 494/500 [============================>.] - ETA: 1s - loss: 1.9987 - regression_loss: 1.6436 - classification_loss: 0.3551 495/500 [============================>.] - ETA: 1s - loss: 1.9984 - regression_loss: 1.6435 - classification_loss: 0.3549 496/500 [============================>.] - ETA: 1s - loss: 1.9993 - regression_loss: 1.6442 - classification_loss: 0.3552 497/500 [============================>.] - ETA: 0s - loss: 1.9972 - regression_loss: 1.6424 - classification_loss: 0.3548 498/500 [============================>.] - ETA: 0s - loss: 1.9979 - regression_loss: 1.6429 - classification_loss: 0.3551 499/500 [============================>.] - ETA: 0s - loss: 1.9985 - regression_loss: 1.6433 - classification_loss: 0.3552 500/500 [==============================] - 125s 250ms/step - loss: 1.9987 - regression_loss: 1.6435 - classification_loss: 0.3552 1172 instances of class plum with average precision: 0.4902 mAP: 0.4902 Epoch 00039: saving model to ./training/snapshots/resnet50_pascal_39.h5 Epoch 40/150 1/500 [..............................] - ETA: 2:02 - loss: 2.4529 - regression_loss: 2.0582 - classification_loss: 0.3947 2/500 [..............................] - ETA: 2:05 - loss: 2.2039 - regression_loss: 1.8489 - classification_loss: 0.3550 3/500 [..............................] - ETA: 2:05 - loss: 1.9986 - regression_loss: 1.6502 - classification_loss: 0.3484 4/500 [..............................] - ETA: 2:05 - loss: 2.1204 - regression_loss: 1.7400 - classification_loss: 0.3805 5/500 [..............................] - ETA: 2:05 - loss: 2.0766 - regression_loss: 1.7088 - classification_loss: 0.3678 6/500 [..............................] - ETA: 2:04 - loss: 2.0156 - regression_loss: 1.6472 - classification_loss: 0.3684 7/500 [..............................] - ETA: 2:04 - loss: 2.0791 - regression_loss: 1.7104 - classification_loss: 0.3687 8/500 [..............................] - ETA: 2:05 - loss: 2.1025 - regression_loss: 1.7256 - classification_loss: 0.3769 9/500 [..............................] - ETA: 2:03 - loss: 2.0675 - regression_loss: 1.6982 - classification_loss: 0.3693 10/500 [..............................] - ETA: 2:02 - loss: 1.9570 - regression_loss: 1.6122 - classification_loss: 0.3448 11/500 [..............................] - ETA: 2:01 - loss: 1.9767 - regression_loss: 1.6313 - classification_loss: 0.3454 12/500 [..............................] - ETA: 2:01 - loss: 1.9841 - regression_loss: 1.6321 - classification_loss: 0.3520 13/500 [..............................] - ETA: 2:00 - loss: 2.0136 - regression_loss: 1.6605 - classification_loss: 0.3531 14/500 [..............................] - ETA: 2:00 - loss: 2.0132 - regression_loss: 1.6600 - classification_loss: 0.3532 15/500 [..............................] - ETA: 2:00 - loss: 2.0096 - regression_loss: 1.6541 - classification_loss: 0.3555 16/500 [..............................] - ETA: 2:00 - loss: 2.0123 - regression_loss: 1.6516 - classification_loss: 0.3607 17/500 [>.............................] - ETA: 2:00 - loss: 2.0332 - regression_loss: 1.6701 - classification_loss: 0.3631 18/500 [>.............................] - ETA: 2:00 - loss: 2.0211 - regression_loss: 1.6618 - classification_loss: 0.3593 19/500 [>.............................] - ETA: 2:00 - loss: 1.9957 - regression_loss: 1.6435 - classification_loss: 0.3522 20/500 [>.............................] - ETA: 1:59 - loss: 2.0077 - regression_loss: 1.6539 - classification_loss: 0.3538 21/500 [>.............................] - ETA: 1:59 - loss: 2.0921 - regression_loss: 1.7126 - classification_loss: 0.3795 22/500 [>.............................] - ETA: 1:58 - loss: 2.0949 - regression_loss: 1.7150 - classification_loss: 0.3799 23/500 [>.............................] - ETA: 1:58 - loss: 2.0863 - regression_loss: 1.7095 - classification_loss: 0.3769 24/500 [>.............................] - ETA: 1:58 - loss: 2.0429 - regression_loss: 1.6671 - classification_loss: 0.3758 25/500 [>.............................] - ETA: 1:58 - loss: 2.0592 - regression_loss: 1.6807 - classification_loss: 0.3785 26/500 [>.............................] - ETA: 1:57 - loss: 2.0550 - regression_loss: 1.6800 - classification_loss: 0.3750 27/500 [>.............................] - ETA: 1:57 - loss: 2.0523 - regression_loss: 1.6775 - classification_loss: 0.3748 28/500 [>.............................] - ETA: 1:57 - loss: 2.0628 - regression_loss: 1.6827 - classification_loss: 0.3801 29/500 [>.............................] - ETA: 1:57 - loss: 2.0583 - regression_loss: 1.6803 - classification_loss: 0.3781 30/500 [>.............................] - ETA: 1:57 - loss: 2.0472 - regression_loss: 1.6730 - classification_loss: 0.3742 31/500 [>.............................] - ETA: 1:57 - loss: 2.0664 - regression_loss: 1.6890 - classification_loss: 0.3774 32/500 [>.............................] - ETA: 1:56 - loss: 2.0992 - regression_loss: 1.7092 - classification_loss: 0.3900 33/500 [>.............................] - ETA: 1:56 - loss: 2.1088 - regression_loss: 1.7172 - classification_loss: 0.3917 34/500 [=>............................] - ETA: 1:56 - loss: 2.1197 - regression_loss: 1.7254 - classification_loss: 0.3943 35/500 [=>............................] - ETA: 1:56 - loss: 2.0929 - regression_loss: 1.7028 - classification_loss: 0.3901 36/500 [=>............................] - ETA: 1:55 - loss: 2.0806 - regression_loss: 1.6942 - classification_loss: 0.3864 37/500 [=>............................] - ETA: 1:55 - loss: 2.0877 - regression_loss: 1.7016 - classification_loss: 0.3861 38/500 [=>............................] - ETA: 1:54 - loss: 2.0731 - regression_loss: 1.6889 - classification_loss: 0.3842 39/500 [=>............................] - ETA: 1:54 - loss: 2.0503 - regression_loss: 1.6720 - classification_loss: 0.3784 40/500 [=>............................] - ETA: 1:53 - loss: 2.0332 - regression_loss: 1.6554 - classification_loss: 0.3778 41/500 [=>............................] - ETA: 1:53 - loss: 2.0341 - regression_loss: 1.6581 - classification_loss: 0.3761 42/500 [=>............................] - ETA: 1:53 - loss: 2.0441 - regression_loss: 1.6658 - classification_loss: 0.3784 43/500 [=>............................] - ETA: 1:53 - loss: 2.0294 - regression_loss: 1.6524 - classification_loss: 0.3770 44/500 [=>............................] - ETA: 1:52 - loss: 2.0350 - regression_loss: 1.6575 - classification_loss: 0.3775 45/500 [=>............................] - ETA: 1:52 - loss: 2.0469 - regression_loss: 1.6701 - classification_loss: 0.3768 46/500 [=>............................] - ETA: 1:52 - loss: 2.0447 - regression_loss: 1.6707 - classification_loss: 0.3740 47/500 [=>............................] - ETA: 1:52 - loss: 2.0540 - regression_loss: 1.6775 - classification_loss: 0.3765 48/500 [=>............................] - ETA: 1:52 - loss: 2.0535 - regression_loss: 1.6781 - classification_loss: 0.3754 49/500 [=>............................] - ETA: 1:51 - loss: 2.0419 - regression_loss: 1.6670 - classification_loss: 0.3749 50/500 [==>...........................] - ETA: 1:51 - loss: 2.0430 - regression_loss: 1.6687 - classification_loss: 0.3742 51/500 [==>...........................] - ETA: 1:51 - loss: 2.0379 - regression_loss: 1.6646 - classification_loss: 0.3732 52/500 [==>...........................] - ETA: 1:51 - loss: 2.0480 - regression_loss: 1.6708 - classification_loss: 0.3771 53/500 [==>...........................] - ETA: 1:51 - loss: 2.0498 - regression_loss: 1.6727 - classification_loss: 0.3771 54/500 [==>...........................] - ETA: 1:51 - loss: 2.0487 - regression_loss: 1.6729 - classification_loss: 0.3758 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0522 - regression_loss: 1.6756 - classification_loss: 0.3766 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0454 - regression_loss: 1.6713 - classification_loss: 0.3741 57/500 [==>...........................] - ETA: 1:50 - loss: 2.0578 - regression_loss: 1.6813 - classification_loss: 0.3765 58/500 [==>...........................] - ETA: 1:50 - loss: 2.0692 - regression_loss: 1.6907 - classification_loss: 0.3785 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0605 - regression_loss: 1.6838 - classification_loss: 0.3767 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0602 - regression_loss: 1.6845 - classification_loss: 0.3757 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0577 - regression_loss: 1.6829 - classification_loss: 0.3748 62/500 [==>...........................] - ETA: 1:49 - loss: 2.0631 - regression_loss: 1.6884 - classification_loss: 0.3747 63/500 [==>...........................] - ETA: 1:48 - loss: 2.0637 - regression_loss: 1.6894 - classification_loss: 0.3743 64/500 [==>...........................] - ETA: 1:48 - loss: 2.0693 - regression_loss: 1.6952 - classification_loss: 0.3741 65/500 [==>...........................] - ETA: 1:48 - loss: 2.0673 - regression_loss: 1.6940 - classification_loss: 0.3733 66/500 [==>...........................] - ETA: 1:48 - loss: 2.0609 - regression_loss: 1.6863 - classification_loss: 0.3745 67/500 [===>..........................] - ETA: 1:48 - loss: 2.0547 - regression_loss: 1.6818 - classification_loss: 0.3729 68/500 [===>..........................] - ETA: 1:47 - loss: 2.0469 - regression_loss: 1.6755 - classification_loss: 0.3714 69/500 [===>..........................] - ETA: 1:47 - loss: 2.0578 - regression_loss: 1.6832 - classification_loss: 0.3746 70/500 [===>..........................] - ETA: 1:47 - loss: 2.0380 - regression_loss: 1.6657 - classification_loss: 0.3723 71/500 [===>..........................] - ETA: 1:47 - loss: 2.0440 - regression_loss: 1.6703 - classification_loss: 0.3737 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0445 - regression_loss: 1.6709 - classification_loss: 0.3736 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0390 - regression_loss: 1.6671 - classification_loss: 0.3718 74/500 [===>..........................] - ETA: 1:46 - loss: 2.0413 - regression_loss: 1.6704 - classification_loss: 0.3709 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0451 - regression_loss: 1.6734 - classification_loss: 0.3717 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0486 - regression_loss: 1.6761 - classification_loss: 0.3725 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0393 - regression_loss: 1.6689 - classification_loss: 0.3704 78/500 [===>..........................] - ETA: 1:45 - loss: 2.0430 - regression_loss: 1.6728 - classification_loss: 0.3702 79/500 [===>..........................] - ETA: 1:45 - loss: 2.0365 - regression_loss: 1.6681 - classification_loss: 0.3684 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0332 - regression_loss: 1.6658 - classification_loss: 0.3673 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0282 - regression_loss: 1.6617 - classification_loss: 0.3666 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0331 - regression_loss: 1.6642 - classification_loss: 0.3689 83/500 [===>..........................] - ETA: 1:44 - loss: 2.0428 - regression_loss: 1.6703 - classification_loss: 0.3724 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0310 - regression_loss: 1.6602 - classification_loss: 0.3708 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0355 - regression_loss: 1.6643 - classification_loss: 0.3711 86/500 [====>.........................] - ETA: 1:43 - loss: 2.0421 - regression_loss: 1.6696 - classification_loss: 0.3724 87/500 [====>.........................] - ETA: 1:43 - loss: 2.0365 - regression_loss: 1.6653 - classification_loss: 0.3712 88/500 [====>.........................] - ETA: 1:42 - loss: 2.0304 - regression_loss: 1.6601 - classification_loss: 0.3703 89/500 [====>.........................] - ETA: 1:42 - loss: 2.0237 - regression_loss: 1.6550 - classification_loss: 0.3687 90/500 [====>.........................] - ETA: 1:42 - loss: 2.0269 - regression_loss: 1.6590 - classification_loss: 0.3679 91/500 [====>.........................] - ETA: 1:42 - loss: 2.0197 - regression_loss: 1.6531 - classification_loss: 0.3665 92/500 [====>.........................] - ETA: 1:42 - loss: 2.0199 - regression_loss: 1.6529 - classification_loss: 0.3670 93/500 [====>.........................] - ETA: 1:41 - loss: 2.0225 - regression_loss: 1.6558 - classification_loss: 0.3666 94/500 [====>.........................] - ETA: 1:41 - loss: 2.0196 - regression_loss: 1.6537 - classification_loss: 0.3659 95/500 [====>.........................] - ETA: 1:41 - loss: 2.0209 - regression_loss: 1.6543 - classification_loss: 0.3666 96/500 [====>.........................] - ETA: 1:41 - loss: 2.0157 - regression_loss: 1.6505 - classification_loss: 0.3652 97/500 [====>.........................] - ETA: 1:40 - loss: 2.0183 - regression_loss: 1.6519 - classification_loss: 0.3663 98/500 [====>.........................] - ETA: 1:40 - loss: 2.0190 - regression_loss: 1.6503 - classification_loss: 0.3687 99/500 [====>.........................] - ETA: 1:40 - loss: 2.0215 - regression_loss: 1.6522 - classification_loss: 0.3693 100/500 [=====>........................] - ETA: 1:40 - loss: 2.0190 - regression_loss: 1.6505 - classification_loss: 0.3685 101/500 [=====>........................] - ETA: 1:39 - loss: 2.0180 - regression_loss: 1.6499 - classification_loss: 0.3681 102/500 [=====>........................] - ETA: 1:39 - loss: 2.0157 - regression_loss: 1.6482 - classification_loss: 0.3675 103/500 [=====>........................] - ETA: 1:39 - loss: 2.0161 - regression_loss: 1.6475 - classification_loss: 0.3687 104/500 [=====>........................] - ETA: 1:39 - loss: 2.0186 - regression_loss: 1.6504 - classification_loss: 0.3682 105/500 [=====>........................] - ETA: 1:38 - loss: 2.0224 - regression_loss: 1.6520 - classification_loss: 0.3704 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0239 - regression_loss: 1.6513 - classification_loss: 0.3726 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0238 - regression_loss: 1.6507 - classification_loss: 0.3731 108/500 [=====>........................] - ETA: 1:38 - loss: 2.0273 - regression_loss: 1.6534 - classification_loss: 0.3739 109/500 [=====>........................] - ETA: 1:38 - loss: 2.0259 - regression_loss: 1.6525 - classification_loss: 0.3734 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0324 - regression_loss: 1.6577 - classification_loss: 0.3747 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0301 - regression_loss: 1.6557 - classification_loss: 0.3745 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0316 - regression_loss: 1.6573 - classification_loss: 0.3743 113/500 [=====>........................] - ETA: 1:37 - loss: 2.0336 - regression_loss: 1.6577 - classification_loss: 0.3759 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0274 - regression_loss: 1.6517 - classification_loss: 0.3757 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0295 - regression_loss: 1.6528 - classification_loss: 0.3767 116/500 [=====>........................] - ETA: 1:36 - loss: 2.0256 - regression_loss: 1.6494 - classification_loss: 0.3762 117/500 [======>.......................] - ETA: 1:36 - loss: 2.0260 - regression_loss: 1.6498 - classification_loss: 0.3762 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0288 - regression_loss: 1.6517 - classification_loss: 0.3771 119/500 [======>.......................] - ETA: 1:35 - loss: 2.0330 - regression_loss: 1.6549 - classification_loss: 0.3781 120/500 [======>.......................] - ETA: 1:35 - loss: 2.0362 - regression_loss: 1.6582 - classification_loss: 0.3780 121/500 [======>.......................] - ETA: 1:35 - loss: 2.0373 - regression_loss: 1.6596 - classification_loss: 0.3777 122/500 [======>.......................] - ETA: 1:34 - loss: 2.0433 - regression_loss: 1.6643 - classification_loss: 0.3790 123/500 [======>.......................] - ETA: 1:34 - loss: 2.0493 - regression_loss: 1.6687 - classification_loss: 0.3806 124/500 [======>.......................] - ETA: 1:34 - loss: 2.0498 - regression_loss: 1.6686 - classification_loss: 0.3813 125/500 [======>.......................] - ETA: 1:34 - loss: 2.0503 - regression_loss: 1.6689 - classification_loss: 0.3814 126/500 [======>.......................] - ETA: 1:33 - loss: 2.0524 - regression_loss: 1.6701 - classification_loss: 0.3823 127/500 [======>.......................] - ETA: 1:33 - loss: 2.0577 - regression_loss: 1.6717 - classification_loss: 0.3860 128/500 [======>.......................] - ETA: 1:33 - loss: 2.0645 - regression_loss: 1.6769 - classification_loss: 0.3876 129/500 [======>.......................] - ETA: 1:33 - loss: 2.0667 - regression_loss: 1.6786 - classification_loss: 0.3881 130/500 [======>.......................] - ETA: 1:32 - loss: 2.0681 - regression_loss: 1.6796 - classification_loss: 0.3885 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0675 - regression_loss: 1.6790 - classification_loss: 0.3885 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0615 - regression_loss: 1.6742 - classification_loss: 0.3873 133/500 [======>.......................] - ETA: 1:32 - loss: 2.0595 - regression_loss: 1.6725 - classification_loss: 0.3870 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0622 - regression_loss: 1.6748 - classification_loss: 0.3874 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0599 - regression_loss: 1.6732 - classification_loss: 0.3867 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0509 - regression_loss: 1.6660 - classification_loss: 0.3849 137/500 [=======>......................] - ETA: 1:31 - loss: 2.0545 - regression_loss: 1.6679 - classification_loss: 0.3866 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0481 - regression_loss: 1.6629 - classification_loss: 0.3852 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0482 - regression_loss: 1.6625 - classification_loss: 0.3857 140/500 [=======>......................] - ETA: 1:30 - loss: 2.0451 - regression_loss: 1.6604 - classification_loss: 0.3847 141/500 [=======>......................] - ETA: 1:30 - loss: 2.0473 - regression_loss: 1.6621 - classification_loss: 0.3852 142/500 [=======>......................] - ETA: 1:29 - loss: 2.0479 - regression_loss: 1.6627 - classification_loss: 0.3852 143/500 [=======>......................] - ETA: 1:29 - loss: 2.0509 - regression_loss: 1.6654 - classification_loss: 0.3854 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0523 - regression_loss: 1.6667 - classification_loss: 0.3855 145/500 [=======>......................] - ETA: 1:29 - loss: 2.0514 - regression_loss: 1.6663 - classification_loss: 0.3851 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0568 - regression_loss: 1.6712 - classification_loss: 0.3856 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0570 - regression_loss: 1.6713 - classification_loss: 0.3857 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0555 - regression_loss: 1.6707 - classification_loss: 0.3848 149/500 [=======>......................] - ETA: 1:28 - loss: 2.0560 - regression_loss: 1.6714 - classification_loss: 0.3846 150/500 [========>.....................] - ETA: 1:27 - loss: 2.0585 - regression_loss: 1.6732 - classification_loss: 0.3853 151/500 [========>.....................] - ETA: 1:27 - loss: 2.0607 - regression_loss: 1.6705 - classification_loss: 0.3902 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0617 - regression_loss: 1.6717 - classification_loss: 0.3900 153/500 [========>.....................] - ETA: 1:27 - loss: 2.0600 - regression_loss: 1.6705 - classification_loss: 0.3894 154/500 [========>.....................] - ETA: 1:26 - loss: 2.0578 - regression_loss: 1.6690 - classification_loss: 0.3888 155/500 [========>.....................] - ETA: 1:26 - loss: 2.0569 - regression_loss: 1.6685 - classification_loss: 0.3884 156/500 [========>.....................] - ETA: 1:26 - loss: 2.0621 - regression_loss: 1.6730 - classification_loss: 0.3890 157/500 [========>.....................] - ETA: 1:26 - loss: 2.0540 - regression_loss: 1.6665 - classification_loss: 0.3875 158/500 [========>.....................] - ETA: 1:25 - loss: 2.0539 - regression_loss: 1.6667 - classification_loss: 0.3872 159/500 [========>.....................] - ETA: 1:25 - loss: 2.0578 - regression_loss: 1.6701 - classification_loss: 0.3877 160/500 [========>.....................] - ETA: 1:25 - loss: 2.0583 - regression_loss: 1.6703 - classification_loss: 0.3880 161/500 [========>.....................] - ETA: 1:25 - loss: 2.0558 - regression_loss: 1.6684 - classification_loss: 0.3874 162/500 [========>.....................] - ETA: 1:24 - loss: 2.0550 - regression_loss: 1.6677 - classification_loss: 0.3873 163/500 [========>.....................] - ETA: 1:24 - loss: 2.0533 - regression_loss: 1.6667 - classification_loss: 0.3867 164/500 [========>.....................] - ETA: 1:24 - loss: 2.0575 - regression_loss: 1.6690 - classification_loss: 0.3886 165/500 [========>.....................] - ETA: 1:24 - loss: 2.0536 - regression_loss: 1.6656 - classification_loss: 0.3880 166/500 [========>.....................] - ETA: 1:23 - loss: 2.0539 - regression_loss: 1.6657 - classification_loss: 0.3882 167/500 [=========>....................] - ETA: 1:23 - loss: 2.0491 - regression_loss: 1.6611 - classification_loss: 0.3881 168/500 [=========>....................] - ETA: 1:23 - loss: 2.0538 - regression_loss: 1.6656 - classification_loss: 0.3882 169/500 [=========>....................] - ETA: 1:23 - loss: 2.0542 - regression_loss: 1.6663 - classification_loss: 0.3878 170/500 [=========>....................] - ETA: 1:22 - loss: 2.0529 - regression_loss: 1.6657 - classification_loss: 0.3872 171/500 [=========>....................] - ETA: 1:22 - loss: 2.0512 - regression_loss: 1.6646 - classification_loss: 0.3865 172/500 [=========>....................] - ETA: 1:22 - loss: 2.0527 - regression_loss: 1.6663 - classification_loss: 0.3865 173/500 [=========>....................] - ETA: 1:22 - loss: 2.0502 - regression_loss: 1.6647 - classification_loss: 0.3855 174/500 [=========>....................] - ETA: 1:21 - loss: 2.0442 - regression_loss: 1.6599 - classification_loss: 0.3843 175/500 [=========>....................] - ETA: 1:21 - loss: 2.0472 - regression_loss: 1.6623 - classification_loss: 0.3849 176/500 [=========>....................] - ETA: 1:21 - loss: 2.0456 - regression_loss: 1.6615 - classification_loss: 0.3841 177/500 [=========>....................] - ETA: 1:21 - loss: 2.0445 - regression_loss: 1.6604 - classification_loss: 0.3841 178/500 [=========>....................] - ETA: 1:20 - loss: 2.0452 - regression_loss: 1.6606 - classification_loss: 0.3846 179/500 [=========>....................] - ETA: 1:20 - loss: 2.0447 - regression_loss: 1.6604 - classification_loss: 0.3843 180/500 [=========>....................] - ETA: 1:20 - loss: 2.0438 - regression_loss: 1.6592 - classification_loss: 0.3846 181/500 [=========>....................] - ETA: 1:20 - loss: 2.0468 - regression_loss: 1.6618 - classification_loss: 0.3849 182/500 [=========>....................] - ETA: 1:19 - loss: 2.0473 - regression_loss: 1.6621 - classification_loss: 0.3852 183/500 [=========>....................] - ETA: 1:19 - loss: 2.0458 - regression_loss: 1.6612 - classification_loss: 0.3845 184/500 [==========>...................] - ETA: 1:19 - loss: 2.0469 - regression_loss: 1.6624 - classification_loss: 0.3845 185/500 [==========>...................] - ETA: 1:19 - loss: 2.0427 - regression_loss: 1.6588 - classification_loss: 0.3839 186/500 [==========>...................] - ETA: 1:18 - loss: 2.0440 - regression_loss: 1.6603 - classification_loss: 0.3837 187/500 [==========>...................] - ETA: 1:18 - loss: 2.0449 - regression_loss: 1.6614 - classification_loss: 0.3835 188/500 [==========>...................] - ETA: 1:18 - loss: 2.0396 - regression_loss: 1.6572 - classification_loss: 0.3823 189/500 [==========>...................] - ETA: 1:18 - loss: 2.0420 - regression_loss: 1.6590 - classification_loss: 0.3830 190/500 [==========>...................] - ETA: 1:17 - loss: 2.0434 - regression_loss: 1.6599 - classification_loss: 0.3835 191/500 [==========>...................] - ETA: 1:17 - loss: 2.0395 - regression_loss: 1.6571 - classification_loss: 0.3825 192/500 [==========>...................] - ETA: 1:17 - loss: 2.0409 - regression_loss: 1.6577 - classification_loss: 0.3832 193/500 [==========>...................] - ETA: 1:17 - loss: 2.0356 - regression_loss: 1.6539 - classification_loss: 0.3817 194/500 [==========>...................] - ETA: 1:16 - loss: 2.0392 - regression_loss: 1.6568 - classification_loss: 0.3825 195/500 [==========>...................] - ETA: 1:16 - loss: 2.0391 - regression_loss: 1.6566 - classification_loss: 0.3825 196/500 [==========>...................] - ETA: 1:16 - loss: 2.0400 - regression_loss: 1.6573 - classification_loss: 0.3826 197/500 [==========>...................] - ETA: 1:16 - loss: 2.0363 - regression_loss: 1.6549 - classification_loss: 0.3814 198/500 [==========>...................] - ETA: 1:15 - loss: 2.0323 - regression_loss: 1.6517 - classification_loss: 0.3805 199/500 [==========>...................] - ETA: 1:15 - loss: 2.0375 - regression_loss: 1.6560 - classification_loss: 0.3815 200/500 [===========>..................] - ETA: 1:15 - loss: 2.0376 - regression_loss: 1.6561 - classification_loss: 0.3814 201/500 [===========>..................] - ETA: 1:15 - loss: 2.0363 - regression_loss: 1.6552 - classification_loss: 0.3812 202/500 [===========>..................] - ETA: 1:14 - loss: 2.0357 - regression_loss: 1.6549 - classification_loss: 0.3808 203/500 [===========>..................] - ETA: 1:14 - loss: 2.0346 - regression_loss: 1.6543 - classification_loss: 0.3803 204/500 [===========>..................] - ETA: 1:14 - loss: 2.0332 - regression_loss: 1.6536 - classification_loss: 0.3796 205/500 [===========>..................] - ETA: 1:14 - loss: 2.0310 - regression_loss: 1.6518 - classification_loss: 0.3791 206/500 [===========>..................] - ETA: 1:13 - loss: 2.0307 - regression_loss: 1.6520 - classification_loss: 0.3788 207/500 [===========>..................] - ETA: 1:13 - loss: 2.0307 - regression_loss: 1.6518 - classification_loss: 0.3789 208/500 [===========>..................] - ETA: 1:13 - loss: 2.0289 - regression_loss: 1.6504 - classification_loss: 0.3785 209/500 [===========>..................] - ETA: 1:13 - loss: 2.0302 - regression_loss: 1.6514 - classification_loss: 0.3788 210/500 [===========>..................] - ETA: 1:12 - loss: 2.0306 - regression_loss: 1.6517 - classification_loss: 0.3788 211/500 [===========>..................] - ETA: 1:12 - loss: 2.0302 - regression_loss: 1.6514 - classification_loss: 0.3788 212/500 [===========>..................] - ETA: 1:12 - loss: 2.0299 - regression_loss: 1.6513 - classification_loss: 0.3787 213/500 [===========>..................] - ETA: 1:11 - loss: 2.0264 - regression_loss: 1.6481 - classification_loss: 0.3783 214/500 [===========>..................] - ETA: 1:11 - loss: 2.0271 - regression_loss: 1.6486 - classification_loss: 0.3786 215/500 [===========>..................] - ETA: 1:11 - loss: 2.0308 - regression_loss: 1.6517 - classification_loss: 0.3791 216/500 [===========>..................] - ETA: 1:11 - loss: 2.0312 - regression_loss: 1.6522 - classification_loss: 0.3790 217/500 [============>.................] - ETA: 1:10 - loss: 2.0300 - regression_loss: 1.6514 - classification_loss: 0.3786 218/500 [============>.................] - ETA: 1:10 - loss: 2.0302 - regression_loss: 1.6518 - classification_loss: 0.3784 219/500 [============>.................] - ETA: 1:10 - loss: 2.0311 - regression_loss: 1.6526 - classification_loss: 0.3785 220/500 [============>.................] - ETA: 1:10 - loss: 2.0303 - regression_loss: 1.6521 - classification_loss: 0.3782 221/500 [============>.................] - ETA: 1:09 - loss: 2.0306 - regression_loss: 1.6524 - classification_loss: 0.3782 222/500 [============>.................] - ETA: 1:09 - loss: 2.0341 - regression_loss: 1.6551 - classification_loss: 0.3789 223/500 [============>.................] - ETA: 1:09 - loss: 2.0332 - regression_loss: 1.6546 - classification_loss: 0.3786 224/500 [============>.................] - ETA: 1:09 - loss: 2.0280 - regression_loss: 1.6507 - classification_loss: 0.3774 225/500 [============>.................] - ETA: 1:08 - loss: 2.0264 - regression_loss: 1.6495 - classification_loss: 0.3769 226/500 [============>.................] - ETA: 1:08 - loss: 2.0269 - regression_loss: 1.6506 - classification_loss: 0.3764 227/500 [============>.................] - ETA: 1:08 - loss: 2.0272 - regression_loss: 1.6508 - classification_loss: 0.3765 228/500 [============>.................] - ETA: 1:08 - loss: 2.0263 - regression_loss: 1.6499 - classification_loss: 0.3763 229/500 [============>.................] - ETA: 1:07 - loss: 2.0277 - regression_loss: 1.6509 - classification_loss: 0.3768 230/500 [============>.................] - ETA: 1:07 - loss: 2.0288 - regression_loss: 1.6520 - classification_loss: 0.3767 231/500 [============>.................] - ETA: 1:07 - loss: 2.0280 - regression_loss: 1.6516 - classification_loss: 0.3764 232/500 [============>.................] - ETA: 1:07 - loss: 2.0275 - regression_loss: 1.6512 - classification_loss: 0.3762 233/500 [============>.................] - ETA: 1:06 - loss: 2.0259 - regression_loss: 1.6501 - classification_loss: 0.3758 234/500 [=============>................] - ETA: 1:06 - loss: 2.0253 - regression_loss: 1.6495 - classification_loss: 0.3758 235/500 [=============>................] - ETA: 1:06 - loss: 2.0259 - regression_loss: 1.6506 - classification_loss: 0.3754 236/500 [=============>................] - ETA: 1:06 - loss: 2.0255 - regression_loss: 1.6505 - classification_loss: 0.3750 237/500 [=============>................] - ETA: 1:05 - loss: 2.0285 - regression_loss: 1.6527 - classification_loss: 0.3758 238/500 [=============>................] - ETA: 1:05 - loss: 2.0277 - regression_loss: 1.6522 - classification_loss: 0.3755 239/500 [=============>................] - ETA: 1:05 - loss: 2.0265 - regression_loss: 1.6514 - classification_loss: 0.3751 240/500 [=============>................] - ETA: 1:05 - loss: 2.0243 - regression_loss: 1.6496 - classification_loss: 0.3747 241/500 [=============>................] - ETA: 1:04 - loss: 2.0258 - regression_loss: 1.6513 - classification_loss: 0.3745 242/500 [=============>................] - ETA: 1:04 - loss: 2.0266 - regression_loss: 1.6517 - classification_loss: 0.3748 243/500 [=============>................] - ETA: 1:04 - loss: 2.0251 - regression_loss: 1.6510 - classification_loss: 0.3741 244/500 [=============>................] - ETA: 1:04 - loss: 2.0242 - regression_loss: 1.6499 - classification_loss: 0.3743 245/500 [=============>................] - ETA: 1:03 - loss: 2.0237 - regression_loss: 1.6496 - classification_loss: 0.3741 246/500 [=============>................] - ETA: 1:03 - loss: 2.0228 - regression_loss: 1.6486 - classification_loss: 0.3741 247/500 [=============>................] - ETA: 1:03 - loss: 2.0261 - regression_loss: 1.6513 - classification_loss: 0.3748 248/500 [=============>................] - ETA: 1:03 - loss: 2.0255 - regression_loss: 1.6510 - classification_loss: 0.3745 249/500 [=============>................] - ETA: 1:02 - loss: 2.0264 - regression_loss: 1.6522 - classification_loss: 0.3743 250/500 [==============>...............] - ETA: 1:02 - loss: 2.0257 - regression_loss: 1.6518 - classification_loss: 0.3739 251/500 [==============>...............] - ETA: 1:02 - loss: 2.0227 - regression_loss: 1.6495 - classification_loss: 0.3732 252/500 [==============>...............] - ETA: 1:02 - loss: 2.0244 - regression_loss: 1.6508 - classification_loss: 0.3737 253/500 [==============>...............] - ETA: 1:01 - loss: 2.0244 - regression_loss: 1.6509 - classification_loss: 0.3735 254/500 [==============>...............] - ETA: 1:01 - loss: 2.0255 - regression_loss: 1.6516 - classification_loss: 0.3739 255/500 [==============>...............] - ETA: 1:01 - loss: 2.0214 - regression_loss: 1.6483 - classification_loss: 0.3731 256/500 [==============>...............] - ETA: 1:01 - loss: 2.0217 - regression_loss: 1.6484 - classification_loss: 0.3733 257/500 [==============>...............] - ETA: 1:00 - loss: 2.0222 - regression_loss: 1.6489 - classification_loss: 0.3733 258/500 [==============>...............] - ETA: 1:00 - loss: 2.0203 - regression_loss: 1.6478 - classification_loss: 0.3725 259/500 [==============>...............] - ETA: 1:00 - loss: 2.0217 - regression_loss: 1.6489 - classification_loss: 0.3728 260/500 [==============>...............] - ETA: 1:00 - loss: 2.0208 - regression_loss: 1.6484 - classification_loss: 0.3723 261/500 [==============>...............] - ETA: 59s - loss: 2.0200 - regression_loss: 1.6481 - classification_loss: 0.3719  262/500 [==============>...............] - ETA: 59s - loss: 2.0179 - regression_loss: 1.6462 - classification_loss: 0.3717 263/500 [==============>...............] - ETA: 59s - loss: 2.0206 - regression_loss: 1.6486 - classification_loss: 0.3720 264/500 [==============>...............] - ETA: 59s - loss: 2.0211 - regression_loss: 1.6492 - classification_loss: 0.3719 265/500 [==============>...............] - ETA: 58s - loss: 2.0186 - regression_loss: 1.6472 - classification_loss: 0.3714 266/500 [==============>...............] - ETA: 58s - loss: 2.0181 - regression_loss: 1.6469 - classification_loss: 0.3713 267/500 [===============>..............] - ETA: 58s - loss: 2.0203 - regression_loss: 1.6487 - classification_loss: 0.3716 268/500 [===============>..............] - ETA: 58s - loss: 2.0215 - regression_loss: 1.6498 - classification_loss: 0.3717 269/500 [===============>..............] - ETA: 57s - loss: 2.0212 - regression_loss: 1.6498 - classification_loss: 0.3714 270/500 [===============>..............] - ETA: 57s - loss: 2.0250 - regression_loss: 1.6530 - classification_loss: 0.3719 271/500 [===============>..............] - ETA: 57s - loss: 2.0258 - regression_loss: 1.6539 - classification_loss: 0.3719 272/500 [===============>..............] - ETA: 57s - loss: 2.0266 - regression_loss: 1.6546 - classification_loss: 0.3720 273/500 [===============>..............] - ETA: 56s - loss: 2.0274 - regression_loss: 1.6547 - classification_loss: 0.3727 274/500 [===============>..............] - ETA: 56s - loss: 2.0245 - regression_loss: 1.6522 - classification_loss: 0.3724 275/500 [===============>..............] - ETA: 56s - loss: 2.0240 - regression_loss: 1.6519 - classification_loss: 0.3721 276/500 [===============>..............] - ETA: 56s - loss: 2.0246 - regression_loss: 1.6526 - classification_loss: 0.3720 277/500 [===============>..............] - ETA: 55s - loss: 2.0245 - regression_loss: 1.6525 - classification_loss: 0.3720 278/500 [===============>..............] - ETA: 55s - loss: 2.0241 - regression_loss: 1.6521 - classification_loss: 0.3720 279/500 [===============>..............] - ETA: 55s - loss: 2.0244 - regression_loss: 1.6523 - classification_loss: 0.3721 280/500 [===============>..............] - ETA: 55s - loss: 2.0248 - regression_loss: 1.6529 - classification_loss: 0.3719 281/500 [===============>..............] - ETA: 54s - loss: 2.0253 - regression_loss: 1.6532 - classification_loss: 0.3720 282/500 [===============>..............] - ETA: 54s - loss: 2.0241 - regression_loss: 1.6524 - classification_loss: 0.3717 283/500 [===============>..............] - ETA: 54s - loss: 2.0243 - regression_loss: 1.6526 - classification_loss: 0.3716 284/500 [================>.............] - ETA: 54s - loss: 2.0222 - regression_loss: 1.6507 - classification_loss: 0.3715 285/500 [================>.............] - ETA: 53s - loss: 2.0222 - regression_loss: 1.6511 - classification_loss: 0.3711 286/500 [================>.............] - ETA: 53s - loss: 2.0205 - regression_loss: 1.6499 - classification_loss: 0.3706 287/500 [================>.............] - ETA: 53s - loss: 2.0161 - regression_loss: 1.6464 - classification_loss: 0.3697 288/500 [================>.............] - ETA: 53s - loss: 2.0165 - regression_loss: 1.6468 - classification_loss: 0.3697 289/500 [================>.............] - ETA: 52s - loss: 2.0151 - regression_loss: 1.6459 - classification_loss: 0.3692 290/500 [================>.............] - ETA: 52s - loss: 2.0137 - regression_loss: 1.6448 - classification_loss: 0.3689 291/500 [================>.............] - ETA: 52s - loss: 2.0141 - regression_loss: 1.6452 - classification_loss: 0.3689 292/500 [================>.............] - ETA: 52s - loss: 2.0133 - regression_loss: 1.6446 - classification_loss: 0.3687 293/500 [================>.............] - ETA: 51s - loss: 2.0128 - regression_loss: 1.6443 - classification_loss: 0.3686 294/500 [================>.............] - ETA: 51s - loss: 2.0170 - regression_loss: 1.6463 - classification_loss: 0.3707 295/500 [================>.............] - ETA: 51s - loss: 2.0144 - regression_loss: 1.6441 - classification_loss: 0.3702 296/500 [================>.............] - ETA: 51s - loss: 2.0159 - regression_loss: 1.6453 - classification_loss: 0.3706 297/500 [================>.............] - ETA: 50s - loss: 2.0169 - regression_loss: 1.6462 - classification_loss: 0.3707 298/500 [================>.............] - ETA: 50s - loss: 2.0175 - regression_loss: 1.6467 - classification_loss: 0.3708 299/500 [================>.............] - ETA: 50s - loss: 2.0160 - regression_loss: 1.6458 - classification_loss: 0.3702 300/500 [=================>............] - ETA: 50s - loss: 2.0143 - regression_loss: 1.6446 - classification_loss: 0.3697 301/500 [=================>............] - ETA: 49s - loss: 2.0118 - regression_loss: 1.6429 - classification_loss: 0.3689 302/500 [=================>............] - ETA: 49s - loss: 2.0127 - regression_loss: 1.6437 - classification_loss: 0.3690 303/500 [=================>............] - ETA: 49s - loss: 2.0136 - regression_loss: 1.6448 - classification_loss: 0.3688 304/500 [=================>............] - ETA: 49s - loss: 2.0139 - regression_loss: 1.6451 - classification_loss: 0.3688 305/500 [=================>............] - ETA: 48s - loss: 2.0167 - regression_loss: 1.6468 - classification_loss: 0.3698 306/500 [=================>............] - ETA: 48s - loss: 2.0143 - regression_loss: 1.6447 - classification_loss: 0.3696 307/500 [=================>............] - ETA: 48s - loss: 2.0151 - regression_loss: 1.6455 - classification_loss: 0.3696 308/500 [=================>............] - ETA: 48s - loss: 2.0156 - regression_loss: 1.6459 - classification_loss: 0.3697 309/500 [=================>............] - ETA: 47s - loss: 2.0158 - regression_loss: 1.6462 - classification_loss: 0.3696 310/500 [=================>............] - ETA: 47s - loss: 2.0165 - regression_loss: 1.6468 - classification_loss: 0.3697 311/500 [=================>............] - ETA: 47s - loss: 2.0167 - regression_loss: 1.6471 - classification_loss: 0.3695 312/500 [=================>............] - ETA: 47s - loss: 2.0160 - regression_loss: 1.6469 - classification_loss: 0.3690 313/500 [=================>............] - ETA: 46s - loss: 2.0157 - regression_loss: 1.6468 - classification_loss: 0.3689 314/500 [=================>............] - ETA: 46s - loss: 2.0161 - regression_loss: 1.6474 - classification_loss: 0.3687 315/500 [=================>............] - ETA: 46s - loss: 2.0157 - regression_loss: 1.6472 - classification_loss: 0.3684 316/500 [=================>............] - ETA: 46s - loss: 2.0127 - regression_loss: 1.6447 - classification_loss: 0.3679 317/500 [==================>...........] - ETA: 45s - loss: 2.0139 - regression_loss: 1.6460 - classification_loss: 0.3679 318/500 [==================>...........] - ETA: 45s - loss: 2.0168 - regression_loss: 1.6478 - classification_loss: 0.3689 319/500 [==================>...........] - ETA: 45s - loss: 2.0161 - regression_loss: 1.6473 - classification_loss: 0.3688 320/500 [==================>...........] - ETA: 45s - loss: 2.0153 - regression_loss: 1.6459 - classification_loss: 0.3693 321/500 [==================>...........] - ETA: 44s - loss: 2.0165 - regression_loss: 1.6466 - classification_loss: 0.3699 322/500 [==================>...........] - ETA: 44s - loss: 2.0159 - regression_loss: 1.6464 - classification_loss: 0.3696 323/500 [==================>...........] - ETA: 44s - loss: 2.0156 - regression_loss: 1.6463 - classification_loss: 0.3693 324/500 [==================>...........] - ETA: 44s - loss: 2.0156 - regression_loss: 1.6464 - classification_loss: 0.3692 325/500 [==================>...........] - ETA: 43s - loss: 2.0171 - regression_loss: 1.6478 - classification_loss: 0.3693 326/500 [==================>...........] - ETA: 43s - loss: 2.0182 - regression_loss: 1.6485 - classification_loss: 0.3696 327/500 [==================>...........] - ETA: 43s - loss: 2.0171 - regression_loss: 1.6476 - classification_loss: 0.3694 328/500 [==================>...........] - ETA: 43s - loss: 2.0161 - regression_loss: 1.6468 - classification_loss: 0.3694 329/500 [==================>...........] - ETA: 42s - loss: 2.0170 - regression_loss: 1.6473 - classification_loss: 0.3697 330/500 [==================>...........] - ETA: 42s - loss: 2.0163 - regression_loss: 1.6467 - classification_loss: 0.3696 331/500 [==================>...........] - ETA: 42s - loss: 2.0186 - regression_loss: 1.6483 - classification_loss: 0.3703 332/500 [==================>...........] - ETA: 42s - loss: 2.0182 - regression_loss: 1.6480 - classification_loss: 0.3702 333/500 [==================>...........] - ETA: 41s - loss: 2.0169 - regression_loss: 1.6468 - classification_loss: 0.3701 334/500 [===================>..........] - ETA: 41s - loss: 2.0172 - regression_loss: 1.6472 - classification_loss: 0.3700 335/500 [===================>..........] - ETA: 41s - loss: 2.0168 - regression_loss: 1.6468 - classification_loss: 0.3700 336/500 [===================>..........] - ETA: 41s - loss: 2.0219 - regression_loss: 1.6473 - classification_loss: 0.3745 337/500 [===================>..........] - ETA: 40s - loss: 2.0229 - regression_loss: 1.6485 - classification_loss: 0.3745 338/500 [===================>..........] - ETA: 40s - loss: 2.0234 - regression_loss: 1.6487 - classification_loss: 0.3747 339/500 [===================>..........] - ETA: 40s - loss: 2.0248 - regression_loss: 1.6495 - classification_loss: 0.3753 340/500 [===================>..........] - ETA: 40s - loss: 2.0258 - regression_loss: 1.6497 - classification_loss: 0.3761 341/500 [===================>..........] - ETA: 39s - loss: 2.0279 - regression_loss: 1.6488 - classification_loss: 0.3792 342/500 [===================>..........] - ETA: 39s - loss: 2.0282 - regression_loss: 1.6490 - classification_loss: 0.3792 343/500 [===================>..........] - ETA: 39s - loss: 2.0275 - regression_loss: 1.6484 - classification_loss: 0.3791 344/500 [===================>..........] - ETA: 39s - loss: 2.0282 - regression_loss: 1.6489 - classification_loss: 0.3793 345/500 [===================>..........] - ETA: 38s - loss: 2.0279 - regression_loss: 1.6488 - classification_loss: 0.3791 346/500 [===================>..........] - ETA: 38s - loss: 2.0271 - regression_loss: 1.6481 - classification_loss: 0.3790 347/500 [===================>..........] - ETA: 38s - loss: 2.0317 - regression_loss: 1.6521 - classification_loss: 0.3796 348/500 [===================>..........] - ETA: 38s - loss: 2.0318 - regression_loss: 1.6515 - classification_loss: 0.3803 349/500 [===================>..........] - ETA: 37s - loss: 2.0324 - regression_loss: 1.6520 - classification_loss: 0.3804 350/500 [====================>.........] - ETA: 37s - loss: 2.0314 - regression_loss: 1.6510 - classification_loss: 0.3804 351/500 [====================>.........] - ETA: 37s - loss: 2.0314 - regression_loss: 1.6509 - classification_loss: 0.3805 352/500 [====================>.........] - ETA: 37s - loss: 2.0324 - regression_loss: 1.6516 - classification_loss: 0.3809 353/500 [====================>.........] - ETA: 36s - loss: 2.0322 - regression_loss: 1.6515 - classification_loss: 0.3808 354/500 [====================>.........] - ETA: 36s - loss: 2.0327 - regression_loss: 1.6519 - classification_loss: 0.3808 355/500 [====================>.........] - ETA: 36s - loss: 2.0330 - regression_loss: 1.6522 - classification_loss: 0.3808 356/500 [====================>.........] - ETA: 36s - loss: 2.0319 - regression_loss: 1.6516 - classification_loss: 0.3803 357/500 [====================>.........] - ETA: 35s - loss: 2.0330 - regression_loss: 1.6526 - classification_loss: 0.3804 358/500 [====================>.........] - ETA: 35s - loss: 2.0322 - regression_loss: 1.6521 - classification_loss: 0.3801 359/500 [====================>.........] - ETA: 35s - loss: 2.0319 - regression_loss: 1.6520 - classification_loss: 0.3798 360/500 [====================>.........] - ETA: 35s - loss: 2.0336 - regression_loss: 1.6536 - classification_loss: 0.3801 361/500 [====================>.........] - ETA: 34s - loss: 2.0340 - regression_loss: 1.6540 - classification_loss: 0.3800 362/500 [====================>.........] - ETA: 34s - loss: 2.0332 - regression_loss: 1.6531 - classification_loss: 0.3801 363/500 [====================>.........] - ETA: 34s - loss: 2.0335 - regression_loss: 1.6534 - classification_loss: 0.3801 364/500 [====================>.........] - ETA: 34s - loss: 2.0310 - regression_loss: 1.6514 - classification_loss: 0.3796 365/500 [====================>.........] - ETA: 33s - loss: 2.0310 - regression_loss: 1.6517 - classification_loss: 0.3794 366/500 [====================>.........] - ETA: 33s - loss: 2.0299 - regression_loss: 1.6509 - classification_loss: 0.3790 367/500 [=====================>........] - ETA: 33s - loss: 2.0292 - regression_loss: 1.6505 - classification_loss: 0.3787 368/500 [=====================>........] - ETA: 33s - loss: 2.0285 - regression_loss: 1.6500 - classification_loss: 0.3784 369/500 [=====================>........] - ETA: 32s - loss: 2.0281 - regression_loss: 1.6499 - classification_loss: 0.3783 370/500 [=====================>........] - ETA: 32s - loss: 2.0308 - regression_loss: 1.6519 - classification_loss: 0.3790 371/500 [=====================>........] - ETA: 32s - loss: 2.0312 - regression_loss: 1.6520 - classification_loss: 0.3792 372/500 [=====================>........] - ETA: 32s - loss: 2.0311 - regression_loss: 1.6517 - classification_loss: 0.3794 373/500 [=====================>........] - ETA: 31s - loss: 2.0314 - regression_loss: 1.6522 - classification_loss: 0.3792 374/500 [=====================>........] - ETA: 31s - loss: 2.0339 - regression_loss: 1.6543 - classification_loss: 0.3796 375/500 [=====================>........] - ETA: 31s - loss: 2.0330 - regression_loss: 1.6536 - classification_loss: 0.3793 376/500 [=====================>........] - ETA: 31s - loss: 2.0324 - regression_loss: 1.6534 - classification_loss: 0.3790 377/500 [=====================>........] - ETA: 30s - loss: 2.0328 - regression_loss: 1.6538 - classification_loss: 0.3791 378/500 [=====================>........] - ETA: 30s - loss: 2.0323 - regression_loss: 1.6534 - classification_loss: 0.3789 379/500 [=====================>........] - ETA: 30s - loss: 2.0307 - regression_loss: 1.6522 - classification_loss: 0.3785 380/500 [=====================>........] - ETA: 30s - loss: 2.0308 - regression_loss: 1.6523 - classification_loss: 0.3785 381/500 [=====================>........] - ETA: 29s - loss: 2.0322 - regression_loss: 1.6536 - classification_loss: 0.3786 382/500 [=====================>........] - ETA: 29s - loss: 2.0330 - regression_loss: 1.6543 - classification_loss: 0.3787 383/500 [=====================>........] - ETA: 29s - loss: 2.0332 - regression_loss: 1.6546 - classification_loss: 0.3787 384/500 [======================>.......] - ETA: 29s - loss: 2.0344 - regression_loss: 1.6556 - classification_loss: 0.3788 385/500 [======================>.......] - ETA: 28s - loss: 2.0355 - regression_loss: 1.6568 - classification_loss: 0.3787 386/500 [======================>.......] - ETA: 28s - loss: 2.0374 - regression_loss: 1.6584 - classification_loss: 0.3790 387/500 [======================>.......] - ETA: 28s - loss: 2.0380 - regression_loss: 1.6586 - classification_loss: 0.3794 388/500 [======================>.......] - ETA: 28s - loss: 2.0392 - regression_loss: 1.6596 - classification_loss: 0.3796 389/500 [======================>.......] - ETA: 27s - loss: 2.0396 - regression_loss: 1.6601 - classification_loss: 0.3795 390/500 [======================>.......] - ETA: 27s - loss: 2.0394 - regression_loss: 1.6602 - classification_loss: 0.3792 391/500 [======================>.......] - ETA: 27s - loss: 2.0382 - regression_loss: 1.6589 - classification_loss: 0.3793 392/500 [======================>.......] - ETA: 27s - loss: 2.0356 - regression_loss: 1.6569 - classification_loss: 0.3788 393/500 [======================>.......] - ETA: 26s - loss: 2.0357 - regression_loss: 1.6571 - classification_loss: 0.3786 394/500 [======================>.......] - ETA: 26s - loss: 2.0332 - regression_loss: 1.6553 - classification_loss: 0.3779 395/500 [======================>.......] - ETA: 26s - loss: 2.0329 - regression_loss: 1.6552 - classification_loss: 0.3777 396/500 [======================>.......] - ETA: 26s - loss: 2.0337 - regression_loss: 1.6558 - classification_loss: 0.3779 397/500 [======================>.......] - ETA: 25s - loss: 2.0331 - regression_loss: 1.6555 - classification_loss: 0.3776 398/500 [======================>.......] - ETA: 25s - loss: 2.0337 - regression_loss: 1.6560 - classification_loss: 0.3777 399/500 [======================>.......] - ETA: 25s - loss: 2.0341 - regression_loss: 1.6566 - classification_loss: 0.3775 400/500 [=======================>......] - ETA: 25s - loss: 2.0336 - regression_loss: 1.6564 - classification_loss: 0.3773 401/500 [=======================>......] - ETA: 24s - loss: 2.0349 - regression_loss: 1.6571 - classification_loss: 0.3778 402/500 [=======================>......] - ETA: 24s - loss: 2.0355 - regression_loss: 1.6574 - classification_loss: 0.3781 403/500 [=======================>......] - ETA: 24s - loss: 2.0364 - regression_loss: 1.6582 - classification_loss: 0.3782 404/500 [=======================>......] - ETA: 24s - loss: 2.0359 - regression_loss: 1.6579 - classification_loss: 0.3780 405/500 [=======================>......] - ETA: 23s - loss: 2.0365 - regression_loss: 1.6587 - classification_loss: 0.3778 406/500 [=======================>......] - ETA: 23s - loss: 2.0364 - regression_loss: 1.6587 - classification_loss: 0.3777 407/500 [=======================>......] - ETA: 23s - loss: 2.0365 - regression_loss: 1.6587 - classification_loss: 0.3778 408/500 [=======================>......] - ETA: 23s - loss: 2.0358 - regression_loss: 1.6583 - classification_loss: 0.3775 409/500 [=======================>......] - ETA: 22s - loss: 2.0370 - regression_loss: 1.6597 - classification_loss: 0.3772 410/500 [=======================>......] - ETA: 22s - loss: 2.0360 - regression_loss: 1.6590 - classification_loss: 0.3770 411/500 [=======================>......] - ETA: 22s - loss: 2.0344 - regression_loss: 1.6578 - classification_loss: 0.3766 412/500 [=======================>......] - ETA: 22s - loss: 2.0335 - regression_loss: 1.6571 - classification_loss: 0.3764 413/500 [=======================>......] - ETA: 21s - loss: 2.0320 - regression_loss: 1.6558 - classification_loss: 0.3762 414/500 [=======================>......] - ETA: 21s - loss: 2.0326 - regression_loss: 1.6565 - classification_loss: 0.3761 415/500 [=======================>......] - ETA: 21s - loss: 2.0322 - regression_loss: 1.6562 - classification_loss: 0.3760 416/500 [=======================>......] - ETA: 21s - loss: 2.0330 - regression_loss: 1.6570 - classification_loss: 0.3761 417/500 [========================>.....] - ETA: 20s - loss: 2.0342 - regression_loss: 1.6578 - classification_loss: 0.3764 418/500 [========================>.....] - ETA: 20s - loss: 2.0354 - regression_loss: 1.6588 - classification_loss: 0.3766 419/500 [========================>.....] - ETA: 20s - loss: 2.0358 - regression_loss: 1.6594 - classification_loss: 0.3765 420/500 [========================>.....] - ETA: 20s - loss: 2.0355 - regression_loss: 1.6592 - classification_loss: 0.3763 421/500 [========================>.....] - ETA: 19s - loss: 2.0351 - regression_loss: 1.6589 - classification_loss: 0.3762 422/500 [========================>.....] - ETA: 19s - loss: 2.0322 - regression_loss: 1.6565 - classification_loss: 0.3757 423/500 [========================>.....] - ETA: 19s - loss: 2.0324 - regression_loss: 1.6568 - classification_loss: 0.3756 424/500 [========================>.....] - ETA: 18s - loss: 2.0315 - regression_loss: 1.6562 - classification_loss: 0.3752 425/500 [========================>.....] - ETA: 18s - loss: 2.0309 - regression_loss: 1.6558 - classification_loss: 0.3751 426/500 [========================>.....] - ETA: 18s - loss: 2.0310 - regression_loss: 1.6559 - classification_loss: 0.3751 427/500 [========================>.....] - ETA: 18s - loss: 2.0312 - regression_loss: 1.6560 - classification_loss: 0.3752 428/500 [========================>.....] - ETA: 18s - loss: 2.0302 - regression_loss: 1.6548 - classification_loss: 0.3754 429/500 [========================>.....] - ETA: 17s - loss: 2.0297 - regression_loss: 1.6545 - classification_loss: 0.3753 430/500 [========================>.....] - ETA: 17s - loss: 2.0306 - regression_loss: 1.6553 - classification_loss: 0.3753 431/500 [========================>.....] - ETA: 17s - loss: 2.0295 - regression_loss: 1.6545 - classification_loss: 0.3750 432/500 [========================>.....] - ETA: 17s - loss: 2.0288 - regression_loss: 1.6539 - classification_loss: 0.3748 433/500 [========================>.....] - ETA: 16s - loss: 2.0289 - regression_loss: 1.6540 - classification_loss: 0.3749 434/500 [=========================>....] - ETA: 16s - loss: 2.0285 - regression_loss: 1.6538 - classification_loss: 0.3747 435/500 [=========================>....] - ETA: 16s - loss: 2.0285 - regression_loss: 1.6538 - classification_loss: 0.3747 436/500 [=========================>....] - ETA: 16s - loss: 2.0290 - regression_loss: 1.6543 - classification_loss: 0.3748 437/500 [=========================>....] - ETA: 15s - loss: 2.0289 - regression_loss: 1.6542 - classification_loss: 0.3746 438/500 [=========================>....] - ETA: 15s - loss: 2.0297 - regression_loss: 1.6548 - classification_loss: 0.3748 439/500 [=========================>....] - ETA: 15s - loss: 2.0303 - regression_loss: 1.6555 - classification_loss: 0.3748 440/500 [=========================>....] - ETA: 15s - loss: 2.0314 - regression_loss: 1.6563 - classification_loss: 0.3751 441/500 [=========================>....] - ETA: 14s - loss: 2.0317 - regression_loss: 1.6566 - classification_loss: 0.3750 442/500 [=========================>....] - ETA: 14s - loss: 2.0296 - regression_loss: 1.6550 - classification_loss: 0.3745 443/500 [=========================>....] - ETA: 14s - loss: 2.0292 - regression_loss: 1.6549 - classification_loss: 0.3743 444/500 [=========================>....] - ETA: 14s - loss: 2.0306 - regression_loss: 1.6559 - classification_loss: 0.3747 445/500 [=========================>....] - ETA: 13s - loss: 2.0309 - regression_loss: 1.6562 - classification_loss: 0.3747 446/500 [=========================>....] - ETA: 13s - loss: 2.0326 - regression_loss: 1.6564 - classification_loss: 0.3762 447/500 [=========================>....] - ETA: 13s - loss: 2.0330 - regression_loss: 1.6566 - classification_loss: 0.3764 448/500 [=========================>....] - ETA: 13s - loss: 2.0328 - regression_loss: 1.6565 - classification_loss: 0.3763 449/500 [=========================>....] - ETA: 12s - loss: 2.0321 - regression_loss: 1.6560 - classification_loss: 0.3761 450/500 [==========================>...] - ETA: 12s - loss: 2.0318 - regression_loss: 1.6556 - classification_loss: 0.3762 451/500 [==========================>...] - ETA: 12s - loss: 2.0313 - regression_loss: 1.6553 - classification_loss: 0.3760 452/500 [==========================>...] - ETA: 12s - loss: 2.0296 - regression_loss: 1.6539 - classification_loss: 0.3757 453/500 [==========================>...] - ETA: 11s - loss: 2.0287 - regression_loss: 1.6532 - classification_loss: 0.3755 454/500 [==========================>...] - ETA: 11s - loss: 2.0284 - regression_loss: 1.6527 - classification_loss: 0.3757 455/500 [==========================>...] - ETA: 11s - loss: 2.0289 - regression_loss: 1.6531 - classification_loss: 0.3758 456/500 [==========================>...] - ETA: 11s - loss: 2.0283 - regression_loss: 1.6526 - classification_loss: 0.3757 457/500 [==========================>...] - ETA: 10s - loss: 2.0280 - regression_loss: 1.6523 - classification_loss: 0.3756 458/500 [==========================>...] - ETA: 10s - loss: 2.0284 - regression_loss: 1.6526 - classification_loss: 0.3758 459/500 [==========================>...] - ETA: 10s - loss: 2.0293 - regression_loss: 1.6533 - classification_loss: 0.3760 460/500 [==========================>...] - ETA: 10s - loss: 2.0305 - regression_loss: 1.6542 - classification_loss: 0.3764 461/500 [==========================>...] - ETA: 9s - loss: 2.0307 - regression_loss: 1.6541 - classification_loss: 0.3767  462/500 [==========================>...] - ETA: 9s - loss: 2.0302 - regression_loss: 1.6537 - classification_loss: 0.3765 463/500 [==========================>...] - ETA: 9s - loss: 2.0306 - regression_loss: 1.6542 - classification_loss: 0.3764 464/500 [==========================>...] - ETA: 9s - loss: 2.0303 - regression_loss: 1.6540 - classification_loss: 0.3763 465/500 [==========================>...] - ETA: 8s - loss: 2.0300 - regression_loss: 1.6540 - classification_loss: 0.3761 466/500 [==========================>...] - ETA: 8s - loss: 2.0288 - regression_loss: 1.6531 - classification_loss: 0.3758 467/500 [===========================>..] - ETA: 8s - loss: 2.0286 - regression_loss: 1.6529 - classification_loss: 0.3758 468/500 [===========================>..] - ETA: 8s - loss: 2.0289 - regression_loss: 1.6529 - classification_loss: 0.3760 469/500 [===========================>..] - ETA: 7s - loss: 2.0288 - regression_loss: 1.6527 - classification_loss: 0.3761 470/500 [===========================>..] - ETA: 7s - loss: 2.0284 - regression_loss: 1.6525 - classification_loss: 0.3759 471/500 [===========================>..] - ETA: 7s - loss: 2.0274 - regression_loss: 1.6517 - classification_loss: 0.3757 472/500 [===========================>..] - ETA: 7s - loss: 2.0269 - regression_loss: 1.6512 - classification_loss: 0.3756 473/500 [===========================>..] - ETA: 6s - loss: 2.0273 - regression_loss: 1.6517 - classification_loss: 0.3756 474/500 [===========================>..] - ETA: 6s - loss: 2.0297 - regression_loss: 1.6537 - classification_loss: 0.3760 475/500 [===========================>..] - ETA: 6s - loss: 2.0293 - regression_loss: 1.6535 - classification_loss: 0.3758 476/500 [===========================>..] - ETA: 6s - loss: 2.0290 - regression_loss: 1.6533 - classification_loss: 0.3757 477/500 [===========================>..] - ETA: 5s - loss: 2.0292 - regression_loss: 1.6536 - classification_loss: 0.3756 478/500 [===========================>..] - ETA: 5s - loss: 2.0287 - regression_loss: 1.6532 - classification_loss: 0.3755 479/500 [===========================>..] - ETA: 5s - loss: 2.0293 - regression_loss: 1.6539 - classification_loss: 0.3753 480/500 [===========================>..] - ETA: 5s - loss: 2.0293 - regression_loss: 1.6541 - classification_loss: 0.3753 481/500 [===========================>..] - ETA: 4s - loss: 2.0300 - regression_loss: 1.6545 - classification_loss: 0.3754 482/500 [===========================>..] - ETA: 4s - loss: 2.0305 - regression_loss: 1.6550 - classification_loss: 0.3755 483/500 [===========================>..] - ETA: 4s - loss: 2.0307 - regression_loss: 1.6553 - classification_loss: 0.3754 484/500 [============================>.] - ETA: 4s - loss: 2.0305 - regression_loss: 1.6553 - classification_loss: 0.3752 485/500 [============================>.] - ETA: 3s - loss: 2.0313 - regression_loss: 1.6561 - classification_loss: 0.3752 486/500 [============================>.] - ETA: 3s - loss: 2.0303 - regression_loss: 1.6552 - classification_loss: 0.3751 487/500 [============================>.] - ETA: 3s - loss: 2.0283 - regression_loss: 1.6534 - classification_loss: 0.3749 488/500 [============================>.] - ETA: 3s - loss: 2.0272 - regression_loss: 1.6523 - classification_loss: 0.3749 489/500 [============================>.] - ETA: 2s - loss: 2.0270 - regression_loss: 1.6521 - classification_loss: 0.3748 490/500 [============================>.] - ETA: 2s - loss: 2.0260 - regression_loss: 1.6512 - classification_loss: 0.3748 491/500 [============================>.] - ETA: 2s - loss: 2.0267 - regression_loss: 1.6518 - classification_loss: 0.3749 492/500 [============================>.] - ETA: 2s - loss: 2.0263 - regression_loss: 1.6515 - classification_loss: 0.3748 493/500 [============================>.] - ETA: 1s - loss: 2.0258 - regression_loss: 1.6514 - classification_loss: 0.3745 494/500 [============================>.] - ETA: 1s - loss: 2.0255 - regression_loss: 1.6512 - classification_loss: 0.3743 495/500 [============================>.] - ETA: 1s - loss: 2.0286 - regression_loss: 1.6538 - classification_loss: 0.3748 496/500 [============================>.] - ETA: 1s - loss: 2.0296 - regression_loss: 1.6546 - classification_loss: 0.3750 497/500 [============================>.] - ETA: 0s - loss: 2.0294 - regression_loss: 1.6544 - classification_loss: 0.3750 498/500 [============================>.] - ETA: 0s - loss: 2.0303 - regression_loss: 1.6551 - classification_loss: 0.3752 499/500 [============================>.] - ETA: 0s - loss: 2.0316 - regression_loss: 1.6562 - classification_loss: 0.3754 500/500 [==============================] - 125s 250ms/step - loss: 2.0322 - regression_loss: 1.6567 - classification_loss: 0.3754 1172 instances of class plum with average precision: 0.5389 mAP: 0.5389 Epoch 00040: saving model to ./training/snapshots/resnet50_pascal_40.h5 Epoch 41/150 1/500 [..............................] - ETA: 1:52 - loss: 2.2481 - regression_loss: 1.7923 - classification_loss: 0.4558 2/500 [..............................] - ETA: 2:00 - loss: 1.8492 - regression_loss: 1.4709 - classification_loss: 0.3783 3/500 [..............................] - ETA: 2:02 - loss: 1.9372 - regression_loss: 1.5777 - classification_loss: 0.3595 4/500 [..............................] - ETA: 2:03 - loss: 1.9784 - regression_loss: 1.6109 - classification_loss: 0.3675 5/500 [..............................] - ETA: 2:04 - loss: 2.0711 - regression_loss: 1.6960 - classification_loss: 0.3751 6/500 [..............................] - ETA: 2:04 - loss: 2.0352 - regression_loss: 1.6784 - classification_loss: 0.3568 7/500 [..............................] - ETA: 2:04 - loss: 1.9727 - regression_loss: 1.6314 - classification_loss: 0.3413 8/500 [..............................] - ETA: 2:03 - loss: 2.0343 - regression_loss: 1.6771 - classification_loss: 0.3572 9/500 [..............................] - ETA: 2:03 - loss: 2.0252 - regression_loss: 1.6610 - classification_loss: 0.3642 10/500 [..............................] - ETA: 2:03 - loss: 2.0399 - regression_loss: 1.6656 - classification_loss: 0.3743 11/500 [..............................] - ETA: 2:02 - loss: 1.9904 - regression_loss: 1.6069 - classification_loss: 0.3835 12/500 [..............................] - ETA: 2:01 - loss: 2.0030 - regression_loss: 1.6206 - classification_loss: 0.3824 13/500 [..............................] - ETA: 2:00 - loss: 2.0554 - regression_loss: 1.6567 - classification_loss: 0.3986 14/500 [..............................] - ETA: 2:01 - loss: 2.0834 - regression_loss: 1.6818 - classification_loss: 0.4016 15/500 [..............................] - ETA: 2:00 - loss: 2.0901 - regression_loss: 1.6907 - classification_loss: 0.3993 16/500 [..............................] - ETA: 2:00 - loss: 2.0960 - regression_loss: 1.6927 - classification_loss: 0.4033 17/500 [>.............................] - ETA: 2:00 - loss: 2.1000 - regression_loss: 1.7012 - classification_loss: 0.3988 18/500 [>.............................] - ETA: 2:00 - loss: 2.0949 - regression_loss: 1.6976 - classification_loss: 0.3973 19/500 [>.............................] - ETA: 2:00 - loss: 2.0786 - regression_loss: 1.6864 - classification_loss: 0.3922 20/500 [>.............................] - ETA: 2:00 - loss: 2.0853 - regression_loss: 1.6963 - classification_loss: 0.3890 21/500 [>.............................] - ETA: 2:00 - loss: 2.0884 - regression_loss: 1.6991 - classification_loss: 0.3894 22/500 [>.............................] - ETA: 1:59 - loss: 2.0672 - regression_loss: 1.6817 - classification_loss: 0.3856 23/500 [>.............................] - ETA: 1:59 - loss: 2.0534 - regression_loss: 1.6651 - classification_loss: 0.3883 24/500 [>.............................] - ETA: 1:59 - loss: 2.0295 - regression_loss: 1.6461 - classification_loss: 0.3834 25/500 [>.............................] - ETA: 1:59 - loss: 2.0134 - regression_loss: 1.6331 - classification_loss: 0.3804 26/500 [>.............................] - ETA: 1:59 - loss: 2.0317 - regression_loss: 1.6458 - classification_loss: 0.3859 27/500 [>.............................] - ETA: 1:58 - loss: 2.0442 - regression_loss: 1.6559 - classification_loss: 0.3883 28/500 [>.............................] - ETA: 1:58 - loss: 2.0429 - regression_loss: 1.6568 - classification_loss: 0.3862 29/500 [>.............................] - ETA: 1:58 - loss: 2.0350 - regression_loss: 1.6508 - classification_loss: 0.3842 30/500 [>.............................] - ETA: 1:57 - loss: 2.0349 - regression_loss: 1.6513 - classification_loss: 0.3836 31/500 [>.............................] - ETA: 1:57 - loss: 2.0297 - regression_loss: 1.6510 - classification_loss: 0.3788 32/500 [>.............................] - ETA: 1:57 - loss: 2.0455 - regression_loss: 1.6656 - classification_loss: 0.3798 33/500 [>.............................] - ETA: 1:57 - loss: 2.0258 - regression_loss: 1.6489 - classification_loss: 0.3769 34/500 [=>............................] - ETA: 1:57 - loss: 2.0380 - regression_loss: 1.6605 - classification_loss: 0.3775 35/500 [=>............................] - ETA: 1:56 - loss: 2.0554 - regression_loss: 1.6718 - classification_loss: 0.3835 36/500 [=>............................] - ETA: 1:56 - loss: 2.0805 - regression_loss: 1.6958 - classification_loss: 0.3847 37/500 [=>............................] - ETA: 1:56 - loss: 2.0917 - regression_loss: 1.7043 - classification_loss: 0.3874 38/500 [=>............................] - ETA: 1:56 - loss: 2.0909 - regression_loss: 1.7019 - classification_loss: 0.3890 39/500 [=>............................] - ETA: 1:55 - loss: 2.0856 - regression_loss: 1.6969 - classification_loss: 0.3887 40/500 [=>............................] - ETA: 1:55 - loss: 2.0835 - regression_loss: 1.6977 - classification_loss: 0.3858 41/500 [=>............................] - ETA: 1:55 - loss: 2.0739 - regression_loss: 1.6909 - classification_loss: 0.3830 42/500 [=>............................] - ETA: 1:55 - loss: 2.0846 - regression_loss: 1.6986 - classification_loss: 0.3859 43/500 [=>............................] - ETA: 1:55 - loss: 2.0772 - regression_loss: 1.6920 - classification_loss: 0.3852 44/500 [=>............................] - ETA: 1:54 - loss: 2.0745 - regression_loss: 1.6873 - classification_loss: 0.3872 45/500 [=>............................] - ETA: 1:54 - loss: 2.0719 - regression_loss: 1.6850 - classification_loss: 0.3869 46/500 [=>............................] - ETA: 1:54 - loss: 2.0550 - regression_loss: 1.6715 - classification_loss: 0.3835 47/500 [=>............................] - ETA: 1:53 - loss: 2.0584 - regression_loss: 1.6740 - classification_loss: 0.3844 48/500 [=>............................] - ETA: 1:53 - loss: 2.0571 - regression_loss: 1.6740 - classification_loss: 0.3831 49/500 [=>............................] - ETA: 1:53 - loss: 2.0480 - regression_loss: 1.6663 - classification_loss: 0.3817 50/500 [==>...........................] - ETA: 1:53 - loss: 2.0478 - regression_loss: 1.6669 - classification_loss: 0.3809 51/500 [==>...........................] - ETA: 1:52 - loss: 2.0556 - regression_loss: 1.6734 - classification_loss: 0.3821 52/500 [==>...........................] - ETA: 1:52 - loss: 2.0856 - regression_loss: 1.6808 - classification_loss: 0.4048 53/500 [==>...........................] - ETA: 1:52 - loss: 2.0721 - regression_loss: 1.6705 - classification_loss: 0.4016 54/500 [==>...........................] - ETA: 1:51 - loss: 2.0736 - regression_loss: 1.6727 - classification_loss: 0.4009 55/500 [==>...........................] - ETA: 1:50 - loss: 2.0747 - regression_loss: 1.6743 - classification_loss: 0.4004 56/500 [==>...........................] - ETA: 1:50 - loss: 2.0792 - regression_loss: 1.6791 - classification_loss: 0.4001 57/500 [==>...........................] - ETA: 1:49 - loss: 2.0790 - regression_loss: 1.6780 - classification_loss: 0.4011 58/500 [==>...........................] - ETA: 1:49 - loss: 2.0695 - regression_loss: 1.6716 - classification_loss: 0.3979 59/500 [==>...........................] - ETA: 1:48 - loss: 2.0783 - regression_loss: 1.6782 - classification_loss: 0.4001 60/500 [==>...........................] - ETA: 1:48 - loss: 2.0737 - regression_loss: 1.6750 - classification_loss: 0.3987 61/500 [==>...........................] - ETA: 1:48 - loss: 2.0767 - regression_loss: 1.6772 - classification_loss: 0.3995 62/500 [==>...........................] - ETA: 1:47 - loss: 2.0595 - regression_loss: 1.6641 - classification_loss: 0.3955 63/500 [==>...........................] - ETA: 1:47 - loss: 2.0608 - regression_loss: 1.6655 - classification_loss: 0.3953 64/500 [==>...........................] - ETA: 1:46 - loss: 2.0602 - regression_loss: 1.6657 - classification_loss: 0.3944 65/500 [==>...........................] - ETA: 1:46 - loss: 2.0463 - regression_loss: 1.6551 - classification_loss: 0.3911 66/500 [==>...........................] - ETA: 1:45 - loss: 2.0548 - regression_loss: 1.6616 - classification_loss: 0.3932 67/500 [===>..........................] - ETA: 1:45 - loss: 2.0560 - regression_loss: 1.6635 - classification_loss: 0.3924 68/500 [===>..........................] - ETA: 1:45 - loss: 2.0569 - regression_loss: 1.6648 - classification_loss: 0.3921 69/500 [===>..........................] - ETA: 1:44 - loss: 2.0624 - regression_loss: 1.6692 - classification_loss: 0.3932 70/500 [===>..........................] - ETA: 1:44 - loss: 2.0584 - regression_loss: 1.6666 - classification_loss: 0.3918 71/500 [===>..........................] - ETA: 1:43 - loss: 2.0595 - regression_loss: 1.6671 - classification_loss: 0.3924 72/500 [===>..........................] - ETA: 1:43 - loss: 2.0520 - regression_loss: 1.6611 - classification_loss: 0.3908 73/500 [===>..........................] - ETA: 1:42 - loss: 2.0495 - regression_loss: 1.6584 - classification_loss: 0.3911 74/500 [===>..........................] - ETA: 1:42 - loss: 2.0504 - regression_loss: 1.6596 - classification_loss: 0.3908 75/500 [===>..........................] - ETA: 1:42 - loss: 2.0424 - regression_loss: 1.6506 - classification_loss: 0.3918 76/500 [===>..........................] - ETA: 1:41 - loss: 2.0322 - regression_loss: 1.6417 - classification_loss: 0.3905 77/500 [===>..........................] - ETA: 1:41 - loss: 2.0377 - regression_loss: 1.6460 - classification_loss: 0.3917 78/500 [===>..........................] - ETA: 1:41 - loss: 2.0399 - regression_loss: 1.6481 - classification_loss: 0.3918 79/500 [===>..........................] - ETA: 1:40 - loss: 2.0363 - regression_loss: 1.6463 - classification_loss: 0.3900 80/500 [===>..........................] - ETA: 1:40 - loss: 2.0382 - regression_loss: 1.6482 - classification_loss: 0.3900 81/500 [===>..........................] - ETA: 1:40 - loss: 2.0346 - regression_loss: 1.6467 - classification_loss: 0.3879 82/500 [===>..........................] - ETA: 1:39 - loss: 2.0367 - regression_loss: 1.6498 - classification_loss: 0.3870 83/500 [===>..........................] - ETA: 1:39 - loss: 2.0381 - regression_loss: 1.6509 - classification_loss: 0.3872 84/500 [====>.........................] - ETA: 1:39 - loss: 2.0320 - regression_loss: 1.6451 - classification_loss: 0.3869 85/500 [====>.........................] - ETA: 1:38 - loss: 2.0365 - regression_loss: 1.6484 - classification_loss: 0.3882 86/500 [====>.........................] - ETA: 1:38 - loss: 2.0305 - regression_loss: 1.6440 - classification_loss: 0.3865 87/500 [====>.........................] - ETA: 1:38 - loss: 2.0294 - regression_loss: 1.6435 - classification_loss: 0.3858 88/500 [====>.........................] - ETA: 1:37 - loss: 2.0243 - regression_loss: 1.6393 - classification_loss: 0.3850 89/500 [====>.........................] - ETA: 1:37 - loss: 2.0312 - regression_loss: 1.6461 - classification_loss: 0.3850 90/500 [====>.........................] - ETA: 1:36 - loss: 2.0262 - regression_loss: 1.6429 - classification_loss: 0.3833 91/500 [====>.........................] - ETA: 1:36 - loss: 2.0238 - regression_loss: 1.6417 - classification_loss: 0.3822 92/500 [====>.........................] - ETA: 1:36 - loss: 2.0239 - regression_loss: 1.6421 - classification_loss: 0.3819 93/500 [====>.........................] - ETA: 1:35 - loss: 2.0330 - regression_loss: 1.6504 - classification_loss: 0.3826 94/500 [====>.........................] - ETA: 1:35 - loss: 2.0358 - regression_loss: 1.6528 - classification_loss: 0.3830 95/500 [====>.........................] - ETA: 1:35 - loss: 2.0300 - regression_loss: 1.6475 - classification_loss: 0.3825 96/500 [====>.........................] - ETA: 1:35 - loss: 2.0245 - regression_loss: 1.6431 - classification_loss: 0.3814 97/500 [====>.........................] - ETA: 1:34 - loss: 2.0265 - regression_loss: 1.6455 - classification_loss: 0.3810 98/500 [====>.........................] - ETA: 1:34 - loss: 2.0276 - regression_loss: 1.6469 - classification_loss: 0.3807 99/500 [====>.........................] - ETA: 1:34 - loss: 2.0320 - regression_loss: 1.6506 - classification_loss: 0.3815 100/500 [=====>........................] - ETA: 1:33 - loss: 2.0327 - regression_loss: 1.6518 - classification_loss: 0.3809 101/500 [=====>........................] - ETA: 1:33 - loss: 2.0229 - regression_loss: 1.6436 - classification_loss: 0.3793 102/500 [=====>........................] - ETA: 1:33 - loss: 2.0131 - regression_loss: 1.6356 - classification_loss: 0.3775 103/500 [=====>........................] - ETA: 1:32 - loss: 2.0078 - regression_loss: 1.6313 - classification_loss: 0.3765 104/500 [=====>........................] - ETA: 1:32 - loss: 2.0119 - regression_loss: 1.6355 - classification_loss: 0.3764 105/500 [=====>........................] - ETA: 1:32 - loss: 2.0224 - regression_loss: 1.6449 - classification_loss: 0.3775 106/500 [=====>........................] - ETA: 1:31 - loss: 2.0155 - regression_loss: 1.6396 - classification_loss: 0.3758 107/500 [=====>........................] - ETA: 1:31 - loss: 2.0151 - regression_loss: 1.6396 - classification_loss: 0.3756 108/500 [=====>........................] - ETA: 1:31 - loss: 2.0156 - regression_loss: 1.6408 - classification_loss: 0.3749 109/500 [=====>........................] - ETA: 1:31 - loss: 2.0138 - regression_loss: 1.6398 - classification_loss: 0.3740 110/500 [=====>........................] - ETA: 1:30 - loss: 2.0121 - regression_loss: 1.6381 - classification_loss: 0.3740 111/500 [=====>........................] - ETA: 1:30 - loss: 1.9999 - regression_loss: 1.6282 - classification_loss: 0.3717 112/500 [=====>........................] - ETA: 1:30 - loss: 1.9996 - regression_loss: 1.6288 - classification_loss: 0.3708 113/500 [=====>........................] - ETA: 1:29 - loss: 2.0055 - regression_loss: 1.6340 - classification_loss: 0.3716 114/500 [=====>........................] - ETA: 1:29 - loss: 2.0049 - regression_loss: 1.6339 - classification_loss: 0.3710 115/500 [=====>........................] - ETA: 1:29 - loss: 2.0028 - regression_loss: 1.6323 - classification_loss: 0.3704 116/500 [=====>........................] - ETA: 1:29 - loss: 1.9933 - regression_loss: 1.6253 - classification_loss: 0.3679 117/500 [======>.......................] - ETA: 1:28 - loss: 1.9946 - regression_loss: 1.6267 - classification_loss: 0.3680 118/500 [======>.......................] - ETA: 1:28 - loss: 1.9977 - regression_loss: 1.6286 - classification_loss: 0.3691 119/500 [======>.......................] - ETA: 1:28 - loss: 1.9942 - regression_loss: 1.6260 - classification_loss: 0.3682 120/500 [======>.......................] - ETA: 1:28 - loss: 1.9858 - regression_loss: 1.6187 - classification_loss: 0.3672 121/500 [======>.......................] - ETA: 1:27 - loss: 1.9894 - regression_loss: 1.6220 - classification_loss: 0.3674 122/500 [======>.......................] - ETA: 1:27 - loss: 1.9844 - regression_loss: 1.6174 - classification_loss: 0.3671 123/500 [======>.......................] - ETA: 1:27 - loss: 1.9879 - regression_loss: 1.6199 - classification_loss: 0.3680 124/500 [======>.......................] - ETA: 1:26 - loss: 1.9924 - regression_loss: 1.6229 - classification_loss: 0.3695 125/500 [======>.......................] - ETA: 1:26 - loss: 1.9963 - regression_loss: 1.6258 - classification_loss: 0.3705 126/500 [======>.......................] - ETA: 1:26 - loss: 1.9928 - regression_loss: 1.6233 - classification_loss: 0.3695 127/500 [======>.......................] - ETA: 1:26 - loss: 1.9904 - regression_loss: 1.6215 - classification_loss: 0.3690 128/500 [======>.......................] - ETA: 1:25 - loss: 1.9938 - regression_loss: 1.6243 - classification_loss: 0.3695 129/500 [======>.......................] - ETA: 1:25 - loss: 1.9925 - regression_loss: 1.6238 - classification_loss: 0.3687 130/500 [======>.......................] - ETA: 1:25 - loss: 1.9940 - regression_loss: 1.6248 - classification_loss: 0.3692 131/500 [======>.......................] - ETA: 1:25 - loss: 1.9923 - regression_loss: 1.6237 - classification_loss: 0.3687 132/500 [======>.......................] - ETA: 1:25 - loss: 1.9934 - regression_loss: 1.6250 - classification_loss: 0.3684 133/500 [======>.......................] - ETA: 1:24 - loss: 1.9955 - regression_loss: 1.6267 - classification_loss: 0.3688 134/500 [=======>......................] - ETA: 1:24 - loss: 1.9976 - regression_loss: 1.6286 - classification_loss: 0.3691 135/500 [=======>......................] - ETA: 1:24 - loss: 1.9937 - regression_loss: 1.6260 - classification_loss: 0.3677 136/500 [=======>......................] - ETA: 1:24 - loss: 1.9962 - regression_loss: 1.6285 - classification_loss: 0.3676 137/500 [=======>......................] - ETA: 1:24 - loss: 1.9965 - regression_loss: 1.6292 - classification_loss: 0.3673 138/500 [=======>......................] - ETA: 1:23 - loss: 1.9987 - regression_loss: 1.6309 - classification_loss: 0.3678 139/500 [=======>......................] - ETA: 1:23 - loss: 1.9977 - regression_loss: 1.6302 - classification_loss: 0.3676 140/500 [=======>......................] - ETA: 1:23 - loss: 1.9946 - regression_loss: 1.6274 - classification_loss: 0.3672 141/500 [=======>......................] - ETA: 1:23 - loss: 1.9939 - regression_loss: 1.6273 - classification_loss: 0.3667 142/500 [=======>......................] - ETA: 1:23 - loss: 1.9880 - regression_loss: 1.6216 - classification_loss: 0.3665 143/500 [=======>......................] - ETA: 1:23 - loss: 1.9878 - regression_loss: 1.6218 - classification_loss: 0.3660 144/500 [=======>......................] - ETA: 1:22 - loss: 1.9875 - regression_loss: 1.6216 - classification_loss: 0.3659 145/500 [=======>......................] - ETA: 1:22 - loss: 1.9869 - regression_loss: 1.6214 - classification_loss: 0.3655 146/500 [=======>......................] - ETA: 1:22 - loss: 1.9874 - regression_loss: 1.6222 - classification_loss: 0.3652 147/500 [=======>......................] - ETA: 1:22 - loss: 1.9910 - regression_loss: 1.6251 - classification_loss: 0.3660 148/500 [=======>......................] - ETA: 1:22 - loss: 1.9915 - regression_loss: 1.6249 - classification_loss: 0.3666 149/500 [=======>......................] - ETA: 1:21 - loss: 1.9908 - regression_loss: 1.6243 - classification_loss: 0.3664 150/500 [========>.....................] - ETA: 1:21 - loss: 1.9879 - regression_loss: 1.6216 - classification_loss: 0.3663 151/500 [========>.....................] - ETA: 1:21 - loss: 1.9848 - regression_loss: 1.6194 - classification_loss: 0.3653 152/500 [========>.....................] - ETA: 1:21 - loss: 1.9868 - regression_loss: 1.6209 - classification_loss: 0.3658 153/500 [========>.....................] - ETA: 1:21 - loss: 1.9820 - regression_loss: 1.6171 - classification_loss: 0.3649 154/500 [========>.....................] - ETA: 1:20 - loss: 1.9803 - regression_loss: 1.6163 - classification_loss: 0.3640 155/500 [========>.....................] - ETA: 1:20 - loss: 1.9817 - regression_loss: 1.6178 - classification_loss: 0.3639 156/500 [========>.....................] - ETA: 1:20 - loss: 1.9818 - regression_loss: 1.6179 - classification_loss: 0.3639 157/500 [========>.....................] - ETA: 1:20 - loss: 1.9830 - regression_loss: 1.6186 - classification_loss: 0.3644 158/500 [========>.....................] - ETA: 1:20 - loss: 1.9818 - regression_loss: 1.6177 - classification_loss: 0.3640 159/500 [========>.....................] - ETA: 1:19 - loss: 1.9780 - regression_loss: 1.6149 - classification_loss: 0.3631 160/500 [========>.....................] - ETA: 1:19 - loss: 1.9796 - regression_loss: 1.6164 - classification_loss: 0.3631 161/500 [========>.....................] - ETA: 1:19 - loss: 1.9788 - regression_loss: 1.6161 - classification_loss: 0.3626 162/500 [========>.....................] - ETA: 1:19 - loss: 1.9799 - regression_loss: 1.6175 - classification_loss: 0.3624 163/500 [========>.....................] - ETA: 1:19 - loss: 1.9831 - regression_loss: 1.6197 - classification_loss: 0.3634 164/500 [========>.....................] - ETA: 1:18 - loss: 1.9824 - regression_loss: 1.6196 - classification_loss: 0.3628 165/500 [========>.....................] - ETA: 1:18 - loss: 1.9842 - regression_loss: 1.6211 - classification_loss: 0.3630 166/500 [========>.....................] - ETA: 1:18 - loss: 1.9816 - regression_loss: 1.6192 - classification_loss: 0.3624 167/500 [=========>....................] - ETA: 1:18 - loss: 1.9830 - regression_loss: 1.6204 - classification_loss: 0.3626 168/500 [=========>....................] - ETA: 1:18 - loss: 1.9821 - regression_loss: 1.6198 - classification_loss: 0.3622 169/500 [=========>....................] - ETA: 1:17 - loss: 1.9795 - regression_loss: 1.6179 - classification_loss: 0.3616 170/500 [=========>....................] - ETA: 1:17 - loss: 1.9776 - regression_loss: 1.6153 - classification_loss: 0.3623 171/500 [=========>....................] - ETA: 1:17 - loss: 1.9792 - regression_loss: 1.6167 - classification_loss: 0.3625 172/500 [=========>....................] - ETA: 1:17 - loss: 1.9761 - regression_loss: 1.6135 - classification_loss: 0.3626 173/500 [=========>....................] - ETA: 1:17 - loss: 1.9799 - regression_loss: 1.6173 - classification_loss: 0.3626 174/500 [=========>....................] - ETA: 1:16 - loss: 1.9849 - regression_loss: 1.6210 - classification_loss: 0.3640 175/500 [=========>....................] - ETA: 1:16 - loss: 1.9862 - regression_loss: 1.6220 - classification_loss: 0.3642 176/500 [=========>....................] - ETA: 1:16 - loss: 1.9875 - regression_loss: 1.6232 - classification_loss: 0.3643 177/500 [=========>....................] - ETA: 1:16 - loss: 1.9913 - regression_loss: 1.6266 - classification_loss: 0.3647 178/500 [=========>....................] - ETA: 1:16 - loss: 1.9903 - regression_loss: 1.6261 - classification_loss: 0.3642 179/500 [=========>....................] - ETA: 1:15 - loss: 1.9882 - regression_loss: 1.6245 - classification_loss: 0.3637 180/500 [=========>....................] - ETA: 1:15 - loss: 1.9879 - regression_loss: 1.6242 - classification_loss: 0.3637 181/500 [=========>....................] - ETA: 1:15 - loss: 1.9888 - regression_loss: 1.6250 - classification_loss: 0.3638 182/500 [=========>....................] - ETA: 1:15 - loss: 1.9877 - regression_loss: 1.6233 - classification_loss: 0.3643 183/500 [=========>....................] - ETA: 1:14 - loss: 1.9824 - regression_loss: 1.6193 - classification_loss: 0.3632 184/500 [==========>...................] - ETA: 1:14 - loss: 1.9845 - regression_loss: 1.6213 - classification_loss: 0.3632 185/500 [==========>...................] - ETA: 1:14 - loss: 1.9865 - regression_loss: 1.6231 - classification_loss: 0.3633 186/500 [==========>...................] - ETA: 1:14 - loss: 1.9862 - regression_loss: 1.6232 - classification_loss: 0.3630 187/500 [==========>...................] - ETA: 1:14 - loss: 1.9872 - regression_loss: 1.6242 - classification_loss: 0.3630 188/500 [==========>...................] - ETA: 1:13 - loss: 1.9875 - regression_loss: 1.6245 - classification_loss: 0.3630 189/500 [==========>...................] - ETA: 1:13 - loss: 1.9900 - regression_loss: 1.6266 - classification_loss: 0.3634 190/500 [==========>...................] - ETA: 1:13 - loss: 1.9927 - regression_loss: 1.6291 - classification_loss: 0.3636 191/500 [==========>...................] - ETA: 1:13 - loss: 1.9937 - regression_loss: 1.6300 - classification_loss: 0.3637 192/500 [==========>...................] - ETA: 1:13 - loss: 1.9942 - regression_loss: 1.6294 - classification_loss: 0.3648 193/500 [==========>...................] - ETA: 1:12 - loss: 1.9953 - regression_loss: 1.6304 - classification_loss: 0.3649 194/500 [==========>...................] - ETA: 1:12 - loss: 1.9949 - regression_loss: 1.6294 - classification_loss: 0.3655 195/500 [==========>...................] - ETA: 1:12 - loss: 2.0002 - regression_loss: 1.6332 - classification_loss: 0.3671 196/500 [==========>...................] - ETA: 1:12 - loss: 2.0013 - regression_loss: 1.6341 - classification_loss: 0.3672 197/500 [==========>...................] - ETA: 1:11 - loss: 2.0023 - regression_loss: 1.6349 - classification_loss: 0.3674 198/500 [==========>...................] - ETA: 1:11 - loss: 2.0033 - regression_loss: 1.6358 - classification_loss: 0.3675 199/500 [==========>...................] - ETA: 1:11 - loss: 2.0031 - regression_loss: 1.6359 - classification_loss: 0.3672 200/500 [===========>..................] - ETA: 1:11 - loss: 2.0025 - regression_loss: 1.6360 - classification_loss: 0.3665 201/500 [===========>..................] - ETA: 1:11 - loss: 2.0044 - regression_loss: 1.6377 - classification_loss: 0.3667 202/500 [===========>..................] - ETA: 1:10 - loss: 2.0042 - regression_loss: 1.6373 - classification_loss: 0.3668 203/500 [===========>..................] - ETA: 1:10 - loss: 2.0088 - regression_loss: 1.6412 - classification_loss: 0.3676 204/500 [===========>..................] - ETA: 1:10 - loss: 2.0113 - regression_loss: 1.6429 - classification_loss: 0.3684 205/500 [===========>..................] - ETA: 1:10 - loss: 2.0108 - regression_loss: 1.6423 - classification_loss: 0.3685 206/500 [===========>..................] - ETA: 1:09 - loss: 2.0147 - regression_loss: 1.6455 - classification_loss: 0.3692 207/500 [===========>..................] - ETA: 1:09 - loss: 2.0149 - regression_loss: 1.6459 - classification_loss: 0.3691 208/500 [===========>..................] - ETA: 1:09 - loss: 2.0168 - regression_loss: 1.6475 - classification_loss: 0.3694 209/500 [===========>..................] - ETA: 1:09 - loss: 2.0164 - regression_loss: 1.6470 - classification_loss: 0.3693 210/500 [===========>..................] - ETA: 1:09 - loss: 2.0124 - regression_loss: 1.6441 - classification_loss: 0.3683 211/500 [===========>..................] - ETA: 1:08 - loss: 2.0132 - regression_loss: 1.6447 - classification_loss: 0.3685 212/500 [===========>..................] - ETA: 1:08 - loss: 2.0140 - regression_loss: 1.6453 - classification_loss: 0.3687 213/500 [===========>..................] - ETA: 1:08 - loss: 2.0155 - regression_loss: 1.6466 - classification_loss: 0.3689 214/500 [===========>..................] - ETA: 1:08 - loss: 2.0147 - regression_loss: 1.6461 - classification_loss: 0.3686 215/500 [===========>..................] - ETA: 1:07 - loss: 2.0178 - regression_loss: 1.6476 - classification_loss: 0.3703 216/500 [===========>..................] - ETA: 1:07 - loss: 2.0215 - regression_loss: 1.6496 - classification_loss: 0.3719 217/500 [============>.................] - ETA: 1:07 - loss: 2.0205 - regression_loss: 1.6489 - classification_loss: 0.3716 218/500 [============>.................] - ETA: 1:07 - loss: 2.0164 - regression_loss: 1.6453 - classification_loss: 0.3711 219/500 [============>.................] - ETA: 1:07 - loss: 2.0168 - regression_loss: 1.6446 - classification_loss: 0.3723 220/500 [============>.................] - ETA: 1:06 - loss: 2.0184 - regression_loss: 1.6454 - classification_loss: 0.3730 221/500 [============>.................] - ETA: 1:06 - loss: 2.0187 - regression_loss: 1.6456 - classification_loss: 0.3731 222/500 [============>.................] - ETA: 1:06 - loss: 2.0202 - regression_loss: 1.6464 - classification_loss: 0.3738 223/500 [============>.................] - ETA: 1:06 - loss: 2.0133 - regression_loss: 1.6407 - classification_loss: 0.3726 224/500 [============>.................] - ETA: 1:05 - loss: 2.0135 - regression_loss: 1.6406 - classification_loss: 0.3729 225/500 [============>.................] - ETA: 1:05 - loss: 2.0145 - regression_loss: 1.6415 - classification_loss: 0.3730 226/500 [============>.................] - ETA: 1:05 - loss: 2.0141 - regression_loss: 1.6407 - classification_loss: 0.3734 227/500 [============>.................] - ETA: 1:05 - loss: 2.0129 - regression_loss: 1.6399 - classification_loss: 0.3729 228/500 [============>.................] - ETA: 1:05 - loss: 2.0133 - regression_loss: 1.6401 - classification_loss: 0.3731 229/500 [============>.................] - ETA: 1:04 - loss: 2.0147 - regression_loss: 1.6419 - classification_loss: 0.3728 230/500 [============>.................] - ETA: 1:04 - loss: 2.0116 - regression_loss: 1.6397 - classification_loss: 0.3720 231/500 [============>.................] - ETA: 1:04 - loss: 2.0106 - regression_loss: 1.6388 - classification_loss: 0.3718 232/500 [============>.................] - ETA: 1:04 - loss: 2.0099 - regression_loss: 1.6381 - classification_loss: 0.3718 233/500 [============>.................] - ETA: 1:03 - loss: 2.0125 - regression_loss: 1.6379 - classification_loss: 0.3746 234/500 [=============>................] - ETA: 1:03 - loss: 2.0137 - regression_loss: 1.6392 - classification_loss: 0.3745 235/500 [=============>................] - ETA: 1:03 - loss: 2.0139 - regression_loss: 1.6398 - classification_loss: 0.3741 236/500 [=============>................] - ETA: 1:03 - loss: 2.0117 - regression_loss: 1.6381 - classification_loss: 0.3736 237/500 [=============>................] - ETA: 1:03 - loss: 2.0107 - regression_loss: 1.6376 - classification_loss: 0.3731 238/500 [=============>................] - ETA: 1:02 - loss: 2.0130 - regression_loss: 1.6391 - classification_loss: 0.3739 239/500 [=============>................] - ETA: 1:02 - loss: 2.0132 - regression_loss: 1.6396 - classification_loss: 0.3736 240/500 [=============>................] - ETA: 1:02 - loss: 2.0119 - regression_loss: 1.6388 - classification_loss: 0.3731 241/500 [=============>................] - ETA: 1:02 - loss: 2.0114 - regression_loss: 1.6384 - classification_loss: 0.3730 242/500 [=============>................] - ETA: 1:01 - loss: 2.0093 - regression_loss: 1.6365 - classification_loss: 0.3728 243/500 [=============>................] - ETA: 1:01 - loss: 2.0089 - regression_loss: 1.6363 - classification_loss: 0.3725 244/500 [=============>................] - ETA: 1:01 - loss: 2.0093 - regression_loss: 1.6371 - classification_loss: 0.3722 245/500 [=============>................] - ETA: 1:01 - loss: 2.0092 - regression_loss: 1.6373 - classification_loss: 0.3719 246/500 [=============>................] - ETA: 1:00 - loss: 2.0081 - regression_loss: 1.6365 - classification_loss: 0.3715 247/500 [=============>................] - ETA: 1:00 - loss: 2.0072 - regression_loss: 1.6359 - classification_loss: 0.3713 248/500 [=============>................] - ETA: 1:00 - loss: 2.0082 - regression_loss: 1.6371 - classification_loss: 0.3710 249/500 [=============>................] - ETA: 1:00 - loss: 2.0111 - regression_loss: 1.6394 - classification_loss: 0.3718 250/500 [==============>...............] - ETA: 59s - loss: 2.0093 - regression_loss: 1.6381 - classification_loss: 0.3712  251/500 [==============>...............] - ETA: 59s - loss: 2.0098 - regression_loss: 1.6385 - classification_loss: 0.3712 252/500 [==============>...............] - ETA: 59s - loss: 2.0062 - regression_loss: 1.6358 - classification_loss: 0.3704 253/500 [==============>...............] - ETA: 59s - loss: 2.0071 - regression_loss: 1.6366 - classification_loss: 0.3705 254/500 [==============>...............] - ETA: 59s - loss: 2.0071 - regression_loss: 1.6367 - classification_loss: 0.3704 255/500 [==============>...............] - ETA: 58s - loss: 2.0071 - regression_loss: 1.6369 - classification_loss: 0.3702 256/500 [==============>...............] - ETA: 58s - loss: 2.0076 - regression_loss: 1.6372 - classification_loss: 0.3704 257/500 [==============>...............] - ETA: 58s - loss: 2.0084 - regression_loss: 1.6381 - classification_loss: 0.3703 258/500 [==============>...............] - ETA: 58s - loss: 2.0082 - regression_loss: 1.6381 - classification_loss: 0.3700 259/500 [==============>...............] - ETA: 57s - loss: 2.0082 - regression_loss: 1.6382 - classification_loss: 0.3700 260/500 [==============>...............] - ETA: 57s - loss: 2.0056 - regression_loss: 1.6361 - classification_loss: 0.3695 261/500 [==============>...............] - ETA: 57s - loss: 2.0025 - regression_loss: 1.6339 - classification_loss: 0.3686 262/500 [==============>...............] - ETA: 57s - loss: 2.0030 - regression_loss: 1.6347 - classification_loss: 0.3683 263/500 [==============>...............] - ETA: 56s - loss: 2.0038 - regression_loss: 1.6356 - classification_loss: 0.3683 264/500 [==============>...............] - ETA: 56s - loss: 2.0033 - regression_loss: 1.6350 - classification_loss: 0.3682 265/500 [==============>...............] - ETA: 56s - loss: 2.0032 - regression_loss: 1.6347 - classification_loss: 0.3685 266/500 [==============>...............] - ETA: 56s - loss: 2.0021 - regression_loss: 1.6340 - classification_loss: 0.3681 267/500 [===============>..............] - ETA: 56s - loss: 2.0031 - regression_loss: 1.6349 - classification_loss: 0.3682 268/500 [===============>..............] - ETA: 55s - loss: 1.9992 - regression_loss: 1.6316 - classification_loss: 0.3676 269/500 [===============>..............] - ETA: 55s - loss: 2.0002 - regression_loss: 1.6325 - classification_loss: 0.3678 270/500 [===============>..............] - ETA: 55s - loss: 2.0007 - regression_loss: 1.6328 - classification_loss: 0.3679 271/500 [===============>..............] - ETA: 55s - loss: 1.9995 - regression_loss: 1.6321 - classification_loss: 0.3674 272/500 [===============>..............] - ETA: 54s - loss: 2.0001 - regression_loss: 1.6325 - classification_loss: 0.3676 273/500 [===============>..............] - ETA: 54s - loss: 1.9999 - regression_loss: 1.6325 - classification_loss: 0.3675 274/500 [===============>..............] - ETA: 54s - loss: 1.9990 - regression_loss: 1.6314 - classification_loss: 0.3676 275/500 [===============>..............] - ETA: 54s - loss: 1.9999 - regression_loss: 1.6321 - classification_loss: 0.3678 276/500 [===============>..............] - ETA: 53s - loss: 2.0017 - regression_loss: 1.6334 - classification_loss: 0.3683 277/500 [===============>..............] - ETA: 53s - loss: 2.0024 - regression_loss: 1.6341 - classification_loss: 0.3683 278/500 [===============>..............] - ETA: 53s - loss: 2.0042 - regression_loss: 1.6358 - classification_loss: 0.3685 279/500 [===============>..............] - ETA: 53s - loss: 2.0050 - regression_loss: 1.6366 - classification_loss: 0.3684 280/500 [===============>..............] - ETA: 53s - loss: 2.0038 - regression_loss: 1.6358 - classification_loss: 0.3680 281/500 [===============>..............] - ETA: 52s - loss: 2.0039 - regression_loss: 1.6359 - classification_loss: 0.3680 282/500 [===============>..............] - ETA: 52s - loss: 2.0038 - regression_loss: 1.6349 - classification_loss: 0.3690 283/500 [===============>..............] - ETA: 52s - loss: 2.0038 - regression_loss: 1.6349 - classification_loss: 0.3689 284/500 [================>.............] - ETA: 52s - loss: 2.0066 - regression_loss: 1.6368 - classification_loss: 0.3698 285/500 [================>.............] - ETA: 51s - loss: 2.0047 - regression_loss: 1.6356 - classification_loss: 0.3692 286/500 [================>.............] - ETA: 51s - loss: 2.0049 - regression_loss: 1.6359 - classification_loss: 0.3691 287/500 [================>.............] - ETA: 51s - loss: 2.0057 - regression_loss: 1.6366 - classification_loss: 0.3691 288/500 [================>.............] - ETA: 51s - loss: 2.0056 - regression_loss: 1.6367 - classification_loss: 0.3689 289/500 [================>.............] - ETA: 50s - loss: 2.0067 - regression_loss: 1.6378 - classification_loss: 0.3689 290/500 [================>.............] - ETA: 50s - loss: 2.0069 - regression_loss: 1.6382 - classification_loss: 0.3687 291/500 [================>.............] - ETA: 50s - loss: 2.0075 - regression_loss: 1.6389 - classification_loss: 0.3686 292/500 [================>.............] - ETA: 50s - loss: 2.0084 - regression_loss: 1.6395 - classification_loss: 0.3689 293/500 [================>.............] - ETA: 49s - loss: 2.0076 - regression_loss: 1.6391 - classification_loss: 0.3685 294/500 [================>.............] - ETA: 49s - loss: 2.0080 - regression_loss: 1.6394 - classification_loss: 0.3686 295/500 [================>.............] - ETA: 49s - loss: 2.0090 - regression_loss: 1.6401 - classification_loss: 0.3688 296/500 [================>.............] - ETA: 49s - loss: 2.0082 - regression_loss: 1.6399 - classification_loss: 0.3683 297/500 [================>.............] - ETA: 49s - loss: 2.0101 - regression_loss: 1.6413 - classification_loss: 0.3688 298/500 [================>.............] - ETA: 48s - loss: 2.0095 - regression_loss: 1.6409 - classification_loss: 0.3686 299/500 [================>.............] - ETA: 48s - loss: 2.0110 - regression_loss: 1.6419 - classification_loss: 0.3691 300/500 [=================>............] - ETA: 48s - loss: 2.0094 - regression_loss: 1.6407 - classification_loss: 0.3687 301/500 [=================>............] - ETA: 48s - loss: 2.0090 - regression_loss: 1.6405 - classification_loss: 0.3685 302/500 [=================>............] - ETA: 47s - loss: 2.0084 - regression_loss: 1.6402 - classification_loss: 0.3683 303/500 [=================>............] - ETA: 47s - loss: 2.0077 - regression_loss: 1.6398 - classification_loss: 0.3679 304/500 [=================>............] - ETA: 47s - loss: 2.0081 - regression_loss: 1.6401 - classification_loss: 0.3680 305/500 [=================>............] - ETA: 47s - loss: 2.0080 - regression_loss: 1.6395 - classification_loss: 0.3685 306/500 [=================>............] - ETA: 46s - loss: 2.0055 - regression_loss: 1.6374 - classification_loss: 0.3681 307/500 [=================>............] - ETA: 46s - loss: 2.0064 - regression_loss: 1.6384 - classification_loss: 0.3680 308/500 [=================>............] - ETA: 46s - loss: 2.0070 - regression_loss: 1.6391 - classification_loss: 0.3680 309/500 [=================>............] - ETA: 46s - loss: 2.0067 - regression_loss: 1.6389 - classification_loss: 0.3678 310/500 [=================>............] - ETA: 45s - loss: 2.0072 - regression_loss: 1.6394 - classification_loss: 0.3678 311/500 [=================>............] - ETA: 45s - loss: 2.0049 - regression_loss: 1.6374 - classification_loss: 0.3675 312/500 [=================>............] - ETA: 45s - loss: 2.0053 - regression_loss: 1.6375 - classification_loss: 0.3678 313/500 [=================>............] - ETA: 45s - loss: 2.0052 - regression_loss: 1.6375 - classification_loss: 0.3677 314/500 [=================>............] - ETA: 44s - loss: 2.0045 - regression_loss: 1.6370 - classification_loss: 0.3675 315/500 [=================>............] - ETA: 44s - loss: 2.0031 - regression_loss: 1.6358 - classification_loss: 0.3673 316/500 [=================>............] - ETA: 44s - loss: 2.0034 - regression_loss: 1.6361 - classification_loss: 0.3672 317/500 [==================>...........] - ETA: 44s - loss: 2.0043 - regression_loss: 1.6371 - classification_loss: 0.3672 318/500 [==================>...........] - ETA: 44s - loss: 2.0062 - regression_loss: 1.6385 - classification_loss: 0.3677 319/500 [==================>...........] - ETA: 43s - loss: 2.0072 - regression_loss: 1.6394 - classification_loss: 0.3678 320/500 [==================>...........] - ETA: 43s - loss: 2.0070 - regression_loss: 1.6390 - classification_loss: 0.3679 321/500 [==================>...........] - ETA: 43s - loss: 2.0038 - regression_loss: 1.6365 - classification_loss: 0.3673 322/500 [==================>...........] - ETA: 43s - loss: 2.0056 - regression_loss: 1.6379 - classification_loss: 0.3677 323/500 [==================>...........] - ETA: 42s - loss: 2.0075 - regression_loss: 1.6390 - classification_loss: 0.3685 324/500 [==================>...........] - ETA: 42s - loss: 2.0068 - regression_loss: 1.6386 - classification_loss: 0.3682 325/500 [==================>...........] - ETA: 42s - loss: 2.0058 - regression_loss: 1.6378 - classification_loss: 0.3680 326/500 [==================>...........] - ETA: 42s - loss: 2.0071 - regression_loss: 1.6386 - classification_loss: 0.3684 327/500 [==================>...........] - ETA: 41s - loss: 2.0119 - regression_loss: 1.6429 - classification_loss: 0.3690 328/500 [==================>...........] - ETA: 41s - loss: 2.0125 - regression_loss: 1.6433 - classification_loss: 0.3692 329/500 [==================>...........] - ETA: 41s - loss: 2.0126 - regression_loss: 1.6435 - classification_loss: 0.3692 330/500 [==================>...........] - ETA: 41s - loss: 2.0111 - regression_loss: 1.6425 - classification_loss: 0.3686 331/500 [==================>...........] - ETA: 40s - loss: 2.0129 - regression_loss: 1.6423 - classification_loss: 0.3707 332/500 [==================>...........] - ETA: 40s - loss: 2.0144 - regression_loss: 1.6435 - classification_loss: 0.3709 333/500 [==================>...........] - ETA: 40s - loss: 2.0149 - regression_loss: 1.6440 - classification_loss: 0.3709 334/500 [===================>..........] - ETA: 40s - loss: 2.0149 - regression_loss: 1.6440 - classification_loss: 0.3708 335/500 [===================>..........] - ETA: 40s - loss: 2.0145 - regression_loss: 1.6438 - classification_loss: 0.3707 336/500 [===================>..........] - ETA: 39s - loss: 2.0149 - regression_loss: 1.6440 - classification_loss: 0.3708 337/500 [===================>..........] - ETA: 39s - loss: 2.0147 - regression_loss: 1.6440 - classification_loss: 0.3707 338/500 [===================>..........] - ETA: 39s - loss: 2.0139 - regression_loss: 1.6434 - classification_loss: 0.3706 339/500 [===================>..........] - ETA: 39s - loss: 2.0142 - regression_loss: 1.6438 - classification_loss: 0.3705 340/500 [===================>..........] - ETA: 38s - loss: 2.0148 - regression_loss: 1.6441 - classification_loss: 0.3707 341/500 [===================>..........] - ETA: 38s - loss: 2.0121 - regression_loss: 1.6420 - classification_loss: 0.3700 342/500 [===================>..........] - ETA: 38s - loss: 2.0118 - regression_loss: 1.6420 - classification_loss: 0.3698 343/500 [===================>..........] - ETA: 38s - loss: 2.0097 - regression_loss: 1.6403 - classification_loss: 0.3693 344/500 [===================>..........] - ETA: 37s - loss: 2.0108 - regression_loss: 1.6414 - classification_loss: 0.3693 345/500 [===================>..........] - ETA: 37s - loss: 2.0119 - regression_loss: 1.6425 - classification_loss: 0.3695 346/500 [===================>..........] - ETA: 37s - loss: 2.0137 - regression_loss: 1.6438 - classification_loss: 0.3699 347/500 [===================>..........] - ETA: 37s - loss: 2.0121 - regression_loss: 1.6426 - classification_loss: 0.3695 348/500 [===================>..........] - ETA: 36s - loss: 2.0129 - regression_loss: 1.6433 - classification_loss: 0.3697 349/500 [===================>..........] - ETA: 36s - loss: 2.0130 - regression_loss: 1.6435 - classification_loss: 0.3695 350/500 [====================>.........] - ETA: 36s - loss: 2.0130 - regression_loss: 1.6434 - classification_loss: 0.3696 351/500 [====================>.........] - ETA: 36s - loss: 2.0099 - regression_loss: 1.6407 - classification_loss: 0.3693 352/500 [====================>.........] - ETA: 35s - loss: 2.0106 - regression_loss: 1.6413 - classification_loss: 0.3693 353/500 [====================>.........] - ETA: 35s - loss: 2.0143 - regression_loss: 1.6448 - classification_loss: 0.3694 354/500 [====================>.........] - ETA: 35s - loss: 2.0155 - regression_loss: 1.6460 - classification_loss: 0.3695 355/500 [====================>.........] - ETA: 35s - loss: 2.0158 - regression_loss: 1.6463 - classification_loss: 0.3696 356/500 [====================>.........] - ETA: 34s - loss: 2.0177 - regression_loss: 1.6473 - classification_loss: 0.3703 357/500 [====================>.........] - ETA: 34s - loss: 2.0181 - regression_loss: 1.6477 - classification_loss: 0.3704 358/500 [====================>.........] - ETA: 34s - loss: 2.0191 - regression_loss: 1.6486 - classification_loss: 0.3705 359/500 [====================>.........] - ETA: 34s - loss: 2.0201 - regression_loss: 1.6494 - classification_loss: 0.3707 360/500 [====================>.........] - ETA: 34s - loss: 2.0212 - regression_loss: 1.6503 - classification_loss: 0.3709 361/500 [====================>.........] - ETA: 33s - loss: 2.0206 - regression_loss: 1.6501 - classification_loss: 0.3706 362/500 [====================>.........] - ETA: 33s - loss: 2.0232 - regression_loss: 1.6521 - classification_loss: 0.3711 363/500 [====================>.........] - ETA: 33s - loss: 2.0235 - regression_loss: 1.6524 - classification_loss: 0.3711 364/500 [====================>.........] - ETA: 33s - loss: 2.0220 - regression_loss: 1.6512 - classification_loss: 0.3708 365/500 [====================>.........] - ETA: 32s - loss: 2.0210 - regression_loss: 1.6500 - classification_loss: 0.3710 366/500 [====================>.........] - ETA: 32s - loss: 2.0212 - regression_loss: 1.6502 - classification_loss: 0.3710 367/500 [=====================>........] - ETA: 32s - loss: 2.0222 - regression_loss: 1.6512 - classification_loss: 0.3710 368/500 [=====================>........] - ETA: 32s - loss: 2.0214 - regression_loss: 1.6503 - classification_loss: 0.3710 369/500 [=====================>........] - ETA: 31s - loss: 2.0201 - regression_loss: 1.6493 - classification_loss: 0.3708 370/500 [=====================>........] - ETA: 31s - loss: 2.0181 - regression_loss: 1.6476 - classification_loss: 0.3705 371/500 [=====================>........] - ETA: 31s - loss: 2.0166 - regression_loss: 1.6457 - classification_loss: 0.3709 372/500 [=====================>........] - ETA: 31s - loss: 2.0172 - regression_loss: 1.6463 - classification_loss: 0.3710 373/500 [=====================>........] - ETA: 30s - loss: 2.0168 - regression_loss: 1.6461 - classification_loss: 0.3707 374/500 [=====================>........] - ETA: 30s - loss: 2.0166 - regression_loss: 1.6461 - classification_loss: 0.3706 375/500 [=====================>........] - ETA: 30s - loss: 2.0149 - regression_loss: 1.6449 - classification_loss: 0.3700 376/500 [=====================>........] - ETA: 30s - loss: 2.0148 - regression_loss: 1.6448 - classification_loss: 0.3699 377/500 [=====================>........] - ETA: 29s - loss: 2.0178 - regression_loss: 1.6475 - classification_loss: 0.3703 378/500 [=====================>........] - ETA: 29s - loss: 2.0170 - regression_loss: 1.6470 - classification_loss: 0.3701 379/500 [=====================>........] - ETA: 29s - loss: 2.0181 - regression_loss: 1.6478 - classification_loss: 0.3703 380/500 [=====================>........] - ETA: 29s - loss: 2.0181 - regression_loss: 1.6479 - classification_loss: 0.3702 381/500 [=====================>........] - ETA: 28s - loss: 2.0172 - regression_loss: 1.6470 - classification_loss: 0.3702 382/500 [=====================>........] - ETA: 28s - loss: 2.0164 - regression_loss: 1.6465 - classification_loss: 0.3699 383/500 [=====================>........] - ETA: 28s - loss: 2.0145 - regression_loss: 1.6450 - classification_loss: 0.3695 384/500 [======================>.......] - ETA: 28s - loss: 2.0191 - regression_loss: 1.6482 - classification_loss: 0.3709 385/500 [======================>.......] - ETA: 27s - loss: 2.0188 - regression_loss: 1.6480 - classification_loss: 0.3708 386/500 [======================>.......] - ETA: 27s - loss: 2.0197 - regression_loss: 1.6486 - classification_loss: 0.3711 387/500 [======================>.......] - ETA: 27s - loss: 2.0173 - regression_loss: 1.6467 - classification_loss: 0.3706 388/500 [======================>.......] - ETA: 27s - loss: 2.0164 - regression_loss: 1.6458 - classification_loss: 0.3706 389/500 [======================>.......] - ETA: 27s - loss: 2.0163 - regression_loss: 1.6456 - classification_loss: 0.3707 390/500 [======================>.......] - ETA: 26s - loss: 2.0178 - regression_loss: 1.6469 - classification_loss: 0.3709 391/500 [======================>.......] - ETA: 26s - loss: 2.0169 - regression_loss: 1.6461 - classification_loss: 0.3707 392/500 [======================>.......] - ETA: 26s - loss: 2.0161 - regression_loss: 1.6456 - classification_loss: 0.3704 393/500 [======================>.......] - ETA: 26s - loss: 2.0151 - regression_loss: 1.6449 - classification_loss: 0.3702 394/500 [======================>.......] - ETA: 25s - loss: 2.0139 - regression_loss: 1.6440 - classification_loss: 0.3699 395/500 [======================>.......] - ETA: 25s - loss: 2.0134 - regression_loss: 1.6437 - classification_loss: 0.3697 396/500 [======================>.......] - ETA: 25s - loss: 2.0153 - regression_loss: 1.6443 - classification_loss: 0.3710 397/500 [======================>.......] - ETA: 25s - loss: 2.0162 - regression_loss: 1.6455 - classification_loss: 0.3707 398/500 [======================>.......] - ETA: 24s - loss: 2.0140 - regression_loss: 1.6438 - classification_loss: 0.3702 399/500 [======================>.......] - ETA: 24s - loss: 2.0140 - regression_loss: 1.6439 - classification_loss: 0.3701 400/500 [=======================>......] - ETA: 24s - loss: 2.0140 - regression_loss: 1.6439 - classification_loss: 0.3701 401/500 [=======================>......] - ETA: 24s - loss: 2.0134 - regression_loss: 1.6435 - classification_loss: 0.3699 402/500 [=======================>......] - ETA: 23s - loss: 2.0125 - regression_loss: 1.6430 - classification_loss: 0.3695 403/500 [=======================>......] - ETA: 23s - loss: 2.0122 - regression_loss: 1.6427 - classification_loss: 0.3695 404/500 [=======================>......] - ETA: 23s - loss: 2.0125 - regression_loss: 1.6432 - classification_loss: 0.3693 405/500 [=======================>......] - ETA: 23s - loss: 2.0130 - regression_loss: 1.6436 - classification_loss: 0.3694 406/500 [=======================>......] - ETA: 22s - loss: 2.0135 - regression_loss: 1.6442 - classification_loss: 0.3693 407/500 [=======================>......] - ETA: 22s - loss: 2.0137 - regression_loss: 1.6445 - classification_loss: 0.3692 408/500 [=======================>......] - ETA: 22s - loss: 2.0140 - regression_loss: 1.6449 - classification_loss: 0.3691 409/500 [=======================>......] - ETA: 22s - loss: 2.0156 - regression_loss: 1.6464 - classification_loss: 0.3692 410/500 [=======================>......] - ETA: 21s - loss: 2.0159 - regression_loss: 1.6467 - classification_loss: 0.3692 411/500 [=======================>......] - ETA: 21s - loss: 2.0147 - regression_loss: 1.6458 - classification_loss: 0.3689 412/500 [=======================>......] - ETA: 21s - loss: 2.0145 - regression_loss: 1.6456 - classification_loss: 0.3690 413/500 [=======================>......] - ETA: 21s - loss: 2.0148 - regression_loss: 1.6458 - classification_loss: 0.3690 414/500 [=======================>......] - ETA: 20s - loss: 2.0159 - regression_loss: 1.6470 - classification_loss: 0.3689 415/500 [=======================>......] - ETA: 20s - loss: 2.0159 - regression_loss: 1.6470 - classification_loss: 0.3689 416/500 [=======================>......] - ETA: 20s - loss: 2.0147 - regression_loss: 1.6460 - classification_loss: 0.3687 417/500 [========================>.....] - ETA: 20s - loss: 2.0145 - regression_loss: 1.6459 - classification_loss: 0.3686 418/500 [========================>.....] - ETA: 20s - loss: 2.0131 - regression_loss: 1.6447 - classification_loss: 0.3683 419/500 [========================>.....] - ETA: 19s - loss: 2.0138 - regression_loss: 1.6453 - classification_loss: 0.3685 420/500 [========================>.....] - ETA: 19s - loss: 2.0141 - regression_loss: 1.6457 - classification_loss: 0.3684 421/500 [========================>.....] - ETA: 19s - loss: 2.0142 - regression_loss: 1.6458 - classification_loss: 0.3684 422/500 [========================>.....] - ETA: 19s - loss: 2.0140 - regression_loss: 1.6458 - classification_loss: 0.3682 423/500 [========================>.....] - ETA: 18s - loss: 2.0143 - regression_loss: 1.6460 - classification_loss: 0.3683 424/500 [========================>.....] - ETA: 18s - loss: 2.0137 - regression_loss: 1.6456 - classification_loss: 0.3681 425/500 [========================>.....] - ETA: 18s - loss: 2.0135 - regression_loss: 1.6454 - classification_loss: 0.3680 426/500 [========================>.....] - ETA: 18s - loss: 2.0141 - regression_loss: 1.6458 - classification_loss: 0.3683 427/500 [========================>.....] - ETA: 17s - loss: 2.0138 - regression_loss: 1.6456 - classification_loss: 0.3682 428/500 [========================>.....] - ETA: 17s - loss: 2.0156 - regression_loss: 1.6471 - classification_loss: 0.3685 429/500 [========================>.....] - ETA: 17s - loss: 2.0179 - regression_loss: 1.6489 - classification_loss: 0.3691 430/500 [========================>.....] - ETA: 17s - loss: 2.0178 - regression_loss: 1.6488 - classification_loss: 0.3690 431/500 [========================>.....] - ETA: 16s - loss: 2.0178 - regression_loss: 1.6489 - classification_loss: 0.3689 432/500 [========================>.....] - ETA: 16s - loss: 2.0190 - regression_loss: 1.6495 - classification_loss: 0.3695 433/500 [========================>.....] - ETA: 16s - loss: 2.0186 - regression_loss: 1.6490 - classification_loss: 0.3695 434/500 [=========================>....] - ETA: 16s - loss: 2.0182 - regression_loss: 1.6488 - classification_loss: 0.3694 435/500 [=========================>....] - ETA: 15s - loss: 2.0191 - regression_loss: 1.6496 - classification_loss: 0.3695 436/500 [=========================>....] - ETA: 15s - loss: 2.0201 - regression_loss: 1.6504 - classification_loss: 0.3697 437/500 [=========================>....] - ETA: 15s - loss: 2.0177 - regression_loss: 1.6487 - classification_loss: 0.3691 438/500 [=========================>....] - ETA: 15s - loss: 2.0180 - regression_loss: 1.6490 - classification_loss: 0.3690 439/500 [=========================>....] - ETA: 14s - loss: 2.0199 - regression_loss: 1.6499 - classification_loss: 0.3699 440/500 [=========================>....] - ETA: 14s - loss: 2.0222 - regression_loss: 1.6520 - classification_loss: 0.3702 441/500 [=========================>....] - ETA: 14s - loss: 2.0217 - regression_loss: 1.6517 - classification_loss: 0.3700 442/500 [=========================>....] - ETA: 14s - loss: 2.0217 - regression_loss: 1.6520 - classification_loss: 0.3697 443/500 [=========================>....] - ETA: 13s - loss: 2.0225 - regression_loss: 1.6528 - classification_loss: 0.3696 444/500 [=========================>....] - ETA: 13s - loss: 2.0237 - regression_loss: 1.6538 - classification_loss: 0.3699 445/500 [=========================>....] - ETA: 13s - loss: 2.0229 - regression_loss: 1.6531 - classification_loss: 0.3698 446/500 [=========================>....] - ETA: 13s - loss: 2.0202 - regression_loss: 1.6510 - classification_loss: 0.3692 447/500 [=========================>....] - ETA: 12s - loss: 2.0200 - regression_loss: 1.6510 - classification_loss: 0.3690 448/500 [=========================>....] - ETA: 12s - loss: 2.0199 - regression_loss: 1.6509 - classification_loss: 0.3690 449/500 [=========================>....] - ETA: 12s - loss: 2.0196 - regression_loss: 1.6508 - classification_loss: 0.3688 450/500 [==========================>...] - ETA: 12s - loss: 2.0198 - regression_loss: 1.6510 - classification_loss: 0.3689 451/500 [==========================>...] - ETA: 11s - loss: 2.0205 - regression_loss: 1.6516 - classification_loss: 0.3690 452/500 [==========================>...] - ETA: 11s - loss: 2.0207 - regression_loss: 1.6515 - classification_loss: 0.3692 453/500 [==========================>...] - ETA: 11s - loss: 2.0197 - regression_loss: 1.6509 - classification_loss: 0.3688 454/500 [==========================>...] - ETA: 11s - loss: 2.0191 - regression_loss: 1.6505 - classification_loss: 0.3686 455/500 [==========================>...] - ETA: 10s - loss: 2.0195 - regression_loss: 1.6508 - classification_loss: 0.3687 456/500 [==========================>...] - ETA: 10s - loss: 2.0189 - regression_loss: 1.6504 - classification_loss: 0.3685 457/500 [==========================>...] - ETA: 10s - loss: 2.0204 - regression_loss: 1.6516 - classification_loss: 0.3687 458/500 [==========================>...] - ETA: 10s - loss: 2.0198 - regression_loss: 1.6513 - classification_loss: 0.3685 459/500 [==========================>...] - ETA: 10s - loss: 2.0188 - regression_loss: 1.6506 - classification_loss: 0.3682 460/500 [==========================>...] - ETA: 9s - loss: 2.0187 - regression_loss: 1.6506 - classification_loss: 0.3681  461/500 [==========================>...] - ETA: 9s - loss: 2.0183 - regression_loss: 1.6503 - classification_loss: 0.3680 462/500 [==========================>...] - ETA: 9s - loss: 2.0184 - regression_loss: 1.6504 - classification_loss: 0.3680 463/500 [==========================>...] - ETA: 9s - loss: 2.0192 - regression_loss: 1.6509 - classification_loss: 0.3683 464/500 [==========================>...] - ETA: 8s - loss: 2.0196 - regression_loss: 1.6513 - classification_loss: 0.3683 465/500 [==========================>...] - ETA: 8s - loss: 2.0179 - regression_loss: 1.6499 - classification_loss: 0.3680 466/500 [==========================>...] - ETA: 8s - loss: 2.0190 - regression_loss: 1.6508 - classification_loss: 0.3682 467/500 [===========================>..] - ETA: 8s - loss: 2.0162 - regression_loss: 1.6485 - classification_loss: 0.3677 468/500 [===========================>..] - ETA: 7s - loss: 2.0152 - regression_loss: 1.6477 - classification_loss: 0.3675 469/500 [===========================>..] - ETA: 7s - loss: 2.0161 - regression_loss: 1.6484 - classification_loss: 0.3676 470/500 [===========================>..] - ETA: 7s - loss: 2.0154 - regression_loss: 1.6479 - classification_loss: 0.3675 471/500 [===========================>..] - ETA: 7s - loss: 2.0176 - regression_loss: 1.6496 - classification_loss: 0.3680 472/500 [===========================>..] - ETA: 6s - loss: 2.0190 - regression_loss: 1.6511 - classification_loss: 0.3679 473/500 [===========================>..] - ETA: 6s - loss: 2.0183 - regression_loss: 1.6504 - classification_loss: 0.3678 474/500 [===========================>..] - ETA: 6s - loss: 2.0178 - regression_loss: 1.6500 - classification_loss: 0.3678 475/500 [===========================>..] - ETA: 6s - loss: 2.0184 - regression_loss: 1.6505 - classification_loss: 0.3679 476/500 [===========================>..] - ETA: 5s - loss: 2.0182 - regression_loss: 1.6503 - classification_loss: 0.3679 477/500 [===========================>..] - ETA: 5s - loss: 2.0176 - regression_loss: 1.6499 - classification_loss: 0.3677 478/500 [===========================>..] - ETA: 5s - loss: 2.0176 - regression_loss: 1.6498 - classification_loss: 0.3678 479/500 [===========================>..] - ETA: 5s - loss: 2.0172 - regression_loss: 1.6494 - classification_loss: 0.3679 480/500 [===========================>..] - ETA: 4s - loss: 2.0185 - regression_loss: 1.6505 - classification_loss: 0.3680 481/500 [===========================>..] - ETA: 4s - loss: 2.0175 - regression_loss: 1.6498 - classification_loss: 0.3677 482/500 [===========================>..] - ETA: 4s - loss: 2.0162 - regression_loss: 1.6488 - classification_loss: 0.3674 483/500 [===========================>..] - ETA: 4s - loss: 2.0163 - regression_loss: 1.6489 - classification_loss: 0.3674 484/500 [============================>.] - ETA: 3s - loss: 2.0169 - regression_loss: 1.6495 - classification_loss: 0.3674 485/500 [============================>.] - ETA: 3s - loss: 2.0178 - regression_loss: 1.6504 - classification_loss: 0.3674 486/500 [============================>.] - ETA: 3s - loss: 2.0183 - regression_loss: 1.6509 - classification_loss: 0.3675 487/500 [============================>.] - ETA: 3s - loss: 2.0176 - regression_loss: 1.6501 - classification_loss: 0.3675 488/500 [============================>.] - ETA: 2s - loss: 2.0164 - regression_loss: 1.6492 - classification_loss: 0.3672 489/500 [============================>.] - ETA: 2s - loss: 2.0169 - regression_loss: 1.6498 - classification_loss: 0.3672 490/500 [============================>.] - ETA: 2s - loss: 2.0175 - regression_loss: 1.6501 - classification_loss: 0.3673 491/500 [============================>.] - ETA: 2s - loss: 2.0173 - regression_loss: 1.6500 - classification_loss: 0.3673 492/500 [============================>.] - ETA: 1s - loss: 2.0165 - regression_loss: 1.6494 - classification_loss: 0.3671 493/500 [============================>.] - ETA: 1s - loss: 2.0162 - regression_loss: 1.6492 - classification_loss: 0.3671 494/500 [============================>.] - ETA: 1s - loss: 2.0157 - regression_loss: 1.6487 - classification_loss: 0.3670 495/500 [============================>.] - ETA: 1s - loss: 2.0178 - regression_loss: 1.6501 - classification_loss: 0.3677 496/500 [============================>.] - ETA: 0s - loss: 2.0177 - regression_loss: 1.6497 - classification_loss: 0.3679 497/500 [============================>.] - ETA: 0s - loss: 2.0177 - regression_loss: 1.6499 - classification_loss: 0.3678 498/500 [============================>.] - ETA: 0s - loss: 2.0162 - regression_loss: 1.6484 - classification_loss: 0.3678 499/500 [============================>.] - ETA: 0s - loss: 2.0167 - regression_loss: 1.6489 - classification_loss: 0.3677 500/500 [==============================] - 122s 245ms/step - loss: 2.0172 - regression_loss: 1.6492 - classification_loss: 0.3680 1172 instances of class plum with average precision: 0.5300 mAP: 0.5300 Epoch 00041: saving model to ./training/snapshots/resnet50_pascal_41.h5 Epoch 42/150 1/500 [..............................] - ETA: 1:57 - loss: 1.2831 - regression_loss: 1.0446 - classification_loss: 0.2384 2/500 [..............................] - ETA: 2:02 - loss: 1.8142 - regression_loss: 1.5187 - classification_loss: 0.2955 3/500 [..............................] - ETA: 2:04 - loss: 1.8535 - regression_loss: 1.5553 - classification_loss: 0.2982 4/500 [..............................] - ETA: 2:04 - loss: 1.9436 - regression_loss: 1.6232 - classification_loss: 0.3204 5/500 [..............................] - ETA: 2:03 - loss: 1.9535 - regression_loss: 1.6174 - classification_loss: 0.3361 6/500 [..............................] - ETA: 2:04 - loss: 1.9027 - regression_loss: 1.5776 - classification_loss: 0.3251 7/500 [..............................] - ETA: 2:03 - loss: 1.8192 - regression_loss: 1.5205 - classification_loss: 0.2987 8/500 [..............................] - ETA: 2:03 - loss: 1.7638 - regression_loss: 1.4725 - classification_loss: 0.2913 9/500 [..............................] - ETA: 2:03 - loss: 1.7381 - regression_loss: 1.4500 - classification_loss: 0.2881 10/500 [..............................] - ETA: 2:03 - loss: 1.7779 - regression_loss: 1.4837 - classification_loss: 0.2942 11/500 [..............................] - ETA: 2:03 - loss: 1.8072 - regression_loss: 1.5085 - classification_loss: 0.2986 12/500 [..............................] - ETA: 2:03 - loss: 1.7614 - regression_loss: 1.4655 - classification_loss: 0.2958 13/500 [..............................] - ETA: 2:02 - loss: 1.7603 - regression_loss: 1.4656 - classification_loss: 0.2947 14/500 [..............................] - ETA: 2:02 - loss: 1.7600 - regression_loss: 1.4659 - classification_loss: 0.2941 15/500 [..............................] - ETA: 2:01 - loss: 1.8017 - regression_loss: 1.5002 - classification_loss: 0.3014 16/500 [..............................] - ETA: 2:01 - loss: 1.8532 - regression_loss: 1.5442 - classification_loss: 0.3090 17/500 [>.............................] - ETA: 2:01 - loss: 1.8148 - regression_loss: 1.5131 - classification_loss: 0.3018 18/500 [>.............................] - ETA: 2:00 - loss: 1.8246 - regression_loss: 1.5185 - classification_loss: 0.3062 19/500 [>.............................] - ETA: 2:00 - loss: 1.7919 - regression_loss: 1.4790 - classification_loss: 0.3129 20/500 [>.............................] - ETA: 2:00 - loss: 1.8152 - regression_loss: 1.5006 - classification_loss: 0.3146 21/500 [>.............................] - ETA: 2:00 - loss: 1.8121 - regression_loss: 1.4970 - classification_loss: 0.3151 22/500 [>.............................] - ETA: 1:59 - loss: 1.8102 - regression_loss: 1.4962 - classification_loss: 0.3140 23/500 [>.............................] - ETA: 1:59 - loss: 1.7940 - regression_loss: 1.4807 - classification_loss: 0.3133 24/500 [>.............................] - ETA: 1:59 - loss: 1.7950 - regression_loss: 1.4832 - classification_loss: 0.3118 25/500 [>.............................] - ETA: 1:59 - loss: 1.7653 - regression_loss: 1.4571 - classification_loss: 0.3083 26/500 [>.............................] - ETA: 1:59 - loss: 1.7838 - regression_loss: 1.4719 - classification_loss: 0.3119 27/500 [>.............................] - ETA: 1:58 - loss: 1.7792 - regression_loss: 1.4670 - classification_loss: 0.3121 28/500 [>.............................] - ETA: 1:58 - loss: 1.7747 - regression_loss: 1.4617 - classification_loss: 0.3130 29/500 [>.............................] - ETA: 1:58 - loss: 1.7830 - regression_loss: 1.4682 - classification_loss: 0.3148 30/500 [>.............................] - ETA: 1:58 - loss: 1.7563 - regression_loss: 1.4422 - classification_loss: 0.3140 31/500 [>.............................] - ETA: 1:57 - loss: 1.7640 - regression_loss: 1.4504 - classification_loss: 0.3137 32/500 [>.............................] - ETA: 1:57 - loss: 1.7829 - regression_loss: 1.4640 - classification_loss: 0.3189 33/500 [>.............................] - ETA: 1:57 - loss: 1.7926 - regression_loss: 1.4736 - classification_loss: 0.3190 34/500 [=>............................] - ETA: 1:57 - loss: 1.7825 - regression_loss: 1.4662 - classification_loss: 0.3163 35/500 [=>............................] - ETA: 1:56 - loss: 1.7762 - regression_loss: 1.4610 - classification_loss: 0.3152 36/500 [=>............................] - ETA: 1:56 - loss: 1.7787 - regression_loss: 1.4642 - classification_loss: 0.3144 37/500 [=>............................] - ETA: 1:56 - loss: 1.7878 - regression_loss: 1.4733 - classification_loss: 0.3144 38/500 [=>............................] - ETA: 1:56 - loss: 1.7888 - regression_loss: 1.4743 - classification_loss: 0.3145 39/500 [=>............................] - ETA: 1:55 - loss: 1.8132 - regression_loss: 1.4909 - classification_loss: 0.3223 40/500 [=>............................] - ETA: 1:55 - loss: 1.8322 - regression_loss: 1.5051 - classification_loss: 0.3271 41/500 [=>............................] - ETA: 1:55 - loss: 1.8369 - regression_loss: 1.5090 - classification_loss: 0.3279 42/500 [=>............................] - ETA: 1:55 - loss: 1.8354 - regression_loss: 1.5077 - classification_loss: 0.3277 43/500 [=>............................] - ETA: 1:54 - loss: 1.8444 - regression_loss: 1.5145 - classification_loss: 0.3299 44/500 [=>............................] - ETA: 1:54 - loss: 1.8405 - regression_loss: 1.5126 - classification_loss: 0.3279 45/500 [=>............................] - ETA: 1:54 - loss: 1.8451 - regression_loss: 1.5155 - classification_loss: 0.3296 46/500 [=>............................] - ETA: 1:54 - loss: 1.8591 - regression_loss: 1.5284 - classification_loss: 0.3306 47/500 [=>............................] - ETA: 1:54 - loss: 1.8619 - regression_loss: 1.5318 - classification_loss: 0.3301 48/500 [=>............................] - ETA: 1:53 - loss: 1.8633 - regression_loss: 1.5341 - classification_loss: 0.3292 49/500 [=>............................] - ETA: 1:53 - loss: 1.8688 - regression_loss: 1.5392 - classification_loss: 0.3296 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8746 - regression_loss: 1.5447 - classification_loss: 0.3299 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8635 - regression_loss: 1.5325 - classification_loss: 0.3310 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8473 - regression_loss: 1.5194 - classification_loss: 0.3278 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8395 - regression_loss: 1.5109 - classification_loss: 0.3287 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8357 - regression_loss: 1.5091 - classification_loss: 0.3266 55/500 [==>...........................] - ETA: 1:52 - loss: 1.8289 - regression_loss: 1.5045 - classification_loss: 0.3243 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8343 - regression_loss: 1.5082 - classification_loss: 0.3261 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8334 - regression_loss: 1.5071 - classification_loss: 0.3263 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8375 - regression_loss: 1.5115 - classification_loss: 0.3261 59/500 [==>...........................] - ETA: 1:51 - loss: 1.8423 - regression_loss: 1.5151 - classification_loss: 0.3272 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8465 - regression_loss: 1.5189 - classification_loss: 0.3276 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8605 - regression_loss: 1.5288 - classification_loss: 0.3316 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8597 - regression_loss: 1.5282 - classification_loss: 0.3316 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8652 - regression_loss: 1.5331 - classification_loss: 0.3321 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8760 - regression_loss: 1.5420 - classification_loss: 0.3339 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8887 - regression_loss: 1.5525 - classification_loss: 0.3362 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8890 - regression_loss: 1.5534 - classification_loss: 0.3356 67/500 [===>..........................] - ETA: 1:49 - loss: 1.9000 - regression_loss: 1.5641 - classification_loss: 0.3360 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9048 - regression_loss: 1.5683 - classification_loss: 0.3366 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9081 - regression_loss: 1.5719 - classification_loss: 0.3361 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9182 - regression_loss: 1.5766 - classification_loss: 0.3416 71/500 [===>..........................] - ETA: 1:48 - loss: 1.9287 - regression_loss: 1.5861 - classification_loss: 0.3426 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9356 - regression_loss: 1.5915 - classification_loss: 0.3441 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9302 - regression_loss: 1.5860 - classification_loss: 0.3442 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9382 - regression_loss: 1.5921 - classification_loss: 0.3461 75/500 [===>..........................] - ETA: 1:47 - loss: 1.9431 - regression_loss: 1.5966 - classification_loss: 0.3466 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9341 - regression_loss: 1.5859 - classification_loss: 0.3482 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9390 - regression_loss: 1.5895 - classification_loss: 0.3495 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9518 - regression_loss: 1.6001 - classification_loss: 0.3518 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9561 - regression_loss: 1.6039 - classification_loss: 0.3522 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9563 - regression_loss: 1.6045 - classification_loss: 0.3518 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9520 - regression_loss: 1.6013 - classification_loss: 0.3507 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9473 - regression_loss: 1.5964 - classification_loss: 0.3509 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9316 - regression_loss: 1.5817 - classification_loss: 0.3498 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9385 - regression_loss: 1.5876 - classification_loss: 0.3509 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9406 - regression_loss: 1.5902 - classification_loss: 0.3504 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9398 - regression_loss: 1.5882 - classification_loss: 0.3516 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9463 - regression_loss: 1.5934 - classification_loss: 0.3529 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9445 - regression_loss: 1.5923 - classification_loss: 0.3521 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9393 - regression_loss: 1.5882 - classification_loss: 0.3511 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9283 - regression_loss: 1.5796 - classification_loss: 0.3487 91/500 [====>.........................] - ETA: 1:43 - loss: 1.9238 - regression_loss: 1.5760 - classification_loss: 0.3478 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9278 - regression_loss: 1.5785 - classification_loss: 0.3493 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9311 - regression_loss: 1.5808 - classification_loss: 0.3503 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9303 - regression_loss: 1.5795 - classification_loss: 0.3508 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9614 - regression_loss: 1.5786 - classification_loss: 0.3828 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9538 - regression_loss: 1.5730 - classification_loss: 0.3808 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9537 - regression_loss: 1.5734 - classification_loss: 0.3802 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9508 - regression_loss: 1.5716 - classification_loss: 0.3792 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9501 - regression_loss: 1.5709 - classification_loss: 0.3792 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9458 - regression_loss: 1.5672 - classification_loss: 0.3786 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9497 - regression_loss: 1.5716 - classification_loss: 0.3781 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9478 - regression_loss: 1.5708 - classification_loss: 0.3769 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9411 - regression_loss: 1.5642 - classification_loss: 0.3769 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9418 - regression_loss: 1.5654 - classification_loss: 0.3764 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9372 - regression_loss: 1.5623 - classification_loss: 0.3749 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9350 - regression_loss: 1.5604 - classification_loss: 0.3746 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9394 - regression_loss: 1.5648 - classification_loss: 0.3745 108/500 [=====>........................] - ETA: 1:37 - loss: 1.9363 - regression_loss: 1.5632 - classification_loss: 0.3731 109/500 [=====>........................] - ETA: 1:37 - loss: 1.9428 - regression_loss: 1.5687 - classification_loss: 0.3741 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9431 - regression_loss: 1.5692 - classification_loss: 0.3739 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9420 - regression_loss: 1.5691 - classification_loss: 0.3729 112/500 [=====>........................] - ETA: 1:36 - loss: 1.9455 - regression_loss: 1.5723 - classification_loss: 0.3732 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9517 - regression_loss: 1.5779 - classification_loss: 0.3737 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9513 - regression_loss: 1.5779 - classification_loss: 0.3734 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9535 - regression_loss: 1.5802 - classification_loss: 0.3733 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9559 - regression_loss: 1.5821 - classification_loss: 0.3739 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9596 - regression_loss: 1.5860 - classification_loss: 0.3736 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9619 - regression_loss: 1.5884 - classification_loss: 0.3734 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9631 - regression_loss: 1.5897 - classification_loss: 0.3734 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9612 - regression_loss: 1.5880 - classification_loss: 0.3732 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9614 - regression_loss: 1.5884 - classification_loss: 0.3730 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9591 - regression_loss: 1.5871 - classification_loss: 0.3720 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9604 - regression_loss: 1.5878 - classification_loss: 0.3725 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9613 - regression_loss: 1.5886 - classification_loss: 0.3726 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9576 - regression_loss: 1.5862 - classification_loss: 0.3715 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9525 - regression_loss: 1.5815 - classification_loss: 0.3710 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9509 - regression_loss: 1.5808 - classification_loss: 0.3700 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9443 - regression_loss: 1.5754 - classification_loss: 0.3689 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9391 - regression_loss: 1.5714 - classification_loss: 0.3677 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9308 - regression_loss: 1.5650 - classification_loss: 0.3659 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9266 - regression_loss: 1.5623 - classification_loss: 0.3643 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9230 - regression_loss: 1.5591 - classification_loss: 0.3639 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9227 - regression_loss: 1.5593 - classification_loss: 0.3634 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9186 - regression_loss: 1.5562 - classification_loss: 0.3623 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9182 - regression_loss: 1.5564 - classification_loss: 0.3619 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9203 - regression_loss: 1.5579 - classification_loss: 0.3624 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9142 - regression_loss: 1.5528 - classification_loss: 0.3614 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9157 - regression_loss: 1.5546 - classification_loss: 0.3611 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9151 - regression_loss: 1.5545 - classification_loss: 0.3606 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9127 - regression_loss: 1.5531 - classification_loss: 0.3597 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9144 - regression_loss: 1.5545 - classification_loss: 0.3599 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9129 - regression_loss: 1.5537 - classification_loss: 0.3592 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9134 - regression_loss: 1.5543 - classification_loss: 0.3591 144/500 [=======>......................] - ETA: 1:28 - loss: 1.9130 - regression_loss: 1.5537 - classification_loss: 0.3593 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9087 - regression_loss: 1.5498 - classification_loss: 0.3589 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9098 - regression_loss: 1.5501 - classification_loss: 0.3597 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9037 - regression_loss: 1.5450 - classification_loss: 0.3587 148/500 [=======>......................] - ETA: 1:27 - loss: 1.9050 - regression_loss: 1.5467 - classification_loss: 0.3583 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9048 - regression_loss: 1.5468 - classification_loss: 0.3580 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9022 - regression_loss: 1.5450 - classification_loss: 0.3571 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9014 - regression_loss: 1.5449 - classification_loss: 0.3565 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9106 - regression_loss: 1.5528 - classification_loss: 0.3578 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9123 - regression_loss: 1.5545 - classification_loss: 0.3578 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9075 - regression_loss: 1.5509 - classification_loss: 0.3566 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9054 - regression_loss: 1.5483 - classification_loss: 0.3571 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9042 - regression_loss: 1.5478 - classification_loss: 0.3564 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9039 - regression_loss: 1.5478 - classification_loss: 0.3560 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9079 - regression_loss: 1.5513 - classification_loss: 0.3566 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9065 - regression_loss: 1.5508 - classification_loss: 0.3557 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9067 - regression_loss: 1.5511 - classification_loss: 0.3557 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9019 - regression_loss: 1.5473 - classification_loss: 0.3546 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9027 - regression_loss: 1.5481 - classification_loss: 0.3546 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9046 - regression_loss: 1.5499 - classification_loss: 0.3548 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9073 - regression_loss: 1.5523 - classification_loss: 0.3550 165/500 [========>.....................] - ETA: 1:25 - loss: 1.9076 - regression_loss: 1.5529 - classification_loss: 0.3547 166/500 [========>.....................] - ETA: 1:24 - loss: 1.9107 - regression_loss: 1.5553 - classification_loss: 0.3554 167/500 [=========>....................] - ETA: 1:24 - loss: 1.9152 - regression_loss: 1.5592 - classification_loss: 0.3560 168/500 [=========>....................] - ETA: 1:24 - loss: 1.9177 - regression_loss: 1.5613 - classification_loss: 0.3563 169/500 [=========>....................] - ETA: 1:24 - loss: 1.9168 - regression_loss: 1.5606 - classification_loss: 0.3563 170/500 [=========>....................] - ETA: 1:23 - loss: 1.9197 - regression_loss: 1.5632 - classification_loss: 0.3565 171/500 [=========>....................] - ETA: 1:23 - loss: 1.9245 - regression_loss: 1.5668 - classification_loss: 0.3577 172/500 [=========>....................] - ETA: 1:23 - loss: 1.9270 - regression_loss: 1.5695 - classification_loss: 0.3575 173/500 [=========>....................] - ETA: 1:23 - loss: 1.9263 - regression_loss: 1.5691 - classification_loss: 0.3572 174/500 [=========>....................] - ETA: 1:22 - loss: 1.9283 - regression_loss: 1.5708 - classification_loss: 0.3576 175/500 [=========>....................] - ETA: 1:22 - loss: 1.9289 - regression_loss: 1.5715 - classification_loss: 0.3574 176/500 [=========>....................] - ETA: 1:22 - loss: 1.9292 - regression_loss: 1.5714 - classification_loss: 0.3579 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9294 - regression_loss: 1.5718 - classification_loss: 0.3576 178/500 [=========>....................] - ETA: 1:21 - loss: 1.9296 - regression_loss: 1.5722 - classification_loss: 0.3574 179/500 [=========>....................] - ETA: 1:21 - loss: 1.9298 - regression_loss: 1.5725 - classification_loss: 0.3573 180/500 [=========>....................] - ETA: 1:21 - loss: 1.9305 - regression_loss: 1.5734 - classification_loss: 0.3572 181/500 [=========>....................] - ETA: 1:20 - loss: 1.9291 - regression_loss: 1.5723 - classification_loss: 0.3568 182/500 [=========>....................] - ETA: 1:20 - loss: 1.9309 - regression_loss: 1.5740 - classification_loss: 0.3569 183/500 [=========>....................] - ETA: 1:20 - loss: 1.9300 - regression_loss: 1.5733 - classification_loss: 0.3567 184/500 [==========>...................] - ETA: 1:20 - loss: 1.9286 - regression_loss: 1.5718 - classification_loss: 0.3567 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9285 - regression_loss: 1.5719 - classification_loss: 0.3566 186/500 [==========>...................] - ETA: 1:19 - loss: 1.9289 - regression_loss: 1.5726 - classification_loss: 0.3563 187/500 [==========>...................] - ETA: 1:19 - loss: 1.9302 - regression_loss: 1.5739 - classification_loss: 0.3563 188/500 [==========>...................] - ETA: 1:19 - loss: 1.9313 - regression_loss: 1.5747 - classification_loss: 0.3566 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9309 - regression_loss: 1.5748 - classification_loss: 0.3561 190/500 [==========>...................] - ETA: 1:18 - loss: 1.9300 - regression_loss: 1.5742 - classification_loss: 0.3559 191/500 [==========>...................] - ETA: 1:18 - loss: 1.9307 - regression_loss: 1.5751 - classification_loss: 0.3556 192/500 [==========>...................] - ETA: 1:18 - loss: 1.9340 - regression_loss: 1.5785 - classification_loss: 0.3555 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9349 - regression_loss: 1.5794 - classification_loss: 0.3555 194/500 [==========>...................] - ETA: 1:17 - loss: 1.9351 - regression_loss: 1.5792 - classification_loss: 0.3558 195/500 [==========>...................] - ETA: 1:17 - loss: 1.9287 - regression_loss: 1.5732 - classification_loss: 0.3554 196/500 [==========>...................] - ETA: 1:17 - loss: 1.9269 - regression_loss: 1.5719 - classification_loss: 0.3550 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9266 - regression_loss: 1.5717 - classification_loss: 0.3549 198/500 [==========>...................] - ETA: 1:16 - loss: 1.9234 - regression_loss: 1.5693 - classification_loss: 0.3540 199/500 [==========>...................] - ETA: 1:16 - loss: 1.9228 - regression_loss: 1.5695 - classification_loss: 0.3534 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9240 - regression_loss: 1.5699 - classification_loss: 0.3541 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9210 - regression_loss: 1.5673 - classification_loss: 0.3537 202/500 [===========>..................] - ETA: 1:15 - loss: 1.9200 - regression_loss: 1.5667 - classification_loss: 0.3533 203/500 [===========>..................] - ETA: 1:15 - loss: 1.9222 - regression_loss: 1.5689 - classification_loss: 0.3533 204/500 [===========>..................] - ETA: 1:15 - loss: 1.9235 - regression_loss: 1.5699 - classification_loss: 0.3536 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9293 - regression_loss: 1.5749 - classification_loss: 0.3544 206/500 [===========>..................] - ETA: 1:14 - loss: 1.9310 - regression_loss: 1.5763 - classification_loss: 0.3548 207/500 [===========>..................] - ETA: 1:14 - loss: 1.9336 - regression_loss: 1.5782 - classification_loss: 0.3554 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9327 - regression_loss: 1.5775 - classification_loss: 0.3552 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9356 - regression_loss: 1.5801 - classification_loss: 0.3555 210/500 [===========>..................] - ETA: 1:13 - loss: 1.9376 - regression_loss: 1.5815 - classification_loss: 0.3562 211/500 [===========>..................] - ETA: 1:13 - loss: 1.9380 - regression_loss: 1.5819 - classification_loss: 0.3562 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9386 - regression_loss: 1.5827 - classification_loss: 0.3559 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9410 - regression_loss: 1.5852 - classification_loss: 0.3559 214/500 [===========>..................] - ETA: 1:12 - loss: 1.9391 - regression_loss: 1.5834 - classification_loss: 0.3557 215/500 [===========>..................] - ETA: 1:12 - loss: 1.9353 - regression_loss: 1.5804 - classification_loss: 0.3549 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9341 - regression_loss: 1.5790 - classification_loss: 0.3550 217/500 [============>.................] - ETA: 1:11 - loss: 1.9350 - regression_loss: 1.5799 - classification_loss: 0.3551 218/500 [============>.................] - ETA: 1:11 - loss: 1.9357 - regression_loss: 1.5803 - classification_loss: 0.3554 219/500 [============>.................] - ETA: 1:11 - loss: 1.9351 - regression_loss: 1.5799 - classification_loss: 0.3552 220/500 [============>.................] - ETA: 1:10 - loss: 1.9358 - regression_loss: 1.5807 - classification_loss: 0.3551 221/500 [============>.................] - ETA: 1:10 - loss: 1.9360 - regression_loss: 1.5811 - classification_loss: 0.3550 222/500 [============>.................] - ETA: 1:10 - loss: 1.9390 - regression_loss: 1.5833 - classification_loss: 0.3556 223/500 [============>.................] - ETA: 1:10 - loss: 1.9399 - regression_loss: 1.5841 - classification_loss: 0.3558 224/500 [============>.................] - ETA: 1:09 - loss: 1.9400 - regression_loss: 1.5843 - classification_loss: 0.3556 225/500 [============>.................] - ETA: 1:09 - loss: 1.9400 - regression_loss: 1.5846 - classification_loss: 0.3553 226/500 [============>.................] - ETA: 1:09 - loss: 1.9452 - regression_loss: 1.5887 - classification_loss: 0.3565 227/500 [============>.................] - ETA: 1:09 - loss: 1.9448 - regression_loss: 1.5887 - classification_loss: 0.3561 228/500 [============>.................] - ETA: 1:08 - loss: 1.9454 - regression_loss: 1.5890 - classification_loss: 0.3564 229/500 [============>.................] - ETA: 1:08 - loss: 1.9475 - regression_loss: 1.5898 - classification_loss: 0.3577 230/500 [============>.................] - ETA: 1:08 - loss: 1.9475 - regression_loss: 1.5901 - classification_loss: 0.3574 231/500 [============>.................] - ETA: 1:08 - loss: 1.9467 - regression_loss: 1.5895 - classification_loss: 0.3572 232/500 [============>.................] - ETA: 1:07 - loss: 1.9474 - regression_loss: 1.5901 - classification_loss: 0.3573 233/500 [============>.................] - ETA: 1:07 - loss: 1.9495 - regression_loss: 1.5915 - classification_loss: 0.3580 234/500 [=============>................] - ETA: 1:07 - loss: 1.9510 - regression_loss: 1.5929 - classification_loss: 0.3581 235/500 [=============>................] - ETA: 1:07 - loss: 1.9507 - regression_loss: 1.5930 - classification_loss: 0.3577 236/500 [=============>................] - ETA: 1:06 - loss: 1.9517 - regression_loss: 1.5944 - classification_loss: 0.3573 237/500 [=============>................] - ETA: 1:06 - loss: 1.9512 - regression_loss: 1.5939 - classification_loss: 0.3572 238/500 [=============>................] - ETA: 1:06 - loss: 1.9500 - regression_loss: 1.5932 - classification_loss: 0.3568 239/500 [=============>................] - ETA: 1:06 - loss: 1.9534 - regression_loss: 1.5955 - classification_loss: 0.3580 240/500 [=============>................] - ETA: 1:05 - loss: 1.9531 - regression_loss: 1.5950 - classification_loss: 0.3581 241/500 [=============>................] - ETA: 1:05 - loss: 1.9536 - regression_loss: 1.5953 - classification_loss: 0.3582 242/500 [=============>................] - ETA: 1:05 - loss: 1.9539 - regression_loss: 1.5956 - classification_loss: 0.3584 243/500 [=============>................] - ETA: 1:05 - loss: 1.9539 - regression_loss: 1.5956 - classification_loss: 0.3583 244/500 [=============>................] - ETA: 1:04 - loss: 1.9543 - regression_loss: 1.5956 - classification_loss: 0.3587 245/500 [=============>................] - ETA: 1:04 - loss: 1.9548 - regression_loss: 1.5957 - classification_loss: 0.3591 246/500 [=============>................] - ETA: 1:04 - loss: 1.9562 - regression_loss: 1.5972 - classification_loss: 0.3590 247/500 [=============>................] - ETA: 1:04 - loss: 1.9577 - regression_loss: 1.5985 - classification_loss: 0.3592 248/500 [=============>................] - ETA: 1:03 - loss: 1.9591 - regression_loss: 1.5996 - classification_loss: 0.3595 249/500 [=============>................] - ETA: 1:03 - loss: 1.9620 - regression_loss: 1.6019 - classification_loss: 0.3601 250/500 [==============>...............] - ETA: 1:03 - loss: 1.9627 - regression_loss: 1.6026 - classification_loss: 0.3601 251/500 [==============>...............] - ETA: 1:03 - loss: 1.9611 - regression_loss: 1.6003 - classification_loss: 0.3607 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9600 - regression_loss: 1.5996 - classification_loss: 0.3604 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9609 - regression_loss: 1.6001 - classification_loss: 0.3608 254/500 [==============>...............] - ETA: 1:02 - loss: 1.9615 - regression_loss: 1.6008 - classification_loss: 0.3608 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9649 - regression_loss: 1.6035 - classification_loss: 0.3615 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9646 - regression_loss: 1.6028 - classification_loss: 0.3618 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9660 - regression_loss: 1.6041 - classification_loss: 0.3619 258/500 [==============>...............] - ETA: 1:01 - loss: 1.9655 - regression_loss: 1.6036 - classification_loss: 0.3620 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9673 - regression_loss: 1.6050 - classification_loss: 0.3623 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9680 - regression_loss: 1.6057 - classification_loss: 0.3622 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9635 - regression_loss: 1.6023 - classification_loss: 0.3612 262/500 [==============>...............] - ETA: 1:00 - loss: 1.9639 - regression_loss: 1.6029 - classification_loss: 0.3610 263/500 [==============>...............] - ETA: 59s - loss: 1.9606 - regression_loss: 1.6003 - classification_loss: 0.3604  264/500 [==============>...............] - ETA: 59s - loss: 1.9593 - regression_loss: 1.5992 - classification_loss: 0.3602 265/500 [==============>...............] - ETA: 59s - loss: 1.9579 - regression_loss: 1.5977 - classification_loss: 0.3602 266/500 [==============>...............] - ETA: 59s - loss: 1.9564 - regression_loss: 1.5962 - classification_loss: 0.3602 267/500 [===============>..............] - ETA: 58s - loss: 1.9572 - regression_loss: 1.5972 - classification_loss: 0.3601 268/500 [===============>..............] - ETA: 58s - loss: 1.9568 - regression_loss: 1.5967 - classification_loss: 0.3601 269/500 [===============>..............] - ETA: 58s - loss: 1.9577 - regression_loss: 1.5975 - classification_loss: 0.3602 270/500 [===============>..............] - ETA: 58s - loss: 1.9586 - regression_loss: 1.5981 - classification_loss: 0.3605 271/500 [===============>..............] - ETA: 57s - loss: 1.9580 - regression_loss: 1.5980 - classification_loss: 0.3600 272/500 [===============>..............] - ETA: 57s - loss: 1.9597 - regression_loss: 1.5991 - classification_loss: 0.3606 273/500 [===============>..............] - ETA: 57s - loss: 1.9595 - regression_loss: 1.5990 - classification_loss: 0.3605 274/500 [===============>..............] - ETA: 56s - loss: 1.9588 - regression_loss: 1.5985 - classification_loss: 0.3603 275/500 [===============>..............] - ETA: 56s - loss: 1.9610 - regression_loss: 1.6003 - classification_loss: 0.3607 276/500 [===============>..............] - ETA: 56s - loss: 1.9576 - regression_loss: 1.5970 - classification_loss: 0.3606 277/500 [===============>..............] - ETA: 56s - loss: 1.9569 - regression_loss: 1.5966 - classification_loss: 0.3604 278/500 [===============>..............] - ETA: 55s - loss: 1.9593 - regression_loss: 1.5984 - classification_loss: 0.3609 279/500 [===============>..............] - ETA: 55s - loss: 1.9592 - regression_loss: 1.5983 - classification_loss: 0.3609 280/500 [===============>..............] - ETA: 55s - loss: 1.9592 - regression_loss: 1.5980 - classification_loss: 0.3612 281/500 [===============>..............] - ETA: 55s - loss: 1.9597 - regression_loss: 1.5985 - classification_loss: 0.3612 282/500 [===============>..............] - ETA: 54s - loss: 1.9591 - regression_loss: 1.5979 - classification_loss: 0.3612 283/500 [===============>..............] - ETA: 54s - loss: 1.9599 - regression_loss: 1.5986 - classification_loss: 0.3613 284/500 [================>.............] - ETA: 54s - loss: 1.9605 - regression_loss: 1.5991 - classification_loss: 0.3614 285/500 [================>.............] - ETA: 54s - loss: 1.9625 - regression_loss: 1.6010 - classification_loss: 0.3616 286/500 [================>.............] - ETA: 53s - loss: 1.9600 - regression_loss: 1.5990 - classification_loss: 0.3610 287/500 [================>.............] - ETA: 53s - loss: 1.9593 - regression_loss: 1.5978 - classification_loss: 0.3615 288/500 [================>.............] - ETA: 53s - loss: 1.9569 - regression_loss: 1.5958 - classification_loss: 0.3612 289/500 [================>.............] - ETA: 53s - loss: 1.9542 - regression_loss: 1.5936 - classification_loss: 0.3606 290/500 [================>.............] - ETA: 52s - loss: 1.9534 - regression_loss: 1.5932 - classification_loss: 0.3602 291/500 [================>.............] - ETA: 52s - loss: 1.9546 - regression_loss: 1.5944 - classification_loss: 0.3602 292/500 [================>.............] - ETA: 52s - loss: 1.9566 - regression_loss: 1.5962 - classification_loss: 0.3605 293/500 [================>.............] - ETA: 52s - loss: 1.9561 - regression_loss: 1.5957 - classification_loss: 0.3604 294/500 [================>.............] - ETA: 51s - loss: 1.9561 - regression_loss: 1.5956 - classification_loss: 0.3605 295/500 [================>.............] - ETA: 51s - loss: 1.9561 - regression_loss: 1.5956 - classification_loss: 0.3605 296/500 [================>.............] - ETA: 51s - loss: 1.9565 - regression_loss: 1.5959 - classification_loss: 0.3606 297/500 [================>.............] - ETA: 51s - loss: 1.9560 - regression_loss: 1.5955 - classification_loss: 0.3604 298/500 [================>.............] - ETA: 50s - loss: 1.9522 - regression_loss: 1.5926 - classification_loss: 0.3596 299/500 [================>.............] - ETA: 50s - loss: 1.9534 - regression_loss: 1.5934 - classification_loss: 0.3601 300/500 [=================>............] - ETA: 50s - loss: 1.9504 - regression_loss: 1.5910 - classification_loss: 0.3594 301/500 [=================>............] - ETA: 50s - loss: 1.9515 - regression_loss: 1.5918 - classification_loss: 0.3596 302/500 [=================>............] - ETA: 49s - loss: 1.9491 - regression_loss: 1.5901 - classification_loss: 0.3590 303/500 [=================>............] - ETA: 49s - loss: 1.9502 - regression_loss: 1.5910 - classification_loss: 0.3592 304/500 [=================>............] - ETA: 49s - loss: 1.9504 - regression_loss: 1.5912 - classification_loss: 0.3592 305/500 [=================>............] - ETA: 49s - loss: 1.9496 - regression_loss: 1.5906 - classification_loss: 0.3589 306/500 [=================>............] - ETA: 48s - loss: 1.9465 - regression_loss: 1.5879 - classification_loss: 0.3585 307/500 [=================>............] - ETA: 48s - loss: 1.9471 - regression_loss: 1.5880 - classification_loss: 0.3591 308/500 [=================>............] - ETA: 48s - loss: 1.9460 - regression_loss: 1.5864 - classification_loss: 0.3596 309/500 [=================>............] - ETA: 48s - loss: 1.9464 - regression_loss: 1.5868 - classification_loss: 0.3596 310/500 [=================>............] - ETA: 47s - loss: 1.9466 - regression_loss: 1.5871 - classification_loss: 0.3595 311/500 [=================>............] - ETA: 47s - loss: 1.9465 - regression_loss: 1.5871 - classification_loss: 0.3594 312/500 [=================>............] - ETA: 47s - loss: 1.9473 - regression_loss: 1.5879 - classification_loss: 0.3594 313/500 [=================>............] - ETA: 47s - loss: 1.9481 - regression_loss: 1.5885 - classification_loss: 0.3596 314/500 [=================>............] - ETA: 46s - loss: 1.9468 - regression_loss: 1.5874 - classification_loss: 0.3593 315/500 [=================>............] - ETA: 46s - loss: 1.9475 - regression_loss: 1.5879 - classification_loss: 0.3596 316/500 [=================>............] - ETA: 46s - loss: 1.9485 - regression_loss: 1.5887 - classification_loss: 0.3598 317/500 [==================>...........] - ETA: 46s - loss: 1.9500 - regression_loss: 1.5899 - classification_loss: 0.3601 318/500 [==================>...........] - ETA: 45s - loss: 1.9501 - regression_loss: 1.5900 - classification_loss: 0.3600 319/500 [==================>...........] - ETA: 45s - loss: 1.9517 - regression_loss: 1.5912 - classification_loss: 0.3605 320/500 [==================>...........] - ETA: 45s - loss: 1.9517 - regression_loss: 1.5910 - classification_loss: 0.3606 321/500 [==================>...........] - ETA: 45s - loss: 1.9520 - regression_loss: 1.5914 - classification_loss: 0.3606 322/500 [==================>...........] - ETA: 44s - loss: 1.9498 - regression_loss: 1.5894 - classification_loss: 0.3604 323/500 [==================>...........] - ETA: 44s - loss: 1.9494 - regression_loss: 1.5893 - classification_loss: 0.3601 324/500 [==================>...........] - ETA: 44s - loss: 1.9493 - regression_loss: 1.5893 - classification_loss: 0.3600 325/500 [==================>...........] - ETA: 44s - loss: 1.9499 - regression_loss: 1.5899 - classification_loss: 0.3600 326/500 [==================>...........] - ETA: 43s - loss: 1.9503 - regression_loss: 1.5903 - classification_loss: 0.3599 327/500 [==================>...........] - ETA: 43s - loss: 1.9502 - regression_loss: 1.5903 - classification_loss: 0.3599 328/500 [==================>...........] - ETA: 43s - loss: 1.9489 - regression_loss: 1.5895 - classification_loss: 0.3594 329/500 [==================>...........] - ETA: 43s - loss: 1.9496 - regression_loss: 1.5899 - classification_loss: 0.3597 330/500 [==================>...........] - ETA: 42s - loss: 1.9527 - regression_loss: 1.5925 - classification_loss: 0.3602 331/500 [==================>...........] - ETA: 42s - loss: 1.9535 - regression_loss: 1.5925 - classification_loss: 0.3610 332/500 [==================>...........] - ETA: 42s - loss: 1.9528 - regression_loss: 1.5919 - classification_loss: 0.3609 333/500 [==================>...........] - ETA: 42s - loss: 1.9535 - regression_loss: 1.5927 - classification_loss: 0.3608 334/500 [===================>..........] - ETA: 41s - loss: 1.9578 - regression_loss: 1.5965 - classification_loss: 0.3613 335/500 [===================>..........] - ETA: 41s - loss: 1.9588 - regression_loss: 1.5975 - classification_loss: 0.3614 336/500 [===================>..........] - ETA: 41s - loss: 1.9568 - regression_loss: 1.5960 - classification_loss: 0.3609 337/500 [===================>..........] - ETA: 41s - loss: 1.9543 - regression_loss: 1.5941 - classification_loss: 0.3602 338/500 [===================>..........] - ETA: 40s - loss: 1.9555 - regression_loss: 1.5951 - classification_loss: 0.3604 339/500 [===================>..........] - ETA: 40s - loss: 1.9537 - regression_loss: 1.5935 - classification_loss: 0.3602 340/500 [===================>..........] - ETA: 40s - loss: 1.9538 - regression_loss: 1.5934 - classification_loss: 0.3603 341/500 [===================>..........] - ETA: 40s - loss: 1.9549 - regression_loss: 1.5940 - classification_loss: 0.3609 342/500 [===================>..........] - ETA: 39s - loss: 1.9564 - regression_loss: 1.5953 - classification_loss: 0.3611 343/500 [===================>..........] - ETA: 39s - loss: 1.9563 - regression_loss: 1.5953 - classification_loss: 0.3610 344/500 [===================>..........] - ETA: 39s - loss: 1.9555 - regression_loss: 1.5946 - classification_loss: 0.3609 345/500 [===================>..........] - ETA: 39s - loss: 1.9554 - regression_loss: 1.5946 - classification_loss: 0.3608 346/500 [===================>..........] - ETA: 38s - loss: 1.9547 - regression_loss: 1.5942 - classification_loss: 0.3605 347/500 [===================>..........] - ETA: 38s - loss: 1.9538 - regression_loss: 1.5934 - classification_loss: 0.3604 348/500 [===================>..........] - ETA: 38s - loss: 1.9542 - regression_loss: 1.5941 - classification_loss: 0.3601 349/500 [===================>..........] - ETA: 38s - loss: 1.9551 - regression_loss: 1.5947 - classification_loss: 0.3603 350/500 [====================>.........] - ETA: 37s - loss: 1.9569 - regression_loss: 1.5962 - classification_loss: 0.3607 351/500 [====================>.........] - ETA: 37s - loss: 1.9577 - regression_loss: 1.5969 - classification_loss: 0.3608 352/500 [====================>.........] - ETA: 37s - loss: 1.9592 - regression_loss: 1.5963 - classification_loss: 0.3630 353/500 [====================>.........] - ETA: 37s - loss: 1.9557 - regression_loss: 1.5933 - classification_loss: 0.3624 354/500 [====================>.........] - ETA: 36s - loss: 1.9569 - regression_loss: 1.5944 - classification_loss: 0.3625 355/500 [====================>.........] - ETA: 36s - loss: 1.9570 - regression_loss: 1.5946 - classification_loss: 0.3624 356/500 [====================>.........] - ETA: 36s - loss: 1.9575 - regression_loss: 1.5952 - classification_loss: 0.3623 357/500 [====================>.........] - ETA: 36s - loss: 1.9588 - regression_loss: 1.5965 - classification_loss: 0.3624 358/500 [====================>.........] - ETA: 35s - loss: 1.9582 - regression_loss: 1.5960 - classification_loss: 0.3623 359/500 [====================>.........] - ETA: 35s - loss: 1.9574 - regression_loss: 1.5953 - classification_loss: 0.3620 360/500 [====================>.........] - ETA: 35s - loss: 1.9551 - regression_loss: 1.5934 - classification_loss: 0.3616 361/500 [====================>.........] - ETA: 35s - loss: 1.9549 - regression_loss: 1.5934 - classification_loss: 0.3614 362/500 [====================>.........] - ETA: 34s - loss: 1.9561 - regression_loss: 1.5944 - classification_loss: 0.3616 363/500 [====================>.........] - ETA: 34s - loss: 1.9552 - regression_loss: 1.5940 - classification_loss: 0.3613 364/500 [====================>.........] - ETA: 34s - loss: 1.9536 - regression_loss: 1.5926 - classification_loss: 0.3609 365/500 [====================>.........] - ETA: 34s - loss: 1.9539 - regression_loss: 1.5929 - classification_loss: 0.3610 366/500 [====================>.........] - ETA: 33s - loss: 1.9524 - regression_loss: 1.5918 - classification_loss: 0.3606 367/500 [=====================>........] - ETA: 33s - loss: 1.9521 - regression_loss: 1.5920 - classification_loss: 0.3601 368/500 [=====================>........] - ETA: 33s - loss: 1.9533 - regression_loss: 1.5927 - classification_loss: 0.3606 369/500 [=====================>........] - ETA: 33s - loss: 1.9534 - regression_loss: 1.5927 - classification_loss: 0.3607 370/500 [=====================>........] - ETA: 32s - loss: 1.9523 - regression_loss: 1.5918 - classification_loss: 0.3604 371/500 [=====================>........] - ETA: 32s - loss: 1.9522 - regression_loss: 1.5920 - classification_loss: 0.3602 372/500 [=====================>........] - ETA: 32s - loss: 1.9533 - regression_loss: 1.5929 - classification_loss: 0.3604 373/500 [=====================>........] - ETA: 32s - loss: 1.9526 - regression_loss: 1.5923 - classification_loss: 0.3602 374/500 [=====================>........] - ETA: 31s - loss: 1.9530 - regression_loss: 1.5926 - classification_loss: 0.3604 375/500 [=====================>........] - ETA: 31s - loss: 1.9548 - regression_loss: 1.5941 - classification_loss: 0.3607 376/500 [=====================>........] - ETA: 31s - loss: 1.9548 - regression_loss: 1.5943 - classification_loss: 0.3605 377/500 [=====================>........] - ETA: 30s - loss: 1.9567 - regression_loss: 1.5958 - classification_loss: 0.3609 378/500 [=====================>........] - ETA: 30s - loss: 1.9570 - regression_loss: 1.5962 - classification_loss: 0.3609 379/500 [=====================>........] - ETA: 30s - loss: 1.9569 - regression_loss: 1.5960 - classification_loss: 0.3608 380/500 [=====================>........] - ETA: 30s - loss: 1.9564 - regression_loss: 1.5957 - classification_loss: 0.3607 381/500 [=====================>........] - ETA: 29s - loss: 1.9562 - regression_loss: 1.5956 - classification_loss: 0.3606 382/500 [=====================>........] - ETA: 29s - loss: 1.9570 - regression_loss: 1.5963 - classification_loss: 0.3606 383/500 [=====================>........] - ETA: 29s - loss: 1.9561 - regression_loss: 1.5957 - classification_loss: 0.3604 384/500 [======================>.......] - ETA: 29s - loss: 1.9570 - regression_loss: 1.5963 - classification_loss: 0.3607 385/500 [======================>.......] - ETA: 28s - loss: 1.9568 - regression_loss: 1.5959 - classification_loss: 0.3609 386/500 [======================>.......] - ETA: 28s - loss: 1.9567 - regression_loss: 1.5959 - classification_loss: 0.3608 387/500 [======================>.......] - ETA: 28s - loss: 1.9569 - regression_loss: 1.5959 - classification_loss: 0.3610 388/500 [======================>.......] - ETA: 28s - loss: 1.9567 - regression_loss: 1.5959 - classification_loss: 0.3608 389/500 [======================>.......] - ETA: 27s - loss: 1.9583 - regression_loss: 1.5969 - classification_loss: 0.3614 390/500 [======================>.......] - ETA: 27s - loss: 1.9589 - regression_loss: 1.5975 - classification_loss: 0.3614 391/500 [======================>.......] - ETA: 27s - loss: 1.9599 - regression_loss: 1.5984 - classification_loss: 0.3615 392/500 [======================>.......] - ETA: 27s - loss: 1.9617 - regression_loss: 1.5999 - classification_loss: 0.3618 393/500 [======================>.......] - ETA: 26s - loss: 1.9596 - regression_loss: 1.5981 - classification_loss: 0.3615 394/500 [======================>.......] - ETA: 26s - loss: 1.9598 - regression_loss: 1.5983 - classification_loss: 0.3615 395/500 [======================>.......] - ETA: 26s - loss: 1.9594 - regression_loss: 1.5974 - classification_loss: 0.3619 396/500 [======================>.......] - ETA: 26s - loss: 1.9592 - regression_loss: 1.5974 - classification_loss: 0.3618 397/500 [======================>.......] - ETA: 25s - loss: 1.9590 - regression_loss: 1.5973 - classification_loss: 0.3617 398/500 [======================>.......] - ETA: 25s - loss: 1.9580 - regression_loss: 1.5965 - classification_loss: 0.3615 399/500 [======================>.......] - ETA: 25s - loss: 1.9571 - regression_loss: 1.5958 - classification_loss: 0.3612 400/500 [=======================>......] - ETA: 25s - loss: 1.9576 - regression_loss: 1.5964 - classification_loss: 0.3612 401/500 [=======================>......] - ETA: 24s - loss: 1.9590 - regression_loss: 1.5975 - classification_loss: 0.3614 402/500 [=======================>......] - ETA: 24s - loss: 1.9592 - regression_loss: 1.5976 - classification_loss: 0.3615 403/500 [=======================>......] - ETA: 24s - loss: 1.9605 - regression_loss: 1.5989 - classification_loss: 0.3616 404/500 [=======================>......] - ETA: 24s - loss: 1.9606 - regression_loss: 1.5991 - classification_loss: 0.3615 405/500 [=======================>......] - ETA: 23s - loss: 1.9609 - regression_loss: 1.5992 - classification_loss: 0.3617 406/500 [=======================>......] - ETA: 23s - loss: 1.9613 - regression_loss: 1.5997 - classification_loss: 0.3616 407/500 [=======================>......] - ETA: 23s - loss: 1.9640 - regression_loss: 1.6020 - classification_loss: 0.3620 408/500 [=======================>......] - ETA: 23s - loss: 1.9636 - regression_loss: 1.6017 - classification_loss: 0.3619 409/500 [=======================>......] - ETA: 22s - loss: 1.9643 - regression_loss: 1.6024 - classification_loss: 0.3619 410/500 [=======================>......] - ETA: 22s - loss: 1.9640 - regression_loss: 1.6022 - classification_loss: 0.3619 411/500 [=======================>......] - ETA: 22s - loss: 1.9647 - regression_loss: 1.6029 - classification_loss: 0.3618 412/500 [=======================>......] - ETA: 22s - loss: 1.9652 - regression_loss: 1.6035 - classification_loss: 0.3617 413/500 [=======================>......] - ETA: 21s - loss: 1.9669 - regression_loss: 1.6049 - classification_loss: 0.3621 414/500 [=======================>......] - ETA: 21s - loss: 1.9678 - regression_loss: 1.6059 - classification_loss: 0.3619 415/500 [=======================>......] - ETA: 21s - loss: 1.9677 - regression_loss: 1.6059 - classification_loss: 0.3618 416/500 [=======================>......] - ETA: 21s - loss: 1.9680 - regression_loss: 1.6063 - classification_loss: 0.3617 417/500 [========================>.....] - ETA: 20s - loss: 1.9668 - regression_loss: 1.6054 - classification_loss: 0.3614 418/500 [========================>.....] - ETA: 20s - loss: 1.9651 - regression_loss: 1.6040 - classification_loss: 0.3611 419/500 [========================>.....] - ETA: 20s - loss: 1.9631 - regression_loss: 1.6023 - classification_loss: 0.3607 420/500 [========================>.....] - ETA: 20s - loss: 1.9674 - regression_loss: 1.6058 - classification_loss: 0.3616 421/500 [========================>.....] - ETA: 19s - loss: 1.9679 - regression_loss: 1.6064 - classification_loss: 0.3615 422/500 [========================>.....] - ETA: 19s - loss: 1.9697 - regression_loss: 1.6076 - classification_loss: 0.3621 423/500 [========================>.....] - ETA: 19s - loss: 1.9692 - regression_loss: 1.6074 - classification_loss: 0.3618 424/500 [========================>.....] - ETA: 19s - loss: 1.9675 - regression_loss: 1.6062 - classification_loss: 0.3613 425/500 [========================>.....] - ETA: 18s - loss: 1.9673 - regression_loss: 1.6061 - classification_loss: 0.3612 426/500 [========================>.....] - ETA: 18s - loss: 1.9670 - regression_loss: 1.6057 - classification_loss: 0.3613 427/500 [========================>.....] - ETA: 18s - loss: 1.9664 - regression_loss: 1.6051 - classification_loss: 0.3612 428/500 [========================>.....] - ETA: 18s - loss: 1.9672 - regression_loss: 1.6054 - classification_loss: 0.3618 429/500 [========================>.....] - ETA: 17s - loss: 1.9674 - regression_loss: 1.6058 - classification_loss: 0.3617 430/500 [========================>.....] - ETA: 17s - loss: 1.9686 - regression_loss: 1.6069 - classification_loss: 0.3617 431/500 [========================>.....] - ETA: 17s - loss: 1.9696 - regression_loss: 1.6077 - classification_loss: 0.3619 432/500 [========================>.....] - ETA: 17s - loss: 1.9699 - regression_loss: 1.6080 - classification_loss: 0.3619 433/500 [========================>.....] - ETA: 16s - loss: 1.9702 - regression_loss: 1.6082 - classification_loss: 0.3620 434/500 [=========================>....] - ETA: 16s - loss: 1.9705 - regression_loss: 1.6087 - classification_loss: 0.3618 435/500 [=========================>....] - ETA: 16s - loss: 1.9708 - regression_loss: 1.6090 - classification_loss: 0.3618 436/500 [=========================>....] - ETA: 16s - loss: 1.9707 - regression_loss: 1.6090 - classification_loss: 0.3617 437/500 [=========================>....] - ETA: 15s - loss: 1.9721 - regression_loss: 1.6102 - classification_loss: 0.3619 438/500 [=========================>....] - ETA: 15s - loss: 1.9731 - regression_loss: 1.6110 - classification_loss: 0.3621 439/500 [=========================>....] - ETA: 15s - loss: 1.9727 - regression_loss: 1.6106 - classification_loss: 0.3621 440/500 [=========================>....] - ETA: 15s - loss: 1.9717 - regression_loss: 1.6099 - classification_loss: 0.3618 441/500 [=========================>....] - ETA: 14s - loss: 1.9736 - regression_loss: 1.6114 - classification_loss: 0.3622 442/500 [=========================>....] - ETA: 14s - loss: 1.9745 - regression_loss: 1.6122 - classification_loss: 0.3623 443/500 [=========================>....] - ETA: 14s - loss: 1.9745 - regression_loss: 1.6122 - classification_loss: 0.3622 444/500 [=========================>....] - ETA: 14s - loss: 1.9751 - regression_loss: 1.6127 - classification_loss: 0.3624 445/500 [=========================>....] - ETA: 13s - loss: 1.9749 - regression_loss: 1.6125 - classification_loss: 0.3624 446/500 [=========================>....] - ETA: 13s - loss: 1.9735 - regression_loss: 1.6112 - classification_loss: 0.3624 447/500 [=========================>....] - ETA: 13s - loss: 1.9740 - regression_loss: 1.6114 - classification_loss: 0.3626 448/500 [=========================>....] - ETA: 13s - loss: 1.9762 - regression_loss: 1.6134 - classification_loss: 0.3628 449/500 [=========================>....] - ETA: 12s - loss: 1.9773 - regression_loss: 1.6142 - classification_loss: 0.3631 450/500 [==========================>...] - ETA: 12s - loss: 1.9775 - regression_loss: 1.6142 - classification_loss: 0.3634 451/500 [==========================>...] - ETA: 12s - loss: 1.9761 - regression_loss: 1.6130 - classification_loss: 0.3631 452/500 [==========================>...] - ETA: 12s - loss: 1.9759 - regression_loss: 1.6126 - classification_loss: 0.3633 453/500 [==========================>...] - ETA: 11s - loss: 1.9764 - regression_loss: 1.6131 - classification_loss: 0.3633 454/500 [==========================>...] - ETA: 11s - loss: 1.9755 - regression_loss: 1.6126 - classification_loss: 0.3629 455/500 [==========================>...] - ETA: 11s - loss: 1.9750 - regression_loss: 1.6123 - classification_loss: 0.3627 456/500 [==========================>...] - ETA: 11s - loss: 1.9760 - regression_loss: 1.6123 - classification_loss: 0.3637 457/500 [==========================>...] - ETA: 10s - loss: 1.9780 - regression_loss: 1.6143 - classification_loss: 0.3637 458/500 [==========================>...] - ETA: 10s - loss: 1.9771 - regression_loss: 1.6135 - classification_loss: 0.3637 459/500 [==========================>...] - ETA: 10s - loss: 1.9771 - regression_loss: 1.6133 - classification_loss: 0.3637 460/500 [==========================>...] - ETA: 10s - loss: 1.9769 - regression_loss: 1.6133 - classification_loss: 0.3635 461/500 [==========================>...] - ETA: 9s - loss: 1.9764 - regression_loss: 1.6130 - classification_loss: 0.3634  462/500 [==========================>...] - ETA: 9s - loss: 1.9768 - regression_loss: 1.6131 - classification_loss: 0.3636 463/500 [==========================>...] - ETA: 9s - loss: 1.9770 - regression_loss: 1.6136 - classification_loss: 0.3635 464/500 [==========================>...] - ETA: 9s - loss: 1.9758 - regression_loss: 1.6124 - classification_loss: 0.3635 465/500 [==========================>...] - ETA: 8s - loss: 1.9757 - regression_loss: 1.6124 - classification_loss: 0.3633 466/500 [==========================>...] - ETA: 8s - loss: 1.9767 - regression_loss: 1.6130 - classification_loss: 0.3638 467/500 [===========================>..] - ETA: 8s - loss: 1.9769 - regression_loss: 1.6132 - classification_loss: 0.3637 468/500 [===========================>..] - ETA: 8s - loss: 1.9784 - regression_loss: 1.6143 - classification_loss: 0.3641 469/500 [===========================>..] - ETA: 7s - loss: 1.9790 - regression_loss: 1.6149 - classification_loss: 0.3641 470/500 [===========================>..] - ETA: 7s - loss: 1.9792 - regression_loss: 1.6152 - classification_loss: 0.3640 471/500 [===========================>..] - ETA: 7s - loss: 1.9778 - regression_loss: 1.6139 - classification_loss: 0.3639 472/500 [===========================>..] - ETA: 7s - loss: 1.9774 - regression_loss: 1.6137 - classification_loss: 0.3637 473/500 [===========================>..] - ETA: 6s - loss: 1.9782 - regression_loss: 1.6143 - classification_loss: 0.3639 474/500 [===========================>..] - ETA: 6s - loss: 1.9788 - regression_loss: 1.6148 - classification_loss: 0.3640 475/500 [===========================>..] - ETA: 6s - loss: 1.9805 - regression_loss: 1.6163 - classification_loss: 0.3642 476/500 [===========================>..] - ETA: 6s - loss: 1.9802 - regression_loss: 1.6162 - classification_loss: 0.3640 477/500 [===========================>..] - ETA: 5s - loss: 1.9794 - regression_loss: 1.6153 - classification_loss: 0.3642 478/500 [===========================>..] - ETA: 5s - loss: 1.9795 - regression_loss: 1.6154 - classification_loss: 0.3641 479/500 [===========================>..] - ETA: 5s - loss: 1.9782 - regression_loss: 1.6141 - classification_loss: 0.3641 480/500 [===========================>..] - ETA: 5s - loss: 1.9786 - regression_loss: 1.6145 - classification_loss: 0.3641 481/500 [===========================>..] - ETA: 4s - loss: 1.9804 - regression_loss: 1.6159 - classification_loss: 0.3645 482/500 [===========================>..] - ETA: 4s - loss: 1.9804 - regression_loss: 1.6160 - classification_loss: 0.3644 483/500 [===========================>..] - ETA: 4s - loss: 1.9794 - regression_loss: 1.6151 - classification_loss: 0.3642 484/500 [============================>.] - ETA: 4s - loss: 1.9787 - regression_loss: 1.6147 - classification_loss: 0.3640 485/500 [============================>.] - ETA: 3s - loss: 1.9786 - regression_loss: 1.6147 - classification_loss: 0.3639 486/500 [============================>.] - ETA: 3s - loss: 1.9793 - regression_loss: 1.6153 - classification_loss: 0.3640 487/500 [============================>.] - ETA: 3s - loss: 1.9801 - regression_loss: 1.6161 - classification_loss: 0.3640 488/500 [============================>.] - ETA: 3s - loss: 1.9812 - regression_loss: 1.6170 - classification_loss: 0.3642 489/500 [============================>.] - ETA: 2s - loss: 1.9815 - regression_loss: 1.6173 - classification_loss: 0.3641 490/500 [============================>.] - ETA: 2s - loss: 1.9811 - regression_loss: 1.6171 - classification_loss: 0.3640 491/500 [============================>.] - ETA: 2s - loss: 1.9814 - regression_loss: 1.6173 - classification_loss: 0.3641 492/500 [============================>.] - ETA: 2s - loss: 1.9816 - regression_loss: 1.6175 - classification_loss: 0.3641 493/500 [============================>.] - ETA: 1s - loss: 1.9837 - regression_loss: 1.6191 - classification_loss: 0.3646 494/500 [============================>.] - ETA: 1s - loss: 1.9839 - regression_loss: 1.6190 - classification_loss: 0.3649 495/500 [============================>.] - ETA: 1s - loss: 1.9840 - regression_loss: 1.6190 - classification_loss: 0.3650 496/500 [============================>.] - ETA: 1s - loss: 1.9851 - regression_loss: 1.6199 - classification_loss: 0.3652 497/500 [============================>.] - ETA: 0s - loss: 1.9858 - regression_loss: 1.6205 - classification_loss: 0.3653 498/500 [============================>.] - ETA: 0s - loss: 1.9871 - regression_loss: 1.6216 - classification_loss: 0.3656 499/500 [============================>.] - ETA: 0s - loss: 1.9872 - regression_loss: 1.6217 - classification_loss: 0.3655 500/500 [==============================] - 126s 251ms/step - loss: 1.9866 - regression_loss: 1.6211 - classification_loss: 0.3655 1172 instances of class plum with average precision: 0.5368 mAP: 0.5368 Epoch 00042: saving model to ./training/snapshots/resnet50_pascal_42.h5 Epoch 43/150 1/500 [..............................] - ETA: 2:03 - loss: 1.2963 - regression_loss: 1.1733 - classification_loss: 0.1231 2/500 [..............................] - ETA: 2:06 - loss: 1.8019 - regression_loss: 1.4607 - classification_loss: 0.3412 3/500 [..............................] - ETA: 2:06 - loss: 1.5712 - regression_loss: 1.2883 - classification_loss: 0.2829 4/500 [..............................] - ETA: 2:06 - loss: 1.5995 - regression_loss: 1.3182 - classification_loss: 0.2812 5/500 [..............................] - ETA: 2:06 - loss: 1.7847 - regression_loss: 1.4751 - classification_loss: 0.3097 6/500 [..............................] - ETA: 2:05 - loss: 1.8589 - regression_loss: 1.5379 - classification_loss: 0.3210 7/500 [..............................] - ETA: 2:06 - loss: 1.8821 - regression_loss: 1.5510 - classification_loss: 0.3311 8/500 [..............................] - ETA: 2:05 - loss: 1.8724 - regression_loss: 1.5323 - classification_loss: 0.3401 9/500 [..............................] - ETA: 2:05 - loss: 1.9103 - regression_loss: 1.5598 - classification_loss: 0.3505 10/500 [..............................] - ETA: 2:05 - loss: 1.8931 - regression_loss: 1.5480 - classification_loss: 0.3451 11/500 [..............................] - ETA: 2:04 - loss: 1.9713 - regression_loss: 1.6082 - classification_loss: 0.3631 12/500 [..............................] - ETA: 2:04 - loss: 1.9816 - regression_loss: 1.6240 - classification_loss: 0.3576 13/500 [..............................] - ETA: 2:03 - loss: 1.9819 - regression_loss: 1.6224 - classification_loss: 0.3595 14/500 [..............................] - ETA: 2:03 - loss: 1.9775 - regression_loss: 1.6214 - classification_loss: 0.3562 15/500 [..............................] - ETA: 2:03 - loss: 1.9460 - regression_loss: 1.5941 - classification_loss: 0.3519 16/500 [..............................] - ETA: 2:03 - loss: 1.9169 - regression_loss: 1.5619 - classification_loss: 0.3550 17/500 [>.............................] - ETA: 2:03 - loss: 1.9306 - regression_loss: 1.5752 - classification_loss: 0.3554 18/500 [>.............................] - ETA: 2:03 - loss: 1.9526 - regression_loss: 1.5867 - classification_loss: 0.3658 19/500 [>.............................] - ETA: 2:03 - loss: 1.9381 - regression_loss: 1.5793 - classification_loss: 0.3588 20/500 [>.............................] - ETA: 2:02 - loss: 1.9560 - regression_loss: 1.6015 - classification_loss: 0.3546 21/500 [>.............................] - ETA: 2:02 - loss: 1.9202 - regression_loss: 1.5736 - classification_loss: 0.3466 22/500 [>.............................] - ETA: 2:02 - loss: 1.9277 - regression_loss: 1.5775 - classification_loss: 0.3502 23/500 [>.............................] - ETA: 2:02 - loss: 1.9394 - regression_loss: 1.5886 - classification_loss: 0.3507 24/500 [>.............................] - ETA: 2:01 - loss: 1.9438 - regression_loss: 1.5916 - classification_loss: 0.3522 25/500 [>.............................] - ETA: 2:01 - loss: 1.9587 - regression_loss: 1.6005 - classification_loss: 0.3582 26/500 [>.............................] - ETA: 2:00 - loss: 1.9248 - regression_loss: 1.5766 - classification_loss: 0.3483 27/500 [>.............................] - ETA: 1:59 - loss: 1.9090 - regression_loss: 1.5632 - classification_loss: 0.3459 28/500 [>.............................] - ETA: 1:59 - loss: 1.8957 - regression_loss: 1.5527 - classification_loss: 0.3430 29/500 [>.............................] - ETA: 1:59 - loss: 1.9055 - regression_loss: 1.5624 - classification_loss: 0.3431 30/500 [>.............................] - ETA: 1:59 - loss: 1.8747 - regression_loss: 1.5375 - classification_loss: 0.3372 31/500 [>.............................] - ETA: 1:58 - loss: 1.8725 - regression_loss: 1.5358 - classification_loss: 0.3367 32/500 [>.............................] - ETA: 1:58 - loss: 1.8895 - regression_loss: 1.5465 - classification_loss: 0.3430 33/500 [>.............................] - ETA: 1:58 - loss: 1.8861 - regression_loss: 1.5439 - classification_loss: 0.3422 34/500 [=>............................] - ETA: 1:57 - loss: 1.9021 - regression_loss: 1.5584 - classification_loss: 0.3438 35/500 [=>............................] - ETA: 1:57 - loss: 1.9039 - regression_loss: 1.5591 - classification_loss: 0.3448 36/500 [=>............................] - ETA: 1:57 - loss: 1.9008 - regression_loss: 1.5585 - classification_loss: 0.3423 37/500 [=>............................] - ETA: 1:57 - loss: 1.9095 - regression_loss: 1.5670 - classification_loss: 0.3425 38/500 [=>............................] - ETA: 1:56 - loss: 1.9079 - regression_loss: 1.5670 - classification_loss: 0.3409 39/500 [=>............................] - ETA: 1:56 - loss: 1.9103 - regression_loss: 1.5687 - classification_loss: 0.3416 40/500 [=>............................] - ETA: 1:56 - loss: 1.9182 - regression_loss: 1.5742 - classification_loss: 0.3440 41/500 [=>............................] - ETA: 1:56 - loss: 1.9157 - regression_loss: 1.5745 - classification_loss: 0.3412 42/500 [=>............................] - ETA: 1:55 - loss: 1.9151 - regression_loss: 1.5746 - classification_loss: 0.3405 43/500 [=>............................] - ETA: 1:55 - loss: 1.9163 - regression_loss: 1.5756 - classification_loss: 0.3407 44/500 [=>............................] - ETA: 1:55 - loss: 1.8934 - regression_loss: 1.5555 - classification_loss: 0.3380 45/500 [=>............................] - ETA: 1:55 - loss: 1.8966 - regression_loss: 1.5591 - classification_loss: 0.3375 46/500 [=>............................] - ETA: 1:54 - loss: 1.8779 - regression_loss: 1.5445 - classification_loss: 0.3334 47/500 [=>............................] - ETA: 1:54 - loss: 1.8827 - regression_loss: 1.5481 - classification_loss: 0.3346 48/500 [=>............................] - ETA: 1:54 - loss: 1.8827 - regression_loss: 1.5486 - classification_loss: 0.3341 49/500 [=>............................] - ETA: 1:53 - loss: 1.8809 - regression_loss: 1.5439 - classification_loss: 0.3370 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8787 - regression_loss: 1.5423 - classification_loss: 0.3365 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8812 - regression_loss: 1.5450 - classification_loss: 0.3362 52/500 [==>...........................] - ETA: 1:53 - loss: 1.8726 - regression_loss: 1.5378 - classification_loss: 0.3347 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8696 - regression_loss: 1.5374 - classification_loss: 0.3322 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8755 - regression_loss: 1.5421 - classification_loss: 0.3335 55/500 [==>...........................] - ETA: 1:52 - loss: 1.8600 - regression_loss: 1.5293 - classification_loss: 0.3308 56/500 [==>...........................] - ETA: 1:52 - loss: 1.8602 - regression_loss: 1.5303 - classification_loss: 0.3299 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8675 - regression_loss: 1.5371 - classification_loss: 0.3304 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8673 - regression_loss: 1.5376 - classification_loss: 0.3296 59/500 [==>...........................] - ETA: 1:51 - loss: 1.8657 - regression_loss: 1.5358 - classification_loss: 0.3299 60/500 [==>...........................] - ETA: 1:51 - loss: 1.8646 - regression_loss: 1.5348 - classification_loss: 0.3298 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8782 - regression_loss: 1.5448 - classification_loss: 0.3333 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8796 - regression_loss: 1.5459 - classification_loss: 0.3337 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8827 - regression_loss: 1.5480 - classification_loss: 0.3347 64/500 [==>...........................] - ETA: 1:50 - loss: 1.8689 - regression_loss: 1.5370 - classification_loss: 0.3319 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8732 - regression_loss: 1.5414 - classification_loss: 0.3318 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8774 - regression_loss: 1.5458 - classification_loss: 0.3316 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8777 - regression_loss: 1.5457 - classification_loss: 0.3321 68/500 [===>..........................] - ETA: 1:49 - loss: 1.8788 - regression_loss: 1.5472 - classification_loss: 0.3316 69/500 [===>..........................] - ETA: 1:49 - loss: 1.8835 - regression_loss: 1.5514 - classification_loss: 0.3321 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8844 - regression_loss: 1.5528 - classification_loss: 0.3316 71/500 [===>..........................] - ETA: 1:48 - loss: 1.8867 - regression_loss: 1.5536 - classification_loss: 0.3330 72/500 [===>..........................] - ETA: 1:48 - loss: 1.8898 - regression_loss: 1.5569 - classification_loss: 0.3329 73/500 [===>..........................] - ETA: 1:48 - loss: 1.8859 - regression_loss: 1.5542 - classification_loss: 0.3317 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8943 - regression_loss: 1.5610 - classification_loss: 0.3332 75/500 [===>..........................] - ETA: 1:47 - loss: 1.9052 - regression_loss: 1.5693 - classification_loss: 0.3360 76/500 [===>..........................] - ETA: 1:47 - loss: 1.9113 - regression_loss: 1.5745 - classification_loss: 0.3368 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9159 - regression_loss: 1.5786 - classification_loss: 0.3374 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9168 - regression_loss: 1.5801 - classification_loss: 0.3368 79/500 [===>..........................] - ETA: 1:46 - loss: 1.9156 - regression_loss: 1.5795 - classification_loss: 0.3361 80/500 [===>..........................] - ETA: 1:46 - loss: 1.9029 - regression_loss: 1.5696 - classification_loss: 0.3333 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9064 - regression_loss: 1.5727 - classification_loss: 0.3337 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9112 - regression_loss: 1.5773 - classification_loss: 0.3339 83/500 [===>..........................] - ETA: 1:45 - loss: 1.9115 - regression_loss: 1.5768 - classification_loss: 0.3347 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9004 - regression_loss: 1.5677 - classification_loss: 0.3326 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9030 - regression_loss: 1.5700 - classification_loss: 0.3330 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9018 - regression_loss: 1.5694 - classification_loss: 0.3324 87/500 [====>.........................] - ETA: 1:44 - loss: 1.8968 - regression_loss: 1.5637 - classification_loss: 0.3332 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8955 - regression_loss: 1.5624 - classification_loss: 0.3331 89/500 [====>.........................] - ETA: 1:43 - loss: 1.8986 - regression_loss: 1.5650 - classification_loss: 0.3336 90/500 [====>.........................] - ETA: 1:43 - loss: 1.8971 - regression_loss: 1.5635 - classification_loss: 0.3335 91/500 [====>.........................] - ETA: 1:43 - loss: 1.8973 - regression_loss: 1.5634 - classification_loss: 0.3339 92/500 [====>.........................] - ETA: 1:43 - loss: 1.9000 - regression_loss: 1.5651 - classification_loss: 0.3349 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9007 - regression_loss: 1.5667 - classification_loss: 0.3341 94/500 [====>.........................] - ETA: 1:42 - loss: 1.9050 - regression_loss: 1.5711 - classification_loss: 0.3339 95/500 [====>.........................] - ETA: 1:42 - loss: 1.9106 - regression_loss: 1.5754 - classification_loss: 0.3352 96/500 [====>.........................] - ETA: 1:42 - loss: 1.9059 - regression_loss: 1.5716 - classification_loss: 0.3343 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9081 - regression_loss: 1.5729 - classification_loss: 0.3352 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9145 - regression_loss: 1.5785 - classification_loss: 0.3359 99/500 [====>.........................] - ETA: 1:41 - loss: 1.9150 - regression_loss: 1.5776 - classification_loss: 0.3374 100/500 [=====>........................] - ETA: 1:41 - loss: 1.9193 - regression_loss: 1.5809 - classification_loss: 0.3384 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9238 - regression_loss: 1.5842 - classification_loss: 0.3396 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9260 - regression_loss: 1.5860 - classification_loss: 0.3400 103/500 [=====>........................] - ETA: 1:40 - loss: 1.9244 - regression_loss: 1.5848 - classification_loss: 0.3396 104/500 [=====>........................] - ETA: 1:40 - loss: 1.9231 - regression_loss: 1.5833 - classification_loss: 0.3398 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9190 - regression_loss: 1.5804 - classification_loss: 0.3386 106/500 [=====>........................] - ETA: 1:39 - loss: 1.9175 - regression_loss: 1.5789 - classification_loss: 0.3386 107/500 [=====>........................] - ETA: 1:39 - loss: 1.9296 - regression_loss: 1.5887 - classification_loss: 0.3409 108/500 [=====>........................] - ETA: 1:39 - loss: 1.9310 - regression_loss: 1.5901 - classification_loss: 0.3409 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9375 - regression_loss: 1.5946 - classification_loss: 0.3429 110/500 [=====>........................] - ETA: 1:38 - loss: 1.9390 - regression_loss: 1.5963 - classification_loss: 0.3428 111/500 [=====>........................] - ETA: 1:38 - loss: 1.9477 - regression_loss: 1.6037 - classification_loss: 0.3441 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9563 - regression_loss: 1.6103 - classification_loss: 0.3460 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9561 - regression_loss: 1.6106 - classification_loss: 0.3455 114/500 [=====>........................] - ETA: 1:37 - loss: 1.9594 - regression_loss: 1.6128 - classification_loss: 0.3466 115/500 [=====>........................] - ETA: 1:37 - loss: 1.9601 - regression_loss: 1.6136 - classification_loss: 0.3464 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9529 - regression_loss: 1.6070 - classification_loss: 0.3460 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9505 - regression_loss: 1.6030 - classification_loss: 0.3474 118/500 [======>.......................] - ETA: 1:36 - loss: 1.9457 - regression_loss: 1.5990 - classification_loss: 0.3468 119/500 [======>.......................] - ETA: 1:36 - loss: 1.9455 - regression_loss: 1.5990 - classification_loss: 0.3465 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9475 - regression_loss: 1.6006 - classification_loss: 0.3469 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9492 - regression_loss: 1.6021 - classification_loss: 0.3470 122/500 [======>.......................] - ETA: 1:35 - loss: 1.9501 - regression_loss: 1.6030 - classification_loss: 0.3471 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9464 - regression_loss: 1.6002 - classification_loss: 0.3462 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9439 - regression_loss: 1.5984 - classification_loss: 0.3455 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9448 - regression_loss: 1.5996 - classification_loss: 0.3453 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9401 - regression_loss: 1.5958 - classification_loss: 0.3443 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9351 - regression_loss: 1.5920 - classification_loss: 0.3430 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9334 - regression_loss: 1.5908 - classification_loss: 0.3426 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9277 - regression_loss: 1.5862 - classification_loss: 0.3415 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9236 - regression_loss: 1.5837 - classification_loss: 0.3400 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9175 - regression_loss: 1.5785 - classification_loss: 0.3389 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9173 - regression_loss: 1.5784 - classification_loss: 0.3389 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9143 - regression_loss: 1.5762 - classification_loss: 0.3381 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9138 - regression_loss: 1.5757 - classification_loss: 0.3381 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9193 - regression_loss: 1.5752 - classification_loss: 0.3441 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9215 - regression_loss: 1.5774 - classification_loss: 0.3441 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9206 - regression_loss: 1.5769 - classification_loss: 0.3437 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9212 - regression_loss: 1.5777 - classification_loss: 0.3435 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9212 - regression_loss: 1.5778 - classification_loss: 0.3434 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9214 - regression_loss: 1.5771 - classification_loss: 0.3442 141/500 [=======>......................] - ETA: 1:30 - loss: 1.9255 - regression_loss: 1.5801 - classification_loss: 0.3454 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9225 - regression_loss: 1.5770 - classification_loss: 0.3455 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9246 - regression_loss: 1.5796 - classification_loss: 0.3449 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9235 - regression_loss: 1.5788 - classification_loss: 0.3446 145/500 [=======>......................] - ETA: 1:29 - loss: 1.9229 - regression_loss: 1.5787 - classification_loss: 0.3443 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9245 - regression_loss: 1.5794 - classification_loss: 0.3452 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9250 - regression_loss: 1.5799 - classification_loss: 0.3451 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9221 - regression_loss: 1.5776 - classification_loss: 0.3445 149/500 [=======>......................] - ETA: 1:28 - loss: 1.9253 - regression_loss: 1.5803 - classification_loss: 0.3449 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9245 - regression_loss: 1.5797 - classification_loss: 0.3448 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9276 - regression_loss: 1.5819 - classification_loss: 0.3457 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9276 - regression_loss: 1.5820 - classification_loss: 0.3456 153/500 [========>.....................] - ETA: 1:27 - loss: 1.9270 - regression_loss: 1.5816 - classification_loss: 0.3454 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9267 - regression_loss: 1.5810 - classification_loss: 0.3456 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9291 - regression_loss: 1.5834 - classification_loss: 0.3458 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9311 - regression_loss: 1.5847 - classification_loss: 0.3464 157/500 [========>.....................] - ETA: 1:26 - loss: 1.9304 - regression_loss: 1.5838 - classification_loss: 0.3465 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9330 - regression_loss: 1.5859 - classification_loss: 0.3471 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9329 - regression_loss: 1.5860 - classification_loss: 0.3469 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9335 - regression_loss: 1.5867 - classification_loss: 0.3468 161/500 [========>.....................] - ETA: 1:25 - loss: 1.9305 - regression_loss: 1.5843 - classification_loss: 0.3463 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9305 - regression_loss: 1.5845 - classification_loss: 0.3460 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9332 - regression_loss: 1.5867 - classification_loss: 0.3465 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9337 - regression_loss: 1.5872 - classification_loss: 0.3465 165/500 [========>.....................] - ETA: 1:24 - loss: 1.9337 - regression_loss: 1.5878 - classification_loss: 0.3459 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9368 - regression_loss: 1.5905 - classification_loss: 0.3462 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9426 - regression_loss: 1.5942 - classification_loss: 0.3484 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9456 - regression_loss: 1.5962 - classification_loss: 0.3494 169/500 [=========>....................] - ETA: 1:23 - loss: 1.9393 - regression_loss: 1.5910 - classification_loss: 0.3483 170/500 [=========>....................] - ETA: 1:23 - loss: 1.9436 - regression_loss: 1.5938 - classification_loss: 0.3498 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9459 - regression_loss: 1.5959 - classification_loss: 0.3500 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9420 - regression_loss: 1.5933 - classification_loss: 0.3487 173/500 [=========>....................] - ETA: 1:22 - loss: 1.9365 - regression_loss: 1.5890 - classification_loss: 0.3475 174/500 [=========>....................] - ETA: 1:22 - loss: 1.9417 - regression_loss: 1.5932 - classification_loss: 0.3485 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9377 - regression_loss: 1.5904 - classification_loss: 0.3472 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9442 - regression_loss: 1.5952 - classification_loss: 0.3490 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9447 - regression_loss: 1.5956 - classification_loss: 0.3490 178/500 [=========>....................] - ETA: 1:21 - loss: 1.9464 - regression_loss: 1.5973 - classification_loss: 0.3491 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9440 - regression_loss: 1.5956 - classification_loss: 0.3484 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9432 - regression_loss: 1.5950 - classification_loss: 0.3482 181/500 [=========>....................] - ETA: 1:20 - loss: 1.9475 - regression_loss: 1.5984 - classification_loss: 0.3491 182/500 [=========>....................] - ETA: 1:20 - loss: 1.9505 - regression_loss: 1.6009 - classification_loss: 0.3496 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9723 - regression_loss: 1.5991 - classification_loss: 0.3732 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9770 - regression_loss: 1.6029 - classification_loss: 0.3742 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9757 - regression_loss: 1.6021 - classification_loss: 0.3737 186/500 [==========>...................] - ETA: 1:19 - loss: 1.9759 - regression_loss: 1.6020 - classification_loss: 0.3738 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9778 - regression_loss: 1.6039 - classification_loss: 0.3739 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9801 - regression_loss: 1.6056 - classification_loss: 0.3745 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9790 - regression_loss: 1.6048 - classification_loss: 0.3741 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9822 - regression_loss: 1.6074 - classification_loss: 0.3747 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9829 - regression_loss: 1.6082 - classification_loss: 0.3747 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9877 - regression_loss: 1.6120 - classification_loss: 0.3757 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9862 - regression_loss: 1.6108 - classification_loss: 0.3754 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9848 - regression_loss: 1.6100 - classification_loss: 0.3748 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9845 - regression_loss: 1.6097 - classification_loss: 0.3747 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9841 - regression_loss: 1.6097 - classification_loss: 0.3744 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9819 - regression_loss: 1.6081 - classification_loss: 0.3738 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9804 - regression_loss: 1.6070 - classification_loss: 0.3734 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9798 - regression_loss: 1.6066 - classification_loss: 0.3731 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9768 - regression_loss: 1.6042 - classification_loss: 0.3726 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9832 - regression_loss: 1.6070 - classification_loss: 0.3762 202/500 [===========>..................] - ETA: 1:15 - loss: 1.9836 - regression_loss: 1.6077 - classification_loss: 0.3759 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9796 - regression_loss: 1.6044 - classification_loss: 0.3752 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9801 - regression_loss: 1.6049 - classification_loss: 0.3752 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9807 - regression_loss: 1.6053 - classification_loss: 0.3754 206/500 [===========>..................] - ETA: 1:14 - loss: 1.9827 - regression_loss: 1.6068 - classification_loss: 0.3759 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9836 - regression_loss: 1.6076 - classification_loss: 0.3760 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9819 - regression_loss: 1.6065 - classification_loss: 0.3754 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9841 - regression_loss: 1.6081 - classification_loss: 0.3760 210/500 [===========>..................] - ETA: 1:13 - loss: 1.9840 - regression_loss: 1.6081 - classification_loss: 0.3759 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9851 - regression_loss: 1.6095 - classification_loss: 0.3756 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9843 - regression_loss: 1.6091 - classification_loss: 0.3752 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9834 - regression_loss: 1.6087 - classification_loss: 0.3747 214/500 [===========>..................] - ETA: 1:12 - loss: 1.9832 - regression_loss: 1.6083 - classification_loss: 0.3749 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9829 - regression_loss: 1.6078 - classification_loss: 0.3750 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9810 - regression_loss: 1.6065 - classification_loss: 0.3745 217/500 [============>.................] - ETA: 1:11 - loss: 1.9812 - regression_loss: 1.6068 - classification_loss: 0.3744 218/500 [============>.................] - ETA: 1:11 - loss: 1.9821 - regression_loss: 1.6079 - classification_loss: 0.3742 219/500 [============>.................] - ETA: 1:10 - loss: 1.9845 - regression_loss: 1.6093 - classification_loss: 0.3753 220/500 [============>.................] - ETA: 1:10 - loss: 1.9847 - regression_loss: 1.6091 - classification_loss: 0.3756 221/500 [============>.................] - ETA: 1:10 - loss: 1.9855 - regression_loss: 1.6099 - classification_loss: 0.3756 222/500 [============>.................] - ETA: 1:10 - loss: 1.9893 - regression_loss: 1.6120 - classification_loss: 0.3774 223/500 [============>.................] - ETA: 1:09 - loss: 1.9853 - regression_loss: 1.6087 - classification_loss: 0.3766 224/500 [============>.................] - ETA: 1:09 - loss: 1.9845 - regression_loss: 1.6085 - classification_loss: 0.3760 225/500 [============>.................] - ETA: 1:09 - loss: 1.9856 - regression_loss: 1.6095 - classification_loss: 0.3761 226/500 [============>.................] - ETA: 1:09 - loss: 1.9828 - regression_loss: 1.6074 - classification_loss: 0.3754 227/500 [============>.................] - ETA: 1:08 - loss: 1.9833 - regression_loss: 1.6081 - classification_loss: 0.3752 228/500 [============>.................] - ETA: 1:08 - loss: 1.9781 - regression_loss: 1.6039 - classification_loss: 0.3742 229/500 [============>.................] - ETA: 1:08 - loss: 1.9788 - regression_loss: 1.6046 - classification_loss: 0.3741 230/500 [============>.................] - ETA: 1:08 - loss: 1.9787 - regression_loss: 1.6050 - classification_loss: 0.3737 231/500 [============>.................] - ETA: 1:07 - loss: 1.9764 - regression_loss: 1.6030 - classification_loss: 0.3735 232/500 [============>.................] - ETA: 1:07 - loss: 1.9732 - regression_loss: 1.6007 - classification_loss: 0.3725 233/500 [============>.................] - ETA: 1:07 - loss: 1.9727 - regression_loss: 1.6008 - classification_loss: 0.3719 234/500 [=============>................] - ETA: 1:07 - loss: 1.9714 - regression_loss: 1.5997 - classification_loss: 0.3717 235/500 [=============>................] - ETA: 1:06 - loss: 1.9730 - regression_loss: 1.6008 - classification_loss: 0.3722 236/500 [=============>................] - ETA: 1:06 - loss: 1.9724 - regression_loss: 1.6007 - classification_loss: 0.3717 237/500 [=============>................] - ETA: 1:06 - loss: 1.9730 - regression_loss: 1.6012 - classification_loss: 0.3718 238/500 [=============>................] - ETA: 1:06 - loss: 1.9744 - regression_loss: 1.6022 - classification_loss: 0.3722 239/500 [=============>................] - ETA: 1:05 - loss: 1.9755 - regression_loss: 1.6031 - classification_loss: 0.3723 240/500 [=============>................] - ETA: 1:05 - loss: 1.9733 - regression_loss: 1.6014 - classification_loss: 0.3718 241/500 [=============>................] - ETA: 1:05 - loss: 1.9741 - regression_loss: 1.6022 - classification_loss: 0.3719 242/500 [=============>................] - ETA: 1:05 - loss: 1.9722 - regression_loss: 1.6007 - classification_loss: 0.3714 243/500 [=============>................] - ETA: 1:04 - loss: 1.9761 - regression_loss: 1.6044 - classification_loss: 0.3717 244/500 [=============>................] - ETA: 1:04 - loss: 1.9728 - regression_loss: 1.6019 - classification_loss: 0.3708 245/500 [=============>................] - ETA: 1:04 - loss: 1.9714 - regression_loss: 1.6007 - classification_loss: 0.3707 246/500 [=============>................] - ETA: 1:03 - loss: 1.9698 - regression_loss: 1.5982 - classification_loss: 0.3716 247/500 [=============>................] - ETA: 1:03 - loss: 1.9707 - regression_loss: 1.5978 - classification_loss: 0.3729 248/500 [=============>................] - ETA: 1:03 - loss: 1.9721 - regression_loss: 1.5992 - classification_loss: 0.3729 249/500 [=============>................] - ETA: 1:03 - loss: 1.9720 - regression_loss: 1.5994 - classification_loss: 0.3726 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9724 - regression_loss: 1.5999 - classification_loss: 0.3725 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9707 - regression_loss: 1.5985 - classification_loss: 0.3722 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9693 - regression_loss: 1.5973 - classification_loss: 0.3720 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9706 - regression_loss: 1.5986 - classification_loss: 0.3720 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9713 - regression_loss: 1.5993 - classification_loss: 0.3721 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9667 - regression_loss: 1.5956 - classification_loss: 0.3711 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9659 - regression_loss: 1.5946 - classification_loss: 0.3714 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9670 - regression_loss: 1.5954 - classification_loss: 0.3716 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9677 - regression_loss: 1.5963 - classification_loss: 0.3713 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9691 - regression_loss: 1.5975 - classification_loss: 0.3716 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9693 - regression_loss: 1.5978 - classification_loss: 0.3715 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9723 - regression_loss: 1.6008 - classification_loss: 0.3715 262/500 [==============>...............] - ETA: 59s - loss: 1.9721 - regression_loss: 1.6006 - classification_loss: 0.3716  263/500 [==============>...............] - ETA: 59s - loss: 1.9722 - regression_loss: 1.6006 - classification_loss: 0.3716 264/500 [==============>...............] - ETA: 59s - loss: 1.9737 - regression_loss: 1.6025 - classification_loss: 0.3712 265/500 [==============>...............] - ETA: 59s - loss: 1.9745 - regression_loss: 1.6034 - classification_loss: 0.3712 266/500 [==============>...............] - ETA: 58s - loss: 1.9739 - regression_loss: 1.6030 - classification_loss: 0.3709 267/500 [===============>..............] - ETA: 58s - loss: 1.9743 - regression_loss: 1.6034 - classification_loss: 0.3709 268/500 [===============>..............] - ETA: 58s - loss: 1.9711 - regression_loss: 1.6011 - classification_loss: 0.3700 269/500 [===============>..............] - ETA: 58s - loss: 1.9720 - regression_loss: 1.6019 - classification_loss: 0.3701 270/500 [===============>..............] - ETA: 57s - loss: 1.9713 - regression_loss: 1.6016 - classification_loss: 0.3698 271/500 [===============>..............] - ETA: 57s - loss: 1.9703 - regression_loss: 1.6011 - classification_loss: 0.3692 272/500 [===============>..............] - ETA: 57s - loss: 1.9715 - regression_loss: 1.6027 - classification_loss: 0.3688 273/500 [===============>..............] - ETA: 57s - loss: 1.9708 - regression_loss: 1.6021 - classification_loss: 0.3687 274/500 [===============>..............] - ETA: 56s - loss: 1.9714 - regression_loss: 1.6028 - classification_loss: 0.3686 275/500 [===============>..............] - ETA: 56s - loss: 1.9713 - regression_loss: 1.6029 - classification_loss: 0.3684 276/500 [===============>..............] - ETA: 56s - loss: 1.9704 - regression_loss: 1.6022 - classification_loss: 0.3682 277/500 [===============>..............] - ETA: 56s - loss: 1.9715 - regression_loss: 1.6026 - classification_loss: 0.3689 278/500 [===============>..............] - ETA: 55s - loss: 1.9717 - regression_loss: 1.6029 - classification_loss: 0.3688 279/500 [===============>..............] - ETA: 55s - loss: 1.9725 - regression_loss: 1.6035 - classification_loss: 0.3690 280/500 [===============>..............] - ETA: 55s - loss: 1.9717 - regression_loss: 1.6030 - classification_loss: 0.3686 281/500 [===============>..............] - ETA: 55s - loss: 1.9731 - regression_loss: 1.6043 - classification_loss: 0.3688 282/500 [===============>..............] - ETA: 54s - loss: 1.9738 - regression_loss: 1.6050 - classification_loss: 0.3687 283/500 [===============>..............] - ETA: 54s - loss: 1.9758 - regression_loss: 1.6067 - classification_loss: 0.3691 284/500 [================>.............] - ETA: 54s - loss: 1.9756 - regression_loss: 1.6064 - classification_loss: 0.3692 285/500 [================>.............] - ETA: 54s - loss: 1.9751 - regression_loss: 1.6064 - classification_loss: 0.3687 286/500 [================>.............] - ETA: 53s - loss: 1.9767 - regression_loss: 1.6076 - classification_loss: 0.3691 287/500 [================>.............] - ETA: 53s - loss: 1.9786 - regression_loss: 1.6087 - classification_loss: 0.3699 288/500 [================>.............] - ETA: 53s - loss: 1.9787 - regression_loss: 1.6090 - classification_loss: 0.3697 289/500 [================>.............] - ETA: 53s - loss: 1.9791 - regression_loss: 1.6096 - classification_loss: 0.3695 290/500 [================>.............] - ETA: 52s - loss: 1.9796 - regression_loss: 1.6101 - classification_loss: 0.3695 291/500 [================>.............] - ETA: 52s - loss: 1.9765 - regression_loss: 1.6077 - classification_loss: 0.3688 292/500 [================>.............] - ETA: 52s - loss: 1.9752 - regression_loss: 1.6067 - classification_loss: 0.3684 293/500 [================>.............] - ETA: 52s - loss: 1.9756 - regression_loss: 1.6068 - classification_loss: 0.3688 294/500 [================>.............] - ETA: 51s - loss: 1.9762 - regression_loss: 1.6073 - classification_loss: 0.3689 295/500 [================>.............] - ETA: 51s - loss: 1.9763 - regression_loss: 1.6072 - classification_loss: 0.3690 296/500 [================>.............] - ETA: 51s - loss: 1.9750 - regression_loss: 1.6062 - classification_loss: 0.3688 297/500 [================>.............] - ETA: 51s - loss: 1.9738 - regression_loss: 1.6057 - classification_loss: 0.3681 298/500 [================>.............] - ETA: 50s - loss: 1.9717 - regression_loss: 1.6040 - classification_loss: 0.3677 299/500 [================>.............] - ETA: 50s - loss: 1.9722 - regression_loss: 1.6046 - classification_loss: 0.3675 300/500 [=================>............] - ETA: 50s - loss: 1.9727 - regression_loss: 1.6048 - classification_loss: 0.3679 301/500 [=================>............] - ETA: 49s - loss: 1.9719 - regression_loss: 1.6046 - classification_loss: 0.3674 302/500 [=================>............] - ETA: 49s - loss: 1.9727 - regression_loss: 1.6053 - classification_loss: 0.3674 303/500 [=================>............] - ETA: 49s - loss: 1.9716 - regression_loss: 1.6047 - classification_loss: 0.3670 304/500 [=================>............] - ETA: 49s - loss: 1.9714 - regression_loss: 1.6044 - classification_loss: 0.3670 305/500 [=================>............] - ETA: 48s - loss: 1.9692 - regression_loss: 1.6024 - classification_loss: 0.3668 306/500 [=================>............] - ETA: 48s - loss: 1.9699 - regression_loss: 1.6030 - classification_loss: 0.3669 307/500 [=================>............] - ETA: 48s - loss: 1.9709 - regression_loss: 1.6037 - classification_loss: 0.3672 308/500 [=================>............] - ETA: 48s - loss: 1.9711 - regression_loss: 1.6039 - classification_loss: 0.3673 309/500 [=================>............] - ETA: 47s - loss: 1.9713 - regression_loss: 1.6043 - classification_loss: 0.3669 310/500 [=================>............] - ETA: 47s - loss: 1.9713 - regression_loss: 1.6044 - classification_loss: 0.3669 311/500 [=================>............] - ETA: 47s - loss: 1.9691 - regression_loss: 1.6029 - classification_loss: 0.3662 312/500 [=================>............] - ETA: 47s - loss: 1.9699 - regression_loss: 1.6036 - classification_loss: 0.3663 313/500 [=================>............] - ETA: 46s - loss: 1.9682 - regression_loss: 1.6023 - classification_loss: 0.3659 314/500 [=================>............] - ETA: 46s - loss: 1.9671 - regression_loss: 1.6014 - classification_loss: 0.3657 315/500 [=================>............] - ETA: 46s - loss: 1.9662 - regression_loss: 1.6009 - classification_loss: 0.3653 316/500 [=================>............] - ETA: 46s - loss: 1.9668 - regression_loss: 1.6016 - classification_loss: 0.3653 317/500 [==================>...........] - ETA: 45s - loss: 1.9660 - regression_loss: 1.6010 - classification_loss: 0.3651 318/500 [==================>...........] - ETA: 45s - loss: 1.9652 - regression_loss: 1.5997 - classification_loss: 0.3654 319/500 [==================>...........] - ETA: 45s - loss: 1.9651 - regression_loss: 1.5997 - classification_loss: 0.3654 320/500 [==================>...........] - ETA: 45s - loss: 1.9650 - regression_loss: 1.5996 - classification_loss: 0.3654 321/500 [==================>...........] - ETA: 44s - loss: 1.9658 - regression_loss: 1.6004 - classification_loss: 0.3654 322/500 [==================>...........] - ETA: 44s - loss: 1.9679 - regression_loss: 1.6014 - classification_loss: 0.3665 323/500 [==================>...........] - ETA: 44s - loss: 1.9670 - regression_loss: 1.6008 - classification_loss: 0.3662 324/500 [==================>...........] - ETA: 44s - loss: 1.9661 - regression_loss: 1.6003 - classification_loss: 0.3658 325/500 [==================>...........] - ETA: 43s - loss: 1.9675 - regression_loss: 1.6015 - classification_loss: 0.3660 326/500 [==================>...........] - ETA: 43s - loss: 1.9678 - regression_loss: 1.6018 - classification_loss: 0.3660 327/500 [==================>...........] - ETA: 43s - loss: 1.9676 - regression_loss: 1.6019 - classification_loss: 0.3658 328/500 [==================>...........] - ETA: 43s - loss: 1.9687 - regression_loss: 1.6027 - classification_loss: 0.3660 329/500 [==================>...........] - ETA: 42s - loss: 1.9680 - regression_loss: 1.6020 - classification_loss: 0.3659 330/500 [==================>...........] - ETA: 42s - loss: 1.9704 - regression_loss: 1.6035 - classification_loss: 0.3668 331/500 [==================>...........] - ETA: 42s - loss: 1.9712 - regression_loss: 1.6045 - classification_loss: 0.3667 332/500 [==================>...........] - ETA: 42s - loss: 1.9676 - regression_loss: 1.6018 - classification_loss: 0.3658 333/500 [==================>...........] - ETA: 41s - loss: 1.9670 - regression_loss: 1.6014 - classification_loss: 0.3656 334/500 [===================>..........] - ETA: 41s - loss: 1.9667 - regression_loss: 1.6010 - classification_loss: 0.3656 335/500 [===================>..........] - ETA: 41s - loss: 1.9672 - regression_loss: 1.6016 - classification_loss: 0.3656 336/500 [===================>..........] - ETA: 41s - loss: 1.9669 - regression_loss: 1.6014 - classification_loss: 0.3654 337/500 [===================>..........] - ETA: 40s - loss: 1.9683 - regression_loss: 1.6026 - classification_loss: 0.3658 338/500 [===================>..........] - ETA: 40s - loss: 1.9710 - regression_loss: 1.6040 - classification_loss: 0.3669 339/500 [===================>..........] - ETA: 40s - loss: 1.9700 - regression_loss: 1.6034 - classification_loss: 0.3666 340/500 [===================>..........] - ETA: 40s - loss: 1.9716 - regression_loss: 1.6042 - classification_loss: 0.3674 341/500 [===================>..........] - ETA: 39s - loss: 1.9717 - regression_loss: 1.6045 - classification_loss: 0.3672 342/500 [===================>..........] - ETA: 39s - loss: 1.9695 - regression_loss: 1.6026 - classification_loss: 0.3669 343/500 [===================>..........] - ETA: 39s - loss: 1.9695 - regression_loss: 1.6029 - classification_loss: 0.3666 344/500 [===================>..........] - ETA: 39s - loss: 1.9679 - regression_loss: 1.6013 - classification_loss: 0.3666 345/500 [===================>..........] - ETA: 38s - loss: 1.9642 - regression_loss: 1.5983 - classification_loss: 0.3659 346/500 [===================>..........] - ETA: 38s - loss: 1.9667 - regression_loss: 1.6004 - classification_loss: 0.3663 347/500 [===================>..........] - ETA: 38s - loss: 1.9676 - regression_loss: 1.6012 - classification_loss: 0.3664 348/500 [===================>..........] - ETA: 38s - loss: 1.9682 - regression_loss: 1.6018 - classification_loss: 0.3664 349/500 [===================>..........] - ETA: 37s - loss: 1.9671 - regression_loss: 1.6008 - classification_loss: 0.3663 350/500 [====================>.........] - ETA: 37s - loss: 1.9659 - regression_loss: 1.5997 - classification_loss: 0.3662 351/500 [====================>.........] - ETA: 37s - loss: 1.9647 - regression_loss: 1.5989 - classification_loss: 0.3658 352/500 [====================>.........] - ETA: 37s - loss: 1.9623 - regression_loss: 1.5968 - classification_loss: 0.3655 353/500 [====================>.........] - ETA: 36s - loss: 1.9633 - regression_loss: 1.5976 - classification_loss: 0.3657 354/500 [====================>.........] - ETA: 36s - loss: 1.9624 - regression_loss: 1.5970 - classification_loss: 0.3654 355/500 [====================>.........] - ETA: 36s - loss: 1.9617 - regression_loss: 1.5967 - classification_loss: 0.3650 356/500 [====================>.........] - ETA: 36s - loss: 1.9621 - regression_loss: 1.5970 - classification_loss: 0.3650 357/500 [====================>.........] - ETA: 35s - loss: 1.9612 - regression_loss: 1.5965 - classification_loss: 0.3647 358/500 [====================>.........] - ETA: 35s - loss: 1.9620 - regression_loss: 1.5970 - classification_loss: 0.3650 359/500 [====================>.........] - ETA: 35s - loss: 1.9627 - regression_loss: 1.5975 - classification_loss: 0.3652 360/500 [====================>.........] - ETA: 35s - loss: 1.9639 - regression_loss: 1.5988 - classification_loss: 0.3651 361/500 [====================>.........] - ETA: 34s - loss: 1.9640 - regression_loss: 1.5990 - classification_loss: 0.3650 362/500 [====================>.........] - ETA: 34s - loss: 1.9647 - regression_loss: 1.5997 - classification_loss: 0.3650 363/500 [====================>.........] - ETA: 34s - loss: 1.9651 - regression_loss: 1.5999 - classification_loss: 0.3652 364/500 [====================>.........] - ETA: 34s - loss: 1.9654 - regression_loss: 1.6002 - classification_loss: 0.3651 365/500 [====================>.........] - ETA: 33s - loss: 1.9661 - regression_loss: 1.6008 - classification_loss: 0.3653 366/500 [====================>.........] - ETA: 33s - loss: 1.9663 - regression_loss: 1.6011 - classification_loss: 0.3652 367/500 [=====================>........] - ETA: 33s - loss: 1.9656 - regression_loss: 1.6006 - classification_loss: 0.3649 368/500 [=====================>........] - ETA: 33s - loss: 1.9648 - regression_loss: 1.6001 - classification_loss: 0.3647 369/500 [=====================>........] - ETA: 32s - loss: 1.9658 - regression_loss: 1.6003 - classification_loss: 0.3655 370/500 [=====================>........] - ETA: 32s - loss: 1.9667 - regression_loss: 1.6013 - classification_loss: 0.3653 371/500 [=====================>........] - ETA: 32s - loss: 1.9674 - regression_loss: 1.6020 - classification_loss: 0.3655 372/500 [=====================>........] - ETA: 32s - loss: 1.9685 - regression_loss: 1.6028 - classification_loss: 0.3657 373/500 [=====================>........] - ETA: 31s - loss: 1.9670 - regression_loss: 1.6017 - classification_loss: 0.3653 374/500 [=====================>........] - ETA: 31s - loss: 1.9667 - regression_loss: 1.6016 - classification_loss: 0.3651 375/500 [=====================>........] - ETA: 31s - loss: 1.9664 - regression_loss: 1.6014 - classification_loss: 0.3650 376/500 [=====================>........] - ETA: 31s - loss: 1.9673 - regression_loss: 1.6022 - classification_loss: 0.3650 377/500 [=====================>........] - ETA: 30s - loss: 1.9670 - regression_loss: 1.6022 - classification_loss: 0.3649 378/500 [=====================>........] - ETA: 30s - loss: 1.9662 - regression_loss: 1.6017 - classification_loss: 0.3645 379/500 [=====================>........] - ETA: 30s - loss: 1.9661 - regression_loss: 1.6017 - classification_loss: 0.3644 380/500 [=====================>........] - ETA: 30s - loss: 1.9651 - regression_loss: 1.6010 - classification_loss: 0.3641 381/500 [=====================>........] - ETA: 29s - loss: 1.9643 - regression_loss: 1.6005 - classification_loss: 0.3638 382/500 [=====================>........] - ETA: 29s - loss: 1.9652 - regression_loss: 1.6012 - classification_loss: 0.3639 383/500 [=====================>........] - ETA: 29s - loss: 1.9662 - regression_loss: 1.6020 - classification_loss: 0.3642 384/500 [======================>.......] - ETA: 29s - loss: 1.9663 - regression_loss: 1.6023 - classification_loss: 0.3641 385/500 [======================>.......] - ETA: 28s - loss: 1.9669 - regression_loss: 1.6028 - classification_loss: 0.3641 386/500 [======================>.......] - ETA: 28s - loss: 1.9679 - regression_loss: 1.6038 - classification_loss: 0.3641 387/500 [======================>.......] - ETA: 28s - loss: 1.9690 - regression_loss: 1.6048 - classification_loss: 0.3642 388/500 [======================>.......] - ETA: 28s - loss: 1.9704 - regression_loss: 1.6059 - classification_loss: 0.3645 389/500 [======================>.......] - ETA: 27s - loss: 1.9704 - regression_loss: 1.6060 - classification_loss: 0.3644 390/500 [======================>.......] - ETA: 27s - loss: 1.9708 - regression_loss: 1.6064 - classification_loss: 0.3643 391/500 [======================>.......] - ETA: 27s - loss: 1.9720 - regression_loss: 1.6077 - classification_loss: 0.3643 392/500 [======================>.......] - ETA: 27s - loss: 1.9727 - regression_loss: 1.6085 - classification_loss: 0.3642 393/500 [======================>.......] - ETA: 26s - loss: 1.9712 - regression_loss: 1.6074 - classification_loss: 0.3638 394/500 [======================>.......] - ETA: 26s - loss: 1.9716 - regression_loss: 1.6080 - classification_loss: 0.3637 395/500 [======================>.......] - ETA: 26s - loss: 1.9715 - regression_loss: 1.6079 - classification_loss: 0.3636 396/500 [======================>.......] - ETA: 26s - loss: 1.9741 - regression_loss: 1.6087 - classification_loss: 0.3654 397/500 [======================>.......] - ETA: 25s - loss: 1.9751 - regression_loss: 1.6095 - classification_loss: 0.3657 398/500 [======================>.......] - ETA: 25s - loss: 1.9752 - regression_loss: 1.6097 - classification_loss: 0.3655 399/500 [======================>.......] - ETA: 25s - loss: 1.9773 - regression_loss: 1.6109 - classification_loss: 0.3664 400/500 [=======================>......] - ETA: 25s - loss: 1.9778 - regression_loss: 1.6114 - classification_loss: 0.3664 401/500 [=======================>......] - ETA: 24s - loss: 1.9759 - regression_loss: 1.6098 - classification_loss: 0.3662 402/500 [=======================>......] - ETA: 24s - loss: 1.9767 - regression_loss: 1.6103 - classification_loss: 0.3664 403/500 [=======================>......] - ETA: 24s - loss: 1.9756 - regression_loss: 1.6095 - classification_loss: 0.3661 404/500 [=======================>......] - ETA: 24s - loss: 1.9745 - regression_loss: 1.6087 - classification_loss: 0.3658 405/500 [=======================>......] - ETA: 23s - loss: 1.9727 - regression_loss: 1.6073 - classification_loss: 0.3654 406/500 [=======================>......] - ETA: 23s - loss: 1.9739 - regression_loss: 1.6080 - classification_loss: 0.3660 407/500 [=======================>......] - ETA: 23s - loss: 1.9746 - regression_loss: 1.6086 - classification_loss: 0.3659 408/500 [=======================>......] - ETA: 23s - loss: 1.9761 - regression_loss: 1.6096 - classification_loss: 0.3665 409/500 [=======================>......] - ETA: 22s - loss: 1.9757 - regression_loss: 1.6094 - classification_loss: 0.3663 410/500 [=======================>......] - ETA: 22s - loss: 1.9768 - regression_loss: 1.6102 - classification_loss: 0.3666 411/500 [=======================>......] - ETA: 22s - loss: 1.9780 - regression_loss: 1.6113 - classification_loss: 0.3667 412/500 [=======================>......] - ETA: 22s - loss: 1.9784 - regression_loss: 1.6116 - classification_loss: 0.3668 413/500 [=======================>......] - ETA: 21s - loss: 1.9775 - regression_loss: 1.6110 - classification_loss: 0.3666 414/500 [=======================>......] - ETA: 21s - loss: 1.9775 - regression_loss: 1.6111 - classification_loss: 0.3664 415/500 [=======================>......] - ETA: 21s - loss: 1.9787 - regression_loss: 1.6121 - classification_loss: 0.3666 416/500 [=======================>......] - ETA: 21s - loss: 1.9786 - regression_loss: 1.6114 - classification_loss: 0.3672 417/500 [========================>.....] - ETA: 20s - loss: 1.9783 - regression_loss: 1.6112 - classification_loss: 0.3670 418/500 [========================>.....] - ETA: 20s - loss: 1.9786 - regression_loss: 1.6113 - classification_loss: 0.3673 419/500 [========================>.....] - ETA: 20s - loss: 1.9789 - regression_loss: 1.6117 - classification_loss: 0.3672 420/500 [========================>.....] - ETA: 20s - loss: 1.9785 - regression_loss: 1.6112 - classification_loss: 0.3672 421/500 [========================>.....] - ETA: 19s - loss: 1.9770 - regression_loss: 1.6100 - classification_loss: 0.3669 422/500 [========================>.....] - ETA: 19s - loss: 1.9777 - regression_loss: 1.6107 - classification_loss: 0.3670 423/500 [========================>.....] - ETA: 19s - loss: 1.9771 - regression_loss: 1.6102 - classification_loss: 0.3668 424/500 [========================>.....] - ETA: 19s - loss: 1.9768 - regression_loss: 1.6101 - classification_loss: 0.3668 425/500 [========================>.....] - ETA: 18s - loss: 1.9768 - regression_loss: 1.6100 - classification_loss: 0.3668 426/500 [========================>.....] - ETA: 18s - loss: 1.9767 - regression_loss: 1.6101 - classification_loss: 0.3667 427/500 [========================>.....] - ETA: 18s - loss: 1.9766 - regression_loss: 1.6100 - classification_loss: 0.3666 428/500 [========================>.....] - ETA: 18s - loss: 1.9752 - regression_loss: 1.6088 - classification_loss: 0.3664 429/500 [========================>.....] - ETA: 17s - loss: 1.9758 - regression_loss: 1.6093 - classification_loss: 0.3665 430/500 [========================>.....] - ETA: 17s - loss: 1.9759 - regression_loss: 1.6094 - classification_loss: 0.3665 431/500 [========================>.....] - ETA: 17s - loss: 1.9761 - regression_loss: 1.6096 - classification_loss: 0.3665 432/500 [========================>.....] - ETA: 17s - loss: 1.9770 - regression_loss: 1.6105 - classification_loss: 0.3665 433/500 [========================>.....] - ETA: 16s - loss: 1.9760 - regression_loss: 1.6098 - classification_loss: 0.3662 434/500 [=========================>....] - ETA: 16s - loss: 1.9752 - regression_loss: 1.6092 - classification_loss: 0.3660 435/500 [=========================>....] - ETA: 16s - loss: 1.9752 - regression_loss: 1.6094 - classification_loss: 0.3658 436/500 [=========================>....] - ETA: 16s - loss: 1.9728 - regression_loss: 1.6075 - classification_loss: 0.3653 437/500 [=========================>....] - ETA: 15s - loss: 1.9741 - regression_loss: 1.6085 - classification_loss: 0.3655 438/500 [=========================>....] - ETA: 15s - loss: 1.9726 - regression_loss: 1.6073 - classification_loss: 0.3653 439/500 [=========================>....] - ETA: 15s - loss: 1.9726 - regression_loss: 1.6075 - classification_loss: 0.3651 440/500 [=========================>....] - ETA: 15s - loss: 1.9715 - regression_loss: 1.6067 - classification_loss: 0.3648 441/500 [=========================>....] - ETA: 14s - loss: 1.9713 - regression_loss: 1.6066 - classification_loss: 0.3647 442/500 [=========================>....] - ETA: 14s - loss: 1.9713 - regression_loss: 1.6061 - classification_loss: 0.3652 443/500 [=========================>....] - ETA: 14s - loss: 1.9687 - regression_loss: 1.6040 - classification_loss: 0.3648 444/500 [=========================>....] - ETA: 14s - loss: 1.9698 - regression_loss: 1.6049 - classification_loss: 0.3649 445/500 [=========================>....] - ETA: 13s - loss: 1.9691 - regression_loss: 1.6045 - classification_loss: 0.3646 446/500 [=========================>....] - ETA: 13s - loss: 1.9668 - regression_loss: 1.6028 - classification_loss: 0.3640 447/500 [=========================>....] - ETA: 13s - loss: 1.9669 - regression_loss: 1.6030 - classification_loss: 0.3639 448/500 [=========================>....] - ETA: 13s - loss: 1.9664 - regression_loss: 1.6025 - classification_loss: 0.3639 449/500 [=========================>....] - ETA: 12s - loss: 1.9667 - regression_loss: 1.6028 - classification_loss: 0.3638 450/500 [==========================>...] - ETA: 12s - loss: 1.9680 - regression_loss: 1.6036 - classification_loss: 0.3643 451/500 [==========================>...] - ETA: 12s - loss: 1.9676 - regression_loss: 1.6034 - classification_loss: 0.3642 452/500 [==========================>...] - ETA: 12s - loss: 1.9685 - regression_loss: 1.6039 - classification_loss: 0.3646 453/500 [==========================>...] - ETA: 11s - loss: 1.9684 - regression_loss: 1.6040 - classification_loss: 0.3644 454/500 [==========================>...] - ETA: 11s - loss: 1.9690 - regression_loss: 1.6044 - classification_loss: 0.3646 455/500 [==========================>...] - ETA: 11s - loss: 1.9700 - regression_loss: 1.6051 - classification_loss: 0.3649 456/500 [==========================>...] - ETA: 11s - loss: 1.9722 - regression_loss: 1.6070 - classification_loss: 0.3652 457/500 [==========================>...] - ETA: 10s - loss: 1.9730 - regression_loss: 1.6076 - classification_loss: 0.3654 458/500 [==========================>...] - ETA: 10s - loss: 1.9727 - regression_loss: 1.6075 - classification_loss: 0.3652 459/500 [==========================>...] - ETA: 10s - loss: 1.9725 - regression_loss: 1.6076 - classification_loss: 0.3649 460/500 [==========================>...] - ETA: 10s - loss: 1.9728 - regression_loss: 1.6080 - classification_loss: 0.3649 461/500 [==========================>...] - ETA: 9s - loss: 1.9752 - regression_loss: 1.6100 - classification_loss: 0.3653  462/500 [==========================>...] - ETA: 9s - loss: 1.9757 - regression_loss: 1.6105 - classification_loss: 0.3652 463/500 [==========================>...] - ETA: 9s - loss: 1.9757 - regression_loss: 1.6105 - classification_loss: 0.3652 464/500 [==========================>...] - ETA: 9s - loss: 1.9758 - regression_loss: 1.6106 - classification_loss: 0.3652 465/500 [==========================>...] - ETA: 8s - loss: 1.9762 - regression_loss: 1.6110 - classification_loss: 0.3651 466/500 [==========================>...] - ETA: 8s - loss: 1.9741 - regression_loss: 1.6093 - classification_loss: 0.3649 467/500 [===========================>..] - ETA: 8s - loss: 1.9770 - regression_loss: 1.6118 - classification_loss: 0.3653 468/500 [===========================>..] - ETA: 8s - loss: 1.9763 - regression_loss: 1.6112 - classification_loss: 0.3651 469/500 [===========================>..] - ETA: 7s - loss: 1.9749 - regression_loss: 1.6100 - classification_loss: 0.3649 470/500 [===========================>..] - ETA: 7s - loss: 1.9724 - regression_loss: 1.6081 - classification_loss: 0.3643 471/500 [===========================>..] - ETA: 7s - loss: 1.9737 - regression_loss: 1.6091 - classification_loss: 0.3646 472/500 [===========================>..] - ETA: 7s - loss: 1.9724 - regression_loss: 1.6076 - classification_loss: 0.3648 473/500 [===========================>..] - ETA: 6s - loss: 1.9726 - regression_loss: 1.6077 - classification_loss: 0.3649 474/500 [===========================>..] - ETA: 6s - loss: 1.9731 - regression_loss: 1.6080 - classification_loss: 0.3651 475/500 [===========================>..] - ETA: 6s - loss: 1.9727 - regression_loss: 1.6077 - classification_loss: 0.3650 476/500 [===========================>..] - ETA: 6s - loss: 1.9730 - regression_loss: 1.6078 - classification_loss: 0.3652 477/500 [===========================>..] - ETA: 5s - loss: 1.9732 - regression_loss: 1.6080 - classification_loss: 0.3652 478/500 [===========================>..] - ETA: 5s - loss: 1.9728 - regression_loss: 1.6078 - classification_loss: 0.3650 479/500 [===========================>..] - ETA: 5s - loss: 1.9731 - regression_loss: 1.6080 - classification_loss: 0.3651 480/500 [===========================>..] - ETA: 5s - loss: 1.9715 - regression_loss: 1.6068 - classification_loss: 0.3647 481/500 [===========================>..] - ETA: 4s - loss: 1.9711 - regression_loss: 1.6066 - classification_loss: 0.3645 482/500 [===========================>..] - ETA: 4s - loss: 1.9715 - regression_loss: 1.6070 - classification_loss: 0.3645 483/500 [===========================>..] - ETA: 4s - loss: 1.9718 - regression_loss: 1.6072 - classification_loss: 0.3646 484/500 [============================>.] - ETA: 4s - loss: 1.9718 - regression_loss: 1.6075 - classification_loss: 0.3643 485/500 [============================>.] - ETA: 3s - loss: 1.9707 - regression_loss: 1.6066 - classification_loss: 0.3641 486/500 [============================>.] - ETA: 3s - loss: 1.9714 - regression_loss: 1.6073 - classification_loss: 0.3641 487/500 [============================>.] - ETA: 3s - loss: 1.9709 - regression_loss: 1.6070 - classification_loss: 0.3640 488/500 [============================>.] - ETA: 3s - loss: 1.9712 - regression_loss: 1.6073 - classification_loss: 0.3639 489/500 [============================>.] - ETA: 2s - loss: 1.9704 - regression_loss: 1.6064 - classification_loss: 0.3640 490/500 [============================>.] - ETA: 2s - loss: 1.9702 - regression_loss: 1.6063 - classification_loss: 0.3639 491/500 [============================>.] - ETA: 2s - loss: 1.9705 - regression_loss: 1.6067 - classification_loss: 0.3639 492/500 [============================>.] - ETA: 2s - loss: 1.9703 - regression_loss: 1.6067 - classification_loss: 0.3636 493/500 [============================>.] - ETA: 1s - loss: 1.9691 - regression_loss: 1.6057 - classification_loss: 0.3635 494/500 [============================>.] - ETA: 1s - loss: 1.9680 - regression_loss: 1.6048 - classification_loss: 0.3632 495/500 [============================>.] - ETA: 1s - loss: 1.9682 - regression_loss: 1.6050 - classification_loss: 0.3632 496/500 [============================>.] - ETA: 1s - loss: 1.9679 - regression_loss: 1.6049 - classification_loss: 0.3630 497/500 [============================>.] - ETA: 0s - loss: 1.9674 - regression_loss: 1.6046 - classification_loss: 0.3628 498/500 [============================>.] - ETA: 0s - loss: 1.9685 - regression_loss: 1.6054 - classification_loss: 0.3631 499/500 [============================>.] - ETA: 0s - loss: 1.9687 - regression_loss: 1.6056 - classification_loss: 0.3631 500/500 [==============================] - 125s 251ms/step - loss: 1.9684 - regression_loss: 1.6054 - classification_loss: 0.3629 1172 instances of class plum with average precision: 0.5591 mAP: 0.5591 Epoch 00043: saving model to ./training/snapshots/resnet50_pascal_43.h5 Epoch 44/150 1/500 [..............................] - ETA: 2:00 - loss: 1.9902 - regression_loss: 1.6719 - classification_loss: 0.3183 2/500 [..............................] - ETA: 2:02 - loss: 1.8236 - regression_loss: 1.5775 - classification_loss: 0.2462 3/500 [..............................] - ETA: 2:03 - loss: 2.1409 - regression_loss: 1.7907 - classification_loss: 0.3503 4/500 [..............................] - ETA: 2:04 - loss: 2.1155 - regression_loss: 1.7598 - classification_loss: 0.3556 5/500 [..............................] - ETA: 2:04 - loss: 2.0880 - regression_loss: 1.7370 - classification_loss: 0.3510 6/500 [..............................] - ETA: 2:03 - loss: 2.0886 - regression_loss: 1.7173 - classification_loss: 0.3712 7/500 [..............................] - ETA: 2:03 - loss: 2.0726 - regression_loss: 1.7057 - classification_loss: 0.3669 8/500 [..............................] - ETA: 2:02 - loss: 2.0558 - regression_loss: 1.6802 - classification_loss: 0.3756 9/500 [..............................] - ETA: 2:02 - loss: 2.0678 - regression_loss: 1.6911 - classification_loss: 0.3767 10/500 [..............................] - ETA: 2:02 - loss: 2.0975 - regression_loss: 1.6901 - classification_loss: 0.4074 11/500 [..............................] - ETA: 2:02 - loss: 2.1070 - regression_loss: 1.7000 - classification_loss: 0.4070 12/500 [..............................] - ETA: 2:02 - loss: 2.0218 - regression_loss: 1.6333 - classification_loss: 0.3884 13/500 [..............................] - ETA: 2:02 - loss: 2.0367 - regression_loss: 1.6501 - classification_loss: 0.3866 14/500 [..............................] - ETA: 2:02 - loss: 2.0122 - regression_loss: 1.6343 - classification_loss: 0.3779 15/500 [..............................] - ETA: 2:02 - loss: 2.0126 - regression_loss: 1.6415 - classification_loss: 0.3711 16/500 [..............................] - ETA: 2:01 - loss: 1.9910 - regression_loss: 1.6210 - classification_loss: 0.3700 17/500 [>.............................] - ETA: 2:01 - loss: 2.0445 - regression_loss: 1.6720 - classification_loss: 0.3725 18/500 [>.............................] - ETA: 2:01 - loss: 2.0352 - regression_loss: 1.6645 - classification_loss: 0.3708 19/500 [>.............................] - ETA: 2:01 - loss: 2.0001 - regression_loss: 1.6379 - classification_loss: 0.3621 20/500 [>.............................] - ETA: 2:01 - loss: 1.9942 - regression_loss: 1.6345 - classification_loss: 0.3596 21/500 [>.............................] - ETA: 2:00 - loss: 1.9648 - regression_loss: 1.6108 - classification_loss: 0.3540 22/500 [>.............................] - ETA: 2:00 - loss: 1.9240 - regression_loss: 1.5775 - classification_loss: 0.3465 23/500 [>.............................] - ETA: 2:00 - loss: 1.9090 - regression_loss: 1.5654 - classification_loss: 0.3436 24/500 [>.............................] - ETA: 2:00 - loss: 1.9373 - regression_loss: 1.5903 - classification_loss: 0.3470 25/500 [>.............................] - ETA: 2:00 - loss: 1.9358 - regression_loss: 1.5844 - classification_loss: 0.3514 26/500 [>.............................] - ETA: 1:59 - loss: 1.9464 - regression_loss: 1.5926 - classification_loss: 0.3538 27/500 [>.............................] - ETA: 1:59 - loss: 1.9587 - regression_loss: 1.6049 - classification_loss: 0.3539 28/500 [>.............................] - ETA: 1:59 - loss: 1.9447 - regression_loss: 1.5965 - classification_loss: 0.3482 29/500 [>.............................] - ETA: 1:58 - loss: 1.9211 - regression_loss: 1.5815 - classification_loss: 0.3396 30/500 [>.............................] - ETA: 1:58 - loss: 1.9052 - regression_loss: 1.5657 - classification_loss: 0.3394 31/500 [>.............................] - ETA: 1:58 - loss: 1.9267 - regression_loss: 1.5826 - classification_loss: 0.3441 32/500 [>.............................] - ETA: 1:58 - loss: 1.9631 - regression_loss: 1.6103 - classification_loss: 0.3528 33/500 [>.............................] - ETA: 1:57 - loss: 1.9453 - regression_loss: 1.5954 - classification_loss: 0.3499 34/500 [=>............................] - ETA: 1:57 - loss: 1.9451 - regression_loss: 1.5953 - classification_loss: 0.3498 35/500 [=>............................] - ETA: 1:57 - loss: 1.9462 - regression_loss: 1.5962 - classification_loss: 0.3500 36/500 [=>............................] - ETA: 1:56 - loss: 1.9529 - regression_loss: 1.6011 - classification_loss: 0.3518 37/500 [=>............................] - ETA: 1:56 - loss: 1.9467 - regression_loss: 1.5947 - classification_loss: 0.3520 38/500 [=>............................] - ETA: 1:56 - loss: 1.9303 - regression_loss: 1.5821 - classification_loss: 0.3481 39/500 [=>............................] - ETA: 1:56 - loss: 1.9284 - regression_loss: 1.5816 - classification_loss: 0.3468 40/500 [=>............................] - ETA: 1:55 - loss: 1.9236 - regression_loss: 1.5771 - classification_loss: 0.3466 41/500 [=>............................] - ETA: 1:55 - loss: 1.9119 - regression_loss: 1.5683 - classification_loss: 0.3436 42/500 [=>............................] - ETA: 1:55 - loss: 1.9151 - regression_loss: 1.5698 - classification_loss: 0.3454 43/500 [=>............................] - ETA: 1:55 - loss: 1.9123 - regression_loss: 1.5689 - classification_loss: 0.3434 44/500 [=>............................] - ETA: 1:54 - loss: 1.9059 - regression_loss: 1.5637 - classification_loss: 0.3422 45/500 [=>............................] - ETA: 1:54 - loss: 1.9132 - regression_loss: 1.5697 - classification_loss: 0.3434 46/500 [=>............................] - ETA: 1:54 - loss: 1.9096 - regression_loss: 1.5672 - classification_loss: 0.3424 47/500 [=>............................] - ETA: 1:53 - loss: 1.9339 - regression_loss: 1.5902 - classification_loss: 0.3437 48/500 [=>............................] - ETA: 1:53 - loss: 1.9155 - regression_loss: 1.5746 - classification_loss: 0.3410 49/500 [=>............................] - ETA: 1:53 - loss: 1.9073 - regression_loss: 1.5687 - classification_loss: 0.3386 50/500 [==>...........................] - ETA: 1:53 - loss: 1.9158 - regression_loss: 1.5761 - classification_loss: 0.3397 51/500 [==>...........................] - ETA: 1:53 - loss: 1.9183 - regression_loss: 1.5773 - classification_loss: 0.3410 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9259 - regression_loss: 1.5828 - classification_loss: 0.3431 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9210 - regression_loss: 1.5796 - classification_loss: 0.3414 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9295 - regression_loss: 1.5879 - classification_loss: 0.3416 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9250 - regression_loss: 1.5860 - classification_loss: 0.3389 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9218 - regression_loss: 1.5803 - classification_loss: 0.3415 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9254 - regression_loss: 1.5836 - classification_loss: 0.3419 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9167 - regression_loss: 1.5759 - classification_loss: 0.3408 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9229 - regression_loss: 1.5806 - classification_loss: 0.3423 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9192 - regression_loss: 1.5779 - classification_loss: 0.3413 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9199 - regression_loss: 1.5771 - classification_loss: 0.3428 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9229 - regression_loss: 1.5809 - classification_loss: 0.3421 63/500 [==>...........................] - ETA: 1:50 - loss: 1.9265 - regression_loss: 1.5834 - classification_loss: 0.3431 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9302 - regression_loss: 1.5867 - classification_loss: 0.3435 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9364 - regression_loss: 1.5921 - classification_loss: 0.3442 66/500 [==>...........................] - ETA: 1:49 - loss: 1.9378 - regression_loss: 1.5936 - classification_loss: 0.3441 67/500 [===>..........................] - ETA: 1:49 - loss: 1.9476 - regression_loss: 1.6016 - classification_loss: 0.3460 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9343 - regression_loss: 1.5895 - classification_loss: 0.3448 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9391 - regression_loss: 1.5940 - classification_loss: 0.3451 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9438 - regression_loss: 1.5976 - classification_loss: 0.3462 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9375 - regression_loss: 1.5924 - classification_loss: 0.3451 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9219 - regression_loss: 1.5798 - classification_loss: 0.3422 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9211 - regression_loss: 1.5799 - classification_loss: 0.3412 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9232 - regression_loss: 1.5820 - classification_loss: 0.3412 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9092 - regression_loss: 1.5710 - classification_loss: 0.3382 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9104 - regression_loss: 1.5728 - classification_loss: 0.3376 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9085 - regression_loss: 1.5706 - classification_loss: 0.3379 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8947 - regression_loss: 1.5592 - classification_loss: 0.3355 79/500 [===>..........................] - ETA: 1:46 - loss: 1.8989 - regression_loss: 1.5622 - classification_loss: 0.3368 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9047 - regression_loss: 1.5670 - classification_loss: 0.3376 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9114 - regression_loss: 1.5720 - classification_loss: 0.3395 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9135 - regression_loss: 1.5738 - classification_loss: 0.3397 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9080 - regression_loss: 1.5694 - classification_loss: 0.3386 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9077 - regression_loss: 1.5691 - classification_loss: 0.3386 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9095 - regression_loss: 1.5710 - classification_loss: 0.3385 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9082 - regression_loss: 1.5709 - classification_loss: 0.3373 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9078 - regression_loss: 1.5706 - classification_loss: 0.3371 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9098 - regression_loss: 1.5710 - classification_loss: 0.3387 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9017 - regression_loss: 1.5647 - classification_loss: 0.3370 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9052 - regression_loss: 1.5678 - classification_loss: 0.3374 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9087 - regression_loss: 1.5713 - classification_loss: 0.3374 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9089 - regression_loss: 1.5714 - classification_loss: 0.3375 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9126 - regression_loss: 1.5747 - classification_loss: 0.3379 94/500 [====>.........................] - ETA: 1:42 - loss: 1.9157 - regression_loss: 1.5775 - classification_loss: 0.3381 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9172 - regression_loss: 1.5794 - classification_loss: 0.3378 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9166 - regression_loss: 1.5773 - classification_loss: 0.3393 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9048 - regression_loss: 1.5664 - classification_loss: 0.3384 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9007 - regression_loss: 1.5632 - classification_loss: 0.3376 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8947 - regression_loss: 1.5587 - classification_loss: 0.3360 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8953 - regression_loss: 1.5592 - classification_loss: 0.3362 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8971 - regression_loss: 1.5612 - classification_loss: 0.3359 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9008 - regression_loss: 1.5649 - classification_loss: 0.3359 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8994 - regression_loss: 1.5624 - classification_loss: 0.3370 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8984 - regression_loss: 1.5617 - classification_loss: 0.3367 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9020 - regression_loss: 1.5646 - classification_loss: 0.3375 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9030 - regression_loss: 1.5657 - classification_loss: 0.3373 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9024 - regression_loss: 1.5660 - classification_loss: 0.3364 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9055 - regression_loss: 1.5684 - classification_loss: 0.3371 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9077 - regression_loss: 1.5705 - classification_loss: 0.3372 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9050 - regression_loss: 1.5683 - classification_loss: 0.3367 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8933 - regression_loss: 1.5588 - classification_loss: 0.3345 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8998 - regression_loss: 1.5651 - classification_loss: 0.3347 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9032 - regression_loss: 1.5673 - classification_loss: 0.3360 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9059 - regression_loss: 1.5697 - classification_loss: 0.3362 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9053 - regression_loss: 1.5692 - classification_loss: 0.3361 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9156 - regression_loss: 1.5772 - classification_loss: 0.3384 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9162 - regression_loss: 1.5781 - classification_loss: 0.3380 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9251 - regression_loss: 1.5860 - classification_loss: 0.3392 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9220 - regression_loss: 1.5838 - classification_loss: 0.3382 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9231 - regression_loss: 1.5852 - classification_loss: 0.3378 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9227 - regression_loss: 1.5851 - classification_loss: 0.3376 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9240 - regression_loss: 1.5842 - classification_loss: 0.3399 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9254 - regression_loss: 1.5854 - classification_loss: 0.3399 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9282 - regression_loss: 1.5876 - classification_loss: 0.3406 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9302 - regression_loss: 1.5891 - classification_loss: 0.3411 126/500 [======>.......................] - ETA: 1:34 - loss: 1.9291 - regression_loss: 1.5878 - classification_loss: 0.3412 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9287 - regression_loss: 1.5875 - classification_loss: 0.3412 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9292 - regression_loss: 1.5877 - classification_loss: 0.3415 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9289 - regression_loss: 1.5878 - classification_loss: 0.3411 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9299 - regression_loss: 1.5880 - classification_loss: 0.3419 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9304 - regression_loss: 1.5870 - classification_loss: 0.3434 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9297 - regression_loss: 1.5863 - classification_loss: 0.3434 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9301 - regression_loss: 1.5869 - classification_loss: 0.3432 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9320 - regression_loss: 1.5888 - classification_loss: 0.3432 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9354 - regression_loss: 1.5916 - classification_loss: 0.3438 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9371 - regression_loss: 1.5924 - classification_loss: 0.3447 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9357 - regression_loss: 1.5916 - classification_loss: 0.3441 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9359 - regression_loss: 1.5916 - classification_loss: 0.3442 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9391 - regression_loss: 1.5938 - classification_loss: 0.3453 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9373 - regression_loss: 1.5922 - classification_loss: 0.3450 141/500 [=======>......................] - ETA: 1:30 - loss: 1.9354 - regression_loss: 1.5910 - classification_loss: 0.3444 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9314 - regression_loss: 1.5878 - classification_loss: 0.3436 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9334 - regression_loss: 1.5871 - classification_loss: 0.3463 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9354 - regression_loss: 1.5884 - classification_loss: 0.3470 145/500 [=======>......................] - ETA: 1:29 - loss: 1.9363 - regression_loss: 1.5894 - classification_loss: 0.3469 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9369 - regression_loss: 1.5901 - classification_loss: 0.3468 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9389 - regression_loss: 1.5919 - classification_loss: 0.3471 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9376 - regression_loss: 1.5909 - classification_loss: 0.3466 149/500 [=======>......................] - ETA: 1:28 - loss: 1.9359 - regression_loss: 1.5897 - classification_loss: 0.3461 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9334 - regression_loss: 1.5879 - classification_loss: 0.3454 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9306 - regression_loss: 1.5856 - classification_loss: 0.3450 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9317 - regression_loss: 1.5863 - classification_loss: 0.3454 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9303 - regression_loss: 1.5853 - classification_loss: 0.3449 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9309 - regression_loss: 1.5859 - classification_loss: 0.3450 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9241 - regression_loss: 1.5809 - classification_loss: 0.3432 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9256 - regression_loss: 1.5814 - classification_loss: 0.3443 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9233 - regression_loss: 1.5787 - classification_loss: 0.3446 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9286 - regression_loss: 1.5835 - classification_loss: 0.3451 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9284 - regression_loss: 1.5837 - classification_loss: 0.3447 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9302 - regression_loss: 1.5850 - classification_loss: 0.3453 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9318 - regression_loss: 1.5866 - classification_loss: 0.3452 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9319 - regression_loss: 1.5870 - classification_loss: 0.3449 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9333 - regression_loss: 1.5879 - classification_loss: 0.3454 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9337 - regression_loss: 1.5886 - classification_loss: 0.3451 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9363 - regression_loss: 1.5906 - classification_loss: 0.3457 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9331 - regression_loss: 1.5882 - classification_loss: 0.3449 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9359 - regression_loss: 1.5904 - classification_loss: 0.3455 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9336 - regression_loss: 1.5883 - classification_loss: 0.3453 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9360 - regression_loss: 1.5892 - classification_loss: 0.3468 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9371 - regression_loss: 1.5900 - classification_loss: 0.3471 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9317 - regression_loss: 1.5855 - classification_loss: 0.3462 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9280 - regression_loss: 1.5817 - classification_loss: 0.3463 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9276 - regression_loss: 1.5819 - classification_loss: 0.3457 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9286 - regression_loss: 1.5828 - classification_loss: 0.3458 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9249 - regression_loss: 1.5794 - classification_loss: 0.3455 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9241 - regression_loss: 1.5787 - classification_loss: 0.3453 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9240 - regression_loss: 1.5788 - classification_loss: 0.3452 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9251 - regression_loss: 1.5798 - classification_loss: 0.3453 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9231 - regression_loss: 1.5776 - classification_loss: 0.3455 180/500 [=========>....................] - ETA: 1:19 - loss: 1.9220 - regression_loss: 1.5769 - classification_loss: 0.3450 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9235 - regression_loss: 1.5782 - classification_loss: 0.3454 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9245 - regression_loss: 1.5790 - classification_loss: 0.3455 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9279 - regression_loss: 1.5821 - classification_loss: 0.3458 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9267 - regression_loss: 1.5813 - classification_loss: 0.3454 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9266 - regression_loss: 1.5813 - classification_loss: 0.3454 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9305 - regression_loss: 1.5849 - classification_loss: 0.3456 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9303 - regression_loss: 1.5849 - classification_loss: 0.3455 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9320 - regression_loss: 1.5859 - classification_loss: 0.3462 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9315 - regression_loss: 1.5856 - classification_loss: 0.3459 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9299 - regression_loss: 1.5849 - classification_loss: 0.3450 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9309 - regression_loss: 1.5858 - classification_loss: 0.3451 192/500 [==========>...................] - ETA: 1:16 - loss: 1.9368 - regression_loss: 1.5899 - classification_loss: 0.3469 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9338 - regression_loss: 1.5874 - classification_loss: 0.3465 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9339 - regression_loss: 1.5872 - classification_loss: 0.3466 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9360 - regression_loss: 1.5892 - classification_loss: 0.3468 196/500 [==========>...................] - ETA: 1:15 - loss: 1.9374 - regression_loss: 1.5906 - classification_loss: 0.3468 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9369 - regression_loss: 1.5903 - classification_loss: 0.3465 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9367 - regression_loss: 1.5907 - classification_loss: 0.3459 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9345 - regression_loss: 1.5882 - classification_loss: 0.3463 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9363 - regression_loss: 1.5895 - classification_loss: 0.3468 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9380 - regression_loss: 1.5907 - classification_loss: 0.3473 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9366 - regression_loss: 1.5895 - classification_loss: 0.3471 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9377 - regression_loss: 1.5904 - classification_loss: 0.3473 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9362 - regression_loss: 1.5888 - classification_loss: 0.3473 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9360 - regression_loss: 1.5887 - classification_loss: 0.3472 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9398 - regression_loss: 1.5913 - classification_loss: 0.3485 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9377 - regression_loss: 1.5897 - classification_loss: 0.3481 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9418 - regression_loss: 1.5937 - classification_loss: 0.3481 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9436 - regression_loss: 1.5950 - classification_loss: 0.3485 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9394 - regression_loss: 1.5914 - classification_loss: 0.3480 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9421 - regression_loss: 1.5937 - classification_loss: 0.3483 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9435 - regression_loss: 1.5946 - classification_loss: 0.3489 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9415 - regression_loss: 1.5930 - classification_loss: 0.3485 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9450 - regression_loss: 1.5955 - classification_loss: 0.3496 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9407 - regression_loss: 1.5920 - classification_loss: 0.3487 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9409 - regression_loss: 1.5925 - classification_loss: 0.3484 217/500 [============>.................] - ETA: 1:10 - loss: 1.9423 - regression_loss: 1.5938 - classification_loss: 0.3485 218/500 [============>.................] - ETA: 1:10 - loss: 1.9405 - regression_loss: 1.5917 - classification_loss: 0.3487 219/500 [============>.................] - ETA: 1:10 - loss: 1.9397 - regression_loss: 1.5909 - classification_loss: 0.3488 220/500 [============>.................] - ETA: 1:10 - loss: 1.9391 - regression_loss: 1.5907 - classification_loss: 0.3485 221/500 [============>.................] - ETA: 1:09 - loss: 1.9389 - regression_loss: 1.5907 - classification_loss: 0.3482 222/500 [============>.................] - ETA: 1:09 - loss: 1.9385 - regression_loss: 1.5907 - classification_loss: 0.3478 223/500 [============>.................] - ETA: 1:09 - loss: 1.9390 - regression_loss: 1.5909 - classification_loss: 0.3481 224/500 [============>.................] - ETA: 1:09 - loss: 1.9389 - regression_loss: 1.5909 - classification_loss: 0.3481 225/500 [============>.................] - ETA: 1:08 - loss: 1.9335 - regression_loss: 1.5864 - classification_loss: 0.3471 226/500 [============>.................] - ETA: 1:08 - loss: 1.9323 - regression_loss: 1.5856 - classification_loss: 0.3467 227/500 [============>.................] - ETA: 1:08 - loss: 1.9305 - regression_loss: 1.5845 - classification_loss: 0.3460 228/500 [============>.................] - ETA: 1:08 - loss: 1.9315 - regression_loss: 1.5856 - classification_loss: 0.3460 229/500 [============>.................] - ETA: 1:07 - loss: 1.9322 - regression_loss: 1.5860 - classification_loss: 0.3462 230/500 [============>.................] - ETA: 1:07 - loss: 1.9324 - regression_loss: 1.5863 - classification_loss: 0.3461 231/500 [============>.................] - ETA: 1:07 - loss: 1.9334 - regression_loss: 1.5871 - classification_loss: 0.3463 232/500 [============>.................] - ETA: 1:07 - loss: 1.9338 - regression_loss: 1.5874 - classification_loss: 0.3464 233/500 [============>.................] - ETA: 1:06 - loss: 1.9342 - regression_loss: 1.5880 - classification_loss: 0.3462 234/500 [=============>................] - ETA: 1:06 - loss: 1.9345 - regression_loss: 1.5880 - classification_loss: 0.3465 235/500 [=============>................] - ETA: 1:06 - loss: 1.9324 - regression_loss: 1.5865 - classification_loss: 0.3460 236/500 [=============>................] - ETA: 1:06 - loss: 1.9347 - regression_loss: 1.5885 - classification_loss: 0.3463 237/500 [=============>................] - ETA: 1:05 - loss: 1.9356 - regression_loss: 1.5889 - classification_loss: 0.3466 238/500 [=============>................] - ETA: 1:05 - loss: 1.9330 - regression_loss: 1.5867 - classification_loss: 0.3463 239/500 [=============>................] - ETA: 1:05 - loss: 1.9360 - regression_loss: 1.5896 - classification_loss: 0.3465 240/500 [=============>................] - ETA: 1:05 - loss: 1.9352 - regression_loss: 1.5890 - classification_loss: 0.3462 241/500 [=============>................] - ETA: 1:04 - loss: 1.9311 - regression_loss: 1.5857 - classification_loss: 0.3454 242/500 [=============>................] - ETA: 1:04 - loss: 1.9323 - regression_loss: 1.5866 - classification_loss: 0.3457 243/500 [=============>................] - ETA: 1:04 - loss: 1.9326 - regression_loss: 1.5867 - classification_loss: 0.3458 244/500 [=============>................] - ETA: 1:04 - loss: 1.9342 - regression_loss: 1.5881 - classification_loss: 0.3461 245/500 [=============>................] - ETA: 1:03 - loss: 1.9302 - regression_loss: 1.5849 - classification_loss: 0.3453 246/500 [=============>................] - ETA: 1:03 - loss: 1.9271 - regression_loss: 1.5823 - classification_loss: 0.3447 247/500 [=============>................] - ETA: 1:03 - loss: 1.9270 - regression_loss: 1.5823 - classification_loss: 0.3447 248/500 [=============>................] - ETA: 1:03 - loss: 1.9262 - regression_loss: 1.5818 - classification_loss: 0.3443 249/500 [=============>................] - ETA: 1:02 - loss: 1.9251 - regression_loss: 1.5813 - classification_loss: 0.3438 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9277 - regression_loss: 1.5829 - classification_loss: 0.3447 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9260 - regression_loss: 1.5815 - classification_loss: 0.3445 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9261 - regression_loss: 1.5818 - classification_loss: 0.3442 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9273 - regression_loss: 1.5826 - classification_loss: 0.3447 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9264 - regression_loss: 1.5820 - classification_loss: 0.3444 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9268 - regression_loss: 1.5823 - classification_loss: 0.3445 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9277 - regression_loss: 1.5831 - classification_loss: 0.3446 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9287 - regression_loss: 1.5839 - classification_loss: 0.3447 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9300 - regression_loss: 1.5848 - classification_loss: 0.3452 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9293 - regression_loss: 1.5843 - classification_loss: 0.3450 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9313 - regression_loss: 1.5859 - classification_loss: 0.3454 261/500 [==============>...............] - ETA: 59s - loss: 1.9321 - regression_loss: 1.5867 - classification_loss: 0.3454  262/500 [==============>...............] - ETA: 59s - loss: 1.9304 - regression_loss: 1.5849 - classification_loss: 0.3455 263/500 [==============>...............] - ETA: 59s - loss: 1.9299 - regression_loss: 1.5846 - classification_loss: 0.3453 264/500 [==============>...............] - ETA: 59s - loss: 1.9309 - regression_loss: 1.5851 - classification_loss: 0.3458 265/500 [==============>...............] - ETA: 58s - loss: 1.9299 - regression_loss: 1.5849 - classification_loss: 0.3450 266/500 [==============>...............] - ETA: 58s - loss: 1.9295 - regression_loss: 1.5846 - classification_loss: 0.3449 267/500 [===============>..............] - ETA: 58s - loss: 1.9274 - regression_loss: 1.5828 - classification_loss: 0.3445 268/500 [===============>..............] - ETA: 58s - loss: 1.9265 - regression_loss: 1.5820 - classification_loss: 0.3445 269/500 [===============>..............] - ETA: 57s - loss: 1.9270 - regression_loss: 1.5817 - classification_loss: 0.3454 270/500 [===============>..............] - ETA: 57s - loss: 1.9286 - regression_loss: 1.5821 - classification_loss: 0.3465 271/500 [===============>..............] - ETA: 57s - loss: 1.9272 - regression_loss: 1.5811 - classification_loss: 0.3461 272/500 [===============>..............] - ETA: 57s - loss: 1.9285 - regression_loss: 1.5818 - classification_loss: 0.3467 273/500 [===============>..............] - ETA: 56s - loss: 1.9254 - regression_loss: 1.5794 - classification_loss: 0.3460 274/500 [===============>..............] - ETA: 56s - loss: 1.9320 - regression_loss: 1.5771 - classification_loss: 0.3550 275/500 [===============>..............] - ETA: 56s - loss: 1.9330 - regression_loss: 1.5779 - classification_loss: 0.3552 276/500 [===============>..............] - ETA: 56s - loss: 1.9358 - regression_loss: 1.5808 - classification_loss: 0.3550 277/500 [===============>..............] - ETA: 55s - loss: 1.9370 - regression_loss: 1.5820 - classification_loss: 0.3550 278/500 [===============>..............] - ETA: 55s - loss: 1.9392 - regression_loss: 1.5835 - classification_loss: 0.3557 279/500 [===============>..............] - ETA: 55s - loss: 1.9398 - regression_loss: 1.5843 - classification_loss: 0.3556 280/500 [===============>..............] - ETA: 55s - loss: 1.9391 - regression_loss: 1.5836 - classification_loss: 0.3556 281/500 [===============>..............] - ETA: 54s - loss: 1.9362 - regression_loss: 1.5810 - classification_loss: 0.3552 282/500 [===============>..............] - ETA: 54s - loss: 1.9369 - regression_loss: 1.5818 - classification_loss: 0.3552 283/500 [===============>..............] - ETA: 54s - loss: 1.9377 - regression_loss: 1.5825 - classification_loss: 0.3552 284/500 [================>.............] - ETA: 54s - loss: 1.9407 - regression_loss: 1.5844 - classification_loss: 0.3563 285/500 [================>.............] - ETA: 53s - loss: 1.9400 - regression_loss: 1.5839 - classification_loss: 0.3560 286/500 [================>.............] - ETA: 53s - loss: 1.9413 - regression_loss: 1.5851 - classification_loss: 0.3562 287/500 [================>.............] - ETA: 53s - loss: 1.9396 - regression_loss: 1.5839 - classification_loss: 0.3557 288/500 [================>.............] - ETA: 53s - loss: 1.9400 - regression_loss: 1.5844 - classification_loss: 0.3556 289/500 [================>.............] - ETA: 52s - loss: 1.9386 - regression_loss: 1.5835 - classification_loss: 0.3551 290/500 [================>.............] - ETA: 52s - loss: 1.9362 - regression_loss: 1.5814 - classification_loss: 0.3548 291/500 [================>.............] - ETA: 52s - loss: 1.9351 - regression_loss: 1.5806 - classification_loss: 0.3545 292/500 [================>.............] - ETA: 52s - loss: 1.9354 - regression_loss: 1.5811 - classification_loss: 0.3543 293/500 [================>.............] - ETA: 51s - loss: 1.9320 - regression_loss: 1.5782 - classification_loss: 0.3537 294/500 [================>.............] - ETA: 51s - loss: 1.9337 - regression_loss: 1.5800 - classification_loss: 0.3538 295/500 [================>.............] - ETA: 51s - loss: 1.9330 - regression_loss: 1.5795 - classification_loss: 0.3535 296/500 [================>.............] - ETA: 51s - loss: 1.9338 - regression_loss: 1.5804 - classification_loss: 0.3535 297/500 [================>.............] - ETA: 50s - loss: 1.9324 - regression_loss: 1.5791 - classification_loss: 0.3533 298/500 [================>.............] - ETA: 50s - loss: 1.9334 - regression_loss: 1.5797 - classification_loss: 0.3537 299/500 [================>.............] - ETA: 50s - loss: 1.9346 - regression_loss: 1.5809 - classification_loss: 0.3537 300/500 [=================>............] - ETA: 50s - loss: 1.9337 - regression_loss: 1.5803 - classification_loss: 0.3534 301/500 [=================>............] - ETA: 49s - loss: 1.9338 - regression_loss: 1.5805 - classification_loss: 0.3533 302/500 [=================>............] - ETA: 49s - loss: 1.9332 - regression_loss: 1.5804 - classification_loss: 0.3528 303/500 [=================>............] - ETA: 49s - loss: 1.9341 - regression_loss: 1.5810 - classification_loss: 0.3531 304/500 [=================>............] - ETA: 49s - loss: 1.9324 - regression_loss: 1.5791 - classification_loss: 0.3533 305/500 [=================>............] - ETA: 48s - loss: 1.9301 - regression_loss: 1.5774 - classification_loss: 0.3527 306/500 [=================>............] - ETA: 48s - loss: 1.9290 - regression_loss: 1.5767 - classification_loss: 0.3523 307/500 [=================>............] - ETA: 48s - loss: 1.9252 - regression_loss: 1.5738 - classification_loss: 0.3514 308/500 [=================>............] - ETA: 48s - loss: 1.9266 - regression_loss: 1.5756 - classification_loss: 0.3511 309/500 [=================>............] - ETA: 47s - loss: 1.9273 - regression_loss: 1.5763 - classification_loss: 0.3510 310/500 [=================>............] - ETA: 47s - loss: 1.9292 - regression_loss: 1.5778 - classification_loss: 0.3513 311/500 [=================>............] - ETA: 47s - loss: 1.9302 - regression_loss: 1.5787 - classification_loss: 0.3515 312/500 [=================>............] - ETA: 47s - loss: 1.9311 - regression_loss: 1.5795 - classification_loss: 0.3516 313/500 [=================>............] - ETA: 46s - loss: 1.9309 - regression_loss: 1.5792 - classification_loss: 0.3517 314/500 [=================>............] - ETA: 46s - loss: 1.9318 - regression_loss: 1.5798 - classification_loss: 0.3519 315/500 [=================>............] - ETA: 46s - loss: 1.9309 - regression_loss: 1.5795 - classification_loss: 0.3514 316/500 [=================>............] - ETA: 46s - loss: 1.9348 - regression_loss: 1.5825 - classification_loss: 0.3523 317/500 [==================>...........] - ETA: 45s - loss: 1.9351 - regression_loss: 1.5829 - classification_loss: 0.3522 318/500 [==================>...........] - ETA: 45s - loss: 1.9355 - regression_loss: 1.5833 - classification_loss: 0.3522 319/500 [==================>...........] - ETA: 45s - loss: 1.9363 - regression_loss: 1.5838 - classification_loss: 0.3524 320/500 [==================>...........] - ETA: 45s - loss: 1.9363 - regression_loss: 1.5839 - classification_loss: 0.3524 321/500 [==================>...........] - ETA: 44s - loss: 1.9381 - regression_loss: 1.5853 - classification_loss: 0.3528 322/500 [==================>...........] - ETA: 44s - loss: 1.9374 - regression_loss: 1.5847 - classification_loss: 0.3527 323/500 [==================>...........] - ETA: 44s - loss: 1.9378 - regression_loss: 1.5853 - classification_loss: 0.3525 324/500 [==================>...........] - ETA: 44s - loss: 1.9395 - regression_loss: 1.5863 - classification_loss: 0.3532 325/500 [==================>...........] - ETA: 43s - loss: 1.9394 - regression_loss: 1.5864 - classification_loss: 0.3530 326/500 [==================>...........] - ETA: 43s - loss: 1.9393 - regression_loss: 1.5864 - classification_loss: 0.3528 327/500 [==================>...........] - ETA: 43s - loss: 1.9404 - regression_loss: 1.5875 - classification_loss: 0.3529 328/500 [==================>...........] - ETA: 43s - loss: 1.9409 - regression_loss: 1.5881 - classification_loss: 0.3528 329/500 [==================>...........] - ETA: 42s - loss: 1.9425 - regression_loss: 1.5893 - classification_loss: 0.3532 330/500 [==================>...........] - ETA: 42s - loss: 1.9427 - regression_loss: 1.5898 - classification_loss: 0.3529 331/500 [==================>...........] - ETA: 42s - loss: 1.9424 - regression_loss: 1.5897 - classification_loss: 0.3527 332/500 [==================>...........] - ETA: 42s - loss: 1.9431 - regression_loss: 1.5902 - classification_loss: 0.3529 333/500 [==================>...........] - ETA: 41s - loss: 1.9445 - regression_loss: 1.5913 - classification_loss: 0.3532 334/500 [===================>..........] - ETA: 41s - loss: 1.9431 - regression_loss: 1.5902 - classification_loss: 0.3529 335/500 [===================>..........] - ETA: 41s - loss: 1.9425 - regression_loss: 1.5899 - classification_loss: 0.3526 336/500 [===================>..........] - ETA: 41s - loss: 1.9413 - regression_loss: 1.5889 - classification_loss: 0.3524 337/500 [===================>..........] - ETA: 40s - loss: 1.9393 - regression_loss: 1.5873 - classification_loss: 0.3520 338/500 [===================>..........] - ETA: 40s - loss: 1.9397 - regression_loss: 1.5877 - classification_loss: 0.3520 339/500 [===================>..........] - ETA: 40s - loss: 1.9381 - regression_loss: 1.5864 - classification_loss: 0.3518 340/500 [===================>..........] - ETA: 40s - loss: 1.9388 - regression_loss: 1.5871 - classification_loss: 0.3517 341/500 [===================>..........] - ETA: 39s - loss: 1.9388 - regression_loss: 1.5872 - classification_loss: 0.3516 342/500 [===================>..........] - ETA: 39s - loss: 1.9387 - regression_loss: 1.5869 - classification_loss: 0.3518 343/500 [===================>..........] - ETA: 39s - loss: 1.9391 - regression_loss: 1.5872 - classification_loss: 0.3519 344/500 [===================>..........] - ETA: 39s - loss: 1.9372 - regression_loss: 1.5859 - classification_loss: 0.3513 345/500 [===================>..........] - ETA: 38s - loss: 1.9358 - regression_loss: 1.5850 - classification_loss: 0.3507 346/500 [===================>..........] - ETA: 38s - loss: 1.9364 - regression_loss: 1.5856 - classification_loss: 0.3508 347/500 [===================>..........] - ETA: 38s - loss: 1.9373 - regression_loss: 1.5863 - classification_loss: 0.3510 348/500 [===================>..........] - ETA: 38s - loss: 1.9357 - regression_loss: 1.5852 - classification_loss: 0.3505 349/500 [===================>..........] - ETA: 37s - loss: 1.9365 - regression_loss: 1.5858 - classification_loss: 0.3507 350/500 [====================>.........] - ETA: 37s - loss: 1.9365 - regression_loss: 1.5859 - classification_loss: 0.3506 351/500 [====================>.........] - ETA: 37s - loss: 1.9369 - regression_loss: 1.5861 - classification_loss: 0.3507 352/500 [====================>.........] - ETA: 37s - loss: 1.9381 - regression_loss: 1.5873 - classification_loss: 0.3508 353/500 [====================>.........] - ETA: 36s - loss: 1.9376 - regression_loss: 1.5868 - classification_loss: 0.3508 354/500 [====================>.........] - ETA: 36s - loss: 1.9375 - regression_loss: 1.5867 - classification_loss: 0.3508 355/500 [====================>.........] - ETA: 36s - loss: 1.9382 - regression_loss: 1.5871 - classification_loss: 0.3511 356/500 [====================>.........] - ETA: 36s - loss: 1.9375 - regression_loss: 1.5867 - classification_loss: 0.3508 357/500 [====================>.........] - ETA: 35s - loss: 1.9377 - regression_loss: 1.5867 - classification_loss: 0.3510 358/500 [====================>.........] - ETA: 35s - loss: 1.9385 - regression_loss: 1.5871 - classification_loss: 0.3514 359/500 [====================>.........] - ETA: 35s - loss: 1.9393 - regression_loss: 1.5878 - classification_loss: 0.3515 360/500 [====================>.........] - ETA: 35s - loss: 1.9404 - regression_loss: 1.5889 - classification_loss: 0.3516 361/500 [====================>.........] - ETA: 34s - loss: 1.9414 - regression_loss: 1.5899 - classification_loss: 0.3515 362/500 [====================>.........] - ETA: 34s - loss: 1.9418 - regression_loss: 1.5904 - classification_loss: 0.3514 363/500 [====================>.........] - ETA: 34s - loss: 1.9415 - regression_loss: 1.5894 - classification_loss: 0.3520 364/500 [====================>.........] - ETA: 34s - loss: 1.9418 - regression_loss: 1.5899 - classification_loss: 0.3520 365/500 [====================>.........] - ETA: 33s - loss: 1.9391 - regression_loss: 1.5878 - classification_loss: 0.3513 366/500 [====================>.........] - ETA: 33s - loss: 1.9389 - regression_loss: 1.5876 - classification_loss: 0.3512 367/500 [=====================>........] - ETA: 33s - loss: 1.9390 - regression_loss: 1.5878 - classification_loss: 0.3512 368/500 [=====================>........] - ETA: 33s - loss: 1.9405 - regression_loss: 1.5890 - classification_loss: 0.3515 369/500 [=====================>........] - ETA: 32s - loss: 1.9401 - regression_loss: 1.5883 - classification_loss: 0.3518 370/500 [=====================>........] - ETA: 32s - loss: 1.9427 - regression_loss: 1.5900 - classification_loss: 0.3527 371/500 [=====================>........] - ETA: 32s - loss: 1.9437 - regression_loss: 1.5908 - classification_loss: 0.3529 372/500 [=====================>........] - ETA: 32s - loss: 1.9438 - regression_loss: 1.5910 - classification_loss: 0.3529 373/500 [=====================>........] - ETA: 31s - loss: 1.9432 - regression_loss: 1.5905 - classification_loss: 0.3527 374/500 [=====================>........] - ETA: 31s - loss: 1.9432 - regression_loss: 1.5907 - classification_loss: 0.3526 375/500 [=====================>........] - ETA: 31s - loss: 1.9418 - regression_loss: 1.5896 - classification_loss: 0.3522 376/500 [=====================>........] - ETA: 31s - loss: 1.9441 - regression_loss: 1.5916 - classification_loss: 0.3525 377/500 [=====================>........] - ETA: 30s - loss: 1.9451 - regression_loss: 1.5925 - classification_loss: 0.3526 378/500 [=====================>........] - ETA: 30s - loss: 1.9446 - regression_loss: 1.5921 - classification_loss: 0.3525 379/500 [=====================>........] - ETA: 30s - loss: 1.9448 - regression_loss: 1.5923 - classification_loss: 0.3525 380/500 [=====================>........] - ETA: 30s - loss: 1.9425 - regression_loss: 1.5904 - classification_loss: 0.3521 381/500 [=====================>........] - ETA: 29s - loss: 1.9425 - regression_loss: 1.5905 - classification_loss: 0.3519 382/500 [=====================>........] - ETA: 29s - loss: 1.9439 - regression_loss: 1.5917 - classification_loss: 0.3521 383/500 [=====================>........] - ETA: 29s - loss: 1.9432 - regression_loss: 1.5913 - classification_loss: 0.3519 384/500 [======================>.......] - ETA: 29s - loss: 1.9430 - regression_loss: 1.5910 - classification_loss: 0.3520 385/500 [======================>.......] - ETA: 28s - loss: 1.9420 - regression_loss: 1.5903 - classification_loss: 0.3517 386/500 [======================>.......] - ETA: 28s - loss: 1.9458 - regression_loss: 1.5923 - classification_loss: 0.3535 387/500 [======================>.......] - ETA: 28s - loss: 1.9462 - regression_loss: 1.5927 - classification_loss: 0.3535 388/500 [======================>.......] - ETA: 28s - loss: 1.9466 - regression_loss: 1.5931 - classification_loss: 0.3535 389/500 [======================>.......] - ETA: 27s - loss: 1.9466 - regression_loss: 1.5933 - classification_loss: 0.3533 390/500 [======================>.......] - ETA: 27s - loss: 1.9471 - regression_loss: 1.5938 - classification_loss: 0.3533 391/500 [======================>.......] - ETA: 27s - loss: 1.9485 - regression_loss: 1.5949 - classification_loss: 0.3536 392/500 [======================>.......] - ETA: 27s - loss: 1.9488 - regression_loss: 1.5951 - classification_loss: 0.3536 393/500 [======================>.......] - ETA: 26s - loss: 1.9490 - regression_loss: 1.5954 - classification_loss: 0.3536 394/500 [======================>.......] - ETA: 26s - loss: 1.9485 - regression_loss: 1.5945 - classification_loss: 0.3539 395/500 [======================>.......] - ETA: 26s - loss: 1.9484 - regression_loss: 1.5946 - classification_loss: 0.3538 396/500 [======================>.......] - ETA: 26s - loss: 1.9486 - regression_loss: 1.5949 - classification_loss: 0.3537 397/500 [======================>.......] - ETA: 25s - loss: 1.9482 - regression_loss: 1.5946 - classification_loss: 0.3537 398/500 [======================>.......] - ETA: 25s - loss: 1.9499 - regression_loss: 1.5961 - classification_loss: 0.3538 399/500 [======================>.......] - ETA: 25s - loss: 1.9509 - regression_loss: 1.5970 - classification_loss: 0.3539 400/500 [=======================>......] - ETA: 25s - loss: 1.9515 - regression_loss: 1.5975 - classification_loss: 0.3540 401/500 [=======================>......] - ETA: 24s - loss: 1.9519 - regression_loss: 1.5978 - classification_loss: 0.3541 402/500 [=======================>......] - ETA: 24s - loss: 1.9523 - regression_loss: 1.5983 - classification_loss: 0.3540 403/500 [=======================>......] - ETA: 24s - loss: 1.9532 - regression_loss: 1.5991 - classification_loss: 0.3541 404/500 [=======================>......] - ETA: 24s - loss: 1.9532 - regression_loss: 1.5992 - classification_loss: 0.3541 405/500 [=======================>......] - ETA: 23s - loss: 1.9545 - regression_loss: 1.6004 - classification_loss: 0.3540 406/500 [=======================>......] - ETA: 23s - loss: 1.9579 - regression_loss: 1.6031 - classification_loss: 0.3548 407/500 [=======================>......] - ETA: 23s - loss: 1.9598 - regression_loss: 1.6047 - classification_loss: 0.3551 408/500 [=======================>......] - ETA: 23s - loss: 1.9594 - regression_loss: 1.6044 - classification_loss: 0.3550 409/500 [=======================>......] - ETA: 22s - loss: 1.9570 - regression_loss: 1.6026 - classification_loss: 0.3544 410/500 [=======================>......] - ETA: 22s - loss: 1.9570 - regression_loss: 1.6026 - classification_loss: 0.3544 411/500 [=======================>......] - ETA: 22s - loss: 1.9583 - regression_loss: 1.6032 - classification_loss: 0.3551 412/500 [=======================>......] - ETA: 22s - loss: 1.9557 - regression_loss: 1.6012 - classification_loss: 0.3546 413/500 [=======================>......] - ETA: 21s - loss: 1.9550 - regression_loss: 1.6005 - classification_loss: 0.3544 414/500 [=======================>......] - ETA: 21s - loss: 1.9546 - regression_loss: 1.6002 - classification_loss: 0.3544 415/500 [=======================>......] - ETA: 21s - loss: 1.9551 - regression_loss: 1.6005 - classification_loss: 0.3546 416/500 [=======================>......] - ETA: 21s - loss: 1.9534 - regression_loss: 1.5990 - classification_loss: 0.3544 417/500 [========================>.....] - ETA: 20s - loss: 1.9535 - regression_loss: 1.5990 - classification_loss: 0.3545 418/500 [========================>.....] - ETA: 20s - loss: 1.9539 - regression_loss: 1.5995 - classification_loss: 0.3544 419/500 [========================>.....] - ETA: 20s - loss: 1.9529 - regression_loss: 1.5989 - classification_loss: 0.3540 420/500 [========================>.....] - ETA: 20s - loss: 1.9538 - regression_loss: 1.5996 - classification_loss: 0.3542 421/500 [========================>.....] - ETA: 19s - loss: 1.9533 - regression_loss: 1.5994 - classification_loss: 0.3539 422/500 [========================>.....] - ETA: 19s - loss: 1.9537 - regression_loss: 1.5998 - classification_loss: 0.3538 423/500 [========================>.....] - ETA: 19s - loss: 1.9521 - regression_loss: 1.5983 - classification_loss: 0.3538 424/500 [========================>.....] - ETA: 19s - loss: 1.9527 - regression_loss: 1.5989 - classification_loss: 0.3538 425/500 [========================>.....] - ETA: 18s - loss: 1.9534 - regression_loss: 1.5992 - classification_loss: 0.3542 426/500 [========================>.....] - ETA: 18s - loss: 1.9539 - regression_loss: 1.5996 - classification_loss: 0.3543 427/500 [========================>.....] - ETA: 18s - loss: 1.9545 - regression_loss: 1.5999 - classification_loss: 0.3546 428/500 [========================>.....] - ETA: 18s - loss: 1.9562 - regression_loss: 1.6012 - classification_loss: 0.3550 429/500 [========================>.....] - ETA: 17s - loss: 1.9569 - regression_loss: 1.6018 - classification_loss: 0.3552 430/500 [========================>.....] - ETA: 17s - loss: 1.9568 - regression_loss: 1.6018 - classification_loss: 0.3550 431/500 [========================>.....] - ETA: 17s - loss: 1.9575 - regression_loss: 1.6024 - classification_loss: 0.3551 432/500 [========================>.....] - ETA: 17s - loss: 1.9580 - regression_loss: 1.6029 - classification_loss: 0.3551 433/500 [========================>.....] - ETA: 16s - loss: 1.9577 - regression_loss: 1.6027 - classification_loss: 0.3550 434/500 [=========================>....] - ETA: 16s - loss: 1.9595 - regression_loss: 1.6041 - classification_loss: 0.3554 435/500 [=========================>....] - ETA: 16s - loss: 1.9592 - regression_loss: 1.6036 - classification_loss: 0.3556 436/500 [=========================>....] - ETA: 16s - loss: 1.9598 - regression_loss: 1.6041 - classification_loss: 0.3557 437/500 [=========================>....] - ETA: 15s - loss: 1.9597 - regression_loss: 1.6041 - classification_loss: 0.3556 438/500 [=========================>....] - ETA: 15s - loss: 1.9582 - regression_loss: 1.6028 - classification_loss: 0.3554 439/500 [=========================>....] - ETA: 15s - loss: 1.9567 - regression_loss: 1.6017 - classification_loss: 0.3551 440/500 [=========================>....] - ETA: 15s - loss: 1.9575 - regression_loss: 1.6022 - classification_loss: 0.3553 441/500 [=========================>....] - ETA: 14s - loss: 1.9579 - regression_loss: 1.6025 - classification_loss: 0.3554 442/500 [=========================>....] - ETA: 14s - loss: 1.9576 - regression_loss: 1.6024 - classification_loss: 0.3552 443/500 [=========================>....] - ETA: 14s - loss: 1.9565 - regression_loss: 1.6015 - classification_loss: 0.3550 444/500 [=========================>....] - ETA: 14s - loss: 1.9571 - regression_loss: 1.6020 - classification_loss: 0.3551 445/500 [=========================>....] - ETA: 13s - loss: 1.9564 - regression_loss: 1.6015 - classification_loss: 0.3548 446/500 [=========================>....] - ETA: 13s - loss: 1.9564 - regression_loss: 1.6017 - classification_loss: 0.3547 447/500 [=========================>....] - ETA: 13s - loss: 1.9575 - regression_loss: 1.6025 - classification_loss: 0.3550 448/500 [=========================>....] - ETA: 13s - loss: 1.9576 - regression_loss: 1.6026 - classification_loss: 0.3549 449/500 [=========================>....] - ETA: 12s - loss: 1.9574 - regression_loss: 1.6025 - classification_loss: 0.3549 450/500 [==========================>...] - ETA: 12s - loss: 1.9583 - regression_loss: 1.6033 - classification_loss: 0.3550 451/500 [==========================>...] - ETA: 12s - loss: 1.9600 - regression_loss: 1.6050 - classification_loss: 0.3550 452/500 [==========================>...] - ETA: 12s - loss: 1.9602 - regression_loss: 1.6041 - classification_loss: 0.3562 453/500 [==========================>...] - ETA: 11s - loss: 1.9606 - regression_loss: 1.6041 - classification_loss: 0.3566 454/500 [==========================>...] - ETA: 11s - loss: 1.9620 - regression_loss: 1.6048 - classification_loss: 0.3571 455/500 [==========================>...] - ETA: 11s - loss: 1.9611 - regression_loss: 1.6041 - classification_loss: 0.3570 456/500 [==========================>...] - ETA: 11s - loss: 1.9608 - regression_loss: 1.6039 - classification_loss: 0.3569 457/500 [==========================>...] - ETA: 10s - loss: 1.9595 - regression_loss: 1.6028 - classification_loss: 0.3567 458/500 [==========================>...] - ETA: 10s - loss: 1.9590 - regression_loss: 1.6025 - classification_loss: 0.3565 459/500 [==========================>...] - ETA: 10s - loss: 1.9605 - regression_loss: 1.6034 - classification_loss: 0.3571 460/500 [==========================>...] - ETA: 10s - loss: 1.9608 - regression_loss: 1.6037 - classification_loss: 0.3571 461/500 [==========================>...] - ETA: 9s - loss: 1.9614 - regression_loss: 1.6041 - classification_loss: 0.3573  462/500 [==========================>...] - ETA: 9s - loss: 1.9626 - regression_loss: 1.6051 - classification_loss: 0.3575 463/500 [==========================>...] - ETA: 9s - loss: 1.9622 - regression_loss: 1.6047 - classification_loss: 0.3575 464/500 [==========================>...] - ETA: 9s - loss: 1.9618 - regression_loss: 1.6044 - classification_loss: 0.3573 465/500 [==========================>...] - ETA: 8s - loss: 1.9603 - regression_loss: 1.6034 - classification_loss: 0.3569 466/500 [==========================>...] - ETA: 8s - loss: 1.9600 - regression_loss: 1.6033 - classification_loss: 0.3568 467/500 [===========================>..] - ETA: 8s - loss: 1.9580 - regression_loss: 1.6015 - classification_loss: 0.3566 468/500 [===========================>..] - ETA: 8s - loss: 1.9586 - regression_loss: 1.6018 - classification_loss: 0.3567 469/500 [===========================>..] - ETA: 7s - loss: 1.9577 - regression_loss: 1.6011 - classification_loss: 0.3566 470/500 [===========================>..] - ETA: 7s - loss: 1.9576 - regression_loss: 1.6011 - classification_loss: 0.3565 471/500 [===========================>..] - ETA: 7s - loss: 1.9561 - regression_loss: 1.6000 - classification_loss: 0.3561 472/500 [===========================>..] - ETA: 7s - loss: 1.9569 - regression_loss: 1.6008 - classification_loss: 0.3561 473/500 [===========================>..] - ETA: 6s - loss: 1.9583 - regression_loss: 1.6020 - classification_loss: 0.3563 474/500 [===========================>..] - ETA: 6s - loss: 1.9583 - regression_loss: 1.6020 - classification_loss: 0.3563 475/500 [===========================>..] - ETA: 6s - loss: 1.9583 - regression_loss: 1.6023 - classification_loss: 0.3560 476/500 [===========================>..] - ETA: 6s - loss: 1.9571 - regression_loss: 1.6013 - classification_loss: 0.3558 477/500 [===========================>..] - ETA: 5s - loss: 1.9586 - regression_loss: 1.6027 - classification_loss: 0.3559 478/500 [===========================>..] - ETA: 5s - loss: 1.9588 - regression_loss: 1.6030 - classification_loss: 0.3558 479/500 [===========================>..] - ETA: 5s - loss: 1.9593 - regression_loss: 1.6034 - classification_loss: 0.3559 480/500 [===========================>..] - ETA: 5s - loss: 1.9590 - regression_loss: 1.6033 - classification_loss: 0.3558 481/500 [===========================>..] - ETA: 4s - loss: 1.9604 - regression_loss: 1.6045 - classification_loss: 0.3559 482/500 [===========================>..] - ETA: 4s - loss: 1.9597 - regression_loss: 1.6040 - classification_loss: 0.3557 483/500 [===========================>..] - ETA: 4s - loss: 1.9597 - regression_loss: 1.6042 - classification_loss: 0.3555 484/500 [============================>.] - ETA: 4s - loss: 1.9590 - regression_loss: 1.6037 - classification_loss: 0.3552 485/500 [============================>.] - ETA: 3s - loss: 1.9585 - regression_loss: 1.6035 - classification_loss: 0.3550 486/500 [============================>.] - ETA: 3s - loss: 1.9578 - regression_loss: 1.6027 - classification_loss: 0.3550 487/500 [============================>.] - ETA: 3s - loss: 1.9574 - regression_loss: 1.6025 - classification_loss: 0.3549 488/500 [============================>.] - ETA: 3s - loss: 1.9570 - regression_loss: 1.6020 - classification_loss: 0.3550 489/500 [============================>.] - ETA: 2s - loss: 1.9580 - regression_loss: 1.6027 - classification_loss: 0.3553 490/500 [============================>.] - ETA: 2s - loss: 1.9580 - regression_loss: 1.6026 - classification_loss: 0.3553 491/500 [============================>.] - ETA: 2s - loss: 1.9572 - regression_loss: 1.6019 - classification_loss: 0.3553 492/500 [============================>.] - ETA: 2s - loss: 1.9569 - regression_loss: 1.6017 - classification_loss: 0.3553 493/500 [============================>.] - ETA: 1s - loss: 1.9567 - regression_loss: 1.6016 - classification_loss: 0.3551 494/500 [============================>.] - ETA: 1s - loss: 1.9564 - regression_loss: 1.6014 - classification_loss: 0.3550 495/500 [============================>.] - ETA: 1s - loss: 1.9573 - regression_loss: 1.6020 - classification_loss: 0.3553 496/500 [============================>.] - ETA: 1s - loss: 1.9570 - regression_loss: 1.6017 - classification_loss: 0.3553 497/500 [============================>.] - ETA: 0s - loss: 1.9582 - regression_loss: 1.6026 - classification_loss: 0.3556 498/500 [============================>.] - ETA: 0s - loss: 1.9587 - regression_loss: 1.6033 - classification_loss: 0.3555 499/500 [============================>.] - ETA: 0s - loss: 1.9578 - regression_loss: 1.6026 - classification_loss: 0.3552 500/500 [==============================] - 125s 250ms/step - loss: 1.9588 - regression_loss: 1.6037 - classification_loss: 0.3551 1172 instances of class plum with average precision: 0.5476 mAP: 0.5476 Epoch 00044: saving model to ./training/snapshots/resnet50_pascal_44.h5 Epoch 45/150 1/500 [..............................] - ETA: 1:57 - loss: 1.9844 - regression_loss: 1.6968 - classification_loss: 0.2876 2/500 [..............................] - ETA: 2:01 - loss: 1.7374 - regression_loss: 1.4943 - classification_loss: 0.2430 3/500 [..............................] - ETA: 2:00 - loss: 1.8239 - regression_loss: 1.5447 - classification_loss: 0.2792 4/500 [..............................] - ETA: 2:01 - loss: 1.9982 - regression_loss: 1.6514 - classification_loss: 0.3468 5/500 [..............................] - ETA: 2:01 - loss: 2.0681 - regression_loss: 1.6812 - classification_loss: 0.3869 6/500 [..............................] - ETA: 2:01 - loss: 2.1279 - regression_loss: 1.7415 - classification_loss: 0.3863 7/500 [..............................] - ETA: 2:01 - loss: 2.0606 - regression_loss: 1.6935 - classification_loss: 0.3671 8/500 [..............................] - ETA: 2:01 - loss: 2.0267 - regression_loss: 1.6746 - classification_loss: 0.3521 9/500 [..............................] - ETA: 2:01 - loss: 1.9822 - regression_loss: 1.6378 - classification_loss: 0.3444 10/500 [..............................] - ETA: 2:01 - loss: 2.0054 - regression_loss: 1.6558 - classification_loss: 0.3496 11/500 [..............................] - ETA: 2:01 - loss: 1.9993 - regression_loss: 1.6498 - classification_loss: 0.3495 12/500 [..............................] - ETA: 2:01 - loss: 1.9880 - regression_loss: 1.6429 - classification_loss: 0.3451 13/500 [..............................] - ETA: 2:01 - loss: 1.9882 - regression_loss: 1.6406 - classification_loss: 0.3476 14/500 [..............................] - ETA: 2:01 - loss: 1.9907 - regression_loss: 1.6415 - classification_loss: 0.3492 15/500 [..............................] - ETA: 2:00 - loss: 1.9786 - regression_loss: 1.6350 - classification_loss: 0.3437 16/500 [..............................] - ETA: 2:00 - loss: 1.9462 - regression_loss: 1.6093 - classification_loss: 0.3369 17/500 [>.............................] - ETA: 2:00 - loss: 1.9657 - regression_loss: 1.6259 - classification_loss: 0.3398 18/500 [>.............................] - ETA: 2:00 - loss: 1.9836 - regression_loss: 1.6426 - classification_loss: 0.3410 19/500 [>.............................] - ETA: 1:59 - loss: 1.9495 - regression_loss: 1.6064 - classification_loss: 0.3431 20/500 [>.............................] - ETA: 1:59 - loss: 1.9159 - regression_loss: 1.5768 - classification_loss: 0.3390 21/500 [>.............................] - ETA: 1:59 - loss: 1.9309 - regression_loss: 1.5867 - classification_loss: 0.3442 22/500 [>.............................] - ETA: 1:59 - loss: 1.9050 - regression_loss: 1.5590 - classification_loss: 0.3460 23/500 [>.............................] - ETA: 1:59 - loss: 1.9052 - regression_loss: 1.5605 - classification_loss: 0.3447 24/500 [>.............................] - ETA: 1:58 - loss: 1.9114 - regression_loss: 1.5527 - classification_loss: 0.3587 25/500 [>.............................] - ETA: 1:58 - loss: 1.9159 - regression_loss: 1.5562 - classification_loss: 0.3596 26/500 [>.............................] - ETA: 1:58 - loss: 1.8799 - regression_loss: 1.5268 - classification_loss: 0.3531 27/500 [>.............................] - ETA: 1:58 - loss: 1.8722 - regression_loss: 1.5198 - classification_loss: 0.3525 28/500 [>.............................] - ETA: 1:58 - loss: 1.8782 - regression_loss: 1.5258 - classification_loss: 0.3524 29/500 [>.............................] - ETA: 1:57 - loss: 1.8797 - regression_loss: 1.5281 - classification_loss: 0.3516 30/500 [>.............................] - ETA: 1:57 - loss: 1.8966 - regression_loss: 1.5426 - classification_loss: 0.3540 31/500 [>.............................] - ETA: 1:57 - loss: 1.9183 - regression_loss: 1.5592 - classification_loss: 0.3591 32/500 [>.............................] - ETA: 1:57 - loss: 1.9241 - regression_loss: 1.5651 - classification_loss: 0.3590 33/500 [>.............................] - ETA: 1:57 - loss: 1.9291 - regression_loss: 1.5689 - classification_loss: 0.3602 34/500 [=>............................] - ETA: 1:56 - loss: 1.9046 - regression_loss: 1.5501 - classification_loss: 0.3544 35/500 [=>............................] - ETA: 1:56 - loss: 1.9129 - regression_loss: 1.5588 - classification_loss: 0.3541 36/500 [=>............................] - ETA: 1:56 - loss: 1.8961 - regression_loss: 1.5453 - classification_loss: 0.3508 37/500 [=>............................] - ETA: 1:56 - loss: 1.8841 - regression_loss: 1.5361 - classification_loss: 0.3480 38/500 [=>............................] - ETA: 1:55 - loss: 1.8877 - regression_loss: 1.5421 - classification_loss: 0.3456 39/500 [=>............................] - ETA: 1:55 - loss: 1.9002 - regression_loss: 1.5527 - classification_loss: 0.3475 40/500 [=>............................] - ETA: 1:55 - loss: 1.9074 - regression_loss: 1.5583 - classification_loss: 0.3491 41/500 [=>............................] - ETA: 1:55 - loss: 1.9077 - regression_loss: 1.5588 - classification_loss: 0.3490 42/500 [=>............................] - ETA: 1:54 - loss: 1.9182 - regression_loss: 1.5654 - classification_loss: 0.3527 43/500 [=>............................] - ETA: 1:54 - loss: 1.9222 - regression_loss: 1.5688 - classification_loss: 0.3534 44/500 [=>............................] - ETA: 1:54 - loss: 1.9134 - regression_loss: 1.5615 - classification_loss: 0.3519 45/500 [=>............................] - ETA: 1:54 - loss: 1.9075 - regression_loss: 1.5561 - classification_loss: 0.3513 46/500 [=>............................] - ETA: 1:53 - loss: 1.9166 - regression_loss: 1.5647 - classification_loss: 0.3519 47/500 [=>............................] - ETA: 1:53 - loss: 1.9133 - regression_loss: 1.5623 - classification_loss: 0.3511 48/500 [=>............................] - ETA: 1:53 - loss: 1.9314 - regression_loss: 1.5771 - classification_loss: 0.3543 49/500 [=>............................] - ETA: 1:53 - loss: 1.9203 - regression_loss: 1.5698 - classification_loss: 0.3504 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9348 - regression_loss: 1.5782 - classification_loss: 0.3566 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9498 - regression_loss: 1.5882 - classification_loss: 0.3616 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9402 - regression_loss: 1.5796 - classification_loss: 0.3606 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9591 - regression_loss: 1.5958 - classification_loss: 0.3633 54/500 [==>...........................] - ETA: 1:51 - loss: 1.9648 - regression_loss: 1.6012 - classification_loss: 0.3635 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9604 - regression_loss: 1.5985 - classification_loss: 0.3619 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9699 - regression_loss: 1.6055 - classification_loss: 0.3645 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9788 - regression_loss: 1.6138 - classification_loss: 0.3649 58/500 [==>...........................] - ETA: 1:50 - loss: 1.9800 - regression_loss: 1.6156 - classification_loss: 0.3644 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9709 - regression_loss: 1.6085 - classification_loss: 0.3624 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9725 - regression_loss: 1.6113 - classification_loss: 0.3612 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9766 - regression_loss: 1.6152 - classification_loss: 0.3614 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9582 - regression_loss: 1.6007 - classification_loss: 0.3575 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9611 - regression_loss: 1.5937 - classification_loss: 0.3674 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9493 - regression_loss: 1.5827 - classification_loss: 0.3666 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9461 - regression_loss: 1.5795 - classification_loss: 0.3666 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9534 - regression_loss: 1.5841 - classification_loss: 0.3693 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9534 - regression_loss: 1.5848 - classification_loss: 0.3686 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9558 - regression_loss: 1.5855 - classification_loss: 0.3704 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9559 - regression_loss: 1.5861 - classification_loss: 0.3698 70/500 [===>..........................] - ETA: 1:47 - loss: 1.9582 - regression_loss: 1.5891 - classification_loss: 0.3692 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9607 - regression_loss: 1.5923 - classification_loss: 0.3684 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9647 - regression_loss: 1.5963 - classification_loss: 0.3684 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9663 - regression_loss: 1.5976 - classification_loss: 0.3687 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9671 - regression_loss: 1.5988 - classification_loss: 0.3683 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9672 - regression_loss: 1.5996 - classification_loss: 0.3676 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9638 - regression_loss: 1.5973 - classification_loss: 0.3665 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9718 - regression_loss: 1.6042 - classification_loss: 0.3676 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9723 - regression_loss: 1.6054 - classification_loss: 0.3669 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9778 - regression_loss: 1.6100 - classification_loss: 0.3679 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9867 - regression_loss: 1.6166 - classification_loss: 0.3701 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9799 - regression_loss: 1.6117 - classification_loss: 0.3683 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9827 - regression_loss: 1.6144 - classification_loss: 0.3682 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9794 - regression_loss: 1.6108 - classification_loss: 0.3686 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9828 - regression_loss: 1.6139 - classification_loss: 0.3689 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9854 - regression_loss: 1.6156 - classification_loss: 0.3698 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9884 - regression_loss: 1.6187 - classification_loss: 0.3697 87/500 [====>.........................] - ETA: 1:44 - loss: 1.9822 - regression_loss: 1.6130 - classification_loss: 0.3692 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9764 - regression_loss: 1.6087 - classification_loss: 0.3677 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9687 - regression_loss: 1.6031 - classification_loss: 0.3655 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9536 - regression_loss: 1.5906 - classification_loss: 0.3630 91/500 [====>.........................] - ETA: 1:43 - loss: 1.9526 - regression_loss: 1.5903 - classification_loss: 0.3623 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9535 - regression_loss: 1.5916 - classification_loss: 0.3619 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9450 - regression_loss: 1.5854 - classification_loss: 0.3596 94/500 [====>.........................] - ETA: 1:42 - loss: 1.9442 - regression_loss: 1.5849 - classification_loss: 0.3593 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9413 - regression_loss: 1.5831 - classification_loss: 0.3582 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9409 - regression_loss: 1.5827 - classification_loss: 0.3581 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9404 - regression_loss: 1.5828 - classification_loss: 0.3576 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9340 - regression_loss: 1.5771 - classification_loss: 0.3568 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9312 - regression_loss: 1.5758 - classification_loss: 0.3554 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9349 - regression_loss: 1.5787 - classification_loss: 0.3562 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9399 - regression_loss: 1.5841 - classification_loss: 0.3558 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9393 - regression_loss: 1.5844 - classification_loss: 0.3549 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9359 - regression_loss: 1.5818 - classification_loss: 0.3541 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9375 - regression_loss: 1.5832 - classification_loss: 0.3543 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9386 - regression_loss: 1.5835 - classification_loss: 0.3551 106/500 [=====>........................] - ETA: 1:39 - loss: 1.9438 - regression_loss: 1.5878 - classification_loss: 0.3560 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9484 - regression_loss: 1.5912 - classification_loss: 0.3572 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9409 - regression_loss: 1.5854 - classification_loss: 0.3555 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9318 - regression_loss: 1.5778 - classification_loss: 0.3540 110/500 [=====>........................] - ETA: 1:38 - loss: 1.9346 - regression_loss: 1.5798 - classification_loss: 0.3548 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9453 - regression_loss: 1.5864 - classification_loss: 0.3590 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9430 - regression_loss: 1.5853 - classification_loss: 0.3577 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9394 - regression_loss: 1.5829 - classification_loss: 0.3565 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9399 - regression_loss: 1.5840 - classification_loss: 0.3559 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9374 - regression_loss: 1.5822 - classification_loss: 0.3552 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9433 - regression_loss: 1.5864 - classification_loss: 0.3569 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9421 - regression_loss: 1.5855 - classification_loss: 0.3566 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9402 - regression_loss: 1.5840 - classification_loss: 0.3562 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9409 - regression_loss: 1.5853 - classification_loss: 0.3556 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9363 - regression_loss: 1.5820 - classification_loss: 0.3543 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9375 - regression_loss: 1.5833 - classification_loss: 0.3542 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9385 - regression_loss: 1.5843 - classification_loss: 0.3542 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9376 - regression_loss: 1.5841 - classification_loss: 0.3534 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9393 - regression_loss: 1.5850 - classification_loss: 0.3543 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9402 - regression_loss: 1.5860 - classification_loss: 0.3542 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9447 - regression_loss: 1.5873 - classification_loss: 0.3573 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9456 - regression_loss: 1.5885 - classification_loss: 0.3570 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9480 - regression_loss: 1.5906 - classification_loss: 0.3575 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9492 - regression_loss: 1.5911 - classification_loss: 0.3582 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9455 - regression_loss: 1.5884 - classification_loss: 0.3571 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9485 - regression_loss: 1.5911 - classification_loss: 0.3574 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9508 - regression_loss: 1.5927 - classification_loss: 0.3580 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9505 - regression_loss: 1.5929 - classification_loss: 0.3575 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9528 - regression_loss: 1.5951 - classification_loss: 0.3577 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9496 - regression_loss: 1.5927 - classification_loss: 0.3569 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9456 - regression_loss: 1.5899 - classification_loss: 0.3558 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9461 - regression_loss: 1.5900 - classification_loss: 0.3561 138/500 [=======>......................] - ETA: 1:31 - loss: 1.9442 - regression_loss: 1.5890 - classification_loss: 0.3553 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9456 - regression_loss: 1.5903 - classification_loss: 0.3553 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9463 - regression_loss: 1.5917 - classification_loss: 0.3546 141/500 [=======>......................] - ETA: 1:30 - loss: 1.9427 - regression_loss: 1.5888 - classification_loss: 0.3539 142/500 [=======>......................] - ETA: 1:30 - loss: 1.9441 - regression_loss: 1.5901 - classification_loss: 0.3541 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9347 - regression_loss: 1.5820 - classification_loss: 0.3527 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9338 - regression_loss: 1.5813 - classification_loss: 0.3525 145/500 [=======>......................] - ETA: 1:29 - loss: 1.9345 - regression_loss: 1.5825 - classification_loss: 0.3520 146/500 [=======>......................] - ETA: 1:29 - loss: 1.9358 - regression_loss: 1.5840 - classification_loss: 0.3518 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9363 - regression_loss: 1.5847 - classification_loss: 0.3516 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9350 - regression_loss: 1.5837 - classification_loss: 0.3513 149/500 [=======>......................] - ETA: 1:28 - loss: 1.9395 - regression_loss: 1.5874 - classification_loss: 0.3522 150/500 [========>.....................] - ETA: 1:28 - loss: 1.9409 - regression_loss: 1.5889 - classification_loss: 0.3520 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9400 - regression_loss: 1.5871 - classification_loss: 0.3528 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9433 - regression_loss: 1.5897 - classification_loss: 0.3535 153/500 [========>.....................] - ETA: 1:27 - loss: 1.9448 - regression_loss: 1.5907 - classification_loss: 0.3541 154/500 [========>.....................] - ETA: 1:27 - loss: 1.9474 - regression_loss: 1.5934 - classification_loss: 0.3540 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9495 - regression_loss: 1.5951 - classification_loss: 0.3544 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9417 - regression_loss: 1.5887 - classification_loss: 0.3530 157/500 [========>.....................] - ETA: 1:26 - loss: 1.9380 - regression_loss: 1.5858 - classification_loss: 0.3523 158/500 [========>.....................] - ETA: 1:26 - loss: 1.9366 - regression_loss: 1.5850 - classification_loss: 0.3516 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9351 - regression_loss: 1.5839 - classification_loss: 0.3513 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9365 - regression_loss: 1.5851 - classification_loss: 0.3513 161/500 [========>.....................] - ETA: 1:25 - loss: 1.9437 - regression_loss: 1.5908 - classification_loss: 0.3529 162/500 [========>.....................] - ETA: 1:25 - loss: 1.9432 - regression_loss: 1.5907 - classification_loss: 0.3525 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9455 - regression_loss: 1.5930 - classification_loss: 0.3525 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9443 - regression_loss: 1.5917 - classification_loss: 0.3527 165/500 [========>.....................] - ETA: 1:24 - loss: 1.9453 - regression_loss: 1.5922 - classification_loss: 0.3531 166/500 [========>.....................] - ETA: 1:24 - loss: 1.9427 - regression_loss: 1.5906 - classification_loss: 0.3521 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9506 - regression_loss: 1.5972 - classification_loss: 0.3534 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9523 - regression_loss: 1.5986 - classification_loss: 0.3537 169/500 [=========>....................] - ETA: 1:23 - loss: 1.9606 - regression_loss: 1.6049 - classification_loss: 0.3557 170/500 [=========>....................] - ETA: 1:23 - loss: 1.9628 - regression_loss: 1.6061 - classification_loss: 0.3566 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9609 - regression_loss: 1.6047 - classification_loss: 0.3562 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9619 - regression_loss: 1.6055 - classification_loss: 0.3565 173/500 [=========>....................] - ETA: 1:22 - loss: 1.9611 - regression_loss: 1.6050 - classification_loss: 0.3561 174/500 [=========>....................] - ETA: 1:22 - loss: 1.9643 - regression_loss: 1.6068 - classification_loss: 0.3575 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9631 - regression_loss: 1.6059 - classification_loss: 0.3572 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9621 - regression_loss: 1.6057 - classification_loss: 0.3564 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9642 - regression_loss: 1.6074 - classification_loss: 0.3568 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9628 - regression_loss: 1.6063 - classification_loss: 0.3565 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9633 - regression_loss: 1.6068 - classification_loss: 0.3565 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9631 - regression_loss: 1.6068 - classification_loss: 0.3564 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9619 - regression_loss: 1.6058 - classification_loss: 0.3561 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9638 - regression_loss: 1.6073 - classification_loss: 0.3566 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9648 - regression_loss: 1.6082 - classification_loss: 0.3566 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9611 - regression_loss: 1.6052 - classification_loss: 0.3559 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9578 - regression_loss: 1.6005 - classification_loss: 0.3573 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9550 - regression_loss: 1.5979 - classification_loss: 0.3571 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9550 - regression_loss: 1.5982 - classification_loss: 0.3568 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9542 - regression_loss: 1.5978 - classification_loss: 0.3564 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9546 - regression_loss: 1.5981 - classification_loss: 0.3565 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9515 - regression_loss: 1.5952 - classification_loss: 0.3563 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9465 - regression_loss: 1.5909 - classification_loss: 0.3556 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9472 - regression_loss: 1.5917 - classification_loss: 0.3555 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9484 - regression_loss: 1.5925 - classification_loss: 0.3560 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9425 - regression_loss: 1.5879 - classification_loss: 0.3546 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9405 - regression_loss: 1.5865 - classification_loss: 0.3540 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9442 - regression_loss: 1.5891 - classification_loss: 0.3551 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9458 - regression_loss: 1.5906 - classification_loss: 0.3552 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9457 - regression_loss: 1.5902 - classification_loss: 0.3555 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9416 - regression_loss: 1.5871 - classification_loss: 0.3545 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9395 - regression_loss: 1.5854 - classification_loss: 0.3541 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9401 - regression_loss: 1.5860 - classification_loss: 0.3542 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9418 - regression_loss: 1.5874 - classification_loss: 0.3544 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9390 - regression_loss: 1.5852 - classification_loss: 0.3538 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9376 - regression_loss: 1.5844 - classification_loss: 0.3532 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9390 - regression_loss: 1.5856 - classification_loss: 0.3534 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9378 - regression_loss: 1.5849 - classification_loss: 0.3529 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9352 - regression_loss: 1.5830 - classification_loss: 0.3522 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9343 - regression_loss: 1.5825 - classification_loss: 0.3518 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9337 - regression_loss: 1.5821 - classification_loss: 0.3516 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9351 - regression_loss: 1.5836 - classification_loss: 0.3515 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9361 - regression_loss: 1.5845 - classification_loss: 0.3516 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9301 - regression_loss: 1.5797 - classification_loss: 0.3504 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9316 - regression_loss: 1.5811 - classification_loss: 0.3504 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9340 - regression_loss: 1.5834 - classification_loss: 0.3505 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9352 - regression_loss: 1.5846 - classification_loss: 0.3506 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9371 - regression_loss: 1.5864 - classification_loss: 0.3507 217/500 [============>.................] - ETA: 1:11 - loss: 1.9392 - regression_loss: 1.5879 - classification_loss: 0.3514 218/500 [============>.................] - ETA: 1:10 - loss: 1.9403 - regression_loss: 1.5890 - classification_loss: 0.3513 219/500 [============>.................] - ETA: 1:10 - loss: 1.9375 - regression_loss: 1.5865 - classification_loss: 0.3511 220/500 [============>.................] - ETA: 1:10 - loss: 1.9352 - regression_loss: 1.5845 - classification_loss: 0.3507 221/500 [============>.................] - ETA: 1:10 - loss: 1.9331 - regression_loss: 1.5825 - classification_loss: 0.3506 222/500 [============>.................] - ETA: 1:09 - loss: 1.9300 - regression_loss: 1.5804 - classification_loss: 0.3496 223/500 [============>.................] - ETA: 1:09 - loss: 1.9307 - regression_loss: 1.5806 - classification_loss: 0.3501 224/500 [============>.................] - ETA: 1:09 - loss: 1.9331 - regression_loss: 1.5827 - classification_loss: 0.3504 225/500 [============>.................] - ETA: 1:09 - loss: 1.9345 - regression_loss: 1.5836 - classification_loss: 0.3510 226/500 [============>.................] - ETA: 1:08 - loss: 1.9328 - regression_loss: 1.5824 - classification_loss: 0.3504 227/500 [============>.................] - ETA: 1:08 - loss: 1.9331 - regression_loss: 1.5823 - classification_loss: 0.3507 228/500 [============>.................] - ETA: 1:08 - loss: 1.9346 - regression_loss: 1.5837 - classification_loss: 0.3510 229/500 [============>.................] - ETA: 1:08 - loss: 1.9335 - regression_loss: 1.5828 - classification_loss: 0.3507 230/500 [============>.................] - ETA: 1:07 - loss: 1.9361 - regression_loss: 1.5851 - classification_loss: 0.3509 231/500 [============>.................] - ETA: 1:07 - loss: 1.9348 - regression_loss: 1.5844 - classification_loss: 0.3504 232/500 [============>.................] - ETA: 1:07 - loss: 1.9361 - regression_loss: 1.5856 - classification_loss: 0.3505 233/500 [============>.................] - ETA: 1:07 - loss: 1.9371 - regression_loss: 1.5867 - classification_loss: 0.3504 234/500 [=============>................] - ETA: 1:06 - loss: 1.9374 - regression_loss: 1.5870 - classification_loss: 0.3504 235/500 [=============>................] - ETA: 1:06 - loss: 1.9359 - regression_loss: 1.5856 - classification_loss: 0.3503 236/500 [=============>................] - ETA: 1:06 - loss: 1.9361 - regression_loss: 1.5862 - classification_loss: 0.3499 237/500 [=============>................] - ETA: 1:06 - loss: 1.9381 - regression_loss: 1.5877 - classification_loss: 0.3505 238/500 [=============>................] - ETA: 1:05 - loss: 1.9368 - regression_loss: 1.5866 - classification_loss: 0.3501 239/500 [=============>................] - ETA: 1:05 - loss: 1.9345 - regression_loss: 1.5851 - classification_loss: 0.3494 240/500 [=============>................] - ETA: 1:05 - loss: 1.9364 - regression_loss: 1.5865 - classification_loss: 0.3499 241/500 [=============>................] - ETA: 1:05 - loss: 1.9338 - regression_loss: 1.5842 - classification_loss: 0.3496 242/500 [=============>................] - ETA: 1:04 - loss: 1.9338 - regression_loss: 1.5844 - classification_loss: 0.3494 243/500 [=============>................] - ETA: 1:04 - loss: 1.9363 - regression_loss: 1.5864 - classification_loss: 0.3499 244/500 [=============>................] - ETA: 1:04 - loss: 1.9376 - regression_loss: 1.5873 - classification_loss: 0.3504 245/500 [=============>................] - ETA: 1:04 - loss: 1.9382 - regression_loss: 1.5877 - classification_loss: 0.3505 246/500 [=============>................] - ETA: 1:03 - loss: 1.9423 - regression_loss: 1.5903 - classification_loss: 0.3520 247/500 [=============>................] - ETA: 1:03 - loss: 1.9431 - regression_loss: 1.5910 - classification_loss: 0.3521 248/500 [=============>................] - ETA: 1:03 - loss: 1.9429 - regression_loss: 1.5911 - classification_loss: 0.3518 249/500 [=============>................] - ETA: 1:03 - loss: 1.9443 - regression_loss: 1.5921 - classification_loss: 0.3522 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9456 - regression_loss: 1.5929 - classification_loss: 0.3528 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9460 - regression_loss: 1.5934 - classification_loss: 0.3526 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9519 - regression_loss: 1.5981 - classification_loss: 0.3538 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9505 - regression_loss: 1.5972 - classification_loss: 0.3533 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9480 - regression_loss: 1.5954 - classification_loss: 0.3526 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9472 - regression_loss: 1.5945 - classification_loss: 0.3526 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9463 - regression_loss: 1.5934 - classification_loss: 0.3529 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9454 - regression_loss: 1.5928 - classification_loss: 0.3526 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9432 - regression_loss: 1.5913 - classification_loss: 0.3519 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9456 - regression_loss: 1.5929 - classification_loss: 0.3528 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9464 - regression_loss: 1.5940 - classification_loss: 0.3524 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9446 - regression_loss: 1.5925 - classification_loss: 0.3521 262/500 [==============>...............] - ETA: 59s - loss: 1.9447 - regression_loss: 1.5927 - classification_loss: 0.3520  263/500 [==============>...............] - ETA: 59s - loss: 1.9450 - regression_loss: 1.5925 - classification_loss: 0.3525 264/500 [==============>...............] - ETA: 59s - loss: 1.9461 - regression_loss: 1.5936 - classification_loss: 0.3525 265/500 [==============>...............] - ETA: 59s - loss: 1.9479 - regression_loss: 1.5952 - classification_loss: 0.3527 266/500 [==============>...............] - ETA: 58s - loss: 1.9480 - regression_loss: 1.5954 - classification_loss: 0.3526 267/500 [===============>..............] - ETA: 58s - loss: 1.9484 - regression_loss: 1.5957 - classification_loss: 0.3527 268/500 [===============>..............] - ETA: 58s - loss: 1.9465 - regression_loss: 1.5942 - classification_loss: 0.3523 269/500 [===============>..............] - ETA: 58s - loss: 1.9447 - regression_loss: 1.5928 - classification_loss: 0.3518 270/500 [===============>..............] - ETA: 57s - loss: 1.9453 - regression_loss: 1.5936 - classification_loss: 0.3518 271/500 [===============>..............] - ETA: 57s - loss: 1.9473 - regression_loss: 1.5929 - classification_loss: 0.3544 272/500 [===============>..............] - ETA: 57s - loss: 1.9482 - regression_loss: 1.5931 - classification_loss: 0.3551 273/500 [===============>..............] - ETA: 57s - loss: 1.9472 - regression_loss: 1.5923 - classification_loss: 0.3548 274/500 [===============>..............] - ETA: 56s - loss: 1.9496 - regression_loss: 1.5942 - classification_loss: 0.3554 275/500 [===============>..............] - ETA: 56s - loss: 1.9492 - regression_loss: 1.5938 - classification_loss: 0.3553 276/500 [===============>..............] - ETA: 56s - loss: 1.9503 - regression_loss: 1.5946 - classification_loss: 0.3557 277/500 [===============>..............] - ETA: 56s - loss: 1.9491 - regression_loss: 1.5937 - classification_loss: 0.3554 278/500 [===============>..............] - ETA: 55s - loss: 1.9491 - regression_loss: 1.5937 - classification_loss: 0.3554 279/500 [===============>..............] - ETA: 55s - loss: 1.9486 - regression_loss: 1.5934 - classification_loss: 0.3553 280/500 [===============>..............] - ETA: 55s - loss: 1.9469 - regression_loss: 1.5918 - classification_loss: 0.3550 281/500 [===============>..............] - ETA: 55s - loss: 1.9478 - regression_loss: 1.5925 - classification_loss: 0.3553 282/500 [===============>..............] - ETA: 54s - loss: 1.9485 - regression_loss: 1.5928 - classification_loss: 0.3557 283/500 [===============>..............] - ETA: 54s - loss: 1.9500 - regression_loss: 1.5941 - classification_loss: 0.3559 284/500 [================>.............] - ETA: 54s - loss: 1.9515 - regression_loss: 1.5954 - classification_loss: 0.3561 285/500 [================>.............] - ETA: 54s - loss: 1.9511 - regression_loss: 1.5952 - classification_loss: 0.3559 286/500 [================>.............] - ETA: 53s - loss: 1.9505 - regression_loss: 1.5947 - classification_loss: 0.3558 287/500 [================>.............] - ETA: 53s - loss: 1.9508 - regression_loss: 1.5951 - classification_loss: 0.3556 288/500 [================>.............] - ETA: 53s - loss: 1.9505 - regression_loss: 1.5940 - classification_loss: 0.3565 289/500 [================>.............] - ETA: 53s - loss: 1.9499 - regression_loss: 1.5938 - classification_loss: 0.3562 290/500 [================>.............] - ETA: 52s - loss: 1.9518 - regression_loss: 1.5953 - classification_loss: 0.3564 291/500 [================>.............] - ETA: 52s - loss: 1.9503 - regression_loss: 1.5937 - classification_loss: 0.3566 292/500 [================>.............] - ETA: 52s - loss: 1.9500 - regression_loss: 1.5936 - classification_loss: 0.3564 293/500 [================>.............] - ETA: 51s - loss: 1.9482 - regression_loss: 1.5921 - classification_loss: 0.3561 294/500 [================>.............] - ETA: 51s - loss: 1.9461 - regression_loss: 1.5901 - classification_loss: 0.3560 295/500 [================>.............] - ETA: 51s - loss: 1.9462 - regression_loss: 1.5901 - classification_loss: 0.3560 296/500 [================>.............] - ETA: 51s - loss: 1.9455 - regression_loss: 1.5894 - classification_loss: 0.3561 297/500 [================>.............] - ETA: 50s - loss: 1.9439 - regression_loss: 1.5883 - classification_loss: 0.3557 298/500 [================>.............] - ETA: 50s - loss: 1.9433 - regression_loss: 1.5880 - classification_loss: 0.3553 299/500 [================>.............] - ETA: 50s - loss: 1.9448 - regression_loss: 1.5890 - classification_loss: 0.3558 300/500 [=================>............] - ETA: 50s - loss: 1.9453 - regression_loss: 1.5894 - classification_loss: 0.3560 301/500 [=================>............] - ETA: 49s - loss: 1.9453 - regression_loss: 1.5895 - classification_loss: 0.3558 302/500 [=================>............] - ETA: 49s - loss: 1.9456 - regression_loss: 1.5899 - classification_loss: 0.3557 303/500 [=================>............] - ETA: 49s - loss: 1.9462 - regression_loss: 1.5906 - classification_loss: 0.3556 304/500 [=================>............] - ETA: 49s - loss: 1.9453 - regression_loss: 1.5902 - classification_loss: 0.3552 305/500 [=================>............] - ETA: 48s - loss: 1.9456 - regression_loss: 1.5905 - classification_loss: 0.3551 306/500 [=================>............] - ETA: 48s - loss: 1.9450 - regression_loss: 1.5901 - classification_loss: 0.3549 307/500 [=================>............] - ETA: 48s - loss: 1.9441 - regression_loss: 1.5897 - classification_loss: 0.3544 308/500 [=================>............] - ETA: 48s - loss: 1.9424 - regression_loss: 1.5882 - classification_loss: 0.3542 309/500 [=================>............] - ETA: 47s - loss: 1.9410 - regression_loss: 1.5874 - classification_loss: 0.3537 310/500 [=================>............] - ETA: 47s - loss: 1.9417 - regression_loss: 1.5880 - classification_loss: 0.3537 311/500 [=================>............] - ETA: 47s - loss: 1.9429 - regression_loss: 1.5893 - classification_loss: 0.3536 312/500 [=================>............] - ETA: 47s - loss: 1.9434 - regression_loss: 1.5896 - classification_loss: 0.3538 313/500 [=================>............] - ETA: 46s - loss: 1.9452 - regression_loss: 1.5908 - classification_loss: 0.3544 314/500 [=================>............] - ETA: 46s - loss: 1.9462 - regression_loss: 1.5919 - classification_loss: 0.3543 315/500 [=================>............] - ETA: 46s - loss: 1.9456 - regression_loss: 1.5916 - classification_loss: 0.3540 316/500 [=================>............] - ETA: 46s - loss: 1.9494 - regression_loss: 1.5947 - classification_loss: 0.3547 317/500 [==================>...........] - ETA: 45s - loss: 1.9505 - regression_loss: 1.5954 - classification_loss: 0.3551 318/500 [==================>...........] - ETA: 45s - loss: 1.9490 - regression_loss: 1.5944 - classification_loss: 0.3546 319/500 [==================>...........] - ETA: 45s - loss: 1.9489 - regression_loss: 1.5944 - classification_loss: 0.3545 320/500 [==================>...........] - ETA: 45s - loss: 1.9519 - regression_loss: 1.5970 - classification_loss: 0.3549 321/500 [==================>...........] - ETA: 44s - loss: 1.9542 - regression_loss: 1.5995 - classification_loss: 0.3547 322/500 [==================>...........] - ETA: 44s - loss: 1.9539 - regression_loss: 1.5993 - classification_loss: 0.3546 323/500 [==================>...........] - ETA: 44s - loss: 1.9545 - regression_loss: 1.5999 - classification_loss: 0.3546 324/500 [==================>...........] - ETA: 44s - loss: 1.9561 - regression_loss: 1.6012 - classification_loss: 0.3549 325/500 [==================>...........] - ETA: 43s - loss: 1.9552 - regression_loss: 1.6005 - classification_loss: 0.3548 326/500 [==================>...........] - ETA: 43s - loss: 1.9537 - regression_loss: 1.5994 - classification_loss: 0.3543 327/500 [==================>...........] - ETA: 43s - loss: 1.9539 - regression_loss: 1.5997 - classification_loss: 0.3543 328/500 [==================>...........] - ETA: 43s - loss: 1.9502 - regression_loss: 1.5966 - classification_loss: 0.3536 329/500 [==================>...........] - ETA: 42s - loss: 1.9514 - regression_loss: 1.5966 - classification_loss: 0.3548 330/500 [==================>...........] - ETA: 42s - loss: 1.9516 - regression_loss: 1.5969 - classification_loss: 0.3547 331/500 [==================>...........] - ETA: 42s - loss: 1.9519 - regression_loss: 1.5972 - classification_loss: 0.3547 332/500 [==================>...........] - ETA: 42s - loss: 1.9493 - regression_loss: 1.5950 - classification_loss: 0.3543 333/500 [==================>...........] - ETA: 41s - loss: 1.9507 - regression_loss: 1.5963 - classification_loss: 0.3544 334/500 [===================>..........] - ETA: 41s - loss: 1.9516 - regression_loss: 1.5973 - classification_loss: 0.3542 335/500 [===================>..........] - ETA: 41s - loss: 1.9530 - regression_loss: 1.5986 - classification_loss: 0.3544 336/500 [===================>..........] - ETA: 41s - loss: 1.9530 - regression_loss: 1.5987 - classification_loss: 0.3543 337/500 [===================>..........] - ETA: 40s - loss: 1.9531 - regression_loss: 1.5986 - classification_loss: 0.3544 338/500 [===================>..........] - ETA: 40s - loss: 1.9525 - regression_loss: 1.5983 - classification_loss: 0.3542 339/500 [===================>..........] - ETA: 40s - loss: 1.9518 - regression_loss: 1.5979 - classification_loss: 0.3539 340/500 [===================>..........] - ETA: 40s - loss: 1.9533 - regression_loss: 1.5991 - classification_loss: 0.3542 341/500 [===================>..........] - ETA: 39s - loss: 1.9538 - regression_loss: 1.5994 - classification_loss: 0.3544 342/500 [===================>..........] - ETA: 39s - loss: 1.9547 - regression_loss: 1.6004 - classification_loss: 0.3544 343/500 [===================>..........] - ETA: 39s - loss: 1.9514 - regression_loss: 1.5977 - classification_loss: 0.3537 344/500 [===================>..........] - ETA: 39s - loss: 1.9541 - regression_loss: 1.5998 - classification_loss: 0.3543 345/500 [===================>..........] - ETA: 38s - loss: 1.9543 - regression_loss: 1.5997 - classification_loss: 0.3545 346/500 [===================>..........] - ETA: 38s - loss: 1.9558 - regression_loss: 1.6009 - classification_loss: 0.3549 347/500 [===================>..........] - ETA: 38s - loss: 1.9583 - regression_loss: 1.6030 - classification_loss: 0.3553 348/500 [===================>..........] - ETA: 38s - loss: 1.9573 - regression_loss: 1.6017 - classification_loss: 0.3556 349/500 [===================>..........] - ETA: 37s - loss: 1.9576 - regression_loss: 1.6020 - classification_loss: 0.3556 350/500 [====================>.........] - ETA: 37s - loss: 1.9579 - regression_loss: 1.6022 - classification_loss: 0.3556 351/500 [====================>.........] - ETA: 37s - loss: 1.9599 - regression_loss: 1.6039 - classification_loss: 0.3561 352/500 [====================>.........] - ETA: 37s - loss: 1.9616 - regression_loss: 1.6051 - classification_loss: 0.3565 353/500 [====================>.........] - ETA: 36s - loss: 1.9620 - regression_loss: 1.6055 - classification_loss: 0.3565 354/500 [====================>.........] - ETA: 36s - loss: 1.9686 - regression_loss: 1.6089 - classification_loss: 0.3597 355/500 [====================>.........] - ETA: 36s - loss: 1.9699 - regression_loss: 1.6089 - classification_loss: 0.3611 356/500 [====================>.........] - ETA: 36s - loss: 1.9702 - regression_loss: 1.6090 - classification_loss: 0.3612 357/500 [====================>.........] - ETA: 35s - loss: 1.9697 - regression_loss: 1.6086 - classification_loss: 0.3611 358/500 [====================>.........] - ETA: 35s - loss: 1.9699 - regression_loss: 1.6083 - classification_loss: 0.3616 359/500 [====================>.........] - ETA: 35s - loss: 1.9694 - regression_loss: 1.6079 - classification_loss: 0.3615 360/500 [====================>.........] - ETA: 35s - loss: 1.9702 - regression_loss: 1.6084 - classification_loss: 0.3617 361/500 [====================>.........] - ETA: 34s - loss: 1.9706 - regression_loss: 1.6089 - classification_loss: 0.3617 362/500 [====================>.........] - ETA: 34s - loss: 1.9704 - regression_loss: 1.6088 - classification_loss: 0.3616 363/500 [====================>.........] - ETA: 34s - loss: 1.9699 - regression_loss: 1.6085 - classification_loss: 0.3614 364/500 [====================>.........] - ETA: 34s - loss: 1.9698 - regression_loss: 1.6084 - classification_loss: 0.3614 365/500 [====================>.........] - ETA: 33s - loss: 1.9697 - regression_loss: 1.6084 - classification_loss: 0.3613 366/500 [====================>.........] - ETA: 33s - loss: 1.9700 - regression_loss: 1.6087 - classification_loss: 0.3613 367/500 [=====================>........] - ETA: 33s - loss: 1.9695 - regression_loss: 1.6084 - classification_loss: 0.3611 368/500 [=====================>........] - ETA: 33s - loss: 1.9702 - regression_loss: 1.6091 - classification_loss: 0.3610 369/500 [=====================>........] - ETA: 32s - loss: 1.9725 - regression_loss: 1.6108 - classification_loss: 0.3617 370/500 [=====================>........] - ETA: 32s - loss: 1.9728 - regression_loss: 1.6108 - classification_loss: 0.3620 371/500 [=====================>........] - ETA: 32s - loss: 1.9741 - regression_loss: 1.6117 - classification_loss: 0.3624 372/500 [=====================>........] - ETA: 32s - loss: 1.9736 - regression_loss: 1.6111 - classification_loss: 0.3625 373/500 [=====================>........] - ETA: 31s - loss: 1.9744 - regression_loss: 1.6119 - classification_loss: 0.3625 374/500 [=====================>........] - ETA: 31s - loss: 1.9764 - regression_loss: 1.6134 - classification_loss: 0.3630 375/500 [=====================>........] - ETA: 31s - loss: 1.9764 - regression_loss: 1.6134 - classification_loss: 0.3630 376/500 [=====================>........] - ETA: 31s - loss: 1.9752 - regression_loss: 1.6122 - classification_loss: 0.3630 377/500 [=====================>........] - ETA: 30s - loss: 1.9738 - regression_loss: 1.6111 - classification_loss: 0.3627 378/500 [=====================>........] - ETA: 30s - loss: 1.9739 - regression_loss: 1.6114 - classification_loss: 0.3625 379/500 [=====================>........] - ETA: 30s - loss: 1.9739 - regression_loss: 1.6115 - classification_loss: 0.3624 380/500 [=====================>........] - ETA: 30s - loss: 1.9743 - regression_loss: 1.6119 - classification_loss: 0.3624 381/500 [=====================>........] - ETA: 29s - loss: 1.9749 - regression_loss: 1.6125 - classification_loss: 0.3625 382/500 [=====================>........] - ETA: 29s - loss: 1.9758 - regression_loss: 1.6130 - classification_loss: 0.3627 383/500 [=====================>........] - ETA: 29s - loss: 1.9763 - regression_loss: 1.6135 - classification_loss: 0.3628 384/500 [======================>.......] - ETA: 29s - loss: 1.9759 - regression_loss: 1.6131 - classification_loss: 0.3627 385/500 [======================>.......] - ETA: 28s - loss: 1.9748 - regression_loss: 1.6123 - classification_loss: 0.3625 386/500 [======================>.......] - ETA: 28s - loss: 1.9743 - regression_loss: 1.6119 - classification_loss: 0.3624 387/500 [======================>.......] - ETA: 28s - loss: 1.9720 - regression_loss: 1.6102 - classification_loss: 0.3619 388/500 [======================>.......] - ETA: 28s - loss: 1.9713 - regression_loss: 1.6095 - classification_loss: 0.3617 389/500 [======================>.......] - ETA: 27s - loss: 1.9718 - regression_loss: 1.6099 - classification_loss: 0.3618 390/500 [======================>.......] - ETA: 27s - loss: 1.9713 - regression_loss: 1.6096 - classification_loss: 0.3617 391/500 [======================>.......] - ETA: 27s - loss: 1.9724 - regression_loss: 1.6103 - classification_loss: 0.3621 392/500 [======================>.......] - ETA: 27s - loss: 1.9729 - regression_loss: 1.6109 - classification_loss: 0.3620 393/500 [======================>.......] - ETA: 26s - loss: 1.9723 - regression_loss: 1.6106 - classification_loss: 0.3617 394/500 [======================>.......] - ETA: 26s - loss: 1.9728 - regression_loss: 1.6113 - classification_loss: 0.3615 395/500 [======================>.......] - ETA: 26s - loss: 1.9734 - regression_loss: 1.6120 - classification_loss: 0.3614 396/500 [======================>.......] - ETA: 26s - loss: 1.9728 - regression_loss: 1.6115 - classification_loss: 0.3612 397/500 [======================>.......] - ETA: 25s - loss: 1.9755 - regression_loss: 1.6137 - classification_loss: 0.3618 398/500 [======================>.......] - ETA: 25s - loss: 1.9752 - regression_loss: 1.6136 - classification_loss: 0.3616 399/500 [======================>.......] - ETA: 25s - loss: 1.9750 - regression_loss: 1.6135 - classification_loss: 0.3614 400/500 [=======================>......] - ETA: 25s - loss: 1.9753 - regression_loss: 1.6140 - classification_loss: 0.3613 401/500 [=======================>......] - ETA: 24s - loss: 1.9764 - regression_loss: 1.6150 - classification_loss: 0.3614 402/500 [=======================>......] - ETA: 24s - loss: 1.9763 - regression_loss: 1.6150 - classification_loss: 0.3613 403/500 [=======================>......] - ETA: 24s - loss: 1.9753 - regression_loss: 1.6142 - classification_loss: 0.3611 404/500 [=======================>......] - ETA: 24s - loss: 1.9745 - regression_loss: 1.6136 - classification_loss: 0.3609 405/500 [=======================>......] - ETA: 23s - loss: 1.9753 - regression_loss: 1.6142 - classification_loss: 0.3611 406/500 [=======================>......] - ETA: 23s - loss: 1.9756 - regression_loss: 1.6147 - classification_loss: 0.3608 407/500 [=======================>......] - ETA: 23s - loss: 1.9734 - regression_loss: 1.6130 - classification_loss: 0.3604 408/500 [=======================>......] - ETA: 23s - loss: 1.9726 - regression_loss: 1.6124 - classification_loss: 0.3603 409/500 [=======================>......] - ETA: 22s - loss: 1.9712 - regression_loss: 1.6111 - classification_loss: 0.3601 410/500 [=======================>......] - ETA: 22s - loss: 1.9721 - regression_loss: 1.6114 - classification_loss: 0.3607 411/500 [=======================>......] - ETA: 22s - loss: 1.9716 - regression_loss: 1.6112 - classification_loss: 0.3604 412/500 [=======================>......] - ETA: 22s - loss: 1.9709 - regression_loss: 1.6106 - classification_loss: 0.3603 413/500 [=======================>......] - ETA: 21s - loss: 1.9714 - regression_loss: 1.6111 - classification_loss: 0.3604 414/500 [=======================>......] - ETA: 21s - loss: 1.9705 - regression_loss: 1.6106 - classification_loss: 0.3599 415/500 [=======================>......] - ETA: 21s - loss: 1.9689 - regression_loss: 1.6091 - classification_loss: 0.3597 416/500 [=======================>......] - ETA: 21s - loss: 1.9682 - regression_loss: 1.6086 - classification_loss: 0.3596 417/500 [========================>.....] - ETA: 20s - loss: 1.9659 - regression_loss: 1.6068 - classification_loss: 0.3591 418/500 [========================>.....] - ETA: 20s - loss: 1.9662 - regression_loss: 1.6071 - classification_loss: 0.3591 419/500 [========================>.....] - ETA: 20s - loss: 1.9659 - regression_loss: 1.6068 - classification_loss: 0.3592 420/500 [========================>.....] - ETA: 20s - loss: 1.9660 - regression_loss: 1.6070 - classification_loss: 0.3590 421/500 [========================>.....] - ETA: 19s - loss: 1.9671 - regression_loss: 1.6080 - classification_loss: 0.3592 422/500 [========================>.....] - ETA: 19s - loss: 1.9675 - regression_loss: 1.6084 - classification_loss: 0.3591 423/500 [========================>.....] - ETA: 19s - loss: 1.9669 - regression_loss: 1.6080 - classification_loss: 0.3589 424/500 [========================>.....] - ETA: 19s - loss: 1.9654 - regression_loss: 1.6066 - classification_loss: 0.3588 425/500 [========================>.....] - ETA: 18s - loss: 1.9658 - regression_loss: 1.6071 - classification_loss: 0.3587 426/500 [========================>.....] - ETA: 18s - loss: 1.9645 - regression_loss: 1.6061 - classification_loss: 0.3585 427/500 [========================>.....] - ETA: 18s - loss: 1.9639 - regression_loss: 1.6055 - classification_loss: 0.3584 428/500 [========================>.....] - ETA: 18s - loss: 1.9639 - regression_loss: 1.6055 - classification_loss: 0.3584 429/500 [========================>.....] - ETA: 17s - loss: 1.9650 - regression_loss: 1.6065 - classification_loss: 0.3585 430/500 [========================>.....] - ETA: 17s - loss: 1.9649 - regression_loss: 1.6063 - classification_loss: 0.3586 431/500 [========================>.....] - ETA: 17s - loss: 1.9642 - regression_loss: 1.6058 - classification_loss: 0.3584 432/500 [========================>.....] - ETA: 17s - loss: 1.9641 - regression_loss: 1.6060 - classification_loss: 0.3581 433/500 [========================>.....] - ETA: 16s - loss: 1.9638 - regression_loss: 1.6058 - classification_loss: 0.3580 434/500 [=========================>....] - ETA: 16s - loss: 1.9651 - regression_loss: 1.6071 - classification_loss: 0.3580 435/500 [=========================>....] - ETA: 16s - loss: 1.9658 - regression_loss: 1.6079 - classification_loss: 0.3579 436/500 [=========================>....] - ETA: 16s - loss: 1.9665 - regression_loss: 1.6085 - classification_loss: 0.3580 437/500 [=========================>....] - ETA: 15s - loss: 1.9666 - regression_loss: 1.6086 - classification_loss: 0.3580 438/500 [=========================>....] - ETA: 15s - loss: 1.9660 - regression_loss: 1.6081 - classification_loss: 0.3579 439/500 [=========================>....] - ETA: 15s - loss: 1.9666 - regression_loss: 1.6086 - classification_loss: 0.3580 440/500 [=========================>....] - ETA: 15s - loss: 1.9649 - regression_loss: 1.6073 - classification_loss: 0.3576 441/500 [=========================>....] - ETA: 14s - loss: 1.9625 - regression_loss: 1.6051 - classification_loss: 0.3574 442/500 [=========================>....] - ETA: 14s - loss: 1.9620 - regression_loss: 1.6048 - classification_loss: 0.3572 443/500 [=========================>....] - ETA: 14s - loss: 1.9600 - regression_loss: 1.6032 - classification_loss: 0.3568 444/500 [=========================>....] - ETA: 14s - loss: 1.9581 - regression_loss: 1.6018 - classification_loss: 0.3564 445/500 [=========================>....] - ETA: 13s - loss: 1.9577 - regression_loss: 1.6015 - classification_loss: 0.3562 446/500 [=========================>....] - ETA: 13s - loss: 1.9580 - regression_loss: 1.6016 - classification_loss: 0.3564 447/500 [=========================>....] - ETA: 13s - loss: 1.9585 - regression_loss: 1.6021 - classification_loss: 0.3564 448/500 [=========================>....] - ETA: 13s - loss: 1.9584 - regression_loss: 1.6019 - classification_loss: 0.3565 449/500 [=========================>....] - ETA: 12s - loss: 1.9590 - regression_loss: 1.6023 - classification_loss: 0.3567 450/500 [==========================>...] - ETA: 12s - loss: 1.9586 - regression_loss: 1.6021 - classification_loss: 0.3565 451/500 [==========================>...] - ETA: 12s - loss: 1.9585 - regression_loss: 1.6021 - classification_loss: 0.3564 452/500 [==========================>...] - ETA: 12s - loss: 1.9602 - regression_loss: 1.6021 - classification_loss: 0.3580 453/500 [==========================>...] - ETA: 11s - loss: 1.9587 - regression_loss: 1.6010 - classification_loss: 0.3578 454/500 [==========================>...] - ETA: 11s - loss: 1.9588 - regression_loss: 1.6012 - classification_loss: 0.3577 455/500 [==========================>...] - ETA: 11s - loss: 1.9583 - regression_loss: 1.6007 - classification_loss: 0.3575 456/500 [==========================>...] - ETA: 11s - loss: 1.9590 - regression_loss: 1.6013 - classification_loss: 0.3576 457/500 [==========================>...] - ETA: 10s - loss: 1.9583 - regression_loss: 1.6008 - classification_loss: 0.3575 458/500 [==========================>...] - ETA: 10s - loss: 1.9578 - regression_loss: 1.6003 - classification_loss: 0.3575 459/500 [==========================>...] - ETA: 10s - loss: 1.9596 - regression_loss: 1.6016 - classification_loss: 0.3581 460/500 [==========================>...] - ETA: 10s - loss: 1.9592 - regression_loss: 1.6013 - classification_loss: 0.3579 461/500 [==========================>...] - ETA: 9s - loss: 1.9601 - regression_loss: 1.6020 - classification_loss: 0.3581  462/500 [==========================>...] - ETA: 9s - loss: 1.9572 - regression_loss: 1.5995 - classification_loss: 0.3577 463/500 [==========================>...] - ETA: 9s - loss: 1.9580 - regression_loss: 1.6003 - classification_loss: 0.3578 464/500 [==========================>...] - ETA: 9s - loss: 1.9586 - regression_loss: 1.6008 - classification_loss: 0.3578 465/500 [==========================>...] - ETA: 8s - loss: 1.9596 - regression_loss: 1.6019 - classification_loss: 0.3577 466/500 [==========================>...] - ETA: 8s - loss: 1.9591 - regression_loss: 1.6015 - classification_loss: 0.3576 467/500 [===========================>..] - ETA: 8s - loss: 1.9589 - regression_loss: 1.6014 - classification_loss: 0.3575 468/500 [===========================>..] - ETA: 8s - loss: 1.9593 - regression_loss: 1.6019 - classification_loss: 0.3574 469/500 [===========================>..] - ETA: 7s - loss: 1.9595 - regression_loss: 1.6021 - classification_loss: 0.3574 470/500 [===========================>..] - ETA: 7s - loss: 1.9583 - regression_loss: 1.6011 - classification_loss: 0.3572 471/500 [===========================>..] - ETA: 7s - loss: 1.9588 - regression_loss: 1.6018 - classification_loss: 0.3571 472/500 [===========================>..] - ETA: 7s - loss: 1.9596 - regression_loss: 1.6024 - classification_loss: 0.3572 473/500 [===========================>..] - ETA: 6s - loss: 1.9600 - regression_loss: 1.6028 - classification_loss: 0.3571 474/500 [===========================>..] - ETA: 6s - loss: 1.9604 - regression_loss: 1.6034 - classification_loss: 0.3569 475/500 [===========================>..] - ETA: 6s - loss: 1.9605 - regression_loss: 1.6035 - classification_loss: 0.3569 476/500 [===========================>..] - ETA: 6s - loss: 1.9603 - regression_loss: 1.6037 - classification_loss: 0.3566 477/500 [===========================>..] - ETA: 5s - loss: 1.9610 - regression_loss: 1.6043 - classification_loss: 0.3566 478/500 [===========================>..] - ETA: 5s - loss: 1.9607 - regression_loss: 1.6043 - classification_loss: 0.3565 479/500 [===========================>..] - ETA: 5s - loss: 1.9615 - regression_loss: 1.6050 - classification_loss: 0.3566 480/500 [===========================>..] - ETA: 5s - loss: 1.9618 - regression_loss: 1.6052 - classification_loss: 0.3566 481/500 [===========================>..] - ETA: 4s - loss: 1.9619 - regression_loss: 1.6053 - classification_loss: 0.3566 482/500 [===========================>..] - ETA: 4s - loss: 1.9591 - regression_loss: 1.6030 - classification_loss: 0.3561 483/500 [===========================>..] - ETA: 4s - loss: 1.9567 - regression_loss: 1.6010 - classification_loss: 0.3557 484/500 [============================>.] - ETA: 4s - loss: 1.9565 - regression_loss: 1.6008 - classification_loss: 0.3557 485/500 [============================>.] - ETA: 3s - loss: 1.9554 - regression_loss: 1.6000 - classification_loss: 0.3554 486/500 [============================>.] - ETA: 3s - loss: 1.9529 - regression_loss: 1.5979 - classification_loss: 0.3550 487/500 [============================>.] - ETA: 3s - loss: 1.9533 - regression_loss: 1.5982 - classification_loss: 0.3551 488/500 [============================>.] - ETA: 3s - loss: 1.9535 - regression_loss: 1.5984 - classification_loss: 0.3551 489/500 [============================>.] - ETA: 2s - loss: 1.9535 - regression_loss: 1.5983 - classification_loss: 0.3552 490/500 [============================>.] - ETA: 2s - loss: 1.9540 - regression_loss: 1.5988 - classification_loss: 0.3552 491/500 [============================>.] - ETA: 2s - loss: 1.9540 - regression_loss: 1.5989 - classification_loss: 0.3551 492/500 [============================>.] - ETA: 2s - loss: 1.9531 - regression_loss: 1.5981 - classification_loss: 0.3550 493/500 [============================>.] - ETA: 1s - loss: 1.9524 - regression_loss: 1.5976 - classification_loss: 0.3548 494/500 [============================>.] - ETA: 1s - loss: 1.9535 - regression_loss: 1.5984 - classification_loss: 0.3551 495/500 [============================>.] - ETA: 1s - loss: 1.9518 - regression_loss: 1.5972 - classification_loss: 0.3546 496/500 [============================>.] - ETA: 1s - loss: 1.9524 - regression_loss: 1.5977 - classification_loss: 0.3547 497/500 [============================>.] - ETA: 0s - loss: 1.9500 - regression_loss: 1.5957 - classification_loss: 0.3544 498/500 [============================>.] - ETA: 0s - loss: 1.9517 - regression_loss: 1.5967 - classification_loss: 0.3549 499/500 [============================>.] - ETA: 0s - loss: 1.9526 - regression_loss: 1.5976 - classification_loss: 0.3550 500/500 [==============================] - 126s 251ms/step - loss: 1.9526 - regression_loss: 1.5976 - classification_loss: 0.3550 1172 instances of class plum with average precision: 0.5515 mAP: 0.5515 Epoch 00045: saving model to ./training/snapshots/resnet50_pascal_45.h5 Epoch 46/150 1/500 [..............................] - ETA: 1:57 - loss: 2.2973 - regression_loss: 1.8860 - classification_loss: 0.4113 2/500 [..............................] - ETA: 1:59 - loss: 2.0784 - regression_loss: 1.7211 - classification_loss: 0.3572 3/500 [..............................] - ETA: 2:00 - loss: 2.1821 - regression_loss: 1.8238 - classification_loss: 0.3584 4/500 [..............................] - ETA: 1:59 - loss: 1.8387 - regression_loss: 1.5429 - classification_loss: 0.2958 5/500 [..............................] - ETA: 2:00 - loss: 1.9433 - regression_loss: 1.6103 - classification_loss: 0.3330 6/500 [..............................] - ETA: 2:00 - loss: 2.0697 - regression_loss: 1.7114 - classification_loss: 0.3583 7/500 [..............................] - ETA: 2:01 - loss: 2.0505 - regression_loss: 1.6955 - classification_loss: 0.3550 8/500 [..............................] - ETA: 2:01 - loss: 2.0651 - regression_loss: 1.7048 - classification_loss: 0.3603 9/500 [..............................] - ETA: 2:01 - loss: 2.0723 - regression_loss: 1.7116 - classification_loss: 0.3607 10/500 [..............................] - ETA: 2:01 - loss: 2.0748 - regression_loss: 1.7124 - classification_loss: 0.3624 11/500 [..............................] - ETA: 2:01 - loss: 2.0732 - regression_loss: 1.7149 - classification_loss: 0.3583 12/500 [..............................] - ETA: 2:01 - loss: 2.0992 - regression_loss: 1.7318 - classification_loss: 0.3674 13/500 [..............................] - ETA: 2:02 - loss: 2.1286 - regression_loss: 1.7529 - classification_loss: 0.3757 14/500 [..............................] - ETA: 2:01 - loss: 2.1945 - regression_loss: 1.7942 - classification_loss: 0.4004 15/500 [..............................] - ETA: 2:01 - loss: 2.1598 - regression_loss: 1.7652 - classification_loss: 0.3945 16/500 [..............................] - ETA: 2:01 - loss: 2.1484 - regression_loss: 1.7582 - classification_loss: 0.3902 17/500 [>.............................] - ETA: 2:01 - loss: 2.1319 - regression_loss: 1.7435 - classification_loss: 0.3883 18/500 [>.............................] - ETA: 2:00 - loss: 2.1345 - regression_loss: 1.7464 - classification_loss: 0.3881 19/500 [>.............................] - ETA: 1:59 - loss: 2.1540 - regression_loss: 1.7658 - classification_loss: 0.3882 20/500 [>.............................] - ETA: 1:59 - loss: 2.1554 - regression_loss: 1.7689 - classification_loss: 0.3864 21/500 [>.............................] - ETA: 1:58 - loss: 2.1546 - regression_loss: 1.7654 - classification_loss: 0.3892 22/500 [>.............................] - ETA: 1:58 - loss: 2.1469 - regression_loss: 1.7601 - classification_loss: 0.3868 23/500 [>.............................] - ETA: 1:58 - loss: 2.1514 - regression_loss: 1.7642 - classification_loss: 0.3872 24/500 [>.............................] - ETA: 1:58 - loss: 2.1275 - regression_loss: 1.7451 - classification_loss: 0.3824 25/500 [>.............................] - ETA: 1:58 - loss: 2.1174 - regression_loss: 1.7375 - classification_loss: 0.3799 26/500 [>.............................] - ETA: 1:57 - loss: 2.1043 - regression_loss: 1.7284 - classification_loss: 0.3759 27/500 [>.............................] - ETA: 1:57 - loss: 2.0883 - regression_loss: 1.7176 - classification_loss: 0.3707 28/500 [>.............................] - ETA: 1:56 - loss: 2.0693 - regression_loss: 1.7020 - classification_loss: 0.3673 29/500 [>.............................] - ETA: 1:56 - loss: 2.0737 - regression_loss: 1.7065 - classification_loss: 0.3671 30/500 [>.............................] - ETA: 1:55 - loss: 2.0717 - regression_loss: 1.7040 - classification_loss: 0.3677 31/500 [>.............................] - ETA: 1:55 - loss: 2.0688 - regression_loss: 1.7029 - classification_loss: 0.3659 32/500 [>.............................] - ETA: 1:54 - loss: 2.0369 - regression_loss: 1.6772 - classification_loss: 0.3597 33/500 [>.............................] - ETA: 1:54 - loss: 2.0329 - regression_loss: 1.6739 - classification_loss: 0.3590 34/500 [=>............................] - ETA: 1:53 - loss: 2.0438 - regression_loss: 1.6826 - classification_loss: 0.3612 35/500 [=>............................] - ETA: 1:53 - loss: 2.0284 - regression_loss: 1.6717 - classification_loss: 0.3567 36/500 [=>............................] - ETA: 1:53 - loss: 2.0160 - regression_loss: 1.6603 - classification_loss: 0.3557 37/500 [=>............................] - ETA: 1:53 - loss: 2.0136 - regression_loss: 1.6597 - classification_loss: 0.3539 38/500 [=>............................] - ETA: 1:53 - loss: 2.0144 - regression_loss: 1.6614 - classification_loss: 0.3530 39/500 [=>............................] - ETA: 1:53 - loss: 2.0222 - regression_loss: 1.6643 - classification_loss: 0.3578 40/500 [=>............................] - ETA: 1:53 - loss: 2.0202 - regression_loss: 1.6622 - classification_loss: 0.3580 41/500 [=>............................] - ETA: 1:52 - loss: 2.0090 - regression_loss: 1.6527 - classification_loss: 0.3563 42/500 [=>............................] - ETA: 1:52 - loss: 1.9936 - regression_loss: 1.6425 - classification_loss: 0.3511 43/500 [=>............................] - ETA: 1:52 - loss: 1.9872 - regression_loss: 1.6376 - classification_loss: 0.3497 44/500 [=>............................] - ETA: 1:52 - loss: 1.9858 - regression_loss: 1.6369 - classification_loss: 0.3489 45/500 [=>............................] - ETA: 1:52 - loss: 1.9902 - regression_loss: 1.6404 - classification_loss: 0.3498 46/500 [=>............................] - ETA: 1:52 - loss: 1.9916 - regression_loss: 1.6410 - classification_loss: 0.3506 47/500 [=>............................] - ETA: 1:51 - loss: 1.9894 - regression_loss: 1.6399 - classification_loss: 0.3495 48/500 [=>............................] - ETA: 1:51 - loss: 1.9869 - regression_loss: 1.6386 - classification_loss: 0.3483 49/500 [=>............................] - ETA: 1:51 - loss: 1.9782 - regression_loss: 1.6304 - classification_loss: 0.3478 50/500 [==>...........................] - ETA: 1:50 - loss: 1.9823 - regression_loss: 1.6339 - classification_loss: 0.3484 51/500 [==>...........................] - ETA: 1:50 - loss: 1.9854 - regression_loss: 1.6355 - classification_loss: 0.3499 52/500 [==>...........................] - ETA: 1:50 - loss: 1.9781 - regression_loss: 1.6297 - classification_loss: 0.3485 53/500 [==>...........................] - ETA: 1:50 - loss: 1.9785 - regression_loss: 1.6313 - classification_loss: 0.3472 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9828 - regression_loss: 1.6352 - classification_loss: 0.3476 55/500 [==>...........................] - ETA: 1:49 - loss: 1.9783 - regression_loss: 1.6314 - classification_loss: 0.3469 56/500 [==>...........................] - ETA: 1:49 - loss: 1.9794 - regression_loss: 1.6325 - classification_loss: 0.3470 57/500 [==>...........................] - ETA: 1:49 - loss: 1.9815 - regression_loss: 1.6346 - classification_loss: 0.3468 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9828 - regression_loss: 1.6359 - classification_loss: 0.3469 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9738 - regression_loss: 1.6295 - classification_loss: 0.3443 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9771 - regression_loss: 1.6317 - classification_loss: 0.3454 61/500 [==>...........................] - ETA: 1:48 - loss: 1.9768 - regression_loss: 1.6322 - classification_loss: 0.3446 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9770 - regression_loss: 1.6333 - classification_loss: 0.3438 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9812 - regression_loss: 1.6326 - classification_loss: 0.3486 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9729 - regression_loss: 1.6244 - classification_loss: 0.3485 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9785 - regression_loss: 1.6289 - classification_loss: 0.3496 66/500 [==>...........................] - ETA: 1:47 - loss: 1.9779 - regression_loss: 1.6286 - classification_loss: 0.3493 67/500 [===>..........................] - ETA: 1:47 - loss: 1.9781 - regression_loss: 1.6292 - classification_loss: 0.3488 68/500 [===>..........................] - ETA: 1:47 - loss: 1.9764 - regression_loss: 1.6275 - classification_loss: 0.3489 69/500 [===>..........................] - ETA: 1:47 - loss: 1.9690 - regression_loss: 1.6199 - classification_loss: 0.3491 70/500 [===>..........................] - ETA: 1:46 - loss: 1.9697 - regression_loss: 1.6199 - classification_loss: 0.3498 71/500 [===>..........................] - ETA: 1:46 - loss: 1.9757 - regression_loss: 1.6244 - classification_loss: 0.3513 72/500 [===>..........................] - ETA: 1:46 - loss: 1.9767 - regression_loss: 1.6258 - classification_loss: 0.3509 73/500 [===>..........................] - ETA: 1:46 - loss: 1.9774 - regression_loss: 1.6251 - classification_loss: 0.3523 74/500 [===>..........................] - ETA: 1:45 - loss: 1.9772 - regression_loss: 1.6256 - classification_loss: 0.3516 75/500 [===>..........................] - ETA: 1:45 - loss: 1.9811 - regression_loss: 1.6276 - classification_loss: 0.3535 76/500 [===>..........................] - ETA: 1:45 - loss: 1.9778 - regression_loss: 1.6253 - classification_loss: 0.3525 77/500 [===>..........................] - ETA: 1:45 - loss: 1.9784 - regression_loss: 1.6264 - classification_loss: 0.3520 78/500 [===>..........................] - ETA: 1:45 - loss: 1.9960 - regression_loss: 1.6428 - classification_loss: 0.3532 79/500 [===>..........................] - ETA: 1:44 - loss: 1.9963 - regression_loss: 1.6432 - classification_loss: 0.3531 80/500 [===>..........................] - ETA: 1:44 - loss: 1.9935 - regression_loss: 1.6410 - classification_loss: 0.3525 81/500 [===>..........................] - ETA: 1:44 - loss: 1.9892 - regression_loss: 1.6376 - classification_loss: 0.3516 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9886 - regression_loss: 1.6374 - classification_loss: 0.3512 83/500 [===>..........................] - ETA: 1:43 - loss: 1.9859 - regression_loss: 1.6355 - classification_loss: 0.3504 84/500 [====>.........................] - ETA: 1:43 - loss: 1.9855 - regression_loss: 1.6350 - classification_loss: 0.3505 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9850 - regression_loss: 1.6348 - classification_loss: 0.3502 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9880 - regression_loss: 1.6376 - classification_loss: 0.3504 87/500 [====>.........................] - ETA: 1:42 - loss: 1.9815 - regression_loss: 1.6324 - classification_loss: 0.3490 88/500 [====>.........................] - ETA: 1:42 - loss: 1.9799 - regression_loss: 1.6306 - classification_loss: 0.3493 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9832 - regression_loss: 1.6331 - classification_loss: 0.3502 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9812 - regression_loss: 1.6315 - classification_loss: 0.3496 91/500 [====>.........................] - ETA: 1:41 - loss: 1.9789 - regression_loss: 1.6296 - classification_loss: 0.3493 92/500 [====>.........................] - ETA: 1:41 - loss: 1.9820 - regression_loss: 1.6333 - classification_loss: 0.3486 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9841 - regression_loss: 1.6355 - classification_loss: 0.3486 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9897 - regression_loss: 1.6404 - classification_loss: 0.3492 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9893 - regression_loss: 1.6405 - classification_loss: 0.3488 96/500 [====>.........................] - ETA: 1:40 - loss: 1.9871 - regression_loss: 1.6393 - classification_loss: 0.3478 97/500 [====>.........................] - ETA: 1:40 - loss: 1.9945 - regression_loss: 1.6455 - classification_loss: 0.3490 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9942 - regression_loss: 1.6451 - classification_loss: 0.3491 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9996 - regression_loss: 1.6490 - classification_loss: 0.3506 100/500 [=====>........................] - ETA: 1:39 - loss: 1.9956 - regression_loss: 1.6467 - classification_loss: 0.3489 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9999 - regression_loss: 1.6499 - classification_loss: 0.3500 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9988 - regression_loss: 1.6487 - classification_loss: 0.3501 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9983 - regression_loss: 1.6482 - classification_loss: 0.3501 104/500 [=====>........................] - ETA: 1:38 - loss: 1.9969 - regression_loss: 1.6467 - classification_loss: 0.3502 105/500 [=====>........................] - ETA: 1:38 - loss: 1.9996 - regression_loss: 1.6487 - classification_loss: 0.3509 106/500 [=====>........................] - ETA: 1:38 - loss: 2.0000 - regression_loss: 1.6495 - classification_loss: 0.3504 107/500 [=====>........................] - ETA: 1:38 - loss: 2.0044 - regression_loss: 1.6528 - classification_loss: 0.3516 108/500 [=====>........................] - ETA: 1:37 - loss: 2.0037 - regression_loss: 1.6522 - classification_loss: 0.3515 109/500 [=====>........................] - ETA: 1:37 - loss: 2.0076 - regression_loss: 1.6554 - classification_loss: 0.3522 110/500 [=====>........................] - ETA: 1:37 - loss: 2.0004 - regression_loss: 1.6495 - classification_loss: 0.3509 111/500 [=====>........................] - ETA: 1:37 - loss: 2.0031 - regression_loss: 1.6516 - classification_loss: 0.3514 112/500 [=====>........................] - ETA: 1:37 - loss: 2.0044 - regression_loss: 1.6532 - classification_loss: 0.3512 113/500 [=====>........................] - ETA: 1:36 - loss: 2.0012 - regression_loss: 1.6506 - classification_loss: 0.3506 114/500 [=====>........................] - ETA: 1:36 - loss: 2.0052 - regression_loss: 1.6525 - classification_loss: 0.3527 115/500 [=====>........................] - ETA: 1:36 - loss: 2.0050 - regression_loss: 1.6519 - classification_loss: 0.3531 116/500 [=====>........................] - ETA: 1:35 - loss: 2.0037 - regression_loss: 1.6509 - classification_loss: 0.3528 117/500 [======>.......................] - ETA: 1:35 - loss: 2.0010 - regression_loss: 1.6491 - classification_loss: 0.3519 118/500 [======>.......................] - ETA: 1:35 - loss: 2.0036 - regression_loss: 1.6515 - classification_loss: 0.3521 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9998 - regression_loss: 1.6486 - classification_loss: 0.3512 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9943 - regression_loss: 1.6443 - classification_loss: 0.3500 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9929 - regression_loss: 1.6434 - classification_loss: 0.3495 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9898 - regression_loss: 1.6401 - classification_loss: 0.3497 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9913 - regression_loss: 1.6410 - classification_loss: 0.3503 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9894 - regression_loss: 1.6393 - classification_loss: 0.3502 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9915 - regression_loss: 1.6407 - classification_loss: 0.3508 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9969 - regression_loss: 1.6461 - classification_loss: 0.3509 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9926 - regression_loss: 1.6429 - classification_loss: 0.3497 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9972 - regression_loss: 1.6469 - classification_loss: 0.3503 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9924 - regression_loss: 1.6431 - classification_loss: 0.3493 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9951 - regression_loss: 1.6459 - classification_loss: 0.3492 131/500 [======>.......................] - ETA: 1:32 - loss: 2.0002 - regression_loss: 1.6492 - classification_loss: 0.3510 132/500 [======>.......................] - ETA: 1:32 - loss: 2.0010 - regression_loss: 1.6502 - classification_loss: 0.3508 133/500 [======>.......................] - ETA: 1:31 - loss: 2.0039 - regression_loss: 1.6483 - classification_loss: 0.3556 134/500 [=======>......................] - ETA: 1:31 - loss: 2.0074 - regression_loss: 1.6515 - classification_loss: 0.3560 135/500 [=======>......................] - ETA: 1:31 - loss: 2.0070 - regression_loss: 1.6513 - classification_loss: 0.3557 136/500 [=======>......................] - ETA: 1:31 - loss: 2.0040 - regression_loss: 1.6490 - classification_loss: 0.3549 137/500 [=======>......................] - ETA: 1:30 - loss: 2.0002 - regression_loss: 1.6454 - classification_loss: 0.3548 138/500 [=======>......................] - ETA: 1:30 - loss: 2.0008 - regression_loss: 1.6459 - classification_loss: 0.3549 139/500 [=======>......................] - ETA: 1:30 - loss: 2.0011 - regression_loss: 1.6457 - classification_loss: 0.3554 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9989 - regression_loss: 1.6438 - classification_loss: 0.3550 141/500 [=======>......................] - ETA: 1:29 - loss: 2.0002 - regression_loss: 1.6450 - classification_loss: 0.3551 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9973 - regression_loss: 1.6430 - classification_loss: 0.3542 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9997 - regression_loss: 1.6449 - classification_loss: 0.3547 144/500 [=======>......................] - ETA: 1:29 - loss: 2.0039 - regression_loss: 1.6490 - classification_loss: 0.3549 145/500 [=======>......................] - ETA: 1:28 - loss: 2.0038 - regression_loss: 1.6486 - classification_loss: 0.3553 146/500 [=======>......................] - ETA: 1:28 - loss: 2.0028 - regression_loss: 1.6470 - classification_loss: 0.3558 147/500 [=======>......................] - ETA: 1:28 - loss: 2.0009 - regression_loss: 1.6457 - classification_loss: 0.3553 148/500 [=======>......................] - ETA: 1:28 - loss: 2.0010 - regression_loss: 1.6458 - classification_loss: 0.3552 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9973 - regression_loss: 1.6426 - classification_loss: 0.3547 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9987 - regression_loss: 1.6435 - classification_loss: 0.3552 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9964 - regression_loss: 1.6417 - classification_loss: 0.3547 152/500 [========>.....................] - ETA: 1:27 - loss: 2.0002 - regression_loss: 1.6447 - classification_loss: 0.3554 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9997 - regression_loss: 1.6449 - classification_loss: 0.3548 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9971 - regression_loss: 1.6422 - classification_loss: 0.3549 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9938 - regression_loss: 1.6398 - classification_loss: 0.3540 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9926 - regression_loss: 1.6392 - classification_loss: 0.3534 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9928 - regression_loss: 1.6392 - classification_loss: 0.3536 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9919 - regression_loss: 1.6389 - classification_loss: 0.3530 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9899 - regression_loss: 1.6379 - classification_loss: 0.3520 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9912 - regression_loss: 1.6386 - classification_loss: 0.3526 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9936 - regression_loss: 1.6408 - classification_loss: 0.3528 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9923 - regression_loss: 1.6403 - classification_loss: 0.3520 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9923 - regression_loss: 1.6403 - classification_loss: 0.3520 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9911 - regression_loss: 1.6389 - classification_loss: 0.3522 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9890 - regression_loss: 1.6375 - classification_loss: 0.3515 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9884 - regression_loss: 1.6371 - classification_loss: 0.3513 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9856 - regression_loss: 1.6347 - classification_loss: 0.3509 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9835 - regression_loss: 1.6320 - classification_loss: 0.3514 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9840 - regression_loss: 1.6328 - classification_loss: 0.3512 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9876 - regression_loss: 1.6361 - classification_loss: 0.3516 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9875 - regression_loss: 1.6361 - classification_loss: 0.3514 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9805 - regression_loss: 1.6300 - classification_loss: 0.3505 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9778 - regression_loss: 1.6277 - classification_loss: 0.3502 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9716 - regression_loss: 1.6228 - classification_loss: 0.3488 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9718 - regression_loss: 1.6226 - classification_loss: 0.3492 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9660 - regression_loss: 1.6178 - classification_loss: 0.3482 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9699 - regression_loss: 1.6200 - classification_loss: 0.3499 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9699 - regression_loss: 1.6197 - classification_loss: 0.3502 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9687 - regression_loss: 1.6189 - classification_loss: 0.3498 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9705 - regression_loss: 1.6204 - classification_loss: 0.3500 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9652 - regression_loss: 1.6157 - classification_loss: 0.3496 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9630 - regression_loss: 1.6140 - classification_loss: 0.3490 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9653 - regression_loss: 1.6163 - classification_loss: 0.3489 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9642 - regression_loss: 1.6155 - classification_loss: 0.3486 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9639 - regression_loss: 1.6150 - classification_loss: 0.3489 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9679 - regression_loss: 1.6179 - classification_loss: 0.3500 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9706 - regression_loss: 1.6196 - classification_loss: 0.3510 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9713 - regression_loss: 1.6204 - classification_loss: 0.3509 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9670 - regression_loss: 1.6160 - classification_loss: 0.3510 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9610 - regression_loss: 1.6107 - classification_loss: 0.3503 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9596 - regression_loss: 1.6096 - classification_loss: 0.3500 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9614 - regression_loss: 1.6108 - classification_loss: 0.3506 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9569 - regression_loss: 1.6068 - classification_loss: 0.3501 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9545 - regression_loss: 1.6046 - classification_loss: 0.3499 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9537 - regression_loss: 1.6039 - classification_loss: 0.3498 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9540 - regression_loss: 1.6043 - classification_loss: 0.3496 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9519 - regression_loss: 1.6024 - classification_loss: 0.3495 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9508 - regression_loss: 1.6020 - classification_loss: 0.3488 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9507 - regression_loss: 1.6021 - classification_loss: 0.3487 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9471 - regression_loss: 1.5992 - classification_loss: 0.3479 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9485 - regression_loss: 1.6003 - classification_loss: 0.3481 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9475 - regression_loss: 1.5997 - classification_loss: 0.3479 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9491 - regression_loss: 1.6003 - classification_loss: 0.3487 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9456 - regression_loss: 1.5974 - classification_loss: 0.3482 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9455 - regression_loss: 1.5968 - classification_loss: 0.3487 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9461 - regression_loss: 1.5975 - classification_loss: 0.3486 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9484 - regression_loss: 1.5985 - classification_loss: 0.3499 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9515 - regression_loss: 1.6009 - classification_loss: 0.3506 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9531 - regression_loss: 1.6024 - classification_loss: 0.3507 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9551 - regression_loss: 1.6035 - classification_loss: 0.3516 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9558 - regression_loss: 1.6039 - classification_loss: 0.3519 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9586 - regression_loss: 1.6063 - classification_loss: 0.3523 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9537 - regression_loss: 1.6023 - classification_loss: 0.3514 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9573 - regression_loss: 1.6046 - classification_loss: 0.3527 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9536 - regression_loss: 1.6014 - classification_loss: 0.3522 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9553 - regression_loss: 1.6024 - classification_loss: 0.3529 217/500 [============>.................] - ETA: 1:10 - loss: 1.9558 - regression_loss: 1.6030 - classification_loss: 0.3528 218/500 [============>.................] - ETA: 1:10 - loss: 1.9540 - regression_loss: 1.6018 - classification_loss: 0.3521 219/500 [============>.................] - ETA: 1:10 - loss: 1.9558 - regression_loss: 1.6035 - classification_loss: 0.3523 220/500 [============>.................] - ETA: 1:10 - loss: 1.9572 - regression_loss: 1.6047 - classification_loss: 0.3525 221/500 [============>.................] - ETA: 1:09 - loss: 1.9561 - regression_loss: 1.6038 - classification_loss: 0.3523 222/500 [============>.................] - ETA: 1:09 - loss: 1.9557 - regression_loss: 1.6037 - classification_loss: 0.3520 223/500 [============>.................] - ETA: 1:09 - loss: 1.9565 - regression_loss: 1.6045 - classification_loss: 0.3520 224/500 [============>.................] - ETA: 1:09 - loss: 1.9566 - regression_loss: 1.6046 - classification_loss: 0.3520 225/500 [============>.................] - ETA: 1:08 - loss: 1.9516 - regression_loss: 1.6004 - classification_loss: 0.3511 226/500 [============>.................] - ETA: 1:08 - loss: 1.9520 - regression_loss: 1.6011 - classification_loss: 0.3508 227/500 [============>.................] - ETA: 1:08 - loss: 1.9513 - regression_loss: 1.6008 - classification_loss: 0.3505 228/500 [============>.................] - ETA: 1:07 - loss: 1.9511 - regression_loss: 1.6011 - classification_loss: 0.3500 229/500 [============>.................] - ETA: 1:07 - loss: 1.9538 - regression_loss: 1.6028 - classification_loss: 0.3510 230/500 [============>.................] - ETA: 1:07 - loss: 1.9523 - regression_loss: 1.6018 - classification_loss: 0.3506 231/500 [============>.................] - ETA: 1:07 - loss: 1.9507 - regression_loss: 1.6005 - classification_loss: 0.3502 232/500 [============>.................] - ETA: 1:06 - loss: 1.9505 - regression_loss: 1.6004 - classification_loss: 0.3500 233/500 [============>.................] - ETA: 1:06 - loss: 1.9516 - regression_loss: 1.6010 - classification_loss: 0.3506 234/500 [=============>................] - ETA: 1:06 - loss: 1.9522 - regression_loss: 1.6017 - classification_loss: 0.3505 235/500 [=============>................] - ETA: 1:06 - loss: 1.9543 - regression_loss: 1.6028 - classification_loss: 0.3515 236/500 [=============>................] - ETA: 1:05 - loss: 1.9555 - regression_loss: 1.6041 - classification_loss: 0.3514 237/500 [=============>................] - ETA: 1:05 - loss: 1.9559 - regression_loss: 1.6044 - classification_loss: 0.3515 238/500 [=============>................] - ETA: 1:05 - loss: 1.9568 - regression_loss: 1.6053 - classification_loss: 0.3515 239/500 [=============>................] - ETA: 1:05 - loss: 1.9588 - regression_loss: 1.6070 - classification_loss: 0.3518 240/500 [=============>................] - ETA: 1:04 - loss: 1.9578 - regression_loss: 1.6062 - classification_loss: 0.3517 241/500 [=============>................] - ETA: 1:04 - loss: 1.9593 - regression_loss: 1.6075 - classification_loss: 0.3518 242/500 [=============>................] - ETA: 1:04 - loss: 1.9637 - regression_loss: 1.6089 - classification_loss: 0.3548 243/500 [=============>................] - ETA: 1:04 - loss: 1.9632 - regression_loss: 1.6088 - classification_loss: 0.3544 244/500 [=============>................] - ETA: 1:03 - loss: 1.9646 - regression_loss: 1.6096 - classification_loss: 0.3550 245/500 [=============>................] - ETA: 1:03 - loss: 1.9660 - regression_loss: 1.6106 - classification_loss: 0.3554 246/500 [=============>................] - ETA: 1:03 - loss: 1.9664 - regression_loss: 1.6110 - classification_loss: 0.3553 247/500 [=============>................] - ETA: 1:03 - loss: 1.9678 - regression_loss: 1.6120 - classification_loss: 0.3558 248/500 [=============>................] - ETA: 1:02 - loss: 1.9671 - regression_loss: 1.6113 - classification_loss: 0.3558 249/500 [=============>................] - ETA: 1:02 - loss: 1.9621 - regression_loss: 1.6076 - classification_loss: 0.3545 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9647 - regression_loss: 1.6098 - classification_loss: 0.3549 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9609 - regression_loss: 1.6064 - classification_loss: 0.3545 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9623 - regression_loss: 1.6074 - classification_loss: 0.3549 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9610 - regression_loss: 1.6061 - classification_loss: 0.3549 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9606 - regression_loss: 1.6060 - classification_loss: 0.3546 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9619 - regression_loss: 1.6070 - classification_loss: 0.3550 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9652 - regression_loss: 1.6093 - classification_loss: 0.3559 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9658 - regression_loss: 1.6094 - classification_loss: 0.3564 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9670 - regression_loss: 1.6103 - classification_loss: 0.3567 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9695 - regression_loss: 1.6127 - classification_loss: 0.3568 260/500 [==============>...............] - ETA: 59s - loss: 1.9697 - regression_loss: 1.6129 - classification_loss: 0.3568  261/500 [==============>...............] - ETA: 59s - loss: 1.9674 - regression_loss: 1.6108 - classification_loss: 0.3566 262/500 [==============>...............] - ETA: 59s - loss: 1.9673 - regression_loss: 1.6107 - classification_loss: 0.3566 263/500 [==============>...............] - ETA: 59s - loss: 1.9695 - regression_loss: 1.6126 - classification_loss: 0.3569 264/500 [==============>...............] - ETA: 58s - loss: 1.9720 - regression_loss: 1.6148 - classification_loss: 0.3573 265/500 [==============>...............] - ETA: 58s - loss: 1.9716 - regression_loss: 1.6146 - classification_loss: 0.3570 266/500 [==============>...............] - ETA: 58s - loss: 1.9704 - regression_loss: 1.6140 - classification_loss: 0.3565 267/500 [===============>..............] - ETA: 58s - loss: 1.9687 - regression_loss: 1.6126 - classification_loss: 0.3561 268/500 [===============>..............] - ETA: 57s - loss: 1.9686 - regression_loss: 1.6126 - classification_loss: 0.3560 269/500 [===============>..............] - ETA: 57s - loss: 1.9712 - regression_loss: 1.6152 - classification_loss: 0.3560 270/500 [===============>..............] - ETA: 57s - loss: 1.9713 - regression_loss: 1.6154 - classification_loss: 0.3559 271/500 [===============>..............] - ETA: 57s - loss: 1.9706 - regression_loss: 1.6150 - classification_loss: 0.3555 272/500 [===============>..............] - ETA: 56s - loss: 1.9695 - regression_loss: 1.6143 - classification_loss: 0.3552 273/500 [===============>..............] - ETA: 56s - loss: 1.9687 - regression_loss: 1.6137 - classification_loss: 0.3549 274/500 [===============>..............] - ETA: 56s - loss: 1.9669 - regression_loss: 1.6122 - classification_loss: 0.3547 275/500 [===============>..............] - ETA: 56s - loss: 1.9635 - regression_loss: 1.6093 - classification_loss: 0.3542 276/500 [===============>..............] - ETA: 56s - loss: 1.9612 - regression_loss: 1.6075 - classification_loss: 0.3536 277/500 [===============>..............] - ETA: 55s - loss: 1.9604 - regression_loss: 1.6068 - classification_loss: 0.3535 278/500 [===============>..............] - ETA: 55s - loss: 1.9594 - regression_loss: 1.6059 - classification_loss: 0.3535 279/500 [===============>..............] - ETA: 55s - loss: 1.9592 - regression_loss: 1.6058 - classification_loss: 0.3534 280/500 [===============>..............] - ETA: 55s - loss: 1.9603 - regression_loss: 1.6070 - classification_loss: 0.3533 281/500 [===============>..............] - ETA: 54s - loss: 1.9614 - regression_loss: 1.6080 - classification_loss: 0.3534 282/500 [===============>..............] - ETA: 54s - loss: 1.9637 - regression_loss: 1.6097 - classification_loss: 0.3540 283/500 [===============>..............] - ETA: 54s - loss: 1.9644 - regression_loss: 1.6104 - classification_loss: 0.3540 284/500 [================>.............] - ETA: 54s - loss: 1.9606 - regression_loss: 1.6075 - classification_loss: 0.3532 285/500 [================>.............] - ETA: 53s - loss: 1.9602 - regression_loss: 1.6071 - classification_loss: 0.3531 286/500 [================>.............] - ETA: 53s - loss: 1.9607 - regression_loss: 1.6077 - classification_loss: 0.3530 287/500 [================>.............] - ETA: 53s - loss: 1.9604 - regression_loss: 1.6075 - classification_loss: 0.3529 288/500 [================>.............] - ETA: 53s - loss: 1.9596 - regression_loss: 1.6070 - classification_loss: 0.3526 289/500 [================>.............] - ETA: 52s - loss: 1.9568 - regression_loss: 1.6048 - classification_loss: 0.3520 290/500 [================>.............] - ETA: 52s - loss: 1.9544 - regression_loss: 1.6028 - classification_loss: 0.3515 291/500 [================>.............] - ETA: 52s - loss: 1.9538 - regression_loss: 1.6024 - classification_loss: 0.3514 292/500 [================>.............] - ETA: 52s - loss: 1.9537 - regression_loss: 1.6024 - classification_loss: 0.3513 293/500 [================>.............] - ETA: 51s - loss: 1.9543 - regression_loss: 1.6032 - classification_loss: 0.3511 294/500 [================>.............] - ETA: 51s - loss: 1.9541 - regression_loss: 1.6031 - classification_loss: 0.3510 295/500 [================>.............] - ETA: 51s - loss: 1.9519 - regression_loss: 1.6013 - classification_loss: 0.3506 296/500 [================>.............] - ETA: 51s - loss: 1.9509 - regression_loss: 1.6007 - classification_loss: 0.3502 297/500 [================>.............] - ETA: 50s - loss: 1.9509 - regression_loss: 1.6007 - classification_loss: 0.3502 298/500 [================>.............] - ETA: 50s - loss: 1.9524 - regression_loss: 1.6019 - classification_loss: 0.3504 299/500 [================>.............] - ETA: 50s - loss: 1.9518 - regression_loss: 1.6014 - classification_loss: 0.3504 300/500 [=================>............] - ETA: 50s - loss: 1.9508 - regression_loss: 1.6005 - classification_loss: 0.3503 301/500 [=================>............] - ETA: 49s - loss: 1.9499 - regression_loss: 1.5999 - classification_loss: 0.3500 302/500 [=================>............] - ETA: 49s - loss: 1.9499 - regression_loss: 1.6000 - classification_loss: 0.3500 303/500 [=================>............] - ETA: 49s - loss: 1.9497 - regression_loss: 1.5990 - classification_loss: 0.3507 304/500 [=================>............] - ETA: 49s - loss: 1.9505 - regression_loss: 1.5994 - classification_loss: 0.3511 305/500 [=================>............] - ETA: 48s - loss: 1.9511 - regression_loss: 1.5997 - classification_loss: 0.3514 306/500 [=================>............] - ETA: 48s - loss: 1.9501 - regression_loss: 1.5989 - classification_loss: 0.3512 307/500 [=================>............] - ETA: 48s - loss: 1.9461 - regression_loss: 1.5954 - classification_loss: 0.3506 308/500 [=================>............] - ETA: 48s - loss: 1.9510 - regression_loss: 1.5981 - classification_loss: 0.3528 309/500 [=================>............] - ETA: 47s - loss: 1.9509 - regression_loss: 1.5981 - classification_loss: 0.3527 310/500 [=================>............] - ETA: 47s - loss: 1.9477 - regression_loss: 1.5958 - classification_loss: 0.3519 311/500 [=================>............] - ETA: 47s - loss: 1.9483 - regression_loss: 1.5960 - classification_loss: 0.3523 312/500 [=================>............] - ETA: 47s - loss: 1.9496 - regression_loss: 1.5975 - classification_loss: 0.3521 313/500 [=================>............] - ETA: 46s - loss: 1.9524 - regression_loss: 1.5994 - classification_loss: 0.3530 314/500 [=================>............] - ETA: 46s - loss: 1.9525 - regression_loss: 1.5990 - classification_loss: 0.3535 315/500 [=================>............] - ETA: 46s - loss: 1.9518 - regression_loss: 1.5985 - classification_loss: 0.3533 316/500 [=================>............] - ETA: 46s - loss: 1.9517 - regression_loss: 1.5984 - classification_loss: 0.3533 317/500 [==================>...........] - ETA: 45s - loss: 1.9518 - regression_loss: 1.5985 - classification_loss: 0.3533 318/500 [==================>...........] - ETA: 45s - loss: 1.9530 - regression_loss: 1.5994 - classification_loss: 0.3536 319/500 [==================>...........] - ETA: 45s - loss: 1.9529 - regression_loss: 1.5992 - classification_loss: 0.3537 320/500 [==================>...........] - ETA: 45s - loss: 1.9541 - regression_loss: 1.6000 - classification_loss: 0.3541 321/500 [==================>...........] - ETA: 44s - loss: 1.9519 - regression_loss: 1.5979 - classification_loss: 0.3540 322/500 [==================>...........] - ETA: 44s - loss: 1.9522 - regression_loss: 1.5981 - classification_loss: 0.3541 323/500 [==================>...........] - ETA: 44s - loss: 1.9512 - regression_loss: 1.5974 - classification_loss: 0.3538 324/500 [==================>...........] - ETA: 44s - loss: 1.9523 - regression_loss: 1.5984 - classification_loss: 0.3539 325/500 [==================>...........] - ETA: 43s - loss: 1.9526 - regression_loss: 1.5988 - classification_loss: 0.3538 326/500 [==================>...........] - ETA: 43s - loss: 1.9536 - regression_loss: 1.5995 - classification_loss: 0.3541 327/500 [==================>...........] - ETA: 43s - loss: 1.9550 - regression_loss: 1.6002 - classification_loss: 0.3548 328/500 [==================>...........] - ETA: 43s - loss: 1.9545 - regression_loss: 1.5998 - classification_loss: 0.3546 329/500 [==================>...........] - ETA: 42s - loss: 1.9544 - regression_loss: 1.6000 - classification_loss: 0.3544 330/500 [==================>...........] - ETA: 42s - loss: 1.9534 - regression_loss: 1.5993 - classification_loss: 0.3541 331/500 [==================>...........] - ETA: 42s - loss: 1.9529 - regression_loss: 1.5987 - classification_loss: 0.3543 332/500 [==================>...........] - ETA: 42s - loss: 1.9537 - regression_loss: 1.5992 - classification_loss: 0.3545 333/500 [==================>...........] - ETA: 41s - loss: 1.9546 - regression_loss: 1.5999 - classification_loss: 0.3547 334/500 [===================>..........] - ETA: 41s - loss: 1.9541 - regression_loss: 1.5995 - classification_loss: 0.3546 335/500 [===================>..........] - ETA: 41s - loss: 1.9529 - regression_loss: 1.5986 - classification_loss: 0.3544 336/500 [===================>..........] - ETA: 41s - loss: 1.9540 - regression_loss: 1.5994 - classification_loss: 0.3546 337/500 [===================>..........] - ETA: 40s - loss: 1.9532 - regression_loss: 1.5991 - classification_loss: 0.3542 338/500 [===================>..........] - ETA: 40s - loss: 1.9537 - regression_loss: 1.5995 - classification_loss: 0.3542 339/500 [===================>..........] - ETA: 40s - loss: 1.9536 - regression_loss: 1.5996 - classification_loss: 0.3540 340/500 [===================>..........] - ETA: 40s - loss: 1.9539 - regression_loss: 1.5998 - classification_loss: 0.3541 341/500 [===================>..........] - ETA: 39s - loss: 1.9541 - regression_loss: 1.6001 - classification_loss: 0.3541 342/500 [===================>..........] - ETA: 39s - loss: 1.9514 - regression_loss: 1.5976 - classification_loss: 0.3538 343/500 [===================>..........] - ETA: 39s - loss: 1.9527 - regression_loss: 1.5986 - classification_loss: 0.3541 344/500 [===================>..........] - ETA: 39s - loss: 1.9507 - regression_loss: 1.5969 - classification_loss: 0.3537 345/500 [===================>..........] - ETA: 38s - loss: 1.9515 - regression_loss: 1.5979 - classification_loss: 0.3536 346/500 [===================>..........] - ETA: 38s - loss: 1.9497 - regression_loss: 1.5958 - classification_loss: 0.3539 347/500 [===================>..........] - ETA: 38s - loss: 1.9491 - regression_loss: 1.5955 - classification_loss: 0.3536 348/500 [===================>..........] - ETA: 38s - loss: 1.9484 - regression_loss: 1.5950 - classification_loss: 0.3534 349/500 [===================>..........] - ETA: 37s - loss: 1.9464 - regression_loss: 1.5934 - classification_loss: 0.3531 350/500 [====================>.........] - ETA: 37s - loss: 1.9462 - regression_loss: 1.5933 - classification_loss: 0.3528 351/500 [====================>.........] - ETA: 37s - loss: 1.9463 - regression_loss: 1.5934 - classification_loss: 0.3528 352/500 [====================>.........] - ETA: 37s - loss: 1.9481 - regression_loss: 1.5947 - classification_loss: 0.3534 353/500 [====================>.........] - ETA: 36s - loss: 1.9492 - regression_loss: 1.5956 - classification_loss: 0.3537 354/500 [====================>.........] - ETA: 36s - loss: 1.9497 - regression_loss: 1.5961 - classification_loss: 0.3536 355/500 [====================>.........] - ETA: 36s - loss: 1.9485 - regression_loss: 1.5950 - classification_loss: 0.3536 356/500 [====================>.........] - ETA: 36s - loss: 1.9484 - regression_loss: 1.5951 - classification_loss: 0.3533 357/500 [====================>.........] - ETA: 35s - loss: 1.9493 - regression_loss: 1.5959 - classification_loss: 0.3535 358/500 [====================>.........] - ETA: 35s - loss: 1.9491 - regression_loss: 1.5957 - classification_loss: 0.3534 359/500 [====================>.........] - ETA: 35s - loss: 1.9483 - regression_loss: 1.5950 - classification_loss: 0.3533 360/500 [====================>.........] - ETA: 35s - loss: 1.9488 - regression_loss: 1.5955 - classification_loss: 0.3533 361/500 [====================>.........] - ETA: 34s - loss: 1.9468 - regression_loss: 1.5941 - classification_loss: 0.3527 362/500 [====================>.........] - ETA: 34s - loss: 1.9487 - regression_loss: 1.5953 - classification_loss: 0.3534 363/500 [====================>.........] - ETA: 34s - loss: 1.9475 - regression_loss: 1.5944 - classification_loss: 0.3531 364/500 [====================>.........] - ETA: 34s - loss: 1.9476 - regression_loss: 1.5944 - classification_loss: 0.3532 365/500 [====================>.........] - ETA: 33s - loss: 1.9462 - regression_loss: 1.5933 - classification_loss: 0.3529 366/500 [====================>.........] - ETA: 33s - loss: 1.9482 - regression_loss: 1.5949 - classification_loss: 0.3533 367/500 [=====================>........] - ETA: 33s - loss: 1.9486 - regression_loss: 1.5954 - classification_loss: 0.3532 368/500 [=====================>........] - ETA: 33s - loss: 1.9502 - regression_loss: 1.5966 - classification_loss: 0.3536 369/500 [=====================>........] - ETA: 32s - loss: 1.9528 - regression_loss: 1.5985 - classification_loss: 0.3544 370/500 [=====================>........] - ETA: 32s - loss: 1.9563 - regression_loss: 1.5996 - classification_loss: 0.3567 371/500 [=====================>........] - ETA: 32s - loss: 1.9578 - regression_loss: 1.6005 - classification_loss: 0.3573 372/500 [=====================>........] - ETA: 32s - loss: 1.9578 - regression_loss: 1.6004 - classification_loss: 0.3573 373/500 [=====================>........] - ETA: 31s - loss: 1.9559 - regression_loss: 1.5990 - classification_loss: 0.3569 374/500 [=====================>........] - ETA: 31s - loss: 1.9548 - regression_loss: 1.5982 - classification_loss: 0.3566 375/500 [=====================>........] - ETA: 31s - loss: 1.9550 - regression_loss: 1.5979 - classification_loss: 0.3570 376/500 [=====================>........] - ETA: 31s - loss: 1.9541 - regression_loss: 1.5970 - classification_loss: 0.3572 377/500 [=====================>........] - ETA: 30s - loss: 1.9538 - regression_loss: 1.5966 - classification_loss: 0.3571 378/500 [=====================>........] - ETA: 30s - loss: 1.9532 - regression_loss: 1.5961 - classification_loss: 0.3571 379/500 [=====================>........] - ETA: 30s - loss: 1.9544 - regression_loss: 1.5973 - classification_loss: 0.3572 380/500 [=====================>........] - ETA: 30s - loss: 1.9550 - regression_loss: 1.5977 - classification_loss: 0.3573 381/500 [=====================>........] - ETA: 29s - loss: 1.9522 - regression_loss: 1.5953 - classification_loss: 0.3569 382/500 [=====================>........] - ETA: 29s - loss: 1.9515 - regression_loss: 1.5951 - classification_loss: 0.3564 383/500 [=====================>........] - ETA: 29s - loss: 1.9522 - regression_loss: 1.5953 - classification_loss: 0.3569 384/500 [======================>.......] - ETA: 28s - loss: 1.9529 - regression_loss: 1.5958 - classification_loss: 0.3572 385/500 [======================>.......] - ETA: 28s - loss: 1.9537 - regression_loss: 1.5962 - classification_loss: 0.3575 386/500 [======================>.......] - ETA: 28s - loss: 1.9535 - regression_loss: 1.5962 - classification_loss: 0.3573 387/500 [======================>.......] - ETA: 28s - loss: 1.9527 - regression_loss: 1.5954 - classification_loss: 0.3573 388/500 [======================>.......] - ETA: 27s - loss: 1.9539 - regression_loss: 1.5965 - classification_loss: 0.3575 389/500 [======================>.......] - ETA: 27s - loss: 1.9563 - regression_loss: 1.5987 - classification_loss: 0.3577 390/500 [======================>.......] - ETA: 27s - loss: 1.9567 - regression_loss: 1.5990 - classification_loss: 0.3577 391/500 [======================>.......] - ETA: 27s - loss: 1.9566 - regression_loss: 1.5991 - classification_loss: 0.3575 392/500 [======================>.......] - ETA: 26s - loss: 1.9562 - regression_loss: 1.5988 - classification_loss: 0.3574 393/500 [======================>.......] - ETA: 26s - loss: 1.9569 - regression_loss: 1.5996 - classification_loss: 0.3573 394/500 [======================>.......] - ETA: 26s - loss: 1.9562 - regression_loss: 1.5992 - classification_loss: 0.3570 395/500 [======================>.......] - ETA: 26s - loss: 1.9563 - regression_loss: 1.5992 - classification_loss: 0.3571 396/500 [======================>.......] - ETA: 25s - loss: 1.9564 - regression_loss: 1.5994 - classification_loss: 0.3570 397/500 [======================>.......] - ETA: 25s - loss: 1.9546 - regression_loss: 1.5980 - classification_loss: 0.3567 398/500 [======================>.......] - ETA: 25s - loss: 1.9529 - regression_loss: 1.5966 - classification_loss: 0.3563 399/500 [======================>.......] - ETA: 25s - loss: 1.9533 - regression_loss: 1.5970 - classification_loss: 0.3563 400/500 [=======================>......] - ETA: 24s - loss: 1.9543 - regression_loss: 1.5976 - classification_loss: 0.3567 401/500 [=======================>......] - ETA: 24s - loss: 1.9535 - regression_loss: 1.5970 - classification_loss: 0.3565 402/500 [=======================>......] - ETA: 24s - loss: 1.9535 - regression_loss: 1.5970 - classification_loss: 0.3564 403/500 [=======================>......] - ETA: 24s - loss: 1.9534 - regression_loss: 1.5972 - classification_loss: 0.3562 404/500 [=======================>......] - ETA: 23s - loss: 1.9551 - regression_loss: 1.5987 - classification_loss: 0.3564 405/500 [=======================>......] - ETA: 23s - loss: 1.9547 - regression_loss: 1.5984 - classification_loss: 0.3563 406/500 [=======================>......] - ETA: 23s - loss: 1.9551 - regression_loss: 1.5988 - classification_loss: 0.3563 407/500 [=======================>......] - ETA: 23s - loss: 1.9568 - regression_loss: 1.6008 - classification_loss: 0.3560 408/500 [=======================>......] - ETA: 22s - loss: 1.9579 - regression_loss: 1.6016 - classification_loss: 0.3564 409/500 [=======================>......] - ETA: 22s - loss: 1.9568 - regression_loss: 1.6007 - classification_loss: 0.3561 410/500 [=======================>......] - ETA: 22s - loss: 1.9573 - regression_loss: 1.6012 - classification_loss: 0.3560 411/500 [=======================>......] - ETA: 22s - loss: 1.9558 - regression_loss: 1.5998 - classification_loss: 0.3560 412/500 [=======================>......] - ETA: 21s - loss: 1.9563 - regression_loss: 1.6001 - classification_loss: 0.3561 413/500 [=======================>......] - ETA: 21s - loss: 1.9543 - regression_loss: 1.5985 - classification_loss: 0.3558 414/500 [=======================>......] - ETA: 21s - loss: 1.9537 - regression_loss: 1.5981 - classification_loss: 0.3556 415/500 [=======================>......] - ETA: 21s - loss: 1.9529 - regression_loss: 1.5976 - classification_loss: 0.3554 416/500 [=======================>......] - ETA: 20s - loss: 1.9525 - regression_loss: 1.5973 - classification_loss: 0.3552 417/500 [========================>.....] - ETA: 20s - loss: 1.9521 - regression_loss: 1.5969 - classification_loss: 0.3551 418/500 [========================>.....] - ETA: 20s - loss: 1.9514 - regression_loss: 1.5965 - classification_loss: 0.3549 419/500 [========================>.....] - ETA: 20s - loss: 1.9529 - regression_loss: 1.5979 - classification_loss: 0.3550 420/500 [========================>.....] - ETA: 19s - loss: 1.9541 - regression_loss: 1.5988 - classification_loss: 0.3553 421/500 [========================>.....] - ETA: 19s - loss: 1.9523 - regression_loss: 1.5973 - classification_loss: 0.3550 422/500 [========================>.....] - ETA: 19s - loss: 1.9519 - regression_loss: 1.5971 - classification_loss: 0.3548 423/500 [========================>.....] - ETA: 19s - loss: 1.9514 - regression_loss: 1.5965 - classification_loss: 0.3548 424/500 [========================>.....] - ETA: 19s - loss: 1.9509 - regression_loss: 1.5963 - classification_loss: 0.3546 425/500 [========================>.....] - ETA: 18s - loss: 1.9509 - regression_loss: 1.5964 - classification_loss: 0.3545 426/500 [========================>.....] - ETA: 18s - loss: 1.9510 - regression_loss: 1.5966 - classification_loss: 0.3544 427/500 [========================>.....] - ETA: 18s - loss: 1.9496 - regression_loss: 1.5955 - classification_loss: 0.3540 428/500 [========================>.....] - ETA: 18s - loss: 1.9481 - regression_loss: 1.5944 - classification_loss: 0.3537 429/500 [========================>.....] - ETA: 17s - loss: 1.9483 - regression_loss: 1.5948 - classification_loss: 0.3535 430/500 [========================>.....] - ETA: 17s - loss: 1.9487 - regression_loss: 1.5951 - classification_loss: 0.3536 431/500 [========================>.....] - ETA: 17s - loss: 1.9492 - regression_loss: 1.5956 - classification_loss: 0.3536 432/500 [========================>.....] - ETA: 17s - loss: 1.9481 - regression_loss: 1.5947 - classification_loss: 0.3534 433/500 [========================>.....] - ETA: 16s - loss: 1.9489 - regression_loss: 1.5955 - classification_loss: 0.3534 434/500 [=========================>....] - ETA: 16s - loss: 1.9486 - regression_loss: 1.5954 - classification_loss: 0.3532 435/500 [=========================>....] - ETA: 16s - loss: 1.9478 - regression_loss: 1.5947 - classification_loss: 0.3531 436/500 [=========================>....] - ETA: 16s - loss: 1.9468 - regression_loss: 1.5942 - classification_loss: 0.3526 437/500 [=========================>....] - ETA: 15s - loss: 1.9477 - regression_loss: 1.5946 - classification_loss: 0.3531 438/500 [=========================>....] - ETA: 15s - loss: 1.9478 - regression_loss: 1.5947 - classification_loss: 0.3530 439/500 [=========================>....] - ETA: 15s - loss: 1.9473 - regression_loss: 1.5946 - classification_loss: 0.3527 440/500 [=========================>....] - ETA: 15s - loss: 1.9460 - regression_loss: 1.5936 - classification_loss: 0.3524 441/500 [=========================>....] - ETA: 14s - loss: 1.9444 - regression_loss: 1.5923 - classification_loss: 0.3521 442/500 [=========================>....] - ETA: 14s - loss: 1.9439 - regression_loss: 1.5919 - classification_loss: 0.3520 443/500 [=========================>....] - ETA: 14s - loss: 1.9424 - regression_loss: 1.5907 - classification_loss: 0.3517 444/500 [=========================>....] - ETA: 14s - loss: 1.9431 - regression_loss: 1.5912 - classification_loss: 0.3519 445/500 [=========================>....] - ETA: 13s - loss: 1.9435 - regression_loss: 1.5916 - classification_loss: 0.3519 446/500 [=========================>....] - ETA: 13s - loss: 1.9435 - regression_loss: 1.5915 - classification_loss: 0.3520 447/500 [=========================>....] - ETA: 13s - loss: 1.9455 - regression_loss: 1.5933 - classification_loss: 0.3523 448/500 [=========================>....] - ETA: 13s - loss: 1.9457 - regression_loss: 1.5933 - classification_loss: 0.3524 449/500 [=========================>....] - ETA: 12s - loss: 1.9471 - regression_loss: 1.5946 - classification_loss: 0.3525 450/500 [==========================>...] - ETA: 12s - loss: 1.9471 - regression_loss: 1.5947 - classification_loss: 0.3524 451/500 [==========================>...] - ETA: 12s - loss: 1.9469 - regression_loss: 1.5946 - classification_loss: 0.3523 452/500 [==========================>...] - ETA: 12s - loss: 1.9485 - regression_loss: 1.5959 - classification_loss: 0.3525 453/500 [==========================>...] - ETA: 11s - loss: 1.9470 - regression_loss: 1.5948 - classification_loss: 0.3522 454/500 [==========================>...] - ETA: 11s - loss: 1.9471 - regression_loss: 1.5950 - classification_loss: 0.3522 455/500 [==========================>...] - ETA: 11s - loss: 1.9471 - regression_loss: 1.5949 - classification_loss: 0.3522 456/500 [==========================>...] - ETA: 11s - loss: 1.9474 - regression_loss: 1.5952 - classification_loss: 0.3522 457/500 [==========================>...] - ETA: 10s - loss: 1.9498 - regression_loss: 1.5968 - classification_loss: 0.3530 458/500 [==========================>...] - ETA: 10s - loss: 1.9494 - regression_loss: 1.5966 - classification_loss: 0.3528 459/500 [==========================>...] - ETA: 10s - loss: 1.9471 - regression_loss: 1.5947 - classification_loss: 0.3524 460/500 [==========================>...] - ETA: 10s - loss: 1.9478 - regression_loss: 1.5953 - classification_loss: 0.3525 461/500 [==========================>...] - ETA: 9s - loss: 1.9484 - regression_loss: 1.5958 - classification_loss: 0.3526  462/500 [==========================>...] - ETA: 9s - loss: 1.9484 - regression_loss: 1.5958 - classification_loss: 0.3526 463/500 [==========================>...] - ETA: 9s - loss: 1.9495 - regression_loss: 1.5966 - classification_loss: 0.3529 464/500 [==========================>...] - ETA: 9s - loss: 1.9499 - regression_loss: 1.5971 - classification_loss: 0.3528 465/500 [==========================>...] - ETA: 8s - loss: 1.9503 - regression_loss: 1.5973 - classification_loss: 0.3530 466/500 [==========================>...] - ETA: 8s - loss: 1.9502 - regression_loss: 1.5974 - classification_loss: 0.3529 467/500 [===========================>..] - ETA: 8s - loss: 1.9497 - regression_loss: 1.5971 - classification_loss: 0.3526 468/500 [===========================>..] - ETA: 8s - loss: 1.9478 - regression_loss: 1.5956 - classification_loss: 0.3522 469/500 [===========================>..] - ETA: 7s - loss: 1.9477 - regression_loss: 1.5955 - classification_loss: 0.3522 470/500 [===========================>..] - ETA: 7s - loss: 1.9470 - regression_loss: 1.5949 - classification_loss: 0.3520 471/500 [===========================>..] - ETA: 7s - loss: 1.9442 - regression_loss: 1.5926 - classification_loss: 0.3516 472/500 [===========================>..] - ETA: 7s - loss: 1.9449 - regression_loss: 1.5933 - classification_loss: 0.3516 473/500 [===========================>..] - ETA: 6s - loss: 1.9447 - regression_loss: 1.5932 - classification_loss: 0.3515 474/500 [===========================>..] - ETA: 6s - loss: 1.9449 - regression_loss: 1.5933 - classification_loss: 0.3516 475/500 [===========================>..] - ETA: 6s - loss: 1.9452 - regression_loss: 1.5935 - classification_loss: 0.3517 476/500 [===========================>..] - ETA: 6s - loss: 1.9431 - regression_loss: 1.5917 - classification_loss: 0.3514 477/500 [===========================>..] - ETA: 5s - loss: 1.9431 - regression_loss: 1.5918 - classification_loss: 0.3513 478/500 [===========================>..] - ETA: 5s - loss: 1.9437 - regression_loss: 1.5922 - classification_loss: 0.3515 479/500 [===========================>..] - ETA: 5s - loss: 1.9437 - regression_loss: 1.5923 - classification_loss: 0.3515 480/500 [===========================>..] - ETA: 5s - loss: 1.9447 - regression_loss: 1.5930 - classification_loss: 0.3516 481/500 [===========================>..] - ETA: 4s - loss: 1.9450 - regression_loss: 1.5935 - classification_loss: 0.3515 482/500 [===========================>..] - ETA: 4s - loss: 1.9455 - regression_loss: 1.5941 - classification_loss: 0.3514 483/500 [===========================>..] - ETA: 4s - loss: 1.9463 - regression_loss: 1.5950 - classification_loss: 0.3513 484/500 [============================>.] - ETA: 4s - loss: 1.9468 - regression_loss: 1.5954 - classification_loss: 0.3514 485/500 [============================>.] - ETA: 3s - loss: 1.9475 - regression_loss: 1.5961 - classification_loss: 0.3514 486/500 [============================>.] - ETA: 3s - loss: 1.9477 - regression_loss: 1.5962 - classification_loss: 0.3515 487/500 [============================>.] - ETA: 3s - loss: 1.9478 - regression_loss: 1.5964 - classification_loss: 0.3514 488/500 [============================>.] - ETA: 3s - loss: 1.9471 - regression_loss: 1.5959 - classification_loss: 0.3511 489/500 [============================>.] - ETA: 2s - loss: 1.9468 - regression_loss: 1.5954 - classification_loss: 0.3514 490/500 [============================>.] - ETA: 2s - loss: 1.9475 - regression_loss: 1.5951 - classification_loss: 0.3524 491/500 [============================>.] - ETA: 2s - loss: 1.9459 - regression_loss: 1.5937 - classification_loss: 0.3521 492/500 [============================>.] - ETA: 2s - loss: 1.9458 - regression_loss: 1.5938 - classification_loss: 0.3520 493/500 [============================>.] - ETA: 1s - loss: 1.9453 - regression_loss: 1.5935 - classification_loss: 0.3518 494/500 [============================>.] - ETA: 1s - loss: 1.9468 - regression_loss: 1.5945 - classification_loss: 0.3523 495/500 [============================>.] - ETA: 1s - loss: 1.9481 - regression_loss: 1.5957 - classification_loss: 0.3524 496/500 [============================>.] - ETA: 1s - loss: 1.9482 - regression_loss: 1.5957 - classification_loss: 0.3525 497/500 [============================>.] - ETA: 0s - loss: 1.9490 - regression_loss: 1.5961 - classification_loss: 0.3529 498/500 [============================>.] - ETA: 0s - loss: 1.9497 - regression_loss: 1.5966 - classification_loss: 0.3531 499/500 [============================>.] - ETA: 0s - loss: 1.9494 - regression_loss: 1.5963 - classification_loss: 0.3531 500/500 [==============================] - 125s 250ms/step - loss: 1.9492 - regression_loss: 1.5961 - classification_loss: 0.3530 1172 instances of class plum with average precision: 0.5631 mAP: 0.5631 Epoch 00046: saving model to ./training/snapshots/resnet50_pascal_46.h5 Epoch 47/150 1/500 [..............................] - ETA: 1:59 - loss: 2.7643 - regression_loss: 2.1852 - classification_loss: 0.5791 2/500 [..............................] - ETA: 2:01 - loss: 2.4847 - regression_loss: 2.0115 - classification_loss: 0.4732 3/500 [..............................] - ETA: 2:03 - loss: 2.5607 - regression_loss: 2.1342 - classification_loss: 0.4265 4/500 [..............................] - ETA: 2:03 - loss: 2.4299 - regression_loss: 2.0298 - classification_loss: 0.4001 5/500 [..............................] - ETA: 2:01 - loss: 2.3332 - regression_loss: 1.9338 - classification_loss: 0.3994 6/500 [..............................] - ETA: 2:00 - loss: 2.3169 - regression_loss: 1.9185 - classification_loss: 0.3983 7/500 [..............................] - ETA: 2:00 - loss: 2.3400 - regression_loss: 1.9459 - classification_loss: 0.3941 8/500 [..............................] - ETA: 2:01 - loss: 2.3627 - regression_loss: 1.9558 - classification_loss: 0.4069 9/500 [..............................] - ETA: 2:01 - loss: 2.4196 - regression_loss: 2.0101 - classification_loss: 0.4095 10/500 [..............................] - ETA: 2:01 - loss: 2.4229 - regression_loss: 2.0126 - classification_loss: 0.4103 11/500 [..............................] - ETA: 2:01 - loss: 2.4073 - regression_loss: 2.0012 - classification_loss: 0.4061 12/500 [..............................] - ETA: 2:00 - loss: 2.3523 - regression_loss: 1.9548 - classification_loss: 0.3975 13/500 [..............................] - ETA: 2:00 - loss: 2.3443 - regression_loss: 1.9461 - classification_loss: 0.3982 14/500 [..............................] - ETA: 2:00 - loss: 2.3325 - regression_loss: 1.9361 - classification_loss: 0.3964 15/500 [..............................] - ETA: 2:00 - loss: 2.2713 - regression_loss: 1.8852 - classification_loss: 0.3861 16/500 [..............................] - ETA: 1:59 - loss: 2.2470 - regression_loss: 1.8639 - classification_loss: 0.3831 17/500 [>.............................] - ETA: 1:59 - loss: 2.1966 - regression_loss: 1.8201 - classification_loss: 0.3766 18/500 [>.............................] - ETA: 1:58 - loss: 2.1873 - regression_loss: 1.8127 - classification_loss: 0.3745 19/500 [>.............................] - ETA: 1:58 - loss: 2.1523 - regression_loss: 1.7829 - classification_loss: 0.3694 20/500 [>.............................] - ETA: 1:58 - loss: 2.1185 - regression_loss: 1.7487 - classification_loss: 0.3697 21/500 [>.............................] - ETA: 1:58 - loss: 2.1208 - regression_loss: 1.7520 - classification_loss: 0.3688 22/500 [>.............................] - ETA: 1:58 - loss: 2.0942 - regression_loss: 1.7297 - classification_loss: 0.3645 23/500 [>.............................] - ETA: 1:58 - loss: 2.1027 - regression_loss: 1.7370 - classification_loss: 0.3657 24/500 [>.............................] - ETA: 1:58 - loss: 2.0983 - regression_loss: 1.7321 - classification_loss: 0.3662 25/500 [>.............................] - ETA: 1:57 - loss: 2.0605 - regression_loss: 1.6993 - classification_loss: 0.3612 26/500 [>.............................] - ETA: 1:57 - loss: 2.0644 - regression_loss: 1.7053 - classification_loss: 0.3591 27/500 [>.............................] - ETA: 1:57 - loss: 2.0703 - regression_loss: 1.7086 - classification_loss: 0.3617 28/500 [>.............................] - ETA: 1:57 - loss: 2.0820 - regression_loss: 1.7190 - classification_loss: 0.3630 29/500 [>.............................] - ETA: 1:57 - loss: 2.0667 - regression_loss: 1.7057 - classification_loss: 0.3610 30/500 [>.............................] - ETA: 1:57 - loss: 2.0585 - regression_loss: 1.6946 - classification_loss: 0.3639 31/500 [>.............................] - ETA: 1:57 - loss: 2.0462 - regression_loss: 1.6869 - classification_loss: 0.3593 32/500 [>.............................] - ETA: 1:57 - loss: 2.0558 - regression_loss: 1.6978 - classification_loss: 0.3580 33/500 [>.............................] - ETA: 1:56 - loss: 2.0628 - regression_loss: 1.7041 - classification_loss: 0.3587 34/500 [=>............................] - ETA: 1:56 - loss: 2.0668 - regression_loss: 1.7051 - classification_loss: 0.3617 35/500 [=>............................] - ETA: 1:56 - loss: 2.0561 - regression_loss: 1.6968 - classification_loss: 0.3593 36/500 [=>............................] - ETA: 1:56 - loss: 2.0602 - regression_loss: 1.6999 - classification_loss: 0.3603 37/500 [=>............................] - ETA: 1:56 - loss: 2.0541 - regression_loss: 1.6929 - classification_loss: 0.3612 38/500 [=>............................] - ETA: 1:55 - loss: 2.0292 - regression_loss: 1.6740 - classification_loss: 0.3552 39/500 [=>............................] - ETA: 1:55 - loss: 2.0301 - regression_loss: 1.6745 - classification_loss: 0.3556 40/500 [=>............................] - ETA: 1:55 - loss: 2.0361 - regression_loss: 1.6751 - classification_loss: 0.3610 41/500 [=>............................] - ETA: 1:55 - loss: 2.0331 - regression_loss: 1.6729 - classification_loss: 0.3602 42/500 [=>............................] - ETA: 1:54 - loss: 2.0313 - regression_loss: 1.6712 - classification_loss: 0.3601 43/500 [=>............................] - ETA: 1:54 - loss: 2.0380 - regression_loss: 1.6752 - classification_loss: 0.3628 44/500 [=>............................] - ETA: 1:54 - loss: 2.0490 - regression_loss: 1.6815 - classification_loss: 0.3675 45/500 [=>............................] - ETA: 1:54 - loss: 2.0578 - regression_loss: 1.6898 - classification_loss: 0.3680 46/500 [=>............................] - ETA: 1:54 - loss: 2.0506 - regression_loss: 1.6844 - classification_loss: 0.3662 47/500 [=>............................] - ETA: 1:53 - loss: 2.0428 - regression_loss: 1.6776 - classification_loss: 0.3652 48/500 [=>............................] - ETA: 1:53 - loss: 2.0347 - regression_loss: 1.6718 - classification_loss: 0.3630 49/500 [=>............................] - ETA: 1:53 - loss: 2.0194 - regression_loss: 1.6585 - classification_loss: 0.3609 50/500 [==>...........................] - ETA: 1:53 - loss: 2.0309 - regression_loss: 1.6669 - classification_loss: 0.3640 51/500 [==>...........................] - ETA: 1:53 - loss: 2.0195 - regression_loss: 1.6586 - classification_loss: 0.3609 52/500 [==>...........................] - ETA: 1:52 - loss: 2.0164 - regression_loss: 1.6565 - classification_loss: 0.3599 53/500 [==>...........................] - ETA: 1:52 - loss: 2.0214 - regression_loss: 1.6621 - classification_loss: 0.3594 54/500 [==>...........................] - ETA: 1:52 - loss: 2.0242 - regression_loss: 1.6648 - classification_loss: 0.3593 55/500 [==>...........................] - ETA: 1:52 - loss: 2.0194 - regression_loss: 1.6610 - classification_loss: 0.3584 56/500 [==>...........................] - ETA: 1:51 - loss: 2.0216 - regression_loss: 1.6628 - classification_loss: 0.3589 57/500 [==>...........................] - ETA: 1:51 - loss: 2.0207 - regression_loss: 1.6626 - classification_loss: 0.3581 58/500 [==>...........................] - ETA: 1:50 - loss: 2.0246 - regression_loss: 1.6655 - classification_loss: 0.3591 59/500 [==>...........................] - ETA: 1:49 - loss: 2.0300 - regression_loss: 1.6693 - classification_loss: 0.3607 60/500 [==>...........................] - ETA: 1:49 - loss: 2.0327 - regression_loss: 1.6708 - classification_loss: 0.3619 61/500 [==>...........................] - ETA: 1:49 - loss: 2.0281 - regression_loss: 1.6674 - classification_loss: 0.3607 62/500 [==>...........................] - ETA: 1:49 - loss: 2.0130 - regression_loss: 1.6533 - classification_loss: 0.3597 63/500 [==>...........................] - ETA: 1:49 - loss: 2.0072 - regression_loss: 1.6493 - classification_loss: 0.3579 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9970 - regression_loss: 1.6416 - classification_loss: 0.3554 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9982 - regression_loss: 1.6421 - classification_loss: 0.3561 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9958 - regression_loss: 1.6404 - classification_loss: 0.3554 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9857 - regression_loss: 1.6316 - classification_loss: 0.3541 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9959 - regression_loss: 1.6425 - classification_loss: 0.3533 69/500 [===>..........................] - ETA: 1:47 - loss: 1.9988 - regression_loss: 1.6451 - classification_loss: 0.3538 70/500 [===>..........................] - ETA: 1:47 - loss: 1.9971 - regression_loss: 1.6444 - classification_loss: 0.3527 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9911 - regression_loss: 1.6402 - classification_loss: 0.3509 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9822 - regression_loss: 1.6322 - classification_loss: 0.3500 73/500 [===>..........................] - ETA: 1:46 - loss: 1.9732 - regression_loss: 1.6246 - classification_loss: 0.3485 74/500 [===>..........................] - ETA: 1:46 - loss: 1.9769 - regression_loss: 1.6282 - classification_loss: 0.3487 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9675 - regression_loss: 1.6205 - classification_loss: 0.3470 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9551 - regression_loss: 1.6104 - classification_loss: 0.3448 77/500 [===>..........................] - ETA: 1:45 - loss: 1.9535 - regression_loss: 1.6087 - classification_loss: 0.3448 78/500 [===>..........................] - ETA: 1:45 - loss: 1.9467 - regression_loss: 1.6032 - classification_loss: 0.3435 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9484 - regression_loss: 1.6050 - classification_loss: 0.3434 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9444 - regression_loss: 1.6014 - classification_loss: 0.3430 81/500 [===>..........................] - ETA: 1:44 - loss: 1.9475 - regression_loss: 1.6050 - classification_loss: 0.3424 82/500 [===>..........................] - ETA: 1:44 - loss: 1.9496 - regression_loss: 1.6071 - classification_loss: 0.3425 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9491 - regression_loss: 1.6071 - classification_loss: 0.3420 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9497 - regression_loss: 1.6077 - classification_loss: 0.3419 85/500 [====>.........................] - ETA: 1:43 - loss: 1.9516 - regression_loss: 1.6096 - classification_loss: 0.3420 86/500 [====>.........................] - ETA: 1:43 - loss: 1.9505 - regression_loss: 1.6081 - classification_loss: 0.3424 87/500 [====>.........................] - ETA: 1:43 - loss: 1.9472 - regression_loss: 1.6059 - classification_loss: 0.3414 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9515 - regression_loss: 1.6101 - classification_loss: 0.3414 89/500 [====>.........................] - ETA: 1:42 - loss: 1.9551 - regression_loss: 1.6133 - classification_loss: 0.3418 90/500 [====>.........................] - ETA: 1:42 - loss: 1.9499 - regression_loss: 1.6082 - classification_loss: 0.3418 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9473 - regression_loss: 1.6062 - classification_loss: 0.3411 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9482 - regression_loss: 1.6074 - classification_loss: 0.3409 93/500 [====>.........................] - ETA: 1:41 - loss: 1.9376 - regression_loss: 1.5990 - classification_loss: 0.3386 94/500 [====>.........................] - ETA: 1:41 - loss: 1.9390 - regression_loss: 1.6013 - classification_loss: 0.3378 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9390 - regression_loss: 1.6008 - classification_loss: 0.3382 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9390 - regression_loss: 1.6009 - classification_loss: 0.3381 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9384 - regression_loss: 1.6004 - classification_loss: 0.3379 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9385 - regression_loss: 1.6008 - classification_loss: 0.3377 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9367 - regression_loss: 1.5980 - classification_loss: 0.3387 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9408 - regression_loss: 1.6016 - classification_loss: 0.3392 101/500 [=====>........................] - ETA: 1:39 - loss: 1.9474 - regression_loss: 1.6057 - classification_loss: 0.3416 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9423 - regression_loss: 1.6014 - classification_loss: 0.3409 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9465 - regression_loss: 1.6039 - classification_loss: 0.3426 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9620 - regression_loss: 1.6172 - classification_loss: 0.3448 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9604 - regression_loss: 1.6157 - classification_loss: 0.3447 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9605 - regression_loss: 1.6160 - classification_loss: 0.3445 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9642 - regression_loss: 1.6187 - classification_loss: 0.3455 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9644 - regression_loss: 1.6186 - classification_loss: 0.3458 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9662 - regression_loss: 1.6205 - classification_loss: 0.3457 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9597 - regression_loss: 1.6146 - classification_loss: 0.3451 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9580 - regression_loss: 1.6123 - classification_loss: 0.3457 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9469 - regression_loss: 1.6037 - classification_loss: 0.3433 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9477 - regression_loss: 1.5990 - classification_loss: 0.3486 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9525 - regression_loss: 1.6026 - classification_loss: 0.3499 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9543 - regression_loss: 1.6048 - classification_loss: 0.3496 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9546 - regression_loss: 1.6049 - classification_loss: 0.3498 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9517 - regression_loss: 1.6025 - classification_loss: 0.3493 118/500 [======>.......................] - ETA: 1:35 - loss: 1.9541 - regression_loss: 1.6045 - classification_loss: 0.3496 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9539 - regression_loss: 1.6044 - classification_loss: 0.3495 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9485 - regression_loss: 1.6005 - classification_loss: 0.3480 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9523 - regression_loss: 1.6044 - classification_loss: 0.3480 122/500 [======>.......................] - ETA: 1:34 - loss: 1.9542 - regression_loss: 1.6067 - classification_loss: 0.3475 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9515 - regression_loss: 1.6044 - classification_loss: 0.3471 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9529 - regression_loss: 1.6053 - classification_loss: 0.3476 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9530 - regression_loss: 1.6055 - classification_loss: 0.3475 126/500 [======>.......................] - ETA: 1:33 - loss: 1.9486 - regression_loss: 1.6015 - classification_loss: 0.3471 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9491 - regression_loss: 1.6021 - classification_loss: 0.3471 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9526 - regression_loss: 1.6057 - classification_loss: 0.3469 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9496 - regression_loss: 1.6029 - classification_loss: 0.3467 130/500 [======>.......................] - ETA: 1:32 - loss: 1.9487 - regression_loss: 1.6024 - classification_loss: 0.3463 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9403 - regression_loss: 1.5951 - classification_loss: 0.3451 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9486 - regression_loss: 1.5935 - classification_loss: 0.3551 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9445 - regression_loss: 1.5884 - classification_loss: 0.3561 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9463 - regression_loss: 1.5904 - classification_loss: 0.3558 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9455 - regression_loss: 1.5899 - classification_loss: 0.3556 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9460 - regression_loss: 1.5907 - classification_loss: 0.3553 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9463 - regression_loss: 1.5912 - classification_loss: 0.3551 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9524 - regression_loss: 1.5957 - classification_loss: 0.3568 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9545 - regression_loss: 1.5966 - classification_loss: 0.3579 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9563 - regression_loss: 1.5979 - classification_loss: 0.3584 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9540 - regression_loss: 1.5958 - classification_loss: 0.3582 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9554 - regression_loss: 1.5969 - classification_loss: 0.3584 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9535 - regression_loss: 1.5953 - classification_loss: 0.3582 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9532 - regression_loss: 1.5951 - classification_loss: 0.3581 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9598 - regression_loss: 1.6016 - classification_loss: 0.3581 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9614 - regression_loss: 1.6029 - classification_loss: 0.3586 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9608 - regression_loss: 1.6024 - classification_loss: 0.3584 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9652 - regression_loss: 1.6065 - classification_loss: 0.3588 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9649 - regression_loss: 1.6068 - classification_loss: 0.3581 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9656 - regression_loss: 1.6076 - classification_loss: 0.3580 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9616 - regression_loss: 1.6047 - classification_loss: 0.3569 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9630 - regression_loss: 1.6058 - classification_loss: 0.3572 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9561 - regression_loss: 1.6005 - classification_loss: 0.3556 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9565 - regression_loss: 1.6012 - classification_loss: 0.3553 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9563 - regression_loss: 1.6011 - classification_loss: 0.3551 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9552 - regression_loss: 1.6006 - classification_loss: 0.3547 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9573 - regression_loss: 1.6028 - classification_loss: 0.3545 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9602 - regression_loss: 1.6053 - classification_loss: 0.3548 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9700 - regression_loss: 1.6142 - classification_loss: 0.3558 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9635 - regression_loss: 1.6080 - classification_loss: 0.3555 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9634 - regression_loss: 1.6081 - classification_loss: 0.3553 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9592 - regression_loss: 1.6052 - classification_loss: 0.3540 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9618 - regression_loss: 1.6073 - classification_loss: 0.3545 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9571 - regression_loss: 1.6036 - classification_loss: 0.3535 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9592 - regression_loss: 1.6054 - classification_loss: 0.3538 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9603 - regression_loss: 1.6062 - classification_loss: 0.3541 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9582 - regression_loss: 1.6048 - classification_loss: 0.3534 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9603 - regression_loss: 1.6068 - classification_loss: 0.3535 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9597 - regression_loss: 1.6060 - classification_loss: 0.3537 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9583 - regression_loss: 1.6048 - classification_loss: 0.3535 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9561 - regression_loss: 1.6033 - classification_loss: 0.3528 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9518 - regression_loss: 1.5998 - classification_loss: 0.3520 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9548 - regression_loss: 1.6020 - classification_loss: 0.3528 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9533 - regression_loss: 1.6009 - classification_loss: 0.3524 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9536 - regression_loss: 1.6014 - classification_loss: 0.3522 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9529 - regression_loss: 1.6011 - classification_loss: 0.3518 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9531 - regression_loss: 1.6014 - classification_loss: 0.3517 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9515 - regression_loss: 1.6002 - classification_loss: 0.3513 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9526 - regression_loss: 1.6010 - classification_loss: 0.3517 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9483 - regression_loss: 1.5970 - classification_loss: 0.3512 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9480 - regression_loss: 1.5965 - classification_loss: 0.3515 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9474 - regression_loss: 1.5961 - classification_loss: 0.3512 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9463 - regression_loss: 1.5953 - classification_loss: 0.3510 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9463 - regression_loss: 1.5956 - classification_loss: 0.3507 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9500 - regression_loss: 1.5983 - classification_loss: 0.3517 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9502 - regression_loss: 1.5983 - classification_loss: 0.3519 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9505 - regression_loss: 1.5986 - classification_loss: 0.3519 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9509 - regression_loss: 1.5992 - classification_loss: 0.3517 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9518 - regression_loss: 1.5998 - classification_loss: 0.3521 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9529 - regression_loss: 1.6009 - classification_loss: 0.3521 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9531 - regression_loss: 1.6012 - classification_loss: 0.3519 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9521 - regression_loss: 1.5996 - classification_loss: 0.3526 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9509 - regression_loss: 1.5985 - classification_loss: 0.3523 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9524 - regression_loss: 1.5996 - classification_loss: 0.3529 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9520 - regression_loss: 1.5996 - classification_loss: 0.3524 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9490 - regression_loss: 1.5970 - classification_loss: 0.3519 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9482 - regression_loss: 1.5967 - classification_loss: 0.3516 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9489 - regression_loss: 1.5972 - classification_loss: 0.3517 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9501 - regression_loss: 1.5982 - classification_loss: 0.3519 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9497 - regression_loss: 1.5975 - classification_loss: 0.3522 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9493 - regression_loss: 1.5967 - classification_loss: 0.3526 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9474 - regression_loss: 1.5952 - classification_loss: 0.3522 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9479 - regression_loss: 1.5958 - classification_loss: 0.3521 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9461 - regression_loss: 1.5943 - classification_loss: 0.3519 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9460 - regression_loss: 1.5942 - classification_loss: 0.3518 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9463 - regression_loss: 1.5947 - classification_loss: 0.3516 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9447 - regression_loss: 1.5936 - classification_loss: 0.3511 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9446 - regression_loss: 1.5937 - classification_loss: 0.3509 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9436 - regression_loss: 1.5928 - classification_loss: 0.3508 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9445 - regression_loss: 1.5940 - classification_loss: 0.3505 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9469 - regression_loss: 1.5961 - classification_loss: 0.3508 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9444 - regression_loss: 1.5942 - classification_loss: 0.3502 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9463 - regression_loss: 1.5957 - classification_loss: 0.3507 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9422 - regression_loss: 1.5921 - classification_loss: 0.3502 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9401 - regression_loss: 1.5900 - classification_loss: 0.3501 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9395 - regression_loss: 1.5894 - classification_loss: 0.3501 217/500 [============>.................] - ETA: 1:11 - loss: 1.9391 - regression_loss: 1.5886 - classification_loss: 0.3504 218/500 [============>.................] - ETA: 1:10 - loss: 1.9444 - regression_loss: 1.5930 - classification_loss: 0.3514 219/500 [============>.................] - ETA: 1:10 - loss: 1.9427 - regression_loss: 1.5916 - classification_loss: 0.3511 220/500 [============>.................] - ETA: 1:10 - loss: 1.9410 - regression_loss: 1.5903 - classification_loss: 0.3507 221/500 [============>.................] - ETA: 1:10 - loss: 1.9397 - regression_loss: 1.5894 - classification_loss: 0.3503 222/500 [============>.................] - ETA: 1:09 - loss: 1.9435 - regression_loss: 1.5924 - classification_loss: 0.3511 223/500 [============>.................] - ETA: 1:09 - loss: 1.9431 - regression_loss: 1.5922 - classification_loss: 0.3509 224/500 [============>.................] - ETA: 1:09 - loss: 1.9432 - regression_loss: 1.5925 - classification_loss: 0.3507 225/500 [============>.................] - ETA: 1:09 - loss: 1.9438 - regression_loss: 1.5933 - classification_loss: 0.3505 226/500 [============>.................] - ETA: 1:08 - loss: 1.9432 - regression_loss: 1.5931 - classification_loss: 0.3501 227/500 [============>.................] - ETA: 1:08 - loss: 1.9401 - regression_loss: 1.5906 - classification_loss: 0.3495 228/500 [============>.................] - ETA: 1:08 - loss: 1.9406 - regression_loss: 1.5911 - classification_loss: 0.3495 229/500 [============>.................] - ETA: 1:08 - loss: 1.9410 - regression_loss: 1.5920 - classification_loss: 0.3490 230/500 [============>.................] - ETA: 1:07 - loss: 1.9412 - regression_loss: 1.5916 - classification_loss: 0.3496 231/500 [============>.................] - ETA: 1:07 - loss: 1.9442 - regression_loss: 1.5936 - classification_loss: 0.3506 232/500 [============>.................] - ETA: 1:07 - loss: 1.9442 - regression_loss: 1.5936 - classification_loss: 0.3506 233/500 [============>.................] - ETA: 1:07 - loss: 1.9437 - regression_loss: 1.5934 - classification_loss: 0.3502 234/500 [=============>................] - ETA: 1:06 - loss: 1.9462 - regression_loss: 1.5956 - classification_loss: 0.3506 235/500 [=============>................] - ETA: 1:06 - loss: 1.9455 - regression_loss: 1.5953 - classification_loss: 0.3502 236/500 [=============>................] - ETA: 1:06 - loss: 1.9472 - regression_loss: 1.5969 - classification_loss: 0.3503 237/500 [=============>................] - ETA: 1:05 - loss: 1.9457 - regression_loss: 1.5953 - classification_loss: 0.3504 238/500 [=============>................] - ETA: 1:05 - loss: 1.9455 - regression_loss: 1.5954 - classification_loss: 0.3501 239/500 [=============>................] - ETA: 1:05 - loss: 1.9427 - regression_loss: 1.5930 - classification_loss: 0.3497 240/500 [=============>................] - ETA: 1:05 - loss: 1.9459 - regression_loss: 1.5955 - classification_loss: 0.3503 241/500 [=============>................] - ETA: 1:04 - loss: 1.9469 - regression_loss: 1.5966 - classification_loss: 0.3503 242/500 [=============>................] - ETA: 1:04 - loss: 1.9457 - regression_loss: 1.5956 - classification_loss: 0.3501 243/500 [=============>................] - ETA: 1:04 - loss: 1.9449 - regression_loss: 1.5951 - classification_loss: 0.3498 244/500 [=============>................] - ETA: 1:04 - loss: 1.9440 - regression_loss: 1.5944 - classification_loss: 0.3496 245/500 [=============>................] - ETA: 1:03 - loss: 1.9449 - regression_loss: 1.5951 - classification_loss: 0.3498 246/500 [=============>................] - ETA: 1:03 - loss: 1.9439 - regression_loss: 1.5942 - classification_loss: 0.3497 247/500 [=============>................] - ETA: 1:03 - loss: 1.9453 - regression_loss: 1.5951 - classification_loss: 0.3502 248/500 [=============>................] - ETA: 1:03 - loss: 1.9469 - regression_loss: 1.5944 - classification_loss: 0.3524 249/500 [=============>................] - ETA: 1:02 - loss: 1.9477 - regression_loss: 1.5950 - classification_loss: 0.3527 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9470 - regression_loss: 1.5947 - classification_loss: 0.3523 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9462 - regression_loss: 1.5940 - classification_loss: 0.3522 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9434 - regression_loss: 1.5913 - classification_loss: 0.3521 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9411 - regression_loss: 1.5891 - classification_loss: 0.3521 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9426 - regression_loss: 1.5908 - classification_loss: 0.3518 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9437 - regression_loss: 1.5920 - classification_loss: 0.3517 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9449 - regression_loss: 1.5929 - classification_loss: 0.3520 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9449 - regression_loss: 1.5932 - classification_loss: 0.3517 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9417 - regression_loss: 1.5906 - classification_loss: 0.3512 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9387 - regression_loss: 1.5880 - classification_loss: 0.3507 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9389 - regression_loss: 1.5881 - classification_loss: 0.3508 261/500 [==============>...............] - ETA: 59s - loss: 1.9378 - regression_loss: 1.5874 - classification_loss: 0.3505  262/500 [==============>...............] - ETA: 59s - loss: 1.9381 - regression_loss: 1.5878 - classification_loss: 0.3504 263/500 [==============>...............] - ETA: 59s - loss: 1.9387 - regression_loss: 1.5880 - classification_loss: 0.3507 264/500 [==============>...............] - ETA: 59s - loss: 1.9394 - regression_loss: 1.5887 - classification_loss: 0.3507 265/500 [==============>...............] - ETA: 58s - loss: 1.9387 - regression_loss: 1.5880 - classification_loss: 0.3507 266/500 [==============>...............] - ETA: 58s - loss: 1.9391 - regression_loss: 1.5873 - classification_loss: 0.3518 267/500 [===============>..............] - ETA: 58s - loss: 1.9396 - regression_loss: 1.5877 - classification_loss: 0.3520 268/500 [===============>..............] - ETA: 58s - loss: 1.9384 - regression_loss: 1.5868 - classification_loss: 0.3516 269/500 [===============>..............] - ETA: 57s - loss: 1.9382 - regression_loss: 1.5865 - classification_loss: 0.3517 270/500 [===============>..............] - ETA: 57s - loss: 1.9376 - regression_loss: 1.5861 - classification_loss: 0.3515 271/500 [===============>..............] - ETA: 57s - loss: 1.9393 - regression_loss: 1.5866 - classification_loss: 0.3527 272/500 [===============>..............] - ETA: 57s - loss: 1.9384 - regression_loss: 1.5860 - classification_loss: 0.3524 273/500 [===============>..............] - ETA: 56s - loss: 1.9375 - regression_loss: 1.5856 - classification_loss: 0.3519 274/500 [===============>..............] - ETA: 56s - loss: 1.9341 - regression_loss: 1.5828 - classification_loss: 0.3513 275/500 [===============>..............] - ETA: 56s - loss: 1.9347 - regression_loss: 1.5834 - classification_loss: 0.3513 276/500 [===============>..............] - ETA: 56s - loss: 1.9360 - regression_loss: 1.5843 - classification_loss: 0.3517 277/500 [===============>..............] - ETA: 55s - loss: 1.9358 - regression_loss: 1.5841 - classification_loss: 0.3517 278/500 [===============>..............] - ETA: 55s - loss: 1.9347 - regression_loss: 1.5831 - classification_loss: 0.3516 279/500 [===============>..............] - ETA: 55s - loss: 1.9313 - regression_loss: 1.5804 - classification_loss: 0.3509 280/500 [===============>..............] - ETA: 55s - loss: 1.9302 - regression_loss: 1.5795 - classification_loss: 0.3507 281/500 [===============>..............] - ETA: 54s - loss: 1.9312 - regression_loss: 1.5802 - classification_loss: 0.3509 282/500 [===============>..............] - ETA: 54s - loss: 1.9298 - regression_loss: 1.5791 - classification_loss: 0.3506 283/500 [===============>..............] - ETA: 54s - loss: 1.9302 - regression_loss: 1.5796 - classification_loss: 0.3506 284/500 [================>.............] - ETA: 54s - loss: 1.9299 - regression_loss: 1.5795 - classification_loss: 0.3504 285/500 [================>.............] - ETA: 53s - loss: 1.9294 - regression_loss: 1.5791 - classification_loss: 0.3503 286/500 [================>.............] - ETA: 53s - loss: 1.9300 - regression_loss: 1.5798 - classification_loss: 0.3502 287/500 [================>.............] - ETA: 53s - loss: 1.9300 - regression_loss: 1.5799 - classification_loss: 0.3501 288/500 [================>.............] - ETA: 53s - loss: 1.9303 - regression_loss: 1.5802 - classification_loss: 0.3500 289/500 [================>.............] - ETA: 52s - loss: 1.9300 - regression_loss: 1.5804 - classification_loss: 0.3496 290/500 [================>.............] - ETA: 52s - loss: 1.9315 - regression_loss: 1.5819 - classification_loss: 0.3496 291/500 [================>.............] - ETA: 52s - loss: 1.9311 - regression_loss: 1.5818 - classification_loss: 0.3494 292/500 [================>.............] - ETA: 52s - loss: 1.9314 - regression_loss: 1.5820 - classification_loss: 0.3493 293/500 [================>.............] - ETA: 51s - loss: 1.9302 - regression_loss: 1.5808 - classification_loss: 0.3493 294/500 [================>.............] - ETA: 51s - loss: 1.9300 - regression_loss: 1.5808 - classification_loss: 0.3492 295/500 [================>.............] - ETA: 51s - loss: 1.9316 - regression_loss: 1.5823 - classification_loss: 0.3492 296/500 [================>.............] - ETA: 51s - loss: 1.9283 - regression_loss: 1.5799 - classification_loss: 0.3484 297/500 [================>.............] - ETA: 50s - loss: 1.9293 - regression_loss: 1.5809 - classification_loss: 0.3484 298/500 [================>.............] - ETA: 50s - loss: 1.9277 - regression_loss: 1.5797 - classification_loss: 0.3480 299/500 [================>.............] - ETA: 50s - loss: 1.9288 - regression_loss: 1.5803 - classification_loss: 0.3485 300/500 [=================>............] - ETA: 50s - loss: 1.9282 - regression_loss: 1.5797 - classification_loss: 0.3485 301/500 [=================>............] - ETA: 49s - loss: 1.9292 - regression_loss: 1.5807 - classification_loss: 0.3485 302/500 [=================>............] - ETA: 49s - loss: 1.9312 - regression_loss: 1.5821 - classification_loss: 0.3491 303/500 [=================>............] - ETA: 49s - loss: 1.9330 - regression_loss: 1.5832 - classification_loss: 0.3498 304/500 [=================>............] - ETA: 49s - loss: 1.9328 - regression_loss: 1.5832 - classification_loss: 0.3496 305/500 [=================>............] - ETA: 48s - loss: 1.9333 - regression_loss: 1.5835 - classification_loss: 0.3499 306/500 [=================>............] - ETA: 48s - loss: 1.9339 - regression_loss: 1.5838 - classification_loss: 0.3501 307/500 [=================>............] - ETA: 48s - loss: 1.9359 - regression_loss: 1.5854 - classification_loss: 0.3505 308/500 [=================>............] - ETA: 48s - loss: 1.9359 - regression_loss: 1.5855 - classification_loss: 0.3504 309/500 [=================>............] - ETA: 47s - loss: 1.9327 - regression_loss: 1.5831 - classification_loss: 0.3497 310/500 [=================>............] - ETA: 47s - loss: 1.9328 - regression_loss: 1.5832 - classification_loss: 0.3496 311/500 [=================>............] - ETA: 47s - loss: 1.9322 - regression_loss: 1.5825 - classification_loss: 0.3497 312/500 [=================>............] - ETA: 47s - loss: 1.9327 - regression_loss: 1.5823 - classification_loss: 0.3504 313/500 [=================>............] - ETA: 46s - loss: 1.9333 - regression_loss: 1.5829 - classification_loss: 0.3505 314/500 [=================>............] - ETA: 46s - loss: 1.9330 - regression_loss: 1.5827 - classification_loss: 0.3503 315/500 [=================>............] - ETA: 46s - loss: 1.9352 - regression_loss: 1.5844 - classification_loss: 0.3508 316/500 [=================>............] - ETA: 46s - loss: 1.9336 - regression_loss: 1.5826 - classification_loss: 0.3510 317/500 [==================>...........] - ETA: 45s - loss: 1.9314 - regression_loss: 1.5807 - classification_loss: 0.3507 318/500 [==================>...........] - ETA: 45s - loss: 1.9291 - regression_loss: 1.5786 - classification_loss: 0.3505 319/500 [==================>...........] - ETA: 45s - loss: 1.9297 - regression_loss: 1.5791 - classification_loss: 0.3506 320/500 [==================>...........] - ETA: 45s - loss: 1.9295 - regression_loss: 1.5790 - classification_loss: 0.3505 321/500 [==================>...........] - ETA: 44s - loss: 1.9302 - regression_loss: 1.5796 - classification_loss: 0.3505 322/500 [==================>...........] - ETA: 44s - loss: 1.9284 - regression_loss: 1.5785 - classification_loss: 0.3500 323/500 [==================>...........] - ETA: 44s - loss: 1.9294 - regression_loss: 1.5793 - classification_loss: 0.3501 324/500 [==================>...........] - ETA: 44s - loss: 1.9282 - regression_loss: 1.5784 - classification_loss: 0.3497 325/500 [==================>...........] - ETA: 43s - loss: 1.9244 - regression_loss: 1.5754 - classification_loss: 0.3490 326/500 [==================>...........] - ETA: 43s - loss: 1.9261 - regression_loss: 1.5765 - classification_loss: 0.3497 327/500 [==================>...........] - ETA: 43s - loss: 1.9249 - regression_loss: 1.5756 - classification_loss: 0.3493 328/500 [==================>...........] - ETA: 43s - loss: 1.9265 - regression_loss: 1.5769 - classification_loss: 0.3496 329/500 [==================>...........] - ETA: 42s - loss: 1.9270 - regression_loss: 1.5774 - classification_loss: 0.3496 330/500 [==================>...........] - ETA: 42s - loss: 1.9261 - regression_loss: 1.5769 - classification_loss: 0.3492 331/500 [==================>...........] - ETA: 42s - loss: 1.9245 - regression_loss: 1.5758 - classification_loss: 0.3488 332/500 [==================>...........] - ETA: 42s - loss: 1.9243 - regression_loss: 1.5758 - classification_loss: 0.3485 333/500 [==================>...........] - ETA: 41s - loss: 1.9203 - regression_loss: 1.5726 - classification_loss: 0.3476 334/500 [===================>..........] - ETA: 41s - loss: 1.9230 - regression_loss: 1.5747 - classification_loss: 0.3483 335/500 [===================>..........] - ETA: 41s - loss: 1.9233 - regression_loss: 1.5750 - classification_loss: 0.3482 336/500 [===================>..........] - ETA: 41s - loss: 1.9235 - regression_loss: 1.5752 - classification_loss: 0.3483 337/500 [===================>..........] - ETA: 40s - loss: 1.9235 - regression_loss: 1.5754 - classification_loss: 0.3481 338/500 [===================>..........] - ETA: 40s - loss: 1.9258 - regression_loss: 1.5774 - classification_loss: 0.3483 339/500 [===================>..........] - ETA: 40s - loss: 1.9242 - regression_loss: 1.5764 - classification_loss: 0.3478 340/500 [===================>..........] - ETA: 40s - loss: 1.9254 - regression_loss: 1.5774 - classification_loss: 0.3479 341/500 [===================>..........] - ETA: 39s - loss: 1.9253 - regression_loss: 1.5774 - classification_loss: 0.3479 342/500 [===================>..........] - ETA: 39s - loss: 1.9254 - regression_loss: 1.5775 - classification_loss: 0.3478 343/500 [===================>..........] - ETA: 39s - loss: 1.9256 - regression_loss: 1.5780 - classification_loss: 0.3477 344/500 [===================>..........] - ETA: 39s - loss: 1.9252 - regression_loss: 1.5777 - classification_loss: 0.3474 345/500 [===================>..........] - ETA: 38s - loss: 1.9259 - regression_loss: 1.5784 - classification_loss: 0.3476 346/500 [===================>..........] - ETA: 38s - loss: 1.9258 - regression_loss: 1.5781 - classification_loss: 0.3476 347/500 [===================>..........] - ETA: 38s - loss: 1.9267 - regression_loss: 1.5787 - classification_loss: 0.3480 348/500 [===================>..........] - ETA: 38s - loss: 1.9299 - regression_loss: 1.5813 - classification_loss: 0.3486 349/500 [===================>..........] - ETA: 37s - loss: 1.9299 - regression_loss: 1.5813 - classification_loss: 0.3486 350/500 [====================>.........] - ETA: 37s - loss: 1.9290 - regression_loss: 1.5808 - classification_loss: 0.3483 351/500 [====================>.........] - ETA: 37s - loss: 1.9295 - regression_loss: 1.5811 - classification_loss: 0.3484 352/500 [====================>.........] - ETA: 37s - loss: 1.9293 - regression_loss: 1.5811 - classification_loss: 0.3481 353/500 [====================>.........] - ETA: 36s - loss: 1.9302 - regression_loss: 1.5822 - classification_loss: 0.3480 354/500 [====================>.........] - ETA: 36s - loss: 1.9305 - regression_loss: 1.5826 - classification_loss: 0.3479 355/500 [====================>.........] - ETA: 36s - loss: 1.9309 - regression_loss: 1.5829 - classification_loss: 0.3480 356/500 [====================>.........] - ETA: 36s - loss: 1.9309 - regression_loss: 1.5831 - classification_loss: 0.3479 357/500 [====================>.........] - ETA: 35s - loss: 1.9321 - regression_loss: 1.5840 - classification_loss: 0.3481 358/500 [====================>.........] - ETA: 35s - loss: 1.9331 - regression_loss: 1.5847 - classification_loss: 0.3484 359/500 [====================>.........] - ETA: 35s - loss: 1.9350 - regression_loss: 1.5862 - classification_loss: 0.3488 360/500 [====================>.........] - ETA: 35s - loss: 1.9354 - regression_loss: 1.5866 - classification_loss: 0.3489 361/500 [====================>.........] - ETA: 34s - loss: 1.9358 - regression_loss: 1.5870 - classification_loss: 0.3488 362/500 [====================>.........] - ETA: 34s - loss: 1.9370 - regression_loss: 1.5881 - classification_loss: 0.3489 363/500 [====================>.........] - ETA: 34s - loss: 1.9362 - regression_loss: 1.5876 - classification_loss: 0.3486 364/500 [====================>.........] - ETA: 34s - loss: 1.9333 - regression_loss: 1.5853 - classification_loss: 0.3480 365/500 [====================>.........] - ETA: 33s - loss: 1.9356 - regression_loss: 1.5869 - classification_loss: 0.3487 366/500 [====================>.........] - ETA: 33s - loss: 1.9353 - regression_loss: 1.5869 - classification_loss: 0.3484 367/500 [=====================>........] - ETA: 33s - loss: 1.9358 - regression_loss: 1.5874 - classification_loss: 0.3485 368/500 [=====================>........] - ETA: 33s - loss: 1.9352 - regression_loss: 1.5868 - classification_loss: 0.3484 369/500 [=====================>........] - ETA: 32s - loss: 1.9348 - regression_loss: 1.5860 - classification_loss: 0.3488 370/500 [=====================>........] - ETA: 32s - loss: 1.9351 - regression_loss: 1.5864 - classification_loss: 0.3487 371/500 [=====================>........] - ETA: 32s - loss: 1.9344 - regression_loss: 1.5858 - classification_loss: 0.3485 372/500 [=====================>........] - ETA: 32s - loss: 1.9340 - regression_loss: 1.5857 - classification_loss: 0.3484 373/500 [=====================>........] - ETA: 31s - loss: 1.9322 - regression_loss: 1.5841 - classification_loss: 0.3481 374/500 [=====================>........] - ETA: 31s - loss: 1.9319 - regression_loss: 1.5839 - classification_loss: 0.3479 375/500 [=====================>........] - ETA: 31s - loss: 1.9321 - regression_loss: 1.5834 - classification_loss: 0.3487 376/500 [=====================>........] - ETA: 31s - loss: 1.9333 - regression_loss: 1.5845 - classification_loss: 0.3488 377/500 [=====================>........] - ETA: 30s - loss: 1.9328 - regression_loss: 1.5841 - classification_loss: 0.3488 378/500 [=====================>........] - ETA: 30s - loss: 1.9351 - regression_loss: 1.5863 - classification_loss: 0.3489 379/500 [=====================>........] - ETA: 30s - loss: 1.9346 - regression_loss: 1.5860 - classification_loss: 0.3487 380/500 [=====================>........] - ETA: 30s - loss: 1.9321 - regression_loss: 1.5840 - classification_loss: 0.3481 381/500 [=====================>........] - ETA: 29s - loss: 1.9330 - regression_loss: 1.5848 - classification_loss: 0.3482 382/500 [=====================>........] - ETA: 29s - loss: 1.9333 - regression_loss: 1.5851 - classification_loss: 0.3482 383/500 [=====================>........] - ETA: 29s - loss: 1.9331 - regression_loss: 1.5850 - classification_loss: 0.3481 384/500 [======================>.......] - ETA: 29s - loss: 1.9331 - regression_loss: 1.5851 - classification_loss: 0.3481 385/500 [======================>.......] - ETA: 28s - loss: 1.9335 - regression_loss: 1.5854 - classification_loss: 0.3480 386/500 [======================>.......] - ETA: 28s - loss: 1.9336 - regression_loss: 1.5855 - classification_loss: 0.3480 387/500 [======================>.......] - ETA: 28s - loss: 1.9345 - regression_loss: 1.5862 - classification_loss: 0.3484 388/500 [======================>.......] - ETA: 28s - loss: 1.9361 - regression_loss: 1.5875 - classification_loss: 0.3486 389/500 [======================>.......] - ETA: 27s - loss: 1.9382 - regression_loss: 1.5892 - classification_loss: 0.3490 390/500 [======================>.......] - ETA: 27s - loss: 1.9382 - regression_loss: 1.5891 - classification_loss: 0.3491 391/500 [======================>.......] - ETA: 27s - loss: 1.9395 - regression_loss: 1.5902 - classification_loss: 0.3493 392/500 [======================>.......] - ETA: 27s - loss: 1.9399 - regression_loss: 1.5908 - classification_loss: 0.3491 393/500 [======================>.......] - ETA: 26s - loss: 1.9370 - regression_loss: 1.5885 - classification_loss: 0.3485 394/500 [======================>.......] - ETA: 26s - loss: 1.9368 - regression_loss: 1.5884 - classification_loss: 0.3484 395/500 [======================>.......] - ETA: 26s - loss: 1.9371 - regression_loss: 1.5888 - classification_loss: 0.3483 396/500 [======================>.......] - ETA: 26s - loss: 1.9368 - regression_loss: 1.5885 - classification_loss: 0.3483 397/500 [======================>.......] - ETA: 25s - loss: 1.9384 - regression_loss: 1.5898 - classification_loss: 0.3486 398/500 [======================>.......] - ETA: 25s - loss: 1.9395 - regression_loss: 1.5903 - classification_loss: 0.3492 399/500 [======================>.......] - ETA: 25s - loss: 1.9400 - regression_loss: 1.5909 - classification_loss: 0.3491 400/500 [=======================>......] - ETA: 25s - loss: 1.9418 - regression_loss: 1.5911 - classification_loss: 0.3507 401/500 [=======================>......] - ETA: 24s - loss: 1.9431 - regression_loss: 1.5922 - classification_loss: 0.3509 402/500 [=======================>......] - ETA: 24s - loss: 1.9448 - regression_loss: 1.5937 - classification_loss: 0.3511 403/500 [=======================>......] - ETA: 24s - loss: 1.9446 - regression_loss: 1.5937 - classification_loss: 0.3509 404/500 [=======================>......] - ETA: 24s - loss: 1.9441 - regression_loss: 1.5934 - classification_loss: 0.3507 405/500 [=======================>......] - ETA: 23s - loss: 1.9422 - regression_loss: 1.5919 - classification_loss: 0.3503 406/500 [=======================>......] - ETA: 23s - loss: 1.9423 - regression_loss: 1.5922 - classification_loss: 0.3502 407/500 [=======================>......] - ETA: 23s - loss: 1.9420 - regression_loss: 1.5916 - classification_loss: 0.3504 408/500 [=======================>......] - ETA: 23s - loss: 1.9424 - regression_loss: 1.5919 - classification_loss: 0.3505 409/500 [=======================>......] - ETA: 22s - loss: 1.9428 - regression_loss: 1.5925 - classification_loss: 0.3504 410/500 [=======================>......] - ETA: 22s - loss: 1.9425 - regression_loss: 1.5922 - classification_loss: 0.3504 411/500 [=======================>......] - ETA: 22s - loss: 1.9420 - regression_loss: 1.5920 - classification_loss: 0.3500 412/500 [=======================>......] - ETA: 22s - loss: 1.9418 - regression_loss: 1.5919 - classification_loss: 0.3498 413/500 [=======================>......] - ETA: 21s - loss: 1.9427 - regression_loss: 1.5928 - classification_loss: 0.3499 414/500 [=======================>......] - ETA: 21s - loss: 1.9427 - regression_loss: 1.5928 - classification_loss: 0.3498 415/500 [=======================>......] - ETA: 21s - loss: 1.9420 - regression_loss: 1.5922 - classification_loss: 0.3498 416/500 [=======================>......] - ETA: 21s - loss: 1.9435 - regression_loss: 1.5932 - classification_loss: 0.3503 417/500 [========================>.....] - ETA: 20s - loss: 1.9437 - regression_loss: 1.5934 - classification_loss: 0.3504 418/500 [========================>.....] - ETA: 20s - loss: 1.9439 - regression_loss: 1.5936 - classification_loss: 0.3503 419/500 [========================>.....] - ETA: 20s - loss: 1.9446 - regression_loss: 1.5945 - classification_loss: 0.3501 420/500 [========================>.....] - ETA: 20s - loss: 1.9440 - regression_loss: 1.5940 - classification_loss: 0.3499 421/500 [========================>.....] - ETA: 19s - loss: 1.9447 - regression_loss: 1.5942 - classification_loss: 0.3505 422/500 [========================>.....] - ETA: 19s - loss: 1.9445 - regression_loss: 1.5942 - classification_loss: 0.3503 423/500 [========================>.....] - ETA: 19s - loss: 1.9412 - regression_loss: 1.5913 - classification_loss: 0.3499 424/500 [========================>.....] - ETA: 19s - loss: 1.9421 - regression_loss: 1.5920 - classification_loss: 0.3501 425/500 [========================>.....] - ETA: 18s - loss: 1.9417 - regression_loss: 1.5918 - classification_loss: 0.3499 426/500 [========================>.....] - ETA: 18s - loss: 1.9421 - regression_loss: 1.5920 - classification_loss: 0.3501 427/500 [========================>.....] - ETA: 18s - loss: 1.9411 - regression_loss: 1.5911 - classification_loss: 0.3500 428/500 [========================>.....] - ETA: 18s - loss: 1.9405 - regression_loss: 1.5907 - classification_loss: 0.3498 429/500 [========================>.....] - ETA: 17s - loss: 1.9401 - regression_loss: 1.5903 - classification_loss: 0.3497 430/500 [========================>.....] - ETA: 17s - loss: 1.9383 - regression_loss: 1.5888 - classification_loss: 0.3495 431/500 [========================>.....] - ETA: 17s - loss: 1.9424 - regression_loss: 1.5922 - classification_loss: 0.3501 432/500 [========================>.....] - ETA: 17s - loss: 1.9402 - regression_loss: 1.5905 - classification_loss: 0.3497 433/500 [========================>.....] - ETA: 16s - loss: 1.9390 - regression_loss: 1.5897 - classification_loss: 0.3494 434/500 [=========================>....] - ETA: 16s - loss: 1.9377 - regression_loss: 1.5887 - classification_loss: 0.3490 435/500 [=========================>....] - ETA: 16s - loss: 1.9375 - regression_loss: 1.5885 - classification_loss: 0.3490 436/500 [=========================>....] - ETA: 16s - loss: 1.9370 - regression_loss: 1.5881 - classification_loss: 0.3489 437/500 [=========================>....] - ETA: 15s - loss: 1.9367 - regression_loss: 1.5879 - classification_loss: 0.3488 438/500 [=========================>....] - ETA: 15s - loss: 1.9364 - regression_loss: 1.5877 - classification_loss: 0.3488 439/500 [=========================>....] - ETA: 15s - loss: 1.9371 - regression_loss: 1.5883 - classification_loss: 0.3488 440/500 [=========================>....] - ETA: 15s - loss: 1.9376 - regression_loss: 1.5887 - classification_loss: 0.3489 441/500 [=========================>....] - ETA: 14s - loss: 1.9372 - regression_loss: 1.5885 - classification_loss: 0.3487 442/500 [=========================>....] - ETA: 14s - loss: 1.9374 - regression_loss: 1.5886 - classification_loss: 0.3488 443/500 [=========================>....] - ETA: 14s - loss: 1.9363 - regression_loss: 1.5878 - classification_loss: 0.3485 444/500 [=========================>....] - ETA: 14s - loss: 1.9340 - regression_loss: 1.5860 - classification_loss: 0.3480 445/500 [=========================>....] - ETA: 13s - loss: 1.9339 - regression_loss: 1.5859 - classification_loss: 0.3480 446/500 [=========================>....] - ETA: 13s - loss: 1.9362 - regression_loss: 1.5874 - classification_loss: 0.3488 447/500 [=========================>....] - ETA: 13s - loss: 1.9365 - regression_loss: 1.5876 - classification_loss: 0.3489 448/500 [=========================>....] - ETA: 13s - loss: 1.9368 - regression_loss: 1.5879 - classification_loss: 0.3489 449/500 [=========================>....] - ETA: 12s - loss: 1.9370 - regression_loss: 1.5883 - classification_loss: 0.3487 450/500 [==========================>...] - ETA: 12s - loss: 1.9371 - regression_loss: 1.5884 - classification_loss: 0.3487 451/500 [==========================>...] - ETA: 12s - loss: 1.9381 - regression_loss: 1.5890 - classification_loss: 0.3491 452/500 [==========================>...] - ETA: 12s - loss: 1.9387 - regression_loss: 1.5897 - classification_loss: 0.3490 453/500 [==========================>...] - ETA: 11s - loss: 1.9390 - regression_loss: 1.5899 - classification_loss: 0.3491 454/500 [==========================>...] - ETA: 11s - loss: 1.9384 - regression_loss: 1.5895 - classification_loss: 0.3489 455/500 [==========================>...] - ETA: 11s - loss: 1.9390 - regression_loss: 1.5901 - classification_loss: 0.3489 456/500 [==========================>...] - ETA: 11s - loss: 1.9387 - regression_loss: 1.5900 - classification_loss: 0.3487 457/500 [==========================>...] - ETA: 10s - loss: 1.9390 - regression_loss: 1.5901 - classification_loss: 0.3488 458/500 [==========================>...] - ETA: 10s - loss: 1.9395 - regression_loss: 1.5903 - classification_loss: 0.3492 459/500 [==========================>...] - ETA: 10s - loss: 1.9412 - regression_loss: 1.5918 - classification_loss: 0.3493 460/500 [==========================>...] - ETA: 10s - loss: 1.9425 - regression_loss: 1.5928 - classification_loss: 0.3497 461/500 [==========================>...] - ETA: 9s - loss: 1.9423 - regression_loss: 1.5929 - classification_loss: 0.3494  462/500 [==========================>...] - ETA: 9s - loss: 1.9397 - regression_loss: 1.5909 - classification_loss: 0.3488 463/500 [==========================>...] - ETA: 9s - loss: 1.9397 - regression_loss: 1.5909 - classification_loss: 0.3488 464/500 [==========================>...] - ETA: 9s - loss: 1.9399 - regression_loss: 1.5909 - classification_loss: 0.3490 465/500 [==========================>...] - ETA: 8s - loss: 1.9402 - regression_loss: 1.5912 - classification_loss: 0.3491 466/500 [==========================>...] - ETA: 8s - loss: 1.9397 - regression_loss: 1.5909 - classification_loss: 0.3489 467/500 [===========================>..] - ETA: 8s - loss: 1.9404 - regression_loss: 1.5914 - classification_loss: 0.3490 468/500 [===========================>..] - ETA: 8s - loss: 1.9405 - regression_loss: 1.5914 - classification_loss: 0.3491 469/500 [===========================>..] - ETA: 7s - loss: 1.9403 - regression_loss: 1.5914 - classification_loss: 0.3489 470/500 [===========================>..] - ETA: 7s - loss: 1.9412 - regression_loss: 1.5920 - classification_loss: 0.3491 471/500 [===========================>..] - ETA: 7s - loss: 1.9413 - regression_loss: 1.5921 - classification_loss: 0.3491 472/500 [===========================>..] - ETA: 7s - loss: 1.9408 - regression_loss: 1.5917 - classification_loss: 0.3491 473/500 [===========================>..] - ETA: 6s - loss: 1.9390 - regression_loss: 1.5903 - classification_loss: 0.3487 474/500 [===========================>..] - ETA: 6s - loss: 1.9386 - regression_loss: 1.5901 - classification_loss: 0.3485 475/500 [===========================>..] - ETA: 6s - loss: 1.9368 - regression_loss: 1.5886 - classification_loss: 0.3482 476/500 [===========================>..] - ETA: 6s - loss: 1.9375 - regression_loss: 1.5892 - classification_loss: 0.3483 477/500 [===========================>..] - ETA: 5s - loss: 1.9366 - regression_loss: 1.5882 - classification_loss: 0.3484 478/500 [===========================>..] - ETA: 5s - loss: 1.9371 - regression_loss: 1.5887 - classification_loss: 0.3483 479/500 [===========================>..] - ETA: 5s - loss: 1.9362 - regression_loss: 1.5881 - classification_loss: 0.3481 480/500 [===========================>..] - ETA: 5s - loss: 1.9349 - regression_loss: 1.5871 - classification_loss: 0.3478 481/500 [===========================>..] - ETA: 4s - loss: 1.9329 - regression_loss: 1.5855 - classification_loss: 0.3474 482/500 [===========================>..] - ETA: 4s - loss: 1.9324 - regression_loss: 1.5850 - classification_loss: 0.3473 483/500 [===========================>..] - ETA: 4s - loss: 1.9312 - regression_loss: 1.5840 - classification_loss: 0.3472 484/500 [============================>.] - ETA: 4s - loss: 1.9317 - regression_loss: 1.5844 - classification_loss: 0.3473 485/500 [============================>.] - ETA: 3s - loss: 1.9316 - regression_loss: 1.5844 - classification_loss: 0.3472 486/500 [============================>.] - ETA: 3s - loss: 1.9310 - regression_loss: 1.5839 - classification_loss: 0.3471 487/500 [============================>.] - ETA: 3s - loss: 1.9312 - regression_loss: 1.5841 - classification_loss: 0.3471 488/500 [============================>.] - ETA: 3s - loss: 1.9306 - regression_loss: 1.5837 - classification_loss: 0.3469 489/500 [============================>.] - ETA: 2s - loss: 1.9307 - regression_loss: 1.5838 - classification_loss: 0.3469 490/500 [============================>.] - ETA: 2s - loss: 1.9306 - regression_loss: 1.5837 - classification_loss: 0.3469 491/500 [============================>.] - ETA: 2s - loss: 1.9309 - regression_loss: 1.5840 - classification_loss: 0.3469 492/500 [============================>.] - ETA: 2s - loss: 1.9311 - regression_loss: 1.5843 - classification_loss: 0.3468 493/500 [============================>.] - ETA: 1s - loss: 1.9289 - regression_loss: 1.5826 - classification_loss: 0.3463 494/500 [============================>.] - ETA: 1s - loss: 1.9272 - regression_loss: 1.5813 - classification_loss: 0.3459 495/500 [============================>.] - ETA: 1s - loss: 1.9273 - regression_loss: 1.5813 - classification_loss: 0.3460 496/500 [============================>.] - ETA: 1s - loss: 1.9269 - regression_loss: 1.5808 - classification_loss: 0.3461 497/500 [============================>.] - ETA: 0s - loss: 1.9286 - regression_loss: 1.5823 - classification_loss: 0.3463 498/500 [============================>.] - ETA: 0s - loss: 1.9274 - regression_loss: 1.5814 - classification_loss: 0.3460 499/500 [============================>.] - ETA: 0s - loss: 1.9260 - regression_loss: 1.5803 - classification_loss: 0.3457 500/500 [==============================] - 125s 251ms/step - loss: 1.9270 - regression_loss: 1.5812 - classification_loss: 0.3457 1172 instances of class plum with average precision: 0.5778 mAP: 0.5778 Epoch 00047: saving model to ./training/snapshots/resnet50_pascal_47.h5 Epoch 48/150 1/500 [..............................] - ETA: 1:58 - loss: 1.7636 - regression_loss: 1.4740 - classification_loss: 0.2896 2/500 [..............................] - ETA: 2:00 - loss: 1.7194 - regression_loss: 1.4561 - classification_loss: 0.2632 3/500 [..............................] - ETA: 2:01 - loss: 1.5055 - regression_loss: 1.2332 - classification_loss: 0.2723 4/500 [..............................] - ETA: 2:02 - loss: 1.6391 - regression_loss: 1.3502 - classification_loss: 0.2890 5/500 [..............................] - ETA: 2:02 - loss: 1.6687 - regression_loss: 1.3623 - classification_loss: 0.3064 6/500 [..............................] - ETA: 2:02 - loss: 1.7782 - regression_loss: 1.4423 - classification_loss: 0.3359 7/500 [..............................] - ETA: 2:02 - loss: 1.7782 - regression_loss: 1.4520 - classification_loss: 0.3262 8/500 [..............................] - ETA: 2:01 - loss: 1.7689 - regression_loss: 1.4420 - classification_loss: 0.3270 9/500 [..............................] - ETA: 2:01 - loss: 1.8691 - regression_loss: 1.5249 - classification_loss: 0.3442 10/500 [..............................] - ETA: 2:01 - loss: 1.8802 - regression_loss: 1.5372 - classification_loss: 0.3430 11/500 [..............................] - ETA: 2:00 - loss: 1.9033 - regression_loss: 1.5601 - classification_loss: 0.3432 12/500 [..............................] - ETA: 2:00 - loss: 1.9360 - regression_loss: 1.5821 - classification_loss: 0.3538 13/500 [..............................] - ETA: 1:59 - loss: 1.9735 - regression_loss: 1.6069 - classification_loss: 0.3666 14/500 [..............................] - ETA: 1:59 - loss: 1.9599 - regression_loss: 1.5978 - classification_loss: 0.3621 15/500 [..............................] - ETA: 1:59 - loss: 1.9647 - regression_loss: 1.6065 - classification_loss: 0.3582 16/500 [..............................] - ETA: 1:59 - loss: 2.0027 - regression_loss: 1.6349 - classification_loss: 0.3677 17/500 [>.............................] - ETA: 1:58 - loss: 1.9970 - regression_loss: 1.6324 - classification_loss: 0.3646 18/500 [>.............................] - ETA: 1:58 - loss: 2.0103 - regression_loss: 1.6400 - classification_loss: 0.3703 19/500 [>.............................] - ETA: 1:58 - loss: 2.0062 - regression_loss: 1.6406 - classification_loss: 0.3656 20/500 [>.............................] - ETA: 1:58 - loss: 2.0157 - regression_loss: 1.6504 - classification_loss: 0.3653 21/500 [>.............................] - ETA: 1:58 - loss: 2.0202 - regression_loss: 1.6549 - classification_loss: 0.3653 22/500 [>.............................] - ETA: 1:57 - loss: 1.9753 - regression_loss: 1.6186 - classification_loss: 0.3567 23/500 [>.............................] - ETA: 1:57 - loss: 1.9753 - regression_loss: 1.6181 - classification_loss: 0.3572 24/500 [>.............................] - ETA: 1:57 - loss: 1.9689 - regression_loss: 1.6135 - classification_loss: 0.3554 25/500 [>.............................] - ETA: 1:57 - loss: 1.9577 - regression_loss: 1.6044 - classification_loss: 0.3533 26/500 [>.............................] - ETA: 1:57 - loss: 1.9623 - regression_loss: 1.6053 - classification_loss: 0.3570 27/500 [>.............................] - ETA: 1:57 - loss: 1.9555 - regression_loss: 1.5997 - classification_loss: 0.3558 28/500 [>.............................] - ETA: 1:56 - loss: 1.9311 - regression_loss: 1.5808 - classification_loss: 0.3503 29/500 [>.............................] - ETA: 1:56 - loss: 1.9311 - regression_loss: 1.5814 - classification_loss: 0.3497 30/500 [>.............................] - ETA: 1:56 - loss: 1.9292 - regression_loss: 1.5796 - classification_loss: 0.3496 31/500 [>.............................] - ETA: 1:56 - loss: 1.9240 - regression_loss: 1.5748 - classification_loss: 0.3492 32/500 [>.............................] - ETA: 1:55 - loss: 1.9011 - regression_loss: 1.5574 - classification_loss: 0.3437 33/500 [>.............................] - ETA: 1:55 - loss: 1.9192 - regression_loss: 1.5730 - classification_loss: 0.3462 34/500 [=>............................] - ETA: 1:55 - loss: 1.9078 - regression_loss: 1.5642 - classification_loss: 0.3437 35/500 [=>............................] - ETA: 1:55 - loss: 1.9051 - regression_loss: 1.5634 - classification_loss: 0.3417 36/500 [=>............................] - ETA: 1:54 - loss: 1.9148 - regression_loss: 1.5713 - classification_loss: 0.3435 37/500 [=>............................] - ETA: 1:54 - loss: 1.9213 - regression_loss: 1.5765 - classification_loss: 0.3448 38/500 [=>............................] - ETA: 1:54 - loss: 1.9337 - regression_loss: 1.5896 - classification_loss: 0.3441 39/500 [=>............................] - ETA: 1:54 - loss: 1.9278 - regression_loss: 1.5854 - classification_loss: 0.3424 40/500 [=>............................] - ETA: 1:54 - loss: 1.9338 - regression_loss: 1.5899 - classification_loss: 0.3439 41/500 [=>............................] - ETA: 1:53 - loss: 1.9477 - regression_loss: 1.5994 - classification_loss: 0.3483 42/500 [=>............................] - ETA: 1:53 - loss: 1.9560 - regression_loss: 1.6063 - classification_loss: 0.3497 43/500 [=>............................] - ETA: 1:53 - loss: 1.9577 - regression_loss: 1.6074 - classification_loss: 0.3503 44/500 [=>............................] - ETA: 1:53 - loss: 1.9535 - regression_loss: 1.6038 - classification_loss: 0.3497 45/500 [=>............................] - ETA: 1:52 - loss: 1.9525 - regression_loss: 1.5969 - classification_loss: 0.3556 46/500 [=>............................] - ETA: 1:52 - loss: 1.9437 - regression_loss: 1.5911 - classification_loss: 0.3526 47/500 [=>............................] - ETA: 1:52 - loss: 1.9489 - regression_loss: 1.5952 - classification_loss: 0.3537 48/500 [=>............................] - ETA: 1:52 - loss: 1.9529 - regression_loss: 1.5968 - classification_loss: 0.3560 49/500 [=>............................] - ETA: 1:51 - loss: 1.9546 - regression_loss: 1.5986 - classification_loss: 0.3560 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9546 - regression_loss: 1.5986 - classification_loss: 0.3560 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9557 - regression_loss: 1.6012 - classification_loss: 0.3545 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9473 - regression_loss: 1.5942 - classification_loss: 0.3532 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9482 - regression_loss: 1.5941 - classification_loss: 0.3541 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9573 - regression_loss: 1.6028 - classification_loss: 0.3545 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9564 - regression_loss: 1.6026 - classification_loss: 0.3539 56/500 [==>...........................] - ETA: 1:50 - loss: 1.9548 - regression_loss: 1.6018 - classification_loss: 0.3530 57/500 [==>...........................] - ETA: 1:49 - loss: 1.9403 - regression_loss: 1.5882 - classification_loss: 0.3521 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9431 - regression_loss: 1.5921 - classification_loss: 0.3510 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9562 - regression_loss: 1.6017 - classification_loss: 0.3545 60/500 [==>...........................] - ETA: 1:49 - loss: 1.9495 - regression_loss: 1.5972 - classification_loss: 0.3523 61/500 [==>...........................] - ETA: 1:49 - loss: 1.9447 - regression_loss: 1.5931 - classification_loss: 0.3516 62/500 [==>...........................] - ETA: 1:48 - loss: 1.9458 - regression_loss: 1.5942 - classification_loss: 0.3516 63/500 [==>...........................] - ETA: 1:48 - loss: 1.9707 - regression_loss: 1.6001 - classification_loss: 0.3707 64/500 [==>...........................] - ETA: 1:48 - loss: 1.9717 - regression_loss: 1.6021 - classification_loss: 0.3696 65/500 [==>...........................] - ETA: 1:48 - loss: 1.9751 - regression_loss: 1.6068 - classification_loss: 0.3684 66/500 [==>...........................] - ETA: 1:47 - loss: 1.9791 - regression_loss: 1.6099 - classification_loss: 0.3693 67/500 [===>..........................] - ETA: 1:47 - loss: 1.9820 - regression_loss: 1.6111 - classification_loss: 0.3709 68/500 [===>..........................] - ETA: 1:47 - loss: 1.9828 - regression_loss: 1.6133 - classification_loss: 0.3696 69/500 [===>..........................] - ETA: 1:47 - loss: 1.9892 - regression_loss: 1.6190 - classification_loss: 0.3702 70/500 [===>..........................] - ETA: 1:46 - loss: 1.9991 - regression_loss: 1.6245 - classification_loss: 0.3746 71/500 [===>..........................] - ETA: 1:46 - loss: 2.0072 - regression_loss: 1.6325 - classification_loss: 0.3747 72/500 [===>..........................] - ETA: 1:46 - loss: 2.0176 - regression_loss: 1.6412 - classification_loss: 0.3764 73/500 [===>..........................] - ETA: 1:46 - loss: 2.0211 - regression_loss: 1.6437 - classification_loss: 0.3774 74/500 [===>..........................] - ETA: 1:45 - loss: 2.0176 - regression_loss: 1.6405 - classification_loss: 0.3771 75/500 [===>..........................] - ETA: 1:45 - loss: 2.0159 - regression_loss: 1.6394 - classification_loss: 0.3765 76/500 [===>..........................] - ETA: 1:45 - loss: 2.0222 - regression_loss: 1.6454 - classification_loss: 0.3769 77/500 [===>..........................] - ETA: 1:45 - loss: 2.0214 - regression_loss: 1.6448 - classification_loss: 0.3766 78/500 [===>..........................] - ETA: 1:44 - loss: 2.0237 - regression_loss: 1.6470 - classification_loss: 0.3767 79/500 [===>..........................] - ETA: 1:44 - loss: 2.0259 - regression_loss: 1.6485 - classification_loss: 0.3775 80/500 [===>..........................] - ETA: 1:44 - loss: 2.0247 - regression_loss: 1.6480 - classification_loss: 0.3767 81/500 [===>..........................] - ETA: 1:44 - loss: 2.0256 - regression_loss: 1.6488 - classification_loss: 0.3768 82/500 [===>..........................] - ETA: 1:44 - loss: 2.0256 - regression_loss: 1.6495 - classification_loss: 0.3762 83/500 [===>..........................] - ETA: 1:43 - loss: 2.0288 - regression_loss: 1.6500 - classification_loss: 0.3789 84/500 [====>.........................] - ETA: 1:43 - loss: 2.0233 - regression_loss: 1.6462 - classification_loss: 0.3772 85/500 [====>.........................] - ETA: 1:43 - loss: 2.0227 - regression_loss: 1.6464 - classification_loss: 0.3763 86/500 [====>.........................] - ETA: 1:42 - loss: 2.0155 - regression_loss: 1.6413 - classification_loss: 0.3742 87/500 [====>.........................] - ETA: 1:42 - loss: 2.0172 - regression_loss: 1.6429 - classification_loss: 0.3743 88/500 [====>.........................] - ETA: 1:41 - loss: 2.0064 - regression_loss: 1.6336 - classification_loss: 0.3728 89/500 [====>.........................] - ETA: 1:41 - loss: 2.0058 - regression_loss: 1.6325 - classification_loss: 0.3733 90/500 [====>.........................] - ETA: 1:41 - loss: 2.0059 - regression_loss: 1.6334 - classification_loss: 0.3726 91/500 [====>.........................] - ETA: 1:41 - loss: 2.0298 - regression_loss: 1.6510 - classification_loss: 0.3788 92/500 [====>.........................] - ETA: 1:41 - loss: 2.0303 - regression_loss: 1.6521 - classification_loss: 0.3782 93/500 [====>.........................] - ETA: 1:40 - loss: 2.0272 - regression_loss: 1.6491 - classification_loss: 0.3781 94/500 [====>.........................] - ETA: 1:40 - loss: 2.0256 - regression_loss: 1.6478 - classification_loss: 0.3778 95/500 [====>.........................] - ETA: 1:40 - loss: 2.0269 - regression_loss: 1.6491 - classification_loss: 0.3778 96/500 [====>.........................] - ETA: 1:40 - loss: 2.0211 - regression_loss: 1.6449 - classification_loss: 0.3762 97/500 [====>.........................] - ETA: 1:39 - loss: 2.0152 - regression_loss: 1.6395 - classification_loss: 0.3757 98/500 [====>.........................] - ETA: 1:39 - loss: 2.0080 - regression_loss: 1.6340 - classification_loss: 0.3740 99/500 [====>.........................] - ETA: 1:39 - loss: 2.0102 - regression_loss: 1.6364 - classification_loss: 0.3738 100/500 [=====>........................] - ETA: 1:38 - loss: 2.0103 - regression_loss: 1.6360 - classification_loss: 0.3742 101/500 [=====>........................] - ETA: 1:38 - loss: 2.0024 - regression_loss: 1.6289 - classification_loss: 0.3735 102/500 [=====>........................] - ETA: 1:38 - loss: 2.0009 - regression_loss: 1.6280 - classification_loss: 0.3728 103/500 [=====>........................] - ETA: 1:38 - loss: 1.9936 - regression_loss: 1.6227 - classification_loss: 0.3709 104/500 [=====>........................] - ETA: 1:38 - loss: 1.9929 - regression_loss: 1.6222 - classification_loss: 0.3707 105/500 [=====>........................] - ETA: 1:37 - loss: 1.9998 - regression_loss: 1.6282 - classification_loss: 0.3716 106/500 [=====>........................] - ETA: 1:37 - loss: 1.9973 - regression_loss: 1.6267 - classification_loss: 0.3706 107/500 [=====>........................] - ETA: 1:37 - loss: 1.9930 - regression_loss: 1.6236 - classification_loss: 0.3694 108/500 [=====>........................] - ETA: 1:37 - loss: 1.9970 - regression_loss: 1.6274 - classification_loss: 0.3695 109/500 [=====>........................] - ETA: 1:36 - loss: 1.9987 - regression_loss: 1.6299 - classification_loss: 0.3687 110/500 [=====>........................] - ETA: 1:36 - loss: 1.9986 - regression_loss: 1.6313 - classification_loss: 0.3673 111/500 [=====>........................] - ETA: 1:36 - loss: 1.9954 - regression_loss: 1.6290 - classification_loss: 0.3664 112/500 [=====>........................] - ETA: 1:36 - loss: 1.9961 - regression_loss: 1.6297 - classification_loss: 0.3665 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9939 - regression_loss: 1.6282 - classification_loss: 0.3657 114/500 [=====>........................] - ETA: 1:35 - loss: 2.0006 - regression_loss: 1.6338 - classification_loss: 0.3668 115/500 [=====>........................] - ETA: 1:35 - loss: 1.9982 - regression_loss: 1.6319 - classification_loss: 0.3663 116/500 [=====>........................] - ETA: 1:35 - loss: 1.9922 - regression_loss: 1.6277 - classification_loss: 0.3645 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9963 - regression_loss: 1.6304 - classification_loss: 0.3659 118/500 [======>.......................] - ETA: 1:34 - loss: 1.9884 - regression_loss: 1.6242 - classification_loss: 0.3642 119/500 [======>.......................] - ETA: 1:34 - loss: 1.9824 - regression_loss: 1.6199 - classification_loss: 0.3625 120/500 [======>.......................] - ETA: 1:34 - loss: 1.9826 - regression_loss: 1.6201 - classification_loss: 0.3624 121/500 [======>.......................] - ETA: 1:34 - loss: 1.9836 - regression_loss: 1.6209 - classification_loss: 0.3627 122/500 [======>.......................] - ETA: 1:33 - loss: 1.9825 - regression_loss: 1.6196 - classification_loss: 0.3629 123/500 [======>.......................] - ETA: 1:33 - loss: 1.9824 - regression_loss: 1.6199 - classification_loss: 0.3625 124/500 [======>.......................] - ETA: 1:33 - loss: 1.9745 - regression_loss: 1.6134 - classification_loss: 0.3611 125/500 [======>.......................] - ETA: 1:33 - loss: 1.9762 - regression_loss: 1.6148 - classification_loss: 0.3615 126/500 [======>.......................] - ETA: 1:32 - loss: 1.9695 - regression_loss: 1.6090 - classification_loss: 0.3605 127/500 [======>.......................] - ETA: 1:32 - loss: 1.9685 - regression_loss: 1.6080 - classification_loss: 0.3604 128/500 [======>.......................] - ETA: 1:32 - loss: 1.9637 - regression_loss: 1.6042 - classification_loss: 0.3595 129/500 [======>.......................] - ETA: 1:32 - loss: 1.9656 - regression_loss: 1.6060 - classification_loss: 0.3596 130/500 [======>.......................] - ETA: 1:31 - loss: 1.9636 - regression_loss: 1.6048 - classification_loss: 0.3587 131/500 [======>.......................] - ETA: 1:31 - loss: 1.9590 - regression_loss: 1.6011 - classification_loss: 0.3579 132/500 [======>.......................] - ETA: 1:31 - loss: 1.9591 - regression_loss: 1.6011 - classification_loss: 0.3580 133/500 [======>.......................] - ETA: 1:31 - loss: 1.9605 - regression_loss: 1.6026 - classification_loss: 0.3579 134/500 [=======>......................] - ETA: 1:31 - loss: 1.9613 - regression_loss: 1.6036 - classification_loss: 0.3577 135/500 [=======>......................] - ETA: 1:30 - loss: 1.9586 - regression_loss: 1.6021 - classification_loss: 0.3565 136/500 [=======>......................] - ETA: 1:30 - loss: 1.9564 - regression_loss: 1.6001 - classification_loss: 0.3564 137/500 [=======>......................] - ETA: 1:30 - loss: 1.9573 - regression_loss: 1.6010 - classification_loss: 0.3563 138/500 [=======>......................] - ETA: 1:30 - loss: 1.9593 - regression_loss: 1.6025 - classification_loss: 0.3568 139/500 [=======>......................] - ETA: 1:29 - loss: 1.9613 - regression_loss: 1.6046 - classification_loss: 0.3568 140/500 [=======>......................] - ETA: 1:29 - loss: 1.9551 - regression_loss: 1.5996 - classification_loss: 0.3555 141/500 [=======>......................] - ETA: 1:29 - loss: 1.9569 - regression_loss: 1.6014 - classification_loss: 0.3555 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9610 - regression_loss: 1.6044 - classification_loss: 0.3566 143/500 [=======>......................] - ETA: 1:28 - loss: 1.9616 - regression_loss: 1.6052 - classification_loss: 0.3564 144/500 [=======>......................] - ETA: 1:28 - loss: 1.9619 - regression_loss: 1.6052 - classification_loss: 0.3567 145/500 [=======>......................] - ETA: 1:28 - loss: 1.9615 - regression_loss: 1.6052 - classification_loss: 0.3563 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9633 - regression_loss: 1.6071 - classification_loss: 0.3562 147/500 [=======>......................] - ETA: 1:27 - loss: 1.9604 - regression_loss: 1.6052 - classification_loss: 0.3552 148/500 [=======>......................] - ETA: 1:27 - loss: 1.9590 - regression_loss: 1.6043 - classification_loss: 0.3546 149/500 [=======>......................] - ETA: 1:27 - loss: 1.9626 - regression_loss: 1.6064 - classification_loss: 0.3562 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9562 - regression_loss: 1.6013 - classification_loss: 0.3549 151/500 [========>.....................] - ETA: 1:26 - loss: 1.9532 - regression_loss: 1.5992 - classification_loss: 0.3539 152/500 [========>.....................] - ETA: 1:26 - loss: 1.9574 - regression_loss: 1.6027 - classification_loss: 0.3547 153/500 [========>.....................] - ETA: 1:26 - loss: 1.9592 - regression_loss: 1.6041 - classification_loss: 0.3551 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9575 - regression_loss: 1.6029 - classification_loss: 0.3546 155/500 [========>.....................] - ETA: 1:25 - loss: 1.9526 - regression_loss: 1.5989 - classification_loss: 0.3537 156/500 [========>.....................] - ETA: 1:25 - loss: 1.9520 - regression_loss: 1.5990 - classification_loss: 0.3530 157/500 [========>.....................] - ETA: 1:25 - loss: 1.9491 - regression_loss: 1.5955 - classification_loss: 0.3535 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9492 - regression_loss: 1.5961 - classification_loss: 0.3531 159/500 [========>.....................] - ETA: 1:24 - loss: 1.9492 - regression_loss: 1.5957 - classification_loss: 0.3535 160/500 [========>.....................] - ETA: 1:24 - loss: 1.9532 - regression_loss: 1.5985 - classification_loss: 0.3547 161/500 [========>.....................] - ETA: 1:24 - loss: 1.9569 - regression_loss: 1.6007 - classification_loss: 0.3563 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9542 - regression_loss: 1.5987 - classification_loss: 0.3555 163/500 [========>.....................] - ETA: 1:23 - loss: 1.9536 - regression_loss: 1.5984 - classification_loss: 0.3552 164/500 [========>.....................] - ETA: 1:23 - loss: 1.9549 - regression_loss: 1.5995 - classification_loss: 0.3554 165/500 [========>.....................] - ETA: 1:23 - loss: 1.9536 - regression_loss: 1.5987 - classification_loss: 0.3549 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9543 - regression_loss: 1.5995 - classification_loss: 0.3548 167/500 [=========>....................] - ETA: 1:22 - loss: 1.9510 - regression_loss: 1.5970 - classification_loss: 0.3540 168/500 [=========>....................] - ETA: 1:22 - loss: 1.9463 - regression_loss: 1.5938 - classification_loss: 0.3525 169/500 [=========>....................] - ETA: 1:22 - loss: 1.9452 - regression_loss: 1.5930 - classification_loss: 0.3522 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9472 - regression_loss: 1.5950 - classification_loss: 0.3522 171/500 [=========>....................] - ETA: 1:21 - loss: 1.9465 - regression_loss: 1.5943 - classification_loss: 0.3521 172/500 [=========>....................] - ETA: 1:21 - loss: 1.9474 - regression_loss: 1.5954 - classification_loss: 0.3520 173/500 [=========>....................] - ETA: 1:21 - loss: 1.9448 - regression_loss: 1.5935 - classification_loss: 0.3513 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9431 - regression_loss: 1.5925 - classification_loss: 0.3507 175/500 [=========>....................] - ETA: 1:20 - loss: 1.9462 - regression_loss: 1.5952 - classification_loss: 0.3510 176/500 [=========>....................] - ETA: 1:20 - loss: 1.9451 - regression_loss: 1.5946 - classification_loss: 0.3505 177/500 [=========>....................] - ETA: 1:20 - loss: 1.9401 - regression_loss: 1.5903 - classification_loss: 0.3497 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9401 - regression_loss: 1.5909 - classification_loss: 0.3491 179/500 [=========>....................] - ETA: 1:19 - loss: 1.9418 - regression_loss: 1.5925 - classification_loss: 0.3493 180/500 [=========>....................] - ETA: 1:19 - loss: 1.9435 - regression_loss: 1.5939 - classification_loss: 0.3496 181/500 [=========>....................] - ETA: 1:19 - loss: 1.9442 - regression_loss: 1.5945 - classification_loss: 0.3497 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9428 - regression_loss: 1.5937 - classification_loss: 0.3492 183/500 [=========>....................] - ETA: 1:18 - loss: 1.9459 - regression_loss: 1.5963 - classification_loss: 0.3496 184/500 [==========>...................] - ETA: 1:18 - loss: 1.9465 - regression_loss: 1.5969 - classification_loss: 0.3496 185/500 [==========>...................] - ETA: 1:18 - loss: 1.9484 - regression_loss: 1.5983 - classification_loss: 0.3501 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9502 - regression_loss: 1.6000 - classification_loss: 0.3502 187/500 [==========>...................] - ETA: 1:17 - loss: 1.9476 - regression_loss: 1.5982 - classification_loss: 0.3494 188/500 [==========>...................] - ETA: 1:17 - loss: 1.9460 - regression_loss: 1.5971 - classification_loss: 0.3489 189/500 [==========>...................] - ETA: 1:17 - loss: 1.9464 - regression_loss: 1.5978 - classification_loss: 0.3486 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9458 - regression_loss: 1.5973 - classification_loss: 0.3485 191/500 [==========>...................] - ETA: 1:16 - loss: 1.9455 - regression_loss: 1.5970 - classification_loss: 0.3485 192/500 [==========>...................] - ETA: 1:16 - loss: 1.9460 - regression_loss: 1.5977 - classification_loss: 0.3483 193/500 [==========>...................] - ETA: 1:16 - loss: 1.9464 - regression_loss: 1.5982 - classification_loss: 0.3482 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9445 - regression_loss: 1.5966 - classification_loss: 0.3479 195/500 [==========>...................] - ETA: 1:15 - loss: 1.9421 - regression_loss: 1.5948 - classification_loss: 0.3473 196/500 [==========>...................] - ETA: 1:15 - loss: 1.9350 - regression_loss: 1.5890 - classification_loss: 0.3460 197/500 [==========>...................] - ETA: 1:15 - loss: 1.9353 - regression_loss: 1.5894 - classification_loss: 0.3459 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9369 - regression_loss: 1.5908 - classification_loss: 0.3461 199/500 [==========>...................] - ETA: 1:14 - loss: 1.9366 - regression_loss: 1.5904 - classification_loss: 0.3462 200/500 [===========>..................] - ETA: 1:14 - loss: 1.9379 - regression_loss: 1.5918 - classification_loss: 0.3461 201/500 [===========>..................] - ETA: 1:14 - loss: 1.9395 - regression_loss: 1.5932 - classification_loss: 0.3463 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9410 - regression_loss: 1.5944 - classification_loss: 0.3466 203/500 [===========>..................] - ETA: 1:13 - loss: 1.9424 - regression_loss: 1.5957 - classification_loss: 0.3467 204/500 [===========>..................] - ETA: 1:13 - loss: 1.9453 - regression_loss: 1.5987 - classification_loss: 0.3466 205/500 [===========>..................] - ETA: 1:13 - loss: 1.9423 - regression_loss: 1.5963 - classification_loss: 0.3461 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9413 - regression_loss: 1.5955 - classification_loss: 0.3458 207/500 [===========>..................] - ETA: 1:12 - loss: 1.9414 - regression_loss: 1.5959 - classification_loss: 0.3455 208/500 [===========>..................] - ETA: 1:12 - loss: 1.9406 - regression_loss: 1.5950 - classification_loss: 0.3455 209/500 [===========>..................] - ETA: 1:12 - loss: 1.9392 - regression_loss: 1.5934 - classification_loss: 0.3458 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9421 - regression_loss: 1.5964 - classification_loss: 0.3457 211/500 [===========>..................] - ETA: 1:11 - loss: 1.9393 - regression_loss: 1.5944 - classification_loss: 0.3450 212/500 [===========>..................] - ETA: 1:11 - loss: 1.9437 - regression_loss: 1.5966 - classification_loss: 0.3470 213/500 [===========>..................] - ETA: 1:11 - loss: 1.9443 - regression_loss: 1.5973 - classification_loss: 0.3470 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9453 - regression_loss: 1.5985 - classification_loss: 0.3468 215/500 [===========>..................] - ETA: 1:10 - loss: 1.9453 - regression_loss: 1.5987 - classification_loss: 0.3466 216/500 [===========>..................] - ETA: 1:10 - loss: 1.9487 - regression_loss: 1.6013 - classification_loss: 0.3474 217/500 [============>.................] - ETA: 1:10 - loss: 1.9486 - regression_loss: 1.6014 - classification_loss: 0.3472 218/500 [============>.................] - ETA: 1:10 - loss: 1.9498 - regression_loss: 1.6025 - classification_loss: 0.3472 219/500 [============>.................] - ETA: 1:09 - loss: 1.9520 - regression_loss: 1.6044 - classification_loss: 0.3476 220/500 [============>.................] - ETA: 1:09 - loss: 1.9531 - regression_loss: 1.6037 - classification_loss: 0.3494 221/500 [============>.................] - ETA: 1:09 - loss: 1.9547 - regression_loss: 1.6050 - classification_loss: 0.3497 222/500 [============>.................] - ETA: 1:09 - loss: 1.9561 - regression_loss: 1.6059 - classification_loss: 0.3503 223/500 [============>.................] - ETA: 1:09 - loss: 1.9515 - regression_loss: 1.6021 - classification_loss: 0.3495 224/500 [============>.................] - ETA: 1:08 - loss: 1.9515 - regression_loss: 1.6023 - classification_loss: 0.3492 225/500 [============>.................] - ETA: 1:08 - loss: 1.9622 - regression_loss: 1.5952 - classification_loss: 0.3671 226/500 [============>.................] - ETA: 1:08 - loss: 1.9607 - regression_loss: 1.5936 - classification_loss: 0.3671 227/500 [============>.................] - ETA: 1:08 - loss: 1.9633 - regression_loss: 1.5957 - classification_loss: 0.3675 228/500 [============>.................] - ETA: 1:07 - loss: 1.9628 - regression_loss: 1.5957 - classification_loss: 0.3670 229/500 [============>.................] - ETA: 1:07 - loss: 1.9620 - regression_loss: 1.5953 - classification_loss: 0.3668 230/500 [============>.................] - ETA: 1:07 - loss: 1.9618 - regression_loss: 1.5952 - classification_loss: 0.3666 231/500 [============>.................] - ETA: 1:07 - loss: 1.9617 - regression_loss: 1.5951 - classification_loss: 0.3665 232/500 [============>.................] - ETA: 1:06 - loss: 1.9619 - regression_loss: 1.5952 - classification_loss: 0.3668 233/500 [============>.................] - ETA: 1:06 - loss: 1.9647 - regression_loss: 1.5968 - classification_loss: 0.3679 234/500 [=============>................] - ETA: 1:06 - loss: 1.9611 - regression_loss: 1.5942 - classification_loss: 0.3669 235/500 [=============>................] - ETA: 1:06 - loss: 1.9621 - regression_loss: 1.5950 - classification_loss: 0.3672 236/500 [=============>................] - ETA: 1:05 - loss: 1.9623 - regression_loss: 1.5955 - classification_loss: 0.3669 237/500 [=============>................] - ETA: 1:05 - loss: 1.9629 - regression_loss: 1.5958 - classification_loss: 0.3670 238/500 [=============>................] - ETA: 1:05 - loss: 1.9628 - regression_loss: 1.5961 - classification_loss: 0.3668 239/500 [=============>................] - ETA: 1:05 - loss: 1.9634 - regression_loss: 1.5967 - classification_loss: 0.3667 240/500 [=============>................] - ETA: 1:04 - loss: 1.9625 - regression_loss: 1.5960 - classification_loss: 0.3665 241/500 [=============>................] - ETA: 1:04 - loss: 1.9602 - regression_loss: 1.5944 - classification_loss: 0.3658 242/500 [=============>................] - ETA: 1:04 - loss: 1.9580 - regression_loss: 1.5928 - classification_loss: 0.3652 243/500 [=============>................] - ETA: 1:04 - loss: 1.9565 - regression_loss: 1.5920 - classification_loss: 0.3645 244/500 [=============>................] - ETA: 1:03 - loss: 1.9548 - regression_loss: 1.5905 - classification_loss: 0.3643 245/500 [=============>................] - ETA: 1:03 - loss: 1.9549 - regression_loss: 1.5908 - classification_loss: 0.3640 246/500 [=============>................] - ETA: 1:03 - loss: 1.9550 - regression_loss: 1.5913 - classification_loss: 0.3637 247/500 [=============>................] - ETA: 1:03 - loss: 1.9567 - regression_loss: 1.5929 - classification_loss: 0.3639 248/500 [=============>................] - ETA: 1:02 - loss: 1.9595 - regression_loss: 1.5950 - classification_loss: 0.3645 249/500 [=============>................] - ETA: 1:02 - loss: 1.9586 - regression_loss: 1.5943 - classification_loss: 0.3643 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9578 - regression_loss: 1.5939 - classification_loss: 0.3639 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9583 - regression_loss: 1.5946 - classification_loss: 0.3637 252/500 [==============>...............] - ETA: 1:01 - loss: 1.9588 - regression_loss: 1.5951 - classification_loss: 0.3637 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9595 - regression_loss: 1.5956 - classification_loss: 0.3639 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9595 - regression_loss: 1.5958 - classification_loss: 0.3638 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9612 - regression_loss: 1.5968 - classification_loss: 0.3644 256/500 [==============>...............] - ETA: 1:00 - loss: 1.9601 - regression_loss: 1.5959 - classification_loss: 0.3641 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9572 - regression_loss: 1.5935 - classification_loss: 0.3637 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9570 - regression_loss: 1.5936 - classification_loss: 0.3633 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9532 - regression_loss: 1.5905 - classification_loss: 0.3627 260/500 [==============>...............] - ETA: 59s - loss: 1.9573 - regression_loss: 1.5937 - classification_loss: 0.3636  261/500 [==============>...............] - ETA: 59s - loss: 1.9566 - regression_loss: 1.5932 - classification_loss: 0.3634 262/500 [==============>...............] - ETA: 59s - loss: 1.9566 - regression_loss: 1.5934 - classification_loss: 0.3632 263/500 [==============>...............] - ETA: 59s - loss: 1.9572 - regression_loss: 1.5943 - classification_loss: 0.3628 264/500 [==============>...............] - ETA: 58s - loss: 1.9568 - regression_loss: 1.5943 - classification_loss: 0.3625 265/500 [==============>...............] - ETA: 58s - loss: 1.9540 - regression_loss: 1.5922 - classification_loss: 0.3618 266/500 [==============>...............] - ETA: 58s - loss: 1.9528 - regression_loss: 1.5912 - classification_loss: 0.3615 267/500 [===============>..............] - ETA: 57s - loss: 1.9535 - regression_loss: 1.5919 - classification_loss: 0.3616 268/500 [===============>..............] - ETA: 57s - loss: 1.9544 - regression_loss: 1.5928 - classification_loss: 0.3615 269/500 [===============>..............] - ETA: 57s - loss: 1.9542 - regression_loss: 1.5929 - classification_loss: 0.3612 270/500 [===============>..............] - ETA: 57s - loss: 1.9520 - regression_loss: 1.5910 - classification_loss: 0.3609 271/500 [===============>..............] - ETA: 56s - loss: 1.9525 - regression_loss: 1.5915 - classification_loss: 0.3610 272/500 [===============>..............] - ETA: 56s - loss: 1.9531 - regression_loss: 1.5923 - classification_loss: 0.3608 273/500 [===============>..............] - ETA: 56s - loss: 1.9545 - regression_loss: 1.5936 - classification_loss: 0.3609 274/500 [===============>..............] - ETA: 56s - loss: 1.9551 - regression_loss: 1.5939 - classification_loss: 0.3612 275/500 [===============>..............] - ETA: 56s - loss: 1.9558 - regression_loss: 1.5946 - classification_loss: 0.3612 276/500 [===============>..............] - ETA: 55s - loss: 1.9563 - regression_loss: 1.5952 - classification_loss: 0.3611 277/500 [===============>..............] - ETA: 55s - loss: 1.9571 - regression_loss: 1.5958 - classification_loss: 0.3613 278/500 [===============>..............] - ETA: 55s - loss: 1.9559 - regression_loss: 1.5950 - classification_loss: 0.3609 279/500 [===============>..............] - ETA: 55s - loss: 1.9583 - regression_loss: 1.5971 - classification_loss: 0.3612 280/500 [===============>..............] - ETA: 54s - loss: 1.9574 - regression_loss: 1.5965 - classification_loss: 0.3609 281/500 [===============>..............] - ETA: 54s - loss: 1.9574 - regression_loss: 1.5965 - classification_loss: 0.3609 282/500 [===============>..............] - ETA: 54s - loss: 1.9557 - regression_loss: 1.5949 - classification_loss: 0.3608 283/500 [===============>..............] - ETA: 54s - loss: 1.9533 - regression_loss: 1.5931 - classification_loss: 0.3601 284/500 [================>.............] - ETA: 53s - loss: 1.9536 - regression_loss: 1.5934 - classification_loss: 0.3602 285/500 [================>.............] - ETA: 53s - loss: 1.9572 - regression_loss: 1.5963 - classification_loss: 0.3609 286/500 [================>.............] - ETA: 53s - loss: 1.9604 - regression_loss: 1.5985 - classification_loss: 0.3618 287/500 [================>.............] - ETA: 53s - loss: 1.9612 - regression_loss: 1.5990 - classification_loss: 0.3622 288/500 [================>.............] - ETA: 52s - loss: 1.9598 - regression_loss: 1.5979 - classification_loss: 0.3619 289/500 [================>.............] - ETA: 52s - loss: 1.9596 - regression_loss: 1.5977 - classification_loss: 0.3619 290/500 [================>.............] - ETA: 52s - loss: 1.9601 - regression_loss: 1.5982 - classification_loss: 0.3619 291/500 [================>.............] - ETA: 52s - loss: 1.9590 - regression_loss: 1.5976 - classification_loss: 0.3614 292/500 [================>.............] - ETA: 51s - loss: 1.9582 - regression_loss: 1.5971 - classification_loss: 0.3611 293/500 [================>.............] - ETA: 51s - loss: 1.9586 - regression_loss: 1.5976 - classification_loss: 0.3610 294/500 [================>.............] - ETA: 51s - loss: 1.9587 - regression_loss: 1.5977 - classification_loss: 0.3611 295/500 [================>.............] - ETA: 51s - loss: 1.9588 - regression_loss: 1.5978 - classification_loss: 0.3610 296/500 [================>.............] - ETA: 50s - loss: 1.9587 - regression_loss: 1.5980 - classification_loss: 0.3608 297/500 [================>.............] - ETA: 50s - loss: 1.9590 - regression_loss: 1.5984 - classification_loss: 0.3606 298/500 [================>.............] - ETA: 50s - loss: 1.9594 - regression_loss: 1.5990 - classification_loss: 0.3604 299/500 [================>.............] - ETA: 50s - loss: 1.9596 - regression_loss: 1.5991 - classification_loss: 0.3605 300/500 [=================>............] - ETA: 49s - loss: 1.9588 - regression_loss: 1.5983 - classification_loss: 0.3604 301/500 [=================>............] - ETA: 49s - loss: 1.9594 - regression_loss: 1.5991 - classification_loss: 0.3603 302/500 [=================>............] - ETA: 49s - loss: 1.9597 - regression_loss: 1.5994 - classification_loss: 0.3603 303/500 [=================>............] - ETA: 49s - loss: 1.9580 - regression_loss: 1.5981 - classification_loss: 0.3599 304/500 [=================>............] - ETA: 48s - loss: 1.9556 - regression_loss: 1.5962 - classification_loss: 0.3594 305/500 [=================>............] - ETA: 48s - loss: 1.9568 - regression_loss: 1.5974 - classification_loss: 0.3593 306/500 [=================>............] - ETA: 48s - loss: 1.9563 - regression_loss: 1.5973 - classification_loss: 0.3590 307/500 [=================>............] - ETA: 48s - loss: 1.9565 - regression_loss: 1.5977 - classification_loss: 0.3588 308/500 [=================>............] - ETA: 47s - loss: 1.9554 - regression_loss: 1.5969 - classification_loss: 0.3585 309/500 [=================>............] - ETA: 47s - loss: 1.9577 - regression_loss: 1.5987 - classification_loss: 0.3590 310/500 [=================>............] - ETA: 47s - loss: 1.9552 - regression_loss: 1.5966 - classification_loss: 0.3587 311/500 [=================>............] - ETA: 47s - loss: 1.9532 - regression_loss: 1.5951 - classification_loss: 0.3580 312/500 [=================>............] - ETA: 46s - loss: 1.9507 - regression_loss: 1.5931 - classification_loss: 0.3575 313/500 [=================>............] - ETA: 46s - loss: 1.9473 - regression_loss: 1.5905 - classification_loss: 0.3567 314/500 [=================>............] - ETA: 46s - loss: 1.9465 - regression_loss: 1.5900 - classification_loss: 0.3565 315/500 [=================>............] - ETA: 46s - loss: 1.9456 - regression_loss: 1.5897 - classification_loss: 0.3559 316/500 [=================>............] - ETA: 45s - loss: 1.9452 - regression_loss: 1.5894 - classification_loss: 0.3558 317/500 [==================>...........] - ETA: 45s - loss: 1.9460 - regression_loss: 1.5901 - classification_loss: 0.3559 318/500 [==================>...........] - ETA: 45s - loss: 1.9446 - regression_loss: 1.5888 - classification_loss: 0.3558 319/500 [==================>...........] - ETA: 45s - loss: 1.9421 - regression_loss: 1.5869 - classification_loss: 0.3553 320/500 [==================>...........] - ETA: 44s - loss: 1.9424 - regression_loss: 1.5873 - classification_loss: 0.3551 321/500 [==================>...........] - ETA: 44s - loss: 1.9437 - regression_loss: 1.5887 - classification_loss: 0.3550 322/500 [==================>...........] - ETA: 44s - loss: 1.9408 - regression_loss: 1.5865 - classification_loss: 0.3543 323/500 [==================>...........] - ETA: 44s - loss: 1.9413 - regression_loss: 1.5870 - classification_loss: 0.3543 324/500 [==================>...........] - ETA: 43s - loss: 1.9414 - regression_loss: 1.5872 - classification_loss: 0.3541 325/500 [==================>...........] - ETA: 43s - loss: 1.9411 - regression_loss: 1.5872 - classification_loss: 0.3540 326/500 [==================>...........] - ETA: 43s - loss: 1.9421 - regression_loss: 1.5882 - classification_loss: 0.3539 327/500 [==================>...........] - ETA: 43s - loss: 1.9394 - regression_loss: 1.5858 - classification_loss: 0.3536 328/500 [==================>...........] - ETA: 42s - loss: 1.9396 - regression_loss: 1.5860 - classification_loss: 0.3536 329/500 [==================>...........] - ETA: 42s - loss: 1.9405 - regression_loss: 1.5868 - classification_loss: 0.3537 330/500 [==================>...........] - ETA: 42s - loss: 1.9394 - regression_loss: 1.5861 - classification_loss: 0.3533 331/500 [==================>...........] - ETA: 42s - loss: 1.9377 - regression_loss: 1.5847 - classification_loss: 0.3530 332/500 [==================>...........] - ETA: 41s - loss: 1.9393 - regression_loss: 1.5858 - classification_loss: 0.3535 333/500 [==================>...........] - ETA: 41s - loss: 1.9362 - regression_loss: 1.5834 - classification_loss: 0.3528 334/500 [===================>..........] - ETA: 41s - loss: 1.9376 - regression_loss: 1.5841 - classification_loss: 0.3534 335/500 [===================>..........] - ETA: 41s - loss: 1.9362 - regression_loss: 1.5832 - classification_loss: 0.3530 336/500 [===================>..........] - ETA: 40s - loss: 1.9343 - regression_loss: 1.5817 - classification_loss: 0.3526 337/500 [===================>..........] - ETA: 40s - loss: 1.9349 - regression_loss: 1.5823 - classification_loss: 0.3527 338/500 [===================>..........] - ETA: 40s - loss: 1.9342 - regression_loss: 1.5817 - classification_loss: 0.3524 339/500 [===================>..........] - ETA: 40s - loss: 1.9377 - regression_loss: 1.5818 - classification_loss: 0.3559 340/500 [===================>..........] - ETA: 39s - loss: 1.9372 - regression_loss: 1.5812 - classification_loss: 0.3560 341/500 [===================>..........] - ETA: 39s - loss: 1.9379 - regression_loss: 1.5820 - classification_loss: 0.3559 342/500 [===================>..........] - ETA: 39s - loss: 1.9380 - regression_loss: 1.5820 - classification_loss: 0.3560 343/500 [===================>..........] - ETA: 39s - loss: 1.9358 - regression_loss: 1.5802 - classification_loss: 0.3556 344/500 [===================>..........] - ETA: 38s - loss: 1.9367 - regression_loss: 1.5811 - classification_loss: 0.3556 345/500 [===================>..........] - ETA: 38s - loss: 1.9368 - regression_loss: 1.5812 - classification_loss: 0.3556 346/500 [===================>..........] - ETA: 38s - loss: 1.9369 - regression_loss: 1.5813 - classification_loss: 0.3556 347/500 [===================>..........] - ETA: 38s - loss: 1.9362 - regression_loss: 1.5809 - classification_loss: 0.3553 348/500 [===================>..........] - ETA: 37s - loss: 1.9380 - regression_loss: 1.5822 - classification_loss: 0.3558 349/500 [===================>..........] - ETA: 37s - loss: 1.9381 - regression_loss: 1.5822 - classification_loss: 0.3559 350/500 [====================>.........] - ETA: 37s - loss: 1.9380 - regression_loss: 1.5820 - classification_loss: 0.3560 351/500 [====================>.........] - ETA: 37s - loss: 1.9378 - regression_loss: 1.5818 - classification_loss: 0.3560 352/500 [====================>.........] - ETA: 36s - loss: 1.9360 - regression_loss: 1.5801 - classification_loss: 0.3559 353/500 [====================>.........] - ETA: 36s - loss: 1.9366 - regression_loss: 1.5807 - classification_loss: 0.3559 354/500 [====================>.........] - ETA: 36s - loss: 1.9359 - regression_loss: 1.5802 - classification_loss: 0.3558 355/500 [====================>.........] - ETA: 36s - loss: 1.9345 - regression_loss: 1.5791 - classification_loss: 0.3554 356/500 [====================>.........] - ETA: 35s - loss: 1.9338 - regression_loss: 1.5787 - classification_loss: 0.3552 357/500 [====================>.........] - ETA: 35s - loss: 1.9353 - regression_loss: 1.5802 - classification_loss: 0.3551 358/500 [====================>.........] - ETA: 35s - loss: 1.9361 - regression_loss: 1.5810 - classification_loss: 0.3550 359/500 [====================>.........] - ETA: 35s - loss: 1.9343 - regression_loss: 1.5795 - classification_loss: 0.3548 360/500 [====================>.........] - ETA: 34s - loss: 1.9344 - regression_loss: 1.5798 - classification_loss: 0.3547 361/500 [====================>.........] - ETA: 34s - loss: 1.9333 - regression_loss: 1.5791 - classification_loss: 0.3542 362/500 [====================>.........] - ETA: 34s - loss: 1.9326 - regression_loss: 1.5779 - classification_loss: 0.3547 363/500 [====================>.........] - ETA: 34s - loss: 1.9321 - regression_loss: 1.5776 - classification_loss: 0.3545 364/500 [====================>.........] - ETA: 33s - loss: 1.9323 - regression_loss: 1.5778 - classification_loss: 0.3545 365/500 [====================>.........] - ETA: 33s - loss: 1.9327 - regression_loss: 1.5783 - classification_loss: 0.3544 366/500 [====================>.........] - ETA: 33s - loss: 1.9315 - regression_loss: 1.5774 - classification_loss: 0.3541 367/500 [=====================>........] - ETA: 33s - loss: 1.9323 - regression_loss: 1.5783 - classification_loss: 0.3540 368/500 [=====================>........] - ETA: 32s - loss: 1.9329 - regression_loss: 1.5787 - classification_loss: 0.3542 369/500 [=====================>........] - ETA: 32s - loss: 1.9342 - regression_loss: 1.5796 - classification_loss: 0.3546 370/500 [=====================>........] - ETA: 32s - loss: 1.9344 - regression_loss: 1.5797 - classification_loss: 0.3546 371/500 [=====================>........] - ETA: 32s - loss: 1.9329 - regression_loss: 1.5787 - classification_loss: 0.3542 372/500 [=====================>........] - ETA: 31s - loss: 1.9306 - regression_loss: 1.5764 - classification_loss: 0.3542 373/500 [=====================>........] - ETA: 31s - loss: 1.9302 - regression_loss: 1.5760 - classification_loss: 0.3542 374/500 [=====================>........] - ETA: 31s - loss: 1.9321 - regression_loss: 1.5774 - classification_loss: 0.3547 375/500 [=====================>........] - ETA: 31s - loss: 1.9325 - regression_loss: 1.5778 - classification_loss: 0.3547 376/500 [=====================>........] - ETA: 30s - loss: 1.9329 - regression_loss: 1.5782 - classification_loss: 0.3547 377/500 [=====================>........] - ETA: 30s - loss: 1.9344 - regression_loss: 1.5794 - classification_loss: 0.3551 378/500 [=====================>........] - ETA: 30s - loss: 1.9350 - regression_loss: 1.5798 - classification_loss: 0.3551 379/500 [=====================>........] - ETA: 30s - loss: 1.9345 - regression_loss: 1.5796 - classification_loss: 0.3550 380/500 [=====================>........] - ETA: 29s - loss: 1.9359 - regression_loss: 1.5809 - classification_loss: 0.3550 381/500 [=====================>........] - ETA: 29s - loss: 1.9362 - regression_loss: 1.5811 - classification_loss: 0.3551 382/500 [=====================>........] - ETA: 29s - loss: 1.9367 - regression_loss: 1.5818 - classification_loss: 0.3549 383/500 [=====================>........] - ETA: 29s - loss: 1.9371 - regression_loss: 1.5820 - classification_loss: 0.3551 384/500 [======================>.......] - ETA: 28s - loss: 1.9364 - regression_loss: 1.5815 - classification_loss: 0.3550 385/500 [======================>.......] - ETA: 28s - loss: 1.9343 - regression_loss: 1.5797 - classification_loss: 0.3545 386/500 [======================>.......] - ETA: 28s - loss: 1.9350 - regression_loss: 1.5801 - classification_loss: 0.3549 387/500 [======================>.......] - ETA: 28s - loss: 1.9345 - regression_loss: 1.5796 - classification_loss: 0.3548 388/500 [======================>.......] - ETA: 27s - loss: 1.9334 - regression_loss: 1.5790 - classification_loss: 0.3545 389/500 [======================>.......] - ETA: 27s - loss: 1.9345 - regression_loss: 1.5802 - classification_loss: 0.3543 390/500 [======================>.......] - ETA: 27s - loss: 1.9350 - regression_loss: 1.5805 - classification_loss: 0.3545 391/500 [======================>.......] - ETA: 27s - loss: 1.9359 - regression_loss: 1.5812 - classification_loss: 0.3547 392/500 [======================>.......] - ETA: 26s - loss: 1.9372 - regression_loss: 1.5823 - classification_loss: 0.3550 393/500 [======================>.......] - ETA: 26s - loss: 1.9375 - regression_loss: 1.5827 - classification_loss: 0.3548 394/500 [======================>.......] - ETA: 26s - loss: 1.9387 - regression_loss: 1.5838 - classification_loss: 0.3550 395/500 [======================>.......] - ETA: 26s - loss: 1.9384 - regression_loss: 1.5837 - classification_loss: 0.3548 396/500 [======================>.......] - ETA: 25s - loss: 1.9382 - regression_loss: 1.5835 - classification_loss: 0.3546 397/500 [======================>.......] - ETA: 25s - loss: 1.9368 - regression_loss: 1.5826 - classification_loss: 0.3542 398/500 [======================>.......] - ETA: 25s - loss: 1.9369 - regression_loss: 1.5828 - classification_loss: 0.3541 399/500 [======================>.......] - ETA: 25s - loss: 1.9365 - regression_loss: 1.5825 - classification_loss: 0.3540 400/500 [=======================>......] - ETA: 24s - loss: 1.9389 - regression_loss: 1.5842 - classification_loss: 0.3547 401/500 [=======================>......] - ETA: 24s - loss: 1.9402 - regression_loss: 1.5853 - classification_loss: 0.3549 402/500 [=======================>......] - ETA: 24s - loss: 1.9411 - regression_loss: 1.5860 - classification_loss: 0.3551 403/500 [=======================>......] - ETA: 24s - loss: 1.9423 - regression_loss: 1.5870 - classification_loss: 0.3553 404/500 [=======================>......] - ETA: 23s - loss: 1.9436 - regression_loss: 1.5882 - classification_loss: 0.3554 405/500 [=======================>......] - ETA: 23s - loss: 1.9434 - regression_loss: 1.5882 - classification_loss: 0.3552 406/500 [=======================>......] - ETA: 23s - loss: 1.9432 - regression_loss: 1.5880 - classification_loss: 0.3552 407/500 [=======================>......] - ETA: 23s - loss: 1.9438 - regression_loss: 1.5886 - classification_loss: 0.3552 408/500 [=======================>......] - ETA: 22s - loss: 1.9415 - regression_loss: 1.5868 - classification_loss: 0.3547 409/500 [=======================>......] - ETA: 22s - loss: 1.9408 - regression_loss: 1.5862 - classification_loss: 0.3547 410/500 [=======================>......] - ETA: 22s - loss: 1.9414 - regression_loss: 1.5866 - classification_loss: 0.3548 411/500 [=======================>......] - ETA: 22s - loss: 1.9409 - regression_loss: 1.5862 - classification_loss: 0.3546 412/500 [=======================>......] - ETA: 21s - loss: 1.9411 - regression_loss: 1.5866 - classification_loss: 0.3545 413/500 [=======================>......] - ETA: 21s - loss: 1.9408 - regression_loss: 1.5863 - classification_loss: 0.3545 414/500 [=======================>......] - ETA: 21s - loss: 1.9402 - regression_loss: 1.5859 - classification_loss: 0.3543 415/500 [=======================>......] - ETA: 21s - loss: 1.9391 - regression_loss: 1.5851 - classification_loss: 0.3540 416/500 [=======================>......] - ETA: 20s - loss: 1.9386 - regression_loss: 1.5847 - classification_loss: 0.3540 417/500 [========================>.....] - ETA: 20s - loss: 1.9398 - regression_loss: 1.5859 - classification_loss: 0.3539 418/500 [========================>.....] - ETA: 20s - loss: 1.9396 - regression_loss: 1.5860 - classification_loss: 0.3536 419/500 [========================>.....] - ETA: 20s - loss: 1.9400 - regression_loss: 1.5863 - classification_loss: 0.3536 420/500 [========================>.....] - ETA: 19s - loss: 1.9395 - regression_loss: 1.5860 - classification_loss: 0.3535 421/500 [========================>.....] - ETA: 19s - loss: 1.9391 - regression_loss: 1.5858 - classification_loss: 0.3533 422/500 [========================>.....] - ETA: 19s - loss: 1.9393 - regression_loss: 1.5859 - classification_loss: 0.3534 423/500 [========================>.....] - ETA: 19s - loss: 1.9406 - regression_loss: 1.5869 - classification_loss: 0.3537 424/500 [========================>.....] - ETA: 18s - loss: 1.9408 - regression_loss: 1.5873 - classification_loss: 0.3535 425/500 [========================>.....] - ETA: 18s - loss: 1.9391 - regression_loss: 1.5861 - classification_loss: 0.3531 426/500 [========================>.....] - ETA: 18s - loss: 1.9395 - regression_loss: 1.5864 - classification_loss: 0.3531 427/500 [========================>.....] - ETA: 18s - loss: 1.9402 - regression_loss: 1.5872 - classification_loss: 0.3530 428/500 [========================>.....] - ETA: 17s - loss: 1.9405 - regression_loss: 1.5872 - classification_loss: 0.3533 429/500 [========================>.....] - ETA: 17s - loss: 1.9399 - regression_loss: 1.5870 - classification_loss: 0.3529 430/500 [========================>.....] - ETA: 17s - loss: 1.9371 - regression_loss: 1.5847 - classification_loss: 0.3524 431/500 [========================>.....] - ETA: 17s - loss: 1.9373 - regression_loss: 1.5849 - classification_loss: 0.3524 432/500 [========================>.....] - ETA: 16s - loss: 1.9378 - regression_loss: 1.5855 - classification_loss: 0.3523 433/500 [========================>.....] - ETA: 16s - loss: 1.9384 - regression_loss: 1.5860 - classification_loss: 0.3524 434/500 [=========================>....] - ETA: 16s - loss: 1.9384 - regression_loss: 1.5861 - classification_loss: 0.3523 435/500 [=========================>....] - ETA: 16s - loss: 1.9389 - regression_loss: 1.5867 - classification_loss: 0.3522 436/500 [=========================>....] - ETA: 15s - loss: 1.9392 - regression_loss: 1.5868 - classification_loss: 0.3524 437/500 [=========================>....] - ETA: 15s - loss: 1.9393 - regression_loss: 1.5869 - classification_loss: 0.3524 438/500 [=========================>....] - ETA: 15s - loss: 1.9394 - regression_loss: 1.5869 - classification_loss: 0.3526 439/500 [=========================>....] - ETA: 15s - loss: 1.9386 - regression_loss: 1.5863 - classification_loss: 0.3523 440/500 [=========================>....] - ETA: 14s - loss: 1.9377 - regression_loss: 1.5857 - classification_loss: 0.3519 441/500 [=========================>....] - ETA: 14s - loss: 1.9384 - regression_loss: 1.5862 - classification_loss: 0.3522 442/500 [=========================>....] - ETA: 14s - loss: 1.9374 - regression_loss: 1.5852 - classification_loss: 0.3521 443/500 [=========================>....] - ETA: 14s - loss: 1.9372 - regression_loss: 1.5850 - classification_loss: 0.3522 444/500 [=========================>....] - ETA: 13s - loss: 1.9372 - regression_loss: 1.5852 - classification_loss: 0.3520 445/500 [=========================>....] - ETA: 13s - loss: 1.9340 - regression_loss: 1.5826 - classification_loss: 0.3514 446/500 [=========================>....] - ETA: 13s - loss: 1.9348 - regression_loss: 1.5833 - classification_loss: 0.3515 447/500 [=========================>....] - ETA: 13s - loss: 1.9353 - regression_loss: 1.5838 - classification_loss: 0.3515 448/500 [=========================>....] - ETA: 12s - loss: 1.9347 - regression_loss: 1.5832 - classification_loss: 0.3515 449/500 [=========================>....] - ETA: 12s - loss: 1.9351 - regression_loss: 1.5834 - classification_loss: 0.3517 450/500 [==========================>...] - ETA: 12s - loss: 1.9362 - regression_loss: 1.5845 - classification_loss: 0.3517 451/500 [==========================>...] - ETA: 12s - loss: 1.9354 - regression_loss: 1.5839 - classification_loss: 0.3515 452/500 [==========================>...] - ETA: 11s - loss: 1.9352 - regression_loss: 1.5839 - classification_loss: 0.3514 453/500 [==========================>...] - ETA: 11s - loss: 1.9350 - regression_loss: 1.5838 - classification_loss: 0.3512 454/500 [==========================>...] - ETA: 11s - loss: 1.9341 - regression_loss: 1.5830 - classification_loss: 0.3511 455/500 [==========================>...] - ETA: 11s - loss: 1.9336 - regression_loss: 1.5826 - classification_loss: 0.3510 456/500 [==========================>...] - ETA: 10s - loss: 1.9346 - regression_loss: 1.5832 - classification_loss: 0.3514 457/500 [==========================>...] - ETA: 10s - loss: 1.9347 - regression_loss: 1.5833 - classification_loss: 0.3515 458/500 [==========================>...] - ETA: 10s - loss: 1.9340 - regression_loss: 1.5828 - classification_loss: 0.3512 459/500 [==========================>...] - ETA: 10s - loss: 1.9329 - regression_loss: 1.5820 - classification_loss: 0.3509 460/500 [==========================>...] - ETA: 9s - loss: 1.9324 - regression_loss: 1.5817 - classification_loss: 0.3507  461/500 [==========================>...] - ETA: 9s - loss: 1.9329 - regression_loss: 1.5821 - classification_loss: 0.3508 462/500 [==========================>...] - ETA: 9s - loss: 1.9336 - regression_loss: 1.5828 - classification_loss: 0.3508 463/500 [==========================>...] - ETA: 9s - loss: 1.9316 - regression_loss: 1.5812 - classification_loss: 0.3505 464/500 [==========================>...] - ETA: 8s - loss: 1.9318 - regression_loss: 1.5815 - classification_loss: 0.3504 465/500 [==========================>...] - ETA: 8s - loss: 1.9321 - regression_loss: 1.5816 - classification_loss: 0.3504 466/500 [==========================>...] - ETA: 8s - loss: 1.9302 - regression_loss: 1.5801 - classification_loss: 0.3502 467/500 [===========================>..] - ETA: 8s - loss: 1.9305 - regression_loss: 1.5802 - classification_loss: 0.3503 468/500 [===========================>..] - ETA: 7s - loss: 1.9321 - regression_loss: 1.5815 - classification_loss: 0.3505 469/500 [===========================>..] - ETA: 7s - loss: 1.9316 - regression_loss: 1.5812 - classification_loss: 0.3505 470/500 [===========================>..] - ETA: 7s - loss: 1.9336 - regression_loss: 1.5828 - classification_loss: 0.3509 471/500 [===========================>..] - ETA: 7s - loss: 1.9338 - regression_loss: 1.5829 - classification_loss: 0.3509 472/500 [===========================>..] - ETA: 6s - loss: 1.9337 - regression_loss: 1.5829 - classification_loss: 0.3507 473/500 [===========================>..] - ETA: 6s - loss: 1.9328 - regression_loss: 1.5822 - classification_loss: 0.3506 474/500 [===========================>..] - ETA: 6s - loss: 1.9306 - regression_loss: 1.5804 - classification_loss: 0.3501 475/500 [===========================>..] - ETA: 6s - loss: 1.9311 - regression_loss: 1.5807 - classification_loss: 0.3504 476/500 [===========================>..] - ETA: 5s - loss: 1.9312 - regression_loss: 1.5807 - classification_loss: 0.3504 477/500 [===========================>..] - ETA: 5s - loss: 1.9319 - regression_loss: 1.5813 - classification_loss: 0.3506 478/500 [===========================>..] - ETA: 5s - loss: 1.9321 - regression_loss: 1.5816 - classification_loss: 0.3505 479/500 [===========================>..] - ETA: 5s - loss: 1.9331 - regression_loss: 1.5823 - classification_loss: 0.3508 480/500 [===========================>..] - ETA: 4s - loss: 1.9328 - regression_loss: 1.5822 - classification_loss: 0.3506 481/500 [===========================>..] - ETA: 4s - loss: 1.9327 - regression_loss: 1.5821 - classification_loss: 0.3505 482/500 [===========================>..] - ETA: 4s - loss: 1.9344 - regression_loss: 1.5833 - classification_loss: 0.3511 483/500 [===========================>..] - ETA: 4s - loss: 1.9330 - regression_loss: 1.5821 - classification_loss: 0.3509 484/500 [============================>.] - ETA: 3s - loss: 1.9324 - regression_loss: 1.5817 - classification_loss: 0.3507 485/500 [============================>.] - ETA: 3s - loss: 1.9329 - regression_loss: 1.5822 - classification_loss: 0.3507 486/500 [============================>.] - ETA: 3s - loss: 1.9309 - regression_loss: 1.5802 - classification_loss: 0.3507 487/500 [============================>.] - ETA: 3s - loss: 1.9300 - regression_loss: 1.5795 - classification_loss: 0.3505 488/500 [============================>.] - ETA: 2s - loss: 1.9301 - regression_loss: 1.5797 - classification_loss: 0.3504 489/500 [============================>.] - ETA: 2s - loss: 1.9293 - regression_loss: 1.5792 - classification_loss: 0.3501 490/500 [============================>.] - ETA: 2s - loss: 1.9297 - regression_loss: 1.5796 - classification_loss: 0.3501 491/500 [============================>.] - ETA: 2s - loss: 1.9298 - regression_loss: 1.5798 - classification_loss: 0.3500 492/500 [============================>.] - ETA: 1s - loss: 1.9299 - regression_loss: 1.5791 - classification_loss: 0.3509 493/500 [============================>.] - ETA: 1s - loss: 1.9300 - regression_loss: 1.5792 - classification_loss: 0.3508 494/500 [============================>.] - ETA: 1s - loss: 1.9298 - regression_loss: 1.5790 - classification_loss: 0.3508 495/500 [============================>.] - ETA: 1s - loss: 1.9275 - regression_loss: 1.5772 - classification_loss: 0.3503 496/500 [============================>.] - ETA: 0s - loss: 1.9275 - regression_loss: 1.5772 - classification_loss: 0.3502 497/500 [============================>.] - ETA: 0s - loss: 1.9284 - regression_loss: 1.5778 - classification_loss: 0.3506 498/500 [============================>.] - ETA: 0s - loss: 1.9285 - regression_loss: 1.5780 - classification_loss: 0.3506 499/500 [============================>.] - ETA: 0s - loss: 1.9302 - regression_loss: 1.5794 - classification_loss: 0.3508 500/500 [==============================] - 125s 250ms/step - loss: 1.9300 - regression_loss: 1.5793 - classification_loss: 0.3507 1172 instances of class plum with average precision: 0.5964 mAP: 0.5964 Epoch 00048: saving model to ./training/snapshots/resnet50_pascal_48.h5 Epoch 49/150 1/500 [..............................] - ETA: 1:54 - loss: 2.2372 - regression_loss: 1.7684 - classification_loss: 0.4688 2/500 [..............................] - ETA: 2:00 - loss: 2.0199 - regression_loss: 1.6539 - classification_loss: 0.3661 3/500 [..............................] - ETA: 2:01 - loss: 2.0234 - regression_loss: 1.6660 - classification_loss: 0.3574 4/500 [..............................] - ETA: 2:01 - loss: 2.0439 - regression_loss: 1.6949 - classification_loss: 0.3490 5/500 [..............................] - ETA: 2:02 - loss: 2.0087 - regression_loss: 1.6773 - classification_loss: 0.3314 6/500 [..............................] - ETA: 2:01 - loss: 1.9832 - regression_loss: 1.6610 - classification_loss: 0.3222 7/500 [..............................] - ETA: 2:00 - loss: 2.0412 - regression_loss: 1.6900 - classification_loss: 0.3512 8/500 [..............................] - ETA: 2:00 - loss: 1.9669 - regression_loss: 1.6163 - classification_loss: 0.3506 9/500 [..............................] - ETA: 2:01 - loss: 1.9621 - regression_loss: 1.6123 - classification_loss: 0.3498 10/500 [..............................] - ETA: 2:01 - loss: 1.9505 - regression_loss: 1.6047 - classification_loss: 0.3458 11/500 [..............................] - ETA: 2:01 - loss: 1.9714 - regression_loss: 1.6203 - classification_loss: 0.3512 12/500 [..............................] - ETA: 2:01 - loss: 1.9401 - regression_loss: 1.5964 - classification_loss: 0.3437 13/500 [..............................] - ETA: 2:00 - loss: 1.9510 - regression_loss: 1.6086 - classification_loss: 0.3424 14/500 [..............................] - ETA: 2:00 - loss: 1.9803 - regression_loss: 1.6331 - classification_loss: 0.3472 15/500 [..............................] - ETA: 2:00 - loss: 1.9636 - regression_loss: 1.6156 - classification_loss: 0.3480 16/500 [..............................] - ETA: 2:00 - loss: 1.8922 - regression_loss: 1.5524 - classification_loss: 0.3398 17/500 [>.............................] - ETA: 1:59 - loss: 1.9000 - regression_loss: 1.5641 - classification_loss: 0.3359 18/500 [>.............................] - ETA: 1:59 - loss: 1.8702 - regression_loss: 1.5400 - classification_loss: 0.3301 19/500 [>.............................] - ETA: 1:59 - loss: 1.8678 - regression_loss: 1.5374 - classification_loss: 0.3304 20/500 [>.............................] - ETA: 1:59 - loss: 1.8741 - regression_loss: 1.5444 - classification_loss: 0.3297 21/500 [>.............................] - ETA: 1:59 - loss: 1.8485 - regression_loss: 1.5230 - classification_loss: 0.3255 22/500 [>.............................] - ETA: 1:59 - loss: 1.8448 - regression_loss: 1.5216 - classification_loss: 0.3232 23/500 [>.............................] - ETA: 1:58 - loss: 1.8446 - regression_loss: 1.5222 - classification_loss: 0.3224 24/500 [>.............................] - ETA: 1:58 - loss: 1.8348 - regression_loss: 1.5157 - classification_loss: 0.3190 25/500 [>.............................] - ETA: 1:58 - loss: 1.7906 - regression_loss: 1.4788 - classification_loss: 0.3119 26/500 [>.............................] - ETA: 1:58 - loss: 1.7907 - regression_loss: 1.4825 - classification_loss: 0.3082 27/500 [>.............................] - ETA: 1:57 - loss: 1.8203 - regression_loss: 1.5045 - classification_loss: 0.3157 28/500 [>.............................] - ETA: 1:57 - loss: 1.7831 - regression_loss: 1.4748 - classification_loss: 0.3082 29/500 [>.............................] - ETA: 1:57 - loss: 1.8021 - regression_loss: 1.4888 - classification_loss: 0.3133 30/500 [>.............................] - ETA: 1:57 - loss: 1.8138 - regression_loss: 1.4979 - classification_loss: 0.3158 31/500 [>.............................] - ETA: 1:57 - loss: 1.8107 - regression_loss: 1.4979 - classification_loss: 0.3128 32/500 [>.............................] - ETA: 1:56 - loss: 1.8204 - regression_loss: 1.5055 - classification_loss: 0.3148 33/500 [>.............................] - ETA: 1:56 - loss: 1.8091 - regression_loss: 1.4969 - classification_loss: 0.3121 34/500 [=>............................] - ETA: 1:56 - loss: 1.8192 - regression_loss: 1.5057 - classification_loss: 0.3135 35/500 [=>............................] - ETA: 1:56 - loss: 1.8135 - regression_loss: 1.4987 - classification_loss: 0.3148 36/500 [=>............................] - ETA: 1:56 - loss: 1.7914 - regression_loss: 1.4820 - classification_loss: 0.3094 37/500 [=>............................] - ETA: 1:55 - loss: 1.8069 - regression_loss: 1.4933 - classification_loss: 0.3136 38/500 [=>............................] - ETA: 1:55 - loss: 1.8168 - regression_loss: 1.5013 - classification_loss: 0.3155 39/500 [=>............................] - ETA: 1:55 - loss: 1.8187 - regression_loss: 1.5036 - classification_loss: 0.3151 40/500 [=>............................] - ETA: 1:55 - loss: 1.8255 - regression_loss: 1.5111 - classification_loss: 0.3144 41/500 [=>............................] - ETA: 1:55 - loss: 1.8462 - regression_loss: 1.5233 - classification_loss: 0.3229 42/500 [=>............................] - ETA: 1:54 - loss: 1.8525 - regression_loss: 1.5278 - classification_loss: 0.3248 43/500 [=>............................] - ETA: 1:54 - loss: 1.8536 - regression_loss: 1.5274 - classification_loss: 0.3262 44/500 [=>............................] - ETA: 1:54 - loss: 1.8423 - regression_loss: 1.5188 - classification_loss: 0.3235 45/500 [=>............................] - ETA: 1:54 - loss: 1.8456 - regression_loss: 1.5220 - classification_loss: 0.3236 46/500 [=>............................] - ETA: 1:53 - loss: 1.8534 - regression_loss: 1.5297 - classification_loss: 0.3237 47/500 [=>............................] - ETA: 1:53 - loss: 1.8437 - regression_loss: 1.5171 - classification_loss: 0.3266 48/500 [=>............................] - ETA: 1:53 - loss: 1.8552 - regression_loss: 1.5273 - classification_loss: 0.3279 49/500 [=>............................] - ETA: 1:53 - loss: 1.8559 - regression_loss: 1.5280 - classification_loss: 0.3279 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8404 - regression_loss: 1.5152 - classification_loss: 0.3251 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8389 - regression_loss: 1.5144 - classification_loss: 0.3246 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8484 - regression_loss: 1.5218 - classification_loss: 0.3266 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8438 - regression_loss: 1.5182 - classification_loss: 0.3256 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8590 - regression_loss: 1.5319 - classification_loss: 0.3271 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8540 - regression_loss: 1.5281 - classification_loss: 0.3259 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8601 - regression_loss: 1.5318 - classification_loss: 0.3283 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8650 - regression_loss: 1.5355 - classification_loss: 0.3294 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8731 - regression_loss: 1.5429 - classification_loss: 0.3302 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8785 - regression_loss: 1.5464 - classification_loss: 0.3321 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8947 - regression_loss: 1.5600 - classification_loss: 0.3347 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9005 - regression_loss: 1.5635 - classification_loss: 0.3370 62/500 [==>...........................] - ETA: 1:49 - loss: 1.9103 - regression_loss: 1.5699 - classification_loss: 0.3404 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9028 - regression_loss: 1.5640 - classification_loss: 0.3388 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9020 - regression_loss: 1.5619 - classification_loss: 0.3401 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9067 - regression_loss: 1.5641 - classification_loss: 0.3426 66/500 [==>...........................] - ETA: 1:48 - loss: 1.9134 - regression_loss: 1.5702 - classification_loss: 0.3432 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9095 - regression_loss: 1.5677 - classification_loss: 0.3418 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9133 - regression_loss: 1.5704 - classification_loss: 0.3428 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9222 - regression_loss: 1.5774 - classification_loss: 0.3447 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9085 - regression_loss: 1.5656 - classification_loss: 0.3429 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9078 - regression_loss: 1.5652 - classification_loss: 0.3425 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9114 - regression_loss: 1.5684 - classification_loss: 0.3430 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9150 - regression_loss: 1.5697 - classification_loss: 0.3454 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9222 - regression_loss: 1.5754 - classification_loss: 0.3468 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9255 - regression_loss: 1.5783 - classification_loss: 0.3472 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9249 - regression_loss: 1.5775 - classification_loss: 0.3474 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9252 - regression_loss: 1.5783 - classification_loss: 0.3469 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9291 - regression_loss: 1.5819 - classification_loss: 0.3472 79/500 [===>..........................] - ETA: 1:45 - loss: 1.9270 - regression_loss: 1.5812 - classification_loss: 0.3457 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9237 - regression_loss: 1.5792 - classification_loss: 0.3445 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9093 - regression_loss: 1.5677 - classification_loss: 0.3416 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9118 - regression_loss: 1.5700 - classification_loss: 0.3418 83/500 [===>..........................] - ETA: 1:44 - loss: 1.9144 - regression_loss: 1.5730 - classification_loss: 0.3414 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9134 - regression_loss: 1.5721 - classification_loss: 0.3413 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9029 - regression_loss: 1.5640 - classification_loss: 0.3389 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9047 - regression_loss: 1.5653 - classification_loss: 0.3394 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8980 - regression_loss: 1.5596 - classification_loss: 0.3384 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8999 - regression_loss: 1.5613 - classification_loss: 0.3387 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9010 - regression_loss: 1.5609 - classification_loss: 0.3401 90/500 [====>.........................] - ETA: 1:43 - loss: 1.8991 - regression_loss: 1.5602 - classification_loss: 0.3389 91/500 [====>.........................] - ETA: 1:42 - loss: 1.9066 - regression_loss: 1.5656 - classification_loss: 0.3410 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9005 - regression_loss: 1.5611 - classification_loss: 0.3394 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8966 - regression_loss: 1.5584 - classification_loss: 0.3382 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8996 - regression_loss: 1.5612 - classification_loss: 0.3385 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9001 - regression_loss: 1.5616 - classification_loss: 0.3385 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9098 - regression_loss: 1.5692 - classification_loss: 0.3406 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9078 - regression_loss: 1.5671 - classification_loss: 0.3408 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9012 - regression_loss: 1.5618 - classification_loss: 0.3394 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8999 - regression_loss: 1.5611 - classification_loss: 0.3388 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8980 - regression_loss: 1.5600 - classification_loss: 0.3380 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8944 - regression_loss: 1.5568 - classification_loss: 0.3375 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8972 - regression_loss: 1.5593 - classification_loss: 0.3380 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8956 - regression_loss: 1.5582 - classification_loss: 0.3373 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8916 - regression_loss: 1.5556 - classification_loss: 0.3360 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8911 - regression_loss: 1.5550 - classification_loss: 0.3361 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8950 - regression_loss: 1.5584 - classification_loss: 0.3366 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8951 - regression_loss: 1.5590 - classification_loss: 0.3361 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8974 - regression_loss: 1.5613 - classification_loss: 0.3361 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9022 - regression_loss: 1.5651 - classification_loss: 0.3371 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8977 - regression_loss: 1.5618 - classification_loss: 0.3360 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8954 - regression_loss: 1.5604 - classification_loss: 0.3350 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8968 - regression_loss: 1.5615 - classification_loss: 0.3352 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8947 - regression_loss: 1.5594 - classification_loss: 0.3353 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8950 - regression_loss: 1.5601 - classification_loss: 0.3349 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8862 - regression_loss: 1.5528 - classification_loss: 0.3334 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8856 - regression_loss: 1.5528 - classification_loss: 0.3328 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8884 - regression_loss: 1.5556 - classification_loss: 0.3328 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8882 - regression_loss: 1.5555 - classification_loss: 0.3326 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8875 - regression_loss: 1.5552 - classification_loss: 0.3323 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8884 - regression_loss: 1.5560 - classification_loss: 0.3325 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8844 - regression_loss: 1.5532 - classification_loss: 0.3312 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8806 - regression_loss: 1.5504 - classification_loss: 0.3302 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8785 - regression_loss: 1.5491 - classification_loss: 0.3295 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8726 - regression_loss: 1.5444 - classification_loss: 0.3282 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8773 - regression_loss: 1.5491 - classification_loss: 0.3282 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8806 - regression_loss: 1.5518 - classification_loss: 0.3288 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8796 - regression_loss: 1.5510 - classification_loss: 0.3286 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8828 - regression_loss: 1.5537 - classification_loss: 0.3291 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8797 - regression_loss: 1.5514 - classification_loss: 0.3282 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8801 - regression_loss: 1.5520 - classification_loss: 0.3281 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8735 - regression_loss: 1.5470 - classification_loss: 0.3265 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8727 - regression_loss: 1.5465 - classification_loss: 0.3261 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8758 - regression_loss: 1.5491 - classification_loss: 0.3268 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8788 - regression_loss: 1.5512 - classification_loss: 0.3276 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8786 - regression_loss: 1.5513 - classification_loss: 0.3273 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8793 - regression_loss: 1.5522 - classification_loss: 0.3271 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8795 - regression_loss: 1.5526 - classification_loss: 0.3269 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8779 - regression_loss: 1.5520 - classification_loss: 0.3259 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8762 - regression_loss: 1.5509 - classification_loss: 0.3254 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8729 - regression_loss: 1.5484 - classification_loss: 0.3245 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8738 - regression_loss: 1.5491 - classification_loss: 0.3247 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8751 - regression_loss: 1.5497 - classification_loss: 0.3254 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8739 - regression_loss: 1.5487 - classification_loss: 0.3252 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8763 - regression_loss: 1.5507 - classification_loss: 0.3256 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8702 - regression_loss: 1.5461 - classification_loss: 0.3241 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8701 - regression_loss: 1.5461 - classification_loss: 0.3240 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8732 - regression_loss: 1.5479 - classification_loss: 0.3253 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8683 - regression_loss: 1.5440 - classification_loss: 0.3243 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8677 - regression_loss: 1.5437 - classification_loss: 0.3240 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8698 - regression_loss: 1.5447 - classification_loss: 0.3251 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8706 - regression_loss: 1.5455 - classification_loss: 0.3251 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8735 - regression_loss: 1.5477 - classification_loss: 0.3258 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8793 - regression_loss: 1.5520 - classification_loss: 0.3273 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8842 - regression_loss: 1.5555 - classification_loss: 0.3287 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8851 - regression_loss: 1.5565 - classification_loss: 0.3286 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8859 - regression_loss: 1.5573 - classification_loss: 0.3286 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8875 - regression_loss: 1.5585 - classification_loss: 0.3289 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8834 - regression_loss: 1.5552 - classification_loss: 0.3282 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8787 - regression_loss: 1.5514 - classification_loss: 0.3274 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8778 - regression_loss: 1.5508 - classification_loss: 0.3270 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8781 - regression_loss: 1.5511 - classification_loss: 0.3270 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8781 - regression_loss: 1.5512 - classification_loss: 0.3269 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8809 - regression_loss: 1.5535 - classification_loss: 0.3274 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8810 - regression_loss: 1.5535 - classification_loss: 0.3275 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8737 - regression_loss: 1.5473 - classification_loss: 0.3264 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8714 - regression_loss: 1.5455 - classification_loss: 0.3260 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8724 - regression_loss: 1.5463 - classification_loss: 0.3261 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8727 - regression_loss: 1.5469 - classification_loss: 0.3259 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8777 - regression_loss: 1.5506 - classification_loss: 0.3271 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8775 - regression_loss: 1.5500 - classification_loss: 0.3275 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8951 - regression_loss: 1.5492 - classification_loss: 0.3459 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8946 - regression_loss: 1.5491 - classification_loss: 0.3455 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8934 - regression_loss: 1.5484 - classification_loss: 0.3450 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8947 - regression_loss: 1.5490 - classification_loss: 0.3457 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8958 - regression_loss: 1.5503 - classification_loss: 0.3456 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8956 - regression_loss: 1.5504 - classification_loss: 0.3452 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8955 - regression_loss: 1.5505 - classification_loss: 0.3450 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8947 - regression_loss: 1.5501 - classification_loss: 0.3446 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8978 - regression_loss: 1.5528 - classification_loss: 0.3450 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8999 - regression_loss: 1.5547 - classification_loss: 0.3453 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8999 - regression_loss: 1.5540 - classification_loss: 0.3459 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8991 - regression_loss: 1.5537 - classification_loss: 0.3454 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8995 - regression_loss: 1.5543 - classification_loss: 0.3452 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8969 - regression_loss: 1.5523 - classification_loss: 0.3445 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8966 - regression_loss: 1.5521 - classification_loss: 0.3445 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8981 - regression_loss: 1.5535 - classification_loss: 0.3446 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8996 - regression_loss: 1.5549 - classification_loss: 0.3447 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8992 - regression_loss: 1.5546 - classification_loss: 0.3446 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9016 - regression_loss: 1.5567 - classification_loss: 0.3449 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9008 - regression_loss: 1.5563 - classification_loss: 0.3446 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9025 - regression_loss: 1.5574 - classification_loss: 0.3451 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9037 - regression_loss: 1.5581 - classification_loss: 0.3456 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9044 - regression_loss: 1.5588 - classification_loss: 0.3455 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9061 - regression_loss: 1.5603 - classification_loss: 0.3458 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9064 - regression_loss: 1.5605 - classification_loss: 0.3458 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9098 - regression_loss: 1.5633 - classification_loss: 0.3464 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9110 - regression_loss: 1.5648 - classification_loss: 0.3462 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9135 - regression_loss: 1.5669 - classification_loss: 0.3466 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9163 - regression_loss: 1.5688 - classification_loss: 0.3475 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9152 - regression_loss: 1.5672 - classification_loss: 0.3479 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9139 - regression_loss: 1.5663 - classification_loss: 0.3476 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9134 - regression_loss: 1.5655 - classification_loss: 0.3479 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9134 - regression_loss: 1.5657 - classification_loss: 0.3477 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9109 - regression_loss: 1.5638 - classification_loss: 0.3471 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9091 - regression_loss: 1.5625 - classification_loss: 0.3466 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9081 - regression_loss: 1.5617 - classification_loss: 0.3464 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9059 - regression_loss: 1.5602 - classification_loss: 0.3457 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9059 - regression_loss: 1.5602 - classification_loss: 0.3457 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9090 - regression_loss: 1.5621 - classification_loss: 0.3469 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9099 - regression_loss: 1.5631 - classification_loss: 0.3467 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9039 - regression_loss: 1.5581 - classification_loss: 0.3458 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9044 - regression_loss: 1.5587 - classification_loss: 0.3457 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9051 - regression_loss: 1.5594 - classification_loss: 0.3457 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9048 - regression_loss: 1.5590 - classification_loss: 0.3458 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9061 - regression_loss: 1.5599 - classification_loss: 0.3462 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9065 - regression_loss: 1.5595 - classification_loss: 0.3471 217/500 [============>.................] - ETA: 1:11 - loss: 1.9074 - regression_loss: 1.5603 - classification_loss: 0.3470 218/500 [============>.................] - ETA: 1:10 - loss: 1.9040 - regression_loss: 1.5575 - classification_loss: 0.3465 219/500 [============>.................] - ETA: 1:10 - loss: 1.9059 - regression_loss: 1.5588 - classification_loss: 0.3471 220/500 [============>.................] - ETA: 1:10 - loss: 1.9135 - regression_loss: 1.5651 - classification_loss: 0.3484 221/500 [============>.................] - ETA: 1:10 - loss: 1.9127 - regression_loss: 1.5638 - classification_loss: 0.3489 222/500 [============>.................] - ETA: 1:09 - loss: 1.9137 - regression_loss: 1.5646 - classification_loss: 0.3491 223/500 [============>.................] - ETA: 1:09 - loss: 1.9167 - regression_loss: 1.5669 - classification_loss: 0.3498 224/500 [============>.................] - ETA: 1:09 - loss: 1.9133 - regression_loss: 1.5642 - classification_loss: 0.3491 225/500 [============>.................] - ETA: 1:09 - loss: 1.9139 - regression_loss: 1.5650 - classification_loss: 0.3489 226/500 [============>.................] - ETA: 1:08 - loss: 1.9095 - regression_loss: 1.5615 - classification_loss: 0.3479 227/500 [============>.................] - ETA: 1:08 - loss: 1.9113 - regression_loss: 1.5631 - classification_loss: 0.3481 228/500 [============>.................] - ETA: 1:08 - loss: 1.9124 - regression_loss: 1.5636 - classification_loss: 0.3488 229/500 [============>.................] - ETA: 1:08 - loss: 1.9115 - regression_loss: 1.5628 - classification_loss: 0.3487 230/500 [============>.................] - ETA: 1:07 - loss: 1.9105 - regression_loss: 1.5622 - classification_loss: 0.3483 231/500 [============>.................] - ETA: 1:07 - loss: 1.9112 - regression_loss: 1.5631 - classification_loss: 0.3481 232/500 [============>.................] - ETA: 1:07 - loss: 1.9116 - regression_loss: 1.5636 - classification_loss: 0.3480 233/500 [============>.................] - ETA: 1:06 - loss: 1.9112 - regression_loss: 1.5636 - classification_loss: 0.3476 234/500 [=============>................] - ETA: 1:06 - loss: 1.9093 - regression_loss: 1.5618 - classification_loss: 0.3476 235/500 [=============>................] - ETA: 1:06 - loss: 1.9064 - regression_loss: 1.5590 - classification_loss: 0.3474 236/500 [=============>................] - ETA: 1:06 - loss: 1.9092 - regression_loss: 1.5609 - classification_loss: 0.3484 237/500 [=============>................] - ETA: 1:06 - loss: 1.9094 - regression_loss: 1.5610 - classification_loss: 0.3483 238/500 [=============>................] - ETA: 1:05 - loss: 1.9105 - regression_loss: 1.5623 - classification_loss: 0.3482 239/500 [=============>................] - ETA: 1:05 - loss: 1.9097 - regression_loss: 1.5618 - classification_loss: 0.3479 240/500 [=============>................] - ETA: 1:05 - loss: 1.9123 - regression_loss: 1.5642 - classification_loss: 0.3481 241/500 [=============>................] - ETA: 1:04 - loss: 1.9083 - regression_loss: 1.5611 - classification_loss: 0.3472 242/500 [=============>................] - ETA: 1:04 - loss: 1.9073 - regression_loss: 1.5605 - classification_loss: 0.3468 243/500 [=============>................] - ETA: 1:04 - loss: 1.9107 - regression_loss: 1.5635 - classification_loss: 0.3472 244/500 [=============>................] - ETA: 1:04 - loss: 1.9081 - regression_loss: 1.5615 - classification_loss: 0.3467 245/500 [=============>................] - ETA: 1:03 - loss: 1.9077 - regression_loss: 1.5613 - classification_loss: 0.3463 246/500 [=============>................] - ETA: 1:03 - loss: 1.9077 - regression_loss: 1.5615 - classification_loss: 0.3462 247/500 [=============>................] - ETA: 1:03 - loss: 1.9091 - regression_loss: 1.5631 - classification_loss: 0.3460 248/500 [=============>................] - ETA: 1:03 - loss: 1.9101 - regression_loss: 1.5638 - classification_loss: 0.3463 249/500 [=============>................] - ETA: 1:02 - loss: 1.9097 - regression_loss: 1.5634 - classification_loss: 0.3464 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9107 - regression_loss: 1.5643 - classification_loss: 0.3465 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9067 - regression_loss: 1.5606 - classification_loss: 0.3460 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9059 - regression_loss: 1.5600 - classification_loss: 0.3459 253/500 [==============>...............] - ETA: 1:01 - loss: 1.9071 - regression_loss: 1.5610 - classification_loss: 0.3461 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9066 - regression_loss: 1.5606 - classification_loss: 0.3460 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9081 - regression_loss: 1.5609 - classification_loss: 0.3472 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9094 - regression_loss: 1.5621 - classification_loss: 0.3473 257/500 [==============>...............] - ETA: 1:00 - loss: 1.9096 - regression_loss: 1.5625 - classification_loss: 0.3471 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9073 - regression_loss: 1.5609 - classification_loss: 0.3464 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9069 - regression_loss: 1.5609 - classification_loss: 0.3460 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9075 - regression_loss: 1.5616 - classification_loss: 0.3459 261/500 [==============>...............] - ETA: 59s - loss: 1.9061 - regression_loss: 1.5607 - classification_loss: 0.3454  262/500 [==============>...............] - ETA: 59s - loss: 1.9060 - regression_loss: 1.5608 - classification_loss: 0.3452 263/500 [==============>...............] - ETA: 59s - loss: 1.9068 - regression_loss: 1.5612 - classification_loss: 0.3456 264/500 [==============>...............] - ETA: 59s - loss: 1.9085 - regression_loss: 1.5625 - classification_loss: 0.3460 265/500 [==============>...............] - ETA: 58s - loss: 1.9078 - regression_loss: 1.5620 - classification_loss: 0.3458 266/500 [==============>...............] - ETA: 58s - loss: 1.9092 - regression_loss: 1.5635 - classification_loss: 0.3457 267/500 [===============>..............] - ETA: 58s - loss: 1.9121 - regression_loss: 1.5655 - classification_loss: 0.3466 268/500 [===============>..............] - ETA: 58s - loss: 1.9126 - regression_loss: 1.5657 - classification_loss: 0.3469 269/500 [===============>..............] - ETA: 57s - loss: 1.9187 - regression_loss: 1.5705 - classification_loss: 0.3482 270/500 [===============>..............] - ETA: 57s - loss: 1.9191 - regression_loss: 1.5706 - classification_loss: 0.3484 271/500 [===============>..............] - ETA: 57s - loss: 1.9185 - regression_loss: 1.5702 - classification_loss: 0.3483 272/500 [===============>..............] - ETA: 57s - loss: 1.9196 - regression_loss: 1.5710 - classification_loss: 0.3486 273/500 [===============>..............] - ETA: 56s - loss: 1.9198 - regression_loss: 1.5714 - classification_loss: 0.3485 274/500 [===============>..............] - ETA: 56s - loss: 1.9181 - regression_loss: 1.5700 - classification_loss: 0.3481 275/500 [===============>..............] - ETA: 56s - loss: 1.9180 - regression_loss: 1.5699 - classification_loss: 0.3481 276/500 [===============>..............] - ETA: 56s - loss: 1.9187 - regression_loss: 1.5706 - classification_loss: 0.3480 277/500 [===============>..............] - ETA: 55s - loss: 1.9194 - regression_loss: 1.5713 - classification_loss: 0.3481 278/500 [===============>..............] - ETA: 55s - loss: 1.9194 - regression_loss: 1.5713 - classification_loss: 0.3480 279/500 [===============>..............] - ETA: 55s - loss: 1.9151 - regression_loss: 1.5679 - classification_loss: 0.3472 280/500 [===============>..............] - ETA: 55s - loss: 1.9141 - regression_loss: 1.5673 - classification_loss: 0.3469 281/500 [===============>..............] - ETA: 54s - loss: 1.9151 - regression_loss: 1.5679 - classification_loss: 0.3472 282/500 [===============>..............] - ETA: 54s - loss: 1.9196 - regression_loss: 1.5713 - classification_loss: 0.3483 283/500 [===============>..............] - ETA: 54s - loss: 1.9214 - regression_loss: 1.5724 - classification_loss: 0.3490 284/500 [================>.............] - ETA: 54s - loss: 1.9215 - regression_loss: 1.5724 - classification_loss: 0.3491 285/500 [================>.............] - ETA: 53s - loss: 1.9214 - regression_loss: 1.5723 - classification_loss: 0.3491 286/500 [================>.............] - ETA: 53s - loss: 1.9216 - regression_loss: 1.5728 - classification_loss: 0.3489 287/500 [================>.............] - ETA: 53s - loss: 1.9220 - regression_loss: 1.5733 - classification_loss: 0.3487 288/500 [================>.............] - ETA: 53s - loss: 1.9228 - regression_loss: 1.5740 - classification_loss: 0.3488 289/500 [================>.............] - ETA: 52s - loss: 1.9245 - regression_loss: 1.5761 - classification_loss: 0.3484 290/500 [================>.............] - ETA: 52s - loss: 1.9250 - regression_loss: 1.5765 - classification_loss: 0.3485 291/500 [================>.............] - ETA: 52s - loss: 1.9268 - regression_loss: 1.5782 - classification_loss: 0.3486 292/500 [================>.............] - ETA: 52s - loss: 1.9280 - regression_loss: 1.5790 - classification_loss: 0.3490 293/500 [================>.............] - ETA: 51s - loss: 1.9284 - regression_loss: 1.5793 - classification_loss: 0.3491 294/500 [================>.............] - ETA: 51s - loss: 1.9289 - regression_loss: 1.5799 - classification_loss: 0.3490 295/500 [================>.............] - ETA: 51s - loss: 1.9278 - regression_loss: 1.5792 - classification_loss: 0.3486 296/500 [================>.............] - ETA: 51s - loss: 1.9284 - regression_loss: 1.5799 - classification_loss: 0.3486 297/500 [================>.............] - ETA: 50s - loss: 1.9303 - regression_loss: 1.5815 - classification_loss: 0.3488 298/500 [================>.............] - ETA: 50s - loss: 1.9285 - regression_loss: 1.5801 - classification_loss: 0.3484 299/500 [================>.............] - ETA: 50s - loss: 1.9269 - regression_loss: 1.5789 - classification_loss: 0.3480 300/500 [=================>............] - ETA: 50s - loss: 1.9251 - regression_loss: 1.5776 - classification_loss: 0.3475 301/500 [=================>............] - ETA: 49s - loss: 1.9235 - regression_loss: 1.5764 - classification_loss: 0.3471 302/500 [=================>............] - ETA: 49s - loss: 1.9244 - regression_loss: 1.5774 - classification_loss: 0.3470 303/500 [=================>............] - ETA: 49s - loss: 1.9226 - regression_loss: 1.5752 - classification_loss: 0.3473 304/500 [=================>............] - ETA: 49s - loss: 1.9203 - regression_loss: 1.5731 - classification_loss: 0.3471 305/500 [=================>............] - ETA: 48s - loss: 1.9208 - regression_loss: 1.5733 - classification_loss: 0.3475 306/500 [=================>............] - ETA: 48s - loss: 1.9222 - regression_loss: 1.5744 - classification_loss: 0.3478 307/500 [=================>............] - ETA: 48s - loss: 1.9229 - regression_loss: 1.5751 - classification_loss: 0.3479 308/500 [=================>............] - ETA: 48s - loss: 1.9189 - regression_loss: 1.5719 - classification_loss: 0.3470 309/500 [=================>............] - ETA: 47s - loss: 1.9188 - regression_loss: 1.5719 - classification_loss: 0.3469 310/500 [=================>............] - ETA: 47s - loss: 1.9175 - regression_loss: 1.5710 - classification_loss: 0.3465 311/500 [=================>............] - ETA: 47s - loss: 1.9177 - regression_loss: 1.5712 - classification_loss: 0.3464 312/500 [=================>............] - ETA: 47s - loss: 1.9188 - regression_loss: 1.5722 - classification_loss: 0.3466 313/500 [=================>............] - ETA: 46s - loss: 1.9162 - regression_loss: 1.5700 - classification_loss: 0.3462 314/500 [=================>............] - ETA: 46s - loss: 1.9166 - regression_loss: 1.5702 - classification_loss: 0.3463 315/500 [=================>............] - ETA: 46s - loss: 1.9154 - regression_loss: 1.5694 - classification_loss: 0.3460 316/500 [=================>............] - ETA: 46s - loss: 1.9144 - regression_loss: 1.5685 - classification_loss: 0.3459 317/500 [==================>...........] - ETA: 45s - loss: 1.9174 - regression_loss: 1.5711 - classification_loss: 0.3463 318/500 [==================>...........] - ETA: 45s - loss: 1.9175 - regression_loss: 1.5711 - classification_loss: 0.3464 319/500 [==================>...........] - ETA: 45s - loss: 1.9173 - regression_loss: 1.5711 - classification_loss: 0.3462 320/500 [==================>...........] - ETA: 45s - loss: 1.9168 - regression_loss: 1.5709 - classification_loss: 0.3459 321/500 [==================>...........] - ETA: 44s - loss: 1.9178 - regression_loss: 1.5716 - classification_loss: 0.3461 322/500 [==================>...........] - ETA: 44s - loss: 1.9157 - regression_loss: 1.5696 - classification_loss: 0.3462 323/500 [==================>...........] - ETA: 44s - loss: 1.9173 - regression_loss: 1.5708 - classification_loss: 0.3465 324/500 [==================>...........] - ETA: 44s - loss: 1.9184 - regression_loss: 1.5718 - classification_loss: 0.3466 325/500 [==================>...........] - ETA: 43s - loss: 1.9185 - regression_loss: 1.5719 - classification_loss: 0.3466 326/500 [==================>...........] - ETA: 43s - loss: 1.9184 - regression_loss: 1.5718 - classification_loss: 0.3466 327/500 [==================>...........] - ETA: 43s - loss: 1.9187 - regression_loss: 1.5722 - classification_loss: 0.3465 328/500 [==================>...........] - ETA: 43s - loss: 1.9163 - regression_loss: 1.5704 - classification_loss: 0.3458 329/500 [==================>...........] - ETA: 42s - loss: 1.9145 - regression_loss: 1.5688 - classification_loss: 0.3457 330/500 [==================>...........] - ETA: 42s - loss: 1.9148 - regression_loss: 1.5688 - classification_loss: 0.3460 331/500 [==================>...........] - ETA: 42s - loss: 1.9151 - regression_loss: 1.5691 - classification_loss: 0.3460 332/500 [==================>...........] - ETA: 42s - loss: 1.9147 - regression_loss: 1.5687 - classification_loss: 0.3460 333/500 [==================>...........] - ETA: 41s - loss: 1.9127 - regression_loss: 1.5670 - classification_loss: 0.3457 334/500 [===================>..........] - ETA: 41s - loss: 1.9134 - regression_loss: 1.5677 - classification_loss: 0.3457 335/500 [===================>..........] - ETA: 41s - loss: 1.9106 - regression_loss: 1.5656 - classification_loss: 0.3450 336/500 [===================>..........] - ETA: 41s - loss: 1.9114 - regression_loss: 1.5659 - classification_loss: 0.3455 337/500 [===================>..........] - ETA: 40s - loss: 1.9094 - regression_loss: 1.5642 - classification_loss: 0.3452 338/500 [===================>..........] - ETA: 40s - loss: 1.9079 - regression_loss: 1.5631 - classification_loss: 0.3448 339/500 [===================>..........] - ETA: 40s - loss: 1.9055 - regression_loss: 1.5609 - classification_loss: 0.3446 340/500 [===================>..........] - ETA: 40s - loss: 1.9055 - regression_loss: 1.5609 - classification_loss: 0.3446 341/500 [===================>..........] - ETA: 39s - loss: 1.9036 - regression_loss: 1.5595 - classification_loss: 0.3441 342/500 [===================>..........] - ETA: 39s - loss: 1.9042 - regression_loss: 1.5601 - classification_loss: 0.3441 343/500 [===================>..........] - ETA: 39s - loss: 1.9050 - regression_loss: 1.5605 - classification_loss: 0.3445 344/500 [===================>..........] - ETA: 39s - loss: 1.9035 - regression_loss: 1.5593 - classification_loss: 0.3441 345/500 [===================>..........] - ETA: 38s - loss: 1.9040 - regression_loss: 1.5597 - classification_loss: 0.3443 346/500 [===================>..........] - ETA: 38s - loss: 1.9045 - regression_loss: 1.5599 - classification_loss: 0.3447 347/500 [===================>..........] - ETA: 38s - loss: 1.9039 - regression_loss: 1.5593 - classification_loss: 0.3446 348/500 [===================>..........] - ETA: 38s - loss: 1.9054 - regression_loss: 1.5605 - classification_loss: 0.3450 349/500 [===================>..........] - ETA: 37s - loss: 1.9060 - regression_loss: 1.5609 - classification_loss: 0.3450 350/500 [====================>.........] - ETA: 37s - loss: 1.9067 - regression_loss: 1.5615 - classification_loss: 0.3452 351/500 [====================>.........] - ETA: 37s - loss: 1.9080 - regression_loss: 1.5625 - classification_loss: 0.3455 352/500 [====================>.........] - ETA: 37s - loss: 1.9086 - regression_loss: 1.5631 - classification_loss: 0.3456 353/500 [====================>.........] - ETA: 36s - loss: 1.9098 - regression_loss: 1.5638 - classification_loss: 0.3460 354/500 [====================>.........] - ETA: 36s - loss: 1.9087 - regression_loss: 1.5630 - classification_loss: 0.3456 355/500 [====================>.........] - ETA: 36s - loss: 1.9098 - regression_loss: 1.5640 - classification_loss: 0.3458 356/500 [====================>.........] - ETA: 36s - loss: 1.9092 - regression_loss: 1.5632 - classification_loss: 0.3460 357/500 [====================>.........] - ETA: 35s - loss: 1.9104 - regression_loss: 1.5638 - classification_loss: 0.3466 358/500 [====================>.........] - ETA: 35s - loss: 1.9116 - regression_loss: 1.5650 - classification_loss: 0.3466 359/500 [====================>.........] - ETA: 35s - loss: 1.9130 - regression_loss: 1.5662 - classification_loss: 0.3468 360/500 [====================>.........] - ETA: 35s - loss: 1.9123 - regression_loss: 1.5659 - classification_loss: 0.3465 361/500 [====================>.........] - ETA: 34s - loss: 1.9121 - regression_loss: 1.5658 - classification_loss: 0.3463 362/500 [====================>.........] - ETA: 34s - loss: 1.9141 - regression_loss: 1.5671 - classification_loss: 0.3469 363/500 [====================>.........] - ETA: 34s - loss: 1.9156 - regression_loss: 1.5683 - classification_loss: 0.3472 364/500 [====================>.........] - ETA: 34s - loss: 1.9137 - regression_loss: 1.5667 - classification_loss: 0.3470 365/500 [====================>.........] - ETA: 33s - loss: 1.9129 - regression_loss: 1.5661 - classification_loss: 0.3467 366/500 [====================>.........] - ETA: 33s - loss: 1.9140 - regression_loss: 1.5670 - classification_loss: 0.3470 367/500 [=====================>........] - ETA: 33s - loss: 1.9142 - regression_loss: 1.5674 - classification_loss: 0.3468 368/500 [=====================>........] - ETA: 33s - loss: 1.9147 - regression_loss: 1.5678 - classification_loss: 0.3470 369/500 [=====================>........] - ETA: 32s - loss: 1.9149 - regression_loss: 1.5680 - classification_loss: 0.3469 370/500 [=====================>........] - ETA: 32s - loss: 1.9159 - regression_loss: 1.5690 - classification_loss: 0.3470 371/500 [=====================>........] - ETA: 32s - loss: 1.9153 - regression_loss: 1.5685 - classification_loss: 0.3468 372/500 [=====================>........] - ETA: 32s - loss: 1.9161 - regression_loss: 1.5692 - classification_loss: 0.3470 373/500 [=====================>........] - ETA: 31s - loss: 1.9170 - regression_loss: 1.5699 - classification_loss: 0.3472 374/500 [=====================>........] - ETA: 31s - loss: 1.9154 - regression_loss: 1.5688 - classification_loss: 0.3466 375/500 [=====================>........] - ETA: 31s - loss: 1.9156 - regression_loss: 1.5688 - classification_loss: 0.3468 376/500 [=====================>........] - ETA: 31s - loss: 1.9137 - regression_loss: 1.5670 - classification_loss: 0.3467 377/500 [=====================>........] - ETA: 30s - loss: 1.9139 - regression_loss: 1.5672 - classification_loss: 0.3466 378/500 [=====================>........] - ETA: 30s - loss: 1.9125 - regression_loss: 1.5660 - classification_loss: 0.3465 379/500 [=====================>........] - ETA: 30s - loss: 1.9123 - regression_loss: 1.5659 - classification_loss: 0.3464 380/500 [=====================>........] - ETA: 30s - loss: 1.9125 - regression_loss: 1.5662 - classification_loss: 0.3463 381/500 [=====================>........] - ETA: 29s - loss: 1.9121 - regression_loss: 1.5659 - classification_loss: 0.3463 382/500 [=====================>........] - ETA: 29s - loss: 1.9115 - regression_loss: 1.5655 - classification_loss: 0.3460 383/500 [=====================>........] - ETA: 29s - loss: 1.9113 - regression_loss: 1.5654 - classification_loss: 0.3459 384/500 [======================>.......] - ETA: 29s - loss: 1.9099 - regression_loss: 1.5646 - classification_loss: 0.3453 385/500 [======================>.......] - ETA: 28s - loss: 1.9097 - regression_loss: 1.5646 - classification_loss: 0.3451 386/500 [======================>.......] - ETA: 28s - loss: 1.9098 - regression_loss: 1.5648 - classification_loss: 0.3450 387/500 [======================>.......] - ETA: 28s - loss: 1.9108 - regression_loss: 1.5658 - classification_loss: 0.3450 388/500 [======================>.......] - ETA: 28s - loss: 1.9108 - regression_loss: 1.5656 - classification_loss: 0.3452 389/500 [======================>.......] - ETA: 27s - loss: 1.9098 - regression_loss: 1.5650 - classification_loss: 0.3449 390/500 [======================>.......] - ETA: 27s - loss: 1.9108 - regression_loss: 1.5658 - classification_loss: 0.3450 391/500 [======================>.......] - ETA: 27s - loss: 1.9106 - regression_loss: 1.5656 - classification_loss: 0.3450 392/500 [======================>.......] - ETA: 27s - loss: 1.9109 - regression_loss: 1.5659 - classification_loss: 0.3450 393/500 [======================>.......] - ETA: 26s - loss: 1.9122 - regression_loss: 1.5670 - classification_loss: 0.3452 394/500 [======================>.......] - ETA: 26s - loss: 1.9119 - regression_loss: 1.5665 - classification_loss: 0.3454 395/500 [======================>.......] - ETA: 26s - loss: 1.9113 - regression_loss: 1.5661 - classification_loss: 0.3452 396/500 [======================>.......] - ETA: 26s - loss: 1.9132 - regression_loss: 1.5676 - classification_loss: 0.3456 397/500 [======================>.......] - ETA: 25s - loss: 1.9105 - regression_loss: 1.5654 - classification_loss: 0.3450 398/500 [======================>.......] - ETA: 25s - loss: 1.9117 - regression_loss: 1.5662 - classification_loss: 0.3454 399/500 [======================>.......] - ETA: 25s - loss: 1.9122 - regression_loss: 1.5666 - classification_loss: 0.3456 400/500 [=======================>......] - ETA: 25s - loss: 1.9130 - regression_loss: 1.5671 - classification_loss: 0.3459 401/500 [=======================>......] - ETA: 24s - loss: 1.9122 - regression_loss: 1.5665 - classification_loss: 0.3457 402/500 [=======================>......] - ETA: 24s - loss: 1.9118 - regression_loss: 1.5663 - classification_loss: 0.3455 403/500 [=======================>......] - ETA: 24s - loss: 1.9119 - regression_loss: 1.5664 - classification_loss: 0.3455 404/500 [=======================>......] - ETA: 24s - loss: 1.9146 - regression_loss: 1.5686 - classification_loss: 0.3460 405/500 [=======================>......] - ETA: 23s - loss: 1.9144 - regression_loss: 1.5686 - classification_loss: 0.3458 406/500 [=======================>......] - ETA: 23s - loss: 1.9145 - regression_loss: 1.5688 - classification_loss: 0.3457 407/500 [=======================>......] - ETA: 23s - loss: 1.9158 - regression_loss: 1.5700 - classification_loss: 0.3459 408/500 [=======================>......] - ETA: 23s - loss: 1.9162 - regression_loss: 1.5703 - classification_loss: 0.3459 409/500 [=======================>......] - ETA: 22s - loss: 1.9150 - regression_loss: 1.5692 - classification_loss: 0.3457 410/500 [=======================>......] - ETA: 22s - loss: 1.9161 - regression_loss: 1.5701 - classification_loss: 0.3459 411/500 [=======================>......] - ETA: 22s - loss: 1.9141 - regression_loss: 1.5686 - classification_loss: 0.3455 412/500 [=======================>......] - ETA: 22s - loss: 1.9159 - regression_loss: 1.5695 - classification_loss: 0.3464 413/500 [=======================>......] - ETA: 21s - loss: 1.9158 - regression_loss: 1.5695 - classification_loss: 0.3463 414/500 [=======================>......] - ETA: 21s - loss: 1.9166 - regression_loss: 1.5701 - classification_loss: 0.3465 415/500 [=======================>......] - ETA: 21s - loss: 1.9168 - regression_loss: 1.5705 - classification_loss: 0.3464 416/500 [=======================>......] - ETA: 21s - loss: 1.9164 - regression_loss: 1.5701 - classification_loss: 0.3464 417/500 [========================>.....] - ETA: 20s - loss: 1.9158 - regression_loss: 1.5697 - classification_loss: 0.3461 418/500 [========================>.....] - ETA: 20s - loss: 1.9139 - regression_loss: 1.5680 - classification_loss: 0.3459 419/500 [========================>.....] - ETA: 20s - loss: 1.9126 - regression_loss: 1.5669 - classification_loss: 0.3457 420/500 [========================>.....] - ETA: 20s - loss: 1.9127 - regression_loss: 1.5670 - classification_loss: 0.3456 421/500 [========================>.....] - ETA: 19s - loss: 1.9132 - regression_loss: 1.5675 - classification_loss: 0.3457 422/500 [========================>.....] - ETA: 19s - loss: 1.9167 - regression_loss: 1.5702 - classification_loss: 0.3465 423/500 [========================>.....] - ETA: 19s - loss: 1.9172 - regression_loss: 1.5707 - classification_loss: 0.3466 424/500 [========================>.....] - ETA: 19s - loss: 1.9181 - regression_loss: 1.5713 - classification_loss: 0.3467 425/500 [========================>.....] - ETA: 18s - loss: 1.9180 - regression_loss: 1.5714 - classification_loss: 0.3466 426/500 [========================>.....] - ETA: 18s - loss: 1.9187 - regression_loss: 1.5719 - classification_loss: 0.3468 427/500 [========================>.....] - ETA: 18s - loss: 1.9184 - regression_loss: 1.5719 - classification_loss: 0.3465 428/500 [========================>.....] - ETA: 18s - loss: 1.9177 - regression_loss: 1.5713 - classification_loss: 0.3464 429/500 [========================>.....] - ETA: 17s - loss: 1.9186 - regression_loss: 1.5720 - classification_loss: 0.3466 430/500 [========================>.....] - ETA: 17s - loss: 1.9173 - regression_loss: 1.5709 - classification_loss: 0.3465 431/500 [========================>.....] - ETA: 17s - loss: 1.9175 - regression_loss: 1.5712 - classification_loss: 0.3463 432/500 [========================>.....] - ETA: 17s - loss: 1.9186 - regression_loss: 1.5720 - classification_loss: 0.3466 433/500 [========================>.....] - ETA: 16s - loss: 1.9170 - regression_loss: 1.5708 - classification_loss: 0.3462 434/500 [=========================>....] - ETA: 16s - loss: 1.9167 - regression_loss: 1.5706 - classification_loss: 0.3461 435/500 [=========================>....] - ETA: 16s - loss: 1.9168 - regression_loss: 1.5707 - classification_loss: 0.3461 436/500 [=========================>....] - ETA: 16s - loss: 1.9160 - regression_loss: 1.5700 - classification_loss: 0.3460 437/500 [=========================>....] - ETA: 15s - loss: 1.9155 - regression_loss: 1.5697 - classification_loss: 0.3458 438/500 [=========================>....] - ETA: 15s - loss: 1.9156 - regression_loss: 1.5699 - classification_loss: 0.3457 439/500 [=========================>....] - ETA: 15s - loss: 1.9160 - regression_loss: 1.5702 - classification_loss: 0.3458 440/500 [=========================>....] - ETA: 15s - loss: 1.9143 - regression_loss: 1.5688 - classification_loss: 0.3455 441/500 [=========================>....] - ETA: 14s - loss: 1.9128 - regression_loss: 1.5675 - classification_loss: 0.3453 442/500 [=========================>....] - ETA: 14s - loss: 1.9110 - regression_loss: 1.5661 - classification_loss: 0.3449 443/500 [=========================>....] - ETA: 14s - loss: 1.9110 - regression_loss: 1.5661 - classification_loss: 0.3449 444/500 [=========================>....] - ETA: 14s - loss: 1.9106 - regression_loss: 1.5660 - classification_loss: 0.3446 445/500 [=========================>....] - ETA: 13s - loss: 1.9105 - regression_loss: 1.5658 - classification_loss: 0.3447 446/500 [=========================>....] - ETA: 13s - loss: 1.9104 - regression_loss: 1.5657 - classification_loss: 0.3447 447/500 [=========================>....] - ETA: 13s - loss: 1.9100 - regression_loss: 1.5654 - classification_loss: 0.3445 448/500 [=========================>....] - ETA: 13s - loss: 1.9096 - regression_loss: 1.5651 - classification_loss: 0.3444 449/500 [=========================>....] - ETA: 12s - loss: 1.9109 - regression_loss: 1.5663 - classification_loss: 0.3446 450/500 [==========================>...] - ETA: 12s - loss: 1.9110 - regression_loss: 1.5665 - classification_loss: 0.3445 451/500 [==========================>...] - ETA: 12s - loss: 1.9119 - regression_loss: 1.5674 - classification_loss: 0.3445 452/500 [==========================>...] - ETA: 12s - loss: 1.9137 - regression_loss: 1.5688 - classification_loss: 0.3448 453/500 [==========================>...] - ETA: 11s - loss: 1.9133 - regression_loss: 1.5687 - classification_loss: 0.3446 454/500 [==========================>...] - ETA: 11s - loss: 1.9131 - regression_loss: 1.5685 - classification_loss: 0.3447 455/500 [==========================>...] - ETA: 11s - loss: 1.9129 - regression_loss: 1.5684 - classification_loss: 0.3445 456/500 [==========================>...] - ETA: 11s - loss: 1.9119 - regression_loss: 1.5675 - classification_loss: 0.3443 457/500 [==========================>...] - ETA: 10s - loss: 1.9115 - regression_loss: 1.5672 - classification_loss: 0.3443 458/500 [==========================>...] - ETA: 10s - loss: 1.9134 - regression_loss: 1.5686 - classification_loss: 0.3448 459/500 [==========================>...] - ETA: 10s - loss: 1.9142 - regression_loss: 1.5692 - classification_loss: 0.3450 460/500 [==========================>...] - ETA: 10s - loss: 1.9154 - regression_loss: 1.5704 - classification_loss: 0.3450 461/500 [==========================>...] - ETA: 9s - loss: 1.9132 - regression_loss: 1.5688 - classification_loss: 0.3444  462/500 [==========================>...] - ETA: 9s - loss: 1.9140 - regression_loss: 1.5693 - classification_loss: 0.3447 463/500 [==========================>...] - ETA: 9s - loss: 1.9137 - regression_loss: 1.5693 - classification_loss: 0.3444 464/500 [==========================>...] - ETA: 9s - loss: 1.9145 - regression_loss: 1.5701 - classification_loss: 0.3444 465/500 [==========================>...] - ETA: 8s - loss: 1.9149 - regression_loss: 1.5705 - classification_loss: 0.3444 466/500 [==========================>...] - ETA: 8s - loss: 1.9136 - regression_loss: 1.5695 - classification_loss: 0.3441 467/500 [===========================>..] - ETA: 8s - loss: 1.9141 - regression_loss: 1.5700 - classification_loss: 0.3441 468/500 [===========================>..] - ETA: 8s - loss: 1.9149 - regression_loss: 1.5706 - classification_loss: 0.3442 469/500 [===========================>..] - ETA: 7s - loss: 1.9146 - regression_loss: 1.5704 - classification_loss: 0.3442 470/500 [===========================>..] - ETA: 7s - loss: 1.9157 - regression_loss: 1.5711 - classification_loss: 0.3446 471/500 [===========================>..] - ETA: 7s - loss: 1.9145 - regression_loss: 1.5700 - classification_loss: 0.3445 472/500 [===========================>..] - ETA: 7s - loss: 1.9145 - regression_loss: 1.5701 - classification_loss: 0.3444 473/500 [===========================>..] - ETA: 6s - loss: 1.9140 - regression_loss: 1.5697 - classification_loss: 0.3443 474/500 [===========================>..] - ETA: 6s - loss: 1.9126 - regression_loss: 1.5684 - classification_loss: 0.3441 475/500 [===========================>..] - ETA: 6s - loss: 1.9135 - regression_loss: 1.5690 - classification_loss: 0.3444 476/500 [===========================>..] - ETA: 6s - loss: 1.9152 - regression_loss: 1.5705 - classification_loss: 0.3447 477/500 [===========================>..] - ETA: 5s - loss: 1.9133 - regression_loss: 1.5689 - classification_loss: 0.3444 478/500 [===========================>..] - ETA: 5s - loss: 1.9134 - regression_loss: 1.5691 - classification_loss: 0.3443 479/500 [===========================>..] - ETA: 5s - loss: 1.9136 - regression_loss: 1.5692 - classification_loss: 0.3444 480/500 [===========================>..] - ETA: 5s - loss: 1.9127 - regression_loss: 1.5685 - classification_loss: 0.3442 481/500 [===========================>..] - ETA: 4s - loss: 1.9114 - regression_loss: 1.5675 - classification_loss: 0.3439 482/500 [===========================>..] - ETA: 4s - loss: 1.9113 - regression_loss: 1.5675 - classification_loss: 0.3438 483/500 [===========================>..] - ETA: 4s - loss: 1.9130 - regression_loss: 1.5689 - classification_loss: 0.3440 484/500 [============================>.] - ETA: 4s - loss: 1.9129 - regression_loss: 1.5689 - classification_loss: 0.3440 485/500 [============================>.] - ETA: 3s - loss: 1.9135 - regression_loss: 1.5695 - classification_loss: 0.3440 486/500 [============================>.] - ETA: 3s - loss: 1.9132 - regression_loss: 1.5692 - classification_loss: 0.3440 487/500 [============================>.] - ETA: 3s - loss: 1.9137 - regression_loss: 1.5696 - classification_loss: 0.3441 488/500 [============================>.] - ETA: 3s - loss: 1.9139 - regression_loss: 1.5699 - classification_loss: 0.3440 489/500 [============================>.] - ETA: 2s - loss: 1.9135 - regression_loss: 1.5694 - classification_loss: 0.3441 490/500 [============================>.] - ETA: 2s - loss: 1.9134 - regression_loss: 1.5691 - classification_loss: 0.3442 491/500 [============================>.] - ETA: 2s - loss: 1.9112 - regression_loss: 1.5675 - classification_loss: 0.3437 492/500 [============================>.] - ETA: 2s - loss: 1.9100 - regression_loss: 1.5666 - classification_loss: 0.3434 493/500 [============================>.] - ETA: 1s - loss: 1.9093 - regression_loss: 1.5660 - classification_loss: 0.3432 494/500 [============================>.] - ETA: 1s - loss: 1.9089 - regression_loss: 1.5659 - classification_loss: 0.3430 495/500 [============================>.] - ETA: 1s - loss: 1.9094 - regression_loss: 1.5665 - classification_loss: 0.3429 496/500 [============================>.] - ETA: 1s - loss: 1.9105 - regression_loss: 1.5671 - classification_loss: 0.3433 497/500 [============================>.] - ETA: 0s - loss: 1.9102 - regression_loss: 1.5668 - classification_loss: 0.3434 498/500 [============================>.] - ETA: 0s - loss: 1.9101 - regression_loss: 1.5669 - classification_loss: 0.3432 499/500 [============================>.] - ETA: 0s - loss: 1.9101 - regression_loss: 1.5671 - classification_loss: 0.3430 500/500 [==============================] - 125s 250ms/step - loss: 1.9111 - regression_loss: 1.5679 - classification_loss: 0.3433 1172 instances of class plum with average precision: 0.5868 mAP: 0.5868 Epoch 00049: saving model to ./training/snapshots/resnet50_pascal_49.h5 Epoch 50/150 1/500 [..............................] - ETA: 1:57 - loss: 2.3847 - regression_loss: 1.9362 - classification_loss: 0.4485 2/500 [..............................] - ETA: 2:00 - loss: 2.4745 - regression_loss: 2.1164 - classification_loss: 0.3581 3/500 [..............................] - ETA: 2:01 - loss: 2.3963 - regression_loss: 2.0198 - classification_loss: 0.3765 4/500 [..............................] - ETA: 2:01 - loss: 2.0602 - regression_loss: 1.7533 - classification_loss: 0.3070 5/500 [..............................] - ETA: 2:02 - loss: 2.1453 - regression_loss: 1.8052 - classification_loss: 0.3401 6/500 [..............................] - ETA: 2:01 - loss: 2.2347 - regression_loss: 1.8944 - classification_loss: 0.3403 7/500 [..............................] - ETA: 2:03 - loss: 2.2295 - regression_loss: 1.8705 - classification_loss: 0.3590 8/500 [..............................] - ETA: 2:02 - loss: 2.2463 - regression_loss: 1.8762 - classification_loss: 0.3701 9/500 [..............................] - ETA: 2:01 - loss: 2.2886 - regression_loss: 1.9083 - classification_loss: 0.3803 10/500 [..............................] - ETA: 2:01 - loss: 2.2010 - regression_loss: 1.8279 - classification_loss: 0.3731 11/500 [..............................] - ETA: 2:01 - loss: 2.1525 - regression_loss: 1.7871 - classification_loss: 0.3655 12/500 [..............................] - ETA: 2:01 - loss: 2.1601 - regression_loss: 1.7966 - classification_loss: 0.3636 13/500 [..............................] - ETA: 2:01 - loss: 2.1406 - regression_loss: 1.7780 - classification_loss: 0.3627 14/500 [..............................] - ETA: 2:01 - loss: 2.1259 - regression_loss: 1.7633 - classification_loss: 0.3625 15/500 [..............................] - ETA: 2:01 - loss: 2.0665 - regression_loss: 1.7170 - classification_loss: 0.3495 16/500 [..............................] - ETA: 2:00 - loss: 2.0589 - regression_loss: 1.7115 - classification_loss: 0.3475 17/500 [>.............................] - ETA: 2:01 - loss: 2.0711 - regression_loss: 1.7210 - classification_loss: 0.3502 18/500 [>.............................] - ETA: 2:01 - loss: 2.1213 - regression_loss: 1.7563 - classification_loss: 0.3650 19/500 [>.............................] - ETA: 2:01 - loss: 2.1256 - regression_loss: 1.7611 - classification_loss: 0.3646 20/500 [>.............................] - ETA: 2:00 - loss: 2.1387 - regression_loss: 1.7711 - classification_loss: 0.3676 21/500 [>.............................] - ETA: 2:00 - loss: 2.1210 - regression_loss: 1.7565 - classification_loss: 0.3645 22/500 [>.............................] - ETA: 2:00 - loss: 2.1055 - regression_loss: 1.7455 - classification_loss: 0.3600 23/500 [>.............................] - ETA: 2:00 - loss: 2.1076 - regression_loss: 1.7450 - classification_loss: 0.3626 24/500 [>.............................] - ETA: 1:59 - loss: 2.0836 - regression_loss: 1.7276 - classification_loss: 0.3561 25/500 [>.............................] - ETA: 1:59 - loss: 2.0818 - regression_loss: 1.7259 - classification_loss: 0.3559 26/500 [>.............................] - ETA: 1:59 - loss: 2.0282 - regression_loss: 1.6813 - classification_loss: 0.3469 27/500 [>.............................] - ETA: 1:58 - loss: 2.0034 - regression_loss: 1.6601 - classification_loss: 0.3433 28/500 [>.............................] - ETA: 1:58 - loss: 2.0069 - regression_loss: 1.6638 - classification_loss: 0.3431 29/500 [>.............................] - ETA: 1:58 - loss: 2.0074 - regression_loss: 1.6634 - classification_loss: 0.3440 30/500 [>.............................] - ETA: 1:57 - loss: 2.0009 - regression_loss: 1.6409 - classification_loss: 0.3600 31/500 [>.............................] - ETA: 1:57 - loss: 1.9892 - regression_loss: 1.6327 - classification_loss: 0.3564 32/500 [>.............................] - ETA: 1:57 - loss: 1.9800 - regression_loss: 1.6271 - classification_loss: 0.3529 33/500 [>.............................] - ETA: 1:57 - loss: 1.9516 - regression_loss: 1.6015 - classification_loss: 0.3501 34/500 [=>............................] - ETA: 1:57 - loss: 1.9676 - regression_loss: 1.6167 - classification_loss: 0.3509 35/500 [=>............................] - ETA: 1:56 - loss: 1.9813 - regression_loss: 1.6280 - classification_loss: 0.3533 36/500 [=>............................] - ETA: 1:56 - loss: 1.9648 - regression_loss: 1.6148 - classification_loss: 0.3500 37/500 [=>............................] - ETA: 1:56 - loss: 1.9681 - regression_loss: 1.6145 - classification_loss: 0.3536 38/500 [=>............................] - ETA: 1:56 - loss: 1.9613 - regression_loss: 1.6091 - classification_loss: 0.3522 39/500 [=>............................] - ETA: 1:55 - loss: 1.9614 - regression_loss: 1.6098 - classification_loss: 0.3517 40/500 [=>............................] - ETA: 1:55 - loss: 1.9702 - regression_loss: 1.6166 - classification_loss: 0.3536 41/500 [=>............................] - ETA: 1:55 - loss: 1.9780 - regression_loss: 1.6262 - classification_loss: 0.3518 42/500 [=>............................] - ETA: 1:55 - loss: 1.9803 - regression_loss: 1.6289 - classification_loss: 0.3514 43/500 [=>............................] - ETA: 1:54 - loss: 1.9800 - regression_loss: 1.6278 - classification_loss: 0.3522 44/500 [=>............................] - ETA: 1:54 - loss: 1.9709 - regression_loss: 1.6200 - classification_loss: 0.3509 45/500 [=>............................] - ETA: 1:54 - loss: 1.9710 - regression_loss: 1.6206 - classification_loss: 0.3504 46/500 [=>............................] - ETA: 1:54 - loss: 1.9733 - regression_loss: 1.6204 - classification_loss: 0.3529 47/500 [=>............................] - ETA: 1:53 - loss: 1.9689 - regression_loss: 1.6176 - classification_loss: 0.3513 48/500 [=>............................] - ETA: 1:53 - loss: 1.9662 - regression_loss: 1.6191 - classification_loss: 0.3471 49/500 [=>............................] - ETA: 1:53 - loss: 1.9556 - regression_loss: 1.6111 - classification_loss: 0.3445 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9628 - regression_loss: 1.6161 - classification_loss: 0.3467 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9642 - regression_loss: 1.6173 - classification_loss: 0.3469 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9534 - regression_loss: 1.6088 - classification_loss: 0.3446 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9495 - regression_loss: 1.6044 - classification_loss: 0.3451 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9538 - regression_loss: 1.6080 - classification_loss: 0.3458 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9460 - regression_loss: 1.6026 - classification_loss: 0.3434 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9505 - regression_loss: 1.6072 - classification_loss: 0.3433 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9561 - regression_loss: 1.6113 - classification_loss: 0.3448 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9549 - regression_loss: 1.6096 - classification_loss: 0.3453 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9529 - regression_loss: 1.6089 - classification_loss: 0.3441 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9613 - regression_loss: 1.6134 - classification_loss: 0.3480 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9677 - regression_loss: 1.6183 - classification_loss: 0.3494 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9706 - regression_loss: 1.6201 - classification_loss: 0.3505 63/500 [==>...........................] - ETA: 1:50 - loss: 1.9744 - regression_loss: 1.6232 - classification_loss: 0.3512 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9736 - regression_loss: 1.6231 - classification_loss: 0.3505 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9692 - regression_loss: 1.6203 - classification_loss: 0.3490 66/500 [==>...........................] - ETA: 1:49 - loss: 1.9551 - regression_loss: 1.6086 - classification_loss: 0.3465 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9526 - regression_loss: 1.6053 - classification_loss: 0.3473 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9459 - regression_loss: 1.5995 - classification_loss: 0.3464 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9347 - regression_loss: 1.5895 - classification_loss: 0.3453 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9341 - regression_loss: 1.5897 - classification_loss: 0.3444 71/500 [===>..........................] - ETA: 1:48 - loss: 1.9327 - regression_loss: 1.5882 - classification_loss: 0.3445 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9321 - regression_loss: 1.5874 - classification_loss: 0.3447 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9382 - regression_loss: 1.5917 - classification_loss: 0.3465 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9407 - regression_loss: 1.5922 - classification_loss: 0.3484 75/500 [===>..........................] - ETA: 1:47 - loss: 1.9339 - regression_loss: 1.5864 - classification_loss: 0.3474 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9334 - regression_loss: 1.5866 - classification_loss: 0.3468 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9317 - regression_loss: 1.5853 - classification_loss: 0.3465 78/500 [===>..........................] - ETA: 1:46 - loss: 1.9257 - regression_loss: 1.5813 - classification_loss: 0.3444 79/500 [===>..........................] - ETA: 1:46 - loss: 1.9287 - regression_loss: 1.5842 - classification_loss: 0.3445 80/500 [===>..........................] - ETA: 1:45 - loss: 1.9262 - regression_loss: 1.5830 - classification_loss: 0.3433 81/500 [===>..........................] - ETA: 1:45 - loss: 1.9315 - regression_loss: 1.5862 - classification_loss: 0.3453 82/500 [===>..........................] - ETA: 1:45 - loss: 1.9372 - regression_loss: 1.5909 - classification_loss: 0.3464 83/500 [===>..........................] - ETA: 1:45 - loss: 1.9388 - regression_loss: 1.5917 - classification_loss: 0.3472 84/500 [====>.........................] - ETA: 1:44 - loss: 1.9421 - regression_loss: 1.5948 - classification_loss: 0.3474 85/500 [====>.........................] - ETA: 1:44 - loss: 1.9430 - regression_loss: 1.5960 - classification_loss: 0.3470 86/500 [====>.........................] - ETA: 1:44 - loss: 1.9440 - regression_loss: 1.5962 - classification_loss: 0.3479 87/500 [====>.........................] - ETA: 1:44 - loss: 1.9460 - regression_loss: 1.5982 - classification_loss: 0.3478 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9424 - regression_loss: 1.5939 - classification_loss: 0.3485 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9416 - regression_loss: 1.5934 - classification_loss: 0.3482 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9410 - regression_loss: 1.5920 - classification_loss: 0.3490 91/500 [====>.........................] - ETA: 1:43 - loss: 1.9338 - regression_loss: 1.5859 - classification_loss: 0.3479 92/500 [====>.........................] - ETA: 1:42 - loss: 1.9355 - regression_loss: 1.5877 - classification_loss: 0.3478 93/500 [====>.........................] - ETA: 1:42 - loss: 1.9341 - regression_loss: 1.5870 - classification_loss: 0.3471 94/500 [====>.........................] - ETA: 1:42 - loss: 1.9364 - regression_loss: 1.5887 - classification_loss: 0.3477 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9285 - regression_loss: 1.5827 - classification_loss: 0.3458 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9312 - regression_loss: 1.5852 - classification_loss: 0.3460 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9328 - regression_loss: 1.5874 - classification_loss: 0.3454 98/500 [====>.........................] - ETA: 1:41 - loss: 1.9193 - regression_loss: 1.5762 - classification_loss: 0.3431 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9207 - regression_loss: 1.5760 - classification_loss: 0.3447 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9317 - regression_loss: 1.5855 - classification_loss: 0.3462 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9282 - regression_loss: 1.5841 - classification_loss: 0.3441 102/500 [=====>........................] - ETA: 1:40 - loss: 1.9316 - regression_loss: 1.5871 - classification_loss: 0.3445 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9324 - regression_loss: 1.5881 - classification_loss: 0.3442 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9420 - regression_loss: 1.5952 - classification_loss: 0.3468 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9382 - regression_loss: 1.5926 - classification_loss: 0.3456 106/500 [=====>........................] - ETA: 1:39 - loss: 1.9394 - regression_loss: 1.5927 - classification_loss: 0.3467 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9348 - regression_loss: 1.5894 - classification_loss: 0.3455 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9424 - regression_loss: 1.5953 - classification_loss: 0.3470 109/500 [=====>........................] - ETA: 1:38 - loss: 1.9424 - regression_loss: 1.5956 - classification_loss: 0.3468 110/500 [=====>........................] - ETA: 1:38 - loss: 1.9384 - regression_loss: 1.5927 - classification_loss: 0.3457 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9331 - regression_loss: 1.5888 - classification_loss: 0.3443 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9329 - regression_loss: 1.5890 - classification_loss: 0.3439 113/500 [=====>........................] - ETA: 1:37 - loss: 1.9318 - regression_loss: 1.5877 - classification_loss: 0.3441 114/500 [=====>........................] - ETA: 1:37 - loss: 1.9333 - regression_loss: 1.5890 - classification_loss: 0.3443 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9406 - regression_loss: 1.5943 - classification_loss: 0.3464 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9437 - regression_loss: 1.5975 - classification_loss: 0.3462 117/500 [======>.......................] - ETA: 1:36 - loss: 1.9436 - regression_loss: 1.5976 - classification_loss: 0.3460 118/500 [======>.......................] - ETA: 1:36 - loss: 1.9412 - regression_loss: 1.5961 - classification_loss: 0.3451 119/500 [======>.......................] - ETA: 1:35 - loss: 1.9403 - regression_loss: 1.5959 - classification_loss: 0.3444 120/500 [======>.......................] - ETA: 1:35 - loss: 1.9346 - regression_loss: 1.5917 - classification_loss: 0.3429 121/500 [======>.......................] - ETA: 1:35 - loss: 1.9352 - regression_loss: 1.5928 - classification_loss: 0.3424 122/500 [======>.......................] - ETA: 1:35 - loss: 1.9349 - regression_loss: 1.5924 - classification_loss: 0.3425 123/500 [======>.......................] - ETA: 1:34 - loss: 1.9345 - regression_loss: 1.5926 - classification_loss: 0.3419 124/500 [======>.......................] - ETA: 1:34 - loss: 1.9339 - regression_loss: 1.5923 - classification_loss: 0.3416 125/500 [======>.......................] - ETA: 1:34 - loss: 1.9285 - regression_loss: 1.5879 - classification_loss: 0.3406 126/500 [======>.......................] - ETA: 1:34 - loss: 1.9305 - regression_loss: 1.5893 - classification_loss: 0.3412 127/500 [======>.......................] - ETA: 1:33 - loss: 1.9333 - regression_loss: 1.5917 - classification_loss: 0.3416 128/500 [======>.......................] - ETA: 1:33 - loss: 1.9326 - regression_loss: 1.5912 - classification_loss: 0.3414 129/500 [======>.......................] - ETA: 1:33 - loss: 1.9270 - regression_loss: 1.5864 - classification_loss: 0.3405 130/500 [======>.......................] - ETA: 1:33 - loss: 1.9327 - regression_loss: 1.5916 - classification_loss: 0.3411 131/500 [======>.......................] - ETA: 1:32 - loss: 1.9304 - regression_loss: 1.5897 - classification_loss: 0.3407 132/500 [======>.......................] - ETA: 1:32 - loss: 1.9316 - regression_loss: 1.5912 - classification_loss: 0.3404 133/500 [======>.......................] - ETA: 1:32 - loss: 1.9332 - regression_loss: 1.5932 - classification_loss: 0.3400 134/500 [=======>......................] - ETA: 1:32 - loss: 1.9360 - regression_loss: 1.5955 - classification_loss: 0.3405 135/500 [=======>......................] - ETA: 1:31 - loss: 1.9382 - regression_loss: 1.5973 - classification_loss: 0.3410 136/500 [=======>......................] - ETA: 1:31 - loss: 1.9439 - regression_loss: 1.6021 - classification_loss: 0.3418 137/500 [=======>......................] - ETA: 1:31 - loss: 1.9427 - regression_loss: 1.6010 - classification_loss: 0.3417 138/500 [=======>......................] - ETA: 1:31 - loss: 1.9446 - regression_loss: 1.6030 - classification_loss: 0.3416 139/500 [=======>......................] - ETA: 1:30 - loss: 1.9459 - regression_loss: 1.6041 - classification_loss: 0.3418 140/500 [=======>......................] - ETA: 1:30 - loss: 1.9380 - regression_loss: 1.5976 - classification_loss: 0.3403 141/500 [=======>......................] - ETA: 1:30 - loss: 1.9368 - regression_loss: 1.5965 - classification_loss: 0.3403 142/500 [=======>......................] - ETA: 1:29 - loss: 1.9400 - regression_loss: 1.5993 - classification_loss: 0.3406 143/500 [=======>......................] - ETA: 1:29 - loss: 1.9405 - regression_loss: 1.5999 - classification_loss: 0.3406 144/500 [=======>......................] - ETA: 1:29 - loss: 1.9317 - regression_loss: 1.5927 - classification_loss: 0.3390 145/500 [=======>......................] - ETA: 1:29 - loss: 1.9354 - regression_loss: 1.5961 - classification_loss: 0.3393 146/500 [=======>......................] - ETA: 1:28 - loss: 1.9357 - regression_loss: 1.5966 - classification_loss: 0.3391 147/500 [=======>......................] - ETA: 1:28 - loss: 1.9355 - regression_loss: 1.5965 - classification_loss: 0.3390 148/500 [=======>......................] - ETA: 1:28 - loss: 1.9361 - regression_loss: 1.5974 - classification_loss: 0.3387 149/500 [=======>......................] - ETA: 1:28 - loss: 1.9334 - regression_loss: 1.5954 - classification_loss: 0.3380 150/500 [========>.....................] - ETA: 1:27 - loss: 1.9337 - regression_loss: 1.5959 - classification_loss: 0.3378 151/500 [========>.....................] - ETA: 1:27 - loss: 1.9343 - regression_loss: 1.5968 - classification_loss: 0.3375 152/500 [========>.....................] - ETA: 1:27 - loss: 1.9366 - regression_loss: 1.5995 - classification_loss: 0.3372 153/500 [========>.....................] - ETA: 1:27 - loss: 1.9371 - regression_loss: 1.5996 - classification_loss: 0.3375 154/500 [========>.....................] - ETA: 1:26 - loss: 1.9352 - regression_loss: 1.5982 - classification_loss: 0.3370 155/500 [========>.....................] - ETA: 1:26 - loss: 1.9361 - regression_loss: 1.5990 - classification_loss: 0.3371 156/500 [========>.....................] - ETA: 1:26 - loss: 1.9423 - regression_loss: 1.6025 - classification_loss: 0.3398 157/500 [========>.....................] - ETA: 1:26 - loss: 1.9430 - regression_loss: 1.6032 - classification_loss: 0.3397 158/500 [========>.....................] - ETA: 1:25 - loss: 1.9447 - regression_loss: 1.6044 - classification_loss: 0.3402 159/500 [========>.....................] - ETA: 1:25 - loss: 1.9454 - regression_loss: 1.6050 - classification_loss: 0.3404 160/500 [========>.....................] - ETA: 1:25 - loss: 1.9435 - regression_loss: 1.6035 - classification_loss: 0.3400 161/500 [========>.....................] - ETA: 1:25 - loss: 1.9467 - regression_loss: 1.6052 - classification_loss: 0.3415 162/500 [========>.....................] - ETA: 1:24 - loss: 1.9479 - regression_loss: 1.6064 - classification_loss: 0.3415 163/500 [========>.....................] - ETA: 1:24 - loss: 1.9478 - regression_loss: 1.6068 - classification_loss: 0.3411 164/500 [========>.....................] - ETA: 1:24 - loss: 1.9494 - regression_loss: 1.6081 - classification_loss: 0.3413 165/500 [========>.....................] - ETA: 1:24 - loss: 1.9478 - regression_loss: 1.6070 - classification_loss: 0.3409 166/500 [========>.....................] - ETA: 1:23 - loss: 1.9512 - regression_loss: 1.6098 - classification_loss: 0.3414 167/500 [=========>....................] - ETA: 1:23 - loss: 1.9510 - regression_loss: 1.6100 - classification_loss: 0.3410 168/500 [=========>....................] - ETA: 1:23 - loss: 1.9520 - regression_loss: 1.6105 - classification_loss: 0.3415 169/500 [=========>....................] - ETA: 1:23 - loss: 1.9554 - regression_loss: 1.6135 - classification_loss: 0.3419 170/500 [=========>....................] - ETA: 1:22 - loss: 1.9570 - regression_loss: 1.6148 - classification_loss: 0.3422 171/500 [=========>....................] - ETA: 1:22 - loss: 1.9527 - regression_loss: 1.6110 - classification_loss: 0.3417 172/500 [=========>....................] - ETA: 1:22 - loss: 1.9537 - regression_loss: 1.6123 - classification_loss: 0.3414 173/500 [=========>....................] - ETA: 1:22 - loss: 1.9558 - regression_loss: 1.6130 - classification_loss: 0.3428 174/500 [=========>....................] - ETA: 1:21 - loss: 1.9531 - regression_loss: 1.6105 - classification_loss: 0.3426 175/500 [=========>....................] - ETA: 1:21 - loss: 1.9523 - regression_loss: 1.6101 - classification_loss: 0.3423 176/500 [=========>....................] - ETA: 1:21 - loss: 1.9517 - regression_loss: 1.6097 - classification_loss: 0.3420 177/500 [=========>....................] - ETA: 1:21 - loss: 1.9496 - regression_loss: 1.6075 - classification_loss: 0.3421 178/500 [=========>....................] - ETA: 1:20 - loss: 1.9483 - regression_loss: 1.6064 - classification_loss: 0.3419 179/500 [=========>....................] - ETA: 1:20 - loss: 1.9487 - regression_loss: 1.6067 - classification_loss: 0.3420 180/500 [=========>....................] - ETA: 1:20 - loss: 1.9523 - regression_loss: 1.6089 - classification_loss: 0.3435 181/500 [=========>....................] - ETA: 1:20 - loss: 1.9500 - regression_loss: 1.6070 - classification_loss: 0.3431 182/500 [=========>....................] - ETA: 1:19 - loss: 1.9451 - regression_loss: 1.6028 - classification_loss: 0.3423 183/500 [=========>....................] - ETA: 1:19 - loss: 1.9445 - regression_loss: 1.6024 - classification_loss: 0.3421 184/500 [==========>...................] - ETA: 1:19 - loss: 1.9420 - regression_loss: 1.6004 - classification_loss: 0.3417 185/500 [==========>...................] - ETA: 1:19 - loss: 1.9431 - regression_loss: 1.6014 - classification_loss: 0.3417 186/500 [==========>...................] - ETA: 1:18 - loss: 1.9393 - regression_loss: 1.5985 - classification_loss: 0.3408 187/500 [==========>...................] - ETA: 1:18 - loss: 1.9409 - regression_loss: 1.5998 - classification_loss: 0.3411 188/500 [==========>...................] - ETA: 1:18 - loss: 1.9411 - regression_loss: 1.6003 - classification_loss: 0.3407 189/500 [==========>...................] - ETA: 1:18 - loss: 1.9410 - regression_loss: 1.6004 - classification_loss: 0.3405 190/500 [==========>...................] - ETA: 1:17 - loss: 1.9394 - regression_loss: 1.5988 - classification_loss: 0.3406 191/500 [==========>...................] - ETA: 1:17 - loss: 1.9405 - regression_loss: 1.5998 - classification_loss: 0.3406 192/500 [==========>...................] - ETA: 1:17 - loss: 1.9430 - regression_loss: 1.6013 - classification_loss: 0.3417 193/500 [==========>...................] - ETA: 1:17 - loss: 1.9424 - regression_loss: 1.6010 - classification_loss: 0.3414 194/500 [==========>...................] - ETA: 1:16 - loss: 1.9425 - regression_loss: 1.6006 - classification_loss: 0.3419 195/500 [==========>...................] - ETA: 1:16 - loss: 1.9439 - regression_loss: 1.6021 - classification_loss: 0.3418 196/500 [==========>...................] - ETA: 1:16 - loss: 1.9446 - regression_loss: 1.6028 - classification_loss: 0.3418 197/500 [==========>...................] - ETA: 1:16 - loss: 1.9447 - regression_loss: 1.6026 - classification_loss: 0.3421 198/500 [==========>...................] - ETA: 1:15 - loss: 1.9457 - regression_loss: 1.6037 - classification_loss: 0.3420 199/500 [==========>...................] - ETA: 1:15 - loss: 1.9480 - regression_loss: 1.6053 - classification_loss: 0.3427 200/500 [===========>..................] - ETA: 1:15 - loss: 1.9534 - regression_loss: 1.6094 - classification_loss: 0.3440 201/500 [===========>..................] - ETA: 1:15 - loss: 1.9545 - regression_loss: 1.6105 - classification_loss: 0.3440 202/500 [===========>..................] - ETA: 1:14 - loss: 1.9529 - regression_loss: 1.6093 - classification_loss: 0.3437 203/500 [===========>..................] - ETA: 1:14 - loss: 1.9540 - regression_loss: 1.6104 - classification_loss: 0.3436 204/500 [===========>..................] - ETA: 1:14 - loss: 1.9538 - regression_loss: 1.6098 - classification_loss: 0.3440 205/500 [===========>..................] - ETA: 1:14 - loss: 1.9535 - regression_loss: 1.6095 - classification_loss: 0.3440 206/500 [===========>..................] - ETA: 1:13 - loss: 1.9533 - regression_loss: 1.6094 - classification_loss: 0.3439 207/500 [===========>..................] - ETA: 1:13 - loss: 1.9530 - regression_loss: 1.6077 - classification_loss: 0.3453 208/500 [===========>..................] - ETA: 1:13 - loss: 1.9512 - regression_loss: 1.6065 - classification_loss: 0.3447 209/500 [===========>..................] - ETA: 1:13 - loss: 1.9479 - regression_loss: 1.6036 - classification_loss: 0.3443 210/500 [===========>..................] - ETA: 1:12 - loss: 1.9499 - regression_loss: 1.6049 - classification_loss: 0.3451 211/500 [===========>..................] - ETA: 1:12 - loss: 1.9514 - regression_loss: 1.6062 - classification_loss: 0.3453 212/500 [===========>..................] - ETA: 1:12 - loss: 1.9492 - regression_loss: 1.6043 - classification_loss: 0.3449 213/500 [===========>..................] - ETA: 1:12 - loss: 1.9490 - regression_loss: 1.6044 - classification_loss: 0.3447 214/500 [===========>..................] - ETA: 1:11 - loss: 1.9491 - regression_loss: 1.6049 - classification_loss: 0.3443 215/500 [===========>..................] - ETA: 1:11 - loss: 1.9554 - regression_loss: 1.6091 - classification_loss: 0.3463 216/500 [===========>..................] - ETA: 1:11 - loss: 1.9540 - regression_loss: 1.6071 - classification_loss: 0.3469 217/500 [============>.................] - ETA: 1:11 - loss: 1.9600 - regression_loss: 1.6115 - classification_loss: 0.3485 218/500 [============>.................] - ETA: 1:10 - loss: 1.9610 - regression_loss: 1.6124 - classification_loss: 0.3486 219/500 [============>.................] - ETA: 1:10 - loss: 1.9620 - regression_loss: 1.6130 - classification_loss: 0.3491 220/500 [============>.................] - ETA: 1:10 - loss: 1.9569 - regression_loss: 1.6091 - classification_loss: 0.3478 221/500 [============>.................] - ETA: 1:10 - loss: 1.9582 - regression_loss: 1.6100 - classification_loss: 0.3482 222/500 [============>.................] - ETA: 1:09 - loss: 1.9598 - regression_loss: 1.6114 - classification_loss: 0.3484 223/500 [============>.................] - ETA: 1:09 - loss: 1.9583 - regression_loss: 1.6103 - classification_loss: 0.3481 224/500 [============>.................] - ETA: 1:09 - loss: 1.9592 - regression_loss: 1.6108 - classification_loss: 0.3484 225/500 [============>.................] - ETA: 1:09 - loss: 1.9588 - regression_loss: 1.6106 - classification_loss: 0.3482 226/500 [============>.................] - ETA: 1:08 - loss: 1.9585 - regression_loss: 1.6103 - classification_loss: 0.3482 227/500 [============>.................] - ETA: 1:08 - loss: 1.9578 - regression_loss: 1.6099 - classification_loss: 0.3479 228/500 [============>.................] - ETA: 1:08 - loss: 1.9585 - regression_loss: 1.6105 - classification_loss: 0.3480 229/500 [============>.................] - ETA: 1:08 - loss: 1.9588 - regression_loss: 1.6107 - classification_loss: 0.3481 230/500 [============>.................] - ETA: 1:07 - loss: 1.9603 - regression_loss: 1.6119 - classification_loss: 0.3485 231/500 [============>.................] - ETA: 1:07 - loss: 1.9601 - regression_loss: 1.6119 - classification_loss: 0.3482 232/500 [============>.................] - ETA: 1:07 - loss: 1.9610 - regression_loss: 1.6126 - classification_loss: 0.3484 233/500 [============>.................] - ETA: 1:07 - loss: 1.9625 - regression_loss: 1.6145 - classification_loss: 0.3480 234/500 [=============>................] - ETA: 1:06 - loss: 1.9626 - regression_loss: 1.6148 - classification_loss: 0.3478 235/500 [=============>................] - ETA: 1:06 - loss: 1.9643 - regression_loss: 1.6161 - classification_loss: 0.3483 236/500 [=============>................] - ETA: 1:06 - loss: 1.9619 - regression_loss: 1.6140 - classification_loss: 0.3479 237/500 [=============>................] - ETA: 1:06 - loss: 1.9620 - regression_loss: 1.6142 - classification_loss: 0.3478 238/500 [=============>................] - ETA: 1:05 - loss: 1.9622 - regression_loss: 1.6146 - classification_loss: 0.3476 239/500 [=============>................] - ETA: 1:05 - loss: 1.9607 - regression_loss: 1.6134 - classification_loss: 0.3473 240/500 [=============>................] - ETA: 1:05 - loss: 1.9614 - regression_loss: 1.6140 - classification_loss: 0.3473 241/500 [=============>................] - ETA: 1:05 - loss: 1.9645 - regression_loss: 1.6165 - classification_loss: 0.3481 242/500 [=============>................] - ETA: 1:04 - loss: 1.9628 - regression_loss: 1.6141 - classification_loss: 0.3486 243/500 [=============>................] - ETA: 1:04 - loss: 1.9615 - regression_loss: 1.6121 - classification_loss: 0.3494 244/500 [=============>................] - ETA: 1:04 - loss: 1.9606 - regression_loss: 1.6116 - classification_loss: 0.3491 245/500 [=============>................] - ETA: 1:04 - loss: 1.9594 - regression_loss: 1.6105 - classification_loss: 0.3489 246/500 [=============>................] - ETA: 1:03 - loss: 1.9589 - regression_loss: 1.6103 - classification_loss: 0.3486 247/500 [=============>................] - ETA: 1:03 - loss: 1.9598 - regression_loss: 1.6116 - classification_loss: 0.3483 248/500 [=============>................] - ETA: 1:03 - loss: 1.9618 - regression_loss: 1.6131 - classification_loss: 0.3488 249/500 [=============>................] - ETA: 1:03 - loss: 1.9588 - regression_loss: 1.6106 - classification_loss: 0.3482 250/500 [==============>...............] - ETA: 1:02 - loss: 1.9570 - regression_loss: 1.6094 - classification_loss: 0.3476 251/500 [==============>...............] - ETA: 1:02 - loss: 1.9565 - regression_loss: 1.6091 - classification_loss: 0.3474 252/500 [==============>...............] - ETA: 1:02 - loss: 1.9545 - regression_loss: 1.6070 - classification_loss: 0.3475 253/500 [==============>...............] - ETA: 1:02 - loss: 1.9560 - regression_loss: 1.6082 - classification_loss: 0.3478 254/500 [==============>...............] - ETA: 1:01 - loss: 1.9559 - regression_loss: 1.6080 - classification_loss: 0.3479 255/500 [==============>...............] - ETA: 1:01 - loss: 1.9558 - regression_loss: 1.6076 - classification_loss: 0.3482 256/500 [==============>...............] - ETA: 1:01 - loss: 1.9559 - regression_loss: 1.6075 - classification_loss: 0.3484 257/500 [==============>...............] - ETA: 1:01 - loss: 1.9568 - regression_loss: 1.6085 - classification_loss: 0.3483 258/500 [==============>...............] - ETA: 1:00 - loss: 1.9576 - regression_loss: 1.6094 - classification_loss: 0.3482 259/500 [==============>...............] - ETA: 1:00 - loss: 1.9583 - regression_loss: 1.6101 - classification_loss: 0.3483 260/500 [==============>...............] - ETA: 1:00 - loss: 1.9577 - regression_loss: 1.6096 - classification_loss: 0.3481 261/500 [==============>...............] - ETA: 1:00 - loss: 1.9591 - regression_loss: 1.6112 - classification_loss: 0.3480 262/500 [==============>...............] - ETA: 59s - loss: 1.9579 - regression_loss: 1.6103 - classification_loss: 0.3476  263/500 [==============>...............] - ETA: 59s - loss: 1.9557 - regression_loss: 1.6086 - classification_loss: 0.3471 264/500 [==============>...............] - ETA: 59s - loss: 1.9539 - regression_loss: 1.6071 - classification_loss: 0.3468 265/500 [==============>...............] - ETA: 59s - loss: 1.9542 - regression_loss: 1.6074 - classification_loss: 0.3468 266/500 [==============>...............] - ETA: 58s - loss: 1.9557 - regression_loss: 1.6090 - classification_loss: 0.3467 267/500 [===============>..............] - ETA: 58s - loss: 1.9568 - regression_loss: 1.6096 - classification_loss: 0.3472 268/500 [===============>..............] - ETA: 58s - loss: 1.9558 - regression_loss: 1.6092 - classification_loss: 0.3466 269/500 [===============>..............] - ETA: 58s - loss: 1.9564 - regression_loss: 1.6095 - classification_loss: 0.3468 270/500 [===============>..............] - ETA: 57s - loss: 1.9564 - regression_loss: 1.6094 - classification_loss: 0.3470 271/500 [===============>..............] - ETA: 57s - loss: 1.9534 - regression_loss: 1.6068 - classification_loss: 0.3465 272/500 [===============>..............] - ETA: 57s - loss: 1.9526 - regression_loss: 1.6064 - classification_loss: 0.3462 273/500 [===============>..............] - ETA: 57s - loss: 1.9526 - regression_loss: 1.6061 - classification_loss: 0.3465 274/500 [===============>..............] - ETA: 56s - loss: 1.9538 - regression_loss: 1.6072 - classification_loss: 0.3467 275/500 [===============>..............] - ETA: 56s - loss: 1.9565 - regression_loss: 1.6096 - classification_loss: 0.3469 276/500 [===============>..............] - ETA: 56s - loss: 1.9561 - regression_loss: 1.6093 - classification_loss: 0.3467 277/500 [===============>..............] - ETA: 56s - loss: 1.9572 - regression_loss: 1.6101 - classification_loss: 0.3471 278/500 [===============>..............] - ETA: 55s - loss: 1.9535 - regression_loss: 1.6069 - classification_loss: 0.3466 279/500 [===============>..............] - ETA: 55s - loss: 1.9529 - regression_loss: 1.6063 - classification_loss: 0.3466 280/500 [===============>..............] - ETA: 55s - loss: 1.9504 - regression_loss: 1.6043 - classification_loss: 0.3461 281/500 [===============>..............] - ETA: 55s - loss: 1.9475 - regression_loss: 1.6020 - classification_loss: 0.3454 282/500 [===============>..............] - ETA: 54s - loss: 1.9463 - regression_loss: 1.6011 - classification_loss: 0.3451 283/500 [===============>..............] - ETA: 54s - loss: 1.9454 - regression_loss: 1.6005 - classification_loss: 0.3449 284/500 [================>.............] - ETA: 54s - loss: 1.9456 - regression_loss: 1.6006 - classification_loss: 0.3450 285/500 [================>.............] - ETA: 54s - loss: 1.9444 - regression_loss: 1.5997 - classification_loss: 0.3446 286/500 [================>.............] - ETA: 53s - loss: 1.9452 - regression_loss: 1.6003 - classification_loss: 0.3450 287/500 [================>.............] - ETA: 53s - loss: 1.9463 - regression_loss: 1.6013 - classification_loss: 0.3450 288/500 [================>.............] - ETA: 53s - loss: 1.9434 - regression_loss: 1.5986 - classification_loss: 0.3447 289/500 [================>.............] - ETA: 53s - loss: 1.9423 - regression_loss: 1.5978 - classification_loss: 0.3446 290/500 [================>.............] - ETA: 52s - loss: 1.9416 - regression_loss: 1.5963 - classification_loss: 0.3452 291/500 [================>.............] - ETA: 52s - loss: 1.9394 - regression_loss: 1.5945 - classification_loss: 0.3449 292/500 [================>.............] - ETA: 52s - loss: 1.9391 - regression_loss: 1.5944 - classification_loss: 0.3447 293/500 [================>.............] - ETA: 52s - loss: 1.9372 - regression_loss: 1.5929 - classification_loss: 0.3443 294/500 [================>.............] - ETA: 51s - loss: 1.9359 - regression_loss: 1.5917 - classification_loss: 0.3442 295/500 [================>.............] - ETA: 51s - loss: 1.9349 - regression_loss: 1.5910 - classification_loss: 0.3439 296/500 [================>.............] - ETA: 51s - loss: 1.9363 - regression_loss: 1.5924 - classification_loss: 0.3440 297/500 [================>.............] - ETA: 51s - loss: 1.9378 - regression_loss: 1.5934 - classification_loss: 0.3444 298/500 [================>.............] - ETA: 50s - loss: 1.9365 - regression_loss: 1.5926 - classification_loss: 0.3440 299/500 [================>.............] - ETA: 50s - loss: 1.9370 - regression_loss: 1.5932 - classification_loss: 0.3438 300/500 [=================>............] - ETA: 50s - loss: 1.9359 - regression_loss: 1.5919 - classification_loss: 0.3440 301/500 [=================>............] - ETA: 50s - loss: 1.9341 - regression_loss: 1.5905 - classification_loss: 0.3436 302/500 [=================>............] - ETA: 49s - loss: 1.9339 - regression_loss: 1.5903 - classification_loss: 0.3436 303/500 [=================>............] - ETA: 49s - loss: 1.9338 - regression_loss: 1.5899 - classification_loss: 0.3439 304/500 [=================>............] - ETA: 49s - loss: 1.9305 - regression_loss: 1.5873 - classification_loss: 0.3432 305/500 [=================>............] - ETA: 49s - loss: 1.9308 - regression_loss: 1.5877 - classification_loss: 0.3431 306/500 [=================>............] - ETA: 48s - loss: 1.9322 - regression_loss: 1.5887 - classification_loss: 0.3434 307/500 [=================>............] - ETA: 48s - loss: 1.9311 - regression_loss: 1.5881 - classification_loss: 0.3431 308/500 [=================>............] - ETA: 48s - loss: 1.9359 - regression_loss: 1.5920 - classification_loss: 0.3439 309/500 [=================>............] - ETA: 48s - loss: 1.9357 - regression_loss: 1.5917 - classification_loss: 0.3440 310/500 [=================>............] - ETA: 47s - loss: 1.9363 - regression_loss: 1.5921 - classification_loss: 0.3442 311/500 [=================>............] - ETA: 47s - loss: 1.9355 - regression_loss: 1.5915 - classification_loss: 0.3440 312/500 [=================>............] - ETA: 47s - loss: 1.9350 - regression_loss: 1.5909 - classification_loss: 0.3441 313/500 [=================>............] - ETA: 46s - loss: 1.9322 - regression_loss: 1.5883 - classification_loss: 0.3439 314/500 [=================>............] - ETA: 46s - loss: 1.9334 - regression_loss: 1.5892 - classification_loss: 0.3442 315/500 [=================>............] - ETA: 46s - loss: 1.9295 - regression_loss: 1.5857 - classification_loss: 0.3438 316/500 [=================>............] - ETA: 46s - loss: 1.9283 - regression_loss: 1.5845 - classification_loss: 0.3438 317/500 [==================>...........] - ETA: 45s - loss: 1.9282 - regression_loss: 1.5843 - classification_loss: 0.3439 318/500 [==================>...........] - ETA: 45s - loss: 1.9289 - regression_loss: 1.5846 - classification_loss: 0.3442 319/500 [==================>...........] - ETA: 45s - loss: 1.9272 - regression_loss: 1.5833 - classification_loss: 0.3439 320/500 [==================>...........] - ETA: 45s - loss: 1.9291 - regression_loss: 1.5846 - classification_loss: 0.3445 321/500 [==================>...........] - ETA: 44s - loss: 1.9298 - regression_loss: 1.5852 - classification_loss: 0.3446 322/500 [==================>...........] - ETA: 44s - loss: 1.9294 - regression_loss: 1.5848 - classification_loss: 0.3446 323/500 [==================>...........] - ETA: 44s - loss: 1.9306 - regression_loss: 1.5858 - classification_loss: 0.3448 324/500 [==================>...........] - ETA: 44s - loss: 1.9277 - regression_loss: 1.5836 - classification_loss: 0.3441 325/500 [==================>...........] - ETA: 43s - loss: 1.9264 - regression_loss: 1.5828 - classification_loss: 0.3436 326/500 [==================>...........] - ETA: 43s - loss: 1.9261 - regression_loss: 1.5824 - classification_loss: 0.3437 327/500 [==================>...........] - ETA: 43s - loss: 1.9270 - regression_loss: 1.5832 - classification_loss: 0.3438 328/500 [==================>...........] - ETA: 43s - loss: 1.9273 - regression_loss: 1.5835 - classification_loss: 0.3438 329/500 [==================>...........] - ETA: 42s - loss: 1.9271 - regression_loss: 1.5835 - classification_loss: 0.3437 330/500 [==================>...........] - ETA: 42s - loss: 1.9267 - regression_loss: 1.5830 - classification_loss: 0.3437 331/500 [==================>...........] - ETA: 42s - loss: 1.9258 - regression_loss: 1.5823 - classification_loss: 0.3435 332/500 [==================>...........] - ETA: 42s - loss: 1.9261 - regression_loss: 1.5826 - classification_loss: 0.3435 333/500 [==================>...........] - ETA: 41s - loss: 1.9263 - regression_loss: 1.5823 - classification_loss: 0.3440 334/500 [===================>..........] - ETA: 41s - loss: 1.9275 - regression_loss: 1.5831 - classification_loss: 0.3444 335/500 [===================>..........] - ETA: 41s - loss: 1.9271 - regression_loss: 1.5826 - classification_loss: 0.3445 336/500 [===================>..........] - ETA: 41s - loss: 1.9280 - regression_loss: 1.5835 - classification_loss: 0.3445 337/500 [===================>..........] - ETA: 40s - loss: 1.9277 - regression_loss: 1.5833 - classification_loss: 0.3444 338/500 [===================>..........] - ETA: 40s - loss: 1.9282 - regression_loss: 1.5839 - classification_loss: 0.3443 339/500 [===================>..........] - ETA: 40s - loss: 1.9279 - regression_loss: 1.5838 - classification_loss: 0.3442 340/500 [===================>..........] - ETA: 40s - loss: 1.9289 - regression_loss: 1.5846 - classification_loss: 0.3444 341/500 [===================>..........] - ETA: 39s - loss: 1.9283 - regression_loss: 1.5843 - classification_loss: 0.3440 342/500 [===================>..........] - ETA: 39s - loss: 1.9267 - regression_loss: 1.5832 - classification_loss: 0.3435 343/500 [===================>..........] - ETA: 39s - loss: 1.9253 - regression_loss: 1.5819 - classification_loss: 0.3434 344/500 [===================>..........] - ETA: 39s - loss: 1.9229 - regression_loss: 1.5800 - classification_loss: 0.3429 345/500 [===================>..........] - ETA: 38s - loss: 1.9210 - regression_loss: 1.5783 - classification_loss: 0.3427 346/500 [===================>..........] - ETA: 38s - loss: 1.9223 - regression_loss: 1.5792 - classification_loss: 0.3431 347/500 [===================>..........] - ETA: 38s - loss: 1.9216 - regression_loss: 1.5788 - classification_loss: 0.3428 348/500 [===================>..........] - ETA: 38s - loss: 1.9223 - regression_loss: 1.5795 - classification_loss: 0.3428 349/500 [===================>..........] - ETA: 37s - loss: 1.9228 - regression_loss: 1.5800 - classification_loss: 0.3429 350/500 [====================>.........] - ETA: 37s - loss: 1.9242 - regression_loss: 1.5803 - classification_loss: 0.3439 351/500 [====================>.........] - ETA: 37s - loss: 1.9256 - regression_loss: 1.5814 - classification_loss: 0.3443 352/500 [====================>.........] - ETA: 37s - loss: 1.9248 - regression_loss: 1.5807 - classification_loss: 0.3441 353/500 [====================>.........] - ETA: 36s - loss: 1.9243 - regression_loss: 1.5804 - classification_loss: 0.3439 354/500 [====================>.........] - ETA: 36s - loss: 1.9250 - regression_loss: 1.5809 - classification_loss: 0.3441 355/500 [====================>.........] - ETA: 36s - loss: 1.9253 - regression_loss: 1.5811 - classification_loss: 0.3441 356/500 [====================>.........] - ETA: 36s - loss: 1.9250 - regression_loss: 1.5810 - classification_loss: 0.3440 357/500 [====================>.........] - ETA: 35s - loss: 1.9242 - regression_loss: 1.5805 - classification_loss: 0.3438 358/500 [====================>.........] - ETA: 35s - loss: 1.9251 - regression_loss: 1.5813 - classification_loss: 0.3438 359/500 [====================>.........] - ETA: 35s - loss: 1.9278 - regression_loss: 1.5832 - classification_loss: 0.3447 360/500 [====================>.........] - ETA: 35s - loss: 1.9291 - regression_loss: 1.5845 - classification_loss: 0.3446 361/500 [====================>.........] - ETA: 34s - loss: 1.9302 - regression_loss: 1.5851 - classification_loss: 0.3451 362/500 [====================>.........] - ETA: 34s - loss: 1.9308 - regression_loss: 1.5858 - classification_loss: 0.3450 363/500 [====================>.........] - ETA: 34s - loss: 1.9309 - regression_loss: 1.5860 - classification_loss: 0.3450 364/500 [====================>.........] - ETA: 34s - loss: 1.9307 - regression_loss: 1.5859 - classification_loss: 0.3448 365/500 [====================>.........] - ETA: 33s - loss: 1.9301 - regression_loss: 1.5855 - classification_loss: 0.3447 366/500 [====================>.........] - ETA: 33s - loss: 1.9303 - regression_loss: 1.5857 - classification_loss: 0.3446 367/500 [=====================>........] - ETA: 33s - loss: 1.9277 - regression_loss: 1.5836 - classification_loss: 0.3441 368/500 [=====================>........] - ETA: 33s - loss: 1.9278 - regression_loss: 1.5838 - classification_loss: 0.3440 369/500 [=====================>........] - ETA: 32s - loss: 1.9268 - regression_loss: 1.5827 - classification_loss: 0.3441 370/500 [=====================>........] - ETA: 32s - loss: 1.9266 - regression_loss: 1.5826 - classification_loss: 0.3440 371/500 [=====================>........] - ETA: 32s - loss: 1.9273 - regression_loss: 1.5832 - classification_loss: 0.3441 372/500 [=====================>........] - ETA: 32s - loss: 1.9280 - regression_loss: 1.5838 - classification_loss: 0.3442 373/500 [=====================>........] - ETA: 31s - loss: 1.9292 - regression_loss: 1.5846 - classification_loss: 0.3446 374/500 [=====================>........] - ETA: 31s - loss: 1.9292 - regression_loss: 1.5846 - classification_loss: 0.3446 375/500 [=====================>........] - ETA: 31s - loss: 1.9289 - regression_loss: 1.5844 - classification_loss: 0.3445 376/500 [=====================>........] - ETA: 31s - loss: 1.9261 - regression_loss: 1.5820 - classification_loss: 0.3441 377/500 [=====================>........] - ETA: 30s - loss: 1.9278 - regression_loss: 1.5833 - classification_loss: 0.3445 378/500 [=====================>........] - ETA: 30s - loss: 1.9287 - regression_loss: 1.5830 - classification_loss: 0.3458 379/500 [=====================>........] - ETA: 30s - loss: 1.9274 - regression_loss: 1.5816 - classification_loss: 0.3458 380/500 [=====================>........] - ETA: 30s - loss: 1.9262 - regression_loss: 1.5807 - classification_loss: 0.3455 381/500 [=====================>........] - ETA: 29s - loss: 1.9267 - regression_loss: 1.5812 - classification_loss: 0.3455 382/500 [=====================>........] - ETA: 29s - loss: 1.9256 - regression_loss: 1.5804 - classification_loss: 0.3452 383/500 [=====================>........] - ETA: 29s - loss: 1.9259 - regression_loss: 1.5806 - classification_loss: 0.3453 384/500 [======================>.......] - ETA: 29s - loss: 1.9246 - regression_loss: 1.5796 - classification_loss: 0.3450 385/500 [======================>.......] - ETA: 28s - loss: 1.9263 - regression_loss: 1.5810 - classification_loss: 0.3453 386/500 [======================>.......] - ETA: 28s - loss: 1.9260 - regression_loss: 1.5809 - classification_loss: 0.3451 387/500 [======================>.......] - ETA: 28s - loss: 1.9265 - regression_loss: 1.5814 - classification_loss: 0.3451 388/500 [======================>.......] - ETA: 28s - loss: 1.9233 - regression_loss: 1.5789 - classification_loss: 0.3444 389/500 [======================>.......] - ETA: 27s - loss: 1.9214 - regression_loss: 1.5772 - classification_loss: 0.3443 390/500 [======================>.......] - ETA: 27s - loss: 1.9214 - regression_loss: 1.5771 - classification_loss: 0.3442 391/500 [======================>.......] - ETA: 27s - loss: 1.9226 - regression_loss: 1.5778 - classification_loss: 0.3448 392/500 [======================>.......] - ETA: 27s - loss: 1.9233 - regression_loss: 1.5784 - classification_loss: 0.3449 393/500 [======================>.......] - ETA: 26s - loss: 1.9243 - regression_loss: 1.5792 - classification_loss: 0.3452 394/500 [======================>.......] - ETA: 26s - loss: 1.9253 - regression_loss: 1.5797 - classification_loss: 0.3456 395/500 [======================>.......] - ETA: 26s - loss: 1.9260 - regression_loss: 1.5799 - classification_loss: 0.3461 396/500 [======================>.......] - ETA: 26s - loss: 1.9224 - regression_loss: 1.5769 - classification_loss: 0.3456 397/500 [======================>.......] - ETA: 25s - loss: 1.9234 - regression_loss: 1.5776 - classification_loss: 0.3457 398/500 [======================>.......] - ETA: 25s - loss: 1.9228 - regression_loss: 1.5771 - classification_loss: 0.3457 399/500 [======================>.......] - ETA: 25s - loss: 1.9226 - regression_loss: 1.5770 - classification_loss: 0.3456 400/500 [=======================>......] - ETA: 25s - loss: 1.9209 - regression_loss: 1.5757 - classification_loss: 0.3452 401/500 [=======================>......] - ETA: 24s - loss: 1.9210 - regression_loss: 1.5758 - classification_loss: 0.3452 402/500 [=======================>......] - ETA: 24s - loss: 1.9211 - regression_loss: 1.5759 - classification_loss: 0.3452 403/500 [=======================>......] - ETA: 24s - loss: 1.9236 - regression_loss: 1.5779 - classification_loss: 0.3457 404/500 [=======================>......] - ETA: 24s - loss: 1.9215 - regression_loss: 1.5761 - classification_loss: 0.3454 405/500 [=======================>......] - ETA: 23s - loss: 1.9219 - regression_loss: 1.5764 - classification_loss: 0.3455 406/500 [=======================>......] - ETA: 23s - loss: 1.9218 - regression_loss: 1.5763 - classification_loss: 0.3456 407/500 [=======================>......] - ETA: 23s - loss: 1.9231 - regression_loss: 1.5774 - classification_loss: 0.3456 408/500 [=======================>......] - ETA: 23s - loss: 1.9238 - regression_loss: 1.5780 - classification_loss: 0.3458 409/500 [=======================>......] - ETA: 22s - loss: 1.9234 - regression_loss: 1.5778 - classification_loss: 0.3455 410/500 [=======================>......] - ETA: 22s - loss: 1.9231 - regression_loss: 1.5776 - classification_loss: 0.3455 411/500 [=======================>......] - ETA: 22s - loss: 1.9235 - regression_loss: 1.5780 - classification_loss: 0.3455 412/500 [=======================>......] - ETA: 22s - loss: 1.9230 - regression_loss: 1.5776 - classification_loss: 0.3454 413/500 [=======================>......] - ETA: 21s - loss: 1.9231 - regression_loss: 1.5774 - classification_loss: 0.3457 414/500 [=======================>......] - ETA: 21s - loss: 1.9223 - regression_loss: 1.5770 - classification_loss: 0.3454 415/500 [=======================>......] - ETA: 21s - loss: 1.9237 - regression_loss: 1.5780 - classification_loss: 0.3457 416/500 [=======================>......] - ETA: 21s - loss: 1.9217 - regression_loss: 1.5765 - classification_loss: 0.3453 417/500 [========================>.....] - ETA: 20s - loss: 1.9235 - regression_loss: 1.5777 - classification_loss: 0.3457 418/500 [========================>.....] - ETA: 20s - loss: 1.9232 - regression_loss: 1.5775 - classification_loss: 0.3456 419/500 [========================>.....] - ETA: 20s - loss: 1.9215 - regression_loss: 1.5761 - classification_loss: 0.3454 420/500 [========================>.....] - ETA: 20s - loss: 1.9222 - regression_loss: 1.5768 - classification_loss: 0.3455 421/500 [========================>.....] - ETA: 19s - loss: 1.9209 - regression_loss: 1.5757 - classification_loss: 0.3452 422/500 [========================>.....] - ETA: 19s - loss: 1.9187 - regression_loss: 1.5740 - classification_loss: 0.3447 423/500 [========================>.....] - ETA: 19s - loss: 1.9172 - regression_loss: 1.5728 - classification_loss: 0.3444 424/500 [========================>.....] - ETA: 19s - loss: 1.9180 - regression_loss: 1.5735 - classification_loss: 0.3444 425/500 [========================>.....] - ETA: 18s - loss: 1.9185 - regression_loss: 1.5739 - classification_loss: 0.3447 426/500 [========================>.....] - ETA: 18s - loss: 1.9175 - regression_loss: 1.5730 - classification_loss: 0.3445 427/500 [========================>.....] - ETA: 18s - loss: 1.9178 - regression_loss: 1.5732 - classification_loss: 0.3445 428/500 [========================>.....] - ETA: 18s - loss: 1.9170 - regression_loss: 1.5727 - classification_loss: 0.3443 429/500 [========================>.....] - ETA: 17s - loss: 1.9146 - regression_loss: 1.5709 - classification_loss: 0.3437 430/500 [========================>.....] - ETA: 17s - loss: 1.9143 - regression_loss: 1.5707 - classification_loss: 0.3437 431/500 [========================>.....] - ETA: 17s - loss: 1.9153 - regression_loss: 1.5714 - classification_loss: 0.3439 432/500 [========================>.....] - ETA: 17s - loss: 1.9149 - regression_loss: 1.5709 - classification_loss: 0.3440 433/500 [========================>.....] - ETA: 16s - loss: 1.9155 - regression_loss: 1.5714 - classification_loss: 0.3441 434/500 [=========================>....] - ETA: 16s - loss: 1.9155 - regression_loss: 1.5715 - classification_loss: 0.3440 435/500 [=========================>....] - ETA: 16s - loss: 1.9161 - regression_loss: 1.5721 - classification_loss: 0.3440 436/500 [=========================>....] - ETA: 16s - loss: 1.9156 - regression_loss: 1.5719 - classification_loss: 0.3437 437/500 [=========================>....] - ETA: 15s - loss: 1.9166 - regression_loss: 1.5727 - classification_loss: 0.3439 438/500 [=========================>....] - ETA: 15s - loss: 1.9167 - regression_loss: 1.5730 - classification_loss: 0.3438 439/500 [=========================>....] - ETA: 15s - loss: 1.9161 - regression_loss: 1.5726 - classification_loss: 0.3435 440/500 [=========================>....] - ETA: 15s - loss: 1.9180 - regression_loss: 1.5739 - classification_loss: 0.3441 441/500 [=========================>....] - ETA: 14s - loss: 1.9182 - regression_loss: 1.5741 - classification_loss: 0.3442 442/500 [=========================>....] - ETA: 14s - loss: 1.9187 - regression_loss: 1.5745 - classification_loss: 0.3442 443/500 [=========================>....] - ETA: 14s - loss: 1.9194 - regression_loss: 1.5751 - classification_loss: 0.3443 444/500 [=========================>....] - ETA: 14s - loss: 1.9173 - regression_loss: 1.5734 - classification_loss: 0.3439 445/500 [=========================>....] - ETA: 13s - loss: 1.9200 - regression_loss: 1.5746 - classification_loss: 0.3454 446/500 [=========================>....] - ETA: 13s - loss: 1.9214 - regression_loss: 1.5759 - classification_loss: 0.3455 447/500 [=========================>....] - ETA: 13s - loss: 1.9212 - regression_loss: 1.5756 - classification_loss: 0.3455 448/500 [=========================>....] - ETA: 13s - loss: 1.9219 - regression_loss: 1.5763 - classification_loss: 0.3456 449/500 [=========================>....] - ETA: 12s - loss: 1.9215 - regression_loss: 1.5761 - classification_loss: 0.3454 450/500 [==========================>...] - ETA: 12s - loss: 1.9218 - regression_loss: 1.5765 - classification_loss: 0.3452 451/500 [==========================>...] - ETA: 12s - loss: 1.9218 - regression_loss: 1.5767 - classification_loss: 0.3451 452/500 [==========================>...] - ETA: 12s - loss: 1.9218 - regression_loss: 1.5767 - classification_loss: 0.3451 453/500 [==========================>...] - ETA: 11s - loss: 1.9224 - regression_loss: 1.5774 - classification_loss: 0.3450 454/500 [==========================>...] - ETA: 11s - loss: 1.9228 - regression_loss: 1.5778 - classification_loss: 0.3450 455/500 [==========================>...] - ETA: 11s - loss: 1.9216 - regression_loss: 1.5769 - classification_loss: 0.3447 456/500 [==========================>...] - ETA: 11s - loss: 1.9200 - regression_loss: 1.5756 - classification_loss: 0.3444 457/500 [==========================>...] - ETA: 10s - loss: 1.9205 - regression_loss: 1.5759 - classification_loss: 0.3446 458/500 [==========================>...] - ETA: 10s - loss: 1.9207 - regression_loss: 1.5761 - classification_loss: 0.3446 459/500 [==========================>...] - ETA: 10s - loss: 1.9226 - regression_loss: 1.5776 - classification_loss: 0.3450 460/500 [==========================>...] - ETA: 10s - loss: 1.9226 - regression_loss: 1.5778 - classification_loss: 0.3448 461/500 [==========================>...] - ETA: 9s - loss: 1.9235 - regression_loss: 1.5782 - classification_loss: 0.3453  462/500 [==========================>...] - ETA: 9s - loss: 1.9228 - regression_loss: 1.5776 - classification_loss: 0.3452 463/500 [==========================>...] - ETA: 9s - loss: 1.9238 - regression_loss: 1.5783 - classification_loss: 0.3456 464/500 [==========================>...] - ETA: 9s - loss: 1.9247 - regression_loss: 1.5789 - classification_loss: 0.3457 465/500 [==========================>...] - ETA: 8s - loss: 1.9247 - regression_loss: 1.5790 - classification_loss: 0.3457 466/500 [==========================>...] - ETA: 8s - loss: 1.9257 - regression_loss: 1.5799 - classification_loss: 0.3458 467/500 [===========================>..] - ETA: 8s - loss: 1.9264 - regression_loss: 1.5805 - classification_loss: 0.3459 468/500 [===========================>..] - ETA: 8s - loss: 1.9263 - regression_loss: 1.5806 - classification_loss: 0.3457 469/500 [===========================>..] - ETA: 7s - loss: 1.9266 - regression_loss: 1.5810 - classification_loss: 0.3456 470/500 [===========================>..] - ETA: 7s - loss: 1.9267 - regression_loss: 1.5808 - classification_loss: 0.3459 471/500 [===========================>..] - ETA: 7s - loss: 1.9260 - regression_loss: 1.5802 - classification_loss: 0.3458 472/500 [===========================>..] - ETA: 7s - loss: 1.9260 - regression_loss: 1.5803 - classification_loss: 0.3457 473/500 [===========================>..] - ETA: 6s - loss: 1.9263 - regression_loss: 1.5807 - classification_loss: 0.3456 474/500 [===========================>..] - ETA: 6s - loss: 1.9259 - regression_loss: 1.5804 - classification_loss: 0.3456 475/500 [===========================>..] - ETA: 6s - loss: 1.9256 - regression_loss: 1.5802 - classification_loss: 0.3454 476/500 [===========================>..] - ETA: 6s - loss: 1.9268 - regression_loss: 1.5811 - classification_loss: 0.3457 477/500 [===========================>..] - ETA: 5s - loss: 1.9252 - regression_loss: 1.5799 - classification_loss: 0.3453 478/500 [===========================>..] - ETA: 5s - loss: 1.9258 - regression_loss: 1.5805 - classification_loss: 0.3453 479/500 [===========================>..] - ETA: 5s - loss: 1.9255 - regression_loss: 1.5804 - classification_loss: 0.3452 480/500 [===========================>..] - ETA: 5s - loss: 1.9255 - regression_loss: 1.5803 - classification_loss: 0.3452 481/500 [===========================>..] - ETA: 4s - loss: 1.9251 - regression_loss: 1.5800 - classification_loss: 0.3452 482/500 [===========================>..] - ETA: 4s - loss: 1.9246 - regression_loss: 1.5796 - classification_loss: 0.3450 483/500 [===========================>..] - ETA: 4s - loss: 1.9256 - regression_loss: 1.5803 - classification_loss: 0.3453 484/500 [============================>.] - ETA: 4s - loss: 1.9254 - regression_loss: 1.5803 - classification_loss: 0.3450 485/500 [============================>.] - ETA: 3s - loss: 1.9272 - regression_loss: 1.5821 - classification_loss: 0.3451 486/500 [============================>.] - ETA: 3s - loss: 1.9278 - regression_loss: 1.5826 - classification_loss: 0.3452 487/500 [============================>.] - ETA: 3s - loss: 1.9258 - regression_loss: 1.5811 - classification_loss: 0.3448 488/500 [============================>.] - ETA: 3s - loss: 1.9262 - regression_loss: 1.5812 - classification_loss: 0.3450 489/500 [============================>.] - ETA: 2s - loss: 1.9262 - regression_loss: 1.5810 - classification_loss: 0.3452 490/500 [============================>.] - ETA: 2s - loss: 1.9269 - regression_loss: 1.5815 - classification_loss: 0.3454 491/500 [============================>.] - ETA: 2s - loss: 1.9275 - regression_loss: 1.5817 - classification_loss: 0.3458 492/500 [============================>.] - ETA: 2s - loss: 1.9264 - regression_loss: 1.5809 - classification_loss: 0.3455 493/500 [============================>.] - ETA: 1s - loss: 1.9271 - regression_loss: 1.5813 - classification_loss: 0.3459 494/500 [============================>.] - ETA: 1s - loss: 1.9264 - regression_loss: 1.5807 - classification_loss: 0.3457 495/500 [============================>.] - ETA: 1s - loss: 1.9262 - regression_loss: 1.5806 - classification_loss: 0.3456 496/500 [============================>.] - ETA: 1s - loss: 1.9270 - regression_loss: 1.5813 - classification_loss: 0.3458 497/500 [============================>.] - ETA: 0s - loss: 1.9260 - regression_loss: 1.5804 - classification_loss: 0.3456 498/500 [============================>.] - ETA: 0s - loss: 1.9260 - regression_loss: 1.5805 - classification_loss: 0.3455 499/500 [============================>.] - ETA: 0s - loss: 1.9259 - regression_loss: 1.5801 - classification_loss: 0.3457 500/500 [==============================] - 125s 251ms/step - loss: 1.9267 - regression_loss: 1.5809 - classification_loss: 0.3457 1172 instances of class plum with average precision: 0.5416 mAP: 0.5416 Epoch 00050: saving model to ./training/snapshots/resnet50_pascal_50.h5 Epoch 51/150 1/500 [..............................] - ETA: 1:59 - loss: 1.9233 - regression_loss: 1.5980 - classification_loss: 0.3252 2/500 [..............................] - ETA: 2:01 - loss: 1.9376 - regression_loss: 1.6181 - classification_loss: 0.3195 3/500 [..............................] - ETA: 2:03 - loss: 1.5596 - regression_loss: 1.2987 - classification_loss: 0.2609 4/500 [..............................] - ETA: 2:00 - loss: 2.1482 - regression_loss: 1.4510 - classification_loss: 0.6972 5/500 [..............................] - ETA: 1:59 - loss: 1.9128 - regression_loss: 1.3209 - classification_loss: 0.5919 6/500 [..............................] - ETA: 2:00 - loss: 1.9318 - regression_loss: 1.3878 - classification_loss: 0.5440 7/500 [..............................] - ETA: 2:00 - loss: 1.8556 - regression_loss: 1.3566 - classification_loss: 0.4989 8/500 [..............................] - ETA: 2:01 - loss: 1.8659 - regression_loss: 1.3846 - classification_loss: 0.4812 9/500 [..............................] - ETA: 2:00 - loss: 1.8723 - regression_loss: 1.4109 - classification_loss: 0.4614 10/500 [..............................] - ETA: 2:00 - loss: 1.8993 - regression_loss: 1.4493 - classification_loss: 0.4500 11/500 [..............................] - ETA: 2:00 - loss: 1.8668 - regression_loss: 1.4391 - classification_loss: 0.4277 12/500 [..............................] - ETA: 2:00 - loss: 1.8633 - regression_loss: 1.4493 - classification_loss: 0.4141 13/500 [..............................] - ETA: 2:00 - loss: 1.8373 - regression_loss: 1.4381 - classification_loss: 0.3992 14/500 [..............................] - ETA: 2:00 - loss: 1.8392 - regression_loss: 1.4469 - classification_loss: 0.3922 15/500 [..............................] - ETA: 2:00 - loss: 1.8918 - regression_loss: 1.4928 - classification_loss: 0.3991 16/500 [..............................] - ETA: 1:59 - loss: 1.8462 - regression_loss: 1.4574 - classification_loss: 0.3888 17/500 [>.............................] - ETA: 1:59 - loss: 1.8295 - regression_loss: 1.4430 - classification_loss: 0.3865 18/500 [>.............................] - ETA: 1:59 - loss: 1.8488 - regression_loss: 1.4609 - classification_loss: 0.3880 19/500 [>.............................] - ETA: 1:59 - loss: 1.8576 - regression_loss: 1.4738 - classification_loss: 0.3838 20/500 [>.............................] - ETA: 1:59 - loss: 1.8616 - regression_loss: 1.4770 - classification_loss: 0.3846 21/500 [>.............................] - ETA: 1:59 - loss: 1.8289 - regression_loss: 1.4531 - classification_loss: 0.3757 22/500 [>.............................] - ETA: 1:58 - loss: 1.8484 - regression_loss: 1.4740 - classification_loss: 0.3744 23/500 [>.............................] - ETA: 1:58 - loss: 1.8502 - regression_loss: 1.4775 - classification_loss: 0.3726 24/500 [>.............................] - ETA: 1:58 - loss: 1.8605 - regression_loss: 1.4884 - classification_loss: 0.3722 25/500 [>.............................] - ETA: 1:58 - loss: 1.8656 - regression_loss: 1.4979 - classification_loss: 0.3676 26/500 [>.............................] - ETA: 1:58 - loss: 1.8548 - regression_loss: 1.4918 - classification_loss: 0.3630 27/500 [>.............................] - ETA: 1:58 - loss: 1.8553 - regression_loss: 1.4950 - classification_loss: 0.3603 28/500 [>.............................] - ETA: 1:57 - loss: 1.8316 - regression_loss: 1.4771 - classification_loss: 0.3545 29/500 [>.............................] - ETA: 1:57 - loss: 1.8290 - regression_loss: 1.4755 - classification_loss: 0.3535 30/500 [>.............................] - ETA: 1:57 - loss: 1.8391 - regression_loss: 1.4900 - classification_loss: 0.3491 31/500 [>.............................] - ETA: 1:57 - loss: 1.8317 - regression_loss: 1.4861 - classification_loss: 0.3455 32/500 [>.............................] - ETA: 1:57 - loss: 1.8227 - regression_loss: 1.4801 - classification_loss: 0.3426 33/500 [>.............................] - ETA: 1:56 - loss: 1.8233 - regression_loss: 1.4830 - classification_loss: 0.3403 34/500 [=>............................] - ETA: 1:56 - loss: 1.8135 - regression_loss: 1.4769 - classification_loss: 0.3365 35/500 [=>............................] - ETA: 1:56 - loss: 1.8006 - regression_loss: 1.4673 - classification_loss: 0.3332 36/500 [=>............................] - ETA: 1:56 - loss: 1.8090 - regression_loss: 1.4751 - classification_loss: 0.3339 37/500 [=>............................] - ETA: 1:56 - loss: 1.8028 - regression_loss: 1.4703 - classification_loss: 0.3326 38/500 [=>............................] - ETA: 1:55 - loss: 1.7935 - regression_loss: 1.4574 - classification_loss: 0.3361 39/500 [=>............................] - ETA: 1:55 - loss: 1.7942 - regression_loss: 1.4582 - classification_loss: 0.3359 40/500 [=>............................] - ETA: 1:55 - loss: 1.7969 - regression_loss: 1.4621 - classification_loss: 0.3348 41/500 [=>............................] - ETA: 1:55 - loss: 1.8007 - regression_loss: 1.4664 - classification_loss: 0.3343 42/500 [=>............................] - ETA: 1:55 - loss: 1.7989 - regression_loss: 1.4665 - classification_loss: 0.3324 43/500 [=>............................] - ETA: 1:54 - loss: 1.8179 - regression_loss: 1.4835 - classification_loss: 0.3343 44/500 [=>............................] - ETA: 1:54 - loss: 1.8252 - regression_loss: 1.4910 - classification_loss: 0.3342 45/500 [=>............................] - ETA: 1:54 - loss: 1.8442 - regression_loss: 1.5032 - classification_loss: 0.3410 46/500 [=>............................] - ETA: 1:53 - loss: 1.8518 - regression_loss: 1.5101 - classification_loss: 0.3417 47/500 [=>............................] - ETA: 1:53 - loss: 1.8714 - regression_loss: 1.5258 - classification_loss: 0.3456 48/500 [=>............................] - ETA: 1:53 - loss: 1.8579 - regression_loss: 1.5142 - classification_loss: 0.3437 49/500 [=>............................] - ETA: 1:53 - loss: 1.8598 - regression_loss: 1.5165 - classification_loss: 0.3433 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8611 - regression_loss: 1.5189 - classification_loss: 0.3422 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8765 - regression_loss: 1.5293 - classification_loss: 0.3472 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8756 - regression_loss: 1.5293 - classification_loss: 0.3463 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8780 - regression_loss: 1.5318 - classification_loss: 0.3462 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8864 - regression_loss: 1.5395 - classification_loss: 0.3469 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8791 - regression_loss: 1.5343 - classification_loss: 0.3448 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8796 - regression_loss: 1.5339 - classification_loss: 0.3457 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8786 - regression_loss: 1.5332 - classification_loss: 0.3454 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8898 - regression_loss: 1.5430 - classification_loss: 0.3468 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8817 - regression_loss: 1.5342 - classification_loss: 0.3476 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8839 - regression_loss: 1.5358 - classification_loss: 0.3481 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8871 - regression_loss: 1.5390 - classification_loss: 0.3482 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8857 - regression_loss: 1.5383 - classification_loss: 0.3474 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8796 - regression_loss: 1.5339 - classification_loss: 0.3458 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8690 - regression_loss: 1.5267 - classification_loss: 0.3423 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8592 - regression_loss: 1.5189 - classification_loss: 0.3403 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8588 - regression_loss: 1.5187 - classification_loss: 0.3401 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8603 - regression_loss: 1.5205 - classification_loss: 0.3398 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8590 - regression_loss: 1.5203 - classification_loss: 0.3387 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8543 - regression_loss: 1.5168 - classification_loss: 0.3375 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8662 - regression_loss: 1.5253 - classification_loss: 0.3408 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8673 - regression_loss: 1.5271 - classification_loss: 0.3402 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8912 - regression_loss: 1.5322 - classification_loss: 0.3589 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8890 - regression_loss: 1.5306 - classification_loss: 0.3584 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8727 - regression_loss: 1.5175 - classification_loss: 0.3552 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8721 - regression_loss: 1.5169 - classification_loss: 0.3552 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8750 - regression_loss: 1.5187 - classification_loss: 0.3563 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8776 - regression_loss: 1.5212 - classification_loss: 0.3564 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8643 - regression_loss: 1.5104 - classification_loss: 0.3539 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8639 - regression_loss: 1.5112 - classification_loss: 0.3527 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8639 - regression_loss: 1.5111 - classification_loss: 0.3528 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8521 - regression_loss: 1.5015 - classification_loss: 0.3506 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8524 - regression_loss: 1.5016 - classification_loss: 0.3508 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8496 - regression_loss: 1.4995 - classification_loss: 0.3501 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8526 - regression_loss: 1.5031 - classification_loss: 0.3496 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8582 - regression_loss: 1.5071 - classification_loss: 0.3511 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8503 - regression_loss: 1.5009 - classification_loss: 0.3495 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8513 - regression_loss: 1.5016 - classification_loss: 0.3498 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8537 - regression_loss: 1.5037 - classification_loss: 0.3499 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8583 - regression_loss: 1.5077 - classification_loss: 0.3506 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8522 - regression_loss: 1.5037 - classification_loss: 0.3485 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8469 - regression_loss: 1.4996 - classification_loss: 0.3473 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8464 - regression_loss: 1.5001 - classification_loss: 0.3462 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8436 - regression_loss: 1.4983 - classification_loss: 0.3454 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8355 - regression_loss: 1.4918 - classification_loss: 0.3437 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8354 - regression_loss: 1.4914 - classification_loss: 0.3441 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8357 - regression_loss: 1.4916 - classification_loss: 0.3441 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8384 - regression_loss: 1.4938 - classification_loss: 0.3446 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8314 - regression_loss: 1.4882 - classification_loss: 0.3433 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8330 - regression_loss: 1.4901 - classification_loss: 0.3429 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8292 - regression_loss: 1.4871 - classification_loss: 0.3422 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8269 - regression_loss: 1.4857 - classification_loss: 0.3412 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8244 - regression_loss: 1.4832 - classification_loss: 0.3413 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8262 - regression_loss: 1.4849 - classification_loss: 0.3414 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8304 - regression_loss: 1.4882 - classification_loss: 0.3423 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8298 - regression_loss: 1.4880 - classification_loss: 0.3418 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8285 - regression_loss: 1.4875 - classification_loss: 0.3410 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8355 - regression_loss: 1.4949 - classification_loss: 0.3406 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8317 - regression_loss: 1.4923 - classification_loss: 0.3395 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8320 - regression_loss: 1.4930 - classification_loss: 0.3390 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8260 - regression_loss: 1.4879 - classification_loss: 0.3380 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8291 - regression_loss: 1.4906 - classification_loss: 0.3385 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8345 - regression_loss: 1.4951 - classification_loss: 0.3393 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8269 - regression_loss: 1.4898 - classification_loss: 0.3371 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8319 - regression_loss: 1.4938 - classification_loss: 0.3381 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8329 - regression_loss: 1.4947 - classification_loss: 0.3382 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8304 - regression_loss: 1.4925 - classification_loss: 0.3379 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8291 - regression_loss: 1.4904 - classification_loss: 0.3387 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8226 - regression_loss: 1.4856 - classification_loss: 0.3370 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8209 - regression_loss: 1.4849 - classification_loss: 0.3360 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8220 - regression_loss: 1.4858 - classification_loss: 0.3361 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8162 - regression_loss: 1.4815 - classification_loss: 0.3347 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8195 - regression_loss: 1.4839 - classification_loss: 0.3357 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8223 - regression_loss: 1.4867 - classification_loss: 0.3356 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8231 - regression_loss: 1.4878 - classification_loss: 0.3353 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8287 - regression_loss: 1.4926 - classification_loss: 0.3362 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8318 - regression_loss: 1.4955 - classification_loss: 0.3363 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8333 - regression_loss: 1.4965 - classification_loss: 0.3368 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8352 - regression_loss: 1.4986 - classification_loss: 0.3366 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8362 - regression_loss: 1.4993 - classification_loss: 0.3368 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8367 - regression_loss: 1.5000 - classification_loss: 0.3367 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8412 - regression_loss: 1.5038 - classification_loss: 0.3374 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8433 - regression_loss: 1.5058 - classification_loss: 0.3376 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8419 - regression_loss: 1.5056 - classification_loss: 0.3363 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8489 - regression_loss: 1.5123 - classification_loss: 0.3366 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8523 - regression_loss: 1.5156 - classification_loss: 0.3367 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8532 - regression_loss: 1.5162 - classification_loss: 0.3370 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8547 - regression_loss: 1.5177 - classification_loss: 0.3371 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8554 - regression_loss: 1.5183 - classification_loss: 0.3371 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8538 - regression_loss: 1.5171 - classification_loss: 0.3367 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8535 - regression_loss: 1.5171 - classification_loss: 0.3365 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8540 - regression_loss: 1.5176 - classification_loss: 0.3363 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8557 - regression_loss: 1.5191 - classification_loss: 0.3366 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8511 - regression_loss: 1.5154 - classification_loss: 0.3357 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8520 - regression_loss: 1.5147 - classification_loss: 0.3373 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8556 - regression_loss: 1.5182 - classification_loss: 0.3373 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8549 - regression_loss: 1.5182 - classification_loss: 0.3368 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8496 - regression_loss: 1.5143 - classification_loss: 0.3353 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8527 - regression_loss: 1.5169 - classification_loss: 0.3358 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8537 - regression_loss: 1.5176 - classification_loss: 0.3361 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8565 - regression_loss: 1.5191 - classification_loss: 0.3374 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8549 - regression_loss: 1.5179 - classification_loss: 0.3370 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8589 - regression_loss: 1.5212 - classification_loss: 0.3377 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8603 - regression_loss: 1.5223 - classification_loss: 0.3380 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8593 - regression_loss: 1.5217 - classification_loss: 0.3376 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8640 - regression_loss: 1.5248 - classification_loss: 0.3392 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8583 - regression_loss: 1.5203 - classification_loss: 0.3380 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8579 - regression_loss: 1.5203 - classification_loss: 0.3376 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8613 - regression_loss: 1.5232 - classification_loss: 0.3382 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8673 - regression_loss: 1.5282 - classification_loss: 0.3390 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8670 - regression_loss: 1.5280 - classification_loss: 0.3390 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8710 - regression_loss: 1.5308 - classification_loss: 0.3402 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8692 - regression_loss: 1.5298 - classification_loss: 0.3394 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8696 - regression_loss: 1.5300 - classification_loss: 0.3396 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8727 - regression_loss: 1.5323 - classification_loss: 0.3404 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8680 - regression_loss: 1.5281 - classification_loss: 0.3399 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8702 - regression_loss: 1.5299 - classification_loss: 0.3403 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8731 - regression_loss: 1.5319 - classification_loss: 0.3413 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8717 - regression_loss: 1.5304 - classification_loss: 0.3413 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8684 - regression_loss: 1.5279 - classification_loss: 0.3405 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8694 - regression_loss: 1.5290 - classification_loss: 0.3405 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8729 - regression_loss: 1.5317 - classification_loss: 0.3412 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8715 - regression_loss: 1.5309 - classification_loss: 0.3406 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8718 - regression_loss: 1.5314 - classification_loss: 0.3404 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8708 - regression_loss: 1.5309 - classification_loss: 0.3399 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8695 - regression_loss: 1.5304 - classification_loss: 0.3392 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8686 - regression_loss: 1.5297 - classification_loss: 0.3389 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8686 - regression_loss: 1.5297 - classification_loss: 0.3389 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8645 - regression_loss: 1.5264 - classification_loss: 0.3382 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8621 - regression_loss: 1.5243 - classification_loss: 0.3378 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8588 - regression_loss: 1.5217 - classification_loss: 0.3371 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8535 - regression_loss: 1.5175 - classification_loss: 0.3359 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8517 - regression_loss: 1.5162 - classification_loss: 0.3355 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8538 - regression_loss: 1.5180 - classification_loss: 0.3359 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8548 - regression_loss: 1.5188 - classification_loss: 0.3360 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8541 - regression_loss: 1.5185 - classification_loss: 0.3357 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8549 - regression_loss: 1.5190 - classification_loss: 0.3360 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8545 - regression_loss: 1.5188 - classification_loss: 0.3358 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8573 - regression_loss: 1.5211 - classification_loss: 0.3362 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8583 - regression_loss: 1.5216 - classification_loss: 0.3367 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8637 - regression_loss: 1.5251 - classification_loss: 0.3386 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8628 - regression_loss: 1.5243 - classification_loss: 0.3385 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8597 - regression_loss: 1.5221 - classification_loss: 0.3376 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8575 - regression_loss: 1.5201 - classification_loss: 0.3374 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8591 - regression_loss: 1.5210 - classification_loss: 0.3381 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8583 - regression_loss: 1.5200 - classification_loss: 0.3382 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8539 - regression_loss: 1.5167 - classification_loss: 0.3372 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8557 - regression_loss: 1.5180 - classification_loss: 0.3378 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8574 - regression_loss: 1.5175 - classification_loss: 0.3398 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8567 - regression_loss: 1.5172 - classification_loss: 0.3396 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8587 - regression_loss: 1.5189 - classification_loss: 0.3398 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8605 - regression_loss: 1.5205 - classification_loss: 0.3399 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8624 - regression_loss: 1.5221 - classification_loss: 0.3403 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8639 - regression_loss: 1.5235 - classification_loss: 0.3403 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8630 - regression_loss: 1.5231 - classification_loss: 0.3399 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8650 - regression_loss: 1.5251 - classification_loss: 0.3400 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8626 - regression_loss: 1.5235 - classification_loss: 0.3391 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8637 - regression_loss: 1.5247 - classification_loss: 0.3390 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8642 - regression_loss: 1.5252 - classification_loss: 0.3389 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8639 - regression_loss: 1.5253 - classification_loss: 0.3386 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8678 - regression_loss: 1.5287 - classification_loss: 0.3391 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8689 - regression_loss: 1.5296 - classification_loss: 0.3393 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8688 - regression_loss: 1.5297 - classification_loss: 0.3391 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8697 - regression_loss: 1.5306 - classification_loss: 0.3390 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8709 - regression_loss: 1.5317 - classification_loss: 0.3392 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8710 - regression_loss: 1.5320 - classification_loss: 0.3390 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8713 - regression_loss: 1.5327 - classification_loss: 0.3386 217/500 [============>.................] - ETA: 1:10 - loss: 1.8671 - regression_loss: 1.5294 - classification_loss: 0.3377 218/500 [============>.................] - ETA: 1:10 - loss: 1.8678 - regression_loss: 1.5301 - classification_loss: 0.3377 219/500 [============>.................] - ETA: 1:10 - loss: 1.8676 - regression_loss: 1.5300 - classification_loss: 0.3376 220/500 [============>.................] - ETA: 1:09 - loss: 1.8687 - regression_loss: 1.5311 - classification_loss: 0.3376 221/500 [============>.................] - ETA: 1:09 - loss: 1.8633 - regression_loss: 1.5263 - classification_loss: 0.3371 222/500 [============>.................] - ETA: 1:09 - loss: 1.8610 - regression_loss: 1.5246 - classification_loss: 0.3364 223/500 [============>.................] - ETA: 1:09 - loss: 1.8646 - regression_loss: 1.5272 - classification_loss: 0.3374 224/500 [============>.................] - ETA: 1:08 - loss: 1.8654 - regression_loss: 1.5278 - classification_loss: 0.3375 225/500 [============>.................] - ETA: 1:08 - loss: 1.8632 - regression_loss: 1.5262 - classification_loss: 0.3369 226/500 [============>.................] - ETA: 1:08 - loss: 1.8611 - regression_loss: 1.5247 - classification_loss: 0.3364 227/500 [============>.................] - ETA: 1:08 - loss: 1.8615 - regression_loss: 1.5248 - classification_loss: 0.3367 228/500 [============>.................] - ETA: 1:07 - loss: 1.8591 - regression_loss: 1.5228 - classification_loss: 0.3363 229/500 [============>.................] - ETA: 1:07 - loss: 1.8614 - regression_loss: 1.5245 - classification_loss: 0.3369 230/500 [============>.................] - ETA: 1:07 - loss: 1.8611 - regression_loss: 1.5243 - classification_loss: 0.3368 231/500 [============>.................] - ETA: 1:07 - loss: 1.8556 - regression_loss: 1.5197 - classification_loss: 0.3359 232/500 [============>.................] - ETA: 1:06 - loss: 1.8552 - regression_loss: 1.5195 - classification_loss: 0.3357 233/500 [============>.................] - ETA: 1:06 - loss: 1.8585 - regression_loss: 1.5220 - classification_loss: 0.3366 234/500 [=============>................] - ETA: 1:06 - loss: 1.8587 - regression_loss: 1.5222 - classification_loss: 0.3364 235/500 [=============>................] - ETA: 1:06 - loss: 1.8590 - regression_loss: 1.5226 - classification_loss: 0.3365 236/500 [=============>................] - ETA: 1:05 - loss: 1.8567 - regression_loss: 1.5197 - classification_loss: 0.3370 237/500 [=============>................] - ETA: 1:05 - loss: 1.8548 - regression_loss: 1.5173 - classification_loss: 0.3375 238/500 [=============>................] - ETA: 1:05 - loss: 1.8569 - regression_loss: 1.5196 - classification_loss: 0.3373 239/500 [=============>................] - ETA: 1:05 - loss: 1.8596 - regression_loss: 1.5217 - classification_loss: 0.3380 240/500 [=============>................] - ETA: 1:04 - loss: 1.8602 - regression_loss: 1.5218 - classification_loss: 0.3383 241/500 [=============>................] - ETA: 1:04 - loss: 1.8584 - regression_loss: 1.5205 - classification_loss: 0.3379 242/500 [=============>................] - ETA: 1:04 - loss: 1.8599 - regression_loss: 1.5219 - classification_loss: 0.3380 243/500 [=============>................] - ETA: 1:04 - loss: 1.8598 - regression_loss: 1.5211 - classification_loss: 0.3387 244/500 [=============>................] - ETA: 1:03 - loss: 1.8618 - regression_loss: 1.5224 - classification_loss: 0.3394 245/500 [=============>................] - ETA: 1:03 - loss: 1.8622 - regression_loss: 1.5229 - classification_loss: 0.3393 246/500 [=============>................] - ETA: 1:03 - loss: 1.8667 - regression_loss: 1.5264 - classification_loss: 0.3404 247/500 [=============>................] - ETA: 1:03 - loss: 1.8669 - regression_loss: 1.5265 - classification_loss: 0.3404 248/500 [=============>................] - ETA: 1:02 - loss: 1.8643 - regression_loss: 1.5245 - classification_loss: 0.3398 249/500 [=============>................] - ETA: 1:02 - loss: 1.8645 - regression_loss: 1.5245 - classification_loss: 0.3400 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8662 - regression_loss: 1.5261 - classification_loss: 0.3401 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8677 - regression_loss: 1.5276 - classification_loss: 0.3401 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8689 - regression_loss: 1.5289 - classification_loss: 0.3401 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8690 - regression_loss: 1.5291 - classification_loss: 0.3399 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8683 - regression_loss: 1.5287 - classification_loss: 0.3395 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8676 - regression_loss: 1.5282 - classification_loss: 0.3394 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8692 - regression_loss: 1.5298 - classification_loss: 0.3394 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8687 - regression_loss: 1.5296 - classification_loss: 0.3391 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8711 - regression_loss: 1.5320 - classification_loss: 0.3390 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8696 - regression_loss: 1.5311 - classification_loss: 0.3385 260/500 [==============>...............] - ETA: 59s - loss: 1.8698 - regression_loss: 1.5313 - classification_loss: 0.3385  261/500 [==============>...............] - ETA: 59s - loss: 1.8699 - regression_loss: 1.5312 - classification_loss: 0.3387 262/500 [==============>...............] - ETA: 59s - loss: 1.8717 - regression_loss: 1.5328 - classification_loss: 0.3389 263/500 [==============>...............] - ETA: 59s - loss: 1.8712 - regression_loss: 1.5325 - classification_loss: 0.3387 264/500 [==============>...............] - ETA: 58s - loss: 1.8727 - regression_loss: 1.5336 - classification_loss: 0.3391 265/500 [==============>...............] - ETA: 58s - loss: 1.8734 - regression_loss: 1.5343 - classification_loss: 0.3392 266/500 [==============>...............] - ETA: 58s - loss: 1.8735 - regression_loss: 1.5346 - classification_loss: 0.3390 267/500 [===============>..............] - ETA: 58s - loss: 1.8731 - regression_loss: 1.5344 - classification_loss: 0.3387 268/500 [===============>..............] - ETA: 57s - loss: 1.8712 - regression_loss: 1.5328 - classification_loss: 0.3384 269/500 [===============>..............] - ETA: 57s - loss: 1.8720 - regression_loss: 1.5338 - classification_loss: 0.3382 270/500 [===============>..............] - ETA: 57s - loss: 1.8736 - regression_loss: 1.5353 - classification_loss: 0.3383 271/500 [===============>..............] - ETA: 57s - loss: 1.8730 - regression_loss: 1.5349 - classification_loss: 0.3380 272/500 [===============>..............] - ETA: 56s - loss: 1.8744 - regression_loss: 1.5361 - classification_loss: 0.3383 273/500 [===============>..............] - ETA: 56s - loss: 1.8750 - regression_loss: 1.5369 - classification_loss: 0.3381 274/500 [===============>..............] - ETA: 56s - loss: 1.8775 - regression_loss: 1.5389 - classification_loss: 0.3386 275/500 [===============>..............] - ETA: 56s - loss: 1.8765 - regression_loss: 1.5382 - classification_loss: 0.3383 276/500 [===============>..............] - ETA: 55s - loss: 1.8762 - regression_loss: 1.5381 - classification_loss: 0.3381 277/500 [===============>..............] - ETA: 55s - loss: 1.8749 - regression_loss: 1.5373 - classification_loss: 0.3376 278/500 [===============>..............] - ETA: 55s - loss: 1.8751 - regression_loss: 1.5376 - classification_loss: 0.3375 279/500 [===============>..............] - ETA: 55s - loss: 1.8790 - regression_loss: 1.5407 - classification_loss: 0.3383 280/500 [===============>..............] - ETA: 54s - loss: 1.8776 - regression_loss: 1.5397 - classification_loss: 0.3379 281/500 [===============>..............] - ETA: 54s - loss: 1.8785 - regression_loss: 1.5404 - classification_loss: 0.3380 282/500 [===============>..............] - ETA: 54s - loss: 1.8788 - regression_loss: 1.5408 - classification_loss: 0.3380 283/500 [===============>..............] - ETA: 54s - loss: 1.8791 - regression_loss: 1.5411 - classification_loss: 0.3381 284/500 [================>.............] - ETA: 53s - loss: 1.8821 - regression_loss: 1.5435 - classification_loss: 0.3386 285/500 [================>.............] - ETA: 53s - loss: 1.8824 - regression_loss: 1.5440 - classification_loss: 0.3385 286/500 [================>.............] - ETA: 53s - loss: 1.8804 - regression_loss: 1.5423 - classification_loss: 0.3381 287/500 [================>.............] - ETA: 53s - loss: 1.8811 - regression_loss: 1.5431 - classification_loss: 0.3381 288/500 [================>.............] - ETA: 52s - loss: 1.8817 - regression_loss: 1.5436 - classification_loss: 0.3380 289/500 [================>.............] - ETA: 52s - loss: 1.8786 - regression_loss: 1.5414 - classification_loss: 0.3373 290/500 [================>.............] - ETA: 52s - loss: 1.8762 - regression_loss: 1.5395 - classification_loss: 0.3367 291/500 [================>.............] - ETA: 52s - loss: 1.8774 - regression_loss: 1.5404 - classification_loss: 0.3370 292/500 [================>.............] - ETA: 51s - loss: 1.8766 - regression_loss: 1.5398 - classification_loss: 0.3368 293/500 [================>.............] - ETA: 51s - loss: 1.8766 - regression_loss: 1.5398 - classification_loss: 0.3368 294/500 [================>.............] - ETA: 51s - loss: 1.8779 - regression_loss: 1.5403 - classification_loss: 0.3376 295/500 [================>.............] - ETA: 51s - loss: 1.8754 - regression_loss: 1.5384 - classification_loss: 0.3370 296/500 [================>.............] - ETA: 50s - loss: 1.8762 - regression_loss: 1.5390 - classification_loss: 0.3373 297/500 [================>.............] - ETA: 50s - loss: 1.8751 - regression_loss: 1.5379 - classification_loss: 0.3372 298/500 [================>.............] - ETA: 50s - loss: 1.8753 - regression_loss: 1.5379 - classification_loss: 0.3374 299/500 [================>.............] - ETA: 50s - loss: 1.8757 - regression_loss: 1.5383 - classification_loss: 0.3374 300/500 [=================>............] - ETA: 49s - loss: 1.8747 - regression_loss: 1.5378 - classification_loss: 0.3369 301/500 [=================>............] - ETA: 49s - loss: 1.8759 - regression_loss: 1.5383 - classification_loss: 0.3376 302/500 [=================>............] - ETA: 49s - loss: 1.8758 - regression_loss: 1.5382 - classification_loss: 0.3377 303/500 [=================>............] - ETA: 49s - loss: 1.8739 - regression_loss: 1.5364 - classification_loss: 0.3375 304/500 [=================>............] - ETA: 48s - loss: 1.8764 - regression_loss: 1.5381 - classification_loss: 0.3383 305/500 [=================>............] - ETA: 48s - loss: 1.8744 - regression_loss: 1.5366 - classification_loss: 0.3378 306/500 [=================>............] - ETA: 48s - loss: 1.8741 - regression_loss: 1.5361 - classification_loss: 0.3379 307/500 [=================>............] - ETA: 48s - loss: 1.8723 - regression_loss: 1.5349 - classification_loss: 0.3374 308/500 [=================>............] - ETA: 47s - loss: 1.8735 - regression_loss: 1.5359 - classification_loss: 0.3377 309/500 [=================>............] - ETA: 47s - loss: 1.8746 - regression_loss: 1.5366 - classification_loss: 0.3380 310/500 [=================>............] - ETA: 47s - loss: 1.8747 - regression_loss: 1.5370 - classification_loss: 0.3377 311/500 [=================>............] - ETA: 47s - loss: 1.8760 - regression_loss: 1.5381 - classification_loss: 0.3379 312/500 [=================>............] - ETA: 46s - loss: 1.8776 - regression_loss: 1.5396 - classification_loss: 0.3379 313/500 [=================>............] - ETA: 46s - loss: 1.8799 - regression_loss: 1.5415 - classification_loss: 0.3384 314/500 [=================>............] - ETA: 46s - loss: 1.8791 - regression_loss: 1.5408 - classification_loss: 0.3383 315/500 [=================>............] - ETA: 46s - loss: 1.8821 - regression_loss: 1.5431 - classification_loss: 0.3390 316/500 [=================>............] - ETA: 45s - loss: 1.8818 - regression_loss: 1.5429 - classification_loss: 0.3389 317/500 [==================>...........] - ETA: 45s - loss: 1.8823 - regression_loss: 1.5434 - classification_loss: 0.3390 318/500 [==================>...........] - ETA: 45s - loss: 1.8839 - regression_loss: 1.5447 - classification_loss: 0.3393 319/500 [==================>...........] - ETA: 45s - loss: 1.8828 - regression_loss: 1.5438 - classification_loss: 0.3391 320/500 [==================>...........] - ETA: 44s - loss: 1.8876 - regression_loss: 1.5478 - classification_loss: 0.3398 321/500 [==================>...........] - ETA: 44s - loss: 1.8873 - regression_loss: 1.5478 - classification_loss: 0.3395 322/500 [==================>...........] - ETA: 44s - loss: 1.8882 - regression_loss: 1.5485 - classification_loss: 0.3397 323/500 [==================>...........] - ETA: 44s - loss: 1.8888 - regression_loss: 1.5490 - classification_loss: 0.3398 324/500 [==================>...........] - ETA: 43s - loss: 1.8884 - regression_loss: 1.5489 - classification_loss: 0.3396 325/500 [==================>...........] - ETA: 43s - loss: 1.8889 - regression_loss: 1.5495 - classification_loss: 0.3394 326/500 [==================>...........] - ETA: 43s - loss: 1.8884 - regression_loss: 1.5494 - classification_loss: 0.3391 327/500 [==================>...........] - ETA: 43s - loss: 1.8897 - regression_loss: 1.5506 - classification_loss: 0.3390 328/500 [==================>...........] - ETA: 42s - loss: 1.8899 - regression_loss: 1.5510 - classification_loss: 0.3390 329/500 [==================>...........] - ETA: 42s - loss: 1.8870 - regression_loss: 1.5483 - classification_loss: 0.3387 330/500 [==================>...........] - ETA: 42s - loss: 1.8875 - regression_loss: 1.5488 - classification_loss: 0.3387 331/500 [==================>...........] - ETA: 42s - loss: 1.8876 - regression_loss: 1.5490 - classification_loss: 0.3387 332/500 [==================>...........] - ETA: 41s - loss: 1.8951 - regression_loss: 1.5494 - classification_loss: 0.3456 333/500 [==================>...........] - ETA: 41s - loss: 1.8962 - regression_loss: 1.5502 - classification_loss: 0.3460 334/500 [===================>..........] - ETA: 41s - loss: 1.8957 - regression_loss: 1.5498 - classification_loss: 0.3459 335/500 [===================>..........] - ETA: 41s - loss: 1.8949 - regression_loss: 1.5494 - classification_loss: 0.3455 336/500 [===================>..........] - ETA: 40s - loss: 1.8962 - regression_loss: 1.5505 - classification_loss: 0.3457 337/500 [===================>..........] - ETA: 40s - loss: 1.8969 - regression_loss: 1.5510 - classification_loss: 0.3459 338/500 [===================>..........] - ETA: 40s - loss: 1.8974 - regression_loss: 1.5515 - classification_loss: 0.3458 339/500 [===================>..........] - ETA: 40s - loss: 1.8981 - regression_loss: 1.5523 - classification_loss: 0.3458 340/500 [===================>..........] - ETA: 39s - loss: 1.8979 - regression_loss: 1.5522 - classification_loss: 0.3457 341/500 [===================>..........] - ETA: 39s - loss: 1.8972 - regression_loss: 1.5521 - classification_loss: 0.3451 342/500 [===================>..........] - ETA: 39s - loss: 1.8963 - regression_loss: 1.5513 - classification_loss: 0.3450 343/500 [===================>..........] - ETA: 39s - loss: 1.8964 - regression_loss: 1.5514 - classification_loss: 0.3450 344/500 [===================>..........] - ETA: 38s - loss: 1.8941 - regression_loss: 1.5496 - classification_loss: 0.3445 345/500 [===================>..........] - ETA: 38s - loss: 1.8933 - regression_loss: 1.5491 - classification_loss: 0.3443 346/500 [===================>..........] - ETA: 38s - loss: 1.8947 - regression_loss: 1.5501 - classification_loss: 0.3445 347/500 [===================>..........] - ETA: 38s - loss: 1.8961 - regression_loss: 1.5515 - classification_loss: 0.3446 348/500 [===================>..........] - ETA: 37s - loss: 1.8971 - regression_loss: 1.5522 - classification_loss: 0.3449 349/500 [===================>..........] - ETA: 37s - loss: 1.8975 - regression_loss: 1.5527 - classification_loss: 0.3449 350/500 [====================>.........] - ETA: 37s - loss: 1.8975 - regression_loss: 1.5523 - classification_loss: 0.3451 351/500 [====================>.........] - ETA: 37s - loss: 1.8978 - regression_loss: 1.5528 - classification_loss: 0.3450 352/500 [====================>.........] - ETA: 36s - loss: 1.8980 - regression_loss: 1.5532 - classification_loss: 0.3448 353/500 [====================>.........] - ETA: 36s - loss: 1.8984 - regression_loss: 1.5536 - classification_loss: 0.3448 354/500 [====================>.........] - ETA: 36s - loss: 1.8989 - regression_loss: 1.5542 - classification_loss: 0.3447 355/500 [====================>.........] - ETA: 36s - loss: 1.9005 - regression_loss: 1.5555 - classification_loss: 0.3450 356/500 [====================>.........] - ETA: 35s - loss: 1.9012 - regression_loss: 1.5556 - classification_loss: 0.3456 357/500 [====================>.........] - ETA: 35s - loss: 1.9017 - regression_loss: 1.5559 - classification_loss: 0.3458 358/500 [====================>.........] - ETA: 35s - loss: 1.9014 - regression_loss: 1.5557 - classification_loss: 0.3457 359/500 [====================>.........] - ETA: 35s - loss: 1.9009 - regression_loss: 1.5555 - classification_loss: 0.3454 360/500 [====================>.........] - ETA: 34s - loss: 1.9008 - regression_loss: 1.5555 - classification_loss: 0.3452 361/500 [====================>.........] - ETA: 34s - loss: 1.9002 - regression_loss: 1.5552 - classification_loss: 0.3450 362/500 [====================>.........] - ETA: 34s - loss: 1.9013 - regression_loss: 1.5561 - classification_loss: 0.3452 363/500 [====================>.........] - ETA: 34s - loss: 1.8998 - regression_loss: 1.5550 - classification_loss: 0.3449 364/500 [====================>.........] - ETA: 33s - loss: 1.9009 - regression_loss: 1.5557 - classification_loss: 0.3452 365/500 [====================>.........] - ETA: 33s - loss: 1.9022 - regression_loss: 1.5569 - classification_loss: 0.3453 366/500 [====================>.........] - ETA: 33s - loss: 1.9038 - regression_loss: 1.5587 - classification_loss: 0.3451 367/500 [=====================>........] - ETA: 33s - loss: 1.9024 - regression_loss: 1.5576 - classification_loss: 0.3449 368/500 [=====================>........] - ETA: 32s - loss: 1.9028 - regression_loss: 1.5579 - classification_loss: 0.3449 369/500 [=====================>........] - ETA: 32s - loss: 1.9039 - regression_loss: 1.5583 - classification_loss: 0.3456 370/500 [=====================>........] - ETA: 32s - loss: 1.9029 - regression_loss: 1.5575 - classification_loss: 0.3455 371/500 [=====================>........] - ETA: 32s - loss: 1.9036 - regression_loss: 1.5581 - classification_loss: 0.3455 372/500 [=====================>........] - ETA: 31s - loss: 1.9046 - regression_loss: 1.5590 - classification_loss: 0.3455 373/500 [=====================>........] - ETA: 31s - loss: 1.9034 - regression_loss: 1.5581 - classification_loss: 0.3453 374/500 [=====================>........] - ETA: 31s - loss: 1.9020 - regression_loss: 1.5569 - classification_loss: 0.3451 375/500 [=====================>........] - ETA: 31s - loss: 1.9027 - regression_loss: 1.5574 - classification_loss: 0.3453 376/500 [=====================>........] - ETA: 30s - loss: 1.9020 - regression_loss: 1.5570 - classification_loss: 0.3450 377/500 [=====================>........] - ETA: 30s - loss: 1.9029 - regression_loss: 1.5576 - classification_loss: 0.3453 378/500 [=====================>........] - ETA: 30s - loss: 1.9009 - regression_loss: 1.5561 - classification_loss: 0.3448 379/500 [=====================>........] - ETA: 30s - loss: 1.9015 - regression_loss: 1.5568 - classification_loss: 0.3447 380/500 [=====================>........] - ETA: 29s - loss: 1.9015 - regression_loss: 1.5568 - classification_loss: 0.3447 381/500 [=====================>........] - ETA: 29s - loss: 1.9004 - regression_loss: 1.5560 - classification_loss: 0.3444 382/500 [=====================>........] - ETA: 29s - loss: 1.8984 - regression_loss: 1.5544 - classification_loss: 0.3441 383/500 [=====================>........] - ETA: 29s - loss: 1.8996 - regression_loss: 1.5554 - classification_loss: 0.3442 384/500 [======================>.......] - ETA: 28s - loss: 1.9001 - regression_loss: 1.5556 - classification_loss: 0.3445 385/500 [======================>.......] - ETA: 28s - loss: 1.8982 - regression_loss: 1.5536 - classification_loss: 0.3447 386/500 [======================>.......] - ETA: 28s - loss: 1.8981 - regression_loss: 1.5533 - classification_loss: 0.3448 387/500 [======================>.......] - ETA: 28s - loss: 1.9004 - regression_loss: 1.5549 - classification_loss: 0.3456 388/500 [======================>.......] - ETA: 27s - loss: 1.9011 - regression_loss: 1.5554 - classification_loss: 0.3456 389/500 [======================>.......] - ETA: 27s - loss: 1.9014 - regression_loss: 1.5558 - classification_loss: 0.3456 390/500 [======================>.......] - ETA: 27s - loss: 1.9007 - regression_loss: 1.5553 - classification_loss: 0.3454 391/500 [======================>.......] - ETA: 27s - loss: 1.8999 - regression_loss: 1.5548 - classification_loss: 0.3451 392/500 [======================>.......] - ETA: 26s - loss: 1.8987 - regression_loss: 1.5541 - classification_loss: 0.3447 393/500 [======================>.......] - ETA: 26s - loss: 1.8992 - regression_loss: 1.5543 - classification_loss: 0.3449 394/500 [======================>.......] - ETA: 26s - loss: 1.8981 - regression_loss: 1.5533 - classification_loss: 0.3447 395/500 [======================>.......] - ETA: 26s - loss: 1.8987 - regression_loss: 1.5539 - classification_loss: 0.3449 396/500 [======================>.......] - ETA: 25s - loss: 1.8983 - regression_loss: 1.5536 - classification_loss: 0.3447 397/500 [======================>.......] - ETA: 25s - loss: 1.8993 - regression_loss: 1.5544 - classification_loss: 0.3449 398/500 [======================>.......] - ETA: 25s - loss: 1.9025 - regression_loss: 1.5570 - classification_loss: 0.3455 399/500 [======================>.......] - ETA: 25s - loss: 1.9032 - regression_loss: 1.5574 - classification_loss: 0.3457 400/500 [=======================>......] - ETA: 24s - loss: 1.9003 - regression_loss: 1.5552 - classification_loss: 0.3451 401/500 [=======================>......] - ETA: 24s - loss: 1.9008 - regression_loss: 1.5555 - classification_loss: 0.3452 402/500 [=======================>......] - ETA: 24s - loss: 1.9005 - regression_loss: 1.5553 - classification_loss: 0.3452 403/500 [=======================>......] - ETA: 24s - loss: 1.9021 - regression_loss: 1.5565 - classification_loss: 0.3457 404/500 [=======================>......] - ETA: 23s - loss: 1.9019 - regression_loss: 1.5564 - classification_loss: 0.3455 405/500 [=======================>......] - ETA: 23s - loss: 1.9025 - regression_loss: 1.5569 - classification_loss: 0.3456 406/500 [=======================>......] - ETA: 23s - loss: 1.9024 - regression_loss: 1.5567 - classification_loss: 0.3456 407/500 [=======================>......] - ETA: 23s - loss: 1.9031 - regression_loss: 1.5571 - classification_loss: 0.3460 408/500 [=======================>......] - ETA: 22s - loss: 1.9031 - regression_loss: 1.5572 - classification_loss: 0.3459 409/500 [=======================>......] - ETA: 22s - loss: 1.9049 - regression_loss: 1.5581 - classification_loss: 0.3468 410/500 [=======================>......] - ETA: 22s - loss: 1.9032 - regression_loss: 1.5568 - classification_loss: 0.3464 411/500 [=======================>......] - ETA: 22s - loss: 1.9022 - regression_loss: 1.5561 - classification_loss: 0.3461 412/500 [=======================>......] - ETA: 21s - loss: 1.9007 - regression_loss: 1.5551 - classification_loss: 0.3456 413/500 [=======================>......] - ETA: 21s - loss: 1.9021 - regression_loss: 1.5560 - classification_loss: 0.3461 414/500 [=======================>......] - ETA: 21s - loss: 1.9014 - regression_loss: 1.5556 - classification_loss: 0.3458 415/500 [=======================>......] - ETA: 21s - loss: 1.8998 - regression_loss: 1.5542 - classification_loss: 0.3456 416/500 [=======================>......] - ETA: 20s - loss: 1.9002 - regression_loss: 1.5544 - classification_loss: 0.3458 417/500 [========================>.....] - ETA: 20s - loss: 1.9009 - regression_loss: 1.5551 - classification_loss: 0.3459 418/500 [========================>.....] - ETA: 20s - loss: 1.8993 - regression_loss: 1.5537 - classification_loss: 0.3456 419/500 [========================>.....] - ETA: 20s - loss: 1.8976 - regression_loss: 1.5524 - classification_loss: 0.3452 420/500 [========================>.....] - ETA: 19s - loss: 1.8978 - regression_loss: 1.5526 - classification_loss: 0.3452 421/500 [========================>.....] - ETA: 19s - loss: 1.8958 - regression_loss: 1.5510 - classification_loss: 0.3448 422/500 [========================>.....] - ETA: 19s - loss: 1.8949 - regression_loss: 1.5501 - classification_loss: 0.3449 423/500 [========================>.....] - ETA: 19s - loss: 1.8951 - regression_loss: 1.5505 - classification_loss: 0.3446 424/500 [========================>.....] - ETA: 18s - loss: 1.8965 - regression_loss: 1.5517 - classification_loss: 0.3449 425/500 [========================>.....] - ETA: 18s - loss: 1.8962 - regression_loss: 1.5516 - classification_loss: 0.3447 426/500 [========================>.....] - ETA: 18s - loss: 1.8959 - regression_loss: 1.5514 - classification_loss: 0.3445 427/500 [========================>.....] - ETA: 18s - loss: 1.8965 - regression_loss: 1.5520 - classification_loss: 0.3445 428/500 [========================>.....] - ETA: 17s - loss: 1.8966 - regression_loss: 1.5522 - classification_loss: 0.3444 429/500 [========================>.....] - ETA: 17s - loss: 1.8970 - regression_loss: 1.5527 - classification_loss: 0.3444 430/500 [========================>.....] - ETA: 17s - loss: 1.8979 - regression_loss: 1.5536 - classification_loss: 0.3443 431/500 [========================>.....] - ETA: 17s - loss: 1.8944 - regression_loss: 1.5506 - classification_loss: 0.3438 432/500 [========================>.....] - ETA: 16s - loss: 1.8945 - regression_loss: 1.5507 - classification_loss: 0.3439 433/500 [========================>.....] - ETA: 16s - loss: 1.8943 - regression_loss: 1.5506 - classification_loss: 0.3437 434/500 [=========================>....] - ETA: 16s - loss: 1.8945 - regression_loss: 1.5508 - classification_loss: 0.3437 435/500 [=========================>....] - ETA: 16s - loss: 1.8940 - regression_loss: 1.5504 - classification_loss: 0.3436 436/500 [=========================>....] - ETA: 15s - loss: 1.8937 - regression_loss: 1.5504 - classification_loss: 0.3433 437/500 [=========================>....] - ETA: 15s - loss: 1.8927 - regression_loss: 1.5493 - classification_loss: 0.3433 438/500 [=========================>....] - ETA: 15s - loss: 1.8938 - regression_loss: 1.5504 - classification_loss: 0.3434 439/500 [=========================>....] - ETA: 15s - loss: 1.8908 - regression_loss: 1.5480 - classification_loss: 0.3428 440/500 [=========================>....] - ETA: 14s - loss: 1.8894 - regression_loss: 1.5471 - classification_loss: 0.3424 441/500 [=========================>....] - ETA: 14s - loss: 1.8880 - regression_loss: 1.5459 - classification_loss: 0.3421 442/500 [=========================>....] - ETA: 14s - loss: 1.8869 - regression_loss: 1.5450 - classification_loss: 0.3419 443/500 [=========================>....] - ETA: 14s - loss: 1.8856 - regression_loss: 1.5442 - classification_loss: 0.3414 444/500 [=========================>....] - ETA: 13s - loss: 1.8860 - regression_loss: 1.5446 - classification_loss: 0.3414 445/500 [=========================>....] - ETA: 13s - loss: 1.8868 - regression_loss: 1.5454 - classification_loss: 0.3414 446/500 [=========================>....] - ETA: 13s - loss: 1.8863 - regression_loss: 1.5453 - classification_loss: 0.3411 447/500 [=========================>....] - ETA: 13s - loss: 1.8866 - regression_loss: 1.5455 - classification_loss: 0.3411 448/500 [=========================>....] - ETA: 12s - loss: 1.8868 - regression_loss: 1.5458 - classification_loss: 0.3410 449/500 [=========================>....] - ETA: 12s - loss: 1.8881 - regression_loss: 1.5470 - classification_loss: 0.3411 450/500 [==========================>...] - ETA: 12s - loss: 1.8876 - regression_loss: 1.5467 - classification_loss: 0.3408 451/500 [==========================>...] - ETA: 12s - loss: 1.8878 - regression_loss: 1.5472 - classification_loss: 0.3406 452/500 [==========================>...] - ETA: 11s - loss: 1.8876 - regression_loss: 1.5471 - classification_loss: 0.3405 453/500 [==========================>...] - ETA: 11s - loss: 1.8855 - regression_loss: 1.5454 - classification_loss: 0.3401 454/500 [==========================>...] - ETA: 11s - loss: 1.8846 - regression_loss: 1.5449 - classification_loss: 0.3397 455/500 [==========================>...] - ETA: 11s - loss: 1.8851 - regression_loss: 1.5454 - classification_loss: 0.3397 456/500 [==========================>...] - ETA: 10s - loss: 1.8856 - regression_loss: 1.5458 - classification_loss: 0.3397 457/500 [==========================>...] - ETA: 10s - loss: 1.8862 - regression_loss: 1.5464 - classification_loss: 0.3398 458/500 [==========================>...] - ETA: 10s - loss: 1.8864 - regression_loss: 1.5467 - classification_loss: 0.3397 459/500 [==========================>...] - ETA: 10s - loss: 1.8880 - regression_loss: 1.5476 - classification_loss: 0.3404 460/500 [==========================>...] - ETA: 9s - loss: 1.8867 - regression_loss: 1.5465 - classification_loss: 0.3401  461/500 [==========================>...] - ETA: 9s - loss: 1.8861 - regression_loss: 1.5461 - classification_loss: 0.3400 462/500 [==========================>...] - ETA: 9s - loss: 1.8872 - regression_loss: 1.5470 - classification_loss: 0.3402 463/500 [==========================>...] - ETA: 9s - loss: 1.8883 - regression_loss: 1.5475 - classification_loss: 0.3409 464/500 [==========================>...] - ETA: 8s - loss: 1.8886 - regression_loss: 1.5476 - classification_loss: 0.3410 465/500 [==========================>...] - ETA: 8s - loss: 1.8887 - regression_loss: 1.5476 - classification_loss: 0.3410 466/500 [==========================>...] - ETA: 8s - loss: 1.8881 - regression_loss: 1.5473 - classification_loss: 0.3409 467/500 [===========================>..] - ETA: 8s - loss: 1.8880 - regression_loss: 1.5472 - classification_loss: 0.3408 468/500 [===========================>..] - ETA: 7s - loss: 1.8887 - regression_loss: 1.5478 - classification_loss: 0.3409 469/500 [===========================>..] - ETA: 7s - loss: 1.8882 - regression_loss: 1.5475 - classification_loss: 0.3407 470/500 [===========================>..] - ETA: 7s - loss: 1.8893 - regression_loss: 1.5482 - classification_loss: 0.3411 471/500 [===========================>..] - ETA: 7s - loss: 1.8904 - regression_loss: 1.5491 - classification_loss: 0.3413 472/500 [===========================>..] - ETA: 6s - loss: 1.8904 - regression_loss: 1.5491 - classification_loss: 0.3412 473/500 [===========================>..] - ETA: 6s - loss: 1.8907 - regression_loss: 1.5495 - classification_loss: 0.3411 474/500 [===========================>..] - ETA: 6s - loss: 1.8909 - regression_loss: 1.5499 - classification_loss: 0.3411 475/500 [===========================>..] - ETA: 6s - loss: 1.8915 - regression_loss: 1.5503 - classification_loss: 0.3411 476/500 [===========================>..] - ETA: 5s - loss: 1.8915 - regression_loss: 1.5504 - classification_loss: 0.3412 477/500 [===========================>..] - ETA: 5s - loss: 1.8925 - regression_loss: 1.5511 - classification_loss: 0.3414 478/500 [===========================>..] - ETA: 5s - loss: 1.8910 - regression_loss: 1.5497 - classification_loss: 0.3413 479/500 [===========================>..] - ETA: 5s - loss: 1.8920 - regression_loss: 1.5506 - classification_loss: 0.3414 480/500 [===========================>..] - ETA: 4s - loss: 1.8923 - regression_loss: 1.5509 - classification_loss: 0.3414 481/500 [===========================>..] - ETA: 4s - loss: 1.8926 - regression_loss: 1.5512 - classification_loss: 0.3414 482/500 [===========================>..] - ETA: 4s - loss: 1.8922 - regression_loss: 1.5510 - classification_loss: 0.3413 483/500 [===========================>..] - ETA: 4s - loss: 1.8922 - regression_loss: 1.5511 - classification_loss: 0.3412 484/500 [============================>.] - ETA: 3s - loss: 1.8919 - regression_loss: 1.5509 - classification_loss: 0.3410 485/500 [============================>.] - ETA: 3s - loss: 1.8925 - regression_loss: 1.5515 - classification_loss: 0.3411 486/500 [============================>.] - ETA: 3s - loss: 1.8911 - regression_loss: 1.5503 - classification_loss: 0.3408 487/500 [============================>.] - ETA: 3s - loss: 1.8894 - regression_loss: 1.5490 - classification_loss: 0.3403 488/500 [============================>.] - ETA: 2s - loss: 1.8883 - regression_loss: 1.5480 - classification_loss: 0.3402 489/500 [============================>.] - ETA: 2s - loss: 1.8879 - regression_loss: 1.5478 - classification_loss: 0.3402 490/500 [============================>.] - ETA: 2s - loss: 1.8862 - regression_loss: 1.5464 - classification_loss: 0.3398 491/500 [============================>.] - ETA: 2s - loss: 1.8870 - regression_loss: 1.5472 - classification_loss: 0.3398 492/500 [============================>.] - ETA: 2s - loss: 1.8873 - regression_loss: 1.5474 - classification_loss: 0.3399 493/500 [============================>.] - ETA: 1s - loss: 1.8875 - regression_loss: 1.5476 - classification_loss: 0.3400 494/500 [============================>.] - ETA: 1s - loss: 1.8865 - regression_loss: 1.5468 - classification_loss: 0.3397 495/500 [============================>.] - ETA: 1s - loss: 1.8857 - regression_loss: 1.5462 - classification_loss: 0.3395 496/500 [============================>.] - ETA: 0s - loss: 1.8852 - regression_loss: 1.5455 - classification_loss: 0.3397 497/500 [============================>.] - ETA: 0s - loss: 1.8860 - regression_loss: 1.5460 - classification_loss: 0.3399 498/500 [============================>.] - ETA: 0s - loss: 1.8861 - regression_loss: 1.5463 - classification_loss: 0.3398 499/500 [============================>.] - ETA: 0s - loss: 1.8842 - regression_loss: 1.5448 - classification_loss: 0.3394 500/500 [==============================] - 125s 250ms/step - loss: 1.8836 - regression_loss: 1.5442 - classification_loss: 0.3394 1172 instances of class plum with average precision: 0.5809 mAP: 0.5809 Epoch 00051: saving model to ./training/snapshots/resnet50_pascal_51.h5 Epoch 52/150 1/500 [..............................] - ETA: 1:57 - loss: 1.4608 - regression_loss: 1.2220 - classification_loss: 0.2388 2/500 [..............................] - ETA: 2:02 - loss: 1.3274 - regression_loss: 1.1047 - classification_loss: 0.2227 3/500 [..............................] - ETA: 2:02 - loss: 1.5699 - regression_loss: 1.3045 - classification_loss: 0.2654 4/500 [..............................] - ETA: 2:04 - loss: 1.7226 - regression_loss: 1.4318 - classification_loss: 0.2908 5/500 [..............................] - ETA: 2:05 - loss: 1.5926 - regression_loss: 1.3192 - classification_loss: 0.2734 6/500 [..............................] - ETA: 2:04 - loss: 1.5763 - regression_loss: 1.3163 - classification_loss: 0.2600 7/500 [..............................] - ETA: 2:03 - loss: 1.5400 - regression_loss: 1.2841 - classification_loss: 0.2559 8/500 [..............................] - ETA: 2:03 - loss: 1.5681 - regression_loss: 1.3152 - classification_loss: 0.2530 9/500 [..............................] - ETA: 2:03 - loss: 1.5965 - regression_loss: 1.3387 - classification_loss: 0.2578 10/500 [..............................] - ETA: 2:02 - loss: 1.7865 - regression_loss: 1.4884 - classification_loss: 0.2981 11/500 [..............................] - ETA: 2:02 - loss: 1.7914 - regression_loss: 1.4869 - classification_loss: 0.3045 12/500 [..............................] - ETA: 2:02 - loss: 1.7247 - regression_loss: 1.4261 - classification_loss: 0.2987 13/500 [..............................] - ETA: 2:02 - loss: 1.7566 - regression_loss: 1.4530 - classification_loss: 0.3036 14/500 [..............................] - ETA: 2:02 - loss: 1.7773 - regression_loss: 1.4677 - classification_loss: 0.3096 15/500 [..............................] - ETA: 2:02 - loss: 1.7557 - regression_loss: 1.4514 - classification_loss: 0.3044 16/500 [..............................] - ETA: 2:01 - loss: 1.7602 - regression_loss: 1.4561 - classification_loss: 0.3042 17/500 [>.............................] - ETA: 2:02 - loss: 1.7939 - regression_loss: 1.4824 - classification_loss: 0.3115 18/500 [>.............................] - ETA: 2:01 - loss: 1.8031 - regression_loss: 1.4900 - classification_loss: 0.3131 19/500 [>.............................] - ETA: 2:01 - loss: 1.8129 - regression_loss: 1.5012 - classification_loss: 0.3116 20/500 [>.............................] - ETA: 2:00 - loss: 1.8231 - regression_loss: 1.5098 - classification_loss: 0.3133 21/500 [>.............................] - ETA: 1:59 - loss: 1.8295 - regression_loss: 1.5145 - classification_loss: 0.3149 22/500 [>.............................] - ETA: 1:58 - loss: 1.8646 - regression_loss: 1.5471 - classification_loss: 0.3175 23/500 [>.............................] - ETA: 1:57 - loss: 1.8874 - regression_loss: 1.5621 - classification_loss: 0.3252 24/500 [>.............................] - ETA: 1:57 - loss: 1.8872 - regression_loss: 1.5606 - classification_loss: 0.3266 25/500 [>.............................] - ETA: 1:57 - loss: 1.8680 - regression_loss: 1.5487 - classification_loss: 0.3193 26/500 [>.............................] - ETA: 1:56 - loss: 1.8695 - regression_loss: 1.5482 - classification_loss: 0.3213 27/500 [>.............................] - ETA: 1:56 - loss: 1.8519 - regression_loss: 1.5286 - classification_loss: 0.3233 28/500 [>.............................] - ETA: 1:56 - loss: 1.8383 - regression_loss: 1.5217 - classification_loss: 0.3166 29/500 [>.............................] - ETA: 1:56 - loss: 1.8173 - regression_loss: 1.5065 - classification_loss: 0.3108 30/500 [>.............................] - ETA: 1:56 - loss: 1.8473 - regression_loss: 1.5277 - classification_loss: 0.3196 31/500 [>.............................] - ETA: 1:56 - loss: 1.8561 - regression_loss: 1.5345 - classification_loss: 0.3216 32/500 [>.............................] - ETA: 1:56 - loss: 1.8539 - regression_loss: 1.5315 - classification_loss: 0.3224 33/500 [>.............................] - ETA: 1:55 - loss: 1.8521 - regression_loss: 1.5292 - classification_loss: 0.3229 34/500 [=>............................] - ETA: 1:55 - loss: 1.8164 - regression_loss: 1.4990 - classification_loss: 0.3174 35/500 [=>............................] - ETA: 1:55 - loss: 1.8212 - regression_loss: 1.5026 - classification_loss: 0.3185 36/500 [=>............................] - ETA: 1:55 - loss: 1.8253 - regression_loss: 1.5077 - classification_loss: 0.3176 37/500 [=>............................] - ETA: 1:55 - loss: 1.8382 - regression_loss: 1.5177 - classification_loss: 0.3205 38/500 [=>............................] - ETA: 1:54 - loss: 1.8263 - regression_loss: 1.5088 - classification_loss: 0.3175 39/500 [=>............................] - ETA: 1:54 - loss: 1.8248 - regression_loss: 1.5093 - classification_loss: 0.3155 40/500 [=>............................] - ETA: 1:54 - loss: 1.8165 - regression_loss: 1.5018 - classification_loss: 0.3147 41/500 [=>............................] - ETA: 1:54 - loss: 1.8110 - regression_loss: 1.4957 - classification_loss: 0.3153 42/500 [=>............................] - ETA: 1:53 - loss: 1.8167 - regression_loss: 1.5020 - classification_loss: 0.3147 43/500 [=>............................] - ETA: 1:53 - loss: 1.8307 - regression_loss: 1.5135 - classification_loss: 0.3172 44/500 [=>............................] - ETA: 1:53 - loss: 1.8359 - regression_loss: 1.5186 - classification_loss: 0.3173 45/500 [=>............................] - ETA: 1:53 - loss: 1.8366 - regression_loss: 1.5210 - classification_loss: 0.3156 46/500 [=>............................] - ETA: 1:53 - loss: 1.8248 - regression_loss: 1.5125 - classification_loss: 0.3123 47/500 [=>............................] - ETA: 1:52 - loss: 1.8056 - regression_loss: 1.4975 - classification_loss: 0.3081 48/500 [=>............................] - ETA: 1:52 - loss: 1.8082 - regression_loss: 1.4992 - classification_loss: 0.3090 49/500 [=>............................] - ETA: 1:52 - loss: 1.8099 - regression_loss: 1.5017 - classification_loss: 0.3082 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8162 - regression_loss: 1.5076 - classification_loss: 0.3086 51/500 [==>...........................] - ETA: 1:51 - loss: 1.8031 - regression_loss: 1.4981 - classification_loss: 0.3051 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8102 - regression_loss: 1.5059 - classification_loss: 0.3043 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8179 - regression_loss: 1.5125 - classification_loss: 0.3053 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8276 - regression_loss: 1.5201 - classification_loss: 0.3075 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8287 - regression_loss: 1.5169 - classification_loss: 0.3118 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8229 - regression_loss: 1.5113 - classification_loss: 0.3116 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8297 - regression_loss: 1.5163 - classification_loss: 0.3134 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8161 - regression_loss: 1.5059 - classification_loss: 0.3102 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8233 - regression_loss: 1.5119 - classification_loss: 0.3114 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8322 - regression_loss: 1.5209 - classification_loss: 0.3113 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8316 - regression_loss: 1.5214 - classification_loss: 0.3103 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8283 - regression_loss: 1.5187 - classification_loss: 0.3097 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8323 - regression_loss: 1.5219 - classification_loss: 0.3103 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8290 - regression_loss: 1.5190 - classification_loss: 0.3100 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8308 - regression_loss: 1.5205 - classification_loss: 0.3103 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8241 - regression_loss: 1.5152 - classification_loss: 0.3088 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8377 - regression_loss: 1.5263 - classification_loss: 0.3114 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8278 - regression_loss: 1.5174 - classification_loss: 0.3104 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8193 - regression_loss: 1.5101 - classification_loss: 0.3092 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8180 - regression_loss: 1.5086 - classification_loss: 0.3094 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8038 - regression_loss: 1.4979 - classification_loss: 0.3059 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8052 - regression_loss: 1.4994 - classification_loss: 0.3058 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7997 - regression_loss: 1.4955 - classification_loss: 0.3042 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7978 - regression_loss: 1.4938 - classification_loss: 0.3040 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7957 - regression_loss: 1.4924 - classification_loss: 0.3033 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7934 - regression_loss: 1.4905 - classification_loss: 0.3029 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8008 - regression_loss: 1.4969 - classification_loss: 0.3040 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7960 - regression_loss: 1.4935 - classification_loss: 0.3024 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7985 - regression_loss: 1.4953 - classification_loss: 0.3032 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7984 - regression_loss: 1.4953 - classification_loss: 0.3030 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7916 - regression_loss: 1.4895 - classification_loss: 0.3022 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7776 - regression_loss: 1.4778 - classification_loss: 0.2998 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7775 - regression_loss: 1.4772 - classification_loss: 0.3003 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7820 - regression_loss: 1.4805 - classification_loss: 0.3016 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7864 - regression_loss: 1.4844 - classification_loss: 0.3020 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7791 - regression_loss: 1.4770 - classification_loss: 0.3021 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7796 - regression_loss: 1.4774 - classification_loss: 0.3021 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7784 - regression_loss: 1.4760 - classification_loss: 0.3024 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7792 - regression_loss: 1.4772 - classification_loss: 0.3020 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7876 - regression_loss: 1.4834 - classification_loss: 0.3042 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7936 - regression_loss: 1.4878 - classification_loss: 0.3058 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8078 - regression_loss: 1.4979 - classification_loss: 0.3099 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8103 - regression_loss: 1.5006 - classification_loss: 0.3097 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8100 - regression_loss: 1.5008 - classification_loss: 0.3092 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8131 - regression_loss: 1.5034 - classification_loss: 0.3097 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8048 - regression_loss: 1.4972 - classification_loss: 0.3077 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7969 - regression_loss: 1.4901 - classification_loss: 0.3069 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7988 - regression_loss: 1.4918 - classification_loss: 0.3070 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8008 - regression_loss: 1.4940 - classification_loss: 0.3068 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7938 - regression_loss: 1.4880 - classification_loss: 0.3059 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7937 - regression_loss: 1.4883 - classification_loss: 0.3054 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7985 - regression_loss: 1.4919 - classification_loss: 0.3065 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8015 - regression_loss: 1.4943 - classification_loss: 0.3072 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8043 - regression_loss: 1.4961 - classification_loss: 0.3081 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8108 - regression_loss: 1.5022 - classification_loss: 0.3086 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8122 - regression_loss: 1.5018 - classification_loss: 0.3103 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8143 - regression_loss: 1.5040 - classification_loss: 0.3103 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8123 - regression_loss: 1.5025 - classification_loss: 0.3098 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8134 - regression_loss: 1.5036 - classification_loss: 0.3098 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8143 - regression_loss: 1.5044 - classification_loss: 0.3099 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8217 - regression_loss: 1.5096 - classification_loss: 0.3121 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8136 - regression_loss: 1.5028 - classification_loss: 0.3108 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8110 - regression_loss: 1.4991 - classification_loss: 0.3119 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8155 - regression_loss: 1.5028 - classification_loss: 0.3128 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8082 - regression_loss: 1.4971 - classification_loss: 0.3111 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8038 - regression_loss: 1.4935 - classification_loss: 0.3104 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8139 - regression_loss: 1.5019 - classification_loss: 0.3120 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8149 - regression_loss: 1.5023 - classification_loss: 0.3126 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8155 - regression_loss: 1.5032 - classification_loss: 0.3123 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8142 - regression_loss: 1.5021 - classification_loss: 0.3121 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8126 - regression_loss: 1.5009 - classification_loss: 0.3117 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8166 - regression_loss: 1.5043 - classification_loss: 0.3123 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8232 - regression_loss: 1.5104 - classification_loss: 0.3128 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8242 - regression_loss: 1.5113 - classification_loss: 0.3129 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8229 - regression_loss: 1.5102 - classification_loss: 0.3127 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8288 - regression_loss: 1.5156 - classification_loss: 0.3132 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8296 - regression_loss: 1.5165 - classification_loss: 0.3131 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8285 - regression_loss: 1.5160 - classification_loss: 0.3126 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8276 - regression_loss: 1.5154 - classification_loss: 0.3122 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8286 - regression_loss: 1.5167 - classification_loss: 0.3119 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8212 - regression_loss: 1.5106 - classification_loss: 0.3106 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8247 - regression_loss: 1.5132 - classification_loss: 0.3115 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8251 - regression_loss: 1.5138 - classification_loss: 0.3113 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8221 - regression_loss: 1.5095 - classification_loss: 0.3126 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8232 - regression_loss: 1.5106 - classification_loss: 0.3126 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8244 - regression_loss: 1.5105 - classification_loss: 0.3140 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8270 - regression_loss: 1.5121 - classification_loss: 0.3149 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8266 - regression_loss: 1.5117 - classification_loss: 0.3150 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8250 - regression_loss: 1.5104 - classification_loss: 0.3146 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8300 - regression_loss: 1.5129 - classification_loss: 0.3172 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8344 - regression_loss: 1.5165 - classification_loss: 0.3179 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8370 - regression_loss: 1.5182 - classification_loss: 0.3187 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8408 - regression_loss: 1.5214 - classification_loss: 0.3194 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8389 - regression_loss: 1.5200 - classification_loss: 0.3189 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8364 - regression_loss: 1.5179 - classification_loss: 0.3185 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8377 - regression_loss: 1.5188 - classification_loss: 0.3188 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8367 - regression_loss: 1.5182 - classification_loss: 0.3185 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8337 - regression_loss: 1.5155 - classification_loss: 0.3182 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8354 - regression_loss: 1.5168 - classification_loss: 0.3186 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8344 - regression_loss: 1.5163 - classification_loss: 0.3181 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8368 - regression_loss: 1.5180 - classification_loss: 0.3188 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8399 - regression_loss: 1.5207 - classification_loss: 0.3192 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8406 - regression_loss: 1.5216 - classification_loss: 0.3191 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8378 - regression_loss: 1.5191 - classification_loss: 0.3187 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8399 - regression_loss: 1.5205 - classification_loss: 0.3194 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8387 - regression_loss: 1.5199 - classification_loss: 0.3188 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8368 - regression_loss: 1.5185 - classification_loss: 0.3183 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8354 - regression_loss: 1.5170 - classification_loss: 0.3184 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8398 - regression_loss: 1.5201 - classification_loss: 0.3198 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8369 - regression_loss: 1.5176 - classification_loss: 0.3193 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8418 - regression_loss: 1.5218 - classification_loss: 0.3199 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8426 - regression_loss: 1.5223 - classification_loss: 0.3202 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8439 - regression_loss: 1.5237 - classification_loss: 0.3202 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8481 - regression_loss: 1.5273 - classification_loss: 0.3208 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8514 - regression_loss: 1.5298 - classification_loss: 0.3216 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8543 - regression_loss: 1.5318 - classification_loss: 0.3225 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8590 - regression_loss: 1.5348 - classification_loss: 0.3242 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8580 - regression_loss: 1.5342 - classification_loss: 0.3238 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8604 - regression_loss: 1.5362 - classification_loss: 0.3242 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8605 - regression_loss: 1.5363 - classification_loss: 0.3241 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8586 - regression_loss: 1.5348 - classification_loss: 0.3239 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8588 - regression_loss: 1.5351 - classification_loss: 0.3236 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8595 - regression_loss: 1.5360 - classification_loss: 0.3235 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8595 - regression_loss: 1.5362 - classification_loss: 0.3233 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8606 - regression_loss: 1.5369 - classification_loss: 0.3238 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8594 - regression_loss: 1.5359 - classification_loss: 0.3235 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8601 - regression_loss: 1.5369 - classification_loss: 0.3232 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8598 - regression_loss: 1.5368 - classification_loss: 0.3230 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8697 - regression_loss: 1.5450 - classification_loss: 0.3247 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8704 - regression_loss: 1.5450 - classification_loss: 0.3255 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8665 - regression_loss: 1.5416 - classification_loss: 0.3249 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8655 - regression_loss: 1.5409 - classification_loss: 0.3246 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8675 - regression_loss: 1.5429 - classification_loss: 0.3246 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8680 - regression_loss: 1.5434 - classification_loss: 0.3246 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8704 - regression_loss: 1.5454 - classification_loss: 0.3251 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8686 - regression_loss: 1.5438 - classification_loss: 0.3248 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8673 - regression_loss: 1.5430 - classification_loss: 0.3243 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8670 - regression_loss: 1.5427 - classification_loss: 0.3242 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8646 - regression_loss: 1.5409 - classification_loss: 0.3237 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8670 - regression_loss: 1.5427 - classification_loss: 0.3243 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8697 - regression_loss: 1.5443 - classification_loss: 0.3255 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8706 - regression_loss: 1.5450 - classification_loss: 0.3256 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8692 - regression_loss: 1.5436 - classification_loss: 0.3255 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8672 - regression_loss: 1.5422 - classification_loss: 0.3251 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8683 - regression_loss: 1.5429 - classification_loss: 0.3254 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8711 - regression_loss: 1.5454 - classification_loss: 0.3258 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8713 - regression_loss: 1.5457 - classification_loss: 0.3256 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8669 - regression_loss: 1.5422 - classification_loss: 0.3247 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8643 - regression_loss: 1.5399 - classification_loss: 0.3245 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8662 - regression_loss: 1.5416 - classification_loss: 0.3245 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8666 - regression_loss: 1.5421 - classification_loss: 0.3245 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8668 - regression_loss: 1.5424 - classification_loss: 0.3243 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8682 - regression_loss: 1.5439 - classification_loss: 0.3244 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8680 - regression_loss: 1.5437 - classification_loss: 0.3243 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8701 - regression_loss: 1.5455 - classification_loss: 0.3246 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8700 - regression_loss: 1.5458 - classification_loss: 0.3242 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8684 - regression_loss: 1.5446 - classification_loss: 0.3238 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8691 - regression_loss: 1.5454 - classification_loss: 0.3237 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8684 - regression_loss: 1.5450 - classification_loss: 0.3234 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8667 - regression_loss: 1.5436 - classification_loss: 0.3231 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8648 - regression_loss: 1.5418 - classification_loss: 0.3230 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8600 - regression_loss: 1.5381 - classification_loss: 0.3219 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8592 - regression_loss: 1.5377 - classification_loss: 0.3215 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8577 - regression_loss: 1.5366 - classification_loss: 0.3211 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8550 - regression_loss: 1.5340 - classification_loss: 0.3210 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8519 - regression_loss: 1.5315 - classification_loss: 0.3204 217/500 [============>.................] - ETA: 1:10 - loss: 1.8557 - regression_loss: 1.5352 - classification_loss: 0.3205 218/500 [============>.................] - ETA: 1:10 - loss: 1.8544 - regression_loss: 1.5344 - classification_loss: 0.3200 219/500 [============>.................] - ETA: 1:10 - loss: 1.8532 - regression_loss: 1.5332 - classification_loss: 0.3199 220/500 [============>.................] - ETA: 1:09 - loss: 1.8517 - regression_loss: 1.5317 - classification_loss: 0.3199 221/500 [============>.................] - ETA: 1:09 - loss: 1.8545 - regression_loss: 1.5334 - classification_loss: 0.3212 222/500 [============>.................] - ETA: 1:09 - loss: 1.8548 - regression_loss: 1.5337 - classification_loss: 0.3212 223/500 [============>.................] - ETA: 1:09 - loss: 1.8526 - regression_loss: 1.5319 - classification_loss: 0.3208 224/500 [============>.................] - ETA: 1:08 - loss: 1.8538 - regression_loss: 1.5329 - classification_loss: 0.3209 225/500 [============>.................] - ETA: 1:08 - loss: 1.8547 - regression_loss: 1.5340 - classification_loss: 0.3207 226/500 [============>.................] - ETA: 1:08 - loss: 1.8564 - regression_loss: 1.5355 - classification_loss: 0.3209 227/500 [============>.................] - ETA: 1:08 - loss: 1.8579 - regression_loss: 1.5367 - classification_loss: 0.3211 228/500 [============>.................] - ETA: 1:07 - loss: 1.8577 - regression_loss: 1.5367 - classification_loss: 0.3210 229/500 [============>.................] - ETA: 1:07 - loss: 1.8583 - regression_loss: 1.5375 - classification_loss: 0.3208 230/500 [============>.................] - ETA: 1:07 - loss: 1.8579 - regression_loss: 1.5377 - classification_loss: 0.3202 231/500 [============>.................] - ETA: 1:07 - loss: 1.8590 - regression_loss: 1.5388 - classification_loss: 0.3202 232/500 [============>.................] - ETA: 1:06 - loss: 1.8585 - regression_loss: 1.5386 - classification_loss: 0.3199 233/500 [============>.................] - ETA: 1:06 - loss: 1.8588 - regression_loss: 1.5389 - classification_loss: 0.3199 234/500 [=============>................] - ETA: 1:06 - loss: 1.8606 - regression_loss: 1.5402 - classification_loss: 0.3205 235/500 [=============>................] - ETA: 1:06 - loss: 1.8599 - regression_loss: 1.5396 - classification_loss: 0.3202 236/500 [=============>................] - ETA: 1:05 - loss: 1.8602 - regression_loss: 1.5400 - classification_loss: 0.3202 237/500 [=============>................] - ETA: 1:05 - loss: 1.8631 - regression_loss: 1.5424 - classification_loss: 0.3207 238/500 [=============>................] - ETA: 1:05 - loss: 1.8647 - regression_loss: 1.5439 - classification_loss: 0.3209 239/500 [=============>................] - ETA: 1:05 - loss: 1.8637 - regression_loss: 1.5431 - classification_loss: 0.3206 240/500 [=============>................] - ETA: 1:04 - loss: 1.8641 - regression_loss: 1.5433 - classification_loss: 0.3207 241/500 [=============>................] - ETA: 1:04 - loss: 1.8640 - regression_loss: 1.5430 - classification_loss: 0.3210 242/500 [=============>................] - ETA: 1:04 - loss: 1.8660 - regression_loss: 1.5447 - classification_loss: 0.3213 243/500 [=============>................] - ETA: 1:04 - loss: 1.8664 - regression_loss: 1.5454 - classification_loss: 0.3211 244/500 [=============>................] - ETA: 1:03 - loss: 1.8663 - regression_loss: 1.5454 - classification_loss: 0.3209 245/500 [=============>................] - ETA: 1:03 - loss: 1.8663 - regression_loss: 1.5453 - classification_loss: 0.3210 246/500 [=============>................] - ETA: 1:03 - loss: 1.8662 - regression_loss: 1.5454 - classification_loss: 0.3208 247/500 [=============>................] - ETA: 1:03 - loss: 1.8633 - regression_loss: 1.5431 - classification_loss: 0.3201 248/500 [=============>................] - ETA: 1:02 - loss: 1.8645 - regression_loss: 1.5443 - classification_loss: 0.3203 249/500 [=============>................] - ETA: 1:02 - loss: 1.8628 - regression_loss: 1.5424 - classification_loss: 0.3204 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8638 - regression_loss: 1.5432 - classification_loss: 0.3206 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8641 - regression_loss: 1.5436 - classification_loss: 0.3205 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8642 - regression_loss: 1.5437 - classification_loss: 0.3205 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8648 - regression_loss: 1.5442 - classification_loss: 0.3206 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8652 - regression_loss: 1.5444 - classification_loss: 0.3208 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8659 - regression_loss: 1.5451 - classification_loss: 0.3208 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8680 - regression_loss: 1.5471 - classification_loss: 0.3209 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8720 - regression_loss: 1.5504 - classification_loss: 0.3216 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8726 - regression_loss: 1.5510 - classification_loss: 0.3216 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8746 - regression_loss: 1.5509 - classification_loss: 0.3238 260/500 [==============>...............] - ETA: 59s - loss: 1.8734 - regression_loss: 1.5500 - classification_loss: 0.3233  261/500 [==============>...............] - ETA: 59s - loss: 1.8730 - regression_loss: 1.5500 - classification_loss: 0.3230 262/500 [==============>...............] - ETA: 59s - loss: 1.8702 - regression_loss: 1.5474 - classification_loss: 0.3228 263/500 [==============>...............] - ETA: 59s - loss: 1.8696 - regression_loss: 1.5473 - classification_loss: 0.3223 264/500 [==============>...............] - ETA: 58s - loss: 1.8698 - regression_loss: 1.5477 - classification_loss: 0.3222 265/500 [==============>...............] - ETA: 58s - loss: 1.8706 - regression_loss: 1.5481 - classification_loss: 0.3225 266/500 [==============>...............] - ETA: 58s - loss: 1.8708 - regression_loss: 1.5483 - classification_loss: 0.3225 267/500 [===============>..............] - ETA: 58s - loss: 1.8724 - regression_loss: 1.5494 - classification_loss: 0.3230 268/500 [===============>..............] - ETA: 57s - loss: 1.8742 - regression_loss: 1.5508 - classification_loss: 0.3234 269/500 [===============>..............] - ETA: 57s - loss: 1.8743 - regression_loss: 1.5510 - classification_loss: 0.3233 270/500 [===============>..............] - ETA: 57s - loss: 1.8749 - regression_loss: 1.5514 - classification_loss: 0.3235 271/500 [===============>..............] - ETA: 57s - loss: 1.8745 - regression_loss: 1.5513 - classification_loss: 0.3232 272/500 [===============>..............] - ETA: 56s - loss: 1.8766 - regression_loss: 1.5526 - classification_loss: 0.3240 273/500 [===============>..............] - ETA: 56s - loss: 1.8763 - regression_loss: 1.5525 - classification_loss: 0.3238 274/500 [===============>..............] - ETA: 56s - loss: 1.8747 - regression_loss: 1.5511 - classification_loss: 0.3236 275/500 [===============>..............] - ETA: 56s - loss: 1.8728 - regression_loss: 1.5495 - classification_loss: 0.3234 276/500 [===============>..............] - ETA: 55s - loss: 1.8718 - regression_loss: 1.5487 - classification_loss: 0.3231 277/500 [===============>..............] - ETA: 55s - loss: 1.8704 - regression_loss: 1.5471 - classification_loss: 0.3233 278/500 [===============>..............] - ETA: 55s - loss: 1.8680 - regression_loss: 1.5450 - classification_loss: 0.3229 279/500 [===============>..............] - ETA: 55s - loss: 1.8665 - regression_loss: 1.5432 - classification_loss: 0.3233 280/500 [===============>..............] - ETA: 54s - loss: 1.8671 - regression_loss: 1.5436 - classification_loss: 0.3235 281/500 [===============>..............] - ETA: 54s - loss: 1.8681 - regression_loss: 1.5443 - classification_loss: 0.3238 282/500 [===============>..............] - ETA: 54s - loss: 1.8698 - regression_loss: 1.5456 - classification_loss: 0.3243 283/500 [===============>..............] - ETA: 54s - loss: 1.8694 - regression_loss: 1.5454 - classification_loss: 0.3240 284/500 [================>.............] - ETA: 53s - loss: 1.8702 - regression_loss: 1.5463 - classification_loss: 0.3240 285/500 [================>.............] - ETA: 53s - loss: 1.8706 - regression_loss: 1.5467 - classification_loss: 0.3239 286/500 [================>.............] - ETA: 53s - loss: 1.8714 - regression_loss: 1.5475 - classification_loss: 0.3239 287/500 [================>.............] - ETA: 53s - loss: 1.8730 - regression_loss: 1.5487 - classification_loss: 0.3243 288/500 [================>.............] - ETA: 52s - loss: 1.8727 - regression_loss: 1.5486 - classification_loss: 0.3241 289/500 [================>.............] - ETA: 52s - loss: 1.8694 - regression_loss: 1.5460 - classification_loss: 0.3234 290/500 [================>.............] - ETA: 52s - loss: 1.8705 - regression_loss: 1.5468 - classification_loss: 0.3237 291/500 [================>.............] - ETA: 52s - loss: 1.8722 - regression_loss: 1.5479 - classification_loss: 0.3242 292/500 [================>.............] - ETA: 51s - loss: 1.8706 - regression_loss: 1.5467 - classification_loss: 0.3238 293/500 [================>.............] - ETA: 51s - loss: 1.8674 - regression_loss: 1.5441 - classification_loss: 0.3232 294/500 [================>.............] - ETA: 51s - loss: 1.8682 - regression_loss: 1.5450 - classification_loss: 0.3232 295/500 [================>.............] - ETA: 51s - loss: 1.8686 - regression_loss: 1.5453 - classification_loss: 0.3233 296/500 [================>.............] - ETA: 50s - loss: 1.8684 - regression_loss: 1.5451 - classification_loss: 0.3233 297/500 [================>.............] - ETA: 50s - loss: 1.8682 - regression_loss: 1.5444 - classification_loss: 0.3238 298/500 [================>.............] - ETA: 50s - loss: 1.8673 - regression_loss: 1.5432 - classification_loss: 0.3242 299/500 [================>.............] - ETA: 50s - loss: 1.8672 - regression_loss: 1.5432 - classification_loss: 0.3240 300/500 [=================>............] - ETA: 49s - loss: 1.8687 - regression_loss: 1.5440 - classification_loss: 0.3247 301/500 [=================>............] - ETA: 49s - loss: 1.8693 - regression_loss: 1.5446 - classification_loss: 0.3246 302/500 [=================>............] - ETA: 49s - loss: 1.8695 - regression_loss: 1.5448 - classification_loss: 0.3247 303/500 [=================>............] - ETA: 49s - loss: 1.8685 - regression_loss: 1.5440 - classification_loss: 0.3245 304/500 [=================>............] - ETA: 48s - loss: 1.8683 - regression_loss: 1.5436 - classification_loss: 0.3246 305/500 [=================>............] - ETA: 48s - loss: 1.8687 - regression_loss: 1.5443 - classification_loss: 0.3245 306/500 [=================>............] - ETA: 48s - loss: 1.8700 - regression_loss: 1.5450 - classification_loss: 0.3249 307/500 [=================>............] - ETA: 48s - loss: 1.8664 - regression_loss: 1.5422 - classification_loss: 0.3242 308/500 [=================>............] - ETA: 48s - loss: 1.8649 - regression_loss: 1.5410 - classification_loss: 0.3239 309/500 [=================>............] - ETA: 47s - loss: 1.8660 - regression_loss: 1.5418 - classification_loss: 0.3242 310/500 [=================>............] - ETA: 47s - loss: 1.8665 - regression_loss: 1.5423 - classification_loss: 0.3243 311/500 [=================>............] - ETA: 47s - loss: 1.8671 - regression_loss: 1.5425 - classification_loss: 0.3246 312/500 [=================>............] - ETA: 47s - loss: 1.8667 - regression_loss: 1.5421 - classification_loss: 0.3246 313/500 [=================>............] - ETA: 46s - loss: 1.8695 - regression_loss: 1.5441 - classification_loss: 0.3254 314/500 [=================>............] - ETA: 46s - loss: 1.8699 - regression_loss: 1.5444 - classification_loss: 0.3256 315/500 [=================>............] - ETA: 46s - loss: 1.8700 - regression_loss: 1.5443 - classification_loss: 0.3257 316/500 [=================>............] - ETA: 45s - loss: 1.8722 - regression_loss: 1.5462 - classification_loss: 0.3259 317/500 [==================>...........] - ETA: 45s - loss: 1.8689 - regression_loss: 1.5437 - classification_loss: 0.3252 318/500 [==================>...........] - ETA: 45s - loss: 1.8669 - regression_loss: 1.5420 - classification_loss: 0.3248 319/500 [==================>...........] - ETA: 45s - loss: 1.8653 - regression_loss: 1.5408 - classification_loss: 0.3244 320/500 [==================>...........] - ETA: 44s - loss: 1.8616 - regression_loss: 1.5377 - classification_loss: 0.3239 321/500 [==================>...........] - ETA: 44s - loss: 1.8597 - regression_loss: 1.5360 - classification_loss: 0.3236 322/500 [==================>...........] - ETA: 44s - loss: 1.8598 - regression_loss: 1.5362 - classification_loss: 0.3236 323/500 [==================>...........] - ETA: 44s - loss: 1.8606 - regression_loss: 1.5367 - classification_loss: 0.3238 324/500 [==================>...........] - ETA: 43s - loss: 1.8615 - regression_loss: 1.5374 - classification_loss: 0.3241 325/500 [==================>...........] - ETA: 43s - loss: 1.8625 - regression_loss: 1.5383 - classification_loss: 0.3242 326/500 [==================>...........] - ETA: 43s - loss: 1.8628 - regression_loss: 1.5385 - classification_loss: 0.3243 327/500 [==================>...........] - ETA: 43s - loss: 1.8614 - regression_loss: 1.5375 - classification_loss: 0.3239 328/500 [==================>...........] - ETA: 42s - loss: 1.8610 - regression_loss: 1.5373 - classification_loss: 0.3237 329/500 [==================>...........] - ETA: 42s - loss: 1.8604 - regression_loss: 1.5374 - classification_loss: 0.3230 330/500 [==================>...........] - ETA: 42s - loss: 1.8603 - regression_loss: 1.5375 - classification_loss: 0.3228 331/500 [==================>...........] - ETA: 42s - loss: 1.8620 - regression_loss: 1.5389 - classification_loss: 0.3231 332/500 [==================>...........] - ETA: 41s - loss: 1.8616 - regression_loss: 1.5384 - classification_loss: 0.3232 333/500 [==================>...........] - ETA: 41s - loss: 1.8627 - regression_loss: 1.5390 - classification_loss: 0.3237 334/500 [===================>..........] - ETA: 41s - loss: 1.8629 - regression_loss: 1.5391 - classification_loss: 0.3238 335/500 [===================>..........] - ETA: 41s - loss: 1.8653 - regression_loss: 1.5405 - classification_loss: 0.3248 336/500 [===================>..........] - ETA: 40s - loss: 1.8650 - regression_loss: 1.5399 - classification_loss: 0.3251 337/500 [===================>..........] - ETA: 40s - loss: 1.8672 - regression_loss: 1.5410 - classification_loss: 0.3263 338/500 [===================>..........] - ETA: 40s - loss: 1.8680 - regression_loss: 1.5415 - classification_loss: 0.3265 339/500 [===================>..........] - ETA: 40s - loss: 1.8692 - regression_loss: 1.5426 - classification_loss: 0.3266 340/500 [===================>..........] - ETA: 39s - loss: 1.8682 - regression_loss: 1.5420 - classification_loss: 0.3262 341/500 [===================>..........] - ETA: 39s - loss: 1.8657 - regression_loss: 1.5397 - classification_loss: 0.3260 342/500 [===================>..........] - ETA: 39s - loss: 1.8648 - regression_loss: 1.5391 - classification_loss: 0.3258 343/500 [===================>..........] - ETA: 39s - loss: 1.8654 - regression_loss: 1.5394 - classification_loss: 0.3260 344/500 [===================>..........] - ETA: 38s - loss: 1.8656 - regression_loss: 1.5396 - classification_loss: 0.3261 345/500 [===================>..........] - ETA: 38s - loss: 1.8653 - regression_loss: 1.5393 - classification_loss: 0.3260 346/500 [===================>..........] - ETA: 38s - loss: 1.8633 - regression_loss: 1.5378 - classification_loss: 0.3255 347/500 [===================>..........] - ETA: 38s - loss: 1.8598 - regression_loss: 1.5349 - classification_loss: 0.3249 348/500 [===================>..........] - ETA: 37s - loss: 1.8596 - regression_loss: 1.5349 - classification_loss: 0.3247 349/500 [===================>..........] - ETA: 37s - loss: 1.8574 - regression_loss: 1.5332 - classification_loss: 0.3241 350/500 [====================>.........] - ETA: 37s - loss: 1.8579 - regression_loss: 1.5338 - classification_loss: 0.3241 351/500 [====================>.........] - ETA: 37s - loss: 1.8557 - regression_loss: 1.5320 - classification_loss: 0.3237 352/500 [====================>.........] - ETA: 36s - loss: 1.8556 - regression_loss: 1.5321 - classification_loss: 0.3235 353/500 [====================>.........] - ETA: 36s - loss: 1.8559 - regression_loss: 1.5323 - classification_loss: 0.3236 354/500 [====================>.........] - ETA: 36s - loss: 1.8554 - regression_loss: 1.5322 - classification_loss: 0.3232 355/500 [====================>.........] - ETA: 36s - loss: 1.8566 - regression_loss: 1.5329 - classification_loss: 0.3237 356/500 [====================>.........] - ETA: 35s - loss: 1.8569 - regression_loss: 1.5334 - classification_loss: 0.3235 357/500 [====================>.........] - ETA: 35s - loss: 1.8585 - regression_loss: 1.5347 - classification_loss: 0.3239 358/500 [====================>.........] - ETA: 35s - loss: 1.8583 - regression_loss: 1.5344 - classification_loss: 0.3239 359/500 [====================>.........] - ETA: 35s - loss: 1.8589 - regression_loss: 1.5348 - classification_loss: 0.3242 360/500 [====================>.........] - ETA: 34s - loss: 1.8600 - regression_loss: 1.5355 - classification_loss: 0.3244 361/500 [====================>.........] - ETA: 34s - loss: 1.8604 - regression_loss: 1.5359 - classification_loss: 0.3245 362/500 [====================>.........] - ETA: 34s - loss: 1.8607 - regression_loss: 1.5361 - classification_loss: 0.3247 363/500 [====================>.........] - ETA: 34s - loss: 1.8598 - regression_loss: 1.5353 - classification_loss: 0.3245 364/500 [====================>.........] - ETA: 33s - loss: 1.8611 - regression_loss: 1.5363 - classification_loss: 0.3249 365/500 [====================>.........] - ETA: 33s - loss: 1.8612 - regression_loss: 1.5364 - classification_loss: 0.3249 366/500 [====================>.........] - ETA: 33s - loss: 1.8595 - regression_loss: 1.5348 - classification_loss: 0.3247 367/500 [=====================>........] - ETA: 33s - loss: 1.8622 - regression_loss: 1.5373 - classification_loss: 0.3249 368/500 [=====================>........] - ETA: 32s - loss: 1.8652 - regression_loss: 1.5399 - classification_loss: 0.3253 369/500 [=====================>........] - ETA: 32s - loss: 1.8654 - regression_loss: 1.5402 - classification_loss: 0.3252 370/500 [=====================>........] - ETA: 32s - loss: 1.8629 - regression_loss: 1.5380 - classification_loss: 0.3249 371/500 [=====================>........] - ETA: 32s - loss: 1.8640 - regression_loss: 1.5387 - classification_loss: 0.3253 372/500 [=====================>........] - ETA: 31s - loss: 1.8631 - regression_loss: 1.5378 - classification_loss: 0.3253 373/500 [=====================>........] - ETA: 31s - loss: 1.8633 - regression_loss: 1.5379 - classification_loss: 0.3254 374/500 [=====================>........] - ETA: 31s - loss: 1.8623 - regression_loss: 1.5371 - classification_loss: 0.3252 375/500 [=====================>........] - ETA: 31s - loss: 1.8594 - regression_loss: 1.5347 - classification_loss: 0.3247 376/500 [=====================>........] - ETA: 30s - loss: 1.8588 - regression_loss: 1.5341 - classification_loss: 0.3247 377/500 [=====================>........] - ETA: 30s - loss: 1.8592 - regression_loss: 1.5346 - classification_loss: 0.3247 378/500 [=====================>........] - ETA: 30s - loss: 1.8579 - regression_loss: 1.5334 - classification_loss: 0.3245 379/500 [=====================>........] - ETA: 30s - loss: 1.8585 - regression_loss: 1.5338 - classification_loss: 0.3247 380/500 [=====================>........] - ETA: 29s - loss: 1.8578 - regression_loss: 1.5333 - classification_loss: 0.3245 381/500 [=====================>........] - ETA: 29s - loss: 1.8579 - regression_loss: 1.5334 - classification_loss: 0.3245 382/500 [=====================>........] - ETA: 29s - loss: 1.8572 - regression_loss: 1.5326 - classification_loss: 0.3246 383/500 [=====================>........] - ETA: 29s - loss: 1.8576 - regression_loss: 1.5329 - classification_loss: 0.3247 384/500 [======================>.......] - ETA: 28s - loss: 1.8585 - regression_loss: 1.5337 - classification_loss: 0.3248 385/500 [======================>.......] - ETA: 28s - loss: 1.8567 - regression_loss: 1.5319 - classification_loss: 0.3247 386/500 [======================>.......] - ETA: 28s - loss: 1.8597 - regression_loss: 1.5339 - classification_loss: 0.3258 387/500 [======================>.......] - ETA: 28s - loss: 1.8596 - regression_loss: 1.5333 - classification_loss: 0.3262 388/500 [======================>.......] - ETA: 27s - loss: 1.8599 - regression_loss: 1.5337 - classification_loss: 0.3263 389/500 [======================>.......] - ETA: 27s - loss: 1.8588 - regression_loss: 1.5326 - classification_loss: 0.3262 390/500 [======================>.......] - ETA: 27s - loss: 1.8582 - regression_loss: 1.5321 - classification_loss: 0.3261 391/500 [======================>.......] - ETA: 27s - loss: 1.8567 - regression_loss: 1.5309 - classification_loss: 0.3258 392/500 [======================>.......] - ETA: 26s - loss: 1.8570 - regression_loss: 1.5311 - classification_loss: 0.3259 393/500 [======================>.......] - ETA: 26s - loss: 1.8581 - regression_loss: 1.5320 - classification_loss: 0.3261 394/500 [======================>.......] - ETA: 26s - loss: 1.8584 - regression_loss: 1.5322 - classification_loss: 0.3262 395/500 [======================>.......] - ETA: 26s - loss: 1.8590 - regression_loss: 1.5326 - classification_loss: 0.3264 396/500 [======================>.......] - ETA: 25s - loss: 1.8615 - regression_loss: 1.5349 - classification_loss: 0.3267 397/500 [======================>.......] - ETA: 25s - loss: 1.8619 - regression_loss: 1.5354 - classification_loss: 0.3265 398/500 [======================>.......] - ETA: 25s - loss: 1.8623 - regression_loss: 1.5353 - classification_loss: 0.3269 399/500 [======================>.......] - ETA: 25s - loss: 1.8616 - regression_loss: 1.5348 - classification_loss: 0.3267 400/500 [=======================>......] - ETA: 24s - loss: 1.8607 - regression_loss: 1.5340 - classification_loss: 0.3267 401/500 [=======================>......] - ETA: 24s - loss: 1.8620 - regression_loss: 1.5352 - classification_loss: 0.3268 402/500 [=======================>......] - ETA: 24s - loss: 1.8623 - regression_loss: 1.5356 - classification_loss: 0.3268 403/500 [=======================>......] - ETA: 24s - loss: 1.8615 - regression_loss: 1.5349 - classification_loss: 0.3265 404/500 [=======================>......] - ETA: 23s - loss: 1.8627 - regression_loss: 1.5359 - classification_loss: 0.3268 405/500 [=======================>......] - ETA: 23s - loss: 1.8626 - regression_loss: 1.5358 - classification_loss: 0.3268 406/500 [=======================>......] - ETA: 23s - loss: 1.8636 - regression_loss: 1.5367 - classification_loss: 0.3269 407/500 [=======================>......] - ETA: 23s - loss: 1.8625 - regression_loss: 1.5359 - classification_loss: 0.3267 408/500 [=======================>......] - ETA: 22s - loss: 1.8625 - regression_loss: 1.5357 - classification_loss: 0.3267 409/500 [=======================>......] - ETA: 22s - loss: 1.8618 - regression_loss: 1.5352 - classification_loss: 0.3266 410/500 [=======================>......] - ETA: 22s - loss: 1.8629 - regression_loss: 1.5360 - classification_loss: 0.3268 411/500 [=======================>......] - ETA: 22s - loss: 1.8628 - regression_loss: 1.5360 - classification_loss: 0.3267 412/500 [=======================>......] - ETA: 21s - loss: 1.8630 - regression_loss: 1.5361 - classification_loss: 0.3269 413/500 [=======================>......] - ETA: 21s - loss: 1.8638 - regression_loss: 1.5368 - classification_loss: 0.3269 414/500 [=======================>......] - ETA: 21s - loss: 1.8634 - regression_loss: 1.5367 - classification_loss: 0.3268 415/500 [=======================>......] - ETA: 21s - loss: 1.8630 - regression_loss: 1.5362 - classification_loss: 0.3268 416/500 [=======================>......] - ETA: 20s - loss: 1.8620 - regression_loss: 1.5354 - classification_loss: 0.3266 417/500 [========================>.....] - ETA: 20s - loss: 1.8618 - regression_loss: 1.5352 - classification_loss: 0.3265 418/500 [========================>.....] - ETA: 20s - loss: 1.8601 - regression_loss: 1.5339 - classification_loss: 0.3262 419/500 [========================>.....] - ETA: 20s - loss: 1.8601 - regression_loss: 1.5339 - classification_loss: 0.3263 420/500 [========================>.....] - ETA: 19s - loss: 1.8591 - regression_loss: 1.5331 - classification_loss: 0.3260 421/500 [========================>.....] - ETA: 19s - loss: 1.8596 - regression_loss: 1.5334 - classification_loss: 0.3262 422/500 [========================>.....] - ETA: 19s - loss: 1.8584 - regression_loss: 1.5325 - classification_loss: 0.3259 423/500 [========================>.....] - ETA: 19s - loss: 1.8574 - regression_loss: 1.5318 - classification_loss: 0.3257 424/500 [========================>.....] - ETA: 18s - loss: 1.8551 - regression_loss: 1.5299 - classification_loss: 0.3252 425/500 [========================>.....] - ETA: 18s - loss: 1.8526 - regression_loss: 1.5279 - classification_loss: 0.3247 426/500 [========================>.....] - ETA: 18s - loss: 1.8523 - regression_loss: 1.5275 - classification_loss: 0.3248 427/500 [========================>.....] - ETA: 18s - loss: 1.8528 - regression_loss: 1.5277 - classification_loss: 0.3251 428/500 [========================>.....] - ETA: 17s - loss: 1.8511 - regression_loss: 1.5265 - classification_loss: 0.3246 429/500 [========================>.....] - ETA: 17s - loss: 1.8520 - regression_loss: 1.5270 - classification_loss: 0.3249 430/500 [========================>.....] - ETA: 17s - loss: 1.8510 - regression_loss: 1.5263 - classification_loss: 0.3247 431/500 [========================>.....] - ETA: 17s - loss: 1.8515 - regression_loss: 1.5268 - classification_loss: 0.3248 432/500 [========================>.....] - ETA: 16s - loss: 1.8534 - regression_loss: 1.5283 - classification_loss: 0.3250 433/500 [========================>.....] - ETA: 16s - loss: 1.8537 - regression_loss: 1.5287 - classification_loss: 0.3250 434/500 [=========================>....] - ETA: 16s - loss: 1.8541 - regression_loss: 1.5291 - classification_loss: 0.3250 435/500 [=========================>....] - ETA: 16s - loss: 1.8543 - regression_loss: 1.5293 - classification_loss: 0.3249 436/500 [=========================>....] - ETA: 15s - loss: 1.8539 - regression_loss: 1.5292 - classification_loss: 0.3246 437/500 [=========================>....] - ETA: 15s - loss: 1.8546 - regression_loss: 1.5300 - classification_loss: 0.3246 438/500 [=========================>....] - ETA: 15s - loss: 1.8547 - regression_loss: 1.5301 - classification_loss: 0.3246 439/500 [=========================>....] - ETA: 15s - loss: 1.8551 - regression_loss: 1.5303 - classification_loss: 0.3247 440/500 [=========================>....] - ETA: 14s - loss: 1.8581 - regression_loss: 1.5330 - classification_loss: 0.3251 441/500 [=========================>....] - ETA: 14s - loss: 1.8580 - regression_loss: 1.5328 - classification_loss: 0.3252 442/500 [=========================>....] - ETA: 14s - loss: 1.8579 - regression_loss: 1.5327 - classification_loss: 0.3252 443/500 [=========================>....] - ETA: 14s - loss: 1.8566 - regression_loss: 1.5311 - classification_loss: 0.3255 444/500 [=========================>....] - ETA: 13s - loss: 1.8570 - regression_loss: 1.5315 - classification_loss: 0.3255 445/500 [=========================>....] - ETA: 13s - loss: 1.8570 - regression_loss: 1.5315 - classification_loss: 0.3255 446/500 [=========================>....] - ETA: 13s - loss: 1.8574 - regression_loss: 1.5320 - classification_loss: 0.3254 447/500 [=========================>....] - ETA: 13s - loss: 1.8581 - regression_loss: 1.5326 - classification_loss: 0.3255 448/500 [=========================>....] - ETA: 12s - loss: 1.8574 - regression_loss: 1.5321 - classification_loss: 0.3253 449/500 [=========================>....] - ETA: 12s - loss: 1.8593 - regression_loss: 1.5337 - classification_loss: 0.3256 450/500 [==========================>...] - ETA: 12s - loss: 1.8566 - regression_loss: 1.5316 - classification_loss: 0.3251 451/500 [==========================>...] - ETA: 12s - loss: 1.8577 - regression_loss: 1.5323 - classification_loss: 0.3254 452/500 [==========================>...] - ETA: 11s - loss: 1.8578 - regression_loss: 1.5324 - classification_loss: 0.3253 453/500 [==========================>...] - ETA: 11s - loss: 1.8582 - regression_loss: 1.5325 - classification_loss: 0.3257 454/500 [==========================>...] - ETA: 11s - loss: 1.8586 - regression_loss: 1.5326 - classification_loss: 0.3259 455/500 [==========================>...] - ETA: 11s - loss: 1.8601 - regression_loss: 1.5338 - classification_loss: 0.3264 456/500 [==========================>...] - ETA: 10s - loss: 1.8584 - regression_loss: 1.5324 - classification_loss: 0.3259 457/500 [==========================>...] - ETA: 10s - loss: 1.8575 - regression_loss: 1.5316 - classification_loss: 0.3259 458/500 [==========================>...] - ETA: 10s - loss: 1.8579 - regression_loss: 1.5320 - classification_loss: 0.3259 459/500 [==========================>...] - ETA: 10s - loss: 1.8608 - regression_loss: 1.5334 - classification_loss: 0.3274 460/500 [==========================>...] - ETA: 9s - loss: 1.8611 - regression_loss: 1.5337 - classification_loss: 0.3274  461/500 [==========================>...] - ETA: 9s - loss: 1.8614 - regression_loss: 1.5340 - classification_loss: 0.3274 462/500 [==========================>...] - ETA: 9s - loss: 1.8633 - regression_loss: 1.5350 - classification_loss: 0.3283 463/500 [==========================>...] - ETA: 9s - loss: 1.8622 - regression_loss: 1.5342 - classification_loss: 0.3280 464/500 [==========================>...] - ETA: 8s - loss: 1.8635 - regression_loss: 1.5354 - classification_loss: 0.3281 465/500 [==========================>...] - ETA: 8s - loss: 1.8639 - regression_loss: 1.5358 - classification_loss: 0.3282 466/500 [==========================>...] - ETA: 8s - loss: 1.8629 - regression_loss: 1.5350 - classification_loss: 0.3279 467/500 [===========================>..] - ETA: 8s - loss: 1.8632 - regression_loss: 1.5354 - classification_loss: 0.3278 468/500 [===========================>..] - ETA: 7s - loss: 1.8639 - regression_loss: 1.5361 - classification_loss: 0.3278 469/500 [===========================>..] - ETA: 7s - loss: 1.8652 - regression_loss: 1.5369 - classification_loss: 0.3283 470/500 [===========================>..] - ETA: 7s - loss: 1.8657 - regression_loss: 1.5373 - classification_loss: 0.3284 471/500 [===========================>..] - ETA: 7s - loss: 1.8667 - regression_loss: 1.5381 - classification_loss: 0.3286 472/500 [===========================>..] - ETA: 6s - loss: 1.8676 - regression_loss: 1.5389 - classification_loss: 0.3287 473/500 [===========================>..] - ETA: 6s - loss: 1.8677 - regression_loss: 1.5391 - classification_loss: 0.3286 474/500 [===========================>..] - ETA: 6s - loss: 1.8680 - regression_loss: 1.5393 - classification_loss: 0.3287 475/500 [===========================>..] - ETA: 6s - loss: 1.8688 - regression_loss: 1.5397 - classification_loss: 0.3291 476/500 [===========================>..] - ETA: 5s - loss: 1.8669 - regression_loss: 1.5382 - classification_loss: 0.3288 477/500 [===========================>..] - ETA: 5s - loss: 1.8657 - regression_loss: 1.5370 - classification_loss: 0.3287 478/500 [===========================>..] - ETA: 5s - loss: 1.8690 - regression_loss: 1.5395 - classification_loss: 0.3295 479/500 [===========================>..] - ETA: 5s - loss: 1.8698 - regression_loss: 1.5403 - classification_loss: 0.3295 480/500 [===========================>..] - ETA: 4s - loss: 1.8709 - regression_loss: 1.5413 - classification_loss: 0.3296 481/500 [===========================>..] - ETA: 4s - loss: 1.8712 - regression_loss: 1.5416 - classification_loss: 0.3295 482/500 [===========================>..] - ETA: 4s - loss: 1.8711 - regression_loss: 1.5416 - classification_loss: 0.3295 483/500 [===========================>..] - ETA: 4s - loss: 1.8707 - regression_loss: 1.5414 - classification_loss: 0.3293 484/500 [============================>.] - ETA: 3s - loss: 1.8695 - regression_loss: 1.5406 - classification_loss: 0.3289 485/500 [============================>.] - ETA: 3s - loss: 1.8705 - regression_loss: 1.5414 - classification_loss: 0.3291 486/500 [============================>.] - ETA: 3s - loss: 1.8704 - regression_loss: 1.5413 - classification_loss: 0.3291 487/500 [============================>.] - ETA: 3s - loss: 1.8700 - regression_loss: 1.5410 - classification_loss: 0.3290 488/500 [============================>.] - ETA: 2s - loss: 1.8703 - regression_loss: 1.5411 - classification_loss: 0.3292 489/500 [============================>.] - ETA: 2s - loss: 1.8703 - regression_loss: 1.5411 - classification_loss: 0.3292 490/500 [============================>.] - ETA: 2s - loss: 1.8703 - regression_loss: 1.5412 - classification_loss: 0.3291 491/500 [============================>.] - ETA: 2s - loss: 1.8684 - regression_loss: 1.5397 - classification_loss: 0.3287 492/500 [============================>.] - ETA: 1s - loss: 1.8695 - regression_loss: 1.5405 - classification_loss: 0.3289 493/500 [============================>.] - ETA: 1s - loss: 1.8698 - regression_loss: 1.5408 - classification_loss: 0.3290 494/500 [============================>.] - ETA: 1s - loss: 1.8707 - regression_loss: 1.5415 - classification_loss: 0.3292 495/500 [============================>.] - ETA: 1s - loss: 1.8712 - regression_loss: 1.5419 - classification_loss: 0.3293 496/500 [============================>.] - ETA: 0s - loss: 1.8712 - regression_loss: 1.5420 - classification_loss: 0.3292 497/500 [============================>.] - ETA: 0s - loss: 1.8712 - regression_loss: 1.5419 - classification_loss: 0.3293 498/500 [============================>.] - ETA: 0s - loss: 1.8714 - regression_loss: 1.5420 - classification_loss: 0.3294 499/500 [============================>.] - ETA: 0s - loss: 1.8715 - regression_loss: 1.5421 - classification_loss: 0.3294 500/500 [==============================] - 125s 250ms/step - loss: 1.8718 - regression_loss: 1.5424 - classification_loss: 0.3294 1172 instances of class plum with average precision: 0.5857 mAP: 0.5857 Epoch 00052: saving model to ./training/snapshots/resnet50_pascal_52.h5 Epoch 53/150 1/500 [..............................] - ETA: 1:48 - loss: 1.7522 - regression_loss: 1.4720 - classification_loss: 0.2803 2/500 [..............................] - ETA: 1:53 - loss: 2.0971 - regression_loss: 1.7685 - classification_loss: 0.3286 3/500 [..............................] - ETA: 1:57 - loss: 2.2362 - regression_loss: 1.8656 - classification_loss: 0.3705 4/500 [..............................] - ETA: 1:57 - loss: 2.1400 - regression_loss: 1.7625 - classification_loss: 0.3775 5/500 [..............................] - ETA: 1:59 - loss: 2.1458 - regression_loss: 1.7823 - classification_loss: 0.3635 6/500 [..............................] - ETA: 2:01 - loss: 2.1743 - regression_loss: 1.7693 - classification_loss: 0.4050 7/500 [..............................] - ETA: 2:01 - loss: 2.1707 - regression_loss: 1.7367 - classification_loss: 0.4340 8/500 [..............................] - ETA: 2:01 - loss: 2.1399 - regression_loss: 1.7207 - classification_loss: 0.4192 9/500 [..............................] - ETA: 2:01 - loss: 2.0346 - regression_loss: 1.6374 - classification_loss: 0.3972 10/500 [..............................] - ETA: 2:01 - loss: 1.9699 - regression_loss: 1.5942 - classification_loss: 0.3757 11/500 [..............................] - ETA: 2:01 - loss: 1.9432 - regression_loss: 1.5768 - classification_loss: 0.3664 12/500 [..............................] - ETA: 2:01 - loss: 1.9611 - regression_loss: 1.5999 - classification_loss: 0.3612 13/500 [..............................] - ETA: 2:00 - loss: 1.9246 - regression_loss: 1.5699 - classification_loss: 0.3548 14/500 [..............................] - ETA: 1:59 - loss: 1.8843 - regression_loss: 1.5419 - classification_loss: 0.3423 15/500 [..............................] - ETA: 1:59 - loss: 1.9188 - regression_loss: 1.5655 - classification_loss: 0.3533 16/500 [..............................] - ETA: 1:59 - loss: 1.9214 - regression_loss: 1.5704 - classification_loss: 0.3510 17/500 [>.............................] - ETA: 1:59 - loss: 1.9417 - regression_loss: 1.5751 - classification_loss: 0.3666 18/500 [>.............................] - ETA: 1:59 - loss: 1.9642 - regression_loss: 1.5866 - classification_loss: 0.3776 19/500 [>.............................] - ETA: 1:58 - loss: 1.9618 - regression_loss: 1.5866 - classification_loss: 0.3752 20/500 [>.............................] - ETA: 1:57 - loss: 1.9566 - regression_loss: 1.5832 - classification_loss: 0.3734 21/500 [>.............................] - ETA: 1:56 - loss: 1.9413 - regression_loss: 1.5745 - classification_loss: 0.3668 22/500 [>.............................] - ETA: 1:55 - loss: 1.9531 - regression_loss: 1.5838 - classification_loss: 0.3692 23/500 [>.............................] - ETA: 1:54 - loss: 1.9625 - regression_loss: 1.5949 - classification_loss: 0.3676 24/500 [>.............................] - ETA: 1:53 - loss: 1.9850 - regression_loss: 1.6101 - classification_loss: 0.3749 25/500 [>.............................] - ETA: 1:52 - loss: 1.9844 - regression_loss: 1.6099 - classification_loss: 0.3746 26/500 [>.............................] - ETA: 1:52 - loss: 1.9479 - regression_loss: 1.5792 - classification_loss: 0.3687 27/500 [>.............................] - ETA: 1:51 - loss: 1.9754 - regression_loss: 1.5838 - classification_loss: 0.3915 28/500 [>.............................] - ETA: 1:50 - loss: 1.9852 - regression_loss: 1.5926 - classification_loss: 0.3926 29/500 [>.............................] - ETA: 1:50 - loss: 1.9841 - regression_loss: 1.5923 - classification_loss: 0.3918 30/500 [>.............................] - ETA: 1:49 - loss: 2.0060 - regression_loss: 1.6120 - classification_loss: 0.3940 31/500 [>.............................] - ETA: 1:49 - loss: 1.9901 - regression_loss: 1.5994 - classification_loss: 0.3907 32/500 [>.............................] - ETA: 1:49 - loss: 2.0082 - regression_loss: 1.6066 - classification_loss: 0.4016 33/500 [>.............................] - ETA: 1:49 - loss: 2.0128 - regression_loss: 1.6113 - classification_loss: 0.4015 34/500 [=>............................] - ETA: 1:49 - loss: 2.0010 - regression_loss: 1.6018 - classification_loss: 0.3992 35/500 [=>............................] - ETA: 1:49 - loss: 2.0087 - regression_loss: 1.6110 - classification_loss: 0.3977 36/500 [=>............................] - ETA: 1:49 - loss: 1.9912 - regression_loss: 1.5945 - classification_loss: 0.3968 37/500 [=>............................] - ETA: 1:49 - loss: 1.9912 - regression_loss: 1.5953 - classification_loss: 0.3959 38/500 [=>............................] - ETA: 1:49 - loss: 1.9865 - regression_loss: 1.5934 - classification_loss: 0.3932 39/500 [=>............................] - ETA: 1:49 - loss: 1.9904 - regression_loss: 1.5978 - classification_loss: 0.3926 40/500 [=>............................] - ETA: 1:49 - loss: 1.9881 - regression_loss: 1.5978 - classification_loss: 0.3903 41/500 [=>............................] - ETA: 1:49 - loss: 1.9953 - regression_loss: 1.5996 - classification_loss: 0.3957 42/500 [=>............................] - ETA: 1:49 - loss: 2.0102 - regression_loss: 1.6110 - classification_loss: 0.3991 43/500 [=>............................] - ETA: 1:48 - loss: 1.9909 - regression_loss: 1.5951 - classification_loss: 0.3958 44/500 [=>............................] - ETA: 1:48 - loss: 1.9779 - regression_loss: 1.5852 - classification_loss: 0.3926 45/500 [=>............................] - ETA: 1:48 - loss: 1.9622 - regression_loss: 1.5741 - classification_loss: 0.3881 46/500 [=>............................] - ETA: 1:48 - loss: 1.9679 - regression_loss: 1.5791 - classification_loss: 0.3888 47/500 [=>............................] - ETA: 1:48 - loss: 1.9566 - regression_loss: 1.5718 - classification_loss: 0.3847 48/500 [=>............................] - ETA: 1:48 - loss: 1.9437 - regression_loss: 1.5634 - classification_loss: 0.3803 49/500 [=>............................] - ETA: 1:47 - loss: 1.9390 - regression_loss: 1.5595 - classification_loss: 0.3795 50/500 [==>...........................] - ETA: 1:47 - loss: 1.9425 - regression_loss: 1.5647 - classification_loss: 0.3778 51/500 [==>...........................] - ETA: 1:46 - loss: 1.9430 - regression_loss: 1.5666 - classification_loss: 0.3764 52/500 [==>...........................] - ETA: 1:46 - loss: 1.9393 - regression_loss: 1.5627 - classification_loss: 0.3766 53/500 [==>...........................] - ETA: 1:46 - loss: 1.9323 - regression_loss: 1.5591 - classification_loss: 0.3732 54/500 [==>...........................] - ETA: 1:46 - loss: 1.9349 - regression_loss: 1.5618 - classification_loss: 0.3732 55/500 [==>...........................] - ETA: 1:46 - loss: 1.9337 - regression_loss: 1.5618 - classification_loss: 0.3719 56/500 [==>...........................] - ETA: 1:45 - loss: 1.9169 - regression_loss: 1.5492 - classification_loss: 0.3677 57/500 [==>...........................] - ETA: 1:45 - loss: 1.9165 - regression_loss: 1.5495 - classification_loss: 0.3670 58/500 [==>...........................] - ETA: 1:45 - loss: 1.9257 - regression_loss: 1.5561 - classification_loss: 0.3696 59/500 [==>...........................] - ETA: 1:45 - loss: 1.9215 - regression_loss: 1.5536 - classification_loss: 0.3679 60/500 [==>...........................] - ETA: 1:45 - loss: 1.9252 - regression_loss: 1.5574 - classification_loss: 0.3679 61/500 [==>...........................] - ETA: 1:45 - loss: 1.9227 - regression_loss: 1.5546 - classification_loss: 0.3681 62/500 [==>...........................] - ETA: 1:45 - loss: 1.9260 - regression_loss: 1.5578 - classification_loss: 0.3682 63/500 [==>...........................] - ETA: 1:45 - loss: 1.9241 - regression_loss: 1.5565 - classification_loss: 0.3676 64/500 [==>...........................] - ETA: 1:44 - loss: 1.9177 - regression_loss: 1.5520 - classification_loss: 0.3657 65/500 [==>...........................] - ETA: 1:44 - loss: 1.9233 - regression_loss: 1.5584 - classification_loss: 0.3649 66/500 [==>...........................] - ETA: 1:44 - loss: 1.9265 - regression_loss: 1.5629 - classification_loss: 0.3636 67/500 [===>..........................] - ETA: 1:44 - loss: 1.9258 - regression_loss: 1.5630 - classification_loss: 0.3628 68/500 [===>..........................] - ETA: 1:44 - loss: 1.9283 - regression_loss: 1.5655 - classification_loss: 0.3629 69/500 [===>..........................] - ETA: 1:43 - loss: 1.9297 - regression_loss: 1.5671 - classification_loss: 0.3626 70/500 [===>..........................] - ETA: 1:43 - loss: 1.9332 - regression_loss: 1.5701 - classification_loss: 0.3631 71/500 [===>..........................] - ETA: 1:43 - loss: 1.9253 - regression_loss: 1.5639 - classification_loss: 0.3615 72/500 [===>..........................] - ETA: 1:43 - loss: 1.9314 - regression_loss: 1.5689 - classification_loss: 0.3625 73/500 [===>..........................] - ETA: 1:42 - loss: 1.9317 - regression_loss: 1.5697 - classification_loss: 0.3620 74/500 [===>..........................] - ETA: 1:42 - loss: 1.9295 - regression_loss: 1.5683 - classification_loss: 0.3611 75/500 [===>..........................] - ETA: 1:42 - loss: 1.9302 - regression_loss: 1.5679 - classification_loss: 0.3623 76/500 [===>..........................] - ETA: 1:42 - loss: 1.9317 - regression_loss: 1.5687 - classification_loss: 0.3630 77/500 [===>..........................] - ETA: 1:42 - loss: 1.9385 - regression_loss: 1.5740 - classification_loss: 0.3645 78/500 [===>..........................] - ETA: 1:42 - loss: 1.9378 - regression_loss: 1.5739 - classification_loss: 0.3639 79/500 [===>..........................] - ETA: 1:41 - loss: 1.9360 - regression_loss: 1.5732 - classification_loss: 0.3628 80/500 [===>..........................] - ETA: 1:41 - loss: 1.9287 - regression_loss: 1.5676 - classification_loss: 0.3611 81/500 [===>..........................] - ETA: 1:41 - loss: 1.9261 - regression_loss: 1.5662 - classification_loss: 0.3599 82/500 [===>..........................] - ETA: 1:41 - loss: 1.9265 - regression_loss: 1.5674 - classification_loss: 0.3592 83/500 [===>..........................] - ETA: 1:41 - loss: 1.9249 - regression_loss: 1.5655 - classification_loss: 0.3594 84/500 [====>.........................] - ETA: 1:40 - loss: 1.9262 - regression_loss: 1.5665 - classification_loss: 0.3597 85/500 [====>.........................] - ETA: 1:40 - loss: 1.9256 - regression_loss: 1.5664 - classification_loss: 0.3592 86/500 [====>.........................] - ETA: 1:40 - loss: 1.9304 - regression_loss: 1.5706 - classification_loss: 0.3599 87/500 [====>.........................] - ETA: 1:40 - loss: 1.9299 - regression_loss: 1.5701 - classification_loss: 0.3598 88/500 [====>.........................] - ETA: 1:40 - loss: 1.9340 - regression_loss: 1.5735 - classification_loss: 0.3604 89/500 [====>.........................] - ETA: 1:40 - loss: 1.9357 - regression_loss: 1.5752 - classification_loss: 0.3604 90/500 [====>.........................] - ETA: 1:39 - loss: 1.9336 - regression_loss: 1.5743 - classification_loss: 0.3593 91/500 [====>.........................] - ETA: 1:39 - loss: 1.9341 - regression_loss: 1.5753 - classification_loss: 0.3587 92/500 [====>.........................] - ETA: 1:39 - loss: 1.9346 - regression_loss: 1.5752 - classification_loss: 0.3593 93/500 [====>.........................] - ETA: 1:39 - loss: 1.9354 - regression_loss: 1.5764 - classification_loss: 0.3590 94/500 [====>.........................] - ETA: 1:38 - loss: 1.9318 - regression_loss: 1.5739 - classification_loss: 0.3579 95/500 [====>.........................] - ETA: 1:38 - loss: 1.9363 - regression_loss: 1.5772 - classification_loss: 0.3591 96/500 [====>.........................] - ETA: 1:38 - loss: 1.9391 - regression_loss: 1.5800 - classification_loss: 0.3591 97/500 [====>.........................] - ETA: 1:38 - loss: 1.9367 - regression_loss: 1.5784 - classification_loss: 0.3583 98/500 [====>.........................] - ETA: 1:38 - loss: 1.9325 - regression_loss: 1.5751 - classification_loss: 0.3574 99/500 [====>.........................] - ETA: 1:37 - loss: 1.9353 - regression_loss: 1.5773 - classification_loss: 0.3580 100/500 [=====>........................] - ETA: 1:37 - loss: 1.9384 - regression_loss: 1.5804 - classification_loss: 0.3581 101/500 [=====>........................] - ETA: 1:37 - loss: 1.9418 - regression_loss: 1.5845 - classification_loss: 0.3573 102/500 [=====>........................] - ETA: 1:37 - loss: 1.9299 - regression_loss: 1.5741 - classification_loss: 0.3558 103/500 [=====>........................] - ETA: 1:37 - loss: 1.9281 - regression_loss: 1.5681 - classification_loss: 0.3600 104/500 [=====>........................] - ETA: 1:36 - loss: 1.9307 - regression_loss: 1.5707 - classification_loss: 0.3600 105/500 [=====>........................] - ETA: 1:36 - loss: 1.9331 - regression_loss: 1.5736 - classification_loss: 0.3595 106/500 [=====>........................] - ETA: 1:36 - loss: 1.9402 - regression_loss: 1.5793 - classification_loss: 0.3609 107/500 [=====>........................] - ETA: 1:36 - loss: 1.9413 - regression_loss: 1.5806 - classification_loss: 0.3607 108/500 [=====>........................] - ETA: 1:35 - loss: 1.9387 - regression_loss: 1.5790 - classification_loss: 0.3597 109/500 [=====>........................] - ETA: 1:35 - loss: 1.9415 - regression_loss: 1.5811 - classification_loss: 0.3604 110/500 [=====>........................] - ETA: 1:35 - loss: 1.9442 - regression_loss: 1.5836 - classification_loss: 0.3606 111/500 [=====>........................] - ETA: 1:35 - loss: 1.9441 - regression_loss: 1.5840 - classification_loss: 0.3601 112/500 [=====>........................] - ETA: 1:35 - loss: 1.9428 - regression_loss: 1.5839 - classification_loss: 0.3589 113/500 [=====>........................] - ETA: 1:34 - loss: 1.9342 - regression_loss: 1.5772 - classification_loss: 0.3570 114/500 [=====>........................] - ETA: 1:34 - loss: 1.9376 - regression_loss: 1.5793 - classification_loss: 0.3583 115/500 [=====>........................] - ETA: 1:34 - loss: 1.9398 - regression_loss: 1.5814 - classification_loss: 0.3584 116/500 [=====>........................] - ETA: 1:34 - loss: 1.9400 - regression_loss: 1.5817 - classification_loss: 0.3583 117/500 [======>.......................] - ETA: 1:33 - loss: 1.9421 - regression_loss: 1.5837 - classification_loss: 0.3584 118/500 [======>.......................] - ETA: 1:33 - loss: 1.9428 - regression_loss: 1.5846 - classification_loss: 0.3582 119/500 [======>.......................] - ETA: 1:33 - loss: 1.9448 - regression_loss: 1.5868 - classification_loss: 0.3580 120/500 [======>.......................] - ETA: 1:33 - loss: 1.9457 - regression_loss: 1.5871 - classification_loss: 0.3586 121/500 [======>.......................] - ETA: 1:32 - loss: 1.9422 - regression_loss: 1.5839 - classification_loss: 0.3583 122/500 [======>.......................] - ETA: 1:32 - loss: 1.9400 - regression_loss: 1.5819 - classification_loss: 0.3580 123/500 [======>.......................] - ETA: 1:32 - loss: 1.9454 - regression_loss: 1.5852 - classification_loss: 0.3602 124/500 [======>.......................] - ETA: 1:32 - loss: 1.9442 - regression_loss: 1.5845 - classification_loss: 0.3598 125/500 [======>.......................] - ETA: 1:31 - loss: 1.9426 - regression_loss: 1.5831 - classification_loss: 0.3595 126/500 [======>.......................] - ETA: 1:31 - loss: 1.9429 - regression_loss: 1.5834 - classification_loss: 0.3595 127/500 [======>.......................] - ETA: 1:31 - loss: 1.9435 - regression_loss: 1.5841 - classification_loss: 0.3593 128/500 [======>.......................] - ETA: 1:31 - loss: 1.9515 - regression_loss: 1.5903 - classification_loss: 0.3612 129/500 [======>.......................] - ETA: 1:31 - loss: 1.9539 - regression_loss: 1.5925 - classification_loss: 0.3614 130/500 [======>.......................] - ETA: 1:30 - loss: 1.9597 - regression_loss: 1.5972 - classification_loss: 0.3625 131/500 [======>.......................] - ETA: 1:30 - loss: 1.9580 - regression_loss: 1.5963 - classification_loss: 0.3617 132/500 [======>.......................] - ETA: 1:30 - loss: 1.9548 - regression_loss: 1.5950 - classification_loss: 0.3598 133/500 [======>.......................] - ETA: 1:30 - loss: 1.9548 - regression_loss: 1.5949 - classification_loss: 0.3599 134/500 [=======>......................] - ETA: 1:29 - loss: 1.9538 - regression_loss: 1.5947 - classification_loss: 0.3591 135/500 [=======>......................] - ETA: 1:29 - loss: 1.9559 - regression_loss: 1.5967 - classification_loss: 0.3592 136/500 [=======>......................] - ETA: 1:29 - loss: 1.9570 - regression_loss: 1.5977 - classification_loss: 0.3593 137/500 [=======>......................] - ETA: 1:29 - loss: 1.9487 - regression_loss: 1.5912 - classification_loss: 0.3575 138/500 [=======>......................] - ETA: 1:29 - loss: 1.9442 - regression_loss: 1.5874 - classification_loss: 0.3568 139/500 [=======>......................] - ETA: 1:28 - loss: 1.9428 - regression_loss: 1.5866 - classification_loss: 0.3562 140/500 [=======>......................] - ETA: 1:28 - loss: 1.9365 - regression_loss: 1.5822 - classification_loss: 0.3543 141/500 [=======>......................] - ETA: 1:28 - loss: 1.9354 - regression_loss: 1.5814 - classification_loss: 0.3540 142/500 [=======>......................] - ETA: 1:28 - loss: 1.9331 - regression_loss: 1.5799 - classification_loss: 0.3531 143/500 [=======>......................] - ETA: 1:27 - loss: 1.9299 - regression_loss: 1.5776 - classification_loss: 0.3523 144/500 [=======>......................] - ETA: 1:27 - loss: 1.9290 - regression_loss: 1.5773 - classification_loss: 0.3518 145/500 [=======>......................] - ETA: 1:27 - loss: 1.9251 - regression_loss: 1.5739 - classification_loss: 0.3512 146/500 [=======>......................] - ETA: 1:27 - loss: 1.9208 - regression_loss: 1.5703 - classification_loss: 0.3505 147/500 [=======>......................] - ETA: 1:26 - loss: 1.9205 - regression_loss: 1.5702 - classification_loss: 0.3503 148/500 [=======>......................] - ETA: 1:26 - loss: 1.9215 - regression_loss: 1.5707 - classification_loss: 0.3508 149/500 [=======>......................] - ETA: 1:26 - loss: 1.9191 - regression_loss: 1.5691 - classification_loss: 0.3500 150/500 [========>.....................] - ETA: 1:26 - loss: 1.9177 - regression_loss: 1.5682 - classification_loss: 0.3495 151/500 [========>.....................] - ETA: 1:25 - loss: 1.9169 - regression_loss: 1.5681 - classification_loss: 0.3488 152/500 [========>.....................] - ETA: 1:25 - loss: 1.9151 - regression_loss: 1.5663 - classification_loss: 0.3487 153/500 [========>.....................] - ETA: 1:25 - loss: 1.9167 - regression_loss: 1.5678 - classification_loss: 0.3489 154/500 [========>.....................] - ETA: 1:25 - loss: 1.9208 - regression_loss: 1.5721 - classification_loss: 0.3487 155/500 [========>.....................] - ETA: 1:24 - loss: 1.9245 - regression_loss: 1.5749 - classification_loss: 0.3497 156/500 [========>.....................] - ETA: 1:24 - loss: 1.9234 - regression_loss: 1.5733 - classification_loss: 0.3501 157/500 [========>.....................] - ETA: 1:24 - loss: 1.9235 - regression_loss: 1.5741 - classification_loss: 0.3493 158/500 [========>.....................] - ETA: 1:24 - loss: 1.9163 - regression_loss: 1.5681 - classification_loss: 0.3482 159/500 [========>.....................] - ETA: 1:24 - loss: 1.9185 - regression_loss: 1.5695 - classification_loss: 0.3490 160/500 [========>.....................] - ETA: 1:23 - loss: 1.9206 - regression_loss: 1.5711 - classification_loss: 0.3495 161/500 [========>.....................] - ETA: 1:23 - loss: 1.9208 - regression_loss: 1.5719 - classification_loss: 0.3490 162/500 [========>.....................] - ETA: 1:23 - loss: 1.9174 - regression_loss: 1.5693 - classification_loss: 0.3481 163/500 [========>.....................] - ETA: 1:23 - loss: 1.9156 - regression_loss: 1.5681 - classification_loss: 0.3475 164/500 [========>.....................] - ETA: 1:22 - loss: 1.9148 - regression_loss: 1.5675 - classification_loss: 0.3473 165/500 [========>.....................] - ETA: 1:22 - loss: 1.9099 - regression_loss: 1.5632 - classification_loss: 0.3467 166/500 [========>.....................] - ETA: 1:22 - loss: 1.9128 - regression_loss: 1.5650 - classification_loss: 0.3477 167/500 [=========>....................] - ETA: 1:22 - loss: 1.9104 - regression_loss: 1.5637 - classification_loss: 0.3468 168/500 [=========>....................] - ETA: 1:21 - loss: 1.9043 - regression_loss: 1.5586 - classification_loss: 0.3457 169/500 [=========>....................] - ETA: 1:21 - loss: 1.8996 - regression_loss: 1.5547 - classification_loss: 0.3449 170/500 [=========>....................] - ETA: 1:21 - loss: 1.8998 - regression_loss: 1.5552 - classification_loss: 0.3446 171/500 [=========>....................] - ETA: 1:21 - loss: 1.9014 - regression_loss: 1.5565 - classification_loss: 0.3449 172/500 [=========>....................] - ETA: 1:20 - loss: 1.9003 - regression_loss: 1.5558 - classification_loss: 0.3445 173/500 [=========>....................] - ETA: 1:20 - loss: 1.8939 - regression_loss: 1.5507 - classification_loss: 0.3432 174/500 [=========>....................] - ETA: 1:20 - loss: 1.8886 - regression_loss: 1.5465 - classification_loss: 0.3422 175/500 [=========>....................] - ETA: 1:20 - loss: 1.8861 - regression_loss: 1.5445 - classification_loss: 0.3416 176/500 [=========>....................] - ETA: 1:19 - loss: 1.8857 - regression_loss: 1.5439 - classification_loss: 0.3418 177/500 [=========>....................] - ETA: 1:19 - loss: 1.8846 - regression_loss: 1.5431 - classification_loss: 0.3415 178/500 [=========>....................] - ETA: 1:19 - loss: 1.8837 - regression_loss: 1.5426 - classification_loss: 0.3411 179/500 [=========>....................] - ETA: 1:19 - loss: 1.8811 - regression_loss: 1.5405 - classification_loss: 0.3406 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8841 - regression_loss: 1.5430 - classification_loss: 0.3411 181/500 [=========>....................] - ETA: 1:18 - loss: 1.8829 - regression_loss: 1.5417 - classification_loss: 0.3411 182/500 [=========>....................] - ETA: 1:18 - loss: 1.8863 - regression_loss: 1.5449 - classification_loss: 0.3414 183/500 [=========>....................] - ETA: 1:18 - loss: 1.8829 - regression_loss: 1.5423 - classification_loss: 0.3406 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8781 - regression_loss: 1.5385 - classification_loss: 0.3395 185/500 [==========>...................] - ETA: 1:17 - loss: 1.8749 - regression_loss: 1.5361 - classification_loss: 0.3388 186/500 [==========>...................] - ETA: 1:17 - loss: 1.8764 - regression_loss: 1.5373 - classification_loss: 0.3391 187/500 [==========>...................] - ETA: 1:17 - loss: 1.8746 - regression_loss: 1.5362 - classification_loss: 0.3384 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8721 - regression_loss: 1.5341 - classification_loss: 0.3380 189/500 [==========>...................] - ETA: 1:16 - loss: 1.8695 - regression_loss: 1.5321 - classification_loss: 0.3374 190/500 [==========>...................] - ETA: 1:16 - loss: 1.8764 - regression_loss: 1.5336 - classification_loss: 0.3428 191/500 [==========>...................] - ETA: 1:16 - loss: 1.8765 - regression_loss: 1.5339 - classification_loss: 0.3426 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8740 - regression_loss: 1.5323 - classification_loss: 0.3418 193/500 [==========>...................] - ETA: 1:15 - loss: 1.8753 - regression_loss: 1.5335 - classification_loss: 0.3418 194/500 [==========>...................] - ETA: 1:15 - loss: 1.8816 - regression_loss: 1.5384 - classification_loss: 0.3432 195/500 [==========>...................] - ETA: 1:15 - loss: 1.8797 - regression_loss: 1.5370 - classification_loss: 0.3428 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8799 - regression_loss: 1.5372 - classification_loss: 0.3427 197/500 [==========>...................] - ETA: 1:14 - loss: 1.8772 - regression_loss: 1.5357 - classification_loss: 0.3416 198/500 [==========>...................] - ETA: 1:14 - loss: 1.8761 - regression_loss: 1.5350 - classification_loss: 0.3412 199/500 [==========>...................] - ETA: 1:14 - loss: 1.8771 - regression_loss: 1.5357 - classification_loss: 0.3414 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8772 - regression_loss: 1.5356 - classification_loss: 0.3415 201/500 [===========>..................] - ETA: 1:13 - loss: 1.8765 - regression_loss: 1.5354 - classification_loss: 0.3411 202/500 [===========>..................] - ETA: 1:13 - loss: 1.8778 - regression_loss: 1.5365 - classification_loss: 0.3413 203/500 [===========>..................] - ETA: 1:13 - loss: 1.8797 - regression_loss: 1.5382 - classification_loss: 0.3415 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8808 - regression_loss: 1.5394 - classification_loss: 0.3414 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8811 - regression_loss: 1.5397 - classification_loss: 0.3414 206/500 [===========>..................] - ETA: 1:12 - loss: 1.8827 - regression_loss: 1.5402 - classification_loss: 0.3425 207/500 [===========>..................] - ETA: 1:12 - loss: 1.8817 - regression_loss: 1.5398 - classification_loss: 0.3419 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8863 - regression_loss: 1.5441 - classification_loss: 0.3422 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8866 - regression_loss: 1.5446 - classification_loss: 0.3420 210/500 [===========>..................] - ETA: 1:11 - loss: 1.8844 - regression_loss: 1.5428 - classification_loss: 0.3417 211/500 [===========>..................] - ETA: 1:11 - loss: 1.8823 - regression_loss: 1.5411 - classification_loss: 0.3411 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8817 - regression_loss: 1.5413 - classification_loss: 0.3404 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8866 - regression_loss: 1.5449 - classification_loss: 0.3417 214/500 [===========>..................] - ETA: 1:10 - loss: 1.8858 - regression_loss: 1.5441 - classification_loss: 0.3417 215/500 [===========>..................] - ETA: 1:10 - loss: 1.8855 - regression_loss: 1.5436 - classification_loss: 0.3419 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8859 - regression_loss: 1.5439 - classification_loss: 0.3420 217/500 [============>.................] - ETA: 1:10 - loss: 1.8846 - regression_loss: 1.5431 - classification_loss: 0.3415 218/500 [============>.................] - ETA: 1:09 - loss: 1.8839 - regression_loss: 1.5428 - classification_loss: 0.3411 219/500 [============>.................] - ETA: 1:09 - loss: 1.8822 - regression_loss: 1.5411 - classification_loss: 0.3410 220/500 [============>.................] - ETA: 1:09 - loss: 1.8838 - regression_loss: 1.5424 - classification_loss: 0.3415 221/500 [============>.................] - ETA: 1:09 - loss: 1.8837 - regression_loss: 1.5423 - classification_loss: 0.3415 222/500 [============>.................] - ETA: 1:08 - loss: 1.8803 - regression_loss: 1.5398 - classification_loss: 0.3405 223/500 [============>.................] - ETA: 1:08 - loss: 1.8759 - regression_loss: 1.5362 - classification_loss: 0.3397 224/500 [============>.................] - ETA: 1:08 - loss: 1.8811 - regression_loss: 1.5399 - classification_loss: 0.3413 225/500 [============>.................] - ETA: 1:08 - loss: 1.8820 - regression_loss: 1.5406 - classification_loss: 0.3414 226/500 [============>.................] - ETA: 1:07 - loss: 1.8796 - regression_loss: 1.5386 - classification_loss: 0.3410 227/500 [============>.................] - ETA: 1:07 - loss: 1.8811 - regression_loss: 1.5400 - classification_loss: 0.3411 228/500 [============>.................] - ETA: 1:07 - loss: 1.8894 - regression_loss: 1.5457 - classification_loss: 0.3436 229/500 [============>.................] - ETA: 1:07 - loss: 1.8919 - regression_loss: 1.5478 - classification_loss: 0.3441 230/500 [============>.................] - ETA: 1:06 - loss: 1.8924 - regression_loss: 1.5470 - classification_loss: 0.3454 231/500 [============>.................] - ETA: 1:06 - loss: 1.8926 - regression_loss: 1.5474 - classification_loss: 0.3452 232/500 [============>.................] - ETA: 1:06 - loss: 1.8909 - regression_loss: 1.5461 - classification_loss: 0.3448 233/500 [============>.................] - ETA: 1:06 - loss: 1.8923 - regression_loss: 1.5472 - classification_loss: 0.3450 234/500 [=============>................] - ETA: 1:05 - loss: 1.8871 - regression_loss: 1.5432 - classification_loss: 0.3439 235/500 [=============>................] - ETA: 1:05 - loss: 1.8857 - regression_loss: 1.5419 - classification_loss: 0.3438 236/500 [=============>................] - ETA: 1:05 - loss: 1.8845 - regression_loss: 1.5412 - classification_loss: 0.3433 237/500 [=============>................] - ETA: 1:05 - loss: 1.8858 - regression_loss: 1.5421 - classification_loss: 0.3437 238/500 [=============>................] - ETA: 1:04 - loss: 1.8873 - regression_loss: 1.5435 - classification_loss: 0.3438 239/500 [=============>................] - ETA: 1:04 - loss: 1.8877 - regression_loss: 1.5435 - classification_loss: 0.3442 240/500 [=============>................] - ETA: 1:04 - loss: 1.8831 - regression_loss: 1.5397 - classification_loss: 0.3434 241/500 [=============>................] - ETA: 1:04 - loss: 1.8869 - regression_loss: 1.5434 - classification_loss: 0.3435 242/500 [=============>................] - ETA: 1:03 - loss: 1.8834 - regression_loss: 1.5407 - classification_loss: 0.3427 243/500 [=============>................] - ETA: 1:03 - loss: 1.8829 - regression_loss: 1.5403 - classification_loss: 0.3426 244/500 [=============>................] - ETA: 1:03 - loss: 1.8803 - regression_loss: 1.5383 - classification_loss: 0.3420 245/500 [=============>................] - ETA: 1:03 - loss: 1.8807 - regression_loss: 1.5386 - classification_loss: 0.3422 246/500 [=============>................] - ETA: 1:02 - loss: 1.8824 - regression_loss: 1.5396 - classification_loss: 0.3428 247/500 [=============>................] - ETA: 1:02 - loss: 1.8833 - regression_loss: 1.5404 - classification_loss: 0.3429 248/500 [=============>................] - ETA: 1:02 - loss: 1.8834 - regression_loss: 1.5408 - classification_loss: 0.3427 249/500 [=============>................] - ETA: 1:02 - loss: 1.8823 - regression_loss: 1.5400 - classification_loss: 0.3423 250/500 [==============>...............] - ETA: 1:01 - loss: 1.8825 - regression_loss: 1.5407 - classification_loss: 0.3418 251/500 [==============>...............] - ETA: 1:01 - loss: 1.8817 - regression_loss: 1.5402 - classification_loss: 0.3414 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8819 - regression_loss: 1.5406 - classification_loss: 0.3414 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8863 - regression_loss: 1.5449 - classification_loss: 0.3414 254/500 [==============>...............] - ETA: 1:00 - loss: 1.8865 - regression_loss: 1.5450 - classification_loss: 0.3415 255/500 [==============>...............] - ETA: 1:00 - loss: 1.8878 - regression_loss: 1.5459 - classification_loss: 0.3419 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8877 - regression_loss: 1.5459 - classification_loss: 0.3418 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8902 - regression_loss: 1.5484 - classification_loss: 0.3418 258/500 [==============>...............] - ETA: 59s - loss: 1.8925 - regression_loss: 1.5505 - classification_loss: 0.3421  259/500 [==============>...............] - ETA: 59s - loss: 1.8902 - regression_loss: 1.5489 - classification_loss: 0.3413 260/500 [==============>...............] - ETA: 59s - loss: 1.8898 - regression_loss: 1.5487 - classification_loss: 0.3411 261/500 [==============>...............] - ETA: 59s - loss: 1.8897 - regression_loss: 1.5488 - classification_loss: 0.3409 262/500 [==============>...............] - ETA: 59s - loss: 1.8870 - regression_loss: 1.5468 - classification_loss: 0.3403 263/500 [==============>...............] - ETA: 58s - loss: 1.8865 - regression_loss: 1.5462 - classification_loss: 0.3403 264/500 [==============>...............] - ETA: 58s - loss: 1.8878 - regression_loss: 1.5472 - classification_loss: 0.3406 265/500 [==============>...............] - ETA: 58s - loss: 1.8881 - regression_loss: 1.5476 - classification_loss: 0.3405 266/500 [==============>...............] - ETA: 58s - loss: 1.8878 - regression_loss: 1.5473 - classification_loss: 0.3405 267/500 [===============>..............] - ETA: 57s - loss: 1.8863 - regression_loss: 1.5460 - classification_loss: 0.3404 268/500 [===============>..............] - ETA: 57s - loss: 1.8863 - regression_loss: 1.5461 - classification_loss: 0.3401 269/500 [===============>..............] - ETA: 57s - loss: 1.8856 - regression_loss: 1.5457 - classification_loss: 0.3399 270/500 [===============>..............] - ETA: 57s - loss: 1.8872 - regression_loss: 1.5471 - classification_loss: 0.3400 271/500 [===============>..............] - ETA: 56s - loss: 1.8868 - regression_loss: 1.5468 - classification_loss: 0.3400 272/500 [===============>..............] - ETA: 56s - loss: 1.8851 - regression_loss: 1.5455 - classification_loss: 0.3396 273/500 [===============>..............] - ETA: 56s - loss: 1.8847 - regression_loss: 1.5452 - classification_loss: 0.3396 274/500 [===============>..............] - ETA: 56s - loss: 1.8849 - regression_loss: 1.5452 - classification_loss: 0.3397 275/500 [===============>..............] - ETA: 55s - loss: 1.8877 - regression_loss: 1.5476 - classification_loss: 0.3401 276/500 [===============>..............] - ETA: 55s - loss: 1.8878 - regression_loss: 1.5479 - classification_loss: 0.3399 277/500 [===============>..............] - ETA: 55s - loss: 1.8841 - regression_loss: 1.5450 - classification_loss: 0.3391 278/500 [===============>..............] - ETA: 55s - loss: 1.8872 - regression_loss: 1.5472 - classification_loss: 0.3401 279/500 [===============>..............] - ETA: 54s - loss: 1.8873 - regression_loss: 1.5471 - classification_loss: 0.3402 280/500 [===============>..............] - ETA: 54s - loss: 1.8851 - regression_loss: 1.5449 - classification_loss: 0.3402 281/500 [===============>..............] - ETA: 54s - loss: 1.8864 - regression_loss: 1.5456 - classification_loss: 0.3408 282/500 [===============>..............] - ETA: 54s - loss: 1.8858 - regression_loss: 1.5451 - classification_loss: 0.3407 283/500 [===============>..............] - ETA: 53s - loss: 1.8843 - regression_loss: 1.5440 - classification_loss: 0.3403 284/500 [================>.............] - ETA: 53s - loss: 1.8849 - regression_loss: 1.5448 - classification_loss: 0.3402 285/500 [================>.............] - ETA: 53s - loss: 1.8846 - regression_loss: 1.5445 - classification_loss: 0.3401 286/500 [================>.............] - ETA: 53s - loss: 1.8838 - regression_loss: 1.5438 - classification_loss: 0.3400 287/500 [================>.............] - ETA: 52s - loss: 1.8839 - regression_loss: 1.5442 - classification_loss: 0.3397 288/500 [================>.............] - ETA: 52s - loss: 1.8833 - regression_loss: 1.5437 - classification_loss: 0.3396 289/500 [================>.............] - ETA: 52s - loss: 1.8841 - regression_loss: 1.5441 - classification_loss: 0.3400 290/500 [================>.............] - ETA: 52s - loss: 1.8849 - regression_loss: 1.5442 - classification_loss: 0.3407 291/500 [================>.............] - ETA: 51s - loss: 1.8855 - regression_loss: 1.5448 - classification_loss: 0.3407 292/500 [================>.............] - ETA: 51s - loss: 1.8863 - regression_loss: 1.5454 - classification_loss: 0.3409 293/500 [================>.............] - ETA: 51s - loss: 1.8865 - regression_loss: 1.5456 - classification_loss: 0.3409 294/500 [================>.............] - ETA: 51s - loss: 1.8862 - regression_loss: 1.5454 - classification_loss: 0.3408 295/500 [================>.............] - ETA: 50s - loss: 1.8852 - regression_loss: 1.5447 - classification_loss: 0.3405 296/500 [================>.............] - ETA: 50s - loss: 1.8839 - regression_loss: 1.5435 - classification_loss: 0.3404 297/500 [================>.............] - ETA: 50s - loss: 1.8844 - regression_loss: 1.5440 - classification_loss: 0.3404 298/500 [================>.............] - ETA: 50s - loss: 1.8833 - regression_loss: 1.5433 - classification_loss: 0.3400 299/500 [================>.............] - ETA: 49s - loss: 1.8842 - regression_loss: 1.5444 - classification_loss: 0.3398 300/500 [=================>............] - ETA: 49s - loss: 1.8800 - regression_loss: 1.5407 - classification_loss: 0.3392 301/500 [=================>............] - ETA: 49s - loss: 1.8768 - regression_loss: 1.5380 - classification_loss: 0.3388 302/500 [=================>............] - ETA: 49s - loss: 1.8732 - regression_loss: 1.5352 - classification_loss: 0.3380 303/500 [=================>............] - ETA: 49s - loss: 1.8730 - regression_loss: 1.5350 - classification_loss: 0.3379 304/500 [=================>............] - ETA: 48s - loss: 1.8728 - regression_loss: 1.5348 - classification_loss: 0.3381 305/500 [=================>............] - ETA: 48s - loss: 1.8729 - regression_loss: 1.5347 - classification_loss: 0.3382 306/500 [=================>............] - ETA: 48s - loss: 1.8734 - regression_loss: 1.5350 - classification_loss: 0.3384 307/500 [=================>............] - ETA: 48s - loss: 1.8708 - regression_loss: 1.5330 - classification_loss: 0.3378 308/500 [=================>............] - ETA: 47s - loss: 1.8713 - regression_loss: 1.5333 - classification_loss: 0.3379 309/500 [=================>............] - ETA: 47s - loss: 1.8723 - regression_loss: 1.5345 - classification_loss: 0.3379 310/500 [=================>............] - ETA: 47s - loss: 1.8715 - regression_loss: 1.5340 - classification_loss: 0.3375 311/500 [=================>............] - ETA: 47s - loss: 1.8712 - regression_loss: 1.5339 - classification_loss: 0.3372 312/500 [=================>............] - ETA: 46s - loss: 1.8724 - regression_loss: 1.5349 - classification_loss: 0.3375 313/500 [=================>............] - ETA: 46s - loss: 1.8748 - regression_loss: 1.5370 - classification_loss: 0.3378 314/500 [=================>............] - ETA: 46s - loss: 1.8724 - regression_loss: 1.5355 - classification_loss: 0.3369 315/500 [=================>............] - ETA: 46s - loss: 1.8697 - regression_loss: 1.5332 - classification_loss: 0.3365 316/500 [=================>............] - ETA: 45s - loss: 1.8700 - regression_loss: 1.5336 - classification_loss: 0.3364 317/500 [==================>...........] - ETA: 45s - loss: 1.8712 - regression_loss: 1.5346 - classification_loss: 0.3366 318/500 [==================>...........] - ETA: 45s - loss: 1.8716 - regression_loss: 1.5350 - classification_loss: 0.3366 319/500 [==================>...........] - ETA: 45s - loss: 1.8698 - regression_loss: 1.5335 - classification_loss: 0.3363 320/500 [==================>...........] - ETA: 44s - loss: 1.8694 - regression_loss: 1.5333 - classification_loss: 0.3361 321/500 [==================>...........] - ETA: 44s - loss: 1.8713 - regression_loss: 1.5339 - classification_loss: 0.3374 322/500 [==================>...........] - ETA: 44s - loss: 1.8717 - regression_loss: 1.5344 - classification_loss: 0.3373 323/500 [==================>...........] - ETA: 44s - loss: 1.8735 - regression_loss: 1.5356 - classification_loss: 0.3379 324/500 [==================>...........] - ETA: 43s - loss: 1.8715 - regression_loss: 1.5340 - classification_loss: 0.3375 325/500 [==================>...........] - ETA: 43s - loss: 1.8711 - regression_loss: 1.5338 - classification_loss: 0.3373 326/500 [==================>...........] - ETA: 43s - loss: 1.8702 - regression_loss: 1.5333 - classification_loss: 0.3370 327/500 [==================>...........] - ETA: 43s - loss: 1.8696 - regression_loss: 1.5329 - classification_loss: 0.3367 328/500 [==================>...........] - ETA: 42s - loss: 1.8701 - regression_loss: 1.5334 - classification_loss: 0.3367 329/500 [==================>...........] - ETA: 42s - loss: 1.8689 - regression_loss: 1.5325 - classification_loss: 0.3364 330/500 [==================>...........] - ETA: 42s - loss: 1.8681 - regression_loss: 1.5319 - classification_loss: 0.3362 331/500 [==================>...........] - ETA: 42s - loss: 1.8664 - regression_loss: 1.5308 - classification_loss: 0.3356 332/500 [==================>...........] - ETA: 41s - loss: 1.8663 - regression_loss: 1.5307 - classification_loss: 0.3356 333/500 [==================>...........] - ETA: 41s - loss: 1.8672 - regression_loss: 1.5315 - classification_loss: 0.3356 334/500 [===================>..........] - ETA: 41s - loss: 1.8652 - regression_loss: 1.5300 - classification_loss: 0.3352 335/500 [===================>..........] - ETA: 41s - loss: 1.8647 - regression_loss: 1.5294 - classification_loss: 0.3353 336/500 [===================>..........] - ETA: 40s - loss: 1.8663 - regression_loss: 1.5300 - classification_loss: 0.3363 337/500 [===================>..........] - ETA: 40s - loss: 1.8656 - regression_loss: 1.5298 - classification_loss: 0.3358 338/500 [===================>..........] - ETA: 40s - loss: 1.8671 - regression_loss: 1.5310 - classification_loss: 0.3361 339/500 [===================>..........] - ETA: 40s - loss: 1.8656 - regression_loss: 1.5297 - classification_loss: 0.3358 340/500 [===================>..........] - ETA: 39s - loss: 1.8641 - regression_loss: 1.5283 - classification_loss: 0.3358 341/500 [===================>..........] - ETA: 39s - loss: 1.8629 - regression_loss: 1.5275 - classification_loss: 0.3354 342/500 [===================>..........] - ETA: 39s - loss: 1.8642 - regression_loss: 1.5289 - classification_loss: 0.3353 343/500 [===================>..........] - ETA: 39s - loss: 1.8634 - regression_loss: 1.5282 - classification_loss: 0.3352 344/500 [===================>..........] - ETA: 38s - loss: 1.8636 - regression_loss: 1.5284 - classification_loss: 0.3352 345/500 [===================>..........] - ETA: 38s - loss: 1.8622 - regression_loss: 1.5273 - classification_loss: 0.3349 346/500 [===================>..........] - ETA: 38s - loss: 1.8615 - regression_loss: 1.5269 - classification_loss: 0.3346 347/500 [===================>..........] - ETA: 38s - loss: 1.8621 - regression_loss: 1.5275 - classification_loss: 0.3346 348/500 [===================>..........] - ETA: 37s - loss: 1.8656 - regression_loss: 1.5307 - classification_loss: 0.3349 349/500 [===================>..........] - ETA: 37s - loss: 1.8635 - regression_loss: 1.5288 - classification_loss: 0.3348 350/500 [====================>.........] - ETA: 37s - loss: 1.8636 - regression_loss: 1.5290 - classification_loss: 0.3346 351/500 [====================>.........] - ETA: 37s - loss: 1.8637 - regression_loss: 1.5293 - classification_loss: 0.3344 352/500 [====================>.........] - ETA: 36s - loss: 1.8627 - regression_loss: 1.5284 - classification_loss: 0.3343 353/500 [====================>.........] - ETA: 36s - loss: 1.8620 - regression_loss: 1.5280 - classification_loss: 0.3340 354/500 [====================>.........] - ETA: 36s - loss: 1.8599 - regression_loss: 1.5258 - classification_loss: 0.3341 355/500 [====================>.........] - ETA: 36s - loss: 1.8625 - regression_loss: 1.5282 - classification_loss: 0.3343 356/500 [====================>.........] - ETA: 35s - loss: 1.8627 - regression_loss: 1.5285 - classification_loss: 0.3342 357/500 [====================>.........] - ETA: 35s - loss: 1.8596 - regression_loss: 1.5260 - classification_loss: 0.3336 358/500 [====================>.........] - ETA: 35s - loss: 1.8586 - regression_loss: 1.5252 - classification_loss: 0.3335 359/500 [====================>.........] - ETA: 35s - loss: 1.8578 - regression_loss: 1.5246 - classification_loss: 0.3332 360/500 [====================>.........] - ETA: 34s - loss: 1.8590 - regression_loss: 1.5256 - classification_loss: 0.3334 361/500 [====================>.........] - ETA: 34s - loss: 1.8609 - regression_loss: 1.5273 - classification_loss: 0.3336 362/500 [====================>.........] - ETA: 34s - loss: 1.8632 - regression_loss: 1.5290 - classification_loss: 0.3342 363/500 [====================>.........] - ETA: 34s - loss: 1.8620 - regression_loss: 1.5282 - classification_loss: 0.3338 364/500 [====================>.........] - ETA: 33s - loss: 1.8609 - regression_loss: 1.5274 - classification_loss: 0.3335 365/500 [====================>.........] - ETA: 33s - loss: 1.8611 - regression_loss: 1.5276 - classification_loss: 0.3335 366/500 [====================>.........] - ETA: 33s - loss: 1.8629 - regression_loss: 1.5292 - classification_loss: 0.3338 367/500 [=====================>........] - ETA: 33s - loss: 1.8639 - regression_loss: 1.5300 - classification_loss: 0.3339 368/500 [=====================>........] - ETA: 32s - loss: 1.8640 - regression_loss: 1.5301 - classification_loss: 0.3339 369/500 [=====================>........] - ETA: 32s - loss: 1.8649 - regression_loss: 1.5311 - classification_loss: 0.3338 370/500 [=====================>........] - ETA: 32s - loss: 1.8667 - regression_loss: 1.5328 - classification_loss: 0.3339 371/500 [=====================>........] - ETA: 32s - loss: 1.8693 - regression_loss: 1.5342 - classification_loss: 0.3351 372/500 [=====================>........] - ETA: 31s - loss: 1.8698 - regression_loss: 1.5343 - classification_loss: 0.3356 373/500 [=====================>........] - ETA: 31s - loss: 1.8695 - regression_loss: 1.5339 - classification_loss: 0.3356 374/500 [=====================>........] - ETA: 31s - loss: 1.8697 - regression_loss: 1.5340 - classification_loss: 0.3357 375/500 [=====================>........] - ETA: 31s - loss: 1.8701 - regression_loss: 1.5344 - classification_loss: 0.3357 376/500 [=====================>........] - ETA: 30s - loss: 1.8689 - regression_loss: 1.5331 - classification_loss: 0.3358 377/500 [=====================>........] - ETA: 30s - loss: 1.8694 - regression_loss: 1.5336 - classification_loss: 0.3358 378/500 [=====================>........] - ETA: 30s - loss: 1.8680 - regression_loss: 1.5325 - classification_loss: 0.3355 379/500 [=====================>........] - ETA: 30s - loss: 1.8684 - regression_loss: 1.5329 - classification_loss: 0.3355 380/500 [=====================>........] - ETA: 29s - loss: 1.8683 - regression_loss: 1.5330 - classification_loss: 0.3352 381/500 [=====================>........] - ETA: 29s - loss: 1.8676 - regression_loss: 1.5326 - classification_loss: 0.3350 382/500 [=====================>........] - ETA: 29s - loss: 1.8675 - regression_loss: 1.5326 - classification_loss: 0.3350 383/500 [=====================>........] - ETA: 29s - loss: 1.8671 - regression_loss: 1.5322 - classification_loss: 0.3349 384/500 [======================>.......] - ETA: 28s - loss: 1.8663 - regression_loss: 1.5317 - classification_loss: 0.3346 385/500 [======================>.......] - ETA: 28s - loss: 1.8672 - regression_loss: 1.5317 - classification_loss: 0.3355 386/500 [======================>.......] - ETA: 28s - loss: 1.8681 - regression_loss: 1.5324 - classification_loss: 0.3357 387/500 [======================>.......] - ETA: 28s - loss: 1.8690 - regression_loss: 1.5332 - classification_loss: 0.3359 388/500 [======================>.......] - ETA: 27s - loss: 1.8698 - regression_loss: 1.5338 - classification_loss: 0.3360 389/500 [======================>.......] - ETA: 27s - loss: 1.8709 - regression_loss: 1.5345 - classification_loss: 0.3364 390/500 [======================>.......] - ETA: 27s - loss: 1.8735 - regression_loss: 1.5369 - classification_loss: 0.3366 391/500 [======================>.......] - ETA: 27s - loss: 1.8738 - regression_loss: 1.5372 - classification_loss: 0.3365 392/500 [======================>.......] - ETA: 26s - loss: 1.8728 - regression_loss: 1.5363 - classification_loss: 0.3365 393/500 [======================>.......] - ETA: 26s - loss: 1.8712 - regression_loss: 1.5350 - classification_loss: 0.3362 394/500 [======================>.......] - ETA: 26s - loss: 1.8720 - regression_loss: 1.5358 - classification_loss: 0.3362 395/500 [======================>.......] - ETA: 26s - loss: 1.8720 - regression_loss: 1.5359 - classification_loss: 0.3361 396/500 [======================>.......] - ETA: 25s - loss: 1.8712 - regression_loss: 1.5349 - classification_loss: 0.3363 397/500 [======================>.......] - ETA: 25s - loss: 1.8713 - regression_loss: 1.5350 - classification_loss: 0.3363 398/500 [======================>.......] - ETA: 25s - loss: 1.8722 - regression_loss: 1.5357 - classification_loss: 0.3365 399/500 [======================>.......] - ETA: 25s - loss: 1.8726 - regression_loss: 1.5359 - classification_loss: 0.3366 400/500 [=======================>......] - ETA: 24s - loss: 1.8738 - regression_loss: 1.5368 - classification_loss: 0.3370 401/500 [=======================>......] - ETA: 24s - loss: 1.8734 - regression_loss: 1.5366 - classification_loss: 0.3369 402/500 [=======================>......] - ETA: 24s - loss: 1.8732 - regression_loss: 1.5364 - classification_loss: 0.3367 403/500 [=======================>......] - ETA: 24s - loss: 1.8735 - regression_loss: 1.5369 - classification_loss: 0.3366 404/500 [=======================>......] - ETA: 23s - loss: 1.8722 - regression_loss: 1.5358 - classification_loss: 0.3364 405/500 [=======================>......] - ETA: 23s - loss: 1.8722 - regression_loss: 1.5361 - classification_loss: 0.3361 406/500 [=======================>......] - ETA: 23s - loss: 1.8701 - regression_loss: 1.5344 - classification_loss: 0.3357 407/500 [=======================>......] - ETA: 23s - loss: 1.8699 - regression_loss: 1.5343 - classification_loss: 0.3356 408/500 [=======================>......] - ETA: 22s - loss: 1.8710 - regression_loss: 1.5354 - classification_loss: 0.3356 409/500 [=======================>......] - ETA: 22s - loss: 1.8698 - regression_loss: 1.5346 - classification_loss: 0.3353 410/500 [=======================>......] - ETA: 22s - loss: 1.8709 - regression_loss: 1.5353 - classification_loss: 0.3355 411/500 [=======================>......] - ETA: 22s - loss: 1.8705 - regression_loss: 1.5348 - classification_loss: 0.3357 412/500 [=======================>......] - ETA: 21s - loss: 1.8695 - regression_loss: 1.5341 - classification_loss: 0.3355 413/500 [=======================>......] - ETA: 21s - loss: 1.8739 - regression_loss: 1.5384 - classification_loss: 0.3356 414/500 [=======================>......] - ETA: 21s - loss: 1.8716 - regression_loss: 1.5363 - classification_loss: 0.3353 415/500 [=======================>......] - ETA: 21s - loss: 1.8725 - regression_loss: 1.5370 - classification_loss: 0.3355 416/500 [=======================>......] - ETA: 20s - loss: 1.8728 - regression_loss: 1.5374 - classification_loss: 0.3354 417/500 [========================>.....] - ETA: 20s - loss: 1.8712 - regression_loss: 1.5362 - classification_loss: 0.3350 418/500 [========================>.....] - ETA: 20s - loss: 1.8711 - regression_loss: 1.5360 - classification_loss: 0.3350 419/500 [========================>.....] - ETA: 20s - loss: 1.8702 - regression_loss: 1.5353 - classification_loss: 0.3349 420/500 [========================>.....] - ETA: 19s - loss: 1.8707 - regression_loss: 1.5356 - classification_loss: 0.3350 421/500 [========================>.....] - ETA: 19s - loss: 1.8711 - regression_loss: 1.5362 - classification_loss: 0.3350 422/500 [========================>.....] - ETA: 19s - loss: 1.8715 - regression_loss: 1.5363 - classification_loss: 0.3352 423/500 [========================>.....] - ETA: 19s - loss: 1.8701 - regression_loss: 1.5351 - classification_loss: 0.3350 424/500 [========================>.....] - ETA: 18s - loss: 1.8700 - regression_loss: 1.5352 - classification_loss: 0.3348 425/500 [========================>.....] - ETA: 18s - loss: 1.8703 - regression_loss: 1.5356 - classification_loss: 0.3347 426/500 [========================>.....] - ETA: 18s - loss: 1.8705 - regression_loss: 1.5359 - classification_loss: 0.3346 427/500 [========================>.....] - ETA: 18s - loss: 1.8704 - regression_loss: 1.5358 - classification_loss: 0.3345 428/500 [========================>.....] - ETA: 17s - loss: 1.8711 - regression_loss: 1.5366 - classification_loss: 0.3345 429/500 [========================>.....] - ETA: 17s - loss: 1.8714 - regression_loss: 1.5368 - classification_loss: 0.3346 430/500 [========================>.....] - ETA: 17s - loss: 1.8716 - regression_loss: 1.5369 - classification_loss: 0.3347 431/500 [========================>.....] - ETA: 17s - loss: 1.8724 - regression_loss: 1.5377 - classification_loss: 0.3347 432/500 [========================>.....] - ETA: 16s - loss: 1.8706 - regression_loss: 1.5363 - classification_loss: 0.3343 433/500 [========================>.....] - ETA: 16s - loss: 1.8708 - regression_loss: 1.5366 - classification_loss: 0.3342 434/500 [=========================>....] - ETA: 16s - loss: 1.8714 - regression_loss: 1.5373 - classification_loss: 0.3341 435/500 [=========================>....] - ETA: 16s - loss: 1.8715 - regression_loss: 1.5375 - classification_loss: 0.3340 436/500 [=========================>....] - ETA: 15s - loss: 1.8713 - regression_loss: 1.5374 - classification_loss: 0.3339 437/500 [=========================>....] - ETA: 15s - loss: 1.8717 - regression_loss: 1.5379 - classification_loss: 0.3338 438/500 [=========================>....] - ETA: 15s - loss: 1.8706 - regression_loss: 1.5366 - classification_loss: 0.3340 439/500 [=========================>....] - ETA: 15s - loss: 1.8719 - regression_loss: 1.5378 - classification_loss: 0.3341 440/500 [=========================>....] - ETA: 14s - loss: 1.8717 - regression_loss: 1.5377 - classification_loss: 0.3340 441/500 [=========================>....] - ETA: 14s - loss: 1.8714 - regression_loss: 1.5376 - classification_loss: 0.3338 442/500 [=========================>....] - ETA: 14s - loss: 1.8706 - regression_loss: 1.5371 - classification_loss: 0.3335 443/500 [=========================>....] - ETA: 14s - loss: 1.8714 - regression_loss: 1.5379 - classification_loss: 0.3335 444/500 [=========================>....] - ETA: 13s - loss: 1.8684 - regression_loss: 1.5353 - classification_loss: 0.3331 445/500 [=========================>....] - ETA: 13s - loss: 1.8708 - regression_loss: 1.5374 - classification_loss: 0.3334 446/500 [=========================>....] - ETA: 13s - loss: 1.8708 - regression_loss: 1.5376 - classification_loss: 0.3332 447/500 [=========================>....] - ETA: 13s - loss: 1.8715 - regression_loss: 1.5383 - classification_loss: 0.3332 448/500 [=========================>....] - ETA: 12s - loss: 1.8711 - regression_loss: 1.5380 - classification_loss: 0.3331 449/500 [=========================>....] - ETA: 12s - loss: 1.8725 - regression_loss: 1.5387 - classification_loss: 0.3338 450/500 [==========================>...] - ETA: 12s - loss: 1.8730 - regression_loss: 1.5392 - classification_loss: 0.3338 451/500 [==========================>...] - ETA: 12s - loss: 1.8742 - regression_loss: 1.5402 - classification_loss: 0.3340 452/500 [==========================>...] - ETA: 11s - loss: 1.8731 - regression_loss: 1.5393 - classification_loss: 0.3337 453/500 [==========================>...] - ETA: 11s - loss: 1.8738 - regression_loss: 1.5401 - classification_loss: 0.3338 454/500 [==========================>...] - ETA: 11s - loss: 1.8744 - regression_loss: 1.5405 - classification_loss: 0.3339 455/500 [==========================>...] - ETA: 11s - loss: 1.8743 - regression_loss: 1.5405 - classification_loss: 0.3337 456/500 [==========================>...] - ETA: 10s - loss: 1.8749 - regression_loss: 1.5410 - classification_loss: 0.3339 457/500 [==========================>...] - ETA: 10s - loss: 1.8765 - regression_loss: 1.5424 - classification_loss: 0.3342 458/500 [==========================>...] - ETA: 10s - loss: 1.8753 - regression_loss: 1.5413 - classification_loss: 0.3340 459/500 [==========================>...] - ETA: 10s - loss: 1.8754 - regression_loss: 1.5415 - classification_loss: 0.3339 460/500 [==========================>...] - ETA: 9s - loss: 1.8750 - regression_loss: 1.5413 - classification_loss: 0.3337  461/500 [==========================>...] - ETA: 9s - loss: 1.8751 - regression_loss: 1.5415 - classification_loss: 0.3336 462/500 [==========================>...] - ETA: 9s - loss: 1.8750 - regression_loss: 1.5413 - classification_loss: 0.3337 463/500 [==========================>...] - ETA: 9s - loss: 1.8758 - regression_loss: 1.5421 - classification_loss: 0.3337 464/500 [==========================>...] - ETA: 8s - loss: 1.8772 - regression_loss: 1.5431 - classification_loss: 0.3341 465/500 [==========================>...] - ETA: 8s - loss: 1.8760 - regression_loss: 1.5421 - classification_loss: 0.3339 466/500 [==========================>...] - ETA: 8s - loss: 1.8755 - regression_loss: 1.5408 - classification_loss: 0.3346 467/500 [===========================>..] - ETA: 8s - loss: 1.8765 - regression_loss: 1.5416 - classification_loss: 0.3349 468/500 [===========================>..] - ETA: 7s - loss: 1.8759 - regression_loss: 1.5412 - classification_loss: 0.3347 469/500 [===========================>..] - ETA: 7s - loss: 1.8764 - regression_loss: 1.5418 - classification_loss: 0.3345 470/500 [===========================>..] - ETA: 7s - loss: 1.8766 - regression_loss: 1.5420 - classification_loss: 0.3346 471/500 [===========================>..] - ETA: 7s - loss: 1.8771 - regression_loss: 1.5424 - classification_loss: 0.3348 472/500 [===========================>..] - ETA: 6s - loss: 1.8775 - regression_loss: 1.5427 - classification_loss: 0.3347 473/500 [===========================>..] - ETA: 6s - loss: 1.8785 - regression_loss: 1.5436 - classification_loss: 0.3349 474/500 [===========================>..] - ETA: 6s - loss: 1.8784 - regression_loss: 1.5436 - classification_loss: 0.3348 475/500 [===========================>..] - ETA: 6s - loss: 1.8786 - regression_loss: 1.5437 - classification_loss: 0.3348 476/500 [===========================>..] - ETA: 5s - loss: 1.8775 - regression_loss: 1.5429 - classification_loss: 0.3346 477/500 [===========================>..] - ETA: 5s - loss: 1.8785 - regression_loss: 1.5437 - classification_loss: 0.3347 478/500 [===========================>..] - ETA: 5s - loss: 1.8768 - regression_loss: 1.5425 - classification_loss: 0.3343 479/500 [===========================>..] - ETA: 5s - loss: 1.8771 - regression_loss: 1.5425 - classification_loss: 0.3345 480/500 [===========================>..] - ETA: 4s - loss: 1.8780 - regression_loss: 1.5432 - classification_loss: 0.3348 481/500 [===========================>..] - ETA: 4s - loss: 1.8791 - regression_loss: 1.5442 - classification_loss: 0.3349 482/500 [===========================>..] - ETA: 4s - loss: 1.8803 - regression_loss: 1.5449 - classification_loss: 0.3354 483/500 [===========================>..] - ETA: 4s - loss: 1.8791 - regression_loss: 1.5441 - classification_loss: 0.3350 484/500 [============================>.] - ETA: 3s - loss: 1.8787 - regression_loss: 1.5437 - classification_loss: 0.3350 485/500 [============================>.] - ETA: 3s - loss: 1.8783 - regression_loss: 1.5435 - classification_loss: 0.3348 486/500 [============================>.] - ETA: 3s - loss: 1.8786 - regression_loss: 1.5438 - classification_loss: 0.3349 487/500 [============================>.] - ETA: 3s - loss: 1.8784 - regression_loss: 1.5435 - classification_loss: 0.3348 488/500 [============================>.] - ETA: 2s - loss: 1.8787 - regression_loss: 1.5438 - classification_loss: 0.3349 489/500 [============================>.] - ETA: 2s - loss: 1.8784 - regression_loss: 1.5435 - classification_loss: 0.3349 490/500 [============================>.] - ETA: 2s - loss: 1.8782 - regression_loss: 1.5433 - classification_loss: 0.3348 491/500 [============================>.] - ETA: 2s - loss: 1.8788 - regression_loss: 1.5440 - classification_loss: 0.3348 492/500 [============================>.] - ETA: 1s - loss: 1.8791 - regression_loss: 1.5442 - classification_loss: 0.3348 493/500 [============================>.] - ETA: 1s - loss: 1.8784 - regression_loss: 1.5436 - classification_loss: 0.3348 494/500 [============================>.] - ETA: 1s - loss: 1.8776 - regression_loss: 1.5430 - classification_loss: 0.3346 495/500 [============================>.] - ETA: 1s - loss: 1.8778 - regression_loss: 1.5432 - classification_loss: 0.3346 496/500 [============================>.] - ETA: 0s - loss: 1.8793 - regression_loss: 1.5443 - classification_loss: 0.3349 497/500 [============================>.] - ETA: 0s - loss: 1.8800 - regression_loss: 1.5448 - classification_loss: 0.3353 498/500 [============================>.] - ETA: 0s - loss: 1.8796 - regression_loss: 1.5445 - classification_loss: 0.3351 499/500 [============================>.] - ETA: 0s - loss: 1.8793 - regression_loss: 1.5443 - classification_loss: 0.3350 500/500 [==============================] - 125s 249ms/step - loss: 1.8803 - regression_loss: 1.5451 - classification_loss: 0.3352 1172 instances of class plum with average precision: 0.5722 mAP: 0.5722 Epoch 00053: saving model to ./training/snapshots/resnet50_pascal_53.h5 Epoch 54/150 1/500 [..............................] - ETA: 1:46 - loss: 1.9683 - regression_loss: 1.6231 - classification_loss: 0.3451 2/500 [..............................] - ETA: 1:54 - loss: 2.4457 - regression_loss: 2.0650 - classification_loss: 0.3808 3/500 [..............................] - ETA: 1:56 - loss: 2.5036 - regression_loss: 2.0626 - classification_loss: 0.4409 4/500 [..............................] - ETA: 1:58 - loss: 2.3596 - regression_loss: 1.9486 - classification_loss: 0.4110 5/500 [..............................] - ETA: 1:59 - loss: 2.0650 - regression_loss: 1.7055 - classification_loss: 0.3595 6/500 [..............................] - ETA: 2:01 - loss: 2.0602 - regression_loss: 1.7077 - classification_loss: 0.3525 7/500 [..............................] - ETA: 2:01 - loss: 2.1309 - regression_loss: 1.7755 - classification_loss: 0.3554 8/500 [..............................] - ETA: 2:02 - loss: 2.0711 - regression_loss: 1.7262 - classification_loss: 0.3449 9/500 [..............................] - ETA: 2:02 - loss: 2.0544 - regression_loss: 1.7125 - classification_loss: 0.3419 10/500 [..............................] - ETA: 2:02 - loss: 2.0128 - regression_loss: 1.6809 - classification_loss: 0.3319 11/500 [..............................] - ETA: 2:01 - loss: 2.0111 - regression_loss: 1.6805 - classification_loss: 0.3306 12/500 [..............................] - ETA: 2:02 - loss: 1.9998 - regression_loss: 1.6614 - classification_loss: 0.3384 13/500 [..............................] - ETA: 2:02 - loss: 1.9923 - regression_loss: 1.6621 - classification_loss: 0.3302 14/500 [..............................] - ETA: 2:02 - loss: 2.0210 - regression_loss: 1.6804 - classification_loss: 0.3405 15/500 [..............................] - ETA: 2:01 - loss: 2.0065 - regression_loss: 1.6693 - classification_loss: 0.3372 16/500 [..............................] - ETA: 2:01 - loss: 2.0304 - regression_loss: 1.6898 - classification_loss: 0.3407 17/500 [>.............................] - ETA: 2:01 - loss: 2.0471 - regression_loss: 1.7046 - classification_loss: 0.3425 18/500 [>.............................] - ETA: 2:00 - loss: 2.0306 - regression_loss: 1.6903 - classification_loss: 0.3403 19/500 [>.............................] - ETA: 2:00 - loss: 2.0088 - regression_loss: 1.6682 - classification_loss: 0.3406 20/500 [>.............................] - ETA: 1:59 - loss: 1.9969 - regression_loss: 1.6587 - classification_loss: 0.3382 21/500 [>.............................] - ETA: 1:59 - loss: 1.9778 - regression_loss: 1.6437 - classification_loss: 0.3341 22/500 [>.............................] - ETA: 1:59 - loss: 1.9828 - regression_loss: 1.6505 - classification_loss: 0.3323 23/500 [>.............................] - ETA: 1:58 - loss: 1.9765 - regression_loss: 1.6440 - classification_loss: 0.3325 24/500 [>.............................] - ETA: 1:58 - loss: 1.9684 - regression_loss: 1.6384 - classification_loss: 0.3300 25/500 [>.............................] - ETA: 1:58 - loss: 1.9748 - regression_loss: 1.6414 - classification_loss: 0.3334 26/500 [>.............................] - ETA: 1:58 - loss: 1.9921 - regression_loss: 1.6541 - classification_loss: 0.3380 27/500 [>.............................] - ETA: 1:58 - loss: 1.9766 - regression_loss: 1.6412 - classification_loss: 0.3354 28/500 [>.............................] - ETA: 1:57 - loss: 1.9384 - regression_loss: 1.6085 - classification_loss: 0.3298 29/500 [>.............................] - ETA: 1:57 - loss: 1.9445 - regression_loss: 1.6154 - classification_loss: 0.3291 30/500 [>.............................] - ETA: 1:57 - loss: 1.9475 - regression_loss: 1.6187 - classification_loss: 0.3288 31/500 [>.............................] - ETA: 1:57 - loss: 1.9770 - regression_loss: 1.6399 - classification_loss: 0.3371 32/500 [>.............................] - ETA: 1:56 - loss: 1.9865 - regression_loss: 1.6437 - classification_loss: 0.3428 33/500 [>.............................] - ETA: 1:56 - loss: 1.9839 - regression_loss: 1.6413 - classification_loss: 0.3426 34/500 [=>............................] - ETA: 1:56 - loss: 1.9920 - regression_loss: 1.6478 - classification_loss: 0.3442 35/500 [=>............................] - ETA: 1:56 - loss: 1.9857 - regression_loss: 1.6443 - classification_loss: 0.3413 36/500 [=>............................] - ETA: 1:55 - loss: 1.9625 - regression_loss: 1.6257 - classification_loss: 0.3368 37/500 [=>............................] - ETA: 1:55 - loss: 1.9527 - regression_loss: 1.6190 - classification_loss: 0.3337 38/500 [=>............................] - ETA: 1:55 - loss: 1.9417 - regression_loss: 1.6009 - classification_loss: 0.3408 39/500 [=>............................] - ETA: 1:55 - loss: 1.9389 - regression_loss: 1.5999 - classification_loss: 0.3389 40/500 [=>............................] - ETA: 1:55 - loss: 1.9346 - regression_loss: 1.5962 - classification_loss: 0.3385 41/500 [=>............................] - ETA: 1:54 - loss: 1.9468 - regression_loss: 1.6035 - classification_loss: 0.3433 42/500 [=>............................] - ETA: 1:54 - loss: 1.9456 - regression_loss: 1.6037 - classification_loss: 0.3419 43/500 [=>............................] - ETA: 1:54 - loss: 1.9198 - regression_loss: 1.5832 - classification_loss: 0.3365 44/500 [=>............................] - ETA: 1:54 - loss: 1.9052 - regression_loss: 1.5699 - classification_loss: 0.3353 45/500 [=>............................] - ETA: 1:53 - loss: 1.8870 - regression_loss: 1.5558 - classification_loss: 0.3311 46/500 [=>............................] - ETA: 1:53 - loss: 1.8816 - regression_loss: 1.5525 - classification_loss: 0.3291 47/500 [=>............................] - ETA: 1:53 - loss: 1.8686 - regression_loss: 1.5429 - classification_loss: 0.3257 48/500 [=>............................] - ETA: 1:53 - loss: 1.8567 - regression_loss: 1.5332 - classification_loss: 0.3235 49/500 [=>............................] - ETA: 1:52 - loss: 1.8659 - regression_loss: 1.5398 - classification_loss: 0.3261 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8660 - regression_loss: 1.5407 - classification_loss: 0.3253 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8442 - regression_loss: 1.5229 - classification_loss: 0.3213 52/500 [==>...........................] - ETA: 1:51 - loss: 1.8385 - regression_loss: 1.5184 - classification_loss: 0.3201 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8340 - regression_loss: 1.5150 - classification_loss: 0.3190 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8343 - regression_loss: 1.5164 - classification_loss: 0.3179 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8443 - regression_loss: 1.5241 - classification_loss: 0.3201 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8501 - regression_loss: 1.5308 - classification_loss: 0.3192 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8457 - regression_loss: 1.5260 - classification_loss: 0.3198 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8429 - regression_loss: 1.5235 - classification_loss: 0.3194 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8336 - regression_loss: 1.5157 - classification_loss: 0.3179 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8428 - regression_loss: 1.5214 - classification_loss: 0.3214 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8453 - regression_loss: 1.5244 - classification_loss: 0.3210 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8491 - regression_loss: 1.5287 - classification_loss: 0.3204 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8416 - regression_loss: 1.5227 - classification_loss: 0.3189 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8424 - regression_loss: 1.5241 - classification_loss: 0.3182 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8525 - regression_loss: 1.5340 - classification_loss: 0.3185 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8543 - regression_loss: 1.5353 - classification_loss: 0.3189 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8490 - regression_loss: 1.5309 - classification_loss: 0.3181 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8563 - regression_loss: 1.5371 - classification_loss: 0.3191 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8531 - regression_loss: 1.5345 - classification_loss: 0.3187 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8513 - regression_loss: 1.5331 - classification_loss: 0.3181 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8517 - regression_loss: 1.5336 - classification_loss: 0.3180 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8492 - regression_loss: 1.5319 - classification_loss: 0.3173 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8594 - regression_loss: 1.5386 - classification_loss: 0.3208 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8619 - regression_loss: 1.5406 - classification_loss: 0.3213 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8512 - regression_loss: 1.5320 - classification_loss: 0.3192 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8509 - regression_loss: 1.5311 - classification_loss: 0.3197 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8597 - regression_loss: 1.5394 - classification_loss: 0.3203 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8571 - regression_loss: 1.5375 - classification_loss: 0.3196 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8577 - regression_loss: 1.5397 - classification_loss: 0.3180 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8629 - regression_loss: 1.5437 - classification_loss: 0.3193 81/500 [===>..........................] - ETA: 1:43 - loss: 1.8661 - regression_loss: 1.5462 - classification_loss: 0.3199 82/500 [===>..........................] - ETA: 1:43 - loss: 1.8518 - regression_loss: 1.5340 - classification_loss: 0.3178 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8522 - regression_loss: 1.5347 - classification_loss: 0.3175 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8522 - regression_loss: 1.5354 - classification_loss: 0.3168 85/500 [====>.........................] - ETA: 1:42 - loss: 1.8557 - regression_loss: 1.5372 - classification_loss: 0.3184 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8594 - regression_loss: 1.5412 - classification_loss: 0.3182 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8598 - regression_loss: 1.5416 - classification_loss: 0.3182 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8596 - regression_loss: 1.5409 - classification_loss: 0.3188 89/500 [====>.........................] - ETA: 1:41 - loss: 1.8604 - regression_loss: 1.5420 - classification_loss: 0.3184 90/500 [====>.........................] - ETA: 1:41 - loss: 1.8624 - regression_loss: 1.5396 - classification_loss: 0.3227 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8620 - regression_loss: 1.5396 - classification_loss: 0.3223 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8593 - regression_loss: 1.5382 - classification_loss: 0.3211 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8644 - regression_loss: 1.5420 - classification_loss: 0.3224 94/500 [====>.........................] - ETA: 1:40 - loss: 1.8658 - regression_loss: 1.5434 - classification_loss: 0.3224 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8652 - regression_loss: 1.5420 - classification_loss: 0.3232 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8665 - regression_loss: 1.5429 - classification_loss: 0.3236 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8684 - regression_loss: 1.5414 - classification_loss: 0.3270 98/500 [====>.........................] - ETA: 1:39 - loss: 1.8636 - regression_loss: 1.5361 - classification_loss: 0.3275 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8683 - regression_loss: 1.5393 - classification_loss: 0.3290 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8604 - regression_loss: 1.5313 - classification_loss: 0.3291 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8596 - regression_loss: 1.5310 - classification_loss: 0.3286 102/500 [=====>........................] - ETA: 1:38 - loss: 1.8535 - regression_loss: 1.5262 - classification_loss: 0.3273 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8550 - regression_loss: 1.5271 - classification_loss: 0.3279 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8560 - regression_loss: 1.5280 - classification_loss: 0.3280 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8559 - regression_loss: 1.5281 - classification_loss: 0.3278 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8494 - regression_loss: 1.5228 - classification_loss: 0.3267 107/500 [=====>........................] - ETA: 1:37 - loss: 1.8477 - regression_loss: 1.5187 - classification_loss: 0.3290 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8402 - regression_loss: 1.5124 - classification_loss: 0.3278 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8425 - regression_loss: 1.5152 - classification_loss: 0.3273 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8446 - regression_loss: 1.5173 - classification_loss: 0.3273 111/500 [=====>........................] - ETA: 1:36 - loss: 1.8478 - regression_loss: 1.5202 - classification_loss: 0.3276 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8526 - regression_loss: 1.5240 - classification_loss: 0.3286 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8573 - regression_loss: 1.5279 - classification_loss: 0.3294 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8532 - regression_loss: 1.5246 - classification_loss: 0.3287 115/500 [=====>........................] - ETA: 1:35 - loss: 1.8530 - regression_loss: 1.5243 - classification_loss: 0.3287 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8527 - regression_loss: 1.5234 - classification_loss: 0.3293 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8583 - regression_loss: 1.5285 - classification_loss: 0.3298 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8557 - regression_loss: 1.5271 - classification_loss: 0.3286 119/500 [======>.......................] - ETA: 1:34 - loss: 1.8601 - regression_loss: 1.5305 - classification_loss: 0.3296 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8578 - regression_loss: 1.5287 - classification_loss: 0.3291 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8618 - regression_loss: 1.5314 - classification_loss: 0.3304 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8638 - regression_loss: 1.5331 - classification_loss: 0.3307 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8636 - regression_loss: 1.5328 - classification_loss: 0.3308 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8662 - regression_loss: 1.5355 - classification_loss: 0.3307 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8662 - regression_loss: 1.5358 - classification_loss: 0.3304 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8674 - regression_loss: 1.5363 - classification_loss: 0.3310 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8609 - regression_loss: 1.5311 - classification_loss: 0.3299 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8567 - regression_loss: 1.5270 - classification_loss: 0.3297 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8570 - regression_loss: 1.5271 - classification_loss: 0.3299 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8583 - regression_loss: 1.5283 - classification_loss: 0.3300 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8576 - regression_loss: 1.5281 - classification_loss: 0.3295 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8576 - regression_loss: 1.5281 - classification_loss: 0.3294 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8588 - regression_loss: 1.5299 - classification_loss: 0.3289 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8572 - regression_loss: 1.5286 - classification_loss: 0.3286 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8597 - regression_loss: 1.5297 - classification_loss: 0.3299 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8592 - regression_loss: 1.5294 - classification_loss: 0.3298 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8596 - regression_loss: 1.5293 - classification_loss: 0.3303 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8734 - regression_loss: 1.5396 - classification_loss: 0.3338 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8747 - regression_loss: 1.5409 - classification_loss: 0.3338 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8722 - regression_loss: 1.5392 - classification_loss: 0.3331 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8702 - regression_loss: 1.5377 - classification_loss: 0.3325 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8649 - regression_loss: 1.5330 - classification_loss: 0.3318 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8674 - regression_loss: 1.5353 - classification_loss: 0.3321 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8671 - regression_loss: 1.5351 - classification_loss: 0.3319 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8690 - regression_loss: 1.5371 - classification_loss: 0.3319 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8711 - regression_loss: 1.5394 - classification_loss: 0.3317 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8700 - regression_loss: 1.5384 - classification_loss: 0.3316 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8705 - regression_loss: 1.5391 - classification_loss: 0.3313 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8719 - regression_loss: 1.5407 - classification_loss: 0.3313 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8705 - regression_loss: 1.5395 - classification_loss: 0.3311 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8706 - regression_loss: 1.5399 - classification_loss: 0.3306 152/500 [========>.....................] - ETA: 1:26 - loss: 1.8725 - regression_loss: 1.5401 - classification_loss: 0.3325 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8675 - regression_loss: 1.5354 - classification_loss: 0.3321 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8674 - regression_loss: 1.5353 - classification_loss: 0.3321 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8750 - regression_loss: 1.5418 - classification_loss: 0.3332 156/500 [========>.....................] - ETA: 1:25 - loss: 1.8750 - regression_loss: 1.5422 - classification_loss: 0.3329 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8788 - regression_loss: 1.5452 - classification_loss: 0.3336 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8791 - regression_loss: 1.5455 - classification_loss: 0.3336 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8816 - regression_loss: 1.5483 - classification_loss: 0.3334 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8821 - regression_loss: 1.5487 - classification_loss: 0.3334 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8766 - regression_loss: 1.5447 - classification_loss: 0.3319 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8747 - regression_loss: 1.5436 - classification_loss: 0.3311 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8737 - regression_loss: 1.5435 - classification_loss: 0.3302 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8720 - regression_loss: 1.5421 - classification_loss: 0.3300 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8672 - regression_loss: 1.5381 - classification_loss: 0.3292 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8675 - regression_loss: 1.5385 - classification_loss: 0.3290 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8699 - regression_loss: 1.5406 - classification_loss: 0.3293 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8722 - regression_loss: 1.5429 - classification_loss: 0.3293 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8662 - regression_loss: 1.5381 - classification_loss: 0.3282 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8618 - regression_loss: 1.5344 - classification_loss: 0.3274 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8632 - regression_loss: 1.5358 - classification_loss: 0.3274 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8647 - regression_loss: 1.5374 - classification_loss: 0.3274 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8675 - regression_loss: 1.5397 - classification_loss: 0.3278 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8702 - regression_loss: 1.5422 - classification_loss: 0.3280 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8726 - regression_loss: 1.5443 - classification_loss: 0.3283 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8739 - regression_loss: 1.5453 - classification_loss: 0.3286 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8735 - regression_loss: 1.5446 - classification_loss: 0.3289 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8703 - regression_loss: 1.5420 - classification_loss: 0.3284 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8719 - regression_loss: 1.5434 - classification_loss: 0.3285 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8733 - regression_loss: 1.5449 - classification_loss: 0.3283 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8737 - regression_loss: 1.5453 - classification_loss: 0.3283 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8735 - regression_loss: 1.5455 - classification_loss: 0.3280 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8734 - regression_loss: 1.5457 - classification_loss: 0.3277 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8730 - regression_loss: 1.5455 - classification_loss: 0.3276 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8724 - regression_loss: 1.5440 - classification_loss: 0.3284 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8725 - regression_loss: 1.5437 - classification_loss: 0.3288 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8748 - regression_loss: 1.5459 - classification_loss: 0.3289 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8728 - regression_loss: 1.5446 - classification_loss: 0.3282 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8715 - regression_loss: 1.5435 - classification_loss: 0.3280 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8704 - regression_loss: 1.5428 - classification_loss: 0.3276 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8729 - regression_loss: 1.5438 - classification_loss: 0.3291 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8712 - regression_loss: 1.5422 - classification_loss: 0.3290 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8688 - regression_loss: 1.5399 - classification_loss: 0.3289 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8670 - regression_loss: 1.5381 - classification_loss: 0.3289 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8656 - regression_loss: 1.5373 - classification_loss: 0.3283 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8677 - regression_loss: 1.5391 - classification_loss: 0.3285 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8648 - regression_loss: 1.5367 - classification_loss: 0.3281 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8635 - regression_loss: 1.5352 - classification_loss: 0.3283 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8634 - regression_loss: 1.5354 - classification_loss: 0.3280 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8620 - regression_loss: 1.5339 - classification_loss: 0.3281 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8608 - regression_loss: 1.5325 - classification_loss: 0.3283 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8602 - regression_loss: 1.5318 - classification_loss: 0.3284 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8596 - regression_loss: 1.5313 - classification_loss: 0.3283 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8609 - regression_loss: 1.5327 - classification_loss: 0.3282 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8599 - regression_loss: 1.5321 - classification_loss: 0.3278 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8595 - regression_loss: 1.5319 - classification_loss: 0.3276 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8601 - regression_loss: 1.5326 - classification_loss: 0.3275 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8605 - regression_loss: 1.5328 - classification_loss: 0.3277 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8625 - regression_loss: 1.5345 - classification_loss: 0.3281 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8633 - regression_loss: 1.5354 - classification_loss: 0.3279 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8606 - regression_loss: 1.5327 - classification_loss: 0.3279 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8636 - regression_loss: 1.5353 - classification_loss: 0.3283 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8620 - regression_loss: 1.5341 - classification_loss: 0.3280 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8584 - regression_loss: 1.5310 - classification_loss: 0.3274 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8637 - regression_loss: 1.5354 - classification_loss: 0.3283 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8636 - regression_loss: 1.5355 - classification_loss: 0.3281 217/500 [============>.................] - ETA: 1:10 - loss: 1.8625 - regression_loss: 1.5348 - classification_loss: 0.3278 218/500 [============>.................] - ETA: 1:10 - loss: 1.8624 - regression_loss: 1.5349 - classification_loss: 0.3275 219/500 [============>.................] - ETA: 1:10 - loss: 1.8635 - regression_loss: 1.5360 - classification_loss: 0.3275 220/500 [============>.................] - ETA: 1:10 - loss: 1.8625 - regression_loss: 1.5352 - classification_loss: 0.3273 221/500 [============>.................] - ETA: 1:09 - loss: 1.8626 - regression_loss: 1.5354 - classification_loss: 0.3272 222/500 [============>.................] - ETA: 1:09 - loss: 1.8651 - regression_loss: 1.5367 - classification_loss: 0.3284 223/500 [============>.................] - ETA: 1:09 - loss: 1.8638 - regression_loss: 1.5356 - classification_loss: 0.3282 224/500 [============>.................] - ETA: 1:09 - loss: 1.8608 - regression_loss: 1.5338 - classification_loss: 0.3270 225/500 [============>.................] - ETA: 1:08 - loss: 1.8616 - regression_loss: 1.5345 - classification_loss: 0.3271 226/500 [============>.................] - ETA: 1:08 - loss: 1.8619 - regression_loss: 1.5348 - classification_loss: 0.3271 227/500 [============>.................] - ETA: 1:08 - loss: 1.8624 - regression_loss: 1.5353 - classification_loss: 0.3271 228/500 [============>.................] - ETA: 1:08 - loss: 1.8608 - regression_loss: 1.5342 - classification_loss: 0.3266 229/500 [============>.................] - ETA: 1:07 - loss: 1.8623 - regression_loss: 1.5344 - classification_loss: 0.3280 230/500 [============>.................] - ETA: 1:07 - loss: 1.8640 - regression_loss: 1.5355 - classification_loss: 0.3286 231/500 [============>.................] - ETA: 1:07 - loss: 1.8652 - regression_loss: 1.5363 - classification_loss: 0.3289 232/500 [============>.................] - ETA: 1:07 - loss: 1.8654 - regression_loss: 1.5361 - classification_loss: 0.3293 233/500 [============>.................] - ETA: 1:06 - loss: 1.8637 - regression_loss: 1.5348 - classification_loss: 0.3289 234/500 [=============>................] - ETA: 1:06 - loss: 1.8640 - regression_loss: 1.5352 - classification_loss: 0.3288 235/500 [=============>................] - ETA: 1:06 - loss: 1.8602 - regression_loss: 1.5321 - classification_loss: 0.3281 236/500 [=============>................] - ETA: 1:06 - loss: 1.8633 - regression_loss: 1.5344 - classification_loss: 0.3289 237/500 [=============>................] - ETA: 1:05 - loss: 1.8616 - regression_loss: 1.5329 - classification_loss: 0.3286 238/500 [=============>................] - ETA: 1:05 - loss: 1.8619 - regression_loss: 1.5336 - classification_loss: 0.3283 239/500 [=============>................] - ETA: 1:05 - loss: 1.8611 - regression_loss: 1.5332 - classification_loss: 0.3280 240/500 [=============>................] - ETA: 1:05 - loss: 1.8597 - regression_loss: 1.5321 - classification_loss: 0.3276 241/500 [=============>................] - ETA: 1:04 - loss: 1.8617 - regression_loss: 1.5335 - classification_loss: 0.3283 242/500 [=============>................] - ETA: 1:04 - loss: 1.8614 - regression_loss: 1.5334 - classification_loss: 0.3281 243/500 [=============>................] - ETA: 1:04 - loss: 1.8617 - regression_loss: 1.5333 - classification_loss: 0.3284 244/500 [=============>................] - ETA: 1:04 - loss: 1.8618 - regression_loss: 1.5336 - classification_loss: 0.3281 245/500 [=============>................] - ETA: 1:03 - loss: 1.8598 - regression_loss: 1.5321 - classification_loss: 0.3277 246/500 [=============>................] - ETA: 1:03 - loss: 1.8618 - regression_loss: 1.5335 - classification_loss: 0.3282 247/500 [=============>................] - ETA: 1:03 - loss: 1.8625 - regression_loss: 1.5334 - classification_loss: 0.3291 248/500 [=============>................] - ETA: 1:03 - loss: 1.8625 - regression_loss: 1.5335 - classification_loss: 0.3290 249/500 [=============>................] - ETA: 1:02 - loss: 1.8628 - regression_loss: 1.5339 - classification_loss: 0.3289 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8626 - regression_loss: 1.5339 - classification_loss: 0.3287 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8611 - regression_loss: 1.5327 - classification_loss: 0.3284 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8601 - regression_loss: 1.5319 - classification_loss: 0.3282 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8596 - regression_loss: 1.5315 - classification_loss: 0.3282 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8605 - regression_loss: 1.5319 - classification_loss: 0.3286 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8600 - regression_loss: 1.5315 - classification_loss: 0.3285 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8615 - regression_loss: 1.5329 - classification_loss: 0.3287 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8582 - regression_loss: 1.5301 - classification_loss: 0.3281 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8587 - regression_loss: 1.5306 - classification_loss: 0.3281 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8583 - regression_loss: 1.5300 - classification_loss: 0.3283 260/500 [==============>...............] - ETA: 59s - loss: 1.8596 - regression_loss: 1.5312 - classification_loss: 0.3284  261/500 [==============>...............] - ETA: 59s - loss: 1.8593 - regression_loss: 1.5300 - classification_loss: 0.3293 262/500 [==============>...............] - ETA: 59s - loss: 1.8588 - regression_loss: 1.5297 - classification_loss: 0.3291 263/500 [==============>...............] - ETA: 59s - loss: 1.8587 - regression_loss: 1.5294 - classification_loss: 0.3292 264/500 [==============>...............] - ETA: 58s - loss: 1.8597 - regression_loss: 1.5304 - classification_loss: 0.3292 265/500 [==============>...............] - ETA: 58s - loss: 1.8612 - regression_loss: 1.5317 - classification_loss: 0.3295 266/500 [==============>...............] - ETA: 58s - loss: 1.8607 - regression_loss: 1.5315 - classification_loss: 0.3292 267/500 [===============>..............] - ETA: 58s - loss: 1.8607 - regression_loss: 1.5317 - classification_loss: 0.3289 268/500 [===============>..............] - ETA: 57s - loss: 1.8622 - regression_loss: 1.5332 - classification_loss: 0.3290 269/500 [===============>..............] - ETA: 57s - loss: 1.8630 - regression_loss: 1.5338 - classification_loss: 0.3292 270/500 [===============>..............] - ETA: 57s - loss: 1.8630 - regression_loss: 1.5342 - classification_loss: 0.3288 271/500 [===============>..............] - ETA: 57s - loss: 1.8616 - regression_loss: 1.5333 - classification_loss: 0.3283 272/500 [===============>..............] - ETA: 56s - loss: 1.8620 - regression_loss: 1.5337 - classification_loss: 0.3283 273/500 [===============>..............] - ETA: 56s - loss: 1.8640 - regression_loss: 1.5351 - classification_loss: 0.3289 274/500 [===============>..............] - ETA: 56s - loss: 1.8651 - regression_loss: 1.5362 - classification_loss: 0.3289 275/500 [===============>..............] - ETA: 56s - loss: 1.8626 - regression_loss: 1.5343 - classification_loss: 0.3283 276/500 [===============>..............] - ETA: 55s - loss: 1.8638 - regression_loss: 1.5349 - classification_loss: 0.3289 277/500 [===============>..............] - ETA: 55s - loss: 1.8686 - regression_loss: 1.5390 - classification_loss: 0.3296 278/500 [===============>..............] - ETA: 55s - loss: 1.8688 - regression_loss: 1.5395 - classification_loss: 0.3293 279/500 [===============>..............] - ETA: 55s - loss: 1.8702 - regression_loss: 1.5405 - classification_loss: 0.3297 280/500 [===============>..............] - ETA: 54s - loss: 1.8685 - regression_loss: 1.5395 - classification_loss: 0.3290 281/500 [===============>..............] - ETA: 54s - loss: 1.8698 - regression_loss: 1.5407 - classification_loss: 0.3291 282/500 [===============>..............] - ETA: 54s - loss: 1.8671 - regression_loss: 1.5386 - classification_loss: 0.3285 283/500 [===============>..............] - ETA: 54s - loss: 1.8682 - regression_loss: 1.5394 - classification_loss: 0.3288 284/500 [================>.............] - ETA: 54s - loss: 1.8687 - regression_loss: 1.5398 - classification_loss: 0.3289 285/500 [================>.............] - ETA: 53s - loss: 1.8656 - regression_loss: 1.5372 - classification_loss: 0.3284 286/500 [================>.............] - ETA: 53s - loss: 1.8657 - regression_loss: 1.5376 - classification_loss: 0.3281 287/500 [================>.............] - ETA: 53s - loss: 1.8655 - regression_loss: 1.5372 - classification_loss: 0.3283 288/500 [================>.............] - ETA: 53s - loss: 1.8666 - regression_loss: 1.5380 - classification_loss: 0.3286 289/500 [================>.............] - ETA: 52s - loss: 1.8674 - regression_loss: 1.5388 - classification_loss: 0.3286 290/500 [================>.............] - ETA: 52s - loss: 1.8659 - regression_loss: 1.5377 - classification_loss: 0.3282 291/500 [================>.............] - ETA: 52s - loss: 1.8662 - regression_loss: 1.5381 - classification_loss: 0.3281 292/500 [================>.............] - ETA: 52s - loss: 1.8663 - regression_loss: 1.5384 - classification_loss: 0.3279 293/500 [================>.............] - ETA: 51s - loss: 1.8690 - regression_loss: 1.5408 - classification_loss: 0.3282 294/500 [================>.............] - ETA: 51s - loss: 1.8709 - regression_loss: 1.5423 - classification_loss: 0.3286 295/500 [================>.............] - ETA: 51s - loss: 1.8713 - regression_loss: 1.5428 - classification_loss: 0.3285 296/500 [================>.............] - ETA: 51s - loss: 1.8697 - regression_loss: 1.5415 - classification_loss: 0.3282 297/500 [================>.............] - ETA: 50s - loss: 1.8737 - regression_loss: 1.5446 - classification_loss: 0.3291 298/500 [================>.............] - ETA: 50s - loss: 1.8710 - regression_loss: 1.5423 - classification_loss: 0.3287 299/500 [================>.............] - ETA: 50s - loss: 1.8722 - regression_loss: 1.5432 - classification_loss: 0.3291 300/500 [=================>............] - ETA: 50s - loss: 1.8741 - regression_loss: 1.5449 - classification_loss: 0.3292 301/500 [=================>............] - ETA: 49s - loss: 1.8738 - regression_loss: 1.5448 - classification_loss: 0.3290 302/500 [=================>............] - ETA: 49s - loss: 1.8745 - regression_loss: 1.5455 - classification_loss: 0.3290 303/500 [=================>............] - ETA: 49s - loss: 1.8749 - regression_loss: 1.5460 - classification_loss: 0.3289 304/500 [=================>............] - ETA: 49s - loss: 1.8762 - regression_loss: 1.5471 - classification_loss: 0.3292 305/500 [=================>............] - ETA: 48s - loss: 1.8774 - regression_loss: 1.5478 - classification_loss: 0.3296 306/500 [=================>............] - ETA: 48s - loss: 1.8767 - regression_loss: 1.5471 - classification_loss: 0.3296 307/500 [=================>............] - ETA: 48s - loss: 1.8745 - regression_loss: 1.5454 - classification_loss: 0.3290 308/500 [=================>............] - ETA: 48s - loss: 1.8736 - regression_loss: 1.5447 - classification_loss: 0.3289 309/500 [=================>............] - ETA: 47s - loss: 1.8761 - regression_loss: 1.5465 - classification_loss: 0.3295 310/500 [=================>............] - ETA: 47s - loss: 1.8783 - regression_loss: 1.5485 - classification_loss: 0.3297 311/500 [=================>............] - ETA: 47s - loss: 1.8781 - regression_loss: 1.5484 - classification_loss: 0.3297 312/500 [=================>............] - ETA: 47s - loss: 1.8781 - regression_loss: 1.5482 - classification_loss: 0.3299 313/500 [=================>............] - ETA: 46s - loss: 1.8761 - regression_loss: 1.5464 - classification_loss: 0.3297 314/500 [=================>............] - ETA: 46s - loss: 1.8760 - regression_loss: 1.5463 - classification_loss: 0.3297 315/500 [=================>............] - ETA: 46s - loss: 1.8774 - regression_loss: 1.5472 - classification_loss: 0.3302 316/500 [=================>............] - ETA: 46s - loss: 1.8758 - regression_loss: 1.5458 - classification_loss: 0.3300 317/500 [==================>...........] - ETA: 45s - loss: 1.8736 - regression_loss: 1.5441 - classification_loss: 0.3295 318/500 [==================>...........] - ETA: 45s - loss: 1.8721 - regression_loss: 1.5429 - classification_loss: 0.3292 319/500 [==================>...........] - ETA: 45s - loss: 1.8712 - regression_loss: 1.5424 - classification_loss: 0.3288 320/500 [==================>...........] - ETA: 45s - loss: 1.8694 - regression_loss: 1.5410 - classification_loss: 0.3283 321/500 [==================>...........] - ETA: 44s - loss: 1.8716 - regression_loss: 1.5430 - classification_loss: 0.3286 322/500 [==================>...........] - ETA: 44s - loss: 1.8680 - regression_loss: 1.5403 - classification_loss: 0.3278 323/500 [==================>...........] - ETA: 44s - loss: 1.8687 - regression_loss: 1.5408 - classification_loss: 0.3279 324/500 [==================>...........] - ETA: 44s - loss: 1.8659 - regression_loss: 1.5384 - classification_loss: 0.3274 325/500 [==================>...........] - ETA: 43s - loss: 1.8661 - regression_loss: 1.5386 - classification_loss: 0.3275 326/500 [==================>...........] - ETA: 43s - loss: 1.8658 - regression_loss: 1.5384 - classification_loss: 0.3273 327/500 [==================>...........] - ETA: 43s - loss: 1.8654 - regression_loss: 1.5383 - classification_loss: 0.3271 328/500 [==================>...........] - ETA: 43s - loss: 1.8651 - regression_loss: 1.5375 - classification_loss: 0.3276 329/500 [==================>...........] - ETA: 42s - loss: 1.8654 - regression_loss: 1.5378 - classification_loss: 0.3276 330/500 [==================>...........] - ETA: 42s - loss: 1.8640 - regression_loss: 1.5367 - classification_loss: 0.3273 331/500 [==================>...........] - ETA: 42s - loss: 1.8639 - regression_loss: 1.5366 - classification_loss: 0.3273 332/500 [==================>...........] - ETA: 42s - loss: 1.8640 - regression_loss: 1.5368 - classification_loss: 0.3272 333/500 [==================>...........] - ETA: 41s - loss: 1.8615 - regression_loss: 1.5349 - classification_loss: 0.3266 334/500 [===================>..........] - ETA: 41s - loss: 1.8601 - regression_loss: 1.5335 - classification_loss: 0.3266 335/500 [===================>..........] - ETA: 41s - loss: 1.8636 - regression_loss: 1.5357 - classification_loss: 0.3279 336/500 [===================>..........] - ETA: 41s - loss: 1.8654 - regression_loss: 1.5372 - classification_loss: 0.3282 337/500 [===================>..........] - ETA: 40s - loss: 1.8637 - regression_loss: 1.5359 - classification_loss: 0.3277 338/500 [===================>..........] - ETA: 40s - loss: 1.8631 - regression_loss: 1.5356 - classification_loss: 0.3275 339/500 [===================>..........] - ETA: 40s - loss: 1.8640 - regression_loss: 1.5364 - classification_loss: 0.3277 340/500 [===================>..........] - ETA: 40s - loss: 1.8632 - regression_loss: 1.5360 - classification_loss: 0.3273 341/500 [===================>..........] - ETA: 39s - loss: 1.8628 - regression_loss: 1.5356 - classification_loss: 0.3272 342/500 [===================>..........] - ETA: 39s - loss: 1.8656 - regression_loss: 1.5380 - classification_loss: 0.3276 343/500 [===================>..........] - ETA: 39s - loss: 1.8664 - regression_loss: 1.5388 - classification_loss: 0.3276 344/500 [===================>..........] - ETA: 39s - loss: 1.8647 - regression_loss: 1.5375 - classification_loss: 0.3272 345/500 [===================>..........] - ETA: 38s - loss: 1.8657 - regression_loss: 1.5379 - classification_loss: 0.3278 346/500 [===================>..........] - ETA: 38s - loss: 1.8655 - regression_loss: 1.5380 - classification_loss: 0.3276 347/500 [===================>..........] - ETA: 38s - loss: 1.8669 - regression_loss: 1.5388 - classification_loss: 0.3281 348/500 [===================>..........] - ETA: 38s - loss: 1.8678 - regression_loss: 1.5396 - classification_loss: 0.3282 349/500 [===================>..........] - ETA: 37s - loss: 1.8675 - regression_loss: 1.5394 - classification_loss: 0.3281 350/500 [====================>.........] - ETA: 37s - loss: 1.8668 - regression_loss: 1.5389 - classification_loss: 0.3279 351/500 [====================>.........] - ETA: 37s - loss: 1.8678 - regression_loss: 1.5395 - classification_loss: 0.3283 352/500 [====================>.........] - ETA: 37s - loss: 1.8696 - regression_loss: 1.5410 - classification_loss: 0.3286 353/500 [====================>.........] - ETA: 36s - loss: 1.8723 - regression_loss: 1.5431 - classification_loss: 0.3292 354/500 [====================>.........] - ETA: 36s - loss: 1.8691 - regression_loss: 1.5405 - classification_loss: 0.3286 355/500 [====================>.........] - ETA: 36s - loss: 1.8702 - regression_loss: 1.5411 - classification_loss: 0.3291 356/500 [====================>.........] - ETA: 36s - loss: 1.8706 - regression_loss: 1.5414 - classification_loss: 0.3291 357/500 [====================>.........] - ETA: 35s - loss: 1.8709 - regression_loss: 1.5419 - classification_loss: 0.3290 358/500 [====================>.........] - ETA: 35s - loss: 1.8711 - regression_loss: 1.5421 - classification_loss: 0.3290 359/500 [====================>.........] - ETA: 35s - loss: 1.8701 - regression_loss: 1.5411 - classification_loss: 0.3289 360/500 [====================>.........] - ETA: 35s - loss: 1.8713 - regression_loss: 1.5423 - classification_loss: 0.3290 361/500 [====================>.........] - ETA: 34s - loss: 1.8717 - regression_loss: 1.5428 - classification_loss: 0.3289 362/500 [====================>.........] - ETA: 34s - loss: 1.8697 - regression_loss: 1.5410 - classification_loss: 0.3287 363/500 [====================>.........] - ETA: 34s - loss: 1.8703 - regression_loss: 1.5413 - classification_loss: 0.3291 364/500 [====================>.........] - ETA: 34s - loss: 1.8709 - regression_loss: 1.5418 - classification_loss: 0.3291 365/500 [====================>.........] - ETA: 33s - loss: 1.8712 - regression_loss: 1.5420 - classification_loss: 0.3292 366/500 [====================>.........] - ETA: 33s - loss: 1.8685 - regression_loss: 1.5401 - classification_loss: 0.3285 367/500 [=====================>........] - ETA: 33s - loss: 1.8672 - regression_loss: 1.5391 - classification_loss: 0.3281 368/500 [=====================>........] - ETA: 33s - loss: 1.8679 - regression_loss: 1.5396 - classification_loss: 0.3283 369/500 [=====================>........] - ETA: 32s - loss: 1.8656 - regression_loss: 1.5376 - classification_loss: 0.3280 370/500 [=====================>........] - ETA: 32s - loss: 1.8659 - regression_loss: 1.5379 - classification_loss: 0.3280 371/500 [=====================>........] - ETA: 32s - loss: 1.8655 - regression_loss: 1.5377 - classification_loss: 0.3278 372/500 [=====================>........] - ETA: 32s - loss: 1.8657 - regression_loss: 1.5379 - classification_loss: 0.3278 373/500 [=====================>........] - ETA: 31s - loss: 1.8660 - regression_loss: 1.5383 - classification_loss: 0.3277 374/500 [=====================>........] - ETA: 31s - loss: 1.8632 - regression_loss: 1.5359 - classification_loss: 0.3273 375/500 [=====================>........] - ETA: 31s - loss: 1.8659 - regression_loss: 1.5379 - classification_loss: 0.3280 376/500 [=====================>........] - ETA: 31s - loss: 1.8660 - regression_loss: 1.5380 - classification_loss: 0.3280 377/500 [=====================>........] - ETA: 30s - loss: 1.8659 - regression_loss: 1.5377 - classification_loss: 0.3282 378/500 [=====================>........] - ETA: 30s - loss: 1.8668 - regression_loss: 1.5385 - classification_loss: 0.3283 379/500 [=====================>........] - ETA: 30s - loss: 1.8669 - regression_loss: 1.5386 - classification_loss: 0.3282 380/500 [=====================>........] - ETA: 30s - loss: 1.8655 - regression_loss: 1.5375 - classification_loss: 0.3279 381/500 [=====================>........] - ETA: 29s - loss: 1.8649 - regression_loss: 1.5372 - classification_loss: 0.3277 382/500 [=====================>........] - ETA: 29s - loss: 1.8634 - regression_loss: 1.5361 - classification_loss: 0.3273 383/500 [=====================>........] - ETA: 29s - loss: 1.8629 - regression_loss: 1.5359 - classification_loss: 0.3270 384/500 [======================>.......] - ETA: 29s - loss: 1.8666 - regression_loss: 1.5383 - classification_loss: 0.3282 385/500 [======================>.......] - ETA: 28s - loss: 1.8672 - regression_loss: 1.5389 - classification_loss: 0.3283 386/500 [======================>.......] - ETA: 28s - loss: 1.8685 - regression_loss: 1.5400 - classification_loss: 0.3284 387/500 [======================>.......] - ETA: 28s - loss: 1.8677 - regression_loss: 1.5393 - classification_loss: 0.3284 388/500 [======================>.......] - ETA: 28s - loss: 1.8692 - regression_loss: 1.5406 - classification_loss: 0.3286 389/500 [======================>.......] - ETA: 27s - loss: 1.8662 - regression_loss: 1.5382 - classification_loss: 0.3280 390/500 [======================>.......] - ETA: 27s - loss: 1.8668 - regression_loss: 1.5387 - classification_loss: 0.3281 391/500 [======================>.......] - ETA: 27s - loss: 1.8669 - regression_loss: 1.5389 - classification_loss: 0.3280 392/500 [======================>.......] - ETA: 27s - loss: 1.8668 - regression_loss: 1.5389 - classification_loss: 0.3279 393/500 [======================>.......] - ETA: 26s - loss: 1.8686 - regression_loss: 1.5404 - classification_loss: 0.3282 394/500 [======================>.......] - ETA: 26s - loss: 1.8682 - regression_loss: 1.5401 - classification_loss: 0.3282 395/500 [======================>.......] - ETA: 26s - loss: 1.8681 - regression_loss: 1.5400 - classification_loss: 0.3280 396/500 [======================>.......] - ETA: 26s - loss: 1.8696 - regression_loss: 1.5409 - classification_loss: 0.3287 397/500 [======================>.......] - ETA: 25s - loss: 1.8700 - regression_loss: 1.5413 - classification_loss: 0.3288 398/500 [======================>.......] - ETA: 25s - loss: 1.8712 - regression_loss: 1.5422 - classification_loss: 0.3290 399/500 [======================>.......] - ETA: 25s - loss: 1.8723 - regression_loss: 1.5430 - classification_loss: 0.3293 400/500 [=======================>......] - ETA: 25s - loss: 1.8704 - regression_loss: 1.5413 - classification_loss: 0.3291 401/500 [=======================>......] - ETA: 24s - loss: 1.8710 - regression_loss: 1.5417 - classification_loss: 0.3294 402/500 [=======================>......] - ETA: 24s - loss: 1.8696 - regression_loss: 1.5404 - classification_loss: 0.3292 403/500 [=======================>......] - ETA: 24s - loss: 1.8700 - regression_loss: 1.5408 - classification_loss: 0.3292 404/500 [=======================>......] - ETA: 24s - loss: 1.8705 - regression_loss: 1.5412 - classification_loss: 0.3292 405/500 [=======================>......] - ETA: 23s - loss: 1.8713 - regression_loss: 1.5419 - classification_loss: 0.3294 406/500 [=======================>......] - ETA: 23s - loss: 1.8691 - regression_loss: 1.5402 - classification_loss: 0.3289 407/500 [=======================>......] - ETA: 23s - loss: 1.8686 - regression_loss: 1.5399 - classification_loss: 0.3287 408/500 [=======================>......] - ETA: 23s - loss: 1.8693 - regression_loss: 1.5405 - classification_loss: 0.3288 409/500 [=======================>......] - ETA: 22s - loss: 1.8684 - regression_loss: 1.5399 - classification_loss: 0.3286 410/500 [=======================>......] - ETA: 22s - loss: 1.8682 - regression_loss: 1.5397 - classification_loss: 0.3285 411/500 [=======================>......] - ETA: 22s - loss: 1.8672 - regression_loss: 1.5389 - classification_loss: 0.3283 412/500 [=======================>......] - ETA: 22s - loss: 1.8667 - regression_loss: 1.5387 - classification_loss: 0.3281 413/500 [=======================>......] - ETA: 21s - loss: 1.8669 - regression_loss: 1.5389 - classification_loss: 0.3281 414/500 [=======================>......] - ETA: 21s - loss: 1.8678 - regression_loss: 1.5395 - classification_loss: 0.3283 415/500 [=======================>......] - ETA: 21s - loss: 1.8677 - regression_loss: 1.5395 - classification_loss: 0.3283 416/500 [=======================>......] - ETA: 21s - loss: 1.8681 - regression_loss: 1.5399 - classification_loss: 0.3282 417/500 [========================>.....] - ETA: 20s - loss: 1.8683 - regression_loss: 1.5400 - classification_loss: 0.3283 418/500 [========================>.....] - ETA: 20s - loss: 1.8684 - regression_loss: 1.5402 - classification_loss: 0.3282 419/500 [========================>.....] - ETA: 20s - loss: 1.8694 - regression_loss: 1.5411 - classification_loss: 0.3283 420/500 [========================>.....] - ETA: 20s - loss: 1.8698 - regression_loss: 1.5414 - classification_loss: 0.3284 421/500 [========================>.....] - ETA: 19s - loss: 1.8679 - regression_loss: 1.5398 - classification_loss: 0.3280 422/500 [========================>.....] - ETA: 19s - loss: 1.8672 - regression_loss: 1.5393 - classification_loss: 0.3279 423/500 [========================>.....] - ETA: 19s - loss: 1.8685 - regression_loss: 1.5402 - classification_loss: 0.3283 424/500 [========================>.....] - ETA: 19s - loss: 1.8693 - regression_loss: 1.5411 - classification_loss: 0.3282 425/500 [========================>.....] - ETA: 18s - loss: 1.8690 - regression_loss: 1.5410 - classification_loss: 0.3280 426/500 [========================>.....] - ETA: 18s - loss: 1.8685 - regression_loss: 1.5406 - classification_loss: 0.3279 427/500 [========================>.....] - ETA: 18s - loss: 1.8661 - regression_loss: 1.5388 - classification_loss: 0.3273 428/500 [========================>.....] - ETA: 18s - loss: 1.8656 - regression_loss: 1.5383 - classification_loss: 0.3273 429/500 [========================>.....] - ETA: 17s - loss: 1.8653 - regression_loss: 1.5381 - classification_loss: 0.3272 430/500 [========================>.....] - ETA: 17s - loss: 1.8653 - regression_loss: 1.5381 - classification_loss: 0.3273 431/500 [========================>.....] - ETA: 17s - loss: 1.8642 - regression_loss: 1.5369 - classification_loss: 0.3273 432/500 [========================>.....] - ETA: 17s - loss: 1.8643 - regression_loss: 1.5369 - classification_loss: 0.3274 433/500 [========================>.....] - ETA: 16s - loss: 1.8637 - regression_loss: 1.5365 - classification_loss: 0.3272 434/500 [=========================>....] - ETA: 16s - loss: 1.8635 - regression_loss: 1.5364 - classification_loss: 0.3271 435/500 [=========================>....] - ETA: 16s - loss: 1.8610 - regression_loss: 1.5345 - classification_loss: 0.3265 436/500 [=========================>....] - ETA: 16s - loss: 1.8596 - regression_loss: 1.5334 - classification_loss: 0.3262 437/500 [=========================>....] - ETA: 15s - loss: 1.8609 - regression_loss: 1.5344 - classification_loss: 0.3265 438/500 [=========================>....] - ETA: 15s - loss: 1.8604 - regression_loss: 1.5340 - classification_loss: 0.3264 439/500 [=========================>....] - ETA: 15s - loss: 1.8596 - regression_loss: 1.5334 - classification_loss: 0.3262 440/500 [=========================>....] - ETA: 15s - loss: 1.8599 - regression_loss: 1.5336 - classification_loss: 0.3263 441/500 [=========================>....] - ETA: 14s - loss: 1.8608 - regression_loss: 1.5345 - classification_loss: 0.3263 442/500 [=========================>....] - ETA: 14s - loss: 1.8615 - regression_loss: 1.5351 - classification_loss: 0.3264 443/500 [=========================>....] - ETA: 14s - loss: 1.8617 - regression_loss: 1.5354 - classification_loss: 0.3264 444/500 [=========================>....] - ETA: 14s - loss: 1.8618 - regression_loss: 1.5357 - classification_loss: 0.3262 445/500 [=========================>....] - ETA: 13s - loss: 1.8626 - regression_loss: 1.5364 - classification_loss: 0.3262 446/500 [=========================>....] - ETA: 13s - loss: 1.8626 - regression_loss: 1.5365 - classification_loss: 0.3261 447/500 [=========================>....] - ETA: 13s - loss: 1.8646 - regression_loss: 1.5379 - classification_loss: 0.3266 448/500 [=========================>....] - ETA: 13s - loss: 1.8648 - regression_loss: 1.5380 - classification_loss: 0.3267 449/500 [=========================>....] - ETA: 12s - loss: 1.8656 - regression_loss: 1.5386 - classification_loss: 0.3270 450/500 [==========================>...] - ETA: 12s - loss: 1.8644 - regression_loss: 1.5379 - classification_loss: 0.3265 451/500 [==========================>...] - ETA: 12s - loss: 1.8653 - regression_loss: 1.5385 - classification_loss: 0.3268 452/500 [==========================>...] - ETA: 12s - loss: 1.8639 - regression_loss: 1.5373 - classification_loss: 0.3266 453/500 [==========================>...] - ETA: 11s - loss: 1.8636 - regression_loss: 1.5372 - classification_loss: 0.3264 454/500 [==========================>...] - ETA: 11s - loss: 1.8635 - regression_loss: 1.5372 - classification_loss: 0.3263 455/500 [==========================>...] - ETA: 11s - loss: 1.8632 - regression_loss: 1.5371 - classification_loss: 0.3261 456/500 [==========================>...] - ETA: 11s - loss: 1.8629 - regression_loss: 1.5371 - classification_loss: 0.3258 457/500 [==========================>...] - ETA: 10s - loss: 1.8628 - regression_loss: 1.5371 - classification_loss: 0.3257 458/500 [==========================>...] - ETA: 10s - loss: 1.8629 - regression_loss: 1.5373 - classification_loss: 0.3256 459/500 [==========================>...] - ETA: 10s - loss: 1.8621 - regression_loss: 1.5367 - classification_loss: 0.3254 460/500 [==========================>...] - ETA: 10s - loss: 1.8593 - regression_loss: 1.5341 - classification_loss: 0.3252 461/500 [==========================>...] - ETA: 9s - loss: 1.8578 - regression_loss: 1.5330 - classification_loss: 0.3248  462/500 [==========================>...] - ETA: 9s - loss: 1.8577 - regression_loss: 1.5330 - classification_loss: 0.3247 463/500 [==========================>...] - ETA: 9s - loss: 1.8574 - regression_loss: 1.5328 - classification_loss: 0.3246 464/500 [==========================>...] - ETA: 9s - loss: 1.8581 - regression_loss: 1.5334 - classification_loss: 0.3247 465/500 [==========================>...] - ETA: 8s - loss: 1.8586 - regression_loss: 1.5337 - classification_loss: 0.3248 466/500 [==========================>...] - ETA: 8s - loss: 1.8594 - regression_loss: 1.5345 - classification_loss: 0.3249 467/500 [===========================>..] - ETA: 8s - loss: 1.8584 - regression_loss: 1.5333 - classification_loss: 0.3252 468/500 [===========================>..] - ETA: 8s - loss: 1.8592 - regression_loss: 1.5340 - classification_loss: 0.3253 469/500 [===========================>..] - ETA: 7s - loss: 1.8583 - regression_loss: 1.5332 - classification_loss: 0.3250 470/500 [===========================>..] - ETA: 7s - loss: 1.8585 - regression_loss: 1.5336 - classification_loss: 0.3249 471/500 [===========================>..] - ETA: 7s - loss: 1.8584 - regression_loss: 1.5336 - classification_loss: 0.3248 472/500 [===========================>..] - ETA: 7s - loss: 1.8567 - regression_loss: 1.5322 - classification_loss: 0.3245 473/500 [===========================>..] - ETA: 6s - loss: 1.8586 - regression_loss: 1.5339 - classification_loss: 0.3247 474/500 [===========================>..] - ETA: 6s - loss: 1.8581 - regression_loss: 1.5335 - classification_loss: 0.3246 475/500 [===========================>..] - ETA: 6s - loss: 1.8579 - regression_loss: 1.5333 - classification_loss: 0.3246 476/500 [===========================>..] - ETA: 6s - loss: 1.8590 - regression_loss: 1.5344 - classification_loss: 0.3246 477/500 [===========================>..] - ETA: 5s - loss: 1.8574 - regression_loss: 1.5331 - classification_loss: 0.3242 478/500 [===========================>..] - ETA: 5s - loss: 1.8571 - regression_loss: 1.5330 - classification_loss: 0.3241 479/500 [===========================>..] - ETA: 5s - loss: 1.8582 - regression_loss: 1.5338 - classification_loss: 0.3244 480/500 [===========================>..] - ETA: 5s - loss: 1.8583 - regression_loss: 1.5338 - classification_loss: 0.3245 481/500 [===========================>..] - ETA: 4s - loss: 1.8588 - regression_loss: 1.5343 - classification_loss: 0.3245 482/500 [===========================>..] - ETA: 4s - loss: 1.8576 - regression_loss: 1.5328 - classification_loss: 0.3247 483/500 [===========================>..] - ETA: 4s - loss: 1.8577 - regression_loss: 1.5330 - classification_loss: 0.3247 484/500 [============================>.] - ETA: 4s - loss: 1.8586 - regression_loss: 1.5338 - classification_loss: 0.3248 485/500 [============================>.] - ETA: 3s - loss: 1.8591 - regression_loss: 1.5344 - classification_loss: 0.3246 486/500 [============================>.] - ETA: 3s - loss: 1.8594 - regression_loss: 1.5347 - classification_loss: 0.3247 487/500 [============================>.] - ETA: 3s - loss: 1.8575 - regression_loss: 1.5332 - classification_loss: 0.3243 488/500 [============================>.] - ETA: 3s - loss: 1.8574 - regression_loss: 1.5333 - classification_loss: 0.3241 489/500 [============================>.] - ETA: 2s - loss: 1.8567 - regression_loss: 1.5328 - classification_loss: 0.3239 490/500 [============================>.] - ETA: 2s - loss: 1.8568 - regression_loss: 1.5329 - classification_loss: 0.3239 491/500 [============================>.] - ETA: 2s - loss: 1.8579 - regression_loss: 1.5339 - classification_loss: 0.3240 492/500 [============================>.] - ETA: 2s - loss: 1.8581 - regression_loss: 1.5342 - classification_loss: 0.3239 493/500 [============================>.] - ETA: 1s - loss: 1.8586 - regression_loss: 1.5346 - classification_loss: 0.3240 494/500 [============================>.] - ETA: 1s - loss: 1.8582 - regression_loss: 1.5342 - classification_loss: 0.3239 495/500 [============================>.] - ETA: 1s - loss: 1.8586 - regression_loss: 1.5346 - classification_loss: 0.3240 496/500 [============================>.] - ETA: 1s - loss: 1.8592 - regression_loss: 1.5352 - classification_loss: 0.3240 497/500 [============================>.] - ETA: 0s - loss: 1.8592 - regression_loss: 1.5351 - classification_loss: 0.3241 498/500 [============================>.] - ETA: 0s - loss: 1.8575 - regression_loss: 1.5337 - classification_loss: 0.3238 499/500 [============================>.] - ETA: 0s - loss: 1.8569 - regression_loss: 1.5331 - classification_loss: 0.3238 500/500 [==============================] - 125s 250ms/step - loss: 1.8578 - regression_loss: 1.5338 - classification_loss: 0.3240 1172 instances of class plum with average precision: 0.5671 mAP: 0.5671 Epoch 00054: saving model to ./training/snapshots/resnet50_pascal_54.h5 Epoch 55/150 1/500 [..............................] - ETA: 1:58 - loss: 1.9703 - regression_loss: 1.6047 - classification_loss: 0.3656 2/500 [..............................] - ETA: 2:01 - loss: 2.2910 - regression_loss: 1.8517 - classification_loss: 0.4393 3/500 [..............................] - ETA: 2:02 - loss: 2.3007 - regression_loss: 1.8498 - classification_loss: 0.4510 4/500 [..............................] - ETA: 2:02 - loss: 2.1911 - regression_loss: 1.7688 - classification_loss: 0.4223 5/500 [..............................] - ETA: 2:03 - loss: 2.0923 - regression_loss: 1.6793 - classification_loss: 0.4129 6/500 [..............................] - ETA: 2:03 - loss: 2.1296 - regression_loss: 1.7197 - classification_loss: 0.4100 7/500 [..............................] - ETA: 2:03 - loss: 2.1707 - regression_loss: 1.7395 - classification_loss: 0.4312 8/500 [..............................] - ETA: 2:03 - loss: 2.1681 - regression_loss: 1.7437 - classification_loss: 0.4243 9/500 [..............................] - ETA: 2:03 - loss: 2.1699 - regression_loss: 1.7531 - classification_loss: 0.4168 10/500 [..............................] - ETA: 2:02 - loss: 2.1612 - regression_loss: 1.7496 - classification_loss: 0.4116 11/500 [..............................] - ETA: 2:02 - loss: 2.1329 - regression_loss: 1.7362 - classification_loss: 0.3967 12/500 [..............................] - ETA: 2:02 - loss: 2.0257 - regression_loss: 1.6470 - classification_loss: 0.3787 13/500 [..............................] - ETA: 2:02 - loss: 2.0371 - regression_loss: 1.6612 - classification_loss: 0.3759 14/500 [..............................] - ETA: 2:01 - loss: 1.9809 - regression_loss: 1.6177 - classification_loss: 0.3632 15/500 [..............................] - ETA: 2:01 - loss: 1.9583 - regression_loss: 1.6009 - classification_loss: 0.3573 16/500 [..............................] - ETA: 2:01 - loss: 1.9772 - regression_loss: 1.6156 - classification_loss: 0.3615 17/500 [>.............................] - ETA: 2:01 - loss: 1.9355 - regression_loss: 1.5807 - classification_loss: 0.3547 18/500 [>.............................] - ETA: 2:00 - loss: 1.9527 - regression_loss: 1.5993 - classification_loss: 0.3533 19/500 [>.............................] - ETA: 2:00 - loss: 1.9241 - regression_loss: 1.5715 - classification_loss: 0.3526 20/500 [>.............................] - ETA: 2:00 - loss: 1.9006 - regression_loss: 1.5561 - classification_loss: 0.3445 21/500 [>.............................] - ETA: 2:00 - loss: 1.9039 - regression_loss: 1.5608 - classification_loss: 0.3430 22/500 [>.............................] - ETA: 2:00 - loss: 1.9193 - regression_loss: 1.5749 - classification_loss: 0.3444 23/500 [>.............................] - ETA: 1:59 - loss: 1.9307 - regression_loss: 1.5855 - classification_loss: 0.3453 24/500 [>.............................] - ETA: 1:59 - loss: 1.9431 - regression_loss: 1.5954 - classification_loss: 0.3478 25/500 [>.............................] - ETA: 1:59 - loss: 1.9545 - regression_loss: 1.6074 - classification_loss: 0.3471 26/500 [>.............................] - ETA: 1:59 - loss: 1.9215 - regression_loss: 1.5808 - classification_loss: 0.3407 27/500 [>.............................] - ETA: 1:58 - loss: 1.9325 - regression_loss: 1.5923 - classification_loss: 0.3402 28/500 [>.............................] - ETA: 1:58 - loss: 1.9170 - regression_loss: 1.5815 - classification_loss: 0.3355 29/500 [>.............................] - ETA: 1:58 - loss: 1.8995 - regression_loss: 1.5689 - classification_loss: 0.3306 30/500 [>.............................] - ETA: 1:58 - loss: 1.8889 - regression_loss: 1.5619 - classification_loss: 0.3270 31/500 [>.............................] - ETA: 1:58 - loss: 1.9009 - regression_loss: 1.5718 - classification_loss: 0.3291 32/500 [>.............................] - ETA: 1:57 - loss: 1.8821 - regression_loss: 1.5559 - classification_loss: 0.3262 33/500 [>.............................] - ETA: 1:57 - loss: 1.8818 - regression_loss: 1.5559 - classification_loss: 0.3259 34/500 [=>............................] - ETA: 1:57 - loss: 1.8742 - regression_loss: 1.5515 - classification_loss: 0.3227 35/500 [=>............................] - ETA: 1:56 - loss: 1.8558 - regression_loss: 1.5339 - classification_loss: 0.3219 36/500 [=>............................] - ETA: 1:56 - loss: 1.8533 - regression_loss: 1.5326 - classification_loss: 0.3206 37/500 [=>............................] - ETA: 1:56 - loss: 1.8612 - regression_loss: 1.5379 - classification_loss: 0.3232 38/500 [=>............................] - ETA: 1:56 - loss: 1.8594 - regression_loss: 1.5363 - classification_loss: 0.3231 39/500 [=>............................] - ETA: 1:56 - loss: 1.8648 - regression_loss: 1.5404 - classification_loss: 0.3244 40/500 [=>............................] - ETA: 1:55 - loss: 1.8494 - regression_loss: 1.5277 - classification_loss: 0.3217 41/500 [=>............................] - ETA: 1:55 - loss: 1.8511 - regression_loss: 1.5303 - classification_loss: 0.3209 42/500 [=>............................] - ETA: 1:55 - loss: 1.8423 - regression_loss: 1.5205 - classification_loss: 0.3218 43/500 [=>............................] - ETA: 1:55 - loss: 1.8540 - regression_loss: 1.5264 - classification_loss: 0.3276 44/500 [=>............................] - ETA: 1:55 - loss: 1.8438 - regression_loss: 1.5186 - classification_loss: 0.3252 45/500 [=>............................] - ETA: 1:54 - loss: 1.8485 - regression_loss: 1.5235 - classification_loss: 0.3250 46/500 [=>............................] - ETA: 1:54 - loss: 1.8408 - regression_loss: 1.5170 - classification_loss: 0.3238 47/500 [=>............................] - ETA: 1:54 - loss: 1.8421 - regression_loss: 1.5195 - classification_loss: 0.3227 48/500 [=>............................] - ETA: 1:54 - loss: 1.8318 - regression_loss: 1.5118 - classification_loss: 0.3200 49/500 [=>............................] - ETA: 1:54 - loss: 1.8416 - regression_loss: 1.5214 - classification_loss: 0.3202 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8400 - regression_loss: 1.5211 - classification_loss: 0.3188 51/500 [==>...........................] - ETA: 1:53 - loss: 1.8461 - regression_loss: 1.5276 - classification_loss: 0.3185 52/500 [==>...........................] - ETA: 1:53 - loss: 1.8581 - regression_loss: 1.5392 - classification_loss: 0.3189 53/500 [==>...........................] - ETA: 1:53 - loss: 1.8541 - regression_loss: 1.5360 - classification_loss: 0.3180 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8649 - regression_loss: 1.5469 - classification_loss: 0.3180 55/500 [==>...........................] - ETA: 1:52 - loss: 1.8592 - regression_loss: 1.5409 - classification_loss: 0.3184 56/500 [==>...........................] - ETA: 1:52 - loss: 1.8505 - regression_loss: 1.5331 - classification_loss: 0.3174 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8549 - regression_loss: 1.5358 - classification_loss: 0.3190 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8458 - regression_loss: 1.5288 - classification_loss: 0.3170 59/500 [==>...........................] - ETA: 1:51 - loss: 1.8510 - regression_loss: 1.5328 - classification_loss: 0.3182 60/500 [==>...........................] - ETA: 1:51 - loss: 1.8528 - regression_loss: 1.5354 - classification_loss: 0.3174 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8467 - regression_loss: 1.5307 - classification_loss: 0.3161 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8491 - regression_loss: 1.5306 - classification_loss: 0.3185 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8372 - regression_loss: 1.5208 - classification_loss: 0.3164 64/500 [==>...........................] - ETA: 1:50 - loss: 1.8312 - regression_loss: 1.5165 - classification_loss: 0.3147 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8300 - regression_loss: 1.5163 - classification_loss: 0.3137 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8344 - regression_loss: 1.5191 - classification_loss: 0.3153 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8343 - regression_loss: 1.5162 - classification_loss: 0.3181 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8261 - regression_loss: 1.5113 - classification_loss: 0.3149 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8262 - regression_loss: 1.5115 - classification_loss: 0.3147 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8165 - regression_loss: 1.5037 - classification_loss: 0.3128 71/500 [===>..........................] - ETA: 1:48 - loss: 1.8176 - regression_loss: 1.5056 - classification_loss: 0.3120 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8252 - regression_loss: 1.5109 - classification_loss: 0.3143 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8273 - regression_loss: 1.5121 - classification_loss: 0.3152 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8355 - regression_loss: 1.5167 - classification_loss: 0.3188 75/500 [===>..........................] - ETA: 1:47 - loss: 1.8364 - regression_loss: 1.5177 - classification_loss: 0.3187 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8329 - regression_loss: 1.5150 - classification_loss: 0.3178 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8313 - regression_loss: 1.5139 - classification_loss: 0.3174 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8172 - regression_loss: 1.5021 - classification_loss: 0.3151 79/500 [===>..........................] - ETA: 1:46 - loss: 1.8211 - regression_loss: 1.5056 - classification_loss: 0.3156 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8261 - regression_loss: 1.5107 - classification_loss: 0.3154 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8262 - regression_loss: 1.5110 - classification_loss: 0.3152 82/500 [===>..........................] - ETA: 1:45 - loss: 1.8266 - regression_loss: 1.5115 - classification_loss: 0.3152 83/500 [===>..........................] - ETA: 1:45 - loss: 1.8299 - regression_loss: 1.5130 - classification_loss: 0.3169 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8362 - regression_loss: 1.5177 - classification_loss: 0.3185 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8511 - regression_loss: 1.5289 - classification_loss: 0.3221 86/500 [====>.........................] - ETA: 1:44 - loss: 1.8483 - regression_loss: 1.5267 - classification_loss: 0.3216 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8571 - regression_loss: 1.5347 - classification_loss: 0.3224 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8509 - regression_loss: 1.5296 - classification_loss: 0.3213 89/500 [====>.........................] - ETA: 1:43 - loss: 1.8508 - regression_loss: 1.5293 - classification_loss: 0.3215 90/500 [====>.........................] - ETA: 1:43 - loss: 1.8532 - regression_loss: 1.5314 - classification_loss: 0.3218 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8589 - regression_loss: 1.5340 - classification_loss: 0.3249 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8610 - regression_loss: 1.5364 - classification_loss: 0.3246 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8674 - regression_loss: 1.5400 - classification_loss: 0.3274 94/500 [====>.........................] - ETA: 1:42 - loss: 1.8781 - regression_loss: 1.5478 - classification_loss: 0.3304 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8679 - regression_loss: 1.5398 - classification_loss: 0.3281 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8723 - regression_loss: 1.5431 - classification_loss: 0.3292 97/500 [====>.........................] - ETA: 1:41 - loss: 1.8689 - regression_loss: 1.5403 - classification_loss: 0.3286 98/500 [====>.........................] - ETA: 1:41 - loss: 1.8583 - regression_loss: 1.5309 - classification_loss: 0.3275 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8605 - regression_loss: 1.5325 - classification_loss: 0.3281 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8606 - regression_loss: 1.5325 - classification_loss: 0.3281 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8647 - regression_loss: 1.5346 - classification_loss: 0.3301 102/500 [=====>........................] - ETA: 1:40 - loss: 1.8583 - regression_loss: 1.5285 - classification_loss: 0.3298 103/500 [=====>........................] - ETA: 1:40 - loss: 1.8576 - regression_loss: 1.5277 - classification_loss: 0.3299 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8572 - regression_loss: 1.5274 - classification_loss: 0.3298 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8615 - regression_loss: 1.5302 - classification_loss: 0.3313 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8528 - regression_loss: 1.5241 - classification_loss: 0.3287 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8632 - regression_loss: 1.5323 - classification_loss: 0.3309 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8674 - regression_loss: 1.5351 - classification_loss: 0.3323 109/500 [=====>........................] - ETA: 1:38 - loss: 1.8652 - regression_loss: 1.5345 - classification_loss: 0.3308 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8630 - regression_loss: 1.5326 - classification_loss: 0.3304 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8629 - regression_loss: 1.5321 - classification_loss: 0.3308 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8642 - regression_loss: 1.5334 - classification_loss: 0.3308 113/500 [=====>........................] - ETA: 1:37 - loss: 1.8637 - regression_loss: 1.5330 - classification_loss: 0.3307 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8663 - regression_loss: 1.5355 - classification_loss: 0.3307 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8652 - regression_loss: 1.5346 - classification_loss: 0.3306 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8675 - regression_loss: 1.5362 - classification_loss: 0.3313 117/500 [======>.......................] - ETA: 1:36 - loss: 1.8692 - regression_loss: 1.5380 - classification_loss: 0.3312 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8632 - regression_loss: 1.5333 - classification_loss: 0.3299 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8679 - regression_loss: 1.5372 - classification_loss: 0.3307 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8668 - regression_loss: 1.5356 - classification_loss: 0.3312 121/500 [======>.......................] - ETA: 1:35 - loss: 1.8677 - regression_loss: 1.5362 - classification_loss: 0.3315 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8668 - regression_loss: 1.5357 - classification_loss: 0.3310 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8677 - regression_loss: 1.5368 - classification_loss: 0.3309 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8756 - regression_loss: 1.5437 - classification_loss: 0.3319 125/500 [======>.......................] - ETA: 1:34 - loss: 1.8748 - regression_loss: 1.5432 - classification_loss: 0.3316 126/500 [======>.......................] - ETA: 1:34 - loss: 1.8776 - regression_loss: 1.5454 - classification_loss: 0.3322 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8777 - regression_loss: 1.5445 - classification_loss: 0.3332 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8795 - regression_loss: 1.5459 - classification_loss: 0.3336 129/500 [======>.......................] - ETA: 1:33 - loss: 1.8770 - regression_loss: 1.5444 - classification_loss: 0.3326 130/500 [======>.......................] - ETA: 1:33 - loss: 1.8803 - regression_loss: 1.5466 - classification_loss: 0.3338 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8766 - regression_loss: 1.5436 - classification_loss: 0.3331 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8783 - regression_loss: 1.5450 - classification_loss: 0.3333 133/500 [======>.......................] - ETA: 1:32 - loss: 1.8797 - regression_loss: 1.5462 - classification_loss: 0.3335 134/500 [=======>......................] - ETA: 1:32 - loss: 1.8786 - regression_loss: 1.5457 - classification_loss: 0.3328 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8806 - regression_loss: 1.5472 - classification_loss: 0.3334 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8847 - regression_loss: 1.5504 - classification_loss: 0.3343 137/500 [=======>......................] - ETA: 1:31 - loss: 1.8832 - regression_loss: 1.5496 - classification_loss: 0.3337 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8806 - regression_loss: 1.5475 - classification_loss: 0.3332 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8779 - regression_loss: 1.5455 - classification_loss: 0.3324 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8792 - regression_loss: 1.5465 - classification_loss: 0.3327 141/500 [=======>......................] - ETA: 1:30 - loss: 1.8738 - regression_loss: 1.5419 - classification_loss: 0.3319 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8750 - regression_loss: 1.5430 - classification_loss: 0.3320 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8784 - regression_loss: 1.5457 - classification_loss: 0.3327 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8788 - regression_loss: 1.5460 - classification_loss: 0.3328 145/500 [=======>......................] - ETA: 1:29 - loss: 1.8791 - regression_loss: 1.5466 - classification_loss: 0.3325 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8782 - regression_loss: 1.5460 - classification_loss: 0.3322 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8744 - regression_loss: 1.5427 - classification_loss: 0.3318 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8809 - regression_loss: 1.5478 - classification_loss: 0.3331 149/500 [=======>......................] - ETA: 1:28 - loss: 1.8804 - regression_loss: 1.5476 - classification_loss: 0.3328 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8773 - regression_loss: 1.5454 - classification_loss: 0.3319 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8835 - regression_loss: 1.5512 - classification_loss: 0.3323 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8853 - regression_loss: 1.5526 - classification_loss: 0.3327 153/500 [========>.....................] - ETA: 1:27 - loss: 1.8861 - regression_loss: 1.5521 - classification_loss: 0.3340 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8883 - regression_loss: 1.5539 - classification_loss: 0.3343 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8808 - regression_loss: 1.5481 - classification_loss: 0.3327 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8825 - regression_loss: 1.5493 - classification_loss: 0.3332 157/500 [========>.....................] - ETA: 1:26 - loss: 1.8813 - regression_loss: 1.5485 - classification_loss: 0.3328 158/500 [========>.....................] - ETA: 1:26 - loss: 1.8804 - regression_loss: 1.5477 - classification_loss: 0.3326 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8766 - regression_loss: 1.5448 - classification_loss: 0.3318 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8760 - regression_loss: 1.5435 - classification_loss: 0.3325 161/500 [========>.....................] - ETA: 1:25 - loss: 1.8731 - regression_loss: 1.5413 - classification_loss: 0.3318 162/500 [========>.....................] - ETA: 1:25 - loss: 1.8663 - regression_loss: 1.5356 - classification_loss: 0.3307 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8656 - regression_loss: 1.5353 - classification_loss: 0.3303 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8620 - regression_loss: 1.5323 - classification_loss: 0.3297 165/500 [========>.....................] - ETA: 1:24 - loss: 1.8611 - regression_loss: 1.5314 - classification_loss: 0.3297 166/500 [========>.....................] - ETA: 1:24 - loss: 1.8630 - regression_loss: 1.5326 - classification_loss: 0.3304 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8589 - regression_loss: 1.5292 - classification_loss: 0.3297 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8591 - regression_loss: 1.5296 - classification_loss: 0.3295 169/500 [=========>....................] - ETA: 1:23 - loss: 1.8594 - regression_loss: 1.5294 - classification_loss: 0.3299 170/500 [=========>....................] - ETA: 1:23 - loss: 1.8553 - regression_loss: 1.5256 - classification_loss: 0.3296 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8529 - regression_loss: 1.5239 - classification_loss: 0.3290 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8548 - regression_loss: 1.5254 - classification_loss: 0.3294 173/500 [=========>....................] - ETA: 1:22 - loss: 1.8551 - regression_loss: 1.5259 - classification_loss: 0.3292 174/500 [=========>....................] - ETA: 1:22 - loss: 1.8615 - regression_loss: 1.5312 - classification_loss: 0.3304 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8588 - regression_loss: 1.5292 - classification_loss: 0.3296 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8590 - regression_loss: 1.5297 - classification_loss: 0.3293 177/500 [=========>....................] - ETA: 1:21 - loss: 1.8600 - regression_loss: 1.5305 - classification_loss: 0.3295 178/500 [=========>....................] - ETA: 1:21 - loss: 1.8566 - regression_loss: 1.5281 - classification_loss: 0.3285 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8574 - regression_loss: 1.5295 - classification_loss: 0.3278 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8592 - regression_loss: 1.5311 - classification_loss: 0.3280 181/500 [=========>....................] - ETA: 1:20 - loss: 1.8620 - regression_loss: 1.5334 - classification_loss: 0.3286 182/500 [=========>....................] - ETA: 1:20 - loss: 1.8625 - regression_loss: 1.5341 - classification_loss: 0.3284 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8629 - regression_loss: 1.5347 - classification_loss: 0.3282 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8642 - regression_loss: 1.5359 - classification_loss: 0.3283 185/500 [==========>...................] - ETA: 1:19 - loss: 1.8639 - regression_loss: 1.5356 - classification_loss: 0.3283 186/500 [==========>...................] - ETA: 1:19 - loss: 1.8638 - regression_loss: 1.5359 - classification_loss: 0.3280 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8623 - regression_loss: 1.5349 - classification_loss: 0.3275 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8616 - regression_loss: 1.5346 - classification_loss: 0.3270 189/500 [==========>...................] - ETA: 1:18 - loss: 1.8627 - regression_loss: 1.5350 - classification_loss: 0.3277 190/500 [==========>...................] - ETA: 1:18 - loss: 1.8590 - regression_loss: 1.5320 - classification_loss: 0.3270 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8608 - regression_loss: 1.5332 - classification_loss: 0.3276 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8589 - regression_loss: 1.5318 - classification_loss: 0.3271 193/500 [==========>...................] - ETA: 1:17 - loss: 1.8595 - regression_loss: 1.5323 - classification_loss: 0.3272 194/500 [==========>...................] - ETA: 1:17 - loss: 1.8603 - regression_loss: 1.5329 - classification_loss: 0.3274 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8589 - regression_loss: 1.5318 - classification_loss: 0.3271 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8583 - regression_loss: 1.5317 - classification_loss: 0.3267 197/500 [==========>...................] - ETA: 1:16 - loss: 1.8577 - regression_loss: 1.5313 - classification_loss: 0.3264 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8577 - regression_loss: 1.5316 - classification_loss: 0.3262 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8629 - regression_loss: 1.5361 - classification_loss: 0.3268 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8593 - regression_loss: 1.5333 - classification_loss: 0.3260 201/500 [===========>..................] - ETA: 1:15 - loss: 1.8586 - regression_loss: 1.5328 - classification_loss: 0.3259 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8612 - regression_loss: 1.5344 - classification_loss: 0.3267 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8640 - regression_loss: 1.5369 - classification_loss: 0.3271 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8629 - regression_loss: 1.5362 - classification_loss: 0.3268 205/500 [===========>..................] - ETA: 1:14 - loss: 1.8631 - regression_loss: 1.5367 - classification_loss: 0.3263 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8653 - regression_loss: 1.5387 - classification_loss: 0.3266 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8675 - regression_loss: 1.5403 - classification_loss: 0.3273 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8685 - regression_loss: 1.5405 - classification_loss: 0.3280 209/500 [===========>..................] - ETA: 1:13 - loss: 1.8665 - regression_loss: 1.5382 - classification_loss: 0.3283 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8686 - regression_loss: 1.5400 - classification_loss: 0.3286 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8716 - regression_loss: 1.5423 - classification_loss: 0.3294 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8708 - regression_loss: 1.5420 - classification_loss: 0.3288 213/500 [===========>..................] - ETA: 1:12 - loss: 1.8721 - regression_loss: 1.5430 - classification_loss: 0.3291 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8734 - regression_loss: 1.5442 - classification_loss: 0.3292 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8748 - regression_loss: 1.5453 - classification_loss: 0.3295 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8696 - regression_loss: 1.5402 - classification_loss: 0.3293 217/500 [============>.................] - ETA: 1:11 - loss: 1.8698 - regression_loss: 1.5408 - classification_loss: 0.3290 218/500 [============>.................] - ETA: 1:10 - loss: 1.8714 - regression_loss: 1.5423 - classification_loss: 0.3291 219/500 [============>.................] - ETA: 1:10 - loss: 1.8743 - regression_loss: 1.5446 - classification_loss: 0.3296 220/500 [============>.................] - ETA: 1:10 - loss: 1.8740 - regression_loss: 1.5443 - classification_loss: 0.3297 221/500 [============>.................] - ETA: 1:10 - loss: 1.8705 - regression_loss: 1.5418 - classification_loss: 0.3287 222/500 [============>.................] - ETA: 1:09 - loss: 1.8676 - regression_loss: 1.5394 - classification_loss: 0.3283 223/500 [============>.................] - ETA: 1:09 - loss: 1.8649 - regression_loss: 1.5365 - classification_loss: 0.3284 224/500 [============>.................] - ETA: 1:09 - loss: 1.8664 - regression_loss: 1.5366 - classification_loss: 0.3298 225/500 [============>.................] - ETA: 1:09 - loss: 1.8683 - regression_loss: 1.5384 - classification_loss: 0.3300 226/500 [============>.................] - ETA: 1:08 - loss: 1.8689 - regression_loss: 1.5392 - classification_loss: 0.3297 227/500 [============>.................] - ETA: 1:08 - loss: 1.8683 - regression_loss: 1.5388 - classification_loss: 0.3295 228/500 [============>.................] - ETA: 1:08 - loss: 1.8698 - regression_loss: 1.5399 - classification_loss: 0.3299 229/500 [============>.................] - ETA: 1:08 - loss: 1.8704 - regression_loss: 1.5405 - classification_loss: 0.3299 230/500 [============>.................] - ETA: 1:07 - loss: 1.8707 - regression_loss: 1.5409 - classification_loss: 0.3298 231/500 [============>.................] - ETA: 1:07 - loss: 1.8706 - regression_loss: 1.5409 - classification_loss: 0.3296 232/500 [============>.................] - ETA: 1:07 - loss: 1.8696 - regression_loss: 1.5399 - classification_loss: 0.3298 233/500 [============>.................] - ETA: 1:07 - loss: 1.8703 - regression_loss: 1.5402 - classification_loss: 0.3302 234/500 [=============>................] - ETA: 1:06 - loss: 1.8703 - regression_loss: 1.5403 - classification_loss: 0.3301 235/500 [=============>................] - ETA: 1:06 - loss: 1.8694 - regression_loss: 1.5395 - classification_loss: 0.3298 236/500 [=============>................] - ETA: 1:06 - loss: 1.8671 - regression_loss: 1.5377 - classification_loss: 0.3294 237/500 [=============>................] - ETA: 1:06 - loss: 1.8655 - regression_loss: 1.5365 - classification_loss: 0.3289 238/500 [=============>................] - ETA: 1:05 - loss: 1.8714 - regression_loss: 1.5406 - classification_loss: 0.3308 239/500 [=============>................] - ETA: 1:05 - loss: 1.8727 - regression_loss: 1.5410 - classification_loss: 0.3317 240/500 [=============>................] - ETA: 1:05 - loss: 1.8739 - regression_loss: 1.5418 - classification_loss: 0.3320 241/500 [=============>................] - ETA: 1:05 - loss: 1.8765 - regression_loss: 1.5443 - classification_loss: 0.3323 242/500 [=============>................] - ETA: 1:04 - loss: 1.8769 - regression_loss: 1.5448 - classification_loss: 0.3321 243/500 [=============>................] - ETA: 1:04 - loss: 1.8775 - regression_loss: 1.5455 - classification_loss: 0.3320 244/500 [=============>................] - ETA: 1:04 - loss: 1.8763 - regression_loss: 1.5450 - classification_loss: 0.3314 245/500 [=============>................] - ETA: 1:04 - loss: 1.8770 - regression_loss: 1.5457 - classification_loss: 0.3314 246/500 [=============>................] - ETA: 1:03 - loss: 1.8765 - regression_loss: 1.5455 - classification_loss: 0.3310 247/500 [=============>................] - ETA: 1:03 - loss: 1.8742 - regression_loss: 1.5437 - classification_loss: 0.3305 248/500 [=============>................] - ETA: 1:03 - loss: 1.8733 - regression_loss: 1.5431 - classification_loss: 0.3302 249/500 [=============>................] - ETA: 1:03 - loss: 1.8745 - regression_loss: 1.5442 - classification_loss: 0.3303 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8738 - regression_loss: 1.5438 - classification_loss: 0.3300 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8764 - regression_loss: 1.5462 - classification_loss: 0.3302 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8764 - regression_loss: 1.5459 - classification_loss: 0.3304 253/500 [==============>...............] - ETA: 1:02 - loss: 1.8726 - regression_loss: 1.5430 - classification_loss: 0.3296 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8744 - regression_loss: 1.5443 - classification_loss: 0.3301 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8716 - regression_loss: 1.5421 - classification_loss: 0.3295 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8676 - regression_loss: 1.5388 - classification_loss: 0.3287 257/500 [==============>...............] - ETA: 1:01 - loss: 1.8671 - regression_loss: 1.5385 - classification_loss: 0.3286 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8677 - regression_loss: 1.5390 - classification_loss: 0.3287 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8683 - regression_loss: 1.5397 - classification_loss: 0.3286 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8663 - regression_loss: 1.5378 - classification_loss: 0.3285 261/500 [==============>...............] - ETA: 1:00 - loss: 1.8661 - regression_loss: 1.5377 - classification_loss: 0.3284 262/500 [==============>...............] - ETA: 59s - loss: 1.8659 - regression_loss: 1.5377 - classification_loss: 0.3283  263/500 [==============>...............] - ETA: 59s - loss: 1.8667 - regression_loss: 1.5385 - classification_loss: 0.3282 264/500 [==============>...............] - ETA: 59s - loss: 1.8671 - regression_loss: 1.5383 - classification_loss: 0.3287 265/500 [==============>...............] - ETA: 59s - loss: 1.8658 - regression_loss: 1.5374 - classification_loss: 0.3284 266/500 [==============>...............] - ETA: 58s - loss: 1.8646 - regression_loss: 1.5365 - classification_loss: 0.3281 267/500 [===============>..............] - ETA: 58s - loss: 1.8655 - regression_loss: 1.5375 - classification_loss: 0.3280 268/500 [===============>..............] - ETA: 58s - loss: 1.8661 - regression_loss: 1.5381 - classification_loss: 0.3280 269/500 [===============>..............] - ETA: 58s - loss: 1.8680 - regression_loss: 1.5392 - classification_loss: 0.3288 270/500 [===============>..............] - ETA: 57s - loss: 1.8681 - regression_loss: 1.5395 - classification_loss: 0.3287 271/500 [===============>..............] - ETA: 57s - loss: 1.8701 - regression_loss: 1.5410 - classification_loss: 0.3291 272/500 [===============>..............] - ETA: 57s - loss: 1.8722 - regression_loss: 1.5431 - classification_loss: 0.3292 273/500 [===============>..............] - ETA: 57s - loss: 1.8694 - regression_loss: 1.5404 - classification_loss: 0.3290 274/500 [===============>..............] - ETA: 56s - loss: 1.8694 - regression_loss: 1.5406 - classification_loss: 0.3288 275/500 [===============>..............] - ETA: 56s - loss: 1.8684 - regression_loss: 1.5398 - classification_loss: 0.3286 276/500 [===============>..............] - ETA: 56s - loss: 1.8694 - regression_loss: 1.5405 - classification_loss: 0.3289 277/500 [===============>..............] - ETA: 56s - loss: 1.8700 - regression_loss: 1.5410 - classification_loss: 0.3290 278/500 [===============>..............] - ETA: 55s - loss: 1.8713 - regression_loss: 1.5422 - classification_loss: 0.3291 279/500 [===============>..............] - ETA: 55s - loss: 1.8714 - regression_loss: 1.5424 - classification_loss: 0.3290 280/500 [===============>..............] - ETA: 55s - loss: 1.8710 - regression_loss: 1.5418 - classification_loss: 0.3292 281/500 [===============>..............] - ETA: 55s - loss: 1.8719 - regression_loss: 1.5423 - classification_loss: 0.3295 282/500 [===============>..............] - ETA: 54s - loss: 1.8734 - regression_loss: 1.5436 - classification_loss: 0.3297 283/500 [===============>..............] - ETA: 54s - loss: 1.8723 - regression_loss: 1.5429 - classification_loss: 0.3294 284/500 [================>.............] - ETA: 54s - loss: 1.8729 - regression_loss: 1.5434 - classification_loss: 0.3295 285/500 [================>.............] - ETA: 53s - loss: 1.8714 - regression_loss: 1.5418 - classification_loss: 0.3296 286/500 [================>.............] - ETA: 53s - loss: 1.8699 - regression_loss: 1.5406 - classification_loss: 0.3293 287/500 [================>.............] - ETA: 53s - loss: 1.8693 - regression_loss: 1.5399 - classification_loss: 0.3294 288/500 [================>.............] - ETA: 53s - loss: 1.8711 - regression_loss: 1.5415 - classification_loss: 0.3296 289/500 [================>.............] - ETA: 52s - loss: 1.8717 - regression_loss: 1.5423 - classification_loss: 0.3294 290/500 [================>.............] - ETA: 52s - loss: 1.8733 - regression_loss: 1.5432 - classification_loss: 0.3300 291/500 [================>.............] - ETA: 52s - loss: 1.8733 - regression_loss: 1.5434 - classification_loss: 0.3299 292/500 [================>.............] - ETA: 52s - loss: 1.8723 - regression_loss: 1.5427 - classification_loss: 0.3296 293/500 [================>.............] - ETA: 51s - loss: 1.8711 - regression_loss: 1.5415 - classification_loss: 0.3295 294/500 [================>.............] - ETA: 51s - loss: 1.8735 - regression_loss: 1.5433 - classification_loss: 0.3301 295/500 [================>.............] - ETA: 51s - loss: 1.8704 - regression_loss: 1.5409 - classification_loss: 0.3294 296/500 [================>.............] - ETA: 51s - loss: 1.8701 - regression_loss: 1.5409 - classification_loss: 0.3292 297/500 [================>.............] - ETA: 50s - loss: 1.8712 - regression_loss: 1.5420 - classification_loss: 0.3292 298/500 [================>.............] - ETA: 50s - loss: 1.8713 - regression_loss: 1.5419 - classification_loss: 0.3294 299/500 [================>.............] - ETA: 50s - loss: 1.8722 - regression_loss: 1.5426 - classification_loss: 0.3296 300/500 [=================>............] - ETA: 50s - loss: 1.8739 - regression_loss: 1.5440 - classification_loss: 0.3298 301/500 [=================>............] - ETA: 49s - loss: 1.8755 - regression_loss: 1.5451 - classification_loss: 0.3304 302/500 [=================>............] - ETA: 49s - loss: 1.8742 - regression_loss: 1.5440 - classification_loss: 0.3302 303/500 [=================>............] - ETA: 49s - loss: 1.8729 - regression_loss: 1.5430 - classification_loss: 0.3299 304/500 [=================>............] - ETA: 49s - loss: 1.8718 - regression_loss: 1.5421 - classification_loss: 0.3297 305/500 [=================>............] - ETA: 48s - loss: 1.8702 - regression_loss: 1.5410 - classification_loss: 0.3292 306/500 [=================>............] - ETA: 48s - loss: 1.8708 - regression_loss: 1.5416 - classification_loss: 0.3292 307/500 [=================>............] - ETA: 48s - loss: 1.8701 - regression_loss: 1.5413 - classification_loss: 0.3288 308/500 [=================>............] - ETA: 48s - loss: 1.8677 - regression_loss: 1.5393 - classification_loss: 0.3284 309/500 [=================>............] - ETA: 47s - loss: 1.8653 - regression_loss: 1.5372 - classification_loss: 0.3281 310/500 [=================>............] - ETA: 47s - loss: 1.8653 - regression_loss: 1.5374 - classification_loss: 0.3279 311/500 [=================>............] - ETA: 47s - loss: 1.8641 - regression_loss: 1.5363 - classification_loss: 0.3278 312/500 [=================>............] - ETA: 47s - loss: 1.8636 - regression_loss: 1.5361 - classification_loss: 0.3276 313/500 [=================>............] - ETA: 46s - loss: 1.8609 - regression_loss: 1.5338 - classification_loss: 0.3271 314/500 [=================>............] - ETA: 46s - loss: 1.8602 - regression_loss: 1.5333 - classification_loss: 0.3268 315/500 [=================>............] - ETA: 46s - loss: 1.8604 - regression_loss: 1.5336 - classification_loss: 0.3268 316/500 [=================>............] - ETA: 46s - loss: 1.8610 - regression_loss: 1.5337 - classification_loss: 0.3273 317/500 [==================>...........] - ETA: 45s - loss: 1.8620 - regression_loss: 1.5345 - classification_loss: 0.3275 318/500 [==================>...........] - ETA: 45s - loss: 1.8619 - regression_loss: 1.5346 - classification_loss: 0.3273 319/500 [==================>...........] - ETA: 45s - loss: 1.8627 - regression_loss: 1.5353 - classification_loss: 0.3274 320/500 [==================>...........] - ETA: 45s - loss: 1.8640 - regression_loss: 1.5364 - classification_loss: 0.3277 321/500 [==================>...........] - ETA: 44s - loss: 1.8654 - regression_loss: 1.5375 - classification_loss: 0.3279 322/500 [==================>...........] - ETA: 44s - loss: 1.8650 - regression_loss: 1.5372 - classification_loss: 0.3278 323/500 [==================>...........] - ETA: 44s - loss: 1.8660 - regression_loss: 1.5380 - classification_loss: 0.3280 324/500 [==================>...........] - ETA: 44s - loss: 1.8672 - regression_loss: 1.5390 - classification_loss: 0.3282 325/500 [==================>...........] - ETA: 43s - loss: 1.8671 - regression_loss: 1.5383 - classification_loss: 0.3288 326/500 [==================>...........] - ETA: 43s - loss: 1.8667 - regression_loss: 1.5379 - classification_loss: 0.3288 327/500 [==================>...........] - ETA: 43s - loss: 1.8665 - regression_loss: 1.5378 - classification_loss: 0.3287 328/500 [==================>...........] - ETA: 43s - loss: 1.8677 - regression_loss: 1.5388 - classification_loss: 0.3289 329/500 [==================>...........] - ETA: 42s - loss: 1.8659 - regression_loss: 1.5375 - classification_loss: 0.3285 330/500 [==================>...........] - ETA: 42s - loss: 1.8625 - regression_loss: 1.5348 - classification_loss: 0.3277 331/500 [==================>...........] - ETA: 42s - loss: 1.8641 - regression_loss: 1.5359 - classification_loss: 0.3281 332/500 [==================>...........] - ETA: 42s - loss: 1.8647 - regression_loss: 1.5365 - classification_loss: 0.3281 333/500 [==================>...........] - ETA: 41s - loss: 1.8648 - regression_loss: 1.5367 - classification_loss: 0.3281 334/500 [===================>..........] - ETA: 41s - loss: 1.8654 - regression_loss: 1.5369 - classification_loss: 0.3285 335/500 [===================>..........] - ETA: 41s - loss: 1.8673 - regression_loss: 1.5383 - classification_loss: 0.3291 336/500 [===================>..........] - ETA: 41s - loss: 1.8683 - regression_loss: 1.5389 - classification_loss: 0.3294 337/500 [===================>..........] - ETA: 40s - loss: 1.8683 - regression_loss: 1.5391 - classification_loss: 0.3292 338/500 [===================>..........] - ETA: 40s - loss: 1.8678 - regression_loss: 1.5388 - classification_loss: 0.3290 339/500 [===================>..........] - ETA: 40s - loss: 1.8700 - regression_loss: 1.5400 - classification_loss: 0.3300 340/500 [===================>..........] - ETA: 40s - loss: 1.8710 - regression_loss: 1.5408 - classification_loss: 0.3302 341/500 [===================>..........] - ETA: 39s - loss: 1.8711 - regression_loss: 1.5409 - classification_loss: 0.3302 342/500 [===================>..........] - ETA: 39s - loss: 1.8684 - regression_loss: 1.5389 - classification_loss: 0.3294 343/500 [===================>..........] - ETA: 39s - loss: 1.8662 - regression_loss: 1.5372 - classification_loss: 0.3290 344/500 [===================>..........] - ETA: 39s - loss: 1.8669 - regression_loss: 1.5378 - classification_loss: 0.3291 345/500 [===================>..........] - ETA: 38s - loss: 1.8680 - regression_loss: 1.5387 - classification_loss: 0.3294 346/500 [===================>..........] - ETA: 38s - loss: 1.8647 - regression_loss: 1.5356 - classification_loss: 0.3291 347/500 [===================>..........] - ETA: 38s - loss: 1.8648 - regression_loss: 1.5357 - classification_loss: 0.3291 348/500 [===================>..........] - ETA: 38s - loss: 1.8654 - regression_loss: 1.5363 - classification_loss: 0.3291 349/500 [===================>..........] - ETA: 37s - loss: 1.8637 - regression_loss: 1.5350 - classification_loss: 0.3287 350/500 [====================>.........] - ETA: 37s - loss: 1.8634 - regression_loss: 1.5346 - classification_loss: 0.3287 351/500 [====================>.........] - ETA: 37s - loss: 1.8624 - regression_loss: 1.5341 - classification_loss: 0.3283 352/500 [====================>.........] - ETA: 37s - loss: 1.8633 - regression_loss: 1.5338 - classification_loss: 0.3295 353/500 [====================>.........] - ETA: 36s - loss: 1.8628 - regression_loss: 1.5331 - classification_loss: 0.3297 354/500 [====================>.........] - ETA: 36s - loss: 1.8625 - regression_loss: 1.5330 - classification_loss: 0.3295 355/500 [====================>.........] - ETA: 36s - loss: 1.8618 - regression_loss: 1.5324 - classification_loss: 0.3293 356/500 [====================>.........] - ETA: 36s - loss: 1.8621 - regression_loss: 1.5326 - classification_loss: 0.3295 357/500 [====================>.........] - ETA: 35s - loss: 1.8638 - regression_loss: 1.5341 - classification_loss: 0.3297 358/500 [====================>.........] - ETA: 35s - loss: 1.8658 - regression_loss: 1.5358 - classification_loss: 0.3300 359/500 [====================>.........] - ETA: 35s - loss: 1.8649 - regression_loss: 1.5349 - classification_loss: 0.3300 360/500 [====================>.........] - ETA: 35s - loss: 1.8651 - regression_loss: 1.5350 - classification_loss: 0.3301 361/500 [====================>.........] - ETA: 34s - loss: 1.8653 - regression_loss: 1.5350 - classification_loss: 0.3303 362/500 [====================>.........] - ETA: 34s - loss: 1.8656 - regression_loss: 1.5355 - classification_loss: 0.3301 363/500 [====================>.........] - ETA: 34s - loss: 1.8661 - regression_loss: 1.5360 - classification_loss: 0.3301 364/500 [====================>.........] - ETA: 34s - loss: 1.8664 - regression_loss: 1.5363 - classification_loss: 0.3301 365/500 [====================>.........] - ETA: 33s - loss: 1.8674 - regression_loss: 1.5372 - classification_loss: 0.3302 366/500 [====================>.........] - ETA: 33s - loss: 1.8672 - regression_loss: 1.5370 - classification_loss: 0.3301 367/500 [=====================>........] - ETA: 33s - loss: 1.8673 - regression_loss: 1.5372 - classification_loss: 0.3301 368/500 [=====================>........] - ETA: 33s - loss: 1.8676 - regression_loss: 1.5375 - classification_loss: 0.3301 369/500 [=====================>........] - ETA: 32s - loss: 1.8681 - regression_loss: 1.5381 - classification_loss: 0.3300 370/500 [=====================>........] - ETA: 32s - loss: 1.8660 - regression_loss: 1.5362 - classification_loss: 0.3298 371/500 [=====================>........] - ETA: 32s - loss: 1.8658 - regression_loss: 1.5361 - classification_loss: 0.3297 372/500 [=====================>........] - ETA: 32s - loss: 1.8661 - regression_loss: 1.5366 - classification_loss: 0.3295 373/500 [=====================>........] - ETA: 31s - loss: 1.8653 - regression_loss: 1.5357 - classification_loss: 0.3296 374/500 [=====================>........] - ETA: 31s - loss: 1.8660 - regression_loss: 1.5363 - classification_loss: 0.3298 375/500 [=====================>........] - ETA: 31s - loss: 1.8672 - regression_loss: 1.5373 - classification_loss: 0.3299 376/500 [=====================>........] - ETA: 31s - loss: 1.8670 - regression_loss: 1.5372 - classification_loss: 0.3298 377/500 [=====================>........] - ETA: 30s - loss: 1.8653 - regression_loss: 1.5356 - classification_loss: 0.3297 378/500 [=====================>........] - ETA: 30s - loss: 1.8651 - regression_loss: 1.5356 - classification_loss: 0.3296 379/500 [=====================>........] - ETA: 30s - loss: 1.8637 - regression_loss: 1.5344 - classification_loss: 0.3293 380/500 [=====================>........] - ETA: 30s - loss: 1.8616 - regression_loss: 1.5325 - classification_loss: 0.3291 381/500 [=====================>........] - ETA: 29s - loss: 1.8623 - regression_loss: 1.5331 - classification_loss: 0.3291 382/500 [=====================>........] - ETA: 29s - loss: 1.8602 - regression_loss: 1.5315 - classification_loss: 0.3287 383/500 [=====================>........] - ETA: 29s - loss: 1.8618 - regression_loss: 1.5322 - classification_loss: 0.3295 384/500 [======================>.......] - ETA: 29s - loss: 1.8620 - regression_loss: 1.5326 - classification_loss: 0.3295 385/500 [======================>.......] - ETA: 28s - loss: 1.8624 - regression_loss: 1.5328 - classification_loss: 0.3296 386/500 [======================>.......] - ETA: 28s - loss: 1.8620 - regression_loss: 1.5325 - classification_loss: 0.3295 387/500 [======================>.......] - ETA: 28s - loss: 1.8604 - regression_loss: 1.5313 - classification_loss: 0.3291 388/500 [======================>.......] - ETA: 28s - loss: 1.8586 - regression_loss: 1.5300 - classification_loss: 0.3286 389/500 [======================>.......] - ETA: 27s - loss: 1.8590 - regression_loss: 1.5304 - classification_loss: 0.3286 390/500 [======================>.......] - ETA: 27s - loss: 1.8593 - regression_loss: 1.5306 - classification_loss: 0.3287 391/500 [======================>.......] - ETA: 27s - loss: 1.8596 - regression_loss: 1.5311 - classification_loss: 0.3285 392/500 [======================>.......] - ETA: 27s - loss: 1.8604 - regression_loss: 1.5320 - classification_loss: 0.3285 393/500 [======================>.......] - ETA: 26s - loss: 1.8595 - regression_loss: 1.5311 - classification_loss: 0.3284 394/500 [======================>.......] - ETA: 26s - loss: 1.8591 - regression_loss: 1.5310 - classification_loss: 0.3281 395/500 [======================>.......] - ETA: 26s - loss: 1.8589 - regression_loss: 1.5308 - classification_loss: 0.3280 396/500 [======================>.......] - ETA: 26s - loss: 1.8591 - regression_loss: 1.5313 - classification_loss: 0.3277 397/500 [======================>.......] - ETA: 25s - loss: 1.8561 - regression_loss: 1.5289 - classification_loss: 0.3271 398/500 [======================>.......] - ETA: 25s - loss: 1.8562 - regression_loss: 1.5291 - classification_loss: 0.3272 399/500 [======================>.......] - ETA: 25s - loss: 1.8565 - regression_loss: 1.5293 - classification_loss: 0.3272 400/500 [=======================>......] - ETA: 25s - loss: 1.8568 - regression_loss: 1.5296 - classification_loss: 0.3272 401/500 [=======================>......] - ETA: 24s - loss: 1.8553 - regression_loss: 1.5284 - classification_loss: 0.3269 402/500 [=======================>......] - ETA: 24s - loss: 1.8563 - regression_loss: 1.5292 - classification_loss: 0.3272 403/500 [=======================>......] - ETA: 24s - loss: 1.8550 - regression_loss: 1.5281 - classification_loss: 0.3269 404/500 [=======================>......] - ETA: 24s - loss: 1.8570 - regression_loss: 1.5295 - classification_loss: 0.3274 405/500 [=======================>......] - ETA: 23s - loss: 1.8557 - regression_loss: 1.5286 - classification_loss: 0.3271 406/500 [=======================>......] - ETA: 23s - loss: 1.8557 - regression_loss: 1.5285 - classification_loss: 0.3272 407/500 [=======================>......] - ETA: 23s - loss: 1.8554 - regression_loss: 1.5284 - classification_loss: 0.3270 408/500 [=======================>......] - ETA: 23s - loss: 1.8547 - regression_loss: 1.5279 - classification_loss: 0.3268 409/500 [=======================>......] - ETA: 22s - loss: 1.8542 - regression_loss: 1.5271 - classification_loss: 0.3271 410/500 [=======================>......] - ETA: 22s - loss: 1.8552 - regression_loss: 1.5278 - classification_loss: 0.3274 411/500 [=======================>......] - ETA: 22s - loss: 1.8563 - regression_loss: 1.5283 - classification_loss: 0.3280 412/500 [=======================>......] - ETA: 22s - loss: 1.8562 - regression_loss: 1.5284 - classification_loss: 0.3278 413/500 [=======================>......] - ETA: 21s - loss: 1.8557 - regression_loss: 1.5281 - classification_loss: 0.3277 414/500 [=======================>......] - ETA: 21s - loss: 1.8564 - regression_loss: 1.5287 - classification_loss: 0.3276 415/500 [=======================>......] - ETA: 21s - loss: 1.8571 - regression_loss: 1.5294 - classification_loss: 0.3276 416/500 [=======================>......] - ETA: 21s - loss: 1.8569 - regression_loss: 1.5294 - classification_loss: 0.3276 417/500 [========================>.....] - ETA: 20s - loss: 1.8570 - regression_loss: 1.5297 - classification_loss: 0.3274 418/500 [========================>.....] - ETA: 20s - loss: 1.8571 - regression_loss: 1.5296 - classification_loss: 0.3275 419/500 [========================>.....] - ETA: 20s - loss: 1.8571 - regression_loss: 1.5295 - classification_loss: 0.3276 420/500 [========================>.....] - ETA: 20s - loss: 1.8573 - regression_loss: 1.5297 - classification_loss: 0.3276 421/500 [========================>.....] - ETA: 19s - loss: 1.8553 - regression_loss: 1.5281 - classification_loss: 0.3273 422/500 [========================>.....] - ETA: 19s - loss: 1.8550 - regression_loss: 1.5278 - classification_loss: 0.3272 423/500 [========================>.....] - ETA: 19s - loss: 1.8554 - regression_loss: 1.5281 - classification_loss: 0.3273 424/500 [========================>.....] - ETA: 19s - loss: 1.8561 - regression_loss: 1.5292 - classification_loss: 0.3270 425/500 [========================>.....] - ETA: 18s - loss: 1.8565 - regression_loss: 1.5295 - classification_loss: 0.3270 426/500 [========================>.....] - ETA: 18s - loss: 1.8551 - regression_loss: 1.5283 - classification_loss: 0.3268 427/500 [========================>.....] - ETA: 18s - loss: 1.8558 - regression_loss: 1.5290 - classification_loss: 0.3268 428/500 [========================>.....] - ETA: 18s - loss: 1.8560 - regression_loss: 1.5293 - classification_loss: 0.3267 429/500 [========================>.....] - ETA: 17s - loss: 1.8532 - regression_loss: 1.5271 - classification_loss: 0.3262 430/500 [========================>.....] - ETA: 17s - loss: 1.8538 - regression_loss: 1.5281 - classification_loss: 0.3258 431/500 [========================>.....] - ETA: 17s - loss: 1.8533 - regression_loss: 1.5277 - classification_loss: 0.3255 432/500 [========================>.....] - ETA: 17s - loss: 1.8513 - regression_loss: 1.5262 - classification_loss: 0.3251 433/500 [========================>.....] - ETA: 16s - loss: 1.8513 - regression_loss: 1.5262 - classification_loss: 0.3252 434/500 [=========================>....] - ETA: 16s - loss: 1.8535 - regression_loss: 1.5275 - classification_loss: 0.3260 435/500 [=========================>....] - ETA: 16s - loss: 1.8531 - regression_loss: 1.5273 - classification_loss: 0.3258 436/500 [=========================>....] - ETA: 16s - loss: 1.8529 - regression_loss: 1.5270 - classification_loss: 0.3259 437/500 [=========================>....] - ETA: 15s - loss: 1.8528 - regression_loss: 1.5270 - classification_loss: 0.3259 438/500 [=========================>....] - ETA: 15s - loss: 1.8539 - regression_loss: 1.5275 - classification_loss: 0.3263 439/500 [=========================>....] - ETA: 15s - loss: 1.8569 - regression_loss: 1.5300 - classification_loss: 0.3269 440/500 [=========================>....] - ETA: 15s - loss: 1.8577 - regression_loss: 1.5305 - classification_loss: 0.3272 441/500 [=========================>....] - ETA: 14s - loss: 1.8583 - regression_loss: 1.5311 - classification_loss: 0.3271 442/500 [=========================>....] - ETA: 14s - loss: 1.8572 - regression_loss: 1.5303 - classification_loss: 0.3269 443/500 [=========================>....] - ETA: 14s - loss: 1.8586 - regression_loss: 1.5310 - classification_loss: 0.3276 444/500 [=========================>....] - ETA: 14s - loss: 1.8588 - regression_loss: 1.5312 - classification_loss: 0.3276 445/500 [=========================>....] - ETA: 13s - loss: 1.8603 - regression_loss: 1.5325 - classification_loss: 0.3278 446/500 [=========================>....] - ETA: 13s - loss: 1.8609 - regression_loss: 1.5329 - classification_loss: 0.3280 447/500 [=========================>....] - ETA: 13s - loss: 1.8614 - regression_loss: 1.5334 - classification_loss: 0.3280 448/500 [=========================>....] - ETA: 13s - loss: 1.8612 - regression_loss: 1.5332 - classification_loss: 0.3279 449/500 [=========================>....] - ETA: 12s - loss: 1.8607 - regression_loss: 1.5328 - classification_loss: 0.3279 450/500 [==========================>...] - ETA: 12s - loss: 1.8606 - regression_loss: 1.5328 - classification_loss: 0.3279 451/500 [==========================>...] - ETA: 12s - loss: 1.8602 - regression_loss: 1.5326 - classification_loss: 0.3276 452/500 [==========================>...] - ETA: 12s - loss: 1.8576 - regression_loss: 1.5305 - classification_loss: 0.3271 453/500 [==========================>...] - ETA: 11s - loss: 1.8577 - regression_loss: 1.5306 - classification_loss: 0.3271 454/500 [==========================>...] - ETA: 11s - loss: 1.8586 - regression_loss: 1.5314 - classification_loss: 0.3272 455/500 [==========================>...] - ETA: 11s - loss: 1.8575 - regression_loss: 1.5307 - classification_loss: 0.3268 456/500 [==========================>...] - ETA: 11s - loss: 1.8569 - regression_loss: 1.5303 - classification_loss: 0.3266 457/500 [==========================>...] - ETA: 10s - loss: 1.8559 - regression_loss: 1.5295 - classification_loss: 0.3263 458/500 [==========================>...] - ETA: 10s - loss: 1.8544 - regression_loss: 1.5278 - classification_loss: 0.3267 459/500 [==========================>...] - ETA: 10s - loss: 1.8568 - regression_loss: 1.5291 - classification_loss: 0.3277 460/500 [==========================>...] - ETA: 10s - loss: 1.8566 - regression_loss: 1.5289 - classification_loss: 0.3277 461/500 [==========================>...] - ETA: 9s - loss: 1.8569 - regression_loss: 1.5293 - classification_loss: 0.3277  462/500 [==========================>...] - ETA: 9s - loss: 1.8563 - regression_loss: 1.5288 - classification_loss: 0.3275 463/500 [==========================>...] - ETA: 9s - loss: 1.8568 - regression_loss: 1.5291 - classification_loss: 0.3276 464/500 [==========================>...] - ETA: 9s - loss: 1.8578 - regression_loss: 1.5299 - classification_loss: 0.3279 465/500 [==========================>...] - ETA: 8s - loss: 1.8598 - regression_loss: 1.5316 - classification_loss: 0.3282 466/500 [==========================>...] - ETA: 8s - loss: 1.8585 - regression_loss: 1.5305 - classification_loss: 0.3280 467/500 [===========================>..] - ETA: 8s - loss: 1.8588 - regression_loss: 1.5307 - classification_loss: 0.3281 468/500 [===========================>..] - ETA: 8s - loss: 1.8570 - regression_loss: 1.5291 - classification_loss: 0.3279 469/500 [===========================>..] - ETA: 7s - loss: 1.8574 - regression_loss: 1.5291 - classification_loss: 0.3282 470/500 [===========================>..] - ETA: 7s - loss: 1.8585 - regression_loss: 1.5299 - classification_loss: 0.3286 471/500 [===========================>..] - ETA: 7s - loss: 1.8593 - regression_loss: 1.5306 - classification_loss: 0.3287 472/500 [===========================>..] - ETA: 7s - loss: 1.8598 - regression_loss: 1.5310 - classification_loss: 0.3288 473/500 [===========================>..] - ETA: 6s - loss: 1.8634 - regression_loss: 1.5336 - classification_loss: 0.3298 474/500 [===========================>..] - ETA: 6s - loss: 1.8650 - regression_loss: 1.5347 - classification_loss: 0.3303 475/500 [===========================>..] - ETA: 6s - loss: 1.8647 - regression_loss: 1.5344 - classification_loss: 0.3303 476/500 [===========================>..] - ETA: 6s - loss: 1.8647 - regression_loss: 1.5344 - classification_loss: 0.3302 477/500 [===========================>..] - ETA: 5s - loss: 1.8654 - regression_loss: 1.5350 - classification_loss: 0.3304 478/500 [===========================>..] - ETA: 5s - loss: 1.8648 - regression_loss: 1.5346 - classification_loss: 0.3302 479/500 [===========================>..] - ETA: 5s - loss: 1.8650 - regression_loss: 1.5348 - classification_loss: 0.3302 480/500 [===========================>..] - ETA: 5s - loss: 1.8662 - regression_loss: 1.5357 - classification_loss: 0.3305 481/500 [===========================>..] - ETA: 4s - loss: 1.8665 - regression_loss: 1.5359 - classification_loss: 0.3306 482/500 [===========================>..] - ETA: 4s - loss: 1.8670 - regression_loss: 1.5364 - classification_loss: 0.3306 483/500 [===========================>..] - ETA: 4s - loss: 1.8653 - regression_loss: 1.5351 - classification_loss: 0.3302 484/500 [============================>.] - ETA: 4s - loss: 1.8640 - regression_loss: 1.5340 - classification_loss: 0.3300 485/500 [============================>.] - ETA: 3s - loss: 1.8648 - regression_loss: 1.5347 - classification_loss: 0.3301 486/500 [============================>.] - ETA: 3s - loss: 1.8630 - regression_loss: 1.5333 - classification_loss: 0.3298 487/500 [============================>.] - ETA: 3s - loss: 1.8638 - regression_loss: 1.5339 - classification_loss: 0.3299 488/500 [============================>.] - ETA: 3s - loss: 1.8643 - regression_loss: 1.5343 - classification_loss: 0.3300 489/500 [============================>.] - ETA: 2s - loss: 1.8641 - regression_loss: 1.5342 - classification_loss: 0.3299 490/500 [============================>.] - ETA: 2s - loss: 1.8636 - regression_loss: 1.5339 - classification_loss: 0.3297 491/500 [============================>.] - ETA: 2s - loss: 1.8643 - regression_loss: 1.5345 - classification_loss: 0.3298 492/500 [============================>.] - ETA: 2s - loss: 1.8629 - regression_loss: 1.5333 - classification_loss: 0.3295 493/500 [============================>.] - ETA: 1s - loss: 1.8625 - regression_loss: 1.5331 - classification_loss: 0.3294 494/500 [============================>.] - ETA: 1s - loss: 1.8621 - regression_loss: 1.5327 - classification_loss: 0.3293 495/500 [============================>.] - ETA: 1s - loss: 1.8627 - regression_loss: 1.5334 - classification_loss: 0.3293 496/500 [============================>.] - ETA: 1s - loss: 1.8617 - regression_loss: 1.5327 - classification_loss: 0.3290 497/500 [============================>.] - ETA: 0s - loss: 1.8621 - regression_loss: 1.5329 - classification_loss: 0.3292 498/500 [============================>.] - ETA: 0s - loss: 1.8627 - regression_loss: 1.5333 - classification_loss: 0.3293 499/500 [============================>.] - ETA: 0s - loss: 1.8625 - regression_loss: 1.5333 - classification_loss: 0.3292 500/500 [==============================] - 125s 251ms/step - loss: 1.8623 - regression_loss: 1.5332 - classification_loss: 0.3292 1172 instances of class plum with average precision: 0.5697 mAP: 0.5697 Epoch 00055: saving model to ./training/snapshots/resnet50_pascal_55.h5 Epoch 56/150 1/500 [..............................] - ETA: 1:57 - loss: 1.7662 - regression_loss: 1.4653 - classification_loss: 0.3008 2/500 [..............................] - ETA: 2:02 - loss: 1.8585 - regression_loss: 1.5417 - classification_loss: 0.3168 3/500 [..............................] - ETA: 2:04 - loss: 1.8276 - regression_loss: 1.5150 - classification_loss: 0.3126 4/500 [..............................] - ETA: 2:03 - loss: 1.8296 - regression_loss: 1.5152 - classification_loss: 0.3144 5/500 [..............................] - ETA: 2:03 - loss: 1.8172 - regression_loss: 1.5178 - classification_loss: 0.2994 6/500 [..............................] - ETA: 2:03 - loss: 1.8804 - regression_loss: 1.5500 - classification_loss: 0.3303 7/500 [..............................] - ETA: 2:02 - loss: 1.8990 - regression_loss: 1.5728 - classification_loss: 0.3263 8/500 [..............................] - ETA: 2:03 - loss: 1.8677 - regression_loss: 1.5118 - classification_loss: 0.3560 9/500 [..............................] - ETA: 2:03 - loss: 1.8096 - regression_loss: 1.4677 - classification_loss: 0.3419 10/500 [..............................] - ETA: 2:03 - loss: 1.8226 - regression_loss: 1.4814 - classification_loss: 0.3412 11/500 [..............................] - ETA: 2:03 - loss: 1.9235 - regression_loss: 1.5631 - classification_loss: 0.3604 12/500 [..............................] - ETA: 2:02 - loss: 1.9169 - regression_loss: 1.5533 - classification_loss: 0.3636 13/500 [..............................] - ETA: 2:02 - loss: 1.9205 - regression_loss: 1.5614 - classification_loss: 0.3591 14/500 [..............................] - ETA: 2:02 - loss: 1.8833 - regression_loss: 1.5093 - classification_loss: 0.3740 15/500 [..............................] - ETA: 2:02 - loss: 1.8622 - regression_loss: 1.4956 - classification_loss: 0.3666 16/500 [..............................] - ETA: 2:01 - loss: 1.8394 - regression_loss: 1.4761 - classification_loss: 0.3633 17/500 [>.............................] - ETA: 2:01 - loss: 1.8350 - regression_loss: 1.4743 - classification_loss: 0.3607 18/500 [>.............................] - ETA: 2:01 - loss: 1.8628 - regression_loss: 1.4988 - classification_loss: 0.3639 19/500 [>.............................] - ETA: 2:01 - loss: 1.8976 - regression_loss: 1.5290 - classification_loss: 0.3687 20/500 [>.............................] - ETA: 2:00 - loss: 1.9054 - regression_loss: 1.5348 - classification_loss: 0.3705 21/500 [>.............................] - ETA: 2:00 - loss: 1.8841 - regression_loss: 1.5196 - classification_loss: 0.3646 22/500 [>.............................] - ETA: 2:00 - loss: 1.8759 - regression_loss: 1.5144 - classification_loss: 0.3615 23/500 [>.............................] - ETA: 1:59 - loss: 1.8836 - regression_loss: 1.5239 - classification_loss: 0.3597 24/500 [>.............................] - ETA: 1:59 - loss: 1.9256 - regression_loss: 1.5594 - classification_loss: 0.3662 25/500 [>.............................] - ETA: 1:59 - loss: 1.9146 - regression_loss: 1.5488 - classification_loss: 0.3658 26/500 [>.............................] - ETA: 1:59 - loss: 1.9248 - regression_loss: 1.5569 - classification_loss: 0.3679 27/500 [>.............................] - ETA: 1:58 - loss: 1.9487 - regression_loss: 1.5768 - classification_loss: 0.3719 28/500 [>.............................] - ETA: 1:58 - loss: 1.9533 - regression_loss: 1.5827 - classification_loss: 0.3706 29/500 [>.............................] - ETA: 1:58 - loss: 1.9557 - regression_loss: 1.5874 - classification_loss: 0.3683 30/500 [>.............................] - ETA: 1:58 - loss: 1.9454 - regression_loss: 1.5817 - classification_loss: 0.3636 31/500 [>.............................] - ETA: 1:58 - loss: 1.9347 - regression_loss: 1.5703 - classification_loss: 0.3644 32/500 [>.............................] - ETA: 1:57 - loss: 1.9262 - regression_loss: 1.5656 - classification_loss: 0.3606 33/500 [>.............................] - ETA: 1:57 - loss: 1.9316 - regression_loss: 1.5707 - classification_loss: 0.3609 34/500 [=>............................] - ETA: 1:56 - loss: 1.9131 - regression_loss: 1.5562 - classification_loss: 0.3569 35/500 [=>............................] - ETA: 1:56 - loss: 1.9108 - regression_loss: 1.5552 - classification_loss: 0.3556 36/500 [=>............................] - ETA: 1:56 - loss: 1.9151 - regression_loss: 1.5595 - classification_loss: 0.3555 37/500 [=>............................] - ETA: 1:56 - loss: 1.9427 - regression_loss: 1.5821 - classification_loss: 0.3606 38/500 [=>............................] - ETA: 1:55 - loss: 1.9415 - regression_loss: 1.5833 - classification_loss: 0.3582 39/500 [=>............................] - ETA: 1:55 - loss: 1.9403 - regression_loss: 1.5846 - classification_loss: 0.3557 40/500 [=>............................] - ETA: 1:55 - loss: 1.9427 - regression_loss: 1.5877 - classification_loss: 0.3549 41/500 [=>............................] - ETA: 1:55 - loss: 1.9189 - regression_loss: 1.5675 - classification_loss: 0.3514 42/500 [=>............................] - ETA: 1:54 - loss: 1.9160 - regression_loss: 1.5657 - classification_loss: 0.3503 43/500 [=>............................] - ETA: 1:54 - loss: 1.8987 - regression_loss: 1.5431 - classification_loss: 0.3556 44/500 [=>............................] - ETA: 1:54 - loss: 1.8941 - regression_loss: 1.5406 - classification_loss: 0.3536 45/500 [=>............................] - ETA: 1:54 - loss: 1.9259 - regression_loss: 1.5351 - classification_loss: 0.3908 46/500 [=>............................] - ETA: 1:53 - loss: 1.9253 - regression_loss: 1.5366 - classification_loss: 0.3887 47/500 [=>............................] - ETA: 1:53 - loss: 1.9228 - regression_loss: 1.5359 - classification_loss: 0.3869 48/500 [=>............................] - ETA: 1:53 - loss: 1.9204 - regression_loss: 1.5344 - classification_loss: 0.3861 49/500 [=>............................] - ETA: 1:53 - loss: 1.9197 - regression_loss: 1.5355 - classification_loss: 0.3842 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9200 - regression_loss: 1.5362 - classification_loss: 0.3838 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9176 - regression_loss: 1.5346 - classification_loss: 0.3831 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9101 - regression_loss: 1.5305 - classification_loss: 0.3797 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9089 - regression_loss: 1.5308 - classification_loss: 0.3781 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9178 - regression_loss: 1.5381 - classification_loss: 0.3797 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9233 - regression_loss: 1.5451 - classification_loss: 0.3782 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9169 - regression_loss: 1.5411 - classification_loss: 0.3758 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9020 - regression_loss: 1.5303 - classification_loss: 0.3718 58/500 [==>...........................] - ETA: 1:51 - loss: 1.9064 - regression_loss: 1.5348 - classification_loss: 0.3716 59/500 [==>...........................] - ETA: 1:50 - loss: 1.9014 - regression_loss: 1.5314 - classification_loss: 0.3700 60/500 [==>...........................] - ETA: 1:50 - loss: 1.9078 - regression_loss: 1.5372 - classification_loss: 0.3706 61/500 [==>...........................] - ETA: 1:50 - loss: 1.9057 - regression_loss: 1.5330 - classification_loss: 0.3727 62/500 [==>...........................] - ETA: 1:50 - loss: 1.9067 - regression_loss: 1.5339 - classification_loss: 0.3727 63/500 [==>...........................] - ETA: 1:50 - loss: 1.8955 - regression_loss: 1.5256 - classification_loss: 0.3699 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8769 - regression_loss: 1.5116 - classification_loss: 0.3653 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8812 - regression_loss: 1.5121 - classification_loss: 0.3691 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8853 - regression_loss: 1.5160 - classification_loss: 0.3694 67/500 [===>..........................] - ETA: 1:49 - loss: 1.8873 - regression_loss: 1.5166 - classification_loss: 0.3707 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8856 - regression_loss: 1.5163 - classification_loss: 0.3692 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8873 - regression_loss: 1.5196 - classification_loss: 0.3677 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8808 - regression_loss: 1.5144 - classification_loss: 0.3663 71/500 [===>..........................] - ETA: 1:48 - loss: 1.8792 - regression_loss: 1.5142 - classification_loss: 0.3650 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8808 - regression_loss: 1.5165 - classification_loss: 0.3643 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8820 - regression_loss: 1.5188 - classification_loss: 0.3632 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8701 - regression_loss: 1.5096 - classification_loss: 0.3605 75/500 [===>..........................] - ETA: 1:47 - loss: 1.8797 - regression_loss: 1.5157 - classification_loss: 0.3641 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8695 - regression_loss: 1.5080 - classification_loss: 0.3615 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8688 - regression_loss: 1.5081 - classification_loss: 0.3607 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8710 - regression_loss: 1.5097 - classification_loss: 0.3613 79/500 [===>..........................] - ETA: 1:46 - loss: 1.8736 - regression_loss: 1.5115 - classification_loss: 0.3621 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8769 - regression_loss: 1.5154 - classification_loss: 0.3615 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8723 - regression_loss: 1.5129 - classification_loss: 0.3594 82/500 [===>..........................] - ETA: 1:45 - loss: 1.8700 - regression_loss: 1.5124 - classification_loss: 0.3576 83/500 [===>..........................] - ETA: 1:45 - loss: 1.8730 - regression_loss: 1.5154 - classification_loss: 0.3576 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8624 - regression_loss: 1.5082 - classification_loss: 0.3542 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8499 - regression_loss: 1.4984 - classification_loss: 0.3515 86/500 [====>.........................] - ETA: 1:44 - loss: 1.8434 - regression_loss: 1.4933 - classification_loss: 0.3502 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8404 - regression_loss: 1.4905 - classification_loss: 0.3498 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8486 - regression_loss: 1.4981 - classification_loss: 0.3505 89/500 [====>.........................] - ETA: 1:43 - loss: 1.8468 - regression_loss: 1.4975 - classification_loss: 0.3493 90/500 [====>.........................] - ETA: 1:43 - loss: 1.8450 - regression_loss: 1.4954 - classification_loss: 0.3496 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8472 - regression_loss: 1.4979 - classification_loss: 0.3493 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8353 - regression_loss: 1.4883 - classification_loss: 0.3470 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8389 - regression_loss: 1.4923 - classification_loss: 0.3465 94/500 [====>.........................] - ETA: 1:42 - loss: 1.8388 - regression_loss: 1.4927 - classification_loss: 0.3462 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8421 - regression_loss: 1.4960 - classification_loss: 0.3461 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8472 - regression_loss: 1.4991 - classification_loss: 0.3481 97/500 [====>.........................] - ETA: 1:41 - loss: 1.8523 - regression_loss: 1.5027 - classification_loss: 0.3497 98/500 [====>.........................] - ETA: 1:41 - loss: 1.8444 - regression_loss: 1.4962 - classification_loss: 0.3483 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8400 - regression_loss: 1.4925 - classification_loss: 0.3475 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8397 - regression_loss: 1.4924 - classification_loss: 0.3472 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8446 - regression_loss: 1.4963 - classification_loss: 0.3483 102/500 [=====>........................] - ETA: 1:40 - loss: 1.8463 - regression_loss: 1.4978 - classification_loss: 0.3485 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8459 - regression_loss: 1.4970 - classification_loss: 0.3488 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8471 - regression_loss: 1.4989 - classification_loss: 0.3482 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8436 - regression_loss: 1.4964 - classification_loss: 0.3472 106/500 [=====>........................] - ETA: 1:39 - loss: 1.8418 - regression_loss: 1.4954 - classification_loss: 0.3464 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8403 - regression_loss: 1.4942 - classification_loss: 0.3461 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8421 - regression_loss: 1.4956 - classification_loss: 0.3465 109/500 [=====>........................] - ETA: 1:38 - loss: 1.8400 - regression_loss: 1.4941 - classification_loss: 0.3459 110/500 [=====>........................] - ETA: 1:38 - loss: 1.8418 - regression_loss: 1.4962 - classification_loss: 0.3456 111/500 [=====>........................] - ETA: 1:38 - loss: 1.8415 - regression_loss: 1.4962 - classification_loss: 0.3453 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8453 - regression_loss: 1.5002 - classification_loss: 0.3451 113/500 [=====>........................] - ETA: 1:37 - loss: 1.8448 - regression_loss: 1.5004 - classification_loss: 0.3445 114/500 [=====>........................] - ETA: 1:37 - loss: 1.8518 - regression_loss: 1.5060 - classification_loss: 0.3458 115/500 [=====>........................] - ETA: 1:37 - loss: 1.8501 - regression_loss: 1.5038 - classification_loss: 0.3463 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8500 - regression_loss: 1.5042 - classification_loss: 0.3458 117/500 [======>.......................] - ETA: 1:36 - loss: 1.8489 - regression_loss: 1.5043 - classification_loss: 0.3445 118/500 [======>.......................] - ETA: 1:36 - loss: 1.8494 - regression_loss: 1.5051 - classification_loss: 0.3443 119/500 [======>.......................] - ETA: 1:36 - loss: 1.8445 - regression_loss: 1.5017 - classification_loss: 0.3428 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8489 - regression_loss: 1.5053 - classification_loss: 0.3436 121/500 [======>.......................] - ETA: 1:35 - loss: 1.8484 - regression_loss: 1.5053 - classification_loss: 0.3431 122/500 [======>.......................] - ETA: 1:35 - loss: 1.8493 - regression_loss: 1.5066 - classification_loss: 0.3428 123/500 [======>.......................] - ETA: 1:35 - loss: 1.8467 - regression_loss: 1.5044 - classification_loss: 0.3423 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8420 - regression_loss: 1.5002 - classification_loss: 0.3418 125/500 [======>.......................] - ETA: 1:34 - loss: 1.8387 - regression_loss: 1.4956 - classification_loss: 0.3431 126/500 [======>.......................] - ETA: 1:34 - loss: 1.8359 - regression_loss: 1.4937 - classification_loss: 0.3422 127/500 [======>.......................] - ETA: 1:34 - loss: 1.8370 - regression_loss: 1.4945 - classification_loss: 0.3425 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8360 - regression_loss: 1.4943 - classification_loss: 0.3417 129/500 [======>.......................] - ETA: 1:33 - loss: 1.8421 - regression_loss: 1.4993 - classification_loss: 0.3427 130/500 [======>.......................] - ETA: 1:33 - loss: 1.8447 - regression_loss: 1.5021 - classification_loss: 0.3426 131/500 [======>.......................] - ETA: 1:33 - loss: 1.8438 - regression_loss: 1.5017 - classification_loss: 0.3421 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8409 - regression_loss: 1.4998 - classification_loss: 0.3411 133/500 [======>.......................] - ETA: 1:32 - loss: 1.8424 - regression_loss: 1.5010 - classification_loss: 0.3414 134/500 [=======>......................] - ETA: 1:32 - loss: 1.8425 - regression_loss: 1.5014 - classification_loss: 0.3411 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8418 - regression_loss: 1.5015 - classification_loss: 0.3403 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8461 - regression_loss: 1.5051 - classification_loss: 0.3410 137/500 [=======>......................] - ETA: 1:31 - loss: 1.8433 - regression_loss: 1.5033 - classification_loss: 0.3399 138/500 [=======>......................] - ETA: 1:31 - loss: 1.8440 - regression_loss: 1.5034 - classification_loss: 0.3407 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8445 - regression_loss: 1.5042 - classification_loss: 0.3404 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8365 - regression_loss: 1.4974 - classification_loss: 0.3391 141/500 [=======>......................] - ETA: 1:30 - loss: 1.8336 - regression_loss: 1.4955 - classification_loss: 0.3381 142/500 [=======>......................] - ETA: 1:30 - loss: 1.8317 - regression_loss: 1.4944 - classification_loss: 0.3373 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8346 - regression_loss: 1.4974 - classification_loss: 0.3372 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8413 - regression_loss: 1.5021 - classification_loss: 0.3392 145/500 [=======>......................] - ETA: 1:29 - loss: 1.8417 - regression_loss: 1.5026 - classification_loss: 0.3391 146/500 [=======>......................] - ETA: 1:29 - loss: 1.8454 - regression_loss: 1.5055 - classification_loss: 0.3400 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8416 - regression_loss: 1.5024 - classification_loss: 0.3392 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8400 - regression_loss: 1.5011 - classification_loss: 0.3389 149/500 [=======>......................] - ETA: 1:28 - loss: 1.8402 - regression_loss: 1.5012 - classification_loss: 0.3390 150/500 [========>.....................] - ETA: 1:28 - loss: 1.8446 - regression_loss: 1.5042 - classification_loss: 0.3404 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8417 - regression_loss: 1.5021 - classification_loss: 0.3396 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8448 - regression_loss: 1.5024 - classification_loss: 0.3424 153/500 [========>.....................] - ETA: 1:27 - loss: 1.8467 - regression_loss: 1.5044 - classification_loss: 0.3423 154/500 [========>.....................] - ETA: 1:27 - loss: 1.8475 - regression_loss: 1.5053 - classification_loss: 0.3422 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8463 - regression_loss: 1.5044 - classification_loss: 0.3419 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8467 - regression_loss: 1.5043 - classification_loss: 0.3423 157/500 [========>.....................] - ETA: 1:26 - loss: 1.8452 - regression_loss: 1.5033 - classification_loss: 0.3419 158/500 [========>.....................] - ETA: 1:26 - loss: 1.8430 - regression_loss: 1.5019 - classification_loss: 0.3411 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8378 - regression_loss: 1.4981 - classification_loss: 0.3397 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8396 - regression_loss: 1.4999 - classification_loss: 0.3397 161/500 [========>.....................] - ETA: 1:25 - loss: 1.8406 - regression_loss: 1.5007 - classification_loss: 0.3399 162/500 [========>.....................] - ETA: 1:25 - loss: 1.8409 - regression_loss: 1.5012 - classification_loss: 0.3397 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8402 - regression_loss: 1.5006 - classification_loss: 0.3396 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8393 - regression_loss: 1.4998 - classification_loss: 0.3394 165/500 [========>.....................] - ETA: 1:24 - loss: 1.8377 - regression_loss: 1.4987 - classification_loss: 0.3390 166/500 [========>.....................] - ETA: 1:24 - loss: 1.8409 - regression_loss: 1.5010 - classification_loss: 0.3399 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8353 - regression_loss: 1.4971 - classification_loss: 0.3383 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8410 - regression_loss: 1.5013 - classification_loss: 0.3397 169/500 [=========>....................] - ETA: 1:23 - loss: 1.8381 - regression_loss: 1.4991 - classification_loss: 0.3390 170/500 [=========>....................] - ETA: 1:23 - loss: 1.8366 - regression_loss: 1.4982 - classification_loss: 0.3384 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8359 - regression_loss: 1.4980 - classification_loss: 0.3379 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8366 - regression_loss: 1.4989 - classification_loss: 0.3377 173/500 [=========>....................] - ETA: 1:22 - loss: 1.8401 - regression_loss: 1.5016 - classification_loss: 0.3385 174/500 [=========>....................] - ETA: 1:22 - loss: 1.8390 - regression_loss: 1.5005 - classification_loss: 0.3385 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8392 - regression_loss: 1.4996 - classification_loss: 0.3396 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8366 - regression_loss: 1.4977 - classification_loss: 0.3389 177/500 [=========>....................] - ETA: 1:21 - loss: 1.8426 - regression_loss: 1.5034 - classification_loss: 0.3392 178/500 [=========>....................] - ETA: 1:21 - loss: 1.8443 - regression_loss: 1.5052 - classification_loss: 0.3390 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8465 - regression_loss: 1.5072 - classification_loss: 0.3393 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8457 - regression_loss: 1.5067 - classification_loss: 0.3390 181/500 [=========>....................] - ETA: 1:20 - loss: 1.8442 - regression_loss: 1.5058 - classification_loss: 0.3384 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8456 - regression_loss: 1.5073 - classification_loss: 0.3383 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8451 - regression_loss: 1.5072 - classification_loss: 0.3379 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8485 - regression_loss: 1.5101 - classification_loss: 0.3384 185/500 [==========>...................] - ETA: 1:19 - loss: 1.8485 - regression_loss: 1.5108 - classification_loss: 0.3377 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8507 - regression_loss: 1.5126 - classification_loss: 0.3381 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8518 - regression_loss: 1.5131 - classification_loss: 0.3387 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8524 - regression_loss: 1.5131 - classification_loss: 0.3393 189/500 [==========>...................] - ETA: 1:18 - loss: 1.8553 - regression_loss: 1.5148 - classification_loss: 0.3405 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8557 - regression_loss: 1.5155 - classification_loss: 0.3401 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8525 - regression_loss: 1.5131 - classification_loss: 0.3395 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8480 - regression_loss: 1.5094 - classification_loss: 0.3386 193/500 [==========>...................] - ETA: 1:17 - loss: 1.8506 - regression_loss: 1.5117 - classification_loss: 0.3390 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8531 - regression_loss: 1.5144 - classification_loss: 0.3387 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8555 - regression_loss: 1.5170 - classification_loss: 0.3385 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8564 - regression_loss: 1.5178 - classification_loss: 0.3386 197/500 [==========>...................] - ETA: 1:16 - loss: 1.8558 - regression_loss: 1.5170 - classification_loss: 0.3388 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8569 - regression_loss: 1.5182 - classification_loss: 0.3387 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8513 - regression_loss: 1.5135 - classification_loss: 0.3378 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8512 - regression_loss: 1.5134 - classification_loss: 0.3378 201/500 [===========>..................] - ETA: 1:15 - loss: 1.8556 - regression_loss: 1.5174 - classification_loss: 0.3382 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8584 - regression_loss: 1.5203 - classification_loss: 0.3380 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8576 - regression_loss: 1.5195 - classification_loss: 0.3381 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8584 - regression_loss: 1.5204 - classification_loss: 0.3379 205/500 [===========>..................] - ETA: 1:14 - loss: 1.8595 - regression_loss: 1.5213 - classification_loss: 0.3382 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8597 - regression_loss: 1.5215 - classification_loss: 0.3382 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8543 - regression_loss: 1.5170 - classification_loss: 0.3373 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8557 - regression_loss: 1.5181 - classification_loss: 0.3376 209/500 [===========>..................] - ETA: 1:13 - loss: 1.8577 - regression_loss: 1.5198 - classification_loss: 0.3380 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8575 - regression_loss: 1.5188 - classification_loss: 0.3386 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8587 - regression_loss: 1.5199 - classification_loss: 0.3388 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8564 - regression_loss: 1.5181 - classification_loss: 0.3383 213/500 [===========>..................] - ETA: 1:12 - loss: 1.8571 - regression_loss: 1.5186 - classification_loss: 0.3386 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8552 - regression_loss: 1.5168 - classification_loss: 0.3383 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8552 - regression_loss: 1.5171 - classification_loss: 0.3381 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8571 - regression_loss: 1.5185 - classification_loss: 0.3386 217/500 [============>.................] - ETA: 1:11 - loss: 1.8529 - regression_loss: 1.5148 - classification_loss: 0.3381 218/500 [============>.................] - ETA: 1:10 - loss: 1.8526 - regression_loss: 1.5146 - classification_loss: 0.3380 219/500 [============>.................] - ETA: 1:10 - loss: 1.8517 - regression_loss: 1.5135 - classification_loss: 0.3381 220/500 [============>.................] - ETA: 1:10 - loss: 1.8515 - regression_loss: 1.5137 - classification_loss: 0.3378 221/500 [============>.................] - ETA: 1:10 - loss: 1.8460 - regression_loss: 1.5093 - classification_loss: 0.3368 222/500 [============>.................] - ETA: 1:09 - loss: 1.8452 - regression_loss: 1.5089 - classification_loss: 0.3363 223/500 [============>.................] - ETA: 1:09 - loss: 1.8456 - regression_loss: 1.5091 - classification_loss: 0.3366 224/500 [============>.................] - ETA: 1:09 - loss: 1.8465 - regression_loss: 1.5097 - classification_loss: 0.3368 225/500 [============>.................] - ETA: 1:09 - loss: 1.8465 - regression_loss: 1.5100 - classification_loss: 0.3365 226/500 [============>.................] - ETA: 1:08 - loss: 1.8469 - regression_loss: 1.5105 - classification_loss: 0.3364 227/500 [============>.................] - ETA: 1:08 - loss: 1.8472 - regression_loss: 1.5108 - classification_loss: 0.3363 228/500 [============>.................] - ETA: 1:08 - loss: 1.8469 - regression_loss: 1.5109 - classification_loss: 0.3360 229/500 [============>.................] - ETA: 1:08 - loss: 1.8504 - regression_loss: 1.5141 - classification_loss: 0.3363 230/500 [============>.................] - ETA: 1:07 - loss: 1.8514 - regression_loss: 1.5153 - classification_loss: 0.3360 231/500 [============>.................] - ETA: 1:07 - loss: 1.8512 - regression_loss: 1.5154 - classification_loss: 0.3358 232/500 [============>.................] - ETA: 1:07 - loss: 1.8521 - regression_loss: 1.5162 - classification_loss: 0.3359 233/500 [============>.................] - ETA: 1:07 - loss: 1.8530 - regression_loss: 1.5168 - classification_loss: 0.3362 234/500 [=============>................] - ETA: 1:06 - loss: 1.8550 - regression_loss: 1.5186 - classification_loss: 0.3364 235/500 [=============>................] - ETA: 1:06 - loss: 1.8542 - regression_loss: 1.5179 - classification_loss: 0.3363 236/500 [=============>................] - ETA: 1:06 - loss: 1.8546 - regression_loss: 1.5181 - classification_loss: 0.3365 237/500 [=============>................] - ETA: 1:06 - loss: 1.8566 - regression_loss: 1.5197 - classification_loss: 0.3369 238/500 [=============>................] - ETA: 1:05 - loss: 1.8575 - regression_loss: 1.5205 - classification_loss: 0.3370 239/500 [=============>................] - ETA: 1:05 - loss: 1.8551 - regression_loss: 1.5184 - classification_loss: 0.3368 240/500 [=============>................] - ETA: 1:05 - loss: 1.8558 - regression_loss: 1.5190 - classification_loss: 0.3368 241/500 [=============>................] - ETA: 1:05 - loss: 1.8573 - regression_loss: 1.5201 - classification_loss: 0.3372 242/500 [=============>................] - ETA: 1:04 - loss: 1.8590 - regression_loss: 1.5217 - classification_loss: 0.3374 243/500 [=============>................] - ETA: 1:04 - loss: 1.8581 - regression_loss: 1.5211 - classification_loss: 0.3370 244/500 [=============>................] - ETA: 1:04 - loss: 1.8573 - regression_loss: 1.5206 - classification_loss: 0.3367 245/500 [=============>................] - ETA: 1:04 - loss: 1.8568 - regression_loss: 1.5203 - classification_loss: 0.3366 246/500 [=============>................] - ETA: 1:03 - loss: 1.8583 - regression_loss: 1.5218 - classification_loss: 0.3365 247/500 [=============>................] - ETA: 1:03 - loss: 1.8583 - regression_loss: 1.5218 - classification_loss: 0.3364 248/500 [=============>................] - ETA: 1:03 - loss: 1.8619 - regression_loss: 1.5241 - classification_loss: 0.3378 249/500 [=============>................] - ETA: 1:03 - loss: 1.8594 - regression_loss: 1.5218 - classification_loss: 0.3376 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8581 - regression_loss: 1.5204 - classification_loss: 0.3378 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8591 - regression_loss: 1.5214 - classification_loss: 0.3377 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8544 - regression_loss: 1.5174 - classification_loss: 0.3371 253/500 [==============>...............] - ETA: 1:02 - loss: 1.8530 - regression_loss: 1.5163 - classification_loss: 0.3367 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8540 - regression_loss: 1.5172 - classification_loss: 0.3368 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8539 - regression_loss: 1.5171 - classification_loss: 0.3368 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8560 - regression_loss: 1.5188 - classification_loss: 0.3372 257/500 [==============>...............] - ETA: 1:01 - loss: 1.8545 - regression_loss: 1.5176 - classification_loss: 0.3369 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8557 - regression_loss: 1.5186 - classification_loss: 0.3371 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8551 - regression_loss: 1.5182 - classification_loss: 0.3369 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8565 - regression_loss: 1.5190 - classification_loss: 0.3375 261/500 [==============>...............] - ETA: 1:00 - loss: 1.8582 - regression_loss: 1.5206 - classification_loss: 0.3376 262/500 [==============>...............] - ETA: 59s - loss: 1.8571 - regression_loss: 1.5196 - classification_loss: 0.3376  263/500 [==============>...............] - ETA: 59s - loss: 1.8556 - regression_loss: 1.5184 - classification_loss: 0.3372 264/500 [==============>...............] - ETA: 59s - loss: 1.8535 - regression_loss: 1.5168 - classification_loss: 0.3366 265/500 [==============>...............] - ETA: 59s - loss: 1.8530 - regression_loss: 1.5163 - classification_loss: 0.3367 266/500 [==============>...............] - ETA: 58s - loss: 1.8517 - regression_loss: 1.5153 - classification_loss: 0.3364 267/500 [===============>..............] - ETA: 58s - loss: 1.8513 - regression_loss: 1.5151 - classification_loss: 0.3362 268/500 [===============>..............] - ETA: 58s - loss: 1.8528 - regression_loss: 1.5166 - classification_loss: 0.3362 269/500 [===============>..............] - ETA: 58s - loss: 1.8511 - regression_loss: 1.5151 - classification_loss: 0.3360 270/500 [===============>..............] - ETA: 57s - loss: 1.8516 - regression_loss: 1.5156 - classification_loss: 0.3359 271/500 [===============>..............] - ETA: 57s - loss: 1.8523 - regression_loss: 1.5163 - classification_loss: 0.3360 272/500 [===============>..............] - ETA: 57s - loss: 1.8532 - regression_loss: 1.5172 - classification_loss: 0.3361 273/500 [===============>..............] - ETA: 57s - loss: 1.8535 - regression_loss: 1.5176 - classification_loss: 0.3360 274/500 [===============>..............] - ETA: 56s - loss: 1.8539 - regression_loss: 1.5177 - classification_loss: 0.3362 275/500 [===============>..............] - ETA: 56s - loss: 1.8528 - regression_loss: 1.5170 - classification_loss: 0.3358 276/500 [===============>..............] - ETA: 56s - loss: 1.8511 - regression_loss: 1.5156 - classification_loss: 0.3355 277/500 [===============>..............] - ETA: 56s - loss: 1.8521 - regression_loss: 1.5165 - classification_loss: 0.3356 278/500 [===============>..............] - ETA: 55s - loss: 1.8499 - regression_loss: 1.5146 - classification_loss: 0.3353 279/500 [===============>..............] - ETA: 55s - loss: 1.8505 - regression_loss: 1.5152 - classification_loss: 0.3353 280/500 [===============>..............] - ETA: 55s - loss: 1.8523 - regression_loss: 1.5167 - classification_loss: 0.3356 281/500 [===============>..............] - ETA: 55s - loss: 1.8533 - regression_loss: 1.5175 - classification_loss: 0.3358 282/500 [===============>..............] - ETA: 54s - loss: 1.8555 - regression_loss: 1.5193 - classification_loss: 0.3362 283/500 [===============>..............] - ETA: 54s - loss: 1.8559 - regression_loss: 1.5199 - classification_loss: 0.3359 284/500 [================>.............] - ETA: 54s - loss: 1.8563 - regression_loss: 1.5206 - classification_loss: 0.3356 285/500 [================>.............] - ETA: 54s - loss: 1.8527 - regression_loss: 1.5171 - classification_loss: 0.3356 286/500 [================>.............] - ETA: 53s - loss: 1.8557 - regression_loss: 1.5195 - classification_loss: 0.3362 287/500 [================>.............] - ETA: 53s - loss: 1.8565 - regression_loss: 1.5202 - classification_loss: 0.3363 288/500 [================>.............] - ETA: 53s - loss: 1.8567 - regression_loss: 1.5206 - classification_loss: 0.3362 289/500 [================>.............] - ETA: 53s - loss: 1.8541 - regression_loss: 1.5184 - classification_loss: 0.3357 290/500 [================>.............] - ETA: 52s - loss: 1.8552 - regression_loss: 1.5192 - classification_loss: 0.3360 291/500 [================>.............] - ETA: 52s - loss: 1.8539 - regression_loss: 1.5183 - classification_loss: 0.3356 292/500 [================>.............] - ETA: 52s - loss: 1.8549 - regression_loss: 1.5191 - classification_loss: 0.3359 293/500 [================>.............] - ETA: 52s - loss: 1.8563 - regression_loss: 1.5200 - classification_loss: 0.3363 294/500 [================>.............] - ETA: 51s - loss: 1.8561 - regression_loss: 1.5198 - classification_loss: 0.3363 295/500 [================>.............] - ETA: 51s - loss: 1.8549 - regression_loss: 1.5189 - classification_loss: 0.3360 296/500 [================>.............] - ETA: 51s - loss: 1.8537 - regression_loss: 1.5181 - classification_loss: 0.3355 297/500 [================>.............] - ETA: 51s - loss: 1.8511 - regression_loss: 1.5161 - classification_loss: 0.3350 298/500 [================>.............] - ETA: 50s - loss: 1.8511 - regression_loss: 1.5162 - classification_loss: 0.3349 299/500 [================>.............] - ETA: 50s - loss: 1.8515 - regression_loss: 1.5168 - classification_loss: 0.3348 300/500 [=================>............] - ETA: 50s - loss: 1.8514 - regression_loss: 1.5168 - classification_loss: 0.3346 301/500 [=================>............] - ETA: 50s - loss: 1.8512 - regression_loss: 1.5168 - classification_loss: 0.3344 302/500 [=================>............] - ETA: 49s - loss: 1.8505 - regression_loss: 1.5161 - classification_loss: 0.3344 303/500 [=================>............] - ETA: 49s - loss: 1.8499 - regression_loss: 1.5155 - classification_loss: 0.3344 304/500 [=================>............] - ETA: 49s - loss: 1.8488 - regression_loss: 1.5144 - classification_loss: 0.3345 305/500 [=================>............] - ETA: 49s - loss: 1.8492 - regression_loss: 1.5147 - classification_loss: 0.3345 306/500 [=================>............] - ETA: 48s - loss: 1.8483 - regression_loss: 1.5141 - classification_loss: 0.3342 307/500 [=================>............] - ETA: 48s - loss: 1.8470 - regression_loss: 1.5132 - classification_loss: 0.3338 308/500 [=================>............] - ETA: 48s - loss: 1.8481 - regression_loss: 1.5141 - classification_loss: 0.3340 309/500 [=================>............] - ETA: 48s - loss: 1.8449 - regression_loss: 1.5116 - classification_loss: 0.3332 310/500 [=================>............] - ETA: 47s - loss: 1.8433 - regression_loss: 1.5106 - classification_loss: 0.3327 311/500 [=================>............] - ETA: 47s - loss: 1.8436 - regression_loss: 1.5109 - classification_loss: 0.3327 312/500 [=================>............] - ETA: 47s - loss: 1.8438 - regression_loss: 1.5113 - classification_loss: 0.3325 313/500 [=================>............] - ETA: 46s - loss: 1.8449 - regression_loss: 1.5123 - classification_loss: 0.3326 314/500 [=================>............] - ETA: 46s - loss: 1.8460 - regression_loss: 1.5134 - classification_loss: 0.3327 315/500 [=================>............] - ETA: 46s - loss: 1.8466 - regression_loss: 1.5138 - classification_loss: 0.3328 316/500 [=================>............] - ETA: 46s - loss: 1.8471 - regression_loss: 1.5142 - classification_loss: 0.3329 317/500 [==================>...........] - ETA: 45s - loss: 1.8433 - regression_loss: 1.5110 - classification_loss: 0.3323 318/500 [==================>...........] - ETA: 45s - loss: 1.8432 - regression_loss: 1.5109 - classification_loss: 0.3324 319/500 [==================>...........] - ETA: 45s - loss: 1.8391 - regression_loss: 1.5075 - classification_loss: 0.3316 320/500 [==================>...........] - ETA: 45s - loss: 1.8402 - regression_loss: 1.5085 - classification_loss: 0.3317 321/500 [==================>...........] - ETA: 44s - loss: 1.8437 - regression_loss: 1.5110 - classification_loss: 0.3327 322/500 [==================>...........] - ETA: 44s - loss: 1.8429 - regression_loss: 1.5103 - classification_loss: 0.3326 323/500 [==================>...........] - ETA: 44s - loss: 1.8426 - regression_loss: 1.5100 - classification_loss: 0.3326 324/500 [==================>...........] - ETA: 44s - loss: 1.8414 - regression_loss: 1.5091 - classification_loss: 0.3323 325/500 [==================>...........] - ETA: 43s - loss: 1.8421 - regression_loss: 1.5096 - classification_loss: 0.3325 326/500 [==================>...........] - ETA: 43s - loss: 1.8384 - regression_loss: 1.5066 - classification_loss: 0.3318 327/500 [==================>...........] - ETA: 43s - loss: 1.8399 - regression_loss: 1.5082 - classification_loss: 0.3318 328/500 [==================>...........] - ETA: 43s - loss: 1.8396 - regression_loss: 1.5077 - classification_loss: 0.3318 329/500 [==================>...........] - ETA: 42s - loss: 1.8407 - regression_loss: 1.5088 - classification_loss: 0.3319 330/500 [==================>...........] - ETA: 42s - loss: 1.8413 - regression_loss: 1.5092 - classification_loss: 0.3320 331/500 [==================>...........] - ETA: 42s - loss: 1.8409 - regression_loss: 1.5090 - classification_loss: 0.3319 332/500 [==================>...........] - ETA: 42s - loss: 1.8426 - regression_loss: 1.5100 - classification_loss: 0.3326 333/500 [==================>...........] - ETA: 41s - loss: 1.8429 - regression_loss: 1.5103 - classification_loss: 0.3326 334/500 [===================>..........] - ETA: 41s - loss: 1.8402 - regression_loss: 1.5079 - classification_loss: 0.3322 335/500 [===================>..........] - ETA: 41s - loss: 1.8409 - regression_loss: 1.5087 - classification_loss: 0.3322 336/500 [===================>..........] - ETA: 41s - loss: 1.8437 - regression_loss: 1.5112 - classification_loss: 0.3325 337/500 [===================>..........] - ETA: 40s - loss: 1.8435 - regression_loss: 1.5111 - classification_loss: 0.3324 338/500 [===================>..........] - ETA: 40s - loss: 1.8435 - regression_loss: 1.5112 - classification_loss: 0.3324 339/500 [===================>..........] - ETA: 40s - loss: 1.8416 - regression_loss: 1.5094 - classification_loss: 0.3321 340/500 [===================>..........] - ETA: 40s - loss: 1.8430 - regression_loss: 1.5107 - classification_loss: 0.3323 341/500 [===================>..........] - ETA: 39s - loss: 1.8409 - regression_loss: 1.5091 - classification_loss: 0.3318 342/500 [===================>..........] - ETA: 39s - loss: 1.8404 - regression_loss: 1.5088 - classification_loss: 0.3315 343/500 [===================>..........] - ETA: 39s - loss: 1.8398 - regression_loss: 1.5086 - classification_loss: 0.3312 344/500 [===================>..........] - ETA: 39s - loss: 1.8382 - regression_loss: 1.5075 - classification_loss: 0.3307 345/500 [===================>..........] - ETA: 38s - loss: 1.8363 - regression_loss: 1.5061 - classification_loss: 0.3302 346/500 [===================>..........] - ETA: 38s - loss: 1.8360 - regression_loss: 1.5060 - classification_loss: 0.3300 347/500 [===================>..........] - ETA: 38s - loss: 1.8370 - regression_loss: 1.5068 - classification_loss: 0.3302 348/500 [===================>..........] - ETA: 38s - loss: 1.8345 - regression_loss: 1.5048 - classification_loss: 0.3298 349/500 [===================>..........] - ETA: 37s - loss: 1.8349 - regression_loss: 1.5052 - classification_loss: 0.3297 350/500 [====================>.........] - ETA: 37s - loss: 1.8335 - regression_loss: 1.5041 - classification_loss: 0.3294 351/500 [====================>.........] - ETA: 37s - loss: 1.8334 - regression_loss: 1.5042 - classification_loss: 0.3293 352/500 [====================>.........] - ETA: 37s - loss: 1.8325 - regression_loss: 1.5035 - classification_loss: 0.3290 353/500 [====================>.........] - ETA: 36s - loss: 1.8332 - regression_loss: 1.5043 - classification_loss: 0.3290 354/500 [====================>.........] - ETA: 36s - loss: 1.8340 - regression_loss: 1.5048 - classification_loss: 0.3292 355/500 [====================>.........] - ETA: 36s - loss: 1.8343 - regression_loss: 1.5052 - classification_loss: 0.3291 356/500 [====================>.........] - ETA: 36s - loss: 1.8344 - regression_loss: 1.5054 - classification_loss: 0.3290 357/500 [====================>.........] - ETA: 35s - loss: 1.8350 - regression_loss: 1.5059 - classification_loss: 0.3292 358/500 [====================>.........] - ETA: 35s - loss: 1.8344 - regression_loss: 1.5053 - classification_loss: 0.3291 359/500 [====================>.........] - ETA: 35s - loss: 1.8338 - regression_loss: 1.5050 - classification_loss: 0.3288 360/500 [====================>.........] - ETA: 35s - loss: 1.8337 - regression_loss: 1.5049 - classification_loss: 0.3288 361/500 [====================>.........] - ETA: 34s - loss: 1.8328 - regression_loss: 1.5041 - classification_loss: 0.3287 362/500 [====================>.........] - ETA: 34s - loss: 1.8332 - regression_loss: 1.5044 - classification_loss: 0.3288 363/500 [====================>.........] - ETA: 34s - loss: 1.8344 - regression_loss: 1.5053 - classification_loss: 0.3291 364/500 [====================>.........] - ETA: 34s - loss: 1.8346 - regression_loss: 1.5054 - classification_loss: 0.3292 365/500 [====================>.........] - ETA: 33s - loss: 1.8342 - regression_loss: 1.5052 - classification_loss: 0.3290 366/500 [====================>.........] - ETA: 33s - loss: 1.8345 - regression_loss: 1.5056 - classification_loss: 0.3289 367/500 [=====================>........] - ETA: 33s - loss: 1.8357 - regression_loss: 1.5064 - classification_loss: 0.3292 368/500 [=====================>........] - ETA: 33s - loss: 1.8362 - regression_loss: 1.5070 - classification_loss: 0.3292 369/500 [=====================>........] - ETA: 32s - loss: 1.8355 - regression_loss: 1.5066 - classification_loss: 0.3289 370/500 [=====================>........] - ETA: 32s - loss: 1.8363 - regression_loss: 1.5070 - classification_loss: 0.3293 371/500 [=====================>........] - ETA: 32s - loss: 1.8359 - regression_loss: 1.5068 - classification_loss: 0.3291 372/500 [=====================>........] - ETA: 32s - loss: 1.8334 - regression_loss: 1.5045 - classification_loss: 0.3289 373/500 [=====================>........] - ETA: 31s - loss: 1.8343 - regression_loss: 1.5043 - classification_loss: 0.3300 374/500 [=====================>........] - ETA: 31s - loss: 1.8316 - regression_loss: 1.5019 - classification_loss: 0.3297 375/500 [=====================>........] - ETA: 31s - loss: 1.8280 - regression_loss: 1.4991 - classification_loss: 0.3290 376/500 [=====================>........] - ETA: 31s - loss: 1.8275 - regression_loss: 1.4988 - classification_loss: 0.3288 377/500 [=====================>........] - ETA: 30s - loss: 1.8274 - regression_loss: 1.4988 - classification_loss: 0.3287 378/500 [=====================>........] - ETA: 30s - loss: 1.8259 - regression_loss: 1.4976 - classification_loss: 0.3283 379/500 [=====================>........] - ETA: 30s - loss: 1.8265 - regression_loss: 1.4982 - classification_loss: 0.3283 380/500 [=====================>........] - ETA: 30s - loss: 1.8264 - regression_loss: 1.4984 - classification_loss: 0.3280 381/500 [=====================>........] - ETA: 29s - loss: 1.8276 - regression_loss: 1.4994 - classification_loss: 0.3282 382/500 [=====================>........] - ETA: 29s - loss: 1.8273 - regression_loss: 1.4992 - classification_loss: 0.3281 383/500 [=====================>........] - ETA: 29s - loss: 1.8275 - regression_loss: 1.4994 - classification_loss: 0.3280 384/500 [======================>.......] - ETA: 29s - loss: 1.8266 - regression_loss: 1.4987 - classification_loss: 0.3278 385/500 [======================>.......] - ETA: 28s - loss: 1.8267 - regression_loss: 1.4984 - classification_loss: 0.3282 386/500 [======================>.......] - ETA: 28s - loss: 1.8250 - regression_loss: 1.4970 - classification_loss: 0.3280 387/500 [======================>.......] - ETA: 28s - loss: 1.8250 - regression_loss: 1.4971 - classification_loss: 0.3279 388/500 [======================>.......] - ETA: 28s - loss: 1.8250 - regression_loss: 1.4972 - classification_loss: 0.3278 389/500 [======================>.......] - ETA: 27s - loss: 1.8257 - regression_loss: 1.4980 - classification_loss: 0.3277 390/500 [======================>.......] - ETA: 27s - loss: 1.8258 - regression_loss: 1.4982 - classification_loss: 0.3276 391/500 [======================>.......] - ETA: 27s - loss: 1.8257 - regression_loss: 1.4982 - classification_loss: 0.3276 392/500 [======================>.......] - ETA: 27s - loss: 1.8249 - regression_loss: 1.4976 - classification_loss: 0.3272 393/500 [======================>.......] - ETA: 26s - loss: 1.8259 - regression_loss: 1.4983 - classification_loss: 0.3276 394/500 [======================>.......] - ETA: 26s - loss: 1.8255 - regression_loss: 1.4970 - classification_loss: 0.3284 395/500 [======================>.......] - ETA: 26s - loss: 1.8272 - regression_loss: 1.4988 - classification_loss: 0.3285 396/500 [======================>.......] - ETA: 26s - loss: 1.8273 - regression_loss: 1.4989 - classification_loss: 0.3285 397/500 [======================>.......] - ETA: 25s - loss: 1.8278 - regression_loss: 1.4993 - classification_loss: 0.3285 398/500 [======================>.......] - ETA: 25s - loss: 1.8277 - regression_loss: 1.4993 - classification_loss: 0.3284 399/500 [======================>.......] - ETA: 25s - loss: 1.8283 - regression_loss: 1.5000 - classification_loss: 0.3283 400/500 [=======================>......] - ETA: 25s - loss: 1.8293 - regression_loss: 1.5007 - classification_loss: 0.3286 401/500 [=======================>......] - ETA: 24s - loss: 1.8285 - regression_loss: 1.5001 - classification_loss: 0.3284 402/500 [=======================>......] - ETA: 24s - loss: 1.8298 - regression_loss: 1.5009 - classification_loss: 0.3288 403/500 [=======================>......] - ETA: 24s - loss: 1.8284 - regression_loss: 1.4999 - classification_loss: 0.3285 404/500 [=======================>......] - ETA: 24s - loss: 1.8286 - regression_loss: 1.5000 - classification_loss: 0.3286 405/500 [=======================>......] - ETA: 23s - loss: 1.8285 - regression_loss: 1.5000 - classification_loss: 0.3286 406/500 [=======================>......] - ETA: 23s - loss: 1.8299 - regression_loss: 1.5010 - classification_loss: 0.3289 407/500 [=======================>......] - ETA: 23s - loss: 1.8294 - regression_loss: 1.5007 - classification_loss: 0.3287 408/500 [=======================>......] - ETA: 23s - loss: 1.8294 - regression_loss: 1.5004 - classification_loss: 0.3290 409/500 [=======================>......] - ETA: 22s - loss: 1.8295 - regression_loss: 1.5004 - classification_loss: 0.3290 410/500 [=======================>......] - ETA: 22s - loss: 1.8297 - regression_loss: 1.5007 - classification_loss: 0.3290 411/500 [=======================>......] - ETA: 22s - loss: 1.8303 - regression_loss: 1.5012 - classification_loss: 0.3291 412/500 [=======================>......] - ETA: 22s - loss: 1.8320 - regression_loss: 1.5027 - classification_loss: 0.3293 413/500 [=======================>......] - ETA: 21s - loss: 1.8318 - regression_loss: 1.5026 - classification_loss: 0.3292 414/500 [=======================>......] - ETA: 21s - loss: 1.8320 - regression_loss: 1.5027 - classification_loss: 0.3293 415/500 [=======================>......] - ETA: 21s - loss: 1.8326 - regression_loss: 1.5028 - classification_loss: 0.3298 416/500 [=======================>......] - ETA: 21s - loss: 1.8329 - regression_loss: 1.5028 - classification_loss: 0.3301 417/500 [========================>.....] - ETA: 20s - loss: 1.8341 - regression_loss: 1.5039 - classification_loss: 0.3302 418/500 [========================>.....] - ETA: 20s - loss: 1.8350 - regression_loss: 1.5047 - classification_loss: 0.3302 419/500 [========================>.....] - ETA: 20s - loss: 1.8358 - regression_loss: 1.5054 - classification_loss: 0.3304 420/500 [========================>.....] - ETA: 20s - loss: 1.8355 - regression_loss: 1.5052 - classification_loss: 0.3303 421/500 [========================>.....] - ETA: 19s - loss: 1.8353 - regression_loss: 1.5051 - classification_loss: 0.3302 422/500 [========================>.....] - ETA: 19s - loss: 1.8352 - regression_loss: 1.5051 - classification_loss: 0.3301 423/500 [========================>.....] - ETA: 19s - loss: 1.8324 - regression_loss: 1.5030 - classification_loss: 0.3294 424/500 [========================>.....] - ETA: 19s - loss: 1.8341 - regression_loss: 1.5044 - classification_loss: 0.3298 425/500 [========================>.....] - ETA: 18s - loss: 1.8341 - regression_loss: 1.5040 - classification_loss: 0.3301 426/500 [========================>.....] - ETA: 18s - loss: 1.8333 - regression_loss: 1.5036 - classification_loss: 0.3298 427/500 [========================>.....] - ETA: 18s - loss: 1.8349 - regression_loss: 1.5046 - classification_loss: 0.3303 428/500 [========================>.....] - ETA: 18s - loss: 1.8350 - regression_loss: 1.5047 - classification_loss: 0.3302 429/500 [========================>.....] - ETA: 17s - loss: 1.8356 - regression_loss: 1.5052 - classification_loss: 0.3304 430/500 [========================>.....] - ETA: 17s - loss: 1.8357 - regression_loss: 1.5054 - classification_loss: 0.3303 431/500 [========================>.....] - ETA: 17s - loss: 1.8351 - regression_loss: 1.5050 - classification_loss: 0.3302 432/500 [========================>.....] - ETA: 17s - loss: 1.8350 - regression_loss: 1.5046 - classification_loss: 0.3304 433/500 [========================>.....] - ETA: 16s - loss: 1.8334 - regression_loss: 1.5032 - classification_loss: 0.3302 434/500 [=========================>....] - ETA: 16s - loss: 1.8339 - regression_loss: 1.5037 - classification_loss: 0.3302 435/500 [=========================>....] - ETA: 16s - loss: 1.8332 - regression_loss: 1.5032 - classification_loss: 0.3300 436/500 [=========================>....] - ETA: 16s - loss: 1.8345 - regression_loss: 1.5042 - classification_loss: 0.3303 437/500 [=========================>....] - ETA: 15s - loss: 1.8350 - regression_loss: 1.5047 - classification_loss: 0.3303 438/500 [=========================>....] - ETA: 15s - loss: 1.8345 - regression_loss: 1.5043 - classification_loss: 0.3301 439/500 [=========================>....] - ETA: 15s - loss: 1.8351 - regression_loss: 1.5047 - classification_loss: 0.3304 440/500 [=========================>....] - ETA: 15s - loss: 1.8381 - regression_loss: 1.5050 - classification_loss: 0.3331 441/500 [=========================>....] - ETA: 14s - loss: 1.8381 - regression_loss: 1.5050 - classification_loss: 0.3331 442/500 [=========================>....] - ETA: 14s - loss: 1.8391 - regression_loss: 1.5058 - classification_loss: 0.3333 443/500 [=========================>....] - ETA: 14s - loss: 1.8379 - regression_loss: 1.5048 - classification_loss: 0.3331 444/500 [=========================>....] - ETA: 14s - loss: 1.8405 - regression_loss: 1.5070 - classification_loss: 0.3334 445/500 [=========================>....] - ETA: 13s - loss: 1.8417 - regression_loss: 1.5079 - classification_loss: 0.3339 446/500 [=========================>....] - ETA: 13s - loss: 1.8417 - regression_loss: 1.5079 - classification_loss: 0.3337 447/500 [=========================>....] - ETA: 13s - loss: 1.8415 - regression_loss: 1.5080 - classification_loss: 0.3335 448/500 [=========================>....] - ETA: 13s - loss: 1.8423 - regression_loss: 1.5084 - classification_loss: 0.3339 449/500 [=========================>....] - ETA: 12s - loss: 1.8422 - regression_loss: 1.5084 - classification_loss: 0.3339 450/500 [==========================>...] - ETA: 12s - loss: 1.8423 - regression_loss: 1.5084 - classification_loss: 0.3339 451/500 [==========================>...] - ETA: 12s - loss: 1.8433 - regression_loss: 1.5091 - classification_loss: 0.3342 452/500 [==========================>...] - ETA: 12s - loss: 1.8418 - regression_loss: 1.5078 - classification_loss: 0.3340 453/500 [==========================>...] - ETA: 11s - loss: 1.8423 - regression_loss: 1.5082 - classification_loss: 0.3341 454/500 [==========================>...] - ETA: 11s - loss: 1.8432 - regression_loss: 1.5091 - classification_loss: 0.3342 455/500 [==========================>...] - ETA: 11s - loss: 1.8421 - regression_loss: 1.5082 - classification_loss: 0.3339 456/500 [==========================>...] - ETA: 11s - loss: 1.8420 - regression_loss: 1.5082 - classification_loss: 0.3339 457/500 [==========================>...] - ETA: 10s - loss: 1.8421 - regression_loss: 1.5083 - classification_loss: 0.3338 458/500 [==========================>...] - ETA: 10s - loss: 1.8456 - regression_loss: 1.5111 - classification_loss: 0.3345 459/500 [==========================>...] - ETA: 10s - loss: 1.8448 - regression_loss: 1.5099 - classification_loss: 0.3349 460/500 [==========================>...] - ETA: 10s - loss: 1.8450 - regression_loss: 1.5101 - classification_loss: 0.3349 461/500 [==========================>...] - ETA: 9s - loss: 1.8454 - regression_loss: 1.5106 - classification_loss: 0.3347  462/500 [==========================>...] - ETA: 9s - loss: 1.8459 - regression_loss: 1.5111 - classification_loss: 0.3348 463/500 [==========================>...] - ETA: 9s - loss: 1.8466 - regression_loss: 1.5117 - classification_loss: 0.3349 464/500 [==========================>...] - ETA: 9s - loss: 1.8467 - regression_loss: 1.5119 - classification_loss: 0.3348 465/500 [==========================>...] - ETA: 8s - loss: 1.8470 - regression_loss: 1.5123 - classification_loss: 0.3347 466/500 [==========================>...] - ETA: 8s - loss: 1.8449 - regression_loss: 1.5106 - classification_loss: 0.3343 467/500 [===========================>..] - ETA: 8s - loss: 1.8433 - regression_loss: 1.5094 - classification_loss: 0.3339 468/500 [===========================>..] - ETA: 8s - loss: 1.8435 - regression_loss: 1.5096 - classification_loss: 0.3339 469/500 [===========================>..] - ETA: 7s - loss: 1.8420 - regression_loss: 1.5082 - classification_loss: 0.3337 470/500 [===========================>..] - ETA: 7s - loss: 1.8412 - regression_loss: 1.5077 - classification_loss: 0.3335 471/500 [===========================>..] - ETA: 7s - loss: 1.8414 - regression_loss: 1.5079 - classification_loss: 0.3335 472/500 [===========================>..] - ETA: 7s - loss: 1.8401 - regression_loss: 1.5069 - classification_loss: 0.3333 473/500 [===========================>..] - ETA: 6s - loss: 1.8406 - regression_loss: 1.5074 - classification_loss: 0.3332 474/500 [===========================>..] - ETA: 6s - loss: 1.8429 - regression_loss: 1.5092 - classification_loss: 0.3337 475/500 [===========================>..] - ETA: 6s - loss: 1.8434 - regression_loss: 1.5097 - classification_loss: 0.3337 476/500 [===========================>..] - ETA: 6s - loss: 1.8424 - regression_loss: 1.5090 - classification_loss: 0.3334 477/500 [===========================>..] - ETA: 5s - loss: 1.8425 - regression_loss: 1.5091 - classification_loss: 0.3334 478/500 [===========================>..] - ETA: 5s - loss: 1.8423 - regression_loss: 1.5090 - classification_loss: 0.3332 479/500 [===========================>..] - ETA: 5s - loss: 1.8430 - regression_loss: 1.5095 - classification_loss: 0.3335 480/500 [===========================>..] - ETA: 5s - loss: 1.8406 - regression_loss: 1.5076 - classification_loss: 0.3330 481/500 [===========================>..] - ETA: 4s - loss: 1.8394 - regression_loss: 1.5067 - classification_loss: 0.3327 482/500 [===========================>..] - ETA: 4s - loss: 1.8380 - regression_loss: 1.5056 - classification_loss: 0.3324 483/500 [===========================>..] - ETA: 4s - loss: 1.8389 - regression_loss: 1.5064 - classification_loss: 0.3326 484/500 [============================>.] - ETA: 4s - loss: 1.8398 - regression_loss: 1.5073 - classification_loss: 0.3326 485/500 [============================>.] - ETA: 3s - loss: 1.8386 - regression_loss: 1.5063 - classification_loss: 0.3323 486/500 [============================>.] - ETA: 3s - loss: 1.8390 - regression_loss: 1.5066 - classification_loss: 0.3324 487/500 [============================>.] - ETA: 3s - loss: 1.8389 - regression_loss: 1.5065 - classification_loss: 0.3323 488/500 [============================>.] - ETA: 3s - loss: 1.8394 - regression_loss: 1.5068 - classification_loss: 0.3326 489/500 [============================>.] - ETA: 2s - loss: 1.8380 - regression_loss: 1.5056 - classification_loss: 0.3323 490/500 [============================>.] - ETA: 2s - loss: 1.8369 - regression_loss: 1.5049 - classification_loss: 0.3320 491/500 [============================>.] - ETA: 2s - loss: 1.8373 - regression_loss: 1.5054 - classification_loss: 0.3319 492/500 [============================>.] - ETA: 2s - loss: 1.8383 - regression_loss: 1.5063 - classification_loss: 0.3320 493/500 [============================>.] - ETA: 1s - loss: 1.8365 - regression_loss: 1.5051 - classification_loss: 0.3314 494/500 [============================>.] - ETA: 1s - loss: 1.8332 - regression_loss: 1.5020 - classification_loss: 0.3312 495/500 [============================>.] - ETA: 1s - loss: 1.8339 - regression_loss: 1.5025 - classification_loss: 0.3314 496/500 [============================>.] - ETA: 1s - loss: 1.8346 - regression_loss: 1.5031 - classification_loss: 0.3315 497/500 [============================>.] - ETA: 0s - loss: 1.8352 - regression_loss: 1.5037 - classification_loss: 0.3315 498/500 [============================>.] - ETA: 0s - loss: 1.8361 - regression_loss: 1.5043 - classification_loss: 0.3318 499/500 [============================>.] - ETA: 0s - loss: 1.8358 - regression_loss: 1.5042 - classification_loss: 0.3316 500/500 [==============================] - 125s 251ms/step - loss: 1.8352 - regression_loss: 1.5038 - classification_loss: 0.3314 1172 instances of class plum with average precision: 0.5919 mAP: 0.5919 Epoch 00056: saving model to ./training/snapshots/resnet50_pascal_56.h5 Epoch 57/150 1/500 [..............................] - ETA: 2:01 - loss: 1.8417 - regression_loss: 1.5586 - classification_loss: 0.2831 2/500 [..............................] - ETA: 2:00 - loss: 1.4845 - regression_loss: 1.2434 - classification_loss: 0.2411 3/500 [..............................] - ETA: 1:58 - loss: 1.7228 - regression_loss: 1.4419 - classification_loss: 0.2809 4/500 [..............................] - ETA: 1:57 - loss: 1.7331 - regression_loss: 1.4531 - classification_loss: 0.2800 5/500 [..............................] - ETA: 1:59 - loss: 1.8521 - regression_loss: 1.5364 - classification_loss: 0.3156 6/500 [..............................] - ETA: 2:00 - loss: 1.8663 - regression_loss: 1.5463 - classification_loss: 0.3201 7/500 [..............................] - ETA: 2:00 - loss: 1.8415 - regression_loss: 1.5224 - classification_loss: 0.3192 8/500 [..............................] - ETA: 2:00 - loss: 1.8965 - regression_loss: 1.5584 - classification_loss: 0.3382 9/500 [..............................] - ETA: 2:00 - loss: 1.9312 - regression_loss: 1.5770 - classification_loss: 0.3542 10/500 [..............................] - ETA: 2:00 - loss: 1.9218 - regression_loss: 1.5432 - classification_loss: 0.3786 11/500 [..............................] - ETA: 2:00 - loss: 1.9667 - regression_loss: 1.5705 - classification_loss: 0.3962 12/500 [..............................] - ETA: 2:00 - loss: 1.9019 - regression_loss: 1.5230 - classification_loss: 0.3789 13/500 [..............................] - ETA: 2:00 - loss: 1.8881 - regression_loss: 1.5207 - classification_loss: 0.3674 14/500 [..............................] - ETA: 2:00 - loss: 1.9220 - regression_loss: 1.5493 - classification_loss: 0.3727 15/500 [..............................] - ETA: 2:00 - loss: 1.9408 - regression_loss: 1.5701 - classification_loss: 0.3707 16/500 [..............................] - ETA: 2:00 - loss: 1.9547 - regression_loss: 1.5721 - classification_loss: 0.3826 17/500 [>.............................] - ETA: 2:00 - loss: 1.9152 - regression_loss: 1.5388 - classification_loss: 0.3764 18/500 [>.............................] - ETA: 2:00 - loss: 1.9278 - regression_loss: 1.5525 - classification_loss: 0.3753 19/500 [>.............................] - ETA: 2:00 - loss: 1.9286 - regression_loss: 1.5555 - classification_loss: 0.3731 20/500 [>.............................] - ETA: 1:59 - loss: 1.9701 - regression_loss: 1.5882 - classification_loss: 0.3819 21/500 [>.............................] - ETA: 1:59 - loss: 1.9449 - regression_loss: 1.5659 - classification_loss: 0.3790 22/500 [>.............................] - ETA: 1:59 - loss: 1.9436 - regression_loss: 1.5627 - classification_loss: 0.3809 23/500 [>.............................] - ETA: 1:59 - loss: 1.9482 - regression_loss: 1.5693 - classification_loss: 0.3788 24/500 [>.............................] - ETA: 1:59 - loss: 1.9369 - regression_loss: 1.5634 - classification_loss: 0.3735 25/500 [>.............................] - ETA: 1:59 - loss: 1.9498 - regression_loss: 1.5662 - classification_loss: 0.3836 26/500 [>.............................] - ETA: 1:58 - loss: 1.9398 - regression_loss: 1.5608 - classification_loss: 0.3790 27/500 [>.............................] - ETA: 1:58 - loss: 1.9314 - regression_loss: 1.5601 - classification_loss: 0.3713 28/500 [>.............................] - ETA: 1:58 - loss: 1.9327 - regression_loss: 1.5648 - classification_loss: 0.3679 29/500 [>.............................] - ETA: 1:58 - loss: 1.9287 - regression_loss: 1.5606 - classification_loss: 0.3681 30/500 [>.............................] - ETA: 1:58 - loss: 1.9360 - regression_loss: 1.5572 - classification_loss: 0.3788 31/500 [>.............................] - ETA: 1:57 - loss: 1.9179 - regression_loss: 1.5436 - classification_loss: 0.3743 32/500 [>.............................] - ETA: 1:57 - loss: 1.9406 - regression_loss: 1.5675 - classification_loss: 0.3731 33/500 [>.............................] - ETA: 1:57 - loss: 1.9308 - regression_loss: 1.5590 - classification_loss: 0.3717 34/500 [=>............................] - ETA: 1:57 - loss: 1.9300 - regression_loss: 1.5590 - classification_loss: 0.3710 35/500 [=>............................] - ETA: 1:57 - loss: 1.9157 - regression_loss: 1.5447 - classification_loss: 0.3710 36/500 [=>............................] - ETA: 1:56 - loss: 1.9031 - regression_loss: 1.5356 - classification_loss: 0.3675 37/500 [=>............................] - ETA: 1:56 - loss: 1.9050 - regression_loss: 1.5384 - classification_loss: 0.3667 38/500 [=>............................] - ETA: 1:56 - loss: 1.8991 - regression_loss: 1.5340 - classification_loss: 0.3651 39/500 [=>............................] - ETA: 1:56 - loss: 1.9041 - regression_loss: 1.5393 - classification_loss: 0.3649 40/500 [=>............................] - ETA: 1:55 - loss: 1.9057 - regression_loss: 1.5411 - classification_loss: 0.3646 41/500 [=>............................] - ETA: 1:55 - loss: 1.9078 - regression_loss: 1.5413 - classification_loss: 0.3664 42/500 [=>............................] - ETA: 1:55 - loss: 1.9228 - regression_loss: 1.5535 - classification_loss: 0.3693 43/500 [=>............................] - ETA: 1:54 - loss: 1.9334 - regression_loss: 1.5615 - classification_loss: 0.3720 44/500 [=>............................] - ETA: 1:54 - loss: 1.9337 - regression_loss: 1.5624 - classification_loss: 0.3713 45/500 [=>............................] - ETA: 1:54 - loss: 1.9409 - regression_loss: 1.5684 - classification_loss: 0.3725 46/500 [=>............................] - ETA: 1:54 - loss: 1.9410 - regression_loss: 1.5688 - classification_loss: 0.3722 47/500 [=>............................] - ETA: 1:53 - loss: 1.9395 - regression_loss: 1.5658 - classification_loss: 0.3737 48/500 [=>............................] - ETA: 1:53 - loss: 1.9324 - regression_loss: 1.5593 - classification_loss: 0.3732 49/500 [=>............................] - ETA: 1:53 - loss: 1.9296 - regression_loss: 1.5588 - classification_loss: 0.3708 50/500 [==>...........................] - ETA: 1:53 - loss: 1.9131 - regression_loss: 1.5461 - classification_loss: 0.3669 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9104 - regression_loss: 1.5446 - classification_loss: 0.3658 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9116 - regression_loss: 1.5464 - classification_loss: 0.3653 53/500 [==>...........................] - ETA: 1:52 - loss: 1.9200 - regression_loss: 1.5536 - classification_loss: 0.3664 54/500 [==>...........................] - ETA: 1:52 - loss: 1.9112 - regression_loss: 1.5479 - classification_loss: 0.3633 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9148 - regression_loss: 1.5522 - classification_loss: 0.3627 56/500 [==>...........................] - ETA: 1:51 - loss: 1.9120 - regression_loss: 1.5506 - classification_loss: 0.3614 57/500 [==>...........................] - ETA: 1:51 - loss: 1.9034 - regression_loss: 1.5448 - classification_loss: 0.3586 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8947 - regression_loss: 1.5383 - classification_loss: 0.3564 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8829 - regression_loss: 1.5297 - classification_loss: 0.3532 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8856 - regression_loss: 1.5323 - classification_loss: 0.3533 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8954 - regression_loss: 1.5391 - classification_loss: 0.3563 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8991 - regression_loss: 1.5433 - classification_loss: 0.3558 63/500 [==>...........................] - ETA: 1:49 - loss: 1.9050 - regression_loss: 1.5494 - classification_loss: 0.3556 64/500 [==>...........................] - ETA: 1:49 - loss: 1.9008 - regression_loss: 1.5462 - classification_loss: 0.3546 65/500 [==>...........................] - ETA: 1:49 - loss: 1.9089 - regression_loss: 1.5529 - classification_loss: 0.3560 66/500 [==>...........................] - ETA: 1:49 - loss: 1.9141 - regression_loss: 1.5578 - classification_loss: 0.3563 67/500 [===>..........................] - ETA: 1:48 - loss: 1.9165 - regression_loss: 1.5611 - classification_loss: 0.3553 68/500 [===>..........................] - ETA: 1:48 - loss: 1.9118 - regression_loss: 1.5583 - classification_loss: 0.3536 69/500 [===>..........................] - ETA: 1:48 - loss: 1.9176 - regression_loss: 1.5644 - classification_loss: 0.3532 70/500 [===>..........................] - ETA: 1:48 - loss: 1.9133 - regression_loss: 1.5617 - classification_loss: 0.3516 71/500 [===>..........................] - ETA: 1:47 - loss: 1.9077 - regression_loss: 1.5574 - classification_loss: 0.3503 72/500 [===>..........................] - ETA: 1:47 - loss: 1.9055 - regression_loss: 1.5558 - classification_loss: 0.3497 73/500 [===>..........................] - ETA: 1:47 - loss: 1.9062 - regression_loss: 1.5566 - classification_loss: 0.3496 74/500 [===>..........................] - ETA: 1:47 - loss: 1.9076 - regression_loss: 1.5579 - classification_loss: 0.3497 75/500 [===>..........................] - ETA: 1:46 - loss: 1.9058 - regression_loss: 1.5570 - classification_loss: 0.3488 76/500 [===>..........................] - ETA: 1:46 - loss: 1.9099 - regression_loss: 1.5595 - classification_loss: 0.3504 77/500 [===>..........................] - ETA: 1:46 - loss: 1.9139 - regression_loss: 1.5632 - classification_loss: 0.3507 78/500 [===>..........................] - ETA: 1:46 - loss: 1.8987 - regression_loss: 1.5510 - classification_loss: 0.3477 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8939 - regression_loss: 1.5479 - classification_loss: 0.3460 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8901 - regression_loss: 1.5454 - classification_loss: 0.3448 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8902 - regression_loss: 1.5458 - classification_loss: 0.3444 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8941 - regression_loss: 1.5495 - classification_loss: 0.3446 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8926 - regression_loss: 1.5482 - classification_loss: 0.3444 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8939 - regression_loss: 1.5492 - classification_loss: 0.3447 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8966 - regression_loss: 1.5516 - classification_loss: 0.3450 86/500 [====>.........................] - ETA: 1:44 - loss: 1.8968 - regression_loss: 1.5520 - classification_loss: 0.3448 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8962 - regression_loss: 1.5513 - classification_loss: 0.3449 88/500 [====>.........................] - ETA: 1:43 - loss: 1.9113 - regression_loss: 1.5642 - classification_loss: 0.3471 89/500 [====>.........................] - ETA: 1:43 - loss: 1.9134 - regression_loss: 1.5654 - classification_loss: 0.3480 90/500 [====>.........................] - ETA: 1:43 - loss: 1.9053 - regression_loss: 1.5589 - classification_loss: 0.3464 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8948 - regression_loss: 1.5509 - classification_loss: 0.3439 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8944 - regression_loss: 1.5471 - classification_loss: 0.3472 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8966 - regression_loss: 1.5492 - classification_loss: 0.3474 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8934 - regression_loss: 1.5473 - classification_loss: 0.3460 95/500 [====>.........................] - ETA: 1:41 - loss: 1.9004 - regression_loss: 1.5529 - classification_loss: 0.3476 96/500 [====>.........................] - ETA: 1:41 - loss: 1.9019 - regression_loss: 1.5545 - classification_loss: 0.3473 97/500 [====>.........................] - ETA: 1:41 - loss: 1.9070 - regression_loss: 1.5585 - classification_loss: 0.3485 98/500 [====>.........................] - ETA: 1:40 - loss: 1.9051 - regression_loss: 1.5576 - classification_loss: 0.3475 99/500 [====>.........................] - ETA: 1:40 - loss: 1.9070 - regression_loss: 1.5595 - classification_loss: 0.3475 100/500 [=====>........................] - ETA: 1:40 - loss: 1.9098 - regression_loss: 1.5623 - classification_loss: 0.3475 101/500 [=====>........................] - ETA: 1:40 - loss: 1.9053 - regression_loss: 1.5592 - classification_loss: 0.3461 102/500 [=====>........................] - ETA: 1:39 - loss: 1.9003 - regression_loss: 1.5545 - classification_loss: 0.3458 103/500 [=====>........................] - ETA: 1:39 - loss: 1.9009 - regression_loss: 1.5556 - classification_loss: 0.3454 104/500 [=====>........................] - ETA: 1:39 - loss: 1.9017 - regression_loss: 1.5556 - classification_loss: 0.3460 105/500 [=====>........................] - ETA: 1:39 - loss: 1.9029 - regression_loss: 1.5563 - classification_loss: 0.3465 106/500 [=====>........................] - ETA: 1:38 - loss: 1.9025 - regression_loss: 1.5560 - classification_loss: 0.3465 107/500 [=====>........................] - ETA: 1:38 - loss: 1.9112 - regression_loss: 1.5635 - classification_loss: 0.3477 108/500 [=====>........................] - ETA: 1:38 - loss: 1.9056 - regression_loss: 1.5588 - classification_loss: 0.3468 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8944 - regression_loss: 1.5494 - classification_loss: 0.3450 110/500 [=====>........................] - ETA: 1:37 - loss: 1.9011 - regression_loss: 1.5549 - classification_loss: 0.3461 111/500 [=====>........................] - ETA: 1:37 - loss: 1.9033 - regression_loss: 1.5574 - classification_loss: 0.3459 112/500 [=====>........................] - ETA: 1:37 - loss: 1.9082 - regression_loss: 1.5612 - classification_loss: 0.3470 113/500 [=====>........................] - ETA: 1:36 - loss: 1.9101 - regression_loss: 1.5635 - classification_loss: 0.3466 114/500 [=====>........................] - ETA: 1:36 - loss: 1.9133 - regression_loss: 1.5661 - classification_loss: 0.3472 115/500 [=====>........................] - ETA: 1:36 - loss: 1.9060 - regression_loss: 1.5602 - classification_loss: 0.3458 116/500 [=====>........................] - ETA: 1:36 - loss: 1.9029 - regression_loss: 1.5584 - classification_loss: 0.3445 117/500 [======>.......................] - ETA: 1:35 - loss: 1.9014 - regression_loss: 1.5575 - classification_loss: 0.3438 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8982 - regression_loss: 1.5555 - classification_loss: 0.3427 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8978 - regression_loss: 1.5559 - classification_loss: 0.3419 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8931 - regression_loss: 1.5526 - classification_loss: 0.3406 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8933 - regression_loss: 1.5528 - classification_loss: 0.3404 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8928 - regression_loss: 1.5529 - classification_loss: 0.3399 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8850 - regression_loss: 1.5465 - classification_loss: 0.3385 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8796 - regression_loss: 1.5427 - classification_loss: 0.3369 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8788 - regression_loss: 1.5428 - classification_loss: 0.3359 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8768 - regression_loss: 1.5415 - classification_loss: 0.3353 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8764 - regression_loss: 1.5423 - classification_loss: 0.3341 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8711 - regression_loss: 1.5380 - classification_loss: 0.3331 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8674 - regression_loss: 1.5352 - classification_loss: 0.3321 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8751 - regression_loss: 1.5406 - classification_loss: 0.3344 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8777 - regression_loss: 1.5430 - classification_loss: 0.3347 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8727 - regression_loss: 1.5393 - classification_loss: 0.3334 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8751 - regression_loss: 1.5417 - classification_loss: 0.3334 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8714 - regression_loss: 1.5387 - classification_loss: 0.3327 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8687 - regression_loss: 1.5366 - classification_loss: 0.3321 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8680 - regression_loss: 1.5365 - classification_loss: 0.3315 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8655 - regression_loss: 1.5342 - classification_loss: 0.3314 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8648 - regression_loss: 1.5336 - classification_loss: 0.3313 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8602 - regression_loss: 1.5298 - classification_loss: 0.3303 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8560 - regression_loss: 1.5266 - classification_loss: 0.3294 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8591 - regression_loss: 1.5294 - classification_loss: 0.3297 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8621 - regression_loss: 1.5316 - classification_loss: 0.3304 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8543 - regression_loss: 1.5254 - classification_loss: 0.3289 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8489 - regression_loss: 1.5208 - classification_loss: 0.3281 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8493 - regression_loss: 1.5214 - classification_loss: 0.3280 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8512 - regression_loss: 1.5231 - classification_loss: 0.3281 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8528 - regression_loss: 1.5241 - classification_loss: 0.3287 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8533 - regression_loss: 1.5249 - classification_loss: 0.3284 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8544 - regression_loss: 1.5254 - classification_loss: 0.3290 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8557 - regression_loss: 1.5268 - classification_loss: 0.3289 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8572 - regression_loss: 1.5280 - classification_loss: 0.3292 152/500 [========>.....................] - ETA: 1:26 - loss: 1.8643 - regression_loss: 1.5320 - classification_loss: 0.3323 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8638 - regression_loss: 1.5319 - classification_loss: 0.3319 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8664 - regression_loss: 1.5334 - classification_loss: 0.3330 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8643 - regression_loss: 1.5320 - classification_loss: 0.3323 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8658 - regression_loss: 1.5337 - classification_loss: 0.3321 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8692 - regression_loss: 1.5364 - classification_loss: 0.3327 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8681 - regression_loss: 1.5358 - classification_loss: 0.3322 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8690 - regression_loss: 1.5368 - classification_loss: 0.3322 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8644 - regression_loss: 1.5335 - classification_loss: 0.3309 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8637 - regression_loss: 1.5332 - classification_loss: 0.3305 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8659 - regression_loss: 1.5354 - classification_loss: 0.3305 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8662 - regression_loss: 1.5360 - classification_loss: 0.3301 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8664 - regression_loss: 1.5356 - classification_loss: 0.3308 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8626 - regression_loss: 1.5318 - classification_loss: 0.3308 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8623 - regression_loss: 1.5313 - classification_loss: 0.3310 167/500 [=========>....................] - ETA: 1:22 - loss: 1.8636 - regression_loss: 1.5326 - classification_loss: 0.3310 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8645 - regression_loss: 1.5336 - classification_loss: 0.3309 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8666 - regression_loss: 1.5353 - classification_loss: 0.3312 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8647 - regression_loss: 1.5340 - classification_loss: 0.3307 171/500 [=========>....................] - ETA: 1:21 - loss: 1.8644 - regression_loss: 1.5341 - classification_loss: 0.3303 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8653 - regression_loss: 1.5344 - classification_loss: 0.3309 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8659 - regression_loss: 1.5346 - classification_loss: 0.3313 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8627 - regression_loss: 1.5316 - classification_loss: 0.3311 175/500 [=========>....................] - ETA: 1:20 - loss: 1.8631 - regression_loss: 1.5322 - classification_loss: 0.3309 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8633 - regression_loss: 1.5326 - classification_loss: 0.3307 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8594 - regression_loss: 1.5294 - classification_loss: 0.3300 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8586 - regression_loss: 1.5288 - classification_loss: 0.3298 179/500 [=========>....................] - ETA: 1:19 - loss: 1.8561 - regression_loss: 1.5269 - classification_loss: 0.3292 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8577 - regression_loss: 1.5289 - classification_loss: 0.3287 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8592 - regression_loss: 1.5303 - classification_loss: 0.3289 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8578 - regression_loss: 1.5293 - classification_loss: 0.3284 183/500 [=========>....................] - ETA: 1:18 - loss: 1.8588 - regression_loss: 1.5304 - classification_loss: 0.3283 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8604 - regression_loss: 1.5318 - classification_loss: 0.3286 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8561 - regression_loss: 1.5284 - classification_loss: 0.3277 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8513 - regression_loss: 1.5244 - classification_loss: 0.3268 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8527 - regression_loss: 1.5256 - classification_loss: 0.3272 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8527 - regression_loss: 1.5264 - classification_loss: 0.3263 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8522 - regression_loss: 1.5259 - classification_loss: 0.3263 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8510 - regression_loss: 1.5251 - classification_loss: 0.3259 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8528 - regression_loss: 1.5263 - classification_loss: 0.3265 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8553 - regression_loss: 1.5284 - classification_loss: 0.3269 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8546 - regression_loss: 1.5281 - classification_loss: 0.3265 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8550 - regression_loss: 1.5285 - classification_loss: 0.3265 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8579 - regression_loss: 1.5307 - classification_loss: 0.3272 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8536 - regression_loss: 1.5273 - classification_loss: 0.3262 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8540 - regression_loss: 1.5279 - classification_loss: 0.3261 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8508 - regression_loss: 1.5252 - classification_loss: 0.3256 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8595 - regression_loss: 1.5311 - classification_loss: 0.3283 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8585 - regression_loss: 1.5307 - classification_loss: 0.3278 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8597 - regression_loss: 1.5320 - classification_loss: 0.3277 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8603 - regression_loss: 1.5327 - classification_loss: 0.3276 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8611 - regression_loss: 1.5336 - classification_loss: 0.3275 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8615 - regression_loss: 1.5336 - classification_loss: 0.3279 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8625 - regression_loss: 1.5348 - classification_loss: 0.3277 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8655 - regression_loss: 1.5376 - classification_loss: 0.3279 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8632 - regression_loss: 1.5354 - classification_loss: 0.3278 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8662 - regression_loss: 1.5371 - classification_loss: 0.3291 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8669 - regression_loss: 1.5379 - classification_loss: 0.3291 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8680 - regression_loss: 1.5386 - classification_loss: 0.3294 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8682 - regression_loss: 1.5380 - classification_loss: 0.3301 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8694 - regression_loss: 1.5388 - classification_loss: 0.3306 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8718 - regression_loss: 1.5411 - classification_loss: 0.3306 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8693 - regression_loss: 1.5393 - classification_loss: 0.3300 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8655 - regression_loss: 1.5363 - classification_loss: 0.3292 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8658 - regression_loss: 1.5364 - classification_loss: 0.3294 217/500 [============>.................] - ETA: 1:10 - loss: 1.8663 - regression_loss: 1.5366 - classification_loss: 0.3297 218/500 [============>.................] - ETA: 1:10 - loss: 1.8666 - regression_loss: 1.5366 - classification_loss: 0.3300 219/500 [============>.................] - ETA: 1:10 - loss: 1.8678 - regression_loss: 1.5376 - classification_loss: 0.3303 220/500 [============>.................] - ETA: 1:09 - loss: 1.8676 - regression_loss: 1.5377 - classification_loss: 0.3299 221/500 [============>.................] - ETA: 1:09 - loss: 1.8685 - regression_loss: 1.5381 - classification_loss: 0.3304 222/500 [============>.................] - ETA: 1:09 - loss: 1.8693 - regression_loss: 1.5390 - classification_loss: 0.3303 223/500 [============>.................] - ETA: 1:09 - loss: 1.8684 - regression_loss: 1.5383 - classification_loss: 0.3301 224/500 [============>.................] - ETA: 1:08 - loss: 1.8686 - regression_loss: 1.5386 - classification_loss: 0.3301 225/500 [============>.................] - ETA: 1:08 - loss: 1.8718 - regression_loss: 1.5417 - classification_loss: 0.3301 226/500 [============>.................] - ETA: 1:08 - loss: 1.8746 - regression_loss: 1.5438 - classification_loss: 0.3308 227/500 [============>.................] - ETA: 1:08 - loss: 1.8741 - regression_loss: 1.5436 - classification_loss: 0.3305 228/500 [============>.................] - ETA: 1:07 - loss: 1.8706 - regression_loss: 1.5410 - classification_loss: 0.3297 229/500 [============>.................] - ETA: 1:07 - loss: 1.8718 - regression_loss: 1.5420 - classification_loss: 0.3298 230/500 [============>.................] - ETA: 1:07 - loss: 1.8700 - regression_loss: 1.5402 - classification_loss: 0.3298 231/500 [============>.................] - ETA: 1:07 - loss: 1.8699 - regression_loss: 1.5403 - classification_loss: 0.3297 232/500 [============>.................] - ETA: 1:06 - loss: 1.8700 - regression_loss: 1.5404 - classification_loss: 0.3296 233/500 [============>.................] - ETA: 1:06 - loss: 1.8702 - regression_loss: 1.5411 - classification_loss: 0.3291 234/500 [=============>................] - ETA: 1:06 - loss: 1.8702 - regression_loss: 1.5410 - classification_loss: 0.3292 235/500 [=============>................] - ETA: 1:06 - loss: 1.8701 - regression_loss: 1.5413 - classification_loss: 0.3289 236/500 [=============>................] - ETA: 1:05 - loss: 1.8698 - regression_loss: 1.5402 - classification_loss: 0.3296 237/500 [=============>................] - ETA: 1:05 - loss: 1.8686 - regression_loss: 1.5395 - classification_loss: 0.3291 238/500 [=============>................] - ETA: 1:05 - loss: 1.8689 - regression_loss: 1.5397 - classification_loss: 0.3292 239/500 [=============>................] - ETA: 1:05 - loss: 1.8687 - regression_loss: 1.5396 - classification_loss: 0.3292 240/500 [=============>................] - ETA: 1:04 - loss: 1.8705 - regression_loss: 1.5414 - classification_loss: 0.3291 241/500 [=============>................] - ETA: 1:04 - loss: 1.8698 - regression_loss: 1.5398 - classification_loss: 0.3300 242/500 [=============>................] - ETA: 1:04 - loss: 1.8720 - regression_loss: 1.5417 - classification_loss: 0.3303 243/500 [=============>................] - ETA: 1:04 - loss: 1.8705 - regression_loss: 1.5407 - classification_loss: 0.3299 244/500 [=============>................] - ETA: 1:03 - loss: 1.8720 - regression_loss: 1.5422 - classification_loss: 0.3298 245/500 [=============>................] - ETA: 1:03 - loss: 1.8712 - regression_loss: 1.5417 - classification_loss: 0.3295 246/500 [=============>................] - ETA: 1:03 - loss: 1.8732 - regression_loss: 1.5432 - classification_loss: 0.3300 247/500 [=============>................] - ETA: 1:03 - loss: 1.8736 - regression_loss: 1.5433 - classification_loss: 0.3303 248/500 [=============>................] - ETA: 1:02 - loss: 1.8744 - regression_loss: 1.5438 - classification_loss: 0.3306 249/500 [=============>................] - ETA: 1:02 - loss: 1.8761 - regression_loss: 1.5450 - classification_loss: 0.3312 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8774 - regression_loss: 1.5461 - classification_loss: 0.3312 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8782 - regression_loss: 1.5470 - classification_loss: 0.3313 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8802 - regression_loss: 1.5484 - classification_loss: 0.3318 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8815 - regression_loss: 1.5494 - classification_loss: 0.3321 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8795 - regression_loss: 1.5479 - classification_loss: 0.3316 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8801 - regression_loss: 1.5481 - classification_loss: 0.3321 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8800 - regression_loss: 1.5478 - classification_loss: 0.3321 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8784 - regression_loss: 1.5456 - classification_loss: 0.3328 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8777 - regression_loss: 1.5454 - classification_loss: 0.3323 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8799 - regression_loss: 1.5469 - classification_loss: 0.3329 260/500 [==============>...............] - ETA: 59s - loss: 1.8793 - regression_loss: 1.5469 - classification_loss: 0.3325  261/500 [==============>...............] - ETA: 59s - loss: 1.8798 - regression_loss: 1.5476 - classification_loss: 0.3322 262/500 [==============>...............] - ETA: 59s - loss: 1.8805 - regression_loss: 1.5483 - classification_loss: 0.3323 263/500 [==============>...............] - ETA: 59s - loss: 1.8804 - regression_loss: 1.5484 - classification_loss: 0.3320 264/500 [==============>...............] - ETA: 58s - loss: 1.8808 - regression_loss: 1.5487 - classification_loss: 0.3321 265/500 [==============>...............] - ETA: 58s - loss: 1.8775 - regression_loss: 1.5462 - classification_loss: 0.3313 266/500 [==============>...............] - ETA: 58s - loss: 1.8775 - regression_loss: 1.5464 - classification_loss: 0.3311 267/500 [===============>..............] - ETA: 58s - loss: 1.8745 - regression_loss: 1.5440 - classification_loss: 0.3304 268/500 [===============>..............] - ETA: 57s - loss: 1.8729 - regression_loss: 1.5427 - classification_loss: 0.3302 269/500 [===============>..............] - ETA: 57s - loss: 1.8739 - regression_loss: 1.5435 - classification_loss: 0.3305 270/500 [===============>..............] - ETA: 57s - loss: 1.8743 - regression_loss: 1.5434 - classification_loss: 0.3309 271/500 [===============>..............] - ETA: 57s - loss: 1.8741 - regression_loss: 1.5430 - classification_loss: 0.3310 272/500 [===============>..............] - ETA: 56s - loss: 1.8715 - regression_loss: 1.5409 - classification_loss: 0.3306 273/500 [===============>..............] - ETA: 56s - loss: 1.8743 - regression_loss: 1.5431 - classification_loss: 0.3312 274/500 [===============>..............] - ETA: 56s - loss: 1.8733 - regression_loss: 1.5425 - classification_loss: 0.3309 275/500 [===============>..............] - ETA: 56s - loss: 1.8713 - regression_loss: 1.5410 - classification_loss: 0.3303 276/500 [===============>..............] - ETA: 55s - loss: 1.8670 - regression_loss: 1.5374 - classification_loss: 0.3295 277/500 [===============>..............] - ETA: 55s - loss: 1.8660 - regression_loss: 1.5367 - classification_loss: 0.3294 278/500 [===============>..............] - ETA: 55s - loss: 1.8651 - regression_loss: 1.5360 - classification_loss: 0.3291 279/500 [===============>..............] - ETA: 55s - loss: 1.8610 - regression_loss: 1.5325 - classification_loss: 0.3285 280/500 [===============>..............] - ETA: 54s - loss: 1.8616 - regression_loss: 1.5331 - classification_loss: 0.3285 281/500 [===============>..............] - ETA: 54s - loss: 1.8613 - regression_loss: 1.5330 - classification_loss: 0.3283 282/500 [===============>..............] - ETA: 54s - loss: 1.8625 - regression_loss: 1.5339 - classification_loss: 0.3287 283/500 [===============>..............] - ETA: 54s - loss: 1.8623 - regression_loss: 1.5338 - classification_loss: 0.3285 284/500 [================>.............] - ETA: 53s - loss: 1.8622 - regression_loss: 1.5338 - classification_loss: 0.3283 285/500 [================>.............] - ETA: 53s - loss: 1.8623 - regression_loss: 1.5339 - classification_loss: 0.3284 286/500 [================>.............] - ETA: 53s - loss: 1.8640 - regression_loss: 1.5356 - classification_loss: 0.3284 287/500 [================>.............] - ETA: 53s - loss: 1.8649 - regression_loss: 1.5364 - classification_loss: 0.3285 288/500 [================>.............] - ETA: 52s - loss: 1.8646 - regression_loss: 1.5360 - classification_loss: 0.3286 289/500 [================>.............] - ETA: 52s - loss: 1.8647 - regression_loss: 1.5359 - classification_loss: 0.3287 290/500 [================>.............] - ETA: 52s - loss: 1.8663 - regression_loss: 1.5374 - classification_loss: 0.3289 291/500 [================>.............] - ETA: 52s - loss: 1.8647 - regression_loss: 1.5362 - classification_loss: 0.3285 292/500 [================>.............] - ETA: 51s - loss: 1.8653 - regression_loss: 1.5366 - classification_loss: 0.3287 293/500 [================>.............] - ETA: 51s - loss: 1.8643 - regression_loss: 1.5360 - classification_loss: 0.3283 294/500 [================>.............] - ETA: 51s - loss: 1.8648 - regression_loss: 1.5365 - classification_loss: 0.3284 295/500 [================>.............] - ETA: 51s - loss: 1.8650 - regression_loss: 1.5368 - classification_loss: 0.3282 296/500 [================>.............] - ETA: 50s - loss: 1.8632 - regression_loss: 1.5356 - classification_loss: 0.3276 297/500 [================>.............] - ETA: 50s - loss: 1.8620 - regression_loss: 1.5346 - classification_loss: 0.3274 298/500 [================>.............] - ETA: 50s - loss: 1.8598 - regression_loss: 1.5330 - classification_loss: 0.3268 299/500 [================>.............] - ETA: 50s - loss: 1.8592 - regression_loss: 1.5325 - classification_loss: 0.3267 300/500 [=================>............] - ETA: 49s - loss: 1.8578 - regression_loss: 1.5315 - classification_loss: 0.3263 301/500 [=================>............] - ETA: 49s - loss: 1.8591 - regression_loss: 1.5328 - classification_loss: 0.3263 302/500 [=================>............] - ETA: 49s - loss: 1.8553 - regression_loss: 1.5295 - classification_loss: 0.3258 303/500 [=================>............] - ETA: 49s - loss: 1.8554 - regression_loss: 1.5298 - classification_loss: 0.3256 304/500 [=================>............] - ETA: 48s - loss: 1.8559 - regression_loss: 1.5302 - classification_loss: 0.3256 305/500 [=================>............] - ETA: 48s - loss: 1.8553 - regression_loss: 1.5298 - classification_loss: 0.3256 306/500 [=================>............] - ETA: 48s - loss: 1.8547 - regression_loss: 1.5295 - classification_loss: 0.3253 307/500 [=================>............] - ETA: 48s - loss: 1.8528 - regression_loss: 1.5277 - classification_loss: 0.3252 308/500 [=================>............] - ETA: 47s - loss: 1.8539 - regression_loss: 1.5286 - classification_loss: 0.3253 309/500 [=================>............] - ETA: 47s - loss: 1.8532 - regression_loss: 1.5279 - classification_loss: 0.3253 310/500 [=================>............] - ETA: 47s - loss: 1.8536 - regression_loss: 1.5281 - classification_loss: 0.3255 311/500 [=================>............] - ETA: 47s - loss: 1.8535 - regression_loss: 1.5279 - classification_loss: 0.3256 312/500 [=================>............] - ETA: 46s - loss: 1.8510 - regression_loss: 1.5261 - classification_loss: 0.3249 313/500 [=================>............] - ETA: 46s - loss: 1.8532 - regression_loss: 1.5275 - classification_loss: 0.3257 314/500 [=================>............] - ETA: 46s - loss: 1.8529 - regression_loss: 1.5272 - classification_loss: 0.3257 315/500 [=================>............] - ETA: 46s - loss: 1.8510 - regression_loss: 1.5259 - classification_loss: 0.3251 316/500 [=================>............] - ETA: 45s - loss: 1.8491 - regression_loss: 1.5245 - classification_loss: 0.3246 317/500 [==================>...........] - ETA: 45s - loss: 1.8487 - regression_loss: 1.5244 - classification_loss: 0.3243 318/500 [==================>...........] - ETA: 45s - loss: 1.8496 - regression_loss: 1.5245 - classification_loss: 0.3251 319/500 [==================>...........] - ETA: 45s - loss: 1.8469 - regression_loss: 1.5223 - classification_loss: 0.3246 320/500 [==================>...........] - ETA: 44s - loss: 1.8447 - regression_loss: 1.5204 - classification_loss: 0.3243 321/500 [==================>...........] - ETA: 44s - loss: 1.8456 - regression_loss: 1.5214 - classification_loss: 0.3242 322/500 [==================>...........] - ETA: 44s - loss: 1.8447 - regression_loss: 1.5210 - classification_loss: 0.3237 323/500 [==================>...........] - ETA: 44s - loss: 1.8456 - regression_loss: 1.5217 - classification_loss: 0.3239 324/500 [==================>...........] - ETA: 43s - loss: 1.8461 - regression_loss: 1.5221 - classification_loss: 0.3240 325/500 [==================>...........] - ETA: 43s - loss: 1.8446 - regression_loss: 1.5208 - classification_loss: 0.3238 326/500 [==================>...........] - ETA: 43s - loss: 1.8454 - regression_loss: 1.5217 - classification_loss: 0.3237 327/500 [==================>...........] - ETA: 43s - loss: 1.8448 - regression_loss: 1.5213 - classification_loss: 0.3235 328/500 [==================>...........] - ETA: 42s - loss: 1.8458 - regression_loss: 1.5222 - classification_loss: 0.3236 329/500 [==================>...........] - ETA: 42s - loss: 1.8454 - regression_loss: 1.5219 - classification_loss: 0.3235 330/500 [==================>...........] - ETA: 42s - loss: 1.8455 - regression_loss: 1.5217 - classification_loss: 0.3238 331/500 [==================>...........] - ETA: 42s - loss: 1.8474 - regression_loss: 1.5229 - classification_loss: 0.3245 332/500 [==================>...........] - ETA: 41s - loss: 1.8489 - regression_loss: 1.5242 - classification_loss: 0.3247 333/500 [==================>...........] - ETA: 41s - loss: 1.8497 - regression_loss: 1.5249 - classification_loss: 0.3247 334/500 [===================>..........] - ETA: 41s - loss: 1.8490 - regression_loss: 1.5245 - classification_loss: 0.3245 335/500 [===================>..........] - ETA: 41s - loss: 1.8491 - regression_loss: 1.5247 - classification_loss: 0.3243 336/500 [===================>..........] - ETA: 40s - loss: 1.8495 - regression_loss: 1.5247 - classification_loss: 0.3248 337/500 [===================>..........] - ETA: 40s - loss: 1.8489 - regression_loss: 1.5245 - classification_loss: 0.3244 338/500 [===================>..........] - ETA: 40s - loss: 1.8501 - regression_loss: 1.5253 - classification_loss: 0.3247 339/500 [===================>..........] - ETA: 40s - loss: 1.8496 - regression_loss: 1.5251 - classification_loss: 0.3245 340/500 [===================>..........] - ETA: 39s - loss: 1.8502 - regression_loss: 1.5258 - classification_loss: 0.3245 341/500 [===================>..........] - ETA: 39s - loss: 1.8501 - regression_loss: 1.5258 - classification_loss: 0.3243 342/500 [===================>..........] - ETA: 39s - loss: 1.8515 - regression_loss: 1.5266 - classification_loss: 0.3249 343/500 [===================>..........] - ETA: 39s - loss: 1.8520 - regression_loss: 1.5268 - classification_loss: 0.3252 344/500 [===================>..........] - ETA: 38s - loss: 1.8515 - regression_loss: 1.5264 - classification_loss: 0.3251 345/500 [===================>..........] - ETA: 38s - loss: 1.8524 - regression_loss: 1.5271 - classification_loss: 0.3253 346/500 [===================>..........] - ETA: 38s - loss: 1.8531 - regression_loss: 1.5275 - classification_loss: 0.3256 347/500 [===================>..........] - ETA: 38s - loss: 1.8531 - regression_loss: 1.5277 - classification_loss: 0.3255 348/500 [===================>..........] - ETA: 37s - loss: 1.8503 - regression_loss: 1.5256 - classification_loss: 0.3247 349/500 [===================>..........] - ETA: 37s - loss: 1.8510 - regression_loss: 1.5262 - classification_loss: 0.3247 350/500 [====================>.........] - ETA: 37s - loss: 1.8487 - regression_loss: 1.5243 - classification_loss: 0.3244 351/500 [====================>.........] - ETA: 37s - loss: 1.8488 - regression_loss: 1.5244 - classification_loss: 0.3244 352/500 [====================>.........] - ETA: 36s - loss: 1.8494 - regression_loss: 1.5249 - classification_loss: 0.3245 353/500 [====================>.........] - ETA: 36s - loss: 1.8483 - regression_loss: 1.5240 - classification_loss: 0.3243 354/500 [====================>.........] - ETA: 36s - loss: 1.8496 - regression_loss: 1.5252 - classification_loss: 0.3244 355/500 [====================>.........] - ETA: 36s - loss: 1.8460 - regression_loss: 1.5224 - classification_loss: 0.3237 356/500 [====================>.........] - ETA: 35s - loss: 1.8463 - regression_loss: 1.5227 - classification_loss: 0.3235 357/500 [====================>.........] - ETA: 35s - loss: 1.8469 - regression_loss: 1.5233 - classification_loss: 0.3236 358/500 [====================>.........] - ETA: 35s - loss: 1.8464 - regression_loss: 1.5230 - classification_loss: 0.3234 359/500 [====================>.........] - ETA: 35s - loss: 1.8481 - regression_loss: 1.5244 - classification_loss: 0.3238 360/500 [====================>.........] - ETA: 34s - loss: 1.8480 - regression_loss: 1.5242 - classification_loss: 0.3238 361/500 [====================>.........] - ETA: 34s - loss: 1.8471 - regression_loss: 1.5234 - classification_loss: 0.3236 362/500 [====================>.........] - ETA: 34s - loss: 1.8467 - regression_loss: 1.5233 - classification_loss: 0.3235 363/500 [====================>.........] - ETA: 34s - loss: 1.8463 - regression_loss: 1.5230 - classification_loss: 0.3233 364/500 [====================>.........] - ETA: 33s - loss: 1.8464 - regression_loss: 1.5231 - classification_loss: 0.3233 365/500 [====================>.........] - ETA: 33s - loss: 1.8457 - regression_loss: 1.5226 - classification_loss: 0.3231 366/500 [====================>.........] - ETA: 33s - loss: 1.8432 - regression_loss: 1.5204 - classification_loss: 0.3228 367/500 [=====================>........] - ETA: 33s - loss: 1.8426 - regression_loss: 1.5201 - classification_loss: 0.3225 368/500 [=====================>........] - ETA: 32s - loss: 1.8435 - regression_loss: 1.5209 - classification_loss: 0.3226 369/500 [=====================>........] - ETA: 32s - loss: 1.8430 - regression_loss: 1.5205 - classification_loss: 0.3225 370/500 [=====================>........] - ETA: 32s - loss: 1.8442 - regression_loss: 1.5215 - classification_loss: 0.3227 371/500 [=====================>........] - ETA: 32s - loss: 1.8461 - regression_loss: 1.5231 - classification_loss: 0.3230 372/500 [=====================>........] - ETA: 31s - loss: 1.8477 - regression_loss: 1.5243 - classification_loss: 0.3234 373/500 [=====================>........] - ETA: 31s - loss: 1.8494 - regression_loss: 1.5258 - classification_loss: 0.3236 374/500 [=====================>........] - ETA: 31s - loss: 1.8484 - regression_loss: 1.5249 - classification_loss: 0.3235 375/500 [=====================>........] - ETA: 31s - loss: 1.8492 - regression_loss: 1.5255 - classification_loss: 0.3237 376/500 [=====================>........] - ETA: 30s - loss: 1.8464 - regression_loss: 1.5231 - classification_loss: 0.3233 377/500 [=====================>........] - ETA: 30s - loss: 1.8451 - regression_loss: 1.5220 - classification_loss: 0.3231 378/500 [=====================>........] - ETA: 30s - loss: 1.8445 - regression_loss: 1.5215 - classification_loss: 0.3230 379/500 [=====================>........] - ETA: 30s - loss: 1.8454 - regression_loss: 1.5223 - classification_loss: 0.3230 380/500 [=====================>........] - ETA: 29s - loss: 1.8428 - regression_loss: 1.5202 - classification_loss: 0.3226 381/500 [=====================>........] - ETA: 29s - loss: 1.8423 - regression_loss: 1.5198 - classification_loss: 0.3225 382/500 [=====================>........] - ETA: 29s - loss: 1.8438 - regression_loss: 1.5210 - classification_loss: 0.3228 383/500 [=====================>........] - ETA: 29s - loss: 1.8430 - regression_loss: 1.5203 - classification_loss: 0.3227 384/500 [======================>.......] - ETA: 28s - loss: 1.8412 - regression_loss: 1.5188 - classification_loss: 0.3224 385/500 [======================>.......] - ETA: 28s - loss: 1.8395 - regression_loss: 1.5175 - classification_loss: 0.3220 386/500 [======================>.......] - ETA: 28s - loss: 1.8398 - regression_loss: 1.5179 - classification_loss: 0.3220 387/500 [======================>.......] - ETA: 28s - loss: 1.8381 - regression_loss: 1.5160 - classification_loss: 0.3221 388/500 [======================>.......] - ETA: 27s - loss: 1.8386 - regression_loss: 1.5165 - classification_loss: 0.3221 389/500 [======================>.......] - ETA: 27s - loss: 1.8389 - regression_loss: 1.5169 - classification_loss: 0.3220 390/500 [======================>.......] - ETA: 27s - loss: 1.8428 - regression_loss: 1.5199 - classification_loss: 0.3229 391/500 [======================>.......] - ETA: 27s - loss: 1.8443 - regression_loss: 1.5216 - classification_loss: 0.3226 392/500 [======================>.......] - ETA: 26s - loss: 1.8430 - regression_loss: 1.5206 - classification_loss: 0.3224 393/500 [======================>.......] - ETA: 26s - loss: 1.8429 - regression_loss: 1.5206 - classification_loss: 0.3223 394/500 [======================>.......] - ETA: 26s - loss: 1.8438 - regression_loss: 1.5215 - classification_loss: 0.3222 395/500 [======================>.......] - ETA: 26s - loss: 1.8434 - regression_loss: 1.5214 - classification_loss: 0.3220 396/500 [======================>.......] - ETA: 25s - loss: 1.8434 - regression_loss: 1.5212 - classification_loss: 0.3221 397/500 [======================>.......] - ETA: 25s - loss: 1.8426 - regression_loss: 1.5205 - classification_loss: 0.3220 398/500 [======================>.......] - ETA: 25s - loss: 1.8431 - regression_loss: 1.5208 - classification_loss: 0.3223 399/500 [======================>.......] - ETA: 25s - loss: 1.8429 - regression_loss: 1.5207 - classification_loss: 0.3222 400/500 [=======================>......] - ETA: 24s - loss: 1.8419 - regression_loss: 1.5199 - classification_loss: 0.3220 401/500 [=======================>......] - ETA: 24s - loss: 1.8416 - regression_loss: 1.5198 - classification_loss: 0.3218 402/500 [=======================>......] - ETA: 24s - loss: 1.8420 - regression_loss: 1.5203 - classification_loss: 0.3218 403/500 [=======================>......] - ETA: 24s - loss: 1.8421 - regression_loss: 1.5203 - classification_loss: 0.3218 404/500 [=======================>......] - ETA: 23s - loss: 1.8410 - regression_loss: 1.5195 - classification_loss: 0.3214 405/500 [=======================>......] - ETA: 23s - loss: 1.8388 - regression_loss: 1.5175 - classification_loss: 0.3213 406/500 [=======================>......] - ETA: 23s - loss: 1.8372 - regression_loss: 1.5162 - classification_loss: 0.3210 407/500 [=======================>......] - ETA: 23s - loss: 1.8354 - regression_loss: 1.5146 - classification_loss: 0.3207 408/500 [=======================>......] - ETA: 22s - loss: 1.8357 - regression_loss: 1.5149 - classification_loss: 0.3209 409/500 [=======================>......] - ETA: 22s - loss: 1.8354 - regression_loss: 1.5146 - classification_loss: 0.3208 410/500 [=======================>......] - ETA: 22s - loss: 1.8359 - regression_loss: 1.5147 - classification_loss: 0.3212 411/500 [=======================>......] - ETA: 22s - loss: 1.8361 - regression_loss: 1.5150 - classification_loss: 0.3211 412/500 [=======================>......] - ETA: 21s - loss: 1.8377 - regression_loss: 1.5163 - classification_loss: 0.3214 413/500 [=======================>......] - ETA: 21s - loss: 1.8385 - regression_loss: 1.5169 - classification_loss: 0.3215 414/500 [=======================>......] - ETA: 21s - loss: 1.8382 - regression_loss: 1.5167 - classification_loss: 0.3215 415/500 [=======================>......] - ETA: 21s - loss: 1.8378 - regression_loss: 1.5165 - classification_loss: 0.3213 416/500 [=======================>......] - ETA: 20s - loss: 1.8388 - regression_loss: 1.5173 - classification_loss: 0.3214 417/500 [========================>.....] - ETA: 20s - loss: 1.8384 - regression_loss: 1.5171 - classification_loss: 0.3213 418/500 [========================>.....] - ETA: 20s - loss: 1.8389 - regression_loss: 1.5175 - classification_loss: 0.3214 419/500 [========================>.....] - ETA: 20s - loss: 1.8402 - regression_loss: 1.5184 - classification_loss: 0.3218 420/500 [========================>.....] - ETA: 19s - loss: 1.8399 - regression_loss: 1.5182 - classification_loss: 0.3217 421/500 [========================>.....] - ETA: 19s - loss: 1.8395 - regression_loss: 1.5179 - classification_loss: 0.3216 422/500 [========================>.....] - ETA: 19s - loss: 1.8391 - regression_loss: 1.5176 - classification_loss: 0.3215 423/500 [========================>.....] - ETA: 19s - loss: 1.8394 - regression_loss: 1.5180 - classification_loss: 0.3215 424/500 [========================>.....] - ETA: 18s - loss: 1.8395 - regression_loss: 1.5177 - classification_loss: 0.3218 425/500 [========================>.....] - ETA: 18s - loss: 1.8399 - regression_loss: 1.5180 - classification_loss: 0.3219 426/500 [========================>.....] - ETA: 18s - loss: 1.8404 - regression_loss: 1.5185 - classification_loss: 0.3218 427/500 [========================>.....] - ETA: 18s - loss: 1.8403 - regression_loss: 1.5187 - classification_loss: 0.3216 428/500 [========================>.....] - ETA: 17s - loss: 1.8409 - regression_loss: 1.5192 - classification_loss: 0.3216 429/500 [========================>.....] - ETA: 17s - loss: 1.8421 - regression_loss: 1.5202 - classification_loss: 0.3219 430/500 [========================>.....] - ETA: 17s - loss: 1.8439 - regression_loss: 1.5218 - classification_loss: 0.3221 431/500 [========================>.....] - ETA: 17s - loss: 1.8439 - regression_loss: 1.5219 - classification_loss: 0.3220 432/500 [========================>.....] - ETA: 17s - loss: 1.8444 - regression_loss: 1.5223 - classification_loss: 0.3221 433/500 [========================>.....] - ETA: 16s - loss: 1.8457 - regression_loss: 1.5234 - classification_loss: 0.3223 434/500 [=========================>....] - ETA: 16s - loss: 1.8451 - regression_loss: 1.5230 - classification_loss: 0.3221 435/500 [=========================>....] - ETA: 16s - loss: 1.8463 - regression_loss: 1.5240 - classification_loss: 0.3223 436/500 [=========================>....] - ETA: 15s - loss: 1.8462 - regression_loss: 1.5237 - classification_loss: 0.3225 437/500 [=========================>....] - ETA: 15s - loss: 1.8467 - regression_loss: 1.5239 - classification_loss: 0.3228 438/500 [=========================>....] - ETA: 15s - loss: 1.8470 - regression_loss: 1.5243 - classification_loss: 0.3227 439/500 [=========================>....] - ETA: 15s - loss: 1.8467 - regression_loss: 1.5239 - classification_loss: 0.3228 440/500 [=========================>....] - ETA: 14s - loss: 1.8454 - regression_loss: 1.5228 - classification_loss: 0.3226 441/500 [=========================>....] - ETA: 14s - loss: 1.8453 - regression_loss: 1.5222 - classification_loss: 0.3231 442/500 [=========================>....] - ETA: 14s - loss: 1.8460 - regression_loss: 1.5229 - classification_loss: 0.3231 443/500 [=========================>....] - ETA: 14s - loss: 1.8459 - regression_loss: 1.5229 - classification_loss: 0.3229 444/500 [=========================>....] - ETA: 13s - loss: 1.8457 - regression_loss: 1.5228 - classification_loss: 0.3228 445/500 [=========================>....] - ETA: 13s - loss: 1.8457 - regression_loss: 1.5229 - classification_loss: 0.3228 446/500 [=========================>....] - ETA: 13s - loss: 1.8445 - regression_loss: 1.5220 - classification_loss: 0.3225 447/500 [=========================>....] - ETA: 13s - loss: 1.8446 - regression_loss: 1.5221 - classification_loss: 0.3224 448/500 [=========================>....] - ETA: 12s - loss: 1.8424 - regression_loss: 1.5204 - classification_loss: 0.3220 449/500 [=========================>....] - ETA: 12s - loss: 1.8415 - regression_loss: 1.5196 - classification_loss: 0.3219 450/500 [==========================>...] - ETA: 12s - loss: 1.8416 - regression_loss: 1.5196 - classification_loss: 0.3220 451/500 [==========================>...] - ETA: 12s - loss: 1.8403 - regression_loss: 1.5187 - classification_loss: 0.3217 452/500 [==========================>...] - ETA: 11s - loss: 1.8412 - regression_loss: 1.5193 - classification_loss: 0.3219 453/500 [==========================>...] - ETA: 11s - loss: 1.8417 - regression_loss: 1.5196 - classification_loss: 0.3221 454/500 [==========================>...] - ETA: 11s - loss: 1.8427 - regression_loss: 1.5205 - classification_loss: 0.3222 455/500 [==========================>...] - ETA: 11s - loss: 1.8405 - regression_loss: 1.5188 - classification_loss: 0.3217 456/500 [==========================>...] - ETA: 10s - loss: 1.8406 - regression_loss: 1.5188 - classification_loss: 0.3218 457/500 [==========================>...] - ETA: 10s - loss: 1.8410 - regression_loss: 1.5192 - classification_loss: 0.3218 458/500 [==========================>...] - ETA: 10s - loss: 1.8419 - regression_loss: 1.5200 - classification_loss: 0.3219 459/500 [==========================>...] - ETA: 10s - loss: 1.8416 - regression_loss: 1.5198 - classification_loss: 0.3218 460/500 [==========================>...] - ETA: 9s - loss: 1.8416 - regression_loss: 1.5199 - classification_loss: 0.3218  461/500 [==========================>...] - ETA: 9s - loss: 1.8412 - regression_loss: 1.5193 - classification_loss: 0.3219 462/500 [==========================>...] - ETA: 9s - loss: 1.8393 - regression_loss: 1.5178 - classification_loss: 0.3215 463/500 [==========================>...] - ETA: 9s - loss: 1.8394 - regression_loss: 1.5180 - classification_loss: 0.3214 464/500 [==========================>...] - ETA: 8s - loss: 1.8403 - regression_loss: 1.5188 - classification_loss: 0.3214 465/500 [==========================>...] - ETA: 8s - loss: 1.8385 - regression_loss: 1.5169 - classification_loss: 0.3216 466/500 [==========================>...] - ETA: 8s - loss: 1.8384 - regression_loss: 1.5169 - classification_loss: 0.3215 467/500 [===========================>..] - ETA: 8s - loss: 1.8386 - regression_loss: 1.5171 - classification_loss: 0.3215 468/500 [===========================>..] - ETA: 7s - loss: 1.8388 - regression_loss: 1.5173 - classification_loss: 0.3215 469/500 [===========================>..] - ETA: 7s - loss: 1.8394 - regression_loss: 1.5177 - classification_loss: 0.3217 470/500 [===========================>..] - ETA: 7s - loss: 1.8400 - regression_loss: 1.5183 - classification_loss: 0.3218 471/500 [===========================>..] - ETA: 7s - loss: 1.8405 - regression_loss: 1.5187 - classification_loss: 0.3218 472/500 [===========================>..] - ETA: 6s - loss: 1.8413 - regression_loss: 1.5194 - classification_loss: 0.3219 473/500 [===========================>..] - ETA: 6s - loss: 1.8416 - regression_loss: 1.5197 - classification_loss: 0.3218 474/500 [===========================>..] - ETA: 6s - loss: 1.8398 - regression_loss: 1.5183 - classification_loss: 0.3215 475/500 [===========================>..] - ETA: 6s - loss: 1.8385 - regression_loss: 1.5172 - classification_loss: 0.3213 476/500 [===========================>..] - ETA: 5s - loss: 1.8364 - regression_loss: 1.5153 - classification_loss: 0.3211 477/500 [===========================>..] - ETA: 5s - loss: 1.8371 - regression_loss: 1.5158 - classification_loss: 0.3212 478/500 [===========================>..] - ETA: 5s - loss: 1.8366 - regression_loss: 1.5154 - classification_loss: 0.3212 479/500 [===========================>..] - ETA: 5s - loss: 1.8372 - regression_loss: 1.5159 - classification_loss: 0.3214 480/500 [===========================>..] - ETA: 4s - loss: 1.8378 - regression_loss: 1.5164 - classification_loss: 0.3214 481/500 [===========================>..] - ETA: 4s - loss: 1.8381 - regression_loss: 1.5168 - classification_loss: 0.3213 482/500 [===========================>..] - ETA: 4s - loss: 1.8397 - regression_loss: 1.5178 - classification_loss: 0.3218 483/500 [===========================>..] - ETA: 4s - loss: 1.8405 - regression_loss: 1.5183 - classification_loss: 0.3222 484/500 [============================>.] - ETA: 3s - loss: 1.8412 - regression_loss: 1.5188 - classification_loss: 0.3224 485/500 [============================>.] - ETA: 3s - loss: 1.8409 - regression_loss: 1.5187 - classification_loss: 0.3222 486/500 [============================>.] - ETA: 3s - loss: 1.8403 - regression_loss: 1.5182 - classification_loss: 0.3221 487/500 [============================>.] - ETA: 3s - loss: 1.8394 - regression_loss: 1.5175 - classification_loss: 0.3219 488/500 [============================>.] - ETA: 2s - loss: 1.8388 - regression_loss: 1.5170 - classification_loss: 0.3218 489/500 [============================>.] - ETA: 2s - loss: 1.8366 - regression_loss: 1.5151 - classification_loss: 0.3215 490/500 [============================>.] - ETA: 2s - loss: 1.8348 - regression_loss: 1.5136 - classification_loss: 0.3212 491/500 [============================>.] - ETA: 2s - loss: 1.8352 - regression_loss: 1.5140 - classification_loss: 0.3213 492/500 [============================>.] - ETA: 1s - loss: 1.8354 - regression_loss: 1.5141 - classification_loss: 0.3213 493/500 [============================>.] - ETA: 1s - loss: 1.8356 - regression_loss: 1.5143 - classification_loss: 0.3213 494/500 [============================>.] - ETA: 1s - loss: 1.8358 - regression_loss: 1.5145 - classification_loss: 0.3213 495/500 [============================>.] - ETA: 1s - loss: 1.8354 - regression_loss: 1.5143 - classification_loss: 0.3211 496/500 [============================>.] - ETA: 0s - loss: 1.8349 - regression_loss: 1.5140 - classification_loss: 0.3209 497/500 [============================>.] - ETA: 0s - loss: 1.8357 - regression_loss: 1.5146 - classification_loss: 0.3211 498/500 [============================>.] - ETA: 0s - loss: 1.8370 - regression_loss: 1.5160 - classification_loss: 0.3211 499/500 [============================>.] - ETA: 0s - loss: 1.8374 - regression_loss: 1.5164 - classification_loss: 0.3211 500/500 [==============================] - 125s 250ms/step - loss: 1.8378 - regression_loss: 1.5168 - classification_loss: 0.3210 1172 instances of class plum with average precision: 0.5570 mAP: 0.5570 Epoch 00057: saving model to ./training/snapshots/resnet50_pascal_57.h5 Epoch 58/150 1/500 [..............................] - ETA: 1:59 - loss: 2.8705 - regression_loss: 2.3911 - classification_loss: 0.4794 2/500 [..............................] - ETA: 2:02 - loss: 2.8691 - regression_loss: 2.3968 - classification_loss: 0.4722 3/500 [..............................] - ETA: 2:03 - loss: 2.3431 - regression_loss: 1.9365 - classification_loss: 0.4066 4/500 [..............................] - ETA: 2:03 - loss: 2.3719 - regression_loss: 1.9603 - classification_loss: 0.4116 5/500 [..............................] - ETA: 2:03 - loss: 2.3207 - regression_loss: 1.9110 - classification_loss: 0.4097 6/500 [..............................] - ETA: 2:03 - loss: 2.1589 - regression_loss: 1.7824 - classification_loss: 0.3765 7/500 [..............................] - ETA: 2:02 - loss: 2.1139 - regression_loss: 1.7438 - classification_loss: 0.3701 8/500 [..............................] - ETA: 2:01 - loss: 2.1053 - regression_loss: 1.7376 - classification_loss: 0.3677 9/500 [..............................] - ETA: 2:02 - loss: 2.0657 - regression_loss: 1.7024 - classification_loss: 0.3633 10/500 [..............................] - ETA: 2:02 - loss: 2.0282 - regression_loss: 1.6729 - classification_loss: 0.3553 11/500 [..............................] - ETA: 2:02 - loss: 1.9752 - regression_loss: 1.6247 - classification_loss: 0.3505 12/500 [..............................] - ETA: 2:01 - loss: 1.9442 - regression_loss: 1.6034 - classification_loss: 0.3407 13/500 [..............................] - ETA: 1:59 - loss: 1.9457 - regression_loss: 1.6069 - classification_loss: 0.3388 14/500 [..............................] - ETA: 1:58 - loss: 1.9538 - regression_loss: 1.6144 - classification_loss: 0.3394 15/500 [..............................] - ETA: 1:57 - loss: 1.9553 - regression_loss: 1.6182 - classification_loss: 0.3370 16/500 [..............................] - ETA: 1:56 - loss: 1.9572 - regression_loss: 1.6211 - classification_loss: 0.3361 17/500 [>.............................] - ETA: 1:56 - loss: 1.9578 - regression_loss: 1.6233 - classification_loss: 0.3345 18/500 [>.............................] - ETA: 1:56 - loss: 1.9625 - regression_loss: 1.6287 - classification_loss: 0.3338 19/500 [>.............................] - ETA: 1:56 - loss: 1.9651 - regression_loss: 1.6252 - classification_loss: 0.3399 20/500 [>.............................] - ETA: 1:56 - loss: 1.9978 - regression_loss: 1.6547 - classification_loss: 0.3431 21/500 [>.............................] - ETA: 1:56 - loss: 1.9899 - regression_loss: 1.6432 - classification_loss: 0.3466 22/500 [>.............................] - ETA: 1:56 - loss: 2.0082 - regression_loss: 1.6538 - classification_loss: 0.3544 23/500 [>.............................] - ETA: 1:56 - loss: 1.9939 - regression_loss: 1.6438 - classification_loss: 0.3501 24/500 [>.............................] - ETA: 1:56 - loss: 1.9632 - regression_loss: 1.6150 - classification_loss: 0.3483 25/500 [>.............................] - ETA: 1:56 - loss: 1.9626 - regression_loss: 1.6158 - classification_loss: 0.3469 26/500 [>.............................] - ETA: 1:56 - loss: 1.9728 - regression_loss: 1.6275 - classification_loss: 0.3454 27/500 [>.............................] - ETA: 1:56 - loss: 1.9604 - regression_loss: 1.6138 - classification_loss: 0.3467 28/500 [>.............................] - ETA: 1:56 - loss: 1.9536 - regression_loss: 1.5997 - classification_loss: 0.3539 29/500 [>.............................] - ETA: 1:56 - loss: 1.9576 - regression_loss: 1.6033 - classification_loss: 0.3543 30/500 [>.............................] - ETA: 1:55 - loss: 1.9333 - regression_loss: 1.5826 - classification_loss: 0.3506 31/500 [>.............................] - ETA: 1:55 - loss: 1.9283 - regression_loss: 1.5776 - classification_loss: 0.3506 32/500 [>.............................] - ETA: 1:55 - loss: 1.9244 - regression_loss: 1.5752 - classification_loss: 0.3492 33/500 [>.............................] - ETA: 1:55 - loss: 1.9632 - regression_loss: 1.6088 - classification_loss: 0.3544 34/500 [=>............................] - ETA: 1:55 - loss: 1.9549 - regression_loss: 1.6041 - classification_loss: 0.3508 35/500 [=>............................] - ETA: 1:55 - loss: 1.9777 - regression_loss: 1.6223 - classification_loss: 0.3555 36/500 [=>............................] - ETA: 1:55 - loss: 1.9841 - regression_loss: 1.6273 - classification_loss: 0.3568 37/500 [=>............................] - ETA: 1:54 - loss: 1.9816 - regression_loss: 1.6249 - classification_loss: 0.3567 38/500 [=>............................] - ETA: 1:54 - loss: 1.9883 - regression_loss: 1.6295 - classification_loss: 0.3588 39/500 [=>............................] - ETA: 1:54 - loss: 1.9739 - regression_loss: 1.6199 - classification_loss: 0.3540 40/500 [=>............................] - ETA: 1:54 - loss: 1.9780 - regression_loss: 1.6236 - classification_loss: 0.3543 41/500 [=>............................] - ETA: 1:54 - loss: 1.9700 - regression_loss: 1.6183 - classification_loss: 0.3517 42/500 [=>............................] - ETA: 1:54 - loss: 1.9667 - regression_loss: 1.6161 - classification_loss: 0.3507 43/500 [=>............................] - ETA: 1:53 - loss: 1.9545 - regression_loss: 1.6037 - classification_loss: 0.3508 44/500 [=>............................] - ETA: 1:53 - loss: 1.9355 - regression_loss: 1.5896 - classification_loss: 0.3459 45/500 [=>............................] - ETA: 1:53 - loss: 1.9358 - regression_loss: 1.5903 - classification_loss: 0.3455 46/500 [=>............................] - ETA: 1:53 - loss: 1.9425 - regression_loss: 1.5944 - classification_loss: 0.3481 47/500 [=>............................] - ETA: 1:53 - loss: 1.9455 - regression_loss: 1.5996 - classification_loss: 0.3459 48/500 [=>............................] - ETA: 1:52 - loss: 1.9422 - regression_loss: 1.5954 - classification_loss: 0.3467 49/500 [=>............................] - ETA: 1:52 - loss: 1.9339 - regression_loss: 1.5784 - classification_loss: 0.3554 50/500 [==>...........................] - ETA: 1:52 - loss: 1.9182 - regression_loss: 1.5645 - classification_loss: 0.3536 51/500 [==>...........................] - ETA: 1:52 - loss: 1.9203 - regression_loss: 1.5679 - classification_loss: 0.3523 52/500 [==>...........................] - ETA: 1:52 - loss: 1.9167 - regression_loss: 1.5668 - classification_loss: 0.3500 53/500 [==>...........................] - ETA: 1:51 - loss: 1.9146 - regression_loss: 1.5645 - classification_loss: 0.3501 54/500 [==>...........................] - ETA: 1:51 - loss: 1.8978 - regression_loss: 1.5519 - classification_loss: 0.3459 55/500 [==>...........................] - ETA: 1:51 - loss: 1.9059 - regression_loss: 1.5581 - classification_loss: 0.3479 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8982 - regression_loss: 1.5527 - classification_loss: 0.3456 57/500 [==>...........................] - ETA: 1:50 - loss: 1.8947 - regression_loss: 1.5499 - classification_loss: 0.3448 58/500 [==>...........................] - ETA: 1:50 - loss: 1.8803 - regression_loss: 1.5389 - classification_loss: 0.3414 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8827 - regression_loss: 1.5417 - classification_loss: 0.3410 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8716 - regression_loss: 1.5329 - classification_loss: 0.3387 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8713 - regression_loss: 1.5329 - classification_loss: 0.3384 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8679 - regression_loss: 1.5308 - classification_loss: 0.3371 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8667 - regression_loss: 1.5305 - classification_loss: 0.3362 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8694 - regression_loss: 1.5329 - classification_loss: 0.3365 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8723 - regression_loss: 1.5350 - classification_loss: 0.3372 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8723 - regression_loss: 1.5357 - classification_loss: 0.3365 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8730 - regression_loss: 1.5365 - classification_loss: 0.3365 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8810 - regression_loss: 1.5420 - classification_loss: 0.3390 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8816 - regression_loss: 1.5423 - classification_loss: 0.3393 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8642 - regression_loss: 1.5285 - classification_loss: 0.3357 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8765 - regression_loss: 1.5389 - classification_loss: 0.3376 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8765 - regression_loss: 1.5391 - classification_loss: 0.3375 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8737 - regression_loss: 1.5372 - classification_loss: 0.3365 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8691 - regression_loss: 1.5342 - classification_loss: 0.3350 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8565 - regression_loss: 1.5244 - classification_loss: 0.3321 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8541 - regression_loss: 1.5225 - classification_loss: 0.3316 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8541 - regression_loss: 1.5225 - classification_loss: 0.3316 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8475 - regression_loss: 1.5161 - classification_loss: 0.3314 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8523 - regression_loss: 1.5208 - classification_loss: 0.3316 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8497 - regression_loss: 1.5191 - classification_loss: 0.3306 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8509 - regression_loss: 1.5206 - classification_loss: 0.3304 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8496 - regression_loss: 1.5197 - classification_loss: 0.3299 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8490 - regression_loss: 1.5192 - classification_loss: 0.3298 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8564 - regression_loss: 1.5251 - classification_loss: 0.3313 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8582 - regression_loss: 1.5275 - classification_loss: 0.3307 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8578 - regression_loss: 1.5273 - classification_loss: 0.3304 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8653 - regression_loss: 1.5335 - classification_loss: 0.3319 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8590 - regression_loss: 1.5273 - classification_loss: 0.3316 89/500 [====>.........................] - ETA: 1:43 - loss: 1.8656 - regression_loss: 1.5332 - classification_loss: 0.3323 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8629 - regression_loss: 1.5315 - classification_loss: 0.3314 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8664 - regression_loss: 1.5347 - classification_loss: 0.3317 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8656 - regression_loss: 1.5344 - classification_loss: 0.3312 93/500 [====>.........................] - ETA: 1:42 - loss: 1.8671 - regression_loss: 1.5359 - classification_loss: 0.3312 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8666 - regression_loss: 1.5362 - classification_loss: 0.3304 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8669 - regression_loss: 1.5370 - classification_loss: 0.3299 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8703 - regression_loss: 1.5400 - classification_loss: 0.3303 97/500 [====>.........................] - ETA: 1:41 - loss: 1.8704 - regression_loss: 1.5406 - classification_loss: 0.3298 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8649 - regression_loss: 1.5340 - classification_loss: 0.3309 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8688 - regression_loss: 1.5375 - classification_loss: 0.3313 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8637 - regression_loss: 1.5329 - classification_loss: 0.3308 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8640 - regression_loss: 1.5333 - classification_loss: 0.3306 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8619 - regression_loss: 1.5311 - classification_loss: 0.3308 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8635 - regression_loss: 1.5331 - classification_loss: 0.3304 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8537 - regression_loss: 1.5250 - classification_loss: 0.3286 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8529 - regression_loss: 1.5240 - classification_loss: 0.3290 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8544 - regression_loss: 1.5261 - classification_loss: 0.3283 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8518 - regression_loss: 1.5241 - classification_loss: 0.3277 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8552 - regression_loss: 1.5254 - classification_loss: 0.3298 109/500 [=====>........................] - ETA: 1:38 - loss: 1.8539 - regression_loss: 1.5249 - classification_loss: 0.3290 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8529 - regression_loss: 1.5247 - classification_loss: 0.3283 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8459 - regression_loss: 1.5182 - classification_loss: 0.3277 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8482 - regression_loss: 1.5205 - classification_loss: 0.3277 113/500 [=====>........................] - ETA: 1:37 - loss: 1.8511 - regression_loss: 1.5231 - classification_loss: 0.3280 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8517 - regression_loss: 1.5237 - classification_loss: 0.3280 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8498 - regression_loss: 1.5228 - classification_loss: 0.3270 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8455 - regression_loss: 1.5196 - classification_loss: 0.3259 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8448 - regression_loss: 1.5186 - classification_loss: 0.3262 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8472 - regression_loss: 1.5191 - classification_loss: 0.3281 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8503 - regression_loss: 1.5213 - classification_loss: 0.3290 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8455 - regression_loss: 1.5177 - classification_loss: 0.3278 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8483 - regression_loss: 1.5198 - classification_loss: 0.3285 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8465 - regression_loss: 1.5187 - classification_loss: 0.3278 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8534 - regression_loss: 1.5255 - classification_loss: 0.3279 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8533 - regression_loss: 1.5253 - classification_loss: 0.3279 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8574 - regression_loss: 1.5299 - classification_loss: 0.3276 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8555 - regression_loss: 1.5274 - classification_loss: 0.3282 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8463 - regression_loss: 1.5190 - classification_loss: 0.3273 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8533 - regression_loss: 1.5245 - classification_loss: 0.3288 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8523 - regression_loss: 1.5239 - classification_loss: 0.3284 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8474 - regression_loss: 1.5199 - classification_loss: 0.3275 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8502 - regression_loss: 1.5224 - classification_loss: 0.3278 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8529 - regression_loss: 1.5242 - classification_loss: 0.3287 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8561 - regression_loss: 1.5271 - classification_loss: 0.3291 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8506 - regression_loss: 1.5223 - classification_loss: 0.3283 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8523 - regression_loss: 1.5241 - classification_loss: 0.3282 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8543 - regression_loss: 1.5260 - classification_loss: 0.3283 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8537 - regression_loss: 1.5252 - classification_loss: 0.3285 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8503 - regression_loss: 1.5225 - classification_loss: 0.3278 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8547 - regression_loss: 1.5267 - classification_loss: 0.3280 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8576 - regression_loss: 1.5297 - classification_loss: 0.3279 141/500 [=======>......................] - ETA: 1:30 - loss: 1.8574 - regression_loss: 1.5297 - classification_loss: 0.3278 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8567 - regression_loss: 1.5293 - classification_loss: 0.3274 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8597 - regression_loss: 1.5317 - classification_loss: 0.3280 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8619 - regression_loss: 1.5335 - classification_loss: 0.3284 145/500 [=======>......................] - ETA: 1:29 - loss: 1.8597 - regression_loss: 1.5322 - classification_loss: 0.3275 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8615 - regression_loss: 1.5339 - classification_loss: 0.3276 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8607 - regression_loss: 1.5334 - classification_loss: 0.3273 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8605 - regression_loss: 1.5328 - classification_loss: 0.3278 149/500 [=======>......................] - ETA: 1:28 - loss: 1.8603 - regression_loss: 1.5327 - classification_loss: 0.3276 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8582 - regression_loss: 1.5312 - classification_loss: 0.3270 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8550 - regression_loss: 1.5291 - classification_loss: 0.3260 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8520 - regression_loss: 1.5268 - classification_loss: 0.3253 153/500 [========>.....................] - ETA: 1:27 - loss: 1.8545 - regression_loss: 1.5292 - classification_loss: 0.3253 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8540 - regression_loss: 1.5293 - classification_loss: 0.3247 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8526 - regression_loss: 1.5282 - classification_loss: 0.3244 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8524 - regression_loss: 1.5281 - classification_loss: 0.3243 157/500 [========>.....................] - ETA: 1:26 - loss: 1.8481 - regression_loss: 1.5247 - classification_loss: 0.3234 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8571 - regression_loss: 1.5330 - classification_loss: 0.3241 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8570 - regression_loss: 1.5326 - classification_loss: 0.3244 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8604 - regression_loss: 1.5354 - classification_loss: 0.3250 161/500 [========>.....................] - ETA: 1:25 - loss: 1.8571 - regression_loss: 1.5328 - classification_loss: 0.3243 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8510 - regression_loss: 1.5275 - classification_loss: 0.3235 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8563 - regression_loss: 1.5321 - classification_loss: 0.3242 164/500 [========>.....................] - ETA: 1:24 - loss: 1.8578 - regression_loss: 1.5337 - classification_loss: 0.3241 165/500 [========>.....................] - ETA: 1:24 - loss: 1.8537 - regression_loss: 1.5303 - classification_loss: 0.3234 166/500 [========>.....................] - ETA: 1:23 - loss: 1.8551 - regression_loss: 1.5317 - classification_loss: 0.3234 167/500 [=========>....................] - ETA: 1:23 - loss: 1.8553 - regression_loss: 1.5321 - classification_loss: 0.3231 168/500 [=========>....................] - ETA: 1:23 - loss: 1.8509 - regression_loss: 1.5286 - classification_loss: 0.3223 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8507 - regression_loss: 1.5284 - classification_loss: 0.3223 170/500 [=========>....................] - ETA: 1:22 - loss: 1.8495 - regression_loss: 1.5275 - classification_loss: 0.3220 171/500 [=========>....................] - ETA: 1:22 - loss: 1.8485 - regression_loss: 1.5263 - classification_loss: 0.3222 172/500 [=========>....................] - ETA: 1:22 - loss: 1.8449 - regression_loss: 1.5240 - classification_loss: 0.3209 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8443 - regression_loss: 1.5240 - classification_loss: 0.3202 174/500 [=========>....................] - ETA: 1:21 - loss: 1.8502 - regression_loss: 1.5288 - classification_loss: 0.3215 175/500 [=========>....................] - ETA: 1:21 - loss: 1.8508 - regression_loss: 1.5296 - classification_loss: 0.3213 176/500 [=========>....................] - ETA: 1:21 - loss: 1.8508 - regression_loss: 1.5293 - classification_loss: 0.3215 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8462 - regression_loss: 1.5255 - classification_loss: 0.3207 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8485 - regression_loss: 1.5273 - classification_loss: 0.3211 179/500 [=========>....................] - ETA: 1:20 - loss: 1.8431 - regression_loss: 1.5231 - classification_loss: 0.3200 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8454 - regression_loss: 1.5251 - classification_loss: 0.3203 181/500 [=========>....................] - ETA: 1:20 - loss: 1.8426 - regression_loss: 1.5230 - classification_loss: 0.3196 182/500 [=========>....................] - ETA: 1:19 - loss: 1.8365 - regression_loss: 1.5182 - classification_loss: 0.3183 183/500 [=========>....................] - ETA: 1:19 - loss: 1.8376 - regression_loss: 1.5189 - classification_loss: 0.3187 184/500 [==========>...................] - ETA: 1:19 - loss: 1.8381 - regression_loss: 1.5196 - classification_loss: 0.3185 185/500 [==========>...................] - ETA: 1:19 - loss: 1.8383 - regression_loss: 1.5199 - classification_loss: 0.3184 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8319 - regression_loss: 1.5147 - classification_loss: 0.3173 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8345 - regression_loss: 1.5168 - classification_loss: 0.3178 188/500 [==========>...................] - ETA: 1:18 - loss: 1.8370 - regression_loss: 1.5184 - classification_loss: 0.3186 189/500 [==========>...................] - ETA: 1:18 - loss: 1.8360 - regression_loss: 1.5173 - classification_loss: 0.3187 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8347 - regression_loss: 1.5162 - classification_loss: 0.3185 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8365 - regression_loss: 1.5177 - classification_loss: 0.3188 192/500 [==========>...................] - ETA: 1:17 - loss: 1.8398 - regression_loss: 1.5201 - classification_loss: 0.3197 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8409 - regression_loss: 1.5208 - classification_loss: 0.3201 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8398 - regression_loss: 1.5204 - classification_loss: 0.3193 195/500 [==========>...................] - ETA: 1:16 - loss: 1.8413 - regression_loss: 1.5217 - classification_loss: 0.3196 196/500 [==========>...................] - ETA: 1:16 - loss: 1.8386 - regression_loss: 1.5199 - classification_loss: 0.3187 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8378 - regression_loss: 1.5194 - classification_loss: 0.3185 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8391 - regression_loss: 1.5202 - classification_loss: 0.3188 199/500 [==========>...................] - ETA: 1:15 - loss: 1.8386 - regression_loss: 1.5197 - classification_loss: 0.3188 200/500 [===========>..................] - ETA: 1:15 - loss: 1.8348 - regression_loss: 1.5164 - classification_loss: 0.3184 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8350 - regression_loss: 1.5165 - classification_loss: 0.3185 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8352 - regression_loss: 1.5166 - classification_loss: 0.3187 203/500 [===========>..................] - ETA: 1:14 - loss: 1.8297 - regression_loss: 1.5122 - classification_loss: 0.3175 204/500 [===========>..................] - ETA: 1:14 - loss: 1.8295 - regression_loss: 1.5121 - classification_loss: 0.3174 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8283 - regression_loss: 1.5110 - classification_loss: 0.3174 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8291 - regression_loss: 1.5116 - classification_loss: 0.3175 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8255 - regression_loss: 1.5084 - classification_loss: 0.3171 208/500 [===========>..................] - ETA: 1:13 - loss: 1.8267 - regression_loss: 1.5095 - classification_loss: 0.3172 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8290 - regression_loss: 1.5116 - classification_loss: 0.3174 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8302 - regression_loss: 1.5127 - classification_loss: 0.3175 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8303 - regression_loss: 1.5129 - classification_loss: 0.3174 212/500 [===========>..................] - ETA: 1:12 - loss: 1.8305 - regression_loss: 1.5130 - classification_loss: 0.3175 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8318 - regression_loss: 1.5139 - classification_loss: 0.3179 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8345 - regression_loss: 1.5161 - classification_loss: 0.3184 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8350 - regression_loss: 1.5166 - classification_loss: 0.3185 216/500 [===========>..................] - ETA: 1:11 - loss: 1.8356 - regression_loss: 1.5174 - classification_loss: 0.3183 217/500 [============>.................] - ETA: 1:10 - loss: 1.8344 - regression_loss: 1.5166 - classification_loss: 0.3178 218/500 [============>.................] - ETA: 1:10 - loss: 1.8336 - regression_loss: 1.5145 - classification_loss: 0.3191 219/500 [============>.................] - ETA: 1:10 - loss: 1.8328 - regression_loss: 1.5140 - classification_loss: 0.3188 220/500 [============>.................] - ETA: 1:10 - loss: 1.8320 - regression_loss: 1.5135 - classification_loss: 0.3185 221/500 [============>.................] - ETA: 1:09 - loss: 1.8334 - regression_loss: 1.5145 - classification_loss: 0.3189 222/500 [============>.................] - ETA: 1:09 - loss: 1.8355 - regression_loss: 1.5161 - classification_loss: 0.3194 223/500 [============>.................] - ETA: 1:09 - loss: 1.8355 - regression_loss: 1.5165 - classification_loss: 0.3190 224/500 [============>.................] - ETA: 1:09 - loss: 1.8341 - regression_loss: 1.5155 - classification_loss: 0.3187 225/500 [============>.................] - ETA: 1:08 - loss: 1.8304 - regression_loss: 1.5122 - classification_loss: 0.3181 226/500 [============>.................] - ETA: 1:08 - loss: 1.8284 - regression_loss: 1.5102 - classification_loss: 0.3182 227/500 [============>.................] - ETA: 1:08 - loss: 1.8267 - regression_loss: 1.5089 - classification_loss: 0.3177 228/500 [============>.................] - ETA: 1:08 - loss: 1.8272 - regression_loss: 1.5095 - classification_loss: 0.3177 229/500 [============>.................] - ETA: 1:07 - loss: 1.8262 - regression_loss: 1.5082 - classification_loss: 0.3181 230/500 [============>.................] - ETA: 1:07 - loss: 1.8254 - regression_loss: 1.5078 - classification_loss: 0.3177 231/500 [============>.................] - ETA: 1:07 - loss: 1.8249 - regression_loss: 1.5077 - classification_loss: 0.3172 232/500 [============>.................] - ETA: 1:07 - loss: 1.8242 - regression_loss: 1.5072 - classification_loss: 0.3170 233/500 [============>.................] - ETA: 1:07 - loss: 1.8253 - regression_loss: 1.5081 - classification_loss: 0.3172 234/500 [=============>................] - ETA: 1:06 - loss: 1.8256 - regression_loss: 1.5086 - classification_loss: 0.3170 235/500 [=============>................] - ETA: 1:06 - loss: 1.8234 - regression_loss: 1.5066 - classification_loss: 0.3168 236/500 [=============>................] - ETA: 1:06 - loss: 1.8228 - regression_loss: 1.5066 - classification_loss: 0.3162 237/500 [=============>................] - ETA: 1:06 - loss: 1.8249 - regression_loss: 1.5081 - classification_loss: 0.3169 238/500 [=============>................] - ETA: 1:05 - loss: 1.8249 - regression_loss: 1.5082 - classification_loss: 0.3167 239/500 [=============>................] - ETA: 1:05 - loss: 1.8209 - regression_loss: 1.5051 - classification_loss: 0.3158 240/500 [=============>................] - ETA: 1:05 - loss: 1.8205 - regression_loss: 1.5049 - classification_loss: 0.3156 241/500 [=============>................] - ETA: 1:05 - loss: 1.8198 - regression_loss: 1.5045 - classification_loss: 0.3153 242/500 [=============>................] - ETA: 1:04 - loss: 1.8172 - regression_loss: 1.5025 - classification_loss: 0.3147 243/500 [=============>................] - ETA: 1:04 - loss: 1.8179 - regression_loss: 1.5025 - classification_loss: 0.3154 244/500 [=============>................] - ETA: 1:04 - loss: 1.8187 - regression_loss: 1.5031 - classification_loss: 0.3156 245/500 [=============>................] - ETA: 1:04 - loss: 1.8175 - regression_loss: 1.5025 - classification_loss: 0.3151 246/500 [=============>................] - ETA: 1:03 - loss: 1.8170 - regression_loss: 1.5021 - classification_loss: 0.3148 247/500 [=============>................] - ETA: 1:03 - loss: 1.8167 - regression_loss: 1.5021 - classification_loss: 0.3146 248/500 [=============>................] - ETA: 1:03 - loss: 1.8156 - regression_loss: 1.5014 - classification_loss: 0.3141 249/500 [=============>................] - ETA: 1:03 - loss: 1.8154 - regression_loss: 1.5011 - classification_loss: 0.3143 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8160 - regression_loss: 1.5017 - classification_loss: 0.3144 251/500 [==============>...............] - ETA: 1:02 - loss: 1.8180 - regression_loss: 1.5033 - classification_loss: 0.3147 252/500 [==============>...............] - ETA: 1:02 - loss: 1.8181 - regression_loss: 1.5036 - classification_loss: 0.3145 253/500 [==============>...............] - ETA: 1:02 - loss: 1.8165 - regression_loss: 1.5022 - classification_loss: 0.3144 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8152 - regression_loss: 1.5009 - classification_loss: 0.3143 255/500 [==============>...............] - ETA: 1:01 - loss: 1.8114 - regression_loss: 1.4976 - classification_loss: 0.3139 256/500 [==============>...............] - ETA: 1:01 - loss: 1.8086 - regression_loss: 1.4952 - classification_loss: 0.3134 257/500 [==============>...............] - ETA: 1:01 - loss: 1.8088 - regression_loss: 1.4955 - classification_loss: 0.3133 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8083 - regression_loss: 1.4953 - classification_loss: 0.3131 259/500 [==============>...............] - ETA: 1:00 - loss: 1.8098 - regression_loss: 1.4963 - classification_loss: 0.3134 260/500 [==============>...............] - ETA: 1:00 - loss: 1.8105 - regression_loss: 1.4967 - classification_loss: 0.3138 261/500 [==============>...............] - ETA: 1:00 - loss: 1.8135 - regression_loss: 1.4991 - classification_loss: 0.3143 262/500 [==============>...............] - ETA: 59s - loss: 1.8123 - regression_loss: 1.4982 - classification_loss: 0.3141  263/500 [==============>...............] - ETA: 59s - loss: 1.8139 - regression_loss: 1.4995 - classification_loss: 0.3144 264/500 [==============>...............] - ETA: 59s - loss: 1.8150 - regression_loss: 1.5002 - classification_loss: 0.3148 265/500 [==============>...............] - ETA: 59s - loss: 1.8153 - regression_loss: 1.5003 - classification_loss: 0.3149 266/500 [==============>...............] - ETA: 58s - loss: 1.8174 - regression_loss: 1.5021 - classification_loss: 0.3152 267/500 [===============>..............] - ETA: 58s - loss: 1.8151 - regression_loss: 1.5005 - classification_loss: 0.3146 268/500 [===============>..............] - ETA: 58s - loss: 1.8159 - regression_loss: 1.5013 - classification_loss: 0.3145 269/500 [===============>..............] - ETA: 58s - loss: 1.8152 - regression_loss: 1.5010 - classification_loss: 0.3142 270/500 [===============>..............] - ETA: 57s - loss: 1.8153 - regression_loss: 1.5011 - classification_loss: 0.3141 271/500 [===============>..............] - ETA: 57s - loss: 1.8121 - regression_loss: 1.4987 - classification_loss: 0.3134 272/500 [===============>..............] - ETA: 57s - loss: 1.8118 - regression_loss: 1.4983 - classification_loss: 0.3135 273/500 [===============>..............] - ETA: 57s - loss: 1.8141 - regression_loss: 1.4995 - classification_loss: 0.3146 274/500 [===============>..............] - ETA: 56s - loss: 1.8148 - regression_loss: 1.5002 - classification_loss: 0.3146 275/500 [===============>..............] - ETA: 56s - loss: 1.8150 - regression_loss: 1.5002 - classification_loss: 0.3148 276/500 [===============>..............] - ETA: 56s - loss: 1.8148 - regression_loss: 1.5001 - classification_loss: 0.3147 277/500 [===============>..............] - ETA: 56s - loss: 1.8152 - regression_loss: 1.5005 - classification_loss: 0.3147 278/500 [===============>..............] - ETA: 55s - loss: 1.8146 - regression_loss: 1.5003 - classification_loss: 0.3144 279/500 [===============>..............] - ETA: 55s - loss: 1.8151 - regression_loss: 1.5009 - classification_loss: 0.3143 280/500 [===============>..............] - ETA: 55s - loss: 1.8167 - regression_loss: 1.5023 - classification_loss: 0.3144 281/500 [===============>..............] - ETA: 55s - loss: 1.8176 - regression_loss: 1.5029 - classification_loss: 0.3147 282/500 [===============>..............] - ETA: 54s - loss: 1.8156 - regression_loss: 1.5011 - classification_loss: 0.3145 283/500 [===============>..............] - ETA: 54s - loss: 1.8126 - regression_loss: 1.4988 - classification_loss: 0.3139 284/500 [================>.............] - ETA: 54s - loss: 1.8145 - regression_loss: 1.5002 - classification_loss: 0.3143 285/500 [================>.............] - ETA: 54s - loss: 1.8138 - regression_loss: 1.4977 - classification_loss: 0.3161 286/500 [================>.............] - ETA: 53s - loss: 1.8131 - regression_loss: 1.4973 - classification_loss: 0.3158 287/500 [================>.............] - ETA: 53s - loss: 1.8136 - regression_loss: 1.4979 - classification_loss: 0.3157 288/500 [================>.............] - ETA: 53s - loss: 1.8133 - regression_loss: 1.4976 - classification_loss: 0.3156 289/500 [================>.............] - ETA: 53s - loss: 1.8123 - regression_loss: 1.4970 - classification_loss: 0.3153 290/500 [================>.............] - ETA: 52s - loss: 1.8105 - regression_loss: 1.4955 - classification_loss: 0.3151 291/500 [================>.............] - ETA: 52s - loss: 1.8116 - regression_loss: 1.4965 - classification_loss: 0.3151 292/500 [================>.............] - ETA: 52s - loss: 1.8106 - regression_loss: 1.4955 - classification_loss: 0.3151 293/500 [================>.............] - ETA: 52s - loss: 1.8101 - regression_loss: 1.4952 - classification_loss: 0.3149 294/500 [================>.............] - ETA: 51s - loss: 1.8108 - regression_loss: 1.4959 - classification_loss: 0.3149 295/500 [================>.............] - ETA: 51s - loss: 1.8110 - regression_loss: 1.4959 - classification_loss: 0.3150 296/500 [================>.............] - ETA: 51s - loss: 1.8120 - regression_loss: 1.4968 - classification_loss: 0.3152 297/500 [================>.............] - ETA: 51s - loss: 1.8132 - regression_loss: 1.4977 - classification_loss: 0.3155 298/500 [================>.............] - ETA: 50s - loss: 1.8155 - regression_loss: 1.4995 - classification_loss: 0.3160 299/500 [================>.............] - ETA: 50s - loss: 1.8140 - regression_loss: 1.4981 - classification_loss: 0.3158 300/500 [=================>............] - ETA: 50s - loss: 1.8151 - regression_loss: 1.4990 - classification_loss: 0.3161 301/500 [=================>............] - ETA: 50s - loss: 1.8151 - regression_loss: 1.4990 - classification_loss: 0.3160 302/500 [=================>............] - ETA: 49s - loss: 1.8113 - regression_loss: 1.4959 - classification_loss: 0.3154 303/500 [=================>............] - ETA: 49s - loss: 1.8119 - regression_loss: 1.4965 - classification_loss: 0.3154 304/500 [=================>............] - ETA: 49s - loss: 1.8123 - regression_loss: 1.4963 - classification_loss: 0.3160 305/500 [=================>............] - ETA: 49s - loss: 1.8124 - regression_loss: 1.4965 - classification_loss: 0.3159 306/500 [=================>............] - ETA: 48s - loss: 1.8114 - regression_loss: 1.4955 - classification_loss: 0.3159 307/500 [=================>............] - ETA: 48s - loss: 1.8086 - regression_loss: 1.4933 - classification_loss: 0.3153 308/500 [=================>............] - ETA: 48s - loss: 1.8092 - regression_loss: 1.4939 - classification_loss: 0.3153 309/500 [=================>............] - ETA: 48s - loss: 1.8102 - regression_loss: 1.4948 - classification_loss: 0.3154 310/500 [=================>............] - ETA: 47s - loss: 1.8091 - regression_loss: 1.4940 - classification_loss: 0.3151 311/500 [=================>............] - ETA: 47s - loss: 1.8094 - regression_loss: 1.4941 - classification_loss: 0.3153 312/500 [=================>............] - ETA: 47s - loss: 1.8076 - regression_loss: 1.4927 - classification_loss: 0.3149 313/500 [=================>............] - ETA: 47s - loss: 1.8089 - regression_loss: 1.4938 - classification_loss: 0.3152 314/500 [=================>............] - ETA: 46s - loss: 1.8103 - regression_loss: 1.4947 - classification_loss: 0.3156 315/500 [=================>............] - ETA: 46s - loss: 1.8110 - regression_loss: 1.4948 - classification_loss: 0.3163 316/500 [=================>............] - ETA: 46s - loss: 1.8119 - regression_loss: 1.4957 - classification_loss: 0.3163 317/500 [==================>...........] - ETA: 46s - loss: 1.8123 - regression_loss: 1.4959 - classification_loss: 0.3164 318/500 [==================>...........] - ETA: 45s - loss: 1.8132 - regression_loss: 1.4966 - classification_loss: 0.3166 319/500 [==================>...........] - ETA: 45s - loss: 1.8149 - regression_loss: 1.4978 - classification_loss: 0.3171 320/500 [==================>...........] - ETA: 45s - loss: 1.8144 - regression_loss: 1.4974 - classification_loss: 0.3170 321/500 [==================>...........] - ETA: 45s - loss: 1.8151 - regression_loss: 1.4978 - classification_loss: 0.3173 322/500 [==================>...........] - ETA: 44s - loss: 1.8146 - regression_loss: 1.4974 - classification_loss: 0.3172 323/500 [==================>...........] - ETA: 44s - loss: 1.8144 - regression_loss: 1.4975 - classification_loss: 0.3170 324/500 [==================>...........] - ETA: 44s - loss: 1.8158 - regression_loss: 1.4986 - classification_loss: 0.3173 325/500 [==================>...........] - ETA: 44s - loss: 1.8163 - regression_loss: 1.4993 - classification_loss: 0.3170 326/500 [==================>...........] - ETA: 43s - loss: 1.8171 - regression_loss: 1.5000 - classification_loss: 0.3172 327/500 [==================>...........] - ETA: 43s - loss: 1.8167 - regression_loss: 1.4998 - classification_loss: 0.3169 328/500 [==================>...........] - ETA: 43s - loss: 1.8149 - regression_loss: 1.4980 - classification_loss: 0.3169 329/500 [==================>...........] - ETA: 43s - loss: 1.8166 - regression_loss: 1.4997 - classification_loss: 0.3169 330/500 [==================>...........] - ETA: 42s - loss: 1.8150 - regression_loss: 1.4983 - classification_loss: 0.3167 331/500 [==================>...........] - ETA: 42s - loss: 1.8164 - regression_loss: 1.4993 - classification_loss: 0.3171 332/500 [==================>...........] - ETA: 42s - loss: 1.8172 - regression_loss: 1.5001 - classification_loss: 0.3171 333/500 [==================>...........] - ETA: 42s - loss: 1.8178 - regression_loss: 1.5005 - classification_loss: 0.3172 334/500 [===================>..........] - ETA: 41s - loss: 1.8196 - regression_loss: 1.5020 - classification_loss: 0.3176 335/500 [===================>..........] - ETA: 41s - loss: 1.8206 - regression_loss: 1.5028 - classification_loss: 0.3178 336/500 [===================>..........] - ETA: 41s - loss: 1.8181 - regression_loss: 1.5009 - classification_loss: 0.3172 337/500 [===================>..........] - ETA: 41s - loss: 1.8180 - regression_loss: 1.5010 - classification_loss: 0.3170 338/500 [===================>..........] - ETA: 40s - loss: 1.8179 - regression_loss: 1.5009 - classification_loss: 0.3170 339/500 [===================>..........] - ETA: 40s - loss: 1.8176 - regression_loss: 1.5007 - classification_loss: 0.3169 340/500 [===================>..........] - ETA: 40s - loss: 1.8170 - regression_loss: 1.5003 - classification_loss: 0.3167 341/500 [===================>..........] - ETA: 40s - loss: 1.8160 - regression_loss: 1.4996 - classification_loss: 0.3165 342/500 [===================>..........] - ETA: 39s - loss: 1.8173 - regression_loss: 1.5007 - classification_loss: 0.3166 343/500 [===================>..........] - ETA: 39s - loss: 1.8182 - regression_loss: 1.5015 - classification_loss: 0.3167 344/500 [===================>..........] - ETA: 39s - loss: 1.8180 - regression_loss: 1.5011 - classification_loss: 0.3169 345/500 [===================>..........] - ETA: 38s - loss: 1.8187 - regression_loss: 1.5017 - classification_loss: 0.3170 346/500 [===================>..........] - ETA: 38s - loss: 1.8186 - regression_loss: 1.5015 - classification_loss: 0.3171 347/500 [===================>..........] - ETA: 38s - loss: 1.8189 - regression_loss: 1.5019 - classification_loss: 0.3169 348/500 [===================>..........] - ETA: 38s - loss: 1.8194 - regression_loss: 1.5021 - classification_loss: 0.3173 349/500 [===================>..........] - ETA: 37s - loss: 1.8190 - regression_loss: 1.5018 - classification_loss: 0.3172 350/500 [====================>.........] - ETA: 37s - loss: 1.8172 - regression_loss: 1.5003 - classification_loss: 0.3170 351/500 [====================>.........] - ETA: 37s - loss: 1.8186 - regression_loss: 1.5014 - classification_loss: 0.3172 352/500 [====================>.........] - ETA: 37s - loss: 1.8187 - regression_loss: 1.5016 - classification_loss: 0.3171 353/500 [====================>.........] - ETA: 36s - loss: 1.8189 - regression_loss: 1.5019 - classification_loss: 0.3170 354/500 [====================>.........] - ETA: 36s - loss: 1.8192 - regression_loss: 1.5023 - classification_loss: 0.3169 355/500 [====================>.........] - ETA: 36s - loss: 1.8189 - regression_loss: 1.5019 - classification_loss: 0.3170 356/500 [====================>.........] - ETA: 36s - loss: 1.8179 - regression_loss: 1.5011 - classification_loss: 0.3168 357/500 [====================>.........] - ETA: 35s - loss: 1.8184 - regression_loss: 1.5015 - classification_loss: 0.3169 358/500 [====================>.........] - ETA: 35s - loss: 1.8195 - regression_loss: 1.5023 - classification_loss: 0.3172 359/500 [====================>.........] - ETA: 35s - loss: 1.8193 - regression_loss: 1.5022 - classification_loss: 0.3171 360/500 [====================>.........] - ETA: 35s - loss: 1.8178 - regression_loss: 1.5007 - classification_loss: 0.3171 361/500 [====================>.........] - ETA: 34s - loss: 1.8173 - regression_loss: 1.5004 - classification_loss: 0.3169 362/500 [====================>.........] - ETA: 34s - loss: 1.8178 - regression_loss: 1.5010 - classification_loss: 0.3168 363/500 [====================>.........] - ETA: 34s - loss: 1.8165 - regression_loss: 1.4997 - classification_loss: 0.3168 364/500 [====================>.........] - ETA: 34s - loss: 1.8183 - regression_loss: 1.5009 - classification_loss: 0.3174 365/500 [====================>.........] - ETA: 33s - loss: 1.8168 - regression_loss: 1.5001 - classification_loss: 0.3168 366/500 [====================>.........] - ETA: 33s - loss: 1.8167 - regression_loss: 1.4999 - classification_loss: 0.3168 367/500 [=====================>........] - ETA: 33s - loss: 1.8175 - regression_loss: 1.5006 - classification_loss: 0.3169 368/500 [=====================>........] - ETA: 33s - loss: 1.8191 - regression_loss: 1.5020 - classification_loss: 0.3172 369/500 [=====================>........] - ETA: 32s - loss: 1.8188 - regression_loss: 1.5019 - classification_loss: 0.3170 370/500 [=====================>........] - ETA: 32s - loss: 1.8175 - regression_loss: 1.5007 - classification_loss: 0.3168 371/500 [=====================>........] - ETA: 32s - loss: 1.8169 - regression_loss: 1.5003 - classification_loss: 0.3166 372/500 [=====================>........] - ETA: 32s - loss: 1.8176 - regression_loss: 1.5007 - classification_loss: 0.3170 373/500 [=====================>........] - ETA: 31s - loss: 1.8187 - regression_loss: 1.5016 - classification_loss: 0.3171 374/500 [=====================>........] - ETA: 31s - loss: 1.8200 - regression_loss: 1.5028 - classification_loss: 0.3172 375/500 [=====================>........] - ETA: 31s - loss: 1.8193 - regression_loss: 1.5022 - classification_loss: 0.3171 376/500 [=====================>........] - ETA: 31s - loss: 1.8202 - regression_loss: 1.5030 - classification_loss: 0.3172 377/500 [=====================>........] - ETA: 30s - loss: 1.8203 - regression_loss: 1.5029 - classification_loss: 0.3174 378/500 [=====================>........] - ETA: 30s - loss: 1.8210 - regression_loss: 1.5034 - classification_loss: 0.3176 379/500 [=====================>........] - ETA: 30s - loss: 1.8253 - regression_loss: 1.5058 - classification_loss: 0.3194 380/500 [=====================>........] - ETA: 30s - loss: 1.8271 - regression_loss: 1.5070 - classification_loss: 0.3201 381/500 [=====================>........] - ETA: 29s - loss: 1.8259 - regression_loss: 1.5060 - classification_loss: 0.3198 382/500 [=====================>........] - ETA: 29s - loss: 1.8254 - regression_loss: 1.5055 - classification_loss: 0.3199 383/500 [=====================>........] - ETA: 29s - loss: 1.8247 - regression_loss: 1.5050 - classification_loss: 0.3197 384/500 [======================>.......] - ETA: 29s - loss: 1.8265 - regression_loss: 1.5061 - classification_loss: 0.3204 385/500 [======================>.......] - ETA: 28s - loss: 1.8245 - regression_loss: 1.5044 - classification_loss: 0.3201 386/500 [======================>.......] - ETA: 28s - loss: 1.8247 - regression_loss: 1.5044 - classification_loss: 0.3203 387/500 [======================>.......] - ETA: 28s - loss: 1.8251 - regression_loss: 1.5045 - classification_loss: 0.3206 388/500 [======================>.......] - ETA: 28s - loss: 1.8256 - regression_loss: 1.5050 - classification_loss: 0.3206 389/500 [======================>.......] - ETA: 27s - loss: 1.8267 - regression_loss: 1.5056 - classification_loss: 0.3211 390/500 [======================>.......] - ETA: 27s - loss: 1.8267 - regression_loss: 1.5055 - classification_loss: 0.3211 391/500 [======================>.......] - ETA: 27s - loss: 1.8263 - regression_loss: 1.5053 - classification_loss: 0.3210 392/500 [======================>.......] - ETA: 27s - loss: 1.8267 - regression_loss: 1.5057 - classification_loss: 0.3209 393/500 [======================>.......] - ETA: 26s - loss: 1.8268 - regression_loss: 1.5059 - classification_loss: 0.3209 394/500 [======================>.......] - ETA: 26s - loss: 1.8295 - regression_loss: 1.5080 - classification_loss: 0.3215 395/500 [======================>.......] - ETA: 26s - loss: 1.8304 - regression_loss: 1.5087 - classification_loss: 0.3217 396/500 [======================>.......] - ETA: 26s - loss: 1.8286 - regression_loss: 1.5074 - classification_loss: 0.3212 397/500 [======================>.......] - ETA: 25s - loss: 1.8289 - regression_loss: 1.5075 - classification_loss: 0.3213 398/500 [======================>.......] - ETA: 25s - loss: 1.8280 - regression_loss: 1.5068 - classification_loss: 0.3212 399/500 [======================>.......] - ETA: 25s - loss: 1.8284 - regression_loss: 1.5073 - classification_loss: 0.3211 400/500 [=======================>......] - ETA: 25s - loss: 1.8266 - regression_loss: 1.5058 - classification_loss: 0.3208 401/500 [=======================>......] - ETA: 24s - loss: 1.8270 - regression_loss: 1.5062 - classification_loss: 0.3207 402/500 [=======================>......] - ETA: 24s - loss: 1.8288 - regression_loss: 1.5076 - classification_loss: 0.3212 403/500 [=======================>......] - ETA: 24s - loss: 1.8271 - regression_loss: 1.5061 - classification_loss: 0.3209 404/500 [=======================>......] - ETA: 24s - loss: 1.8267 - regression_loss: 1.5058 - classification_loss: 0.3208 405/500 [=======================>......] - ETA: 23s - loss: 1.8265 - regression_loss: 1.5057 - classification_loss: 0.3208 406/500 [=======================>......] - ETA: 23s - loss: 1.8290 - regression_loss: 1.5079 - classification_loss: 0.3211 407/500 [=======================>......] - ETA: 23s - loss: 1.8289 - regression_loss: 1.5079 - classification_loss: 0.3210 408/500 [=======================>......] - ETA: 23s - loss: 1.8291 - regression_loss: 1.5081 - classification_loss: 0.3210 409/500 [=======================>......] - ETA: 22s - loss: 1.8292 - regression_loss: 1.5083 - classification_loss: 0.3209 410/500 [=======================>......] - ETA: 22s - loss: 1.8292 - regression_loss: 1.5083 - classification_loss: 0.3208 411/500 [=======================>......] - ETA: 22s - loss: 1.8303 - regression_loss: 1.5091 - classification_loss: 0.3212 412/500 [=======================>......] - ETA: 22s - loss: 1.8308 - regression_loss: 1.5097 - classification_loss: 0.3211 413/500 [=======================>......] - ETA: 21s - loss: 1.8310 - regression_loss: 1.5101 - classification_loss: 0.3210 414/500 [=======================>......] - ETA: 21s - loss: 1.8317 - regression_loss: 1.5106 - classification_loss: 0.3210 415/500 [=======================>......] - ETA: 21s - loss: 1.8295 - regression_loss: 1.5088 - classification_loss: 0.3207 416/500 [=======================>......] - ETA: 21s - loss: 1.8285 - regression_loss: 1.5082 - classification_loss: 0.3204 417/500 [========================>.....] - ETA: 20s - loss: 1.8293 - regression_loss: 1.5087 - classification_loss: 0.3206 418/500 [========================>.....] - ETA: 20s - loss: 1.8287 - regression_loss: 1.5084 - classification_loss: 0.3203 419/500 [========================>.....] - ETA: 20s - loss: 1.8263 - regression_loss: 1.5062 - classification_loss: 0.3201 420/500 [========================>.....] - ETA: 20s - loss: 1.8246 - regression_loss: 1.5050 - classification_loss: 0.3196 421/500 [========================>.....] - ETA: 19s - loss: 1.8256 - regression_loss: 1.5055 - classification_loss: 0.3202 422/500 [========================>.....] - ETA: 19s - loss: 1.8263 - regression_loss: 1.5062 - classification_loss: 0.3201 423/500 [========================>.....] - ETA: 19s - loss: 1.8263 - regression_loss: 1.5061 - classification_loss: 0.3202 424/500 [========================>.....] - ETA: 19s - loss: 1.8276 - regression_loss: 1.5073 - classification_loss: 0.3203 425/500 [========================>.....] - ETA: 18s - loss: 1.8281 - regression_loss: 1.5077 - classification_loss: 0.3204 426/500 [========================>.....] - ETA: 18s - loss: 1.8280 - regression_loss: 1.5078 - classification_loss: 0.3202 427/500 [========================>.....] - ETA: 18s - loss: 1.8263 - regression_loss: 1.5064 - classification_loss: 0.3200 428/500 [========================>.....] - ETA: 18s - loss: 1.8238 - regression_loss: 1.5044 - classification_loss: 0.3194 429/500 [========================>.....] - ETA: 17s - loss: 1.8233 - regression_loss: 1.5040 - classification_loss: 0.3193 430/500 [========================>.....] - ETA: 17s - loss: 1.8236 - regression_loss: 1.5043 - classification_loss: 0.3193 431/500 [========================>.....] - ETA: 17s - loss: 1.8244 - regression_loss: 1.5050 - classification_loss: 0.3194 432/500 [========================>.....] - ETA: 17s - loss: 1.8236 - regression_loss: 1.5042 - classification_loss: 0.3193 433/500 [========================>.....] - ETA: 16s - loss: 1.8236 - regression_loss: 1.5042 - classification_loss: 0.3194 434/500 [=========================>....] - ETA: 16s - loss: 1.8218 - regression_loss: 1.5027 - classification_loss: 0.3192 435/500 [=========================>....] - ETA: 16s - loss: 1.8214 - regression_loss: 1.5024 - classification_loss: 0.3190 436/500 [=========================>....] - ETA: 16s - loss: 1.8216 - regression_loss: 1.5026 - classification_loss: 0.3190 437/500 [=========================>....] - ETA: 15s - loss: 1.8223 - regression_loss: 1.5028 - classification_loss: 0.3195 438/500 [=========================>....] - ETA: 15s - loss: 1.8193 - regression_loss: 1.5003 - classification_loss: 0.3190 439/500 [=========================>....] - ETA: 15s - loss: 1.8185 - regression_loss: 1.4993 - classification_loss: 0.3191 440/500 [=========================>....] - ETA: 15s - loss: 1.8184 - regression_loss: 1.4994 - classification_loss: 0.3190 441/500 [=========================>....] - ETA: 14s - loss: 1.8190 - regression_loss: 1.4998 - classification_loss: 0.3192 442/500 [=========================>....] - ETA: 14s - loss: 1.8189 - regression_loss: 1.4999 - classification_loss: 0.3190 443/500 [=========================>....] - ETA: 14s - loss: 1.8190 - regression_loss: 1.5000 - classification_loss: 0.3190 444/500 [=========================>....] - ETA: 14s - loss: 1.8199 - regression_loss: 1.5008 - classification_loss: 0.3191 445/500 [=========================>....] - ETA: 13s - loss: 1.8204 - regression_loss: 1.5013 - classification_loss: 0.3191 446/500 [=========================>....] - ETA: 13s - loss: 1.8192 - regression_loss: 1.5003 - classification_loss: 0.3189 447/500 [=========================>....] - ETA: 13s - loss: 1.8175 - regression_loss: 1.4988 - classification_loss: 0.3187 448/500 [=========================>....] - ETA: 13s - loss: 1.8164 - regression_loss: 1.4981 - classification_loss: 0.3184 449/500 [=========================>....] - ETA: 12s - loss: 1.8171 - regression_loss: 1.4987 - classification_loss: 0.3184 450/500 [==========================>...] - ETA: 12s - loss: 1.8175 - regression_loss: 1.4991 - classification_loss: 0.3184 451/500 [==========================>...] - ETA: 12s - loss: 1.8169 - regression_loss: 1.4987 - classification_loss: 0.3183 452/500 [==========================>...] - ETA: 12s - loss: 1.8155 - regression_loss: 1.4975 - classification_loss: 0.3180 453/500 [==========================>...] - ETA: 11s - loss: 1.8167 - regression_loss: 1.4984 - classification_loss: 0.3183 454/500 [==========================>...] - ETA: 11s - loss: 1.8174 - regression_loss: 1.4992 - classification_loss: 0.3182 455/500 [==========================>...] - ETA: 11s - loss: 1.8173 - regression_loss: 1.4991 - classification_loss: 0.3182 456/500 [==========================>...] - ETA: 11s - loss: 1.8172 - regression_loss: 1.4991 - classification_loss: 0.3181 457/500 [==========================>...] - ETA: 10s - loss: 1.8157 - regression_loss: 1.4979 - classification_loss: 0.3177 458/500 [==========================>...] - ETA: 10s - loss: 1.8182 - regression_loss: 1.5000 - classification_loss: 0.3182 459/500 [==========================>...] - ETA: 10s - loss: 1.8173 - regression_loss: 1.4993 - classification_loss: 0.3180 460/500 [==========================>...] - ETA: 10s - loss: 1.8167 - regression_loss: 1.4988 - classification_loss: 0.3179 461/500 [==========================>...] - ETA: 9s - loss: 1.8158 - regression_loss: 1.4980 - classification_loss: 0.3178  462/500 [==========================>...] - ETA: 9s - loss: 1.8163 - regression_loss: 1.4984 - classification_loss: 0.3178 463/500 [==========================>...] - ETA: 9s - loss: 1.8169 - regression_loss: 1.4988 - classification_loss: 0.3180 464/500 [==========================>...] - ETA: 9s - loss: 1.8172 - regression_loss: 1.4993 - classification_loss: 0.3179 465/500 [==========================>...] - ETA: 8s - loss: 1.8161 - regression_loss: 1.4985 - classification_loss: 0.3176 466/500 [==========================>...] - ETA: 8s - loss: 1.8157 - regression_loss: 1.4983 - classification_loss: 0.3174 467/500 [===========================>..] - ETA: 8s - loss: 1.8170 - regression_loss: 1.4991 - classification_loss: 0.3179 468/500 [===========================>..] - ETA: 8s - loss: 1.8154 - regression_loss: 1.4979 - classification_loss: 0.3175 469/500 [===========================>..] - ETA: 7s - loss: 1.8149 - regression_loss: 1.4975 - classification_loss: 0.3174 470/500 [===========================>..] - ETA: 7s - loss: 1.8155 - regression_loss: 1.4980 - classification_loss: 0.3175 471/500 [===========================>..] - ETA: 7s - loss: 1.8163 - regression_loss: 1.4987 - classification_loss: 0.3176 472/500 [===========================>..] - ETA: 7s - loss: 1.8159 - regression_loss: 1.4984 - classification_loss: 0.3175 473/500 [===========================>..] - ETA: 6s - loss: 1.8156 - regression_loss: 1.4981 - classification_loss: 0.3175 474/500 [===========================>..] - ETA: 6s - loss: 1.8156 - regression_loss: 1.4977 - classification_loss: 0.3179 475/500 [===========================>..] - ETA: 6s - loss: 1.8160 - regression_loss: 1.4979 - classification_loss: 0.3181 476/500 [===========================>..] - ETA: 6s - loss: 1.8170 - regression_loss: 1.4989 - classification_loss: 0.3182 477/500 [===========================>..] - ETA: 5s - loss: 1.8168 - regression_loss: 1.4987 - classification_loss: 0.3181 478/500 [===========================>..] - ETA: 5s - loss: 1.8166 - regression_loss: 1.4986 - classification_loss: 0.3180 479/500 [===========================>..] - ETA: 5s - loss: 1.8172 - regression_loss: 1.4990 - classification_loss: 0.3182 480/500 [===========================>..] - ETA: 5s - loss: 1.8168 - regression_loss: 1.4988 - classification_loss: 0.3181 481/500 [===========================>..] - ETA: 4s - loss: 1.8166 - regression_loss: 1.4986 - classification_loss: 0.3180 482/500 [===========================>..] - ETA: 4s - loss: 1.8167 - regression_loss: 1.4987 - classification_loss: 0.3180 483/500 [===========================>..] - ETA: 4s - loss: 1.8163 - regression_loss: 1.4985 - classification_loss: 0.3178 484/500 [============================>.] - ETA: 4s - loss: 1.8161 - regression_loss: 1.4983 - classification_loss: 0.3178 485/500 [============================>.] - ETA: 3s - loss: 1.8162 - regression_loss: 1.4983 - classification_loss: 0.3179 486/500 [============================>.] - ETA: 3s - loss: 1.8142 - regression_loss: 1.4966 - classification_loss: 0.3176 487/500 [============================>.] - ETA: 3s - loss: 1.8133 - regression_loss: 1.4959 - classification_loss: 0.3174 488/500 [============================>.] - ETA: 3s - loss: 1.8159 - regression_loss: 1.4980 - classification_loss: 0.3179 489/500 [============================>.] - ETA: 2s - loss: 1.8159 - regression_loss: 1.4981 - classification_loss: 0.3178 490/500 [============================>.] - ETA: 2s - loss: 1.8163 - regression_loss: 1.4983 - classification_loss: 0.3180 491/500 [============================>.] - ETA: 2s - loss: 1.8149 - regression_loss: 1.4971 - classification_loss: 0.3179 492/500 [============================>.] - ETA: 2s - loss: 1.8158 - regression_loss: 1.4979 - classification_loss: 0.3179 493/500 [============================>.] - ETA: 1s - loss: 1.8153 - regression_loss: 1.4973 - classification_loss: 0.3180 494/500 [============================>.] - ETA: 1s - loss: 1.8158 - regression_loss: 1.4978 - classification_loss: 0.3181 495/500 [============================>.] - ETA: 1s - loss: 1.8158 - regression_loss: 1.4978 - classification_loss: 0.3180 496/500 [============================>.] - ETA: 1s - loss: 1.8137 - regression_loss: 1.4961 - classification_loss: 0.3176 497/500 [============================>.] - ETA: 0s - loss: 1.8152 - regression_loss: 1.4974 - classification_loss: 0.3178 498/500 [============================>.] - ETA: 0s - loss: 1.8163 - regression_loss: 1.4983 - classification_loss: 0.3180 499/500 [============================>.] - ETA: 0s - loss: 1.8169 - regression_loss: 1.4988 - classification_loss: 0.3181 500/500 [==============================] - 126s 251ms/step - loss: 1.8161 - regression_loss: 1.4983 - classification_loss: 0.3178 1172 instances of class plum with average precision: 0.6020 mAP: 0.6020 Epoch 00058: saving model to ./training/snapshots/resnet50_pascal_58.h5 Epoch 59/150 1/500 [..............................] - ETA: 2:00 - loss: 2.3465 - regression_loss: 2.0033 - classification_loss: 0.3432 2/500 [..............................] - ETA: 1:59 - loss: 2.1794 - regression_loss: 1.8479 - classification_loss: 0.3315 3/500 [..............................] - ETA: 1:59 - loss: 1.7658 - regression_loss: 1.4600 - classification_loss: 0.3057 4/500 [..............................] - ETA: 2:00 - loss: 1.7808 - regression_loss: 1.4937 - classification_loss: 0.2871 5/500 [..............................] - ETA: 2:00 - loss: 1.7994 - regression_loss: 1.5045 - classification_loss: 0.2949 6/500 [..............................] - ETA: 2:01 - loss: 1.8032 - regression_loss: 1.5072 - classification_loss: 0.2960 7/500 [..............................] - ETA: 2:01 - loss: 1.8372 - regression_loss: 1.5259 - classification_loss: 0.3112 8/500 [..............................] - ETA: 2:01 - loss: 1.9255 - regression_loss: 1.6096 - classification_loss: 0.3160 9/500 [..............................] - ETA: 2:01 - loss: 1.9241 - regression_loss: 1.6067 - classification_loss: 0.3175 10/500 [..............................] - ETA: 2:01 - loss: 1.9075 - regression_loss: 1.5933 - classification_loss: 0.3142 11/500 [..............................] - ETA: 2:01 - loss: 1.9073 - regression_loss: 1.5981 - classification_loss: 0.3092 12/500 [..............................] - ETA: 2:01 - loss: 1.9180 - regression_loss: 1.6039 - classification_loss: 0.3141 13/500 [..............................] - ETA: 2:00 - loss: 1.9519 - regression_loss: 1.6287 - classification_loss: 0.3232 14/500 [..............................] - ETA: 2:00 - loss: 1.8939 - regression_loss: 1.5819 - classification_loss: 0.3120 15/500 [..............................] - ETA: 1:59 - loss: 1.8989 - regression_loss: 1.5813 - classification_loss: 0.3176 16/500 [..............................] - ETA: 2:00 - loss: 1.8721 - regression_loss: 1.5601 - classification_loss: 0.3120 17/500 [>.............................] - ETA: 1:59 - loss: 1.7979 - regression_loss: 1.4999 - classification_loss: 0.2980 18/500 [>.............................] - ETA: 1:59 - loss: 1.7915 - regression_loss: 1.4962 - classification_loss: 0.2953 19/500 [>.............................] - ETA: 1:59 - loss: 1.8173 - regression_loss: 1.5143 - classification_loss: 0.3030 20/500 [>.............................] - ETA: 1:59 - loss: 1.8458 - regression_loss: 1.5383 - classification_loss: 0.3074 21/500 [>.............................] - ETA: 1:59 - loss: 1.8253 - regression_loss: 1.5205 - classification_loss: 0.3048 22/500 [>.............................] - ETA: 1:59 - loss: 1.8469 - regression_loss: 1.5348 - classification_loss: 0.3122 23/500 [>.............................] - ETA: 1:58 - loss: 1.8543 - regression_loss: 1.5405 - classification_loss: 0.3138 24/500 [>.............................] - ETA: 1:58 - loss: 1.8646 - regression_loss: 1.5497 - classification_loss: 0.3149 25/500 [>.............................] - ETA: 1:58 - loss: 1.8624 - regression_loss: 1.5471 - classification_loss: 0.3153 26/500 [>.............................] - ETA: 1:58 - loss: 1.8285 - regression_loss: 1.5189 - classification_loss: 0.3096 27/500 [>.............................] - ETA: 1:58 - loss: 1.8186 - regression_loss: 1.5128 - classification_loss: 0.3057 28/500 [>.............................] - ETA: 1:58 - loss: 1.8106 - regression_loss: 1.5063 - classification_loss: 0.3043 29/500 [>.............................] - ETA: 1:57 - loss: 1.8155 - regression_loss: 1.5092 - classification_loss: 0.3062 30/500 [>.............................] - ETA: 1:57 - loss: 1.8065 - regression_loss: 1.5034 - classification_loss: 0.3031 31/500 [>.............................] - ETA: 1:57 - loss: 1.8165 - regression_loss: 1.5133 - classification_loss: 0.3032 32/500 [>.............................] - ETA: 1:56 - loss: 1.8241 - regression_loss: 1.5180 - classification_loss: 0.3060 33/500 [>.............................] - ETA: 1:56 - loss: 1.8176 - regression_loss: 1.5125 - classification_loss: 0.3051 34/500 [=>............................] - ETA: 1:56 - loss: 1.8253 - regression_loss: 1.5199 - classification_loss: 0.3054 35/500 [=>............................] - ETA: 1:55 - loss: 1.8531 - regression_loss: 1.5461 - classification_loss: 0.3071 36/500 [=>............................] - ETA: 1:55 - loss: 1.8565 - regression_loss: 1.5515 - classification_loss: 0.3049 37/500 [=>............................] - ETA: 1:55 - loss: 1.8454 - regression_loss: 1.5419 - classification_loss: 0.3035 38/500 [=>............................] - ETA: 1:55 - loss: 1.8524 - regression_loss: 1.5474 - classification_loss: 0.3050 39/500 [=>............................] - ETA: 1:54 - loss: 1.8567 - regression_loss: 1.5496 - classification_loss: 0.3071 40/500 [=>............................] - ETA: 1:54 - loss: 1.8642 - regression_loss: 1.5547 - classification_loss: 0.3095 41/500 [=>............................] - ETA: 1:53 - loss: 1.8719 - regression_loss: 1.5623 - classification_loss: 0.3096 42/500 [=>............................] - ETA: 1:53 - loss: 1.8748 - regression_loss: 1.5627 - classification_loss: 0.3121 43/500 [=>............................] - ETA: 1:52 - loss: 1.8805 - regression_loss: 1.5650 - classification_loss: 0.3155 44/500 [=>............................] - ETA: 1:52 - loss: 1.8793 - regression_loss: 1.5652 - classification_loss: 0.3142 45/500 [=>............................] - ETA: 1:52 - loss: 1.8960 - regression_loss: 1.5514 - classification_loss: 0.3445 46/500 [=>............................] - ETA: 1:51 - loss: 1.9072 - regression_loss: 1.5586 - classification_loss: 0.3486 47/500 [=>............................] - ETA: 1:51 - loss: 1.9075 - regression_loss: 1.5494 - classification_loss: 0.3581 48/500 [=>............................] - ETA: 1:51 - loss: 1.8988 - regression_loss: 1.5409 - classification_loss: 0.3580 49/500 [=>............................] - ETA: 1:51 - loss: 1.9075 - regression_loss: 1.5492 - classification_loss: 0.3583 50/500 [==>...........................] - ETA: 1:50 - loss: 1.8943 - regression_loss: 1.5377 - classification_loss: 0.3566 51/500 [==>...........................] - ETA: 1:50 - loss: 1.8812 - regression_loss: 1.5268 - classification_loss: 0.3544 52/500 [==>...........................] - ETA: 1:50 - loss: 1.8856 - regression_loss: 1.5311 - classification_loss: 0.3544 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8767 - regression_loss: 1.5246 - classification_loss: 0.3521 54/500 [==>...........................] - ETA: 1:50 - loss: 1.8753 - regression_loss: 1.5232 - classification_loss: 0.3522 55/500 [==>...........................] - ETA: 1:49 - loss: 1.8798 - regression_loss: 1.5273 - classification_loss: 0.3525 56/500 [==>...........................] - ETA: 1:49 - loss: 1.8746 - regression_loss: 1.5242 - classification_loss: 0.3504 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8799 - regression_loss: 1.5283 - classification_loss: 0.3516 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8736 - regression_loss: 1.5224 - classification_loss: 0.3511 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8725 - regression_loss: 1.5233 - classification_loss: 0.3492 60/500 [==>...........................] - ETA: 1:48 - loss: 1.8700 - regression_loss: 1.5195 - classification_loss: 0.3505 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8662 - regression_loss: 1.5177 - classification_loss: 0.3485 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8633 - regression_loss: 1.5155 - classification_loss: 0.3478 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8630 - regression_loss: 1.5159 - classification_loss: 0.3471 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8595 - regression_loss: 1.5139 - classification_loss: 0.3456 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8553 - regression_loss: 1.5103 - classification_loss: 0.3450 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8426 - regression_loss: 1.5010 - classification_loss: 0.3416 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8458 - regression_loss: 1.5028 - classification_loss: 0.3430 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8489 - regression_loss: 1.5055 - classification_loss: 0.3435 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8521 - regression_loss: 1.5083 - classification_loss: 0.3438 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8538 - regression_loss: 1.5116 - classification_loss: 0.3422 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8440 - regression_loss: 1.5038 - classification_loss: 0.3401 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8355 - regression_loss: 1.4969 - classification_loss: 0.3385 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8303 - regression_loss: 1.4909 - classification_loss: 0.3394 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8142 - regression_loss: 1.4782 - classification_loss: 0.3360 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8147 - regression_loss: 1.4784 - classification_loss: 0.3363 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8237 - regression_loss: 1.4864 - classification_loss: 0.3373 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8188 - regression_loss: 1.4833 - classification_loss: 0.3355 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8269 - regression_loss: 1.4891 - classification_loss: 0.3377 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8267 - regression_loss: 1.4896 - classification_loss: 0.3371 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8263 - regression_loss: 1.4904 - classification_loss: 0.3359 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8223 - regression_loss: 1.4879 - classification_loss: 0.3344 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8128 - regression_loss: 1.4807 - classification_loss: 0.3321 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8130 - regression_loss: 1.4811 - classification_loss: 0.3319 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8113 - regression_loss: 1.4802 - classification_loss: 0.3312 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8106 - regression_loss: 1.4801 - classification_loss: 0.3305 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8086 - regression_loss: 1.4793 - classification_loss: 0.3294 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8142 - regression_loss: 1.4844 - classification_loss: 0.3298 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8075 - regression_loss: 1.4797 - classification_loss: 0.3278 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8172 - regression_loss: 1.4872 - classification_loss: 0.3299 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8196 - regression_loss: 1.4892 - classification_loss: 0.3304 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8171 - regression_loss: 1.4876 - classification_loss: 0.3295 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8179 - regression_loss: 1.4887 - classification_loss: 0.3292 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8186 - regression_loss: 1.4895 - classification_loss: 0.3291 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8198 - regression_loss: 1.4912 - classification_loss: 0.3285 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8282 - regression_loss: 1.4976 - classification_loss: 0.3307 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8317 - regression_loss: 1.5007 - classification_loss: 0.3310 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8325 - regression_loss: 1.5015 - classification_loss: 0.3309 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8302 - regression_loss: 1.4998 - classification_loss: 0.3304 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8353 - regression_loss: 1.5040 - classification_loss: 0.3314 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8278 - regression_loss: 1.4974 - classification_loss: 0.3305 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8300 - regression_loss: 1.4990 - classification_loss: 0.3310 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8243 - regression_loss: 1.4943 - classification_loss: 0.3300 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8291 - regression_loss: 1.4971 - classification_loss: 0.3320 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8297 - regression_loss: 1.4987 - classification_loss: 0.3310 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8381 - regression_loss: 1.5057 - classification_loss: 0.3324 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8411 - regression_loss: 1.5086 - classification_loss: 0.3325 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8334 - regression_loss: 1.5024 - classification_loss: 0.3311 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8297 - regression_loss: 1.4993 - classification_loss: 0.3303 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8257 - regression_loss: 1.4968 - classification_loss: 0.3289 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8270 - regression_loss: 1.4982 - classification_loss: 0.3288 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8169 - regression_loss: 1.4899 - classification_loss: 0.3270 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8148 - regression_loss: 1.4882 - classification_loss: 0.3266 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8197 - regression_loss: 1.4926 - classification_loss: 0.3271 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8219 - regression_loss: 1.4941 - classification_loss: 0.3278 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8288 - regression_loss: 1.5002 - classification_loss: 0.3286 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8246 - regression_loss: 1.4974 - classification_loss: 0.3271 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8280 - regression_loss: 1.5004 - classification_loss: 0.3276 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8281 - regression_loss: 1.5009 - classification_loss: 0.3272 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8262 - regression_loss: 1.4995 - classification_loss: 0.3267 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8277 - regression_loss: 1.5010 - classification_loss: 0.3267 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8337 - regression_loss: 1.5050 - classification_loss: 0.3287 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8326 - regression_loss: 1.5039 - classification_loss: 0.3286 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8314 - regression_loss: 1.5035 - classification_loss: 0.3279 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8330 - regression_loss: 1.5049 - classification_loss: 0.3281 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8376 - regression_loss: 1.5092 - classification_loss: 0.3285 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8392 - regression_loss: 1.5107 - classification_loss: 0.3285 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8398 - regression_loss: 1.5114 - classification_loss: 0.3283 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8416 - regression_loss: 1.5131 - classification_loss: 0.3285 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8417 - regression_loss: 1.5136 - classification_loss: 0.3281 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8364 - regression_loss: 1.5094 - classification_loss: 0.3270 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8362 - regression_loss: 1.5093 - classification_loss: 0.3268 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8326 - regression_loss: 1.5062 - classification_loss: 0.3264 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8336 - regression_loss: 1.5074 - classification_loss: 0.3262 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8381 - regression_loss: 1.5100 - classification_loss: 0.3281 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8397 - regression_loss: 1.5115 - classification_loss: 0.3282 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8430 - regression_loss: 1.5144 - classification_loss: 0.3286 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8343 - regression_loss: 1.5074 - classification_loss: 0.3269 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8344 - regression_loss: 1.5069 - classification_loss: 0.3274 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8355 - regression_loss: 1.5082 - classification_loss: 0.3273 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8332 - regression_loss: 1.5065 - classification_loss: 0.3267 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8284 - regression_loss: 1.5022 - classification_loss: 0.3262 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8298 - regression_loss: 1.5035 - classification_loss: 0.3263 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8266 - regression_loss: 1.5009 - classification_loss: 0.3257 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8263 - regression_loss: 1.5012 - classification_loss: 0.3251 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8233 - regression_loss: 1.4990 - classification_loss: 0.3243 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8195 - regression_loss: 1.4958 - classification_loss: 0.3236 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8185 - regression_loss: 1.4953 - classification_loss: 0.3232 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8130 - regression_loss: 1.4907 - classification_loss: 0.3223 149/500 [=======>......................] - ETA: 1:27 - loss: 1.8152 - regression_loss: 1.4926 - classification_loss: 0.3226 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8135 - regression_loss: 1.4914 - classification_loss: 0.3221 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8133 - regression_loss: 1.4907 - classification_loss: 0.3226 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8142 - regression_loss: 1.4915 - classification_loss: 0.3228 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8061 - regression_loss: 1.4850 - classification_loss: 0.3212 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8069 - regression_loss: 1.4858 - classification_loss: 0.3211 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8079 - regression_loss: 1.4866 - classification_loss: 0.3213 156/500 [========>.....................] - ETA: 1:26 - loss: 1.8057 - regression_loss: 1.4849 - classification_loss: 0.3208 157/500 [========>.....................] - ETA: 1:25 - loss: 1.8026 - regression_loss: 1.4822 - classification_loss: 0.3204 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8034 - regression_loss: 1.4832 - classification_loss: 0.3202 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8045 - regression_loss: 1.4844 - classification_loss: 0.3201 160/500 [========>.....................] - ETA: 1:25 - loss: 1.8026 - regression_loss: 1.4828 - classification_loss: 0.3197 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8060 - regression_loss: 1.4860 - classification_loss: 0.3200 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8043 - regression_loss: 1.4845 - classification_loss: 0.3197 163/500 [========>.....................] - ETA: 1:24 - loss: 1.8035 - regression_loss: 1.4841 - classification_loss: 0.3194 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7995 - regression_loss: 1.4802 - classification_loss: 0.3192 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7991 - regression_loss: 1.4798 - classification_loss: 0.3193 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7939 - regression_loss: 1.4759 - classification_loss: 0.3180 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7959 - regression_loss: 1.4768 - classification_loss: 0.3191 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7967 - regression_loss: 1.4779 - classification_loss: 0.3189 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7970 - regression_loss: 1.4783 - classification_loss: 0.3187 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7960 - regression_loss: 1.4777 - classification_loss: 0.3183 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7917 - regression_loss: 1.4744 - classification_loss: 0.3173 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7912 - regression_loss: 1.4736 - classification_loss: 0.3176 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7922 - regression_loss: 1.4742 - classification_loss: 0.3180 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7926 - regression_loss: 1.4745 - classification_loss: 0.3180 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7941 - regression_loss: 1.4764 - classification_loss: 0.3178 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7931 - regression_loss: 1.4758 - classification_loss: 0.3173 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7952 - regression_loss: 1.4771 - classification_loss: 0.3181 178/500 [=========>....................] - ETA: 1:20 - loss: 1.8009 - regression_loss: 1.4811 - classification_loss: 0.3198 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7993 - regression_loss: 1.4800 - classification_loss: 0.3193 180/500 [=========>....................] - ETA: 1:20 - loss: 1.8001 - regression_loss: 1.4809 - classification_loss: 0.3191 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8009 - regression_loss: 1.4816 - classification_loss: 0.3193 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7987 - regression_loss: 1.4796 - classification_loss: 0.3191 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7979 - regression_loss: 1.4794 - classification_loss: 0.3185 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7927 - regression_loss: 1.4755 - classification_loss: 0.3173 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7935 - regression_loss: 1.4762 - classification_loss: 0.3173 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7935 - regression_loss: 1.4761 - classification_loss: 0.3174 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7946 - regression_loss: 1.4771 - classification_loss: 0.3175 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7998 - regression_loss: 1.4813 - classification_loss: 0.3184 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7990 - regression_loss: 1.4811 - classification_loss: 0.3180 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7955 - regression_loss: 1.4779 - classification_loss: 0.3176 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7945 - regression_loss: 1.4773 - classification_loss: 0.3172 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7922 - regression_loss: 1.4757 - classification_loss: 0.3165 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7917 - regression_loss: 1.4752 - classification_loss: 0.3165 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7900 - regression_loss: 1.4740 - classification_loss: 0.3159 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7839 - regression_loss: 1.4693 - classification_loss: 0.3146 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7870 - regression_loss: 1.4717 - classification_loss: 0.3153 197/500 [==========>...................] - ETA: 1:16 - loss: 1.7886 - regression_loss: 1.4733 - classification_loss: 0.3153 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7886 - regression_loss: 1.4737 - classification_loss: 0.3150 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7879 - regression_loss: 1.4733 - classification_loss: 0.3146 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7866 - regression_loss: 1.4725 - classification_loss: 0.3141 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7881 - regression_loss: 1.4739 - classification_loss: 0.3142 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7890 - regression_loss: 1.4745 - classification_loss: 0.3144 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7895 - regression_loss: 1.4754 - classification_loss: 0.3142 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7861 - regression_loss: 1.4730 - classification_loss: 0.3130 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7900 - regression_loss: 1.4754 - classification_loss: 0.3147 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7898 - regression_loss: 1.4749 - classification_loss: 0.3149 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7896 - regression_loss: 1.4750 - classification_loss: 0.3145 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7875 - regression_loss: 1.4735 - classification_loss: 0.3141 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7870 - regression_loss: 1.4732 - classification_loss: 0.3138 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7879 - regression_loss: 1.4741 - classification_loss: 0.3138 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7863 - regression_loss: 1.4732 - classification_loss: 0.3131 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7846 - regression_loss: 1.4718 - classification_loss: 0.3128 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7847 - regression_loss: 1.4719 - classification_loss: 0.3128 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7864 - regression_loss: 1.4734 - classification_loss: 0.3130 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7873 - regression_loss: 1.4741 - classification_loss: 0.3132 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7848 - regression_loss: 1.4721 - classification_loss: 0.3128 217/500 [============>.................] - ETA: 1:10 - loss: 1.7854 - regression_loss: 1.4726 - classification_loss: 0.3127 218/500 [============>.................] - ETA: 1:10 - loss: 1.7834 - regression_loss: 1.4708 - classification_loss: 0.3126 219/500 [============>.................] - ETA: 1:10 - loss: 1.7832 - regression_loss: 1.4702 - classification_loss: 0.3130 220/500 [============>.................] - ETA: 1:10 - loss: 1.7835 - regression_loss: 1.4705 - classification_loss: 0.3130 221/500 [============>.................] - ETA: 1:09 - loss: 1.7811 - regression_loss: 1.4684 - classification_loss: 0.3128 222/500 [============>.................] - ETA: 1:09 - loss: 1.7805 - regression_loss: 1.4673 - classification_loss: 0.3132 223/500 [============>.................] - ETA: 1:09 - loss: 1.7798 - regression_loss: 1.4670 - classification_loss: 0.3128 224/500 [============>.................] - ETA: 1:09 - loss: 1.7782 - regression_loss: 1.4651 - classification_loss: 0.3131 225/500 [============>.................] - ETA: 1:08 - loss: 1.7756 - regression_loss: 1.4630 - classification_loss: 0.3126 226/500 [============>.................] - ETA: 1:08 - loss: 1.7769 - regression_loss: 1.4644 - classification_loss: 0.3125 227/500 [============>.................] - ETA: 1:08 - loss: 1.7761 - regression_loss: 1.4640 - classification_loss: 0.3122 228/500 [============>.................] - ETA: 1:08 - loss: 1.7748 - regression_loss: 1.4630 - classification_loss: 0.3118 229/500 [============>.................] - ETA: 1:07 - loss: 1.7771 - regression_loss: 1.4645 - classification_loss: 0.3126 230/500 [============>.................] - ETA: 1:07 - loss: 1.7798 - regression_loss: 1.4670 - classification_loss: 0.3128 231/500 [============>.................] - ETA: 1:07 - loss: 1.7792 - regression_loss: 1.4666 - classification_loss: 0.3126 232/500 [============>.................] - ETA: 1:07 - loss: 1.7777 - regression_loss: 1.4653 - classification_loss: 0.3124 233/500 [============>.................] - ETA: 1:06 - loss: 1.7766 - regression_loss: 1.4643 - classification_loss: 0.3123 234/500 [=============>................] - ETA: 1:06 - loss: 1.7760 - regression_loss: 1.4636 - classification_loss: 0.3124 235/500 [=============>................] - ETA: 1:06 - loss: 1.7779 - regression_loss: 1.4651 - classification_loss: 0.3128 236/500 [=============>................] - ETA: 1:06 - loss: 1.7787 - regression_loss: 1.4659 - classification_loss: 0.3128 237/500 [=============>................] - ETA: 1:05 - loss: 1.7817 - regression_loss: 1.4684 - classification_loss: 0.3133 238/500 [=============>................] - ETA: 1:05 - loss: 1.7812 - regression_loss: 1.4679 - classification_loss: 0.3133 239/500 [=============>................] - ETA: 1:05 - loss: 1.7818 - regression_loss: 1.4687 - classification_loss: 0.3131 240/500 [=============>................] - ETA: 1:05 - loss: 1.7792 - regression_loss: 1.4667 - classification_loss: 0.3125 241/500 [=============>................] - ETA: 1:04 - loss: 1.7760 - regression_loss: 1.4645 - classification_loss: 0.3115 242/500 [=============>................] - ETA: 1:04 - loss: 1.7765 - regression_loss: 1.4652 - classification_loss: 0.3113 243/500 [=============>................] - ETA: 1:04 - loss: 1.7774 - regression_loss: 1.4661 - classification_loss: 0.3112 244/500 [=============>................] - ETA: 1:04 - loss: 1.7744 - regression_loss: 1.4641 - classification_loss: 0.3103 245/500 [=============>................] - ETA: 1:03 - loss: 1.7736 - regression_loss: 1.4636 - classification_loss: 0.3100 246/500 [=============>................] - ETA: 1:03 - loss: 1.7748 - regression_loss: 1.4647 - classification_loss: 0.3101 247/500 [=============>................] - ETA: 1:03 - loss: 1.7757 - regression_loss: 1.4657 - classification_loss: 0.3100 248/500 [=============>................] - ETA: 1:03 - loss: 1.7762 - regression_loss: 1.4663 - classification_loss: 0.3099 249/500 [=============>................] - ETA: 1:02 - loss: 1.7741 - regression_loss: 1.4647 - classification_loss: 0.3095 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7745 - regression_loss: 1.4651 - classification_loss: 0.3094 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7755 - regression_loss: 1.4660 - classification_loss: 0.3095 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7729 - regression_loss: 1.4640 - classification_loss: 0.3088 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7742 - regression_loss: 1.4651 - classification_loss: 0.3091 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7710 - regression_loss: 1.4624 - classification_loss: 0.3086 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7720 - regression_loss: 1.4634 - classification_loss: 0.3086 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7735 - regression_loss: 1.4646 - classification_loss: 0.3089 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7724 - regression_loss: 1.4639 - classification_loss: 0.3085 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7727 - regression_loss: 1.4645 - classification_loss: 0.3082 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7722 - regression_loss: 1.4639 - classification_loss: 0.3083 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7734 - regression_loss: 1.4653 - classification_loss: 0.3081 261/500 [==============>...............] - ETA: 59s - loss: 1.7736 - regression_loss: 1.4656 - classification_loss: 0.3080  262/500 [==============>...............] - ETA: 59s - loss: 1.7735 - regression_loss: 1.4655 - classification_loss: 0.3079 263/500 [==============>...............] - ETA: 59s - loss: 1.7746 - regression_loss: 1.4668 - classification_loss: 0.3078 264/500 [==============>...............] - ETA: 59s - loss: 1.7719 - regression_loss: 1.4645 - classification_loss: 0.3074 265/500 [==============>...............] - ETA: 58s - loss: 1.7715 - regression_loss: 1.4643 - classification_loss: 0.3072 266/500 [==============>...............] - ETA: 58s - loss: 1.7708 - regression_loss: 1.4640 - classification_loss: 0.3068 267/500 [===============>..............] - ETA: 58s - loss: 1.7685 - regression_loss: 1.4621 - classification_loss: 0.3064 268/500 [===============>..............] - ETA: 58s - loss: 1.7693 - regression_loss: 1.4627 - classification_loss: 0.3066 269/500 [===============>..............] - ETA: 57s - loss: 1.7696 - regression_loss: 1.4626 - classification_loss: 0.3070 270/500 [===============>..............] - ETA: 57s - loss: 1.7669 - regression_loss: 1.4603 - classification_loss: 0.3065 271/500 [===============>..............] - ETA: 57s - loss: 1.7670 - regression_loss: 1.4608 - classification_loss: 0.3062 272/500 [===============>..............] - ETA: 57s - loss: 1.7673 - regression_loss: 1.4613 - classification_loss: 0.3060 273/500 [===============>..............] - ETA: 56s - loss: 1.7677 - regression_loss: 1.4617 - classification_loss: 0.3060 274/500 [===============>..............] - ETA: 56s - loss: 1.7674 - regression_loss: 1.4615 - classification_loss: 0.3059 275/500 [===============>..............] - ETA: 56s - loss: 1.7669 - regression_loss: 1.4612 - classification_loss: 0.3057 276/500 [===============>..............] - ETA: 56s - loss: 1.7676 - regression_loss: 1.4619 - classification_loss: 0.3057 277/500 [===============>..............] - ETA: 55s - loss: 1.7676 - regression_loss: 1.4619 - classification_loss: 0.3058 278/500 [===============>..............] - ETA: 55s - loss: 1.7683 - regression_loss: 1.4624 - classification_loss: 0.3059 279/500 [===============>..............] - ETA: 55s - loss: 1.7680 - regression_loss: 1.4621 - classification_loss: 0.3058 280/500 [===============>..............] - ETA: 55s - loss: 1.7672 - regression_loss: 1.4605 - classification_loss: 0.3066 281/500 [===============>..............] - ETA: 54s - loss: 1.7693 - regression_loss: 1.4621 - classification_loss: 0.3072 282/500 [===============>..............] - ETA: 54s - loss: 1.7686 - regression_loss: 1.4616 - classification_loss: 0.3070 283/500 [===============>..............] - ETA: 54s - loss: 1.7689 - regression_loss: 1.4623 - classification_loss: 0.3067 284/500 [================>.............] - ETA: 54s - loss: 1.7663 - regression_loss: 1.4602 - classification_loss: 0.3061 285/500 [================>.............] - ETA: 53s - loss: 1.7642 - regression_loss: 1.4585 - classification_loss: 0.3058 286/500 [================>.............] - ETA: 53s - loss: 1.7643 - regression_loss: 1.4587 - classification_loss: 0.3057 287/500 [================>.............] - ETA: 53s - loss: 1.7661 - regression_loss: 1.4601 - classification_loss: 0.3059 288/500 [================>.............] - ETA: 53s - loss: 1.7683 - regression_loss: 1.4618 - classification_loss: 0.3065 289/500 [================>.............] - ETA: 52s - loss: 1.7674 - regression_loss: 1.4611 - classification_loss: 0.3063 290/500 [================>.............] - ETA: 52s - loss: 1.7683 - regression_loss: 1.4621 - classification_loss: 0.3062 291/500 [================>.............] - ETA: 52s - loss: 1.7696 - regression_loss: 1.4630 - classification_loss: 0.3065 292/500 [================>.............] - ETA: 52s - loss: 1.7678 - regression_loss: 1.4618 - classification_loss: 0.3059 293/500 [================>.............] - ETA: 51s - loss: 1.7716 - regression_loss: 1.4642 - classification_loss: 0.3074 294/500 [================>.............] - ETA: 51s - loss: 1.7724 - regression_loss: 1.4650 - classification_loss: 0.3074 295/500 [================>.............] - ETA: 51s - loss: 1.7740 - regression_loss: 1.4663 - classification_loss: 0.3077 296/500 [================>.............] - ETA: 51s - loss: 1.7744 - regression_loss: 1.4666 - classification_loss: 0.3078 297/500 [================>.............] - ETA: 50s - loss: 1.7720 - regression_loss: 1.4645 - classification_loss: 0.3075 298/500 [================>.............] - ETA: 50s - loss: 1.7708 - regression_loss: 1.4636 - classification_loss: 0.3072 299/500 [================>.............] - ETA: 50s - loss: 1.7713 - regression_loss: 1.4639 - classification_loss: 0.3074 300/500 [=================>............] - ETA: 50s - loss: 1.7728 - regression_loss: 1.4655 - classification_loss: 0.3073 301/500 [=================>............] - ETA: 49s - loss: 1.7693 - regression_loss: 1.4625 - classification_loss: 0.3068 302/500 [=================>............] - ETA: 49s - loss: 1.7693 - regression_loss: 1.4625 - classification_loss: 0.3069 303/500 [=================>............] - ETA: 49s - loss: 1.7690 - regression_loss: 1.4622 - classification_loss: 0.3068 304/500 [=================>............] - ETA: 49s - loss: 1.7705 - regression_loss: 1.4635 - classification_loss: 0.3070 305/500 [=================>............] - ETA: 48s - loss: 1.7719 - regression_loss: 1.4648 - classification_loss: 0.3071 306/500 [=================>............] - ETA: 48s - loss: 1.7746 - regression_loss: 1.4671 - classification_loss: 0.3075 307/500 [=================>............] - ETA: 48s - loss: 1.7756 - regression_loss: 1.4677 - classification_loss: 0.3078 308/500 [=================>............] - ETA: 48s - loss: 1.7767 - regression_loss: 1.4687 - classification_loss: 0.3080 309/500 [=================>............] - ETA: 47s - loss: 1.7787 - regression_loss: 1.4703 - classification_loss: 0.3083 310/500 [=================>............] - ETA: 47s - loss: 1.7790 - regression_loss: 1.4705 - classification_loss: 0.3085 311/500 [=================>............] - ETA: 47s - loss: 1.7774 - regression_loss: 1.4691 - classification_loss: 0.3083 312/500 [=================>............] - ETA: 47s - loss: 1.7747 - regression_loss: 1.4671 - classification_loss: 0.3076 313/500 [=================>............] - ETA: 46s - loss: 1.7765 - regression_loss: 1.4684 - classification_loss: 0.3081 314/500 [=================>............] - ETA: 46s - loss: 1.7755 - regression_loss: 1.4679 - classification_loss: 0.3077 315/500 [=================>............] - ETA: 46s - loss: 1.7754 - regression_loss: 1.4677 - classification_loss: 0.3076 316/500 [=================>............] - ETA: 46s - loss: 1.7781 - regression_loss: 1.4697 - classification_loss: 0.3084 317/500 [==================>...........] - ETA: 45s - loss: 1.7819 - regression_loss: 1.4730 - classification_loss: 0.3089 318/500 [==================>...........] - ETA: 45s - loss: 1.7820 - regression_loss: 1.4730 - classification_loss: 0.3089 319/500 [==================>...........] - ETA: 45s - loss: 1.7826 - regression_loss: 1.4736 - classification_loss: 0.3090 320/500 [==================>...........] - ETA: 45s - loss: 1.7839 - regression_loss: 1.4746 - classification_loss: 0.3093 321/500 [==================>...........] - ETA: 44s - loss: 1.7846 - regression_loss: 1.4752 - classification_loss: 0.3095 322/500 [==================>...........] - ETA: 44s - loss: 1.7854 - regression_loss: 1.4759 - classification_loss: 0.3095 323/500 [==================>...........] - ETA: 44s - loss: 1.7860 - regression_loss: 1.4765 - classification_loss: 0.3095 324/500 [==================>...........] - ETA: 44s - loss: 1.7854 - regression_loss: 1.4761 - classification_loss: 0.3093 325/500 [==================>...........] - ETA: 43s - loss: 1.7854 - regression_loss: 1.4763 - classification_loss: 0.3091 326/500 [==================>...........] - ETA: 43s - loss: 1.7866 - regression_loss: 1.4771 - classification_loss: 0.3095 327/500 [==================>...........] - ETA: 43s - loss: 1.7861 - regression_loss: 1.4765 - classification_loss: 0.3096 328/500 [==================>...........] - ETA: 43s - loss: 1.7865 - regression_loss: 1.4769 - classification_loss: 0.3096 329/500 [==================>...........] - ETA: 42s - loss: 1.7869 - regression_loss: 1.4773 - classification_loss: 0.3096 330/500 [==================>...........] - ETA: 42s - loss: 1.7893 - regression_loss: 1.4793 - classification_loss: 0.3100 331/500 [==================>...........] - ETA: 42s - loss: 1.7892 - regression_loss: 1.4793 - classification_loss: 0.3099 332/500 [==================>...........] - ETA: 42s - loss: 1.7883 - regression_loss: 1.4785 - classification_loss: 0.3098 333/500 [==================>...........] - ETA: 41s - loss: 1.7885 - regression_loss: 1.4786 - classification_loss: 0.3100 334/500 [===================>..........] - ETA: 41s - loss: 1.7892 - regression_loss: 1.4788 - classification_loss: 0.3103 335/500 [===================>..........] - ETA: 41s - loss: 1.7902 - regression_loss: 1.4795 - classification_loss: 0.3107 336/500 [===================>..........] - ETA: 41s - loss: 1.7908 - regression_loss: 1.4799 - classification_loss: 0.3109 337/500 [===================>..........] - ETA: 40s - loss: 1.7912 - regression_loss: 1.4804 - classification_loss: 0.3108 338/500 [===================>..........] - ETA: 40s - loss: 1.7902 - regression_loss: 1.4798 - classification_loss: 0.3104 339/500 [===================>..........] - ETA: 40s - loss: 1.7901 - regression_loss: 1.4797 - classification_loss: 0.3104 340/500 [===================>..........] - ETA: 40s - loss: 1.7888 - regression_loss: 1.4786 - classification_loss: 0.3102 341/500 [===================>..........] - ETA: 39s - loss: 1.7884 - regression_loss: 1.4783 - classification_loss: 0.3101 342/500 [===================>..........] - ETA: 39s - loss: 1.7897 - regression_loss: 1.4794 - classification_loss: 0.3102 343/500 [===================>..........] - ETA: 39s - loss: 1.7902 - regression_loss: 1.4796 - classification_loss: 0.3106 344/500 [===================>..........] - ETA: 39s - loss: 1.7896 - regression_loss: 1.4793 - classification_loss: 0.3103 345/500 [===================>..........] - ETA: 38s - loss: 1.7908 - regression_loss: 1.4800 - classification_loss: 0.3108 346/500 [===================>..........] - ETA: 38s - loss: 1.7888 - regression_loss: 1.4781 - classification_loss: 0.3107 347/500 [===================>..........] - ETA: 38s - loss: 1.7905 - regression_loss: 1.4795 - classification_loss: 0.3110 348/500 [===================>..........] - ETA: 38s - loss: 1.7900 - regression_loss: 1.4792 - classification_loss: 0.3108 349/500 [===================>..........] - ETA: 37s - loss: 1.7886 - regression_loss: 1.4783 - classification_loss: 0.3104 350/500 [====================>.........] - ETA: 37s - loss: 1.7884 - regression_loss: 1.4781 - classification_loss: 0.3103 351/500 [====================>.........] - ETA: 37s - loss: 1.7894 - regression_loss: 1.4788 - classification_loss: 0.3106 352/500 [====================>.........] - ETA: 37s - loss: 1.7889 - regression_loss: 1.4784 - classification_loss: 0.3105 353/500 [====================>.........] - ETA: 36s - loss: 1.7899 - regression_loss: 1.4793 - classification_loss: 0.3107 354/500 [====================>.........] - ETA: 36s - loss: 1.7906 - regression_loss: 1.4798 - classification_loss: 0.3107 355/500 [====================>.........] - ETA: 36s - loss: 1.7923 - regression_loss: 1.4813 - classification_loss: 0.3110 356/500 [====================>.........] - ETA: 36s - loss: 1.7913 - regression_loss: 1.4806 - classification_loss: 0.3107 357/500 [====================>.........] - ETA: 35s - loss: 1.7938 - regression_loss: 1.4823 - classification_loss: 0.3115 358/500 [====================>.........] - ETA: 35s - loss: 1.7944 - regression_loss: 1.4829 - classification_loss: 0.3115 359/500 [====================>.........] - ETA: 35s - loss: 1.7943 - regression_loss: 1.4830 - classification_loss: 0.3113 360/500 [====================>.........] - ETA: 35s - loss: 1.7925 - regression_loss: 1.4814 - classification_loss: 0.3111 361/500 [====================>.........] - ETA: 34s - loss: 1.7912 - regression_loss: 1.4804 - classification_loss: 0.3108 362/500 [====================>.........] - ETA: 34s - loss: 1.7919 - regression_loss: 1.4811 - classification_loss: 0.3108 363/500 [====================>.........] - ETA: 34s - loss: 1.7910 - regression_loss: 1.4804 - classification_loss: 0.3106 364/500 [====================>.........] - ETA: 34s - loss: 1.7917 - regression_loss: 1.4807 - classification_loss: 0.3110 365/500 [====================>.........] - ETA: 33s - loss: 1.7922 - regression_loss: 1.4812 - classification_loss: 0.3110 366/500 [====================>.........] - ETA: 33s - loss: 1.7941 - regression_loss: 1.4827 - classification_loss: 0.3113 367/500 [=====================>........] - ETA: 33s - loss: 1.7937 - regression_loss: 1.4825 - classification_loss: 0.3112 368/500 [=====================>........] - ETA: 33s - loss: 1.7949 - regression_loss: 1.4835 - classification_loss: 0.3114 369/500 [=====================>........] - ETA: 32s - loss: 1.7977 - regression_loss: 1.4861 - classification_loss: 0.3116 370/500 [=====================>........] - ETA: 32s - loss: 1.7983 - regression_loss: 1.4866 - classification_loss: 0.3117 371/500 [=====================>........] - ETA: 32s - loss: 1.7984 - regression_loss: 1.4867 - classification_loss: 0.3117 372/500 [=====================>........] - ETA: 32s - loss: 1.7984 - regression_loss: 1.4869 - classification_loss: 0.3116 373/500 [=====================>........] - ETA: 31s - loss: 1.7980 - regression_loss: 1.4866 - classification_loss: 0.3113 374/500 [=====================>........] - ETA: 31s - loss: 1.7967 - regression_loss: 1.4855 - classification_loss: 0.3112 375/500 [=====================>........] - ETA: 31s - loss: 1.7970 - regression_loss: 1.4858 - classification_loss: 0.3112 376/500 [=====================>........] - ETA: 31s - loss: 1.7983 - regression_loss: 1.4867 - classification_loss: 0.3116 377/500 [=====================>........] - ETA: 30s - loss: 1.7988 - regression_loss: 1.4872 - classification_loss: 0.3116 378/500 [=====================>........] - ETA: 30s - loss: 1.7989 - regression_loss: 1.4873 - classification_loss: 0.3117 379/500 [=====================>........] - ETA: 30s - loss: 1.7989 - regression_loss: 1.4872 - classification_loss: 0.3117 380/500 [=====================>........] - ETA: 30s - loss: 1.7987 - regression_loss: 1.4870 - classification_loss: 0.3117 381/500 [=====================>........] - ETA: 29s - loss: 1.7963 - regression_loss: 1.4849 - classification_loss: 0.3114 382/500 [=====================>........] - ETA: 29s - loss: 1.7960 - regression_loss: 1.4843 - classification_loss: 0.3117 383/500 [=====================>........] - ETA: 29s - loss: 1.7967 - regression_loss: 1.4847 - classification_loss: 0.3120 384/500 [======================>.......] - ETA: 29s - loss: 1.7972 - regression_loss: 1.4851 - classification_loss: 0.3120 385/500 [======================>.......] - ETA: 28s - loss: 1.7981 - regression_loss: 1.4853 - classification_loss: 0.3128 386/500 [======================>.......] - ETA: 28s - loss: 1.7972 - regression_loss: 1.4846 - classification_loss: 0.3126 387/500 [======================>.......] - ETA: 28s - loss: 1.7978 - regression_loss: 1.4848 - classification_loss: 0.3130 388/500 [======================>.......] - ETA: 28s - loss: 1.7990 - regression_loss: 1.4855 - classification_loss: 0.3135 389/500 [======================>.......] - ETA: 27s - loss: 1.7996 - regression_loss: 1.4858 - classification_loss: 0.3138 390/500 [======================>.......] - ETA: 27s - loss: 1.7998 - regression_loss: 1.4860 - classification_loss: 0.3138 391/500 [======================>.......] - ETA: 27s - loss: 1.7993 - regression_loss: 1.4854 - classification_loss: 0.3139 392/500 [======================>.......] - ETA: 27s - loss: 1.7966 - regression_loss: 1.4831 - classification_loss: 0.3134 393/500 [======================>.......] - ETA: 26s - loss: 1.7959 - regression_loss: 1.4825 - classification_loss: 0.3134 394/500 [======================>.......] - ETA: 26s - loss: 1.7961 - regression_loss: 1.4829 - classification_loss: 0.3132 395/500 [======================>.......] - ETA: 26s - loss: 1.7973 - regression_loss: 1.4837 - classification_loss: 0.3136 396/500 [======================>.......] - ETA: 26s - loss: 1.7992 - regression_loss: 1.4853 - classification_loss: 0.3139 397/500 [======================>.......] - ETA: 25s - loss: 1.7976 - regression_loss: 1.4840 - classification_loss: 0.3137 398/500 [======================>.......] - ETA: 25s - loss: 1.7995 - regression_loss: 1.4856 - classification_loss: 0.3139 399/500 [======================>.......] - ETA: 25s - loss: 1.7971 - regression_loss: 1.4837 - classification_loss: 0.3134 400/500 [=======================>......] - ETA: 25s - loss: 1.7974 - regression_loss: 1.4837 - classification_loss: 0.3136 401/500 [=======================>......] - ETA: 24s - loss: 1.7948 - regression_loss: 1.4816 - classification_loss: 0.3132 402/500 [=======================>......] - ETA: 24s - loss: 1.7964 - regression_loss: 1.4827 - classification_loss: 0.3138 403/500 [=======================>......] - ETA: 24s - loss: 1.7966 - regression_loss: 1.4831 - classification_loss: 0.3136 404/500 [=======================>......] - ETA: 24s - loss: 1.7967 - regression_loss: 1.4832 - classification_loss: 0.3135 405/500 [=======================>......] - ETA: 23s - loss: 1.7976 - regression_loss: 1.4838 - classification_loss: 0.3138 406/500 [=======================>......] - ETA: 23s - loss: 1.7974 - regression_loss: 1.4838 - classification_loss: 0.3136 407/500 [=======================>......] - ETA: 23s - loss: 1.7977 - regression_loss: 1.4837 - classification_loss: 0.3140 408/500 [=======================>......] - ETA: 23s - loss: 1.7963 - regression_loss: 1.4827 - classification_loss: 0.3136 409/500 [=======================>......] - ETA: 22s - loss: 1.7971 - regression_loss: 1.4833 - classification_loss: 0.3137 410/500 [=======================>......] - ETA: 22s - loss: 1.7972 - regression_loss: 1.4834 - classification_loss: 0.3139 411/500 [=======================>......] - ETA: 22s - loss: 1.7966 - regression_loss: 1.4828 - classification_loss: 0.3138 412/500 [=======================>......] - ETA: 22s - loss: 1.7959 - regression_loss: 1.4823 - classification_loss: 0.3136 413/500 [=======================>......] - ETA: 21s - loss: 1.7961 - regression_loss: 1.4825 - classification_loss: 0.3136 414/500 [=======================>......] - ETA: 21s - loss: 1.7942 - regression_loss: 1.4811 - classification_loss: 0.3131 415/500 [=======================>......] - ETA: 21s - loss: 1.7929 - regression_loss: 1.4801 - classification_loss: 0.3128 416/500 [=======================>......] - ETA: 21s - loss: 1.7917 - regression_loss: 1.4794 - classification_loss: 0.3123 417/500 [========================>.....] - ETA: 20s - loss: 1.7921 - regression_loss: 1.4798 - classification_loss: 0.3123 418/500 [========================>.....] - ETA: 20s - loss: 1.7928 - regression_loss: 1.4805 - classification_loss: 0.3123 419/500 [========================>.....] - ETA: 20s - loss: 1.7929 - regression_loss: 1.4806 - classification_loss: 0.3123 420/500 [========================>.....] - ETA: 20s - loss: 1.7934 - regression_loss: 1.4810 - classification_loss: 0.3125 421/500 [========================>.....] - ETA: 19s - loss: 1.7919 - regression_loss: 1.4798 - classification_loss: 0.3121 422/500 [========================>.....] - ETA: 19s - loss: 1.7924 - regression_loss: 1.4799 - classification_loss: 0.3126 423/500 [========================>.....] - ETA: 19s - loss: 1.7916 - regression_loss: 1.4792 - classification_loss: 0.3124 424/500 [========================>.....] - ETA: 19s - loss: 1.7918 - regression_loss: 1.4794 - classification_loss: 0.3124 425/500 [========================>.....] - ETA: 18s - loss: 1.7913 - regression_loss: 1.4789 - classification_loss: 0.3124 426/500 [========================>.....] - ETA: 18s - loss: 1.7918 - regression_loss: 1.4791 - classification_loss: 0.3127 427/500 [========================>.....] - ETA: 18s - loss: 1.7924 - regression_loss: 1.4796 - classification_loss: 0.3128 428/500 [========================>.....] - ETA: 18s - loss: 1.7928 - regression_loss: 1.4800 - classification_loss: 0.3128 429/500 [========================>.....] - ETA: 17s - loss: 1.7930 - regression_loss: 1.4802 - classification_loss: 0.3128 430/500 [========================>.....] - ETA: 17s - loss: 1.7931 - regression_loss: 1.4804 - classification_loss: 0.3127 431/500 [========================>.....] - ETA: 17s - loss: 1.7936 - regression_loss: 1.4807 - classification_loss: 0.3129 432/500 [========================>.....] - ETA: 17s - loss: 1.7938 - regression_loss: 1.4810 - classification_loss: 0.3129 433/500 [========================>.....] - ETA: 16s - loss: 1.7938 - regression_loss: 1.4809 - classification_loss: 0.3129 434/500 [=========================>....] - ETA: 16s - loss: 1.7937 - regression_loss: 1.4809 - classification_loss: 0.3128 435/500 [=========================>....] - ETA: 16s - loss: 1.7933 - regression_loss: 1.4804 - classification_loss: 0.3130 436/500 [=========================>....] - ETA: 16s - loss: 1.7939 - regression_loss: 1.4809 - classification_loss: 0.3130 437/500 [=========================>....] - ETA: 15s - loss: 1.7930 - regression_loss: 1.4802 - classification_loss: 0.3128 438/500 [=========================>....] - ETA: 15s - loss: 1.7938 - regression_loss: 1.4809 - classification_loss: 0.3129 439/500 [=========================>....] - ETA: 15s - loss: 1.7942 - regression_loss: 1.4814 - classification_loss: 0.3129 440/500 [=========================>....] - ETA: 15s - loss: 1.7961 - regression_loss: 1.4825 - classification_loss: 0.3136 441/500 [=========================>....] - ETA: 14s - loss: 1.7944 - regression_loss: 1.4811 - classification_loss: 0.3133 442/500 [=========================>....] - ETA: 14s - loss: 1.7946 - regression_loss: 1.4812 - classification_loss: 0.3134 443/500 [=========================>....] - ETA: 14s - loss: 1.7953 - regression_loss: 1.4817 - classification_loss: 0.3136 444/500 [=========================>....] - ETA: 14s - loss: 1.7964 - regression_loss: 1.4827 - classification_loss: 0.3137 445/500 [=========================>....] - ETA: 13s - loss: 1.7963 - regression_loss: 1.4827 - classification_loss: 0.3137 446/500 [=========================>....] - ETA: 13s - loss: 1.7964 - regression_loss: 1.4827 - classification_loss: 0.3138 447/500 [=========================>....] - ETA: 13s - loss: 1.7960 - regression_loss: 1.4824 - classification_loss: 0.3136 448/500 [=========================>....] - ETA: 13s - loss: 1.7937 - regression_loss: 1.4805 - classification_loss: 0.3133 449/500 [=========================>....] - ETA: 12s - loss: 1.7955 - regression_loss: 1.4817 - classification_loss: 0.3138 450/500 [==========================>...] - ETA: 12s - loss: 1.7961 - regression_loss: 1.4821 - classification_loss: 0.3140 451/500 [==========================>...] - ETA: 12s - loss: 1.7959 - regression_loss: 1.4820 - classification_loss: 0.3140 452/500 [==========================>...] - ETA: 12s - loss: 1.7956 - regression_loss: 1.4816 - classification_loss: 0.3140 453/500 [==========================>...] - ETA: 11s - loss: 1.7980 - regression_loss: 1.4833 - classification_loss: 0.3146 454/500 [==========================>...] - ETA: 11s - loss: 1.7991 - regression_loss: 1.4842 - classification_loss: 0.3149 455/500 [==========================>...] - ETA: 11s - loss: 1.7986 - regression_loss: 1.4838 - classification_loss: 0.3148 456/500 [==========================>...] - ETA: 11s - loss: 1.7982 - regression_loss: 1.4835 - classification_loss: 0.3147 457/500 [==========================>...] - ETA: 10s - loss: 1.7968 - regression_loss: 1.4823 - classification_loss: 0.3146 458/500 [==========================>...] - ETA: 10s - loss: 1.7946 - regression_loss: 1.4805 - classification_loss: 0.3141 459/500 [==========================>...] - ETA: 10s - loss: 1.7916 - regression_loss: 1.4781 - classification_loss: 0.3135 460/500 [==========================>...] - ETA: 10s - loss: 1.7929 - regression_loss: 1.4790 - classification_loss: 0.3139 461/500 [==========================>...] - ETA: 9s - loss: 1.7926 - regression_loss: 1.4789 - classification_loss: 0.3137  462/500 [==========================>...] - ETA: 9s - loss: 1.7928 - regression_loss: 1.4791 - classification_loss: 0.3137 463/500 [==========================>...] - ETA: 9s - loss: 1.7926 - regression_loss: 1.4787 - classification_loss: 0.3139 464/500 [==========================>...] - ETA: 9s - loss: 1.7923 - regression_loss: 1.4786 - classification_loss: 0.3138 465/500 [==========================>...] - ETA: 8s - loss: 1.7925 - regression_loss: 1.4788 - classification_loss: 0.3137 466/500 [==========================>...] - ETA: 8s - loss: 1.7925 - regression_loss: 1.4789 - classification_loss: 0.3136 467/500 [===========================>..] - ETA: 8s - loss: 1.7923 - regression_loss: 1.4787 - classification_loss: 0.3135 468/500 [===========================>..] - ETA: 8s - loss: 1.7919 - regression_loss: 1.4785 - classification_loss: 0.3134 469/500 [===========================>..] - ETA: 7s - loss: 1.7914 - regression_loss: 1.4781 - classification_loss: 0.3133 470/500 [===========================>..] - ETA: 7s - loss: 1.7928 - regression_loss: 1.4790 - classification_loss: 0.3139 471/500 [===========================>..] - ETA: 7s - loss: 1.7935 - regression_loss: 1.4795 - classification_loss: 0.3140 472/500 [===========================>..] - ETA: 7s - loss: 1.7920 - regression_loss: 1.4784 - classification_loss: 0.3136 473/500 [===========================>..] - ETA: 6s - loss: 1.7912 - regression_loss: 1.4777 - classification_loss: 0.3135 474/500 [===========================>..] - ETA: 6s - loss: 1.7921 - regression_loss: 1.4786 - classification_loss: 0.3135 475/500 [===========================>..] - ETA: 6s - loss: 1.7907 - regression_loss: 1.4772 - classification_loss: 0.3134 476/500 [===========================>..] - ETA: 6s - loss: 1.7910 - regression_loss: 1.4776 - classification_loss: 0.3134 477/500 [===========================>..] - ETA: 5s - loss: 1.7905 - regression_loss: 1.4773 - classification_loss: 0.3133 478/500 [===========================>..] - ETA: 5s - loss: 1.7908 - regression_loss: 1.4775 - classification_loss: 0.3133 479/500 [===========================>..] - ETA: 5s - loss: 1.7905 - regression_loss: 1.4773 - classification_loss: 0.3132 480/500 [===========================>..] - ETA: 5s - loss: 1.7890 - regression_loss: 1.4763 - classification_loss: 0.3127 481/500 [===========================>..] - ETA: 4s - loss: 1.7896 - regression_loss: 1.4768 - classification_loss: 0.3128 482/500 [===========================>..] - ETA: 4s - loss: 1.7922 - regression_loss: 1.4788 - classification_loss: 0.3134 483/500 [===========================>..] - ETA: 4s - loss: 1.7931 - regression_loss: 1.4797 - classification_loss: 0.3134 484/500 [============================>.] - ETA: 4s - loss: 1.7940 - regression_loss: 1.4803 - classification_loss: 0.3137 485/500 [============================>.] - ETA: 3s - loss: 1.7935 - regression_loss: 1.4801 - classification_loss: 0.3135 486/500 [============================>.] - ETA: 3s - loss: 1.7934 - regression_loss: 1.4799 - classification_loss: 0.3135 487/500 [============================>.] - ETA: 3s - loss: 1.7933 - regression_loss: 1.4799 - classification_loss: 0.3134 488/500 [============================>.] - ETA: 3s - loss: 1.7927 - regression_loss: 1.4794 - classification_loss: 0.3133 489/500 [============================>.] - ETA: 2s - loss: 1.7926 - regression_loss: 1.4793 - classification_loss: 0.3133 490/500 [============================>.] - ETA: 2s - loss: 1.7933 - regression_loss: 1.4798 - classification_loss: 0.3135 491/500 [============================>.] - ETA: 2s - loss: 1.7926 - regression_loss: 1.4793 - classification_loss: 0.3133 492/500 [============================>.] - ETA: 2s - loss: 1.7922 - regression_loss: 1.4789 - classification_loss: 0.3133 493/500 [============================>.] - ETA: 1s - loss: 1.7919 - regression_loss: 1.4787 - classification_loss: 0.3132 494/500 [============================>.] - ETA: 1s - loss: 1.7897 - regression_loss: 1.4768 - classification_loss: 0.3129 495/500 [============================>.] - ETA: 1s - loss: 1.7876 - regression_loss: 1.4750 - classification_loss: 0.3126 496/500 [============================>.] - ETA: 1s - loss: 1.7877 - regression_loss: 1.4752 - classification_loss: 0.3125 497/500 [============================>.] - ETA: 0s - loss: 1.7881 - regression_loss: 1.4756 - classification_loss: 0.3125 498/500 [============================>.] - ETA: 0s - loss: 1.7857 - regression_loss: 1.4735 - classification_loss: 0.3122 499/500 [============================>.] - ETA: 0s - loss: 1.7860 - regression_loss: 1.4738 - classification_loss: 0.3123 500/500 [==============================] - 125s 251ms/step - loss: 1.7872 - regression_loss: 1.4747 - classification_loss: 0.3125 1172 instances of class plum with average precision: 0.6142 mAP: 0.6142 Epoch 00059: saving model to ./training/snapshots/resnet50_pascal_59.h5 Epoch 60/150 1/500 [..............................] - ETA: 1:58 - loss: 1.3803 - regression_loss: 1.1531 - classification_loss: 0.2272 2/500 [..............................] - ETA: 2:00 - loss: 1.6706 - regression_loss: 1.4181 - classification_loss: 0.2525 3/500 [..............................] - ETA: 2:02 - loss: 1.5941 - regression_loss: 1.3496 - classification_loss: 0.2445 4/500 [..............................] - ETA: 2:01 - loss: 1.6110 - regression_loss: 1.3616 - classification_loss: 0.2495 5/500 [..............................] - ETA: 2:02 - loss: 1.7166 - regression_loss: 1.4432 - classification_loss: 0.2734 6/500 [..............................] - ETA: 2:02 - loss: 1.7888 - regression_loss: 1.4970 - classification_loss: 0.2918 7/500 [..............................] - ETA: 2:02 - loss: 1.7591 - regression_loss: 1.4483 - classification_loss: 0.3108 8/500 [..............................] - ETA: 2:02 - loss: 1.7732 - regression_loss: 1.4617 - classification_loss: 0.3115 9/500 [..............................] - ETA: 2:02 - loss: 1.8527 - regression_loss: 1.5255 - classification_loss: 0.3272 10/500 [..............................] - ETA: 2:02 - loss: 1.8502 - regression_loss: 1.5299 - classification_loss: 0.3203 11/500 [..............................] - ETA: 2:02 - loss: 1.8626 - regression_loss: 1.5411 - classification_loss: 0.3215 12/500 [..............................] - ETA: 2:01 - loss: 1.8624 - regression_loss: 1.5433 - classification_loss: 0.3191 13/500 [..............................] - ETA: 2:01 - loss: 1.8396 - regression_loss: 1.5242 - classification_loss: 0.3154 14/500 [..............................] - ETA: 2:01 - loss: 1.7942 - regression_loss: 1.4790 - classification_loss: 0.3152 15/500 [..............................] - ETA: 2:01 - loss: 1.8541 - regression_loss: 1.5335 - classification_loss: 0.3207 16/500 [..............................] - ETA: 2:00 - loss: 1.8409 - regression_loss: 1.5183 - classification_loss: 0.3225 17/500 [>.............................] - ETA: 2:00 - loss: 1.8027 - regression_loss: 1.4853 - classification_loss: 0.3174 18/500 [>.............................] - ETA: 2:00 - loss: 1.8130 - regression_loss: 1.4919 - classification_loss: 0.3212 19/500 [>.............................] - ETA: 1:59 - loss: 1.8116 - regression_loss: 1.4935 - classification_loss: 0.3180 20/500 [>.............................] - ETA: 1:58 - loss: 1.8184 - regression_loss: 1.5014 - classification_loss: 0.3170 21/500 [>.............................] - ETA: 1:58 - loss: 1.8035 - regression_loss: 1.4899 - classification_loss: 0.3135 22/500 [>.............................] - ETA: 1:57 - loss: 1.8226 - regression_loss: 1.5058 - classification_loss: 0.3167 23/500 [>.............................] - ETA: 1:57 - loss: 1.8289 - regression_loss: 1.5122 - classification_loss: 0.3167 24/500 [>.............................] - ETA: 1:57 - loss: 1.8426 - regression_loss: 1.5252 - classification_loss: 0.3174 25/500 [>.............................] - ETA: 1:56 - loss: 1.8590 - regression_loss: 1.5394 - classification_loss: 0.3195 26/500 [>.............................] - ETA: 1:56 - loss: 1.8488 - regression_loss: 1.5335 - classification_loss: 0.3153 27/500 [>.............................] - ETA: 1:56 - loss: 1.8630 - regression_loss: 1.5440 - classification_loss: 0.3190 28/500 [>.............................] - ETA: 1:55 - loss: 1.8745 - regression_loss: 1.5515 - classification_loss: 0.3230 29/500 [>.............................] - ETA: 1:54 - loss: 1.8754 - regression_loss: 1.5542 - classification_loss: 0.3212 30/500 [>.............................] - ETA: 1:54 - loss: 1.8904 - regression_loss: 1.5650 - classification_loss: 0.3254 31/500 [>.............................] - ETA: 1:54 - loss: 1.8992 - regression_loss: 1.5717 - classification_loss: 0.3275 32/500 [>.............................] - ETA: 1:54 - loss: 1.8872 - regression_loss: 1.5579 - classification_loss: 0.3294 33/500 [>.............................] - ETA: 1:54 - loss: 1.8698 - regression_loss: 1.5438 - classification_loss: 0.3260 34/500 [=>............................] - ETA: 1:54 - loss: 1.8574 - regression_loss: 1.5351 - classification_loss: 0.3223 35/500 [=>............................] - ETA: 1:53 - loss: 1.8287 - regression_loss: 1.5105 - classification_loss: 0.3182 36/500 [=>............................] - ETA: 1:53 - loss: 1.8429 - regression_loss: 1.5203 - classification_loss: 0.3226 37/500 [=>............................] - ETA: 1:53 - loss: 1.8522 - regression_loss: 1.5284 - classification_loss: 0.3238 38/500 [=>............................] - ETA: 1:53 - loss: 1.8623 - regression_loss: 1.5364 - classification_loss: 0.3259 39/500 [=>............................] - ETA: 1:53 - loss: 1.8652 - regression_loss: 1.5395 - classification_loss: 0.3258 40/500 [=>............................] - ETA: 1:53 - loss: 1.8598 - regression_loss: 1.5337 - classification_loss: 0.3261 41/500 [=>............................] - ETA: 1:52 - loss: 1.8602 - regression_loss: 1.5327 - classification_loss: 0.3275 42/500 [=>............................] - ETA: 1:52 - loss: 1.8531 - regression_loss: 1.5277 - classification_loss: 0.3253 43/500 [=>............................] - ETA: 1:52 - loss: 1.8512 - regression_loss: 1.5263 - classification_loss: 0.3249 44/500 [=>............................] - ETA: 1:52 - loss: 1.8384 - regression_loss: 1.5161 - classification_loss: 0.3223 45/500 [=>............................] - ETA: 1:51 - loss: 1.8490 - regression_loss: 1.5231 - classification_loss: 0.3259 46/500 [=>............................] - ETA: 1:51 - loss: 1.8383 - regression_loss: 1.5147 - classification_loss: 0.3236 47/500 [=>............................] - ETA: 1:51 - loss: 1.8388 - regression_loss: 1.5158 - classification_loss: 0.3230 48/500 [=>............................] - ETA: 1:51 - loss: 1.8400 - regression_loss: 1.5163 - classification_loss: 0.3236 49/500 [=>............................] - ETA: 1:51 - loss: 1.8228 - regression_loss: 1.5007 - classification_loss: 0.3221 50/500 [==>...........................] - ETA: 1:51 - loss: 1.8323 - regression_loss: 1.5081 - classification_loss: 0.3242 51/500 [==>...........................] - ETA: 1:50 - loss: 1.8327 - regression_loss: 1.5090 - classification_loss: 0.3237 52/500 [==>...........................] - ETA: 1:50 - loss: 1.8297 - regression_loss: 1.5064 - classification_loss: 0.3232 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8358 - regression_loss: 1.5122 - classification_loss: 0.3236 54/500 [==>...........................] - ETA: 1:50 - loss: 1.8455 - regression_loss: 1.5207 - classification_loss: 0.3248 55/500 [==>...........................] - ETA: 1:50 - loss: 1.8462 - regression_loss: 1.5227 - classification_loss: 0.3235 56/500 [==>...........................] - ETA: 1:49 - loss: 1.8481 - regression_loss: 1.5254 - classification_loss: 0.3227 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8522 - regression_loss: 1.5285 - classification_loss: 0.3237 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8510 - regression_loss: 1.5283 - classification_loss: 0.3227 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8386 - regression_loss: 1.5174 - classification_loss: 0.3212 60/500 [==>...........................] - ETA: 1:48 - loss: 1.8377 - regression_loss: 1.5165 - classification_loss: 0.3212 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8421 - regression_loss: 1.5202 - classification_loss: 0.3219 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8469 - regression_loss: 1.5249 - classification_loss: 0.3220 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8465 - regression_loss: 1.5243 - classification_loss: 0.3222 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8460 - regression_loss: 1.5237 - classification_loss: 0.3223 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8470 - regression_loss: 1.5234 - classification_loss: 0.3236 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8569 - regression_loss: 1.5300 - classification_loss: 0.3269 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8415 - regression_loss: 1.5177 - classification_loss: 0.3238 68/500 [===>..........................] - ETA: 1:46 - loss: 1.8469 - regression_loss: 1.5231 - classification_loss: 0.3238 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8450 - regression_loss: 1.5198 - classification_loss: 0.3253 70/500 [===>..........................] - ETA: 1:45 - loss: 1.8372 - regression_loss: 1.5121 - classification_loss: 0.3251 71/500 [===>..........................] - ETA: 1:45 - loss: 1.8511 - regression_loss: 1.5239 - classification_loss: 0.3272 72/500 [===>..........................] - ETA: 1:45 - loss: 1.8586 - regression_loss: 1.5311 - classification_loss: 0.3275 73/500 [===>..........................] - ETA: 1:44 - loss: 1.8502 - regression_loss: 1.5248 - classification_loss: 0.3254 74/500 [===>..........................] - ETA: 1:44 - loss: 1.8545 - regression_loss: 1.5297 - classification_loss: 0.3248 75/500 [===>..........................] - ETA: 1:44 - loss: 1.8420 - regression_loss: 1.5198 - classification_loss: 0.3222 76/500 [===>..........................] - ETA: 1:44 - loss: 1.8375 - regression_loss: 1.5168 - classification_loss: 0.3208 77/500 [===>..........................] - ETA: 1:43 - loss: 1.8404 - regression_loss: 1.5195 - classification_loss: 0.3209 78/500 [===>..........................] - ETA: 1:43 - loss: 1.8411 - regression_loss: 1.5203 - classification_loss: 0.3208 79/500 [===>..........................] - ETA: 1:43 - loss: 1.8412 - regression_loss: 1.5203 - classification_loss: 0.3210 80/500 [===>..........................] - ETA: 1:43 - loss: 1.8440 - regression_loss: 1.5232 - classification_loss: 0.3208 81/500 [===>..........................] - ETA: 1:42 - loss: 1.8324 - regression_loss: 1.5136 - classification_loss: 0.3188 82/500 [===>..........................] - ETA: 1:42 - loss: 1.8348 - regression_loss: 1.5154 - classification_loss: 0.3193 83/500 [===>..........................] - ETA: 1:42 - loss: 1.8391 - regression_loss: 1.5192 - classification_loss: 0.3200 84/500 [====>.........................] - ETA: 1:42 - loss: 1.8561 - regression_loss: 1.5336 - classification_loss: 0.3225 85/500 [====>.........................] - ETA: 1:42 - loss: 1.8628 - regression_loss: 1.5393 - classification_loss: 0.3235 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8607 - regression_loss: 1.5379 - classification_loss: 0.3228 87/500 [====>.........................] - ETA: 1:41 - loss: 1.8607 - regression_loss: 1.5381 - classification_loss: 0.3225 88/500 [====>.........................] - ETA: 1:41 - loss: 1.8530 - regression_loss: 1.5324 - classification_loss: 0.3205 89/500 [====>.........................] - ETA: 1:41 - loss: 1.8525 - regression_loss: 1.5321 - classification_loss: 0.3204 90/500 [====>.........................] - ETA: 1:41 - loss: 1.8592 - regression_loss: 1.5380 - classification_loss: 0.3212 91/500 [====>.........................] - ETA: 1:40 - loss: 1.8650 - regression_loss: 1.5420 - classification_loss: 0.3231 92/500 [====>.........................] - ETA: 1:40 - loss: 1.8666 - regression_loss: 1.5441 - classification_loss: 0.3224 93/500 [====>.........................] - ETA: 1:40 - loss: 1.8613 - regression_loss: 1.5409 - classification_loss: 0.3204 94/500 [====>.........................] - ETA: 1:40 - loss: 1.8571 - regression_loss: 1.5386 - classification_loss: 0.3185 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8462 - regression_loss: 1.5295 - classification_loss: 0.3167 96/500 [====>.........................] - ETA: 1:39 - loss: 1.8392 - regression_loss: 1.5240 - classification_loss: 0.3152 97/500 [====>.........................] - ETA: 1:39 - loss: 1.8398 - regression_loss: 1.5241 - classification_loss: 0.3157 98/500 [====>.........................] - ETA: 1:39 - loss: 1.8398 - regression_loss: 1.5240 - classification_loss: 0.3158 99/500 [====>.........................] - ETA: 1:38 - loss: 1.8283 - regression_loss: 1.5145 - classification_loss: 0.3137 100/500 [=====>........................] - ETA: 1:38 - loss: 1.8234 - regression_loss: 1.5096 - classification_loss: 0.3138 101/500 [=====>........................] - ETA: 1:38 - loss: 1.8265 - regression_loss: 1.5129 - classification_loss: 0.3136 102/500 [=====>........................] - ETA: 1:38 - loss: 1.8289 - regression_loss: 1.5146 - classification_loss: 0.3143 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8316 - regression_loss: 1.5167 - classification_loss: 0.3149 104/500 [=====>........................] - ETA: 1:37 - loss: 1.8270 - regression_loss: 1.5133 - classification_loss: 0.3137 105/500 [=====>........................] - ETA: 1:37 - loss: 1.8264 - regression_loss: 1.5136 - classification_loss: 0.3128 106/500 [=====>........................] - ETA: 1:37 - loss: 1.8288 - regression_loss: 1.5159 - classification_loss: 0.3129 107/500 [=====>........................] - ETA: 1:37 - loss: 1.8279 - regression_loss: 1.5155 - classification_loss: 0.3124 108/500 [=====>........................] - ETA: 1:36 - loss: 1.8250 - regression_loss: 1.5133 - classification_loss: 0.3117 109/500 [=====>........................] - ETA: 1:36 - loss: 1.8223 - regression_loss: 1.5099 - classification_loss: 0.3124 110/500 [=====>........................] - ETA: 1:36 - loss: 1.8248 - regression_loss: 1.5115 - classification_loss: 0.3133 111/500 [=====>........................] - ETA: 1:35 - loss: 1.8241 - regression_loss: 1.5107 - classification_loss: 0.3133 112/500 [=====>........................] - ETA: 1:35 - loss: 1.8228 - regression_loss: 1.5102 - classification_loss: 0.3126 113/500 [=====>........................] - ETA: 1:35 - loss: 1.8246 - regression_loss: 1.5112 - classification_loss: 0.3134 114/500 [=====>........................] - ETA: 1:35 - loss: 1.8263 - regression_loss: 1.5125 - classification_loss: 0.3138 115/500 [=====>........................] - ETA: 1:35 - loss: 1.8406 - regression_loss: 1.5244 - classification_loss: 0.3161 116/500 [=====>........................] - ETA: 1:34 - loss: 1.8450 - regression_loss: 1.5277 - classification_loss: 0.3173 117/500 [======>.......................] - ETA: 1:34 - loss: 1.8425 - regression_loss: 1.5258 - classification_loss: 0.3167 118/500 [======>.......................] - ETA: 1:34 - loss: 1.8466 - regression_loss: 1.5279 - classification_loss: 0.3187 119/500 [======>.......................] - ETA: 1:34 - loss: 1.8487 - regression_loss: 1.5292 - classification_loss: 0.3196 120/500 [======>.......................] - ETA: 1:33 - loss: 1.8463 - regression_loss: 1.5276 - classification_loss: 0.3187 121/500 [======>.......................] - ETA: 1:33 - loss: 1.8467 - regression_loss: 1.5274 - classification_loss: 0.3192 122/500 [======>.......................] - ETA: 1:33 - loss: 1.8489 - regression_loss: 1.5292 - classification_loss: 0.3197 123/500 [======>.......................] - ETA: 1:33 - loss: 1.8509 - regression_loss: 1.5313 - classification_loss: 0.3196 124/500 [======>.......................] - ETA: 1:32 - loss: 1.8530 - regression_loss: 1.5327 - classification_loss: 0.3203 125/500 [======>.......................] - ETA: 1:32 - loss: 1.8528 - regression_loss: 1.5328 - classification_loss: 0.3200 126/500 [======>.......................] - ETA: 1:32 - loss: 1.8480 - regression_loss: 1.5286 - classification_loss: 0.3194 127/500 [======>.......................] - ETA: 1:32 - loss: 1.8484 - regression_loss: 1.5294 - classification_loss: 0.3190 128/500 [======>.......................] - ETA: 1:31 - loss: 1.8477 - regression_loss: 1.5287 - classification_loss: 0.3190 129/500 [======>.......................] - ETA: 1:31 - loss: 1.8492 - regression_loss: 1.5300 - classification_loss: 0.3192 130/500 [======>.......................] - ETA: 1:31 - loss: 1.8492 - regression_loss: 1.5299 - classification_loss: 0.3193 131/500 [======>.......................] - ETA: 1:31 - loss: 1.8435 - regression_loss: 1.5257 - classification_loss: 0.3178 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8466 - regression_loss: 1.5274 - classification_loss: 0.3192 133/500 [======>.......................] - ETA: 1:30 - loss: 1.8441 - regression_loss: 1.5254 - classification_loss: 0.3187 134/500 [=======>......................] - ETA: 1:30 - loss: 1.8395 - regression_loss: 1.5216 - classification_loss: 0.3178 135/500 [=======>......................] - ETA: 1:30 - loss: 1.8407 - regression_loss: 1.5235 - classification_loss: 0.3172 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8347 - regression_loss: 1.5189 - classification_loss: 0.3158 137/500 [=======>......................] - ETA: 1:29 - loss: 1.8358 - regression_loss: 1.5202 - classification_loss: 0.3156 138/500 [=======>......................] - ETA: 1:29 - loss: 1.8337 - regression_loss: 1.5186 - classification_loss: 0.3151 139/500 [=======>......................] - ETA: 1:29 - loss: 1.8283 - regression_loss: 1.5144 - classification_loss: 0.3139 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8208 - regression_loss: 1.5080 - classification_loss: 0.3128 141/500 [=======>......................] - ETA: 1:28 - loss: 1.8176 - regression_loss: 1.5056 - classification_loss: 0.3120 142/500 [=======>......................] - ETA: 1:28 - loss: 1.8169 - regression_loss: 1.5049 - classification_loss: 0.3121 143/500 [=======>......................] - ETA: 1:28 - loss: 1.8160 - regression_loss: 1.5045 - classification_loss: 0.3115 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8169 - regression_loss: 1.5048 - classification_loss: 0.3121 145/500 [=======>......................] - ETA: 1:27 - loss: 1.8129 - regression_loss: 1.5016 - classification_loss: 0.3114 146/500 [=======>......................] - ETA: 1:27 - loss: 1.8152 - regression_loss: 1.5032 - classification_loss: 0.3121 147/500 [=======>......................] - ETA: 1:27 - loss: 1.8158 - regression_loss: 1.5038 - classification_loss: 0.3120 148/500 [=======>......................] - ETA: 1:27 - loss: 1.8146 - regression_loss: 1.5029 - classification_loss: 0.3117 149/500 [=======>......................] - ETA: 1:26 - loss: 1.8167 - regression_loss: 1.5018 - classification_loss: 0.3149 150/500 [========>.....................] - ETA: 1:26 - loss: 1.8141 - regression_loss: 1.5002 - classification_loss: 0.3139 151/500 [========>.....................] - ETA: 1:26 - loss: 1.8125 - regression_loss: 1.4994 - classification_loss: 0.3131 152/500 [========>.....................] - ETA: 1:26 - loss: 1.8132 - regression_loss: 1.5002 - classification_loss: 0.3130 153/500 [========>.....................] - ETA: 1:25 - loss: 1.8140 - regression_loss: 1.5012 - classification_loss: 0.3128 154/500 [========>.....................] - ETA: 1:25 - loss: 1.8173 - regression_loss: 1.5022 - classification_loss: 0.3151 155/500 [========>.....................] - ETA: 1:25 - loss: 1.8159 - regression_loss: 1.5002 - classification_loss: 0.3157 156/500 [========>.....................] - ETA: 1:25 - loss: 1.8201 - regression_loss: 1.5041 - classification_loss: 0.3160 157/500 [========>.....................] - ETA: 1:24 - loss: 1.8177 - regression_loss: 1.5022 - classification_loss: 0.3155 158/500 [========>.....................] - ETA: 1:24 - loss: 1.8187 - regression_loss: 1.5032 - classification_loss: 0.3155 159/500 [========>.....................] - ETA: 1:24 - loss: 1.8235 - regression_loss: 1.5079 - classification_loss: 0.3156 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8203 - regression_loss: 1.5054 - classification_loss: 0.3149 161/500 [========>.....................] - ETA: 1:23 - loss: 1.8178 - regression_loss: 1.5035 - classification_loss: 0.3143 162/500 [========>.....................] - ETA: 1:23 - loss: 1.8177 - regression_loss: 1.5035 - classification_loss: 0.3142 163/500 [========>.....................] - ETA: 1:23 - loss: 1.8183 - regression_loss: 1.5039 - classification_loss: 0.3144 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8213 - regression_loss: 1.5063 - classification_loss: 0.3150 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8274 - regression_loss: 1.5113 - classification_loss: 0.3160 166/500 [========>.....................] - ETA: 1:22 - loss: 1.8310 - regression_loss: 1.5140 - classification_loss: 0.3170 167/500 [=========>....................] - ETA: 1:22 - loss: 1.8318 - regression_loss: 1.5146 - classification_loss: 0.3172 168/500 [=========>....................] - ETA: 1:22 - loss: 1.8336 - regression_loss: 1.5160 - classification_loss: 0.3176 169/500 [=========>....................] - ETA: 1:22 - loss: 1.8326 - regression_loss: 1.5152 - classification_loss: 0.3174 170/500 [=========>....................] - ETA: 1:21 - loss: 1.8331 - regression_loss: 1.5156 - classification_loss: 0.3175 171/500 [=========>....................] - ETA: 1:21 - loss: 1.8325 - regression_loss: 1.5154 - classification_loss: 0.3172 172/500 [=========>....................] - ETA: 1:21 - loss: 1.8330 - regression_loss: 1.5153 - classification_loss: 0.3177 173/500 [=========>....................] - ETA: 1:21 - loss: 1.8319 - regression_loss: 1.5125 - classification_loss: 0.3195 174/500 [=========>....................] - ETA: 1:20 - loss: 1.8328 - regression_loss: 1.5119 - classification_loss: 0.3209 175/500 [=========>....................] - ETA: 1:20 - loss: 1.8351 - regression_loss: 1.5140 - classification_loss: 0.3211 176/500 [=========>....................] - ETA: 1:20 - loss: 1.8310 - regression_loss: 1.5106 - classification_loss: 0.3203 177/500 [=========>....................] - ETA: 1:20 - loss: 1.8321 - regression_loss: 1.5115 - classification_loss: 0.3206 178/500 [=========>....................] - ETA: 1:19 - loss: 1.8348 - regression_loss: 1.5139 - classification_loss: 0.3209 179/500 [=========>....................] - ETA: 1:19 - loss: 1.8355 - regression_loss: 1.5145 - classification_loss: 0.3210 180/500 [=========>....................] - ETA: 1:19 - loss: 1.8387 - regression_loss: 1.5174 - classification_loss: 0.3213 181/500 [=========>....................] - ETA: 1:19 - loss: 1.8391 - regression_loss: 1.5181 - classification_loss: 0.3209 182/500 [=========>....................] - ETA: 1:18 - loss: 1.8407 - regression_loss: 1.5199 - classification_loss: 0.3207 183/500 [=========>....................] - ETA: 1:18 - loss: 1.8410 - regression_loss: 1.5205 - classification_loss: 0.3205 184/500 [==========>...................] - ETA: 1:18 - loss: 1.8434 - regression_loss: 1.5227 - classification_loss: 0.3207 185/500 [==========>...................] - ETA: 1:18 - loss: 1.8467 - regression_loss: 1.5249 - classification_loss: 0.3219 186/500 [==========>...................] - ETA: 1:18 - loss: 1.8464 - regression_loss: 1.5249 - classification_loss: 0.3215 187/500 [==========>...................] - ETA: 1:18 - loss: 1.8484 - regression_loss: 1.5264 - classification_loss: 0.3221 188/500 [==========>...................] - ETA: 1:17 - loss: 1.8515 - regression_loss: 1.5292 - classification_loss: 0.3223 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8521 - regression_loss: 1.5300 - classification_loss: 0.3222 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8497 - regression_loss: 1.5283 - classification_loss: 0.3214 191/500 [==========>...................] - ETA: 1:16 - loss: 1.8496 - regression_loss: 1.5283 - classification_loss: 0.3212 192/500 [==========>...................] - ETA: 1:16 - loss: 1.8510 - regression_loss: 1.5296 - classification_loss: 0.3214 193/500 [==========>...................] - ETA: 1:16 - loss: 1.8531 - regression_loss: 1.5292 - classification_loss: 0.3239 194/500 [==========>...................] - ETA: 1:16 - loss: 1.8531 - regression_loss: 1.5294 - classification_loss: 0.3237 195/500 [==========>...................] - ETA: 1:15 - loss: 1.8550 - regression_loss: 1.5311 - classification_loss: 0.3239 196/500 [==========>...................] - ETA: 1:15 - loss: 1.8546 - regression_loss: 1.5310 - classification_loss: 0.3236 197/500 [==========>...................] - ETA: 1:15 - loss: 1.8592 - regression_loss: 1.5345 - classification_loss: 0.3247 198/500 [==========>...................] - ETA: 1:15 - loss: 1.8606 - regression_loss: 1.5355 - classification_loss: 0.3251 199/500 [==========>...................] - ETA: 1:14 - loss: 1.8617 - regression_loss: 1.5363 - classification_loss: 0.3253 200/500 [===========>..................] - ETA: 1:14 - loss: 1.8629 - regression_loss: 1.5374 - classification_loss: 0.3254 201/500 [===========>..................] - ETA: 1:14 - loss: 1.8647 - regression_loss: 1.5390 - classification_loss: 0.3257 202/500 [===========>..................] - ETA: 1:14 - loss: 1.8630 - regression_loss: 1.5377 - classification_loss: 0.3253 203/500 [===========>..................] - ETA: 1:13 - loss: 1.8594 - regression_loss: 1.5348 - classification_loss: 0.3246 204/500 [===========>..................] - ETA: 1:13 - loss: 1.8541 - regression_loss: 1.5303 - classification_loss: 0.3238 205/500 [===========>..................] - ETA: 1:13 - loss: 1.8543 - regression_loss: 1.5304 - classification_loss: 0.3239 206/500 [===========>..................] - ETA: 1:13 - loss: 1.8541 - regression_loss: 1.5304 - classification_loss: 0.3237 207/500 [===========>..................] - ETA: 1:13 - loss: 1.8551 - regression_loss: 1.5312 - classification_loss: 0.3239 208/500 [===========>..................] - ETA: 1:12 - loss: 1.8563 - regression_loss: 1.5320 - classification_loss: 0.3243 209/500 [===========>..................] - ETA: 1:12 - loss: 1.8541 - regression_loss: 1.5305 - classification_loss: 0.3237 210/500 [===========>..................] - ETA: 1:12 - loss: 1.8529 - regression_loss: 1.5295 - classification_loss: 0.3234 211/500 [===========>..................] - ETA: 1:12 - loss: 1.8516 - regression_loss: 1.5287 - classification_loss: 0.3229 212/500 [===========>..................] - ETA: 1:11 - loss: 1.8513 - regression_loss: 1.5283 - classification_loss: 0.3229 213/500 [===========>..................] - ETA: 1:11 - loss: 1.8509 - regression_loss: 1.5279 - classification_loss: 0.3230 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8517 - regression_loss: 1.5287 - classification_loss: 0.3230 215/500 [===========>..................] - ETA: 1:11 - loss: 1.8511 - regression_loss: 1.5284 - classification_loss: 0.3226 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8526 - regression_loss: 1.5298 - classification_loss: 0.3228 217/500 [============>.................] - ETA: 1:10 - loss: 1.8535 - regression_loss: 1.5306 - classification_loss: 0.3229 218/500 [============>.................] - ETA: 1:10 - loss: 1.8474 - regression_loss: 1.5255 - classification_loss: 0.3218 219/500 [============>.................] - ETA: 1:10 - loss: 1.8478 - regression_loss: 1.5261 - classification_loss: 0.3217 220/500 [============>.................] - ETA: 1:09 - loss: 1.8429 - regression_loss: 1.5219 - classification_loss: 0.3209 221/500 [============>.................] - ETA: 1:09 - loss: 1.8416 - regression_loss: 1.5212 - classification_loss: 0.3204 222/500 [============>.................] - ETA: 1:09 - loss: 1.8431 - regression_loss: 1.5226 - classification_loss: 0.3205 223/500 [============>.................] - ETA: 1:09 - loss: 1.8440 - regression_loss: 1.5232 - classification_loss: 0.3208 224/500 [============>.................] - ETA: 1:08 - loss: 1.8427 - regression_loss: 1.5218 - classification_loss: 0.3210 225/500 [============>.................] - ETA: 1:08 - loss: 1.8437 - regression_loss: 1.5225 - classification_loss: 0.3212 226/500 [============>.................] - ETA: 1:08 - loss: 1.8423 - regression_loss: 1.5215 - classification_loss: 0.3208 227/500 [============>.................] - ETA: 1:08 - loss: 1.8410 - regression_loss: 1.5191 - classification_loss: 0.3219 228/500 [============>.................] - ETA: 1:07 - loss: 1.8415 - regression_loss: 1.5197 - classification_loss: 0.3218 229/500 [============>.................] - ETA: 1:07 - loss: 1.8401 - regression_loss: 1.5187 - classification_loss: 0.3215 230/500 [============>.................] - ETA: 1:07 - loss: 1.8385 - regression_loss: 1.5175 - classification_loss: 0.3210 231/500 [============>.................] - ETA: 1:07 - loss: 1.8377 - regression_loss: 1.5171 - classification_loss: 0.3207 232/500 [============>.................] - ETA: 1:06 - loss: 1.8381 - regression_loss: 1.5175 - classification_loss: 0.3206 233/500 [============>.................] - ETA: 1:06 - loss: 1.8385 - regression_loss: 1.5177 - classification_loss: 0.3209 234/500 [=============>................] - ETA: 1:06 - loss: 1.8387 - regression_loss: 1.5178 - classification_loss: 0.3209 235/500 [=============>................] - ETA: 1:06 - loss: 1.8396 - regression_loss: 1.5185 - classification_loss: 0.3211 236/500 [=============>................] - ETA: 1:05 - loss: 1.8424 - regression_loss: 1.5204 - classification_loss: 0.3219 237/500 [=============>................] - ETA: 1:05 - loss: 1.8431 - regression_loss: 1.5210 - classification_loss: 0.3221 238/500 [=============>................] - ETA: 1:05 - loss: 1.8410 - regression_loss: 1.5195 - classification_loss: 0.3215 239/500 [=============>................] - ETA: 1:05 - loss: 1.8394 - regression_loss: 1.5181 - classification_loss: 0.3213 240/500 [=============>................] - ETA: 1:04 - loss: 1.8391 - regression_loss: 1.5183 - classification_loss: 0.3208 241/500 [=============>................] - ETA: 1:04 - loss: 1.8379 - regression_loss: 1.5173 - classification_loss: 0.3206 242/500 [=============>................] - ETA: 1:04 - loss: 1.8364 - regression_loss: 1.5159 - classification_loss: 0.3205 243/500 [=============>................] - ETA: 1:04 - loss: 1.8364 - regression_loss: 1.5160 - classification_loss: 0.3204 244/500 [=============>................] - ETA: 1:03 - loss: 1.8372 - regression_loss: 1.5168 - classification_loss: 0.3204 245/500 [=============>................] - ETA: 1:03 - loss: 1.8359 - regression_loss: 1.5159 - classification_loss: 0.3200 246/500 [=============>................] - ETA: 1:03 - loss: 1.8338 - regression_loss: 1.5143 - classification_loss: 0.3195 247/500 [=============>................] - ETA: 1:03 - loss: 1.8324 - regression_loss: 1.5134 - classification_loss: 0.3190 248/500 [=============>................] - ETA: 1:02 - loss: 1.8322 - regression_loss: 1.5134 - classification_loss: 0.3188 249/500 [=============>................] - ETA: 1:02 - loss: 1.8311 - regression_loss: 1.5125 - classification_loss: 0.3186 250/500 [==============>...............] - ETA: 1:02 - loss: 1.8311 - regression_loss: 1.5128 - classification_loss: 0.3183 251/500 [==============>...............] - ETA: 1:01 - loss: 1.8294 - regression_loss: 1.5115 - classification_loss: 0.3179 252/500 [==============>...............] - ETA: 1:01 - loss: 1.8294 - regression_loss: 1.5114 - classification_loss: 0.3180 253/500 [==============>...............] - ETA: 1:01 - loss: 1.8285 - regression_loss: 1.5105 - classification_loss: 0.3180 254/500 [==============>...............] - ETA: 1:01 - loss: 1.8246 - regression_loss: 1.5073 - classification_loss: 0.3174 255/500 [==============>...............] - ETA: 1:00 - loss: 1.8252 - regression_loss: 1.5080 - classification_loss: 0.3171 256/500 [==============>...............] - ETA: 1:00 - loss: 1.8252 - regression_loss: 1.5081 - classification_loss: 0.3172 257/500 [==============>...............] - ETA: 1:00 - loss: 1.8260 - regression_loss: 1.5089 - classification_loss: 0.3172 258/500 [==============>...............] - ETA: 1:00 - loss: 1.8264 - regression_loss: 1.5094 - classification_loss: 0.3170 259/500 [==============>...............] - ETA: 59s - loss: 1.8253 - regression_loss: 1.5087 - classification_loss: 0.3166  260/500 [==============>...............] - ETA: 59s - loss: 1.8261 - regression_loss: 1.5093 - classification_loss: 0.3167 261/500 [==============>...............] - ETA: 59s - loss: 1.8293 - regression_loss: 1.5119 - classification_loss: 0.3173 262/500 [==============>...............] - ETA: 59s - loss: 1.8294 - regression_loss: 1.5118 - classification_loss: 0.3175 263/500 [==============>...............] - ETA: 59s - loss: 1.8300 - regression_loss: 1.5124 - classification_loss: 0.3176 264/500 [==============>...............] - ETA: 58s - loss: 1.8287 - regression_loss: 1.5116 - classification_loss: 0.3171 265/500 [==============>...............] - ETA: 58s - loss: 1.8245 - regression_loss: 1.5081 - classification_loss: 0.3164 266/500 [==============>...............] - ETA: 58s - loss: 1.8245 - regression_loss: 1.5082 - classification_loss: 0.3162 267/500 [===============>..............] - ETA: 58s - loss: 1.8253 - regression_loss: 1.5087 - classification_loss: 0.3166 268/500 [===============>..............] - ETA: 57s - loss: 1.8250 - regression_loss: 1.5084 - classification_loss: 0.3166 269/500 [===============>..............] - ETA: 57s - loss: 1.8259 - regression_loss: 1.5090 - classification_loss: 0.3168 270/500 [===============>..............] - ETA: 57s - loss: 1.8255 - regression_loss: 1.5089 - classification_loss: 0.3166 271/500 [===============>..............] - ETA: 57s - loss: 1.8256 - regression_loss: 1.5091 - classification_loss: 0.3165 272/500 [===============>..............] - ETA: 56s - loss: 1.8257 - regression_loss: 1.5092 - classification_loss: 0.3165 273/500 [===============>..............] - ETA: 56s - loss: 1.8275 - regression_loss: 1.5104 - classification_loss: 0.3171 274/500 [===============>..............] - ETA: 56s - loss: 1.8298 - regression_loss: 1.5124 - classification_loss: 0.3175 275/500 [===============>..............] - ETA: 56s - loss: 1.8310 - regression_loss: 1.5129 - classification_loss: 0.3180 276/500 [===============>..............] - ETA: 55s - loss: 1.8298 - regression_loss: 1.5117 - classification_loss: 0.3181 277/500 [===============>..............] - ETA: 55s - loss: 1.8310 - regression_loss: 1.5127 - classification_loss: 0.3183 278/500 [===============>..............] - ETA: 55s - loss: 1.8364 - regression_loss: 1.5165 - classification_loss: 0.3199 279/500 [===============>..............] - ETA: 55s - loss: 1.8356 - regression_loss: 1.5159 - classification_loss: 0.3198 280/500 [===============>..............] - ETA: 54s - loss: 1.8368 - regression_loss: 1.5170 - classification_loss: 0.3198 281/500 [===============>..............] - ETA: 54s - loss: 1.8382 - regression_loss: 1.5181 - classification_loss: 0.3201 282/500 [===============>..............] - ETA: 54s - loss: 1.8380 - regression_loss: 1.5184 - classification_loss: 0.3197 283/500 [===============>..............] - ETA: 54s - loss: 1.8377 - regression_loss: 1.5183 - classification_loss: 0.3194 284/500 [================>.............] - ETA: 53s - loss: 1.8355 - regression_loss: 1.5163 - classification_loss: 0.3192 285/500 [================>.............] - ETA: 53s - loss: 1.8367 - regression_loss: 1.5173 - classification_loss: 0.3193 286/500 [================>.............] - ETA: 53s - loss: 1.8375 - regression_loss: 1.5180 - classification_loss: 0.3194 287/500 [================>.............] - ETA: 53s - loss: 1.8378 - regression_loss: 1.5184 - classification_loss: 0.3194 288/500 [================>.............] - ETA: 52s - loss: 1.8398 - regression_loss: 1.5196 - classification_loss: 0.3202 289/500 [================>.............] - ETA: 52s - loss: 1.8391 - regression_loss: 1.5192 - classification_loss: 0.3200 290/500 [================>.............] - ETA: 52s - loss: 1.8375 - regression_loss: 1.5179 - classification_loss: 0.3197 291/500 [================>.............] - ETA: 52s - loss: 1.8392 - regression_loss: 1.5189 - classification_loss: 0.3204 292/500 [================>.............] - ETA: 51s - loss: 1.8386 - regression_loss: 1.5185 - classification_loss: 0.3201 293/500 [================>.............] - ETA: 51s - loss: 1.8392 - regression_loss: 1.5191 - classification_loss: 0.3201 294/500 [================>.............] - ETA: 51s - loss: 1.8383 - regression_loss: 1.5186 - classification_loss: 0.3197 295/500 [================>.............] - ETA: 51s - loss: 1.8381 - regression_loss: 1.5184 - classification_loss: 0.3197 296/500 [================>.............] - ETA: 50s - loss: 1.8385 - regression_loss: 1.5189 - classification_loss: 0.3196 297/500 [================>.............] - ETA: 50s - loss: 1.8390 - regression_loss: 1.5194 - classification_loss: 0.3196 298/500 [================>.............] - ETA: 50s - loss: 1.8409 - regression_loss: 1.5199 - classification_loss: 0.3210 299/500 [================>.............] - ETA: 50s - loss: 1.8419 - regression_loss: 1.5206 - classification_loss: 0.3214 300/500 [=================>............] - ETA: 49s - loss: 1.8416 - regression_loss: 1.5204 - classification_loss: 0.3212 301/500 [=================>............] - ETA: 49s - loss: 1.8416 - regression_loss: 1.5206 - classification_loss: 0.3211 302/500 [=================>............] - ETA: 49s - loss: 1.8401 - regression_loss: 1.5195 - classification_loss: 0.3206 303/500 [=================>............] - ETA: 49s - loss: 1.8406 - regression_loss: 1.5199 - classification_loss: 0.3207 304/500 [=================>............] - ETA: 48s - loss: 1.8428 - regression_loss: 1.5216 - classification_loss: 0.3213 305/500 [=================>............] - ETA: 48s - loss: 1.8418 - regression_loss: 1.5209 - classification_loss: 0.3210 306/500 [=================>............] - ETA: 48s - loss: 1.8419 - regression_loss: 1.5209 - classification_loss: 0.3209 307/500 [=================>............] - ETA: 48s - loss: 1.8412 - regression_loss: 1.5202 - classification_loss: 0.3210 308/500 [=================>............] - ETA: 47s - loss: 1.8414 - regression_loss: 1.5205 - classification_loss: 0.3210 309/500 [=================>............] - ETA: 47s - loss: 1.8395 - regression_loss: 1.5188 - classification_loss: 0.3208 310/500 [=================>............] - ETA: 47s - loss: 1.8362 - regression_loss: 1.5160 - classification_loss: 0.3202 311/500 [=================>............] - ETA: 47s - loss: 1.8380 - regression_loss: 1.5177 - classification_loss: 0.3203 312/500 [=================>............] - ETA: 46s - loss: 1.8377 - regression_loss: 1.5174 - classification_loss: 0.3203 313/500 [=================>............] - ETA: 46s - loss: 1.8374 - regression_loss: 1.5171 - classification_loss: 0.3203 314/500 [=================>............] - ETA: 46s - loss: 1.8367 - regression_loss: 1.5165 - classification_loss: 0.3202 315/500 [=================>............] - ETA: 46s - loss: 1.8367 - regression_loss: 1.5166 - classification_loss: 0.3201 316/500 [=================>............] - ETA: 45s - loss: 1.8341 - regression_loss: 1.5145 - classification_loss: 0.3196 317/500 [==================>...........] - ETA: 45s - loss: 1.8336 - regression_loss: 1.5142 - classification_loss: 0.3194 318/500 [==================>...........] - ETA: 45s - loss: 1.8335 - regression_loss: 1.5135 - classification_loss: 0.3200 319/500 [==================>...........] - ETA: 45s - loss: 1.8325 - regression_loss: 1.5123 - classification_loss: 0.3202 320/500 [==================>...........] - ETA: 44s - loss: 1.8330 - regression_loss: 1.5128 - classification_loss: 0.3201 321/500 [==================>...........] - ETA: 44s - loss: 1.8349 - regression_loss: 1.5140 - classification_loss: 0.3209 322/500 [==================>...........] - ETA: 44s - loss: 1.8359 - regression_loss: 1.5150 - classification_loss: 0.3209 323/500 [==================>...........] - ETA: 44s - loss: 1.8349 - regression_loss: 1.5142 - classification_loss: 0.3207 324/500 [==================>...........] - ETA: 43s - loss: 1.8349 - regression_loss: 1.5142 - classification_loss: 0.3207 325/500 [==================>...........] - ETA: 43s - loss: 1.8337 - regression_loss: 1.5132 - classification_loss: 0.3205 326/500 [==================>...........] - ETA: 43s - loss: 1.8346 - regression_loss: 1.5139 - classification_loss: 0.3206 327/500 [==================>...........] - ETA: 43s - loss: 1.8349 - regression_loss: 1.5144 - classification_loss: 0.3205 328/500 [==================>...........] - ETA: 42s - loss: 1.8328 - regression_loss: 1.5127 - classification_loss: 0.3201 329/500 [==================>...........] - ETA: 42s - loss: 1.8335 - regression_loss: 1.5133 - classification_loss: 0.3202 330/500 [==================>...........] - ETA: 42s - loss: 1.8337 - regression_loss: 1.5136 - classification_loss: 0.3202 331/500 [==================>...........] - ETA: 42s - loss: 1.8337 - regression_loss: 1.5136 - classification_loss: 0.3201 332/500 [==================>...........] - ETA: 41s - loss: 1.8345 - regression_loss: 1.5140 - classification_loss: 0.3205 333/500 [==================>...........] - ETA: 41s - loss: 1.8335 - regression_loss: 1.5132 - classification_loss: 0.3203 334/500 [===================>..........] - ETA: 41s - loss: 1.8323 - regression_loss: 1.5123 - classification_loss: 0.3200 335/500 [===================>..........] - ETA: 41s - loss: 1.8334 - regression_loss: 1.5131 - classification_loss: 0.3203 336/500 [===================>..........] - ETA: 40s - loss: 1.8336 - regression_loss: 1.5132 - classification_loss: 0.3204 337/500 [===================>..........] - ETA: 40s - loss: 1.8346 - regression_loss: 1.5138 - classification_loss: 0.3208 338/500 [===================>..........] - ETA: 40s - loss: 1.8378 - regression_loss: 1.5163 - classification_loss: 0.3215 339/500 [===================>..........] - ETA: 40s - loss: 1.8389 - regression_loss: 1.5171 - classification_loss: 0.3218 340/500 [===================>..........] - ETA: 39s - loss: 1.8387 - regression_loss: 1.5169 - classification_loss: 0.3219 341/500 [===================>..........] - ETA: 39s - loss: 1.8367 - regression_loss: 1.5151 - classification_loss: 0.3216 342/500 [===================>..........] - ETA: 39s - loss: 1.8378 - regression_loss: 1.5160 - classification_loss: 0.3218 343/500 [===================>..........] - ETA: 39s - loss: 1.8368 - regression_loss: 1.5152 - classification_loss: 0.3216 344/500 [===================>..........] - ETA: 38s - loss: 1.8355 - regression_loss: 1.5143 - classification_loss: 0.3211 345/500 [===================>..........] - ETA: 38s - loss: 1.8364 - regression_loss: 1.5151 - classification_loss: 0.3213 346/500 [===================>..........] - ETA: 38s - loss: 1.8368 - regression_loss: 1.5154 - classification_loss: 0.3213 347/500 [===================>..........] - ETA: 38s - loss: 1.8338 - regression_loss: 1.5128 - classification_loss: 0.3210 348/500 [===================>..........] - ETA: 37s - loss: 1.8357 - regression_loss: 1.5143 - classification_loss: 0.3214 349/500 [===================>..........] - ETA: 37s - loss: 1.8361 - regression_loss: 1.5146 - classification_loss: 0.3215 350/500 [====================>.........] - ETA: 37s - loss: 1.8359 - regression_loss: 1.5146 - classification_loss: 0.3214 351/500 [====================>.........] - ETA: 37s - loss: 1.8370 - regression_loss: 1.5154 - classification_loss: 0.3216 352/500 [====================>.........] - ETA: 36s - loss: 1.8375 - regression_loss: 1.5158 - classification_loss: 0.3217 353/500 [====================>.........] - ETA: 36s - loss: 1.8377 - regression_loss: 1.5162 - classification_loss: 0.3216 354/500 [====================>.........] - ETA: 36s - loss: 1.8386 - regression_loss: 1.5171 - classification_loss: 0.3216 355/500 [====================>.........] - ETA: 36s - loss: 1.8398 - regression_loss: 1.5181 - classification_loss: 0.3217 356/500 [====================>.........] - ETA: 35s - loss: 1.8393 - regression_loss: 1.5178 - classification_loss: 0.3215 357/500 [====================>.........] - ETA: 35s - loss: 1.8409 - regression_loss: 1.5186 - classification_loss: 0.3222 358/500 [====================>.........] - ETA: 35s - loss: 1.8444 - regression_loss: 1.5212 - classification_loss: 0.3231 359/500 [====================>.........] - ETA: 35s - loss: 1.8442 - regression_loss: 1.5211 - classification_loss: 0.3231 360/500 [====================>.........] - ETA: 34s - loss: 1.8447 - regression_loss: 1.5215 - classification_loss: 0.3232 361/500 [====================>.........] - ETA: 34s - loss: 1.8440 - regression_loss: 1.5209 - classification_loss: 0.3231 362/500 [====================>.........] - ETA: 34s - loss: 1.8447 - regression_loss: 1.5213 - classification_loss: 0.3233 363/500 [====================>.........] - ETA: 34s - loss: 1.8461 - regression_loss: 1.5224 - classification_loss: 0.3237 364/500 [====================>.........] - ETA: 33s - loss: 1.8439 - regression_loss: 1.5206 - classification_loss: 0.3233 365/500 [====================>.........] - ETA: 33s - loss: 1.8420 - regression_loss: 1.5191 - classification_loss: 0.3229 366/500 [====================>.........] - ETA: 33s - loss: 1.8427 - regression_loss: 1.5196 - classification_loss: 0.3231 367/500 [=====================>........] - ETA: 33s - loss: 1.8391 - regression_loss: 1.5168 - classification_loss: 0.3223 368/500 [=====================>........] - ETA: 32s - loss: 1.8395 - regression_loss: 1.5171 - classification_loss: 0.3224 369/500 [=====================>........] - ETA: 32s - loss: 1.8387 - regression_loss: 1.5165 - classification_loss: 0.3222 370/500 [=====================>........] - ETA: 32s - loss: 1.8385 - regression_loss: 1.5163 - classification_loss: 0.3222 371/500 [=====================>........] - ETA: 32s - loss: 1.8371 - regression_loss: 1.5153 - classification_loss: 0.3217 372/500 [=====================>........] - ETA: 31s - loss: 1.8366 - regression_loss: 1.5148 - classification_loss: 0.3218 373/500 [=====================>........] - ETA: 31s - loss: 1.8382 - regression_loss: 1.5161 - classification_loss: 0.3221 374/500 [=====================>........] - ETA: 31s - loss: 1.8384 - regression_loss: 1.5164 - classification_loss: 0.3221 375/500 [=====================>........] - ETA: 31s - loss: 1.8363 - regression_loss: 1.5144 - classification_loss: 0.3219 376/500 [=====================>........] - ETA: 30s - loss: 1.8384 - regression_loss: 1.5158 - classification_loss: 0.3226 377/500 [=====================>........] - ETA: 30s - loss: 1.8392 - regression_loss: 1.5164 - classification_loss: 0.3228 378/500 [=====================>........] - ETA: 30s - loss: 1.8382 - regression_loss: 1.5157 - classification_loss: 0.3225 379/500 [=====================>........] - ETA: 30s - loss: 1.8375 - regression_loss: 1.5142 - classification_loss: 0.3234 380/500 [=====================>........] - ETA: 29s - loss: 1.8393 - regression_loss: 1.5156 - classification_loss: 0.3237 381/500 [=====================>........] - ETA: 29s - loss: 1.8393 - regression_loss: 1.5157 - classification_loss: 0.3236 382/500 [=====================>........] - ETA: 29s - loss: 1.8401 - regression_loss: 1.5166 - classification_loss: 0.3235 383/500 [=====================>........] - ETA: 29s - loss: 1.8402 - regression_loss: 1.5168 - classification_loss: 0.3234 384/500 [======================>.......] - ETA: 29s - loss: 1.8397 - regression_loss: 1.5165 - classification_loss: 0.3232 385/500 [======================>.......] - ETA: 28s - loss: 1.8404 - regression_loss: 1.5173 - classification_loss: 0.3231 386/500 [======================>.......] - ETA: 28s - loss: 1.8413 - regression_loss: 1.5181 - classification_loss: 0.3232 387/500 [======================>.......] - ETA: 28s - loss: 1.8390 - regression_loss: 1.5165 - classification_loss: 0.3225 388/500 [======================>.......] - ETA: 28s - loss: 1.8384 - regression_loss: 1.5162 - classification_loss: 0.3222 389/500 [======================>.......] - ETA: 27s - loss: 1.8380 - regression_loss: 1.5159 - classification_loss: 0.3221 390/500 [======================>.......] - ETA: 27s - loss: 1.8367 - regression_loss: 1.5148 - classification_loss: 0.3219 391/500 [======================>.......] - ETA: 27s - loss: 1.8374 - regression_loss: 1.5154 - classification_loss: 0.3219 392/500 [======================>.......] - ETA: 27s - loss: 1.8374 - regression_loss: 1.5156 - classification_loss: 0.3219 393/500 [======================>.......] - ETA: 26s - loss: 1.8362 - regression_loss: 1.5146 - classification_loss: 0.3217 394/500 [======================>.......] - ETA: 26s - loss: 1.8375 - regression_loss: 1.5156 - classification_loss: 0.3219 395/500 [======================>.......] - ETA: 26s - loss: 1.8383 - regression_loss: 1.5164 - classification_loss: 0.3219 396/500 [======================>.......] - ETA: 26s - loss: 1.8363 - regression_loss: 1.5148 - classification_loss: 0.3215 397/500 [======================>.......] - ETA: 25s - loss: 1.8359 - regression_loss: 1.5145 - classification_loss: 0.3214 398/500 [======================>.......] - ETA: 25s - loss: 1.8354 - regression_loss: 1.5142 - classification_loss: 0.3212 399/500 [======================>.......] - ETA: 25s - loss: 1.8356 - regression_loss: 1.5144 - classification_loss: 0.3213 400/500 [=======================>......] - ETA: 25s - loss: 1.8344 - regression_loss: 1.5135 - classification_loss: 0.3209 401/500 [=======================>......] - ETA: 24s - loss: 1.8335 - regression_loss: 1.5128 - classification_loss: 0.3206 402/500 [=======================>......] - ETA: 24s - loss: 1.8342 - regression_loss: 1.5135 - classification_loss: 0.3207 403/500 [=======================>......] - ETA: 24s - loss: 1.8343 - regression_loss: 1.5137 - classification_loss: 0.3206 404/500 [=======================>......] - ETA: 24s - loss: 1.8352 - regression_loss: 1.5145 - classification_loss: 0.3207 405/500 [=======================>......] - ETA: 23s - loss: 1.8344 - regression_loss: 1.5140 - classification_loss: 0.3204 406/500 [=======================>......] - ETA: 23s - loss: 1.8327 - regression_loss: 1.5128 - classification_loss: 0.3199 407/500 [=======================>......] - ETA: 23s - loss: 1.8318 - regression_loss: 1.5122 - classification_loss: 0.3197 408/500 [=======================>......] - ETA: 23s - loss: 1.8323 - regression_loss: 1.5125 - classification_loss: 0.3198 409/500 [=======================>......] - ETA: 22s - loss: 1.8303 - regression_loss: 1.5108 - classification_loss: 0.3195 410/500 [=======================>......] - ETA: 22s - loss: 1.8302 - regression_loss: 1.5106 - classification_loss: 0.3196 411/500 [=======================>......] - ETA: 22s - loss: 1.8314 - regression_loss: 1.5116 - classification_loss: 0.3199 412/500 [=======================>......] - ETA: 22s - loss: 1.8310 - regression_loss: 1.5111 - classification_loss: 0.3200 413/500 [=======================>......] - ETA: 21s - loss: 1.8313 - regression_loss: 1.5114 - classification_loss: 0.3199 414/500 [=======================>......] - ETA: 21s - loss: 1.8306 - regression_loss: 1.5108 - classification_loss: 0.3198 415/500 [=======================>......] - ETA: 21s - loss: 1.8302 - regression_loss: 1.5106 - classification_loss: 0.3196 416/500 [=======================>......] - ETA: 21s - loss: 1.8304 - regression_loss: 1.5107 - classification_loss: 0.3197 417/500 [========================>.....] - ETA: 20s - loss: 1.8298 - regression_loss: 1.5103 - classification_loss: 0.3195 418/500 [========================>.....] - ETA: 20s - loss: 1.8285 - regression_loss: 1.5090 - classification_loss: 0.3194 419/500 [========================>.....] - ETA: 20s - loss: 1.8285 - regression_loss: 1.5091 - classification_loss: 0.3194 420/500 [========================>.....] - ETA: 19s - loss: 1.8277 - regression_loss: 1.5085 - classification_loss: 0.3192 421/500 [========================>.....] - ETA: 19s - loss: 1.8279 - regression_loss: 1.5087 - classification_loss: 0.3192 422/500 [========================>.....] - ETA: 19s - loss: 1.8281 - regression_loss: 1.5090 - classification_loss: 0.3191 423/500 [========================>.....] - ETA: 19s - loss: 1.8279 - regression_loss: 1.5090 - classification_loss: 0.3189 424/500 [========================>.....] - ETA: 18s - loss: 1.8278 - regression_loss: 1.5090 - classification_loss: 0.3188 425/500 [========================>.....] - ETA: 18s - loss: 1.8258 - regression_loss: 1.5072 - classification_loss: 0.3186 426/500 [========================>.....] - ETA: 18s - loss: 1.8254 - regression_loss: 1.5070 - classification_loss: 0.3184 427/500 [========================>.....] - ETA: 18s - loss: 1.8251 - regression_loss: 1.5068 - classification_loss: 0.3183 428/500 [========================>.....] - ETA: 17s - loss: 1.8249 - regression_loss: 1.5067 - classification_loss: 0.3182 429/500 [========================>.....] - ETA: 17s - loss: 1.8247 - regression_loss: 1.5061 - classification_loss: 0.3186 430/500 [========================>.....] - ETA: 17s - loss: 1.8260 - regression_loss: 1.5069 - classification_loss: 0.3191 431/500 [========================>.....] - ETA: 17s - loss: 1.8245 - regression_loss: 1.5058 - classification_loss: 0.3187 432/500 [========================>.....] - ETA: 16s - loss: 1.8244 - regression_loss: 1.5057 - classification_loss: 0.3187 433/500 [========================>.....] - ETA: 16s - loss: 1.8237 - regression_loss: 1.5054 - classification_loss: 0.3183 434/500 [=========================>....] - ETA: 16s - loss: 1.8245 - regression_loss: 1.5061 - classification_loss: 0.3184 435/500 [=========================>....] - ETA: 16s - loss: 1.8223 - regression_loss: 1.5043 - classification_loss: 0.3180 436/500 [=========================>....] - ETA: 15s - loss: 1.8227 - regression_loss: 1.5046 - classification_loss: 0.3182 437/500 [=========================>....] - ETA: 15s - loss: 1.8230 - regression_loss: 1.5048 - classification_loss: 0.3182 438/500 [=========================>....] - ETA: 15s - loss: 1.8238 - regression_loss: 1.5054 - classification_loss: 0.3184 439/500 [=========================>....] - ETA: 15s - loss: 1.8253 - regression_loss: 1.5067 - classification_loss: 0.3186 440/500 [=========================>....] - ETA: 14s - loss: 1.8262 - regression_loss: 1.5076 - classification_loss: 0.3186 441/500 [=========================>....] - ETA: 14s - loss: 1.8255 - regression_loss: 1.5069 - classification_loss: 0.3186 442/500 [=========================>....] - ETA: 14s - loss: 1.8255 - regression_loss: 1.5070 - classification_loss: 0.3185 443/500 [=========================>....] - ETA: 14s - loss: 1.8257 - regression_loss: 1.5071 - classification_loss: 0.3186 444/500 [=========================>....] - ETA: 13s - loss: 1.8262 - regression_loss: 1.5077 - classification_loss: 0.3185 445/500 [=========================>....] - ETA: 13s - loss: 1.8264 - regression_loss: 1.5080 - classification_loss: 0.3184 446/500 [=========================>....] - ETA: 13s - loss: 1.8274 - regression_loss: 1.5086 - classification_loss: 0.3188 447/500 [=========================>....] - ETA: 13s - loss: 1.8292 - regression_loss: 1.5101 - classification_loss: 0.3191 448/500 [=========================>....] - ETA: 12s - loss: 1.8297 - regression_loss: 1.5106 - classification_loss: 0.3192 449/500 [=========================>....] - ETA: 12s - loss: 1.8282 - regression_loss: 1.5093 - classification_loss: 0.3189 450/500 [==========================>...] - ETA: 12s - loss: 1.8281 - regression_loss: 1.5090 - classification_loss: 0.3191 451/500 [==========================>...] - ETA: 12s - loss: 1.8296 - regression_loss: 1.5103 - classification_loss: 0.3193 452/500 [==========================>...] - ETA: 11s - loss: 1.8293 - regression_loss: 1.5101 - classification_loss: 0.3193 453/500 [==========================>...] - ETA: 11s - loss: 1.8300 - regression_loss: 1.5107 - classification_loss: 0.3193 454/500 [==========================>...] - ETA: 11s - loss: 1.8303 - regression_loss: 1.5110 - classification_loss: 0.3193 455/500 [==========================>...] - ETA: 11s - loss: 1.8302 - regression_loss: 1.5110 - classification_loss: 0.3192 456/500 [==========================>...] - ETA: 10s - loss: 1.8300 - regression_loss: 1.5108 - classification_loss: 0.3191 457/500 [==========================>...] - ETA: 10s - loss: 1.8299 - regression_loss: 1.5108 - classification_loss: 0.3190 458/500 [==========================>...] - ETA: 10s - loss: 1.8305 - regression_loss: 1.5111 - classification_loss: 0.3193 459/500 [==========================>...] - ETA: 10s - loss: 1.8305 - regression_loss: 1.5112 - classification_loss: 0.3194 460/500 [==========================>...] - ETA: 9s - loss: 1.8284 - regression_loss: 1.5093 - classification_loss: 0.3191  461/500 [==========================>...] - ETA: 9s - loss: 1.8308 - regression_loss: 1.5114 - classification_loss: 0.3194 462/500 [==========================>...] - ETA: 9s - loss: 1.8302 - regression_loss: 1.5107 - classification_loss: 0.3195 463/500 [==========================>...] - ETA: 9s - loss: 1.8291 - regression_loss: 1.5088 - classification_loss: 0.3202 464/500 [==========================>...] - ETA: 8s - loss: 1.8297 - regression_loss: 1.5091 - classification_loss: 0.3206 465/500 [==========================>...] - ETA: 8s - loss: 1.8304 - regression_loss: 1.5097 - classification_loss: 0.3206 466/500 [==========================>...] - ETA: 8s - loss: 1.8304 - regression_loss: 1.5098 - classification_loss: 0.3206 467/500 [===========================>..] - ETA: 8s - loss: 1.8286 - regression_loss: 1.5083 - classification_loss: 0.3202 468/500 [===========================>..] - ETA: 7s - loss: 1.8289 - regression_loss: 1.5089 - classification_loss: 0.3200 469/500 [===========================>..] - ETA: 7s - loss: 1.8289 - regression_loss: 1.5089 - classification_loss: 0.3200 470/500 [===========================>..] - ETA: 7s - loss: 1.8277 - regression_loss: 1.5079 - classification_loss: 0.3199 471/500 [===========================>..] - ETA: 7s - loss: 1.8284 - regression_loss: 1.5084 - classification_loss: 0.3200 472/500 [===========================>..] - ETA: 6s - loss: 1.8282 - regression_loss: 1.5083 - classification_loss: 0.3198 473/500 [===========================>..] - ETA: 6s - loss: 1.8290 - regression_loss: 1.5092 - classification_loss: 0.3197 474/500 [===========================>..] - ETA: 6s - loss: 1.8285 - regression_loss: 1.5090 - classification_loss: 0.3196 475/500 [===========================>..] - ETA: 6s - loss: 1.8286 - regression_loss: 1.5091 - classification_loss: 0.3195 476/500 [===========================>..] - ETA: 5s - loss: 1.8284 - regression_loss: 1.5090 - classification_loss: 0.3194 477/500 [===========================>..] - ETA: 5s - loss: 1.8287 - regression_loss: 1.5092 - classification_loss: 0.3195 478/500 [===========================>..] - ETA: 5s - loss: 1.8271 - regression_loss: 1.5079 - classification_loss: 0.3192 479/500 [===========================>..] - ETA: 5s - loss: 1.8268 - regression_loss: 1.5077 - classification_loss: 0.3191 480/500 [===========================>..] - ETA: 4s - loss: 1.8268 - regression_loss: 1.5077 - classification_loss: 0.3191 481/500 [===========================>..] - ETA: 4s - loss: 1.8263 - regression_loss: 1.5074 - classification_loss: 0.3189 482/500 [===========================>..] - ETA: 4s - loss: 1.8263 - regression_loss: 1.5075 - classification_loss: 0.3188 483/500 [===========================>..] - ETA: 4s - loss: 1.8257 - regression_loss: 1.5070 - classification_loss: 0.3187 484/500 [============================>.] - ETA: 3s - loss: 1.8237 - regression_loss: 1.5055 - classification_loss: 0.3183 485/500 [============================>.] - ETA: 3s - loss: 1.8237 - regression_loss: 1.5056 - classification_loss: 0.3181 486/500 [============================>.] - ETA: 3s - loss: 1.8236 - regression_loss: 1.5056 - classification_loss: 0.3180 487/500 [============================>.] - ETA: 3s - loss: 1.8230 - regression_loss: 1.5051 - classification_loss: 0.3178 488/500 [============================>.] - ETA: 2s - loss: 1.8227 - regression_loss: 1.5048 - classification_loss: 0.3179 489/500 [============================>.] - ETA: 2s - loss: 1.8225 - regression_loss: 1.5046 - classification_loss: 0.3179 490/500 [============================>.] - ETA: 2s - loss: 1.8228 - regression_loss: 1.5049 - classification_loss: 0.3179 491/500 [============================>.] - ETA: 2s - loss: 1.8229 - regression_loss: 1.5050 - classification_loss: 0.3178 492/500 [============================>.] - ETA: 1s - loss: 1.8230 - regression_loss: 1.5052 - classification_loss: 0.3178 493/500 [============================>.] - ETA: 1s - loss: 1.8221 - regression_loss: 1.5046 - classification_loss: 0.3175 494/500 [============================>.] - ETA: 1s - loss: 1.8220 - regression_loss: 1.5046 - classification_loss: 0.3174 495/500 [============================>.] - ETA: 1s - loss: 1.8220 - regression_loss: 1.5046 - classification_loss: 0.3174 496/500 [============================>.] - ETA: 0s - loss: 1.8215 - regression_loss: 1.5043 - classification_loss: 0.3172 497/500 [============================>.] - ETA: 0s - loss: 1.8212 - regression_loss: 1.5037 - classification_loss: 0.3175 498/500 [============================>.] - ETA: 0s - loss: 1.8195 - regression_loss: 1.5023 - classification_loss: 0.3172 499/500 [============================>.] - ETA: 0s - loss: 1.8206 - regression_loss: 1.5030 - classification_loss: 0.3176 500/500 [==============================] - 125s 250ms/step - loss: 1.8195 - regression_loss: 1.5022 - classification_loss: 0.3173 1172 instances of class plum with average precision: 0.6187 mAP: 0.6187 Epoch 00060: saving model to ./training/snapshots/resnet50_pascal_60.h5 Epoch 61/150 1/500 [..............................] - ETA: 1:58 - loss: 1.7943 - regression_loss: 1.5511 - classification_loss: 0.2431 2/500 [..............................] - ETA: 2:00 - loss: 1.9553 - regression_loss: 1.6533 - classification_loss: 0.3020 3/500 [..............................] - ETA: 2:02 - loss: 1.7535 - regression_loss: 1.4962 - classification_loss: 0.2572 4/500 [..............................] - ETA: 2:01 - loss: 1.8630 - regression_loss: 1.5201 - classification_loss: 0.3429 5/500 [..............................] - ETA: 2:02 - loss: 1.8186 - regression_loss: 1.4922 - classification_loss: 0.3264 6/500 [..............................] - ETA: 2:01 - loss: 1.7585 - regression_loss: 1.4596 - classification_loss: 0.2989 7/500 [..............................] - ETA: 2:01 - loss: 1.8083 - regression_loss: 1.4947 - classification_loss: 0.3136 8/500 [..............................] - ETA: 2:02 - loss: 1.7048 - regression_loss: 1.4120 - classification_loss: 0.2928 9/500 [..............................] - ETA: 2:02 - loss: 1.7830 - regression_loss: 1.4813 - classification_loss: 0.3016 10/500 [..............................] - ETA: 2:01 - loss: 1.8594 - regression_loss: 1.5303 - classification_loss: 0.3290 11/500 [..............................] - ETA: 2:01 - loss: 1.8239 - regression_loss: 1.5060 - classification_loss: 0.3179 12/500 [..............................] - ETA: 2:01 - loss: 1.8249 - regression_loss: 1.5024 - classification_loss: 0.3225 13/500 [..............................] - ETA: 2:01 - loss: 1.8313 - regression_loss: 1.5114 - classification_loss: 0.3199 14/500 [..............................] - ETA: 2:01 - loss: 1.8482 - regression_loss: 1.5322 - classification_loss: 0.3160 15/500 [..............................] - ETA: 2:01 - loss: 1.8519 - regression_loss: 1.5377 - classification_loss: 0.3142 16/500 [..............................] - ETA: 2:01 - loss: 1.8084 - regression_loss: 1.4994 - classification_loss: 0.3090 17/500 [>.............................] - ETA: 2:00 - loss: 1.7760 - regression_loss: 1.4691 - classification_loss: 0.3069 18/500 [>.............................] - ETA: 2:00 - loss: 1.7575 - regression_loss: 1.4573 - classification_loss: 0.3003 19/500 [>.............................] - ETA: 2:00 - loss: 1.7604 - regression_loss: 1.4612 - classification_loss: 0.2992 20/500 [>.............................] - ETA: 1:59 - loss: 1.7552 - regression_loss: 1.4505 - classification_loss: 0.3047 21/500 [>.............................] - ETA: 1:59 - loss: 1.7702 - regression_loss: 1.4607 - classification_loss: 0.3095 22/500 [>.............................] - ETA: 1:59 - loss: 1.7669 - regression_loss: 1.4583 - classification_loss: 0.3086 23/500 [>.............................] - ETA: 1:59 - loss: 1.7828 - regression_loss: 1.4718 - classification_loss: 0.3110 24/500 [>.............................] - ETA: 1:59 - loss: 1.8114 - regression_loss: 1.4956 - classification_loss: 0.3159 25/500 [>.............................] - ETA: 1:59 - loss: 1.7976 - regression_loss: 1.4852 - classification_loss: 0.3123 26/500 [>.............................] - ETA: 1:59 - loss: 1.8150 - regression_loss: 1.4959 - classification_loss: 0.3191 27/500 [>.............................] - ETA: 1:58 - loss: 1.8265 - regression_loss: 1.5072 - classification_loss: 0.3193 28/500 [>.............................] - ETA: 1:58 - loss: 1.8279 - regression_loss: 1.5083 - classification_loss: 0.3196 29/500 [>.............................] - ETA: 1:58 - loss: 1.8445 - regression_loss: 1.5216 - classification_loss: 0.3229 30/500 [>.............................] - ETA: 1:58 - loss: 1.8322 - regression_loss: 1.5138 - classification_loss: 0.3184 31/500 [>.............................] - ETA: 1:58 - loss: 1.8403 - regression_loss: 1.5218 - classification_loss: 0.3185 32/500 [>.............................] - ETA: 1:57 - loss: 1.8403 - regression_loss: 1.5202 - classification_loss: 0.3200 33/500 [>.............................] - ETA: 1:57 - loss: 1.8415 - regression_loss: 1.5230 - classification_loss: 0.3185 34/500 [=>............................] - ETA: 1:57 - loss: 1.8615 - regression_loss: 1.5364 - classification_loss: 0.3252 35/500 [=>............................] - ETA: 1:56 - loss: 1.8521 - regression_loss: 1.5275 - classification_loss: 0.3246 36/500 [=>............................] - ETA: 1:56 - loss: 1.8695 - regression_loss: 1.5409 - classification_loss: 0.3286 37/500 [=>............................] - ETA: 1:56 - loss: 1.8577 - regression_loss: 1.5329 - classification_loss: 0.3248 38/500 [=>............................] - ETA: 1:56 - loss: 1.8562 - regression_loss: 1.5331 - classification_loss: 0.3230 39/500 [=>............................] - ETA: 1:55 - loss: 1.8615 - regression_loss: 1.5374 - classification_loss: 0.3241 40/500 [=>............................] - ETA: 1:55 - loss: 1.8570 - regression_loss: 1.5349 - classification_loss: 0.3221 41/500 [=>............................] - ETA: 1:55 - loss: 1.8602 - regression_loss: 1.5385 - classification_loss: 0.3217 42/500 [=>............................] - ETA: 1:55 - loss: 1.8481 - regression_loss: 1.5297 - classification_loss: 0.3184 43/500 [=>............................] - ETA: 1:54 - loss: 1.8562 - regression_loss: 1.5355 - classification_loss: 0.3207 44/500 [=>............................] - ETA: 1:54 - loss: 1.8414 - regression_loss: 1.5240 - classification_loss: 0.3174 45/500 [=>............................] - ETA: 1:54 - loss: 1.8473 - regression_loss: 1.5288 - classification_loss: 0.3185 46/500 [=>............................] - ETA: 1:54 - loss: 1.8519 - regression_loss: 1.5328 - classification_loss: 0.3191 47/500 [=>............................] - ETA: 1:53 - loss: 1.8542 - regression_loss: 1.5353 - classification_loss: 0.3189 48/500 [=>............................] - ETA: 1:53 - loss: 1.8525 - regression_loss: 1.5348 - classification_loss: 0.3177 49/500 [=>............................] - ETA: 1:53 - loss: 1.8592 - regression_loss: 1.5400 - classification_loss: 0.3191 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8818 - regression_loss: 1.5592 - classification_loss: 0.3226 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8815 - regression_loss: 1.5595 - classification_loss: 0.3220 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8639 - regression_loss: 1.5451 - classification_loss: 0.3188 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8681 - regression_loss: 1.5472 - classification_loss: 0.3209 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8702 - regression_loss: 1.5505 - classification_loss: 0.3197 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8650 - regression_loss: 1.5461 - classification_loss: 0.3189 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8651 - regression_loss: 1.5460 - classification_loss: 0.3192 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8606 - regression_loss: 1.5429 - classification_loss: 0.3178 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8528 - regression_loss: 1.5370 - classification_loss: 0.3158 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8608 - regression_loss: 1.5427 - classification_loss: 0.3181 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8723 - regression_loss: 1.5517 - classification_loss: 0.3206 61/500 [==>...........................] - ETA: 1:50 - loss: 1.8579 - regression_loss: 1.5405 - classification_loss: 0.3175 62/500 [==>...........................] - ETA: 1:50 - loss: 1.8414 - regression_loss: 1.5259 - classification_loss: 0.3155 63/500 [==>...........................] - ETA: 1:49 - loss: 1.8491 - regression_loss: 1.5309 - classification_loss: 0.3183 64/500 [==>...........................] - ETA: 1:49 - loss: 1.8495 - regression_loss: 1.5312 - classification_loss: 0.3183 65/500 [==>...........................] - ETA: 1:49 - loss: 1.8510 - regression_loss: 1.5316 - classification_loss: 0.3194 66/500 [==>...........................] - ETA: 1:49 - loss: 1.8628 - regression_loss: 1.5397 - classification_loss: 0.3231 67/500 [===>..........................] - ETA: 1:48 - loss: 1.8503 - regression_loss: 1.5291 - classification_loss: 0.3213 68/500 [===>..........................] - ETA: 1:48 - loss: 1.8305 - regression_loss: 1.5126 - classification_loss: 0.3179 69/500 [===>..........................] - ETA: 1:48 - loss: 1.8407 - regression_loss: 1.5195 - classification_loss: 0.3212 70/500 [===>..........................] - ETA: 1:48 - loss: 1.8438 - regression_loss: 1.5212 - classification_loss: 0.3226 71/500 [===>..........................] - ETA: 1:47 - loss: 1.8488 - regression_loss: 1.5266 - classification_loss: 0.3222 72/500 [===>..........................] - ETA: 1:47 - loss: 1.8494 - regression_loss: 1.5268 - classification_loss: 0.3226 73/500 [===>..........................] - ETA: 1:47 - loss: 1.8566 - regression_loss: 1.5329 - classification_loss: 0.3237 74/500 [===>..........................] - ETA: 1:47 - loss: 1.8517 - regression_loss: 1.5293 - classification_loss: 0.3224 75/500 [===>..........................] - ETA: 1:46 - loss: 1.8483 - regression_loss: 1.5270 - classification_loss: 0.3213 76/500 [===>..........................] - ETA: 1:46 - loss: 1.8429 - regression_loss: 1.5236 - classification_loss: 0.3193 77/500 [===>..........................] - ETA: 1:46 - loss: 1.8475 - regression_loss: 1.5266 - classification_loss: 0.3209 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8533 - regression_loss: 1.5314 - classification_loss: 0.3220 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8539 - regression_loss: 1.5312 - classification_loss: 0.3227 80/500 [===>..........................] - ETA: 1:45 - loss: 1.8464 - regression_loss: 1.5255 - classification_loss: 0.3208 81/500 [===>..........................] - ETA: 1:45 - loss: 1.8411 - regression_loss: 1.5212 - classification_loss: 0.3199 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8367 - regression_loss: 1.5149 - classification_loss: 0.3219 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8413 - regression_loss: 1.5177 - classification_loss: 0.3237 84/500 [====>.........................] - ETA: 1:44 - loss: 1.8268 - regression_loss: 1.5058 - classification_loss: 0.3211 85/500 [====>.........................] - ETA: 1:44 - loss: 1.8173 - regression_loss: 1.4981 - classification_loss: 0.3192 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8088 - regression_loss: 1.4923 - classification_loss: 0.3164 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8074 - regression_loss: 1.4916 - classification_loss: 0.3159 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8017 - regression_loss: 1.4878 - classification_loss: 0.3139 89/500 [====>.........................] - ETA: 1:43 - loss: 1.8047 - regression_loss: 1.4908 - classification_loss: 0.3139 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8021 - regression_loss: 1.4890 - classification_loss: 0.3131 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8033 - regression_loss: 1.4888 - classification_loss: 0.3145 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8032 - regression_loss: 1.4893 - classification_loss: 0.3139 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7934 - regression_loss: 1.4805 - classification_loss: 0.3129 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7857 - regression_loss: 1.4737 - classification_loss: 0.3120 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7870 - regression_loss: 1.4755 - classification_loss: 0.3115 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7887 - regression_loss: 1.4774 - classification_loss: 0.3113 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7905 - regression_loss: 1.4788 - classification_loss: 0.3118 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7949 - regression_loss: 1.4823 - classification_loss: 0.3126 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7974 - regression_loss: 1.4843 - classification_loss: 0.3131 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7975 - regression_loss: 1.4850 - classification_loss: 0.3125 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8001 - regression_loss: 1.4862 - classification_loss: 0.3139 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7976 - regression_loss: 1.4841 - classification_loss: 0.3135 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7986 - regression_loss: 1.4847 - classification_loss: 0.3139 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8008 - regression_loss: 1.4868 - classification_loss: 0.3140 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7949 - regression_loss: 1.4813 - classification_loss: 0.3136 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7951 - regression_loss: 1.4820 - classification_loss: 0.3131 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7929 - regression_loss: 1.4790 - classification_loss: 0.3139 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7949 - regression_loss: 1.4814 - classification_loss: 0.3134 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7916 - regression_loss: 1.4789 - classification_loss: 0.3127 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7895 - regression_loss: 1.4773 - classification_loss: 0.3122 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7935 - regression_loss: 1.4810 - classification_loss: 0.3125 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7934 - regression_loss: 1.4813 - classification_loss: 0.3121 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7993 - regression_loss: 1.4865 - classification_loss: 0.3128 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7937 - regression_loss: 1.4817 - classification_loss: 0.3120 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7903 - regression_loss: 1.4793 - classification_loss: 0.3110 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7830 - regression_loss: 1.4741 - classification_loss: 0.3088 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7727 - regression_loss: 1.4658 - classification_loss: 0.3068 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7727 - regression_loss: 1.4658 - classification_loss: 0.3069 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7722 - regression_loss: 1.4650 - classification_loss: 0.3072 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7718 - regression_loss: 1.4653 - classification_loss: 0.3065 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7720 - regression_loss: 1.4650 - classification_loss: 0.3070 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7697 - regression_loss: 1.4635 - classification_loss: 0.3062 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7633 - regression_loss: 1.4584 - classification_loss: 0.3049 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7687 - regression_loss: 1.4633 - classification_loss: 0.3054 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7697 - regression_loss: 1.4640 - classification_loss: 0.3057 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7729 - regression_loss: 1.4653 - classification_loss: 0.3076 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7702 - regression_loss: 1.4636 - classification_loss: 0.3066 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7699 - regression_loss: 1.4633 - classification_loss: 0.3066 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7738 - regression_loss: 1.4666 - classification_loss: 0.3071 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7707 - regression_loss: 1.4641 - classification_loss: 0.3066 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7742 - regression_loss: 1.4673 - classification_loss: 0.3069 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7759 - regression_loss: 1.4689 - classification_loss: 0.3070 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7705 - regression_loss: 1.4646 - classification_loss: 0.3059 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7697 - regression_loss: 1.4639 - classification_loss: 0.3058 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7689 - regression_loss: 1.4625 - classification_loss: 0.3064 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7720 - regression_loss: 1.4647 - classification_loss: 0.3073 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7648 - regression_loss: 1.4591 - classification_loss: 0.3057 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7638 - regression_loss: 1.4584 - classification_loss: 0.3054 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7634 - regression_loss: 1.4577 - classification_loss: 0.3057 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7631 - regression_loss: 1.4574 - classification_loss: 0.3056 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7643 - regression_loss: 1.4580 - classification_loss: 0.3064 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7636 - regression_loss: 1.4573 - classification_loss: 0.3064 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7643 - regression_loss: 1.4579 - classification_loss: 0.3063 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7658 - regression_loss: 1.4594 - classification_loss: 0.3064 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7659 - regression_loss: 1.4597 - classification_loss: 0.3062 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7675 - regression_loss: 1.4615 - classification_loss: 0.3060 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7717 - regression_loss: 1.4645 - classification_loss: 0.3072 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7679 - regression_loss: 1.4614 - classification_loss: 0.3064 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7636 - regression_loss: 1.4579 - classification_loss: 0.3056 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7624 - regression_loss: 1.4572 - classification_loss: 0.3052 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7608 - regression_loss: 1.4558 - classification_loss: 0.3050 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7661 - regression_loss: 1.4598 - classification_loss: 0.3064 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7683 - regression_loss: 1.4616 - classification_loss: 0.3067 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7651 - regression_loss: 1.4590 - classification_loss: 0.3061 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7635 - regression_loss: 1.4578 - classification_loss: 0.3057 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7595 - regression_loss: 1.4550 - classification_loss: 0.3045 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7618 - regression_loss: 1.4568 - classification_loss: 0.3050 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7642 - regression_loss: 1.4585 - classification_loss: 0.3057 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7667 - regression_loss: 1.4606 - classification_loss: 0.3061 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7691 - regression_loss: 1.4623 - classification_loss: 0.3067 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7674 - regression_loss: 1.4613 - classification_loss: 0.3061 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7687 - regression_loss: 1.4614 - classification_loss: 0.3073 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7692 - regression_loss: 1.4618 - classification_loss: 0.3075 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7705 - regression_loss: 1.4631 - classification_loss: 0.3075 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7732 - regression_loss: 1.4655 - classification_loss: 0.3078 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7719 - regression_loss: 1.4624 - classification_loss: 0.3095 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7715 - regression_loss: 1.4620 - classification_loss: 0.3094 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7749 - regression_loss: 1.4647 - classification_loss: 0.3101 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7725 - regression_loss: 1.4630 - classification_loss: 0.3095 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7730 - regression_loss: 1.4632 - classification_loss: 0.3098 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7683 - regression_loss: 1.4587 - classification_loss: 0.3096 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7697 - regression_loss: 1.4601 - classification_loss: 0.3096 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7723 - regression_loss: 1.4620 - classification_loss: 0.3103 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7748 - regression_loss: 1.4641 - classification_loss: 0.3107 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7774 - regression_loss: 1.4664 - classification_loss: 0.3110 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7791 - regression_loss: 1.4677 - classification_loss: 0.3115 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7793 - regression_loss: 1.4681 - classification_loss: 0.3112 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7787 - regression_loss: 1.4676 - classification_loss: 0.3111 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7770 - regression_loss: 1.4660 - classification_loss: 0.3110 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7775 - regression_loss: 1.4669 - classification_loss: 0.3106 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7745 - regression_loss: 1.4646 - classification_loss: 0.3100 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7804 - regression_loss: 1.4702 - classification_loss: 0.3102 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7793 - regression_loss: 1.4693 - classification_loss: 0.3100 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7797 - regression_loss: 1.4695 - classification_loss: 0.3103 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7800 - regression_loss: 1.4697 - classification_loss: 0.3103 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7770 - regression_loss: 1.4673 - classification_loss: 0.3097 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7784 - regression_loss: 1.4692 - classification_loss: 0.3092 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7722 - regression_loss: 1.4641 - classification_loss: 0.3081 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7732 - regression_loss: 1.4653 - classification_loss: 0.3079 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7735 - regression_loss: 1.4655 - classification_loss: 0.3080 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7759 - regression_loss: 1.4673 - classification_loss: 0.3086 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7752 - regression_loss: 1.4666 - classification_loss: 0.3086 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7755 - regression_loss: 1.4673 - classification_loss: 0.3082 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7769 - regression_loss: 1.4676 - classification_loss: 0.3093 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7776 - regression_loss: 1.4680 - classification_loss: 0.3096 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7801 - regression_loss: 1.4696 - classification_loss: 0.3105 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7781 - regression_loss: 1.4678 - classification_loss: 0.3103 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7774 - regression_loss: 1.4671 - classification_loss: 0.3103 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7770 - regression_loss: 1.4668 - classification_loss: 0.3102 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7770 - regression_loss: 1.4668 - classification_loss: 0.3101 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7786 - regression_loss: 1.4681 - classification_loss: 0.3105 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7752 - regression_loss: 1.4657 - classification_loss: 0.3095 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7765 - regression_loss: 1.4670 - classification_loss: 0.3095 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7785 - regression_loss: 1.4687 - classification_loss: 0.3098 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7794 - regression_loss: 1.4695 - classification_loss: 0.3099 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7788 - regression_loss: 1.4691 - classification_loss: 0.3098 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7832 - regression_loss: 1.4727 - classification_loss: 0.3105 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7831 - regression_loss: 1.4729 - classification_loss: 0.3102 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7778 - regression_loss: 1.4683 - classification_loss: 0.3095 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7784 - regression_loss: 1.4689 - classification_loss: 0.3095 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7752 - regression_loss: 1.4668 - classification_loss: 0.3084 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7760 - regression_loss: 1.4676 - classification_loss: 0.3083 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7779 - regression_loss: 1.4692 - classification_loss: 0.3086 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7758 - regression_loss: 1.4676 - classification_loss: 0.3083 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7768 - regression_loss: 1.4685 - classification_loss: 0.3083 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7792 - regression_loss: 1.4703 - classification_loss: 0.3089 217/500 [============>.................] - ETA: 1:10 - loss: 1.7785 - regression_loss: 1.4699 - classification_loss: 0.3086 218/500 [============>.................] - ETA: 1:10 - loss: 1.7779 - regression_loss: 1.4688 - classification_loss: 0.3090 219/500 [============>.................] - ETA: 1:10 - loss: 1.7800 - regression_loss: 1.4705 - classification_loss: 0.3095 220/500 [============>.................] - ETA: 1:10 - loss: 1.7771 - regression_loss: 1.4674 - classification_loss: 0.3097 221/500 [============>.................] - ETA: 1:09 - loss: 1.7729 - regression_loss: 1.4639 - classification_loss: 0.3090 222/500 [============>.................] - ETA: 1:09 - loss: 1.7694 - regression_loss: 1.4612 - classification_loss: 0.3082 223/500 [============>.................] - ETA: 1:09 - loss: 1.7702 - regression_loss: 1.4618 - classification_loss: 0.3083 224/500 [============>.................] - ETA: 1:09 - loss: 1.7709 - regression_loss: 1.4624 - classification_loss: 0.3085 225/500 [============>.................] - ETA: 1:08 - loss: 1.7702 - regression_loss: 1.4619 - classification_loss: 0.3082 226/500 [============>.................] - ETA: 1:08 - loss: 1.7718 - regression_loss: 1.4633 - classification_loss: 0.3085 227/500 [============>.................] - ETA: 1:08 - loss: 1.7709 - regression_loss: 1.4627 - classification_loss: 0.3082 228/500 [============>.................] - ETA: 1:08 - loss: 1.7707 - regression_loss: 1.4627 - classification_loss: 0.3080 229/500 [============>.................] - ETA: 1:07 - loss: 1.7719 - regression_loss: 1.4637 - classification_loss: 0.3081 230/500 [============>.................] - ETA: 1:07 - loss: 1.7704 - regression_loss: 1.4626 - classification_loss: 0.3078 231/500 [============>.................] - ETA: 1:07 - loss: 1.7703 - regression_loss: 1.4626 - classification_loss: 0.3077 232/500 [============>.................] - ETA: 1:07 - loss: 1.7711 - regression_loss: 1.4631 - classification_loss: 0.3080 233/500 [============>.................] - ETA: 1:06 - loss: 1.7732 - regression_loss: 1.4646 - classification_loss: 0.3086 234/500 [=============>................] - ETA: 1:06 - loss: 1.7733 - regression_loss: 1.4643 - classification_loss: 0.3089 235/500 [=============>................] - ETA: 1:06 - loss: 1.7721 - regression_loss: 1.4636 - classification_loss: 0.3085 236/500 [=============>................] - ETA: 1:06 - loss: 1.7736 - regression_loss: 1.4649 - classification_loss: 0.3087 237/500 [=============>................] - ETA: 1:05 - loss: 1.7739 - regression_loss: 1.4652 - classification_loss: 0.3087 238/500 [=============>................] - ETA: 1:05 - loss: 1.7735 - regression_loss: 1.4650 - classification_loss: 0.3084 239/500 [=============>................] - ETA: 1:05 - loss: 1.7727 - regression_loss: 1.4647 - classification_loss: 0.3080 240/500 [=============>................] - ETA: 1:05 - loss: 1.7731 - regression_loss: 1.4652 - classification_loss: 0.3079 241/500 [=============>................] - ETA: 1:04 - loss: 1.7695 - regression_loss: 1.4622 - classification_loss: 0.3073 242/500 [=============>................] - ETA: 1:04 - loss: 1.7703 - regression_loss: 1.4630 - classification_loss: 0.3073 243/500 [=============>................] - ETA: 1:04 - loss: 1.7692 - regression_loss: 1.4622 - classification_loss: 0.3070 244/500 [=============>................] - ETA: 1:04 - loss: 1.7717 - regression_loss: 1.4641 - classification_loss: 0.3076 245/500 [=============>................] - ETA: 1:03 - loss: 1.7722 - regression_loss: 1.4644 - classification_loss: 0.3078 246/500 [=============>................] - ETA: 1:03 - loss: 1.7750 - regression_loss: 1.4673 - classification_loss: 0.3077 247/500 [=============>................] - ETA: 1:03 - loss: 1.7751 - regression_loss: 1.4680 - classification_loss: 0.3071 248/500 [=============>................] - ETA: 1:03 - loss: 1.7744 - regression_loss: 1.4676 - classification_loss: 0.3067 249/500 [=============>................] - ETA: 1:02 - loss: 1.7741 - regression_loss: 1.4675 - classification_loss: 0.3066 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7692 - regression_loss: 1.4636 - classification_loss: 0.3057 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7684 - regression_loss: 1.4630 - classification_loss: 0.3054 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7686 - regression_loss: 1.4633 - classification_loss: 0.3052 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7733 - regression_loss: 1.4668 - classification_loss: 0.3065 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7746 - regression_loss: 1.4675 - classification_loss: 0.3071 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7743 - regression_loss: 1.4675 - classification_loss: 0.3069 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7722 - regression_loss: 1.4660 - classification_loss: 0.3062 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7723 - regression_loss: 1.4660 - classification_loss: 0.3062 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7726 - regression_loss: 1.4665 - classification_loss: 0.3061 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7723 - regression_loss: 1.4662 - classification_loss: 0.3061 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7716 - regression_loss: 1.4657 - classification_loss: 0.3058 261/500 [==============>...............] - ETA: 59s - loss: 1.7721 - regression_loss: 1.4662 - classification_loss: 0.3059  262/500 [==============>...............] - ETA: 59s - loss: 1.7725 - regression_loss: 1.4667 - classification_loss: 0.3058 263/500 [==============>...............] - ETA: 59s - loss: 1.7740 - regression_loss: 1.4680 - classification_loss: 0.3060 264/500 [==============>...............] - ETA: 59s - loss: 1.7746 - regression_loss: 1.4688 - classification_loss: 0.3058 265/500 [==============>...............] - ETA: 58s - loss: 1.7761 - regression_loss: 1.4697 - classification_loss: 0.3065 266/500 [==============>...............] - ETA: 58s - loss: 1.7759 - regression_loss: 1.4695 - classification_loss: 0.3064 267/500 [===============>..............] - ETA: 58s - loss: 1.7760 - regression_loss: 1.4698 - classification_loss: 0.3062 268/500 [===============>..............] - ETA: 58s - loss: 1.7774 - regression_loss: 1.4712 - classification_loss: 0.3062 269/500 [===============>..............] - ETA: 57s - loss: 1.7788 - regression_loss: 1.4725 - classification_loss: 0.3063 270/500 [===============>..............] - ETA: 57s - loss: 1.7767 - regression_loss: 1.4710 - classification_loss: 0.3057 271/500 [===============>..............] - ETA: 57s - loss: 1.7781 - regression_loss: 1.4722 - classification_loss: 0.3059 272/500 [===============>..............] - ETA: 57s - loss: 1.7780 - regression_loss: 1.4722 - classification_loss: 0.3058 273/500 [===============>..............] - ETA: 56s - loss: 1.7795 - regression_loss: 1.4734 - classification_loss: 0.3061 274/500 [===============>..............] - ETA: 56s - loss: 1.7806 - regression_loss: 1.4744 - classification_loss: 0.3062 275/500 [===============>..............] - ETA: 56s - loss: 1.7821 - regression_loss: 1.4757 - classification_loss: 0.3064 276/500 [===============>..............] - ETA: 56s - loss: 1.7833 - regression_loss: 1.4768 - classification_loss: 0.3064 277/500 [===============>..............] - ETA: 55s - loss: 1.7836 - regression_loss: 1.4767 - classification_loss: 0.3069 278/500 [===============>..............] - ETA: 55s - loss: 1.7811 - regression_loss: 1.4750 - classification_loss: 0.3061 279/500 [===============>..............] - ETA: 55s - loss: 1.7829 - regression_loss: 1.4760 - classification_loss: 0.3069 280/500 [===============>..............] - ETA: 55s - loss: 1.7806 - regression_loss: 1.4740 - classification_loss: 0.3066 281/500 [===============>..............] - ETA: 54s - loss: 1.7794 - regression_loss: 1.4728 - classification_loss: 0.3065 282/500 [===============>..............] - ETA: 54s - loss: 1.7799 - regression_loss: 1.4733 - classification_loss: 0.3066 283/500 [===============>..............] - ETA: 54s - loss: 1.7805 - regression_loss: 1.4739 - classification_loss: 0.3066 284/500 [================>.............] - ETA: 54s - loss: 1.7820 - regression_loss: 1.4752 - classification_loss: 0.3068 285/500 [================>.............] - ETA: 53s - loss: 1.7783 - regression_loss: 1.4722 - classification_loss: 0.3062 286/500 [================>.............] - ETA: 53s - loss: 1.7781 - regression_loss: 1.4717 - classification_loss: 0.3065 287/500 [================>.............] - ETA: 53s - loss: 1.7793 - regression_loss: 1.4727 - classification_loss: 0.3066 288/500 [================>.............] - ETA: 53s - loss: 1.7803 - regression_loss: 1.4736 - classification_loss: 0.3068 289/500 [================>.............] - ETA: 52s - loss: 1.7796 - regression_loss: 1.4724 - classification_loss: 0.3072 290/500 [================>.............] - ETA: 52s - loss: 1.7789 - regression_loss: 1.4718 - classification_loss: 0.3071 291/500 [================>.............] - ETA: 52s - loss: 1.7782 - regression_loss: 1.4712 - classification_loss: 0.3069 292/500 [================>.............] - ETA: 52s - loss: 1.7766 - regression_loss: 1.4697 - classification_loss: 0.3069 293/500 [================>.............] - ETA: 51s - loss: 1.7770 - regression_loss: 1.4702 - classification_loss: 0.3068 294/500 [================>.............] - ETA: 51s - loss: 1.7780 - regression_loss: 1.4710 - classification_loss: 0.3070 295/500 [================>.............] - ETA: 51s - loss: 1.7815 - regression_loss: 1.4736 - classification_loss: 0.3078 296/500 [================>.............] - ETA: 51s - loss: 1.7837 - regression_loss: 1.4757 - classification_loss: 0.3079 297/500 [================>.............] - ETA: 50s - loss: 1.7832 - regression_loss: 1.4750 - classification_loss: 0.3082 298/500 [================>.............] - ETA: 50s - loss: 1.7859 - regression_loss: 1.4772 - classification_loss: 0.3086 299/500 [================>.............] - ETA: 50s - loss: 1.7867 - regression_loss: 1.4779 - classification_loss: 0.3088 300/500 [=================>............] - ETA: 50s - loss: 1.7876 - regression_loss: 1.4788 - classification_loss: 0.3088 301/500 [=================>............] - ETA: 49s - loss: 1.7869 - regression_loss: 1.4781 - classification_loss: 0.3088 302/500 [=================>............] - ETA: 49s - loss: 1.7879 - regression_loss: 1.4771 - classification_loss: 0.3108 303/500 [=================>............] - ETA: 49s - loss: 1.7901 - regression_loss: 1.4790 - classification_loss: 0.3111 304/500 [=================>............] - ETA: 49s - loss: 1.7904 - regression_loss: 1.4793 - classification_loss: 0.3111 305/500 [=================>............] - ETA: 48s - loss: 1.7918 - regression_loss: 1.4804 - classification_loss: 0.3114 306/500 [=================>............] - ETA: 48s - loss: 1.7920 - regression_loss: 1.4805 - classification_loss: 0.3115 307/500 [=================>............] - ETA: 48s - loss: 1.7913 - regression_loss: 1.4801 - classification_loss: 0.3112 308/500 [=================>............] - ETA: 48s - loss: 1.7914 - regression_loss: 1.4802 - classification_loss: 0.3112 309/500 [=================>............] - ETA: 47s - loss: 1.7917 - regression_loss: 1.4805 - classification_loss: 0.3112 310/500 [=================>............] - ETA: 47s - loss: 1.7954 - regression_loss: 1.4833 - classification_loss: 0.3121 311/500 [=================>............] - ETA: 47s - loss: 1.7967 - regression_loss: 1.4842 - classification_loss: 0.3125 312/500 [=================>............] - ETA: 47s - loss: 1.7951 - regression_loss: 1.4828 - classification_loss: 0.3124 313/500 [=================>............] - ETA: 46s - loss: 1.7952 - regression_loss: 1.4827 - classification_loss: 0.3125 314/500 [=================>............] - ETA: 46s - loss: 1.7929 - regression_loss: 1.4809 - classification_loss: 0.3120 315/500 [=================>............] - ETA: 46s - loss: 1.7898 - regression_loss: 1.4784 - classification_loss: 0.3113 316/500 [=================>............] - ETA: 46s - loss: 1.7873 - regression_loss: 1.4767 - classification_loss: 0.3106 317/500 [==================>...........] - ETA: 45s - loss: 1.7871 - regression_loss: 1.4766 - classification_loss: 0.3105 318/500 [==================>...........] - ETA: 45s - loss: 1.7877 - regression_loss: 1.4773 - classification_loss: 0.3104 319/500 [==================>...........] - ETA: 45s - loss: 1.7876 - regression_loss: 1.4774 - classification_loss: 0.3101 320/500 [==================>...........] - ETA: 45s - loss: 1.7877 - regression_loss: 1.4756 - classification_loss: 0.3122 321/500 [==================>...........] - ETA: 44s - loss: 1.7893 - regression_loss: 1.4769 - classification_loss: 0.3124 322/500 [==================>...........] - ETA: 44s - loss: 1.7905 - regression_loss: 1.4780 - classification_loss: 0.3126 323/500 [==================>...........] - ETA: 44s - loss: 1.7911 - regression_loss: 1.4786 - classification_loss: 0.3125 324/500 [==================>...........] - ETA: 44s - loss: 1.7909 - regression_loss: 1.4784 - classification_loss: 0.3125 325/500 [==================>...........] - ETA: 43s - loss: 1.7907 - regression_loss: 1.4779 - classification_loss: 0.3128 326/500 [==================>...........] - ETA: 43s - loss: 1.7907 - regression_loss: 1.4778 - classification_loss: 0.3128 327/500 [==================>...........] - ETA: 43s - loss: 1.7903 - regression_loss: 1.4778 - classification_loss: 0.3124 328/500 [==================>...........] - ETA: 43s - loss: 1.7917 - regression_loss: 1.4789 - classification_loss: 0.3128 329/500 [==================>...........] - ETA: 42s - loss: 1.7910 - regression_loss: 1.4784 - classification_loss: 0.3126 330/500 [==================>...........] - ETA: 42s - loss: 1.7905 - regression_loss: 1.4780 - classification_loss: 0.3126 331/500 [==================>...........] - ETA: 42s - loss: 1.7909 - regression_loss: 1.4785 - classification_loss: 0.3125 332/500 [==================>...........] - ETA: 42s - loss: 1.7885 - regression_loss: 1.4763 - classification_loss: 0.3122 333/500 [==================>...........] - ETA: 41s - loss: 1.7881 - regression_loss: 1.4760 - classification_loss: 0.3121 334/500 [===================>..........] - ETA: 41s - loss: 1.7879 - regression_loss: 1.4756 - classification_loss: 0.3123 335/500 [===================>..........] - ETA: 41s - loss: 1.7869 - regression_loss: 1.4747 - classification_loss: 0.3122 336/500 [===================>..........] - ETA: 41s - loss: 1.7874 - regression_loss: 1.4751 - classification_loss: 0.3123 337/500 [===================>..........] - ETA: 40s - loss: 1.7873 - regression_loss: 1.4751 - classification_loss: 0.3122 338/500 [===================>..........] - ETA: 40s - loss: 1.7874 - regression_loss: 1.4751 - classification_loss: 0.3123 339/500 [===================>..........] - ETA: 40s - loss: 1.7873 - regression_loss: 1.4751 - classification_loss: 0.3122 340/500 [===================>..........] - ETA: 40s - loss: 1.7865 - regression_loss: 1.4745 - classification_loss: 0.3120 341/500 [===================>..........] - ETA: 39s - loss: 1.7871 - regression_loss: 1.4749 - classification_loss: 0.3123 342/500 [===================>..........] - ETA: 39s - loss: 1.7855 - regression_loss: 1.4737 - classification_loss: 0.3119 343/500 [===================>..........] - ETA: 39s - loss: 1.7875 - regression_loss: 1.4754 - classification_loss: 0.3121 344/500 [===================>..........] - ETA: 39s - loss: 1.7855 - regression_loss: 1.4737 - classification_loss: 0.3118 345/500 [===================>..........] - ETA: 38s - loss: 1.7852 - regression_loss: 1.4737 - classification_loss: 0.3115 346/500 [===================>..........] - ETA: 38s - loss: 1.7854 - regression_loss: 1.4740 - classification_loss: 0.3114 347/500 [===================>..........] - ETA: 38s - loss: 1.7857 - regression_loss: 1.4740 - classification_loss: 0.3116 348/500 [===================>..........] - ETA: 38s - loss: 1.7838 - regression_loss: 1.4725 - classification_loss: 0.3114 349/500 [===================>..........] - ETA: 37s - loss: 1.7838 - regression_loss: 1.4726 - classification_loss: 0.3111 350/500 [====================>.........] - ETA: 37s - loss: 1.7835 - regression_loss: 1.4724 - classification_loss: 0.3110 351/500 [====================>.........] - ETA: 37s - loss: 1.7850 - regression_loss: 1.4740 - classification_loss: 0.3110 352/500 [====================>.........] - ETA: 37s - loss: 1.7858 - regression_loss: 1.4749 - classification_loss: 0.3109 353/500 [====================>.........] - ETA: 36s - loss: 1.7855 - regression_loss: 1.4747 - classification_loss: 0.3107 354/500 [====================>.........] - ETA: 36s - loss: 1.7876 - regression_loss: 1.4763 - classification_loss: 0.3113 355/500 [====================>.........] - ETA: 36s - loss: 1.7846 - regression_loss: 1.4738 - classification_loss: 0.3108 356/500 [====================>.........] - ETA: 36s - loss: 1.7849 - regression_loss: 1.4740 - classification_loss: 0.3108 357/500 [====================>.........] - ETA: 35s - loss: 1.7851 - regression_loss: 1.4742 - classification_loss: 0.3109 358/500 [====================>.........] - ETA: 35s - loss: 1.7863 - regression_loss: 1.4750 - classification_loss: 0.3112 359/500 [====================>.........] - ETA: 35s - loss: 1.7875 - regression_loss: 1.4759 - classification_loss: 0.3116 360/500 [====================>.........] - ETA: 35s - loss: 1.7874 - regression_loss: 1.4757 - classification_loss: 0.3116 361/500 [====================>.........] - ETA: 34s - loss: 1.7873 - regression_loss: 1.4756 - classification_loss: 0.3117 362/500 [====================>.........] - ETA: 34s - loss: 1.7857 - regression_loss: 1.4746 - classification_loss: 0.3111 363/500 [====================>.........] - ETA: 34s - loss: 1.7864 - regression_loss: 1.4752 - classification_loss: 0.3112 364/500 [====================>.........] - ETA: 34s - loss: 1.7867 - regression_loss: 1.4755 - classification_loss: 0.3112 365/500 [====================>.........] - ETA: 33s - loss: 1.7870 - regression_loss: 1.4758 - classification_loss: 0.3111 366/500 [====================>.........] - ETA: 33s - loss: 1.7847 - regression_loss: 1.4738 - classification_loss: 0.3109 367/500 [=====================>........] - ETA: 33s - loss: 1.7835 - regression_loss: 1.4730 - classification_loss: 0.3106 368/500 [=====================>........] - ETA: 33s - loss: 1.7828 - regression_loss: 1.4724 - classification_loss: 0.3104 369/500 [=====================>........] - ETA: 32s - loss: 1.7829 - regression_loss: 1.4725 - classification_loss: 0.3104 370/500 [=====================>........] - ETA: 32s - loss: 1.7835 - regression_loss: 1.4732 - classification_loss: 0.3103 371/500 [=====================>........] - ETA: 32s - loss: 1.7838 - regression_loss: 1.4736 - classification_loss: 0.3103 372/500 [=====================>........] - ETA: 32s - loss: 1.7839 - regression_loss: 1.4737 - classification_loss: 0.3102 373/500 [=====================>........] - ETA: 31s - loss: 1.7843 - regression_loss: 1.4742 - classification_loss: 0.3101 374/500 [=====================>........] - ETA: 31s - loss: 1.7851 - regression_loss: 1.4749 - classification_loss: 0.3102 375/500 [=====================>........] - ETA: 31s - loss: 1.7860 - regression_loss: 1.4755 - classification_loss: 0.3105 376/500 [=====================>........] - ETA: 31s - loss: 1.7860 - regression_loss: 1.4756 - classification_loss: 0.3104 377/500 [=====================>........] - ETA: 30s - loss: 1.7872 - regression_loss: 1.4764 - classification_loss: 0.3108 378/500 [=====================>........] - ETA: 30s - loss: 1.7871 - regression_loss: 1.4764 - classification_loss: 0.3107 379/500 [=====================>........] - ETA: 30s - loss: 1.7862 - regression_loss: 1.4758 - classification_loss: 0.3104 380/500 [=====================>........] - ETA: 30s - loss: 1.7866 - regression_loss: 1.4762 - classification_loss: 0.3104 381/500 [=====================>........] - ETA: 29s - loss: 1.7847 - regression_loss: 1.4746 - classification_loss: 0.3101 382/500 [=====================>........] - ETA: 29s - loss: 1.7856 - regression_loss: 1.4754 - classification_loss: 0.3102 383/500 [=====================>........] - ETA: 29s - loss: 1.7872 - regression_loss: 1.4766 - classification_loss: 0.3107 384/500 [======================>.......] - ETA: 29s - loss: 1.7884 - regression_loss: 1.4775 - classification_loss: 0.3109 385/500 [======================>.......] - ETA: 28s - loss: 1.7889 - regression_loss: 1.4783 - classification_loss: 0.3106 386/500 [======================>.......] - ETA: 28s - loss: 1.7890 - regression_loss: 1.4783 - classification_loss: 0.3107 387/500 [======================>.......] - ETA: 28s - loss: 1.7876 - regression_loss: 1.4769 - classification_loss: 0.3107 388/500 [======================>.......] - ETA: 28s - loss: 1.7873 - regression_loss: 1.4769 - classification_loss: 0.3104 389/500 [======================>.......] - ETA: 27s - loss: 1.7879 - regression_loss: 1.4773 - classification_loss: 0.3107 390/500 [======================>.......] - ETA: 27s - loss: 1.7879 - regression_loss: 1.4768 - classification_loss: 0.3111 391/500 [======================>.......] - ETA: 27s - loss: 1.7900 - regression_loss: 1.4789 - classification_loss: 0.3112 392/500 [======================>.......] - ETA: 27s - loss: 1.7907 - regression_loss: 1.4795 - classification_loss: 0.3112 393/500 [======================>.......] - ETA: 26s - loss: 1.7906 - regression_loss: 1.4792 - classification_loss: 0.3115 394/500 [======================>.......] - ETA: 26s - loss: 1.7897 - regression_loss: 1.4789 - classification_loss: 0.3108 395/500 [======================>.......] - ETA: 26s - loss: 1.7914 - regression_loss: 1.4802 - classification_loss: 0.3112 396/500 [======================>.......] - ETA: 26s - loss: 1.7904 - regression_loss: 1.4794 - classification_loss: 0.3109 397/500 [======================>.......] - ETA: 25s - loss: 1.7873 - regression_loss: 1.4770 - classification_loss: 0.3103 398/500 [======================>.......] - ETA: 25s - loss: 1.7865 - regression_loss: 1.4763 - classification_loss: 0.3103 399/500 [======================>.......] - ETA: 25s - loss: 1.7867 - regression_loss: 1.4763 - classification_loss: 0.3104 400/500 [=======================>......] - ETA: 25s - loss: 1.7886 - regression_loss: 1.4778 - classification_loss: 0.3108 401/500 [=======================>......] - ETA: 24s - loss: 1.7892 - regression_loss: 1.4781 - classification_loss: 0.3111 402/500 [=======================>......] - ETA: 24s - loss: 1.7891 - regression_loss: 1.4780 - classification_loss: 0.3111 403/500 [=======================>......] - ETA: 24s - loss: 1.7889 - regression_loss: 1.4778 - classification_loss: 0.3111 404/500 [=======================>......] - ETA: 24s - loss: 1.7895 - regression_loss: 1.4783 - classification_loss: 0.3112 405/500 [=======================>......] - ETA: 23s - loss: 1.7885 - regression_loss: 1.4777 - classification_loss: 0.3109 406/500 [=======================>......] - ETA: 23s - loss: 1.7892 - regression_loss: 1.4783 - classification_loss: 0.3108 407/500 [=======================>......] - ETA: 23s - loss: 1.7884 - regression_loss: 1.4778 - classification_loss: 0.3106 408/500 [=======================>......] - ETA: 23s - loss: 1.7898 - regression_loss: 1.4790 - classification_loss: 0.3108 409/500 [=======================>......] - ETA: 22s - loss: 1.7899 - regression_loss: 1.4788 - classification_loss: 0.3111 410/500 [=======================>......] - ETA: 22s - loss: 1.7888 - regression_loss: 1.4779 - classification_loss: 0.3109 411/500 [=======================>......] - ETA: 22s - loss: 1.7887 - regression_loss: 1.4779 - classification_loss: 0.3108 412/500 [=======================>......] - ETA: 22s - loss: 1.7889 - regression_loss: 1.4780 - classification_loss: 0.3109 413/500 [=======================>......] - ETA: 21s - loss: 1.7879 - regression_loss: 1.4767 - classification_loss: 0.3112 414/500 [=======================>......] - ETA: 21s - loss: 1.7860 - regression_loss: 1.4752 - classification_loss: 0.3108 415/500 [=======================>......] - ETA: 21s - loss: 1.7865 - regression_loss: 1.4757 - classification_loss: 0.3108 416/500 [=======================>......] - ETA: 21s - loss: 1.7859 - regression_loss: 1.4753 - classification_loss: 0.3106 417/500 [========================>.....] - ETA: 20s - loss: 1.7870 - regression_loss: 1.4761 - classification_loss: 0.3110 418/500 [========================>.....] - ETA: 20s - loss: 1.7864 - regression_loss: 1.4756 - classification_loss: 0.3107 419/500 [========================>.....] - ETA: 20s - loss: 1.7887 - regression_loss: 1.4777 - classification_loss: 0.3111 420/500 [========================>.....] - ETA: 20s - loss: 1.7899 - regression_loss: 1.4787 - classification_loss: 0.3112 421/500 [========================>.....] - ETA: 19s - loss: 1.7902 - regression_loss: 1.4789 - classification_loss: 0.3113 422/500 [========================>.....] - ETA: 19s - loss: 1.7880 - regression_loss: 1.4772 - classification_loss: 0.3108 423/500 [========================>.....] - ETA: 19s - loss: 1.7883 - regression_loss: 1.4771 - classification_loss: 0.3112 424/500 [========================>.....] - ETA: 19s - loss: 1.7888 - regression_loss: 1.4773 - classification_loss: 0.3115 425/500 [========================>.....] - ETA: 18s - loss: 1.7891 - regression_loss: 1.4771 - classification_loss: 0.3120 426/500 [========================>.....] - ETA: 18s - loss: 1.7889 - regression_loss: 1.4770 - classification_loss: 0.3119 427/500 [========================>.....] - ETA: 18s - loss: 1.7879 - regression_loss: 1.4759 - classification_loss: 0.3120 428/500 [========================>.....] - ETA: 18s - loss: 1.7876 - regression_loss: 1.4756 - classification_loss: 0.3119 429/500 [========================>.....] - ETA: 17s - loss: 1.7893 - regression_loss: 1.4768 - classification_loss: 0.3124 430/500 [========================>.....] - ETA: 17s - loss: 1.7878 - regression_loss: 1.4757 - classification_loss: 0.3121 431/500 [========================>.....] - ETA: 17s - loss: 1.7902 - regression_loss: 1.4775 - classification_loss: 0.3127 432/500 [========================>.....] - ETA: 17s - loss: 1.7910 - regression_loss: 1.4780 - classification_loss: 0.3129 433/500 [========================>.....] - ETA: 16s - loss: 1.7913 - regression_loss: 1.4784 - classification_loss: 0.3129 434/500 [=========================>....] - ETA: 16s - loss: 1.7911 - regression_loss: 1.4783 - classification_loss: 0.3129 435/500 [=========================>....] - ETA: 16s - loss: 1.7921 - regression_loss: 1.4789 - classification_loss: 0.3132 436/500 [=========================>....] - ETA: 16s - loss: 1.7919 - regression_loss: 1.4789 - classification_loss: 0.3130 437/500 [=========================>....] - ETA: 15s - loss: 1.7925 - regression_loss: 1.4794 - classification_loss: 0.3131 438/500 [=========================>....] - ETA: 15s - loss: 1.7929 - regression_loss: 1.4799 - classification_loss: 0.3130 439/500 [=========================>....] - ETA: 15s - loss: 1.7926 - regression_loss: 1.4797 - classification_loss: 0.3129 440/500 [=========================>....] - ETA: 15s - loss: 1.7906 - regression_loss: 1.4782 - classification_loss: 0.3124 441/500 [=========================>....] - ETA: 14s - loss: 1.7911 - regression_loss: 1.4787 - classification_loss: 0.3124 442/500 [=========================>....] - ETA: 14s - loss: 1.7902 - regression_loss: 1.4780 - classification_loss: 0.3122 443/500 [=========================>....] - ETA: 14s - loss: 1.7906 - regression_loss: 1.4781 - classification_loss: 0.3125 444/500 [=========================>....] - ETA: 14s - loss: 1.7899 - regression_loss: 1.4777 - classification_loss: 0.3123 445/500 [=========================>....] - ETA: 13s - loss: 1.7888 - regression_loss: 1.4763 - classification_loss: 0.3125 446/500 [=========================>....] - ETA: 13s - loss: 1.7893 - regression_loss: 1.4768 - classification_loss: 0.3125 447/500 [=========================>....] - ETA: 13s - loss: 1.7902 - regression_loss: 1.4776 - classification_loss: 0.3126 448/500 [=========================>....] - ETA: 13s - loss: 1.7904 - regression_loss: 1.4777 - classification_loss: 0.3127 449/500 [=========================>....] - ETA: 12s - loss: 1.7903 - regression_loss: 1.4777 - classification_loss: 0.3126 450/500 [==========================>...] - ETA: 12s - loss: 1.7908 - regression_loss: 1.4781 - classification_loss: 0.3127 451/500 [==========================>...] - ETA: 12s - loss: 1.7901 - regression_loss: 1.4777 - classification_loss: 0.3125 452/500 [==========================>...] - ETA: 12s - loss: 1.7893 - regression_loss: 1.4768 - classification_loss: 0.3125 453/500 [==========================>...] - ETA: 11s - loss: 1.7891 - regression_loss: 1.4766 - classification_loss: 0.3125 454/500 [==========================>...] - ETA: 11s - loss: 1.7896 - regression_loss: 1.4772 - classification_loss: 0.3124 455/500 [==========================>...] - ETA: 11s - loss: 1.7875 - regression_loss: 1.4755 - classification_loss: 0.3120 456/500 [==========================>...] - ETA: 11s - loss: 1.7872 - regression_loss: 1.4752 - classification_loss: 0.3120 457/500 [==========================>...] - ETA: 10s - loss: 1.7862 - regression_loss: 1.4744 - classification_loss: 0.3118 458/500 [==========================>...] - ETA: 10s - loss: 1.7853 - regression_loss: 1.4738 - classification_loss: 0.3115 459/500 [==========================>...] - ETA: 10s - loss: 1.7839 - regression_loss: 1.4727 - classification_loss: 0.3112 460/500 [==========================>...] - ETA: 10s - loss: 1.7843 - regression_loss: 1.4731 - classification_loss: 0.3112 461/500 [==========================>...] - ETA: 9s - loss: 1.7856 - regression_loss: 1.4740 - classification_loss: 0.3115  462/500 [==========================>...] - ETA: 9s - loss: 1.7864 - regression_loss: 1.4747 - classification_loss: 0.3117 463/500 [==========================>...] - ETA: 9s - loss: 1.7865 - regression_loss: 1.4749 - classification_loss: 0.3117 464/500 [==========================>...] - ETA: 9s - loss: 1.7854 - regression_loss: 1.4741 - classification_loss: 0.3114 465/500 [==========================>...] - ETA: 8s - loss: 1.7854 - regression_loss: 1.4741 - classification_loss: 0.3113 466/500 [==========================>...] - ETA: 8s - loss: 1.7849 - regression_loss: 1.4739 - classification_loss: 0.3111 467/500 [===========================>..] - ETA: 8s - loss: 1.7845 - regression_loss: 1.4735 - classification_loss: 0.3109 468/500 [===========================>..] - ETA: 8s - loss: 1.7844 - regression_loss: 1.4737 - classification_loss: 0.3108 469/500 [===========================>..] - ETA: 7s - loss: 1.7849 - regression_loss: 1.4741 - classification_loss: 0.3109 470/500 [===========================>..] - ETA: 7s - loss: 1.7850 - regression_loss: 1.4741 - classification_loss: 0.3109 471/500 [===========================>..] - ETA: 7s - loss: 1.7866 - regression_loss: 1.4752 - classification_loss: 0.3114 472/500 [===========================>..] - ETA: 7s - loss: 1.7880 - regression_loss: 1.4762 - classification_loss: 0.3118 473/500 [===========================>..] - ETA: 6s - loss: 1.7879 - regression_loss: 1.4762 - classification_loss: 0.3117 474/500 [===========================>..] - ETA: 6s - loss: 1.7881 - regression_loss: 1.4762 - classification_loss: 0.3119 475/500 [===========================>..] - ETA: 6s - loss: 1.7889 - regression_loss: 1.4770 - classification_loss: 0.3119 476/500 [===========================>..] - ETA: 6s - loss: 1.7887 - regression_loss: 1.4769 - classification_loss: 0.3117 477/500 [===========================>..] - ETA: 5s - loss: 1.7885 - regression_loss: 1.4768 - classification_loss: 0.3117 478/500 [===========================>..] - ETA: 5s - loss: 1.7896 - regression_loss: 1.4778 - classification_loss: 0.3118 479/500 [===========================>..] - ETA: 5s - loss: 1.7874 - regression_loss: 1.4761 - classification_loss: 0.3113 480/500 [===========================>..] - ETA: 5s - loss: 1.7878 - regression_loss: 1.4763 - classification_loss: 0.3114 481/500 [===========================>..] - ETA: 4s - loss: 1.7878 - regression_loss: 1.4764 - classification_loss: 0.3114 482/500 [===========================>..] - ETA: 4s - loss: 1.7875 - regression_loss: 1.4762 - classification_loss: 0.3113 483/500 [===========================>..] - ETA: 4s - loss: 1.7888 - regression_loss: 1.4773 - classification_loss: 0.3115 484/500 [============================>.] - ETA: 4s - loss: 1.7885 - regression_loss: 1.4771 - classification_loss: 0.3113 485/500 [============================>.] - ETA: 3s - loss: 1.7871 - regression_loss: 1.4760 - classification_loss: 0.3110 486/500 [============================>.] - ETA: 3s - loss: 1.7870 - regression_loss: 1.4761 - classification_loss: 0.3109 487/500 [============================>.] - ETA: 3s - loss: 1.7875 - regression_loss: 1.4764 - classification_loss: 0.3110 488/500 [============================>.] - ETA: 3s - loss: 1.7886 - regression_loss: 1.4774 - classification_loss: 0.3112 489/500 [============================>.] - ETA: 2s - loss: 1.7889 - regression_loss: 1.4777 - classification_loss: 0.3111 490/500 [============================>.] - ETA: 2s - loss: 1.7891 - regression_loss: 1.4781 - classification_loss: 0.3111 491/500 [============================>.] - ETA: 2s - loss: 1.7873 - regression_loss: 1.4765 - classification_loss: 0.3107 492/500 [============================>.] - ETA: 2s - loss: 1.7877 - regression_loss: 1.4769 - classification_loss: 0.3108 493/500 [============================>.] - ETA: 1s - loss: 1.7874 - regression_loss: 1.4770 - classification_loss: 0.3105 494/500 [============================>.] - ETA: 1s - loss: 1.7879 - regression_loss: 1.4774 - classification_loss: 0.3106 495/500 [============================>.] - ETA: 1s - loss: 1.7887 - regression_loss: 1.4781 - classification_loss: 0.3106 496/500 [============================>.] - ETA: 1s - loss: 1.7893 - regression_loss: 1.4786 - classification_loss: 0.3107 497/500 [============================>.] - ETA: 0s - loss: 1.7893 - regression_loss: 1.4786 - classification_loss: 0.3106 498/500 [============================>.] - ETA: 0s - loss: 1.7912 - regression_loss: 1.4804 - classification_loss: 0.3108 499/500 [============================>.] - ETA: 0s - loss: 1.7918 - regression_loss: 1.4809 - classification_loss: 0.3109 500/500 [==============================] - 125s 250ms/step - loss: 1.7923 - regression_loss: 1.4813 - classification_loss: 0.3110 1172 instances of class plum with average precision: 0.6172 mAP: 0.6172 Epoch 00061: saving model to ./training/snapshots/resnet50_pascal_61.h5 Epoch 62/150 1/500 [..............................] - ETA: 2:01 - loss: 2.3915 - regression_loss: 1.9512 - classification_loss: 0.4403 2/500 [..............................] - ETA: 2:06 - loss: 2.1783 - regression_loss: 1.8011 - classification_loss: 0.3772 3/500 [..............................] - ETA: 2:05 - loss: 2.0948 - regression_loss: 1.7502 - classification_loss: 0.3446 4/500 [..............................] - ETA: 2:04 - loss: 2.0603 - regression_loss: 1.7246 - classification_loss: 0.3357 5/500 [..............................] - ETA: 2:03 - loss: 2.0507 - regression_loss: 1.7114 - classification_loss: 0.3393 6/500 [..............................] - ETA: 2:04 - loss: 2.0144 - regression_loss: 1.6807 - classification_loss: 0.3337 7/500 [..............................] - ETA: 2:04 - loss: 2.0007 - regression_loss: 1.6635 - classification_loss: 0.3372 8/500 [..............................] - ETA: 2:03 - loss: 1.8843 - regression_loss: 1.5505 - classification_loss: 0.3338 9/500 [..............................] - ETA: 2:03 - loss: 1.8665 - regression_loss: 1.5380 - classification_loss: 0.3285 10/500 [..............................] - ETA: 2:03 - loss: 1.8044 - regression_loss: 1.4914 - classification_loss: 0.3131 11/500 [..............................] - ETA: 2:02 - loss: 1.7692 - regression_loss: 1.4523 - classification_loss: 0.3170 12/500 [..............................] - ETA: 2:02 - loss: 1.7592 - regression_loss: 1.4488 - classification_loss: 0.3104 13/500 [..............................] - ETA: 2:01 - loss: 1.7404 - regression_loss: 1.4352 - classification_loss: 0.3052 14/500 [..............................] - ETA: 2:01 - loss: 1.7440 - regression_loss: 1.4351 - classification_loss: 0.3089 15/500 [..............................] - ETA: 2:00 - loss: 1.7316 - regression_loss: 1.4260 - classification_loss: 0.3055 16/500 [..............................] - ETA: 2:00 - loss: 1.7366 - regression_loss: 1.4339 - classification_loss: 0.3027 17/500 [>.............................] - ETA: 2:00 - loss: 1.7619 - regression_loss: 1.4546 - classification_loss: 0.3073 18/500 [>.............................] - ETA: 2:00 - loss: 1.7640 - regression_loss: 1.4530 - classification_loss: 0.3110 19/500 [>.............................] - ETA: 2:00 - loss: 1.7995 - regression_loss: 1.4649 - classification_loss: 0.3346 20/500 [>.............................] - ETA: 1:59 - loss: 1.7951 - regression_loss: 1.4581 - classification_loss: 0.3370 21/500 [>.............................] - ETA: 1:59 - loss: 1.8051 - regression_loss: 1.4687 - classification_loss: 0.3364 22/500 [>.............................] - ETA: 1:59 - loss: 1.7988 - regression_loss: 1.4635 - classification_loss: 0.3353 23/500 [>.............................] - ETA: 1:59 - loss: 1.8024 - regression_loss: 1.4673 - classification_loss: 0.3352 24/500 [>.............................] - ETA: 1:59 - loss: 1.8180 - regression_loss: 1.4778 - classification_loss: 0.3401 25/500 [>.............................] - ETA: 1:58 - loss: 1.7946 - regression_loss: 1.4592 - classification_loss: 0.3354 26/500 [>.............................] - ETA: 1:58 - loss: 1.8192 - regression_loss: 1.4798 - classification_loss: 0.3394 27/500 [>.............................] - ETA: 1:58 - loss: 1.8274 - regression_loss: 1.4878 - classification_loss: 0.3396 28/500 [>.............................] - ETA: 1:58 - loss: 1.8226 - regression_loss: 1.4856 - classification_loss: 0.3370 29/500 [>.............................] - ETA: 1:57 - loss: 1.8271 - regression_loss: 1.4900 - classification_loss: 0.3372 30/500 [>.............................] - ETA: 1:57 - loss: 1.8020 - regression_loss: 1.4712 - classification_loss: 0.3308 31/500 [>.............................] - ETA: 1:57 - loss: 1.7970 - regression_loss: 1.4661 - classification_loss: 0.3309 32/500 [>.............................] - ETA: 1:57 - loss: 1.7970 - regression_loss: 1.4659 - classification_loss: 0.3311 33/500 [>.............................] - ETA: 1:56 - loss: 1.8149 - regression_loss: 1.4826 - classification_loss: 0.3323 34/500 [=>............................] - ETA: 1:56 - loss: 1.8348 - regression_loss: 1.4979 - classification_loss: 0.3368 35/500 [=>............................] - ETA: 1:56 - loss: 1.8445 - regression_loss: 1.5076 - classification_loss: 0.3369 36/500 [=>............................] - ETA: 1:56 - loss: 1.8396 - regression_loss: 1.5040 - classification_loss: 0.3355 37/500 [=>............................] - ETA: 1:56 - loss: 1.8394 - regression_loss: 1.5049 - classification_loss: 0.3345 38/500 [=>............................] - ETA: 1:55 - loss: 1.8452 - regression_loss: 1.5106 - classification_loss: 0.3347 39/500 [=>............................] - ETA: 1:55 - loss: 1.8348 - regression_loss: 1.5023 - classification_loss: 0.3325 40/500 [=>............................] - ETA: 1:55 - loss: 1.8243 - regression_loss: 1.4938 - classification_loss: 0.3305 41/500 [=>............................] - ETA: 1:55 - loss: 1.8062 - regression_loss: 1.4820 - classification_loss: 0.3242 42/500 [=>............................] - ETA: 1:54 - loss: 1.8158 - regression_loss: 1.4898 - classification_loss: 0.3260 43/500 [=>............................] - ETA: 1:54 - loss: 1.8248 - regression_loss: 1.4975 - classification_loss: 0.3274 44/500 [=>............................] - ETA: 1:54 - loss: 1.8221 - regression_loss: 1.4952 - classification_loss: 0.3268 45/500 [=>............................] - ETA: 1:54 - loss: 1.8219 - regression_loss: 1.4964 - classification_loss: 0.3255 46/500 [=>............................] - ETA: 1:54 - loss: 1.8208 - regression_loss: 1.4962 - classification_loss: 0.3246 47/500 [=>............................] - ETA: 1:53 - loss: 1.8169 - regression_loss: 1.4938 - classification_loss: 0.3231 48/500 [=>............................] - ETA: 1:53 - loss: 1.8087 - regression_loss: 1.4876 - classification_loss: 0.3211 49/500 [=>............................] - ETA: 1:53 - loss: 1.8105 - regression_loss: 1.4887 - classification_loss: 0.3219 50/500 [==>...........................] - ETA: 1:53 - loss: 1.8109 - regression_loss: 1.4885 - classification_loss: 0.3223 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8123 - regression_loss: 1.4920 - classification_loss: 0.3203 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8192 - regression_loss: 1.4915 - classification_loss: 0.3278 53/500 [==>...........................] - ETA: 1:52 - loss: 1.8061 - regression_loss: 1.4761 - classification_loss: 0.3299 54/500 [==>...........................] - ETA: 1:52 - loss: 1.8025 - regression_loss: 1.4728 - classification_loss: 0.3298 55/500 [==>...........................] - ETA: 1:51 - loss: 1.7926 - regression_loss: 1.4656 - classification_loss: 0.3270 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7784 - regression_loss: 1.4540 - classification_loss: 0.3244 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7725 - regression_loss: 1.4501 - classification_loss: 0.3224 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7753 - regression_loss: 1.4529 - classification_loss: 0.3224 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7652 - regression_loss: 1.4453 - classification_loss: 0.3199 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7673 - regression_loss: 1.4470 - classification_loss: 0.3204 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7665 - regression_loss: 1.4464 - classification_loss: 0.3202 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7737 - regression_loss: 1.4537 - classification_loss: 0.3200 63/500 [==>...........................] - ETA: 1:50 - loss: 1.7641 - regression_loss: 1.4464 - classification_loss: 0.3177 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7635 - regression_loss: 1.4469 - classification_loss: 0.3166 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7551 - regression_loss: 1.4408 - classification_loss: 0.3144 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7673 - regression_loss: 1.4509 - classification_loss: 0.3164 67/500 [===>..........................] - ETA: 1:49 - loss: 1.7718 - regression_loss: 1.4534 - classification_loss: 0.3184 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7793 - regression_loss: 1.4603 - classification_loss: 0.3189 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7813 - regression_loss: 1.4625 - classification_loss: 0.3187 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7865 - regression_loss: 1.4672 - classification_loss: 0.3193 71/500 [===>..........................] - ETA: 1:48 - loss: 1.7815 - regression_loss: 1.4636 - classification_loss: 0.3179 72/500 [===>..........................] - ETA: 1:48 - loss: 1.7842 - regression_loss: 1.4663 - classification_loss: 0.3178 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7862 - regression_loss: 1.4677 - classification_loss: 0.3186 74/500 [===>..........................] - ETA: 1:47 - loss: 1.7748 - regression_loss: 1.4584 - classification_loss: 0.3163 75/500 [===>..........................] - ETA: 1:47 - loss: 1.7734 - regression_loss: 1.4581 - classification_loss: 0.3154 76/500 [===>..........................] - ETA: 1:47 - loss: 1.7630 - regression_loss: 1.4496 - classification_loss: 0.3134 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7525 - regression_loss: 1.4408 - classification_loss: 0.3117 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7447 - regression_loss: 1.4337 - classification_loss: 0.3110 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7447 - regression_loss: 1.4344 - classification_loss: 0.3103 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7696 - regression_loss: 1.4518 - classification_loss: 0.3178 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7666 - regression_loss: 1.4493 - classification_loss: 0.3173 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7750 - regression_loss: 1.4554 - classification_loss: 0.3196 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7755 - regression_loss: 1.4572 - classification_loss: 0.3183 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7750 - regression_loss: 1.4575 - classification_loss: 0.3175 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7715 - regression_loss: 1.4548 - classification_loss: 0.3167 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7743 - regression_loss: 1.4582 - classification_loss: 0.3161 87/500 [====>.........................] - ETA: 1:44 - loss: 1.7625 - regression_loss: 1.4485 - classification_loss: 0.3140 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7612 - regression_loss: 1.4481 - classification_loss: 0.3131 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7619 - regression_loss: 1.4488 - classification_loss: 0.3131 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7629 - regression_loss: 1.4502 - classification_loss: 0.3127 91/500 [====>.........................] - ETA: 1:43 - loss: 1.7602 - regression_loss: 1.4490 - classification_loss: 0.3112 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7630 - regression_loss: 1.4511 - classification_loss: 0.3119 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7687 - regression_loss: 1.4544 - classification_loss: 0.3143 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7612 - regression_loss: 1.4491 - classification_loss: 0.3122 95/500 [====>.........................] - ETA: 1:42 - loss: 1.7554 - regression_loss: 1.4435 - classification_loss: 0.3119 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7499 - regression_loss: 1.4383 - classification_loss: 0.3115 97/500 [====>.........................] - ETA: 1:41 - loss: 1.7399 - regression_loss: 1.4303 - classification_loss: 0.3096 98/500 [====>.........................] - ETA: 1:41 - loss: 1.7432 - regression_loss: 1.4327 - classification_loss: 0.3105 99/500 [====>.........................] - ETA: 1:41 - loss: 1.7372 - regression_loss: 1.4282 - classification_loss: 0.3090 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7333 - regression_loss: 1.4254 - classification_loss: 0.3079 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7347 - regression_loss: 1.4269 - classification_loss: 0.3078 102/500 [=====>........................] - ETA: 1:40 - loss: 1.7429 - regression_loss: 1.4345 - classification_loss: 0.3085 103/500 [=====>........................] - ETA: 1:40 - loss: 1.7449 - regression_loss: 1.4366 - classification_loss: 0.3082 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7474 - regression_loss: 1.4392 - classification_loss: 0.3082 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7448 - regression_loss: 1.4376 - classification_loss: 0.3072 106/500 [=====>........................] - ETA: 1:39 - loss: 1.7395 - regression_loss: 1.4337 - classification_loss: 0.3058 107/500 [=====>........................] - ETA: 1:39 - loss: 1.7418 - regression_loss: 1.4365 - classification_loss: 0.3054 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7387 - regression_loss: 1.4340 - classification_loss: 0.3046 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7451 - regression_loss: 1.4387 - classification_loss: 0.3064 110/500 [=====>........................] - ETA: 1:38 - loss: 1.7399 - regression_loss: 1.4339 - classification_loss: 0.3061 111/500 [=====>........................] - ETA: 1:38 - loss: 1.7432 - regression_loss: 1.4370 - classification_loss: 0.3062 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7467 - regression_loss: 1.4398 - classification_loss: 0.3069 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7380 - regression_loss: 1.4328 - classification_loss: 0.3052 114/500 [=====>........................] - ETA: 1:37 - loss: 1.7320 - regression_loss: 1.4280 - classification_loss: 0.3040 115/500 [=====>........................] - ETA: 1:37 - loss: 1.7304 - regression_loss: 1.4265 - classification_loss: 0.3039 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7333 - regression_loss: 1.4291 - classification_loss: 0.3042 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7445 - regression_loss: 1.4387 - classification_loss: 0.3058 118/500 [======>.......................] - ETA: 1:36 - loss: 1.7350 - regression_loss: 1.4311 - classification_loss: 0.3039 119/500 [======>.......................] - ETA: 1:36 - loss: 1.7344 - regression_loss: 1.4309 - classification_loss: 0.3036 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7291 - regression_loss: 1.4266 - classification_loss: 0.3025 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7315 - regression_loss: 1.4292 - classification_loss: 0.3023 122/500 [======>.......................] - ETA: 1:35 - loss: 1.7362 - regression_loss: 1.4337 - classification_loss: 0.3025 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7414 - regression_loss: 1.4384 - classification_loss: 0.3029 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7430 - regression_loss: 1.4394 - classification_loss: 0.3036 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7442 - regression_loss: 1.4402 - classification_loss: 0.3040 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7394 - regression_loss: 1.4367 - classification_loss: 0.3027 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7344 - regression_loss: 1.4329 - classification_loss: 0.3015 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7330 - regression_loss: 1.4318 - classification_loss: 0.3013 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7329 - regression_loss: 1.4316 - classification_loss: 0.3013 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7310 - regression_loss: 1.4304 - classification_loss: 0.3006 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7299 - regression_loss: 1.4296 - classification_loss: 0.3003 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7314 - regression_loss: 1.4310 - classification_loss: 0.3004 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7358 - regression_loss: 1.4346 - classification_loss: 0.3013 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7396 - regression_loss: 1.4371 - classification_loss: 0.3026 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7425 - regression_loss: 1.4395 - classification_loss: 0.3031 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7482 - regression_loss: 1.4446 - classification_loss: 0.3036 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7474 - regression_loss: 1.4442 - classification_loss: 0.3032 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7496 - regression_loss: 1.4455 - classification_loss: 0.3042 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7474 - regression_loss: 1.4410 - classification_loss: 0.3064 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7524 - regression_loss: 1.4451 - classification_loss: 0.3072 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7508 - regression_loss: 1.4443 - classification_loss: 0.3066 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7530 - regression_loss: 1.4461 - classification_loss: 0.3069 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7541 - regression_loss: 1.4479 - classification_loss: 0.3062 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7542 - regression_loss: 1.4482 - classification_loss: 0.3060 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7580 - regression_loss: 1.4510 - classification_loss: 0.3070 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7571 - regression_loss: 1.4506 - classification_loss: 0.3065 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7538 - regression_loss: 1.4476 - classification_loss: 0.3062 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7554 - regression_loss: 1.4492 - classification_loss: 0.3063 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7528 - regression_loss: 1.4473 - classification_loss: 0.3055 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7535 - regression_loss: 1.4485 - classification_loss: 0.3050 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7532 - regression_loss: 1.4486 - classification_loss: 0.3046 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7578 - regression_loss: 1.4519 - classification_loss: 0.3059 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7584 - regression_loss: 1.4525 - classification_loss: 0.3059 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7523 - regression_loss: 1.4471 - classification_loss: 0.3052 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7537 - regression_loss: 1.4482 - classification_loss: 0.3054 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7547 - regression_loss: 1.4489 - classification_loss: 0.3058 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7528 - regression_loss: 1.4468 - classification_loss: 0.3060 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7493 - regression_loss: 1.4440 - classification_loss: 0.3053 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7491 - regression_loss: 1.4436 - classification_loss: 0.3054 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7481 - regression_loss: 1.4429 - classification_loss: 0.3053 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7487 - regression_loss: 1.4437 - classification_loss: 0.3050 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7444 - regression_loss: 1.4404 - classification_loss: 0.3040 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7413 - regression_loss: 1.4380 - classification_loss: 0.3033 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7414 - regression_loss: 1.4379 - classification_loss: 0.3035 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7378 - regression_loss: 1.4347 - classification_loss: 0.3030 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7356 - regression_loss: 1.4330 - classification_loss: 0.3026 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7414 - regression_loss: 1.4377 - classification_loss: 0.3037 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7426 - regression_loss: 1.4386 - classification_loss: 0.3040 169/500 [=========>....................] - ETA: 1:23 - loss: 1.7450 - regression_loss: 1.4405 - classification_loss: 0.3045 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7463 - regression_loss: 1.4415 - classification_loss: 0.3048 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7477 - regression_loss: 1.4423 - classification_loss: 0.3054 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7483 - regression_loss: 1.4432 - classification_loss: 0.3051 173/500 [=========>....................] - ETA: 1:22 - loss: 1.7499 - regression_loss: 1.4444 - classification_loss: 0.3055 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7511 - regression_loss: 1.4452 - classification_loss: 0.3059 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7488 - regression_loss: 1.4432 - classification_loss: 0.3056 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7501 - regression_loss: 1.4445 - classification_loss: 0.3056 177/500 [=========>....................] - ETA: 1:21 - loss: 1.7510 - regression_loss: 1.4452 - classification_loss: 0.3058 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7515 - regression_loss: 1.4460 - classification_loss: 0.3055 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7542 - regression_loss: 1.4483 - classification_loss: 0.3058 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7476 - regression_loss: 1.4428 - classification_loss: 0.3048 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7499 - regression_loss: 1.4453 - classification_loss: 0.3046 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7521 - regression_loss: 1.4474 - classification_loss: 0.3047 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7528 - regression_loss: 1.4480 - classification_loss: 0.3048 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7517 - regression_loss: 1.4472 - classification_loss: 0.3045 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7507 - regression_loss: 1.4461 - classification_loss: 0.3046 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7512 - regression_loss: 1.4462 - classification_loss: 0.3049 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7506 - regression_loss: 1.4462 - classification_loss: 0.3045 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7461 - regression_loss: 1.4421 - classification_loss: 0.3039 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7476 - regression_loss: 1.4435 - classification_loss: 0.3040 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7482 - regression_loss: 1.4441 - classification_loss: 0.3041 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7491 - regression_loss: 1.4451 - classification_loss: 0.3041 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7492 - regression_loss: 1.4451 - classification_loss: 0.3041 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7462 - regression_loss: 1.4422 - classification_loss: 0.3040 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7462 - regression_loss: 1.4424 - classification_loss: 0.3038 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7476 - regression_loss: 1.4439 - classification_loss: 0.3036 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7500 - regression_loss: 1.4459 - classification_loss: 0.3041 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7496 - regression_loss: 1.4460 - classification_loss: 0.3036 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7481 - regression_loss: 1.4450 - classification_loss: 0.3031 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7466 - regression_loss: 1.4438 - classification_loss: 0.3028 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7423 - regression_loss: 1.4405 - classification_loss: 0.3018 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7429 - regression_loss: 1.4411 - classification_loss: 0.3018 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7384 - regression_loss: 1.4371 - classification_loss: 0.3012 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7370 - regression_loss: 1.4361 - classification_loss: 0.3010 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7372 - regression_loss: 1.4360 - classification_loss: 0.3012 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7364 - regression_loss: 1.4352 - classification_loss: 0.3013 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7357 - regression_loss: 1.4346 - classification_loss: 0.3011 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7377 - regression_loss: 1.4364 - classification_loss: 0.3013 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7416 - regression_loss: 1.4395 - classification_loss: 0.3021 209/500 [===========>..................] - ETA: 1:13 - loss: 1.7446 - regression_loss: 1.4418 - classification_loss: 0.3028 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7461 - regression_loss: 1.4434 - classification_loss: 0.3028 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7449 - regression_loss: 1.4426 - classification_loss: 0.3023 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7414 - regression_loss: 1.4396 - classification_loss: 0.3018 213/500 [===========>..................] - ETA: 1:12 - loss: 1.7404 - regression_loss: 1.4369 - classification_loss: 0.3035 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7408 - regression_loss: 1.4371 - classification_loss: 0.3037 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7411 - regression_loss: 1.4376 - classification_loss: 0.3036 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7410 - regression_loss: 1.4376 - classification_loss: 0.3034 217/500 [============>.................] - ETA: 1:11 - loss: 1.7413 - regression_loss: 1.4381 - classification_loss: 0.3032 218/500 [============>.................] - ETA: 1:10 - loss: 1.7398 - regression_loss: 1.4366 - classification_loss: 0.3032 219/500 [============>.................] - ETA: 1:10 - loss: 1.7344 - regression_loss: 1.4322 - classification_loss: 0.3022 220/500 [============>.................] - ETA: 1:10 - loss: 1.7327 - regression_loss: 1.4311 - classification_loss: 0.3016 221/500 [============>.................] - ETA: 1:09 - loss: 1.7331 - regression_loss: 1.4313 - classification_loss: 0.3017 222/500 [============>.................] - ETA: 1:09 - loss: 1.7307 - regression_loss: 1.4295 - classification_loss: 0.3013 223/500 [============>.................] - ETA: 1:09 - loss: 1.7293 - regression_loss: 1.4286 - classification_loss: 0.3006 224/500 [============>.................] - ETA: 1:09 - loss: 1.7280 - regression_loss: 1.4272 - classification_loss: 0.3009 225/500 [============>.................] - ETA: 1:08 - loss: 1.7277 - regression_loss: 1.4269 - classification_loss: 0.3008 226/500 [============>.................] - ETA: 1:08 - loss: 1.7291 - regression_loss: 1.4278 - classification_loss: 0.3012 227/500 [============>.................] - ETA: 1:08 - loss: 1.7310 - regression_loss: 1.4295 - classification_loss: 0.3015 228/500 [============>.................] - ETA: 1:08 - loss: 1.7321 - regression_loss: 1.4306 - classification_loss: 0.3015 229/500 [============>.................] - ETA: 1:08 - loss: 1.7361 - regression_loss: 1.4334 - classification_loss: 0.3027 230/500 [============>.................] - ETA: 1:07 - loss: 1.7350 - regression_loss: 1.4328 - classification_loss: 0.3022 231/500 [============>.................] - ETA: 1:07 - loss: 1.7353 - regression_loss: 1.4331 - classification_loss: 0.3022 232/500 [============>.................] - ETA: 1:07 - loss: 1.7347 - regression_loss: 1.4324 - classification_loss: 0.3023 233/500 [============>.................] - ETA: 1:06 - loss: 1.7300 - regression_loss: 1.4288 - classification_loss: 0.3012 234/500 [=============>................] - ETA: 1:06 - loss: 1.7307 - regression_loss: 1.4291 - classification_loss: 0.3016 235/500 [=============>................] - ETA: 1:06 - loss: 1.7299 - regression_loss: 1.4284 - classification_loss: 0.3015 236/500 [=============>................] - ETA: 1:06 - loss: 1.7294 - regression_loss: 1.4282 - classification_loss: 0.3012 237/500 [=============>................] - ETA: 1:05 - loss: 1.7348 - regression_loss: 1.4324 - classification_loss: 0.3024 238/500 [=============>................] - ETA: 1:05 - loss: 1.7377 - regression_loss: 1.4349 - classification_loss: 0.3028 239/500 [=============>................] - ETA: 1:05 - loss: 1.7341 - regression_loss: 1.4320 - classification_loss: 0.3021 240/500 [=============>................] - ETA: 1:05 - loss: 1.7349 - regression_loss: 1.4328 - classification_loss: 0.3021 241/500 [=============>................] - ETA: 1:04 - loss: 1.7361 - regression_loss: 1.4339 - classification_loss: 0.3022 242/500 [=============>................] - ETA: 1:04 - loss: 1.7345 - regression_loss: 1.4324 - classification_loss: 0.3021 243/500 [=============>................] - ETA: 1:04 - loss: 1.7375 - regression_loss: 1.4347 - classification_loss: 0.3028 244/500 [=============>................] - ETA: 1:04 - loss: 1.7381 - regression_loss: 1.4353 - classification_loss: 0.3028 245/500 [=============>................] - ETA: 1:03 - loss: 1.7393 - regression_loss: 1.4366 - classification_loss: 0.3027 246/500 [=============>................] - ETA: 1:03 - loss: 1.7409 - regression_loss: 1.4381 - classification_loss: 0.3028 247/500 [=============>................] - ETA: 1:03 - loss: 1.7429 - regression_loss: 1.4399 - classification_loss: 0.3030 248/500 [=============>................] - ETA: 1:03 - loss: 1.7460 - regression_loss: 1.4425 - classification_loss: 0.3035 249/500 [=============>................] - ETA: 1:02 - loss: 1.7481 - regression_loss: 1.4442 - classification_loss: 0.3039 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7495 - regression_loss: 1.4453 - classification_loss: 0.3042 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7475 - regression_loss: 1.4438 - classification_loss: 0.3036 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7508 - regression_loss: 1.4472 - classification_loss: 0.3036 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7525 - regression_loss: 1.4489 - classification_loss: 0.3037 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7496 - regression_loss: 1.4467 - classification_loss: 0.3029 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7491 - regression_loss: 1.4461 - classification_loss: 0.3030 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7490 - regression_loss: 1.4461 - classification_loss: 0.3029 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7498 - regression_loss: 1.4467 - classification_loss: 0.3031 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7487 - regression_loss: 1.4460 - classification_loss: 0.3027 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7490 - regression_loss: 1.4460 - classification_loss: 0.3030 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7499 - regression_loss: 1.4471 - classification_loss: 0.3028 261/500 [==============>...............] - ETA: 59s - loss: 1.7498 - regression_loss: 1.4468 - classification_loss: 0.3030  262/500 [==============>...............] - ETA: 59s - loss: 1.7485 - regression_loss: 1.4459 - classification_loss: 0.3025 263/500 [==============>...............] - ETA: 59s - loss: 1.7491 - regression_loss: 1.4466 - classification_loss: 0.3025 264/500 [==============>...............] - ETA: 59s - loss: 1.7506 - regression_loss: 1.4479 - classification_loss: 0.3026 265/500 [==============>...............] - ETA: 58s - loss: 1.7507 - regression_loss: 1.4482 - classification_loss: 0.3026 266/500 [==============>...............] - ETA: 58s - loss: 1.7500 - regression_loss: 1.4477 - classification_loss: 0.3024 267/500 [===============>..............] - ETA: 58s - loss: 1.7503 - regression_loss: 1.4476 - classification_loss: 0.3027 268/500 [===============>..............] - ETA: 58s - loss: 1.7504 - regression_loss: 1.4477 - classification_loss: 0.3027 269/500 [===============>..............] - ETA: 57s - loss: 1.7520 - regression_loss: 1.4488 - classification_loss: 0.3032 270/500 [===============>..............] - ETA: 57s - loss: 1.7527 - regression_loss: 1.4494 - classification_loss: 0.3033 271/500 [===============>..............] - ETA: 57s - loss: 1.7498 - regression_loss: 1.4466 - classification_loss: 0.3033 272/500 [===============>..............] - ETA: 57s - loss: 1.7519 - regression_loss: 1.4482 - classification_loss: 0.3038 273/500 [===============>..............] - ETA: 56s - loss: 1.7500 - regression_loss: 1.4467 - classification_loss: 0.3033 274/500 [===============>..............] - ETA: 56s - loss: 1.7513 - regression_loss: 1.4479 - classification_loss: 0.3034 275/500 [===============>..............] - ETA: 56s - loss: 1.7500 - regression_loss: 1.4469 - classification_loss: 0.3031 276/500 [===============>..............] - ETA: 56s - loss: 1.7502 - regression_loss: 1.4472 - classification_loss: 0.3029 277/500 [===============>..............] - ETA: 55s - loss: 1.7503 - regression_loss: 1.4475 - classification_loss: 0.3028 278/500 [===============>..............] - ETA: 55s - loss: 1.7483 - regression_loss: 1.4457 - classification_loss: 0.3025 279/500 [===============>..............] - ETA: 55s - loss: 1.7506 - regression_loss: 1.4476 - classification_loss: 0.3030 280/500 [===============>..............] - ETA: 55s - loss: 1.7502 - regression_loss: 1.4474 - classification_loss: 0.3028 281/500 [===============>..............] - ETA: 54s - loss: 1.7507 - regression_loss: 1.4478 - classification_loss: 0.3029 282/500 [===============>..............] - ETA: 54s - loss: 1.7514 - regression_loss: 1.4485 - classification_loss: 0.3029 283/500 [===============>..............] - ETA: 54s - loss: 1.7503 - regression_loss: 1.4476 - classification_loss: 0.3027 284/500 [================>.............] - ETA: 54s - loss: 1.7523 - regression_loss: 1.4489 - classification_loss: 0.3034 285/500 [================>.............] - ETA: 53s - loss: 1.7502 - regression_loss: 1.4473 - classification_loss: 0.3029 286/500 [================>.............] - ETA: 53s - loss: 1.7502 - regression_loss: 1.4474 - classification_loss: 0.3028 287/500 [================>.............] - ETA: 53s - loss: 1.7514 - regression_loss: 1.4485 - classification_loss: 0.3029 288/500 [================>.............] - ETA: 53s - loss: 1.7484 - regression_loss: 1.4457 - classification_loss: 0.3028 289/500 [================>.............] - ETA: 52s - loss: 1.7499 - regression_loss: 1.4470 - classification_loss: 0.3028 290/500 [================>.............] - ETA: 52s - loss: 1.7516 - regression_loss: 1.4484 - classification_loss: 0.3032 291/500 [================>.............] - ETA: 52s - loss: 1.7531 - regression_loss: 1.4478 - classification_loss: 0.3053 292/500 [================>.............] - ETA: 52s - loss: 1.7515 - regression_loss: 1.4463 - classification_loss: 0.3051 293/500 [================>.............] - ETA: 51s - loss: 1.7515 - regression_loss: 1.4461 - classification_loss: 0.3053 294/500 [================>.............] - ETA: 51s - loss: 1.7514 - regression_loss: 1.4461 - classification_loss: 0.3053 295/500 [================>.............] - ETA: 51s - loss: 1.7517 - regression_loss: 1.4464 - classification_loss: 0.3053 296/500 [================>.............] - ETA: 51s - loss: 1.7522 - regression_loss: 1.4469 - classification_loss: 0.3054 297/500 [================>.............] - ETA: 50s - loss: 1.7525 - regression_loss: 1.4468 - classification_loss: 0.3057 298/500 [================>.............] - ETA: 50s - loss: 1.7537 - regression_loss: 1.4480 - classification_loss: 0.3057 299/500 [================>.............] - ETA: 50s - loss: 1.7546 - regression_loss: 1.4489 - classification_loss: 0.3057 300/500 [=================>............] - ETA: 50s - loss: 1.7522 - regression_loss: 1.4470 - classification_loss: 0.3052 301/500 [=================>............] - ETA: 49s - loss: 1.7527 - regression_loss: 1.4473 - classification_loss: 0.3054 302/500 [=================>............] - ETA: 49s - loss: 1.7528 - regression_loss: 1.4474 - classification_loss: 0.3054 303/500 [=================>............] - ETA: 49s - loss: 1.7513 - regression_loss: 1.4462 - classification_loss: 0.3051 304/500 [=================>............] - ETA: 49s - loss: 1.7540 - regression_loss: 1.4481 - classification_loss: 0.3059 305/500 [=================>............] - ETA: 48s - loss: 1.7522 - regression_loss: 1.4470 - classification_loss: 0.3052 306/500 [=================>............] - ETA: 48s - loss: 1.7508 - regression_loss: 1.4463 - classification_loss: 0.3045 307/500 [=================>............] - ETA: 48s - loss: 1.7492 - regression_loss: 1.4450 - classification_loss: 0.3042 308/500 [=================>............] - ETA: 48s - loss: 1.7487 - regression_loss: 1.4446 - classification_loss: 0.3042 309/500 [=================>............] - ETA: 47s - loss: 1.7512 - regression_loss: 1.4469 - classification_loss: 0.3043 310/500 [=================>............] - ETA: 47s - loss: 1.7528 - regression_loss: 1.4478 - classification_loss: 0.3050 311/500 [=================>............] - ETA: 47s - loss: 1.7507 - regression_loss: 1.4458 - classification_loss: 0.3049 312/500 [=================>............] - ETA: 47s - loss: 1.7497 - regression_loss: 1.4451 - classification_loss: 0.3046 313/500 [=================>............] - ETA: 46s - loss: 1.7504 - regression_loss: 1.4457 - classification_loss: 0.3047 314/500 [=================>............] - ETA: 46s - loss: 1.7506 - regression_loss: 1.4460 - classification_loss: 0.3046 315/500 [=================>............] - ETA: 46s - loss: 1.7506 - regression_loss: 1.4461 - classification_loss: 0.3045 316/500 [=================>............] - ETA: 46s - loss: 1.7503 - regression_loss: 1.4459 - classification_loss: 0.3044 317/500 [==================>...........] - ETA: 45s - loss: 1.7482 - regression_loss: 1.4442 - classification_loss: 0.3040 318/500 [==================>...........] - ETA: 45s - loss: 1.7484 - regression_loss: 1.4445 - classification_loss: 0.3038 319/500 [==================>...........] - ETA: 45s - loss: 1.7488 - regression_loss: 1.4450 - classification_loss: 0.3038 320/500 [==================>...........] - ETA: 45s - loss: 1.7497 - regression_loss: 1.4458 - classification_loss: 0.3039 321/500 [==================>...........] - ETA: 44s - loss: 1.7495 - regression_loss: 1.4457 - classification_loss: 0.3037 322/500 [==================>...........] - ETA: 44s - loss: 1.7491 - regression_loss: 1.4454 - classification_loss: 0.3037 323/500 [==================>...........] - ETA: 44s - loss: 1.7474 - regression_loss: 1.4441 - classification_loss: 0.3033 324/500 [==================>...........] - ETA: 44s - loss: 1.7477 - regression_loss: 1.4443 - classification_loss: 0.3034 325/500 [==================>...........] - ETA: 43s - loss: 1.7440 - regression_loss: 1.4412 - classification_loss: 0.3028 326/500 [==================>...........] - ETA: 43s - loss: 1.7444 - regression_loss: 1.4417 - classification_loss: 0.3027 327/500 [==================>...........] - ETA: 43s - loss: 1.7430 - regression_loss: 1.4402 - classification_loss: 0.3028 328/500 [==================>...........] - ETA: 43s - loss: 1.7397 - regression_loss: 1.4375 - classification_loss: 0.3022 329/500 [==================>...........] - ETA: 42s - loss: 1.7414 - regression_loss: 1.4391 - classification_loss: 0.3023 330/500 [==================>...........] - ETA: 42s - loss: 1.7387 - regression_loss: 1.4370 - classification_loss: 0.3017 331/500 [==================>...........] - ETA: 42s - loss: 1.7369 - regression_loss: 1.4356 - classification_loss: 0.3012 332/500 [==================>...........] - ETA: 42s - loss: 1.7374 - regression_loss: 1.4362 - classification_loss: 0.3012 333/500 [==================>...........] - ETA: 41s - loss: 1.7367 - regression_loss: 1.4358 - classification_loss: 0.3010 334/500 [===================>..........] - ETA: 41s - loss: 1.7387 - regression_loss: 1.4371 - classification_loss: 0.3016 335/500 [===================>..........] - ETA: 41s - loss: 1.7382 - regression_loss: 1.4368 - classification_loss: 0.3014 336/500 [===================>..........] - ETA: 41s - loss: 1.7383 - regression_loss: 1.4370 - classification_loss: 0.3012 337/500 [===================>..........] - ETA: 40s - loss: 1.7398 - regression_loss: 1.4385 - classification_loss: 0.3013 338/500 [===================>..........] - ETA: 40s - loss: 1.7411 - regression_loss: 1.4397 - classification_loss: 0.3014 339/500 [===================>..........] - ETA: 40s - loss: 1.7390 - regression_loss: 1.4382 - classification_loss: 0.3009 340/500 [===================>..........] - ETA: 40s - loss: 1.7385 - regression_loss: 1.4379 - classification_loss: 0.3006 341/500 [===================>..........] - ETA: 39s - loss: 1.7399 - regression_loss: 1.4391 - classification_loss: 0.3008 342/500 [===================>..........] - ETA: 39s - loss: 1.7401 - regression_loss: 1.4390 - classification_loss: 0.3011 343/500 [===================>..........] - ETA: 39s - loss: 1.7411 - regression_loss: 1.4399 - classification_loss: 0.3012 344/500 [===================>..........] - ETA: 39s - loss: 1.7430 - regression_loss: 1.4415 - classification_loss: 0.3015 345/500 [===================>..........] - ETA: 38s - loss: 1.7439 - regression_loss: 1.4424 - classification_loss: 0.3015 346/500 [===================>..........] - ETA: 38s - loss: 1.7441 - regression_loss: 1.4427 - classification_loss: 0.3014 347/500 [===================>..........] - ETA: 38s - loss: 1.7446 - regression_loss: 1.4431 - classification_loss: 0.3015 348/500 [===================>..........] - ETA: 38s - loss: 1.7450 - regression_loss: 1.4434 - classification_loss: 0.3016 349/500 [===================>..........] - ETA: 37s - loss: 1.7446 - regression_loss: 1.4433 - classification_loss: 0.3013 350/500 [====================>.........] - ETA: 37s - loss: 1.7471 - regression_loss: 1.4453 - classification_loss: 0.3017 351/500 [====================>.........] - ETA: 37s - loss: 1.7473 - regression_loss: 1.4456 - classification_loss: 0.3016 352/500 [====================>.........] - ETA: 37s - loss: 1.7479 - regression_loss: 1.4462 - classification_loss: 0.3017 353/500 [====================>.........] - ETA: 36s - loss: 1.7479 - regression_loss: 1.4462 - classification_loss: 0.3018 354/500 [====================>.........] - ETA: 36s - loss: 1.7513 - regression_loss: 1.4487 - classification_loss: 0.3026 355/500 [====================>.........] - ETA: 36s - loss: 1.7509 - regression_loss: 1.4483 - classification_loss: 0.3026 356/500 [====================>.........] - ETA: 36s - loss: 1.7512 - regression_loss: 1.4488 - classification_loss: 0.3024 357/500 [====================>.........] - ETA: 35s - loss: 1.7542 - regression_loss: 1.4511 - classification_loss: 0.3031 358/500 [====================>.........] - ETA: 35s - loss: 1.7545 - regression_loss: 1.4515 - classification_loss: 0.3030 359/500 [====================>.........] - ETA: 35s - loss: 1.7521 - regression_loss: 1.4496 - classification_loss: 0.3026 360/500 [====================>.........] - ETA: 35s - loss: 1.7517 - regression_loss: 1.4492 - classification_loss: 0.3025 361/500 [====================>.........] - ETA: 34s - loss: 1.7525 - regression_loss: 1.4501 - classification_loss: 0.3024 362/500 [====================>.........] - ETA: 34s - loss: 1.7535 - regression_loss: 1.4510 - classification_loss: 0.3026 363/500 [====================>.........] - ETA: 34s - loss: 1.7545 - regression_loss: 1.4520 - classification_loss: 0.3025 364/500 [====================>.........] - ETA: 34s - loss: 1.7553 - regression_loss: 1.4524 - classification_loss: 0.3029 365/500 [====================>.........] - ETA: 33s - loss: 1.7560 - regression_loss: 1.4530 - classification_loss: 0.3030 366/500 [====================>.........] - ETA: 33s - loss: 1.7567 - regression_loss: 1.4532 - classification_loss: 0.3035 367/500 [=====================>........] - ETA: 33s - loss: 1.7571 - regression_loss: 1.4533 - classification_loss: 0.3037 368/500 [=====================>........] - ETA: 33s - loss: 1.7573 - regression_loss: 1.4534 - classification_loss: 0.3039 369/500 [=====================>........] - ETA: 32s - loss: 1.7565 - regression_loss: 1.4527 - classification_loss: 0.3038 370/500 [=====================>........] - ETA: 32s - loss: 1.7547 - regression_loss: 1.4512 - classification_loss: 0.3034 371/500 [=====================>........] - ETA: 32s - loss: 1.7540 - regression_loss: 1.4506 - classification_loss: 0.3034 372/500 [=====================>........] - ETA: 32s - loss: 1.7546 - regression_loss: 1.4509 - classification_loss: 0.3037 373/500 [=====================>........] - ETA: 31s - loss: 1.7510 - regression_loss: 1.4479 - classification_loss: 0.3031 374/500 [=====================>........] - ETA: 31s - loss: 1.7525 - regression_loss: 1.4490 - classification_loss: 0.3035 375/500 [=====================>........] - ETA: 31s - loss: 1.7524 - regression_loss: 1.4488 - classification_loss: 0.3036 376/500 [=====================>........] - ETA: 31s - loss: 1.7511 - regression_loss: 1.4478 - classification_loss: 0.3034 377/500 [=====================>........] - ETA: 30s - loss: 1.7521 - regression_loss: 1.4484 - classification_loss: 0.3038 378/500 [=====================>........] - ETA: 30s - loss: 1.7540 - regression_loss: 1.4496 - classification_loss: 0.3044 379/500 [=====================>........] - ETA: 30s - loss: 1.7564 - regression_loss: 1.4514 - classification_loss: 0.3051 380/500 [=====================>........] - ETA: 30s - loss: 1.7574 - regression_loss: 1.4522 - classification_loss: 0.3052 381/500 [=====================>........] - ETA: 29s - loss: 1.7590 - regression_loss: 1.4537 - classification_loss: 0.3053 382/500 [=====================>........] - ETA: 29s - loss: 1.7594 - regression_loss: 1.4541 - classification_loss: 0.3053 383/500 [=====================>........] - ETA: 29s - loss: 1.7598 - regression_loss: 1.4545 - classification_loss: 0.3052 384/500 [======================>.......] - ETA: 29s - loss: 1.7598 - regression_loss: 1.4546 - classification_loss: 0.3052 385/500 [======================>.......] - ETA: 28s - loss: 1.7589 - regression_loss: 1.4540 - classification_loss: 0.3049 386/500 [======================>.......] - ETA: 28s - loss: 1.7603 - regression_loss: 1.4553 - classification_loss: 0.3050 387/500 [======================>.......] - ETA: 28s - loss: 1.7609 - regression_loss: 1.4558 - classification_loss: 0.3051 388/500 [======================>.......] - ETA: 28s - loss: 1.7611 - regression_loss: 1.4560 - classification_loss: 0.3051 389/500 [======================>.......] - ETA: 27s - loss: 1.7604 - regression_loss: 1.4553 - classification_loss: 0.3051 390/500 [======================>.......] - ETA: 27s - loss: 1.7600 - regression_loss: 1.4550 - classification_loss: 0.3050 391/500 [======================>.......] - ETA: 27s - loss: 1.7602 - regression_loss: 1.4552 - classification_loss: 0.3050 392/500 [======================>.......] - ETA: 27s - loss: 1.7616 - regression_loss: 1.4563 - classification_loss: 0.3054 393/500 [======================>.......] - ETA: 26s - loss: 1.7619 - regression_loss: 1.4565 - classification_loss: 0.3054 394/500 [======================>.......] - ETA: 26s - loss: 1.7616 - regression_loss: 1.4562 - classification_loss: 0.3053 395/500 [======================>.......] - ETA: 26s - loss: 1.7614 - regression_loss: 1.4562 - classification_loss: 0.3052 396/500 [======================>.......] - ETA: 26s - loss: 1.7603 - regression_loss: 1.4554 - classification_loss: 0.3050 397/500 [======================>.......] - ETA: 25s - loss: 1.7602 - regression_loss: 1.4553 - classification_loss: 0.3049 398/500 [======================>.......] - ETA: 25s - loss: 1.7592 - regression_loss: 1.4548 - classification_loss: 0.3045 399/500 [======================>.......] - ETA: 25s - loss: 1.7608 - regression_loss: 1.4558 - classification_loss: 0.3050 400/500 [=======================>......] - ETA: 25s - loss: 1.7591 - regression_loss: 1.4545 - classification_loss: 0.3045 401/500 [=======================>......] - ETA: 24s - loss: 1.7598 - regression_loss: 1.4551 - classification_loss: 0.3047 402/500 [=======================>......] - ETA: 24s - loss: 1.7603 - regression_loss: 1.4555 - classification_loss: 0.3049 403/500 [=======================>......] - ETA: 24s - loss: 1.7579 - regression_loss: 1.4533 - classification_loss: 0.3046 404/500 [=======================>......] - ETA: 24s - loss: 1.7588 - regression_loss: 1.4541 - classification_loss: 0.3048 405/500 [=======================>......] - ETA: 23s - loss: 1.7571 - regression_loss: 1.4526 - classification_loss: 0.3045 406/500 [=======================>......] - ETA: 23s - loss: 1.7557 - regression_loss: 1.4516 - classification_loss: 0.3041 407/500 [=======================>......] - ETA: 23s - loss: 1.7551 - regression_loss: 1.4513 - classification_loss: 0.3038 408/500 [=======================>......] - ETA: 23s - loss: 1.7548 - regression_loss: 1.4510 - classification_loss: 0.3037 409/500 [=======================>......] - ETA: 22s - loss: 1.7537 - regression_loss: 1.4502 - classification_loss: 0.3035 410/500 [=======================>......] - ETA: 22s - loss: 1.7550 - regression_loss: 1.4513 - classification_loss: 0.3037 411/500 [=======================>......] - ETA: 22s - loss: 1.7562 - regression_loss: 1.4523 - classification_loss: 0.3039 412/500 [=======================>......] - ETA: 22s - loss: 1.7546 - regression_loss: 1.4512 - classification_loss: 0.3035 413/500 [=======================>......] - ETA: 21s - loss: 1.7549 - regression_loss: 1.4514 - classification_loss: 0.3035 414/500 [=======================>......] - ETA: 21s - loss: 1.7560 - regression_loss: 1.4522 - classification_loss: 0.3038 415/500 [=======================>......] - ETA: 21s - loss: 1.7575 - regression_loss: 1.4531 - classification_loss: 0.3043 416/500 [=======================>......] - ETA: 21s - loss: 1.7586 - regression_loss: 1.4540 - classification_loss: 0.3046 417/500 [========================>.....] - ETA: 20s - loss: 1.7580 - regression_loss: 1.4536 - classification_loss: 0.3044 418/500 [========================>.....] - ETA: 20s - loss: 1.7573 - regression_loss: 1.4530 - classification_loss: 0.3043 419/500 [========================>.....] - ETA: 20s - loss: 1.7580 - regression_loss: 1.4536 - classification_loss: 0.3044 420/500 [========================>.....] - ETA: 20s - loss: 1.7565 - regression_loss: 1.4524 - classification_loss: 0.3041 421/500 [========================>.....] - ETA: 19s - loss: 1.7553 - regression_loss: 1.4514 - classification_loss: 0.3039 422/500 [========================>.....] - ETA: 19s - loss: 1.7555 - regression_loss: 1.4516 - classification_loss: 0.3039 423/500 [========================>.....] - ETA: 19s - loss: 1.7549 - regression_loss: 1.4510 - classification_loss: 0.3039 424/500 [========================>.....] - ETA: 19s - loss: 1.7551 - regression_loss: 1.4512 - classification_loss: 0.3038 425/500 [========================>.....] - ETA: 18s - loss: 1.7563 - regression_loss: 1.4524 - classification_loss: 0.3039 426/500 [========================>.....] - ETA: 18s - loss: 1.7569 - regression_loss: 1.4530 - classification_loss: 0.3039 427/500 [========================>.....] - ETA: 18s - loss: 1.7585 - regression_loss: 1.4542 - classification_loss: 0.3043 428/500 [========================>.....] - ETA: 18s - loss: 1.7567 - regression_loss: 1.4528 - classification_loss: 0.3039 429/500 [========================>.....] - ETA: 17s - loss: 1.7571 - regression_loss: 1.4530 - classification_loss: 0.3040 430/500 [========================>.....] - ETA: 17s - loss: 1.7578 - regression_loss: 1.4537 - classification_loss: 0.3041 431/500 [========================>.....] - ETA: 17s - loss: 1.7582 - regression_loss: 1.4542 - classification_loss: 0.3041 432/500 [========================>.....] - ETA: 17s - loss: 1.7571 - regression_loss: 1.4530 - classification_loss: 0.3041 433/500 [========================>.....] - ETA: 16s - loss: 1.7571 - regression_loss: 1.4529 - classification_loss: 0.3042 434/500 [=========================>....] - ETA: 16s - loss: 1.7590 - regression_loss: 1.4544 - classification_loss: 0.3046 435/500 [=========================>....] - ETA: 16s - loss: 1.7576 - regression_loss: 1.4534 - classification_loss: 0.3043 436/500 [=========================>....] - ETA: 16s - loss: 1.7590 - regression_loss: 1.4546 - classification_loss: 0.3044 437/500 [=========================>....] - ETA: 15s - loss: 1.7592 - regression_loss: 1.4542 - classification_loss: 0.3050 438/500 [=========================>....] - ETA: 15s - loss: 1.7587 - regression_loss: 1.4537 - classification_loss: 0.3049 439/500 [=========================>....] - ETA: 15s - loss: 1.7595 - regression_loss: 1.4546 - classification_loss: 0.3050 440/500 [=========================>....] - ETA: 15s - loss: 1.7594 - regression_loss: 1.4546 - classification_loss: 0.3049 441/500 [=========================>....] - ETA: 14s - loss: 1.7595 - regression_loss: 1.4545 - classification_loss: 0.3050 442/500 [=========================>....] - ETA: 14s - loss: 1.7589 - regression_loss: 1.4540 - classification_loss: 0.3049 443/500 [=========================>....] - ETA: 14s - loss: 1.7589 - regression_loss: 1.4540 - classification_loss: 0.3049 444/500 [=========================>....] - ETA: 14s - loss: 1.7590 - regression_loss: 1.4539 - classification_loss: 0.3050 445/500 [=========================>....] - ETA: 13s - loss: 1.7589 - regression_loss: 1.4538 - classification_loss: 0.3051 446/500 [=========================>....] - ETA: 13s - loss: 1.7602 - regression_loss: 1.4549 - classification_loss: 0.3053 447/500 [=========================>....] - ETA: 13s - loss: 1.7605 - regression_loss: 1.4550 - classification_loss: 0.3055 448/500 [=========================>....] - ETA: 13s - loss: 1.7591 - regression_loss: 1.4539 - classification_loss: 0.3052 449/500 [=========================>....] - ETA: 12s - loss: 1.7601 - regression_loss: 1.4549 - classification_loss: 0.3052 450/500 [==========================>...] - ETA: 12s - loss: 1.7586 - regression_loss: 1.4538 - classification_loss: 0.3049 451/500 [==========================>...] - ETA: 12s - loss: 1.7583 - regression_loss: 1.4536 - classification_loss: 0.3047 452/500 [==========================>...] - ETA: 12s - loss: 1.7586 - regression_loss: 1.4536 - classification_loss: 0.3050 453/500 [==========================>...] - ETA: 11s - loss: 1.7604 - regression_loss: 1.4547 - classification_loss: 0.3057 454/500 [==========================>...] - ETA: 11s - loss: 1.7607 - regression_loss: 1.4550 - classification_loss: 0.3058 455/500 [==========================>...] - ETA: 11s - loss: 1.7593 - regression_loss: 1.4537 - classification_loss: 0.3056 456/500 [==========================>...] - ETA: 11s - loss: 1.7598 - regression_loss: 1.4542 - classification_loss: 0.3056 457/500 [==========================>...] - ETA: 10s - loss: 1.7603 - regression_loss: 1.4545 - classification_loss: 0.3057 458/500 [==========================>...] - ETA: 10s - loss: 1.7615 - regression_loss: 1.4553 - classification_loss: 0.3063 459/500 [==========================>...] - ETA: 10s - loss: 1.7620 - regression_loss: 1.4557 - classification_loss: 0.3063 460/500 [==========================>...] - ETA: 10s - loss: 1.7613 - regression_loss: 1.4551 - classification_loss: 0.3062 461/500 [==========================>...] - ETA: 9s - loss: 1.7611 - regression_loss: 1.4549 - classification_loss: 0.3062  462/500 [==========================>...] - ETA: 9s - loss: 1.7610 - regression_loss: 1.4549 - classification_loss: 0.3061 463/500 [==========================>...] - ETA: 9s - loss: 1.7623 - regression_loss: 1.4559 - classification_loss: 0.3064 464/500 [==========================>...] - ETA: 9s - loss: 1.7624 - regression_loss: 1.4561 - classification_loss: 0.3064 465/500 [==========================>...] - ETA: 8s - loss: 1.7617 - regression_loss: 1.4556 - classification_loss: 0.3061 466/500 [==========================>...] - ETA: 8s - loss: 1.7614 - regression_loss: 1.4554 - classification_loss: 0.3060 467/500 [===========================>..] - ETA: 8s - loss: 1.7602 - regression_loss: 1.4545 - classification_loss: 0.3057 468/500 [===========================>..] - ETA: 8s - loss: 1.7607 - regression_loss: 1.4549 - classification_loss: 0.3058 469/500 [===========================>..] - ETA: 7s - loss: 1.7615 - regression_loss: 1.4557 - classification_loss: 0.3058 470/500 [===========================>..] - ETA: 7s - loss: 1.7615 - regression_loss: 1.4558 - classification_loss: 0.3058 471/500 [===========================>..] - ETA: 7s - loss: 1.7620 - regression_loss: 1.4562 - classification_loss: 0.3058 472/500 [===========================>..] - ETA: 7s - loss: 1.7623 - regression_loss: 1.4562 - classification_loss: 0.3061 473/500 [===========================>..] - ETA: 6s - loss: 1.7628 - regression_loss: 1.4565 - classification_loss: 0.3063 474/500 [===========================>..] - ETA: 6s - loss: 1.7627 - regression_loss: 1.4565 - classification_loss: 0.3061 475/500 [===========================>..] - ETA: 6s - loss: 1.7625 - regression_loss: 1.4565 - classification_loss: 0.3060 476/500 [===========================>..] - ETA: 6s - loss: 1.7632 - regression_loss: 1.4571 - classification_loss: 0.3061 477/500 [===========================>..] - ETA: 5s - loss: 1.7620 - regression_loss: 1.4562 - classification_loss: 0.3058 478/500 [===========================>..] - ETA: 5s - loss: 1.7638 - regression_loss: 1.4578 - classification_loss: 0.3060 479/500 [===========================>..] - ETA: 5s - loss: 1.7639 - regression_loss: 1.4580 - classification_loss: 0.3059 480/500 [===========================>..] - ETA: 5s - loss: 1.7640 - regression_loss: 1.4582 - classification_loss: 0.3058 481/500 [===========================>..] - ETA: 4s - loss: 1.7646 - regression_loss: 1.4586 - classification_loss: 0.3060 482/500 [===========================>..] - ETA: 4s - loss: 1.7653 - regression_loss: 1.4591 - classification_loss: 0.3062 483/500 [===========================>..] - ETA: 4s - loss: 1.7655 - regression_loss: 1.4592 - classification_loss: 0.3064 484/500 [============================>.] - ETA: 4s - loss: 1.7641 - regression_loss: 1.4581 - classification_loss: 0.3060 485/500 [============================>.] - ETA: 3s - loss: 1.7645 - regression_loss: 1.4586 - classification_loss: 0.3059 486/500 [============================>.] - ETA: 3s - loss: 1.7643 - regression_loss: 1.4585 - classification_loss: 0.3058 487/500 [============================>.] - ETA: 3s - loss: 1.7650 - regression_loss: 1.4593 - classification_loss: 0.3057 488/500 [============================>.] - ETA: 3s - loss: 1.7634 - regression_loss: 1.4579 - classification_loss: 0.3055 489/500 [============================>.] - ETA: 2s - loss: 1.7626 - regression_loss: 1.4573 - classification_loss: 0.3053 490/500 [============================>.] - ETA: 2s - loss: 1.7623 - regression_loss: 1.4571 - classification_loss: 0.3053 491/500 [============================>.] - ETA: 2s - loss: 1.7629 - regression_loss: 1.4573 - classification_loss: 0.3056 492/500 [============================>.] - ETA: 2s - loss: 1.7633 - regression_loss: 1.4575 - classification_loss: 0.3057 493/500 [============================>.] - ETA: 1s - loss: 1.7634 - regression_loss: 1.4578 - classification_loss: 0.3056 494/500 [============================>.] - ETA: 1s - loss: 1.7628 - regression_loss: 1.4574 - classification_loss: 0.3054 495/500 [============================>.] - ETA: 1s - loss: 1.7631 - regression_loss: 1.4578 - classification_loss: 0.3053 496/500 [============================>.] - ETA: 1s - loss: 1.7633 - regression_loss: 1.4580 - classification_loss: 0.3053 497/500 [============================>.] - ETA: 0s - loss: 1.7630 - regression_loss: 1.4577 - classification_loss: 0.3053 498/500 [============================>.] - ETA: 0s - loss: 1.7629 - regression_loss: 1.4577 - classification_loss: 0.3052 499/500 [============================>.] - ETA: 0s - loss: 1.7627 - regression_loss: 1.4577 - classification_loss: 0.3051 500/500 [==============================] - 125s 251ms/step - loss: 1.7633 - regression_loss: 1.4581 - classification_loss: 0.3052 1172 instances of class plum with average precision: 0.6066 mAP: 0.6066 Epoch 00062: saving model to ./training/snapshots/resnet50_pascal_62.h5 Epoch 63/150 1/500 [..............................] - ETA: 2:03 - loss: 1.9291 - regression_loss: 1.6355 - classification_loss: 0.2936 2/500 [..............................] - ETA: 2:05 - loss: 1.8402 - regression_loss: 1.5543 - classification_loss: 0.2859 3/500 [..............................] - ETA: 2:05 - loss: 1.8119 - regression_loss: 1.4695 - classification_loss: 0.3424 4/500 [..............................] - ETA: 2:05 - loss: 1.8625 - regression_loss: 1.5107 - classification_loss: 0.3518 5/500 [..............................] - ETA: 2:05 - loss: 1.7578 - regression_loss: 1.4325 - classification_loss: 0.3253 6/500 [..............................] - ETA: 2:05 - loss: 1.7568 - regression_loss: 1.4470 - classification_loss: 0.3099 7/500 [..............................] - ETA: 2:04 - loss: 1.7094 - regression_loss: 1.4131 - classification_loss: 0.2963 8/500 [..............................] - ETA: 2:04 - loss: 1.7471 - regression_loss: 1.4405 - classification_loss: 0.3066 9/500 [..............................] - ETA: 2:03 - loss: 1.7768 - regression_loss: 1.4665 - classification_loss: 0.3103 10/500 [..............................] - ETA: 2:03 - loss: 1.8050 - regression_loss: 1.4887 - classification_loss: 0.3163 11/500 [..............................] - ETA: 2:02 - loss: 1.8493 - regression_loss: 1.5209 - classification_loss: 0.3284 12/500 [..............................] - ETA: 2:02 - loss: 1.8473 - regression_loss: 1.5025 - classification_loss: 0.3448 13/500 [..............................] - ETA: 2:02 - loss: 1.8229 - regression_loss: 1.4933 - classification_loss: 0.3296 14/500 [..............................] - ETA: 2:01 - loss: 1.8433 - regression_loss: 1.5071 - classification_loss: 0.3362 15/500 [..............................] - ETA: 2:01 - loss: 1.8505 - regression_loss: 1.5188 - classification_loss: 0.3317 16/500 [..............................] - ETA: 2:00 - loss: 1.8335 - regression_loss: 1.5106 - classification_loss: 0.3229 17/500 [>.............................] - ETA: 2:00 - loss: 1.8657 - regression_loss: 1.5373 - classification_loss: 0.3284 18/500 [>.............................] - ETA: 2:00 - loss: 1.8657 - regression_loss: 1.5392 - classification_loss: 0.3265 19/500 [>.............................] - ETA: 1:59 - loss: 1.8741 - regression_loss: 1.5487 - classification_loss: 0.3254 20/500 [>.............................] - ETA: 1:59 - loss: 1.8547 - regression_loss: 1.5338 - classification_loss: 0.3209 21/500 [>.............................] - ETA: 1:59 - loss: 1.8802 - regression_loss: 1.5447 - classification_loss: 0.3354 22/500 [>.............................] - ETA: 1:59 - loss: 1.8747 - regression_loss: 1.5420 - classification_loss: 0.3327 23/500 [>.............................] - ETA: 1:59 - loss: 1.8406 - regression_loss: 1.5174 - classification_loss: 0.3232 24/500 [>.............................] - ETA: 1:58 - loss: 1.8298 - regression_loss: 1.5086 - classification_loss: 0.3212 25/500 [>.............................] - ETA: 1:58 - loss: 1.8264 - regression_loss: 1.5067 - classification_loss: 0.3197 26/500 [>.............................] - ETA: 1:58 - loss: 1.8449 - regression_loss: 1.5197 - classification_loss: 0.3252 27/500 [>.............................] - ETA: 1:57 - loss: 1.8329 - regression_loss: 1.5100 - classification_loss: 0.3229 28/500 [>.............................] - ETA: 1:57 - loss: 1.8314 - regression_loss: 1.5040 - classification_loss: 0.3274 29/500 [>.............................] - ETA: 1:57 - loss: 1.8308 - regression_loss: 1.5058 - classification_loss: 0.3251 30/500 [>.............................] - ETA: 1:57 - loss: 1.8308 - regression_loss: 1.5070 - classification_loss: 0.3239 31/500 [>.............................] - ETA: 1:56 - loss: 1.8226 - regression_loss: 1.5036 - classification_loss: 0.3189 32/500 [>.............................] - ETA: 1:56 - loss: 1.8378 - regression_loss: 1.5153 - classification_loss: 0.3224 33/500 [>.............................] - ETA: 1:56 - loss: 1.8518 - regression_loss: 1.5255 - classification_loss: 0.3263 34/500 [=>............................] - ETA: 1:56 - loss: 1.8449 - regression_loss: 1.5212 - classification_loss: 0.3237 35/500 [=>............................] - ETA: 1:55 - loss: 1.8214 - regression_loss: 1.5015 - classification_loss: 0.3199 36/500 [=>............................] - ETA: 1:55 - loss: 1.8727 - regression_loss: 1.5423 - classification_loss: 0.3304 37/500 [=>............................] - ETA: 1:55 - loss: 1.8486 - regression_loss: 1.5217 - classification_loss: 0.3269 38/500 [=>............................] - ETA: 1:55 - loss: 1.8467 - regression_loss: 1.5208 - classification_loss: 0.3259 39/500 [=>............................] - ETA: 1:54 - loss: 1.8526 - regression_loss: 1.5272 - classification_loss: 0.3253 40/500 [=>............................] - ETA: 1:54 - loss: 1.8506 - regression_loss: 1.5274 - classification_loss: 0.3232 41/500 [=>............................] - ETA: 1:54 - loss: 1.8502 - regression_loss: 1.5256 - classification_loss: 0.3246 42/500 [=>............................] - ETA: 1:54 - loss: 1.8409 - regression_loss: 1.5188 - classification_loss: 0.3221 43/500 [=>............................] - ETA: 1:54 - loss: 1.8436 - regression_loss: 1.5213 - classification_loss: 0.3223 44/500 [=>............................] - ETA: 1:53 - loss: 1.8426 - regression_loss: 1.5176 - classification_loss: 0.3250 45/500 [=>............................] - ETA: 1:53 - loss: 1.8418 - regression_loss: 1.5172 - classification_loss: 0.3246 46/500 [=>............................] - ETA: 1:53 - loss: 1.8442 - regression_loss: 1.5197 - classification_loss: 0.3245 47/500 [=>............................] - ETA: 1:53 - loss: 1.8437 - regression_loss: 1.5205 - classification_loss: 0.3231 48/500 [=>............................] - ETA: 1:52 - loss: 1.8488 - regression_loss: 1.5246 - classification_loss: 0.3242 49/500 [=>............................] - ETA: 1:52 - loss: 1.8359 - regression_loss: 1.5124 - classification_loss: 0.3235 50/500 [==>...........................] - ETA: 1:52 - loss: 1.8337 - regression_loss: 1.5114 - classification_loss: 0.3223 51/500 [==>...........................] - ETA: 1:52 - loss: 1.8350 - regression_loss: 1.5136 - classification_loss: 0.3213 52/500 [==>...........................] - ETA: 1:52 - loss: 1.8166 - regression_loss: 1.4987 - classification_loss: 0.3179 53/500 [==>...........................] - ETA: 1:51 - loss: 1.8108 - regression_loss: 1.4935 - classification_loss: 0.3172 54/500 [==>...........................] - ETA: 1:51 - loss: 1.7974 - regression_loss: 1.4822 - classification_loss: 0.3152 55/500 [==>...........................] - ETA: 1:51 - loss: 1.7964 - regression_loss: 1.4821 - classification_loss: 0.3143 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7872 - regression_loss: 1.4750 - classification_loss: 0.3122 57/500 [==>...........................] - ETA: 1:50 - loss: 1.7869 - regression_loss: 1.4746 - classification_loss: 0.3123 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7855 - regression_loss: 1.4739 - classification_loss: 0.3116 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7781 - regression_loss: 1.4679 - classification_loss: 0.3102 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7607 - regression_loss: 1.4534 - classification_loss: 0.3073 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7753 - regression_loss: 1.4659 - classification_loss: 0.3093 62/500 [==>...........................] - ETA: 1:49 - loss: 1.7835 - regression_loss: 1.4733 - classification_loss: 0.3102 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7904 - regression_loss: 1.4782 - classification_loss: 0.3122 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7818 - regression_loss: 1.4705 - classification_loss: 0.3113 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7848 - regression_loss: 1.4735 - classification_loss: 0.3113 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7895 - regression_loss: 1.4781 - classification_loss: 0.3114 67/500 [===>..........................] - ETA: 1:48 - loss: 1.7886 - regression_loss: 1.4776 - classification_loss: 0.3110 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7948 - regression_loss: 1.4826 - classification_loss: 0.3122 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7900 - regression_loss: 1.4790 - classification_loss: 0.3110 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7800 - regression_loss: 1.4700 - classification_loss: 0.3100 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7815 - regression_loss: 1.4712 - classification_loss: 0.3103 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7931 - regression_loss: 1.4797 - classification_loss: 0.3134 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7870 - regression_loss: 1.4742 - classification_loss: 0.3128 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7904 - regression_loss: 1.4769 - classification_loss: 0.3136 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7882 - regression_loss: 1.4740 - classification_loss: 0.3142 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7867 - regression_loss: 1.4729 - classification_loss: 0.3138 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7752 - regression_loss: 1.4641 - classification_loss: 0.3111 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7806 - regression_loss: 1.4692 - classification_loss: 0.3114 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7864 - regression_loss: 1.4740 - classification_loss: 0.3125 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7805 - regression_loss: 1.4694 - classification_loss: 0.3111 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7808 - regression_loss: 1.4701 - classification_loss: 0.3107 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7841 - regression_loss: 1.4729 - classification_loss: 0.3113 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7787 - regression_loss: 1.4689 - classification_loss: 0.3098 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7818 - regression_loss: 1.4714 - classification_loss: 0.3103 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7804 - regression_loss: 1.4700 - classification_loss: 0.3103 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7780 - regression_loss: 1.4683 - classification_loss: 0.3097 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7758 - regression_loss: 1.4665 - classification_loss: 0.3093 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7755 - regression_loss: 1.4660 - classification_loss: 0.3095 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7756 - regression_loss: 1.4668 - classification_loss: 0.3088 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7706 - regression_loss: 1.4631 - classification_loss: 0.3074 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7627 - regression_loss: 1.4565 - classification_loss: 0.3062 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7645 - regression_loss: 1.4579 - classification_loss: 0.3066 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7558 - regression_loss: 1.4504 - classification_loss: 0.3054 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7493 - regression_loss: 1.4452 - classification_loss: 0.3041 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7511 - regression_loss: 1.4466 - classification_loss: 0.3045 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7445 - regression_loss: 1.4414 - classification_loss: 0.3032 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7367 - regression_loss: 1.4336 - classification_loss: 0.3030 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7400 - regression_loss: 1.4358 - classification_loss: 0.3042 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7451 - regression_loss: 1.4404 - classification_loss: 0.3047 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7466 - regression_loss: 1.4421 - classification_loss: 0.3045 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7490 - regression_loss: 1.4451 - classification_loss: 0.3039 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7527 - regression_loss: 1.4477 - classification_loss: 0.3049 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7507 - regression_loss: 1.4467 - classification_loss: 0.3039 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7598 - regression_loss: 1.4546 - classification_loss: 0.3052 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7533 - regression_loss: 1.4499 - classification_loss: 0.3034 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7545 - regression_loss: 1.4507 - classification_loss: 0.3038 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7507 - regression_loss: 1.4455 - classification_loss: 0.3051 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7630 - regression_loss: 1.4577 - classification_loss: 0.3052 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7661 - regression_loss: 1.4608 - classification_loss: 0.3054 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7596 - regression_loss: 1.4557 - classification_loss: 0.3038 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7651 - regression_loss: 1.4606 - classification_loss: 0.3045 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7561 - regression_loss: 1.4535 - classification_loss: 0.3026 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7558 - regression_loss: 1.4535 - classification_loss: 0.3023 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7619 - regression_loss: 1.4589 - classification_loss: 0.3030 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7658 - regression_loss: 1.4625 - classification_loss: 0.3033 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7639 - regression_loss: 1.4615 - classification_loss: 0.3025 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7553 - regression_loss: 1.4538 - classification_loss: 0.3014 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7579 - regression_loss: 1.4562 - classification_loss: 0.3017 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7582 - regression_loss: 1.4568 - classification_loss: 0.3014 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7596 - regression_loss: 1.4579 - classification_loss: 0.3017 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7601 - regression_loss: 1.4583 - classification_loss: 0.3018 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7624 - regression_loss: 1.4594 - classification_loss: 0.3030 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7607 - regression_loss: 1.4582 - classification_loss: 0.3025 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7607 - regression_loss: 1.4588 - classification_loss: 0.3019 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7617 - regression_loss: 1.4589 - classification_loss: 0.3028 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7670 - regression_loss: 1.4631 - classification_loss: 0.3039 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7671 - regression_loss: 1.4631 - classification_loss: 0.3040 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7659 - regression_loss: 1.4629 - classification_loss: 0.3030 129/500 [======>.......................] - ETA: 1:33 - loss: 1.7688 - regression_loss: 1.4655 - classification_loss: 0.3034 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7714 - regression_loss: 1.4674 - classification_loss: 0.3040 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7708 - regression_loss: 1.4667 - classification_loss: 0.3041 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7767 - regression_loss: 1.4704 - classification_loss: 0.3062 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7844 - regression_loss: 1.4772 - classification_loss: 0.3072 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7883 - regression_loss: 1.4806 - classification_loss: 0.3077 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7877 - regression_loss: 1.4802 - classification_loss: 0.3074 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7786 - regression_loss: 1.4727 - classification_loss: 0.3059 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7785 - regression_loss: 1.4722 - classification_loss: 0.3063 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7763 - regression_loss: 1.4704 - classification_loss: 0.3059 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7815 - regression_loss: 1.4738 - classification_loss: 0.3077 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7803 - regression_loss: 1.4728 - classification_loss: 0.3075 141/500 [=======>......................] - ETA: 1:30 - loss: 1.7804 - regression_loss: 1.4729 - classification_loss: 0.3075 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7804 - regression_loss: 1.4725 - classification_loss: 0.3079 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7795 - regression_loss: 1.4721 - classification_loss: 0.3074 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7811 - regression_loss: 1.4734 - classification_loss: 0.3078 145/500 [=======>......................] - ETA: 1:29 - loss: 1.7816 - regression_loss: 1.4734 - classification_loss: 0.3082 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7810 - regression_loss: 1.4731 - classification_loss: 0.3078 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7767 - regression_loss: 1.4690 - classification_loss: 0.3077 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7788 - regression_loss: 1.4712 - classification_loss: 0.3076 149/500 [=======>......................] - ETA: 1:28 - loss: 1.7709 - regression_loss: 1.4644 - classification_loss: 0.3065 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7713 - regression_loss: 1.4648 - classification_loss: 0.3064 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7716 - regression_loss: 1.4653 - classification_loss: 0.3063 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7722 - regression_loss: 1.4657 - classification_loss: 0.3065 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7724 - regression_loss: 1.4662 - classification_loss: 0.3062 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7742 - regression_loss: 1.4676 - classification_loss: 0.3066 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7753 - regression_loss: 1.4687 - classification_loss: 0.3066 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7751 - regression_loss: 1.4688 - classification_loss: 0.3064 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7784 - regression_loss: 1.4716 - classification_loss: 0.3068 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7788 - regression_loss: 1.4723 - classification_loss: 0.3065 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7773 - regression_loss: 1.4711 - classification_loss: 0.3062 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7787 - regression_loss: 1.4725 - classification_loss: 0.3062 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7771 - regression_loss: 1.4708 - classification_loss: 0.3063 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7752 - regression_loss: 1.4689 - classification_loss: 0.3063 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7762 - regression_loss: 1.4700 - classification_loss: 0.3062 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7720 - regression_loss: 1.4665 - classification_loss: 0.3055 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7737 - regression_loss: 1.4682 - classification_loss: 0.3055 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7740 - regression_loss: 1.4684 - classification_loss: 0.3056 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7733 - regression_loss: 1.4682 - classification_loss: 0.3051 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7728 - regression_loss: 1.4678 - classification_loss: 0.3050 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7742 - regression_loss: 1.4687 - classification_loss: 0.3055 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7697 - regression_loss: 1.4648 - classification_loss: 0.3048 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7668 - regression_loss: 1.4625 - classification_loss: 0.3042 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7711 - regression_loss: 1.4661 - classification_loss: 0.3050 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7720 - regression_loss: 1.4670 - classification_loss: 0.3050 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7736 - regression_loss: 1.4683 - classification_loss: 0.3053 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7723 - regression_loss: 1.4670 - classification_loss: 0.3053 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7727 - regression_loss: 1.4675 - classification_loss: 0.3052 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7717 - regression_loss: 1.4668 - classification_loss: 0.3050 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7713 - regression_loss: 1.4660 - classification_loss: 0.3053 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7712 - regression_loss: 1.4663 - classification_loss: 0.3049 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7762 - regression_loss: 1.4702 - classification_loss: 0.3060 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7733 - regression_loss: 1.4676 - classification_loss: 0.3057 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7757 - regression_loss: 1.4681 - classification_loss: 0.3076 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7754 - regression_loss: 1.4679 - classification_loss: 0.3075 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7752 - regression_loss: 1.4680 - classification_loss: 0.3073 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7768 - regression_loss: 1.4695 - classification_loss: 0.3074 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7789 - regression_loss: 1.4713 - classification_loss: 0.3076 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7734 - regression_loss: 1.4667 - classification_loss: 0.3067 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7739 - regression_loss: 1.4672 - classification_loss: 0.3067 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7749 - regression_loss: 1.4684 - classification_loss: 0.3065 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7780 - regression_loss: 1.4709 - classification_loss: 0.3071 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7772 - regression_loss: 1.4708 - classification_loss: 0.3065 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7776 - regression_loss: 1.4708 - classification_loss: 0.3069 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7786 - regression_loss: 1.4716 - classification_loss: 0.3070 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7784 - regression_loss: 1.4714 - classification_loss: 0.3070 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7789 - regression_loss: 1.4717 - classification_loss: 0.3072 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7773 - regression_loss: 1.4696 - classification_loss: 0.3077 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7802 - regression_loss: 1.4719 - classification_loss: 0.3083 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7824 - regression_loss: 1.4740 - classification_loss: 0.3085 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7892 - regression_loss: 1.4786 - classification_loss: 0.3106 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7888 - regression_loss: 1.4784 - classification_loss: 0.3104 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7902 - regression_loss: 1.4797 - classification_loss: 0.3105 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7890 - regression_loss: 1.4787 - classification_loss: 0.3103 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7871 - regression_loss: 1.4771 - classification_loss: 0.3100 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7874 - regression_loss: 1.4776 - classification_loss: 0.3097 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7878 - regression_loss: 1.4777 - classification_loss: 0.3101 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7875 - regression_loss: 1.4777 - classification_loss: 0.3099 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7875 - regression_loss: 1.4779 - classification_loss: 0.3096 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7880 - regression_loss: 1.4783 - classification_loss: 0.3097 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7885 - regression_loss: 1.4784 - classification_loss: 0.3101 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7896 - regression_loss: 1.4796 - classification_loss: 0.3100 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7922 - regression_loss: 1.4815 - classification_loss: 0.3107 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7915 - regression_loss: 1.4809 - classification_loss: 0.3106 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7912 - regression_loss: 1.4806 - classification_loss: 0.3106 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7884 - regression_loss: 1.4785 - classification_loss: 0.3099 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7896 - regression_loss: 1.4793 - classification_loss: 0.3103 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7900 - regression_loss: 1.4796 - classification_loss: 0.3104 217/500 [============>.................] - ETA: 1:10 - loss: 1.7872 - regression_loss: 1.4778 - classification_loss: 0.3094 218/500 [============>.................] - ETA: 1:10 - loss: 1.7833 - regression_loss: 1.4747 - classification_loss: 0.3086 219/500 [============>.................] - ETA: 1:10 - loss: 1.7845 - regression_loss: 1.4759 - classification_loss: 0.3086 220/500 [============>.................] - ETA: 1:10 - loss: 1.7847 - regression_loss: 1.4761 - classification_loss: 0.3086 221/500 [============>.................] - ETA: 1:09 - loss: 1.7840 - regression_loss: 1.4757 - classification_loss: 0.3083 222/500 [============>.................] - ETA: 1:09 - loss: 1.7860 - regression_loss: 1.4771 - classification_loss: 0.3089 223/500 [============>.................] - ETA: 1:09 - loss: 1.7838 - regression_loss: 1.4755 - classification_loss: 0.3083 224/500 [============>.................] - ETA: 1:09 - loss: 1.7846 - regression_loss: 1.4763 - classification_loss: 0.3083 225/500 [============>.................] - ETA: 1:08 - loss: 1.7835 - regression_loss: 1.4756 - classification_loss: 0.3079 226/500 [============>.................] - ETA: 1:08 - loss: 1.7820 - regression_loss: 1.4743 - classification_loss: 0.3077 227/500 [============>.................] - ETA: 1:08 - loss: 1.7826 - regression_loss: 1.4748 - classification_loss: 0.3078 228/500 [============>.................] - ETA: 1:08 - loss: 1.7853 - regression_loss: 1.4771 - classification_loss: 0.3082 229/500 [============>.................] - ETA: 1:07 - loss: 1.7857 - regression_loss: 1.4773 - classification_loss: 0.3083 230/500 [============>.................] - ETA: 1:07 - loss: 1.7861 - regression_loss: 1.4779 - classification_loss: 0.3082 231/500 [============>.................] - ETA: 1:07 - loss: 1.7869 - regression_loss: 1.4788 - classification_loss: 0.3081 232/500 [============>.................] - ETA: 1:07 - loss: 1.7851 - regression_loss: 1.4775 - classification_loss: 0.3076 233/500 [============>.................] - ETA: 1:06 - loss: 1.7802 - regression_loss: 1.4737 - classification_loss: 0.3065 234/500 [=============>................] - ETA: 1:06 - loss: 1.7792 - regression_loss: 1.4730 - classification_loss: 0.3063 235/500 [=============>................] - ETA: 1:06 - loss: 1.7798 - regression_loss: 1.4735 - classification_loss: 0.3063 236/500 [=============>................] - ETA: 1:06 - loss: 1.7800 - regression_loss: 1.4734 - classification_loss: 0.3066 237/500 [=============>................] - ETA: 1:05 - loss: 1.7824 - regression_loss: 1.4752 - classification_loss: 0.3072 238/500 [=============>................] - ETA: 1:05 - loss: 1.7818 - regression_loss: 1.4750 - classification_loss: 0.3068 239/500 [=============>................] - ETA: 1:05 - loss: 1.7813 - regression_loss: 1.4747 - classification_loss: 0.3065 240/500 [=============>................] - ETA: 1:05 - loss: 1.7827 - regression_loss: 1.4758 - classification_loss: 0.3069 241/500 [=============>................] - ETA: 1:04 - loss: 1.7797 - regression_loss: 1.4736 - classification_loss: 0.3061 242/500 [=============>................] - ETA: 1:04 - loss: 1.7803 - regression_loss: 1.4740 - classification_loss: 0.3063 243/500 [=============>................] - ETA: 1:04 - loss: 1.7822 - regression_loss: 1.4757 - classification_loss: 0.3064 244/500 [=============>................] - ETA: 1:04 - loss: 1.7835 - regression_loss: 1.4770 - classification_loss: 0.3065 245/500 [=============>................] - ETA: 1:03 - loss: 1.7806 - regression_loss: 1.4746 - classification_loss: 0.3060 246/500 [=============>................] - ETA: 1:03 - loss: 1.7835 - regression_loss: 1.4764 - classification_loss: 0.3072 247/500 [=============>................] - ETA: 1:03 - loss: 1.7822 - regression_loss: 1.4752 - classification_loss: 0.3070 248/500 [=============>................] - ETA: 1:03 - loss: 1.7823 - regression_loss: 1.4751 - classification_loss: 0.3071 249/500 [=============>................] - ETA: 1:02 - loss: 1.7849 - regression_loss: 1.4774 - classification_loss: 0.3075 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7856 - regression_loss: 1.4777 - classification_loss: 0.3079 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7866 - regression_loss: 1.4787 - classification_loss: 0.3079 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7831 - regression_loss: 1.4753 - classification_loss: 0.3078 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7838 - regression_loss: 1.4759 - classification_loss: 0.3079 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7832 - regression_loss: 1.4755 - classification_loss: 0.3077 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7826 - regression_loss: 1.4751 - classification_loss: 0.3075 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7801 - regression_loss: 1.4731 - classification_loss: 0.3070 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7816 - regression_loss: 1.4741 - classification_loss: 0.3076 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7778 - regression_loss: 1.4704 - classification_loss: 0.3073 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7770 - regression_loss: 1.4696 - classification_loss: 0.3074 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7761 - regression_loss: 1.4691 - classification_loss: 0.3070 261/500 [==============>...............] - ETA: 59s - loss: 1.7769 - regression_loss: 1.4696 - classification_loss: 0.3073  262/500 [==============>...............] - ETA: 59s - loss: 1.7752 - regression_loss: 1.4684 - classification_loss: 0.3068 263/500 [==============>...............] - ETA: 59s - loss: 1.7731 - regression_loss: 1.4669 - classification_loss: 0.3061 264/500 [==============>...............] - ETA: 59s - loss: 1.7708 - regression_loss: 1.4652 - classification_loss: 0.3057 265/500 [==============>...............] - ETA: 58s - loss: 1.7741 - regression_loss: 1.4682 - classification_loss: 0.3059 266/500 [==============>...............] - ETA: 58s - loss: 1.7754 - regression_loss: 1.4692 - classification_loss: 0.3063 267/500 [===============>..............] - ETA: 58s - loss: 1.7740 - regression_loss: 1.4683 - classification_loss: 0.3058 268/500 [===============>..............] - ETA: 58s - loss: 1.7716 - regression_loss: 1.4664 - classification_loss: 0.3052 269/500 [===============>..............] - ETA: 57s - loss: 1.7720 - regression_loss: 1.4668 - classification_loss: 0.3052 270/500 [===============>..............] - ETA: 57s - loss: 1.7690 - regression_loss: 1.4645 - classification_loss: 0.3046 271/500 [===============>..............] - ETA: 57s - loss: 1.7669 - regression_loss: 1.4628 - classification_loss: 0.3041 272/500 [===============>..............] - ETA: 57s - loss: 1.7650 - regression_loss: 1.4614 - classification_loss: 0.3037 273/500 [===============>..............] - ETA: 56s - loss: 1.7638 - regression_loss: 1.4606 - classification_loss: 0.3032 274/500 [===============>..............] - ETA: 56s - loss: 1.7662 - regression_loss: 1.4624 - classification_loss: 0.3039 275/500 [===============>..............] - ETA: 56s - loss: 1.7628 - regression_loss: 1.4596 - classification_loss: 0.3032 276/500 [===============>..............] - ETA: 56s - loss: 1.7644 - regression_loss: 1.4614 - classification_loss: 0.3030 277/500 [===============>..............] - ETA: 55s - loss: 1.7646 - regression_loss: 1.4617 - classification_loss: 0.3029 278/500 [===============>..............] - ETA: 55s - loss: 1.7625 - regression_loss: 1.4600 - classification_loss: 0.3025 279/500 [===============>..............] - ETA: 55s - loss: 1.7629 - regression_loss: 1.4605 - classification_loss: 0.3024 280/500 [===============>..............] - ETA: 55s - loss: 1.7648 - regression_loss: 1.4626 - classification_loss: 0.3022 281/500 [===============>..............] - ETA: 54s - loss: 1.7656 - regression_loss: 1.4631 - classification_loss: 0.3025 282/500 [===============>..............] - ETA: 54s - loss: 1.7671 - regression_loss: 1.4637 - classification_loss: 0.3035 283/500 [===============>..............] - ETA: 54s - loss: 1.7646 - regression_loss: 1.4617 - classification_loss: 0.3029 284/500 [================>.............] - ETA: 54s - loss: 1.7658 - regression_loss: 1.4630 - classification_loss: 0.3029 285/500 [================>.............] - ETA: 53s - loss: 1.7669 - regression_loss: 1.4639 - classification_loss: 0.3030 286/500 [================>.............] - ETA: 53s - loss: 1.7665 - regression_loss: 1.4637 - classification_loss: 0.3028 287/500 [================>.............] - ETA: 53s - loss: 1.7646 - regression_loss: 1.4621 - classification_loss: 0.3025 288/500 [================>.............] - ETA: 53s - loss: 1.7644 - regression_loss: 1.4619 - classification_loss: 0.3025 289/500 [================>.............] - ETA: 52s - loss: 1.7644 - regression_loss: 1.4616 - classification_loss: 0.3029 290/500 [================>.............] - ETA: 52s - loss: 1.7650 - regression_loss: 1.4618 - classification_loss: 0.3031 291/500 [================>.............] - ETA: 52s - loss: 1.7647 - regression_loss: 1.4616 - classification_loss: 0.3030 292/500 [================>.............] - ETA: 52s - loss: 1.7645 - regression_loss: 1.4615 - classification_loss: 0.3029 293/500 [================>.............] - ETA: 51s - loss: 1.7642 - regression_loss: 1.4615 - classification_loss: 0.3028 294/500 [================>.............] - ETA: 51s - loss: 1.7654 - regression_loss: 1.4624 - classification_loss: 0.3030 295/500 [================>.............] - ETA: 51s - loss: 1.7648 - regression_loss: 1.4620 - classification_loss: 0.3028 296/500 [================>.............] - ETA: 51s - loss: 1.7633 - regression_loss: 1.4608 - classification_loss: 0.3025 297/500 [================>.............] - ETA: 50s - loss: 1.7640 - regression_loss: 1.4613 - classification_loss: 0.3026 298/500 [================>.............] - ETA: 50s - loss: 1.7643 - regression_loss: 1.4615 - classification_loss: 0.3028 299/500 [================>.............] - ETA: 50s - loss: 1.7643 - regression_loss: 1.4615 - classification_loss: 0.3028 300/500 [=================>............] - ETA: 50s - loss: 1.7648 - regression_loss: 1.4617 - classification_loss: 0.3030 301/500 [=================>............] - ETA: 49s - loss: 1.7647 - regression_loss: 1.4618 - classification_loss: 0.3029 302/500 [=================>............] - ETA: 49s - loss: 1.7679 - regression_loss: 1.4642 - classification_loss: 0.3036 303/500 [=================>............] - ETA: 49s - loss: 1.7678 - regression_loss: 1.4643 - classification_loss: 0.3035 304/500 [=================>............] - ETA: 49s - loss: 1.7695 - regression_loss: 1.4652 - classification_loss: 0.3042 305/500 [=================>............] - ETA: 48s - loss: 1.7697 - regression_loss: 1.4654 - classification_loss: 0.3043 306/500 [=================>............] - ETA: 48s - loss: 1.7674 - regression_loss: 1.4631 - classification_loss: 0.3043 307/500 [=================>............] - ETA: 48s - loss: 1.7686 - regression_loss: 1.4641 - classification_loss: 0.3046 308/500 [=================>............] - ETA: 48s - loss: 1.7679 - regression_loss: 1.4637 - classification_loss: 0.3042 309/500 [=================>............] - ETA: 47s - loss: 1.7646 - regression_loss: 1.4608 - classification_loss: 0.3038 310/500 [=================>............] - ETA: 47s - loss: 1.7629 - regression_loss: 1.4594 - classification_loss: 0.3035 311/500 [=================>............] - ETA: 47s - loss: 1.7625 - regression_loss: 1.4591 - classification_loss: 0.3034 312/500 [=================>............] - ETA: 47s - loss: 1.7635 - regression_loss: 1.4601 - classification_loss: 0.3035 313/500 [=================>............] - ETA: 46s - loss: 1.7655 - regression_loss: 1.4616 - classification_loss: 0.3039 314/500 [=================>............] - ETA: 46s - loss: 1.7634 - regression_loss: 1.4598 - classification_loss: 0.3036 315/500 [=================>............] - ETA: 46s - loss: 1.7626 - regression_loss: 1.4591 - classification_loss: 0.3035 316/500 [=================>............] - ETA: 46s - loss: 1.7617 - regression_loss: 1.4583 - classification_loss: 0.3033 317/500 [==================>...........] - ETA: 45s - loss: 1.7628 - regression_loss: 1.4595 - classification_loss: 0.3033 318/500 [==================>...........] - ETA: 45s - loss: 1.7621 - regression_loss: 1.4589 - classification_loss: 0.3032 319/500 [==================>...........] - ETA: 45s - loss: 1.7588 - regression_loss: 1.4561 - classification_loss: 0.3028 320/500 [==================>...........] - ETA: 45s - loss: 1.7585 - regression_loss: 1.4556 - classification_loss: 0.3029 321/500 [==================>...........] - ETA: 44s - loss: 1.7590 - regression_loss: 1.4561 - classification_loss: 0.3029 322/500 [==================>...........] - ETA: 44s - loss: 1.7585 - regression_loss: 1.4558 - classification_loss: 0.3028 323/500 [==================>...........] - ETA: 44s - loss: 1.7592 - regression_loss: 1.4562 - classification_loss: 0.3031 324/500 [==================>...........] - ETA: 44s - loss: 1.7603 - regression_loss: 1.4570 - classification_loss: 0.3034 325/500 [==================>...........] - ETA: 43s - loss: 1.7590 - regression_loss: 1.4559 - classification_loss: 0.3031 326/500 [==================>...........] - ETA: 43s - loss: 1.7553 - regression_loss: 1.4529 - classification_loss: 0.3024 327/500 [==================>...........] - ETA: 43s - loss: 1.7537 - regression_loss: 1.4515 - classification_loss: 0.3022 328/500 [==================>...........] - ETA: 43s - loss: 1.7553 - regression_loss: 1.4526 - classification_loss: 0.3027 329/500 [==================>...........] - ETA: 42s - loss: 1.7568 - regression_loss: 1.4537 - classification_loss: 0.3031 330/500 [==================>...........] - ETA: 42s - loss: 1.7578 - regression_loss: 1.4545 - classification_loss: 0.3033 331/500 [==================>...........] - ETA: 42s - loss: 1.7552 - regression_loss: 1.4524 - classification_loss: 0.3028 332/500 [==================>...........] - ETA: 42s - loss: 1.7551 - regression_loss: 1.4526 - classification_loss: 0.3025 333/500 [==================>...........] - ETA: 41s - loss: 1.7532 - regression_loss: 1.4511 - classification_loss: 0.3021 334/500 [===================>..........] - ETA: 41s - loss: 1.7506 - regression_loss: 1.4491 - classification_loss: 0.3016 335/500 [===================>..........] - ETA: 41s - loss: 1.7521 - regression_loss: 1.4504 - classification_loss: 0.3017 336/500 [===================>..........] - ETA: 41s - loss: 1.7525 - regression_loss: 1.4506 - classification_loss: 0.3019 337/500 [===================>..........] - ETA: 40s - loss: 1.7520 - regression_loss: 1.4501 - classification_loss: 0.3019 338/500 [===================>..........] - ETA: 40s - loss: 1.7518 - regression_loss: 1.4500 - classification_loss: 0.3018 339/500 [===================>..........] - ETA: 40s - loss: 1.7513 - regression_loss: 1.4495 - classification_loss: 0.3018 340/500 [===================>..........] - ETA: 40s - loss: 1.7521 - regression_loss: 1.4503 - classification_loss: 0.3018 341/500 [===================>..........] - ETA: 39s - loss: 1.7513 - regression_loss: 1.4498 - classification_loss: 0.3015 342/500 [===================>..........] - ETA: 39s - loss: 1.7538 - regression_loss: 1.4519 - classification_loss: 0.3019 343/500 [===================>..........] - ETA: 39s - loss: 1.7547 - regression_loss: 1.4527 - classification_loss: 0.3020 344/500 [===================>..........] - ETA: 39s - loss: 1.7542 - regression_loss: 1.4524 - classification_loss: 0.3018 345/500 [===================>..........] - ETA: 38s - loss: 1.7544 - regression_loss: 1.4525 - classification_loss: 0.3018 346/500 [===================>..........] - ETA: 38s - loss: 1.7546 - regression_loss: 1.4529 - classification_loss: 0.3017 347/500 [===================>..........] - ETA: 38s - loss: 1.7564 - regression_loss: 1.4543 - classification_loss: 0.3020 348/500 [===================>..........] - ETA: 38s - loss: 1.7569 - regression_loss: 1.4547 - classification_loss: 0.3022 349/500 [===================>..........] - ETA: 37s - loss: 1.7575 - regression_loss: 1.4552 - classification_loss: 0.3023 350/500 [====================>.........] - ETA: 37s - loss: 1.7565 - regression_loss: 1.4544 - classification_loss: 0.3021 351/500 [====================>.........] - ETA: 37s - loss: 1.7581 - regression_loss: 1.4554 - classification_loss: 0.3027 352/500 [====================>.........] - ETA: 37s - loss: 1.7577 - regression_loss: 1.4549 - classification_loss: 0.3028 353/500 [====================>.........] - ETA: 36s - loss: 1.7585 - regression_loss: 1.4553 - classification_loss: 0.3032 354/500 [====================>.........] - ETA: 36s - loss: 1.7579 - regression_loss: 1.4550 - classification_loss: 0.3029 355/500 [====================>.........] - ETA: 36s - loss: 1.7568 - regression_loss: 1.4540 - classification_loss: 0.3028 356/500 [====================>.........] - ETA: 36s - loss: 1.7564 - regression_loss: 1.4538 - classification_loss: 0.3026 357/500 [====================>.........] - ETA: 35s - loss: 1.7549 - regression_loss: 1.4526 - classification_loss: 0.3023 358/500 [====================>.........] - ETA: 35s - loss: 1.7545 - regression_loss: 1.4524 - classification_loss: 0.3021 359/500 [====================>.........] - ETA: 35s - loss: 1.7548 - regression_loss: 1.4526 - classification_loss: 0.3021 360/500 [====================>.........] - ETA: 35s - loss: 1.7552 - regression_loss: 1.4531 - classification_loss: 0.3021 361/500 [====================>.........] - ETA: 34s - loss: 1.7537 - regression_loss: 1.4518 - classification_loss: 0.3020 362/500 [====================>.........] - ETA: 34s - loss: 1.7533 - regression_loss: 1.4514 - classification_loss: 0.3019 363/500 [====================>.........] - ETA: 34s - loss: 1.7507 - regression_loss: 1.4494 - classification_loss: 0.3013 364/500 [====================>.........] - ETA: 34s - loss: 1.7518 - regression_loss: 1.4503 - classification_loss: 0.3014 365/500 [====================>.........] - ETA: 33s - loss: 1.7524 - regression_loss: 1.4508 - classification_loss: 0.3016 366/500 [====================>.........] - ETA: 33s - loss: 1.7519 - regression_loss: 1.4506 - classification_loss: 0.3013 367/500 [=====================>........] - ETA: 33s - loss: 1.7509 - regression_loss: 1.4498 - classification_loss: 0.3011 368/500 [=====================>........] - ETA: 33s - loss: 1.7495 - regression_loss: 1.4487 - classification_loss: 0.3008 369/500 [=====================>........] - ETA: 32s - loss: 1.7492 - regression_loss: 1.4485 - classification_loss: 0.3008 370/500 [=====================>........] - ETA: 32s - loss: 1.7531 - regression_loss: 1.4517 - classification_loss: 0.3014 371/500 [=====================>........] - ETA: 32s - loss: 1.7547 - regression_loss: 1.4530 - classification_loss: 0.3017 372/500 [=====================>........] - ETA: 32s - loss: 1.7547 - regression_loss: 1.4522 - classification_loss: 0.3026 373/500 [=====================>........] - ETA: 31s - loss: 1.7549 - regression_loss: 1.4521 - classification_loss: 0.3028 374/500 [=====================>........] - ETA: 31s - loss: 1.7551 - regression_loss: 1.4522 - classification_loss: 0.3029 375/500 [=====================>........] - ETA: 31s - loss: 1.7538 - regression_loss: 1.4511 - classification_loss: 0.3027 376/500 [=====================>........] - ETA: 31s - loss: 1.7520 - regression_loss: 1.4496 - classification_loss: 0.3024 377/500 [=====================>........] - ETA: 30s - loss: 1.7504 - regression_loss: 1.4483 - classification_loss: 0.3021 378/500 [=====================>........] - ETA: 30s - loss: 1.7478 - regression_loss: 1.4463 - classification_loss: 0.3015 379/500 [=====================>........] - ETA: 30s - loss: 1.7473 - regression_loss: 1.4460 - classification_loss: 0.3012 380/500 [=====================>........] - ETA: 30s - loss: 1.7483 - regression_loss: 1.4469 - classification_loss: 0.3014 381/500 [=====================>........] - ETA: 29s - loss: 1.7492 - regression_loss: 1.4479 - classification_loss: 0.3014 382/500 [=====================>........] - ETA: 29s - loss: 1.7489 - regression_loss: 1.4476 - classification_loss: 0.3013 383/500 [=====================>........] - ETA: 29s - loss: 1.7497 - regression_loss: 1.4483 - classification_loss: 0.3014 384/500 [======================>.......] - ETA: 29s - loss: 1.7487 - regression_loss: 1.4475 - classification_loss: 0.3012 385/500 [======================>.......] - ETA: 28s - loss: 1.7505 - regression_loss: 1.4492 - classification_loss: 0.3014 386/500 [======================>.......] - ETA: 28s - loss: 1.7508 - regression_loss: 1.4492 - classification_loss: 0.3016 387/500 [======================>.......] - ETA: 28s - loss: 1.7517 - regression_loss: 1.4499 - classification_loss: 0.3017 388/500 [======================>.......] - ETA: 28s - loss: 1.7495 - regression_loss: 1.4482 - classification_loss: 0.3014 389/500 [======================>.......] - ETA: 27s - loss: 1.7500 - regression_loss: 1.4483 - classification_loss: 0.3016 390/500 [======================>.......] - ETA: 27s - loss: 1.7504 - regression_loss: 1.4487 - classification_loss: 0.3018 391/500 [======================>.......] - ETA: 27s - loss: 1.7509 - regression_loss: 1.4492 - classification_loss: 0.3017 392/500 [======================>.......] - ETA: 27s - loss: 1.7502 - regression_loss: 1.4487 - classification_loss: 0.3015 393/500 [======================>.......] - ETA: 26s - loss: 1.7510 - regression_loss: 1.4492 - classification_loss: 0.3018 394/500 [======================>.......] - ETA: 26s - loss: 1.7512 - regression_loss: 1.4495 - classification_loss: 0.3016 395/500 [======================>.......] - ETA: 26s - loss: 1.7508 - regression_loss: 1.4494 - classification_loss: 0.3015 396/500 [======================>.......] - ETA: 26s - loss: 1.7520 - regression_loss: 1.4502 - classification_loss: 0.3019 397/500 [======================>.......] - ETA: 25s - loss: 1.7497 - regression_loss: 1.4481 - classification_loss: 0.3016 398/500 [======================>.......] - ETA: 25s - loss: 1.7490 - regression_loss: 1.4477 - classification_loss: 0.3012 399/500 [======================>.......] - ETA: 25s - loss: 1.7488 - regression_loss: 1.4477 - classification_loss: 0.3010 400/500 [=======================>......] - ETA: 25s - loss: 1.7482 - regression_loss: 1.4474 - classification_loss: 0.3008 401/500 [=======================>......] - ETA: 24s - loss: 1.7495 - regression_loss: 1.4484 - classification_loss: 0.3010 402/500 [=======================>......] - ETA: 24s - loss: 1.7501 - regression_loss: 1.4488 - classification_loss: 0.3012 403/500 [=======================>......] - ETA: 24s - loss: 1.7508 - regression_loss: 1.4496 - classification_loss: 0.3012 404/500 [=======================>......] - ETA: 24s - loss: 1.7504 - regression_loss: 1.4494 - classification_loss: 0.3010 405/500 [=======================>......] - ETA: 23s - loss: 1.7504 - regression_loss: 1.4495 - classification_loss: 0.3009 406/500 [=======================>......] - ETA: 23s - loss: 1.7500 - regression_loss: 1.4493 - classification_loss: 0.3008 407/500 [=======================>......] - ETA: 23s - loss: 1.7495 - regression_loss: 1.4488 - classification_loss: 0.3007 408/500 [=======================>......] - ETA: 23s - loss: 1.7507 - regression_loss: 1.4496 - classification_loss: 0.3011 409/500 [=======================>......] - ETA: 22s - loss: 1.7515 - regression_loss: 1.4503 - classification_loss: 0.3012 410/500 [=======================>......] - ETA: 22s - loss: 1.7516 - regression_loss: 1.4504 - classification_loss: 0.3012 411/500 [=======================>......] - ETA: 22s - loss: 1.7509 - regression_loss: 1.4495 - classification_loss: 0.3014 412/500 [=======================>......] - ETA: 22s - loss: 1.7499 - regression_loss: 1.4488 - classification_loss: 0.3010 413/500 [=======================>......] - ETA: 21s - loss: 1.7501 - regression_loss: 1.4491 - classification_loss: 0.3010 414/500 [=======================>......] - ETA: 21s - loss: 1.7497 - regression_loss: 1.4487 - classification_loss: 0.3010 415/500 [=======================>......] - ETA: 21s - loss: 1.7494 - regression_loss: 1.4484 - classification_loss: 0.3010 416/500 [=======================>......] - ETA: 21s - loss: 1.7491 - regression_loss: 1.4481 - classification_loss: 0.3009 417/500 [========================>.....] - ETA: 20s - loss: 1.7500 - regression_loss: 1.4489 - classification_loss: 0.3010 418/500 [========================>.....] - ETA: 20s - loss: 1.7498 - regression_loss: 1.4488 - classification_loss: 0.3010 419/500 [========================>.....] - ETA: 20s - loss: 1.7472 - regression_loss: 1.4466 - classification_loss: 0.3006 420/500 [========================>.....] - ETA: 20s - loss: 1.7477 - regression_loss: 1.4471 - classification_loss: 0.3006 421/500 [========================>.....] - ETA: 19s - loss: 1.7488 - regression_loss: 1.4479 - classification_loss: 0.3009 422/500 [========================>.....] - ETA: 19s - loss: 1.7495 - regression_loss: 1.4485 - classification_loss: 0.3010 423/500 [========================>.....] - ETA: 19s - loss: 1.7508 - regression_loss: 1.4493 - classification_loss: 0.3015 424/500 [========================>.....] - ETA: 19s - loss: 1.7486 - regression_loss: 1.4475 - classification_loss: 0.3011 425/500 [========================>.....] - ETA: 18s - loss: 1.7484 - regression_loss: 1.4471 - classification_loss: 0.3013 426/500 [========================>.....] - ETA: 18s - loss: 1.7493 - regression_loss: 1.4480 - classification_loss: 0.3013 427/500 [========================>.....] - ETA: 18s - loss: 1.7482 - regression_loss: 1.4470 - classification_loss: 0.3012 428/500 [========================>.....] - ETA: 18s - loss: 1.7489 - regression_loss: 1.4477 - classification_loss: 0.3012 429/500 [========================>.....] - ETA: 17s - loss: 1.7499 - regression_loss: 1.4482 - classification_loss: 0.3016 430/500 [========================>.....] - ETA: 17s - loss: 1.7507 - regression_loss: 1.4490 - classification_loss: 0.3017 431/500 [========================>.....] - ETA: 17s - loss: 1.7504 - regression_loss: 1.4488 - classification_loss: 0.3016 432/500 [========================>.....] - ETA: 17s - loss: 1.7512 - regression_loss: 1.4494 - classification_loss: 0.3018 433/500 [========================>.....] - ETA: 16s - loss: 1.7489 - regression_loss: 1.4473 - classification_loss: 0.3016 434/500 [=========================>....] - ETA: 16s - loss: 1.7494 - regression_loss: 1.4478 - classification_loss: 0.3016 435/500 [=========================>....] - ETA: 16s - loss: 1.7483 - regression_loss: 1.4469 - classification_loss: 0.3014 436/500 [=========================>....] - ETA: 16s - loss: 1.7477 - regression_loss: 1.4462 - classification_loss: 0.3015 437/500 [=========================>....] - ETA: 15s - loss: 1.7480 - regression_loss: 1.4464 - classification_loss: 0.3016 438/500 [=========================>....] - ETA: 15s - loss: 1.7486 - regression_loss: 1.4469 - classification_loss: 0.3017 439/500 [=========================>....] - ETA: 15s - loss: 1.7473 - regression_loss: 1.4459 - classification_loss: 0.3014 440/500 [=========================>....] - ETA: 15s - loss: 1.7464 - regression_loss: 1.4453 - classification_loss: 0.3012 441/500 [=========================>....] - ETA: 14s - loss: 1.7464 - regression_loss: 1.4454 - classification_loss: 0.3010 442/500 [=========================>....] - ETA: 14s - loss: 1.7443 - regression_loss: 1.4438 - classification_loss: 0.3005 443/500 [=========================>....] - ETA: 14s - loss: 1.7441 - regression_loss: 1.4438 - classification_loss: 0.3003 444/500 [=========================>....] - ETA: 14s - loss: 1.7485 - regression_loss: 1.4473 - classification_loss: 0.3012 445/500 [=========================>....] - ETA: 13s - loss: 1.7479 - regression_loss: 1.4469 - classification_loss: 0.3010 446/500 [=========================>....] - ETA: 13s - loss: 1.7491 - regression_loss: 1.4479 - classification_loss: 0.3013 447/500 [=========================>....] - ETA: 13s - loss: 1.7512 - regression_loss: 1.4494 - classification_loss: 0.3018 448/500 [=========================>....] - ETA: 13s - loss: 1.7518 - regression_loss: 1.4499 - classification_loss: 0.3018 449/500 [=========================>....] - ETA: 12s - loss: 1.7523 - regression_loss: 1.4503 - classification_loss: 0.3020 450/500 [==========================>...] - ETA: 12s - loss: 1.7539 - regression_loss: 1.4514 - classification_loss: 0.3024 451/500 [==========================>...] - ETA: 12s - loss: 1.7529 - regression_loss: 1.4508 - classification_loss: 0.3022 452/500 [==========================>...] - ETA: 12s - loss: 1.7543 - regression_loss: 1.4520 - classification_loss: 0.3023 453/500 [==========================>...] - ETA: 11s - loss: 1.7537 - regression_loss: 1.4514 - classification_loss: 0.3023 454/500 [==========================>...] - ETA: 11s - loss: 1.7518 - regression_loss: 1.4497 - classification_loss: 0.3021 455/500 [==========================>...] - ETA: 11s - loss: 1.7535 - regression_loss: 1.4510 - classification_loss: 0.3024 456/500 [==========================>...] - ETA: 11s - loss: 1.7531 - regression_loss: 1.4509 - classification_loss: 0.3022 457/500 [==========================>...] - ETA: 10s - loss: 1.7531 - regression_loss: 1.4509 - classification_loss: 0.3022 458/500 [==========================>...] - ETA: 10s - loss: 1.7537 - regression_loss: 1.4514 - classification_loss: 0.3022 459/500 [==========================>...] - ETA: 10s - loss: 1.7533 - regression_loss: 1.4511 - classification_loss: 0.3021 460/500 [==========================>...] - ETA: 10s - loss: 1.7530 - regression_loss: 1.4508 - classification_loss: 0.3022 461/500 [==========================>...] - ETA: 9s - loss: 1.7545 - regression_loss: 1.4520 - classification_loss: 0.3025  462/500 [==========================>...] - ETA: 9s - loss: 1.7516 - regression_loss: 1.4495 - classification_loss: 0.3021 463/500 [==========================>...] - ETA: 9s - loss: 1.7522 - regression_loss: 1.4501 - classification_loss: 0.3021 464/500 [==========================>...] - ETA: 9s - loss: 1.7528 - regression_loss: 1.4507 - classification_loss: 0.3021 465/500 [==========================>...] - ETA: 8s - loss: 1.7528 - regression_loss: 1.4508 - classification_loss: 0.3020 466/500 [==========================>...] - ETA: 8s - loss: 1.7526 - regression_loss: 1.4507 - classification_loss: 0.3019 467/500 [===========================>..] - ETA: 8s - loss: 1.7529 - regression_loss: 1.4510 - classification_loss: 0.3019 468/500 [===========================>..] - ETA: 8s - loss: 1.7513 - regression_loss: 1.4498 - classification_loss: 0.3015 469/500 [===========================>..] - ETA: 7s - loss: 1.7523 - regression_loss: 1.4506 - classification_loss: 0.3016 470/500 [===========================>..] - ETA: 7s - loss: 1.7526 - regression_loss: 1.4509 - classification_loss: 0.3017 471/500 [===========================>..] - ETA: 7s - loss: 1.7531 - regression_loss: 1.4513 - classification_loss: 0.3018 472/500 [===========================>..] - ETA: 7s - loss: 1.7540 - regression_loss: 1.4521 - classification_loss: 0.3018 473/500 [===========================>..] - ETA: 6s - loss: 1.7547 - regression_loss: 1.4527 - classification_loss: 0.3020 474/500 [===========================>..] - ETA: 6s - loss: 1.7544 - regression_loss: 1.4526 - classification_loss: 0.3019 475/500 [===========================>..] - ETA: 6s - loss: 1.7544 - regression_loss: 1.4525 - classification_loss: 0.3019 476/500 [===========================>..] - ETA: 6s - loss: 1.7548 - regression_loss: 1.4529 - classification_loss: 0.3019 477/500 [===========================>..] - ETA: 5s - loss: 1.7547 - regression_loss: 1.4527 - classification_loss: 0.3020 478/500 [===========================>..] - ETA: 5s - loss: 1.7542 - regression_loss: 1.4524 - classification_loss: 0.3018 479/500 [===========================>..] - ETA: 5s - loss: 1.7534 - regression_loss: 1.4518 - classification_loss: 0.3016 480/500 [===========================>..] - ETA: 5s - loss: 1.7530 - regression_loss: 1.4516 - classification_loss: 0.3014 481/500 [===========================>..] - ETA: 4s - loss: 1.7533 - regression_loss: 1.4520 - classification_loss: 0.3013 482/500 [===========================>..] - ETA: 4s - loss: 1.7548 - regression_loss: 1.4532 - classification_loss: 0.3016 483/500 [===========================>..] - ETA: 4s - loss: 1.7553 - regression_loss: 1.4538 - classification_loss: 0.3016 484/500 [============================>.] - ETA: 4s - loss: 1.7559 - regression_loss: 1.4544 - classification_loss: 0.3015 485/500 [============================>.] - ETA: 3s - loss: 1.7570 - regression_loss: 1.4555 - classification_loss: 0.3015 486/500 [============================>.] - ETA: 3s - loss: 1.7578 - regression_loss: 1.4561 - classification_loss: 0.3017 487/500 [============================>.] - ETA: 3s - loss: 1.7574 - regression_loss: 1.4559 - classification_loss: 0.3016 488/500 [============================>.] - ETA: 3s - loss: 1.7585 - regression_loss: 1.4565 - classification_loss: 0.3020 489/500 [============================>.] - ETA: 2s - loss: 1.7583 - regression_loss: 1.4565 - classification_loss: 0.3018 490/500 [============================>.] - ETA: 2s - loss: 1.7589 - regression_loss: 1.4570 - classification_loss: 0.3019 491/500 [============================>.] - ETA: 2s - loss: 1.7578 - regression_loss: 1.4561 - classification_loss: 0.3017 492/500 [============================>.] - ETA: 2s - loss: 1.7557 - regression_loss: 1.4544 - classification_loss: 0.3013 493/500 [============================>.] - ETA: 1s - loss: 1.7543 - regression_loss: 1.4532 - classification_loss: 0.3011 494/500 [============================>.] - ETA: 1s - loss: 1.7541 - regression_loss: 1.4530 - classification_loss: 0.3011 495/500 [============================>.] - ETA: 1s - loss: 1.7535 - regression_loss: 1.4524 - classification_loss: 0.3011 496/500 [============================>.] - ETA: 1s - loss: 1.7534 - regression_loss: 1.4522 - classification_loss: 0.3011 497/500 [============================>.] - ETA: 0s - loss: 1.7520 - regression_loss: 1.4512 - classification_loss: 0.3008 498/500 [============================>.] - ETA: 0s - loss: 1.7521 - regression_loss: 1.4513 - classification_loss: 0.3008 499/500 [============================>.] - ETA: 0s - loss: 1.7526 - regression_loss: 1.4518 - classification_loss: 0.3009 500/500 [==============================] - 125s 251ms/step - loss: 1.7521 - regression_loss: 1.4514 - classification_loss: 0.3007 1172 instances of class plum with average precision: 0.6095 mAP: 0.6095 Epoch 00063: saving model to ./training/snapshots/resnet50_pascal_63.h5 Epoch 64/150 1/500 [..............................] - ETA: 1:57 - loss: 1.8122 - regression_loss: 1.4325 - classification_loss: 0.3797 2/500 [..............................] - ETA: 2:01 - loss: 1.4940 - regression_loss: 1.2359 - classification_loss: 0.2581 3/500 [..............................] - ETA: 2:02 - loss: 1.6282 - regression_loss: 1.3776 - classification_loss: 0.2506 4/500 [..............................] - ETA: 1:59 - loss: 1.6766 - regression_loss: 1.4024 - classification_loss: 0.2742 5/500 [..............................] - ETA: 1:56 - loss: 1.7165 - regression_loss: 1.4417 - classification_loss: 0.2747 6/500 [..............................] - ETA: 1:55 - loss: 1.7004 - regression_loss: 1.4098 - classification_loss: 0.2906 7/500 [..............................] - ETA: 1:53 - loss: 1.6906 - regression_loss: 1.4095 - classification_loss: 0.2811 8/500 [..............................] - ETA: 1:53 - loss: 1.6935 - regression_loss: 1.4079 - classification_loss: 0.2856 9/500 [..............................] - ETA: 1:54 - loss: 1.6550 - regression_loss: 1.3846 - classification_loss: 0.2704 10/500 [..............................] - ETA: 1:54 - loss: 1.6291 - regression_loss: 1.3514 - classification_loss: 0.2777 11/500 [..............................] - ETA: 1:54 - loss: 1.6567 - regression_loss: 1.3574 - classification_loss: 0.2993 12/500 [..............................] - ETA: 1:54 - loss: 1.7301 - regression_loss: 1.4183 - classification_loss: 0.3118 13/500 [..............................] - ETA: 1:55 - loss: 1.7552 - regression_loss: 1.4352 - classification_loss: 0.3200 14/500 [..............................] - ETA: 1:56 - loss: 1.7432 - regression_loss: 1.4303 - classification_loss: 0.3129 15/500 [..............................] - ETA: 1:56 - loss: 1.7626 - regression_loss: 1.4458 - classification_loss: 0.3168 16/500 [..............................] - ETA: 1:56 - loss: 1.7677 - regression_loss: 1.4522 - classification_loss: 0.3155 17/500 [>.............................] - ETA: 1:56 - loss: 1.7360 - regression_loss: 1.4306 - classification_loss: 0.3054 18/500 [>.............................] - ETA: 1:56 - loss: 1.7433 - regression_loss: 1.4377 - classification_loss: 0.3056 19/500 [>.............................] - ETA: 1:56 - loss: 1.7889 - regression_loss: 1.4749 - classification_loss: 0.3140 20/500 [>.............................] - ETA: 1:56 - loss: 1.8055 - regression_loss: 1.4881 - classification_loss: 0.3174 21/500 [>.............................] - ETA: 1:56 - loss: 1.7999 - regression_loss: 1.4826 - classification_loss: 0.3173 22/500 [>.............................] - ETA: 1:56 - loss: 1.7924 - regression_loss: 1.4789 - classification_loss: 0.3135 23/500 [>.............................] - ETA: 1:56 - loss: 1.8220 - regression_loss: 1.4984 - classification_loss: 0.3236 24/500 [>.............................] - ETA: 1:56 - loss: 1.8174 - regression_loss: 1.4958 - classification_loss: 0.3215 25/500 [>.............................] - ETA: 1:56 - loss: 1.7846 - regression_loss: 1.4707 - classification_loss: 0.3139 26/500 [>.............................] - ETA: 1:55 - loss: 1.7659 - regression_loss: 1.4561 - classification_loss: 0.3098 27/500 [>.............................] - ETA: 1:55 - loss: 1.7728 - regression_loss: 1.4612 - classification_loss: 0.3116 28/500 [>.............................] - ETA: 1:55 - loss: 1.7744 - regression_loss: 1.4634 - classification_loss: 0.3110 29/500 [>.............................] - ETA: 1:55 - loss: 1.7705 - regression_loss: 1.4619 - classification_loss: 0.3086 30/500 [>.............................] - ETA: 1:55 - loss: 1.7827 - regression_loss: 1.4726 - classification_loss: 0.3101 31/500 [>.............................] - ETA: 1:55 - loss: 1.7927 - regression_loss: 1.4827 - classification_loss: 0.3100 32/500 [>.............................] - ETA: 1:55 - loss: 1.7924 - regression_loss: 1.4827 - classification_loss: 0.3097 33/500 [>.............................] - ETA: 1:54 - loss: 1.7715 - regression_loss: 1.4660 - classification_loss: 0.3055 34/500 [=>............................] - ETA: 1:54 - loss: 1.7851 - regression_loss: 1.4771 - classification_loss: 0.3080 35/500 [=>............................] - ETA: 1:54 - loss: 1.7974 - regression_loss: 1.4866 - classification_loss: 0.3107 36/500 [=>............................] - ETA: 1:54 - loss: 1.8045 - regression_loss: 1.4925 - classification_loss: 0.3120 37/500 [=>............................] - ETA: 1:54 - loss: 1.8094 - regression_loss: 1.4978 - classification_loss: 0.3117 38/500 [=>............................] - ETA: 1:53 - loss: 1.8024 - regression_loss: 1.4930 - classification_loss: 0.3095 39/500 [=>............................] - ETA: 1:53 - loss: 1.8064 - regression_loss: 1.4969 - classification_loss: 0.3095 40/500 [=>............................] - ETA: 1:53 - loss: 1.7894 - regression_loss: 1.4831 - classification_loss: 0.3063 41/500 [=>............................] - ETA: 1:53 - loss: 1.7817 - regression_loss: 1.4812 - classification_loss: 0.3005 42/500 [=>............................] - ETA: 1:52 - loss: 1.7936 - regression_loss: 1.4904 - classification_loss: 0.3032 43/500 [=>............................] - ETA: 1:52 - loss: 1.7999 - regression_loss: 1.4967 - classification_loss: 0.3032 44/500 [=>............................] - ETA: 1:52 - loss: 1.7732 - regression_loss: 1.4733 - classification_loss: 0.2999 45/500 [=>............................] - ETA: 1:52 - loss: 1.7789 - regression_loss: 1.4786 - classification_loss: 0.3003 46/500 [=>............................] - ETA: 1:51 - loss: 1.7837 - regression_loss: 1.4833 - classification_loss: 0.3004 47/500 [=>............................] - ETA: 1:51 - loss: 1.7862 - regression_loss: 1.4864 - classification_loss: 0.2998 48/500 [=>............................] - ETA: 1:51 - loss: 1.7820 - regression_loss: 1.4825 - classification_loss: 0.2995 49/500 [=>............................] - ETA: 1:50 - loss: 1.8041 - regression_loss: 1.4997 - classification_loss: 0.3044 50/500 [==>...........................] - ETA: 1:50 - loss: 1.8066 - regression_loss: 1.5019 - classification_loss: 0.3047 51/500 [==>...........................] - ETA: 1:50 - loss: 1.8087 - regression_loss: 1.5026 - classification_loss: 0.3061 52/500 [==>...........................] - ETA: 1:50 - loss: 1.8103 - regression_loss: 1.5025 - classification_loss: 0.3078 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8191 - regression_loss: 1.5095 - classification_loss: 0.3096 54/500 [==>...........................] - ETA: 1:49 - loss: 1.8228 - regression_loss: 1.5142 - classification_loss: 0.3086 55/500 [==>...........................] - ETA: 1:49 - loss: 1.8144 - regression_loss: 1.5073 - classification_loss: 0.3071 56/500 [==>...........................] - ETA: 1:49 - loss: 1.8101 - regression_loss: 1.5043 - classification_loss: 0.3058 57/500 [==>...........................] - ETA: 1:49 - loss: 1.7944 - regression_loss: 1.4907 - classification_loss: 0.3037 58/500 [==>...........................] - ETA: 1:49 - loss: 1.7948 - regression_loss: 1.4914 - classification_loss: 0.3034 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7911 - regression_loss: 1.4883 - classification_loss: 0.3029 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7972 - regression_loss: 1.4934 - classification_loss: 0.3038 61/500 [==>...........................] - ETA: 1:48 - loss: 1.8013 - regression_loss: 1.4967 - classification_loss: 0.3045 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8054 - regression_loss: 1.5003 - classification_loss: 0.3051 63/500 [==>...........................] - ETA: 1:47 - loss: 1.8063 - regression_loss: 1.5013 - classification_loss: 0.3050 64/500 [==>...........................] - ETA: 1:47 - loss: 1.8061 - regression_loss: 1.5017 - classification_loss: 0.3044 65/500 [==>...........................] - ETA: 1:47 - loss: 1.8102 - regression_loss: 1.5051 - classification_loss: 0.3051 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8047 - regression_loss: 1.5009 - classification_loss: 0.3038 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8064 - regression_loss: 1.5030 - classification_loss: 0.3034 68/500 [===>..........................] - ETA: 1:46 - loss: 1.8064 - regression_loss: 1.5027 - classification_loss: 0.3037 69/500 [===>..........................] - ETA: 1:46 - loss: 1.8066 - regression_loss: 1.5020 - classification_loss: 0.3046 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8204 - regression_loss: 1.5117 - classification_loss: 0.3087 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8271 - regression_loss: 1.5166 - classification_loss: 0.3106 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8296 - regression_loss: 1.5182 - classification_loss: 0.3114 73/500 [===>..........................] - ETA: 1:45 - loss: 1.8299 - regression_loss: 1.5188 - classification_loss: 0.3111 74/500 [===>..........................] - ETA: 1:45 - loss: 1.8298 - regression_loss: 1.5192 - classification_loss: 0.3106 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8217 - regression_loss: 1.5129 - classification_loss: 0.3088 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8270 - regression_loss: 1.5168 - classification_loss: 0.3102 77/500 [===>..........................] - ETA: 1:44 - loss: 1.8128 - regression_loss: 1.5040 - classification_loss: 0.3088 78/500 [===>..........................] - ETA: 1:44 - loss: 1.8216 - regression_loss: 1.5126 - classification_loss: 0.3090 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8152 - regression_loss: 1.5063 - classification_loss: 0.3089 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8175 - regression_loss: 1.5087 - classification_loss: 0.3088 81/500 [===>..........................] - ETA: 1:43 - loss: 1.8209 - regression_loss: 1.5101 - classification_loss: 0.3108 82/500 [===>..........................] - ETA: 1:43 - loss: 1.8232 - regression_loss: 1.5119 - classification_loss: 0.3114 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8202 - regression_loss: 1.5101 - classification_loss: 0.3101 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8204 - regression_loss: 1.5103 - classification_loss: 0.3100 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8217 - regression_loss: 1.5112 - classification_loss: 0.3105 86/500 [====>.........................] - ETA: 1:42 - loss: 1.8128 - regression_loss: 1.5034 - classification_loss: 0.3094 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8172 - regression_loss: 1.5059 - classification_loss: 0.3112 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8086 - regression_loss: 1.4978 - classification_loss: 0.3108 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8046 - regression_loss: 1.4942 - classification_loss: 0.3104 90/500 [====>.........................] - ETA: 1:41 - loss: 1.8000 - regression_loss: 1.4906 - classification_loss: 0.3094 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8079 - regression_loss: 1.4954 - classification_loss: 0.3125 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8111 - regression_loss: 1.4974 - classification_loss: 0.3137 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8106 - regression_loss: 1.4963 - classification_loss: 0.3143 94/500 [====>.........................] - ETA: 1:40 - loss: 1.8107 - regression_loss: 1.4960 - classification_loss: 0.3147 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8073 - regression_loss: 1.4935 - classification_loss: 0.3138 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8112 - regression_loss: 1.4973 - classification_loss: 0.3139 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8115 - regression_loss: 1.4975 - classification_loss: 0.3141 98/500 [====>.........................] - ETA: 1:39 - loss: 1.8103 - regression_loss: 1.4968 - classification_loss: 0.3135 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8120 - regression_loss: 1.4984 - classification_loss: 0.3135 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8082 - regression_loss: 1.4957 - classification_loss: 0.3125 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8171 - regression_loss: 1.5036 - classification_loss: 0.3134 102/500 [=====>........................] - ETA: 1:38 - loss: 1.8171 - regression_loss: 1.5036 - classification_loss: 0.3136 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8140 - regression_loss: 1.5012 - classification_loss: 0.3128 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8215 - regression_loss: 1.5073 - classification_loss: 0.3141 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8247 - regression_loss: 1.5100 - classification_loss: 0.3147 106/500 [=====>........................] - ETA: 1:37 - loss: 1.8205 - regression_loss: 1.5060 - classification_loss: 0.3145 107/500 [=====>........................] - ETA: 1:37 - loss: 1.8229 - regression_loss: 1.5082 - classification_loss: 0.3148 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8150 - regression_loss: 1.5010 - classification_loss: 0.3141 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8153 - regression_loss: 1.5014 - classification_loss: 0.3139 110/500 [=====>........................] - ETA: 1:36 - loss: 1.8047 - regression_loss: 1.4929 - classification_loss: 0.3118 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7996 - regression_loss: 1.4884 - classification_loss: 0.3112 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8031 - regression_loss: 1.4911 - classification_loss: 0.3120 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7990 - regression_loss: 1.4876 - classification_loss: 0.3115 114/500 [=====>........................] - ETA: 1:35 - loss: 1.8011 - regression_loss: 1.4894 - classification_loss: 0.3117 115/500 [=====>........................] - ETA: 1:35 - loss: 1.8021 - regression_loss: 1.4903 - classification_loss: 0.3118 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8049 - regression_loss: 1.4926 - classification_loss: 0.3123 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8032 - regression_loss: 1.4915 - classification_loss: 0.3117 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7991 - regression_loss: 1.4880 - classification_loss: 0.3110 119/500 [======>.......................] - ETA: 1:34 - loss: 1.8022 - regression_loss: 1.4906 - classification_loss: 0.3115 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8061 - regression_loss: 1.4933 - classification_loss: 0.3128 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8051 - regression_loss: 1.4928 - classification_loss: 0.3123 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8089 - regression_loss: 1.4964 - classification_loss: 0.3125 123/500 [======>.......................] - ETA: 1:33 - loss: 1.8027 - regression_loss: 1.4909 - classification_loss: 0.3118 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8013 - regression_loss: 1.4899 - classification_loss: 0.3113 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8012 - regression_loss: 1.4902 - classification_loss: 0.3110 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8051 - regression_loss: 1.4933 - classification_loss: 0.3117 127/500 [======>.......................] - ETA: 1:32 - loss: 1.8113 - regression_loss: 1.4988 - classification_loss: 0.3125 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8055 - regression_loss: 1.4941 - classification_loss: 0.3114 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8055 - regression_loss: 1.4947 - classification_loss: 0.3109 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8061 - regression_loss: 1.4951 - classification_loss: 0.3110 131/500 [======>.......................] - ETA: 1:31 - loss: 1.8023 - regression_loss: 1.4921 - classification_loss: 0.3102 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8008 - regression_loss: 1.4912 - classification_loss: 0.3096 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7993 - regression_loss: 1.4888 - classification_loss: 0.3105 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8030 - regression_loss: 1.4918 - classification_loss: 0.3112 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7981 - regression_loss: 1.4879 - classification_loss: 0.3101 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7966 - regression_loss: 1.4857 - classification_loss: 0.3109 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7991 - regression_loss: 1.4874 - classification_loss: 0.3116 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8031 - regression_loss: 1.4907 - classification_loss: 0.3125 139/500 [=======>......................] - ETA: 1:29 - loss: 1.8029 - regression_loss: 1.4900 - classification_loss: 0.3129 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8106 - regression_loss: 1.4970 - classification_loss: 0.3136 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8093 - regression_loss: 1.4962 - classification_loss: 0.3131 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8027 - regression_loss: 1.4910 - classification_loss: 0.3117 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7967 - regression_loss: 1.4863 - classification_loss: 0.3103 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7953 - regression_loss: 1.4854 - classification_loss: 0.3099 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7928 - regression_loss: 1.4836 - classification_loss: 0.3091 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7913 - regression_loss: 1.4826 - classification_loss: 0.3087 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7926 - regression_loss: 1.4840 - classification_loss: 0.3086 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7952 - regression_loss: 1.4854 - classification_loss: 0.3098 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7965 - regression_loss: 1.4869 - classification_loss: 0.3096 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7926 - regression_loss: 1.4823 - classification_loss: 0.3103 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7909 - regression_loss: 1.4808 - classification_loss: 0.3101 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7961 - regression_loss: 1.4849 - classification_loss: 0.3112 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7969 - regression_loss: 1.4858 - classification_loss: 0.3111 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7930 - regression_loss: 1.4828 - classification_loss: 0.3102 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7928 - regression_loss: 1.4828 - classification_loss: 0.3100 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7910 - regression_loss: 1.4810 - classification_loss: 0.3100 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7912 - regression_loss: 1.4814 - classification_loss: 0.3098 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7948 - regression_loss: 1.4844 - classification_loss: 0.3104 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7889 - regression_loss: 1.4798 - classification_loss: 0.3090 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7930 - regression_loss: 1.4832 - classification_loss: 0.3098 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7898 - regression_loss: 1.4806 - classification_loss: 0.3092 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7905 - regression_loss: 1.4811 - classification_loss: 0.3094 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7931 - regression_loss: 1.4835 - classification_loss: 0.3096 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7912 - regression_loss: 1.4818 - classification_loss: 0.3094 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7956 - regression_loss: 1.4849 - classification_loss: 0.3107 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7957 - regression_loss: 1.4848 - classification_loss: 0.3109 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7948 - regression_loss: 1.4837 - classification_loss: 0.3111 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7898 - regression_loss: 1.4795 - classification_loss: 0.3103 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7878 - regression_loss: 1.4782 - classification_loss: 0.3096 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7889 - regression_loss: 1.4792 - classification_loss: 0.3097 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7891 - regression_loss: 1.4795 - classification_loss: 0.3095 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7902 - regression_loss: 1.4806 - classification_loss: 0.3096 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7928 - regression_loss: 1.4824 - classification_loss: 0.3104 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7890 - regression_loss: 1.4794 - classification_loss: 0.3096 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7879 - regression_loss: 1.4789 - classification_loss: 0.3090 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7878 - regression_loss: 1.4790 - classification_loss: 0.3089 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7863 - regression_loss: 1.4779 - classification_loss: 0.3083 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7872 - regression_loss: 1.4787 - classification_loss: 0.3085 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7846 - regression_loss: 1.4763 - classification_loss: 0.3084 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7876 - regression_loss: 1.4790 - classification_loss: 0.3087 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7886 - regression_loss: 1.4796 - classification_loss: 0.3089 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7899 - regression_loss: 1.4808 - classification_loss: 0.3091 183/500 [=========>....................] - ETA: 1:18 - loss: 1.7888 - regression_loss: 1.4802 - classification_loss: 0.3086 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7884 - regression_loss: 1.4797 - classification_loss: 0.3087 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7892 - regression_loss: 1.4805 - classification_loss: 0.3087 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7895 - regression_loss: 1.4810 - classification_loss: 0.3085 187/500 [==========>...................] - ETA: 1:17 - loss: 1.7871 - regression_loss: 1.4792 - classification_loss: 0.3078 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7886 - regression_loss: 1.4807 - classification_loss: 0.3079 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7901 - regression_loss: 1.4823 - classification_loss: 0.3078 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7851 - regression_loss: 1.4780 - classification_loss: 0.3071 191/500 [==========>...................] - ETA: 1:16 - loss: 1.7852 - regression_loss: 1.4783 - classification_loss: 0.3068 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7829 - regression_loss: 1.4767 - classification_loss: 0.3062 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7826 - regression_loss: 1.4766 - classification_loss: 0.3060 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7839 - regression_loss: 1.4775 - classification_loss: 0.3064 195/500 [==========>...................] - ETA: 1:15 - loss: 1.7856 - regression_loss: 1.4793 - classification_loss: 0.3064 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7850 - regression_loss: 1.4788 - classification_loss: 0.3062 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7850 - regression_loss: 1.4791 - classification_loss: 0.3059 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7878 - regression_loss: 1.4814 - classification_loss: 0.3064 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7882 - regression_loss: 1.4816 - classification_loss: 0.3066 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7879 - regression_loss: 1.4813 - classification_loss: 0.3066 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7875 - regression_loss: 1.4814 - classification_loss: 0.3061 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7909 - regression_loss: 1.4841 - classification_loss: 0.3068 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7920 - regression_loss: 1.4847 - classification_loss: 0.3073 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7924 - regression_loss: 1.4848 - classification_loss: 0.3075 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7886 - regression_loss: 1.4814 - classification_loss: 0.3072 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7878 - regression_loss: 1.4806 - classification_loss: 0.3072 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7869 - regression_loss: 1.4802 - classification_loss: 0.3067 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7869 - regression_loss: 1.4802 - classification_loss: 0.3067 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7875 - regression_loss: 1.4809 - classification_loss: 0.3066 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7880 - regression_loss: 1.4792 - classification_loss: 0.3087 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7884 - regression_loss: 1.4797 - classification_loss: 0.3087 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7870 - regression_loss: 1.4789 - classification_loss: 0.3081 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7826 - regression_loss: 1.4754 - classification_loss: 0.3072 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7843 - regression_loss: 1.4767 - classification_loss: 0.3076 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7841 - regression_loss: 1.4767 - classification_loss: 0.3074 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7849 - regression_loss: 1.4775 - classification_loss: 0.3073 217/500 [============>.................] - ETA: 1:10 - loss: 1.7829 - regression_loss: 1.4762 - classification_loss: 0.3067 218/500 [============>.................] - ETA: 1:10 - loss: 1.7808 - regression_loss: 1.4746 - classification_loss: 0.3062 219/500 [============>.................] - ETA: 1:10 - loss: 1.7796 - regression_loss: 1.4738 - classification_loss: 0.3058 220/500 [============>.................] - ETA: 1:09 - loss: 1.7792 - regression_loss: 1.4732 - classification_loss: 0.3060 221/500 [============>.................] - ETA: 1:09 - loss: 1.7824 - regression_loss: 1.4757 - classification_loss: 0.3068 222/500 [============>.................] - ETA: 1:09 - loss: 1.7824 - regression_loss: 1.4758 - classification_loss: 0.3066 223/500 [============>.................] - ETA: 1:09 - loss: 1.7834 - regression_loss: 1.4770 - classification_loss: 0.3064 224/500 [============>.................] - ETA: 1:08 - loss: 1.7801 - regression_loss: 1.4747 - classification_loss: 0.3054 225/500 [============>.................] - ETA: 1:08 - loss: 1.7780 - regression_loss: 1.4731 - classification_loss: 0.3049 226/500 [============>.................] - ETA: 1:08 - loss: 1.7765 - regression_loss: 1.4720 - classification_loss: 0.3044 227/500 [============>.................] - ETA: 1:08 - loss: 1.7784 - regression_loss: 1.4735 - classification_loss: 0.3049 228/500 [============>.................] - ETA: 1:07 - loss: 1.7792 - regression_loss: 1.4741 - classification_loss: 0.3050 229/500 [============>.................] - ETA: 1:07 - loss: 1.7807 - regression_loss: 1.4754 - classification_loss: 0.3053 230/500 [============>.................] - ETA: 1:07 - loss: 1.7758 - regression_loss: 1.4714 - classification_loss: 0.3044 231/500 [============>.................] - ETA: 1:07 - loss: 1.7758 - regression_loss: 1.4714 - classification_loss: 0.3044 232/500 [============>.................] - ETA: 1:06 - loss: 1.7719 - regression_loss: 1.4682 - classification_loss: 0.3037 233/500 [============>.................] - ETA: 1:06 - loss: 1.7729 - regression_loss: 1.4690 - classification_loss: 0.3039 234/500 [=============>................] - ETA: 1:06 - loss: 1.7736 - regression_loss: 1.4693 - classification_loss: 0.3043 235/500 [=============>................] - ETA: 1:06 - loss: 1.7738 - regression_loss: 1.4676 - classification_loss: 0.3061 236/500 [=============>................] - ETA: 1:06 - loss: 1.7719 - regression_loss: 1.4660 - classification_loss: 0.3058 237/500 [=============>................] - ETA: 1:05 - loss: 1.7693 - regression_loss: 1.4641 - classification_loss: 0.3052 238/500 [=============>................] - ETA: 1:05 - loss: 1.7694 - regression_loss: 1.4643 - classification_loss: 0.3051 239/500 [=============>................] - ETA: 1:05 - loss: 1.7709 - regression_loss: 1.4656 - classification_loss: 0.3053 240/500 [=============>................] - ETA: 1:05 - loss: 1.7723 - regression_loss: 1.4668 - classification_loss: 0.3055 241/500 [=============>................] - ETA: 1:04 - loss: 1.7701 - regression_loss: 1.4649 - classification_loss: 0.3052 242/500 [=============>................] - ETA: 1:04 - loss: 1.7715 - regression_loss: 1.4658 - classification_loss: 0.3057 243/500 [=============>................] - ETA: 1:04 - loss: 1.7688 - regression_loss: 1.4637 - classification_loss: 0.3051 244/500 [=============>................] - ETA: 1:04 - loss: 1.7708 - regression_loss: 1.4654 - classification_loss: 0.3054 245/500 [=============>................] - ETA: 1:03 - loss: 1.7714 - regression_loss: 1.4664 - classification_loss: 0.3051 246/500 [=============>................] - ETA: 1:03 - loss: 1.7707 - regression_loss: 1.4656 - classification_loss: 0.3051 247/500 [=============>................] - ETA: 1:03 - loss: 1.7718 - regression_loss: 1.4662 - classification_loss: 0.3056 248/500 [=============>................] - ETA: 1:03 - loss: 1.7675 - regression_loss: 1.4629 - classification_loss: 0.3047 249/500 [=============>................] - ETA: 1:02 - loss: 1.7682 - regression_loss: 1.4633 - classification_loss: 0.3049 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7687 - regression_loss: 1.4637 - classification_loss: 0.3050 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7690 - regression_loss: 1.4640 - classification_loss: 0.3051 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7700 - regression_loss: 1.4648 - classification_loss: 0.3052 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7699 - regression_loss: 1.4648 - classification_loss: 0.3050 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7691 - regression_loss: 1.4643 - classification_loss: 0.3048 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7667 - regression_loss: 1.4622 - classification_loss: 0.3045 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7617 - regression_loss: 1.4578 - classification_loss: 0.3040 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7623 - regression_loss: 1.4583 - classification_loss: 0.3040 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7691 - regression_loss: 1.4637 - classification_loss: 0.3054 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7716 - regression_loss: 1.4650 - classification_loss: 0.3066 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7724 - regression_loss: 1.4658 - classification_loss: 0.3066 261/500 [==============>...............] - ETA: 59s - loss: 1.7706 - regression_loss: 1.4643 - classification_loss: 0.3063  262/500 [==============>...............] - ETA: 59s - loss: 1.7725 - regression_loss: 1.4662 - classification_loss: 0.3062 263/500 [==============>...............] - ETA: 59s - loss: 1.7722 - regression_loss: 1.4663 - classification_loss: 0.3059 264/500 [==============>...............] - ETA: 59s - loss: 1.7687 - regression_loss: 1.4634 - classification_loss: 0.3053 265/500 [==============>...............] - ETA: 58s - loss: 1.7707 - regression_loss: 1.4651 - classification_loss: 0.3056 266/500 [==============>...............] - ETA: 58s - loss: 1.7687 - regression_loss: 1.4635 - classification_loss: 0.3052 267/500 [===============>..............] - ETA: 58s - loss: 1.7653 - regression_loss: 1.4611 - classification_loss: 0.3042 268/500 [===============>..............] - ETA: 58s - loss: 1.7637 - regression_loss: 1.4601 - classification_loss: 0.3036 269/500 [===============>..............] - ETA: 57s - loss: 1.7640 - regression_loss: 1.4602 - classification_loss: 0.3038 270/500 [===============>..............] - ETA: 57s - loss: 1.7645 - regression_loss: 1.4605 - classification_loss: 0.3040 271/500 [===============>..............] - ETA: 57s - loss: 1.7674 - regression_loss: 1.4622 - classification_loss: 0.3052 272/500 [===============>..............] - ETA: 57s - loss: 1.7680 - regression_loss: 1.4627 - classification_loss: 0.3053 273/500 [===============>..............] - ETA: 56s - loss: 1.7677 - regression_loss: 1.4626 - classification_loss: 0.3051 274/500 [===============>..............] - ETA: 56s - loss: 1.7667 - regression_loss: 1.4610 - classification_loss: 0.3058 275/500 [===============>..............] - ETA: 56s - loss: 1.7665 - regression_loss: 1.4609 - classification_loss: 0.3056 276/500 [===============>..............] - ETA: 55s - loss: 1.7658 - regression_loss: 1.4603 - classification_loss: 0.3055 277/500 [===============>..............] - ETA: 55s - loss: 1.7655 - regression_loss: 1.4604 - classification_loss: 0.3051 278/500 [===============>..............] - ETA: 55s - loss: 1.7659 - regression_loss: 1.4606 - classification_loss: 0.3054 279/500 [===============>..............] - ETA: 55s - loss: 1.7649 - regression_loss: 1.4599 - classification_loss: 0.3051 280/500 [===============>..............] - ETA: 54s - loss: 1.7634 - regression_loss: 1.4587 - classification_loss: 0.3047 281/500 [===============>..............] - ETA: 54s - loss: 1.7622 - regression_loss: 1.4579 - classification_loss: 0.3043 282/500 [===============>..............] - ETA: 54s - loss: 1.7624 - regression_loss: 1.4581 - classification_loss: 0.3043 283/500 [===============>..............] - ETA: 54s - loss: 1.7629 - regression_loss: 1.4583 - classification_loss: 0.3046 284/500 [================>.............] - ETA: 53s - loss: 1.7613 - regression_loss: 1.4568 - classification_loss: 0.3045 285/500 [================>.............] - ETA: 53s - loss: 1.7574 - regression_loss: 1.4537 - classification_loss: 0.3037 286/500 [================>.............] - ETA: 53s - loss: 1.7554 - regression_loss: 1.4521 - classification_loss: 0.3033 287/500 [================>.............] - ETA: 53s - loss: 1.7569 - regression_loss: 1.4531 - classification_loss: 0.3038 288/500 [================>.............] - ETA: 52s - loss: 1.7542 - regression_loss: 1.4506 - classification_loss: 0.3036 289/500 [================>.............] - ETA: 52s - loss: 1.7559 - regression_loss: 1.4518 - classification_loss: 0.3042 290/500 [================>.............] - ETA: 52s - loss: 1.7568 - regression_loss: 1.4526 - classification_loss: 0.3042 291/500 [================>.............] - ETA: 51s - loss: 1.7579 - regression_loss: 1.4531 - classification_loss: 0.3048 292/500 [================>.............] - ETA: 51s - loss: 1.7571 - regression_loss: 1.4522 - classification_loss: 0.3048 293/500 [================>.............] - ETA: 51s - loss: 1.7559 - regression_loss: 1.4513 - classification_loss: 0.3046 294/500 [================>.............] - ETA: 51s - loss: 1.7533 - regression_loss: 1.4493 - classification_loss: 0.3041 295/500 [================>.............] - ETA: 50s - loss: 1.7555 - regression_loss: 1.4506 - classification_loss: 0.3048 296/500 [================>.............] - ETA: 50s - loss: 1.7551 - regression_loss: 1.4503 - classification_loss: 0.3049 297/500 [================>.............] - ETA: 50s - loss: 1.7538 - regression_loss: 1.4491 - classification_loss: 0.3046 298/500 [================>.............] - ETA: 50s - loss: 1.7538 - regression_loss: 1.4491 - classification_loss: 0.3048 299/500 [================>.............] - ETA: 49s - loss: 1.7539 - regression_loss: 1.4493 - classification_loss: 0.3047 300/500 [=================>............] - ETA: 49s - loss: 1.7550 - regression_loss: 1.4502 - classification_loss: 0.3048 301/500 [=================>............] - ETA: 49s - loss: 1.7528 - regression_loss: 1.4485 - classification_loss: 0.3043 302/500 [=================>............] - ETA: 49s - loss: 1.7521 - regression_loss: 1.4479 - classification_loss: 0.3041 303/500 [=================>............] - ETA: 48s - loss: 1.7493 - regression_loss: 1.4453 - classification_loss: 0.3040 304/500 [=================>............] - ETA: 48s - loss: 1.7510 - regression_loss: 1.4467 - classification_loss: 0.3043 305/500 [=================>............] - ETA: 48s - loss: 1.7516 - regression_loss: 1.4474 - classification_loss: 0.3042 306/500 [=================>............] - ETA: 48s - loss: 1.7508 - regression_loss: 1.4470 - classification_loss: 0.3038 307/500 [=================>............] - ETA: 48s - loss: 1.7523 - regression_loss: 1.4484 - classification_loss: 0.3039 308/500 [=================>............] - ETA: 47s - loss: 1.7537 - regression_loss: 1.4494 - classification_loss: 0.3043 309/500 [=================>............] - ETA: 47s - loss: 1.7549 - regression_loss: 1.4504 - classification_loss: 0.3045 310/500 [=================>............] - ETA: 47s - loss: 1.7543 - regression_loss: 1.4501 - classification_loss: 0.3042 311/500 [=================>............] - ETA: 47s - loss: 1.7551 - regression_loss: 1.4508 - classification_loss: 0.3043 312/500 [=================>............] - ETA: 46s - loss: 1.7577 - regression_loss: 1.4524 - classification_loss: 0.3052 313/500 [=================>............] - ETA: 46s - loss: 1.7574 - regression_loss: 1.4525 - classification_loss: 0.3049 314/500 [=================>............] - ETA: 46s - loss: 1.7567 - regression_loss: 1.4520 - classification_loss: 0.3047 315/500 [=================>............] - ETA: 46s - loss: 1.7555 - regression_loss: 1.4513 - classification_loss: 0.3043 316/500 [=================>............] - ETA: 45s - loss: 1.7576 - regression_loss: 1.4529 - classification_loss: 0.3047 317/500 [==================>...........] - ETA: 45s - loss: 1.7584 - regression_loss: 1.4535 - classification_loss: 0.3048 318/500 [==================>...........] - ETA: 45s - loss: 1.7571 - regression_loss: 1.4524 - classification_loss: 0.3047 319/500 [==================>...........] - ETA: 45s - loss: 1.7577 - regression_loss: 1.4529 - classification_loss: 0.3048 320/500 [==================>...........] - ETA: 44s - loss: 1.7578 - regression_loss: 1.4531 - classification_loss: 0.3047 321/500 [==================>...........] - ETA: 44s - loss: 1.7583 - regression_loss: 1.4535 - classification_loss: 0.3048 322/500 [==================>...........] - ETA: 44s - loss: 1.7582 - regression_loss: 1.4534 - classification_loss: 0.3048 323/500 [==================>...........] - ETA: 44s - loss: 1.7598 - regression_loss: 1.4548 - classification_loss: 0.3050 324/500 [==================>...........] - ETA: 43s - loss: 1.7609 - regression_loss: 1.4557 - classification_loss: 0.3052 325/500 [==================>...........] - ETA: 43s - loss: 1.7606 - regression_loss: 1.4556 - classification_loss: 0.3050 326/500 [==================>...........] - ETA: 43s - loss: 1.7620 - regression_loss: 1.4568 - classification_loss: 0.3052 327/500 [==================>...........] - ETA: 43s - loss: 1.7638 - regression_loss: 1.4581 - classification_loss: 0.3057 328/500 [==================>...........] - ETA: 42s - loss: 1.7621 - regression_loss: 1.4569 - classification_loss: 0.3052 329/500 [==================>...........] - ETA: 42s - loss: 1.7630 - regression_loss: 1.4577 - classification_loss: 0.3053 330/500 [==================>...........] - ETA: 42s - loss: 1.7624 - regression_loss: 1.4573 - classification_loss: 0.3051 331/500 [==================>...........] - ETA: 42s - loss: 1.7632 - regression_loss: 1.4581 - classification_loss: 0.3051 332/500 [==================>...........] - ETA: 41s - loss: 1.7647 - regression_loss: 1.4593 - classification_loss: 0.3054 333/500 [==================>...........] - ETA: 41s - loss: 1.7641 - regression_loss: 1.4590 - classification_loss: 0.3052 334/500 [===================>..........] - ETA: 41s - loss: 1.7654 - regression_loss: 1.4599 - classification_loss: 0.3055 335/500 [===================>..........] - ETA: 41s - loss: 1.7653 - regression_loss: 1.4598 - classification_loss: 0.3055 336/500 [===================>..........] - ETA: 40s - loss: 1.7664 - regression_loss: 1.4607 - classification_loss: 0.3057 337/500 [===================>..........] - ETA: 40s - loss: 1.7647 - regression_loss: 1.4593 - classification_loss: 0.3055 338/500 [===================>..........] - ETA: 40s - loss: 1.7650 - regression_loss: 1.4594 - classification_loss: 0.3055 339/500 [===================>..........] - ETA: 40s - loss: 1.7658 - regression_loss: 1.4605 - classification_loss: 0.3054 340/500 [===================>..........] - ETA: 39s - loss: 1.7661 - regression_loss: 1.4607 - classification_loss: 0.3053 341/500 [===================>..........] - ETA: 39s - loss: 1.7664 - regression_loss: 1.4613 - classification_loss: 0.3051 342/500 [===================>..........] - ETA: 39s - loss: 1.7656 - regression_loss: 1.4608 - classification_loss: 0.3049 343/500 [===================>..........] - ETA: 39s - loss: 1.7661 - regression_loss: 1.4610 - classification_loss: 0.3050 344/500 [===================>..........] - ETA: 38s - loss: 1.7687 - regression_loss: 1.4634 - classification_loss: 0.3053 345/500 [===================>..........] - ETA: 38s - loss: 1.7698 - regression_loss: 1.4644 - classification_loss: 0.3054 346/500 [===================>..........] - ETA: 38s - loss: 1.7703 - regression_loss: 1.4649 - classification_loss: 0.3054 347/500 [===================>..........] - ETA: 38s - loss: 1.7692 - regression_loss: 1.4641 - classification_loss: 0.3051 348/500 [===================>..........] - ETA: 37s - loss: 1.7678 - regression_loss: 1.4629 - classification_loss: 0.3048 349/500 [===================>..........] - ETA: 37s - loss: 1.7682 - regression_loss: 1.4633 - classification_loss: 0.3049 350/500 [====================>.........] - ETA: 37s - loss: 1.7666 - regression_loss: 1.4617 - classification_loss: 0.3048 351/500 [====================>.........] - ETA: 37s - loss: 1.7658 - regression_loss: 1.4609 - classification_loss: 0.3049 352/500 [====================>.........] - ETA: 36s - loss: 1.7722 - regression_loss: 1.4651 - classification_loss: 0.3071 353/500 [====================>.........] - ETA: 36s - loss: 1.7707 - regression_loss: 1.4639 - classification_loss: 0.3068 354/500 [====================>.........] - ETA: 36s - loss: 1.7716 - regression_loss: 1.4649 - classification_loss: 0.3068 355/500 [====================>.........] - ETA: 36s - loss: 1.7705 - regression_loss: 1.4644 - classification_loss: 0.3061 356/500 [====================>.........] - ETA: 35s - loss: 1.7710 - regression_loss: 1.4649 - classification_loss: 0.3061 357/500 [====================>.........] - ETA: 35s - loss: 1.7683 - regression_loss: 1.4626 - classification_loss: 0.3056 358/500 [====================>.........] - ETA: 35s - loss: 1.7663 - regression_loss: 1.4609 - classification_loss: 0.3054 359/500 [====================>.........] - ETA: 35s - loss: 1.7665 - regression_loss: 1.4610 - classification_loss: 0.3055 360/500 [====================>.........] - ETA: 34s - loss: 1.7666 - regression_loss: 1.4608 - classification_loss: 0.3058 361/500 [====================>.........] - ETA: 34s - loss: 1.7662 - regression_loss: 1.4606 - classification_loss: 0.3056 362/500 [====================>.........] - ETA: 34s - loss: 1.7670 - regression_loss: 1.4612 - classification_loss: 0.3059 363/500 [====================>.........] - ETA: 34s - loss: 1.7689 - regression_loss: 1.4626 - classification_loss: 0.3063 364/500 [====================>.........] - ETA: 33s - loss: 1.7688 - regression_loss: 1.4625 - classification_loss: 0.3063 365/500 [====================>.........] - ETA: 33s - loss: 1.7693 - regression_loss: 1.4630 - classification_loss: 0.3063 366/500 [====================>.........] - ETA: 33s - loss: 1.7680 - regression_loss: 1.4617 - classification_loss: 0.3063 367/500 [=====================>........] - ETA: 33s - loss: 1.7691 - regression_loss: 1.4625 - classification_loss: 0.3066 368/500 [=====================>........] - ETA: 32s - loss: 1.7688 - regression_loss: 1.4623 - classification_loss: 0.3064 369/500 [=====================>........] - ETA: 32s - loss: 1.7695 - regression_loss: 1.4631 - classification_loss: 0.3064 370/500 [=====================>........] - ETA: 32s - loss: 1.7693 - regression_loss: 1.4627 - classification_loss: 0.3066 371/500 [=====================>........] - ETA: 32s - loss: 1.7675 - regression_loss: 1.4609 - classification_loss: 0.3066 372/500 [=====================>........] - ETA: 31s - loss: 1.7685 - regression_loss: 1.4609 - classification_loss: 0.3077 373/500 [=====================>........] - ETA: 31s - loss: 1.7677 - regression_loss: 1.4600 - classification_loss: 0.3076 374/500 [=====================>........] - ETA: 31s - loss: 1.7674 - regression_loss: 1.4600 - classification_loss: 0.3075 375/500 [=====================>........] - ETA: 31s - loss: 1.7689 - regression_loss: 1.4611 - classification_loss: 0.3078 376/500 [=====================>........] - ETA: 30s - loss: 1.7688 - regression_loss: 1.4605 - classification_loss: 0.3084 377/500 [=====================>........] - ETA: 30s - loss: 1.7694 - regression_loss: 1.4610 - classification_loss: 0.3084 378/500 [=====================>........] - ETA: 30s - loss: 1.7699 - regression_loss: 1.4614 - classification_loss: 0.3085 379/500 [=====================>........] - ETA: 30s - loss: 1.7692 - regression_loss: 1.4608 - classification_loss: 0.3084 380/500 [=====================>........] - ETA: 29s - loss: 1.7693 - regression_loss: 1.4609 - classification_loss: 0.3084 381/500 [=====================>........] - ETA: 29s - loss: 1.7685 - regression_loss: 1.4601 - classification_loss: 0.3084 382/500 [=====================>........] - ETA: 29s - loss: 1.7662 - regression_loss: 1.4582 - classification_loss: 0.3080 383/500 [=====================>........] - ETA: 29s - loss: 1.7679 - regression_loss: 1.4598 - classification_loss: 0.3081 384/500 [======================>.......] - ETA: 28s - loss: 1.7685 - regression_loss: 1.4605 - classification_loss: 0.3080 385/500 [======================>.......] - ETA: 28s - loss: 1.7687 - regression_loss: 1.4598 - classification_loss: 0.3088 386/500 [======================>.......] - ETA: 28s - loss: 1.7699 - regression_loss: 1.4605 - classification_loss: 0.3095 387/500 [======================>.......] - ETA: 28s - loss: 1.7691 - regression_loss: 1.4598 - classification_loss: 0.3093 388/500 [======================>.......] - ETA: 27s - loss: 1.7693 - regression_loss: 1.4601 - classification_loss: 0.3092 389/500 [======================>.......] - ETA: 27s - loss: 1.7694 - regression_loss: 1.4600 - classification_loss: 0.3094 390/500 [======================>.......] - ETA: 27s - loss: 1.7705 - regression_loss: 1.4605 - classification_loss: 0.3100 391/500 [======================>.......] - ETA: 27s - loss: 1.7710 - regression_loss: 1.4611 - classification_loss: 0.3099 392/500 [======================>.......] - ETA: 26s - loss: 1.7709 - regression_loss: 1.4610 - classification_loss: 0.3098 393/500 [======================>.......] - ETA: 26s - loss: 1.7693 - regression_loss: 1.4599 - classification_loss: 0.3094 394/500 [======================>.......] - ETA: 26s - loss: 1.7700 - regression_loss: 1.4605 - classification_loss: 0.3095 395/500 [======================>.......] - ETA: 26s - loss: 1.7704 - regression_loss: 1.4608 - classification_loss: 0.3096 396/500 [======================>.......] - ETA: 25s - loss: 1.7712 - regression_loss: 1.4616 - classification_loss: 0.3096 397/500 [======================>.......] - ETA: 25s - loss: 1.7727 - regression_loss: 1.4627 - classification_loss: 0.3100 398/500 [======================>.......] - ETA: 25s - loss: 1.7726 - regression_loss: 1.4628 - classification_loss: 0.3098 399/500 [======================>.......] - ETA: 25s - loss: 1.7715 - regression_loss: 1.4619 - classification_loss: 0.3096 400/500 [=======================>......] - ETA: 24s - loss: 1.7728 - regression_loss: 1.4625 - classification_loss: 0.3103 401/500 [=======================>......] - ETA: 24s - loss: 1.7716 - regression_loss: 1.4612 - classification_loss: 0.3104 402/500 [=======================>......] - ETA: 24s - loss: 1.7694 - regression_loss: 1.4595 - classification_loss: 0.3099 403/500 [=======================>......] - ETA: 24s - loss: 1.7694 - regression_loss: 1.4596 - classification_loss: 0.3097 404/500 [=======================>......] - ETA: 23s - loss: 1.7698 - regression_loss: 1.4596 - classification_loss: 0.3102 405/500 [=======================>......] - ETA: 23s - loss: 1.7710 - regression_loss: 1.4605 - classification_loss: 0.3106 406/500 [=======================>......] - ETA: 23s - loss: 1.7701 - regression_loss: 1.4598 - classification_loss: 0.3103 407/500 [=======================>......] - ETA: 23s - loss: 1.7723 - regression_loss: 1.4614 - classification_loss: 0.3109 408/500 [=======================>......] - ETA: 22s - loss: 1.7732 - regression_loss: 1.4621 - classification_loss: 0.3111 409/500 [=======================>......] - ETA: 22s - loss: 1.7731 - regression_loss: 1.4621 - classification_loss: 0.3110 410/500 [=======================>......] - ETA: 22s - loss: 1.7727 - regression_loss: 1.4619 - classification_loss: 0.3108 411/500 [=======================>......] - ETA: 22s - loss: 1.7737 - regression_loss: 1.4629 - classification_loss: 0.3108 412/500 [=======================>......] - ETA: 21s - loss: 1.7738 - regression_loss: 1.4630 - classification_loss: 0.3107 413/500 [=======================>......] - ETA: 21s - loss: 1.7740 - regression_loss: 1.4633 - classification_loss: 0.3107 414/500 [=======================>......] - ETA: 21s - loss: 1.7733 - regression_loss: 1.4627 - classification_loss: 0.3106 415/500 [=======================>......] - ETA: 21s - loss: 1.7735 - regression_loss: 1.4628 - classification_loss: 0.3108 416/500 [=======================>......] - ETA: 20s - loss: 1.7725 - regression_loss: 1.4619 - classification_loss: 0.3105 417/500 [========================>.....] - ETA: 20s - loss: 1.7731 - regression_loss: 1.4624 - classification_loss: 0.3107 418/500 [========================>.....] - ETA: 20s - loss: 1.7723 - regression_loss: 1.4619 - classification_loss: 0.3104 419/500 [========================>.....] - ETA: 20s - loss: 1.7727 - regression_loss: 1.4623 - classification_loss: 0.3104 420/500 [========================>.....] - ETA: 19s - loss: 1.7730 - regression_loss: 1.4626 - classification_loss: 0.3105 421/500 [========================>.....] - ETA: 19s - loss: 1.7720 - regression_loss: 1.4617 - classification_loss: 0.3103 422/500 [========================>.....] - ETA: 19s - loss: 1.7719 - regression_loss: 1.4618 - classification_loss: 0.3101 423/500 [========================>.....] - ETA: 19s - loss: 1.7725 - regression_loss: 1.4623 - classification_loss: 0.3101 424/500 [========================>.....] - ETA: 18s - loss: 1.7725 - regression_loss: 1.4625 - classification_loss: 0.3100 425/500 [========================>.....] - ETA: 18s - loss: 1.7710 - regression_loss: 1.4614 - classification_loss: 0.3096 426/500 [========================>.....] - ETA: 18s - loss: 1.7720 - regression_loss: 1.4621 - classification_loss: 0.3099 427/500 [========================>.....] - ETA: 18s - loss: 1.7734 - regression_loss: 1.4631 - classification_loss: 0.3103 428/500 [========================>.....] - ETA: 17s - loss: 1.7742 - regression_loss: 1.4637 - classification_loss: 0.3105 429/500 [========================>.....] - ETA: 17s - loss: 1.7739 - regression_loss: 1.4634 - classification_loss: 0.3105 430/500 [========================>.....] - ETA: 17s - loss: 1.7743 - regression_loss: 1.4638 - classification_loss: 0.3105 431/500 [========================>.....] - ETA: 17s - loss: 1.7748 - regression_loss: 1.4642 - classification_loss: 0.3105 432/500 [========================>.....] - ETA: 16s - loss: 1.7727 - regression_loss: 1.4625 - classification_loss: 0.3102 433/500 [========================>.....] - ETA: 16s - loss: 1.7731 - regression_loss: 1.4627 - classification_loss: 0.3104 434/500 [=========================>....] - ETA: 16s - loss: 1.7742 - regression_loss: 1.4636 - classification_loss: 0.3105 435/500 [=========================>....] - ETA: 16s - loss: 1.7742 - regression_loss: 1.4637 - classification_loss: 0.3104 436/500 [=========================>....] - ETA: 15s - loss: 1.7734 - regression_loss: 1.4629 - classification_loss: 0.3105 437/500 [=========================>....] - ETA: 15s - loss: 1.7719 - regression_loss: 1.4616 - classification_loss: 0.3103 438/500 [=========================>....] - ETA: 15s - loss: 1.7728 - regression_loss: 1.4624 - classification_loss: 0.3104 439/500 [=========================>....] - ETA: 15s - loss: 1.7731 - regression_loss: 1.4627 - classification_loss: 0.3104 440/500 [=========================>....] - ETA: 14s - loss: 1.7709 - regression_loss: 1.4609 - classification_loss: 0.3100 441/500 [=========================>....] - ETA: 14s - loss: 1.7705 - regression_loss: 1.4606 - classification_loss: 0.3099 442/500 [=========================>....] - ETA: 14s - loss: 1.7699 - regression_loss: 1.4602 - classification_loss: 0.3097 443/500 [=========================>....] - ETA: 14s - loss: 1.7704 - regression_loss: 1.4606 - classification_loss: 0.3098 444/500 [=========================>....] - ETA: 13s - loss: 1.7710 - regression_loss: 1.4611 - classification_loss: 0.3099 445/500 [=========================>....] - ETA: 13s - loss: 1.7697 - regression_loss: 1.4600 - classification_loss: 0.3097 446/500 [=========================>....] - ETA: 13s - loss: 1.7677 - regression_loss: 1.4584 - classification_loss: 0.3093 447/500 [=========================>....] - ETA: 13s - loss: 1.7673 - regression_loss: 1.4582 - classification_loss: 0.3091 448/500 [=========================>....] - ETA: 12s - loss: 1.7667 - regression_loss: 1.4578 - classification_loss: 0.3088 449/500 [=========================>....] - ETA: 12s - loss: 1.7669 - regression_loss: 1.4581 - classification_loss: 0.3088 450/500 [==========================>...] - ETA: 12s - loss: 1.7713 - regression_loss: 1.4614 - classification_loss: 0.3099 451/500 [==========================>...] - ETA: 12s - loss: 1.7714 - regression_loss: 1.4616 - classification_loss: 0.3098 452/500 [==========================>...] - ETA: 11s - loss: 1.7721 - regression_loss: 1.4621 - classification_loss: 0.3100 453/500 [==========================>...] - ETA: 11s - loss: 1.7725 - regression_loss: 1.4624 - classification_loss: 0.3100 454/500 [==========================>...] - ETA: 11s - loss: 1.7735 - regression_loss: 1.4629 - classification_loss: 0.3105 455/500 [==========================>...] - ETA: 11s - loss: 1.7717 - regression_loss: 1.4616 - classification_loss: 0.3100 456/500 [==========================>...] - ETA: 10s - loss: 1.7715 - regression_loss: 1.4616 - classification_loss: 0.3099 457/500 [==========================>...] - ETA: 10s - loss: 1.7707 - regression_loss: 1.4610 - classification_loss: 0.3096 458/500 [==========================>...] - ETA: 10s - loss: 1.7707 - regression_loss: 1.4611 - classification_loss: 0.3096 459/500 [==========================>...] - ETA: 10s - loss: 1.7711 - regression_loss: 1.4613 - classification_loss: 0.3098 460/500 [==========================>...] - ETA: 9s - loss: 1.7714 - regression_loss: 1.4616 - classification_loss: 0.3098  461/500 [==========================>...] - ETA: 9s - loss: 1.7706 - regression_loss: 1.4610 - classification_loss: 0.3096 462/500 [==========================>...] - ETA: 9s - loss: 1.7720 - regression_loss: 1.4619 - classification_loss: 0.3101 463/500 [==========================>...] - ETA: 9s - loss: 1.7716 - regression_loss: 1.4616 - classification_loss: 0.3100 464/500 [==========================>...] - ETA: 8s - loss: 1.7716 - regression_loss: 1.4616 - classification_loss: 0.3100 465/500 [==========================>...] - ETA: 8s - loss: 1.7711 - regression_loss: 1.4613 - classification_loss: 0.3099 466/500 [==========================>...] - ETA: 8s - loss: 1.7713 - regression_loss: 1.4616 - classification_loss: 0.3097 467/500 [===========================>..] - ETA: 8s - loss: 1.7717 - regression_loss: 1.4619 - classification_loss: 0.3097 468/500 [===========================>..] - ETA: 7s - loss: 1.7714 - regression_loss: 1.4618 - classification_loss: 0.3097 469/500 [===========================>..] - ETA: 7s - loss: 1.7703 - regression_loss: 1.4610 - classification_loss: 0.3093 470/500 [===========================>..] - ETA: 7s - loss: 1.7690 - regression_loss: 1.4599 - classification_loss: 0.3092 471/500 [===========================>..] - ETA: 7s - loss: 1.7680 - regression_loss: 1.4589 - classification_loss: 0.3091 472/500 [===========================>..] - ETA: 6s - loss: 1.7679 - regression_loss: 1.4590 - classification_loss: 0.3090 473/500 [===========================>..] - ETA: 6s - loss: 1.7687 - regression_loss: 1.4596 - classification_loss: 0.3090 474/500 [===========================>..] - ETA: 6s - loss: 1.7707 - regression_loss: 1.4612 - classification_loss: 0.3094 475/500 [===========================>..] - ETA: 6s - loss: 1.7696 - regression_loss: 1.4604 - classification_loss: 0.3091 476/500 [===========================>..] - ETA: 5s - loss: 1.7697 - regression_loss: 1.4604 - classification_loss: 0.3092 477/500 [===========================>..] - ETA: 5s - loss: 1.7692 - regression_loss: 1.4600 - classification_loss: 0.3092 478/500 [===========================>..] - ETA: 5s - loss: 1.7693 - regression_loss: 1.4599 - classification_loss: 0.3093 479/500 [===========================>..] - ETA: 5s - loss: 1.7680 - regression_loss: 1.4590 - classification_loss: 0.3091 480/500 [===========================>..] - ETA: 4s - loss: 1.7689 - regression_loss: 1.4597 - classification_loss: 0.3092 481/500 [===========================>..] - ETA: 4s - loss: 1.7671 - regression_loss: 1.4583 - classification_loss: 0.3088 482/500 [===========================>..] - ETA: 4s - loss: 1.7669 - regression_loss: 1.4582 - classification_loss: 0.3088 483/500 [===========================>..] - ETA: 4s - loss: 1.7674 - regression_loss: 1.4587 - classification_loss: 0.3087 484/500 [============================>.] - ETA: 3s - loss: 1.7676 - regression_loss: 1.4589 - classification_loss: 0.3087 485/500 [============================>.] - ETA: 3s - loss: 1.7684 - regression_loss: 1.4597 - classification_loss: 0.3087 486/500 [============================>.] - ETA: 3s - loss: 1.7682 - regression_loss: 1.4596 - classification_loss: 0.3086 487/500 [============================>.] - ETA: 3s - loss: 1.7673 - regression_loss: 1.4588 - classification_loss: 0.3085 488/500 [============================>.] - ETA: 2s - loss: 1.7684 - regression_loss: 1.4597 - classification_loss: 0.3087 489/500 [============================>.] - ETA: 2s - loss: 1.7668 - regression_loss: 1.4584 - classification_loss: 0.3084 490/500 [============================>.] - ETA: 2s - loss: 1.7664 - regression_loss: 1.4582 - classification_loss: 0.3083 491/500 [============================>.] - ETA: 2s - loss: 1.7667 - regression_loss: 1.4584 - classification_loss: 0.3083 492/500 [============================>.] - ETA: 1s - loss: 1.7671 - regression_loss: 1.4587 - classification_loss: 0.3084 493/500 [============================>.] - ETA: 1s - loss: 1.7674 - regression_loss: 1.4590 - classification_loss: 0.3084 494/500 [============================>.] - ETA: 1s - loss: 1.7666 - regression_loss: 1.4584 - classification_loss: 0.3082 495/500 [============================>.] - ETA: 1s - loss: 1.7656 - regression_loss: 1.4576 - classification_loss: 0.3079 496/500 [============================>.] - ETA: 0s - loss: 1.7646 - regression_loss: 1.4570 - classification_loss: 0.3077 497/500 [============================>.] - ETA: 0s - loss: 1.7649 - regression_loss: 1.4572 - classification_loss: 0.3077 498/500 [============================>.] - ETA: 0s - loss: 1.7637 - regression_loss: 1.4562 - classification_loss: 0.3074 499/500 [============================>.] - ETA: 0s - loss: 1.7639 - regression_loss: 1.4561 - classification_loss: 0.3078 500/500 [==============================] - 125s 249ms/step - loss: 1.7643 - regression_loss: 1.4565 - classification_loss: 0.3078 1172 instances of class plum with average precision: 0.6040 mAP: 0.6040 Epoch 00064: saving model to ./training/snapshots/resnet50_pascal_64.h5 Epoch 65/150 1/500 [..............................] - ETA: 1:51 - loss: 2.0712 - regression_loss: 1.6681 - classification_loss: 0.4031 2/500 [..............................] - ETA: 1:57 - loss: 2.1082 - regression_loss: 1.7254 - classification_loss: 0.3827 3/500 [..............................] - ETA: 1:59 - loss: 1.9970 - regression_loss: 1.6618 - classification_loss: 0.3351 4/500 [..............................] - ETA: 2:00 - loss: 1.9220 - regression_loss: 1.5988 - classification_loss: 0.3233 5/500 [..............................] - ETA: 2:01 - loss: 1.9597 - regression_loss: 1.6418 - classification_loss: 0.3179 6/500 [..............................] - ETA: 2:01 - loss: 1.9872 - regression_loss: 1.6593 - classification_loss: 0.3279 7/500 [..............................] - ETA: 2:02 - loss: 2.0106 - regression_loss: 1.6846 - classification_loss: 0.3260 8/500 [..............................] - ETA: 2:03 - loss: 1.9451 - regression_loss: 1.6272 - classification_loss: 0.3178 9/500 [..............................] - ETA: 2:03 - loss: 1.9694 - regression_loss: 1.6472 - classification_loss: 0.3222 10/500 [..............................] - ETA: 2:02 - loss: 1.9723 - regression_loss: 1.6404 - classification_loss: 0.3319 11/500 [..............................] - ETA: 2:02 - loss: 1.9970 - regression_loss: 1.6622 - classification_loss: 0.3348 12/500 [..............................] - ETA: 2:02 - loss: 2.0474 - regression_loss: 1.6969 - classification_loss: 0.3506 13/500 [..............................] - ETA: 2:02 - loss: 2.0629 - regression_loss: 1.7095 - classification_loss: 0.3534 14/500 [..............................] - ETA: 2:01 - loss: 2.0585 - regression_loss: 1.7057 - classification_loss: 0.3527 15/500 [..............................] - ETA: 2:01 - loss: 2.0713 - regression_loss: 1.7139 - classification_loss: 0.3574 16/500 [..............................] - ETA: 2:01 - loss: 2.0694 - regression_loss: 1.7100 - classification_loss: 0.3593 17/500 [>.............................] - ETA: 2:01 - loss: 2.0652 - regression_loss: 1.7064 - classification_loss: 0.3588 18/500 [>.............................] - ETA: 2:00 - loss: 2.0576 - regression_loss: 1.6989 - classification_loss: 0.3587 19/500 [>.............................] - ETA: 2:00 - loss: 2.0472 - regression_loss: 1.6910 - classification_loss: 0.3562 20/500 [>.............................] - ETA: 2:00 - loss: 2.0420 - regression_loss: 1.6841 - classification_loss: 0.3580 21/500 [>.............................] - ETA: 2:00 - loss: 2.0156 - regression_loss: 1.6609 - classification_loss: 0.3547 22/500 [>.............................] - ETA: 1:59 - loss: 2.0137 - regression_loss: 1.6599 - classification_loss: 0.3538 23/500 [>.............................] - ETA: 1:59 - loss: 2.0070 - regression_loss: 1.6582 - classification_loss: 0.3487 24/500 [>.............................] - ETA: 1:59 - loss: 2.0001 - regression_loss: 1.6530 - classification_loss: 0.3471 25/500 [>.............................] - ETA: 1:58 - loss: 2.0028 - regression_loss: 1.6559 - classification_loss: 0.3470 26/500 [>.............................] - ETA: 1:58 - loss: 1.9866 - regression_loss: 1.6414 - classification_loss: 0.3452 27/500 [>.............................] - ETA: 1:58 - loss: 1.9413 - regression_loss: 1.5978 - classification_loss: 0.3435 28/500 [>.............................] - ETA: 1:58 - loss: 1.9449 - regression_loss: 1.6015 - classification_loss: 0.3434 29/500 [>.............................] - ETA: 1:58 - loss: 1.9381 - regression_loss: 1.5924 - classification_loss: 0.3457 30/500 [>.............................] - ETA: 1:57 - loss: 1.9880 - regression_loss: 1.6198 - classification_loss: 0.3682 31/500 [>.............................] - ETA: 1:57 - loss: 1.9749 - regression_loss: 1.6114 - classification_loss: 0.3635 32/500 [>.............................] - ETA: 1:57 - loss: 1.9559 - regression_loss: 1.5983 - classification_loss: 0.3576 33/500 [>.............................] - ETA: 1:56 - loss: 1.9448 - regression_loss: 1.5892 - classification_loss: 0.3556 34/500 [=>............................] - ETA: 1:55 - loss: 1.9342 - regression_loss: 1.5816 - classification_loss: 0.3526 35/500 [=>............................] - ETA: 1:55 - loss: 1.9299 - regression_loss: 1.5798 - classification_loss: 0.3501 36/500 [=>............................] - ETA: 1:54 - loss: 1.9254 - regression_loss: 1.5773 - classification_loss: 0.3481 37/500 [=>............................] - ETA: 1:53 - loss: 1.9085 - regression_loss: 1.5628 - classification_loss: 0.3458 38/500 [=>............................] - ETA: 1:53 - loss: 1.8970 - regression_loss: 1.5551 - classification_loss: 0.3419 39/500 [=>............................] - ETA: 1:53 - loss: 1.9002 - regression_loss: 1.5577 - classification_loss: 0.3425 40/500 [=>............................] - ETA: 1:53 - loss: 1.8945 - regression_loss: 1.5534 - classification_loss: 0.3411 41/500 [=>............................] - ETA: 1:53 - loss: 1.9008 - regression_loss: 1.5584 - classification_loss: 0.3424 42/500 [=>............................] - ETA: 1:53 - loss: 1.9007 - regression_loss: 1.5589 - classification_loss: 0.3417 43/500 [=>............................] - ETA: 1:52 - loss: 1.9096 - regression_loss: 1.5644 - classification_loss: 0.3452 44/500 [=>............................] - ETA: 1:52 - loss: 1.9173 - regression_loss: 1.5698 - classification_loss: 0.3475 45/500 [=>............................] - ETA: 1:52 - loss: 1.9154 - regression_loss: 1.5682 - classification_loss: 0.3471 46/500 [=>............................] - ETA: 1:52 - loss: 1.9179 - regression_loss: 1.5720 - classification_loss: 0.3459 47/500 [=>............................] - ETA: 1:52 - loss: 1.9190 - regression_loss: 1.5732 - classification_loss: 0.3458 48/500 [=>............................] - ETA: 1:51 - loss: 1.9165 - regression_loss: 1.5716 - classification_loss: 0.3449 49/500 [=>............................] - ETA: 1:51 - loss: 1.9104 - regression_loss: 1.5663 - classification_loss: 0.3441 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9037 - regression_loss: 1.5600 - classification_loss: 0.3437 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9018 - regression_loss: 1.5593 - classification_loss: 0.3425 52/500 [==>...........................] - ETA: 1:50 - loss: 1.9022 - regression_loss: 1.5587 - classification_loss: 0.3434 53/500 [==>...........................] - ETA: 1:50 - loss: 1.8971 - regression_loss: 1.5558 - classification_loss: 0.3414 54/500 [==>...........................] - ETA: 1:50 - loss: 1.8851 - regression_loss: 1.5477 - classification_loss: 0.3375 55/500 [==>...........................] - ETA: 1:50 - loss: 1.8827 - regression_loss: 1.5461 - classification_loss: 0.3366 56/500 [==>...........................] - ETA: 1:50 - loss: 1.8943 - regression_loss: 1.5551 - classification_loss: 0.3393 57/500 [==>...........................] - ETA: 1:49 - loss: 1.8866 - regression_loss: 1.5497 - classification_loss: 0.3369 58/500 [==>...........................] - ETA: 1:49 - loss: 1.8842 - regression_loss: 1.5484 - classification_loss: 0.3358 59/500 [==>...........................] - ETA: 1:49 - loss: 1.8782 - regression_loss: 1.5451 - classification_loss: 0.3331 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8643 - regression_loss: 1.5338 - classification_loss: 0.3305 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8681 - regression_loss: 1.5368 - classification_loss: 0.3313 62/500 [==>...........................] - ETA: 1:48 - loss: 1.8620 - regression_loss: 1.5319 - classification_loss: 0.3301 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8628 - regression_loss: 1.5323 - classification_loss: 0.3305 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8601 - regression_loss: 1.5304 - classification_loss: 0.3297 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8455 - regression_loss: 1.5182 - classification_loss: 0.3273 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8423 - regression_loss: 1.5160 - classification_loss: 0.3263 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8327 - regression_loss: 1.5090 - classification_loss: 0.3236 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8294 - regression_loss: 1.5065 - classification_loss: 0.3229 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8342 - regression_loss: 1.5099 - classification_loss: 0.3243 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8255 - regression_loss: 1.5015 - classification_loss: 0.3240 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8285 - regression_loss: 1.5030 - classification_loss: 0.3255 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8340 - regression_loss: 1.5060 - classification_loss: 0.3280 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8294 - regression_loss: 1.5026 - classification_loss: 0.3268 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8319 - regression_loss: 1.5040 - classification_loss: 0.3279 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8343 - regression_loss: 1.5066 - classification_loss: 0.3277 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8287 - regression_loss: 1.5021 - classification_loss: 0.3266 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8307 - regression_loss: 1.5047 - classification_loss: 0.3260 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8231 - regression_loss: 1.4982 - classification_loss: 0.3249 79/500 [===>..........................] - ETA: 1:44 - loss: 1.8226 - regression_loss: 1.4981 - classification_loss: 0.3245 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8284 - regression_loss: 1.5022 - classification_loss: 0.3262 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8123 - regression_loss: 1.4894 - classification_loss: 0.3229 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8181 - regression_loss: 1.4951 - classification_loss: 0.3230 83/500 [===>..........................] - ETA: 1:43 - loss: 1.8187 - regression_loss: 1.4955 - classification_loss: 0.3232 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8354 - regression_loss: 1.5103 - classification_loss: 0.3251 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8328 - regression_loss: 1.5085 - classification_loss: 0.3243 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8375 - regression_loss: 1.5120 - classification_loss: 0.3255 87/500 [====>.........................] - ETA: 1:42 - loss: 1.8433 - regression_loss: 1.5170 - classification_loss: 0.3264 88/500 [====>.........................] - ETA: 1:42 - loss: 1.8426 - regression_loss: 1.5170 - classification_loss: 0.3256 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8409 - regression_loss: 1.5155 - classification_loss: 0.3253 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8519 - regression_loss: 1.5248 - classification_loss: 0.3271 91/500 [====>.........................] - ETA: 1:41 - loss: 1.8509 - regression_loss: 1.5246 - classification_loss: 0.3263 92/500 [====>.........................] - ETA: 1:41 - loss: 1.8504 - regression_loss: 1.5245 - classification_loss: 0.3259 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8467 - regression_loss: 1.5219 - classification_loss: 0.3248 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8462 - regression_loss: 1.5224 - classification_loss: 0.3238 95/500 [====>.........................] - ETA: 1:40 - loss: 1.8458 - regression_loss: 1.5221 - classification_loss: 0.3237 96/500 [====>.........................] - ETA: 1:40 - loss: 1.8450 - regression_loss: 1.5217 - classification_loss: 0.3232 97/500 [====>.........................] - ETA: 1:40 - loss: 1.8453 - regression_loss: 1.5213 - classification_loss: 0.3240 98/500 [====>.........................] - ETA: 1:39 - loss: 1.8487 - regression_loss: 1.5240 - classification_loss: 0.3247 99/500 [====>.........................] - ETA: 1:39 - loss: 1.8533 - regression_loss: 1.5280 - classification_loss: 0.3253 100/500 [=====>........................] - ETA: 1:39 - loss: 1.8449 - regression_loss: 1.5212 - classification_loss: 0.3236 101/500 [=====>........................] - ETA: 1:39 - loss: 1.8482 - regression_loss: 1.5251 - classification_loss: 0.3232 102/500 [=====>........................] - ETA: 1:38 - loss: 1.8507 - regression_loss: 1.5249 - classification_loss: 0.3258 103/500 [=====>........................] - ETA: 1:38 - loss: 1.8499 - regression_loss: 1.5246 - classification_loss: 0.3253 104/500 [=====>........................] - ETA: 1:38 - loss: 1.8456 - regression_loss: 1.5211 - classification_loss: 0.3246 105/500 [=====>........................] - ETA: 1:38 - loss: 1.8478 - regression_loss: 1.5229 - classification_loss: 0.3249 106/500 [=====>........................] - ETA: 1:37 - loss: 1.8415 - regression_loss: 1.5181 - classification_loss: 0.3234 107/500 [=====>........................] - ETA: 1:37 - loss: 1.8458 - regression_loss: 1.5219 - classification_loss: 0.3239 108/500 [=====>........................] - ETA: 1:37 - loss: 1.8429 - regression_loss: 1.5200 - classification_loss: 0.3229 109/500 [=====>........................] - ETA: 1:37 - loss: 1.8413 - regression_loss: 1.5188 - classification_loss: 0.3225 110/500 [=====>........................] - ETA: 1:36 - loss: 1.8422 - regression_loss: 1.5192 - classification_loss: 0.3230 111/500 [=====>........................] - ETA: 1:36 - loss: 1.8401 - regression_loss: 1.5173 - classification_loss: 0.3228 112/500 [=====>........................] - ETA: 1:36 - loss: 1.8382 - regression_loss: 1.5151 - classification_loss: 0.3231 113/500 [=====>........................] - ETA: 1:36 - loss: 1.8339 - regression_loss: 1.5119 - classification_loss: 0.3219 114/500 [=====>........................] - ETA: 1:35 - loss: 1.8315 - regression_loss: 1.5106 - classification_loss: 0.3209 115/500 [=====>........................] - ETA: 1:35 - loss: 1.8311 - regression_loss: 1.5102 - classification_loss: 0.3209 116/500 [=====>........................] - ETA: 1:35 - loss: 1.8382 - regression_loss: 1.5167 - classification_loss: 0.3215 117/500 [======>.......................] - ETA: 1:35 - loss: 1.8347 - regression_loss: 1.5141 - classification_loss: 0.3206 118/500 [======>.......................] - ETA: 1:34 - loss: 1.8289 - regression_loss: 1.5090 - classification_loss: 0.3198 119/500 [======>.......................] - ETA: 1:34 - loss: 1.8294 - regression_loss: 1.5092 - classification_loss: 0.3201 120/500 [======>.......................] - ETA: 1:34 - loss: 1.8234 - regression_loss: 1.5044 - classification_loss: 0.3191 121/500 [======>.......................] - ETA: 1:34 - loss: 1.8248 - regression_loss: 1.5057 - classification_loss: 0.3191 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8269 - regression_loss: 1.5077 - classification_loss: 0.3192 123/500 [======>.......................] - ETA: 1:33 - loss: 1.8262 - regression_loss: 1.5073 - classification_loss: 0.3188 124/500 [======>.......................] - ETA: 1:33 - loss: 1.8252 - regression_loss: 1.5067 - classification_loss: 0.3185 125/500 [======>.......................] - ETA: 1:33 - loss: 1.8221 - regression_loss: 1.5038 - classification_loss: 0.3182 126/500 [======>.......................] - ETA: 1:33 - loss: 1.8216 - regression_loss: 1.5018 - classification_loss: 0.3198 127/500 [======>.......................] - ETA: 1:32 - loss: 1.8242 - regression_loss: 1.5036 - classification_loss: 0.3207 128/500 [======>.......................] - ETA: 1:32 - loss: 1.8251 - regression_loss: 1.5027 - classification_loss: 0.3224 129/500 [======>.......................] - ETA: 1:32 - loss: 1.8255 - regression_loss: 1.5027 - classification_loss: 0.3228 130/500 [======>.......................] - ETA: 1:32 - loss: 1.8193 - regression_loss: 1.4979 - classification_loss: 0.3214 131/500 [======>.......................] - ETA: 1:31 - loss: 1.8177 - regression_loss: 1.4947 - classification_loss: 0.3230 132/500 [======>.......................] - ETA: 1:31 - loss: 1.8155 - regression_loss: 1.4934 - classification_loss: 0.3221 133/500 [======>.......................] - ETA: 1:31 - loss: 1.8175 - regression_loss: 1.4959 - classification_loss: 0.3217 134/500 [=======>......................] - ETA: 1:31 - loss: 1.8164 - regression_loss: 1.4953 - classification_loss: 0.3211 135/500 [=======>......................] - ETA: 1:30 - loss: 1.8090 - regression_loss: 1.4895 - classification_loss: 0.3195 136/500 [=======>......................] - ETA: 1:30 - loss: 1.8072 - regression_loss: 1.4886 - classification_loss: 0.3186 137/500 [=======>......................] - ETA: 1:30 - loss: 1.8063 - regression_loss: 1.4879 - classification_loss: 0.3185 138/500 [=======>......................] - ETA: 1:30 - loss: 1.8049 - regression_loss: 1.4865 - classification_loss: 0.3184 139/500 [=======>......................] - ETA: 1:29 - loss: 1.8096 - regression_loss: 1.4921 - classification_loss: 0.3175 140/500 [=======>......................] - ETA: 1:29 - loss: 1.8120 - regression_loss: 1.4938 - classification_loss: 0.3182 141/500 [=======>......................] - ETA: 1:29 - loss: 1.8168 - regression_loss: 1.4971 - classification_loss: 0.3197 142/500 [=======>......................] - ETA: 1:29 - loss: 1.8128 - regression_loss: 1.4929 - classification_loss: 0.3199 143/500 [=======>......................] - ETA: 1:28 - loss: 1.8067 - regression_loss: 1.4874 - classification_loss: 0.3193 144/500 [=======>......................] - ETA: 1:28 - loss: 1.8085 - regression_loss: 1.4889 - classification_loss: 0.3196 145/500 [=======>......................] - ETA: 1:28 - loss: 1.8049 - regression_loss: 1.4865 - classification_loss: 0.3184 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8031 - regression_loss: 1.4850 - classification_loss: 0.3182 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8019 - regression_loss: 1.4842 - classification_loss: 0.3177 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7999 - regression_loss: 1.4818 - classification_loss: 0.3181 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7997 - regression_loss: 1.4816 - classification_loss: 0.3181 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7936 - regression_loss: 1.4764 - classification_loss: 0.3172 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7986 - regression_loss: 1.4787 - classification_loss: 0.3199 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7937 - regression_loss: 1.4742 - classification_loss: 0.3195 153/500 [========>.....................] - ETA: 1:26 - loss: 1.8015 - regression_loss: 1.4808 - classification_loss: 0.3207 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7953 - regression_loss: 1.4759 - classification_loss: 0.3195 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7955 - regression_loss: 1.4758 - classification_loss: 0.3197 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7939 - regression_loss: 1.4733 - classification_loss: 0.3206 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7964 - regression_loss: 1.4752 - classification_loss: 0.3212 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8004 - regression_loss: 1.4783 - classification_loss: 0.3221 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8007 - regression_loss: 1.4791 - classification_loss: 0.3216 160/500 [========>.....................] - ETA: 1:24 - loss: 1.8026 - regression_loss: 1.4806 - classification_loss: 0.3220 161/500 [========>.....................] - ETA: 1:24 - loss: 1.8018 - regression_loss: 1.4798 - classification_loss: 0.3220 162/500 [========>.....................] - ETA: 1:24 - loss: 1.8009 - regression_loss: 1.4794 - classification_loss: 0.3215 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7989 - regression_loss: 1.4782 - classification_loss: 0.3207 164/500 [========>.....................] - ETA: 1:23 - loss: 1.8003 - regression_loss: 1.4794 - classification_loss: 0.3209 165/500 [========>.....................] - ETA: 1:23 - loss: 1.8001 - regression_loss: 1.4795 - classification_loss: 0.3206 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7934 - regression_loss: 1.4741 - classification_loss: 0.3193 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7931 - regression_loss: 1.4742 - classification_loss: 0.3188 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7906 - regression_loss: 1.4725 - classification_loss: 0.3181 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7940 - regression_loss: 1.4752 - classification_loss: 0.3187 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7954 - regression_loss: 1.4766 - classification_loss: 0.3188 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7914 - regression_loss: 1.4732 - classification_loss: 0.3182 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7931 - regression_loss: 1.4750 - classification_loss: 0.3181 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7898 - regression_loss: 1.4722 - classification_loss: 0.3176 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7907 - regression_loss: 1.4729 - classification_loss: 0.3178 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7873 - regression_loss: 1.4705 - classification_loss: 0.3168 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7887 - regression_loss: 1.4718 - classification_loss: 0.3169 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7928 - regression_loss: 1.4752 - classification_loss: 0.3176 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7930 - regression_loss: 1.4757 - classification_loss: 0.3173 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7928 - regression_loss: 1.4757 - classification_loss: 0.3171 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7933 - regression_loss: 1.4733 - classification_loss: 0.3200 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7951 - regression_loss: 1.4750 - classification_loss: 0.3201 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7965 - regression_loss: 1.4764 - classification_loss: 0.3202 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7940 - regression_loss: 1.4742 - classification_loss: 0.3199 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7939 - regression_loss: 1.4741 - classification_loss: 0.3197 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7931 - regression_loss: 1.4735 - classification_loss: 0.3196 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7942 - regression_loss: 1.4747 - classification_loss: 0.3195 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7987 - regression_loss: 1.4785 - classification_loss: 0.3202 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7990 - regression_loss: 1.4791 - classification_loss: 0.3199 189/500 [==========>...................] - ETA: 1:17 - loss: 1.8001 - regression_loss: 1.4798 - classification_loss: 0.3203 190/500 [==========>...................] - ETA: 1:17 - loss: 1.8008 - regression_loss: 1.4802 - classification_loss: 0.3206 191/500 [==========>...................] - ETA: 1:17 - loss: 1.8014 - regression_loss: 1.4808 - classification_loss: 0.3205 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7991 - regression_loss: 1.4793 - classification_loss: 0.3198 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7979 - regression_loss: 1.4784 - classification_loss: 0.3194 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7973 - regression_loss: 1.4775 - classification_loss: 0.3197 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7980 - regression_loss: 1.4780 - classification_loss: 0.3200 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7948 - regression_loss: 1.4756 - classification_loss: 0.3193 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7969 - regression_loss: 1.4769 - classification_loss: 0.3199 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7961 - regression_loss: 1.4767 - classification_loss: 0.3195 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7939 - regression_loss: 1.4751 - classification_loss: 0.3188 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7959 - regression_loss: 1.4770 - classification_loss: 0.3190 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7918 - regression_loss: 1.4736 - classification_loss: 0.3182 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7956 - regression_loss: 1.4771 - classification_loss: 0.3185 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7959 - regression_loss: 1.4773 - classification_loss: 0.3186 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7967 - regression_loss: 1.4777 - classification_loss: 0.3191 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7949 - regression_loss: 1.4764 - classification_loss: 0.3185 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7952 - regression_loss: 1.4768 - classification_loss: 0.3184 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7945 - regression_loss: 1.4763 - classification_loss: 0.3181 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7955 - regression_loss: 1.4773 - classification_loss: 0.3182 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7959 - regression_loss: 1.4777 - classification_loss: 0.3182 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7959 - regression_loss: 1.4780 - classification_loss: 0.3179 211/500 [===========>..................] - ETA: 1:11 - loss: 1.7980 - regression_loss: 1.4799 - classification_loss: 0.3181 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7978 - regression_loss: 1.4799 - classification_loss: 0.3179 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7982 - regression_loss: 1.4783 - classification_loss: 0.3199 214/500 [===========>..................] - ETA: 1:11 - loss: 1.8002 - regression_loss: 1.4788 - classification_loss: 0.3215 215/500 [===========>..................] - ETA: 1:10 - loss: 1.7997 - regression_loss: 1.4781 - classification_loss: 0.3215 216/500 [===========>..................] - ETA: 1:10 - loss: 1.8011 - regression_loss: 1.4789 - classification_loss: 0.3223 217/500 [============>.................] - ETA: 1:10 - loss: 1.8018 - regression_loss: 1.4794 - classification_loss: 0.3224 218/500 [============>.................] - ETA: 1:10 - loss: 1.8010 - regression_loss: 1.4788 - classification_loss: 0.3222 219/500 [============>.................] - ETA: 1:10 - loss: 1.7964 - regression_loss: 1.4750 - classification_loss: 0.3215 220/500 [============>.................] - ETA: 1:09 - loss: 1.7921 - regression_loss: 1.4716 - classification_loss: 0.3205 221/500 [============>.................] - ETA: 1:09 - loss: 1.7922 - regression_loss: 1.4719 - classification_loss: 0.3203 222/500 [============>.................] - ETA: 1:09 - loss: 1.7892 - regression_loss: 1.4695 - classification_loss: 0.3197 223/500 [============>.................] - ETA: 1:09 - loss: 1.7881 - regression_loss: 1.4684 - classification_loss: 0.3198 224/500 [============>.................] - ETA: 1:08 - loss: 1.7892 - regression_loss: 1.4696 - classification_loss: 0.3196 225/500 [============>.................] - ETA: 1:08 - loss: 1.7876 - regression_loss: 1.4686 - classification_loss: 0.3190 226/500 [============>.................] - ETA: 1:08 - loss: 1.7871 - regression_loss: 1.4680 - classification_loss: 0.3191 227/500 [============>.................] - ETA: 1:08 - loss: 1.7868 - regression_loss: 1.4678 - classification_loss: 0.3190 228/500 [============>.................] - ETA: 1:07 - loss: 1.7873 - regression_loss: 1.4681 - classification_loss: 0.3191 229/500 [============>.................] - ETA: 1:07 - loss: 1.7863 - regression_loss: 1.4673 - classification_loss: 0.3189 230/500 [============>.................] - ETA: 1:07 - loss: 1.7828 - regression_loss: 1.4647 - classification_loss: 0.3182 231/500 [============>.................] - ETA: 1:07 - loss: 1.7828 - regression_loss: 1.4644 - classification_loss: 0.3184 232/500 [============>.................] - ETA: 1:06 - loss: 1.7813 - regression_loss: 1.4632 - classification_loss: 0.3180 233/500 [============>.................] - ETA: 1:06 - loss: 1.7770 - regression_loss: 1.4597 - classification_loss: 0.3173 234/500 [=============>................] - ETA: 1:06 - loss: 1.7770 - regression_loss: 1.4597 - classification_loss: 0.3173 235/500 [=============>................] - ETA: 1:06 - loss: 1.7788 - regression_loss: 1.4612 - classification_loss: 0.3176 236/500 [=============>................] - ETA: 1:05 - loss: 1.7816 - regression_loss: 1.4632 - classification_loss: 0.3184 237/500 [=============>................] - ETA: 1:05 - loss: 1.7829 - regression_loss: 1.4642 - classification_loss: 0.3187 238/500 [=============>................] - ETA: 1:05 - loss: 1.7801 - regression_loss: 1.4621 - classification_loss: 0.3180 239/500 [=============>................] - ETA: 1:05 - loss: 1.7811 - regression_loss: 1.4630 - classification_loss: 0.3181 240/500 [=============>................] - ETA: 1:04 - loss: 1.7822 - regression_loss: 1.4638 - classification_loss: 0.3184 241/500 [=============>................] - ETA: 1:04 - loss: 1.7836 - regression_loss: 1.4652 - classification_loss: 0.3184 242/500 [=============>................] - ETA: 1:04 - loss: 1.7822 - regression_loss: 1.4643 - classification_loss: 0.3180 243/500 [=============>................] - ETA: 1:04 - loss: 1.7784 - regression_loss: 1.4611 - classification_loss: 0.3173 244/500 [=============>................] - ETA: 1:03 - loss: 1.7779 - regression_loss: 1.4608 - classification_loss: 0.3171 245/500 [=============>................] - ETA: 1:03 - loss: 1.7746 - regression_loss: 1.4579 - classification_loss: 0.3168 246/500 [=============>................] - ETA: 1:03 - loss: 1.7736 - regression_loss: 1.4572 - classification_loss: 0.3164 247/500 [=============>................] - ETA: 1:03 - loss: 1.7730 - regression_loss: 1.4570 - classification_loss: 0.3160 248/500 [=============>................] - ETA: 1:02 - loss: 1.7739 - regression_loss: 1.4578 - classification_loss: 0.3161 249/500 [=============>................] - ETA: 1:02 - loss: 1.7763 - regression_loss: 1.4600 - classification_loss: 0.3162 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7736 - regression_loss: 1.4580 - classification_loss: 0.3156 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7758 - regression_loss: 1.4597 - classification_loss: 0.3161 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7734 - regression_loss: 1.4581 - classification_loss: 0.3153 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7731 - regression_loss: 1.4580 - classification_loss: 0.3151 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7709 - regression_loss: 1.4563 - classification_loss: 0.3146 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7719 - regression_loss: 1.4573 - classification_loss: 0.3146 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7732 - regression_loss: 1.4581 - classification_loss: 0.3150 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7733 - regression_loss: 1.4584 - classification_loss: 0.3149 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7754 - regression_loss: 1.4600 - classification_loss: 0.3154 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7723 - regression_loss: 1.4571 - classification_loss: 0.3152 260/500 [==============>...............] - ETA: 59s - loss: 1.7734 - regression_loss: 1.4579 - classification_loss: 0.3155  261/500 [==============>...............] - ETA: 59s - loss: 1.7748 - regression_loss: 1.4592 - classification_loss: 0.3156 262/500 [==============>...............] - ETA: 59s - loss: 1.7762 - regression_loss: 1.4604 - classification_loss: 0.3158 263/500 [==============>...............] - ETA: 59s - loss: 1.7781 - regression_loss: 1.4620 - classification_loss: 0.3161 264/500 [==============>...............] - ETA: 58s - loss: 1.7779 - regression_loss: 1.4626 - classification_loss: 0.3153 265/500 [==============>...............] - ETA: 58s - loss: 1.7782 - regression_loss: 1.4629 - classification_loss: 0.3153 266/500 [==============>...............] - ETA: 58s - loss: 1.7795 - regression_loss: 1.4637 - classification_loss: 0.3158 267/500 [===============>..............] - ETA: 58s - loss: 1.7791 - regression_loss: 1.4632 - classification_loss: 0.3158 268/500 [===============>..............] - ETA: 57s - loss: 1.7768 - regression_loss: 1.4613 - classification_loss: 0.3155 269/500 [===============>..............] - ETA: 57s - loss: 1.7746 - regression_loss: 1.4593 - classification_loss: 0.3153 270/500 [===============>..............] - ETA: 57s - loss: 1.7741 - regression_loss: 1.4591 - classification_loss: 0.3150 271/500 [===============>..............] - ETA: 57s - loss: 1.7742 - regression_loss: 1.4596 - classification_loss: 0.3146 272/500 [===============>..............] - ETA: 56s - loss: 1.7743 - regression_loss: 1.4597 - classification_loss: 0.3145 273/500 [===============>..............] - ETA: 56s - loss: 1.7755 - regression_loss: 1.4606 - classification_loss: 0.3149 274/500 [===============>..............] - ETA: 56s - loss: 1.7771 - regression_loss: 1.4620 - classification_loss: 0.3151 275/500 [===============>..............] - ETA: 56s - loss: 1.7773 - regression_loss: 1.4624 - classification_loss: 0.3149 276/500 [===============>..............] - ETA: 55s - loss: 1.7764 - regression_loss: 1.4618 - classification_loss: 0.3146 277/500 [===============>..............] - ETA: 55s - loss: 1.7764 - regression_loss: 1.4619 - classification_loss: 0.3145 278/500 [===============>..............] - ETA: 55s - loss: 1.7749 - regression_loss: 1.4607 - classification_loss: 0.3142 279/500 [===============>..............] - ETA: 55s - loss: 1.7736 - regression_loss: 1.4600 - classification_loss: 0.3137 280/500 [===============>..............] - ETA: 54s - loss: 1.7731 - regression_loss: 1.4598 - classification_loss: 0.3133 281/500 [===============>..............] - ETA: 54s - loss: 1.7739 - regression_loss: 1.4608 - classification_loss: 0.3131 282/500 [===============>..............] - ETA: 54s - loss: 1.7706 - regression_loss: 1.4578 - classification_loss: 0.3128 283/500 [===============>..............] - ETA: 54s - loss: 1.7679 - regression_loss: 1.4559 - classification_loss: 0.3120 284/500 [================>.............] - ETA: 53s - loss: 1.7667 - regression_loss: 1.4550 - classification_loss: 0.3118 285/500 [================>.............] - ETA: 53s - loss: 1.7666 - regression_loss: 1.4550 - classification_loss: 0.3116 286/500 [================>.............] - ETA: 53s - loss: 1.7673 - regression_loss: 1.4558 - classification_loss: 0.3115 287/500 [================>.............] - ETA: 53s - loss: 1.7674 - regression_loss: 1.4562 - classification_loss: 0.3112 288/500 [================>.............] - ETA: 52s - loss: 1.7680 - regression_loss: 1.4567 - classification_loss: 0.3114 289/500 [================>.............] - ETA: 52s - loss: 1.7674 - regression_loss: 1.4563 - classification_loss: 0.3111 290/500 [================>.............] - ETA: 52s - loss: 1.7663 - regression_loss: 1.4554 - classification_loss: 0.3109 291/500 [================>.............] - ETA: 52s - loss: 1.7679 - regression_loss: 1.4566 - classification_loss: 0.3113 292/500 [================>.............] - ETA: 51s - loss: 1.7696 - regression_loss: 1.4579 - classification_loss: 0.3117 293/500 [================>.............] - ETA: 51s - loss: 1.7685 - regression_loss: 1.4571 - classification_loss: 0.3114 294/500 [================>.............] - ETA: 51s - loss: 1.7666 - regression_loss: 1.4556 - classification_loss: 0.3110 295/500 [================>.............] - ETA: 51s - loss: 1.7676 - regression_loss: 1.4565 - classification_loss: 0.3111 296/500 [================>.............] - ETA: 50s - loss: 1.7678 - regression_loss: 1.4565 - classification_loss: 0.3113 297/500 [================>.............] - ETA: 50s - loss: 1.7688 - regression_loss: 1.4574 - classification_loss: 0.3114 298/500 [================>.............] - ETA: 50s - loss: 1.7695 - regression_loss: 1.4580 - classification_loss: 0.3115 299/500 [================>.............] - ETA: 50s - loss: 1.7697 - regression_loss: 1.4580 - classification_loss: 0.3117 300/500 [=================>............] - ETA: 49s - loss: 1.7695 - regression_loss: 1.4580 - classification_loss: 0.3114 301/500 [=================>............] - ETA: 49s - loss: 1.7672 - regression_loss: 1.4564 - classification_loss: 0.3108 302/500 [=================>............] - ETA: 49s - loss: 1.7680 - regression_loss: 1.4573 - classification_loss: 0.3107 303/500 [=================>............] - ETA: 49s - loss: 1.7674 - regression_loss: 1.4568 - classification_loss: 0.3106 304/500 [=================>............] - ETA: 48s - loss: 1.7655 - regression_loss: 1.4552 - classification_loss: 0.3103 305/500 [=================>............] - ETA: 48s - loss: 1.7657 - regression_loss: 1.4553 - classification_loss: 0.3104 306/500 [=================>............] - ETA: 48s - loss: 1.7652 - regression_loss: 1.4550 - classification_loss: 0.3101 307/500 [=================>............] - ETA: 48s - loss: 1.7633 - regression_loss: 1.4533 - classification_loss: 0.3100 308/500 [=================>............] - ETA: 47s - loss: 1.7615 - regression_loss: 1.4519 - classification_loss: 0.3097 309/500 [=================>............] - ETA: 47s - loss: 1.7633 - regression_loss: 1.4533 - classification_loss: 0.3101 310/500 [=================>............] - ETA: 47s - loss: 1.7638 - regression_loss: 1.4536 - classification_loss: 0.3102 311/500 [=================>............] - ETA: 47s - loss: 1.7632 - regression_loss: 1.4533 - classification_loss: 0.3098 312/500 [=================>............] - ETA: 47s - loss: 1.7640 - regression_loss: 1.4542 - classification_loss: 0.3098 313/500 [=================>............] - ETA: 46s - loss: 1.7623 - regression_loss: 1.4523 - classification_loss: 0.3100 314/500 [=================>............] - ETA: 46s - loss: 1.7607 - regression_loss: 1.4512 - classification_loss: 0.3095 315/500 [=================>............] - ETA: 46s - loss: 1.7603 - regression_loss: 1.4510 - classification_loss: 0.3093 316/500 [=================>............] - ETA: 46s - loss: 1.7571 - regression_loss: 1.4483 - classification_loss: 0.3088 317/500 [==================>...........] - ETA: 45s - loss: 1.7577 - regression_loss: 1.4491 - classification_loss: 0.3087 318/500 [==================>...........] - ETA: 45s - loss: 1.7549 - regression_loss: 1.4469 - classification_loss: 0.3080 319/500 [==================>...........] - ETA: 45s - loss: 1.7539 - regression_loss: 1.4460 - classification_loss: 0.3079 320/500 [==================>...........] - ETA: 45s - loss: 1.7547 - regression_loss: 1.4463 - classification_loss: 0.3084 321/500 [==================>...........] - ETA: 44s - loss: 1.7553 - regression_loss: 1.4470 - classification_loss: 0.3083 322/500 [==================>...........] - ETA: 44s - loss: 1.7544 - regression_loss: 1.4465 - classification_loss: 0.3079 323/500 [==================>...........] - ETA: 44s - loss: 1.7553 - regression_loss: 1.4475 - classification_loss: 0.3078 324/500 [==================>...........] - ETA: 44s - loss: 1.7541 - regression_loss: 1.4466 - classification_loss: 0.3074 325/500 [==================>...........] - ETA: 43s - loss: 1.7560 - regression_loss: 1.4484 - classification_loss: 0.3076 326/500 [==================>...........] - ETA: 43s - loss: 1.7550 - regression_loss: 1.4480 - classification_loss: 0.3070 327/500 [==================>...........] - ETA: 43s - loss: 1.7561 - regression_loss: 1.4490 - classification_loss: 0.3071 328/500 [==================>...........] - ETA: 43s - loss: 1.7571 - regression_loss: 1.4498 - classification_loss: 0.3073 329/500 [==================>...........] - ETA: 42s - loss: 1.7584 - regression_loss: 1.4507 - classification_loss: 0.3076 330/500 [==================>...........] - ETA: 42s - loss: 1.7600 - regression_loss: 1.4519 - classification_loss: 0.3080 331/500 [==================>...........] - ETA: 42s - loss: 1.7584 - regression_loss: 1.4507 - classification_loss: 0.3078 332/500 [==================>...........] - ETA: 42s - loss: 1.7598 - regression_loss: 1.4520 - classification_loss: 0.3078 333/500 [==================>...........] - ETA: 41s - loss: 1.7611 - regression_loss: 1.4531 - classification_loss: 0.3080 334/500 [===================>..........] - ETA: 41s - loss: 1.7614 - regression_loss: 1.4534 - classification_loss: 0.3079 335/500 [===================>..........] - ETA: 41s - loss: 1.7615 - regression_loss: 1.4533 - classification_loss: 0.3082 336/500 [===================>..........] - ETA: 41s - loss: 1.7608 - regression_loss: 1.4526 - classification_loss: 0.3083 337/500 [===================>..........] - ETA: 40s - loss: 1.7614 - regression_loss: 1.4529 - classification_loss: 0.3084 338/500 [===================>..........] - ETA: 40s - loss: 1.7620 - regression_loss: 1.4535 - classification_loss: 0.3085 339/500 [===================>..........] - ETA: 40s - loss: 1.7635 - regression_loss: 1.4548 - classification_loss: 0.3087 340/500 [===================>..........] - ETA: 40s - loss: 1.7623 - regression_loss: 1.4534 - classification_loss: 0.3089 341/500 [===================>..........] - ETA: 39s - loss: 1.7633 - regression_loss: 1.4541 - classification_loss: 0.3092 342/500 [===================>..........] - ETA: 39s - loss: 1.7623 - regression_loss: 1.4535 - classification_loss: 0.3088 343/500 [===================>..........] - ETA: 39s - loss: 1.7614 - regression_loss: 1.4528 - classification_loss: 0.3086 344/500 [===================>..........] - ETA: 39s - loss: 1.7606 - regression_loss: 1.4521 - classification_loss: 0.3085 345/500 [===================>..........] - ETA: 38s - loss: 1.7620 - regression_loss: 1.4533 - classification_loss: 0.3087 346/500 [===================>..........] - ETA: 38s - loss: 1.7616 - regression_loss: 1.4531 - classification_loss: 0.3086 347/500 [===================>..........] - ETA: 38s - loss: 1.7625 - regression_loss: 1.4538 - classification_loss: 0.3087 348/500 [===================>..........] - ETA: 38s - loss: 1.7618 - regression_loss: 1.4532 - classification_loss: 0.3086 349/500 [===================>..........] - ETA: 37s - loss: 1.7635 - regression_loss: 1.4546 - classification_loss: 0.3088 350/500 [====================>.........] - ETA: 37s - loss: 1.7634 - regression_loss: 1.4548 - classification_loss: 0.3086 351/500 [====================>.........] - ETA: 37s - loss: 1.7646 - regression_loss: 1.4557 - classification_loss: 0.3090 352/500 [====================>.........] - ETA: 37s - loss: 1.7648 - regression_loss: 1.4560 - classification_loss: 0.3088 353/500 [====================>.........] - ETA: 36s - loss: 1.7665 - regression_loss: 1.4574 - classification_loss: 0.3091 354/500 [====================>.........] - ETA: 36s - loss: 1.7661 - regression_loss: 1.4571 - classification_loss: 0.3090 355/500 [====================>.........] - ETA: 36s - loss: 1.7660 - regression_loss: 1.4572 - classification_loss: 0.3088 356/500 [====================>.........] - ETA: 36s - loss: 1.7657 - regression_loss: 1.4573 - classification_loss: 0.3085 357/500 [====================>.........] - ETA: 35s - loss: 1.7655 - regression_loss: 1.4574 - classification_loss: 0.3082 358/500 [====================>.........] - ETA: 35s - loss: 1.7638 - regression_loss: 1.4561 - classification_loss: 0.3077 359/500 [====================>.........] - ETA: 35s - loss: 1.7634 - regression_loss: 1.4557 - classification_loss: 0.3077 360/500 [====================>.........] - ETA: 34s - loss: 1.7643 - regression_loss: 1.4564 - classification_loss: 0.3079 361/500 [====================>.........] - ETA: 34s - loss: 1.7642 - regression_loss: 1.4565 - classification_loss: 0.3078 362/500 [====================>.........] - ETA: 34s - loss: 1.7641 - regression_loss: 1.4564 - classification_loss: 0.3077 363/500 [====================>.........] - ETA: 34s - loss: 1.7647 - regression_loss: 1.4569 - classification_loss: 0.3078 364/500 [====================>.........] - ETA: 34s - loss: 1.7653 - regression_loss: 1.4574 - classification_loss: 0.3079 365/500 [====================>.........] - ETA: 33s - loss: 1.7647 - regression_loss: 1.4570 - classification_loss: 0.3077 366/500 [====================>.........] - ETA: 33s - loss: 1.7659 - regression_loss: 1.4578 - classification_loss: 0.3081 367/500 [=====================>........] - ETA: 33s - loss: 1.7659 - regression_loss: 1.4579 - classification_loss: 0.3081 368/500 [=====================>........] - ETA: 33s - loss: 1.7665 - regression_loss: 1.4586 - classification_loss: 0.3080 369/500 [=====================>........] - ETA: 32s - loss: 1.7645 - regression_loss: 1.4570 - classification_loss: 0.3076 370/500 [=====================>........] - ETA: 32s - loss: 1.7665 - regression_loss: 1.4581 - classification_loss: 0.3083 371/500 [=====================>........] - ETA: 32s - loss: 1.7663 - regression_loss: 1.4580 - classification_loss: 0.3083 372/500 [=====================>........] - ETA: 32s - loss: 1.7649 - regression_loss: 1.4568 - classification_loss: 0.3081 373/500 [=====================>........] - ETA: 31s - loss: 1.7632 - regression_loss: 1.4556 - classification_loss: 0.3076 374/500 [=====================>........] - ETA: 31s - loss: 1.7641 - regression_loss: 1.4560 - classification_loss: 0.3082 375/500 [=====================>........] - ETA: 31s - loss: 1.7651 - regression_loss: 1.4567 - classification_loss: 0.3083 376/500 [=====================>........] - ETA: 31s - loss: 1.7654 - regression_loss: 1.4572 - classification_loss: 0.3082 377/500 [=====================>........] - ETA: 30s - loss: 1.7657 - regression_loss: 1.4574 - classification_loss: 0.3083 378/500 [=====================>........] - ETA: 30s - loss: 1.7660 - regression_loss: 1.4579 - classification_loss: 0.3082 379/500 [=====================>........] - ETA: 30s - loss: 1.7666 - regression_loss: 1.4581 - classification_loss: 0.3084 380/500 [=====================>........] - ETA: 30s - loss: 1.7688 - regression_loss: 1.4601 - classification_loss: 0.3087 381/500 [=====================>........] - ETA: 29s - loss: 1.7689 - regression_loss: 1.4601 - classification_loss: 0.3088 382/500 [=====================>........] - ETA: 29s - loss: 1.7693 - regression_loss: 1.4604 - classification_loss: 0.3089 383/500 [=====================>........] - ETA: 29s - loss: 1.7685 - regression_loss: 1.4597 - classification_loss: 0.3087 384/500 [======================>.......] - ETA: 28s - loss: 1.7694 - regression_loss: 1.4602 - classification_loss: 0.3091 385/500 [======================>.......] - ETA: 28s - loss: 1.7704 - regression_loss: 1.4610 - classification_loss: 0.3094 386/500 [======================>.......] - ETA: 28s - loss: 1.7707 - regression_loss: 1.4612 - classification_loss: 0.3095 387/500 [======================>.......] - ETA: 28s - loss: 1.7685 - regression_loss: 1.4594 - classification_loss: 0.3091 388/500 [======================>.......] - ETA: 27s - loss: 1.7693 - regression_loss: 1.4603 - classification_loss: 0.3089 389/500 [======================>.......] - ETA: 27s - loss: 1.7704 - regression_loss: 1.4612 - classification_loss: 0.3092 390/500 [======================>.......] - ETA: 27s - loss: 1.7682 - regression_loss: 1.4594 - classification_loss: 0.3087 391/500 [======================>.......] - ETA: 27s - loss: 1.7690 - regression_loss: 1.4602 - classification_loss: 0.3088 392/500 [======================>.......] - ETA: 26s - loss: 1.7681 - regression_loss: 1.4595 - classification_loss: 0.3086 393/500 [======================>.......] - ETA: 26s - loss: 1.7690 - regression_loss: 1.4601 - classification_loss: 0.3089 394/500 [======================>.......] - ETA: 26s - loss: 1.7691 - regression_loss: 1.4603 - classification_loss: 0.3087 395/500 [======================>.......] - ETA: 26s - loss: 1.7697 - regression_loss: 1.4609 - classification_loss: 0.3088 396/500 [======================>.......] - ETA: 25s - loss: 1.7695 - regression_loss: 1.4607 - classification_loss: 0.3088 397/500 [======================>.......] - ETA: 25s - loss: 1.7668 - regression_loss: 1.4586 - classification_loss: 0.3082 398/500 [======================>.......] - ETA: 25s - loss: 1.7666 - regression_loss: 1.4585 - classification_loss: 0.3080 399/500 [======================>.......] - ETA: 25s - loss: 1.7669 - regression_loss: 1.4587 - classification_loss: 0.3081 400/500 [=======================>......] - ETA: 24s - loss: 1.7661 - regression_loss: 1.4580 - classification_loss: 0.3081 401/500 [=======================>......] - ETA: 24s - loss: 1.7645 - regression_loss: 1.4567 - classification_loss: 0.3078 402/500 [=======================>......] - ETA: 24s - loss: 1.7646 - regression_loss: 1.4568 - classification_loss: 0.3077 403/500 [=======================>......] - ETA: 24s - loss: 1.7631 - regression_loss: 1.4558 - classification_loss: 0.3074 404/500 [=======================>......] - ETA: 23s - loss: 1.7624 - regression_loss: 1.4553 - classification_loss: 0.3071 405/500 [=======================>......] - ETA: 23s - loss: 1.7608 - regression_loss: 1.4539 - classification_loss: 0.3069 406/500 [=======================>......] - ETA: 23s - loss: 1.7602 - regression_loss: 1.4534 - classification_loss: 0.3067 407/500 [=======================>......] - ETA: 23s - loss: 1.7598 - regression_loss: 1.4534 - classification_loss: 0.3064 408/500 [=======================>......] - ETA: 22s - loss: 1.7597 - regression_loss: 1.4534 - classification_loss: 0.3063 409/500 [=======================>......] - ETA: 22s - loss: 1.7608 - regression_loss: 1.4541 - classification_loss: 0.3066 410/500 [=======================>......] - ETA: 22s - loss: 1.7604 - regression_loss: 1.4538 - classification_loss: 0.3066 411/500 [=======================>......] - ETA: 22s - loss: 1.7615 - regression_loss: 1.4546 - classification_loss: 0.3069 412/500 [=======================>......] - ETA: 21s - loss: 1.7633 - regression_loss: 1.4561 - classification_loss: 0.3073 413/500 [=======================>......] - ETA: 21s - loss: 1.7627 - regression_loss: 1.4557 - classification_loss: 0.3070 414/500 [=======================>......] - ETA: 21s - loss: 1.7630 - regression_loss: 1.4560 - classification_loss: 0.3070 415/500 [=======================>......] - ETA: 21s - loss: 1.7644 - regression_loss: 1.4572 - classification_loss: 0.3072 416/500 [=======================>......] - ETA: 20s - loss: 1.7647 - regression_loss: 1.4575 - classification_loss: 0.3072 417/500 [========================>.....] - ETA: 20s - loss: 1.7649 - regression_loss: 1.4576 - classification_loss: 0.3074 418/500 [========================>.....] - ETA: 20s - loss: 1.7654 - regression_loss: 1.4578 - classification_loss: 0.3075 419/500 [========================>.....] - ETA: 20s - loss: 1.7666 - regression_loss: 1.4588 - classification_loss: 0.3078 420/500 [========================>.....] - ETA: 19s - loss: 1.7663 - regression_loss: 1.4586 - classification_loss: 0.3077 421/500 [========================>.....] - ETA: 19s - loss: 1.7662 - regression_loss: 1.4584 - classification_loss: 0.3079 422/500 [========================>.....] - ETA: 19s - loss: 1.7673 - regression_loss: 1.4591 - classification_loss: 0.3082 423/500 [========================>.....] - ETA: 19s - loss: 1.7675 - regression_loss: 1.4592 - classification_loss: 0.3083 424/500 [========================>.....] - ETA: 18s - loss: 1.7675 - regression_loss: 1.4593 - classification_loss: 0.3082 425/500 [========================>.....] - ETA: 18s - loss: 1.7668 - regression_loss: 1.4585 - classification_loss: 0.3082 426/500 [========================>.....] - ETA: 18s - loss: 1.7680 - regression_loss: 1.4595 - classification_loss: 0.3085 427/500 [========================>.....] - ETA: 18s - loss: 1.7678 - regression_loss: 1.4594 - classification_loss: 0.3084 428/500 [========================>.....] - ETA: 17s - loss: 1.7679 - regression_loss: 1.4595 - classification_loss: 0.3084 429/500 [========================>.....] - ETA: 17s - loss: 1.7677 - regression_loss: 1.4594 - classification_loss: 0.3083 430/500 [========================>.....] - ETA: 17s - loss: 1.7686 - regression_loss: 1.4602 - classification_loss: 0.3084 431/500 [========================>.....] - ETA: 17s - loss: 1.7694 - regression_loss: 1.4609 - classification_loss: 0.3085 432/500 [========================>.....] - ETA: 16s - loss: 1.7672 - regression_loss: 1.4588 - classification_loss: 0.3083 433/500 [========================>.....] - ETA: 16s - loss: 1.7661 - regression_loss: 1.4579 - classification_loss: 0.3082 434/500 [=========================>....] - ETA: 16s - loss: 1.7651 - regression_loss: 1.4569 - classification_loss: 0.3081 435/500 [=========================>....] - ETA: 16s - loss: 1.7659 - regression_loss: 1.4576 - classification_loss: 0.3084 436/500 [=========================>....] - ETA: 15s - loss: 1.7650 - regression_loss: 1.4568 - classification_loss: 0.3081 437/500 [=========================>....] - ETA: 15s - loss: 1.7655 - regression_loss: 1.4573 - classification_loss: 0.3081 438/500 [=========================>....] - ETA: 15s - loss: 1.7656 - regression_loss: 1.4575 - classification_loss: 0.3081 439/500 [=========================>....] - ETA: 15s - loss: 1.7656 - regression_loss: 1.4576 - classification_loss: 0.3081 440/500 [=========================>....] - ETA: 14s - loss: 1.7646 - regression_loss: 1.4568 - classification_loss: 0.3077 441/500 [=========================>....] - ETA: 14s - loss: 1.7648 - regression_loss: 1.4571 - classification_loss: 0.3077 442/500 [=========================>....] - ETA: 14s - loss: 1.7648 - regression_loss: 1.4571 - classification_loss: 0.3076 443/500 [=========================>....] - ETA: 14s - loss: 1.7655 - regression_loss: 1.4578 - classification_loss: 0.3078 444/500 [=========================>....] - ETA: 13s - loss: 1.7644 - regression_loss: 1.4568 - classification_loss: 0.3077 445/500 [=========================>....] - ETA: 13s - loss: 1.7655 - regression_loss: 1.4576 - classification_loss: 0.3079 446/500 [=========================>....] - ETA: 13s - loss: 1.7644 - regression_loss: 1.4568 - classification_loss: 0.3076 447/500 [=========================>....] - ETA: 13s - loss: 1.7634 - regression_loss: 1.4560 - classification_loss: 0.3074 448/500 [=========================>....] - ETA: 12s - loss: 1.7628 - regression_loss: 1.4554 - classification_loss: 0.3074 449/500 [=========================>....] - ETA: 12s - loss: 1.7632 - regression_loss: 1.4558 - classification_loss: 0.3075 450/500 [==========================>...] - ETA: 12s - loss: 1.7626 - regression_loss: 1.4553 - classification_loss: 0.3073 451/500 [==========================>...] - ETA: 12s - loss: 1.7633 - regression_loss: 1.4560 - classification_loss: 0.3072 452/500 [==========================>...] - ETA: 11s - loss: 1.7627 - regression_loss: 1.4556 - classification_loss: 0.3070 453/500 [==========================>...] - ETA: 11s - loss: 1.7630 - regression_loss: 1.4559 - classification_loss: 0.3071 454/500 [==========================>...] - ETA: 11s - loss: 1.7637 - regression_loss: 1.4566 - classification_loss: 0.3071 455/500 [==========================>...] - ETA: 11s - loss: 1.7645 - regression_loss: 1.4573 - classification_loss: 0.3072 456/500 [==========================>...] - ETA: 10s - loss: 1.7639 - regression_loss: 1.4568 - classification_loss: 0.3070 457/500 [==========================>...] - ETA: 10s - loss: 1.7642 - regression_loss: 1.4571 - classification_loss: 0.3071 458/500 [==========================>...] - ETA: 10s - loss: 1.7642 - regression_loss: 1.4571 - classification_loss: 0.3070 459/500 [==========================>...] - ETA: 10s - loss: 1.7643 - regression_loss: 1.4574 - classification_loss: 0.3070 460/500 [==========================>...] - ETA: 9s - loss: 1.7644 - regression_loss: 1.4574 - classification_loss: 0.3070  461/500 [==========================>...] - ETA: 9s - loss: 1.7667 - regression_loss: 1.4590 - classification_loss: 0.3077 462/500 [==========================>...] - ETA: 9s - loss: 1.7666 - regression_loss: 1.4591 - classification_loss: 0.3076 463/500 [==========================>...] - ETA: 9s - loss: 1.7667 - regression_loss: 1.4591 - classification_loss: 0.3076 464/500 [==========================>...] - ETA: 8s - loss: 1.7664 - regression_loss: 1.4589 - classification_loss: 0.3075 465/500 [==========================>...] - ETA: 8s - loss: 1.7667 - regression_loss: 1.4592 - classification_loss: 0.3075 466/500 [==========================>...] - ETA: 8s - loss: 1.7667 - regression_loss: 1.4593 - classification_loss: 0.3074 467/500 [===========================>..] - ETA: 8s - loss: 1.7654 - regression_loss: 1.4582 - classification_loss: 0.3072 468/500 [===========================>..] - ETA: 7s - loss: 1.7671 - regression_loss: 1.4594 - classification_loss: 0.3076 469/500 [===========================>..] - ETA: 7s - loss: 1.7662 - regression_loss: 1.4585 - classification_loss: 0.3077 470/500 [===========================>..] - ETA: 7s - loss: 1.7667 - regression_loss: 1.4591 - classification_loss: 0.3077 471/500 [===========================>..] - ETA: 7s - loss: 1.7658 - regression_loss: 1.4583 - classification_loss: 0.3075 472/500 [===========================>..] - ETA: 6s - loss: 1.7654 - regression_loss: 1.4580 - classification_loss: 0.3074 473/500 [===========================>..] - ETA: 6s - loss: 1.7640 - regression_loss: 1.4564 - classification_loss: 0.3077 474/500 [===========================>..] - ETA: 6s - loss: 1.7667 - regression_loss: 1.4586 - classification_loss: 0.3081 475/500 [===========================>..] - ETA: 6s - loss: 1.7663 - regression_loss: 1.4583 - classification_loss: 0.3080 476/500 [===========================>..] - ETA: 6s - loss: 1.7651 - regression_loss: 1.4574 - classification_loss: 0.3077 477/500 [===========================>..] - ETA: 5s - loss: 1.7654 - regression_loss: 1.4576 - classification_loss: 0.3078 478/500 [===========================>..] - ETA: 5s - loss: 1.7654 - regression_loss: 1.4575 - classification_loss: 0.3079 479/500 [===========================>..] - ETA: 5s - loss: 1.7653 - regression_loss: 1.4574 - classification_loss: 0.3079 480/500 [===========================>..] - ETA: 5s - loss: 1.7664 - regression_loss: 1.4582 - classification_loss: 0.3082 481/500 [===========================>..] - ETA: 4s - loss: 1.7666 - regression_loss: 1.4584 - classification_loss: 0.3082 482/500 [===========================>..] - ETA: 4s - loss: 1.7650 - regression_loss: 1.4572 - classification_loss: 0.3078 483/500 [===========================>..] - ETA: 4s - loss: 1.7647 - regression_loss: 1.4569 - classification_loss: 0.3078 484/500 [============================>.] - ETA: 4s - loss: 1.7651 - regression_loss: 1.4573 - classification_loss: 0.3077 485/500 [============================>.] - ETA: 3s - loss: 1.7655 - regression_loss: 1.4576 - classification_loss: 0.3079 486/500 [============================>.] - ETA: 3s - loss: 1.7657 - regression_loss: 1.4578 - classification_loss: 0.3078 487/500 [============================>.] - ETA: 3s - loss: 1.7644 - regression_loss: 1.4570 - classification_loss: 0.3074 488/500 [============================>.] - ETA: 3s - loss: 1.7644 - regression_loss: 1.4570 - classification_loss: 0.3073 489/500 [============================>.] - ETA: 2s - loss: 1.7659 - regression_loss: 1.4584 - classification_loss: 0.3076 490/500 [============================>.] - ETA: 2s - loss: 1.7649 - regression_loss: 1.4574 - classification_loss: 0.3075 491/500 [============================>.] - ETA: 2s - loss: 1.7651 - regression_loss: 1.4575 - classification_loss: 0.3076 492/500 [============================>.] - ETA: 2s - loss: 1.7647 - regression_loss: 1.4572 - classification_loss: 0.3074 493/500 [============================>.] - ETA: 1s - loss: 1.7625 - regression_loss: 1.4555 - classification_loss: 0.3071 494/500 [============================>.] - ETA: 1s - loss: 1.7620 - regression_loss: 1.4552 - classification_loss: 0.3069 495/500 [============================>.] - ETA: 1s - loss: 1.7616 - regression_loss: 1.4550 - classification_loss: 0.3067 496/500 [============================>.] - ETA: 1s - loss: 1.7612 - regression_loss: 1.4547 - classification_loss: 0.3065 497/500 [============================>.] - ETA: 0s - loss: 1.7600 - regression_loss: 1.4536 - classification_loss: 0.3064 498/500 [============================>.] - ETA: 0s - loss: 1.7608 - regression_loss: 1.4541 - classification_loss: 0.3067 499/500 [============================>.] - ETA: 0s - loss: 1.7618 - regression_loss: 1.4548 - classification_loss: 0.3069 500/500 [==============================] - 125s 250ms/step - loss: 1.7625 - regression_loss: 1.4554 - classification_loss: 0.3071 1172 instances of class plum with average precision: 0.6141 mAP: 0.6141 Epoch 00065: saving model to ./training/snapshots/resnet50_pascal_65.h5 Epoch 66/150 1/500 [..............................] - ETA: 2:00 - loss: 1.8150 - regression_loss: 1.5015 - classification_loss: 0.3136 2/500 [..............................] - ETA: 2:03 - loss: 1.7495 - regression_loss: 1.4601 - classification_loss: 0.2894 3/500 [..............................] - ETA: 2:05 - loss: 1.7998 - regression_loss: 1.5122 - classification_loss: 0.2875 4/500 [..............................] - ETA: 2:05 - loss: 1.9519 - regression_loss: 1.5746 - classification_loss: 0.3772 5/500 [..............................] - ETA: 2:05 - loss: 1.9409 - regression_loss: 1.5663 - classification_loss: 0.3747 6/500 [..............................] - ETA: 2:04 - loss: 1.8683 - regression_loss: 1.5188 - classification_loss: 0.3495 7/500 [..............................] - ETA: 2:03 - loss: 1.7910 - regression_loss: 1.4615 - classification_loss: 0.3295 8/500 [..............................] - ETA: 2:03 - loss: 1.7904 - regression_loss: 1.4697 - classification_loss: 0.3207 9/500 [..............................] - ETA: 2:03 - loss: 1.7386 - regression_loss: 1.4359 - classification_loss: 0.3026 10/500 [..............................] - ETA: 2:03 - loss: 1.6935 - regression_loss: 1.3953 - classification_loss: 0.2982 11/500 [..............................] - ETA: 2:03 - loss: 1.7533 - regression_loss: 1.4447 - classification_loss: 0.3087 12/500 [..............................] - ETA: 2:02 - loss: 1.6652 - regression_loss: 1.3680 - classification_loss: 0.2972 13/500 [..............................] - ETA: 2:02 - loss: 1.7004 - regression_loss: 1.3990 - classification_loss: 0.3014 14/500 [..............................] - ETA: 2:02 - loss: 1.7253 - regression_loss: 1.4209 - classification_loss: 0.3044 15/500 [..............................] - ETA: 2:02 - loss: 1.6551 - regression_loss: 1.3625 - classification_loss: 0.2926 16/500 [..............................] - ETA: 2:01 - loss: 1.6323 - regression_loss: 1.3397 - classification_loss: 0.2926 17/500 [>.............................] - ETA: 2:01 - loss: 1.6438 - regression_loss: 1.3492 - classification_loss: 0.2945 18/500 [>.............................] - ETA: 2:01 - loss: 1.6631 - regression_loss: 1.3640 - classification_loss: 0.2991 19/500 [>.............................] - ETA: 2:00 - loss: 1.7074 - regression_loss: 1.3948 - classification_loss: 0.3126 20/500 [>.............................] - ETA: 2:00 - loss: 1.7297 - regression_loss: 1.4169 - classification_loss: 0.3128 21/500 [>.............................] - ETA: 2:00 - loss: 1.6989 - regression_loss: 1.3905 - classification_loss: 0.3084 22/500 [>.............................] - ETA: 1:59 - loss: 1.7182 - regression_loss: 1.4067 - classification_loss: 0.3116 23/500 [>.............................] - ETA: 1:59 - loss: 1.7716 - regression_loss: 1.4533 - classification_loss: 0.3184 24/500 [>.............................] - ETA: 1:59 - loss: 1.7665 - regression_loss: 1.4516 - classification_loss: 0.3149 25/500 [>.............................] - ETA: 1:59 - loss: 1.7575 - regression_loss: 1.4464 - classification_loss: 0.3111 26/500 [>.............................] - ETA: 1:59 - loss: 1.8114 - regression_loss: 1.4926 - classification_loss: 0.3188 27/500 [>.............................] - ETA: 1:59 - loss: 1.8158 - regression_loss: 1.4979 - classification_loss: 0.3179 28/500 [>.............................] - ETA: 1:58 - loss: 1.8154 - regression_loss: 1.4976 - classification_loss: 0.3178 29/500 [>.............................] - ETA: 1:58 - loss: 1.8056 - regression_loss: 1.4889 - classification_loss: 0.3167 30/500 [>.............................] - ETA: 1:58 - loss: 1.8166 - regression_loss: 1.4990 - classification_loss: 0.3176 31/500 [>.............................] - ETA: 1:58 - loss: 1.8023 - regression_loss: 1.4887 - classification_loss: 0.3136 32/500 [>.............................] - ETA: 1:57 - loss: 1.7823 - regression_loss: 1.4725 - classification_loss: 0.3098 33/500 [>.............................] - ETA: 1:57 - loss: 1.7993 - regression_loss: 1.4877 - classification_loss: 0.3116 34/500 [=>............................] - ETA: 1:57 - loss: 1.7746 - regression_loss: 1.4668 - classification_loss: 0.3078 35/500 [=>............................] - ETA: 1:57 - loss: 1.7759 - regression_loss: 1.4696 - classification_loss: 0.3063 36/500 [=>............................] - ETA: 1:56 - loss: 1.7494 - regression_loss: 1.4488 - classification_loss: 0.3006 37/500 [=>............................] - ETA: 1:56 - loss: 1.7579 - regression_loss: 1.4550 - classification_loss: 0.3028 38/500 [=>............................] - ETA: 1:56 - loss: 1.7606 - regression_loss: 1.4588 - classification_loss: 0.3018 39/500 [=>............................] - ETA: 1:56 - loss: 1.7615 - regression_loss: 1.4587 - classification_loss: 0.3028 40/500 [=>............................] - ETA: 1:55 - loss: 1.7657 - regression_loss: 1.4599 - classification_loss: 0.3058 41/500 [=>............................] - ETA: 1:55 - loss: 1.7607 - regression_loss: 1.4567 - classification_loss: 0.3041 42/500 [=>............................] - ETA: 1:55 - loss: 1.7659 - regression_loss: 1.4631 - classification_loss: 0.3027 43/500 [=>............................] - ETA: 1:55 - loss: 1.7642 - regression_loss: 1.4622 - classification_loss: 0.3020 44/500 [=>............................] - ETA: 1:54 - loss: 1.7629 - regression_loss: 1.4593 - classification_loss: 0.3036 45/500 [=>............................] - ETA: 1:54 - loss: 1.7474 - regression_loss: 1.4468 - classification_loss: 0.3006 46/500 [=>............................] - ETA: 1:54 - loss: 1.7433 - regression_loss: 1.4443 - classification_loss: 0.2990 47/500 [=>............................] - ETA: 1:54 - loss: 1.7459 - regression_loss: 1.4470 - classification_loss: 0.2989 48/500 [=>............................] - ETA: 1:53 - loss: 1.7504 - regression_loss: 1.4500 - classification_loss: 0.3003 49/500 [=>............................] - ETA: 1:53 - loss: 1.7531 - regression_loss: 1.4524 - classification_loss: 0.3007 50/500 [==>...........................] - ETA: 1:53 - loss: 1.7528 - regression_loss: 1.4491 - classification_loss: 0.3037 51/500 [==>...........................] - ETA: 1:53 - loss: 1.7735 - regression_loss: 1.4652 - classification_loss: 0.3082 52/500 [==>...........................] - ETA: 1:52 - loss: 1.7722 - regression_loss: 1.4643 - classification_loss: 0.3079 53/500 [==>...........................] - ETA: 1:52 - loss: 1.7821 - regression_loss: 1.4730 - classification_loss: 0.3091 54/500 [==>...........................] - ETA: 1:52 - loss: 1.7908 - regression_loss: 1.4801 - classification_loss: 0.3107 55/500 [==>...........................] - ETA: 1:51 - loss: 1.8024 - regression_loss: 1.4907 - classification_loss: 0.3117 56/500 [==>...........................] - ETA: 1:51 - loss: 1.8009 - regression_loss: 1.4893 - classification_loss: 0.3116 57/500 [==>...........................] - ETA: 1:51 - loss: 1.8036 - regression_loss: 1.4910 - classification_loss: 0.3125 58/500 [==>...........................] - ETA: 1:51 - loss: 1.8199 - regression_loss: 1.5064 - classification_loss: 0.3135 59/500 [==>...........................] - ETA: 1:50 - loss: 1.8224 - regression_loss: 1.5087 - classification_loss: 0.3137 60/500 [==>...........................] - ETA: 1:50 - loss: 1.8265 - regression_loss: 1.5127 - classification_loss: 0.3138 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8207 - regression_loss: 1.5076 - classification_loss: 0.3131 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8230 - regression_loss: 1.5090 - classification_loss: 0.3140 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8209 - regression_loss: 1.5075 - classification_loss: 0.3133 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8324 - regression_loss: 1.5158 - classification_loss: 0.3166 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8306 - regression_loss: 1.5148 - classification_loss: 0.3159 66/500 [==>...........................] - ETA: 1:47 - loss: 1.8281 - regression_loss: 1.5128 - classification_loss: 0.3153 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8296 - regression_loss: 1.5149 - classification_loss: 0.3147 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8404 - regression_loss: 1.5160 - classification_loss: 0.3244 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8288 - regression_loss: 1.5055 - classification_loss: 0.3233 70/500 [===>..........................] - ETA: 1:46 - loss: 1.8265 - regression_loss: 1.5043 - classification_loss: 0.3222 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8277 - regression_loss: 1.5058 - classification_loss: 0.3219 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8322 - regression_loss: 1.5097 - classification_loss: 0.3225 73/500 [===>..........................] - ETA: 1:46 - loss: 1.8284 - regression_loss: 1.5066 - classification_loss: 0.3218 74/500 [===>..........................] - ETA: 1:46 - loss: 1.8291 - regression_loss: 1.5082 - classification_loss: 0.3209 75/500 [===>..........................] - ETA: 1:45 - loss: 1.8409 - regression_loss: 1.5166 - classification_loss: 0.3243 76/500 [===>..........................] - ETA: 1:45 - loss: 1.8432 - regression_loss: 1.5177 - classification_loss: 0.3256 77/500 [===>..........................] - ETA: 1:45 - loss: 1.8382 - regression_loss: 1.5141 - classification_loss: 0.3241 78/500 [===>..........................] - ETA: 1:45 - loss: 1.8443 - regression_loss: 1.5195 - classification_loss: 0.3248 79/500 [===>..........................] - ETA: 1:45 - loss: 1.8386 - regression_loss: 1.5152 - classification_loss: 0.3234 80/500 [===>..........................] - ETA: 1:44 - loss: 1.8399 - regression_loss: 1.5169 - classification_loss: 0.3230 81/500 [===>..........................] - ETA: 1:44 - loss: 1.8333 - regression_loss: 1.5122 - classification_loss: 0.3211 82/500 [===>..........................] - ETA: 1:44 - loss: 1.8315 - regression_loss: 1.5115 - classification_loss: 0.3200 83/500 [===>..........................] - ETA: 1:44 - loss: 1.8296 - regression_loss: 1.5096 - classification_loss: 0.3200 84/500 [====>.........................] - ETA: 1:43 - loss: 1.8165 - regression_loss: 1.4984 - classification_loss: 0.3181 85/500 [====>.........................] - ETA: 1:43 - loss: 1.8102 - regression_loss: 1.4923 - classification_loss: 0.3179 86/500 [====>.........................] - ETA: 1:43 - loss: 1.8172 - regression_loss: 1.4976 - classification_loss: 0.3195 87/500 [====>.........................] - ETA: 1:43 - loss: 1.8170 - regression_loss: 1.4979 - classification_loss: 0.3191 88/500 [====>.........................] - ETA: 1:43 - loss: 1.8231 - regression_loss: 1.5019 - classification_loss: 0.3212 89/500 [====>.........................] - ETA: 1:42 - loss: 1.8244 - regression_loss: 1.5028 - classification_loss: 0.3216 90/500 [====>.........................] - ETA: 1:42 - loss: 1.8272 - regression_loss: 1.5051 - classification_loss: 0.3222 91/500 [====>.........................] - ETA: 1:42 - loss: 1.8243 - regression_loss: 1.5015 - classification_loss: 0.3228 92/500 [====>.........................] - ETA: 1:42 - loss: 1.8162 - regression_loss: 1.4949 - classification_loss: 0.3213 93/500 [====>.........................] - ETA: 1:41 - loss: 1.8107 - regression_loss: 1.4908 - classification_loss: 0.3199 94/500 [====>.........................] - ETA: 1:41 - loss: 1.8177 - regression_loss: 1.4973 - classification_loss: 0.3203 95/500 [====>.........................] - ETA: 1:41 - loss: 1.8219 - regression_loss: 1.5011 - classification_loss: 0.3207 96/500 [====>.........................] - ETA: 1:41 - loss: 1.8210 - regression_loss: 1.5009 - classification_loss: 0.3201 97/500 [====>.........................] - ETA: 1:41 - loss: 1.8191 - regression_loss: 1.4999 - classification_loss: 0.3192 98/500 [====>.........................] - ETA: 1:40 - loss: 1.8148 - regression_loss: 1.4967 - classification_loss: 0.3181 99/500 [====>.........................] - ETA: 1:40 - loss: 1.8179 - regression_loss: 1.4997 - classification_loss: 0.3182 100/500 [=====>........................] - ETA: 1:40 - loss: 1.8063 - regression_loss: 1.4901 - classification_loss: 0.3162 101/500 [=====>........................] - ETA: 1:40 - loss: 1.8093 - regression_loss: 1.4921 - classification_loss: 0.3172 102/500 [=====>........................] - ETA: 1:39 - loss: 1.8194 - regression_loss: 1.5000 - classification_loss: 0.3193 103/500 [=====>........................] - ETA: 1:39 - loss: 1.8198 - regression_loss: 1.5008 - classification_loss: 0.3190 104/500 [=====>........................] - ETA: 1:39 - loss: 1.8087 - regression_loss: 1.4922 - classification_loss: 0.3165 105/500 [=====>........................] - ETA: 1:39 - loss: 1.8079 - regression_loss: 1.4915 - classification_loss: 0.3164 106/500 [=====>........................] - ETA: 1:38 - loss: 1.8076 - regression_loss: 1.4916 - classification_loss: 0.3160 107/500 [=====>........................] - ETA: 1:38 - loss: 1.8016 - regression_loss: 1.4870 - classification_loss: 0.3145 108/500 [=====>........................] - ETA: 1:38 - loss: 1.8004 - regression_loss: 1.4865 - classification_loss: 0.3139 109/500 [=====>........................] - ETA: 1:38 - loss: 1.8034 - regression_loss: 1.4888 - classification_loss: 0.3147 110/500 [=====>........................] - ETA: 1:37 - loss: 1.8073 - regression_loss: 1.4916 - classification_loss: 0.3157 111/500 [=====>........................] - ETA: 1:37 - loss: 1.8096 - regression_loss: 1.4947 - classification_loss: 0.3148 112/500 [=====>........................] - ETA: 1:37 - loss: 1.8185 - regression_loss: 1.5020 - classification_loss: 0.3166 113/500 [=====>........................] - ETA: 1:37 - loss: 1.8128 - regression_loss: 1.4971 - classification_loss: 0.3157 114/500 [=====>........................] - ETA: 1:36 - loss: 1.8080 - regression_loss: 1.4935 - classification_loss: 0.3145 115/500 [=====>........................] - ETA: 1:36 - loss: 1.8048 - regression_loss: 1.4912 - classification_loss: 0.3136 116/500 [=====>........................] - ETA: 1:36 - loss: 1.8087 - regression_loss: 1.4947 - classification_loss: 0.3140 117/500 [======>.......................] - ETA: 1:36 - loss: 1.8149 - regression_loss: 1.4998 - classification_loss: 0.3151 118/500 [======>.......................] - ETA: 1:35 - loss: 1.8146 - regression_loss: 1.5000 - classification_loss: 0.3146 119/500 [======>.......................] - ETA: 1:35 - loss: 1.8202 - regression_loss: 1.5042 - classification_loss: 0.3159 120/500 [======>.......................] - ETA: 1:35 - loss: 1.8167 - regression_loss: 1.5018 - classification_loss: 0.3149 121/500 [======>.......................] - ETA: 1:35 - loss: 1.8142 - regression_loss: 1.5001 - classification_loss: 0.3142 122/500 [======>.......................] - ETA: 1:34 - loss: 1.8123 - regression_loss: 1.4985 - classification_loss: 0.3138 123/500 [======>.......................] - ETA: 1:34 - loss: 1.8142 - regression_loss: 1.5004 - classification_loss: 0.3138 124/500 [======>.......................] - ETA: 1:34 - loss: 1.8181 - regression_loss: 1.5025 - classification_loss: 0.3156 125/500 [======>.......................] - ETA: 1:34 - loss: 1.8172 - regression_loss: 1.5019 - classification_loss: 0.3153 126/500 [======>.......................] - ETA: 1:34 - loss: 1.8144 - regression_loss: 1.4987 - classification_loss: 0.3156 127/500 [======>.......................] - ETA: 1:33 - loss: 1.8179 - regression_loss: 1.5015 - classification_loss: 0.3164 128/500 [======>.......................] - ETA: 1:33 - loss: 1.8159 - regression_loss: 1.5006 - classification_loss: 0.3154 129/500 [======>.......................] - ETA: 1:33 - loss: 1.8140 - regression_loss: 1.4988 - classification_loss: 0.3151 130/500 [======>.......................] - ETA: 1:33 - loss: 1.8145 - regression_loss: 1.4996 - classification_loss: 0.3148 131/500 [======>.......................] - ETA: 1:32 - loss: 1.8123 - regression_loss: 1.4983 - classification_loss: 0.3140 132/500 [======>.......................] - ETA: 1:32 - loss: 1.8150 - regression_loss: 1.4998 - classification_loss: 0.3152 133/500 [======>.......................] - ETA: 1:32 - loss: 1.8088 - regression_loss: 1.4943 - classification_loss: 0.3145 134/500 [=======>......................] - ETA: 1:32 - loss: 1.8086 - regression_loss: 1.4935 - classification_loss: 0.3150 135/500 [=======>......................] - ETA: 1:31 - loss: 1.8106 - regression_loss: 1.4953 - classification_loss: 0.3154 136/500 [=======>......................] - ETA: 1:31 - loss: 1.8113 - regression_loss: 1.4959 - classification_loss: 0.3154 137/500 [=======>......................] - ETA: 1:31 - loss: 1.8119 - regression_loss: 1.4964 - classification_loss: 0.3154 138/500 [=======>......................] - ETA: 1:31 - loss: 1.8103 - regression_loss: 1.4951 - classification_loss: 0.3152 139/500 [=======>......................] - ETA: 1:30 - loss: 1.8129 - regression_loss: 1.4975 - classification_loss: 0.3154 140/500 [=======>......................] - ETA: 1:30 - loss: 1.8125 - regression_loss: 1.4974 - classification_loss: 0.3151 141/500 [=======>......................] - ETA: 1:30 - loss: 1.8053 - regression_loss: 1.4917 - classification_loss: 0.3136 142/500 [=======>......................] - ETA: 1:30 - loss: 1.8044 - regression_loss: 1.4917 - classification_loss: 0.3128 143/500 [=======>......................] - ETA: 1:29 - loss: 1.8058 - regression_loss: 1.4928 - classification_loss: 0.3129 144/500 [=======>......................] - ETA: 1:29 - loss: 1.8062 - regression_loss: 1.4930 - classification_loss: 0.3131 145/500 [=======>......................] - ETA: 1:29 - loss: 1.8095 - regression_loss: 1.4958 - classification_loss: 0.3137 146/500 [=======>......................] - ETA: 1:28 - loss: 1.8083 - regression_loss: 1.4948 - classification_loss: 0.3136 147/500 [=======>......................] - ETA: 1:28 - loss: 1.8010 - regression_loss: 1.4891 - classification_loss: 0.3120 148/500 [=======>......................] - ETA: 1:28 - loss: 1.8023 - regression_loss: 1.4899 - classification_loss: 0.3124 149/500 [=======>......................] - ETA: 1:28 - loss: 1.8020 - regression_loss: 1.4902 - classification_loss: 0.3118 150/500 [========>.....................] - ETA: 1:27 - loss: 1.8009 - regression_loss: 1.4893 - classification_loss: 0.3116 151/500 [========>.....................] - ETA: 1:27 - loss: 1.8012 - regression_loss: 1.4894 - classification_loss: 0.3118 152/500 [========>.....................] - ETA: 1:27 - loss: 1.8020 - regression_loss: 1.4904 - classification_loss: 0.3116 153/500 [========>.....................] - ETA: 1:27 - loss: 1.8033 - regression_loss: 1.4918 - classification_loss: 0.3115 154/500 [========>.....................] - ETA: 1:26 - loss: 1.8007 - regression_loss: 1.4898 - classification_loss: 0.3108 155/500 [========>.....................] - ETA: 1:26 - loss: 1.8001 - regression_loss: 1.4895 - classification_loss: 0.3106 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7985 - regression_loss: 1.4879 - classification_loss: 0.3107 157/500 [========>.....................] - ETA: 1:26 - loss: 1.8006 - regression_loss: 1.4895 - classification_loss: 0.3111 158/500 [========>.....................] - ETA: 1:25 - loss: 1.8071 - regression_loss: 1.4955 - classification_loss: 0.3116 159/500 [========>.....................] - ETA: 1:25 - loss: 1.8000 - regression_loss: 1.4898 - classification_loss: 0.3102 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7995 - regression_loss: 1.4893 - classification_loss: 0.3102 161/500 [========>.....................] - ETA: 1:25 - loss: 1.7942 - regression_loss: 1.4850 - classification_loss: 0.3092 162/500 [========>.....................] - ETA: 1:25 - loss: 1.7884 - regression_loss: 1.4802 - classification_loss: 0.3082 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7900 - regression_loss: 1.4814 - classification_loss: 0.3085 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7913 - regression_loss: 1.4824 - classification_loss: 0.3089 165/500 [========>.....................] - ETA: 1:24 - loss: 1.7908 - regression_loss: 1.4821 - classification_loss: 0.3087 166/500 [========>.....................] - ETA: 1:24 - loss: 1.7917 - regression_loss: 1.4829 - classification_loss: 0.3088 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7927 - regression_loss: 1.4839 - classification_loss: 0.3088 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7891 - regression_loss: 1.4811 - classification_loss: 0.3080 169/500 [=========>....................] - ETA: 1:23 - loss: 1.7877 - regression_loss: 1.4797 - classification_loss: 0.3081 170/500 [=========>....................] - ETA: 1:23 - loss: 1.7889 - regression_loss: 1.4809 - classification_loss: 0.3080 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7900 - regression_loss: 1.4821 - classification_loss: 0.3079 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7922 - regression_loss: 1.4841 - classification_loss: 0.3081 173/500 [=========>....................] - ETA: 1:22 - loss: 1.7919 - regression_loss: 1.4842 - classification_loss: 0.3077 174/500 [=========>....................] - ETA: 1:22 - loss: 1.7877 - regression_loss: 1.4807 - classification_loss: 0.3070 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7898 - regression_loss: 1.4826 - classification_loss: 0.3072 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7864 - regression_loss: 1.4795 - classification_loss: 0.3068 177/500 [=========>....................] - ETA: 1:21 - loss: 1.7875 - regression_loss: 1.4805 - classification_loss: 0.3070 178/500 [=========>....................] - ETA: 1:21 - loss: 1.7852 - regression_loss: 1.4785 - classification_loss: 0.3067 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7844 - regression_loss: 1.4778 - classification_loss: 0.3066 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7800 - regression_loss: 1.4742 - classification_loss: 0.3058 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7831 - regression_loss: 1.4767 - classification_loss: 0.3063 182/500 [=========>....................] - ETA: 1:20 - loss: 1.7843 - regression_loss: 1.4780 - classification_loss: 0.3063 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7887 - regression_loss: 1.4818 - classification_loss: 0.3069 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7885 - regression_loss: 1.4822 - classification_loss: 0.3063 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7854 - regression_loss: 1.4799 - classification_loss: 0.3056 186/500 [==========>...................] - ETA: 1:19 - loss: 1.7788 - regression_loss: 1.4745 - classification_loss: 0.3043 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7770 - regression_loss: 1.4729 - classification_loss: 0.3042 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7755 - regression_loss: 1.4717 - classification_loss: 0.3038 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7759 - regression_loss: 1.4720 - classification_loss: 0.3039 190/500 [==========>...................] - ETA: 1:18 - loss: 1.7737 - regression_loss: 1.4704 - classification_loss: 0.3033 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7751 - regression_loss: 1.4712 - classification_loss: 0.3039 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7753 - regression_loss: 1.4717 - classification_loss: 0.3037 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7729 - regression_loss: 1.4698 - classification_loss: 0.3030 194/500 [==========>...................] - ETA: 1:17 - loss: 1.7711 - regression_loss: 1.4687 - classification_loss: 0.3023 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7694 - regression_loss: 1.4674 - classification_loss: 0.3020 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7675 - regression_loss: 1.4657 - classification_loss: 0.3019 197/500 [==========>...................] - ETA: 1:16 - loss: 1.7666 - regression_loss: 1.4651 - classification_loss: 0.3015 198/500 [==========>...................] - ETA: 1:16 - loss: 1.7691 - regression_loss: 1.4669 - classification_loss: 0.3022 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7700 - regression_loss: 1.4677 - classification_loss: 0.3023 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7674 - regression_loss: 1.4656 - classification_loss: 0.3018 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7689 - regression_loss: 1.4670 - classification_loss: 0.3019 202/500 [===========>..................] - ETA: 1:15 - loss: 1.7689 - regression_loss: 1.4672 - classification_loss: 0.3017 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7703 - regression_loss: 1.4682 - classification_loss: 0.3021 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7651 - regression_loss: 1.4640 - classification_loss: 0.3011 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7679 - regression_loss: 1.4651 - classification_loss: 0.3028 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7678 - regression_loss: 1.4653 - classification_loss: 0.3026 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7702 - regression_loss: 1.4674 - classification_loss: 0.3028 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7729 - regression_loss: 1.4694 - classification_loss: 0.3034 209/500 [===========>..................] - ETA: 1:13 - loss: 1.7713 - regression_loss: 1.4679 - classification_loss: 0.3033 210/500 [===========>..................] - ETA: 1:13 - loss: 1.7718 - regression_loss: 1.4687 - classification_loss: 0.3031 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7759 - regression_loss: 1.4724 - classification_loss: 0.3035 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7732 - regression_loss: 1.4700 - classification_loss: 0.3032 213/500 [===========>..................] - ETA: 1:12 - loss: 1.7753 - regression_loss: 1.4717 - classification_loss: 0.3035 214/500 [===========>..................] - ETA: 1:12 - loss: 1.7756 - regression_loss: 1.4722 - classification_loss: 0.3034 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7775 - regression_loss: 1.4733 - classification_loss: 0.3042 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7757 - regression_loss: 1.4722 - classification_loss: 0.3035 217/500 [============>.................] - ETA: 1:11 - loss: 1.7778 - regression_loss: 1.4737 - classification_loss: 0.3041 218/500 [============>.................] - ETA: 1:11 - loss: 1.7781 - regression_loss: 1.4742 - classification_loss: 0.3039 219/500 [============>.................] - ETA: 1:10 - loss: 1.7782 - regression_loss: 1.4744 - classification_loss: 0.3039 220/500 [============>.................] - ETA: 1:10 - loss: 1.7792 - regression_loss: 1.4750 - classification_loss: 0.3042 221/500 [============>.................] - ETA: 1:10 - loss: 1.7798 - regression_loss: 1.4751 - classification_loss: 0.3047 222/500 [============>.................] - ETA: 1:10 - loss: 1.7793 - regression_loss: 1.4744 - classification_loss: 0.3049 223/500 [============>.................] - ETA: 1:09 - loss: 1.7763 - regression_loss: 1.4720 - classification_loss: 0.3044 224/500 [============>.................] - ETA: 1:09 - loss: 1.7736 - regression_loss: 1.4690 - classification_loss: 0.3046 225/500 [============>.................] - ETA: 1:09 - loss: 1.7726 - regression_loss: 1.4684 - classification_loss: 0.3042 226/500 [============>.................] - ETA: 1:09 - loss: 1.7707 - regression_loss: 1.4667 - classification_loss: 0.3040 227/500 [============>.................] - ETA: 1:08 - loss: 1.7688 - regression_loss: 1.4650 - classification_loss: 0.3038 228/500 [============>.................] - ETA: 1:08 - loss: 1.7695 - regression_loss: 1.4659 - classification_loss: 0.3036 229/500 [============>.................] - ETA: 1:08 - loss: 1.7679 - regression_loss: 1.4644 - classification_loss: 0.3035 230/500 [============>.................] - ETA: 1:08 - loss: 1.7673 - regression_loss: 1.4642 - classification_loss: 0.3031 231/500 [============>.................] - ETA: 1:07 - loss: 1.7687 - regression_loss: 1.4654 - classification_loss: 0.3033 232/500 [============>.................] - ETA: 1:07 - loss: 1.7707 - regression_loss: 1.4669 - classification_loss: 0.3038 233/500 [============>.................] - ETA: 1:07 - loss: 1.7706 - regression_loss: 1.4668 - classification_loss: 0.3038 234/500 [=============>................] - ETA: 1:07 - loss: 1.7734 - regression_loss: 1.4691 - classification_loss: 0.3043 235/500 [=============>................] - ETA: 1:06 - loss: 1.7738 - regression_loss: 1.4694 - classification_loss: 0.3044 236/500 [=============>................] - ETA: 1:06 - loss: 1.7727 - regression_loss: 1.4686 - classification_loss: 0.3041 237/500 [=============>................] - ETA: 1:06 - loss: 1.7733 - regression_loss: 1.4692 - classification_loss: 0.3041 238/500 [=============>................] - ETA: 1:05 - loss: 1.7732 - regression_loss: 1.4692 - classification_loss: 0.3040 239/500 [=============>................] - ETA: 1:05 - loss: 1.7731 - regression_loss: 1.4692 - classification_loss: 0.3039 240/500 [=============>................] - ETA: 1:05 - loss: 1.7759 - regression_loss: 1.4709 - classification_loss: 0.3050 241/500 [=============>................] - ETA: 1:05 - loss: 1.7750 - regression_loss: 1.4703 - classification_loss: 0.3047 242/500 [=============>................] - ETA: 1:04 - loss: 1.7742 - regression_loss: 1.4694 - classification_loss: 0.3048 243/500 [=============>................] - ETA: 1:04 - loss: 1.7719 - regression_loss: 1.4679 - classification_loss: 0.3040 244/500 [=============>................] - ETA: 1:04 - loss: 1.7717 - regression_loss: 1.4680 - classification_loss: 0.3037 245/500 [=============>................] - ETA: 1:04 - loss: 1.7695 - regression_loss: 1.4662 - classification_loss: 0.3033 246/500 [=============>................] - ETA: 1:03 - loss: 1.7714 - regression_loss: 1.4679 - classification_loss: 0.3035 247/500 [=============>................] - ETA: 1:03 - loss: 1.7682 - regression_loss: 1.4654 - classification_loss: 0.3028 248/500 [=============>................] - ETA: 1:03 - loss: 1.7683 - regression_loss: 1.4656 - classification_loss: 0.3027 249/500 [=============>................] - ETA: 1:03 - loss: 1.7735 - regression_loss: 1.4701 - classification_loss: 0.3034 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7741 - regression_loss: 1.4701 - classification_loss: 0.3040 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7740 - regression_loss: 1.4703 - classification_loss: 0.3037 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7736 - regression_loss: 1.4702 - classification_loss: 0.3034 253/500 [==============>...............] - ETA: 1:02 - loss: 1.7734 - regression_loss: 1.4701 - classification_loss: 0.3033 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7729 - regression_loss: 1.4698 - classification_loss: 0.3031 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7720 - regression_loss: 1.4692 - classification_loss: 0.3028 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7709 - regression_loss: 1.4682 - classification_loss: 0.3027 257/500 [==============>...............] - ETA: 1:01 - loss: 1.7724 - regression_loss: 1.4694 - classification_loss: 0.3030 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7722 - regression_loss: 1.4693 - classification_loss: 0.3030 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7739 - regression_loss: 1.4708 - classification_loss: 0.3031 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7738 - regression_loss: 1.4707 - classification_loss: 0.3031 261/500 [==============>...............] - ETA: 1:00 - loss: 1.7760 - regression_loss: 1.4724 - classification_loss: 0.3035 262/500 [==============>...............] - ETA: 59s - loss: 1.7759 - regression_loss: 1.4727 - classification_loss: 0.3032  263/500 [==============>...............] - ETA: 59s - loss: 1.7760 - regression_loss: 1.4726 - classification_loss: 0.3035 264/500 [==============>...............] - ETA: 59s - loss: 1.7774 - regression_loss: 1.4739 - classification_loss: 0.3035 265/500 [==============>...............] - ETA: 59s - loss: 1.7768 - regression_loss: 1.4734 - classification_loss: 0.3034 266/500 [==============>...............] - ETA: 58s - loss: 1.7781 - regression_loss: 1.4744 - classification_loss: 0.3037 267/500 [===============>..............] - ETA: 58s - loss: 1.7779 - regression_loss: 1.4748 - classification_loss: 0.3032 268/500 [===============>..............] - ETA: 58s - loss: 1.7777 - regression_loss: 1.4747 - classification_loss: 0.3030 269/500 [===============>..............] - ETA: 58s - loss: 1.7791 - regression_loss: 1.4763 - classification_loss: 0.3028 270/500 [===============>..............] - ETA: 57s - loss: 1.7778 - regression_loss: 1.4754 - classification_loss: 0.3024 271/500 [===============>..............] - ETA: 57s - loss: 1.7815 - regression_loss: 1.4782 - classification_loss: 0.3033 272/500 [===============>..............] - ETA: 57s - loss: 1.7816 - regression_loss: 1.4783 - classification_loss: 0.3033 273/500 [===============>..............] - ETA: 57s - loss: 1.7811 - regression_loss: 1.4780 - classification_loss: 0.3031 274/500 [===============>..............] - ETA: 56s - loss: 1.7822 - regression_loss: 1.4790 - classification_loss: 0.3032 275/500 [===============>..............] - ETA: 56s - loss: 1.7841 - regression_loss: 1.4803 - classification_loss: 0.3038 276/500 [===============>..............] - ETA: 56s - loss: 1.7849 - regression_loss: 1.4808 - classification_loss: 0.3041 277/500 [===============>..............] - ETA: 56s - loss: 1.7807 - regression_loss: 1.4772 - classification_loss: 0.3035 278/500 [===============>..............] - ETA: 55s - loss: 1.7781 - regression_loss: 1.4745 - classification_loss: 0.3036 279/500 [===============>..............] - ETA: 55s - loss: 1.7758 - regression_loss: 1.4723 - classification_loss: 0.3035 280/500 [===============>..............] - ETA: 55s - loss: 1.7781 - regression_loss: 1.4745 - classification_loss: 0.3036 281/500 [===============>..............] - ETA: 55s - loss: 1.7774 - regression_loss: 1.4739 - classification_loss: 0.3035 282/500 [===============>..............] - ETA: 54s - loss: 1.7733 - regression_loss: 1.4707 - classification_loss: 0.3027 283/500 [===============>..............] - ETA: 54s - loss: 1.7716 - regression_loss: 1.4693 - classification_loss: 0.3022 284/500 [================>.............] - ETA: 54s - loss: 1.7704 - regression_loss: 1.4684 - classification_loss: 0.3020 285/500 [================>.............] - ETA: 54s - loss: 1.7703 - regression_loss: 1.4683 - classification_loss: 0.3019 286/500 [================>.............] - ETA: 53s - loss: 1.7711 - regression_loss: 1.4691 - classification_loss: 0.3020 287/500 [================>.............] - ETA: 53s - loss: 1.7701 - regression_loss: 1.4684 - classification_loss: 0.3017 288/500 [================>.............] - ETA: 53s - loss: 1.7752 - regression_loss: 1.4718 - classification_loss: 0.3034 289/500 [================>.............] - ETA: 53s - loss: 1.7767 - regression_loss: 1.4731 - classification_loss: 0.3036 290/500 [================>.............] - ETA: 52s - loss: 1.7764 - regression_loss: 1.4728 - classification_loss: 0.3036 291/500 [================>.............] - ETA: 52s - loss: 1.7774 - regression_loss: 1.4736 - classification_loss: 0.3038 292/500 [================>.............] - ETA: 52s - loss: 1.7777 - regression_loss: 1.4738 - classification_loss: 0.3039 293/500 [================>.............] - ETA: 52s - loss: 1.7749 - regression_loss: 1.4715 - classification_loss: 0.3033 294/500 [================>.............] - ETA: 51s - loss: 1.7745 - regression_loss: 1.4711 - classification_loss: 0.3034 295/500 [================>.............] - ETA: 51s - loss: 1.7730 - regression_loss: 1.4700 - classification_loss: 0.3030 296/500 [================>.............] - ETA: 51s - loss: 1.7720 - regression_loss: 1.4695 - classification_loss: 0.3025 297/500 [================>.............] - ETA: 51s - loss: 1.7718 - regression_loss: 1.4690 - classification_loss: 0.3028 298/500 [================>.............] - ETA: 50s - loss: 1.7698 - regression_loss: 1.4673 - classification_loss: 0.3025 299/500 [================>.............] - ETA: 50s - loss: 1.7693 - regression_loss: 1.4669 - classification_loss: 0.3024 300/500 [=================>............] - ETA: 50s - loss: 1.7702 - regression_loss: 1.4676 - classification_loss: 0.3026 301/500 [=================>............] - ETA: 50s - loss: 1.7707 - regression_loss: 1.4681 - classification_loss: 0.3026 302/500 [=================>............] - ETA: 49s - loss: 1.7697 - regression_loss: 1.4671 - classification_loss: 0.3026 303/500 [=================>............] - ETA: 49s - loss: 1.7656 - regression_loss: 1.4638 - classification_loss: 0.3018 304/500 [=================>............] - ETA: 49s - loss: 1.7650 - regression_loss: 1.4634 - classification_loss: 0.3016 305/500 [=================>............] - ETA: 49s - loss: 1.7628 - regression_loss: 1.4612 - classification_loss: 0.3016 306/500 [=================>............] - ETA: 48s - loss: 1.7614 - regression_loss: 1.4601 - classification_loss: 0.3013 307/500 [=================>............] - ETA: 48s - loss: 1.7618 - regression_loss: 1.4599 - classification_loss: 0.3018 308/500 [=================>............] - ETA: 48s - loss: 1.7617 - regression_loss: 1.4599 - classification_loss: 0.3017 309/500 [=================>............] - ETA: 48s - loss: 1.7616 - regression_loss: 1.4600 - classification_loss: 0.3017 310/500 [=================>............] - ETA: 47s - loss: 1.7632 - regression_loss: 1.4613 - classification_loss: 0.3019 311/500 [=================>............] - ETA: 47s - loss: 1.7632 - regression_loss: 1.4614 - classification_loss: 0.3017 312/500 [=================>............] - ETA: 47s - loss: 1.7667 - regression_loss: 1.4649 - classification_loss: 0.3018 313/500 [=================>............] - ETA: 47s - loss: 1.7637 - regression_loss: 1.4625 - classification_loss: 0.3012 314/500 [=================>............] - ETA: 46s - loss: 1.7617 - regression_loss: 1.4610 - classification_loss: 0.3007 315/500 [=================>............] - ETA: 46s - loss: 1.7617 - regression_loss: 1.4611 - classification_loss: 0.3007 316/500 [=================>............] - ETA: 46s - loss: 1.7618 - regression_loss: 1.4608 - classification_loss: 0.3010 317/500 [==================>...........] - ETA: 46s - loss: 1.7600 - regression_loss: 1.4592 - classification_loss: 0.3008 318/500 [==================>...........] - ETA: 45s - loss: 1.7598 - regression_loss: 1.4590 - classification_loss: 0.3008 319/500 [==================>...........] - ETA: 45s - loss: 1.7574 - regression_loss: 1.4572 - classification_loss: 0.3002 320/500 [==================>...........] - ETA: 45s - loss: 1.7559 - regression_loss: 1.4559 - classification_loss: 0.3000 321/500 [==================>...........] - ETA: 45s - loss: 1.7547 - regression_loss: 1.4549 - classification_loss: 0.2998 322/500 [==================>...........] - ETA: 44s - loss: 1.7552 - regression_loss: 1.4556 - classification_loss: 0.2997 323/500 [==================>...........] - ETA: 44s - loss: 1.7552 - regression_loss: 1.4556 - classification_loss: 0.2996 324/500 [==================>...........] - ETA: 44s - loss: 1.7546 - regression_loss: 1.4550 - classification_loss: 0.2996 325/500 [==================>...........] - ETA: 44s - loss: 1.7557 - regression_loss: 1.4555 - classification_loss: 0.3002 326/500 [==================>...........] - ETA: 43s - loss: 1.7558 - regression_loss: 1.4555 - classification_loss: 0.3004 327/500 [==================>...........] - ETA: 43s - loss: 1.7538 - regression_loss: 1.4539 - classification_loss: 0.2999 328/500 [==================>...........] - ETA: 43s - loss: 1.7551 - regression_loss: 1.4550 - classification_loss: 0.3001 329/500 [==================>...........] - ETA: 43s - loss: 1.7554 - regression_loss: 1.4555 - classification_loss: 0.2999 330/500 [==================>...........] - ETA: 42s - loss: 1.7566 - regression_loss: 1.4559 - classification_loss: 0.3007 331/500 [==================>...........] - ETA: 42s - loss: 1.7563 - regression_loss: 1.4557 - classification_loss: 0.3006 332/500 [==================>...........] - ETA: 42s - loss: 1.7550 - regression_loss: 1.4546 - classification_loss: 0.3003 333/500 [==================>...........] - ETA: 42s - loss: 1.7554 - regression_loss: 1.4551 - classification_loss: 0.3003 334/500 [===================>..........] - ETA: 41s - loss: 1.7564 - regression_loss: 1.4558 - classification_loss: 0.3006 335/500 [===================>..........] - ETA: 41s - loss: 1.7590 - regression_loss: 1.4576 - classification_loss: 0.3014 336/500 [===================>..........] - ETA: 41s - loss: 1.7609 - regression_loss: 1.4592 - classification_loss: 0.3018 337/500 [===================>..........] - ETA: 41s - loss: 1.7612 - regression_loss: 1.4595 - classification_loss: 0.3017 338/500 [===================>..........] - ETA: 40s - loss: 1.7617 - regression_loss: 1.4598 - classification_loss: 0.3018 339/500 [===================>..........] - ETA: 40s - loss: 1.7633 - regression_loss: 1.4611 - classification_loss: 0.3021 340/500 [===================>..........] - ETA: 40s - loss: 1.7615 - regression_loss: 1.4596 - classification_loss: 0.3019 341/500 [===================>..........] - ETA: 40s - loss: 1.7609 - regression_loss: 1.4589 - classification_loss: 0.3020 342/500 [===================>..........] - ETA: 39s - loss: 1.7606 - regression_loss: 1.4586 - classification_loss: 0.3020 343/500 [===================>..........] - ETA: 39s - loss: 1.7615 - regression_loss: 1.4592 - classification_loss: 0.3023 344/500 [===================>..........] - ETA: 39s - loss: 1.7628 - regression_loss: 1.4602 - classification_loss: 0.3026 345/500 [===================>..........] - ETA: 39s - loss: 1.7625 - regression_loss: 1.4600 - classification_loss: 0.3024 346/500 [===================>..........] - ETA: 38s - loss: 1.7623 - regression_loss: 1.4599 - classification_loss: 0.3025 347/500 [===================>..........] - ETA: 38s - loss: 1.7633 - regression_loss: 1.4607 - classification_loss: 0.3026 348/500 [===================>..........] - ETA: 38s - loss: 1.7636 - regression_loss: 1.4608 - classification_loss: 0.3028 349/500 [===================>..........] - ETA: 37s - loss: 1.7637 - regression_loss: 1.4607 - classification_loss: 0.3030 350/500 [====================>.........] - ETA: 37s - loss: 1.7622 - regression_loss: 1.4594 - classification_loss: 0.3029 351/500 [====================>.........] - ETA: 37s - loss: 1.7612 - regression_loss: 1.4586 - classification_loss: 0.3026 352/500 [====================>.........] - ETA: 37s - loss: 1.7604 - regression_loss: 1.4580 - classification_loss: 0.3024 353/500 [====================>.........] - ETA: 36s - loss: 1.7582 - regression_loss: 1.4560 - classification_loss: 0.3022 354/500 [====================>.........] - ETA: 36s - loss: 1.7579 - regression_loss: 1.4561 - classification_loss: 0.3019 355/500 [====================>.........] - ETA: 36s - loss: 1.7583 - regression_loss: 1.4565 - classification_loss: 0.3018 356/500 [====================>.........] - ETA: 36s - loss: 1.7600 - regression_loss: 1.4577 - classification_loss: 0.3023 357/500 [====================>.........] - ETA: 35s - loss: 1.7615 - regression_loss: 1.4590 - classification_loss: 0.3025 358/500 [====================>.........] - ETA: 35s - loss: 1.7629 - regression_loss: 1.4603 - classification_loss: 0.3026 359/500 [====================>.........] - ETA: 35s - loss: 1.7631 - regression_loss: 1.4604 - classification_loss: 0.3026 360/500 [====================>.........] - ETA: 35s - loss: 1.7638 - regression_loss: 1.4608 - classification_loss: 0.3029 361/500 [====================>.........] - ETA: 34s - loss: 1.7639 - regression_loss: 1.4610 - classification_loss: 0.3029 362/500 [====================>.........] - ETA: 34s - loss: 1.7651 - regression_loss: 1.4619 - classification_loss: 0.3032 363/500 [====================>.........] - ETA: 34s - loss: 1.7660 - regression_loss: 1.4625 - classification_loss: 0.3035 364/500 [====================>.........] - ETA: 34s - loss: 1.7668 - regression_loss: 1.4632 - classification_loss: 0.3036 365/500 [====================>.........] - ETA: 33s - loss: 1.7668 - regression_loss: 1.4632 - classification_loss: 0.3035 366/500 [====================>.........] - ETA: 33s - loss: 1.7662 - regression_loss: 1.4626 - classification_loss: 0.3035 367/500 [=====================>........] - ETA: 33s - loss: 1.7648 - regression_loss: 1.4614 - classification_loss: 0.3034 368/500 [=====================>........] - ETA: 33s - loss: 1.7633 - regression_loss: 1.4601 - classification_loss: 0.3032 369/500 [=====================>........] - ETA: 32s - loss: 1.7636 - regression_loss: 1.4601 - classification_loss: 0.3035 370/500 [=====================>........] - ETA: 32s - loss: 1.7637 - regression_loss: 1.4602 - classification_loss: 0.3035 371/500 [=====================>........] - ETA: 32s - loss: 1.7641 - regression_loss: 1.4604 - classification_loss: 0.3037 372/500 [=====================>........] - ETA: 32s - loss: 1.7660 - regression_loss: 1.4617 - classification_loss: 0.3043 373/500 [=====================>........] - ETA: 31s - loss: 1.7664 - regression_loss: 1.4621 - classification_loss: 0.3042 374/500 [=====================>........] - ETA: 31s - loss: 1.7670 - regression_loss: 1.4625 - classification_loss: 0.3045 375/500 [=====================>........] - ETA: 31s - loss: 1.7666 - regression_loss: 1.4615 - classification_loss: 0.3052 376/500 [=====================>........] - ETA: 31s - loss: 1.7656 - regression_loss: 1.4608 - classification_loss: 0.3048 377/500 [=====================>........] - ETA: 30s - loss: 1.7645 - regression_loss: 1.4600 - classification_loss: 0.3046 378/500 [=====================>........] - ETA: 30s - loss: 1.7616 - regression_loss: 1.4577 - classification_loss: 0.3040 379/500 [=====================>........] - ETA: 30s - loss: 1.7603 - regression_loss: 1.4567 - classification_loss: 0.3036 380/500 [=====================>........] - ETA: 30s - loss: 1.7601 - regression_loss: 1.4567 - classification_loss: 0.3034 381/500 [=====================>........] - ETA: 29s - loss: 1.7599 - regression_loss: 1.4563 - classification_loss: 0.3036 382/500 [=====================>........] - ETA: 29s - loss: 1.7605 - regression_loss: 1.4567 - classification_loss: 0.3037 383/500 [=====================>........] - ETA: 29s - loss: 1.7610 - regression_loss: 1.4575 - classification_loss: 0.3036 384/500 [======================>.......] - ETA: 29s - loss: 1.7618 - regression_loss: 1.4581 - classification_loss: 0.3037 385/500 [======================>.......] - ETA: 28s - loss: 1.7603 - regression_loss: 1.4570 - classification_loss: 0.3033 386/500 [======================>.......] - ETA: 28s - loss: 1.7616 - regression_loss: 1.4580 - classification_loss: 0.3036 387/500 [======================>.......] - ETA: 28s - loss: 1.7616 - regression_loss: 1.4580 - classification_loss: 0.3036 388/500 [======================>.......] - ETA: 28s - loss: 1.7632 - regression_loss: 1.4593 - classification_loss: 0.3038 389/500 [======================>.......] - ETA: 27s - loss: 1.7623 - regression_loss: 1.4587 - classification_loss: 0.3035 390/500 [======================>.......] - ETA: 27s - loss: 1.7615 - regression_loss: 1.4582 - classification_loss: 0.3033 391/500 [======================>.......] - ETA: 27s - loss: 1.7624 - regression_loss: 1.4590 - classification_loss: 0.3034 392/500 [======================>.......] - ETA: 27s - loss: 1.7619 - regression_loss: 1.4586 - classification_loss: 0.3033 393/500 [======================>.......] - ETA: 26s - loss: 1.7636 - regression_loss: 1.4596 - classification_loss: 0.3040 394/500 [======================>.......] - ETA: 26s - loss: 1.7643 - regression_loss: 1.4603 - classification_loss: 0.3040 395/500 [======================>.......] - ETA: 26s - loss: 1.7653 - regression_loss: 1.4612 - classification_loss: 0.3041 396/500 [======================>.......] - ETA: 26s - loss: 1.7653 - regression_loss: 1.4613 - classification_loss: 0.3039 397/500 [======================>.......] - ETA: 25s - loss: 1.7647 - regression_loss: 1.4608 - classification_loss: 0.3039 398/500 [======================>.......] - ETA: 25s - loss: 1.7636 - regression_loss: 1.4598 - classification_loss: 0.3038 399/500 [======================>.......] - ETA: 25s - loss: 1.7631 - regression_loss: 1.4595 - classification_loss: 0.3036 400/500 [=======================>......] - ETA: 25s - loss: 1.7635 - regression_loss: 1.4596 - classification_loss: 0.3039 401/500 [=======================>......] - ETA: 24s - loss: 1.7627 - regression_loss: 1.4587 - classification_loss: 0.3040 402/500 [=======================>......] - ETA: 24s - loss: 1.7608 - regression_loss: 1.4573 - classification_loss: 0.3035 403/500 [=======================>......] - ETA: 24s - loss: 1.7610 - regression_loss: 1.4574 - classification_loss: 0.3036 404/500 [=======================>......] - ETA: 24s - loss: 1.7597 - regression_loss: 1.4564 - classification_loss: 0.3034 405/500 [=======================>......] - ETA: 23s - loss: 1.7592 - regression_loss: 1.4560 - classification_loss: 0.3032 406/500 [=======================>......] - ETA: 23s - loss: 1.7592 - regression_loss: 1.4561 - classification_loss: 0.3031 407/500 [=======================>......] - ETA: 23s - loss: 1.7592 - regression_loss: 1.4562 - classification_loss: 0.3030 408/500 [=======================>......] - ETA: 23s - loss: 1.7606 - regression_loss: 1.4573 - classification_loss: 0.3033 409/500 [=======================>......] - ETA: 22s - loss: 1.7596 - regression_loss: 1.4564 - classification_loss: 0.3033 410/500 [=======================>......] - ETA: 22s - loss: 1.7585 - regression_loss: 1.4554 - classification_loss: 0.3031 411/500 [=======================>......] - ETA: 22s - loss: 1.7575 - regression_loss: 1.4541 - classification_loss: 0.3034 412/500 [=======================>......] - ETA: 22s - loss: 1.7584 - regression_loss: 1.4549 - classification_loss: 0.3035 413/500 [=======================>......] - ETA: 21s - loss: 1.7597 - regression_loss: 1.4560 - classification_loss: 0.3037 414/500 [=======================>......] - ETA: 21s - loss: 1.7571 - regression_loss: 1.4537 - classification_loss: 0.3034 415/500 [=======================>......] - ETA: 21s - loss: 1.7571 - regression_loss: 1.4537 - classification_loss: 0.3034 416/500 [=======================>......] - ETA: 21s - loss: 1.7573 - regression_loss: 1.4541 - classification_loss: 0.3031 417/500 [========================>.....] - ETA: 20s - loss: 1.7582 - regression_loss: 1.4549 - classification_loss: 0.3033 418/500 [========================>.....] - ETA: 20s - loss: 1.7598 - regression_loss: 1.4562 - classification_loss: 0.3036 419/500 [========================>.....] - ETA: 20s - loss: 1.7577 - regression_loss: 1.4545 - classification_loss: 0.3032 420/500 [========================>.....] - ETA: 20s - loss: 1.7571 - regression_loss: 1.4540 - classification_loss: 0.3031 421/500 [========================>.....] - ETA: 19s - loss: 1.7574 - regression_loss: 1.4544 - classification_loss: 0.3030 422/500 [========================>.....] - ETA: 19s - loss: 1.7562 - regression_loss: 1.4533 - classification_loss: 0.3029 423/500 [========================>.....] - ETA: 19s - loss: 1.7567 - regression_loss: 1.4537 - classification_loss: 0.3030 424/500 [========================>.....] - ETA: 19s - loss: 1.7583 - regression_loss: 1.4552 - classification_loss: 0.3032 425/500 [========================>.....] - ETA: 18s - loss: 1.7569 - regression_loss: 1.4541 - classification_loss: 0.3028 426/500 [========================>.....] - ETA: 18s - loss: 1.7574 - regression_loss: 1.4546 - classification_loss: 0.3029 427/500 [========================>.....] - ETA: 18s - loss: 1.7571 - regression_loss: 1.4543 - classification_loss: 0.3028 428/500 [========================>.....] - ETA: 18s - loss: 1.7568 - regression_loss: 1.4542 - classification_loss: 0.3026 429/500 [========================>.....] - ETA: 17s - loss: 1.7581 - regression_loss: 1.4554 - classification_loss: 0.3027 430/500 [========================>.....] - ETA: 17s - loss: 1.7563 - regression_loss: 1.4539 - classification_loss: 0.3024 431/500 [========================>.....] - ETA: 17s - loss: 1.7576 - regression_loss: 1.4548 - classification_loss: 0.3028 432/500 [========================>.....] - ETA: 17s - loss: 1.7574 - regression_loss: 1.4548 - classification_loss: 0.3027 433/500 [========================>.....] - ETA: 16s - loss: 1.7557 - regression_loss: 1.4533 - classification_loss: 0.3024 434/500 [=========================>....] - ETA: 16s - loss: 1.7556 - regression_loss: 1.4534 - classification_loss: 0.3023 435/500 [=========================>....] - ETA: 16s - loss: 1.7556 - regression_loss: 1.4533 - classification_loss: 0.3024 436/500 [=========================>....] - ETA: 16s - loss: 1.7566 - regression_loss: 1.4541 - classification_loss: 0.3024 437/500 [=========================>....] - ETA: 15s - loss: 1.7579 - regression_loss: 1.4551 - classification_loss: 0.3029 438/500 [=========================>....] - ETA: 15s - loss: 1.7555 - regression_loss: 1.4532 - classification_loss: 0.3023 439/500 [=========================>....] - ETA: 15s - loss: 1.7546 - regression_loss: 1.4524 - classification_loss: 0.3021 440/500 [=========================>....] - ETA: 15s - loss: 1.7549 - regression_loss: 1.4527 - classification_loss: 0.3022 441/500 [=========================>....] - ETA: 14s - loss: 1.7557 - regression_loss: 1.4534 - classification_loss: 0.3023 442/500 [=========================>....] - ETA: 14s - loss: 1.7561 - regression_loss: 1.4537 - classification_loss: 0.3024 443/500 [=========================>....] - ETA: 14s - loss: 1.7565 - regression_loss: 1.4542 - classification_loss: 0.3023 444/500 [=========================>....] - ETA: 14s - loss: 1.7566 - regression_loss: 1.4543 - classification_loss: 0.3023 445/500 [=========================>....] - ETA: 13s - loss: 1.7594 - regression_loss: 1.4565 - classification_loss: 0.3028 446/500 [=========================>....] - ETA: 13s - loss: 1.7585 - regression_loss: 1.4556 - classification_loss: 0.3028 447/500 [=========================>....] - ETA: 13s - loss: 1.7568 - regression_loss: 1.4544 - classification_loss: 0.3024 448/500 [=========================>....] - ETA: 13s - loss: 1.7560 - regression_loss: 1.4537 - classification_loss: 0.3023 449/500 [=========================>....] - ETA: 12s - loss: 1.7555 - regression_loss: 1.4533 - classification_loss: 0.3022 450/500 [==========================>...] - ETA: 12s - loss: 1.7549 - regression_loss: 1.4529 - classification_loss: 0.3020 451/500 [==========================>...] - ETA: 12s - loss: 1.7553 - regression_loss: 1.4533 - classification_loss: 0.3020 452/500 [==========================>...] - ETA: 12s - loss: 1.7545 - regression_loss: 1.4527 - classification_loss: 0.3018 453/500 [==========================>...] - ETA: 11s - loss: 1.7543 - regression_loss: 1.4526 - classification_loss: 0.3017 454/500 [==========================>...] - ETA: 11s - loss: 1.7541 - regression_loss: 1.4524 - classification_loss: 0.3017 455/500 [==========================>...] - ETA: 11s - loss: 1.7549 - regression_loss: 1.4531 - classification_loss: 0.3019 456/500 [==========================>...] - ETA: 11s - loss: 1.7545 - regression_loss: 1.4528 - classification_loss: 0.3017 457/500 [==========================>...] - ETA: 10s - loss: 1.7554 - regression_loss: 1.4536 - classification_loss: 0.3018 458/500 [==========================>...] - ETA: 10s - loss: 1.7551 - regression_loss: 1.4534 - classification_loss: 0.3017 459/500 [==========================>...] - ETA: 10s - loss: 1.7555 - regression_loss: 1.4538 - classification_loss: 0.3017 460/500 [==========================>...] - ETA: 10s - loss: 1.7548 - regression_loss: 1.4533 - classification_loss: 0.3015 461/500 [==========================>...] - ETA: 9s - loss: 1.7530 - regression_loss: 1.4517 - classification_loss: 0.3013  462/500 [==========================>...] - ETA: 9s - loss: 1.7538 - regression_loss: 1.4523 - classification_loss: 0.3015 463/500 [==========================>...] - ETA: 9s - loss: 1.7535 - regression_loss: 1.4521 - classification_loss: 0.3014 464/500 [==========================>...] - ETA: 9s - loss: 1.7531 - regression_loss: 1.4519 - classification_loss: 0.3012 465/500 [==========================>...] - ETA: 8s - loss: 1.7529 - regression_loss: 1.4518 - classification_loss: 0.3011 466/500 [==========================>...] - ETA: 8s - loss: 1.7511 - regression_loss: 1.4503 - classification_loss: 0.3008 467/500 [===========================>..] - ETA: 8s - loss: 1.7527 - regression_loss: 1.4519 - classification_loss: 0.3008 468/500 [===========================>..] - ETA: 8s - loss: 1.7517 - regression_loss: 1.4511 - classification_loss: 0.3006 469/500 [===========================>..] - ETA: 7s - loss: 1.7501 - regression_loss: 1.4496 - classification_loss: 0.3005 470/500 [===========================>..] - ETA: 7s - loss: 1.7510 - regression_loss: 1.4503 - classification_loss: 0.3007 471/500 [===========================>..] - ETA: 7s - loss: 1.7505 - regression_loss: 1.4500 - classification_loss: 0.3005 472/500 [===========================>..] - ETA: 7s - loss: 1.7510 - regression_loss: 1.4505 - classification_loss: 0.3005 473/500 [===========================>..] - ETA: 6s - loss: 1.7511 - regression_loss: 1.4507 - classification_loss: 0.3004 474/500 [===========================>..] - ETA: 6s - loss: 1.7502 - regression_loss: 1.4500 - classification_loss: 0.3002 475/500 [===========================>..] - ETA: 6s - loss: 1.7511 - regression_loss: 1.4507 - classification_loss: 0.3004 476/500 [===========================>..] - ETA: 6s - loss: 1.7514 - regression_loss: 1.4511 - classification_loss: 0.3003 477/500 [===========================>..] - ETA: 5s - loss: 1.7518 - regression_loss: 1.4514 - classification_loss: 0.3005 478/500 [===========================>..] - ETA: 5s - loss: 1.7514 - regression_loss: 1.4511 - classification_loss: 0.3003 479/500 [===========================>..] - ETA: 5s - loss: 1.7515 - regression_loss: 1.4514 - classification_loss: 0.3001 480/500 [===========================>..] - ETA: 5s - loss: 1.7525 - regression_loss: 1.4522 - classification_loss: 0.3003 481/500 [===========================>..] - ETA: 4s - loss: 1.7527 - regression_loss: 1.4525 - classification_loss: 0.3002 482/500 [===========================>..] - ETA: 4s - loss: 1.7518 - regression_loss: 1.4518 - classification_loss: 0.3000 483/500 [===========================>..] - ETA: 4s - loss: 1.7523 - regression_loss: 1.4520 - classification_loss: 0.3002 484/500 [============================>.] - ETA: 4s - loss: 1.7510 - regression_loss: 1.4509 - classification_loss: 0.3001 485/500 [============================>.] - ETA: 3s - loss: 1.7505 - regression_loss: 1.4506 - classification_loss: 0.2999 486/500 [============================>.] - ETA: 3s - loss: 1.7505 - regression_loss: 1.4506 - classification_loss: 0.2999 487/500 [============================>.] - ETA: 3s - loss: 1.7494 - regression_loss: 1.4498 - classification_loss: 0.2997 488/500 [============================>.] - ETA: 3s - loss: 1.7504 - regression_loss: 1.4505 - classification_loss: 0.2999 489/500 [============================>.] - ETA: 2s - loss: 1.7502 - regression_loss: 1.4505 - classification_loss: 0.2997 490/500 [============================>.] - ETA: 2s - loss: 1.7494 - regression_loss: 1.4498 - classification_loss: 0.2995 491/500 [============================>.] - ETA: 2s - loss: 1.7494 - regression_loss: 1.4499 - classification_loss: 0.2995 492/500 [============================>.] - ETA: 2s - loss: 1.7484 - regression_loss: 1.4492 - classification_loss: 0.2992 493/500 [============================>.] - ETA: 1s - loss: 1.7480 - regression_loss: 1.4488 - classification_loss: 0.2992 494/500 [============================>.] - ETA: 1s - loss: 1.7482 - regression_loss: 1.4491 - classification_loss: 0.2991 495/500 [============================>.] - ETA: 1s - loss: 1.7482 - regression_loss: 1.4490 - classification_loss: 0.2991 496/500 [============================>.] - ETA: 1s - loss: 1.7481 - regression_loss: 1.4491 - classification_loss: 0.2990 497/500 [============================>.] - ETA: 0s - loss: 1.7488 - regression_loss: 1.4497 - classification_loss: 0.2991 498/500 [============================>.] - ETA: 0s - loss: 1.7465 - regression_loss: 1.4477 - classification_loss: 0.2988 499/500 [============================>.] - ETA: 0s - loss: 1.7457 - regression_loss: 1.4471 - classification_loss: 0.2986 500/500 [==============================] - 125s 251ms/step - loss: 1.7451 - regression_loss: 1.4465 - classification_loss: 0.2986 1172 instances of class plum with average precision: 0.6279 mAP: 0.6279 Epoch 00066: saving model to ./training/snapshots/resnet50_pascal_66.h5 Epoch 67/150 1/500 [..............................] - ETA: 2:00 - loss: 0.7424 - regression_loss: 0.5880 - classification_loss: 0.1544 2/500 [..............................] - ETA: 2:02 - loss: 0.8976 - regression_loss: 0.7382 - classification_loss: 0.1594 3/500 [..............................] - ETA: 2:02 - loss: 1.1098 - regression_loss: 0.9136 - classification_loss: 0.1963 4/500 [..............................] - ETA: 2:02 - loss: 1.1143 - regression_loss: 0.9279 - classification_loss: 0.1864 5/500 [..............................] - ETA: 2:03 - loss: 1.1657 - regression_loss: 0.9747 - classification_loss: 0.1910 6/500 [..............................] - ETA: 2:03 - loss: 1.4822 - regression_loss: 1.2353 - classification_loss: 0.2469 7/500 [..............................] - ETA: 2:03 - loss: 1.5927 - regression_loss: 1.3182 - classification_loss: 0.2745 8/500 [..............................] - ETA: 2:02 - loss: 1.5907 - regression_loss: 1.3154 - classification_loss: 0.2754 9/500 [..............................] - ETA: 2:02 - loss: 1.6243 - regression_loss: 1.3442 - classification_loss: 0.2801 10/500 [..............................] - ETA: 2:02 - loss: 1.6127 - regression_loss: 1.3190 - classification_loss: 0.2936 11/500 [..............................] - ETA: 2:02 - loss: 1.5984 - regression_loss: 1.3060 - classification_loss: 0.2924 12/500 [..............................] - ETA: 2:02 - loss: 1.6147 - regression_loss: 1.3190 - classification_loss: 0.2958 13/500 [..............................] - ETA: 2:02 - loss: 1.6349 - regression_loss: 1.3362 - classification_loss: 0.2987 14/500 [..............................] - ETA: 2:02 - loss: 1.6737 - regression_loss: 1.3709 - classification_loss: 0.3028 15/500 [..............................] - ETA: 2:02 - loss: 1.6807 - regression_loss: 1.3801 - classification_loss: 0.3006 16/500 [..............................] - ETA: 2:02 - loss: 1.7613 - regression_loss: 1.4453 - classification_loss: 0.3160 17/500 [>.............................] - ETA: 2:02 - loss: 1.7643 - regression_loss: 1.4468 - classification_loss: 0.3176 18/500 [>.............................] - ETA: 2:02 - loss: 1.7704 - regression_loss: 1.4545 - classification_loss: 0.3159 19/500 [>.............................] - ETA: 2:02 - loss: 1.7095 - regression_loss: 1.4045 - classification_loss: 0.3051 20/500 [>.............................] - ETA: 2:01 - loss: 1.7080 - regression_loss: 1.4035 - classification_loss: 0.3045 21/500 [>.............................] - ETA: 2:01 - loss: 1.7020 - regression_loss: 1.3979 - classification_loss: 0.3041 22/500 [>.............................] - ETA: 2:01 - loss: 1.7111 - regression_loss: 1.4040 - classification_loss: 0.3071 23/500 [>.............................] - ETA: 2:00 - loss: 1.7087 - regression_loss: 1.4009 - classification_loss: 0.3078 24/500 [>.............................] - ETA: 2:00 - loss: 1.7039 - regression_loss: 1.3986 - classification_loss: 0.3052 25/500 [>.............................] - ETA: 2:00 - loss: 1.7006 - regression_loss: 1.3992 - classification_loss: 0.3014 26/500 [>.............................] - ETA: 2:00 - loss: 1.7103 - regression_loss: 1.4096 - classification_loss: 0.3007 27/500 [>.............................] - ETA: 1:59 - loss: 1.7115 - regression_loss: 1.4120 - classification_loss: 0.2995 28/500 [>.............................] - ETA: 1:59 - loss: 1.7285 - regression_loss: 1.4271 - classification_loss: 0.3014 29/500 [>.............................] - ETA: 1:59 - loss: 1.7211 - regression_loss: 1.4198 - classification_loss: 0.3013 30/500 [>.............................] - ETA: 1:59 - loss: 1.7234 - regression_loss: 1.4242 - classification_loss: 0.2991 31/500 [>.............................] - ETA: 1:58 - loss: 1.7316 - regression_loss: 1.4302 - classification_loss: 0.3013 32/500 [>.............................] - ETA: 1:58 - loss: 1.7361 - regression_loss: 1.4339 - classification_loss: 0.3022 33/500 [>.............................] - ETA: 1:58 - loss: 1.7410 - regression_loss: 1.4416 - classification_loss: 0.2995 34/500 [=>............................] - ETA: 1:58 - loss: 1.7517 - regression_loss: 1.4495 - classification_loss: 0.3023 35/500 [=>............................] - ETA: 1:57 - loss: 1.7354 - regression_loss: 1.4368 - classification_loss: 0.2986 36/500 [=>............................] - ETA: 1:57 - loss: 1.7448 - regression_loss: 1.4438 - classification_loss: 0.3010 37/500 [=>............................] - ETA: 1:57 - loss: 1.7322 - regression_loss: 1.4296 - classification_loss: 0.3026 38/500 [=>............................] - ETA: 1:56 - loss: 1.7451 - regression_loss: 1.4393 - classification_loss: 0.3058 39/500 [=>............................] - ETA: 1:56 - loss: 1.7370 - regression_loss: 1.4328 - classification_loss: 0.3043 40/500 [=>............................] - ETA: 1:56 - loss: 1.7218 - regression_loss: 1.4187 - classification_loss: 0.3032 41/500 [=>............................] - ETA: 1:56 - loss: 1.7355 - regression_loss: 1.4306 - classification_loss: 0.3049 42/500 [=>............................] - ETA: 1:56 - loss: 1.7346 - regression_loss: 1.4295 - classification_loss: 0.3051 43/500 [=>............................] - ETA: 1:55 - loss: 1.7414 - regression_loss: 1.4338 - classification_loss: 0.3076 44/500 [=>............................] - ETA: 1:55 - loss: 1.7421 - regression_loss: 1.4341 - classification_loss: 0.3080 45/500 [=>............................] - ETA: 1:55 - loss: 1.7427 - regression_loss: 1.4340 - classification_loss: 0.3087 46/500 [=>............................] - ETA: 1:55 - loss: 1.7551 - regression_loss: 1.4413 - classification_loss: 0.3138 47/500 [=>............................] - ETA: 1:55 - loss: 1.7517 - regression_loss: 1.4393 - classification_loss: 0.3124 48/500 [=>............................] - ETA: 1:54 - loss: 1.7600 - regression_loss: 1.4454 - classification_loss: 0.3146 49/500 [=>............................] - ETA: 1:54 - loss: 1.7712 - regression_loss: 1.4541 - classification_loss: 0.3171 50/500 [==>...........................] - ETA: 1:54 - loss: 1.7557 - regression_loss: 1.4427 - classification_loss: 0.3130 51/500 [==>...........................] - ETA: 1:53 - loss: 1.7566 - regression_loss: 1.4456 - classification_loss: 0.3110 52/500 [==>...........................] - ETA: 1:53 - loss: 1.7610 - regression_loss: 1.4503 - classification_loss: 0.3107 53/500 [==>...........................] - ETA: 1:53 - loss: 1.7560 - regression_loss: 1.4453 - classification_loss: 0.3107 54/500 [==>...........................] - ETA: 1:52 - loss: 1.7599 - regression_loss: 1.4499 - classification_loss: 0.3100 55/500 [==>...........................] - ETA: 1:52 - loss: 1.7607 - regression_loss: 1.4511 - classification_loss: 0.3096 56/500 [==>...........................] - ETA: 1:51 - loss: 1.7510 - regression_loss: 1.4418 - classification_loss: 0.3092 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7578 - regression_loss: 1.4484 - classification_loss: 0.3094 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7527 - regression_loss: 1.4437 - classification_loss: 0.3090 59/500 [==>...........................] - ETA: 1:51 - loss: 1.7395 - regression_loss: 1.4338 - classification_loss: 0.3057 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7277 - regression_loss: 1.4239 - classification_loss: 0.3038 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7218 - regression_loss: 1.4178 - classification_loss: 0.3040 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7262 - regression_loss: 1.4208 - classification_loss: 0.3054 63/500 [==>...........................] - ETA: 1:50 - loss: 1.7287 - regression_loss: 1.4238 - classification_loss: 0.3049 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7110 - regression_loss: 1.4099 - classification_loss: 0.3010 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7027 - regression_loss: 1.4027 - classification_loss: 0.3000 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7069 - regression_loss: 1.4066 - classification_loss: 0.3002 67/500 [===>..........................] - ETA: 1:49 - loss: 1.7151 - regression_loss: 1.4132 - classification_loss: 0.3018 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7166 - regression_loss: 1.4151 - classification_loss: 0.3015 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7185 - regression_loss: 1.4166 - classification_loss: 0.3019 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7211 - regression_loss: 1.4185 - classification_loss: 0.3026 71/500 [===>..........................] - ETA: 1:48 - loss: 1.7260 - regression_loss: 1.4219 - classification_loss: 0.3041 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7299 - regression_loss: 1.4257 - classification_loss: 0.3043 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7347 - regression_loss: 1.4298 - classification_loss: 0.3050 74/500 [===>..........................] - ETA: 1:47 - loss: 1.7351 - regression_loss: 1.4307 - classification_loss: 0.3044 75/500 [===>..........................] - ETA: 1:47 - loss: 1.7271 - regression_loss: 1.4245 - classification_loss: 0.3027 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7299 - regression_loss: 1.4271 - classification_loss: 0.3028 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7286 - regression_loss: 1.4262 - classification_loss: 0.3023 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7312 - regression_loss: 1.4291 - classification_loss: 0.3021 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7272 - regression_loss: 1.4262 - classification_loss: 0.3010 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7200 - regression_loss: 1.4201 - classification_loss: 0.2999 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7277 - regression_loss: 1.4279 - classification_loss: 0.2998 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7324 - regression_loss: 1.4319 - classification_loss: 0.3004 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7294 - regression_loss: 1.4290 - classification_loss: 0.3003 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7324 - regression_loss: 1.4316 - classification_loss: 0.3009 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7350 - regression_loss: 1.4338 - classification_loss: 0.3011 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7303 - regression_loss: 1.4289 - classification_loss: 0.3014 87/500 [====>.........................] - ETA: 1:44 - loss: 1.7341 - regression_loss: 1.4327 - classification_loss: 0.3015 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7384 - regression_loss: 1.4358 - classification_loss: 0.3026 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7418 - regression_loss: 1.4385 - classification_loss: 0.3033 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7430 - regression_loss: 1.4401 - classification_loss: 0.3029 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7419 - regression_loss: 1.4402 - classification_loss: 0.3017 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7456 - regression_loss: 1.4433 - classification_loss: 0.3023 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7465 - regression_loss: 1.4440 - classification_loss: 0.3025 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7423 - regression_loss: 1.4410 - classification_loss: 0.3013 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7446 - regression_loss: 1.4427 - classification_loss: 0.3019 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7403 - regression_loss: 1.4395 - classification_loss: 0.3007 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7422 - regression_loss: 1.4410 - classification_loss: 0.3012 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7398 - regression_loss: 1.4387 - classification_loss: 0.3010 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7427 - regression_loss: 1.4408 - classification_loss: 0.3019 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7398 - regression_loss: 1.4388 - classification_loss: 0.3010 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7444 - regression_loss: 1.4433 - classification_loss: 0.3011 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7475 - regression_loss: 1.4454 - classification_loss: 0.3021 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7488 - regression_loss: 1.4471 - classification_loss: 0.3017 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7490 - regression_loss: 1.4478 - classification_loss: 0.3011 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7580 - regression_loss: 1.4549 - classification_loss: 0.3031 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7611 - regression_loss: 1.4578 - classification_loss: 0.3033 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7650 - regression_loss: 1.4609 - classification_loss: 0.3041 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7621 - regression_loss: 1.4580 - classification_loss: 0.3041 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7621 - regression_loss: 1.4578 - classification_loss: 0.3043 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7649 - regression_loss: 1.4604 - classification_loss: 0.3045 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7697 - regression_loss: 1.4649 - classification_loss: 0.3048 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7702 - regression_loss: 1.4652 - classification_loss: 0.3050 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7649 - regression_loss: 1.4610 - classification_loss: 0.3040 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7636 - regression_loss: 1.4600 - classification_loss: 0.3036 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7625 - regression_loss: 1.4596 - classification_loss: 0.3029 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7634 - regression_loss: 1.4606 - classification_loss: 0.3027 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7600 - regression_loss: 1.4565 - classification_loss: 0.3034 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7545 - regression_loss: 1.4519 - classification_loss: 0.3026 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7499 - regression_loss: 1.4486 - classification_loss: 0.3013 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7559 - regression_loss: 1.4521 - classification_loss: 0.3038 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7574 - regression_loss: 1.4535 - classification_loss: 0.3039 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7541 - regression_loss: 1.4512 - classification_loss: 0.3029 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7455 - regression_loss: 1.4430 - classification_loss: 0.3025 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7442 - regression_loss: 1.4420 - classification_loss: 0.3021 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7453 - regression_loss: 1.4430 - classification_loss: 0.3023 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7438 - regression_loss: 1.4417 - classification_loss: 0.3021 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7473 - regression_loss: 1.4444 - classification_loss: 0.3029 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7460 - regression_loss: 1.4432 - classification_loss: 0.3028 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7548 - regression_loss: 1.4495 - classification_loss: 0.3053 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7627 - regression_loss: 1.4576 - classification_loss: 0.3051 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7720 - regression_loss: 1.4630 - classification_loss: 0.3091 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7701 - regression_loss: 1.4619 - classification_loss: 0.3082 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7709 - regression_loss: 1.4627 - classification_loss: 0.3081 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7723 - regression_loss: 1.4642 - classification_loss: 0.3082 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7698 - regression_loss: 1.4623 - classification_loss: 0.3075 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7725 - regression_loss: 1.4647 - classification_loss: 0.3077 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7647 - regression_loss: 1.4582 - classification_loss: 0.3065 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7637 - regression_loss: 1.4579 - classification_loss: 0.3058 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7655 - regression_loss: 1.4600 - classification_loss: 0.3055 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7582 - regression_loss: 1.4543 - classification_loss: 0.3039 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7589 - regression_loss: 1.4546 - classification_loss: 0.3042 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7523 - regression_loss: 1.4492 - classification_loss: 0.3031 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7497 - regression_loss: 1.4461 - classification_loss: 0.3037 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7509 - regression_loss: 1.4473 - classification_loss: 0.3036 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7548 - regression_loss: 1.4509 - classification_loss: 0.3038 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7574 - regression_loss: 1.4529 - classification_loss: 0.3044 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7588 - regression_loss: 1.4546 - classification_loss: 0.3042 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7586 - regression_loss: 1.4543 - classification_loss: 0.3043 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7636 - regression_loss: 1.4587 - classification_loss: 0.3049 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7604 - regression_loss: 1.4559 - classification_loss: 0.3045 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7549 - regression_loss: 1.4514 - classification_loss: 0.3035 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7545 - regression_loss: 1.4511 - classification_loss: 0.3034 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7531 - regression_loss: 1.4500 - classification_loss: 0.3031 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7588 - regression_loss: 1.4533 - classification_loss: 0.3055 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7560 - regression_loss: 1.4512 - classification_loss: 0.3048 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7527 - regression_loss: 1.4489 - classification_loss: 0.3038 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7501 - regression_loss: 1.4472 - classification_loss: 0.3029 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7521 - regression_loss: 1.4489 - classification_loss: 0.3032 159/500 [========>.....................] - ETA: 1:24 - loss: 1.7560 - regression_loss: 1.4522 - classification_loss: 0.3038 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7511 - regression_loss: 1.4483 - classification_loss: 0.3028 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7518 - regression_loss: 1.4489 - classification_loss: 0.3028 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7523 - regression_loss: 1.4497 - classification_loss: 0.3026 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7532 - regression_loss: 1.4504 - classification_loss: 0.3028 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7518 - regression_loss: 1.4490 - classification_loss: 0.3028 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7493 - regression_loss: 1.4467 - classification_loss: 0.3026 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7535 - regression_loss: 1.4499 - classification_loss: 0.3036 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7530 - regression_loss: 1.4498 - classification_loss: 0.3032 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7499 - regression_loss: 1.4475 - classification_loss: 0.3024 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7497 - regression_loss: 1.4464 - classification_loss: 0.3033 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7484 - regression_loss: 1.4457 - classification_loss: 0.3027 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7451 - regression_loss: 1.4427 - classification_loss: 0.3024 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7465 - regression_loss: 1.4438 - classification_loss: 0.3026 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7487 - regression_loss: 1.4452 - classification_loss: 0.3035 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7479 - regression_loss: 1.4444 - classification_loss: 0.3035 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7466 - regression_loss: 1.4434 - classification_loss: 0.3032 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7499 - regression_loss: 1.4457 - classification_loss: 0.3042 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7491 - regression_loss: 1.4452 - classification_loss: 0.3038 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7501 - regression_loss: 1.4462 - classification_loss: 0.3038 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7517 - regression_loss: 1.4476 - classification_loss: 0.3041 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7520 - regression_loss: 1.4475 - classification_loss: 0.3045 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7502 - regression_loss: 1.4463 - classification_loss: 0.3039 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7475 - regression_loss: 1.4440 - classification_loss: 0.3035 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7496 - regression_loss: 1.4442 - classification_loss: 0.3054 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7438 - regression_loss: 1.4395 - classification_loss: 0.3043 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7441 - regression_loss: 1.4398 - classification_loss: 0.3043 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7439 - regression_loss: 1.4401 - classification_loss: 0.3039 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7431 - regression_loss: 1.4398 - classification_loss: 0.3033 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7474 - regression_loss: 1.4434 - classification_loss: 0.3040 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7499 - regression_loss: 1.4453 - classification_loss: 0.3045 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7507 - regression_loss: 1.4461 - classification_loss: 0.3046 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7511 - regression_loss: 1.4467 - classification_loss: 0.3043 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7477 - regression_loss: 1.4441 - classification_loss: 0.3036 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7462 - regression_loss: 1.4429 - classification_loss: 0.3033 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7451 - regression_loss: 1.4423 - classification_loss: 0.3028 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7455 - regression_loss: 1.4428 - classification_loss: 0.3027 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7470 - regression_loss: 1.4434 - classification_loss: 0.3036 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7471 - regression_loss: 1.4431 - classification_loss: 0.3040 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7471 - regression_loss: 1.4431 - classification_loss: 0.3040 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7478 - regression_loss: 1.4440 - classification_loss: 0.3038 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7473 - regression_loss: 1.4436 - classification_loss: 0.3038 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7486 - regression_loss: 1.4445 - classification_loss: 0.3041 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7507 - regression_loss: 1.4465 - classification_loss: 0.3042 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7504 - regression_loss: 1.4464 - classification_loss: 0.3040 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7500 - regression_loss: 1.4462 - classification_loss: 0.3038 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7479 - regression_loss: 1.4447 - classification_loss: 0.3031 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7495 - regression_loss: 1.4458 - classification_loss: 0.3038 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7497 - regression_loss: 1.4461 - classification_loss: 0.3036 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7473 - regression_loss: 1.4441 - classification_loss: 0.3032 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7492 - regression_loss: 1.4457 - classification_loss: 0.3034 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7467 - regression_loss: 1.4435 - classification_loss: 0.3032 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7472 - regression_loss: 1.4442 - classification_loss: 0.3030 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7481 - regression_loss: 1.4453 - classification_loss: 0.3028 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7500 - regression_loss: 1.4469 - classification_loss: 0.3031 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7493 - regression_loss: 1.4464 - classification_loss: 0.3029 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7500 - regression_loss: 1.4470 - classification_loss: 0.3030 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7469 - regression_loss: 1.4445 - classification_loss: 0.3024 217/500 [============>.................] - ETA: 1:10 - loss: 1.7475 - regression_loss: 1.4451 - classification_loss: 0.3025 218/500 [============>.................] - ETA: 1:10 - loss: 1.7456 - regression_loss: 1.4437 - classification_loss: 0.3019 219/500 [============>.................] - ETA: 1:10 - loss: 1.7474 - regression_loss: 1.4453 - classification_loss: 0.3021 220/500 [============>.................] - ETA: 1:09 - loss: 1.7443 - regression_loss: 1.4431 - classification_loss: 0.3012 221/500 [============>.................] - ETA: 1:09 - loss: 1.7427 - regression_loss: 1.4420 - classification_loss: 0.3007 222/500 [============>.................] - ETA: 1:09 - loss: 1.7439 - regression_loss: 1.4432 - classification_loss: 0.3008 223/500 [============>.................] - ETA: 1:09 - loss: 1.7457 - regression_loss: 1.4444 - classification_loss: 0.3012 224/500 [============>.................] - ETA: 1:08 - loss: 1.7435 - regression_loss: 1.4426 - classification_loss: 0.3008 225/500 [============>.................] - ETA: 1:08 - loss: 1.7446 - regression_loss: 1.4440 - classification_loss: 0.3006 226/500 [============>.................] - ETA: 1:08 - loss: 1.7412 - regression_loss: 1.4408 - classification_loss: 0.3004 227/500 [============>.................] - ETA: 1:08 - loss: 1.7382 - regression_loss: 1.4381 - classification_loss: 0.3001 228/500 [============>.................] - ETA: 1:07 - loss: 1.7382 - regression_loss: 1.4383 - classification_loss: 0.3000 229/500 [============>.................] - ETA: 1:07 - loss: 1.7381 - regression_loss: 1.4384 - classification_loss: 0.2997 230/500 [============>.................] - ETA: 1:07 - loss: 1.7409 - regression_loss: 1.4408 - classification_loss: 0.3001 231/500 [============>.................] - ETA: 1:07 - loss: 1.7405 - regression_loss: 1.4405 - classification_loss: 0.3000 232/500 [============>.................] - ETA: 1:06 - loss: 1.7352 - regression_loss: 1.4360 - classification_loss: 0.2993 233/500 [============>.................] - ETA: 1:06 - loss: 1.7340 - regression_loss: 1.4351 - classification_loss: 0.2989 234/500 [=============>................] - ETA: 1:06 - loss: 1.7289 - regression_loss: 1.4308 - classification_loss: 0.2981 235/500 [=============>................] - ETA: 1:06 - loss: 1.7287 - regression_loss: 1.4305 - classification_loss: 0.2982 236/500 [=============>................] - ETA: 1:05 - loss: 1.7299 - regression_loss: 1.4315 - classification_loss: 0.2983 237/500 [=============>................] - ETA: 1:05 - loss: 1.7296 - regression_loss: 1.4313 - classification_loss: 0.2983 238/500 [=============>................] - ETA: 1:05 - loss: 1.7270 - regression_loss: 1.4292 - classification_loss: 0.2978 239/500 [=============>................] - ETA: 1:05 - loss: 1.7263 - regression_loss: 1.4286 - classification_loss: 0.2977 240/500 [=============>................] - ETA: 1:04 - loss: 1.7259 - regression_loss: 1.4284 - classification_loss: 0.2975 241/500 [=============>................] - ETA: 1:04 - loss: 1.7257 - regression_loss: 1.4281 - classification_loss: 0.2976 242/500 [=============>................] - ETA: 1:04 - loss: 1.7273 - regression_loss: 1.4294 - classification_loss: 0.2979 243/500 [=============>................] - ETA: 1:04 - loss: 1.7284 - regression_loss: 1.4297 - classification_loss: 0.2986 244/500 [=============>................] - ETA: 1:03 - loss: 1.7306 - regression_loss: 1.4311 - classification_loss: 0.2995 245/500 [=============>................] - ETA: 1:03 - loss: 1.7269 - regression_loss: 1.4280 - classification_loss: 0.2989 246/500 [=============>................] - ETA: 1:03 - loss: 1.7275 - regression_loss: 1.4285 - classification_loss: 0.2990 247/500 [=============>................] - ETA: 1:03 - loss: 1.7287 - regression_loss: 1.4295 - classification_loss: 0.2993 248/500 [=============>................] - ETA: 1:02 - loss: 1.7302 - regression_loss: 1.4295 - classification_loss: 0.3007 249/500 [=============>................] - ETA: 1:02 - loss: 1.7301 - regression_loss: 1.4295 - classification_loss: 0.3006 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7314 - regression_loss: 1.4301 - classification_loss: 0.3012 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7333 - regression_loss: 1.4318 - classification_loss: 0.3014 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7322 - regression_loss: 1.4310 - classification_loss: 0.3012 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7357 - regression_loss: 1.4338 - classification_loss: 0.3018 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7350 - regression_loss: 1.4334 - classification_loss: 0.3016 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7351 - regression_loss: 1.4339 - classification_loss: 0.3012 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7370 - regression_loss: 1.4353 - classification_loss: 0.3017 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7367 - regression_loss: 1.4351 - classification_loss: 0.3016 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7370 - regression_loss: 1.4356 - classification_loss: 0.3015 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7384 - regression_loss: 1.4341 - classification_loss: 0.3043 260/500 [==============>...............] - ETA: 59s - loss: 1.7396 - regression_loss: 1.4353 - classification_loss: 0.3043  261/500 [==============>...............] - ETA: 59s - loss: 1.7399 - regression_loss: 1.4355 - classification_loss: 0.3043 262/500 [==============>...............] - ETA: 59s - loss: 1.7392 - regression_loss: 1.4344 - classification_loss: 0.3048 263/500 [==============>...............] - ETA: 59s - loss: 1.7402 - regression_loss: 1.4352 - classification_loss: 0.3050 264/500 [==============>...............] - ETA: 58s - loss: 1.7395 - regression_loss: 1.4347 - classification_loss: 0.3047 265/500 [==============>...............] - ETA: 58s - loss: 1.7397 - regression_loss: 1.4348 - classification_loss: 0.3049 266/500 [==============>...............] - ETA: 58s - loss: 1.7407 - regression_loss: 1.4358 - classification_loss: 0.3050 267/500 [===============>..............] - ETA: 58s - loss: 1.7396 - regression_loss: 1.4350 - classification_loss: 0.3046 268/500 [===============>..............] - ETA: 57s - loss: 1.7411 - regression_loss: 1.4353 - classification_loss: 0.3058 269/500 [===============>..............] - ETA: 57s - loss: 1.7420 - regression_loss: 1.4360 - classification_loss: 0.3060 270/500 [===============>..............] - ETA: 57s - loss: 1.7426 - regression_loss: 1.4366 - classification_loss: 0.3060 271/500 [===============>..............] - ETA: 57s - loss: 1.7433 - regression_loss: 1.4372 - classification_loss: 0.3061 272/500 [===============>..............] - ETA: 56s - loss: 1.7418 - regression_loss: 1.4362 - classification_loss: 0.3056 273/500 [===============>..............] - ETA: 56s - loss: 1.7417 - regression_loss: 1.4365 - classification_loss: 0.3053 274/500 [===============>..............] - ETA: 56s - loss: 1.7389 - regression_loss: 1.4343 - classification_loss: 0.3046 275/500 [===============>..............] - ETA: 56s - loss: 1.7375 - regression_loss: 1.4329 - classification_loss: 0.3046 276/500 [===============>..............] - ETA: 55s - loss: 1.7382 - regression_loss: 1.4335 - classification_loss: 0.3046 277/500 [===============>..............] - ETA: 55s - loss: 1.7385 - regression_loss: 1.4340 - classification_loss: 0.3045 278/500 [===============>..............] - ETA: 55s - loss: 1.7372 - regression_loss: 1.4330 - classification_loss: 0.3042 279/500 [===============>..............] - ETA: 55s - loss: 1.7374 - regression_loss: 1.4335 - classification_loss: 0.3039 280/500 [===============>..............] - ETA: 54s - loss: 1.7377 - regression_loss: 1.4338 - classification_loss: 0.3039 281/500 [===============>..............] - ETA: 54s - loss: 1.7382 - regression_loss: 1.4342 - classification_loss: 0.3040 282/500 [===============>..............] - ETA: 54s - loss: 1.7380 - regression_loss: 1.4340 - classification_loss: 0.3041 283/500 [===============>..............] - ETA: 54s - loss: 1.7415 - regression_loss: 1.4356 - classification_loss: 0.3059 284/500 [================>.............] - ETA: 53s - loss: 1.7433 - regression_loss: 1.4370 - classification_loss: 0.3063 285/500 [================>.............] - ETA: 53s - loss: 1.7413 - regression_loss: 1.4353 - classification_loss: 0.3060 286/500 [================>.............] - ETA: 53s - loss: 1.7396 - regression_loss: 1.4338 - classification_loss: 0.3058 287/500 [================>.............] - ETA: 53s - loss: 1.7353 - regression_loss: 1.4302 - classification_loss: 0.3051 288/500 [================>.............] - ETA: 52s - loss: 1.7333 - regression_loss: 1.4284 - classification_loss: 0.3050 289/500 [================>.............] - ETA: 52s - loss: 1.7324 - regression_loss: 1.4277 - classification_loss: 0.3047 290/500 [================>.............] - ETA: 52s - loss: 1.7306 - regression_loss: 1.4263 - classification_loss: 0.3043 291/500 [================>.............] - ETA: 52s - loss: 1.7296 - regression_loss: 1.4256 - classification_loss: 0.3040 292/500 [================>.............] - ETA: 51s - loss: 1.7307 - regression_loss: 1.4263 - classification_loss: 0.3044 293/500 [================>.............] - ETA: 51s - loss: 1.7297 - regression_loss: 1.4256 - classification_loss: 0.3040 294/500 [================>.............] - ETA: 51s - loss: 1.7306 - regression_loss: 1.4263 - classification_loss: 0.3044 295/500 [================>.............] - ETA: 51s - loss: 1.7324 - regression_loss: 1.4279 - classification_loss: 0.3045 296/500 [================>.............] - ETA: 50s - loss: 1.7346 - regression_loss: 1.4294 - classification_loss: 0.3052 297/500 [================>.............] - ETA: 50s - loss: 1.7320 - regression_loss: 1.4273 - classification_loss: 0.3047 298/500 [================>.............] - ETA: 50s - loss: 1.7331 - regression_loss: 1.4278 - classification_loss: 0.3053 299/500 [================>.............] - ETA: 50s - loss: 1.7316 - regression_loss: 1.4263 - classification_loss: 0.3053 300/500 [=================>............] - ETA: 49s - loss: 1.7314 - regression_loss: 1.4262 - classification_loss: 0.3052 301/500 [=================>............] - ETA: 49s - loss: 1.7298 - regression_loss: 1.4253 - classification_loss: 0.3045 302/500 [=================>............] - ETA: 49s - loss: 1.7302 - regression_loss: 1.4258 - classification_loss: 0.3044 303/500 [=================>............] - ETA: 49s - loss: 1.7311 - regression_loss: 1.4264 - classification_loss: 0.3047 304/500 [=================>............] - ETA: 48s - loss: 1.7319 - regression_loss: 1.4270 - classification_loss: 0.3049 305/500 [=================>............] - ETA: 48s - loss: 1.7324 - regression_loss: 1.4275 - classification_loss: 0.3050 306/500 [=================>............] - ETA: 48s - loss: 1.7324 - regression_loss: 1.4277 - classification_loss: 0.3048 307/500 [=================>............] - ETA: 48s - loss: 1.7318 - regression_loss: 1.4273 - classification_loss: 0.3045 308/500 [=================>............] - ETA: 47s - loss: 1.7311 - regression_loss: 1.4270 - classification_loss: 0.3041 309/500 [=================>............] - ETA: 47s - loss: 1.7318 - regression_loss: 1.4276 - classification_loss: 0.3042 310/500 [=================>............] - ETA: 47s - loss: 1.7326 - regression_loss: 1.4284 - classification_loss: 0.3043 311/500 [=================>............] - ETA: 47s - loss: 1.7326 - regression_loss: 1.4284 - classification_loss: 0.3042 312/500 [=================>............] - ETA: 46s - loss: 1.7364 - regression_loss: 1.4315 - classification_loss: 0.3050 313/500 [=================>............] - ETA: 46s - loss: 1.7372 - regression_loss: 1.4321 - classification_loss: 0.3050 314/500 [=================>............] - ETA: 46s - loss: 1.7369 - regression_loss: 1.4318 - classification_loss: 0.3051 315/500 [=================>............] - ETA: 46s - loss: 1.7352 - regression_loss: 1.4304 - classification_loss: 0.3048 316/500 [=================>............] - ETA: 45s - loss: 1.7363 - regression_loss: 1.4314 - classification_loss: 0.3049 317/500 [==================>...........] - ETA: 45s - loss: 1.7368 - regression_loss: 1.4320 - classification_loss: 0.3049 318/500 [==================>...........] - ETA: 45s - loss: 1.7346 - regression_loss: 1.4304 - classification_loss: 0.3043 319/500 [==================>...........] - ETA: 45s - loss: 1.7319 - regression_loss: 1.4284 - classification_loss: 0.3036 320/500 [==================>...........] - ETA: 44s - loss: 1.7317 - regression_loss: 1.4285 - classification_loss: 0.3032 321/500 [==================>...........] - ETA: 44s - loss: 1.7309 - regression_loss: 1.4278 - classification_loss: 0.3031 322/500 [==================>...........] - ETA: 44s - loss: 1.7293 - regression_loss: 1.4267 - classification_loss: 0.3027 323/500 [==================>...........] - ETA: 44s - loss: 1.7288 - regression_loss: 1.4264 - classification_loss: 0.3024 324/500 [==================>...........] - ETA: 43s - loss: 1.7299 - regression_loss: 1.4273 - classification_loss: 0.3025 325/500 [==================>...........] - ETA: 43s - loss: 1.7299 - regression_loss: 1.4276 - classification_loss: 0.3024 326/500 [==================>...........] - ETA: 43s - loss: 1.7289 - regression_loss: 1.4268 - classification_loss: 0.3021 327/500 [==================>...........] - ETA: 43s - loss: 1.7291 - regression_loss: 1.4268 - classification_loss: 0.3023 328/500 [==================>...........] - ETA: 42s - loss: 1.7279 - regression_loss: 1.4258 - classification_loss: 0.3020 329/500 [==================>...........] - ETA: 42s - loss: 1.7289 - regression_loss: 1.4268 - classification_loss: 0.3021 330/500 [==================>...........] - ETA: 42s - loss: 1.7294 - regression_loss: 1.4271 - classification_loss: 0.3023 331/500 [==================>...........] - ETA: 42s - loss: 1.7311 - regression_loss: 1.4285 - classification_loss: 0.3026 332/500 [==================>...........] - ETA: 41s - loss: 1.7345 - regression_loss: 1.4313 - classification_loss: 0.3031 333/500 [==================>...........] - ETA: 41s - loss: 1.7336 - regression_loss: 1.4306 - classification_loss: 0.3031 334/500 [===================>..........] - ETA: 41s - loss: 1.7323 - regression_loss: 1.4297 - classification_loss: 0.3025 335/500 [===================>..........] - ETA: 41s - loss: 1.7328 - regression_loss: 1.4303 - classification_loss: 0.3026 336/500 [===================>..........] - ETA: 40s - loss: 1.7336 - regression_loss: 1.4309 - classification_loss: 0.3027 337/500 [===================>..........] - ETA: 40s - loss: 1.7350 - regression_loss: 1.4321 - classification_loss: 0.3029 338/500 [===================>..........] - ETA: 40s - loss: 1.7355 - regression_loss: 1.4327 - classification_loss: 0.3028 339/500 [===================>..........] - ETA: 40s - loss: 1.7355 - regression_loss: 1.4326 - classification_loss: 0.3030 340/500 [===================>..........] - ETA: 39s - loss: 1.7333 - regression_loss: 1.4307 - classification_loss: 0.3026 341/500 [===================>..........] - ETA: 39s - loss: 1.7338 - regression_loss: 1.4310 - classification_loss: 0.3028 342/500 [===================>..........] - ETA: 39s - loss: 1.7354 - regression_loss: 1.4323 - classification_loss: 0.3031 343/500 [===================>..........] - ETA: 39s - loss: 1.7359 - regression_loss: 1.4326 - classification_loss: 0.3033 344/500 [===================>..........] - ETA: 38s - loss: 1.7360 - regression_loss: 1.4328 - classification_loss: 0.3032 345/500 [===================>..........] - ETA: 38s - loss: 1.7375 - regression_loss: 1.4339 - classification_loss: 0.3036 346/500 [===================>..........] - ETA: 38s - loss: 1.7392 - regression_loss: 1.4352 - classification_loss: 0.3040 347/500 [===================>..........] - ETA: 38s - loss: 1.7399 - regression_loss: 1.4357 - classification_loss: 0.3042 348/500 [===================>..........] - ETA: 37s - loss: 1.7399 - regression_loss: 1.4358 - classification_loss: 0.3041 349/500 [===================>..........] - ETA: 37s - loss: 1.7400 - regression_loss: 1.4359 - classification_loss: 0.3041 350/500 [====================>.........] - ETA: 37s - loss: 1.7384 - regression_loss: 1.4348 - classification_loss: 0.3036 351/500 [====================>.........] - ETA: 37s - loss: 1.7392 - regression_loss: 1.4355 - classification_loss: 0.3037 352/500 [====================>.........] - ETA: 37s - loss: 1.7399 - regression_loss: 1.4359 - classification_loss: 0.3039 353/500 [====================>.........] - ETA: 36s - loss: 1.7390 - regression_loss: 1.4353 - classification_loss: 0.3037 354/500 [====================>.........] - ETA: 36s - loss: 1.7397 - regression_loss: 1.4359 - classification_loss: 0.3038 355/500 [====================>.........] - ETA: 36s - loss: 1.7403 - regression_loss: 1.4366 - classification_loss: 0.3037 356/500 [====================>.........] - ETA: 36s - loss: 1.7427 - regression_loss: 1.4384 - classification_loss: 0.3043 357/500 [====================>.........] - ETA: 35s - loss: 1.7425 - regression_loss: 1.4384 - classification_loss: 0.3041 358/500 [====================>.........] - ETA: 35s - loss: 1.7425 - regression_loss: 1.4383 - classification_loss: 0.3042 359/500 [====================>.........] - ETA: 35s - loss: 1.7412 - regression_loss: 1.4370 - classification_loss: 0.3042 360/500 [====================>.........] - ETA: 35s - loss: 1.7408 - regression_loss: 1.4365 - classification_loss: 0.3043 361/500 [====================>.........] - ETA: 34s - loss: 1.7414 - regression_loss: 1.4371 - classification_loss: 0.3043 362/500 [====================>.........] - ETA: 34s - loss: 1.7408 - regression_loss: 1.4365 - classification_loss: 0.3043 363/500 [====================>.........] - ETA: 34s - loss: 1.7413 - regression_loss: 1.4370 - classification_loss: 0.3043 364/500 [====================>.........] - ETA: 34s - loss: 1.7407 - regression_loss: 1.4366 - classification_loss: 0.3042 365/500 [====================>.........] - ETA: 33s - loss: 1.7430 - regression_loss: 1.4386 - classification_loss: 0.3044 366/500 [====================>.........] - ETA: 33s - loss: 1.7437 - regression_loss: 1.4395 - classification_loss: 0.3042 367/500 [=====================>........] - ETA: 33s - loss: 1.7440 - regression_loss: 1.4397 - classification_loss: 0.3042 368/500 [=====================>........] - ETA: 33s - loss: 1.7471 - regression_loss: 1.4424 - classification_loss: 0.3047 369/500 [=====================>........] - ETA: 32s - loss: 1.7448 - regression_loss: 1.4405 - classification_loss: 0.3043 370/500 [=====================>........] - ETA: 32s - loss: 1.7450 - regression_loss: 1.4407 - classification_loss: 0.3044 371/500 [=====================>........] - ETA: 32s - loss: 1.7462 - regression_loss: 1.4415 - classification_loss: 0.3047 372/500 [=====================>........] - ETA: 32s - loss: 1.7467 - regression_loss: 1.4419 - classification_loss: 0.3048 373/500 [=====================>........] - ETA: 31s - loss: 1.7480 - regression_loss: 1.4429 - classification_loss: 0.3051 374/500 [=====================>........] - ETA: 31s - loss: 1.7481 - regression_loss: 1.4430 - classification_loss: 0.3051 375/500 [=====================>........] - ETA: 31s - loss: 1.7466 - regression_loss: 1.4415 - classification_loss: 0.3050 376/500 [=====================>........] - ETA: 31s - loss: 1.7464 - regression_loss: 1.4415 - classification_loss: 0.3049 377/500 [=====================>........] - ETA: 30s - loss: 1.7452 - regression_loss: 1.4407 - classification_loss: 0.3045 378/500 [=====================>........] - ETA: 30s - loss: 1.7448 - regression_loss: 1.4403 - classification_loss: 0.3046 379/500 [=====================>........] - ETA: 30s - loss: 1.7446 - regression_loss: 1.4401 - classification_loss: 0.3045 380/500 [=====================>........] - ETA: 30s - loss: 1.7440 - regression_loss: 1.4398 - classification_loss: 0.3043 381/500 [=====================>........] - ETA: 29s - loss: 1.7442 - regression_loss: 1.4401 - classification_loss: 0.3041 382/500 [=====================>........] - ETA: 29s - loss: 1.7433 - regression_loss: 1.4393 - classification_loss: 0.3039 383/500 [=====================>........] - ETA: 29s - loss: 1.7438 - regression_loss: 1.4399 - classification_loss: 0.3040 384/500 [======================>.......] - ETA: 29s - loss: 1.7440 - regression_loss: 1.4401 - classification_loss: 0.3039 385/500 [======================>.......] - ETA: 28s - loss: 1.7436 - regression_loss: 1.4398 - classification_loss: 0.3038 386/500 [======================>.......] - ETA: 28s - loss: 1.7428 - regression_loss: 1.4394 - classification_loss: 0.3034 387/500 [======================>.......] - ETA: 28s - loss: 1.7438 - regression_loss: 1.4403 - classification_loss: 0.3035 388/500 [======================>.......] - ETA: 28s - loss: 1.7435 - regression_loss: 1.4401 - classification_loss: 0.3033 389/500 [======================>.......] - ETA: 27s - loss: 1.7443 - regression_loss: 1.4407 - classification_loss: 0.3037 390/500 [======================>.......] - ETA: 27s - loss: 1.7439 - regression_loss: 1.4404 - classification_loss: 0.3035 391/500 [======================>.......] - ETA: 27s - loss: 1.7422 - regression_loss: 1.4390 - classification_loss: 0.3031 392/500 [======================>.......] - ETA: 27s - loss: 1.7428 - regression_loss: 1.4395 - classification_loss: 0.3033 393/500 [======================>.......] - ETA: 26s - loss: 1.7409 - regression_loss: 1.4379 - classification_loss: 0.3030 394/500 [======================>.......] - ETA: 26s - loss: 1.7399 - regression_loss: 1.4372 - classification_loss: 0.3027 395/500 [======================>.......] - ETA: 26s - loss: 1.7396 - regression_loss: 1.4371 - classification_loss: 0.3025 396/500 [======================>.......] - ETA: 26s - loss: 1.7401 - regression_loss: 1.4375 - classification_loss: 0.3026 397/500 [======================>.......] - ETA: 25s - loss: 1.7399 - regression_loss: 1.4374 - classification_loss: 0.3024 398/500 [======================>.......] - ETA: 25s - loss: 1.7392 - regression_loss: 1.4364 - classification_loss: 0.3027 399/500 [======================>.......] - ETA: 25s - loss: 1.7387 - regression_loss: 1.4361 - classification_loss: 0.3026 400/500 [=======================>......] - ETA: 25s - loss: 1.7397 - regression_loss: 1.4368 - classification_loss: 0.3029 401/500 [=======================>......] - ETA: 24s - loss: 1.7385 - regression_loss: 1.4356 - classification_loss: 0.3029 402/500 [=======================>......] - ETA: 24s - loss: 1.7388 - regression_loss: 1.4359 - classification_loss: 0.3029 403/500 [=======================>......] - ETA: 24s - loss: 1.7383 - regression_loss: 1.4354 - classification_loss: 0.3030 404/500 [=======================>......] - ETA: 24s - loss: 1.7384 - regression_loss: 1.4355 - classification_loss: 0.3029 405/500 [=======================>......] - ETA: 23s - loss: 1.7398 - regression_loss: 1.4366 - classification_loss: 0.3032 406/500 [=======================>......] - ETA: 23s - loss: 1.7400 - regression_loss: 1.4368 - classification_loss: 0.3032 407/500 [=======================>......] - ETA: 23s - loss: 1.7387 - regression_loss: 1.4358 - classification_loss: 0.3029 408/500 [=======================>......] - ETA: 23s - loss: 1.7398 - regression_loss: 1.4367 - classification_loss: 0.3031 409/500 [=======================>......] - ETA: 22s - loss: 1.7406 - regression_loss: 1.4375 - classification_loss: 0.3031 410/500 [=======================>......] - ETA: 22s - loss: 1.7406 - regression_loss: 1.4377 - classification_loss: 0.3029 411/500 [=======================>......] - ETA: 22s - loss: 1.7385 - regression_loss: 1.4358 - classification_loss: 0.3027 412/500 [=======================>......] - ETA: 22s - loss: 1.7394 - regression_loss: 1.4364 - classification_loss: 0.3029 413/500 [=======================>......] - ETA: 21s - loss: 1.7410 - regression_loss: 1.4378 - classification_loss: 0.3032 414/500 [=======================>......] - ETA: 21s - loss: 1.7409 - regression_loss: 1.4377 - classification_loss: 0.3032 415/500 [=======================>......] - ETA: 21s - loss: 1.7430 - regression_loss: 1.4396 - classification_loss: 0.3034 416/500 [=======================>......] - ETA: 21s - loss: 1.7427 - regression_loss: 1.4394 - classification_loss: 0.3032 417/500 [========================>.....] - ETA: 20s - loss: 1.7426 - regression_loss: 1.4393 - classification_loss: 0.3032 418/500 [========================>.....] - ETA: 20s - loss: 1.7435 - regression_loss: 1.4398 - classification_loss: 0.3037 419/500 [========================>.....] - ETA: 20s - loss: 1.7435 - regression_loss: 1.4396 - classification_loss: 0.3039 420/500 [========================>.....] - ETA: 20s - loss: 1.7444 - regression_loss: 1.4401 - classification_loss: 0.3043 421/500 [========================>.....] - ETA: 19s - loss: 1.7443 - regression_loss: 1.4401 - classification_loss: 0.3042 422/500 [========================>.....] - ETA: 19s - loss: 1.7456 - regression_loss: 1.4411 - classification_loss: 0.3044 423/500 [========================>.....] - ETA: 19s - loss: 1.7464 - regression_loss: 1.4416 - classification_loss: 0.3048 424/500 [========================>.....] - ETA: 19s - loss: 1.7481 - regression_loss: 1.4428 - classification_loss: 0.3053 425/500 [========================>.....] - ETA: 18s - loss: 1.7485 - regression_loss: 1.4432 - classification_loss: 0.3053 426/500 [========================>.....] - ETA: 18s - loss: 1.7489 - regression_loss: 1.4436 - classification_loss: 0.3053 427/500 [========================>.....] - ETA: 18s - loss: 1.7476 - regression_loss: 1.4427 - classification_loss: 0.3048 428/500 [========================>.....] - ETA: 18s - loss: 1.7480 - regression_loss: 1.4432 - classification_loss: 0.3048 429/500 [========================>.....] - ETA: 17s - loss: 1.7482 - regression_loss: 1.4434 - classification_loss: 0.3048 430/500 [========================>.....] - ETA: 17s - loss: 1.7484 - regression_loss: 1.4436 - classification_loss: 0.3049 431/500 [========================>.....] - ETA: 17s - loss: 1.7474 - regression_loss: 1.4427 - classification_loss: 0.3047 432/500 [========================>.....] - ETA: 17s - loss: 1.7459 - regression_loss: 1.4415 - classification_loss: 0.3044 433/500 [========================>.....] - ETA: 16s - loss: 1.7476 - regression_loss: 1.4428 - classification_loss: 0.3048 434/500 [=========================>....] - ETA: 16s - loss: 1.7476 - regression_loss: 1.4429 - classification_loss: 0.3047 435/500 [=========================>....] - ETA: 16s - loss: 1.7479 - regression_loss: 1.4430 - classification_loss: 0.3049 436/500 [=========================>....] - ETA: 16s - loss: 1.7490 - regression_loss: 1.4443 - classification_loss: 0.3047 437/500 [=========================>....] - ETA: 15s - loss: 1.7484 - regression_loss: 1.4439 - classification_loss: 0.3045 438/500 [=========================>....] - ETA: 15s - loss: 1.7486 - regression_loss: 1.4443 - classification_loss: 0.3043 439/500 [=========================>....] - ETA: 15s - loss: 1.7477 - regression_loss: 1.4436 - classification_loss: 0.3041 440/500 [=========================>....] - ETA: 15s - loss: 1.7479 - regression_loss: 1.4439 - classification_loss: 0.3040 441/500 [=========================>....] - ETA: 14s - loss: 1.7494 - regression_loss: 1.4455 - classification_loss: 0.3039 442/500 [=========================>....] - ETA: 14s - loss: 1.7495 - regression_loss: 1.4456 - classification_loss: 0.3039 443/500 [=========================>....] - ETA: 14s - loss: 1.7506 - regression_loss: 1.4466 - classification_loss: 0.3040 444/500 [=========================>....] - ETA: 14s - loss: 1.7495 - regression_loss: 1.4458 - classification_loss: 0.3037 445/500 [=========================>....] - ETA: 13s - loss: 1.7491 - regression_loss: 1.4456 - classification_loss: 0.3035 446/500 [=========================>....] - ETA: 13s - loss: 1.7497 - regression_loss: 1.4460 - classification_loss: 0.3037 447/500 [=========================>....] - ETA: 13s - loss: 1.7503 - regression_loss: 1.4466 - classification_loss: 0.3037 448/500 [=========================>....] - ETA: 13s - loss: 1.7494 - regression_loss: 1.4459 - classification_loss: 0.3035 449/500 [=========================>....] - ETA: 12s - loss: 1.7501 - regression_loss: 1.4465 - classification_loss: 0.3036 450/500 [==========================>...] - ETA: 12s - loss: 1.7509 - regression_loss: 1.4472 - classification_loss: 0.3037 451/500 [==========================>...] - ETA: 12s - loss: 1.7514 - regression_loss: 1.4477 - classification_loss: 0.3036 452/500 [==========================>...] - ETA: 11s - loss: 1.7522 - regression_loss: 1.4485 - classification_loss: 0.3037 453/500 [==========================>...] - ETA: 11s - loss: 1.7532 - regression_loss: 1.4495 - classification_loss: 0.3037 454/500 [==========================>...] - ETA: 11s - loss: 1.7520 - regression_loss: 1.4485 - classification_loss: 0.3034 455/500 [==========================>...] - ETA: 11s - loss: 1.7505 - regression_loss: 1.4472 - classification_loss: 0.3033 456/500 [==========================>...] - ETA: 10s - loss: 1.7505 - regression_loss: 1.4473 - classification_loss: 0.3032 457/500 [==========================>...] - ETA: 10s - loss: 1.7514 - regression_loss: 1.4480 - classification_loss: 0.3034 458/500 [==========================>...] - ETA: 10s - loss: 1.7525 - regression_loss: 1.4490 - classification_loss: 0.3035 459/500 [==========================>...] - ETA: 10s - loss: 1.7521 - regression_loss: 1.4489 - classification_loss: 0.3033 460/500 [==========================>...] - ETA: 9s - loss: 1.7522 - regression_loss: 1.4491 - classification_loss: 0.3032  461/500 [==========================>...] - ETA: 9s - loss: 1.7524 - regression_loss: 1.4494 - classification_loss: 0.3030 462/500 [==========================>...] - ETA: 9s - loss: 1.7533 - regression_loss: 1.4500 - classification_loss: 0.3032 463/500 [==========================>...] - ETA: 9s - loss: 1.7533 - regression_loss: 1.4501 - classification_loss: 0.3032 464/500 [==========================>...] - ETA: 8s - loss: 1.7546 - regression_loss: 1.4511 - classification_loss: 0.3035 465/500 [==========================>...] - ETA: 8s - loss: 1.7551 - regression_loss: 1.4516 - classification_loss: 0.3035 466/500 [==========================>...] - ETA: 8s - loss: 1.7549 - regression_loss: 1.4514 - classification_loss: 0.3035 467/500 [===========================>..] - ETA: 8s - loss: 1.7554 - regression_loss: 1.4519 - classification_loss: 0.3035 468/500 [===========================>..] - ETA: 7s - loss: 1.7545 - regression_loss: 1.4512 - classification_loss: 0.3033 469/500 [===========================>..] - ETA: 7s - loss: 1.7546 - regression_loss: 1.4515 - classification_loss: 0.3031 470/500 [===========================>..] - ETA: 7s - loss: 1.7545 - regression_loss: 1.4513 - classification_loss: 0.3032 471/500 [===========================>..] - ETA: 7s - loss: 1.7539 - regression_loss: 1.4508 - classification_loss: 0.3031 472/500 [===========================>..] - ETA: 6s - loss: 1.7527 - regression_loss: 1.4499 - classification_loss: 0.3029 473/500 [===========================>..] - ETA: 6s - loss: 1.7530 - regression_loss: 1.4502 - classification_loss: 0.3028 474/500 [===========================>..] - ETA: 6s - loss: 1.7531 - regression_loss: 1.4504 - classification_loss: 0.3028 475/500 [===========================>..] - ETA: 6s - loss: 1.7517 - regression_loss: 1.4492 - classification_loss: 0.3025 476/500 [===========================>..] - ETA: 5s - loss: 1.7508 - regression_loss: 1.4485 - classification_loss: 0.3023 477/500 [===========================>..] - ETA: 5s - loss: 1.7490 - regression_loss: 1.4471 - classification_loss: 0.3019 478/500 [===========================>..] - ETA: 5s - loss: 1.7493 - regression_loss: 1.4475 - classification_loss: 0.3018 479/500 [===========================>..] - ETA: 5s - loss: 1.7492 - regression_loss: 1.4475 - classification_loss: 0.3017 480/500 [===========================>..] - ETA: 5s - loss: 1.7494 - regression_loss: 1.4477 - classification_loss: 0.3017 481/500 [===========================>..] - ETA: 4s - loss: 1.7498 - regression_loss: 1.4480 - classification_loss: 0.3017 482/500 [===========================>..] - ETA: 4s - loss: 1.7482 - regression_loss: 1.4468 - classification_loss: 0.3013 483/500 [===========================>..] - ETA: 4s - loss: 1.7485 - regression_loss: 1.4472 - classification_loss: 0.3013 484/500 [============================>.] - ETA: 4s - loss: 1.7486 - regression_loss: 1.4473 - classification_loss: 0.3013 485/500 [============================>.] - ETA: 3s - loss: 1.7491 - regression_loss: 1.4477 - classification_loss: 0.3014 486/500 [============================>.] - ETA: 3s - loss: 1.7494 - regression_loss: 1.4480 - classification_loss: 0.3014 487/500 [============================>.] - ETA: 3s - loss: 1.7508 - regression_loss: 1.4489 - classification_loss: 0.3019 488/500 [============================>.] - ETA: 3s - loss: 1.7499 - regression_loss: 1.4483 - classification_loss: 0.3016 489/500 [============================>.] - ETA: 2s - loss: 1.7490 - regression_loss: 1.4475 - classification_loss: 0.3015 490/500 [============================>.] - ETA: 2s - loss: 1.7485 - regression_loss: 1.4472 - classification_loss: 0.3013 491/500 [============================>.] - ETA: 2s - loss: 1.7481 - regression_loss: 1.4469 - classification_loss: 0.3012 492/500 [============================>.] - ETA: 2s - loss: 1.7486 - regression_loss: 1.4476 - classification_loss: 0.3010 493/500 [============================>.] - ETA: 1s - loss: 1.7484 - regression_loss: 1.4475 - classification_loss: 0.3009 494/500 [============================>.] - ETA: 1s - loss: 1.7492 - regression_loss: 1.4480 - classification_loss: 0.3012 495/500 [============================>.] - ETA: 1s - loss: 1.7504 - regression_loss: 1.4488 - classification_loss: 0.3016 496/500 [============================>.] - ETA: 1s - loss: 1.7506 - regression_loss: 1.4489 - classification_loss: 0.3017 497/500 [============================>.] - ETA: 0s - loss: 1.7513 - regression_loss: 1.4495 - classification_loss: 0.3018 498/500 [============================>.] - ETA: 0s - loss: 1.7520 - regression_loss: 1.4500 - classification_loss: 0.3020 499/500 [============================>.] - ETA: 0s - loss: 1.7519 - regression_loss: 1.4500 - classification_loss: 0.3019 500/500 [==============================] - 125s 250ms/step - loss: 1.7526 - regression_loss: 1.4506 - classification_loss: 0.3019 1172 instances of class plum with average precision: 0.5858 mAP: 0.5858 Epoch 00067: saving model to ./training/snapshots/resnet50_pascal_67.h5 Epoch 68/150 1/500 [..............................] - ETA: 1:53 - loss: 1.9384 - regression_loss: 1.6142 - classification_loss: 0.3242 2/500 [..............................] - ETA: 1:52 - loss: 1.4342 - regression_loss: 1.1999 - classification_loss: 0.2343 3/500 [..............................] - ETA: 1:56 - loss: 1.5332 - regression_loss: 1.2389 - classification_loss: 0.2943 4/500 [..............................] - ETA: 1:59 - loss: 1.5973 - regression_loss: 1.2974 - classification_loss: 0.2998 5/500 [..............................] - ETA: 1:59 - loss: 1.3894 - regression_loss: 1.1325 - classification_loss: 0.2569 6/500 [..............................] - ETA: 2:00 - loss: 1.4485 - regression_loss: 1.1922 - classification_loss: 0.2563 7/500 [..............................] - ETA: 2:00 - loss: 1.5161 - regression_loss: 1.2496 - classification_loss: 0.2664 8/500 [..............................] - ETA: 2:00 - loss: 1.5224 - regression_loss: 1.2601 - classification_loss: 0.2623 9/500 [..............................] - ETA: 1:59 - loss: 1.5756 - regression_loss: 1.2983 - classification_loss: 0.2773 10/500 [..............................] - ETA: 1:59 - loss: 1.5017 - regression_loss: 1.2367 - classification_loss: 0.2650 11/500 [..............................] - ETA: 1:59 - loss: 1.6153 - regression_loss: 1.3239 - classification_loss: 0.2914 12/500 [..............................] - ETA: 1:59 - loss: 1.6320 - regression_loss: 1.3371 - classification_loss: 0.2949 13/500 [..............................] - ETA: 1:58 - loss: 1.6602 - regression_loss: 1.3585 - classification_loss: 0.3017 14/500 [..............................] - ETA: 1:58 - loss: 1.6882 - regression_loss: 1.3849 - classification_loss: 0.3034 15/500 [..............................] - ETA: 1:58 - loss: 1.6252 - regression_loss: 1.3370 - classification_loss: 0.2882 16/500 [..............................] - ETA: 1:56 - loss: 1.5620 - regression_loss: 1.2851 - classification_loss: 0.2769 17/500 [>.............................] - ETA: 1:55 - loss: 1.5831 - regression_loss: 1.3018 - classification_loss: 0.2813 18/500 [>.............................] - ETA: 1:55 - loss: 1.6114 - regression_loss: 1.3237 - classification_loss: 0.2877 19/500 [>.............................] - ETA: 1:55 - loss: 1.6571 - regression_loss: 1.3612 - classification_loss: 0.2959 20/500 [>.............................] - ETA: 1:55 - loss: 1.6458 - regression_loss: 1.3487 - classification_loss: 0.2971 21/500 [>.............................] - ETA: 1:55 - loss: 1.6600 - regression_loss: 1.3633 - classification_loss: 0.2967 22/500 [>.............................] - ETA: 1:55 - loss: 1.6563 - regression_loss: 1.3611 - classification_loss: 0.2952 23/500 [>.............................] - ETA: 1:54 - loss: 1.6847 - regression_loss: 1.3878 - classification_loss: 0.2969 24/500 [>.............................] - ETA: 1:54 - loss: 1.6873 - regression_loss: 1.3900 - classification_loss: 0.2973 25/500 [>.............................] - ETA: 1:54 - loss: 1.6992 - regression_loss: 1.3969 - classification_loss: 0.3023 26/500 [>.............................] - ETA: 1:54 - loss: 1.7256 - regression_loss: 1.4122 - classification_loss: 0.3134 27/500 [>.............................] - ETA: 1:54 - loss: 1.7176 - regression_loss: 1.4074 - classification_loss: 0.3102 28/500 [>.............................] - ETA: 1:54 - loss: 1.7138 - regression_loss: 1.4045 - classification_loss: 0.3093 29/500 [>.............................] - ETA: 1:54 - loss: 1.7351 - regression_loss: 1.4218 - classification_loss: 0.3133 30/500 [>.............................] - ETA: 1:54 - loss: 1.7618 - regression_loss: 1.4473 - classification_loss: 0.3144 31/500 [>.............................] - ETA: 1:54 - loss: 1.7552 - regression_loss: 1.4420 - classification_loss: 0.3132 32/500 [>.............................] - ETA: 1:54 - loss: 1.7710 - regression_loss: 1.4599 - classification_loss: 0.3110 33/500 [>.............................] - ETA: 1:54 - loss: 1.7865 - regression_loss: 1.4744 - classification_loss: 0.3120 34/500 [=>............................] - ETA: 1:54 - loss: 1.7837 - regression_loss: 1.4725 - classification_loss: 0.3112 35/500 [=>............................] - ETA: 1:54 - loss: 1.7847 - regression_loss: 1.4710 - classification_loss: 0.3138 36/500 [=>............................] - ETA: 1:53 - loss: 1.7907 - regression_loss: 1.4776 - classification_loss: 0.3131 37/500 [=>............................] - ETA: 1:53 - loss: 1.7742 - regression_loss: 1.4646 - classification_loss: 0.3096 38/500 [=>............................] - ETA: 1:53 - loss: 1.7850 - regression_loss: 1.4729 - classification_loss: 0.3121 39/500 [=>............................] - ETA: 1:53 - loss: 1.7607 - regression_loss: 1.4548 - classification_loss: 0.3059 40/500 [=>............................] - ETA: 1:53 - loss: 1.7662 - regression_loss: 1.4591 - classification_loss: 0.3071 41/500 [=>............................] - ETA: 1:52 - loss: 1.7609 - regression_loss: 1.4559 - classification_loss: 0.3050 42/500 [=>............................] - ETA: 1:52 - loss: 1.7697 - regression_loss: 1.4630 - classification_loss: 0.3067 43/500 [=>............................] - ETA: 1:52 - loss: 1.7561 - regression_loss: 1.4534 - classification_loss: 0.3026 44/500 [=>............................] - ETA: 1:51 - loss: 1.7586 - regression_loss: 1.4547 - classification_loss: 0.3039 45/500 [=>............................] - ETA: 1:51 - loss: 1.7496 - regression_loss: 1.4480 - classification_loss: 0.3016 46/500 [=>............................] - ETA: 1:51 - loss: 1.7567 - regression_loss: 1.4554 - classification_loss: 0.3013 47/500 [=>............................] - ETA: 1:51 - loss: 1.7378 - regression_loss: 1.4405 - classification_loss: 0.2973 48/500 [=>............................] - ETA: 1:50 - loss: 1.7392 - regression_loss: 1.4422 - classification_loss: 0.2970 49/500 [=>............................] - ETA: 1:50 - loss: 1.7369 - regression_loss: 1.4401 - classification_loss: 0.2968 50/500 [==>...........................] - ETA: 1:50 - loss: 1.7103 - regression_loss: 1.4183 - classification_loss: 0.2920 51/500 [==>...........................] - ETA: 1:50 - loss: 1.7093 - regression_loss: 1.4149 - classification_loss: 0.2945 52/500 [==>...........................] - ETA: 1:49 - loss: 1.7184 - regression_loss: 1.4229 - classification_loss: 0.2955 53/500 [==>...........................] - ETA: 1:49 - loss: 1.7273 - regression_loss: 1.4298 - classification_loss: 0.2974 54/500 [==>...........................] - ETA: 1:49 - loss: 1.7288 - regression_loss: 1.4310 - classification_loss: 0.2979 55/500 [==>...........................] - ETA: 1:49 - loss: 1.7218 - regression_loss: 1.4261 - classification_loss: 0.2957 56/500 [==>...........................] - ETA: 1:49 - loss: 1.7211 - regression_loss: 1.4266 - classification_loss: 0.2945 57/500 [==>...........................] - ETA: 1:48 - loss: 1.7406 - regression_loss: 1.4412 - classification_loss: 0.2993 58/500 [==>...........................] - ETA: 1:48 - loss: 1.7352 - regression_loss: 1.4365 - classification_loss: 0.2987 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7399 - regression_loss: 1.4332 - classification_loss: 0.3067 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7280 - regression_loss: 1.4232 - classification_loss: 0.3048 61/500 [==>...........................] - ETA: 1:48 - loss: 1.7258 - regression_loss: 1.4213 - classification_loss: 0.3045 62/500 [==>...........................] - ETA: 1:47 - loss: 1.7249 - regression_loss: 1.4204 - classification_loss: 0.3045 63/500 [==>...........................] - ETA: 1:47 - loss: 1.7356 - regression_loss: 1.4294 - classification_loss: 0.3062 64/500 [==>...........................] - ETA: 1:47 - loss: 1.7421 - regression_loss: 1.4345 - classification_loss: 0.3075 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7307 - regression_loss: 1.4256 - classification_loss: 0.3052 66/500 [==>...........................] - ETA: 1:47 - loss: 1.7336 - regression_loss: 1.4281 - classification_loss: 0.3055 67/500 [===>..........................] - ETA: 1:46 - loss: 1.7385 - regression_loss: 1.4319 - classification_loss: 0.3066 68/500 [===>..........................] - ETA: 1:46 - loss: 1.7316 - regression_loss: 1.4252 - classification_loss: 0.3064 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7486 - regression_loss: 1.4371 - classification_loss: 0.3115 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7515 - regression_loss: 1.4389 - classification_loss: 0.3127 71/500 [===>..........................] - ETA: 1:46 - loss: 1.7473 - regression_loss: 1.4360 - classification_loss: 0.3113 72/500 [===>..........................] - ETA: 1:45 - loss: 1.7400 - regression_loss: 1.4302 - classification_loss: 0.3098 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7396 - regression_loss: 1.4299 - classification_loss: 0.3097 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7313 - regression_loss: 1.4231 - classification_loss: 0.3082 75/500 [===>..........................] - ETA: 1:45 - loss: 1.7227 - regression_loss: 1.4169 - classification_loss: 0.3058 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7196 - regression_loss: 1.4147 - classification_loss: 0.3048 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7100 - regression_loss: 1.4073 - classification_loss: 0.3027 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7138 - regression_loss: 1.4099 - classification_loss: 0.3039 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7176 - regression_loss: 1.4127 - classification_loss: 0.3049 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7245 - regression_loss: 1.4178 - classification_loss: 0.3066 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7287 - regression_loss: 1.4212 - classification_loss: 0.3075 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7278 - regression_loss: 1.4210 - classification_loss: 0.3068 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7312 - regression_loss: 1.4240 - classification_loss: 0.3073 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7306 - regression_loss: 1.4235 - classification_loss: 0.3070 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7305 - regression_loss: 1.4238 - classification_loss: 0.3067 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7291 - regression_loss: 1.4232 - classification_loss: 0.3059 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7267 - regression_loss: 1.4210 - classification_loss: 0.3057 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7307 - regression_loss: 1.4245 - classification_loss: 0.3062 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7371 - regression_loss: 1.4301 - classification_loss: 0.3071 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7413 - regression_loss: 1.4340 - classification_loss: 0.3073 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7439 - regression_loss: 1.4352 - classification_loss: 0.3086 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7436 - regression_loss: 1.4351 - classification_loss: 0.3085 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7475 - regression_loss: 1.4386 - classification_loss: 0.3089 94/500 [====>.........................] - ETA: 1:40 - loss: 1.7496 - regression_loss: 1.4410 - classification_loss: 0.3086 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7470 - regression_loss: 1.4381 - classification_loss: 0.3089 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7467 - regression_loss: 1.4381 - classification_loss: 0.3086 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7424 - regression_loss: 1.4349 - classification_loss: 0.3075 98/500 [====>.........................] - ETA: 1:39 - loss: 1.7486 - regression_loss: 1.4384 - classification_loss: 0.3102 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7458 - regression_loss: 1.4369 - classification_loss: 0.3089 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7514 - regression_loss: 1.4411 - classification_loss: 0.3103 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7416 - regression_loss: 1.4325 - classification_loss: 0.3091 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7393 - regression_loss: 1.4306 - classification_loss: 0.3088 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7424 - regression_loss: 1.4333 - classification_loss: 0.3092 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7411 - regression_loss: 1.4326 - classification_loss: 0.3085 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7398 - regression_loss: 1.4320 - classification_loss: 0.3078 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7380 - regression_loss: 1.4309 - classification_loss: 0.3071 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7391 - regression_loss: 1.4320 - classification_loss: 0.3071 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7432 - regression_loss: 1.4349 - classification_loss: 0.3083 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7377 - regression_loss: 1.4310 - classification_loss: 0.3067 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7430 - regression_loss: 1.4360 - classification_loss: 0.3071 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7447 - regression_loss: 1.4375 - classification_loss: 0.3072 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7442 - regression_loss: 1.4378 - classification_loss: 0.3065 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7475 - regression_loss: 1.4409 - classification_loss: 0.3066 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7479 - regression_loss: 1.4415 - classification_loss: 0.3064 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7434 - regression_loss: 1.4377 - classification_loss: 0.3057 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7427 - regression_loss: 1.4374 - classification_loss: 0.3053 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7449 - regression_loss: 1.4395 - classification_loss: 0.3054 118/500 [======>.......................] - ETA: 1:34 - loss: 1.7507 - regression_loss: 1.4437 - classification_loss: 0.3070 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7542 - regression_loss: 1.4466 - classification_loss: 0.3075 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7473 - regression_loss: 1.4405 - classification_loss: 0.3068 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7417 - regression_loss: 1.4362 - classification_loss: 0.3055 122/500 [======>.......................] - ETA: 1:33 - loss: 1.7392 - regression_loss: 1.4334 - classification_loss: 0.3058 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7436 - regression_loss: 1.4371 - classification_loss: 0.3065 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7397 - regression_loss: 1.4342 - classification_loss: 0.3055 125/500 [======>.......................] - ETA: 1:32 - loss: 1.7425 - regression_loss: 1.4366 - classification_loss: 0.3059 126/500 [======>.......................] - ETA: 1:32 - loss: 1.7437 - regression_loss: 1.4376 - classification_loss: 0.3061 127/500 [======>.......................] - ETA: 1:32 - loss: 1.7360 - regression_loss: 1.4319 - classification_loss: 0.3041 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7400 - regression_loss: 1.4353 - classification_loss: 0.3047 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7390 - regression_loss: 1.4348 - classification_loss: 0.3042 130/500 [======>.......................] - ETA: 1:31 - loss: 1.7399 - regression_loss: 1.4361 - classification_loss: 0.3038 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7320 - regression_loss: 1.4297 - classification_loss: 0.3023 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7359 - regression_loss: 1.4330 - classification_loss: 0.3029 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7323 - regression_loss: 1.4301 - classification_loss: 0.3022 134/500 [=======>......................] - ETA: 1:30 - loss: 1.7323 - regression_loss: 1.4303 - classification_loss: 0.3020 135/500 [=======>......................] - ETA: 1:30 - loss: 1.7252 - regression_loss: 1.4247 - classification_loss: 0.3005 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7252 - regression_loss: 1.4248 - classification_loss: 0.3004 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7281 - regression_loss: 1.4272 - classification_loss: 0.3009 138/500 [=======>......................] - ETA: 1:29 - loss: 1.7305 - regression_loss: 1.4295 - classification_loss: 0.3011 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7275 - regression_loss: 1.4268 - classification_loss: 0.3007 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7276 - regression_loss: 1.4272 - classification_loss: 0.3004 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7278 - regression_loss: 1.4283 - classification_loss: 0.2995 142/500 [=======>......................] - ETA: 1:28 - loss: 1.7306 - regression_loss: 1.4307 - classification_loss: 0.2999 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7314 - regression_loss: 1.4319 - classification_loss: 0.2995 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7305 - regression_loss: 1.4312 - classification_loss: 0.2994 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7342 - regression_loss: 1.4343 - classification_loss: 0.2999 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7339 - regression_loss: 1.4341 - classification_loss: 0.2998 147/500 [=======>......................] - ETA: 1:27 - loss: 1.7314 - regression_loss: 1.4318 - classification_loss: 0.2996 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7355 - regression_loss: 1.4356 - classification_loss: 0.2999 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7344 - regression_loss: 1.4350 - classification_loss: 0.2994 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7342 - regression_loss: 1.4350 - classification_loss: 0.2992 151/500 [========>.....................] - ETA: 1:26 - loss: 1.7349 - regression_loss: 1.4359 - classification_loss: 0.2990 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7369 - regression_loss: 1.4382 - classification_loss: 0.2987 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7359 - regression_loss: 1.4377 - classification_loss: 0.2982 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7339 - regression_loss: 1.4361 - classification_loss: 0.2978 155/500 [========>.....................] - ETA: 1:25 - loss: 1.7353 - regression_loss: 1.4376 - classification_loss: 0.2977 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7343 - regression_loss: 1.4372 - classification_loss: 0.2972 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7351 - regression_loss: 1.4379 - classification_loss: 0.2972 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7442 - regression_loss: 1.4447 - classification_loss: 0.2995 159/500 [========>.....................] - ETA: 1:24 - loss: 1.7460 - regression_loss: 1.4454 - classification_loss: 0.3006 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7469 - regression_loss: 1.4462 - classification_loss: 0.3007 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7488 - regression_loss: 1.4478 - classification_loss: 0.3011 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7526 - regression_loss: 1.4504 - classification_loss: 0.3022 163/500 [========>.....................] - ETA: 1:23 - loss: 1.7547 - regression_loss: 1.4520 - classification_loss: 0.3027 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7517 - regression_loss: 1.4494 - classification_loss: 0.3023 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7524 - regression_loss: 1.4500 - classification_loss: 0.3025 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7550 - regression_loss: 1.4523 - classification_loss: 0.3027 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7522 - regression_loss: 1.4504 - classification_loss: 0.3018 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7542 - regression_loss: 1.4521 - classification_loss: 0.3022 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7538 - regression_loss: 1.4516 - classification_loss: 0.3022 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7535 - regression_loss: 1.4516 - classification_loss: 0.3019 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7543 - regression_loss: 1.4526 - classification_loss: 0.3018 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7517 - regression_loss: 1.4506 - classification_loss: 0.3011 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7561 - regression_loss: 1.4538 - classification_loss: 0.3023 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7556 - regression_loss: 1.4534 - classification_loss: 0.3022 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7544 - regression_loss: 1.4526 - classification_loss: 0.3018 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7560 - regression_loss: 1.4542 - classification_loss: 0.3017 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7566 - regression_loss: 1.4548 - classification_loss: 0.3018 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7581 - regression_loss: 1.4556 - classification_loss: 0.3025 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7600 - regression_loss: 1.4568 - classification_loss: 0.3032 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7564 - regression_loss: 1.4546 - classification_loss: 0.3018 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7602 - regression_loss: 1.4575 - classification_loss: 0.3027 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7634 - regression_loss: 1.4598 - classification_loss: 0.3036 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7656 - regression_loss: 1.4613 - classification_loss: 0.3043 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7613 - regression_loss: 1.4577 - classification_loss: 0.3035 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7630 - regression_loss: 1.4593 - classification_loss: 0.3037 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7625 - regression_loss: 1.4593 - classification_loss: 0.3032 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7630 - regression_loss: 1.4590 - classification_loss: 0.3040 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7638 - regression_loss: 1.4596 - classification_loss: 0.3042 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7598 - regression_loss: 1.4562 - classification_loss: 0.3036 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7573 - regression_loss: 1.4540 - classification_loss: 0.3033 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7524 - regression_loss: 1.4496 - classification_loss: 0.3028 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7553 - regression_loss: 1.4522 - classification_loss: 0.3031 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7514 - regression_loss: 1.4490 - classification_loss: 0.3023 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7469 - regression_loss: 1.4454 - classification_loss: 0.3014 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7456 - regression_loss: 1.4444 - classification_loss: 0.3012 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7485 - regression_loss: 1.4464 - classification_loss: 0.3021 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7488 - regression_loss: 1.4465 - classification_loss: 0.3024 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7498 - regression_loss: 1.4474 - classification_loss: 0.3024 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7511 - regression_loss: 1.4483 - classification_loss: 0.3028 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7482 - regression_loss: 1.4460 - classification_loss: 0.3022 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7503 - regression_loss: 1.4476 - classification_loss: 0.3027 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7516 - regression_loss: 1.4486 - classification_loss: 0.3030 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7522 - regression_loss: 1.4492 - classification_loss: 0.3030 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7530 - regression_loss: 1.4498 - classification_loss: 0.3032 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7558 - regression_loss: 1.4519 - classification_loss: 0.3039 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7542 - regression_loss: 1.4507 - classification_loss: 0.3034 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7551 - regression_loss: 1.4517 - classification_loss: 0.3034 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7536 - regression_loss: 1.4507 - classification_loss: 0.3029 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7534 - regression_loss: 1.4509 - classification_loss: 0.3026 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7520 - regression_loss: 1.4487 - classification_loss: 0.3034 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7532 - regression_loss: 1.4495 - classification_loss: 0.3037 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7510 - regression_loss: 1.4473 - classification_loss: 0.3037 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7509 - regression_loss: 1.4472 - classification_loss: 0.3037 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7501 - regression_loss: 1.4458 - classification_loss: 0.3043 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7526 - regression_loss: 1.4481 - classification_loss: 0.3045 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7582 - regression_loss: 1.4497 - classification_loss: 0.3084 217/500 [============>.................] - ETA: 1:10 - loss: 1.7590 - regression_loss: 1.4500 - classification_loss: 0.3090 218/500 [============>.................] - ETA: 1:10 - loss: 1.7617 - regression_loss: 1.4519 - classification_loss: 0.3098 219/500 [============>.................] - ETA: 1:10 - loss: 1.7565 - regression_loss: 1.4478 - classification_loss: 0.3088 220/500 [============>.................] - ETA: 1:10 - loss: 1.7569 - regression_loss: 1.4481 - classification_loss: 0.3088 221/500 [============>.................] - ETA: 1:09 - loss: 1.7568 - regression_loss: 1.4482 - classification_loss: 0.3086 222/500 [============>.................] - ETA: 1:09 - loss: 1.7562 - regression_loss: 1.4477 - classification_loss: 0.3086 223/500 [============>.................] - ETA: 1:09 - loss: 1.7573 - regression_loss: 1.4482 - classification_loss: 0.3091 224/500 [============>.................] - ETA: 1:09 - loss: 1.7575 - regression_loss: 1.4483 - classification_loss: 0.3091 225/500 [============>.................] - ETA: 1:08 - loss: 1.7571 - regression_loss: 1.4482 - classification_loss: 0.3089 226/500 [============>.................] - ETA: 1:08 - loss: 1.7562 - regression_loss: 1.4475 - classification_loss: 0.3087 227/500 [============>.................] - ETA: 1:08 - loss: 1.7567 - regression_loss: 1.4480 - classification_loss: 0.3087 228/500 [============>.................] - ETA: 1:08 - loss: 1.7543 - regression_loss: 1.4463 - classification_loss: 0.3081 229/500 [============>.................] - ETA: 1:07 - loss: 1.7520 - regression_loss: 1.4447 - classification_loss: 0.3073 230/500 [============>.................] - ETA: 1:07 - loss: 1.7525 - regression_loss: 1.4451 - classification_loss: 0.3074 231/500 [============>.................] - ETA: 1:07 - loss: 1.7561 - regression_loss: 1.4477 - classification_loss: 0.3083 232/500 [============>.................] - ETA: 1:07 - loss: 1.7537 - regression_loss: 1.4457 - classification_loss: 0.3080 233/500 [============>.................] - ETA: 1:06 - loss: 1.7524 - regression_loss: 1.4450 - classification_loss: 0.3074 234/500 [=============>................] - ETA: 1:06 - loss: 1.7494 - regression_loss: 1.4425 - classification_loss: 0.3069 235/500 [=============>................] - ETA: 1:06 - loss: 1.7460 - regression_loss: 1.4396 - classification_loss: 0.3063 236/500 [=============>................] - ETA: 1:06 - loss: 1.7461 - regression_loss: 1.4399 - classification_loss: 0.3062 237/500 [=============>................] - ETA: 1:05 - loss: 1.7454 - regression_loss: 1.4395 - classification_loss: 0.3059 238/500 [=============>................] - ETA: 1:05 - loss: 1.7442 - regression_loss: 1.4387 - classification_loss: 0.3055 239/500 [=============>................] - ETA: 1:05 - loss: 1.7468 - regression_loss: 1.4413 - classification_loss: 0.3055 240/500 [=============>................] - ETA: 1:05 - loss: 1.7463 - regression_loss: 1.4411 - classification_loss: 0.3051 241/500 [=============>................] - ETA: 1:04 - loss: 1.7449 - regression_loss: 1.4401 - classification_loss: 0.3048 242/500 [=============>................] - ETA: 1:04 - loss: 1.7451 - regression_loss: 1.4405 - classification_loss: 0.3046 243/500 [=============>................] - ETA: 1:04 - loss: 1.7473 - regression_loss: 1.4418 - classification_loss: 0.3055 244/500 [=============>................] - ETA: 1:04 - loss: 1.7472 - regression_loss: 1.4419 - classification_loss: 0.3054 245/500 [=============>................] - ETA: 1:03 - loss: 1.7445 - regression_loss: 1.4400 - classification_loss: 0.3045 246/500 [=============>................] - ETA: 1:03 - loss: 1.7452 - regression_loss: 1.4407 - classification_loss: 0.3045 247/500 [=============>................] - ETA: 1:03 - loss: 1.7462 - regression_loss: 1.4417 - classification_loss: 0.3045 248/500 [=============>................] - ETA: 1:03 - loss: 1.7432 - regression_loss: 1.4390 - classification_loss: 0.3042 249/500 [=============>................] - ETA: 1:02 - loss: 1.7424 - regression_loss: 1.4385 - classification_loss: 0.3038 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7407 - regression_loss: 1.4372 - classification_loss: 0.3036 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7414 - regression_loss: 1.4378 - classification_loss: 0.3036 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7416 - regression_loss: 1.4383 - classification_loss: 0.3034 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7414 - regression_loss: 1.4381 - classification_loss: 0.3033 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7397 - regression_loss: 1.4368 - classification_loss: 0.3029 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7421 - regression_loss: 1.4386 - classification_loss: 0.3035 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7433 - regression_loss: 1.4399 - classification_loss: 0.3034 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7400 - regression_loss: 1.4373 - classification_loss: 0.3027 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7380 - regression_loss: 1.4353 - classification_loss: 0.3026 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7360 - regression_loss: 1.4338 - classification_loss: 0.3022 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7361 - regression_loss: 1.4339 - classification_loss: 0.3022 261/500 [==============>...............] - ETA: 59s - loss: 1.7366 - regression_loss: 1.4343 - classification_loss: 0.3023  262/500 [==============>...............] - ETA: 59s - loss: 1.7353 - regression_loss: 1.4329 - classification_loss: 0.3024 263/500 [==============>...............] - ETA: 59s - loss: 1.7353 - regression_loss: 1.4330 - classification_loss: 0.3023 264/500 [==============>...............] - ETA: 59s - loss: 1.7359 - regression_loss: 1.4337 - classification_loss: 0.3022 265/500 [==============>...............] - ETA: 58s - loss: 1.7329 - regression_loss: 1.4315 - classification_loss: 0.3014 266/500 [==============>...............] - ETA: 58s - loss: 1.7339 - regression_loss: 1.4324 - classification_loss: 0.3015 267/500 [===============>..............] - ETA: 58s - loss: 1.7348 - regression_loss: 1.4329 - classification_loss: 0.3019 268/500 [===============>..............] - ETA: 58s - loss: 1.7353 - regression_loss: 1.4333 - classification_loss: 0.3020 269/500 [===============>..............] - ETA: 57s - loss: 1.7342 - regression_loss: 1.4327 - classification_loss: 0.3015 270/500 [===============>..............] - ETA: 57s - loss: 1.7346 - regression_loss: 1.4331 - classification_loss: 0.3015 271/500 [===============>..............] - ETA: 57s - loss: 1.7353 - regression_loss: 1.4337 - classification_loss: 0.3016 272/500 [===============>..............] - ETA: 57s - loss: 1.7374 - regression_loss: 1.4351 - classification_loss: 0.3023 273/500 [===============>..............] - ETA: 56s - loss: 1.7356 - regression_loss: 1.4336 - classification_loss: 0.3020 274/500 [===============>..............] - ETA: 56s - loss: 1.7347 - regression_loss: 1.4329 - classification_loss: 0.3018 275/500 [===============>..............] - ETA: 56s - loss: 1.7342 - regression_loss: 1.4325 - classification_loss: 0.3016 276/500 [===============>..............] - ETA: 56s - loss: 1.7352 - regression_loss: 1.4335 - classification_loss: 0.3017 277/500 [===============>..............] - ETA: 55s - loss: 1.7365 - regression_loss: 1.4346 - classification_loss: 0.3019 278/500 [===============>..............] - ETA: 55s - loss: 1.7381 - regression_loss: 1.4357 - classification_loss: 0.3025 279/500 [===============>..............] - ETA: 55s - loss: 1.7366 - regression_loss: 1.4347 - classification_loss: 0.3019 280/500 [===============>..............] - ETA: 55s - loss: 1.7369 - regression_loss: 1.4350 - classification_loss: 0.3019 281/500 [===============>..............] - ETA: 54s - loss: 1.7361 - regression_loss: 1.4341 - classification_loss: 0.3020 282/500 [===============>..............] - ETA: 54s - loss: 1.7338 - regression_loss: 1.4322 - classification_loss: 0.3016 283/500 [===============>..............] - ETA: 54s - loss: 1.7325 - regression_loss: 1.4312 - classification_loss: 0.3012 284/500 [================>.............] - ETA: 54s - loss: 1.7321 - regression_loss: 1.4308 - classification_loss: 0.3014 285/500 [================>.............] - ETA: 53s - loss: 1.7326 - regression_loss: 1.4312 - classification_loss: 0.3014 286/500 [================>.............] - ETA: 53s - loss: 1.7319 - regression_loss: 1.4307 - classification_loss: 0.3011 287/500 [================>.............] - ETA: 53s - loss: 1.7328 - regression_loss: 1.4315 - classification_loss: 0.3013 288/500 [================>.............] - ETA: 53s - loss: 1.7328 - regression_loss: 1.4315 - classification_loss: 0.3013 289/500 [================>.............] - ETA: 52s - loss: 1.7348 - regression_loss: 1.4333 - classification_loss: 0.3015 290/500 [================>.............] - ETA: 52s - loss: 1.7345 - regression_loss: 1.4328 - classification_loss: 0.3016 291/500 [================>.............] - ETA: 52s - loss: 1.7327 - regression_loss: 1.4314 - classification_loss: 0.3012 292/500 [================>.............] - ETA: 52s - loss: 1.7341 - regression_loss: 1.4330 - classification_loss: 0.3011 293/500 [================>.............] - ETA: 51s - loss: 1.7348 - regression_loss: 1.4338 - classification_loss: 0.3011 294/500 [================>.............] - ETA: 51s - loss: 1.7338 - regression_loss: 1.4331 - classification_loss: 0.3008 295/500 [================>.............] - ETA: 51s - loss: 1.7355 - regression_loss: 1.4343 - classification_loss: 0.3012 296/500 [================>.............] - ETA: 51s - loss: 1.7348 - regression_loss: 1.4339 - classification_loss: 0.3010 297/500 [================>.............] - ETA: 50s - loss: 1.7354 - regression_loss: 1.4342 - classification_loss: 0.3011 298/500 [================>.............] - ETA: 50s - loss: 1.7348 - regression_loss: 1.4339 - classification_loss: 0.3009 299/500 [================>.............] - ETA: 50s - loss: 1.7354 - regression_loss: 1.4346 - classification_loss: 0.3008 300/500 [=================>............] - ETA: 50s - loss: 1.7345 - regression_loss: 1.4338 - classification_loss: 0.3008 301/500 [=================>............] - ETA: 49s - loss: 1.7339 - regression_loss: 1.4333 - classification_loss: 0.3006 302/500 [=================>............] - ETA: 49s - loss: 1.7341 - regression_loss: 1.4336 - classification_loss: 0.3005 303/500 [=================>............] - ETA: 49s - loss: 1.7359 - regression_loss: 1.4349 - classification_loss: 0.3010 304/500 [=================>............] - ETA: 49s - loss: 1.7358 - regression_loss: 1.4349 - classification_loss: 0.3009 305/500 [=================>............] - ETA: 48s - loss: 1.7349 - regression_loss: 1.4342 - classification_loss: 0.3007 306/500 [=================>............] - ETA: 48s - loss: 1.7361 - regression_loss: 1.4351 - classification_loss: 0.3010 307/500 [=================>............] - ETA: 48s - loss: 1.7357 - regression_loss: 1.4349 - classification_loss: 0.3008 308/500 [=================>............] - ETA: 48s - loss: 1.7365 - regression_loss: 1.4357 - classification_loss: 0.3008 309/500 [=================>............] - ETA: 47s - loss: 1.7386 - regression_loss: 1.4374 - classification_loss: 0.3012 310/500 [=================>............] - ETA: 47s - loss: 1.7386 - regression_loss: 1.4372 - classification_loss: 0.3015 311/500 [=================>............] - ETA: 47s - loss: 1.7412 - regression_loss: 1.4387 - classification_loss: 0.3025 312/500 [=================>............] - ETA: 47s - loss: 1.7410 - regression_loss: 1.4386 - classification_loss: 0.3024 313/500 [=================>............] - ETA: 46s - loss: 1.7393 - regression_loss: 1.4369 - classification_loss: 0.3024 314/500 [=================>............] - ETA: 46s - loss: 1.7395 - regression_loss: 1.4372 - classification_loss: 0.3023 315/500 [=================>............] - ETA: 46s - loss: 1.7365 - regression_loss: 1.4350 - classification_loss: 0.3015 316/500 [=================>............] - ETA: 46s - loss: 1.7383 - regression_loss: 1.4364 - classification_loss: 0.3019 317/500 [==================>...........] - ETA: 45s - loss: 1.7387 - regression_loss: 1.4366 - classification_loss: 0.3021 318/500 [==================>...........] - ETA: 45s - loss: 1.7387 - regression_loss: 1.4367 - classification_loss: 0.3020 319/500 [==================>...........] - ETA: 45s - loss: 1.7385 - regression_loss: 1.4367 - classification_loss: 0.3019 320/500 [==================>...........] - ETA: 45s - loss: 1.7390 - regression_loss: 1.4371 - classification_loss: 0.3020 321/500 [==================>...........] - ETA: 44s - loss: 1.7421 - regression_loss: 1.4395 - classification_loss: 0.3026 322/500 [==================>...........] - ETA: 44s - loss: 1.7437 - regression_loss: 1.4407 - classification_loss: 0.3030 323/500 [==================>...........] - ETA: 44s - loss: 1.7442 - regression_loss: 1.4409 - classification_loss: 0.3033 324/500 [==================>...........] - ETA: 44s - loss: 1.7431 - regression_loss: 1.4401 - classification_loss: 0.3030 325/500 [==================>...........] - ETA: 43s - loss: 1.7432 - regression_loss: 1.4401 - classification_loss: 0.3030 326/500 [==================>...........] - ETA: 43s - loss: 1.7447 - regression_loss: 1.4412 - classification_loss: 0.3035 327/500 [==================>...........] - ETA: 43s - loss: 1.7456 - regression_loss: 1.4418 - classification_loss: 0.3038 328/500 [==================>...........] - ETA: 42s - loss: 1.7481 - regression_loss: 1.4433 - classification_loss: 0.3047 329/500 [==================>...........] - ETA: 42s - loss: 1.7489 - regression_loss: 1.4441 - classification_loss: 0.3047 330/500 [==================>...........] - ETA: 42s - loss: 1.7492 - regression_loss: 1.4445 - classification_loss: 0.3047 331/500 [==================>...........] - ETA: 42s - loss: 1.7475 - regression_loss: 1.4428 - classification_loss: 0.3046 332/500 [==================>...........] - ETA: 41s - loss: 1.7479 - regression_loss: 1.4429 - classification_loss: 0.3050 333/500 [==================>...........] - ETA: 41s - loss: 1.7472 - regression_loss: 1.4425 - classification_loss: 0.3048 334/500 [===================>..........] - ETA: 41s - loss: 1.7489 - regression_loss: 1.4440 - classification_loss: 0.3049 335/500 [===================>..........] - ETA: 41s - loss: 1.7494 - regression_loss: 1.4442 - classification_loss: 0.3052 336/500 [===================>..........] - ETA: 40s - loss: 1.7500 - regression_loss: 1.4444 - classification_loss: 0.3056 337/500 [===================>..........] - ETA: 40s - loss: 1.7487 - regression_loss: 1.4430 - classification_loss: 0.3057 338/500 [===================>..........] - ETA: 40s - loss: 1.7461 - regression_loss: 1.4409 - classification_loss: 0.3052 339/500 [===================>..........] - ETA: 40s - loss: 1.7448 - regression_loss: 1.4400 - classification_loss: 0.3049 340/500 [===================>..........] - ETA: 39s - loss: 1.7460 - regression_loss: 1.4411 - classification_loss: 0.3049 341/500 [===================>..........] - ETA: 39s - loss: 1.7466 - regression_loss: 1.4418 - classification_loss: 0.3048 342/500 [===================>..........] - ETA: 39s - loss: 1.7470 - regression_loss: 1.4421 - classification_loss: 0.3050 343/500 [===================>..........] - ETA: 39s - loss: 1.7472 - regression_loss: 1.4424 - classification_loss: 0.3048 344/500 [===================>..........] - ETA: 38s - loss: 1.7443 - regression_loss: 1.4402 - classification_loss: 0.3041 345/500 [===================>..........] - ETA: 38s - loss: 1.7445 - regression_loss: 1.4405 - classification_loss: 0.3040 346/500 [===================>..........] - ETA: 38s - loss: 1.7442 - regression_loss: 1.4403 - classification_loss: 0.3039 347/500 [===================>..........] - ETA: 38s - loss: 1.7434 - regression_loss: 1.4399 - classification_loss: 0.3035 348/500 [===================>..........] - ETA: 37s - loss: 1.7411 - regression_loss: 1.4377 - classification_loss: 0.3034 349/500 [===================>..........] - ETA: 37s - loss: 1.7414 - regression_loss: 1.4378 - classification_loss: 0.3036 350/500 [====================>.........] - ETA: 37s - loss: 1.7418 - regression_loss: 1.4383 - classification_loss: 0.3035 351/500 [====================>.........] - ETA: 37s - loss: 1.7404 - regression_loss: 1.4373 - classification_loss: 0.3031 352/500 [====================>.........] - ETA: 36s - loss: 1.7423 - regression_loss: 1.4386 - classification_loss: 0.3037 353/500 [====================>.........] - ETA: 36s - loss: 1.7407 - regression_loss: 1.4374 - classification_loss: 0.3033 354/500 [====================>.........] - ETA: 36s - loss: 1.7423 - regression_loss: 1.4386 - classification_loss: 0.3037 355/500 [====================>.........] - ETA: 36s - loss: 1.7422 - regression_loss: 1.4386 - classification_loss: 0.3036 356/500 [====================>.........] - ETA: 35s - loss: 1.7402 - regression_loss: 1.4369 - classification_loss: 0.3033 357/500 [====================>.........] - ETA: 35s - loss: 1.7406 - regression_loss: 1.4372 - classification_loss: 0.3034 358/500 [====================>.........] - ETA: 35s - loss: 1.7400 - regression_loss: 1.4366 - classification_loss: 0.3034 359/500 [====================>.........] - ETA: 35s - loss: 1.7364 - regression_loss: 1.4335 - classification_loss: 0.3029 360/500 [====================>.........] - ETA: 34s - loss: 1.7353 - regression_loss: 1.4327 - classification_loss: 0.3026 361/500 [====================>.........] - ETA: 34s - loss: 1.7355 - regression_loss: 1.4329 - classification_loss: 0.3026 362/500 [====================>.........] - ETA: 34s - loss: 1.7365 - regression_loss: 1.4338 - classification_loss: 0.3027 363/500 [====================>.........] - ETA: 34s - loss: 1.7372 - regression_loss: 1.4346 - classification_loss: 0.3026 364/500 [====================>.........] - ETA: 33s - loss: 1.7380 - regression_loss: 1.4352 - classification_loss: 0.3028 365/500 [====================>.........] - ETA: 33s - loss: 1.7384 - regression_loss: 1.4357 - classification_loss: 0.3028 366/500 [====================>.........] - ETA: 33s - loss: 1.7370 - regression_loss: 1.4345 - classification_loss: 0.3025 367/500 [=====================>........] - ETA: 33s - loss: 1.7372 - regression_loss: 1.4347 - classification_loss: 0.3025 368/500 [=====================>........] - ETA: 32s - loss: 1.7360 - regression_loss: 1.4336 - classification_loss: 0.3024 369/500 [=====================>........] - ETA: 32s - loss: 1.7353 - regression_loss: 1.4329 - classification_loss: 0.3023 370/500 [=====================>........] - ETA: 32s - loss: 1.7354 - regression_loss: 1.4332 - classification_loss: 0.3022 371/500 [=====================>........] - ETA: 32s - loss: 1.7354 - regression_loss: 1.4333 - classification_loss: 0.3021 372/500 [=====================>........] - ETA: 31s - loss: 1.7358 - regression_loss: 1.4338 - classification_loss: 0.3019 373/500 [=====================>........] - ETA: 31s - loss: 1.7366 - regression_loss: 1.4345 - classification_loss: 0.3021 374/500 [=====================>........] - ETA: 31s - loss: 1.7376 - regression_loss: 1.4353 - classification_loss: 0.3022 375/500 [=====================>........] - ETA: 31s - loss: 1.7389 - regression_loss: 1.4365 - classification_loss: 0.3024 376/500 [=====================>........] - ETA: 31s - loss: 1.7374 - regression_loss: 1.4354 - classification_loss: 0.3020 377/500 [=====================>........] - ETA: 30s - loss: 1.7365 - regression_loss: 1.4344 - classification_loss: 0.3020 378/500 [=====================>........] - ETA: 30s - loss: 1.7368 - regression_loss: 1.4348 - classification_loss: 0.3020 379/500 [=====================>........] - ETA: 30s - loss: 1.7380 - regression_loss: 1.4360 - classification_loss: 0.3021 380/500 [=====================>........] - ETA: 30s - loss: 1.7403 - regression_loss: 1.4376 - classification_loss: 0.3027 381/500 [=====================>........] - ETA: 29s - loss: 1.7398 - regression_loss: 1.4372 - classification_loss: 0.3026 382/500 [=====================>........] - ETA: 29s - loss: 1.7409 - regression_loss: 1.4380 - classification_loss: 0.3029 383/500 [=====================>........] - ETA: 29s - loss: 1.7418 - regression_loss: 1.4386 - classification_loss: 0.3032 384/500 [======================>.......] - ETA: 29s - loss: 1.7422 - regression_loss: 1.4390 - classification_loss: 0.3032 385/500 [======================>.......] - ETA: 28s - loss: 1.7418 - regression_loss: 1.4386 - classification_loss: 0.3032 386/500 [======================>.......] - ETA: 28s - loss: 1.7409 - regression_loss: 1.4379 - classification_loss: 0.3030 387/500 [======================>.......] - ETA: 28s - loss: 1.7403 - regression_loss: 1.4375 - classification_loss: 0.3028 388/500 [======================>.......] - ETA: 28s - loss: 1.7408 - regression_loss: 1.4379 - classification_loss: 0.3029 389/500 [======================>.......] - ETA: 27s - loss: 1.7410 - regression_loss: 1.4378 - classification_loss: 0.3033 390/500 [======================>.......] - ETA: 27s - loss: 1.7407 - regression_loss: 1.4376 - classification_loss: 0.3031 391/500 [======================>.......] - ETA: 27s - loss: 1.7404 - regression_loss: 1.4372 - classification_loss: 0.3032 392/500 [======================>.......] - ETA: 27s - loss: 1.7402 - regression_loss: 1.4371 - classification_loss: 0.3031 393/500 [======================>.......] - ETA: 26s - loss: 1.7413 - regression_loss: 1.4382 - classification_loss: 0.3032 394/500 [======================>.......] - ETA: 26s - loss: 1.7421 - regression_loss: 1.4389 - classification_loss: 0.3032 395/500 [======================>.......] - ETA: 26s - loss: 1.7441 - regression_loss: 1.4403 - classification_loss: 0.3038 396/500 [======================>.......] - ETA: 26s - loss: 1.7442 - regression_loss: 1.4405 - classification_loss: 0.3037 397/500 [======================>.......] - ETA: 25s - loss: 1.7424 - regression_loss: 1.4390 - classification_loss: 0.3034 398/500 [======================>.......] - ETA: 25s - loss: 1.7430 - regression_loss: 1.4397 - classification_loss: 0.3033 399/500 [======================>.......] - ETA: 25s - loss: 1.7412 - regression_loss: 1.4384 - classification_loss: 0.3028 400/500 [=======================>......] - ETA: 25s - loss: 1.7400 - regression_loss: 1.4376 - classification_loss: 0.3024 401/500 [=======================>......] - ETA: 24s - loss: 1.7406 - regression_loss: 1.4377 - classification_loss: 0.3029 402/500 [=======================>......] - ETA: 24s - loss: 1.7417 - regression_loss: 1.4384 - classification_loss: 0.3032 403/500 [=======================>......] - ETA: 24s - loss: 1.7400 - regression_loss: 1.4368 - classification_loss: 0.3032 404/500 [=======================>......] - ETA: 24s - loss: 1.7398 - regression_loss: 1.4368 - classification_loss: 0.3030 405/500 [=======================>......] - ETA: 23s - loss: 1.7406 - regression_loss: 1.4374 - classification_loss: 0.3031 406/500 [=======================>......] - ETA: 23s - loss: 1.7404 - regression_loss: 1.4374 - classification_loss: 0.3031 407/500 [=======================>......] - ETA: 23s - loss: 1.7397 - regression_loss: 1.4370 - classification_loss: 0.3028 408/500 [=======================>......] - ETA: 23s - loss: 1.7417 - regression_loss: 1.4387 - classification_loss: 0.3030 409/500 [=======================>......] - ETA: 22s - loss: 1.7422 - regression_loss: 1.4389 - classification_loss: 0.3032 410/500 [=======================>......] - ETA: 22s - loss: 1.7425 - regression_loss: 1.4392 - classification_loss: 0.3032 411/500 [=======================>......] - ETA: 22s - loss: 1.7431 - regression_loss: 1.4398 - classification_loss: 0.3034 412/500 [=======================>......] - ETA: 22s - loss: 1.7425 - regression_loss: 1.4393 - classification_loss: 0.3032 413/500 [=======================>......] - ETA: 21s - loss: 1.7423 - regression_loss: 1.4388 - classification_loss: 0.3035 414/500 [=======================>......] - ETA: 21s - loss: 1.7425 - regression_loss: 1.4392 - classification_loss: 0.3033 415/500 [=======================>......] - ETA: 21s - loss: 1.7432 - regression_loss: 1.4397 - classification_loss: 0.3035 416/500 [=======================>......] - ETA: 21s - loss: 1.7429 - regression_loss: 1.4396 - classification_loss: 0.3033 417/500 [========================>.....] - ETA: 20s - loss: 1.7434 - regression_loss: 1.4401 - classification_loss: 0.3033 418/500 [========================>.....] - ETA: 20s - loss: 1.7429 - regression_loss: 1.4398 - classification_loss: 0.3031 419/500 [========================>.....] - ETA: 20s - loss: 1.7407 - regression_loss: 1.4381 - classification_loss: 0.3026 420/500 [========================>.....] - ETA: 20s - loss: 1.7421 - regression_loss: 1.4385 - classification_loss: 0.3035 421/500 [========================>.....] - ETA: 19s - loss: 1.7413 - regression_loss: 1.4378 - classification_loss: 0.3034 422/500 [========================>.....] - ETA: 19s - loss: 1.7416 - regression_loss: 1.4380 - classification_loss: 0.3036 423/500 [========================>.....] - ETA: 19s - loss: 1.7423 - regression_loss: 1.4385 - classification_loss: 0.3038 424/500 [========================>.....] - ETA: 19s - loss: 1.7436 - regression_loss: 1.4395 - classification_loss: 0.3041 425/500 [========================>.....] - ETA: 18s - loss: 1.7439 - regression_loss: 1.4398 - classification_loss: 0.3041 426/500 [========================>.....] - ETA: 18s - loss: 1.7434 - regression_loss: 1.4395 - classification_loss: 0.3040 427/500 [========================>.....] - ETA: 18s - loss: 1.7432 - regression_loss: 1.4393 - classification_loss: 0.3038 428/500 [========================>.....] - ETA: 18s - loss: 1.7416 - regression_loss: 1.4380 - classification_loss: 0.3036 429/500 [========================>.....] - ETA: 17s - loss: 1.7403 - regression_loss: 1.4368 - classification_loss: 0.3035 430/500 [========================>.....] - ETA: 17s - loss: 1.7412 - regression_loss: 1.4375 - classification_loss: 0.3037 431/500 [========================>.....] - ETA: 17s - loss: 1.7409 - regression_loss: 1.4375 - classification_loss: 0.3034 432/500 [========================>.....] - ETA: 17s - loss: 1.7409 - regression_loss: 1.4372 - classification_loss: 0.3036 433/500 [========================>.....] - ETA: 16s - loss: 1.7414 - regression_loss: 1.4376 - classification_loss: 0.3038 434/500 [=========================>....] - ETA: 16s - loss: 1.7412 - regression_loss: 1.4374 - classification_loss: 0.3038 435/500 [=========================>....] - ETA: 16s - loss: 1.7421 - regression_loss: 1.4382 - classification_loss: 0.3040 436/500 [=========================>....] - ETA: 16s - loss: 1.7424 - regression_loss: 1.4384 - classification_loss: 0.3041 437/500 [=========================>....] - ETA: 15s - loss: 1.7420 - regression_loss: 1.4381 - classification_loss: 0.3039 438/500 [=========================>....] - ETA: 15s - loss: 1.7413 - regression_loss: 1.4376 - classification_loss: 0.3037 439/500 [=========================>....] - ETA: 15s - loss: 1.7397 - regression_loss: 1.4363 - classification_loss: 0.3035 440/500 [=========================>....] - ETA: 15s - loss: 1.7381 - regression_loss: 1.4350 - classification_loss: 0.3031 441/500 [=========================>....] - ETA: 14s - loss: 1.7388 - regression_loss: 1.4356 - classification_loss: 0.3032 442/500 [=========================>....] - ETA: 14s - loss: 1.7399 - regression_loss: 1.4365 - classification_loss: 0.3034 443/500 [=========================>....] - ETA: 14s - loss: 1.7377 - regression_loss: 1.4347 - classification_loss: 0.3030 444/500 [=========================>....] - ETA: 14s - loss: 1.7376 - regression_loss: 1.4348 - classification_loss: 0.3028 445/500 [=========================>....] - ETA: 13s - loss: 1.7368 - regression_loss: 1.4341 - classification_loss: 0.3026 446/500 [=========================>....] - ETA: 13s - loss: 1.7351 - regression_loss: 1.4327 - classification_loss: 0.3024 447/500 [=========================>....] - ETA: 13s - loss: 1.7352 - regression_loss: 1.4328 - classification_loss: 0.3024 448/500 [=========================>....] - ETA: 13s - loss: 1.7356 - regression_loss: 1.4333 - classification_loss: 0.3024 449/500 [=========================>....] - ETA: 12s - loss: 1.7346 - regression_loss: 1.4322 - classification_loss: 0.3024 450/500 [==========================>...] - ETA: 12s - loss: 1.7353 - regression_loss: 1.4328 - classification_loss: 0.3025 451/500 [==========================>...] - ETA: 12s - loss: 1.7345 - regression_loss: 1.4320 - classification_loss: 0.3025 452/500 [==========================>...] - ETA: 12s - loss: 1.7315 - regression_loss: 1.4296 - classification_loss: 0.3019 453/500 [==========================>...] - ETA: 11s - loss: 1.7320 - regression_loss: 1.4300 - classification_loss: 0.3020 454/500 [==========================>...] - ETA: 11s - loss: 1.7323 - regression_loss: 1.4301 - classification_loss: 0.3022 455/500 [==========================>...] - ETA: 11s - loss: 1.7318 - regression_loss: 1.4298 - classification_loss: 0.3020 456/500 [==========================>...] - ETA: 11s - loss: 1.7317 - regression_loss: 1.4298 - classification_loss: 0.3019 457/500 [==========================>...] - ETA: 10s - loss: 1.7308 - regression_loss: 1.4289 - classification_loss: 0.3019 458/500 [==========================>...] - ETA: 10s - loss: 1.7309 - regression_loss: 1.4289 - classification_loss: 0.3019 459/500 [==========================>...] - ETA: 10s - loss: 1.7328 - regression_loss: 1.4305 - classification_loss: 0.3023 460/500 [==========================>...] - ETA: 10s - loss: 1.7309 - regression_loss: 1.4290 - classification_loss: 0.3019 461/500 [==========================>...] - ETA: 9s - loss: 1.7310 - regression_loss: 1.4290 - classification_loss: 0.3020  462/500 [==========================>...] - ETA: 9s - loss: 1.7307 - regression_loss: 1.4288 - classification_loss: 0.3020 463/500 [==========================>...] - ETA: 9s - loss: 1.7317 - regression_loss: 1.4295 - classification_loss: 0.3022 464/500 [==========================>...] - ETA: 9s - loss: 1.7306 - regression_loss: 1.4284 - classification_loss: 0.3022 465/500 [==========================>...] - ETA: 8s - loss: 1.7312 - regression_loss: 1.4290 - classification_loss: 0.3022 466/500 [==========================>...] - ETA: 8s - loss: 1.7308 - regression_loss: 1.4287 - classification_loss: 0.3021 467/500 [===========================>..] - ETA: 8s - loss: 1.7319 - regression_loss: 1.4299 - classification_loss: 0.3019 468/500 [===========================>..] - ETA: 8s - loss: 1.7317 - regression_loss: 1.4298 - classification_loss: 0.3019 469/500 [===========================>..] - ETA: 7s - loss: 1.7320 - regression_loss: 1.4300 - classification_loss: 0.3020 470/500 [===========================>..] - ETA: 7s - loss: 1.7324 - regression_loss: 1.4303 - classification_loss: 0.3022 471/500 [===========================>..] - ETA: 7s - loss: 1.7327 - regression_loss: 1.4305 - classification_loss: 0.3022 472/500 [===========================>..] - ETA: 7s - loss: 1.7334 - regression_loss: 1.4310 - classification_loss: 0.3024 473/500 [===========================>..] - ETA: 6s - loss: 1.7321 - regression_loss: 1.4299 - classification_loss: 0.3022 474/500 [===========================>..] - ETA: 6s - loss: 1.7312 - regression_loss: 1.4291 - classification_loss: 0.3021 475/500 [===========================>..] - ETA: 6s - loss: 1.7317 - regression_loss: 1.4295 - classification_loss: 0.3021 476/500 [===========================>..] - ETA: 6s - loss: 1.7316 - regression_loss: 1.4295 - classification_loss: 0.3021 477/500 [===========================>..] - ETA: 5s - loss: 1.7324 - regression_loss: 1.4303 - classification_loss: 0.3021 478/500 [===========================>..] - ETA: 5s - loss: 1.7333 - regression_loss: 1.4310 - classification_loss: 0.3022 479/500 [===========================>..] - ETA: 5s - loss: 1.7341 - regression_loss: 1.4318 - classification_loss: 0.3023 480/500 [===========================>..] - ETA: 5s - loss: 1.7342 - regression_loss: 1.4319 - classification_loss: 0.3023 481/500 [===========================>..] - ETA: 4s - loss: 1.7346 - regression_loss: 1.4322 - classification_loss: 0.3023 482/500 [===========================>..] - ETA: 4s - loss: 1.7356 - regression_loss: 1.4327 - classification_loss: 0.3028 483/500 [===========================>..] - ETA: 4s - loss: 1.7351 - regression_loss: 1.4324 - classification_loss: 0.3028 484/500 [============================>.] - ETA: 4s - loss: 1.7346 - regression_loss: 1.4321 - classification_loss: 0.3025 485/500 [============================>.] - ETA: 3s - loss: 1.7339 - regression_loss: 1.4316 - classification_loss: 0.3023 486/500 [============================>.] - ETA: 3s - loss: 1.7343 - regression_loss: 1.4320 - classification_loss: 0.3023 487/500 [============================>.] - ETA: 3s - loss: 1.7321 - regression_loss: 1.4303 - classification_loss: 0.3019 488/500 [============================>.] - ETA: 3s - loss: 1.7322 - regression_loss: 1.4304 - classification_loss: 0.3018 489/500 [============================>.] - ETA: 2s - loss: 1.7327 - regression_loss: 1.4308 - classification_loss: 0.3019 490/500 [============================>.] - ETA: 2s - loss: 1.7317 - regression_loss: 1.4300 - classification_loss: 0.3017 491/500 [============================>.] - ETA: 2s - loss: 1.7330 - regression_loss: 1.4310 - classification_loss: 0.3020 492/500 [============================>.] - ETA: 2s - loss: 1.7330 - regression_loss: 1.4310 - classification_loss: 0.3020 493/500 [============================>.] - ETA: 1s - loss: 1.7331 - regression_loss: 1.4310 - classification_loss: 0.3020 494/500 [============================>.] - ETA: 1s - loss: 1.7329 - regression_loss: 1.4309 - classification_loss: 0.3019 495/500 [============================>.] - ETA: 1s - loss: 1.7318 - regression_loss: 1.4299 - classification_loss: 0.3019 496/500 [============================>.] - ETA: 1s - loss: 1.7327 - regression_loss: 1.4306 - classification_loss: 0.3021 497/500 [============================>.] - ETA: 0s - loss: 1.7326 - regression_loss: 1.4305 - classification_loss: 0.3020 498/500 [============================>.] - ETA: 0s - loss: 1.7319 - regression_loss: 1.4299 - classification_loss: 0.3019 499/500 [============================>.] - ETA: 0s - loss: 1.7319 - regression_loss: 1.4300 - classification_loss: 0.3019 500/500 [==============================] - 125s 250ms/step - loss: 1.7322 - regression_loss: 1.4303 - classification_loss: 0.3020 1172 instances of class plum with average precision: 0.6329 mAP: 0.6329 Epoch 00068: saving model to ./training/snapshots/resnet50_pascal_68.h5 Epoch 69/150 1/500 [..............................] - ETA: 2:00 - loss: 1.6635 - regression_loss: 1.3675 - classification_loss: 0.2959 2/500 [..............................] - ETA: 2:02 - loss: 1.8801 - regression_loss: 1.5660 - classification_loss: 0.3140 3/500 [..............................] - ETA: 2:02 - loss: 2.0531 - regression_loss: 1.7032 - classification_loss: 0.3499 4/500 [..............................] - ETA: 2:03 - loss: 1.8105 - regression_loss: 1.5093 - classification_loss: 0.3012 5/500 [..............................] - ETA: 2:03 - loss: 1.8205 - regression_loss: 1.5131 - classification_loss: 0.3075 6/500 [..............................] - ETA: 2:02 - loss: 1.7388 - regression_loss: 1.4527 - classification_loss: 0.2862 7/500 [..............................] - ETA: 2:02 - loss: 1.6227 - regression_loss: 1.3592 - classification_loss: 0.2635 8/500 [..............................] - ETA: 2:02 - loss: 1.6233 - regression_loss: 1.3571 - classification_loss: 0.2662 9/500 [..............................] - ETA: 2:02 - loss: 1.6471 - regression_loss: 1.3730 - classification_loss: 0.2741 10/500 [..............................] - ETA: 2:02 - loss: 1.6973 - regression_loss: 1.4181 - classification_loss: 0.2792 11/500 [..............................] - ETA: 2:02 - loss: 1.7296 - regression_loss: 1.4564 - classification_loss: 0.2731 12/500 [..............................] - ETA: 2:02 - loss: 1.7390 - regression_loss: 1.4633 - classification_loss: 0.2757 13/500 [..............................] - ETA: 2:01 - loss: 1.7586 - regression_loss: 1.4789 - classification_loss: 0.2797 14/500 [..............................] - ETA: 2:01 - loss: 1.7165 - regression_loss: 1.4365 - classification_loss: 0.2800 15/500 [..............................] - ETA: 2:01 - loss: 1.6638 - regression_loss: 1.3942 - classification_loss: 0.2696 16/500 [..............................] - ETA: 2:01 - loss: 1.6705 - regression_loss: 1.3978 - classification_loss: 0.2727 17/500 [>.............................] - ETA: 2:01 - loss: 1.6965 - regression_loss: 1.4059 - classification_loss: 0.2906 18/500 [>.............................] - ETA: 2:00 - loss: 1.7151 - regression_loss: 1.4227 - classification_loss: 0.2925 19/500 [>.............................] - ETA: 2:00 - loss: 1.7232 - regression_loss: 1.4325 - classification_loss: 0.2907 20/500 [>.............................] - ETA: 2:00 - loss: 1.7266 - regression_loss: 1.4349 - classification_loss: 0.2917 21/500 [>.............................] - ETA: 2:00 - loss: 1.7348 - regression_loss: 1.4393 - classification_loss: 0.2955 22/500 [>.............................] - ETA: 1:59 - loss: 1.6835 - regression_loss: 1.3963 - classification_loss: 0.2871 23/500 [>.............................] - ETA: 1:59 - loss: 1.7107 - regression_loss: 1.4202 - classification_loss: 0.2905 24/500 [>.............................] - ETA: 1:59 - loss: 1.6809 - regression_loss: 1.3957 - classification_loss: 0.2852 25/500 [>.............................] - ETA: 1:59 - loss: 1.6950 - regression_loss: 1.4078 - classification_loss: 0.2872 26/500 [>.............................] - ETA: 1:59 - loss: 1.6834 - regression_loss: 1.3979 - classification_loss: 0.2855 27/500 [>.............................] - ETA: 1:58 - loss: 1.6574 - regression_loss: 1.3767 - classification_loss: 0.2807 28/500 [>.............................] - ETA: 1:58 - loss: 1.6272 - regression_loss: 1.3532 - classification_loss: 0.2740 29/500 [>.............................] - ETA: 1:58 - loss: 1.6437 - regression_loss: 1.3679 - classification_loss: 0.2758 30/500 [>.............................] - ETA: 1:58 - loss: 1.6365 - regression_loss: 1.3616 - classification_loss: 0.2750 31/500 [>.............................] - ETA: 1:57 - loss: 1.6461 - regression_loss: 1.3699 - classification_loss: 0.2763 32/500 [>.............................] - ETA: 1:57 - loss: 1.6498 - regression_loss: 1.3729 - classification_loss: 0.2769 33/500 [>.............................] - ETA: 1:57 - loss: 1.6863 - regression_loss: 1.4075 - classification_loss: 0.2788 34/500 [=>............................] - ETA: 1:57 - loss: 1.6699 - regression_loss: 1.3940 - classification_loss: 0.2759 35/500 [=>............................] - ETA: 1:56 - loss: 1.6697 - regression_loss: 1.3938 - classification_loss: 0.2759 36/500 [=>............................] - ETA: 1:56 - loss: 1.6749 - regression_loss: 1.3988 - classification_loss: 0.2762 37/500 [=>............................] - ETA: 1:56 - loss: 1.6808 - regression_loss: 1.4057 - classification_loss: 0.2751 38/500 [=>............................] - ETA: 1:56 - loss: 1.6958 - regression_loss: 1.4195 - classification_loss: 0.2763 39/500 [=>............................] - ETA: 1:55 - loss: 1.6878 - regression_loss: 1.4128 - classification_loss: 0.2751 40/500 [=>............................] - ETA: 1:55 - loss: 1.6888 - regression_loss: 1.4140 - classification_loss: 0.2748 41/500 [=>............................] - ETA: 1:55 - loss: 1.6900 - regression_loss: 1.4148 - classification_loss: 0.2751 42/500 [=>............................] - ETA: 1:55 - loss: 1.6960 - regression_loss: 1.4201 - classification_loss: 0.2759 43/500 [=>............................] - ETA: 1:55 - loss: 1.7010 - regression_loss: 1.4226 - classification_loss: 0.2784 44/500 [=>............................] - ETA: 1:54 - loss: 1.6993 - regression_loss: 1.4210 - classification_loss: 0.2782 45/500 [=>............................] - ETA: 1:54 - loss: 1.7066 - regression_loss: 1.4253 - classification_loss: 0.2813 46/500 [=>............................] - ETA: 1:54 - loss: 1.7059 - regression_loss: 1.4256 - classification_loss: 0.2803 47/500 [=>............................] - ETA: 1:54 - loss: 1.7000 - regression_loss: 1.4215 - classification_loss: 0.2785 48/500 [=>............................] - ETA: 1:53 - loss: 1.6988 - regression_loss: 1.4204 - classification_loss: 0.2784 49/500 [=>............................] - ETA: 1:53 - loss: 1.6986 - regression_loss: 1.4216 - classification_loss: 0.2771 50/500 [==>...........................] - ETA: 1:53 - loss: 1.7067 - regression_loss: 1.4282 - classification_loss: 0.2786 51/500 [==>...........................] - ETA: 1:53 - loss: 1.7117 - regression_loss: 1.4322 - classification_loss: 0.2794 52/500 [==>...........................] - ETA: 1:53 - loss: 1.7012 - regression_loss: 1.4239 - classification_loss: 0.2772 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6996 - regression_loss: 1.4219 - classification_loss: 0.2777 54/500 [==>...........................] - ETA: 1:52 - loss: 1.7035 - regression_loss: 1.4259 - classification_loss: 0.2776 55/500 [==>...........................] - ETA: 1:52 - loss: 1.7053 - regression_loss: 1.4276 - classification_loss: 0.2777 56/500 [==>...........................] - ETA: 1:52 - loss: 1.7114 - regression_loss: 1.4320 - classification_loss: 0.2794 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7105 - regression_loss: 1.4319 - classification_loss: 0.2786 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7107 - regression_loss: 1.4316 - classification_loss: 0.2791 59/500 [==>...........................] - ETA: 1:51 - loss: 1.7131 - regression_loss: 1.4339 - classification_loss: 0.2792 60/500 [==>...........................] - ETA: 1:50 - loss: 1.7027 - regression_loss: 1.4254 - classification_loss: 0.2774 61/500 [==>...........................] - ETA: 1:50 - loss: 1.7029 - regression_loss: 1.4266 - classification_loss: 0.2764 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7050 - regression_loss: 1.4284 - classification_loss: 0.2766 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7101 - regression_loss: 1.4336 - classification_loss: 0.2765 64/500 [==>...........................] - ETA: 1:49 - loss: 1.7059 - regression_loss: 1.4258 - classification_loss: 0.2800 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7065 - regression_loss: 1.4270 - classification_loss: 0.2795 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7062 - regression_loss: 1.4271 - classification_loss: 0.2791 67/500 [===>..........................] - ETA: 1:49 - loss: 1.7062 - regression_loss: 1.4273 - classification_loss: 0.2789 68/500 [===>..........................] - ETA: 1:48 - loss: 1.7159 - regression_loss: 1.4362 - classification_loss: 0.2797 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7172 - regression_loss: 1.4373 - classification_loss: 0.2799 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7227 - regression_loss: 1.4406 - classification_loss: 0.2821 71/500 [===>..........................] - ETA: 1:48 - loss: 1.7241 - regression_loss: 1.4412 - classification_loss: 0.2829 72/500 [===>..........................] - ETA: 1:47 - loss: 1.7243 - regression_loss: 1.4415 - classification_loss: 0.2828 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7218 - regression_loss: 1.4390 - classification_loss: 0.2829 74/500 [===>..........................] - ETA: 1:47 - loss: 1.7212 - regression_loss: 1.4389 - classification_loss: 0.2822 75/500 [===>..........................] - ETA: 1:47 - loss: 1.7248 - regression_loss: 1.4428 - classification_loss: 0.2819 76/500 [===>..........................] - ETA: 1:46 - loss: 1.7365 - regression_loss: 1.4532 - classification_loss: 0.2833 77/500 [===>..........................] - ETA: 1:46 - loss: 1.7402 - regression_loss: 1.4555 - classification_loss: 0.2847 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7478 - regression_loss: 1.4610 - classification_loss: 0.2868 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7496 - regression_loss: 1.4629 - classification_loss: 0.2867 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7480 - regression_loss: 1.4615 - classification_loss: 0.2865 81/500 [===>..........................] - ETA: 1:45 - loss: 1.7489 - regression_loss: 1.4610 - classification_loss: 0.2878 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7412 - regression_loss: 1.4553 - classification_loss: 0.2859 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7457 - regression_loss: 1.4593 - classification_loss: 0.2864 84/500 [====>.........................] - ETA: 1:44 - loss: 1.7505 - regression_loss: 1.4636 - classification_loss: 0.2869 85/500 [====>.........................] - ETA: 1:44 - loss: 1.7514 - regression_loss: 1.4641 - classification_loss: 0.2873 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7583 - regression_loss: 1.4677 - classification_loss: 0.2907 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7523 - regression_loss: 1.4617 - classification_loss: 0.2906 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7466 - regression_loss: 1.4576 - classification_loss: 0.2890 89/500 [====>.........................] - ETA: 1:43 - loss: 1.7388 - regression_loss: 1.4506 - classification_loss: 0.2882 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7342 - regression_loss: 1.4467 - classification_loss: 0.2874 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7365 - regression_loss: 1.4488 - classification_loss: 0.2878 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7368 - regression_loss: 1.4490 - classification_loss: 0.2879 93/500 [====>.........................] - ETA: 1:42 - loss: 1.7401 - regression_loss: 1.4518 - classification_loss: 0.2883 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7446 - regression_loss: 1.4537 - classification_loss: 0.2909 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7494 - regression_loss: 1.4567 - classification_loss: 0.2927 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7508 - regression_loss: 1.4581 - classification_loss: 0.2927 97/500 [====>.........................] - ETA: 1:41 - loss: 1.7485 - regression_loss: 1.4564 - classification_loss: 0.2921 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7485 - regression_loss: 1.4567 - classification_loss: 0.2918 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7566 - regression_loss: 1.4629 - classification_loss: 0.2937 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7504 - regression_loss: 1.4564 - classification_loss: 0.2940 101/500 [=====>........................] - ETA: 1:40 - loss: 1.7481 - regression_loss: 1.4547 - classification_loss: 0.2934 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7386 - regression_loss: 1.4465 - classification_loss: 0.2921 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7394 - regression_loss: 1.4471 - classification_loss: 0.2923 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7419 - regression_loss: 1.4491 - classification_loss: 0.2928 105/500 [=====>........................] - ETA: 1:39 - loss: 1.7405 - regression_loss: 1.4486 - classification_loss: 0.2919 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7408 - regression_loss: 1.4484 - classification_loss: 0.2924 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7363 - regression_loss: 1.4445 - classification_loss: 0.2918 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7314 - regression_loss: 1.4405 - classification_loss: 0.2909 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7244 - regression_loss: 1.4350 - classification_loss: 0.2894 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7228 - regression_loss: 1.4348 - classification_loss: 0.2881 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7200 - regression_loss: 1.4325 - classification_loss: 0.2875 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7250 - regression_loss: 1.4374 - classification_loss: 0.2877 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7153 - regression_loss: 1.4293 - classification_loss: 0.2860 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7158 - regression_loss: 1.4301 - classification_loss: 0.2858 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7169 - regression_loss: 1.4310 - classification_loss: 0.2859 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7157 - regression_loss: 1.4302 - classification_loss: 0.2856 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7144 - regression_loss: 1.4288 - classification_loss: 0.2856 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7195 - regression_loss: 1.4336 - classification_loss: 0.2860 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7177 - regression_loss: 1.4316 - classification_loss: 0.2861 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7161 - regression_loss: 1.4296 - classification_loss: 0.2865 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7132 - regression_loss: 1.4275 - classification_loss: 0.2856 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7128 - regression_loss: 1.4274 - classification_loss: 0.2854 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7066 - regression_loss: 1.4223 - classification_loss: 0.2843 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7073 - regression_loss: 1.4243 - classification_loss: 0.2830 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7052 - regression_loss: 1.4217 - classification_loss: 0.2834 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7095 - regression_loss: 1.4251 - classification_loss: 0.2844 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7126 - regression_loss: 1.4269 - classification_loss: 0.2857 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7134 - regression_loss: 1.4282 - classification_loss: 0.2852 129/500 [======>.......................] - ETA: 1:33 - loss: 1.7099 - regression_loss: 1.4255 - classification_loss: 0.2844 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7149 - regression_loss: 1.4283 - classification_loss: 0.2866 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7184 - regression_loss: 1.4314 - classification_loss: 0.2870 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7185 - regression_loss: 1.4313 - classification_loss: 0.2872 133/500 [======>.......................] - ETA: 1:32 - loss: 1.7163 - regression_loss: 1.4294 - classification_loss: 0.2870 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7197 - regression_loss: 1.4318 - classification_loss: 0.2880 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7148 - regression_loss: 1.4276 - classification_loss: 0.2871 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7185 - regression_loss: 1.4313 - classification_loss: 0.2872 137/500 [=======>......................] - ETA: 1:31 - loss: 1.7203 - regression_loss: 1.4326 - classification_loss: 0.2877 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7235 - regression_loss: 1.4352 - classification_loss: 0.2883 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7183 - regression_loss: 1.4312 - classification_loss: 0.2871 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7223 - regression_loss: 1.4344 - classification_loss: 0.2879 141/500 [=======>......................] - ETA: 1:30 - loss: 1.7209 - regression_loss: 1.4338 - classification_loss: 0.2871 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7190 - regression_loss: 1.4326 - classification_loss: 0.2864 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7218 - regression_loss: 1.4347 - classification_loss: 0.2871 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7223 - regression_loss: 1.4350 - classification_loss: 0.2873 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7221 - regression_loss: 1.4350 - classification_loss: 0.2871 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7222 - regression_loss: 1.4347 - classification_loss: 0.2875 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7232 - regression_loss: 1.4357 - classification_loss: 0.2875 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7258 - regression_loss: 1.4381 - classification_loss: 0.2877 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7282 - regression_loss: 1.4398 - classification_loss: 0.2883 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7289 - regression_loss: 1.4406 - classification_loss: 0.2884 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7247 - regression_loss: 1.4373 - classification_loss: 0.2875 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7203 - regression_loss: 1.4335 - classification_loss: 0.2868 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7203 - regression_loss: 1.4335 - classification_loss: 0.2868 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7223 - regression_loss: 1.4356 - classification_loss: 0.2866 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7231 - regression_loss: 1.4367 - classification_loss: 0.2864 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7284 - regression_loss: 1.4411 - classification_loss: 0.2873 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7293 - regression_loss: 1.4416 - classification_loss: 0.2876 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7302 - regression_loss: 1.4424 - classification_loss: 0.2878 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7297 - regression_loss: 1.4415 - classification_loss: 0.2882 160/500 [========>.....................] - ETA: 1:24 - loss: 1.7310 - regression_loss: 1.4428 - classification_loss: 0.2882 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7303 - regression_loss: 1.4424 - classification_loss: 0.2879 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7309 - regression_loss: 1.4431 - classification_loss: 0.2879 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7258 - regression_loss: 1.4385 - classification_loss: 0.2873 164/500 [========>.....................] - ETA: 1:23 - loss: 1.7311 - regression_loss: 1.4432 - classification_loss: 0.2879 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7254 - regression_loss: 1.4388 - classification_loss: 0.2866 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7255 - regression_loss: 1.4387 - classification_loss: 0.2868 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7211 - regression_loss: 1.4351 - classification_loss: 0.2860 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7216 - regression_loss: 1.4359 - classification_loss: 0.2858 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7201 - regression_loss: 1.4347 - classification_loss: 0.2854 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7174 - regression_loss: 1.4322 - classification_loss: 0.2852 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7187 - regression_loss: 1.4334 - classification_loss: 0.2853 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7182 - regression_loss: 1.4327 - classification_loss: 0.2855 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7194 - regression_loss: 1.4341 - classification_loss: 0.2854 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7207 - regression_loss: 1.4353 - classification_loss: 0.2854 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7191 - regression_loss: 1.4345 - classification_loss: 0.2846 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7210 - regression_loss: 1.4362 - classification_loss: 0.2849 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7196 - regression_loss: 1.4329 - classification_loss: 0.2868 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7218 - regression_loss: 1.4345 - classification_loss: 0.2873 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7206 - regression_loss: 1.4323 - classification_loss: 0.2883 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7214 - regression_loss: 1.4332 - classification_loss: 0.2882 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7237 - regression_loss: 1.4351 - classification_loss: 0.2886 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7235 - regression_loss: 1.4350 - classification_loss: 0.2885 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7251 - regression_loss: 1.4364 - classification_loss: 0.2887 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7224 - regression_loss: 1.4343 - classification_loss: 0.2881 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7227 - regression_loss: 1.4345 - classification_loss: 0.2882 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7252 - regression_loss: 1.4362 - classification_loss: 0.2890 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7251 - regression_loss: 1.4356 - classification_loss: 0.2895 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7208 - regression_loss: 1.4314 - classification_loss: 0.2893 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7198 - regression_loss: 1.4305 - classification_loss: 0.2893 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7212 - regression_loss: 1.4313 - classification_loss: 0.2899 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7181 - regression_loss: 1.4285 - classification_loss: 0.2896 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7165 - regression_loss: 1.4272 - classification_loss: 0.2893 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7109 - regression_loss: 1.4228 - classification_loss: 0.2880 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7131 - regression_loss: 1.4245 - classification_loss: 0.2886 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7134 - regression_loss: 1.4244 - classification_loss: 0.2889 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7098 - regression_loss: 1.4215 - classification_loss: 0.2883 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7112 - regression_loss: 1.4228 - classification_loss: 0.2884 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7158 - regression_loss: 1.4260 - classification_loss: 0.2898 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7168 - regression_loss: 1.4271 - classification_loss: 0.2896 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7164 - regression_loss: 1.4269 - classification_loss: 0.2896 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7172 - regression_loss: 1.4276 - classification_loss: 0.2896 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7121 - regression_loss: 1.4234 - classification_loss: 0.2887 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7142 - regression_loss: 1.4250 - classification_loss: 0.2892 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7185 - regression_loss: 1.4284 - classification_loss: 0.2901 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7178 - regression_loss: 1.4282 - classification_loss: 0.2896 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7182 - regression_loss: 1.4284 - classification_loss: 0.2897 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7175 - regression_loss: 1.4279 - classification_loss: 0.2896 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7197 - regression_loss: 1.4293 - classification_loss: 0.2904 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7259 - regression_loss: 1.4335 - classification_loss: 0.2924 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7236 - regression_loss: 1.4315 - classification_loss: 0.2921 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7226 - regression_loss: 1.4310 - classification_loss: 0.2916 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7235 - regression_loss: 1.4316 - classification_loss: 0.2919 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7197 - regression_loss: 1.4285 - classification_loss: 0.2911 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7198 - regression_loss: 1.4284 - classification_loss: 0.2914 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7219 - regression_loss: 1.4303 - classification_loss: 0.2916 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7205 - regression_loss: 1.4293 - classification_loss: 0.2912 217/500 [============>.................] - ETA: 1:10 - loss: 1.7186 - regression_loss: 1.4274 - classification_loss: 0.2912 218/500 [============>.................] - ETA: 1:10 - loss: 1.7199 - regression_loss: 1.4286 - classification_loss: 0.2912 219/500 [============>.................] - ETA: 1:10 - loss: 1.7189 - regression_loss: 1.4281 - classification_loss: 0.2908 220/500 [============>.................] - ETA: 1:09 - loss: 1.7209 - regression_loss: 1.4298 - classification_loss: 0.2910 221/500 [============>.................] - ETA: 1:09 - loss: 1.7206 - regression_loss: 1.4298 - classification_loss: 0.2909 222/500 [============>.................] - ETA: 1:09 - loss: 1.7223 - regression_loss: 1.4313 - classification_loss: 0.2909 223/500 [============>.................] - ETA: 1:09 - loss: 1.7175 - regression_loss: 1.4276 - classification_loss: 0.2899 224/500 [============>.................] - ETA: 1:09 - loss: 1.7174 - regression_loss: 1.4281 - classification_loss: 0.2893 225/500 [============>.................] - ETA: 1:08 - loss: 1.7172 - regression_loss: 1.4279 - classification_loss: 0.2892 226/500 [============>.................] - ETA: 1:08 - loss: 1.7176 - regression_loss: 1.4286 - classification_loss: 0.2890 227/500 [============>.................] - ETA: 1:08 - loss: 1.7182 - regression_loss: 1.4290 - classification_loss: 0.2892 228/500 [============>.................] - ETA: 1:08 - loss: 1.7197 - regression_loss: 1.4301 - classification_loss: 0.2895 229/500 [============>.................] - ETA: 1:07 - loss: 1.7212 - regression_loss: 1.4309 - classification_loss: 0.2903 230/500 [============>.................] - ETA: 1:07 - loss: 1.7239 - regression_loss: 1.4328 - classification_loss: 0.2911 231/500 [============>.................] - ETA: 1:07 - loss: 1.7225 - regression_loss: 1.4316 - classification_loss: 0.2908 232/500 [============>.................] - ETA: 1:07 - loss: 1.7177 - regression_loss: 1.4279 - classification_loss: 0.2899 233/500 [============>.................] - ETA: 1:06 - loss: 1.7141 - regression_loss: 1.4249 - classification_loss: 0.2892 234/500 [=============>................] - ETA: 1:06 - loss: 1.7122 - regression_loss: 1.4227 - classification_loss: 0.2895 235/500 [=============>................] - ETA: 1:06 - loss: 1.7137 - regression_loss: 1.4241 - classification_loss: 0.2896 236/500 [=============>................] - ETA: 1:06 - loss: 1.7145 - regression_loss: 1.4248 - classification_loss: 0.2897 237/500 [=============>................] - ETA: 1:05 - loss: 1.7153 - regression_loss: 1.4256 - classification_loss: 0.2898 238/500 [=============>................] - ETA: 1:05 - loss: 1.7147 - regression_loss: 1.4252 - classification_loss: 0.2895 239/500 [=============>................] - ETA: 1:05 - loss: 1.7153 - regression_loss: 1.4252 - classification_loss: 0.2902 240/500 [=============>................] - ETA: 1:05 - loss: 1.7164 - regression_loss: 1.4263 - classification_loss: 0.2902 241/500 [=============>................] - ETA: 1:04 - loss: 1.7168 - regression_loss: 1.4264 - classification_loss: 0.2904 242/500 [=============>................] - ETA: 1:04 - loss: 1.7176 - regression_loss: 1.4268 - classification_loss: 0.2909 243/500 [=============>................] - ETA: 1:04 - loss: 1.7187 - regression_loss: 1.4277 - classification_loss: 0.2910 244/500 [=============>................] - ETA: 1:04 - loss: 1.7187 - regression_loss: 1.4277 - classification_loss: 0.2910 245/500 [=============>................] - ETA: 1:03 - loss: 1.7187 - regression_loss: 1.4277 - classification_loss: 0.2910 246/500 [=============>................] - ETA: 1:03 - loss: 1.7193 - regression_loss: 1.4284 - classification_loss: 0.2909 247/500 [=============>................] - ETA: 1:03 - loss: 1.7200 - regression_loss: 1.4291 - classification_loss: 0.2909 248/500 [=============>................] - ETA: 1:02 - loss: 1.7176 - regression_loss: 1.4271 - classification_loss: 0.2904 249/500 [=============>................] - ETA: 1:02 - loss: 1.7165 - regression_loss: 1.4262 - classification_loss: 0.2903 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7193 - regression_loss: 1.4287 - classification_loss: 0.2906 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7173 - regression_loss: 1.4271 - classification_loss: 0.2902 252/500 [==============>...............] - ETA: 1:01 - loss: 1.7185 - regression_loss: 1.4275 - classification_loss: 0.2909 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7176 - regression_loss: 1.4270 - classification_loss: 0.2907 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7173 - regression_loss: 1.4265 - classification_loss: 0.2907 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7154 - regression_loss: 1.4248 - classification_loss: 0.2905 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7162 - regression_loss: 1.4255 - classification_loss: 0.2907 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7137 - regression_loss: 1.4231 - classification_loss: 0.2906 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7105 - regression_loss: 1.4204 - classification_loss: 0.2901 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7093 - regression_loss: 1.4196 - classification_loss: 0.2898 260/500 [==============>...............] - ETA: 59s - loss: 1.7073 - regression_loss: 1.4176 - classification_loss: 0.2897  261/500 [==============>...............] - ETA: 59s - loss: 1.7080 - regression_loss: 1.4178 - classification_loss: 0.2901 262/500 [==============>...............] - ETA: 59s - loss: 1.7089 - regression_loss: 1.4187 - classification_loss: 0.2902 263/500 [==============>...............] - ETA: 59s - loss: 1.7099 - regression_loss: 1.4198 - classification_loss: 0.2901 264/500 [==============>...............] - ETA: 58s - loss: 1.7120 - regression_loss: 1.4210 - classification_loss: 0.2910 265/500 [==============>...............] - ETA: 58s - loss: 1.7132 - regression_loss: 1.4220 - classification_loss: 0.2912 266/500 [==============>...............] - ETA: 58s - loss: 1.7094 - regression_loss: 1.4189 - classification_loss: 0.2904 267/500 [===============>..............] - ETA: 58s - loss: 1.7109 - regression_loss: 1.4201 - classification_loss: 0.2908 268/500 [===============>..............] - ETA: 57s - loss: 1.7121 - regression_loss: 1.4211 - classification_loss: 0.2910 269/500 [===============>..............] - ETA: 57s - loss: 1.7127 - regression_loss: 1.4215 - classification_loss: 0.2912 270/500 [===============>..............] - ETA: 57s - loss: 1.7126 - regression_loss: 1.4210 - classification_loss: 0.2916 271/500 [===============>..............] - ETA: 57s - loss: 1.7103 - regression_loss: 1.4189 - classification_loss: 0.2915 272/500 [===============>..............] - ETA: 56s - loss: 1.7106 - regression_loss: 1.4193 - classification_loss: 0.2914 273/500 [===============>..............] - ETA: 56s - loss: 1.7124 - regression_loss: 1.4208 - classification_loss: 0.2916 274/500 [===============>..............] - ETA: 56s - loss: 1.7128 - regression_loss: 1.4211 - classification_loss: 0.2918 275/500 [===============>..............] - ETA: 56s - loss: 1.7098 - regression_loss: 1.4187 - classification_loss: 0.2911 276/500 [===============>..............] - ETA: 55s - loss: 1.7109 - regression_loss: 1.4193 - classification_loss: 0.2916 277/500 [===============>..............] - ETA: 55s - loss: 1.7127 - regression_loss: 1.4205 - classification_loss: 0.2922 278/500 [===============>..............] - ETA: 55s - loss: 1.7125 - regression_loss: 1.4204 - classification_loss: 0.2921 279/500 [===============>..............] - ETA: 55s - loss: 1.7086 - regression_loss: 1.4172 - classification_loss: 0.2914 280/500 [===============>..............] - ETA: 54s - loss: 1.7064 - regression_loss: 1.4152 - classification_loss: 0.2912 281/500 [===============>..............] - ETA: 54s - loss: 1.7056 - regression_loss: 1.4143 - classification_loss: 0.2913 282/500 [===============>..............] - ETA: 54s - loss: 1.7077 - regression_loss: 1.4160 - classification_loss: 0.2916 283/500 [===============>..............] - ETA: 54s - loss: 1.7075 - regression_loss: 1.4161 - classification_loss: 0.2915 284/500 [================>.............] - ETA: 53s - loss: 1.7064 - regression_loss: 1.4151 - classification_loss: 0.2913 285/500 [================>.............] - ETA: 53s - loss: 1.7081 - regression_loss: 1.4166 - classification_loss: 0.2915 286/500 [================>.............] - ETA: 53s - loss: 1.7083 - regression_loss: 1.4167 - classification_loss: 0.2916 287/500 [================>.............] - ETA: 53s - loss: 1.7086 - regression_loss: 1.4171 - classification_loss: 0.2915 288/500 [================>.............] - ETA: 52s - loss: 1.7096 - regression_loss: 1.4180 - classification_loss: 0.2917 289/500 [================>.............] - ETA: 52s - loss: 1.7075 - regression_loss: 1.4164 - classification_loss: 0.2911 290/500 [================>.............] - ETA: 52s - loss: 1.7062 - regression_loss: 1.4152 - classification_loss: 0.2909 291/500 [================>.............] - ETA: 52s - loss: 1.7102 - regression_loss: 1.4186 - classification_loss: 0.2917 292/500 [================>.............] - ETA: 51s - loss: 1.7107 - regression_loss: 1.4191 - classification_loss: 0.2917 293/500 [================>.............] - ETA: 51s - loss: 1.7118 - regression_loss: 1.4199 - classification_loss: 0.2920 294/500 [================>.............] - ETA: 51s - loss: 1.7138 - regression_loss: 1.4215 - classification_loss: 0.2924 295/500 [================>.............] - ETA: 51s - loss: 1.7143 - regression_loss: 1.4218 - classification_loss: 0.2925 296/500 [================>.............] - ETA: 50s - loss: 1.7123 - regression_loss: 1.4202 - classification_loss: 0.2921 297/500 [================>.............] - ETA: 50s - loss: 1.7113 - regression_loss: 1.4195 - classification_loss: 0.2918 298/500 [================>.............] - ETA: 50s - loss: 1.7111 - regression_loss: 1.4195 - classification_loss: 0.2916 299/500 [================>.............] - ETA: 50s - loss: 1.7116 - regression_loss: 1.4200 - classification_loss: 0.2916 300/500 [=================>............] - ETA: 49s - loss: 1.7114 - regression_loss: 1.4199 - classification_loss: 0.2915 301/500 [=================>............] - ETA: 49s - loss: 1.7092 - regression_loss: 1.4180 - classification_loss: 0.2912 302/500 [=================>............] - ETA: 49s - loss: 1.7107 - regression_loss: 1.4193 - classification_loss: 0.2915 303/500 [=================>............] - ETA: 49s - loss: 1.7115 - regression_loss: 1.4199 - classification_loss: 0.2915 304/500 [=================>............] - ETA: 48s - loss: 1.7117 - regression_loss: 1.4204 - classification_loss: 0.2913 305/500 [=================>............] - ETA: 48s - loss: 1.7122 - regression_loss: 1.4210 - classification_loss: 0.2912 306/500 [=================>............] - ETA: 48s - loss: 1.7126 - regression_loss: 1.4215 - classification_loss: 0.2911 307/500 [=================>............] - ETA: 48s - loss: 1.7121 - regression_loss: 1.4209 - classification_loss: 0.2911 308/500 [=================>............] - ETA: 47s - loss: 1.7124 - regression_loss: 1.4211 - classification_loss: 0.2913 309/500 [=================>............] - ETA: 47s - loss: 1.7136 - regression_loss: 1.4219 - classification_loss: 0.2916 310/500 [=================>............] - ETA: 47s - loss: 1.7134 - regression_loss: 1.4219 - classification_loss: 0.2915 311/500 [=================>............] - ETA: 47s - loss: 1.7127 - regression_loss: 1.4213 - classification_loss: 0.2914 312/500 [=================>............] - ETA: 46s - loss: 1.7128 - regression_loss: 1.4215 - classification_loss: 0.2913 313/500 [=================>............] - ETA: 46s - loss: 1.7144 - regression_loss: 1.4229 - classification_loss: 0.2915 314/500 [=================>............] - ETA: 46s - loss: 1.7144 - regression_loss: 1.4226 - classification_loss: 0.2918 315/500 [=================>............] - ETA: 46s - loss: 1.7155 - regression_loss: 1.4236 - classification_loss: 0.2919 316/500 [=================>............] - ETA: 46s - loss: 1.7167 - regression_loss: 1.4247 - classification_loss: 0.2921 317/500 [==================>...........] - ETA: 45s - loss: 1.7189 - regression_loss: 1.4260 - classification_loss: 0.2929 318/500 [==================>...........] - ETA: 45s - loss: 1.7155 - regression_loss: 1.4228 - classification_loss: 0.2927 319/500 [==================>...........] - ETA: 45s - loss: 1.7133 - regression_loss: 1.4212 - classification_loss: 0.2922 320/500 [==================>...........] - ETA: 44s - loss: 1.7154 - regression_loss: 1.4227 - classification_loss: 0.2927 321/500 [==================>...........] - ETA: 44s - loss: 1.7173 - regression_loss: 1.4243 - classification_loss: 0.2930 322/500 [==================>...........] - ETA: 44s - loss: 1.7179 - regression_loss: 1.4250 - classification_loss: 0.2930 323/500 [==================>...........] - ETA: 44s - loss: 1.7189 - regression_loss: 1.4257 - classification_loss: 0.2932 324/500 [==================>...........] - ETA: 43s - loss: 1.7198 - regression_loss: 1.4265 - classification_loss: 0.2932 325/500 [==================>...........] - ETA: 43s - loss: 1.7181 - regression_loss: 1.4255 - classification_loss: 0.2927 326/500 [==================>...........] - ETA: 43s - loss: 1.7190 - regression_loss: 1.4260 - classification_loss: 0.2930 327/500 [==================>...........] - ETA: 43s - loss: 1.7157 - regression_loss: 1.4232 - classification_loss: 0.2924 328/500 [==================>...........] - ETA: 42s - loss: 1.7160 - regression_loss: 1.4237 - classification_loss: 0.2923 329/500 [==================>...........] - ETA: 42s - loss: 1.7139 - regression_loss: 1.4221 - classification_loss: 0.2918 330/500 [==================>...........] - ETA: 42s - loss: 1.7139 - regression_loss: 1.4221 - classification_loss: 0.2918 331/500 [==================>...........] - ETA: 42s - loss: 1.7153 - regression_loss: 1.4232 - classification_loss: 0.2920 332/500 [==================>...........] - ETA: 41s - loss: 1.7170 - regression_loss: 1.4246 - classification_loss: 0.2924 333/500 [==================>...........] - ETA: 41s - loss: 1.7139 - regression_loss: 1.4221 - classification_loss: 0.2918 334/500 [===================>..........] - ETA: 41s - loss: 1.7148 - regression_loss: 1.4228 - classification_loss: 0.2920 335/500 [===================>..........] - ETA: 41s - loss: 1.7151 - regression_loss: 1.4232 - classification_loss: 0.2919 336/500 [===================>..........] - ETA: 40s - loss: 1.7145 - regression_loss: 1.4229 - classification_loss: 0.2916 337/500 [===================>..........] - ETA: 40s - loss: 1.7138 - regression_loss: 1.4220 - classification_loss: 0.2918 338/500 [===================>..........] - ETA: 40s - loss: 1.7151 - regression_loss: 1.4233 - classification_loss: 0.2918 339/500 [===================>..........] - ETA: 40s - loss: 1.7187 - regression_loss: 1.4267 - classification_loss: 0.2920 340/500 [===================>..........] - ETA: 39s - loss: 1.7166 - regression_loss: 1.4250 - classification_loss: 0.2915 341/500 [===================>..........] - ETA: 39s - loss: 1.7153 - regression_loss: 1.4240 - classification_loss: 0.2913 342/500 [===================>..........] - ETA: 39s - loss: 1.7168 - regression_loss: 1.4254 - classification_loss: 0.2914 343/500 [===================>..........] - ETA: 39s - loss: 1.7170 - regression_loss: 1.4255 - classification_loss: 0.2915 344/500 [===================>..........] - ETA: 38s - loss: 1.7184 - regression_loss: 1.4260 - classification_loss: 0.2924 345/500 [===================>..........] - ETA: 38s - loss: 1.7195 - regression_loss: 1.4271 - classification_loss: 0.2924 346/500 [===================>..........] - ETA: 38s - loss: 1.7175 - regression_loss: 1.4256 - classification_loss: 0.2919 347/500 [===================>..........] - ETA: 38s - loss: 1.7173 - regression_loss: 1.4255 - classification_loss: 0.2918 348/500 [===================>..........] - ETA: 37s - loss: 1.7179 - regression_loss: 1.4262 - classification_loss: 0.2918 349/500 [===================>..........] - ETA: 37s - loss: 1.7177 - regression_loss: 1.4258 - classification_loss: 0.2919 350/500 [====================>.........] - ETA: 37s - loss: 1.7151 - regression_loss: 1.4236 - classification_loss: 0.2914 351/500 [====================>.........] - ETA: 37s - loss: 1.7149 - regression_loss: 1.4234 - classification_loss: 0.2915 352/500 [====================>.........] - ETA: 36s - loss: 1.7145 - regression_loss: 1.4230 - classification_loss: 0.2914 353/500 [====================>.........] - ETA: 36s - loss: 1.7170 - regression_loss: 1.4251 - classification_loss: 0.2920 354/500 [====================>.........] - ETA: 36s - loss: 1.7159 - regression_loss: 1.4241 - classification_loss: 0.2917 355/500 [====================>.........] - ETA: 36s - loss: 1.7153 - regression_loss: 1.4238 - classification_loss: 0.2915 356/500 [====================>.........] - ETA: 35s - loss: 1.7158 - regression_loss: 1.4243 - classification_loss: 0.2915 357/500 [====================>.........] - ETA: 35s - loss: 1.7167 - regression_loss: 1.4250 - classification_loss: 0.2918 358/500 [====================>.........] - ETA: 35s - loss: 1.7187 - regression_loss: 1.4266 - classification_loss: 0.2921 359/500 [====================>.........] - ETA: 35s - loss: 1.7188 - regression_loss: 1.4268 - classification_loss: 0.2920 360/500 [====================>.........] - ETA: 34s - loss: 1.7170 - regression_loss: 1.4254 - classification_loss: 0.2916 361/500 [====================>.........] - ETA: 34s - loss: 1.7161 - regression_loss: 1.4247 - classification_loss: 0.2914 362/500 [====================>.........] - ETA: 34s - loss: 1.7168 - regression_loss: 1.4254 - classification_loss: 0.2915 363/500 [====================>.........] - ETA: 34s - loss: 1.7168 - regression_loss: 1.4254 - classification_loss: 0.2914 364/500 [====================>.........] - ETA: 33s - loss: 1.7176 - regression_loss: 1.4259 - classification_loss: 0.2917 365/500 [====================>.........] - ETA: 33s - loss: 1.7179 - regression_loss: 1.4263 - classification_loss: 0.2916 366/500 [====================>.........] - ETA: 33s - loss: 1.7168 - regression_loss: 1.4254 - classification_loss: 0.2915 367/500 [=====================>........] - ETA: 33s - loss: 1.7172 - regression_loss: 1.4257 - classification_loss: 0.2915 368/500 [=====================>........] - ETA: 32s - loss: 1.7181 - regression_loss: 1.4264 - classification_loss: 0.2917 369/500 [=====================>........] - ETA: 32s - loss: 1.7163 - regression_loss: 1.4250 - classification_loss: 0.2914 370/500 [=====================>........] - ETA: 32s - loss: 1.7177 - regression_loss: 1.4259 - classification_loss: 0.2918 371/500 [=====================>........] - ETA: 32s - loss: 1.7174 - regression_loss: 1.4258 - classification_loss: 0.2916 372/500 [=====================>........] - ETA: 31s - loss: 1.7166 - regression_loss: 1.4252 - classification_loss: 0.2914 373/500 [=====================>........] - ETA: 31s - loss: 1.7175 - regression_loss: 1.4260 - classification_loss: 0.2915 374/500 [=====================>........] - ETA: 31s - loss: 1.7182 - regression_loss: 1.4264 - classification_loss: 0.2918 375/500 [=====================>........] - ETA: 31s - loss: 1.7186 - regression_loss: 1.4269 - classification_loss: 0.2917 376/500 [=====================>........] - ETA: 31s - loss: 1.7172 - regression_loss: 1.4258 - classification_loss: 0.2914 377/500 [=====================>........] - ETA: 30s - loss: 1.7169 - regression_loss: 1.4257 - classification_loss: 0.2912 378/500 [=====================>........] - ETA: 30s - loss: 1.7166 - regression_loss: 1.4253 - classification_loss: 0.2912 379/500 [=====================>........] - ETA: 30s - loss: 1.7198 - regression_loss: 1.4277 - classification_loss: 0.2921 380/500 [=====================>........] - ETA: 30s - loss: 1.7202 - regression_loss: 1.4281 - classification_loss: 0.2922 381/500 [=====================>........] - ETA: 29s - loss: 1.7197 - regression_loss: 1.4277 - classification_loss: 0.2920 382/500 [=====================>........] - ETA: 29s - loss: 1.7182 - regression_loss: 1.4266 - classification_loss: 0.2916 383/500 [=====================>........] - ETA: 29s - loss: 1.7189 - regression_loss: 1.4270 - classification_loss: 0.2918 384/500 [======================>.......] - ETA: 29s - loss: 1.7157 - regression_loss: 1.4243 - classification_loss: 0.2914 385/500 [======================>.......] - ETA: 28s - loss: 1.7142 - regression_loss: 1.4230 - classification_loss: 0.2911 386/500 [======================>.......] - ETA: 28s - loss: 1.7153 - regression_loss: 1.4239 - classification_loss: 0.2914 387/500 [======================>.......] - ETA: 28s - loss: 1.7157 - regression_loss: 1.4243 - classification_loss: 0.2914 388/500 [======================>.......] - ETA: 28s - loss: 1.7134 - regression_loss: 1.4225 - classification_loss: 0.2909 389/500 [======================>.......] - ETA: 27s - loss: 1.7128 - regression_loss: 1.4221 - classification_loss: 0.2907 390/500 [======================>.......] - ETA: 27s - loss: 1.7122 - regression_loss: 1.4214 - classification_loss: 0.2908 391/500 [======================>.......] - ETA: 27s - loss: 1.7131 - regression_loss: 1.4221 - classification_loss: 0.2910 392/500 [======================>.......] - ETA: 27s - loss: 1.7120 - regression_loss: 1.4209 - classification_loss: 0.2911 393/500 [======================>.......] - ETA: 26s - loss: 1.7110 - regression_loss: 1.4201 - classification_loss: 0.2909 394/500 [======================>.......] - ETA: 26s - loss: 1.7100 - regression_loss: 1.4194 - classification_loss: 0.2907 395/500 [======================>.......] - ETA: 26s - loss: 1.7107 - regression_loss: 1.4199 - classification_loss: 0.2907 396/500 [======================>.......] - ETA: 26s - loss: 1.7102 - regression_loss: 1.4195 - classification_loss: 0.2907 397/500 [======================>.......] - ETA: 25s - loss: 1.7110 - regression_loss: 1.4203 - classification_loss: 0.2907 398/500 [======================>.......] - ETA: 25s - loss: 1.7108 - regression_loss: 1.4201 - classification_loss: 0.2907 399/500 [======================>.......] - ETA: 25s - loss: 1.7124 - regression_loss: 1.4213 - classification_loss: 0.2911 400/500 [=======================>......] - ETA: 25s - loss: 1.7123 - regression_loss: 1.4212 - classification_loss: 0.2911 401/500 [=======================>......] - ETA: 24s - loss: 1.7100 - regression_loss: 1.4194 - classification_loss: 0.2906 402/500 [=======================>......] - ETA: 24s - loss: 1.7088 - regression_loss: 1.4185 - classification_loss: 0.2903 403/500 [=======================>......] - ETA: 24s - loss: 1.7100 - regression_loss: 1.4197 - classification_loss: 0.2903 404/500 [=======================>......] - ETA: 24s - loss: 1.7111 - regression_loss: 1.4204 - classification_loss: 0.2906 405/500 [=======================>......] - ETA: 23s - loss: 1.7115 - regression_loss: 1.4207 - classification_loss: 0.2909 406/500 [=======================>......] - ETA: 23s - loss: 1.7121 - regression_loss: 1.4209 - classification_loss: 0.2913 407/500 [=======================>......] - ETA: 23s - loss: 1.7131 - regression_loss: 1.4216 - classification_loss: 0.2915 408/500 [=======================>......] - ETA: 23s - loss: 1.7135 - regression_loss: 1.4215 - classification_loss: 0.2920 409/500 [=======================>......] - ETA: 22s - loss: 1.7150 - regression_loss: 1.4227 - classification_loss: 0.2923 410/500 [=======================>......] - ETA: 22s - loss: 1.7148 - regression_loss: 1.4225 - classification_loss: 0.2923 411/500 [=======================>......] - ETA: 22s - loss: 1.7161 - regression_loss: 1.4234 - classification_loss: 0.2927 412/500 [=======================>......] - ETA: 22s - loss: 1.7167 - regression_loss: 1.4241 - classification_loss: 0.2926 413/500 [=======================>......] - ETA: 21s - loss: 1.7176 - regression_loss: 1.4248 - classification_loss: 0.2928 414/500 [=======================>......] - ETA: 21s - loss: 1.7172 - regression_loss: 1.4245 - classification_loss: 0.2927 415/500 [=======================>......] - ETA: 21s - loss: 1.7177 - regression_loss: 1.4250 - classification_loss: 0.2927 416/500 [=======================>......] - ETA: 21s - loss: 1.7173 - regression_loss: 1.4248 - classification_loss: 0.2925 417/500 [========================>.....] - ETA: 20s - loss: 1.7187 - regression_loss: 1.4260 - classification_loss: 0.2928 418/500 [========================>.....] - ETA: 20s - loss: 1.7185 - regression_loss: 1.4257 - classification_loss: 0.2929 419/500 [========================>.....] - ETA: 20s - loss: 1.7207 - regression_loss: 1.4273 - classification_loss: 0.2934 420/500 [========================>.....] - ETA: 20s - loss: 1.7233 - regression_loss: 1.4287 - classification_loss: 0.2946 421/500 [========================>.....] - ETA: 19s - loss: 1.7232 - regression_loss: 1.4286 - classification_loss: 0.2946 422/500 [========================>.....] - ETA: 19s - loss: 1.7237 - regression_loss: 1.4290 - classification_loss: 0.2947 423/500 [========================>.....] - ETA: 19s - loss: 1.7235 - regression_loss: 1.4288 - classification_loss: 0.2947 424/500 [========================>.....] - ETA: 19s - loss: 1.7235 - regression_loss: 1.4289 - classification_loss: 0.2946 425/500 [========================>.....] - ETA: 18s - loss: 1.7238 - regression_loss: 1.4292 - classification_loss: 0.2946 426/500 [========================>.....] - ETA: 18s - loss: 1.7250 - regression_loss: 1.4301 - classification_loss: 0.2949 427/500 [========================>.....] - ETA: 18s - loss: 1.7247 - regression_loss: 1.4299 - classification_loss: 0.2948 428/500 [========================>.....] - ETA: 18s - loss: 1.7260 - regression_loss: 1.4309 - classification_loss: 0.2951 429/500 [========================>.....] - ETA: 17s - loss: 1.7261 - regression_loss: 1.4310 - classification_loss: 0.2951 430/500 [========================>.....] - ETA: 17s - loss: 1.7242 - regression_loss: 1.4294 - classification_loss: 0.2948 431/500 [========================>.....] - ETA: 17s - loss: 1.7243 - regression_loss: 1.4295 - classification_loss: 0.2948 432/500 [========================>.....] - ETA: 17s - loss: 1.7243 - regression_loss: 1.4295 - classification_loss: 0.2948 433/500 [========================>.....] - ETA: 16s - loss: 1.7222 - regression_loss: 1.4278 - classification_loss: 0.2944 434/500 [=========================>....] - ETA: 16s - loss: 1.7214 - regression_loss: 1.4269 - classification_loss: 0.2944 435/500 [=========================>....] - ETA: 16s - loss: 1.7222 - regression_loss: 1.4275 - classification_loss: 0.2947 436/500 [=========================>....] - ETA: 16s - loss: 1.7200 - regression_loss: 1.4258 - classification_loss: 0.2942 437/500 [=========================>....] - ETA: 15s - loss: 1.7194 - regression_loss: 1.4254 - classification_loss: 0.2940 438/500 [=========================>....] - ETA: 15s - loss: 1.7194 - regression_loss: 1.4254 - classification_loss: 0.2940 439/500 [=========================>....] - ETA: 15s - loss: 1.7200 - regression_loss: 1.4257 - classification_loss: 0.2943 440/500 [=========================>....] - ETA: 15s - loss: 1.7217 - regression_loss: 1.4269 - classification_loss: 0.2949 441/500 [=========================>....] - ETA: 14s - loss: 1.7219 - regression_loss: 1.4271 - classification_loss: 0.2947 442/500 [=========================>....] - ETA: 14s - loss: 1.7233 - regression_loss: 1.4284 - classification_loss: 0.2948 443/500 [=========================>....] - ETA: 14s - loss: 1.7240 - regression_loss: 1.4290 - classification_loss: 0.2950 444/500 [=========================>....] - ETA: 14s - loss: 1.7246 - regression_loss: 1.4295 - classification_loss: 0.2951 445/500 [=========================>....] - ETA: 13s - loss: 1.7235 - regression_loss: 1.4284 - classification_loss: 0.2952 446/500 [=========================>....] - ETA: 13s - loss: 1.7240 - regression_loss: 1.4284 - classification_loss: 0.2956 447/500 [=========================>....] - ETA: 13s - loss: 1.7240 - regression_loss: 1.4285 - classification_loss: 0.2954 448/500 [=========================>....] - ETA: 13s - loss: 1.7227 - regression_loss: 1.4274 - classification_loss: 0.2953 449/500 [=========================>....] - ETA: 12s - loss: 1.7229 - regression_loss: 1.4276 - classification_loss: 0.2953 450/500 [==========================>...] - ETA: 12s - loss: 1.7223 - regression_loss: 1.4273 - classification_loss: 0.2950 451/500 [==========================>...] - ETA: 12s - loss: 1.7228 - regression_loss: 1.4276 - classification_loss: 0.2952 452/500 [==========================>...] - ETA: 12s - loss: 1.7213 - regression_loss: 1.4264 - classification_loss: 0.2949 453/500 [==========================>...] - ETA: 11s - loss: 1.7212 - regression_loss: 1.4264 - classification_loss: 0.2949 454/500 [==========================>...] - ETA: 11s - loss: 1.7229 - regression_loss: 1.4275 - classification_loss: 0.2953 455/500 [==========================>...] - ETA: 11s - loss: 1.7237 - regression_loss: 1.4282 - classification_loss: 0.2954 456/500 [==========================>...] - ETA: 11s - loss: 1.7253 - regression_loss: 1.4295 - classification_loss: 0.2958 457/500 [==========================>...] - ETA: 10s - loss: 1.7251 - regression_loss: 1.4294 - classification_loss: 0.2957 458/500 [==========================>...] - ETA: 10s - loss: 1.7249 - regression_loss: 1.4291 - classification_loss: 0.2958 459/500 [==========================>...] - ETA: 10s - loss: 1.7248 - regression_loss: 1.4291 - classification_loss: 0.2957 460/500 [==========================>...] - ETA: 10s - loss: 1.7260 - regression_loss: 1.4301 - classification_loss: 0.2959 461/500 [==========================>...] - ETA: 9s - loss: 1.7256 - regression_loss: 1.4296 - classification_loss: 0.2959  462/500 [==========================>...] - ETA: 9s - loss: 1.7252 - regression_loss: 1.4293 - classification_loss: 0.2959 463/500 [==========================>...] - ETA: 9s - loss: 1.7256 - regression_loss: 1.4295 - classification_loss: 0.2961 464/500 [==========================>...] - ETA: 9s - loss: 1.7264 - regression_loss: 1.4303 - classification_loss: 0.2962 465/500 [==========================>...] - ETA: 8s - loss: 1.7268 - regression_loss: 1.4307 - classification_loss: 0.2962 466/500 [==========================>...] - ETA: 8s - loss: 1.7272 - regression_loss: 1.4304 - classification_loss: 0.2968 467/500 [===========================>..] - ETA: 8s - loss: 1.7266 - regression_loss: 1.4300 - classification_loss: 0.2966 468/500 [===========================>..] - ETA: 8s - loss: 1.7264 - regression_loss: 1.4298 - classification_loss: 0.2966 469/500 [===========================>..] - ETA: 7s - loss: 1.7273 - regression_loss: 1.4305 - classification_loss: 0.2968 470/500 [===========================>..] - ETA: 7s - loss: 1.7286 - regression_loss: 1.4314 - classification_loss: 0.2972 471/500 [===========================>..] - ETA: 7s - loss: 1.7290 - regression_loss: 1.4319 - classification_loss: 0.2971 472/500 [===========================>..] - ETA: 7s - loss: 1.7292 - regression_loss: 1.4322 - classification_loss: 0.2970 473/500 [===========================>..] - ETA: 6s - loss: 1.7287 - regression_loss: 1.4317 - classification_loss: 0.2970 474/500 [===========================>..] - ETA: 6s - loss: 1.7287 - regression_loss: 1.4318 - classification_loss: 0.2969 475/500 [===========================>..] - ETA: 6s - loss: 1.7300 - regression_loss: 1.4327 - classification_loss: 0.2973 476/500 [===========================>..] - ETA: 6s - loss: 1.7319 - regression_loss: 1.4339 - classification_loss: 0.2979 477/500 [===========================>..] - ETA: 5s - loss: 1.7324 - regression_loss: 1.4344 - classification_loss: 0.2979 478/500 [===========================>..] - ETA: 5s - loss: 1.7328 - regression_loss: 1.4349 - classification_loss: 0.2980 479/500 [===========================>..] - ETA: 5s - loss: 1.7336 - regression_loss: 1.4356 - classification_loss: 0.2980 480/500 [===========================>..] - ETA: 5s - loss: 1.7320 - regression_loss: 1.4343 - classification_loss: 0.2977 481/500 [===========================>..] - ETA: 4s - loss: 1.7323 - regression_loss: 1.4346 - classification_loss: 0.2977 482/500 [===========================>..] - ETA: 4s - loss: 1.7330 - regression_loss: 1.4353 - classification_loss: 0.2977 483/500 [===========================>..] - ETA: 4s - loss: 1.7317 - regression_loss: 1.4341 - classification_loss: 0.2976 484/500 [============================>.] - ETA: 4s - loss: 1.7311 - regression_loss: 1.4335 - classification_loss: 0.2975 485/500 [============================>.] - ETA: 3s - loss: 1.7317 - regression_loss: 1.4341 - classification_loss: 0.2976 486/500 [============================>.] - ETA: 3s - loss: 1.7318 - regression_loss: 1.4343 - classification_loss: 0.2975 487/500 [============================>.] - ETA: 3s - loss: 1.7322 - regression_loss: 1.4346 - classification_loss: 0.2976 488/500 [============================>.] - ETA: 3s - loss: 1.7322 - regression_loss: 1.4348 - classification_loss: 0.2973 489/500 [============================>.] - ETA: 2s - loss: 1.7316 - regression_loss: 1.4346 - classification_loss: 0.2971 490/500 [============================>.] - ETA: 2s - loss: 1.7307 - regression_loss: 1.4338 - classification_loss: 0.2969 491/500 [============================>.] - ETA: 2s - loss: 1.7315 - regression_loss: 1.4345 - classification_loss: 0.2970 492/500 [============================>.] - ETA: 2s - loss: 1.7318 - regression_loss: 1.4348 - classification_loss: 0.2970 493/500 [============================>.] - ETA: 1s - loss: 1.7315 - regression_loss: 1.4345 - classification_loss: 0.2970 494/500 [============================>.] - ETA: 1s - loss: 1.7313 - regression_loss: 1.4344 - classification_loss: 0.2969 495/500 [============================>.] - ETA: 1s - loss: 1.7315 - regression_loss: 1.4345 - classification_loss: 0.2970 496/500 [============================>.] - ETA: 0s - loss: 1.7319 - regression_loss: 1.4348 - classification_loss: 0.2972 497/500 [============================>.] - ETA: 0s - loss: 1.7326 - regression_loss: 1.4353 - classification_loss: 0.2972 498/500 [============================>.] - ETA: 0s - loss: 1.7324 - regression_loss: 1.4353 - classification_loss: 0.2971 499/500 [============================>.] - ETA: 0s - loss: 1.7333 - regression_loss: 1.4361 - classification_loss: 0.2973 500/500 [==============================] - 125s 250ms/step - loss: 1.7339 - regression_loss: 1.4366 - classification_loss: 0.2973 1172 instances of class plum with average precision: 0.6163 mAP: 0.6163 Epoch 00069: saving model to ./training/snapshots/resnet50_pascal_69.h5 Epoch 70/150 1/500 [..............................] - ETA: 1:56 - loss: 1.8656 - regression_loss: 1.5722 - classification_loss: 0.2935 2/500 [..............................] - ETA: 2:01 - loss: 1.4025 - regression_loss: 1.1794 - classification_loss: 0.2231 3/500 [..............................] - ETA: 2:03 - loss: 1.5411 - regression_loss: 1.2981 - classification_loss: 0.2430 4/500 [..............................] - ETA: 2:03 - loss: 1.5720 - regression_loss: 1.2551 - classification_loss: 0.3169 5/500 [..............................] - ETA: 2:02 - loss: 1.6210 - regression_loss: 1.3062 - classification_loss: 0.3148 6/500 [..............................] - ETA: 2:02 - loss: 1.6278 - regression_loss: 1.3126 - classification_loss: 0.3152 7/500 [..............................] - ETA: 2:02 - loss: 1.6934 - regression_loss: 1.3770 - classification_loss: 0.3163 8/500 [..............................] - ETA: 2:02 - loss: 1.7444 - regression_loss: 1.4066 - classification_loss: 0.3377 9/500 [..............................] - ETA: 2:02 - loss: 1.7918 - regression_loss: 1.4510 - classification_loss: 0.3408 10/500 [..............................] - ETA: 2:02 - loss: 1.8339 - regression_loss: 1.4993 - classification_loss: 0.3345 11/500 [..............................] - ETA: 2:01 - loss: 1.8132 - regression_loss: 1.4839 - classification_loss: 0.3293 12/500 [..............................] - ETA: 2:01 - loss: 1.8360 - regression_loss: 1.5037 - classification_loss: 0.3323 13/500 [..............................] - ETA: 2:01 - loss: 1.7961 - regression_loss: 1.4777 - classification_loss: 0.3184 14/500 [..............................] - ETA: 2:01 - loss: 1.7730 - regression_loss: 1.4634 - classification_loss: 0.3096 15/500 [..............................] - ETA: 2:01 - loss: 1.7529 - regression_loss: 1.4458 - classification_loss: 0.3070 16/500 [..............................] - ETA: 2:01 - loss: 1.7784 - regression_loss: 1.4672 - classification_loss: 0.3113 17/500 [>.............................] - ETA: 2:00 - loss: 1.7836 - regression_loss: 1.4705 - classification_loss: 0.3131 18/500 [>.............................] - ETA: 2:00 - loss: 1.8168 - regression_loss: 1.4944 - classification_loss: 0.3224 19/500 [>.............................] - ETA: 2:00 - loss: 1.8622 - regression_loss: 1.5363 - classification_loss: 0.3260 20/500 [>.............................] - ETA: 2:00 - loss: 1.8255 - regression_loss: 1.5050 - classification_loss: 0.3205 21/500 [>.............................] - ETA: 2:00 - loss: 1.8074 - regression_loss: 1.4912 - classification_loss: 0.3162 22/500 [>.............................] - ETA: 2:00 - loss: 1.8931 - regression_loss: 1.5525 - classification_loss: 0.3406 23/500 [>.............................] - ETA: 1:59 - loss: 1.8774 - regression_loss: 1.5427 - classification_loss: 0.3347 24/500 [>.............................] - ETA: 1:58 - loss: 1.8466 - regression_loss: 1.5179 - classification_loss: 0.3288 25/500 [>.............................] - ETA: 1:57 - loss: 1.8149 - regression_loss: 1.4909 - classification_loss: 0.3240 26/500 [>.............................] - ETA: 1:58 - loss: 1.8051 - regression_loss: 1.4844 - classification_loss: 0.3207 27/500 [>.............................] - ETA: 1:57 - loss: 1.8061 - regression_loss: 1.4889 - classification_loss: 0.3172 28/500 [>.............................] - ETA: 1:57 - loss: 1.7965 - regression_loss: 1.4825 - classification_loss: 0.3140 29/500 [>.............................] - ETA: 1:57 - loss: 1.7792 - regression_loss: 1.4705 - classification_loss: 0.3087 30/500 [>.............................] - ETA: 1:57 - loss: 1.7711 - regression_loss: 1.4657 - classification_loss: 0.3054 31/500 [>.............................] - ETA: 1:56 - loss: 1.7773 - regression_loss: 1.4695 - classification_loss: 0.3078 32/500 [>.............................] - ETA: 1:56 - loss: 1.7860 - regression_loss: 1.4762 - classification_loss: 0.3098 33/500 [>.............................] - ETA: 1:56 - loss: 1.7917 - regression_loss: 1.4842 - classification_loss: 0.3074 34/500 [=>............................] - ETA: 1:56 - loss: 1.7753 - regression_loss: 1.4698 - classification_loss: 0.3055 35/500 [=>............................] - ETA: 1:55 - loss: 1.7628 - regression_loss: 1.4595 - classification_loss: 0.3033 36/500 [=>............................] - ETA: 1:55 - loss: 1.7810 - regression_loss: 1.4785 - classification_loss: 0.3025 37/500 [=>............................] - ETA: 1:55 - loss: 1.7851 - regression_loss: 1.4825 - classification_loss: 0.3027 38/500 [=>............................] - ETA: 1:55 - loss: 1.7632 - regression_loss: 1.4659 - classification_loss: 0.2974 39/500 [=>............................] - ETA: 1:54 - loss: 1.7616 - regression_loss: 1.4658 - classification_loss: 0.2959 40/500 [=>............................] - ETA: 1:54 - loss: 1.7684 - regression_loss: 1.4728 - classification_loss: 0.2956 41/500 [=>............................] - ETA: 1:54 - loss: 1.7538 - regression_loss: 1.4630 - classification_loss: 0.2908 42/500 [=>............................] - ETA: 1:53 - loss: 1.7574 - regression_loss: 1.4653 - classification_loss: 0.2922 43/500 [=>............................] - ETA: 1:53 - loss: 1.7611 - regression_loss: 1.4670 - classification_loss: 0.2941 44/500 [=>............................] - ETA: 1:53 - loss: 1.7577 - regression_loss: 1.4651 - classification_loss: 0.2926 45/500 [=>............................] - ETA: 1:53 - loss: 1.7682 - regression_loss: 1.4730 - classification_loss: 0.2952 46/500 [=>............................] - ETA: 1:52 - loss: 1.7583 - regression_loss: 1.4651 - classification_loss: 0.2932 47/500 [=>............................] - ETA: 1:52 - loss: 1.7745 - regression_loss: 1.4788 - classification_loss: 0.2956 48/500 [=>............................] - ETA: 1:52 - loss: 1.7779 - regression_loss: 1.4806 - classification_loss: 0.2973 49/500 [=>............................] - ETA: 1:52 - loss: 1.7811 - regression_loss: 1.4822 - classification_loss: 0.2990 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7788 - regression_loss: 1.4808 - classification_loss: 0.2980 51/500 [==>...........................] - ETA: 1:51 - loss: 1.7786 - regression_loss: 1.4798 - classification_loss: 0.2988 52/500 [==>...........................] - ETA: 1:51 - loss: 1.7745 - regression_loss: 1.4773 - classification_loss: 0.2972 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7606 - regression_loss: 1.4653 - classification_loss: 0.2953 54/500 [==>...........................] - ETA: 1:50 - loss: 1.7638 - regression_loss: 1.4690 - classification_loss: 0.2948 55/500 [==>...........................] - ETA: 1:50 - loss: 1.7663 - regression_loss: 1.4716 - classification_loss: 0.2947 56/500 [==>...........................] - ETA: 1:50 - loss: 1.7600 - regression_loss: 1.4654 - classification_loss: 0.2947 57/500 [==>...........................] - ETA: 1:50 - loss: 1.7570 - regression_loss: 1.4639 - classification_loss: 0.2931 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7645 - regression_loss: 1.4700 - classification_loss: 0.2945 59/500 [==>...........................] - ETA: 1:49 - loss: 1.7674 - regression_loss: 1.4708 - classification_loss: 0.2966 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7651 - regression_loss: 1.4686 - classification_loss: 0.2965 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7684 - regression_loss: 1.4716 - classification_loss: 0.2968 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7675 - regression_loss: 1.4711 - classification_loss: 0.2965 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7591 - regression_loss: 1.4645 - classification_loss: 0.2946 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7673 - regression_loss: 1.4710 - classification_loss: 0.2963 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7615 - regression_loss: 1.4658 - classification_loss: 0.2956 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7598 - regression_loss: 1.4651 - classification_loss: 0.2947 67/500 [===>..........................] - ETA: 1:47 - loss: 1.7586 - regression_loss: 1.4640 - classification_loss: 0.2946 68/500 [===>..........................] - ETA: 1:47 - loss: 1.7682 - regression_loss: 1.4704 - classification_loss: 0.2978 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7636 - regression_loss: 1.4663 - classification_loss: 0.2973 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7613 - regression_loss: 1.4630 - classification_loss: 0.2983 71/500 [===>..........................] - ETA: 1:46 - loss: 1.7629 - regression_loss: 1.4651 - classification_loss: 0.2978 72/500 [===>..........................] - ETA: 1:46 - loss: 1.7646 - regression_loss: 1.4665 - classification_loss: 0.2981 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7658 - regression_loss: 1.4668 - classification_loss: 0.2990 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7702 - regression_loss: 1.4708 - classification_loss: 0.2994 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7719 - regression_loss: 1.4728 - classification_loss: 0.2991 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7733 - regression_loss: 1.4745 - classification_loss: 0.2989 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7670 - regression_loss: 1.4695 - classification_loss: 0.2975 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7702 - regression_loss: 1.4726 - classification_loss: 0.2976 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7764 - regression_loss: 1.4763 - classification_loss: 0.3001 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7778 - regression_loss: 1.4775 - classification_loss: 0.3003 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7747 - regression_loss: 1.4744 - classification_loss: 0.3004 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7801 - regression_loss: 1.4801 - classification_loss: 0.3000 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7854 - regression_loss: 1.4842 - classification_loss: 0.3012 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7732 - regression_loss: 1.4738 - classification_loss: 0.2994 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7706 - regression_loss: 1.4724 - classification_loss: 0.2983 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7707 - regression_loss: 1.4723 - classification_loss: 0.2983 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7805 - regression_loss: 1.4799 - classification_loss: 0.3006 88/500 [====>.........................] - ETA: 1:43 - loss: 1.7815 - regression_loss: 1.4795 - classification_loss: 0.3020 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7767 - regression_loss: 1.4741 - classification_loss: 0.3025 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7789 - regression_loss: 1.4764 - classification_loss: 0.3026 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7787 - regression_loss: 1.4775 - classification_loss: 0.3012 92/500 [====>.........................] - ETA: 1:42 - loss: 1.7690 - regression_loss: 1.4691 - classification_loss: 0.2998 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7686 - regression_loss: 1.4695 - classification_loss: 0.2991 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7676 - regression_loss: 1.4688 - classification_loss: 0.2987 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7688 - regression_loss: 1.4704 - classification_loss: 0.2984 96/500 [====>.........................] - ETA: 1:41 - loss: 1.7705 - regression_loss: 1.4715 - classification_loss: 0.2990 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7699 - regression_loss: 1.4708 - classification_loss: 0.2991 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7729 - regression_loss: 1.4735 - classification_loss: 0.2994 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7701 - regression_loss: 1.4712 - classification_loss: 0.2989 100/500 [=====>........................] - ETA: 1:40 - loss: 1.7611 - regression_loss: 1.4640 - classification_loss: 0.2970 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7616 - regression_loss: 1.4624 - classification_loss: 0.2992 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7634 - regression_loss: 1.4640 - classification_loss: 0.2994 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7555 - regression_loss: 1.4581 - classification_loss: 0.2975 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7572 - regression_loss: 1.4593 - classification_loss: 0.2979 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7529 - regression_loss: 1.4560 - classification_loss: 0.2968 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7499 - regression_loss: 1.4535 - classification_loss: 0.2964 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7423 - regression_loss: 1.4474 - classification_loss: 0.2949 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7399 - regression_loss: 1.4455 - classification_loss: 0.2944 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7420 - regression_loss: 1.4470 - classification_loss: 0.2951 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7377 - regression_loss: 1.4434 - classification_loss: 0.2943 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7415 - regression_loss: 1.4444 - classification_loss: 0.2972 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7435 - regression_loss: 1.4458 - classification_loss: 0.2977 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7496 - regression_loss: 1.4498 - classification_loss: 0.2998 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7520 - regression_loss: 1.4523 - classification_loss: 0.2997 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7499 - regression_loss: 1.4508 - classification_loss: 0.2991 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7502 - regression_loss: 1.4518 - classification_loss: 0.2984 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7530 - regression_loss: 1.4542 - classification_loss: 0.2988 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7487 - regression_loss: 1.4501 - classification_loss: 0.2986 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7462 - regression_loss: 1.4482 - classification_loss: 0.2980 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7533 - regression_loss: 1.4531 - classification_loss: 0.3002 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7531 - regression_loss: 1.4532 - classification_loss: 0.2999 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7526 - regression_loss: 1.4522 - classification_loss: 0.3005 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7498 - regression_loss: 1.4502 - classification_loss: 0.2996 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7486 - regression_loss: 1.4492 - classification_loss: 0.2994 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7443 - regression_loss: 1.4456 - classification_loss: 0.2987 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7395 - regression_loss: 1.4413 - classification_loss: 0.2983 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7396 - regression_loss: 1.4416 - classification_loss: 0.2980 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7401 - regression_loss: 1.4423 - classification_loss: 0.2978 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7407 - regression_loss: 1.4434 - classification_loss: 0.2973 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7348 - regression_loss: 1.4382 - classification_loss: 0.2966 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7341 - regression_loss: 1.4376 - classification_loss: 0.2965 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7335 - regression_loss: 1.4374 - classification_loss: 0.2961 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7324 - regression_loss: 1.4366 - classification_loss: 0.2958 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7289 - regression_loss: 1.4335 - classification_loss: 0.2955 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7204 - regression_loss: 1.4266 - classification_loss: 0.2938 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7153 - regression_loss: 1.4225 - classification_loss: 0.2928 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7198 - regression_loss: 1.4258 - classification_loss: 0.2940 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7223 - regression_loss: 1.4275 - classification_loss: 0.2948 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7193 - regression_loss: 1.4251 - classification_loss: 0.2941 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7210 - regression_loss: 1.4262 - classification_loss: 0.2948 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7295 - regression_loss: 1.4332 - classification_loss: 0.2963 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7249 - regression_loss: 1.4299 - classification_loss: 0.2950 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7270 - regression_loss: 1.4314 - classification_loss: 0.2956 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7298 - regression_loss: 1.4337 - classification_loss: 0.2960 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7271 - regression_loss: 1.4314 - classification_loss: 0.2957 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7350 - regression_loss: 1.4380 - classification_loss: 0.2969 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7327 - regression_loss: 1.4362 - classification_loss: 0.2965 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7316 - regression_loss: 1.4352 - classification_loss: 0.2964 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7339 - regression_loss: 1.4371 - classification_loss: 0.2968 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7349 - regression_loss: 1.4378 - classification_loss: 0.2971 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7376 - regression_loss: 1.4396 - classification_loss: 0.2980 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7396 - regression_loss: 1.4417 - classification_loss: 0.2979 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7353 - regression_loss: 1.4384 - classification_loss: 0.2969 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7371 - regression_loss: 1.4401 - classification_loss: 0.2970 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7312 - regression_loss: 1.4356 - classification_loss: 0.2956 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7284 - regression_loss: 1.4337 - classification_loss: 0.2947 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7224 - regression_loss: 1.4287 - classification_loss: 0.2937 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7245 - regression_loss: 1.4304 - classification_loss: 0.2941 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7246 - regression_loss: 1.4298 - classification_loss: 0.2948 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7255 - regression_loss: 1.4307 - classification_loss: 0.2949 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7283 - regression_loss: 1.4328 - classification_loss: 0.2955 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7277 - regression_loss: 1.4324 - classification_loss: 0.2953 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7283 - regression_loss: 1.4333 - classification_loss: 0.2950 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7317 - regression_loss: 1.4362 - classification_loss: 0.2955 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7313 - regression_loss: 1.4361 - classification_loss: 0.2952 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7329 - regression_loss: 1.4375 - classification_loss: 0.2953 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7299 - regression_loss: 1.4343 - classification_loss: 0.2956 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7313 - regression_loss: 1.4354 - classification_loss: 0.2960 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7303 - regression_loss: 1.4342 - classification_loss: 0.2961 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7306 - regression_loss: 1.4348 - classification_loss: 0.2958 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7295 - regression_loss: 1.4339 - classification_loss: 0.2956 172/500 [=========>....................] - ETA: 1:21 - loss: 1.7321 - regression_loss: 1.4360 - classification_loss: 0.2961 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7330 - regression_loss: 1.4366 - classification_loss: 0.2964 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7342 - regression_loss: 1.4372 - classification_loss: 0.2971 175/500 [=========>....................] - ETA: 1:20 - loss: 1.7343 - regression_loss: 1.4376 - classification_loss: 0.2968 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7355 - regression_loss: 1.4386 - classification_loss: 0.2968 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7372 - regression_loss: 1.4393 - classification_loss: 0.2978 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7363 - regression_loss: 1.4386 - classification_loss: 0.2977 179/500 [=========>....................] - ETA: 1:19 - loss: 1.7387 - regression_loss: 1.4409 - classification_loss: 0.2978 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7358 - regression_loss: 1.4387 - classification_loss: 0.2971 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7380 - regression_loss: 1.4404 - classification_loss: 0.2976 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7379 - regression_loss: 1.4405 - classification_loss: 0.2974 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7350 - regression_loss: 1.4383 - classification_loss: 0.2967 184/500 [==========>...................] - ETA: 1:18 - loss: 1.7349 - regression_loss: 1.4381 - classification_loss: 0.2968 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7314 - regression_loss: 1.4353 - classification_loss: 0.2961 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7317 - regression_loss: 1.4361 - classification_loss: 0.2956 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7336 - regression_loss: 1.4375 - classification_loss: 0.2961 188/500 [==========>...................] - ETA: 1:17 - loss: 1.7354 - regression_loss: 1.4390 - classification_loss: 0.2963 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7350 - regression_loss: 1.4385 - classification_loss: 0.2966 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7329 - regression_loss: 1.4369 - classification_loss: 0.2960 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7291 - regression_loss: 1.4342 - classification_loss: 0.2949 192/500 [==========>...................] - ETA: 1:16 - loss: 1.7267 - regression_loss: 1.4317 - classification_loss: 0.2950 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7284 - regression_loss: 1.4330 - classification_loss: 0.2954 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7285 - regression_loss: 1.4328 - classification_loss: 0.2957 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7255 - regression_loss: 1.4306 - classification_loss: 0.2950 196/500 [==========>...................] - ETA: 1:15 - loss: 1.7246 - regression_loss: 1.4291 - classification_loss: 0.2955 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7253 - regression_loss: 1.4300 - classification_loss: 0.2953 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7253 - regression_loss: 1.4298 - classification_loss: 0.2955 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7274 - regression_loss: 1.4314 - classification_loss: 0.2960 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7237 - regression_loss: 1.4289 - classification_loss: 0.2948 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7197 - regression_loss: 1.4257 - classification_loss: 0.2941 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7182 - regression_loss: 1.4245 - classification_loss: 0.2937 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7212 - regression_loss: 1.4262 - classification_loss: 0.2950 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7221 - regression_loss: 1.4267 - classification_loss: 0.2954 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7220 - regression_loss: 1.4266 - classification_loss: 0.2954 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7220 - regression_loss: 1.4262 - classification_loss: 0.2958 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7245 - regression_loss: 1.4279 - classification_loss: 0.2967 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7256 - regression_loss: 1.4289 - classification_loss: 0.2967 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7264 - regression_loss: 1.4298 - classification_loss: 0.2966 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7210 - regression_loss: 1.4256 - classification_loss: 0.2955 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7231 - regression_loss: 1.4269 - classification_loss: 0.2962 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7236 - regression_loss: 1.4275 - classification_loss: 0.2961 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7252 - regression_loss: 1.4286 - classification_loss: 0.2966 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7257 - regression_loss: 1.4292 - classification_loss: 0.2966 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7255 - regression_loss: 1.4289 - classification_loss: 0.2966 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7267 - regression_loss: 1.4296 - classification_loss: 0.2971 217/500 [============>.................] - ETA: 1:10 - loss: 1.7290 - regression_loss: 1.4308 - classification_loss: 0.2982 218/500 [============>.................] - ETA: 1:10 - loss: 1.7301 - regression_loss: 1.4317 - classification_loss: 0.2984 219/500 [============>.................] - ETA: 1:10 - loss: 1.7296 - regression_loss: 1.4315 - classification_loss: 0.2981 220/500 [============>.................] - ETA: 1:09 - loss: 1.7290 - regression_loss: 1.4311 - classification_loss: 0.2979 221/500 [============>.................] - ETA: 1:09 - loss: 1.7276 - regression_loss: 1.4302 - classification_loss: 0.2973 222/500 [============>.................] - ETA: 1:09 - loss: 1.7237 - regression_loss: 1.4272 - classification_loss: 0.2966 223/500 [============>.................] - ETA: 1:09 - loss: 1.7181 - regression_loss: 1.4225 - classification_loss: 0.2956 224/500 [============>.................] - ETA: 1:08 - loss: 1.7239 - regression_loss: 1.4275 - classification_loss: 0.2964 225/500 [============>.................] - ETA: 1:08 - loss: 1.7246 - regression_loss: 1.4282 - classification_loss: 0.2963 226/500 [============>.................] - ETA: 1:08 - loss: 1.7254 - regression_loss: 1.4290 - classification_loss: 0.2963 227/500 [============>.................] - ETA: 1:08 - loss: 1.7253 - regression_loss: 1.4290 - classification_loss: 0.2963 228/500 [============>.................] - ETA: 1:07 - loss: 1.7252 - regression_loss: 1.4290 - classification_loss: 0.2962 229/500 [============>.................] - ETA: 1:07 - loss: 1.7252 - regression_loss: 1.4291 - classification_loss: 0.2961 230/500 [============>.................] - ETA: 1:07 - loss: 1.7265 - regression_loss: 1.4299 - classification_loss: 0.2966 231/500 [============>.................] - ETA: 1:07 - loss: 1.7258 - regression_loss: 1.4295 - classification_loss: 0.2962 232/500 [============>.................] - ETA: 1:06 - loss: 1.7257 - regression_loss: 1.4296 - classification_loss: 0.2961 233/500 [============>.................] - ETA: 1:06 - loss: 1.7267 - regression_loss: 1.4308 - classification_loss: 0.2959 234/500 [=============>................] - ETA: 1:06 - loss: 1.7258 - regression_loss: 1.4298 - classification_loss: 0.2960 235/500 [=============>................] - ETA: 1:06 - loss: 1.7241 - regression_loss: 1.4284 - classification_loss: 0.2957 236/500 [=============>................] - ETA: 1:06 - loss: 1.7270 - regression_loss: 1.4303 - classification_loss: 0.2967 237/500 [=============>................] - ETA: 1:05 - loss: 1.7252 - regression_loss: 1.4288 - classification_loss: 0.2963 238/500 [=============>................] - ETA: 1:05 - loss: 1.7242 - regression_loss: 1.4280 - classification_loss: 0.2962 239/500 [=============>................] - ETA: 1:05 - loss: 1.7250 - regression_loss: 1.4289 - classification_loss: 0.2961 240/500 [=============>................] - ETA: 1:05 - loss: 1.7230 - regression_loss: 1.4273 - classification_loss: 0.2957 241/500 [=============>................] - ETA: 1:04 - loss: 1.7226 - regression_loss: 1.4272 - classification_loss: 0.2955 242/500 [=============>................] - ETA: 1:04 - loss: 1.7190 - regression_loss: 1.4242 - classification_loss: 0.2948 243/500 [=============>................] - ETA: 1:04 - loss: 1.7151 - regression_loss: 1.4210 - classification_loss: 0.2942 244/500 [=============>................] - ETA: 1:04 - loss: 1.7104 - regression_loss: 1.4170 - classification_loss: 0.2934 245/500 [=============>................] - ETA: 1:03 - loss: 1.7106 - regression_loss: 1.4173 - classification_loss: 0.2933 246/500 [=============>................] - ETA: 1:03 - loss: 1.7080 - regression_loss: 1.4154 - classification_loss: 0.2926 247/500 [=============>................] - ETA: 1:03 - loss: 1.7067 - regression_loss: 1.4144 - classification_loss: 0.2923 248/500 [=============>................] - ETA: 1:03 - loss: 1.7057 - regression_loss: 1.4135 - classification_loss: 0.2922 249/500 [=============>................] - ETA: 1:02 - loss: 1.7052 - regression_loss: 1.4134 - classification_loss: 0.2918 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7059 - regression_loss: 1.4144 - classification_loss: 0.2916 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7039 - regression_loss: 1.4128 - classification_loss: 0.2912 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7034 - regression_loss: 1.4124 - classification_loss: 0.2910 253/500 [==============>...............] - ETA: 1:01 - loss: 1.7027 - regression_loss: 1.4119 - classification_loss: 0.2907 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7037 - regression_loss: 1.4128 - classification_loss: 0.2909 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6997 - regression_loss: 1.4096 - classification_loss: 0.2901 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7014 - regression_loss: 1.4107 - classification_loss: 0.2907 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7012 - regression_loss: 1.4107 - classification_loss: 0.2905 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6976 - regression_loss: 1.4078 - classification_loss: 0.2899 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6981 - regression_loss: 1.4082 - classification_loss: 0.2899 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6977 - regression_loss: 1.4075 - classification_loss: 0.2902 261/500 [==============>...............] - ETA: 59s - loss: 1.6981 - regression_loss: 1.4078 - classification_loss: 0.2903  262/500 [==============>...............] - ETA: 59s - loss: 1.6961 - regression_loss: 1.4061 - classification_loss: 0.2900 263/500 [==============>...............] - ETA: 59s - loss: 1.6938 - regression_loss: 1.4042 - classification_loss: 0.2896 264/500 [==============>...............] - ETA: 59s - loss: 1.6928 - regression_loss: 1.4032 - classification_loss: 0.2896 265/500 [==============>...............] - ETA: 58s - loss: 1.6975 - regression_loss: 1.4072 - classification_loss: 0.2903 266/500 [==============>...............] - ETA: 58s - loss: 1.6978 - regression_loss: 1.4076 - classification_loss: 0.2902 267/500 [===============>..............] - ETA: 58s - loss: 1.6981 - regression_loss: 1.4079 - classification_loss: 0.2901 268/500 [===============>..............] - ETA: 58s - loss: 1.6999 - regression_loss: 1.4095 - classification_loss: 0.2904 269/500 [===============>..............] - ETA: 57s - loss: 1.7021 - regression_loss: 1.4115 - classification_loss: 0.2905 270/500 [===============>..............] - ETA: 57s - loss: 1.7017 - regression_loss: 1.4111 - classification_loss: 0.2906 271/500 [===============>..............] - ETA: 57s - loss: 1.7013 - regression_loss: 1.4108 - classification_loss: 0.2905 272/500 [===============>..............] - ETA: 57s - loss: 1.7022 - regression_loss: 1.4112 - classification_loss: 0.2910 273/500 [===============>..............] - ETA: 56s - loss: 1.7029 - regression_loss: 1.4118 - classification_loss: 0.2911 274/500 [===============>..............] - ETA: 56s - loss: 1.7038 - regression_loss: 1.4124 - classification_loss: 0.2914 275/500 [===============>..............] - ETA: 56s - loss: 1.7039 - regression_loss: 1.4125 - classification_loss: 0.2914 276/500 [===============>..............] - ETA: 56s - loss: 1.7050 - regression_loss: 1.4131 - classification_loss: 0.2919 277/500 [===============>..............] - ETA: 55s - loss: 1.7036 - regression_loss: 1.4120 - classification_loss: 0.2916 278/500 [===============>..............] - ETA: 55s - loss: 1.7018 - regression_loss: 1.4106 - classification_loss: 0.2912 279/500 [===============>..............] - ETA: 55s - loss: 1.7013 - regression_loss: 1.4103 - classification_loss: 0.2910 280/500 [===============>..............] - ETA: 55s - loss: 1.7022 - regression_loss: 1.4112 - classification_loss: 0.2910 281/500 [===============>..............] - ETA: 54s - loss: 1.7012 - regression_loss: 1.4104 - classification_loss: 0.2908 282/500 [===============>..............] - ETA: 54s - loss: 1.7020 - regression_loss: 1.4104 - classification_loss: 0.2916 283/500 [===============>..............] - ETA: 54s - loss: 1.7026 - regression_loss: 1.4110 - classification_loss: 0.2916 284/500 [================>.............] - ETA: 54s - loss: 1.7042 - regression_loss: 1.4119 - classification_loss: 0.2922 285/500 [================>.............] - ETA: 53s - loss: 1.7048 - regression_loss: 1.4126 - classification_loss: 0.2922 286/500 [================>.............] - ETA: 53s - loss: 1.7053 - regression_loss: 1.4132 - classification_loss: 0.2920 287/500 [================>.............] - ETA: 53s - loss: 1.7028 - regression_loss: 1.4111 - classification_loss: 0.2917 288/500 [================>.............] - ETA: 53s - loss: 1.7016 - regression_loss: 1.4098 - classification_loss: 0.2918 289/500 [================>.............] - ETA: 52s - loss: 1.7027 - regression_loss: 1.4108 - classification_loss: 0.2919 290/500 [================>.............] - ETA: 52s - loss: 1.7031 - regression_loss: 1.4113 - classification_loss: 0.2918 291/500 [================>.............] - ETA: 52s - loss: 1.7040 - regression_loss: 1.4121 - classification_loss: 0.2919 292/500 [================>.............] - ETA: 52s - loss: 1.7060 - regression_loss: 1.4130 - classification_loss: 0.2930 293/500 [================>.............] - ETA: 51s - loss: 1.7060 - regression_loss: 1.4130 - classification_loss: 0.2929 294/500 [================>.............] - ETA: 51s - loss: 1.7063 - regression_loss: 1.4132 - classification_loss: 0.2931 295/500 [================>.............] - ETA: 51s - loss: 1.7061 - regression_loss: 1.4132 - classification_loss: 0.2929 296/500 [================>.............] - ETA: 51s - loss: 1.7065 - regression_loss: 1.4135 - classification_loss: 0.2929 297/500 [================>.............] - ETA: 50s - loss: 1.7082 - regression_loss: 1.4147 - classification_loss: 0.2935 298/500 [================>.............] - ETA: 50s - loss: 1.7088 - regression_loss: 1.4152 - classification_loss: 0.2936 299/500 [================>.............] - ETA: 50s - loss: 1.7108 - regression_loss: 1.4166 - classification_loss: 0.2942 300/500 [=================>............] - ETA: 50s - loss: 1.7103 - regression_loss: 1.4162 - classification_loss: 0.2941 301/500 [=================>............] - ETA: 49s - loss: 1.7099 - regression_loss: 1.4160 - classification_loss: 0.2940 302/500 [=================>............] - ETA: 49s - loss: 1.7105 - regression_loss: 1.4165 - classification_loss: 0.2940 303/500 [=================>............] - ETA: 49s - loss: 1.7121 - regression_loss: 1.4180 - classification_loss: 0.2941 304/500 [=================>............] - ETA: 49s - loss: 1.7110 - regression_loss: 1.4172 - classification_loss: 0.2938 305/500 [=================>............] - ETA: 48s - loss: 1.7141 - regression_loss: 1.4197 - classification_loss: 0.2944 306/500 [=================>............] - ETA: 48s - loss: 1.7146 - regression_loss: 1.4203 - classification_loss: 0.2943 307/500 [=================>............] - ETA: 48s - loss: 1.7142 - regression_loss: 1.4194 - classification_loss: 0.2947 308/500 [=================>............] - ETA: 48s - loss: 1.7148 - regression_loss: 1.4200 - classification_loss: 0.2948 309/500 [=================>............] - ETA: 47s - loss: 1.7154 - regression_loss: 1.4206 - classification_loss: 0.2948 310/500 [=================>............] - ETA: 47s - loss: 1.7152 - regression_loss: 1.4205 - classification_loss: 0.2948 311/500 [=================>............] - ETA: 47s - loss: 1.7162 - regression_loss: 1.4213 - classification_loss: 0.2950 312/500 [=================>............] - ETA: 47s - loss: 1.7139 - regression_loss: 1.4195 - classification_loss: 0.2944 313/500 [=================>............] - ETA: 46s - loss: 1.7140 - regression_loss: 1.4196 - classification_loss: 0.2944 314/500 [=================>............] - ETA: 46s - loss: 1.7136 - regression_loss: 1.4193 - classification_loss: 0.2943 315/500 [=================>............] - ETA: 46s - loss: 1.7142 - regression_loss: 1.4197 - classification_loss: 0.2944 316/500 [=================>............] - ETA: 46s - loss: 1.7117 - regression_loss: 1.4177 - classification_loss: 0.2941 317/500 [==================>...........] - ETA: 45s - loss: 1.7145 - regression_loss: 1.4195 - classification_loss: 0.2950 318/500 [==================>...........] - ETA: 45s - loss: 1.7152 - regression_loss: 1.4201 - classification_loss: 0.2951 319/500 [==================>...........] - ETA: 45s - loss: 1.7145 - regression_loss: 1.4197 - classification_loss: 0.2948 320/500 [==================>...........] - ETA: 45s - loss: 1.7149 - regression_loss: 1.4200 - classification_loss: 0.2948 321/500 [==================>...........] - ETA: 44s - loss: 1.7155 - regression_loss: 1.4206 - classification_loss: 0.2949 322/500 [==================>...........] - ETA: 44s - loss: 1.7155 - regression_loss: 1.4207 - classification_loss: 0.2948 323/500 [==================>...........] - ETA: 44s - loss: 1.7182 - regression_loss: 1.4231 - classification_loss: 0.2951 324/500 [==================>...........] - ETA: 44s - loss: 1.7190 - regression_loss: 1.4239 - classification_loss: 0.2951 325/500 [==================>...........] - ETA: 43s - loss: 1.7176 - regression_loss: 1.4228 - classification_loss: 0.2948 326/500 [==================>...........] - ETA: 43s - loss: 1.7174 - regression_loss: 1.4228 - classification_loss: 0.2946 327/500 [==================>...........] - ETA: 43s - loss: 1.7185 - regression_loss: 1.4234 - classification_loss: 0.2951 328/500 [==================>...........] - ETA: 43s - loss: 1.7182 - regression_loss: 1.4232 - classification_loss: 0.2950 329/500 [==================>...........] - ETA: 42s - loss: 1.7180 - regression_loss: 1.4232 - classification_loss: 0.2948 330/500 [==================>...........] - ETA: 42s - loss: 1.7192 - regression_loss: 1.4243 - classification_loss: 0.2949 331/500 [==================>...........] - ETA: 42s - loss: 1.7175 - regression_loss: 1.4228 - classification_loss: 0.2946 332/500 [==================>...........] - ETA: 42s - loss: 1.7175 - regression_loss: 1.4228 - classification_loss: 0.2947 333/500 [==================>...........] - ETA: 41s - loss: 1.7164 - regression_loss: 1.4210 - classification_loss: 0.2954 334/500 [===================>..........] - ETA: 41s - loss: 1.7157 - regression_loss: 1.4205 - classification_loss: 0.2953 335/500 [===================>..........] - ETA: 41s - loss: 1.7155 - regression_loss: 1.4202 - classification_loss: 0.2953 336/500 [===================>..........] - ETA: 41s - loss: 1.7141 - regression_loss: 1.4192 - classification_loss: 0.2950 337/500 [===================>..........] - ETA: 40s - loss: 1.7140 - regression_loss: 1.4190 - classification_loss: 0.2950 338/500 [===================>..........] - ETA: 40s - loss: 1.7122 - regression_loss: 1.4177 - classification_loss: 0.2945 339/500 [===================>..........] - ETA: 40s - loss: 1.7137 - regression_loss: 1.4189 - classification_loss: 0.2948 340/500 [===================>..........] - ETA: 40s - loss: 1.7142 - regression_loss: 1.4193 - classification_loss: 0.2949 341/500 [===================>..........] - ETA: 39s - loss: 1.7138 - regression_loss: 1.4190 - classification_loss: 0.2947 342/500 [===================>..........] - ETA: 39s - loss: 1.7119 - regression_loss: 1.4176 - classification_loss: 0.2943 343/500 [===================>..........] - ETA: 39s - loss: 1.7127 - regression_loss: 1.4183 - classification_loss: 0.2944 344/500 [===================>..........] - ETA: 39s - loss: 1.7136 - regression_loss: 1.4191 - classification_loss: 0.2945 345/500 [===================>..........] - ETA: 38s - loss: 1.7152 - regression_loss: 1.4205 - classification_loss: 0.2947 346/500 [===================>..........] - ETA: 38s - loss: 1.7137 - regression_loss: 1.4195 - classification_loss: 0.2942 347/500 [===================>..........] - ETA: 38s - loss: 1.7120 - regression_loss: 1.4182 - classification_loss: 0.2938 348/500 [===================>..........] - ETA: 38s - loss: 1.7134 - regression_loss: 1.4194 - classification_loss: 0.2940 349/500 [===================>..........] - ETA: 37s - loss: 1.7119 - regression_loss: 1.4181 - classification_loss: 0.2938 350/500 [====================>.........] - ETA: 37s - loss: 1.7130 - regression_loss: 1.4187 - classification_loss: 0.2944 351/500 [====================>.........] - ETA: 37s - loss: 1.7110 - regression_loss: 1.4172 - classification_loss: 0.2937 352/500 [====================>.........] - ETA: 37s - loss: 1.7099 - regression_loss: 1.4166 - classification_loss: 0.2933 353/500 [====================>.........] - ETA: 36s - loss: 1.7105 - regression_loss: 1.4172 - classification_loss: 0.2933 354/500 [====================>.........] - ETA: 36s - loss: 1.7106 - regression_loss: 1.4171 - classification_loss: 0.2935 355/500 [====================>.........] - ETA: 36s - loss: 1.7114 - regression_loss: 1.4180 - classification_loss: 0.2934 356/500 [====================>.........] - ETA: 36s - loss: 1.7104 - regression_loss: 1.4172 - classification_loss: 0.2932 357/500 [====================>.........] - ETA: 35s - loss: 1.7105 - regression_loss: 1.4173 - classification_loss: 0.2932 358/500 [====================>.........] - ETA: 35s - loss: 1.7106 - regression_loss: 1.4174 - classification_loss: 0.2932 359/500 [====================>.........] - ETA: 35s - loss: 1.7117 - regression_loss: 1.4182 - classification_loss: 0.2935 360/500 [====================>.........] - ETA: 35s - loss: 1.7136 - regression_loss: 1.4197 - classification_loss: 0.2939 361/500 [====================>.........] - ETA: 34s - loss: 1.7145 - regression_loss: 1.4205 - classification_loss: 0.2939 362/500 [====================>.........] - ETA: 34s - loss: 1.7154 - regression_loss: 1.4213 - classification_loss: 0.2941 363/500 [====================>.........] - ETA: 34s - loss: 1.7143 - regression_loss: 1.4202 - classification_loss: 0.2941 364/500 [====================>.........] - ETA: 34s - loss: 1.7131 - regression_loss: 1.4194 - classification_loss: 0.2937 365/500 [====================>.........] - ETA: 33s - loss: 1.7133 - regression_loss: 1.4197 - classification_loss: 0.2936 366/500 [====================>.........] - ETA: 33s - loss: 1.7126 - regression_loss: 1.4192 - classification_loss: 0.2934 367/500 [=====================>........] - ETA: 33s - loss: 1.7143 - regression_loss: 1.4205 - classification_loss: 0.2938 368/500 [=====================>........] - ETA: 33s - loss: 1.7146 - regression_loss: 1.4208 - classification_loss: 0.2938 369/500 [=====================>........] - ETA: 32s - loss: 1.7134 - regression_loss: 1.4200 - classification_loss: 0.2934 370/500 [=====================>........] - ETA: 32s - loss: 1.7122 - regression_loss: 1.4191 - classification_loss: 0.2931 371/500 [=====================>........] - ETA: 32s - loss: 1.7123 - regression_loss: 1.4194 - classification_loss: 0.2929 372/500 [=====================>........] - ETA: 32s - loss: 1.7126 - regression_loss: 1.4198 - classification_loss: 0.2929 373/500 [=====================>........] - ETA: 31s - loss: 1.7132 - regression_loss: 1.4203 - classification_loss: 0.2929 374/500 [=====================>........] - ETA: 31s - loss: 1.7141 - regression_loss: 1.4211 - classification_loss: 0.2931 375/500 [=====================>........] - ETA: 31s - loss: 1.7145 - regression_loss: 1.4215 - classification_loss: 0.2930 376/500 [=====================>........] - ETA: 31s - loss: 1.7121 - regression_loss: 1.4196 - classification_loss: 0.2925 377/500 [=====================>........] - ETA: 30s - loss: 1.7130 - regression_loss: 1.4202 - classification_loss: 0.2927 378/500 [=====================>........] - ETA: 30s - loss: 1.7133 - regression_loss: 1.4204 - classification_loss: 0.2929 379/500 [=====================>........] - ETA: 30s - loss: 1.7130 - regression_loss: 1.4201 - classification_loss: 0.2929 380/500 [=====================>........] - ETA: 30s - loss: 1.7123 - regression_loss: 1.4194 - classification_loss: 0.2929 381/500 [=====================>........] - ETA: 29s - loss: 1.7105 - regression_loss: 1.4177 - classification_loss: 0.2928 382/500 [=====================>........] - ETA: 29s - loss: 1.7107 - regression_loss: 1.4178 - classification_loss: 0.2928 383/500 [=====================>........] - ETA: 29s - loss: 1.7100 - regression_loss: 1.4172 - classification_loss: 0.2928 384/500 [======================>.......] - ETA: 29s - loss: 1.7108 - regression_loss: 1.4179 - classification_loss: 0.2929 385/500 [======================>.......] - ETA: 28s - loss: 1.7096 - regression_loss: 1.4170 - classification_loss: 0.2926 386/500 [======================>.......] - ETA: 28s - loss: 1.7101 - regression_loss: 1.4174 - classification_loss: 0.2927 387/500 [======================>.......] - ETA: 28s - loss: 1.7103 - regression_loss: 1.4177 - classification_loss: 0.2926 388/500 [======================>.......] - ETA: 28s - loss: 1.7090 - regression_loss: 1.4165 - classification_loss: 0.2924 389/500 [======================>.......] - ETA: 27s - loss: 1.7089 - regression_loss: 1.4163 - classification_loss: 0.2926 390/500 [======================>.......] - ETA: 27s - loss: 1.7082 - regression_loss: 1.4158 - classification_loss: 0.2924 391/500 [======================>.......] - ETA: 27s - loss: 1.7063 - regression_loss: 1.4143 - classification_loss: 0.2921 392/500 [======================>.......] - ETA: 27s - loss: 1.7058 - regression_loss: 1.4138 - classification_loss: 0.2921 393/500 [======================>.......] - ETA: 26s - loss: 1.7055 - regression_loss: 1.4134 - classification_loss: 0.2921 394/500 [======================>.......] - ETA: 26s - loss: 1.7043 - regression_loss: 1.4124 - classification_loss: 0.2918 395/500 [======================>.......] - ETA: 26s - loss: 1.7062 - regression_loss: 1.4138 - classification_loss: 0.2924 396/500 [======================>.......] - ETA: 26s - loss: 1.7066 - regression_loss: 1.4140 - classification_loss: 0.2926 397/500 [======================>.......] - ETA: 25s - loss: 1.7072 - regression_loss: 1.4144 - classification_loss: 0.2928 398/500 [======================>.......] - ETA: 25s - loss: 1.7077 - regression_loss: 1.4147 - classification_loss: 0.2929 399/500 [======================>.......] - ETA: 25s - loss: 1.7084 - regression_loss: 1.4156 - classification_loss: 0.2928 400/500 [=======================>......] - ETA: 25s - loss: 1.7087 - regression_loss: 1.4155 - classification_loss: 0.2932 401/500 [=======================>......] - ETA: 24s - loss: 1.7094 - regression_loss: 1.4158 - classification_loss: 0.2937 402/500 [=======================>......] - ETA: 24s - loss: 1.7098 - regression_loss: 1.4161 - classification_loss: 0.2937 403/500 [=======================>......] - ETA: 24s - loss: 1.7117 - regression_loss: 1.4180 - classification_loss: 0.2938 404/500 [=======================>......] - ETA: 24s - loss: 1.7096 - regression_loss: 1.4161 - classification_loss: 0.2935 405/500 [=======================>......] - ETA: 23s - loss: 1.7101 - regression_loss: 1.4166 - classification_loss: 0.2935 406/500 [=======================>......] - ETA: 23s - loss: 1.7104 - regression_loss: 1.4170 - classification_loss: 0.2934 407/500 [=======================>......] - ETA: 23s - loss: 1.7098 - regression_loss: 1.4164 - classification_loss: 0.2933 408/500 [=======================>......] - ETA: 23s - loss: 1.7106 - regression_loss: 1.4170 - classification_loss: 0.2935 409/500 [=======================>......] - ETA: 22s - loss: 1.7105 - regression_loss: 1.4171 - classification_loss: 0.2935 410/500 [=======================>......] - ETA: 22s - loss: 1.7106 - regression_loss: 1.4171 - classification_loss: 0.2934 411/500 [=======================>......] - ETA: 22s - loss: 1.7114 - regression_loss: 1.4176 - classification_loss: 0.2937 412/500 [=======================>......] - ETA: 22s - loss: 1.7095 - regression_loss: 1.4161 - classification_loss: 0.2934 413/500 [=======================>......] - ETA: 21s - loss: 1.7097 - regression_loss: 1.4163 - classification_loss: 0.2934 414/500 [=======================>......] - ETA: 21s - loss: 1.7092 - regression_loss: 1.4158 - classification_loss: 0.2934 415/500 [=======================>......] - ETA: 21s - loss: 1.7080 - regression_loss: 1.4147 - classification_loss: 0.2933 416/500 [=======================>......] - ETA: 21s - loss: 1.7073 - regression_loss: 1.4142 - classification_loss: 0.2931 417/500 [========================>.....] - ETA: 20s - loss: 1.7082 - regression_loss: 1.4146 - classification_loss: 0.2936 418/500 [========================>.....] - ETA: 20s - loss: 1.7081 - regression_loss: 1.4147 - classification_loss: 0.2934 419/500 [========================>.....] - ETA: 20s - loss: 1.7093 - regression_loss: 1.4157 - classification_loss: 0.2936 420/500 [========================>.....] - ETA: 20s - loss: 1.7113 - regression_loss: 1.4174 - classification_loss: 0.2939 421/500 [========================>.....] - ETA: 19s - loss: 1.7131 - regression_loss: 1.4188 - classification_loss: 0.2943 422/500 [========================>.....] - ETA: 19s - loss: 1.7143 - regression_loss: 1.4198 - classification_loss: 0.2945 423/500 [========================>.....] - ETA: 19s - loss: 1.7154 - regression_loss: 1.4206 - classification_loss: 0.2947 424/500 [========================>.....] - ETA: 19s - loss: 1.7186 - regression_loss: 1.4235 - classification_loss: 0.2951 425/500 [========================>.....] - ETA: 18s - loss: 1.7195 - regression_loss: 1.4244 - classification_loss: 0.2951 426/500 [========================>.....] - ETA: 18s - loss: 1.7193 - regression_loss: 1.4244 - classification_loss: 0.2950 427/500 [========================>.....] - ETA: 18s - loss: 1.7197 - regression_loss: 1.4248 - classification_loss: 0.2949 428/500 [========================>.....] - ETA: 18s - loss: 1.7207 - regression_loss: 1.4256 - classification_loss: 0.2951 429/500 [========================>.....] - ETA: 17s - loss: 1.7209 - regression_loss: 1.4259 - classification_loss: 0.2950 430/500 [========================>.....] - ETA: 17s - loss: 1.7187 - regression_loss: 1.4242 - classification_loss: 0.2946 431/500 [========================>.....] - ETA: 17s - loss: 1.7189 - regression_loss: 1.4244 - classification_loss: 0.2945 432/500 [========================>.....] - ETA: 17s - loss: 1.7191 - regression_loss: 1.4246 - classification_loss: 0.2945 433/500 [========================>.....] - ETA: 16s - loss: 1.7208 - regression_loss: 1.4259 - classification_loss: 0.2949 434/500 [=========================>....] - ETA: 16s - loss: 1.7211 - regression_loss: 1.4260 - classification_loss: 0.2951 435/500 [=========================>....] - ETA: 16s - loss: 1.7209 - regression_loss: 1.4259 - classification_loss: 0.2950 436/500 [=========================>....] - ETA: 16s - loss: 1.7197 - regression_loss: 1.4249 - classification_loss: 0.2949 437/500 [=========================>....] - ETA: 15s - loss: 1.7204 - regression_loss: 1.4253 - classification_loss: 0.2951 438/500 [=========================>....] - ETA: 15s - loss: 1.7202 - regression_loss: 1.4251 - classification_loss: 0.2951 439/500 [=========================>....] - ETA: 15s - loss: 1.7224 - regression_loss: 1.4266 - classification_loss: 0.2958 440/500 [=========================>....] - ETA: 15s - loss: 1.7219 - regression_loss: 1.4262 - classification_loss: 0.2957 441/500 [=========================>....] - ETA: 14s - loss: 1.7211 - regression_loss: 1.4257 - classification_loss: 0.2954 442/500 [=========================>....] - ETA: 14s - loss: 1.7214 - regression_loss: 1.4259 - classification_loss: 0.2954 443/500 [=========================>....] - ETA: 14s - loss: 1.7195 - regression_loss: 1.4243 - classification_loss: 0.2952 444/500 [=========================>....] - ETA: 14s - loss: 1.7184 - regression_loss: 1.4234 - classification_loss: 0.2950 445/500 [=========================>....] - ETA: 13s - loss: 1.7193 - regression_loss: 1.4242 - classification_loss: 0.2951 446/500 [=========================>....] - ETA: 13s - loss: 1.7187 - regression_loss: 1.4237 - classification_loss: 0.2950 447/500 [=========================>....] - ETA: 13s - loss: 1.7199 - regression_loss: 1.4246 - classification_loss: 0.2953 448/500 [=========================>....] - ETA: 13s - loss: 1.7214 - regression_loss: 1.4254 - classification_loss: 0.2960 449/500 [=========================>....] - ETA: 12s - loss: 1.7201 - regression_loss: 1.4244 - classification_loss: 0.2956 450/500 [==========================>...] - ETA: 12s - loss: 1.7240 - regression_loss: 1.4248 - classification_loss: 0.2992 451/500 [==========================>...] - ETA: 12s - loss: 1.7240 - regression_loss: 1.4248 - classification_loss: 0.2992 452/500 [==========================>...] - ETA: 12s - loss: 1.7237 - regression_loss: 1.4247 - classification_loss: 0.2990 453/500 [==========================>...] - ETA: 11s - loss: 1.7259 - regression_loss: 1.4266 - classification_loss: 0.2993 454/500 [==========================>...] - ETA: 11s - loss: 1.7267 - regression_loss: 1.4272 - classification_loss: 0.2994 455/500 [==========================>...] - ETA: 11s - loss: 1.7271 - regression_loss: 1.4277 - classification_loss: 0.2994 456/500 [==========================>...] - ETA: 11s - loss: 1.7272 - regression_loss: 1.4278 - classification_loss: 0.2994 457/500 [==========================>...] - ETA: 10s - loss: 1.7271 - regression_loss: 1.4277 - classification_loss: 0.2993 458/500 [==========================>...] - ETA: 10s - loss: 1.7274 - regression_loss: 1.4281 - classification_loss: 0.2993 459/500 [==========================>...] - ETA: 10s - loss: 1.7271 - regression_loss: 1.4278 - classification_loss: 0.2993 460/500 [==========================>...] - ETA: 10s - loss: 1.7265 - regression_loss: 1.4274 - classification_loss: 0.2991 461/500 [==========================>...] - ETA: 9s - loss: 1.7273 - regression_loss: 1.4280 - classification_loss: 0.2993  462/500 [==========================>...] - ETA: 9s - loss: 1.7271 - regression_loss: 1.4280 - classification_loss: 0.2991 463/500 [==========================>...] - ETA: 9s - loss: 1.7278 - regression_loss: 1.4285 - classification_loss: 0.2993 464/500 [==========================>...] - ETA: 9s - loss: 1.7286 - regression_loss: 1.4288 - classification_loss: 0.2998 465/500 [==========================>...] - ETA: 8s - loss: 1.7283 - regression_loss: 1.4285 - classification_loss: 0.2998 466/500 [==========================>...] - ETA: 8s - loss: 1.7273 - regression_loss: 1.4275 - classification_loss: 0.2998 467/500 [===========================>..] - ETA: 8s - loss: 1.7263 - regression_loss: 1.4268 - classification_loss: 0.2995 468/500 [===========================>..] - ETA: 8s - loss: 1.7259 - regression_loss: 1.4263 - classification_loss: 0.2996 469/500 [===========================>..] - ETA: 7s - loss: 1.7254 - regression_loss: 1.4259 - classification_loss: 0.2995 470/500 [===========================>..] - ETA: 7s - loss: 1.7253 - regression_loss: 1.4258 - classification_loss: 0.2995 471/500 [===========================>..] - ETA: 7s - loss: 1.7249 - regression_loss: 1.4255 - classification_loss: 0.2994 472/500 [===========================>..] - ETA: 7s - loss: 1.7248 - regression_loss: 1.4254 - classification_loss: 0.2994 473/500 [===========================>..] - ETA: 6s - loss: 1.7257 - regression_loss: 1.4261 - classification_loss: 0.2995 474/500 [===========================>..] - ETA: 6s - loss: 1.7240 - regression_loss: 1.4248 - classification_loss: 0.2992 475/500 [===========================>..] - ETA: 6s - loss: 1.7252 - regression_loss: 1.4258 - classification_loss: 0.2995 476/500 [===========================>..] - ETA: 6s - loss: 1.7251 - regression_loss: 1.4257 - classification_loss: 0.2994 477/500 [===========================>..] - ETA: 5s - loss: 1.7241 - regression_loss: 1.4249 - classification_loss: 0.2992 478/500 [===========================>..] - ETA: 5s - loss: 1.7247 - regression_loss: 1.4253 - classification_loss: 0.2994 479/500 [===========================>..] - ETA: 5s - loss: 1.7251 - regression_loss: 1.4256 - classification_loss: 0.2994 480/500 [===========================>..] - ETA: 5s - loss: 1.7260 - regression_loss: 1.4264 - classification_loss: 0.2995 481/500 [===========================>..] - ETA: 4s - loss: 1.7235 - regression_loss: 1.4244 - classification_loss: 0.2991 482/500 [===========================>..] - ETA: 4s - loss: 1.7222 - regression_loss: 1.4236 - classification_loss: 0.2986 483/500 [===========================>..] - ETA: 4s - loss: 1.7217 - regression_loss: 1.4233 - classification_loss: 0.2984 484/500 [============================>.] - ETA: 4s - loss: 1.7222 - regression_loss: 1.4238 - classification_loss: 0.2984 485/500 [============================>.] - ETA: 3s - loss: 1.7222 - regression_loss: 1.4237 - classification_loss: 0.2984 486/500 [============================>.] - ETA: 3s - loss: 1.7221 - regression_loss: 1.4238 - classification_loss: 0.2983 487/500 [============================>.] - ETA: 3s - loss: 1.7219 - regression_loss: 1.4238 - classification_loss: 0.2982 488/500 [============================>.] - ETA: 3s - loss: 1.7221 - regression_loss: 1.4240 - classification_loss: 0.2981 489/500 [============================>.] - ETA: 2s - loss: 1.7218 - regression_loss: 1.4238 - classification_loss: 0.2980 490/500 [============================>.] - ETA: 2s - loss: 1.7218 - regression_loss: 1.4239 - classification_loss: 0.2979 491/500 [============================>.] - ETA: 2s - loss: 1.7210 - regression_loss: 1.4233 - classification_loss: 0.2977 492/500 [============================>.] - ETA: 2s - loss: 1.7214 - regression_loss: 1.4237 - classification_loss: 0.2977 493/500 [============================>.] - ETA: 1s - loss: 1.7209 - regression_loss: 1.4230 - classification_loss: 0.2979 494/500 [============================>.] - ETA: 1s - loss: 1.7192 - regression_loss: 1.4216 - classification_loss: 0.2976 495/500 [============================>.] - ETA: 1s - loss: 1.7199 - regression_loss: 1.4222 - classification_loss: 0.2977 496/500 [============================>.] - ETA: 1s - loss: 1.7204 - regression_loss: 1.4225 - classification_loss: 0.2979 497/500 [============================>.] - ETA: 0s - loss: 1.7202 - regression_loss: 1.4224 - classification_loss: 0.2977 498/500 [============================>.] - ETA: 0s - loss: 1.7194 - regression_loss: 1.4221 - classification_loss: 0.2974 499/500 [============================>.] - ETA: 0s - loss: 1.7201 - regression_loss: 1.4225 - classification_loss: 0.2976 500/500 [==============================] - 125s 251ms/step - loss: 1.7205 - regression_loss: 1.4227 - classification_loss: 0.2978 1172 instances of class plum with average precision: 0.6091 mAP: 0.6091 Epoch 00070: saving model to ./training/snapshots/resnet50_pascal_70.h5 Epoch 71/150 1/500 [..............................] - ETA: 2:01 - loss: 1.6298 - regression_loss: 1.4442 - classification_loss: 0.1855 2/500 [..............................] - ETA: 2:03 - loss: 1.7115 - regression_loss: 1.4810 - classification_loss: 0.2305 3/500 [..............................] - ETA: 2:04 - loss: 1.6098 - regression_loss: 1.3650 - classification_loss: 0.2448 4/500 [..............................] - ETA: 2:03 - loss: 1.6515 - regression_loss: 1.4136 - classification_loss: 0.2379 5/500 [..............................] - ETA: 2:04 - loss: 1.4954 - regression_loss: 1.2654 - classification_loss: 0.2300 6/500 [..............................] - ETA: 2:04 - loss: 1.4611 - regression_loss: 1.2375 - classification_loss: 0.2236 7/500 [..............................] - ETA: 2:03 - loss: 1.5862 - regression_loss: 1.3439 - classification_loss: 0.2423 8/500 [..............................] - ETA: 2:02 - loss: 1.6488 - regression_loss: 1.3917 - classification_loss: 0.2570 9/500 [..............................] - ETA: 2:01 - loss: 1.6758 - regression_loss: 1.4098 - classification_loss: 0.2660 10/500 [..............................] - ETA: 2:01 - loss: 1.7115 - regression_loss: 1.4391 - classification_loss: 0.2723 11/500 [..............................] - ETA: 2:01 - loss: 1.6877 - regression_loss: 1.4156 - classification_loss: 0.2721 12/500 [..............................] - ETA: 2:01 - loss: 1.6879 - regression_loss: 1.4209 - classification_loss: 0.2670 13/500 [..............................] - ETA: 2:01 - loss: 1.7076 - regression_loss: 1.4389 - classification_loss: 0.2687 14/500 [..............................] - ETA: 2:00 - loss: 1.7368 - regression_loss: 1.4633 - classification_loss: 0.2736 15/500 [..............................] - ETA: 2:01 - loss: 1.7125 - regression_loss: 1.4392 - classification_loss: 0.2732 16/500 [..............................] - ETA: 2:00 - loss: 1.7129 - regression_loss: 1.4392 - classification_loss: 0.2737 17/500 [>.............................] - ETA: 1:59 - loss: 1.7210 - regression_loss: 1.4464 - classification_loss: 0.2746 18/500 [>.............................] - ETA: 1:59 - loss: 1.7173 - regression_loss: 1.4423 - classification_loss: 0.2750 19/500 [>.............................] - ETA: 1:59 - loss: 1.7294 - regression_loss: 1.4387 - classification_loss: 0.2907 20/500 [>.............................] - ETA: 1:59 - loss: 1.7273 - regression_loss: 1.4364 - classification_loss: 0.2909 21/500 [>.............................] - ETA: 1:59 - loss: 1.7020 - regression_loss: 1.4187 - classification_loss: 0.2833 22/500 [>.............................] - ETA: 1:59 - loss: 1.7125 - regression_loss: 1.4301 - classification_loss: 0.2824 23/500 [>.............................] - ETA: 1:59 - loss: 1.7376 - regression_loss: 1.4523 - classification_loss: 0.2853 24/500 [>.............................] - ETA: 1:58 - loss: 1.7452 - regression_loss: 1.4590 - classification_loss: 0.2862 25/500 [>.............................] - ETA: 1:57 - loss: 1.7558 - regression_loss: 1.4711 - classification_loss: 0.2847 26/500 [>.............................] - ETA: 1:56 - loss: 1.7052 - regression_loss: 1.4275 - classification_loss: 0.2776 27/500 [>.............................] - ETA: 1:56 - loss: 1.6978 - regression_loss: 1.4238 - classification_loss: 0.2740 28/500 [>.............................] - ETA: 1:56 - loss: 1.6914 - regression_loss: 1.4149 - classification_loss: 0.2765 29/500 [>.............................] - ETA: 1:56 - loss: 1.7081 - regression_loss: 1.4251 - classification_loss: 0.2830 30/500 [>.............................] - ETA: 1:55 - loss: 1.7088 - regression_loss: 1.4258 - classification_loss: 0.2831 31/500 [>.............................] - ETA: 1:55 - loss: 1.7414 - regression_loss: 1.4507 - classification_loss: 0.2907 32/500 [>.............................] - ETA: 1:55 - loss: 1.7186 - regression_loss: 1.4328 - classification_loss: 0.2859 33/500 [>.............................] - ETA: 1:55 - loss: 1.7258 - regression_loss: 1.4365 - classification_loss: 0.2894 34/500 [=>............................] - ETA: 1:55 - loss: 1.7460 - regression_loss: 1.4485 - classification_loss: 0.2976 35/500 [=>............................] - ETA: 1:55 - loss: 1.7483 - regression_loss: 1.4514 - classification_loss: 0.2969 36/500 [=>............................] - ETA: 1:54 - loss: 1.7363 - regression_loss: 1.4405 - classification_loss: 0.2958 37/500 [=>............................] - ETA: 1:54 - loss: 1.7157 - regression_loss: 1.4236 - classification_loss: 0.2921 38/500 [=>............................] - ETA: 1:54 - loss: 1.7055 - regression_loss: 1.4161 - classification_loss: 0.2894 39/500 [=>............................] - ETA: 1:54 - loss: 1.7033 - regression_loss: 1.4159 - classification_loss: 0.2874 40/500 [=>............................] - ETA: 1:54 - loss: 1.7051 - regression_loss: 1.4156 - classification_loss: 0.2895 41/500 [=>............................] - ETA: 1:53 - loss: 1.6970 - regression_loss: 1.4111 - classification_loss: 0.2858 42/500 [=>............................] - ETA: 1:53 - loss: 1.7061 - regression_loss: 1.4178 - classification_loss: 0.2883 43/500 [=>............................] - ETA: 1:53 - loss: 1.7094 - regression_loss: 1.4219 - classification_loss: 0.2875 44/500 [=>............................] - ETA: 1:53 - loss: 1.7060 - regression_loss: 1.4197 - classification_loss: 0.2863 45/500 [=>............................] - ETA: 1:52 - loss: 1.6906 - regression_loss: 1.4086 - classification_loss: 0.2821 46/500 [=>............................] - ETA: 1:52 - loss: 1.6870 - regression_loss: 1.4061 - classification_loss: 0.2809 47/500 [=>............................] - ETA: 1:52 - loss: 1.6736 - regression_loss: 1.3956 - classification_loss: 0.2779 48/500 [=>............................] - ETA: 1:52 - loss: 1.6588 - regression_loss: 1.3826 - classification_loss: 0.2762 49/500 [=>............................] - ETA: 1:52 - loss: 1.6625 - regression_loss: 1.3849 - classification_loss: 0.2776 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6636 - regression_loss: 1.3861 - classification_loss: 0.2775 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6564 - regression_loss: 1.3793 - classification_loss: 0.2770 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6660 - regression_loss: 1.3858 - classification_loss: 0.2802 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6687 - regression_loss: 1.3912 - classification_loss: 0.2775 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6727 - regression_loss: 1.3940 - classification_loss: 0.2787 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6793 - regression_loss: 1.3996 - classification_loss: 0.2797 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6817 - regression_loss: 1.4010 - classification_loss: 0.2808 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6835 - regression_loss: 1.4023 - classification_loss: 0.2811 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6878 - regression_loss: 1.4053 - classification_loss: 0.2825 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6846 - regression_loss: 1.4034 - classification_loss: 0.2812 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6692 - regression_loss: 1.3893 - classification_loss: 0.2799 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6643 - regression_loss: 1.3854 - classification_loss: 0.2788 62/500 [==>...........................] - ETA: 1:48 - loss: 1.6684 - regression_loss: 1.3891 - classification_loss: 0.2792 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6701 - regression_loss: 1.3892 - classification_loss: 0.2810 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6681 - regression_loss: 1.3882 - classification_loss: 0.2800 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6647 - regression_loss: 1.3851 - classification_loss: 0.2796 66/500 [==>...........................] - ETA: 1:47 - loss: 1.6655 - regression_loss: 1.3864 - classification_loss: 0.2791 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6645 - regression_loss: 1.3854 - classification_loss: 0.2791 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6581 - regression_loss: 1.3810 - classification_loss: 0.2771 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6445 - regression_loss: 1.3704 - classification_loss: 0.2741 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6365 - regression_loss: 1.3628 - classification_loss: 0.2737 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6349 - regression_loss: 1.3608 - classification_loss: 0.2741 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6444 - regression_loss: 1.3673 - classification_loss: 0.2771 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6387 - regression_loss: 1.3617 - classification_loss: 0.2770 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6496 - regression_loss: 1.3723 - classification_loss: 0.2774 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6570 - regression_loss: 1.3785 - classification_loss: 0.2786 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6628 - regression_loss: 1.3836 - classification_loss: 0.2793 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6566 - regression_loss: 1.3784 - classification_loss: 0.2782 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6615 - regression_loss: 1.3822 - classification_loss: 0.2794 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6674 - regression_loss: 1.3863 - classification_loss: 0.2812 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6729 - regression_loss: 1.3898 - classification_loss: 0.2831 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6746 - regression_loss: 1.3913 - classification_loss: 0.2833 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6773 - regression_loss: 1.3929 - classification_loss: 0.2843 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6819 - regression_loss: 1.3972 - classification_loss: 0.2847 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6882 - regression_loss: 1.4018 - classification_loss: 0.2865 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6929 - regression_loss: 1.4050 - classification_loss: 0.2880 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6933 - regression_loss: 1.4052 - classification_loss: 0.2880 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6955 - regression_loss: 1.4076 - classification_loss: 0.2879 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6993 - regression_loss: 1.4109 - classification_loss: 0.2884 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7027 - regression_loss: 1.4131 - classification_loss: 0.2896 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6957 - regression_loss: 1.4075 - classification_loss: 0.2882 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6970 - regression_loss: 1.4085 - classification_loss: 0.2885 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7046 - regression_loss: 1.4146 - classification_loss: 0.2900 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7019 - regression_loss: 1.4131 - classification_loss: 0.2888 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7053 - regression_loss: 1.4158 - classification_loss: 0.2895 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6990 - regression_loss: 1.4109 - classification_loss: 0.2882 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7004 - regression_loss: 1.4119 - classification_loss: 0.2886 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6987 - regression_loss: 1.4100 - classification_loss: 0.2887 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6972 - regression_loss: 1.4093 - classification_loss: 0.2880 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6896 - regression_loss: 1.4034 - classification_loss: 0.2862 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6907 - regression_loss: 1.4044 - classification_loss: 0.2863 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6920 - regression_loss: 1.4057 - classification_loss: 0.2863 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6974 - regression_loss: 1.4074 - classification_loss: 0.2900 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6974 - regression_loss: 1.4080 - classification_loss: 0.2895 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6990 - regression_loss: 1.4098 - classification_loss: 0.2893 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6994 - regression_loss: 1.4103 - classification_loss: 0.2891 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7031 - regression_loss: 1.4135 - classification_loss: 0.2896 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7059 - regression_loss: 1.4159 - classification_loss: 0.2900 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7064 - regression_loss: 1.4166 - classification_loss: 0.2899 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7104 - regression_loss: 1.4202 - classification_loss: 0.2902 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7121 - regression_loss: 1.4211 - classification_loss: 0.2909 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7254 - regression_loss: 1.4342 - classification_loss: 0.2912 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7306 - regression_loss: 1.4377 - classification_loss: 0.2929 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7312 - regression_loss: 1.4386 - classification_loss: 0.2926 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7330 - regression_loss: 1.4410 - classification_loss: 0.2919 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7334 - regression_loss: 1.4416 - classification_loss: 0.2918 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7278 - regression_loss: 1.4370 - classification_loss: 0.2908 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7278 - regression_loss: 1.4373 - classification_loss: 0.2905 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7266 - regression_loss: 1.4361 - classification_loss: 0.2905 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7239 - regression_loss: 1.4335 - classification_loss: 0.2904 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7217 - regression_loss: 1.4316 - classification_loss: 0.2901 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7195 - regression_loss: 1.4299 - classification_loss: 0.2896 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7121 - regression_loss: 1.4232 - classification_loss: 0.2889 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7114 - regression_loss: 1.4227 - classification_loss: 0.2886 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7094 - regression_loss: 1.4213 - classification_loss: 0.2881 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7097 - regression_loss: 1.4217 - classification_loss: 0.2880 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7098 - regression_loss: 1.4218 - classification_loss: 0.2880 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7103 - regression_loss: 1.4215 - classification_loss: 0.2888 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7102 - regression_loss: 1.4210 - classification_loss: 0.2892 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7090 - regression_loss: 1.4201 - classification_loss: 0.2889 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7080 - regression_loss: 1.4198 - classification_loss: 0.2881 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7214 - regression_loss: 1.4306 - classification_loss: 0.2908 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7209 - regression_loss: 1.4307 - classification_loss: 0.2902 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7180 - regression_loss: 1.4280 - classification_loss: 0.2900 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7167 - regression_loss: 1.4271 - classification_loss: 0.2896 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7193 - regression_loss: 1.4297 - classification_loss: 0.2896 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7190 - regression_loss: 1.4301 - classification_loss: 0.2890 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7192 - regression_loss: 1.4301 - classification_loss: 0.2891 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7151 - regression_loss: 1.4271 - classification_loss: 0.2880 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7170 - regression_loss: 1.4288 - classification_loss: 0.2882 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7176 - regression_loss: 1.4295 - classification_loss: 0.2881 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7163 - regression_loss: 1.4277 - classification_loss: 0.2886 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7166 - regression_loss: 1.4282 - classification_loss: 0.2884 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7156 - regression_loss: 1.4275 - classification_loss: 0.2881 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7207 - regression_loss: 1.4311 - classification_loss: 0.2897 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7239 - regression_loss: 1.4336 - classification_loss: 0.2902 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7241 - regression_loss: 1.4337 - classification_loss: 0.2904 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7232 - regression_loss: 1.4331 - classification_loss: 0.2901 148/500 [=======>......................] - ETA: 1:27 - loss: 1.7264 - regression_loss: 1.4357 - classification_loss: 0.2907 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7203 - regression_loss: 1.4304 - classification_loss: 0.2899 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7143 - regression_loss: 1.4256 - classification_loss: 0.2887 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7112 - regression_loss: 1.4226 - classification_loss: 0.2886 152/500 [========>.....................] - ETA: 1:26 - loss: 1.7101 - regression_loss: 1.4220 - classification_loss: 0.2881 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7085 - regression_loss: 1.4212 - classification_loss: 0.2873 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7092 - regression_loss: 1.4215 - classification_loss: 0.2877 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7075 - regression_loss: 1.4203 - classification_loss: 0.2872 156/500 [========>.....................] - ETA: 1:25 - loss: 1.7102 - regression_loss: 1.4227 - classification_loss: 0.2875 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7063 - regression_loss: 1.4197 - classification_loss: 0.2866 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7049 - regression_loss: 1.4185 - classification_loss: 0.2864 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7053 - regression_loss: 1.4184 - classification_loss: 0.2869 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7067 - regression_loss: 1.4196 - classification_loss: 0.2871 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7137 - regression_loss: 1.4246 - classification_loss: 0.2891 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7135 - regression_loss: 1.4248 - classification_loss: 0.2887 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7084 - regression_loss: 1.4204 - classification_loss: 0.2880 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7082 - regression_loss: 1.4205 - classification_loss: 0.2877 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7059 - regression_loss: 1.4188 - classification_loss: 0.2872 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7067 - regression_loss: 1.4195 - classification_loss: 0.2872 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7064 - regression_loss: 1.4195 - classification_loss: 0.2869 168/500 [=========>....................] - ETA: 1:22 - loss: 1.7080 - regression_loss: 1.4209 - classification_loss: 0.2871 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7102 - regression_loss: 1.4228 - classification_loss: 0.2874 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7081 - regression_loss: 1.4209 - classification_loss: 0.2873 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7039 - regression_loss: 1.4172 - classification_loss: 0.2867 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7061 - regression_loss: 1.4189 - classification_loss: 0.2872 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7067 - regression_loss: 1.4197 - classification_loss: 0.2869 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7071 - regression_loss: 1.4205 - classification_loss: 0.2866 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7082 - regression_loss: 1.4213 - classification_loss: 0.2869 176/500 [=========>....................] - ETA: 1:20 - loss: 1.7087 - regression_loss: 1.4217 - classification_loss: 0.2869 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7158 - regression_loss: 1.4274 - classification_loss: 0.2884 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7175 - regression_loss: 1.4287 - classification_loss: 0.2888 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7184 - regression_loss: 1.4300 - classification_loss: 0.2884 180/500 [=========>....................] - ETA: 1:19 - loss: 1.7232 - regression_loss: 1.4330 - classification_loss: 0.2902 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7233 - regression_loss: 1.4331 - classification_loss: 0.2903 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7190 - regression_loss: 1.4296 - classification_loss: 0.2893 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7179 - regression_loss: 1.4291 - classification_loss: 0.2888 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7154 - regression_loss: 1.4272 - classification_loss: 0.2882 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7177 - regression_loss: 1.4289 - classification_loss: 0.2888 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7182 - regression_loss: 1.4290 - classification_loss: 0.2892 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7145 - regression_loss: 1.4260 - classification_loss: 0.2885 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7125 - regression_loss: 1.4245 - classification_loss: 0.2879 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7125 - regression_loss: 1.4243 - classification_loss: 0.2882 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7149 - regression_loss: 1.4264 - classification_loss: 0.2885 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7162 - regression_loss: 1.4278 - classification_loss: 0.2885 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7167 - regression_loss: 1.4278 - classification_loss: 0.2889 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7181 - regression_loss: 1.4289 - classification_loss: 0.2892 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7166 - regression_loss: 1.4280 - classification_loss: 0.2886 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7121 - regression_loss: 1.4242 - classification_loss: 0.2879 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7117 - regression_loss: 1.4240 - classification_loss: 0.2877 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7137 - regression_loss: 1.4257 - classification_loss: 0.2879 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7161 - regression_loss: 1.4279 - classification_loss: 0.2882 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7168 - regression_loss: 1.4286 - classification_loss: 0.2882 200/500 [===========>..................] - ETA: 1:14 - loss: 1.7175 - regression_loss: 1.4294 - classification_loss: 0.2882 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7202 - regression_loss: 1.4314 - classification_loss: 0.2888 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7178 - regression_loss: 1.4296 - classification_loss: 0.2883 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7176 - regression_loss: 1.4294 - classification_loss: 0.2882 204/500 [===========>..................] - ETA: 1:13 - loss: 1.7180 - regression_loss: 1.4300 - classification_loss: 0.2880 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7199 - regression_loss: 1.4320 - classification_loss: 0.2879 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7171 - regression_loss: 1.4297 - classification_loss: 0.2874 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7177 - regression_loss: 1.4298 - classification_loss: 0.2878 208/500 [===========>..................] - ETA: 1:12 - loss: 1.7157 - regression_loss: 1.4284 - classification_loss: 0.2874 209/500 [===========>..................] - ETA: 1:12 - loss: 1.7147 - regression_loss: 1.4278 - classification_loss: 0.2869 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7152 - regression_loss: 1.4285 - classification_loss: 0.2867 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7129 - regression_loss: 1.4268 - classification_loss: 0.2861 212/500 [===========>..................] - ETA: 1:11 - loss: 1.7101 - regression_loss: 1.4246 - classification_loss: 0.2855 213/500 [===========>..................] - ETA: 1:11 - loss: 1.7101 - regression_loss: 1.4247 - classification_loss: 0.2854 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7105 - regression_loss: 1.4252 - classification_loss: 0.2854 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7119 - regression_loss: 1.4261 - classification_loss: 0.2858 216/500 [===========>..................] - ETA: 1:10 - loss: 1.7122 - regression_loss: 1.4265 - classification_loss: 0.2857 217/500 [============>.................] - ETA: 1:10 - loss: 1.7127 - regression_loss: 1.4268 - classification_loss: 0.2859 218/500 [============>.................] - ETA: 1:10 - loss: 1.7152 - regression_loss: 1.4290 - classification_loss: 0.2862 219/500 [============>.................] - ETA: 1:10 - loss: 1.7170 - regression_loss: 1.4305 - classification_loss: 0.2865 220/500 [============>.................] - ETA: 1:09 - loss: 1.7172 - regression_loss: 1.4307 - classification_loss: 0.2866 221/500 [============>.................] - ETA: 1:09 - loss: 1.7193 - regression_loss: 1.4323 - classification_loss: 0.2870 222/500 [============>.................] - ETA: 1:09 - loss: 1.7183 - regression_loss: 1.4317 - classification_loss: 0.2866 223/500 [============>.................] - ETA: 1:09 - loss: 1.7191 - regression_loss: 1.4323 - classification_loss: 0.2868 224/500 [============>.................] - ETA: 1:08 - loss: 1.7204 - regression_loss: 1.4332 - classification_loss: 0.2872 225/500 [============>.................] - ETA: 1:08 - loss: 1.7170 - regression_loss: 1.4308 - classification_loss: 0.2862 226/500 [============>.................] - ETA: 1:08 - loss: 1.7153 - regression_loss: 1.4294 - classification_loss: 0.2859 227/500 [============>.................] - ETA: 1:08 - loss: 1.7121 - regression_loss: 1.4269 - classification_loss: 0.2853 228/500 [============>.................] - ETA: 1:07 - loss: 1.7129 - regression_loss: 1.4277 - classification_loss: 0.2852 229/500 [============>.................] - ETA: 1:07 - loss: 1.7089 - regression_loss: 1.4246 - classification_loss: 0.2843 230/500 [============>.................] - ETA: 1:07 - loss: 1.7094 - regression_loss: 1.4250 - classification_loss: 0.2845 231/500 [============>.................] - ETA: 1:07 - loss: 1.7094 - regression_loss: 1.4252 - classification_loss: 0.2842 232/500 [============>.................] - ETA: 1:06 - loss: 1.7094 - regression_loss: 1.4254 - classification_loss: 0.2840 233/500 [============>.................] - ETA: 1:06 - loss: 1.7076 - regression_loss: 1.4240 - classification_loss: 0.2836 234/500 [=============>................] - ETA: 1:06 - loss: 1.7050 - regression_loss: 1.4215 - classification_loss: 0.2834 235/500 [=============>................] - ETA: 1:06 - loss: 1.7056 - regression_loss: 1.4225 - classification_loss: 0.2831 236/500 [=============>................] - ETA: 1:05 - loss: 1.7064 - regression_loss: 1.4232 - classification_loss: 0.2832 237/500 [=============>................] - ETA: 1:05 - loss: 1.7093 - regression_loss: 1.4254 - classification_loss: 0.2839 238/500 [=============>................] - ETA: 1:05 - loss: 1.7086 - regression_loss: 1.4248 - classification_loss: 0.2838 239/500 [=============>................] - ETA: 1:05 - loss: 1.7049 - regression_loss: 1.4216 - classification_loss: 0.2833 240/500 [=============>................] - ETA: 1:04 - loss: 1.7053 - regression_loss: 1.4218 - classification_loss: 0.2834 241/500 [=============>................] - ETA: 1:04 - loss: 1.7041 - regression_loss: 1.4207 - classification_loss: 0.2834 242/500 [=============>................] - ETA: 1:04 - loss: 1.7040 - regression_loss: 1.4204 - classification_loss: 0.2835 243/500 [=============>................] - ETA: 1:04 - loss: 1.7026 - regression_loss: 1.4194 - classification_loss: 0.2832 244/500 [=============>................] - ETA: 1:03 - loss: 1.7014 - regression_loss: 1.4184 - classification_loss: 0.2830 245/500 [=============>................] - ETA: 1:03 - loss: 1.7023 - regression_loss: 1.4192 - classification_loss: 0.2832 246/500 [=============>................] - ETA: 1:03 - loss: 1.7020 - regression_loss: 1.4189 - classification_loss: 0.2831 247/500 [=============>................] - ETA: 1:03 - loss: 1.7032 - regression_loss: 1.4195 - classification_loss: 0.2837 248/500 [=============>................] - ETA: 1:02 - loss: 1.7031 - regression_loss: 1.4195 - classification_loss: 0.2836 249/500 [=============>................] - ETA: 1:02 - loss: 1.7027 - regression_loss: 1.4191 - classification_loss: 0.2836 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7030 - regression_loss: 1.4194 - classification_loss: 0.2836 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6983 - regression_loss: 1.4151 - classification_loss: 0.2833 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6949 - regression_loss: 1.4124 - classification_loss: 0.2825 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6967 - regression_loss: 1.4138 - classification_loss: 0.2829 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6961 - regression_loss: 1.4132 - classification_loss: 0.2829 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6989 - regression_loss: 1.4153 - classification_loss: 0.2836 256/500 [==============>...............] - ETA: 1:00 - loss: 1.7022 - regression_loss: 1.4172 - classification_loss: 0.2851 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7026 - regression_loss: 1.4175 - classification_loss: 0.2851 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7041 - regression_loss: 1.4186 - classification_loss: 0.2855 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7046 - regression_loss: 1.4189 - classification_loss: 0.2857 260/500 [==============>...............] - ETA: 59s - loss: 1.7071 - regression_loss: 1.4209 - classification_loss: 0.2862  261/500 [==============>...............] - ETA: 59s - loss: 1.7068 - regression_loss: 1.4207 - classification_loss: 0.2860 262/500 [==============>...............] - ETA: 59s - loss: 1.7040 - regression_loss: 1.4185 - classification_loss: 0.2855 263/500 [==============>...............] - ETA: 59s - loss: 1.7057 - regression_loss: 1.4201 - classification_loss: 0.2856 264/500 [==============>...............] - ETA: 58s - loss: 1.7068 - regression_loss: 1.4210 - classification_loss: 0.2858 265/500 [==============>...............] - ETA: 58s - loss: 1.7076 - regression_loss: 1.4217 - classification_loss: 0.2859 266/500 [==============>...............] - ETA: 58s - loss: 1.7064 - regression_loss: 1.4200 - classification_loss: 0.2864 267/500 [===============>..............] - ETA: 58s - loss: 1.7061 - regression_loss: 1.4198 - classification_loss: 0.2862 268/500 [===============>..............] - ETA: 57s - loss: 1.7036 - regression_loss: 1.4179 - classification_loss: 0.2856 269/500 [===============>..............] - ETA: 57s - loss: 1.7042 - regression_loss: 1.4182 - classification_loss: 0.2860 270/500 [===============>..............] - ETA: 57s - loss: 1.7053 - regression_loss: 1.4192 - classification_loss: 0.2861 271/500 [===============>..............] - ETA: 57s - loss: 1.7048 - regression_loss: 1.4188 - classification_loss: 0.2860 272/500 [===============>..............] - ETA: 56s - loss: 1.7048 - regression_loss: 1.4191 - classification_loss: 0.2857 273/500 [===============>..............] - ETA: 56s - loss: 1.7048 - regression_loss: 1.4192 - classification_loss: 0.2856 274/500 [===============>..............] - ETA: 56s - loss: 1.7050 - regression_loss: 1.4195 - classification_loss: 0.2855 275/500 [===============>..............] - ETA: 56s - loss: 1.7042 - regression_loss: 1.4189 - classification_loss: 0.2853 276/500 [===============>..............] - ETA: 55s - loss: 1.7051 - regression_loss: 1.4194 - classification_loss: 0.2858 277/500 [===============>..............] - ETA: 55s - loss: 1.7038 - regression_loss: 1.4185 - classification_loss: 0.2853 278/500 [===============>..............] - ETA: 55s - loss: 1.7026 - regression_loss: 1.4177 - classification_loss: 0.2850 279/500 [===============>..............] - ETA: 55s - loss: 1.7041 - regression_loss: 1.4190 - classification_loss: 0.2851 280/500 [===============>..............] - ETA: 54s - loss: 1.7012 - regression_loss: 1.4167 - classification_loss: 0.2844 281/500 [===============>..............] - ETA: 54s - loss: 1.7005 - regression_loss: 1.4163 - classification_loss: 0.2842 282/500 [===============>..............] - ETA: 54s - loss: 1.7004 - regression_loss: 1.4164 - classification_loss: 0.2840 283/500 [===============>..............] - ETA: 54s - loss: 1.6983 - regression_loss: 1.4147 - classification_loss: 0.2836 284/500 [================>.............] - ETA: 53s - loss: 1.6999 - regression_loss: 1.4160 - classification_loss: 0.2839 285/500 [================>.............] - ETA: 53s - loss: 1.7003 - regression_loss: 1.4163 - classification_loss: 0.2840 286/500 [================>.............] - ETA: 53s - loss: 1.7011 - regression_loss: 1.4170 - classification_loss: 0.2842 287/500 [================>.............] - ETA: 53s - loss: 1.6996 - regression_loss: 1.4159 - classification_loss: 0.2836 288/500 [================>.............] - ETA: 52s - loss: 1.6997 - regression_loss: 1.4161 - classification_loss: 0.2836 289/500 [================>.............] - ETA: 52s - loss: 1.6998 - regression_loss: 1.4163 - classification_loss: 0.2835 290/500 [================>.............] - ETA: 52s - loss: 1.6987 - regression_loss: 1.4156 - classification_loss: 0.2831 291/500 [================>.............] - ETA: 52s - loss: 1.6963 - regression_loss: 1.4136 - classification_loss: 0.2827 292/500 [================>.............] - ETA: 51s - loss: 1.6953 - regression_loss: 1.4129 - classification_loss: 0.2825 293/500 [================>.............] - ETA: 51s - loss: 1.6960 - regression_loss: 1.4136 - classification_loss: 0.2824 294/500 [================>.............] - ETA: 51s - loss: 1.6949 - regression_loss: 1.4127 - classification_loss: 0.2822 295/500 [================>.............] - ETA: 51s - loss: 1.6954 - regression_loss: 1.4130 - classification_loss: 0.2824 296/500 [================>.............] - ETA: 50s - loss: 1.6984 - regression_loss: 1.4155 - classification_loss: 0.2829 297/500 [================>.............] - ETA: 50s - loss: 1.6990 - regression_loss: 1.4158 - classification_loss: 0.2831 298/500 [================>.............] - ETA: 50s - loss: 1.7010 - regression_loss: 1.4173 - classification_loss: 0.2837 299/500 [================>.............] - ETA: 50s - loss: 1.6993 - regression_loss: 1.4154 - classification_loss: 0.2839 300/500 [=================>............] - ETA: 49s - loss: 1.6997 - regression_loss: 1.4157 - classification_loss: 0.2840 301/500 [=================>............] - ETA: 49s - loss: 1.6995 - regression_loss: 1.4155 - classification_loss: 0.2841 302/500 [=================>............] - ETA: 49s - loss: 1.7017 - regression_loss: 1.4171 - classification_loss: 0.2847 303/500 [=================>............] - ETA: 49s - loss: 1.7032 - regression_loss: 1.4183 - classification_loss: 0.2849 304/500 [=================>............] - ETA: 49s - loss: 1.7042 - regression_loss: 1.4187 - classification_loss: 0.2855 305/500 [=================>............] - ETA: 48s - loss: 1.7046 - regression_loss: 1.4191 - classification_loss: 0.2855 306/500 [=================>............] - ETA: 48s - loss: 1.7056 - regression_loss: 1.4201 - classification_loss: 0.2855 307/500 [=================>............] - ETA: 48s - loss: 1.7068 - regression_loss: 1.4212 - classification_loss: 0.2856 308/500 [=================>............] - ETA: 48s - loss: 1.7080 - regression_loss: 1.4219 - classification_loss: 0.2861 309/500 [=================>............] - ETA: 47s - loss: 1.7077 - regression_loss: 1.4216 - classification_loss: 0.2860 310/500 [=================>............] - ETA: 47s - loss: 1.7080 - regression_loss: 1.4218 - classification_loss: 0.2862 311/500 [=================>............] - ETA: 47s - loss: 1.7090 - regression_loss: 1.4224 - classification_loss: 0.2866 312/500 [=================>............] - ETA: 47s - loss: 1.7089 - regression_loss: 1.4224 - classification_loss: 0.2865 313/500 [=================>............] - ETA: 46s - loss: 1.7063 - regression_loss: 1.4201 - classification_loss: 0.2862 314/500 [=================>............] - ETA: 46s - loss: 1.7071 - regression_loss: 1.4207 - classification_loss: 0.2864 315/500 [=================>............] - ETA: 46s - loss: 1.7060 - regression_loss: 1.4199 - classification_loss: 0.2861 316/500 [=================>............] - ETA: 45s - loss: 1.7089 - regression_loss: 1.4221 - classification_loss: 0.2868 317/500 [==================>...........] - ETA: 45s - loss: 1.7091 - regression_loss: 1.4224 - classification_loss: 0.2867 318/500 [==================>...........] - ETA: 45s - loss: 1.7071 - regression_loss: 1.4207 - classification_loss: 0.2864 319/500 [==================>...........] - ETA: 45s - loss: 1.7045 - regression_loss: 1.4186 - classification_loss: 0.2859 320/500 [==================>...........] - ETA: 45s - loss: 1.7029 - regression_loss: 1.4169 - classification_loss: 0.2860 321/500 [==================>...........] - ETA: 44s - loss: 1.7009 - regression_loss: 1.4153 - classification_loss: 0.2856 322/500 [==================>...........] - ETA: 44s - loss: 1.7021 - regression_loss: 1.4162 - classification_loss: 0.2860 323/500 [==================>...........] - ETA: 44s - loss: 1.6992 - regression_loss: 1.4138 - classification_loss: 0.2854 324/500 [==================>...........] - ETA: 44s - loss: 1.6982 - regression_loss: 1.4130 - classification_loss: 0.2852 325/500 [==================>...........] - ETA: 43s - loss: 1.6990 - regression_loss: 1.4139 - classification_loss: 0.2852 326/500 [==================>...........] - ETA: 43s - loss: 1.6990 - regression_loss: 1.4139 - classification_loss: 0.2851 327/500 [==================>...........] - ETA: 43s - loss: 1.6976 - regression_loss: 1.4129 - classification_loss: 0.2847 328/500 [==================>...........] - ETA: 43s - loss: 1.6952 - regression_loss: 1.4108 - classification_loss: 0.2844 329/500 [==================>...........] - ETA: 42s - loss: 1.6967 - regression_loss: 1.4120 - classification_loss: 0.2847 330/500 [==================>...........] - ETA: 42s - loss: 1.6948 - regression_loss: 1.4104 - classification_loss: 0.2844 331/500 [==================>...........] - ETA: 42s - loss: 1.6915 - regression_loss: 1.4076 - classification_loss: 0.2840 332/500 [==================>...........] - ETA: 42s - loss: 1.6930 - regression_loss: 1.4081 - classification_loss: 0.2849 333/500 [==================>...........] - ETA: 41s - loss: 1.6931 - regression_loss: 1.4082 - classification_loss: 0.2848 334/500 [===================>..........] - ETA: 41s - loss: 1.6943 - regression_loss: 1.4092 - classification_loss: 0.2851 335/500 [===================>..........] - ETA: 41s - loss: 1.6945 - regression_loss: 1.4093 - classification_loss: 0.2851 336/500 [===================>..........] - ETA: 41s - loss: 1.6959 - regression_loss: 1.4106 - classification_loss: 0.2853 337/500 [===================>..........] - ETA: 40s - loss: 1.6970 - regression_loss: 1.4114 - classification_loss: 0.2856 338/500 [===================>..........] - ETA: 40s - loss: 1.6971 - regression_loss: 1.4117 - classification_loss: 0.2855 339/500 [===================>..........] - ETA: 40s - loss: 1.6982 - regression_loss: 1.4126 - classification_loss: 0.2856 340/500 [===================>..........] - ETA: 40s - loss: 1.6988 - regression_loss: 1.4131 - classification_loss: 0.2856 341/500 [===================>..........] - ETA: 39s - loss: 1.6995 - regression_loss: 1.4136 - classification_loss: 0.2859 342/500 [===================>..........] - ETA: 39s - loss: 1.6999 - regression_loss: 1.4140 - classification_loss: 0.2859 343/500 [===================>..........] - ETA: 39s - loss: 1.7012 - regression_loss: 1.4149 - classification_loss: 0.2863 344/500 [===================>..........] - ETA: 39s - loss: 1.7024 - regression_loss: 1.4157 - classification_loss: 0.2867 345/500 [===================>..........] - ETA: 38s - loss: 1.7035 - regression_loss: 1.4165 - classification_loss: 0.2869 346/500 [===================>..........] - ETA: 38s - loss: 1.7065 - regression_loss: 1.4183 - classification_loss: 0.2882 347/500 [===================>..........] - ETA: 38s - loss: 1.7079 - regression_loss: 1.4194 - classification_loss: 0.2885 348/500 [===================>..........] - ETA: 38s - loss: 1.7080 - regression_loss: 1.4185 - classification_loss: 0.2895 349/500 [===================>..........] - ETA: 37s - loss: 1.7096 - regression_loss: 1.4178 - classification_loss: 0.2918 350/500 [====================>.........] - ETA: 37s - loss: 1.7086 - regression_loss: 1.4171 - classification_loss: 0.2915 351/500 [====================>.........] - ETA: 37s - loss: 1.7071 - regression_loss: 1.4160 - classification_loss: 0.2911 352/500 [====================>.........] - ETA: 37s - loss: 1.7072 - regression_loss: 1.4160 - classification_loss: 0.2912 353/500 [====================>.........] - ETA: 36s - loss: 1.7074 - regression_loss: 1.4161 - classification_loss: 0.2913 354/500 [====================>.........] - ETA: 36s - loss: 1.7074 - regression_loss: 1.4159 - classification_loss: 0.2915 355/500 [====================>.........] - ETA: 36s - loss: 1.7081 - regression_loss: 1.4166 - classification_loss: 0.2915 356/500 [====================>.........] - ETA: 36s - loss: 1.7090 - regression_loss: 1.4173 - classification_loss: 0.2916 357/500 [====================>.........] - ETA: 35s - loss: 1.7082 - regression_loss: 1.4168 - classification_loss: 0.2915 358/500 [====================>.........] - ETA: 35s - loss: 1.7104 - regression_loss: 1.4188 - classification_loss: 0.2916 359/500 [====================>.........] - ETA: 35s - loss: 1.7099 - regression_loss: 1.4186 - classification_loss: 0.2912 360/500 [====================>.........] - ETA: 35s - loss: 1.7116 - regression_loss: 1.4203 - classification_loss: 0.2913 361/500 [====================>.........] - ETA: 34s - loss: 1.7107 - regression_loss: 1.4195 - classification_loss: 0.2912 362/500 [====================>.........] - ETA: 34s - loss: 1.7113 - regression_loss: 1.4200 - classification_loss: 0.2912 363/500 [====================>.........] - ETA: 34s - loss: 1.7118 - regression_loss: 1.4205 - classification_loss: 0.2913 364/500 [====================>.........] - ETA: 34s - loss: 1.7111 - regression_loss: 1.4198 - classification_loss: 0.2913 365/500 [====================>.........] - ETA: 33s - loss: 1.7113 - regression_loss: 1.4199 - classification_loss: 0.2913 366/500 [====================>.........] - ETA: 33s - loss: 1.7089 - regression_loss: 1.4179 - classification_loss: 0.2910 367/500 [=====================>........] - ETA: 33s - loss: 1.7084 - regression_loss: 1.4175 - classification_loss: 0.2909 368/500 [=====================>........] - ETA: 33s - loss: 1.7090 - regression_loss: 1.4180 - classification_loss: 0.2910 369/500 [=====================>........] - ETA: 32s - loss: 1.7070 - regression_loss: 1.4165 - classification_loss: 0.2905 370/500 [=====================>........] - ETA: 32s - loss: 1.7073 - regression_loss: 1.4169 - classification_loss: 0.2904 371/500 [=====================>........] - ETA: 32s - loss: 1.7085 - regression_loss: 1.4179 - classification_loss: 0.2906 372/500 [=====================>........] - ETA: 32s - loss: 1.7068 - regression_loss: 1.4166 - classification_loss: 0.2902 373/500 [=====================>........] - ETA: 31s - loss: 1.7078 - regression_loss: 1.4174 - classification_loss: 0.2903 374/500 [=====================>........] - ETA: 31s - loss: 1.7097 - regression_loss: 1.4190 - classification_loss: 0.2907 375/500 [=====================>........] - ETA: 31s - loss: 1.7097 - regression_loss: 1.4191 - classification_loss: 0.2906 376/500 [=====================>........] - ETA: 31s - loss: 1.7082 - regression_loss: 1.4180 - classification_loss: 0.2902 377/500 [=====================>........] - ETA: 30s - loss: 1.7103 - regression_loss: 1.4196 - classification_loss: 0.2907 378/500 [=====================>........] - ETA: 30s - loss: 1.7103 - regression_loss: 1.4198 - classification_loss: 0.2905 379/500 [=====================>........] - ETA: 30s - loss: 1.7104 - regression_loss: 1.4199 - classification_loss: 0.2904 380/500 [=====================>........] - ETA: 29s - loss: 1.7106 - regression_loss: 1.4202 - classification_loss: 0.2904 381/500 [=====================>........] - ETA: 29s - loss: 1.7082 - regression_loss: 1.4183 - classification_loss: 0.2899 382/500 [=====================>........] - ETA: 29s - loss: 1.7078 - regression_loss: 1.4175 - classification_loss: 0.2902 383/500 [=====================>........] - ETA: 29s - loss: 1.7078 - regression_loss: 1.4177 - classification_loss: 0.2901 384/500 [======================>.......] - ETA: 28s - loss: 1.7091 - regression_loss: 1.4187 - classification_loss: 0.2903 385/500 [======================>.......] - ETA: 28s - loss: 1.7103 - regression_loss: 1.4195 - classification_loss: 0.2908 386/500 [======================>.......] - ETA: 28s - loss: 1.7116 - regression_loss: 1.4204 - classification_loss: 0.2911 387/500 [======================>.......] - ETA: 28s - loss: 1.7108 - regression_loss: 1.4197 - classification_loss: 0.2911 388/500 [======================>.......] - ETA: 27s - loss: 1.7113 - regression_loss: 1.4201 - classification_loss: 0.2913 389/500 [======================>.......] - ETA: 27s - loss: 1.7090 - regression_loss: 1.4181 - classification_loss: 0.2908 390/500 [======================>.......] - ETA: 27s - loss: 1.7086 - regression_loss: 1.4179 - classification_loss: 0.2907 391/500 [======================>.......] - ETA: 27s - loss: 1.7085 - regression_loss: 1.4177 - classification_loss: 0.2907 392/500 [======================>.......] - ETA: 26s - loss: 1.7081 - regression_loss: 1.4173 - classification_loss: 0.2908 393/500 [======================>.......] - ETA: 26s - loss: 1.7078 - regression_loss: 1.4169 - classification_loss: 0.2909 394/500 [======================>.......] - ETA: 26s - loss: 1.7078 - regression_loss: 1.4170 - classification_loss: 0.2908 395/500 [======================>.......] - ETA: 26s - loss: 1.7069 - regression_loss: 1.4164 - classification_loss: 0.2905 396/500 [======================>.......] - ETA: 26s - loss: 1.7067 - regression_loss: 1.4162 - classification_loss: 0.2905 397/500 [======================>.......] - ETA: 25s - loss: 1.7071 - regression_loss: 1.4167 - classification_loss: 0.2904 398/500 [======================>.......] - ETA: 25s - loss: 1.7070 - regression_loss: 1.4165 - classification_loss: 0.2905 399/500 [======================>.......] - ETA: 25s - loss: 1.7076 - regression_loss: 1.4170 - classification_loss: 0.2905 400/500 [=======================>......] - ETA: 25s - loss: 1.7091 - regression_loss: 1.4185 - classification_loss: 0.2906 401/500 [=======================>......] - ETA: 24s - loss: 1.7090 - regression_loss: 1.4184 - classification_loss: 0.2906 402/500 [=======================>......] - ETA: 24s - loss: 1.7067 - regression_loss: 1.4166 - classification_loss: 0.2901 403/500 [=======================>......] - ETA: 24s - loss: 1.7071 - regression_loss: 1.4169 - classification_loss: 0.2903 404/500 [=======================>......] - ETA: 24s - loss: 1.7060 - regression_loss: 1.4160 - classification_loss: 0.2900 405/500 [=======================>......] - ETA: 23s - loss: 1.7053 - regression_loss: 1.4153 - classification_loss: 0.2900 406/500 [=======================>......] - ETA: 23s - loss: 1.7048 - regression_loss: 1.4150 - classification_loss: 0.2898 407/500 [=======================>......] - ETA: 23s - loss: 1.7052 - regression_loss: 1.4153 - classification_loss: 0.2899 408/500 [=======================>......] - ETA: 23s - loss: 1.7047 - regression_loss: 1.4147 - classification_loss: 0.2900 409/500 [=======================>......] - ETA: 22s - loss: 1.7057 - regression_loss: 1.4156 - classification_loss: 0.2901 410/500 [=======================>......] - ETA: 22s - loss: 1.7055 - regression_loss: 1.4154 - classification_loss: 0.2901 411/500 [=======================>......] - ETA: 22s - loss: 1.7047 - regression_loss: 1.4147 - classification_loss: 0.2899 412/500 [=======================>......] - ETA: 22s - loss: 1.7041 - regression_loss: 1.4143 - classification_loss: 0.2898 413/500 [=======================>......] - ETA: 21s - loss: 1.7030 - regression_loss: 1.4133 - classification_loss: 0.2897 414/500 [=======================>......] - ETA: 21s - loss: 1.7044 - regression_loss: 1.4147 - classification_loss: 0.2897 415/500 [=======================>......] - ETA: 21s - loss: 1.7017 - regression_loss: 1.4125 - classification_loss: 0.2892 416/500 [=======================>......] - ETA: 21s - loss: 1.6997 - regression_loss: 1.4107 - classification_loss: 0.2890 417/500 [========================>.....] - ETA: 20s - loss: 1.7021 - regression_loss: 1.4126 - classification_loss: 0.2895 418/500 [========================>.....] - ETA: 20s - loss: 1.7036 - regression_loss: 1.4138 - classification_loss: 0.2899 419/500 [========================>.....] - ETA: 20s - loss: 1.7037 - regression_loss: 1.4139 - classification_loss: 0.2898 420/500 [========================>.....] - ETA: 20s - loss: 1.7022 - regression_loss: 1.4128 - classification_loss: 0.2894 421/500 [========================>.....] - ETA: 19s - loss: 1.7028 - regression_loss: 1.4133 - classification_loss: 0.2895 422/500 [========================>.....] - ETA: 19s - loss: 1.7042 - regression_loss: 1.4145 - classification_loss: 0.2897 423/500 [========================>.....] - ETA: 19s - loss: 1.7034 - regression_loss: 1.4139 - classification_loss: 0.2895 424/500 [========================>.....] - ETA: 19s - loss: 1.7039 - regression_loss: 1.4143 - classification_loss: 0.2896 425/500 [========================>.....] - ETA: 18s - loss: 1.7054 - regression_loss: 1.4155 - classification_loss: 0.2898 426/500 [========================>.....] - ETA: 18s - loss: 1.7047 - regression_loss: 1.4149 - classification_loss: 0.2898 427/500 [========================>.....] - ETA: 18s - loss: 1.7035 - regression_loss: 1.4139 - classification_loss: 0.2895 428/500 [========================>.....] - ETA: 18s - loss: 1.7038 - regression_loss: 1.4143 - classification_loss: 0.2895 429/500 [========================>.....] - ETA: 17s - loss: 1.7032 - regression_loss: 1.4139 - classification_loss: 0.2893 430/500 [========================>.....] - ETA: 17s - loss: 1.7042 - regression_loss: 1.4149 - classification_loss: 0.2893 431/500 [========================>.....] - ETA: 17s - loss: 1.7039 - regression_loss: 1.4147 - classification_loss: 0.2892 432/500 [========================>.....] - ETA: 17s - loss: 1.7036 - regression_loss: 1.4144 - classification_loss: 0.2892 433/500 [========================>.....] - ETA: 16s - loss: 1.7013 - regression_loss: 1.4125 - classification_loss: 0.2888 434/500 [=========================>....] - ETA: 16s - loss: 1.7019 - regression_loss: 1.4131 - classification_loss: 0.2888 435/500 [=========================>....] - ETA: 16s - loss: 1.7018 - regression_loss: 1.4131 - classification_loss: 0.2888 436/500 [=========================>....] - ETA: 16s - loss: 1.7019 - regression_loss: 1.4132 - classification_loss: 0.2887 437/500 [=========================>....] - ETA: 15s - loss: 1.7026 - regression_loss: 1.4139 - classification_loss: 0.2887 438/500 [=========================>....] - ETA: 15s - loss: 1.7003 - regression_loss: 1.4119 - classification_loss: 0.2884 439/500 [=========================>....] - ETA: 15s - loss: 1.6989 - regression_loss: 1.4105 - classification_loss: 0.2884 440/500 [=========================>....] - ETA: 15s - loss: 1.6982 - regression_loss: 1.4099 - classification_loss: 0.2883 441/500 [=========================>....] - ETA: 14s - loss: 1.6970 - regression_loss: 1.4090 - classification_loss: 0.2880 442/500 [=========================>....] - ETA: 14s - loss: 1.6947 - regression_loss: 1.4072 - classification_loss: 0.2875 443/500 [=========================>....] - ETA: 14s - loss: 1.6957 - regression_loss: 1.4080 - classification_loss: 0.2877 444/500 [=========================>....] - ETA: 14s - loss: 1.6961 - regression_loss: 1.4080 - classification_loss: 0.2881 445/500 [=========================>....] - ETA: 13s - loss: 1.6968 - regression_loss: 1.4085 - classification_loss: 0.2882 446/500 [=========================>....] - ETA: 13s - loss: 1.6964 - regression_loss: 1.4084 - classification_loss: 0.2880 447/500 [=========================>....] - ETA: 13s - loss: 1.6968 - regression_loss: 1.4087 - classification_loss: 0.2881 448/500 [=========================>....] - ETA: 13s - loss: 1.6968 - regression_loss: 1.4085 - classification_loss: 0.2884 449/500 [=========================>....] - ETA: 12s - loss: 1.6978 - regression_loss: 1.4092 - classification_loss: 0.2886 450/500 [==========================>...] - ETA: 12s - loss: 1.6974 - regression_loss: 1.4089 - classification_loss: 0.2885 451/500 [==========================>...] - ETA: 12s - loss: 1.6969 - regression_loss: 1.4085 - classification_loss: 0.2884 452/500 [==========================>...] - ETA: 12s - loss: 1.6971 - regression_loss: 1.4085 - classification_loss: 0.2886 453/500 [==========================>...] - ETA: 11s - loss: 1.7027 - regression_loss: 1.4103 - classification_loss: 0.2924 454/500 [==========================>...] - ETA: 11s - loss: 1.7029 - regression_loss: 1.4106 - classification_loss: 0.2923 455/500 [==========================>...] - ETA: 11s - loss: 1.7014 - regression_loss: 1.4094 - classification_loss: 0.2920 456/500 [==========================>...] - ETA: 11s - loss: 1.7019 - regression_loss: 1.4099 - classification_loss: 0.2920 457/500 [==========================>...] - ETA: 10s - loss: 1.7022 - regression_loss: 1.4102 - classification_loss: 0.2920 458/500 [==========================>...] - ETA: 10s - loss: 1.7010 - regression_loss: 1.4093 - classification_loss: 0.2917 459/500 [==========================>...] - ETA: 10s - loss: 1.7008 - regression_loss: 1.4091 - classification_loss: 0.2916 460/500 [==========================>...] - ETA: 10s - loss: 1.6991 - regression_loss: 1.4078 - classification_loss: 0.2913 461/500 [==========================>...] - ETA: 9s - loss: 1.6997 - regression_loss: 1.4083 - classification_loss: 0.2914  462/500 [==========================>...] - ETA: 9s - loss: 1.6999 - regression_loss: 1.4086 - classification_loss: 0.2914 463/500 [==========================>...] - ETA: 9s - loss: 1.7007 - regression_loss: 1.4092 - classification_loss: 0.2915 464/500 [==========================>...] - ETA: 9s - loss: 1.6996 - regression_loss: 1.4083 - classification_loss: 0.2913 465/500 [==========================>...] - ETA: 8s - loss: 1.7003 - regression_loss: 1.4088 - classification_loss: 0.2915 466/500 [==========================>...] - ETA: 8s - loss: 1.6986 - regression_loss: 1.4075 - classification_loss: 0.2911 467/500 [===========================>..] - ETA: 8s - loss: 1.6989 - regression_loss: 1.4078 - classification_loss: 0.2911 468/500 [===========================>..] - ETA: 8s - loss: 1.6975 - regression_loss: 1.4067 - classification_loss: 0.2908 469/500 [===========================>..] - ETA: 7s - loss: 1.6973 - regression_loss: 1.4066 - classification_loss: 0.2907 470/500 [===========================>..] - ETA: 7s - loss: 1.6954 - regression_loss: 1.4049 - classification_loss: 0.2904 471/500 [===========================>..] - ETA: 7s - loss: 1.6957 - regression_loss: 1.4052 - classification_loss: 0.2905 472/500 [===========================>..] - ETA: 7s - loss: 1.6950 - regression_loss: 1.4046 - classification_loss: 0.2903 473/500 [===========================>..] - ETA: 6s - loss: 1.6936 - regression_loss: 1.4034 - classification_loss: 0.2901 474/500 [===========================>..] - ETA: 6s - loss: 1.6942 - regression_loss: 1.4040 - classification_loss: 0.2902 475/500 [===========================>..] - ETA: 6s - loss: 1.6948 - regression_loss: 1.4047 - classification_loss: 0.2901 476/500 [===========================>..] - ETA: 6s - loss: 1.6953 - regression_loss: 1.4051 - classification_loss: 0.2903 477/500 [===========================>..] - ETA: 5s - loss: 1.7007 - regression_loss: 1.4077 - classification_loss: 0.2931 478/500 [===========================>..] - ETA: 5s - loss: 1.6993 - regression_loss: 1.4067 - classification_loss: 0.2926 479/500 [===========================>..] - ETA: 5s - loss: 1.7010 - regression_loss: 1.4082 - classification_loss: 0.2928 480/500 [===========================>..] - ETA: 5s - loss: 1.7012 - regression_loss: 1.4084 - classification_loss: 0.2928 481/500 [===========================>..] - ETA: 4s - loss: 1.7048 - regression_loss: 1.4096 - classification_loss: 0.2952 482/500 [===========================>..] - ETA: 4s - loss: 1.7038 - regression_loss: 1.4087 - classification_loss: 0.2951 483/500 [===========================>..] - ETA: 4s - loss: 1.7016 - regression_loss: 1.4070 - classification_loss: 0.2947 484/500 [============================>.] - ETA: 4s - loss: 1.7021 - regression_loss: 1.4074 - classification_loss: 0.2947 485/500 [============================>.] - ETA: 3s - loss: 1.7025 - regression_loss: 1.4075 - classification_loss: 0.2949 486/500 [============================>.] - ETA: 3s - loss: 1.7034 - regression_loss: 1.4083 - classification_loss: 0.2951 487/500 [============================>.] - ETA: 3s - loss: 1.7036 - regression_loss: 1.4084 - classification_loss: 0.2952 488/500 [============================>.] - ETA: 3s - loss: 1.7051 - regression_loss: 1.4097 - classification_loss: 0.2954 489/500 [============================>.] - ETA: 2s - loss: 1.7053 - regression_loss: 1.4100 - classification_loss: 0.2953 490/500 [============================>.] - ETA: 2s - loss: 1.7045 - regression_loss: 1.4094 - classification_loss: 0.2951 491/500 [============================>.] - ETA: 2s - loss: 1.7044 - regression_loss: 1.4094 - classification_loss: 0.2950 492/500 [============================>.] - ETA: 2s - loss: 1.7046 - regression_loss: 1.4096 - classification_loss: 0.2950 493/500 [============================>.] - ETA: 1s - loss: 1.7044 - regression_loss: 1.4095 - classification_loss: 0.2949 494/500 [============================>.] - ETA: 1s - loss: 1.7043 - regression_loss: 1.4094 - classification_loss: 0.2949 495/500 [============================>.] - ETA: 1s - loss: 1.7040 - regression_loss: 1.4092 - classification_loss: 0.2948 496/500 [============================>.] - ETA: 1s - loss: 1.7050 - regression_loss: 1.4101 - classification_loss: 0.2949 497/500 [============================>.] - ETA: 0s - loss: 1.7054 - regression_loss: 1.4105 - classification_loss: 0.2949 498/500 [============================>.] - ETA: 0s - loss: 1.7062 - regression_loss: 1.4111 - classification_loss: 0.2950 499/500 [============================>.] - ETA: 0s - loss: 1.7059 - regression_loss: 1.4108 - classification_loss: 0.2951 500/500 [==============================] - 125s 250ms/step - loss: 1.7052 - regression_loss: 1.4099 - classification_loss: 0.2952 1172 instances of class plum with average precision: 0.6114 mAP: 0.6114 Epoch 00071: saving model to ./training/snapshots/resnet50_pascal_71.h5 Epoch 72/150 1/500 [..............................] - ETA: 2:03 - loss: 1.8865 - regression_loss: 1.5848 - classification_loss: 0.3017 2/500 [..............................] - ETA: 2:02 - loss: 1.8941 - regression_loss: 1.6059 - classification_loss: 0.2881 3/500 [..............................] - ETA: 2:03 - loss: 1.9155 - regression_loss: 1.6151 - classification_loss: 0.3004 4/500 [..............................] - ETA: 2:02 - loss: 1.6672 - regression_loss: 1.4130 - classification_loss: 0.2543 5/500 [..............................] - ETA: 2:03 - loss: 1.6576 - regression_loss: 1.4019 - classification_loss: 0.2557 6/500 [..............................] - ETA: 2:03 - loss: 1.7523 - regression_loss: 1.4783 - classification_loss: 0.2740 7/500 [..............................] - ETA: 2:03 - loss: 1.7997 - regression_loss: 1.5098 - classification_loss: 0.2898 8/500 [..............................] - ETA: 2:03 - loss: 1.8364 - regression_loss: 1.5271 - classification_loss: 0.3093 9/500 [..............................] - ETA: 2:02 - loss: 1.9592 - regression_loss: 1.6218 - classification_loss: 0.3374 10/500 [..............................] - ETA: 2:02 - loss: 1.9364 - regression_loss: 1.6065 - classification_loss: 0.3298 11/500 [..............................] - ETA: 2:02 - loss: 1.8880 - regression_loss: 1.5737 - classification_loss: 0.3143 12/500 [..............................] - ETA: 2:01 - loss: 1.8577 - regression_loss: 1.5484 - classification_loss: 0.3092 13/500 [..............................] - ETA: 2:01 - loss: 1.8556 - regression_loss: 1.5447 - classification_loss: 0.3109 14/500 [..............................] - ETA: 2:01 - loss: 1.8825 - regression_loss: 1.5658 - classification_loss: 0.3167 15/500 [..............................] - ETA: 2:01 - loss: 1.8896 - regression_loss: 1.5730 - classification_loss: 0.3166 16/500 [..............................] - ETA: 2:01 - loss: 1.8605 - regression_loss: 1.5487 - classification_loss: 0.3118 17/500 [>.............................] - ETA: 2:00 - loss: 1.8281 - regression_loss: 1.5193 - classification_loss: 0.3088 18/500 [>.............................] - ETA: 2:00 - loss: 1.8477 - regression_loss: 1.5357 - classification_loss: 0.3120 19/500 [>.............................] - ETA: 2:00 - loss: 1.8250 - regression_loss: 1.5187 - classification_loss: 0.3063 20/500 [>.............................] - ETA: 2:00 - loss: 1.7886 - regression_loss: 1.4907 - classification_loss: 0.2980 21/500 [>.............................] - ETA: 2:00 - loss: 1.8044 - regression_loss: 1.5001 - classification_loss: 0.3043 22/500 [>.............................] - ETA: 2:00 - loss: 1.7921 - regression_loss: 1.4916 - classification_loss: 0.3005 23/500 [>.............................] - ETA: 1:59 - loss: 1.7889 - regression_loss: 1.4897 - classification_loss: 0.2992 24/500 [>.............................] - ETA: 1:59 - loss: 1.7907 - regression_loss: 1.4902 - classification_loss: 0.3004 25/500 [>.............................] - ETA: 1:59 - loss: 1.7790 - regression_loss: 1.4694 - classification_loss: 0.3096 26/500 [>.............................] - ETA: 1:59 - loss: 1.7821 - regression_loss: 1.4743 - classification_loss: 0.3078 27/500 [>.............................] - ETA: 1:59 - loss: 1.7823 - regression_loss: 1.4752 - classification_loss: 0.3072 28/500 [>.............................] - ETA: 1:58 - loss: 1.7553 - regression_loss: 1.4527 - classification_loss: 0.3026 29/500 [>.............................] - ETA: 1:58 - loss: 1.7407 - regression_loss: 1.4427 - classification_loss: 0.2979 30/500 [>.............................] - ETA: 1:57 - loss: 1.7335 - regression_loss: 1.4365 - classification_loss: 0.2970 31/500 [>.............................] - ETA: 1:57 - loss: 1.7320 - regression_loss: 1.4360 - classification_loss: 0.2960 32/500 [>.............................] - ETA: 1:57 - loss: 1.7413 - regression_loss: 1.4441 - classification_loss: 0.2972 33/500 [>.............................] - ETA: 1:57 - loss: 1.7682 - regression_loss: 1.4682 - classification_loss: 0.3000 34/500 [=>............................] - ETA: 1:57 - loss: 1.7820 - regression_loss: 1.4780 - classification_loss: 0.3039 35/500 [=>............................] - ETA: 1:56 - loss: 1.7922 - regression_loss: 1.4822 - classification_loss: 0.3100 36/500 [=>............................] - ETA: 1:56 - loss: 1.7892 - regression_loss: 1.4796 - classification_loss: 0.3095 37/500 [=>............................] - ETA: 1:56 - loss: 1.7925 - regression_loss: 1.4824 - classification_loss: 0.3101 38/500 [=>............................] - ETA: 1:55 - loss: 1.7658 - regression_loss: 1.4608 - classification_loss: 0.3051 39/500 [=>............................] - ETA: 1:55 - loss: 1.7618 - regression_loss: 1.4571 - classification_loss: 0.3047 40/500 [=>............................] - ETA: 1:55 - loss: 1.7672 - regression_loss: 1.4617 - classification_loss: 0.3056 41/500 [=>............................] - ETA: 1:55 - loss: 1.7685 - regression_loss: 1.4614 - classification_loss: 0.3071 42/500 [=>............................] - ETA: 1:54 - loss: 1.7697 - regression_loss: 1.4607 - classification_loss: 0.3089 43/500 [=>............................] - ETA: 1:54 - loss: 1.7592 - regression_loss: 1.4545 - classification_loss: 0.3047 44/500 [=>............................] - ETA: 1:54 - loss: 1.7653 - regression_loss: 1.4604 - classification_loss: 0.3049 45/500 [=>............................] - ETA: 1:53 - loss: 1.7794 - regression_loss: 1.4703 - classification_loss: 0.3091 46/500 [=>............................] - ETA: 1:53 - loss: 1.7829 - regression_loss: 1.4707 - classification_loss: 0.3122 47/500 [=>............................] - ETA: 1:53 - loss: 1.7541 - regression_loss: 1.4453 - classification_loss: 0.3088 48/500 [=>............................] - ETA: 1:53 - loss: 1.7419 - regression_loss: 1.4351 - classification_loss: 0.3069 49/500 [=>............................] - ETA: 1:52 - loss: 1.7322 - regression_loss: 1.4279 - classification_loss: 0.3043 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7491 - regression_loss: 1.4454 - classification_loss: 0.3037 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7422 - regression_loss: 1.4401 - classification_loss: 0.3021 52/500 [==>...........................] - ETA: 1:51 - loss: 1.7434 - regression_loss: 1.4423 - classification_loss: 0.3011 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7514 - regression_loss: 1.4491 - classification_loss: 0.3023 54/500 [==>...........................] - ETA: 1:50 - loss: 1.7476 - regression_loss: 1.4472 - classification_loss: 0.3004 55/500 [==>...........................] - ETA: 1:50 - loss: 1.7526 - regression_loss: 1.4521 - classification_loss: 0.3005 56/500 [==>...........................] - ETA: 1:50 - loss: 1.7569 - regression_loss: 1.4527 - classification_loss: 0.3042 57/500 [==>...........................] - ETA: 1:49 - loss: 1.7474 - regression_loss: 1.4438 - classification_loss: 0.3037 58/500 [==>...........................] - ETA: 1:49 - loss: 1.7354 - regression_loss: 1.4346 - classification_loss: 0.3007 59/500 [==>...........................] - ETA: 1:49 - loss: 1.7317 - regression_loss: 1.4320 - classification_loss: 0.2997 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7346 - regression_loss: 1.4357 - classification_loss: 0.2989 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7296 - regression_loss: 1.4321 - classification_loss: 0.2976 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7319 - regression_loss: 1.4354 - classification_loss: 0.2966 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7233 - regression_loss: 1.4250 - classification_loss: 0.2983 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7120 - regression_loss: 1.4157 - classification_loss: 0.2963 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7125 - regression_loss: 1.4170 - classification_loss: 0.2955 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7127 - regression_loss: 1.4170 - classification_loss: 0.2957 67/500 [===>..........................] - ETA: 1:47 - loss: 1.7146 - regression_loss: 1.4187 - classification_loss: 0.2958 68/500 [===>..........................] - ETA: 1:47 - loss: 1.7231 - regression_loss: 1.4258 - classification_loss: 0.2972 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7269 - regression_loss: 1.4294 - classification_loss: 0.2975 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7180 - regression_loss: 1.4212 - classification_loss: 0.2968 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7049 - regression_loss: 1.4106 - classification_loss: 0.2943 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6931 - regression_loss: 1.4015 - classification_loss: 0.2916 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6953 - regression_loss: 1.4040 - classification_loss: 0.2913 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6967 - regression_loss: 1.4052 - classification_loss: 0.2915 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6962 - regression_loss: 1.4046 - classification_loss: 0.2915 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7057 - regression_loss: 1.4112 - classification_loss: 0.2945 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7091 - regression_loss: 1.4126 - classification_loss: 0.2965 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7064 - regression_loss: 1.4107 - classification_loss: 0.2957 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7015 - regression_loss: 1.4073 - classification_loss: 0.2942 80/500 [===>..........................] - ETA: 1:45 - loss: 1.7043 - regression_loss: 1.4096 - classification_loss: 0.2947 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6997 - regression_loss: 1.4062 - classification_loss: 0.2934 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6864 - regression_loss: 1.3957 - classification_loss: 0.2907 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6879 - regression_loss: 1.3970 - classification_loss: 0.2909 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6935 - regression_loss: 1.4006 - classification_loss: 0.2928 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6802 - regression_loss: 1.3900 - classification_loss: 0.2901 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6776 - regression_loss: 1.3884 - classification_loss: 0.2892 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6802 - regression_loss: 1.3911 - classification_loss: 0.2891 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6830 - regression_loss: 1.3920 - classification_loss: 0.2910 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6804 - regression_loss: 1.3898 - classification_loss: 0.2907 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6726 - regression_loss: 1.3829 - classification_loss: 0.2897 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6748 - regression_loss: 1.3849 - classification_loss: 0.2899 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6715 - regression_loss: 1.3823 - classification_loss: 0.2892 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6709 - regression_loss: 1.3823 - classification_loss: 0.2887 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6600 - regression_loss: 1.3734 - classification_loss: 0.2866 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6620 - regression_loss: 1.3750 - classification_loss: 0.2870 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6676 - regression_loss: 1.3793 - classification_loss: 0.2884 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6711 - regression_loss: 1.3819 - classification_loss: 0.2892 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6714 - regression_loss: 1.3821 - classification_loss: 0.2893 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6701 - regression_loss: 1.3816 - classification_loss: 0.2885 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6709 - regression_loss: 1.3830 - classification_loss: 0.2879 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6640 - regression_loss: 1.3773 - classification_loss: 0.2868 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6668 - regression_loss: 1.3799 - classification_loss: 0.2869 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6689 - regression_loss: 1.3814 - classification_loss: 0.2875 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6689 - regression_loss: 1.3814 - classification_loss: 0.2876 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6714 - regression_loss: 1.3834 - classification_loss: 0.2880 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6706 - regression_loss: 1.3828 - classification_loss: 0.2878 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6674 - regression_loss: 1.3798 - classification_loss: 0.2876 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6699 - regression_loss: 1.3814 - classification_loss: 0.2885 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6699 - regression_loss: 1.3815 - classification_loss: 0.2884 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6737 - regression_loss: 1.3843 - classification_loss: 0.2894 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6736 - regression_loss: 1.3840 - classification_loss: 0.2897 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6704 - regression_loss: 1.3819 - classification_loss: 0.2885 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6723 - regression_loss: 1.3838 - classification_loss: 0.2885 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6728 - regression_loss: 1.3841 - classification_loss: 0.2887 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6776 - regression_loss: 1.3879 - classification_loss: 0.2898 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6707 - regression_loss: 1.3824 - classification_loss: 0.2883 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6656 - regression_loss: 1.3781 - classification_loss: 0.2875 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6618 - regression_loss: 1.3753 - classification_loss: 0.2865 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6601 - regression_loss: 1.3744 - classification_loss: 0.2857 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6643 - regression_loss: 1.3782 - classification_loss: 0.2861 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6613 - regression_loss: 1.3758 - classification_loss: 0.2855 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6550 - regression_loss: 1.3708 - classification_loss: 0.2842 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6607 - regression_loss: 1.3747 - classification_loss: 0.2860 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6549 - regression_loss: 1.3688 - classification_loss: 0.2861 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6579 - regression_loss: 1.3710 - classification_loss: 0.2869 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6606 - regression_loss: 1.3729 - classification_loss: 0.2876 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6654 - regression_loss: 1.3781 - classification_loss: 0.2873 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6720 - regression_loss: 1.3832 - classification_loss: 0.2887 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6718 - regression_loss: 1.3831 - classification_loss: 0.2887 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6697 - regression_loss: 1.3807 - classification_loss: 0.2890 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6721 - regression_loss: 1.3827 - classification_loss: 0.2895 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6762 - regression_loss: 1.3857 - classification_loss: 0.2905 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6704 - regression_loss: 1.3813 - classification_loss: 0.2892 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6712 - regression_loss: 1.3821 - classification_loss: 0.2891 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6713 - regression_loss: 1.3826 - classification_loss: 0.2887 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6731 - regression_loss: 1.3845 - classification_loss: 0.2886 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6736 - regression_loss: 1.3847 - classification_loss: 0.2889 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6741 - regression_loss: 1.3850 - classification_loss: 0.2891 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6795 - regression_loss: 1.3883 - classification_loss: 0.2912 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6770 - regression_loss: 1.3860 - classification_loss: 0.2911 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6746 - regression_loss: 1.3841 - classification_loss: 0.2905 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6745 - regression_loss: 1.3829 - classification_loss: 0.2916 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6682 - regression_loss: 1.3778 - classification_loss: 0.2904 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6694 - regression_loss: 1.3786 - classification_loss: 0.2908 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6738 - regression_loss: 1.3820 - classification_loss: 0.2918 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6738 - regression_loss: 1.3822 - classification_loss: 0.2916 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6694 - regression_loss: 1.3787 - classification_loss: 0.2907 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6726 - regression_loss: 1.3810 - classification_loss: 0.2915 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6709 - regression_loss: 1.3798 - classification_loss: 0.2911 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6735 - regression_loss: 1.3820 - classification_loss: 0.2915 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6785 - regression_loss: 1.3862 - classification_loss: 0.2923 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6771 - regression_loss: 1.3847 - classification_loss: 0.2924 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6802 - regression_loss: 1.3872 - classification_loss: 0.2930 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6764 - regression_loss: 1.3841 - classification_loss: 0.2923 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6754 - regression_loss: 1.3835 - classification_loss: 0.2919 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6740 - regression_loss: 1.3824 - classification_loss: 0.2916 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6768 - regression_loss: 1.3849 - classification_loss: 0.2918 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6781 - regression_loss: 1.3860 - classification_loss: 0.2921 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6799 - regression_loss: 1.3878 - classification_loss: 0.2921 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6808 - regression_loss: 1.3887 - classification_loss: 0.2921 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6790 - regression_loss: 1.3874 - classification_loss: 0.2916 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6804 - regression_loss: 1.3886 - classification_loss: 0.2918 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6822 - regression_loss: 1.3903 - classification_loss: 0.2920 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6831 - regression_loss: 1.3909 - classification_loss: 0.2922 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6827 - regression_loss: 1.3909 - classification_loss: 0.2918 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6840 - regression_loss: 1.3924 - classification_loss: 0.2916 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6855 - regression_loss: 1.3933 - classification_loss: 0.2922 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6809 - regression_loss: 1.3894 - classification_loss: 0.2915 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6789 - regression_loss: 1.3879 - classification_loss: 0.2910 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6804 - regression_loss: 1.3890 - classification_loss: 0.2915 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6810 - regression_loss: 1.3892 - classification_loss: 0.2918 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6818 - regression_loss: 1.3896 - classification_loss: 0.2923 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6805 - regression_loss: 1.3888 - classification_loss: 0.2917 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6750 - regression_loss: 1.3845 - classification_loss: 0.2906 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6744 - regression_loss: 1.3839 - classification_loss: 0.2905 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6765 - regression_loss: 1.3855 - classification_loss: 0.2910 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6732 - regression_loss: 1.3827 - classification_loss: 0.2906 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6752 - regression_loss: 1.3847 - classification_loss: 0.2905 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6783 - regression_loss: 1.3874 - classification_loss: 0.2909 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6762 - regression_loss: 1.3858 - classification_loss: 0.2905 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6752 - regression_loss: 1.3849 - classification_loss: 0.2904 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6758 - regression_loss: 1.3853 - classification_loss: 0.2906 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6770 - regression_loss: 1.3865 - classification_loss: 0.2906 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6746 - regression_loss: 1.3847 - classification_loss: 0.2899 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6744 - regression_loss: 1.3847 - classification_loss: 0.2897 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6757 - regression_loss: 1.3856 - classification_loss: 0.2901 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6759 - regression_loss: 1.3859 - classification_loss: 0.2900 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6802 - regression_loss: 1.3893 - classification_loss: 0.2909 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6811 - regression_loss: 1.3903 - classification_loss: 0.2908 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6818 - regression_loss: 1.3908 - classification_loss: 0.2910 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6822 - regression_loss: 1.3910 - classification_loss: 0.2912 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6835 - regression_loss: 1.3921 - classification_loss: 0.2914 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6855 - regression_loss: 1.3941 - classification_loss: 0.2914 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6850 - regression_loss: 1.3938 - classification_loss: 0.2912 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6867 - regression_loss: 1.3947 - classification_loss: 0.2920 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6881 - regression_loss: 1.3961 - classification_loss: 0.2920 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6879 - regression_loss: 1.3961 - classification_loss: 0.2918 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6907 - regression_loss: 1.3982 - classification_loss: 0.2924 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6918 - regression_loss: 1.3992 - classification_loss: 0.2925 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6930 - regression_loss: 1.4002 - classification_loss: 0.2928 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6901 - regression_loss: 1.3972 - classification_loss: 0.2929 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6907 - regression_loss: 1.3979 - classification_loss: 0.2928 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6917 - regression_loss: 1.3989 - classification_loss: 0.2928 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6904 - regression_loss: 1.3977 - classification_loss: 0.2927 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6929 - regression_loss: 1.3999 - classification_loss: 0.2929 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6907 - regression_loss: 1.3985 - classification_loss: 0.2922 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6919 - regression_loss: 1.3993 - classification_loss: 0.2926 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6962 - regression_loss: 1.4027 - classification_loss: 0.2935 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6973 - regression_loss: 1.4035 - classification_loss: 0.2937 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6987 - regression_loss: 1.4045 - classification_loss: 0.2942 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6990 - regression_loss: 1.4050 - classification_loss: 0.2940 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6972 - regression_loss: 1.4036 - classification_loss: 0.2936 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6998 - regression_loss: 1.4060 - classification_loss: 0.2938 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6999 - regression_loss: 1.4062 - classification_loss: 0.2937 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6983 - regression_loss: 1.4050 - classification_loss: 0.2933 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6965 - regression_loss: 1.4038 - classification_loss: 0.2927 217/500 [============>.................] - ETA: 1:11 - loss: 1.6990 - regression_loss: 1.4054 - classification_loss: 0.2936 218/500 [============>.................] - ETA: 1:10 - loss: 1.6992 - regression_loss: 1.4049 - classification_loss: 0.2943 219/500 [============>.................] - ETA: 1:10 - loss: 1.7026 - regression_loss: 1.4073 - classification_loss: 0.2954 220/500 [============>.................] - ETA: 1:10 - loss: 1.7047 - regression_loss: 1.4086 - classification_loss: 0.2961 221/500 [============>.................] - ETA: 1:10 - loss: 1.7042 - regression_loss: 1.4080 - classification_loss: 0.2962 222/500 [============>.................] - ETA: 1:09 - loss: 1.7045 - regression_loss: 1.4081 - classification_loss: 0.2964 223/500 [============>.................] - ETA: 1:09 - loss: 1.7057 - regression_loss: 1.4092 - classification_loss: 0.2966 224/500 [============>.................] - ETA: 1:09 - loss: 1.7032 - regression_loss: 1.4073 - classification_loss: 0.2959 225/500 [============>.................] - ETA: 1:09 - loss: 1.7035 - regression_loss: 1.4076 - classification_loss: 0.2959 226/500 [============>.................] - ETA: 1:08 - loss: 1.7042 - regression_loss: 1.4081 - classification_loss: 0.2961 227/500 [============>.................] - ETA: 1:08 - loss: 1.7049 - regression_loss: 1.4088 - classification_loss: 0.2961 228/500 [============>.................] - ETA: 1:08 - loss: 1.7041 - regression_loss: 1.4083 - classification_loss: 0.2958 229/500 [============>.................] - ETA: 1:08 - loss: 1.6999 - regression_loss: 1.4047 - classification_loss: 0.2952 230/500 [============>.................] - ETA: 1:07 - loss: 1.7012 - regression_loss: 1.4058 - classification_loss: 0.2954 231/500 [============>.................] - ETA: 1:07 - loss: 1.7023 - regression_loss: 1.4063 - classification_loss: 0.2960 232/500 [============>.................] - ETA: 1:07 - loss: 1.6991 - regression_loss: 1.4037 - classification_loss: 0.2953 233/500 [============>.................] - ETA: 1:07 - loss: 1.7003 - regression_loss: 1.4050 - classification_loss: 0.2953 234/500 [=============>................] - ETA: 1:06 - loss: 1.7028 - regression_loss: 1.4075 - classification_loss: 0.2953 235/500 [=============>................] - ETA: 1:06 - loss: 1.7046 - regression_loss: 1.4087 - classification_loss: 0.2959 236/500 [=============>................] - ETA: 1:06 - loss: 1.7040 - regression_loss: 1.4083 - classification_loss: 0.2956 237/500 [=============>................] - ETA: 1:06 - loss: 1.7028 - regression_loss: 1.4078 - classification_loss: 0.2950 238/500 [=============>................] - ETA: 1:05 - loss: 1.7029 - regression_loss: 1.4079 - classification_loss: 0.2951 239/500 [=============>................] - ETA: 1:05 - loss: 1.7034 - regression_loss: 1.4083 - classification_loss: 0.2950 240/500 [=============>................] - ETA: 1:05 - loss: 1.7039 - regression_loss: 1.4089 - classification_loss: 0.2949 241/500 [=============>................] - ETA: 1:05 - loss: 1.7051 - regression_loss: 1.4101 - classification_loss: 0.2950 242/500 [=============>................] - ETA: 1:04 - loss: 1.7072 - regression_loss: 1.4120 - classification_loss: 0.2952 243/500 [=============>................] - ETA: 1:04 - loss: 1.7055 - regression_loss: 1.4108 - classification_loss: 0.2947 244/500 [=============>................] - ETA: 1:04 - loss: 1.7065 - regression_loss: 1.4114 - classification_loss: 0.2951 245/500 [=============>................] - ETA: 1:04 - loss: 1.7065 - regression_loss: 1.4115 - classification_loss: 0.2950 246/500 [=============>................] - ETA: 1:03 - loss: 1.7064 - regression_loss: 1.4113 - classification_loss: 0.2951 247/500 [=============>................] - ETA: 1:03 - loss: 1.7086 - regression_loss: 1.4130 - classification_loss: 0.2956 248/500 [=============>................] - ETA: 1:03 - loss: 1.7072 - regression_loss: 1.4119 - classification_loss: 0.2953 249/500 [=============>................] - ETA: 1:03 - loss: 1.7086 - regression_loss: 1.4124 - classification_loss: 0.2963 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7122 - regression_loss: 1.4153 - classification_loss: 0.2968 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7113 - regression_loss: 1.4138 - classification_loss: 0.2975 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7113 - regression_loss: 1.4143 - classification_loss: 0.2971 253/500 [==============>...............] - ETA: 1:02 - loss: 1.7140 - regression_loss: 1.4159 - classification_loss: 0.2981 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7139 - regression_loss: 1.4158 - classification_loss: 0.2981 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7164 - regression_loss: 1.4180 - classification_loss: 0.2984 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7157 - regression_loss: 1.4173 - classification_loss: 0.2984 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7133 - regression_loss: 1.4155 - classification_loss: 0.2978 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7147 - regression_loss: 1.4167 - classification_loss: 0.2979 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7104 - regression_loss: 1.4133 - classification_loss: 0.2971 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7087 - regression_loss: 1.4117 - classification_loss: 0.2971 261/500 [==============>...............] - ETA: 1:00 - loss: 1.7080 - regression_loss: 1.4111 - classification_loss: 0.2969 262/500 [==============>...............] - ETA: 59s - loss: 1.7095 - regression_loss: 1.4123 - classification_loss: 0.2972  263/500 [==============>...............] - ETA: 59s - loss: 1.7102 - regression_loss: 1.4130 - classification_loss: 0.2972 264/500 [==============>...............] - ETA: 59s - loss: 1.7098 - regression_loss: 1.4129 - classification_loss: 0.2969 265/500 [==============>...............] - ETA: 59s - loss: 1.7115 - regression_loss: 1.4144 - classification_loss: 0.2970 266/500 [==============>...............] - ETA: 58s - loss: 1.7105 - regression_loss: 1.4137 - classification_loss: 0.2968 267/500 [===============>..............] - ETA: 58s - loss: 1.7106 - regression_loss: 1.4138 - classification_loss: 0.2968 268/500 [===============>..............] - ETA: 58s - loss: 1.7094 - regression_loss: 1.4128 - classification_loss: 0.2966 269/500 [===============>..............] - ETA: 58s - loss: 1.7089 - regression_loss: 1.4125 - classification_loss: 0.2963 270/500 [===============>..............] - ETA: 57s - loss: 1.7100 - regression_loss: 1.4139 - classification_loss: 0.2960 271/500 [===============>..............] - ETA: 57s - loss: 1.7102 - regression_loss: 1.4141 - classification_loss: 0.2961 272/500 [===============>..............] - ETA: 57s - loss: 1.7110 - regression_loss: 1.4146 - classification_loss: 0.2964 273/500 [===============>..............] - ETA: 57s - loss: 1.7091 - regression_loss: 1.4133 - classification_loss: 0.2958 274/500 [===============>..............] - ETA: 56s - loss: 1.7090 - regression_loss: 1.4134 - classification_loss: 0.2956 275/500 [===============>..............] - ETA: 56s - loss: 1.7081 - regression_loss: 1.4127 - classification_loss: 0.2954 276/500 [===============>..............] - ETA: 56s - loss: 1.7053 - regression_loss: 1.4105 - classification_loss: 0.2948 277/500 [===============>..............] - ETA: 56s - loss: 1.7069 - regression_loss: 1.4119 - classification_loss: 0.2950 278/500 [===============>..............] - ETA: 55s - loss: 1.7047 - regression_loss: 1.4102 - classification_loss: 0.2944 279/500 [===============>..............] - ETA: 55s - loss: 1.7046 - regression_loss: 1.4104 - classification_loss: 0.2942 280/500 [===============>..............] - ETA: 55s - loss: 1.7067 - regression_loss: 1.4123 - classification_loss: 0.2943 281/500 [===============>..............] - ETA: 55s - loss: 1.7072 - regression_loss: 1.4130 - classification_loss: 0.2942 282/500 [===============>..............] - ETA: 54s - loss: 1.7081 - regression_loss: 1.4135 - classification_loss: 0.2945 283/500 [===============>..............] - ETA: 54s - loss: 1.7080 - regression_loss: 1.4137 - classification_loss: 0.2943 284/500 [================>.............] - ETA: 54s - loss: 1.7094 - regression_loss: 1.4148 - classification_loss: 0.2947 285/500 [================>.............] - ETA: 54s - loss: 1.7102 - regression_loss: 1.4154 - classification_loss: 0.2948 286/500 [================>.............] - ETA: 53s - loss: 1.7079 - regression_loss: 1.4134 - classification_loss: 0.2945 287/500 [================>.............] - ETA: 53s - loss: 1.7087 - regression_loss: 1.4140 - classification_loss: 0.2947 288/500 [================>.............] - ETA: 53s - loss: 1.7091 - regression_loss: 1.4145 - classification_loss: 0.2947 289/500 [================>.............] - ETA: 53s - loss: 1.7095 - regression_loss: 1.4150 - classification_loss: 0.2945 290/500 [================>.............] - ETA: 52s - loss: 1.7082 - regression_loss: 1.4141 - classification_loss: 0.2941 291/500 [================>.............] - ETA: 52s - loss: 1.7093 - regression_loss: 1.4149 - classification_loss: 0.2944 292/500 [================>.............] - ETA: 52s - loss: 1.7074 - regression_loss: 1.4133 - classification_loss: 0.2941 293/500 [================>.............] - ETA: 51s - loss: 1.7102 - regression_loss: 1.4156 - classification_loss: 0.2946 294/500 [================>.............] - ETA: 51s - loss: 1.7100 - regression_loss: 1.4151 - classification_loss: 0.2948 295/500 [================>.............] - ETA: 51s - loss: 1.7119 - regression_loss: 1.4168 - classification_loss: 0.2951 296/500 [================>.............] - ETA: 51s - loss: 1.7119 - regression_loss: 1.4168 - classification_loss: 0.2951 297/500 [================>.............] - ETA: 50s - loss: 1.7089 - regression_loss: 1.4144 - classification_loss: 0.2945 298/500 [================>.............] - ETA: 50s - loss: 1.7087 - regression_loss: 1.4142 - classification_loss: 0.2946 299/500 [================>.............] - ETA: 50s - loss: 1.7086 - regression_loss: 1.4139 - classification_loss: 0.2947 300/500 [=================>............] - ETA: 50s - loss: 1.7110 - regression_loss: 1.4156 - classification_loss: 0.2954 301/500 [=================>............] - ETA: 49s - loss: 1.7089 - regression_loss: 1.4139 - classification_loss: 0.2950 302/500 [=================>............] - ETA: 49s - loss: 1.7084 - regression_loss: 1.4135 - classification_loss: 0.2949 303/500 [=================>............] - ETA: 49s - loss: 1.7092 - regression_loss: 1.4143 - classification_loss: 0.2950 304/500 [=================>............] - ETA: 49s - loss: 1.7088 - regression_loss: 1.4140 - classification_loss: 0.2948 305/500 [=================>............] - ETA: 48s - loss: 1.7080 - regression_loss: 1.4135 - classification_loss: 0.2945 306/500 [=================>............] - ETA: 48s - loss: 1.7082 - regression_loss: 1.4138 - classification_loss: 0.2945 307/500 [=================>............] - ETA: 48s - loss: 1.7079 - regression_loss: 1.4136 - classification_loss: 0.2942 308/500 [=================>............] - ETA: 48s - loss: 1.7060 - regression_loss: 1.4122 - classification_loss: 0.2938 309/500 [=================>............] - ETA: 47s - loss: 1.7062 - regression_loss: 1.4124 - classification_loss: 0.2939 310/500 [=================>............] - ETA: 47s - loss: 1.7044 - regression_loss: 1.4110 - classification_loss: 0.2934 311/500 [=================>............] - ETA: 47s - loss: 1.7046 - regression_loss: 1.4111 - classification_loss: 0.2935 312/500 [=================>............] - ETA: 47s - loss: 1.7043 - regression_loss: 1.4107 - classification_loss: 0.2936 313/500 [=================>............] - ETA: 46s - loss: 1.7029 - regression_loss: 1.4096 - classification_loss: 0.2933 314/500 [=================>............] - ETA: 46s - loss: 1.6999 - regression_loss: 1.4070 - classification_loss: 0.2928 315/500 [=================>............] - ETA: 46s - loss: 1.7019 - regression_loss: 1.4084 - classification_loss: 0.2935 316/500 [=================>............] - ETA: 46s - loss: 1.7052 - regression_loss: 1.4111 - classification_loss: 0.2941 317/500 [==================>...........] - ETA: 45s - loss: 1.7038 - regression_loss: 1.4098 - classification_loss: 0.2940 318/500 [==================>...........] - ETA: 45s - loss: 1.7030 - regression_loss: 1.4092 - classification_loss: 0.2938 319/500 [==================>...........] - ETA: 45s - loss: 1.7028 - regression_loss: 1.4089 - classification_loss: 0.2939 320/500 [==================>...........] - ETA: 45s - loss: 1.7013 - regression_loss: 1.4077 - classification_loss: 0.2936 321/500 [==================>...........] - ETA: 44s - loss: 1.7036 - regression_loss: 1.4096 - classification_loss: 0.2940 322/500 [==================>...........] - ETA: 44s - loss: 1.7030 - regression_loss: 1.4083 - classification_loss: 0.2947 323/500 [==================>...........] - ETA: 44s - loss: 1.7027 - regression_loss: 1.4082 - classification_loss: 0.2945 324/500 [==================>...........] - ETA: 44s - loss: 1.7014 - regression_loss: 1.4074 - classification_loss: 0.2940 325/500 [==================>...........] - ETA: 43s - loss: 1.7012 - regression_loss: 1.4072 - classification_loss: 0.2940 326/500 [==================>...........] - ETA: 43s - loss: 1.7014 - regression_loss: 1.4073 - classification_loss: 0.2940 327/500 [==================>...........] - ETA: 43s - loss: 1.7008 - regression_loss: 1.4070 - classification_loss: 0.2937 328/500 [==================>...........] - ETA: 43s - loss: 1.7014 - regression_loss: 1.4074 - classification_loss: 0.2940 329/500 [==================>...........] - ETA: 42s - loss: 1.7013 - regression_loss: 1.4074 - classification_loss: 0.2939 330/500 [==================>...........] - ETA: 42s - loss: 1.7027 - regression_loss: 1.4083 - classification_loss: 0.2944 331/500 [==================>...........] - ETA: 42s - loss: 1.7021 - regression_loss: 1.4080 - classification_loss: 0.2942 332/500 [==================>...........] - ETA: 42s - loss: 1.7015 - regression_loss: 1.4075 - classification_loss: 0.2940 333/500 [==================>...........] - ETA: 41s - loss: 1.7003 - regression_loss: 1.4064 - classification_loss: 0.2939 334/500 [===================>..........] - ETA: 41s - loss: 1.7002 - regression_loss: 1.4065 - classification_loss: 0.2937 335/500 [===================>..........] - ETA: 41s - loss: 1.6993 - regression_loss: 1.4060 - classification_loss: 0.2934 336/500 [===================>..........] - ETA: 41s - loss: 1.6977 - regression_loss: 1.4045 - classification_loss: 0.2932 337/500 [===================>..........] - ETA: 40s - loss: 1.6986 - regression_loss: 1.4052 - classification_loss: 0.2934 338/500 [===================>..........] - ETA: 40s - loss: 1.6987 - regression_loss: 1.4054 - classification_loss: 0.2933 339/500 [===================>..........] - ETA: 40s - loss: 1.6987 - regression_loss: 1.4055 - classification_loss: 0.2932 340/500 [===================>..........] - ETA: 40s - loss: 1.6994 - regression_loss: 1.4062 - classification_loss: 0.2932 341/500 [===================>..........] - ETA: 39s - loss: 1.7001 - regression_loss: 1.4068 - classification_loss: 0.2932 342/500 [===================>..........] - ETA: 39s - loss: 1.7004 - regression_loss: 1.4073 - classification_loss: 0.2932 343/500 [===================>..........] - ETA: 39s - loss: 1.7019 - regression_loss: 1.4067 - classification_loss: 0.2952 344/500 [===================>..........] - ETA: 39s - loss: 1.7021 - regression_loss: 1.4059 - classification_loss: 0.2962 345/500 [===================>..........] - ETA: 38s - loss: 1.7033 - regression_loss: 1.4070 - classification_loss: 0.2963 346/500 [===================>..........] - ETA: 38s - loss: 1.7041 - regression_loss: 1.4077 - classification_loss: 0.2964 347/500 [===================>..........] - ETA: 38s - loss: 1.7044 - regression_loss: 1.4081 - classification_loss: 0.2963 348/500 [===================>..........] - ETA: 38s - loss: 1.7037 - regression_loss: 1.4075 - classification_loss: 0.2962 349/500 [===================>..........] - ETA: 37s - loss: 1.7045 - regression_loss: 1.4079 - classification_loss: 0.2966 350/500 [====================>.........] - ETA: 37s - loss: 1.7065 - regression_loss: 1.4093 - classification_loss: 0.2972 351/500 [====================>.........] - ETA: 37s - loss: 1.7073 - regression_loss: 1.4098 - classification_loss: 0.2974 352/500 [====================>.........] - ETA: 37s - loss: 1.7076 - regression_loss: 1.4102 - classification_loss: 0.2974 353/500 [====================>.........] - ETA: 36s - loss: 1.7074 - regression_loss: 1.4102 - classification_loss: 0.2972 354/500 [====================>.........] - ETA: 36s - loss: 1.7054 - regression_loss: 1.4084 - classification_loss: 0.2969 355/500 [====================>.........] - ETA: 36s - loss: 1.7062 - regression_loss: 1.4091 - classification_loss: 0.2970 356/500 [====================>.........] - ETA: 36s - loss: 1.7069 - regression_loss: 1.4097 - classification_loss: 0.2972 357/500 [====================>.........] - ETA: 35s - loss: 1.7049 - regression_loss: 1.4078 - classification_loss: 0.2971 358/500 [====================>.........] - ETA: 35s - loss: 1.7059 - regression_loss: 1.4087 - classification_loss: 0.2972 359/500 [====================>.........] - ETA: 35s - loss: 1.7066 - regression_loss: 1.4090 - classification_loss: 0.2976 360/500 [====================>.........] - ETA: 35s - loss: 1.7074 - regression_loss: 1.4097 - classification_loss: 0.2977 361/500 [====================>.........] - ETA: 34s - loss: 1.7063 - regression_loss: 1.4090 - classification_loss: 0.2973 362/500 [====================>.........] - ETA: 34s - loss: 1.7079 - regression_loss: 1.4104 - classification_loss: 0.2975 363/500 [====================>.........] - ETA: 34s - loss: 1.7075 - regression_loss: 1.4100 - classification_loss: 0.2974 364/500 [====================>.........] - ETA: 34s - loss: 1.7068 - regression_loss: 1.4094 - classification_loss: 0.2973 365/500 [====================>.........] - ETA: 33s - loss: 1.7073 - regression_loss: 1.4099 - classification_loss: 0.2974 366/500 [====================>.........] - ETA: 33s - loss: 1.7087 - regression_loss: 1.4112 - classification_loss: 0.2976 367/500 [=====================>........] - ETA: 33s - loss: 1.7090 - regression_loss: 1.4115 - classification_loss: 0.2975 368/500 [=====================>........] - ETA: 33s - loss: 1.7086 - regression_loss: 1.4111 - classification_loss: 0.2975 369/500 [=====================>........] - ETA: 32s - loss: 1.7098 - regression_loss: 1.4123 - classification_loss: 0.2976 370/500 [=====================>........] - ETA: 32s - loss: 1.7098 - regression_loss: 1.4123 - classification_loss: 0.2974 371/500 [=====================>........] - ETA: 32s - loss: 1.7072 - regression_loss: 1.4103 - classification_loss: 0.2969 372/500 [=====================>........] - ETA: 32s - loss: 1.7067 - regression_loss: 1.4098 - classification_loss: 0.2968 373/500 [=====================>........] - ETA: 31s - loss: 1.7064 - regression_loss: 1.4093 - classification_loss: 0.2971 374/500 [=====================>........] - ETA: 31s - loss: 1.7079 - regression_loss: 1.4105 - classification_loss: 0.2974 375/500 [=====================>........] - ETA: 31s - loss: 1.7070 - regression_loss: 1.4097 - classification_loss: 0.2973 376/500 [=====================>........] - ETA: 31s - loss: 1.7065 - regression_loss: 1.4094 - classification_loss: 0.2971 377/500 [=====================>........] - ETA: 30s - loss: 1.7070 - regression_loss: 1.4099 - classification_loss: 0.2971 378/500 [=====================>........] - ETA: 30s - loss: 1.7070 - regression_loss: 1.4100 - classification_loss: 0.2970 379/500 [=====================>........] - ETA: 30s - loss: 1.7075 - regression_loss: 1.4106 - classification_loss: 0.2969 380/500 [=====================>........] - ETA: 30s - loss: 1.7070 - regression_loss: 1.4102 - classification_loss: 0.2968 381/500 [=====================>........] - ETA: 29s - loss: 1.7053 - regression_loss: 1.4086 - classification_loss: 0.2967 382/500 [=====================>........] - ETA: 29s - loss: 1.7039 - regression_loss: 1.4077 - classification_loss: 0.2962 383/500 [=====================>........] - ETA: 29s - loss: 1.7050 - regression_loss: 1.4086 - classification_loss: 0.2964 384/500 [======================>.......] - ETA: 29s - loss: 1.7042 - regression_loss: 1.4080 - classification_loss: 0.2962 385/500 [======================>.......] - ETA: 28s - loss: 1.7043 - regression_loss: 1.4083 - classification_loss: 0.2961 386/500 [======================>.......] - ETA: 28s - loss: 1.7055 - regression_loss: 1.4091 - classification_loss: 0.2964 387/500 [======================>.......] - ETA: 28s - loss: 1.7059 - regression_loss: 1.4098 - classification_loss: 0.2961 388/500 [======================>.......] - ETA: 28s - loss: 1.7049 - regression_loss: 1.4090 - classification_loss: 0.2958 389/500 [======================>.......] - ETA: 27s - loss: 1.7031 - regression_loss: 1.4075 - classification_loss: 0.2956 390/500 [======================>.......] - ETA: 27s - loss: 1.7034 - regression_loss: 1.4077 - classification_loss: 0.2956 391/500 [======================>.......] - ETA: 27s - loss: 1.7033 - regression_loss: 1.4077 - classification_loss: 0.2957 392/500 [======================>.......] - ETA: 27s - loss: 1.7029 - regression_loss: 1.4074 - classification_loss: 0.2955 393/500 [======================>.......] - ETA: 26s - loss: 1.7031 - regression_loss: 1.4076 - classification_loss: 0.2955 394/500 [======================>.......] - ETA: 26s - loss: 1.7044 - regression_loss: 1.4084 - classification_loss: 0.2960 395/500 [======================>.......] - ETA: 26s - loss: 1.7053 - regression_loss: 1.4093 - classification_loss: 0.2960 396/500 [======================>.......] - ETA: 26s - loss: 1.7043 - regression_loss: 1.4086 - classification_loss: 0.2957 397/500 [======================>.......] - ETA: 25s - loss: 1.7048 - regression_loss: 1.4091 - classification_loss: 0.2957 398/500 [======================>.......] - ETA: 25s - loss: 1.7047 - regression_loss: 1.4092 - classification_loss: 0.2955 399/500 [======================>.......] - ETA: 25s - loss: 1.7049 - regression_loss: 1.4095 - classification_loss: 0.2954 400/500 [=======================>......] - ETA: 25s - loss: 1.7038 - regression_loss: 1.4086 - classification_loss: 0.2951 401/500 [=======================>......] - ETA: 24s - loss: 1.7049 - regression_loss: 1.4097 - classification_loss: 0.2952 402/500 [=======================>......] - ETA: 24s - loss: 1.7026 - regression_loss: 1.4079 - classification_loss: 0.2947 403/500 [=======================>......] - ETA: 24s - loss: 1.7045 - regression_loss: 1.4094 - classification_loss: 0.2951 404/500 [=======================>......] - ETA: 24s - loss: 1.7041 - regression_loss: 1.4091 - classification_loss: 0.2950 405/500 [=======================>......] - ETA: 23s - loss: 1.7032 - regression_loss: 1.4085 - classification_loss: 0.2946 406/500 [=======================>......] - ETA: 23s - loss: 1.7033 - regression_loss: 1.4087 - classification_loss: 0.2946 407/500 [=======================>......] - ETA: 23s - loss: 1.7031 - regression_loss: 1.4087 - classification_loss: 0.2944 408/500 [=======================>......] - ETA: 23s - loss: 1.7053 - regression_loss: 1.4104 - classification_loss: 0.2949 409/500 [=======================>......] - ETA: 22s - loss: 1.7047 - regression_loss: 1.4100 - classification_loss: 0.2947 410/500 [=======================>......] - ETA: 22s - loss: 1.7031 - regression_loss: 1.4086 - classification_loss: 0.2945 411/500 [=======================>......] - ETA: 22s - loss: 1.7037 - regression_loss: 1.4091 - classification_loss: 0.2946 412/500 [=======================>......] - ETA: 22s - loss: 1.7036 - regression_loss: 1.4092 - classification_loss: 0.2945 413/500 [=======================>......] - ETA: 21s - loss: 1.7013 - regression_loss: 1.4072 - classification_loss: 0.2941 414/500 [=======================>......] - ETA: 21s - loss: 1.6997 - regression_loss: 1.4060 - classification_loss: 0.2937 415/500 [=======================>......] - ETA: 21s - loss: 1.6995 - regression_loss: 1.4059 - classification_loss: 0.2936 416/500 [=======================>......] - ETA: 21s - loss: 1.7003 - regression_loss: 1.4068 - classification_loss: 0.2935 417/500 [========================>.....] - ETA: 20s - loss: 1.7015 - regression_loss: 1.4074 - classification_loss: 0.2941 418/500 [========================>.....] - ETA: 20s - loss: 1.7011 - regression_loss: 1.4073 - classification_loss: 0.2938 419/500 [========================>.....] - ETA: 20s - loss: 1.7028 - regression_loss: 1.4089 - classification_loss: 0.2939 420/500 [========================>.....] - ETA: 20s - loss: 1.7005 - regression_loss: 1.4071 - classification_loss: 0.2934 421/500 [========================>.....] - ETA: 19s - loss: 1.7009 - regression_loss: 1.4076 - classification_loss: 0.2933 422/500 [========================>.....] - ETA: 19s - loss: 1.6995 - regression_loss: 1.4064 - classification_loss: 0.2931 423/500 [========================>.....] - ETA: 19s - loss: 1.7001 - regression_loss: 1.4069 - classification_loss: 0.2932 424/500 [========================>.....] - ETA: 19s - loss: 1.7000 - regression_loss: 1.4069 - classification_loss: 0.2931 425/500 [========================>.....] - ETA: 18s - loss: 1.6998 - regression_loss: 1.4067 - classification_loss: 0.2932 426/500 [========================>.....] - ETA: 18s - loss: 1.6995 - regression_loss: 1.4062 - classification_loss: 0.2933 427/500 [========================>.....] - ETA: 18s - loss: 1.7002 - regression_loss: 1.4068 - classification_loss: 0.2934 428/500 [========================>.....] - ETA: 18s - loss: 1.7007 - regression_loss: 1.4072 - classification_loss: 0.2935 429/500 [========================>.....] - ETA: 17s - loss: 1.6998 - regression_loss: 1.4065 - classification_loss: 0.2933 430/500 [========================>.....] - ETA: 17s - loss: 1.6997 - regression_loss: 1.4064 - classification_loss: 0.2933 431/500 [========================>.....] - ETA: 17s - loss: 1.6996 - regression_loss: 1.4063 - classification_loss: 0.2932 432/500 [========================>.....] - ETA: 17s - loss: 1.7006 - regression_loss: 1.4070 - classification_loss: 0.2936 433/500 [========================>.....] - ETA: 16s - loss: 1.6984 - regression_loss: 1.4053 - classification_loss: 0.2931 434/500 [=========================>....] - ETA: 16s - loss: 1.6977 - regression_loss: 1.4048 - classification_loss: 0.2929 435/500 [=========================>....] - ETA: 16s - loss: 1.6980 - regression_loss: 1.4045 - classification_loss: 0.2934 436/500 [=========================>....] - ETA: 16s - loss: 1.6978 - regression_loss: 1.4043 - classification_loss: 0.2934 437/500 [=========================>....] - ETA: 15s - loss: 1.6987 - regression_loss: 1.4051 - classification_loss: 0.2936 438/500 [=========================>....] - ETA: 15s - loss: 1.6974 - regression_loss: 1.4038 - classification_loss: 0.2936 439/500 [=========================>....] - ETA: 15s - loss: 1.6982 - regression_loss: 1.4045 - classification_loss: 0.2937 440/500 [=========================>....] - ETA: 15s - loss: 1.6989 - regression_loss: 1.4051 - classification_loss: 0.2938 441/500 [=========================>....] - ETA: 14s - loss: 1.6994 - regression_loss: 1.4056 - classification_loss: 0.2938 442/500 [=========================>....] - ETA: 14s - loss: 1.7007 - regression_loss: 1.4067 - classification_loss: 0.2940 443/500 [=========================>....] - ETA: 14s - loss: 1.7009 - regression_loss: 1.4071 - classification_loss: 0.2938 444/500 [=========================>....] - ETA: 14s - loss: 1.7003 - regression_loss: 1.4065 - classification_loss: 0.2938 445/500 [=========================>....] - ETA: 13s - loss: 1.7008 - regression_loss: 1.4069 - classification_loss: 0.2939 446/500 [=========================>....] - ETA: 13s - loss: 1.7010 - regression_loss: 1.4071 - classification_loss: 0.2940 447/500 [=========================>....] - ETA: 13s - loss: 1.7013 - regression_loss: 1.4073 - classification_loss: 0.2941 448/500 [=========================>....] - ETA: 13s - loss: 1.7009 - regression_loss: 1.4070 - classification_loss: 0.2939 449/500 [=========================>....] - ETA: 12s - loss: 1.6989 - regression_loss: 1.4054 - classification_loss: 0.2935 450/500 [==========================>...] - ETA: 12s - loss: 1.6974 - regression_loss: 1.4041 - classification_loss: 0.2933 451/500 [==========================>...] - ETA: 12s - loss: 1.6967 - regression_loss: 1.4036 - classification_loss: 0.2931 452/500 [==========================>...] - ETA: 12s - loss: 1.6948 - regression_loss: 1.4020 - classification_loss: 0.2928 453/500 [==========================>...] - ETA: 11s - loss: 1.6939 - regression_loss: 1.4012 - classification_loss: 0.2927 454/500 [==========================>...] - ETA: 11s - loss: 1.6938 - regression_loss: 1.4010 - classification_loss: 0.2928 455/500 [==========================>...] - ETA: 11s - loss: 1.6935 - regression_loss: 1.4007 - classification_loss: 0.2928 456/500 [==========================>...] - ETA: 11s - loss: 1.6935 - regression_loss: 1.4008 - classification_loss: 0.2928 457/500 [==========================>...] - ETA: 10s - loss: 1.6937 - regression_loss: 1.4011 - classification_loss: 0.2926 458/500 [==========================>...] - ETA: 10s - loss: 1.6919 - regression_loss: 1.3996 - classification_loss: 0.2923 459/500 [==========================>...] - ETA: 10s - loss: 1.6917 - regression_loss: 1.3995 - classification_loss: 0.2923 460/500 [==========================>...] - ETA: 10s - loss: 1.6924 - regression_loss: 1.4001 - classification_loss: 0.2923 461/500 [==========================>...] - ETA: 9s - loss: 1.6927 - regression_loss: 1.4004 - classification_loss: 0.2923  462/500 [==========================>...] - ETA: 9s - loss: 1.6929 - regression_loss: 1.4007 - classification_loss: 0.2922 463/500 [==========================>...] - ETA: 9s - loss: 1.6950 - regression_loss: 1.4025 - classification_loss: 0.2925 464/500 [==========================>...] - ETA: 9s - loss: 1.6955 - regression_loss: 1.4029 - classification_loss: 0.2926 465/500 [==========================>...] - ETA: 8s - loss: 1.6967 - regression_loss: 1.4039 - classification_loss: 0.2929 466/500 [==========================>...] - ETA: 8s - loss: 1.6969 - regression_loss: 1.4040 - classification_loss: 0.2929 467/500 [===========================>..] - ETA: 8s - loss: 1.6966 - regression_loss: 1.4037 - classification_loss: 0.2929 468/500 [===========================>..] - ETA: 8s - loss: 1.6962 - regression_loss: 1.4031 - classification_loss: 0.2931 469/500 [===========================>..] - ETA: 7s - loss: 1.6961 - regression_loss: 1.4031 - classification_loss: 0.2930 470/500 [===========================>..] - ETA: 7s - loss: 1.6946 - regression_loss: 1.4019 - classification_loss: 0.2927 471/500 [===========================>..] - ETA: 7s - loss: 1.6936 - regression_loss: 1.4009 - classification_loss: 0.2927 472/500 [===========================>..] - ETA: 7s - loss: 1.6940 - regression_loss: 1.4013 - classification_loss: 0.2928 473/500 [===========================>..] - ETA: 6s - loss: 1.6937 - regression_loss: 1.4009 - classification_loss: 0.2928 474/500 [===========================>..] - ETA: 6s - loss: 1.6942 - regression_loss: 1.4013 - classification_loss: 0.2929 475/500 [===========================>..] - ETA: 6s - loss: 1.6946 - regression_loss: 1.4017 - classification_loss: 0.2929 476/500 [===========================>..] - ETA: 6s - loss: 1.6937 - regression_loss: 1.4010 - classification_loss: 0.2927 477/500 [===========================>..] - ETA: 5s - loss: 1.6918 - regression_loss: 1.3994 - classification_loss: 0.2925 478/500 [===========================>..] - ETA: 5s - loss: 1.6923 - regression_loss: 1.3997 - classification_loss: 0.2926 479/500 [===========================>..] - ETA: 5s - loss: 1.6924 - regression_loss: 1.3999 - classification_loss: 0.2925 480/500 [===========================>..] - ETA: 5s - loss: 1.6917 - regression_loss: 1.3993 - classification_loss: 0.2925 481/500 [===========================>..] - ETA: 4s - loss: 1.6913 - regression_loss: 1.3990 - classification_loss: 0.2923 482/500 [===========================>..] - ETA: 4s - loss: 1.6903 - regression_loss: 1.3982 - classification_loss: 0.2922 483/500 [===========================>..] - ETA: 4s - loss: 1.6915 - regression_loss: 1.3993 - classification_loss: 0.2922 484/500 [============================>.] - ETA: 4s - loss: 1.6919 - regression_loss: 1.3997 - classification_loss: 0.2922 485/500 [============================>.] - ETA: 3s - loss: 1.6921 - regression_loss: 1.4000 - classification_loss: 0.2921 486/500 [============================>.] - ETA: 3s - loss: 1.6918 - regression_loss: 1.3999 - classification_loss: 0.2919 487/500 [============================>.] - ETA: 3s - loss: 1.6918 - regression_loss: 1.4000 - classification_loss: 0.2918 488/500 [============================>.] - ETA: 3s - loss: 1.6916 - regression_loss: 1.3997 - classification_loss: 0.2919 489/500 [============================>.] - ETA: 2s - loss: 1.6905 - regression_loss: 1.3987 - classification_loss: 0.2917 490/500 [============================>.] - ETA: 2s - loss: 1.6915 - regression_loss: 1.3996 - classification_loss: 0.2919 491/500 [============================>.] - ETA: 2s - loss: 1.6921 - regression_loss: 1.4000 - classification_loss: 0.2921 492/500 [============================>.] - ETA: 2s - loss: 1.6921 - regression_loss: 1.4000 - classification_loss: 0.2921 493/500 [============================>.] - ETA: 1s - loss: 1.6924 - regression_loss: 1.4003 - classification_loss: 0.2921 494/500 [============================>.] - ETA: 1s - loss: 1.6910 - regression_loss: 1.3993 - classification_loss: 0.2918 495/500 [============================>.] - ETA: 1s - loss: 1.6907 - regression_loss: 1.3989 - classification_loss: 0.2918 496/500 [============================>.] - ETA: 1s - loss: 1.6926 - regression_loss: 1.4005 - classification_loss: 0.2921 497/500 [============================>.] - ETA: 0s - loss: 1.6911 - regression_loss: 1.3993 - classification_loss: 0.2918 498/500 [============================>.] - ETA: 0s - loss: 1.6906 - regression_loss: 1.3990 - classification_loss: 0.2916 499/500 [============================>.] - ETA: 0s - loss: 1.6910 - regression_loss: 1.3991 - classification_loss: 0.2919 500/500 [==============================] - 126s 251ms/step - loss: 1.6921 - regression_loss: 1.4001 - classification_loss: 0.2920 1172 instances of class plum with average precision: 0.5947 mAP: 0.5947 Epoch 00072: saving model to ./training/snapshots/resnet50_pascal_72.h5 Epoch 73/150 1/500 [..............................] - ETA: 1:54 - loss: 1.3364 - regression_loss: 0.8968 - classification_loss: 0.4396 2/500 [..............................] - ETA: 2:00 - loss: 1.3706 - regression_loss: 1.0677 - classification_loss: 0.3028 3/500 [..............................] - ETA: 2:03 - loss: 1.6819 - regression_loss: 1.3395 - classification_loss: 0.3425 4/500 [..............................] - ETA: 2:02 - loss: 1.7554 - regression_loss: 1.4175 - classification_loss: 0.3378 5/500 [..............................] - ETA: 2:02 - loss: 1.8158 - regression_loss: 1.4655 - classification_loss: 0.3503 6/500 [..............................] - ETA: 2:03 - loss: 1.7682 - regression_loss: 1.4289 - classification_loss: 0.3393 7/500 [..............................] - ETA: 2:01 - loss: 1.7056 - regression_loss: 1.3870 - classification_loss: 0.3186 8/500 [..............................] - ETA: 2:01 - loss: 1.7117 - regression_loss: 1.4036 - classification_loss: 0.3081 9/500 [..............................] - ETA: 2:00 - loss: 1.7606 - regression_loss: 1.4450 - classification_loss: 0.3156 10/500 [..............................] - ETA: 2:00 - loss: 1.8140 - regression_loss: 1.4908 - classification_loss: 0.3232 11/500 [..............................] - ETA: 2:00 - loss: 1.8136 - regression_loss: 1.4926 - classification_loss: 0.3210 12/500 [..............................] - ETA: 2:00 - loss: 1.8568 - regression_loss: 1.5302 - classification_loss: 0.3265 13/500 [..............................] - ETA: 2:00 - loss: 1.8765 - regression_loss: 1.5468 - classification_loss: 0.3297 14/500 [..............................] - ETA: 2:00 - loss: 1.8708 - regression_loss: 1.5466 - classification_loss: 0.3242 15/500 [..............................] - ETA: 2:00 - loss: 1.8968 - regression_loss: 1.5648 - classification_loss: 0.3320 16/500 [..............................] - ETA: 2:00 - loss: 1.8610 - regression_loss: 1.5377 - classification_loss: 0.3233 17/500 [>.............................] - ETA: 1:59 - loss: 1.8541 - regression_loss: 1.5343 - classification_loss: 0.3198 18/500 [>.............................] - ETA: 1:59 - loss: 1.8429 - regression_loss: 1.5203 - classification_loss: 0.3227 19/500 [>.............................] - ETA: 1:59 - loss: 1.8405 - regression_loss: 1.5177 - classification_loss: 0.3227 20/500 [>.............................] - ETA: 1:59 - loss: 1.8091 - regression_loss: 1.4930 - classification_loss: 0.3161 21/500 [>.............................] - ETA: 1:59 - loss: 1.8003 - regression_loss: 1.4851 - classification_loss: 0.3152 22/500 [>.............................] - ETA: 1:58 - loss: 1.8152 - regression_loss: 1.4979 - classification_loss: 0.3172 23/500 [>.............................] - ETA: 1:58 - loss: 1.8613 - regression_loss: 1.5355 - classification_loss: 0.3258 24/500 [>.............................] - ETA: 1:58 - loss: 1.8539 - regression_loss: 1.5291 - classification_loss: 0.3249 25/500 [>.............................] - ETA: 1:57 - loss: 1.8420 - regression_loss: 1.5142 - classification_loss: 0.3277 26/500 [>.............................] - ETA: 1:57 - loss: 1.8481 - regression_loss: 1.5161 - classification_loss: 0.3320 27/500 [>.............................] - ETA: 1:57 - loss: 1.8450 - regression_loss: 1.5128 - classification_loss: 0.3322 28/500 [>.............................] - ETA: 1:57 - loss: 1.8637 - regression_loss: 1.5304 - classification_loss: 0.3333 29/500 [>.............................] - ETA: 1:57 - loss: 1.8498 - regression_loss: 1.5209 - classification_loss: 0.3289 30/500 [>.............................] - ETA: 1:56 - loss: 1.8375 - regression_loss: 1.5110 - classification_loss: 0.3265 31/500 [>.............................] - ETA: 1:56 - loss: 1.8381 - regression_loss: 1.5137 - classification_loss: 0.3244 32/500 [>.............................] - ETA: 1:56 - loss: 1.8343 - regression_loss: 1.5112 - classification_loss: 0.3231 33/500 [>.............................] - ETA: 1:56 - loss: 1.8299 - regression_loss: 1.5099 - classification_loss: 0.3200 34/500 [=>............................] - ETA: 1:55 - loss: 1.8224 - regression_loss: 1.5076 - classification_loss: 0.3147 35/500 [=>............................] - ETA: 1:55 - loss: 1.8075 - regression_loss: 1.4957 - classification_loss: 0.3118 36/500 [=>............................] - ETA: 1:55 - loss: 1.8149 - regression_loss: 1.5031 - classification_loss: 0.3117 37/500 [=>............................] - ETA: 1:55 - loss: 1.8161 - regression_loss: 1.5060 - classification_loss: 0.3101 38/500 [=>............................] - ETA: 1:54 - loss: 1.8209 - regression_loss: 1.5096 - classification_loss: 0.3112 39/500 [=>............................] - ETA: 1:54 - loss: 1.8368 - regression_loss: 1.5183 - classification_loss: 0.3185 40/500 [=>............................] - ETA: 1:54 - loss: 1.8400 - regression_loss: 1.5213 - classification_loss: 0.3187 41/500 [=>............................] - ETA: 1:54 - loss: 1.8172 - regression_loss: 1.5034 - classification_loss: 0.3138 42/500 [=>............................] - ETA: 1:53 - loss: 1.8302 - regression_loss: 1.5148 - classification_loss: 0.3153 43/500 [=>............................] - ETA: 1:53 - loss: 1.8217 - regression_loss: 1.5068 - classification_loss: 0.3149 44/500 [=>............................] - ETA: 1:53 - loss: 1.7945 - regression_loss: 1.4838 - classification_loss: 0.3107 45/500 [=>............................] - ETA: 1:53 - loss: 1.7819 - regression_loss: 1.4737 - classification_loss: 0.3082 46/500 [=>............................] - ETA: 1:53 - loss: 1.7792 - regression_loss: 1.4710 - classification_loss: 0.3081 47/500 [=>............................] - ETA: 1:52 - loss: 1.7761 - regression_loss: 1.4698 - classification_loss: 0.3063 48/500 [=>............................] - ETA: 1:52 - loss: 1.7816 - regression_loss: 1.4733 - classification_loss: 0.3083 49/500 [=>............................] - ETA: 1:52 - loss: 1.7818 - regression_loss: 1.4738 - classification_loss: 0.3080 50/500 [==>...........................] - ETA: 1:52 - loss: 1.7863 - regression_loss: 1.4759 - classification_loss: 0.3104 51/500 [==>...........................] - ETA: 1:52 - loss: 1.7812 - regression_loss: 1.4730 - classification_loss: 0.3082 52/500 [==>...........................] - ETA: 1:51 - loss: 1.7830 - regression_loss: 1.4735 - classification_loss: 0.3095 53/500 [==>...........................] - ETA: 1:51 - loss: 1.7969 - regression_loss: 1.4844 - classification_loss: 0.3125 54/500 [==>...........................] - ETA: 1:51 - loss: 1.7856 - regression_loss: 1.4761 - classification_loss: 0.3095 55/500 [==>...........................] - ETA: 1:50 - loss: 1.7852 - regression_loss: 1.4761 - classification_loss: 0.3090 56/500 [==>...........................] - ETA: 1:50 - loss: 1.7718 - regression_loss: 1.4653 - classification_loss: 0.3065 57/500 [==>...........................] - ETA: 1:50 - loss: 1.7740 - regression_loss: 1.4683 - classification_loss: 0.3057 58/500 [==>...........................] - ETA: 1:50 - loss: 1.7650 - regression_loss: 1.4623 - classification_loss: 0.3028 59/500 [==>...........................] - ETA: 1:50 - loss: 1.7603 - regression_loss: 1.4586 - classification_loss: 0.3017 60/500 [==>...........................] - ETA: 1:49 - loss: 1.7620 - regression_loss: 1.4607 - classification_loss: 0.3013 61/500 [==>...........................] - ETA: 1:49 - loss: 1.7724 - regression_loss: 1.4683 - classification_loss: 0.3041 62/500 [==>...........................] - ETA: 1:49 - loss: 1.7766 - regression_loss: 1.4700 - classification_loss: 0.3066 63/500 [==>...........................] - ETA: 1:49 - loss: 1.7806 - regression_loss: 1.4724 - classification_loss: 0.3081 64/500 [==>...........................] - ETA: 1:48 - loss: 1.7818 - regression_loss: 1.4729 - classification_loss: 0.3090 65/500 [==>...........................] - ETA: 1:48 - loss: 1.7684 - regression_loss: 1.4627 - classification_loss: 0.3057 66/500 [==>...........................] - ETA: 1:48 - loss: 1.7681 - regression_loss: 1.4629 - classification_loss: 0.3053 67/500 [===>..........................] - ETA: 1:48 - loss: 1.7644 - regression_loss: 1.4611 - classification_loss: 0.3033 68/500 [===>..........................] - ETA: 1:47 - loss: 1.7601 - regression_loss: 1.4584 - classification_loss: 0.3017 69/500 [===>..........................] - ETA: 1:47 - loss: 1.7599 - regression_loss: 1.4568 - classification_loss: 0.3031 70/500 [===>..........................] - ETA: 1:47 - loss: 1.7604 - regression_loss: 1.4578 - classification_loss: 0.3026 71/500 [===>..........................] - ETA: 1:47 - loss: 1.7682 - regression_loss: 1.4634 - classification_loss: 0.3048 72/500 [===>..........................] - ETA: 1:46 - loss: 1.7662 - regression_loss: 1.4619 - classification_loss: 0.3043 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7633 - regression_loss: 1.4594 - classification_loss: 0.3039 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7647 - regression_loss: 1.4608 - classification_loss: 0.3039 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7643 - regression_loss: 1.4607 - classification_loss: 0.3036 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7686 - regression_loss: 1.4643 - classification_loss: 0.3042 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7692 - regression_loss: 1.4649 - classification_loss: 0.3042 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7678 - regression_loss: 1.4647 - classification_loss: 0.3031 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7699 - regression_loss: 1.4670 - classification_loss: 0.3029 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7667 - regression_loss: 1.4647 - classification_loss: 0.3019 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7710 - regression_loss: 1.4684 - classification_loss: 0.3027 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7777 - regression_loss: 1.4732 - classification_loss: 0.3046 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7759 - regression_loss: 1.4720 - classification_loss: 0.3039 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7684 - regression_loss: 1.4657 - classification_loss: 0.3027 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7669 - regression_loss: 1.4635 - classification_loss: 0.3034 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7653 - regression_loss: 1.4623 - classification_loss: 0.3030 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7685 - regression_loss: 1.4651 - classification_loss: 0.3034 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7701 - regression_loss: 1.4662 - classification_loss: 0.3039 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7616 - regression_loss: 1.4595 - classification_loss: 0.3021 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7631 - regression_loss: 1.4614 - classification_loss: 0.3017 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7576 - regression_loss: 1.4563 - classification_loss: 0.3014 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7616 - regression_loss: 1.4597 - classification_loss: 0.3020 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7495 - regression_loss: 1.4500 - classification_loss: 0.2995 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7491 - regression_loss: 1.4499 - classification_loss: 0.2991 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7513 - regression_loss: 1.4517 - classification_loss: 0.2997 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7439 - regression_loss: 1.4456 - classification_loss: 0.2983 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7484 - regression_loss: 1.4496 - classification_loss: 0.2988 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7566 - regression_loss: 1.4575 - classification_loss: 0.2991 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7549 - regression_loss: 1.4561 - classification_loss: 0.2988 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7523 - regression_loss: 1.4542 - classification_loss: 0.2981 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7472 - regression_loss: 1.4504 - classification_loss: 0.2968 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7473 - regression_loss: 1.4502 - classification_loss: 0.2970 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7574 - regression_loss: 1.4599 - classification_loss: 0.2975 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7586 - regression_loss: 1.4610 - classification_loss: 0.2976 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7594 - regression_loss: 1.4611 - classification_loss: 0.2983 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7559 - regression_loss: 1.4585 - classification_loss: 0.2974 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7535 - regression_loss: 1.4564 - classification_loss: 0.2972 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7547 - regression_loss: 1.4574 - classification_loss: 0.2973 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7560 - regression_loss: 1.4587 - classification_loss: 0.2973 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7535 - regression_loss: 1.4567 - classification_loss: 0.2968 111/500 [=====>........................] - ETA: 1:37 - loss: 1.7503 - regression_loss: 1.4542 - classification_loss: 0.2961 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7430 - regression_loss: 1.4478 - classification_loss: 0.2952 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7405 - regression_loss: 1.4459 - classification_loss: 0.2947 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7407 - regression_loss: 1.4461 - classification_loss: 0.2946 115/500 [=====>........................] - ETA: 1:36 - loss: 1.7405 - regression_loss: 1.4462 - classification_loss: 0.2943 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7365 - regression_loss: 1.4430 - classification_loss: 0.2935 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7312 - regression_loss: 1.4385 - classification_loss: 0.2927 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7307 - regression_loss: 1.4381 - classification_loss: 0.2927 119/500 [======>.......................] - ETA: 1:35 - loss: 1.7256 - regression_loss: 1.4339 - classification_loss: 0.2917 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7222 - regression_loss: 1.4303 - classification_loss: 0.2919 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7264 - regression_loss: 1.4335 - classification_loss: 0.2928 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7271 - regression_loss: 1.4345 - classification_loss: 0.2926 123/500 [======>.......................] - ETA: 1:34 - loss: 1.7266 - regression_loss: 1.4347 - classification_loss: 0.2919 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7241 - regression_loss: 1.4331 - classification_loss: 0.2911 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7222 - regression_loss: 1.4319 - classification_loss: 0.2903 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7204 - regression_loss: 1.4308 - classification_loss: 0.2896 127/500 [======>.......................] - ETA: 1:33 - loss: 1.7189 - regression_loss: 1.4296 - classification_loss: 0.2893 128/500 [======>.......................] - ETA: 1:33 - loss: 1.7182 - regression_loss: 1.4285 - classification_loss: 0.2897 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7196 - regression_loss: 1.4296 - classification_loss: 0.2900 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7208 - regression_loss: 1.4299 - classification_loss: 0.2908 131/500 [======>.......................] - ETA: 1:32 - loss: 1.7182 - regression_loss: 1.4274 - classification_loss: 0.2908 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7204 - regression_loss: 1.4293 - classification_loss: 0.2911 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7217 - regression_loss: 1.4304 - classification_loss: 0.2912 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7193 - regression_loss: 1.4285 - classification_loss: 0.2908 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7238 - regression_loss: 1.4319 - classification_loss: 0.2920 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7248 - regression_loss: 1.4332 - classification_loss: 0.2915 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7272 - regression_loss: 1.4350 - classification_loss: 0.2922 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7271 - regression_loss: 1.4349 - classification_loss: 0.2922 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7232 - regression_loss: 1.4313 - classification_loss: 0.2919 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7255 - regression_loss: 1.4336 - classification_loss: 0.2919 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7315 - regression_loss: 1.4385 - classification_loss: 0.2930 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7311 - regression_loss: 1.4379 - classification_loss: 0.2932 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7284 - regression_loss: 1.4358 - classification_loss: 0.2926 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7306 - regression_loss: 1.4380 - classification_loss: 0.2927 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7338 - regression_loss: 1.4405 - classification_loss: 0.2932 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7341 - regression_loss: 1.4409 - classification_loss: 0.2932 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7322 - regression_loss: 1.4395 - classification_loss: 0.2927 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7300 - regression_loss: 1.4380 - classification_loss: 0.2920 149/500 [=======>......................] - ETA: 1:27 - loss: 1.7335 - regression_loss: 1.4408 - classification_loss: 0.2927 150/500 [========>.....................] - ETA: 1:27 - loss: 1.7340 - regression_loss: 1.4415 - classification_loss: 0.2924 151/500 [========>.....................] - ETA: 1:27 - loss: 1.7338 - regression_loss: 1.4412 - classification_loss: 0.2926 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7352 - regression_loss: 1.4422 - classification_loss: 0.2930 153/500 [========>.....................] - ETA: 1:26 - loss: 1.7309 - regression_loss: 1.4386 - classification_loss: 0.2923 154/500 [========>.....................] - ETA: 1:26 - loss: 1.7255 - regression_loss: 1.4343 - classification_loss: 0.2912 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7243 - regression_loss: 1.4334 - classification_loss: 0.2908 156/500 [========>.....................] - ETA: 1:26 - loss: 1.7250 - regression_loss: 1.4338 - classification_loss: 0.2912 157/500 [========>.....................] - ETA: 1:25 - loss: 1.7243 - regression_loss: 1.4333 - classification_loss: 0.2910 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7229 - regression_loss: 1.4316 - classification_loss: 0.2914 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7196 - regression_loss: 1.4285 - classification_loss: 0.2911 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7190 - regression_loss: 1.4277 - classification_loss: 0.2914 161/500 [========>.....................] - ETA: 1:24 - loss: 1.7171 - regression_loss: 1.4261 - classification_loss: 0.2910 162/500 [========>.....................] - ETA: 1:24 - loss: 1.7099 - regression_loss: 1.4202 - classification_loss: 0.2898 163/500 [========>.....................] - ETA: 1:24 - loss: 1.7104 - regression_loss: 1.4201 - classification_loss: 0.2904 164/500 [========>.....................] - ETA: 1:24 - loss: 1.7105 - regression_loss: 1.4203 - classification_loss: 0.2902 165/500 [========>.....................] - ETA: 1:23 - loss: 1.7068 - regression_loss: 1.4173 - classification_loss: 0.2895 166/500 [========>.....................] - ETA: 1:23 - loss: 1.7062 - regression_loss: 1.4172 - classification_loss: 0.2890 167/500 [=========>....................] - ETA: 1:23 - loss: 1.7063 - regression_loss: 1.4173 - classification_loss: 0.2890 168/500 [=========>....................] - ETA: 1:23 - loss: 1.7064 - regression_loss: 1.4171 - classification_loss: 0.2892 169/500 [=========>....................] - ETA: 1:22 - loss: 1.7031 - regression_loss: 1.4138 - classification_loss: 0.2893 170/500 [=========>....................] - ETA: 1:22 - loss: 1.7028 - regression_loss: 1.4135 - classification_loss: 0.2892 171/500 [=========>....................] - ETA: 1:22 - loss: 1.7060 - regression_loss: 1.4162 - classification_loss: 0.2897 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7069 - regression_loss: 1.4168 - classification_loss: 0.2901 173/500 [=========>....................] - ETA: 1:21 - loss: 1.7063 - regression_loss: 1.4161 - classification_loss: 0.2902 174/500 [=========>....................] - ETA: 1:21 - loss: 1.7006 - regression_loss: 1.4115 - classification_loss: 0.2891 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7025 - regression_loss: 1.4133 - classification_loss: 0.2892 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7062 - regression_loss: 1.4164 - classification_loss: 0.2898 177/500 [=========>....................] - ETA: 1:20 - loss: 1.7075 - regression_loss: 1.4180 - classification_loss: 0.2894 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7078 - regression_loss: 1.4184 - classification_loss: 0.2894 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7086 - regression_loss: 1.4191 - classification_loss: 0.2895 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7093 - regression_loss: 1.4197 - classification_loss: 0.2896 181/500 [=========>....................] - ETA: 1:19 - loss: 1.7089 - regression_loss: 1.4195 - classification_loss: 0.2894 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7111 - regression_loss: 1.4206 - classification_loss: 0.2905 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7114 - regression_loss: 1.4210 - classification_loss: 0.2904 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7126 - regression_loss: 1.4222 - classification_loss: 0.2903 185/500 [==========>...................] - ETA: 1:18 - loss: 1.7072 - regression_loss: 1.4178 - classification_loss: 0.2894 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7093 - regression_loss: 1.4199 - classification_loss: 0.2894 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7103 - regression_loss: 1.4209 - classification_loss: 0.2894 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7101 - regression_loss: 1.4210 - classification_loss: 0.2891 189/500 [==========>...................] - ETA: 1:17 - loss: 1.7165 - regression_loss: 1.4270 - classification_loss: 0.2895 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7212 - regression_loss: 1.4272 - classification_loss: 0.2940 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7168 - regression_loss: 1.4236 - classification_loss: 0.2932 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7177 - regression_loss: 1.4247 - classification_loss: 0.2930 193/500 [==========>...................] - ETA: 1:16 - loss: 1.7181 - regression_loss: 1.4251 - classification_loss: 0.2930 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7125 - regression_loss: 1.4205 - classification_loss: 0.2920 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7169 - regression_loss: 1.4242 - classification_loss: 0.2926 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7184 - regression_loss: 1.4255 - classification_loss: 0.2929 197/500 [==========>...................] - ETA: 1:15 - loss: 1.7191 - regression_loss: 1.4259 - classification_loss: 0.2932 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7215 - regression_loss: 1.4285 - classification_loss: 0.2930 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7230 - regression_loss: 1.4297 - classification_loss: 0.2934 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7238 - regression_loss: 1.4308 - classification_loss: 0.2931 201/500 [===========>..................] - ETA: 1:14 - loss: 1.7227 - regression_loss: 1.4279 - classification_loss: 0.2948 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7213 - regression_loss: 1.4269 - classification_loss: 0.2945 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7202 - regression_loss: 1.4261 - classification_loss: 0.2941 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7178 - regression_loss: 1.4240 - classification_loss: 0.2938 205/500 [===========>..................] - ETA: 1:13 - loss: 1.7196 - regression_loss: 1.4258 - classification_loss: 0.2938 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7202 - regression_loss: 1.4263 - classification_loss: 0.2939 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7252 - regression_loss: 1.4303 - classification_loss: 0.2949 208/500 [===========>..................] - ETA: 1:13 - loss: 1.7280 - regression_loss: 1.4325 - classification_loss: 0.2955 209/500 [===========>..................] - ETA: 1:13 - loss: 1.7272 - regression_loss: 1.4319 - classification_loss: 0.2953 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7298 - regression_loss: 1.4337 - classification_loss: 0.2961 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7304 - regression_loss: 1.4342 - classification_loss: 0.2962 212/500 [===========>..................] - ETA: 1:12 - loss: 1.7316 - regression_loss: 1.4351 - classification_loss: 0.2965 213/500 [===========>..................] - ETA: 1:12 - loss: 1.7327 - regression_loss: 1.4358 - classification_loss: 0.2969 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7360 - regression_loss: 1.4384 - classification_loss: 0.2976 215/500 [===========>..................] - ETA: 1:11 - loss: 1.7353 - regression_loss: 1.4379 - classification_loss: 0.2974 216/500 [===========>..................] - ETA: 1:11 - loss: 1.7355 - regression_loss: 1.4381 - classification_loss: 0.2973 217/500 [============>.................] - ETA: 1:11 - loss: 1.7351 - regression_loss: 1.4378 - classification_loss: 0.2973 218/500 [============>.................] - ETA: 1:10 - loss: 1.7345 - regression_loss: 1.4371 - classification_loss: 0.2975 219/500 [============>.................] - ETA: 1:10 - loss: 1.7319 - regression_loss: 1.4346 - classification_loss: 0.2973 220/500 [============>.................] - ETA: 1:10 - loss: 1.7330 - regression_loss: 1.4357 - classification_loss: 0.2973 221/500 [============>.................] - ETA: 1:10 - loss: 1.7326 - regression_loss: 1.4356 - classification_loss: 0.2969 222/500 [============>.................] - ETA: 1:09 - loss: 1.7330 - regression_loss: 1.4361 - classification_loss: 0.2969 223/500 [============>.................] - ETA: 1:09 - loss: 1.7313 - regression_loss: 1.4346 - classification_loss: 0.2967 224/500 [============>.................] - ETA: 1:09 - loss: 1.7296 - regression_loss: 1.4335 - classification_loss: 0.2962 225/500 [============>.................] - ETA: 1:09 - loss: 1.7296 - regression_loss: 1.4336 - classification_loss: 0.2960 226/500 [============>.................] - ETA: 1:08 - loss: 1.7302 - regression_loss: 1.4343 - classification_loss: 0.2959 227/500 [============>.................] - ETA: 1:08 - loss: 1.7291 - regression_loss: 1.4336 - classification_loss: 0.2954 228/500 [============>.................] - ETA: 1:08 - loss: 1.7302 - regression_loss: 1.4346 - classification_loss: 0.2956 229/500 [============>.................] - ETA: 1:08 - loss: 1.7314 - regression_loss: 1.4358 - classification_loss: 0.2956 230/500 [============>.................] - ETA: 1:07 - loss: 1.7292 - regression_loss: 1.4334 - classification_loss: 0.2958 231/500 [============>.................] - ETA: 1:07 - loss: 1.7296 - regression_loss: 1.4339 - classification_loss: 0.2957 232/500 [============>.................] - ETA: 1:07 - loss: 1.7297 - regression_loss: 1.4339 - classification_loss: 0.2958 233/500 [============>.................] - ETA: 1:06 - loss: 1.7299 - regression_loss: 1.4341 - classification_loss: 0.2958 234/500 [=============>................] - ETA: 1:06 - loss: 1.7301 - regression_loss: 1.4338 - classification_loss: 0.2963 235/500 [=============>................] - ETA: 1:06 - loss: 1.7342 - regression_loss: 1.4371 - classification_loss: 0.2970 236/500 [=============>................] - ETA: 1:06 - loss: 1.7351 - regression_loss: 1.4381 - classification_loss: 0.2970 237/500 [=============>................] - ETA: 1:05 - loss: 1.7350 - regression_loss: 1.4381 - classification_loss: 0.2969 238/500 [=============>................] - ETA: 1:05 - loss: 1.7358 - regression_loss: 1.4389 - classification_loss: 0.2968 239/500 [=============>................] - ETA: 1:05 - loss: 1.7322 - regression_loss: 1.4362 - classification_loss: 0.2960 240/500 [=============>................] - ETA: 1:05 - loss: 1.7328 - regression_loss: 1.4366 - classification_loss: 0.2961 241/500 [=============>................] - ETA: 1:04 - loss: 1.7330 - regression_loss: 1.4368 - classification_loss: 0.2963 242/500 [=============>................] - ETA: 1:04 - loss: 1.7334 - regression_loss: 1.4369 - classification_loss: 0.2964 243/500 [=============>................] - ETA: 1:04 - loss: 1.7294 - regression_loss: 1.4337 - classification_loss: 0.2958 244/500 [=============>................] - ETA: 1:04 - loss: 1.7294 - regression_loss: 1.4338 - classification_loss: 0.2956 245/500 [=============>................] - ETA: 1:03 - loss: 1.7250 - regression_loss: 1.4303 - classification_loss: 0.2947 246/500 [=============>................] - ETA: 1:03 - loss: 1.7254 - regression_loss: 1.4304 - classification_loss: 0.2950 247/500 [=============>................] - ETA: 1:03 - loss: 1.7261 - regression_loss: 1.4309 - classification_loss: 0.2951 248/500 [=============>................] - ETA: 1:03 - loss: 1.7262 - regression_loss: 1.4310 - classification_loss: 0.2952 249/500 [=============>................] - ETA: 1:03 - loss: 1.7260 - regression_loss: 1.4310 - classification_loss: 0.2950 250/500 [==============>...............] - ETA: 1:02 - loss: 1.7269 - regression_loss: 1.4319 - classification_loss: 0.2950 251/500 [==============>...............] - ETA: 1:02 - loss: 1.7247 - regression_loss: 1.4299 - classification_loss: 0.2948 252/500 [==============>...............] - ETA: 1:02 - loss: 1.7228 - regression_loss: 1.4283 - classification_loss: 0.2945 253/500 [==============>...............] - ETA: 1:02 - loss: 1.7240 - regression_loss: 1.4294 - classification_loss: 0.2946 254/500 [==============>...............] - ETA: 1:01 - loss: 1.7229 - regression_loss: 1.4286 - classification_loss: 0.2944 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7203 - regression_loss: 1.4264 - classification_loss: 0.2938 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7198 - regression_loss: 1.4261 - classification_loss: 0.2937 257/500 [==============>...............] - ETA: 1:00 - loss: 1.7207 - regression_loss: 1.4270 - classification_loss: 0.2937 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7201 - regression_loss: 1.4266 - classification_loss: 0.2934 259/500 [==============>...............] - ETA: 1:00 - loss: 1.7201 - regression_loss: 1.4268 - classification_loss: 0.2933 260/500 [==============>...............] - ETA: 1:00 - loss: 1.7171 - regression_loss: 1.4244 - classification_loss: 0.2927 261/500 [==============>...............] - ETA: 59s - loss: 1.7138 - regression_loss: 1.4219 - classification_loss: 0.2919  262/500 [==============>...............] - ETA: 59s - loss: 1.7116 - regression_loss: 1.4199 - classification_loss: 0.2917 263/500 [==============>...............] - ETA: 59s - loss: 1.7113 - regression_loss: 1.4197 - classification_loss: 0.2916 264/500 [==============>...............] - ETA: 59s - loss: 1.7080 - regression_loss: 1.4169 - classification_loss: 0.2911 265/500 [==============>...............] - ETA: 58s - loss: 1.7048 - regression_loss: 1.4141 - classification_loss: 0.2907 266/500 [==============>...............] - ETA: 58s - loss: 1.7053 - regression_loss: 1.4145 - classification_loss: 0.2908 267/500 [===============>..............] - ETA: 58s - loss: 1.7067 - regression_loss: 1.4158 - classification_loss: 0.2909 268/500 [===============>..............] - ETA: 58s - loss: 1.7022 - regression_loss: 1.4122 - classification_loss: 0.2900 269/500 [===============>..............] - ETA: 57s - loss: 1.7001 - regression_loss: 1.4104 - classification_loss: 0.2897 270/500 [===============>..............] - ETA: 57s - loss: 1.6992 - regression_loss: 1.4099 - classification_loss: 0.2893 271/500 [===============>..............] - ETA: 57s - loss: 1.6992 - regression_loss: 1.4100 - classification_loss: 0.2892 272/500 [===============>..............] - ETA: 57s - loss: 1.6978 - regression_loss: 1.4089 - classification_loss: 0.2889 273/500 [===============>..............] - ETA: 56s - loss: 1.6983 - regression_loss: 1.4096 - classification_loss: 0.2887 274/500 [===============>..............] - ETA: 56s - loss: 1.6996 - regression_loss: 1.4107 - classification_loss: 0.2889 275/500 [===============>..............] - ETA: 56s - loss: 1.7016 - regression_loss: 1.4122 - classification_loss: 0.2895 276/500 [===============>..............] - ETA: 56s - loss: 1.7017 - regression_loss: 1.4124 - classification_loss: 0.2893 277/500 [===============>..............] - ETA: 55s - loss: 1.7018 - regression_loss: 1.4107 - classification_loss: 0.2911 278/500 [===============>..............] - ETA: 55s - loss: 1.6989 - regression_loss: 1.4083 - classification_loss: 0.2907 279/500 [===============>..............] - ETA: 55s - loss: 1.7010 - regression_loss: 1.4100 - classification_loss: 0.2911 280/500 [===============>..............] - ETA: 55s - loss: 1.7028 - regression_loss: 1.4111 - classification_loss: 0.2917 281/500 [===============>..............] - ETA: 54s - loss: 1.7031 - regression_loss: 1.4111 - classification_loss: 0.2920 282/500 [===============>..............] - ETA: 54s - loss: 1.7042 - regression_loss: 1.4120 - classification_loss: 0.2922 283/500 [===============>..............] - ETA: 54s - loss: 1.7042 - regression_loss: 1.4121 - classification_loss: 0.2920 284/500 [================>.............] - ETA: 54s - loss: 1.7038 - regression_loss: 1.4123 - classification_loss: 0.2916 285/500 [================>.............] - ETA: 53s - loss: 1.7027 - regression_loss: 1.4115 - classification_loss: 0.2912 286/500 [================>.............] - ETA: 53s - loss: 1.7036 - regression_loss: 1.4123 - classification_loss: 0.2913 287/500 [================>.............] - ETA: 53s - loss: 1.7051 - regression_loss: 1.4131 - classification_loss: 0.2920 288/500 [================>.............] - ETA: 53s - loss: 1.7022 - regression_loss: 1.4109 - classification_loss: 0.2913 289/500 [================>.............] - ETA: 52s - loss: 1.7033 - regression_loss: 1.4118 - classification_loss: 0.2915 290/500 [================>.............] - ETA: 52s - loss: 1.7034 - regression_loss: 1.4119 - classification_loss: 0.2914 291/500 [================>.............] - ETA: 52s - loss: 1.7038 - regression_loss: 1.4123 - classification_loss: 0.2914 292/500 [================>.............] - ETA: 52s - loss: 1.7028 - regression_loss: 1.4117 - classification_loss: 0.2911 293/500 [================>.............] - ETA: 51s - loss: 1.7032 - regression_loss: 1.4119 - classification_loss: 0.2913 294/500 [================>.............] - ETA: 51s - loss: 1.7018 - regression_loss: 1.4107 - classification_loss: 0.2911 295/500 [================>.............] - ETA: 51s - loss: 1.7021 - regression_loss: 1.4110 - classification_loss: 0.2910 296/500 [================>.............] - ETA: 51s - loss: 1.7012 - regression_loss: 1.4106 - classification_loss: 0.2907 297/500 [================>.............] - ETA: 50s - loss: 1.7017 - regression_loss: 1.4098 - classification_loss: 0.2918 298/500 [================>.............] - ETA: 50s - loss: 1.6984 - regression_loss: 1.4072 - classification_loss: 0.2912 299/500 [================>.............] - ETA: 50s - loss: 1.6989 - regression_loss: 1.4078 - classification_loss: 0.2912 300/500 [=================>............] - ETA: 50s - loss: 1.6992 - regression_loss: 1.4080 - classification_loss: 0.2911 301/500 [=================>............] - ETA: 49s - loss: 1.6990 - regression_loss: 1.4081 - classification_loss: 0.2910 302/500 [=================>............] - ETA: 49s - loss: 1.6990 - regression_loss: 1.4081 - classification_loss: 0.2910 303/500 [=================>............] - ETA: 49s - loss: 1.6976 - regression_loss: 1.4067 - classification_loss: 0.2910 304/500 [=================>............] - ETA: 49s - loss: 1.6993 - regression_loss: 1.4079 - classification_loss: 0.2914 305/500 [=================>............] - ETA: 48s - loss: 1.6987 - regression_loss: 1.4075 - classification_loss: 0.2912 306/500 [=================>............] - ETA: 48s - loss: 1.6980 - regression_loss: 1.4071 - classification_loss: 0.2909 307/500 [=================>............] - ETA: 48s - loss: 1.6997 - regression_loss: 1.4089 - classification_loss: 0.2908 308/500 [=================>............] - ETA: 48s - loss: 1.6992 - regression_loss: 1.4087 - classification_loss: 0.2905 309/500 [=================>............] - ETA: 47s - loss: 1.6971 - regression_loss: 1.4070 - classification_loss: 0.2901 310/500 [=================>............] - ETA: 47s - loss: 1.6960 - regression_loss: 1.4063 - classification_loss: 0.2898 311/500 [=================>............] - ETA: 47s - loss: 1.6930 - regression_loss: 1.4038 - classification_loss: 0.2892 312/500 [=================>............] - ETA: 47s - loss: 1.6934 - regression_loss: 1.4042 - classification_loss: 0.2892 313/500 [=================>............] - ETA: 46s - loss: 1.6914 - regression_loss: 1.4021 - classification_loss: 0.2892 314/500 [=================>............] - ETA: 46s - loss: 1.6917 - regression_loss: 1.4025 - classification_loss: 0.2892 315/500 [=================>............] - ETA: 46s - loss: 1.6932 - regression_loss: 1.4038 - classification_loss: 0.2893 316/500 [=================>............] - ETA: 46s - loss: 1.6929 - regression_loss: 1.4034 - classification_loss: 0.2896 317/500 [==================>...........] - ETA: 45s - loss: 1.6940 - regression_loss: 1.4043 - classification_loss: 0.2897 318/500 [==================>...........] - ETA: 45s - loss: 1.6937 - regression_loss: 1.4042 - classification_loss: 0.2896 319/500 [==================>...........] - ETA: 45s - loss: 1.6985 - regression_loss: 1.4077 - classification_loss: 0.2908 320/500 [==================>...........] - ETA: 45s - loss: 1.6964 - regression_loss: 1.4061 - classification_loss: 0.2903 321/500 [==================>...........] - ETA: 44s - loss: 1.6986 - regression_loss: 1.4075 - classification_loss: 0.2911 322/500 [==================>...........] - ETA: 44s - loss: 1.6993 - regression_loss: 1.4080 - classification_loss: 0.2913 323/500 [==================>...........] - ETA: 44s - loss: 1.6985 - regression_loss: 1.4072 - classification_loss: 0.2913 324/500 [==================>...........] - ETA: 44s - loss: 1.6989 - regression_loss: 1.4076 - classification_loss: 0.2913 325/500 [==================>...........] - ETA: 43s - loss: 1.7004 - regression_loss: 1.4089 - classification_loss: 0.2915 326/500 [==================>...........] - ETA: 43s - loss: 1.6999 - regression_loss: 1.4086 - classification_loss: 0.2913 327/500 [==================>...........] - ETA: 43s - loss: 1.6994 - regression_loss: 1.4082 - classification_loss: 0.2912 328/500 [==================>...........] - ETA: 43s - loss: 1.6976 - regression_loss: 1.4065 - classification_loss: 0.2911 329/500 [==================>...........] - ETA: 42s - loss: 1.6975 - regression_loss: 1.4065 - classification_loss: 0.2910 330/500 [==================>...........] - ETA: 42s - loss: 1.6987 - regression_loss: 1.4078 - classification_loss: 0.2909 331/500 [==================>...........] - ETA: 42s - loss: 1.7002 - regression_loss: 1.4092 - classification_loss: 0.2910 332/500 [==================>...........] - ETA: 42s - loss: 1.7004 - regression_loss: 1.4094 - classification_loss: 0.2910 333/500 [==================>...........] - ETA: 41s - loss: 1.6987 - regression_loss: 1.4084 - classification_loss: 0.2903 334/500 [===================>..........] - ETA: 41s - loss: 1.6979 - regression_loss: 1.4075 - classification_loss: 0.2904 335/500 [===================>..........] - ETA: 41s - loss: 1.6993 - regression_loss: 1.4084 - classification_loss: 0.2908 336/500 [===================>..........] - ETA: 41s - loss: 1.6976 - regression_loss: 1.4071 - classification_loss: 0.2906 337/500 [===================>..........] - ETA: 40s - loss: 1.6975 - regression_loss: 1.4071 - classification_loss: 0.2904 338/500 [===================>..........] - ETA: 40s - loss: 1.6991 - regression_loss: 1.4084 - classification_loss: 0.2906 339/500 [===================>..........] - ETA: 40s - loss: 1.6997 - regression_loss: 1.4089 - classification_loss: 0.2908 340/500 [===================>..........] - ETA: 40s - loss: 1.6993 - regression_loss: 1.4086 - classification_loss: 0.2907 341/500 [===================>..........] - ETA: 39s - loss: 1.7001 - regression_loss: 1.4093 - classification_loss: 0.2908 342/500 [===================>..........] - ETA: 39s - loss: 1.7004 - regression_loss: 1.4095 - classification_loss: 0.2909 343/500 [===================>..........] - ETA: 39s - loss: 1.7008 - regression_loss: 1.4101 - classification_loss: 0.2907 344/500 [===================>..........] - ETA: 39s - loss: 1.7003 - regression_loss: 1.4099 - classification_loss: 0.2904 345/500 [===================>..........] - ETA: 38s - loss: 1.7011 - regression_loss: 1.4109 - classification_loss: 0.2902 346/500 [===================>..........] - ETA: 38s - loss: 1.7021 - regression_loss: 1.4116 - classification_loss: 0.2905 347/500 [===================>..........] - ETA: 38s - loss: 1.7025 - regression_loss: 1.4120 - classification_loss: 0.2906 348/500 [===================>..........] - ETA: 38s - loss: 1.7015 - regression_loss: 1.4112 - classification_loss: 0.2903 349/500 [===================>..........] - ETA: 37s - loss: 1.6999 - regression_loss: 1.4100 - classification_loss: 0.2898 350/500 [====================>.........] - ETA: 37s - loss: 1.7009 - regression_loss: 1.4108 - classification_loss: 0.2900 351/500 [====================>.........] - ETA: 37s - loss: 1.6993 - regression_loss: 1.4094 - classification_loss: 0.2899 352/500 [====================>.........] - ETA: 37s - loss: 1.6982 - regression_loss: 1.4084 - classification_loss: 0.2898 353/500 [====================>.........] - ETA: 36s - loss: 1.6978 - regression_loss: 1.4081 - classification_loss: 0.2897 354/500 [====================>.........] - ETA: 36s - loss: 1.6979 - regression_loss: 1.4082 - classification_loss: 0.2898 355/500 [====================>.........] - ETA: 36s - loss: 1.6991 - regression_loss: 1.4088 - classification_loss: 0.2903 356/500 [====================>.........] - ETA: 36s - loss: 1.6970 - regression_loss: 1.4071 - classification_loss: 0.2899 357/500 [====================>.........] - ETA: 35s - loss: 1.6975 - regression_loss: 1.4076 - classification_loss: 0.2899 358/500 [====================>.........] - ETA: 35s - loss: 1.6967 - regression_loss: 1.4070 - classification_loss: 0.2897 359/500 [====================>.........] - ETA: 35s - loss: 1.6974 - regression_loss: 1.4074 - classification_loss: 0.2901 360/500 [====================>.........] - ETA: 35s - loss: 1.6979 - regression_loss: 1.4078 - classification_loss: 0.2900 361/500 [====================>.........] - ETA: 34s - loss: 1.6985 - regression_loss: 1.4084 - classification_loss: 0.2901 362/500 [====================>.........] - ETA: 34s - loss: 1.6985 - regression_loss: 1.4087 - classification_loss: 0.2899 363/500 [====================>.........] - ETA: 34s - loss: 1.6973 - regression_loss: 1.4078 - classification_loss: 0.2895 364/500 [====================>.........] - ETA: 34s - loss: 1.6953 - regression_loss: 1.4061 - classification_loss: 0.2892 365/500 [====================>.........] - ETA: 33s - loss: 1.6937 - regression_loss: 1.4050 - classification_loss: 0.2887 366/500 [====================>.........] - ETA: 33s - loss: 1.6931 - regression_loss: 1.4044 - classification_loss: 0.2887 367/500 [=====================>........] - ETA: 33s - loss: 1.6917 - regression_loss: 1.4032 - classification_loss: 0.2885 368/500 [=====================>........] - ETA: 33s - loss: 1.6920 - regression_loss: 1.4036 - classification_loss: 0.2884 369/500 [=====================>........] - ETA: 32s - loss: 1.6927 - regression_loss: 1.4043 - classification_loss: 0.2883 370/500 [=====================>........] - ETA: 32s - loss: 1.6929 - regression_loss: 1.4046 - classification_loss: 0.2883 371/500 [=====================>........] - ETA: 32s - loss: 1.6917 - regression_loss: 1.4038 - classification_loss: 0.2879 372/500 [=====================>........] - ETA: 32s - loss: 1.6912 - regression_loss: 1.4033 - classification_loss: 0.2879 373/500 [=====================>........] - ETA: 31s - loss: 1.6923 - regression_loss: 1.4043 - classification_loss: 0.2880 374/500 [=====================>........] - ETA: 31s - loss: 1.6933 - regression_loss: 1.4052 - classification_loss: 0.2881 375/500 [=====================>........] - ETA: 31s - loss: 1.6940 - regression_loss: 1.4058 - classification_loss: 0.2883 376/500 [=====================>........] - ETA: 31s - loss: 1.6948 - regression_loss: 1.4064 - classification_loss: 0.2885 377/500 [=====================>........] - ETA: 30s - loss: 1.6948 - regression_loss: 1.4063 - classification_loss: 0.2885 378/500 [=====================>........] - ETA: 30s - loss: 1.6941 - regression_loss: 1.4056 - classification_loss: 0.2885 379/500 [=====================>........] - ETA: 30s - loss: 1.6942 - regression_loss: 1.4057 - classification_loss: 0.2885 380/500 [=====================>........] - ETA: 30s - loss: 1.6925 - regression_loss: 1.4042 - classification_loss: 0.2883 381/500 [=====================>........] - ETA: 29s - loss: 1.6935 - regression_loss: 1.4050 - classification_loss: 0.2886 382/500 [=====================>........] - ETA: 29s - loss: 1.6942 - regression_loss: 1.4056 - classification_loss: 0.2887 383/500 [=====================>........] - ETA: 29s - loss: 1.6947 - regression_loss: 1.4058 - classification_loss: 0.2889 384/500 [======================>.......] - ETA: 29s - loss: 1.6962 - regression_loss: 1.4070 - classification_loss: 0.2892 385/500 [======================>.......] - ETA: 28s - loss: 1.6956 - regression_loss: 1.4065 - classification_loss: 0.2891 386/500 [======================>.......] - ETA: 28s - loss: 1.6933 - regression_loss: 1.4047 - classification_loss: 0.2886 387/500 [======================>.......] - ETA: 28s - loss: 1.6929 - regression_loss: 1.4044 - classification_loss: 0.2886 388/500 [======================>.......] - ETA: 28s - loss: 1.6936 - regression_loss: 1.4049 - classification_loss: 0.2888 389/500 [======================>.......] - ETA: 27s - loss: 1.6939 - regression_loss: 1.4050 - classification_loss: 0.2889 390/500 [======================>.......] - ETA: 27s - loss: 1.6940 - regression_loss: 1.4053 - classification_loss: 0.2888 391/500 [======================>.......] - ETA: 27s - loss: 1.6951 - regression_loss: 1.4061 - classification_loss: 0.2889 392/500 [======================>.......] - ETA: 27s - loss: 1.6967 - regression_loss: 1.4073 - classification_loss: 0.2894 393/500 [======================>.......] - ETA: 26s - loss: 1.6948 - regression_loss: 1.4059 - classification_loss: 0.2890 394/500 [======================>.......] - ETA: 26s - loss: 1.6956 - regression_loss: 1.4064 - classification_loss: 0.2892 395/500 [======================>.......] - ETA: 26s - loss: 1.6966 - regression_loss: 1.4072 - classification_loss: 0.2894 396/500 [======================>.......] - ETA: 26s - loss: 1.6966 - regression_loss: 1.4072 - classification_loss: 0.2894 397/500 [======================>.......] - ETA: 25s - loss: 1.6983 - regression_loss: 1.4088 - classification_loss: 0.2894 398/500 [======================>.......] - ETA: 25s - loss: 1.6988 - regression_loss: 1.4093 - classification_loss: 0.2894 399/500 [======================>.......] - ETA: 25s - loss: 1.6989 - regression_loss: 1.4094 - classification_loss: 0.2896 400/500 [=======================>......] - ETA: 25s - loss: 1.7000 - regression_loss: 1.4102 - classification_loss: 0.2898 401/500 [=======================>......] - ETA: 24s - loss: 1.6997 - regression_loss: 1.4102 - classification_loss: 0.2895 402/500 [=======================>......] - ETA: 24s - loss: 1.6999 - regression_loss: 1.4104 - classification_loss: 0.2895 403/500 [=======================>......] - ETA: 24s - loss: 1.7011 - regression_loss: 1.4115 - classification_loss: 0.2896 404/500 [=======================>......] - ETA: 24s - loss: 1.7016 - regression_loss: 1.4118 - classification_loss: 0.2898 405/500 [=======================>......] - ETA: 23s - loss: 1.7023 - regression_loss: 1.4123 - classification_loss: 0.2900 406/500 [=======================>......] - ETA: 23s - loss: 1.7028 - regression_loss: 1.4127 - classification_loss: 0.2901 407/500 [=======================>......] - ETA: 23s - loss: 1.7008 - regression_loss: 1.4111 - classification_loss: 0.2897 408/500 [=======================>......] - ETA: 23s - loss: 1.6998 - regression_loss: 1.4102 - classification_loss: 0.2896 409/500 [=======================>......] - ETA: 22s - loss: 1.7010 - regression_loss: 1.4112 - classification_loss: 0.2898 410/500 [=======================>......] - ETA: 22s - loss: 1.7003 - regression_loss: 1.4108 - classification_loss: 0.2896 411/500 [=======================>......] - ETA: 22s - loss: 1.7002 - regression_loss: 1.4102 - classification_loss: 0.2900 412/500 [=======================>......] - ETA: 22s - loss: 1.6981 - regression_loss: 1.4084 - classification_loss: 0.2897 413/500 [=======================>......] - ETA: 21s - loss: 1.6989 - regression_loss: 1.4089 - classification_loss: 0.2899 414/500 [=======================>......] - ETA: 21s - loss: 1.7004 - regression_loss: 1.4101 - classification_loss: 0.2902 415/500 [=======================>......] - ETA: 21s - loss: 1.7004 - regression_loss: 1.4103 - classification_loss: 0.2901 416/500 [=======================>......] - ETA: 21s - loss: 1.7011 - regression_loss: 1.4107 - classification_loss: 0.2903 417/500 [========================>.....] - ETA: 20s - loss: 1.7018 - regression_loss: 1.4113 - classification_loss: 0.2905 418/500 [========================>.....] - ETA: 20s - loss: 1.7004 - regression_loss: 1.4101 - classification_loss: 0.2902 419/500 [========================>.....] - ETA: 20s - loss: 1.7008 - regression_loss: 1.4106 - classification_loss: 0.2902 420/500 [========================>.....] - ETA: 20s - loss: 1.7002 - regression_loss: 1.4102 - classification_loss: 0.2900 421/500 [========================>.....] - ETA: 19s - loss: 1.7004 - regression_loss: 1.4104 - classification_loss: 0.2900 422/500 [========================>.....] - ETA: 19s - loss: 1.7015 - regression_loss: 1.4113 - classification_loss: 0.2902 423/500 [========================>.....] - ETA: 19s - loss: 1.7019 - regression_loss: 1.4119 - classification_loss: 0.2900 424/500 [========================>.....] - ETA: 19s - loss: 1.7015 - regression_loss: 1.4117 - classification_loss: 0.2898 425/500 [========================>.....] - ETA: 18s - loss: 1.7012 - regression_loss: 1.4116 - classification_loss: 0.2896 426/500 [========================>.....] - ETA: 18s - loss: 1.7010 - regression_loss: 1.4114 - classification_loss: 0.2896 427/500 [========================>.....] - ETA: 18s - loss: 1.7021 - regression_loss: 1.4126 - classification_loss: 0.2895 428/500 [========================>.....] - ETA: 18s - loss: 1.7021 - regression_loss: 1.4128 - classification_loss: 0.2893 429/500 [========================>.....] - ETA: 17s - loss: 1.7055 - regression_loss: 1.4152 - classification_loss: 0.2903 430/500 [========================>.....] - ETA: 17s - loss: 1.7060 - regression_loss: 1.4157 - classification_loss: 0.2903 431/500 [========================>.....] - ETA: 17s - loss: 1.7068 - regression_loss: 1.4164 - classification_loss: 0.2904 432/500 [========================>.....] - ETA: 17s - loss: 1.7071 - regression_loss: 1.4167 - classification_loss: 0.2904 433/500 [========================>.....] - ETA: 16s - loss: 1.7070 - regression_loss: 1.4167 - classification_loss: 0.2903 434/500 [=========================>....] - ETA: 16s - loss: 1.7076 - regression_loss: 1.4171 - classification_loss: 0.2905 435/500 [=========================>....] - ETA: 16s - loss: 1.7065 - regression_loss: 1.4160 - classification_loss: 0.2905 436/500 [=========================>....] - ETA: 16s - loss: 1.7060 - regression_loss: 1.4157 - classification_loss: 0.2903 437/500 [=========================>....] - ETA: 15s - loss: 1.7040 - regression_loss: 1.4141 - classification_loss: 0.2898 438/500 [=========================>....] - ETA: 15s - loss: 1.7045 - regression_loss: 1.4145 - classification_loss: 0.2900 439/500 [=========================>....] - ETA: 15s - loss: 1.7059 - regression_loss: 1.4153 - classification_loss: 0.2906 440/500 [=========================>....] - ETA: 15s - loss: 1.7050 - regression_loss: 1.4147 - classification_loss: 0.2903 441/500 [=========================>....] - ETA: 14s - loss: 1.7046 - regression_loss: 1.4144 - classification_loss: 0.2902 442/500 [=========================>....] - ETA: 14s - loss: 1.7056 - regression_loss: 1.4152 - classification_loss: 0.2904 443/500 [=========================>....] - ETA: 14s - loss: 1.7054 - regression_loss: 1.4151 - classification_loss: 0.2903 444/500 [=========================>....] - ETA: 14s - loss: 1.7059 - regression_loss: 1.4154 - classification_loss: 0.2905 445/500 [=========================>....] - ETA: 13s - loss: 1.7050 - regression_loss: 1.4147 - classification_loss: 0.2903 446/500 [=========================>....] - ETA: 13s - loss: 1.7062 - regression_loss: 1.4154 - classification_loss: 0.2908 447/500 [=========================>....] - ETA: 13s - loss: 1.7060 - regression_loss: 1.4153 - classification_loss: 0.2907 448/500 [=========================>....] - ETA: 13s - loss: 1.7069 - regression_loss: 1.4160 - classification_loss: 0.2909 449/500 [=========================>....] - ETA: 12s - loss: 1.7077 - regression_loss: 1.4166 - classification_loss: 0.2911 450/500 [==========================>...] - ETA: 12s - loss: 1.7076 - regression_loss: 1.4167 - classification_loss: 0.2909 451/500 [==========================>...] - ETA: 12s - loss: 1.7052 - regression_loss: 1.4148 - classification_loss: 0.2905 452/500 [==========================>...] - ETA: 12s - loss: 1.7054 - regression_loss: 1.4150 - classification_loss: 0.2904 453/500 [==========================>...] - ETA: 11s - loss: 1.7052 - regression_loss: 1.4148 - classification_loss: 0.2904 454/500 [==========================>...] - ETA: 11s - loss: 1.7053 - regression_loss: 1.4149 - classification_loss: 0.2904 455/500 [==========================>...] - ETA: 11s - loss: 1.7065 - regression_loss: 1.4158 - classification_loss: 0.2907 456/500 [==========================>...] - ETA: 11s - loss: 1.7046 - regression_loss: 1.4141 - classification_loss: 0.2904 457/500 [==========================>...] - ETA: 10s - loss: 1.7037 - regression_loss: 1.4135 - classification_loss: 0.2902 458/500 [==========================>...] - ETA: 10s - loss: 1.7050 - regression_loss: 1.4146 - classification_loss: 0.2903 459/500 [==========================>...] - ETA: 10s - loss: 1.7055 - regression_loss: 1.4144 - classification_loss: 0.2911 460/500 [==========================>...] - ETA: 10s - loss: 1.7044 - regression_loss: 1.4129 - classification_loss: 0.2915 461/500 [==========================>...] - ETA: 9s - loss: 1.7047 - regression_loss: 1.4132 - classification_loss: 0.2915  462/500 [==========================>...] - ETA: 9s - loss: 1.7054 - regression_loss: 1.4137 - classification_loss: 0.2917 463/500 [==========================>...] - ETA: 9s - loss: 1.7058 - regression_loss: 1.4139 - classification_loss: 0.2918 464/500 [==========================>...] - ETA: 9s - loss: 1.7039 - regression_loss: 1.4125 - classification_loss: 0.2914 465/500 [==========================>...] - ETA: 8s - loss: 1.7043 - regression_loss: 1.4127 - classification_loss: 0.2916 466/500 [==========================>...] - ETA: 8s - loss: 1.7038 - regression_loss: 1.4122 - classification_loss: 0.2916 467/500 [===========================>..] - ETA: 8s - loss: 1.7038 - regression_loss: 1.4121 - classification_loss: 0.2917 468/500 [===========================>..] - ETA: 8s - loss: 1.7035 - regression_loss: 1.4119 - classification_loss: 0.2916 469/500 [===========================>..] - ETA: 7s - loss: 1.7033 - regression_loss: 1.4118 - classification_loss: 0.2915 470/500 [===========================>..] - ETA: 7s - loss: 1.7047 - regression_loss: 1.4133 - classification_loss: 0.2914 471/500 [===========================>..] - ETA: 7s - loss: 1.7055 - regression_loss: 1.4138 - classification_loss: 0.2917 472/500 [===========================>..] - ETA: 7s - loss: 1.7054 - regression_loss: 1.4139 - classification_loss: 0.2916 473/500 [===========================>..] - ETA: 6s - loss: 1.7048 - regression_loss: 1.4135 - classification_loss: 0.2913 474/500 [===========================>..] - ETA: 6s - loss: 1.7041 - regression_loss: 1.4130 - classification_loss: 0.2911 475/500 [===========================>..] - ETA: 6s - loss: 1.7025 - regression_loss: 1.4119 - classification_loss: 0.2906 476/500 [===========================>..] - ETA: 6s - loss: 1.7013 - regression_loss: 1.4109 - classification_loss: 0.2904 477/500 [===========================>..] - ETA: 5s - loss: 1.7020 - regression_loss: 1.4114 - classification_loss: 0.2905 478/500 [===========================>..] - ETA: 5s - loss: 1.7015 - regression_loss: 1.4112 - classification_loss: 0.2904 479/500 [===========================>..] - ETA: 5s - loss: 1.7020 - regression_loss: 1.4115 - classification_loss: 0.2905 480/500 [===========================>..] - ETA: 5s - loss: 1.7011 - regression_loss: 1.4110 - classification_loss: 0.2902 481/500 [===========================>..] - ETA: 4s - loss: 1.7019 - regression_loss: 1.4116 - classification_loss: 0.2903 482/500 [===========================>..] - ETA: 4s - loss: 1.7010 - regression_loss: 1.4108 - classification_loss: 0.2902 483/500 [===========================>..] - ETA: 4s - loss: 1.6995 - regression_loss: 1.4094 - classification_loss: 0.2901 484/500 [============================>.] - ETA: 4s - loss: 1.6992 - regression_loss: 1.4092 - classification_loss: 0.2900 485/500 [============================>.] - ETA: 3s - loss: 1.6982 - regression_loss: 1.4084 - classification_loss: 0.2898 486/500 [============================>.] - ETA: 3s - loss: 1.6987 - regression_loss: 1.4089 - classification_loss: 0.2899 487/500 [============================>.] - ETA: 3s - loss: 1.6995 - regression_loss: 1.4093 - classification_loss: 0.2902 488/500 [============================>.] - ETA: 3s - loss: 1.6991 - regression_loss: 1.4091 - classification_loss: 0.2900 489/500 [============================>.] - ETA: 2s - loss: 1.6995 - regression_loss: 1.4094 - classification_loss: 0.2900 490/500 [============================>.] - ETA: 2s - loss: 1.6996 - regression_loss: 1.4097 - classification_loss: 0.2899 491/500 [============================>.] - ETA: 2s - loss: 1.6987 - regression_loss: 1.4090 - classification_loss: 0.2897 492/500 [============================>.] - ETA: 2s - loss: 1.6969 - regression_loss: 1.4075 - classification_loss: 0.2894 493/500 [============================>.] - ETA: 1s - loss: 1.6976 - regression_loss: 1.4080 - classification_loss: 0.2896 494/500 [============================>.] - ETA: 1s - loss: 1.6974 - regression_loss: 1.4079 - classification_loss: 0.2895 495/500 [============================>.] - ETA: 1s - loss: 1.6971 - regression_loss: 1.4078 - classification_loss: 0.2893 496/500 [============================>.] - ETA: 1s - loss: 1.6974 - regression_loss: 1.4082 - classification_loss: 0.2893 497/500 [============================>.] - ETA: 0s - loss: 1.6973 - regression_loss: 1.4079 - classification_loss: 0.2893 498/500 [============================>.] - ETA: 0s - loss: 1.6956 - regression_loss: 1.4066 - classification_loss: 0.2890 499/500 [============================>.] - ETA: 0s - loss: 1.6948 - regression_loss: 1.4059 - classification_loss: 0.2889 500/500 [==============================] - 125s 251ms/step - loss: 1.6968 - regression_loss: 1.4074 - classification_loss: 0.2894 1172 instances of class plum with average precision: 0.6307 mAP: 0.6307 Epoch 00073: saving model to ./training/snapshots/resnet50_pascal_73.h5 Epoch 74/150 1/500 [..............................] - ETA: 1:59 - loss: 2.3068 - regression_loss: 1.8582 - classification_loss: 0.4486 2/500 [..............................] - ETA: 2:00 - loss: 2.2052 - regression_loss: 1.8226 - classification_loss: 0.3827 3/500 [..............................] - ETA: 2:02 - loss: 2.2155 - regression_loss: 1.8239 - classification_loss: 0.3916 4/500 [..............................] - ETA: 2:03 - loss: 2.1436 - regression_loss: 1.7575 - classification_loss: 0.3861 5/500 [..............................] - ETA: 2:03 - loss: 2.0114 - regression_loss: 1.6498 - classification_loss: 0.3616 6/500 [..............................] - ETA: 2:05 - loss: 1.9690 - regression_loss: 1.6224 - classification_loss: 0.3466 7/500 [..............................] - ETA: 2:04 - loss: 1.9418 - regression_loss: 1.5973 - classification_loss: 0.3445 8/500 [..............................] - ETA: 2:04 - loss: 1.9132 - regression_loss: 1.5748 - classification_loss: 0.3384 9/500 [..............................] - ETA: 2:04 - loss: 1.8816 - regression_loss: 1.5552 - classification_loss: 0.3264 10/500 [..............................] - ETA: 2:04 - loss: 1.8819 - regression_loss: 1.5549 - classification_loss: 0.3270 11/500 [..............................] - ETA: 2:03 - loss: 1.8342 - regression_loss: 1.5124 - classification_loss: 0.3218 12/500 [..............................] - ETA: 2:02 - loss: 1.8511 - regression_loss: 1.5271 - classification_loss: 0.3240 13/500 [..............................] - ETA: 2:02 - loss: 1.8411 - regression_loss: 1.5194 - classification_loss: 0.3216 14/500 [..............................] - ETA: 2:02 - loss: 1.8305 - regression_loss: 1.5125 - classification_loss: 0.3180 15/500 [..............................] - ETA: 2:02 - loss: 1.8141 - regression_loss: 1.5019 - classification_loss: 0.3122 16/500 [..............................] - ETA: 2:02 - loss: 1.7796 - regression_loss: 1.4757 - classification_loss: 0.3039 17/500 [>.............................] - ETA: 2:02 - loss: 1.7745 - regression_loss: 1.4700 - classification_loss: 0.3044 18/500 [>.............................] - ETA: 2:02 - loss: 1.7848 - regression_loss: 1.4822 - classification_loss: 0.3026 19/500 [>.............................] - ETA: 2:02 - loss: 1.7907 - regression_loss: 1.4886 - classification_loss: 0.3021 20/500 [>.............................] - ETA: 2:01 - loss: 1.7957 - regression_loss: 1.4939 - classification_loss: 0.3018 21/500 [>.............................] - ETA: 2:01 - loss: 1.7623 - regression_loss: 1.4627 - classification_loss: 0.2996 22/500 [>.............................] - ETA: 2:01 - loss: 1.7698 - regression_loss: 1.4685 - classification_loss: 0.3013 23/500 [>.............................] - ETA: 2:01 - loss: 1.7661 - regression_loss: 1.4664 - classification_loss: 0.2997 24/500 [>.............................] - ETA: 2:01 - loss: 1.7871 - regression_loss: 1.4821 - classification_loss: 0.3050 25/500 [>.............................] - ETA: 2:00 - loss: 1.7821 - regression_loss: 1.4792 - classification_loss: 0.3030 26/500 [>.............................] - ETA: 2:00 - loss: 1.7913 - regression_loss: 1.4868 - classification_loss: 0.3045 27/500 [>.............................] - ETA: 2:00 - loss: 1.7883 - regression_loss: 1.4866 - classification_loss: 0.3017 28/500 [>.............................] - ETA: 2:00 - loss: 1.7553 - regression_loss: 1.4611 - classification_loss: 0.2942 29/500 [>.............................] - ETA: 1:59 - loss: 1.7451 - regression_loss: 1.4543 - classification_loss: 0.2908 30/500 [>.............................] - ETA: 1:59 - loss: 1.7426 - regression_loss: 1.4524 - classification_loss: 0.2902 31/500 [>.............................] - ETA: 1:59 - loss: 1.7489 - regression_loss: 1.4522 - classification_loss: 0.2967 32/500 [>.............................] - ETA: 1:59 - loss: 1.7541 - regression_loss: 1.4553 - classification_loss: 0.2988 33/500 [>.............................] - ETA: 1:58 - loss: 1.7741 - regression_loss: 1.4690 - classification_loss: 0.3052 34/500 [=>............................] - ETA: 1:58 - loss: 1.7401 - regression_loss: 1.4401 - classification_loss: 0.3000 35/500 [=>............................] - ETA: 1:58 - loss: 1.7476 - regression_loss: 1.4464 - classification_loss: 0.3011 36/500 [=>............................] - ETA: 1:57 - loss: 1.7214 - regression_loss: 1.4248 - classification_loss: 0.2966 37/500 [=>............................] - ETA: 1:57 - loss: 1.7203 - regression_loss: 1.4256 - classification_loss: 0.2947 38/500 [=>............................] - ETA: 1:57 - loss: 1.7284 - regression_loss: 1.4335 - classification_loss: 0.2949 39/500 [=>............................] - ETA: 1:57 - loss: 1.7320 - regression_loss: 1.4367 - classification_loss: 0.2954 40/500 [=>............................] - ETA: 1:56 - loss: 1.7042 - regression_loss: 1.4150 - classification_loss: 0.2892 41/500 [=>............................] - ETA: 1:56 - loss: 1.7084 - regression_loss: 1.4188 - classification_loss: 0.2896 42/500 [=>............................] - ETA: 1:55 - loss: 1.7256 - regression_loss: 1.4324 - classification_loss: 0.2932 43/500 [=>............................] - ETA: 1:55 - loss: 1.7122 - regression_loss: 1.4212 - classification_loss: 0.2910 44/500 [=>............................] - ETA: 1:55 - loss: 1.7136 - regression_loss: 1.4232 - classification_loss: 0.2904 45/500 [=>............................] - ETA: 1:54 - loss: 1.7037 - regression_loss: 1.4155 - classification_loss: 0.2881 46/500 [=>............................] - ETA: 1:54 - loss: 1.7028 - regression_loss: 1.4160 - classification_loss: 0.2869 47/500 [=>............................] - ETA: 1:54 - loss: 1.7011 - regression_loss: 1.4149 - classification_loss: 0.2862 48/500 [=>............................] - ETA: 1:54 - loss: 1.7024 - regression_loss: 1.4162 - classification_loss: 0.2861 49/500 [=>............................] - ETA: 1:54 - loss: 1.6808 - regression_loss: 1.3978 - classification_loss: 0.2829 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6935 - regression_loss: 1.4071 - classification_loss: 0.2864 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6853 - regression_loss: 1.4003 - classification_loss: 0.2850 52/500 [==>...........................] - ETA: 1:53 - loss: 1.6960 - regression_loss: 1.4064 - classification_loss: 0.2897 53/500 [==>...........................] - ETA: 1:53 - loss: 1.6940 - regression_loss: 1.4051 - classification_loss: 0.2889 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6986 - regression_loss: 1.4086 - classification_loss: 0.2900 55/500 [==>...........................] - ETA: 1:52 - loss: 1.7050 - regression_loss: 1.4140 - classification_loss: 0.2910 56/500 [==>...........................] - ETA: 1:52 - loss: 1.7062 - regression_loss: 1.4160 - classification_loss: 0.2902 57/500 [==>...........................] - ETA: 1:51 - loss: 1.7083 - regression_loss: 1.4182 - classification_loss: 0.2901 58/500 [==>...........................] - ETA: 1:51 - loss: 1.7053 - regression_loss: 1.4160 - classification_loss: 0.2893 59/500 [==>...........................] - ETA: 1:51 - loss: 1.7129 - regression_loss: 1.4220 - classification_loss: 0.2910 60/500 [==>...........................] - ETA: 1:51 - loss: 1.7150 - regression_loss: 1.4247 - classification_loss: 0.2903 61/500 [==>...........................] - ETA: 1:51 - loss: 1.7241 - regression_loss: 1.4330 - classification_loss: 0.2911 62/500 [==>...........................] - ETA: 1:50 - loss: 1.7214 - regression_loss: 1.4301 - classification_loss: 0.2913 63/500 [==>...........................] - ETA: 1:50 - loss: 1.7147 - regression_loss: 1.4247 - classification_loss: 0.2900 64/500 [==>...........................] - ETA: 1:50 - loss: 1.7181 - regression_loss: 1.4263 - classification_loss: 0.2918 65/500 [==>...........................] - ETA: 1:49 - loss: 1.7241 - regression_loss: 1.4321 - classification_loss: 0.2920 66/500 [==>...........................] - ETA: 1:49 - loss: 1.7165 - regression_loss: 1.4268 - classification_loss: 0.2896 67/500 [===>..........................] - ETA: 1:49 - loss: 1.7227 - regression_loss: 1.4322 - classification_loss: 0.2906 68/500 [===>..........................] - ETA: 1:49 - loss: 1.7308 - regression_loss: 1.4387 - classification_loss: 0.2921 69/500 [===>..........................] - ETA: 1:48 - loss: 1.7218 - regression_loss: 1.4312 - classification_loss: 0.2906 70/500 [===>..........................] - ETA: 1:48 - loss: 1.7110 - regression_loss: 1.4225 - classification_loss: 0.2885 71/500 [===>..........................] - ETA: 1:48 - loss: 1.7106 - regression_loss: 1.4224 - classification_loss: 0.2882 72/500 [===>..........................] - ETA: 1:48 - loss: 1.7175 - regression_loss: 1.4275 - classification_loss: 0.2900 73/500 [===>..........................] - ETA: 1:47 - loss: 1.7171 - regression_loss: 1.4274 - classification_loss: 0.2898 74/500 [===>..........................] - ETA: 1:47 - loss: 1.7250 - regression_loss: 1.4331 - classification_loss: 0.2919 75/500 [===>..........................] - ETA: 1:47 - loss: 1.7190 - regression_loss: 1.4277 - classification_loss: 0.2913 76/500 [===>..........................] - ETA: 1:47 - loss: 1.7250 - regression_loss: 1.4338 - classification_loss: 0.2912 77/500 [===>..........................] - ETA: 1:47 - loss: 1.7310 - regression_loss: 1.4372 - classification_loss: 0.2937 78/500 [===>..........................] - ETA: 1:46 - loss: 1.7249 - regression_loss: 1.4327 - classification_loss: 0.2921 79/500 [===>..........................] - ETA: 1:46 - loss: 1.7170 - regression_loss: 1.4272 - classification_loss: 0.2899 80/500 [===>..........................] - ETA: 1:46 - loss: 1.7180 - regression_loss: 1.4287 - classification_loss: 0.2893 81/500 [===>..........................] - ETA: 1:46 - loss: 1.7136 - regression_loss: 1.4253 - classification_loss: 0.2883 82/500 [===>..........................] - ETA: 1:45 - loss: 1.7150 - regression_loss: 1.4268 - classification_loss: 0.2882 83/500 [===>..........................] - ETA: 1:45 - loss: 1.7133 - regression_loss: 1.4253 - classification_loss: 0.2879 84/500 [====>.........................] - ETA: 1:45 - loss: 1.7139 - regression_loss: 1.4260 - classification_loss: 0.2879 85/500 [====>.........................] - ETA: 1:45 - loss: 1.7171 - regression_loss: 1.4286 - classification_loss: 0.2885 86/500 [====>.........................] - ETA: 1:44 - loss: 1.7067 - regression_loss: 1.4195 - classification_loss: 0.2872 87/500 [====>.........................] - ETA: 1:44 - loss: 1.7024 - regression_loss: 1.4163 - classification_loss: 0.2861 88/500 [====>.........................] - ETA: 1:44 - loss: 1.7011 - regression_loss: 1.4146 - classification_loss: 0.2865 89/500 [====>.........................] - ETA: 1:44 - loss: 1.7027 - regression_loss: 1.4159 - classification_loss: 0.2868 90/500 [====>.........................] - ETA: 1:43 - loss: 1.7039 - regression_loss: 1.4156 - classification_loss: 0.2883 91/500 [====>.........................] - ETA: 1:43 - loss: 1.7020 - regression_loss: 1.4143 - classification_loss: 0.2876 92/500 [====>.........................] - ETA: 1:43 - loss: 1.7103 - regression_loss: 1.4221 - classification_loss: 0.2882 93/500 [====>.........................] - ETA: 1:43 - loss: 1.7025 - regression_loss: 1.4157 - classification_loss: 0.2867 94/500 [====>.........................] - ETA: 1:42 - loss: 1.7096 - regression_loss: 1.4202 - classification_loss: 0.2894 95/500 [====>.........................] - ETA: 1:42 - loss: 1.7179 - regression_loss: 1.4266 - classification_loss: 0.2913 96/500 [====>.........................] - ETA: 1:42 - loss: 1.7236 - regression_loss: 1.4303 - classification_loss: 0.2933 97/500 [====>.........................] - ETA: 1:42 - loss: 1.7188 - regression_loss: 1.4265 - classification_loss: 0.2923 98/500 [====>.........................] - ETA: 1:41 - loss: 1.7138 - regression_loss: 1.4227 - classification_loss: 0.2910 99/500 [====>.........................] - ETA: 1:41 - loss: 1.7162 - regression_loss: 1.4241 - classification_loss: 0.2922 100/500 [=====>........................] - ETA: 1:41 - loss: 1.7171 - regression_loss: 1.4244 - classification_loss: 0.2927 101/500 [=====>........................] - ETA: 1:41 - loss: 1.7145 - regression_loss: 1.4221 - classification_loss: 0.2925 102/500 [=====>........................] - ETA: 1:40 - loss: 1.7179 - regression_loss: 1.4259 - classification_loss: 0.2920 103/500 [=====>........................] - ETA: 1:40 - loss: 1.7202 - regression_loss: 1.4246 - classification_loss: 0.2956 104/500 [=====>........................] - ETA: 1:40 - loss: 1.7214 - regression_loss: 1.4258 - classification_loss: 0.2955 105/500 [=====>........................] - ETA: 1:40 - loss: 1.7236 - regression_loss: 1.4276 - classification_loss: 0.2960 106/500 [=====>........................] - ETA: 1:39 - loss: 1.7225 - regression_loss: 1.4269 - classification_loss: 0.2955 107/500 [=====>........................] - ETA: 1:39 - loss: 1.7237 - regression_loss: 1.4267 - classification_loss: 0.2969 108/500 [=====>........................] - ETA: 1:39 - loss: 1.7277 - regression_loss: 1.4307 - classification_loss: 0.2970 109/500 [=====>........................] - ETA: 1:38 - loss: 1.7315 - regression_loss: 1.4339 - classification_loss: 0.2976 110/500 [=====>........................] - ETA: 1:38 - loss: 1.7370 - regression_loss: 1.4393 - classification_loss: 0.2977 111/500 [=====>........................] - ETA: 1:38 - loss: 1.7353 - regression_loss: 1.4382 - classification_loss: 0.2971 112/500 [=====>........................] - ETA: 1:37 - loss: 1.7365 - regression_loss: 1.4398 - classification_loss: 0.2968 113/500 [=====>........................] - ETA: 1:37 - loss: 1.7330 - regression_loss: 1.4371 - classification_loss: 0.2959 114/500 [=====>........................] - ETA: 1:37 - loss: 1.7302 - regression_loss: 1.4352 - classification_loss: 0.2951 115/500 [=====>........................] - ETA: 1:37 - loss: 1.7307 - regression_loss: 1.4359 - classification_loss: 0.2948 116/500 [=====>........................] - ETA: 1:36 - loss: 1.7265 - regression_loss: 1.4329 - classification_loss: 0.2936 117/500 [======>.......................] - ETA: 1:36 - loss: 1.7205 - regression_loss: 1.4286 - classification_loss: 0.2919 118/500 [======>.......................] - ETA: 1:36 - loss: 1.7225 - regression_loss: 1.4302 - classification_loss: 0.2924 119/500 [======>.......................] - ETA: 1:36 - loss: 1.7239 - regression_loss: 1.4312 - classification_loss: 0.2928 120/500 [======>.......................] - ETA: 1:35 - loss: 1.7266 - regression_loss: 1.4337 - classification_loss: 0.2929 121/500 [======>.......................] - ETA: 1:35 - loss: 1.7204 - regression_loss: 1.4274 - classification_loss: 0.2930 122/500 [======>.......................] - ETA: 1:35 - loss: 1.7127 - regression_loss: 1.4214 - classification_loss: 0.2914 123/500 [======>.......................] - ETA: 1:35 - loss: 1.7074 - regression_loss: 1.4177 - classification_loss: 0.2898 124/500 [======>.......................] - ETA: 1:34 - loss: 1.7080 - regression_loss: 1.4182 - classification_loss: 0.2897 125/500 [======>.......................] - ETA: 1:34 - loss: 1.7057 - regression_loss: 1.4163 - classification_loss: 0.2893 126/500 [======>.......................] - ETA: 1:34 - loss: 1.7054 - regression_loss: 1.4164 - classification_loss: 0.2889 127/500 [======>.......................] - ETA: 1:34 - loss: 1.7062 - regression_loss: 1.4170 - classification_loss: 0.2892 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6989 - regression_loss: 1.4109 - classification_loss: 0.2880 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6971 - regression_loss: 1.4102 - classification_loss: 0.2869 130/500 [======>.......................] - ETA: 1:33 - loss: 1.6988 - regression_loss: 1.4119 - classification_loss: 0.2869 131/500 [======>.......................] - ETA: 1:33 - loss: 1.6997 - regression_loss: 1.4132 - classification_loss: 0.2865 132/500 [======>.......................] - ETA: 1:32 - loss: 1.7007 - regression_loss: 1.4146 - classification_loss: 0.2861 133/500 [======>.......................] - ETA: 1:32 - loss: 1.7000 - regression_loss: 1.4148 - classification_loss: 0.2852 134/500 [=======>......................] - ETA: 1:32 - loss: 1.7013 - regression_loss: 1.4159 - classification_loss: 0.2854 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7093 - regression_loss: 1.4221 - classification_loss: 0.2872 136/500 [=======>......................] - ETA: 1:31 - loss: 1.7027 - regression_loss: 1.4167 - classification_loss: 0.2861 137/500 [=======>......................] - ETA: 1:31 - loss: 1.7034 - regression_loss: 1.4177 - classification_loss: 0.2857 138/500 [=======>......................] - ETA: 1:31 - loss: 1.7035 - regression_loss: 1.4180 - classification_loss: 0.2855 139/500 [=======>......................] - ETA: 1:30 - loss: 1.7035 - regression_loss: 1.4176 - classification_loss: 0.2859 140/500 [=======>......................] - ETA: 1:30 - loss: 1.7089 - regression_loss: 1.4228 - classification_loss: 0.2860 141/500 [=======>......................] - ETA: 1:30 - loss: 1.7070 - regression_loss: 1.4215 - classification_loss: 0.2856 142/500 [=======>......................] - ETA: 1:30 - loss: 1.7084 - regression_loss: 1.4230 - classification_loss: 0.2855 143/500 [=======>......................] - ETA: 1:29 - loss: 1.7073 - regression_loss: 1.4224 - classification_loss: 0.2849 144/500 [=======>......................] - ETA: 1:29 - loss: 1.7044 - regression_loss: 1.4199 - classification_loss: 0.2845 145/500 [=======>......................] - ETA: 1:29 - loss: 1.7032 - regression_loss: 1.4191 - classification_loss: 0.2841 146/500 [=======>......................] - ETA: 1:29 - loss: 1.7036 - regression_loss: 1.4196 - classification_loss: 0.2840 147/500 [=======>......................] - ETA: 1:28 - loss: 1.7049 - regression_loss: 1.4205 - classification_loss: 0.2844 148/500 [=======>......................] - ETA: 1:28 - loss: 1.7046 - regression_loss: 1.4205 - classification_loss: 0.2842 149/500 [=======>......................] - ETA: 1:28 - loss: 1.7059 - regression_loss: 1.4214 - classification_loss: 0.2845 150/500 [========>.....................] - ETA: 1:28 - loss: 1.7011 - regression_loss: 1.4176 - classification_loss: 0.2835 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6991 - regression_loss: 1.4159 - classification_loss: 0.2832 152/500 [========>.....................] - ETA: 1:27 - loss: 1.7010 - regression_loss: 1.4176 - classification_loss: 0.2834 153/500 [========>.....................] - ETA: 1:27 - loss: 1.7020 - regression_loss: 1.4190 - classification_loss: 0.2830 154/500 [========>.....................] - ETA: 1:27 - loss: 1.7006 - regression_loss: 1.4177 - classification_loss: 0.2829 155/500 [========>.....................] - ETA: 1:26 - loss: 1.7001 - regression_loss: 1.4175 - classification_loss: 0.2827 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6992 - regression_loss: 1.4167 - classification_loss: 0.2825 157/500 [========>.....................] - ETA: 1:26 - loss: 1.7021 - regression_loss: 1.4188 - classification_loss: 0.2833 158/500 [========>.....................] - ETA: 1:25 - loss: 1.7045 - regression_loss: 1.4206 - classification_loss: 0.2839 159/500 [========>.....................] - ETA: 1:25 - loss: 1.7027 - regression_loss: 1.4188 - classification_loss: 0.2839 160/500 [========>.....................] - ETA: 1:25 - loss: 1.7036 - regression_loss: 1.4200 - classification_loss: 0.2836 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6991 - regression_loss: 1.4162 - classification_loss: 0.2830 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6959 - regression_loss: 1.4135 - classification_loss: 0.2824 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6940 - regression_loss: 1.4121 - classification_loss: 0.2819 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6975 - regression_loss: 1.4152 - classification_loss: 0.2823 165/500 [========>.....................] - ETA: 1:24 - loss: 1.7011 - regression_loss: 1.4185 - classification_loss: 0.2826 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6971 - regression_loss: 1.4150 - classification_loss: 0.2821 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6987 - regression_loss: 1.4165 - classification_loss: 0.2822 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6915 - regression_loss: 1.4107 - classification_loss: 0.2808 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6953 - regression_loss: 1.4134 - classification_loss: 0.2819 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6972 - regression_loss: 1.4149 - classification_loss: 0.2824 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6996 - regression_loss: 1.4167 - classification_loss: 0.2829 172/500 [=========>....................] - ETA: 1:22 - loss: 1.7022 - regression_loss: 1.4189 - classification_loss: 0.2833 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6973 - regression_loss: 1.4142 - classification_loss: 0.2832 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6997 - regression_loss: 1.4159 - classification_loss: 0.2838 175/500 [=========>....................] - ETA: 1:21 - loss: 1.7002 - regression_loss: 1.4163 - classification_loss: 0.2839 176/500 [=========>....................] - ETA: 1:21 - loss: 1.7051 - regression_loss: 1.4200 - classification_loss: 0.2851 177/500 [=========>....................] - ETA: 1:21 - loss: 1.7088 - regression_loss: 1.4235 - classification_loss: 0.2853 178/500 [=========>....................] - ETA: 1:20 - loss: 1.7110 - regression_loss: 1.4258 - classification_loss: 0.2852 179/500 [=========>....................] - ETA: 1:20 - loss: 1.7091 - regression_loss: 1.4244 - classification_loss: 0.2846 180/500 [=========>....................] - ETA: 1:20 - loss: 1.7073 - regression_loss: 1.4231 - classification_loss: 0.2841 181/500 [=========>....................] - ETA: 1:20 - loss: 1.7073 - regression_loss: 1.4232 - classification_loss: 0.2841 182/500 [=========>....................] - ETA: 1:19 - loss: 1.7085 - regression_loss: 1.4246 - classification_loss: 0.2839 183/500 [=========>....................] - ETA: 1:19 - loss: 1.7051 - regression_loss: 1.4220 - classification_loss: 0.2831 184/500 [==========>...................] - ETA: 1:19 - loss: 1.7040 - regression_loss: 1.4211 - classification_loss: 0.2829 185/500 [==========>...................] - ETA: 1:19 - loss: 1.7063 - regression_loss: 1.4230 - classification_loss: 0.2833 186/500 [==========>...................] - ETA: 1:18 - loss: 1.7088 - regression_loss: 1.4250 - classification_loss: 0.2838 187/500 [==========>...................] - ETA: 1:18 - loss: 1.7111 - regression_loss: 1.4272 - classification_loss: 0.2839 188/500 [==========>...................] - ETA: 1:18 - loss: 1.7121 - regression_loss: 1.4281 - classification_loss: 0.2840 189/500 [==========>...................] - ETA: 1:18 - loss: 1.7131 - regression_loss: 1.4287 - classification_loss: 0.2844 190/500 [==========>...................] - ETA: 1:17 - loss: 1.7139 - regression_loss: 1.4295 - classification_loss: 0.2844 191/500 [==========>...................] - ETA: 1:17 - loss: 1.7102 - regression_loss: 1.4264 - classification_loss: 0.2838 192/500 [==========>...................] - ETA: 1:17 - loss: 1.7077 - regression_loss: 1.4239 - classification_loss: 0.2838 193/500 [==========>...................] - ETA: 1:17 - loss: 1.7079 - regression_loss: 1.4243 - classification_loss: 0.2835 194/500 [==========>...................] - ETA: 1:16 - loss: 1.7077 - regression_loss: 1.4243 - classification_loss: 0.2834 195/500 [==========>...................] - ETA: 1:16 - loss: 1.7089 - regression_loss: 1.4250 - classification_loss: 0.2839 196/500 [==========>...................] - ETA: 1:16 - loss: 1.7103 - regression_loss: 1.4259 - classification_loss: 0.2843 197/500 [==========>...................] - ETA: 1:16 - loss: 1.7114 - regression_loss: 1.4265 - classification_loss: 0.2848 198/500 [==========>...................] - ETA: 1:15 - loss: 1.7083 - regression_loss: 1.4242 - classification_loss: 0.2842 199/500 [==========>...................] - ETA: 1:15 - loss: 1.7065 - regression_loss: 1.4232 - classification_loss: 0.2833 200/500 [===========>..................] - ETA: 1:15 - loss: 1.7034 - regression_loss: 1.4206 - classification_loss: 0.2828 201/500 [===========>..................] - ETA: 1:15 - loss: 1.7055 - regression_loss: 1.4220 - classification_loss: 0.2836 202/500 [===========>..................] - ETA: 1:14 - loss: 1.7054 - regression_loss: 1.4219 - classification_loss: 0.2835 203/500 [===========>..................] - ETA: 1:14 - loss: 1.7050 - regression_loss: 1.4216 - classification_loss: 0.2835 204/500 [===========>..................] - ETA: 1:14 - loss: 1.7056 - regression_loss: 1.4220 - classification_loss: 0.2837 205/500 [===========>..................] - ETA: 1:14 - loss: 1.7035 - regression_loss: 1.4205 - classification_loss: 0.2829 206/500 [===========>..................] - ETA: 1:13 - loss: 1.7053 - regression_loss: 1.4219 - classification_loss: 0.2834 207/500 [===========>..................] - ETA: 1:13 - loss: 1.7003 - regression_loss: 1.4174 - classification_loss: 0.2829 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6999 - regression_loss: 1.4171 - classification_loss: 0.2827 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6991 - regression_loss: 1.4167 - classification_loss: 0.2823 210/500 [===========>..................] - ETA: 1:12 - loss: 1.7007 - regression_loss: 1.4177 - classification_loss: 0.2830 211/500 [===========>..................] - ETA: 1:12 - loss: 1.7005 - regression_loss: 1.4174 - classification_loss: 0.2831 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6982 - regression_loss: 1.4155 - classification_loss: 0.2827 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6982 - regression_loss: 1.4156 - classification_loss: 0.2826 214/500 [===========>..................] - ETA: 1:11 - loss: 1.7000 - regression_loss: 1.4167 - classification_loss: 0.2834 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6998 - regression_loss: 1.4166 - classification_loss: 0.2833 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6966 - regression_loss: 1.4138 - classification_loss: 0.2828 217/500 [============>.................] - ETA: 1:11 - loss: 1.6972 - regression_loss: 1.4143 - classification_loss: 0.2829 218/500 [============>.................] - ETA: 1:10 - loss: 1.6991 - regression_loss: 1.4159 - classification_loss: 0.2832 219/500 [============>.................] - ETA: 1:10 - loss: 1.6950 - regression_loss: 1.4126 - classification_loss: 0.2824 220/500 [============>.................] - ETA: 1:10 - loss: 1.6930 - regression_loss: 1.4107 - classification_loss: 0.2822 221/500 [============>.................] - ETA: 1:10 - loss: 1.6936 - regression_loss: 1.4109 - classification_loss: 0.2827 222/500 [============>.................] - ETA: 1:09 - loss: 1.6920 - regression_loss: 1.4094 - classification_loss: 0.2826 223/500 [============>.................] - ETA: 1:09 - loss: 1.6914 - regression_loss: 1.4089 - classification_loss: 0.2826 224/500 [============>.................] - ETA: 1:09 - loss: 1.6938 - regression_loss: 1.4106 - classification_loss: 0.2832 225/500 [============>.................] - ETA: 1:09 - loss: 1.6915 - regression_loss: 1.4086 - classification_loss: 0.2829 226/500 [============>.................] - ETA: 1:08 - loss: 1.6894 - regression_loss: 1.4067 - classification_loss: 0.2827 227/500 [============>.................] - ETA: 1:08 - loss: 1.6905 - regression_loss: 1.4075 - classification_loss: 0.2829 228/500 [============>.................] - ETA: 1:08 - loss: 1.6881 - regression_loss: 1.4057 - classification_loss: 0.2824 229/500 [============>.................] - ETA: 1:08 - loss: 1.6866 - regression_loss: 1.4044 - classification_loss: 0.2823 230/500 [============>.................] - ETA: 1:07 - loss: 1.6856 - regression_loss: 1.4037 - classification_loss: 0.2819 231/500 [============>.................] - ETA: 1:07 - loss: 1.6868 - regression_loss: 1.4046 - classification_loss: 0.2822 232/500 [============>.................] - ETA: 1:07 - loss: 1.6873 - regression_loss: 1.4048 - classification_loss: 0.2826 233/500 [============>.................] - ETA: 1:07 - loss: 1.6910 - regression_loss: 1.4079 - classification_loss: 0.2831 234/500 [=============>................] - ETA: 1:06 - loss: 1.6922 - regression_loss: 1.4088 - classification_loss: 0.2834 235/500 [=============>................] - ETA: 1:06 - loss: 1.6928 - regression_loss: 1.4097 - classification_loss: 0.2831 236/500 [=============>................] - ETA: 1:06 - loss: 1.6930 - regression_loss: 1.4097 - classification_loss: 0.2834 237/500 [=============>................] - ETA: 1:06 - loss: 1.6927 - regression_loss: 1.4095 - classification_loss: 0.2831 238/500 [=============>................] - ETA: 1:05 - loss: 1.6928 - regression_loss: 1.4097 - classification_loss: 0.2831 239/500 [=============>................] - ETA: 1:05 - loss: 1.6944 - regression_loss: 1.4111 - classification_loss: 0.2834 240/500 [=============>................] - ETA: 1:05 - loss: 1.6940 - regression_loss: 1.4107 - classification_loss: 0.2833 241/500 [=============>................] - ETA: 1:05 - loss: 1.6953 - regression_loss: 1.4117 - classification_loss: 0.2836 242/500 [=============>................] - ETA: 1:04 - loss: 1.6977 - regression_loss: 1.4142 - classification_loss: 0.2835 243/500 [=============>................] - ETA: 1:04 - loss: 1.6990 - regression_loss: 1.4152 - classification_loss: 0.2837 244/500 [=============>................] - ETA: 1:04 - loss: 1.6954 - regression_loss: 1.4121 - classification_loss: 0.2832 245/500 [=============>................] - ETA: 1:04 - loss: 1.6964 - regression_loss: 1.4130 - classification_loss: 0.2834 246/500 [=============>................] - ETA: 1:03 - loss: 1.6972 - regression_loss: 1.4139 - classification_loss: 0.2833 247/500 [=============>................] - ETA: 1:03 - loss: 1.6953 - regression_loss: 1.4115 - classification_loss: 0.2838 248/500 [=============>................] - ETA: 1:03 - loss: 1.6969 - regression_loss: 1.4128 - classification_loss: 0.2842 249/500 [=============>................] - ETA: 1:03 - loss: 1.6951 - regression_loss: 1.4112 - classification_loss: 0.2839 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6965 - regression_loss: 1.4124 - classification_loss: 0.2841 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6980 - regression_loss: 1.4124 - classification_loss: 0.2856 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6984 - regression_loss: 1.4130 - classification_loss: 0.2854 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6984 - regression_loss: 1.4130 - classification_loss: 0.2855 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6997 - regression_loss: 1.4141 - classification_loss: 0.2856 255/500 [==============>...............] - ETA: 1:01 - loss: 1.7007 - regression_loss: 1.4152 - classification_loss: 0.2854 256/500 [==============>...............] - ETA: 1:01 - loss: 1.7007 - regression_loss: 1.4153 - classification_loss: 0.2853 257/500 [==============>...............] - ETA: 1:01 - loss: 1.7021 - regression_loss: 1.4163 - classification_loss: 0.2857 258/500 [==============>...............] - ETA: 1:00 - loss: 1.7024 - regression_loss: 1.4168 - classification_loss: 0.2857 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6996 - regression_loss: 1.4146 - classification_loss: 0.2850 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6997 - regression_loss: 1.4147 - classification_loss: 0.2850 261/500 [==============>...............] - ETA: 1:00 - loss: 1.7002 - regression_loss: 1.4153 - classification_loss: 0.2850 262/500 [==============>...............] - ETA: 59s - loss: 1.6994 - regression_loss: 1.4147 - classification_loss: 0.2847  263/500 [==============>...............] - ETA: 59s - loss: 1.6994 - regression_loss: 1.4147 - classification_loss: 0.2847 264/500 [==============>...............] - ETA: 59s - loss: 1.6998 - regression_loss: 1.4151 - classification_loss: 0.2848 265/500 [==============>...............] - ETA: 59s - loss: 1.6986 - regression_loss: 1.4140 - classification_loss: 0.2846 266/500 [==============>...............] - ETA: 58s - loss: 1.7001 - regression_loss: 1.4151 - classification_loss: 0.2850 267/500 [===============>..............] - ETA: 58s - loss: 1.6988 - regression_loss: 1.4143 - classification_loss: 0.2845 268/500 [===============>..............] - ETA: 58s - loss: 1.6950 - regression_loss: 1.4112 - classification_loss: 0.2838 269/500 [===============>..............] - ETA: 58s - loss: 1.6966 - regression_loss: 1.4121 - classification_loss: 0.2845 270/500 [===============>..............] - ETA: 57s - loss: 1.6951 - regression_loss: 1.4106 - classification_loss: 0.2845 271/500 [===============>..............] - ETA: 57s - loss: 1.6952 - regression_loss: 1.4106 - classification_loss: 0.2846 272/500 [===============>..............] - ETA: 57s - loss: 1.6944 - regression_loss: 1.4102 - classification_loss: 0.2842 273/500 [===============>..............] - ETA: 57s - loss: 1.6946 - regression_loss: 1.4105 - classification_loss: 0.2841 274/500 [===============>..............] - ETA: 56s - loss: 1.6913 - regression_loss: 1.4079 - classification_loss: 0.2834 275/500 [===============>..............] - ETA: 56s - loss: 1.6912 - regression_loss: 1.4080 - classification_loss: 0.2832 276/500 [===============>..............] - ETA: 56s - loss: 1.6913 - regression_loss: 1.4076 - classification_loss: 0.2837 277/500 [===============>..............] - ETA: 56s - loss: 1.6899 - regression_loss: 1.4067 - classification_loss: 0.2832 278/500 [===============>..............] - ETA: 55s - loss: 1.6901 - regression_loss: 1.4068 - classification_loss: 0.2833 279/500 [===============>..............] - ETA: 55s - loss: 1.6929 - regression_loss: 1.4093 - classification_loss: 0.2836 280/500 [===============>..............] - ETA: 55s - loss: 1.6933 - regression_loss: 1.4097 - classification_loss: 0.2836 281/500 [===============>..............] - ETA: 55s - loss: 1.6911 - regression_loss: 1.4077 - classification_loss: 0.2833 282/500 [===============>..............] - ETA: 54s - loss: 1.6895 - regression_loss: 1.4064 - classification_loss: 0.2831 283/500 [===============>..............] - ETA: 54s - loss: 1.6889 - regression_loss: 1.4060 - classification_loss: 0.2829 284/500 [================>.............] - ETA: 54s - loss: 1.6889 - regression_loss: 1.4060 - classification_loss: 0.2829 285/500 [================>.............] - ETA: 54s - loss: 1.6870 - regression_loss: 1.4046 - classification_loss: 0.2825 286/500 [================>.............] - ETA: 53s - loss: 1.6884 - regression_loss: 1.4057 - classification_loss: 0.2827 287/500 [================>.............] - ETA: 53s - loss: 1.6908 - regression_loss: 1.4076 - classification_loss: 0.2832 288/500 [================>.............] - ETA: 53s - loss: 1.6889 - regression_loss: 1.4059 - classification_loss: 0.2830 289/500 [================>.............] - ETA: 52s - loss: 1.6901 - regression_loss: 1.4069 - classification_loss: 0.2832 290/500 [================>.............] - ETA: 52s - loss: 1.6888 - regression_loss: 1.4058 - classification_loss: 0.2830 291/500 [================>.............] - ETA: 52s - loss: 1.6907 - regression_loss: 1.4075 - classification_loss: 0.2831 292/500 [================>.............] - ETA: 52s - loss: 1.6922 - regression_loss: 1.4088 - classification_loss: 0.2834 293/500 [================>.............] - ETA: 51s - loss: 1.6917 - regression_loss: 1.4084 - classification_loss: 0.2834 294/500 [================>.............] - ETA: 51s - loss: 1.6913 - regression_loss: 1.4080 - classification_loss: 0.2833 295/500 [================>.............] - ETA: 51s - loss: 1.6956 - regression_loss: 1.4111 - classification_loss: 0.2844 296/500 [================>.............] - ETA: 51s - loss: 1.6926 - regression_loss: 1.4086 - classification_loss: 0.2839 297/500 [================>.............] - ETA: 50s - loss: 1.6914 - regression_loss: 1.4074 - classification_loss: 0.2840 298/500 [================>.............] - ETA: 50s - loss: 1.6925 - regression_loss: 1.4085 - classification_loss: 0.2840 299/500 [================>.............] - ETA: 50s - loss: 1.6928 - regression_loss: 1.4091 - classification_loss: 0.2837 300/500 [=================>............] - ETA: 50s - loss: 1.6923 - regression_loss: 1.4089 - classification_loss: 0.2834 301/500 [=================>............] - ETA: 49s - loss: 1.6923 - regression_loss: 1.4088 - classification_loss: 0.2834 302/500 [=================>............] - ETA: 49s - loss: 1.6922 - regression_loss: 1.4089 - classification_loss: 0.2833 303/500 [=================>............] - ETA: 49s - loss: 1.6907 - regression_loss: 1.4078 - classification_loss: 0.2830 304/500 [=================>............] - ETA: 49s - loss: 1.6902 - regression_loss: 1.4075 - classification_loss: 0.2827 305/500 [=================>............] - ETA: 48s - loss: 1.6890 - regression_loss: 1.4065 - classification_loss: 0.2825 306/500 [=================>............] - ETA: 48s - loss: 1.6880 - regression_loss: 1.4058 - classification_loss: 0.2822 307/500 [=================>............] - ETA: 48s - loss: 1.6880 - regression_loss: 1.4058 - classification_loss: 0.2822 308/500 [=================>............] - ETA: 48s - loss: 1.6870 - regression_loss: 1.4053 - classification_loss: 0.2817 309/500 [=================>............] - ETA: 47s - loss: 1.6867 - regression_loss: 1.4050 - classification_loss: 0.2817 310/500 [=================>............] - ETA: 47s - loss: 1.6873 - regression_loss: 1.4055 - classification_loss: 0.2818 311/500 [=================>............] - ETA: 47s - loss: 1.6847 - regression_loss: 1.4032 - classification_loss: 0.2815 312/500 [=================>............] - ETA: 47s - loss: 1.6822 - regression_loss: 1.4010 - classification_loss: 0.2812 313/500 [=================>............] - ETA: 46s - loss: 1.6831 - regression_loss: 1.4017 - classification_loss: 0.2813 314/500 [=================>............] - ETA: 46s - loss: 1.6815 - regression_loss: 1.4004 - classification_loss: 0.2811 315/500 [=================>............] - ETA: 46s - loss: 1.6795 - regression_loss: 1.3989 - classification_loss: 0.2806 316/500 [=================>............] - ETA: 46s - loss: 1.6798 - regression_loss: 1.3994 - classification_loss: 0.2804 317/500 [==================>...........] - ETA: 45s - loss: 1.6795 - regression_loss: 1.3995 - classification_loss: 0.2801 318/500 [==================>...........] - ETA: 45s - loss: 1.6802 - regression_loss: 1.4001 - classification_loss: 0.2801 319/500 [==================>...........] - ETA: 45s - loss: 1.6765 - regression_loss: 1.3971 - classification_loss: 0.2795 320/500 [==================>...........] - ETA: 45s - loss: 1.6774 - regression_loss: 1.3979 - classification_loss: 0.2795 321/500 [==================>...........] - ETA: 44s - loss: 1.6772 - regression_loss: 1.3977 - classification_loss: 0.2795 322/500 [==================>...........] - ETA: 44s - loss: 1.6770 - regression_loss: 1.3976 - classification_loss: 0.2794 323/500 [==================>...........] - ETA: 44s - loss: 1.6748 - regression_loss: 1.3957 - classification_loss: 0.2791 324/500 [==================>...........] - ETA: 44s - loss: 1.6755 - regression_loss: 1.3964 - classification_loss: 0.2791 325/500 [==================>...........] - ETA: 43s - loss: 1.6767 - regression_loss: 1.3974 - classification_loss: 0.2793 326/500 [==================>...........] - ETA: 43s - loss: 1.6767 - regression_loss: 1.3974 - classification_loss: 0.2793 327/500 [==================>...........] - ETA: 43s - loss: 1.6770 - regression_loss: 1.3977 - classification_loss: 0.2793 328/500 [==================>...........] - ETA: 43s - loss: 1.6784 - regression_loss: 1.3986 - classification_loss: 0.2798 329/500 [==================>...........] - ETA: 42s - loss: 1.6781 - regression_loss: 1.3984 - classification_loss: 0.2798 330/500 [==================>...........] - ETA: 42s - loss: 1.6816 - regression_loss: 1.4012 - classification_loss: 0.2804 331/500 [==================>...........] - ETA: 42s - loss: 1.6837 - regression_loss: 1.4027 - classification_loss: 0.2810 332/500 [==================>...........] - ETA: 42s - loss: 1.6810 - regression_loss: 1.4006 - classification_loss: 0.2804 333/500 [==================>...........] - ETA: 41s - loss: 1.6803 - regression_loss: 1.4002 - classification_loss: 0.2802 334/500 [===================>..........] - ETA: 41s - loss: 1.6803 - regression_loss: 1.4001 - classification_loss: 0.2803 335/500 [===================>..........] - ETA: 41s - loss: 1.6798 - regression_loss: 1.3999 - classification_loss: 0.2798 336/500 [===================>..........] - ETA: 41s - loss: 1.6791 - regression_loss: 1.3994 - classification_loss: 0.2797 337/500 [===================>..........] - ETA: 40s - loss: 1.6800 - regression_loss: 1.4001 - classification_loss: 0.2799 338/500 [===================>..........] - ETA: 40s - loss: 1.6768 - regression_loss: 1.3972 - classification_loss: 0.2796 339/500 [===================>..........] - ETA: 40s - loss: 1.6746 - regression_loss: 1.3954 - classification_loss: 0.2792 340/500 [===================>..........] - ETA: 40s - loss: 1.6752 - regression_loss: 1.3959 - classification_loss: 0.2794 341/500 [===================>..........] - ETA: 39s - loss: 1.6757 - regression_loss: 1.3963 - classification_loss: 0.2794 342/500 [===================>..........] - ETA: 39s - loss: 1.6768 - regression_loss: 1.3971 - classification_loss: 0.2797 343/500 [===================>..........] - ETA: 39s - loss: 1.6756 - regression_loss: 1.3962 - classification_loss: 0.2794 344/500 [===================>..........] - ETA: 39s - loss: 1.6770 - regression_loss: 1.3974 - classification_loss: 0.2796 345/500 [===================>..........] - ETA: 38s - loss: 1.6766 - regression_loss: 1.3971 - classification_loss: 0.2795 346/500 [===================>..........] - ETA: 38s - loss: 1.6774 - regression_loss: 1.3967 - classification_loss: 0.2806 347/500 [===================>..........] - ETA: 38s - loss: 1.6769 - regression_loss: 1.3965 - classification_loss: 0.2804 348/500 [===================>..........] - ETA: 38s - loss: 1.6772 - regression_loss: 1.3969 - classification_loss: 0.2803 349/500 [===================>..........] - ETA: 37s - loss: 1.6789 - regression_loss: 1.3980 - classification_loss: 0.2808 350/500 [====================>.........] - ETA: 37s - loss: 1.6754 - regression_loss: 1.3953 - classification_loss: 0.2801 351/500 [====================>.........] - ETA: 37s - loss: 1.6764 - regression_loss: 1.3959 - classification_loss: 0.2806 352/500 [====================>.........] - ETA: 37s - loss: 1.6738 - regression_loss: 1.3936 - classification_loss: 0.2802 353/500 [====================>.........] - ETA: 36s - loss: 1.6738 - regression_loss: 1.3936 - classification_loss: 0.2802 354/500 [====================>.........] - ETA: 36s - loss: 1.6722 - regression_loss: 1.3921 - classification_loss: 0.2801 355/500 [====================>.........] - ETA: 36s - loss: 1.6734 - regression_loss: 1.3927 - classification_loss: 0.2807 356/500 [====================>.........] - ETA: 36s - loss: 1.6720 - regression_loss: 1.3916 - classification_loss: 0.2804 357/500 [====================>.........] - ETA: 35s - loss: 1.6739 - regression_loss: 1.3935 - classification_loss: 0.2804 358/500 [====================>.........] - ETA: 35s - loss: 1.6742 - regression_loss: 1.3935 - classification_loss: 0.2807 359/500 [====================>.........] - ETA: 35s - loss: 1.6743 - regression_loss: 1.3936 - classification_loss: 0.2806 360/500 [====================>.........] - ETA: 35s - loss: 1.6751 - regression_loss: 1.3943 - classification_loss: 0.2808 361/500 [====================>.........] - ETA: 34s - loss: 1.6742 - regression_loss: 1.3937 - classification_loss: 0.2805 362/500 [====================>.........] - ETA: 34s - loss: 1.6755 - regression_loss: 1.3949 - classification_loss: 0.2806 363/500 [====================>.........] - ETA: 34s - loss: 1.6760 - regression_loss: 1.3952 - classification_loss: 0.2807 364/500 [====================>.........] - ETA: 34s - loss: 1.6736 - regression_loss: 1.3933 - classification_loss: 0.2803 365/500 [====================>.........] - ETA: 33s - loss: 1.6752 - regression_loss: 1.3947 - classification_loss: 0.2805 366/500 [====================>.........] - ETA: 33s - loss: 1.6736 - regression_loss: 1.3933 - classification_loss: 0.2802 367/500 [=====================>........] - ETA: 33s - loss: 1.6751 - regression_loss: 1.3946 - classification_loss: 0.2805 368/500 [=====================>........] - ETA: 33s - loss: 1.6749 - regression_loss: 1.3943 - classification_loss: 0.2806 369/500 [=====================>........] - ETA: 32s - loss: 1.6765 - regression_loss: 1.3955 - classification_loss: 0.2810 370/500 [=====================>........] - ETA: 32s - loss: 1.6753 - regression_loss: 1.3946 - classification_loss: 0.2807 371/500 [=====================>........] - ETA: 32s - loss: 1.6755 - regression_loss: 1.3947 - classification_loss: 0.2808 372/500 [=====================>........] - ETA: 32s - loss: 1.6741 - regression_loss: 1.3936 - classification_loss: 0.2805 373/500 [=====================>........] - ETA: 31s - loss: 1.6759 - regression_loss: 1.3948 - classification_loss: 0.2811 374/500 [=====================>........] - ETA: 31s - loss: 1.6764 - regression_loss: 1.3953 - classification_loss: 0.2812 375/500 [=====================>........] - ETA: 31s - loss: 1.6763 - regression_loss: 1.3952 - classification_loss: 0.2812 376/500 [=====================>........] - ETA: 31s - loss: 1.6762 - regression_loss: 1.3951 - classification_loss: 0.2811 377/500 [=====================>........] - ETA: 30s - loss: 1.6767 - regression_loss: 1.3955 - classification_loss: 0.2812 378/500 [=====================>........] - ETA: 30s - loss: 1.6772 - regression_loss: 1.3957 - classification_loss: 0.2815 379/500 [=====================>........] - ETA: 30s - loss: 1.6778 - regression_loss: 1.3960 - classification_loss: 0.2818 380/500 [=====================>........] - ETA: 30s - loss: 1.6786 - regression_loss: 1.3965 - classification_loss: 0.2821 381/500 [=====================>........] - ETA: 29s - loss: 1.6791 - regression_loss: 1.3969 - classification_loss: 0.2821 382/500 [=====================>........] - ETA: 29s - loss: 1.6800 - regression_loss: 1.3974 - classification_loss: 0.2825 383/500 [=====================>........] - ETA: 29s - loss: 1.6809 - regression_loss: 1.3980 - classification_loss: 0.2828 384/500 [======================>.......] - ETA: 29s - loss: 1.6809 - regression_loss: 1.3980 - classification_loss: 0.2829 385/500 [======================>.......] - ETA: 28s - loss: 1.6819 - regression_loss: 1.3989 - classification_loss: 0.2829 386/500 [======================>.......] - ETA: 28s - loss: 1.6830 - regression_loss: 1.3997 - classification_loss: 0.2832 387/500 [======================>.......] - ETA: 28s - loss: 1.6833 - regression_loss: 1.4001 - classification_loss: 0.2832 388/500 [======================>.......] - ETA: 28s - loss: 1.6826 - regression_loss: 1.3996 - classification_loss: 0.2830 389/500 [======================>.......] - ETA: 27s - loss: 1.6818 - regression_loss: 1.3989 - classification_loss: 0.2829 390/500 [======================>.......] - ETA: 27s - loss: 1.6801 - regression_loss: 1.3977 - classification_loss: 0.2825 391/500 [======================>.......] - ETA: 27s - loss: 1.6814 - regression_loss: 1.3986 - classification_loss: 0.2827 392/500 [======================>.......] - ETA: 27s - loss: 1.6829 - regression_loss: 1.3997 - classification_loss: 0.2832 393/500 [======================>.......] - ETA: 26s - loss: 1.6822 - regression_loss: 1.3992 - classification_loss: 0.2830 394/500 [======================>.......] - ETA: 26s - loss: 1.6833 - regression_loss: 1.3999 - classification_loss: 0.2833 395/500 [======================>.......] - ETA: 26s - loss: 1.6865 - regression_loss: 1.4026 - classification_loss: 0.2838 396/500 [======================>.......] - ETA: 26s - loss: 1.6866 - regression_loss: 1.4028 - classification_loss: 0.2838 397/500 [======================>.......] - ETA: 25s - loss: 1.6875 - regression_loss: 1.4035 - classification_loss: 0.2840 398/500 [======================>.......] - ETA: 25s - loss: 1.6876 - regression_loss: 1.4035 - classification_loss: 0.2840 399/500 [======================>.......] - ETA: 25s - loss: 1.6882 - regression_loss: 1.4042 - classification_loss: 0.2840 400/500 [=======================>......] - ETA: 25s - loss: 1.6885 - regression_loss: 1.4044 - classification_loss: 0.2841 401/500 [=======================>......] - ETA: 24s - loss: 1.6893 - regression_loss: 1.4052 - classification_loss: 0.2841 402/500 [=======================>......] - ETA: 24s - loss: 1.6893 - regression_loss: 1.4052 - classification_loss: 0.2841 403/500 [=======================>......] - ETA: 24s - loss: 1.6902 - regression_loss: 1.4061 - classification_loss: 0.2841 404/500 [=======================>......] - ETA: 24s - loss: 1.6906 - regression_loss: 1.4064 - classification_loss: 0.2842 405/500 [=======================>......] - ETA: 23s - loss: 1.6894 - regression_loss: 1.4053 - classification_loss: 0.2841 406/500 [=======================>......] - ETA: 23s - loss: 1.6897 - regression_loss: 1.4057 - classification_loss: 0.2840 407/500 [=======================>......] - ETA: 23s - loss: 1.6876 - regression_loss: 1.4040 - classification_loss: 0.2836 408/500 [=======================>......] - ETA: 23s - loss: 1.6855 - regression_loss: 1.4023 - classification_loss: 0.2831 409/500 [=======================>......] - ETA: 22s - loss: 1.6877 - regression_loss: 1.4040 - classification_loss: 0.2837 410/500 [=======================>......] - ETA: 22s - loss: 1.6862 - regression_loss: 1.4028 - classification_loss: 0.2834 411/500 [=======================>......] - ETA: 22s - loss: 1.6859 - regression_loss: 1.4026 - classification_loss: 0.2833 412/500 [=======================>......] - ETA: 22s - loss: 1.6873 - regression_loss: 1.4038 - classification_loss: 0.2836 413/500 [=======================>......] - ETA: 21s - loss: 1.6876 - regression_loss: 1.4041 - classification_loss: 0.2836 414/500 [=======================>......] - ETA: 21s - loss: 1.6888 - regression_loss: 1.4051 - classification_loss: 0.2837 415/500 [=======================>......] - ETA: 21s - loss: 1.6866 - regression_loss: 1.4033 - classification_loss: 0.2833 416/500 [=======================>......] - ETA: 21s - loss: 1.6869 - regression_loss: 1.4035 - classification_loss: 0.2834 417/500 [========================>.....] - ETA: 20s - loss: 1.6893 - regression_loss: 1.4056 - classification_loss: 0.2836 418/500 [========================>.....] - ETA: 20s - loss: 1.6875 - regression_loss: 1.4041 - classification_loss: 0.2833 419/500 [========================>.....] - ETA: 20s - loss: 1.6883 - regression_loss: 1.4047 - classification_loss: 0.2836 420/500 [========================>.....] - ETA: 20s - loss: 1.6893 - regression_loss: 1.4047 - classification_loss: 0.2845 421/500 [========================>.....] - ETA: 19s - loss: 1.6889 - regression_loss: 1.4045 - classification_loss: 0.2844 422/500 [========================>.....] - ETA: 19s - loss: 1.6866 - regression_loss: 1.4027 - classification_loss: 0.2839 423/500 [========================>.....] - ETA: 19s - loss: 1.6861 - regression_loss: 1.4022 - classification_loss: 0.2839 424/500 [========================>.....] - ETA: 19s - loss: 1.6867 - regression_loss: 1.4028 - classification_loss: 0.2839 425/500 [========================>.....] - ETA: 18s - loss: 1.6865 - regression_loss: 1.4027 - classification_loss: 0.2838 426/500 [========================>.....] - ETA: 18s - loss: 1.6859 - regression_loss: 1.4022 - classification_loss: 0.2837 427/500 [========================>.....] - ETA: 18s - loss: 1.6840 - regression_loss: 1.4006 - classification_loss: 0.2833 428/500 [========================>.....] - ETA: 18s - loss: 1.6839 - regression_loss: 1.4006 - classification_loss: 0.2833 429/500 [========================>.....] - ETA: 17s - loss: 1.6831 - regression_loss: 1.4000 - classification_loss: 0.2832 430/500 [========================>.....] - ETA: 17s - loss: 1.6822 - regression_loss: 1.3993 - classification_loss: 0.2829 431/500 [========================>.....] - ETA: 17s - loss: 1.6820 - regression_loss: 1.3990 - classification_loss: 0.2829 432/500 [========================>.....] - ETA: 17s - loss: 1.6827 - regression_loss: 1.3996 - classification_loss: 0.2832 433/500 [========================>.....] - ETA: 16s - loss: 1.6829 - regression_loss: 1.3996 - classification_loss: 0.2832 434/500 [=========================>....] - ETA: 16s - loss: 1.6835 - regression_loss: 1.4001 - classification_loss: 0.2834 435/500 [=========================>....] - ETA: 16s - loss: 1.6848 - regression_loss: 1.4013 - classification_loss: 0.2835 436/500 [=========================>....] - ETA: 16s - loss: 1.6847 - regression_loss: 1.4013 - classification_loss: 0.2834 437/500 [=========================>....] - ETA: 15s - loss: 1.6832 - regression_loss: 1.4000 - classification_loss: 0.2832 438/500 [=========================>....] - ETA: 15s - loss: 1.6831 - regression_loss: 1.3999 - classification_loss: 0.2832 439/500 [=========================>....] - ETA: 15s - loss: 1.6825 - regression_loss: 1.3995 - classification_loss: 0.2830 440/500 [=========================>....] - ETA: 15s - loss: 1.6827 - regression_loss: 1.3997 - classification_loss: 0.2830 441/500 [=========================>....] - ETA: 14s - loss: 1.6833 - regression_loss: 1.4002 - classification_loss: 0.2831 442/500 [=========================>....] - ETA: 14s - loss: 1.6833 - regression_loss: 1.4002 - classification_loss: 0.2831 443/500 [=========================>....] - ETA: 14s - loss: 1.6825 - regression_loss: 1.3997 - classification_loss: 0.2828 444/500 [=========================>....] - ETA: 14s - loss: 1.6829 - regression_loss: 1.4001 - classification_loss: 0.2828 445/500 [=========================>....] - ETA: 13s - loss: 1.6836 - regression_loss: 1.4008 - classification_loss: 0.2829 446/500 [=========================>....] - ETA: 13s - loss: 1.6832 - regression_loss: 1.4004 - classification_loss: 0.2828 447/500 [=========================>....] - ETA: 13s - loss: 1.6838 - regression_loss: 1.4009 - classification_loss: 0.2829 448/500 [=========================>....] - ETA: 13s - loss: 1.6825 - regression_loss: 1.3999 - classification_loss: 0.2827 449/500 [=========================>....] - ETA: 12s - loss: 1.6818 - regression_loss: 1.3993 - classification_loss: 0.2824 450/500 [==========================>...] - ETA: 12s - loss: 1.6801 - regression_loss: 1.3980 - classification_loss: 0.2822 451/500 [==========================>...] - ETA: 12s - loss: 1.6800 - regression_loss: 1.3978 - classification_loss: 0.2822 452/500 [==========================>...] - ETA: 12s - loss: 1.6810 - regression_loss: 1.3986 - classification_loss: 0.2824 453/500 [==========================>...] - ETA: 11s - loss: 1.6810 - regression_loss: 1.3988 - classification_loss: 0.2822 454/500 [==========================>...] - ETA: 11s - loss: 1.6793 - regression_loss: 1.3974 - classification_loss: 0.2819 455/500 [==========================>...] - ETA: 11s - loss: 1.6808 - regression_loss: 1.3985 - classification_loss: 0.2823 456/500 [==========================>...] - ETA: 11s - loss: 1.6810 - regression_loss: 1.3987 - classification_loss: 0.2823 457/500 [==========================>...] - ETA: 10s - loss: 1.6785 - regression_loss: 1.3966 - classification_loss: 0.2819 458/500 [==========================>...] - ETA: 10s - loss: 1.6784 - regression_loss: 1.3967 - classification_loss: 0.2817 459/500 [==========================>...] - ETA: 10s - loss: 1.6794 - regression_loss: 1.3973 - classification_loss: 0.2821 460/500 [==========================>...] - ETA: 10s - loss: 1.6794 - regression_loss: 1.3975 - classification_loss: 0.2819 461/500 [==========================>...] - ETA: 9s - loss: 1.6797 - regression_loss: 1.3979 - classification_loss: 0.2818  462/500 [==========================>...] - ETA: 9s - loss: 1.6798 - regression_loss: 1.3980 - classification_loss: 0.2818 463/500 [==========================>...] - ETA: 9s - loss: 1.6800 - regression_loss: 1.3983 - classification_loss: 0.2818 464/500 [==========================>...] - ETA: 9s - loss: 1.6795 - regression_loss: 1.3979 - classification_loss: 0.2816 465/500 [==========================>...] - ETA: 8s - loss: 1.6803 - regression_loss: 1.3984 - classification_loss: 0.2819 466/500 [==========================>...] - ETA: 8s - loss: 1.6828 - regression_loss: 1.4005 - classification_loss: 0.2822 467/500 [===========================>..] - ETA: 8s - loss: 1.6849 - regression_loss: 1.4017 - classification_loss: 0.2832 468/500 [===========================>..] - ETA: 8s - loss: 1.6841 - regression_loss: 1.4010 - classification_loss: 0.2831 469/500 [===========================>..] - ETA: 7s - loss: 1.6828 - regression_loss: 1.3999 - classification_loss: 0.2830 470/500 [===========================>..] - ETA: 7s - loss: 1.6815 - regression_loss: 1.3989 - classification_loss: 0.2826 471/500 [===========================>..] - ETA: 7s - loss: 1.6813 - regression_loss: 1.3988 - classification_loss: 0.2825 472/500 [===========================>..] - ETA: 7s - loss: 1.6833 - regression_loss: 1.4002 - classification_loss: 0.2832 473/500 [===========================>..] - ETA: 6s - loss: 1.6816 - regression_loss: 1.3988 - classification_loss: 0.2828 474/500 [===========================>..] - ETA: 6s - loss: 1.6801 - regression_loss: 1.3976 - classification_loss: 0.2825 475/500 [===========================>..] - ETA: 6s - loss: 1.6808 - regression_loss: 1.3982 - classification_loss: 0.2826 476/500 [===========================>..] - ETA: 6s - loss: 1.6804 - regression_loss: 1.3980 - classification_loss: 0.2824 477/500 [===========================>..] - ETA: 5s - loss: 1.6805 - regression_loss: 1.3981 - classification_loss: 0.2824 478/500 [===========================>..] - ETA: 5s - loss: 1.6798 - regression_loss: 1.3975 - classification_loss: 0.2823 479/500 [===========================>..] - ETA: 5s - loss: 1.6797 - regression_loss: 1.3974 - classification_loss: 0.2823 480/500 [===========================>..] - ETA: 5s - loss: 1.6796 - regression_loss: 1.3973 - classification_loss: 0.2823 481/500 [===========================>..] - ETA: 4s - loss: 1.6780 - regression_loss: 1.3961 - classification_loss: 0.2820 482/500 [===========================>..] - ETA: 4s - loss: 1.6786 - regression_loss: 1.3964 - classification_loss: 0.2822 483/500 [===========================>..] - ETA: 4s - loss: 1.6786 - regression_loss: 1.3963 - classification_loss: 0.2823 484/500 [============================>.] - ETA: 4s - loss: 1.6791 - regression_loss: 1.3967 - classification_loss: 0.2824 485/500 [============================>.] - ETA: 3s - loss: 1.6779 - regression_loss: 1.3956 - classification_loss: 0.2823 486/500 [============================>.] - ETA: 3s - loss: 1.6786 - regression_loss: 1.3960 - classification_loss: 0.2826 487/500 [============================>.] - ETA: 3s - loss: 1.6767 - regression_loss: 1.3944 - classification_loss: 0.2823 488/500 [============================>.] - ETA: 3s - loss: 1.6769 - regression_loss: 1.3944 - classification_loss: 0.2824 489/500 [============================>.] - ETA: 2s - loss: 1.6773 - regression_loss: 1.3949 - classification_loss: 0.2824 490/500 [============================>.] - ETA: 2s - loss: 1.6778 - regression_loss: 1.3954 - classification_loss: 0.2824 491/500 [============================>.] - ETA: 2s - loss: 1.6778 - regression_loss: 1.3953 - classification_loss: 0.2824 492/500 [============================>.] - ETA: 2s - loss: 1.6764 - regression_loss: 1.3942 - classification_loss: 0.2822 493/500 [============================>.] - ETA: 1s - loss: 1.6764 - regression_loss: 1.3942 - classification_loss: 0.2822 494/500 [============================>.] - ETA: 1s - loss: 1.6770 - regression_loss: 1.3946 - classification_loss: 0.2823 495/500 [============================>.] - ETA: 1s - loss: 1.6777 - regression_loss: 1.3953 - classification_loss: 0.2825 496/500 [============================>.] - ETA: 1s - loss: 1.6787 - regression_loss: 1.3961 - classification_loss: 0.2826 497/500 [============================>.] - ETA: 0s - loss: 1.6795 - regression_loss: 1.3966 - classification_loss: 0.2829 498/500 [============================>.] - ETA: 0s - loss: 1.6805 - regression_loss: 1.3974 - classification_loss: 0.2831 499/500 [============================>.] - ETA: 0s - loss: 1.6803 - regression_loss: 1.3972 - classification_loss: 0.2830 500/500 [==============================] - 126s 251ms/step - loss: 1.6798 - regression_loss: 1.3969 - classification_loss: 0.2829 1172 instances of class plum with average precision: 0.6277 mAP: 0.6277 Epoch 00074: saving model to ./training/snapshots/resnet50_pascal_74.h5 Epoch 75/150 1/500 [..............................] - ETA: 2:00 - loss: 1.1437 - regression_loss: 0.9846 - classification_loss: 0.1591 2/500 [..............................] - ETA: 2:04 - loss: 1.0822 - regression_loss: 0.9018 - classification_loss: 0.1803 3/500 [..............................] - ETA: 2:03 - loss: 0.9700 - regression_loss: 0.8044 - classification_loss: 0.1656 4/500 [..............................] - ETA: 2:03 - loss: 1.1134 - regression_loss: 0.9228 - classification_loss: 0.1906 5/500 [..............................] - ETA: 2:03 - loss: 1.1777 - regression_loss: 0.9797 - classification_loss: 0.1980 6/500 [..............................] - ETA: 2:03 - loss: 1.0731 - regression_loss: 0.8951 - classification_loss: 0.1779 7/500 [..............................] - ETA: 2:03 - loss: 1.2466 - regression_loss: 1.0352 - classification_loss: 0.2114 8/500 [..............................] - ETA: 2:03 - loss: 1.3205 - regression_loss: 1.0980 - classification_loss: 0.2224 9/500 [..............................] - ETA: 2:03 - loss: 1.3454 - regression_loss: 1.1242 - classification_loss: 0.2213 10/500 [..............................] - ETA: 2:02 - loss: 1.3315 - regression_loss: 1.1105 - classification_loss: 0.2210 11/500 [..............................] - ETA: 2:01 - loss: 1.3900 - regression_loss: 1.1595 - classification_loss: 0.2305 12/500 [..............................] - ETA: 2:01 - loss: 1.3778 - regression_loss: 1.1517 - classification_loss: 0.2261 13/500 [..............................] - ETA: 2:01 - loss: 1.3891 - regression_loss: 1.1642 - classification_loss: 0.2249 14/500 [..............................] - ETA: 2:01 - loss: 1.4376 - regression_loss: 1.2046 - classification_loss: 0.2331 15/500 [..............................] - ETA: 2:01 - loss: 1.5024 - regression_loss: 1.2688 - classification_loss: 0.2337 16/500 [..............................] - ETA: 2:00 - loss: 1.5265 - regression_loss: 1.2894 - classification_loss: 0.2370 17/500 [>.............................] - ETA: 2:00 - loss: 1.5468 - regression_loss: 1.3040 - classification_loss: 0.2428 18/500 [>.............................] - ETA: 2:00 - loss: 1.5661 - regression_loss: 1.3201 - classification_loss: 0.2460 19/500 [>.............................] - ETA: 1:59 - loss: 1.5628 - regression_loss: 1.3175 - classification_loss: 0.2452 20/500 [>.............................] - ETA: 1:58 - loss: 1.5752 - regression_loss: 1.3274 - classification_loss: 0.2478 21/500 [>.............................] - ETA: 1:58 - loss: 1.6170 - regression_loss: 1.3539 - classification_loss: 0.2631 22/500 [>.............................] - ETA: 1:58 - loss: 1.6334 - regression_loss: 1.3672 - classification_loss: 0.2662 23/500 [>.............................] - ETA: 1:58 - loss: 1.6236 - regression_loss: 1.3571 - classification_loss: 0.2665 24/500 [>.............................] - ETA: 1:58 - loss: 1.6404 - regression_loss: 1.3735 - classification_loss: 0.2670 25/500 [>.............................] - ETA: 1:57 - loss: 1.6339 - regression_loss: 1.3695 - classification_loss: 0.2644 26/500 [>.............................] - ETA: 1:57 - loss: 1.6562 - regression_loss: 1.3854 - classification_loss: 0.2708 27/500 [>.............................] - ETA: 1:57 - loss: 1.6597 - regression_loss: 1.3891 - classification_loss: 0.2705 28/500 [>.............................] - ETA: 1:57 - loss: 1.6485 - regression_loss: 1.3706 - classification_loss: 0.2780 29/500 [>.............................] - ETA: 1:57 - loss: 1.6562 - regression_loss: 1.3796 - classification_loss: 0.2766 30/500 [>.............................] - ETA: 1:56 - loss: 1.6582 - regression_loss: 1.3826 - classification_loss: 0.2756 31/500 [>.............................] - ETA: 1:56 - loss: 1.6772 - regression_loss: 1.3951 - classification_loss: 0.2821 32/500 [>.............................] - ETA: 1:56 - loss: 1.6812 - regression_loss: 1.3991 - classification_loss: 0.2820 33/500 [>.............................] - ETA: 1:56 - loss: 1.6565 - regression_loss: 1.3768 - classification_loss: 0.2797 34/500 [=>............................] - ETA: 1:55 - loss: 1.6563 - regression_loss: 1.3771 - classification_loss: 0.2792 35/500 [=>............................] - ETA: 1:55 - loss: 1.6618 - regression_loss: 1.3815 - classification_loss: 0.2803 36/500 [=>............................] - ETA: 1:55 - loss: 1.6664 - regression_loss: 1.3853 - classification_loss: 0.2811 37/500 [=>............................] - ETA: 1:55 - loss: 1.6666 - regression_loss: 1.3838 - classification_loss: 0.2828 38/500 [=>............................] - ETA: 1:55 - loss: 1.6510 - regression_loss: 1.3713 - classification_loss: 0.2797 39/500 [=>............................] - ETA: 1:54 - loss: 1.6327 - regression_loss: 1.3542 - classification_loss: 0.2785 40/500 [=>............................] - ETA: 1:54 - loss: 1.6399 - regression_loss: 1.3615 - classification_loss: 0.2784 41/500 [=>............................] - ETA: 1:54 - loss: 1.6460 - regression_loss: 1.3666 - classification_loss: 0.2795 42/500 [=>............................] - ETA: 1:53 - loss: 1.6623 - regression_loss: 1.3775 - classification_loss: 0.2848 43/500 [=>............................] - ETA: 1:53 - loss: 1.6666 - regression_loss: 1.3808 - classification_loss: 0.2858 44/500 [=>............................] - ETA: 1:53 - loss: 1.6519 - regression_loss: 1.3680 - classification_loss: 0.2839 45/500 [=>............................] - ETA: 1:53 - loss: 1.6460 - regression_loss: 1.3624 - classification_loss: 0.2836 46/500 [=>............................] - ETA: 1:52 - loss: 1.6476 - regression_loss: 1.3649 - classification_loss: 0.2828 47/500 [=>............................] - ETA: 1:52 - loss: 1.6531 - regression_loss: 1.3696 - classification_loss: 0.2835 48/500 [=>............................] - ETA: 1:52 - loss: 1.6468 - regression_loss: 1.3654 - classification_loss: 0.2814 49/500 [=>............................] - ETA: 1:52 - loss: 1.6482 - regression_loss: 1.3670 - classification_loss: 0.2812 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6645 - regression_loss: 1.3790 - classification_loss: 0.2855 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6632 - regression_loss: 1.3787 - classification_loss: 0.2845 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6715 - regression_loss: 1.3856 - classification_loss: 0.2859 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6760 - regression_loss: 1.3884 - classification_loss: 0.2875 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6678 - regression_loss: 1.3814 - classification_loss: 0.2863 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6774 - regression_loss: 1.3895 - classification_loss: 0.2879 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6841 - regression_loss: 1.3926 - classification_loss: 0.2915 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6743 - regression_loss: 1.3856 - classification_loss: 0.2887 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6746 - regression_loss: 1.3867 - classification_loss: 0.2879 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6772 - regression_loss: 1.3898 - classification_loss: 0.2874 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6787 - regression_loss: 1.3906 - classification_loss: 0.2881 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6824 - regression_loss: 1.3939 - classification_loss: 0.2885 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6733 - regression_loss: 1.3860 - classification_loss: 0.2874 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6670 - regression_loss: 1.3813 - classification_loss: 0.2857 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6608 - regression_loss: 1.3776 - classification_loss: 0.2832 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6618 - regression_loss: 1.3778 - classification_loss: 0.2840 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6659 - regression_loss: 1.3809 - classification_loss: 0.2851 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6658 - regression_loss: 1.3820 - classification_loss: 0.2838 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6738 - regression_loss: 1.3888 - classification_loss: 0.2850 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6647 - regression_loss: 1.3770 - classification_loss: 0.2877 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6583 - regression_loss: 1.3723 - classification_loss: 0.2861 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6633 - regression_loss: 1.3768 - classification_loss: 0.2865 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6675 - regression_loss: 1.3804 - classification_loss: 0.2870 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6541 - regression_loss: 1.3693 - classification_loss: 0.2848 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6582 - regression_loss: 1.3744 - classification_loss: 0.2839 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6601 - regression_loss: 1.3766 - classification_loss: 0.2835 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6665 - regression_loss: 1.3822 - classification_loss: 0.2842 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6660 - regression_loss: 1.3813 - classification_loss: 0.2847 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6646 - regression_loss: 1.3806 - classification_loss: 0.2840 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6584 - regression_loss: 1.3759 - classification_loss: 0.2825 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6529 - regression_loss: 1.3709 - classification_loss: 0.2820 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6575 - regression_loss: 1.3737 - classification_loss: 0.2837 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6539 - regression_loss: 1.3712 - classification_loss: 0.2828 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6517 - regression_loss: 1.3681 - classification_loss: 0.2836 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6454 - regression_loss: 1.3628 - classification_loss: 0.2826 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6407 - regression_loss: 1.3593 - classification_loss: 0.2815 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6413 - regression_loss: 1.3597 - classification_loss: 0.2816 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6435 - regression_loss: 1.3615 - classification_loss: 0.2819 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6348 - regression_loss: 1.3543 - classification_loss: 0.2805 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6386 - regression_loss: 1.3577 - classification_loss: 0.2808 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6407 - regression_loss: 1.3601 - classification_loss: 0.2806 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6363 - regression_loss: 1.3564 - classification_loss: 0.2799 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6405 - regression_loss: 1.3598 - classification_loss: 0.2807 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6331 - regression_loss: 1.3535 - classification_loss: 0.2796 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6279 - regression_loss: 1.3487 - classification_loss: 0.2792 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6241 - regression_loss: 1.3459 - classification_loss: 0.2782 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6276 - regression_loss: 1.3491 - classification_loss: 0.2786 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6284 - regression_loss: 1.3497 - classification_loss: 0.2787 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6337 - regression_loss: 1.3549 - classification_loss: 0.2788 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6410 - regression_loss: 1.3606 - classification_loss: 0.2804 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6476 - regression_loss: 1.3655 - classification_loss: 0.2821 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6499 - regression_loss: 1.3671 - classification_loss: 0.2828 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6446 - regression_loss: 1.3636 - classification_loss: 0.2809 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6408 - regression_loss: 1.3610 - classification_loss: 0.2799 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6450 - regression_loss: 1.3640 - classification_loss: 0.2811 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6481 - regression_loss: 1.3671 - classification_loss: 0.2810 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6507 - regression_loss: 1.3697 - classification_loss: 0.2810 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6484 - regression_loss: 1.3681 - classification_loss: 0.2803 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6399 - regression_loss: 1.3613 - classification_loss: 0.2787 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6353 - regression_loss: 1.3580 - classification_loss: 0.2773 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6258 - regression_loss: 1.3501 - classification_loss: 0.2758 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6259 - regression_loss: 1.3506 - classification_loss: 0.2754 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6280 - regression_loss: 1.3512 - classification_loss: 0.2767 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6248 - regression_loss: 1.3486 - classification_loss: 0.2761 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6219 - regression_loss: 1.3463 - classification_loss: 0.2756 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6226 - regression_loss: 1.3472 - classification_loss: 0.2754 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6176 - regression_loss: 1.3434 - classification_loss: 0.2742 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6184 - regression_loss: 1.3432 - classification_loss: 0.2752 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6197 - regression_loss: 1.3440 - classification_loss: 0.2757 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6237 - regression_loss: 1.3471 - classification_loss: 0.2766 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6263 - regression_loss: 1.3493 - classification_loss: 0.2770 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6237 - regression_loss: 1.3476 - classification_loss: 0.2762 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6259 - regression_loss: 1.3497 - classification_loss: 0.2762 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6255 - regression_loss: 1.3496 - classification_loss: 0.2759 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6273 - regression_loss: 1.3512 - classification_loss: 0.2761 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6235 - regression_loss: 1.3481 - classification_loss: 0.2754 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6238 - regression_loss: 1.3483 - classification_loss: 0.2756 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6243 - regression_loss: 1.3490 - classification_loss: 0.2753 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6264 - regression_loss: 1.3511 - classification_loss: 0.2753 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6286 - regression_loss: 1.3536 - classification_loss: 0.2750 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6250 - regression_loss: 1.3502 - classification_loss: 0.2748 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6285 - regression_loss: 1.3533 - classification_loss: 0.2752 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6273 - regression_loss: 1.3527 - classification_loss: 0.2746 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6235 - regression_loss: 1.3498 - classification_loss: 0.2737 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6278 - regression_loss: 1.3530 - classification_loss: 0.2748 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6376 - regression_loss: 1.3598 - classification_loss: 0.2778 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6382 - regression_loss: 1.3602 - classification_loss: 0.2779 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6379 - regression_loss: 1.3603 - classification_loss: 0.2777 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6435 - regression_loss: 1.3641 - classification_loss: 0.2794 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6457 - regression_loss: 1.3663 - classification_loss: 0.2794 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6516 - regression_loss: 1.3716 - classification_loss: 0.2800 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6538 - regression_loss: 1.3739 - classification_loss: 0.2800 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6543 - regression_loss: 1.3741 - classification_loss: 0.2801 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6571 - regression_loss: 1.3761 - classification_loss: 0.2810 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6546 - regression_loss: 1.3741 - classification_loss: 0.2805 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6524 - regression_loss: 1.3726 - classification_loss: 0.2798 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6505 - regression_loss: 1.3702 - classification_loss: 0.2803 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6523 - regression_loss: 1.3716 - classification_loss: 0.2806 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6494 - regression_loss: 1.3695 - classification_loss: 0.2800 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6516 - regression_loss: 1.3717 - classification_loss: 0.2799 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6539 - regression_loss: 1.3734 - classification_loss: 0.2804 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6448 - regression_loss: 1.3658 - classification_loss: 0.2790 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6466 - regression_loss: 1.3671 - classification_loss: 0.2795 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6508 - regression_loss: 1.3705 - classification_loss: 0.2803 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6553 - regression_loss: 1.3739 - classification_loss: 0.2814 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6577 - regression_loss: 1.3757 - classification_loss: 0.2820 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6555 - regression_loss: 1.3740 - classification_loss: 0.2815 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6586 - regression_loss: 1.3754 - classification_loss: 0.2832 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6594 - regression_loss: 1.3763 - classification_loss: 0.2832 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6615 - regression_loss: 1.3781 - classification_loss: 0.2835 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6584 - regression_loss: 1.3758 - classification_loss: 0.2826 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6604 - regression_loss: 1.3774 - classification_loss: 0.2829 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6619 - regression_loss: 1.3781 - classification_loss: 0.2838 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6614 - regression_loss: 1.3779 - classification_loss: 0.2836 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6623 - regression_loss: 1.3786 - classification_loss: 0.2837 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6585 - regression_loss: 1.3756 - classification_loss: 0.2829 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6546 - regression_loss: 1.3727 - classification_loss: 0.2819 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6515 - regression_loss: 1.3701 - classification_loss: 0.2813 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6516 - regression_loss: 1.3697 - classification_loss: 0.2820 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6499 - regression_loss: 1.3682 - classification_loss: 0.2817 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6451 - regression_loss: 1.3641 - classification_loss: 0.2810 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6463 - regression_loss: 1.3654 - classification_loss: 0.2809 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6471 - regression_loss: 1.3661 - classification_loss: 0.2809 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6494 - regression_loss: 1.3678 - classification_loss: 0.2816 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6498 - regression_loss: 1.3689 - classification_loss: 0.2809 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6502 - regression_loss: 1.3693 - classification_loss: 0.2810 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6528 - regression_loss: 1.3710 - classification_loss: 0.2818 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6509 - regression_loss: 1.3696 - classification_loss: 0.2813 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6494 - regression_loss: 1.3686 - classification_loss: 0.2808 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6432 - regression_loss: 1.3637 - classification_loss: 0.2795 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6445 - regression_loss: 1.3646 - classification_loss: 0.2798 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6410 - regression_loss: 1.3617 - classification_loss: 0.2793 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6425 - regression_loss: 1.3629 - classification_loss: 0.2795 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6433 - regression_loss: 1.3636 - classification_loss: 0.2797 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6420 - regression_loss: 1.3624 - classification_loss: 0.2797 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6425 - regression_loss: 1.3631 - classification_loss: 0.2794 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6431 - regression_loss: 1.3638 - classification_loss: 0.2793 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6448 - regression_loss: 1.3650 - classification_loss: 0.2797 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6439 - regression_loss: 1.3646 - classification_loss: 0.2793 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6495 - regression_loss: 1.3668 - classification_loss: 0.2827 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6446 - regression_loss: 1.3629 - classification_loss: 0.2816 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6474 - regression_loss: 1.3652 - classification_loss: 0.2822 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6500 - regression_loss: 1.3676 - classification_loss: 0.2824 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6525 - regression_loss: 1.3698 - classification_loss: 0.2828 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6550 - regression_loss: 1.3717 - classification_loss: 0.2833 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6511 - regression_loss: 1.3689 - classification_loss: 0.2823 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6520 - regression_loss: 1.3696 - classification_loss: 0.2823 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6510 - regression_loss: 1.3687 - classification_loss: 0.2823 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6532 - regression_loss: 1.3709 - classification_loss: 0.2824 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6562 - regression_loss: 1.3727 - classification_loss: 0.2835 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6578 - regression_loss: 1.3740 - classification_loss: 0.2838 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6584 - regression_loss: 1.3745 - classification_loss: 0.2839 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6583 - regression_loss: 1.3747 - classification_loss: 0.2837 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6542 - regression_loss: 1.3713 - classification_loss: 0.2829 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6570 - regression_loss: 1.3738 - classification_loss: 0.2832 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6581 - regression_loss: 1.3749 - classification_loss: 0.2832 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6558 - regression_loss: 1.3734 - classification_loss: 0.2824 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6517 - regression_loss: 1.3700 - classification_loss: 0.2817 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6536 - regression_loss: 1.3713 - classification_loss: 0.2823 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6510 - regression_loss: 1.3689 - classification_loss: 0.2821 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6500 - regression_loss: 1.3679 - classification_loss: 0.2821 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6506 - regression_loss: 1.3687 - classification_loss: 0.2819 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6505 - regression_loss: 1.3687 - classification_loss: 0.2819 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6511 - regression_loss: 1.3692 - classification_loss: 0.2819 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6506 - regression_loss: 1.3689 - classification_loss: 0.2817 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6473 - regression_loss: 1.3663 - classification_loss: 0.2810 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6488 - regression_loss: 1.3672 - classification_loss: 0.2816 217/500 [============>.................] - ETA: 1:11 - loss: 1.6444 - regression_loss: 1.3636 - classification_loss: 0.2808 218/500 [============>.................] - ETA: 1:10 - loss: 1.6440 - regression_loss: 1.3632 - classification_loss: 0.2808 219/500 [============>.................] - ETA: 1:10 - loss: 1.6470 - regression_loss: 1.3653 - classification_loss: 0.2817 220/500 [============>.................] - ETA: 1:10 - loss: 1.6498 - regression_loss: 1.3664 - classification_loss: 0.2834 221/500 [============>.................] - ETA: 1:10 - loss: 1.6493 - regression_loss: 1.3653 - classification_loss: 0.2841 222/500 [============>.................] - ETA: 1:09 - loss: 1.6504 - regression_loss: 1.3660 - classification_loss: 0.2844 223/500 [============>.................] - ETA: 1:09 - loss: 1.6495 - regression_loss: 1.3654 - classification_loss: 0.2841 224/500 [============>.................] - ETA: 1:09 - loss: 1.6459 - regression_loss: 1.3625 - classification_loss: 0.2834 225/500 [============>.................] - ETA: 1:09 - loss: 1.6457 - regression_loss: 1.3624 - classification_loss: 0.2833 226/500 [============>.................] - ETA: 1:08 - loss: 1.6460 - regression_loss: 1.3628 - classification_loss: 0.2833 227/500 [============>.................] - ETA: 1:08 - loss: 1.6435 - regression_loss: 1.3608 - classification_loss: 0.2827 228/500 [============>.................] - ETA: 1:08 - loss: 1.6454 - regression_loss: 1.3623 - classification_loss: 0.2831 229/500 [============>.................] - ETA: 1:08 - loss: 1.6453 - regression_loss: 1.3625 - classification_loss: 0.2827 230/500 [============>.................] - ETA: 1:07 - loss: 1.6431 - regression_loss: 1.3605 - classification_loss: 0.2826 231/500 [============>.................] - ETA: 1:07 - loss: 1.6428 - regression_loss: 1.3600 - classification_loss: 0.2828 232/500 [============>.................] - ETA: 1:07 - loss: 1.6502 - regression_loss: 1.3660 - classification_loss: 0.2842 233/500 [============>.................] - ETA: 1:07 - loss: 1.6493 - regression_loss: 1.3652 - classification_loss: 0.2841 234/500 [=============>................] - ETA: 1:06 - loss: 1.6472 - regression_loss: 1.3636 - classification_loss: 0.2836 235/500 [=============>................] - ETA: 1:06 - loss: 1.6477 - regression_loss: 1.3639 - classification_loss: 0.2838 236/500 [=============>................] - ETA: 1:06 - loss: 1.6461 - regression_loss: 1.3631 - classification_loss: 0.2831 237/500 [=============>................] - ETA: 1:06 - loss: 1.6463 - regression_loss: 1.3634 - classification_loss: 0.2829 238/500 [=============>................] - ETA: 1:05 - loss: 1.6455 - regression_loss: 1.3628 - classification_loss: 0.2827 239/500 [=============>................] - ETA: 1:05 - loss: 1.6468 - regression_loss: 1.3624 - classification_loss: 0.2844 240/500 [=============>................] - ETA: 1:05 - loss: 1.6477 - regression_loss: 1.3629 - classification_loss: 0.2849 241/500 [=============>................] - ETA: 1:05 - loss: 1.6478 - regression_loss: 1.3631 - classification_loss: 0.2846 242/500 [=============>................] - ETA: 1:04 - loss: 1.6484 - regression_loss: 1.3636 - classification_loss: 0.2847 243/500 [=============>................] - ETA: 1:04 - loss: 1.6484 - regression_loss: 1.3637 - classification_loss: 0.2846 244/500 [=============>................] - ETA: 1:04 - loss: 1.6497 - regression_loss: 1.3650 - classification_loss: 0.2848 245/500 [=============>................] - ETA: 1:04 - loss: 1.6495 - regression_loss: 1.3646 - classification_loss: 0.2849 246/500 [=============>................] - ETA: 1:03 - loss: 1.6510 - regression_loss: 1.3651 - classification_loss: 0.2859 247/500 [=============>................] - ETA: 1:03 - loss: 1.6522 - regression_loss: 1.3662 - classification_loss: 0.2860 248/500 [=============>................] - ETA: 1:03 - loss: 1.6525 - regression_loss: 1.3668 - classification_loss: 0.2857 249/500 [=============>................] - ETA: 1:03 - loss: 1.6502 - regression_loss: 1.3651 - classification_loss: 0.2852 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6507 - regression_loss: 1.3657 - classification_loss: 0.2850 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6513 - regression_loss: 1.3661 - classification_loss: 0.2852 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6487 - regression_loss: 1.3641 - classification_loss: 0.2846 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6507 - regression_loss: 1.3653 - classification_loss: 0.2854 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6522 - regression_loss: 1.3664 - classification_loss: 0.2858 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6513 - regression_loss: 1.3659 - classification_loss: 0.2854 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6520 - regression_loss: 1.3665 - classification_loss: 0.2856 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6527 - regression_loss: 1.3672 - classification_loss: 0.2855 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6543 - regression_loss: 1.3682 - classification_loss: 0.2861 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6531 - regression_loss: 1.3674 - classification_loss: 0.2857 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6544 - regression_loss: 1.3687 - classification_loss: 0.2857 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6518 - regression_loss: 1.3668 - classification_loss: 0.2850 262/500 [==============>...............] - ETA: 59s - loss: 1.6531 - regression_loss: 1.3678 - classification_loss: 0.2854  263/500 [==============>...............] - ETA: 59s - loss: 1.6499 - regression_loss: 1.3652 - classification_loss: 0.2846 264/500 [==============>...............] - ETA: 59s - loss: 1.6500 - regression_loss: 1.3651 - classification_loss: 0.2849 265/500 [==============>...............] - ETA: 59s - loss: 1.6506 - regression_loss: 1.3657 - classification_loss: 0.2849 266/500 [==============>...............] - ETA: 58s - loss: 1.6500 - regression_loss: 1.3654 - classification_loss: 0.2846 267/500 [===============>..............] - ETA: 58s - loss: 1.6458 - regression_loss: 1.3620 - classification_loss: 0.2838 268/500 [===============>..............] - ETA: 58s - loss: 1.6469 - regression_loss: 1.3630 - classification_loss: 0.2838 269/500 [===============>..............] - ETA: 58s - loss: 1.6483 - regression_loss: 1.3641 - classification_loss: 0.2842 270/500 [===============>..............] - ETA: 57s - loss: 1.6482 - regression_loss: 1.3638 - classification_loss: 0.2844 271/500 [===============>..............] - ETA: 57s - loss: 1.6473 - regression_loss: 1.3632 - classification_loss: 0.2841 272/500 [===============>..............] - ETA: 57s - loss: 1.6475 - regression_loss: 1.3634 - classification_loss: 0.2841 273/500 [===============>..............] - ETA: 57s - loss: 1.6492 - regression_loss: 1.3651 - classification_loss: 0.2840 274/500 [===============>..............] - ETA: 56s - loss: 1.6489 - regression_loss: 1.3650 - classification_loss: 0.2839 275/500 [===============>..............] - ETA: 56s - loss: 1.6518 - regression_loss: 1.3673 - classification_loss: 0.2845 276/500 [===============>..............] - ETA: 56s - loss: 1.6522 - regression_loss: 1.3676 - classification_loss: 0.2846 277/500 [===============>..............] - ETA: 56s - loss: 1.6495 - regression_loss: 1.3653 - classification_loss: 0.2841 278/500 [===============>..............] - ETA: 55s - loss: 1.6504 - regression_loss: 1.3658 - classification_loss: 0.2847 279/500 [===============>..............] - ETA: 55s - loss: 1.6491 - regression_loss: 1.3647 - classification_loss: 0.2844 280/500 [===============>..............] - ETA: 55s - loss: 1.6513 - regression_loss: 1.3662 - classification_loss: 0.2851 281/500 [===============>..............] - ETA: 55s - loss: 1.6515 - regression_loss: 1.3664 - classification_loss: 0.2851 282/500 [===============>..............] - ETA: 54s - loss: 1.6498 - regression_loss: 1.3650 - classification_loss: 0.2848 283/500 [===============>..............] - ETA: 54s - loss: 1.6510 - regression_loss: 1.3660 - classification_loss: 0.2850 284/500 [================>.............] - ETA: 54s - loss: 1.6531 - regression_loss: 1.3680 - classification_loss: 0.2851 285/500 [================>.............] - ETA: 54s - loss: 1.6549 - regression_loss: 1.3692 - classification_loss: 0.2857 286/500 [================>.............] - ETA: 53s - loss: 1.6542 - regression_loss: 1.3685 - classification_loss: 0.2857 287/500 [================>.............] - ETA: 53s - loss: 1.6542 - regression_loss: 1.3686 - classification_loss: 0.2856 288/500 [================>.............] - ETA: 53s - loss: 1.6569 - regression_loss: 1.3710 - classification_loss: 0.2859 289/500 [================>.............] - ETA: 53s - loss: 1.6595 - regression_loss: 1.3731 - classification_loss: 0.2864 290/500 [================>.............] - ETA: 52s - loss: 1.6604 - regression_loss: 1.3737 - classification_loss: 0.2867 291/500 [================>.............] - ETA: 52s - loss: 1.6594 - regression_loss: 1.3730 - classification_loss: 0.2864 292/500 [================>.............] - ETA: 52s - loss: 1.6607 - regression_loss: 1.3744 - classification_loss: 0.2863 293/500 [================>.............] - ETA: 52s - loss: 1.6614 - regression_loss: 1.3751 - classification_loss: 0.2863 294/500 [================>.............] - ETA: 51s - loss: 1.6588 - regression_loss: 1.3730 - classification_loss: 0.2859 295/500 [================>.............] - ETA: 51s - loss: 1.6620 - regression_loss: 1.3753 - classification_loss: 0.2867 296/500 [================>.............] - ETA: 51s - loss: 1.6607 - regression_loss: 1.3741 - classification_loss: 0.2866 297/500 [================>.............] - ETA: 51s - loss: 1.6605 - regression_loss: 1.3739 - classification_loss: 0.2866 298/500 [================>.............] - ETA: 50s - loss: 1.6592 - regression_loss: 1.3727 - classification_loss: 0.2865 299/500 [================>.............] - ETA: 50s - loss: 1.6604 - regression_loss: 1.3738 - classification_loss: 0.2866 300/500 [=================>............] - ETA: 50s - loss: 1.6600 - regression_loss: 1.3736 - classification_loss: 0.2864 301/500 [=================>............] - ETA: 49s - loss: 1.6613 - regression_loss: 1.3748 - classification_loss: 0.2864 302/500 [=================>............] - ETA: 49s - loss: 1.6623 - regression_loss: 1.3758 - classification_loss: 0.2865 303/500 [=================>............] - ETA: 49s - loss: 1.6640 - regression_loss: 1.3773 - classification_loss: 0.2867 304/500 [=================>............] - ETA: 49s - loss: 1.6636 - regression_loss: 1.3771 - classification_loss: 0.2865 305/500 [=================>............] - ETA: 48s - loss: 1.6645 - regression_loss: 1.3780 - classification_loss: 0.2865 306/500 [=================>............] - ETA: 48s - loss: 1.6665 - regression_loss: 1.3791 - classification_loss: 0.2874 307/500 [=================>............] - ETA: 48s - loss: 1.6685 - regression_loss: 1.3807 - classification_loss: 0.2878 308/500 [=================>............] - ETA: 48s - loss: 1.6679 - regression_loss: 1.3804 - classification_loss: 0.2875 309/500 [=================>............] - ETA: 47s - loss: 1.6679 - regression_loss: 1.3805 - classification_loss: 0.2874 310/500 [=================>............] - ETA: 47s - loss: 1.6674 - regression_loss: 1.3803 - classification_loss: 0.2872 311/500 [=================>............] - ETA: 47s - loss: 1.6668 - regression_loss: 1.3798 - classification_loss: 0.2870 312/500 [=================>............] - ETA: 47s - loss: 1.6676 - regression_loss: 1.3803 - classification_loss: 0.2873 313/500 [=================>............] - ETA: 46s - loss: 1.6658 - regression_loss: 1.3788 - classification_loss: 0.2870 314/500 [=================>............] - ETA: 46s - loss: 1.6661 - regression_loss: 1.3791 - classification_loss: 0.2870 315/500 [=================>............] - ETA: 46s - loss: 1.6688 - regression_loss: 1.3812 - classification_loss: 0.2876 316/500 [=================>............] - ETA: 46s - loss: 1.6681 - regression_loss: 1.3808 - classification_loss: 0.2874 317/500 [==================>...........] - ETA: 45s - loss: 1.6679 - regression_loss: 1.3804 - classification_loss: 0.2875 318/500 [==================>...........] - ETA: 45s - loss: 1.6655 - regression_loss: 1.3786 - classification_loss: 0.2869 319/500 [==================>...........] - ETA: 45s - loss: 1.6651 - regression_loss: 1.3784 - classification_loss: 0.2867 320/500 [==================>...........] - ETA: 45s - loss: 1.6640 - regression_loss: 1.3775 - classification_loss: 0.2865 321/500 [==================>...........] - ETA: 44s - loss: 1.6648 - regression_loss: 1.3781 - classification_loss: 0.2866 322/500 [==================>...........] - ETA: 44s - loss: 1.6654 - regression_loss: 1.3787 - classification_loss: 0.2867 323/500 [==================>...........] - ETA: 44s - loss: 1.6679 - regression_loss: 1.3807 - classification_loss: 0.2872 324/500 [==================>...........] - ETA: 44s - loss: 1.6691 - regression_loss: 1.3817 - classification_loss: 0.2874 325/500 [==================>...........] - ETA: 43s - loss: 1.6669 - regression_loss: 1.3798 - classification_loss: 0.2871 326/500 [==================>...........] - ETA: 43s - loss: 1.6679 - regression_loss: 1.3807 - classification_loss: 0.2872 327/500 [==================>...........] - ETA: 43s - loss: 1.6673 - regression_loss: 1.3802 - classification_loss: 0.2870 328/500 [==================>...........] - ETA: 43s - loss: 1.6685 - regression_loss: 1.3812 - classification_loss: 0.2873 329/500 [==================>...........] - ETA: 42s - loss: 1.6707 - regression_loss: 1.3831 - classification_loss: 0.2876 330/500 [==================>...........] - ETA: 42s - loss: 1.6709 - regression_loss: 1.3832 - classification_loss: 0.2877 331/500 [==================>...........] - ETA: 42s - loss: 1.6693 - regression_loss: 1.3821 - classification_loss: 0.2872 332/500 [==================>...........] - ETA: 42s - loss: 1.6703 - regression_loss: 1.3827 - classification_loss: 0.2876 333/500 [==================>...........] - ETA: 41s - loss: 1.6693 - regression_loss: 1.3819 - classification_loss: 0.2873 334/500 [===================>..........] - ETA: 41s - loss: 1.6714 - regression_loss: 1.3836 - classification_loss: 0.2878 335/500 [===================>..........] - ETA: 41s - loss: 1.6717 - regression_loss: 1.3834 - classification_loss: 0.2883 336/500 [===================>..........] - ETA: 41s - loss: 1.6722 - regression_loss: 1.3839 - classification_loss: 0.2883 337/500 [===================>..........] - ETA: 40s - loss: 1.6732 - regression_loss: 1.3849 - classification_loss: 0.2884 338/500 [===================>..........] - ETA: 40s - loss: 1.6730 - regression_loss: 1.3847 - classification_loss: 0.2883 339/500 [===================>..........] - ETA: 40s - loss: 1.6726 - regression_loss: 1.3843 - classification_loss: 0.2883 340/500 [===================>..........] - ETA: 40s - loss: 1.6730 - regression_loss: 1.3842 - classification_loss: 0.2887 341/500 [===================>..........] - ETA: 39s - loss: 1.6724 - regression_loss: 1.3837 - classification_loss: 0.2888 342/500 [===================>..........] - ETA: 39s - loss: 1.6728 - regression_loss: 1.3840 - classification_loss: 0.2888 343/500 [===================>..........] - ETA: 39s - loss: 1.6729 - regression_loss: 1.3847 - classification_loss: 0.2883 344/500 [===================>..........] - ETA: 39s - loss: 1.6740 - regression_loss: 1.3853 - classification_loss: 0.2887 345/500 [===================>..........] - ETA: 38s - loss: 1.6754 - regression_loss: 1.3863 - classification_loss: 0.2892 346/500 [===================>..........] - ETA: 38s - loss: 1.6749 - regression_loss: 1.3857 - classification_loss: 0.2892 347/500 [===================>..........] - ETA: 38s - loss: 1.6735 - regression_loss: 1.3848 - classification_loss: 0.2887 348/500 [===================>..........] - ETA: 38s - loss: 1.6741 - regression_loss: 1.3854 - classification_loss: 0.2887 349/500 [===================>..........] - ETA: 37s - loss: 1.6742 - regression_loss: 1.3854 - classification_loss: 0.2888 350/500 [====================>.........] - ETA: 37s - loss: 1.6754 - regression_loss: 1.3861 - classification_loss: 0.2892 351/500 [====================>.........] - ETA: 37s - loss: 1.6760 - regression_loss: 1.3865 - classification_loss: 0.2895 352/500 [====================>.........] - ETA: 37s - loss: 1.6772 - regression_loss: 1.3871 - classification_loss: 0.2900 353/500 [====================>.........] - ETA: 36s - loss: 1.6763 - regression_loss: 1.3864 - classification_loss: 0.2899 354/500 [====================>.........] - ETA: 36s - loss: 1.6769 - regression_loss: 1.3870 - classification_loss: 0.2899 355/500 [====================>.........] - ETA: 36s - loss: 1.6753 - regression_loss: 1.3858 - classification_loss: 0.2895 356/500 [====================>.........] - ETA: 36s - loss: 1.6742 - regression_loss: 1.3850 - classification_loss: 0.2892 357/500 [====================>.........] - ETA: 35s - loss: 1.6763 - regression_loss: 1.3864 - classification_loss: 0.2899 358/500 [====================>.........] - ETA: 35s - loss: 1.6777 - regression_loss: 1.3874 - classification_loss: 0.2904 359/500 [====================>.........] - ETA: 35s - loss: 1.6760 - regression_loss: 1.3861 - classification_loss: 0.2898 360/500 [====================>.........] - ETA: 35s - loss: 1.6771 - regression_loss: 1.3873 - classification_loss: 0.2898 361/500 [====================>.........] - ETA: 34s - loss: 1.6770 - regression_loss: 1.3873 - classification_loss: 0.2897 362/500 [====================>.........] - ETA: 34s - loss: 1.6791 - regression_loss: 1.3890 - classification_loss: 0.2901 363/500 [====================>.........] - ETA: 34s - loss: 1.6782 - regression_loss: 1.3883 - classification_loss: 0.2899 364/500 [====================>.........] - ETA: 34s - loss: 1.6796 - regression_loss: 1.3896 - classification_loss: 0.2901 365/500 [====================>.........] - ETA: 33s - loss: 1.6793 - regression_loss: 1.3894 - classification_loss: 0.2899 366/500 [====================>.........] - ETA: 33s - loss: 1.6796 - regression_loss: 1.3895 - classification_loss: 0.2900 367/500 [=====================>........] - ETA: 33s - loss: 1.6800 - regression_loss: 1.3899 - classification_loss: 0.2901 368/500 [=====================>........] - ETA: 33s - loss: 1.6775 - regression_loss: 1.3876 - classification_loss: 0.2899 369/500 [=====================>........] - ETA: 32s - loss: 1.6800 - regression_loss: 1.3899 - classification_loss: 0.2901 370/500 [=====================>........] - ETA: 32s - loss: 1.6787 - regression_loss: 1.3889 - classification_loss: 0.2899 371/500 [=====================>........] - ETA: 32s - loss: 1.6783 - regression_loss: 1.3886 - classification_loss: 0.2897 372/500 [=====================>........] - ETA: 32s - loss: 1.6766 - regression_loss: 1.3873 - classification_loss: 0.2893 373/500 [=====================>........] - ETA: 31s - loss: 1.6744 - regression_loss: 1.3855 - classification_loss: 0.2888 374/500 [=====================>........] - ETA: 31s - loss: 1.6749 - regression_loss: 1.3860 - classification_loss: 0.2889 375/500 [=====================>........] - ETA: 31s - loss: 1.6754 - regression_loss: 1.3866 - classification_loss: 0.2888 376/500 [=====================>........] - ETA: 31s - loss: 1.6760 - regression_loss: 1.3872 - classification_loss: 0.2888 377/500 [=====================>........] - ETA: 30s - loss: 1.6781 - regression_loss: 1.3888 - classification_loss: 0.2893 378/500 [=====================>........] - ETA: 30s - loss: 1.6784 - regression_loss: 1.3890 - classification_loss: 0.2894 379/500 [=====================>........] - ETA: 30s - loss: 1.6789 - regression_loss: 1.3891 - classification_loss: 0.2898 380/500 [=====================>........] - ETA: 30s - loss: 1.6789 - regression_loss: 1.3891 - classification_loss: 0.2898 381/500 [=====================>........] - ETA: 29s - loss: 1.6795 - regression_loss: 1.3899 - classification_loss: 0.2896 382/500 [=====================>........] - ETA: 29s - loss: 1.6804 - regression_loss: 1.3905 - classification_loss: 0.2899 383/500 [=====================>........] - ETA: 29s - loss: 1.6808 - regression_loss: 1.3910 - classification_loss: 0.2899 384/500 [======================>.......] - ETA: 29s - loss: 1.6784 - regression_loss: 1.3890 - classification_loss: 0.2894 385/500 [======================>.......] - ETA: 28s - loss: 1.6788 - regression_loss: 1.3893 - classification_loss: 0.2894 386/500 [======================>.......] - ETA: 28s - loss: 1.6756 - regression_loss: 1.3868 - classification_loss: 0.2888 387/500 [======================>.......] - ETA: 28s - loss: 1.6753 - regression_loss: 1.3866 - classification_loss: 0.2887 388/500 [======================>.......] - ETA: 28s - loss: 1.6740 - regression_loss: 1.3856 - classification_loss: 0.2884 389/500 [======================>.......] - ETA: 27s - loss: 1.6746 - regression_loss: 1.3861 - classification_loss: 0.2885 390/500 [======================>.......] - ETA: 27s - loss: 1.6733 - regression_loss: 1.3851 - classification_loss: 0.2882 391/500 [======================>.......] - ETA: 27s - loss: 1.6754 - regression_loss: 1.3866 - classification_loss: 0.2888 392/500 [======================>.......] - ETA: 27s - loss: 1.6763 - regression_loss: 1.3875 - classification_loss: 0.2888 393/500 [======================>.......] - ETA: 26s - loss: 1.6737 - regression_loss: 1.3855 - classification_loss: 0.2882 394/500 [======================>.......] - ETA: 26s - loss: 1.6739 - regression_loss: 1.3858 - classification_loss: 0.2881 395/500 [======================>.......] - ETA: 26s - loss: 1.6732 - regression_loss: 1.3853 - classification_loss: 0.2879 396/500 [======================>.......] - ETA: 26s - loss: 1.6736 - regression_loss: 1.3856 - classification_loss: 0.2880 397/500 [======================>.......] - ETA: 25s - loss: 1.6716 - regression_loss: 1.3841 - classification_loss: 0.2876 398/500 [======================>.......] - ETA: 25s - loss: 1.6699 - regression_loss: 1.3827 - classification_loss: 0.2873 399/500 [======================>.......] - ETA: 25s - loss: 1.6681 - regression_loss: 1.3811 - classification_loss: 0.2869 400/500 [=======================>......] - ETA: 25s - loss: 1.6685 - regression_loss: 1.3816 - classification_loss: 0.2869 401/500 [=======================>......] - ETA: 24s - loss: 1.6688 - regression_loss: 1.3819 - classification_loss: 0.2868 402/500 [=======================>......] - ETA: 24s - loss: 1.6671 - regression_loss: 1.3804 - classification_loss: 0.2867 403/500 [=======================>......] - ETA: 24s - loss: 1.6660 - regression_loss: 1.3796 - classification_loss: 0.2865 404/500 [=======================>......] - ETA: 24s - loss: 1.6673 - regression_loss: 1.3798 - classification_loss: 0.2875 405/500 [=======================>......] - ETA: 23s - loss: 1.6680 - regression_loss: 1.3802 - classification_loss: 0.2878 406/500 [=======================>......] - ETA: 23s - loss: 1.6686 - regression_loss: 1.3807 - classification_loss: 0.2878 407/500 [=======================>......] - ETA: 23s - loss: 1.6680 - regression_loss: 1.3804 - classification_loss: 0.2876 408/500 [=======================>......] - ETA: 23s - loss: 1.6670 - regression_loss: 1.3795 - classification_loss: 0.2875 409/500 [=======================>......] - ETA: 22s - loss: 1.6673 - regression_loss: 1.3798 - classification_loss: 0.2875 410/500 [=======================>......] - ETA: 22s - loss: 1.6672 - regression_loss: 1.3797 - classification_loss: 0.2875 411/500 [=======================>......] - ETA: 22s - loss: 1.6665 - regression_loss: 1.3792 - classification_loss: 0.2873 412/500 [=======================>......] - ETA: 22s - loss: 1.6676 - regression_loss: 1.3801 - classification_loss: 0.2876 413/500 [=======================>......] - ETA: 21s - loss: 1.6671 - regression_loss: 1.3797 - classification_loss: 0.2875 414/500 [=======================>......] - ETA: 21s - loss: 1.6684 - regression_loss: 1.3808 - classification_loss: 0.2876 415/500 [=======================>......] - ETA: 21s - loss: 1.6684 - regression_loss: 1.3809 - classification_loss: 0.2875 416/500 [=======================>......] - ETA: 21s - loss: 1.6695 - regression_loss: 1.3818 - classification_loss: 0.2877 417/500 [========================>.....] - ETA: 20s - loss: 1.6693 - regression_loss: 1.3816 - classification_loss: 0.2877 418/500 [========================>.....] - ETA: 20s - loss: 1.6695 - regression_loss: 1.3819 - classification_loss: 0.2877 419/500 [========================>.....] - ETA: 20s - loss: 1.6700 - regression_loss: 1.3822 - classification_loss: 0.2878 420/500 [========================>.....] - ETA: 20s - loss: 1.6690 - regression_loss: 1.3814 - classification_loss: 0.2876 421/500 [========================>.....] - ETA: 19s - loss: 1.6695 - regression_loss: 1.3820 - classification_loss: 0.2876 422/500 [========================>.....] - ETA: 19s - loss: 1.6690 - regression_loss: 1.3816 - classification_loss: 0.2873 423/500 [========================>.....] - ETA: 19s - loss: 1.6698 - regression_loss: 1.3823 - classification_loss: 0.2875 424/500 [========================>.....] - ETA: 19s - loss: 1.6713 - regression_loss: 1.3831 - classification_loss: 0.2882 425/500 [========================>.....] - ETA: 18s - loss: 1.6719 - regression_loss: 1.3837 - classification_loss: 0.2882 426/500 [========================>.....] - ETA: 18s - loss: 1.6720 - regression_loss: 1.3837 - classification_loss: 0.2883 427/500 [========================>.....] - ETA: 18s - loss: 1.6735 - regression_loss: 1.3851 - classification_loss: 0.2884 428/500 [========================>.....] - ETA: 18s - loss: 1.6737 - regression_loss: 1.3852 - classification_loss: 0.2885 429/500 [========================>.....] - ETA: 17s - loss: 1.6736 - regression_loss: 1.3852 - classification_loss: 0.2884 430/500 [========================>.....] - ETA: 17s - loss: 1.6732 - regression_loss: 1.3849 - classification_loss: 0.2883 431/500 [========================>.....] - ETA: 17s - loss: 1.6729 - regression_loss: 1.3848 - classification_loss: 0.2882 432/500 [========================>.....] - ETA: 17s - loss: 1.6717 - regression_loss: 1.3838 - classification_loss: 0.2879 433/500 [========================>.....] - ETA: 16s - loss: 1.6704 - regression_loss: 1.3828 - classification_loss: 0.2875 434/500 [=========================>....] - ETA: 16s - loss: 1.6703 - regression_loss: 1.3828 - classification_loss: 0.2875 435/500 [=========================>....] - ETA: 16s - loss: 1.6703 - regression_loss: 1.3829 - classification_loss: 0.2874 436/500 [=========================>....] - ETA: 16s - loss: 1.6686 - regression_loss: 1.3816 - classification_loss: 0.2870 437/500 [=========================>....] - ETA: 15s - loss: 1.6682 - regression_loss: 1.3813 - classification_loss: 0.2869 438/500 [=========================>....] - ETA: 15s - loss: 1.6686 - regression_loss: 1.3817 - classification_loss: 0.2868 439/500 [=========================>....] - ETA: 15s - loss: 1.6686 - regression_loss: 1.3818 - classification_loss: 0.2868 440/500 [=========================>....] - ETA: 15s - loss: 1.6688 - regression_loss: 1.3820 - classification_loss: 0.2869 441/500 [=========================>....] - ETA: 14s - loss: 1.6689 - regression_loss: 1.3821 - classification_loss: 0.2868 442/500 [=========================>....] - ETA: 14s - loss: 1.6700 - regression_loss: 1.3828 - classification_loss: 0.2873 443/500 [=========================>....] - ETA: 14s - loss: 1.6711 - regression_loss: 1.3837 - classification_loss: 0.2874 444/500 [=========================>....] - ETA: 14s - loss: 1.6713 - regression_loss: 1.3839 - classification_loss: 0.2874 445/500 [=========================>....] - ETA: 13s - loss: 1.6712 - regression_loss: 1.3840 - classification_loss: 0.2873 446/500 [=========================>....] - ETA: 13s - loss: 1.6684 - regression_loss: 1.3816 - classification_loss: 0.2868 447/500 [=========================>....] - ETA: 13s - loss: 1.6695 - regression_loss: 1.3824 - classification_loss: 0.2870 448/500 [=========================>....] - ETA: 13s - loss: 1.6686 - regression_loss: 1.3817 - classification_loss: 0.2869 449/500 [=========================>....] - ETA: 12s - loss: 1.6698 - regression_loss: 1.3827 - classification_loss: 0.2871 450/500 [==========================>...] - ETA: 12s - loss: 1.6693 - regression_loss: 1.3823 - classification_loss: 0.2870 451/500 [==========================>...] - ETA: 12s - loss: 1.6691 - regression_loss: 1.3823 - classification_loss: 0.2869 452/500 [==========================>...] - ETA: 12s - loss: 1.6694 - regression_loss: 1.3824 - classification_loss: 0.2870 453/500 [==========================>...] - ETA: 11s - loss: 1.6699 - regression_loss: 1.3828 - classification_loss: 0.2871 454/500 [==========================>...] - ETA: 11s - loss: 1.6686 - regression_loss: 1.3819 - classification_loss: 0.2867 455/500 [==========================>...] - ETA: 11s - loss: 1.6700 - regression_loss: 1.3833 - classification_loss: 0.2867 456/500 [==========================>...] - ETA: 11s - loss: 1.6715 - regression_loss: 1.3847 - classification_loss: 0.2869 457/500 [==========================>...] - ETA: 10s - loss: 1.6723 - regression_loss: 1.3853 - classification_loss: 0.2870 458/500 [==========================>...] - ETA: 10s - loss: 1.6728 - regression_loss: 1.3857 - classification_loss: 0.2871 459/500 [==========================>...] - ETA: 10s - loss: 1.6725 - regression_loss: 1.3855 - classification_loss: 0.2870 460/500 [==========================>...] - ETA: 10s - loss: 1.6724 - regression_loss: 1.3855 - classification_loss: 0.2869 461/500 [==========================>...] - ETA: 9s - loss: 1.6729 - regression_loss: 1.3860 - classification_loss: 0.2869  462/500 [==========================>...] - ETA: 9s - loss: 1.6729 - regression_loss: 1.3861 - classification_loss: 0.2868 463/500 [==========================>...] - ETA: 9s - loss: 1.6728 - regression_loss: 1.3860 - classification_loss: 0.2868 464/500 [==========================>...] - ETA: 9s - loss: 1.6743 - regression_loss: 1.3872 - classification_loss: 0.2870 465/500 [==========================>...] - ETA: 8s - loss: 1.6748 - regression_loss: 1.3876 - classification_loss: 0.2872 466/500 [==========================>...] - ETA: 8s - loss: 1.6753 - regression_loss: 1.3880 - classification_loss: 0.2873 467/500 [===========================>..] - ETA: 8s - loss: 1.6758 - regression_loss: 1.3884 - classification_loss: 0.2874 468/500 [===========================>..] - ETA: 8s - loss: 1.6756 - regression_loss: 1.3882 - classification_loss: 0.2875 469/500 [===========================>..] - ETA: 7s - loss: 1.6759 - regression_loss: 1.3885 - classification_loss: 0.2875 470/500 [===========================>..] - ETA: 7s - loss: 1.6767 - regression_loss: 1.3891 - classification_loss: 0.2876 471/500 [===========================>..] - ETA: 7s - loss: 1.6771 - regression_loss: 1.3895 - classification_loss: 0.2876 472/500 [===========================>..] - ETA: 7s - loss: 1.6769 - regression_loss: 1.3892 - classification_loss: 0.2876 473/500 [===========================>..] - ETA: 6s - loss: 1.6770 - regression_loss: 1.3896 - classification_loss: 0.2874 474/500 [===========================>..] - ETA: 6s - loss: 1.6745 - regression_loss: 1.3875 - classification_loss: 0.2870 475/500 [===========================>..] - ETA: 6s - loss: 1.6734 - regression_loss: 1.3867 - classification_loss: 0.2867 476/500 [===========================>..] - ETA: 6s - loss: 1.6716 - regression_loss: 1.3852 - classification_loss: 0.2863 477/500 [===========================>..] - ETA: 5s - loss: 1.6724 - regression_loss: 1.3860 - classification_loss: 0.2864 478/500 [===========================>..] - ETA: 5s - loss: 1.6720 - regression_loss: 1.3858 - classification_loss: 0.2862 479/500 [===========================>..] - ETA: 5s - loss: 1.6708 - regression_loss: 1.3848 - classification_loss: 0.2860 480/500 [===========================>..] - ETA: 5s - loss: 1.6708 - regression_loss: 1.3849 - classification_loss: 0.2859 481/500 [===========================>..] - ETA: 4s - loss: 1.6706 - regression_loss: 1.3849 - classification_loss: 0.2857 482/500 [===========================>..] - ETA: 4s - loss: 1.6712 - regression_loss: 1.3854 - classification_loss: 0.2858 483/500 [===========================>..] - ETA: 4s - loss: 1.6724 - regression_loss: 1.3864 - classification_loss: 0.2859 484/500 [============================>.] - ETA: 4s - loss: 1.6720 - regression_loss: 1.3861 - classification_loss: 0.2859 485/500 [============================>.] - ETA: 3s - loss: 1.6719 - regression_loss: 1.3861 - classification_loss: 0.2858 486/500 [============================>.] - ETA: 3s - loss: 1.6729 - regression_loss: 1.3869 - classification_loss: 0.2860 487/500 [============================>.] - ETA: 3s - loss: 1.6735 - regression_loss: 1.3874 - classification_loss: 0.2861 488/500 [============================>.] - ETA: 3s - loss: 1.6751 - regression_loss: 1.3888 - classification_loss: 0.2863 489/500 [============================>.] - ETA: 2s - loss: 1.6740 - regression_loss: 1.3879 - classification_loss: 0.2862 490/500 [============================>.] - ETA: 2s - loss: 1.6742 - regression_loss: 1.3879 - classification_loss: 0.2862 491/500 [============================>.] - ETA: 2s - loss: 1.6743 - regression_loss: 1.3878 - classification_loss: 0.2865 492/500 [============================>.] - ETA: 2s - loss: 1.6744 - regression_loss: 1.3878 - classification_loss: 0.2865 493/500 [============================>.] - ETA: 1s - loss: 1.6740 - regression_loss: 1.3875 - classification_loss: 0.2865 494/500 [============================>.] - ETA: 1s - loss: 1.6731 - regression_loss: 1.3867 - classification_loss: 0.2864 495/500 [============================>.] - ETA: 1s - loss: 1.6732 - regression_loss: 1.3869 - classification_loss: 0.2863 496/500 [============================>.] - ETA: 1s - loss: 1.6726 - regression_loss: 1.3866 - classification_loss: 0.2860 497/500 [============================>.] - ETA: 0s - loss: 1.6726 - regression_loss: 1.3866 - classification_loss: 0.2860 498/500 [============================>.] - ETA: 0s - loss: 1.6739 - regression_loss: 1.3875 - classification_loss: 0.2863 499/500 [============================>.] - ETA: 0s - loss: 1.6727 - regression_loss: 1.3863 - classification_loss: 0.2864 500/500 [==============================] - 125s 251ms/step - loss: 1.6723 - regression_loss: 1.3858 - classification_loss: 0.2864 1172 instances of class plum with average precision: 0.5988 mAP: 0.5988 Epoch 00075: saving model to ./training/snapshots/resnet50_pascal_75.h5 Epoch 76/150 1/500 [..............................] - ETA: 1:58 - loss: 1.6858 - regression_loss: 1.4485 - classification_loss: 0.2372 2/500 [..............................] - ETA: 2:01 - loss: 1.8387 - regression_loss: 1.4810 - classification_loss: 0.3578 3/500 [..............................] - ETA: 2:02 - loss: 1.7656 - regression_loss: 1.4142 - classification_loss: 0.3514 4/500 [..............................] - ETA: 2:03 - loss: 1.6373 - regression_loss: 1.3359 - classification_loss: 0.3014 5/500 [..............................] - ETA: 2:03 - loss: 1.8021 - regression_loss: 1.4830 - classification_loss: 0.3191 6/500 [..............................] - ETA: 2:04 - loss: 1.8190 - regression_loss: 1.4871 - classification_loss: 0.3319 7/500 [..............................] - ETA: 2:04 - loss: 1.8727 - regression_loss: 1.5269 - classification_loss: 0.3458 8/500 [..............................] - ETA: 2:02 - loss: 1.8903 - regression_loss: 1.5469 - classification_loss: 0.3435 9/500 [..............................] - ETA: 2:01 - loss: 1.7484 - regression_loss: 1.4361 - classification_loss: 0.3124 10/500 [..............................] - ETA: 2:01 - loss: 1.7780 - regression_loss: 1.4617 - classification_loss: 0.3162 11/500 [..............................] - ETA: 2:01 - loss: 1.7885 - regression_loss: 1.4691 - classification_loss: 0.3194 12/500 [..............................] - ETA: 2:01 - loss: 1.7872 - regression_loss: 1.4715 - classification_loss: 0.3156 13/500 [..............................] - ETA: 2:02 - loss: 1.7653 - regression_loss: 1.4531 - classification_loss: 0.3122 14/500 [..............................] - ETA: 2:01 - loss: 1.7301 - regression_loss: 1.4241 - classification_loss: 0.3060 15/500 [..............................] - ETA: 2:01 - loss: 1.6827 - regression_loss: 1.3761 - classification_loss: 0.3065 16/500 [..............................] - ETA: 2:01 - loss: 1.7106 - regression_loss: 1.3988 - classification_loss: 0.3118 17/500 [>.............................] - ETA: 2:01 - loss: 1.7138 - regression_loss: 1.4031 - classification_loss: 0.3107 18/500 [>.............................] - ETA: 2:00 - loss: 1.7336 - regression_loss: 1.4204 - classification_loss: 0.3132 19/500 [>.............................] - ETA: 2:00 - loss: 1.7721 - regression_loss: 1.4488 - classification_loss: 0.3233 20/500 [>.............................] - ETA: 2:00 - loss: 1.7693 - regression_loss: 1.4452 - classification_loss: 0.3241 21/500 [>.............................] - ETA: 2:00 - loss: 1.7894 - regression_loss: 1.4634 - classification_loss: 0.3259 22/500 [>.............................] - ETA: 2:00 - loss: 1.7893 - regression_loss: 1.4654 - classification_loss: 0.3239 23/500 [>.............................] - ETA: 2:00 - loss: 1.7835 - regression_loss: 1.4625 - classification_loss: 0.3210 24/500 [>.............................] - ETA: 1:59 - loss: 1.7825 - regression_loss: 1.4646 - classification_loss: 0.3178 25/500 [>.............................] - ETA: 1:59 - loss: 1.7785 - regression_loss: 1.4618 - classification_loss: 0.3167 26/500 [>.............................] - ETA: 1:59 - loss: 1.7862 - regression_loss: 1.4690 - classification_loss: 0.3172 27/500 [>.............................] - ETA: 1:59 - loss: 1.7976 - regression_loss: 1.4792 - classification_loss: 0.3185 28/500 [>.............................] - ETA: 1:58 - loss: 1.7723 - regression_loss: 1.4601 - classification_loss: 0.3121 29/500 [>.............................] - ETA: 1:58 - loss: 1.7407 - regression_loss: 1.4353 - classification_loss: 0.3054 30/500 [>.............................] - ETA: 1:58 - loss: 1.7512 - regression_loss: 1.4447 - classification_loss: 0.3065 31/500 [>.............................] - ETA: 1:58 - loss: 1.7357 - regression_loss: 1.4313 - classification_loss: 0.3044 32/500 [>.............................] - ETA: 1:57 - loss: 1.7400 - regression_loss: 1.4349 - classification_loss: 0.3051 33/500 [>.............................] - ETA: 1:57 - loss: 1.7162 - regression_loss: 1.4171 - classification_loss: 0.2991 34/500 [=>............................] - ETA: 1:57 - loss: 1.7199 - regression_loss: 1.4210 - classification_loss: 0.2989 35/500 [=>............................] - ETA: 1:57 - loss: 1.7023 - regression_loss: 1.4059 - classification_loss: 0.2964 36/500 [=>............................] - ETA: 1:56 - loss: 1.7025 - regression_loss: 1.4075 - classification_loss: 0.2951 37/500 [=>............................] - ETA: 1:56 - loss: 1.6957 - regression_loss: 1.4032 - classification_loss: 0.2926 38/500 [=>............................] - ETA: 1:56 - loss: 1.6942 - regression_loss: 1.4022 - classification_loss: 0.2920 39/500 [=>............................] - ETA: 1:56 - loss: 1.6897 - regression_loss: 1.3955 - classification_loss: 0.2942 40/500 [=>............................] - ETA: 1:55 - loss: 1.6933 - regression_loss: 1.3988 - classification_loss: 0.2945 41/500 [=>............................] - ETA: 1:55 - loss: 1.6869 - regression_loss: 1.3952 - classification_loss: 0.2917 42/500 [=>............................] - ETA: 1:55 - loss: 1.6910 - regression_loss: 1.3991 - classification_loss: 0.2919 43/500 [=>............................] - ETA: 1:55 - loss: 1.6888 - regression_loss: 1.3988 - classification_loss: 0.2900 44/500 [=>............................] - ETA: 1:54 - loss: 1.6802 - regression_loss: 1.3920 - classification_loss: 0.2883 45/500 [=>............................] - ETA: 1:54 - loss: 1.6773 - regression_loss: 1.3893 - classification_loss: 0.2879 46/500 [=>............................] - ETA: 1:54 - loss: 1.6890 - regression_loss: 1.3972 - classification_loss: 0.2918 47/500 [=>............................] - ETA: 1:54 - loss: 1.6948 - regression_loss: 1.4026 - classification_loss: 0.2922 48/500 [=>............................] - ETA: 1:53 - loss: 1.6822 - regression_loss: 1.3921 - classification_loss: 0.2901 49/500 [=>............................] - ETA: 1:53 - loss: 1.6851 - regression_loss: 1.3947 - classification_loss: 0.2904 50/500 [==>...........................] - ETA: 1:53 - loss: 1.6812 - regression_loss: 1.3919 - classification_loss: 0.2893 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6771 - regression_loss: 1.3883 - classification_loss: 0.2888 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6850 - regression_loss: 1.3950 - classification_loss: 0.2900 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6740 - regression_loss: 1.3843 - classification_loss: 0.2897 54/500 [==>...........................] - ETA: 1:52 - loss: 1.6608 - regression_loss: 1.3741 - classification_loss: 0.2868 55/500 [==>...........................] - ETA: 1:52 - loss: 1.6544 - regression_loss: 1.3693 - classification_loss: 0.2851 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6574 - regression_loss: 1.3726 - classification_loss: 0.2848 57/500 [==>...........................] - ETA: 1:51 - loss: 1.6501 - regression_loss: 1.3634 - classification_loss: 0.2867 58/500 [==>...........................] - ETA: 1:51 - loss: 1.6568 - regression_loss: 1.3680 - classification_loss: 0.2888 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6376 - regression_loss: 1.3525 - classification_loss: 0.2851 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6238 - regression_loss: 1.3414 - classification_loss: 0.2824 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6213 - regression_loss: 1.3392 - classification_loss: 0.2821 62/500 [==>...........................] - ETA: 1:50 - loss: 1.6097 - regression_loss: 1.3310 - classification_loss: 0.2787 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6114 - regression_loss: 1.3316 - classification_loss: 0.2799 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6116 - regression_loss: 1.3324 - classification_loss: 0.2792 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6049 - regression_loss: 1.3272 - classification_loss: 0.2777 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5958 - regression_loss: 1.3195 - classification_loss: 0.2762 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5995 - regression_loss: 1.3220 - classification_loss: 0.2775 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5937 - regression_loss: 1.3163 - classification_loss: 0.2773 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5970 - regression_loss: 1.3205 - classification_loss: 0.2765 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6018 - regression_loss: 1.3247 - classification_loss: 0.2771 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6056 - regression_loss: 1.3276 - classification_loss: 0.2780 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6076 - regression_loss: 1.3298 - classification_loss: 0.2778 73/500 [===>..........................] - ETA: 1:47 - loss: 1.6140 - regression_loss: 1.3347 - classification_loss: 0.2793 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6149 - regression_loss: 1.3358 - classification_loss: 0.2791 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6141 - regression_loss: 1.3356 - classification_loss: 0.2785 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6231 - regression_loss: 1.3437 - classification_loss: 0.2794 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6278 - regression_loss: 1.3483 - classification_loss: 0.2795 78/500 [===>..........................] - ETA: 1:46 - loss: 1.6339 - regression_loss: 1.3521 - classification_loss: 0.2818 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6471 - regression_loss: 1.3592 - classification_loss: 0.2879 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6505 - regression_loss: 1.3628 - classification_loss: 0.2877 81/500 [===>..........................] - ETA: 1:45 - loss: 1.6536 - regression_loss: 1.3655 - classification_loss: 0.2881 82/500 [===>..........................] - ETA: 1:45 - loss: 1.6551 - regression_loss: 1.3674 - classification_loss: 0.2877 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6603 - regression_loss: 1.3716 - classification_loss: 0.2888 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6629 - regression_loss: 1.3738 - classification_loss: 0.2890 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6632 - regression_loss: 1.3738 - classification_loss: 0.2894 86/500 [====>.........................] - ETA: 1:44 - loss: 1.6612 - regression_loss: 1.3721 - classification_loss: 0.2891 87/500 [====>.........................] - ETA: 1:44 - loss: 1.6606 - regression_loss: 1.3706 - classification_loss: 0.2900 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6625 - regression_loss: 1.3726 - classification_loss: 0.2900 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6593 - regression_loss: 1.3692 - classification_loss: 0.2901 90/500 [====>.........................] - ETA: 1:43 - loss: 1.6607 - regression_loss: 1.3709 - classification_loss: 0.2898 91/500 [====>.........................] - ETA: 1:43 - loss: 1.6614 - regression_loss: 1.3721 - classification_loss: 0.2893 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6620 - regression_loss: 1.3736 - classification_loss: 0.2884 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6552 - regression_loss: 1.3685 - classification_loss: 0.2867 94/500 [====>.........................] - ETA: 1:42 - loss: 1.6578 - regression_loss: 1.3709 - classification_loss: 0.2870 95/500 [====>.........................] - ETA: 1:42 - loss: 1.6566 - regression_loss: 1.3704 - classification_loss: 0.2862 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6566 - regression_loss: 1.3704 - classification_loss: 0.2861 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6570 - regression_loss: 1.3703 - classification_loss: 0.2867 98/500 [====>.........................] - ETA: 1:41 - loss: 1.6597 - regression_loss: 1.3725 - classification_loss: 0.2871 99/500 [====>.........................] - ETA: 1:41 - loss: 1.6644 - regression_loss: 1.3764 - classification_loss: 0.2880 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6613 - regression_loss: 1.3743 - classification_loss: 0.2870 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6638 - regression_loss: 1.3762 - classification_loss: 0.2876 102/500 [=====>........................] - ETA: 1:40 - loss: 1.6737 - regression_loss: 1.3840 - classification_loss: 0.2897 103/500 [=====>........................] - ETA: 1:40 - loss: 1.6761 - regression_loss: 1.3857 - classification_loss: 0.2904 104/500 [=====>........................] - ETA: 1:40 - loss: 1.6771 - regression_loss: 1.3869 - classification_loss: 0.2902 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6731 - regression_loss: 1.3819 - classification_loss: 0.2912 106/500 [=====>........................] - ETA: 1:39 - loss: 1.6762 - regression_loss: 1.3846 - classification_loss: 0.2916 107/500 [=====>........................] - ETA: 1:39 - loss: 1.6737 - regression_loss: 1.3832 - classification_loss: 0.2905 108/500 [=====>........................] - ETA: 1:39 - loss: 1.6756 - regression_loss: 1.3848 - classification_loss: 0.2908 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6752 - regression_loss: 1.3850 - classification_loss: 0.2902 110/500 [=====>........................] - ETA: 1:38 - loss: 1.6770 - regression_loss: 1.3863 - classification_loss: 0.2907 111/500 [=====>........................] - ETA: 1:38 - loss: 1.6722 - regression_loss: 1.3817 - classification_loss: 0.2905 112/500 [=====>........................] - ETA: 1:38 - loss: 1.6653 - regression_loss: 1.3764 - classification_loss: 0.2889 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6651 - regression_loss: 1.3767 - classification_loss: 0.2884 114/500 [=====>........................] - ETA: 1:37 - loss: 1.6597 - regression_loss: 1.3728 - classification_loss: 0.2869 115/500 [=====>........................] - ETA: 1:37 - loss: 1.6566 - regression_loss: 1.3705 - classification_loss: 0.2861 116/500 [=====>........................] - ETA: 1:37 - loss: 1.6624 - regression_loss: 1.3734 - classification_loss: 0.2890 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6615 - regression_loss: 1.3727 - classification_loss: 0.2888 118/500 [======>.......................] - ETA: 1:36 - loss: 1.6636 - regression_loss: 1.3747 - classification_loss: 0.2888 119/500 [======>.......................] - ETA: 1:36 - loss: 1.6675 - regression_loss: 1.3779 - classification_loss: 0.2896 120/500 [======>.......................] - ETA: 1:36 - loss: 1.6630 - regression_loss: 1.3745 - classification_loss: 0.2885 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6570 - regression_loss: 1.3695 - classification_loss: 0.2875 122/500 [======>.......................] - ETA: 1:35 - loss: 1.6605 - regression_loss: 1.3724 - classification_loss: 0.2880 123/500 [======>.......................] - ETA: 1:35 - loss: 1.6515 - regression_loss: 1.3648 - classification_loss: 0.2867 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6505 - regression_loss: 1.3635 - classification_loss: 0.2870 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6583 - regression_loss: 1.3682 - classification_loss: 0.2901 126/500 [======>.......................] - ETA: 1:34 - loss: 1.6654 - regression_loss: 1.3745 - classification_loss: 0.2909 127/500 [======>.......................] - ETA: 1:34 - loss: 1.6672 - regression_loss: 1.3750 - classification_loss: 0.2922 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6633 - regression_loss: 1.3723 - classification_loss: 0.2910 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6650 - regression_loss: 1.3738 - classification_loss: 0.2912 130/500 [======>.......................] - ETA: 1:33 - loss: 1.6646 - regression_loss: 1.3735 - classification_loss: 0.2911 131/500 [======>.......................] - ETA: 1:33 - loss: 1.6629 - regression_loss: 1.3727 - classification_loss: 0.2902 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6660 - regression_loss: 1.3753 - classification_loss: 0.2907 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6675 - regression_loss: 1.3762 - classification_loss: 0.2913 134/500 [=======>......................] - ETA: 1:32 - loss: 1.6673 - regression_loss: 1.3765 - classification_loss: 0.2908 135/500 [=======>......................] - ETA: 1:32 - loss: 1.6692 - regression_loss: 1.3782 - classification_loss: 0.2910 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6635 - regression_loss: 1.3739 - classification_loss: 0.2897 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6602 - regression_loss: 1.3707 - classification_loss: 0.2895 138/500 [=======>......................] - ETA: 1:31 - loss: 1.6585 - regression_loss: 1.3694 - classification_loss: 0.2891 139/500 [=======>......................] - ETA: 1:31 - loss: 1.6548 - regression_loss: 1.3668 - classification_loss: 0.2880 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6557 - regression_loss: 1.3676 - classification_loss: 0.2881 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6557 - regression_loss: 1.3676 - classification_loss: 0.2881 142/500 [=======>......................] - ETA: 1:30 - loss: 1.6559 - regression_loss: 1.3680 - classification_loss: 0.2879 143/500 [=======>......................] - ETA: 1:30 - loss: 1.6560 - regression_loss: 1.3684 - classification_loss: 0.2876 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6548 - regression_loss: 1.3677 - classification_loss: 0.2871 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6539 - regression_loss: 1.3672 - classification_loss: 0.2867 146/500 [=======>......................] - ETA: 1:29 - loss: 1.6538 - regression_loss: 1.3676 - classification_loss: 0.2862 147/500 [=======>......................] - ETA: 1:29 - loss: 1.6571 - regression_loss: 1.3704 - classification_loss: 0.2867 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6560 - regression_loss: 1.3699 - classification_loss: 0.2861 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6567 - regression_loss: 1.3710 - classification_loss: 0.2856 150/500 [========>.....................] - ETA: 1:28 - loss: 1.6532 - regression_loss: 1.3683 - classification_loss: 0.2849 151/500 [========>.....................] - ETA: 1:28 - loss: 1.6559 - regression_loss: 1.3703 - classification_loss: 0.2856 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6565 - regression_loss: 1.3704 - classification_loss: 0.2861 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6551 - regression_loss: 1.3696 - classification_loss: 0.2855 154/500 [========>.....................] - ETA: 1:27 - loss: 1.6561 - regression_loss: 1.3693 - classification_loss: 0.2867 155/500 [========>.....................] - ETA: 1:27 - loss: 1.6689 - regression_loss: 1.3791 - classification_loss: 0.2899 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6652 - regression_loss: 1.3761 - classification_loss: 0.2891 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6687 - regression_loss: 1.3785 - classification_loss: 0.2902 158/500 [========>.....................] - ETA: 1:26 - loss: 1.6688 - regression_loss: 1.3775 - classification_loss: 0.2913 159/500 [========>.....................] - ETA: 1:26 - loss: 1.6722 - regression_loss: 1.3797 - classification_loss: 0.2925 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6737 - regression_loss: 1.3816 - classification_loss: 0.2921 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6756 - regression_loss: 1.3840 - classification_loss: 0.2915 162/500 [========>.....................] - ETA: 1:25 - loss: 1.6802 - regression_loss: 1.3875 - classification_loss: 0.2927 163/500 [========>.....................] - ETA: 1:25 - loss: 1.6781 - regression_loss: 1.3859 - classification_loss: 0.2922 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6786 - regression_loss: 1.3864 - classification_loss: 0.2922 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6786 - regression_loss: 1.3866 - classification_loss: 0.2919 166/500 [========>.....................] - ETA: 1:24 - loss: 1.6785 - regression_loss: 1.3867 - classification_loss: 0.2918 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6761 - regression_loss: 1.3850 - classification_loss: 0.2912 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6705 - regression_loss: 1.3804 - classification_loss: 0.2901 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6728 - regression_loss: 1.3822 - classification_loss: 0.2906 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6760 - regression_loss: 1.3851 - classification_loss: 0.2909 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6750 - regression_loss: 1.3845 - classification_loss: 0.2905 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6774 - regression_loss: 1.3872 - classification_loss: 0.2902 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6724 - regression_loss: 1.3824 - classification_loss: 0.2900 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6705 - regression_loss: 1.3810 - classification_loss: 0.2895 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6711 - regression_loss: 1.3813 - classification_loss: 0.2897 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6655 - regression_loss: 1.3770 - classification_loss: 0.2885 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6656 - regression_loss: 1.3774 - classification_loss: 0.2882 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6681 - regression_loss: 1.3796 - classification_loss: 0.2885 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6671 - regression_loss: 1.3789 - classification_loss: 0.2882 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6680 - regression_loss: 1.3797 - classification_loss: 0.2882 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6655 - regression_loss: 1.3779 - classification_loss: 0.2876 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6649 - regression_loss: 1.3777 - classification_loss: 0.2872 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6632 - regression_loss: 1.3765 - classification_loss: 0.2867 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6663 - regression_loss: 1.3792 - classification_loss: 0.2872 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6634 - regression_loss: 1.3770 - classification_loss: 0.2864 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6606 - regression_loss: 1.3749 - classification_loss: 0.2858 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6606 - regression_loss: 1.3744 - classification_loss: 0.2862 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6623 - regression_loss: 1.3757 - classification_loss: 0.2866 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6642 - regression_loss: 1.3776 - classification_loss: 0.2866 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6625 - regression_loss: 1.3764 - classification_loss: 0.2860 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6612 - regression_loss: 1.3756 - classification_loss: 0.2857 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6602 - regression_loss: 1.3750 - classification_loss: 0.2852 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6612 - regression_loss: 1.3760 - classification_loss: 0.2852 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6629 - regression_loss: 1.3773 - classification_loss: 0.2855 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6641 - regression_loss: 1.3787 - classification_loss: 0.2854 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6701 - regression_loss: 1.3836 - classification_loss: 0.2865 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6698 - regression_loss: 1.3834 - classification_loss: 0.2864 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6701 - regression_loss: 1.3837 - classification_loss: 0.2864 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6657 - regression_loss: 1.3801 - classification_loss: 0.2856 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6674 - regression_loss: 1.3817 - classification_loss: 0.2857 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6661 - regression_loss: 1.3807 - classification_loss: 0.2854 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6666 - regression_loss: 1.3809 - classification_loss: 0.2857 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6683 - regression_loss: 1.3821 - classification_loss: 0.2861 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6632 - regression_loss: 1.3782 - classification_loss: 0.2851 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6646 - regression_loss: 1.3792 - classification_loss: 0.2853 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6658 - regression_loss: 1.3807 - classification_loss: 0.2851 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6678 - regression_loss: 1.3823 - classification_loss: 0.2855 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6694 - regression_loss: 1.3837 - classification_loss: 0.2858 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6658 - regression_loss: 1.3809 - classification_loss: 0.2849 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6653 - regression_loss: 1.3807 - classification_loss: 0.2847 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6659 - regression_loss: 1.3812 - classification_loss: 0.2847 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6694 - regression_loss: 1.3841 - classification_loss: 0.2854 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6647 - regression_loss: 1.3803 - classification_loss: 0.2843 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6658 - regression_loss: 1.3811 - classification_loss: 0.2847 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6652 - regression_loss: 1.3806 - classification_loss: 0.2846 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6656 - regression_loss: 1.3810 - classification_loss: 0.2846 217/500 [============>.................] - ETA: 1:11 - loss: 1.6653 - regression_loss: 1.3806 - classification_loss: 0.2847 218/500 [============>.................] - ETA: 1:10 - loss: 1.6661 - regression_loss: 1.3813 - classification_loss: 0.2848 219/500 [============>.................] - ETA: 1:10 - loss: 1.6673 - regression_loss: 1.3822 - classification_loss: 0.2851 220/500 [============>.................] - ETA: 1:10 - loss: 1.6667 - regression_loss: 1.3818 - classification_loss: 0.2849 221/500 [============>.................] - ETA: 1:10 - loss: 1.6705 - regression_loss: 1.3847 - classification_loss: 0.2858 222/500 [============>.................] - ETA: 1:09 - loss: 1.6688 - regression_loss: 1.3835 - classification_loss: 0.2853 223/500 [============>.................] - ETA: 1:09 - loss: 1.6683 - regression_loss: 1.3832 - classification_loss: 0.2851 224/500 [============>.................] - ETA: 1:09 - loss: 1.6688 - regression_loss: 1.3836 - classification_loss: 0.2851 225/500 [============>.................] - ETA: 1:09 - loss: 1.6644 - regression_loss: 1.3801 - classification_loss: 0.2843 226/500 [============>.................] - ETA: 1:08 - loss: 1.6660 - regression_loss: 1.3816 - classification_loss: 0.2845 227/500 [============>.................] - ETA: 1:08 - loss: 1.6620 - regression_loss: 1.3778 - classification_loss: 0.2842 228/500 [============>.................] - ETA: 1:08 - loss: 1.6618 - regression_loss: 1.3783 - classification_loss: 0.2835 229/500 [============>.................] - ETA: 1:08 - loss: 1.6624 - regression_loss: 1.3790 - classification_loss: 0.2834 230/500 [============>.................] - ETA: 1:07 - loss: 1.6644 - regression_loss: 1.3810 - classification_loss: 0.2835 231/500 [============>.................] - ETA: 1:07 - loss: 1.6648 - regression_loss: 1.3812 - classification_loss: 0.2835 232/500 [============>.................] - ETA: 1:07 - loss: 1.6670 - regression_loss: 1.3835 - classification_loss: 0.2835 233/500 [============>.................] - ETA: 1:07 - loss: 1.6690 - regression_loss: 1.3851 - classification_loss: 0.2839 234/500 [=============>................] - ETA: 1:06 - loss: 1.6705 - regression_loss: 1.3863 - classification_loss: 0.2841 235/500 [=============>................] - ETA: 1:06 - loss: 1.6688 - regression_loss: 1.3852 - classification_loss: 0.2836 236/500 [=============>................] - ETA: 1:06 - loss: 1.6688 - regression_loss: 1.3854 - classification_loss: 0.2834 237/500 [=============>................] - ETA: 1:06 - loss: 1.6674 - regression_loss: 1.3845 - classification_loss: 0.2829 238/500 [=============>................] - ETA: 1:05 - loss: 1.6674 - regression_loss: 1.3844 - classification_loss: 0.2830 239/500 [=============>................] - ETA: 1:05 - loss: 1.6677 - regression_loss: 1.3840 - classification_loss: 0.2837 240/500 [=============>................] - ETA: 1:05 - loss: 1.6703 - regression_loss: 1.3859 - classification_loss: 0.2844 241/500 [=============>................] - ETA: 1:05 - loss: 1.6696 - regression_loss: 1.3856 - classification_loss: 0.2840 242/500 [=============>................] - ETA: 1:04 - loss: 1.6706 - regression_loss: 1.3866 - classification_loss: 0.2840 243/500 [=============>................] - ETA: 1:04 - loss: 1.6720 - regression_loss: 1.3876 - classification_loss: 0.2844 244/500 [=============>................] - ETA: 1:04 - loss: 1.6741 - regression_loss: 1.3887 - classification_loss: 0.2854 245/500 [=============>................] - ETA: 1:04 - loss: 1.6710 - regression_loss: 1.3863 - classification_loss: 0.2847 246/500 [=============>................] - ETA: 1:03 - loss: 1.6677 - regression_loss: 1.3838 - classification_loss: 0.2839 247/500 [=============>................] - ETA: 1:03 - loss: 1.6693 - regression_loss: 1.3857 - classification_loss: 0.2837 248/500 [=============>................] - ETA: 1:03 - loss: 1.6703 - regression_loss: 1.3867 - classification_loss: 0.2836 249/500 [=============>................] - ETA: 1:03 - loss: 1.6719 - regression_loss: 1.3883 - classification_loss: 0.2837 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6714 - regression_loss: 1.3880 - classification_loss: 0.2834 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6722 - regression_loss: 1.3887 - classification_loss: 0.2835 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6731 - regression_loss: 1.3895 - classification_loss: 0.2836 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6733 - regression_loss: 1.3893 - classification_loss: 0.2840 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6726 - regression_loss: 1.3888 - classification_loss: 0.2838 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6717 - regression_loss: 1.3881 - classification_loss: 0.2836 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6712 - regression_loss: 1.3879 - classification_loss: 0.2833 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6698 - regression_loss: 1.3869 - classification_loss: 0.2829 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6711 - regression_loss: 1.3879 - classification_loss: 0.2833 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6697 - regression_loss: 1.3866 - classification_loss: 0.2831 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6700 - regression_loss: 1.3869 - classification_loss: 0.2831 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6692 - regression_loss: 1.3858 - classification_loss: 0.2834 262/500 [==============>...............] - ETA: 59s - loss: 1.6704 - regression_loss: 1.3866 - classification_loss: 0.2838  263/500 [==============>...............] - ETA: 59s - loss: 1.6704 - regression_loss: 1.3867 - classification_loss: 0.2837 264/500 [==============>...............] - ETA: 59s - loss: 1.6710 - regression_loss: 1.3872 - classification_loss: 0.2838 265/500 [==============>...............] - ETA: 59s - loss: 1.6707 - regression_loss: 1.3866 - classification_loss: 0.2841 266/500 [==============>...............] - ETA: 58s - loss: 1.6727 - regression_loss: 1.3884 - classification_loss: 0.2843 267/500 [===============>..............] - ETA: 58s - loss: 1.6701 - regression_loss: 1.3858 - classification_loss: 0.2843 268/500 [===============>..............] - ETA: 58s - loss: 1.6708 - regression_loss: 1.3864 - classification_loss: 0.2844 269/500 [===============>..............] - ETA: 58s - loss: 1.6693 - regression_loss: 1.3855 - classification_loss: 0.2838 270/500 [===============>..............] - ETA: 57s - loss: 1.6683 - regression_loss: 1.3848 - classification_loss: 0.2835 271/500 [===============>..............] - ETA: 57s - loss: 1.6650 - regression_loss: 1.3823 - classification_loss: 0.2827 272/500 [===============>..............] - ETA: 57s - loss: 1.6632 - regression_loss: 1.3808 - classification_loss: 0.2823 273/500 [===============>..............] - ETA: 57s - loss: 1.6616 - regression_loss: 1.3796 - classification_loss: 0.2820 274/500 [===============>..............] - ETA: 56s - loss: 1.6624 - regression_loss: 1.3802 - classification_loss: 0.2822 275/500 [===============>..............] - ETA: 56s - loss: 1.6618 - regression_loss: 1.3798 - classification_loss: 0.2820 276/500 [===============>..............] - ETA: 56s - loss: 1.6603 - regression_loss: 1.3788 - classification_loss: 0.2815 277/500 [===============>..............] - ETA: 56s - loss: 1.6602 - regression_loss: 1.3783 - classification_loss: 0.2819 278/500 [===============>..............] - ETA: 55s - loss: 1.6592 - regression_loss: 1.3776 - classification_loss: 0.2816 279/500 [===============>..............] - ETA: 55s - loss: 1.6596 - regression_loss: 1.3782 - classification_loss: 0.2814 280/500 [===============>..............] - ETA: 55s - loss: 1.6608 - regression_loss: 1.3794 - classification_loss: 0.2815 281/500 [===============>..............] - ETA: 55s - loss: 1.6619 - regression_loss: 1.3804 - classification_loss: 0.2815 282/500 [===============>..............] - ETA: 54s - loss: 1.6609 - regression_loss: 1.3796 - classification_loss: 0.2814 283/500 [===============>..............] - ETA: 54s - loss: 1.6614 - regression_loss: 1.3800 - classification_loss: 0.2814 284/500 [================>.............] - ETA: 54s - loss: 1.6627 - regression_loss: 1.3812 - classification_loss: 0.2815 285/500 [================>.............] - ETA: 54s - loss: 1.6646 - regression_loss: 1.3830 - classification_loss: 0.2816 286/500 [================>.............] - ETA: 53s - loss: 1.6648 - regression_loss: 1.3835 - classification_loss: 0.2813 287/500 [================>.............] - ETA: 53s - loss: 1.6659 - regression_loss: 1.3846 - classification_loss: 0.2813 288/500 [================>.............] - ETA: 53s - loss: 1.6655 - regression_loss: 1.3844 - classification_loss: 0.2811 289/500 [================>.............] - ETA: 53s - loss: 1.6657 - regression_loss: 1.3846 - classification_loss: 0.2812 290/500 [================>.............] - ETA: 52s - loss: 1.6657 - regression_loss: 1.3845 - classification_loss: 0.2812 291/500 [================>.............] - ETA: 52s - loss: 1.6681 - regression_loss: 1.3864 - classification_loss: 0.2817 292/500 [================>.............] - ETA: 52s - loss: 1.6678 - regression_loss: 1.3862 - classification_loss: 0.2816 293/500 [================>.............] - ETA: 52s - loss: 1.6682 - regression_loss: 1.3866 - classification_loss: 0.2816 294/500 [================>.............] - ETA: 51s - loss: 1.6691 - regression_loss: 1.3874 - classification_loss: 0.2816 295/500 [================>.............] - ETA: 51s - loss: 1.6665 - regression_loss: 1.3854 - classification_loss: 0.2811 296/500 [================>.............] - ETA: 51s - loss: 1.6669 - regression_loss: 1.3857 - classification_loss: 0.2812 297/500 [================>.............] - ETA: 51s - loss: 1.6669 - regression_loss: 1.3859 - classification_loss: 0.2811 298/500 [================>.............] - ETA: 50s - loss: 1.6679 - regression_loss: 1.3866 - classification_loss: 0.2813 299/500 [================>.............] - ETA: 50s - loss: 1.6677 - regression_loss: 1.3864 - classification_loss: 0.2812 300/500 [=================>............] - ETA: 50s - loss: 1.6709 - regression_loss: 1.3883 - classification_loss: 0.2826 301/500 [=================>............] - ETA: 50s - loss: 1.6720 - regression_loss: 1.3891 - classification_loss: 0.2828 302/500 [=================>............] - ETA: 49s - loss: 1.6724 - regression_loss: 1.3894 - classification_loss: 0.2830 303/500 [=================>............] - ETA: 49s - loss: 1.6708 - regression_loss: 1.3882 - classification_loss: 0.2826 304/500 [=================>............] - ETA: 49s - loss: 1.6717 - regression_loss: 1.3886 - classification_loss: 0.2831 305/500 [=================>............] - ETA: 49s - loss: 1.6718 - regression_loss: 1.3886 - classification_loss: 0.2832 306/500 [=================>............] - ETA: 48s - loss: 1.6718 - regression_loss: 1.3886 - classification_loss: 0.2832 307/500 [=================>............] - ETA: 48s - loss: 1.6717 - regression_loss: 1.3892 - classification_loss: 0.2825 308/500 [=================>............] - ETA: 48s - loss: 1.6722 - regression_loss: 1.3899 - classification_loss: 0.2823 309/500 [=================>............] - ETA: 48s - loss: 1.6738 - regression_loss: 1.3911 - classification_loss: 0.2827 310/500 [=================>............] - ETA: 47s - loss: 1.6747 - regression_loss: 1.3919 - classification_loss: 0.2828 311/500 [=================>............] - ETA: 47s - loss: 1.6730 - regression_loss: 1.3906 - classification_loss: 0.2824 312/500 [=================>............] - ETA: 47s - loss: 1.6740 - regression_loss: 1.3908 - classification_loss: 0.2832 313/500 [=================>............] - ETA: 47s - loss: 1.6751 - regression_loss: 1.3917 - classification_loss: 0.2834 314/500 [=================>............] - ETA: 46s - loss: 1.6749 - regression_loss: 1.3918 - classification_loss: 0.2831 315/500 [=================>............] - ETA: 46s - loss: 1.6750 - regression_loss: 1.3917 - classification_loss: 0.2833 316/500 [=================>............] - ETA: 46s - loss: 1.6770 - regression_loss: 1.3931 - classification_loss: 0.2840 317/500 [==================>...........] - ETA: 46s - loss: 1.6759 - regression_loss: 1.3923 - classification_loss: 0.2837 318/500 [==================>...........] - ETA: 45s - loss: 1.6768 - regression_loss: 1.3930 - classification_loss: 0.2838 319/500 [==================>...........] - ETA: 45s - loss: 1.6774 - regression_loss: 1.3934 - classification_loss: 0.2840 320/500 [==================>...........] - ETA: 45s - loss: 1.6759 - regression_loss: 1.3919 - classification_loss: 0.2839 321/500 [==================>...........] - ETA: 45s - loss: 1.6757 - regression_loss: 1.3917 - classification_loss: 0.2841 322/500 [==================>...........] - ETA: 44s - loss: 1.6755 - regression_loss: 1.3912 - classification_loss: 0.2843 323/500 [==================>...........] - ETA: 44s - loss: 1.6758 - regression_loss: 1.3913 - classification_loss: 0.2845 324/500 [==================>...........] - ETA: 44s - loss: 1.6763 - regression_loss: 1.3917 - classification_loss: 0.2846 325/500 [==================>...........] - ETA: 43s - loss: 1.6754 - regression_loss: 1.3910 - classification_loss: 0.2845 326/500 [==================>...........] - ETA: 43s - loss: 1.6755 - regression_loss: 1.3912 - classification_loss: 0.2843 327/500 [==================>...........] - ETA: 43s - loss: 1.6752 - regression_loss: 1.3911 - classification_loss: 0.2841 328/500 [==================>...........] - ETA: 43s - loss: 1.6758 - regression_loss: 1.3917 - classification_loss: 0.2842 329/500 [==================>...........] - ETA: 42s - loss: 1.6736 - regression_loss: 1.3899 - classification_loss: 0.2837 330/500 [==================>...........] - ETA: 42s - loss: 1.6713 - regression_loss: 1.3881 - classification_loss: 0.2832 331/500 [==================>...........] - ETA: 42s - loss: 1.6718 - regression_loss: 1.3886 - classification_loss: 0.2832 332/500 [==================>...........] - ETA: 42s - loss: 1.6691 - regression_loss: 1.3864 - classification_loss: 0.2827 333/500 [==================>...........] - ETA: 41s - loss: 1.6690 - regression_loss: 1.3864 - classification_loss: 0.2827 334/500 [===================>..........] - ETA: 41s - loss: 1.6690 - regression_loss: 1.3863 - classification_loss: 0.2827 335/500 [===================>..........] - ETA: 41s - loss: 1.6702 - regression_loss: 1.3870 - classification_loss: 0.2832 336/500 [===================>..........] - ETA: 41s - loss: 1.6701 - regression_loss: 1.3870 - classification_loss: 0.2831 337/500 [===================>..........] - ETA: 40s - loss: 1.6707 - regression_loss: 1.3875 - classification_loss: 0.2832 338/500 [===================>..........] - ETA: 40s - loss: 1.6710 - regression_loss: 1.3878 - classification_loss: 0.2832 339/500 [===================>..........] - ETA: 40s - loss: 1.6729 - regression_loss: 1.3895 - classification_loss: 0.2834 340/500 [===================>..........] - ETA: 40s - loss: 1.6735 - regression_loss: 1.3902 - classification_loss: 0.2833 341/500 [===================>..........] - ETA: 39s - loss: 1.6714 - regression_loss: 1.3884 - classification_loss: 0.2830 342/500 [===================>..........] - ETA: 39s - loss: 1.6715 - regression_loss: 1.3885 - classification_loss: 0.2831 343/500 [===================>..........] - ETA: 39s - loss: 1.6722 - regression_loss: 1.3890 - classification_loss: 0.2831 344/500 [===================>..........] - ETA: 39s - loss: 1.6715 - regression_loss: 1.3886 - classification_loss: 0.2829 345/500 [===================>..........] - ETA: 38s - loss: 1.6717 - regression_loss: 1.3889 - classification_loss: 0.2828 346/500 [===================>..........] - ETA: 38s - loss: 1.6717 - regression_loss: 1.3887 - classification_loss: 0.2830 347/500 [===================>..........] - ETA: 38s - loss: 1.6716 - regression_loss: 1.3887 - classification_loss: 0.2829 348/500 [===================>..........] - ETA: 38s - loss: 1.6723 - regression_loss: 1.3893 - classification_loss: 0.2830 349/500 [===================>..........] - ETA: 37s - loss: 1.6719 - regression_loss: 1.3889 - classification_loss: 0.2829 350/500 [====================>.........] - ETA: 37s - loss: 1.6730 - regression_loss: 1.3899 - classification_loss: 0.2831 351/500 [====================>.........] - ETA: 37s - loss: 1.6709 - regression_loss: 1.3880 - classification_loss: 0.2829 352/500 [====================>.........] - ETA: 37s - loss: 1.6708 - regression_loss: 1.3879 - classification_loss: 0.2830 353/500 [====================>.........] - ETA: 36s - loss: 1.6709 - regression_loss: 1.3879 - classification_loss: 0.2830 354/500 [====================>.........] - ETA: 36s - loss: 1.6714 - regression_loss: 1.3882 - classification_loss: 0.2832 355/500 [====================>.........] - ETA: 36s - loss: 1.6697 - regression_loss: 1.3870 - classification_loss: 0.2827 356/500 [====================>.........] - ETA: 36s - loss: 1.6687 - regression_loss: 1.3862 - classification_loss: 0.2825 357/500 [====================>.........] - ETA: 35s - loss: 1.6695 - regression_loss: 1.3869 - classification_loss: 0.2825 358/500 [====================>.........] - ETA: 35s - loss: 1.6703 - regression_loss: 1.3875 - classification_loss: 0.2828 359/500 [====================>.........] - ETA: 35s - loss: 1.6708 - regression_loss: 1.3878 - classification_loss: 0.2830 360/500 [====================>.........] - ETA: 35s - loss: 1.6719 - regression_loss: 1.3887 - classification_loss: 0.2832 361/500 [====================>.........] - ETA: 34s - loss: 1.6719 - regression_loss: 1.3890 - classification_loss: 0.2829 362/500 [====================>.........] - ETA: 34s - loss: 1.6716 - regression_loss: 1.3888 - classification_loss: 0.2827 363/500 [====================>.........] - ETA: 34s - loss: 1.6706 - regression_loss: 1.3881 - classification_loss: 0.2825 364/500 [====================>.........] - ETA: 34s - loss: 1.6705 - regression_loss: 1.3881 - classification_loss: 0.2825 365/500 [====================>.........] - ETA: 33s - loss: 1.6699 - regression_loss: 1.3877 - classification_loss: 0.2822 366/500 [====================>.........] - ETA: 33s - loss: 1.6719 - regression_loss: 1.3895 - classification_loss: 0.2824 367/500 [=====================>........] - ETA: 33s - loss: 1.6724 - regression_loss: 1.3897 - classification_loss: 0.2828 368/500 [=====================>........] - ETA: 33s - loss: 1.6711 - regression_loss: 1.3886 - classification_loss: 0.2825 369/500 [=====================>........] - ETA: 32s - loss: 1.6689 - regression_loss: 1.3869 - classification_loss: 0.2821 370/500 [=====================>........] - ETA: 32s - loss: 1.6698 - regression_loss: 1.3876 - classification_loss: 0.2822 371/500 [=====================>........] - ETA: 32s - loss: 1.6699 - regression_loss: 1.3875 - classification_loss: 0.2824 372/500 [=====================>........] - ETA: 32s - loss: 1.6701 - regression_loss: 1.3876 - classification_loss: 0.2824 373/500 [=====================>........] - ETA: 31s - loss: 1.6689 - regression_loss: 1.3867 - classification_loss: 0.2822 374/500 [=====================>........] - ETA: 31s - loss: 1.6686 - regression_loss: 1.3867 - classification_loss: 0.2819 375/500 [=====================>........] - ETA: 31s - loss: 1.6672 - regression_loss: 1.3855 - classification_loss: 0.2817 376/500 [=====================>........] - ETA: 31s - loss: 1.6676 - regression_loss: 1.3858 - classification_loss: 0.2818 377/500 [=====================>........] - ETA: 30s - loss: 1.6692 - regression_loss: 1.3869 - classification_loss: 0.2823 378/500 [=====================>........] - ETA: 30s - loss: 1.6686 - regression_loss: 1.3865 - classification_loss: 0.2821 379/500 [=====================>........] - ETA: 30s - loss: 1.6690 - regression_loss: 1.3869 - classification_loss: 0.2821 380/500 [=====================>........] - ETA: 30s - loss: 1.6687 - regression_loss: 1.3869 - classification_loss: 0.2818 381/500 [=====================>........] - ETA: 29s - loss: 1.6698 - regression_loss: 1.3878 - classification_loss: 0.2820 382/500 [=====================>........] - ETA: 29s - loss: 1.6710 - regression_loss: 1.3887 - classification_loss: 0.2823 383/500 [=====================>........] - ETA: 29s - loss: 1.6712 - regression_loss: 1.3887 - classification_loss: 0.2825 384/500 [======================>.......] - ETA: 29s - loss: 1.6725 - regression_loss: 1.3897 - classification_loss: 0.2827 385/500 [======================>.......] - ETA: 28s - loss: 1.6726 - regression_loss: 1.3898 - classification_loss: 0.2828 386/500 [======================>.......] - ETA: 28s - loss: 1.6727 - regression_loss: 1.3900 - classification_loss: 0.2827 387/500 [======================>.......] - ETA: 28s - loss: 1.6728 - regression_loss: 1.3900 - classification_loss: 0.2828 388/500 [======================>.......] - ETA: 28s - loss: 1.6712 - regression_loss: 1.3885 - classification_loss: 0.2826 389/500 [======================>.......] - ETA: 27s - loss: 1.6706 - regression_loss: 1.3881 - classification_loss: 0.2825 390/500 [======================>.......] - ETA: 27s - loss: 1.6716 - regression_loss: 1.3887 - classification_loss: 0.2829 391/500 [======================>.......] - ETA: 27s - loss: 1.6713 - regression_loss: 1.3884 - classification_loss: 0.2829 392/500 [======================>.......] - ETA: 27s - loss: 1.6705 - regression_loss: 1.3879 - classification_loss: 0.2826 393/500 [======================>.......] - ETA: 26s - loss: 1.6709 - regression_loss: 1.3883 - classification_loss: 0.2826 394/500 [======================>.......] - ETA: 26s - loss: 1.6693 - regression_loss: 1.3871 - classification_loss: 0.2822 395/500 [======================>.......] - ETA: 26s - loss: 1.6682 - regression_loss: 1.3864 - classification_loss: 0.2818 396/500 [======================>.......] - ETA: 26s - loss: 1.6689 - regression_loss: 1.3870 - classification_loss: 0.2819 397/500 [======================>.......] - ETA: 25s - loss: 1.6696 - regression_loss: 1.3875 - classification_loss: 0.2821 398/500 [======================>.......] - ETA: 25s - loss: 1.6685 - regression_loss: 1.3867 - classification_loss: 0.2817 399/500 [======================>.......] - ETA: 25s - loss: 1.6683 - regression_loss: 1.3867 - classification_loss: 0.2816 400/500 [=======================>......] - ETA: 25s - loss: 1.6667 - regression_loss: 1.3854 - classification_loss: 0.2813 401/500 [=======================>......] - ETA: 24s - loss: 1.6645 - regression_loss: 1.3837 - classification_loss: 0.2808 402/500 [=======================>......] - ETA: 24s - loss: 1.6646 - regression_loss: 1.3836 - classification_loss: 0.2810 403/500 [=======================>......] - ETA: 24s - loss: 1.6642 - regression_loss: 1.3834 - classification_loss: 0.2808 404/500 [=======================>......] - ETA: 24s - loss: 1.6642 - regression_loss: 1.3834 - classification_loss: 0.2807 405/500 [=======================>......] - ETA: 23s - loss: 1.6649 - regression_loss: 1.3841 - classification_loss: 0.2808 406/500 [=======================>......] - ETA: 23s - loss: 1.6659 - regression_loss: 1.3848 - classification_loss: 0.2811 407/500 [=======================>......] - ETA: 23s - loss: 1.6643 - regression_loss: 1.3836 - classification_loss: 0.2808 408/500 [=======================>......] - ETA: 23s - loss: 1.6644 - regression_loss: 1.3836 - classification_loss: 0.2808 409/500 [=======================>......] - ETA: 22s - loss: 1.6643 - regression_loss: 1.3836 - classification_loss: 0.2807 410/500 [=======================>......] - ETA: 22s - loss: 1.6676 - regression_loss: 1.3861 - classification_loss: 0.2816 411/500 [=======================>......] - ETA: 22s - loss: 1.6678 - regression_loss: 1.3861 - classification_loss: 0.2817 412/500 [=======================>......] - ETA: 22s - loss: 1.6683 - regression_loss: 1.3864 - classification_loss: 0.2818 413/500 [=======================>......] - ETA: 21s - loss: 1.6679 - regression_loss: 1.3862 - classification_loss: 0.2817 414/500 [=======================>......] - ETA: 21s - loss: 1.6686 - regression_loss: 1.3868 - classification_loss: 0.2818 415/500 [=======================>......] - ETA: 21s - loss: 1.6670 - regression_loss: 1.3854 - classification_loss: 0.2816 416/500 [=======================>......] - ETA: 21s - loss: 1.6672 - regression_loss: 1.3857 - classification_loss: 0.2815 417/500 [========================>.....] - ETA: 20s - loss: 1.6660 - regression_loss: 1.3848 - classification_loss: 0.2812 418/500 [========================>.....] - ETA: 20s - loss: 1.6660 - regression_loss: 1.3849 - classification_loss: 0.2810 419/500 [========================>.....] - ETA: 20s - loss: 1.6648 - regression_loss: 1.3842 - classification_loss: 0.2806 420/500 [========================>.....] - ETA: 20s - loss: 1.6655 - regression_loss: 1.3848 - classification_loss: 0.2807 421/500 [========================>.....] - ETA: 19s - loss: 1.6650 - regression_loss: 1.3841 - classification_loss: 0.2809 422/500 [========================>.....] - ETA: 19s - loss: 1.6654 - regression_loss: 1.3845 - classification_loss: 0.2809 423/500 [========================>.....] - ETA: 19s - loss: 1.6637 - regression_loss: 1.3831 - classification_loss: 0.2806 424/500 [========================>.....] - ETA: 19s - loss: 1.6641 - regression_loss: 1.3835 - classification_loss: 0.2806 425/500 [========================>.....] - ETA: 18s - loss: 1.6655 - regression_loss: 1.3846 - classification_loss: 0.2810 426/500 [========================>.....] - ETA: 18s - loss: 1.6656 - regression_loss: 1.3847 - classification_loss: 0.2809 427/500 [========================>.....] - ETA: 18s - loss: 1.6659 - regression_loss: 1.3850 - classification_loss: 0.2809 428/500 [========================>.....] - ETA: 18s - loss: 1.6648 - regression_loss: 1.3842 - classification_loss: 0.2806 429/500 [========================>.....] - ETA: 17s - loss: 1.6650 - regression_loss: 1.3844 - classification_loss: 0.2806 430/500 [========================>.....] - ETA: 17s - loss: 1.6627 - regression_loss: 1.3825 - classification_loss: 0.2802 431/500 [========================>.....] - ETA: 17s - loss: 1.6623 - regression_loss: 1.3820 - classification_loss: 0.2803 432/500 [========================>.....] - ETA: 17s - loss: 1.6614 - regression_loss: 1.3813 - classification_loss: 0.2802 433/500 [========================>.....] - ETA: 16s - loss: 1.6613 - regression_loss: 1.3809 - classification_loss: 0.2804 434/500 [=========================>....] - ETA: 16s - loss: 1.6614 - regression_loss: 1.3810 - classification_loss: 0.2804 435/500 [=========================>....] - ETA: 16s - loss: 1.6618 - regression_loss: 1.3813 - classification_loss: 0.2805 436/500 [=========================>....] - ETA: 16s - loss: 1.6606 - regression_loss: 1.3804 - classification_loss: 0.2803 437/500 [=========================>....] - ETA: 15s - loss: 1.6613 - regression_loss: 1.3810 - classification_loss: 0.2803 438/500 [=========================>....] - ETA: 15s - loss: 1.6611 - regression_loss: 1.3810 - classification_loss: 0.2801 439/500 [=========================>....] - ETA: 15s - loss: 1.6616 - regression_loss: 1.3812 - classification_loss: 0.2804 440/500 [=========================>....] - ETA: 15s - loss: 1.6630 - regression_loss: 1.3823 - classification_loss: 0.2807 441/500 [=========================>....] - ETA: 14s - loss: 1.6630 - regression_loss: 1.3823 - classification_loss: 0.2806 442/500 [=========================>....] - ETA: 14s - loss: 1.6633 - regression_loss: 1.3824 - classification_loss: 0.2809 443/500 [=========================>....] - ETA: 14s - loss: 1.6623 - regression_loss: 1.3816 - classification_loss: 0.2808 444/500 [=========================>....] - ETA: 14s - loss: 1.6641 - regression_loss: 1.3831 - classification_loss: 0.2810 445/500 [=========================>....] - ETA: 13s - loss: 1.6625 - regression_loss: 1.3818 - classification_loss: 0.2807 446/500 [=========================>....] - ETA: 13s - loss: 1.6643 - regression_loss: 1.3835 - classification_loss: 0.2807 447/500 [=========================>....] - ETA: 13s - loss: 1.6641 - regression_loss: 1.3835 - classification_loss: 0.2806 448/500 [=========================>....] - ETA: 13s - loss: 1.6639 - regression_loss: 1.3835 - classification_loss: 0.2804 449/500 [=========================>....] - ETA: 12s - loss: 1.6637 - regression_loss: 1.3835 - classification_loss: 0.2802 450/500 [==========================>...] - ETA: 12s - loss: 1.6633 - regression_loss: 1.3831 - classification_loss: 0.2802 451/500 [==========================>...] - ETA: 12s - loss: 1.6645 - regression_loss: 1.3840 - classification_loss: 0.2805 452/500 [==========================>...] - ETA: 12s - loss: 1.6648 - regression_loss: 1.3842 - classification_loss: 0.2807 453/500 [==========================>...] - ETA: 11s - loss: 1.6657 - regression_loss: 1.3850 - classification_loss: 0.2807 454/500 [==========================>...] - ETA: 11s - loss: 1.6652 - regression_loss: 1.3846 - classification_loss: 0.2806 455/500 [==========================>...] - ETA: 11s - loss: 1.6635 - regression_loss: 1.3833 - classification_loss: 0.2802 456/500 [==========================>...] - ETA: 11s - loss: 1.6652 - regression_loss: 1.3845 - classification_loss: 0.2807 457/500 [==========================>...] - ETA: 10s - loss: 1.6644 - regression_loss: 1.3839 - classification_loss: 0.2806 458/500 [==========================>...] - ETA: 10s - loss: 1.6640 - regression_loss: 1.3836 - classification_loss: 0.2804 459/500 [==========================>...] - ETA: 10s - loss: 1.6646 - regression_loss: 1.3841 - classification_loss: 0.2805 460/500 [==========================>...] - ETA: 10s - loss: 1.6651 - regression_loss: 1.3845 - classification_loss: 0.2806 461/500 [==========================>...] - ETA: 9s - loss: 1.6636 - regression_loss: 1.3832 - classification_loss: 0.2804  462/500 [==========================>...] - ETA: 9s - loss: 1.6629 - regression_loss: 1.3826 - classification_loss: 0.2803 463/500 [==========================>...] - ETA: 9s - loss: 1.6630 - regression_loss: 1.3827 - classification_loss: 0.2803 464/500 [==========================>...] - ETA: 9s - loss: 1.6629 - regression_loss: 1.3823 - classification_loss: 0.2805 465/500 [==========================>...] - ETA: 8s - loss: 1.6627 - regression_loss: 1.3820 - classification_loss: 0.2807 466/500 [==========================>...] - ETA: 8s - loss: 1.6633 - regression_loss: 1.3826 - classification_loss: 0.2807 467/500 [===========================>..] - ETA: 8s - loss: 1.6643 - regression_loss: 1.3836 - classification_loss: 0.2807 468/500 [===========================>..] - ETA: 8s - loss: 1.6645 - regression_loss: 1.3839 - classification_loss: 0.2806 469/500 [===========================>..] - ETA: 7s - loss: 1.6668 - regression_loss: 1.3857 - classification_loss: 0.2811 470/500 [===========================>..] - ETA: 7s - loss: 1.6666 - regression_loss: 1.3856 - classification_loss: 0.2811 471/500 [===========================>..] - ETA: 7s - loss: 1.6671 - regression_loss: 1.3859 - classification_loss: 0.2812 472/500 [===========================>..] - ETA: 7s - loss: 1.6665 - regression_loss: 1.3854 - classification_loss: 0.2811 473/500 [===========================>..] - ETA: 6s - loss: 1.6675 - regression_loss: 1.3861 - classification_loss: 0.2814 474/500 [===========================>..] - ETA: 6s - loss: 1.6662 - regression_loss: 1.3850 - classification_loss: 0.2812 475/500 [===========================>..] - ETA: 6s - loss: 1.6643 - regression_loss: 1.3835 - classification_loss: 0.2808 476/500 [===========================>..] - ETA: 6s - loss: 1.6643 - regression_loss: 1.3835 - classification_loss: 0.2808 477/500 [===========================>..] - ETA: 5s - loss: 1.6642 - regression_loss: 1.3835 - classification_loss: 0.2807 478/500 [===========================>..] - ETA: 5s - loss: 1.6645 - regression_loss: 1.3838 - classification_loss: 0.2807 479/500 [===========================>..] - ETA: 5s - loss: 1.6646 - regression_loss: 1.3837 - classification_loss: 0.2808 480/500 [===========================>..] - ETA: 5s - loss: 1.6638 - regression_loss: 1.3831 - classification_loss: 0.2807 481/500 [===========================>..] - ETA: 4s - loss: 1.6635 - regression_loss: 1.3829 - classification_loss: 0.2806 482/500 [===========================>..] - ETA: 4s - loss: 1.6635 - regression_loss: 1.3828 - classification_loss: 0.2807 483/500 [===========================>..] - ETA: 4s - loss: 1.6637 - regression_loss: 1.3830 - classification_loss: 0.2807 484/500 [============================>.] - ETA: 4s - loss: 1.6651 - regression_loss: 1.3839 - classification_loss: 0.2812 485/500 [============================>.] - ETA: 3s - loss: 1.6647 - regression_loss: 1.3836 - classification_loss: 0.2810 486/500 [============================>.] - ETA: 3s - loss: 1.6633 - regression_loss: 1.3825 - classification_loss: 0.2808 487/500 [============================>.] - ETA: 3s - loss: 1.6641 - regression_loss: 1.3831 - classification_loss: 0.2810 488/500 [============================>.] - ETA: 3s - loss: 1.6645 - regression_loss: 1.3835 - classification_loss: 0.2810 489/500 [============================>.] - ETA: 2s - loss: 1.6646 - regression_loss: 1.3836 - classification_loss: 0.2810 490/500 [============================>.] - ETA: 2s - loss: 1.6641 - regression_loss: 1.3832 - classification_loss: 0.2809 491/500 [============================>.] - ETA: 2s - loss: 1.6642 - regression_loss: 1.3833 - classification_loss: 0.2809 492/500 [============================>.] - ETA: 2s - loss: 1.6647 - regression_loss: 1.3837 - classification_loss: 0.2809 493/500 [============================>.] - ETA: 1s - loss: 1.6654 - regression_loss: 1.3844 - classification_loss: 0.2810 494/500 [============================>.] - ETA: 1s - loss: 1.6640 - regression_loss: 1.3834 - classification_loss: 0.2806 495/500 [============================>.] - ETA: 1s - loss: 1.6631 - regression_loss: 1.3827 - classification_loss: 0.2805 496/500 [============================>.] - ETA: 1s - loss: 1.6640 - regression_loss: 1.3832 - classification_loss: 0.2808 497/500 [============================>.] - ETA: 0s - loss: 1.6643 - regression_loss: 1.3835 - classification_loss: 0.2808 498/500 [============================>.] - ETA: 0s - loss: 1.6652 - regression_loss: 1.3843 - classification_loss: 0.2809 499/500 [============================>.] - ETA: 0s - loss: 1.6653 - regression_loss: 1.3843 - classification_loss: 0.2810 500/500 [==============================] - 126s 251ms/step - loss: 1.6666 - regression_loss: 1.3852 - classification_loss: 0.2813 1172 instances of class plum with average precision: 0.6393 mAP: 0.6393 Epoch 00076: saving model to ./training/snapshots/resnet50_pascal_76.h5 Epoch 77/150 1/500 [..............................] - ETA: 1:56 - loss: 2.1576 - regression_loss: 1.7516 - classification_loss: 0.4060 2/500 [..............................] - ETA: 1:59 - loss: 1.3466 - regression_loss: 1.1077 - classification_loss: 0.2389 3/500 [..............................] - ETA: 2:01 - loss: 1.5534 - regression_loss: 1.2620 - classification_loss: 0.2914 4/500 [..............................] - ETA: 2:01 - loss: 1.4373 - regression_loss: 1.1805 - classification_loss: 0.2568 5/500 [..............................] - ETA: 2:01 - loss: 1.4368 - regression_loss: 1.1914 - classification_loss: 0.2454 6/500 [..............................] - ETA: 2:01 - loss: 1.5085 - regression_loss: 1.2532 - classification_loss: 0.2552 7/500 [..............................] - ETA: 2:00 - loss: 1.5105 - regression_loss: 1.2519 - classification_loss: 0.2586 8/500 [..............................] - ETA: 1:59 - loss: 1.5383 - regression_loss: 1.2770 - classification_loss: 0.2613 9/500 [..............................] - ETA: 1:59 - loss: 1.5498 - regression_loss: 1.2924 - classification_loss: 0.2574 10/500 [..............................] - ETA: 1:59 - loss: 1.5855 - regression_loss: 1.3243 - classification_loss: 0.2612 11/500 [..............................] - ETA: 1:59 - loss: 1.5987 - regression_loss: 1.3281 - classification_loss: 0.2706 12/500 [..............................] - ETA: 1:59 - loss: 1.6021 - regression_loss: 1.3369 - classification_loss: 0.2651 13/500 [..............................] - ETA: 1:59 - loss: 1.5974 - regression_loss: 1.3288 - classification_loss: 0.2686 14/500 [..............................] - ETA: 1:59 - loss: 1.5317 - regression_loss: 1.2727 - classification_loss: 0.2589 15/500 [..............................] - ETA: 2:00 - loss: 1.5343 - regression_loss: 1.2767 - classification_loss: 0.2576 16/500 [..............................] - ETA: 1:59 - loss: 1.5451 - regression_loss: 1.2874 - classification_loss: 0.2577 17/500 [>.............................] - ETA: 1:58 - loss: 1.5549 - regression_loss: 1.2819 - classification_loss: 0.2731 18/500 [>.............................] - ETA: 1:57 - loss: 1.5167 - regression_loss: 1.2524 - classification_loss: 0.2643 19/500 [>.............................] - ETA: 1:56 - loss: 1.5219 - regression_loss: 1.2610 - classification_loss: 0.2609 20/500 [>.............................] - ETA: 1:55 - loss: 1.5028 - regression_loss: 1.2475 - classification_loss: 0.2554 21/500 [>.............................] - ETA: 1:55 - loss: 1.5182 - regression_loss: 1.2593 - classification_loss: 0.2590 22/500 [>.............................] - ETA: 1:55 - loss: 1.5134 - regression_loss: 1.2549 - classification_loss: 0.2585 23/500 [>.............................] - ETA: 1:55 - loss: 1.5358 - regression_loss: 1.2705 - classification_loss: 0.2653 24/500 [>.............................] - ETA: 1:55 - loss: 1.5466 - regression_loss: 1.2801 - classification_loss: 0.2665 25/500 [>.............................] - ETA: 1:55 - loss: 1.5606 - regression_loss: 1.2932 - classification_loss: 0.2674 26/500 [>.............................] - ETA: 1:55 - loss: 1.5636 - regression_loss: 1.2967 - classification_loss: 0.2669 27/500 [>.............................] - ETA: 1:55 - loss: 1.5492 - regression_loss: 1.2837 - classification_loss: 0.2655 28/500 [>.............................] - ETA: 1:55 - loss: 1.5601 - regression_loss: 1.2905 - classification_loss: 0.2697 29/500 [>.............................] - ETA: 1:55 - loss: 1.5540 - regression_loss: 1.2861 - classification_loss: 0.2678 30/500 [>.............................] - ETA: 1:55 - loss: 1.5793 - regression_loss: 1.3053 - classification_loss: 0.2740 31/500 [>.............................] - ETA: 1:54 - loss: 1.5510 - regression_loss: 1.2831 - classification_loss: 0.2679 32/500 [>.............................] - ETA: 1:54 - loss: 1.5724 - regression_loss: 1.3018 - classification_loss: 0.2706 33/500 [>.............................] - ETA: 1:54 - loss: 1.5805 - regression_loss: 1.3081 - classification_loss: 0.2724 34/500 [=>............................] - ETA: 1:54 - loss: 1.5846 - regression_loss: 1.3124 - classification_loss: 0.2722 35/500 [=>............................] - ETA: 1:54 - loss: 1.6003 - regression_loss: 1.3249 - classification_loss: 0.2754 36/500 [=>............................] - ETA: 1:54 - loss: 1.6046 - regression_loss: 1.3276 - classification_loss: 0.2770 37/500 [=>............................] - ETA: 1:54 - loss: 1.6187 - regression_loss: 1.3398 - classification_loss: 0.2789 38/500 [=>............................] - ETA: 1:54 - loss: 1.6054 - regression_loss: 1.3254 - classification_loss: 0.2800 39/500 [=>............................] - ETA: 1:53 - loss: 1.5909 - regression_loss: 1.3124 - classification_loss: 0.2785 40/500 [=>............................] - ETA: 1:53 - loss: 1.5896 - regression_loss: 1.3116 - classification_loss: 0.2780 41/500 [=>............................] - ETA: 1:53 - loss: 1.5847 - regression_loss: 1.3079 - classification_loss: 0.2768 42/500 [=>............................] - ETA: 1:53 - loss: 1.5922 - regression_loss: 1.3145 - classification_loss: 0.2777 43/500 [=>............................] - ETA: 1:53 - loss: 1.6122 - regression_loss: 1.3299 - classification_loss: 0.2823 44/500 [=>............................] - ETA: 1:52 - loss: 1.6079 - regression_loss: 1.3267 - classification_loss: 0.2812 45/500 [=>............................] - ETA: 1:52 - loss: 1.6132 - regression_loss: 1.3336 - classification_loss: 0.2796 46/500 [=>............................] - ETA: 1:52 - loss: 1.6255 - regression_loss: 1.3445 - classification_loss: 0.2810 47/500 [=>............................] - ETA: 1:52 - loss: 1.6287 - regression_loss: 1.3482 - classification_loss: 0.2805 48/500 [=>............................] - ETA: 1:52 - loss: 1.6219 - regression_loss: 1.3433 - classification_loss: 0.2787 49/500 [=>............................] - ETA: 1:51 - loss: 1.6173 - regression_loss: 1.3408 - classification_loss: 0.2765 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6232 - regression_loss: 1.3467 - classification_loss: 0.2765 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6286 - regression_loss: 1.3515 - classification_loss: 0.2771 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6284 - regression_loss: 1.3525 - classification_loss: 0.2759 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6434 - regression_loss: 1.3639 - classification_loss: 0.2795 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6489 - regression_loss: 1.3686 - classification_loss: 0.2803 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6601 - regression_loss: 1.3796 - classification_loss: 0.2806 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6599 - regression_loss: 1.3802 - classification_loss: 0.2798 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6543 - regression_loss: 1.3759 - classification_loss: 0.2784 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6584 - regression_loss: 1.3802 - classification_loss: 0.2783 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6605 - regression_loss: 1.3817 - classification_loss: 0.2788 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6653 - regression_loss: 1.3855 - classification_loss: 0.2798 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6745 - regression_loss: 1.3926 - classification_loss: 0.2819 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6759 - regression_loss: 1.3943 - classification_loss: 0.2815 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6706 - regression_loss: 1.3905 - classification_loss: 0.2800 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6751 - regression_loss: 1.3941 - classification_loss: 0.2810 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6715 - regression_loss: 1.3920 - classification_loss: 0.2795 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6592 - regression_loss: 1.3822 - classification_loss: 0.2770 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6652 - regression_loss: 1.3870 - classification_loss: 0.2782 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6750 - regression_loss: 1.3959 - classification_loss: 0.2791 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6826 - regression_loss: 1.3995 - classification_loss: 0.2831 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6882 - regression_loss: 1.4048 - classification_loss: 0.2834 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6893 - regression_loss: 1.4052 - classification_loss: 0.2841 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6864 - regression_loss: 1.4034 - classification_loss: 0.2830 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6888 - regression_loss: 1.4061 - classification_loss: 0.2827 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6741 - regression_loss: 1.3939 - classification_loss: 0.2803 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6764 - regression_loss: 1.3956 - classification_loss: 0.2808 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6782 - regression_loss: 1.3975 - classification_loss: 0.2807 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6800 - regression_loss: 1.3986 - classification_loss: 0.2814 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6672 - regression_loss: 1.3869 - classification_loss: 0.2802 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6570 - regression_loss: 1.3786 - classification_loss: 0.2784 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6550 - regression_loss: 1.3774 - classification_loss: 0.2776 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6487 - regression_loss: 1.3721 - classification_loss: 0.2766 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6475 - regression_loss: 1.3714 - classification_loss: 0.2761 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6669 - regression_loss: 1.3860 - classification_loss: 0.2810 84/500 [====>.........................] - ETA: 1:44 - loss: 1.6642 - regression_loss: 1.3837 - classification_loss: 0.2805 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6672 - regression_loss: 1.3863 - classification_loss: 0.2809 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6695 - regression_loss: 1.3880 - classification_loss: 0.2815 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6682 - regression_loss: 1.3853 - classification_loss: 0.2829 88/500 [====>.........................] - ETA: 1:43 - loss: 1.6710 - regression_loss: 1.3880 - classification_loss: 0.2830 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6709 - regression_loss: 1.3876 - classification_loss: 0.2833 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6710 - regression_loss: 1.3878 - classification_loss: 0.2832 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6757 - regression_loss: 1.3919 - classification_loss: 0.2839 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6767 - regression_loss: 1.3923 - classification_loss: 0.2844 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6751 - regression_loss: 1.3913 - classification_loss: 0.2838 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6736 - regression_loss: 1.3894 - classification_loss: 0.2842 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6771 - regression_loss: 1.3922 - classification_loss: 0.2849 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6746 - regression_loss: 1.3903 - classification_loss: 0.2842 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6786 - regression_loss: 1.3933 - classification_loss: 0.2853 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6774 - regression_loss: 1.3921 - classification_loss: 0.2853 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6802 - regression_loss: 1.3947 - classification_loss: 0.2855 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6827 - regression_loss: 1.3972 - classification_loss: 0.2855 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6819 - regression_loss: 1.3964 - classification_loss: 0.2855 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6827 - regression_loss: 1.3972 - classification_loss: 0.2855 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6772 - regression_loss: 1.3928 - classification_loss: 0.2844 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6814 - regression_loss: 1.3970 - classification_loss: 0.2844 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6857 - regression_loss: 1.4006 - classification_loss: 0.2850 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6880 - regression_loss: 1.4028 - classification_loss: 0.2852 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6865 - regression_loss: 1.4013 - classification_loss: 0.2852 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6868 - regression_loss: 1.4006 - classification_loss: 0.2862 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6887 - regression_loss: 1.4023 - classification_loss: 0.2864 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6835 - regression_loss: 1.3983 - classification_loss: 0.2853 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6802 - regression_loss: 1.3953 - classification_loss: 0.2849 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6838 - regression_loss: 1.3976 - classification_loss: 0.2862 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6831 - regression_loss: 1.3966 - classification_loss: 0.2865 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6855 - regression_loss: 1.3980 - classification_loss: 0.2875 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6869 - regression_loss: 1.3991 - classification_loss: 0.2877 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6843 - regression_loss: 1.3971 - classification_loss: 0.2872 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6913 - regression_loss: 1.4029 - classification_loss: 0.2884 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6912 - regression_loss: 1.4024 - classification_loss: 0.2888 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6823 - regression_loss: 1.3953 - classification_loss: 0.2870 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6792 - regression_loss: 1.3929 - classification_loss: 0.2863 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6795 - regression_loss: 1.3938 - classification_loss: 0.2858 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6774 - regression_loss: 1.3920 - classification_loss: 0.2853 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6782 - regression_loss: 1.3921 - classification_loss: 0.2861 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6821 - regression_loss: 1.3958 - classification_loss: 0.2863 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6811 - regression_loss: 1.3953 - classification_loss: 0.2859 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6824 - regression_loss: 1.3964 - classification_loss: 0.2860 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6829 - regression_loss: 1.3969 - classification_loss: 0.2860 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6841 - regression_loss: 1.3976 - classification_loss: 0.2864 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6870 - regression_loss: 1.4001 - classification_loss: 0.2869 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6879 - regression_loss: 1.4013 - classification_loss: 0.2866 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6879 - regression_loss: 1.4011 - classification_loss: 0.2868 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6857 - regression_loss: 1.3999 - classification_loss: 0.2858 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6843 - regression_loss: 1.3989 - classification_loss: 0.2854 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6902 - regression_loss: 1.4039 - classification_loss: 0.2864 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6894 - regression_loss: 1.4034 - classification_loss: 0.2860 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6870 - regression_loss: 1.4016 - classification_loss: 0.2854 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6888 - regression_loss: 1.4036 - classification_loss: 0.2852 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6915 - regression_loss: 1.4057 - classification_loss: 0.2858 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6947 - regression_loss: 1.4081 - classification_loss: 0.2866 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6900 - regression_loss: 1.4046 - classification_loss: 0.2854 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6834 - regression_loss: 1.3993 - classification_loss: 0.2841 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6771 - regression_loss: 1.3944 - classification_loss: 0.2827 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6745 - regression_loss: 1.3917 - classification_loss: 0.2828 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6743 - regression_loss: 1.3913 - classification_loss: 0.2831 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6741 - regression_loss: 1.3914 - classification_loss: 0.2828 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6685 - regression_loss: 1.3872 - classification_loss: 0.2812 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6642 - regression_loss: 1.3839 - classification_loss: 0.2802 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6656 - regression_loss: 1.3850 - classification_loss: 0.2806 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6674 - regression_loss: 1.3864 - classification_loss: 0.2810 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6626 - regression_loss: 1.3829 - classification_loss: 0.2797 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6620 - regression_loss: 1.3825 - classification_loss: 0.2795 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6536 - regression_loss: 1.3754 - classification_loss: 0.2782 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6534 - regression_loss: 1.3752 - classification_loss: 0.2783 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6494 - regression_loss: 1.3717 - classification_loss: 0.2777 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6493 - regression_loss: 1.3717 - classification_loss: 0.2776 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6521 - regression_loss: 1.3745 - classification_loss: 0.2776 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6461 - regression_loss: 1.3699 - classification_loss: 0.2762 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6457 - regression_loss: 1.3698 - classification_loss: 0.2758 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6460 - regression_loss: 1.3703 - classification_loss: 0.2757 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6458 - regression_loss: 1.3691 - classification_loss: 0.2767 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6507 - regression_loss: 1.3732 - classification_loss: 0.2774 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6520 - regression_loss: 1.3740 - classification_loss: 0.2780 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6533 - regression_loss: 1.3754 - classification_loss: 0.2779 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6571 - regression_loss: 1.3782 - classification_loss: 0.2789 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6556 - regression_loss: 1.3771 - classification_loss: 0.2785 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6556 - regression_loss: 1.3773 - classification_loss: 0.2783 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6554 - regression_loss: 1.3772 - classification_loss: 0.2782 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6567 - regression_loss: 1.3785 - classification_loss: 0.2781 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6578 - regression_loss: 1.3793 - classification_loss: 0.2785 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6523 - regression_loss: 1.3751 - classification_loss: 0.2773 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6463 - regression_loss: 1.3703 - classification_loss: 0.2760 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6532 - regression_loss: 1.3764 - classification_loss: 0.2768 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6531 - regression_loss: 1.3761 - classification_loss: 0.2770 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6489 - regression_loss: 1.3724 - classification_loss: 0.2764 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6464 - regression_loss: 1.3704 - classification_loss: 0.2760 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6518 - regression_loss: 1.3743 - classification_loss: 0.2775 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6532 - regression_loss: 1.3757 - classification_loss: 0.2776 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6499 - regression_loss: 1.3733 - classification_loss: 0.2766 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6511 - regression_loss: 1.3741 - classification_loss: 0.2770 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6535 - regression_loss: 1.3759 - classification_loss: 0.2776 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6491 - regression_loss: 1.3723 - classification_loss: 0.2768 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6489 - regression_loss: 1.3721 - classification_loss: 0.2768 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6474 - regression_loss: 1.3711 - classification_loss: 0.2763 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6494 - regression_loss: 1.3730 - classification_loss: 0.2764 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6491 - regression_loss: 1.3732 - classification_loss: 0.2759 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6506 - regression_loss: 1.3746 - classification_loss: 0.2760 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6534 - regression_loss: 1.3766 - classification_loss: 0.2767 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6564 - regression_loss: 1.3792 - classification_loss: 0.2772 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6536 - regression_loss: 1.3771 - classification_loss: 0.2765 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6549 - regression_loss: 1.3786 - classification_loss: 0.2763 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6546 - regression_loss: 1.3784 - classification_loss: 0.2762 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6571 - regression_loss: 1.3806 - classification_loss: 0.2765 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6582 - regression_loss: 1.3816 - classification_loss: 0.2766 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6551 - regression_loss: 1.3791 - classification_loss: 0.2760 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6531 - regression_loss: 1.3772 - classification_loss: 0.2759 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6486 - regression_loss: 1.3734 - classification_loss: 0.2752 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6457 - regression_loss: 1.3710 - classification_loss: 0.2747 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6462 - regression_loss: 1.3718 - classification_loss: 0.2744 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6488 - regression_loss: 1.3737 - classification_loss: 0.2751 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6548 - regression_loss: 1.3788 - classification_loss: 0.2761 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6558 - regression_loss: 1.3791 - classification_loss: 0.2767 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6546 - regression_loss: 1.3783 - classification_loss: 0.2763 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6561 - regression_loss: 1.3798 - classification_loss: 0.2762 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6520 - regression_loss: 1.3767 - classification_loss: 0.2754 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6520 - regression_loss: 1.3769 - classification_loss: 0.2751 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6498 - regression_loss: 1.3751 - classification_loss: 0.2746 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6524 - regression_loss: 1.3755 - classification_loss: 0.2769 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6522 - regression_loss: 1.3756 - classification_loss: 0.2765 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6557 - regression_loss: 1.3785 - classification_loss: 0.2772 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6579 - regression_loss: 1.3800 - classification_loss: 0.2779 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6547 - regression_loss: 1.3776 - classification_loss: 0.2770 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6571 - regression_loss: 1.3794 - classification_loss: 0.2776 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6574 - regression_loss: 1.3800 - classification_loss: 0.2775 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6562 - regression_loss: 1.3790 - classification_loss: 0.2772 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6549 - regression_loss: 1.3777 - classification_loss: 0.2772 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6555 - regression_loss: 1.3776 - classification_loss: 0.2778 217/500 [============>.................] - ETA: 1:10 - loss: 1.6611 - regression_loss: 1.3826 - classification_loss: 0.2785 218/500 [============>.................] - ETA: 1:10 - loss: 1.6605 - regression_loss: 1.3823 - classification_loss: 0.2783 219/500 [============>.................] - ETA: 1:10 - loss: 1.6618 - regression_loss: 1.3831 - classification_loss: 0.2787 220/500 [============>.................] - ETA: 1:09 - loss: 1.6646 - regression_loss: 1.3852 - classification_loss: 0.2794 221/500 [============>.................] - ETA: 1:09 - loss: 1.6661 - regression_loss: 1.3865 - classification_loss: 0.2796 222/500 [============>.................] - ETA: 1:09 - loss: 1.6670 - regression_loss: 1.3874 - classification_loss: 0.2795 223/500 [============>.................] - ETA: 1:09 - loss: 1.6659 - regression_loss: 1.3866 - classification_loss: 0.2793 224/500 [============>.................] - ETA: 1:08 - loss: 1.6672 - regression_loss: 1.3876 - classification_loss: 0.2797 225/500 [============>.................] - ETA: 1:08 - loss: 1.6666 - regression_loss: 1.3874 - classification_loss: 0.2793 226/500 [============>.................] - ETA: 1:08 - loss: 1.6627 - regression_loss: 1.3843 - classification_loss: 0.2784 227/500 [============>.................] - ETA: 1:08 - loss: 1.6656 - regression_loss: 1.3862 - classification_loss: 0.2794 228/500 [============>.................] - ETA: 1:07 - loss: 1.6632 - regression_loss: 1.3840 - classification_loss: 0.2792 229/500 [============>.................] - ETA: 1:07 - loss: 1.6653 - regression_loss: 1.3860 - classification_loss: 0.2793 230/500 [============>.................] - ETA: 1:07 - loss: 1.6656 - regression_loss: 1.3863 - classification_loss: 0.2793 231/500 [============>.................] - ETA: 1:07 - loss: 1.6641 - regression_loss: 1.3852 - classification_loss: 0.2789 232/500 [============>.................] - ETA: 1:06 - loss: 1.6646 - regression_loss: 1.3856 - classification_loss: 0.2790 233/500 [============>.................] - ETA: 1:06 - loss: 1.6645 - regression_loss: 1.3855 - classification_loss: 0.2790 234/500 [=============>................] - ETA: 1:06 - loss: 1.6652 - regression_loss: 1.3863 - classification_loss: 0.2788 235/500 [=============>................] - ETA: 1:06 - loss: 1.6644 - regression_loss: 1.3858 - classification_loss: 0.2786 236/500 [=============>................] - ETA: 1:05 - loss: 1.6660 - regression_loss: 1.3873 - classification_loss: 0.2787 237/500 [=============>................] - ETA: 1:05 - loss: 1.6631 - regression_loss: 1.3845 - classification_loss: 0.2785 238/500 [=============>................] - ETA: 1:05 - loss: 1.6647 - regression_loss: 1.3859 - classification_loss: 0.2788 239/500 [=============>................] - ETA: 1:05 - loss: 1.6628 - regression_loss: 1.3844 - classification_loss: 0.2784 240/500 [=============>................] - ETA: 1:04 - loss: 1.6608 - regression_loss: 1.3823 - classification_loss: 0.2785 241/500 [=============>................] - ETA: 1:04 - loss: 1.6612 - regression_loss: 1.3830 - classification_loss: 0.2782 242/500 [=============>................] - ETA: 1:04 - loss: 1.6612 - regression_loss: 1.3830 - classification_loss: 0.2782 243/500 [=============>................] - ETA: 1:04 - loss: 1.6615 - regression_loss: 1.3829 - classification_loss: 0.2786 244/500 [=============>................] - ETA: 1:03 - loss: 1.6601 - regression_loss: 1.3818 - classification_loss: 0.2782 245/500 [=============>................] - ETA: 1:03 - loss: 1.6588 - regression_loss: 1.3808 - classification_loss: 0.2780 246/500 [=============>................] - ETA: 1:03 - loss: 1.6626 - regression_loss: 1.3837 - classification_loss: 0.2789 247/500 [=============>................] - ETA: 1:03 - loss: 1.6593 - regression_loss: 1.3810 - classification_loss: 0.2783 248/500 [=============>................] - ETA: 1:02 - loss: 1.6572 - regression_loss: 1.3794 - classification_loss: 0.2777 249/500 [=============>................] - ETA: 1:02 - loss: 1.6547 - regression_loss: 1.3774 - classification_loss: 0.2773 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6541 - regression_loss: 1.3766 - classification_loss: 0.2774 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6542 - regression_loss: 1.3764 - classification_loss: 0.2778 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6518 - regression_loss: 1.3750 - classification_loss: 0.2768 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6516 - regression_loss: 1.3748 - classification_loss: 0.2768 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6528 - regression_loss: 1.3759 - classification_loss: 0.2768 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6531 - regression_loss: 1.3763 - classification_loss: 0.2768 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6509 - regression_loss: 1.3745 - classification_loss: 0.2764 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6527 - regression_loss: 1.3761 - classification_loss: 0.2766 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6525 - regression_loss: 1.3759 - classification_loss: 0.2766 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6491 - regression_loss: 1.3729 - classification_loss: 0.2763 260/500 [==============>...............] - ETA: 59s - loss: 1.6506 - regression_loss: 1.3742 - classification_loss: 0.2764  261/500 [==============>...............] - ETA: 59s - loss: 1.6520 - regression_loss: 1.3748 - classification_loss: 0.2773 262/500 [==============>...............] - ETA: 59s - loss: 1.6502 - regression_loss: 1.3734 - classification_loss: 0.2768 263/500 [==============>...............] - ETA: 59s - loss: 1.6526 - regression_loss: 1.3747 - classification_loss: 0.2779 264/500 [==============>...............] - ETA: 58s - loss: 1.6547 - regression_loss: 1.3765 - classification_loss: 0.2782 265/500 [==============>...............] - ETA: 58s - loss: 1.6587 - regression_loss: 1.3794 - classification_loss: 0.2793 266/500 [==============>...............] - ETA: 58s - loss: 1.6599 - regression_loss: 1.3803 - classification_loss: 0.2796 267/500 [===============>..............] - ETA: 58s - loss: 1.6612 - regression_loss: 1.3814 - classification_loss: 0.2798 268/500 [===============>..............] - ETA: 57s - loss: 1.6611 - regression_loss: 1.3814 - classification_loss: 0.2797 269/500 [===============>..............] - ETA: 57s - loss: 1.6619 - regression_loss: 1.3822 - classification_loss: 0.2798 270/500 [===============>..............] - ETA: 57s - loss: 1.6622 - regression_loss: 1.3811 - classification_loss: 0.2812 271/500 [===============>..............] - ETA: 57s - loss: 1.6610 - regression_loss: 1.3801 - classification_loss: 0.2809 272/500 [===============>..............] - ETA: 56s - loss: 1.6610 - regression_loss: 1.3801 - classification_loss: 0.2810 273/500 [===============>..............] - ETA: 56s - loss: 1.6627 - regression_loss: 1.3813 - classification_loss: 0.2814 274/500 [===============>..............] - ETA: 56s - loss: 1.6630 - regression_loss: 1.3815 - classification_loss: 0.2815 275/500 [===============>..............] - ETA: 56s - loss: 1.6608 - regression_loss: 1.3799 - classification_loss: 0.2810 276/500 [===============>..............] - ETA: 55s - loss: 1.6616 - regression_loss: 1.3806 - classification_loss: 0.2810 277/500 [===============>..............] - ETA: 55s - loss: 1.6613 - regression_loss: 1.3804 - classification_loss: 0.2809 278/500 [===============>..............] - ETA: 55s - loss: 1.6622 - regression_loss: 1.3811 - classification_loss: 0.2810 279/500 [===============>..............] - ETA: 55s - loss: 1.6609 - regression_loss: 1.3804 - classification_loss: 0.2805 280/500 [===============>..............] - ETA: 54s - loss: 1.6592 - regression_loss: 1.3791 - classification_loss: 0.2800 281/500 [===============>..............] - ETA: 54s - loss: 1.6609 - regression_loss: 1.3799 - classification_loss: 0.2810 282/500 [===============>..............] - ETA: 54s - loss: 1.6614 - regression_loss: 1.3804 - classification_loss: 0.2810 283/500 [===============>..............] - ETA: 54s - loss: 1.6623 - regression_loss: 1.3812 - classification_loss: 0.2810 284/500 [================>.............] - ETA: 53s - loss: 1.6643 - regression_loss: 1.3829 - classification_loss: 0.2814 285/500 [================>.............] - ETA: 53s - loss: 1.6645 - regression_loss: 1.3832 - classification_loss: 0.2813 286/500 [================>.............] - ETA: 53s - loss: 1.6634 - regression_loss: 1.3823 - classification_loss: 0.2811 287/500 [================>.............] - ETA: 53s - loss: 1.6636 - regression_loss: 1.3825 - classification_loss: 0.2811 288/500 [================>.............] - ETA: 52s - loss: 1.6623 - regression_loss: 1.3816 - classification_loss: 0.2807 289/500 [================>.............] - ETA: 52s - loss: 1.6634 - regression_loss: 1.3827 - classification_loss: 0.2806 290/500 [================>.............] - ETA: 52s - loss: 1.6645 - regression_loss: 1.3837 - classification_loss: 0.2808 291/500 [================>.............] - ETA: 52s - loss: 1.6625 - regression_loss: 1.3820 - classification_loss: 0.2804 292/500 [================>.............] - ETA: 52s - loss: 1.6630 - regression_loss: 1.3825 - classification_loss: 0.2805 293/500 [================>.............] - ETA: 51s - loss: 1.6607 - regression_loss: 1.3806 - classification_loss: 0.2801 294/500 [================>.............] - ETA: 51s - loss: 1.6605 - regression_loss: 1.3805 - classification_loss: 0.2800 295/500 [================>.............] - ETA: 51s - loss: 1.6620 - regression_loss: 1.3817 - classification_loss: 0.2803 296/500 [================>.............] - ETA: 50s - loss: 1.6622 - regression_loss: 1.3818 - classification_loss: 0.2804 297/500 [================>.............] - ETA: 50s - loss: 1.6643 - regression_loss: 1.3834 - classification_loss: 0.2808 298/500 [================>.............] - ETA: 50s - loss: 1.6648 - regression_loss: 1.3839 - classification_loss: 0.2809 299/500 [================>.............] - ETA: 50s - loss: 1.6637 - regression_loss: 1.3831 - classification_loss: 0.2806 300/500 [=================>............] - ETA: 49s - loss: 1.6611 - regression_loss: 1.3808 - classification_loss: 0.2803 301/500 [=================>............] - ETA: 49s - loss: 1.6606 - regression_loss: 1.3805 - classification_loss: 0.2800 302/500 [=================>............] - ETA: 49s - loss: 1.6605 - regression_loss: 1.3803 - classification_loss: 0.2802 303/500 [=================>............] - ETA: 49s - loss: 1.6623 - regression_loss: 1.3817 - classification_loss: 0.2806 304/500 [=================>............] - ETA: 49s - loss: 1.6622 - regression_loss: 1.3816 - classification_loss: 0.2806 305/500 [=================>............] - ETA: 48s - loss: 1.6632 - regression_loss: 1.3824 - classification_loss: 0.2808 306/500 [=================>............] - ETA: 48s - loss: 1.6642 - regression_loss: 1.3832 - classification_loss: 0.2809 307/500 [=================>............] - ETA: 48s - loss: 1.6621 - regression_loss: 1.3816 - classification_loss: 0.2805 308/500 [=================>............] - ETA: 48s - loss: 1.6599 - regression_loss: 1.3797 - classification_loss: 0.2801 309/500 [=================>............] - ETA: 47s - loss: 1.6607 - regression_loss: 1.3806 - classification_loss: 0.2801 310/500 [=================>............] - ETA: 47s - loss: 1.6616 - regression_loss: 1.3814 - classification_loss: 0.2802 311/500 [=================>............] - ETA: 47s - loss: 1.6626 - regression_loss: 1.3823 - classification_loss: 0.2804 312/500 [=================>............] - ETA: 47s - loss: 1.6638 - regression_loss: 1.3833 - classification_loss: 0.2805 313/500 [=================>............] - ETA: 46s - loss: 1.6631 - regression_loss: 1.3828 - classification_loss: 0.2803 314/500 [=================>............] - ETA: 46s - loss: 1.6663 - regression_loss: 1.3850 - classification_loss: 0.2813 315/500 [=================>............] - ETA: 46s - loss: 1.6673 - regression_loss: 1.3857 - classification_loss: 0.2816 316/500 [=================>............] - ETA: 46s - loss: 1.6680 - regression_loss: 1.3864 - classification_loss: 0.2816 317/500 [==================>...........] - ETA: 45s - loss: 1.6679 - regression_loss: 1.3864 - classification_loss: 0.2815 318/500 [==================>...........] - ETA: 45s - loss: 1.6694 - regression_loss: 1.3879 - classification_loss: 0.2815 319/500 [==================>...........] - ETA: 45s - loss: 1.6711 - regression_loss: 1.3890 - classification_loss: 0.2821 320/500 [==================>...........] - ETA: 45s - loss: 1.6707 - regression_loss: 1.3887 - classification_loss: 0.2820 321/500 [==================>...........] - ETA: 44s - loss: 1.6718 - regression_loss: 1.3896 - classification_loss: 0.2821 322/500 [==================>...........] - ETA: 44s - loss: 1.6679 - regression_loss: 1.3863 - classification_loss: 0.2816 323/500 [==================>...........] - ETA: 44s - loss: 1.6718 - regression_loss: 1.3897 - classification_loss: 0.2822 324/500 [==================>...........] - ETA: 44s - loss: 1.6722 - regression_loss: 1.3899 - classification_loss: 0.2823 325/500 [==================>...........] - ETA: 43s - loss: 1.6716 - regression_loss: 1.3897 - classification_loss: 0.2819 326/500 [==================>...........] - ETA: 43s - loss: 1.6703 - regression_loss: 1.3886 - classification_loss: 0.2817 327/500 [==================>...........] - ETA: 43s - loss: 1.6709 - regression_loss: 1.3891 - classification_loss: 0.2818 328/500 [==================>...........] - ETA: 43s - loss: 1.6686 - regression_loss: 1.3872 - classification_loss: 0.2813 329/500 [==================>...........] - ETA: 42s - loss: 1.6676 - regression_loss: 1.3864 - classification_loss: 0.2812 330/500 [==================>...........] - ETA: 42s - loss: 1.6672 - regression_loss: 1.3861 - classification_loss: 0.2811 331/500 [==================>...........] - ETA: 42s - loss: 1.6654 - regression_loss: 1.3843 - classification_loss: 0.2811 332/500 [==================>...........] - ETA: 42s - loss: 1.6693 - regression_loss: 1.3873 - classification_loss: 0.2820 333/500 [==================>...........] - ETA: 41s - loss: 1.6675 - regression_loss: 1.3861 - classification_loss: 0.2814 334/500 [===================>..........] - ETA: 41s - loss: 1.6672 - regression_loss: 1.3857 - classification_loss: 0.2815 335/500 [===================>..........] - ETA: 41s - loss: 1.6686 - regression_loss: 1.3868 - classification_loss: 0.2818 336/500 [===================>..........] - ETA: 41s - loss: 1.6667 - regression_loss: 1.3853 - classification_loss: 0.2813 337/500 [===================>..........] - ETA: 40s - loss: 1.6680 - regression_loss: 1.3863 - classification_loss: 0.2816 338/500 [===================>..........] - ETA: 40s - loss: 1.6678 - regression_loss: 1.3861 - classification_loss: 0.2817 339/500 [===================>..........] - ETA: 40s - loss: 1.6651 - regression_loss: 1.3841 - classification_loss: 0.2810 340/500 [===================>..........] - ETA: 40s - loss: 1.6658 - regression_loss: 1.3847 - classification_loss: 0.2811 341/500 [===================>..........] - ETA: 39s - loss: 1.6676 - regression_loss: 1.3863 - classification_loss: 0.2814 342/500 [===================>..........] - ETA: 39s - loss: 1.6666 - regression_loss: 1.3854 - classification_loss: 0.2811 343/500 [===================>..........] - ETA: 39s - loss: 1.6664 - regression_loss: 1.3851 - classification_loss: 0.2812 344/500 [===================>..........] - ETA: 39s - loss: 1.6648 - regression_loss: 1.3839 - classification_loss: 0.2809 345/500 [===================>..........] - ETA: 38s - loss: 1.6650 - regression_loss: 1.3841 - classification_loss: 0.2809 346/500 [===================>..........] - ETA: 38s - loss: 1.6728 - regression_loss: 1.3871 - classification_loss: 0.2858 347/500 [===================>..........] - ETA: 38s - loss: 1.6740 - regression_loss: 1.3882 - classification_loss: 0.2858 348/500 [===================>..........] - ETA: 38s - loss: 1.6753 - regression_loss: 1.3892 - classification_loss: 0.2861 349/500 [===================>..........] - ETA: 37s - loss: 1.6755 - regression_loss: 1.3894 - classification_loss: 0.2860 350/500 [====================>.........] - ETA: 37s - loss: 1.6759 - regression_loss: 1.3898 - classification_loss: 0.2861 351/500 [====================>.........] - ETA: 37s - loss: 1.6744 - regression_loss: 1.3885 - classification_loss: 0.2859 352/500 [====================>.........] - ETA: 37s - loss: 1.6744 - regression_loss: 1.3885 - classification_loss: 0.2859 353/500 [====================>.........] - ETA: 36s - loss: 1.6746 - regression_loss: 1.3888 - classification_loss: 0.2859 354/500 [====================>.........] - ETA: 36s - loss: 1.6741 - regression_loss: 1.3884 - classification_loss: 0.2857 355/500 [====================>.........] - ETA: 36s - loss: 1.6747 - regression_loss: 1.3890 - classification_loss: 0.2857 356/500 [====================>.........] - ETA: 36s - loss: 1.6747 - regression_loss: 1.3890 - classification_loss: 0.2857 357/500 [====================>.........] - ETA: 35s - loss: 1.6751 - regression_loss: 1.3895 - classification_loss: 0.2856 358/500 [====================>.........] - ETA: 35s - loss: 1.6766 - regression_loss: 1.3908 - classification_loss: 0.2859 359/500 [====================>.........] - ETA: 35s - loss: 1.6758 - regression_loss: 1.3895 - classification_loss: 0.2863 360/500 [====================>.........] - ETA: 35s - loss: 1.6738 - regression_loss: 1.3876 - classification_loss: 0.2862 361/500 [====================>.........] - ETA: 34s - loss: 1.6736 - regression_loss: 1.3875 - classification_loss: 0.2861 362/500 [====================>.........] - ETA: 34s - loss: 1.6734 - regression_loss: 1.3875 - classification_loss: 0.2860 363/500 [====================>.........] - ETA: 34s - loss: 1.6730 - regression_loss: 1.3872 - classification_loss: 0.2857 364/500 [====================>.........] - ETA: 34s - loss: 1.6738 - regression_loss: 1.3881 - classification_loss: 0.2857 365/500 [====================>.........] - ETA: 33s - loss: 1.6752 - regression_loss: 1.3891 - classification_loss: 0.2861 366/500 [====================>.........] - ETA: 33s - loss: 1.6756 - regression_loss: 1.3896 - classification_loss: 0.2861 367/500 [=====================>........] - ETA: 33s - loss: 1.6723 - regression_loss: 1.3868 - classification_loss: 0.2855 368/500 [=====================>........] - ETA: 33s - loss: 1.6728 - regression_loss: 1.3871 - classification_loss: 0.2857 369/500 [=====================>........] - ETA: 32s - loss: 1.6739 - regression_loss: 1.3879 - classification_loss: 0.2860 370/500 [=====================>........] - ETA: 32s - loss: 1.6735 - regression_loss: 1.3877 - classification_loss: 0.2858 371/500 [=====================>........] - ETA: 32s - loss: 1.6744 - regression_loss: 1.3887 - classification_loss: 0.2857 372/500 [=====================>........] - ETA: 32s - loss: 1.6742 - regression_loss: 1.3888 - classification_loss: 0.2854 373/500 [=====================>........] - ETA: 31s - loss: 1.6745 - regression_loss: 1.3891 - classification_loss: 0.2854 374/500 [=====================>........] - ETA: 31s - loss: 1.6741 - regression_loss: 1.3889 - classification_loss: 0.2852 375/500 [=====================>........] - ETA: 31s - loss: 1.6764 - regression_loss: 1.3911 - classification_loss: 0.2853 376/500 [=====================>........] - ETA: 30s - loss: 1.6746 - regression_loss: 1.3897 - classification_loss: 0.2849 377/500 [=====================>........] - ETA: 30s - loss: 1.6756 - regression_loss: 1.3906 - classification_loss: 0.2850 378/500 [=====================>........] - ETA: 30s - loss: 1.6754 - regression_loss: 1.3902 - classification_loss: 0.2852 379/500 [=====================>........] - ETA: 30s - loss: 1.6771 - regression_loss: 1.3915 - classification_loss: 0.2857 380/500 [=====================>........] - ETA: 29s - loss: 1.6746 - regression_loss: 1.3895 - classification_loss: 0.2851 381/500 [=====================>........] - ETA: 29s - loss: 1.6753 - regression_loss: 1.3901 - classification_loss: 0.2852 382/500 [=====================>........] - ETA: 29s - loss: 1.6758 - regression_loss: 1.3908 - classification_loss: 0.2850 383/500 [=====================>........] - ETA: 29s - loss: 1.6745 - regression_loss: 1.3897 - classification_loss: 0.2848 384/500 [======================>.......] - ETA: 28s - loss: 1.6740 - regression_loss: 1.3893 - classification_loss: 0.2847 385/500 [======================>.......] - ETA: 28s - loss: 1.6743 - regression_loss: 1.3899 - classification_loss: 0.2845 386/500 [======================>.......] - ETA: 28s - loss: 1.6747 - regression_loss: 1.3901 - classification_loss: 0.2846 387/500 [======================>.......] - ETA: 28s - loss: 1.6735 - regression_loss: 1.3891 - classification_loss: 0.2844 388/500 [======================>.......] - ETA: 27s - loss: 1.6713 - regression_loss: 1.3875 - classification_loss: 0.2838 389/500 [======================>.......] - ETA: 27s - loss: 1.6718 - regression_loss: 1.3879 - classification_loss: 0.2839 390/500 [======================>.......] - ETA: 27s - loss: 1.6721 - regression_loss: 1.3879 - classification_loss: 0.2842 391/500 [======================>.......] - ETA: 27s - loss: 1.6702 - regression_loss: 1.3864 - classification_loss: 0.2838 392/500 [======================>.......] - ETA: 26s - loss: 1.6696 - regression_loss: 1.3857 - classification_loss: 0.2838 393/500 [======================>.......] - ETA: 26s - loss: 1.6669 - regression_loss: 1.3835 - classification_loss: 0.2833 394/500 [======================>.......] - ETA: 26s - loss: 1.6652 - regression_loss: 1.3821 - classification_loss: 0.2831 395/500 [======================>.......] - ETA: 26s - loss: 1.6654 - regression_loss: 1.3824 - classification_loss: 0.2830 396/500 [======================>.......] - ETA: 25s - loss: 1.6672 - regression_loss: 1.3838 - classification_loss: 0.2834 397/500 [======================>.......] - ETA: 25s - loss: 1.6666 - regression_loss: 1.3834 - classification_loss: 0.2833 398/500 [======================>.......] - ETA: 25s - loss: 1.6649 - regression_loss: 1.3822 - classification_loss: 0.2828 399/500 [======================>.......] - ETA: 25s - loss: 1.6654 - regression_loss: 1.3826 - classification_loss: 0.2828 400/500 [=======================>......] - ETA: 25s - loss: 1.6652 - regression_loss: 1.3824 - classification_loss: 0.2828 401/500 [=======================>......] - ETA: 24s - loss: 1.6639 - regression_loss: 1.3815 - classification_loss: 0.2825 402/500 [=======================>......] - ETA: 24s - loss: 1.6647 - regression_loss: 1.3821 - classification_loss: 0.2826 403/500 [=======================>......] - ETA: 24s - loss: 1.6641 - regression_loss: 1.3815 - classification_loss: 0.2826 404/500 [=======================>......] - ETA: 24s - loss: 1.6639 - regression_loss: 1.3817 - classification_loss: 0.2823 405/500 [=======================>......] - ETA: 23s - loss: 1.6652 - regression_loss: 1.3824 - classification_loss: 0.2828 406/500 [=======================>......] - ETA: 23s - loss: 1.6637 - regression_loss: 1.3811 - classification_loss: 0.2825 407/500 [=======================>......] - ETA: 23s - loss: 1.6626 - regression_loss: 1.3802 - classification_loss: 0.2824 408/500 [=======================>......] - ETA: 23s - loss: 1.6639 - regression_loss: 1.3813 - classification_loss: 0.2826 409/500 [=======================>......] - ETA: 22s - loss: 1.6655 - regression_loss: 1.3828 - classification_loss: 0.2827 410/500 [=======================>......] - ETA: 22s - loss: 1.6657 - regression_loss: 1.3829 - classification_loss: 0.2828 411/500 [=======================>......] - ETA: 22s - loss: 1.6655 - regression_loss: 1.3827 - classification_loss: 0.2829 412/500 [=======================>......] - ETA: 22s - loss: 1.6660 - regression_loss: 1.3831 - classification_loss: 0.2829 413/500 [=======================>......] - ETA: 21s - loss: 1.6661 - regression_loss: 1.3831 - classification_loss: 0.2830 414/500 [=======================>......] - ETA: 21s - loss: 1.6658 - regression_loss: 1.3829 - classification_loss: 0.2828 415/500 [=======================>......] - ETA: 21s - loss: 1.6663 - regression_loss: 1.3834 - classification_loss: 0.2829 416/500 [=======================>......] - ETA: 21s - loss: 1.6645 - regression_loss: 1.3820 - classification_loss: 0.2826 417/500 [========================>.....] - ETA: 20s - loss: 1.6657 - regression_loss: 1.3829 - classification_loss: 0.2828 418/500 [========================>.....] - ETA: 20s - loss: 1.6642 - regression_loss: 1.3819 - classification_loss: 0.2824 419/500 [========================>.....] - ETA: 20s - loss: 1.6659 - regression_loss: 1.3832 - classification_loss: 0.2827 420/500 [========================>.....] - ETA: 20s - loss: 1.6651 - regression_loss: 1.3824 - classification_loss: 0.2827 421/500 [========================>.....] - ETA: 19s - loss: 1.6652 - regression_loss: 1.3825 - classification_loss: 0.2827 422/500 [========================>.....] - ETA: 19s - loss: 1.6639 - regression_loss: 1.3814 - classification_loss: 0.2825 423/500 [========================>.....] - ETA: 19s - loss: 1.6640 - regression_loss: 1.3816 - classification_loss: 0.2823 424/500 [========================>.....] - ETA: 19s - loss: 1.6634 - regression_loss: 1.3813 - classification_loss: 0.2822 425/500 [========================>.....] - ETA: 18s - loss: 1.6627 - regression_loss: 1.3808 - classification_loss: 0.2820 426/500 [========================>.....] - ETA: 18s - loss: 1.6628 - regression_loss: 1.3808 - classification_loss: 0.2820 427/500 [========================>.....] - ETA: 18s - loss: 1.6628 - regression_loss: 1.3809 - classification_loss: 0.2818 428/500 [========================>.....] - ETA: 18s - loss: 1.6634 - regression_loss: 1.3814 - classification_loss: 0.2820 429/500 [========================>.....] - ETA: 17s - loss: 1.6642 - regression_loss: 1.3820 - classification_loss: 0.2823 430/500 [========================>.....] - ETA: 17s - loss: 1.6634 - regression_loss: 1.3814 - classification_loss: 0.2820 431/500 [========================>.....] - ETA: 17s - loss: 1.6633 - regression_loss: 1.3814 - classification_loss: 0.2819 432/500 [========================>.....] - ETA: 17s - loss: 1.6622 - regression_loss: 1.3804 - classification_loss: 0.2818 433/500 [========================>.....] - ETA: 16s - loss: 1.6624 - regression_loss: 1.3806 - classification_loss: 0.2818 434/500 [=========================>....] - ETA: 16s - loss: 1.6632 - regression_loss: 1.3806 - classification_loss: 0.2826 435/500 [=========================>....] - ETA: 16s - loss: 1.6629 - regression_loss: 1.3804 - classification_loss: 0.2825 436/500 [=========================>....] - ETA: 16s - loss: 1.6637 - regression_loss: 1.3811 - classification_loss: 0.2826 437/500 [=========================>....] - ETA: 15s - loss: 1.6622 - regression_loss: 1.3798 - classification_loss: 0.2824 438/500 [=========================>....] - ETA: 15s - loss: 1.6606 - regression_loss: 1.3786 - classification_loss: 0.2821 439/500 [=========================>....] - ETA: 15s - loss: 1.6614 - regression_loss: 1.3792 - classification_loss: 0.2822 440/500 [=========================>....] - ETA: 15s - loss: 1.6617 - regression_loss: 1.3794 - classification_loss: 0.2823 441/500 [=========================>....] - ETA: 14s - loss: 1.6614 - regression_loss: 1.3792 - classification_loss: 0.2822 442/500 [=========================>....] - ETA: 14s - loss: 1.6612 - regression_loss: 1.3791 - classification_loss: 0.2820 443/500 [=========================>....] - ETA: 14s - loss: 1.6608 - regression_loss: 1.3788 - classification_loss: 0.2819 444/500 [=========================>....] - ETA: 14s - loss: 1.6622 - regression_loss: 1.3799 - classification_loss: 0.2823 445/500 [=========================>....] - ETA: 13s - loss: 1.6615 - regression_loss: 1.3793 - classification_loss: 0.2822 446/500 [=========================>....] - ETA: 13s - loss: 1.6597 - regression_loss: 1.3777 - classification_loss: 0.2820 447/500 [=========================>....] - ETA: 13s - loss: 1.6599 - regression_loss: 1.3780 - classification_loss: 0.2820 448/500 [=========================>....] - ETA: 13s - loss: 1.6609 - regression_loss: 1.3788 - classification_loss: 0.2821 449/500 [=========================>....] - ETA: 12s - loss: 1.6609 - regression_loss: 1.3789 - classification_loss: 0.2819 450/500 [==========================>...] - ETA: 12s - loss: 1.6601 - regression_loss: 1.3782 - classification_loss: 0.2819 451/500 [==========================>...] - ETA: 12s - loss: 1.6607 - regression_loss: 1.3788 - classification_loss: 0.2819 452/500 [==========================>...] - ETA: 12s - loss: 1.6611 - regression_loss: 1.3791 - classification_loss: 0.2819 453/500 [==========================>...] - ETA: 11s - loss: 1.6630 - regression_loss: 1.3808 - classification_loss: 0.2822 454/500 [==========================>...] - ETA: 11s - loss: 1.6623 - regression_loss: 1.3802 - classification_loss: 0.2821 455/500 [==========================>...] - ETA: 11s - loss: 1.6631 - regression_loss: 1.3808 - classification_loss: 0.2823 456/500 [==========================>...] - ETA: 11s - loss: 1.6618 - regression_loss: 1.3797 - classification_loss: 0.2821 457/500 [==========================>...] - ETA: 10s - loss: 1.6620 - regression_loss: 1.3799 - classification_loss: 0.2821 458/500 [==========================>...] - ETA: 10s - loss: 1.6609 - regression_loss: 1.3789 - classification_loss: 0.2820 459/500 [==========================>...] - ETA: 10s - loss: 1.6606 - regression_loss: 1.3788 - classification_loss: 0.2818 460/500 [==========================>...] - ETA: 10s - loss: 1.6603 - regression_loss: 1.3786 - classification_loss: 0.2817 461/500 [==========================>...] - ETA: 9s - loss: 1.6609 - regression_loss: 1.3791 - classification_loss: 0.2818  462/500 [==========================>...] - ETA: 9s - loss: 1.6608 - regression_loss: 1.3791 - classification_loss: 0.2817 463/500 [==========================>...] - ETA: 9s - loss: 1.6615 - regression_loss: 1.3797 - classification_loss: 0.2817 464/500 [==========================>...] - ETA: 9s - loss: 1.6596 - regression_loss: 1.3782 - classification_loss: 0.2814 465/500 [==========================>...] - ETA: 8s - loss: 1.6596 - regression_loss: 1.3783 - classification_loss: 0.2814 466/500 [==========================>...] - ETA: 8s - loss: 1.6598 - regression_loss: 1.3785 - classification_loss: 0.2813 467/500 [===========================>..] - ETA: 8s - loss: 1.6585 - regression_loss: 1.3776 - classification_loss: 0.2809 468/500 [===========================>..] - ETA: 8s - loss: 1.6603 - regression_loss: 1.3791 - classification_loss: 0.2812 469/500 [===========================>..] - ETA: 7s - loss: 1.6613 - regression_loss: 1.3800 - classification_loss: 0.2813 470/500 [===========================>..] - ETA: 7s - loss: 1.6613 - regression_loss: 1.3800 - classification_loss: 0.2813 471/500 [===========================>..] - ETA: 7s - loss: 1.6615 - regression_loss: 1.3801 - classification_loss: 0.2814 472/500 [===========================>..] - ETA: 7s - loss: 1.6612 - regression_loss: 1.3799 - classification_loss: 0.2813 473/500 [===========================>..] - ETA: 6s - loss: 1.6597 - regression_loss: 1.3787 - classification_loss: 0.2810 474/500 [===========================>..] - ETA: 6s - loss: 1.6594 - regression_loss: 1.3785 - classification_loss: 0.2809 475/500 [===========================>..] - ETA: 6s - loss: 1.6593 - regression_loss: 1.3785 - classification_loss: 0.2809 476/500 [===========================>..] - ETA: 6s - loss: 1.6587 - regression_loss: 1.3779 - classification_loss: 0.2808 477/500 [===========================>..] - ETA: 5s - loss: 1.6593 - regression_loss: 1.3784 - classification_loss: 0.2809 478/500 [===========================>..] - ETA: 5s - loss: 1.6595 - regression_loss: 1.3787 - classification_loss: 0.2808 479/500 [===========================>..] - ETA: 5s - loss: 1.6594 - regression_loss: 1.3787 - classification_loss: 0.2807 480/500 [===========================>..] - ETA: 5s - loss: 1.6606 - regression_loss: 1.3795 - classification_loss: 0.2811 481/500 [===========================>..] - ETA: 4s - loss: 1.6607 - regression_loss: 1.3797 - classification_loss: 0.2810 482/500 [===========================>..] - ETA: 4s - loss: 1.6619 - regression_loss: 1.3805 - classification_loss: 0.2814 483/500 [===========================>..] - ETA: 4s - loss: 1.6628 - regression_loss: 1.3811 - classification_loss: 0.2817 484/500 [============================>.] - ETA: 3s - loss: 1.6635 - regression_loss: 1.3817 - classification_loss: 0.2819 485/500 [============================>.] - ETA: 3s - loss: 1.6636 - regression_loss: 1.3818 - classification_loss: 0.2819 486/500 [============================>.] - ETA: 3s - loss: 1.6626 - regression_loss: 1.3811 - classification_loss: 0.2815 487/500 [============================>.] - ETA: 3s - loss: 1.6624 - regression_loss: 1.3810 - classification_loss: 0.2814 488/500 [============================>.] - ETA: 2s - loss: 1.6613 - regression_loss: 1.3801 - classification_loss: 0.2811 489/500 [============================>.] - ETA: 2s - loss: 1.6604 - regression_loss: 1.3794 - classification_loss: 0.2809 490/500 [============================>.] - ETA: 2s - loss: 1.6604 - regression_loss: 1.3795 - classification_loss: 0.2809 491/500 [============================>.] - ETA: 2s - loss: 1.6612 - regression_loss: 1.3801 - classification_loss: 0.2810 492/500 [============================>.] - ETA: 1s - loss: 1.6612 - regression_loss: 1.3802 - classification_loss: 0.2810 493/500 [============================>.] - ETA: 1s - loss: 1.6612 - regression_loss: 1.3802 - classification_loss: 0.2810 494/500 [============================>.] - ETA: 1s - loss: 1.6602 - regression_loss: 1.3794 - classification_loss: 0.2808 495/500 [============================>.] - ETA: 1s - loss: 1.6620 - regression_loss: 1.3808 - classification_loss: 0.2812 496/500 [============================>.] - ETA: 0s - loss: 1.6612 - regression_loss: 1.3800 - classification_loss: 0.2812 497/500 [============================>.] - ETA: 0s - loss: 1.6618 - regression_loss: 1.3805 - classification_loss: 0.2813 498/500 [============================>.] - ETA: 0s - loss: 1.6619 - regression_loss: 1.3804 - classification_loss: 0.2815 499/500 [============================>.] - ETA: 0s - loss: 1.6624 - regression_loss: 1.3808 - classification_loss: 0.2816 500/500 [==============================] - 125s 250ms/step - loss: 1.6610 - regression_loss: 1.3797 - classification_loss: 0.2814 1172 instances of class plum with average precision: 0.6398 mAP: 0.6398 Epoch 00077: saving model to ./training/snapshots/resnet50_pascal_77.h5 Epoch 78/150 1/500 [..............................] - ETA: 2:00 - loss: 1.7779 - regression_loss: 1.5329 - classification_loss: 0.2450 2/500 [..............................] - ETA: 1:58 - loss: 1.6276 - regression_loss: 1.4009 - classification_loss: 0.2266 3/500 [..............................] - ETA: 1:59 - loss: 1.7285 - regression_loss: 1.4540 - classification_loss: 0.2746 4/500 [..............................] - ETA: 1:58 - loss: 1.6273 - regression_loss: 1.3790 - classification_loss: 0.2482 5/500 [..............................] - ETA: 1:59 - loss: 1.7006 - regression_loss: 1.4378 - classification_loss: 0.2628 6/500 [..............................] - ETA: 1:59 - loss: 1.7054 - regression_loss: 1.4347 - classification_loss: 0.2707 7/500 [..............................] - ETA: 2:00 - loss: 1.7678 - regression_loss: 1.4844 - classification_loss: 0.2833 8/500 [..............................] - ETA: 2:01 - loss: 1.7847 - regression_loss: 1.4960 - classification_loss: 0.2886 9/500 [..............................] - ETA: 2:00 - loss: 1.7557 - regression_loss: 1.4740 - classification_loss: 0.2817 10/500 [..............................] - ETA: 2:01 - loss: 1.7162 - regression_loss: 1.4387 - classification_loss: 0.2774 11/500 [..............................] - ETA: 2:01 - loss: 1.6489 - regression_loss: 1.3908 - classification_loss: 0.2581 12/500 [..............................] - ETA: 2:00 - loss: 1.6580 - regression_loss: 1.4074 - classification_loss: 0.2506 13/500 [..............................] - ETA: 2:00 - loss: 1.6469 - regression_loss: 1.3979 - classification_loss: 0.2489 14/500 [..............................] - ETA: 2:00 - loss: 1.6418 - regression_loss: 1.3961 - classification_loss: 0.2456 15/500 [..............................] - ETA: 2:00 - loss: 1.6646 - regression_loss: 1.4129 - classification_loss: 0.2517 16/500 [..............................] - ETA: 1:59 - loss: 1.6794 - regression_loss: 1.4184 - classification_loss: 0.2611 17/500 [>.............................] - ETA: 1:58 - loss: 1.6877 - regression_loss: 1.4218 - classification_loss: 0.2659 18/500 [>.............................] - ETA: 1:58 - loss: 1.7042 - regression_loss: 1.4374 - classification_loss: 0.2668 19/500 [>.............................] - ETA: 1:58 - loss: 1.7006 - regression_loss: 1.4327 - classification_loss: 0.2679 20/500 [>.............................] - ETA: 1:58 - loss: 1.6546 - regression_loss: 1.3965 - classification_loss: 0.2580 21/500 [>.............................] - ETA: 1:57 - loss: 1.6347 - regression_loss: 1.3830 - classification_loss: 0.2517 22/500 [>.............................] - ETA: 1:57 - loss: 1.6263 - regression_loss: 1.3774 - classification_loss: 0.2489 23/500 [>.............................] - ETA: 1:57 - loss: 1.6344 - regression_loss: 1.3814 - classification_loss: 0.2530 24/500 [>.............................] - ETA: 1:57 - loss: 1.6364 - regression_loss: 1.3819 - classification_loss: 0.2545 25/500 [>.............................] - ETA: 1:57 - loss: 1.6510 - regression_loss: 1.3957 - classification_loss: 0.2554 26/500 [>.............................] - ETA: 1:57 - loss: 1.6796 - regression_loss: 1.4140 - classification_loss: 0.2656 27/500 [>.............................] - ETA: 1:56 - loss: 1.6929 - regression_loss: 1.4239 - classification_loss: 0.2691 28/500 [>.............................] - ETA: 1:56 - loss: 1.6927 - regression_loss: 1.4257 - classification_loss: 0.2669 29/500 [>.............................] - ETA: 1:56 - loss: 1.6904 - regression_loss: 1.4249 - classification_loss: 0.2655 30/500 [>.............................] - ETA: 1:56 - loss: 1.6702 - regression_loss: 1.4009 - classification_loss: 0.2693 31/500 [>.............................] - ETA: 1:55 - loss: 1.6837 - regression_loss: 1.4128 - classification_loss: 0.2709 32/500 [>.............................] - ETA: 1:55 - loss: 1.6861 - regression_loss: 1.4156 - classification_loss: 0.2705 33/500 [>.............................] - ETA: 1:55 - loss: 1.6842 - regression_loss: 1.4130 - classification_loss: 0.2712 34/500 [=>............................] - ETA: 1:55 - loss: 1.6865 - regression_loss: 1.4134 - classification_loss: 0.2731 35/500 [=>............................] - ETA: 1:55 - loss: 1.6967 - regression_loss: 1.4211 - classification_loss: 0.2756 36/500 [=>............................] - ETA: 1:55 - loss: 1.6918 - regression_loss: 1.4174 - classification_loss: 0.2744 37/500 [=>............................] - ETA: 1:55 - loss: 1.7034 - regression_loss: 1.4238 - classification_loss: 0.2796 38/500 [=>............................] - ETA: 1:54 - loss: 1.7086 - regression_loss: 1.4283 - classification_loss: 0.2803 39/500 [=>............................] - ETA: 1:54 - loss: 1.7197 - regression_loss: 1.4338 - classification_loss: 0.2859 40/500 [=>............................] - ETA: 1:54 - loss: 1.7311 - regression_loss: 1.4437 - classification_loss: 0.2873 41/500 [=>............................] - ETA: 1:54 - loss: 1.7346 - regression_loss: 1.4463 - classification_loss: 0.2884 42/500 [=>............................] - ETA: 1:53 - loss: 1.7357 - regression_loss: 1.4461 - classification_loss: 0.2896 43/500 [=>............................] - ETA: 1:53 - loss: 1.7417 - regression_loss: 1.4520 - classification_loss: 0.2898 44/500 [=>............................] - ETA: 1:53 - loss: 1.7529 - regression_loss: 1.4609 - classification_loss: 0.2920 45/500 [=>............................] - ETA: 1:52 - loss: 1.7608 - regression_loss: 1.4661 - classification_loss: 0.2947 46/500 [=>............................] - ETA: 1:51 - loss: 1.7657 - regression_loss: 1.4702 - classification_loss: 0.2955 47/500 [=>............................] - ETA: 1:51 - loss: 1.7627 - regression_loss: 1.4688 - classification_loss: 0.2939 48/500 [=>............................] - ETA: 1:50 - loss: 1.7690 - regression_loss: 1.4747 - classification_loss: 0.2943 49/500 [=>............................] - ETA: 1:50 - loss: 1.7447 - regression_loss: 1.4541 - classification_loss: 0.2906 50/500 [==>...........................] - ETA: 1:50 - loss: 1.7442 - regression_loss: 1.4540 - classification_loss: 0.2902 51/500 [==>...........................] - ETA: 1:50 - loss: 1.7492 - regression_loss: 1.4589 - classification_loss: 0.2902 52/500 [==>...........................] - ETA: 1:50 - loss: 1.7452 - regression_loss: 1.4558 - classification_loss: 0.2894 53/500 [==>...........................] - ETA: 1:49 - loss: 1.7483 - regression_loss: 1.4584 - classification_loss: 0.2899 54/500 [==>...........................] - ETA: 1:49 - loss: 1.7319 - regression_loss: 1.4461 - classification_loss: 0.2857 55/500 [==>...........................] - ETA: 1:49 - loss: 1.7256 - regression_loss: 1.4412 - classification_loss: 0.2844 56/500 [==>...........................] - ETA: 1:49 - loss: 1.7220 - regression_loss: 1.4371 - classification_loss: 0.2849 57/500 [==>...........................] - ETA: 1:49 - loss: 1.7318 - regression_loss: 1.4433 - classification_loss: 0.2884 58/500 [==>...........................] - ETA: 1:49 - loss: 1.7348 - regression_loss: 1.4448 - classification_loss: 0.2899 59/500 [==>...........................] - ETA: 1:48 - loss: 1.7343 - regression_loss: 1.4440 - classification_loss: 0.2903 60/500 [==>...........................] - ETA: 1:48 - loss: 1.7305 - regression_loss: 1.4415 - classification_loss: 0.2891 61/500 [==>...........................] - ETA: 1:48 - loss: 1.7353 - regression_loss: 1.4453 - classification_loss: 0.2900 62/500 [==>...........................] - ETA: 1:48 - loss: 1.7599 - regression_loss: 1.4650 - classification_loss: 0.2949 63/500 [==>...........................] - ETA: 1:48 - loss: 1.7543 - regression_loss: 1.4600 - classification_loss: 0.2943 64/500 [==>...........................] - ETA: 1:47 - loss: 1.7558 - regression_loss: 1.4614 - classification_loss: 0.2943 65/500 [==>...........................] - ETA: 1:47 - loss: 1.7576 - regression_loss: 1.4643 - classification_loss: 0.2934 66/500 [==>...........................] - ETA: 1:47 - loss: 1.7659 - regression_loss: 1.4702 - classification_loss: 0.2957 67/500 [===>..........................] - ETA: 1:47 - loss: 1.7711 - regression_loss: 1.4731 - classification_loss: 0.2980 68/500 [===>..........................] - ETA: 1:46 - loss: 1.7685 - regression_loss: 1.4704 - classification_loss: 0.2981 69/500 [===>..........................] - ETA: 1:46 - loss: 1.7689 - regression_loss: 1.4682 - classification_loss: 0.3008 70/500 [===>..........................] - ETA: 1:46 - loss: 1.7699 - regression_loss: 1.4648 - classification_loss: 0.3051 71/500 [===>..........................] - ETA: 1:46 - loss: 1.7663 - regression_loss: 1.4621 - classification_loss: 0.3042 72/500 [===>..........................] - ETA: 1:45 - loss: 1.7594 - regression_loss: 1.4567 - classification_loss: 0.3027 73/500 [===>..........................] - ETA: 1:45 - loss: 1.7688 - regression_loss: 1.4635 - classification_loss: 0.3053 74/500 [===>..........................] - ETA: 1:45 - loss: 1.7586 - regression_loss: 1.4550 - classification_loss: 0.3036 75/500 [===>..........................] - ETA: 1:45 - loss: 1.7606 - regression_loss: 1.4565 - classification_loss: 0.3041 76/500 [===>..........................] - ETA: 1:44 - loss: 1.7650 - regression_loss: 1.4611 - classification_loss: 0.3039 77/500 [===>..........................] - ETA: 1:44 - loss: 1.7586 - regression_loss: 1.4552 - classification_loss: 0.3035 78/500 [===>..........................] - ETA: 1:44 - loss: 1.7609 - regression_loss: 1.4574 - classification_loss: 0.3035 79/500 [===>..........................] - ETA: 1:44 - loss: 1.7605 - regression_loss: 1.4575 - classification_loss: 0.3031 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7642 - regression_loss: 1.4607 - classification_loss: 0.3035 81/500 [===>..........................] - ETA: 1:43 - loss: 1.7731 - regression_loss: 1.4681 - classification_loss: 0.3049 82/500 [===>..........................] - ETA: 1:43 - loss: 1.7679 - regression_loss: 1.4641 - classification_loss: 0.3038 83/500 [===>..........................] - ETA: 1:43 - loss: 1.7658 - regression_loss: 1.4626 - classification_loss: 0.3031 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7619 - regression_loss: 1.4600 - classification_loss: 0.3019 85/500 [====>.........................] - ETA: 1:42 - loss: 1.7582 - regression_loss: 1.4573 - classification_loss: 0.3008 86/500 [====>.........................] - ETA: 1:42 - loss: 1.7512 - regression_loss: 1.4524 - classification_loss: 0.2988 87/500 [====>.........................] - ETA: 1:42 - loss: 1.7519 - regression_loss: 1.4532 - classification_loss: 0.2987 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7532 - regression_loss: 1.4545 - classification_loss: 0.2987 89/500 [====>.........................] - ETA: 1:41 - loss: 1.7515 - regression_loss: 1.4536 - classification_loss: 0.2979 90/500 [====>.........................] - ETA: 1:41 - loss: 1.7508 - regression_loss: 1.4533 - classification_loss: 0.2975 91/500 [====>.........................] - ETA: 1:41 - loss: 1.7450 - regression_loss: 1.4479 - classification_loss: 0.2971 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7476 - regression_loss: 1.4501 - classification_loss: 0.2975 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7521 - regression_loss: 1.4537 - classification_loss: 0.2985 94/500 [====>.........................] - ETA: 1:40 - loss: 1.7528 - regression_loss: 1.4546 - classification_loss: 0.2982 95/500 [====>.........................] - ETA: 1:40 - loss: 1.7542 - regression_loss: 1.4551 - classification_loss: 0.2991 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7478 - regression_loss: 1.4500 - classification_loss: 0.2979 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7478 - regression_loss: 1.4501 - classification_loss: 0.2977 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7463 - regression_loss: 1.4493 - classification_loss: 0.2970 99/500 [====>.........................] - ETA: 1:39 - loss: 1.7381 - regression_loss: 1.4417 - classification_loss: 0.2964 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7367 - regression_loss: 1.4399 - classification_loss: 0.2968 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7370 - regression_loss: 1.4406 - classification_loss: 0.2963 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7292 - regression_loss: 1.4348 - classification_loss: 0.2945 103/500 [=====>........................] - ETA: 1:38 - loss: 1.7332 - regression_loss: 1.4386 - classification_loss: 0.2946 104/500 [=====>........................] - ETA: 1:38 - loss: 1.7313 - regression_loss: 1.4376 - classification_loss: 0.2936 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7344 - regression_loss: 1.4397 - classification_loss: 0.2947 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7364 - regression_loss: 1.4407 - classification_loss: 0.2957 107/500 [=====>........................] - ETA: 1:37 - loss: 1.7338 - regression_loss: 1.4373 - classification_loss: 0.2965 108/500 [=====>........................] - ETA: 1:37 - loss: 1.7276 - regression_loss: 1.4328 - classification_loss: 0.2948 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7282 - regression_loss: 1.4334 - classification_loss: 0.2949 110/500 [=====>........................] - ETA: 1:37 - loss: 1.7298 - regression_loss: 1.4352 - classification_loss: 0.2946 111/500 [=====>........................] - ETA: 1:36 - loss: 1.7187 - regression_loss: 1.4260 - classification_loss: 0.2927 112/500 [=====>........................] - ETA: 1:36 - loss: 1.7187 - regression_loss: 1.4268 - classification_loss: 0.2919 113/500 [=====>........................] - ETA: 1:36 - loss: 1.7219 - regression_loss: 1.4295 - classification_loss: 0.2924 114/500 [=====>........................] - ETA: 1:36 - loss: 1.7216 - regression_loss: 1.4295 - classification_loss: 0.2921 115/500 [=====>........................] - ETA: 1:35 - loss: 1.7200 - regression_loss: 1.4286 - classification_loss: 0.2914 116/500 [=====>........................] - ETA: 1:35 - loss: 1.7189 - regression_loss: 1.4277 - classification_loss: 0.2912 117/500 [======>.......................] - ETA: 1:35 - loss: 1.7184 - regression_loss: 1.4277 - classification_loss: 0.2907 118/500 [======>.......................] - ETA: 1:35 - loss: 1.7190 - regression_loss: 1.4281 - classification_loss: 0.2909 119/500 [======>.......................] - ETA: 1:34 - loss: 1.7193 - regression_loss: 1.4278 - classification_loss: 0.2915 120/500 [======>.......................] - ETA: 1:34 - loss: 1.7175 - regression_loss: 1.4264 - classification_loss: 0.2911 121/500 [======>.......................] - ETA: 1:34 - loss: 1.7176 - regression_loss: 1.4267 - classification_loss: 0.2909 122/500 [======>.......................] - ETA: 1:34 - loss: 1.7183 - regression_loss: 1.4267 - classification_loss: 0.2916 123/500 [======>.......................] - ETA: 1:33 - loss: 1.7158 - regression_loss: 1.4249 - classification_loss: 0.2909 124/500 [======>.......................] - ETA: 1:33 - loss: 1.7190 - regression_loss: 1.4275 - classification_loss: 0.2915 125/500 [======>.......................] - ETA: 1:33 - loss: 1.7226 - regression_loss: 1.4305 - classification_loss: 0.2922 126/500 [======>.......................] - ETA: 1:33 - loss: 1.7188 - regression_loss: 1.4275 - classification_loss: 0.2912 127/500 [======>.......................] - ETA: 1:32 - loss: 1.7215 - regression_loss: 1.4302 - classification_loss: 0.2912 128/500 [======>.......................] - ETA: 1:32 - loss: 1.7199 - regression_loss: 1.4291 - classification_loss: 0.2908 129/500 [======>.......................] - ETA: 1:32 - loss: 1.7128 - regression_loss: 1.4228 - classification_loss: 0.2900 130/500 [======>.......................] - ETA: 1:32 - loss: 1.7127 - regression_loss: 1.4225 - classification_loss: 0.2902 131/500 [======>.......................] - ETA: 1:31 - loss: 1.7155 - regression_loss: 1.4249 - classification_loss: 0.2906 132/500 [======>.......................] - ETA: 1:31 - loss: 1.7181 - regression_loss: 1.4269 - classification_loss: 0.2912 133/500 [======>.......................] - ETA: 1:31 - loss: 1.7225 - regression_loss: 1.4303 - classification_loss: 0.2921 134/500 [=======>......................] - ETA: 1:31 - loss: 1.7204 - regression_loss: 1.4286 - classification_loss: 0.2918 135/500 [=======>......................] - ETA: 1:31 - loss: 1.7175 - regression_loss: 1.4263 - classification_loss: 0.2912 136/500 [=======>......................] - ETA: 1:30 - loss: 1.7154 - regression_loss: 1.4244 - classification_loss: 0.2911 137/500 [=======>......................] - ETA: 1:30 - loss: 1.7100 - regression_loss: 1.4202 - classification_loss: 0.2898 138/500 [=======>......................] - ETA: 1:30 - loss: 1.7095 - regression_loss: 1.4195 - classification_loss: 0.2900 139/500 [=======>......................] - ETA: 1:29 - loss: 1.7072 - regression_loss: 1.4180 - classification_loss: 0.2891 140/500 [=======>......................] - ETA: 1:29 - loss: 1.7034 - regression_loss: 1.4152 - classification_loss: 0.2882 141/500 [=======>......................] - ETA: 1:29 - loss: 1.7058 - regression_loss: 1.4169 - classification_loss: 0.2889 142/500 [=======>......................] - ETA: 1:29 - loss: 1.7086 - regression_loss: 1.4200 - classification_loss: 0.2886 143/500 [=======>......................] - ETA: 1:28 - loss: 1.7090 - regression_loss: 1.4205 - classification_loss: 0.2885 144/500 [=======>......................] - ETA: 1:28 - loss: 1.7085 - regression_loss: 1.4201 - classification_loss: 0.2884 145/500 [=======>......................] - ETA: 1:28 - loss: 1.7050 - regression_loss: 1.4176 - classification_loss: 0.2875 146/500 [=======>......................] - ETA: 1:28 - loss: 1.7033 - regression_loss: 1.4163 - classification_loss: 0.2870 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6994 - regression_loss: 1.4127 - classification_loss: 0.2867 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6938 - regression_loss: 1.4084 - classification_loss: 0.2854 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6924 - regression_loss: 1.4073 - classification_loss: 0.2852 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6901 - regression_loss: 1.4053 - classification_loss: 0.2848 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6886 - regression_loss: 1.4042 - classification_loss: 0.2844 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6850 - regression_loss: 1.4013 - classification_loss: 0.2837 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6829 - regression_loss: 1.3997 - classification_loss: 0.2832 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6844 - regression_loss: 1.4009 - classification_loss: 0.2835 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6894 - regression_loss: 1.4031 - classification_loss: 0.2863 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6885 - regression_loss: 1.4026 - classification_loss: 0.2859 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6884 - regression_loss: 1.4027 - classification_loss: 0.2858 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6900 - regression_loss: 1.4038 - classification_loss: 0.2862 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6850 - regression_loss: 1.3995 - classification_loss: 0.2854 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6836 - regression_loss: 1.3979 - classification_loss: 0.2857 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6807 - regression_loss: 1.3947 - classification_loss: 0.2860 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6814 - regression_loss: 1.3954 - classification_loss: 0.2860 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6804 - regression_loss: 1.3947 - classification_loss: 0.2857 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6795 - regression_loss: 1.3938 - classification_loss: 0.2857 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6717 - regression_loss: 1.3872 - classification_loss: 0.2845 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6701 - regression_loss: 1.3860 - classification_loss: 0.2841 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6650 - regression_loss: 1.3820 - classification_loss: 0.2830 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6613 - regression_loss: 1.3790 - classification_loss: 0.2822 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6613 - regression_loss: 1.3799 - classification_loss: 0.2814 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6630 - regression_loss: 1.3817 - classification_loss: 0.2813 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6591 - regression_loss: 1.3785 - classification_loss: 0.2806 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6591 - regression_loss: 1.3787 - classification_loss: 0.2804 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6620 - regression_loss: 1.3807 - classification_loss: 0.2813 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6615 - regression_loss: 1.3808 - classification_loss: 0.2807 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6628 - regression_loss: 1.3820 - classification_loss: 0.2808 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6578 - regression_loss: 1.3782 - classification_loss: 0.2796 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6576 - regression_loss: 1.3780 - classification_loss: 0.2796 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6600 - regression_loss: 1.3794 - classification_loss: 0.2806 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6615 - regression_loss: 1.3808 - classification_loss: 0.2808 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6612 - regression_loss: 1.3806 - classification_loss: 0.2807 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6585 - regression_loss: 1.3784 - classification_loss: 0.2801 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6532 - regression_loss: 1.3738 - classification_loss: 0.2794 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6508 - regression_loss: 1.3717 - classification_loss: 0.2791 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6518 - regression_loss: 1.3728 - classification_loss: 0.2790 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6525 - regression_loss: 1.3733 - classification_loss: 0.2792 186/500 [==========>...................] - ETA: 1:20 - loss: 1.6529 - regression_loss: 1.3735 - classification_loss: 0.2795 187/500 [==========>...................] - ETA: 1:19 - loss: 1.6528 - regression_loss: 1.3735 - classification_loss: 0.2793 188/500 [==========>...................] - ETA: 1:19 - loss: 1.6530 - regression_loss: 1.3739 - classification_loss: 0.2791 189/500 [==========>...................] - ETA: 1:19 - loss: 1.6537 - regression_loss: 1.3748 - classification_loss: 0.2790 190/500 [==========>...................] - ETA: 1:19 - loss: 1.6560 - regression_loss: 1.3764 - classification_loss: 0.2796 191/500 [==========>...................] - ETA: 1:18 - loss: 1.6541 - regression_loss: 1.3748 - classification_loss: 0.2793 192/500 [==========>...................] - ETA: 1:18 - loss: 1.6535 - regression_loss: 1.3747 - classification_loss: 0.2788 193/500 [==========>...................] - ETA: 1:18 - loss: 1.6543 - regression_loss: 1.3758 - classification_loss: 0.2785 194/500 [==========>...................] - ETA: 1:18 - loss: 1.6552 - regression_loss: 1.3768 - classification_loss: 0.2785 195/500 [==========>...................] - ETA: 1:17 - loss: 1.6588 - regression_loss: 1.3796 - classification_loss: 0.2792 196/500 [==========>...................] - ETA: 1:17 - loss: 1.6650 - regression_loss: 1.3846 - classification_loss: 0.2804 197/500 [==========>...................] - ETA: 1:17 - loss: 1.6613 - regression_loss: 1.3817 - classification_loss: 0.2796 198/500 [==========>...................] - ETA: 1:17 - loss: 1.6615 - regression_loss: 1.3812 - classification_loss: 0.2803 199/500 [==========>...................] - ETA: 1:16 - loss: 1.6617 - regression_loss: 1.3816 - classification_loss: 0.2801 200/500 [===========>..................] - ETA: 1:16 - loss: 1.6616 - regression_loss: 1.3815 - classification_loss: 0.2801 201/500 [===========>..................] - ETA: 1:16 - loss: 1.6640 - regression_loss: 1.3834 - classification_loss: 0.2806 202/500 [===========>..................] - ETA: 1:15 - loss: 1.6666 - regression_loss: 1.3854 - classification_loss: 0.2812 203/500 [===========>..................] - ETA: 1:15 - loss: 1.6668 - regression_loss: 1.3855 - classification_loss: 0.2812 204/500 [===========>..................] - ETA: 1:15 - loss: 1.6657 - regression_loss: 1.3849 - classification_loss: 0.2808 205/500 [===========>..................] - ETA: 1:15 - loss: 1.6661 - regression_loss: 1.3854 - classification_loss: 0.2807 206/500 [===========>..................] - ETA: 1:14 - loss: 1.6696 - regression_loss: 1.3879 - classification_loss: 0.2817 207/500 [===========>..................] - ETA: 1:14 - loss: 1.6697 - regression_loss: 1.3879 - classification_loss: 0.2818 208/500 [===========>..................] - ETA: 1:14 - loss: 1.6697 - regression_loss: 1.3880 - classification_loss: 0.2817 209/500 [===========>..................] - ETA: 1:14 - loss: 1.6702 - regression_loss: 1.3876 - classification_loss: 0.2826 210/500 [===========>..................] - ETA: 1:13 - loss: 1.6661 - regression_loss: 1.3843 - classification_loss: 0.2817 211/500 [===========>..................] - ETA: 1:13 - loss: 1.6690 - regression_loss: 1.3869 - classification_loss: 0.2821 212/500 [===========>..................] - ETA: 1:13 - loss: 1.6736 - regression_loss: 1.3900 - classification_loss: 0.2835 213/500 [===========>..................] - ETA: 1:13 - loss: 1.6718 - regression_loss: 1.3884 - classification_loss: 0.2834 214/500 [===========>..................] - ETA: 1:12 - loss: 1.6730 - regression_loss: 1.3895 - classification_loss: 0.2835 215/500 [===========>..................] - ETA: 1:12 - loss: 1.6708 - regression_loss: 1.3872 - classification_loss: 0.2836 216/500 [===========>..................] - ETA: 1:12 - loss: 1.6704 - regression_loss: 1.3871 - classification_loss: 0.2833 217/500 [============>.................] - ETA: 1:12 - loss: 1.6713 - regression_loss: 1.3880 - classification_loss: 0.2833 218/500 [============>.................] - ETA: 1:11 - loss: 1.6714 - regression_loss: 1.3883 - classification_loss: 0.2832 219/500 [============>.................] - ETA: 1:11 - loss: 1.6703 - regression_loss: 1.3867 - classification_loss: 0.2836 220/500 [============>.................] - ETA: 1:11 - loss: 1.6707 - regression_loss: 1.3871 - classification_loss: 0.2836 221/500 [============>.................] - ETA: 1:10 - loss: 1.6716 - regression_loss: 1.3879 - classification_loss: 0.2837 222/500 [============>.................] - ETA: 1:10 - loss: 1.6747 - regression_loss: 1.3901 - classification_loss: 0.2846 223/500 [============>.................] - ETA: 1:10 - loss: 1.6759 - regression_loss: 1.3912 - classification_loss: 0.2847 224/500 [============>.................] - ETA: 1:10 - loss: 1.6763 - regression_loss: 1.3914 - classification_loss: 0.2849 225/500 [============>.................] - ETA: 1:09 - loss: 1.6732 - regression_loss: 1.3893 - classification_loss: 0.2839 226/500 [============>.................] - ETA: 1:09 - loss: 1.6724 - regression_loss: 1.3885 - classification_loss: 0.2839 227/500 [============>.................] - ETA: 1:09 - loss: 1.6757 - regression_loss: 1.3913 - classification_loss: 0.2844 228/500 [============>.................] - ETA: 1:09 - loss: 1.6778 - regression_loss: 1.3930 - classification_loss: 0.2847 229/500 [============>.................] - ETA: 1:08 - loss: 1.6772 - regression_loss: 1.3927 - classification_loss: 0.2844 230/500 [============>.................] - ETA: 1:08 - loss: 1.6753 - regression_loss: 1.3912 - classification_loss: 0.2841 231/500 [============>.................] - ETA: 1:08 - loss: 1.6741 - regression_loss: 1.3902 - classification_loss: 0.2839 232/500 [============>.................] - ETA: 1:08 - loss: 1.6710 - regression_loss: 1.3875 - classification_loss: 0.2835 233/500 [============>.................] - ETA: 1:07 - loss: 1.6722 - regression_loss: 1.3884 - classification_loss: 0.2838 234/500 [=============>................] - ETA: 1:07 - loss: 1.6687 - regression_loss: 1.3854 - classification_loss: 0.2834 235/500 [=============>................] - ETA: 1:07 - loss: 1.6689 - regression_loss: 1.3857 - classification_loss: 0.2831 236/500 [=============>................] - ETA: 1:07 - loss: 1.6693 - regression_loss: 1.3859 - classification_loss: 0.2834 237/500 [=============>................] - ETA: 1:06 - loss: 1.6691 - regression_loss: 1.3859 - classification_loss: 0.2832 238/500 [=============>................] - ETA: 1:06 - loss: 1.6705 - regression_loss: 1.3870 - classification_loss: 0.2835 239/500 [=============>................] - ETA: 1:06 - loss: 1.6709 - regression_loss: 1.3874 - classification_loss: 0.2835 240/500 [=============>................] - ETA: 1:06 - loss: 1.6704 - regression_loss: 1.3861 - classification_loss: 0.2843 241/500 [=============>................] - ETA: 1:05 - loss: 1.6714 - regression_loss: 1.3868 - classification_loss: 0.2846 242/500 [=============>................] - ETA: 1:05 - loss: 1.6684 - regression_loss: 1.3844 - classification_loss: 0.2840 243/500 [=============>................] - ETA: 1:05 - loss: 1.6653 - regression_loss: 1.3819 - classification_loss: 0.2834 244/500 [=============>................] - ETA: 1:04 - loss: 1.6691 - regression_loss: 1.3837 - classification_loss: 0.2854 245/500 [=============>................] - ETA: 1:04 - loss: 1.6680 - regression_loss: 1.3825 - classification_loss: 0.2855 246/500 [=============>................] - ETA: 1:04 - loss: 1.6687 - regression_loss: 1.3832 - classification_loss: 0.2854 247/500 [=============>................] - ETA: 1:04 - loss: 1.6694 - regression_loss: 1.3837 - classification_loss: 0.2856 248/500 [=============>................] - ETA: 1:03 - loss: 1.6694 - regression_loss: 1.3839 - classification_loss: 0.2856 249/500 [=============>................] - ETA: 1:03 - loss: 1.6698 - regression_loss: 1.3844 - classification_loss: 0.2854 250/500 [==============>...............] - ETA: 1:03 - loss: 1.6676 - regression_loss: 1.3830 - classification_loss: 0.2846 251/500 [==============>...............] - ETA: 1:03 - loss: 1.6689 - regression_loss: 1.3840 - classification_loss: 0.2849 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6686 - regression_loss: 1.3839 - classification_loss: 0.2846 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6700 - regression_loss: 1.3850 - classification_loss: 0.2850 254/500 [==============>...............] - ETA: 1:02 - loss: 1.6699 - regression_loss: 1.3846 - classification_loss: 0.2853 255/500 [==============>...............] - ETA: 1:02 - loss: 1.6708 - regression_loss: 1.3853 - classification_loss: 0.2855 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6718 - regression_loss: 1.3857 - classification_loss: 0.2861 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6720 - regression_loss: 1.3861 - classification_loss: 0.2859 258/500 [==============>...............] - ETA: 1:01 - loss: 1.6741 - regression_loss: 1.3877 - classification_loss: 0.2863 259/500 [==============>...............] - ETA: 1:01 - loss: 1.6750 - regression_loss: 1.3887 - classification_loss: 0.2864 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6747 - regression_loss: 1.3884 - classification_loss: 0.2862 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6733 - regression_loss: 1.3874 - classification_loss: 0.2859 262/500 [==============>...............] - ETA: 1:00 - loss: 1.6750 - regression_loss: 1.3887 - classification_loss: 0.2863 263/500 [==============>...............] - ETA: 1:00 - loss: 1.6735 - regression_loss: 1.3873 - classification_loss: 0.2862 264/500 [==============>...............] - ETA: 59s - loss: 1.6749 - regression_loss: 1.3884 - classification_loss: 0.2865  265/500 [==============>...............] - ETA: 59s - loss: 1.6728 - regression_loss: 1.3866 - classification_loss: 0.2862 266/500 [==============>...............] - ETA: 59s - loss: 1.6738 - regression_loss: 1.3875 - classification_loss: 0.2863 267/500 [===============>..............] - ETA: 59s - loss: 1.6733 - regression_loss: 1.3872 - classification_loss: 0.2861 268/500 [===============>..............] - ETA: 58s - loss: 1.6726 - regression_loss: 1.3867 - classification_loss: 0.2859 269/500 [===============>..............] - ETA: 58s - loss: 1.6698 - regression_loss: 1.3843 - classification_loss: 0.2855 270/500 [===============>..............] - ETA: 58s - loss: 1.6684 - regression_loss: 1.3833 - classification_loss: 0.2851 271/500 [===============>..............] - ETA: 58s - loss: 1.6659 - regression_loss: 1.3814 - classification_loss: 0.2845 272/500 [===============>..............] - ETA: 57s - loss: 1.6663 - regression_loss: 1.3817 - classification_loss: 0.2846 273/500 [===============>..............] - ETA: 57s - loss: 1.6656 - regression_loss: 1.3813 - classification_loss: 0.2843 274/500 [===============>..............] - ETA: 57s - loss: 1.6663 - regression_loss: 1.3814 - classification_loss: 0.2850 275/500 [===============>..............] - ETA: 57s - loss: 1.6683 - regression_loss: 1.3831 - classification_loss: 0.2853 276/500 [===============>..............] - ETA: 56s - loss: 1.6689 - regression_loss: 1.3836 - classification_loss: 0.2853 277/500 [===============>..............] - ETA: 56s - loss: 1.6693 - regression_loss: 1.3839 - classification_loss: 0.2853 278/500 [===============>..............] - ETA: 56s - loss: 1.6703 - regression_loss: 1.3848 - classification_loss: 0.2855 279/500 [===============>..............] - ETA: 56s - loss: 1.6711 - regression_loss: 1.3855 - classification_loss: 0.2856 280/500 [===============>..............] - ETA: 55s - loss: 1.6689 - regression_loss: 1.3837 - classification_loss: 0.2852 281/500 [===============>..............] - ETA: 55s - loss: 1.6662 - regression_loss: 1.3812 - classification_loss: 0.2850 282/500 [===============>..............] - ETA: 55s - loss: 1.6655 - regression_loss: 1.3807 - classification_loss: 0.2848 283/500 [===============>..............] - ETA: 54s - loss: 1.6658 - regression_loss: 1.3809 - classification_loss: 0.2850 284/500 [================>.............] - ETA: 54s - loss: 1.6639 - regression_loss: 1.3793 - classification_loss: 0.2846 285/500 [================>.............] - ETA: 54s - loss: 1.6648 - regression_loss: 1.3802 - classification_loss: 0.2846 286/500 [================>.............] - ETA: 54s - loss: 1.6645 - regression_loss: 1.3801 - classification_loss: 0.2844 287/500 [================>.............] - ETA: 53s - loss: 1.6650 - regression_loss: 1.3802 - classification_loss: 0.2848 288/500 [================>.............] - ETA: 53s - loss: 1.6644 - regression_loss: 1.3798 - classification_loss: 0.2846 289/500 [================>.............] - ETA: 53s - loss: 1.6621 - regression_loss: 1.3781 - classification_loss: 0.2840 290/500 [================>.............] - ETA: 53s - loss: 1.6630 - regression_loss: 1.3787 - classification_loss: 0.2843 291/500 [================>.............] - ETA: 52s - loss: 1.6648 - regression_loss: 1.3802 - classification_loss: 0.2846 292/500 [================>.............] - ETA: 52s - loss: 1.6616 - regression_loss: 1.3774 - classification_loss: 0.2842 293/500 [================>.............] - ETA: 52s - loss: 1.6655 - regression_loss: 1.3802 - classification_loss: 0.2852 294/500 [================>.............] - ETA: 52s - loss: 1.6677 - regression_loss: 1.3820 - classification_loss: 0.2857 295/500 [================>.............] - ETA: 51s - loss: 1.6694 - regression_loss: 1.3834 - classification_loss: 0.2860 296/500 [================>.............] - ETA: 51s - loss: 1.6700 - regression_loss: 1.3841 - classification_loss: 0.2859 297/500 [================>.............] - ETA: 51s - loss: 1.6696 - regression_loss: 1.3840 - classification_loss: 0.2856 298/500 [================>.............] - ETA: 51s - loss: 1.6708 - regression_loss: 1.3848 - classification_loss: 0.2859 299/500 [================>.............] - ETA: 50s - loss: 1.6703 - regression_loss: 1.3845 - classification_loss: 0.2858 300/500 [=================>............] - ETA: 50s - loss: 1.6705 - regression_loss: 1.3848 - classification_loss: 0.2857 301/500 [=================>............] - ETA: 50s - loss: 1.6702 - regression_loss: 1.3845 - classification_loss: 0.2856 302/500 [=================>............] - ETA: 50s - loss: 1.6700 - regression_loss: 1.3845 - classification_loss: 0.2855 303/500 [=================>............] - ETA: 49s - loss: 1.6691 - regression_loss: 1.3836 - classification_loss: 0.2855 304/500 [=================>............] - ETA: 49s - loss: 1.6670 - regression_loss: 1.3816 - classification_loss: 0.2854 305/500 [=================>............] - ETA: 49s - loss: 1.6711 - regression_loss: 1.3853 - classification_loss: 0.2859 306/500 [=================>............] - ETA: 49s - loss: 1.6714 - regression_loss: 1.3855 - classification_loss: 0.2859 307/500 [=================>............] - ETA: 48s - loss: 1.6743 - regression_loss: 1.3880 - classification_loss: 0.2863 308/500 [=================>............] - ETA: 48s - loss: 1.6712 - regression_loss: 1.3855 - classification_loss: 0.2857 309/500 [=================>............] - ETA: 48s - loss: 1.6691 - regression_loss: 1.3839 - classification_loss: 0.2853 310/500 [=================>............] - ETA: 48s - loss: 1.6683 - regression_loss: 1.3828 - classification_loss: 0.2855 311/500 [=================>............] - ETA: 47s - loss: 1.6673 - regression_loss: 1.3821 - classification_loss: 0.2852 312/500 [=================>............] - ETA: 47s - loss: 1.6686 - regression_loss: 1.3834 - classification_loss: 0.2852 313/500 [=================>............] - ETA: 47s - loss: 1.6691 - regression_loss: 1.3838 - classification_loss: 0.2853 314/500 [=================>............] - ETA: 47s - loss: 1.6691 - regression_loss: 1.3836 - classification_loss: 0.2855 315/500 [=================>............] - ETA: 46s - loss: 1.6704 - regression_loss: 1.3848 - classification_loss: 0.2856 316/500 [=================>............] - ETA: 46s - loss: 1.6726 - regression_loss: 1.3865 - classification_loss: 0.2861 317/500 [==================>...........] - ETA: 46s - loss: 1.6720 - regression_loss: 1.3861 - classification_loss: 0.2859 318/500 [==================>...........] - ETA: 46s - loss: 1.6684 - regression_loss: 1.3831 - classification_loss: 0.2853 319/500 [==================>...........] - ETA: 45s - loss: 1.6688 - regression_loss: 1.3836 - classification_loss: 0.2852 320/500 [==================>...........] - ETA: 45s - loss: 1.6683 - regression_loss: 1.3833 - classification_loss: 0.2850 321/500 [==================>...........] - ETA: 45s - loss: 1.6651 - regression_loss: 1.3807 - classification_loss: 0.2844 322/500 [==================>...........] - ETA: 45s - loss: 1.6653 - regression_loss: 1.3811 - classification_loss: 0.2843 323/500 [==================>...........] - ETA: 44s - loss: 1.6653 - regression_loss: 1.3810 - classification_loss: 0.2843 324/500 [==================>...........] - ETA: 44s - loss: 1.6655 - regression_loss: 1.3812 - classification_loss: 0.2844 325/500 [==================>...........] - ETA: 44s - loss: 1.6649 - regression_loss: 1.3806 - classification_loss: 0.2843 326/500 [==================>...........] - ETA: 44s - loss: 1.6630 - regression_loss: 1.3791 - classification_loss: 0.2839 327/500 [==================>...........] - ETA: 43s - loss: 1.6630 - regression_loss: 1.3790 - classification_loss: 0.2840 328/500 [==================>...........] - ETA: 43s - loss: 1.6625 - regression_loss: 1.3787 - classification_loss: 0.2838 329/500 [==================>...........] - ETA: 43s - loss: 1.6605 - regression_loss: 1.3770 - classification_loss: 0.2835 330/500 [==================>...........] - ETA: 43s - loss: 1.6607 - regression_loss: 1.3772 - classification_loss: 0.2835 331/500 [==================>...........] - ETA: 42s - loss: 1.6601 - regression_loss: 1.3768 - classification_loss: 0.2833 332/500 [==================>...........] - ETA: 42s - loss: 1.6577 - regression_loss: 1.3749 - classification_loss: 0.2829 333/500 [==================>...........] - ETA: 42s - loss: 1.6597 - regression_loss: 1.3761 - classification_loss: 0.2836 334/500 [===================>..........] - ETA: 42s - loss: 1.6582 - regression_loss: 1.3750 - classification_loss: 0.2832 335/500 [===================>..........] - ETA: 41s - loss: 1.6600 - regression_loss: 1.3766 - classification_loss: 0.2834 336/500 [===================>..........] - ETA: 41s - loss: 1.6606 - regression_loss: 1.3771 - classification_loss: 0.2835 337/500 [===================>..........] - ETA: 41s - loss: 1.6587 - regression_loss: 1.3756 - classification_loss: 0.2831 338/500 [===================>..........] - ETA: 40s - loss: 1.6593 - regression_loss: 1.3757 - classification_loss: 0.2836 339/500 [===================>..........] - ETA: 40s - loss: 1.6585 - regression_loss: 1.3751 - classification_loss: 0.2834 340/500 [===================>..........] - ETA: 40s - loss: 1.6589 - regression_loss: 1.3757 - classification_loss: 0.2832 341/500 [===================>..........] - ETA: 40s - loss: 1.6589 - regression_loss: 1.3758 - classification_loss: 0.2832 342/500 [===================>..........] - ETA: 39s - loss: 1.6583 - regression_loss: 1.3753 - classification_loss: 0.2830 343/500 [===================>..........] - ETA: 39s - loss: 1.6579 - regression_loss: 1.3752 - classification_loss: 0.2826 344/500 [===================>..........] - ETA: 39s - loss: 1.6579 - regression_loss: 1.3754 - classification_loss: 0.2825 345/500 [===================>..........] - ETA: 39s - loss: 1.6578 - regression_loss: 1.3755 - classification_loss: 0.2824 346/500 [===================>..........] - ETA: 38s - loss: 1.6586 - regression_loss: 1.3764 - classification_loss: 0.2821 347/500 [===================>..........] - ETA: 38s - loss: 1.6594 - regression_loss: 1.3770 - classification_loss: 0.2824 348/500 [===================>..........] - ETA: 38s - loss: 1.6598 - regression_loss: 1.3774 - classification_loss: 0.2824 349/500 [===================>..........] - ETA: 38s - loss: 1.6612 - regression_loss: 1.3785 - classification_loss: 0.2826 350/500 [====================>.........] - ETA: 37s - loss: 1.6613 - regression_loss: 1.3788 - classification_loss: 0.2826 351/500 [====================>.........] - ETA: 37s - loss: 1.6611 - regression_loss: 1.3787 - classification_loss: 0.2825 352/500 [====================>.........] - ETA: 37s - loss: 1.6618 - regression_loss: 1.3792 - classification_loss: 0.2827 353/500 [====================>.........] - ETA: 37s - loss: 1.6618 - regression_loss: 1.3789 - classification_loss: 0.2829 354/500 [====================>.........] - ETA: 36s - loss: 1.6630 - regression_loss: 1.3799 - classification_loss: 0.2832 355/500 [====================>.........] - ETA: 36s - loss: 1.6642 - regression_loss: 1.3808 - classification_loss: 0.2834 356/500 [====================>.........] - ETA: 36s - loss: 1.6620 - regression_loss: 1.3790 - classification_loss: 0.2830 357/500 [====================>.........] - ETA: 36s - loss: 1.6622 - regression_loss: 1.3792 - classification_loss: 0.2830 358/500 [====================>.........] - ETA: 35s - loss: 1.6590 - regression_loss: 1.3764 - classification_loss: 0.2825 359/500 [====================>.........] - ETA: 35s - loss: 1.6585 - regression_loss: 1.3762 - classification_loss: 0.2824 360/500 [====================>.........] - ETA: 35s - loss: 1.6575 - regression_loss: 1.3751 - classification_loss: 0.2824 361/500 [====================>.........] - ETA: 35s - loss: 1.6569 - regression_loss: 1.3747 - classification_loss: 0.2822 362/500 [====================>.........] - ETA: 34s - loss: 1.6576 - regression_loss: 1.3753 - classification_loss: 0.2823 363/500 [====================>.........] - ETA: 34s - loss: 1.6552 - regression_loss: 1.3734 - classification_loss: 0.2818 364/500 [====================>.........] - ETA: 34s - loss: 1.6540 - regression_loss: 1.3726 - classification_loss: 0.2814 365/500 [====================>.........] - ETA: 34s - loss: 1.6525 - regression_loss: 1.3712 - classification_loss: 0.2812 366/500 [====================>.........] - ETA: 33s - loss: 1.6507 - regression_loss: 1.3699 - classification_loss: 0.2808 367/500 [=====================>........] - ETA: 33s - loss: 1.6515 - regression_loss: 1.3706 - classification_loss: 0.2809 368/500 [=====================>........] - ETA: 33s - loss: 1.6532 - regression_loss: 1.3722 - classification_loss: 0.2810 369/500 [=====================>........] - ETA: 33s - loss: 1.6529 - regression_loss: 1.3719 - classification_loss: 0.2810 370/500 [=====================>........] - ETA: 32s - loss: 1.6529 - regression_loss: 1.3717 - classification_loss: 0.2812 371/500 [=====================>........] - ETA: 32s - loss: 1.6531 - regression_loss: 1.3716 - classification_loss: 0.2815 372/500 [=====================>........] - ETA: 32s - loss: 1.6527 - regression_loss: 1.3714 - classification_loss: 0.2813 373/500 [=====================>........] - ETA: 32s - loss: 1.6497 - regression_loss: 1.3689 - classification_loss: 0.2807 374/500 [=====================>........] - ETA: 31s - loss: 1.6474 - regression_loss: 1.3671 - classification_loss: 0.2803 375/500 [=====================>........] - ETA: 31s - loss: 1.6464 - regression_loss: 1.3663 - classification_loss: 0.2801 376/500 [=====================>........] - ETA: 31s - loss: 1.6475 - regression_loss: 1.3673 - classification_loss: 0.2802 377/500 [=====================>........] - ETA: 31s - loss: 1.6484 - regression_loss: 1.3679 - classification_loss: 0.2806 378/500 [=====================>........] - ETA: 30s - loss: 1.6483 - regression_loss: 1.3678 - classification_loss: 0.2806 379/500 [=====================>........] - ETA: 30s - loss: 1.6485 - regression_loss: 1.3680 - classification_loss: 0.2806 380/500 [=====================>........] - ETA: 30s - loss: 1.6483 - regression_loss: 1.3674 - classification_loss: 0.2809 381/500 [=====================>........] - ETA: 30s - loss: 1.6479 - regression_loss: 1.3670 - classification_loss: 0.2809 382/500 [=====================>........] - ETA: 29s - loss: 1.6483 - regression_loss: 1.3674 - classification_loss: 0.2809 383/500 [=====================>........] - ETA: 29s - loss: 1.6490 - regression_loss: 1.3681 - classification_loss: 0.2809 384/500 [======================>.......] - ETA: 29s - loss: 1.6475 - regression_loss: 1.3669 - classification_loss: 0.2806 385/500 [======================>.......] - ETA: 29s - loss: 1.6480 - regression_loss: 1.3675 - classification_loss: 0.2805 386/500 [======================>.......] - ETA: 28s - loss: 1.6489 - regression_loss: 1.3683 - classification_loss: 0.2806 387/500 [======================>.......] - ETA: 28s - loss: 1.6499 - regression_loss: 1.3691 - classification_loss: 0.2808 388/500 [======================>.......] - ETA: 28s - loss: 1.6494 - regression_loss: 1.3688 - classification_loss: 0.2806 389/500 [======================>.......] - ETA: 28s - loss: 1.6516 - regression_loss: 1.3702 - classification_loss: 0.2815 390/500 [======================>.......] - ETA: 27s - loss: 1.6511 - regression_loss: 1.3699 - classification_loss: 0.2813 391/500 [======================>.......] - ETA: 27s - loss: 1.6516 - regression_loss: 1.3703 - classification_loss: 0.2813 392/500 [======================>.......] - ETA: 27s - loss: 1.6527 - regression_loss: 1.3711 - classification_loss: 0.2816 393/500 [======================>.......] - ETA: 27s - loss: 1.6525 - regression_loss: 1.3709 - classification_loss: 0.2816 394/500 [======================>.......] - ETA: 26s - loss: 1.6522 - regression_loss: 1.3705 - classification_loss: 0.2816 395/500 [======================>.......] - ETA: 26s - loss: 1.6526 - regression_loss: 1.3709 - classification_loss: 0.2817 396/500 [======================>.......] - ETA: 26s - loss: 1.6521 - regression_loss: 1.3704 - classification_loss: 0.2817 397/500 [======================>.......] - ETA: 26s - loss: 1.6524 - regression_loss: 1.3707 - classification_loss: 0.2817 398/500 [======================>.......] - ETA: 25s - loss: 1.6522 - regression_loss: 1.3707 - classification_loss: 0.2815 399/500 [======================>.......] - ETA: 25s - loss: 1.6515 - regression_loss: 1.3700 - classification_loss: 0.2815 400/500 [=======================>......] - ETA: 25s - loss: 1.6504 - regression_loss: 1.3692 - classification_loss: 0.2812 401/500 [=======================>......] - ETA: 24s - loss: 1.6496 - regression_loss: 1.3684 - classification_loss: 0.2811 402/500 [=======================>......] - ETA: 24s - loss: 1.6506 - regression_loss: 1.3690 - classification_loss: 0.2816 403/500 [=======================>......] - ETA: 24s - loss: 1.6498 - regression_loss: 1.3684 - classification_loss: 0.2814 404/500 [=======================>......] - ETA: 24s - loss: 1.6486 - regression_loss: 1.3675 - classification_loss: 0.2811 405/500 [=======================>......] - ETA: 23s - loss: 1.6478 - regression_loss: 1.3670 - classification_loss: 0.2808 406/500 [=======================>......] - ETA: 23s - loss: 1.6480 - regression_loss: 1.3672 - classification_loss: 0.2808 407/500 [=======================>......] - ETA: 23s - loss: 1.6491 - regression_loss: 1.3682 - classification_loss: 0.2810 408/500 [=======================>......] - ETA: 23s - loss: 1.6493 - regression_loss: 1.3682 - classification_loss: 0.2811 409/500 [=======================>......] - ETA: 22s - loss: 1.6495 - regression_loss: 1.3684 - classification_loss: 0.2811 410/500 [=======================>......] - ETA: 22s - loss: 1.6485 - regression_loss: 1.3676 - classification_loss: 0.2809 411/500 [=======================>......] - ETA: 22s - loss: 1.6477 - regression_loss: 1.3670 - classification_loss: 0.2807 412/500 [=======================>......] - ETA: 22s - loss: 1.6487 - regression_loss: 1.3677 - classification_loss: 0.2810 413/500 [=======================>......] - ETA: 21s - loss: 1.6484 - regression_loss: 1.3675 - classification_loss: 0.2809 414/500 [=======================>......] - ETA: 21s - loss: 1.6488 - regression_loss: 1.3679 - classification_loss: 0.2810 415/500 [=======================>......] - ETA: 21s - loss: 1.6473 - regression_loss: 1.3665 - classification_loss: 0.2808 416/500 [=======================>......] - ETA: 21s - loss: 1.6470 - regression_loss: 1.3663 - classification_loss: 0.2807 417/500 [========================>.....] - ETA: 20s - loss: 1.6479 - regression_loss: 1.3671 - classification_loss: 0.2807 418/500 [========================>.....] - ETA: 20s - loss: 1.6477 - regression_loss: 1.3668 - classification_loss: 0.2809 419/500 [========================>.....] - ETA: 20s - loss: 1.6485 - regression_loss: 1.3675 - classification_loss: 0.2810 420/500 [========================>.....] - ETA: 20s - loss: 1.6482 - regression_loss: 1.3673 - classification_loss: 0.2809 421/500 [========================>.....] - ETA: 19s - loss: 1.6489 - regression_loss: 1.3679 - classification_loss: 0.2810 422/500 [========================>.....] - ETA: 19s - loss: 1.6492 - regression_loss: 1.3683 - classification_loss: 0.2809 423/500 [========================>.....] - ETA: 19s - loss: 1.6490 - regression_loss: 1.3680 - classification_loss: 0.2810 424/500 [========================>.....] - ETA: 19s - loss: 1.6483 - regression_loss: 1.3675 - classification_loss: 0.2807 425/500 [========================>.....] - ETA: 18s - loss: 1.6488 - regression_loss: 1.3681 - classification_loss: 0.2807 426/500 [========================>.....] - ETA: 18s - loss: 1.6492 - regression_loss: 1.3684 - classification_loss: 0.2808 427/500 [========================>.....] - ETA: 18s - loss: 1.6561 - regression_loss: 1.3707 - classification_loss: 0.2854 428/500 [========================>.....] - ETA: 18s - loss: 1.6596 - regression_loss: 1.3730 - classification_loss: 0.2866 429/500 [========================>.....] - ETA: 17s - loss: 1.6602 - regression_loss: 1.3736 - classification_loss: 0.2866 430/500 [========================>.....] - ETA: 17s - loss: 1.6601 - regression_loss: 1.3736 - classification_loss: 0.2865 431/500 [========================>.....] - ETA: 17s - loss: 1.6610 - regression_loss: 1.3745 - classification_loss: 0.2866 432/500 [========================>.....] - ETA: 17s - loss: 1.6612 - regression_loss: 1.3746 - classification_loss: 0.2866 433/500 [========================>.....] - ETA: 16s - loss: 1.6611 - regression_loss: 1.3745 - classification_loss: 0.2865 434/500 [=========================>....] - ETA: 16s - loss: 1.6617 - regression_loss: 1.3750 - classification_loss: 0.2867 435/500 [=========================>....] - ETA: 16s - loss: 1.6611 - regression_loss: 1.3745 - classification_loss: 0.2866 436/500 [=========================>....] - ETA: 16s - loss: 1.6609 - regression_loss: 1.3743 - classification_loss: 0.2866 437/500 [=========================>....] - ETA: 15s - loss: 1.6623 - regression_loss: 1.3755 - classification_loss: 0.2868 438/500 [=========================>....] - ETA: 15s - loss: 1.6617 - regression_loss: 1.3750 - classification_loss: 0.2866 439/500 [=========================>....] - ETA: 15s - loss: 1.6597 - regression_loss: 1.3734 - classification_loss: 0.2864 440/500 [=========================>....] - ETA: 15s - loss: 1.6587 - regression_loss: 1.3724 - classification_loss: 0.2862 441/500 [=========================>....] - ETA: 14s - loss: 1.6600 - regression_loss: 1.3735 - classification_loss: 0.2865 442/500 [=========================>....] - ETA: 14s - loss: 1.6607 - regression_loss: 1.3740 - classification_loss: 0.2867 443/500 [=========================>....] - ETA: 14s - loss: 1.6591 - regression_loss: 1.3726 - classification_loss: 0.2864 444/500 [=========================>....] - ETA: 14s - loss: 1.6596 - regression_loss: 1.3730 - classification_loss: 0.2866 445/500 [=========================>....] - ETA: 13s - loss: 1.6625 - regression_loss: 1.3755 - classification_loss: 0.2870 446/500 [=========================>....] - ETA: 13s - loss: 1.6628 - regression_loss: 1.3758 - classification_loss: 0.2870 447/500 [=========================>....] - ETA: 13s - loss: 1.6630 - regression_loss: 1.3762 - classification_loss: 0.2868 448/500 [=========================>....] - ETA: 13s - loss: 1.6621 - regression_loss: 1.3756 - classification_loss: 0.2865 449/500 [=========================>....] - ETA: 12s - loss: 1.6615 - regression_loss: 1.3752 - classification_loss: 0.2863 450/500 [==========================>...] - ETA: 12s - loss: 1.6621 - regression_loss: 1.3757 - classification_loss: 0.2864 451/500 [==========================>...] - ETA: 12s - loss: 1.6622 - regression_loss: 1.3760 - classification_loss: 0.2862 452/500 [==========================>...] - ETA: 12s - loss: 1.6619 - regression_loss: 1.3759 - classification_loss: 0.2861 453/500 [==========================>...] - ETA: 11s - loss: 1.6615 - regression_loss: 1.3757 - classification_loss: 0.2858 454/500 [==========================>...] - ETA: 11s - loss: 1.6617 - regression_loss: 1.3759 - classification_loss: 0.2858 455/500 [==========================>...] - ETA: 11s - loss: 1.6619 - regression_loss: 1.3761 - classification_loss: 0.2858 456/500 [==========================>...] - ETA: 11s - loss: 1.6622 - regression_loss: 1.3765 - classification_loss: 0.2858 457/500 [==========================>...] - ETA: 10s - loss: 1.6621 - regression_loss: 1.3764 - classification_loss: 0.2857 458/500 [==========================>...] - ETA: 10s - loss: 1.6621 - regression_loss: 1.3764 - classification_loss: 0.2857 459/500 [==========================>...] - ETA: 10s - loss: 1.6618 - regression_loss: 1.3765 - classification_loss: 0.2854 460/500 [==========================>...] - ETA: 10s - loss: 1.6618 - regression_loss: 1.3765 - classification_loss: 0.2853 461/500 [==========================>...] - ETA: 9s - loss: 1.6624 - regression_loss: 1.3770 - classification_loss: 0.2854  462/500 [==========================>...] - ETA: 9s - loss: 1.6622 - regression_loss: 1.3769 - classification_loss: 0.2853 463/500 [==========================>...] - ETA: 9s - loss: 1.6624 - regression_loss: 1.3771 - classification_loss: 0.2853 464/500 [==========================>...] - ETA: 9s - loss: 1.6621 - regression_loss: 1.3770 - classification_loss: 0.2851 465/500 [==========================>...] - ETA: 8s - loss: 1.6621 - regression_loss: 1.3769 - classification_loss: 0.2851 466/500 [==========================>...] - ETA: 8s - loss: 1.6602 - regression_loss: 1.3754 - classification_loss: 0.2848 467/500 [===========================>..] - ETA: 8s - loss: 1.6597 - regression_loss: 1.3750 - classification_loss: 0.2847 468/500 [===========================>..] - ETA: 8s - loss: 1.6593 - regression_loss: 1.3741 - classification_loss: 0.2853 469/500 [===========================>..] - ETA: 7s - loss: 1.6599 - regression_loss: 1.3747 - classification_loss: 0.2852 470/500 [===========================>..] - ETA: 7s - loss: 1.6595 - regression_loss: 1.3743 - classification_loss: 0.2852 471/500 [===========================>..] - ETA: 7s - loss: 1.6598 - regression_loss: 1.3742 - classification_loss: 0.2856 472/500 [===========================>..] - ETA: 7s - loss: 1.6599 - regression_loss: 1.3741 - classification_loss: 0.2857 473/500 [===========================>..] - ETA: 6s - loss: 1.6600 - regression_loss: 1.3742 - classification_loss: 0.2858 474/500 [===========================>..] - ETA: 6s - loss: 1.6605 - regression_loss: 1.3746 - classification_loss: 0.2859 475/500 [===========================>..] - ETA: 6s - loss: 1.6612 - regression_loss: 1.3754 - classification_loss: 0.2858 476/500 [===========================>..] - ETA: 6s - loss: 1.6611 - regression_loss: 1.3754 - classification_loss: 0.2857 477/500 [===========================>..] - ETA: 5s - loss: 1.6615 - regression_loss: 1.3757 - classification_loss: 0.2858 478/500 [===========================>..] - ETA: 5s - loss: 1.6610 - regression_loss: 1.3754 - classification_loss: 0.2856 479/500 [===========================>..] - ETA: 5s - loss: 1.6622 - regression_loss: 1.3765 - classification_loss: 0.2858 480/500 [===========================>..] - ETA: 5s - loss: 1.6617 - regression_loss: 1.3760 - classification_loss: 0.2857 481/500 [===========================>..] - ETA: 4s - loss: 1.6616 - regression_loss: 1.3760 - classification_loss: 0.2857 482/500 [===========================>..] - ETA: 4s - loss: 1.6619 - regression_loss: 1.3762 - classification_loss: 0.2857 483/500 [===========================>..] - ETA: 4s - loss: 1.6619 - regression_loss: 1.3761 - classification_loss: 0.2858 484/500 [============================>.] - ETA: 4s - loss: 1.6620 - regression_loss: 1.3762 - classification_loss: 0.2858 485/500 [============================>.] - ETA: 3s - loss: 1.6603 - regression_loss: 1.3749 - classification_loss: 0.2854 486/500 [============================>.] - ETA: 3s - loss: 1.6610 - regression_loss: 1.3754 - classification_loss: 0.2856 487/500 [============================>.] - ETA: 3s - loss: 1.6611 - regression_loss: 1.3754 - classification_loss: 0.2856 488/500 [============================>.] - ETA: 3s - loss: 1.6618 - regression_loss: 1.3757 - classification_loss: 0.2861 489/500 [============================>.] - ETA: 2s - loss: 1.6619 - regression_loss: 1.3758 - classification_loss: 0.2861 490/500 [============================>.] - ETA: 2s - loss: 1.6617 - regression_loss: 1.3757 - classification_loss: 0.2860 491/500 [============================>.] - ETA: 2s - loss: 1.6630 - regression_loss: 1.3767 - classification_loss: 0.2863 492/500 [============================>.] - ETA: 2s - loss: 1.6633 - regression_loss: 1.3770 - classification_loss: 0.2863 493/500 [============================>.] - ETA: 1s - loss: 1.6615 - regression_loss: 1.3755 - classification_loss: 0.2860 494/500 [============================>.] - ETA: 1s - loss: 1.6614 - regression_loss: 1.3756 - classification_loss: 0.2858 495/500 [============================>.] - ETA: 1s - loss: 1.6614 - regression_loss: 1.3754 - classification_loss: 0.2860 496/500 [============================>.] - ETA: 1s - loss: 1.6622 - regression_loss: 1.3761 - classification_loss: 0.2861 497/500 [============================>.] - ETA: 0s - loss: 1.6605 - regression_loss: 1.3745 - classification_loss: 0.2860 498/500 [============================>.] - ETA: 0s - loss: 1.6609 - regression_loss: 1.3749 - classification_loss: 0.2860 499/500 [============================>.] - ETA: 0s - loss: 1.6590 - regression_loss: 1.3733 - classification_loss: 0.2857 500/500 [==============================] - 126s 252ms/step - loss: 1.6599 - regression_loss: 1.3741 - classification_loss: 0.2859 1172 instances of class plum with average precision: 0.6429 mAP: 0.6429 Epoch 00078: saving model to ./training/snapshots/resnet50_pascal_78.h5 Epoch 79/150 1/500 [..............................] - ETA: 1:57 - loss: 1.4086 - regression_loss: 1.1921 - classification_loss: 0.2165 2/500 [..............................] - ETA: 2:02 - loss: 1.4548 - regression_loss: 1.2424 - classification_loss: 0.2124 3/500 [..............................] - ETA: 2:03 - loss: 1.5566 - regression_loss: 1.3115 - classification_loss: 0.2450 4/500 [..............................] - ETA: 2:03 - loss: 1.3924 - regression_loss: 1.1705 - classification_loss: 0.2220 5/500 [..............................] - ETA: 2:03 - loss: 1.5557 - regression_loss: 1.3183 - classification_loss: 0.2375 6/500 [..............................] - ETA: 2:02 - loss: 1.5848 - regression_loss: 1.3413 - classification_loss: 0.2435 7/500 [..............................] - ETA: 2:03 - loss: 1.5799 - regression_loss: 1.3348 - classification_loss: 0.2451 8/500 [..............................] - ETA: 2:02 - loss: 1.6082 - regression_loss: 1.3623 - classification_loss: 0.2459 9/500 [..............................] - ETA: 2:02 - loss: 1.6975 - regression_loss: 1.4252 - classification_loss: 0.2723 10/500 [..............................] - ETA: 2:02 - loss: 1.6538 - regression_loss: 1.3880 - classification_loss: 0.2658 11/500 [..............................] - ETA: 2:02 - loss: 1.6248 - regression_loss: 1.3636 - classification_loss: 0.2612 12/500 [..............................] - ETA: 2:02 - loss: 1.6875 - regression_loss: 1.4075 - classification_loss: 0.2800 13/500 [..............................] - ETA: 2:02 - loss: 1.6829 - regression_loss: 1.4059 - classification_loss: 0.2771 14/500 [..............................] - ETA: 2:01 - loss: 1.6212 - regression_loss: 1.3539 - classification_loss: 0.2673 15/500 [..............................] - ETA: 2:01 - loss: 1.6141 - regression_loss: 1.3450 - classification_loss: 0.2691 16/500 [..............................] - ETA: 2:01 - loss: 1.5661 - regression_loss: 1.3047 - classification_loss: 0.2614 17/500 [>.............................] - ETA: 2:00 - loss: 1.6256 - regression_loss: 1.3526 - classification_loss: 0.2730 18/500 [>.............................] - ETA: 2:00 - loss: 1.5817 - regression_loss: 1.3166 - classification_loss: 0.2651 19/500 [>.............................] - ETA: 2:00 - loss: 1.6665 - regression_loss: 1.3955 - classification_loss: 0.2710 20/500 [>.............................] - ETA: 2:00 - loss: 1.6540 - regression_loss: 1.3862 - classification_loss: 0.2678 21/500 [>.............................] - ETA: 1:59 - loss: 1.6520 - regression_loss: 1.3872 - classification_loss: 0.2647 22/500 [>.............................] - ETA: 1:59 - loss: 1.6597 - regression_loss: 1.3964 - classification_loss: 0.2633 23/500 [>.............................] - ETA: 1:59 - loss: 1.6553 - regression_loss: 1.3913 - classification_loss: 0.2641 24/500 [>.............................] - ETA: 1:59 - loss: 1.6835 - regression_loss: 1.4143 - classification_loss: 0.2692 25/500 [>.............................] - ETA: 1:58 - loss: 1.6865 - regression_loss: 1.4170 - classification_loss: 0.2695 26/500 [>.............................] - ETA: 1:58 - loss: 1.6487 - regression_loss: 1.3862 - classification_loss: 0.2625 27/500 [>.............................] - ETA: 1:58 - loss: 1.6262 - regression_loss: 1.3681 - classification_loss: 0.2580 28/500 [>.............................] - ETA: 1:57 - loss: 1.6345 - regression_loss: 1.3759 - classification_loss: 0.2586 29/500 [>.............................] - ETA: 1:57 - loss: 1.6487 - regression_loss: 1.3878 - classification_loss: 0.2609 30/500 [>.............................] - ETA: 1:57 - loss: 1.6272 - regression_loss: 1.3674 - classification_loss: 0.2598 31/500 [>.............................] - ETA: 1:56 - loss: 1.6229 - regression_loss: 1.3633 - classification_loss: 0.2596 32/500 [>.............................] - ETA: 1:56 - loss: 1.6066 - regression_loss: 1.3508 - classification_loss: 0.2558 33/500 [>.............................] - ETA: 1:56 - loss: 1.5783 - regression_loss: 1.3278 - classification_loss: 0.2505 34/500 [=>............................] - ETA: 1:56 - loss: 1.5555 - regression_loss: 1.3053 - classification_loss: 0.2502 35/500 [=>............................] - ETA: 1:56 - loss: 1.5575 - regression_loss: 1.3074 - classification_loss: 0.2501 36/500 [=>............................] - ETA: 1:55 - loss: 1.5650 - regression_loss: 1.3133 - classification_loss: 0.2517 37/500 [=>............................] - ETA: 1:55 - loss: 1.5829 - regression_loss: 1.3265 - classification_loss: 0.2564 38/500 [=>............................] - ETA: 1:55 - loss: 1.5769 - regression_loss: 1.3216 - classification_loss: 0.2554 39/500 [=>............................] - ETA: 1:55 - loss: 1.5604 - regression_loss: 1.3081 - classification_loss: 0.2523 40/500 [=>............................] - ETA: 1:54 - loss: 1.5429 - regression_loss: 1.2934 - classification_loss: 0.2495 41/500 [=>............................] - ETA: 1:54 - loss: 1.5617 - regression_loss: 1.3076 - classification_loss: 0.2542 42/500 [=>............................] - ETA: 1:54 - loss: 1.5673 - regression_loss: 1.3121 - classification_loss: 0.2553 43/500 [=>............................] - ETA: 1:54 - loss: 1.5541 - regression_loss: 1.3003 - classification_loss: 0.2537 44/500 [=>............................] - ETA: 1:54 - loss: 1.5694 - regression_loss: 1.3125 - classification_loss: 0.2570 45/500 [=>............................] - ETA: 1:53 - loss: 1.5880 - regression_loss: 1.3260 - classification_loss: 0.2620 46/500 [=>............................] - ETA: 1:53 - loss: 1.5798 - regression_loss: 1.3197 - classification_loss: 0.2601 47/500 [=>............................] - ETA: 1:53 - loss: 1.5892 - regression_loss: 1.3275 - classification_loss: 0.2617 48/500 [=>............................] - ETA: 1:53 - loss: 1.5963 - regression_loss: 1.3340 - classification_loss: 0.2622 49/500 [=>............................] - ETA: 1:52 - loss: 1.5965 - regression_loss: 1.3339 - classification_loss: 0.2626 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5965 - regression_loss: 1.3343 - classification_loss: 0.2622 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6020 - regression_loss: 1.3378 - classification_loss: 0.2643 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5983 - regression_loss: 1.3334 - classification_loss: 0.2649 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5913 - regression_loss: 1.3286 - classification_loss: 0.2628 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5929 - regression_loss: 1.3301 - classification_loss: 0.2628 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5979 - regression_loss: 1.3343 - classification_loss: 0.2636 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5945 - regression_loss: 1.3310 - classification_loss: 0.2635 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5971 - regression_loss: 1.3343 - classification_loss: 0.2628 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6040 - regression_loss: 1.3395 - classification_loss: 0.2645 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6071 - regression_loss: 1.3423 - classification_loss: 0.2647 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6022 - regression_loss: 1.3380 - classification_loss: 0.2643 61/500 [==>...........................] - ETA: 1:50 - loss: 1.6095 - regression_loss: 1.3440 - classification_loss: 0.2654 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6156 - regression_loss: 1.3484 - classification_loss: 0.2672 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6114 - regression_loss: 1.3437 - classification_loss: 0.2677 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6174 - regression_loss: 1.3489 - classification_loss: 0.2685 65/500 [==>...........................] - ETA: 1:49 - loss: 1.6041 - regression_loss: 1.3383 - classification_loss: 0.2658 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6001 - regression_loss: 1.3348 - classification_loss: 0.2653 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6041 - regression_loss: 1.3374 - classification_loss: 0.2667 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6028 - regression_loss: 1.3369 - classification_loss: 0.2659 69/500 [===>..........................] - ETA: 1:48 - loss: 1.6056 - regression_loss: 1.3394 - classification_loss: 0.2663 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5921 - regression_loss: 1.3285 - classification_loss: 0.2636 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5908 - regression_loss: 1.3280 - classification_loss: 0.2628 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5867 - regression_loss: 1.3251 - classification_loss: 0.2616 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5889 - regression_loss: 1.3276 - classification_loss: 0.2612 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5890 - regression_loss: 1.3278 - classification_loss: 0.2612 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5908 - regression_loss: 1.3297 - classification_loss: 0.2611 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5983 - regression_loss: 1.3361 - classification_loss: 0.2623 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6100 - regression_loss: 1.3434 - classification_loss: 0.2666 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6190 - regression_loss: 1.3492 - classification_loss: 0.2698 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6341 - regression_loss: 1.3642 - classification_loss: 0.2699 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6345 - regression_loss: 1.3647 - classification_loss: 0.2699 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6260 - regression_loss: 1.3573 - classification_loss: 0.2686 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6273 - regression_loss: 1.3582 - classification_loss: 0.2691 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6292 - regression_loss: 1.3600 - classification_loss: 0.2692 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6189 - regression_loss: 1.3518 - classification_loss: 0.2671 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6231 - regression_loss: 1.3549 - classification_loss: 0.2682 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6201 - regression_loss: 1.3527 - classification_loss: 0.2674 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6250 - regression_loss: 1.3568 - classification_loss: 0.2682 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6265 - regression_loss: 1.3567 - classification_loss: 0.2698 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6287 - regression_loss: 1.3577 - classification_loss: 0.2710 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6392 - regression_loss: 1.3660 - classification_loss: 0.2732 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6401 - regression_loss: 1.3662 - classification_loss: 0.2739 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6310 - regression_loss: 1.3588 - classification_loss: 0.2722 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6326 - regression_loss: 1.3608 - classification_loss: 0.2718 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6316 - regression_loss: 1.3601 - classification_loss: 0.2715 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6285 - regression_loss: 1.3575 - classification_loss: 0.2710 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6187 - regression_loss: 1.3490 - classification_loss: 0.2696 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6223 - regression_loss: 1.3510 - classification_loss: 0.2713 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6276 - regression_loss: 1.3552 - classification_loss: 0.2724 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6320 - regression_loss: 1.3597 - classification_loss: 0.2722 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6355 - regression_loss: 1.3629 - classification_loss: 0.2725 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6276 - regression_loss: 1.3569 - classification_loss: 0.2707 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6385 - regression_loss: 1.3652 - classification_loss: 0.2733 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6376 - regression_loss: 1.3644 - classification_loss: 0.2733 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6359 - regression_loss: 1.3633 - classification_loss: 0.2727 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6380 - regression_loss: 1.3647 - classification_loss: 0.2733 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6356 - regression_loss: 1.3628 - classification_loss: 0.2728 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6275 - regression_loss: 1.3567 - classification_loss: 0.2709 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6238 - regression_loss: 1.3538 - classification_loss: 0.2699 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6300 - regression_loss: 1.3584 - classification_loss: 0.2717 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6379 - regression_loss: 1.3652 - classification_loss: 0.2727 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6428 - regression_loss: 1.3682 - classification_loss: 0.2745 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6459 - regression_loss: 1.3710 - classification_loss: 0.2749 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6446 - regression_loss: 1.3701 - classification_loss: 0.2745 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6370 - regression_loss: 1.3633 - classification_loss: 0.2737 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6323 - regression_loss: 1.3589 - classification_loss: 0.2733 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6303 - regression_loss: 1.3569 - classification_loss: 0.2734 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6314 - regression_loss: 1.3581 - classification_loss: 0.2732 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6254 - regression_loss: 1.3534 - classification_loss: 0.2720 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6273 - regression_loss: 1.3551 - classification_loss: 0.2721 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6296 - regression_loss: 1.3573 - classification_loss: 0.2723 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6314 - regression_loss: 1.3591 - classification_loss: 0.2723 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6302 - regression_loss: 1.3586 - classification_loss: 0.2716 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6338 - regression_loss: 1.3618 - classification_loss: 0.2720 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6337 - regression_loss: 1.3618 - classification_loss: 0.2719 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6278 - regression_loss: 1.3566 - classification_loss: 0.2711 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6302 - regression_loss: 1.3586 - classification_loss: 0.2716 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6315 - regression_loss: 1.3597 - classification_loss: 0.2718 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6368 - regression_loss: 1.3640 - classification_loss: 0.2729 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6387 - regression_loss: 1.3655 - classification_loss: 0.2732 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6409 - regression_loss: 1.3671 - classification_loss: 0.2738 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6429 - regression_loss: 1.3687 - classification_loss: 0.2741 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6447 - regression_loss: 1.3701 - classification_loss: 0.2746 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6381 - regression_loss: 1.3648 - classification_loss: 0.2732 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6401 - regression_loss: 1.3667 - classification_loss: 0.2734 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6393 - regression_loss: 1.3654 - classification_loss: 0.2739 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6368 - regression_loss: 1.3636 - classification_loss: 0.2731 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6344 - regression_loss: 1.3619 - classification_loss: 0.2724 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6351 - regression_loss: 1.3628 - classification_loss: 0.2723 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6388 - regression_loss: 1.3653 - classification_loss: 0.2735 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6357 - regression_loss: 1.3619 - classification_loss: 0.2739 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6379 - regression_loss: 1.3641 - classification_loss: 0.2739 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6382 - regression_loss: 1.3643 - classification_loss: 0.2739 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6367 - regression_loss: 1.3631 - classification_loss: 0.2736 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6356 - regression_loss: 1.3621 - classification_loss: 0.2735 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6361 - regression_loss: 1.3619 - classification_loss: 0.2742 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6358 - regression_loss: 1.3617 - classification_loss: 0.2742 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6404 - regression_loss: 1.3652 - classification_loss: 0.2752 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6352 - regression_loss: 1.3608 - classification_loss: 0.2744 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6345 - regression_loss: 1.3604 - classification_loss: 0.2741 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6352 - regression_loss: 1.3610 - classification_loss: 0.2743 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6289 - regression_loss: 1.3556 - classification_loss: 0.2733 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6210 - regression_loss: 1.3490 - classification_loss: 0.2720 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6282 - regression_loss: 1.3546 - classification_loss: 0.2736 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6289 - regression_loss: 1.3543 - classification_loss: 0.2746 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6244 - regression_loss: 1.3511 - classification_loss: 0.2734 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6253 - regression_loss: 1.3521 - classification_loss: 0.2733 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6256 - regression_loss: 1.3522 - classification_loss: 0.2734 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6258 - regression_loss: 1.3524 - classification_loss: 0.2734 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6252 - regression_loss: 1.3517 - classification_loss: 0.2735 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6253 - regression_loss: 1.3516 - classification_loss: 0.2738 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6248 - regression_loss: 1.3513 - classification_loss: 0.2735 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6248 - regression_loss: 1.3513 - classification_loss: 0.2735 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6224 - regression_loss: 1.3490 - classification_loss: 0.2733 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6283 - regression_loss: 1.3540 - classification_loss: 0.2743 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6247 - regression_loss: 1.3515 - classification_loss: 0.2731 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6263 - regression_loss: 1.3530 - classification_loss: 0.2733 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6289 - regression_loss: 1.3550 - classification_loss: 0.2739 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6264 - regression_loss: 1.3528 - classification_loss: 0.2736 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6268 - regression_loss: 1.3529 - classification_loss: 0.2738 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6331 - regression_loss: 1.3571 - classification_loss: 0.2759 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6334 - regression_loss: 1.3576 - classification_loss: 0.2758 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6327 - regression_loss: 1.3567 - classification_loss: 0.2760 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6340 - regression_loss: 1.3582 - classification_loss: 0.2758 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6340 - regression_loss: 1.3581 - classification_loss: 0.2760 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6349 - regression_loss: 1.3587 - classification_loss: 0.2762 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6290 - regression_loss: 1.3538 - classification_loss: 0.2752 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6307 - regression_loss: 1.3555 - classification_loss: 0.2752 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6281 - regression_loss: 1.3535 - classification_loss: 0.2745 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6415 - regression_loss: 1.3615 - classification_loss: 0.2800 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6419 - regression_loss: 1.3620 - classification_loss: 0.2799 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6398 - regression_loss: 1.3601 - classification_loss: 0.2797 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6398 - regression_loss: 1.3602 - classification_loss: 0.2796 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6394 - regression_loss: 1.3602 - classification_loss: 0.2792 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6389 - regression_loss: 1.3598 - classification_loss: 0.2790 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6393 - regression_loss: 1.3605 - classification_loss: 0.2789 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6412 - regression_loss: 1.3622 - classification_loss: 0.2790 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6367 - regression_loss: 1.3588 - classification_loss: 0.2780 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6344 - regression_loss: 1.3567 - classification_loss: 0.2777 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6343 - regression_loss: 1.3563 - classification_loss: 0.2781 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6314 - regression_loss: 1.3539 - classification_loss: 0.2775 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6285 - regression_loss: 1.3515 - classification_loss: 0.2770 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6285 - regression_loss: 1.3516 - classification_loss: 0.2769 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6299 - regression_loss: 1.3526 - classification_loss: 0.2773 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6309 - regression_loss: 1.3537 - classification_loss: 0.2773 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6321 - regression_loss: 1.3547 - classification_loss: 0.2773 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6320 - regression_loss: 1.3547 - classification_loss: 0.2773 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6334 - regression_loss: 1.3557 - classification_loss: 0.2777 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6331 - regression_loss: 1.3556 - classification_loss: 0.2776 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6318 - regression_loss: 1.3545 - classification_loss: 0.2773 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6318 - regression_loss: 1.3546 - classification_loss: 0.2772 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6325 - regression_loss: 1.3550 - classification_loss: 0.2775 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6324 - regression_loss: 1.3550 - classification_loss: 0.2774 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6311 - regression_loss: 1.3537 - classification_loss: 0.2774 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6306 - regression_loss: 1.3532 - classification_loss: 0.2774 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6263 - regression_loss: 1.3497 - classification_loss: 0.2766 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6270 - regression_loss: 1.3504 - classification_loss: 0.2765 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6296 - regression_loss: 1.3530 - classification_loss: 0.2765 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6299 - regression_loss: 1.3531 - classification_loss: 0.2768 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6287 - regression_loss: 1.3524 - classification_loss: 0.2764 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6303 - regression_loss: 1.3534 - classification_loss: 0.2769 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6325 - regression_loss: 1.3553 - classification_loss: 0.2772 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6304 - regression_loss: 1.3538 - classification_loss: 0.2766 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6320 - regression_loss: 1.3553 - classification_loss: 0.2767 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6317 - regression_loss: 1.3551 - classification_loss: 0.2766 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6301 - regression_loss: 1.3542 - classification_loss: 0.2759 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6314 - regression_loss: 1.3552 - classification_loss: 0.2762 217/500 [============>.................] - ETA: 1:10 - loss: 1.6331 - regression_loss: 1.3567 - classification_loss: 0.2764 218/500 [============>.................] - ETA: 1:10 - loss: 1.6325 - regression_loss: 1.3560 - classification_loss: 0.2765 219/500 [============>.................] - ETA: 1:10 - loss: 1.6331 - regression_loss: 1.3566 - classification_loss: 0.2765 220/500 [============>.................] - ETA: 1:10 - loss: 1.6361 - regression_loss: 1.3589 - classification_loss: 0.2772 221/500 [============>.................] - ETA: 1:09 - loss: 1.6335 - regression_loss: 1.3572 - classification_loss: 0.2763 222/500 [============>.................] - ETA: 1:09 - loss: 1.6293 - regression_loss: 1.3539 - classification_loss: 0.2754 223/500 [============>.................] - ETA: 1:09 - loss: 1.6332 - regression_loss: 1.3563 - classification_loss: 0.2769 224/500 [============>.................] - ETA: 1:09 - loss: 1.6354 - regression_loss: 1.3583 - classification_loss: 0.2771 225/500 [============>.................] - ETA: 1:08 - loss: 1.6370 - regression_loss: 1.3597 - classification_loss: 0.2773 226/500 [============>.................] - ETA: 1:08 - loss: 1.6384 - regression_loss: 1.3610 - classification_loss: 0.2774 227/500 [============>.................] - ETA: 1:08 - loss: 1.6368 - regression_loss: 1.3596 - classification_loss: 0.2771 228/500 [============>.................] - ETA: 1:08 - loss: 1.6379 - regression_loss: 1.3607 - classification_loss: 0.2771 229/500 [============>.................] - ETA: 1:07 - loss: 1.6384 - regression_loss: 1.3612 - classification_loss: 0.2772 230/500 [============>.................] - ETA: 1:07 - loss: 1.6397 - regression_loss: 1.3622 - classification_loss: 0.2775 231/500 [============>.................] - ETA: 1:07 - loss: 1.6382 - regression_loss: 1.3608 - classification_loss: 0.2775 232/500 [============>.................] - ETA: 1:07 - loss: 1.6396 - regression_loss: 1.3620 - classification_loss: 0.2776 233/500 [============>.................] - ETA: 1:06 - loss: 1.6377 - regression_loss: 1.3606 - classification_loss: 0.2771 234/500 [=============>................] - ETA: 1:06 - loss: 1.6390 - regression_loss: 1.3615 - classification_loss: 0.2775 235/500 [=============>................] - ETA: 1:06 - loss: 1.6404 - regression_loss: 1.3630 - classification_loss: 0.2775 236/500 [=============>................] - ETA: 1:06 - loss: 1.6422 - regression_loss: 1.3645 - classification_loss: 0.2777 237/500 [=============>................] - ETA: 1:05 - loss: 1.6440 - regression_loss: 1.3659 - classification_loss: 0.2781 238/500 [=============>................] - ETA: 1:05 - loss: 1.6475 - regression_loss: 1.3686 - classification_loss: 0.2789 239/500 [=============>................] - ETA: 1:05 - loss: 1.6476 - regression_loss: 1.3686 - classification_loss: 0.2790 240/500 [=============>................] - ETA: 1:05 - loss: 1.6499 - regression_loss: 1.3706 - classification_loss: 0.2793 241/500 [=============>................] - ETA: 1:04 - loss: 1.6507 - regression_loss: 1.3712 - classification_loss: 0.2795 242/500 [=============>................] - ETA: 1:04 - loss: 1.6505 - regression_loss: 1.3710 - classification_loss: 0.2795 243/500 [=============>................] - ETA: 1:04 - loss: 1.6478 - regression_loss: 1.3688 - classification_loss: 0.2789 244/500 [=============>................] - ETA: 1:04 - loss: 1.6469 - regression_loss: 1.3682 - classification_loss: 0.2787 245/500 [=============>................] - ETA: 1:03 - loss: 1.6492 - regression_loss: 1.3702 - classification_loss: 0.2790 246/500 [=============>................] - ETA: 1:03 - loss: 1.6495 - regression_loss: 1.3707 - classification_loss: 0.2788 247/500 [=============>................] - ETA: 1:03 - loss: 1.6517 - regression_loss: 1.3721 - classification_loss: 0.2796 248/500 [=============>................] - ETA: 1:03 - loss: 1.6531 - regression_loss: 1.3733 - classification_loss: 0.2798 249/500 [=============>................] - ETA: 1:02 - loss: 1.6529 - regression_loss: 1.3731 - classification_loss: 0.2798 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6547 - regression_loss: 1.3749 - classification_loss: 0.2798 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6543 - regression_loss: 1.3747 - classification_loss: 0.2795 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6508 - regression_loss: 1.3721 - classification_loss: 0.2787 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6525 - regression_loss: 1.3736 - classification_loss: 0.2790 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6532 - regression_loss: 1.3743 - classification_loss: 0.2790 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6533 - regression_loss: 1.3743 - classification_loss: 0.2789 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6486 - regression_loss: 1.3705 - classification_loss: 0.2781 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6485 - regression_loss: 1.3705 - classification_loss: 0.2780 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6488 - regression_loss: 1.3699 - classification_loss: 0.2789 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6449 - regression_loss: 1.3666 - classification_loss: 0.2783 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6458 - regression_loss: 1.3672 - classification_loss: 0.2786 261/500 [==============>...............] - ETA: 59s - loss: 1.6476 - regression_loss: 1.3689 - classification_loss: 0.2787  262/500 [==============>...............] - ETA: 59s - loss: 1.6488 - regression_loss: 1.3690 - classification_loss: 0.2799 263/500 [==============>...............] - ETA: 59s - loss: 1.6522 - regression_loss: 1.3707 - classification_loss: 0.2816 264/500 [==============>...............] - ETA: 59s - loss: 1.6513 - regression_loss: 1.3701 - classification_loss: 0.2813 265/500 [==============>...............] - ETA: 58s - loss: 1.6512 - regression_loss: 1.3700 - classification_loss: 0.2812 266/500 [==============>...............] - ETA: 58s - loss: 1.6520 - regression_loss: 1.3707 - classification_loss: 0.2813 267/500 [===============>..............] - ETA: 58s - loss: 1.6523 - regression_loss: 1.3710 - classification_loss: 0.2813 268/500 [===============>..............] - ETA: 58s - loss: 1.6509 - regression_loss: 1.3699 - classification_loss: 0.2809 269/500 [===============>..............] - ETA: 57s - loss: 1.6524 - regression_loss: 1.3712 - classification_loss: 0.2812 270/500 [===============>..............] - ETA: 57s - loss: 1.6553 - regression_loss: 1.3739 - classification_loss: 0.2814 271/500 [===============>..............] - ETA: 57s - loss: 1.6545 - regression_loss: 1.3733 - classification_loss: 0.2812 272/500 [===============>..............] - ETA: 57s - loss: 1.6542 - regression_loss: 1.3738 - classification_loss: 0.2805 273/500 [===============>..............] - ETA: 56s - loss: 1.6555 - regression_loss: 1.3749 - classification_loss: 0.2806 274/500 [===============>..............] - ETA: 56s - loss: 1.6538 - regression_loss: 1.3730 - classification_loss: 0.2807 275/500 [===============>..............] - ETA: 56s - loss: 1.6552 - regression_loss: 1.3740 - classification_loss: 0.2812 276/500 [===============>..............] - ETA: 56s - loss: 1.6551 - regression_loss: 1.3740 - classification_loss: 0.2811 277/500 [===============>..............] - ETA: 55s - loss: 1.6532 - regression_loss: 1.3724 - classification_loss: 0.2807 278/500 [===============>..............] - ETA: 55s - loss: 1.6538 - regression_loss: 1.3730 - classification_loss: 0.2808 279/500 [===============>..............] - ETA: 55s - loss: 1.6548 - regression_loss: 1.3737 - classification_loss: 0.2811 280/500 [===============>..............] - ETA: 55s - loss: 1.6552 - regression_loss: 1.3741 - classification_loss: 0.2811 281/500 [===============>..............] - ETA: 54s - loss: 1.6548 - regression_loss: 1.3736 - classification_loss: 0.2812 282/500 [===============>..............] - ETA: 54s - loss: 1.6530 - regression_loss: 1.3720 - classification_loss: 0.2810 283/500 [===============>..............] - ETA: 54s - loss: 1.6530 - regression_loss: 1.3720 - classification_loss: 0.2810 284/500 [================>.............] - ETA: 54s - loss: 1.6523 - regression_loss: 1.3716 - classification_loss: 0.2807 285/500 [================>.............] - ETA: 53s - loss: 1.6537 - regression_loss: 1.3727 - classification_loss: 0.2810 286/500 [================>.............] - ETA: 53s - loss: 1.6566 - regression_loss: 1.3751 - classification_loss: 0.2815 287/500 [================>.............] - ETA: 53s - loss: 1.6552 - regression_loss: 1.3736 - classification_loss: 0.2816 288/500 [================>.............] - ETA: 53s - loss: 1.6539 - regression_loss: 1.3727 - classification_loss: 0.2813 289/500 [================>.............] - ETA: 52s - loss: 1.6553 - regression_loss: 1.3737 - classification_loss: 0.2815 290/500 [================>.............] - ETA: 52s - loss: 1.6559 - regression_loss: 1.3744 - classification_loss: 0.2815 291/500 [================>.............] - ETA: 52s - loss: 1.6541 - regression_loss: 1.3727 - classification_loss: 0.2814 292/500 [================>.............] - ETA: 52s - loss: 1.6573 - regression_loss: 1.3758 - classification_loss: 0.2814 293/500 [================>.............] - ETA: 51s - loss: 1.6550 - regression_loss: 1.3739 - classification_loss: 0.2811 294/500 [================>.............] - ETA: 51s - loss: 1.6558 - regression_loss: 1.3747 - classification_loss: 0.2811 295/500 [================>.............] - ETA: 51s - loss: 1.6553 - regression_loss: 1.3742 - classification_loss: 0.2811 296/500 [================>.............] - ETA: 51s - loss: 1.6559 - regression_loss: 1.3749 - classification_loss: 0.2810 297/500 [================>.............] - ETA: 50s - loss: 1.6583 - regression_loss: 1.3765 - classification_loss: 0.2818 298/500 [================>.............] - ETA: 50s - loss: 1.6585 - regression_loss: 1.3764 - classification_loss: 0.2821 299/500 [================>.............] - ETA: 50s - loss: 1.6600 - regression_loss: 1.3776 - classification_loss: 0.2824 300/500 [=================>............] - ETA: 50s - loss: 1.6599 - regression_loss: 1.3775 - classification_loss: 0.2825 301/500 [=================>............] - ETA: 49s - loss: 1.6594 - regression_loss: 1.3770 - classification_loss: 0.2823 302/500 [=================>............] - ETA: 49s - loss: 1.6599 - regression_loss: 1.3776 - classification_loss: 0.2823 303/500 [=================>............] - ETA: 49s - loss: 1.6584 - regression_loss: 1.3765 - classification_loss: 0.2818 304/500 [=================>............] - ETA: 49s - loss: 1.6563 - regression_loss: 1.3748 - classification_loss: 0.2815 305/500 [=================>............] - ETA: 48s - loss: 1.6580 - regression_loss: 1.3764 - classification_loss: 0.2817 306/500 [=================>............] - ETA: 48s - loss: 1.6573 - regression_loss: 1.3757 - classification_loss: 0.2816 307/500 [=================>............] - ETA: 48s - loss: 1.6585 - regression_loss: 1.3761 - classification_loss: 0.2824 308/500 [=================>............] - ETA: 48s - loss: 1.6590 - regression_loss: 1.3765 - classification_loss: 0.2825 309/500 [=================>............] - ETA: 47s - loss: 1.6580 - regression_loss: 1.3757 - classification_loss: 0.2822 310/500 [=================>............] - ETA: 47s - loss: 1.6577 - regression_loss: 1.3755 - classification_loss: 0.2821 311/500 [=================>............] - ETA: 47s - loss: 1.6585 - regression_loss: 1.3764 - classification_loss: 0.2821 312/500 [=================>............] - ETA: 47s - loss: 1.6557 - regression_loss: 1.3739 - classification_loss: 0.2818 313/500 [=================>............] - ETA: 46s - loss: 1.6556 - regression_loss: 1.3741 - classification_loss: 0.2815 314/500 [=================>............] - ETA: 46s - loss: 1.6530 - regression_loss: 1.3719 - classification_loss: 0.2811 315/500 [=================>............] - ETA: 46s - loss: 1.6537 - regression_loss: 1.3725 - classification_loss: 0.2811 316/500 [=================>............] - ETA: 46s - loss: 1.6516 - regression_loss: 1.3709 - classification_loss: 0.2807 317/500 [==================>...........] - ETA: 45s - loss: 1.6519 - regression_loss: 1.3711 - classification_loss: 0.2808 318/500 [==================>...........] - ETA: 45s - loss: 1.6522 - regression_loss: 1.3712 - classification_loss: 0.2810 319/500 [==================>...........] - ETA: 45s - loss: 1.6530 - regression_loss: 1.3719 - classification_loss: 0.2811 320/500 [==================>...........] - ETA: 45s - loss: 1.6521 - regression_loss: 1.3712 - classification_loss: 0.2809 321/500 [==================>...........] - ETA: 44s - loss: 1.6531 - regression_loss: 1.3721 - classification_loss: 0.2811 322/500 [==================>...........] - ETA: 44s - loss: 1.6529 - regression_loss: 1.3720 - classification_loss: 0.2810 323/500 [==================>...........] - ETA: 44s - loss: 1.6545 - regression_loss: 1.3732 - classification_loss: 0.2813 324/500 [==================>...........] - ETA: 44s - loss: 1.6537 - regression_loss: 1.3727 - classification_loss: 0.2810 325/500 [==================>...........] - ETA: 43s - loss: 1.6531 - regression_loss: 1.3723 - classification_loss: 0.2808 326/500 [==================>...........] - ETA: 43s - loss: 1.6530 - regression_loss: 1.3722 - classification_loss: 0.2808 327/500 [==================>...........] - ETA: 43s - loss: 1.6534 - regression_loss: 1.3723 - classification_loss: 0.2811 328/500 [==================>...........] - ETA: 43s - loss: 1.6526 - regression_loss: 1.3716 - classification_loss: 0.2810 329/500 [==================>...........] - ETA: 42s - loss: 1.6523 - regression_loss: 1.3714 - classification_loss: 0.2809 330/500 [==================>...........] - ETA: 42s - loss: 1.6523 - regression_loss: 1.3715 - classification_loss: 0.2808 331/500 [==================>...........] - ETA: 42s - loss: 1.6528 - regression_loss: 1.3719 - classification_loss: 0.2809 332/500 [==================>...........] - ETA: 42s - loss: 1.6518 - regression_loss: 1.3711 - classification_loss: 0.2807 333/500 [==================>...........] - ETA: 41s - loss: 1.6530 - regression_loss: 1.3723 - classification_loss: 0.2807 334/500 [===================>..........] - ETA: 41s - loss: 1.6530 - regression_loss: 1.3723 - classification_loss: 0.2806 335/500 [===================>..........] - ETA: 41s - loss: 1.6526 - regression_loss: 1.3720 - classification_loss: 0.2806 336/500 [===================>..........] - ETA: 41s - loss: 1.6544 - regression_loss: 1.3725 - classification_loss: 0.2819 337/500 [===================>..........] - ETA: 40s - loss: 1.6549 - regression_loss: 1.3731 - classification_loss: 0.2819 338/500 [===================>..........] - ETA: 40s - loss: 1.6529 - regression_loss: 1.3715 - classification_loss: 0.2813 339/500 [===================>..........] - ETA: 40s - loss: 1.6529 - regression_loss: 1.3714 - classification_loss: 0.2815 340/500 [===================>..........] - ETA: 40s - loss: 1.6502 - regression_loss: 1.3692 - classification_loss: 0.2811 341/500 [===================>..........] - ETA: 39s - loss: 1.6513 - regression_loss: 1.3701 - classification_loss: 0.2812 342/500 [===================>..........] - ETA: 39s - loss: 1.6517 - regression_loss: 1.3707 - classification_loss: 0.2811 343/500 [===================>..........] - ETA: 39s - loss: 1.6528 - regression_loss: 1.3718 - classification_loss: 0.2810 344/500 [===================>..........] - ETA: 39s - loss: 1.6526 - regression_loss: 1.3716 - classification_loss: 0.2810 345/500 [===================>..........] - ETA: 38s - loss: 1.6534 - regression_loss: 1.3724 - classification_loss: 0.2810 346/500 [===================>..........] - ETA: 38s - loss: 1.6536 - regression_loss: 1.3725 - classification_loss: 0.2810 347/500 [===================>..........] - ETA: 38s - loss: 1.6538 - regression_loss: 1.3729 - classification_loss: 0.2810 348/500 [===================>..........] - ETA: 38s - loss: 1.6552 - regression_loss: 1.3741 - classification_loss: 0.2811 349/500 [===================>..........] - ETA: 37s - loss: 1.6545 - regression_loss: 1.3735 - classification_loss: 0.2810 350/500 [====================>.........] - ETA: 37s - loss: 1.6551 - regression_loss: 1.3742 - classification_loss: 0.2809 351/500 [====================>.........] - ETA: 37s - loss: 1.6558 - regression_loss: 1.3750 - classification_loss: 0.2808 352/500 [====================>.........] - ETA: 37s - loss: 1.6563 - regression_loss: 1.3752 - classification_loss: 0.2811 353/500 [====================>.........] - ETA: 36s - loss: 1.6551 - regression_loss: 1.3743 - classification_loss: 0.2808 354/500 [====================>.........] - ETA: 36s - loss: 1.6563 - regression_loss: 1.3755 - classification_loss: 0.2808 355/500 [====================>.........] - ETA: 36s - loss: 1.6544 - regression_loss: 1.3741 - classification_loss: 0.2803 356/500 [====================>.........] - ETA: 36s - loss: 1.6529 - regression_loss: 1.3729 - classification_loss: 0.2800 357/500 [====================>.........] - ETA: 35s - loss: 1.6542 - regression_loss: 1.3739 - classification_loss: 0.2803 358/500 [====================>.........] - ETA: 35s - loss: 1.6542 - regression_loss: 1.3740 - classification_loss: 0.2802 359/500 [====================>.........] - ETA: 35s - loss: 1.6550 - regression_loss: 1.3748 - classification_loss: 0.2803 360/500 [====================>.........] - ETA: 35s - loss: 1.6554 - regression_loss: 1.3751 - classification_loss: 0.2803 361/500 [====================>.........] - ETA: 34s - loss: 1.6544 - regression_loss: 1.3738 - classification_loss: 0.2806 362/500 [====================>.........] - ETA: 34s - loss: 1.6545 - regression_loss: 1.3740 - classification_loss: 0.2805 363/500 [====================>.........] - ETA: 34s - loss: 1.6544 - regression_loss: 1.3739 - classification_loss: 0.2805 364/500 [====================>.........] - ETA: 34s - loss: 1.6537 - regression_loss: 1.3733 - classification_loss: 0.2804 365/500 [====================>.........] - ETA: 33s - loss: 1.6528 - regression_loss: 1.3727 - classification_loss: 0.2801 366/500 [====================>.........] - ETA: 33s - loss: 1.6518 - regression_loss: 1.3719 - classification_loss: 0.2798 367/500 [=====================>........] - ETA: 33s - loss: 1.6501 - regression_loss: 1.3708 - classification_loss: 0.2793 368/500 [=====================>........] - ETA: 33s - loss: 1.6504 - regression_loss: 1.3710 - classification_loss: 0.2794 369/500 [=====================>........] - ETA: 32s - loss: 1.6499 - regression_loss: 1.3706 - classification_loss: 0.2793 370/500 [=====================>........] - ETA: 32s - loss: 1.6501 - regression_loss: 1.3708 - classification_loss: 0.2793 371/500 [=====================>........] - ETA: 32s - loss: 1.6500 - regression_loss: 1.3709 - classification_loss: 0.2792 372/500 [=====================>........] - ETA: 32s - loss: 1.6498 - regression_loss: 1.3707 - classification_loss: 0.2792 373/500 [=====================>........] - ETA: 31s - loss: 1.6500 - regression_loss: 1.3709 - classification_loss: 0.2791 374/500 [=====================>........] - ETA: 31s - loss: 1.6519 - regression_loss: 1.3725 - classification_loss: 0.2794 375/500 [=====================>........] - ETA: 31s - loss: 1.6519 - regression_loss: 1.3726 - classification_loss: 0.2793 376/500 [=====================>........] - ETA: 31s - loss: 1.6514 - regression_loss: 1.3724 - classification_loss: 0.2790 377/500 [=====================>........] - ETA: 30s - loss: 1.6496 - regression_loss: 1.3710 - classification_loss: 0.2786 378/500 [=====================>........] - ETA: 30s - loss: 1.6485 - regression_loss: 1.3703 - classification_loss: 0.2782 379/500 [=====================>........] - ETA: 30s - loss: 1.6487 - regression_loss: 1.3706 - classification_loss: 0.2781 380/500 [=====================>........] - ETA: 30s - loss: 1.6493 - regression_loss: 1.3709 - classification_loss: 0.2784 381/500 [=====================>........] - ETA: 29s - loss: 1.6502 - regression_loss: 1.3715 - classification_loss: 0.2787 382/500 [=====================>........] - ETA: 29s - loss: 1.6477 - regression_loss: 1.3694 - classification_loss: 0.2783 383/500 [=====================>........] - ETA: 29s - loss: 1.6485 - regression_loss: 1.3701 - classification_loss: 0.2784 384/500 [======================>.......] - ETA: 29s - loss: 1.6481 - regression_loss: 1.3698 - classification_loss: 0.2782 385/500 [======================>.......] - ETA: 28s - loss: 1.6487 - regression_loss: 1.3704 - classification_loss: 0.2783 386/500 [======================>.......] - ETA: 28s - loss: 1.6478 - regression_loss: 1.3697 - classification_loss: 0.2781 387/500 [======================>.......] - ETA: 28s - loss: 1.6477 - regression_loss: 1.3697 - classification_loss: 0.2780 388/500 [======================>.......] - ETA: 28s - loss: 1.6490 - regression_loss: 1.3708 - classification_loss: 0.2782 389/500 [======================>.......] - ETA: 27s - loss: 1.6494 - regression_loss: 1.3712 - classification_loss: 0.2782 390/500 [======================>.......] - ETA: 27s - loss: 1.6491 - regression_loss: 1.3711 - classification_loss: 0.2780 391/500 [======================>.......] - ETA: 27s - loss: 1.6498 - regression_loss: 1.3717 - classification_loss: 0.2782 392/500 [======================>.......] - ETA: 27s - loss: 1.6498 - regression_loss: 1.3715 - classification_loss: 0.2782 393/500 [======================>.......] - ETA: 26s - loss: 1.6513 - regression_loss: 1.3731 - classification_loss: 0.2782 394/500 [======================>.......] - ETA: 26s - loss: 1.6513 - regression_loss: 1.3731 - classification_loss: 0.2782 395/500 [======================>.......] - ETA: 26s - loss: 1.6507 - regression_loss: 1.3727 - classification_loss: 0.2780 396/500 [======================>.......] - ETA: 26s - loss: 1.6492 - regression_loss: 1.3714 - classification_loss: 0.2778 397/500 [======================>.......] - ETA: 25s - loss: 1.6496 - regression_loss: 1.3719 - classification_loss: 0.2777 398/500 [======================>.......] - ETA: 25s - loss: 1.6472 - regression_loss: 1.3700 - classification_loss: 0.2773 399/500 [======================>.......] - ETA: 25s - loss: 1.6473 - regression_loss: 1.3701 - classification_loss: 0.2772 400/500 [=======================>......] - ETA: 25s - loss: 1.6477 - regression_loss: 1.3703 - classification_loss: 0.2774 401/500 [=======================>......] - ETA: 24s - loss: 1.6476 - regression_loss: 1.3706 - classification_loss: 0.2771 402/500 [=======================>......] - ETA: 24s - loss: 1.6475 - regression_loss: 1.3705 - classification_loss: 0.2770 403/500 [=======================>......] - ETA: 24s - loss: 1.6507 - regression_loss: 1.3727 - classification_loss: 0.2781 404/500 [=======================>......] - ETA: 24s - loss: 1.6510 - regression_loss: 1.3727 - classification_loss: 0.2784 405/500 [=======================>......] - ETA: 23s - loss: 1.6516 - regression_loss: 1.3732 - classification_loss: 0.2784 406/500 [=======================>......] - ETA: 23s - loss: 1.6528 - regression_loss: 1.3744 - classification_loss: 0.2784 407/500 [=======================>......] - ETA: 23s - loss: 1.6525 - regression_loss: 1.3742 - classification_loss: 0.2783 408/500 [=======================>......] - ETA: 22s - loss: 1.6522 - regression_loss: 1.3741 - classification_loss: 0.2781 409/500 [=======================>......] - ETA: 22s - loss: 1.6532 - regression_loss: 1.3749 - classification_loss: 0.2783 410/500 [=======================>......] - ETA: 22s - loss: 1.6545 - regression_loss: 1.3759 - classification_loss: 0.2785 411/500 [=======================>......] - ETA: 22s - loss: 1.6553 - regression_loss: 1.3767 - classification_loss: 0.2786 412/500 [=======================>......] - ETA: 21s - loss: 1.6549 - regression_loss: 1.3764 - classification_loss: 0.2786 413/500 [=======================>......] - ETA: 21s - loss: 1.6549 - regression_loss: 1.3764 - classification_loss: 0.2785 414/500 [=======================>......] - ETA: 21s - loss: 1.6541 - regression_loss: 1.3757 - classification_loss: 0.2784 415/500 [=======================>......] - ETA: 21s - loss: 1.6539 - regression_loss: 1.3756 - classification_loss: 0.2783 416/500 [=======================>......] - ETA: 20s - loss: 1.6546 - regression_loss: 1.3761 - classification_loss: 0.2784 417/500 [========================>.....] - ETA: 20s - loss: 1.6553 - regression_loss: 1.3767 - classification_loss: 0.2786 418/500 [========================>.....] - ETA: 20s - loss: 1.6570 - regression_loss: 1.3781 - classification_loss: 0.2789 419/500 [========================>.....] - ETA: 20s - loss: 1.6565 - regression_loss: 1.3777 - classification_loss: 0.2788 420/500 [========================>.....] - ETA: 19s - loss: 1.6557 - regression_loss: 1.3772 - classification_loss: 0.2785 421/500 [========================>.....] - ETA: 19s - loss: 1.6548 - regression_loss: 1.3764 - classification_loss: 0.2784 422/500 [========================>.....] - ETA: 19s - loss: 1.6541 - regression_loss: 1.3759 - classification_loss: 0.2782 423/500 [========================>.....] - ETA: 19s - loss: 1.6548 - regression_loss: 1.3764 - classification_loss: 0.2784 424/500 [========================>.....] - ETA: 18s - loss: 1.6548 - regression_loss: 1.3764 - classification_loss: 0.2784 425/500 [========================>.....] - ETA: 18s - loss: 1.6553 - regression_loss: 1.3768 - classification_loss: 0.2784 426/500 [========================>.....] - ETA: 18s - loss: 1.6544 - regression_loss: 1.3762 - classification_loss: 0.2782 427/500 [========================>.....] - ETA: 18s - loss: 1.6528 - regression_loss: 1.3750 - classification_loss: 0.2778 428/500 [========================>.....] - ETA: 17s - loss: 1.6524 - regression_loss: 1.3747 - classification_loss: 0.2777 429/500 [========================>.....] - ETA: 17s - loss: 1.6534 - regression_loss: 1.3755 - classification_loss: 0.2779 430/500 [========================>.....] - ETA: 17s - loss: 1.6533 - regression_loss: 1.3754 - classification_loss: 0.2779 431/500 [========================>.....] - ETA: 17s - loss: 1.6541 - regression_loss: 1.3763 - classification_loss: 0.2779 432/500 [========================>.....] - ETA: 16s - loss: 1.6556 - regression_loss: 1.3773 - classification_loss: 0.2784 433/500 [========================>.....] - ETA: 16s - loss: 1.6553 - regression_loss: 1.3770 - classification_loss: 0.2783 434/500 [=========================>....] - ETA: 16s - loss: 1.6539 - regression_loss: 1.3758 - classification_loss: 0.2781 435/500 [=========================>....] - ETA: 16s - loss: 1.6545 - regression_loss: 1.3763 - classification_loss: 0.2783 436/500 [=========================>....] - ETA: 15s - loss: 1.6531 - regression_loss: 1.3752 - classification_loss: 0.2779 437/500 [=========================>....] - ETA: 15s - loss: 1.6545 - regression_loss: 1.3765 - classification_loss: 0.2780 438/500 [=========================>....] - ETA: 15s - loss: 1.6520 - regression_loss: 1.3744 - classification_loss: 0.2775 439/500 [=========================>....] - ETA: 15s - loss: 1.6508 - regression_loss: 1.3734 - classification_loss: 0.2774 440/500 [=========================>....] - ETA: 14s - loss: 1.6508 - regression_loss: 1.3733 - classification_loss: 0.2775 441/500 [=========================>....] - ETA: 14s - loss: 1.6512 - regression_loss: 1.3736 - classification_loss: 0.2776 442/500 [=========================>....] - ETA: 14s - loss: 1.6508 - regression_loss: 1.3733 - classification_loss: 0.2775 443/500 [=========================>....] - ETA: 14s - loss: 1.6503 - regression_loss: 1.3730 - classification_loss: 0.2773 444/500 [=========================>....] - ETA: 13s - loss: 1.6510 - regression_loss: 1.3736 - classification_loss: 0.2774 445/500 [=========================>....] - ETA: 13s - loss: 1.6508 - regression_loss: 1.3734 - classification_loss: 0.2773 446/500 [=========================>....] - ETA: 13s - loss: 1.6517 - regression_loss: 1.3743 - classification_loss: 0.2774 447/500 [=========================>....] - ETA: 13s - loss: 1.6519 - regression_loss: 1.3744 - classification_loss: 0.2774 448/500 [=========================>....] - ETA: 12s - loss: 1.6518 - regression_loss: 1.3743 - classification_loss: 0.2775 449/500 [=========================>....] - ETA: 12s - loss: 1.6549 - regression_loss: 1.3768 - classification_loss: 0.2781 450/500 [==========================>...] - ETA: 12s - loss: 1.6560 - regression_loss: 1.3778 - classification_loss: 0.2782 451/500 [==========================>...] - ETA: 12s - loss: 1.6565 - regression_loss: 1.3781 - classification_loss: 0.2784 452/500 [==========================>...] - ETA: 11s - loss: 1.6570 - regression_loss: 1.3786 - classification_loss: 0.2784 453/500 [==========================>...] - ETA: 11s - loss: 1.6582 - regression_loss: 1.3798 - classification_loss: 0.2784 454/500 [==========================>...] - ETA: 11s - loss: 1.6576 - regression_loss: 1.3794 - classification_loss: 0.2782 455/500 [==========================>...] - ETA: 11s - loss: 1.6570 - regression_loss: 1.3787 - classification_loss: 0.2782 456/500 [==========================>...] - ETA: 10s - loss: 1.6568 - regression_loss: 1.3785 - classification_loss: 0.2783 457/500 [==========================>...] - ETA: 10s - loss: 1.6584 - regression_loss: 1.3798 - classification_loss: 0.2786 458/500 [==========================>...] - ETA: 10s - loss: 1.6582 - regression_loss: 1.3798 - classification_loss: 0.2785 459/500 [==========================>...] - ETA: 10s - loss: 1.6577 - regression_loss: 1.3796 - classification_loss: 0.2781 460/500 [==========================>...] - ETA: 9s - loss: 1.6553 - regression_loss: 1.3774 - classification_loss: 0.2778  461/500 [==========================>...] - ETA: 9s - loss: 1.6561 - regression_loss: 1.3782 - classification_loss: 0.2779 462/500 [==========================>...] - ETA: 9s - loss: 1.6549 - regression_loss: 1.3772 - classification_loss: 0.2777 463/500 [==========================>...] - ETA: 9s - loss: 1.6562 - regression_loss: 1.3780 - classification_loss: 0.2782 464/500 [==========================>...] - ETA: 8s - loss: 1.6565 - regression_loss: 1.3782 - classification_loss: 0.2783 465/500 [==========================>...] - ETA: 8s - loss: 1.6554 - regression_loss: 1.3774 - classification_loss: 0.2780 466/500 [==========================>...] - ETA: 8s - loss: 1.6554 - regression_loss: 1.3773 - classification_loss: 0.2781 467/500 [===========================>..] - ETA: 8s - loss: 1.6551 - regression_loss: 1.3772 - classification_loss: 0.2779 468/500 [===========================>..] - ETA: 7s - loss: 1.6554 - regression_loss: 1.3774 - classification_loss: 0.2779 469/500 [===========================>..] - ETA: 7s - loss: 1.6570 - regression_loss: 1.3784 - classification_loss: 0.2786 470/500 [===========================>..] - ETA: 7s - loss: 1.6573 - regression_loss: 1.3787 - classification_loss: 0.2786 471/500 [===========================>..] - ETA: 7s - loss: 1.6570 - regression_loss: 1.3786 - classification_loss: 0.2785 472/500 [===========================>..] - ETA: 6s - loss: 1.6576 - regression_loss: 1.3790 - classification_loss: 0.2787 473/500 [===========================>..] - ETA: 6s - loss: 1.6581 - regression_loss: 1.3793 - classification_loss: 0.2788 474/500 [===========================>..] - ETA: 6s - loss: 1.6583 - regression_loss: 1.3796 - classification_loss: 0.2787 475/500 [===========================>..] - ETA: 6s - loss: 1.6585 - regression_loss: 1.3796 - classification_loss: 0.2789 476/500 [===========================>..] - ETA: 5s - loss: 1.6599 - regression_loss: 1.3805 - classification_loss: 0.2794 477/500 [===========================>..] - ETA: 5s - loss: 1.6605 - regression_loss: 1.3811 - classification_loss: 0.2794 478/500 [===========================>..] - ETA: 5s - loss: 1.6607 - regression_loss: 1.3813 - classification_loss: 0.2794 479/500 [===========================>..] - ETA: 5s - loss: 1.6591 - regression_loss: 1.3799 - classification_loss: 0.2792 480/500 [===========================>..] - ETA: 4s - loss: 1.6593 - regression_loss: 1.3803 - classification_loss: 0.2790 481/500 [===========================>..] - ETA: 4s - loss: 1.6596 - regression_loss: 1.3806 - classification_loss: 0.2790 482/500 [===========================>..] - ETA: 4s - loss: 1.6588 - regression_loss: 1.3799 - classification_loss: 0.2789 483/500 [===========================>..] - ETA: 4s - loss: 1.6587 - regression_loss: 1.3800 - classification_loss: 0.2788 484/500 [============================>.] - ETA: 3s - loss: 1.6592 - regression_loss: 1.3804 - classification_loss: 0.2788 485/500 [============================>.] - ETA: 3s - loss: 1.6596 - regression_loss: 1.3808 - classification_loss: 0.2788 486/500 [============================>.] - ETA: 3s - loss: 1.6596 - regression_loss: 1.3809 - classification_loss: 0.2788 487/500 [============================>.] - ETA: 3s - loss: 1.6599 - regression_loss: 1.3812 - classification_loss: 0.2787 488/500 [============================>.] - ETA: 2s - loss: 1.6605 - regression_loss: 1.3817 - classification_loss: 0.2788 489/500 [============================>.] - ETA: 2s - loss: 1.6614 - regression_loss: 1.3823 - classification_loss: 0.2791 490/500 [============================>.] - ETA: 2s - loss: 1.6628 - regression_loss: 1.3834 - classification_loss: 0.2794 491/500 [============================>.] - ETA: 2s - loss: 1.6622 - regression_loss: 1.3830 - classification_loss: 0.2792 492/500 [============================>.] - ETA: 1s - loss: 1.6620 - regression_loss: 1.3829 - classification_loss: 0.2791 493/500 [============================>.] - ETA: 1s - loss: 1.6622 - regression_loss: 1.3831 - classification_loss: 0.2791 494/500 [============================>.] - ETA: 1s - loss: 1.6613 - regression_loss: 1.3824 - classification_loss: 0.2789 495/500 [============================>.] - ETA: 1s - loss: 1.6611 - regression_loss: 1.3822 - classification_loss: 0.2789 496/500 [============================>.] - ETA: 0s - loss: 1.6602 - regression_loss: 1.3813 - classification_loss: 0.2789 497/500 [============================>.] - ETA: 0s - loss: 1.6600 - regression_loss: 1.3812 - classification_loss: 0.2788 498/500 [============================>.] - ETA: 0s - loss: 1.6597 - regression_loss: 1.3810 - classification_loss: 0.2788 499/500 [============================>.] - ETA: 0s - loss: 1.6608 - regression_loss: 1.3818 - classification_loss: 0.2790 500/500 [==============================] - 125s 249ms/step - loss: 1.6606 - regression_loss: 1.3817 - classification_loss: 0.2789 1172 instances of class plum with average precision: 0.6395 mAP: 0.6395 Epoch 00079: saving model to ./training/snapshots/resnet50_pascal_79.h5 Epoch 80/150 1/500 [..............................] - ETA: 2:01 - loss: 0.7468 - regression_loss: 0.5959 - classification_loss: 0.1509 2/500 [..............................] - ETA: 2:02 - loss: 1.1387 - regression_loss: 0.9327 - classification_loss: 0.2060 3/500 [..............................] - ETA: 1:58 - loss: 1.3510 - regression_loss: 1.0962 - classification_loss: 0.2548 4/500 [..............................] - ETA: 1:55 - loss: 1.6117 - regression_loss: 1.3024 - classification_loss: 0.3093 5/500 [..............................] - ETA: 1:53 - loss: 1.6016 - regression_loss: 1.3148 - classification_loss: 0.2868 6/500 [..............................] - ETA: 1:55 - loss: 1.5051 - regression_loss: 1.2423 - classification_loss: 0.2629 7/500 [..............................] - ETA: 1:56 - loss: 1.5153 - regression_loss: 1.2449 - classification_loss: 0.2704 8/500 [..............................] - ETA: 1:57 - loss: 1.5217 - regression_loss: 1.2566 - classification_loss: 0.2651 9/500 [..............................] - ETA: 1:56 - loss: 1.6021 - regression_loss: 1.3146 - classification_loss: 0.2875 10/500 [..............................] - ETA: 1:56 - loss: 1.6208 - regression_loss: 1.3369 - classification_loss: 0.2839 11/500 [..............................] - ETA: 1:56 - loss: 1.5586 - regression_loss: 1.2868 - classification_loss: 0.2719 12/500 [..............................] - ETA: 1:57 - loss: 1.6097 - regression_loss: 1.3306 - classification_loss: 0.2791 13/500 [..............................] - ETA: 1:56 - loss: 1.6407 - regression_loss: 1.3529 - classification_loss: 0.2878 14/500 [..............................] - ETA: 1:56 - loss: 1.6756 - regression_loss: 1.3778 - classification_loss: 0.2978 15/500 [..............................] - ETA: 1:56 - loss: 1.6969 - regression_loss: 1.4055 - classification_loss: 0.2913 16/500 [..............................] - ETA: 1:55 - loss: 1.7118 - regression_loss: 1.4193 - classification_loss: 0.2925 17/500 [>.............................] - ETA: 1:54 - loss: 1.7016 - regression_loss: 1.4156 - classification_loss: 0.2860 18/500 [>.............................] - ETA: 1:54 - loss: 1.6602 - regression_loss: 1.3821 - classification_loss: 0.2781 19/500 [>.............................] - ETA: 1:55 - loss: 1.6683 - regression_loss: 1.3902 - classification_loss: 0.2782 20/500 [>.............................] - ETA: 1:54 - loss: 1.6581 - regression_loss: 1.3770 - classification_loss: 0.2811 21/500 [>.............................] - ETA: 1:54 - loss: 1.7021 - regression_loss: 1.4192 - classification_loss: 0.2829 22/500 [>.............................] - ETA: 1:54 - loss: 1.6700 - regression_loss: 1.3952 - classification_loss: 0.2748 23/500 [>.............................] - ETA: 1:54 - loss: 1.6611 - regression_loss: 1.3875 - classification_loss: 0.2736 24/500 [>.............................] - ETA: 1:54 - loss: 1.6443 - regression_loss: 1.3731 - classification_loss: 0.2712 25/500 [>.............................] - ETA: 1:54 - loss: 1.6569 - regression_loss: 1.3839 - classification_loss: 0.2731 26/500 [>.............................] - ETA: 1:55 - loss: 1.6635 - regression_loss: 1.3905 - classification_loss: 0.2730 27/500 [>.............................] - ETA: 1:54 - loss: 1.6329 - regression_loss: 1.3657 - classification_loss: 0.2673 28/500 [>.............................] - ETA: 1:54 - loss: 1.6109 - regression_loss: 1.3478 - classification_loss: 0.2631 29/500 [>.............................] - ETA: 1:54 - loss: 1.5902 - regression_loss: 1.3299 - classification_loss: 0.2603 30/500 [>.............................] - ETA: 1:54 - loss: 1.5901 - regression_loss: 1.3294 - classification_loss: 0.2607 31/500 [>.............................] - ETA: 1:54 - loss: 1.5630 - regression_loss: 1.3042 - classification_loss: 0.2588 32/500 [>.............................] - ETA: 1:54 - loss: 1.5726 - regression_loss: 1.3114 - classification_loss: 0.2612 33/500 [>.............................] - ETA: 1:53 - loss: 1.6017 - regression_loss: 1.3294 - classification_loss: 0.2723 34/500 [=>............................] - ETA: 1:53 - loss: 1.6023 - regression_loss: 1.3301 - classification_loss: 0.2722 35/500 [=>............................] - ETA: 1:53 - loss: 1.6279 - regression_loss: 1.3440 - classification_loss: 0.2839 36/500 [=>............................] - ETA: 1:52 - loss: 1.6272 - regression_loss: 1.3448 - classification_loss: 0.2824 37/500 [=>............................] - ETA: 1:52 - loss: 1.6404 - regression_loss: 1.3567 - classification_loss: 0.2837 38/500 [=>............................] - ETA: 1:52 - loss: 1.6425 - regression_loss: 1.3583 - classification_loss: 0.2842 39/500 [=>............................] - ETA: 1:52 - loss: 1.6537 - regression_loss: 1.3669 - classification_loss: 0.2868 40/500 [=>............................] - ETA: 1:52 - loss: 1.6513 - regression_loss: 1.3655 - classification_loss: 0.2858 41/500 [=>............................] - ETA: 1:52 - loss: 1.6489 - regression_loss: 1.3640 - classification_loss: 0.2849 42/500 [=>............................] - ETA: 1:51 - loss: 1.6456 - regression_loss: 1.3581 - classification_loss: 0.2876 43/500 [=>............................] - ETA: 1:51 - loss: 1.6255 - regression_loss: 1.3413 - classification_loss: 0.2842 44/500 [=>............................] - ETA: 1:51 - loss: 1.6216 - regression_loss: 1.3395 - classification_loss: 0.2821 45/500 [=>............................] - ETA: 1:51 - loss: 1.6243 - regression_loss: 1.3409 - classification_loss: 0.2834 46/500 [=>............................] - ETA: 1:50 - loss: 1.6244 - regression_loss: 1.3422 - classification_loss: 0.2823 47/500 [=>............................] - ETA: 1:50 - loss: 1.6243 - regression_loss: 1.3421 - classification_loss: 0.2822 48/500 [=>............................] - ETA: 1:50 - loss: 1.6170 - regression_loss: 1.3368 - classification_loss: 0.2802 49/500 [=>............................] - ETA: 1:50 - loss: 1.5985 - regression_loss: 1.3229 - classification_loss: 0.2756 50/500 [==>...........................] - ETA: 1:50 - loss: 1.6144 - regression_loss: 1.3363 - classification_loss: 0.2781 51/500 [==>...........................] - ETA: 1:49 - loss: 1.6250 - regression_loss: 1.3450 - classification_loss: 0.2800 52/500 [==>...........................] - ETA: 1:49 - loss: 1.6345 - regression_loss: 1.3525 - classification_loss: 0.2819 53/500 [==>...........................] - ETA: 1:49 - loss: 1.6429 - regression_loss: 1.3594 - classification_loss: 0.2834 54/500 [==>...........................] - ETA: 1:49 - loss: 1.6422 - regression_loss: 1.3594 - classification_loss: 0.2828 55/500 [==>...........................] - ETA: 1:48 - loss: 1.6440 - regression_loss: 1.3609 - classification_loss: 0.2832 56/500 [==>...........................] - ETA: 1:48 - loss: 1.6483 - regression_loss: 1.3648 - classification_loss: 0.2835 57/500 [==>...........................] - ETA: 1:48 - loss: 1.6516 - regression_loss: 1.3666 - classification_loss: 0.2849 58/500 [==>...........................] - ETA: 1:48 - loss: 1.6526 - regression_loss: 1.3680 - classification_loss: 0.2845 59/500 [==>...........................] - ETA: 1:48 - loss: 1.6385 - regression_loss: 1.3550 - classification_loss: 0.2835 60/500 [==>...........................] - ETA: 1:47 - loss: 1.6319 - regression_loss: 1.3502 - classification_loss: 0.2817 61/500 [==>...........................] - ETA: 1:47 - loss: 1.6293 - regression_loss: 1.3486 - classification_loss: 0.2807 62/500 [==>...........................] - ETA: 1:47 - loss: 1.6339 - regression_loss: 1.3534 - classification_loss: 0.2806 63/500 [==>...........................] - ETA: 1:47 - loss: 1.6460 - regression_loss: 1.3608 - classification_loss: 0.2852 64/500 [==>...........................] - ETA: 1:47 - loss: 1.6528 - regression_loss: 1.3667 - classification_loss: 0.2861 65/500 [==>...........................] - ETA: 1:46 - loss: 1.6697 - regression_loss: 1.3838 - classification_loss: 0.2858 66/500 [==>...........................] - ETA: 1:46 - loss: 1.6794 - regression_loss: 1.3909 - classification_loss: 0.2885 67/500 [===>..........................] - ETA: 1:46 - loss: 1.6740 - regression_loss: 1.3876 - classification_loss: 0.2864 68/500 [===>..........................] - ETA: 1:45 - loss: 1.6693 - regression_loss: 1.3841 - classification_loss: 0.2852 69/500 [===>..........................] - ETA: 1:45 - loss: 1.6657 - regression_loss: 1.3812 - classification_loss: 0.2845 70/500 [===>..........................] - ETA: 1:45 - loss: 1.6719 - regression_loss: 1.3858 - classification_loss: 0.2861 71/500 [===>..........................] - ETA: 1:45 - loss: 1.6694 - regression_loss: 1.3843 - classification_loss: 0.2851 72/500 [===>..........................] - ETA: 1:45 - loss: 1.6542 - regression_loss: 1.3724 - classification_loss: 0.2818 73/500 [===>..........................] - ETA: 1:44 - loss: 1.6600 - regression_loss: 1.3774 - classification_loss: 0.2826 74/500 [===>..........................] - ETA: 1:44 - loss: 1.6718 - regression_loss: 1.3858 - classification_loss: 0.2860 75/500 [===>..........................] - ETA: 1:44 - loss: 1.6655 - regression_loss: 1.3807 - classification_loss: 0.2848 76/500 [===>..........................] - ETA: 1:44 - loss: 1.6651 - regression_loss: 1.3804 - classification_loss: 0.2847 77/500 [===>..........................] - ETA: 1:43 - loss: 1.6646 - regression_loss: 1.3795 - classification_loss: 0.2851 78/500 [===>..........................] - ETA: 1:43 - loss: 1.6635 - regression_loss: 1.3786 - classification_loss: 0.2849 79/500 [===>..........................] - ETA: 1:43 - loss: 1.6650 - regression_loss: 1.3803 - classification_loss: 0.2848 80/500 [===>..........................] - ETA: 1:43 - loss: 1.6668 - regression_loss: 1.3811 - classification_loss: 0.2857 81/500 [===>..........................] - ETA: 1:43 - loss: 1.6711 - regression_loss: 1.3847 - classification_loss: 0.2864 82/500 [===>..........................] - ETA: 1:42 - loss: 1.6709 - regression_loss: 1.3850 - classification_loss: 0.2859 83/500 [===>..........................] - ETA: 1:42 - loss: 1.6796 - regression_loss: 1.3907 - classification_loss: 0.2888 84/500 [====>.........................] - ETA: 1:42 - loss: 1.6717 - regression_loss: 1.3832 - classification_loss: 0.2885 85/500 [====>.........................] - ETA: 1:42 - loss: 1.6679 - regression_loss: 1.3793 - classification_loss: 0.2886 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6778 - regression_loss: 1.3861 - classification_loss: 0.2917 87/500 [====>.........................] - ETA: 1:41 - loss: 1.6730 - regression_loss: 1.3825 - classification_loss: 0.2905 88/500 [====>.........................] - ETA: 1:41 - loss: 1.6715 - regression_loss: 1.3818 - classification_loss: 0.2897 89/500 [====>.........................] - ETA: 1:41 - loss: 1.6688 - regression_loss: 1.3790 - classification_loss: 0.2898 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6594 - regression_loss: 1.3709 - classification_loss: 0.2885 91/500 [====>.........................] - ETA: 1:40 - loss: 1.6518 - regression_loss: 1.3648 - classification_loss: 0.2871 92/500 [====>.........................] - ETA: 1:40 - loss: 1.6600 - regression_loss: 1.3701 - classification_loss: 0.2899 93/500 [====>.........................] - ETA: 1:40 - loss: 1.6647 - regression_loss: 1.3733 - classification_loss: 0.2914 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6723 - regression_loss: 1.3777 - classification_loss: 0.2946 95/500 [====>.........................] - ETA: 1:39 - loss: 1.6739 - regression_loss: 1.3790 - classification_loss: 0.2949 96/500 [====>.........................] - ETA: 1:39 - loss: 1.6710 - regression_loss: 1.3771 - classification_loss: 0.2939 97/500 [====>.........................] - ETA: 1:39 - loss: 1.6687 - regression_loss: 1.3751 - classification_loss: 0.2936 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6659 - regression_loss: 1.3732 - classification_loss: 0.2927 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6693 - regression_loss: 1.3763 - classification_loss: 0.2930 100/500 [=====>........................] - ETA: 1:38 - loss: 1.6692 - regression_loss: 1.3772 - classification_loss: 0.2921 101/500 [=====>........................] - ETA: 1:38 - loss: 1.6722 - regression_loss: 1.3802 - classification_loss: 0.2920 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6764 - regression_loss: 1.3838 - classification_loss: 0.2926 103/500 [=====>........................] - ETA: 1:37 - loss: 1.6729 - regression_loss: 1.3804 - classification_loss: 0.2925 104/500 [=====>........................] - ETA: 1:37 - loss: 1.6815 - regression_loss: 1.3880 - classification_loss: 0.2935 105/500 [=====>........................] - ETA: 1:37 - loss: 1.6782 - regression_loss: 1.3846 - classification_loss: 0.2936 106/500 [=====>........................] - ETA: 1:36 - loss: 1.6792 - regression_loss: 1.3858 - classification_loss: 0.2934 107/500 [=====>........................] - ETA: 1:36 - loss: 1.6802 - regression_loss: 1.3870 - classification_loss: 0.2933 108/500 [=====>........................] - ETA: 1:36 - loss: 1.6766 - regression_loss: 1.3847 - classification_loss: 0.2919 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6778 - regression_loss: 1.3860 - classification_loss: 0.2918 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6682 - regression_loss: 1.3782 - classification_loss: 0.2899 111/500 [=====>........................] - ETA: 1:35 - loss: 1.6716 - regression_loss: 1.3814 - classification_loss: 0.2902 112/500 [=====>........................] - ETA: 1:35 - loss: 1.6664 - regression_loss: 1.3774 - classification_loss: 0.2890 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6604 - regression_loss: 1.3728 - classification_loss: 0.2876 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6631 - regression_loss: 1.3752 - classification_loss: 0.2878 115/500 [=====>........................] - ETA: 1:34 - loss: 1.6568 - regression_loss: 1.3702 - classification_loss: 0.2866 116/500 [=====>........................] - ETA: 1:34 - loss: 1.6559 - regression_loss: 1.3692 - classification_loss: 0.2868 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6573 - regression_loss: 1.3704 - classification_loss: 0.2869 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6600 - regression_loss: 1.3729 - classification_loss: 0.2871 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6625 - regression_loss: 1.3740 - classification_loss: 0.2885 120/500 [======>.......................] - ETA: 1:33 - loss: 1.6619 - regression_loss: 1.3725 - classification_loss: 0.2894 121/500 [======>.......................] - ETA: 1:33 - loss: 1.6638 - regression_loss: 1.3730 - classification_loss: 0.2908 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6661 - regression_loss: 1.3757 - classification_loss: 0.2904 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6601 - regression_loss: 1.3715 - classification_loss: 0.2886 124/500 [======>.......................] - ETA: 1:32 - loss: 1.6542 - regression_loss: 1.3664 - classification_loss: 0.2878 125/500 [======>.......................] - ETA: 1:32 - loss: 1.6564 - regression_loss: 1.3680 - classification_loss: 0.2884 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6559 - regression_loss: 1.3676 - classification_loss: 0.2883 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6560 - regression_loss: 1.3678 - classification_loss: 0.2882 128/500 [======>.......................] - ETA: 1:31 - loss: 1.6498 - regression_loss: 1.3631 - classification_loss: 0.2867 129/500 [======>.......................] - ETA: 1:31 - loss: 1.6476 - regression_loss: 1.3616 - classification_loss: 0.2860 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6434 - regression_loss: 1.3583 - classification_loss: 0.2851 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6446 - regression_loss: 1.3600 - classification_loss: 0.2845 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6441 - regression_loss: 1.3600 - classification_loss: 0.2841 133/500 [======>.......................] - ETA: 1:30 - loss: 1.6390 - regression_loss: 1.3556 - classification_loss: 0.2834 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6391 - regression_loss: 1.3554 - classification_loss: 0.2837 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6376 - regression_loss: 1.3534 - classification_loss: 0.2841 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6448 - regression_loss: 1.3580 - classification_loss: 0.2869 137/500 [=======>......................] - ETA: 1:29 - loss: 1.6459 - regression_loss: 1.3595 - classification_loss: 0.2864 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6410 - regression_loss: 1.3556 - classification_loss: 0.2854 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6441 - regression_loss: 1.3581 - classification_loss: 0.2860 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6468 - regression_loss: 1.3605 - classification_loss: 0.2863 141/500 [=======>......................] - ETA: 1:28 - loss: 1.6441 - regression_loss: 1.3587 - classification_loss: 0.2855 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6405 - regression_loss: 1.3561 - classification_loss: 0.2844 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6396 - regression_loss: 1.3551 - classification_loss: 0.2845 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6417 - regression_loss: 1.3561 - classification_loss: 0.2856 145/500 [=======>......................] - ETA: 1:27 - loss: 1.6490 - regression_loss: 1.3613 - classification_loss: 0.2877 146/500 [=======>......................] - ETA: 1:27 - loss: 1.6482 - regression_loss: 1.3609 - classification_loss: 0.2873 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6527 - regression_loss: 1.3646 - classification_loss: 0.2881 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6534 - regression_loss: 1.3661 - classification_loss: 0.2873 149/500 [=======>......................] - ETA: 1:26 - loss: 1.6505 - regression_loss: 1.3637 - classification_loss: 0.2867 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6494 - regression_loss: 1.3632 - classification_loss: 0.2863 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6543 - regression_loss: 1.3674 - classification_loss: 0.2869 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6530 - regression_loss: 1.3665 - classification_loss: 0.2865 153/500 [========>.....................] - ETA: 1:25 - loss: 1.6546 - regression_loss: 1.3679 - classification_loss: 0.2867 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6561 - regression_loss: 1.3695 - classification_loss: 0.2866 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6588 - regression_loss: 1.3719 - classification_loss: 0.2870 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6619 - regression_loss: 1.3744 - classification_loss: 0.2875 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6632 - regression_loss: 1.3757 - classification_loss: 0.2875 158/500 [========>.....................] - ETA: 1:24 - loss: 1.6632 - regression_loss: 1.3759 - classification_loss: 0.2873 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6653 - regression_loss: 1.3778 - classification_loss: 0.2875 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6668 - regression_loss: 1.3790 - classification_loss: 0.2878 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6635 - regression_loss: 1.3763 - classification_loss: 0.2873 162/500 [========>.....................] - ETA: 1:23 - loss: 1.6640 - regression_loss: 1.3762 - classification_loss: 0.2877 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6621 - regression_loss: 1.3752 - classification_loss: 0.2869 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6612 - regression_loss: 1.3745 - classification_loss: 0.2866 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6667 - regression_loss: 1.3787 - classification_loss: 0.2880 166/500 [========>.....................] - ETA: 1:22 - loss: 1.6663 - regression_loss: 1.3786 - classification_loss: 0.2877 167/500 [=========>....................] - ETA: 1:22 - loss: 1.6641 - regression_loss: 1.3765 - classification_loss: 0.2876 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6648 - regression_loss: 1.3775 - classification_loss: 0.2873 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6632 - regression_loss: 1.3758 - classification_loss: 0.2874 170/500 [=========>....................] - ETA: 1:21 - loss: 1.6649 - regression_loss: 1.3769 - classification_loss: 0.2881 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6659 - regression_loss: 1.3772 - classification_loss: 0.2887 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6673 - regression_loss: 1.3785 - classification_loss: 0.2888 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6624 - regression_loss: 1.3748 - classification_loss: 0.2876 174/500 [=========>....................] - ETA: 1:20 - loss: 1.6589 - regression_loss: 1.3722 - classification_loss: 0.2867 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6574 - regression_loss: 1.3713 - classification_loss: 0.2861 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6523 - regression_loss: 1.3672 - classification_loss: 0.2851 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6523 - regression_loss: 1.3671 - classification_loss: 0.2852 178/500 [=========>....................] - ETA: 1:19 - loss: 1.6516 - regression_loss: 1.3666 - classification_loss: 0.2850 179/500 [=========>....................] - ETA: 1:19 - loss: 1.6485 - regression_loss: 1.3643 - classification_loss: 0.2842 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6481 - regression_loss: 1.3637 - classification_loss: 0.2843 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6465 - regression_loss: 1.3628 - classification_loss: 0.2837 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6466 - regression_loss: 1.3632 - classification_loss: 0.2834 183/500 [=========>....................] - ETA: 1:18 - loss: 1.6457 - regression_loss: 1.3624 - classification_loss: 0.2834 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6473 - regression_loss: 1.3639 - classification_loss: 0.2834 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6507 - regression_loss: 1.3666 - classification_loss: 0.2841 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6472 - regression_loss: 1.3635 - classification_loss: 0.2837 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6480 - regression_loss: 1.3642 - classification_loss: 0.2838 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6480 - regression_loss: 1.3645 - classification_loss: 0.2836 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6475 - regression_loss: 1.3643 - classification_loss: 0.2833 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6485 - regression_loss: 1.3650 - classification_loss: 0.2834 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6519 - regression_loss: 1.3675 - classification_loss: 0.2844 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6528 - regression_loss: 1.3686 - classification_loss: 0.2842 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6518 - regression_loss: 1.3677 - classification_loss: 0.2841 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6529 - regression_loss: 1.3685 - classification_loss: 0.2843 195/500 [==========>...................] - ETA: 1:15 - loss: 1.6545 - regression_loss: 1.3702 - classification_loss: 0.2843 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6551 - regression_loss: 1.3702 - classification_loss: 0.2849 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6534 - regression_loss: 1.3688 - classification_loss: 0.2845 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6522 - regression_loss: 1.3682 - classification_loss: 0.2839 199/500 [==========>...................] - ETA: 1:14 - loss: 1.6519 - regression_loss: 1.3679 - classification_loss: 0.2841 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6534 - regression_loss: 1.3690 - classification_loss: 0.2845 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6501 - regression_loss: 1.3665 - classification_loss: 0.2836 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6477 - regression_loss: 1.3647 - classification_loss: 0.2830 203/500 [===========>..................] - ETA: 1:13 - loss: 1.6498 - regression_loss: 1.3663 - classification_loss: 0.2835 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6506 - regression_loss: 1.3670 - classification_loss: 0.2836 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6504 - regression_loss: 1.3669 - classification_loss: 0.2835 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6555 - regression_loss: 1.3715 - classification_loss: 0.2840 207/500 [===========>..................] - ETA: 1:12 - loss: 1.6571 - regression_loss: 1.3726 - classification_loss: 0.2845 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6582 - regression_loss: 1.3720 - classification_loss: 0.2861 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6600 - regression_loss: 1.3737 - classification_loss: 0.2864 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6602 - regression_loss: 1.3736 - classification_loss: 0.2867 211/500 [===========>..................] - ETA: 1:11 - loss: 1.6606 - regression_loss: 1.3729 - classification_loss: 0.2877 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6617 - regression_loss: 1.3739 - classification_loss: 0.2878 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6604 - regression_loss: 1.3730 - classification_loss: 0.2874 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6588 - regression_loss: 1.3716 - classification_loss: 0.2872 215/500 [===========>..................] - ETA: 1:10 - loss: 1.6607 - regression_loss: 1.3732 - classification_loss: 0.2875 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6610 - regression_loss: 1.3735 - classification_loss: 0.2876 217/500 [============>.................] - ETA: 1:10 - loss: 1.6607 - regression_loss: 1.3734 - classification_loss: 0.2872 218/500 [============>.................] - ETA: 1:10 - loss: 1.6559 - regression_loss: 1.3695 - classification_loss: 0.2863 219/500 [============>.................] - ETA: 1:09 - loss: 1.6587 - regression_loss: 1.3717 - classification_loss: 0.2870 220/500 [============>.................] - ETA: 1:09 - loss: 1.6551 - regression_loss: 1.3686 - classification_loss: 0.2865 221/500 [============>.................] - ETA: 1:09 - loss: 1.6566 - regression_loss: 1.3697 - classification_loss: 0.2869 222/500 [============>.................] - ETA: 1:09 - loss: 1.6563 - regression_loss: 1.3696 - classification_loss: 0.2867 223/500 [============>.................] - ETA: 1:09 - loss: 1.6596 - regression_loss: 1.3722 - classification_loss: 0.2875 224/500 [============>.................] - ETA: 1:08 - loss: 1.6582 - regression_loss: 1.3707 - classification_loss: 0.2874 225/500 [============>.................] - ETA: 1:08 - loss: 1.6576 - regression_loss: 1.3703 - classification_loss: 0.2873 226/500 [============>.................] - ETA: 1:08 - loss: 1.6606 - regression_loss: 1.3725 - classification_loss: 0.2881 227/500 [============>.................] - ETA: 1:08 - loss: 1.6649 - regression_loss: 1.3732 - classification_loss: 0.2917 228/500 [============>.................] - ETA: 1:07 - loss: 1.6620 - regression_loss: 1.3708 - classification_loss: 0.2912 229/500 [============>.................] - ETA: 1:07 - loss: 1.6612 - regression_loss: 1.3705 - classification_loss: 0.2907 230/500 [============>.................] - ETA: 1:07 - loss: 1.6623 - regression_loss: 1.3715 - classification_loss: 0.2908 231/500 [============>.................] - ETA: 1:07 - loss: 1.6640 - regression_loss: 1.3732 - classification_loss: 0.2908 232/500 [============>.................] - ETA: 1:06 - loss: 1.6641 - regression_loss: 1.3734 - classification_loss: 0.2908 233/500 [============>.................] - ETA: 1:06 - loss: 1.6641 - regression_loss: 1.3735 - classification_loss: 0.2905 234/500 [=============>................] - ETA: 1:06 - loss: 1.6659 - regression_loss: 1.3752 - classification_loss: 0.2907 235/500 [=============>................] - ETA: 1:06 - loss: 1.6666 - regression_loss: 1.3758 - classification_loss: 0.2908 236/500 [=============>................] - ETA: 1:05 - loss: 1.6662 - regression_loss: 1.3755 - classification_loss: 0.2907 237/500 [=============>................] - ETA: 1:05 - loss: 1.6658 - regression_loss: 1.3753 - classification_loss: 0.2906 238/500 [=============>................] - ETA: 1:05 - loss: 1.6664 - regression_loss: 1.3758 - classification_loss: 0.2905 239/500 [=============>................] - ETA: 1:05 - loss: 1.6675 - regression_loss: 1.3770 - classification_loss: 0.2905 240/500 [=============>................] - ETA: 1:04 - loss: 1.6681 - regression_loss: 1.3776 - classification_loss: 0.2906 241/500 [=============>................] - ETA: 1:04 - loss: 1.6667 - regression_loss: 1.3765 - classification_loss: 0.2902 242/500 [=============>................] - ETA: 1:04 - loss: 1.6682 - regression_loss: 1.3781 - classification_loss: 0.2901 243/500 [=============>................] - ETA: 1:04 - loss: 1.6691 - regression_loss: 1.3791 - classification_loss: 0.2900 244/500 [=============>................] - ETA: 1:03 - loss: 1.6669 - regression_loss: 1.3776 - classification_loss: 0.2893 245/500 [=============>................] - ETA: 1:03 - loss: 1.6656 - regression_loss: 1.3765 - classification_loss: 0.2890 246/500 [=============>................] - ETA: 1:03 - loss: 1.6642 - regression_loss: 1.3753 - classification_loss: 0.2889 247/500 [=============>................] - ETA: 1:03 - loss: 1.6639 - regression_loss: 1.3754 - classification_loss: 0.2885 248/500 [=============>................] - ETA: 1:02 - loss: 1.6646 - regression_loss: 1.3758 - classification_loss: 0.2888 249/500 [=============>................] - ETA: 1:02 - loss: 1.6626 - regression_loss: 1.3744 - classification_loss: 0.2882 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6636 - regression_loss: 1.3754 - classification_loss: 0.2882 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6649 - regression_loss: 1.3761 - classification_loss: 0.2887 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6667 - regression_loss: 1.3774 - classification_loss: 0.2894 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6629 - regression_loss: 1.3741 - classification_loss: 0.2888 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6585 - regression_loss: 1.3705 - classification_loss: 0.2880 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6593 - regression_loss: 1.3712 - classification_loss: 0.2881 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6548 - regression_loss: 1.3671 - classification_loss: 0.2877 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6558 - regression_loss: 1.3684 - classification_loss: 0.2874 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6570 - regression_loss: 1.3694 - classification_loss: 0.2876 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6559 - regression_loss: 1.3687 - classification_loss: 0.2872 260/500 [==============>...............] - ETA: 59s - loss: 1.6604 - regression_loss: 1.3721 - classification_loss: 0.2882  261/500 [==============>...............] - ETA: 59s - loss: 1.6619 - regression_loss: 1.3735 - classification_loss: 0.2883 262/500 [==============>...............] - ETA: 59s - loss: 1.6605 - regression_loss: 1.3725 - classification_loss: 0.2880 263/500 [==============>...............] - ETA: 59s - loss: 1.6618 - regression_loss: 1.3737 - classification_loss: 0.2881 264/500 [==============>...............] - ETA: 58s - loss: 1.6601 - regression_loss: 1.3726 - classification_loss: 0.2876 265/500 [==============>...............] - ETA: 58s - loss: 1.6608 - regression_loss: 1.3735 - classification_loss: 0.2873 266/500 [==============>...............] - ETA: 58s - loss: 1.6578 - regression_loss: 1.3711 - classification_loss: 0.2867 267/500 [===============>..............] - ETA: 58s - loss: 1.6562 - regression_loss: 1.3700 - classification_loss: 0.2862 268/500 [===============>..............] - ETA: 57s - loss: 1.6580 - regression_loss: 1.3719 - classification_loss: 0.2860 269/500 [===============>..............] - ETA: 57s - loss: 1.6581 - regression_loss: 1.3721 - classification_loss: 0.2860 270/500 [===============>..............] - ETA: 57s - loss: 1.6588 - regression_loss: 1.3727 - classification_loss: 0.2861 271/500 [===============>..............] - ETA: 57s - loss: 1.6599 - regression_loss: 1.3740 - classification_loss: 0.2859 272/500 [===============>..............] - ETA: 56s - loss: 1.6596 - regression_loss: 1.3739 - classification_loss: 0.2857 273/500 [===============>..............] - ETA: 56s - loss: 1.6601 - regression_loss: 1.3746 - classification_loss: 0.2855 274/500 [===============>..............] - ETA: 56s - loss: 1.6575 - regression_loss: 1.3727 - classification_loss: 0.2848 275/500 [===============>..............] - ETA: 56s - loss: 1.6585 - regression_loss: 1.3734 - classification_loss: 0.2852 276/500 [===============>..............] - ETA: 55s - loss: 1.6594 - regression_loss: 1.3738 - classification_loss: 0.2856 277/500 [===============>..............] - ETA: 55s - loss: 1.6591 - regression_loss: 1.3737 - classification_loss: 0.2854 278/500 [===============>..............] - ETA: 55s - loss: 1.6604 - regression_loss: 1.3746 - classification_loss: 0.2859 279/500 [===============>..............] - ETA: 55s - loss: 1.6612 - regression_loss: 1.3750 - classification_loss: 0.2861 280/500 [===============>..............] - ETA: 54s - loss: 1.6603 - regression_loss: 1.3744 - classification_loss: 0.2859 281/500 [===============>..............] - ETA: 54s - loss: 1.6602 - regression_loss: 1.3742 - classification_loss: 0.2861 282/500 [===============>..............] - ETA: 54s - loss: 1.6606 - regression_loss: 1.3742 - classification_loss: 0.2864 283/500 [===============>..............] - ETA: 54s - loss: 1.6616 - regression_loss: 1.3749 - classification_loss: 0.2867 284/500 [================>.............] - ETA: 53s - loss: 1.6614 - regression_loss: 1.3747 - classification_loss: 0.2867 285/500 [================>.............] - ETA: 53s - loss: 1.6628 - regression_loss: 1.3756 - classification_loss: 0.2871 286/500 [================>.............] - ETA: 53s - loss: 1.6615 - regression_loss: 1.3747 - classification_loss: 0.2868 287/500 [================>.............] - ETA: 53s - loss: 1.6639 - regression_loss: 1.3766 - classification_loss: 0.2873 288/500 [================>.............] - ETA: 52s - loss: 1.6648 - regression_loss: 1.3769 - classification_loss: 0.2879 289/500 [================>.............] - ETA: 52s - loss: 1.6656 - regression_loss: 1.3777 - classification_loss: 0.2879 290/500 [================>.............] - ETA: 52s - loss: 1.6623 - regression_loss: 1.3749 - classification_loss: 0.2874 291/500 [================>.............] - ETA: 52s - loss: 1.6613 - regression_loss: 1.3742 - classification_loss: 0.2871 292/500 [================>.............] - ETA: 51s - loss: 1.6612 - regression_loss: 1.3744 - classification_loss: 0.2869 293/500 [================>.............] - ETA: 51s - loss: 1.6614 - regression_loss: 1.3746 - classification_loss: 0.2868 294/500 [================>.............] - ETA: 51s - loss: 1.6611 - regression_loss: 1.3747 - classification_loss: 0.2864 295/500 [================>.............] - ETA: 51s - loss: 1.6614 - regression_loss: 1.3749 - classification_loss: 0.2865 296/500 [================>.............] - ETA: 50s - loss: 1.6626 - regression_loss: 1.3759 - classification_loss: 0.2867 297/500 [================>.............] - ETA: 50s - loss: 1.6630 - regression_loss: 1.3763 - classification_loss: 0.2867 298/500 [================>.............] - ETA: 50s - loss: 1.6631 - regression_loss: 1.3763 - classification_loss: 0.2868 299/500 [================>.............] - ETA: 50s - loss: 1.6617 - regression_loss: 1.3753 - classification_loss: 0.2864 300/500 [=================>............] - ETA: 49s - loss: 1.6618 - regression_loss: 1.3753 - classification_loss: 0.2865 301/500 [=================>............] - ETA: 49s - loss: 1.6608 - regression_loss: 1.3747 - classification_loss: 0.2862 302/500 [=================>............] - ETA: 49s - loss: 1.6617 - regression_loss: 1.3756 - classification_loss: 0.2861 303/500 [=================>............] - ETA: 49s - loss: 1.6618 - regression_loss: 1.3760 - classification_loss: 0.2858 304/500 [=================>............] - ETA: 48s - loss: 1.6618 - regression_loss: 1.3758 - classification_loss: 0.2860 305/500 [=================>............] - ETA: 48s - loss: 1.6628 - regression_loss: 1.3768 - classification_loss: 0.2860 306/500 [=================>............] - ETA: 48s - loss: 1.6645 - regression_loss: 1.3783 - classification_loss: 0.2863 307/500 [=================>............] - ETA: 48s - loss: 1.6622 - regression_loss: 1.3758 - classification_loss: 0.2864 308/500 [=================>............] - ETA: 47s - loss: 1.6630 - regression_loss: 1.3766 - classification_loss: 0.2864 309/500 [=================>............] - ETA: 47s - loss: 1.6625 - regression_loss: 1.3763 - classification_loss: 0.2862 310/500 [=================>............] - ETA: 47s - loss: 1.6623 - regression_loss: 1.3758 - classification_loss: 0.2865 311/500 [=================>............] - ETA: 47s - loss: 1.6626 - regression_loss: 1.3762 - classification_loss: 0.2863 312/500 [=================>............] - ETA: 46s - loss: 1.6627 - regression_loss: 1.3764 - classification_loss: 0.2863 313/500 [=================>............] - ETA: 46s - loss: 1.6638 - regression_loss: 1.3774 - classification_loss: 0.2864 314/500 [=================>............] - ETA: 46s - loss: 1.6626 - regression_loss: 1.3764 - classification_loss: 0.2861 315/500 [=================>............] - ETA: 46s - loss: 1.6626 - regression_loss: 1.3762 - classification_loss: 0.2864 316/500 [=================>............] - ETA: 45s - loss: 1.6628 - regression_loss: 1.3762 - classification_loss: 0.2866 317/500 [==================>...........] - ETA: 45s - loss: 1.6606 - regression_loss: 1.3746 - classification_loss: 0.2860 318/500 [==================>...........] - ETA: 45s - loss: 1.6616 - regression_loss: 1.3755 - classification_loss: 0.2861 319/500 [==================>...........] - ETA: 45s - loss: 1.6618 - regression_loss: 1.3758 - classification_loss: 0.2860 320/500 [==================>...........] - ETA: 44s - loss: 1.6624 - regression_loss: 1.3763 - classification_loss: 0.2861 321/500 [==================>...........] - ETA: 44s - loss: 1.6598 - regression_loss: 1.3743 - classification_loss: 0.2855 322/500 [==================>...........] - ETA: 44s - loss: 1.6596 - regression_loss: 1.3743 - classification_loss: 0.2853 323/500 [==================>...........] - ETA: 44s - loss: 1.6562 - regression_loss: 1.3714 - classification_loss: 0.2847 324/500 [==================>...........] - ETA: 43s - loss: 1.6535 - regression_loss: 1.3693 - classification_loss: 0.2842 325/500 [==================>...........] - ETA: 43s - loss: 1.6525 - regression_loss: 1.3685 - classification_loss: 0.2840 326/500 [==================>...........] - ETA: 43s - loss: 1.6527 - regression_loss: 1.3686 - classification_loss: 0.2841 327/500 [==================>...........] - ETA: 43s - loss: 1.6528 - regression_loss: 1.3687 - classification_loss: 0.2841 328/500 [==================>...........] - ETA: 42s - loss: 1.6524 - regression_loss: 1.3686 - classification_loss: 0.2839 329/500 [==================>...........] - ETA: 42s - loss: 1.6509 - regression_loss: 1.3673 - classification_loss: 0.2836 330/500 [==================>...........] - ETA: 42s - loss: 1.6501 - regression_loss: 1.3668 - classification_loss: 0.2833 331/500 [==================>...........] - ETA: 42s - loss: 1.6478 - regression_loss: 1.3650 - classification_loss: 0.2828 332/500 [==================>...........] - ETA: 41s - loss: 1.6481 - regression_loss: 1.3650 - classification_loss: 0.2831 333/500 [==================>...........] - ETA: 41s - loss: 1.6479 - regression_loss: 1.3650 - classification_loss: 0.2830 334/500 [===================>..........] - ETA: 41s - loss: 1.6489 - regression_loss: 1.3654 - classification_loss: 0.2835 335/500 [===================>..........] - ETA: 41s - loss: 1.6478 - regression_loss: 1.3646 - classification_loss: 0.2831 336/500 [===================>..........] - ETA: 40s - loss: 1.6480 - regression_loss: 1.3650 - classification_loss: 0.2830 337/500 [===================>..........] - ETA: 40s - loss: 1.6459 - regression_loss: 1.3630 - classification_loss: 0.2828 338/500 [===================>..........] - ETA: 40s - loss: 1.6457 - regression_loss: 1.3630 - classification_loss: 0.2827 339/500 [===================>..........] - ETA: 40s - loss: 1.6465 - regression_loss: 1.3637 - classification_loss: 0.2829 340/500 [===================>..........] - ETA: 39s - loss: 1.6462 - regression_loss: 1.3635 - classification_loss: 0.2827 341/500 [===================>..........] - ETA: 39s - loss: 1.6473 - regression_loss: 1.3642 - classification_loss: 0.2830 342/500 [===================>..........] - ETA: 39s - loss: 1.6511 - regression_loss: 1.3668 - classification_loss: 0.2843 343/500 [===================>..........] - ETA: 39s - loss: 1.6523 - regression_loss: 1.3678 - classification_loss: 0.2845 344/500 [===================>..........] - ETA: 38s - loss: 1.6519 - regression_loss: 1.3675 - classification_loss: 0.2844 345/500 [===================>..........] - ETA: 38s - loss: 1.6536 - regression_loss: 1.3688 - classification_loss: 0.2848 346/500 [===================>..........] - ETA: 38s - loss: 1.6527 - regression_loss: 1.3681 - classification_loss: 0.2846 347/500 [===================>..........] - ETA: 38s - loss: 1.6511 - regression_loss: 1.3668 - classification_loss: 0.2842 348/500 [===================>..........] - ETA: 37s - loss: 1.6504 - regression_loss: 1.3665 - classification_loss: 0.2839 349/500 [===================>..........] - ETA: 37s - loss: 1.6511 - regression_loss: 1.3672 - classification_loss: 0.2839 350/500 [====================>.........] - ETA: 37s - loss: 1.6520 - regression_loss: 1.3681 - classification_loss: 0.2839 351/500 [====================>.........] - ETA: 37s - loss: 1.6536 - regression_loss: 1.3693 - classification_loss: 0.2843 352/500 [====================>.........] - ETA: 36s - loss: 1.6514 - regression_loss: 1.3676 - classification_loss: 0.2838 353/500 [====================>.........] - ETA: 36s - loss: 1.6519 - regression_loss: 1.3683 - classification_loss: 0.2836 354/500 [====================>.........] - ETA: 36s - loss: 1.6513 - regression_loss: 1.3679 - classification_loss: 0.2834 355/500 [====================>.........] - ETA: 36s - loss: 1.6525 - regression_loss: 1.3689 - classification_loss: 0.2836 356/500 [====================>.........] - ETA: 35s - loss: 1.6531 - regression_loss: 1.3694 - classification_loss: 0.2838 357/500 [====================>.........] - ETA: 35s - loss: 1.6516 - regression_loss: 1.3683 - classification_loss: 0.2833 358/500 [====================>.........] - ETA: 35s - loss: 1.6494 - regression_loss: 1.3665 - classification_loss: 0.2829 359/500 [====================>.........] - ETA: 35s - loss: 1.6482 - regression_loss: 1.3653 - classification_loss: 0.2829 360/500 [====================>.........] - ETA: 34s - loss: 1.6469 - regression_loss: 1.3644 - classification_loss: 0.2825 361/500 [====================>.........] - ETA: 34s - loss: 1.6469 - regression_loss: 1.3645 - classification_loss: 0.2825 362/500 [====================>.........] - ETA: 34s - loss: 1.6464 - regression_loss: 1.3639 - classification_loss: 0.2825 363/500 [====================>.........] - ETA: 34s - loss: 1.6462 - regression_loss: 1.3638 - classification_loss: 0.2824 364/500 [====================>.........] - ETA: 33s - loss: 1.6456 - regression_loss: 1.3632 - classification_loss: 0.2824 365/500 [====================>.........] - ETA: 33s - loss: 1.6459 - regression_loss: 1.3636 - classification_loss: 0.2823 366/500 [====================>.........] - ETA: 33s - loss: 1.6476 - regression_loss: 1.3649 - classification_loss: 0.2827 367/500 [=====================>........] - ETA: 33s - loss: 1.6479 - regression_loss: 1.3652 - classification_loss: 0.2827 368/500 [=====================>........] - ETA: 32s - loss: 1.6497 - regression_loss: 1.3665 - classification_loss: 0.2831 369/500 [=====================>........] - ETA: 32s - loss: 1.6501 - regression_loss: 1.3671 - classification_loss: 0.2830 370/500 [=====================>........] - ETA: 32s - loss: 1.6491 - regression_loss: 1.3664 - classification_loss: 0.2827 371/500 [=====================>........] - ETA: 32s - loss: 1.6490 - regression_loss: 1.3664 - classification_loss: 0.2826 372/500 [=====================>........] - ETA: 31s - loss: 1.6472 - regression_loss: 1.3650 - classification_loss: 0.2822 373/500 [=====================>........] - ETA: 31s - loss: 1.6482 - regression_loss: 1.3660 - classification_loss: 0.2823 374/500 [=====================>........] - ETA: 31s - loss: 1.6490 - regression_loss: 1.3667 - classification_loss: 0.2823 375/500 [=====================>........] - ETA: 31s - loss: 1.6503 - regression_loss: 1.3677 - classification_loss: 0.2826 376/500 [=====================>........] - ETA: 30s - loss: 1.6506 - regression_loss: 1.3680 - classification_loss: 0.2825 377/500 [=====================>........] - ETA: 30s - loss: 1.6510 - regression_loss: 1.3686 - classification_loss: 0.2824 378/500 [=====================>........] - ETA: 30s - loss: 1.6513 - regression_loss: 1.3687 - classification_loss: 0.2826 379/500 [=====================>........] - ETA: 30s - loss: 1.6513 - regression_loss: 1.3688 - classification_loss: 0.2825 380/500 [=====================>........] - ETA: 29s - loss: 1.6498 - regression_loss: 1.3677 - classification_loss: 0.2821 381/500 [=====================>........] - ETA: 29s - loss: 1.6513 - regression_loss: 1.3688 - classification_loss: 0.2826 382/500 [=====================>........] - ETA: 29s - loss: 1.6514 - regression_loss: 1.3690 - classification_loss: 0.2824 383/500 [=====================>........] - ETA: 29s - loss: 1.6489 - regression_loss: 1.3670 - classification_loss: 0.2819 384/500 [======================>.......] - ETA: 28s - loss: 1.6486 - regression_loss: 1.3668 - classification_loss: 0.2819 385/500 [======================>.......] - ETA: 28s - loss: 1.6476 - regression_loss: 1.3660 - classification_loss: 0.2816 386/500 [======================>.......] - ETA: 28s - loss: 1.6476 - regression_loss: 1.3661 - classification_loss: 0.2815 387/500 [======================>.......] - ETA: 28s - loss: 1.6483 - regression_loss: 1.3667 - classification_loss: 0.2816 388/500 [======================>.......] - ETA: 28s - loss: 1.6483 - regression_loss: 1.3666 - classification_loss: 0.2817 389/500 [======================>.......] - ETA: 27s - loss: 1.6474 - regression_loss: 1.3660 - classification_loss: 0.2814 390/500 [======================>.......] - ETA: 27s - loss: 1.6460 - regression_loss: 1.3647 - classification_loss: 0.2812 391/500 [======================>.......] - ETA: 27s - loss: 1.6464 - regression_loss: 1.3653 - classification_loss: 0.2810 392/500 [======================>.......] - ETA: 26s - loss: 1.6469 - regression_loss: 1.3656 - classification_loss: 0.2813 393/500 [======================>.......] - ETA: 26s - loss: 1.6495 - regression_loss: 1.3664 - classification_loss: 0.2832 394/500 [======================>.......] - ETA: 26s - loss: 1.6497 - regression_loss: 1.3665 - classification_loss: 0.2831 395/500 [======================>.......] - ETA: 26s - loss: 1.6493 - regression_loss: 1.3665 - classification_loss: 0.2828 396/500 [======================>.......] - ETA: 25s - loss: 1.6519 - regression_loss: 1.3684 - classification_loss: 0.2836 397/500 [======================>.......] - ETA: 25s - loss: 1.6512 - regression_loss: 1.3679 - classification_loss: 0.2833 398/500 [======================>.......] - ETA: 25s - loss: 1.6519 - regression_loss: 1.3685 - classification_loss: 0.2834 399/500 [======================>.......] - ETA: 25s - loss: 1.6524 - regression_loss: 1.3690 - classification_loss: 0.2834 400/500 [=======================>......] - ETA: 24s - loss: 1.6535 - regression_loss: 1.3700 - classification_loss: 0.2835 401/500 [=======================>......] - ETA: 24s - loss: 1.6524 - regression_loss: 1.3690 - classification_loss: 0.2834 402/500 [=======================>......] - ETA: 24s - loss: 1.6534 - regression_loss: 1.3700 - classification_loss: 0.2834 403/500 [=======================>......] - ETA: 24s - loss: 1.6533 - regression_loss: 1.3700 - classification_loss: 0.2833 404/500 [=======================>......] - ETA: 23s - loss: 1.6543 - regression_loss: 1.3708 - classification_loss: 0.2834 405/500 [=======================>......] - ETA: 23s - loss: 1.6540 - regression_loss: 1.3707 - classification_loss: 0.2833 406/500 [=======================>......] - ETA: 23s - loss: 1.6547 - regression_loss: 1.3712 - classification_loss: 0.2835 407/500 [=======================>......] - ETA: 23s - loss: 1.6551 - regression_loss: 1.3717 - classification_loss: 0.2833 408/500 [=======================>......] - ETA: 22s - loss: 1.6548 - regression_loss: 1.3715 - classification_loss: 0.2832 409/500 [=======================>......] - ETA: 22s - loss: 1.6548 - regression_loss: 1.3714 - classification_loss: 0.2834 410/500 [=======================>......] - ETA: 22s - loss: 1.6550 - regression_loss: 1.3717 - classification_loss: 0.2834 411/500 [=======================>......] - ETA: 22s - loss: 1.6562 - regression_loss: 1.3725 - classification_loss: 0.2837 412/500 [=======================>......] - ETA: 21s - loss: 1.6572 - regression_loss: 1.3733 - classification_loss: 0.2839 413/500 [=======================>......] - ETA: 21s - loss: 1.6572 - regression_loss: 1.3733 - classification_loss: 0.2839 414/500 [=======================>......] - ETA: 21s - loss: 1.6572 - regression_loss: 1.3734 - classification_loss: 0.2838 415/500 [=======================>......] - ETA: 21s - loss: 1.6575 - regression_loss: 1.3735 - classification_loss: 0.2839 416/500 [=======================>......] - ETA: 20s - loss: 1.6567 - regression_loss: 1.3728 - classification_loss: 0.2839 417/500 [========================>.....] - ETA: 20s - loss: 1.6574 - regression_loss: 1.3735 - classification_loss: 0.2839 418/500 [========================>.....] - ETA: 20s - loss: 1.6593 - regression_loss: 1.3751 - classification_loss: 0.2842 419/500 [========================>.....] - ETA: 20s - loss: 1.6588 - regression_loss: 1.3747 - classification_loss: 0.2841 420/500 [========================>.....] - ETA: 19s - loss: 1.6587 - regression_loss: 1.3747 - classification_loss: 0.2841 421/500 [========================>.....] - ETA: 19s - loss: 1.6572 - regression_loss: 1.3735 - classification_loss: 0.2837 422/500 [========================>.....] - ETA: 19s - loss: 1.6552 - regression_loss: 1.3719 - classification_loss: 0.2833 423/500 [========================>.....] - ETA: 19s - loss: 1.6555 - regression_loss: 1.3722 - classification_loss: 0.2833 424/500 [========================>.....] - ETA: 18s - loss: 1.6533 - regression_loss: 1.3703 - classification_loss: 0.2830 425/500 [========================>.....] - ETA: 18s - loss: 1.6504 - regression_loss: 1.3679 - classification_loss: 0.2824 426/500 [========================>.....] - ETA: 18s - loss: 1.6499 - regression_loss: 1.3675 - classification_loss: 0.2823 427/500 [========================>.....] - ETA: 18s - loss: 1.6503 - regression_loss: 1.3679 - classification_loss: 0.2825 428/500 [========================>.....] - ETA: 17s - loss: 1.6503 - regression_loss: 1.3679 - classification_loss: 0.2824 429/500 [========================>.....] - ETA: 17s - loss: 1.6502 - regression_loss: 1.3679 - classification_loss: 0.2823 430/500 [========================>.....] - ETA: 17s - loss: 1.6505 - regression_loss: 1.3682 - classification_loss: 0.2823 431/500 [========================>.....] - ETA: 17s - loss: 1.6514 - regression_loss: 1.3689 - classification_loss: 0.2825 432/500 [========================>.....] - ETA: 16s - loss: 1.6524 - regression_loss: 1.3698 - classification_loss: 0.2827 433/500 [========================>.....] - ETA: 16s - loss: 1.6555 - regression_loss: 1.3721 - classification_loss: 0.2834 434/500 [=========================>....] - ETA: 16s - loss: 1.6545 - regression_loss: 1.3713 - classification_loss: 0.2831 435/500 [=========================>....] - ETA: 16s - loss: 1.6546 - regression_loss: 1.3716 - classification_loss: 0.2830 436/500 [=========================>....] - ETA: 15s - loss: 1.6552 - regression_loss: 1.3720 - classification_loss: 0.2832 437/500 [=========================>....] - ETA: 15s - loss: 1.6534 - regression_loss: 1.3704 - classification_loss: 0.2829 438/500 [=========================>....] - ETA: 15s - loss: 1.6536 - regression_loss: 1.3706 - classification_loss: 0.2830 439/500 [=========================>....] - ETA: 15s - loss: 1.6529 - regression_loss: 1.3698 - classification_loss: 0.2831 440/500 [=========================>....] - ETA: 14s - loss: 1.6523 - regression_loss: 1.3694 - classification_loss: 0.2829 441/500 [=========================>....] - ETA: 14s - loss: 1.6523 - regression_loss: 1.3694 - classification_loss: 0.2829 442/500 [=========================>....] - ETA: 14s - loss: 1.6527 - regression_loss: 1.3698 - classification_loss: 0.2829 443/500 [=========================>....] - ETA: 14s - loss: 1.6519 - regression_loss: 1.3693 - classification_loss: 0.2826 444/500 [=========================>....] - ETA: 13s - loss: 1.6494 - regression_loss: 1.3673 - classification_loss: 0.2821 445/500 [=========================>....] - ETA: 13s - loss: 1.6498 - regression_loss: 1.3677 - classification_loss: 0.2821 446/500 [=========================>....] - ETA: 13s - loss: 1.6480 - regression_loss: 1.3662 - classification_loss: 0.2818 447/500 [=========================>....] - ETA: 13s - loss: 1.6487 - regression_loss: 1.3669 - classification_loss: 0.2818 448/500 [=========================>....] - ETA: 12s - loss: 1.6462 - regression_loss: 1.3648 - classification_loss: 0.2814 449/500 [=========================>....] - ETA: 12s - loss: 1.6450 - regression_loss: 1.3639 - classification_loss: 0.2811 450/500 [==========================>...] - ETA: 12s - loss: 1.6460 - regression_loss: 1.3647 - classification_loss: 0.2814 451/500 [==========================>...] - ETA: 12s - loss: 1.6472 - regression_loss: 1.3656 - classification_loss: 0.2816 452/500 [==========================>...] - ETA: 11s - loss: 1.6476 - regression_loss: 1.3658 - classification_loss: 0.2818 453/500 [==========================>...] - ETA: 11s - loss: 1.6475 - regression_loss: 1.3658 - classification_loss: 0.2817 454/500 [==========================>...] - ETA: 11s - loss: 1.6467 - regression_loss: 1.3653 - classification_loss: 0.2815 455/500 [==========================>...] - ETA: 11s - loss: 1.6466 - regression_loss: 1.3653 - classification_loss: 0.2814 456/500 [==========================>...] - ETA: 10s - loss: 1.6476 - regression_loss: 1.3659 - classification_loss: 0.2817 457/500 [==========================>...] - ETA: 10s - loss: 1.6475 - regression_loss: 1.3658 - classification_loss: 0.2817 458/500 [==========================>...] - ETA: 10s - loss: 1.6474 - regression_loss: 1.3659 - classification_loss: 0.2815 459/500 [==========================>...] - ETA: 10s - loss: 1.6452 - regression_loss: 1.3641 - classification_loss: 0.2811 460/500 [==========================>...] - ETA: 9s - loss: 1.6448 - regression_loss: 1.3638 - classification_loss: 0.2810  461/500 [==========================>...] - ETA: 9s - loss: 1.6447 - regression_loss: 1.3638 - classification_loss: 0.2809 462/500 [==========================>...] - ETA: 9s - loss: 1.6460 - regression_loss: 1.3649 - classification_loss: 0.2811 463/500 [==========================>...] - ETA: 9s - loss: 1.6448 - regression_loss: 1.3640 - classification_loss: 0.2808 464/500 [==========================>...] - ETA: 8s - loss: 1.6454 - regression_loss: 1.3645 - classification_loss: 0.2809 465/500 [==========================>...] - ETA: 8s - loss: 1.6457 - regression_loss: 1.3646 - classification_loss: 0.2811 466/500 [==========================>...] - ETA: 8s - loss: 1.6437 - regression_loss: 1.3629 - classification_loss: 0.2808 467/500 [===========================>..] - ETA: 8s - loss: 1.6438 - regression_loss: 1.3631 - classification_loss: 0.2807 468/500 [===========================>..] - ETA: 7s - loss: 1.6416 - regression_loss: 1.3602 - classification_loss: 0.2814 469/500 [===========================>..] - ETA: 7s - loss: 1.6424 - regression_loss: 1.3609 - classification_loss: 0.2815 470/500 [===========================>..] - ETA: 7s - loss: 1.6425 - regression_loss: 1.3610 - classification_loss: 0.2816 471/500 [===========================>..] - ETA: 7s - loss: 1.6440 - regression_loss: 1.3622 - classification_loss: 0.2818 472/500 [===========================>..] - ETA: 6s - loss: 1.6445 - regression_loss: 1.3624 - classification_loss: 0.2821 473/500 [===========================>..] - ETA: 6s - loss: 1.6433 - regression_loss: 1.3615 - classification_loss: 0.2818 474/500 [===========================>..] - ETA: 6s - loss: 1.6430 - regression_loss: 1.3614 - classification_loss: 0.2816 475/500 [===========================>..] - ETA: 6s - loss: 1.6409 - regression_loss: 1.3597 - classification_loss: 0.2812 476/500 [===========================>..] - ETA: 5s - loss: 1.6424 - regression_loss: 1.3609 - classification_loss: 0.2815 477/500 [===========================>..] - ETA: 5s - loss: 1.6421 - regression_loss: 1.3607 - classification_loss: 0.2814 478/500 [===========================>..] - ETA: 5s - loss: 1.6419 - regression_loss: 1.3605 - classification_loss: 0.2815 479/500 [===========================>..] - ETA: 5s - loss: 1.6416 - regression_loss: 1.3603 - classification_loss: 0.2813 480/500 [===========================>..] - ETA: 4s - loss: 1.6403 - regression_loss: 1.3594 - classification_loss: 0.2810 481/500 [===========================>..] - ETA: 4s - loss: 1.6392 - regression_loss: 1.3585 - classification_loss: 0.2807 482/500 [===========================>..] - ETA: 4s - loss: 1.6390 - regression_loss: 1.3582 - classification_loss: 0.2808 483/500 [===========================>..] - ETA: 4s - loss: 1.6404 - regression_loss: 1.3592 - classification_loss: 0.2812 484/500 [============================>.] - ETA: 3s - loss: 1.6393 - regression_loss: 1.3584 - classification_loss: 0.2809 485/500 [============================>.] - ETA: 3s - loss: 1.6385 - regression_loss: 1.3577 - classification_loss: 0.2807 486/500 [============================>.] - ETA: 3s - loss: 1.6388 - regression_loss: 1.3581 - classification_loss: 0.2807 487/500 [============================>.] - ETA: 3s - loss: 1.6392 - regression_loss: 1.3584 - classification_loss: 0.2808 488/500 [============================>.] - ETA: 2s - loss: 1.6387 - regression_loss: 1.3580 - classification_loss: 0.2807 489/500 [============================>.] - ETA: 2s - loss: 1.6378 - regression_loss: 1.3573 - classification_loss: 0.2805 490/500 [============================>.] - ETA: 2s - loss: 1.6392 - regression_loss: 1.3585 - classification_loss: 0.2807 491/500 [============================>.] - ETA: 2s - loss: 1.6389 - regression_loss: 1.3583 - classification_loss: 0.2807 492/500 [============================>.] - ETA: 1s - loss: 1.6383 - regression_loss: 1.3572 - classification_loss: 0.2811 493/500 [============================>.] - ETA: 1s - loss: 1.6385 - regression_loss: 1.3574 - classification_loss: 0.2811 494/500 [============================>.] - ETA: 1s - loss: 1.6386 - regression_loss: 1.3575 - classification_loss: 0.2811 495/500 [============================>.] - ETA: 1s - loss: 1.6396 - regression_loss: 1.3584 - classification_loss: 0.2812 496/500 [============================>.] - ETA: 0s - loss: 1.6392 - regression_loss: 1.3576 - classification_loss: 0.2815 497/500 [============================>.] - ETA: 0s - loss: 1.6368 - regression_loss: 1.3557 - classification_loss: 0.2811 498/500 [============================>.] - ETA: 0s - loss: 1.6377 - regression_loss: 1.3566 - classification_loss: 0.2811 499/500 [============================>.] - ETA: 0s - loss: 1.6376 - regression_loss: 1.3564 - classification_loss: 0.2812 500/500 [==============================] - 125s 250ms/step - loss: 1.6381 - regression_loss: 1.3568 - classification_loss: 0.2813 1172 instances of class plum with average precision: 0.6516 mAP: 0.6516 Epoch 00080: saving model to ./training/snapshots/resnet50_pascal_80.h5 Epoch 81/150 1/500 [..............................] - ETA: 1:57 - loss: 0.6139 - regression_loss: 0.5133 - classification_loss: 0.1006 2/500 [..............................] - ETA: 1:57 - loss: 1.2807 - regression_loss: 1.0731 - classification_loss: 0.2075 3/500 [..............................] - ETA: 1:57 - loss: 1.5667 - regression_loss: 1.2837 - classification_loss: 0.2830 4/500 [..............................] - ETA: 1:57 - loss: 1.7044 - regression_loss: 1.3887 - classification_loss: 0.3157 5/500 [..............................] - ETA: 1:58 - loss: 1.6814 - regression_loss: 1.3800 - classification_loss: 0.3014 6/500 [..............................] - ETA: 1:59 - loss: 1.6382 - regression_loss: 1.3518 - classification_loss: 0.2864 7/500 [..............................] - ETA: 1:59 - loss: 1.6245 - regression_loss: 1.3448 - classification_loss: 0.2797 8/500 [..............................] - ETA: 1:59 - loss: 1.5476 - regression_loss: 1.2818 - classification_loss: 0.2659 9/500 [..............................] - ETA: 1:59 - loss: 1.5862 - regression_loss: 1.3246 - classification_loss: 0.2617 10/500 [..............................] - ETA: 2:00 - loss: 1.5816 - regression_loss: 1.3179 - classification_loss: 0.2637 11/500 [..............................] - ETA: 2:00 - loss: 1.5495 - regression_loss: 1.2968 - classification_loss: 0.2526 12/500 [..............................] - ETA: 2:00 - loss: 1.5664 - regression_loss: 1.3128 - classification_loss: 0.2537 13/500 [..............................] - ETA: 2:00 - loss: 1.5722 - regression_loss: 1.3178 - classification_loss: 0.2544 14/500 [..............................] - ETA: 2:00 - loss: 1.6763 - regression_loss: 1.4029 - classification_loss: 0.2734 15/500 [..............................] - ETA: 1:59 - loss: 1.6453 - regression_loss: 1.3788 - classification_loss: 0.2666 16/500 [..............................] - ETA: 1:59 - loss: 1.6473 - regression_loss: 1.3817 - classification_loss: 0.2656 17/500 [>.............................] - ETA: 1:59 - loss: 1.6616 - regression_loss: 1.3929 - classification_loss: 0.2687 18/500 [>.............................] - ETA: 1:58 - loss: 1.6544 - regression_loss: 1.3878 - classification_loss: 0.2666 19/500 [>.............................] - ETA: 1:57 - loss: 1.6605 - regression_loss: 1.3944 - classification_loss: 0.2662 20/500 [>.............................] - ETA: 1:57 - loss: 1.6497 - regression_loss: 1.3853 - classification_loss: 0.2644 21/500 [>.............................] - ETA: 1:56 - loss: 1.6709 - regression_loss: 1.4048 - classification_loss: 0.2661 22/500 [>.............................] - ETA: 1:56 - loss: 1.6756 - regression_loss: 1.4046 - classification_loss: 0.2709 23/500 [>.............................] - ETA: 1:56 - loss: 1.7070 - regression_loss: 1.4246 - classification_loss: 0.2823 24/500 [>.............................] - ETA: 1:55 - loss: 1.7150 - regression_loss: 1.4307 - classification_loss: 0.2843 25/500 [>.............................] - ETA: 1:55 - loss: 1.7094 - regression_loss: 1.4273 - classification_loss: 0.2821 26/500 [>.............................] - ETA: 1:55 - loss: 1.7083 - regression_loss: 1.4264 - classification_loss: 0.2818 27/500 [>.............................] - ETA: 1:55 - loss: 1.7219 - regression_loss: 1.4407 - classification_loss: 0.2812 28/500 [>.............................] - ETA: 1:55 - loss: 1.7157 - regression_loss: 1.4357 - classification_loss: 0.2800 29/500 [>.............................] - ETA: 1:55 - loss: 1.7028 - regression_loss: 1.4257 - classification_loss: 0.2771 30/500 [>.............................] - ETA: 1:55 - loss: 1.6975 - regression_loss: 1.4218 - classification_loss: 0.2757 31/500 [>.............................] - ETA: 1:55 - loss: 1.6996 - regression_loss: 1.4218 - classification_loss: 0.2779 32/500 [>.............................] - ETA: 1:55 - loss: 1.6855 - regression_loss: 1.4103 - classification_loss: 0.2752 33/500 [>.............................] - ETA: 1:55 - loss: 1.7057 - regression_loss: 1.4217 - classification_loss: 0.2839 34/500 [=>............................] - ETA: 1:54 - loss: 1.7020 - regression_loss: 1.4194 - classification_loss: 0.2825 35/500 [=>............................] - ETA: 1:54 - loss: 1.6752 - regression_loss: 1.3981 - classification_loss: 0.2771 36/500 [=>............................] - ETA: 1:54 - loss: 1.6757 - regression_loss: 1.3987 - classification_loss: 0.2770 37/500 [=>............................] - ETA: 1:54 - loss: 1.6521 - regression_loss: 1.3813 - classification_loss: 0.2708 38/500 [=>............................] - ETA: 1:54 - loss: 1.6564 - regression_loss: 1.3863 - classification_loss: 0.2701 39/500 [=>............................] - ETA: 1:53 - loss: 1.6463 - regression_loss: 1.3788 - classification_loss: 0.2675 40/500 [=>............................] - ETA: 1:53 - loss: 1.6529 - regression_loss: 1.3840 - classification_loss: 0.2689 41/500 [=>............................] - ETA: 1:53 - loss: 1.6545 - regression_loss: 1.3851 - classification_loss: 0.2694 42/500 [=>............................] - ETA: 1:53 - loss: 1.6639 - regression_loss: 1.3931 - classification_loss: 0.2708 43/500 [=>............................] - ETA: 1:53 - loss: 1.6637 - regression_loss: 1.3940 - classification_loss: 0.2697 44/500 [=>............................] - ETA: 1:53 - loss: 1.6657 - regression_loss: 1.3942 - classification_loss: 0.2715 45/500 [=>............................] - ETA: 1:53 - loss: 1.6707 - regression_loss: 1.3989 - classification_loss: 0.2717 46/500 [=>............................] - ETA: 1:52 - loss: 1.6771 - regression_loss: 1.4033 - classification_loss: 0.2739 47/500 [=>............................] - ETA: 1:52 - loss: 1.6734 - regression_loss: 1.4007 - classification_loss: 0.2727 48/500 [=>............................] - ETA: 1:52 - loss: 1.6623 - regression_loss: 1.3920 - classification_loss: 0.2703 49/500 [=>............................] - ETA: 1:52 - loss: 1.6641 - regression_loss: 1.3943 - classification_loss: 0.2698 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6597 - regression_loss: 1.3905 - classification_loss: 0.2692 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6572 - regression_loss: 1.3889 - classification_loss: 0.2683 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6481 - regression_loss: 1.3820 - classification_loss: 0.2661 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6529 - regression_loss: 1.3844 - classification_loss: 0.2685 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6512 - regression_loss: 1.3829 - classification_loss: 0.2683 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6635 - regression_loss: 1.3900 - classification_loss: 0.2735 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6620 - regression_loss: 1.3892 - classification_loss: 0.2729 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6520 - regression_loss: 1.3813 - classification_loss: 0.2707 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6501 - regression_loss: 1.3796 - classification_loss: 0.2704 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6349 - regression_loss: 1.3671 - classification_loss: 0.2678 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6325 - regression_loss: 1.3662 - classification_loss: 0.2663 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6324 - regression_loss: 1.3660 - classification_loss: 0.2664 62/500 [==>...........................] - ETA: 1:48 - loss: 1.6311 - regression_loss: 1.3648 - classification_loss: 0.2664 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6349 - regression_loss: 1.3686 - classification_loss: 0.2663 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6301 - regression_loss: 1.3650 - classification_loss: 0.2652 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6330 - regression_loss: 1.3670 - classification_loss: 0.2661 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6357 - regression_loss: 1.3686 - classification_loss: 0.2671 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6240 - regression_loss: 1.3592 - classification_loss: 0.2648 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6359 - regression_loss: 1.3680 - classification_loss: 0.2679 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6365 - regression_loss: 1.3692 - classification_loss: 0.2673 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6326 - regression_loss: 1.3656 - classification_loss: 0.2670 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6308 - regression_loss: 1.3640 - classification_loss: 0.2668 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6289 - regression_loss: 1.3625 - classification_loss: 0.2664 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6250 - regression_loss: 1.3598 - classification_loss: 0.2653 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6178 - regression_loss: 1.3542 - classification_loss: 0.2636 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6152 - regression_loss: 1.3513 - classification_loss: 0.2640 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6050 - regression_loss: 1.3435 - classification_loss: 0.2615 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6106 - regression_loss: 1.3465 - classification_loss: 0.2641 78/500 [===>..........................] - ETA: 1:44 - loss: 1.6180 - regression_loss: 1.3518 - classification_loss: 0.2661 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6243 - regression_loss: 1.3559 - classification_loss: 0.2684 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6318 - regression_loss: 1.3612 - classification_loss: 0.2706 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6419 - regression_loss: 1.3696 - classification_loss: 0.2723 82/500 [===>..........................] - ETA: 1:43 - loss: 1.6434 - regression_loss: 1.3710 - classification_loss: 0.2724 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6453 - regression_loss: 1.3720 - classification_loss: 0.2733 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6529 - regression_loss: 1.3791 - classification_loss: 0.2738 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6558 - regression_loss: 1.3826 - classification_loss: 0.2732 86/500 [====>.........................] - ETA: 1:42 - loss: 1.6489 - regression_loss: 1.3776 - classification_loss: 0.2714 87/500 [====>.........................] - ETA: 1:42 - loss: 1.6542 - regression_loss: 1.3820 - classification_loss: 0.2722 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6581 - regression_loss: 1.3854 - classification_loss: 0.2728 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6643 - regression_loss: 1.3906 - classification_loss: 0.2737 90/500 [====>.........................] - ETA: 1:41 - loss: 1.6679 - regression_loss: 1.3924 - classification_loss: 0.2755 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6708 - regression_loss: 1.3943 - classification_loss: 0.2766 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6665 - regression_loss: 1.3905 - classification_loss: 0.2761 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6652 - regression_loss: 1.3893 - classification_loss: 0.2759 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6659 - regression_loss: 1.3903 - classification_loss: 0.2756 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6689 - regression_loss: 1.3927 - classification_loss: 0.2762 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6617 - regression_loss: 1.3867 - classification_loss: 0.2750 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6612 - regression_loss: 1.3866 - classification_loss: 0.2747 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6612 - regression_loss: 1.3860 - classification_loss: 0.2752 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6720 - regression_loss: 1.3933 - classification_loss: 0.2787 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6702 - regression_loss: 1.3921 - classification_loss: 0.2781 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6729 - regression_loss: 1.3944 - classification_loss: 0.2785 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6618 - regression_loss: 1.3849 - classification_loss: 0.2769 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6652 - regression_loss: 1.3872 - classification_loss: 0.2780 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6680 - regression_loss: 1.3892 - classification_loss: 0.2788 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6744 - regression_loss: 1.3938 - classification_loss: 0.2806 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6786 - regression_loss: 1.3975 - classification_loss: 0.2811 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6815 - regression_loss: 1.3997 - classification_loss: 0.2819 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6772 - regression_loss: 1.3959 - classification_loss: 0.2813 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6787 - regression_loss: 1.3970 - classification_loss: 0.2817 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6801 - regression_loss: 1.3981 - classification_loss: 0.2820 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6821 - regression_loss: 1.4005 - classification_loss: 0.2815 112/500 [=====>........................] - ETA: 1:36 - loss: 1.6809 - regression_loss: 1.3999 - classification_loss: 0.2810 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6833 - regression_loss: 1.4022 - classification_loss: 0.2811 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6832 - regression_loss: 1.4024 - classification_loss: 0.2808 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6854 - regression_loss: 1.4039 - classification_loss: 0.2814 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6865 - regression_loss: 1.4043 - classification_loss: 0.2822 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6828 - regression_loss: 1.4015 - classification_loss: 0.2813 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6766 - regression_loss: 1.3965 - classification_loss: 0.2801 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6739 - regression_loss: 1.3944 - classification_loss: 0.2795 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6752 - regression_loss: 1.3954 - classification_loss: 0.2797 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6773 - regression_loss: 1.3972 - classification_loss: 0.2801 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6799 - regression_loss: 1.3994 - classification_loss: 0.2805 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6818 - regression_loss: 1.4008 - classification_loss: 0.2810 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6903 - regression_loss: 1.4079 - classification_loss: 0.2824 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6871 - regression_loss: 1.4046 - classification_loss: 0.2825 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6816 - regression_loss: 1.4001 - classification_loss: 0.2816 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6830 - regression_loss: 1.4014 - classification_loss: 0.2816 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6833 - regression_loss: 1.4023 - classification_loss: 0.2810 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6774 - regression_loss: 1.3977 - classification_loss: 0.2797 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6732 - regression_loss: 1.3939 - classification_loss: 0.2793 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6750 - regression_loss: 1.3958 - classification_loss: 0.2792 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6768 - regression_loss: 1.3973 - classification_loss: 0.2794 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6686 - regression_loss: 1.3904 - classification_loss: 0.2782 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6626 - regression_loss: 1.3859 - classification_loss: 0.2767 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6656 - regression_loss: 1.3882 - classification_loss: 0.2774 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6634 - regression_loss: 1.3866 - classification_loss: 0.2768 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6673 - regression_loss: 1.3890 - classification_loss: 0.2783 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6664 - regression_loss: 1.3884 - classification_loss: 0.2780 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6642 - regression_loss: 1.3867 - classification_loss: 0.2775 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6665 - regression_loss: 1.3884 - classification_loss: 0.2780 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6596 - regression_loss: 1.3829 - classification_loss: 0.2767 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6641 - regression_loss: 1.3858 - classification_loss: 0.2783 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6663 - regression_loss: 1.3874 - classification_loss: 0.2789 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6607 - regression_loss: 1.3827 - classification_loss: 0.2780 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6536 - regression_loss: 1.3769 - classification_loss: 0.2767 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6550 - regression_loss: 1.3779 - classification_loss: 0.2771 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6496 - regression_loss: 1.3734 - classification_loss: 0.2762 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6459 - regression_loss: 1.3710 - classification_loss: 0.2749 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6536 - regression_loss: 1.3763 - classification_loss: 0.2774 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6509 - regression_loss: 1.3740 - classification_loss: 0.2769 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6481 - regression_loss: 1.3712 - classification_loss: 0.2769 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6502 - regression_loss: 1.3733 - classification_loss: 0.2770 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6458 - regression_loss: 1.3694 - classification_loss: 0.2764 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6491 - regression_loss: 1.3722 - classification_loss: 0.2769 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6490 - regression_loss: 1.3725 - classification_loss: 0.2765 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6496 - regression_loss: 1.3734 - classification_loss: 0.2762 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6466 - regression_loss: 1.3710 - classification_loss: 0.2755 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6473 - regression_loss: 1.3708 - classification_loss: 0.2765 159/500 [========>.....................] - ETA: 1:24 - loss: 1.6446 - regression_loss: 1.3686 - classification_loss: 0.2759 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6474 - regression_loss: 1.3712 - classification_loss: 0.2762 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6460 - regression_loss: 1.3699 - classification_loss: 0.2761 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6460 - regression_loss: 1.3700 - classification_loss: 0.2761 163/500 [========>.....................] - ETA: 1:23 - loss: 1.6468 - regression_loss: 1.3708 - classification_loss: 0.2761 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6517 - regression_loss: 1.3756 - classification_loss: 0.2761 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6531 - regression_loss: 1.3768 - classification_loss: 0.2763 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6531 - regression_loss: 1.3765 - classification_loss: 0.2766 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6482 - regression_loss: 1.3726 - classification_loss: 0.2756 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6474 - regression_loss: 1.3719 - classification_loss: 0.2755 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6478 - regression_loss: 1.3724 - classification_loss: 0.2754 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6506 - regression_loss: 1.3747 - classification_loss: 0.2760 171/500 [=========>....................] - ETA: 1:21 - loss: 1.6492 - regression_loss: 1.3736 - classification_loss: 0.2757 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6462 - regression_loss: 1.3712 - classification_loss: 0.2750 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6460 - regression_loss: 1.3709 - classification_loss: 0.2751 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6512 - regression_loss: 1.3751 - classification_loss: 0.2761 175/500 [=========>....................] - ETA: 1:20 - loss: 1.6502 - regression_loss: 1.3744 - classification_loss: 0.2758 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6489 - regression_loss: 1.3733 - classification_loss: 0.2756 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6486 - regression_loss: 1.3731 - classification_loss: 0.2755 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6524 - regression_loss: 1.3762 - classification_loss: 0.2762 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6510 - regression_loss: 1.3753 - classification_loss: 0.2756 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6528 - regression_loss: 1.3770 - classification_loss: 0.2758 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6515 - regression_loss: 1.3761 - classification_loss: 0.2755 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6524 - regression_loss: 1.3764 - classification_loss: 0.2759 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6519 - regression_loss: 1.3757 - classification_loss: 0.2762 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6520 - regression_loss: 1.3758 - classification_loss: 0.2762 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6520 - regression_loss: 1.3760 - classification_loss: 0.2760 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6476 - regression_loss: 1.3720 - classification_loss: 0.2755 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6499 - regression_loss: 1.3737 - classification_loss: 0.2763 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6499 - regression_loss: 1.3741 - classification_loss: 0.2759 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6463 - regression_loss: 1.3712 - classification_loss: 0.2751 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6461 - regression_loss: 1.3713 - classification_loss: 0.2748 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6499 - regression_loss: 1.3744 - classification_loss: 0.2755 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6512 - regression_loss: 1.3754 - classification_loss: 0.2757 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6503 - regression_loss: 1.3745 - classification_loss: 0.2758 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6501 - regression_loss: 1.3744 - classification_loss: 0.2756 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6465 - regression_loss: 1.3715 - classification_loss: 0.2750 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6458 - regression_loss: 1.3709 - classification_loss: 0.2749 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6439 - regression_loss: 1.3695 - classification_loss: 0.2743 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6412 - regression_loss: 1.3673 - classification_loss: 0.2739 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6407 - regression_loss: 1.3670 - classification_loss: 0.2737 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6413 - regression_loss: 1.3676 - classification_loss: 0.2737 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6400 - regression_loss: 1.3664 - classification_loss: 0.2737 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6375 - regression_loss: 1.3643 - classification_loss: 0.2732 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6373 - regression_loss: 1.3643 - classification_loss: 0.2730 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6343 - regression_loss: 1.3621 - classification_loss: 0.2722 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6338 - regression_loss: 1.3621 - classification_loss: 0.2717 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6340 - regression_loss: 1.3625 - classification_loss: 0.2715 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6309 - regression_loss: 1.3601 - classification_loss: 0.2708 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6302 - regression_loss: 1.3598 - classification_loss: 0.2704 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6319 - regression_loss: 1.3611 - classification_loss: 0.2709 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6339 - regression_loss: 1.3625 - classification_loss: 0.2714 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6353 - regression_loss: 1.3641 - classification_loss: 0.2713 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6358 - regression_loss: 1.3647 - classification_loss: 0.2711 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6357 - regression_loss: 1.3645 - classification_loss: 0.2712 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6363 - regression_loss: 1.3651 - classification_loss: 0.2712 215/500 [===========>..................] - ETA: 1:10 - loss: 1.6360 - regression_loss: 1.3650 - classification_loss: 0.2711 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6348 - regression_loss: 1.3641 - classification_loss: 0.2707 217/500 [============>.................] - ETA: 1:10 - loss: 1.6361 - regression_loss: 1.3655 - classification_loss: 0.2707 218/500 [============>.................] - ETA: 1:10 - loss: 1.6358 - regression_loss: 1.3652 - classification_loss: 0.2706 219/500 [============>.................] - ETA: 1:09 - loss: 1.6363 - regression_loss: 1.3658 - classification_loss: 0.2706 220/500 [============>.................] - ETA: 1:09 - loss: 1.6362 - regression_loss: 1.3658 - classification_loss: 0.2704 221/500 [============>.................] - ETA: 1:09 - loss: 1.6371 - regression_loss: 1.3666 - classification_loss: 0.2705 222/500 [============>.................] - ETA: 1:09 - loss: 1.6364 - regression_loss: 1.3659 - classification_loss: 0.2704 223/500 [============>.................] - ETA: 1:08 - loss: 1.6358 - regression_loss: 1.3657 - classification_loss: 0.2702 224/500 [============>.................] - ETA: 1:08 - loss: 1.6377 - regression_loss: 1.3657 - classification_loss: 0.2720 225/500 [============>.................] - ETA: 1:08 - loss: 1.6358 - regression_loss: 1.3640 - classification_loss: 0.2718 226/500 [============>.................] - ETA: 1:08 - loss: 1.6366 - regression_loss: 1.3647 - classification_loss: 0.2718 227/500 [============>.................] - ETA: 1:07 - loss: 1.6353 - regression_loss: 1.3639 - classification_loss: 0.2714 228/500 [============>.................] - ETA: 1:07 - loss: 1.6397 - regression_loss: 1.3673 - classification_loss: 0.2724 229/500 [============>.................] - ETA: 1:07 - loss: 1.6355 - regression_loss: 1.3637 - classification_loss: 0.2718 230/500 [============>.................] - ETA: 1:07 - loss: 1.6350 - regression_loss: 1.3634 - classification_loss: 0.2716 231/500 [============>.................] - ETA: 1:06 - loss: 1.6358 - regression_loss: 1.3641 - classification_loss: 0.2717 232/500 [============>.................] - ETA: 1:06 - loss: 1.6357 - regression_loss: 1.3639 - classification_loss: 0.2718 233/500 [============>.................] - ETA: 1:06 - loss: 1.6362 - regression_loss: 1.3641 - classification_loss: 0.2721 234/500 [=============>................] - ETA: 1:06 - loss: 1.6352 - regression_loss: 1.3631 - classification_loss: 0.2721 235/500 [=============>................] - ETA: 1:05 - loss: 1.6331 - regression_loss: 1.3615 - classification_loss: 0.2715 236/500 [=============>................] - ETA: 1:05 - loss: 1.6352 - regression_loss: 1.3629 - classification_loss: 0.2723 237/500 [=============>................] - ETA: 1:05 - loss: 1.6358 - regression_loss: 1.3634 - classification_loss: 0.2725 238/500 [=============>................] - ETA: 1:05 - loss: 1.6328 - regression_loss: 1.3610 - classification_loss: 0.2718 239/500 [=============>................] - ETA: 1:04 - loss: 1.6343 - regression_loss: 1.3623 - classification_loss: 0.2721 240/500 [=============>................] - ETA: 1:04 - loss: 1.6372 - regression_loss: 1.3648 - classification_loss: 0.2725 241/500 [=============>................] - ETA: 1:04 - loss: 1.6360 - regression_loss: 1.3639 - classification_loss: 0.2721 242/500 [=============>................] - ETA: 1:04 - loss: 1.6360 - regression_loss: 1.3639 - classification_loss: 0.2721 243/500 [=============>................] - ETA: 1:03 - loss: 1.6337 - regression_loss: 1.3619 - classification_loss: 0.2718 244/500 [=============>................] - ETA: 1:03 - loss: 1.6321 - regression_loss: 1.3604 - classification_loss: 0.2717 245/500 [=============>................] - ETA: 1:03 - loss: 1.6302 - regression_loss: 1.3587 - classification_loss: 0.2715 246/500 [=============>................] - ETA: 1:03 - loss: 1.6304 - regression_loss: 1.3589 - classification_loss: 0.2714 247/500 [=============>................] - ETA: 1:02 - loss: 1.6305 - regression_loss: 1.3593 - classification_loss: 0.2712 248/500 [=============>................] - ETA: 1:02 - loss: 1.6313 - regression_loss: 1.3603 - classification_loss: 0.2711 249/500 [=============>................] - ETA: 1:02 - loss: 1.6317 - regression_loss: 1.3602 - classification_loss: 0.2715 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6335 - regression_loss: 1.3611 - classification_loss: 0.2724 251/500 [==============>...............] - ETA: 1:01 - loss: 1.6344 - regression_loss: 1.3617 - classification_loss: 0.2727 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6352 - regression_loss: 1.3624 - classification_loss: 0.2728 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6370 - regression_loss: 1.3638 - classification_loss: 0.2732 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6357 - regression_loss: 1.3631 - classification_loss: 0.2726 255/500 [==============>...............] - ETA: 1:00 - loss: 1.6361 - regression_loss: 1.3633 - classification_loss: 0.2728 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6359 - regression_loss: 1.3631 - classification_loss: 0.2729 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6368 - regression_loss: 1.3638 - classification_loss: 0.2729 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6386 - regression_loss: 1.3648 - classification_loss: 0.2738 259/500 [==============>...............] - ETA: 59s - loss: 1.6362 - regression_loss: 1.3627 - classification_loss: 0.2735  260/500 [==============>...............] - ETA: 59s - loss: 1.6362 - regression_loss: 1.3627 - classification_loss: 0.2735 261/500 [==============>...............] - ETA: 59s - loss: 1.6365 - regression_loss: 1.3629 - classification_loss: 0.2736 262/500 [==============>...............] - ETA: 58s - loss: 1.6377 - regression_loss: 1.3639 - classification_loss: 0.2738 263/500 [==============>...............] - ETA: 58s - loss: 1.6378 - regression_loss: 1.3641 - classification_loss: 0.2738 264/500 [==============>...............] - ETA: 58s - loss: 1.6384 - regression_loss: 1.3647 - classification_loss: 0.2737 265/500 [==============>...............] - ETA: 58s - loss: 1.6410 - regression_loss: 1.3668 - classification_loss: 0.2743 266/500 [==============>...............] - ETA: 57s - loss: 1.6416 - regression_loss: 1.3662 - classification_loss: 0.2754 267/500 [===============>..............] - ETA: 57s - loss: 1.6388 - regression_loss: 1.3639 - classification_loss: 0.2748 268/500 [===============>..............] - ETA: 57s - loss: 1.6408 - regression_loss: 1.3661 - classification_loss: 0.2747 269/500 [===============>..............] - ETA: 57s - loss: 1.6411 - regression_loss: 1.3662 - classification_loss: 0.2749 270/500 [===============>..............] - ETA: 56s - loss: 1.6403 - regression_loss: 1.3656 - classification_loss: 0.2747 271/500 [===============>..............] - ETA: 56s - loss: 1.6371 - regression_loss: 1.3629 - classification_loss: 0.2741 272/500 [===============>..............] - ETA: 56s - loss: 1.6377 - regression_loss: 1.3634 - classification_loss: 0.2743 273/500 [===============>..............] - ETA: 56s - loss: 1.6352 - regression_loss: 1.3614 - classification_loss: 0.2738 274/500 [===============>..............] - ETA: 55s - loss: 1.6354 - regression_loss: 1.3617 - classification_loss: 0.2738 275/500 [===============>..............] - ETA: 55s - loss: 1.6376 - regression_loss: 1.3638 - classification_loss: 0.2738 276/500 [===============>..............] - ETA: 55s - loss: 1.6384 - regression_loss: 1.3646 - classification_loss: 0.2738 277/500 [===============>..............] - ETA: 55s - loss: 1.6393 - regression_loss: 1.3651 - classification_loss: 0.2742 278/500 [===============>..............] - ETA: 54s - loss: 1.6400 - regression_loss: 1.3657 - classification_loss: 0.2743 279/500 [===============>..............] - ETA: 54s - loss: 1.6390 - regression_loss: 1.3650 - classification_loss: 0.2740 280/500 [===============>..............] - ETA: 54s - loss: 1.6384 - regression_loss: 1.3646 - classification_loss: 0.2739 281/500 [===============>..............] - ETA: 54s - loss: 1.6374 - regression_loss: 1.3636 - classification_loss: 0.2737 282/500 [===============>..............] - ETA: 53s - loss: 1.6379 - regression_loss: 1.3639 - classification_loss: 0.2739 283/500 [===============>..............] - ETA: 53s - loss: 1.6390 - regression_loss: 1.3648 - classification_loss: 0.2741 284/500 [================>.............] - ETA: 53s - loss: 1.6389 - regression_loss: 1.3647 - classification_loss: 0.2742 285/500 [================>.............] - ETA: 53s - loss: 1.6423 - regression_loss: 1.3675 - classification_loss: 0.2748 286/500 [================>.............] - ETA: 52s - loss: 1.6431 - regression_loss: 1.3682 - classification_loss: 0.2749 287/500 [================>.............] - ETA: 52s - loss: 1.6408 - regression_loss: 1.3664 - classification_loss: 0.2745 288/500 [================>.............] - ETA: 52s - loss: 1.6420 - regression_loss: 1.3669 - classification_loss: 0.2751 289/500 [================>.............] - ETA: 52s - loss: 1.6402 - regression_loss: 1.3656 - classification_loss: 0.2747 290/500 [================>.............] - ETA: 51s - loss: 1.6436 - regression_loss: 1.3676 - classification_loss: 0.2760 291/500 [================>.............] - ETA: 51s - loss: 1.6447 - regression_loss: 1.3684 - classification_loss: 0.2763 292/500 [================>.............] - ETA: 51s - loss: 1.6457 - regression_loss: 1.3691 - classification_loss: 0.2765 293/500 [================>.............] - ETA: 51s - loss: 1.6466 - regression_loss: 1.3698 - classification_loss: 0.2768 294/500 [================>.............] - ETA: 51s - loss: 1.6454 - regression_loss: 1.3689 - classification_loss: 0.2765 295/500 [================>.............] - ETA: 50s - loss: 1.6467 - regression_loss: 1.3697 - classification_loss: 0.2770 296/500 [================>.............] - ETA: 50s - loss: 1.6468 - regression_loss: 1.3684 - classification_loss: 0.2784 297/500 [================>.............] - ETA: 50s - loss: 1.6480 - regression_loss: 1.3692 - classification_loss: 0.2787 298/500 [================>.............] - ETA: 50s - loss: 1.6473 - regression_loss: 1.3689 - classification_loss: 0.2784 299/500 [================>.............] - ETA: 49s - loss: 1.6495 - regression_loss: 1.3706 - classification_loss: 0.2789 300/500 [=================>............] - ETA: 49s - loss: 1.6497 - regression_loss: 1.3708 - classification_loss: 0.2789 301/500 [=================>............] - ETA: 49s - loss: 1.6467 - regression_loss: 1.3684 - classification_loss: 0.2784 302/500 [=================>............] - ETA: 49s - loss: 1.6481 - regression_loss: 1.3695 - classification_loss: 0.2786 303/500 [=================>............] - ETA: 48s - loss: 1.6463 - regression_loss: 1.3683 - classification_loss: 0.2779 304/500 [=================>............] - ETA: 48s - loss: 1.6445 - regression_loss: 1.3670 - classification_loss: 0.2775 305/500 [=================>............] - ETA: 48s - loss: 1.6448 - regression_loss: 1.3674 - classification_loss: 0.2774 306/500 [=================>............] - ETA: 48s - loss: 1.6441 - regression_loss: 1.3668 - classification_loss: 0.2773 307/500 [=================>............] - ETA: 47s - loss: 1.6417 - regression_loss: 1.3650 - classification_loss: 0.2767 308/500 [=================>............] - ETA: 47s - loss: 1.6421 - regression_loss: 1.3654 - classification_loss: 0.2767 309/500 [=================>............] - ETA: 47s - loss: 1.6434 - regression_loss: 1.3665 - classification_loss: 0.2769 310/500 [=================>............] - ETA: 46s - loss: 1.6447 - regression_loss: 1.3676 - classification_loss: 0.2771 311/500 [=================>............] - ETA: 46s - loss: 1.6446 - regression_loss: 1.3675 - classification_loss: 0.2771 312/500 [=================>............] - ETA: 46s - loss: 1.6422 - regression_loss: 1.3657 - classification_loss: 0.2765 313/500 [=================>............] - ETA: 46s - loss: 1.6447 - regression_loss: 1.3670 - classification_loss: 0.2777 314/500 [=================>............] - ETA: 46s - loss: 1.6426 - regression_loss: 1.3652 - classification_loss: 0.2774 315/500 [=================>............] - ETA: 45s - loss: 1.6407 - regression_loss: 1.3635 - classification_loss: 0.2772 316/500 [=================>............] - ETA: 45s - loss: 1.6402 - regression_loss: 1.3631 - classification_loss: 0.2771 317/500 [==================>...........] - ETA: 45s - loss: 1.6415 - regression_loss: 1.3638 - classification_loss: 0.2777 318/500 [==================>...........] - ETA: 45s - loss: 1.6414 - regression_loss: 1.3639 - classification_loss: 0.2775 319/500 [==================>...........] - ETA: 44s - loss: 1.6380 - regression_loss: 1.3612 - classification_loss: 0.2768 320/500 [==================>...........] - ETA: 44s - loss: 1.6371 - regression_loss: 1.3604 - classification_loss: 0.2767 321/500 [==================>...........] - ETA: 44s - loss: 1.6379 - regression_loss: 1.3608 - classification_loss: 0.2771 322/500 [==================>...........] - ETA: 44s - loss: 1.6385 - regression_loss: 1.3613 - classification_loss: 0.2773 323/500 [==================>...........] - ETA: 43s - loss: 1.6377 - regression_loss: 1.3606 - classification_loss: 0.2771 324/500 [==================>...........] - ETA: 43s - loss: 1.6351 - regression_loss: 1.3584 - classification_loss: 0.2766 325/500 [==================>...........] - ETA: 43s - loss: 1.6346 - regression_loss: 1.3580 - classification_loss: 0.2766 326/500 [==================>...........] - ETA: 43s - loss: 1.6312 - regression_loss: 1.3553 - classification_loss: 0.2759 327/500 [==================>...........] - ETA: 42s - loss: 1.6330 - regression_loss: 1.3568 - classification_loss: 0.2763 328/500 [==================>...........] - ETA: 42s - loss: 1.6332 - regression_loss: 1.3570 - classification_loss: 0.2761 329/500 [==================>...........] - ETA: 42s - loss: 1.6312 - regression_loss: 1.3554 - classification_loss: 0.2759 330/500 [==================>...........] - ETA: 42s - loss: 1.6294 - regression_loss: 1.3539 - classification_loss: 0.2756 331/500 [==================>...........] - ETA: 41s - loss: 1.6301 - regression_loss: 1.3545 - classification_loss: 0.2756 332/500 [==================>...........] - ETA: 41s - loss: 1.6310 - regression_loss: 1.3553 - classification_loss: 0.2758 333/500 [==================>...........] - ETA: 41s - loss: 1.6327 - regression_loss: 1.3565 - classification_loss: 0.2761 334/500 [===================>..........] - ETA: 41s - loss: 1.6335 - regression_loss: 1.3573 - classification_loss: 0.2762 335/500 [===================>..........] - ETA: 40s - loss: 1.6330 - regression_loss: 1.3570 - classification_loss: 0.2760 336/500 [===================>..........] - ETA: 40s - loss: 1.6350 - regression_loss: 1.3587 - classification_loss: 0.2763 337/500 [===================>..........] - ETA: 40s - loss: 1.6358 - regression_loss: 1.3593 - classification_loss: 0.2765 338/500 [===================>..........] - ETA: 40s - loss: 1.6356 - regression_loss: 1.3593 - classification_loss: 0.2763 339/500 [===================>..........] - ETA: 39s - loss: 1.6362 - regression_loss: 1.3598 - classification_loss: 0.2764 340/500 [===================>..........] - ETA: 39s - loss: 1.6354 - regression_loss: 1.3592 - classification_loss: 0.2762 341/500 [===================>..........] - ETA: 39s - loss: 1.6354 - regression_loss: 1.3593 - classification_loss: 0.2761 342/500 [===================>..........] - ETA: 39s - loss: 1.6362 - regression_loss: 1.3601 - classification_loss: 0.2761 343/500 [===================>..........] - ETA: 38s - loss: 1.6344 - regression_loss: 1.3587 - classification_loss: 0.2757 344/500 [===================>..........] - ETA: 38s - loss: 1.6331 - regression_loss: 1.3577 - classification_loss: 0.2754 345/500 [===================>..........] - ETA: 38s - loss: 1.6344 - regression_loss: 1.3589 - classification_loss: 0.2755 346/500 [===================>..........] - ETA: 38s - loss: 1.6352 - regression_loss: 1.3594 - classification_loss: 0.2758 347/500 [===================>..........] - ETA: 37s - loss: 1.6343 - regression_loss: 1.3588 - classification_loss: 0.2755 348/500 [===================>..........] - ETA: 37s - loss: 1.6328 - regression_loss: 1.3578 - classification_loss: 0.2750 349/500 [===================>..........] - ETA: 37s - loss: 1.6317 - regression_loss: 1.3568 - classification_loss: 0.2749 350/500 [====================>.........] - ETA: 37s - loss: 1.6312 - regression_loss: 1.3564 - classification_loss: 0.2748 351/500 [====================>.........] - ETA: 36s - loss: 1.6310 - regression_loss: 1.3562 - classification_loss: 0.2747 352/500 [====================>.........] - ETA: 36s - loss: 1.6307 - regression_loss: 1.3558 - classification_loss: 0.2748 353/500 [====================>.........] - ETA: 36s - loss: 1.6298 - regression_loss: 1.3553 - classification_loss: 0.2746 354/500 [====================>.........] - ETA: 36s - loss: 1.6299 - regression_loss: 1.3554 - classification_loss: 0.2745 355/500 [====================>.........] - ETA: 35s - loss: 1.6286 - regression_loss: 1.3544 - classification_loss: 0.2742 356/500 [====================>.........] - ETA: 35s - loss: 1.6287 - regression_loss: 1.3545 - classification_loss: 0.2742 357/500 [====================>.........] - ETA: 35s - loss: 1.6299 - regression_loss: 1.3555 - classification_loss: 0.2745 358/500 [====================>.........] - ETA: 35s - loss: 1.6270 - regression_loss: 1.3531 - classification_loss: 0.2739 359/500 [====================>.........] - ETA: 34s - loss: 1.6247 - regression_loss: 1.3511 - classification_loss: 0.2736 360/500 [====================>.........] - ETA: 34s - loss: 1.6249 - regression_loss: 1.3513 - classification_loss: 0.2736 361/500 [====================>.........] - ETA: 34s - loss: 1.6259 - regression_loss: 1.3521 - classification_loss: 0.2739 362/500 [====================>.........] - ETA: 34s - loss: 1.6255 - regression_loss: 1.3519 - classification_loss: 0.2736 363/500 [====================>.........] - ETA: 33s - loss: 1.6253 - regression_loss: 1.3517 - classification_loss: 0.2735 364/500 [====================>.........] - ETA: 33s - loss: 1.6260 - regression_loss: 1.3523 - classification_loss: 0.2737 365/500 [====================>.........] - ETA: 33s - loss: 1.6257 - regression_loss: 1.3521 - classification_loss: 0.2736 366/500 [====================>.........] - ETA: 33s - loss: 1.6265 - regression_loss: 1.3526 - classification_loss: 0.2739 367/500 [=====================>........] - ETA: 32s - loss: 1.6254 - regression_loss: 1.3514 - classification_loss: 0.2740 368/500 [=====================>........] - ETA: 32s - loss: 1.6293 - regression_loss: 1.3550 - classification_loss: 0.2742 369/500 [=====================>........] - ETA: 32s - loss: 1.6275 - regression_loss: 1.3535 - classification_loss: 0.2740 370/500 [=====================>........] - ETA: 32s - loss: 1.6284 - regression_loss: 1.3541 - classification_loss: 0.2744 371/500 [=====================>........] - ETA: 31s - loss: 1.6297 - regression_loss: 1.3553 - classification_loss: 0.2744 372/500 [=====================>........] - ETA: 31s - loss: 1.6300 - regression_loss: 1.3553 - classification_loss: 0.2747 373/500 [=====================>........] - ETA: 31s - loss: 1.6313 - regression_loss: 1.3564 - classification_loss: 0.2749 374/500 [=====================>........] - ETA: 31s - loss: 1.6322 - regression_loss: 1.3570 - classification_loss: 0.2752 375/500 [=====================>........] - ETA: 30s - loss: 1.6308 - regression_loss: 1.3559 - classification_loss: 0.2749 376/500 [=====================>........] - ETA: 30s - loss: 1.6319 - regression_loss: 1.3568 - classification_loss: 0.2751 377/500 [=====================>........] - ETA: 30s - loss: 1.6329 - regression_loss: 1.3577 - classification_loss: 0.2752 378/500 [=====================>........] - ETA: 30s - loss: 1.6337 - regression_loss: 1.3582 - classification_loss: 0.2755 379/500 [=====================>........] - ETA: 30s - loss: 1.6350 - regression_loss: 1.3592 - classification_loss: 0.2757 380/500 [=====================>........] - ETA: 29s - loss: 1.6355 - regression_loss: 1.3598 - classification_loss: 0.2757 381/500 [=====================>........] - ETA: 29s - loss: 1.6365 - regression_loss: 1.3607 - classification_loss: 0.2758 382/500 [=====================>........] - ETA: 29s - loss: 1.6354 - regression_loss: 1.3597 - classification_loss: 0.2757 383/500 [=====================>........] - ETA: 29s - loss: 1.6372 - regression_loss: 1.3615 - classification_loss: 0.2758 384/500 [======================>.......] - ETA: 28s - loss: 1.6375 - regression_loss: 1.3616 - classification_loss: 0.2759 385/500 [======================>.......] - ETA: 28s - loss: 1.6382 - regression_loss: 1.3621 - classification_loss: 0.2761 386/500 [======================>.......] - ETA: 28s - loss: 1.6382 - regression_loss: 1.3619 - classification_loss: 0.2763 387/500 [======================>.......] - ETA: 28s - loss: 1.6391 - regression_loss: 1.3625 - classification_loss: 0.2767 388/500 [======================>.......] - ETA: 27s - loss: 1.6411 - regression_loss: 1.3644 - classification_loss: 0.2767 389/500 [======================>.......] - ETA: 27s - loss: 1.6407 - regression_loss: 1.3640 - classification_loss: 0.2767 390/500 [======================>.......] - ETA: 27s - loss: 1.6398 - regression_loss: 1.3630 - classification_loss: 0.2769 391/500 [======================>.......] - ETA: 27s - loss: 1.6384 - regression_loss: 1.3614 - classification_loss: 0.2770 392/500 [======================>.......] - ETA: 26s - loss: 1.6377 - regression_loss: 1.3610 - classification_loss: 0.2768 393/500 [======================>.......] - ETA: 26s - loss: 1.6373 - regression_loss: 1.3607 - classification_loss: 0.2766 394/500 [======================>.......] - ETA: 26s - loss: 1.6367 - regression_loss: 1.3602 - classification_loss: 0.2765 395/500 [======================>.......] - ETA: 26s - loss: 1.6390 - regression_loss: 1.3622 - classification_loss: 0.2768 396/500 [======================>.......] - ETA: 25s - loss: 1.6394 - regression_loss: 1.3627 - classification_loss: 0.2768 397/500 [======================>.......] - ETA: 25s - loss: 1.6395 - regression_loss: 1.3628 - classification_loss: 0.2767 398/500 [======================>.......] - ETA: 25s - loss: 1.6389 - regression_loss: 1.3620 - classification_loss: 0.2768 399/500 [======================>.......] - ETA: 25s - loss: 1.6408 - regression_loss: 1.3636 - classification_loss: 0.2772 400/500 [=======================>......] - ETA: 24s - loss: 1.6423 - regression_loss: 1.3652 - classification_loss: 0.2771 401/500 [=======================>......] - ETA: 24s - loss: 1.6427 - regression_loss: 1.3654 - classification_loss: 0.2772 402/500 [=======================>......] - ETA: 24s - loss: 1.6423 - regression_loss: 1.3650 - classification_loss: 0.2773 403/500 [=======================>......] - ETA: 24s - loss: 1.6409 - regression_loss: 1.3638 - classification_loss: 0.2770 404/500 [=======================>......] - ETA: 23s - loss: 1.6404 - regression_loss: 1.3636 - classification_loss: 0.2768 405/500 [=======================>......] - ETA: 23s - loss: 1.6403 - regression_loss: 1.3636 - classification_loss: 0.2767 406/500 [=======================>......] - ETA: 23s - loss: 1.6400 - regression_loss: 1.3631 - classification_loss: 0.2769 407/500 [=======================>......] - ETA: 23s - loss: 1.6382 - regression_loss: 1.3615 - classification_loss: 0.2767 408/500 [=======================>......] - ETA: 22s - loss: 1.6356 - regression_loss: 1.3593 - classification_loss: 0.2763 409/500 [=======================>......] - ETA: 22s - loss: 1.6362 - regression_loss: 1.3598 - classification_loss: 0.2764 410/500 [=======================>......] - ETA: 22s - loss: 1.6352 - regression_loss: 1.3592 - classification_loss: 0.2760 411/500 [=======================>......] - ETA: 22s - loss: 1.6371 - regression_loss: 1.3608 - classification_loss: 0.2764 412/500 [=======================>......] - ETA: 21s - loss: 1.6367 - regression_loss: 1.3604 - classification_loss: 0.2763 413/500 [=======================>......] - ETA: 21s - loss: 1.6368 - regression_loss: 1.3603 - classification_loss: 0.2765 414/500 [=======================>......] - ETA: 21s - loss: 1.6369 - regression_loss: 1.3604 - classification_loss: 0.2765 415/500 [=======================>......] - ETA: 21s - loss: 1.6374 - regression_loss: 1.3608 - classification_loss: 0.2766 416/500 [=======================>......] - ETA: 20s - loss: 1.6385 - regression_loss: 1.3617 - classification_loss: 0.2768 417/500 [========================>.....] - ETA: 20s - loss: 1.6391 - regression_loss: 1.3624 - classification_loss: 0.2768 418/500 [========================>.....] - ETA: 20s - loss: 1.6395 - regression_loss: 1.3628 - classification_loss: 0.2767 419/500 [========================>.....] - ETA: 20s - loss: 1.6405 - regression_loss: 1.3639 - classification_loss: 0.2767 420/500 [========================>.....] - ETA: 19s - loss: 1.6387 - regression_loss: 1.3625 - classification_loss: 0.2762 421/500 [========================>.....] - ETA: 19s - loss: 1.6375 - regression_loss: 1.3615 - classification_loss: 0.2759 422/500 [========================>.....] - ETA: 19s - loss: 1.6356 - regression_loss: 1.3600 - classification_loss: 0.2757 423/500 [========================>.....] - ETA: 19s - loss: 1.6356 - regression_loss: 1.3600 - classification_loss: 0.2756 424/500 [========================>.....] - ETA: 18s - loss: 1.6360 - regression_loss: 1.3604 - classification_loss: 0.2756 425/500 [========================>.....] - ETA: 18s - loss: 1.6352 - regression_loss: 1.3597 - classification_loss: 0.2754 426/500 [========================>.....] - ETA: 18s - loss: 1.6351 - regression_loss: 1.3598 - classification_loss: 0.2753 427/500 [========================>.....] - ETA: 18s - loss: 1.6356 - regression_loss: 1.3603 - classification_loss: 0.2753 428/500 [========================>.....] - ETA: 17s - loss: 1.6358 - regression_loss: 1.3604 - classification_loss: 0.2754 429/500 [========================>.....] - ETA: 17s - loss: 1.6353 - regression_loss: 1.3601 - classification_loss: 0.2751 430/500 [========================>.....] - ETA: 17s - loss: 1.6357 - regression_loss: 1.3605 - classification_loss: 0.2752 431/500 [========================>.....] - ETA: 17s - loss: 1.6358 - regression_loss: 1.3607 - classification_loss: 0.2751 432/500 [========================>.....] - ETA: 16s - loss: 1.6362 - regression_loss: 1.3611 - classification_loss: 0.2751 433/500 [========================>.....] - ETA: 16s - loss: 1.6368 - regression_loss: 1.3616 - classification_loss: 0.2752 434/500 [=========================>....] - ETA: 16s - loss: 1.6371 - regression_loss: 1.3620 - classification_loss: 0.2751 435/500 [=========================>....] - ETA: 16s - loss: 1.6374 - regression_loss: 1.3624 - classification_loss: 0.2750 436/500 [=========================>....] - ETA: 15s - loss: 1.6353 - regression_loss: 1.3608 - classification_loss: 0.2744 437/500 [=========================>....] - ETA: 15s - loss: 1.6340 - regression_loss: 1.3599 - classification_loss: 0.2741 438/500 [=========================>....] - ETA: 15s - loss: 1.6324 - regression_loss: 1.3587 - classification_loss: 0.2738 439/500 [=========================>....] - ETA: 15s - loss: 1.6333 - regression_loss: 1.3594 - classification_loss: 0.2740 440/500 [=========================>....] - ETA: 14s - loss: 1.6326 - regression_loss: 1.3586 - classification_loss: 0.2740 441/500 [=========================>....] - ETA: 14s - loss: 1.6334 - regression_loss: 1.3592 - classification_loss: 0.2742 442/500 [=========================>....] - ETA: 14s - loss: 1.6335 - regression_loss: 1.3593 - classification_loss: 0.2741 443/500 [=========================>....] - ETA: 14s - loss: 1.6329 - regression_loss: 1.3589 - classification_loss: 0.2740 444/500 [=========================>....] - ETA: 13s - loss: 1.6335 - regression_loss: 1.3592 - classification_loss: 0.2743 445/500 [=========================>....] - ETA: 13s - loss: 1.6339 - regression_loss: 1.3596 - classification_loss: 0.2743 446/500 [=========================>....] - ETA: 13s - loss: 1.6344 - regression_loss: 1.3601 - classification_loss: 0.2743 447/500 [=========================>....] - ETA: 13s - loss: 1.6352 - regression_loss: 1.3608 - classification_loss: 0.2744 448/500 [=========================>....] - ETA: 12s - loss: 1.6330 - regression_loss: 1.3590 - classification_loss: 0.2739 449/500 [=========================>....] - ETA: 12s - loss: 1.6342 - regression_loss: 1.3599 - classification_loss: 0.2743 450/500 [==========================>...] - ETA: 12s - loss: 1.6356 - regression_loss: 1.3609 - classification_loss: 0.2747 451/500 [==========================>...] - ETA: 12s - loss: 1.6348 - regression_loss: 1.3604 - classification_loss: 0.2745 452/500 [==========================>...] - ETA: 11s - loss: 1.6348 - regression_loss: 1.3603 - classification_loss: 0.2745 453/500 [==========================>...] - ETA: 11s - loss: 1.6356 - regression_loss: 1.3610 - classification_loss: 0.2746 454/500 [==========================>...] - ETA: 11s - loss: 1.6356 - regression_loss: 1.3610 - classification_loss: 0.2746 455/500 [==========================>...] - ETA: 11s - loss: 1.6342 - regression_loss: 1.3600 - classification_loss: 0.2742 456/500 [==========================>...] - ETA: 10s - loss: 1.6331 - regression_loss: 1.3587 - classification_loss: 0.2743 457/500 [==========================>...] - ETA: 10s - loss: 1.6334 - regression_loss: 1.3590 - classification_loss: 0.2744 458/500 [==========================>...] - ETA: 10s - loss: 1.6319 - regression_loss: 1.3579 - classification_loss: 0.2740 459/500 [==========================>...] - ETA: 10s - loss: 1.6324 - regression_loss: 1.3583 - classification_loss: 0.2740 460/500 [==========================>...] - ETA: 9s - loss: 1.6319 - regression_loss: 1.3580 - classification_loss: 0.2739  461/500 [==========================>...] - ETA: 9s - loss: 1.6333 - regression_loss: 1.3588 - classification_loss: 0.2745 462/500 [==========================>...] - ETA: 9s - loss: 1.6340 - regression_loss: 1.3594 - classification_loss: 0.2746 463/500 [==========================>...] - ETA: 9s - loss: 1.6351 - regression_loss: 1.3601 - classification_loss: 0.2750 464/500 [==========================>...] - ETA: 8s - loss: 1.6353 - regression_loss: 1.3601 - classification_loss: 0.2751 465/500 [==========================>...] - ETA: 8s - loss: 1.6360 - regression_loss: 1.3606 - classification_loss: 0.2754 466/500 [==========================>...] - ETA: 8s - loss: 1.6369 - regression_loss: 1.3613 - classification_loss: 0.2756 467/500 [===========================>..] - ETA: 8s - loss: 1.6376 - regression_loss: 1.3618 - classification_loss: 0.2758 468/500 [===========================>..] - ETA: 7s - loss: 1.6372 - regression_loss: 1.3615 - classification_loss: 0.2757 469/500 [===========================>..] - ETA: 7s - loss: 1.6376 - regression_loss: 1.3620 - classification_loss: 0.2757 470/500 [===========================>..] - ETA: 7s - loss: 1.6391 - regression_loss: 1.3633 - classification_loss: 0.2758 471/500 [===========================>..] - ETA: 7s - loss: 1.6396 - regression_loss: 1.3638 - classification_loss: 0.2757 472/500 [===========================>..] - ETA: 6s - loss: 1.6384 - regression_loss: 1.3629 - classification_loss: 0.2755 473/500 [===========================>..] - ETA: 6s - loss: 1.6386 - regression_loss: 1.3631 - classification_loss: 0.2755 474/500 [===========================>..] - ETA: 6s - loss: 1.6400 - regression_loss: 1.3642 - classification_loss: 0.2758 475/500 [===========================>..] - ETA: 6s - loss: 1.6403 - regression_loss: 1.3645 - classification_loss: 0.2758 476/500 [===========================>..] - ETA: 5s - loss: 1.6403 - regression_loss: 1.3646 - classification_loss: 0.2757 477/500 [===========================>..] - ETA: 5s - loss: 1.6427 - regression_loss: 1.3667 - classification_loss: 0.2760 478/500 [===========================>..] - ETA: 5s - loss: 1.6431 - regression_loss: 1.3672 - classification_loss: 0.2759 479/500 [===========================>..] - ETA: 5s - loss: 1.6422 - regression_loss: 1.3665 - classification_loss: 0.2757 480/500 [===========================>..] - ETA: 4s - loss: 1.6421 - regression_loss: 1.3665 - classification_loss: 0.2756 481/500 [===========================>..] - ETA: 4s - loss: 1.6417 - regression_loss: 1.3661 - classification_loss: 0.2756 482/500 [===========================>..] - ETA: 4s - loss: 1.6418 - regression_loss: 1.3662 - classification_loss: 0.2756 483/500 [===========================>..] - ETA: 4s - loss: 1.6416 - regression_loss: 1.3659 - classification_loss: 0.2756 484/500 [============================>.] - ETA: 3s - loss: 1.6412 - regression_loss: 1.3657 - classification_loss: 0.2756 485/500 [============================>.] - ETA: 3s - loss: 1.6427 - regression_loss: 1.3669 - classification_loss: 0.2758 486/500 [============================>.] - ETA: 3s - loss: 1.6410 - regression_loss: 1.3654 - classification_loss: 0.2756 487/500 [============================>.] - ETA: 3s - loss: 1.6409 - regression_loss: 1.3655 - classification_loss: 0.2754 488/500 [============================>.] - ETA: 2s - loss: 1.6405 - regression_loss: 1.3652 - classification_loss: 0.2754 489/500 [============================>.] - ETA: 2s - loss: 1.6390 - regression_loss: 1.3638 - classification_loss: 0.2752 490/500 [============================>.] - ETA: 2s - loss: 1.6404 - regression_loss: 1.3648 - classification_loss: 0.2756 491/500 [============================>.] - ETA: 2s - loss: 1.6399 - regression_loss: 1.3643 - classification_loss: 0.2755 492/500 [============================>.] - ETA: 1s - loss: 1.6403 - regression_loss: 1.3648 - classification_loss: 0.2756 493/500 [============================>.] - ETA: 1s - loss: 1.6390 - regression_loss: 1.3638 - classification_loss: 0.2752 494/500 [============================>.] - ETA: 1s - loss: 1.6394 - regression_loss: 1.3640 - classification_loss: 0.2754 495/500 [============================>.] - ETA: 1s - loss: 1.6388 - regression_loss: 1.3637 - classification_loss: 0.2751 496/500 [============================>.] - ETA: 0s - loss: 1.6403 - regression_loss: 1.3650 - classification_loss: 0.2753 497/500 [============================>.] - ETA: 0s - loss: 1.6395 - regression_loss: 1.3645 - classification_loss: 0.2751 498/500 [============================>.] - ETA: 0s - loss: 1.6404 - regression_loss: 1.3651 - classification_loss: 0.2753 499/500 [============================>.] - ETA: 0s - loss: 1.6404 - regression_loss: 1.3653 - classification_loss: 0.2751 500/500 [==============================] - 124s 248ms/step - loss: 1.6382 - regression_loss: 1.3635 - classification_loss: 0.2747 1172 instances of class plum with average precision: 0.6557 mAP: 0.6557 Epoch 00081: saving model to ./training/snapshots/resnet50_pascal_81.h5 Epoch 82/150 1/500 [..............................] - ETA: 1:51 - loss: 1.5621 - regression_loss: 1.2834 - classification_loss: 0.2787 2/500 [..............................] - ETA: 1:55 - loss: 1.5942 - regression_loss: 1.3462 - classification_loss: 0.2480 3/500 [..............................] - ETA: 1:58 - loss: 1.6736 - regression_loss: 1.4049 - classification_loss: 0.2687 4/500 [..............................] - ETA: 1:59 - loss: 1.7596 - regression_loss: 1.4570 - classification_loss: 0.3026 5/500 [..............................] - ETA: 2:01 - loss: 1.7176 - regression_loss: 1.4222 - classification_loss: 0.2954 6/500 [..............................] - ETA: 2:00 - loss: 1.5610 - regression_loss: 1.2994 - classification_loss: 0.2616 7/500 [..............................] - ETA: 2:00 - loss: 1.5732 - regression_loss: 1.3144 - classification_loss: 0.2588 8/500 [..............................] - ETA: 2:00 - loss: 1.6477 - regression_loss: 1.3791 - classification_loss: 0.2686 9/500 [..............................] - ETA: 2:01 - loss: 1.6853 - regression_loss: 1.4136 - classification_loss: 0.2717 10/500 [..............................] - ETA: 2:00 - loss: 1.6636 - regression_loss: 1.3901 - classification_loss: 0.2734 11/500 [..............................] - ETA: 2:00 - loss: 1.6425 - regression_loss: 1.3747 - classification_loss: 0.2677 12/500 [..............................] - ETA: 2:00 - loss: 1.6003 - regression_loss: 1.3405 - classification_loss: 0.2597 13/500 [..............................] - ETA: 2:00 - loss: 1.5460 - regression_loss: 1.2891 - classification_loss: 0.2569 14/500 [..............................] - ETA: 2:00 - loss: 1.5449 - regression_loss: 1.2881 - classification_loss: 0.2568 15/500 [..............................] - ETA: 2:00 - loss: 1.5222 - regression_loss: 1.2608 - classification_loss: 0.2614 16/500 [..............................] - ETA: 2:00 - loss: 1.5548 - regression_loss: 1.2849 - classification_loss: 0.2699 17/500 [>.............................] - ETA: 2:00 - loss: 1.5208 - regression_loss: 1.2575 - classification_loss: 0.2633 18/500 [>.............................] - ETA: 1:59 - loss: 1.5122 - regression_loss: 1.2527 - classification_loss: 0.2596 19/500 [>.............................] - ETA: 1:58 - loss: 1.5254 - regression_loss: 1.2638 - classification_loss: 0.2616 20/500 [>.............................] - ETA: 1:57 - loss: 1.5373 - regression_loss: 1.2751 - classification_loss: 0.2622 21/500 [>.............................] - ETA: 1:57 - loss: 1.5919 - regression_loss: 1.3188 - classification_loss: 0.2731 22/500 [>.............................] - ETA: 1:57 - loss: 1.5832 - regression_loss: 1.3124 - classification_loss: 0.2708 23/500 [>.............................] - ETA: 1:57 - loss: 1.6086 - regression_loss: 1.3327 - classification_loss: 0.2759 24/500 [>.............................] - ETA: 1:57 - loss: 1.6025 - regression_loss: 1.3282 - classification_loss: 0.2744 25/500 [>.............................] - ETA: 1:57 - loss: 1.6439 - regression_loss: 1.3631 - classification_loss: 0.2808 26/500 [>.............................] - ETA: 1:57 - loss: 1.6558 - regression_loss: 1.3725 - classification_loss: 0.2833 27/500 [>.............................] - ETA: 1:56 - loss: 1.6659 - regression_loss: 1.3811 - classification_loss: 0.2848 28/500 [>.............................] - ETA: 1:56 - loss: 1.6347 - regression_loss: 1.3571 - classification_loss: 0.2775 29/500 [>.............................] - ETA: 1:56 - loss: 1.6546 - regression_loss: 1.3725 - classification_loss: 0.2821 30/500 [>.............................] - ETA: 1:56 - loss: 1.6477 - regression_loss: 1.3652 - classification_loss: 0.2825 31/500 [>.............................] - ETA: 1:56 - loss: 1.6386 - regression_loss: 1.3557 - classification_loss: 0.2830 32/500 [>.............................] - ETA: 1:56 - loss: 1.6090 - regression_loss: 1.3322 - classification_loss: 0.2768 33/500 [>.............................] - ETA: 1:55 - loss: 1.6166 - regression_loss: 1.3392 - classification_loss: 0.2774 34/500 [=>............................] - ETA: 1:55 - loss: 1.6208 - regression_loss: 1.3444 - classification_loss: 0.2764 35/500 [=>............................] - ETA: 1:55 - loss: 1.6229 - regression_loss: 1.3477 - classification_loss: 0.2752 36/500 [=>............................] - ETA: 1:55 - loss: 1.6345 - regression_loss: 1.3565 - classification_loss: 0.2780 37/500 [=>............................] - ETA: 1:55 - loss: 1.6195 - regression_loss: 1.3448 - classification_loss: 0.2748 38/500 [=>............................] - ETA: 1:55 - loss: 1.6397 - regression_loss: 1.3626 - classification_loss: 0.2770 39/500 [=>............................] - ETA: 1:55 - loss: 1.6494 - regression_loss: 1.3710 - classification_loss: 0.2784 40/500 [=>............................] - ETA: 1:54 - loss: 1.6464 - regression_loss: 1.3703 - classification_loss: 0.2761 41/500 [=>............................] - ETA: 1:54 - loss: 1.6481 - regression_loss: 1.3721 - classification_loss: 0.2760 42/500 [=>............................] - ETA: 1:54 - loss: 1.6194 - regression_loss: 1.3477 - classification_loss: 0.2717 43/500 [=>............................] - ETA: 1:53 - loss: 1.6204 - regression_loss: 1.3472 - classification_loss: 0.2732 44/500 [=>............................] - ETA: 1:53 - loss: 1.6187 - regression_loss: 1.3454 - classification_loss: 0.2733 45/500 [=>............................] - ETA: 1:53 - loss: 1.6385 - regression_loss: 1.3603 - classification_loss: 0.2781 46/500 [=>............................] - ETA: 1:53 - loss: 1.6339 - regression_loss: 1.3575 - classification_loss: 0.2764 47/500 [=>............................] - ETA: 1:53 - loss: 1.6360 - regression_loss: 1.3573 - classification_loss: 0.2787 48/500 [=>............................] - ETA: 1:52 - loss: 1.6154 - regression_loss: 1.3391 - classification_loss: 0.2763 49/500 [=>............................] - ETA: 1:52 - loss: 1.6242 - regression_loss: 1.3454 - classification_loss: 0.2788 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6214 - regression_loss: 1.3403 - classification_loss: 0.2811 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6300 - regression_loss: 1.3481 - classification_loss: 0.2820 52/500 [==>...........................] - ETA: 1:52 - loss: 1.6169 - regression_loss: 1.3377 - classification_loss: 0.2792 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6254 - regression_loss: 1.3453 - classification_loss: 0.2801 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6276 - regression_loss: 1.3474 - classification_loss: 0.2802 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6171 - regression_loss: 1.3397 - classification_loss: 0.2774 56/500 [==>...........................] - ETA: 1:51 - loss: 1.6148 - regression_loss: 1.3387 - classification_loss: 0.2761 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6161 - regression_loss: 1.3389 - classification_loss: 0.2772 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6161 - regression_loss: 1.3389 - classification_loss: 0.2772 59/500 [==>...........................] - ETA: 1:50 - loss: 1.6137 - regression_loss: 1.3367 - classification_loss: 0.2770 60/500 [==>...........................] - ETA: 1:50 - loss: 1.6145 - regression_loss: 1.3376 - classification_loss: 0.2769 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6235 - regression_loss: 1.3451 - classification_loss: 0.2783 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6277 - regression_loss: 1.3489 - classification_loss: 0.2787 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6154 - regression_loss: 1.3372 - classification_loss: 0.2781 64/500 [==>...........................] - ETA: 1:49 - loss: 1.6135 - regression_loss: 1.3354 - classification_loss: 0.2781 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6085 - regression_loss: 1.3324 - classification_loss: 0.2760 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6110 - regression_loss: 1.3359 - classification_loss: 0.2751 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6120 - regression_loss: 1.3367 - classification_loss: 0.2753 68/500 [===>..........................] - ETA: 1:48 - loss: 1.6209 - regression_loss: 1.3440 - classification_loss: 0.2769 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6235 - regression_loss: 1.3470 - classification_loss: 0.2765 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6163 - regression_loss: 1.3421 - classification_loss: 0.2742 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6188 - regression_loss: 1.3452 - classification_loss: 0.2737 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6150 - regression_loss: 1.3422 - classification_loss: 0.2728 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6147 - regression_loss: 1.3427 - classification_loss: 0.2719 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6204 - regression_loss: 1.3467 - classification_loss: 0.2737 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6077 - regression_loss: 1.3363 - classification_loss: 0.2714 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5987 - regression_loss: 1.3289 - classification_loss: 0.2698 77/500 [===>..........................] - ETA: 1:46 - loss: 1.6001 - regression_loss: 1.3303 - classification_loss: 0.2699 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6075 - regression_loss: 1.3349 - classification_loss: 0.2726 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6090 - regression_loss: 1.3368 - classification_loss: 0.2721 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6054 - regression_loss: 1.3341 - classification_loss: 0.2712 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6001 - regression_loss: 1.3281 - classification_loss: 0.2721 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5987 - regression_loss: 1.3273 - classification_loss: 0.2714 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5884 - regression_loss: 1.3189 - classification_loss: 0.2695 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5936 - regression_loss: 1.3242 - classification_loss: 0.2694 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5889 - regression_loss: 1.3209 - classification_loss: 0.2680 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5857 - regression_loss: 1.3188 - classification_loss: 0.2669 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5878 - regression_loss: 1.3205 - classification_loss: 0.2673 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5905 - regression_loss: 1.3229 - classification_loss: 0.2676 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5987 - regression_loss: 1.3288 - classification_loss: 0.2699 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6020 - regression_loss: 1.3317 - classification_loss: 0.2703 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6047 - regression_loss: 1.3342 - classification_loss: 0.2705 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6036 - regression_loss: 1.3334 - classification_loss: 0.2702 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6059 - regression_loss: 1.3354 - classification_loss: 0.2705 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6061 - regression_loss: 1.3355 - classification_loss: 0.2706 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6030 - regression_loss: 1.3334 - classification_loss: 0.2696 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6013 - regression_loss: 1.3323 - classification_loss: 0.2689 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6070 - regression_loss: 1.3361 - classification_loss: 0.2709 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6074 - regression_loss: 1.3366 - classification_loss: 0.2707 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6097 - regression_loss: 1.3381 - classification_loss: 0.2716 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6086 - regression_loss: 1.3369 - classification_loss: 0.2716 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6065 - regression_loss: 1.3348 - classification_loss: 0.2717 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6052 - regression_loss: 1.3332 - classification_loss: 0.2720 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6072 - regression_loss: 1.3351 - classification_loss: 0.2721 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6110 - regression_loss: 1.3385 - classification_loss: 0.2724 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6137 - regression_loss: 1.3415 - classification_loss: 0.2723 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6182 - regression_loss: 1.3454 - classification_loss: 0.2729 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6156 - regression_loss: 1.3426 - classification_loss: 0.2729 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6094 - regression_loss: 1.3376 - classification_loss: 0.2717 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6053 - regression_loss: 1.3345 - classification_loss: 0.2709 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6120 - regression_loss: 1.3407 - classification_loss: 0.2713 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6094 - regression_loss: 1.3387 - classification_loss: 0.2707 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6118 - regression_loss: 1.3401 - classification_loss: 0.2717 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6141 - regression_loss: 1.3416 - classification_loss: 0.2725 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6166 - regression_loss: 1.3431 - classification_loss: 0.2735 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6201 - regression_loss: 1.3461 - classification_loss: 0.2740 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6201 - regression_loss: 1.3461 - classification_loss: 0.2741 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6178 - regression_loss: 1.3442 - classification_loss: 0.2736 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6205 - regression_loss: 1.3463 - classification_loss: 0.2743 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6208 - regression_loss: 1.3461 - classification_loss: 0.2747 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6216 - regression_loss: 1.3470 - classification_loss: 0.2746 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6222 - regression_loss: 1.3468 - classification_loss: 0.2754 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6214 - regression_loss: 1.3443 - classification_loss: 0.2771 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6186 - regression_loss: 1.3423 - classification_loss: 0.2763 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6208 - regression_loss: 1.3443 - classification_loss: 0.2765 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6191 - regression_loss: 1.3429 - classification_loss: 0.2762 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6179 - regression_loss: 1.3426 - classification_loss: 0.2754 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6132 - regression_loss: 1.3386 - classification_loss: 0.2746 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6147 - regression_loss: 1.3403 - classification_loss: 0.2743 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6123 - regression_loss: 1.3384 - classification_loss: 0.2740 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6164 - regression_loss: 1.3416 - classification_loss: 0.2748 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6165 - regression_loss: 1.3418 - classification_loss: 0.2747 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6183 - regression_loss: 1.3433 - classification_loss: 0.2750 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6108 - regression_loss: 1.3372 - classification_loss: 0.2737 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6133 - regression_loss: 1.3389 - classification_loss: 0.2744 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6162 - regression_loss: 1.3410 - classification_loss: 0.2752 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6213 - regression_loss: 1.3453 - classification_loss: 0.2760 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6230 - regression_loss: 1.3470 - classification_loss: 0.2760 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6182 - regression_loss: 1.3437 - classification_loss: 0.2745 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6210 - regression_loss: 1.3463 - classification_loss: 0.2747 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6250 - regression_loss: 1.3494 - classification_loss: 0.2756 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6197 - regression_loss: 1.3452 - classification_loss: 0.2744 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6235 - regression_loss: 1.3484 - classification_loss: 0.2751 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6230 - regression_loss: 1.3480 - classification_loss: 0.2750 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6260 - regression_loss: 1.3506 - classification_loss: 0.2755 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6246 - regression_loss: 1.3496 - classification_loss: 0.2749 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6271 - regression_loss: 1.3513 - classification_loss: 0.2757 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6318 - regression_loss: 1.3549 - classification_loss: 0.2769 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6311 - regression_loss: 1.3547 - classification_loss: 0.2764 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6309 - regression_loss: 1.3546 - classification_loss: 0.2763 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6283 - regression_loss: 1.3523 - classification_loss: 0.2760 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6291 - regression_loss: 1.3530 - classification_loss: 0.2761 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6325 - regression_loss: 1.3555 - classification_loss: 0.2770 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6335 - regression_loss: 1.3564 - classification_loss: 0.2771 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6341 - regression_loss: 1.3568 - classification_loss: 0.2773 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6316 - regression_loss: 1.3553 - classification_loss: 0.2763 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6318 - regression_loss: 1.3557 - classification_loss: 0.2761 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6308 - regression_loss: 1.3543 - classification_loss: 0.2765 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6296 - regression_loss: 1.3536 - classification_loss: 0.2760 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6287 - regression_loss: 1.3534 - classification_loss: 0.2753 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6313 - regression_loss: 1.3558 - classification_loss: 0.2756 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6350 - regression_loss: 1.3587 - classification_loss: 0.2763 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6427 - regression_loss: 1.3648 - classification_loss: 0.2780 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6396 - regression_loss: 1.3624 - classification_loss: 0.2772 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6345 - regression_loss: 1.3583 - classification_loss: 0.2762 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6360 - regression_loss: 1.3596 - classification_loss: 0.2764 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6376 - regression_loss: 1.3610 - classification_loss: 0.2766 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6408 - regression_loss: 1.3635 - classification_loss: 0.2772 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6412 - regression_loss: 1.3642 - classification_loss: 0.2770 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6385 - regression_loss: 1.3616 - classification_loss: 0.2768 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6392 - regression_loss: 1.3625 - classification_loss: 0.2767 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6398 - regression_loss: 1.3633 - classification_loss: 0.2766 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6375 - regression_loss: 1.3615 - classification_loss: 0.2760 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6371 - regression_loss: 1.3615 - classification_loss: 0.2756 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6367 - regression_loss: 1.3611 - classification_loss: 0.2756 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6351 - regression_loss: 1.3597 - classification_loss: 0.2754 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6361 - regression_loss: 1.3602 - classification_loss: 0.2759 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6378 - regression_loss: 1.3617 - classification_loss: 0.2760 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6355 - regression_loss: 1.3601 - classification_loss: 0.2754 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6343 - regression_loss: 1.3592 - classification_loss: 0.2751 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6313 - regression_loss: 1.3569 - classification_loss: 0.2744 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6350 - regression_loss: 1.3601 - classification_loss: 0.2749 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6359 - regression_loss: 1.3614 - classification_loss: 0.2744 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6376 - regression_loss: 1.3632 - classification_loss: 0.2744 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6313 - regression_loss: 1.3579 - classification_loss: 0.2735 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6318 - regression_loss: 1.3582 - classification_loss: 0.2736 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6332 - regression_loss: 1.3592 - classification_loss: 0.2739 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6287 - regression_loss: 1.3557 - classification_loss: 0.2730 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6288 - regression_loss: 1.3557 - classification_loss: 0.2730 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6296 - regression_loss: 1.3563 - classification_loss: 0.2733 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6294 - regression_loss: 1.3561 - classification_loss: 0.2734 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6230 - regression_loss: 1.3508 - classification_loss: 0.2722 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6225 - regression_loss: 1.3505 - classification_loss: 0.2720 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6247 - regression_loss: 1.3520 - classification_loss: 0.2726 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6313 - regression_loss: 1.3571 - classification_loss: 0.2742 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6329 - regression_loss: 1.3585 - classification_loss: 0.2744 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6300 - regression_loss: 1.3560 - classification_loss: 0.2741 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6291 - regression_loss: 1.3552 - classification_loss: 0.2739 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6315 - regression_loss: 1.3572 - classification_loss: 0.2743 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6305 - regression_loss: 1.3566 - classification_loss: 0.2740 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6276 - regression_loss: 1.3537 - classification_loss: 0.2738 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6271 - regression_loss: 1.3536 - classification_loss: 0.2736 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6278 - regression_loss: 1.3542 - classification_loss: 0.2736 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6272 - regression_loss: 1.3534 - classification_loss: 0.2737 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6305 - regression_loss: 1.3566 - classification_loss: 0.2739 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6288 - regression_loss: 1.3553 - classification_loss: 0.2736 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6310 - regression_loss: 1.3571 - classification_loss: 0.2739 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6303 - regression_loss: 1.3567 - classification_loss: 0.2736 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6279 - regression_loss: 1.3546 - classification_loss: 0.2733 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6290 - regression_loss: 1.3555 - classification_loss: 0.2735 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6293 - regression_loss: 1.3558 - classification_loss: 0.2735 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6304 - regression_loss: 1.3567 - classification_loss: 0.2737 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6337 - regression_loss: 1.3589 - classification_loss: 0.2749 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6317 - regression_loss: 1.3571 - classification_loss: 0.2746 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6326 - regression_loss: 1.3583 - classification_loss: 0.2743 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6287 - regression_loss: 1.3553 - classification_loss: 0.2735 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6260 - regression_loss: 1.3531 - classification_loss: 0.2729 217/500 [============>.................] - ETA: 1:10 - loss: 1.6256 - regression_loss: 1.3521 - classification_loss: 0.2735 218/500 [============>.................] - ETA: 1:10 - loss: 1.6269 - regression_loss: 1.3536 - classification_loss: 0.2733 219/500 [============>.................] - ETA: 1:10 - loss: 1.6286 - regression_loss: 1.3550 - classification_loss: 0.2736 220/500 [============>.................] - ETA: 1:10 - loss: 1.6302 - regression_loss: 1.3565 - classification_loss: 0.2737 221/500 [============>.................] - ETA: 1:09 - loss: 1.6322 - regression_loss: 1.3575 - classification_loss: 0.2747 222/500 [============>.................] - ETA: 1:09 - loss: 1.6320 - regression_loss: 1.3576 - classification_loss: 0.2744 223/500 [============>.................] - ETA: 1:09 - loss: 1.6288 - regression_loss: 1.3547 - classification_loss: 0.2741 224/500 [============>.................] - ETA: 1:09 - loss: 1.6319 - regression_loss: 1.3574 - classification_loss: 0.2745 225/500 [============>.................] - ETA: 1:08 - loss: 1.6295 - regression_loss: 1.3553 - classification_loss: 0.2743 226/500 [============>.................] - ETA: 1:08 - loss: 1.6323 - regression_loss: 1.3580 - classification_loss: 0.2744 227/500 [============>.................] - ETA: 1:08 - loss: 1.6306 - regression_loss: 1.3562 - classification_loss: 0.2744 228/500 [============>.................] - ETA: 1:08 - loss: 1.6319 - regression_loss: 1.3573 - classification_loss: 0.2745 229/500 [============>.................] - ETA: 1:07 - loss: 1.6327 - regression_loss: 1.3581 - classification_loss: 0.2746 230/500 [============>.................] - ETA: 1:07 - loss: 1.6331 - regression_loss: 1.3585 - classification_loss: 0.2747 231/500 [============>.................] - ETA: 1:07 - loss: 1.6298 - regression_loss: 1.3553 - classification_loss: 0.2744 232/500 [============>.................] - ETA: 1:07 - loss: 1.6267 - regression_loss: 1.3524 - classification_loss: 0.2742 233/500 [============>.................] - ETA: 1:06 - loss: 1.6279 - regression_loss: 1.3537 - classification_loss: 0.2742 234/500 [=============>................] - ETA: 1:06 - loss: 1.6298 - regression_loss: 1.3553 - classification_loss: 0.2745 235/500 [=============>................] - ETA: 1:06 - loss: 1.6288 - regression_loss: 1.3545 - classification_loss: 0.2743 236/500 [=============>................] - ETA: 1:06 - loss: 1.6297 - regression_loss: 1.3551 - classification_loss: 0.2746 237/500 [=============>................] - ETA: 1:05 - loss: 1.6297 - regression_loss: 1.3552 - classification_loss: 0.2746 238/500 [=============>................] - ETA: 1:05 - loss: 1.6292 - regression_loss: 1.3545 - classification_loss: 0.2746 239/500 [=============>................] - ETA: 1:05 - loss: 1.6290 - regression_loss: 1.3547 - classification_loss: 0.2743 240/500 [=============>................] - ETA: 1:05 - loss: 1.6285 - regression_loss: 1.3541 - classification_loss: 0.2744 241/500 [=============>................] - ETA: 1:04 - loss: 1.6293 - regression_loss: 1.3547 - classification_loss: 0.2747 242/500 [=============>................] - ETA: 1:04 - loss: 1.6303 - regression_loss: 1.3557 - classification_loss: 0.2746 243/500 [=============>................] - ETA: 1:04 - loss: 1.6292 - regression_loss: 1.3549 - classification_loss: 0.2743 244/500 [=============>................] - ETA: 1:04 - loss: 1.6296 - regression_loss: 1.3552 - classification_loss: 0.2745 245/500 [=============>................] - ETA: 1:03 - loss: 1.6309 - regression_loss: 1.3565 - classification_loss: 0.2744 246/500 [=============>................] - ETA: 1:03 - loss: 1.6344 - regression_loss: 1.3593 - classification_loss: 0.2751 247/500 [=============>................] - ETA: 1:03 - loss: 1.6317 - regression_loss: 1.3573 - classification_loss: 0.2745 248/500 [=============>................] - ETA: 1:03 - loss: 1.6332 - regression_loss: 1.3584 - classification_loss: 0.2748 249/500 [=============>................] - ETA: 1:02 - loss: 1.6359 - regression_loss: 1.3604 - classification_loss: 0.2755 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6377 - regression_loss: 1.3614 - classification_loss: 0.2763 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6357 - regression_loss: 1.3598 - classification_loss: 0.2759 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6321 - regression_loss: 1.3569 - classification_loss: 0.2751 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6326 - regression_loss: 1.3574 - classification_loss: 0.2752 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6326 - regression_loss: 1.3575 - classification_loss: 0.2750 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6343 - regression_loss: 1.3591 - classification_loss: 0.2752 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6353 - regression_loss: 1.3594 - classification_loss: 0.2759 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6351 - regression_loss: 1.3594 - classification_loss: 0.2757 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6352 - regression_loss: 1.3598 - classification_loss: 0.2753 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6355 - regression_loss: 1.3602 - classification_loss: 0.2754 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6328 - regression_loss: 1.3581 - classification_loss: 0.2746 261/500 [==============>...............] - ETA: 59s - loss: 1.6349 - regression_loss: 1.3597 - classification_loss: 0.2752  262/500 [==============>...............] - ETA: 59s - loss: 1.6312 - regression_loss: 1.3565 - classification_loss: 0.2747 263/500 [==============>...............] - ETA: 59s - loss: 1.6353 - regression_loss: 1.3602 - classification_loss: 0.2751 264/500 [==============>...............] - ETA: 59s - loss: 1.6348 - regression_loss: 1.3599 - classification_loss: 0.2748 265/500 [==============>...............] - ETA: 58s - loss: 1.6345 - regression_loss: 1.3598 - classification_loss: 0.2746 266/500 [==============>...............] - ETA: 58s - loss: 1.6325 - regression_loss: 1.3582 - classification_loss: 0.2743 267/500 [===============>..............] - ETA: 58s - loss: 1.6330 - regression_loss: 1.3587 - classification_loss: 0.2743 268/500 [===============>..............] - ETA: 58s - loss: 1.6341 - regression_loss: 1.3595 - classification_loss: 0.2746 269/500 [===============>..............] - ETA: 57s - loss: 1.6309 - regression_loss: 1.3569 - classification_loss: 0.2740 270/500 [===============>..............] - ETA: 57s - loss: 1.6309 - regression_loss: 1.3571 - classification_loss: 0.2738 271/500 [===============>..............] - ETA: 57s - loss: 1.6313 - regression_loss: 1.3573 - classification_loss: 0.2740 272/500 [===============>..............] - ETA: 57s - loss: 1.6308 - regression_loss: 1.3568 - classification_loss: 0.2739 273/500 [===============>..............] - ETA: 56s - loss: 1.6285 - regression_loss: 1.3551 - classification_loss: 0.2734 274/500 [===============>..............] - ETA: 56s - loss: 1.6291 - regression_loss: 1.3556 - classification_loss: 0.2735 275/500 [===============>..............] - ETA: 56s - loss: 1.6264 - regression_loss: 1.3535 - classification_loss: 0.2730 276/500 [===============>..............] - ETA: 56s - loss: 1.6269 - regression_loss: 1.3539 - classification_loss: 0.2729 277/500 [===============>..............] - ETA: 55s - loss: 1.6236 - regression_loss: 1.3514 - classification_loss: 0.2722 278/500 [===============>..............] - ETA: 55s - loss: 1.6198 - regression_loss: 1.3482 - classification_loss: 0.2716 279/500 [===============>..............] - ETA: 55s - loss: 1.6207 - regression_loss: 1.3490 - classification_loss: 0.2717 280/500 [===============>..............] - ETA: 55s - loss: 1.6223 - regression_loss: 1.3505 - classification_loss: 0.2718 281/500 [===============>..............] - ETA: 54s - loss: 1.6248 - regression_loss: 1.3524 - classification_loss: 0.2724 282/500 [===============>..............] - ETA: 54s - loss: 1.6256 - regression_loss: 1.3531 - classification_loss: 0.2726 283/500 [===============>..............] - ETA: 54s - loss: 1.6252 - regression_loss: 1.3527 - classification_loss: 0.2725 284/500 [================>.............] - ETA: 54s - loss: 1.6260 - regression_loss: 1.3535 - classification_loss: 0.2725 285/500 [================>.............] - ETA: 53s - loss: 1.6248 - regression_loss: 1.3526 - classification_loss: 0.2722 286/500 [================>.............] - ETA: 53s - loss: 1.6256 - regression_loss: 1.3532 - classification_loss: 0.2724 287/500 [================>.............] - ETA: 53s - loss: 1.6261 - regression_loss: 1.3534 - classification_loss: 0.2727 288/500 [================>.............] - ETA: 53s - loss: 1.6255 - regression_loss: 1.3529 - classification_loss: 0.2726 289/500 [================>.............] - ETA: 52s - loss: 1.6254 - regression_loss: 1.3530 - classification_loss: 0.2724 290/500 [================>.............] - ETA: 52s - loss: 1.6258 - regression_loss: 1.3534 - classification_loss: 0.2724 291/500 [================>.............] - ETA: 52s - loss: 1.6258 - regression_loss: 1.3534 - classification_loss: 0.2724 292/500 [================>.............] - ETA: 52s - loss: 1.6254 - regression_loss: 1.3530 - classification_loss: 0.2723 293/500 [================>.............] - ETA: 51s - loss: 1.6259 - regression_loss: 1.3536 - classification_loss: 0.2724 294/500 [================>.............] - ETA: 51s - loss: 1.6230 - regression_loss: 1.3512 - classification_loss: 0.2718 295/500 [================>.............] - ETA: 51s - loss: 1.6218 - regression_loss: 1.3503 - classification_loss: 0.2715 296/500 [================>.............] - ETA: 51s - loss: 1.6219 - regression_loss: 1.3505 - classification_loss: 0.2713 297/500 [================>.............] - ETA: 50s - loss: 1.6207 - regression_loss: 1.3494 - classification_loss: 0.2713 298/500 [================>.............] - ETA: 50s - loss: 1.6205 - regression_loss: 1.3494 - classification_loss: 0.2711 299/500 [================>.............] - ETA: 50s - loss: 1.6207 - regression_loss: 1.3496 - classification_loss: 0.2711 300/500 [=================>............] - ETA: 50s - loss: 1.6207 - regression_loss: 1.3493 - classification_loss: 0.2713 301/500 [=================>............] - ETA: 49s - loss: 1.6212 - regression_loss: 1.3488 - classification_loss: 0.2724 302/500 [=================>............] - ETA: 49s - loss: 1.6232 - regression_loss: 1.3504 - classification_loss: 0.2728 303/500 [=================>............] - ETA: 49s - loss: 1.6235 - regression_loss: 1.3507 - classification_loss: 0.2729 304/500 [=================>............] - ETA: 49s - loss: 1.6258 - regression_loss: 1.3521 - classification_loss: 0.2738 305/500 [=================>............] - ETA: 48s - loss: 1.6238 - regression_loss: 1.3505 - classification_loss: 0.2733 306/500 [=================>............] - ETA: 48s - loss: 1.6206 - regression_loss: 1.3479 - classification_loss: 0.2727 307/500 [=================>............] - ETA: 48s - loss: 1.6228 - regression_loss: 1.3498 - classification_loss: 0.2730 308/500 [=================>............] - ETA: 48s - loss: 1.6241 - regression_loss: 1.3509 - classification_loss: 0.2731 309/500 [=================>............] - ETA: 47s - loss: 1.6231 - regression_loss: 1.3501 - classification_loss: 0.2730 310/500 [=================>............] - ETA: 47s - loss: 1.6253 - regression_loss: 1.3520 - classification_loss: 0.2733 311/500 [=================>............] - ETA: 47s - loss: 1.6260 - regression_loss: 1.3525 - classification_loss: 0.2735 312/500 [=================>............] - ETA: 47s - loss: 1.6248 - regression_loss: 1.3515 - classification_loss: 0.2733 313/500 [=================>............] - ETA: 46s - loss: 1.6216 - regression_loss: 1.3489 - classification_loss: 0.2727 314/500 [=================>............] - ETA: 46s - loss: 1.6218 - regression_loss: 1.3493 - classification_loss: 0.2725 315/500 [=================>............] - ETA: 46s - loss: 1.6223 - regression_loss: 1.3498 - classification_loss: 0.2725 316/500 [=================>............] - ETA: 46s - loss: 1.6227 - regression_loss: 1.3500 - classification_loss: 0.2727 317/500 [==================>...........] - ETA: 45s - loss: 1.6222 - regression_loss: 1.3498 - classification_loss: 0.2724 318/500 [==================>...........] - ETA: 45s - loss: 1.6208 - regression_loss: 1.3486 - classification_loss: 0.2723 319/500 [==================>...........] - ETA: 45s - loss: 1.6211 - regression_loss: 1.3489 - classification_loss: 0.2722 320/500 [==================>...........] - ETA: 45s - loss: 1.6197 - regression_loss: 1.3478 - classification_loss: 0.2719 321/500 [==================>...........] - ETA: 44s - loss: 1.6210 - regression_loss: 1.3490 - classification_loss: 0.2720 322/500 [==================>...........] - ETA: 44s - loss: 1.6220 - regression_loss: 1.3497 - classification_loss: 0.2723 323/500 [==================>...........] - ETA: 44s - loss: 1.6213 - regression_loss: 1.3493 - classification_loss: 0.2720 324/500 [==================>...........] - ETA: 44s - loss: 1.6213 - regression_loss: 1.3495 - classification_loss: 0.2718 325/500 [==================>...........] - ETA: 43s - loss: 1.6207 - regression_loss: 1.3484 - classification_loss: 0.2723 326/500 [==================>...........] - ETA: 43s - loss: 1.6213 - regression_loss: 1.3490 - classification_loss: 0.2723 327/500 [==================>...........] - ETA: 43s - loss: 1.6216 - regression_loss: 1.3496 - classification_loss: 0.2720 328/500 [==================>...........] - ETA: 43s - loss: 1.6207 - regression_loss: 1.3492 - classification_loss: 0.2715 329/500 [==================>...........] - ETA: 42s - loss: 1.6212 - regression_loss: 1.3496 - classification_loss: 0.2716 330/500 [==================>...........] - ETA: 42s - loss: 1.6205 - regression_loss: 1.3491 - classification_loss: 0.2714 331/500 [==================>...........] - ETA: 42s - loss: 1.6194 - regression_loss: 1.3482 - classification_loss: 0.2712 332/500 [==================>...........] - ETA: 42s - loss: 1.6202 - regression_loss: 1.3490 - classification_loss: 0.2712 333/500 [==================>...........] - ETA: 41s - loss: 1.6185 - regression_loss: 1.3476 - classification_loss: 0.2708 334/500 [===================>..........] - ETA: 41s - loss: 1.6184 - regression_loss: 1.3477 - classification_loss: 0.2707 335/500 [===================>..........] - ETA: 41s - loss: 1.6198 - regression_loss: 1.3486 - classification_loss: 0.2712 336/500 [===================>..........] - ETA: 41s - loss: 1.6211 - regression_loss: 1.3497 - classification_loss: 0.2714 337/500 [===================>..........] - ETA: 40s - loss: 1.6214 - regression_loss: 1.3500 - classification_loss: 0.2713 338/500 [===================>..........] - ETA: 40s - loss: 1.6199 - regression_loss: 1.3489 - classification_loss: 0.2710 339/500 [===================>..........] - ETA: 40s - loss: 1.6197 - regression_loss: 1.3490 - classification_loss: 0.2707 340/500 [===================>..........] - ETA: 40s - loss: 1.6199 - regression_loss: 1.3491 - classification_loss: 0.2708 341/500 [===================>..........] - ETA: 39s - loss: 1.6200 - regression_loss: 1.3491 - classification_loss: 0.2709 342/500 [===================>..........] - ETA: 39s - loss: 1.6206 - regression_loss: 1.3496 - classification_loss: 0.2710 343/500 [===================>..........] - ETA: 39s - loss: 1.6203 - regression_loss: 1.3496 - classification_loss: 0.2708 344/500 [===================>..........] - ETA: 39s - loss: 1.6205 - regression_loss: 1.3497 - classification_loss: 0.2707 345/500 [===================>..........] - ETA: 38s - loss: 1.6213 - regression_loss: 1.3505 - classification_loss: 0.2709 346/500 [===================>..........] - ETA: 38s - loss: 1.6215 - regression_loss: 1.3507 - classification_loss: 0.2708 347/500 [===================>..........] - ETA: 38s - loss: 1.6197 - regression_loss: 1.3492 - classification_loss: 0.2705 348/500 [===================>..........] - ETA: 38s - loss: 1.6185 - regression_loss: 1.3480 - classification_loss: 0.2705 349/500 [===================>..........] - ETA: 37s - loss: 1.6174 - regression_loss: 1.3467 - classification_loss: 0.2707 350/500 [====================>.........] - ETA: 37s - loss: 1.6173 - regression_loss: 1.3466 - classification_loss: 0.2708 351/500 [====================>.........] - ETA: 37s - loss: 1.6171 - regression_loss: 1.3464 - classification_loss: 0.2707 352/500 [====================>.........] - ETA: 37s - loss: 1.6178 - regression_loss: 1.3471 - classification_loss: 0.2707 353/500 [====================>.........] - ETA: 36s - loss: 1.6171 - regression_loss: 1.3465 - classification_loss: 0.2707 354/500 [====================>.........] - ETA: 36s - loss: 1.6178 - regression_loss: 1.3470 - classification_loss: 0.2708 355/500 [====================>.........] - ETA: 36s - loss: 1.6172 - regression_loss: 1.3466 - classification_loss: 0.2706 356/500 [====================>.........] - ETA: 36s - loss: 1.6176 - regression_loss: 1.3469 - classification_loss: 0.2707 357/500 [====================>.........] - ETA: 35s - loss: 1.6180 - regression_loss: 1.3472 - classification_loss: 0.2708 358/500 [====================>.........] - ETA: 35s - loss: 1.6182 - regression_loss: 1.3474 - classification_loss: 0.2707 359/500 [====================>.........] - ETA: 35s - loss: 1.6174 - regression_loss: 1.3469 - classification_loss: 0.2706 360/500 [====================>.........] - ETA: 35s - loss: 1.6181 - regression_loss: 1.3472 - classification_loss: 0.2709 361/500 [====================>.........] - ETA: 34s - loss: 1.6195 - regression_loss: 1.3483 - classification_loss: 0.2712 362/500 [====================>.........] - ETA: 34s - loss: 1.6196 - regression_loss: 1.3483 - classification_loss: 0.2713 363/500 [====================>.........] - ETA: 34s - loss: 1.6175 - regression_loss: 1.3466 - classification_loss: 0.2709 364/500 [====================>.........] - ETA: 34s - loss: 1.6173 - regression_loss: 1.3465 - classification_loss: 0.2708 365/500 [====================>.........] - ETA: 33s - loss: 1.6148 - regression_loss: 1.3446 - classification_loss: 0.2703 366/500 [====================>.........] - ETA: 33s - loss: 1.6141 - regression_loss: 1.3437 - classification_loss: 0.2703 367/500 [=====================>........] - ETA: 33s - loss: 1.6129 - regression_loss: 1.3424 - classification_loss: 0.2705 368/500 [=====================>........] - ETA: 33s - loss: 1.6129 - regression_loss: 1.3426 - classification_loss: 0.2702 369/500 [=====================>........] - ETA: 32s - loss: 1.6133 - regression_loss: 1.3431 - classification_loss: 0.2703 370/500 [=====================>........] - ETA: 32s - loss: 1.6126 - regression_loss: 1.3424 - classification_loss: 0.2702 371/500 [=====================>........] - ETA: 32s - loss: 1.6123 - regression_loss: 1.3422 - classification_loss: 0.2701 372/500 [=====================>........] - ETA: 32s - loss: 1.6116 - regression_loss: 1.3418 - classification_loss: 0.2698 373/500 [=====================>........] - ETA: 31s - loss: 1.6108 - regression_loss: 1.3410 - classification_loss: 0.2698 374/500 [=====================>........] - ETA: 31s - loss: 1.6096 - regression_loss: 1.3401 - classification_loss: 0.2695 375/500 [=====================>........] - ETA: 31s - loss: 1.6112 - regression_loss: 1.3413 - classification_loss: 0.2699 376/500 [=====================>........] - ETA: 31s - loss: 1.6113 - regression_loss: 1.3415 - classification_loss: 0.2698 377/500 [=====================>........] - ETA: 30s - loss: 1.6111 - regression_loss: 1.3414 - classification_loss: 0.2698 378/500 [=====================>........] - ETA: 30s - loss: 1.6116 - regression_loss: 1.3417 - classification_loss: 0.2698 379/500 [=====================>........] - ETA: 30s - loss: 1.6100 - regression_loss: 1.3406 - classification_loss: 0.2694 380/500 [=====================>........] - ETA: 30s - loss: 1.6089 - regression_loss: 1.3396 - classification_loss: 0.2693 381/500 [=====================>........] - ETA: 29s - loss: 1.6093 - regression_loss: 1.3399 - classification_loss: 0.2694 382/500 [=====================>........] - ETA: 29s - loss: 1.6122 - regression_loss: 1.3424 - classification_loss: 0.2699 383/500 [=====================>........] - ETA: 29s - loss: 1.6120 - regression_loss: 1.3423 - classification_loss: 0.2697 384/500 [======================>.......] - ETA: 29s - loss: 1.6134 - regression_loss: 1.3429 - classification_loss: 0.2705 385/500 [======================>.......] - ETA: 28s - loss: 1.6111 - regression_loss: 1.3409 - classification_loss: 0.2701 386/500 [======================>.......] - ETA: 28s - loss: 1.6120 - regression_loss: 1.3417 - classification_loss: 0.2703 387/500 [======================>.......] - ETA: 28s - loss: 1.6100 - regression_loss: 1.3400 - classification_loss: 0.2699 388/500 [======================>.......] - ETA: 28s - loss: 1.6094 - regression_loss: 1.3395 - classification_loss: 0.2699 389/500 [======================>.......] - ETA: 27s - loss: 1.6092 - regression_loss: 1.3395 - classification_loss: 0.2698 390/500 [======================>.......] - ETA: 27s - loss: 1.6091 - regression_loss: 1.3394 - classification_loss: 0.2697 391/500 [======================>.......] - ETA: 27s - loss: 1.6104 - regression_loss: 1.3405 - classification_loss: 0.2699 392/500 [======================>.......] - ETA: 27s - loss: 1.6115 - regression_loss: 1.3415 - classification_loss: 0.2701 393/500 [======================>.......] - ETA: 26s - loss: 1.6126 - regression_loss: 1.3425 - classification_loss: 0.2702 394/500 [======================>.......] - ETA: 26s - loss: 1.6127 - regression_loss: 1.3424 - classification_loss: 0.2703 395/500 [======================>.......] - ETA: 26s - loss: 1.6126 - regression_loss: 1.3426 - classification_loss: 0.2700 396/500 [======================>.......] - ETA: 26s - loss: 1.6135 - regression_loss: 1.3433 - classification_loss: 0.2702 397/500 [======================>.......] - ETA: 25s - loss: 1.6117 - regression_loss: 1.3417 - classification_loss: 0.2700 398/500 [======================>.......] - ETA: 25s - loss: 1.6121 - regression_loss: 1.3422 - classification_loss: 0.2700 399/500 [======================>.......] - ETA: 25s - loss: 1.6125 - regression_loss: 1.3425 - classification_loss: 0.2700 400/500 [=======================>......] - ETA: 25s - loss: 1.6101 - regression_loss: 1.3406 - classification_loss: 0.2695 401/500 [=======================>......] - ETA: 24s - loss: 1.6106 - regression_loss: 1.3410 - classification_loss: 0.2696 402/500 [=======================>......] - ETA: 24s - loss: 1.6107 - regression_loss: 1.3413 - classification_loss: 0.2694 403/500 [=======================>......] - ETA: 24s - loss: 1.6091 - regression_loss: 1.3400 - classification_loss: 0.2691 404/500 [=======================>......] - ETA: 24s - loss: 1.6077 - regression_loss: 1.3389 - classification_loss: 0.2688 405/500 [=======================>......] - ETA: 23s - loss: 1.6078 - regression_loss: 1.3391 - classification_loss: 0.2687 406/500 [=======================>......] - ETA: 23s - loss: 1.6064 - regression_loss: 1.3380 - classification_loss: 0.2684 407/500 [=======================>......] - ETA: 23s - loss: 1.6073 - regression_loss: 1.3387 - classification_loss: 0.2687 408/500 [=======================>......] - ETA: 23s - loss: 1.6086 - regression_loss: 1.3396 - classification_loss: 0.2689 409/500 [=======================>......] - ETA: 22s - loss: 1.6087 - regression_loss: 1.3397 - classification_loss: 0.2690 410/500 [=======================>......] - ETA: 22s - loss: 1.6062 - regression_loss: 1.3375 - classification_loss: 0.2686 411/500 [=======================>......] - ETA: 22s - loss: 1.6049 - regression_loss: 1.3365 - classification_loss: 0.2684 412/500 [=======================>......] - ETA: 22s - loss: 1.6049 - regression_loss: 1.3365 - classification_loss: 0.2684 413/500 [=======================>......] - ETA: 21s - loss: 1.6061 - regression_loss: 1.3376 - classification_loss: 0.2685 414/500 [=======================>......] - ETA: 21s - loss: 1.6057 - regression_loss: 1.3374 - classification_loss: 0.2684 415/500 [=======================>......] - ETA: 21s - loss: 1.6065 - regression_loss: 1.3380 - classification_loss: 0.2685 416/500 [=======================>......] - ETA: 21s - loss: 1.6069 - regression_loss: 1.3385 - classification_loss: 0.2685 417/500 [========================>.....] - ETA: 20s - loss: 1.6062 - regression_loss: 1.3378 - classification_loss: 0.2684 418/500 [========================>.....] - ETA: 20s - loss: 1.6061 - regression_loss: 1.3378 - classification_loss: 0.2684 419/500 [========================>.....] - ETA: 20s - loss: 1.6068 - regression_loss: 1.3384 - classification_loss: 0.2685 420/500 [========================>.....] - ETA: 20s - loss: 1.6065 - regression_loss: 1.3381 - classification_loss: 0.2684 421/500 [========================>.....] - ETA: 19s - loss: 1.6074 - regression_loss: 1.3386 - classification_loss: 0.2688 422/500 [========================>.....] - ETA: 19s - loss: 1.6094 - regression_loss: 1.3405 - classification_loss: 0.2689 423/500 [========================>.....] - ETA: 19s - loss: 1.6089 - regression_loss: 1.3401 - classification_loss: 0.2688 424/500 [========================>.....] - ETA: 19s - loss: 1.6067 - regression_loss: 1.3384 - classification_loss: 0.2683 425/500 [========================>.....] - ETA: 18s - loss: 1.6059 - regression_loss: 1.3377 - classification_loss: 0.2682 426/500 [========================>.....] - ETA: 18s - loss: 1.6045 - regression_loss: 1.3363 - classification_loss: 0.2682 427/500 [========================>.....] - ETA: 18s - loss: 1.6052 - regression_loss: 1.3369 - classification_loss: 0.2683 428/500 [========================>.....] - ETA: 18s - loss: 1.6061 - regression_loss: 1.3378 - classification_loss: 0.2683 429/500 [========================>.....] - ETA: 17s - loss: 1.6070 - regression_loss: 1.3384 - classification_loss: 0.2686 430/500 [========================>.....] - ETA: 17s - loss: 1.6079 - regression_loss: 1.3392 - classification_loss: 0.2688 431/500 [========================>.....] - ETA: 17s - loss: 1.6076 - regression_loss: 1.3390 - classification_loss: 0.2685 432/500 [========================>.....] - ETA: 17s - loss: 1.6092 - regression_loss: 1.3402 - classification_loss: 0.2689 433/500 [========================>.....] - ETA: 16s - loss: 1.6102 - regression_loss: 1.3410 - classification_loss: 0.2692 434/500 [=========================>....] - ETA: 16s - loss: 1.6087 - regression_loss: 1.3397 - classification_loss: 0.2690 435/500 [=========================>....] - ETA: 16s - loss: 1.6100 - regression_loss: 1.3407 - classification_loss: 0.2693 436/500 [=========================>....] - ETA: 16s - loss: 1.6107 - regression_loss: 1.3413 - classification_loss: 0.2694 437/500 [=========================>....] - ETA: 15s - loss: 1.6108 - regression_loss: 1.3414 - classification_loss: 0.2694 438/500 [=========================>....] - ETA: 15s - loss: 1.6126 - regression_loss: 1.3429 - classification_loss: 0.2698 439/500 [=========================>....] - ETA: 15s - loss: 1.6117 - regression_loss: 1.3420 - classification_loss: 0.2697 440/500 [=========================>....] - ETA: 15s - loss: 1.6115 - regression_loss: 1.3419 - classification_loss: 0.2696 441/500 [=========================>....] - ETA: 14s - loss: 1.6117 - regression_loss: 1.3421 - classification_loss: 0.2695 442/500 [=========================>....] - ETA: 14s - loss: 1.6105 - regression_loss: 1.3411 - classification_loss: 0.2694 443/500 [=========================>....] - ETA: 14s - loss: 1.6101 - regression_loss: 1.3402 - classification_loss: 0.2698 444/500 [=========================>....] - ETA: 14s - loss: 1.6125 - regression_loss: 1.3422 - classification_loss: 0.2702 445/500 [=========================>....] - ETA: 13s - loss: 1.6134 - regression_loss: 1.3430 - classification_loss: 0.2704 446/500 [=========================>....] - ETA: 13s - loss: 1.6120 - regression_loss: 1.3419 - classification_loss: 0.2701 447/500 [=========================>....] - ETA: 13s - loss: 1.6123 - regression_loss: 1.3422 - classification_loss: 0.2701 448/500 [=========================>....] - ETA: 13s - loss: 1.6145 - regression_loss: 1.3437 - classification_loss: 0.2708 449/500 [=========================>....] - ETA: 12s - loss: 1.6148 - regression_loss: 1.3441 - classification_loss: 0.2707 450/500 [==========================>...] - ETA: 12s - loss: 1.6143 - regression_loss: 1.3437 - classification_loss: 0.2705 451/500 [==========================>...] - ETA: 12s - loss: 1.6141 - regression_loss: 1.3436 - classification_loss: 0.2705 452/500 [==========================>...] - ETA: 12s - loss: 1.6146 - regression_loss: 1.3439 - classification_loss: 0.2706 453/500 [==========================>...] - ETA: 11s - loss: 1.6173 - regression_loss: 1.3461 - classification_loss: 0.2712 454/500 [==========================>...] - ETA: 11s - loss: 1.6176 - regression_loss: 1.3464 - classification_loss: 0.2712 455/500 [==========================>...] - ETA: 11s - loss: 1.6180 - regression_loss: 1.3468 - classification_loss: 0.2712 456/500 [==========================>...] - ETA: 11s - loss: 1.6199 - regression_loss: 1.3486 - classification_loss: 0.2713 457/500 [==========================>...] - ETA: 10s - loss: 1.6202 - regression_loss: 1.3489 - classification_loss: 0.2713 458/500 [==========================>...] - ETA: 10s - loss: 1.6207 - regression_loss: 1.3495 - classification_loss: 0.2712 459/500 [==========================>...] - ETA: 10s - loss: 1.6213 - regression_loss: 1.3500 - classification_loss: 0.2712 460/500 [==========================>...] - ETA: 10s - loss: 1.6211 - regression_loss: 1.3500 - classification_loss: 0.2712 461/500 [==========================>...] - ETA: 9s - loss: 1.6213 - regression_loss: 1.3500 - classification_loss: 0.2713  462/500 [==========================>...] - ETA: 9s - loss: 1.6216 - regression_loss: 1.3504 - classification_loss: 0.2712 463/500 [==========================>...] - ETA: 9s - loss: 1.6222 - regression_loss: 1.3512 - classification_loss: 0.2710 464/500 [==========================>...] - ETA: 9s - loss: 1.6221 - regression_loss: 1.3512 - classification_loss: 0.2709 465/500 [==========================>...] - ETA: 8s - loss: 1.6228 - regression_loss: 1.3519 - classification_loss: 0.2709 466/500 [==========================>...] - ETA: 8s - loss: 1.6230 - regression_loss: 1.3522 - classification_loss: 0.2708 467/500 [===========================>..] - ETA: 8s - loss: 1.6237 - regression_loss: 1.3528 - classification_loss: 0.2708 468/500 [===========================>..] - ETA: 8s - loss: 1.6237 - regression_loss: 1.3530 - classification_loss: 0.2707 469/500 [===========================>..] - ETA: 7s - loss: 1.6241 - regression_loss: 1.3533 - classification_loss: 0.2707 470/500 [===========================>..] - ETA: 7s - loss: 1.6242 - regression_loss: 1.3534 - classification_loss: 0.2709 471/500 [===========================>..] - ETA: 7s - loss: 1.6250 - regression_loss: 1.3541 - classification_loss: 0.2709 472/500 [===========================>..] - ETA: 7s - loss: 1.6252 - regression_loss: 1.3543 - classification_loss: 0.2709 473/500 [===========================>..] - ETA: 6s - loss: 1.6264 - regression_loss: 1.3553 - classification_loss: 0.2711 474/500 [===========================>..] - ETA: 6s - loss: 1.6247 - regression_loss: 1.3540 - classification_loss: 0.2707 475/500 [===========================>..] - ETA: 6s - loss: 1.6258 - regression_loss: 1.3549 - classification_loss: 0.2709 476/500 [===========================>..] - ETA: 6s - loss: 1.6246 - regression_loss: 1.3539 - classification_loss: 0.2706 477/500 [===========================>..] - ETA: 5s - loss: 1.6241 - regression_loss: 1.3533 - classification_loss: 0.2709 478/500 [===========================>..] - ETA: 5s - loss: 1.6234 - regression_loss: 1.3527 - classification_loss: 0.2707 479/500 [===========================>..] - ETA: 5s - loss: 1.6255 - regression_loss: 1.3544 - classification_loss: 0.2710 480/500 [===========================>..] - ETA: 5s - loss: 1.6240 - regression_loss: 1.3532 - classification_loss: 0.2707 481/500 [===========================>..] - ETA: 4s - loss: 1.6226 - regression_loss: 1.3522 - classification_loss: 0.2704 482/500 [===========================>..] - ETA: 4s - loss: 1.6221 - regression_loss: 1.3518 - classification_loss: 0.2703 483/500 [===========================>..] - ETA: 4s - loss: 1.6229 - regression_loss: 1.3523 - classification_loss: 0.2706 484/500 [============================>.] - ETA: 4s - loss: 1.6219 - regression_loss: 1.3515 - classification_loss: 0.2704 485/500 [============================>.] - ETA: 3s - loss: 1.6228 - regression_loss: 1.3521 - classification_loss: 0.2707 486/500 [============================>.] - ETA: 3s - loss: 1.6234 - regression_loss: 1.3526 - classification_loss: 0.2708 487/500 [============================>.] - ETA: 3s - loss: 1.6238 - regression_loss: 1.3529 - classification_loss: 0.2708 488/500 [============================>.] - ETA: 3s - loss: 1.6235 - regression_loss: 1.3526 - classification_loss: 0.2709 489/500 [============================>.] - ETA: 2s - loss: 1.6243 - regression_loss: 1.3534 - classification_loss: 0.2709 490/500 [============================>.] - ETA: 2s - loss: 1.6254 - regression_loss: 1.3543 - classification_loss: 0.2712 491/500 [============================>.] - ETA: 2s - loss: 1.6254 - regression_loss: 1.3543 - classification_loss: 0.2711 492/500 [============================>.] - ETA: 2s - loss: 1.6260 - regression_loss: 1.3549 - classification_loss: 0.2711 493/500 [============================>.] - ETA: 1s - loss: 1.6261 - regression_loss: 1.3549 - classification_loss: 0.2712 494/500 [============================>.] - ETA: 1s - loss: 1.6267 - regression_loss: 1.3554 - classification_loss: 0.2713 495/500 [============================>.] - ETA: 1s - loss: 1.6261 - regression_loss: 1.3550 - classification_loss: 0.2711 496/500 [============================>.] - ETA: 1s - loss: 1.6258 - regression_loss: 1.3547 - classification_loss: 0.2711 497/500 [============================>.] - ETA: 0s - loss: 1.6248 - regression_loss: 1.3538 - classification_loss: 0.2710 498/500 [============================>.] - ETA: 0s - loss: 1.6249 - regression_loss: 1.3540 - classification_loss: 0.2710 499/500 [============================>.] - ETA: 0s - loss: 1.6251 - regression_loss: 1.3541 - classification_loss: 0.2709 500/500 [==============================] - 125s 250ms/step - loss: 1.6249 - regression_loss: 1.3540 - classification_loss: 0.2709 1172 instances of class plum with average precision: 0.6458 mAP: 0.6458 Epoch 00082: saving model to ./training/snapshots/resnet50_pascal_82.h5 Epoch 83/150 1/500 [..............................] - ETA: 1:55 - loss: 1.4757 - regression_loss: 1.2369 - classification_loss: 0.2389 2/500 [..............................] - ETA: 2:00 - loss: 1.7397 - regression_loss: 1.4538 - classification_loss: 0.2859 3/500 [..............................] - ETA: 2:04 - loss: 1.8004 - regression_loss: 1.5241 - classification_loss: 0.2762 4/500 [..............................] - ETA: 2:03 - loss: 1.6873 - regression_loss: 1.4300 - classification_loss: 0.2573 5/500 [..............................] - ETA: 2:04 - loss: 1.6663 - regression_loss: 1.4148 - classification_loss: 0.2515 6/500 [..............................] - ETA: 2:03 - loss: 1.6766 - regression_loss: 1.4293 - classification_loss: 0.2472 7/500 [..............................] - ETA: 2:03 - loss: 1.7204 - regression_loss: 1.4552 - classification_loss: 0.2651 8/500 [..............................] - ETA: 2:01 - loss: 1.7056 - regression_loss: 1.4374 - classification_loss: 0.2681 9/500 [..............................] - ETA: 1:59 - loss: 1.5834 - regression_loss: 1.3361 - classification_loss: 0.2474 10/500 [..............................] - ETA: 1:57 - loss: 1.5946 - regression_loss: 1.3466 - classification_loss: 0.2480 11/500 [..............................] - ETA: 1:56 - loss: 1.6341 - regression_loss: 1.3776 - classification_loss: 0.2566 12/500 [..............................] - ETA: 1:56 - loss: 1.6558 - regression_loss: 1.3942 - classification_loss: 0.2616 13/500 [..............................] - ETA: 1:56 - loss: 1.6016 - regression_loss: 1.3498 - classification_loss: 0.2518 14/500 [..............................] - ETA: 1:57 - loss: 1.6519 - regression_loss: 1.3899 - classification_loss: 0.2620 15/500 [..............................] - ETA: 1:57 - loss: 1.5940 - regression_loss: 1.3410 - classification_loss: 0.2529 16/500 [..............................] - ETA: 1:58 - loss: 1.5613 - regression_loss: 1.3127 - classification_loss: 0.2486 17/500 [>.............................] - ETA: 1:58 - loss: 1.5475 - regression_loss: 1.3035 - classification_loss: 0.2440 18/500 [>.............................] - ETA: 1:57 - loss: 1.5386 - regression_loss: 1.2978 - classification_loss: 0.2408 19/500 [>.............................] - ETA: 1:58 - loss: 1.5395 - regression_loss: 1.2990 - classification_loss: 0.2405 20/500 [>.............................] - ETA: 1:58 - loss: 1.5412 - regression_loss: 1.2991 - classification_loss: 0.2421 21/500 [>.............................] - ETA: 1:58 - loss: 1.5690 - regression_loss: 1.3157 - classification_loss: 0.2533 22/500 [>.............................] - ETA: 1:58 - loss: 1.5678 - regression_loss: 1.3164 - classification_loss: 0.2514 23/500 [>.............................] - ETA: 1:57 - loss: 1.5401 - regression_loss: 1.2942 - classification_loss: 0.2459 24/500 [>.............................] - ETA: 1:57 - loss: 1.5366 - regression_loss: 1.2932 - classification_loss: 0.2434 25/500 [>.............................] - ETA: 1:57 - loss: 1.5027 - regression_loss: 1.2662 - classification_loss: 0.2365 26/500 [>.............................] - ETA: 1:57 - loss: 1.5050 - regression_loss: 1.2683 - classification_loss: 0.2367 27/500 [>.............................] - ETA: 1:57 - loss: 1.5027 - regression_loss: 1.2665 - classification_loss: 0.2361 28/500 [>.............................] - ETA: 1:56 - loss: 1.5182 - regression_loss: 1.2794 - classification_loss: 0.2388 29/500 [>.............................] - ETA: 1:56 - loss: 1.5004 - regression_loss: 1.2660 - classification_loss: 0.2344 30/500 [>.............................] - ETA: 1:56 - loss: 1.4905 - regression_loss: 1.2546 - classification_loss: 0.2359 31/500 [>.............................] - ETA: 1:56 - loss: 1.4918 - regression_loss: 1.2557 - classification_loss: 0.2361 32/500 [>.............................] - ETA: 1:56 - loss: 1.5051 - regression_loss: 1.2677 - classification_loss: 0.2375 33/500 [>.............................] - ETA: 1:56 - loss: 1.5172 - regression_loss: 1.2782 - classification_loss: 0.2391 34/500 [=>............................] - ETA: 1:55 - loss: 1.5310 - regression_loss: 1.2890 - classification_loss: 0.2420 35/500 [=>............................] - ETA: 1:55 - loss: 1.5387 - regression_loss: 1.2938 - classification_loss: 0.2449 36/500 [=>............................] - ETA: 1:55 - loss: 1.5764 - regression_loss: 1.3250 - classification_loss: 0.2514 37/500 [=>............................] - ETA: 1:55 - loss: 1.5878 - regression_loss: 1.3323 - classification_loss: 0.2556 38/500 [=>............................] - ETA: 1:55 - loss: 1.5697 - regression_loss: 1.3186 - classification_loss: 0.2511 39/500 [=>............................] - ETA: 1:55 - loss: 1.5611 - regression_loss: 1.3119 - classification_loss: 0.2492 40/500 [=>............................] - ETA: 1:54 - loss: 1.5691 - regression_loss: 1.3198 - classification_loss: 0.2492 41/500 [=>............................] - ETA: 1:54 - loss: 1.5767 - regression_loss: 1.3265 - classification_loss: 0.2502 42/500 [=>............................] - ETA: 1:54 - loss: 1.5573 - regression_loss: 1.3112 - classification_loss: 0.2461 43/500 [=>............................] - ETA: 1:54 - loss: 1.5366 - regression_loss: 1.2940 - classification_loss: 0.2425 44/500 [=>............................] - ETA: 1:54 - loss: 1.5460 - regression_loss: 1.3025 - classification_loss: 0.2435 45/500 [=>............................] - ETA: 1:53 - loss: 1.5525 - regression_loss: 1.3089 - classification_loss: 0.2436 46/500 [=>............................] - ETA: 1:53 - loss: 1.5513 - regression_loss: 1.3077 - classification_loss: 0.2436 47/500 [=>............................] - ETA: 1:53 - loss: 1.5594 - regression_loss: 1.3137 - classification_loss: 0.2457 48/500 [=>............................] - ETA: 1:53 - loss: 1.5438 - regression_loss: 1.3010 - classification_loss: 0.2428 49/500 [=>............................] - ETA: 1:52 - loss: 1.5448 - regression_loss: 1.3017 - classification_loss: 0.2430 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5513 - regression_loss: 1.3086 - classification_loss: 0.2426 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5361 - regression_loss: 1.2963 - classification_loss: 0.2398 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5325 - regression_loss: 1.2924 - classification_loss: 0.2401 53/500 [==>...........................] - ETA: 1:52 - loss: 1.5376 - regression_loss: 1.2958 - classification_loss: 0.2417 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5564 - regression_loss: 1.3086 - classification_loss: 0.2478 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5582 - regression_loss: 1.3082 - classification_loss: 0.2500 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5572 - regression_loss: 1.3078 - classification_loss: 0.2494 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5535 - regression_loss: 1.3018 - classification_loss: 0.2517 58/500 [==>...........................] - ETA: 1:50 - loss: 1.5578 - regression_loss: 1.3065 - classification_loss: 0.2513 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5613 - regression_loss: 1.3097 - classification_loss: 0.2517 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5611 - regression_loss: 1.3099 - classification_loss: 0.2513 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5598 - regression_loss: 1.3083 - classification_loss: 0.2515 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5639 - regression_loss: 1.3113 - classification_loss: 0.2526 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5654 - regression_loss: 1.3123 - classification_loss: 0.2531 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5551 - regression_loss: 1.3038 - classification_loss: 0.2513 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5464 - regression_loss: 1.2970 - classification_loss: 0.2494 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5505 - regression_loss: 1.3006 - classification_loss: 0.2499 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5470 - regression_loss: 1.2984 - classification_loss: 0.2486 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5446 - regression_loss: 1.2966 - classification_loss: 0.2480 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5523 - regression_loss: 1.3031 - classification_loss: 0.2492 70/500 [===>..........................] - ETA: 1:48 - loss: 1.5635 - regression_loss: 1.3126 - classification_loss: 0.2509 71/500 [===>..........................] - ETA: 1:48 - loss: 1.5676 - regression_loss: 1.3153 - classification_loss: 0.2523 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5674 - regression_loss: 1.3151 - classification_loss: 0.2523 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5722 - regression_loss: 1.3182 - classification_loss: 0.2540 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5637 - regression_loss: 1.3118 - classification_loss: 0.2519 75/500 [===>..........................] - ETA: 1:47 - loss: 1.5678 - regression_loss: 1.3152 - classification_loss: 0.2526 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5583 - regression_loss: 1.3077 - classification_loss: 0.2506 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5590 - regression_loss: 1.3088 - classification_loss: 0.2502 78/500 [===>..........................] - ETA: 1:46 - loss: 1.5561 - regression_loss: 1.3051 - classification_loss: 0.2510 79/500 [===>..........................] - ETA: 1:46 - loss: 1.5568 - regression_loss: 1.3054 - classification_loss: 0.2514 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5630 - regression_loss: 1.3093 - classification_loss: 0.2536 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5752 - regression_loss: 1.3183 - classification_loss: 0.2569 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5772 - regression_loss: 1.3203 - classification_loss: 0.2569 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5658 - regression_loss: 1.3108 - classification_loss: 0.2549 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5557 - regression_loss: 1.3024 - classification_loss: 0.2533 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5453 - regression_loss: 1.2939 - classification_loss: 0.2514 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5512 - regression_loss: 1.3000 - classification_loss: 0.2511 87/500 [====>.........................] - ETA: 1:44 - loss: 1.5530 - regression_loss: 1.3011 - classification_loss: 0.2519 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5440 - regression_loss: 1.2936 - classification_loss: 0.2504 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5462 - regression_loss: 1.2952 - classification_loss: 0.2511 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5482 - regression_loss: 1.2969 - classification_loss: 0.2513 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5463 - regression_loss: 1.2950 - classification_loss: 0.2513 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5402 - regression_loss: 1.2907 - classification_loss: 0.2495 93/500 [====>.........................] - ETA: 1:42 - loss: 1.5500 - regression_loss: 1.2974 - classification_loss: 0.2526 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5504 - regression_loss: 1.2979 - classification_loss: 0.2524 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5535 - regression_loss: 1.2995 - classification_loss: 0.2540 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5496 - regression_loss: 1.2960 - classification_loss: 0.2535 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5530 - regression_loss: 1.2988 - classification_loss: 0.2542 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5561 - regression_loss: 1.3012 - classification_loss: 0.2549 99/500 [====>.........................] - ETA: 1:41 - loss: 1.5664 - regression_loss: 1.3099 - classification_loss: 0.2565 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5697 - regression_loss: 1.3112 - classification_loss: 0.2585 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5687 - regression_loss: 1.3094 - classification_loss: 0.2593 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5689 - regression_loss: 1.3098 - classification_loss: 0.2591 103/500 [=====>........................] - ETA: 1:40 - loss: 1.5690 - regression_loss: 1.3096 - classification_loss: 0.2593 104/500 [=====>........................] - ETA: 1:40 - loss: 1.5707 - regression_loss: 1.3101 - classification_loss: 0.2606 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5733 - regression_loss: 1.3125 - classification_loss: 0.2608 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5751 - regression_loss: 1.3139 - classification_loss: 0.2612 107/500 [=====>........................] - ETA: 1:39 - loss: 1.5782 - regression_loss: 1.3168 - classification_loss: 0.2614 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5806 - regression_loss: 1.3192 - classification_loss: 0.2614 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5822 - regression_loss: 1.3203 - classification_loss: 0.2620 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5850 - regression_loss: 1.3223 - classification_loss: 0.2626 111/500 [=====>........................] - ETA: 1:38 - loss: 1.5786 - regression_loss: 1.3170 - classification_loss: 0.2616 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5782 - regression_loss: 1.3170 - classification_loss: 0.2613 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5840 - regression_loss: 1.3216 - classification_loss: 0.2624 114/500 [=====>........................] - ETA: 1:37 - loss: 1.5868 - regression_loss: 1.3244 - classification_loss: 0.2624 115/500 [=====>........................] - ETA: 1:37 - loss: 1.5912 - regression_loss: 1.3276 - classification_loss: 0.2636 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5923 - regression_loss: 1.3289 - classification_loss: 0.2634 117/500 [======>.......................] - ETA: 1:36 - loss: 1.5865 - regression_loss: 1.3244 - classification_loss: 0.2621 118/500 [======>.......................] - ETA: 1:36 - loss: 1.5838 - regression_loss: 1.3225 - classification_loss: 0.2613 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5876 - regression_loss: 1.3264 - classification_loss: 0.2613 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5871 - regression_loss: 1.3257 - classification_loss: 0.2613 121/500 [======>.......................] - ETA: 1:35 - loss: 1.5918 - regression_loss: 1.3298 - classification_loss: 0.2619 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5849 - regression_loss: 1.3241 - classification_loss: 0.2609 123/500 [======>.......................] - ETA: 1:35 - loss: 1.5867 - regression_loss: 1.3262 - classification_loss: 0.2605 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5867 - regression_loss: 1.3257 - classification_loss: 0.2609 125/500 [======>.......................] - ETA: 1:34 - loss: 1.5896 - regression_loss: 1.3291 - classification_loss: 0.2605 126/500 [======>.......................] - ETA: 1:34 - loss: 1.5900 - regression_loss: 1.3292 - classification_loss: 0.2608 127/500 [======>.......................] - ETA: 1:34 - loss: 1.5957 - regression_loss: 1.3335 - classification_loss: 0.2622 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5945 - regression_loss: 1.3317 - classification_loss: 0.2628 129/500 [======>.......................] - ETA: 1:33 - loss: 1.5939 - regression_loss: 1.3307 - classification_loss: 0.2631 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5949 - regression_loss: 1.3317 - classification_loss: 0.2631 131/500 [======>.......................] - ETA: 1:33 - loss: 1.5909 - regression_loss: 1.3282 - classification_loss: 0.2627 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5896 - regression_loss: 1.3270 - classification_loss: 0.2626 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5905 - regression_loss: 1.3278 - classification_loss: 0.2627 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5918 - regression_loss: 1.3288 - classification_loss: 0.2630 135/500 [=======>......................] - ETA: 1:32 - loss: 1.5927 - regression_loss: 1.3295 - classification_loss: 0.2632 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5912 - regression_loss: 1.3281 - classification_loss: 0.2631 137/500 [=======>......................] - ETA: 1:31 - loss: 1.5925 - regression_loss: 1.3295 - classification_loss: 0.2630 138/500 [=======>......................] - ETA: 1:31 - loss: 1.5931 - regression_loss: 1.3301 - classification_loss: 0.2630 139/500 [=======>......................] - ETA: 1:31 - loss: 1.5931 - regression_loss: 1.3301 - classification_loss: 0.2630 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5937 - regression_loss: 1.3314 - classification_loss: 0.2623 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5956 - regression_loss: 1.3331 - classification_loss: 0.2625 142/500 [=======>......................] - ETA: 1:30 - loss: 1.5976 - regression_loss: 1.3348 - classification_loss: 0.2629 143/500 [=======>......................] - ETA: 1:30 - loss: 1.6016 - regression_loss: 1.3380 - classification_loss: 0.2636 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6002 - regression_loss: 1.3368 - classification_loss: 0.2633 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6007 - regression_loss: 1.3376 - classification_loss: 0.2631 146/500 [=======>......................] - ETA: 1:29 - loss: 1.5989 - regression_loss: 1.3363 - classification_loss: 0.2627 147/500 [=======>......................] - ETA: 1:29 - loss: 1.6028 - regression_loss: 1.3383 - classification_loss: 0.2645 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6032 - regression_loss: 1.3391 - classification_loss: 0.2641 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5976 - regression_loss: 1.3339 - classification_loss: 0.2636 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5988 - regression_loss: 1.3351 - classification_loss: 0.2637 151/500 [========>.....................] - ETA: 1:28 - loss: 1.6007 - regression_loss: 1.3368 - classification_loss: 0.2639 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6018 - regression_loss: 1.3377 - classification_loss: 0.2641 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6043 - regression_loss: 1.3393 - classification_loss: 0.2650 154/500 [========>.....................] - ETA: 1:27 - loss: 1.6000 - regression_loss: 1.3359 - classification_loss: 0.2641 155/500 [========>.....................] - ETA: 1:27 - loss: 1.6021 - regression_loss: 1.3371 - classification_loss: 0.2650 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6042 - regression_loss: 1.3385 - classification_loss: 0.2657 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5996 - regression_loss: 1.3351 - classification_loss: 0.2646 158/500 [========>.....................] - ETA: 1:26 - loss: 1.5990 - regression_loss: 1.3349 - classification_loss: 0.2641 159/500 [========>.....................] - ETA: 1:26 - loss: 1.6000 - regression_loss: 1.3361 - classification_loss: 0.2639 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6009 - regression_loss: 1.3367 - classification_loss: 0.2643 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6012 - regression_loss: 1.3371 - classification_loss: 0.2641 162/500 [========>.....................] - ETA: 1:25 - loss: 1.6016 - regression_loss: 1.3374 - classification_loss: 0.2643 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5986 - regression_loss: 1.3347 - classification_loss: 0.2639 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5981 - regression_loss: 1.3334 - classification_loss: 0.2647 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5976 - regression_loss: 1.3332 - classification_loss: 0.2643 166/500 [========>.....................] - ETA: 1:24 - loss: 1.5964 - regression_loss: 1.3325 - classification_loss: 0.2639 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5982 - regression_loss: 1.3338 - classification_loss: 0.2644 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5991 - regression_loss: 1.3344 - classification_loss: 0.2647 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6028 - regression_loss: 1.3372 - classification_loss: 0.2656 170/500 [=========>....................] - ETA: 1:23 - loss: 1.6022 - regression_loss: 1.3365 - classification_loss: 0.2657 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6044 - regression_loss: 1.3385 - classification_loss: 0.2659 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6060 - regression_loss: 1.3397 - classification_loss: 0.2663 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6103 - regression_loss: 1.3437 - classification_loss: 0.2666 174/500 [=========>....................] - ETA: 1:22 - loss: 1.6079 - regression_loss: 1.3417 - classification_loss: 0.2662 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6083 - regression_loss: 1.3423 - classification_loss: 0.2661 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6099 - regression_loss: 1.3434 - classification_loss: 0.2665 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6118 - regression_loss: 1.3446 - classification_loss: 0.2672 178/500 [=========>....................] - ETA: 1:21 - loss: 1.6108 - regression_loss: 1.3439 - classification_loss: 0.2670 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6129 - regression_loss: 1.3457 - classification_loss: 0.2672 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6086 - regression_loss: 1.3418 - classification_loss: 0.2668 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6028 - regression_loss: 1.3371 - classification_loss: 0.2656 182/500 [=========>....................] - ETA: 1:20 - loss: 1.6013 - regression_loss: 1.3365 - classification_loss: 0.2648 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6039 - regression_loss: 1.3386 - classification_loss: 0.2653 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6060 - regression_loss: 1.3404 - classification_loss: 0.2656 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6074 - regression_loss: 1.3416 - classification_loss: 0.2658 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6097 - regression_loss: 1.3432 - classification_loss: 0.2665 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6093 - regression_loss: 1.3428 - classification_loss: 0.2665 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6091 - regression_loss: 1.3431 - classification_loss: 0.2660 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6113 - regression_loss: 1.3449 - classification_loss: 0.2664 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6090 - regression_loss: 1.3431 - classification_loss: 0.2660 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6163 - regression_loss: 1.3493 - classification_loss: 0.2670 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6188 - regression_loss: 1.3513 - classification_loss: 0.2676 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6201 - regression_loss: 1.3526 - classification_loss: 0.2676 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6227 - regression_loss: 1.3545 - classification_loss: 0.2682 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6188 - regression_loss: 1.3513 - classification_loss: 0.2675 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6197 - regression_loss: 1.3520 - classification_loss: 0.2676 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6194 - regression_loss: 1.3519 - classification_loss: 0.2675 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6185 - regression_loss: 1.3513 - classification_loss: 0.2671 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6166 - regression_loss: 1.3489 - classification_loss: 0.2677 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6167 - regression_loss: 1.3491 - classification_loss: 0.2675 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6187 - regression_loss: 1.3510 - classification_loss: 0.2677 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6207 - regression_loss: 1.3525 - classification_loss: 0.2683 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6172 - regression_loss: 1.3494 - classification_loss: 0.2678 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6160 - regression_loss: 1.3485 - classification_loss: 0.2675 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6167 - regression_loss: 1.3492 - classification_loss: 0.2675 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6147 - regression_loss: 1.3474 - classification_loss: 0.2673 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6159 - regression_loss: 1.3488 - classification_loss: 0.2672 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6132 - regression_loss: 1.3468 - classification_loss: 0.2664 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6092 - regression_loss: 1.3435 - classification_loss: 0.2657 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6089 - regression_loss: 1.3430 - classification_loss: 0.2659 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6093 - regression_loss: 1.3435 - classification_loss: 0.2658 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6059 - regression_loss: 1.3409 - classification_loss: 0.2650 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6058 - regression_loss: 1.3409 - classification_loss: 0.2649 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6035 - regression_loss: 1.3392 - classification_loss: 0.2643 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6019 - regression_loss: 1.3380 - classification_loss: 0.2639 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5974 - regression_loss: 1.3344 - classification_loss: 0.2631 217/500 [============>.................] - ETA: 1:11 - loss: 1.5967 - regression_loss: 1.3337 - classification_loss: 0.2630 218/500 [============>.................] - ETA: 1:10 - loss: 1.5992 - regression_loss: 1.3355 - classification_loss: 0.2637 219/500 [============>.................] - ETA: 1:10 - loss: 1.5983 - regression_loss: 1.3348 - classification_loss: 0.2635 220/500 [============>.................] - ETA: 1:10 - loss: 1.5956 - regression_loss: 1.3328 - classification_loss: 0.2628 221/500 [============>.................] - ETA: 1:10 - loss: 1.5952 - regression_loss: 1.3326 - classification_loss: 0.2625 222/500 [============>.................] - ETA: 1:09 - loss: 1.5973 - regression_loss: 1.3339 - classification_loss: 0.2633 223/500 [============>.................] - ETA: 1:09 - loss: 1.5999 - regression_loss: 1.3355 - classification_loss: 0.2644 224/500 [============>.................] - ETA: 1:09 - loss: 1.6014 - regression_loss: 1.3369 - classification_loss: 0.2645 225/500 [============>.................] - ETA: 1:09 - loss: 1.6005 - regression_loss: 1.3363 - classification_loss: 0.2642 226/500 [============>.................] - ETA: 1:08 - loss: 1.6005 - regression_loss: 1.3360 - classification_loss: 0.2645 227/500 [============>.................] - ETA: 1:08 - loss: 1.5966 - regression_loss: 1.3328 - classification_loss: 0.2637 228/500 [============>.................] - ETA: 1:08 - loss: 1.5949 - regression_loss: 1.3320 - classification_loss: 0.2629 229/500 [============>.................] - ETA: 1:08 - loss: 1.5932 - regression_loss: 1.3306 - classification_loss: 0.2626 230/500 [============>.................] - ETA: 1:07 - loss: 1.5931 - regression_loss: 1.3306 - classification_loss: 0.2625 231/500 [============>.................] - ETA: 1:07 - loss: 1.5925 - regression_loss: 1.3302 - classification_loss: 0.2624 232/500 [============>.................] - ETA: 1:07 - loss: 1.5936 - regression_loss: 1.3312 - classification_loss: 0.2625 233/500 [============>.................] - ETA: 1:07 - loss: 1.5950 - regression_loss: 1.3307 - classification_loss: 0.2644 234/500 [=============>................] - ETA: 1:06 - loss: 1.5962 - regression_loss: 1.3316 - classification_loss: 0.2646 235/500 [=============>................] - ETA: 1:06 - loss: 1.5977 - regression_loss: 1.3328 - classification_loss: 0.2650 236/500 [=============>................] - ETA: 1:06 - loss: 1.5978 - regression_loss: 1.3330 - classification_loss: 0.2648 237/500 [=============>................] - ETA: 1:06 - loss: 1.5973 - regression_loss: 1.3325 - classification_loss: 0.2648 238/500 [=============>................] - ETA: 1:05 - loss: 1.5981 - regression_loss: 1.3323 - classification_loss: 0.2658 239/500 [=============>................] - ETA: 1:05 - loss: 1.5989 - regression_loss: 1.3328 - classification_loss: 0.2661 240/500 [=============>................] - ETA: 1:05 - loss: 1.6016 - regression_loss: 1.3351 - classification_loss: 0.2666 241/500 [=============>................] - ETA: 1:05 - loss: 1.6004 - regression_loss: 1.3341 - classification_loss: 0.2663 242/500 [=============>................] - ETA: 1:04 - loss: 1.6023 - regression_loss: 1.3356 - classification_loss: 0.2667 243/500 [=============>................] - ETA: 1:04 - loss: 1.6042 - regression_loss: 1.3373 - classification_loss: 0.2668 244/500 [=============>................] - ETA: 1:04 - loss: 1.6041 - regression_loss: 1.3375 - classification_loss: 0.2666 245/500 [=============>................] - ETA: 1:04 - loss: 1.6020 - regression_loss: 1.3357 - classification_loss: 0.2663 246/500 [=============>................] - ETA: 1:03 - loss: 1.6007 - regression_loss: 1.3345 - classification_loss: 0.2662 247/500 [=============>................] - ETA: 1:03 - loss: 1.6011 - regression_loss: 1.3347 - classification_loss: 0.2664 248/500 [=============>................] - ETA: 1:03 - loss: 1.5996 - regression_loss: 1.3335 - classification_loss: 0.2660 249/500 [=============>................] - ETA: 1:03 - loss: 1.6021 - regression_loss: 1.3352 - classification_loss: 0.2668 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6028 - regression_loss: 1.3360 - classification_loss: 0.2668 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6038 - regression_loss: 1.3368 - classification_loss: 0.2670 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6059 - regression_loss: 1.3386 - classification_loss: 0.2673 253/500 [==============>...............] - ETA: 1:02 - loss: 1.6051 - regression_loss: 1.3378 - classification_loss: 0.2672 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6062 - regression_loss: 1.3388 - classification_loss: 0.2674 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6069 - regression_loss: 1.3394 - classification_loss: 0.2675 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6080 - regression_loss: 1.3402 - classification_loss: 0.2678 257/500 [==============>...............] - ETA: 1:01 - loss: 1.6084 - regression_loss: 1.3406 - classification_loss: 0.2678 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6041 - regression_loss: 1.3368 - classification_loss: 0.2673 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6049 - regression_loss: 1.3376 - classification_loss: 0.2674 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6014 - regression_loss: 1.3345 - classification_loss: 0.2668 261/500 [==============>...............] - ETA: 1:00 - loss: 1.6020 - regression_loss: 1.3350 - classification_loss: 0.2670 262/500 [==============>...............] - ETA: 59s - loss: 1.6023 - regression_loss: 1.3349 - classification_loss: 0.2675  263/500 [==============>...............] - ETA: 59s - loss: 1.6021 - regression_loss: 1.3338 - classification_loss: 0.2684 264/500 [==============>...............] - ETA: 59s - loss: 1.6026 - regression_loss: 1.3345 - classification_loss: 0.2682 265/500 [==============>...............] - ETA: 59s - loss: 1.6024 - regression_loss: 1.3344 - classification_loss: 0.2680 266/500 [==============>...............] - ETA: 58s - loss: 1.6026 - regression_loss: 1.3345 - classification_loss: 0.2681 267/500 [===============>..............] - ETA: 58s - loss: 1.6080 - regression_loss: 1.3396 - classification_loss: 0.2685 268/500 [===============>..............] - ETA: 58s - loss: 1.6088 - regression_loss: 1.3401 - classification_loss: 0.2686 269/500 [===============>..............] - ETA: 58s - loss: 1.6094 - regression_loss: 1.3405 - classification_loss: 0.2689 270/500 [===============>..............] - ETA: 57s - loss: 1.6085 - regression_loss: 1.3398 - classification_loss: 0.2686 271/500 [===============>..............] - ETA: 57s - loss: 1.6096 - regression_loss: 1.3409 - classification_loss: 0.2687 272/500 [===============>..............] - ETA: 57s - loss: 1.6105 - regression_loss: 1.3417 - classification_loss: 0.2688 273/500 [===============>..............] - ETA: 56s - loss: 1.6110 - regression_loss: 1.3423 - classification_loss: 0.2687 274/500 [===============>..............] - ETA: 56s - loss: 1.6108 - regression_loss: 1.3422 - classification_loss: 0.2686 275/500 [===============>..............] - ETA: 56s - loss: 1.6097 - regression_loss: 1.3412 - classification_loss: 0.2684 276/500 [===============>..............] - ETA: 56s - loss: 1.6104 - regression_loss: 1.3417 - classification_loss: 0.2686 277/500 [===============>..............] - ETA: 55s - loss: 1.6103 - regression_loss: 1.3417 - classification_loss: 0.2687 278/500 [===============>..............] - ETA: 55s - loss: 1.6122 - regression_loss: 1.3426 - classification_loss: 0.2696 279/500 [===============>..............] - ETA: 55s - loss: 1.6105 - regression_loss: 1.3411 - classification_loss: 0.2694 280/500 [===============>..............] - ETA: 55s - loss: 1.6117 - regression_loss: 1.3420 - classification_loss: 0.2697 281/500 [===============>..............] - ETA: 54s - loss: 1.6131 - regression_loss: 1.3433 - classification_loss: 0.2698 282/500 [===============>..............] - ETA: 54s - loss: 1.6135 - regression_loss: 1.3437 - classification_loss: 0.2698 283/500 [===============>..............] - ETA: 54s - loss: 1.6147 - regression_loss: 1.3449 - classification_loss: 0.2698 284/500 [================>.............] - ETA: 54s - loss: 1.6152 - regression_loss: 1.3450 - classification_loss: 0.2702 285/500 [================>.............] - ETA: 53s - loss: 1.6157 - regression_loss: 1.3456 - classification_loss: 0.2701 286/500 [================>.............] - ETA: 53s - loss: 1.6160 - regression_loss: 1.3457 - classification_loss: 0.2703 287/500 [================>.............] - ETA: 53s - loss: 1.6171 - regression_loss: 1.3462 - classification_loss: 0.2709 288/500 [================>.............] - ETA: 53s - loss: 1.6175 - regression_loss: 1.3464 - classification_loss: 0.2710 289/500 [================>.............] - ETA: 52s - loss: 1.6177 - regression_loss: 1.3466 - classification_loss: 0.2711 290/500 [================>.............] - ETA: 52s - loss: 1.6158 - regression_loss: 1.3451 - classification_loss: 0.2706 291/500 [================>.............] - ETA: 52s - loss: 1.6168 - regression_loss: 1.3460 - classification_loss: 0.2708 292/500 [================>.............] - ETA: 52s - loss: 1.6144 - regression_loss: 1.3441 - classification_loss: 0.2703 293/500 [================>.............] - ETA: 51s - loss: 1.6139 - regression_loss: 1.3438 - classification_loss: 0.2701 294/500 [================>.............] - ETA: 51s - loss: 1.6115 - regression_loss: 1.3420 - classification_loss: 0.2696 295/500 [================>.............] - ETA: 51s - loss: 1.6122 - regression_loss: 1.3426 - classification_loss: 0.2696 296/500 [================>.............] - ETA: 51s - loss: 1.6109 - regression_loss: 1.3411 - classification_loss: 0.2697 297/500 [================>.............] - ETA: 50s - loss: 1.6108 - regression_loss: 1.3413 - classification_loss: 0.2696 298/500 [================>.............] - ETA: 50s - loss: 1.6130 - regression_loss: 1.3428 - classification_loss: 0.2702 299/500 [================>.............] - ETA: 50s - loss: 1.6106 - regression_loss: 1.3406 - classification_loss: 0.2700 300/500 [=================>............] - ETA: 50s - loss: 1.6078 - regression_loss: 1.3384 - classification_loss: 0.2694 301/500 [=================>............] - ETA: 49s - loss: 1.6061 - regression_loss: 1.3370 - classification_loss: 0.2691 302/500 [=================>............] - ETA: 49s - loss: 1.6046 - regression_loss: 1.3352 - classification_loss: 0.2693 303/500 [=================>............] - ETA: 49s - loss: 1.6066 - regression_loss: 1.3367 - classification_loss: 0.2698 304/500 [=================>............] - ETA: 49s - loss: 1.6072 - regression_loss: 1.3373 - classification_loss: 0.2699 305/500 [=================>............] - ETA: 48s - loss: 1.6081 - regression_loss: 1.3385 - classification_loss: 0.2697 306/500 [=================>............] - ETA: 48s - loss: 1.6089 - regression_loss: 1.3393 - classification_loss: 0.2696 307/500 [=================>............] - ETA: 48s - loss: 1.6101 - regression_loss: 1.3403 - classification_loss: 0.2698 308/500 [=================>............] - ETA: 48s - loss: 1.6106 - regression_loss: 1.3409 - classification_loss: 0.2697 309/500 [=================>............] - ETA: 47s - loss: 1.6121 - regression_loss: 1.3419 - classification_loss: 0.2702 310/500 [=================>............] - ETA: 47s - loss: 1.6106 - regression_loss: 1.3407 - classification_loss: 0.2700 311/500 [=================>............] - ETA: 47s - loss: 1.6128 - regression_loss: 1.3428 - classification_loss: 0.2700 312/500 [=================>............] - ETA: 47s - loss: 1.6132 - regression_loss: 1.3429 - classification_loss: 0.2702 313/500 [=================>............] - ETA: 46s - loss: 1.6101 - regression_loss: 1.3402 - classification_loss: 0.2699 314/500 [=================>............] - ETA: 46s - loss: 1.6097 - regression_loss: 1.3402 - classification_loss: 0.2696 315/500 [=================>............] - ETA: 46s - loss: 1.6117 - regression_loss: 1.3417 - classification_loss: 0.2700 316/500 [=================>............] - ETA: 46s - loss: 1.6124 - regression_loss: 1.3424 - classification_loss: 0.2700 317/500 [==================>...........] - ETA: 45s - loss: 1.6137 - regression_loss: 1.3435 - classification_loss: 0.2701 318/500 [==================>...........] - ETA: 45s - loss: 1.6146 - regression_loss: 1.3441 - classification_loss: 0.2705 319/500 [==================>...........] - ETA: 45s - loss: 1.6140 - regression_loss: 1.3436 - classification_loss: 0.2704 320/500 [==================>...........] - ETA: 45s - loss: 1.6150 - regression_loss: 1.3444 - classification_loss: 0.2706 321/500 [==================>...........] - ETA: 44s - loss: 1.6143 - regression_loss: 1.3440 - classification_loss: 0.2703 322/500 [==================>...........] - ETA: 44s - loss: 1.6155 - regression_loss: 1.3452 - classification_loss: 0.2703 323/500 [==================>...........] - ETA: 44s - loss: 1.6157 - regression_loss: 1.3454 - classification_loss: 0.2703 324/500 [==================>...........] - ETA: 44s - loss: 1.6151 - regression_loss: 1.3451 - classification_loss: 0.2701 325/500 [==================>...........] - ETA: 43s - loss: 1.6142 - regression_loss: 1.3442 - classification_loss: 0.2700 326/500 [==================>...........] - ETA: 43s - loss: 1.6146 - regression_loss: 1.3440 - classification_loss: 0.2706 327/500 [==================>...........] - ETA: 43s - loss: 1.6154 - regression_loss: 1.3447 - classification_loss: 0.2707 328/500 [==================>...........] - ETA: 43s - loss: 1.6152 - regression_loss: 1.3432 - classification_loss: 0.2719 329/500 [==================>...........] - ETA: 42s - loss: 1.6167 - regression_loss: 1.3446 - classification_loss: 0.2721 330/500 [==================>...........] - ETA: 42s - loss: 1.6159 - regression_loss: 1.3442 - classification_loss: 0.2717 331/500 [==================>...........] - ETA: 42s - loss: 1.6154 - regression_loss: 1.3437 - classification_loss: 0.2716 332/500 [==================>...........] - ETA: 42s - loss: 1.6177 - regression_loss: 1.3460 - classification_loss: 0.2718 333/500 [==================>...........] - ETA: 41s - loss: 1.6191 - regression_loss: 1.3471 - classification_loss: 0.2720 334/500 [===================>..........] - ETA: 41s - loss: 1.6165 - regression_loss: 1.3448 - classification_loss: 0.2718 335/500 [===================>..........] - ETA: 41s - loss: 1.6154 - regression_loss: 1.3440 - classification_loss: 0.2714 336/500 [===================>..........] - ETA: 41s - loss: 1.6175 - regression_loss: 1.3451 - classification_loss: 0.2724 337/500 [===================>..........] - ETA: 40s - loss: 1.6188 - regression_loss: 1.3462 - classification_loss: 0.2727 338/500 [===================>..........] - ETA: 40s - loss: 1.6183 - regression_loss: 1.3456 - classification_loss: 0.2727 339/500 [===================>..........] - ETA: 40s - loss: 1.6163 - regression_loss: 1.3442 - classification_loss: 0.2721 340/500 [===================>..........] - ETA: 40s - loss: 1.6157 - regression_loss: 1.3437 - classification_loss: 0.2720 341/500 [===================>..........] - ETA: 39s - loss: 1.6158 - regression_loss: 1.3440 - classification_loss: 0.2718 342/500 [===================>..........] - ETA: 39s - loss: 1.6145 - regression_loss: 1.3431 - classification_loss: 0.2714 343/500 [===================>..........] - ETA: 39s - loss: 1.6136 - regression_loss: 1.3422 - classification_loss: 0.2714 344/500 [===================>..........] - ETA: 39s - loss: 1.6138 - regression_loss: 1.3425 - classification_loss: 0.2713 345/500 [===================>..........] - ETA: 38s - loss: 1.6145 - regression_loss: 1.3435 - classification_loss: 0.2711 346/500 [===================>..........] - ETA: 38s - loss: 1.6143 - regression_loss: 1.3431 - classification_loss: 0.2712 347/500 [===================>..........] - ETA: 38s - loss: 1.6153 - regression_loss: 1.3439 - classification_loss: 0.2714 348/500 [===================>..........] - ETA: 38s - loss: 1.6145 - regression_loss: 1.3434 - classification_loss: 0.2711 349/500 [===================>..........] - ETA: 37s - loss: 1.6142 - regression_loss: 1.3430 - classification_loss: 0.2711 350/500 [====================>.........] - ETA: 37s - loss: 1.6133 - regression_loss: 1.3421 - classification_loss: 0.2712 351/500 [====================>.........] - ETA: 37s - loss: 1.6127 - regression_loss: 1.3417 - classification_loss: 0.2710 352/500 [====================>.........] - ETA: 37s - loss: 1.6125 - regression_loss: 1.3417 - classification_loss: 0.2708 353/500 [====================>.........] - ETA: 36s - loss: 1.6118 - regression_loss: 1.3413 - classification_loss: 0.2705 354/500 [====================>.........] - ETA: 36s - loss: 1.6136 - regression_loss: 1.3427 - classification_loss: 0.2709 355/500 [====================>.........] - ETA: 36s - loss: 1.6125 - regression_loss: 1.3417 - classification_loss: 0.2708 356/500 [====================>.........] - ETA: 36s - loss: 1.6130 - regression_loss: 1.3423 - classification_loss: 0.2707 357/500 [====================>.........] - ETA: 35s - loss: 1.6125 - regression_loss: 1.3420 - classification_loss: 0.2705 358/500 [====================>.........] - ETA: 35s - loss: 1.6119 - regression_loss: 1.3414 - classification_loss: 0.2705 359/500 [====================>.........] - ETA: 35s - loss: 1.6129 - regression_loss: 1.3422 - classification_loss: 0.2707 360/500 [====================>.........] - ETA: 35s - loss: 1.6127 - regression_loss: 1.3421 - classification_loss: 0.2705 361/500 [====================>.........] - ETA: 34s - loss: 1.6090 - regression_loss: 1.3390 - classification_loss: 0.2700 362/500 [====================>.........] - ETA: 34s - loss: 1.6095 - regression_loss: 1.3395 - classification_loss: 0.2700 363/500 [====================>.........] - ETA: 34s - loss: 1.6080 - regression_loss: 1.3384 - classification_loss: 0.2695 364/500 [====================>.........] - ETA: 33s - loss: 1.6060 - regression_loss: 1.3367 - classification_loss: 0.2692 365/500 [====================>.........] - ETA: 33s - loss: 1.6065 - regression_loss: 1.3371 - classification_loss: 0.2694 366/500 [====================>.........] - ETA: 33s - loss: 1.6065 - regression_loss: 1.3371 - classification_loss: 0.2694 367/500 [=====================>........] - ETA: 33s - loss: 1.6068 - regression_loss: 1.3372 - classification_loss: 0.2696 368/500 [=====================>........] - ETA: 33s - loss: 1.6060 - regression_loss: 1.3366 - classification_loss: 0.2694 369/500 [=====================>........] - ETA: 32s - loss: 1.6036 - regression_loss: 1.3347 - classification_loss: 0.2688 370/500 [=====================>........] - ETA: 32s - loss: 1.6028 - regression_loss: 1.3343 - classification_loss: 0.2685 371/500 [=====================>........] - ETA: 32s - loss: 1.6030 - regression_loss: 1.3345 - classification_loss: 0.2685 372/500 [=====================>........] - ETA: 32s - loss: 1.6029 - regression_loss: 1.3346 - classification_loss: 0.2683 373/500 [=====================>........] - ETA: 31s - loss: 1.6027 - regression_loss: 1.3345 - classification_loss: 0.2682 374/500 [=====================>........] - ETA: 31s - loss: 1.6005 - regression_loss: 1.3326 - classification_loss: 0.2679 375/500 [=====================>........] - ETA: 31s - loss: 1.6014 - regression_loss: 1.3334 - classification_loss: 0.2680 376/500 [=====================>........] - ETA: 31s - loss: 1.6016 - regression_loss: 1.3337 - classification_loss: 0.2679 377/500 [=====================>........] - ETA: 30s - loss: 1.6021 - regression_loss: 1.3341 - classification_loss: 0.2680 378/500 [=====================>........] - ETA: 30s - loss: 1.6014 - regression_loss: 1.3336 - classification_loss: 0.2678 379/500 [=====================>........] - ETA: 30s - loss: 1.6026 - regression_loss: 1.3345 - classification_loss: 0.2681 380/500 [=====================>........] - ETA: 30s - loss: 1.6038 - regression_loss: 1.3356 - classification_loss: 0.2683 381/500 [=====================>........] - ETA: 29s - loss: 1.6028 - regression_loss: 1.3348 - classification_loss: 0.2680 382/500 [=====================>........] - ETA: 29s - loss: 1.6050 - regression_loss: 1.3364 - classification_loss: 0.2686 383/500 [=====================>........] - ETA: 29s - loss: 1.6046 - regression_loss: 1.3361 - classification_loss: 0.2685 384/500 [======================>.......] - ETA: 29s - loss: 1.6050 - regression_loss: 1.3365 - classification_loss: 0.2686 385/500 [======================>.......] - ETA: 28s - loss: 1.6029 - regression_loss: 1.3348 - classification_loss: 0.2681 386/500 [======================>.......] - ETA: 28s - loss: 1.6047 - regression_loss: 1.3363 - classification_loss: 0.2684 387/500 [======================>.......] - ETA: 28s - loss: 1.6050 - regression_loss: 1.3366 - classification_loss: 0.2684 388/500 [======================>.......] - ETA: 28s - loss: 1.6055 - regression_loss: 1.3372 - classification_loss: 0.2684 389/500 [======================>.......] - ETA: 27s - loss: 1.6043 - regression_loss: 1.3361 - classification_loss: 0.2682 390/500 [======================>.......] - ETA: 27s - loss: 1.6045 - regression_loss: 1.3363 - classification_loss: 0.2682 391/500 [======================>.......] - ETA: 27s - loss: 1.6077 - regression_loss: 1.3393 - classification_loss: 0.2684 392/500 [======================>.......] - ETA: 27s - loss: 1.6060 - regression_loss: 1.3380 - classification_loss: 0.2680 393/500 [======================>.......] - ETA: 26s - loss: 1.6044 - regression_loss: 1.3367 - classification_loss: 0.2677 394/500 [======================>.......] - ETA: 26s - loss: 1.6045 - regression_loss: 1.3369 - classification_loss: 0.2676 395/500 [======================>.......] - ETA: 26s - loss: 1.6048 - regression_loss: 1.3375 - classification_loss: 0.2673 396/500 [======================>.......] - ETA: 26s - loss: 1.6050 - regression_loss: 1.3379 - classification_loss: 0.2671 397/500 [======================>.......] - ETA: 25s - loss: 1.6054 - regression_loss: 1.3384 - classification_loss: 0.2670 398/500 [======================>.......] - ETA: 25s - loss: 1.6062 - regression_loss: 1.3390 - classification_loss: 0.2672 399/500 [======================>.......] - ETA: 25s - loss: 1.6069 - regression_loss: 1.3396 - classification_loss: 0.2672 400/500 [=======================>......] - ETA: 25s - loss: 1.6067 - regression_loss: 1.3394 - classification_loss: 0.2673 401/500 [=======================>......] - ETA: 24s - loss: 1.6065 - regression_loss: 1.3391 - classification_loss: 0.2674 402/500 [=======================>......] - ETA: 24s - loss: 1.6067 - regression_loss: 1.3393 - classification_loss: 0.2673 403/500 [=======================>......] - ETA: 24s - loss: 1.6074 - regression_loss: 1.3400 - classification_loss: 0.2674 404/500 [=======================>......] - ETA: 24s - loss: 1.6082 - regression_loss: 1.3408 - classification_loss: 0.2674 405/500 [=======================>......] - ETA: 23s - loss: 1.6098 - regression_loss: 1.3421 - classification_loss: 0.2677 406/500 [=======================>......] - ETA: 23s - loss: 1.6085 - regression_loss: 1.3411 - classification_loss: 0.2674 407/500 [=======================>......] - ETA: 23s - loss: 1.6111 - regression_loss: 1.3429 - classification_loss: 0.2682 408/500 [=======================>......] - ETA: 23s - loss: 1.6118 - regression_loss: 1.3436 - classification_loss: 0.2682 409/500 [=======================>......] - ETA: 22s - loss: 1.6102 - regression_loss: 1.3422 - classification_loss: 0.2680 410/500 [=======================>......] - ETA: 22s - loss: 1.6097 - regression_loss: 1.3418 - classification_loss: 0.2678 411/500 [=======================>......] - ETA: 22s - loss: 1.6091 - regression_loss: 1.3413 - classification_loss: 0.2678 412/500 [=======================>......] - ETA: 22s - loss: 1.6094 - regression_loss: 1.3414 - classification_loss: 0.2680 413/500 [=======================>......] - ETA: 21s - loss: 1.6104 - regression_loss: 1.3422 - classification_loss: 0.2681 414/500 [=======================>......] - ETA: 21s - loss: 1.6104 - regression_loss: 1.3424 - classification_loss: 0.2680 415/500 [=======================>......] - ETA: 21s - loss: 1.6111 - regression_loss: 1.3430 - classification_loss: 0.2680 416/500 [=======================>......] - ETA: 21s - loss: 1.6128 - regression_loss: 1.3442 - classification_loss: 0.2686 417/500 [========================>.....] - ETA: 20s - loss: 1.6140 - regression_loss: 1.3451 - classification_loss: 0.2689 418/500 [========================>.....] - ETA: 20s - loss: 1.6141 - regression_loss: 1.3451 - classification_loss: 0.2690 419/500 [========================>.....] - ETA: 20s - loss: 1.6115 - regression_loss: 1.3430 - classification_loss: 0.2685 420/500 [========================>.....] - ETA: 20s - loss: 1.6121 - regression_loss: 1.3434 - classification_loss: 0.2687 421/500 [========================>.....] - ETA: 19s - loss: 1.6119 - regression_loss: 1.3433 - classification_loss: 0.2686 422/500 [========================>.....] - ETA: 19s - loss: 1.6115 - regression_loss: 1.3430 - classification_loss: 0.2685 423/500 [========================>.....] - ETA: 19s - loss: 1.6123 - regression_loss: 1.3434 - classification_loss: 0.2688 424/500 [========================>.....] - ETA: 19s - loss: 1.6106 - regression_loss: 1.3422 - classification_loss: 0.2684 425/500 [========================>.....] - ETA: 18s - loss: 1.6090 - regression_loss: 1.3410 - classification_loss: 0.2680 426/500 [========================>.....] - ETA: 18s - loss: 1.6093 - regression_loss: 1.3413 - classification_loss: 0.2680 427/500 [========================>.....] - ETA: 18s - loss: 1.6091 - regression_loss: 1.3410 - classification_loss: 0.2681 428/500 [========================>.....] - ETA: 18s - loss: 1.6100 - regression_loss: 1.3417 - classification_loss: 0.2683 429/500 [========================>.....] - ETA: 17s - loss: 1.6105 - regression_loss: 1.3421 - classification_loss: 0.2684 430/500 [========================>.....] - ETA: 17s - loss: 1.6127 - regression_loss: 1.3440 - classification_loss: 0.2687 431/500 [========================>.....] - ETA: 17s - loss: 1.6124 - regression_loss: 1.3438 - classification_loss: 0.2686 432/500 [========================>.....] - ETA: 17s - loss: 1.6120 - regression_loss: 1.3437 - classification_loss: 0.2682 433/500 [========================>.....] - ETA: 16s - loss: 1.6127 - regression_loss: 1.3443 - classification_loss: 0.2685 434/500 [=========================>....] - ETA: 16s - loss: 1.6128 - regression_loss: 1.3443 - classification_loss: 0.2684 435/500 [=========================>....] - ETA: 16s - loss: 1.6121 - regression_loss: 1.3438 - classification_loss: 0.2683 436/500 [=========================>....] - ETA: 16s - loss: 1.6119 - regression_loss: 1.3438 - classification_loss: 0.2681 437/500 [=========================>....] - ETA: 15s - loss: 1.6113 - regression_loss: 1.3433 - classification_loss: 0.2680 438/500 [=========================>....] - ETA: 15s - loss: 1.6118 - regression_loss: 1.3438 - classification_loss: 0.2680 439/500 [=========================>....] - ETA: 15s - loss: 1.6122 - regression_loss: 1.3441 - classification_loss: 0.2681 440/500 [=========================>....] - ETA: 15s - loss: 1.6133 - regression_loss: 1.3446 - classification_loss: 0.2688 441/500 [=========================>....] - ETA: 14s - loss: 1.6103 - regression_loss: 1.3415 - classification_loss: 0.2688 442/500 [=========================>....] - ETA: 14s - loss: 1.6089 - regression_loss: 1.3403 - classification_loss: 0.2686 443/500 [=========================>....] - ETA: 14s - loss: 1.6085 - regression_loss: 1.3400 - classification_loss: 0.2685 444/500 [=========================>....] - ETA: 14s - loss: 1.6088 - regression_loss: 1.3402 - classification_loss: 0.2685 445/500 [=========================>....] - ETA: 13s - loss: 1.6091 - regression_loss: 1.3404 - classification_loss: 0.2688 446/500 [=========================>....] - ETA: 13s - loss: 1.6099 - regression_loss: 1.3410 - classification_loss: 0.2688 447/500 [=========================>....] - ETA: 13s - loss: 1.6098 - regression_loss: 1.3410 - classification_loss: 0.2687 448/500 [=========================>....] - ETA: 13s - loss: 1.6100 - regression_loss: 1.3413 - classification_loss: 0.2688 449/500 [=========================>....] - ETA: 12s - loss: 1.6107 - regression_loss: 1.3418 - classification_loss: 0.2689 450/500 [==========================>...] - ETA: 12s - loss: 1.6108 - regression_loss: 1.3419 - classification_loss: 0.2688 451/500 [==========================>...] - ETA: 12s - loss: 1.6108 - regression_loss: 1.3419 - classification_loss: 0.2689 452/500 [==========================>...] - ETA: 12s - loss: 1.6112 - regression_loss: 1.3423 - classification_loss: 0.2689 453/500 [==========================>...] - ETA: 11s - loss: 1.6106 - regression_loss: 1.3420 - classification_loss: 0.2686 454/500 [==========================>...] - ETA: 11s - loss: 1.6094 - regression_loss: 1.3411 - classification_loss: 0.2682 455/500 [==========================>...] - ETA: 11s - loss: 1.6094 - regression_loss: 1.3413 - classification_loss: 0.2682 456/500 [==========================>...] - ETA: 11s - loss: 1.6104 - regression_loss: 1.3420 - classification_loss: 0.2684 457/500 [==========================>...] - ETA: 10s - loss: 1.6112 - regression_loss: 1.3426 - classification_loss: 0.2685 458/500 [==========================>...] - ETA: 10s - loss: 1.6115 - regression_loss: 1.3431 - classification_loss: 0.2684 459/500 [==========================>...] - ETA: 10s - loss: 1.6106 - regression_loss: 1.3422 - classification_loss: 0.2684 460/500 [==========================>...] - ETA: 10s - loss: 1.6107 - regression_loss: 1.3423 - classification_loss: 0.2684 461/500 [==========================>...] - ETA: 9s - loss: 1.6111 - regression_loss: 1.3426 - classification_loss: 0.2685  462/500 [==========================>...] - ETA: 9s - loss: 1.6107 - regression_loss: 1.3423 - classification_loss: 0.2684 463/500 [==========================>...] - ETA: 9s - loss: 1.6097 - regression_loss: 1.3413 - classification_loss: 0.2683 464/500 [==========================>...] - ETA: 9s - loss: 1.6103 - regression_loss: 1.3418 - classification_loss: 0.2685 465/500 [==========================>...] - ETA: 8s - loss: 1.6111 - regression_loss: 1.3425 - classification_loss: 0.2686 466/500 [==========================>...] - ETA: 8s - loss: 1.6099 - regression_loss: 1.3409 - classification_loss: 0.2690 467/500 [===========================>..] - ETA: 8s - loss: 1.6100 - regression_loss: 1.3410 - classification_loss: 0.2690 468/500 [===========================>..] - ETA: 8s - loss: 1.6119 - regression_loss: 1.3426 - classification_loss: 0.2693 469/500 [===========================>..] - ETA: 7s - loss: 1.6121 - regression_loss: 1.3429 - classification_loss: 0.2692 470/500 [===========================>..] - ETA: 7s - loss: 1.6139 - regression_loss: 1.3441 - classification_loss: 0.2699 471/500 [===========================>..] - ETA: 7s - loss: 1.6131 - regression_loss: 1.3430 - classification_loss: 0.2701 472/500 [===========================>..] - ETA: 7s - loss: 1.6135 - regression_loss: 1.3431 - classification_loss: 0.2704 473/500 [===========================>..] - ETA: 6s - loss: 1.6143 - regression_loss: 1.3438 - classification_loss: 0.2705 474/500 [===========================>..] - ETA: 6s - loss: 1.6147 - regression_loss: 1.3442 - classification_loss: 0.2705 475/500 [===========================>..] - ETA: 6s - loss: 1.6142 - regression_loss: 1.3441 - classification_loss: 0.2702 476/500 [===========================>..] - ETA: 6s - loss: 1.6131 - regression_loss: 1.3433 - classification_loss: 0.2698 477/500 [===========================>..] - ETA: 5s - loss: 1.6128 - regression_loss: 1.3431 - classification_loss: 0.2697 478/500 [===========================>..] - ETA: 5s - loss: 1.6128 - regression_loss: 1.3430 - classification_loss: 0.2698 479/500 [===========================>..] - ETA: 5s - loss: 1.6121 - regression_loss: 1.3424 - classification_loss: 0.2696 480/500 [===========================>..] - ETA: 5s - loss: 1.6127 - regression_loss: 1.3430 - classification_loss: 0.2697 481/500 [===========================>..] - ETA: 4s - loss: 1.6117 - regression_loss: 1.3421 - classification_loss: 0.2696 482/500 [===========================>..] - ETA: 4s - loss: 1.6099 - regression_loss: 1.3407 - classification_loss: 0.2692 483/500 [===========================>..] - ETA: 4s - loss: 1.6109 - regression_loss: 1.3414 - classification_loss: 0.2695 484/500 [============================>.] - ETA: 4s - loss: 1.6109 - regression_loss: 1.3414 - classification_loss: 0.2694 485/500 [============================>.] - ETA: 3s - loss: 1.6102 - regression_loss: 1.3409 - classification_loss: 0.2693 486/500 [============================>.] - ETA: 3s - loss: 1.6107 - regression_loss: 1.3414 - classification_loss: 0.2693 487/500 [============================>.] - ETA: 3s - loss: 1.6136 - regression_loss: 1.3437 - classification_loss: 0.2698 488/500 [============================>.] - ETA: 3s - loss: 1.6135 - regression_loss: 1.3437 - classification_loss: 0.2698 489/500 [============================>.] - ETA: 2s - loss: 1.6134 - regression_loss: 1.3438 - classification_loss: 0.2697 490/500 [============================>.] - ETA: 2s - loss: 1.6146 - regression_loss: 1.3447 - classification_loss: 0.2698 491/500 [============================>.] - ETA: 2s - loss: 1.6148 - regression_loss: 1.3449 - classification_loss: 0.2699 492/500 [============================>.] - ETA: 2s - loss: 1.6161 - regression_loss: 1.3457 - classification_loss: 0.2704 493/500 [============================>.] - ETA: 1s - loss: 1.6160 - regression_loss: 1.3457 - classification_loss: 0.2703 494/500 [============================>.] - ETA: 1s - loss: 1.6160 - regression_loss: 1.3456 - classification_loss: 0.2704 495/500 [============================>.] - ETA: 1s - loss: 1.6163 - regression_loss: 1.3459 - classification_loss: 0.2704 496/500 [============================>.] - ETA: 1s - loss: 1.6170 - regression_loss: 1.3465 - classification_loss: 0.2705 497/500 [============================>.] - ETA: 0s - loss: 1.6152 - regression_loss: 1.3450 - classification_loss: 0.2702 498/500 [============================>.] - ETA: 0s - loss: 1.6144 - regression_loss: 1.3445 - classification_loss: 0.2699 499/500 [============================>.] - ETA: 0s - loss: 1.6138 - regression_loss: 1.3440 - classification_loss: 0.2698 500/500 [==============================] - 125s 251ms/step - loss: 1.6139 - regression_loss: 1.3441 - classification_loss: 0.2698 1172 instances of class plum with average precision: 0.6221 mAP: 0.6221 Epoch 00083: saving model to ./training/snapshots/resnet50_pascal_83.h5 Epoch 84/150 1/500 [..............................] - ETA: 1:58 - loss: 1.5357 - regression_loss: 1.2915 - classification_loss: 0.2442 2/500 [..............................] - ETA: 2:00 - loss: 1.6855 - regression_loss: 1.3764 - classification_loss: 0.3090 3/500 [..............................] - ETA: 2:01 - loss: 1.7629 - regression_loss: 1.4743 - classification_loss: 0.2886 4/500 [..............................] - ETA: 2:01 - loss: 1.7325 - regression_loss: 1.4525 - classification_loss: 0.2801 5/500 [..............................] - ETA: 2:02 - loss: 1.6219 - regression_loss: 1.3657 - classification_loss: 0.2562 6/500 [..............................] - ETA: 2:01 - loss: 1.6650 - regression_loss: 1.4000 - classification_loss: 0.2650 7/500 [..............................] - ETA: 2:02 - loss: 1.5655 - regression_loss: 1.3129 - classification_loss: 0.2526 8/500 [..............................] - ETA: 2:01 - loss: 1.5786 - regression_loss: 1.3233 - classification_loss: 0.2553 9/500 [..............................] - ETA: 2:01 - loss: 1.6157 - regression_loss: 1.3479 - classification_loss: 0.2678 10/500 [..............................] - ETA: 2:01 - loss: 1.6617 - regression_loss: 1.3843 - classification_loss: 0.2774 11/500 [..............................] - ETA: 2:01 - loss: 1.6997 - regression_loss: 1.4166 - classification_loss: 0.2831 12/500 [..............................] - ETA: 2:00 - loss: 1.6287 - regression_loss: 1.3589 - classification_loss: 0.2698 13/500 [..............................] - ETA: 2:00 - loss: 1.6796 - regression_loss: 1.3994 - classification_loss: 0.2801 14/500 [..............................] - ETA: 2:00 - loss: 1.7133 - regression_loss: 1.4125 - classification_loss: 0.3007 15/500 [..............................] - ETA: 2:00 - loss: 1.6529 - regression_loss: 1.3641 - classification_loss: 0.2888 16/500 [..............................] - ETA: 1:59 - loss: 1.6418 - regression_loss: 1.3567 - classification_loss: 0.2851 17/500 [>.............................] - ETA: 1:59 - loss: 1.6101 - regression_loss: 1.3333 - classification_loss: 0.2768 18/500 [>.............................] - ETA: 1:59 - loss: 1.5743 - regression_loss: 1.3068 - classification_loss: 0.2674 19/500 [>.............................] - ETA: 1:59 - loss: 1.6004 - regression_loss: 1.3278 - classification_loss: 0.2726 20/500 [>.............................] - ETA: 1:59 - loss: 1.6120 - regression_loss: 1.3395 - classification_loss: 0.2725 21/500 [>.............................] - ETA: 1:59 - loss: 1.6060 - regression_loss: 1.3376 - classification_loss: 0.2684 22/500 [>.............................] - ETA: 1:59 - loss: 1.6300 - regression_loss: 1.3578 - classification_loss: 0.2722 23/500 [>.............................] - ETA: 1:58 - loss: 1.6349 - regression_loss: 1.3557 - classification_loss: 0.2792 24/500 [>.............................] - ETA: 1:58 - loss: 1.6067 - regression_loss: 1.3322 - classification_loss: 0.2745 25/500 [>.............................] - ETA: 1:58 - loss: 1.6052 - regression_loss: 1.3313 - classification_loss: 0.2739 26/500 [>.............................] - ETA: 1:57 - loss: 1.6064 - regression_loss: 1.3333 - classification_loss: 0.2732 27/500 [>.............................] - ETA: 1:57 - loss: 1.6236 - regression_loss: 1.3440 - classification_loss: 0.2795 28/500 [>.............................] - ETA: 1:57 - loss: 1.5964 - regression_loss: 1.3209 - classification_loss: 0.2756 29/500 [>.............................] - ETA: 1:57 - loss: 1.6103 - regression_loss: 1.3320 - classification_loss: 0.2783 30/500 [>.............................] - ETA: 1:57 - loss: 1.5952 - regression_loss: 1.3182 - classification_loss: 0.2769 31/500 [>.............................] - ETA: 1:57 - loss: 1.5954 - regression_loss: 1.3208 - classification_loss: 0.2746 32/500 [>.............................] - ETA: 1:56 - loss: 1.5959 - regression_loss: 1.3220 - classification_loss: 0.2739 33/500 [>.............................] - ETA: 1:56 - loss: 1.5974 - regression_loss: 1.3252 - classification_loss: 0.2722 34/500 [=>............................] - ETA: 1:56 - loss: 1.5725 - regression_loss: 1.3054 - classification_loss: 0.2671 35/500 [=>............................] - ETA: 1:56 - loss: 1.5902 - regression_loss: 1.3199 - classification_loss: 0.2703 36/500 [=>............................] - ETA: 1:55 - loss: 1.6041 - regression_loss: 1.3303 - classification_loss: 0.2737 37/500 [=>............................] - ETA: 1:55 - loss: 1.6071 - regression_loss: 1.3338 - classification_loss: 0.2733 38/500 [=>............................] - ETA: 1:54 - loss: 1.5964 - regression_loss: 1.3253 - classification_loss: 0.2711 39/500 [=>............................] - ETA: 1:54 - loss: 1.6074 - regression_loss: 1.3340 - classification_loss: 0.2734 40/500 [=>............................] - ETA: 1:53 - loss: 1.6175 - regression_loss: 1.3413 - classification_loss: 0.2763 41/500 [=>............................] - ETA: 1:53 - loss: 1.6282 - regression_loss: 1.3521 - classification_loss: 0.2761 42/500 [=>............................] - ETA: 1:52 - loss: 1.6296 - regression_loss: 1.3547 - classification_loss: 0.2749 43/500 [=>............................] - ETA: 1:52 - loss: 1.6267 - regression_loss: 1.3527 - classification_loss: 0.2740 44/500 [=>............................] - ETA: 1:52 - loss: 1.6331 - regression_loss: 1.3586 - classification_loss: 0.2745 45/500 [=>............................] - ETA: 1:52 - loss: 1.6431 - regression_loss: 1.3671 - classification_loss: 0.2761 46/500 [=>............................] - ETA: 1:52 - loss: 1.6303 - regression_loss: 1.3569 - classification_loss: 0.2734 47/500 [=>............................] - ETA: 1:52 - loss: 1.6289 - regression_loss: 1.3576 - classification_loss: 0.2713 48/500 [=>............................] - ETA: 1:51 - loss: 1.6335 - regression_loss: 1.3603 - classification_loss: 0.2733 49/500 [=>............................] - ETA: 1:51 - loss: 1.6310 - regression_loss: 1.3590 - classification_loss: 0.2720 50/500 [==>...........................] - ETA: 1:51 - loss: 1.6589 - regression_loss: 1.3797 - classification_loss: 0.2792 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6615 - regression_loss: 1.3830 - classification_loss: 0.2785 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6518 - regression_loss: 1.3719 - classification_loss: 0.2799 53/500 [==>...........................] - ETA: 1:50 - loss: 1.6546 - regression_loss: 1.3740 - classification_loss: 0.2806 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6520 - regression_loss: 1.3729 - classification_loss: 0.2790 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6593 - regression_loss: 1.3791 - classification_loss: 0.2802 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6618 - regression_loss: 1.3817 - classification_loss: 0.2801 57/500 [==>...........................] - ETA: 1:49 - loss: 1.6581 - regression_loss: 1.3792 - classification_loss: 0.2789 58/500 [==>...........................] - ETA: 1:49 - loss: 1.6579 - regression_loss: 1.3799 - classification_loss: 0.2780 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6380 - regression_loss: 1.3636 - classification_loss: 0.2744 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6388 - regression_loss: 1.3645 - classification_loss: 0.2743 61/500 [==>...........................] - ETA: 1:48 - loss: 1.6399 - regression_loss: 1.3666 - classification_loss: 0.2732 62/500 [==>...........................] - ETA: 1:48 - loss: 1.6438 - regression_loss: 1.3703 - classification_loss: 0.2735 63/500 [==>...........................] - ETA: 1:48 - loss: 1.6463 - regression_loss: 1.3717 - classification_loss: 0.2745 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6575 - regression_loss: 1.3810 - classification_loss: 0.2765 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6590 - regression_loss: 1.3824 - classification_loss: 0.2766 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6557 - regression_loss: 1.3798 - classification_loss: 0.2759 67/500 [===>..........................] - ETA: 1:47 - loss: 1.6502 - regression_loss: 1.3732 - classification_loss: 0.2770 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6524 - regression_loss: 1.3748 - classification_loss: 0.2776 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6517 - regression_loss: 1.3753 - classification_loss: 0.2764 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6542 - regression_loss: 1.3781 - classification_loss: 0.2762 71/500 [===>..........................] - ETA: 1:46 - loss: 1.6570 - regression_loss: 1.3817 - classification_loss: 0.2752 72/500 [===>..........................] - ETA: 1:46 - loss: 1.6616 - regression_loss: 1.3857 - classification_loss: 0.2758 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6574 - regression_loss: 1.3827 - classification_loss: 0.2748 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6547 - regression_loss: 1.3812 - classification_loss: 0.2735 75/500 [===>..........................] - ETA: 1:45 - loss: 1.6571 - regression_loss: 1.3837 - classification_loss: 0.2734 76/500 [===>..........................] - ETA: 1:45 - loss: 1.6593 - regression_loss: 1.3860 - classification_loss: 0.2733 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6618 - regression_loss: 1.3872 - classification_loss: 0.2746 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6675 - regression_loss: 1.3916 - classification_loss: 0.2760 79/500 [===>..........................] - ETA: 1:44 - loss: 1.6688 - regression_loss: 1.3932 - classification_loss: 0.2756 80/500 [===>..........................] - ETA: 1:44 - loss: 1.6695 - regression_loss: 1.3941 - classification_loss: 0.2754 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6730 - regression_loss: 1.3947 - classification_loss: 0.2783 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6656 - regression_loss: 1.3885 - classification_loss: 0.2771 83/500 [===>..........................] - ETA: 1:43 - loss: 1.6552 - regression_loss: 1.3803 - classification_loss: 0.2749 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6558 - regression_loss: 1.3812 - classification_loss: 0.2746 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6576 - regression_loss: 1.3827 - classification_loss: 0.2749 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6526 - regression_loss: 1.3791 - classification_loss: 0.2735 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6539 - regression_loss: 1.3804 - classification_loss: 0.2735 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6537 - regression_loss: 1.3792 - classification_loss: 0.2745 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6490 - regression_loss: 1.3743 - classification_loss: 0.2747 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6486 - regression_loss: 1.3744 - classification_loss: 0.2742 91/500 [====>.........................] - ETA: 1:42 - loss: 1.6506 - regression_loss: 1.3763 - classification_loss: 0.2743 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6544 - regression_loss: 1.3776 - classification_loss: 0.2768 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6552 - regression_loss: 1.3780 - classification_loss: 0.2773 94/500 [====>.........................] - ETA: 1:41 - loss: 1.6590 - regression_loss: 1.3809 - classification_loss: 0.2781 95/500 [====>.........................] - ETA: 1:41 - loss: 1.6574 - regression_loss: 1.3798 - classification_loss: 0.2777 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6548 - regression_loss: 1.3778 - classification_loss: 0.2770 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6552 - regression_loss: 1.3773 - classification_loss: 0.2780 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6546 - regression_loss: 1.3765 - classification_loss: 0.2781 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6517 - regression_loss: 1.3741 - classification_loss: 0.2776 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6482 - regression_loss: 1.3717 - classification_loss: 0.2765 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6372 - regression_loss: 1.3630 - classification_loss: 0.2742 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6393 - regression_loss: 1.3653 - classification_loss: 0.2740 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6406 - regression_loss: 1.3665 - classification_loss: 0.2741 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6376 - regression_loss: 1.3641 - classification_loss: 0.2735 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6385 - regression_loss: 1.3647 - classification_loss: 0.2738 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6295 - regression_loss: 1.3576 - classification_loss: 0.2719 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6291 - regression_loss: 1.3575 - classification_loss: 0.2716 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6355 - regression_loss: 1.3631 - classification_loss: 0.2724 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6322 - regression_loss: 1.3600 - classification_loss: 0.2721 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6228 - regression_loss: 1.3520 - classification_loss: 0.2708 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6185 - regression_loss: 1.3484 - classification_loss: 0.2701 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6121 - regression_loss: 1.3429 - classification_loss: 0.2692 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6128 - regression_loss: 1.3438 - classification_loss: 0.2691 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6151 - regression_loss: 1.3459 - classification_loss: 0.2691 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6105 - regression_loss: 1.3418 - classification_loss: 0.2687 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6116 - regression_loss: 1.3430 - classification_loss: 0.2686 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6047 - regression_loss: 1.3371 - classification_loss: 0.2676 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6045 - regression_loss: 1.3373 - classification_loss: 0.2672 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6061 - regression_loss: 1.3378 - classification_loss: 0.2684 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6046 - regression_loss: 1.3366 - classification_loss: 0.2680 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6020 - regression_loss: 1.3340 - classification_loss: 0.2679 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6053 - regression_loss: 1.3367 - classification_loss: 0.2686 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6050 - regression_loss: 1.3371 - classification_loss: 0.2679 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6077 - regression_loss: 1.3396 - classification_loss: 0.2682 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6060 - regression_loss: 1.3379 - classification_loss: 0.2681 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5993 - regression_loss: 1.3324 - classification_loss: 0.2670 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6010 - regression_loss: 1.3311 - classification_loss: 0.2700 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6030 - regression_loss: 1.3324 - classification_loss: 0.2706 129/500 [======>.......................] - ETA: 1:32 - loss: 1.6066 - regression_loss: 1.3355 - classification_loss: 0.2711 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6063 - regression_loss: 1.3356 - classification_loss: 0.2707 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6061 - regression_loss: 1.3352 - classification_loss: 0.2710 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6102 - regression_loss: 1.3373 - classification_loss: 0.2729 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6142 - regression_loss: 1.3403 - classification_loss: 0.2739 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6136 - regression_loss: 1.3400 - classification_loss: 0.2736 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6175 - regression_loss: 1.3432 - classification_loss: 0.2743 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6249 - regression_loss: 1.3481 - classification_loss: 0.2768 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6271 - regression_loss: 1.3500 - classification_loss: 0.2772 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6294 - regression_loss: 1.3519 - classification_loss: 0.2775 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6254 - regression_loss: 1.3486 - classification_loss: 0.2769 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6236 - regression_loss: 1.3476 - classification_loss: 0.2760 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6207 - regression_loss: 1.3455 - classification_loss: 0.2752 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6199 - regression_loss: 1.3450 - classification_loss: 0.2749 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6132 - regression_loss: 1.3399 - classification_loss: 0.2733 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6151 - regression_loss: 1.3409 - classification_loss: 0.2741 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6182 - regression_loss: 1.3437 - classification_loss: 0.2745 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6175 - regression_loss: 1.3431 - classification_loss: 0.2744 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6170 - regression_loss: 1.3427 - classification_loss: 0.2743 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6184 - regression_loss: 1.3440 - classification_loss: 0.2744 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6219 - regression_loss: 1.3461 - classification_loss: 0.2758 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6249 - regression_loss: 1.3492 - classification_loss: 0.2757 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6212 - regression_loss: 1.3461 - classification_loss: 0.2751 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6208 - regression_loss: 1.3459 - classification_loss: 0.2749 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6207 - regression_loss: 1.3451 - classification_loss: 0.2755 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6212 - regression_loss: 1.3459 - classification_loss: 0.2753 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6184 - regression_loss: 1.3438 - classification_loss: 0.2746 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6184 - regression_loss: 1.3439 - classification_loss: 0.2745 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6178 - regression_loss: 1.3432 - classification_loss: 0.2746 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6172 - regression_loss: 1.3429 - classification_loss: 0.2743 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6178 - regression_loss: 1.3438 - classification_loss: 0.2740 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6196 - regression_loss: 1.3454 - classification_loss: 0.2742 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6206 - regression_loss: 1.3463 - classification_loss: 0.2743 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6195 - regression_loss: 1.3449 - classification_loss: 0.2746 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6225 - regression_loss: 1.3469 - classification_loss: 0.2756 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6185 - regression_loss: 1.3434 - classification_loss: 0.2751 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6185 - regression_loss: 1.3437 - classification_loss: 0.2748 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6175 - regression_loss: 1.3430 - classification_loss: 0.2745 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6191 - regression_loss: 1.3416 - classification_loss: 0.2775 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6198 - regression_loss: 1.3424 - classification_loss: 0.2774 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6207 - regression_loss: 1.3434 - classification_loss: 0.2773 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6237 - regression_loss: 1.3461 - classification_loss: 0.2776 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6215 - regression_loss: 1.3441 - classification_loss: 0.2774 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6215 - regression_loss: 1.3442 - classification_loss: 0.2773 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6178 - regression_loss: 1.3414 - classification_loss: 0.2764 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6154 - regression_loss: 1.3394 - classification_loss: 0.2760 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6161 - regression_loss: 1.3404 - classification_loss: 0.2757 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6176 - regression_loss: 1.3418 - classification_loss: 0.2758 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6139 - regression_loss: 1.3390 - classification_loss: 0.2750 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6156 - regression_loss: 1.3405 - classification_loss: 0.2751 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6175 - regression_loss: 1.3417 - classification_loss: 0.2758 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6129 - regression_loss: 1.3382 - classification_loss: 0.2747 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6115 - regression_loss: 1.3375 - classification_loss: 0.2740 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6077 - regression_loss: 1.3342 - classification_loss: 0.2735 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6079 - regression_loss: 1.3344 - classification_loss: 0.2736 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6087 - regression_loss: 1.3355 - classification_loss: 0.2732 185/500 [==========>...................] - ETA: 1:19 - loss: 1.6109 - regression_loss: 1.3371 - classification_loss: 0.2738 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6115 - regression_loss: 1.3376 - classification_loss: 0.2738 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6144 - regression_loss: 1.3403 - classification_loss: 0.2741 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6113 - regression_loss: 1.3378 - classification_loss: 0.2735 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6137 - regression_loss: 1.3388 - classification_loss: 0.2748 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6140 - regression_loss: 1.3390 - classification_loss: 0.2749 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6153 - regression_loss: 1.3402 - classification_loss: 0.2752 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6124 - regression_loss: 1.3376 - classification_loss: 0.2749 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6152 - regression_loss: 1.3400 - classification_loss: 0.2752 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6185 - regression_loss: 1.3425 - classification_loss: 0.2760 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6176 - regression_loss: 1.3422 - classification_loss: 0.2755 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6163 - regression_loss: 1.3414 - classification_loss: 0.2749 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6144 - regression_loss: 1.3399 - classification_loss: 0.2745 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6167 - regression_loss: 1.3417 - classification_loss: 0.2750 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6164 - regression_loss: 1.3417 - classification_loss: 0.2747 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6151 - regression_loss: 1.3408 - classification_loss: 0.2743 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6175 - regression_loss: 1.3429 - classification_loss: 0.2746 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6183 - regression_loss: 1.3440 - classification_loss: 0.2743 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6177 - regression_loss: 1.3438 - classification_loss: 0.2739 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6189 - regression_loss: 1.3447 - classification_loss: 0.2742 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6155 - regression_loss: 1.3420 - classification_loss: 0.2735 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6146 - regression_loss: 1.3413 - classification_loss: 0.2734 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6123 - regression_loss: 1.3394 - classification_loss: 0.2729 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6129 - regression_loss: 1.3399 - classification_loss: 0.2730 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6124 - regression_loss: 1.3396 - classification_loss: 0.2729 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6141 - regression_loss: 1.3407 - classification_loss: 0.2734 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6110 - regression_loss: 1.3384 - classification_loss: 0.2726 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6123 - regression_loss: 1.3393 - classification_loss: 0.2730 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6132 - regression_loss: 1.3401 - classification_loss: 0.2731 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6126 - regression_loss: 1.3396 - classification_loss: 0.2730 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6128 - regression_loss: 1.3397 - classification_loss: 0.2731 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6145 - regression_loss: 1.3412 - classification_loss: 0.2733 217/500 [============>.................] - ETA: 1:10 - loss: 1.6114 - regression_loss: 1.3388 - classification_loss: 0.2726 218/500 [============>.................] - ETA: 1:10 - loss: 1.6097 - regression_loss: 1.3374 - classification_loss: 0.2722 219/500 [============>.................] - ETA: 1:10 - loss: 1.6093 - regression_loss: 1.3374 - classification_loss: 0.2719 220/500 [============>.................] - ETA: 1:10 - loss: 1.6099 - regression_loss: 1.3383 - classification_loss: 0.2716 221/500 [============>.................] - ETA: 1:09 - loss: 1.6117 - regression_loss: 1.3399 - classification_loss: 0.2718 222/500 [============>.................] - ETA: 1:09 - loss: 1.6157 - regression_loss: 1.3431 - classification_loss: 0.2726 223/500 [============>.................] - ETA: 1:09 - loss: 1.6175 - regression_loss: 1.3439 - classification_loss: 0.2736 224/500 [============>.................] - ETA: 1:09 - loss: 1.6197 - regression_loss: 1.3454 - classification_loss: 0.2742 225/500 [============>.................] - ETA: 1:08 - loss: 1.6200 - regression_loss: 1.3456 - classification_loss: 0.2745 226/500 [============>.................] - ETA: 1:08 - loss: 1.6225 - regression_loss: 1.3474 - classification_loss: 0.2751 227/500 [============>.................] - ETA: 1:08 - loss: 1.6195 - regression_loss: 1.3447 - classification_loss: 0.2748 228/500 [============>.................] - ETA: 1:08 - loss: 1.6174 - regression_loss: 1.3426 - classification_loss: 0.2748 229/500 [============>.................] - ETA: 1:07 - loss: 1.6187 - regression_loss: 1.3436 - classification_loss: 0.2751 230/500 [============>.................] - ETA: 1:07 - loss: 1.6198 - regression_loss: 1.3447 - classification_loss: 0.2751 231/500 [============>.................] - ETA: 1:07 - loss: 1.6207 - regression_loss: 1.3452 - classification_loss: 0.2755 232/500 [============>.................] - ETA: 1:07 - loss: 1.6202 - regression_loss: 1.3447 - classification_loss: 0.2754 233/500 [============>.................] - ETA: 1:06 - loss: 1.6210 - regression_loss: 1.3456 - classification_loss: 0.2755 234/500 [=============>................] - ETA: 1:06 - loss: 1.6227 - regression_loss: 1.3473 - classification_loss: 0.2754 235/500 [=============>................] - ETA: 1:06 - loss: 1.6214 - regression_loss: 1.3459 - classification_loss: 0.2754 236/500 [=============>................] - ETA: 1:06 - loss: 1.6207 - regression_loss: 1.3456 - classification_loss: 0.2751 237/500 [=============>................] - ETA: 1:05 - loss: 1.6204 - regression_loss: 1.3450 - classification_loss: 0.2754 238/500 [=============>................] - ETA: 1:05 - loss: 1.6222 - regression_loss: 1.3461 - classification_loss: 0.2761 239/500 [=============>................] - ETA: 1:05 - loss: 1.6207 - regression_loss: 1.3449 - classification_loss: 0.2758 240/500 [=============>................] - ETA: 1:05 - loss: 1.6219 - regression_loss: 1.3459 - classification_loss: 0.2760 241/500 [=============>................] - ETA: 1:04 - loss: 1.6220 - regression_loss: 1.3460 - classification_loss: 0.2760 242/500 [=============>................] - ETA: 1:04 - loss: 1.6250 - regression_loss: 1.3487 - classification_loss: 0.2762 243/500 [=============>................] - ETA: 1:04 - loss: 1.6247 - regression_loss: 1.3486 - classification_loss: 0.2761 244/500 [=============>................] - ETA: 1:04 - loss: 1.6223 - regression_loss: 1.3465 - classification_loss: 0.2758 245/500 [=============>................] - ETA: 1:03 - loss: 1.6250 - regression_loss: 1.3487 - classification_loss: 0.2763 246/500 [=============>................] - ETA: 1:03 - loss: 1.6253 - regression_loss: 1.3490 - classification_loss: 0.2762 247/500 [=============>................] - ETA: 1:03 - loss: 1.6262 - regression_loss: 1.3500 - classification_loss: 0.2763 248/500 [=============>................] - ETA: 1:03 - loss: 1.6246 - regression_loss: 1.3486 - classification_loss: 0.2761 249/500 [=============>................] - ETA: 1:02 - loss: 1.6256 - regression_loss: 1.3488 - classification_loss: 0.2769 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6256 - regression_loss: 1.3488 - classification_loss: 0.2768 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6256 - regression_loss: 1.3487 - classification_loss: 0.2769 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6263 - regression_loss: 1.3496 - classification_loss: 0.2767 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6252 - regression_loss: 1.3488 - classification_loss: 0.2764 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6283 - regression_loss: 1.3520 - classification_loss: 0.2763 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6301 - regression_loss: 1.3532 - classification_loss: 0.2769 256/500 [==============>...............] - ETA: 1:01 - loss: 1.6273 - regression_loss: 1.3508 - classification_loss: 0.2766 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6247 - regression_loss: 1.3490 - classification_loss: 0.2757 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6219 - regression_loss: 1.3469 - classification_loss: 0.2750 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6235 - regression_loss: 1.3482 - classification_loss: 0.2753 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6253 - regression_loss: 1.3499 - classification_loss: 0.2754 261/500 [==============>...............] - ETA: 59s - loss: 1.6253 - regression_loss: 1.3500 - classification_loss: 0.2753  262/500 [==============>...............] - ETA: 59s - loss: 1.6279 - regression_loss: 1.3524 - classification_loss: 0.2755 263/500 [==============>...............] - ETA: 59s - loss: 1.6288 - regression_loss: 1.3535 - classification_loss: 0.2753 264/500 [==============>...............] - ETA: 59s - loss: 1.6288 - regression_loss: 1.3534 - classification_loss: 0.2754 265/500 [==============>...............] - ETA: 58s - loss: 1.6309 - regression_loss: 1.3550 - classification_loss: 0.2760 266/500 [==============>...............] - ETA: 58s - loss: 1.6296 - regression_loss: 1.3540 - classification_loss: 0.2756 267/500 [===============>..............] - ETA: 58s - loss: 1.6291 - regression_loss: 1.3535 - classification_loss: 0.2756 268/500 [===============>..............] - ETA: 58s - loss: 1.6270 - regression_loss: 1.3519 - classification_loss: 0.2751 269/500 [===============>..............] - ETA: 57s - loss: 1.6263 - regression_loss: 1.3515 - classification_loss: 0.2749 270/500 [===============>..............] - ETA: 57s - loss: 1.6279 - regression_loss: 1.3527 - classification_loss: 0.2752 271/500 [===============>..............] - ETA: 57s - loss: 1.6278 - regression_loss: 1.3527 - classification_loss: 0.2752 272/500 [===============>..............] - ETA: 57s - loss: 1.6268 - regression_loss: 1.3513 - classification_loss: 0.2755 273/500 [===============>..............] - ETA: 56s - loss: 1.6272 - regression_loss: 1.3515 - classification_loss: 0.2756 274/500 [===============>..............] - ETA: 56s - loss: 1.6257 - regression_loss: 1.3499 - classification_loss: 0.2758 275/500 [===============>..............] - ETA: 56s - loss: 1.6254 - regression_loss: 1.3497 - classification_loss: 0.2757 276/500 [===============>..............] - ETA: 56s - loss: 1.6247 - regression_loss: 1.3493 - classification_loss: 0.2754 277/500 [===============>..............] - ETA: 55s - loss: 1.6241 - regression_loss: 1.3488 - classification_loss: 0.2752 278/500 [===============>..............] - ETA: 55s - loss: 1.6234 - regression_loss: 1.3484 - classification_loss: 0.2750 279/500 [===============>..............] - ETA: 55s - loss: 1.6249 - regression_loss: 1.3496 - classification_loss: 0.2753 280/500 [===============>..............] - ETA: 55s - loss: 1.6259 - regression_loss: 1.3506 - classification_loss: 0.2753 281/500 [===============>..............] - ETA: 54s - loss: 1.6269 - regression_loss: 1.3513 - classification_loss: 0.2755 282/500 [===============>..............] - ETA: 54s - loss: 1.6252 - regression_loss: 1.3498 - classification_loss: 0.2754 283/500 [===============>..............] - ETA: 54s - loss: 1.6250 - regression_loss: 1.3496 - classification_loss: 0.2754 284/500 [================>.............] - ETA: 54s - loss: 1.6255 - regression_loss: 1.3500 - classification_loss: 0.2755 285/500 [================>.............] - ETA: 53s - loss: 1.6250 - regression_loss: 1.3497 - classification_loss: 0.2753 286/500 [================>.............] - ETA: 53s - loss: 1.6212 - regression_loss: 1.3465 - classification_loss: 0.2747 287/500 [================>.............] - ETA: 53s - loss: 1.6207 - regression_loss: 1.3458 - classification_loss: 0.2749 288/500 [================>.............] - ETA: 53s - loss: 1.6217 - regression_loss: 1.3467 - classification_loss: 0.2750 289/500 [================>.............] - ETA: 52s - loss: 1.6218 - regression_loss: 1.3470 - classification_loss: 0.2748 290/500 [================>.............] - ETA: 52s - loss: 1.6221 - regression_loss: 1.3472 - classification_loss: 0.2749 291/500 [================>.............] - ETA: 52s - loss: 1.6215 - regression_loss: 1.3468 - classification_loss: 0.2747 292/500 [================>.............] - ETA: 52s - loss: 1.6219 - regression_loss: 1.3474 - classification_loss: 0.2746 293/500 [================>.............] - ETA: 51s - loss: 1.6249 - regression_loss: 1.3495 - classification_loss: 0.2753 294/500 [================>.............] - ETA: 51s - loss: 1.6245 - regression_loss: 1.3496 - classification_loss: 0.2749 295/500 [================>.............] - ETA: 51s - loss: 1.6243 - regression_loss: 1.3495 - classification_loss: 0.2747 296/500 [================>.............] - ETA: 51s - loss: 1.6267 - regression_loss: 1.3518 - classification_loss: 0.2748 297/500 [================>.............] - ETA: 50s - loss: 1.6277 - regression_loss: 1.3527 - classification_loss: 0.2750 298/500 [================>.............] - ETA: 50s - loss: 1.6244 - regression_loss: 1.3499 - classification_loss: 0.2744 299/500 [================>.............] - ETA: 50s - loss: 1.6231 - regression_loss: 1.3490 - classification_loss: 0.2741 300/500 [=================>............] - ETA: 50s - loss: 1.6240 - regression_loss: 1.3497 - classification_loss: 0.2743 301/500 [=================>............] - ETA: 49s - loss: 1.6229 - regression_loss: 1.3487 - classification_loss: 0.2742 302/500 [=================>............] - ETA: 49s - loss: 1.6230 - regression_loss: 1.3488 - classification_loss: 0.2741 303/500 [=================>............] - ETA: 49s - loss: 1.6229 - regression_loss: 1.3489 - classification_loss: 0.2740 304/500 [=================>............] - ETA: 49s - loss: 1.6222 - regression_loss: 1.3484 - classification_loss: 0.2738 305/500 [=================>............] - ETA: 48s - loss: 1.6198 - regression_loss: 1.3466 - classification_loss: 0.2732 306/500 [=================>............] - ETA: 48s - loss: 1.6176 - regression_loss: 1.3447 - classification_loss: 0.2729 307/500 [=================>............] - ETA: 48s - loss: 1.6160 - regression_loss: 1.3434 - classification_loss: 0.2726 308/500 [=================>............] - ETA: 48s - loss: 1.6166 - regression_loss: 1.3440 - classification_loss: 0.2726 309/500 [=================>............] - ETA: 47s - loss: 1.6182 - regression_loss: 1.3452 - classification_loss: 0.2730 310/500 [=================>............] - ETA: 47s - loss: 1.6186 - regression_loss: 1.3452 - classification_loss: 0.2734 311/500 [=================>............] - ETA: 47s - loss: 1.6192 - regression_loss: 1.3458 - classification_loss: 0.2734 312/500 [=================>............] - ETA: 47s - loss: 1.6187 - regression_loss: 1.3452 - classification_loss: 0.2735 313/500 [=================>............] - ETA: 46s - loss: 1.6166 - regression_loss: 1.3436 - classification_loss: 0.2730 314/500 [=================>............] - ETA: 46s - loss: 1.6155 - regression_loss: 1.3429 - classification_loss: 0.2726 315/500 [=================>............] - ETA: 46s - loss: 1.6164 - regression_loss: 1.3438 - classification_loss: 0.2726 316/500 [=================>............] - ETA: 46s - loss: 1.6141 - regression_loss: 1.3421 - classification_loss: 0.2720 317/500 [==================>...........] - ETA: 45s - loss: 1.6149 - regression_loss: 1.3425 - classification_loss: 0.2724 318/500 [==================>...........] - ETA: 45s - loss: 1.6143 - regression_loss: 1.3423 - classification_loss: 0.2720 319/500 [==================>...........] - ETA: 45s - loss: 1.6141 - regression_loss: 1.3423 - classification_loss: 0.2719 320/500 [==================>...........] - ETA: 45s - loss: 1.6107 - regression_loss: 1.3395 - classification_loss: 0.2713 321/500 [==================>...........] - ETA: 44s - loss: 1.6118 - regression_loss: 1.3404 - classification_loss: 0.2714 322/500 [==================>...........] - ETA: 44s - loss: 1.6123 - regression_loss: 1.3408 - classification_loss: 0.2715 323/500 [==================>...........] - ETA: 44s - loss: 1.6122 - regression_loss: 1.3407 - classification_loss: 0.2715 324/500 [==================>...........] - ETA: 44s - loss: 1.6130 - regression_loss: 1.3415 - classification_loss: 0.2715 325/500 [==================>...........] - ETA: 43s - loss: 1.6123 - regression_loss: 1.3409 - classification_loss: 0.2714 326/500 [==================>...........] - ETA: 43s - loss: 1.6131 - regression_loss: 1.3413 - classification_loss: 0.2718 327/500 [==================>...........] - ETA: 43s - loss: 1.6130 - regression_loss: 1.3413 - classification_loss: 0.2717 328/500 [==================>...........] - ETA: 43s - loss: 1.6118 - regression_loss: 1.3404 - classification_loss: 0.2714 329/500 [==================>...........] - ETA: 42s - loss: 1.6137 - regression_loss: 1.3418 - classification_loss: 0.2719 330/500 [==================>...........] - ETA: 42s - loss: 1.6133 - regression_loss: 1.3415 - classification_loss: 0.2718 331/500 [==================>...........] - ETA: 42s - loss: 1.6130 - regression_loss: 1.3415 - classification_loss: 0.2715 332/500 [==================>...........] - ETA: 42s - loss: 1.6126 - regression_loss: 1.3412 - classification_loss: 0.2714 333/500 [==================>...........] - ETA: 41s - loss: 1.6123 - regression_loss: 1.3410 - classification_loss: 0.2713 334/500 [===================>..........] - ETA: 41s - loss: 1.6118 - regression_loss: 1.3407 - classification_loss: 0.2711 335/500 [===================>..........] - ETA: 41s - loss: 1.6194 - regression_loss: 1.3471 - classification_loss: 0.2722 336/500 [===================>..........] - ETA: 41s - loss: 1.6202 - regression_loss: 1.3480 - classification_loss: 0.2723 337/500 [===================>..........] - ETA: 40s - loss: 1.6215 - regression_loss: 1.3489 - classification_loss: 0.2726 338/500 [===================>..........] - ETA: 40s - loss: 1.6221 - regression_loss: 1.3492 - classification_loss: 0.2729 339/500 [===================>..........] - ETA: 40s - loss: 1.6194 - regression_loss: 1.3467 - classification_loss: 0.2727 340/500 [===================>..........] - ETA: 40s - loss: 1.6196 - regression_loss: 1.3470 - classification_loss: 0.2726 341/500 [===================>..........] - ETA: 39s - loss: 1.6198 - regression_loss: 1.3477 - classification_loss: 0.2722 342/500 [===================>..........] - ETA: 39s - loss: 1.6189 - regression_loss: 1.3464 - classification_loss: 0.2726 343/500 [===================>..........] - ETA: 39s - loss: 1.6198 - regression_loss: 1.3470 - classification_loss: 0.2728 344/500 [===================>..........] - ETA: 39s - loss: 1.6205 - regression_loss: 1.3475 - classification_loss: 0.2730 345/500 [===================>..........] - ETA: 38s - loss: 1.6207 - regression_loss: 1.3478 - classification_loss: 0.2728 346/500 [===================>..........] - ETA: 38s - loss: 1.6195 - regression_loss: 1.3465 - classification_loss: 0.2730 347/500 [===================>..........] - ETA: 38s - loss: 1.6174 - regression_loss: 1.3448 - classification_loss: 0.2726 348/500 [===================>..........] - ETA: 38s - loss: 1.6182 - regression_loss: 1.3455 - classification_loss: 0.2727 349/500 [===================>..........] - ETA: 37s - loss: 1.6160 - regression_loss: 1.3437 - classification_loss: 0.2723 350/500 [====================>.........] - ETA: 37s - loss: 1.6153 - regression_loss: 1.3432 - classification_loss: 0.2721 351/500 [====================>.........] - ETA: 37s - loss: 1.6156 - regression_loss: 1.3433 - classification_loss: 0.2723 352/500 [====================>.........] - ETA: 37s - loss: 1.6165 - regression_loss: 1.3440 - classification_loss: 0.2725 353/500 [====================>.........] - ETA: 36s - loss: 1.6173 - regression_loss: 1.3446 - classification_loss: 0.2727 354/500 [====================>.........] - ETA: 36s - loss: 1.6190 - regression_loss: 1.3458 - classification_loss: 0.2732 355/500 [====================>.........] - ETA: 36s - loss: 1.6202 - regression_loss: 1.3466 - classification_loss: 0.2736 356/500 [====================>.........] - ETA: 36s - loss: 1.6184 - regression_loss: 1.3454 - classification_loss: 0.2730 357/500 [====================>.........] - ETA: 35s - loss: 1.6179 - regression_loss: 1.3450 - classification_loss: 0.2728 358/500 [====================>.........] - ETA: 35s - loss: 1.6182 - regression_loss: 1.3453 - classification_loss: 0.2728 359/500 [====================>.........] - ETA: 35s - loss: 1.6172 - regression_loss: 1.3444 - classification_loss: 0.2728 360/500 [====================>.........] - ETA: 35s - loss: 1.6196 - regression_loss: 1.3462 - classification_loss: 0.2734 361/500 [====================>.........] - ETA: 34s - loss: 1.6187 - regression_loss: 1.3456 - classification_loss: 0.2731 362/500 [====================>.........] - ETA: 34s - loss: 1.6188 - regression_loss: 1.3457 - classification_loss: 0.2732 363/500 [====================>.........] - ETA: 34s - loss: 1.6183 - regression_loss: 1.3454 - classification_loss: 0.2729 364/500 [====================>.........] - ETA: 34s - loss: 1.6188 - regression_loss: 1.3458 - classification_loss: 0.2730 365/500 [====================>.........] - ETA: 33s - loss: 1.6184 - regression_loss: 1.3455 - classification_loss: 0.2729 366/500 [====================>.........] - ETA: 33s - loss: 1.6162 - regression_loss: 1.3439 - classification_loss: 0.2723 367/500 [=====================>........] - ETA: 33s - loss: 1.6154 - regression_loss: 1.3432 - classification_loss: 0.2722 368/500 [=====================>........] - ETA: 33s - loss: 1.6152 - regression_loss: 1.3433 - classification_loss: 0.2719 369/500 [=====================>........] - ETA: 32s - loss: 1.6132 - regression_loss: 1.3416 - classification_loss: 0.2716 370/500 [=====================>........] - ETA: 32s - loss: 1.6148 - regression_loss: 1.3434 - classification_loss: 0.2715 371/500 [=====================>........] - ETA: 32s - loss: 1.6158 - regression_loss: 1.3442 - classification_loss: 0.2716 372/500 [=====================>........] - ETA: 32s - loss: 1.6164 - regression_loss: 1.3446 - classification_loss: 0.2718 373/500 [=====================>........] - ETA: 31s - loss: 1.6168 - regression_loss: 1.3450 - classification_loss: 0.2718 374/500 [=====================>........] - ETA: 31s - loss: 1.6185 - regression_loss: 1.3461 - classification_loss: 0.2724 375/500 [=====================>........] - ETA: 31s - loss: 1.6181 - regression_loss: 1.3461 - classification_loss: 0.2720 376/500 [=====================>........] - ETA: 31s - loss: 1.6171 - regression_loss: 1.3452 - classification_loss: 0.2719 377/500 [=====================>........] - ETA: 30s - loss: 1.6175 - regression_loss: 1.3455 - classification_loss: 0.2720 378/500 [=====================>........] - ETA: 30s - loss: 1.6185 - regression_loss: 1.3463 - classification_loss: 0.2722 379/500 [=====================>........] - ETA: 30s - loss: 1.6192 - regression_loss: 1.3470 - classification_loss: 0.2722 380/500 [=====================>........] - ETA: 30s - loss: 1.6191 - regression_loss: 1.3470 - classification_loss: 0.2721 381/500 [=====================>........] - ETA: 29s - loss: 1.6198 - regression_loss: 1.3478 - classification_loss: 0.2720 382/500 [=====================>........] - ETA: 29s - loss: 1.6202 - regression_loss: 1.3483 - classification_loss: 0.2719 383/500 [=====================>........] - ETA: 29s - loss: 1.6215 - regression_loss: 1.3492 - classification_loss: 0.2722 384/500 [======================>.......] - ETA: 29s - loss: 1.6210 - regression_loss: 1.3489 - classification_loss: 0.2722 385/500 [======================>.......] - ETA: 28s - loss: 1.6200 - regression_loss: 1.3481 - classification_loss: 0.2719 386/500 [======================>.......] - ETA: 28s - loss: 1.6213 - regression_loss: 1.3493 - classification_loss: 0.2720 387/500 [======================>.......] - ETA: 28s - loss: 1.6227 - regression_loss: 1.3504 - classification_loss: 0.2723 388/500 [======================>.......] - ETA: 28s - loss: 1.6204 - regression_loss: 1.3484 - classification_loss: 0.2720 389/500 [======================>.......] - ETA: 27s - loss: 1.6215 - regression_loss: 1.3494 - classification_loss: 0.2721 390/500 [======================>.......] - ETA: 27s - loss: 1.6221 - regression_loss: 1.3500 - classification_loss: 0.2721 391/500 [======================>.......] - ETA: 27s - loss: 1.6219 - regression_loss: 1.3498 - classification_loss: 0.2721 392/500 [======================>.......] - ETA: 27s - loss: 1.6216 - regression_loss: 1.3495 - classification_loss: 0.2720 393/500 [======================>.......] - ETA: 26s - loss: 1.6211 - regression_loss: 1.3491 - classification_loss: 0.2720 394/500 [======================>.......] - ETA: 26s - loss: 1.6221 - regression_loss: 1.3500 - classification_loss: 0.2721 395/500 [======================>.......] - ETA: 26s - loss: 1.6224 - regression_loss: 1.3502 - classification_loss: 0.2722 396/500 [======================>.......] - ETA: 26s - loss: 1.6228 - regression_loss: 1.3506 - classification_loss: 0.2722 397/500 [======================>.......] - ETA: 25s - loss: 1.6238 - regression_loss: 1.3515 - classification_loss: 0.2723 398/500 [======================>.......] - ETA: 25s - loss: 1.6232 - regression_loss: 1.3511 - classification_loss: 0.2721 399/500 [======================>.......] - ETA: 25s - loss: 1.6224 - regression_loss: 1.3505 - classification_loss: 0.2719 400/500 [=======================>......] - ETA: 25s - loss: 1.6208 - regression_loss: 1.3492 - classification_loss: 0.2715 401/500 [=======================>......] - ETA: 24s - loss: 1.6220 - regression_loss: 1.3502 - classification_loss: 0.2718 402/500 [=======================>......] - ETA: 24s - loss: 1.6231 - regression_loss: 1.3512 - classification_loss: 0.2719 403/500 [=======================>......] - ETA: 24s - loss: 1.6229 - regression_loss: 1.3511 - classification_loss: 0.2718 404/500 [=======================>......] - ETA: 24s - loss: 1.6200 - regression_loss: 1.3487 - classification_loss: 0.2713 405/500 [=======================>......] - ETA: 23s - loss: 1.6202 - regression_loss: 1.3490 - classification_loss: 0.2712 406/500 [=======================>......] - ETA: 23s - loss: 1.6181 - regression_loss: 1.3474 - classification_loss: 0.2707 407/500 [=======================>......] - ETA: 23s - loss: 1.6165 - regression_loss: 1.3460 - classification_loss: 0.2706 408/500 [=======================>......] - ETA: 23s - loss: 1.6171 - regression_loss: 1.3466 - classification_loss: 0.2705 409/500 [=======================>......] - ETA: 22s - loss: 1.6172 - regression_loss: 1.3466 - classification_loss: 0.2706 410/500 [=======================>......] - ETA: 22s - loss: 1.6189 - regression_loss: 1.3480 - classification_loss: 0.2709 411/500 [=======================>......] - ETA: 22s - loss: 1.6193 - regression_loss: 1.3483 - classification_loss: 0.2710 412/500 [=======================>......] - ETA: 22s - loss: 1.6194 - regression_loss: 1.3484 - classification_loss: 0.2710 413/500 [=======================>......] - ETA: 21s - loss: 1.6193 - regression_loss: 1.3484 - classification_loss: 0.2709 414/500 [=======================>......] - ETA: 21s - loss: 1.6197 - regression_loss: 1.3489 - classification_loss: 0.2708 415/500 [=======================>......] - ETA: 21s - loss: 1.6184 - regression_loss: 1.3477 - classification_loss: 0.2707 416/500 [=======================>......] - ETA: 21s - loss: 1.6181 - regression_loss: 1.3476 - classification_loss: 0.2705 417/500 [========================>.....] - ETA: 20s - loss: 1.6169 - regression_loss: 1.3466 - classification_loss: 0.2702 418/500 [========================>.....] - ETA: 20s - loss: 1.6177 - regression_loss: 1.3475 - classification_loss: 0.2702 419/500 [========================>.....] - ETA: 20s - loss: 1.6177 - regression_loss: 1.3475 - classification_loss: 0.2702 420/500 [========================>.....] - ETA: 20s - loss: 1.6191 - regression_loss: 1.3487 - classification_loss: 0.2704 421/500 [========================>.....] - ETA: 19s - loss: 1.6200 - regression_loss: 1.3495 - classification_loss: 0.2705 422/500 [========================>.....] - ETA: 19s - loss: 1.6203 - regression_loss: 1.3497 - classification_loss: 0.2706 423/500 [========================>.....] - ETA: 19s - loss: 1.6185 - regression_loss: 1.3483 - classification_loss: 0.2701 424/500 [========================>.....] - ETA: 19s - loss: 1.6176 - regression_loss: 1.3474 - classification_loss: 0.2702 425/500 [========================>.....] - ETA: 18s - loss: 1.6183 - regression_loss: 1.3479 - classification_loss: 0.2704 426/500 [========================>.....] - ETA: 18s - loss: 1.6182 - regression_loss: 1.3479 - classification_loss: 0.2702 427/500 [========================>.....] - ETA: 18s - loss: 1.6174 - regression_loss: 1.3475 - classification_loss: 0.2699 428/500 [========================>.....] - ETA: 18s - loss: 1.6169 - regression_loss: 1.3474 - classification_loss: 0.2696 429/500 [========================>.....] - ETA: 17s - loss: 1.6183 - regression_loss: 1.3484 - classification_loss: 0.2700 430/500 [========================>.....] - ETA: 17s - loss: 1.6168 - regression_loss: 1.3473 - classification_loss: 0.2696 431/500 [========================>.....] - ETA: 17s - loss: 1.6186 - regression_loss: 1.3488 - classification_loss: 0.2698 432/500 [========================>.....] - ETA: 17s - loss: 1.6189 - regression_loss: 1.3489 - classification_loss: 0.2699 433/500 [========================>.....] - ETA: 16s - loss: 1.6177 - regression_loss: 1.3479 - classification_loss: 0.2697 434/500 [=========================>....] - ETA: 16s - loss: 1.6184 - regression_loss: 1.3483 - classification_loss: 0.2700 435/500 [=========================>....] - ETA: 16s - loss: 1.6185 - regression_loss: 1.3485 - classification_loss: 0.2700 436/500 [=========================>....] - ETA: 16s - loss: 1.6195 - regression_loss: 1.3492 - classification_loss: 0.2703 437/500 [=========================>....] - ETA: 15s - loss: 1.6181 - regression_loss: 1.3481 - classification_loss: 0.2700 438/500 [=========================>....] - ETA: 15s - loss: 1.6176 - regression_loss: 1.3474 - classification_loss: 0.2702 439/500 [=========================>....] - ETA: 15s - loss: 1.6176 - regression_loss: 1.3476 - classification_loss: 0.2700 440/500 [=========================>....] - ETA: 15s - loss: 1.6190 - regression_loss: 1.3483 - classification_loss: 0.2707 441/500 [=========================>....] - ETA: 14s - loss: 1.6188 - regression_loss: 1.3481 - classification_loss: 0.2706 442/500 [=========================>....] - ETA: 14s - loss: 1.6186 - regression_loss: 1.3481 - classification_loss: 0.2705 443/500 [=========================>....] - ETA: 14s - loss: 1.6178 - regression_loss: 1.3475 - classification_loss: 0.2703 444/500 [=========================>....] - ETA: 14s - loss: 1.6177 - regression_loss: 1.3475 - classification_loss: 0.2702 445/500 [=========================>....] - ETA: 13s - loss: 1.6167 - regression_loss: 1.3468 - classification_loss: 0.2699 446/500 [=========================>....] - ETA: 13s - loss: 1.6164 - regression_loss: 1.3467 - classification_loss: 0.2697 447/500 [=========================>....] - ETA: 13s - loss: 1.6178 - regression_loss: 1.3479 - classification_loss: 0.2699 448/500 [=========================>....] - ETA: 13s - loss: 1.6181 - regression_loss: 1.3482 - classification_loss: 0.2699 449/500 [=========================>....] - ETA: 12s - loss: 1.6170 - regression_loss: 1.3473 - classification_loss: 0.2697 450/500 [==========================>...] - ETA: 12s - loss: 1.6170 - regression_loss: 1.3474 - classification_loss: 0.2696 451/500 [==========================>...] - ETA: 12s - loss: 1.6178 - regression_loss: 1.3481 - classification_loss: 0.2697 452/500 [==========================>...] - ETA: 12s - loss: 1.6175 - regression_loss: 1.3480 - classification_loss: 0.2695 453/500 [==========================>...] - ETA: 11s - loss: 1.6180 - regression_loss: 1.3485 - classification_loss: 0.2695 454/500 [==========================>...] - ETA: 11s - loss: 1.6172 - regression_loss: 1.3480 - classification_loss: 0.2692 455/500 [==========================>...] - ETA: 11s - loss: 1.6172 - regression_loss: 1.3479 - classification_loss: 0.2693 456/500 [==========================>...] - ETA: 11s - loss: 1.6156 - regression_loss: 1.3467 - classification_loss: 0.2689 457/500 [==========================>...] - ETA: 10s - loss: 1.6164 - regression_loss: 1.3475 - classification_loss: 0.2689 458/500 [==========================>...] - ETA: 10s - loss: 1.6151 - regression_loss: 1.3465 - classification_loss: 0.2686 459/500 [==========================>...] - ETA: 10s - loss: 1.6146 - regression_loss: 1.3462 - classification_loss: 0.2684 460/500 [==========================>...] - ETA: 10s - loss: 1.6143 - regression_loss: 1.3459 - classification_loss: 0.2684 461/500 [==========================>...] - ETA: 9s - loss: 1.6142 - regression_loss: 1.3459 - classification_loss: 0.2683  462/500 [==========================>...] - ETA: 9s - loss: 1.6155 - regression_loss: 1.3467 - classification_loss: 0.2688 463/500 [==========================>...] - ETA: 9s - loss: 1.6161 - regression_loss: 1.3472 - classification_loss: 0.2689 464/500 [==========================>...] - ETA: 9s - loss: 1.6165 - regression_loss: 1.3476 - classification_loss: 0.2690 465/500 [==========================>...] - ETA: 8s - loss: 1.6179 - regression_loss: 1.3487 - classification_loss: 0.2692 466/500 [==========================>...] - ETA: 8s - loss: 1.6180 - regression_loss: 1.3489 - classification_loss: 0.2691 467/500 [===========================>..] - ETA: 8s - loss: 1.6179 - regression_loss: 1.3489 - classification_loss: 0.2690 468/500 [===========================>..] - ETA: 8s - loss: 1.6180 - regression_loss: 1.3491 - classification_loss: 0.2690 469/500 [===========================>..] - ETA: 7s - loss: 1.6186 - regression_loss: 1.3496 - classification_loss: 0.2690 470/500 [===========================>..] - ETA: 7s - loss: 1.6186 - regression_loss: 1.3495 - classification_loss: 0.2691 471/500 [===========================>..] - ETA: 7s - loss: 1.6185 - regression_loss: 1.3494 - classification_loss: 0.2691 472/500 [===========================>..] - ETA: 7s - loss: 1.6184 - regression_loss: 1.3493 - classification_loss: 0.2691 473/500 [===========================>..] - ETA: 6s - loss: 1.6173 - regression_loss: 1.3485 - classification_loss: 0.2688 474/500 [===========================>..] - ETA: 6s - loss: 1.6182 - regression_loss: 1.3492 - classification_loss: 0.2690 475/500 [===========================>..] - ETA: 6s - loss: 1.6179 - regression_loss: 1.3490 - classification_loss: 0.2689 476/500 [===========================>..] - ETA: 6s - loss: 1.6174 - regression_loss: 1.3487 - classification_loss: 0.2687 477/500 [===========================>..] - ETA: 5s - loss: 1.6163 - regression_loss: 1.3477 - classification_loss: 0.2686 478/500 [===========================>..] - ETA: 5s - loss: 1.6209 - regression_loss: 1.3508 - classification_loss: 0.2701 479/500 [===========================>..] - ETA: 5s - loss: 1.6221 - regression_loss: 1.3518 - classification_loss: 0.2703 480/500 [===========================>..] - ETA: 5s - loss: 1.6221 - regression_loss: 1.3518 - classification_loss: 0.2703 481/500 [===========================>..] - ETA: 4s - loss: 1.6228 - regression_loss: 1.3524 - classification_loss: 0.2704 482/500 [===========================>..] - ETA: 4s - loss: 1.6231 - regression_loss: 1.3527 - classification_loss: 0.2705 483/500 [===========================>..] - ETA: 4s - loss: 1.6223 - regression_loss: 1.3521 - classification_loss: 0.2702 484/500 [============================>.] - ETA: 4s - loss: 1.6231 - regression_loss: 1.3528 - classification_loss: 0.2703 485/500 [============================>.] - ETA: 3s - loss: 1.6213 - regression_loss: 1.3514 - classification_loss: 0.2699 486/500 [============================>.] - ETA: 3s - loss: 1.6218 - regression_loss: 1.3518 - classification_loss: 0.2700 487/500 [============================>.] - ETA: 3s - loss: 1.6218 - regression_loss: 1.3519 - classification_loss: 0.2699 488/500 [============================>.] - ETA: 3s - loss: 1.6217 - regression_loss: 1.3520 - classification_loss: 0.2698 489/500 [============================>.] - ETA: 2s - loss: 1.6220 - regression_loss: 1.3522 - classification_loss: 0.2698 490/500 [============================>.] - ETA: 2s - loss: 1.6220 - regression_loss: 1.3523 - classification_loss: 0.2697 491/500 [============================>.] - ETA: 2s - loss: 1.6232 - regression_loss: 1.3533 - classification_loss: 0.2699 492/500 [============================>.] - ETA: 2s - loss: 1.6252 - regression_loss: 1.3544 - classification_loss: 0.2707 493/500 [============================>.] - ETA: 1s - loss: 1.6253 - regression_loss: 1.3547 - classification_loss: 0.2706 494/500 [============================>.] - ETA: 1s - loss: 1.6254 - regression_loss: 1.3548 - classification_loss: 0.2706 495/500 [============================>.] - ETA: 1s - loss: 1.6250 - regression_loss: 1.3545 - classification_loss: 0.2705 496/500 [============================>.] - ETA: 1s - loss: 1.6243 - regression_loss: 1.3540 - classification_loss: 0.2703 497/500 [============================>.] - ETA: 0s - loss: 1.6245 - regression_loss: 1.3540 - classification_loss: 0.2705 498/500 [============================>.] - ETA: 0s - loss: 1.6239 - regression_loss: 1.3533 - classification_loss: 0.2705 499/500 [============================>.] - ETA: 0s - loss: 1.6248 - regression_loss: 1.3540 - classification_loss: 0.2707 500/500 [==============================] - 125s 251ms/step - loss: 1.6238 - regression_loss: 1.3532 - classification_loss: 0.2705 1172 instances of class plum with average precision: 0.6454 mAP: 0.6454 Epoch 00084: saving model to ./training/snapshots/resnet50_pascal_84.h5 Epoch 85/150 1/500 [..............................] - ETA: 2:04 - loss: 1.9572 - regression_loss: 1.6184 - classification_loss: 0.3389 2/500 [..............................] - ETA: 2:07 - loss: 1.5920 - regression_loss: 1.3028 - classification_loss: 0.2891 3/500 [..............................] - ETA: 2:04 - loss: 1.7526 - regression_loss: 1.4300 - classification_loss: 0.3226 4/500 [..............................] - ETA: 2:05 - loss: 1.5432 - regression_loss: 1.2649 - classification_loss: 0.2783 5/500 [..............................] - ETA: 2:05 - loss: 1.5049 - regression_loss: 1.2391 - classification_loss: 0.2659 6/500 [..............................] - ETA: 2:05 - loss: 1.5831 - regression_loss: 1.3088 - classification_loss: 0.2743 7/500 [..............................] - ETA: 2:05 - loss: 1.6009 - regression_loss: 1.3253 - classification_loss: 0.2757 8/500 [..............................] - ETA: 2:05 - loss: 1.5513 - regression_loss: 1.2747 - classification_loss: 0.2766 9/500 [..............................] - ETA: 2:03 - loss: 1.5714 - regression_loss: 1.3038 - classification_loss: 0.2677 10/500 [..............................] - ETA: 2:02 - loss: 1.6113 - regression_loss: 1.3385 - classification_loss: 0.2729 11/500 [..............................] - ETA: 2:03 - loss: 1.6454 - regression_loss: 1.3645 - classification_loss: 0.2809 12/500 [..............................] - ETA: 2:02 - loss: 1.6616 - regression_loss: 1.3845 - classification_loss: 0.2771 13/500 [..............................] - ETA: 2:02 - loss: 1.6169 - regression_loss: 1.3460 - classification_loss: 0.2709 14/500 [..............................] - ETA: 2:02 - loss: 1.6174 - regression_loss: 1.3511 - classification_loss: 0.2663 15/500 [..............................] - ETA: 2:02 - loss: 1.6514 - regression_loss: 1.3788 - classification_loss: 0.2726 16/500 [..............................] - ETA: 2:01 - loss: 1.6566 - regression_loss: 1.3823 - classification_loss: 0.2743 17/500 [>.............................] - ETA: 2:00 - loss: 1.6664 - regression_loss: 1.3890 - classification_loss: 0.2774 18/500 [>.............................] - ETA: 2:00 - loss: 1.6745 - regression_loss: 1.3958 - classification_loss: 0.2787 19/500 [>.............................] - ETA: 2:00 - loss: 1.6887 - regression_loss: 1.4063 - classification_loss: 0.2823 20/500 [>.............................] - ETA: 2:00 - loss: 1.7151 - regression_loss: 1.4255 - classification_loss: 0.2896 21/500 [>.............................] - ETA: 2:00 - loss: 1.6615 - regression_loss: 1.3842 - classification_loss: 0.2773 22/500 [>.............................] - ETA: 1:59 - loss: 1.6755 - regression_loss: 1.3959 - classification_loss: 0.2796 23/500 [>.............................] - ETA: 1:59 - loss: 1.6325 - regression_loss: 1.3602 - classification_loss: 0.2722 24/500 [>.............................] - ETA: 1:59 - loss: 1.6322 - regression_loss: 1.3616 - classification_loss: 0.2706 25/500 [>.............................] - ETA: 1:58 - loss: 1.6223 - regression_loss: 1.3543 - classification_loss: 0.2680 26/500 [>.............................] - ETA: 1:58 - loss: 1.6317 - regression_loss: 1.3634 - classification_loss: 0.2684 27/500 [>.............................] - ETA: 1:58 - loss: 1.6370 - regression_loss: 1.3665 - classification_loss: 0.2705 28/500 [>.............................] - ETA: 1:58 - loss: 1.6476 - regression_loss: 1.3764 - classification_loss: 0.2712 29/500 [>.............................] - ETA: 1:57 - loss: 1.6585 - regression_loss: 1.3833 - classification_loss: 0.2752 30/500 [>.............................] - ETA: 1:57 - loss: 1.6433 - regression_loss: 1.3697 - classification_loss: 0.2736 31/500 [>.............................] - ETA: 1:57 - loss: 1.6443 - regression_loss: 1.3700 - classification_loss: 0.2742 32/500 [>.............................] - ETA: 1:57 - loss: 1.6451 - regression_loss: 1.3720 - classification_loss: 0.2731 33/500 [>.............................] - ETA: 1:57 - loss: 1.6493 - regression_loss: 1.3667 - classification_loss: 0.2826 34/500 [=>............................] - ETA: 1:56 - loss: 1.6294 - regression_loss: 1.3497 - classification_loss: 0.2797 35/500 [=>............................] - ETA: 1:56 - loss: 1.5960 - regression_loss: 1.3201 - classification_loss: 0.2759 36/500 [=>............................] - ETA: 1:56 - loss: 1.5976 - regression_loss: 1.3196 - classification_loss: 0.2780 37/500 [=>............................] - ETA: 1:56 - loss: 1.6048 - regression_loss: 1.3267 - classification_loss: 0.2781 38/500 [=>............................] - ETA: 1:56 - loss: 1.5851 - regression_loss: 1.3114 - classification_loss: 0.2737 39/500 [=>............................] - ETA: 1:55 - loss: 1.5933 - regression_loss: 1.3190 - classification_loss: 0.2743 40/500 [=>............................] - ETA: 1:55 - loss: 1.6032 - regression_loss: 1.3275 - classification_loss: 0.2757 41/500 [=>............................] - ETA: 1:55 - loss: 1.6189 - regression_loss: 1.3389 - classification_loss: 0.2800 42/500 [=>............................] - ETA: 1:55 - loss: 1.6217 - regression_loss: 1.3404 - classification_loss: 0.2813 43/500 [=>............................] - ETA: 1:55 - loss: 1.6258 - regression_loss: 1.3444 - classification_loss: 0.2814 44/500 [=>............................] - ETA: 1:54 - loss: 1.6243 - regression_loss: 1.3439 - classification_loss: 0.2804 45/500 [=>............................] - ETA: 1:54 - loss: 1.6196 - regression_loss: 1.3401 - classification_loss: 0.2794 46/500 [=>............................] - ETA: 1:54 - loss: 1.6031 - regression_loss: 1.3256 - classification_loss: 0.2775 47/500 [=>............................] - ETA: 1:53 - loss: 1.5864 - regression_loss: 1.3126 - classification_loss: 0.2738 48/500 [=>............................] - ETA: 1:53 - loss: 1.5876 - regression_loss: 1.3119 - classification_loss: 0.2757 49/500 [=>............................] - ETA: 1:53 - loss: 1.5824 - regression_loss: 1.3086 - classification_loss: 0.2738 50/500 [==>...........................] - ETA: 1:53 - loss: 1.5753 - regression_loss: 1.3034 - classification_loss: 0.2718 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5826 - regression_loss: 1.3080 - classification_loss: 0.2745 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5836 - regression_loss: 1.3099 - classification_loss: 0.2737 53/500 [==>...........................] - ETA: 1:52 - loss: 1.5895 - regression_loss: 1.3151 - classification_loss: 0.2744 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5865 - regression_loss: 1.3131 - classification_loss: 0.2734 55/500 [==>...........................] - ETA: 1:52 - loss: 1.5741 - regression_loss: 1.3030 - classification_loss: 0.2712 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5709 - regression_loss: 1.3014 - classification_loss: 0.2694 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5527 - regression_loss: 1.2868 - classification_loss: 0.2659 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5422 - regression_loss: 1.2781 - classification_loss: 0.2640 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5459 - regression_loss: 1.2817 - classification_loss: 0.2642 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5457 - regression_loss: 1.2812 - classification_loss: 0.2645 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5575 - regression_loss: 1.2906 - classification_loss: 0.2669 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5602 - regression_loss: 1.2938 - classification_loss: 0.2664 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5535 - regression_loss: 1.2889 - classification_loss: 0.2646 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5552 - regression_loss: 1.2907 - classification_loss: 0.2645 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5558 - regression_loss: 1.2920 - classification_loss: 0.2639 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5588 - regression_loss: 1.2955 - classification_loss: 0.2633 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5587 - regression_loss: 1.2959 - classification_loss: 0.2627 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5613 - regression_loss: 1.2988 - classification_loss: 0.2626 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5631 - regression_loss: 1.2983 - classification_loss: 0.2648 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5558 - regression_loss: 1.2924 - classification_loss: 0.2634 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5617 - regression_loss: 1.2966 - classification_loss: 0.2651 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5596 - regression_loss: 1.2948 - classification_loss: 0.2649 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5573 - regression_loss: 1.2934 - classification_loss: 0.2638 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5608 - regression_loss: 1.2973 - classification_loss: 0.2635 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5627 - regression_loss: 1.2978 - classification_loss: 0.2648 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5672 - regression_loss: 1.3020 - classification_loss: 0.2651 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5670 - regression_loss: 1.3022 - classification_loss: 0.2648 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5671 - regression_loss: 1.3019 - classification_loss: 0.2652 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5590 - regression_loss: 1.2960 - classification_loss: 0.2631 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5611 - regression_loss: 1.2977 - classification_loss: 0.2634 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5657 - regression_loss: 1.3017 - classification_loss: 0.2641 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5662 - regression_loss: 1.3024 - classification_loss: 0.2639 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5704 - regression_loss: 1.3065 - classification_loss: 0.2639 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5573 - regression_loss: 1.2955 - classification_loss: 0.2619 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5593 - regression_loss: 1.2971 - classification_loss: 0.2622 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5569 - regression_loss: 1.2958 - classification_loss: 0.2611 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5539 - regression_loss: 1.2938 - classification_loss: 0.2602 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5525 - regression_loss: 1.2924 - classification_loss: 0.2601 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5457 - regression_loss: 1.2869 - classification_loss: 0.2588 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5445 - regression_loss: 1.2867 - classification_loss: 0.2578 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5428 - regression_loss: 1.2856 - classification_loss: 0.2573 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5471 - regression_loss: 1.2893 - classification_loss: 0.2578 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5473 - regression_loss: 1.2891 - classification_loss: 0.2581 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5451 - regression_loss: 1.2879 - classification_loss: 0.2571 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5499 - regression_loss: 1.2913 - classification_loss: 0.2586 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5407 - regression_loss: 1.2844 - classification_loss: 0.2563 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5399 - regression_loss: 1.2834 - classification_loss: 0.2565 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5344 - regression_loss: 1.2785 - classification_loss: 0.2559 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5334 - regression_loss: 1.2779 - classification_loss: 0.2555 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5287 - regression_loss: 1.2745 - classification_loss: 0.2542 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5328 - regression_loss: 1.2782 - classification_loss: 0.2546 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5349 - regression_loss: 1.2805 - classification_loss: 0.2545 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5308 - regression_loss: 1.2775 - classification_loss: 0.2533 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5234 - regression_loss: 1.2716 - classification_loss: 0.2518 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5220 - regression_loss: 1.2708 - classification_loss: 0.2512 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5260 - regression_loss: 1.2746 - classification_loss: 0.2514 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5279 - regression_loss: 1.2753 - classification_loss: 0.2526 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5313 - regression_loss: 1.2781 - classification_loss: 0.2533 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5290 - regression_loss: 1.2767 - classification_loss: 0.2523 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5317 - regression_loss: 1.2788 - classification_loss: 0.2529 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5249 - regression_loss: 1.2735 - classification_loss: 0.2514 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5266 - regression_loss: 1.2745 - classification_loss: 0.2521 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5209 - regression_loss: 1.2692 - classification_loss: 0.2517 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5242 - regression_loss: 1.2717 - classification_loss: 0.2524 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5180 - regression_loss: 1.2669 - classification_loss: 0.2512 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5238 - regression_loss: 1.2714 - classification_loss: 0.2524 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5207 - regression_loss: 1.2692 - classification_loss: 0.2515 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5191 - regression_loss: 1.2678 - classification_loss: 0.2513 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5203 - regression_loss: 1.2689 - classification_loss: 0.2514 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5200 - regression_loss: 1.2686 - classification_loss: 0.2515 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5187 - regression_loss: 1.2670 - classification_loss: 0.2518 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5216 - regression_loss: 1.2696 - classification_loss: 0.2520 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5200 - regression_loss: 1.2684 - classification_loss: 0.2516 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5228 - regression_loss: 1.2707 - classification_loss: 0.2521 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5198 - regression_loss: 1.2679 - classification_loss: 0.2519 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5303 - regression_loss: 1.2760 - classification_loss: 0.2543 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5319 - regression_loss: 1.2774 - classification_loss: 0.2546 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5249 - regression_loss: 1.2715 - classification_loss: 0.2535 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5269 - regression_loss: 1.2729 - classification_loss: 0.2540 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5297 - regression_loss: 1.2757 - classification_loss: 0.2541 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5305 - regression_loss: 1.2754 - classification_loss: 0.2550 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5296 - regression_loss: 1.2748 - classification_loss: 0.2548 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5313 - regression_loss: 1.2761 - classification_loss: 0.2552 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5356 - regression_loss: 1.2798 - classification_loss: 0.2558 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5409 - regression_loss: 1.2838 - classification_loss: 0.2572 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5404 - regression_loss: 1.2830 - classification_loss: 0.2575 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5378 - regression_loss: 1.2810 - classification_loss: 0.2568 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5381 - regression_loss: 1.2814 - classification_loss: 0.2567 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5408 - regression_loss: 1.2842 - classification_loss: 0.2566 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5449 - regression_loss: 1.2878 - classification_loss: 0.2571 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5468 - regression_loss: 1.2894 - classification_loss: 0.2574 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5442 - regression_loss: 1.2873 - classification_loss: 0.2569 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5462 - regression_loss: 1.2889 - classification_loss: 0.2573 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5473 - regression_loss: 1.2899 - classification_loss: 0.2573 145/500 [=======>......................] - ETA: 1:29 - loss: 1.5436 - regression_loss: 1.2867 - classification_loss: 0.2569 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5431 - regression_loss: 1.2864 - classification_loss: 0.2567 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5471 - regression_loss: 1.2897 - classification_loss: 0.2574 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5483 - regression_loss: 1.2909 - classification_loss: 0.2574 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5499 - regression_loss: 1.2921 - classification_loss: 0.2577 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5525 - regression_loss: 1.2942 - classification_loss: 0.2583 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5551 - regression_loss: 1.2966 - classification_loss: 0.2585 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5553 - regression_loss: 1.2965 - classification_loss: 0.2588 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5544 - regression_loss: 1.2956 - classification_loss: 0.2589 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5533 - regression_loss: 1.2950 - classification_loss: 0.2583 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5549 - regression_loss: 1.2960 - classification_loss: 0.2589 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5580 - regression_loss: 1.2977 - classification_loss: 0.2603 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5602 - regression_loss: 1.2994 - classification_loss: 0.2608 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5575 - regression_loss: 1.2968 - classification_loss: 0.2607 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5599 - regression_loss: 1.2990 - classification_loss: 0.2609 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5539 - regression_loss: 1.2942 - classification_loss: 0.2597 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5561 - regression_loss: 1.2961 - classification_loss: 0.2600 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5583 - regression_loss: 1.2977 - classification_loss: 0.2606 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5591 - regression_loss: 1.2984 - classification_loss: 0.2608 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5616 - regression_loss: 1.3002 - classification_loss: 0.2614 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5587 - regression_loss: 1.2981 - classification_loss: 0.2607 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5600 - regression_loss: 1.2989 - classification_loss: 0.2610 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5604 - regression_loss: 1.2991 - classification_loss: 0.2612 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5626 - regression_loss: 1.3009 - classification_loss: 0.2617 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5655 - regression_loss: 1.3032 - classification_loss: 0.2623 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5603 - regression_loss: 1.2991 - classification_loss: 0.2612 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5598 - regression_loss: 1.2988 - classification_loss: 0.2610 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5620 - regression_loss: 1.3007 - classification_loss: 0.2614 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5624 - regression_loss: 1.3012 - classification_loss: 0.2612 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5613 - regression_loss: 1.3005 - classification_loss: 0.2607 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5638 - regression_loss: 1.3029 - classification_loss: 0.2609 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5629 - regression_loss: 1.3023 - classification_loss: 0.2607 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5599 - regression_loss: 1.2996 - classification_loss: 0.2602 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5598 - regression_loss: 1.2997 - classification_loss: 0.2602 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5616 - regression_loss: 1.3011 - classification_loss: 0.2605 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5609 - regression_loss: 1.3009 - classification_loss: 0.2601 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5616 - regression_loss: 1.3016 - classification_loss: 0.2600 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5617 - regression_loss: 1.3018 - classification_loss: 0.2598 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5645 - regression_loss: 1.3040 - classification_loss: 0.2605 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5665 - regression_loss: 1.3058 - classification_loss: 0.2607 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5661 - regression_loss: 1.3056 - classification_loss: 0.2605 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5641 - regression_loss: 1.3033 - classification_loss: 0.2608 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5676 - regression_loss: 1.3057 - classification_loss: 0.2619 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5710 - regression_loss: 1.3081 - classification_loss: 0.2629 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5730 - regression_loss: 1.3102 - classification_loss: 0.2628 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5751 - regression_loss: 1.3125 - classification_loss: 0.2626 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5786 - regression_loss: 1.3152 - classification_loss: 0.2634 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5800 - regression_loss: 1.3166 - classification_loss: 0.2633 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5820 - regression_loss: 1.3181 - classification_loss: 0.2638 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5811 - regression_loss: 1.3175 - classification_loss: 0.2635 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5762 - regression_loss: 1.3135 - classification_loss: 0.2627 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5765 - regression_loss: 1.3139 - classification_loss: 0.2626 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5771 - regression_loss: 1.3145 - classification_loss: 0.2626 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5792 - regression_loss: 1.3164 - classification_loss: 0.2628 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5774 - regression_loss: 1.3152 - classification_loss: 0.2623 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5773 - regression_loss: 1.3153 - classification_loss: 0.2620 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5795 - regression_loss: 1.3173 - classification_loss: 0.2622 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5781 - regression_loss: 1.3160 - classification_loss: 0.2621 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5760 - regression_loss: 1.3143 - classification_loss: 0.2617 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5740 - regression_loss: 1.3128 - classification_loss: 0.2613 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5742 - regression_loss: 1.3132 - classification_loss: 0.2610 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5696 - regression_loss: 1.3095 - classification_loss: 0.2601 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5693 - regression_loss: 1.3091 - classification_loss: 0.2602 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5705 - regression_loss: 1.3102 - classification_loss: 0.2603 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5721 - regression_loss: 1.3113 - classification_loss: 0.2607 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5745 - regression_loss: 1.3134 - classification_loss: 0.2611 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5716 - regression_loss: 1.3110 - classification_loss: 0.2606 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5688 - regression_loss: 1.3088 - classification_loss: 0.2600 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5674 - regression_loss: 1.3065 - classification_loss: 0.2609 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5713 - regression_loss: 1.3091 - classification_loss: 0.2622 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5743 - regression_loss: 1.3117 - classification_loss: 0.2627 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5729 - regression_loss: 1.3106 - classification_loss: 0.2623 217/500 [============>.................] - ETA: 1:10 - loss: 1.5739 - regression_loss: 1.3115 - classification_loss: 0.2624 218/500 [============>.................] - ETA: 1:10 - loss: 1.5723 - regression_loss: 1.3100 - classification_loss: 0.2623 219/500 [============>.................] - ETA: 1:10 - loss: 1.5730 - regression_loss: 1.3109 - classification_loss: 0.2621 220/500 [============>.................] - ETA: 1:10 - loss: 1.5719 - regression_loss: 1.3099 - classification_loss: 0.2620 221/500 [============>.................] - ETA: 1:10 - loss: 1.5733 - regression_loss: 1.3113 - classification_loss: 0.2620 222/500 [============>.................] - ETA: 1:09 - loss: 1.5697 - regression_loss: 1.3084 - classification_loss: 0.2613 223/500 [============>.................] - ETA: 1:09 - loss: 1.5682 - regression_loss: 1.3071 - classification_loss: 0.2610 224/500 [============>.................] - ETA: 1:09 - loss: 1.5673 - regression_loss: 1.3063 - classification_loss: 0.2610 225/500 [============>.................] - ETA: 1:09 - loss: 1.5686 - regression_loss: 1.3074 - classification_loss: 0.2612 226/500 [============>.................] - ETA: 1:08 - loss: 1.5701 - regression_loss: 1.3088 - classification_loss: 0.2613 227/500 [============>.................] - ETA: 1:08 - loss: 1.5690 - regression_loss: 1.3080 - classification_loss: 0.2610 228/500 [============>.................] - ETA: 1:08 - loss: 1.5694 - regression_loss: 1.3083 - classification_loss: 0.2611 229/500 [============>.................] - ETA: 1:08 - loss: 1.5693 - regression_loss: 1.3083 - classification_loss: 0.2610 230/500 [============>.................] - ETA: 1:07 - loss: 1.5699 - regression_loss: 1.3088 - classification_loss: 0.2611 231/500 [============>.................] - ETA: 1:07 - loss: 1.5684 - regression_loss: 1.3075 - classification_loss: 0.2610 232/500 [============>.................] - ETA: 1:07 - loss: 1.5690 - regression_loss: 1.3080 - classification_loss: 0.2610 233/500 [============>.................] - ETA: 1:07 - loss: 1.5681 - regression_loss: 1.3073 - classification_loss: 0.2607 234/500 [=============>................] - ETA: 1:06 - loss: 1.5665 - regression_loss: 1.3060 - classification_loss: 0.2605 235/500 [=============>................] - ETA: 1:06 - loss: 1.5669 - regression_loss: 1.3064 - classification_loss: 0.2605 236/500 [=============>................] - ETA: 1:06 - loss: 1.5674 - regression_loss: 1.3062 - classification_loss: 0.2612 237/500 [=============>................] - ETA: 1:06 - loss: 1.5699 - regression_loss: 1.3084 - classification_loss: 0.2614 238/500 [=============>................] - ETA: 1:05 - loss: 1.5720 - regression_loss: 1.3103 - classification_loss: 0.2617 239/500 [=============>................] - ETA: 1:05 - loss: 1.5744 - regression_loss: 1.3125 - classification_loss: 0.2620 240/500 [=============>................] - ETA: 1:05 - loss: 1.5747 - regression_loss: 1.3126 - classification_loss: 0.2621 241/500 [=============>................] - ETA: 1:05 - loss: 1.5764 - regression_loss: 1.3140 - classification_loss: 0.2624 242/500 [=============>................] - ETA: 1:04 - loss: 1.5774 - regression_loss: 1.3151 - classification_loss: 0.2623 243/500 [=============>................] - ETA: 1:04 - loss: 1.5744 - regression_loss: 1.3122 - classification_loss: 0.2622 244/500 [=============>................] - ETA: 1:04 - loss: 1.5706 - regression_loss: 1.3090 - classification_loss: 0.2616 245/500 [=============>................] - ETA: 1:03 - loss: 1.5711 - regression_loss: 1.3098 - classification_loss: 0.2614 246/500 [=============>................] - ETA: 1:03 - loss: 1.5712 - regression_loss: 1.3100 - classification_loss: 0.2611 247/500 [=============>................] - ETA: 1:03 - loss: 1.5687 - regression_loss: 1.3082 - classification_loss: 0.2605 248/500 [=============>................] - ETA: 1:03 - loss: 1.5673 - regression_loss: 1.3074 - classification_loss: 0.2599 249/500 [=============>................] - ETA: 1:02 - loss: 1.5673 - regression_loss: 1.3076 - classification_loss: 0.2597 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5706 - regression_loss: 1.3101 - classification_loss: 0.2606 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5710 - regression_loss: 1.3102 - classification_loss: 0.2608 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5693 - regression_loss: 1.3087 - classification_loss: 0.2605 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5736 - regression_loss: 1.3121 - classification_loss: 0.2616 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5759 - regression_loss: 1.3135 - classification_loss: 0.2624 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5779 - regression_loss: 1.3147 - classification_loss: 0.2632 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5786 - regression_loss: 1.3155 - classification_loss: 0.2631 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5778 - regression_loss: 1.3150 - classification_loss: 0.2628 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5797 - regression_loss: 1.3166 - classification_loss: 0.2631 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5812 - regression_loss: 1.3178 - classification_loss: 0.2634 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5815 - regression_loss: 1.3181 - classification_loss: 0.2634 261/500 [==============>...............] - ETA: 59s - loss: 1.5798 - regression_loss: 1.3168 - classification_loss: 0.2630  262/500 [==============>...............] - ETA: 59s - loss: 1.5805 - regression_loss: 1.3175 - classification_loss: 0.2630 263/500 [==============>...............] - ETA: 59s - loss: 1.5778 - regression_loss: 1.3154 - classification_loss: 0.2625 264/500 [==============>...............] - ETA: 59s - loss: 1.5784 - regression_loss: 1.3157 - classification_loss: 0.2626 265/500 [==============>...............] - ETA: 58s - loss: 1.5784 - regression_loss: 1.3159 - classification_loss: 0.2625 266/500 [==============>...............] - ETA: 58s - loss: 1.5801 - regression_loss: 1.3175 - classification_loss: 0.2626 267/500 [===============>..............] - ETA: 58s - loss: 1.5775 - regression_loss: 1.3151 - classification_loss: 0.2624 268/500 [===============>..............] - ETA: 58s - loss: 1.5789 - regression_loss: 1.3163 - classification_loss: 0.2626 269/500 [===============>..............] - ETA: 57s - loss: 1.5775 - regression_loss: 1.3151 - classification_loss: 0.2624 270/500 [===============>..............] - ETA: 57s - loss: 1.5791 - regression_loss: 1.3165 - classification_loss: 0.2626 271/500 [===============>..............] - ETA: 57s - loss: 1.5768 - regression_loss: 1.3147 - classification_loss: 0.2621 272/500 [===============>..............] - ETA: 57s - loss: 1.5788 - regression_loss: 1.3160 - classification_loss: 0.2628 273/500 [===============>..............] - ETA: 56s - loss: 1.5758 - regression_loss: 1.3133 - classification_loss: 0.2625 274/500 [===============>..............] - ETA: 56s - loss: 1.5760 - regression_loss: 1.3137 - classification_loss: 0.2623 275/500 [===============>..............] - ETA: 56s - loss: 1.5752 - regression_loss: 1.3132 - classification_loss: 0.2619 276/500 [===============>..............] - ETA: 56s - loss: 1.5756 - regression_loss: 1.3137 - classification_loss: 0.2619 277/500 [===============>..............] - ETA: 55s - loss: 1.5772 - regression_loss: 1.3149 - classification_loss: 0.2623 278/500 [===============>..............] - ETA: 55s - loss: 1.5759 - regression_loss: 1.3139 - classification_loss: 0.2620 279/500 [===============>..............] - ETA: 55s - loss: 1.5759 - regression_loss: 1.3139 - classification_loss: 0.2620 280/500 [===============>..............] - ETA: 55s - loss: 1.5795 - regression_loss: 1.3167 - classification_loss: 0.2627 281/500 [===============>..............] - ETA: 54s - loss: 1.5792 - regression_loss: 1.3166 - classification_loss: 0.2626 282/500 [===============>..............] - ETA: 54s - loss: 1.5796 - regression_loss: 1.3165 - classification_loss: 0.2631 283/500 [===============>..............] - ETA: 54s - loss: 1.5816 - regression_loss: 1.3179 - classification_loss: 0.2636 284/500 [================>.............] - ETA: 54s - loss: 1.5827 - regression_loss: 1.3188 - classification_loss: 0.2638 285/500 [================>.............] - ETA: 53s - loss: 1.5799 - regression_loss: 1.3166 - classification_loss: 0.2633 286/500 [================>.............] - ETA: 53s - loss: 1.5802 - regression_loss: 1.3168 - classification_loss: 0.2633 287/500 [================>.............] - ETA: 53s - loss: 1.5810 - regression_loss: 1.3174 - classification_loss: 0.2635 288/500 [================>.............] - ETA: 53s - loss: 1.5812 - regression_loss: 1.3176 - classification_loss: 0.2635 289/500 [================>.............] - ETA: 52s - loss: 1.5801 - regression_loss: 1.3168 - classification_loss: 0.2633 290/500 [================>.............] - ETA: 52s - loss: 1.5807 - regression_loss: 1.3175 - classification_loss: 0.2632 291/500 [================>.............] - ETA: 52s - loss: 1.5816 - regression_loss: 1.3186 - classification_loss: 0.2630 292/500 [================>.............] - ETA: 52s - loss: 1.5836 - regression_loss: 1.3200 - classification_loss: 0.2636 293/500 [================>.............] - ETA: 51s - loss: 1.5846 - regression_loss: 1.3208 - classification_loss: 0.2638 294/500 [================>.............] - ETA: 51s - loss: 1.5845 - regression_loss: 1.3206 - classification_loss: 0.2639 295/500 [================>.............] - ETA: 51s - loss: 1.5881 - regression_loss: 1.3240 - classification_loss: 0.2641 296/500 [================>.............] - ETA: 51s - loss: 1.5888 - regression_loss: 1.3244 - classification_loss: 0.2644 297/500 [================>.............] - ETA: 50s - loss: 1.5890 - regression_loss: 1.3246 - classification_loss: 0.2644 298/500 [================>.............] - ETA: 50s - loss: 1.5861 - regression_loss: 1.3219 - classification_loss: 0.2642 299/500 [================>.............] - ETA: 50s - loss: 1.5874 - regression_loss: 1.3231 - classification_loss: 0.2643 300/500 [=================>............] - ETA: 50s - loss: 1.5871 - regression_loss: 1.3229 - classification_loss: 0.2642 301/500 [=================>............] - ETA: 49s - loss: 1.5903 - regression_loss: 1.3253 - classification_loss: 0.2650 302/500 [=================>............] - ETA: 49s - loss: 1.5893 - regression_loss: 1.3244 - classification_loss: 0.2649 303/500 [=================>............] - ETA: 49s - loss: 1.5907 - regression_loss: 1.3253 - classification_loss: 0.2654 304/500 [=================>............] - ETA: 49s - loss: 1.5901 - regression_loss: 1.3249 - classification_loss: 0.2652 305/500 [=================>............] - ETA: 48s - loss: 1.5891 - regression_loss: 1.3241 - classification_loss: 0.2650 306/500 [=================>............] - ETA: 48s - loss: 1.5895 - regression_loss: 1.3246 - classification_loss: 0.2649 307/500 [=================>............] - ETA: 48s - loss: 1.5895 - regression_loss: 1.3245 - classification_loss: 0.2650 308/500 [=================>............] - ETA: 48s - loss: 1.5910 - regression_loss: 1.3259 - classification_loss: 0.2650 309/500 [=================>............] - ETA: 47s - loss: 1.5917 - regression_loss: 1.3265 - classification_loss: 0.2652 310/500 [=================>............] - ETA: 47s - loss: 1.5902 - regression_loss: 1.3254 - classification_loss: 0.2649 311/500 [=================>............] - ETA: 47s - loss: 1.5902 - regression_loss: 1.3253 - classification_loss: 0.2648 312/500 [=================>............] - ETA: 47s - loss: 1.5920 - regression_loss: 1.3266 - classification_loss: 0.2654 313/500 [=================>............] - ETA: 46s - loss: 1.5910 - regression_loss: 1.3255 - classification_loss: 0.2655 314/500 [=================>............] - ETA: 46s - loss: 1.5896 - regression_loss: 1.3246 - classification_loss: 0.2651 315/500 [=================>............] - ETA: 46s - loss: 1.5878 - regression_loss: 1.3232 - classification_loss: 0.2646 316/500 [=================>............] - ETA: 46s - loss: 1.5880 - regression_loss: 1.3234 - classification_loss: 0.2647 317/500 [==================>...........] - ETA: 45s - loss: 1.5910 - regression_loss: 1.3261 - classification_loss: 0.2650 318/500 [==================>...........] - ETA: 45s - loss: 1.5890 - regression_loss: 1.3245 - classification_loss: 0.2645 319/500 [==================>...........] - ETA: 45s - loss: 1.5894 - regression_loss: 1.3246 - classification_loss: 0.2648 320/500 [==================>...........] - ETA: 45s - loss: 1.5885 - regression_loss: 1.3242 - classification_loss: 0.2643 321/500 [==================>...........] - ETA: 44s - loss: 1.5852 - regression_loss: 1.3215 - classification_loss: 0.2637 322/500 [==================>...........] - ETA: 44s - loss: 1.5861 - regression_loss: 1.3224 - classification_loss: 0.2637 323/500 [==================>...........] - ETA: 44s - loss: 1.5874 - regression_loss: 1.3236 - classification_loss: 0.2639 324/500 [==================>...........] - ETA: 44s - loss: 1.5884 - regression_loss: 1.3244 - classification_loss: 0.2641 325/500 [==================>...........] - ETA: 43s - loss: 1.5865 - regression_loss: 1.3229 - classification_loss: 0.2636 326/500 [==================>...........] - ETA: 43s - loss: 1.5867 - regression_loss: 1.3232 - classification_loss: 0.2635 327/500 [==================>...........] - ETA: 43s - loss: 1.5870 - regression_loss: 1.3236 - classification_loss: 0.2634 328/500 [==================>...........] - ETA: 43s - loss: 1.5850 - regression_loss: 1.3220 - classification_loss: 0.2630 329/500 [==================>...........] - ETA: 42s - loss: 1.5856 - regression_loss: 1.3225 - classification_loss: 0.2630 330/500 [==================>...........] - ETA: 42s - loss: 1.5880 - regression_loss: 1.3243 - classification_loss: 0.2637 331/500 [==================>...........] - ETA: 42s - loss: 1.5889 - regression_loss: 1.3250 - classification_loss: 0.2639 332/500 [==================>...........] - ETA: 42s - loss: 1.5893 - regression_loss: 1.3254 - classification_loss: 0.2639 333/500 [==================>...........] - ETA: 41s - loss: 1.5898 - regression_loss: 1.3261 - classification_loss: 0.2637 334/500 [===================>..........] - ETA: 41s - loss: 1.5910 - regression_loss: 1.3273 - classification_loss: 0.2637 335/500 [===================>..........] - ETA: 41s - loss: 1.5926 - regression_loss: 1.3291 - classification_loss: 0.2636 336/500 [===================>..........] - ETA: 41s - loss: 1.5921 - regression_loss: 1.3286 - classification_loss: 0.2634 337/500 [===================>..........] - ETA: 40s - loss: 1.5905 - regression_loss: 1.3275 - classification_loss: 0.2630 338/500 [===================>..........] - ETA: 40s - loss: 1.5897 - regression_loss: 1.3270 - classification_loss: 0.2628 339/500 [===================>..........] - ETA: 40s - loss: 1.5910 - regression_loss: 1.3278 - classification_loss: 0.2631 340/500 [===================>..........] - ETA: 40s - loss: 1.5926 - regression_loss: 1.3291 - classification_loss: 0.2634 341/500 [===================>..........] - ETA: 39s - loss: 1.5933 - regression_loss: 1.3297 - classification_loss: 0.2636 342/500 [===================>..........] - ETA: 39s - loss: 1.5917 - regression_loss: 1.3282 - classification_loss: 0.2635 343/500 [===================>..........] - ETA: 39s - loss: 1.5913 - regression_loss: 1.3277 - classification_loss: 0.2636 344/500 [===================>..........] - ETA: 39s - loss: 1.5919 - regression_loss: 1.3282 - classification_loss: 0.2637 345/500 [===================>..........] - ETA: 38s - loss: 1.5901 - regression_loss: 1.3269 - classification_loss: 0.2632 346/500 [===================>..........] - ETA: 38s - loss: 1.5915 - regression_loss: 1.3281 - classification_loss: 0.2634 347/500 [===================>..........] - ETA: 38s - loss: 1.5914 - regression_loss: 1.3279 - classification_loss: 0.2635 348/500 [===================>..........] - ETA: 38s - loss: 1.5912 - regression_loss: 1.3279 - classification_loss: 0.2634 349/500 [===================>..........] - ETA: 37s - loss: 1.5904 - regression_loss: 1.3271 - classification_loss: 0.2633 350/500 [====================>.........] - ETA: 37s - loss: 1.5906 - regression_loss: 1.3273 - classification_loss: 0.2633 351/500 [====================>.........] - ETA: 37s - loss: 1.5913 - regression_loss: 1.3280 - classification_loss: 0.2633 352/500 [====================>.........] - ETA: 37s - loss: 1.5918 - regression_loss: 1.3283 - classification_loss: 0.2635 353/500 [====================>.........] - ETA: 36s - loss: 1.5906 - regression_loss: 1.3273 - classification_loss: 0.2633 354/500 [====================>.........] - ETA: 36s - loss: 1.5897 - regression_loss: 1.3267 - classification_loss: 0.2630 355/500 [====================>.........] - ETA: 36s - loss: 1.5903 - regression_loss: 1.3272 - classification_loss: 0.2631 356/500 [====================>.........] - ETA: 36s - loss: 1.5916 - regression_loss: 1.3282 - classification_loss: 0.2634 357/500 [====================>.........] - ETA: 35s - loss: 1.5916 - regression_loss: 1.3282 - classification_loss: 0.2634 358/500 [====================>.........] - ETA: 35s - loss: 1.5920 - regression_loss: 1.3287 - classification_loss: 0.2634 359/500 [====================>.........] - ETA: 35s - loss: 1.5927 - regression_loss: 1.3292 - classification_loss: 0.2635 360/500 [====================>.........] - ETA: 35s - loss: 1.5903 - regression_loss: 1.3271 - classification_loss: 0.2632 361/500 [====================>.........] - ETA: 34s - loss: 1.5904 - regression_loss: 1.3273 - classification_loss: 0.2631 362/500 [====================>.........] - ETA: 34s - loss: 1.5906 - regression_loss: 1.3275 - classification_loss: 0.2631 363/500 [====================>.........] - ETA: 34s - loss: 1.5937 - regression_loss: 1.3298 - classification_loss: 0.2639 364/500 [====================>.........] - ETA: 34s - loss: 1.5933 - regression_loss: 1.3298 - classification_loss: 0.2635 365/500 [====================>.........] - ETA: 33s - loss: 1.5914 - regression_loss: 1.3280 - classification_loss: 0.2634 366/500 [====================>.........] - ETA: 33s - loss: 1.5904 - regression_loss: 1.3272 - classification_loss: 0.2632 367/500 [=====================>........] - ETA: 33s - loss: 1.5905 - regression_loss: 1.3271 - classification_loss: 0.2633 368/500 [=====================>........] - ETA: 33s - loss: 1.5904 - regression_loss: 1.3272 - classification_loss: 0.2632 369/500 [=====================>........] - ETA: 32s - loss: 1.5900 - regression_loss: 1.3270 - classification_loss: 0.2630 370/500 [=====================>........] - ETA: 32s - loss: 1.5912 - regression_loss: 1.3280 - classification_loss: 0.2633 371/500 [=====================>........] - ETA: 32s - loss: 1.5910 - regression_loss: 1.3279 - classification_loss: 0.2631 372/500 [=====================>........] - ETA: 32s - loss: 1.5925 - regression_loss: 1.3291 - classification_loss: 0.2634 373/500 [=====================>........] - ETA: 31s - loss: 1.5924 - regression_loss: 1.3291 - classification_loss: 0.2632 374/500 [=====================>........] - ETA: 31s - loss: 1.5914 - regression_loss: 1.3282 - classification_loss: 0.2633 375/500 [=====================>........] - ETA: 31s - loss: 1.5917 - regression_loss: 1.3285 - classification_loss: 0.2632 376/500 [=====================>........] - ETA: 31s - loss: 1.5918 - regression_loss: 1.3287 - classification_loss: 0.2631 377/500 [=====================>........] - ETA: 30s - loss: 1.5904 - regression_loss: 1.3273 - classification_loss: 0.2631 378/500 [=====================>........] - ETA: 30s - loss: 1.5884 - regression_loss: 1.3253 - classification_loss: 0.2631 379/500 [=====================>........] - ETA: 30s - loss: 1.5892 - regression_loss: 1.3260 - classification_loss: 0.2632 380/500 [=====================>........] - ETA: 30s - loss: 1.5901 - regression_loss: 1.3269 - classification_loss: 0.2633 381/500 [=====================>........] - ETA: 29s - loss: 1.5902 - regression_loss: 1.3270 - classification_loss: 0.2633 382/500 [=====================>........] - ETA: 29s - loss: 1.5901 - regression_loss: 1.3267 - classification_loss: 0.2634 383/500 [=====================>........] - ETA: 29s - loss: 1.5904 - regression_loss: 1.3268 - classification_loss: 0.2636 384/500 [======================>.......] - ETA: 29s - loss: 1.5939 - regression_loss: 1.3297 - classification_loss: 0.2642 385/500 [======================>.......] - ETA: 28s - loss: 1.5942 - regression_loss: 1.3300 - classification_loss: 0.2642 386/500 [======================>.......] - ETA: 28s - loss: 1.5921 - regression_loss: 1.3282 - classification_loss: 0.2639 387/500 [======================>.......] - ETA: 28s - loss: 1.5918 - regression_loss: 1.3280 - classification_loss: 0.2638 388/500 [======================>.......] - ETA: 28s - loss: 1.5928 - regression_loss: 1.3289 - classification_loss: 0.2639 389/500 [======================>.......] - ETA: 27s - loss: 1.5936 - regression_loss: 1.3293 - classification_loss: 0.2643 390/500 [======================>.......] - ETA: 27s - loss: 1.5946 - regression_loss: 1.3302 - classification_loss: 0.2644 391/500 [======================>.......] - ETA: 27s - loss: 1.5928 - regression_loss: 1.3287 - classification_loss: 0.2641 392/500 [======================>.......] - ETA: 27s - loss: 1.5931 - regression_loss: 1.3290 - classification_loss: 0.2641 393/500 [======================>.......] - ETA: 26s - loss: 1.5941 - regression_loss: 1.3299 - classification_loss: 0.2643 394/500 [======================>.......] - ETA: 26s - loss: 1.5942 - regression_loss: 1.3300 - classification_loss: 0.2642 395/500 [======================>.......] - ETA: 26s - loss: 1.5937 - regression_loss: 1.3295 - classification_loss: 0.2642 396/500 [======================>.......] - ETA: 26s - loss: 1.5947 - regression_loss: 1.3302 - classification_loss: 0.2644 397/500 [======================>.......] - ETA: 25s - loss: 1.5928 - regression_loss: 1.3288 - classification_loss: 0.2640 398/500 [======================>.......] - ETA: 25s - loss: 1.5917 - regression_loss: 1.3280 - classification_loss: 0.2638 399/500 [======================>.......] - ETA: 25s - loss: 1.5931 - regression_loss: 1.3289 - classification_loss: 0.2641 400/500 [=======================>......] - ETA: 25s - loss: 1.5923 - regression_loss: 1.3285 - classification_loss: 0.2639 401/500 [=======================>......] - ETA: 24s - loss: 1.5931 - regression_loss: 1.3289 - classification_loss: 0.2642 402/500 [=======================>......] - ETA: 24s - loss: 1.5947 - regression_loss: 1.3300 - classification_loss: 0.2647 403/500 [=======================>......] - ETA: 24s - loss: 1.5962 - regression_loss: 1.3313 - classification_loss: 0.2648 404/500 [=======================>......] - ETA: 24s - loss: 1.5960 - regression_loss: 1.3310 - classification_loss: 0.2650 405/500 [=======================>......] - ETA: 23s - loss: 1.5941 - regression_loss: 1.3295 - classification_loss: 0.2646 406/500 [=======================>......] - ETA: 23s - loss: 1.5936 - regression_loss: 1.3290 - classification_loss: 0.2646 407/500 [=======================>......] - ETA: 23s - loss: 1.5942 - regression_loss: 1.3296 - classification_loss: 0.2646 408/500 [=======================>......] - ETA: 23s - loss: 1.5946 - regression_loss: 1.3299 - classification_loss: 0.2647 409/500 [=======================>......] - ETA: 22s - loss: 1.5960 - regression_loss: 1.3309 - classification_loss: 0.2651 410/500 [=======================>......] - ETA: 22s - loss: 1.5959 - regression_loss: 1.3306 - classification_loss: 0.2653 411/500 [=======================>......] - ETA: 22s - loss: 1.5943 - regression_loss: 1.3294 - classification_loss: 0.2649 412/500 [=======================>......] - ETA: 22s - loss: 1.5935 - regression_loss: 1.3287 - classification_loss: 0.2648 413/500 [=======================>......] - ETA: 21s - loss: 1.5931 - regression_loss: 1.3283 - classification_loss: 0.2648 414/500 [=======================>......] - ETA: 21s - loss: 1.5942 - regression_loss: 1.3293 - classification_loss: 0.2649 415/500 [=======================>......] - ETA: 21s - loss: 1.5935 - regression_loss: 1.3288 - classification_loss: 0.2647 416/500 [=======================>......] - ETA: 21s - loss: 1.5939 - regression_loss: 1.3289 - classification_loss: 0.2650 417/500 [========================>.....] - ETA: 20s - loss: 1.5917 - regression_loss: 1.3271 - classification_loss: 0.2646 418/500 [========================>.....] - ETA: 20s - loss: 1.5895 - regression_loss: 1.3252 - classification_loss: 0.2643 419/500 [========================>.....] - ETA: 20s - loss: 1.5900 - regression_loss: 1.3257 - classification_loss: 0.2642 420/500 [========================>.....] - ETA: 20s - loss: 1.5911 - regression_loss: 1.3267 - classification_loss: 0.2644 421/500 [========================>.....] - ETA: 19s - loss: 1.5923 - regression_loss: 1.3275 - classification_loss: 0.2648 422/500 [========================>.....] - ETA: 19s - loss: 1.5898 - regression_loss: 1.3256 - classification_loss: 0.2643 423/500 [========================>.....] - ETA: 19s - loss: 1.5899 - regression_loss: 1.3255 - classification_loss: 0.2644 424/500 [========================>.....] - ETA: 19s - loss: 1.5913 - regression_loss: 1.3267 - classification_loss: 0.2646 425/500 [========================>.....] - ETA: 18s - loss: 1.5919 - regression_loss: 1.3274 - classification_loss: 0.2645 426/500 [========================>.....] - ETA: 18s - loss: 1.5907 - regression_loss: 1.3265 - classification_loss: 0.2642 427/500 [========================>.....] - ETA: 18s - loss: 1.5901 - regression_loss: 1.3260 - classification_loss: 0.2641 428/500 [========================>.....] - ETA: 18s - loss: 1.5899 - regression_loss: 1.3258 - classification_loss: 0.2641 429/500 [========================>.....] - ETA: 17s - loss: 1.5889 - regression_loss: 1.3250 - classification_loss: 0.2639 430/500 [========================>.....] - ETA: 17s - loss: 1.5893 - regression_loss: 1.3252 - classification_loss: 0.2640 431/500 [========================>.....] - ETA: 17s - loss: 1.5901 - regression_loss: 1.3258 - classification_loss: 0.2643 432/500 [========================>.....] - ETA: 17s - loss: 1.5900 - regression_loss: 1.3258 - classification_loss: 0.2642 433/500 [========================>.....] - ETA: 16s - loss: 1.5881 - regression_loss: 1.3243 - classification_loss: 0.2638 434/500 [=========================>....] - ETA: 16s - loss: 1.5881 - regression_loss: 1.3244 - classification_loss: 0.2637 435/500 [=========================>....] - ETA: 16s - loss: 1.5873 - regression_loss: 1.3238 - classification_loss: 0.2635 436/500 [=========================>....] - ETA: 16s - loss: 1.5877 - regression_loss: 1.3242 - classification_loss: 0.2635 437/500 [=========================>....] - ETA: 15s - loss: 1.5873 - regression_loss: 1.3239 - classification_loss: 0.2633 438/500 [=========================>....] - ETA: 15s - loss: 1.5874 - regression_loss: 1.3241 - classification_loss: 0.2633 439/500 [=========================>....] - ETA: 15s - loss: 1.5864 - regression_loss: 1.3233 - classification_loss: 0.2631 440/500 [=========================>....] - ETA: 15s - loss: 1.5865 - regression_loss: 1.3233 - classification_loss: 0.2632 441/500 [=========================>....] - ETA: 14s - loss: 1.5850 - regression_loss: 1.3222 - classification_loss: 0.2628 442/500 [=========================>....] - ETA: 14s - loss: 1.5874 - regression_loss: 1.3243 - classification_loss: 0.2631 443/500 [=========================>....] - ETA: 14s - loss: 1.5874 - regression_loss: 1.3243 - classification_loss: 0.2631 444/500 [=========================>....] - ETA: 14s - loss: 1.5885 - regression_loss: 1.3251 - classification_loss: 0.2634 445/500 [=========================>....] - ETA: 13s - loss: 1.5893 - regression_loss: 1.3256 - classification_loss: 0.2637 446/500 [=========================>....] - ETA: 13s - loss: 1.5886 - regression_loss: 1.3249 - classification_loss: 0.2636 447/500 [=========================>....] - ETA: 13s - loss: 1.5877 - regression_loss: 1.3244 - classification_loss: 0.2634 448/500 [=========================>....] - ETA: 13s - loss: 1.5884 - regression_loss: 1.3248 - classification_loss: 0.2636 449/500 [=========================>....] - ETA: 12s - loss: 1.5885 - regression_loss: 1.3248 - classification_loss: 0.2637 450/500 [==========================>...] - ETA: 12s - loss: 1.5891 - regression_loss: 1.3254 - classification_loss: 0.2637 451/500 [==========================>...] - ETA: 12s - loss: 1.5889 - regression_loss: 1.3252 - classification_loss: 0.2637 452/500 [==========================>...] - ETA: 12s - loss: 1.5892 - regression_loss: 1.3254 - classification_loss: 0.2638 453/500 [==========================>...] - ETA: 11s - loss: 1.5894 - regression_loss: 1.3256 - classification_loss: 0.2638 454/500 [==========================>...] - ETA: 11s - loss: 1.5895 - regression_loss: 1.3255 - classification_loss: 0.2639 455/500 [==========================>...] - ETA: 11s - loss: 1.5894 - regression_loss: 1.3251 - classification_loss: 0.2643 456/500 [==========================>...] - ETA: 11s - loss: 1.5887 - regression_loss: 1.3244 - classification_loss: 0.2643 457/500 [==========================>...] - ETA: 10s - loss: 1.5890 - regression_loss: 1.3247 - classification_loss: 0.2643 458/500 [==========================>...] - ETA: 10s - loss: 1.5868 - regression_loss: 1.3229 - classification_loss: 0.2640 459/500 [==========================>...] - ETA: 10s - loss: 1.5876 - regression_loss: 1.3236 - classification_loss: 0.2640 460/500 [==========================>...] - ETA: 10s - loss: 1.5881 - regression_loss: 1.3242 - classification_loss: 0.2639 461/500 [==========================>...] - ETA: 9s - loss: 1.5877 - regression_loss: 1.3239 - classification_loss: 0.2638  462/500 [==========================>...] - ETA: 9s - loss: 1.5865 - regression_loss: 1.3229 - classification_loss: 0.2636 463/500 [==========================>...] - ETA: 9s - loss: 1.5856 - regression_loss: 1.3223 - classification_loss: 0.2634 464/500 [==========================>...] - ETA: 9s - loss: 1.5840 - regression_loss: 1.3210 - classification_loss: 0.2630 465/500 [==========================>...] - ETA: 8s - loss: 1.5852 - regression_loss: 1.3220 - classification_loss: 0.2632 466/500 [==========================>...] - ETA: 8s - loss: 1.5853 - regression_loss: 1.3221 - classification_loss: 0.2632 467/500 [===========================>..] - ETA: 8s - loss: 1.5842 - regression_loss: 1.3212 - classification_loss: 0.2630 468/500 [===========================>..] - ETA: 8s - loss: 1.5844 - regression_loss: 1.3213 - classification_loss: 0.2631 469/500 [===========================>..] - ETA: 7s - loss: 1.5827 - regression_loss: 1.3199 - classification_loss: 0.2628 470/500 [===========================>..] - ETA: 7s - loss: 1.5833 - regression_loss: 1.3204 - classification_loss: 0.2628 471/500 [===========================>..] - ETA: 7s - loss: 1.5825 - regression_loss: 1.3200 - classification_loss: 0.2626 472/500 [===========================>..] - ETA: 7s - loss: 1.5831 - regression_loss: 1.3205 - classification_loss: 0.2626 473/500 [===========================>..] - ETA: 6s - loss: 1.5829 - regression_loss: 1.3205 - classification_loss: 0.2625 474/500 [===========================>..] - ETA: 6s - loss: 1.5817 - regression_loss: 1.3195 - classification_loss: 0.2622 475/500 [===========================>..] - ETA: 6s - loss: 1.5814 - regression_loss: 1.3192 - classification_loss: 0.2622 476/500 [===========================>..] - ETA: 6s - loss: 1.5814 - regression_loss: 1.3193 - classification_loss: 0.2621 477/500 [===========================>..] - ETA: 5s - loss: 1.5818 - regression_loss: 1.3196 - classification_loss: 0.2621 478/500 [===========================>..] - ETA: 5s - loss: 1.5824 - regression_loss: 1.3200 - classification_loss: 0.2623 479/500 [===========================>..] - ETA: 5s - loss: 1.5814 - regression_loss: 1.3192 - classification_loss: 0.2621 480/500 [===========================>..] - ETA: 5s - loss: 1.5819 - regression_loss: 1.3197 - classification_loss: 0.2622 481/500 [===========================>..] - ETA: 4s - loss: 1.5832 - regression_loss: 1.3207 - classification_loss: 0.2624 482/500 [===========================>..] - ETA: 4s - loss: 1.5836 - regression_loss: 1.3212 - classification_loss: 0.2624 483/500 [===========================>..] - ETA: 4s - loss: 1.5819 - regression_loss: 1.3199 - classification_loss: 0.2619 484/500 [============================>.] - ETA: 4s - loss: 1.5816 - regression_loss: 1.3198 - classification_loss: 0.2618 485/500 [============================>.] - ETA: 3s - loss: 1.5815 - regression_loss: 1.3196 - classification_loss: 0.2619 486/500 [============================>.] - ETA: 3s - loss: 1.5821 - regression_loss: 1.3201 - classification_loss: 0.2620 487/500 [============================>.] - ETA: 3s - loss: 1.5824 - regression_loss: 1.3205 - classification_loss: 0.2620 488/500 [============================>.] - ETA: 3s - loss: 1.5820 - regression_loss: 1.3202 - classification_loss: 0.2618 489/500 [============================>.] - ETA: 2s - loss: 1.5831 - regression_loss: 1.3212 - classification_loss: 0.2619 490/500 [============================>.] - ETA: 2s - loss: 1.5840 - regression_loss: 1.3219 - classification_loss: 0.2621 491/500 [============================>.] - ETA: 2s - loss: 1.5842 - regression_loss: 1.3222 - classification_loss: 0.2620 492/500 [============================>.] - ETA: 2s - loss: 1.5828 - regression_loss: 1.3211 - classification_loss: 0.2617 493/500 [============================>.] - ETA: 1s - loss: 1.5832 - regression_loss: 1.3214 - classification_loss: 0.2618 494/500 [============================>.] - ETA: 1s - loss: 1.5838 - regression_loss: 1.3219 - classification_loss: 0.2618 495/500 [============================>.] - ETA: 1s - loss: 1.5822 - regression_loss: 1.3205 - classification_loss: 0.2617 496/500 [============================>.] - ETA: 1s - loss: 1.5820 - regression_loss: 1.3204 - classification_loss: 0.2616 497/500 [============================>.] - ETA: 0s - loss: 1.5828 - regression_loss: 1.3211 - classification_loss: 0.2617 498/500 [============================>.] - ETA: 0s - loss: 1.5826 - regression_loss: 1.3209 - classification_loss: 0.2617 499/500 [============================>.] - ETA: 0s - loss: 1.5834 - regression_loss: 1.3216 - classification_loss: 0.2618 500/500 [==============================] - 126s 251ms/step - loss: 1.5847 - regression_loss: 1.3226 - classification_loss: 0.2621 1172 instances of class plum with average precision: 0.6507 mAP: 0.6507 Epoch 00085: saving model to ./training/snapshots/resnet50_pascal_85.h5 Epoch 86/150 1/500 [..............................] - ETA: 2:02 - loss: 0.8308 - regression_loss: 0.6863 - classification_loss: 0.1445 2/500 [..............................] - ETA: 2:05 - loss: 1.1452 - regression_loss: 0.9361 - classification_loss: 0.2090 3/500 [..............................] - ETA: 2:05 - loss: 1.1088 - regression_loss: 0.8700 - classification_loss: 0.2388 4/500 [..............................] - ETA: 2:06 - loss: 1.1828 - regression_loss: 0.9493 - classification_loss: 0.2335 5/500 [..............................] - ETA: 2:05 - loss: 1.0370 - regression_loss: 0.8367 - classification_loss: 0.2003 6/500 [..............................] - ETA: 2:04 - loss: 1.0790 - regression_loss: 0.8757 - classification_loss: 0.2033 7/500 [..............................] - ETA: 2:04 - loss: 1.1193 - regression_loss: 0.9175 - classification_loss: 0.2018 8/500 [..............................] - ETA: 2:04 - loss: 1.2309 - regression_loss: 1.0098 - classification_loss: 0.2211 9/500 [..............................] - ETA: 2:03 - loss: 1.2975 - regression_loss: 1.0692 - classification_loss: 0.2283 10/500 [..............................] - ETA: 2:03 - loss: 1.3623 - regression_loss: 1.1329 - classification_loss: 0.2294 11/500 [..............................] - ETA: 2:03 - loss: 1.3838 - regression_loss: 1.1522 - classification_loss: 0.2316 12/500 [..............................] - ETA: 2:02 - loss: 1.4374 - regression_loss: 1.1987 - classification_loss: 0.2386 13/500 [..............................] - ETA: 2:02 - loss: 1.4496 - regression_loss: 1.2076 - classification_loss: 0.2420 14/500 [..............................] - ETA: 2:02 - loss: 1.4050 - regression_loss: 1.1725 - classification_loss: 0.2325 15/500 [..............................] - ETA: 2:01 - loss: 1.4456 - regression_loss: 1.1992 - classification_loss: 0.2464 16/500 [..............................] - ETA: 2:00 - loss: 1.5022 - regression_loss: 1.2455 - classification_loss: 0.2566 17/500 [>.............................] - ETA: 2:00 - loss: 1.5306 - regression_loss: 1.2709 - classification_loss: 0.2597 18/500 [>.............................] - ETA: 2:00 - loss: 1.5173 - regression_loss: 1.2611 - classification_loss: 0.2562 19/500 [>.............................] - ETA: 2:00 - loss: 1.5486 - regression_loss: 1.2820 - classification_loss: 0.2666 20/500 [>.............................] - ETA: 2:00 - loss: 1.5706 - regression_loss: 1.2981 - classification_loss: 0.2725 21/500 [>.............................] - ETA: 1:59 - loss: 1.6057 - regression_loss: 1.3225 - classification_loss: 0.2831 22/500 [>.............................] - ETA: 1:58 - loss: 1.6095 - regression_loss: 1.3258 - classification_loss: 0.2837 23/500 [>.............................] - ETA: 1:58 - loss: 1.6125 - regression_loss: 1.3308 - classification_loss: 0.2817 24/500 [>.............................] - ETA: 1:58 - loss: 1.5867 - regression_loss: 1.3120 - classification_loss: 0.2747 25/500 [>.............................] - ETA: 1:58 - loss: 1.5839 - regression_loss: 1.3106 - classification_loss: 0.2733 26/500 [>.............................] - ETA: 1:57 - loss: 1.5701 - regression_loss: 1.3002 - classification_loss: 0.2698 27/500 [>.............................] - ETA: 1:57 - loss: 1.5930 - regression_loss: 1.3153 - classification_loss: 0.2777 28/500 [>.............................] - ETA: 1:57 - loss: 1.5974 - regression_loss: 1.3209 - classification_loss: 0.2766 29/500 [>.............................] - ETA: 1:57 - loss: 1.5565 - regression_loss: 1.2876 - classification_loss: 0.2689 30/500 [>.............................] - ETA: 1:57 - loss: 1.5640 - regression_loss: 1.2924 - classification_loss: 0.2716 31/500 [>.............................] - ETA: 1:57 - loss: 1.5695 - regression_loss: 1.2955 - classification_loss: 0.2740 32/500 [>.............................] - ETA: 1:56 - loss: 1.5659 - regression_loss: 1.2921 - classification_loss: 0.2737 33/500 [>.............................] - ETA: 1:56 - loss: 1.5741 - regression_loss: 1.2985 - classification_loss: 0.2756 34/500 [=>............................] - ETA: 1:56 - loss: 1.5667 - regression_loss: 1.2938 - classification_loss: 0.2729 35/500 [=>............................] - ETA: 1:56 - loss: 1.5681 - regression_loss: 1.2958 - classification_loss: 0.2723 36/500 [=>............................] - ETA: 1:56 - loss: 1.5730 - regression_loss: 1.2990 - classification_loss: 0.2740 37/500 [=>............................] - ETA: 1:55 - loss: 1.5797 - regression_loss: 1.3047 - classification_loss: 0.2749 38/500 [=>............................] - ETA: 1:55 - loss: 1.5932 - regression_loss: 1.3145 - classification_loss: 0.2787 39/500 [=>............................] - ETA: 1:55 - loss: 1.5837 - regression_loss: 1.3053 - classification_loss: 0.2784 40/500 [=>............................] - ETA: 1:55 - loss: 1.5682 - regression_loss: 1.2907 - classification_loss: 0.2776 41/500 [=>............................] - ETA: 1:54 - loss: 1.5717 - regression_loss: 1.2934 - classification_loss: 0.2783 42/500 [=>............................] - ETA: 1:53 - loss: 1.5586 - regression_loss: 1.2849 - classification_loss: 0.2737 43/500 [=>............................] - ETA: 1:53 - loss: 1.5864 - regression_loss: 1.3056 - classification_loss: 0.2808 44/500 [=>............................] - ETA: 1:53 - loss: 1.5851 - regression_loss: 1.3045 - classification_loss: 0.2806 45/500 [=>............................] - ETA: 1:53 - loss: 1.5947 - regression_loss: 1.3117 - classification_loss: 0.2831 46/500 [=>............................] - ETA: 1:52 - loss: 1.6074 - regression_loss: 1.3221 - classification_loss: 0.2853 47/500 [=>............................] - ETA: 1:52 - loss: 1.6179 - regression_loss: 1.3317 - classification_loss: 0.2862 48/500 [=>............................] - ETA: 1:52 - loss: 1.6072 - regression_loss: 1.3243 - classification_loss: 0.2829 49/500 [=>............................] - ETA: 1:52 - loss: 1.6104 - regression_loss: 1.3280 - classification_loss: 0.2823 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6162 - regression_loss: 1.3329 - classification_loss: 0.2833 51/500 [==>...........................] - ETA: 1:51 - loss: 1.6247 - regression_loss: 1.3412 - classification_loss: 0.2835 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6295 - regression_loss: 1.3460 - classification_loss: 0.2835 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6304 - regression_loss: 1.3477 - classification_loss: 0.2827 54/500 [==>...........................] - ETA: 1:51 - loss: 1.6342 - regression_loss: 1.3517 - classification_loss: 0.2825 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6383 - regression_loss: 1.3504 - classification_loss: 0.2879 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6390 - regression_loss: 1.3508 - classification_loss: 0.2882 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6358 - regression_loss: 1.3483 - classification_loss: 0.2875 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6412 - regression_loss: 1.3534 - classification_loss: 0.2878 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6464 - regression_loss: 1.3584 - classification_loss: 0.2880 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6531 - regression_loss: 1.3650 - classification_loss: 0.2881 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6391 - regression_loss: 1.3529 - classification_loss: 0.2862 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6316 - regression_loss: 1.3476 - classification_loss: 0.2840 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6351 - regression_loss: 1.3511 - classification_loss: 0.2840 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6286 - regression_loss: 1.3464 - classification_loss: 0.2822 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6192 - regression_loss: 1.3403 - classification_loss: 0.2789 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6115 - regression_loss: 1.3343 - classification_loss: 0.2772 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6110 - regression_loss: 1.3346 - classification_loss: 0.2764 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6110 - regression_loss: 1.3352 - classification_loss: 0.2759 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6162 - regression_loss: 1.3391 - classification_loss: 0.2771 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6180 - regression_loss: 1.3397 - classification_loss: 0.2782 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6221 - regression_loss: 1.3426 - classification_loss: 0.2795 72/500 [===>..........................] - ETA: 1:47 - loss: 1.6206 - regression_loss: 1.3410 - classification_loss: 0.2795 73/500 [===>..........................] - ETA: 1:46 - loss: 1.6245 - regression_loss: 1.3443 - classification_loss: 0.2802 74/500 [===>..........................] - ETA: 1:46 - loss: 1.6367 - regression_loss: 1.3573 - classification_loss: 0.2794 75/500 [===>..........................] - ETA: 1:46 - loss: 1.6387 - regression_loss: 1.3584 - classification_loss: 0.2804 76/500 [===>..........................] - ETA: 1:46 - loss: 1.6372 - regression_loss: 1.3575 - classification_loss: 0.2797 77/500 [===>..........................] - ETA: 1:45 - loss: 1.6413 - regression_loss: 1.3597 - classification_loss: 0.2816 78/500 [===>..........................] - ETA: 1:45 - loss: 1.6430 - regression_loss: 1.3611 - classification_loss: 0.2819 79/500 [===>..........................] - ETA: 1:45 - loss: 1.6428 - regression_loss: 1.3607 - classification_loss: 0.2820 80/500 [===>..........................] - ETA: 1:45 - loss: 1.6435 - regression_loss: 1.3618 - classification_loss: 0.2817 81/500 [===>..........................] - ETA: 1:44 - loss: 1.6459 - regression_loss: 1.3646 - classification_loss: 0.2813 82/500 [===>..........................] - ETA: 1:44 - loss: 1.6429 - regression_loss: 1.3627 - classification_loss: 0.2803 83/500 [===>..........................] - ETA: 1:44 - loss: 1.6460 - regression_loss: 1.3649 - classification_loss: 0.2812 84/500 [====>.........................] - ETA: 1:43 - loss: 1.6431 - regression_loss: 1.3632 - classification_loss: 0.2799 85/500 [====>.........................] - ETA: 1:43 - loss: 1.6443 - regression_loss: 1.3636 - classification_loss: 0.2807 86/500 [====>.........................] - ETA: 1:43 - loss: 1.6548 - regression_loss: 1.3717 - classification_loss: 0.2832 87/500 [====>.........................] - ETA: 1:43 - loss: 1.6550 - regression_loss: 1.3724 - classification_loss: 0.2825 88/500 [====>.........................] - ETA: 1:42 - loss: 1.6533 - regression_loss: 1.3704 - classification_loss: 0.2828 89/500 [====>.........................] - ETA: 1:42 - loss: 1.6552 - regression_loss: 1.3721 - classification_loss: 0.2832 90/500 [====>.........................] - ETA: 1:42 - loss: 1.6571 - regression_loss: 1.3740 - classification_loss: 0.2830 91/500 [====>.........................] - ETA: 1:41 - loss: 1.6531 - regression_loss: 1.3715 - classification_loss: 0.2817 92/500 [====>.........................] - ETA: 1:41 - loss: 1.6495 - regression_loss: 1.3682 - classification_loss: 0.2813 93/500 [====>.........................] - ETA: 1:41 - loss: 1.6492 - regression_loss: 1.3691 - classification_loss: 0.2801 94/500 [====>.........................] - ETA: 1:40 - loss: 1.6483 - regression_loss: 1.3685 - classification_loss: 0.2798 95/500 [====>.........................] - ETA: 1:40 - loss: 1.6433 - regression_loss: 1.3643 - classification_loss: 0.2790 96/500 [====>.........................] - ETA: 1:40 - loss: 1.6422 - regression_loss: 1.3633 - classification_loss: 0.2789 97/500 [====>.........................] - ETA: 1:39 - loss: 1.6420 - regression_loss: 1.3627 - classification_loss: 0.2793 98/500 [====>.........................] - ETA: 1:39 - loss: 1.6449 - regression_loss: 1.3652 - classification_loss: 0.2797 99/500 [====>.........................] - ETA: 1:39 - loss: 1.6505 - regression_loss: 1.3699 - classification_loss: 0.2806 100/500 [=====>........................] - ETA: 1:39 - loss: 1.6464 - regression_loss: 1.3668 - classification_loss: 0.2797 101/500 [=====>........................] - ETA: 1:38 - loss: 1.6418 - regression_loss: 1.3632 - classification_loss: 0.2786 102/500 [=====>........................] - ETA: 1:38 - loss: 1.6360 - regression_loss: 1.3587 - classification_loss: 0.2773 103/500 [=====>........................] - ETA: 1:38 - loss: 1.6373 - regression_loss: 1.3597 - classification_loss: 0.2776 104/500 [=====>........................] - ETA: 1:38 - loss: 1.6331 - regression_loss: 1.3563 - classification_loss: 0.2768 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6355 - regression_loss: 1.3586 - classification_loss: 0.2769 106/500 [=====>........................] - ETA: 1:37 - loss: 1.6293 - regression_loss: 1.3533 - classification_loss: 0.2760 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6272 - regression_loss: 1.3519 - classification_loss: 0.2753 108/500 [=====>........................] - ETA: 1:37 - loss: 1.6279 - regression_loss: 1.3529 - classification_loss: 0.2751 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6274 - regression_loss: 1.3517 - classification_loss: 0.2757 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6231 - regression_loss: 1.3483 - classification_loss: 0.2748 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6239 - regression_loss: 1.3491 - classification_loss: 0.2748 112/500 [=====>........................] - ETA: 1:35 - loss: 1.6224 - regression_loss: 1.3481 - classification_loss: 0.2743 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6215 - regression_loss: 1.3479 - classification_loss: 0.2735 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6271 - regression_loss: 1.3520 - classification_loss: 0.2751 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6315 - regression_loss: 1.3560 - classification_loss: 0.2754 116/500 [=====>........................] - ETA: 1:35 - loss: 1.6324 - regression_loss: 1.3570 - classification_loss: 0.2754 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6337 - regression_loss: 1.3580 - classification_loss: 0.2757 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6302 - regression_loss: 1.3559 - classification_loss: 0.2743 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6308 - regression_loss: 1.3566 - classification_loss: 0.2742 120/500 [======>.......................] - ETA: 1:34 - loss: 1.6326 - regression_loss: 1.3580 - classification_loss: 0.2746 121/500 [======>.......................] - ETA: 1:33 - loss: 1.6287 - regression_loss: 1.3554 - classification_loss: 0.2733 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6291 - regression_loss: 1.3563 - classification_loss: 0.2729 123/500 [======>.......................] - ETA: 1:33 - loss: 1.6250 - regression_loss: 1.3530 - classification_loss: 0.2720 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6197 - regression_loss: 1.3486 - classification_loss: 0.2712 125/500 [======>.......................] - ETA: 1:32 - loss: 1.6229 - regression_loss: 1.3512 - classification_loss: 0.2717 126/500 [======>.......................] - ETA: 1:32 - loss: 1.6151 - regression_loss: 1.3450 - classification_loss: 0.2702 127/500 [======>.......................] - ETA: 1:32 - loss: 1.6071 - regression_loss: 1.3383 - classification_loss: 0.2688 128/500 [======>.......................] - ETA: 1:32 - loss: 1.6052 - regression_loss: 1.3370 - classification_loss: 0.2681 129/500 [======>.......................] - ETA: 1:31 - loss: 1.6041 - regression_loss: 1.3362 - classification_loss: 0.2679 130/500 [======>.......................] - ETA: 1:31 - loss: 1.6046 - regression_loss: 1.3365 - classification_loss: 0.2681 131/500 [======>.......................] - ETA: 1:31 - loss: 1.6042 - regression_loss: 1.3362 - classification_loss: 0.2680 132/500 [======>.......................] - ETA: 1:31 - loss: 1.6041 - regression_loss: 1.3361 - classification_loss: 0.2680 133/500 [======>.......................] - ETA: 1:31 - loss: 1.6086 - regression_loss: 1.3402 - classification_loss: 0.2684 134/500 [=======>......................] - ETA: 1:30 - loss: 1.6119 - regression_loss: 1.3425 - classification_loss: 0.2693 135/500 [=======>......................] - ETA: 1:30 - loss: 1.6138 - regression_loss: 1.3439 - classification_loss: 0.2699 136/500 [=======>......................] - ETA: 1:30 - loss: 1.6146 - regression_loss: 1.3438 - classification_loss: 0.2708 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6156 - regression_loss: 1.3449 - classification_loss: 0.2707 138/500 [=======>......................] - ETA: 1:29 - loss: 1.6211 - regression_loss: 1.3491 - classification_loss: 0.2720 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6182 - regression_loss: 1.3465 - classification_loss: 0.2718 140/500 [=======>......................] - ETA: 1:29 - loss: 1.6169 - regression_loss: 1.3455 - classification_loss: 0.2714 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6124 - regression_loss: 1.3417 - classification_loss: 0.2707 142/500 [=======>......................] - ETA: 1:28 - loss: 1.6151 - regression_loss: 1.3440 - classification_loss: 0.2711 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6137 - regression_loss: 1.3427 - classification_loss: 0.2711 144/500 [=======>......................] - ETA: 1:28 - loss: 1.6153 - regression_loss: 1.3439 - classification_loss: 0.2714 145/500 [=======>......................] - ETA: 1:28 - loss: 1.6155 - regression_loss: 1.3440 - classification_loss: 0.2714 146/500 [=======>......................] - ETA: 1:27 - loss: 1.6105 - regression_loss: 1.3402 - classification_loss: 0.2703 147/500 [=======>......................] - ETA: 1:27 - loss: 1.6107 - regression_loss: 1.3403 - classification_loss: 0.2703 148/500 [=======>......................] - ETA: 1:27 - loss: 1.6066 - regression_loss: 1.3372 - classification_loss: 0.2694 149/500 [=======>......................] - ETA: 1:27 - loss: 1.6072 - regression_loss: 1.3377 - classification_loss: 0.2695 150/500 [========>.....................] - ETA: 1:26 - loss: 1.6078 - regression_loss: 1.3389 - classification_loss: 0.2690 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6090 - regression_loss: 1.3397 - classification_loss: 0.2693 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6107 - regression_loss: 1.3412 - classification_loss: 0.2695 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6110 - regression_loss: 1.3416 - classification_loss: 0.2694 154/500 [========>.....................] - ETA: 1:25 - loss: 1.6117 - regression_loss: 1.3423 - classification_loss: 0.2694 155/500 [========>.....................] - ETA: 1:25 - loss: 1.6132 - regression_loss: 1.3437 - classification_loss: 0.2695 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6169 - regression_loss: 1.3470 - classification_loss: 0.2699 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6108 - regression_loss: 1.3425 - classification_loss: 0.2684 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6059 - regression_loss: 1.3382 - classification_loss: 0.2676 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5995 - regression_loss: 1.3331 - classification_loss: 0.2664 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5957 - regression_loss: 1.3301 - classification_loss: 0.2657 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5968 - regression_loss: 1.3313 - classification_loss: 0.2655 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5981 - regression_loss: 1.3324 - classification_loss: 0.2656 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5966 - regression_loss: 1.3313 - classification_loss: 0.2653 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5974 - regression_loss: 1.3318 - classification_loss: 0.2656 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5985 - regression_loss: 1.3327 - classification_loss: 0.2659 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5945 - regression_loss: 1.3296 - classification_loss: 0.2650 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5905 - regression_loss: 1.3261 - classification_loss: 0.2644 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5906 - regression_loss: 1.3265 - classification_loss: 0.2641 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5897 - regression_loss: 1.3258 - classification_loss: 0.2639 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5877 - regression_loss: 1.3244 - classification_loss: 0.2633 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5885 - regression_loss: 1.3251 - classification_loss: 0.2634 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5884 - regression_loss: 1.3252 - classification_loss: 0.2632 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5929 - regression_loss: 1.3288 - classification_loss: 0.2641 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5919 - regression_loss: 1.3281 - classification_loss: 0.2638 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5910 - regression_loss: 1.3276 - classification_loss: 0.2634 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5888 - regression_loss: 1.3261 - classification_loss: 0.2628 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5910 - regression_loss: 1.3280 - classification_loss: 0.2630 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5924 - regression_loss: 1.3291 - classification_loss: 0.2633 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5926 - regression_loss: 1.3294 - classification_loss: 0.2632 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5946 - regression_loss: 1.3310 - classification_loss: 0.2637 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5963 - regression_loss: 1.3325 - classification_loss: 0.2639 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5989 - regression_loss: 1.3350 - classification_loss: 0.2639 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5973 - regression_loss: 1.3333 - classification_loss: 0.2640 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6041 - regression_loss: 1.3384 - classification_loss: 0.2656 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5993 - regression_loss: 1.3346 - classification_loss: 0.2647 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5995 - regression_loss: 1.3348 - classification_loss: 0.2647 187/500 [==========>...................] - ETA: 1:17 - loss: 1.6029 - regression_loss: 1.3377 - classification_loss: 0.2652 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6061 - regression_loss: 1.3398 - classification_loss: 0.2662 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6087 - regression_loss: 1.3417 - classification_loss: 0.2670 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6091 - regression_loss: 1.3419 - classification_loss: 0.2672 191/500 [==========>...................] - ETA: 1:16 - loss: 1.6067 - regression_loss: 1.3404 - classification_loss: 0.2663 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6061 - regression_loss: 1.3400 - classification_loss: 0.2661 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6064 - regression_loss: 1.3403 - classification_loss: 0.2661 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6070 - regression_loss: 1.3410 - classification_loss: 0.2661 195/500 [==========>...................] - ETA: 1:15 - loss: 1.6068 - regression_loss: 1.3412 - classification_loss: 0.2656 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6060 - regression_loss: 1.3406 - classification_loss: 0.2654 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6086 - regression_loss: 1.3424 - classification_loss: 0.2662 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6111 - regression_loss: 1.3443 - classification_loss: 0.2668 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6082 - regression_loss: 1.3418 - classification_loss: 0.2664 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6079 - regression_loss: 1.3415 - classification_loss: 0.2664 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6087 - regression_loss: 1.3424 - classification_loss: 0.2663 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6094 - regression_loss: 1.3432 - classification_loss: 0.2662 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6103 - regression_loss: 1.3435 - classification_loss: 0.2669 204/500 [===========>..................] - ETA: 1:13 - loss: 1.6137 - regression_loss: 1.3456 - classification_loss: 0.2681 205/500 [===========>..................] - ETA: 1:13 - loss: 1.6157 - regression_loss: 1.3467 - classification_loss: 0.2690 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6119 - regression_loss: 1.3432 - classification_loss: 0.2686 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6143 - regression_loss: 1.3456 - classification_loss: 0.2688 208/500 [===========>..................] - ETA: 1:12 - loss: 1.6164 - regression_loss: 1.3476 - classification_loss: 0.2687 209/500 [===========>..................] - ETA: 1:12 - loss: 1.6153 - regression_loss: 1.3471 - classification_loss: 0.2682 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6169 - regression_loss: 1.3477 - classification_loss: 0.2692 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6179 - regression_loss: 1.3486 - classification_loss: 0.2693 212/500 [===========>..................] - ETA: 1:11 - loss: 1.6193 - regression_loss: 1.3496 - classification_loss: 0.2697 213/500 [===========>..................] - ETA: 1:11 - loss: 1.6197 - regression_loss: 1.3497 - classification_loss: 0.2700 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6196 - regression_loss: 1.3495 - classification_loss: 0.2701 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6186 - regression_loss: 1.3486 - classification_loss: 0.2700 216/500 [===========>..................] - ETA: 1:10 - loss: 1.6208 - regression_loss: 1.3502 - classification_loss: 0.2706 217/500 [============>.................] - ETA: 1:10 - loss: 1.6188 - regression_loss: 1.3487 - classification_loss: 0.2701 218/500 [============>.................] - ETA: 1:10 - loss: 1.6205 - regression_loss: 1.3500 - classification_loss: 0.2704 219/500 [============>.................] - ETA: 1:10 - loss: 1.6192 - regression_loss: 1.3495 - classification_loss: 0.2697 220/500 [============>.................] - ETA: 1:09 - loss: 1.6201 - regression_loss: 1.3505 - classification_loss: 0.2696 221/500 [============>.................] - ETA: 1:09 - loss: 1.6206 - regression_loss: 1.3511 - classification_loss: 0.2694 222/500 [============>.................] - ETA: 1:09 - loss: 1.6191 - regression_loss: 1.3501 - classification_loss: 0.2690 223/500 [============>.................] - ETA: 1:09 - loss: 1.6193 - regression_loss: 1.3501 - classification_loss: 0.2692 224/500 [============>.................] - ETA: 1:08 - loss: 1.6197 - regression_loss: 1.3502 - classification_loss: 0.2695 225/500 [============>.................] - ETA: 1:08 - loss: 1.6201 - regression_loss: 1.3505 - classification_loss: 0.2696 226/500 [============>.................] - ETA: 1:08 - loss: 1.6238 - regression_loss: 1.3526 - classification_loss: 0.2711 227/500 [============>.................] - ETA: 1:08 - loss: 1.6246 - regression_loss: 1.3533 - classification_loss: 0.2714 228/500 [============>.................] - ETA: 1:07 - loss: 1.6241 - regression_loss: 1.3529 - classification_loss: 0.2712 229/500 [============>.................] - ETA: 1:07 - loss: 1.6238 - regression_loss: 1.3527 - classification_loss: 0.2710 230/500 [============>.................] - ETA: 1:07 - loss: 1.6243 - regression_loss: 1.3532 - classification_loss: 0.2711 231/500 [============>.................] - ETA: 1:07 - loss: 1.6258 - regression_loss: 1.3548 - classification_loss: 0.2711 232/500 [============>.................] - ETA: 1:06 - loss: 1.6240 - regression_loss: 1.3532 - classification_loss: 0.2709 233/500 [============>.................] - ETA: 1:06 - loss: 1.6206 - regression_loss: 1.3505 - classification_loss: 0.2701 234/500 [=============>................] - ETA: 1:06 - loss: 1.6201 - regression_loss: 1.3503 - classification_loss: 0.2698 235/500 [=============>................] - ETA: 1:06 - loss: 1.6170 - regression_loss: 1.3479 - classification_loss: 0.2690 236/500 [=============>................] - ETA: 1:05 - loss: 1.6135 - regression_loss: 1.3451 - classification_loss: 0.2685 237/500 [=============>................] - ETA: 1:05 - loss: 1.6116 - regression_loss: 1.3433 - classification_loss: 0.2684 238/500 [=============>................] - ETA: 1:05 - loss: 1.6129 - regression_loss: 1.3440 - classification_loss: 0.2688 239/500 [=============>................] - ETA: 1:05 - loss: 1.6148 - regression_loss: 1.3456 - classification_loss: 0.2692 240/500 [=============>................] - ETA: 1:04 - loss: 1.6148 - regression_loss: 1.3455 - classification_loss: 0.2693 241/500 [=============>................] - ETA: 1:04 - loss: 1.6150 - regression_loss: 1.3454 - classification_loss: 0.2696 242/500 [=============>................] - ETA: 1:04 - loss: 1.6163 - regression_loss: 1.3467 - classification_loss: 0.2696 243/500 [=============>................] - ETA: 1:04 - loss: 1.6160 - regression_loss: 1.3460 - classification_loss: 0.2700 244/500 [=============>................] - ETA: 1:03 - loss: 1.6176 - regression_loss: 1.3472 - classification_loss: 0.2704 245/500 [=============>................] - ETA: 1:03 - loss: 1.6171 - regression_loss: 1.3465 - classification_loss: 0.2706 246/500 [=============>................] - ETA: 1:03 - loss: 1.6190 - regression_loss: 1.3477 - classification_loss: 0.2713 247/500 [=============>................] - ETA: 1:03 - loss: 1.6174 - regression_loss: 1.3463 - classification_loss: 0.2711 248/500 [=============>................] - ETA: 1:02 - loss: 1.6165 - regression_loss: 1.3461 - classification_loss: 0.2704 249/500 [=============>................] - ETA: 1:02 - loss: 1.6183 - regression_loss: 1.3479 - classification_loss: 0.2705 250/500 [==============>...............] - ETA: 1:02 - loss: 1.6197 - regression_loss: 1.3492 - classification_loss: 0.2706 251/500 [==============>...............] - ETA: 1:02 - loss: 1.6204 - regression_loss: 1.3498 - classification_loss: 0.2706 252/500 [==============>...............] - ETA: 1:01 - loss: 1.6218 - regression_loss: 1.3511 - classification_loss: 0.2706 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6209 - regression_loss: 1.3505 - classification_loss: 0.2705 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6181 - regression_loss: 1.3481 - classification_loss: 0.2700 255/500 [==============>...............] - ETA: 1:01 - loss: 1.6153 - regression_loss: 1.3456 - classification_loss: 0.2698 256/500 [==============>...............] - ETA: 1:00 - loss: 1.6178 - regression_loss: 1.3475 - classification_loss: 0.2703 257/500 [==============>...............] - ETA: 1:00 - loss: 1.6193 - regression_loss: 1.3486 - classification_loss: 0.2706 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6159 - regression_loss: 1.3461 - classification_loss: 0.2698 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6155 - regression_loss: 1.3460 - classification_loss: 0.2696 260/500 [==============>...............] - ETA: 59s - loss: 1.6144 - regression_loss: 1.3450 - classification_loss: 0.2695  261/500 [==============>...............] - ETA: 59s - loss: 1.6146 - regression_loss: 1.3451 - classification_loss: 0.2696 262/500 [==============>...............] - ETA: 59s - loss: 1.6156 - regression_loss: 1.3458 - classification_loss: 0.2698 263/500 [==============>...............] - ETA: 59s - loss: 1.6161 - regression_loss: 1.3463 - classification_loss: 0.2698 264/500 [==============>...............] - ETA: 58s - loss: 1.6154 - regression_loss: 1.3459 - classification_loss: 0.2695 265/500 [==============>...............] - ETA: 58s - loss: 1.6146 - regression_loss: 1.3451 - classification_loss: 0.2695 266/500 [==============>...............] - ETA: 58s - loss: 1.6159 - regression_loss: 1.3461 - classification_loss: 0.2698 267/500 [===============>..............] - ETA: 58s - loss: 1.6146 - regression_loss: 1.3451 - classification_loss: 0.2695 268/500 [===============>..............] - ETA: 57s - loss: 1.6158 - regression_loss: 1.3461 - classification_loss: 0.2697 269/500 [===============>..............] - ETA: 57s - loss: 1.6176 - regression_loss: 1.3474 - classification_loss: 0.2702 270/500 [===============>..............] - ETA: 57s - loss: 1.6183 - regression_loss: 1.3475 - classification_loss: 0.2708 271/500 [===============>..............] - ETA: 57s - loss: 1.6154 - regression_loss: 1.3452 - classification_loss: 0.2702 272/500 [===============>..............] - ETA: 56s - loss: 1.6143 - regression_loss: 1.3444 - classification_loss: 0.2699 273/500 [===============>..............] - ETA: 56s - loss: 1.6163 - regression_loss: 1.3461 - classification_loss: 0.2702 274/500 [===============>..............] - ETA: 56s - loss: 1.6150 - regression_loss: 1.3448 - classification_loss: 0.2703 275/500 [===============>..............] - ETA: 56s - loss: 1.6143 - regression_loss: 1.3443 - classification_loss: 0.2700 276/500 [===============>..............] - ETA: 55s - loss: 1.6139 - regression_loss: 1.3439 - classification_loss: 0.2700 277/500 [===============>..............] - ETA: 55s - loss: 1.6095 - regression_loss: 1.3402 - classification_loss: 0.2693 278/500 [===============>..............] - ETA: 55s - loss: 1.6100 - regression_loss: 1.3407 - classification_loss: 0.2694 279/500 [===============>..............] - ETA: 55s - loss: 1.6079 - regression_loss: 1.3388 - classification_loss: 0.2690 280/500 [===============>..............] - ETA: 54s - loss: 1.6085 - regression_loss: 1.3395 - classification_loss: 0.2691 281/500 [===============>..............] - ETA: 54s - loss: 1.6080 - regression_loss: 1.3389 - classification_loss: 0.2691 282/500 [===============>..............] - ETA: 54s - loss: 1.6061 - regression_loss: 1.3374 - classification_loss: 0.2686 283/500 [===============>..............] - ETA: 54s - loss: 1.6046 - regression_loss: 1.3363 - classification_loss: 0.2683 284/500 [================>.............] - ETA: 53s - loss: 1.6006 - regression_loss: 1.3331 - classification_loss: 0.2675 285/500 [================>.............] - ETA: 53s - loss: 1.6019 - regression_loss: 1.3343 - classification_loss: 0.2676 286/500 [================>.............] - ETA: 53s - loss: 1.6007 - regression_loss: 1.3334 - classification_loss: 0.2673 287/500 [================>.............] - ETA: 53s - loss: 1.6007 - regression_loss: 1.3335 - classification_loss: 0.2672 288/500 [================>.............] - ETA: 52s - loss: 1.6004 - regression_loss: 1.3333 - classification_loss: 0.2671 289/500 [================>.............] - ETA: 52s - loss: 1.6001 - regression_loss: 1.3331 - classification_loss: 0.2670 290/500 [================>.............] - ETA: 52s - loss: 1.6003 - regression_loss: 1.3333 - classification_loss: 0.2670 291/500 [================>.............] - ETA: 52s - loss: 1.6000 - regression_loss: 1.3333 - classification_loss: 0.2667 292/500 [================>.............] - ETA: 51s - loss: 1.6003 - regression_loss: 1.3336 - classification_loss: 0.2667 293/500 [================>.............] - ETA: 51s - loss: 1.6009 - regression_loss: 1.3340 - classification_loss: 0.2669 294/500 [================>.............] - ETA: 51s - loss: 1.5993 - regression_loss: 1.3327 - classification_loss: 0.2666 295/500 [================>.............] - ETA: 51s - loss: 1.5990 - regression_loss: 1.3325 - classification_loss: 0.2665 296/500 [================>.............] - ETA: 50s - loss: 1.5992 - regression_loss: 1.3328 - classification_loss: 0.2664 297/500 [================>.............] - ETA: 50s - loss: 1.6000 - regression_loss: 1.3334 - classification_loss: 0.2665 298/500 [================>.............] - ETA: 50s - loss: 1.6003 - regression_loss: 1.3338 - classification_loss: 0.2665 299/500 [================>.............] - ETA: 50s - loss: 1.5985 - regression_loss: 1.3324 - classification_loss: 0.2661 300/500 [=================>............] - ETA: 49s - loss: 1.5992 - regression_loss: 1.3326 - classification_loss: 0.2666 301/500 [=================>............] - ETA: 49s - loss: 1.5991 - regression_loss: 1.3327 - classification_loss: 0.2664 302/500 [=================>............] - ETA: 49s - loss: 1.5999 - regression_loss: 1.3334 - classification_loss: 0.2664 303/500 [=================>............] - ETA: 49s - loss: 1.6008 - regression_loss: 1.3341 - classification_loss: 0.2667 304/500 [=================>............] - ETA: 48s - loss: 1.6021 - regression_loss: 1.3353 - classification_loss: 0.2668 305/500 [=================>............] - ETA: 48s - loss: 1.6025 - regression_loss: 1.3358 - classification_loss: 0.2667 306/500 [=================>............] - ETA: 48s - loss: 1.6035 - regression_loss: 1.3366 - classification_loss: 0.2669 307/500 [=================>............] - ETA: 48s - loss: 1.6039 - regression_loss: 1.3371 - classification_loss: 0.2668 308/500 [=================>............] - ETA: 47s - loss: 1.6044 - regression_loss: 1.3374 - classification_loss: 0.2670 309/500 [=================>............] - ETA: 47s - loss: 1.6023 - regression_loss: 1.3357 - classification_loss: 0.2666 310/500 [=================>............] - ETA: 47s - loss: 1.6031 - regression_loss: 1.3365 - classification_loss: 0.2665 311/500 [=================>............] - ETA: 47s - loss: 1.6033 - regression_loss: 1.3368 - classification_loss: 0.2664 312/500 [=================>............] - ETA: 46s - loss: 1.6034 - regression_loss: 1.3370 - classification_loss: 0.2664 313/500 [=================>............] - ETA: 46s - loss: 1.6020 - regression_loss: 1.3359 - classification_loss: 0.2661 314/500 [=================>............] - ETA: 46s - loss: 1.6024 - regression_loss: 1.3361 - classification_loss: 0.2663 315/500 [=================>............] - ETA: 46s - loss: 1.6020 - regression_loss: 1.3359 - classification_loss: 0.2661 316/500 [=================>............] - ETA: 45s - loss: 1.6008 - regression_loss: 1.3343 - classification_loss: 0.2665 317/500 [==================>...........] - ETA: 45s - loss: 1.6025 - regression_loss: 1.3351 - classification_loss: 0.2674 318/500 [==================>...........] - ETA: 45s - loss: 1.6029 - regression_loss: 1.3354 - classification_loss: 0.2675 319/500 [==================>...........] - ETA: 45s - loss: 1.6029 - regression_loss: 1.3354 - classification_loss: 0.2675 320/500 [==================>...........] - ETA: 44s - loss: 1.6042 - regression_loss: 1.3365 - classification_loss: 0.2678 321/500 [==================>...........] - ETA: 44s - loss: 1.6032 - regression_loss: 1.3358 - classification_loss: 0.2674 322/500 [==================>...........] - ETA: 44s - loss: 1.6041 - regression_loss: 1.3365 - classification_loss: 0.2677 323/500 [==================>...........] - ETA: 44s - loss: 1.6049 - regression_loss: 1.3372 - classification_loss: 0.2677 324/500 [==================>...........] - ETA: 43s - loss: 1.6023 - regression_loss: 1.3352 - classification_loss: 0.2671 325/500 [==================>...........] - ETA: 43s - loss: 1.6029 - regression_loss: 1.3360 - classification_loss: 0.2670 326/500 [==================>...........] - ETA: 43s - loss: 1.6004 - regression_loss: 1.3338 - classification_loss: 0.2666 327/500 [==================>...........] - ETA: 43s - loss: 1.6004 - regression_loss: 1.3338 - classification_loss: 0.2666 328/500 [==================>...........] - ETA: 42s - loss: 1.6007 - regression_loss: 1.3341 - classification_loss: 0.2666 329/500 [==================>...........] - ETA: 42s - loss: 1.6022 - regression_loss: 1.3353 - classification_loss: 0.2669 330/500 [==================>...........] - ETA: 42s - loss: 1.6022 - regression_loss: 1.3354 - classification_loss: 0.2668 331/500 [==================>...........] - ETA: 42s - loss: 1.6042 - regression_loss: 1.3370 - classification_loss: 0.2672 332/500 [==================>...........] - ETA: 41s - loss: 1.6066 - regression_loss: 1.3387 - classification_loss: 0.2679 333/500 [==================>...........] - ETA: 41s - loss: 1.6063 - regression_loss: 1.3383 - classification_loss: 0.2680 334/500 [===================>..........] - ETA: 41s - loss: 1.6074 - regression_loss: 1.3393 - classification_loss: 0.2681 335/500 [===================>..........] - ETA: 41s - loss: 1.6052 - regression_loss: 1.3372 - classification_loss: 0.2679 336/500 [===================>..........] - ETA: 40s - loss: 1.6052 - regression_loss: 1.3372 - classification_loss: 0.2680 337/500 [===================>..........] - ETA: 40s - loss: 1.6060 - regression_loss: 1.3376 - classification_loss: 0.2684 338/500 [===================>..........] - ETA: 40s - loss: 1.6066 - regression_loss: 1.3384 - classification_loss: 0.2683 339/500 [===================>..........] - ETA: 40s - loss: 1.6049 - regression_loss: 1.3370 - classification_loss: 0.2679 340/500 [===================>..........] - ETA: 39s - loss: 1.6073 - regression_loss: 1.3386 - classification_loss: 0.2687 341/500 [===================>..........] - ETA: 39s - loss: 1.6079 - regression_loss: 1.3390 - classification_loss: 0.2689 342/500 [===================>..........] - ETA: 39s - loss: 1.6089 - regression_loss: 1.3397 - classification_loss: 0.2692 343/500 [===================>..........] - ETA: 39s - loss: 1.6092 - regression_loss: 1.3399 - classification_loss: 0.2692 344/500 [===================>..........] - ETA: 38s - loss: 1.6074 - regression_loss: 1.3385 - classification_loss: 0.2689 345/500 [===================>..........] - ETA: 38s - loss: 1.6079 - regression_loss: 1.3389 - classification_loss: 0.2690 346/500 [===================>..........] - ETA: 38s - loss: 1.6081 - regression_loss: 1.3391 - classification_loss: 0.2690 347/500 [===================>..........] - ETA: 38s - loss: 1.6088 - regression_loss: 1.3396 - classification_loss: 0.2691 348/500 [===================>..........] - ETA: 37s - loss: 1.6066 - regression_loss: 1.3376 - classification_loss: 0.2690 349/500 [===================>..........] - ETA: 37s - loss: 1.6055 - regression_loss: 1.3368 - classification_loss: 0.2687 350/500 [====================>.........] - ETA: 37s - loss: 1.6060 - regression_loss: 1.3372 - classification_loss: 0.2688 351/500 [====================>.........] - ETA: 37s - loss: 1.6055 - regression_loss: 1.3368 - classification_loss: 0.2687 352/500 [====================>.........] - ETA: 36s - loss: 1.6061 - regression_loss: 1.3373 - classification_loss: 0.2687 353/500 [====================>.........] - ETA: 36s - loss: 1.6076 - regression_loss: 1.3386 - classification_loss: 0.2690 354/500 [====================>.........] - ETA: 36s - loss: 1.6078 - regression_loss: 1.3387 - classification_loss: 0.2690 355/500 [====================>.........] - ETA: 36s - loss: 1.6061 - regression_loss: 1.3371 - classification_loss: 0.2690 356/500 [====================>.........] - ETA: 35s - loss: 1.6065 - regression_loss: 1.3375 - classification_loss: 0.2690 357/500 [====================>.........] - ETA: 35s - loss: 1.6047 - regression_loss: 1.3363 - classification_loss: 0.2684 358/500 [====================>.........] - ETA: 35s - loss: 1.6063 - regression_loss: 1.3376 - classification_loss: 0.2687 359/500 [====================>.........] - ETA: 35s - loss: 1.6094 - regression_loss: 1.3403 - classification_loss: 0.2691 360/500 [====================>.........] - ETA: 34s - loss: 1.6100 - regression_loss: 1.3407 - classification_loss: 0.2693 361/500 [====================>.........] - ETA: 34s - loss: 1.6090 - regression_loss: 1.3400 - classification_loss: 0.2690 362/500 [====================>.........] - ETA: 34s - loss: 1.6115 - regression_loss: 1.3424 - classification_loss: 0.2691 363/500 [====================>.........] - ETA: 34s - loss: 1.6125 - regression_loss: 1.3431 - classification_loss: 0.2694 364/500 [====================>.........] - ETA: 33s - loss: 1.6126 - regression_loss: 1.3432 - classification_loss: 0.2694 365/500 [====================>.........] - ETA: 33s - loss: 1.6134 - regression_loss: 1.3439 - classification_loss: 0.2695 366/500 [====================>.........] - ETA: 33s - loss: 1.6125 - regression_loss: 1.3432 - classification_loss: 0.2693 367/500 [=====================>........] - ETA: 33s - loss: 1.6124 - regression_loss: 1.3430 - classification_loss: 0.2693 368/500 [=====================>........] - ETA: 32s - loss: 1.6141 - regression_loss: 1.3443 - classification_loss: 0.2698 369/500 [=====================>........] - ETA: 32s - loss: 1.6140 - regression_loss: 1.3443 - classification_loss: 0.2697 370/500 [=====================>........] - ETA: 32s - loss: 1.6141 - regression_loss: 1.3444 - classification_loss: 0.2696 371/500 [=====================>........] - ETA: 32s - loss: 1.6144 - regression_loss: 1.3448 - classification_loss: 0.2696 372/500 [=====================>........] - ETA: 31s - loss: 1.6133 - regression_loss: 1.3439 - classification_loss: 0.2694 373/500 [=====================>........] - ETA: 31s - loss: 1.6110 - regression_loss: 1.3420 - classification_loss: 0.2689 374/500 [=====================>........] - ETA: 31s - loss: 1.6107 - regression_loss: 1.3420 - classification_loss: 0.2687 375/500 [=====================>........] - ETA: 31s - loss: 1.6119 - regression_loss: 1.3428 - classification_loss: 0.2691 376/500 [=====================>........] - ETA: 30s - loss: 1.6115 - regression_loss: 1.3423 - classification_loss: 0.2692 377/500 [=====================>........] - ETA: 30s - loss: 1.6127 - regression_loss: 1.3434 - classification_loss: 0.2693 378/500 [=====================>........] - ETA: 30s - loss: 1.6107 - regression_loss: 1.3418 - classification_loss: 0.2689 379/500 [=====================>........] - ETA: 30s - loss: 1.6104 - regression_loss: 1.3416 - classification_loss: 0.2688 380/500 [=====================>........] - ETA: 29s - loss: 1.6080 - regression_loss: 1.3394 - classification_loss: 0.2686 381/500 [=====================>........] - ETA: 29s - loss: 1.6075 - regression_loss: 1.3392 - classification_loss: 0.2684 382/500 [=====================>........] - ETA: 29s - loss: 1.6067 - regression_loss: 1.3386 - classification_loss: 0.2681 383/500 [=====================>........] - ETA: 29s - loss: 1.6065 - regression_loss: 1.3385 - classification_loss: 0.2680 384/500 [======================>.......] - ETA: 28s - loss: 1.6070 - regression_loss: 1.3390 - classification_loss: 0.2681 385/500 [======================>.......] - ETA: 28s - loss: 1.6051 - regression_loss: 1.3374 - classification_loss: 0.2677 386/500 [======================>.......] - ETA: 28s - loss: 1.6042 - regression_loss: 1.3368 - classification_loss: 0.2674 387/500 [======================>.......] - ETA: 28s - loss: 1.6051 - regression_loss: 1.3377 - classification_loss: 0.2674 388/500 [======================>.......] - ETA: 27s - loss: 1.6051 - regression_loss: 1.3379 - classification_loss: 0.2672 389/500 [======================>.......] - ETA: 27s - loss: 1.6027 - regression_loss: 1.3361 - classification_loss: 0.2666 390/500 [======================>.......] - ETA: 27s - loss: 1.6036 - regression_loss: 1.3366 - classification_loss: 0.2669 391/500 [======================>.......] - ETA: 27s - loss: 1.6030 - regression_loss: 1.3361 - classification_loss: 0.2669 392/500 [======================>.......] - ETA: 26s - loss: 1.6030 - regression_loss: 1.3362 - classification_loss: 0.2669 393/500 [======================>.......] - ETA: 26s - loss: 1.6009 - regression_loss: 1.3345 - classification_loss: 0.2664 394/500 [======================>.......] - ETA: 26s - loss: 1.6007 - regression_loss: 1.3345 - classification_loss: 0.2662 395/500 [======================>.......] - ETA: 26s - loss: 1.6008 - regression_loss: 1.3347 - classification_loss: 0.2661 396/500 [======================>.......] - ETA: 25s - loss: 1.6008 - regression_loss: 1.3348 - classification_loss: 0.2660 397/500 [======================>.......] - ETA: 25s - loss: 1.6000 - regression_loss: 1.3342 - classification_loss: 0.2659 398/500 [======================>.......] - ETA: 25s - loss: 1.6010 - regression_loss: 1.3351 - classification_loss: 0.2659 399/500 [======================>.......] - ETA: 25s - loss: 1.6008 - regression_loss: 1.3349 - classification_loss: 0.2659 400/500 [=======================>......] - ETA: 24s - loss: 1.6027 - regression_loss: 1.3363 - classification_loss: 0.2664 401/500 [=======================>......] - ETA: 24s - loss: 1.6063 - regression_loss: 1.3391 - classification_loss: 0.2672 402/500 [=======================>......] - ETA: 24s - loss: 1.6069 - regression_loss: 1.3397 - classification_loss: 0.2672 403/500 [=======================>......] - ETA: 24s - loss: 1.6077 - regression_loss: 1.3401 - classification_loss: 0.2676 404/500 [=======================>......] - ETA: 23s - loss: 1.6118 - regression_loss: 1.3430 - classification_loss: 0.2688 405/500 [=======================>......] - ETA: 23s - loss: 1.6111 - regression_loss: 1.3425 - classification_loss: 0.2686 406/500 [=======================>......] - ETA: 23s - loss: 1.6121 - regression_loss: 1.3434 - classification_loss: 0.2687 407/500 [=======================>......] - ETA: 23s - loss: 1.6111 - regression_loss: 1.3427 - classification_loss: 0.2683 408/500 [=======================>......] - ETA: 22s - loss: 1.6086 - regression_loss: 1.3408 - classification_loss: 0.2679 409/500 [=======================>......] - ETA: 22s - loss: 1.6060 - regression_loss: 1.3387 - classification_loss: 0.2674 410/500 [=======================>......] - ETA: 22s - loss: 1.6072 - regression_loss: 1.3397 - classification_loss: 0.2675 411/500 [=======================>......] - ETA: 22s - loss: 1.6077 - regression_loss: 1.3398 - classification_loss: 0.2678 412/500 [=======================>......] - ETA: 21s - loss: 1.6069 - regression_loss: 1.3392 - classification_loss: 0.2677 413/500 [=======================>......] - ETA: 21s - loss: 1.6055 - regression_loss: 1.3382 - classification_loss: 0.2673 414/500 [=======================>......] - ETA: 21s - loss: 1.6054 - regression_loss: 1.3379 - classification_loss: 0.2674 415/500 [=======================>......] - ETA: 21s - loss: 1.6038 - regression_loss: 1.3367 - classification_loss: 0.2671 416/500 [=======================>......] - ETA: 20s - loss: 1.6030 - regression_loss: 1.3361 - classification_loss: 0.2670 417/500 [========================>.....] - ETA: 20s - loss: 1.6018 - regression_loss: 1.3351 - classification_loss: 0.2667 418/500 [========================>.....] - ETA: 20s - loss: 1.6027 - regression_loss: 1.3358 - classification_loss: 0.2668 419/500 [========================>.....] - ETA: 20s - loss: 1.6026 - regression_loss: 1.3357 - classification_loss: 0.2669 420/500 [========================>.....] - ETA: 19s - loss: 1.6010 - regression_loss: 1.3345 - classification_loss: 0.2665 421/500 [========================>.....] - ETA: 19s - loss: 1.6004 - regression_loss: 1.3340 - classification_loss: 0.2664 422/500 [========================>.....] - ETA: 19s - loss: 1.6015 - regression_loss: 1.3349 - classification_loss: 0.2666 423/500 [========================>.....] - ETA: 19s - loss: 1.6026 - regression_loss: 1.3358 - classification_loss: 0.2668 424/500 [========================>.....] - ETA: 18s - loss: 1.6021 - regression_loss: 1.3354 - classification_loss: 0.2666 425/500 [========================>.....] - ETA: 18s - loss: 1.6018 - regression_loss: 1.3353 - classification_loss: 0.2665 426/500 [========================>.....] - ETA: 18s - loss: 1.6022 - regression_loss: 1.3357 - classification_loss: 0.2665 427/500 [========================>.....] - ETA: 18s - loss: 1.6024 - regression_loss: 1.3358 - classification_loss: 0.2666 428/500 [========================>.....] - ETA: 17s - loss: 1.6029 - regression_loss: 1.3362 - classification_loss: 0.2667 429/500 [========================>.....] - ETA: 17s - loss: 1.6025 - regression_loss: 1.3359 - classification_loss: 0.2666 430/500 [========================>.....] - ETA: 17s - loss: 1.6027 - regression_loss: 1.3361 - classification_loss: 0.2666 431/500 [========================>.....] - ETA: 17s - loss: 1.6028 - regression_loss: 1.3362 - classification_loss: 0.2666 432/500 [========================>.....] - ETA: 16s - loss: 1.6031 - regression_loss: 1.3363 - classification_loss: 0.2668 433/500 [========================>.....] - ETA: 16s - loss: 1.6030 - regression_loss: 1.3362 - classification_loss: 0.2668 434/500 [=========================>....] - ETA: 16s - loss: 1.6037 - regression_loss: 1.3366 - classification_loss: 0.2671 435/500 [=========================>....] - ETA: 16s - loss: 1.6037 - regression_loss: 1.3366 - classification_loss: 0.2671 436/500 [=========================>....] - ETA: 15s - loss: 1.6037 - regression_loss: 1.3366 - classification_loss: 0.2671 437/500 [=========================>....] - ETA: 15s - loss: 1.6028 - regression_loss: 1.3360 - classification_loss: 0.2667 438/500 [=========================>....] - ETA: 15s - loss: 1.6020 - regression_loss: 1.3355 - classification_loss: 0.2665 439/500 [=========================>....] - ETA: 15s - loss: 1.6027 - regression_loss: 1.3362 - classification_loss: 0.2664 440/500 [=========================>....] - ETA: 14s - loss: 1.6023 - regression_loss: 1.3360 - classification_loss: 0.2664 441/500 [=========================>....] - ETA: 14s - loss: 1.6019 - regression_loss: 1.3357 - classification_loss: 0.2662 442/500 [=========================>....] - ETA: 14s - loss: 1.6024 - regression_loss: 1.3361 - classification_loss: 0.2663 443/500 [=========================>....] - ETA: 14s - loss: 1.6016 - regression_loss: 1.3355 - classification_loss: 0.2661 444/500 [=========================>....] - ETA: 13s - loss: 1.6016 - regression_loss: 1.3356 - classification_loss: 0.2661 445/500 [=========================>....] - ETA: 13s - loss: 1.6024 - regression_loss: 1.3363 - classification_loss: 0.2661 446/500 [=========================>....] - ETA: 13s - loss: 1.6018 - regression_loss: 1.3358 - classification_loss: 0.2659 447/500 [=========================>....] - ETA: 13s - loss: 1.6018 - regression_loss: 1.3358 - classification_loss: 0.2660 448/500 [=========================>....] - ETA: 12s - loss: 1.6033 - regression_loss: 1.3368 - classification_loss: 0.2665 449/500 [=========================>....] - ETA: 12s - loss: 1.6042 - regression_loss: 1.3372 - classification_loss: 0.2670 450/500 [==========================>...] - ETA: 12s - loss: 1.6024 - regression_loss: 1.3358 - classification_loss: 0.2666 451/500 [==========================>...] - ETA: 12s - loss: 1.6022 - regression_loss: 1.3357 - classification_loss: 0.2665 452/500 [==========================>...] - ETA: 11s - loss: 1.6030 - regression_loss: 1.3363 - classification_loss: 0.2667 453/500 [==========================>...] - ETA: 11s - loss: 1.6024 - regression_loss: 1.3361 - classification_loss: 0.2663 454/500 [==========================>...] - ETA: 11s - loss: 1.6016 - regression_loss: 1.3357 - classification_loss: 0.2660 455/500 [==========================>...] - ETA: 11s - loss: 1.6032 - regression_loss: 1.3373 - classification_loss: 0.2659 456/500 [==========================>...] - ETA: 10s - loss: 1.6024 - regression_loss: 1.3367 - classification_loss: 0.2657 457/500 [==========================>...] - ETA: 10s - loss: 1.6023 - regression_loss: 1.3366 - classification_loss: 0.2657 458/500 [==========================>...] - ETA: 10s - loss: 1.6028 - regression_loss: 1.3371 - classification_loss: 0.2658 459/500 [==========================>...] - ETA: 10s - loss: 1.6017 - regression_loss: 1.3361 - classification_loss: 0.2656 460/500 [==========================>...] - ETA: 9s - loss: 1.6023 - regression_loss: 1.3367 - classification_loss: 0.2656  461/500 [==========================>...] - ETA: 9s - loss: 1.6028 - regression_loss: 1.3372 - classification_loss: 0.2657 462/500 [==========================>...] - ETA: 9s - loss: 1.6013 - regression_loss: 1.3361 - classification_loss: 0.2652 463/500 [==========================>...] - ETA: 9s - loss: 1.6017 - regression_loss: 1.3364 - classification_loss: 0.2652 464/500 [==========================>...] - ETA: 8s - loss: 1.6026 - regression_loss: 1.3372 - classification_loss: 0.2654 465/500 [==========================>...] - ETA: 8s - loss: 1.6028 - regression_loss: 1.3373 - classification_loss: 0.2654 466/500 [==========================>...] - ETA: 8s - loss: 1.6015 - regression_loss: 1.3362 - classification_loss: 0.2653 467/500 [===========================>..] - ETA: 8s - loss: 1.6011 - regression_loss: 1.3359 - classification_loss: 0.2652 468/500 [===========================>..] - ETA: 7s - loss: 1.6004 - regression_loss: 1.3351 - classification_loss: 0.2653 469/500 [===========================>..] - ETA: 7s - loss: 1.5997 - regression_loss: 1.3346 - classification_loss: 0.2652 470/500 [===========================>..] - ETA: 7s - loss: 1.5981 - regression_loss: 1.3332 - classification_loss: 0.2648 471/500 [===========================>..] - ETA: 7s - loss: 1.5967 - regression_loss: 1.3322 - classification_loss: 0.2645 472/500 [===========================>..] - ETA: 6s - loss: 1.5971 - regression_loss: 1.3327 - classification_loss: 0.2644 473/500 [===========================>..] - ETA: 6s - loss: 1.5969 - regression_loss: 1.3325 - classification_loss: 0.2644 474/500 [===========================>..] - ETA: 6s - loss: 1.5970 - regression_loss: 1.3327 - classification_loss: 0.2644 475/500 [===========================>..] - ETA: 6s - loss: 1.5974 - regression_loss: 1.3328 - classification_loss: 0.2645 476/500 [===========================>..] - ETA: 5s - loss: 1.5977 - regression_loss: 1.3331 - classification_loss: 0.2646 477/500 [===========================>..] - ETA: 5s - loss: 1.5976 - regression_loss: 1.3330 - classification_loss: 0.2646 478/500 [===========================>..] - ETA: 5s - loss: 1.5970 - regression_loss: 1.3328 - classification_loss: 0.2642 479/500 [===========================>..] - ETA: 5s - loss: 1.5968 - regression_loss: 1.3325 - classification_loss: 0.2644 480/500 [===========================>..] - ETA: 4s - loss: 1.5974 - regression_loss: 1.3330 - classification_loss: 0.2645 481/500 [===========================>..] - ETA: 4s - loss: 1.5981 - regression_loss: 1.3335 - classification_loss: 0.2646 482/500 [===========================>..] - ETA: 4s - loss: 1.5972 - regression_loss: 1.3328 - classification_loss: 0.2644 483/500 [===========================>..] - ETA: 4s - loss: 1.5964 - regression_loss: 1.3322 - classification_loss: 0.2642 484/500 [============================>.] - ETA: 3s - loss: 1.5964 - regression_loss: 1.3322 - classification_loss: 0.2641 485/500 [============================>.] - ETA: 3s - loss: 1.5976 - regression_loss: 1.3332 - classification_loss: 0.2644 486/500 [============================>.] - ETA: 3s - loss: 1.5982 - regression_loss: 1.3337 - classification_loss: 0.2646 487/500 [============================>.] - ETA: 3s - loss: 1.5983 - regression_loss: 1.3336 - classification_loss: 0.2647 488/500 [============================>.] - ETA: 2s - loss: 1.5982 - regression_loss: 1.3336 - classification_loss: 0.2645 489/500 [============================>.] - ETA: 2s - loss: 1.5980 - regression_loss: 1.3335 - classification_loss: 0.2645 490/500 [============================>.] - ETA: 2s - loss: 1.5980 - regression_loss: 1.3335 - classification_loss: 0.2645 491/500 [============================>.] - ETA: 2s - loss: 1.5983 - regression_loss: 1.3338 - classification_loss: 0.2645 492/500 [============================>.] - ETA: 1s - loss: 1.5981 - regression_loss: 1.3337 - classification_loss: 0.2644 493/500 [============================>.] - ETA: 1s - loss: 1.5977 - regression_loss: 1.3335 - classification_loss: 0.2642 494/500 [============================>.] - ETA: 1s - loss: 1.5983 - regression_loss: 1.3338 - classification_loss: 0.2645 495/500 [============================>.] - ETA: 1s - loss: 1.5995 - regression_loss: 1.3347 - classification_loss: 0.2648 496/500 [============================>.] - ETA: 0s - loss: 1.5992 - regression_loss: 1.3345 - classification_loss: 0.2647 497/500 [============================>.] - ETA: 0s - loss: 1.5982 - regression_loss: 1.3337 - classification_loss: 0.2645 498/500 [============================>.] - ETA: 0s - loss: 1.5989 - regression_loss: 1.3342 - classification_loss: 0.2648 499/500 [============================>.] - ETA: 0s - loss: 1.5980 - regression_loss: 1.3336 - classification_loss: 0.2644 500/500 [==============================] - 125s 249ms/step - loss: 1.5990 - regression_loss: 1.3342 - classification_loss: 0.2649 1172 instances of class plum with average precision: 0.6538 mAP: 0.6538 Epoch 00086: saving model to ./training/snapshots/resnet50_pascal_86.h5 Epoch 87/150 1/500 [..............................] - ETA: 2:01 - loss: 1.9815 - regression_loss: 1.6482 - classification_loss: 0.3333 2/500 [..............................] - ETA: 1:58 - loss: 1.5015 - regression_loss: 1.1842 - classification_loss: 0.3173 3/500 [..............................] - ETA: 2:02 - loss: 1.6209 - regression_loss: 1.3106 - classification_loss: 0.3103 4/500 [..............................] - ETA: 2:01 - loss: 1.6522 - regression_loss: 1.3514 - classification_loss: 0.3008 5/500 [..............................] - ETA: 2:02 - loss: 1.7316 - regression_loss: 1.4328 - classification_loss: 0.2989 6/500 [..............................] - ETA: 2:02 - loss: 1.6634 - regression_loss: 1.3854 - classification_loss: 0.2780 7/500 [..............................] - ETA: 2:02 - loss: 1.6582 - regression_loss: 1.3725 - classification_loss: 0.2857 8/500 [..............................] - ETA: 2:02 - loss: 1.6159 - regression_loss: 1.3413 - classification_loss: 0.2746 9/500 [..............................] - ETA: 2:02 - loss: 1.6102 - regression_loss: 1.3403 - classification_loss: 0.2699 10/500 [..............................] - ETA: 2:01 - loss: 1.6427 - regression_loss: 1.3680 - classification_loss: 0.2747 11/500 [..............................] - ETA: 2:00 - loss: 1.6459 - regression_loss: 1.3749 - classification_loss: 0.2711 12/500 [..............................] - ETA: 2:00 - loss: 1.6565 - regression_loss: 1.3821 - classification_loss: 0.2745 13/500 [..............................] - ETA: 2:00 - loss: 1.7139 - regression_loss: 1.4334 - classification_loss: 0.2806 14/500 [..............................] - ETA: 2:00 - loss: 1.7417 - regression_loss: 1.4550 - classification_loss: 0.2867 15/500 [..............................] - ETA: 2:00 - loss: 1.7668 - regression_loss: 1.4748 - classification_loss: 0.2920 16/500 [..............................] - ETA: 2:00 - loss: 1.7278 - regression_loss: 1.4410 - classification_loss: 0.2867 17/500 [>.............................] - ETA: 2:00 - loss: 1.7370 - regression_loss: 1.4509 - classification_loss: 0.2861 18/500 [>.............................] - ETA: 1:59 - loss: 1.7600 - regression_loss: 1.4691 - classification_loss: 0.2909 19/500 [>.............................] - ETA: 1:59 - loss: 1.7532 - regression_loss: 1.4600 - classification_loss: 0.2932 20/500 [>.............................] - ETA: 1:59 - loss: 1.7476 - regression_loss: 1.4526 - classification_loss: 0.2950 21/500 [>.............................] - ETA: 1:59 - loss: 1.7061 - regression_loss: 1.4199 - classification_loss: 0.2862 22/500 [>.............................] - ETA: 1:58 - loss: 1.6960 - regression_loss: 1.4126 - classification_loss: 0.2834 23/500 [>.............................] - ETA: 1:58 - loss: 1.6944 - regression_loss: 1.4131 - classification_loss: 0.2814 24/500 [>.............................] - ETA: 1:58 - loss: 1.6667 - regression_loss: 1.3899 - classification_loss: 0.2767 25/500 [>.............................] - ETA: 1:58 - loss: 1.6562 - regression_loss: 1.3820 - classification_loss: 0.2743 26/500 [>.............................] - ETA: 1:57 - loss: 1.6114 - regression_loss: 1.3454 - classification_loss: 0.2660 27/500 [>.............................] - ETA: 1:57 - loss: 1.6270 - regression_loss: 1.3558 - classification_loss: 0.2712 28/500 [>.............................] - ETA: 1:56 - loss: 1.6229 - regression_loss: 1.3540 - classification_loss: 0.2689 29/500 [>.............................] - ETA: 1:56 - loss: 1.6136 - regression_loss: 1.3459 - classification_loss: 0.2677 30/500 [>.............................] - ETA: 1:55 - loss: 1.6139 - regression_loss: 1.3476 - classification_loss: 0.2663 31/500 [>.............................] - ETA: 1:55 - loss: 1.6174 - regression_loss: 1.3523 - classification_loss: 0.2651 32/500 [>.............................] - ETA: 1:55 - loss: 1.5936 - regression_loss: 1.3302 - classification_loss: 0.2633 33/500 [>.............................] - ETA: 1:55 - loss: 1.5820 - regression_loss: 1.3219 - classification_loss: 0.2601 34/500 [=>............................] - ETA: 1:55 - loss: 1.5652 - regression_loss: 1.3061 - classification_loss: 0.2591 35/500 [=>............................] - ETA: 1:55 - loss: 1.5656 - regression_loss: 1.3021 - classification_loss: 0.2635 36/500 [=>............................] - ETA: 1:55 - loss: 1.5644 - regression_loss: 1.3025 - classification_loss: 0.2619 37/500 [=>............................] - ETA: 1:55 - loss: 1.5617 - regression_loss: 1.3012 - classification_loss: 0.2606 38/500 [=>............................] - ETA: 1:54 - loss: 1.5717 - regression_loss: 1.3090 - classification_loss: 0.2627 39/500 [=>............................] - ETA: 1:54 - loss: 1.5893 - regression_loss: 1.3233 - classification_loss: 0.2661 40/500 [=>............................] - ETA: 1:54 - loss: 1.5935 - regression_loss: 1.3256 - classification_loss: 0.2679 41/500 [=>............................] - ETA: 1:54 - loss: 1.5985 - regression_loss: 1.3297 - classification_loss: 0.2689 42/500 [=>............................] - ETA: 1:54 - loss: 1.5951 - regression_loss: 1.3263 - classification_loss: 0.2688 43/500 [=>............................] - ETA: 1:53 - loss: 1.5785 - regression_loss: 1.3140 - classification_loss: 0.2645 44/500 [=>............................] - ETA: 1:53 - loss: 1.5804 - regression_loss: 1.3151 - classification_loss: 0.2653 45/500 [=>............................] - ETA: 1:53 - loss: 1.5801 - regression_loss: 1.3137 - classification_loss: 0.2664 46/500 [=>............................] - ETA: 1:53 - loss: 1.5794 - regression_loss: 1.3129 - classification_loss: 0.2665 47/500 [=>............................] - ETA: 1:52 - loss: 1.5843 - regression_loss: 1.3177 - classification_loss: 0.2666 48/500 [=>............................] - ETA: 1:52 - loss: 1.5896 - regression_loss: 1.3231 - classification_loss: 0.2664 49/500 [=>............................] - ETA: 1:52 - loss: 1.5972 - regression_loss: 1.3297 - classification_loss: 0.2675 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5983 - regression_loss: 1.3314 - classification_loss: 0.2669 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5839 - regression_loss: 1.3169 - classification_loss: 0.2669 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5778 - regression_loss: 1.3110 - classification_loss: 0.2669 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5909 - regression_loss: 1.3222 - classification_loss: 0.2687 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5988 - regression_loss: 1.3291 - classification_loss: 0.2698 55/500 [==>...........................] - ETA: 1:51 - loss: 1.6014 - regression_loss: 1.3309 - classification_loss: 0.2706 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5996 - regression_loss: 1.3282 - classification_loss: 0.2714 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6101 - regression_loss: 1.3366 - classification_loss: 0.2735 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6057 - regression_loss: 1.3339 - classification_loss: 0.2718 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6055 - regression_loss: 1.3358 - classification_loss: 0.2697 60/500 [==>...........................] - ETA: 1:49 - loss: 1.6096 - regression_loss: 1.3397 - classification_loss: 0.2699 61/500 [==>...........................] - ETA: 1:49 - loss: 1.6059 - regression_loss: 1.3371 - classification_loss: 0.2688 62/500 [==>...........................] - ETA: 1:49 - loss: 1.6102 - regression_loss: 1.3407 - classification_loss: 0.2695 63/500 [==>...........................] - ETA: 1:49 - loss: 1.6149 - regression_loss: 1.3450 - classification_loss: 0.2699 64/500 [==>...........................] - ETA: 1:48 - loss: 1.6140 - regression_loss: 1.3451 - classification_loss: 0.2689 65/500 [==>...........................] - ETA: 1:48 - loss: 1.6185 - regression_loss: 1.3493 - classification_loss: 0.2692 66/500 [==>...........................] - ETA: 1:48 - loss: 1.6251 - regression_loss: 1.3541 - classification_loss: 0.2710 67/500 [===>..........................] - ETA: 1:48 - loss: 1.6331 - regression_loss: 1.3597 - classification_loss: 0.2734 68/500 [===>..........................] - ETA: 1:47 - loss: 1.6292 - regression_loss: 1.3569 - classification_loss: 0.2723 69/500 [===>..........................] - ETA: 1:47 - loss: 1.6142 - regression_loss: 1.3448 - classification_loss: 0.2694 70/500 [===>..........................] - ETA: 1:47 - loss: 1.6132 - regression_loss: 1.3443 - classification_loss: 0.2689 71/500 [===>..........................] - ETA: 1:47 - loss: 1.6128 - regression_loss: 1.3447 - classification_loss: 0.2681 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5977 - regression_loss: 1.3327 - classification_loss: 0.2650 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5922 - regression_loss: 1.3272 - classification_loss: 0.2650 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5907 - regression_loss: 1.3266 - classification_loss: 0.2641 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5882 - regression_loss: 1.3257 - classification_loss: 0.2625 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5922 - regression_loss: 1.3273 - classification_loss: 0.2649 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5903 - regression_loss: 1.3251 - classification_loss: 0.2653 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5831 - regression_loss: 1.3199 - classification_loss: 0.2632 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5865 - regression_loss: 1.3228 - classification_loss: 0.2637 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5875 - regression_loss: 1.3240 - classification_loss: 0.2635 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5900 - regression_loss: 1.3259 - classification_loss: 0.2641 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5874 - regression_loss: 1.3238 - classification_loss: 0.2636 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5898 - regression_loss: 1.3263 - classification_loss: 0.2635 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5805 - regression_loss: 1.3188 - classification_loss: 0.2617 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5770 - regression_loss: 1.3162 - classification_loss: 0.2608 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5831 - regression_loss: 1.3228 - classification_loss: 0.2603 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5725 - regression_loss: 1.3133 - classification_loss: 0.2592 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5749 - regression_loss: 1.3154 - classification_loss: 0.2595 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5797 - regression_loss: 1.3194 - classification_loss: 0.2603 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5788 - regression_loss: 1.3187 - classification_loss: 0.2602 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5872 - regression_loss: 1.3256 - classification_loss: 0.2616 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5927 - regression_loss: 1.3291 - classification_loss: 0.2636 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5967 - regression_loss: 1.3329 - classification_loss: 0.2638 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5944 - regression_loss: 1.3310 - classification_loss: 0.2634 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5923 - regression_loss: 1.3295 - classification_loss: 0.2629 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5961 - regression_loss: 1.3320 - classification_loss: 0.2642 97/500 [====>.........................] - ETA: 1:40 - loss: 1.6005 - regression_loss: 1.3337 - classification_loss: 0.2668 98/500 [====>.........................] - ETA: 1:40 - loss: 1.6017 - regression_loss: 1.3347 - classification_loss: 0.2670 99/500 [====>.........................] - ETA: 1:40 - loss: 1.6089 - regression_loss: 1.3392 - classification_loss: 0.2697 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6092 - regression_loss: 1.3394 - classification_loss: 0.2698 101/500 [=====>........................] - ETA: 1:39 - loss: 1.6117 - regression_loss: 1.3418 - classification_loss: 0.2699 102/500 [=====>........................] - ETA: 1:39 - loss: 1.6164 - regression_loss: 1.3460 - classification_loss: 0.2703 103/500 [=====>........................] - ETA: 1:39 - loss: 1.6161 - regression_loss: 1.3462 - classification_loss: 0.2698 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6138 - regression_loss: 1.3442 - classification_loss: 0.2696 105/500 [=====>........................] - ETA: 1:38 - loss: 1.6156 - regression_loss: 1.3457 - classification_loss: 0.2699 106/500 [=====>........................] - ETA: 1:38 - loss: 1.6119 - regression_loss: 1.3425 - classification_loss: 0.2694 107/500 [=====>........................] - ETA: 1:38 - loss: 1.6090 - regression_loss: 1.3397 - classification_loss: 0.2693 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6138 - regression_loss: 1.3427 - classification_loss: 0.2710 109/500 [=====>........................] - ETA: 1:37 - loss: 1.6196 - regression_loss: 1.3475 - classification_loss: 0.2720 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6146 - regression_loss: 1.3437 - classification_loss: 0.2708 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6143 - regression_loss: 1.3433 - classification_loss: 0.2710 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6079 - regression_loss: 1.3384 - classification_loss: 0.2695 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6096 - regression_loss: 1.3397 - classification_loss: 0.2699 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6115 - regression_loss: 1.3414 - classification_loss: 0.2701 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6138 - regression_loss: 1.3432 - classification_loss: 0.2705 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6137 - regression_loss: 1.3433 - classification_loss: 0.2704 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6176 - regression_loss: 1.3470 - classification_loss: 0.2706 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6108 - regression_loss: 1.3408 - classification_loss: 0.2699 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6108 - regression_loss: 1.3416 - classification_loss: 0.2692 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6102 - regression_loss: 1.3414 - classification_loss: 0.2688 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6100 - regression_loss: 1.3409 - classification_loss: 0.2692 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6115 - regression_loss: 1.3422 - classification_loss: 0.2693 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6036 - regression_loss: 1.3359 - classification_loss: 0.2677 124/500 [======>.......................] - ETA: 1:33 - loss: 1.6046 - regression_loss: 1.3370 - classification_loss: 0.2676 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6046 - regression_loss: 1.3371 - classification_loss: 0.2675 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6021 - regression_loss: 1.3356 - classification_loss: 0.2665 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5975 - regression_loss: 1.3320 - classification_loss: 0.2655 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5948 - regression_loss: 1.3301 - classification_loss: 0.2647 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5905 - regression_loss: 1.3268 - classification_loss: 0.2637 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5923 - regression_loss: 1.3282 - classification_loss: 0.2642 131/500 [======>.......................] - ETA: 1:31 - loss: 1.5956 - regression_loss: 1.3308 - classification_loss: 0.2647 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5935 - regression_loss: 1.3292 - classification_loss: 0.2644 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5971 - regression_loss: 1.3318 - classification_loss: 0.2653 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5954 - regression_loss: 1.3301 - classification_loss: 0.2653 135/500 [=======>......................] - ETA: 1:30 - loss: 1.5964 - regression_loss: 1.3311 - classification_loss: 0.2653 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5999 - regression_loss: 1.3334 - classification_loss: 0.2665 137/500 [=======>......................] - ETA: 1:30 - loss: 1.6008 - regression_loss: 1.3337 - classification_loss: 0.2671 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6026 - regression_loss: 1.3354 - classification_loss: 0.2672 139/500 [=======>......................] - ETA: 1:29 - loss: 1.6018 - regression_loss: 1.3349 - classification_loss: 0.2669 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5983 - regression_loss: 1.3316 - classification_loss: 0.2667 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5993 - regression_loss: 1.3327 - classification_loss: 0.2666 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6026 - regression_loss: 1.3354 - classification_loss: 0.2672 143/500 [=======>......................] - ETA: 1:28 - loss: 1.6016 - regression_loss: 1.3350 - classification_loss: 0.2666 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5969 - regression_loss: 1.3315 - classification_loss: 0.2654 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5909 - regression_loss: 1.3266 - classification_loss: 0.2643 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5953 - regression_loss: 1.3296 - classification_loss: 0.2657 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5978 - regression_loss: 1.3318 - classification_loss: 0.2660 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5944 - regression_loss: 1.3290 - classification_loss: 0.2654 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5959 - regression_loss: 1.3301 - classification_loss: 0.2658 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5971 - regression_loss: 1.3310 - classification_loss: 0.2662 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5988 - regression_loss: 1.3324 - classification_loss: 0.2664 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6000 - regression_loss: 1.3334 - classification_loss: 0.2667 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5982 - regression_loss: 1.3320 - classification_loss: 0.2661 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5961 - regression_loss: 1.3306 - classification_loss: 0.2655 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5890 - regression_loss: 1.3248 - classification_loss: 0.2642 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5872 - regression_loss: 1.3232 - classification_loss: 0.2640 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5867 - regression_loss: 1.3229 - classification_loss: 0.2638 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5873 - regression_loss: 1.3236 - classification_loss: 0.2637 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5864 - regression_loss: 1.3223 - classification_loss: 0.2642 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5868 - regression_loss: 1.3227 - classification_loss: 0.2641 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5871 - regression_loss: 1.3237 - classification_loss: 0.2634 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5885 - regression_loss: 1.3248 - classification_loss: 0.2636 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5887 - regression_loss: 1.3253 - classification_loss: 0.2635 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5969 - regression_loss: 1.3323 - classification_loss: 0.2647 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5963 - regression_loss: 1.3319 - classification_loss: 0.2643 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5994 - regression_loss: 1.3340 - classification_loss: 0.2654 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6033 - regression_loss: 1.3340 - classification_loss: 0.2693 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6034 - regression_loss: 1.3342 - classification_loss: 0.2692 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6031 - regression_loss: 1.3337 - classification_loss: 0.2693 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6095 - regression_loss: 1.3390 - classification_loss: 0.2704 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6072 - regression_loss: 1.3373 - classification_loss: 0.2699 172/500 [=========>....................] - ETA: 1:21 - loss: 1.6082 - regression_loss: 1.3380 - classification_loss: 0.2702 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6034 - regression_loss: 1.3344 - classification_loss: 0.2690 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6057 - regression_loss: 1.3358 - classification_loss: 0.2699 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6060 - regression_loss: 1.3363 - classification_loss: 0.2697 176/500 [=========>....................] - ETA: 1:20 - loss: 1.6090 - regression_loss: 1.3388 - classification_loss: 0.2702 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6097 - regression_loss: 1.3392 - classification_loss: 0.2706 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6112 - regression_loss: 1.3399 - classification_loss: 0.2713 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6077 - regression_loss: 1.3372 - classification_loss: 0.2705 180/500 [=========>....................] - ETA: 1:19 - loss: 1.6104 - regression_loss: 1.3398 - classification_loss: 0.2706 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6093 - regression_loss: 1.3386 - classification_loss: 0.2706 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6105 - regression_loss: 1.3400 - classification_loss: 0.2705 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6055 - regression_loss: 1.3362 - classification_loss: 0.2693 184/500 [==========>...................] - ETA: 1:18 - loss: 1.6041 - regression_loss: 1.3353 - classification_loss: 0.2688 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6043 - regression_loss: 1.3355 - classification_loss: 0.2688 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6044 - regression_loss: 1.3355 - classification_loss: 0.2688 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6063 - regression_loss: 1.3374 - classification_loss: 0.2689 188/500 [==========>...................] - ETA: 1:17 - loss: 1.6019 - regression_loss: 1.3335 - classification_loss: 0.2684 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5976 - regression_loss: 1.3297 - classification_loss: 0.2679 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6007 - regression_loss: 1.3320 - classification_loss: 0.2687 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5998 - regression_loss: 1.3312 - classification_loss: 0.2685 192/500 [==========>...................] - ETA: 1:16 - loss: 1.6034 - regression_loss: 1.3338 - classification_loss: 0.2696 193/500 [==========>...................] - ETA: 1:16 - loss: 1.6047 - regression_loss: 1.3352 - classification_loss: 0.2696 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6049 - regression_loss: 1.3356 - classification_loss: 0.2694 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6054 - regression_loss: 1.3359 - classification_loss: 0.2695 196/500 [==========>...................] - ETA: 1:15 - loss: 1.6081 - regression_loss: 1.3381 - classification_loss: 0.2701 197/500 [==========>...................] - ETA: 1:15 - loss: 1.6046 - regression_loss: 1.3352 - classification_loss: 0.2693 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5999 - regression_loss: 1.3313 - classification_loss: 0.2686 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6057 - regression_loss: 1.3356 - classification_loss: 0.2701 200/500 [===========>..................] - ETA: 1:14 - loss: 1.6062 - regression_loss: 1.3362 - classification_loss: 0.2701 201/500 [===========>..................] - ETA: 1:14 - loss: 1.6050 - regression_loss: 1.3352 - classification_loss: 0.2697 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5988 - regression_loss: 1.3300 - classification_loss: 0.2688 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5980 - regression_loss: 1.3293 - classification_loss: 0.2687 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5983 - regression_loss: 1.3295 - classification_loss: 0.2688 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5960 - regression_loss: 1.3275 - classification_loss: 0.2685 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5966 - regression_loss: 1.3280 - classification_loss: 0.2686 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5948 - regression_loss: 1.3265 - classification_loss: 0.2683 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5918 - regression_loss: 1.3243 - classification_loss: 0.2675 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5929 - regression_loss: 1.3252 - classification_loss: 0.2676 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5952 - regression_loss: 1.3268 - classification_loss: 0.2684 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5939 - regression_loss: 1.3259 - classification_loss: 0.2679 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5948 - regression_loss: 1.3268 - classification_loss: 0.2680 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5947 - regression_loss: 1.3268 - classification_loss: 0.2679 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5965 - regression_loss: 1.3282 - classification_loss: 0.2683 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5915 - regression_loss: 1.3242 - classification_loss: 0.2673 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5933 - regression_loss: 1.3257 - classification_loss: 0.2676 217/500 [============>.................] - ETA: 1:10 - loss: 1.5934 - regression_loss: 1.3258 - classification_loss: 0.2676 218/500 [============>.................] - ETA: 1:10 - loss: 1.5938 - regression_loss: 1.3263 - classification_loss: 0.2674 219/500 [============>.................] - ETA: 1:10 - loss: 1.5928 - regression_loss: 1.3256 - classification_loss: 0.2672 220/500 [============>.................] - ETA: 1:10 - loss: 1.5908 - regression_loss: 1.3242 - classification_loss: 0.2666 221/500 [============>.................] - ETA: 1:09 - loss: 1.5908 - regression_loss: 1.3242 - classification_loss: 0.2666 222/500 [============>.................] - ETA: 1:09 - loss: 1.5905 - regression_loss: 1.3239 - classification_loss: 0.2665 223/500 [============>.................] - ETA: 1:09 - loss: 1.5923 - regression_loss: 1.3253 - classification_loss: 0.2669 224/500 [============>.................] - ETA: 1:09 - loss: 1.5912 - regression_loss: 1.3245 - classification_loss: 0.2667 225/500 [============>.................] - ETA: 1:08 - loss: 1.5939 - regression_loss: 1.3265 - classification_loss: 0.2674 226/500 [============>.................] - ETA: 1:08 - loss: 1.5931 - regression_loss: 1.3257 - classification_loss: 0.2673 227/500 [============>.................] - ETA: 1:08 - loss: 1.5935 - regression_loss: 1.3255 - classification_loss: 0.2680 228/500 [============>.................] - ETA: 1:08 - loss: 1.5941 - regression_loss: 1.3259 - classification_loss: 0.2682 229/500 [============>.................] - ETA: 1:07 - loss: 1.5945 - regression_loss: 1.3259 - classification_loss: 0.2686 230/500 [============>.................] - ETA: 1:07 - loss: 1.5947 - regression_loss: 1.3262 - classification_loss: 0.2685 231/500 [============>.................] - ETA: 1:07 - loss: 1.5937 - regression_loss: 1.3253 - classification_loss: 0.2684 232/500 [============>.................] - ETA: 1:07 - loss: 1.5916 - regression_loss: 1.3236 - classification_loss: 0.2680 233/500 [============>.................] - ETA: 1:06 - loss: 1.5910 - regression_loss: 1.3230 - classification_loss: 0.2680 234/500 [=============>................] - ETA: 1:06 - loss: 1.5922 - regression_loss: 1.3240 - classification_loss: 0.2682 235/500 [=============>................] - ETA: 1:06 - loss: 1.5918 - regression_loss: 1.3233 - classification_loss: 0.2685 236/500 [=============>................] - ETA: 1:06 - loss: 1.5912 - regression_loss: 1.3228 - classification_loss: 0.2684 237/500 [=============>................] - ETA: 1:05 - loss: 1.5912 - regression_loss: 1.3230 - classification_loss: 0.2682 238/500 [=============>................] - ETA: 1:05 - loss: 1.5913 - regression_loss: 1.3233 - classification_loss: 0.2681 239/500 [=============>................] - ETA: 1:05 - loss: 1.5928 - regression_loss: 1.3246 - classification_loss: 0.2682 240/500 [=============>................] - ETA: 1:05 - loss: 1.5955 - regression_loss: 1.3267 - classification_loss: 0.2688 241/500 [=============>................] - ETA: 1:04 - loss: 1.5968 - regression_loss: 1.3274 - classification_loss: 0.2694 242/500 [=============>................] - ETA: 1:04 - loss: 1.5942 - regression_loss: 1.3253 - classification_loss: 0.2689 243/500 [=============>................] - ETA: 1:04 - loss: 1.5962 - regression_loss: 1.3264 - classification_loss: 0.2699 244/500 [=============>................] - ETA: 1:04 - loss: 1.5975 - regression_loss: 1.3275 - classification_loss: 0.2700 245/500 [=============>................] - ETA: 1:03 - loss: 1.5965 - regression_loss: 1.3265 - classification_loss: 0.2700 246/500 [=============>................] - ETA: 1:03 - loss: 1.5965 - regression_loss: 1.3267 - classification_loss: 0.2699 247/500 [=============>................] - ETA: 1:03 - loss: 1.5971 - regression_loss: 1.3270 - classification_loss: 0.2701 248/500 [=============>................] - ETA: 1:03 - loss: 1.5955 - regression_loss: 1.3259 - classification_loss: 0.2696 249/500 [=============>................] - ETA: 1:02 - loss: 1.5971 - regression_loss: 1.3271 - classification_loss: 0.2700 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5994 - regression_loss: 1.3290 - classification_loss: 0.2705 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5989 - regression_loss: 1.3285 - classification_loss: 0.2704 252/500 [==============>...............] - ETA: 1:02 - loss: 1.6014 - regression_loss: 1.3307 - classification_loss: 0.2706 253/500 [==============>...............] - ETA: 1:01 - loss: 1.6015 - regression_loss: 1.3309 - classification_loss: 0.2706 254/500 [==============>...............] - ETA: 1:01 - loss: 1.6022 - regression_loss: 1.3314 - classification_loss: 0.2708 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5981 - regression_loss: 1.3280 - classification_loss: 0.2701 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5981 - regression_loss: 1.3281 - classification_loss: 0.2701 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5969 - regression_loss: 1.3268 - classification_loss: 0.2700 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5977 - regression_loss: 1.3278 - classification_loss: 0.2700 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5963 - regression_loss: 1.3266 - classification_loss: 0.2697 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5950 - regression_loss: 1.3255 - classification_loss: 0.2696 261/500 [==============>...............] - ETA: 59s - loss: 1.5936 - regression_loss: 1.3247 - classification_loss: 0.2689  262/500 [==============>...............] - ETA: 59s - loss: 1.5913 - regression_loss: 1.3228 - classification_loss: 0.2685 263/500 [==============>...............] - ETA: 59s - loss: 1.5905 - regression_loss: 1.3223 - classification_loss: 0.2682 264/500 [==============>...............] - ETA: 59s - loss: 1.5923 - regression_loss: 1.3237 - classification_loss: 0.2686 265/500 [==============>...............] - ETA: 58s - loss: 1.5922 - regression_loss: 1.3238 - classification_loss: 0.2684 266/500 [==============>...............] - ETA: 58s - loss: 1.5922 - regression_loss: 1.3239 - classification_loss: 0.2683 267/500 [===============>..............] - ETA: 58s - loss: 1.5923 - regression_loss: 1.3242 - classification_loss: 0.2681 268/500 [===============>..............] - ETA: 58s - loss: 1.5928 - regression_loss: 1.3248 - classification_loss: 0.2680 269/500 [===============>..............] - ETA: 57s - loss: 1.5905 - regression_loss: 1.3228 - classification_loss: 0.2677 270/500 [===============>..............] - ETA: 57s - loss: 1.5891 - regression_loss: 1.3215 - classification_loss: 0.2676 271/500 [===============>..............] - ETA: 57s - loss: 1.5899 - regression_loss: 1.3224 - classification_loss: 0.2675 272/500 [===============>..............] - ETA: 57s - loss: 1.5909 - regression_loss: 1.3232 - classification_loss: 0.2677 273/500 [===============>..............] - ETA: 56s - loss: 1.5906 - regression_loss: 1.3232 - classification_loss: 0.2674 274/500 [===============>..............] - ETA: 56s - loss: 1.5912 - regression_loss: 1.3238 - classification_loss: 0.2675 275/500 [===============>..............] - ETA: 56s - loss: 1.5931 - regression_loss: 1.3256 - classification_loss: 0.2676 276/500 [===============>..............] - ETA: 56s - loss: 1.5922 - regression_loss: 1.3243 - classification_loss: 0.2678 277/500 [===============>..............] - ETA: 55s - loss: 1.5927 - regression_loss: 1.3249 - classification_loss: 0.2678 278/500 [===============>..............] - ETA: 55s - loss: 1.5928 - regression_loss: 1.3250 - classification_loss: 0.2679 279/500 [===============>..............] - ETA: 55s - loss: 1.5926 - regression_loss: 1.3245 - classification_loss: 0.2680 280/500 [===============>..............] - ETA: 55s - loss: 1.5925 - regression_loss: 1.3246 - classification_loss: 0.2679 281/500 [===============>..............] - ETA: 54s - loss: 1.5916 - regression_loss: 1.3241 - classification_loss: 0.2675 282/500 [===============>..............] - ETA: 54s - loss: 1.5918 - regression_loss: 1.3243 - classification_loss: 0.2675 283/500 [===============>..............] - ETA: 54s - loss: 1.5922 - regression_loss: 1.3239 - classification_loss: 0.2683 284/500 [================>.............] - ETA: 54s - loss: 1.5933 - regression_loss: 1.3252 - classification_loss: 0.2681 285/500 [================>.............] - ETA: 53s - loss: 1.5923 - regression_loss: 1.3246 - classification_loss: 0.2678 286/500 [================>.............] - ETA: 53s - loss: 1.5921 - regression_loss: 1.3242 - classification_loss: 0.2679 287/500 [================>.............] - ETA: 53s - loss: 1.5948 - regression_loss: 1.3265 - classification_loss: 0.2683 288/500 [================>.............] - ETA: 53s - loss: 1.5966 - regression_loss: 1.3280 - classification_loss: 0.2686 289/500 [================>.............] - ETA: 52s - loss: 1.5966 - regression_loss: 1.3281 - classification_loss: 0.2685 290/500 [================>.............] - ETA: 52s - loss: 1.5954 - regression_loss: 1.3273 - classification_loss: 0.2681 291/500 [================>.............] - ETA: 52s - loss: 1.5981 - regression_loss: 1.3290 - classification_loss: 0.2691 292/500 [================>.............] - ETA: 52s - loss: 1.5996 - regression_loss: 1.3301 - classification_loss: 0.2694 293/500 [================>.............] - ETA: 51s - loss: 1.5996 - regression_loss: 1.3297 - classification_loss: 0.2699 294/500 [================>.............] - ETA: 51s - loss: 1.6015 - regression_loss: 1.3306 - classification_loss: 0.2709 295/500 [================>.............] - ETA: 51s - loss: 1.6017 - regression_loss: 1.3309 - classification_loss: 0.2708 296/500 [================>.............] - ETA: 51s - loss: 1.6021 - regression_loss: 1.3312 - classification_loss: 0.2709 297/500 [================>.............] - ETA: 50s - loss: 1.5994 - regression_loss: 1.3290 - classification_loss: 0.2704 298/500 [================>.............] - ETA: 50s - loss: 1.6002 - regression_loss: 1.3294 - classification_loss: 0.2708 299/500 [================>.............] - ETA: 50s - loss: 1.5991 - regression_loss: 1.3286 - classification_loss: 0.2705 300/500 [=================>............] - ETA: 50s - loss: 1.6003 - regression_loss: 1.3297 - classification_loss: 0.2706 301/500 [=================>............] - ETA: 49s - loss: 1.5990 - regression_loss: 1.3287 - classification_loss: 0.2703 302/500 [=================>............] - ETA: 49s - loss: 1.5990 - regression_loss: 1.3286 - classification_loss: 0.2704 303/500 [=================>............] - ETA: 49s - loss: 1.5971 - regression_loss: 1.3267 - classification_loss: 0.2704 304/500 [=================>............] - ETA: 49s - loss: 1.5983 - regression_loss: 1.3276 - classification_loss: 0.2708 305/500 [=================>............] - ETA: 48s - loss: 1.5950 - regression_loss: 1.3248 - classification_loss: 0.2702 306/500 [=================>............] - ETA: 48s - loss: 1.5953 - regression_loss: 1.3248 - classification_loss: 0.2704 307/500 [=================>............] - ETA: 48s - loss: 1.5959 - regression_loss: 1.3254 - classification_loss: 0.2705 308/500 [=================>............] - ETA: 48s - loss: 1.5979 - regression_loss: 1.3269 - classification_loss: 0.2710 309/500 [=================>............] - ETA: 47s - loss: 1.5988 - regression_loss: 1.3276 - classification_loss: 0.2712 310/500 [=================>............] - ETA: 47s - loss: 1.6018 - regression_loss: 1.3295 - classification_loss: 0.2723 311/500 [=================>............] - ETA: 47s - loss: 1.6028 - regression_loss: 1.3304 - classification_loss: 0.2723 312/500 [=================>............] - ETA: 47s - loss: 1.6045 - regression_loss: 1.3320 - classification_loss: 0.2725 313/500 [=================>............] - ETA: 46s - loss: 1.6049 - regression_loss: 1.3325 - classification_loss: 0.2724 314/500 [=================>............] - ETA: 46s - loss: 1.6037 - regression_loss: 1.3316 - classification_loss: 0.2721 315/500 [=================>............] - ETA: 46s - loss: 1.6015 - regression_loss: 1.3297 - classification_loss: 0.2718 316/500 [=================>............] - ETA: 46s - loss: 1.5988 - regression_loss: 1.3276 - classification_loss: 0.2712 317/500 [==================>...........] - ETA: 45s - loss: 1.5992 - regression_loss: 1.3282 - classification_loss: 0.2710 318/500 [==================>...........] - ETA: 45s - loss: 1.5979 - regression_loss: 1.3272 - classification_loss: 0.2707 319/500 [==================>...........] - ETA: 45s - loss: 1.5997 - regression_loss: 1.3286 - classification_loss: 0.2711 320/500 [==================>...........] - ETA: 45s - loss: 1.5975 - regression_loss: 1.3269 - classification_loss: 0.2706 321/500 [==================>...........] - ETA: 44s - loss: 1.6015 - regression_loss: 1.3294 - classification_loss: 0.2721 322/500 [==================>...........] - ETA: 44s - loss: 1.6006 - regression_loss: 1.3287 - classification_loss: 0.2719 323/500 [==================>...........] - ETA: 44s - loss: 1.6004 - regression_loss: 1.3286 - classification_loss: 0.2717 324/500 [==================>...........] - ETA: 44s - loss: 1.5993 - regression_loss: 1.3271 - classification_loss: 0.2722 325/500 [==================>...........] - ETA: 43s - loss: 1.5982 - regression_loss: 1.3263 - classification_loss: 0.2720 326/500 [==================>...........] - ETA: 43s - loss: 1.5989 - regression_loss: 1.3267 - classification_loss: 0.2722 327/500 [==================>...........] - ETA: 43s - loss: 1.5998 - regression_loss: 1.3275 - classification_loss: 0.2723 328/500 [==================>...........] - ETA: 43s - loss: 1.5993 - regression_loss: 1.3269 - classification_loss: 0.2724 329/500 [==================>...........] - ETA: 42s - loss: 1.6026 - regression_loss: 1.3297 - classification_loss: 0.2729 330/500 [==================>...........] - ETA: 42s - loss: 1.6028 - regression_loss: 1.3301 - classification_loss: 0.2728 331/500 [==================>...........] - ETA: 42s - loss: 1.6035 - regression_loss: 1.3306 - classification_loss: 0.2729 332/500 [==================>...........] - ETA: 42s - loss: 1.6026 - regression_loss: 1.3301 - classification_loss: 0.2725 333/500 [==================>...........] - ETA: 41s - loss: 1.6019 - regression_loss: 1.3296 - classification_loss: 0.2723 334/500 [===================>..........] - ETA: 41s - loss: 1.6027 - regression_loss: 1.3301 - classification_loss: 0.2726 335/500 [===================>..........] - ETA: 41s - loss: 1.6024 - regression_loss: 1.3298 - classification_loss: 0.2726 336/500 [===================>..........] - ETA: 41s - loss: 1.6035 - regression_loss: 1.3306 - classification_loss: 0.2729 337/500 [===================>..........] - ETA: 40s - loss: 1.6004 - regression_loss: 1.3282 - classification_loss: 0.2722 338/500 [===================>..........] - ETA: 40s - loss: 1.6012 - regression_loss: 1.3290 - classification_loss: 0.2722 339/500 [===================>..........] - ETA: 40s - loss: 1.6017 - regression_loss: 1.3295 - classification_loss: 0.2722 340/500 [===================>..........] - ETA: 40s - loss: 1.6011 - regression_loss: 1.3289 - classification_loss: 0.2722 341/500 [===================>..........] - ETA: 39s - loss: 1.6010 - regression_loss: 1.3287 - classification_loss: 0.2723 342/500 [===================>..........] - ETA: 39s - loss: 1.6005 - regression_loss: 1.3282 - classification_loss: 0.2723 343/500 [===================>..........] - ETA: 39s - loss: 1.5996 - regression_loss: 1.3276 - classification_loss: 0.2721 344/500 [===================>..........] - ETA: 39s - loss: 1.5970 - regression_loss: 1.3255 - classification_loss: 0.2715 345/500 [===================>..........] - ETA: 38s - loss: 1.5967 - regression_loss: 1.3253 - classification_loss: 0.2715 346/500 [===================>..........] - ETA: 38s - loss: 1.5964 - regression_loss: 1.3250 - classification_loss: 0.2714 347/500 [===================>..........] - ETA: 38s - loss: 1.5968 - regression_loss: 1.3252 - classification_loss: 0.2716 348/500 [===================>..........] - ETA: 38s - loss: 1.5970 - regression_loss: 1.3255 - classification_loss: 0.2716 349/500 [===================>..........] - ETA: 37s - loss: 1.5978 - regression_loss: 1.3262 - classification_loss: 0.2717 350/500 [====================>.........] - ETA: 37s - loss: 1.5994 - regression_loss: 1.3274 - classification_loss: 0.2719 351/500 [====================>.........] - ETA: 37s - loss: 1.6003 - regression_loss: 1.3282 - classification_loss: 0.2721 352/500 [====================>.........] - ETA: 37s - loss: 1.5992 - regression_loss: 1.3274 - classification_loss: 0.2717 353/500 [====================>.........] - ETA: 36s - loss: 1.5989 - regression_loss: 1.3274 - classification_loss: 0.2715 354/500 [====================>.........] - ETA: 36s - loss: 1.5984 - regression_loss: 1.3272 - classification_loss: 0.2712 355/500 [====================>.........] - ETA: 36s - loss: 1.6011 - regression_loss: 1.3294 - classification_loss: 0.2716 356/500 [====================>.........] - ETA: 36s - loss: 1.5999 - regression_loss: 1.3285 - classification_loss: 0.2714 357/500 [====================>.........] - ETA: 35s - loss: 1.6010 - regression_loss: 1.3295 - classification_loss: 0.2714 358/500 [====================>.........] - ETA: 35s - loss: 1.6009 - regression_loss: 1.3295 - classification_loss: 0.2713 359/500 [====================>.........] - ETA: 35s - loss: 1.6014 - regression_loss: 1.3300 - classification_loss: 0.2713 360/500 [====================>.........] - ETA: 35s - loss: 1.6024 - regression_loss: 1.3309 - classification_loss: 0.2715 361/500 [====================>.........] - ETA: 34s - loss: 1.6035 - regression_loss: 1.3319 - classification_loss: 0.2716 362/500 [====================>.........] - ETA: 34s - loss: 1.6036 - regression_loss: 1.3321 - classification_loss: 0.2715 363/500 [====================>.........] - ETA: 34s - loss: 1.6043 - regression_loss: 1.3325 - classification_loss: 0.2719 364/500 [====================>.........] - ETA: 34s - loss: 1.6046 - regression_loss: 1.3329 - classification_loss: 0.2717 365/500 [====================>.........] - ETA: 33s - loss: 1.6047 - regression_loss: 1.3327 - classification_loss: 0.2719 366/500 [====================>.........] - ETA: 33s - loss: 1.6039 - regression_loss: 1.3322 - classification_loss: 0.2717 367/500 [=====================>........] - ETA: 33s - loss: 1.6043 - regression_loss: 1.3325 - classification_loss: 0.2718 368/500 [=====================>........] - ETA: 33s - loss: 1.6026 - regression_loss: 1.3312 - classification_loss: 0.2714 369/500 [=====================>........] - ETA: 32s - loss: 1.6031 - regression_loss: 1.3316 - classification_loss: 0.2715 370/500 [=====================>........] - ETA: 32s - loss: 1.6035 - regression_loss: 1.3320 - classification_loss: 0.2714 371/500 [=====================>........] - ETA: 32s - loss: 1.6042 - regression_loss: 1.3326 - classification_loss: 0.2716 372/500 [=====================>........] - ETA: 32s - loss: 1.6022 - regression_loss: 1.3312 - classification_loss: 0.2710 373/500 [=====================>........] - ETA: 31s - loss: 1.6025 - regression_loss: 1.3316 - classification_loss: 0.2709 374/500 [=====================>........] - ETA: 31s - loss: 1.6030 - regression_loss: 1.3319 - classification_loss: 0.2711 375/500 [=====================>........] - ETA: 31s - loss: 1.6027 - regression_loss: 1.3316 - classification_loss: 0.2710 376/500 [=====================>........] - ETA: 31s - loss: 1.6008 - regression_loss: 1.3302 - classification_loss: 0.2706 377/500 [=====================>........] - ETA: 30s - loss: 1.6008 - regression_loss: 1.3305 - classification_loss: 0.2703 378/500 [=====================>........] - ETA: 30s - loss: 1.6011 - regression_loss: 1.3308 - classification_loss: 0.2702 379/500 [=====================>........] - ETA: 30s - loss: 1.5992 - regression_loss: 1.3294 - classification_loss: 0.2699 380/500 [=====================>........] - ETA: 30s - loss: 1.5989 - regression_loss: 1.3291 - classification_loss: 0.2698 381/500 [=====================>........] - ETA: 29s - loss: 1.5978 - regression_loss: 1.3284 - classification_loss: 0.2694 382/500 [=====================>........] - ETA: 29s - loss: 1.5969 - regression_loss: 1.3274 - classification_loss: 0.2695 383/500 [=====================>........] - ETA: 29s - loss: 1.5953 - regression_loss: 1.3260 - classification_loss: 0.2693 384/500 [======================>.......] - ETA: 29s - loss: 1.5954 - regression_loss: 1.3260 - classification_loss: 0.2694 385/500 [======================>.......] - ETA: 28s - loss: 1.5962 - regression_loss: 1.3271 - classification_loss: 0.2691 386/500 [======================>.......] - ETA: 28s - loss: 1.5939 - regression_loss: 1.3251 - classification_loss: 0.2688 387/500 [======================>.......] - ETA: 28s - loss: 1.5921 - regression_loss: 1.3235 - classification_loss: 0.2686 388/500 [======================>.......] - ETA: 28s - loss: 1.5925 - regression_loss: 1.3238 - classification_loss: 0.2687 389/500 [======================>.......] - ETA: 27s - loss: 1.5916 - regression_loss: 1.3231 - classification_loss: 0.2685 390/500 [======================>.......] - ETA: 27s - loss: 1.5923 - regression_loss: 1.3237 - classification_loss: 0.2686 391/500 [======================>.......] - ETA: 27s - loss: 1.5928 - regression_loss: 1.3242 - classification_loss: 0.2686 392/500 [======================>.......] - ETA: 27s - loss: 1.5925 - regression_loss: 1.3239 - classification_loss: 0.2686 393/500 [======================>.......] - ETA: 26s - loss: 1.5938 - regression_loss: 1.3250 - classification_loss: 0.2688 394/500 [======================>.......] - ETA: 26s - loss: 1.5935 - regression_loss: 1.3248 - classification_loss: 0.2687 395/500 [======================>.......] - ETA: 26s - loss: 1.5916 - regression_loss: 1.3232 - classification_loss: 0.2684 396/500 [======================>.......] - ETA: 26s - loss: 1.5909 - regression_loss: 1.3226 - classification_loss: 0.2683 397/500 [======================>.......] - ETA: 25s - loss: 1.5908 - regression_loss: 1.3226 - classification_loss: 0.2682 398/500 [======================>.......] - ETA: 25s - loss: 1.5908 - regression_loss: 1.3226 - classification_loss: 0.2682 399/500 [======================>.......] - ETA: 25s - loss: 1.5911 - regression_loss: 1.3230 - classification_loss: 0.2681 400/500 [=======================>......] - ETA: 25s - loss: 1.5939 - regression_loss: 1.3253 - classification_loss: 0.2686 401/500 [=======================>......] - ETA: 24s - loss: 1.5955 - regression_loss: 1.3266 - classification_loss: 0.2689 402/500 [=======================>......] - ETA: 24s - loss: 1.5973 - regression_loss: 1.3281 - classification_loss: 0.2692 403/500 [=======================>......] - ETA: 24s - loss: 1.5948 - regression_loss: 1.3260 - classification_loss: 0.2689 404/500 [=======================>......] - ETA: 24s - loss: 1.5959 - regression_loss: 1.3268 - classification_loss: 0.2691 405/500 [=======================>......] - ETA: 23s - loss: 1.5941 - regression_loss: 1.3250 - classification_loss: 0.2690 406/500 [=======================>......] - ETA: 23s - loss: 1.5941 - regression_loss: 1.3250 - classification_loss: 0.2691 407/500 [=======================>......] - ETA: 23s - loss: 1.5952 - regression_loss: 1.3258 - classification_loss: 0.2694 408/500 [=======================>......] - ETA: 23s - loss: 1.5959 - regression_loss: 1.3263 - classification_loss: 0.2696 409/500 [=======================>......] - ETA: 22s - loss: 1.5945 - regression_loss: 1.3250 - classification_loss: 0.2694 410/500 [=======================>......] - ETA: 22s - loss: 1.5938 - regression_loss: 1.3245 - classification_loss: 0.2693 411/500 [=======================>......] - ETA: 22s - loss: 1.5947 - regression_loss: 1.3244 - classification_loss: 0.2703 412/500 [=======================>......] - ETA: 22s - loss: 1.5947 - regression_loss: 1.3242 - classification_loss: 0.2705 413/500 [=======================>......] - ETA: 21s - loss: 1.5960 - regression_loss: 1.3252 - classification_loss: 0.2707 414/500 [=======================>......] - ETA: 21s - loss: 1.5964 - regression_loss: 1.3257 - classification_loss: 0.2708 415/500 [=======================>......] - ETA: 21s - loss: 1.5974 - regression_loss: 1.3262 - classification_loss: 0.2712 416/500 [=======================>......] - ETA: 21s - loss: 1.5969 - regression_loss: 1.3259 - classification_loss: 0.2710 417/500 [========================>.....] - ETA: 20s - loss: 1.5971 - regression_loss: 1.3262 - classification_loss: 0.2710 418/500 [========================>.....] - ETA: 20s - loss: 1.5975 - regression_loss: 1.3264 - classification_loss: 0.2711 419/500 [========================>.....] - ETA: 20s - loss: 1.5982 - regression_loss: 1.3269 - classification_loss: 0.2713 420/500 [========================>.....] - ETA: 20s - loss: 1.5996 - regression_loss: 1.3278 - classification_loss: 0.2717 421/500 [========================>.....] - ETA: 19s - loss: 1.6006 - regression_loss: 1.3289 - classification_loss: 0.2717 422/500 [========================>.....] - ETA: 19s - loss: 1.6009 - regression_loss: 1.3293 - classification_loss: 0.2716 423/500 [========================>.....] - ETA: 19s - loss: 1.6031 - regression_loss: 1.3313 - classification_loss: 0.2718 424/500 [========================>.....] - ETA: 19s - loss: 1.6025 - regression_loss: 1.3309 - classification_loss: 0.2716 425/500 [========================>.....] - ETA: 18s - loss: 1.6025 - regression_loss: 1.3310 - classification_loss: 0.2715 426/500 [========================>.....] - ETA: 18s - loss: 1.6037 - regression_loss: 1.3320 - classification_loss: 0.2717 427/500 [========================>.....] - ETA: 18s - loss: 1.6034 - regression_loss: 1.3318 - classification_loss: 0.2716 428/500 [========================>.....] - ETA: 18s - loss: 1.6032 - regression_loss: 1.3316 - classification_loss: 0.2716 429/500 [========================>.....] - ETA: 17s - loss: 1.6038 - regression_loss: 1.3319 - classification_loss: 0.2719 430/500 [========================>.....] - ETA: 17s - loss: 1.6017 - regression_loss: 1.3302 - classification_loss: 0.2715 431/500 [========================>.....] - ETA: 17s - loss: 1.6006 - regression_loss: 1.3294 - classification_loss: 0.2712 432/500 [========================>.....] - ETA: 17s - loss: 1.6014 - regression_loss: 1.3301 - classification_loss: 0.2713 433/500 [========================>.....] - ETA: 16s - loss: 1.6012 - regression_loss: 1.3300 - classification_loss: 0.2712 434/500 [=========================>....] - ETA: 16s - loss: 1.6019 - regression_loss: 1.3306 - classification_loss: 0.2713 435/500 [=========================>....] - ETA: 16s - loss: 1.6017 - regression_loss: 1.3305 - classification_loss: 0.2712 436/500 [=========================>....] - ETA: 16s - loss: 1.6017 - regression_loss: 1.3307 - classification_loss: 0.2710 437/500 [=========================>....] - ETA: 15s - loss: 1.6019 - regression_loss: 1.3309 - classification_loss: 0.2710 438/500 [=========================>....] - ETA: 15s - loss: 1.6007 - regression_loss: 1.3299 - classification_loss: 0.2708 439/500 [=========================>....] - ETA: 15s - loss: 1.6010 - regression_loss: 1.3301 - classification_loss: 0.2709 440/500 [=========================>....] - ETA: 15s - loss: 1.6019 - regression_loss: 1.3308 - classification_loss: 0.2710 441/500 [=========================>....] - ETA: 14s - loss: 1.6030 - regression_loss: 1.3319 - classification_loss: 0.2712 442/500 [=========================>....] - ETA: 14s - loss: 1.6026 - regression_loss: 1.3316 - classification_loss: 0.2710 443/500 [=========================>....] - ETA: 14s - loss: 1.6030 - regression_loss: 1.3320 - classification_loss: 0.2711 444/500 [=========================>....] - ETA: 14s - loss: 1.6029 - regression_loss: 1.3317 - classification_loss: 0.2712 445/500 [=========================>....] - ETA: 13s - loss: 1.6031 - regression_loss: 1.3320 - classification_loss: 0.2712 446/500 [=========================>....] - ETA: 13s - loss: 1.6031 - regression_loss: 1.3320 - classification_loss: 0.2712 447/500 [=========================>....] - ETA: 13s - loss: 1.6016 - regression_loss: 1.3308 - classification_loss: 0.2708 448/500 [=========================>....] - ETA: 13s - loss: 1.6017 - regression_loss: 1.3309 - classification_loss: 0.2707 449/500 [=========================>....] - ETA: 12s - loss: 1.6024 - regression_loss: 1.3315 - classification_loss: 0.2709 450/500 [==========================>...] - ETA: 12s - loss: 1.6033 - regression_loss: 1.3325 - classification_loss: 0.2709 451/500 [==========================>...] - ETA: 12s - loss: 1.6024 - regression_loss: 1.3318 - classification_loss: 0.2706 452/500 [==========================>...] - ETA: 12s - loss: 1.6010 - regression_loss: 1.3307 - classification_loss: 0.2703 453/500 [==========================>...] - ETA: 11s - loss: 1.6009 - regression_loss: 1.3307 - classification_loss: 0.2702 454/500 [==========================>...] - ETA: 11s - loss: 1.6004 - regression_loss: 1.3303 - classification_loss: 0.2701 455/500 [==========================>...] - ETA: 11s - loss: 1.6002 - regression_loss: 1.3300 - classification_loss: 0.2702 456/500 [==========================>...] - ETA: 11s - loss: 1.6009 - regression_loss: 1.3308 - classification_loss: 0.2701 457/500 [==========================>...] - ETA: 10s - loss: 1.6002 - regression_loss: 1.3303 - classification_loss: 0.2699 458/500 [==========================>...] - ETA: 10s - loss: 1.5995 - regression_loss: 1.3298 - classification_loss: 0.2697 459/500 [==========================>...] - ETA: 10s - loss: 1.6001 - regression_loss: 1.3302 - classification_loss: 0.2699 460/500 [==========================>...] - ETA: 10s - loss: 1.6014 - regression_loss: 1.3312 - classification_loss: 0.2702 461/500 [==========================>...] - ETA: 9s - loss: 1.6042 - regression_loss: 1.3335 - classification_loss: 0.2708  462/500 [==========================>...] - ETA: 9s - loss: 1.6043 - regression_loss: 1.3335 - classification_loss: 0.2707 463/500 [==========================>...] - ETA: 9s - loss: 1.6052 - regression_loss: 1.3342 - classification_loss: 0.2710 464/500 [==========================>...] - ETA: 9s - loss: 1.6052 - regression_loss: 1.3341 - classification_loss: 0.2711 465/500 [==========================>...] - ETA: 8s - loss: 1.6056 - regression_loss: 1.3345 - classification_loss: 0.2712 466/500 [==========================>...] - ETA: 8s - loss: 1.6046 - regression_loss: 1.3337 - classification_loss: 0.2709 467/500 [===========================>..] - ETA: 8s - loss: 1.6027 - regression_loss: 1.3323 - classification_loss: 0.2704 468/500 [===========================>..] - ETA: 8s - loss: 1.6022 - regression_loss: 1.3319 - classification_loss: 0.2703 469/500 [===========================>..] - ETA: 7s - loss: 1.6013 - regression_loss: 1.3313 - classification_loss: 0.2701 470/500 [===========================>..] - ETA: 7s - loss: 1.6010 - regression_loss: 1.3310 - classification_loss: 0.2699 471/500 [===========================>..] - ETA: 7s - loss: 1.6009 - regression_loss: 1.3309 - classification_loss: 0.2700 472/500 [===========================>..] - ETA: 7s - loss: 1.6002 - regression_loss: 1.3304 - classification_loss: 0.2698 473/500 [===========================>..] - ETA: 6s - loss: 1.5981 - regression_loss: 1.3286 - classification_loss: 0.2695 474/500 [===========================>..] - ETA: 6s - loss: 1.5976 - regression_loss: 1.3283 - classification_loss: 0.2694 475/500 [===========================>..] - ETA: 6s - loss: 1.5984 - regression_loss: 1.3290 - classification_loss: 0.2694 476/500 [===========================>..] - ETA: 6s - loss: 1.5974 - regression_loss: 1.3280 - classification_loss: 0.2694 477/500 [===========================>..] - ETA: 5s - loss: 1.5969 - regression_loss: 1.3276 - classification_loss: 0.2692 478/500 [===========================>..] - ETA: 5s - loss: 1.5951 - regression_loss: 1.3260 - classification_loss: 0.2691 479/500 [===========================>..] - ETA: 5s - loss: 1.5950 - regression_loss: 1.3257 - classification_loss: 0.2692 480/500 [===========================>..] - ETA: 5s - loss: 1.5954 - regression_loss: 1.3260 - classification_loss: 0.2694 481/500 [===========================>..] - ETA: 4s - loss: 1.5954 - regression_loss: 1.3261 - classification_loss: 0.2693 482/500 [===========================>..] - ETA: 4s - loss: 1.5959 - regression_loss: 1.3265 - classification_loss: 0.2694 483/500 [===========================>..] - ETA: 4s - loss: 1.5964 - regression_loss: 1.3272 - classification_loss: 0.2692 484/500 [============================>.] - ETA: 4s - loss: 1.5964 - regression_loss: 1.3272 - classification_loss: 0.2691 485/500 [============================>.] - ETA: 3s - loss: 1.5965 - regression_loss: 1.3274 - classification_loss: 0.2692 486/500 [============================>.] - ETA: 3s - loss: 1.5954 - regression_loss: 1.3264 - classification_loss: 0.2690 487/500 [============================>.] - ETA: 3s - loss: 1.5961 - regression_loss: 1.3271 - classification_loss: 0.2690 488/500 [============================>.] - ETA: 3s - loss: 1.5964 - regression_loss: 1.3274 - classification_loss: 0.2690 489/500 [============================>.] - ETA: 2s - loss: 1.5961 - regression_loss: 1.3272 - classification_loss: 0.2690 490/500 [============================>.] - ETA: 2s - loss: 1.5955 - regression_loss: 1.3266 - classification_loss: 0.2689 491/500 [============================>.] - ETA: 2s - loss: 1.5960 - regression_loss: 1.3270 - classification_loss: 0.2690 492/500 [============================>.] - ETA: 2s - loss: 1.5944 - regression_loss: 1.3258 - classification_loss: 0.2686 493/500 [============================>.] - ETA: 1s - loss: 1.5943 - regression_loss: 1.3257 - classification_loss: 0.2686 494/500 [============================>.] - ETA: 1s - loss: 1.5946 - regression_loss: 1.3259 - classification_loss: 0.2686 495/500 [============================>.] - ETA: 1s - loss: 1.5946 - regression_loss: 1.3257 - classification_loss: 0.2689 496/500 [============================>.] - ETA: 1s - loss: 1.5955 - regression_loss: 1.3265 - classification_loss: 0.2690 497/500 [============================>.] - ETA: 0s - loss: 1.5956 - regression_loss: 1.3266 - classification_loss: 0.2690 498/500 [============================>.] - ETA: 0s - loss: 1.5959 - regression_loss: 1.3269 - classification_loss: 0.2690 499/500 [============================>.] - ETA: 0s - loss: 1.5959 - regression_loss: 1.3270 - classification_loss: 0.2689 500/500 [==============================] - 125s 251ms/step - loss: 1.5954 - regression_loss: 1.3267 - classification_loss: 0.2687 1172 instances of class plum with average precision: 0.6650 mAP: 0.6650 Epoch 00087: saving model to ./training/snapshots/resnet50_pascal_87.h5 Epoch 88/150 1/500 [..............................] - ETA: 2:01 - loss: 1.4347 - regression_loss: 1.2304 - classification_loss: 0.2043 2/500 [..............................] - ETA: 2:01 - loss: 1.4671 - regression_loss: 1.2532 - classification_loss: 0.2139 3/500 [..............................] - ETA: 2:04 - loss: 1.5316 - regression_loss: 1.2769 - classification_loss: 0.2546 4/500 [..............................] - ETA: 2:04 - loss: 1.7363 - regression_loss: 1.2626 - classification_loss: 0.4737 5/500 [..............................] - ETA: 2:04 - loss: 1.7579 - regression_loss: 1.3092 - classification_loss: 0.4487 6/500 [..............................] - ETA: 2:04 - loss: 1.7666 - regression_loss: 1.3488 - classification_loss: 0.4178 7/500 [..............................] - ETA: 2:04 - loss: 1.9392 - regression_loss: 1.5153 - classification_loss: 0.4239 8/500 [..............................] - ETA: 2:04 - loss: 1.8287 - regression_loss: 1.4339 - classification_loss: 0.3947 9/500 [..............................] - ETA: 2:04 - loss: 1.8077 - regression_loss: 1.4303 - classification_loss: 0.3774 10/500 [..............................] - ETA: 2:04 - loss: 1.8353 - regression_loss: 1.4633 - classification_loss: 0.3721 11/500 [..............................] - ETA: 2:03 - loss: 1.8217 - regression_loss: 1.4606 - classification_loss: 0.3611 12/500 [..............................] - ETA: 2:03 - loss: 1.8287 - regression_loss: 1.4694 - classification_loss: 0.3593 13/500 [..............................] - ETA: 2:02 - loss: 1.8031 - regression_loss: 1.4541 - classification_loss: 0.3490 14/500 [..............................] - ETA: 2:02 - loss: 1.7597 - regression_loss: 1.4230 - classification_loss: 0.3367 15/500 [..............................] - ETA: 2:02 - loss: 1.7246 - regression_loss: 1.3904 - classification_loss: 0.3341 16/500 [..............................] - ETA: 2:02 - loss: 1.7178 - regression_loss: 1.3860 - classification_loss: 0.3318 17/500 [>.............................] - ETA: 2:02 - loss: 1.7339 - regression_loss: 1.3986 - classification_loss: 0.3353 18/500 [>.............................] - ETA: 2:02 - loss: 1.7384 - regression_loss: 1.4017 - classification_loss: 0.3367 19/500 [>.............................] - ETA: 2:02 - loss: 1.7406 - regression_loss: 1.3995 - classification_loss: 0.3410 20/500 [>.............................] - ETA: 2:02 - loss: 1.7442 - regression_loss: 1.4067 - classification_loss: 0.3375 21/500 [>.............................] - ETA: 2:02 - loss: 1.7556 - regression_loss: 1.4189 - classification_loss: 0.3367 22/500 [>.............................] - ETA: 2:02 - loss: 1.7538 - regression_loss: 1.4200 - classification_loss: 0.3339 23/500 [>.............................] - ETA: 2:01 - loss: 1.7098 - regression_loss: 1.3882 - classification_loss: 0.3216 24/500 [>.............................] - ETA: 2:01 - loss: 1.7268 - regression_loss: 1.4039 - classification_loss: 0.3229 25/500 [>.............................] - ETA: 2:01 - loss: 1.6883 - regression_loss: 1.3722 - classification_loss: 0.3161 26/500 [>.............................] - ETA: 2:00 - loss: 1.6724 - regression_loss: 1.3614 - classification_loss: 0.3110 27/500 [>.............................] - ETA: 2:00 - loss: 1.6737 - regression_loss: 1.3647 - classification_loss: 0.3090 28/500 [>.............................] - ETA: 2:00 - loss: 1.6695 - regression_loss: 1.3612 - classification_loss: 0.3083 29/500 [>.............................] - ETA: 1:59 - loss: 1.6466 - regression_loss: 1.3442 - classification_loss: 0.3024 30/500 [>.............................] - ETA: 1:59 - loss: 1.6425 - regression_loss: 1.3405 - classification_loss: 0.3021 31/500 [>.............................] - ETA: 1:59 - loss: 1.6253 - regression_loss: 1.3270 - classification_loss: 0.2983 32/500 [>.............................] - ETA: 1:59 - loss: 1.6201 - regression_loss: 1.3240 - classification_loss: 0.2962 33/500 [>.............................] - ETA: 1:58 - loss: 1.6315 - regression_loss: 1.3344 - classification_loss: 0.2972 34/500 [=>............................] - ETA: 1:58 - loss: 1.6395 - regression_loss: 1.3399 - classification_loss: 0.2996 35/500 [=>............................] - ETA: 1:58 - loss: 1.6183 - regression_loss: 1.3229 - classification_loss: 0.2954 36/500 [=>............................] - ETA: 1:58 - loss: 1.5954 - regression_loss: 1.3038 - classification_loss: 0.2916 37/500 [=>............................] - ETA: 1:57 - loss: 1.5863 - regression_loss: 1.2982 - classification_loss: 0.2881 38/500 [=>............................] - ETA: 1:57 - loss: 1.5849 - regression_loss: 1.2982 - classification_loss: 0.2866 39/500 [=>............................] - ETA: 1:57 - loss: 1.5950 - regression_loss: 1.3084 - classification_loss: 0.2865 40/500 [=>............................] - ETA: 1:57 - loss: 1.5982 - regression_loss: 1.3118 - classification_loss: 0.2863 41/500 [=>............................] - ETA: 1:56 - loss: 1.5920 - regression_loss: 1.3083 - classification_loss: 0.2837 42/500 [=>............................] - ETA: 1:56 - loss: 1.5934 - regression_loss: 1.3112 - classification_loss: 0.2822 43/500 [=>............................] - ETA: 1:56 - loss: 1.5995 - regression_loss: 1.3166 - classification_loss: 0.2830 44/500 [=>............................] - ETA: 1:55 - loss: 1.6164 - regression_loss: 1.3294 - classification_loss: 0.2870 45/500 [=>............................] - ETA: 1:55 - loss: 1.6431 - regression_loss: 1.3552 - classification_loss: 0.2880 46/500 [=>............................] - ETA: 1:55 - loss: 1.6538 - regression_loss: 1.3617 - classification_loss: 0.2921 47/500 [=>............................] - ETA: 1:55 - loss: 1.6743 - regression_loss: 1.3770 - classification_loss: 0.2973 48/500 [=>............................] - ETA: 1:54 - loss: 1.6711 - regression_loss: 1.3761 - classification_loss: 0.2949 49/500 [=>............................] - ETA: 1:54 - loss: 1.6691 - regression_loss: 1.3738 - classification_loss: 0.2953 50/500 [==>...........................] - ETA: 1:54 - loss: 1.6773 - regression_loss: 1.3790 - classification_loss: 0.2983 51/500 [==>...........................] - ETA: 1:53 - loss: 1.6727 - regression_loss: 1.3765 - classification_loss: 0.2962 52/500 [==>...........................] - ETA: 1:53 - loss: 1.6563 - regression_loss: 1.3623 - classification_loss: 0.2940 53/500 [==>...........................] - ETA: 1:53 - loss: 1.6563 - regression_loss: 1.3627 - classification_loss: 0.2936 54/500 [==>...........................] - ETA: 1:53 - loss: 1.6586 - regression_loss: 1.3651 - classification_loss: 0.2935 55/500 [==>...........................] - ETA: 1:52 - loss: 1.6552 - regression_loss: 1.3630 - classification_loss: 0.2921 56/500 [==>...........................] - ETA: 1:52 - loss: 1.6411 - regression_loss: 1.3521 - classification_loss: 0.2890 57/500 [==>...........................] - ETA: 1:52 - loss: 1.6293 - regression_loss: 1.3427 - classification_loss: 0.2866 58/500 [==>...........................] - ETA: 1:52 - loss: 1.6216 - regression_loss: 1.3370 - classification_loss: 0.2846 59/500 [==>...........................] - ETA: 1:51 - loss: 1.6323 - regression_loss: 1.3449 - classification_loss: 0.2874 60/500 [==>...........................] - ETA: 1:51 - loss: 1.6335 - regression_loss: 1.3467 - classification_loss: 0.2868 61/500 [==>...........................] - ETA: 1:51 - loss: 1.6309 - regression_loss: 1.3454 - classification_loss: 0.2855 62/500 [==>...........................] - ETA: 1:51 - loss: 1.6304 - regression_loss: 1.3457 - classification_loss: 0.2847 63/500 [==>...........................] - ETA: 1:50 - loss: 1.6319 - regression_loss: 1.3474 - classification_loss: 0.2845 64/500 [==>...........................] - ETA: 1:50 - loss: 1.6332 - regression_loss: 1.3493 - classification_loss: 0.2839 65/500 [==>...........................] - ETA: 1:50 - loss: 1.6354 - regression_loss: 1.3514 - classification_loss: 0.2839 66/500 [==>...........................] - ETA: 1:50 - loss: 1.6391 - regression_loss: 1.3556 - classification_loss: 0.2835 67/500 [===>..........................] - ETA: 1:49 - loss: 1.6442 - regression_loss: 1.3583 - classification_loss: 0.2859 68/500 [===>..........................] - ETA: 1:49 - loss: 1.6458 - regression_loss: 1.3586 - classification_loss: 0.2872 69/500 [===>..........................] - ETA: 1:49 - loss: 1.6393 - regression_loss: 1.3531 - classification_loss: 0.2862 70/500 [===>..........................] - ETA: 1:48 - loss: 1.6422 - regression_loss: 1.3536 - classification_loss: 0.2886 71/500 [===>..........................] - ETA: 1:48 - loss: 1.6371 - regression_loss: 1.3497 - classification_loss: 0.2874 72/500 [===>..........................] - ETA: 1:48 - loss: 1.6801 - regression_loss: 1.3598 - classification_loss: 0.3203 73/500 [===>..........................] - ETA: 1:48 - loss: 1.6767 - regression_loss: 1.3573 - classification_loss: 0.3194 74/500 [===>..........................] - ETA: 1:47 - loss: 1.6720 - regression_loss: 1.3543 - classification_loss: 0.3177 75/500 [===>..........................] - ETA: 1:47 - loss: 1.6752 - regression_loss: 1.3573 - classification_loss: 0.3179 76/500 [===>..........................] - ETA: 1:47 - loss: 1.6744 - regression_loss: 1.3576 - classification_loss: 0.3169 77/500 [===>..........................] - ETA: 1:47 - loss: 1.6710 - regression_loss: 1.3555 - classification_loss: 0.3155 78/500 [===>..........................] - ETA: 1:46 - loss: 1.6715 - regression_loss: 1.3559 - classification_loss: 0.3156 79/500 [===>..........................] - ETA: 1:46 - loss: 1.6576 - regression_loss: 1.3455 - classification_loss: 0.3121 80/500 [===>..........................] - ETA: 1:46 - loss: 1.6606 - regression_loss: 1.3472 - classification_loss: 0.3135 81/500 [===>..........................] - ETA: 1:45 - loss: 1.6587 - regression_loss: 1.3440 - classification_loss: 0.3147 82/500 [===>..........................] - ETA: 1:45 - loss: 1.6595 - regression_loss: 1.3448 - classification_loss: 0.3147 83/500 [===>..........................] - ETA: 1:45 - loss: 1.6623 - regression_loss: 1.3475 - classification_loss: 0.3148 84/500 [====>.........................] - ETA: 1:45 - loss: 1.6627 - regression_loss: 1.3482 - classification_loss: 0.3145 85/500 [====>.........................] - ETA: 1:44 - loss: 1.6576 - regression_loss: 1.3448 - classification_loss: 0.3128 86/500 [====>.........................] - ETA: 1:44 - loss: 1.6577 - regression_loss: 1.3452 - classification_loss: 0.3125 87/500 [====>.........................] - ETA: 1:44 - loss: 1.6542 - regression_loss: 1.3429 - classification_loss: 0.3112 88/500 [====>.........................] - ETA: 1:44 - loss: 1.6543 - regression_loss: 1.3429 - classification_loss: 0.3115 89/500 [====>.........................] - ETA: 1:43 - loss: 1.6606 - regression_loss: 1.3489 - classification_loss: 0.3116 90/500 [====>.........................] - ETA: 1:43 - loss: 1.6677 - regression_loss: 1.3551 - classification_loss: 0.3127 91/500 [====>.........................] - ETA: 1:43 - loss: 1.6708 - regression_loss: 1.3577 - classification_loss: 0.3130 92/500 [====>.........................] - ETA: 1:42 - loss: 1.6731 - regression_loss: 1.3602 - classification_loss: 0.3129 93/500 [====>.........................] - ETA: 1:42 - loss: 1.6766 - regression_loss: 1.3637 - classification_loss: 0.3129 94/500 [====>.........................] - ETA: 1:42 - loss: 1.6734 - regression_loss: 1.3620 - classification_loss: 0.3114 95/500 [====>.........................] - ETA: 1:42 - loss: 1.6749 - regression_loss: 1.3639 - classification_loss: 0.3110 96/500 [====>.........................] - ETA: 1:41 - loss: 1.6739 - regression_loss: 1.3636 - classification_loss: 0.3103 97/500 [====>.........................] - ETA: 1:41 - loss: 1.6709 - regression_loss: 1.3619 - classification_loss: 0.3090 98/500 [====>.........................] - ETA: 1:41 - loss: 1.6695 - regression_loss: 1.3611 - classification_loss: 0.3083 99/500 [====>.........................] - ETA: 1:41 - loss: 1.6566 - regression_loss: 1.3506 - classification_loss: 0.3060 100/500 [=====>........................] - ETA: 1:40 - loss: 1.6614 - regression_loss: 1.3500 - classification_loss: 0.3115 101/500 [=====>........................] - ETA: 1:40 - loss: 1.6598 - regression_loss: 1.3477 - classification_loss: 0.3121 102/500 [=====>........................] - ETA: 1:40 - loss: 1.6550 - regression_loss: 1.3436 - classification_loss: 0.3114 103/500 [=====>........................] - ETA: 1:40 - loss: 1.6545 - regression_loss: 1.3439 - classification_loss: 0.3107 104/500 [=====>........................] - ETA: 1:39 - loss: 1.6550 - regression_loss: 1.3437 - classification_loss: 0.3113 105/500 [=====>........................] - ETA: 1:39 - loss: 1.6505 - regression_loss: 1.3407 - classification_loss: 0.3098 106/500 [=====>........................] - ETA: 1:39 - loss: 1.6479 - regression_loss: 1.3381 - classification_loss: 0.3098 107/500 [=====>........................] - ETA: 1:39 - loss: 1.6403 - regression_loss: 1.3320 - classification_loss: 0.3084 108/500 [=====>........................] - ETA: 1:38 - loss: 1.6369 - regression_loss: 1.3289 - classification_loss: 0.3080 109/500 [=====>........................] - ETA: 1:38 - loss: 1.6433 - regression_loss: 1.3346 - classification_loss: 0.3087 110/500 [=====>........................] - ETA: 1:38 - loss: 1.6448 - regression_loss: 1.3360 - classification_loss: 0.3088 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6439 - regression_loss: 1.3349 - classification_loss: 0.3090 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6496 - regression_loss: 1.3397 - classification_loss: 0.3099 113/500 [=====>........................] - ETA: 1:37 - loss: 1.6414 - regression_loss: 1.3334 - classification_loss: 0.3080 114/500 [=====>........................] - ETA: 1:37 - loss: 1.6407 - regression_loss: 1.3332 - classification_loss: 0.3075 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6452 - regression_loss: 1.3370 - classification_loss: 0.3082 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6392 - regression_loss: 1.3320 - classification_loss: 0.3073 117/500 [======>.......................] - ETA: 1:36 - loss: 1.6392 - regression_loss: 1.3323 - classification_loss: 0.3069 118/500 [======>.......................] - ETA: 1:36 - loss: 1.6384 - regression_loss: 1.3317 - classification_loss: 0.3067 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6379 - regression_loss: 1.3319 - classification_loss: 0.3059 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6357 - regression_loss: 1.3304 - classification_loss: 0.3053 121/500 [======>.......................] - ETA: 1:35 - loss: 1.6403 - regression_loss: 1.3341 - classification_loss: 0.3062 122/500 [======>.......................] - ETA: 1:35 - loss: 1.6432 - regression_loss: 1.3369 - classification_loss: 0.3063 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6363 - regression_loss: 1.3313 - classification_loss: 0.3050 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6358 - regression_loss: 1.3310 - classification_loss: 0.3048 125/500 [======>.......................] - ETA: 1:34 - loss: 1.6252 - regression_loss: 1.3225 - classification_loss: 0.3027 126/500 [======>.......................] - ETA: 1:34 - loss: 1.6250 - regression_loss: 1.3224 - classification_loss: 0.3025 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6218 - regression_loss: 1.3202 - classification_loss: 0.3015 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6224 - regression_loss: 1.3210 - classification_loss: 0.3013 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6244 - regression_loss: 1.3235 - classification_loss: 0.3009 130/500 [======>.......................] - ETA: 1:33 - loss: 1.6240 - regression_loss: 1.3235 - classification_loss: 0.3005 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6227 - regression_loss: 1.3215 - classification_loss: 0.3012 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6248 - regression_loss: 1.3238 - classification_loss: 0.3010 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6260 - regression_loss: 1.3249 - classification_loss: 0.3011 134/500 [=======>......................] - ETA: 1:32 - loss: 1.6247 - regression_loss: 1.3247 - classification_loss: 0.3000 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6280 - regression_loss: 1.3272 - classification_loss: 0.3007 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6231 - regression_loss: 1.3230 - classification_loss: 0.3001 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6218 - regression_loss: 1.3219 - classification_loss: 0.2999 138/500 [=======>......................] - ETA: 1:31 - loss: 1.6183 - regression_loss: 1.3187 - classification_loss: 0.2996 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6219 - regression_loss: 1.3217 - classification_loss: 0.3001 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6237 - regression_loss: 1.3232 - classification_loss: 0.3005 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6234 - regression_loss: 1.3222 - classification_loss: 0.3012 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6212 - regression_loss: 1.3207 - classification_loss: 0.3004 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6192 - regression_loss: 1.3197 - classification_loss: 0.2995 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6238 - regression_loss: 1.3233 - classification_loss: 0.3005 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6207 - regression_loss: 1.3220 - classification_loss: 0.2987 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6205 - regression_loss: 1.3222 - classification_loss: 0.2983 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6204 - regression_loss: 1.3222 - classification_loss: 0.2982 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6193 - regression_loss: 1.3215 - classification_loss: 0.2978 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6197 - regression_loss: 1.3222 - classification_loss: 0.2976 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6193 - regression_loss: 1.3214 - classification_loss: 0.2979 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6165 - regression_loss: 1.3195 - classification_loss: 0.2970 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6217 - regression_loss: 1.3243 - classification_loss: 0.2974 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6201 - regression_loss: 1.3230 - classification_loss: 0.2972 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6243 - regression_loss: 1.3267 - classification_loss: 0.2977 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6299 - regression_loss: 1.3313 - classification_loss: 0.2986 156/500 [========>.....................] - ETA: 1:25 - loss: 1.6313 - regression_loss: 1.3327 - classification_loss: 0.2986 157/500 [========>.....................] - ETA: 1:25 - loss: 1.6312 - regression_loss: 1.3328 - classification_loss: 0.2984 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6330 - regression_loss: 1.3346 - classification_loss: 0.2984 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6314 - regression_loss: 1.3335 - classification_loss: 0.2979 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6302 - regression_loss: 1.3329 - classification_loss: 0.2974 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6276 - regression_loss: 1.3310 - classification_loss: 0.2966 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6251 - regression_loss: 1.3288 - classification_loss: 0.2962 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6220 - regression_loss: 1.3264 - classification_loss: 0.2956 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6248 - regression_loss: 1.3289 - classification_loss: 0.2959 165/500 [========>.....................] - ETA: 1:23 - loss: 1.6243 - regression_loss: 1.3288 - classification_loss: 0.2954 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6251 - regression_loss: 1.3305 - classification_loss: 0.2946 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6268 - regression_loss: 1.3320 - classification_loss: 0.2948 168/500 [=========>....................] - ETA: 1:22 - loss: 1.6244 - regression_loss: 1.3302 - classification_loss: 0.2942 169/500 [=========>....................] - ETA: 1:22 - loss: 1.6255 - regression_loss: 1.3315 - classification_loss: 0.2941 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6243 - regression_loss: 1.3310 - classification_loss: 0.2933 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6259 - regression_loss: 1.3322 - classification_loss: 0.2936 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6257 - regression_loss: 1.3323 - classification_loss: 0.2934 173/500 [=========>....................] - ETA: 1:21 - loss: 1.6258 - regression_loss: 1.3327 - classification_loss: 0.2931 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6244 - regression_loss: 1.3319 - classification_loss: 0.2925 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6264 - regression_loss: 1.3334 - classification_loss: 0.2929 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6225 - regression_loss: 1.3301 - classification_loss: 0.2924 177/500 [=========>....................] - ETA: 1:20 - loss: 1.6253 - regression_loss: 1.3327 - classification_loss: 0.2925 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6241 - regression_loss: 1.3317 - classification_loss: 0.2924 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6215 - regression_loss: 1.3296 - classification_loss: 0.2919 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6224 - regression_loss: 1.3305 - classification_loss: 0.2920 181/500 [=========>....................] - ETA: 1:19 - loss: 1.6224 - regression_loss: 1.3307 - classification_loss: 0.2917 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6204 - regression_loss: 1.3293 - classification_loss: 0.2911 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6203 - regression_loss: 1.3292 - classification_loss: 0.2912 184/500 [==========>...................] - ETA: 1:19 - loss: 1.6155 - regression_loss: 1.3255 - classification_loss: 0.2900 185/500 [==========>...................] - ETA: 1:18 - loss: 1.6141 - regression_loss: 1.3246 - classification_loss: 0.2895 186/500 [==========>...................] - ETA: 1:18 - loss: 1.6139 - regression_loss: 1.3247 - classification_loss: 0.2892 187/500 [==========>...................] - ETA: 1:18 - loss: 1.6138 - regression_loss: 1.3253 - classification_loss: 0.2885 188/500 [==========>...................] - ETA: 1:18 - loss: 1.6140 - regression_loss: 1.3257 - classification_loss: 0.2883 189/500 [==========>...................] - ETA: 1:17 - loss: 1.6144 - regression_loss: 1.3264 - classification_loss: 0.2880 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6150 - regression_loss: 1.3268 - classification_loss: 0.2882 191/500 [==========>...................] - ETA: 1:17 - loss: 1.6144 - regression_loss: 1.3265 - classification_loss: 0.2879 192/500 [==========>...................] - ETA: 1:17 - loss: 1.6162 - regression_loss: 1.3280 - classification_loss: 0.2881 193/500 [==========>...................] - ETA: 1:17 - loss: 1.6168 - regression_loss: 1.3287 - classification_loss: 0.2881 194/500 [==========>...................] - ETA: 1:16 - loss: 1.6176 - regression_loss: 1.3296 - classification_loss: 0.2881 195/500 [==========>...................] - ETA: 1:16 - loss: 1.6190 - regression_loss: 1.3308 - classification_loss: 0.2882 196/500 [==========>...................] - ETA: 1:16 - loss: 1.6183 - regression_loss: 1.3304 - classification_loss: 0.2879 197/500 [==========>...................] - ETA: 1:16 - loss: 1.6156 - regression_loss: 1.3284 - classification_loss: 0.2873 198/500 [==========>...................] - ETA: 1:15 - loss: 1.6158 - regression_loss: 1.3286 - classification_loss: 0.2872 199/500 [==========>...................] - ETA: 1:15 - loss: 1.6166 - regression_loss: 1.3294 - classification_loss: 0.2872 200/500 [===========>..................] - ETA: 1:15 - loss: 1.6149 - regression_loss: 1.3281 - classification_loss: 0.2867 201/500 [===========>..................] - ETA: 1:15 - loss: 1.6146 - regression_loss: 1.3282 - classification_loss: 0.2863 202/500 [===========>..................] - ETA: 1:14 - loss: 1.6117 - regression_loss: 1.3259 - classification_loss: 0.2858 203/500 [===========>..................] - ETA: 1:14 - loss: 1.6072 - regression_loss: 1.3223 - classification_loss: 0.2849 204/500 [===========>..................] - ETA: 1:14 - loss: 1.6046 - regression_loss: 1.3205 - classification_loss: 0.2841 205/500 [===========>..................] - ETA: 1:14 - loss: 1.6059 - regression_loss: 1.3218 - classification_loss: 0.2841 206/500 [===========>..................] - ETA: 1:13 - loss: 1.6056 - regression_loss: 1.3217 - classification_loss: 0.2839 207/500 [===========>..................] - ETA: 1:13 - loss: 1.6088 - regression_loss: 1.3244 - classification_loss: 0.2844 208/500 [===========>..................] - ETA: 1:13 - loss: 1.6103 - regression_loss: 1.3249 - classification_loss: 0.2854 209/500 [===========>..................] - ETA: 1:13 - loss: 1.6114 - regression_loss: 1.3260 - classification_loss: 0.2854 210/500 [===========>..................] - ETA: 1:12 - loss: 1.6121 - regression_loss: 1.3265 - classification_loss: 0.2856 211/500 [===========>..................] - ETA: 1:12 - loss: 1.6134 - regression_loss: 1.3273 - classification_loss: 0.2860 212/500 [===========>..................] - ETA: 1:12 - loss: 1.6124 - regression_loss: 1.3266 - classification_loss: 0.2858 213/500 [===========>..................] - ETA: 1:12 - loss: 1.6135 - regression_loss: 1.3278 - classification_loss: 0.2857 214/500 [===========>..................] - ETA: 1:11 - loss: 1.6112 - regression_loss: 1.3261 - classification_loss: 0.2851 215/500 [===========>..................] - ETA: 1:11 - loss: 1.6093 - regression_loss: 1.3249 - classification_loss: 0.2844 216/500 [===========>..................] - ETA: 1:11 - loss: 1.6072 - regression_loss: 1.3231 - classification_loss: 0.2840 217/500 [============>.................] - ETA: 1:11 - loss: 1.6087 - regression_loss: 1.3243 - classification_loss: 0.2844 218/500 [============>.................] - ETA: 1:10 - loss: 1.6069 - regression_loss: 1.3232 - classification_loss: 0.2838 219/500 [============>.................] - ETA: 1:10 - loss: 1.6076 - regression_loss: 1.3238 - classification_loss: 0.2838 220/500 [============>.................] - ETA: 1:10 - loss: 1.6077 - regression_loss: 1.3240 - classification_loss: 0.2838 221/500 [============>.................] - ETA: 1:10 - loss: 1.6080 - regression_loss: 1.3244 - classification_loss: 0.2836 222/500 [============>.................] - ETA: 1:09 - loss: 1.6092 - regression_loss: 1.3255 - classification_loss: 0.2837 223/500 [============>.................] - ETA: 1:09 - loss: 1.6096 - regression_loss: 1.3260 - classification_loss: 0.2836 224/500 [============>.................] - ETA: 1:09 - loss: 1.6119 - regression_loss: 1.3279 - classification_loss: 0.2840 225/500 [============>.................] - ETA: 1:09 - loss: 1.6132 - regression_loss: 1.3292 - classification_loss: 0.2840 226/500 [============>.................] - ETA: 1:08 - loss: 1.6145 - regression_loss: 1.3284 - classification_loss: 0.2861 227/500 [============>.................] - ETA: 1:08 - loss: 1.6140 - regression_loss: 1.3277 - classification_loss: 0.2864 228/500 [============>.................] - ETA: 1:08 - loss: 1.6136 - regression_loss: 1.3274 - classification_loss: 0.2861 229/500 [============>.................] - ETA: 1:08 - loss: 1.6146 - regression_loss: 1.3284 - classification_loss: 0.2862 230/500 [============>.................] - ETA: 1:07 - loss: 1.6118 - regression_loss: 1.3264 - classification_loss: 0.2854 231/500 [============>.................] - ETA: 1:07 - loss: 1.6092 - regression_loss: 1.3243 - classification_loss: 0.2849 232/500 [============>.................] - ETA: 1:07 - loss: 1.6071 - regression_loss: 1.3223 - classification_loss: 0.2849 233/500 [============>.................] - ETA: 1:07 - loss: 1.6058 - regression_loss: 1.3213 - classification_loss: 0.2845 234/500 [=============>................] - ETA: 1:06 - loss: 1.6065 - regression_loss: 1.3224 - classification_loss: 0.2842 235/500 [=============>................] - ETA: 1:06 - loss: 1.6030 - regression_loss: 1.3197 - classification_loss: 0.2833 236/500 [=============>................] - ETA: 1:06 - loss: 1.6038 - regression_loss: 1.3206 - classification_loss: 0.2832 237/500 [=============>................] - ETA: 1:06 - loss: 1.6029 - regression_loss: 1.3198 - classification_loss: 0.2830 238/500 [=============>................] - ETA: 1:05 - loss: 1.6007 - regression_loss: 1.3185 - classification_loss: 0.2823 239/500 [=============>................] - ETA: 1:05 - loss: 1.5986 - regression_loss: 1.3168 - classification_loss: 0.2819 240/500 [=============>................] - ETA: 1:05 - loss: 1.6009 - regression_loss: 1.3187 - classification_loss: 0.2821 241/500 [=============>................] - ETA: 1:05 - loss: 1.6013 - regression_loss: 1.3192 - classification_loss: 0.2822 242/500 [=============>................] - ETA: 1:04 - loss: 1.6038 - regression_loss: 1.3216 - classification_loss: 0.2822 243/500 [=============>................] - ETA: 1:04 - loss: 1.6025 - regression_loss: 1.3204 - classification_loss: 0.2821 244/500 [=============>................] - ETA: 1:04 - loss: 1.6006 - regression_loss: 1.3191 - classification_loss: 0.2815 245/500 [=============>................] - ETA: 1:04 - loss: 1.6015 - regression_loss: 1.3197 - classification_loss: 0.2818 246/500 [=============>................] - ETA: 1:03 - loss: 1.6006 - regression_loss: 1.3191 - classification_loss: 0.2814 247/500 [=============>................] - ETA: 1:03 - loss: 1.6004 - regression_loss: 1.3190 - classification_loss: 0.2814 248/500 [=============>................] - ETA: 1:03 - loss: 1.6005 - regression_loss: 1.3192 - classification_loss: 0.2813 249/500 [=============>................] - ETA: 1:03 - loss: 1.5966 - regression_loss: 1.3162 - classification_loss: 0.2804 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5966 - regression_loss: 1.3162 - classification_loss: 0.2804 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5982 - regression_loss: 1.3176 - classification_loss: 0.2807 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5950 - regression_loss: 1.3151 - classification_loss: 0.2799 253/500 [==============>...............] - ETA: 1:02 - loss: 1.5970 - regression_loss: 1.3167 - classification_loss: 0.2802 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5975 - regression_loss: 1.3174 - classification_loss: 0.2801 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5965 - regression_loss: 1.3166 - classification_loss: 0.2799 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5992 - regression_loss: 1.3191 - classification_loss: 0.2801 257/500 [==============>...............] - ETA: 1:01 - loss: 1.5992 - regression_loss: 1.3193 - classification_loss: 0.2799 258/500 [==============>...............] - ETA: 1:00 - loss: 1.6006 - regression_loss: 1.3208 - classification_loss: 0.2798 259/500 [==============>...............] - ETA: 1:00 - loss: 1.6015 - regression_loss: 1.3216 - classification_loss: 0.2799 260/500 [==============>...............] - ETA: 1:00 - loss: 1.6012 - regression_loss: 1.3216 - classification_loss: 0.2796 261/500 [==============>...............] - ETA: 1:00 - loss: 1.5998 - regression_loss: 1.3207 - classification_loss: 0.2791 262/500 [==============>...............] - ETA: 59s - loss: 1.6010 - regression_loss: 1.3213 - classification_loss: 0.2797  263/500 [==============>...............] - ETA: 59s - loss: 1.6022 - regression_loss: 1.3222 - classification_loss: 0.2800 264/500 [==============>...............] - ETA: 59s - loss: 1.6019 - regression_loss: 1.3221 - classification_loss: 0.2798 265/500 [==============>...............] - ETA: 59s - loss: 1.6032 - regression_loss: 1.3227 - classification_loss: 0.2804 266/500 [==============>...............] - ETA: 58s - loss: 1.6008 - regression_loss: 1.3209 - classification_loss: 0.2799 267/500 [===============>..............] - ETA: 58s - loss: 1.5968 - regression_loss: 1.3177 - classification_loss: 0.2792 268/500 [===============>..............] - ETA: 58s - loss: 1.5948 - regression_loss: 1.3162 - classification_loss: 0.2786 269/500 [===============>..............] - ETA: 58s - loss: 1.5962 - regression_loss: 1.3173 - classification_loss: 0.2789 270/500 [===============>..............] - ETA: 57s - loss: 1.5940 - regression_loss: 1.3156 - classification_loss: 0.2784 271/500 [===============>..............] - ETA: 57s - loss: 1.5944 - regression_loss: 1.3161 - classification_loss: 0.2783 272/500 [===============>..............] - ETA: 57s - loss: 1.5951 - regression_loss: 1.3170 - classification_loss: 0.2781 273/500 [===============>..............] - ETA: 57s - loss: 1.5930 - regression_loss: 1.3154 - classification_loss: 0.2775 274/500 [===============>..............] - ETA: 56s - loss: 1.5923 - regression_loss: 1.3151 - classification_loss: 0.2772 275/500 [===============>..............] - ETA: 56s - loss: 1.5923 - regression_loss: 1.3152 - classification_loss: 0.2771 276/500 [===============>..............] - ETA: 56s - loss: 1.5909 - regression_loss: 1.3143 - classification_loss: 0.2766 277/500 [===============>..............] - ETA: 56s - loss: 1.5909 - regression_loss: 1.3142 - classification_loss: 0.2767 278/500 [===============>..............] - ETA: 55s - loss: 1.5884 - regression_loss: 1.3123 - classification_loss: 0.2760 279/500 [===============>..............] - ETA: 55s - loss: 1.5885 - regression_loss: 1.3126 - classification_loss: 0.2759 280/500 [===============>..............] - ETA: 55s - loss: 1.5877 - regression_loss: 1.3123 - classification_loss: 0.2755 281/500 [===============>..............] - ETA: 55s - loss: 1.5853 - regression_loss: 1.3105 - classification_loss: 0.2748 282/500 [===============>..............] - ETA: 54s - loss: 1.5843 - regression_loss: 1.3099 - classification_loss: 0.2744 283/500 [===============>..............] - ETA: 54s - loss: 1.5860 - regression_loss: 1.3113 - classification_loss: 0.2748 284/500 [================>.............] - ETA: 54s - loss: 1.5854 - regression_loss: 1.3110 - classification_loss: 0.2745 285/500 [================>.............] - ETA: 54s - loss: 1.5832 - regression_loss: 1.3093 - classification_loss: 0.2739 286/500 [================>.............] - ETA: 53s - loss: 1.5831 - regression_loss: 1.3090 - classification_loss: 0.2741 287/500 [================>.............] - ETA: 53s - loss: 1.5835 - regression_loss: 1.3095 - classification_loss: 0.2740 288/500 [================>.............] - ETA: 53s - loss: 1.5849 - regression_loss: 1.3105 - classification_loss: 0.2744 289/500 [================>.............] - ETA: 53s - loss: 1.5849 - regression_loss: 1.3107 - classification_loss: 0.2742 290/500 [================>.............] - ETA: 52s - loss: 1.5822 - regression_loss: 1.3086 - classification_loss: 0.2736 291/500 [================>.............] - ETA: 52s - loss: 1.5833 - regression_loss: 1.3095 - classification_loss: 0.2738 292/500 [================>.............] - ETA: 52s - loss: 1.5817 - regression_loss: 1.3082 - classification_loss: 0.2735 293/500 [================>.............] - ETA: 52s - loss: 1.5821 - regression_loss: 1.3086 - classification_loss: 0.2736 294/500 [================>.............] - ETA: 51s - loss: 1.5830 - regression_loss: 1.3094 - classification_loss: 0.2736 295/500 [================>.............] - ETA: 51s - loss: 1.5835 - regression_loss: 1.3099 - classification_loss: 0.2736 296/500 [================>.............] - ETA: 51s - loss: 1.5819 - regression_loss: 1.3087 - classification_loss: 0.2733 297/500 [================>.............] - ETA: 51s - loss: 1.5820 - regression_loss: 1.3088 - classification_loss: 0.2732 298/500 [================>.............] - ETA: 50s - loss: 1.5823 - regression_loss: 1.3089 - classification_loss: 0.2734 299/500 [================>.............] - ETA: 50s - loss: 1.5827 - regression_loss: 1.3095 - classification_loss: 0.2733 300/500 [=================>............] - ETA: 50s - loss: 1.5846 - regression_loss: 1.3111 - classification_loss: 0.2735 301/500 [=================>............] - ETA: 50s - loss: 1.5850 - regression_loss: 1.3116 - classification_loss: 0.2734 302/500 [=================>............] - ETA: 49s - loss: 1.5873 - regression_loss: 1.3133 - classification_loss: 0.2740 303/500 [=================>............] - ETA: 49s - loss: 1.5886 - regression_loss: 1.3141 - classification_loss: 0.2744 304/500 [=================>............] - ETA: 49s - loss: 1.5880 - regression_loss: 1.3138 - classification_loss: 0.2743 305/500 [=================>............] - ETA: 49s - loss: 1.5874 - regression_loss: 1.3135 - classification_loss: 0.2739 306/500 [=================>............] - ETA: 48s - loss: 1.5878 - regression_loss: 1.3139 - classification_loss: 0.2740 307/500 [=================>............] - ETA: 48s - loss: 1.5886 - regression_loss: 1.3147 - classification_loss: 0.2739 308/500 [=================>............] - ETA: 48s - loss: 1.5886 - regression_loss: 1.3147 - classification_loss: 0.2738 309/500 [=================>............] - ETA: 48s - loss: 1.5885 - regression_loss: 1.3148 - classification_loss: 0.2736 310/500 [=================>............] - ETA: 47s - loss: 1.5885 - regression_loss: 1.3150 - classification_loss: 0.2735 311/500 [=================>............] - ETA: 47s - loss: 1.5868 - regression_loss: 1.3135 - classification_loss: 0.2733 312/500 [=================>............] - ETA: 47s - loss: 1.5876 - regression_loss: 1.3140 - classification_loss: 0.2736 313/500 [=================>............] - ETA: 47s - loss: 1.5885 - regression_loss: 1.3144 - classification_loss: 0.2741 314/500 [=================>............] - ETA: 46s - loss: 1.5899 - regression_loss: 1.3151 - classification_loss: 0.2748 315/500 [=================>............] - ETA: 46s - loss: 1.5902 - regression_loss: 1.3156 - classification_loss: 0.2746 316/500 [=================>............] - ETA: 46s - loss: 1.5896 - regression_loss: 1.3154 - classification_loss: 0.2742 317/500 [==================>...........] - ETA: 46s - loss: 1.5901 - regression_loss: 1.3158 - classification_loss: 0.2743 318/500 [==================>...........] - ETA: 45s - loss: 1.5899 - regression_loss: 1.3157 - classification_loss: 0.2742 319/500 [==================>...........] - ETA: 45s - loss: 1.5897 - regression_loss: 1.3157 - classification_loss: 0.2740 320/500 [==================>...........] - ETA: 45s - loss: 1.5905 - regression_loss: 1.3161 - classification_loss: 0.2744 321/500 [==================>...........] - ETA: 45s - loss: 1.5905 - regression_loss: 1.3162 - classification_loss: 0.2743 322/500 [==================>...........] - ETA: 44s - loss: 1.5895 - regression_loss: 1.3154 - classification_loss: 0.2741 323/500 [==================>...........] - ETA: 44s - loss: 1.5867 - regression_loss: 1.3131 - classification_loss: 0.2736 324/500 [==================>...........] - ETA: 44s - loss: 1.5851 - regression_loss: 1.3116 - classification_loss: 0.2735 325/500 [==================>...........] - ETA: 44s - loss: 1.5849 - regression_loss: 1.3115 - classification_loss: 0.2734 326/500 [==================>...........] - ETA: 43s - loss: 1.5844 - regression_loss: 1.3110 - classification_loss: 0.2734 327/500 [==================>...........] - ETA: 43s - loss: 1.5823 - regression_loss: 1.3093 - classification_loss: 0.2730 328/500 [==================>...........] - ETA: 43s - loss: 1.5835 - regression_loss: 1.3102 - classification_loss: 0.2733 329/500 [==================>...........] - ETA: 42s - loss: 1.5856 - regression_loss: 1.3116 - classification_loss: 0.2739 330/500 [==================>...........] - ETA: 42s - loss: 1.5880 - regression_loss: 1.3138 - classification_loss: 0.2741 331/500 [==================>...........] - ETA: 42s - loss: 1.5890 - regression_loss: 1.3148 - classification_loss: 0.2742 332/500 [==================>...........] - ETA: 42s - loss: 1.5903 - regression_loss: 1.3158 - classification_loss: 0.2745 333/500 [==================>...........] - ETA: 41s - loss: 1.5884 - regression_loss: 1.3141 - classification_loss: 0.2743 334/500 [===================>..........] - ETA: 41s - loss: 1.5885 - regression_loss: 1.3143 - classification_loss: 0.2742 335/500 [===================>..........] - ETA: 41s - loss: 1.5890 - regression_loss: 1.3147 - classification_loss: 0.2743 336/500 [===================>..........] - ETA: 41s - loss: 1.5864 - regression_loss: 1.3127 - classification_loss: 0.2737 337/500 [===================>..........] - ETA: 40s - loss: 1.5865 - regression_loss: 1.3129 - classification_loss: 0.2736 338/500 [===================>..........] - ETA: 40s - loss: 1.5870 - regression_loss: 1.3135 - classification_loss: 0.2735 339/500 [===================>..........] - ETA: 40s - loss: 1.5842 - regression_loss: 1.3112 - classification_loss: 0.2730 340/500 [===================>..........] - ETA: 40s - loss: 1.5842 - regression_loss: 1.3112 - classification_loss: 0.2730 341/500 [===================>..........] - ETA: 39s - loss: 1.5866 - regression_loss: 1.3137 - classification_loss: 0.2730 342/500 [===================>..........] - ETA: 39s - loss: 1.5872 - regression_loss: 1.3142 - classification_loss: 0.2730 343/500 [===================>..........] - ETA: 39s - loss: 1.5868 - regression_loss: 1.3141 - classification_loss: 0.2727 344/500 [===================>..........] - ETA: 39s - loss: 1.5882 - regression_loss: 1.3153 - classification_loss: 0.2730 345/500 [===================>..........] - ETA: 38s - loss: 1.5876 - regression_loss: 1.3148 - classification_loss: 0.2729 346/500 [===================>..........] - ETA: 38s - loss: 1.5844 - regression_loss: 1.3120 - classification_loss: 0.2724 347/500 [===================>..........] - ETA: 38s - loss: 1.5850 - regression_loss: 1.3126 - classification_loss: 0.2725 348/500 [===================>..........] - ETA: 38s - loss: 1.5847 - regression_loss: 1.3125 - classification_loss: 0.2723 349/500 [===================>..........] - ETA: 37s - loss: 1.5843 - regression_loss: 1.3122 - classification_loss: 0.2722 350/500 [====================>.........] - ETA: 37s - loss: 1.5822 - regression_loss: 1.3105 - classification_loss: 0.2717 351/500 [====================>.........] - ETA: 37s - loss: 1.5835 - regression_loss: 1.3115 - classification_loss: 0.2720 352/500 [====================>.........] - ETA: 37s - loss: 1.5840 - regression_loss: 1.3119 - classification_loss: 0.2721 353/500 [====================>.........] - ETA: 36s - loss: 1.5825 - regression_loss: 1.3108 - classification_loss: 0.2717 354/500 [====================>.........] - ETA: 36s - loss: 1.5823 - regression_loss: 1.3110 - classification_loss: 0.2713 355/500 [====================>.........] - ETA: 36s - loss: 1.5816 - regression_loss: 1.3105 - classification_loss: 0.2711 356/500 [====================>.........] - ETA: 36s - loss: 1.5801 - regression_loss: 1.3093 - classification_loss: 0.2708 357/500 [====================>.........] - ETA: 35s - loss: 1.5817 - regression_loss: 1.3104 - classification_loss: 0.2713 358/500 [====================>.........] - ETA: 35s - loss: 1.5805 - regression_loss: 1.3096 - classification_loss: 0.2709 359/500 [====================>.........] - ETA: 35s - loss: 1.5816 - regression_loss: 1.3105 - classification_loss: 0.2711 360/500 [====================>.........] - ETA: 35s - loss: 1.5818 - regression_loss: 1.3108 - classification_loss: 0.2710 361/500 [====================>.........] - ETA: 34s - loss: 1.5820 - regression_loss: 1.3110 - classification_loss: 0.2710 362/500 [====================>.........] - ETA: 34s - loss: 1.5794 - regression_loss: 1.3089 - classification_loss: 0.2704 363/500 [====================>.........] - ETA: 34s - loss: 1.5791 - regression_loss: 1.3089 - classification_loss: 0.2702 364/500 [====================>.........] - ETA: 34s - loss: 1.5794 - regression_loss: 1.3089 - classification_loss: 0.2705 365/500 [====================>.........] - ETA: 33s - loss: 1.5776 - regression_loss: 1.3073 - classification_loss: 0.2702 366/500 [====================>.........] - ETA: 33s - loss: 1.5778 - regression_loss: 1.3075 - classification_loss: 0.2702 367/500 [=====================>........] - ETA: 33s - loss: 1.5776 - regression_loss: 1.3072 - classification_loss: 0.2704 368/500 [=====================>........] - ETA: 33s - loss: 1.5787 - regression_loss: 1.3080 - classification_loss: 0.2707 369/500 [=====================>........] - ETA: 32s - loss: 1.5780 - regression_loss: 1.3075 - classification_loss: 0.2705 370/500 [=====================>........] - ETA: 32s - loss: 1.5769 - regression_loss: 1.3067 - classification_loss: 0.2702 371/500 [=====================>........] - ETA: 32s - loss: 1.5780 - regression_loss: 1.3076 - classification_loss: 0.2704 372/500 [=====================>........] - ETA: 32s - loss: 1.5774 - regression_loss: 1.3072 - classification_loss: 0.2702 373/500 [=====================>........] - ETA: 31s - loss: 1.5773 - regression_loss: 1.3069 - classification_loss: 0.2705 374/500 [=====================>........] - ETA: 31s - loss: 1.5782 - regression_loss: 1.3079 - classification_loss: 0.2703 375/500 [=====================>........] - ETA: 31s - loss: 1.5793 - regression_loss: 1.3092 - classification_loss: 0.2701 376/500 [=====================>........] - ETA: 31s - loss: 1.5793 - regression_loss: 1.3092 - classification_loss: 0.2701 377/500 [=====================>........] - ETA: 30s - loss: 1.5796 - regression_loss: 1.3095 - classification_loss: 0.2700 378/500 [=====================>........] - ETA: 30s - loss: 1.5783 - regression_loss: 1.3085 - classification_loss: 0.2698 379/500 [=====================>........] - ETA: 30s - loss: 1.5778 - regression_loss: 1.3081 - classification_loss: 0.2697 380/500 [=====================>........] - ETA: 30s - loss: 1.5776 - regression_loss: 1.3080 - classification_loss: 0.2696 381/500 [=====================>........] - ETA: 29s - loss: 1.5763 - regression_loss: 1.3071 - classification_loss: 0.2692 382/500 [=====================>........] - ETA: 29s - loss: 1.5745 - regression_loss: 1.3057 - classification_loss: 0.2688 383/500 [=====================>........] - ETA: 29s - loss: 1.5735 - regression_loss: 1.3051 - classification_loss: 0.2685 384/500 [======================>.......] - ETA: 29s - loss: 1.5738 - regression_loss: 1.3052 - classification_loss: 0.2686 385/500 [======================>.......] - ETA: 28s - loss: 1.5747 - regression_loss: 1.3061 - classification_loss: 0.2686 386/500 [======================>.......] - ETA: 28s - loss: 1.5759 - regression_loss: 1.3071 - classification_loss: 0.2688 387/500 [======================>.......] - ETA: 28s - loss: 1.5740 - regression_loss: 1.3055 - classification_loss: 0.2685 388/500 [======================>.......] - ETA: 28s - loss: 1.5738 - regression_loss: 1.3054 - classification_loss: 0.2684 389/500 [======================>.......] - ETA: 27s - loss: 1.5732 - regression_loss: 1.3050 - classification_loss: 0.2682 390/500 [======================>.......] - ETA: 27s - loss: 1.5739 - regression_loss: 1.3054 - classification_loss: 0.2685 391/500 [======================>.......] - ETA: 27s - loss: 1.5722 - regression_loss: 1.3041 - classification_loss: 0.2681 392/500 [======================>.......] - ETA: 27s - loss: 1.5712 - regression_loss: 1.3034 - classification_loss: 0.2679 393/500 [======================>.......] - ETA: 26s - loss: 1.5720 - regression_loss: 1.3040 - classification_loss: 0.2680 394/500 [======================>.......] - ETA: 26s - loss: 1.5728 - regression_loss: 1.3046 - classification_loss: 0.2681 395/500 [======================>.......] - ETA: 26s - loss: 1.5715 - regression_loss: 1.3036 - classification_loss: 0.2679 396/500 [======================>.......] - ETA: 26s - loss: 1.5708 - regression_loss: 1.3031 - classification_loss: 0.2677 397/500 [======================>.......] - ETA: 25s - loss: 1.5709 - regression_loss: 1.3033 - classification_loss: 0.2676 398/500 [======================>.......] - ETA: 25s - loss: 1.5717 - regression_loss: 1.3040 - classification_loss: 0.2677 399/500 [======================>.......] - ETA: 25s - loss: 1.5700 - regression_loss: 1.3027 - classification_loss: 0.2673 400/500 [=======================>......] - ETA: 25s - loss: 1.5702 - regression_loss: 1.3029 - classification_loss: 0.2673 401/500 [=======================>......] - ETA: 24s - loss: 1.5692 - regression_loss: 1.3020 - classification_loss: 0.2672 402/500 [=======================>......] - ETA: 24s - loss: 1.5683 - regression_loss: 1.3015 - classification_loss: 0.2668 403/500 [=======================>......] - ETA: 24s - loss: 1.5690 - regression_loss: 1.3020 - classification_loss: 0.2670 404/500 [=======================>......] - ETA: 24s - loss: 1.5698 - regression_loss: 1.3028 - classification_loss: 0.2670 405/500 [=======================>......] - ETA: 23s - loss: 1.5704 - regression_loss: 1.3030 - classification_loss: 0.2674 406/500 [=======================>......] - ETA: 23s - loss: 1.5691 - regression_loss: 1.3021 - classification_loss: 0.2670 407/500 [=======================>......] - ETA: 23s - loss: 1.5697 - regression_loss: 1.3026 - classification_loss: 0.2672 408/500 [=======================>......] - ETA: 23s - loss: 1.5706 - regression_loss: 1.3033 - classification_loss: 0.2673 409/500 [=======================>......] - ETA: 22s - loss: 1.5710 - regression_loss: 1.3038 - classification_loss: 0.2672 410/500 [=======================>......] - ETA: 22s - loss: 1.5689 - regression_loss: 1.3021 - classification_loss: 0.2668 411/500 [=======================>......] - ETA: 22s - loss: 1.5711 - regression_loss: 1.3040 - classification_loss: 0.2671 412/500 [=======================>......] - ETA: 22s - loss: 1.5695 - regression_loss: 1.3026 - classification_loss: 0.2669 413/500 [=======================>......] - ETA: 21s - loss: 1.5697 - regression_loss: 1.3027 - classification_loss: 0.2670 414/500 [=======================>......] - ETA: 21s - loss: 1.5689 - regression_loss: 1.3020 - classification_loss: 0.2669 415/500 [=======================>......] - ETA: 21s - loss: 1.5695 - regression_loss: 1.3024 - classification_loss: 0.2671 416/500 [=======================>......] - ETA: 21s - loss: 1.5704 - regression_loss: 1.3032 - classification_loss: 0.2673 417/500 [========================>.....] - ETA: 20s - loss: 1.5706 - regression_loss: 1.3032 - classification_loss: 0.2674 418/500 [========================>.....] - ETA: 20s - loss: 1.5718 - regression_loss: 1.3041 - classification_loss: 0.2677 419/500 [========================>.....] - ETA: 20s - loss: 1.5714 - regression_loss: 1.3038 - classification_loss: 0.2676 420/500 [========================>.....] - ETA: 20s - loss: 1.5714 - regression_loss: 1.3036 - classification_loss: 0.2678 421/500 [========================>.....] - ETA: 19s - loss: 1.5712 - regression_loss: 1.3035 - classification_loss: 0.2678 422/500 [========================>.....] - ETA: 19s - loss: 1.5722 - regression_loss: 1.3042 - classification_loss: 0.2680 423/500 [========================>.....] - ETA: 19s - loss: 1.5722 - regression_loss: 1.3042 - classification_loss: 0.2680 424/500 [========================>.....] - ETA: 19s - loss: 1.5731 - regression_loss: 1.3051 - classification_loss: 0.2681 425/500 [========================>.....] - ETA: 18s - loss: 1.5739 - regression_loss: 1.3049 - classification_loss: 0.2689 426/500 [========================>.....] - ETA: 18s - loss: 1.5736 - regression_loss: 1.3048 - classification_loss: 0.2688 427/500 [========================>.....] - ETA: 18s - loss: 1.5738 - regression_loss: 1.3049 - classification_loss: 0.2688 428/500 [========================>.....] - ETA: 18s - loss: 1.5741 - regression_loss: 1.3052 - classification_loss: 0.2689 429/500 [========================>.....] - ETA: 17s - loss: 1.5751 - regression_loss: 1.3061 - classification_loss: 0.2690 430/500 [========================>.....] - ETA: 17s - loss: 1.5759 - regression_loss: 1.3069 - classification_loss: 0.2691 431/500 [========================>.....] - ETA: 17s - loss: 1.5760 - regression_loss: 1.3067 - classification_loss: 0.2693 432/500 [========================>.....] - ETA: 17s - loss: 1.5753 - regression_loss: 1.3061 - classification_loss: 0.2692 433/500 [========================>.....] - ETA: 16s - loss: 1.5751 - regression_loss: 1.3059 - classification_loss: 0.2692 434/500 [=========================>....] - ETA: 16s - loss: 1.5764 - regression_loss: 1.3069 - classification_loss: 0.2695 435/500 [=========================>....] - ETA: 16s - loss: 1.5765 - regression_loss: 1.3071 - classification_loss: 0.2694 436/500 [=========================>....] - ETA: 16s - loss: 1.5767 - regression_loss: 1.3074 - classification_loss: 0.2693 437/500 [=========================>....] - ETA: 15s - loss: 1.5779 - regression_loss: 1.3087 - classification_loss: 0.2692 438/500 [=========================>....] - ETA: 15s - loss: 1.5783 - regression_loss: 1.3090 - classification_loss: 0.2693 439/500 [=========================>....] - ETA: 15s - loss: 1.5780 - regression_loss: 1.3089 - classification_loss: 0.2691 440/500 [=========================>....] - ETA: 15s - loss: 1.5779 - regression_loss: 1.3090 - classification_loss: 0.2690 441/500 [=========================>....] - ETA: 14s - loss: 1.5780 - regression_loss: 1.3090 - classification_loss: 0.2689 442/500 [=========================>....] - ETA: 14s - loss: 1.5784 - regression_loss: 1.3095 - classification_loss: 0.2689 443/500 [=========================>....] - ETA: 14s - loss: 1.5773 - regression_loss: 1.3086 - classification_loss: 0.2687 444/500 [=========================>....] - ETA: 14s - loss: 1.5767 - regression_loss: 1.3080 - classification_loss: 0.2687 445/500 [=========================>....] - ETA: 13s - loss: 1.5771 - regression_loss: 1.3084 - classification_loss: 0.2688 446/500 [=========================>....] - ETA: 13s - loss: 1.5749 - regression_loss: 1.3066 - classification_loss: 0.2684 447/500 [=========================>....] - ETA: 13s - loss: 1.5744 - regression_loss: 1.3063 - classification_loss: 0.2681 448/500 [=========================>....] - ETA: 13s - loss: 1.5747 - regression_loss: 1.3066 - classification_loss: 0.2681 449/500 [=========================>....] - ETA: 12s - loss: 1.5730 - regression_loss: 1.3053 - classification_loss: 0.2677 450/500 [==========================>...] - ETA: 12s - loss: 1.5723 - regression_loss: 1.3049 - classification_loss: 0.2675 451/500 [==========================>...] - ETA: 12s - loss: 1.5721 - regression_loss: 1.3048 - classification_loss: 0.2673 452/500 [==========================>...] - ETA: 12s - loss: 1.5724 - regression_loss: 1.3051 - classification_loss: 0.2674 453/500 [==========================>...] - ETA: 11s - loss: 1.5731 - regression_loss: 1.3057 - classification_loss: 0.2674 454/500 [==========================>...] - ETA: 11s - loss: 1.5729 - regression_loss: 1.3055 - classification_loss: 0.2674 455/500 [==========================>...] - ETA: 11s - loss: 1.5726 - regression_loss: 1.3053 - classification_loss: 0.2673 456/500 [==========================>...] - ETA: 11s - loss: 1.5734 - regression_loss: 1.3059 - classification_loss: 0.2675 457/500 [==========================>...] - ETA: 10s - loss: 1.5733 - regression_loss: 1.3060 - classification_loss: 0.2674 458/500 [==========================>...] - ETA: 10s - loss: 1.5743 - regression_loss: 1.3067 - classification_loss: 0.2676 459/500 [==========================>...] - ETA: 10s - loss: 1.5738 - regression_loss: 1.3062 - classification_loss: 0.2676 460/500 [==========================>...] - ETA: 10s - loss: 1.5752 - regression_loss: 1.3070 - classification_loss: 0.2682 461/500 [==========================>...] - ETA: 9s - loss: 1.5763 - regression_loss: 1.3078 - classification_loss: 0.2685  462/500 [==========================>...] - ETA: 9s - loss: 1.5763 - regression_loss: 1.3081 - classification_loss: 0.2682 463/500 [==========================>...] - ETA: 9s - loss: 1.5786 - regression_loss: 1.3100 - classification_loss: 0.2686 464/500 [==========================>...] - ETA: 9s - loss: 1.5775 - regression_loss: 1.3092 - classification_loss: 0.2683 465/500 [==========================>...] - ETA: 8s - loss: 1.5782 - regression_loss: 1.3097 - classification_loss: 0.2685 466/500 [==========================>...] - ETA: 8s - loss: 1.5784 - regression_loss: 1.3098 - classification_loss: 0.2686 467/500 [===========================>..] - ETA: 8s - loss: 1.5771 - regression_loss: 1.3087 - classification_loss: 0.2684 468/500 [===========================>..] - ETA: 8s - loss: 1.5766 - regression_loss: 1.3084 - classification_loss: 0.2682 469/500 [===========================>..] - ETA: 7s - loss: 1.5772 - regression_loss: 1.3089 - classification_loss: 0.2683 470/500 [===========================>..] - ETA: 7s - loss: 1.5786 - regression_loss: 1.3100 - classification_loss: 0.2687 471/500 [===========================>..] - ETA: 7s - loss: 1.5792 - regression_loss: 1.3105 - classification_loss: 0.2687 472/500 [===========================>..] - ETA: 7s - loss: 1.5798 - regression_loss: 1.3110 - classification_loss: 0.2688 473/500 [===========================>..] - ETA: 6s - loss: 1.5799 - regression_loss: 1.3112 - classification_loss: 0.2687 474/500 [===========================>..] - ETA: 6s - loss: 1.5802 - regression_loss: 1.3115 - classification_loss: 0.2687 475/500 [===========================>..] - ETA: 6s - loss: 1.5804 - regression_loss: 1.3118 - classification_loss: 0.2686 476/500 [===========================>..] - ETA: 6s - loss: 1.5793 - regression_loss: 1.3110 - classification_loss: 0.2684 477/500 [===========================>..] - ETA: 5s - loss: 1.5801 - regression_loss: 1.3116 - classification_loss: 0.2685 478/500 [===========================>..] - ETA: 5s - loss: 1.5800 - regression_loss: 1.3116 - classification_loss: 0.2685 479/500 [===========================>..] - ETA: 5s - loss: 1.5804 - regression_loss: 1.3118 - classification_loss: 0.2685 480/500 [===========================>..] - ETA: 5s - loss: 1.5808 - regression_loss: 1.3121 - classification_loss: 0.2686 481/500 [===========================>..] - ETA: 4s - loss: 1.5806 - regression_loss: 1.3121 - classification_loss: 0.2684 482/500 [===========================>..] - ETA: 4s - loss: 1.5822 - regression_loss: 1.3133 - classification_loss: 0.2689 483/500 [===========================>..] - ETA: 4s - loss: 1.5825 - regression_loss: 1.3135 - classification_loss: 0.2690 484/500 [============================>.] - ETA: 4s - loss: 1.5821 - regression_loss: 1.3132 - classification_loss: 0.2689 485/500 [============================>.] - ETA: 3s - loss: 1.5816 - regression_loss: 1.3128 - classification_loss: 0.2687 486/500 [============================>.] - ETA: 3s - loss: 1.5804 - regression_loss: 1.3118 - classification_loss: 0.2686 487/500 [============================>.] - ETA: 3s - loss: 1.5793 - regression_loss: 1.3108 - classification_loss: 0.2684 488/500 [============================>.] - ETA: 3s - loss: 1.5798 - regression_loss: 1.3114 - classification_loss: 0.2685 489/500 [============================>.] - ETA: 2s - loss: 1.5800 - regression_loss: 1.3116 - classification_loss: 0.2685 490/500 [============================>.] - ETA: 2s - loss: 1.5796 - regression_loss: 1.3112 - classification_loss: 0.2684 491/500 [============================>.] - ETA: 2s - loss: 1.5802 - regression_loss: 1.3118 - classification_loss: 0.2684 492/500 [============================>.] - ETA: 2s - loss: 1.5789 - regression_loss: 1.3108 - classification_loss: 0.2681 493/500 [============================>.] - ETA: 1s - loss: 1.5791 - regression_loss: 1.3109 - classification_loss: 0.2682 494/500 [============================>.] - ETA: 1s - loss: 1.5795 - regression_loss: 1.3114 - classification_loss: 0.2681 495/500 [============================>.] - ETA: 1s - loss: 1.5803 - regression_loss: 1.3122 - classification_loss: 0.2682 496/500 [============================>.] - ETA: 1s - loss: 1.5805 - regression_loss: 1.3122 - classification_loss: 0.2682 497/500 [============================>.] - ETA: 0s - loss: 1.5800 - regression_loss: 1.3118 - classification_loss: 0.2682 498/500 [============================>.] - ETA: 0s - loss: 1.5804 - regression_loss: 1.3121 - classification_loss: 0.2683 499/500 [============================>.] - ETA: 0s - loss: 1.5791 - regression_loss: 1.3111 - classification_loss: 0.2680 500/500 [==============================] - 126s 251ms/step - loss: 1.5800 - regression_loss: 1.3119 - classification_loss: 0.2682 1172 instances of class plum with average precision: 0.6507 mAP: 0.6507 Epoch 00088: saving model to ./training/snapshots/resnet50_pascal_88.h5 Epoch 89/150 1/500 [..............................] - ETA: 1:55 - loss: 1.5007 - regression_loss: 1.2616 - classification_loss: 0.2391 2/500 [..............................] - ETA: 2:00 - loss: 1.1778 - regression_loss: 1.0055 - classification_loss: 0.1723 3/500 [..............................] - ETA: 2:01 - loss: 1.4090 - regression_loss: 1.1990 - classification_loss: 0.2100 4/500 [..............................] - ETA: 2:01 - loss: 1.4349 - regression_loss: 1.2236 - classification_loss: 0.2113 5/500 [..............................] - ETA: 2:02 - loss: 1.4177 - regression_loss: 1.1738 - classification_loss: 0.2438 6/500 [..............................] - ETA: 2:02 - loss: 1.5287 - regression_loss: 1.2646 - classification_loss: 0.2641 7/500 [..............................] - ETA: 2:02 - loss: 1.5854 - regression_loss: 1.3133 - classification_loss: 0.2721 8/500 [..............................] - ETA: 2:02 - loss: 1.5743 - regression_loss: 1.3010 - classification_loss: 0.2734 9/500 [..............................] - ETA: 2:01 - loss: 1.5658 - regression_loss: 1.2974 - classification_loss: 0.2684 10/500 [..............................] - ETA: 2:01 - loss: 1.5591 - regression_loss: 1.2918 - classification_loss: 0.2673 11/500 [..............................] - ETA: 2:01 - loss: 1.5716 - regression_loss: 1.2975 - classification_loss: 0.2741 12/500 [..............................] - ETA: 2:01 - loss: 1.6065 - regression_loss: 1.3149 - classification_loss: 0.2916 13/500 [..............................] - ETA: 2:01 - loss: 1.6401 - regression_loss: 1.3487 - classification_loss: 0.2914 14/500 [..............................] - ETA: 2:01 - loss: 1.6470 - regression_loss: 1.3531 - classification_loss: 0.2938 15/500 [..............................] - ETA: 2:00 - loss: 1.6425 - regression_loss: 1.3469 - classification_loss: 0.2957 16/500 [..............................] - ETA: 2:00 - loss: 1.6304 - regression_loss: 1.3130 - classification_loss: 0.3173 17/500 [>.............................] - ETA: 2:00 - loss: 1.6417 - regression_loss: 1.3257 - classification_loss: 0.3161 18/500 [>.............................] - ETA: 2:00 - loss: 1.6185 - regression_loss: 1.3059 - classification_loss: 0.3126 19/500 [>.............................] - ETA: 2:00 - loss: 1.6311 - regression_loss: 1.3201 - classification_loss: 0.3110 20/500 [>.............................] - ETA: 1:59 - loss: 1.6273 - regression_loss: 1.3201 - classification_loss: 0.3072 21/500 [>.............................] - ETA: 1:59 - loss: 1.6191 - regression_loss: 1.3149 - classification_loss: 0.3042 22/500 [>.............................] - ETA: 1:59 - loss: 1.6406 - regression_loss: 1.3343 - classification_loss: 0.3063 23/500 [>.............................] - ETA: 1:59 - loss: 1.6152 - regression_loss: 1.3124 - classification_loss: 0.3027 24/500 [>.............................] - ETA: 1:59 - loss: 1.6224 - regression_loss: 1.3178 - classification_loss: 0.3046 25/500 [>.............................] - ETA: 1:58 - loss: 1.6295 - regression_loss: 1.3234 - classification_loss: 0.3061 26/500 [>.............................] - ETA: 1:57 - loss: 1.5868 - regression_loss: 1.2905 - classification_loss: 0.2963 27/500 [>.............................] - ETA: 1:57 - loss: 1.6006 - regression_loss: 1.3020 - classification_loss: 0.2986 28/500 [>.............................] - ETA: 1:57 - loss: 1.6062 - regression_loss: 1.3043 - classification_loss: 0.3020 29/500 [>.............................] - ETA: 1:57 - loss: 1.6616 - regression_loss: 1.3508 - classification_loss: 0.3108 30/500 [>.............................] - ETA: 1:57 - loss: 1.6768 - regression_loss: 1.3612 - classification_loss: 0.3156 31/500 [>.............................] - ETA: 1:56 - loss: 1.6901 - regression_loss: 1.3713 - classification_loss: 0.3188 32/500 [>.............................] - ETA: 1:56 - loss: 1.6822 - regression_loss: 1.3677 - classification_loss: 0.3145 33/500 [>.............................] - ETA: 1:56 - loss: 1.7031 - regression_loss: 1.3843 - classification_loss: 0.3188 34/500 [=>............................] - ETA: 1:56 - loss: 1.6808 - regression_loss: 1.3681 - classification_loss: 0.3127 35/500 [=>............................] - ETA: 1:56 - loss: 1.6776 - regression_loss: 1.3659 - classification_loss: 0.3117 36/500 [=>............................] - ETA: 1:56 - loss: 1.6660 - regression_loss: 1.3569 - classification_loss: 0.3091 37/500 [=>............................] - ETA: 1:55 - loss: 1.6654 - regression_loss: 1.3587 - classification_loss: 0.3067 38/500 [=>............................] - ETA: 1:55 - loss: 1.6790 - regression_loss: 1.3690 - classification_loss: 0.3100 39/500 [=>............................] - ETA: 1:55 - loss: 1.6748 - regression_loss: 1.3675 - classification_loss: 0.3073 40/500 [=>............................] - ETA: 1:55 - loss: 1.6651 - regression_loss: 1.3607 - classification_loss: 0.3044 41/500 [=>............................] - ETA: 1:54 - loss: 1.6506 - regression_loss: 1.3493 - classification_loss: 0.3013 42/500 [=>............................] - ETA: 1:54 - loss: 1.6393 - regression_loss: 1.3411 - classification_loss: 0.2982 43/500 [=>............................] - ETA: 1:54 - loss: 1.6389 - regression_loss: 1.3399 - classification_loss: 0.2990 44/500 [=>............................] - ETA: 1:54 - loss: 1.6338 - regression_loss: 1.3369 - classification_loss: 0.2969 45/500 [=>............................] - ETA: 1:53 - loss: 1.6335 - regression_loss: 1.3369 - classification_loss: 0.2966 46/500 [=>............................] - ETA: 1:53 - loss: 1.6463 - regression_loss: 1.3478 - classification_loss: 0.2985 47/500 [=>............................] - ETA: 1:53 - loss: 1.6461 - regression_loss: 1.3476 - classification_loss: 0.2986 48/500 [=>............................] - ETA: 1:53 - loss: 1.6493 - regression_loss: 1.3515 - classification_loss: 0.2979 49/500 [=>............................] - ETA: 1:52 - loss: 1.6576 - regression_loss: 1.3607 - classification_loss: 0.2969 50/500 [==>...........................] - ETA: 1:52 - loss: 1.6391 - regression_loss: 1.3447 - classification_loss: 0.2945 51/500 [==>...........................] - ETA: 1:52 - loss: 1.6273 - regression_loss: 1.3370 - classification_loss: 0.2903 52/500 [==>...........................] - ETA: 1:51 - loss: 1.6279 - regression_loss: 1.3396 - classification_loss: 0.2883 53/500 [==>...........................] - ETA: 1:51 - loss: 1.6311 - regression_loss: 1.3427 - classification_loss: 0.2884 54/500 [==>...........................] - ETA: 1:50 - loss: 1.6148 - regression_loss: 1.3303 - classification_loss: 0.2844 55/500 [==>...........................] - ETA: 1:50 - loss: 1.6132 - regression_loss: 1.3292 - classification_loss: 0.2840 56/500 [==>...........................] - ETA: 1:50 - loss: 1.6044 - regression_loss: 1.3222 - classification_loss: 0.2822 57/500 [==>...........................] - ETA: 1:50 - loss: 1.6069 - regression_loss: 1.3254 - classification_loss: 0.2815 58/500 [==>...........................] - ETA: 1:50 - loss: 1.6064 - regression_loss: 1.3252 - classification_loss: 0.2812 59/500 [==>...........................] - ETA: 1:49 - loss: 1.6100 - regression_loss: 1.3290 - classification_loss: 0.2810 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5972 - regression_loss: 1.3189 - classification_loss: 0.2783 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5823 - regression_loss: 1.3077 - classification_loss: 0.2746 62/500 [==>...........................] - ETA: 1:49 - loss: 1.5722 - regression_loss: 1.3003 - classification_loss: 0.2719 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5771 - regression_loss: 1.3048 - classification_loss: 0.2722 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5819 - regression_loss: 1.3082 - classification_loss: 0.2737 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5795 - regression_loss: 1.3064 - classification_loss: 0.2731 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5830 - regression_loss: 1.3096 - classification_loss: 0.2734 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5764 - regression_loss: 1.3039 - classification_loss: 0.2724 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5694 - regression_loss: 1.2987 - classification_loss: 0.2707 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5722 - regression_loss: 1.3018 - classification_loss: 0.2703 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5791 - regression_loss: 1.3076 - classification_loss: 0.2715 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5722 - regression_loss: 1.3018 - classification_loss: 0.2704 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5691 - regression_loss: 1.2987 - classification_loss: 0.2704 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5727 - regression_loss: 1.3014 - classification_loss: 0.2713 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5763 - regression_loss: 1.3053 - classification_loss: 0.2711 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5749 - regression_loss: 1.3047 - classification_loss: 0.2702 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5752 - regression_loss: 1.3059 - classification_loss: 0.2693 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5763 - regression_loss: 1.3068 - classification_loss: 0.2695 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5745 - regression_loss: 1.3060 - classification_loss: 0.2685 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5830 - regression_loss: 1.3133 - classification_loss: 0.2697 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5843 - regression_loss: 1.3143 - classification_loss: 0.2699 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5886 - regression_loss: 1.3170 - classification_loss: 0.2716 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5848 - regression_loss: 1.3152 - classification_loss: 0.2696 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5801 - regression_loss: 1.3116 - classification_loss: 0.2684 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5743 - regression_loss: 1.3066 - classification_loss: 0.2677 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5785 - regression_loss: 1.3101 - classification_loss: 0.2684 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5803 - regression_loss: 1.3124 - classification_loss: 0.2679 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5922 - regression_loss: 1.3223 - classification_loss: 0.2699 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5968 - regression_loss: 1.3266 - classification_loss: 0.2701 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5981 - regression_loss: 1.3268 - classification_loss: 0.2713 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5953 - regression_loss: 1.3250 - classification_loss: 0.2703 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5956 - regression_loss: 1.3253 - classification_loss: 0.2703 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5959 - regression_loss: 1.3257 - classification_loss: 0.2702 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5861 - regression_loss: 1.3176 - classification_loss: 0.2685 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5846 - regression_loss: 1.3163 - classification_loss: 0.2683 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5793 - regression_loss: 1.3110 - classification_loss: 0.2683 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5823 - regression_loss: 1.3147 - classification_loss: 0.2675 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5816 - regression_loss: 1.3149 - classification_loss: 0.2668 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5732 - regression_loss: 1.3081 - classification_loss: 0.2651 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5723 - regression_loss: 1.3076 - classification_loss: 0.2647 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5706 - regression_loss: 1.3059 - classification_loss: 0.2647 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5758 - regression_loss: 1.3101 - classification_loss: 0.2657 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5780 - regression_loss: 1.3120 - classification_loss: 0.2660 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5790 - regression_loss: 1.3127 - classification_loss: 0.2663 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5769 - regression_loss: 1.3113 - classification_loss: 0.2656 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5772 - regression_loss: 1.3123 - classification_loss: 0.2649 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5710 - regression_loss: 1.3078 - classification_loss: 0.2632 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5754 - regression_loss: 1.3113 - classification_loss: 0.2641 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5654 - regression_loss: 1.3035 - classification_loss: 0.2620 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5674 - regression_loss: 1.3047 - classification_loss: 0.2626 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5699 - regression_loss: 1.3069 - classification_loss: 0.2631 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5713 - regression_loss: 1.3078 - classification_loss: 0.2635 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5634 - regression_loss: 1.3012 - classification_loss: 0.2622 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5638 - regression_loss: 1.3017 - classification_loss: 0.2621 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5679 - regression_loss: 1.3055 - classification_loss: 0.2624 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5674 - regression_loss: 1.3052 - classification_loss: 0.2622 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5670 - regression_loss: 1.3054 - classification_loss: 0.2616 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5677 - regression_loss: 1.3064 - classification_loss: 0.2614 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5743 - regression_loss: 1.3115 - classification_loss: 0.2627 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5713 - regression_loss: 1.3097 - classification_loss: 0.2616 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5737 - regression_loss: 1.3120 - classification_loss: 0.2617 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5739 - regression_loss: 1.3125 - classification_loss: 0.2614 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5739 - regression_loss: 1.3128 - classification_loss: 0.2611 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5648 - regression_loss: 1.3050 - classification_loss: 0.2598 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5606 - regression_loss: 1.3018 - classification_loss: 0.2587 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5651 - regression_loss: 1.3051 - classification_loss: 0.2600 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5674 - regression_loss: 1.3070 - classification_loss: 0.2604 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5676 - regression_loss: 1.3074 - classification_loss: 0.2603 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5702 - regression_loss: 1.3092 - classification_loss: 0.2610 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5731 - regression_loss: 1.3115 - classification_loss: 0.2616 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5739 - regression_loss: 1.3126 - classification_loss: 0.2613 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5769 - regression_loss: 1.3148 - classification_loss: 0.2621 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5743 - regression_loss: 1.3123 - classification_loss: 0.2620 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5735 - regression_loss: 1.3121 - classification_loss: 0.2615 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5711 - regression_loss: 1.3105 - classification_loss: 0.2606 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5703 - regression_loss: 1.3102 - classification_loss: 0.2601 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5759 - regression_loss: 1.3144 - classification_loss: 0.2615 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5851 - regression_loss: 1.3223 - classification_loss: 0.2628 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5888 - regression_loss: 1.3256 - classification_loss: 0.2632 139/500 [=======>......................] - ETA: 1:29 - loss: 1.5858 - regression_loss: 1.3231 - classification_loss: 0.2627 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5915 - regression_loss: 1.3266 - classification_loss: 0.2650 141/500 [=======>......................] - ETA: 1:29 - loss: 1.6013 - regression_loss: 1.3353 - classification_loss: 0.2660 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6012 - regression_loss: 1.3348 - classification_loss: 0.2664 143/500 [=======>......................] - ETA: 1:28 - loss: 1.5980 - regression_loss: 1.3326 - classification_loss: 0.2654 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5981 - regression_loss: 1.3327 - classification_loss: 0.2655 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5982 - regression_loss: 1.3331 - classification_loss: 0.2651 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5978 - regression_loss: 1.3330 - classification_loss: 0.2647 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5976 - regression_loss: 1.3329 - classification_loss: 0.2647 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5972 - regression_loss: 1.3327 - classification_loss: 0.2645 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5971 - regression_loss: 1.3327 - classification_loss: 0.2643 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5999 - regression_loss: 1.3352 - classification_loss: 0.2647 151/500 [========>.....................] - ETA: 1:26 - loss: 1.6015 - regression_loss: 1.3363 - classification_loss: 0.2652 152/500 [========>.....................] - ETA: 1:26 - loss: 1.6021 - regression_loss: 1.3371 - classification_loss: 0.2649 153/500 [========>.....................] - ETA: 1:26 - loss: 1.6034 - regression_loss: 1.3384 - classification_loss: 0.2650 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5980 - regression_loss: 1.3338 - classification_loss: 0.2642 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5981 - regression_loss: 1.3339 - classification_loss: 0.2643 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5965 - regression_loss: 1.3325 - classification_loss: 0.2640 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5981 - regression_loss: 1.3339 - classification_loss: 0.2642 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6001 - regression_loss: 1.3354 - classification_loss: 0.2648 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6000 - regression_loss: 1.3349 - classification_loss: 0.2651 160/500 [========>.....................] - ETA: 1:24 - loss: 1.6031 - regression_loss: 1.3375 - classification_loss: 0.2656 161/500 [========>.....................] - ETA: 1:24 - loss: 1.6050 - regression_loss: 1.3383 - classification_loss: 0.2667 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6031 - regression_loss: 1.3371 - classification_loss: 0.2660 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6036 - regression_loss: 1.3376 - classification_loss: 0.2660 164/500 [========>.....................] - ETA: 1:23 - loss: 1.6002 - regression_loss: 1.3344 - classification_loss: 0.2659 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5996 - regression_loss: 1.3344 - classification_loss: 0.2652 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5958 - regression_loss: 1.3312 - classification_loss: 0.2646 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5974 - regression_loss: 1.3325 - classification_loss: 0.2649 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5976 - regression_loss: 1.3326 - classification_loss: 0.2650 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5996 - regression_loss: 1.3341 - classification_loss: 0.2655 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6008 - regression_loss: 1.3351 - classification_loss: 0.2657 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5964 - regression_loss: 1.3316 - classification_loss: 0.2648 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5888 - regression_loss: 1.3238 - classification_loss: 0.2650 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5885 - regression_loss: 1.3237 - classification_loss: 0.2647 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5852 - regression_loss: 1.3211 - classification_loss: 0.2641 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5877 - regression_loss: 1.3228 - classification_loss: 0.2649 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5836 - regression_loss: 1.3193 - classification_loss: 0.2643 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5842 - regression_loss: 1.3201 - classification_loss: 0.2641 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5851 - regression_loss: 1.3209 - classification_loss: 0.2642 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5802 - regression_loss: 1.3168 - classification_loss: 0.2633 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5809 - regression_loss: 1.3176 - classification_loss: 0.2633 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5835 - regression_loss: 1.3199 - classification_loss: 0.2636 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5829 - regression_loss: 1.3189 - classification_loss: 0.2641 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5820 - regression_loss: 1.3183 - classification_loss: 0.2636 184/500 [==========>...................] - ETA: 1:18 - loss: 1.5770 - regression_loss: 1.3144 - classification_loss: 0.2626 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5750 - regression_loss: 1.3130 - classification_loss: 0.2620 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5725 - regression_loss: 1.3114 - classification_loss: 0.2611 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5740 - regression_loss: 1.3125 - classification_loss: 0.2615 188/500 [==========>...................] - ETA: 1:17 - loss: 1.5707 - regression_loss: 1.3096 - classification_loss: 0.2611 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5701 - regression_loss: 1.3090 - classification_loss: 0.2611 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5697 - regression_loss: 1.3088 - classification_loss: 0.2609 191/500 [==========>...................] - ETA: 1:16 - loss: 1.5769 - regression_loss: 1.3143 - classification_loss: 0.2626 192/500 [==========>...................] - ETA: 1:16 - loss: 1.5784 - regression_loss: 1.3155 - classification_loss: 0.2629 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5791 - regression_loss: 1.3158 - classification_loss: 0.2632 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5800 - regression_loss: 1.3165 - classification_loss: 0.2635 195/500 [==========>...................] - ETA: 1:15 - loss: 1.5803 - regression_loss: 1.3165 - classification_loss: 0.2638 196/500 [==========>...................] - ETA: 1:15 - loss: 1.5799 - regression_loss: 1.3162 - classification_loss: 0.2637 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5804 - regression_loss: 1.3161 - classification_loss: 0.2643 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5775 - regression_loss: 1.3134 - classification_loss: 0.2641 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5796 - regression_loss: 1.3153 - classification_loss: 0.2642 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5779 - regression_loss: 1.3132 - classification_loss: 0.2647 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5780 - regression_loss: 1.3131 - classification_loss: 0.2649 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5806 - regression_loss: 1.3149 - classification_loss: 0.2657 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5799 - regression_loss: 1.3145 - classification_loss: 0.2654 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5810 - regression_loss: 1.3154 - classification_loss: 0.2656 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5770 - regression_loss: 1.3120 - classification_loss: 0.2650 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5776 - regression_loss: 1.3127 - classification_loss: 0.2649 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5827 - regression_loss: 1.3162 - classification_loss: 0.2666 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5792 - regression_loss: 1.3135 - classification_loss: 0.2657 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5767 - regression_loss: 1.3115 - classification_loss: 0.2652 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5757 - regression_loss: 1.3112 - classification_loss: 0.2644 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5756 - regression_loss: 1.3112 - classification_loss: 0.2644 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5781 - regression_loss: 1.3134 - classification_loss: 0.2647 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5795 - regression_loss: 1.3144 - classification_loss: 0.2651 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5792 - regression_loss: 1.3142 - classification_loss: 0.2650 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5756 - regression_loss: 1.3111 - classification_loss: 0.2644 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5747 - regression_loss: 1.3106 - classification_loss: 0.2641 217/500 [============>.................] - ETA: 1:10 - loss: 1.5741 - regression_loss: 1.3103 - classification_loss: 0.2638 218/500 [============>.................] - ETA: 1:10 - loss: 1.5740 - regression_loss: 1.3103 - classification_loss: 0.2638 219/500 [============>.................] - ETA: 1:09 - loss: 1.5741 - regression_loss: 1.3106 - classification_loss: 0.2635 220/500 [============>.................] - ETA: 1:09 - loss: 1.5733 - regression_loss: 1.3104 - classification_loss: 0.2630 221/500 [============>.................] - ETA: 1:09 - loss: 1.5749 - regression_loss: 1.3115 - classification_loss: 0.2634 222/500 [============>.................] - ETA: 1:09 - loss: 1.5772 - regression_loss: 1.3133 - classification_loss: 0.2639 223/500 [============>.................] - ETA: 1:08 - loss: 1.5774 - regression_loss: 1.3134 - classification_loss: 0.2639 224/500 [============>.................] - ETA: 1:08 - loss: 1.5778 - regression_loss: 1.3138 - classification_loss: 0.2640 225/500 [============>.................] - ETA: 1:08 - loss: 1.5766 - regression_loss: 1.3127 - classification_loss: 0.2639 226/500 [============>.................] - ETA: 1:08 - loss: 1.5732 - regression_loss: 1.3101 - classification_loss: 0.2630 227/500 [============>.................] - ETA: 1:07 - loss: 1.5702 - regression_loss: 1.3075 - classification_loss: 0.2627 228/500 [============>.................] - ETA: 1:07 - loss: 1.5711 - regression_loss: 1.3082 - classification_loss: 0.2629 229/500 [============>.................] - ETA: 1:07 - loss: 1.5723 - regression_loss: 1.3091 - classification_loss: 0.2632 230/500 [============>.................] - ETA: 1:07 - loss: 1.5737 - regression_loss: 1.3100 - classification_loss: 0.2636 231/500 [============>.................] - ETA: 1:06 - loss: 1.5706 - regression_loss: 1.3072 - classification_loss: 0.2633 232/500 [============>.................] - ETA: 1:06 - loss: 1.5706 - regression_loss: 1.3072 - classification_loss: 0.2635 233/500 [============>.................] - ETA: 1:06 - loss: 1.5712 - regression_loss: 1.3078 - classification_loss: 0.2634 234/500 [=============>................] - ETA: 1:06 - loss: 1.5737 - regression_loss: 1.3096 - classification_loss: 0.2641 235/500 [=============>................] - ETA: 1:05 - loss: 1.5707 - regression_loss: 1.3071 - classification_loss: 0.2636 236/500 [=============>................] - ETA: 1:05 - loss: 1.5739 - regression_loss: 1.3095 - classification_loss: 0.2644 237/500 [=============>................] - ETA: 1:05 - loss: 1.5749 - regression_loss: 1.3102 - classification_loss: 0.2646 238/500 [=============>................] - ETA: 1:05 - loss: 1.5713 - regression_loss: 1.3073 - classification_loss: 0.2640 239/500 [=============>................] - ETA: 1:04 - loss: 1.5709 - regression_loss: 1.3070 - classification_loss: 0.2639 240/500 [=============>................] - ETA: 1:04 - loss: 1.5716 - regression_loss: 1.3076 - classification_loss: 0.2640 241/500 [=============>................] - ETA: 1:04 - loss: 1.5719 - regression_loss: 1.3081 - classification_loss: 0.2639 242/500 [=============>................] - ETA: 1:04 - loss: 1.5734 - regression_loss: 1.3096 - classification_loss: 0.2638 243/500 [=============>................] - ETA: 1:03 - loss: 1.5739 - regression_loss: 1.3105 - classification_loss: 0.2634 244/500 [=============>................] - ETA: 1:03 - loss: 1.5755 - regression_loss: 1.3115 - classification_loss: 0.2640 245/500 [=============>................] - ETA: 1:03 - loss: 1.5719 - regression_loss: 1.3088 - classification_loss: 0.2631 246/500 [=============>................] - ETA: 1:03 - loss: 1.5720 - regression_loss: 1.3090 - classification_loss: 0.2630 247/500 [=============>................] - ETA: 1:02 - loss: 1.5739 - regression_loss: 1.3105 - classification_loss: 0.2634 248/500 [=============>................] - ETA: 1:02 - loss: 1.5746 - regression_loss: 1.3108 - classification_loss: 0.2638 249/500 [=============>................] - ETA: 1:02 - loss: 1.5751 - regression_loss: 1.3113 - classification_loss: 0.2638 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5717 - regression_loss: 1.3084 - classification_loss: 0.2634 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5697 - regression_loss: 1.3067 - classification_loss: 0.2630 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5677 - regression_loss: 1.3052 - classification_loss: 0.2625 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5638 - regression_loss: 1.3023 - classification_loss: 0.2616 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5646 - regression_loss: 1.3029 - classification_loss: 0.2617 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5655 - regression_loss: 1.3036 - classification_loss: 0.2620 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5674 - regression_loss: 1.3050 - classification_loss: 0.2624 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5676 - regression_loss: 1.3052 - classification_loss: 0.2624 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5694 - regression_loss: 1.3069 - classification_loss: 0.2625 259/500 [==============>...............] - ETA: 59s - loss: 1.5702 - regression_loss: 1.3077 - classification_loss: 0.2626  260/500 [==============>...............] - ETA: 59s - loss: 1.5724 - regression_loss: 1.3095 - classification_loss: 0.2629 261/500 [==============>...............] - ETA: 59s - loss: 1.5735 - regression_loss: 1.3102 - classification_loss: 0.2633 262/500 [==============>...............] - ETA: 59s - loss: 1.5740 - regression_loss: 1.3106 - classification_loss: 0.2634 263/500 [==============>...............] - ETA: 58s - loss: 1.5707 - regression_loss: 1.3079 - classification_loss: 0.2628 264/500 [==============>...............] - ETA: 58s - loss: 1.5704 - regression_loss: 1.3076 - classification_loss: 0.2627 265/500 [==============>...............] - ETA: 58s - loss: 1.5664 - regression_loss: 1.3044 - classification_loss: 0.2620 266/500 [==============>...............] - ETA: 58s - loss: 1.5666 - regression_loss: 1.3045 - classification_loss: 0.2621 267/500 [===============>..............] - ETA: 58s - loss: 1.5686 - regression_loss: 1.3061 - classification_loss: 0.2625 268/500 [===============>..............] - ETA: 57s - loss: 1.5686 - regression_loss: 1.3062 - classification_loss: 0.2624 269/500 [===============>..............] - ETA: 57s - loss: 1.5678 - regression_loss: 1.3058 - classification_loss: 0.2620 270/500 [===============>..............] - ETA: 57s - loss: 1.5684 - regression_loss: 1.3063 - classification_loss: 0.2621 271/500 [===============>..............] - ETA: 57s - loss: 1.5674 - regression_loss: 1.3055 - classification_loss: 0.2619 272/500 [===============>..............] - ETA: 56s - loss: 1.5655 - regression_loss: 1.3040 - classification_loss: 0.2615 273/500 [===============>..............] - ETA: 56s - loss: 1.5657 - regression_loss: 1.3042 - classification_loss: 0.2615 274/500 [===============>..............] - ETA: 56s - loss: 1.5662 - regression_loss: 1.3046 - classification_loss: 0.2616 275/500 [===============>..............] - ETA: 56s - loss: 1.5658 - regression_loss: 1.3045 - classification_loss: 0.2613 276/500 [===============>..............] - ETA: 55s - loss: 1.5667 - regression_loss: 1.3052 - classification_loss: 0.2615 277/500 [===============>..............] - ETA: 55s - loss: 1.5686 - regression_loss: 1.3069 - classification_loss: 0.2616 278/500 [===============>..............] - ETA: 55s - loss: 1.5685 - regression_loss: 1.3063 - classification_loss: 0.2622 279/500 [===============>..............] - ETA: 55s - loss: 1.5674 - regression_loss: 1.3054 - classification_loss: 0.2620 280/500 [===============>..............] - ETA: 54s - loss: 1.5681 - regression_loss: 1.3059 - classification_loss: 0.2622 281/500 [===============>..............] - ETA: 54s - loss: 1.5708 - regression_loss: 1.3081 - classification_loss: 0.2627 282/500 [===============>..............] - ETA: 54s - loss: 1.5744 - regression_loss: 1.3110 - classification_loss: 0.2635 283/500 [===============>..............] - ETA: 54s - loss: 1.5753 - regression_loss: 1.3120 - classification_loss: 0.2633 284/500 [================>.............] - ETA: 53s - loss: 1.5744 - regression_loss: 1.3114 - classification_loss: 0.2630 285/500 [================>.............] - ETA: 53s - loss: 1.5738 - regression_loss: 1.3107 - classification_loss: 0.2630 286/500 [================>.............] - ETA: 53s - loss: 1.5738 - regression_loss: 1.3108 - classification_loss: 0.2630 287/500 [================>.............] - ETA: 53s - loss: 1.5722 - regression_loss: 1.3093 - classification_loss: 0.2629 288/500 [================>.............] - ETA: 52s - loss: 1.5705 - regression_loss: 1.3076 - classification_loss: 0.2630 289/500 [================>.............] - ETA: 52s - loss: 1.5729 - regression_loss: 1.3082 - classification_loss: 0.2648 290/500 [================>.............] - ETA: 52s - loss: 1.5719 - regression_loss: 1.3075 - classification_loss: 0.2644 291/500 [================>.............] - ETA: 52s - loss: 1.5711 - regression_loss: 1.3070 - classification_loss: 0.2641 292/500 [================>.............] - ETA: 51s - loss: 1.5709 - regression_loss: 1.3070 - classification_loss: 0.2639 293/500 [================>.............] - ETA: 51s - loss: 1.5705 - regression_loss: 1.3070 - classification_loss: 0.2635 294/500 [================>.............] - ETA: 51s - loss: 1.5713 - regression_loss: 1.3077 - classification_loss: 0.2635 295/500 [================>.............] - ETA: 51s - loss: 1.5713 - regression_loss: 1.3078 - classification_loss: 0.2635 296/500 [================>.............] - ETA: 50s - loss: 1.5696 - regression_loss: 1.3065 - classification_loss: 0.2631 297/500 [================>.............] - ETA: 50s - loss: 1.5669 - regression_loss: 1.3043 - classification_loss: 0.2626 298/500 [================>.............] - ETA: 50s - loss: 1.5688 - regression_loss: 1.3060 - classification_loss: 0.2628 299/500 [================>.............] - ETA: 50s - loss: 1.5701 - regression_loss: 1.3072 - classification_loss: 0.2629 300/500 [=================>............] - ETA: 49s - loss: 1.5715 - regression_loss: 1.3082 - classification_loss: 0.2634 301/500 [=================>............] - ETA: 49s - loss: 1.5713 - regression_loss: 1.3077 - classification_loss: 0.2636 302/500 [=================>............] - ETA: 49s - loss: 1.5714 - regression_loss: 1.3079 - classification_loss: 0.2634 303/500 [=================>............] - ETA: 49s - loss: 1.5717 - regression_loss: 1.3083 - classification_loss: 0.2634 304/500 [=================>............] - ETA: 48s - loss: 1.5745 - regression_loss: 1.3106 - classification_loss: 0.2639 305/500 [=================>............] - ETA: 48s - loss: 1.5736 - regression_loss: 1.3099 - classification_loss: 0.2637 306/500 [=================>............] - ETA: 48s - loss: 1.5721 - regression_loss: 1.3085 - classification_loss: 0.2635 307/500 [=================>............] - ETA: 48s - loss: 1.5721 - regression_loss: 1.3086 - classification_loss: 0.2635 308/500 [=================>............] - ETA: 47s - loss: 1.5727 - regression_loss: 1.3092 - classification_loss: 0.2635 309/500 [=================>............] - ETA: 47s - loss: 1.5731 - regression_loss: 1.3097 - classification_loss: 0.2635 310/500 [=================>............] - ETA: 47s - loss: 1.5731 - regression_loss: 1.3096 - classification_loss: 0.2635 311/500 [=================>............] - ETA: 47s - loss: 1.5707 - regression_loss: 1.3078 - classification_loss: 0.2629 312/500 [=================>............] - ETA: 46s - loss: 1.5714 - regression_loss: 1.3086 - classification_loss: 0.2628 313/500 [=================>............] - ETA: 46s - loss: 1.5711 - regression_loss: 1.3084 - classification_loss: 0.2627 314/500 [=================>............] - ETA: 46s - loss: 1.5708 - regression_loss: 1.3080 - classification_loss: 0.2628 315/500 [=================>............] - ETA: 46s - loss: 1.5676 - regression_loss: 1.3053 - classification_loss: 0.2623 316/500 [=================>............] - ETA: 45s - loss: 1.5675 - regression_loss: 1.3054 - classification_loss: 0.2621 317/500 [==================>...........] - ETA: 45s - loss: 1.5680 - regression_loss: 1.3056 - classification_loss: 0.2624 318/500 [==================>...........] - ETA: 45s - loss: 1.5688 - regression_loss: 1.3064 - classification_loss: 0.2624 319/500 [==================>...........] - ETA: 45s - loss: 1.5686 - regression_loss: 1.3063 - classification_loss: 0.2623 320/500 [==================>...........] - ETA: 44s - loss: 1.5685 - regression_loss: 1.3062 - classification_loss: 0.2624 321/500 [==================>...........] - ETA: 44s - loss: 1.5692 - regression_loss: 1.3068 - classification_loss: 0.2624 322/500 [==================>...........] - ETA: 44s - loss: 1.5708 - regression_loss: 1.3082 - classification_loss: 0.2626 323/500 [==================>...........] - ETA: 44s - loss: 1.5682 - regression_loss: 1.3061 - classification_loss: 0.2621 324/500 [==================>...........] - ETA: 43s - loss: 1.5711 - regression_loss: 1.3087 - classification_loss: 0.2624 325/500 [==================>...........] - ETA: 43s - loss: 1.5730 - regression_loss: 1.3105 - classification_loss: 0.2626 326/500 [==================>...........] - ETA: 43s - loss: 1.5738 - regression_loss: 1.3108 - classification_loss: 0.2629 327/500 [==================>...........] - ETA: 43s - loss: 1.5748 - regression_loss: 1.3118 - classification_loss: 0.2629 328/500 [==================>...........] - ETA: 42s - loss: 1.5730 - regression_loss: 1.3103 - classification_loss: 0.2627 329/500 [==================>...........] - ETA: 42s - loss: 1.5722 - regression_loss: 1.3098 - classification_loss: 0.2624 330/500 [==================>...........] - ETA: 42s - loss: 1.5719 - regression_loss: 1.3096 - classification_loss: 0.2623 331/500 [==================>...........] - ETA: 42s - loss: 1.5730 - regression_loss: 1.3104 - classification_loss: 0.2626 332/500 [==================>...........] - ETA: 41s - loss: 1.5755 - regression_loss: 1.3123 - classification_loss: 0.2633 333/500 [==================>...........] - ETA: 41s - loss: 1.5757 - regression_loss: 1.3124 - classification_loss: 0.2632 334/500 [===================>..........] - ETA: 41s - loss: 1.5753 - regression_loss: 1.3122 - classification_loss: 0.2631 335/500 [===================>..........] - ETA: 41s - loss: 1.5753 - regression_loss: 1.3122 - classification_loss: 0.2631 336/500 [===================>..........] - ETA: 40s - loss: 1.5757 - regression_loss: 1.3125 - classification_loss: 0.2632 337/500 [===================>..........] - ETA: 40s - loss: 1.5759 - regression_loss: 1.3127 - classification_loss: 0.2632 338/500 [===================>..........] - ETA: 40s - loss: 1.5756 - regression_loss: 1.3126 - classification_loss: 0.2631 339/500 [===================>..........] - ETA: 40s - loss: 1.5752 - regression_loss: 1.3123 - classification_loss: 0.2630 340/500 [===================>..........] - ETA: 39s - loss: 1.5741 - regression_loss: 1.3114 - classification_loss: 0.2627 341/500 [===================>..........] - ETA: 39s - loss: 1.5722 - regression_loss: 1.3097 - classification_loss: 0.2625 342/500 [===================>..........] - ETA: 39s - loss: 1.5710 - regression_loss: 1.3090 - classification_loss: 0.2620 343/500 [===================>..........] - ETA: 39s - loss: 1.5722 - regression_loss: 1.3099 - classification_loss: 0.2624 344/500 [===================>..........] - ETA: 38s - loss: 1.5728 - regression_loss: 1.3102 - classification_loss: 0.2625 345/500 [===================>..........] - ETA: 38s - loss: 1.5740 - regression_loss: 1.3114 - classification_loss: 0.2626 346/500 [===================>..........] - ETA: 38s - loss: 1.5731 - regression_loss: 1.3107 - classification_loss: 0.2624 347/500 [===================>..........] - ETA: 38s - loss: 1.5746 - regression_loss: 1.3118 - classification_loss: 0.2628 348/500 [===================>..........] - ETA: 37s - loss: 1.5733 - regression_loss: 1.3108 - classification_loss: 0.2625 349/500 [===================>..........] - ETA: 37s - loss: 1.5733 - regression_loss: 1.3108 - classification_loss: 0.2625 350/500 [====================>.........] - ETA: 37s - loss: 1.5739 - regression_loss: 1.3113 - classification_loss: 0.2626 351/500 [====================>.........] - ETA: 37s - loss: 1.5743 - regression_loss: 1.3115 - classification_loss: 0.2628 352/500 [====================>.........] - ETA: 36s - loss: 1.5751 - regression_loss: 1.3123 - classification_loss: 0.2629 353/500 [====================>.........] - ETA: 36s - loss: 1.5766 - regression_loss: 1.3134 - classification_loss: 0.2632 354/500 [====================>.........] - ETA: 36s - loss: 1.5770 - regression_loss: 1.3137 - classification_loss: 0.2633 355/500 [====================>.........] - ETA: 36s - loss: 1.5751 - regression_loss: 1.3121 - classification_loss: 0.2629 356/500 [====================>.........] - ETA: 35s - loss: 1.5742 - regression_loss: 1.3114 - classification_loss: 0.2628 357/500 [====================>.........] - ETA: 35s - loss: 1.5732 - regression_loss: 1.3106 - classification_loss: 0.2627 358/500 [====================>.........] - ETA: 35s - loss: 1.5730 - regression_loss: 1.3104 - classification_loss: 0.2626 359/500 [====================>.........] - ETA: 35s - loss: 1.5738 - regression_loss: 1.3110 - classification_loss: 0.2629 360/500 [====================>.........] - ETA: 34s - loss: 1.5715 - regression_loss: 1.3093 - classification_loss: 0.2623 361/500 [====================>.........] - ETA: 34s - loss: 1.5699 - regression_loss: 1.3077 - classification_loss: 0.2622 362/500 [====================>.........] - ETA: 34s - loss: 1.5697 - regression_loss: 1.3075 - classification_loss: 0.2622 363/500 [====================>.........] - ETA: 34s - loss: 1.5692 - regression_loss: 1.3069 - classification_loss: 0.2622 364/500 [====================>.........] - ETA: 33s - loss: 1.5680 - regression_loss: 1.3060 - classification_loss: 0.2620 365/500 [====================>.........] - ETA: 33s - loss: 1.5669 - regression_loss: 1.3052 - classification_loss: 0.2617 366/500 [====================>.........] - ETA: 33s - loss: 1.5672 - regression_loss: 1.3055 - classification_loss: 0.2618 367/500 [=====================>........] - ETA: 33s - loss: 1.5669 - regression_loss: 1.3052 - classification_loss: 0.2617 368/500 [=====================>........] - ETA: 32s - loss: 1.5666 - regression_loss: 1.3051 - classification_loss: 0.2615 369/500 [=====================>........] - ETA: 32s - loss: 1.5654 - regression_loss: 1.3041 - classification_loss: 0.2613 370/500 [=====================>........] - ETA: 32s - loss: 1.5652 - regression_loss: 1.3040 - classification_loss: 0.2612 371/500 [=====================>........] - ETA: 32s - loss: 1.5655 - regression_loss: 1.3043 - classification_loss: 0.2612 372/500 [=====================>........] - ETA: 31s - loss: 1.5643 - regression_loss: 1.3033 - classification_loss: 0.2610 373/500 [=====================>........] - ETA: 31s - loss: 1.5642 - regression_loss: 1.3033 - classification_loss: 0.2609 374/500 [=====================>........] - ETA: 31s - loss: 1.5631 - regression_loss: 1.3023 - classification_loss: 0.2608 375/500 [=====================>........] - ETA: 31s - loss: 1.5626 - regression_loss: 1.3020 - classification_loss: 0.2606 376/500 [=====================>........] - ETA: 30s - loss: 1.5612 - regression_loss: 1.3010 - classification_loss: 0.2601 377/500 [=====================>........] - ETA: 30s - loss: 1.5595 - regression_loss: 1.2997 - classification_loss: 0.2598 378/500 [=====================>........] - ETA: 30s - loss: 1.5607 - regression_loss: 1.3006 - classification_loss: 0.2601 379/500 [=====================>........] - ETA: 30s - loss: 1.5616 - regression_loss: 1.3015 - classification_loss: 0.2602 380/500 [=====================>........] - ETA: 29s - loss: 1.5596 - regression_loss: 1.2997 - classification_loss: 0.2599 381/500 [=====================>........] - ETA: 29s - loss: 1.5591 - regression_loss: 1.2995 - classification_loss: 0.2596 382/500 [=====================>........] - ETA: 29s - loss: 1.5574 - regression_loss: 1.2981 - classification_loss: 0.2593 383/500 [=====================>........] - ETA: 29s - loss: 1.5574 - regression_loss: 1.2981 - classification_loss: 0.2593 384/500 [======================>.......] - ETA: 28s - loss: 1.5582 - regression_loss: 1.2989 - classification_loss: 0.2593 385/500 [======================>.......] - ETA: 28s - loss: 1.5579 - regression_loss: 1.2988 - classification_loss: 0.2591 386/500 [======================>.......] - ETA: 28s - loss: 1.5557 - regression_loss: 1.2969 - classification_loss: 0.2588 387/500 [======================>.......] - ETA: 28s - loss: 1.5567 - regression_loss: 1.2976 - classification_loss: 0.2590 388/500 [======================>.......] - ETA: 27s - loss: 1.5568 - regression_loss: 1.2977 - classification_loss: 0.2591 389/500 [======================>.......] - ETA: 27s - loss: 1.5550 - regression_loss: 1.2962 - classification_loss: 0.2587 390/500 [======================>.......] - ETA: 27s - loss: 1.5569 - regression_loss: 1.2981 - classification_loss: 0.2588 391/500 [======================>.......] - ETA: 27s - loss: 1.5600 - regression_loss: 1.3006 - classification_loss: 0.2594 392/500 [======================>.......] - ETA: 26s - loss: 1.5604 - regression_loss: 1.3010 - classification_loss: 0.2594 393/500 [======================>.......] - ETA: 26s - loss: 1.5582 - regression_loss: 1.2993 - classification_loss: 0.2590 394/500 [======================>.......] - ETA: 26s - loss: 1.5574 - regression_loss: 1.2987 - classification_loss: 0.2587 395/500 [======================>.......] - ETA: 26s - loss: 1.5572 - regression_loss: 1.2986 - classification_loss: 0.2587 396/500 [======================>.......] - ETA: 25s - loss: 1.5571 - regression_loss: 1.2986 - classification_loss: 0.2585 397/500 [======================>.......] - ETA: 25s - loss: 1.5588 - regression_loss: 1.2996 - classification_loss: 0.2592 398/500 [======================>.......] - ETA: 25s - loss: 1.5592 - regression_loss: 1.3000 - classification_loss: 0.2593 399/500 [======================>.......] - ETA: 25s - loss: 1.5594 - regression_loss: 1.3001 - classification_loss: 0.2593 400/500 [=======================>......] - ETA: 24s - loss: 1.5595 - regression_loss: 1.3000 - classification_loss: 0.2595 401/500 [=======================>......] - ETA: 24s - loss: 1.5599 - regression_loss: 1.3000 - classification_loss: 0.2600 402/500 [=======================>......] - ETA: 24s - loss: 1.5599 - regression_loss: 1.3000 - classification_loss: 0.2599 403/500 [=======================>......] - ETA: 24s - loss: 1.5613 - regression_loss: 1.3012 - classification_loss: 0.2601 404/500 [=======================>......] - ETA: 23s - loss: 1.5623 - regression_loss: 1.3020 - classification_loss: 0.2603 405/500 [=======================>......] - ETA: 23s - loss: 1.5626 - regression_loss: 1.3023 - classification_loss: 0.2604 406/500 [=======================>......] - ETA: 23s - loss: 1.5625 - regression_loss: 1.3020 - classification_loss: 0.2605 407/500 [=======================>......] - ETA: 23s - loss: 1.5630 - regression_loss: 1.3024 - classification_loss: 0.2606 408/500 [=======================>......] - ETA: 22s - loss: 1.5646 - regression_loss: 1.3039 - classification_loss: 0.2608 409/500 [=======================>......] - ETA: 22s - loss: 1.5640 - regression_loss: 1.3034 - classification_loss: 0.2606 410/500 [=======================>......] - ETA: 22s - loss: 1.5637 - regression_loss: 1.3030 - classification_loss: 0.2607 411/500 [=======================>......] - ETA: 22s - loss: 1.5643 - regression_loss: 1.3036 - classification_loss: 0.2607 412/500 [=======================>......] - ETA: 21s - loss: 1.5635 - regression_loss: 1.3030 - classification_loss: 0.2605 413/500 [=======================>......] - ETA: 21s - loss: 1.5612 - regression_loss: 1.3012 - classification_loss: 0.2600 414/500 [=======================>......] - ETA: 21s - loss: 1.5620 - regression_loss: 1.3018 - classification_loss: 0.2602 415/500 [=======================>......] - ETA: 21s - loss: 1.5634 - regression_loss: 1.3029 - classification_loss: 0.2605 416/500 [=======================>......] - ETA: 20s - loss: 1.5642 - regression_loss: 1.3035 - classification_loss: 0.2607 417/500 [========================>.....] - ETA: 20s - loss: 1.5634 - regression_loss: 1.3028 - classification_loss: 0.2606 418/500 [========================>.....] - ETA: 20s - loss: 1.5633 - regression_loss: 1.3027 - classification_loss: 0.2605 419/500 [========================>.....] - ETA: 20s - loss: 1.5626 - regression_loss: 1.3022 - classification_loss: 0.2604 420/500 [========================>.....] - ETA: 19s - loss: 1.5634 - regression_loss: 1.3028 - classification_loss: 0.2606 421/500 [========================>.....] - ETA: 19s - loss: 1.5629 - regression_loss: 1.3025 - classification_loss: 0.2604 422/500 [========================>.....] - ETA: 19s - loss: 1.5635 - regression_loss: 1.3030 - classification_loss: 0.2605 423/500 [========================>.....] - ETA: 19s - loss: 1.5646 - regression_loss: 1.3040 - classification_loss: 0.2606 424/500 [========================>.....] - ETA: 18s - loss: 1.5643 - regression_loss: 1.3039 - classification_loss: 0.2605 425/500 [========================>.....] - ETA: 18s - loss: 1.5645 - regression_loss: 1.3040 - classification_loss: 0.2605 426/500 [========================>.....] - ETA: 18s - loss: 1.5648 - regression_loss: 1.3042 - classification_loss: 0.2606 427/500 [========================>.....] - ETA: 18s - loss: 1.5644 - regression_loss: 1.3040 - classification_loss: 0.2604 428/500 [========================>.....] - ETA: 17s - loss: 1.5634 - regression_loss: 1.3031 - classification_loss: 0.2602 429/500 [========================>.....] - ETA: 17s - loss: 1.5653 - regression_loss: 1.3044 - classification_loss: 0.2608 430/500 [========================>.....] - ETA: 17s - loss: 1.5645 - regression_loss: 1.3040 - classification_loss: 0.2605 431/500 [========================>.....] - ETA: 17s - loss: 1.5661 - regression_loss: 1.3051 - classification_loss: 0.2610 432/500 [========================>.....] - ETA: 16s - loss: 1.5663 - regression_loss: 1.3055 - classification_loss: 0.2608 433/500 [========================>.....] - ETA: 16s - loss: 1.5684 - regression_loss: 1.3071 - classification_loss: 0.2613 434/500 [=========================>....] - ETA: 16s - loss: 1.5677 - regression_loss: 1.3064 - classification_loss: 0.2612 435/500 [=========================>....] - ETA: 16s - loss: 1.5675 - regression_loss: 1.3063 - classification_loss: 0.2612 436/500 [=========================>....] - ETA: 15s - loss: 1.5663 - regression_loss: 1.3054 - classification_loss: 0.2609 437/500 [=========================>....] - ETA: 15s - loss: 1.5659 - regression_loss: 1.3050 - classification_loss: 0.2609 438/500 [=========================>....] - ETA: 15s - loss: 1.5651 - regression_loss: 1.3043 - classification_loss: 0.2608 439/500 [=========================>....] - ETA: 15s - loss: 1.5644 - regression_loss: 1.3038 - classification_loss: 0.2606 440/500 [=========================>....] - ETA: 14s - loss: 1.5629 - regression_loss: 1.3026 - classification_loss: 0.2603 441/500 [=========================>....] - ETA: 14s - loss: 1.5639 - regression_loss: 1.3035 - classification_loss: 0.2604 442/500 [=========================>....] - ETA: 14s - loss: 1.5644 - regression_loss: 1.3037 - classification_loss: 0.2607 443/500 [=========================>....] - ETA: 14s - loss: 1.5651 - regression_loss: 1.3041 - classification_loss: 0.2610 444/500 [=========================>....] - ETA: 13s - loss: 1.5654 - regression_loss: 1.3043 - classification_loss: 0.2610 445/500 [=========================>....] - ETA: 13s - loss: 1.5655 - regression_loss: 1.3045 - classification_loss: 0.2609 446/500 [=========================>....] - ETA: 13s - loss: 1.5656 - regression_loss: 1.3047 - classification_loss: 0.2608 447/500 [=========================>....] - ETA: 13s - loss: 1.5644 - regression_loss: 1.3040 - classification_loss: 0.2604 448/500 [=========================>....] - ETA: 12s - loss: 1.5618 - regression_loss: 1.3019 - classification_loss: 0.2600 449/500 [=========================>....] - ETA: 12s - loss: 1.5602 - regression_loss: 1.3005 - classification_loss: 0.2597 450/500 [==========================>...] - ETA: 12s - loss: 1.5610 - regression_loss: 1.3013 - classification_loss: 0.2597 451/500 [==========================>...] - ETA: 12s - loss: 1.5614 - regression_loss: 1.3016 - classification_loss: 0.2597 452/500 [==========================>...] - ETA: 11s - loss: 1.5613 - regression_loss: 1.3017 - classification_loss: 0.2597 453/500 [==========================>...] - ETA: 11s - loss: 1.5628 - regression_loss: 1.3031 - classification_loss: 0.2598 454/500 [==========================>...] - ETA: 11s - loss: 1.5611 - regression_loss: 1.3016 - classification_loss: 0.2594 455/500 [==========================>...] - ETA: 11s - loss: 1.5614 - regression_loss: 1.3018 - classification_loss: 0.2596 456/500 [==========================>...] - ETA: 10s - loss: 1.5621 - regression_loss: 1.3023 - classification_loss: 0.2597 457/500 [==========================>...] - ETA: 10s - loss: 1.5627 - regression_loss: 1.3028 - classification_loss: 0.2599 458/500 [==========================>...] - ETA: 10s - loss: 1.5630 - regression_loss: 1.3031 - classification_loss: 0.2599 459/500 [==========================>...] - ETA: 10s - loss: 1.5639 - regression_loss: 1.3038 - classification_loss: 0.2601 460/500 [==========================>...] - ETA: 9s - loss: 1.5645 - regression_loss: 1.3045 - classification_loss: 0.2600  461/500 [==========================>...] - ETA: 9s - loss: 1.5659 - regression_loss: 1.3057 - classification_loss: 0.2602 462/500 [==========================>...] - ETA: 9s - loss: 1.5668 - regression_loss: 1.3064 - classification_loss: 0.2604 463/500 [==========================>...] - ETA: 9s - loss: 1.5680 - regression_loss: 1.3072 - classification_loss: 0.2608 464/500 [==========================>...] - ETA: 8s - loss: 1.5662 - regression_loss: 1.3056 - classification_loss: 0.2606 465/500 [==========================>...] - ETA: 8s - loss: 1.5665 - regression_loss: 1.3059 - classification_loss: 0.2606 466/500 [==========================>...] - ETA: 8s - loss: 1.5660 - regression_loss: 1.3056 - classification_loss: 0.2604 467/500 [===========================>..] - ETA: 8s - loss: 1.5660 - regression_loss: 1.3057 - classification_loss: 0.2604 468/500 [===========================>..] - ETA: 7s - loss: 1.5638 - regression_loss: 1.3038 - classification_loss: 0.2600 469/500 [===========================>..] - ETA: 7s - loss: 1.5648 - regression_loss: 1.3048 - classification_loss: 0.2601 470/500 [===========================>..] - ETA: 7s - loss: 1.5634 - regression_loss: 1.3037 - classification_loss: 0.2597 471/500 [===========================>..] - ETA: 7s - loss: 1.5644 - regression_loss: 1.3045 - classification_loss: 0.2599 472/500 [===========================>..] - ETA: 6s - loss: 1.5651 - regression_loss: 1.3050 - classification_loss: 0.2602 473/500 [===========================>..] - ETA: 6s - loss: 1.5655 - regression_loss: 1.3053 - classification_loss: 0.2602 474/500 [===========================>..] - ETA: 6s - loss: 1.5660 - regression_loss: 1.3058 - classification_loss: 0.2602 475/500 [===========================>..] - ETA: 6s - loss: 1.5641 - regression_loss: 1.3042 - classification_loss: 0.2600 476/500 [===========================>..] - ETA: 5s - loss: 1.5627 - regression_loss: 1.3031 - classification_loss: 0.2597 477/500 [===========================>..] - ETA: 5s - loss: 1.5625 - regression_loss: 1.3029 - classification_loss: 0.2596 478/500 [===========================>..] - ETA: 5s - loss: 1.5638 - regression_loss: 1.3037 - classification_loss: 0.2601 479/500 [===========================>..] - ETA: 5s - loss: 1.5641 - regression_loss: 1.3039 - classification_loss: 0.2602 480/500 [===========================>..] - ETA: 5s - loss: 1.5626 - regression_loss: 1.3027 - classification_loss: 0.2599 481/500 [===========================>..] - ETA: 4s - loss: 1.5635 - regression_loss: 1.3033 - classification_loss: 0.2602 482/500 [===========================>..] - ETA: 4s - loss: 1.5631 - regression_loss: 1.3031 - classification_loss: 0.2601 483/500 [===========================>..] - ETA: 4s - loss: 1.5625 - regression_loss: 1.3027 - classification_loss: 0.2599 484/500 [============================>.] - ETA: 4s - loss: 1.5639 - regression_loss: 1.3039 - classification_loss: 0.2601 485/500 [============================>.] - ETA: 3s - loss: 1.5640 - regression_loss: 1.3039 - classification_loss: 0.2601 486/500 [============================>.] - ETA: 3s - loss: 1.5642 - regression_loss: 1.3042 - classification_loss: 0.2600 487/500 [============================>.] - ETA: 3s - loss: 1.5649 - regression_loss: 1.3048 - classification_loss: 0.2600 488/500 [============================>.] - ETA: 3s - loss: 1.5643 - regression_loss: 1.3044 - classification_loss: 0.2599 489/500 [============================>.] - ETA: 2s - loss: 1.5636 - regression_loss: 1.3038 - classification_loss: 0.2597 490/500 [============================>.] - ETA: 2s - loss: 1.5621 - regression_loss: 1.3025 - classification_loss: 0.2596 491/500 [============================>.] - ETA: 2s - loss: 1.5629 - regression_loss: 1.3032 - classification_loss: 0.2597 492/500 [============================>.] - ETA: 2s - loss: 1.5617 - regression_loss: 1.3022 - classification_loss: 0.2595 493/500 [============================>.] - ETA: 1s - loss: 1.5603 - regression_loss: 1.3011 - classification_loss: 0.2592 494/500 [============================>.] - ETA: 1s - loss: 1.5598 - regression_loss: 1.3008 - classification_loss: 0.2590 495/500 [============================>.] - ETA: 1s - loss: 1.5613 - regression_loss: 1.3024 - classification_loss: 0.2590 496/500 [============================>.] - ETA: 1s - loss: 1.5603 - regression_loss: 1.3016 - classification_loss: 0.2587 497/500 [============================>.] - ETA: 0s - loss: 1.5597 - regression_loss: 1.3011 - classification_loss: 0.2586 498/500 [============================>.] - ETA: 0s - loss: 1.5583 - regression_loss: 1.3000 - classification_loss: 0.2583 499/500 [============================>.] - ETA: 0s - loss: 1.5590 - regression_loss: 1.3005 - classification_loss: 0.2585 500/500 [==============================] - 125s 250ms/step - loss: 1.5593 - regression_loss: 1.3008 - classification_loss: 0.2585 1172 instances of class plum with average precision: 0.6624 mAP: 0.6624 Epoch 00089: saving model to ./training/snapshots/resnet50_pascal_89.h5 Epoch 90/150 1/500 [..............................] - ETA: 1:52 - loss: 1.7978 - regression_loss: 1.4939 - classification_loss: 0.3039 2/500 [..............................] - ETA: 1:57 - loss: 1.7600 - regression_loss: 1.4288 - classification_loss: 0.3313 3/500 [..............................] - ETA: 1:59 - loss: 1.6468 - regression_loss: 1.3366 - classification_loss: 0.3103 4/500 [..............................] - ETA: 2:01 - loss: 1.6009 - regression_loss: 1.2976 - classification_loss: 0.3033 5/500 [..............................] - ETA: 2:01 - loss: 1.4988 - regression_loss: 1.2158 - classification_loss: 0.2830 6/500 [..............................] - ETA: 2:01 - loss: 1.6069 - regression_loss: 1.3266 - classification_loss: 0.2803 7/500 [..............................] - ETA: 2:02 - loss: 1.5909 - regression_loss: 1.3125 - classification_loss: 0.2784 8/500 [..............................] - ETA: 2:01 - loss: 1.5741 - regression_loss: 1.3016 - classification_loss: 0.2725 9/500 [..............................] - ETA: 2:02 - loss: 1.5948 - regression_loss: 1.3206 - classification_loss: 0.2742 10/500 [..............................] - ETA: 2:02 - loss: 1.6353 - regression_loss: 1.3566 - classification_loss: 0.2787 11/500 [..............................] - ETA: 2:02 - loss: 1.6651 - regression_loss: 1.3834 - classification_loss: 0.2817 12/500 [..............................] - ETA: 2:02 - loss: 1.6881 - regression_loss: 1.4064 - classification_loss: 0.2817 13/500 [..............................] - ETA: 2:02 - loss: 1.7759 - regression_loss: 1.4843 - classification_loss: 0.2915 14/500 [..............................] - ETA: 2:02 - loss: 1.7907 - regression_loss: 1.4986 - classification_loss: 0.2922 15/500 [..............................] - ETA: 2:02 - loss: 1.7633 - regression_loss: 1.4618 - classification_loss: 0.3015 16/500 [..............................] - ETA: 2:01 - loss: 1.7346 - regression_loss: 1.4373 - classification_loss: 0.2972 17/500 [>.............................] - ETA: 2:01 - loss: 1.7225 - regression_loss: 1.4301 - classification_loss: 0.2923 18/500 [>.............................] - ETA: 2:01 - loss: 1.7109 - regression_loss: 1.4230 - classification_loss: 0.2879 19/500 [>.............................] - ETA: 2:00 - loss: 1.6926 - regression_loss: 1.4100 - classification_loss: 0.2826 20/500 [>.............................] - ETA: 2:00 - loss: 1.7011 - regression_loss: 1.4164 - classification_loss: 0.2847 21/500 [>.............................] - ETA: 2:00 - loss: 1.6923 - regression_loss: 1.4110 - classification_loss: 0.2812 22/500 [>.............................] - ETA: 2:00 - loss: 1.6870 - regression_loss: 1.4066 - classification_loss: 0.2803 23/500 [>.............................] - ETA: 1:59 - loss: 1.6947 - regression_loss: 1.4141 - classification_loss: 0.2806 24/500 [>.............................] - ETA: 1:58 - loss: 1.7046 - regression_loss: 1.4231 - classification_loss: 0.2815 25/500 [>.............................] - ETA: 1:57 - loss: 1.6847 - regression_loss: 1.4043 - classification_loss: 0.2804 26/500 [>.............................] - ETA: 1:57 - loss: 1.7041 - regression_loss: 1.4224 - classification_loss: 0.2818 27/500 [>.............................] - ETA: 1:57 - loss: 1.6881 - regression_loss: 1.4096 - classification_loss: 0.2785 28/500 [>.............................] - ETA: 1:57 - loss: 1.6915 - regression_loss: 1.4136 - classification_loss: 0.2779 29/500 [>.............................] - ETA: 1:56 - loss: 1.6890 - regression_loss: 1.4113 - classification_loss: 0.2777 30/500 [>.............................] - ETA: 1:56 - loss: 1.6908 - regression_loss: 1.4121 - classification_loss: 0.2787 31/500 [>.............................] - ETA: 1:55 - loss: 1.7003 - regression_loss: 1.4191 - classification_loss: 0.2812 32/500 [>.............................] - ETA: 1:54 - loss: 1.6953 - regression_loss: 1.4178 - classification_loss: 0.2775 33/500 [>.............................] - ETA: 1:54 - loss: 1.6946 - regression_loss: 1.4189 - classification_loss: 0.2757 34/500 [=>............................] - ETA: 1:53 - loss: 1.6901 - regression_loss: 1.4142 - classification_loss: 0.2758 35/500 [=>............................] - ETA: 1:52 - loss: 1.6849 - regression_loss: 1.4067 - classification_loss: 0.2782 36/500 [=>............................] - ETA: 1:52 - loss: 1.6925 - regression_loss: 1.4147 - classification_loss: 0.2779 37/500 [=>............................] - ETA: 1:52 - loss: 1.6897 - regression_loss: 1.4116 - classification_loss: 0.2781 38/500 [=>............................] - ETA: 1:52 - loss: 1.6931 - regression_loss: 1.4135 - classification_loss: 0.2797 39/500 [=>............................] - ETA: 1:52 - loss: 1.7114 - regression_loss: 1.4211 - classification_loss: 0.2902 40/500 [=>............................] - ETA: 1:52 - loss: 1.7123 - regression_loss: 1.4225 - classification_loss: 0.2898 41/500 [=>............................] - ETA: 1:51 - loss: 1.6916 - regression_loss: 1.4062 - classification_loss: 0.2854 42/500 [=>............................] - ETA: 1:51 - loss: 1.6880 - regression_loss: 1.4036 - classification_loss: 0.2843 43/500 [=>............................] - ETA: 1:51 - loss: 1.6882 - regression_loss: 1.4042 - classification_loss: 0.2840 44/500 [=>............................] - ETA: 1:50 - loss: 1.6825 - regression_loss: 1.3980 - classification_loss: 0.2845 45/500 [=>............................] - ETA: 1:50 - loss: 1.6591 - regression_loss: 1.3794 - classification_loss: 0.2797 46/500 [=>............................] - ETA: 1:50 - loss: 1.6594 - regression_loss: 1.3784 - classification_loss: 0.2810 47/500 [=>............................] - ETA: 1:49 - loss: 1.6713 - regression_loss: 1.3906 - classification_loss: 0.2808 48/500 [=>............................] - ETA: 1:49 - loss: 1.6710 - regression_loss: 1.3902 - classification_loss: 0.2808 49/500 [=>............................] - ETA: 1:49 - loss: 1.6697 - regression_loss: 1.3880 - classification_loss: 0.2817 50/500 [==>...........................] - ETA: 1:49 - loss: 1.6643 - regression_loss: 1.3839 - classification_loss: 0.2804 51/500 [==>...........................] - ETA: 1:49 - loss: 1.6507 - regression_loss: 1.3735 - classification_loss: 0.2772 52/500 [==>...........................] - ETA: 1:49 - loss: 1.6529 - regression_loss: 1.3755 - classification_loss: 0.2775 53/500 [==>...........................] - ETA: 1:48 - loss: 1.6519 - regression_loss: 1.3732 - classification_loss: 0.2787 54/500 [==>...........................] - ETA: 1:48 - loss: 1.6574 - regression_loss: 1.3776 - classification_loss: 0.2798 55/500 [==>...........................] - ETA: 1:48 - loss: 1.6443 - regression_loss: 1.3682 - classification_loss: 0.2761 56/500 [==>...........................] - ETA: 1:48 - loss: 1.6341 - regression_loss: 1.3611 - classification_loss: 0.2730 57/500 [==>...........................] - ETA: 1:48 - loss: 1.6316 - regression_loss: 1.3589 - classification_loss: 0.2727 58/500 [==>...........................] - ETA: 1:47 - loss: 1.6210 - regression_loss: 1.3509 - classification_loss: 0.2701 59/500 [==>...........................] - ETA: 1:47 - loss: 1.6248 - regression_loss: 1.3546 - classification_loss: 0.2702 60/500 [==>...........................] - ETA: 1:47 - loss: 1.6112 - regression_loss: 1.3436 - classification_loss: 0.2675 61/500 [==>...........................] - ETA: 1:47 - loss: 1.6175 - regression_loss: 1.3491 - classification_loss: 0.2684 62/500 [==>...........................] - ETA: 1:46 - loss: 1.6100 - regression_loss: 1.3434 - classification_loss: 0.2666 63/500 [==>...........................] - ETA: 1:46 - loss: 1.6129 - regression_loss: 1.3462 - classification_loss: 0.2667 64/500 [==>...........................] - ETA: 1:46 - loss: 1.6064 - regression_loss: 1.3406 - classification_loss: 0.2659 65/500 [==>...........................] - ETA: 1:46 - loss: 1.6013 - regression_loss: 1.3369 - classification_loss: 0.2644 66/500 [==>...........................] - ETA: 1:45 - loss: 1.5958 - regression_loss: 1.3332 - classification_loss: 0.2626 67/500 [===>..........................] - ETA: 1:45 - loss: 1.5960 - regression_loss: 1.3339 - classification_loss: 0.2621 68/500 [===>..........................] - ETA: 1:45 - loss: 1.5925 - regression_loss: 1.3306 - classification_loss: 0.2619 69/500 [===>..........................] - ETA: 1:45 - loss: 1.5929 - regression_loss: 1.3314 - classification_loss: 0.2615 70/500 [===>..........................] - ETA: 1:45 - loss: 1.5999 - regression_loss: 1.3370 - classification_loss: 0.2629 71/500 [===>..........................] - ETA: 1:45 - loss: 1.5865 - regression_loss: 1.3264 - classification_loss: 0.2602 72/500 [===>..........................] - ETA: 1:44 - loss: 1.5878 - regression_loss: 1.3263 - classification_loss: 0.2615 73/500 [===>..........................] - ETA: 1:44 - loss: 1.5818 - regression_loss: 1.3218 - classification_loss: 0.2600 74/500 [===>..........................] - ETA: 1:44 - loss: 1.5676 - regression_loss: 1.3099 - classification_loss: 0.2577 75/500 [===>..........................] - ETA: 1:44 - loss: 1.5639 - regression_loss: 1.3072 - classification_loss: 0.2567 76/500 [===>..........................] - ETA: 1:44 - loss: 1.5665 - regression_loss: 1.3099 - classification_loss: 0.2567 77/500 [===>..........................] - ETA: 1:43 - loss: 1.5699 - regression_loss: 1.3128 - classification_loss: 0.2571 78/500 [===>..........................] - ETA: 1:43 - loss: 1.5653 - regression_loss: 1.3087 - classification_loss: 0.2566 79/500 [===>..........................] - ETA: 1:43 - loss: 1.5709 - regression_loss: 1.3128 - classification_loss: 0.2581 80/500 [===>..........................] - ETA: 1:43 - loss: 1.5773 - regression_loss: 1.3174 - classification_loss: 0.2599 81/500 [===>..........................] - ETA: 1:43 - loss: 1.5766 - regression_loss: 1.3172 - classification_loss: 0.2594 82/500 [===>..........................] - ETA: 1:42 - loss: 1.5777 - regression_loss: 1.3176 - classification_loss: 0.2601 83/500 [===>..........................] - ETA: 1:42 - loss: 1.5779 - regression_loss: 1.3182 - classification_loss: 0.2597 84/500 [====>.........................] - ETA: 1:42 - loss: 1.5783 - regression_loss: 1.3186 - classification_loss: 0.2597 85/500 [====>.........................] - ETA: 1:42 - loss: 1.5764 - regression_loss: 1.3174 - classification_loss: 0.2590 86/500 [====>.........................] - ETA: 1:41 - loss: 1.5789 - regression_loss: 1.3196 - classification_loss: 0.2593 87/500 [====>.........................] - ETA: 1:41 - loss: 1.5801 - regression_loss: 1.3208 - classification_loss: 0.2592 88/500 [====>.........................] - ETA: 1:41 - loss: 1.5793 - regression_loss: 1.3206 - classification_loss: 0.2587 89/500 [====>.........................] - ETA: 1:41 - loss: 1.5801 - regression_loss: 1.3216 - classification_loss: 0.2584 90/500 [====>.........................] - ETA: 1:41 - loss: 1.5888 - regression_loss: 1.3269 - classification_loss: 0.2619 91/500 [====>.........................] - ETA: 1:40 - loss: 1.5934 - regression_loss: 1.3303 - classification_loss: 0.2631 92/500 [====>.........................] - ETA: 1:40 - loss: 1.5965 - regression_loss: 1.3326 - classification_loss: 0.2639 93/500 [====>.........................] - ETA: 1:40 - loss: 1.5889 - regression_loss: 1.3265 - classification_loss: 0.2624 94/500 [====>.........................] - ETA: 1:40 - loss: 1.5899 - regression_loss: 1.3277 - classification_loss: 0.2621 95/500 [====>.........................] - ETA: 1:40 - loss: 1.5936 - regression_loss: 1.3300 - classification_loss: 0.2636 96/500 [====>.........................] - ETA: 1:39 - loss: 1.5940 - regression_loss: 1.3307 - classification_loss: 0.2634 97/500 [====>.........................] - ETA: 1:39 - loss: 1.5913 - regression_loss: 1.3289 - classification_loss: 0.2624 98/500 [====>.........................] - ETA: 1:39 - loss: 1.5917 - regression_loss: 1.3295 - classification_loss: 0.2623 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5973 - regression_loss: 1.3343 - classification_loss: 0.2630 100/500 [=====>........................] - ETA: 1:38 - loss: 1.5989 - regression_loss: 1.3357 - classification_loss: 0.2632 101/500 [=====>........................] - ETA: 1:38 - loss: 1.5984 - regression_loss: 1.3358 - classification_loss: 0.2626 102/500 [=====>........................] - ETA: 1:38 - loss: 1.5976 - regression_loss: 1.3356 - classification_loss: 0.2620 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5964 - regression_loss: 1.3342 - classification_loss: 0.2622 104/500 [=====>........................] - ETA: 1:37 - loss: 1.6003 - regression_loss: 1.3374 - classification_loss: 0.2629 105/500 [=====>........................] - ETA: 1:37 - loss: 1.6014 - regression_loss: 1.3380 - classification_loss: 0.2633 106/500 [=====>........................] - ETA: 1:37 - loss: 1.6026 - regression_loss: 1.3382 - classification_loss: 0.2644 107/500 [=====>........................] - ETA: 1:37 - loss: 1.6090 - regression_loss: 1.3439 - classification_loss: 0.2651 108/500 [=====>........................] - ETA: 1:36 - loss: 1.6148 - regression_loss: 1.3486 - classification_loss: 0.2662 109/500 [=====>........................] - ETA: 1:36 - loss: 1.6177 - regression_loss: 1.3504 - classification_loss: 0.2672 110/500 [=====>........................] - ETA: 1:36 - loss: 1.6190 - regression_loss: 1.3517 - classification_loss: 0.2673 111/500 [=====>........................] - ETA: 1:36 - loss: 1.6156 - regression_loss: 1.3490 - classification_loss: 0.2666 112/500 [=====>........................] - ETA: 1:35 - loss: 1.6179 - regression_loss: 1.3509 - classification_loss: 0.2670 113/500 [=====>........................] - ETA: 1:35 - loss: 1.6168 - regression_loss: 1.3501 - classification_loss: 0.2667 114/500 [=====>........................] - ETA: 1:35 - loss: 1.6181 - regression_loss: 1.3513 - classification_loss: 0.2668 115/500 [=====>........................] - ETA: 1:35 - loss: 1.6183 - regression_loss: 1.3510 - classification_loss: 0.2673 116/500 [=====>........................] - ETA: 1:34 - loss: 1.6177 - regression_loss: 1.3505 - classification_loss: 0.2673 117/500 [======>.......................] - ETA: 1:34 - loss: 1.6154 - regression_loss: 1.3487 - classification_loss: 0.2667 118/500 [======>.......................] - ETA: 1:34 - loss: 1.6130 - regression_loss: 1.3469 - classification_loss: 0.2662 119/500 [======>.......................] - ETA: 1:34 - loss: 1.6139 - regression_loss: 1.3477 - classification_loss: 0.2662 120/500 [======>.......................] - ETA: 1:33 - loss: 1.6071 - regression_loss: 1.3425 - classification_loss: 0.2646 121/500 [======>.......................] - ETA: 1:33 - loss: 1.6039 - regression_loss: 1.3396 - classification_loss: 0.2643 122/500 [======>.......................] - ETA: 1:33 - loss: 1.6008 - regression_loss: 1.3373 - classification_loss: 0.2634 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5921 - regression_loss: 1.3305 - classification_loss: 0.2616 124/500 [======>.......................] - ETA: 1:32 - loss: 1.5930 - regression_loss: 1.3313 - classification_loss: 0.2617 125/500 [======>.......................] - ETA: 1:32 - loss: 1.5859 - regression_loss: 1.3257 - classification_loss: 0.2602 126/500 [======>.......................] - ETA: 1:32 - loss: 1.5779 - regression_loss: 1.3191 - classification_loss: 0.2588 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5784 - regression_loss: 1.3195 - classification_loss: 0.2589 128/500 [======>.......................] - ETA: 1:31 - loss: 1.5754 - regression_loss: 1.3171 - classification_loss: 0.2582 129/500 [======>.......................] - ETA: 1:31 - loss: 1.5719 - regression_loss: 1.3142 - classification_loss: 0.2577 130/500 [======>.......................] - ETA: 1:31 - loss: 1.5766 - regression_loss: 1.3180 - classification_loss: 0.2586 131/500 [======>.......................] - ETA: 1:31 - loss: 1.5798 - regression_loss: 1.3205 - classification_loss: 0.2593 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5781 - regression_loss: 1.3190 - classification_loss: 0.2591 133/500 [======>.......................] - ETA: 1:30 - loss: 1.5811 - regression_loss: 1.3218 - classification_loss: 0.2592 134/500 [=======>......................] - ETA: 1:30 - loss: 1.5827 - regression_loss: 1.3233 - classification_loss: 0.2594 135/500 [=======>......................] - ETA: 1:30 - loss: 1.5779 - regression_loss: 1.3195 - classification_loss: 0.2584 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5709 - regression_loss: 1.3135 - classification_loss: 0.2574 137/500 [=======>......................] - ETA: 1:29 - loss: 1.5711 - regression_loss: 1.3133 - classification_loss: 0.2578 138/500 [=======>......................] - ETA: 1:29 - loss: 1.5736 - regression_loss: 1.3156 - classification_loss: 0.2580 139/500 [=======>......................] - ETA: 1:29 - loss: 1.5754 - regression_loss: 1.3168 - classification_loss: 0.2586 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5764 - regression_loss: 1.3180 - classification_loss: 0.2584 141/500 [=======>......................] - ETA: 1:28 - loss: 1.5795 - regression_loss: 1.3201 - classification_loss: 0.2594 142/500 [=======>......................] - ETA: 1:28 - loss: 1.5817 - regression_loss: 1.3223 - classification_loss: 0.2594 143/500 [=======>......................] - ETA: 1:28 - loss: 1.5806 - regression_loss: 1.3217 - classification_loss: 0.2589 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5760 - regression_loss: 1.3182 - classification_loss: 0.2578 145/500 [=======>......................] - ETA: 1:27 - loss: 1.5755 - regression_loss: 1.3179 - classification_loss: 0.2576 146/500 [=======>......................] - ETA: 1:27 - loss: 1.5741 - regression_loss: 1.3166 - classification_loss: 0.2576 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5691 - regression_loss: 1.3120 - classification_loss: 0.2571 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5750 - regression_loss: 1.3170 - classification_loss: 0.2580 149/500 [=======>......................] - ETA: 1:26 - loss: 1.5773 - regression_loss: 1.3188 - classification_loss: 0.2586 150/500 [========>.....................] - ETA: 1:26 - loss: 1.5790 - regression_loss: 1.3201 - classification_loss: 0.2589 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5742 - regression_loss: 1.3161 - classification_loss: 0.2581 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5777 - regression_loss: 1.3199 - classification_loss: 0.2578 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5786 - regression_loss: 1.3201 - classification_loss: 0.2585 154/500 [========>.....................] - ETA: 1:25 - loss: 1.5786 - regression_loss: 1.3203 - classification_loss: 0.2584 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5795 - regression_loss: 1.3206 - classification_loss: 0.2590 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5784 - regression_loss: 1.3198 - classification_loss: 0.2586 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5801 - regression_loss: 1.3205 - classification_loss: 0.2596 158/500 [========>.....................] - ETA: 1:24 - loss: 1.5780 - regression_loss: 1.3186 - classification_loss: 0.2594 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5807 - regression_loss: 1.3208 - classification_loss: 0.2600 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5781 - regression_loss: 1.3187 - classification_loss: 0.2594 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5785 - regression_loss: 1.3192 - classification_loss: 0.2593 162/500 [========>.....................] - ETA: 1:23 - loss: 1.5797 - regression_loss: 1.3203 - classification_loss: 0.2594 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5769 - regression_loss: 1.3177 - classification_loss: 0.2592 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5785 - regression_loss: 1.3189 - classification_loss: 0.2595 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5760 - regression_loss: 1.3171 - classification_loss: 0.2590 166/500 [========>.....................] - ETA: 1:22 - loss: 1.5761 - regression_loss: 1.3173 - classification_loss: 0.2588 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5675 - regression_loss: 1.3094 - classification_loss: 0.2581 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5624 - regression_loss: 1.3055 - classification_loss: 0.2569 169/500 [=========>....................] - ETA: 1:21 - loss: 1.5643 - regression_loss: 1.3074 - classification_loss: 0.2569 170/500 [=========>....................] - ETA: 1:21 - loss: 1.5628 - regression_loss: 1.3063 - classification_loss: 0.2564 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5641 - regression_loss: 1.3073 - classification_loss: 0.2568 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5623 - regression_loss: 1.3058 - classification_loss: 0.2564 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5639 - regression_loss: 1.3070 - classification_loss: 0.2569 174/500 [=========>....................] - ETA: 1:20 - loss: 1.5633 - regression_loss: 1.3066 - classification_loss: 0.2566 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5635 - regression_loss: 1.3069 - classification_loss: 0.2566 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5611 - regression_loss: 1.3051 - classification_loss: 0.2560 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5650 - regression_loss: 1.3079 - classification_loss: 0.2571 178/500 [=========>....................] - ETA: 1:19 - loss: 1.5670 - regression_loss: 1.3091 - classification_loss: 0.2579 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5669 - regression_loss: 1.3083 - classification_loss: 0.2586 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5663 - regression_loss: 1.3076 - classification_loss: 0.2587 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5668 - regression_loss: 1.3082 - classification_loss: 0.2586 182/500 [=========>....................] - ETA: 1:18 - loss: 1.5640 - regression_loss: 1.3062 - classification_loss: 0.2578 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5698 - regression_loss: 1.3109 - classification_loss: 0.2589 184/500 [==========>...................] - ETA: 1:18 - loss: 1.5693 - regression_loss: 1.3108 - classification_loss: 0.2585 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5686 - regression_loss: 1.3092 - classification_loss: 0.2594 186/500 [==========>...................] - ETA: 1:17 - loss: 1.5648 - regression_loss: 1.3060 - classification_loss: 0.2588 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5680 - regression_loss: 1.3083 - classification_loss: 0.2597 188/500 [==========>...................] - ETA: 1:17 - loss: 1.5727 - regression_loss: 1.3122 - classification_loss: 0.2605 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5733 - regression_loss: 1.3127 - classification_loss: 0.2606 190/500 [==========>...................] - ETA: 1:16 - loss: 1.5753 - regression_loss: 1.3145 - classification_loss: 0.2608 191/500 [==========>...................] - ETA: 1:16 - loss: 1.5748 - regression_loss: 1.3142 - classification_loss: 0.2606 192/500 [==========>...................] - ETA: 1:16 - loss: 1.5758 - regression_loss: 1.3150 - classification_loss: 0.2608 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5760 - regression_loss: 1.3154 - classification_loss: 0.2606 194/500 [==========>...................] - ETA: 1:15 - loss: 1.5732 - regression_loss: 1.3131 - classification_loss: 0.2600 195/500 [==========>...................] - ETA: 1:15 - loss: 1.5740 - regression_loss: 1.3139 - classification_loss: 0.2601 196/500 [==========>...................] - ETA: 1:15 - loss: 1.5744 - regression_loss: 1.3142 - classification_loss: 0.2602 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5753 - regression_loss: 1.3153 - classification_loss: 0.2600 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5752 - regression_loss: 1.3153 - classification_loss: 0.2598 199/500 [==========>...................] - ETA: 1:14 - loss: 1.5733 - regression_loss: 1.3120 - classification_loss: 0.2613 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5713 - regression_loss: 1.3105 - classification_loss: 0.2608 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5713 - regression_loss: 1.3108 - classification_loss: 0.2605 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5748 - regression_loss: 1.3138 - classification_loss: 0.2610 203/500 [===========>..................] - ETA: 1:13 - loss: 1.5723 - regression_loss: 1.3116 - classification_loss: 0.2608 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5708 - regression_loss: 1.3104 - classification_loss: 0.2604 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5689 - regression_loss: 1.3091 - classification_loss: 0.2598 206/500 [===========>..................] - ETA: 1:12 - loss: 1.5702 - regression_loss: 1.3101 - classification_loss: 0.2600 207/500 [===========>..................] - ETA: 1:12 - loss: 1.5682 - regression_loss: 1.3083 - classification_loss: 0.2599 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5710 - regression_loss: 1.3107 - classification_loss: 0.2603 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5714 - regression_loss: 1.3109 - classification_loss: 0.2605 210/500 [===========>..................] - ETA: 1:11 - loss: 1.5704 - regression_loss: 1.3104 - classification_loss: 0.2600 211/500 [===========>..................] - ETA: 1:11 - loss: 1.5675 - regression_loss: 1.3082 - classification_loss: 0.2593 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5690 - regression_loss: 1.3097 - classification_loss: 0.2594 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5676 - regression_loss: 1.3085 - classification_loss: 0.2591 214/500 [===========>..................] - ETA: 1:10 - loss: 1.5653 - regression_loss: 1.3069 - classification_loss: 0.2584 215/500 [===========>..................] - ETA: 1:10 - loss: 1.5658 - regression_loss: 1.3071 - classification_loss: 0.2586 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5628 - regression_loss: 1.3047 - classification_loss: 0.2581 217/500 [============>.................] - ETA: 1:10 - loss: 1.5625 - regression_loss: 1.3044 - classification_loss: 0.2581 218/500 [============>.................] - ETA: 1:09 - loss: 1.5611 - regression_loss: 1.3033 - classification_loss: 0.2578 219/500 [============>.................] - ETA: 1:09 - loss: 1.5606 - regression_loss: 1.3033 - classification_loss: 0.2573 220/500 [============>.................] - ETA: 1:09 - loss: 1.5598 - regression_loss: 1.3028 - classification_loss: 0.2570 221/500 [============>.................] - ETA: 1:09 - loss: 1.5603 - regression_loss: 1.3033 - classification_loss: 0.2570 222/500 [============>.................] - ETA: 1:08 - loss: 1.5613 - regression_loss: 1.3041 - classification_loss: 0.2572 223/500 [============>.................] - ETA: 1:08 - loss: 1.5616 - regression_loss: 1.3045 - classification_loss: 0.2571 224/500 [============>.................] - ETA: 1:08 - loss: 1.5607 - regression_loss: 1.3038 - classification_loss: 0.2569 225/500 [============>.................] - ETA: 1:08 - loss: 1.5613 - regression_loss: 1.3044 - classification_loss: 0.2569 226/500 [============>.................] - ETA: 1:08 - loss: 1.5613 - regression_loss: 1.3044 - classification_loss: 0.2569 227/500 [============>.................] - ETA: 1:07 - loss: 1.5620 - regression_loss: 1.3052 - classification_loss: 0.2568 228/500 [============>.................] - ETA: 1:07 - loss: 1.5626 - regression_loss: 1.3059 - classification_loss: 0.2567 229/500 [============>.................] - ETA: 1:07 - loss: 1.5615 - regression_loss: 1.3051 - classification_loss: 0.2564 230/500 [============>.................] - ETA: 1:07 - loss: 1.5573 - regression_loss: 1.3016 - classification_loss: 0.2557 231/500 [============>.................] - ETA: 1:06 - loss: 1.5575 - regression_loss: 1.3019 - classification_loss: 0.2555 232/500 [============>.................] - ETA: 1:06 - loss: 1.5580 - regression_loss: 1.3024 - classification_loss: 0.2556 233/500 [============>.................] - ETA: 1:06 - loss: 1.5591 - regression_loss: 1.3029 - classification_loss: 0.2562 234/500 [=============>................] - ETA: 1:06 - loss: 1.5608 - regression_loss: 1.3040 - classification_loss: 0.2568 235/500 [=============>................] - ETA: 1:05 - loss: 1.5614 - regression_loss: 1.3048 - classification_loss: 0.2566 236/500 [=============>................] - ETA: 1:05 - loss: 1.5610 - regression_loss: 1.3047 - classification_loss: 0.2563 237/500 [=============>................] - ETA: 1:05 - loss: 1.5633 - regression_loss: 1.3068 - classification_loss: 0.2565 238/500 [=============>................] - ETA: 1:05 - loss: 1.5626 - regression_loss: 1.3064 - classification_loss: 0.2562 239/500 [=============>................] - ETA: 1:04 - loss: 1.5632 - regression_loss: 1.3065 - classification_loss: 0.2567 240/500 [=============>................] - ETA: 1:04 - loss: 1.5644 - regression_loss: 1.3071 - classification_loss: 0.2573 241/500 [=============>................] - ETA: 1:04 - loss: 1.5655 - regression_loss: 1.3078 - classification_loss: 0.2577 242/500 [=============>................] - ETA: 1:04 - loss: 1.5661 - regression_loss: 1.3085 - classification_loss: 0.2576 243/500 [=============>................] - ETA: 1:03 - loss: 1.5624 - regression_loss: 1.3056 - classification_loss: 0.2568 244/500 [=============>................] - ETA: 1:03 - loss: 1.5599 - regression_loss: 1.3037 - classification_loss: 0.2562 245/500 [=============>................] - ETA: 1:03 - loss: 1.5593 - regression_loss: 1.3033 - classification_loss: 0.2560 246/500 [=============>................] - ETA: 1:03 - loss: 1.5605 - regression_loss: 1.3043 - classification_loss: 0.2562 247/500 [=============>................] - ETA: 1:02 - loss: 1.5615 - regression_loss: 1.3049 - classification_loss: 0.2567 248/500 [=============>................] - ETA: 1:02 - loss: 1.5604 - regression_loss: 1.3039 - classification_loss: 0.2565 249/500 [=============>................] - ETA: 1:02 - loss: 1.5572 - regression_loss: 1.3014 - classification_loss: 0.2558 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5569 - regression_loss: 1.3017 - classification_loss: 0.2552 251/500 [==============>...............] - ETA: 1:01 - loss: 1.5581 - regression_loss: 1.3028 - classification_loss: 0.2553 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5616 - regression_loss: 1.3058 - classification_loss: 0.2558 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5627 - regression_loss: 1.3068 - classification_loss: 0.2559 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5617 - regression_loss: 1.3061 - classification_loss: 0.2557 255/500 [==============>...............] - ETA: 1:00 - loss: 1.5592 - regression_loss: 1.3039 - classification_loss: 0.2553 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5622 - regression_loss: 1.3059 - classification_loss: 0.2562 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5628 - regression_loss: 1.3063 - classification_loss: 0.2565 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5632 - regression_loss: 1.3067 - classification_loss: 0.2565 259/500 [==============>...............] - ETA: 59s - loss: 1.5641 - regression_loss: 1.3073 - classification_loss: 0.2567  260/500 [==============>...............] - ETA: 59s - loss: 1.5645 - regression_loss: 1.3074 - classification_loss: 0.2571 261/500 [==============>...............] - ETA: 59s - loss: 1.5641 - regression_loss: 1.3072 - classification_loss: 0.2569 262/500 [==============>...............] - ETA: 59s - loss: 1.5628 - regression_loss: 1.3061 - classification_loss: 0.2567 263/500 [==============>...............] - ETA: 58s - loss: 1.5602 - regression_loss: 1.3039 - classification_loss: 0.2563 264/500 [==============>...............] - ETA: 58s - loss: 1.5616 - regression_loss: 1.3050 - classification_loss: 0.2566 265/500 [==============>...............] - ETA: 58s - loss: 1.5619 - regression_loss: 1.3052 - classification_loss: 0.2567 266/500 [==============>...............] - ETA: 58s - loss: 1.5628 - regression_loss: 1.3060 - classification_loss: 0.2568 267/500 [===============>..............] - ETA: 57s - loss: 1.5645 - regression_loss: 1.3073 - classification_loss: 0.2572 268/500 [===============>..............] - ETA: 57s - loss: 1.5661 - regression_loss: 1.3083 - classification_loss: 0.2577 269/500 [===============>..............] - ETA: 57s - loss: 1.5650 - regression_loss: 1.3074 - classification_loss: 0.2576 270/500 [===============>..............] - ETA: 57s - loss: 1.5664 - regression_loss: 1.3086 - classification_loss: 0.2578 271/500 [===============>..............] - ETA: 56s - loss: 1.5687 - regression_loss: 1.3103 - classification_loss: 0.2585 272/500 [===============>..............] - ETA: 56s - loss: 1.5698 - regression_loss: 1.3109 - classification_loss: 0.2589 273/500 [===============>..............] - ETA: 56s - loss: 1.5692 - regression_loss: 1.3104 - classification_loss: 0.2588 274/500 [===============>..............] - ETA: 56s - loss: 1.5700 - regression_loss: 1.3112 - classification_loss: 0.2588 275/500 [===============>..............] - ETA: 55s - loss: 1.5728 - regression_loss: 1.3133 - classification_loss: 0.2594 276/500 [===============>..............] - ETA: 55s - loss: 1.5730 - regression_loss: 1.3137 - classification_loss: 0.2593 277/500 [===============>..............] - ETA: 55s - loss: 1.5707 - regression_loss: 1.3120 - classification_loss: 0.2587 278/500 [===============>..............] - ETA: 55s - loss: 1.5707 - regression_loss: 1.3119 - classification_loss: 0.2588 279/500 [===============>..............] - ETA: 55s - loss: 1.5718 - regression_loss: 1.3126 - classification_loss: 0.2591 280/500 [===============>..............] - ETA: 54s - loss: 1.5734 - regression_loss: 1.3140 - classification_loss: 0.2593 281/500 [===============>..............] - ETA: 54s - loss: 1.5731 - regression_loss: 1.3138 - classification_loss: 0.2593 282/500 [===============>..............] - ETA: 54s - loss: 1.5750 - regression_loss: 1.3156 - classification_loss: 0.2594 283/500 [===============>..............] - ETA: 54s - loss: 1.5750 - regression_loss: 1.3156 - classification_loss: 0.2594 284/500 [================>.............] - ETA: 53s - loss: 1.5751 - regression_loss: 1.3156 - classification_loss: 0.2595 285/500 [================>.............] - ETA: 53s - loss: 1.5766 - regression_loss: 1.3168 - classification_loss: 0.2598 286/500 [================>.............] - ETA: 53s - loss: 1.5799 - regression_loss: 1.3190 - classification_loss: 0.2609 287/500 [================>.............] - ETA: 53s - loss: 1.5820 - regression_loss: 1.3205 - classification_loss: 0.2615 288/500 [================>.............] - ETA: 52s - loss: 1.5793 - regression_loss: 1.3182 - classification_loss: 0.2610 289/500 [================>.............] - ETA: 52s - loss: 1.5761 - regression_loss: 1.3155 - classification_loss: 0.2606 290/500 [================>.............] - ETA: 52s - loss: 1.5755 - regression_loss: 1.3148 - classification_loss: 0.2606 291/500 [================>.............] - ETA: 52s - loss: 1.5735 - regression_loss: 1.3131 - classification_loss: 0.2605 292/500 [================>.............] - ETA: 51s - loss: 1.5734 - regression_loss: 1.3127 - classification_loss: 0.2607 293/500 [================>.............] - ETA: 51s - loss: 1.5742 - regression_loss: 1.3135 - classification_loss: 0.2608 294/500 [================>.............] - ETA: 51s - loss: 1.5751 - regression_loss: 1.3142 - classification_loss: 0.2609 295/500 [================>.............] - ETA: 51s - loss: 1.5754 - regression_loss: 1.3145 - classification_loss: 0.2609 296/500 [================>.............] - ETA: 50s - loss: 1.5764 - regression_loss: 1.3153 - classification_loss: 0.2611 297/500 [================>.............] - ETA: 50s - loss: 1.5743 - regression_loss: 1.3137 - classification_loss: 0.2605 298/500 [================>.............] - ETA: 50s - loss: 1.5736 - regression_loss: 1.3132 - classification_loss: 0.2604 299/500 [================>.............] - ETA: 50s - loss: 1.5767 - regression_loss: 1.3157 - classification_loss: 0.2610 300/500 [=================>............] - ETA: 49s - loss: 1.5783 - regression_loss: 1.3172 - classification_loss: 0.2611 301/500 [=================>............] - ETA: 49s - loss: 1.5752 - regression_loss: 1.3148 - classification_loss: 0.2604 302/500 [=================>............] - ETA: 49s - loss: 1.5756 - regression_loss: 1.3150 - classification_loss: 0.2606 303/500 [=================>............] - ETA: 49s - loss: 1.5765 - regression_loss: 1.3160 - classification_loss: 0.2605 304/500 [=================>............] - ETA: 48s - loss: 1.5763 - regression_loss: 1.3160 - classification_loss: 0.2603 305/500 [=================>............] - ETA: 48s - loss: 1.5770 - regression_loss: 1.3161 - classification_loss: 0.2609 306/500 [=================>............] - ETA: 48s - loss: 1.5771 - regression_loss: 1.3159 - classification_loss: 0.2612 307/500 [=================>............] - ETA: 48s - loss: 1.5783 - regression_loss: 1.3166 - classification_loss: 0.2617 308/500 [=================>............] - ETA: 47s - loss: 1.5779 - regression_loss: 1.3164 - classification_loss: 0.2615 309/500 [=================>............] - ETA: 47s - loss: 1.5772 - regression_loss: 1.3160 - classification_loss: 0.2613 310/500 [=================>............] - ETA: 47s - loss: 1.5735 - regression_loss: 1.3130 - classification_loss: 0.2605 311/500 [=================>............] - ETA: 47s - loss: 1.5735 - regression_loss: 1.3130 - classification_loss: 0.2605 312/500 [=================>............] - ETA: 46s - loss: 1.5729 - regression_loss: 1.3124 - classification_loss: 0.2605 313/500 [=================>............] - ETA: 46s - loss: 1.5697 - regression_loss: 1.3096 - classification_loss: 0.2601 314/500 [=================>............] - ETA: 46s - loss: 1.5702 - regression_loss: 1.3101 - classification_loss: 0.2602 315/500 [=================>............] - ETA: 46s - loss: 1.5717 - regression_loss: 1.3112 - classification_loss: 0.2605 316/500 [=================>............] - ETA: 45s - loss: 1.5684 - regression_loss: 1.3084 - classification_loss: 0.2600 317/500 [==================>...........] - ETA: 45s - loss: 1.5679 - regression_loss: 1.3080 - classification_loss: 0.2598 318/500 [==================>...........] - ETA: 45s - loss: 1.5680 - regression_loss: 1.3083 - classification_loss: 0.2597 319/500 [==================>...........] - ETA: 45s - loss: 1.5670 - regression_loss: 1.3074 - classification_loss: 0.2596 320/500 [==================>...........] - ETA: 44s - loss: 1.5670 - regression_loss: 1.3074 - classification_loss: 0.2596 321/500 [==================>...........] - ETA: 44s - loss: 1.5678 - regression_loss: 1.3079 - classification_loss: 0.2599 322/500 [==================>...........] - ETA: 44s - loss: 1.5674 - regression_loss: 1.3076 - classification_loss: 0.2598 323/500 [==================>...........] - ETA: 44s - loss: 1.5678 - regression_loss: 1.3079 - classification_loss: 0.2598 324/500 [==================>...........] - ETA: 43s - loss: 1.5670 - regression_loss: 1.3072 - classification_loss: 0.2598 325/500 [==================>...........] - ETA: 43s - loss: 1.5664 - regression_loss: 1.3069 - classification_loss: 0.2595 326/500 [==================>...........] - ETA: 43s - loss: 1.5660 - regression_loss: 1.3067 - classification_loss: 0.2593 327/500 [==================>...........] - ETA: 43s - loss: 1.5669 - regression_loss: 1.3075 - classification_loss: 0.2594 328/500 [==================>...........] - ETA: 42s - loss: 1.5673 - regression_loss: 1.3077 - classification_loss: 0.2596 329/500 [==================>...........] - ETA: 42s - loss: 1.5684 - regression_loss: 1.3087 - classification_loss: 0.2597 330/500 [==================>...........] - ETA: 42s - loss: 1.5687 - regression_loss: 1.3089 - classification_loss: 0.2598 331/500 [==================>...........] - ETA: 42s - loss: 1.5690 - regression_loss: 1.3090 - classification_loss: 0.2600 332/500 [==================>...........] - ETA: 41s - loss: 1.5668 - regression_loss: 1.3073 - classification_loss: 0.2595 333/500 [==================>...........] - ETA: 41s - loss: 1.5650 - regression_loss: 1.3059 - classification_loss: 0.2591 334/500 [===================>..........] - ETA: 41s - loss: 1.5655 - regression_loss: 1.3062 - classification_loss: 0.2593 335/500 [===================>..........] - ETA: 41s - loss: 1.5658 - regression_loss: 1.3064 - classification_loss: 0.2593 336/500 [===================>..........] - ETA: 40s - loss: 1.5642 - regression_loss: 1.3050 - classification_loss: 0.2592 337/500 [===================>..........] - ETA: 40s - loss: 1.5655 - regression_loss: 1.3060 - classification_loss: 0.2595 338/500 [===================>..........] - ETA: 40s - loss: 1.5665 - regression_loss: 1.3068 - classification_loss: 0.2597 339/500 [===================>..........] - ETA: 40s - loss: 1.5673 - regression_loss: 1.3075 - classification_loss: 0.2598 340/500 [===================>..........] - ETA: 39s - loss: 1.5679 - regression_loss: 1.3082 - classification_loss: 0.2597 341/500 [===================>..........] - ETA: 39s - loss: 1.5653 - regression_loss: 1.3061 - classification_loss: 0.2592 342/500 [===================>..........] - ETA: 39s - loss: 1.5656 - regression_loss: 1.3065 - classification_loss: 0.2591 343/500 [===================>..........] - ETA: 39s - loss: 1.5637 - regression_loss: 1.3051 - classification_loss: 0.2587 344/500 [===================>..........] - ETA: 38s - loss: 1.5642 - regression_loss: 1.3052 - classification_loss: 0.2590 345/500 [===================>..........] - ETA: 38s - loss: 1.5648 - regression_loss: 1.3056 - classification_loss: 0.2592 346/500 [===================>..........] - ETA: 38s - loss: 1.5648 - regression_loss: 1.3055 - classification_loss: 0.2593 347/500 [===================>..........] - ETA: 38s - loss: 1.5654 - regression_loss: 1.3060 - classification_loss: 0.2594 348/500 [===================>..........] - ETA: 37s - loss: 1.5668 - regression_loss: 1.3075 - classification_loss: 0.2593 349/500 [===================>..........] - ETA: 37s - loss: 1.5670 - regression_loss: 1.3077 - classification_loss: 0.2593 350/500 [====================>.........] - ETA: 37s - loss: 1.5677 - regression_loss: 1.3082 - classification_loss: 0.2595 351/500 [====================>.........] - ETA: 37s - loss: 1.5678 - regression_loss: 1.3084 - classification_loss: 0.2594 352/500 [====================>.........] - ETA: 36s - loss: 1.5691 - regression_loss: 1.3096 - classification_loss: 0.2595 353/500 [====================>.........] - ETA: 36s - loss: 1.5694 - regression_loss: 1.3099 - classification_loss: 0.2596 354/500 [====================>.........] - ETA: 36s - loss: 1.5704 - regression_loss: 1.3104 - classification_loss: 0.2600 355/500 [====================>.........] - ETA: 36s - loss: 1.5698 - regression_loss: 1.3098 - classification_loss: 0.2600 356/500 [====================>.........] - ETA: 35s - loss: 1.5681 - regression_loss: 1.3084 - classification_loss: 0.2596 357/500 [====================>.........] - ETA: 35s - loss: 1.5670 - regression_loss: 1.3076 - classification_loss: 0.2593 358/500 [====================>.........] - ETA: 35s - loss: 1.5671 - regression_loss: 1.3078 - classification_loss: 0.2594 359/500 [====================>.........] - ETA: 35s - loss: 1.5662 - regression_loss: 1.3067 - classification_loss: 0.2594 360/500 [====================>.........] - ETA: 34s - loss: 1.5683 - regression_loss: 1.3085 - classification_loss: 0.2598 361/500 [====================>.........] - ETA: 34s - loss: 1.5695 - regression_loss: 1.3094 - classification_loss: 0.2600 362/500 [====================>.........] - ETA: 34s - loss: 1.5698 - regression_loss: 1.3097 - classification_loss: 0.2602 363/500 [====================>.........] - ETA: 34s - loss: 1.5699 - regression_loss: 1.3097 - classification_loss: 0.2602 364/500 [====================>.........] - ETA: 33s - loss: 1.5696 - regression_loss: 1.3094 - classification_loss: 0.2602 365/500 [====================>.........] - ETA: 33s - loss: 1.5708 - regression_loss: 1.3103 - classification_loss: 0.2605 366/500 [====================>.........] - ETA: 33s - loss: 1.5705 - regression_loss: 1.3102 - classification_loss: 0.2603 367/500 [=====================>........] - ETA: 33s - loss: 1.5680 - regression_loss: 1.3081 - classification_loss: 0.2599 368/500 [=====================>........] - ETA: 32s - loss: 1.5689 - regression_loss: 1.3087 - classification_loss: 0.2603 369/500 [=====================>........] - ETA: 32s - loss: 1.5700 - regression_loss: 1.3095 - classification_loss: 0.2605 370/500 [=====================>........] - ETA: 32s - loss: 1.5679 - regression_loss: 1.3077 - classification_loss: 0.2601 371/500 [=====================>........] - ETA: 32s - loss: 1.5674 - regression_loss: 1.3074 - classification_loss: 0.2600 372/500 [=====================>........] - ETA: 31s - loss: 1.5676 - regression_loss: 1.3076 - classification_loss: 0.2600 373/500 [=====================>........] - ETA: 31s - loss: 1.5669 - regression_loss: 1.3072 - classification_loss: 0.2597 374/500 [=====================>........] - ETA: 31s - loss: 1.5661 - regression_loss: 1.3064 - classification_loss: 0.2596 375/500 [=====================>........] - ETA: 31s - loss: 1.5652 - regression_loss: 1.3057 - classification_loss: 0.2595 376/500 [=====================>........] - ETA: 30s - loss: 1.5649 - regression_loss: 1.3054 - classification_loss: 0.2595 377/500 [=====================>........] - ETA: 30s - loss: 1.5660 - regression_loss: 1.3062 - classification_loss: 0.2597 378/500 [=====================>........] - ETA: 30s - loss: 1.5662 - regression_loss: 1.3063 - classification_loss: 0.2599 379/500 [=====================>........] - ETA: 30s - loss: 1.5665 - regression_loss: 1.3066 - classification_loss: 0.2599 380/500 [=====================>........] - ETA: 29s - loss: 1.5680 - regression_loss: 1.3079 - classification_loss: 0.2601 381/500 [=====================>........] - ETA: 29s - loss: 1.5692 - regression_loss: 1.3088 - classification_loss: 0.2604 382/500 [=====================>........] - ETA: 29s - loss: 1.5706 - regression_loss: 1.3102 - classification_loss: 0.2604 383/500 [=====================>........] - ETA: 29s - loss: 1.5717 - regression_loss: 1.3114 - classification_loss: 0.2602 384/500 [======================>.......] - ETA: 28s - loss: 1.5715 - regression_loss: 1.3113 - classification_loss: 0.2602 385/500 [======================>.......] - ETA: 28s - loss: 1.5722 - regression_loss: 1.3121 - classification_loss: 0.2601 386/500 [======================>.......] - ETA: 28s - loss: 1.5723 - regression_loss: 1.3122 - classification_loss: 0.2601 387/500 [======================>.......] - ETA: 28s - loss: 1.5710 - regression_loss: 1.3111 - classification_loss: 0.2598 388/500 [======================>.......] - ETA: 27s - loss: 1.5692 - regression_loss: 1.3097 - classification_loss: 0.2595 389/500 [======================>.......] - ETA: 27s - loss: 1.5679 - regression_loss: 1.3087 - classification_loss: 0.2593 390/500 [======================>.......] - ETA: 27s - loss: 1.5658 - regression_loss: 1.3069 - classification_loss: 0.2589 391/500 [======================>.......] - ETA: 27s - loss: 1.5655 - regression_loss: 1.3067 - classification_loss: 0.2588 392/500 [======================>.......] - ETA: 26s - loss: 1.5661 - regression_loss: 1.3071 - classification_loss: 0.2590 393/500 [======================>.......] - ETA: 26s - loss: 1.5667 - regression_loss: 1.3075 - classification_loss: 0.2592 394/500 [======================>.......] - ETA: 26s - loss: 1.5661 - regression_loss: 1.3070 - classification_loss: 0.2591 395/500 [======================>.......] - ETA: 26s - loss: 1.5664 - regression_loss: 1.3073 - classification_loss: 0.2591 396/500 [======================>.......] - ETA: 25s - loss: 1.5663 - regression_loss: 1.3072 - classification_loss: 0.2590 397/500 [======================>.......] - ETA: 25s - loss: 1.5646 - regression_loss: 1.3059 - classification_loss: 0.2587 398/500 [======================>.......] - ETA: 25s - loss: 1.5631 - regression_loss: 1.3046 - classification_loss: 0.2585 399/500 [======================>.......] - ETA: 25s - loss: 1.5624 - regression_loss: 1.3040 - classification_loss: 0.2584 400/500 [=======================>......] - ETA: 24s - loss: 1.5623 - regression_loss: 1.3036 - classification_loss: 0.2587 401/500 [=======================>......] - ETA: 24s - loss: 1.5627 - regression_loss: 1.3039 - classification_loss: 0.2587 402/500 [=======================>......] - ETA: 24s - loss: 1.5631 - regression_loss: 1.3043 - classification_loss: 0.2588 403/500 [=======================>......] - ETA: 24s - loss: 1.5629 - regression_loss: 1.3041 - classification_loss: 0.2588 404/500 [=======================>......] - ETA: 23s - loss: 1.5647 - regression_loss: 1.3054 - classification_loss: 0.2592 405/500 [=======================>......] - ETA: 23s - loss: 1.5654 - regression_loss: 1.3059 - classification_loss: 0.2594 406/500 [=======================>......] - ETA: 23s - loss: 1.5687 - regression_loss: 1.3084 - classification_loss: 0.2603 407/500 [=======================>......] - ETA: 23s - loss: 1.5686 - regression_loss: 1.3085 - classification_loss: 0.2601 408/500 [=======================>......] - ETA: 22s - loss: 1.5687 - regression_loss: 1.3085 - classification_loss: 0.2602 409/500 [=======================>......] - ETA: 22s - loss: 1.5697 - regression_loss: 1.3095 - classification_loss: 0.2603 410/500 [=======================>......] - ETA: 22s - loss: 1.5697 - regression_loss: 1.3095 - classification_loss: 0.2601 411/500 [=======================>......] - ETA: 22s - loss: 1.5719 - regression_loss: 1.3110 - classification_loss: 0.2609 412/500 [=======================>......] - ETA: 21s - loss: 1.5717 - regression_loss: 1.3108 - classification_loss: 0.2609 413/500 [=======================>......] - ETA: 21s - loss: 1.5704 - regression_loss: 1.3098 - classification_loss: 0.2606 414/500 [=======================>......] - ETA: 21s - loss: 1.5701 - regression_loss: 1.3094 - classification_loss: 0.2607 415/500 [=======================>......] - ETA: 21s - loss: 1.5695 - regression_loss: 1.3090 - classification_loss: 0.2605 416/500 [=======================>......] - ETA: 20s - loss: 1.5695 - regression_loss: 1.3084 - classification_loss: 0.2611 417/500 [========================>.....] - ETA: 20s - loss: 1.5670 - regression_loss: 1.3063 - classification_loss: 0.2607 418/500 [========================>.....] - ETA: 20s - loss: 1.5653 - regression_loss: 1.3047 - classification_loss: 0.2606 419/500 [========================>.....] - ETA: 20s - loss: 1.5655 - regression_loss: 1.3050 - classification_loss: 0.2606 420/500 [========================>.....] - ETA: 19s - loss: 1.5640 - regression_loss: 1.3038 - classification_loss: 0.2603 421/500 [========================>.....] - ETA: 19s - loss: 1.5635 - regression_loss: 1.3033 - classification_loss: 0.2601 422/500 [========================>.....] - ETA: 19s - loss: 1.5640 - regression_loss: 1.3035 - classification_loss: 0.2605 423/500 [========================>.....] - ETA: 19s - loss: 1.5650 - regression_loss: 1.3043 - classification_loss: 0.2607 424/500 [========================>.....] - ETA: 18s - loss: 1.5657 - regression_loss: 1.3050 - classification_loss: 0.2607 425/500 [========================>.....] - ETA: 18s - loss: 1.5655 - regression_loss: 1.3049 - classification_loss: 0.2606 426/500 [========================>.....] - ETA: 18s - loss: 1.5649 - regression_loss: 1.3043 - classification_loss: 0.2606 427/500 [========================>.....] - ETA: 18s - loss: 1.5647 - regression_loss: 1.3041 - classification_loss: 0.2606 428/500 [========================>.....] - ETA: 17s - loss: 1.5657 - regression_loss: 1.3049 - classification_loss: 0.2608 429/500 [========================>.....] - ETA: 17s - loss: 1.5662 - regression_loss: 1.3054 - classification_loss: 0.2608 430/500 [========================>.....] - ETA: 17s - loss: 1.5669 - regression_loss: 1.3057 - classification_loss: 0.2612 431/500 [========================>.....] - ETA: 17s - loss: 1.5677 - regression_loss: 1.3063 - classification_loss: 0.2614 432/500 [========================>.....] - ETA: 16s - loss: 1.5687 - regression_loss: 1.3069 - classification_loss: 0.2618 433/500 [========================>.....] - ETA: 16s - loss: 1.5701 - regression_loss: 1.3079 - classification_loss: 0.2622 434/500 [=========================>....] - ETA: 16s - loss: 1.5700 - regression_loss: 1.3077 - classification_loss: 0.2622 435/500 [=========================>....] - ETA: 16s - loss: 1.5710 - regression_loss: 1.3084 - classification_loss: 0.2626 436/500 [=========================>....] - ETA: 15s - loss: 1.5709 - regression_loss: 1.3083 - classification_loss: 0.2625 437/500 [=========================>....] - ETA: 15s - loss: 1.5699 - regression_loss: 1.3077 - classification_loss: 0.2622 438/500 [=========================>....] - ETA: 15s - loss: 1.5697 - regression_loss: 1.3076 - classification_loss: 0.2621 439/500 [=========================>....] - ETA: 15s - loss: 1.5686 - regression_loss: 1.3067 - classification_loss: 0.2618 440/500 [=========================>....] - ETA: 14s - loss: 1.5694 - regression_loss: 1.3074 - classification_loss: 0.2619 441/500 [=========================>....] - ETA: 14s - loss: 1.5692 - regression_loss: 1.3072 - classification_loss: 0.2620 442/500 [=========================>....] - ETA: 14s - loss: 1.5688 - regression_loss: 1.3068 - classification_loss: 0.2619 443/500 [=========================>....] - ETA: 14s - loss: 1.5663 - regression_loss: 1.3048 - classification_loss: 0.2615 444/500 [=========================>....] - ETA: 13s - loss: 1.5663 - regression_loss: 1.3048 - classification_loss: 0.2616 445/500 [=========================>....] - ETA: 13s - loss: 1.5667 - regression_loss: 1.3051 - classification_loss: 0.2615 446/500 [=========================>....] - ETA: 13s - loss: 1.5672 - regression_loss: 1.3056 - classification_loss: 0.2616 447/500 [=========================>....] - ETA: 13s - loss: 1.5683 - regression_loss: 1.3064 - classification_loss: 0.2618 448/500 [=========================>....] - ETA: 12s - loss: 1.5692 - regression_loss: 1.3073 - classification_loss: 0.2619 449/500 [=========================>....] - ETA: 12s - loss: 1.5700 - regression_loss: 1.3081 - classification_loss: 0.2620 450/500 [==========================>...] - ETA: 12s - loss: 1.5691 - regression_loss: 1.3072 - classification_loss: 0.2619 451/500 [==========================>...] - ETA: 12s - loss: 1.5672 - regression_loss: 1.3056 - classification_loss: 0.2615 452/500 [==========================>...] - ETA: 11s - loss: 1.5663 - regression_loss: 1.3050 - classification_loss: 0.2613 453/500 [==========================>...] - ETA: 11s - loss: 1.5671 - regression_loss: 1.3057 - classification_loss: 0.2615 454/500 [==========================>...] - ETA: 11s - loss: 1.5684 - regression_loss: 1.3066 - classification_loss: 0.2618 455/500 [==========================>...] - ETA: 11s - loss: 1.5687 - regression_loss: 1.3070 - classification_loss: 0.2617 456/500 [==========================>...] - ETA: 10s - loss: 1.5684 - regression_loss: 1.3069 - classification_loss: 0.2616 457/500 [==========================>...] - ETA: 10s - loss: 1.5669 - regression_loss: 1.3057 - classification_loss: 0.2612 458/500 [==========================>...] - ETA: 10s - loss: 1.5671 - regression_loss: 1.3059 - classification_loss: 0.2612 459/500 [==========================>...] - ETA: 10s - loss: 1.5671 - regression_loss: 1.3057 - classification_loss: 0.2614 460/500 [==========================>...] - ETA: 9s - loss: 1.5652 - regression_loss: 1.3040 - classification_loss: 0.2612  461/500 [==========================>...] - ETA: 9s - loss: 1.5643 - regression_loss: 1.3035 - classification_loss: 0.2608 462/500 [==========================>...] - ETA: 9s - loss: 1.5649 - regression_loss: 1.3039 - classification_loss: 0.2610 463/500 [==========================>...] - ETA: 9s - loss: 1.5654 - regression_loss: 1.3043 - classification_loss: 0.2611 464/500 [==========================>...] - ETA: 8s - loss: 1.5640 - regression_loss: 1.3033 - classification_loss: 0.2607 465/500 [==========================>...] - ETA: 8s - loss: 1.5624 - regression_loss: 1.3019 - classification_loss: 0.2605 466/500 [==========================>...] - ETA: 8s - loss: 1.5634 - regression_loss: 1.3027 - classification_loss: 0.2606 467/500 [===========================>..] - ETA: 8s - loss: 1.5641 - regression_loss: 1.3033 - classification_loss: 0.2608 468/500 [===========================>..] - ETA: 7s - loss: 1.5641 - regression_loss: 1.3033 - classification_loss: 0.2607 469/500 [===========================>..] - ETA: 7s - loss: 1.5629 - regression_loss: 1.3025 - classification_loss: 0.2605 470/500 [===========================>..] - ETA: 7s - loss: 1.5629 - regression_loss: 1.3025 - classification_loss: 0.2604 471/500 [===========================>..] - ETA: 7s - loss: 1.5628 - regression_loss: 1.3025 - classification_loss: 0.2604 472/500 [===========================>..] - ETA: 6s - loss: 1.5612 - regression_loss: 1.3012 - classification_loss: 0.2599 473/500 [===========================>..] - ETA: 6s - loss: 1.5615 - regression_loss: 1.3014 - classification_loss: 0.2600 474/500 [===========================>..] - ETA: 6s - loss: 1.5615 - regression_loss: 1.3015 - classification_loss: 0.2600 475/500 [===========================>..] - ETA: 6s - loss: 1.5622 - regression_loss: 1.3020 - classification_loss: 0.2602 476/500 [===========================>..] - ETA: 5s - loss: 1.5623 - regression_loss: 1.3021 - classification_loss: 0.2603 477/500 [===========================>..] - ETA: 5s - loss: 1.5621 - regression_loss: 1.3019 - classification_loss: 0.2602 478/500 [===========================>..] - ETA: 5s - loss: 1.5621 - regression_loss: 1.3018 - classification_loss: 0.2603 479/500 [===========================>..] - ETA: 5s - loss: 1.5599 - regression_loss: 1.2999 - classification_loss: 0.2600 480/500 [===========================>..] - ETA: 4s - loss: 1.5609 - regression_loss: 1.3008 - classification_loss: 0.2601 481/500 [===========================>..] - ETA: 4s - loss: 1.5610 - regression_loss: 1.3009 - classification_loss: 0.2601 482/500 [===========================>..] - ETA: 4s - loss: 1.5619 - regression_loss: 1.3016 - classification_loss: 0.2603 483/500 [===========================>..] - ETA: 4s - loss: 1.5619 - regression_loss: 1.3016 - classification_loss: 0.2603 484/500 [============================>.] - ETA: 3s - loss: 1.5615 - regression_loss: 1.3011 - classification_loss: 0.2604 485/500 [============================>.] - ETA: 3s - loss: 1.5623 - regression_loss: 1.3018 - classification_loss: 0.2605 486/500 [============================>.] - ETA: 3s - loss: 1.5620 - regression_loss: 1.3012 - classification_loss: 0.2608 487/500 [============================>.] - ETA: 3s - loss: 1.5616 - regression_loss: 1.3009 - classification_loss: 0.2608 488/500 [============================>.] - ETA: 2s - loss: 1.5618 - regression_loss: 1.3010 - classification_loss: 0.2607 489/500 [============================>.] - ETA: 2s - loss: 1.5600 - regression_loss: 1.2996 - classification_loss: 0.2603 490/500 [============================>.] - ETA: 2s - loss: 1.5602 - regression_loss: 1.2999 - classification_loss: 0.2604 491/500 [============================>.] - ETA: 2s - loss: 1.5589 - regression_loss: 1.2988 - classification_loss: 0.2601 492/500 [============================>.] - ETA: 1s - loss: 1.5601 - regression_loss: 1.2997 - classification_loss: 0.2603 493/500 [============================>.] - ETA: 1s - loss: 1.5588 - regression_loss: 1.2987 - classification_loss: 0.2601 494/500 [============================>.] - ETA: 1s - loss: 1.5567 - regression_loss: 1.2969 - classification_loss: 0.2598 495/500 [============================>.] - ETA: 1s - loss: 1.5548 - regression_loss: 1.2953 - classification_loss: 0.2594 496/500 [============================>.] - ETA: 0s - loss: 1.5551 - regression_loss: 1.2956 - classification_loss: 0.2595 497/500 [============================>.] - ETA: 0s - loss: 1.5555 - regression_loss: 1.2961 - classification_loss: 0.2595 498/500 [============================>.] - ETA: 0s - loss: 1.5563 - regression_loss: 1.2968 - classification_loss: 0.2595 499/500 [============================>.] - ETA: 0s - loss: 1.5574 - regression_loss: 1.2977 - classification_loss: 0.2596 500/500 [==============================] - 125s 250ms/step - loss: 1.5567 - regression_loss: 1.2972 - classification_loss: 0.2595 1172 instances of class plum with average precision: 0.6586 mAP: 0.6586 Epoch 00090: saving model to ./training/snapshots/resnet50_pascal_90.h5 Epoch 91/150 1/500 [..............................] - ETA: 1:56 - loss: 1.8842 - regression_loss: 1.5870 - classification_loss: 0.2972 2/500 [..............................] - ETA: 2:05 - loss: 1.4050 - regression_loss: 1.2020 - classification_loss: 0.2029 3/500 [..............................] - ETA: 2:03 - loss: 1.6473 - regression_loss: 1.3789 - classification_loss: 0.2684 4/500 [..............................] - ETA: 2:03 - loss: 1.7631 - regression_loss: 1.4180 - classification_loss: 0.3451 5/500 [..............................] - ETA: 2:04 - loss: 1.6971 - regression_loss: 1.3945 - classification_loss: 0.3026 6/500 [..............................] - ETA: 2:03 - loss: 1.6210 - regression_loss: 1.3384 - classification_loss: 0.2826 7/500 [..............................] - ETA: 2:02 - loss: 1.5372 - regression_loss: 1.2756 - classification_loss: 0.2616 8/500 [..............................] - ETA: 2:02 - loss: 1.6092 - regression_loss: 1.3149 - classification_loss: 0.2943 9/500 [..............................] - ETA: 2:01 - loss: 1.5269 - regression_loss: 1.2549 - classification_loss: 0.2720 10/500 [..............................] - ETA: 2:01 - loss: 1.4938 - regression_loss: 1.2346 - classification_loss: 0.2592 11/500 [..............................] - ETA: 2:01 - loss: 1.5252 - regression_loss: 1.2589 - classification_loss: 0.2664 12/500 [..............................] - ETA: 2:01 - loss: 1.5477 - regression_loss: 1.2825 - classification_loss: 0.2652 13/500 [..............................] - ETA: 2:01 - loss: 1.5452 - regression_loss: 1.2848 - classification_loss: 0.2604 14/500 [..............................] - ETA: 2:01 - loss: 1.5515 - regression_loss: 1.2883 - classification_loss: 0.2632 15/500 [..............................] - ETA: 2:01 - loss: 1.5433 - regression_loss: 1.2841 - classification_loss: 0.2592 16/500 [..............................] - ETA: 2:01 - loss: 1.5381 - regression_loss: 1.2804 - classification_loss: 0.2577 17/500 [>.............................] - ETA: 2:00 - loss: 1.5488 - regression_loss: 1.2936 - classification_loss: 0.2551 18/500 [>.............................] - ETA: 2:00 - loss: 1.5606 - regression_loss: 1.3051 - classification_loss: 0.2554 19/500 [>.............................] - ETA: 2:00 - loss: 1.5676 - regression_loss: 1.3109 - classification_loss: 0.2567 20/500 [>.............................] - ETA: 2:00 - loss: 1.5730 - regression_loss: 1.3127 - classification_loss: 0.2603 21/500 [>.............................] - ETA: 1:59 - loss: 1.6023 - regression_loss: 1.3341 - classification_loss: 0.2682 22/500 [>.............................] - ETA: 1:59 - loss: 1.6015 - regression_loss: 1.3352 - classification_loss: 0.2663 23/500 [>.............................] - ETA: 1:59 - loss: 1.5996 - regression_loss: 1.3344 - classification_loss: 0.2652 24/500 [>.............................] - ETA: 1:59 - loss: 1.5797 - regression_loss: 1.3205 - classification_loss: 0.2592 25/500 [>.............................] - ETA: 1:58 - loss: 1.6026 - regression_loss: 1.3391 - classification_loss: 0.2635 26/500 [>.............................] - ETA: 1:58 - loss: 1.5990 - regression_loss: 1.3360 - classification_loss: 0.2630 27/500 [>.............................] - ETA: 1:57 - loss: 1.5955 - regression_loss: 1.3345 - classification_loss: 0.2610 28/500 [>.............................] - ETA: 1:57 - loss: 1.5936 - regression_loss: 1.3227 - classification_loss: 0.2708 29/500 [>.............................] - ETA: 1:57 - loss: 1.6132 - regression_loss: 1.3402 - classification_loss: 0.2730 30/500 [>.............................] - ETA: 1:57 - loss: 1.6341 - regression_loss: 1.3549 - classification_loss: 0.2792 31/500 [>.............................] - ETA: 1:57 - loss: 1.5846 - regression_loss: 1.3112 - classification_loss: 0.2734 32/500 [>.............................] - ETA: 1:56 - loss: 1.5883 - regression_loss: 1.3136 - classification_loss: 0.2746 33/500 [>.............................] - ETA: 1:56 - loss: 1.5925 - regression_loss: 1.3174 - classification_loss: 0.2751 34/500 [=>............................] - ETA: 1:56 - loss: 1.5860 - regression_loss: 1.3118 - classification_loss: 0.2742 35/500 [=>............................] - ETA: 1:56 - loss: 1.5871 - regression_loss: 1.3111 - classification_loss: 0.2761 36/500 [=>............................] - ETA: 1:55 - loss: 1.6019 - regression_loss: 1.3272 - classification_loss: 0.2746 37/500 [=>............................] - ETA: 1:55 - loss: 1.6063 - regression_loss: 1.3313 - classification_loss: 0.2750 38/500 [=>............................] - ETA: 1:55 - loss: 1.5846 - regression_loss: 1.3141 - classification_loss: 0.2705 39/500 [=>............................] - ETA: 1:55 - loss: 1.5730 - regression_loss: 1.3057 - classification_loss: 0.2673 40/500 [=>............................] - ETA: 1:55 - loss: 1.5809 - regression_loss: 1.3123 - classification_loss: 0.2686 41/500 [=>............................] - ETA: 1:55 - loss: 1.5641 - regression_loss: 1.2973 - classification_loss: 0.2668 42/500 [=>............................] - ETA: 1:54 - loss: 1.5534 - regression_loss: 1.2894 - classification_loss: 0.2640 43/500 [=>............................] - ETA: 1:54 - loss: 1.5529 - regression_loss: 1.2893 - classification_loss: 0.2636 44/500 [=>............................] - ETA: 1:54 - loss: 1.5518 - regression_loss: 1.2889 - classification_loss: 0.2629 45/500 [=>............................] - ETA: 1:54 - loss: 1.5479 - regression_loss: 1.2863 - classification_loss: 0.2616 46/500 [=>............................] - ETA: 1:53 - loss: 1.5432 - regression_loss: 1.2825 - classification_loss: 0.2607 47/500 [=>............................] - ETA: 1:53 - loss: 1.5322 - regression_loss: 1.2726 - classification_loss: 0.2596 48/500 [=>............................] - ETA: 1:53 - loss: 1.5335 - regression_loss: 1.2745 - classification_loss: 0.2589 49/500 [=>............................] - ETA: 1:53 - loss: 1.5313 - regression_loss: 1.2726 - classification_loss: 0.2587 50/500 [==>...........................] - ETA: 1:53 - loss: 1.5178 - regression_loss: 1.2615 - classification_loss: 0.2563 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5209 - regression_loss: 1.2640 - classification_loss: 0.2570 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5150 - regression_loss: 1.2613 - classification_loss: 0.2537 53/500 [==>...........................] - ETA: 1:52 - loss: 1.5182 - regression_loss: 1.2633 - classification_loss: 0.2548 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5160 - regression_loss: 1.2625 - classification_loss: 0.2535 55/500 [==>...........................] - ETA: 1:52 - loss: 1.5204 - regression_loss: 1.2658 - classification_loss: 0.2546 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5106 - regression_loss: 1.2590 - classification_loss: 0.2516 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5218 - regression_loss: 1.2671 - classification_loss: 0.2548 58/500 [==>...........................] - ETA: 1:50 - loss: 1.5257 - regression_loss: 1.2705 - classification_loss: 0.2551 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5202 - regression_loss: 1.2657 - classification_loss: 0.2545 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5118 - regression_loss: 1.2599 - classification_loss: 0.2519 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4997 - regression_loss: 1.2504 - classification_loss: 0.2493 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4978 - regression_loss: 1.2490 - classification_loss: 0.2488 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4950 - regression_loss: 1.2460 - classification_loss: 0.2490 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5053 - regression_loss: 1.2532 - classification_loss: 0.2521 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5143 - regression_loss: 1.2603 - classification_loss: 0.2540 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5172 - regression_loss: 1.2628 - classification_loss: 0.2543 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5262 - regression_loss: 1.2707 - classification_loss: 0.2555 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5236 - regression_loss: 1.2691 - classification_loss: 0.2545 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5304 - regression_loss: 1.2743 - classification_loss: 0.2562 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5347 - regression_loss: 1.2768 - classification_loss: 0.2579 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5369 - regression_loss: 1.2785 - classification_loss: 0.2585 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5435 - regression_loss: 1.2837 - classification_loss: 0.2598 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5437 - regression_loss: 1.2831 - classification_loss: 0.2606 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5482 - regression_loss: 1.2870 - classification_loss: 0.2612 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5462 - regression_loss: 1.2853 - classification_loss: 0.2609 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5418 - regression_loss: 1.2820 - classification_loss: 0.2598 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5413 - regression_loss: 1.2810 - classification_loss: 0.2602 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5473 - regression_loss: 1.2848 - classification_loss: 0.2625 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5461 - regression_loss: 1.2845 - classification_loss: 0.2617 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5386 - regression_loss: 1.2784 - classification_loss: 0.2602 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5383 - regression_loss: 1.2782 - classification_loss: 0.2601 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5448 - regression_loss: 1.2840 - classification_loss: 0.2608 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5542 - regression_loss: 1.2907 - classification_loss: 0.2635 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5580 - regression_loss: 1.2938 - classification_loss: 0.2643 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5611 - regression_loss: 1.2960 - classification_loss: 0.2651 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5648 - regression_loss: 1.2988 - classification_loss: 0.2660 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5661 - regression_loss: 1.2999 - classification_loss: 0.2662 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5665 - regression_loss: 1.2992 - classification_loss: 0.2672 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5646 - regression_loss: 1.2976 - classification_loss: 0.2670 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5698 - regression_loss: 1.3025 - classification_loss: 0.2674 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5769 - regression_loss: 1.3103 - classification_loss: 0.2666 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5775 - regression_loss: 1.3111 - classification_loss: 0.2664 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5681 - regression_loss: 1.3034 - classification_loss: 0.2646 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5679 - regression_loss: 1.3021 - classification_loss: 0.2659 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5716 - regression_loss: 1.3060 - classification_loss: 0.2656 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5794 - regression_loss: 1.3113 - classification_loss: 0.2681 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5856 - regression_loss: 1.3161 - classification_loss: 0.2695 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5761 - regression_loss: 1.3086 - classification_loss: 0.2674 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5709 - regression_loss: 1.3043 - classification_loss: 0.2666 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5719 - regression_loss: 1.3056 - classification_loss: 0.2663 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5686 - regression_loss: 1.3025 - classification_loss: 0.2661 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5671 - regression_loss: 1.3007 - classification_loss: 0.2663 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5727 - regression_loss: 1.3055 - classification_loss: 0.2672 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5720 - regression_loss: 1.3047 - classification_loss: 0.2674 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5695 - regression_loss: 1.3027 - classification_loss: 0.2668 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5652 - regression_loss: 1.2996 - classification_loss: 0.2656 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5660 - regression_loss: 1.3015 - classification_loss: 0.2646 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5607 - regression_loss: 1.2978 - classification_loss: 0.2629 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5588 - regression_loss: 1.2965 - classification_loss: 0.2623 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5518 - regression_loss: 1.2910 - classification_loss: 0.2608 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5569 - regression_loss: 1.2949 - classification_loss: 0.2620 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5496 - regression_loss: 1.2888 - classification_loss: 0.2609 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5492 - regression_loss: 1.2883 - classification_loss: 0.2609 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5440 - regression_loss: 1.2841 - classification_loss: 0.2599 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5476 - regression_loss: 1.2868 - classification_loss: 0.2608 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5420 - regression_loss: 1.2819 - classification_loss: 0.2601 117/500 [======>.......................] - ETA: 1:36 - loss: 1.5415 - regression_loss: 1.2815 - classification_loss: 0.2600 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5434 - regression_loss: 1.2832 - classification_loss: 0.2602 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5421 - regression_loss: 1.2827 - classification_loss: 0.2595 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5428 - regression_loss: 1.2831 - classification_loss: 0.2597 121/500 [======>.......................] - ETA: 1:35 - loss: 1.5393 - regression_loss: 1.2805 - classification_loss: 0.2588 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5440 - regression_loss: 1.2843 - classification_loss: 0.2597 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5470 - regression_loss: 1.2861 - classification_loss: 0.2608 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5517 - regression_loss: 1.2903 - classification_loss: 0.2614 125/500 [======>.......................] - ETA: 1:34 - loss: 1.5530 - regression_loss: 1.2916 - classification_loss: 0.2614 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5545 - regression_loss: 1.2932 - classification_loss: 0.2613 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5482 - regression_loss: 1.2885 - classification_loss: 0.2597 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5463 - regression_loss: 1.2875 - classification_loss: 0.2588 129/500 [======>.......................] - ETA: 1:33 - loss: 1.5538 - regression_loss: 1.2942 - classification_loss: 0.2596 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5514 - regression_loss: 1.2923 - classification_loss: 0.2591 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5532 - regression_loss: 1.2943 - classification_loss: 0.2589 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5567 - regression_loss: 1.2973 - classification_loss: 0.2595 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5594 - regression_loss: 1.2996 - classification_loss: 0.2597 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5607 - regression_loss: 1.3009 - classification_loss: 0.2598 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5573 - regression_loss: 1.2981 - classification_loss: 0.2592 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5603 - regression_loss: 1.3009 - classification_loss: 0.2594 137/500 [=======>......................] - ETA: 1:31 - loss: 1.5529 - regression_loss: 1.2948 - classification_loss: 0.2581 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5546 - regression_loss: 1.2951 - classification_loss: 0.2595 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5563 - regression_loss: 1.2966 - classification_loss: 0.2597 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5557 - regression_loss: 1.2958 - classification_loss: 0.2599 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5559 - regression_loss: 1.2960 - classification_loss: 0.2599 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5531 - regression_loss: 1.2935 - classification_loss: 0.2595 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5562 - regression_loss: 1.2961 - classification_loss: 0.2601 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5483 - regression_loss: 1.2890 - classification_loss: 0.2593 145/500 [=======>......................] - ETA: 1:29 - loss: 1.5486 - regression_loss: 1.2893 - classification_loss: 0.2593 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5510 - regression_loss: 1.2911 - classification_loss: 0.2599 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5470 - regression_loss: 1.2878 - classification_loss: 0.2592 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5495 - regression_loss: 1.2894 - classification_loss: 0.2601 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5493 - regression_loss: 1.2888 - classification_loss: 0.2605 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5503 - regression_loss: 1.2899 - classification_loss: 0.2604 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5532 - regression_loss: 1.2921 - classification_loss: 0.2611 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5519 - regression_loss: 1.2911 - classification_loss: 0.2608 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5504 - regression_loss: 1.2901 - classification_loss: 0.2603 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5560 - regression_loss: 1.2944 - classification_loss: 0.2616 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5579 - regression_loss: 1.2957 - classification_loss: 0.2622 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5580 - regression_loss: 1.2961 - classification_loss: 0.2619 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5611 - regression_loss: 1.2987 - classification_loss: 0.2625 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5633 - regression_loss: 1.3000 - classification_loss: 0.2633 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5642 - regression_loss: 1.3006 - classification_loss: 0.2636 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5626 - regression_loss: 1.2989 - classification_loss: 0.2637 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5655 - regression_loss: 1.3017 - classification_loss: 0.2638 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5645 - regression_loss: 1.3011 - classification_loss: 0.2634 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5640 - regression_loss: 1.3009 - classification_loss: 0.2632 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5714 - regression_loss: 1.3070 - classification_loss: 0.2644 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5720 - regression_loss: 1.3076 - classification_loss: 0.2645 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5730 - regression_loss: 1.3085 - classification_loss: 0.2645 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5738 - regression_loss: 1.3097 - classification_loss: 0.2642 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5744 - regression_loss: 1.3110 - classification_loss: 0.2633 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5732 - regression_loss: 1.3102 - classification_loss: 0.2630 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5699 - regression_loss: 1.3070 - classification_loss: 0.2629 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5743 - regression_loss: 1.3101 - classification_loss: 0.2642 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5758 - regression_loss: 1.3113 - classification_loss: 0.2645 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5747 - regression_loss: 1.3106 - classification_loss: 0.2641 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5779 - regression_loss: 1.3137 - classification_loss: 0.2643 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5791 - regression_loss: 1.3145 - classification_loss: 0.2645 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5812 - regression_loss: 1.3161 - classification_loss: 0.2651 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5831 - regression_loss: 1.3179 - classification_loss: 0.2652 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5841 - regression_loss: 1.3188 - classification_loss: 0.2653 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5813 - regression_loss: 1.3152 - classification_loss: 0.2661 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5797 - regression_loss: 1.3143 - classification_loss: 0.2655 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5792 - regression_loss: 1.3141 - classification_loss: 0.2651 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5810 - regression_loss: 1.3162 - classification_loss: 0.2648 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5824 - regression_loss: 1.3173 - classification_loss: 0.2651 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5853 - regression_loss: 1.3195 - classification_loss: 0.2658 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5855 - regression_loss: 1.3196 - classification_loss: 0.2659 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5830 - regression_loss: 1.3168 - classification_loss: 0.2662 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5800 - regression_loss: 1.3141 - classification_loss: 0.2659 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5783 - regression_loss: 1.3128 - classification_loss: 0.2655 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5758 - regression_loss: 1.3108 - classification_loss: 0.2651 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5779 - regression_loss: 1.3128 - classification_loss: 0.2651 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5801 - regression_loss: 1.3144 - classification_loss: 0.2657 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5790 - regression_loss: 1.3132 - classification_loss: 0.2657 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5781 - regression_loss: 1.3126 - classification_loss: 0.2655 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5747 - regression_loss: 1.3098 - classification_loss: 0.2648 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5735 - regression_loss: 1.3089 - classification_loss: 0.2646 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5722 - regression_loss: 1.3081 - classification_loss: 0.2641 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5720 - regression_loss: 1.3080 - classification_loss: 0.2639 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5750 - regression_loss: 1.3103 - classification_loss: 0.2646 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5747 - regression_loss: 1.3103 - classification_loss: 0.2644 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5777 - regression_loss: 1.3130 - classification_loss: 0.2647 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5770 - regression_loss: 1.3125 - classification_loss: 0.2645 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5781 - regression_loss: 1.3133 - classification_loss: 0.2648 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5740 - regression_loss: 1.3097 - classification_loss: 0.2643 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5763 - regression_loss: 1.3121 - classification_loss: 0.2642 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5772 - regression_loss: 1.3127 - classification_loss: 0.2645 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5746 - regression_loss: 1.3109 - classification_loss: 0.2637 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5756 - regression_loss: 1.3117 - classification_loss: 0.2639 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5712 - regression_loss: 1.3083 - classification_loss: 0.2630 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5714 - regression_loss: 1.3086 - classification_loss: 0.2628 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5673 - regression_loss: 1.3054 - classification_loss: 0.2619 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5683 - regression_loss: 1.3064 - classification_loss: 0.2619 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5672 - regression_loss: 1.3055 - classification_loss: 0.2617 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5650 - regression_loss: 1.3038 - classification_loss: 0.2612 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5676 - regression_loss: 1.3054 - classification_loss: 0.2622 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5676 - regression_loss: 1.3056 - classification_loss: 0.2620 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5692 - regression_loss: 1.3067 - classification_loss: 0.2625 217/500 [============>.................] - ETA: 1:10 - loss: 1.5702 - regression_loss: 1.3076 - classification_loss: 0.2626 218/500 [============>.................] - ETA: 1:10 - loss: 1.5683 - regression_loss: 1.3060 - classification_loss: 0.2623 219/500 [============>.................] - ETA: 1:10 - loss: 1.5700 - regression_loss: 1.3075 - classification_loss: 0.2626 220/500 [============>.................] - ETA: 1:10 - loss: 1.5677 - regression_loss: 1.3058 - classification_loss: 0.2619 221/500 [============>.................] - ETA: 1:09 - loss: 1.5648 - regression_loss: 1.3036 - classification_loss: 0.2612 222/500 [============>.................] - ETA: 1:09 - loss: 1.5662 - regression_loss: 1.3049 - classification_loss: 0.2613 223/500 [============>.................] - ETA: 1:09 - loss: 1.5626 - regression_loss: 1.3019 - classification_loss: 0.2607 224/500 [============>.................] - ETA: 1:09 - loss: 1.5633 - regression_loss: 1.3025 - classification_loss: 0.2607 225/500 [============>.................] - ETA: 1:08 - loss: 1.5638 - regression_loss: 1.3031 - classification_loss: 0.2607 226/500 [============>.................] - ETA: 1:08 - loss: 1.5649 - regression_loss: 1.3040 - classification_loss: 0.2610 227/500 [============>.................] - ETA: 1:08 - loss: 1.5670 - regression_loss: 1.3057 - classification_loss: 0.2613 228/500 [============>.................] - ETA: 1:08 - loss: 1.5682 - regression_loss: 1.3069 - classification_loss: 0.2613 229/500 [============>.................] - ETA: 1:07 - loss: 1.5660 - regression_loss: 1.3054 - classification_loss: 0.2605 230/500 [============>.................] - ETA: 1:07 - loss: 1.5654 - regression_loss: 1.3050 - classification_loss: 0.2603 231/500 [============>.................] - ETA: 1:07 - loss: 1.5661 - regression_loss: 1.3055 - classification_loss: 0.2606 232/500 [============>.................] - ETA: 1:07 - loss: 1.5651 - regression_loss: 1.3049 - classification_loss: 0.2602 233/500 [============>.................] - ETA: 1:06 - loss: 1.5628 - regression_loss: 1.3030 - classification_loss: 0.2598 234/500 [=============>................] - ETA: 1:06 - loss: 1.5611 - regression_loss: 1.3014 - classification_loss: 0.2596 235/500 [=============>................] - ETA: 1:06 - loss: 1.5616 - regression_loss: 1.3018 - classification_loss: 0.2598 236/500 [=============>................] - ETA: 1:06 - loss: 1.5595 - regression_loss: 1.3000 - classification_loss: 0.2595 237/500 [=============>................] - ETA: 1:05 - loss: 1.5580 - regression_loss: 1.2988 - classification_loss: 0.2592 238/500 [=============>................] - ETA: 1:05 - loss: 1.5585 - regression_loss: 1.2990 - classification_loss: 0.2595 239/500 [=============>................] - ETA: 1:05 - loss: 1.5597 - regression_loss: 1.2999 - classification_loss: 0.2598 240/500 [=============>................] - ETA: 1:05 - loss: 1.5584 - regression_loss: 1.2991 - classification_loss: 0.2593 241/500 [=============>................] - ETA: 1:04 - loss: 1.5603 - regression_loss: 1.3007 - classification_loss: 0.2596 242/500 [=============>................] - ETA: 1:04 - loss: 1.5571 - regression_loss: 1.2981 - classification_loss: 0.2590 243/500 [=============>................] - ETA: 1:04 - loss: 1.5592 - regression_loss: 1.3000 - classification_loss: 0.2592 244/500 [=============>................] - ETA: 1:04 - loss: 1.5601 - regression_loss: 1.3008 - classification_loss: 0.2593 245/500 [=============>................] - ETA: 1:03 - loss: 1.5587 - regression_loss: 1.2996 - classification_loss: 0.2591 246/500 [=============>................] - ETA: 1:03 - loss: 1.5601 - regression_loss: 1.3008 - classification_loss: 0.2593 247/500 [=============>................] - ETA: 1:03 - loss: 1.5638 - regression_loss: 1.3039 - classification_loss: 0.2599 248/500 [=============>................] - ETA: 1:03 - loss: 1.5624 - regression_loss: 1.3028 - classification_loss: 0.2596 249/500 [=============>................] - ETA: 1:02 - loss: 1.5605 - regression_loss: 1.3012 - classification_loss: 0.2593 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5599 - regression_loss: 1.3006 - classification_loss: 0.2593 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5575 - regression_loss: 1.2988 - classification_loss: 0.2587 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5543 - regression_loss: 1.2963 - classification_loss: 0.2580 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5534 - regression_loss: 1.2956 - classification_loss: 0.2578 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5532 - regression_loss: 1.2955 - classification_loss: 0.2578 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5524 - regression_loss: 1.2952 - classification_loss: 0.2571 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5538 - regression_loss: 1.2961 - classification_loss: 0.2577 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5535 - regression_loss: 1.2959 - classification_loss: 0.2576 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5552 - regression_loss: 1.2971 - classification_loss: 0.2581 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5561 - regression_loss: 1.2978 - classification_loss: 0.2583 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5549 - regression_loss: 1.2971 - classification_loss: 0.2578 261/500 [==============>...............] - ETA: 59s - loss: 1.5538 - regression_loss: 1.2962 - classification_loss: 0.2576  262/500 [==============>...............] - ETA: 59s - loss: 1.5533 - regression_loss: 1.2961 - classification_loss: 0.2572 263/500 [==============>...............] - ETA: 59s - loss: 1.5535 - regression_loss: 1.2962 - classification_loss: 0.2573 264/500 [==============>...............] - ETA: 59s - loss: 1.5553 - regression_loss: 1.2977 - classification_loss: 0.2576 265/500 [==============>...............] - ETA: 58s - loss: 1.5563 - regression_loss: 1.2984 - classification_loss: 0.2579 266/500 [==============>...............] - ETA: 58s - loss: 1.5568 - regression_loss: 1.2989 - classification_loss: 0.2578 267/500 [===============>..............] - ETA: 58s - loss: 1.5549 - regression_loss: 1.2976 - classification_loss: 0.2573 268/500 [===============>..............] - ETA: 58s - loss: 1.5527 - regression_loss: 1.2953 - classification_loss: 0.2574 269/500 [===============>..............] - ETA: 57s - loss: 1.5537 - regression_loss: 1.2962 - classification_loss: 0.2576 270/500 [===============>..............] - ETA: 57s - loss: 1.5544 - regression_loss: 1.2962 - classification_loss: 0.2582 271/500 [===============>..............] - ETA: 57s - loss: 1.5545 - regression_loss: 1.2960 - classification_loss: 0.2584 272/500 [===============>..............] - ETA: 57s - loss: 1.5542 - regression_loss: 1.2958 - classification_loss: 0.2584 273/500 [===============>..............] - ETA: 56s - loss: 1.5531 - regression_loss: 1.2945 - classification_loss: 0.2586 274/500 [===============>..............] - ETA: 56s - loss: 1.5544 - regression_loss: 1.2956 - classification_loss: 0.2588 275/500 [===============>..............] - ETA: 56s - loss: 1.5545 - regression_loss: 1.2956 - classification_loss: 0.2589 276/500 [===============>..............] - ETA: 56s - loss: 1.5558 - regression_loss: 1.2964 - classification_loss: 0.2595 277/500 [===============>..............] - ETA: 55s - loss: 1.5554 - regression_loss: 1.2961 - classification_loss: 0.2593 278/500 [===============>..............] - ETA: 55s - loss: 1.5573 - regression_loss: 1.2973 - classification_loss: 0.2600 279/500 [===============>..............] - ETA: 55s - loss: 1.5589 - regression_loss: 1.2987 - classification_loss: 0.2602 280/500 [===============>..............] - ETA: 55s - loss: 1.5563 - regression_loss: 1.2967 - classification_loss: 0.2595 281/500 [===============>..............] - ETA: 54s - loss: 1.5548 - regression_loss: 1.2956 - classification_loss: 0.2593 282/500 [===============>..............] - ETA: 54s - loss: 1.5566 - regression_loss: 1.2966 - classification_loss: 0.2600 283/500 [===============>..............] - ETA: 54s - loss: 1.5603 - regression_loss: 1.2996 - classification_loss: 0.2607 284/500 [================>.............] - ETA: 54s - loss: 1.5608 - regression_loss: 1.3002 - classification_loss: 0.2607 285/500 [================>.............] - ETA: 53s - loss: 1.5632 - regression_loss: 1.3022 - classification_loss: 0.2610 286/500 [================>.............] - ETA: 53s - loss: 1.5613 - regression_loss: 1.3009 - classification_loss: 0.2604 287/500 [================>.............] - ETA: 53s - loss: 1.5599 - regression_loss: 1.2998 - classification_loss: 0.2601 288/500 [================>.............] - ETA: 53s - loss: 1.5605 - regression_loss: 1.3004 - classification_loss: 0.2601 289/500 [================>.............] - ETA: 52s - loss: 1.5595 - regression_loss: 1.2997 - classification_loss: 0.2599 290/500 [================>.............] - ETA: 52s - loss: 1.5592 - regression_loss: 1.2990 - classification_loss: 0.2602 291/500 [================>.............] - ETA: 52s - loss: 1.5590 - regression_loss: 1.2989 - classification_loss: 0.2601 292/500 [================>.............] - ETA: 52s - loss: 1.5594 - regression_loss: 1.2993 - classification_loss: 0.2602 293/500 [================>.............] - ETA: 51s - loss: 1.5589 - regression_loss: 1.2990 - classification_loss: 0.2599 294/500 [================>.............] - ETA: 51s - loss: 1.5597 - regression_loss: 1.2998 - classification_loss: 0.2599 295/500 [================>.............] - ETA: 51s - loss: 1.5598 - regression_loss: 1.2999 - classification_loss: 0.2598 296/500 [================>.............] - ETA: 51s - loss: 1.5617 - regression_loss: 1.3015 - classification_loss: 0.2602 297/500 [================>.............] - ETA: 50s - loss: 1.5629 - regression_loss: 1.3025 - classification_loss: 0.2604 298/500 [================>.............] - ETA: 50s - loss: 1.5606 - regression_loss: 1.3007 - classification_loss: 0.2600 299/500 [================>.............] - ETA: 50s - loss: 1.5605 - regression_loss: 1.3007 - classification_loss: 0.2598 300/500 [=================>............] - ETA: 50s - loss: 1.5641 - regression_loss: 1.3037 - classification_loss: 0.2604 301/500 [=================>............] - ETA: 49s - loss: 1.5662 - regression_loss: 1.3053 - classification_loss: 0.2609 302/500 [=================>............] - ETA: 49s - loss: 1.5642 - regression_loss: 1.3036 - classification_loss: 0.2606 303/500 [=================>............] - ETA: 49s - loss: 1.5649 - regression_loss: 1.3042 - classification_loss: 0.2607 304/500 [=================>............] - ETA: 49s - loss: 1.5656 - regression_loss: 1.3048 - classification_loss: 0.2607 305/500 [=================>............] - ETA: 48s - loss: 1.5672 - regression_loss: 1.3061 - classification_loss: 0.2611 306/500 [=================>............] - ETA: 48s - loss: 1.5680 - regression_loss: 1.3066 - classification_loss: 0.2614 307/500 [=================>............] - ETA: 48s - loss: 1.5639 - regression_loss: 1.3031 - classification_loss: 0.2608 308/500 [=================>............] - ETA: 48s - loss: 1.5606 - regression_loss: 1.3004 - classification_loss: 0.2602 309/500 [=================>............] - ETA: 47s - loss: 1.5602 - regression_loss: 1.3002 - classification_loss: 0.2601 310/500 [=================>............] - ETA: 47s - loss: 1.5608 - regression_loss: 1.3007 - classification_loss: 0.2601 311/500 [=================>............] - ETA: 47s - loss: 1.5624 - regression_loss: 1.3017 - classification_loss: 0.2607 312/500 [=================>............] - ETA: 47s - loss: 1.5626 - regression_loss: 1.3020 - classification_loss: 0.2606 313/500 [=================>............] - ETA: 46s - loss: 1.5629 - regression_loss: 1.3023 - classification_loss: 0.2605 314/500 [=================>............] - ETA: 46s - loss: 1.5635 - regression_loss: 1.3027 - classification_loss: 0.2608 315/500 [=================>............] - ETA: 46s - loss: 1.5640 - regression_loss: 1.3033 - classification_loss: 0.2608 316/500 [=================>............] - ETA: 46s - loss: 1.5646 - regression_loss: 1.3038 - classification_loss: 0.2608 317/500 [==================>...........] - ETA: 45s - loss: 1.5658 - regression_loss: 1.3047 - classification_loss: 0.2611 318/500 [==================>...........] - ETA: 45s - loss: 1.5639 - regression_loss: 1.3034 - classification_loss: 0.2605 319/500 [==================>...........] - ETA: 45s - loss: 1.5627 - regression_loss: 1.3023 - classification_loss: 0.2603 320/500 [==================>...........] - ETA: 45s - loss: 1.5642 - regression_loss: 1.3036 - classification_loss: 0.2606 321/500 [==================>...........] - ETA: 44s - loss: 1.5645 - regression_loss: 1.3038 - classification_loss: 0.2607 322/500 [==================>...........] - ETA: 44s - loss: 1.5624 - regression_loss: 1.3018 - classification_loss: 0.2606 323/500 [==================>...........] - ETA: 44s - loss: 1.5619 - regression_loss: 1.3013 - classification_loss: 0.2606 324/500 [==================>...........] - ETA: 44s - loss: 1.5620 - regression_loss: 1.3015 - classification_loss: 0.2605 325/500 [==================>...........] - ETA: 43s - loss: 1.5622 - regression_loss: 1.3016 - classification_loss: 0.2606 326/500 [==================>...........] - ETA: 43s - loss: 1.5639 - regression_loss: 1.3029 - classification_loss: 0.2610 327/500 [==================>...........] - ETA: 43s - loss: 1.5658 - regression_loss: 1.3047 - classification_loss: 0.2611 328/500 [==================>...........] - ETA: 43s - loss: 1.5654 - regression_loss: 1.3040 - classification_loss: 0.2614 329/500 [==================>...........] - ETA: 42s - loss: 1.5661 - regression_loss: 1.3045 - classification_loss: 0.2617 330/500 [==================>...........] - ETA: 42s - loss: 1.5654 - regression_loss: 1.3040 - classification_loss: 0.2614 331/500 [==================>...........] - ETA: 42s - loss: 1.5666 - regression_loss: 1.3049 - classification_loss: 0.2618 332/500 [==================>...........] - ETA: 42s - loss: 1.5677 - regression_loss: 1.3060 - classification_loss: 0.2617 333/500 [==================>...........] - ETA: 41s - loss: 1.5660 - regression_loss: 1.3045 - classification_loss: 0.2615 334/500 [===================>..........] - ETA: 41s - loss: 1.5655 - regression_loss: 1.3041 - classification_loss: 0.2614 335/500 [===================>..........] - ETA: 41s - loss: 1.5662 - regression_loss: 1.3047 - classification_loss: 0.2615 336/500 [===================>..........] - ETA: 41s - loss: 1.5671 - regression_loss: 1.3049 - classification_loss: 0.2621 337/500 [===================>..........] - ETA: 40s - loss: 1.5650 - regression_loss: 1.3031 - classification_loss: 0.2619 338/500 [===================>..........] - ETA: 40s - loss: 1.5639 - regression_loss: 1.3024 - classification_loss: 0.2615 339/500 [===================>..........] - ETA: 40s - loss: 1.5631 - regression_loss: 1.3018 - classification_loss: 0.2613 340/500 [===================>..........] - ETA: 40s - loss: 1.5641 - regression_loss: 1.3024 - classification_loss: 0.2617 341/500 [===================>..........] - ETA: 39s - loss: 1.5636 - regression_loss: 1.3021 - classification_loss: 0.2615 342/500 [===================>..........] - ETA: 39s - loss: 1.5639 - regression_loss: 1.3024 - classification_loss: 0.2616 343/500 [===================>..........] - ETA: 39s - loss: 1.5635 - regression_loss: 1.3022 - classification_loss: 0.2612 344/500 [===================>..........] - ETA: 38s - loss: 1.5645 - regression_loss: 1.3031 - classification_loss: 0.2613 345/500 [===================>..........] - ETA: 38s - loss: 1.5664 - regression_loss: 1.3046 - classification_loss: 0.2618 346/500 [===================>..........] - ETA: 38s - loss: 1.5674 - regression_loss: 1.3055 - classification_loss: 0.2619 347/500 [===================>..........] - ETA: 38s - loss: 1.5679 - regression_loss: 1.3062 - classification_loss: 0.2618 348/500 [===================>..........] - ETA: 37s - loss: 1.5681 - regression_loss: 1.3064 - classification_loss: 0.2617 349/500 [===================>..........] - ETA: 37s - loss: 1.5692 - regression_loss: 1.3072 - classification_loss: 0.2620 350/500 [====================>.........] - ETA: 37s - loss: 1.5675 - regression_loss: 1.3056 - classification_loss: 0.2619 351/500 [====================>.........] - ETA: 37s - loss: 1.5668 - regression_loss: 1.3051 - classification_loss: 0.2617 352/500 [====================>.........] - ETA: 36s - loss: 1.5667 - regression_loss: 1.3050 - classification_loss: 0.2617 353/500 [====================>.........] - ETA: 36s - loss: 1.5647 - regression_loss: 1.3033 - classification_loss: 0.2613 354/500 [====================>.........] - ETA: 36s - loss: 1.5647 - regression_loss: 1.3034 - classification_loss: 0.2613 355/500 [====================>.........] - ETA: 36s - loss: 1.5644 - regression_loss: 1.3032 - classification_loss: 0.2612 356/500 [====================>.........] - ETA: 35s - loss: 1.5626 - regression_loss: 1.3018 - classification_loss: 0.2608 357/500 [====================>.........] - ETA: 35s - loss: 1.5629 - regression_loss: 1.3021 - classification_loss: 0.2608 358/500 [====================>.........] - ETA: 35s - loss: 1.5667 - regression_loss: 1.3053 - classification_loss: 0.2613 359/500 [====================>.........] - ETA: 35s - loss: 1.5664 - regression_loss: 1.3050 - classification_loss: 0.2614 360/500 [====================>.........] - ETA: 34s - loss: 1.5664 - regression_loss: 1.3049 - classification_loss: 0.2615 361/500 [====================>.........] - ETA: 34s - loss: 1.5678 - regression_loss: 1.3058 - classification_loss: 0.2619 362/500 [====================>.........] - ETA: 34s - loss: 1.5685 - regression_loss: 1.3064 - classification_loss: 0.2621 363/500 [====================>.........] - ETA: 34s - loss: 1.5685 - regression_loss: 1.3063 - classification_loss: 0.2622 364/500 [====================>.........] - ETA: 33s - loss: 1.5693 - regression_loss: 1.3068 - classification_loss: 0.2624 365/500 [====================>.........] - ETA: 33s - loss: 1.5669 - regression_loss: 1.3049 - classification_loss: 0.2620 366/500 [====================>.........] - ETA: 33s - loss: 1.5684 - regression_loss: 1.3061 - classification_loss: 0.2622 367/500 [=====================>........] - ETA: 33s - loss: 1.5692 - regression_loss: 1.3070 - classification_loss: 0.2622 368/500 [=====================>........] - ETA: 32s - loss: 1.5701 - regression_loss: 1.3077 - classification_loss: 0.2624 369/500 [=====================>........] - ETA: 32s - loss: 1.5718 - regression_loss: 1.3090 - classification_loss: 0.2628 370/500 [=====================>........] - ETA: 32s - loss: 1.5735 - regression_loss: 1.3100 - classification_loss: 0.2635 371/500 [=====================>........] - ETA: 32s - loss: 1.5713 - regression_loss: 1.3083 - classification_loss: 0.2629 372/500 [=====================>........] - ETA: 31s - loss: 1.5726 - regression_loss: 1.3095 - classification_loss: 0.2631 373/500 [=====================>........] - ETA: 31s - loss: 1.5740 - regression_loss: 1.3105 - classification_loss: 0.2634 374/500 [=====================>........] - ETA: 31s - loss: 1.5726 - regression_loss: 1.3092 - classification_loss: 0.2634 375/500 [=====================>........] - ETA: 31s - loss: 1.5728 - regression_loss: 1.3095 - classification_loss: 0.2633 376/500 [=====================>........] - ETA: 30s - loss: 1.5731 - regression_loss: 1.3098 - classification_loss: 0.2633 377/500 [=====================>........] - ETA: 30s - loss: 1.5745 - regression_loss: 1.3109 - classification_loss: 0.2635 378/500 [=====================>........] - ETA: 30s - loss: 1.5730 - regression_loss: 1.3098 - classification_loss: 0.2632 379/500 [=====================>........] - ETA: 30s - loss: 1.5731 - regression_loss: 1.3098 - classification_loss: 0.2632 380/500 [=====================>........] - ETA: 29s - loss: 1.5715 - regression_loss: 1.3084 - classification_loss: 0.2632 381/500 [=====================>........] - ETA: 29s - loss: 1.5700 - regression_loss: 1.3071 - classification_loss: 0.2629 382/500 [=====================>........] - ETA: 29s - loss: 1.5712 - regression_loss: 1.3081 - classification_loss: 0.2631 383/500 [=====================>........] - ETA: 29s - loss: 1.5714 - regression_loss: 1.3083 - classification_loss: 0.2631 384/500 [======================>.......] - ETA: 28s - loss: 1.5722 - regression_loss: 1.3090 - classification_loss: 0.2632 385/500 [======================>.......] - ETA: 28s - loss: 1.5710 - regression_loss: 1.3081 - classification_loss: 0.2629 386/500 [======================>.......] - ETA: 28s - loss: 1.5693 - regression_loss: 1.3069 - classification_loss: 0.2624 387/500 [======================>.......] - ETA: 28s - loss: 1.5704 - regression_loss: 1.3078 - classification_loss: 0.2626 388/500 [======================>.......] - ETA: 27s - loss: 1.5697 - regression_loss: 1.3072 - classification_loss: 0.2625 389/500 [======================>.......] - ETA: 27s - loss: 1.5693 - regression_loss: 1.3068 - classification_loss: 0.2625 390/500 [======================>.......] - ETA: 27s - loss: 1.5706 - regression_loss: 1.3076 - classification_loss: 0.2630 391/500 [======================>.......] - ETA: 27s - loss: 1.5693 - regression_loss: 1.3063 - classification_loss: 0.2629 392/500 [======================>.......] - ETA: 26s - loss: 1.5695 - regression_loss: 1.3066 - classification_loss: 0.2629 393/500 [======================>.......] - ETA: 26s - loss: 1.5695 - regression_loss: 1.3061 - classification_loss: 0.2634 394/500 [======================>.......] - ETA: 26s - loss: 1.5701 - regression_loss: 1.3067 - classification_loss: 0.2634 395/500 [======================>.......] - ETA: 26s - loss: 1.5723 - regression_loss: 1.3086 - classification_loss: 0.2636 396/500 [======================>.......] - ETA: 25s - loss: 1.5721 - regression_loss: 1.3085 - classification_loss: 0.2636 397/500 [======================>.......] - ETA: 25s - loss: 1.5737 - regression_loss: 1.3096 - classification_loss: 0.2640 398/500 [======================>.......] - ETA: 25s - loss: 1.5736 - regression_loss: 1.3096 - classification_loss: 0.2640 399/500 [======================>.......] - ETA: 25s - loss: 1.5728 - regression_loss: 1.3090 - classification_loss: 0.2638 400/500 [=======================>......] - ETA: 24s - loss: 1.5707 - regression_loss: 1.3071 - classification_loss: 0.2636 401/500 [=======================>......] - ETA: 24s - loss: 1.5719 - regression_loss: 1.3082 - classification_loss: 0.2638 402/500 [=======================>......] - ETA: 24s - loss: 1.5704 - regression_loss: 1.3069 - classification_loss: 0.2635 403/500 [=======================>......] - ETA: 24s - loss: 1.5709 - regression_loss: 1.3074 - classification_loss: 0.2635 404/500 [=======================>......] - ETA: 23s - loss: 1.5698 - regression_loss: 1.3068 - classification_loss: 0.2630 405/500 [=======================>......] - ETA: 23s - loss: 1.5675 - regression_loss: 1.3050 - classification_loss: 0.2625 406/500 [=======================>......] - ETA: 23s - loss: 1.5677 - regression_loss: 1.3051 - classification_loss: 0.2626 407/500 [=======================>......] - ETA: 23s - loss: 1.5676 - regression_loss: 1.3048 - classification_loss: 0.2627 408/500 [=======================>......] - ETA: 22s - loss: 1.5674 - regression_loss: 1.3049 - classification_loss: 0.2626 409/500 [=======================>......] - ETA: 22s - loss: 1.5660 - regression_loss: 1.3035 - classification_loss: 0.2625 410/500 [=======================>......] - ETA: 22s - loss: 1.5651 - regression_loss: 1.3029 - classification_loss: 0.2623 411/500 [=======================>......] - ETA: 22s - loss: 1.5664 - regression_loss: 1.3037 - classification_loss: 0.2626 412/500 [=======================>......] - ETA: 21s - loss: 1.5677 - regression_loss: 1.3048 - classification_loss: 0.2630 413/500 [=======================>......] - ETA: 21s - loss: 1.5684 - regression_loss: 1.3053 - classification_loss: 0.2631 414/500 [=======================>......] - ETA: 21s - loss: 1.5663 - regression_loss: 1.3037 - classification_loss: 0.2626 415/500 [=======================>......] - ETA: 21s - loss: 1.5668 - regression_loss: 1.3040 - classification_loss: 0.2628 416/500 [=======================>......] - ETA: 20s - loss: 1.5667 - regression_loss: 1.3039 - classification_loss: 0.2627 417/500 [========================>.....] - ETA: 20s - loss: 1.5660 - regression_loss: 1.3033 - classification_loss: 0.2626 418/500 [========================>.....] - ETA: 20s - loss: 1.5664 - regression_loss: 1.3034 - classification_loss: 0.2629 419/500 [========================>.....] - ETA: 20s - loss: 1.5655 - regression_loss: 1.3028 - classification_loss: 0.2626 420/500 [========================>.....] - ETA: 19s - loss: 1.5648 - regression_loss: 1.3023 - classification_loss: 0.2625 421/500 [========================>.....] - ETA: 19s - loss: 1.5670 - regression_loss: 1.3041 - classification_loss: 0.2629 422/500 [========================>.....] - ETA: 19s - loss: 1.5682 - regression_loss: 1.3049 - classification_loss: 0.2633 423/500 [========================>.....] - ETA: 19s - loss: 1.5681 - regression_loss: 1.3049 - classification_loss: 0.2632 424/500 [========================>.....] - ETA: 18s - loss: 1.5713 - regression_loss: 1.3072 - classification_loss: 0.2641 425/500 [========================>.....] - ETA: 18s - loss: 1.5723 - regression_loss: 1.3081 - classification_loss: 0.2643 426/500 [========================>.....] - ETA: 18s - loss: 1.5716 - regression_loss: 1.3074 - classification_loss: 0.2642 427/500 [========================>.....] - ETA: 18s - loss: 1.5717 - regression_loss: 1.3076 - classification_loss: 0.2641 428/500 [========================>.....] - ETA: 17s - loss: 1.5718 - regression_loss: 1.3078 - classification_loss: 0.2640 429/500 [========================>.....] - ETA: 17s - loss: 1.5720 - regression_loss: 1.3081 - classification_loss: 0.2638 430/500 [========================>.....] - ETA: 17s - loss: 1.5698 - regression_loss: 1.3063 - classification_loss: 0.2634 431/500 [========================>.....] - ETA: 17s - loss: 1.5705 - regression_loss: 1.3068 - classification_loss: 0.2637 432/500 [========================>.....] - ETA: 16s - loss: 1.5713 - regression_loss: 1.3073 - classification_loss: 0.2640 433/500 [========================>.....] - ETA: 16s - loss: 1.5731 - regression_loss: 1.3085 - classification_loss: 0.2646 434/500 [=========================>....] - ETA: 16s - loss: 1.5726 - regression_loss: 1.3082 - classification_loss: 0.2644 435/500 [=========================>....] - ETA: 16s - loss: 1.5706 - regression_loss: 1.3066 - classification_loss: 0.2641 436/500 [=========================>....] - ETA: 15s - loss: 1.5708 - regression_loss: 1.3068 - classification_loss: 0.2640 437/500 [=========================>....] - ETA: 15s - loss: 1.5701 - regression_loss: 1.3062 - classification_loss: 0.2639 438/500 [=========================>....] - ETA: 15s - loss: 1.5705 - regression_loss: 1.3063 - classification_loss: 0.2641 439/500 [=========================>....] - ETA: 15s - loss: 1.5702 - regression_loss: 1.3062 - classification_loss: 0.2640 440/500 [=========================>....] - ETA: 14s - loss: 1.5716 - regression_loss: 1.3074 - classification_loss: 0.2642 441/500 [=========================>....] - ETA: 14s - loss: 1.5723 - regression_loss: 1.3079 - classification_loss: 0.2644 442/500 [=========================>....] - ETA: 14s - loss: 1.5730 - regression_loss: 1.3085 - classification_loss: 0.2645 443/500 [=========================>....] - ETA: 14s - loss: 1.5726 - regression_loss: 1.3082 - classification_loss: 0.2644 444/500 [=========================>....] - ETA: 13s - loss: 1.5715 - regression_loss: 1.3073 - classification_loss: 0.2642 445/500 [=========================>....] - ETA: 13s - loss: 1.5721 - regression_loss: 1.3081 - classification_loss: 0.2640 446/500 [=========================>....] - ETA: 13s - loss: 1.5727 - regression_loss: 1.3086 - classification_loss: 0.2641 447/500 [=========================>....] - ETA: 13s - loss: 1.5719 - regression_loss: 1.3079 - classification_loss: 0.2641 448/500 [=========================>....] - ETA: 12s - loss: 1.5714 - regression_loss: 1.3075 - classification_loss: 0.2639 449/500 [=========================>....] - ETA: 12s - loss: 1.5724 - regression_loss: 1.3083 - classification_loss: 0.2642 450/500 [==========================>...] - ETA: 12s - loss: 1.5730 - regression_loss: 1.3087 - classification_loss: 0.2642 451/500 [==========================>...] - ETA: 12s - loss: 1.5727 - regression_loss: 1.3085 - classification_loss: 0.2642 452/500 [==========================>...] - ETA: 11s - loss: 1.5735 - regression_loss: 1.3094 - classification_loss: 0.2641 453/500 [==========================>...] - ETA: 11s - loss: 1.5729 - regression_loss: 1.3087 - classification_loss: 0.2642 454/500 [==========================>...] - ETA: 11s - loss: 1.5741 - regression_loss: 1.3095 - classification_loss: 0.2646 455/500 [==========================>...] - ETA: 11s - loss: 1.5739 - regression_loss: 1.3093 - classification_loss: 0.2645 456/500 [==========================>...] - ETA: 10s - loss: 1.5742 - regression_loss: 1.3096 - classification_loss: 0.2646 457/500 [==========================>...] - ETA: 10s - loss: 1.5741 - regression_loss: 1.3096 - classification_loss: 0.2645 458/500 [==========================>...] - ETA: 10s - loss: 1.5738 - regression_loss: 1.3094 - classification_loss: 0.2644 459/500 [==========================>...] - ETA: 10s - loss: 1.5745 - regression_loss: 1.3100 - classification_loss: 0.2645 460/500 [==========================>...] - ETA: 9s - loss: 1.5737 - regression_loss: 1.3093 - classification_loss: 0.2644  461/500 [==========================>...] - ETA: 9s - loss: 1.5748 - regression_loss: 1.3104 - classification_loss: 0.2645 462/500 [==========================>...] - ETA: 9s - loss: 1.5739 - regression_loss: 1.3096 - classification_loss: 0.2643 463/500 [==========================>...] - ETA: 9s - loss: 1.5744 - regression_loss: 1.3102 - classification_loss: 0.2642 464/500 [==========================>...] - ETA: 8s - loss: 1.5748 - regression_loss: 1.3106 - classification_loss: 0.2642 465/500 [==========================>...] - ETA: 8s - loss: 1.5753 - regression_loss: 1.3106 - classification_loss: 0.2647 466/500 [==========================>...] - ETA: 8s - loss: 1.5755 - regression_loss: 1.3108 - classification_loss: 0.2647 467/500 [===========================>..] - ETA: 8s - loss: 1.5764 - regression_loss: 1.3116 - classification_loss: 0.2648 468/500 [===========================>..] - ETA: 7s - loss: 1.5764 - regression_loss: 1.3112 - classification_loss: 0.2652 469/500 [===========================>..] - ETA: 7s - loss: 1.5769 - regression_loss: 1.3118 - classification_loss: 0.2650 470/500 [===========================>..] - ETA: 7s - loss: 1.5770 - regression_loss: 1.3119 - classification_loss: 0.2650 471/500 [===========================>..] - ETA: 7s - loss: 1.5772 - regression_loss: 1.3121 - classification_loss: 0.2651 472/500 [===========================>..] - ETA: 6s - loss: 1.5792 - regression_loss: 1.3141 - classification_loss: 0.2651 473/500 [===========================>..] - ETA: 6s - loss: 1.5793 - regression_loss: 1.3142 - classification_loss: 0.2650 474/500 [===========================>..] - ETA: 6s - loss: 1.5788 - regression_loss: 1.3139 - classification_loss: 0.2649 475/500 [===========================>..] - ETA: 6s - loss: 1.5783 - regression_loss: 1.3135 - classification_loss: 0.2648 476/500 [===========================>..] - ETA: 5s - loss: 1.5786 - regression_loss: 1.3139 - classification_loss: 0.2647 477/500 [===========================>..] - ETA: 5s - loss: 1.5785 - regression_loss: 1.3138 - classification_loss: 0.2647 478/500 [===========================>..] - ETA: 5s - loss: 1.5779 - regression_loss: 1.3133 - classification_loss: 0.2646 479/500 [===========================>..] - ETA: 5s - loss: 1.5772 - regression_loss: 1.3124 - classification_loss: 0.2648 480/500 [===========================>..] - ETA: 4s - loss: 1.5773 - regression_loss: 1.3126 - classification_loss: 0.2647 481/500 [===========================>..] - ETA: 4s - loss: 1.5771 - regression_loss: 1.3124 - classification_loss: 0.2647 482/500 [===========================>..] - ETA: 4s - loss: 1.5778 - regression_loss: 1.3129 - classification_loss: 0.2649 483/500 [===========================>..] - ETA: 4s - loss: 1.5769 - regression_loss: 1.3122 - classification_loss: 0.2647 484/500 [============================>.] - ETA: 3s - loss: 1.5786 - regression_loss: 1.3131 - classification_loss: 0.2655 485/500 [============================>.] - ETA: 3s - loss: 1.5798 - regression_loss: 1.3140 - classification_loss: 0.2658 486/500 [============================>.] - ETA: 3s - loss: 1.5806 - regression_loss: 1.3147 - classification_loss: 0.2659 487/500 [============================>.] - ETA: 3s - loss: 1.5813 - regression_loss: 1.3154 - classification_loss: 0.2659 488/500 [============================>.] - ETA: 2s - loss: 1.5800 - regression_loss: 1.3144 - classification_loss: 0.2655 489/500 [============================>.] - ETA: 2s - loss: 1.5800 - regression_loss: 1.3144 - classification_loss: 0.2657 490/500 [============================>.] - ETA: 2s - loss: 1.5805 - regression_loss: 1.3148 - classification_loss: 0.2657 491/500 [============================>.] - ETA: 2s - loss: 1.5787 - regression_loss: 1.3134 - classification_loss: 0.2653 492/500 [============================>.] - ETA: 1s - loss: 1.5794 - regression_loss: 1.3140 - classification_loss: 0.2654 493/500 [============================>.] - ETA: 1s - loss: 1.5811 - regression_loss: 1.3155 - classification_loss: 0.2656 494/500 [============================>.] - ETA: 1s - loss: 1.5799 - regression_loss: 1.3145 - classification_loss: 0.2654 495/500 [============================>.] - ETA: 1s - loss: 1.5809 - regression_loss: 1.3153 - classification_loss: 0.2656 496/500 [============================>.] - ETA: 0s - loss: 1.5817 - regression_loss: 1.3160 - classification_loss: 0.2657 497/500 [============================>.] - ETA: 0s - loss: 1.5818 - regression_loss: 1.3162 - classification_loss: 0.2656 498/500 [============================>.] - ETA: 0s - loss: 1.5818 - regression_loss: 1.3162 - classification_loss: 0.2656 499/500 [============================>.] - ETA: 0s - loss: 1.5832 - regression_loss: 1.3174 - classification_loss: 0.2658 500/500 [==============================] - 125s 250ms/step - loss: 1.5831 - regression_loss: 1.3173 - classification_loss: 0.2658 1172 instances of class plum with average precision: 0.6613 mAP: 0.6613 Epoch 00091: saving model to ./training/snapshots/resnet50_pascal_91.h5 Epoch 92/150 1/500 [..............................] - ETA: 1:58 - loss: 1.9706 - regression_loss: 1.6609 - classification_loss: 0.3097 2/500 [..............................] - ETA: 1:58 - loss: 1.6584 - regression_loss: 1.4017 - classification_loss: 0.2566 3/500 [..............................] - ETA: 1:58 - loss: 1.5418 - regression_loss: 1.3128 - classification_loss: 0.2290 4/500 [..............................] - ETA: 2:02 - loss: 1.6411 - regression_loss: 1.3959 - classification_loss: 0.2452 5/500 [..............................] - ETA: 2:02 - loss: 1.5345 - regression_loss: 1.2871 - classification_loss: 0.2473 6/500 [..............................] - ETA: 2:02 - loss: 1.4832 - regression_loss: 1.2398 - classification_loss: 0.2434 7/500 [..............................] - ETA: 2:02 - loss: 1.5459 - regression_loss: 1.2924 - classification_loss: 0.2536 8/500 [..............................] - ETA: 2:02 - loss: 1.4828 - regression_loss: 1.2468 - classification_loss: 0.2361 9/500 [..............................] - ETA: 2:02 - loss: 1.4614 - regression_loss: 1.2295 - classification_loss: 0.2319 10/500 [..............................] - ETA: 2:02 - loss: 1.4833 - regression_loss: 1.2538 - classification_loss: 0.2294 11/500 [..............................] - ETA: 2:02 - loss: 1.4177 - regression_loss: 1.1970 - classification_loss: 0.2207 12/500 [..............................] - ETA: 2:02 - loss: 1.3600 - regression_loss: 1.1538 - classification_loss: 0.2062 13/500 [..............................] - ETA: 2:01 - loss: 1.3783 - regression_loss: 1.1697 - classification_loss: 0.2087 14/500 [..............................] - ETA: 2:01 - loss: 1.3932 - regression_loss: 1.1781 - classification_loss: 0.2151 15/500 [..............................] - ETA: 2:01 - loss: 1.4060 - regression_loss: 1.1892 - classification_loss: 0.2168 16/500 [..............................] - ETA: 2:00 - loss: 1.4294 - regression_loss: 1.2070 - classification_loss: 0.2225 17/500 [>.............................] - ETA: 1:59 - loss: 1.3889 - regression_loss: 1.1758 - classification_loss: 0.2131 18/500 [>.............................] - ETA: 1:58 - loss: 1.4007 - regression_loss: 1.1860 - classification_loss: 0.2147 19/500 [>.............................] - ETA: 1:57 - loss: 1.4164 - regression_loss: 1.1972 - classification_loss: 0.2192 20/500 [>.............................] - ETA: 1:57 - loss: 1.4411 - regression_loss: 1.2131 - classification_loss: 0.2280 21/500 [>.............................] - ETA: 1:57 - loss: 1.4132 - regression_loss: 1.1897 - classification_loss: 0.2235 22/500 [>.............................] - ETA: 1:56 - loss: 1.3936 - regression_loss: 1.1750 - classification_loss: 0.2186 23/500 [>.............................] - ETA: 1:56 - loss: 1.4150 - regression_loss: 1.1929 - classification_loss: 0.2221 24/500 [>.............................] - ETA: 1:56 - loss: 1.4319 - regression_loss: 1.2085 - classification_loss: 0.2234 25/500 [>.............................] - ETA: 1:56 - loss: 1.4663 - regression_loss: 1.2376 - classification_loss: 0.2287 26/500 [>.............................] - ETA: 1:56 - loss: 1.4972 - regression_loss: 1.2595 - classification_loss: 0.2377 27/500 [>.............................] - ETA: 1:56 - loss: 1.4896 - regression_loss: 1.2545 - classification_loss: 0.2351 28/500 [>.............................] - ETA: 1:56 - loss: 1.5002 - regression_loss: 1.2611 - classification_loss: 0.2391 29/500 [>.............................] - ETA: 1:55 - loss: 1.5188 - regression_loss: 1.2767 - classification_loss: 0.2420 30/500 [>.............................] - ETA: 1:55 - loss: 1.5162 - regression_loss: 1.2736 - classification_loss: 0.2426 31/500 [>.............................] - ETA: 1:55 - loss: 1.5386 - regression_loss: 1.2888 - classification_loss: 0.2498 32/500 [>.............................] - ETA: 1:55 - loss: 1.5409 - regression_loss: 1.2906 - classification_loss: 0.2504 33/500 [>.............................] - ETA: 1:55 - loss: 1.5567 - regression_loss: 1.3035 - classification_loss: 0.2532 34/500 [=>............................] - ETA: 1:54 - loss: 1.5571 - regression_loss: 1.3009 - classification_loss: 0.2562 35/500 [=>............................] - ETA: 1:54 - loss: 1.5574 - regression_loss: 1.3009 - classification_loss: 0.2565 36/500 [=>............................] - ETA: 1:54 - loss: 1.5589 - regression_loss: 1.3012 - classification_loss: 0.2577 37/500 [=>............................] - ETA: 1:54 - loss: 1.5631 - regression_loss: 1.3038 - classification_loss: 0.2593 38/500 [=>............................] - ETA: 1:54 - loss: 1.5633 - regression_loss: 1.3050 - classification_loss: 0.2583 39/500 [=>............................] - ETA: 1:53 - loss: 1.5359 - regression_loss: 1.2824 - classification_loss: 0.2536 40/500 [=>............................] - ETA: 1:53 - loss: 1.5467 - regression_loss: 1.2906 - classification_loss: 0.2561 41/500 [=>............................] - ETA: 1:53 - loss: 1.5602 - regression_loss: 1.2984 - classification_loss: 0.2617 42/500 [=>............................] - ETA: 1:53 - loss: 1.5571 - regression_loss: 1.2964 - classification_loss: 0.2607 43/500 [=>............................] - ETA: 1:52 - loss: 1.5504 - regression_loss: 1.2927 - classification_loss: 0.2577 44/500 [=>............................] - ETA: 1:52 - loss: 1.5548 - regression_loss: 1.2970 - classification_loss: 0.2579 45/500 [=>............................] - ETA: 1:52 - loss: 1.5485 - regression_loss: 1.2918 - classification_loss: 0.2567 46/500 [=>............................] - ETA: 1:52 - loss: 1.5496 - regression_loss: 1.2930 - classification_loss: 0.2566 47/500 [=>............................] - ETA: 1:51 - loss: 1.5462 - regression_loss: 1.2926 - classification_loss: 0.2536 48/500 [=>............................] - ETA: 1:51 - loss: 1.5393 - regression_loss: 1.2875 - classification_loss: 0.2518 49/500 [=>............................] - ETA: 1:51 - loss: 1.5386 - regression_loss: 1.2878 - classification_loss: 0.2508 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5385 - regression_loss: 1.2881 - classification_loss: 0.2504 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5433 - regression_loss: 1.2918 - classification_loss: 0.2515 52/500 [==>...........................] - ETA: 1:50 - loss: 1.5477 - regression_loss: 1.2953 - classification_loss: 0.2523 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5437 - regression_loss: 1.2912 - classification_loss: 0.2525 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5431 - regression_loss: 1.2898 - classification_loss: 0.2532 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5390 - regression_loss: 1.2853 - classification_loss: 0.2537 56/500 [==>...........................] - ETA: 1:49 - loss: 1.5303 - regression_loss: 1.2789 - classification_loss: 0.2514 57/500 [==>...........................] - ETA: 1:49 - loss: 1.5274 - regression_loss: 1.2776 - classification_loss: 0.2498 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5176 - regression_loss: 1.2675 - classification_loss: 0.2501 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5074 - regression_loss: 1.2596 - classification_loss: 0.2478 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4975 - regression_loss: 1.2518 - classification_loss: 0.2457 61/500 [==>...........................] - ETA: 1:48 - loss: 1.5019 - regression_loss: 1.2563 - classification_loss: 0.2457 62/500 [==>...........................] - ETA: 1:48 - loss: 1.4943 - regression_loss: 1.2504 - classification_loss: 0.2439 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4884 - regression_loss: 1.2464 - classification_loss: 0.2419 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4871 - regression_loss: 1.2456 - classification_loss: 0.2415 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4832 - regression_loss: 1.2426 - classification_loss: 0.2406 66/500 [==>...........................] - ETA: 1:47 - loss: 1.4787 - regression_loss: 1.2397 - classification_loss: 0.2390 67/500 [===>..........................] - ETA: 1:47 - loss: 1.4719 - regression_loss: 1.2332 - classification_loss: 0.2387 68/500 [===>..........................] - ETA: 1:47 - loss: 1.4683 - regression_loss: 1.2303 - classification_loss: 0.2379 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4530 - regression_loss: 1.2177 - classification_loss: 0.2353 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4583 - regression_loss: 1.2215 - classification_loss: 0.2368 71/500 [===>..........................] - ETA: 1:46 - loss: 1.4593 - regression_loss: 1.2231 - classification_loss: 0.2362 72/500 [===>..........................] - ETA: 1:46 - loss: 1.4486 - regression_loss: 1.2149 - classification_loss: 0.2337 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4387 - regression_loss: 1.2053 - classification_loss: 0.2334 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4470 - regression_loss: 1.2107 - classification_loss: 0.2364 75/500 [===>..........................] - ETA: 1:45 - loss: 1.4405 - regression_loss: 1.2059 - classification_loss: 0.2346 76/500 [===>..........................] - ETA: 1:45 - loss: 1.4468 - regression_loss: 1.2105 - classification_loss: 0.2363 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4487 - regression_loss: 1.2122 - classification_loss: 0.2364 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4489 - regression_loss: 1.2128 - classification_loss: 0.2360 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4467 - regression_loss: 1.2106 - classification_loss: 0.2360 80/500 [===>..........................] - ETA: 1:44 - loss: 1.4466 - regression_loss: 1.2115 - classification_loss: 0.2351 81/500 [===>..........................] - ETA: 1:44 - loss: 1.4472 - regression_loss: 1.2120 - classification_loss: 0.2352 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4563 - regression_loss: 1.2200 - classification_loss: 0.2363 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4553 - regression_loss: 1.2191 - classification_loss: 0.2362 84/500 [====>.........................] - ETA: 1:43 - loss: 1.4606 - regression_loss: 1.2233 - classification_loss: 0.2374 85/500 [====>.........................] - ETA: 1:43 - loss: 1.4614 - regression_loss: 1.2241 - classification_loss: 0.2374 86/500 [====>.........................] - ETA: 1:42 - loss: 1.4547 - regression_loss: 1.2187 - classification_loss: 0.2360 87/500 [====>.........................] - ETA: 1:42 - loss: 1.4491 - regression_loss: 1.2144 - classification_loss: 0.2347 88/500 [====>.........................] - ETA: 1:42 - loss: 1.4563 - regression_loss: 1.2204 - classification_loss: 0.2360 89/500 [====>.........................] - ETA: 1:41 - loss: 1.4571 - regression_loss: 1.2211 - classification_loss: 0.2359 90/500 [====>.........................] - ETA: 1:41 - loss: 1.4597 - regression_loss: 1.2234 - classification_loss: 0.2363 91/500 [====>.........................] - ETA: 1:41 - loss: 1.4647 - regression_loss: 1.2271 - classification_loss: 0.2376 92/500 [====>.........................] - ETA: 1:41 - loss: 1.4619 - regression_loss: 1.2249 - classification_loss: 0.2370 93/500 [====>.........................] - ETA: 1:41 - loss: 1.4632 - regression_loss: 1.2267 - classification_loss: 0.2365 94/500 [====>.........................] - ETA: 1:40 - loss: 1.4647 - regression_loss: 1.2287 - classification_loss: 0.2361 95/500 [====>.........................] - ETA: 1:40 - loss: 1.4671 - regression_loss: 1.2300 - classification_loss: 0.2371 96/500 [====>.........................] - ETA: 1:40 - loss: 1.4599 - regression_loss: 1.2245 - classification_loss: 0.2355 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4521 - regression_loss: 1.2167 - classification_loss: 0.2354 98/500 [====>.........................] - ETA: 1:39 - loss: 1.4529 - regression_loss: 1.2178 - classification_loss: 0.2351 99/500 [====>.........................] - ETA: 1:39 - loss: 1.4533 - regression_loss: 1.2184 - classification_loss: 0.2349 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4537 - regression_loss: 1.2190 - classification_loss: 0.2348 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4582 - regression_loss: 1.2223 - classification_loss: 0.2359 102/500 [=====>........................] - ETA: 1:38 - loss: 1.4613 - regression_loss: 1.2249 - classification_loss: 0.2365 103/500 [=====>........................] - ETA: 1:38 - loss: 1.4585 - regression_loss: 1.2231 - classification_loss: 0.2353 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4582 - regression_loss: 1.2228 - classification_loss: 0.2354 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4603 - regression_loss: 1.2249 - classification_loss: 0.2354 106/500 [=====>........................] - ETA: 1:37 - loss: 1.4617 - regression_loss: 1.2263 - classification_loss: 0.2354 107/500 [=====>........................] - ETA: 1:37 - loss: 1.4660 - regression_loss: 1.2280 - classification_loss: 0.2380 108/500 [=====>........................] - ETA: 1:37 - loss: 1.4710 - regression_loss: 1.2315 - classification_loss: 0.2396 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4713 - regression_loss: 1.2319 - classification_loss: 0.2394 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4677 - regression_loss: 1.2291 - classification_loss: 0.2386 111/500 [=====>........................] - ETA: 1:36 - loss: 1.4674 - regression_loss: 1.2286 - classification_loss: 0.2388 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4703 - regression_loss: 1.2309 - classification_loss: 0.2394 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4704 - regression_loss: 1.2303 - classification_loss: 0.2401 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4721 - regression_loss: 1.2312 - classification_loss: 0.2409 115/500 [=====>........................] - ETA: 1:35 - loss: 1.4766 - regression_loss: 1.2349 - classification_loss: 0.2417 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4881 - regression_loss: 1.2427 - classification_loss: 0.2454 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4896 - regression_loss: 1.2438 - classification_loss: 0.2459 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4935 - regression_loss: 1.2464 - classification_loss: 0.2471 119/500 [======>.......................] - ETA: 1:34 - loss: 1.4992 - regression_loss: 1.2512 - classification_loss: 0.2481 120/500 [======>.......................] - ETA: 1:34 - loss: 1.4952 - regression_loss: 1.2477 - classification_loss: 0.2475 121/500 [======>.......................] - ETA: 1:34 - loss: 1.4985 - regression_loss: 1.2510 - classification_loss: 0.2475 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5017 - regression_loss: 1.2531 - classification_loss: 0.2486 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5034 - regression_loss: 1.2545 - classification_loss: 0.2489 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5035 - regression_loss: 1.2547 - classification_loss: 0.2487 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5015 - regression_loss: 1.2529 - classification_loss: 0.2486 126/500 [======>.......................] - ETA: 1:32 - loss: 1.5019 - regression_loss: 1.2536 - classification_loss: 0.2483 127/500 [======>.......................] - ETA: 1:32 - loss: 1.5039 - regression_loss: 1.2553 - classification_loss: 0.2486 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5094 - regression_loss: 1.2594 - classification_loss: 0.2500 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5097 - regression_loss: 1.2596 - classification_loss: 0.2501 130/500 [======>.......................] - ETA: 1:31 - loss: 1.5113 - regression_loss: 1.2605 - classification_loss: 0.2508 131/500 [======>.......................] - ETA: 1:31 - loss: 1.5111 - regression_loss: 1.2603 - classification_loss: 0.2509 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5115 - regression_loss: 1.2613 - classification_loss: 0.2502 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5133 - regression_loss: 1.2626 - classification_loss: 0.2507 134/500 [=======>......................] - ETA: 1:30 - loss: 1.5078 - regression_loss: 1.2582 - classification_loss: 0.2496 135/500 [=======>......................] - ETA: 1:30 - loss: 1.5004 - regression_loss: 1.2521 - classification_loss: 0.2483 136/500 [=======>......................] - ETA: 1:30 - loss: 1.4993 - regression_loss: 1.2514 - classification_loss: 0.2479 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4937 - regression_loss: 1.2470 - classification_loss: 0.2467 138/500 [=======>......................] - ETA: 1:29 - loss: 1.4972 - regression_loss: 1.2501 - classification_loss: 0.2470 139/500 [=======>......................] - ETA: 1:29 - loss: 1.4993 - regression_loss: 1.2515 - classification_loss: 0.2478 140/500 [=======>......................] - ETA: 1:29 - loss: 1.4995 - regression_loss: 1.2523 - classification_loss: 0.2472 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5001 - regression_loss: 1.2530 - classification_loss: 0.2472 142/500 [=======>......................] - ETA: 1:28 - loss: 1.5037 - regression_loss: 1.2555 - classification_loss: 0.2482 143/500 [=======>......................] - ETA: 1:28 - loss: 1.5130 - regression_loss: 1.2636 - classification_loss: 0.2494 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5086 - regression_loss: 1.2598 - classification_loss: 0.2488 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5094 - regression_loss: 1.2605 - classification_loss: 0.2489 146/500 [=======>......................] - ETA: 1:27 - loss: 1.5099 - regression_loss: 1.2609 - classification_loss: 0.2490 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5133 - regression_loss: 1.2641 - classification_loss: 0.2493 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5178 - regression_loss: 1.2674 - classification_loss: 0.2504 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5127 - regression_loss: 1.2633 - classification_loss: 0.2494 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5105 - regression_loss: 1.2618 - classification_loss: 0.2487 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5120 - regression_loss: 1.2627 - classification_loss: 0.2492 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5125 - regression_loss: 1.2632 - classification_loss: 0.2493 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5094 - regression_loss: 1.2607 - classification_loss: 0.2487 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5040 - regression_loss: 1.2564 - classification_loss: 0.2477 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5037 - regression_loss: 1.2563 - classification_loss: 0.2474 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5040 - regression_loss: 1.2567 - classification_loss: 0.2474 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5037 - regression_loss: 1.2564 - classification_loss: 0.2473 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5027 - regression_loss: 1.2559 - classification_loss: 0.2469 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5034 - regression_loss: 1.2568 - classification_loss: 0.2466 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5065 - regression_loss: 1.2592 - classification_loss: 0.2473 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5060 - regression_loss: 1.2589 - classification_loss: 0.2470 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5055 - regression_loss: 1.2585 - classification_loss: 0.2470 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5059 - regression_loss: 1.2592 - classification_loss: 0.2467 164/500 [========>.....................] - ETA: 1:23 - loss: 1.4992 - regression_loss: 1.2538 - classification_loss: 0.2454 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4981 - regression_loss: 1.2530 - classification_loss: 0.2450 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4942 - regression_loss: 1.2499 - classification_loss: 0.2443 167/500 [=========>....................] - ETA: 1:22 - loss: 1.4935 - regression_loss: 1.2489 - classification_loss: 0.2445 168/500 [=========>....................] - ETA: 1:22 - loss: 1.4942 - regression_loss: 1.2495 - classification_loss: 0.2447 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4934 - regression_loss: 1.2488 - classification_loss: 0.2446 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4930 - regression_loss: 1.2487 - classification_loss: 0.2443 171/500 [=========>....................] - ETA: 1:21 - loss: 1.4908 - regression_loss: 1.2467 - classification_loss: 0.2441 172/500 [=========>....................] - ETA: 1:21 - loss: 1.4916 - regression_loss: 1.2475 - classification_loss: 0.2441 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4905 - regression_loss: 1.2463 - classification_loss: 0.2442 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4933 - regression_loss: 1.2486 - classification_loss: 0.2448 175/500 [=========>....................] - ETA: 1:20 - loss: 1.4942 - regression_loss: 1.2496 - classification_loss: 0.2446 176/500 [=========>....................] - ETA: 1:20 - loss: 1.4924 - regression_loss: 1.2482 - classification_loss: 0.2442 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4943 - regression_loss: 1.2498 - classification_loss: 0.2445 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4944 - regression_loss: 1.2499 - classification_loss: 0.2445 179/500 [=========>....................] - ETA: 1:19 - loss: 1.4926 - regression_loss: 1.2484 - classification_loss: 0.2442 180/500 [=========>....................] - ETA: 1:19 - loss: 1.4896 - regression_loss: 1.2462 - classification_loss: 0.2433 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4866 - regression_loss: 1.2436 - classification_loss: 0.2430 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4855 - regression_loss: 1.2422 - classification_loss: 0.2433 183/500 [=========>....................] - ETA: 1:18 - loss: 1.4844 - regression_loss: 1.2418 - classification_loss: 0.2426 184/500 [==========>...................] - ETA: 1:18 - loss: 1.4832 - regression_loss: 1.2409 - classification_loss: 0.2422 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4823 - regression_loss: 1.2403 - classification_loss: 0.2420 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4824 - regression_loss: 1.2407 - classification_loss: 0.2417 187/500 [==========>...................] - ETA: 1:17 - loss: 1.4823 - regression_loss: 1.2405 - classification_loss: 0.2418 188/500 [==========>...................] - ETA: 1:17 - loss: 1.4825 - regression_loss: 1.2408 - classification_loss: 0.2417 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4773 - regression_loss: 1.2365 - classification_loss: 0.2409 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4740 - regression_loss: 1.2337 - classification_loss: 0.2404 191/500 [==========>...................] - ETA: 1:16 - loss: 1.4771 - regression_loss: 1.2366 - classification_loss: 0.2405 192/500 [==========>...................] - ETA: 1:16 - loss: 1.4784 - regression_loss: 1.2377 - classification_loss: 0.2407 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4797 - regression_loss: 1.2391 - classification_loss: 0.2406 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4790 - regression_loss: 1.2384 - classification_loss: 0.2406 195/500 [==========>...................] - ETA: 1:15 - loss: 1.4817 - regression_loss: 1.2406 - classification_loss: 0.2410 196/500 [==========>...................] - ETA: 1:15 - loss: 1.4781 - regression_loss: 1.2379 - classification_loss: 0.2402 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4775 - regression_loss: 1.2375 - classification_loss: 0.2401 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4784 - regression_loss: 1.2381 - classification_loss: 0.2403 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4809 - regression_loss: 1.2400 - classification_loss: 0.2409 200/500 [===========>..................] - ETA: 1:14 - loss: 1.4855 - regression_loss: 1.2444 - classification_loss: 0.2411 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4886 - regression_loss: 1.2470 - classification_loss: 0.2416 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4883 - regression_loss: 1.2469 - classification_loss: 0.2414 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4873 - regression_loss: 1.2461 - classification_loss: 0.2411 204/500 [===========>..................] - ETA: 1:13 - loss: 1.4881 - regression_loss: 1.2466 - classification_loss: 0.2414 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4843 - regression_loss: 1.2435 - classification_loss: 0.2409 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4861 - regression_loss: 1.2446 - classification_loss: 0.2414 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4889 - regression_loss: 1.2466 - classification_loss: 0.2423 208/500 [===========>..................] - ETA: 1:12 - loss: 1.4900 - regression_loss: 1.2475 - classification_loss: 0.2425 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4885 - regression_loss: 1.2465 - classification_loss: 0.2420 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4859 - regression_loss: 1.2446 - classification_loss: 0.2413 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4868 - regression_loss: 1.2455 - classification_loss: 0.2413 212/500 [===========>..................] - ETA: 1:11 - loss: 1.4862 - regression_loss: 1.2447 - classification_loss: 0.2415 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4864 - regression_loss: 1.2450 - classification_loss: 0.2415 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4885 - regression_loss: 1.2466 - classification_loss: 0.2418 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4887 - regression_loss: 1.2470 - classification_loss: 0.2418 216/500 [===========>..................] - ETA: 1:10 - loss: 1.4890 - regression_loss: 1.2473 - classification_loss: 0.2417 217/500 [============>.................] - ETA: 1:10 - loss: 1.4894 - regression_loss: 1.2478 - classification_loss: 0.2416 218/500 [============>.................] - ETA: 1:10 - loss: 1.4866 - regression_loss: 1.2454 - classification_loss: 0.2412 219/500 [============>.................] - ETA: 1:10 - loss: 1.4866 - regression_loss: 1.2451 - classification_loss: 0.2415 220/500 [============>.................] - ETA: 1:09 - loss: 1.4878 - regression_loss: 1.2462 - classification_loss: 0.2415 221/500 [============>.................] - ETA: 1:09 - loss: 1.4904 - regression_loss: 1.2483 - classification_loss: 0.2421 222/500 [============>.................] - ETA: 1:09 - loss: 1.4911 - regression_loss: 1.2488 - classification_loss: 0.2423 223/500 [============>.................] - ETA: 1:09 - loss: 1.4915 - regression_loss: 1.2488 - classification_loss: 0.2427 224/500 [============>.................] - ETA: 1:08 - loss: 1.4907 - regression_loss: 1.2485 - classification_loss: 0.2422 225/500 [============>.................] - ETA: 1:08 - loss: 1.4873 - regression_loss: 1.2448 - classification_loss: 0.2425 226/500 [============>.................] - ETA: 1:08 - loss: 1.4903 - regression_loss: 1.2473 - classification_loss: 0.2429 227/500 [============>.................] - ETA: 1:08 - loss: 1.4894 - regression_loss: 1.2467 - classification_loss: 0.2427 228/500 [============>.................] - ETA: 1:07 - loss: 1.4912 - regression_loss: 1.2483 - classification_loss: 0.2429 229/500 [============>.................] - ETA: 1:07 - loss: 1.4904 - regression_loss: 1.2474 - classification_loss: 0.2429 230/500 [============>.................] - ETA: 1:07 - loss: 1.4904 - regression_loss: 1.2474 - classification_loss: 0.2430 231/500 [============>.................] - ETA: 1:07 - loss: 1.4890 - regression_loss: 1.2464 - classification_loss: 0.2426 232/500 [============>.................] - ETA: 1:06 - loss: 1.4914 - regression_loss: 1.2484 - classification_loss: 0.2430 233/500 [============>.................] - ETA: 1:06 - loss: 1.4947 - regression_loss: 1.2505 - classification_loss: 0.2442 234/500 [=============>................] - ETA: 1:06 - loss: 1.4954 - regression_loss: 1.2516 - classification_loss: 0.2438 235/500 [=============>................] - ETA: 1:06 - loss: 1.4961 - regression_loss: 1.2521 - classification_loss: 0.2440 236/500 [=============>................] - ETA: 1:05 - loss: 1.4983 - regression_loss: 1.2537 - classification_loss: 0.2446 237/500 [=============>................] - ETA: 1:05 - loss: 1.5017 - regression_loss: 1.2558 - classification_loss: 0.2459 238/500 [=============>................] - ETA: 1:05 - loss: 1.5001 - regression_loss: 1.2547 - classification_loss: 0.2454 239/500 [=============>................] - ETA: 1:05 - loss: 1.4987 - regression_loss: 1.2536 - classification_loss: 0.2450 240/500 [=============>................] - ETA: 1:04 - loss: 1.4991 - regression_loss: 1.2541 - classification_loss: 0.2450 241/500 [=============>................] - ETA: 1:04 - loss: 1.5011 - regression_loss: 1.2557 - classification_loss: 0.2455 242/500 [=============>................] - ETA: 1:04 - loss: 1.5010 - regression_loss: 1.2557 - classification_loss: 0.2454 243/500 [=============>................] - ETA: 1:04 - loss: 1.5000 - regression_loss: 1.2548 - classification_loss: 0.2451 244/500 [=============>................] - ETA: 1:03 - loss: 1.5011 - regression_loss: 1.2557 - classification_loss: 0.2454 245/500 [=============>................] - ETA: 1:03 - loss: 1.5003 - regression_loss: 1.2548 - classification_loss: 0.2456 246/500 [=============>................] - ETA: 1:03 - loss: 1.5024 - regression_loss: 1.2561 - classification_loss: 0.2463 247/500 [=============>................] - ETA: 1:03 - loss: 1.5000 - regression_loss: 1.2541 - classification_loss: 0.2459 248/500 [=============>................] - ETA: 1:02 - loss: 1.4989 - regression_loss: 1.2527 - classification_loss: 0.2462 249/500 [=============>................] - ETA: 1:02 - loss: 1.4974 - regression_loss: 1.2516 - classification_loss: 0.2458 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5001 - regression_loss: 1.2537 - classification_loss: 0.2463 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5008 - regression_loss: 1.2542 - classification_loss: 0.2466 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5030 - regression_loss: 1.2560 - classification_loss: 0.2471 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5032 - regression_loss: 1.2563 - classification_loss: 0.2469 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5044 - regression_loss: 1.2571 - classification_loss: 0.2473 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5067 - regression_loss: 1.2589 - classification_loss: 0.2478 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5055 - regression_loss: 1.2579 - classification_loss: 0.2477 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5076 - regression_loss: 1.2598 - classification_loss: 0.2478 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5073 - regression_loss: 1.2590 - classification_loss: 0.2483 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5085 - regression_loss: 1.2600 - classification_loss: 0.2485 260/500 [==============>...............] - ETA: 59s - loss: 1.5087 - regression_loss: 1.2601 - classification_loss: 0.2486  261/500 [==============>...............] - ETA: 59s - loss: 1.5099 - regression_loss: 1.2609 - classification_loss: 0.2490 262/500 [==============>...............] - ETA: 59s - loss: 1.5104 - regression_loss: 1.2616 - classification_loss: 0.2489 263/500 [==============>...............] - ETA: 59s - loss: 1.5105 - regression_loss: 1.2616 - classification_loss: 0.2489 264/500 [==============>...............] - ETA: 58s - loss: 1.5106 - regression_loss: 1.2614 - classification_loss: 0.2492 265/500 [==============>...............] - ETA: 58s - loss: 1.5075 - regression_loss: 1.2589 - classification_loss: 0.2485 266/500 [==============>...............] - ETA: 58s - loss: 1.5086 - regression_loss: 1.2599 - classification_loss: 0.2487 267/500 [===============>..............] - ETA: 58s - loss: 1.5081 - regression_loss: 1.2597 - classification_loss: 0.2484 268/500 [===============>..............] - ETA: 57s - loss: 1.5097 - regression_loss: 1.2610 - classification_loss: 0.2487 269/500 [===============>..............] - ETA: 57s - loss: 1.5110 - regression_loss: 1.2620 - classification_loss: 0.2490 270/500 [===============>..............] - ETA: 57s - loss: 1.5093 - regression_loss: 1.2608 - classification_loss: 0.2485 271/500 [===============>..............] - ETA: 57s - loss: 1.5067 - regression_loss: 1.2583 - classification_loss: 0.2484 272/500 [===============>..............] - ETA: 56s - loss: 1.5068 - regression_loss: 1.2582 - classification_loss: 0.2487 273/500 [===============>..............] - ETA: 56s - loss: 1.5053 - regression_loss: 1.2565 - classification_loss: 0.2488 274/500 [===============>..............] - ETA: 56s - loss: 1.5073 - regression_loss: 1.2581 - classification_loss: 0.2492 275/500 [===============>..............] - ETA: 56s - loss: 1.5091 - regression_loss: 1.2598 - classification_loss: 0.2493 276/500 [===============>..............] - ETA: 55s - loss: 1.5107 - regression_loss: 1.2610 - classification_loss: 0.2497 277/500 [===============>..............] - ETA: 55s - loss: 1.5106 - regression_loss: 1.2608 - classification_loss: 0.2497 278/500 [===============>..............] - ETA: 55s - loss: 1.5105 - regression_loss: 1.2610 - classification_loss: 0.2495 279/500 [===============>..............] - ETA: 55s - loss: 1.5101 - regression_loss: 1.2610 - classification_loss: 0.2491 280/500 [===============>..............] - ETA: 54s - loss: 1.5108 - regression_loss: 1.2613 - classification_loss: 0.2495 281/500 [===============>..............] - ETA: 54s - loss: 1.5095 - regression_loss: 1.2604 - classification_loss: 0.2491 282/500 [===============>..............] - ETA: 54s - loss: 1.5094 - regression_loss: 1.2599 - classification_loss: 0.2495 283/500 [===============>..............] - ETA: 54s - loss: 1.5102 - regression_loss: 1.2602 - classification_loss: 0.2500 284/500 [================>.............] - ETA: 53s - loss: 1.5115 - regression_loss: 1.2612 - classification_loss: 0.2503 285/500 [================>.............] - ETA: 53s - loss: 1.5131 - regression_loss: 1.2622 - classification_loss: 0.2509 286/500 [================>.............] - ETA: 53s - loss: 1.5121 - regression_loss: 1.2614 - classification_loss: 0.2507 287/500 [================>.............] - ETA: 53s - loss: 1.5129 - regression_loss: 1.2622 - classification_loss: 0.2507 288/500 [================>.............] - ETA: 52s - loss: 1.5125 - regression_loss: 1.2621 - classification_loss: 0.2504 289/500 [================>.............] - ETA: 52s - loss: 1.5137 - regression_loss: 1.2631 - classification_loss: 0.2506 290/500 [================>.............] - ETA: 52s - loss: 1.5145 - regression_loss: 1.2639 - classification_loss: 0.2506 291/500 [================>.............] - ETA: 52s - loss: 1.5132 - regression_loss: 1.2629 - classification_loss: 0.2503 292/500 [================>.............] - ETA: 51s - loss: 1.5133 - regression_loss: 1.2630 - classification_loss: 0.2503 293/500 [================>.............] - ETA: 51s - loss: 1.5117 - regression_loss: 1.2619 - classification_loss: 0.2499 294/500 [================>.............] - ETA: 51s - loss: 1.5139 - regression_loss: 1.2637 - classification_loss: 0.2503 295/500 [================>.............] - ETA: 51s - loss: 1.5149 - regression_loss: 1.2639 - classification_loss: 0.2510 296/500 [================>.............] - ETA: 50s - loss: 1.5164 - regression_loss: 1.2651 - classification_loss: 0.2513 297/500 [================>.............] - ETA: 50s - loss: 1.5163 - regression_loss: 1.2648 - classification_loss: 0.2515 298/500 [================>.............] - ETA: 50s - loss: 1.5173 - regression_loss: 1.2659 - classification_loss: 0.2514 299/500 [================>.............] - ETA: 50s - loss: 1.5182 - regression_loss: 1.2660 - classification_loss: 0.2522 300/500 [=================>............] - ETA: 49s - loss: 1.5195 - regression_loss: 1.2671 - classification_loss: 0.2524 301/500 [=================>............] - ETA: 49s - loss: 1.5207 - regression_loss: 1.2681 - classification_loss: 0.2526 302/500 [=================>............] - ETA: 49s - loss: 1.5211 - regression_loss: 1.2683 - classification_loss: 0.2528 303/500 [=================>............] - ETA: 49s - loss: 1.5222 - regression_loss: 1.2694 - classification_loss: 0.2528 304/500 [=================>............] - ETA: 48s - loss: 1.5217 - regression_loss: 1.2691 - classification_loss: 0.2526 305/500 [=================>............] - ETA: 48s - loss: 1.5213 - regression_loss: 1.2687 - classification_loss: 0.2526 306/500 [=================>............] - ETA: 48s - loss: 1.5222 - regression_loss: 1.2695 - classification_loss: 0.2528 307/500 [=================>............] - ETA: 48s - loss: 1.5234 - regression_loss: 1.2707 - classification_loss: 0.2527 308/500 [=================>............] - ETA: 47s - loss: 1.5235 - regression_loss: 1.2708 - classification_loss: 0.2527 309/500 [=================>............] - ETA: 47s - loss: 1.5259 - regression_loss: 1.2729 - classification_loss: 0.2529 310/500 [=================>............] - ETA: 47s - loss: 1.5279 - regression_loss: 1.2746 - classification_loss: 0.2533 311/500 [=================>............] - ETA: 47s - loss: 1.5274 - regression_loss: 1.2743 - classification_loss: 0.2532 312/500 [=================>............] - ETA: 46s - loss: 1.5292 - regression_loss: 1.2758 - classification_loss: 0.2534 313/500 [=================>............] - ETA: 46s - loss: 1.5268 - regression_loss: 1.2738 - classification_loss: 0.2530 314/500 [=================>............] - ETA: 46s - loss: 1.5236 - regression_loss: 1.2711 - classification_loss: 0.2525 315/500 [=================>............] - ETA: 46s - loss: 1.5232 - regression_loss: 1.2707 - classification_loss: 0.2525 316/500 [=================>............] - ETA: 45s - loss: 1.5240 - regression_loss: 1.2714 - classification_loss: 0.2526 317/500 [==================>...........] - ETA: 45s - loss: 1.5251 - regression_loss: 1.2721 - classification_loss: 0.2530 318/500 [==================>...........] - ETA: 45s - loss: 1.5237 - regression_loss: 1.2709 - classification_loss: 0.2528 319/500 [==================>...........] - ETA: 45s - loss: 1.5237 - regression_loss: 1.2710 - classification_loss: 0.2527 320/500 [==================>...........] - ETA: 44s - loss: 1.5240 - regression_loss: 1.2714 - classification_loss: 0.2526 321/500 [==================>...........] - ETA: 44s - loss: 1.5236 - regression_loss: 1.2713 - classification_loss: 0.2523 322/500 [==================>...........] - ETA: 44s - loss: 1.5228 - regression_loss: 1.2705 - classification_loss: 0.2523 323/500 [==================>...........] - ETA: 44s - loss: 1.5200 - regression_loss: 1.2682 - classification_loss: 0.2518 324/500 [==================>...........] - ETA: 43s - loss: 1.5200 - regression_loss: 1.2683 - classification_loss: 0.2517 325/500 [==================>...........] - ETA: 43s - loss: 1.5196 - regression_loss: 1.2679 - classification_loss: 0.2517 326/500 [==================>...........] - ETA: 43s - loss: 1.5166 - regression_loss: 1.2655 - classification_loss: 0.2511 327/500 [==================>...........] - ETA: 43s - loss: 1.5167 - regression_loss: 1.2656 - classification_loss: 0.2511 328/500 [==================>...........] - ETA: 42s - loss: 1.5140 - regression_loss: 1.2634 - classification_loss: 0.2506 329/500 [==================>...........] - ETA: 42s - loss: 1.5147 - regression_loss: 1.2640 - classification_loss: 0.2507 330/500 [==================>...........] - ETA: 42s - loss: 1.5149 - regression_loss: 1.2643 - classification_loss: 0.2506 331/500 [==================>...........] - ETA: 42s - loss: 1.5154 - regression_loss: 1.2647 - classification_loss: 0.2507 332/500 [==================>...........] - ETA: 41s - loss: 1.5159 - regression_loss: 1.2652 - classification_loss: 0.2507 333/500 [==================>...........] - ETA: 41s - loss: 1.5172 - regression_loss: 1.2664 - classification_loss: 0.2508 334/500 [===================>..........] - ETA: 41s - loss: 1.5156 - regression_loss: 1.2651 - classification_loss: 0.2505 335/500 [===================>..........] - ETA: 41s - loss: 1.5162 - regression_loss: 1.2658 - classification_loss: 0.2504 336/500 [===================>..........] - ETA: 40s - loss: 1.5170 - regression_loss: 1.2664 - classification_loss: 0.2506 337/500 [===================>..........] - ETA: 40s - loss: 1.5165 - regression_loss: 1.2660 - classification_loss: 0.2505 338/500 [===================>..........] - ETA: 40s - loss: 1.5136 - regression_loss: 1.2636 - classification_loss: 0.2500 339/500 [===================>..........] - ETA: 40s - loss: 1.5136 - regression_loss: 1.2641 - classification_loss: 0.2496 340/500 [===================>..........] - ETA: 39s - loss: 1.5134 - regression_loss: 1.2636 - classification_loss: 0.2497 341/500 [===================>..........] - ETA: 39s - loss: 1.5112 - regression_loss: 1.2618 - classification_loss: 0.2494 342/500 [===================>..........] - ETA: 39s - loss: 1.5136 - regression_loss: 1.2635 - classification_loss: 0.2501 343/500 [===================>..........] - ETA: 39s - loss: 1.5139 - regression_loss: 1.2637 - classification_loss: 0.2501 344/500 [===================>..........] - ETA: 38s - loss: 1.5132 - regression_loss: 1.2633 - classification_loss: 0.2500 345/500 [===================>..........] - ETA: 38s - loss: 1.5133 - regression_loss: 1.2635 - classification_loss: 0.2498 346/500 [===================>..........] - ETA: 38s - loss: 1.5135 - regression_loss: 1.2637 - classification_loss: 0.2498 347/500 [===================>..........] - ETA: 38s - loss: 1.5136 - regression_loss: 1.2638 - classification_loss: 0.2498 348/500 [===================>..........] - ETA: 37s - loss: 1.5143 - regression_loss: 1.2644 - classification_loss: 0.2499 349/500 [===================>..........] - ETA: 37s - loss: 1.5136 - regression_loss: 1.2638 - classification_loss: 0.2497 350/500 [====================>.........] - ETA: 37s - loss: 1.5141 - regression_loss: 1.2643 - classification_loss: 0.2498 351/500 [====================>.........] - ETA: 37s - loss: 1.5147 - regression_loss: 1.2649 - classification_loss: 0.2498 352/500 [====================>.........] - ETA: 36s - loss: 1.5132 - regression_loss: 1.2637 - classification_loss: 0.2495 353/500 [====================>.........] - ETA: 36s - loss: 1.5138 - regression_loss: 1.2642 - classification_loss: 0.2495 354/500 [====================>.........] - ETA: 36s - loss: 1.5124 - regression_loss: 1.2631 - classification_loss: 0.2493 355/500 [====================>.........] - ETA: 36s - loss: 1.5107 - regression_loss: 1.2618 - classification_loss: 0.2489 356/500 [====================>.........] - ETA: 35s - loss: 1.5107 - regression_loss: 1.2619 - classification_loss: 0.2488 357/500 [====================>.........] - ETA: 35s - loss: 1.5105 - regression_loss: 1.2618 - classification_loss: 0.2486 358/500 [====================>.........] - ETA: 35s - loss: 1.5118 - regression_loss: 1.2630 - classification_loss: 0.2488 359/500 [====================>.........] - ETA: 35s - loss: 1.5139 - regression_loss: 1.2651 - classification_loss: 0.2488 360/500 [====================>.........] - ETA: 34s - loss: 1.5136 - regression_loss: 1.2648 - classification_loss: 0.2488 361/500 [====================>.........] - ETA: 34s - loss: 1.5125 - regression_loss: 1.2638 - classification_loss: 0.2487 362/500 [====================>.........] - ETA: 34s - loss: 1.5137 - regression_loss: 1.2643 - classification_loss: 0.2493 363/500 [====================>.........] - ETA: 34s - loss: 1.5149 - regression_loss: 1.2653 - classification_loss: 0.2496 364/500 [====================>.........] - ETA: 33s - loss: 1.5150 - regression_loss: 1.2653 - classification_loss: 0.2497 365/500 [====================>.........] - ETA: 33s - loss: 1.5153 - regression_loss: 1.2657 - classification_loss: 0.2496 366/500 [====================>.........] - ETA: 33s - loss: 1.5161 - regression_loss: 1.2665 - classification_loss: 0.2496 367/500 [=====================>........] - ETA: 33s - loss: 1.5183 - regression_loss: 1.2685 - classification_loss: 0.2498 368/500 [=====================>........] - ETA: 32s - loss: 1.5190 - regression_loss: 1.2691 - classification_loss: 0.2499 369/500 [=====================>........] - ETA: 32s - loss: 1.5194 - regression_loss: 1.2694 - classification_loss: 0.2500 370/500 [=====================>........] - ETA: 32s - loss: 1.5181 - regression_loss: 1.2679 - classification_loss: 0.2502 371/500 [=====================>........] - ETA: 32s - loss: 1.5185 - regression_loss: 1.2682 - classification_loss: 0.2502 372/500 [=====================>........] - ETA: 31s - loss: 1.5193 - regression_loss: 1.2690 - classification_loss: 0.2504 373/500 [=====================>........] - ETA: 31s - loss: 1.5201 - regression_loss: 1.2696 - classification_loss: 0.2505 374/500 [=====================>........] - ETA: 31s - loss: 1.5216 - regression_loss: 1.2706 - classification_loss: 0.2509 375/500 [=====================>........] - ETA: 31s - loss: 1.5217 - regression_loss: 1.2708 - classification_loss: 0.2509 376/500 [=====================>........] - ETA: 30s - loss: 1.5191 - regression_loss: 1.2686 - classification_loss: 0.2505 377/500 [=====================>........] - ETA: 30s - loss: 1.5193 - regression_loss: 1.2689 - classification_loss: 0.2504 378/500 [=====================>........] - ETA: 30s - loss: 1.5188 - regression_loss: 1.2686 - classification_loss: 0.2502 379/500 [=====================>........] - ETA: 30s - loss: 1.5192 - regression_loss: 1.2692 - classification_loss: 0.2500 380/500 [=====================>........] - ETA: 29s - loss: 1.5203 - regression_loss: 1.2702 - classification_loss: 0.2501 381/500 [=====================>........] - ETA: 29s - loss: 1.5212 - regression_loss: 1.2704 - classification_loss: 0.2508 382/500 [=====================>........] - ETA: 29s - loss: 1.5220 - regression_loss: 1.2711 - classification_loss: 0.2510 383/500 [=====================>........] - ETA: 29s - loss: 1.5227 - regression_loss: 1.2715 - classification_loss: 0.2511 384/500 [======================>.......] - ETA: 28s - loss: 1.5221 - regression_loss: 1.2711 - classification_loss: 0.2510 385/500 [======================>.......] - ETA: 28s - loss: 1.5210 - regression_loss: 1.2703 - classification_loss: 0.2507 386/500 [======================>.......] - ETA: 28s - loss: 1.5192 - regression_loss: 1.2689 - classification_loss: 0.2503 387/500 [======================>.......] - ETA: 28s - loss: 1.5193 - regression_loss: 1.2691 - classification_loss: 0.2502 388/500 [======================>.......] - ETA: 27s - loss: 1.5174 - regression_loss: 1.2676 - classification_loss: 0.2498 389/500 [======================>.......] - ETA: 27s - loss: 1.5172 - regression_loss: 1.2673 - classification_loss: 0.2499 390/500 [======================>.......] - ETA: 27s - loss: 1.5193 - regression_loss: 1.2691 - classification_loss: 0.2502 391/500 [======================>.......] - ETA: 27s - loss: 1.5207 - regression_loss: 1.2703 - classification_loss: 0.2504 392/500 [======================>.......] - ETA: 26s - loss: 1.5180 - regression_loss: 1.2680 - classification_loss: 0.2500 393/500 [======================>.......] - ETA: 26s - loss: 1.5181 - regression_loss: 1.2681 - classification_loss: 0.2500 394/500 [======================>.......] - ETA: 26s - loss: 1.5183 - regression_loss: 1.2682 - classification_loss: 0.2501 395/500 [======================>.......] - ETA: 26s - loss: 1.5192 - regression_loss: 1.2692 - classification_loss: 0.2500 396/500 [======================>.......] - ETA: 25s - loss: 1.5201 - regression_loss: 1.2700 - classification_loss: 0.2502 397/500 [======================>.......] - ETA: 25s - loss: 1.5190 - regression_loss: 1.2692 - classification_loss: 0.2498 398/500 [======================>.......] - ETA: 25s - loss: 1.5179 - regression_loss: 1.2684 - classification_loss: 0.2495 399/500 [======================>.......] - ETA: 25s - loss: 1.5186 - regression_loss: 1.2688 - classification_loss: 0.2497 400/500 [=======================>......] - ETA: 24s - loss: 1.5198 - regression_loss: 1.2697 - classification_loss: 0.2501 401/500 [=======================>......] - ETA: 24s - loss: 1.5202 - regression_loss: 1.2701 - classification_loss: 0.2502 402/500 [=======================>......] - ETA: 24s - loss: 1.5181 - regression_loss: 1.2683 - classification_loss: 0.2498 403/500 [=======================>......] - ETA: 24s - loss: 1.5191 - regression_loss: 1.2693 - classification_loss: 0.2498 404/500 [=======================>......] - ETA: 23s - loss: 1.5190 - regression_loss: 1.2693 - classification_loss: 0.2497 405/500 [=======================>......] - ETA: 23s - loss: 1.5192 - regression_loss: 1.2695 - classification_loss: 0.2497 406/500 [=======================>......] - ETA: 23s - loss: 1.5183 - regression_loss: 1.2689 - classification_loss: 0.2494 407/500 [=======================>......] - ETA: 23s - loss: 1.5180 - regression_loss: 1.2687 - classification_loss: 0.2493 408/500 [=======================>......] - ETA: 22s - loss: 1.5188 - regression_loss: 1.2692 - classification_loss: 0.2495 409/500 [=======================>......] - ETA: 22s - loss: 1.5202 - regression_loss: 1.2702 - classification_loss: 0.2501 410/500 [=======================>......] - ETA: 22s - loss: 1.5202 - regression_loss: 1.2699 - classification_loss: 0.2503 411/500 [=======================>......] - ETA: 22s - loss: 1.5208 - regression_loss: 1.2706 - classification_loss: 0.2502 412/500 [=======================>......] - ETA: 21s - loss: 1.5214 - regression_loss: 1.2709 - classification_loss: 0.2505 413/500 [=======================>......] - ETA: 21s - loss: 1.5203 - regression_loss: 1.2698 - classification_loss: 0.2504 414/500 [=======================>......] - ETA: 21s - loss: 1.5216 - regression_loss: 1.2705 - classification_loss: 0.2511 415/500 [=======================>......] - ETA: 21s - loss: 1.5206 - regression_loss: 1.2698 - classification_loss: 0.2508 416/500 [=======================>......] - ETA: 20s - loss: 1.5198 - regression_loss: 1.2692 - classification_loss: 0.2506 417/500 [========================>.....] - ETA: 20s - loss: 1.5196 - regression_loss: 1.2691 - classification_loss: 0.2505 418/500 [========================>.....] - ETA: 20s - loss: 1.5212 - regression_loss: 1.2705 - classification_loss: 0.2506 419/500 [========================>.....] - ETA: 20s - loss: 1.5219 - regression_loss: 1.2711 - classification_loss: 0.2508 420/500 [========================>.....] - ETA: 19s - loss: 1.5210 - regression_loss: 1.2705 - classification_loss: 0.2504 421/500 [========================>.....] - ETA: 19s - loss: 1.5193 - regression_loss: 1.2691 - classification_loss: 0.2502 422/500 [========================>.....] - ETA: 19s - loss: 1.5178 - regression_loss: 1.2679 - classification_loss: 0.2499 423/500 [========================>.....] - ETA: 19s - loss: 1.5196 - regression_loss: 1.2691 - classification_loss: 0.2504 424/500 [========================>.....] - ETA: 18s - loss: 1.5199 - regression_loss: 1.2694 - classification_loss: 0.2505 425/500 [========================>.....] - ETA: 18s - loss: 1.5198 - regression_loss: 1.2693 - classification_loss: 0.2506 426/500 [========================>.....] - ETA: 18s - loss: 1.5200 - regression_loss: 1.2695 - classification_loss: 0.2505 427/500 [========================>.....] - ETA: 18s - loss: 1.5221 - regression_loss: 1.2712 - classification_loss: 0.2509 428/500 [========================>.....] - ETA: 17s - loss: 1.5222 - regression_loss: 1.2712 - classification_loss: 0.2509 429/500 [========================>.....] - ETA: 17s - loss: 1.5227 - regression_loss: 1.2717 - classification_loss: 0.2510 430/500 [========================>.....] - ETA: 17s - loss: 1.5235 - regression_loss: 1.2724 - classification_loss: 0.2511 431/500 [========================>.....] - ETA: 17s - loss: 1.5241 - regression_loss: 1.2729 - classification_loss: 0.2512 432/500 [========================>.....] - ETA: 16s - loss: 1.5243 - regression_loss: 1.2729 - classification_loss: 0.2514 433/500 [========================>.....] - ETA: 16s - loss: 1.5251 - regression_loss: 1.2738 - classification_loss: 0.2514 434/500 [=========================>....] - ETA: 16s - loss: 1.5249 - regression_loss: 1.2735 - classification_loss: 0.2514 435/500 [=========================>....] - ETA: 16s - loss: 1.5269 - regression_loss: 1.2751 - classification_loss: 0.2518 436/500 [=========================>....] - ETA: 15s - loss: 1.5271 - regression_loss: 1.2750 - classification_loss: 0.2522 437/500 [=========================>....] - ETA: 15s - loss: 1.5289 - regression_loss: 1.2760 - classification_loss: 0.2529 438/500 [=========================>....] - ETA: 15s - loss: 1.5274 - regression_loss: 1.2748 - classification_loss: 0.2526 439/500 [=========================>....] - ETA: 15s - loss: 1.5288 - regression_loss: 1.2762 - classification_loss: 0.2526 440/500 [=========================>....] - ETA: 14s - loss: 1.5282 - regression_loss: 1.2754 - classification_loss: 0.2528 441/500 [=========================>....] - ETA: 14s - loss: 1.5290 - regression_loss: 1.2762 - classification_loss: 0.2528 442/500 [=========================>....] - ETA: 14s - loss: 1.5286 - regression_loss: 1.2759 - classification_loss: 0.2527 443/500 [=========================>....] - ETA: 14s - loss: 1.5282 - regression_loss: 1.2757 - classification_loss: 0.2526 444/500 [=========================>....] - ETA: 13s - loss: 1.5263 - regression_loss: 1.2742 - classification_loss: 0.2522 445/500 [=========================>....] - ETA: 13s - loss: 1.5245 - regression_loss: 1.2727 - classification_loss: 0.2518 446/500 [=========================>....] - ETA: 13s - loss: 1.5261 - regression_loss: 1.2738 - classification_loss: 0.2523 447/500 [=========================>....] - ETA: 13s - loss: 1.5248 - regression_loss: 1.2726 - classification_loss: 0.2521 448/500 [=========================>....] - ETA: 12s - loss: 1.5232 - regression_loss: 1.2713 - classification_loss: 0.2519 449/500 [=========================>....] - ETA: 12s - loss: 1.5218 - regression_loss: 1.2702 - classification_loss: 0.2516 450/500 [==========================>...] - ETA: 12s - loss: 1.5203 - regression_loss: 1.2689 - classification_loss: 0.2514 451/500 [==========================>...] - ETA: 12s - loss: 1.5210 - regression_loss: 1.2695 - classification_loss: 0.2515 452/500 [==========================>...] - ETA: 11s - loss: 1.5210 - regression_loss: 1.2696 - classification_loss: 0.2514 453/500 [==========================>...] - ETA: 11s - loss: 1.5193 - regression_loss: 1.2683 - classification_loss: 0.2510 454/500 [==========================>...] - ETA: 11s - loss: 1.5198 - regression_loss: 1.2687 - classification_loss: 0.2511 455/500 [==========================>...] - ETA: 11s - loss: 1.5208 - regression_loss: 1.2697 - classification_loss: 0.2511 456/500 [==========================>...] - ETA: 10s - loss: 1.5204 - regression_loss: 1.2692 - classification_loss: 0.2512 457/500 [==========================>...] - ETA: 10s - loss: 1.5210 - regression_loss: 1.2697 - classification_loss: 0.2513 458/500 [==========================>...] - ETA: 10s - loss: 1.5214 - regression_loss: 1.2702 - classification_loss: 0.2512 459/500 [==========================>...] - ETA: 10s - loss: 1.5236 - regression_loss: 1.2721 - classification_loss: 0.2516 460/500 [==========================>...] - ETA: 9s - loss: 1.5230 - regression_loss: 1.2716 - classification_loss: 0.2513  461/500 [==========================>...] - ETA: 9s - loss: 1.5236 - regression_loss: 1.2719 - classification_loss: 0.2517 462/500 [==========================>...] - ETA: 9s - loss: 1.5233 - regression_loss: 1.2718 - classification_loss: 0.2515 463/500 [==========================>...] - ETA: 9s - loss: 1.5251 - regression_loss: 1.2733 - classification_loss: 0.2518 464/500 [==========================>...] - ETA: 8s - loss: 1.5243 - regression_loss: 1.2725 - classification_loss: 0.2517 465/500 [==========================>...] - ETA: 8s - loss: 1.5233 - regression_loss: 1.2717 - classification_loss: 0.2516 466/500 [==========================>...] - ETA: 8s - loss: 1.5240 - regression_loss: 1.2724 - classification_loss: 0.2516 467/500 [===========================>..] - ETA: 8s - loss: 1.5244 - regression_loss: 1.2728 - classification_loss: 0.2516 468/500 [===========================>..] - ETA: 7s - loss: 1.5245 - regression_loss: 1.2729 - classification_loss: 0.2516 469/500 [===========================>..] - ETA: 7s - loss: 1.5249 - regression_loss: 1.2735 - classification_loss: 0.2514 470/500 [===========================>..] - ETA: 7s - loss: 1.5251 - regression_loss: 1.2738 - classification_loss: 0.2513 471/500 [===========================>..] - ETA: 7s - loss: 1.5250 - regression_loss: 1.2737 - classification_loss: 0.2513 472/500 [===========================>..] - ETA: 6s - loss: 1.5250 - regression_loss: 1.2737 - classification_loss: 0.2513 473/500 [===========================>..] - ETA: 6s - loss: 1.5262 - regression_loss: 1.2746 - classification_loss: 0.2517 474/500 [===========================>..] - ETA: 6s - loss: 1.5263 - regression_loss: 1.2747 - classification_loss: 0.2516 475/500 [===========================>..] - ETA: 6s - loss: 1.5261 - regression_loss: 1.2746 - classification_loss: 0.2515 476/500 [===========================>..] - ETA: 5s - loss: 1.5265 - regression_loss: 1.2749 - classification_loss: 0.2516 477/500 [===========================>..] - ETA: 5s - loss: 1.5262 - regression_loss: 1.2746 - classification_loss: 0.2516 478/500 [===========================>..] - ETA: 5s - loss: 1.5260 - regression_loss: 1.2745 - classification_loss: 0.2515 479/500 [===========================>..] - ETA: 5s - loss: 1.5260 - regression_loss: 1.2745 - classification_loss: 0.2515 480/500 [===========================>..] - ETA: 4s - loss: 1.5263 - regression_loss: 1.2747 - classification_loss: 0.2516 481/500 [===========================>..] - ETA: 4s - loss: 1.5270 - regression_loss: 1.2754 - classification_loss: 0.2516 482/500 [===========================>..] - ETA: 4s - loss: 1.5281 - regression_loss: 1.2764 - classification_loss: 0.2518 483/500 [===========================>..] - ETA: 4s - loss: 1.5281 - regression_loss: 1.2764 - classification_loss: 0.2517 484/500 [============================>.] - ETA: 3s - loss: 1.5289 - regression_loss: 1.2771 - classification_loss: 0.2519 485/500 [============================>.] - ETA: 3s - loss: 1.5307 - regression_loss: 1.2787 - classification_loss: 0.2520 486/500 [============================>.] - ETA: 3s - loss: 1.5306 - regression_loss: 1.2787 - classification_loss: 0.2519 487/500 [============================>.] - ETA: 3s - loss: 1.5312 - regression_loss: 1.2793 - classification_loss: 0.2520 488/500 [============================>.] - ETA: 2s - loss: 1.5318 - regression_loss: 1.2796 - classification_loss: 0.2521 489/500 [============================>.] - ETA: 2s - loss: 1.5302 - regression_loss: 1.2782 - classification_loss: 0.2520 490/500 [============================>.] - ETA: 2s - loss: 1.5299 - regression_loss: 1.2780 - classification_loss: 0.2519 491/500 [============================>.] - ETA: 2s - loss: 1.5279 - regression_loss: 1.2761 - classification_loss: 0.2518 492/500 [============================>.] - ETA: 1s - loss: 1.5270 - regression_loss: 1.2755 - classification_loss: 0.2515 493/500 [============================>.] - ETA: 1s - loss: 1.5267 - regression_loss: 1.2752 - classification_loss: 0.2515 494/500 [============================>.] - ETA: 1s - loss: 1.5272 - regression_loss: 1.2756 - classification_loss: 0.2515 495/500 [============================>.] - ETA: 1s - loss: 1.5282 - regression_loss: 1.2765 - classification_loss: 0.2517 496/500 [============================>.] - ETA: 0s - loss: 1.5294 - regression_loss: 1.2774 - classification_loss: 0.2520 497/500 [============================>.] - ETA: 0s - loss: 1.5277 - regression_loss: 1.2759 - classification_loss: 0.2518 498/500 [============================>.] - ETA: 0s - loss: 1.5282 - regression_loss: 1.2763 - classification_loss: 0.2519 499/500 [============================>.] - ETA: 0s - loss: 1.5287 - regression_loss: 1.2766 - classification_loss: 0.2521 500/500 [==============================] - 125s 250ms/step - loss: 1.5272 - regression_loss: 1.2753 - classification_loss: 0.2519 1172 instances of class plum with average precision: 0.6633 mAP: 0.6633 Epoch 00092: saving model to ./training/snapshots/resnet50_pascal_92.h5 Epoch 93/150 1/500 [..............................] - ETA: 1:54 - loss: 1.8559 - regression_loss: 1.5852 - classification_loss: 0.2707 2/500 [..............................] - ETA: 1:58 - loss: 1.4653 - regression_loss: 1.2813 - classification_loss: 0.1840 3/500 [..............................] - ETA: 2:02 - loss: 1.4962 - regression_loss: 1.3381 - classification_loss: 0.1581 4/500 [..............................] - ETA: 2:02 - loss: 1.4577 - regression_loss: 1.2943 - classification_loss: 0.1634 5/500 [..............................] - ETA: 2:02 - loss: 1.3792 - regression_loss: 1.2182 - classification_loss: 0.1610 6/500 [..............................] - ETA: 2:02 - loss: 1.4439 - regression_loss: 1.2668 - classification_loss: 0.1771 7/500 [..............................] - ETA: 2:03 - loss: 1.7060 - regression_loss: 1.4781 - classification_loss: 0.2280 8/500 [..............................] - ETA: 2:03 - loss: 1.7660 - regression_loss: 1.5213 - classification_loss: 0.2447 9/500 [..............................] - ETA: 2:03 - loss: 1.7702 - regression_loss: 1.5134 - classification_loss: 0.2568 10/500 [..............................] - ETA: 2:03 - loss: 1.7446 - regression_loss: 1.4917 - classification_loss: 0.2528 11/500 [..............................] - ETA: 2:03 - loss: 1.7342 - regression_loss: 1.4762 - classification_loss: 0.2580 12/500 [..............................] - ETA: 2:03 - loss: 1.7469 - regression_loss: 1.4792 - classification_loss: 0.2677 13/500 [..............................] - ETA: 2:03 - loss: 1.7337 - regression_loss: 1.4664 - classification_loss: 0.2673 14/500 [..............................] - ETA: 2:02 - loss: 1.7132 - regression_loss: 1.4495 - classification_loss: 0.2637 15/500 [..............................] - ETA: 2:02 - loss: 1.7184 - regression_loss: 1.4508 - classification_loss: 0.2677 16/500 [..............................] - ETA: 2:02 - loss: 1.7106 - regression_loss: 1.4372 - classification_loss: 0.2734 17/500 [>.............................] - ETA: 2:02 - loss: 1.6944 - regression_loss: 1.4301 - classification_loss: 0.2643 18/500 [>.............................] - ETA: 2:02 - loss: 1.6975 - regression_loss: 1.4343 - classification_loss: 0.2631 19/500 [>.............................] - ETA: 2:01 - loss: 1.7059 - regression_loss: 1.4410 - classification_loss: 0.2649 20/500 [>.............................] - ETA: 2:01 - loss: 1.6691 - regression_loss: 1.4116 - classification_loss: 0.2575 21/500 [>.............................] - ETA: 2:01 - loss: 1.6654 - regression_loss: 1.4061 - classification_loss: 0.2593 22/500 [>.............................] - ETA: 2:01 - loss: 1.6335 - regression_loss: 1.3810 - classification_loss: 0.2525 23/500 [>.............................] - ETA: 2:00 - loss: 1.6280 - regression_loss: 1.3777 - classification_loss: 0.2504 24/500 [>.............................] - ETA: 2:00 - loss: 1.6407 - regression_loss: 1.3888 - classification_loss: 0.2519 25/500 [>.............................] - ETA: 1:59 - loss: 1.6452 - regression_loss: 1.3880 - classification_loss: 0.2572 26/500 [>.............................] - ETA: 1:59 - loss: 1.6419 - regression_loss: 1.3841 - classification_loss: 0.2578 27/500 [>.............................] - ETA: 1:59 - loss: 1.6441 - regression_loss: 1.3858 - classification_loss: 0.2583 28/500 [>.............................] - ETA: 1:59 - loss: 1.6248 - regression_loss: 1.3705 - classification_loss: 0.2542 29/500 [>.............................] - ETA: 1:58 - loss: 1.6228 - regression_loss: 1.3702 - classification_loss: 0.2526 30/500 [>.............................] - ETA: 1:58 - loss: 1.6291 - regression_loss: 1.3739 - classification_loss: 0.2553 31/500 [>.............................] - ETA: 1:58 - loss: 1.5977 - regression_loss: 1.3477 - classification_loss: 0.2500 32/500 [>.............................] - ETA: 1:58 - loss: 1.6040 - regression_loss: 1.3520 - classification_loss: 0.2520 33/500 [>.............................] - ETA: 1:57 - loss: 1.6004 - regression_loss: 1.3446 - classification_loss: 0.2558 34/500 [=>............................] - ETA: 1:57 - loss: 1.5951 - regression_loss: 1.3400 - classification_loss: 0.2551 35/500 [=>............................] - ETA: 1:56 - loss: 1.5918 - regression_loss: 1.3379 - classification_loss: 0.2539 36/500 [=>............................] - ETA: 1:56 - loss: 1.6048 - regression_loss: 1.3439 - classification_loss: 0.2610 37/500 [=>............................] - ETA: 1:56 - loss: 1.5840 - regression_loss: 1.3275 - classification_loss: 0.2566 38/500 [=>............................] - ETA: 1:56 - loss: 1.5700 - regression_loss: 1.3164 - classification_loss: 0.2536 39/500 [=>............................] - ETA: 1:55 - loss: 1.5709 - regression_loss: 1.3175 - classification_loss: 0.2533 40/500 [=>............................] - ETA: 1:55 - loss: 1.5736 - regression_loss: 1.3180 - classification_loss: 0.2555 41/500 [=>............................] - ETA: 1:55 - loss: 1.5819 - regression_loss: 1.3256 - classification_loss: 0.2563 42/500 [=>............................] - ETA: 1:54 - loss: 1.5810 - regression_loss: 1.3227 - classification_loss: 0.2583 43/500 [=>............................] - ETA: 1:54 - loss: 1.5734 - regression_loss: 1.3171 - classification_loss: 0.2562 44/500 [=>............................] - ETA: 1:54 - loss: 1.5735 - regression_loss: 1.3176 - classification_loss: 0.2559 45/500 [=>............................] - ETA: 1:54 - loss: 1.5711 - regression_loss: 1.3152 - classification_loss: 0.2560 46/500 [=>............................] - ETA: 1:53 - loss: 1.5696 - regression_loss: 1.3164 - classification_loss: 0.2533 47/500 [=>............................] - ETA: 1:53 - loss: 1.5732 - regression_loss: 1.3174 - classification_loss: 0.2558 48/500 [=>............................] - ETA: 1:53 - loss: 1.5782 - regression_loss: 1.3212 - classification_loss: 0.2570 49/500 [=>............................] - ETA: 1:53 - loss: 1.5852 - regression_loss: 1.3269 - classification_loss: 0.2582 50/500 [==>...........................] - ETA: 1:53 - loss: 1.5856 - regression_loss: 1.3271 - classification_loss: 0.2585 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5954 - regression_loss: 1.3365 - classification_loss: 0.2589 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5926 - regression_loss: 1.3332 - classification_loss: 0.2594 53/500 [==>...........................] - ETA: 1:52 - loss: 1.6007 - regression_loss: 1.3404 - classification_loss: 0.2603 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5905 - regression_loss: 1.3320 - classification_loss: 0.2585 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5980 - regression_loss: 1.3376 - classification_loss: 0.2605 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5897 - regression_loss: 1.3307 - classification_loss: 0.2589 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5769 - regression_loss: 1.3199 - classification_loss: 0.2570 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5840 - regression_loss: 1.3248 - classification_loss: 0.2593 59/500 [==>...........................] - ETA: 1:51 - loss: 1.5911 - regression_loss: 1.3287 - classification_loss: 0.2624 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5814 - regression_loss: 1.3195 - classification_loss: 0.2618 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5737 - regression_loss: 1.3126 - classification_loss: 0.2611 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5727 - regression_loss: 1.3114 - classification_loss: 0.2613 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5729 - regression_loss: 1.3118 - classification_loss: 0.2611 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5766 - regression_loss: 1.3145 - classification_loss: 0.2621 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5757 - regression_loss: 1.3125 - classification_loss: 0.2633 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5851 - regression_loss: 1.3201 - classification_loss: 0.2650 67/500 [===>..........................] - ETA: 1:49 - loss: 1.5871 - regression_loss: 1.3228 - classification_loss: 0.2643 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5706 - regression_loss: 1.3092 - classification_loss: 0.2614 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5709 - regression_loss: 1.3103 - classification_loss: 0.2606 70/500 [===>..........................] - ETA: 1:48 - loss: 1.5744 - regression_loss: 1.3127 - classification_loss: 0.2617 71/500 [===>..........................] - ETA: 1:48 - loss: 1.5804 - regression_loss: 1.3183 - classification_loss: 0.2621 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5812 - regression_loss: 1.3182 - classification_loss: 0.2629 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5828 - regression_loss: 1.3197 - classification_loss: 0.2630 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5833 - regression_loss: 1.3205 - classification_loss: 0.2628 75/500 [===>..........................] - ETA: 1:47 - loss: 1.5868 - regression_loss: 1.3233 - classification_loss: 0.2634 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5834 - regression_loss: 1.3216 - classification_loss: 0.2618 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5921 - regression_loss: 1.3299 - classification_loss: 0.2622 78/500 [===>..........................] - ETA: 1:46 - loss: 1.5895 - regression_loss: 1.3285 - classification_loss: 0.2610 79/500 [===>..........................] - ETA: 1:46 - loss: 1.5902 - regression_loss: 1.3292 - classification_loss: 0.2610 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5843 - regression_loss: 1.3247 - classification_loss: 0.2595 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5789 - regression_loss: 1.3201 - classification_loss: 0.2588 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5768 - regression_loss: 1.3174 - classification_loss: 0.2594 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5785 - regression_loss: 1.3192 - classification_loss: 0.2593 84/500 [====>.........................] - ETA: 1:45 - loss: 1.5786 - regression_loss: 1.3198 - classification_loss: 0.2587 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5775 - regression_loss: 1.3194 - classification_loss: 0.2581 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5830 - regression_loss: 1.3238 - classification_loss: 0.2593 87/500 [====>.........................] - ETA: 1:44 - loss: 1.5833 - regression_loss: 1.3241 - classification_loss: 0.2592 88/500 [====>.........................] - ETA: 1:44 - loss: 1.5845 - regression_loss: 1.3251 - classification_loss: 0.2594 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5879 - regression_loss: 1.3278 - classification_loss: 0.2601 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5910 - regression_loss: 1.3303 - classification_loss: 0.2607 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5853 - regression_loss: 1.3261 - classification_loss: 0.2593 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5762 - regression_loss: 1.3188 - classification_loss: 0.2573 93/500 [====>.........................] - ETA: 1:42 - loss: 1.5757 - regression_loss: 1.3177 - classification_loss: 0.2580 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5652 - regression_loss: 1.3092 - classification_loss: 0.2560 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5658 - regression_loss: 1.3099 - classification_loss: 0.2559 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5643 - regression_loss: 1.3086 - classification_loss: 0.2557 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5591 - regression_loss: 1.3041 - classification_loss: 0.2550 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5544 - regression_loss: 1.2992 - classification_loss: 0.2553 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5537 - regression_loss: 1.2988 - classification_loss: 0.2549 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5592 - regression_loss: 1.3026 - classification_loss: 0.2566 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5619 - regression_loss: 1.3046 - classification_loss: 0.2573 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5637 - regression_loss: 1.3065 - classification_loss: 0.2571 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5569 - regression_loss: 1.3014 - classification_loss: 0.2556 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5574 - regression_loss: 1.3021 - classification_loss: 0.2553 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5609 - regression_loss: 1.3046 - classification_loss: 0.2563 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5612 - regression_loss: 1.3052 - classification_loss: 0.2561 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5620 - regression_loss: 1.3058 - classification_loss: 0.2562 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5653 - regression_loss: 1.3084 - classification_loss: 0.2569 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5694 - regression_loss: 1.3119 - classification_loss: 0.2575 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5701 - regression_loss: 1.3126 - classification_loss: 0.2575 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5724 - regression_loss: 1.3149 - classification_loss: 0.2575 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5655 - regression_loss: 1.3095 - classification_loss: 0.2561 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5637 - regression_loss: 1.3070 - classification_loss: 0.2566 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5655 - regression_loss: 1.3085 - classification_loss: 0.2570 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5643 - regression_loss: 1.3079 - classification_loss: 0.2564 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5565 - regression_loss: 1.3009 - classification_loss: 0.2556 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5554 - regression_loss: 1.3004 - classification_loss: 0.2551 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5508 - regression_loss: 1.2966 - classification_loss: 0.2541 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5444 - regression_loss: 1.2912 - classification_loss: 0.2533 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5444 - regression_loss: 1.2909 - classification_loss: 0.2535 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5425 - regression_loss: 1.2891 - classification_loss: 0.2534 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5432 - regression_loss: 1.2900 - classification_loss: 0.2532 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5378 - regression_loss: 1.2857 - classification_loss: 0.2521 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5432 - regression_loss: 1.2896 - classification_loss: 0.2536 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5421 - regression_loss: 1.2886 - classification_loss: 0.2535 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5440 - regression_loss: 1.2899 - classification_loss: 0.2541 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5441 - regression_loss: 1.2904 - classification_loss: 0.2537 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5377 - regression_loss: 1.2853 - classification_loss: 0.2524 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5364 - regression_loss: 1.2854 - classification_loss: 0.2510 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5361 - regression_loss: 1.2846 - classification_loss: 0.2514 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5338 - regression_loss: 1.2831 - classification_loss: 0.2506 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5318 - regression_loss: 1.2811 - classification_loss: 0.2507 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5341 - regression_loss: 1.2830 - classification_loss: 0.2511 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5331 - regression_loss: 1.2825 - classification_loss: 0.2506 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5344 - regression_loss: 1.2829 - classification_loss: 0.2515 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5358 - regression_loss: 1.2834 - classification_loss: 0.2524 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5362 - regression_loss: 1.2840 - classification_loss: 0.2521 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5324 - regression_loss: 1.2809 - classification_loss: 0.2515 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5341 - regression_loss: 1.2822 - classification_loss: 0.2519 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5402 - regression_loss: 1.2874 - classification_loss: 0.2529 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5427 - regression_loss: 1.2882 - classification_loss: 0.2545 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5466 - regression_loss: 1.2908 - classification_loss: 0.2558 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5463 - regression_loss: 1.2908 - classification_loss: 0.2554 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5490 - regression_loss: 1.2936 - classification_loss: 0.2554 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5491 - regression_loss: 1.2939 - classification_loss: 0.2552 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5468 - regression_loss: 1.2925 - classification_loss: 0.2544 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5510 - regression_loss: 1.2963 - classification_loss: 0.2547 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5535 - regression_loss: 1.2983 - classification_loss: 0.2552 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5504 - regression_loss: 1.2962 - classification_loss: 0.2542 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5512 - regression_loss: 1.2966 - classification_loss: 0.2546 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5514 - regression_loss: 1.2960 - classification_loss: 0.2555 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5462 - regression_loss: 1.2918 - classification_loss: 0.2544 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5479 - regression_loss: 1.2931 - classification_loss: 0.2548 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5484 - regression_loss: 1.2935 - classification_loss: 0.2549 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5492 - regression_loss: 1.2941 - classification_loss: 0.2551 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5435 - regression_loss: 1.2896 - classification_loss: 0.2539 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5414 - regression_loss: 1.2885 - classification_loss: 0.2529 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5424 - regression_loss: 1.2893 - classification_loss: 0.2531 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5423 - regression_loss: 1.2891 - classification_loss: 0.2533 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5410 - regression_loss: 1.2879 - classification_loss: 0.2531 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5384 - regression_loss: 1.2855 - classification_loss: 0.2528 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5379 - regression_loss: 1.2852 - classification_loss: 0.2527 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5377 - regression_loss: 1.2849 - classification_loss: 0.2528 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5336 - regression_loss: 1.2818 - classification_loss: 0.2518 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5367 - regression_loss: 1.2845 - classification_loss: 0.2522 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5309 - regression_loss: 1.2799 - classification_loss: 0.2510 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5337 - regression_loss: 1.2822 - classification_loss: 0.2515 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5359 - regression_loss: 1.2837 - classification_loss: 0.2522 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5355 - regression_loss: 1.2832 - classification_loss: 0.2523 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5353 - regression_loss: 1.2830 - classification_loss: 0.2524 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5326 - regression_loss: 1.2806 - classification_loss: 0.2520 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5343 - regression_loss: 1.2819 - classification_loss: 0.2524 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5362 - regression_loss: 1.2835 - classification_loss: 0.2527 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5361 - regression_loss: 1.2837 - classification_loss: 0.2524 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5372 - regression_loss: 1.2847 - classification_loss: 0.2525 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5392 - regression_loss: 1.2859 - classification_loss: 0.2533 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5389 - regression_loss: 1.2858 - classification_loss: 0.2531 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5357 - regression_loss: 1.2828 - classification_loss: 0.2530 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5385 - regression_loss: 1.2846 - classification_loss: 0.2539 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5380 - regression_loss: 1.2842 - classification_loss: 0.2538 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5374 - regression_loss: 1.2837 - classification_loss: 0.2537 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5405 - regression_loss: 1.2863 - classification_loss: 0.2542 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5394 - regression_loss: 1.2852 - classification_loss: 0.2542 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5406 - regression_loss: 1.2863 - classification_loss: 0.2544 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5361 - regression_loss: 1.2825 - classification_loss: 0.2536 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5328 - regression_loss: 1.2800 - classification_loss: 0.2528 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5351 - regression_loss: 1.2817 - classification_loss: 0.2534 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5404 - regression_loss: 1.2864 - classification_loss: 0.2541 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5438 - regression_loss: 1.2889 - classification_loss: 0.2549 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5443 - regression_loss: 1.2891 - classification_loss: 0.2552 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5441 - regression_loss: 1.2888 - classification_loss: 0.2553 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5439 - regression_loss: 1.2888 - classification_loss: 0.2551 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5434 - regression_loss: 1.2881 - classification_loss: 0.2552 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5424 - regression_loss: 1.2873 - classification_loss: 0.2551 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5423 - regression_loss: 1.2872 - classification_loss: 0.2551 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5436 - regression_loss: 1.2881 - classification_loss: 0.2556 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5433 - regression_loss: 1.2878 - classification_loss: 0.2555 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5466 - regression_loss: 1.2905 - classification_loss: 0.2561 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5463 - regression_loss: 1.2905 - classification_loss: 0.2558 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5456 - regression_loss: 1.2900 - classification_loss: 0.2557 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5426 - regression_loss: 1.2875 - classification_loss: 0.2550 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5441 - regression_loss: 1.2887 - classification_loss: 0.2554 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5462 - regression_loss: 1.2904 - classification_loss: 0.2557 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5422 - regression_loss: 1.2871 - classification_loss: 0.2551 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5407 - regression_loss: 1.2857 - classification_loss: 0.2549 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5391 - regression_loss: 1.2844 - classification_loss: 0.2547 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5376 - regression_loss: 1.2833 - classification_loss: 0.2543 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5394 - regression_loss: 1.2848 - classification_loss: 0.2545 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5462 - regression_loss: 1.2904 - classification_loss: 0.2557 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5454 - regression_loss: 1.2897 - classification_loss: 0.2557 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5471 - regression_loss: 1.2910 - classification_loss: 0.2561 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5442 - regression_loss: 1.2889 - classification_loss: 0.2553 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5453 - regression_loss: 1.2897 - classification_loss: 0.2556 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5458 - regression_loss: 1.2897 - classification_loss: 0.2561 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5475 - regression_loss: 1.2909 - classification_loss: 0.2566 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5476 - regression_loss: 1.2908 - classification_loss: 0.2568 217/500 [============>.................] - ETA: 1:10 - loss: 1.5467 - regression_loss: 1.2902 - classification_loss: 0.2565 218/500 [============>.................] - ETA: 1:10 - loss: 1.5475 - regression_loss: 1.2905 - classification_loss: 0.2570 219/500 [============>.................] - ETA: 1:10 - loss: 1.5474 - regression_loss: 1.2906 - classification_loss: 0.2568 220/500 [============>.................] - ETA: 1:10 - loss: 1.5440 - regression_loss: 1.2878 - classification_loss: 0.2562 221/500 [============>.................] - ETA: 1:09 - loss: 1.5457 - regression_loss: 1.2894 - classification_loss: 0.2563 222/500 [============>.................] - ETA: 1:09 - loss: 1.5455 - regression_loss: 1.2893 - classification_loss: 0.2561 223/500 [============>.................] - ETA: 1:09 - loss: 1.5462 - regression_loss: 1.2901 - classification_loss: 0.2560 224/500 [============>.................] - ETA: 1:09 - loss: 1.5431 - regression_loss: 1.2877 - classification_loss: 0.2554 225/500 [============>.................] - ETA: 1:08 - loss: 1.5420 - regression_loss: 1.2867 - classification_loss: 0.2553 226/500 [============>.................] - ETA: 1:08 - loss: 1.5427 - regression_loss: 1.2873 - classification_loss: 0.2554 227/500 [============>.................] - ETA: 1:08 - loss: 1.5432 - regression_loss: 1.2879 - classification_loss: 0.2554 228/500 [============>.................] - ETA: 1:08 - loss: 1.5415 - regression_loss: 1.2868 - classification_loss: 0.2548 229/500 [============>.................] - ETA: 1:07 - loss: 1.5430 - regression_loss: 1.2883 - classification_loss: 0.2547 230/500 [============>.................] - ETA: 1:07 - loss: 1.5430 - regression_loss: 1.2886 - classification_loss: 0.2544 231/500 [============>.................] - ETA: 1:07 - loss: 1.5426 - regression_loss: 1.2882 - classification_loss: 0.2543 232/500 [============>.................] - ETA: 1:07 - loss: 1.5427 - regression_loss: 1.2883 - classification_loss: 0.2544 233/500 [============>.................] - ETA: 1:06 - loss: 1.5421 - regression_loss: 1.2878 - classification_loss: 0.2543 234/500 [=============>................] - ETA: 1:06 - loss: 1.5425 - regression_loss: 1.2882 - classification_loss: 0.2543 235/500 [=============>................] - ETA: 1:06 - loss: 1.5444 - regression_loss: 1.2896 - classification_loss: 0.2548 236/500 [=============>................] - ETA: 1:06 - loss: 1.5465 - regression_loss: 1.2913 - classification_loss: 0.2552 237/500 [=============>................] - ETA: 1:05 - loss: 1.5473 - regression_loss: 1.2919 - classification_loss: 0.2554 238/500 [=============>................] - ETA: 1:05 - loss: 1.5483 - regression_loss: 1.2924 - classification_loss: 0.2559 239/500 [=============>................] - ETA: 1:05 - loss: 1.5486 - regression_loss: 1.2928 - classification_loss: 0.2558 240/500 [=============>................] - ETA: 1:05 - loss: 1.5471 - regression_loss: 1.2915 - classification_loss: 0.2556 241/500 [=============>................] - ETA: 1:04 - loss: 1.5451 - regression_loss: 1.2894 - classification_loss: 0.2557 242/500 [=============>................] - ETA: 1:04 - loss: 1.5482 - regression_loss: 1.2919 - classification_loss: 0.2563 243/500 [=============>................] - ETA: 1:04 - loss: 1.5507 - regression_loss: 1.2932 - classification_loss: 0.2575 244/500 [=============>................] - ETA: 1:04 - loss: 1.5494 - regression_loss: 1.2923 - classification_loss: 0.2571 245/500 [=============>................] - ETA: 1:03 - loss: 1.5480 - regression_loss: 1.2917 - classification_loss: 0.2563 246/500 [=============>................] - ETA: 1:03 - loss: 1.5459 - regression_loss: 1.2901 - classification_loss: 0.2558 247/500 [=============>................] - ETA: 1:03 - loss: 1.5465 - regression_loss: 1.2906 - classification_loss: 0.2558 248/500 [=============>................] - ETA: 1:03 - loss: 1.5488 - regression_loss: 1.2926 - classification_loss: 0.2562 249/500 [=============>................] - ETA: 1:02 - loss: 1.5455 - regression_loss: 1.2898 - classification_loss: 0.2557 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5448 - regression_loss: 1.2892 - classification_loss: 0.2556 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5458 - regression_loss: 1.2904 - classification_loss: 0.2555 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5458 - regression_loss: 1.2905 - classification_loss: 0.2553 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5484 - regression_loss: 1.2926 - classification_loss: 0.2558 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5481 - regression_loss: 1.2923 - classification_loss: 0.2558 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5482 - regression_loss: 1.2922 - classification_loss: 0.2561 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5492 - regression_loss: 1.2930 - classification_loss: 0.2562 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5483 - regression_loss: 1.2923 - classification_loss: 0.2559 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5491 - regression_loss: 1.2931 - classification_loss: 0.2560 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5511 - regression_loss: 1.2943 - classification_loss: 0.2568 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5523 - regression_loss: 1.2956 - classification_loss: 0.2568 261/500 [==============>...............] - ETA: 59s - loss: 1.5517 - regression_loss: 1.2949 - classification_loss: 0.2568  262/500 [==============>...............] - ETA: 59s - loss: 1.5545 - regression_loss: 1.2969 - classification_loss: 0.2576 263/500 [==============>...............] - ETA: 59s - loss: 1.5547 - regression_loss: 1.2972 - classification_loss: 0.2575 264/500 [==============>...............] - ETA: 59s - loss: 1.5527 - regression_loss: 1.2955 - classification_loss: 0.2572 265/500 [==============>...............] - ETA: 58s - loss: 1.5534 - regression_loss: 1.2962 - classification_loss: 0.2573 266/500 [==============>...............] - ETA: 58s - loss: 1.5507 - regression_loss: 1.2936 - classification_loss: 0.2570 267/500 [===============>..............] - ETA: 58s - loss: 1.5506 - regression_loss: 1.2939 - classification_loss: 0.2567 268/500 [===============>..............] - ETA: 58s - loss: 1.5489 - regression_loss: 1.2925 - classification_loss: 0.2564 269/500 [===============>..............] - ETA: 57s - loss: 1.5508 - regression_loss: 1.2942 - classification_loss: 0.2567 270/500 [===============>..............] - ETA: 57s - loss: 1.5528 - regression_loss: 1.2956 - classification_loss: 0.2572 271/500 [===============>..............] - ETA: 57s - loss: 1.5526 - regression_loss: 1.2956 - classification_loss: 0.2570 272/500 [===============>..............] - ETA: 57s - loss: 1.5520 - regression_loss: 1.2952 - classification_loss: 0.2568 273/500 [===============>..............] - ETA: 56s - loss: 1.5534 - regression_loss: 1.2962 - classification_loss: 0.2572 274/500 [===============>..............] - ETA: 56s - loss: 1.5507 - regression_loss: 1.2942 - classification_loss: 0.2565 275/500 [===============>..............] - ETA: 56s - loss: 1.5509 - regression_loss: 1.2944 - classification_loss: 0.2565 276/500 [===============>..............] - ETA: 56s - loss: 1.5487 - regression_loss: 1.2929 - classification_loss: 0.2559 277/500 [===============>..............] - ETA: 55s - loss: 1.5496 - regression_loss: 1.2939 - classification_loss: 0.2557 278/500 [===============>..............] - ETA: 55s - loss: 1.5508 - regression_loss: 1.2949 - classification_loss: 0.2559 279/500 [===============>..............] - ETA: 55s - loss: 1.5496 - regression_loss: 1.2940 - classification_loss: 0.2556 280/500 [===============>..............] - ETA: 55s - loss: 1.5503 - regression_loss: 1.2947 - classification_loss: 0.2557 281/500 [===============>..............] - ETA: 54s - loss: 1.5501 - regression_loss: 1.2946 - classification_loss: 0.2555 282/500 [===============>..............] - ETA: 54s - loss: 1.5475 - regression_loss: 1.2926 - classification_loss: 0.2548 283/500 [===============>..............] - ETA: 54s - loss: 1.5492 - regression_loss: 1.2942 - classification_loss: 0.2550 284/500 [================>.............] - ETA: 54s - loss: 1.5503 - regression_loss: 1.2952 - classification_loss: 0.2551 285/500 [================>.............] - ETA: 53s - loss: 1.5517 - regression_loss: 1.2964 - classification_loss: 0.2553 286/500 [================>.............] - ETA: 53s - loss: 1.5525 - regression_loss: 1.2968 - classification_loss: 0.2557 287/500 [================>.............] - ETA: 53s - loss: 1.5548 - regression_loss: 1.2984 - classification_loss: 0.2565 288/500 [================>.............] - ETA: 53s - loss: 1.5551 - regression_loss: 1.2987 - classification_loss: 0.2564 289/500 [================>.............] - ETA: 52s - loss: 1.5534 - regression_loss: 1.2974 - classification_loss: 0.2559 290/500 [================>.............] - ETA: 52s - loss: 1.5528 - regression_loss: 1.2965 - classification_loss: 0.2563 291/500 [================>.............] - ETA: 52s - loss: 1.5532 - regression_loss: 1.2971 - classification_loss: 0.2561 292/500 [================>.............] - ETA: 52s - loss: 1.5536 - regression_loss: 1.2976 - classification_loss: 0.2560 293/500 [================>.............] - ETA: 51s - loss: 1.5536 - regression_loss: 1.2976 - classification_loss: 0.2560 294/500 [================>.............] - ETA: 51s - loss: 1.5536 - regression_loss: 1.2977 - classification_loss: 0.2559 295/500 [================>.............] - ETA: 51s - loss: 1.5535 - regression_loss: 1.2976 - classification_loss: 0.2559 296/500 [================>.............] - ETA: 51s - loss: 1.5510 - regression_loss: 1.2955 - classification_loss: 0.2554 297/500 [================>.............] - ETA: 50s - loss: 1.5510 - regression_loss: 1.2957 - classification_loss: 0.2553 298/500 [================>.............] - ETA: 50s - loss: 1.5517 - regression_loss: 1.2959 - classification_loss: 0.2558 299/500 [================>.............] - ETA: 50s - loss: 1.5515 - regression_loss: 1.2958 - classification_loss: 0.2557 300/500 [=================>............] - ETA: 50s - loss: 1.5475 - regression_loss: 1.2924 - classification_loss: 0.2551 301/500 [=================>............] - ETA: 49s - loss: 1.5471 - regression_loss: 1.2921 - classification_loss: 0.2549 302/500 [=================>............] - ETA: 49s - loss: 1.5457 - regression_loss: 1.2911 - classification_loss: 0.2546 303/500 [=================>............] - ETA: 49s - loss: 1.5468 - regression_loss: 1.2921 - classification_loss: 0.2547 304/500 [=================>............] - ETA: 49s - loss: 1.5497 - regression_loss: 1.2946 - classification_loss: 0.2550 305/500 [=================>............] - ETA: 48s - loss: 1.5492 - regression_loss: 1.2945 - classification_loss: 0.2547 306/500 [=================>............] - ETA: 48s - loss: 1.5499 - regression_loss: 1.2950 - classification_loss: 0.2548 307/500 [=================>............] - ETA: 48s - loss: 1.5502 - regression_loss: 1.2953 - classification_loss: 0.2549 308/500 [=================>............] - ETA: 48s - loss: 1.5496 - regression_loss: 1.2949 - classification_loss: 0.2547 309/500 [=================>............] - ETA: 47s - loss: 1.5500 - regression_loss: 1.2953 - classification_loss: 0.2547 310/500 [=================>............] - ETA: 47s - loss: 1.5486 - regression_loss: 1.2940 - classification_loss: 0.2546 311/500 [=================>............] - ETA: 47s - loss: 1.5495 - regression_loss: 1.2946 - classification_loss: 0.2549 312/500 [=================>............] - ETA: 47s - loss: 1.5488 - regression_loss: 1.2942 - classification_loss: 0.2546 313/500 [=================>............] - ETA: 46s - loss: 1.5468 - regression_loss: 1.2926 - classification_loss: 0.2542 314/500 [=================>............] - ETA: 46s - loss: 1.5466 - regression_loss: 1.2924 - classification_loss: 0.2542 315/500 [=================>............] - ETA: 46s - loss: 1.5458 - regression_loss: 1.2918 - classification_loss: 0.2540 316/500 [=================>............] - ETA: 46s - loss: 1.5435 - regression_loss: 1.2902 - classification_loss: 0.2534 317/500 [==================>...........] - ETA: 45s - loss: 1.5433 - regression_loss: 1.2899 - classification_loss: 0.2534 318/500 [==================>...........] - ETA: 45s - loss: 1.5440 - regression_loss: 1.2903 - classification_loss: 0.2536 319/500 [==================>...........] - ETA: 45s - loss: 1.5435 - regression_loss: 1.2901 - classification_loss: 0.2534 320/500 [==================>...........] - ETA: 45s - loss: 1.5436 - regression_loss: 1.2902 - classification_loss: 0.2535 321/500 [==================>...........] - ETA: 44s - loss: 1.5437 - regression_loss: 1.2903 - classification_loss: 0.2534 322/500 [==================>...........] - ETA: 44s - loss: 1.5439 - regression_loss: 1.2904 - classification_loss: 0.2535 323/500 [==================>...........] - ETA: 44s - loss: 1.5416 - regression_loss: 1.2885 - classification_loss: 0.2531 324/500 [==================>...........] - ETA: 44s - loss: 1.5413 - regression_loss: 1.2883 - classification_loss: 0.2530 325/500 [==================>...........] - ETA: 43s - loss: 1.5411 - regression_loss: 1.2883 - classification_loss: 0.2528 326/500 [==================>...........] - ETA: 43s - loss: 1.5415 - regression_loss: 1.2885 - classification_loss: 0.2530 327/500 [==================>...........] - ETA: 43s - loss: 1.5421 - regression_loss: 1.2891 - classification_loss: 0.2531 328/500 [==================>...........] - ETA: 43s - loss: 1.5439 - regression_loss: 1.2903 - classification_loss: 0.2536 329/500 [==================>...........] - ETA: 42s - loss: 1.5443 - regression_loss: 1.2905 - classification_loss: 0.2539 330/500 [==================>...........] - ETA: 42s - loss: 1.5439 - regression_loss: 1.2901 - classification_loss: 0.2538 331/500 [==================>...........] - ETA: 42s - loss: 1.5445 - regression_loss: 1.2907 - classification_loss: 0.2538 332/500 [==================>...........] - ETA: 42s - loss: 1.5440 - regression_loss: 1.2905 - classification_loss: 0.2535 333/500 [==================>...........] - ETA: 41s - loss: 1.5435 - regression_loss: 1.2901 - classification_loss: 0.2533 334/500 [===================>..........] - ETA: 41s - loss: 1.5435 - regression_loss: 1.2904 - classification_loss: 0.2531 335/500 [===================>..........] - ETA: 41s - loss: 1.5446 - regression_loss: 1.2913 - classification_loss: 0.2533 336/500 [===================>..........] - ETA: 41s - loss: 1.5448 - regression_loss: 1.2917 - classification_loss: 0.2532 337/500 [===================>..........] - ETA: 40s - loss: 1.5470 - regression_loss: 1.2933 - classification_loss: 0.2537 338/500 [===================>..........] - ETA: 40s - loss: 1.5479 - regression_loss: 1.2941 - classification_loss: 0.2537 339/500 [===================>..........] - ETA: 40s - loss: 1.5488 - regression_loss: 1.2943 - classification_loss: 0.2545 340/500 [===================>..........] - ETA: 40s - loss: 1.5486 - regression_loss: 1.2940 - classification_loss: 0.2546 341/500 [===================>..........] - ETA: 39s - loss: 1.5503 - regression_loss: 1.2956 - classification_loss: 0.2547 342/500 [===================>..........] - ETA: 39s - loss: 1.5472 - regression_loss: 1.2930 - classification_loss: 0.2542 343/500 [===================>..........] - ETA: 39s - loss: 1.5475 - regression_loss: 1.2933 - classification_loss: 0.2542 344/500 [===================>..........] - ETA: 39s - loss: 1.5474 - regression_loss: 1.2932 - classification_loss: 0.2542 345/500 [===================>..........] - ETA: 38s - loss: 1.5464 - regression_loss: 1.2925 - classification_loss: 0.2539 346/500 [===================>..........] - ETA: 38s - loss: 1.5478 - regression_loss: 1.2937 - classification_loss: 0.2542 347/500 [===================>..........] - ETA: 38s - loss: 1.5492 - regression_loss: 1.2947 - classification_loss: 0.2545 348/500 [===================>..........] - ETA: 38s - loss: 1.5482 - regression_loss: 1.2940 - classification_loss: 0.2542 349/500 [===================>..........] - ETA: 37s - loss: 1.5479 - regression_loss: 1.2939 - classification_loss: 0.2540 350/500 [====================>.........] - ETA: 37s - loss: 1.5487 - regression_loss: 1.2944 - classification_loss: 0.2544 351/500 [====================>.........] - ETA: 37s - loss: 1.5489 - regression_loss: 1.2946 - classification_loss: 0.2543 352/500 [====================>.........] - ETA: 37s - loss: 1.5464 - regression_loss: 1.2926 - classification_loss: 0.2538 353/500 [====================>.........] - ETA: 36s - loss: 1.5453 - regression_loss: 1.2916 - classification_loss: 0.2537 354/500 [====================>.........] - ETA: 36s - loss: 1.5456 - regression_loss: 1.2919 - classification_loss: 0.2538 355/500 [====================>.........] - ETA: 36s - loss: 1.5458 - regression_loss: 1.2918 - classification_loss: 0.2539 356/500 [====================>.........] - ETA: 36s - loss: 1.5449 - regression_loss: 1.2910 - classification_loss: 0.2539 357/500 [====================>.........] - ETA: 35s - loss: 1.5442 - regression_loss: 1.2906 - classification_loss: 0.2536 358/500 [====================>.........] - ETA: 35s - loss: 1.5438 - regression_loss: 1.2903 - classification_loss: 0.2535 359/500 [====================>.........] - ETA: 35s - loss: 1.5416 - regression_loss: 1.2885 - classification_loss: 0.2531 360/500 [====================>.........] - ETA: 35s - loss: 1.5409 - regression_loss: 1.2878 - classification_loss: 0.2531 361/500 [====================>.........] - ETA: 34s - loss: 1.5408 - regression_loss: 1.2877 - classification_loss: 0.2531 362/500 [====================>.........] - ETA: 34s - loss: 1.5421 - regression_loss: 1.2887 - classification_loss: 0.2534 363/500 [====================>.........] - ETA: 34s - loss: 1.5400 - regression_loss: 1.2871 - classification_loss: 0.2529 364/500 [====================>.........] - ETA: 34s - loss: 1.5397 - regression_loss: 1.2870 - classification_loss: 0.2528 365/500 [====================>.........] - ETA: 33s - loss: 1.5390 - regression_loss: 1.2863 - classification_loss: 0.2527 366/500 [====================>.........] - ETA: 33s - loss: 1.5394 - regression_loss: 1.2866 - classification_loss: 0.2528 367/500 [=====================>........] - ETA: 33s - loss: 1.5400 - regression_loss: 1.2874 - classification_loss: 0.2525 368/500 [=====================>........] - ETA: 33s - loss: 1.5433 - regression_loss: 1.2902 - classification_loss: 0.2531 369/500 [=====================>........] - ETA: 32s - loss: 1.5434 - regression_loss: 1.2902 - classification_loss: 0.2532 370/500 [=====================>........] - ETA: 32s - loss: 1.5442 - regression_loss: 1.2909 - classification_loss: 0.2533 371/500 [=====================>........] - ETA: 32s - loss: 1.5446 - regression_loss: 1.2914 - classification_loss: 0.2532 372/500 [=====================>........] - ETA: 32s - loss: 1.5450 - regression_loss: 1.2918 - classification_loss: 0.2532 373/500 [=====================>........] - ETA: 31s - loss: 1.5464 - regression_loss: 1.2932 - classification_loss: 0.2532 374/500 [=====================>........] - ETA: 31s - loss: 1.5473 - regression_loss: 1.2940 - classification_loss: 0.2532 375/500 [=====================>........] - ETA: 31s - loss: 1.5476 - regression_loss: 1.2944 - classification_loss: 0.2532 376/500 [=====================>........] - ETA: 31s - loss: 1.5467 - regression_loss: 1.2937 - classification_loss: 0.2530 377/500 [=====================>........] - ETA: 30s - loss: 1.5468 - regression_loss: 1.2931 - classification_loss: 0.2537 378/500 [=====================>........] - ETA: 30s - loss: 1.5461 - regression_loss: 1.2925 - classification_loss: 0.2536 379/500 [=====================>........] - ETA: 30s - loss: 1.5476 - regression_loss: 1.2936 - classification_loss: 0.2540 380/500 [=====================>........] - ETA: 30s - loss: 1.5461 - regression_loss: 1.2925 - classification_loss: 0.2536 381/500 [=====================>........] - ETA: 29s - loss: 1.5473 - regression_loss: 1.2932 - classification_loss: 0.2541 382/500 [=====================>........] - ETA: 29s - loss: 1.5447 - regression_loss: 1.2909 - classification_loss: 0.2539 383/500 [=====================>........] - ETA: 29s - loss: 1.5460 - regression_loss: 1.2917 - classification_loss: 0.2543 384/500 [======================>.......] - ETA: 29s - loss: 1.5455 - regression_loss: 1.2913 - classification_loss: 0.2542 385/500 [======================>.......] - ETA: 28s - loss: 1.5457 - regression_loss: 1.2914 - classification_loss: 0.2542 386/500 [======================>.......] - ETA: 28s - loss: 1.5460 - regression_loss: 1.2918 - classification_loss: 0.2541 387/500 [======================>.......] - ETA: 28s - loss: 1.5474 - regression_loss: 1.2928 - classification_loss: 0.2546 388/500 [======================>.......] - ETA: 28s - loss: 1.5474 - regression_loss: 1.2927 - classification_loss: 0.2547 389/500 [======================>.......] - ETA: 27s - loss: 1.5483 - regression_loss: 1.2933 - classification_loss: 0.2550 390/500 [======================>.......] - ETA: 27s - loss: 1.5491 - regression_loss: 1.2940 - classification_loss: 0.2551 391/500 [======================>.......] - ETA: 27s - loss: 1.5502 - regression_loss: 1.2947 - classification_loss: 0.2555 392/500 [======================>.......] - ETA: 27s - loss: 1.5507 - regression_loss: 1.2952 - classification_loss: 0.2555 393/500 [======================>.......] - ETA: 26s - loss: 1.5504 - regression_loss: 1.2950 - classification_loss: 0.2554 394/500 [======================>.......] - ETA: 26s - loss: 1.5497 - regression_loss: 1.2945 - classification_loss: 0.2552 395/500 [======================>.......] - ETA: 26s - loss: 1.5493 - regression_loss: 1.2943 - classification_loss: 0.2550 396/500 [======================>.......] - ETA: 26s - loss: 1.5492 - regression_loss: 1.2942 - classification_loss: 0.2550 397/500 [======================>.......] - ETA: 25s - loss: 1.5495 - regression_loss: 1.2945 - classification_loss: 0.2550 398/500 [======================>.......] - ETA: 25s - loss: 1.5483 - regression_loss: 1.2936 - classification_loss: 0.2547 399/500 [======================>.......] - ETA: 25s - loss: 1.5470 - regression_loss: 1.2924 - classification_loss: 0.2546 400/500 [=======================>......] - ETA: 25s - loss: 1.5473 - regression_loss: 1.2927 - classification_loss: 0.2546 401/500 [=======================>......] - ETA: 24s - loss: 1.5467 - regression_loss: 1.2924 - classification_loss: 0.2543 402/500 [=======================>......] - ETA: 24s - loss: 1.5473 - regression_loss: 1.2929 - classification_loss: 0.2544 403/500 [=======================>......] - ETA: 24s - loss: 1.5470 - regression_loss: 1.2927 - classification_loss: 0.2543 404/500 [=======================>......] - ETA: 24s - loss: 1.5482 - regression_loss: 1.2937 - classification_loss: 0.2546 405/500 [=======================>......] - ETA: 23s - loss: 1.5488 - regression_loss: 1.2943 - classification_loss: 0.2545 406/500 [=======================>......] - ETA: 23s - loss: 1.5493 - regression_loss: 1.2947 - classification_loss: 0.2546 407/500 [=======================>......] - ETA: 23s - loss: 1.5501 - regression_loss: 1.2954 - classification_loss: 0.2547 408/500 [=======================>......] - ETA: 23s - loss: 1.5492 - regression_loss: 1.2946 - classification_loss: 0.2546 409/500 [=======================>......] - ETA: 22s - loss: 1.5477 - regression_loss: 1.2935 - classification_loss: 0.2542 410/500 [=======================>......] - ETA: 22s - loss: 1.5499 - regression_loss: 1.2952 - classification_loss: 0.2547 411/500 [=======================>......] - ETA: 22s - loss: 1.5505 - regression_loss: 1.2956 - classification_loss: 0.2549 412/500 [=======================>......] - ETA: 22s - loss: 1.5517 - regression_loss: 1.2966 - classification_loss: 0.2551 413/500 [=======================>......] - ETA: 21s - loss: 1.5546 - regression_loss: 1.2990 - classification_loss: 0.2556 414/500 [=======================>......] - ETA: 21s - loss: 1.5543 - regression_loss: 1.2987 - classification_loss: 0.2556 415/500 [=======================>......] - ETA: 21s - loss: 1.5524 - regression_loss: 1.2971 - classification_loss: 0.2554 416/500 [=======================>......] - ETA: 21s - loss: 1.5509 - regression_loss: 1.2959 - classification_loss: 0.2551 417/500 [========================>.....] - ETA: 20s - loss: 1.5508 - regression_loss: 1.2958 - classification_loss: 0.2550 418/500 [========================>.....] - ETA: 20s - loss: 1.5518 - regression_loss: 1.2967 - classification_loss: 0.2551 419/500 [========================>.....] - ETA: 20s - loss: 1.5533 - regression_loss: 1.2979 - classification_loss: 0.2555 420/500 [========================>.....] - ETA: 20s - loss: 1.5537 - regression_loss: 1.2983 - classification_loss: 0.2554 421/500 [========================>.....] - ETA: 19s - loss: 1.5520 - regression_loss: 1.2970 - classification_loss: 0.2550 422/500 [========================>.....] - ETA: 19s - loss: 1.5529 - regression_loss: 1.2978 - classification_loss: 0.2552 423/500 [========================>.....] - ETA: 19s - loss: 1.5532 - regression_loss: 1.2980 - classification_loss: 0.2552 424/500 [========================>.....] - ETA: 19s - loss: 1.5545 - regression_loss: 1.2988 - classification_loss: 0.2558 425/500 [========================>.....] - ETA: 18s - loss: 1.5531 - regression_loss: 1.2977 - classification_loss: 0.2554 426/500 [========================>.....] - ETA: 18s - loss: 1.5525 - regression_loss: 1.2973 - classification_loss: 0.2552 427/500 [========================>.....] - ETA: 18s - loss: 1.5505 - regression_loss: 1.2956 - classification_loss: 0.2549 428/500 [========================>.....] - ETA: 18s - loss: 1.5501 - regression_loss: 1.2953 - classification_loss: 0.2548 429/500 [========================>.....] - ETA: 17s - loss: 1.5501 - regression_loss: 1.2953 - classification_loss: 0.2548 430/500 [========================>.....] - ETA: 17s - loss: 1.5486 - regression_loss: 1.2940 - classification_loss: 0.2545 431/500 [========================>.....] - ETA: 17s - loss: 1.5477 - regression_loss: 1.2935 - classification_loss: 0.2542 432/500 [========================>.....] - ETA: 17s - loss: 1.5482 - regression_loss: 1.2934 - classification_loss: 0.2548 433/500 [========================>.....] - ETA: 16s - loss: 1.5480 - regression_loss: 1.2933 - classification_loss: 0.2547 434/500 [=========================>....] - ETA: 16s - loss: 1.5476 - regression_loss: 1.2930 - classification_loss: 0.2546 435/500 [=========================>....] - ETA: 16s - loss: 1.5468 - regression_loss: 1.2924 - classification_loss: 0.2544 436/500 [=========================>....] - ETA: 16s - loss: 1.5460 - regression_loss: 1.2918 - classification_loss: 0.2543 437/500 [=========================>....] - ETA: 15s - loss: 1.5453 - regression_loss: 1.2914 - classification_loss: 0.2540 438/500 [=========================>....] - ETA: 15s - loss: 1.5452 - regression_loss: 1.2914 - classification_loss: 0.2539 439/500 [=========================>....] - ETA: 15s - loss: 1.5454 - regression_loss: 1.2915 - classification_loss: 0.2539 440/500 [=========================>....] - ETA: 15s - loss: 1.5470 - regression_loss: 1.2924 - classification_loss: 0.2546 441/500 [=========================>....] - ETA: 14s - loss: 1.5464 - regression_loss: 1.2920 - classification_loss: 0.2544 442/500 [=========================>....] - ETA: 14s - loss: 1.5450 - regression_loss: 1.2909 - classification_loss: 0.2541 443/500 [=========================>....] - ETA: 14s - loss: 1.5454 - regression_loss: 1.2912 - classification_loss: 0.2542 444/500 [=========================>....] - ETA: 14s - loss: 1.5452 - regression_loss: 1.2911 - classification_loss: 0.2542 445/500 [=========================>....] - ETA: 13s - loss: 1.5458 - regression_loss: 1.2916 - classification_loss: 0.2542 446/500 [=========================>....] - ETA: 13s - loss: 1.5469 - regression_loss: 1.2926 - classification_loss: 0.2543 447/500 [=========================>....] - ETA: 13s - loss: 1.5452 - regression_loss: 1.2913 - classification_loss: 0.2540 448/500 [=========================>....] - ETA: 13s - loss: 1.5463 - regression_loss: 1.2920 - classification_loss: 0.2542 449/500 [=========================>....] - ETA: 12s - loss: 1.5464 - regression_loss: 1.2922 - classification_loss: 0.2542 450/500 [==========================>...] - ETA: 12s - loss: 1.5444 - regression_loss: 1.2906 - classification_loss: 0.2538 451/500 [==========================>...] - ETA: 12s - loss: 1.5458 - regression_loss: 1.2917 - classification_loss: 0.2541 452/500 [==========================>...] - ETA: 12s - loss: 1.5456 - regression_loss: 1.2915 - classification_loss: 0.2541 453/500 [==========================>...] - ETA: 11s - loss: 1.5438 - regression_loss: 1.2900 - classification_loss: 0.2538 454/500 [==========================>...] - ETA: 11s - loss: 1.5466 - regression_loss: 1.2921 - classification_loss: 0.2545 455/500 [==========================>...] - ETA: 11s - loss: 1.5461 - regression_loss: 1.2915 - classification_loss: 0.2546 456/500 [==========================>...] - ETA: 11s - loss: 1.5473 - regression_loss: 1.2925 - classification_loss: 0.2547 457/500 [==========================>...] - ETA: 10s - loss: 1.5481 - regression_loss: 1.2932 - classification_loss: 0.2549 458/500 [==========================>...] - ETA: 10s - loss: 1.5495 - regression_loss: 1.2944 - classification_loss: 0.2551 459/500 [==========================>...] - ETA: 10s - loss: 1.5476 - regression_loss: 1.2928 - classification_loss: 0.2548 460/500 [==========================>...] - ETA: 10s - loss: 1.5481 - regression_loss: 1.2930 - classification_loss: 0.2551 461/500 [==========================>...] - ETA: 9s - loss: 1.5479 - regression_loss: 1.2929 - classification_loss: 0.2551  462/500 [==========================>...] - ETA: 9s - loss: 1.5489 - regression_loss: 1.2937 - classification_loss: 0.2552 463/500 [==========================>...] - ETA: 9s - loss: 1.5491 - regression_loss: 1.2936 - classification_loss: 0.2555 464/500 [==========================>...] - ETA: 9s - loss: 1.5488 - regression_loss: 1.2935 - classification_loss: 0.2553 465/500 [==========================>...] - ETA: 8s - loss: 1.5495 - regression_loss: 1.2942 - classification_loss: 0.2554 466/500 [==========================>...] - ETA: 8s - loss: 1.5493 - regression_loss: 1.2941 - classification_loss: 0.2552 467/500 [===========================>..] - ETA: 8s - loss: 1.5488 - regression_loss: 1.2937 - classification_loss: 0.2551 468/500 [===========================>..] - ETA: 8s - loss: 1.5475 - regression_loss: 1.2927 - classification_loss: 0.2548 469/500 [===========================>..] - ETA: 7s - loss: 1.5479 - regression_loss: 1.2929 - classification_loss: 0.2549 470/500 [===========================>..] - ETA: 7s - loss: 1.5467 - regression_loss: 1.2920 - classification_loss: 0.2547 471/500 [===========================>..] - ETA: 7s - loss: 1.5469 - regression_loss: 1.2922 - classification_loss: 0.2547 472/500 [===========================>..] - ETA: 7s - loss: 1.5451 - regression_loss: 1.2907 - classification_loss: 0.2543 473/500 [===========================>..] - ETA: 6s - loss: 1.5460 - regression_loss: 1.2915 - classification_loss: 0.2545 474/500 [===========================>..] - ETA: 6s - loss: 1.5472 - regression_loss: 1.2924 - classification_loss: 0.2548 475/500 [===========================>..] - ETA: 6s - loss: 1.5470 - regression_loss: 1.2921 - classification_loss: 0.2549 476/500 [===========================>..] - ETA: 6s - loss: 1.5473 - regression_loss: 1.2924 - classification_loss: 0.2549 477/500 [===========================>..] - ETA: 5s - loss: 1.5478 - regression_loss: 1.2928 - classification_loss: 0.2550 478/500 [===========================>..] - ETA: 5s - loss: 1.5479 - regression_loss: 1.2930 - classification_loss: 0.2549 479/500 [===========================>..] - ETA: 5s - loss: 1.5486 - regression_loss: 1.2936 - classification_loss: 0.2550 480/500 [===========================>..] - ETA: 5s - loss: 1.5508 - regression_loss: 1.2954 - classification_loss: 0.2555 481/500 [===========================>..] - ETA: 4s - loss: 1.5504 - regression_loss: 1.2952 - classification_loss: 0.2553 482/500 [===========================>..] - ETA: 4s - loss: 1.5508 - regression_loss: 1.2954 - classification_loss: 0.2554 483/500 [===========================>..] - ETA: 4s - loss: 1.5486 - regression_loss: 1.2935 - classification_loss: 0.2551 484/500 [============================>.] - ETA: 4s - loss: 1.5496 - regression_loss: 1.2945 - classification_loss: 0.2552 485/500 [============================>.] - ETA: 3s - loss: 1.5499 - regression_loss: 1.2947 - classification_loss: 0.2551 486/500 [============================>.] - ETA: 3s - loss: 1.5513 - regression_loss: 1.2958 - classification_loss: 0.2555 487/500 [============================>.] - ETA: 3s - loss: 1.5511 - regression_loss: 1.2957 - classification_loss: 0.2554 488/500 [============================>.] - ETA: 3s - loss: 1.5507 - regression_loss: 1.2954 - classification_loss: 0.2552 489/500 [============================>.] - ETA: 2s - loss: 1.5512 - regression_loss: 1.2959 - classification_loss: 0.2552 490/500 [============================>.] - ETA: 2s - loss: 1.5514 - regression_loss: 1.2960 - classification_loss: 0.2554 491/500 [============================>.] - ETA: 2s - loss: 1.5500 - regression_loss: 1.2949 - classification_loss: 0.2551 492/500 [============================>.] - ETA: 2s - loss: 1.5486 - regression_loss: 1.2936 - classification_loss: 0.2550 493/500 [============================>.] - ETA: 1s - loss: 1.5486 - regression_loss: 1.2938 - classification_loss: 0.2548 494/500 [============================>.] - ETA: 1s - loss: 1.5482 - regression_loss: 1.2934 - classification_loss: 0.2548 495/500 [============================>.] - ETA: 1s - loss: 1.5483 - regression_loss: 1.2935 - classification_loss: 0.2548 496/500 [============================>.] - ETA: 1s - loss: 1.5487 - regression_loss: 1.2937 - classification_loss: 0.2549 497/500 [============================>.] - ETA: 0s - loss: 1.5491 - regression_loss: 1.2939 - classification_loss: 0.2552 498/500 [============================>.] - ETA: 0s - loss: 1.5478 - regression_loss: 1.2928 - classification_loss: 0.2550 499/500 [============================>.] - ETA: 0s - loss: 1.5477 - regression_loss: 1.2928 - classification_loss: 0.2549 500/500 [==============================] - 125s 251ms/step - loss: 1.5470 - regression_loss: 1.2924 - classification_loss: 0.2546 1172 instances of class plum with average precision: 0.6489 mAP: 0.6489 Epoch 00093: saving model to ./training/snapshots/resnet50_pascal_93.h5 Epoch 94/150 1/500 [..............................] - ETA: 1:55 - loss: 2.1156 - regression_loss: 1.7924 - classification_loss: 0.3231 2/500 [..............................] - ETA: 2:00 - loss: 1.7070 - regression_loss: 1.3965 - classification_loss: 0.3106 3/500 [..............................] - ETA: 2:02 - loss: 1.7578 - regression_loss: 1.4430 - classification_loss: 0.3149 4/500 [..............................] - ETA: 2:02 - loss: 1.7353 - regression_loss: 1.4265 - classification_loss: 0.3088 5/500 [..............................] - ETA: 2:02 - loss: 1.6575 - regression_loss: 1.3770 - classification_loss: 0.2805 6/500 [..............................] - ETA: 2:02 - loss: 1.5872 - regression_loss: 1.3152 - classification_loss: 0.2720 7/500 [..............................] - ETA: 2:02 - loss: 1.6653 - regression_loss: 1.3681 - classification_loss: 0.2971 8/500 [..............................] - ETA: 2:02 - loss: 1.6132 - regression_loss: 1.3295 - classification_loss: 0.2837 9/500 [..............................] - ETA: 2:02 - loss: 1.6556 - regression_loss: 1.3635 - classification_loss: 0.2921 10/500 [..............................] - ETA: 2:02 - loss: 1.5932 - regression_loss: 1.3169 - classification_loss: 0.2763 11/500 [..............................] - ETA: 2:03 - loss: 1.5667 - regression_loss: 1.3009 - classification_loss: 0.2658 12/500 [..............................] - ETA: 2:02 - loss: 1.5420 - regression_loss: 1.2769 - classification_loss: 0.2652 13/500 [..............................] - ETA: 2:02 - loss: 1.5615 - regression_loss: 1.2956 - classification_loss: 0.2659 14/500 [..............................] - ETA: 2:02 - loss: 1.5655 - regression_loss: 1.2992 - classification_loss: 0.2663 15/500 [..............................] - ETA: 2:02 - loss: 1.5962 - regression_loss: 1.3242 - classification_loss: 0.2720 16/500 [..............................] - ETA: 2:01 - loss: 1.5387 - regression_loss: 1.2771 - classification_loss: 0.2616 17/500 [>.............................] - ETA: 2:01 - loss: 1.5437 - regression_loss: 1.2764 - classification_loss: 0.2673 18/500 [>.............................] - ETA: 2:01 - loss: 1.5762 - regression_loss: 1.3063 - classification_loss: 0.2699 19/500 [>.............................] - ETA: 2:01 - loss: 1.5809 - regression_loss: 1.3109 - classification_loss: 0.2700 20/500 [>.............................] - ETA: 2:00 - loss: 1.5848 - regression_loss: 1.3147 - classification_loss: 0.2700 21/500 [>.............................] - ETA: 2:00 - loss: 1.5662 - regression_loss: 1.3024 - classification_loss: 0.2637 22/500 [>.............................] - ETA: 1:59 - loss: 1.5830 - regression_loss: 1.3165 - classification_loss: 0.2666 23/500 [>.............................] - ETA: 1:59 - loss: 1.5869 - regression_loss: 1.3217 - classification_loss: 0.2651 24/500 [>.............................] - ETA: 1:59 - loss: 1.5774 - regression_loss: 1.3157 - classification_loss: 0.2617 25/500 [>.............................] - ETA: 1:58 - loss: 1.5818 - regression_loss: 1.3175 - classification_loss: 0.2642 26/500 [>.............................] - ETA: 1:58 - loss: 1.5764 - regression_loss: 1.3120 - classification_loss: 0.2644 27/500 [>.............................] - ETA: 1:58 - loss: 1.5686 - regression_loss: 1.2996 - classification_loss: 0.2690 28/500 [>.............................] - ETA: 1:58 - loss: 1.5391 - regression_loss: 1.2767 - classification_loss: 0.2623 29/500 [>.............................] - ETA: 1:58 - loss: 1.5630 - regression_loss: 1.2985 - classification_loss: 0.2645 30/500 [>.............................] - ETA: 1:58 - loss: 1.5583 - regression_loss: 1.2952 - classification_loss: 0.2631 31/500 [>.............................] - ETA: 1:57 - loss: 1.5483 - regression_loss: 1.2884 - classification_loss: 0.2599 32/500 [>.............................] - ETA: 1:57 - loss: 1.5314 - regression_loss: 1.2747 - classification_loss: 0.2567 33/500 [>.............................] - ETA: 1:57 - loss: 1.5330 - regression_loss: 1.2770 - classification_loss: 0.2560 34/500 [=>............................] - ETA: 1:57 - loss: 1.5263 - regression_loss: 1.2717 - classification_loss: 0.2546 35/500 [=>............................] - ETA: 1:56 - loss: 1.5293 - regression_loss: 1.2741 - classification_loss: 0.2552 36/500 [=>............................] - ETA: 1:56 - loss: 1.5349 - regression_loss: 1.2777 - classification_loss: 0.2573 37/500 [=>............................] - ETA: 1:56 - loss: 1.5298 - regression_loss: 1.2742 - classification_loss: 0.2556 38/500 [=>............................] - ETA: 1:56 - loss: 1.5235 - regression_loss: 1.2707 - classification_loss: 0.2528 39/500 [=>............................] - ETA: 1:55 - loss: 1.5371 - regression_loss: 1.2819 - classification_loss: 0.2552 40/500 [=>............................] - ETA: 1:55 - loss: 1.5363 - regression_loss: 1.2810 - classification_loss: 0.2552 41/500 [=>............................] - ETA: 1:55 - loss: 1.5143 - regression_loss: 1.2644 - classification_loss: 0.2498 42/500 [=>............................] - ETA: 1:55 - loss: 1.5140 - regression_loss: 1.2644 - classification_loss: 0.2495 43/500 [=>............................] - ETA: 1:54 - loss: 1.5146 - regression_loss: 1.2679 - classification_loss: 0.2467 44/500 [=>............................] - ETA: 1:54 - loss: 1.5128 - regression_loss: 1.2677 - classification_loss: 0.2451 45/500 [=>............................] - ETA: 1:54 - loss: 1.5204 - regression_loss: 1.2732 - classification_loss: 0.2471 46/500 [=>............................] - ETA: 1:54 - loss: 1.5255 - regression_loss: 1.2780 - classification_loss: 0.2474 47/500 [=>............................] - ETA: 1:53 - loss: 1.5160 - regression_loss: 1.2690 - classification_loss: 0.2471 48/500 [=>............................] - ETA: 1:53 - loss: 1.5251 - regression_loss: 1.2772 - classification_loss: 0.2479 49/500 [=>............................] - ETA: 1:53 - loss: 1.5220 - regression_loss: 1.2731 - classification_loss: 0.2489 50/500 [==>...........................] - ETA: 1:53 - loss: 1.5029 - regression_loss: 1.2575 - classification_loss: 0.2454 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5160 - regression_loss: 1.2697 - classification_loss: 0.2463 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5215 - regression_loss: 1.2731 - classification_loss: 0.2484 53/500 [==>...........................] - ETA: 1:52 - loss: 1.5112 - regression_loss: 1.2639 - classification_loss: 0.2473 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5058 - regression_loss: 1.2590 - classification_loss: 0.2468 55/500 [==>...........................] - ETA: 1:51 - loss: 1.5016 - regression_loss: 1.2558 - classification_loss: 0.2459 56/500 [==>...........................] - ETA: 1:51 - loss: 1.5095 - regression_loss: 1.2627 - classification_loss: 0.2468 57/500 [==>...........................] - ETA: 1:51 - loss: 1.5112 - regression_loss: 1.2634 - classification_loss: 0.2478 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5163 - regression_loss: 1.2670 - classification_loss: 0.2493 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5161 - regression_loss: 1.2673 - classification_loss: 0.2487 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5261 - regression_loss: 1.2764 - classification_loss: 0.2497 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5181 - regression_loss: 1.2701 - classification_loss: 0.2481 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5199 - regression_loss: 1.2720 - classification_loss: 0.2478 63/500 [==>...........................] - ETA: 1:50 - loss: 1.5470 - regression_loss: 1.2941 - classification_loss: 0.2529 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5427 - regression_loss: 1.2882 - classification_loss: 0.2544 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5452 - regression_loss: 1.2906 - classification_loss: 0.2546 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5481 - regression_loss: 1.2932 - classification_loss: 0.2549 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5494 - regression_loss: 1.2943 - classification_loss: 0.2551 68/500 [===>..........................] - ETA: 1:48 - loss: 1.5446 - regression_loss: 1.2909 - classification_loss: 0.2537 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5344 - regression_loss: 1.2826 - classification_loss: 0.2518 70/500 [===>..........................] - ETA: 1:48 - loss: 1.5286 - regression_loss: 1.2777 - classification_loss: 0.2509 71/500 [===>..........................] - ETA: 1:48 - loss: 1.5306 - regression_loss: 1.2788 - classification_loss: 0.2518 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5250 - regression_loss: 1.2743 - classification_loss: 0.2506 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5261 - regression_loss: 1.2749 - classification_loss: 0.2512 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5195 - regression_loss: 1.2695 - classification_loss: 0.2500 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5211 - regression_loss: 1.2708 - classification_loss: 0.2503 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5266 - regression_loss: 1.2747 - classification_loss: 0.2519 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5272 - regression_loss: 1.2754 - classification_loss: 0.2518 78/500 [===>..........................] - ETA: 1:46 - loss: 1.5272 - regression_loss: 1.2756 - classification_loss: 0.2517 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5291 - regression_loss: 1.2774 - classification_loss: 0.2517 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5297 - regression_loss: 1.2788 - classification_loss: 0.2508 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5302 - regression_loss: 1.2798 - classification_loss: 0.2505 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5301 - regression_loss: 1.2797 - classification_loss: 0.2504 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5321 - regression_loss: 1.2811 - classification_loss: 0.2510 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5256 - regression_loss: 1.2765 - classification_loss: 0.2491 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5316 - regression_loss: 1.2810 - classification_loss: 0.2506 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5311 - regression_loss: 1.2805 - classification_loss: 0.2506 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5271 - regression_loss: 1.2772 - classification_loss: 0.2499 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5279 - regression_loss: 1.2775 - classification_loss: 0.2504 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5369 - regression_loss: 1.2840 - classification_loss: 0.2529 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5376 - regression_loss: 1.2848 - classification_loss: 0.2528 91/500 [====>.........................] - ETA: 1:43 - loss: 1.5415 - regression_loss: 1.2858 - classification_loss: 0.2558 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5421 - regression_loss: 1.2863 - classification_loss: 0.2559 93/500 [====>.........................] - ETA: 1:42 - loss: 1.5477 - regression_loss: 1.2909 - classification_loss: 0.2567 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5509 - regression_loss: 1.2940 - classification_loss: 0.2570 95/500 [====>.........................] - ETA: 1:42 - loss: 1.5546 - regression_loss: 1.2968 - classification_loss: 0.2577 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5571 - regression_loss: 1.2993 - classification_loss: 0.2578 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5552 - regression_loss: 1.2982 - classification_loss: 0.2569 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5484 - regression_loss: 1.2927 - classification_loss: 0.2557 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5475 - regression_loss: 1.2922 - classification_loss: 0.2553 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5479 - regression_loss: 1.2929 - classification_loss: 0.2550 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5454 - regression_loss: 1.2910 - classification_loss: 0.2544 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5448 - regression_loss: 1.2906 - classification_loss: 0.2542 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5492 - regression_loss: 1.2935 - classification_loss: 0.2557 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5533 - regression_loss: 1.2963 - classification_loss: 0.2569 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5474 - regression_loss: 1.2913 - classification_loss: 0.2561 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5372 - regression_loss: 1.2829 - classification_loss: 0.2543 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5411 - regression_loss: 1.2868 - classification_loss: 0.2543 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5358 - regression_loss: 1.2828 - classification_loss: 0.2530 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5306 - regression_loss: 1.2787 - classification_loss: 0.2519 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5313 - regression_loss: 1.2795 - classification_loss: 0.2518 111/500 [=====>........................] - ETA: 1:38 - loss: 1.5304 - regression_loss: 1.2791 - classification_loss: 0.2513 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5304 - regression_loss: 1.2788 - classification_loss: 0.2516 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5376 - regression_loss: 1.2840 - classification_loss: 0.2536 114/500 [=====>........................] - ETA: 1:37 - loss: 1.5322 - regression_loss: 1.2798 - classification_loss: 0.2524 115/500 [=====>........................] - ETA: 1:37 - loss: 1.5288 - regression_loss: 1.2772 - classification_loss: 0.2516 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5310 - regression_loss: 1.2795 - classification_loss: 0.2515 117/500 [======>.......................] - ETA: 1:36 - loss: 1.5341 - regression_loss: 1.2823 - classification_loss: 0.2518 118/500 [======>.......................] - ETA: 1:36 - loss: 1.5313 - regression_loss: 1.2805 - classification_loss: 0.2508 119/500 [======>.......................] - ETA: 1:36 - loss: 1.5316 - regression_loss: 1.2810 - classification_loss: 0.2506 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5364 - regression_loss: 1.2848 - classification_loss: 0.2516 121/500 [======>.......................] - ETA: 1:35 - loss: 1.5383 - regression_loss: 1.2860 - classification_loss: 0.2523 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5419 - regression_loss: 1.2881 - classification_loss: 0.2538 123/500 [======>.......................] - ETA: 1:35 - loss: 1.5451 - regression_loss: 1.2909 - classification_loss: 0.2541 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5478 - regression_loss: 1.2923 - classification_loss: 0.2555 125/500 [======>.......................] - ETA: 1:34 - loss: 1.5476 - regression_loss: 1.2923 - classification_loss: 0.2553 126/500 [======>.......................] - ETA: 1:34 - loss: 1.5415 - regression_loss: 1.2876 - classification_loss: 0.2539 127/500 [======>.......................] - ETA: 1:34 - loss: 1.5426 - regression_loss: 1.2885 - classification_loss: 0.2541 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5432 - regression_loss: 1.2889 - classification_loss: 0.2543 129/500 [======>.......................] - ETA: 1:33 - loss: 1.5387 - regression_loss: 1.2857 - classification_loss: 0.2531 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5341 - regression_loss: 1.2824 - classification_loss: 0.2517 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5313 - regression_loss: 1.2804 - classification_loss: 0.2509 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5321 - regression_loss: 1.2811 - classification_loss: 0.2510 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5350 - regression_loss: 1.2834 - classification_loss: 0.2515 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5327 - regression_loss: 1.2818 - classification_loss: 0.2509 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5295 - regression_loss: 1.2793 - classification_loss: 0.2502 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5246 - regression_loss: 1.2758 - classification_loss: 0.2488 137/500 [=======>......................] - ETA: 1:31 - loss: 1.5256 - regression_loss: 1.2764 - classification_loss: 0.2492 138/500 [=======>......................] - ETA: 1:31 - loss: 1.5282 - regression_loss: 1.2783 - classification_loss: 0.2499 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5257 - regression_loss: 1.2768 - classification_loss: 0.2488 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5242 - regression_loss: 1.2756 - classification_loss: 0.2485 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5198 - regression_loss: 1.2715 - classification_loss: 0.2483 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5175 - regression_loss: 1.2696 - classification_loss: 0.2480 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5195 - regression_loss: 1.2710 - classification_loss: 0.2485 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5208 - regression_loss: 1.2719 - classification_loss: 0.2489 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5239 - regression_loss: 1.2735 - classification_loss: 0.2505 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5235 - regression_loss: 1.2734 - classification_loss: 0.2501 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5232 - regression_loss: 1.2732 - classification_loss: 0.2500 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5207 - regression_loss: 1.2707 - classification_loss: 0.2500 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5215 - regression_loss: 1.2712 - classification_loss: 0.2503 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5171 - regression_loss: 1.2676 - classification_loss: 0.2495 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5135 - regression_loss: 1.2648 - classification_loss: 0.2487 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5143 - regression_loss: 1.2656 - classification_loss: 0.2486 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5136 - regression_loss: 1.2647 - classification_loss: 0.2489 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5067 - regression_loss: 1.2590 - classification_loss: 0.2478 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5084 - regression_loss: 1.2608 - classification_loss: 0.2477 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5111 - regression_loss: 1.2631 - classification_loss: 0.2480 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5093 - regression_loss: 1.2621 - classification_loss: 0.2473 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5096 - regression_loss: 1.2626 - classification_loss: 0.2470 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5108 - regression_loss: 1.2637 - classification_loss: 0.2471 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5125 - regression_loss: 1.2653 - classification_loss: 0.2471 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5128 - regression_loss: 1.2657 - classification_loss: 0.2470 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5161 - regression_loss: 1.2690 - classification_loss: 0.2470 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5169 - regression_loss: 1.2703 - classification_loss: 0.2466 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5108 - regression_loss: 1.2650 - classification_loss: 0.2458 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5092 - regression_loss: 1.2638 - classification_loss: 0.2454 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5088 - regression_loss: 1.2634 - classification_loss: 0.2454 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5125 - regression_loss: 1.2661 - classification_loss: 0.2465 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5128 - regression_loss: 1.2664 - classification_loss: 0.2464 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5140 - regression_loss: 1.2675 - classification_loss: 0.2465 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5199 - regression_loss: 1.2729 - classification_loss: 0.2470 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5228 - regression_loss: 1.2756 - classification_loss: 0.2472 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5203 - regression_loss: 1.2735 - classification_loss: 0.2467 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5224 - regression_loss: 1.2754 - classification_loss: 0.2469 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5220 - regression_loss: 1.2753 - classification_loss: 0.2467 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5200 - regression_loss: 1.2736 - classification_loss: 0.2464 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5155 - regression_loss: 1.2700 - classification_loss: 0.2455 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5165 - regression_loss: 1.2704 - classification_loss: 0.2461 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5169 - regression_loss: 1.2708 - classification_loss: 0.2461 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5195 - regression_loss: 1.2730 - classification_loss: 0.2465 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5201 - regression_loss: 1.2737 - classification_loss: 0.2465 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5231 - regression_loss: 1.2761 - classification_loss: 0.2470 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5223 - regression_loss: 1.2742 - classification_loss: 0.2481 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5220 - regression_loss: 1.2739 - classification_loss: 0.2481 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5265 - regression_loss: 1.2779 - classification_loss: 0.2486 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5267 - regression_loss: 1.2778 - classification_loss: 0.2489 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5276 - regression_loss: 1.2785 - classification_loss: 0.2491 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5279 - regression_loss: 1.2791 - classification_loss: 0.2488 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5227 - regression_loss: 1.2747 - classification_loss: 0.2480 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5261 - regression_loss: 1.2777 - classification_loss: 0.2484 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5276 - regression_loss: 1.2788 - classification_loss: 0.2488 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5232 - regression_loss: 1.2751 - classification_loss: 0.2481 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5246 - regression_loss: 1.2760 - classification_loss: 0.2486 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5231 - regression_loss: 1.2750 - classification_loss: 0.2481 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5220 - regression_loss: 1.2740 - classification_loss: 0.2480 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5206 - regression_loss: 1.2729 - classification_loss: 0.2477 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5223 - regression_loss: 1.2738 - classification_loss: 0.2485 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5232 - regression_loss: 1.2747 - classification_loss: 0.2484 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5256 - regression_loss: 1.2760 - classification_loss: 0.2495 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5260 - regression_loss: 1.2760 - classification_loss: 0.2500 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5253 - regression_loss: 1.2755 - classification_loss: 0.2498 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5230 - regression_loss: 1.2735 - classification_loss: 0.2495 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5207 - regression_loss: 1.2719 - classification_loss: 0.2488 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5197 - regression_loss: 1.2704 - classification_loss: 0.2493 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5202 - regression_loss: 1.2708 - classification_loss: 0.2493 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5216 - regression_loss: 1.2724 - classification_loss: 0.2492 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5242 - regression_loss: 1.2746 - classification_loss: 0.2495 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5233 - regression_loss: 1.2738 - classification_loss: 0.2495 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5227 - regression_loss: 1.2731 - classification_loss: 0.2497 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5245 - regression_loss: 1.2746 - classification_loss: 0.2499 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5262 - regression_loss: 1.2758 - classification_loss: 0.2503 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5268 - regression_loss: 1.2764 - classification_loss: 0.2504 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5274 - regression_loss: 1.2770 - classification_loss: 0.2504 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5276 - regression_loss: 1.2767 - classification_loss: 0.2509 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5267 - regression_loss: 1.2761 - classification_loss: 0.2507 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5275 - regression_loss: 1.2766 - classification_loss: 0.2509 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5281 - regression_loss: 1.2773 - classification_loss: 0.2507 217/500 [============>.................] - ETA: 1:10 - loss: 1.5294 - regression_loss: 1.2782 - classification_loss: 0.2511 218/500 [============>.................] - ETA: 1:10 - loss: 1.5284 - regression_loss: 1.2776 - classification_loss: 0.2508 219/500 [============>.................] - ETA: 1:10 - loss: 1.5265 - regression_loss: 1.2762 - classification_loss: 0.2503 220/500 [============>.................] - ETA: 1:10 - loss: 1.5271 - regression_loss: 1.2768 - classification_loss: 0.2503 221/500 [============>.................] - ETA: 1:09 - loss: 1.5253 - regression_loss: 1.2751 - classification_loss: 0.2502 222/500 [============>.................] - ETA: 1:09 - loss: 1.5240 - regression_loss: 1.2739 - classification_loss: 0.2501 223/500 [============>.................] - ETA: 1:09 - loss: 1.5250 - regression_loss: 1.2749 - classification_loss: 0.2501 224/500 [============>.................] - ETA: 1:09 - loss: 1.5264 - regression_loss: 1.2759 - classification_loss: 0.2505 225/500 [============>.................] - ETA: 1:08 - loss: 1.5307 - regression_loss: 1.2702 - classification_loss: 0.2605 226/500 [============>.................] - ETA: 1:08 - loss: 1.5309 - regression_loss: 1.2705 - classification_loss: 0.2604 227/500 [============>.................] - ETA: 1:08 - loss: 1.5316 - regression_loss: 1.2713 - classification_loss: 0.2603 228/500 [============>.................] - ETA: 1:08 - loss: 1.5286 - regression_loss: 1.2690 - classification_loss: 0.2596 229/500 [============>.................] - ETA: 1:07 - loss: 1.5321 - regression_loss: 1.2716 - classification_loss: 0.2605 230/500 [============>.................] - ETA: 1:07 - loss: 1.5326 - regression_loss: 1.2717 - classification_loss: 0.2608 231/500 [============>.................] - ETA: 1:07 - loss: 1.5289 - regression_loss: 1.2688 - classification_loss: 0.2601 232/500 [============>.................] - ETA: 1:07 - loss: 1.5259 - regression_loss: 1.2667 - classification_loss: 0.2593 233/500 [============>.................] - ETA: 1:06 - loss: 1.5269 - regression_loss: 1.2676 - classification_loss: 0.2593 234/500 [=============>................] - ETA: 1:06 - loss: 1.5265 - regression_loss: 1.2675 - classification_loss: 0.2590 235/500 [=============>................] - ETA: 1:06 - loss: 1.5273 - regression_loss: 1.2679 - classification_loss: 0.2594 236/500 [=============>................] - ETA: 1:06 - loss: 1.5285 - regression_loss: 1.2691 - classification_loss: 0.2594 237/500 [=============>................] - ETA: 1:05 - loss: 1.5295 - regression_loss: 1.2698 - classification_loss: 0.2597 238/500 [=============>................] - ETA: 1:05 - loss: 1.5277 - regression_loss: 1.2678 - classification_loss: 0.2599 239/500 [=============>................] - ETA: 1:05 - loss: 1.5300 - regression_loss: 1.2696 - classification_loss: 0.2604 240/500 [=============>................] - ETA: 1:05 - loss: 1.5265 - regression_loss: 1.2670 - classification_loss: 0.2595 241/500 [=============>................] - ETA: 1:04 - loss: 1.5275 - regression_loss: 1.2679 - classification_loss: 0.2596 242/500 [=============>................] - ETA: 1:04 - loss: 1.5275 - regression_loss: 1.2679 - classification_loss: 0.2596 243/500 [=============>................] - ETA: 1:04 - loss: 1.5275 - regression_loss: 1.2681 - classification_loss: 0.2595 244/500 [=============>................] - ETA: 1:04 - loss: 1.5284 - regression_loss: 1.2686 - classification_loss: 0.2599 245/500 [=============>................] - ETA: 1:03 - loss: 1.5252 - regression_loss: 1.2659 - classification_loss: 0.2593 246/500 [=============>................] - ETA: 1:03 - loss: 1.5251 - regression_loss: 1.2658 - classification_loss: 0.2593 247/500 [=============>................] - ETA: 1:03 - loss: 1.5260 - regression_loss: 1.2668 - classification_loss: 0.2592 248/500 [=============>................] - ETA: 1:03 - loss: 1.5277 - regression_loss: 1.2681 - classification_loss: 0.2595 249/500 [=============>................] - ETA: 1:02 - loss: 1.5261 - regression_loss: 1.2671 - classification_loss: 0.2590 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5249 - regression_loss: 1.2663 - classification_loss: 0.2586 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5228 - regression_loss: 1.2640 - classification_loss: 0.2588 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5237 - regression_loss: 1.2653 - classification_loss: 0.2584 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5308 - regression_loss: 1.2710 - classification_loss: 0.2598 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5281 - regression_loss: 1.2688 - classification_loss: 0.2593 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5282 - regression_loss: 1.2688 - classification_loss: 0.2594 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5290 - regression_loss: 1.2696 - classification_loss: 0.2594 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5288 - regression_loss: 1.2694 - classification_loss: 0.2593 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5291 - regression_loss: 1.2698 - classification_loss: 0.2592 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5279 - regression_loss: 1.2690 - classification_loss: 0.2589 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5273 - regression_loss: 1.2683 - classification_loss: 0.2590 261/500 [==============>...............] - ETA: 59s - loss: 1.5297 - regression_loss: 1.2703 - classification_loss: 0.2594  262/500 [==============>...............] - ETA: 59s - loss: 1.5293 - regression_loss: 1.2702 - classification_loss: 0.2592 263/500 [==============>...............] - ETA: 59s - loss: 1.5264 - regression_loss: 1.2679 - classification_loss: 0.2585 264/500 [==============>...............] - ETA: 59s - loss: 1.5289 - regression_loss: 1.2698 - classification_loss: 0.2591 265/500 [==============>...............] - ETA: 59s - loss: 1.5278 - regression_loss: 1.2693 - classification_loss: 0.2585 266/500 [==============>...............] - ETA: 58s - loss: 1.5286 - regression_loss: 1.2699 - classification_loss: 0.2587 267/500 [===============>..............] - ETA: 58s - loss: 1.5266 - regression_loss: 1.2683 - classification_loss: 0.2583 268/500 [===============>..............] - ETA: 58s - loss: 1.5228 - regression_loss: 1.2653 - classification_loss: 0.2575 269/500 [===============>..............] - ETA: 58s - loss: 1.5230 - regression_loss: 1.2658 - classification_loss: 0.2572 270/500 [===============>..............] - ETA: 57s - loss: 1.5235 - regression_loss: 1.2660 - classification_loss: 0.2575 271/500 [===============>..............] - ETA: 57s - loss: 1.5239 - regression_loss: 1.2665 - classification_loss: 0.2574 272/500 [===============>..............] - ETA: 57s - loss: 1.5243 - regression_loss: 1.2671 - classification_loss: 0.2572 273/500 [===============>..............] - ETA: 56s - loss: 1.5239 - regression_loss: 1.2667 - classification_loss: 0.2572 274/500 [===============>..............] - ETA: 56s - loss: 1.5251 - regression_loss: 1.2676 - classification_loss: 0.2575 275/500 [===============>..............] - ETA: 56s - loss: 1.5260 - regression_loss: 1.2685 - classification_loss: 0.2575 276/500 [===============>..............] - ETA: 56s - loss: 1.5277 - regression_loss: 1.2695 - classification_loss: 0.2582 277/500 [===============>..............] - ETA: 55s - loss: 1.5288 - regression_loss: 1.2708 - classification_loss: 0.2580 278/500 [===============>..............] - ETA: 55s - loss: 1.5293 - regression_loss: 1.2712 - classification_loss: 0.2581 279/500 [===============>..............] - ETA: 55s - loss: 1.5304 - regression_loss: 1.2724 - classification_loss: 0.2580 280/500 [===============>..............] - ETA: 55s - loss: 1.5301 - regression_loss: 1.2717 - classification_loss: 0.2583 281/500 [===============>..............] - ETA: 54s - loss: 1.5306 - regression_loss: 1.2719 - classification_loss: 0.2587 282/500 [===============>..............] - ETA: 54s - loss: 1.5302 - regression_loss: 1.2715 - classification_loss: 0.2587 283/500 [===============>..............] - ETA: 54s - loss: 1.5320 - regression_loss: 1.2730 - classification_loss: 0.2590 284/500 [================>.............] - ETA: 54s - loss: 1.5330 - regression_loss: 1.2740 - classification_loss: 0.2590 285/500 [================>.............] - ETA: 53s - loss: 1.5337 - regression_loss: 1.2748 - classification_loss: 0.2589 286/500 [================>.............] - ETA: 53s - loss: 1.5346 - regression_loss: 1.2755 - classification_loss: 0.2590 287/500 [================>.............] - ETA: 53s - loss: 1.5359 - regression_loss: 1.2764 - classification_loss: 0.2595 288/500 [================>.............] - ETA: 53s - loss: 1.5340 - regression_loss: 1.2750 - classification_loss: 0.2591 289/500 [================>.............] - ETA: 52s - loss: 1.5361 - regression_loss: 1.2767 - classification_loss: 0.2594 290/500 [================>.............] - ETA: 52s - loss: 1.5370 - regression_loss: 1.2776 - classification_loss: 0.2594 291/500 [================>.............] - ETA: 52s - loss: 1.5353 - regression_loss: 1.2762 - classification_loss: 0.2591 292/500 [================>.............] - ETA: 52s - loss: 1.5367 - regression_loss: 1.2772 - classification_loss: 0.2595 293/500 [================>.............] - ETA: 51s - loss: 1.5382 - regression_loss: 1.2782 - classification_loss: 0.2600 294/500 [================>.............] - ETA: 51s - loss: 1.5373 - regression_loss: 1.2777 - classification_loss: 0.2596 295/500 [================>.............] - ETA: 51s - loss: 1.5385 - regression_loss: 1.2788 - classification_loss: 0.2598 296/500 [================>.............] - ETA: 51s - loss: 1.5374 - regression_loss: 1.2780 - classification_loss: 0.2594 297/500 [================>.............] - ETA: 50s - loss: 1.5336 - regression_loss: 1.2749 - classification_loss: 0.2587 298/500 [================>.............] - ETA: 50s - loss: 1.5325 - regression_loss: 1.2739 - classification_loss: 0.2587 299/500 [================>.............] - ETA: 50s - loss: 1.5320 - regression_loss: 1.2732 - classification_loss: 0.2588 300/500 [=================>............] - ETA: 50s - loss: 1.5316 - regression_loss: 1.2729 - classification_loss: 0.2586 301/500 [=================>............] - ETA: 49s - loss: 1.5310 - regression_loss: 1.2726 - classification_loss: 0.2584 302/500 [=================>............] - ETA: 49s - loss: 1.5283 - regression_loss: 1.2703 - classification_loss: 0.2580 303/500 [=================>............] - ETA: 49s - loss: 1.5260 - regression_loss: 1.2686 - classification_loss: 0.2575 304/500 [=================>............] - ETA: 49s - loss: 1.5266 - regression_loss: 1.2691 - classification_loss: 0.2575 305/500 [=================>............] - ETA: 48s - loss: 1.5270 - regression_loss: 1.2695 - classification_loss: 0.2576 306/500 [=================>............] - ETA: 48s - loss: 1.5287 - regression_loss: 1.2709 - classification_loss: 0.2578 307/500 [=================>............] - ETA: 48s - loss: 1.5282 - regression_loss: 1.2706 - classification_loss: 0.2576 308/500 [=================>............] - ETA: 48s - loss: 1.5301 - regression_loss: 1.2725 - classification_loss: 0.2576 309/500 [=================>............] - ETA: 47s - loss: 1.5307 - regression_loss: 1.2732 - classification_loss: 0.2575 310/500 [=================>............] - ETA: 47s - loss: 1.5316 - regression_loss: 1.2739 - classification_loss: 0.2577 311/500 [=================>............] - ETA: 47s - loss: 1.5298 - regression_loss: 1.2726 - classification_loss: 0.2572 312/500 [=================>............] - ETA: 47s - loss: 1.5299 - regression_loss: 1.2727 - classification_loss: 0.2572 313/500 [=================>............] - ETA: 46s - loss: 1.5304 - regression_loss: 1.2733 - classification_loss: 0.2571 314/500 [=================>............] - ETA: 46s - loss: 1.5312 - regression_loss: 1.2738 - classification_loss: 0.2574 315/500 [=================>............] - ETA: 46s - loss: 1.5320 - regression_loss: 1.2746 - classification_loss: 0.2574 316/500 [=================>............] - ETA: 46s - loss: 1.5322 - regression_loss: 1.2749 - classification_loss: 0.2573 317/500 [==================>...........] - ETA: 45s - loss: 1.5332 - regression_loss: 1.2757 - classification_loss: 0.2575 318/500 [==================>...........] - ETA: 45s - loss: 1.5342 - regression_loss: 1.2763 - classification_loss: 0.2579 319/500 [==================>...........] - ETA: 45s - loss: 1.5365 - regression_loss: 1.2780 - classification_loss: 0.2585 320/500 [==================>...........] - ETA: 45s - loss: 1.5355 - regression_loss: 1.2772 - classification_loss: 0.2583 321/500 [==================>...........] - ETA: 44s - loss: 1.5344 - regression_loss: 1.2765 - classification_loss: 0.2579 322/500 [==================>...........] - ETA: 44s - loss: 1.5324 - regression_loss: 1.2748 - classification_loss: 0.2576 323/500 [==================>...........] - ETA: 44s - loss: 1.5316 - regression_loss: 1.2743 - classification_loss: 0.2573 324/500 [==================>...........] - ETA: 44s - loss: 1.5312 - regression_loss: 1.2739 - classification_loss: 0.2573 325/500 [==================>...........] - ETA: 43s - loss: 1.5328 - regression_loss: 1.2750 - classification_loss: 0.2578 326/500 [==================>...........] - ETA: 43s - loss: 1.5312 - regression_loss: 1.2737 - classification_loss: 0.2575 327/500 [==================>...........] - ETA: 43s - loss: 1.5305 - regression_loss: 1.2733 - classification_loss: 0.2572 328/500 [==================>...........] - ETA: 43s - loss: 1.5294 - regression_loss: 1.2723 - classification_loss: 0.2571 329/500 [==================>...........] - ETA: 42s - loss: 1.5294 - regression_loss: 1.2725 - classification_loss: 0.2569 330/500 [==================>...........] - ETA: 42s - loss: 1.5269 - regression_loss: 1.2706 - classification_loss: 0.2564 331/500 [==================>...........] - ETA: 42s - loss: 1.5271 - regression_loss: 1.2709 - classification_loss: 0.2562 332/500 [==================>...........] - ETA: 42s - loss: 1.5274 - regression_loss: 1.2712 - classification_loss: 0.2562 333/500 [==================>...........] - ETA: 41s - loss: 1.5259 - regression_loss: 1.2701 - classification_loss: 0.2558 334/500 [===================>..........] - ETA: 41s - loss: 1.5268 - regression_loss: 1.2708 - classification_loss: 0.2561 335/500 [===================>..........] - ETA: 41s - loss: 1.5268 - regression_loss: 1.2709 - classification_loss: 0.2560 336/500 [===================>..........] - ETA: 41s - loss: 1.5270 - regression_loss: 1.2711 - classification_loss: 0.2559 337/500 [===================>..........] - ETA: 40s - loss: 1.5261 - regression_loss: 1.2704 - classification_loss: 0.2557 338/500 [===================>..........] - ETA: 40s - loss: 1.5273 - regression_loss: 1.2715 - classification_loss: 0.2558 339/500 [===================>..........] - ETA: 40s - loss: 1.5276 - regression_loss: 1.2715 - classification_loss: 0.2562 340/500 [===================>..........] - ETA: 40s - loss: 1.5278 - regression_loss: 1.2716 - classification_loss: 0.2563 341/500 [===================>..........] - ETA: 39s - loss: 1.5288 - regression_loss: 1.2725 - classification_loss: 0.2563 342/500 [===================>..........] - ETA: 39s - loss: 1.5267 - regression_loss: 1.2709 - classification_loss: 0.2558 343/500 [===================>..........] - ETA: 39s - loss: 1.5268 - regression_loss: 1.2709 - classification_loss: 0.2559 344/500 [===================>..........] - ETA: 39s - loss: 1.5259 - regression_loss: 1.2703 - classification_loss: 0.2556 345/500 [===================>..........] - ETA: 38s - loss: 1.5261 - regression_loss: 1.2706 - classification_loss: 0.2555 346/500 [===================>..........] - ETA: 38s - loss: 1.5254 - regression_loss: 1.2697 - classification_loss: 0.2556 347/500 [===================>..........] - ETA: 38s - loss: 1.5256 - regression_loss: 1.2698 - classification_loss: 0.2558 348/500 [===================>..........] - ETA: 38s - loss: 1.5265 - regression_loss: 1.2703 - classification_loss: 0.2561 349/500 [===================>..........] - ETA: 37s - loss: 1.5288 - regression_loss: 1.2722 - classification_loss: 0.2566 350/500 [====================>.........] - ETA: 37s - loss: 1.5286 - regression_loss: 1.2722 - classification_loss: 0.2564 351/500 [====================>.........] - ETA: 37s - loss: 1.5291 - regression_loss: 1.2725 - classification_loss: 0.2566 352/500 [====================>.........] - ETA: 37s - loss: 1.5301 - regression_loss: 1.2734 - classification_loss: 0.2567 353/500 [====================>.........] - ETA: 36s - loss: 1.5298 - regression_loss: 1.2732 - classification_loss: 0.2566 354/500 [====================>.........] - ETA: 36s - loss: 1.5309 - regression_loss: 1.2740 - classification_loss: 0.2569 355/500 [====================>.........] - ETA: 36s - loss: 1.5323 - regression_loss: 1.2752 - classification_loss: 0.2571 356/500 [====================>.........] - ETA: 36s - loss: 1.5335 - regression_loss: 1.2761 - classification_loss: 0.2574 357/500 [====================>.........] - ETA: 35s - loss: 1.5328 - regression_loss: 1.2758 - classification_loss: 0.2570 358/500 [====================>.........] - ETA: 35s - loss: 1.5334 - regression_loss: 1.2763 - classification_loss: 0.2571 359/500 [====================>.........] - ETA: 35s - loss: 1.5315 - regression_loss: 1.2748 - classification_loss: 0.2567 360/500 [====================>.........] - ETA: 35s - loss: 1.5325 - regression_loss: 1.2755 - classification_loss: 0.2570 361/500 [====================>.........] - ETA: 34s - loss: 1.5307 - regression_loss: 1.2741 - classification_loss: 0.2567 362/500 [====================>.........] - ETA: 34s - loss: 1.5308 - regression_loss: 1.2741 - classification_loss: 0.2567 363/500 [====================>.........] - ETA: 34s - loss: 1.5304 - regression_loss: 1.2736 - classification_loss: 0.2568 364/500 [====================>.........] - ETA: 34s - loss: 1.5287 - regression_loss: 1.2722 - classification_loss: 0.2565 365/500 [====================>.........] - ETA: 33s - loss: 1.5275 - regression_loss: 1.2713 - classification_loss: 0.2563 366/500 [====================>.........] - ETA: 33s - loss: 1.5278 - regression_loss: 1.2717 - classification_loss: 0.2562 367/500 [=====================>........] - ETA: 33s - loss: 1.5275 - regression_loss: 1.2715 - classification_loss: 0.2560 368/500 [=====================>........] - ETA: 33s - loss: 1.5265 - regression_loss: 1.2708 - classification_loss: 0.2558 369/500 [=====================>........] - ETA: 32s - loss: 1.5267 - regression_loss: 1.2708 - classification_loss: 0.2559 370/500 [=====================>........] - ETA: 32s - loss: 1.5260 - regression_loss: 1.2702 - classification_loss: 0.2559 371/500 [=====================>........] - ETA: 32s - loss: 1.5266 - regression_loss: 1.2708 - classification_loss: 0.2559 372/500 [=====================>........] - ETA: 32s - loss: 1.5248 - regression_loss: 1.2693 - classification_loss: 0.2554 373/500 [=====================>........] - ETA: 31s - loss: 1.5227 - regression_loss: 1.2677 - classification_loss: 0.2551 374/500 [=====================>........] - ETA: 31s - loss: 1.5251 - regression_loss: 1.2694 - classification_loss: 0.2557 375/500 [=====================>........] - ETA: 31s - loss: 1.5231 - regression_loss: 1.2677 - classification_loss: 0.2554 376/500 [=====================>........] - ETA: 31s - loss: 1.5240 - regression_loss: 1.2684 - classification_loss: 0.2556 377/500 [=====================>........] - ETA: 30s - loss: 1.5228 - regression_loss: 1.2676 - classification_loss: 0.2551 378/500 [=====================>........] - ETA: 30s - loss: 1.5227 - regression_loss: 1.2677 - classification_loss: 0.2550 379/500 [=====================>........] - ETA: 30s - loss: 1.5222 - regression_loss: 1.2674 - classification_loss: 0.2548 380/500 [=====================>........] - ETA: 30s - loss: 1.5223 - regression_loss: 1.2676 - classification_loss: 0.2547 381/500 [=====================>........] - ETA: 29s - loss: 1.5206 - regression_loss: 1.2662 - classification_loss: 0.2544 382/500 [=====================>........] - ETA: 29s - loss: 1.5218 - regression_loss: 1.2671 - classification_loss: 0.2546 383/500 [=====================>........] - ETA: 29s - loss: 1.5223 - regression_loss: 1.2675 - classification_loss: 0.2548 384/500 [======================>.......] - ETA: 29s - loss: 1.5237 - regression_loss: 1.2686 - classification_loss: 0.2552 385/500 [======================>.......] - ETA: 28s - loss: 1.5247 - regression_loss: 1.2694 - classification_loss: 0.2553 386/500 [======================>.......] - ETA: 28s - loss: 1.5238 - regression_loss: 1.2687 - classification_loss: 0.2551 387/500 [======================>.......] - ETA: 28s - loss: 1.5246 - regression_loss: 1.2695 - classification_loss: 0.2551 388/500 [======================>.......] - ETA: 28s - loss: 1.5261 - regression_loss: 1.2708 - classification_loss: 0.2553 389/500 [======================>.......] - ETA: 27s - loss: 1.5279 - regression_loss: 1.2723 - classification_loss: 0.2556 390/500 [======================>.......] - ETA: 27s - loss: 1.5290 - regression_loss: 1.2726 - classification_loss: 0.2563 391/500 [======================>.......] - ETA: 27s - loss: 1.5299 - regression_loss: 1.2734 - classification_loss: 0.2565 392/500 [======================>.......] - ETA: 27s - loss: 1.5298 - regression_loss: 1.2735 - classification_loss: 0.2564 393/500 [======================>.......] - ETA: 26s - loss: 1.5303 - regression_loss: 1.2738 - classification_loss: 0.2564 394/500 [======================>.......] - ETA: 26s - loss: 1.5313 - regression_loss: 1.2746 - classification_loss: 0.2567 395/500 [======================>.......] - ETA: 26s - loss: 1.5305 - regression_loss: 1.2736 - classification_loss: 0.2569 396/500 [======================>.......] - ETA: 26s - loss: 1.5316 - regression_loss: 1.2745 - classification_loss: 0.2570 397/500 [======================>.......] - ETA: 25s - loss: 1.5305 - regression_loss: 1.2737 - classification_loss: 0.2568 398/500 [======================>.......] - ETA: 25s - loss: 1.5307 - regression_loss: 1.2738 - classification_loss: 0.2569 399/500 [======================>.......] - ETA: 25s - loss: 1.5307 - regression_loss: 1.2739 - classification_loss: 0.2567 400/500 [=======================>......] - ETA: 25s - loss: 1.5307 - regression_loss: 1.2737 - classification_loss: 0.2570 401/500 [=======================>......] - ETA: 24s - loss: 1.5306 - regression_loss: 1.2736 - classification_loss: 0.2571 402/500 [=======================>......] - ETA: 24s - loss: 1.5315 - regression_loss: 1.2743 - classification_loss: 0.2572 403/500 [=======================>......] - ETA: 24s - loss: 1.5296 - regression_loss: 1.2727 - classification_loss: 0.2568 404/500 [=======================>......] - ETA: 24s - loss: 1.5287 - regression_loss: 1.2721 - classification_loss: 0.2566 405/500 [=======================>......] - ETA: 23s - loss: 1.5281 - regression_loss: 1.2717 - classification_loss: 0.2565 406/500 [=======================>......] - ETA: 23s - loss: 1.5293 - regression_loss: 1.2725 - classification_loss: 0.2568 407/500 [=======================>......] - ETA: 23s - loss: 1.5297 - regression_loss: 1.2729 - classification_loss: 0.2569 408/500 [=======================>......] - ETA: 23s - loss: 1.5310 - regression_loss: 1.2740 - classification_loss: 0.2570 409/500 [=======================>......] - ETA: 22s - loss: 1.5310 - regression_loss: 1.2739 - classification_loss: 0.2570 410/500 [=======================>......] - ETA: 22s - loss: 1.5312 - regression_loss: 1.2736 - classification_loss: 0.2577 411/500 [=======================>......] - ETA: 22s - loss: 1.5315 - regression_loss: 1.2738 - classification_loss: 0.2577 412/500 [=======================>......] - ETA: 22s - loss: 1.5309 - regression_loss: 1.2734 - classification_loss: 0.2574 413/500 [=======================>......] - ETA: 21s - loss: 1.5314 - regression_loss: 1.2739 - classification_loss: 0.2576 414/500 [=======================>......] - ETA: 21s - loss: 1.5317 - regression_loss: 1.2741 - classification_loss: 0.2576 415/500 [=======================>......] - ETA: 21s - loss: 1.5321 - regression_loss: 1.2745 - classification_loss: 0.2575 416/500 [=======================>......] - ETA: 21s - loss: 1.5316 - regression_loss: 1.2744 - classification_loss: 0.2573 417/500 [========================>.....] - ETA: 20s - loss: 1.5322 - regression_loss: 1.2748 - classification_loss: 0.2574 418/500 [========================>.....] - ETA: 20s - loss: 1.5309 - regression_loss: 1.2740 - classification_loss: 0.2569 419/500 [========================>.....] - ETA: 20s - loss: 1.5304 - regression_loss: 1.2735 - classification_loss: 0.2569 420/500 [========================>.....] - ETA: 20s - loss: 1.5308 - regression_loss: 1.2738 - classification_loss: 0.2570 421/500 [========================>.....] - ETA: 19s - loss: 1.5300 - regression_loss: 1.2732 - classification_loss: 0.2569 422/500 [========================>.....] - ETA: 19s - loss: 1.5311 - regression_loss: 1.2741 - classification_loss: 0.2571 423/500 [========================>.....] - ETA: 19s - loss: 1.5319 - regression_loss: 1.2748 - classification_loss: 0.2571 424/500 [========================>.....] - ETA: 19s - loss: 1.5302 - regression_loss: 1.2736 - classification_loss: 0.2567 425/500 [========================>.....] - ETA: 18s - loss: 1.5297 - regression_loss: 1.2732 - classification_loss: 0.2565 426/500 [========================>.....] - ETA: 18s - loss: 1.5289 - regression_loss: 1.2725 - classification_loss: 0.2564 427/500 [========================>.....] - ETA: 18s - loss: 1.5274 - regression_loss: 1.2711 - classification_loss: 0.2563 428/500 [========================>.....] - ETA: 18s - loss: 1.5282 - regression_loss: 1.2718 - classification_loss: 0.2564 429/500 [========================>.....] - ETA: 17s - loss: 1.5292 - regression_loss: 1.2727 - classification_loss: 0.2565 430/500 [========================>.....] - ETA: 17s - loss: 1.5286 - regression_loss: 1.2723 - classification_loss: 0.2563 431/500 [========================>.....] - ETA: 17s - loss: 1.5278 - regression_loss: 1.2717 - classification_loss: 0.2560 432/500 [========================>.....] - ETA: 17s - loss: 1.5275 - regression_loss: 1.2717 - classification_loss: 0.2558 433/500 [========================>.....] - ETA: 16s - loss: 1.5261 - regression_loss: 1.2705 - classification_loss: 0.2557 434/500 [=========================>....] - ETA: 16s - loss: 1.5240 - regression_loss: 1.2686 - classification_loss: 0.2553 435/500 [=========================>....] - ETA: 16s - loss: 1.5236 - regression_loss: 1.2683 - classification_loss: 0.2553 436/500 [=========================>....] - ETA: 16s - loss: 1.5237 - regression_loss: 1.2682 - classification_loss: 0.2555 437/500 [=========================>....] - ETA: 15s - loss: 1.5222 - regression_loss: 1.2670 - classification_loss: 0.2551 438/500 [=========================>....] - ETA: 15s - loss: 1.5218 - regression_loss: 1.2667 - classification_loss: 0.2550 439/500 [=========================>....] - ETA: 15s - loss: 1.5223 - regression_loss: 1.2673 - classification_loss: 0.2550 440/500 [=========================>....] - ETA: 15s - loss: 1.5222 - regression_loss: 1.2673 - classification_loss: 0.2549 441/500 [=========================>....] - ETA: 14s - loss: 1.5219 - regression_loss: 1.2670 - classification_loss: 0.2549 442/500 [=========================>....] - ETA: 14s - loss: 1.5224 - regression_loss: 1.2675 - classification_loss: 0.2550 443/500 [=========================>....] - ETA: 14s - loss: 1.5206 - regression_loss: 1.2659 - classification_loss: 0.2547 444/500 [=========================>....] - ETA: 14s - loss: 1.5211 - regression_loss: 1.2664 - classification_loss: 0.2547 445/500 [=========================>....] - ETA: 13s - loss: 1.5209 - regression_loss: 1.2663 - classification_loss: 0.2546 446/500 [=========================>....] - ETA: 13s - loss: 1.5219 - regression_loss: 1.2671 - classification_loss: 0.2548 447/500 [=========================>....] - ETA: 13s - loss: 1.5194 - regression_loss: 1.2651 - classification_loss: 0.2543 448/500 [=========================>....] - ETA: 13s - loss: 1.5208 - regression_loss: 1.2662 - classification_loss: 0.2546 449/500 [=========================>....] - ETA: 12s - loss: 1.5209 - regression_loss: 1.2667 - classification_loss: 0.2542 450/500 [==========================>...] - ETA: 12s - loss: 1.5201 - regression_loss: 1.2661 - classification_loss: 0.2540 451/500 [==========================>...] - ETA: 12s - loss: 1.5196 - regression_loss: 1.2655 - classification_loss: 0.2542 452/500 [==========================>...] - ETA: 12s - loss: 1.5205 - regression_loss: 1.2661 - classification_loss: 0.2544 453/500 [==========================>...] - ETA: 11s - loss: 1.5204 - regression_loss: 1.2659 - classification_loss: 0.2546 454/500 [==========================>...] - ETA: 11s - loss: 1.5203 - regression_loss: 1.2658 - classification_loss: 0.2545 455/500 [==========================>...] - ETA: 11s - loss: 1.5207 - regression_loss: 1.2662 - classification_loss: 0.2545 456/500 [==========================>...] - ETA: 11s - loss: 1.5201 - regression_loss: 1.2656 - classification_loss: 0.2544 457/500 [==========================>...] - ETA: 10s - loss: 1.5202 - regression_loss: 1.2657 - classification_loss: 0.2544 458/500 [==========================>...] - ETA: 10s - loss: 1.5190 - regression_loss: 1.2648 - classification_loss: 0.2542 459/500 [==========================>...] - ETA: 10s - loss: 1.5200 - regression_loss: 1.2656 - classification_loss: 0.2544 460/500 [==========================>...] - ETA: 10s - loss: 1.5202 - regression_loss: 1.2657 - classification_loss: 0.2545 461/500 [==========================>...] - ETA: 9s - loss: 1.5209 - regression_loss: 1.2662 - classification_loss: 0.2547  462/500 [==========================>...] - ETA: 9s - loss: 1.5213 - regression_loss: 1.2661 - classification_loss: 0.2552 463/500 [==========================>...] - ETA: 9s - loss: 1.5208 - regression_loss: 1.2657 - classification_loss: 0.2551 464/500 [==========================>...] - ETA: 9s - loss: 1.5210 - regression_loss: 1.2659 - classification_loss: 0.2550 465/500 [==========================>...] - ETA: 8s - loss: 1.5218 - regression_loss: 1.2667 - classification_loss: 0.2550 466/500 [==========================>...] - ETA: 8s - loss: 1.5233 - regression_loss: 1.2679 - classification_loss: 0.2554 467/500 [===========================>..] - ETA: 8s - loss: 1.5231 - regression_loss: 1.2678 - classification_loss: 0.2553 468/500 [===========================>..] - ETA: 8s - loss: 1.5239 - regression_loss: 1.2686 - classification_loss: 0.2553 469/500 [===========================>..] - ETA: 7s - loss: 1.5237 - regression_loss: 1.2685 - classification_loss: 0.2552 470/500 [===========================>..] - ETA: 7s - loss: 1.5245 - regression_loss: 1.2692 - classification_loss: 0.2553 471/500 [===========================>..] - ETA: 7s - loss: 1.5247 - regression_loss: 1.2693 - classification_loss: 0.2554 472/500 [===========================>..] - ETA: 7s - loss: 1.5252 - regression_loss: 1.2698 - classification_loss: 0.2555 473/500 [===========================>..] - ETA: 6s - loss: 1.5250 - regression_loss: 1.2694 - classification_loss: 0.2555 474/500 [===========================>..] - ETA: 6s - loss: 1.5249 - regression_loss: 1.2694 - classification_loss: 0.2555 475/500 [===========================>..] - ETA: 6s - loss: 1.5256 - regression_loss: 1.2700 - classification_loss: 0.2555 476/500 [===========================>..] - ETA: 6s - loss: 1.5263 - regression_loss: 1.2705 - classification_loss: 0.2558 477/500 [===========================>..] - ETA: 5s - loss: 1.5263 - regression_loss: 1.2706 - classification_loss: 0.2556 478/500 [===========================>..] - ETA: 5s - loss: 1.5266 - regression_loss: 1.2710 - classification_loss: 0.2556 479/500 [===========================>..] - ETA: 5s - loss: 1.5273 - regression_loss: 1.2717 - classification_loss: 0.2557 480/500 [===========================>..] - ETA: 5s - loss: 1.5274 - regression_loss: 1.2718 - classification_loss: 0.2556 481/500 [===========================>..] - ETA: 4s - loss: 1.5280 - regression_loss: 1.2723 - classification_loss: 0.2557 482/500 [===========================>..] - ETA: 4s - loss: 1.5280 - regression_loss: 1.2723 - classification_loss: 0.2557 483/500 [===========================>..] - ETA: 4s - loss: 1.5282 - regression_loss: 1.2717 - classification_loss: 0.2566 484/500 [============================>.] - ETA: 4s - loss: 1.5282 - regression_loss: 1.2714 - classification_loss: 0.2567 485/500 [============================>.] - ETA: 3s - loss: 1.5285 - regression_loss: 1.2716 - classification_loss: 0.2569 486/500 [============================>.] - ETA: 3s - loss: 1.5285 - regression_loss: 1.2718 - classification_loss: 0.2567 487/500 [============================>.] - ETA: 3s - loss: 1.5291 - regression_loss: 1.2723 - classification_loss: 0.2568 488/500 [============================>.] - ETA: 3s - loss: 1.5292 - regression_loss: 1.2723 - classification_loss: 0.2569 489/500 [============================>.] - ETA: 2s - loss: 1.5295 - regression_loss: 1.2726 - classification_loss: 0.2569 490/500 [============================>.] - ETA: 2s - loss: 1.5296 - regression_loss: 1.2727 - classification_loss: 0.2569 491/500 [============================>.] - ETA: 2s - loss: 1.5296 - regression_loss: 1.2727 - classification_loss: 0.2569 492/500 [============================>.] - ETA: 2s - loss: 1.5299 - regression_loss: 1.2730 - classification_loss: 0.2569 493/500 [============================>.] - ETA: 1s - loss: 1.5302 - regression_loss: 1.2734 - classification_loss: 0.2568 494/500 [============================>.] - ETA: 1s - loss: 1.5319 - regression_loss: 1.2747 - classification_loss: 0.2571 495/500 [============================>.] - ETA: 1s - loss: 1.5326 - regression_loss: 1.2755 - classification_loss: 0.2571 496/500 [============================>.] - ETA: 1s - loss: 1.5336 - regression_loss: 1.2763 - classification_loss: 0.2573 497/500 [============================>.] - ETA: 0s - loss: 1.5335 - regression_loss: 1.2763 - classification_loss: 0.2572 498/500 [============================>.] - ETA: 0s - loss: 1.5324 - regression_loss: 1.2755 - classification_loss: 0.2569 499/500 [============================>.] - ETA: 0s - loss: 1.5339 - regression_loss: 1.2767 - classification_loss: 0.2571 500/500 [==============================] - 125s 251ms/step - loss: 1.5341 - regression_loss: 1.2769 - classification_loss: 0.2572 1172 instances of class plum with average precision: 0.6567 mAP: 0.6567 Epoch 00094: saving model to ./training/snapshots/resnet50_pascal_94.h5 Epoch 95/150 1/500 [..............................] - ETA: 2:04 - loss: 1.5679 - regression_loss: 1.3348 - classification_loss: 0.2332 2/500 [..............................] - ETA: 2:01 - loss: 1.6180 - regression_loss: 1.3754 - classification_loss: 0.2426 3/500 [..............................] - ETA: 2:04 - loss: 1.6532 - regression_loss: 1.3899 - classification_loss: 0.2632 4/500 [..............................] - ETA: 2:04 - loss: 1.6099 - regression_loss: 1.3577 - classification_loss: 0.2522 5/500 [..............................] - ETA: 2:04 - loss: 1.6132 - regression_loss: 1.3638 - classification_loss: 0.2494 6/500 [..............................] - ETA: 2:03 - loss: 1.5872 - regression_loss: 1.3397 - classification_loss: 0.2476 7/500 [..............................] - ETA: 2:03 - loss: 1.4746 - regression_loss: 1.2473 - classification_loss: 0.2273 8/500 [..............................] - ETA: 2:03 - loss: 1.5142 - regression_loss: 1.2812 - classification_loss: 0.2330 9/500 [..............................] - ETA: 2:03 - loss: 1.4049 - regression_loss: 1.1907 - classification_loss: 0.2142 10/500 [..............................] - ETA: 2:02 - loss: 1.4178 - regression_loss: 1.2045 - classification_loss: 0.2133 11/500 [..............................] - ETA: 2:03 - loss: 1.4105 - regression_loss: 1.1997 - classification_loss: 0.2108 12/500 [..............................] - ETA: 2:03 - loss: 1.4847 - regression_loss: 1.2536 - classification_loss: 0.2311 13/500 [..............................] - ETA: 2:03 - loss: 1.4542 - regression_loss: 1.2276 - classification_loss: 0.2266 14/500 [..............................] - ETA: 2:03 - loss: 1.4794 - regression_loss: 1.2495 - classification_loss: 0.2299 15/500 [..............................] - ETA: 2:03 - loss: 1.4511 - regression_loss: 1.2280 - classification_loss: 0.2231 16/500 [..............................] - ETA: 2:02 - loss: 1.3912 - regression_loss: 1.1744 - classification_loss: 0.2168 17/500 [>.............................] - ETA: 2:02 - loss: 1.4230 - regression_loss: 1.2004 - classification_loss: 0.2226 18/500 [>.............................] - ETA: 2:01 - loss: 1.4224 - regression_loss: 1.1936 - classification_loss: 0.2288 19/500 [>.............................] - ETA: 2:01 - loss: 1.4113 - regression_loss: 1.1863 - classification_loss: 0.2250 20/500 [>.............................] - ETA: 2:01 - loss: 1.4433 - regression_loss: 1.2107 - classification_loss: 0.2326 21/500 [>.............................] - ETA: 2:00 - loss: 1.4473 - regression_loss: 1.2156 - classification_loss: 0.2317 22/500 [>.............................] - ETA: 2:00 - loss: 1.4680 - regression_loss: 1.2337 - classification_loss: 0.2343 23/500 [>.............................] - ETA: 2:00 - loss: 1.4636 - regression_loss: 1.2298 - classification_loss: 0.2338 24/500 [>.............................] - ETA: 2:00 - loss: 1.4560 - regression_loss: 1.2239 - classification_loss: 0.2321 25/500 [>.............................] - ETA: 2:00 - loss: 1.4782 - regression_loss: 1.2401 - classification_loss: 0.2381 26/500 [>.............................] - ETA: 1:59 - loss: 1.4835 - regression_loss: 1.2440 - classification_loss: 0.2395 27/500 [>.............................] - ETA: 1:59 - loss: 1.4978 - regression_loss: 1.2554 - classification_loss: 0.2424 28/500 [>.............................] - ETA: 1:59 - loss: 1.5152 - regression_loss: 1.2697 - classification_loss: 0.2454 29/500 [>.............................] - ETA: 1:58 - loss: 1.4980 - regression_loss: 1.2558 - classification_loss: 0.2422 30/500 [>.............................] - ETA: 1:58 - loss: 1.4996 - regression_loss: 1.2589 - classification_loss: 0.2408 31/500 [>.............................] - ETA: 1:58 - loss: 1.5142 - regression_loss: 1.2722 - classification_loss: 0.2420 32/500 [>.............................] - ETA: 1:58 - loss: 1.5213 - regression_loss: 1.2773 - classification_loss: 0.2440 33/500 [>.............................] - ETA: 1:58 - loss: 1.5331 - regression_loss: 1.2874 - classification_loss: 0.2457 34/500 [=>............................] - ETA: 1:57 - loss: 1.5193 - regression_loss: 1.2720 - classification_loss: 0.2473 35/500 [=>............................] - ETA: 1:57 - loss: 1.5170 - regression_loss: 1.2695 - classification_loss: 0.2475 36/500 [=>............................] - ETA: 1:57 - loss: 1.5101 - regression_loss: 1.2614 - classification_loss: 0.2487 37/500 [=>............................] - ETA: 1:57 - loss: 1.5060 - regression_loss: 1.2590 - classification_loss: 0.2470 38/500 [=>............................] - ETA: 1:56 - loss: 1.5139 - regression_loss: 1.2648 - classification_loss: 0.2490 39/500 [=>............................] - ETA: 1:56 - loss: 1.5184 - regression_loss: 1.2697 - classification_loss: 0.2487 40/500 [=>............................] - ETA: 1:56 - loss: 1.5115 - regression_loss: 1.2644 - classification_loss: 0.2472 41/500 [=>............................] - ETA: 1:56 - loss: 1.5161 - regression_loss: 1.2682 - classification_loss: 0.2479 42/500 [=>............................] - ETA: 1:56 - loss: 1.5168 - regression_loss: 1.2691 - classification_loss: 0.2477 43/500 [=>............................] - ETA: 1:55 - loss: 1.5226 - regression_loss: 1.2773 - classification_loss: 0.2453 44/500 [=>............................] - ETA: 1:55 - loss: 1.5088 - regression_loss: 1.2664 - classification_loss: 0.2423 45/500 [=>............................] - ETA: 1:55 - loss: 1.5265 - regression_loss: 1.2759 - classification_loss: 0.2506 46/500 [=>............................] - ETA: 1:55 - loss: 1.5151 - regression_loss: 1.2669 - classification_loss: 0.2482 47/500 [=>............................] - ETA: 1:54 - loss: 1.5220 - regression_loss: 1.2727 - classification_loss: 0.2493 48/500 [=>............................] - ETA: 1:54 - loss: 1.5239 - regression_loss: 1.2732 - classification_loss: 0.2507 49/500 [=>............................] - ETA: 1:54 - loss: 1.5081 - regression_loss: 1.2606 - classification_loss: 0.2474 50/500 [==>...........................] - ETA: 1:53 - loss: 1.4924 - regression_loss: 1.2487 - classification_loss: 0.2438 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4978 - regression_loss: 1.2528 - classification_loss: 0.2450 52/500 [==>...........................] - ETA: 1:53 - loss: 1.5002 - regression_loss: 1.2544 - classification_loss: 0.2459 53/500 [==>...........................] - ETA: 1:53 - loss: 1.5045 - regression_loss: 1.2578 - classification_loss: 0.2468 54/500 [==>...........................] - ETA: 1:52 - loss: 1.5106 - regression_loss: 1.2629 - classification_loss: 0.2477 55/500 [==>...........................] - ETA: 1:52 - loss: 1.5139 - regression_loss: 1.2658 - classification_loss: 0.2481 56/500 [==>...........................] - ETA: 1:52 - loss: 1.5216 - regression_loss: 1.2706 - classification_loss: 0.2510 57/500 [==>...........................] - ETA: 1:52 - loss: 1.5119 - regression_loss: 1.2616 - classification_loss: 0.2502 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5149 - regression_loss: 1.2644 - classification_loss: 0.2505 59/500 [==>...........................] - ETA: 1:51 - loss: 1.5215 - regression_loss: 1.2707 - classification_loss: 0.2508 60/500 [==>...........................] - ETA: 1:51 - loss: 1.5202 - regression_loss: 1.2702 - classification_loss: 0.2499 61/500 [==>...........................] - ETA: 1:50 - loss: 1.5212 - regression_loss: 1.2716 - classification_loss: 0.2496 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5266 - regression_loss: 1.2770 - classification_loss: 0.2496 63/500 [==>...........................] - ETA: 1:50 - loss: 1.5248 - regression_loss: 1.2760 - classification_loss: 0.2489 64/500 [==>...........................] - ETA: 1:50 - loss: 1.5161 - regression_loss: 1.2694 - classification_loss: 0.2467 65/500 [==>...........................] - ETA: 1:49 - loss: 1.5212 - regression_loss: 1.2732 - classification_loss: 0.2480 66/500 [==>...........................] - ETA: 1:49 - loss: 1.5160 - regression_loss: 1.2689 - classification_loss: 0.2471 67/500 [===>..........................] - ETA: 1:49 - loss: 1.5031 - regression_loss: 1.2574 - classification_loss: 0.2457 68/500 [===>..........................] - ETA: 1:49 - loss: 1.4968 - regression_loss: 1.2519 - classification_loss: 0.2449 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4876 - regression_loss: 1.2452 - classification_loss: 0.2424 70/500 [===>..........................] - ETA: 1:48 - loss: 1.4779 - regression_loss: 1.2375 - classification_loss: 0.2404 71/500 [===>..........................] - ETA: 1:48 - loss: 1.4717 - regression_loss: 1.2328 - classification_loss: 0.2389 72/500 [===>..........................] - ETA: 1:48 - loss: 1.4724 - regression_loss: 1.2329 - classification_loss: 0.2395 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4631 - regression_loss: 1.2253 - classification_loss: 0.2378 74/500 [===>..........................] - ETA: 1:47 - loss: 1.4514 - regression_loss: 1.2158 - classification_loss: 0.2356 75/500 [===>..........................] - ETA: 1:47 - loss: 1.4504 - regression_loss: 1.2153 - classification_loss: 0.2350 76/500 [===>..........................] - ETA: 1:47 - loss: 1.4613 - regression_loss: 1.2236 - classification_loss: 0.2377 77/500 [===>..........................] - ETA: 1:46 - loss: 1.4743 - regression_loss: 1.2329 - classification_loss: 0.2414 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4787 - regression_loss: 1.2369 - classification_loss: 0.2418 79/500 [===>..........................] - ETA: 1:46 - loss: 1.4791 - regression_loss: 1.2372 - classification_loss: 0.2419 80/500 [===>..........................] - ETA: 1:46 - loss: 1.4799 - regression_loss: 1.2378 - classification_loss: 0.2421 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4786 - regression_loss: 1.2367 - classification_loss: 0.2419 82/500 [===>..........................] - ETA: 1:45 - loss: 1.4711 - regression_loss: 1.2306 - classification_loss: 0.2405 83/500 [===>..........................] - ETA: 1:45 - loss: 1.4735 - regression_loss: 1.2332 - classification_loss: 0.2403 84/500 [====>.........................] - ETA: 1:45 - loss: 1.4707 - regression_loss: 1.2310 - classification_loss: 0.2397 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4586 - regression_loss: 1.2208 - classification_loss: 0.2378 86/500 [====>.........................] - ETA: 1:44 - loss: 1.4594 - regression_loss: 1.2215 - classification_loss: 0.2379 87/500 [====>.........................] - ETA: 1:44 - loss: 1.4542 - regression_loss: 1.2150 - classification_loss: 0.2392 88/500 [====>.........................] - ETA: 1:44 - loss: 1.4559 - regression_loss: 1.2153 - classification_loss: 0.2406 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4475 - regression_loss: 1.2070 - classification_loss: 0.2405 90/500 [====>.........................] - ETA: 1:43 - loss: 1.4461 - regression_loss: 1.2063 - classification_loss: 0.2398 91/500 [====>.........................] - ETA: 1:43 - loss: 1.4469 - regression_loss: 1.2075 - classification_loss: 0.2394 92/500 [====>.........................] - ETA: 1:43 - loss: 1.4518 - regression_loss: 1.2113 - classification_loss: 0.2404 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4485 - regression_loss: 1.2088 - classification_loss: 0.2397 94/500 [====>.........................] - ETA: 1:42 - loss: 1.4528 - regression_loss: 1.2119 - classification_loss: 0.2409 95/500 [====>.........................] - ETA: 1:42 - loss: 1.4535 - regression_loss: 1.2125 - classification_loss: 0.2410 96/500 [====>.........................] - ETA: 1:42 - loss: 1.4595 - regression_loss: 1.2168 - classification_loss: 0.2427 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4629 - regression_loss: 1.2188 - classification_loss: 0.2440 98/500 [====>.........................] - ETA: 1:41 - loss: 1.4644 - regression_loss: 1.2201 - classification_loss: 0.2443 99/500 [====>.........................] - ETA: 1:41 - loss: 1.4717 - regression_loss: 1.2246 - classification_loss: 0.2471 100/500 [=====>........................] - ETA: 1:41 - loss: 1.4671 - regression_loss: 1.2205 - classification_loss: 0.2466 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4678 - regression_loss: 1.2204 - classification_loss: 0.2474 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4713 - regression_loss: 1.2226 - classification_loss: 0.2488 103/500 [=====>........................] - ETA: 1:40 - loss: 1.4709 - regression_loss: 1.2224 - classification_loss: 0.2485 104/500 [=====>........................] - ETA: 1:40 - loss: 1.4683 - regression_loss: 1.2209 - classification_loss: 0.2475 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4621 - regression_loss: 1.2167 - classification_loss: 0.2454 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4649 - regression_loss: 1.2192 - classification_loss: 0.2457 107/500 [=====>........................] - ETA: 1:39 - loss: 1.4648 - regression_loss: 1.2151 - classification_loss: 0.2497 108/500 [=====>........................] - ETA: 1:39 - loss: 1.4666 - regression_loss: 1.2168 - classification_loss: 0.2498 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4727 - regression_loss: 1.2211 - classification_loss: 0.2516 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4770 - regression_loss: 1.2242 - classification_loss: 0.2528 111/500 [=====>........................] - ETA: 1:38 - loss: 1.4827 - regression_loss: 1.2291 - classification_loss: 0.2536 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4819 - regression_loss: 1.2297 - classification_loss: 0.2523 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4806 - regression_loss: 1.2286 - classification_loss: 0.2520 114/500 [=====>........................] - ETA: 1:37 - loss: 1.4811 - regression_loss: 1.2287 - classification_loss: 0.2524 115/500 [=====>........................] - ETA: 1:37 - loss: 1.4847 - regression_loss: 1.2318 - classification_loss: 0.2529 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4844 - regression_loss: 1.2317 - classification_loss: 0.2527 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4823 - regression_loss: 1.2295 - classification_loss: 0.2528 118/500 [======>.......................] - ETA: 1:36 - loss: 1.4843 - regression_loss: 1.2316 - classification_loss: 0.2527 119/500 [======>.......................] - ETA: 1:36 - loss: 1.4867 - regression_loss: 1.2339 - classification_loss: 0.2528 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4863 - regression_loss: 1.2340 - classification_loss: 0.2522 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4891 - regression_loss: 1.2366 - classification_loss: 0.2525 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4911 - regression_loss: 1.2385 - classification_loss: 0.2526 123/500 [======>.......................] - ETA: 1:35 - loss: 1.4873 - regression_loss: 1.2351 - classification_loss: 0.2522 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4907 - regression_loss: 1.2377 - classification_loss: 0.2529 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4885 - regression_loss: 1.2359 - classification_loss: 0.2527 126/500 [======>.......................] - ETA: 1:34 - loss: 1.4895 - regression_loss: 1.2362 - classification_loss: 0.2534 127/500 [======>.......................] - ETA: 1:34 - loss: 1.4904 - regression_loss: 1.2373 - classification_loss: 0.2531 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4881 - regression_loss: 1.2357 - classification_loss: 0.2524 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4882 - regression_loss: 1.2360 - classification_loss: 0.2522 130/500 [======>.......................] - ETA: 1:33 - loss: 1.4896 - regression_loss: 1.2368 - classification_loss: 0.2528 131/500 [======>.......................] - ETA: 1:33 - loss: 1.4866 - regression_loss: 1.2346 - classification_loss: 0.2520 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4895 - regression_loss: 1.2368 - classification_loss: 0.2527 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4849 - regression_loss: 1.2329 - classification_loss: 0.2519 134/500 [=======>......................] - ETA: 1:32 - loss: 1.4870 - regression_loss: 1.2344 - classification_loss: 0.2525 135/500 [=======>......................] - ETA: 1:32 - loss: 1.4897 - regression_loss: 1.2370 - classification_loss: 0.2527 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4854 - regression_loss: 1.2339 - classification_loss: 0.2515 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4838 - regression_loss: 1.2322 - classification_loss: 0.2516 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4867 - regression_loss: 1.2344 - classification_loss: 0.2523 139/500 [=======>......................] - ETA: 1:31 - loss: 1.4901 - regression_loss: 1.2374 - classification_loss: 0.2527 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4920 - regression_loss: 1.2389 - classification_loss: 0.2531 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4946 - regression_loss: 1.2414 - classification_loss: 0.2532 142/500 [=======>......................] - ETA: 1:30 - loss: 1.5010 - regression_loss: 1.2465 - classification_loss: 0.2545 143/500 [=======>......................] - ETA: 1:30 - loss: 1.5092 - regression_loss: 1.2536 - classification_loss: 0.2556 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5024 - regression_loss: 1.2482 - classification_loss: 0.2543 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4973 - regression_loss: 1.2431 - classification_loss: 0.2543 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4981 - regression_loss: 1.2438 - classification_loss: 0.2542 147/500 [=======>......................] - ETA: 1:29 - loss: 1.5000 - regression_loss: 1.2451 - classification_loss: 0.2549 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5018 - regression_loss: 1.2466 - classification_loss: 0.2552 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5018 - regression_loss: 1.2465 - classification_loss: 0.2553 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5016 - regression_loss: 1.2463 - classification_loss: 0.2553 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5024 - regression_loss: 1.2471 - classification_loss: 0.2553 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4982 - regression_loss: 1.2439 - classification_loss: 0.2543 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5017 - regression_loss: 1.2469 - classification_loss: 0.2548 154/500 [========>.....................] - ETA: 1:27 - loss: 1.4994 - regression_loss: 1.2455 - classification_loss: 0.2539 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5002 - regression_loss: 1.2459 - classification_loss: 0.2542 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4966 - regression_loss: 1.2431 - classification_loss: 0.2535 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4997 - regression_loss: 1.2458 - classification_loss: 0.2539 158/500 [========>.....................] - ETA: 1:26 - loss: 1.4977 - regression_loss: 1.2443 - classification_loss: 0.2534 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5024 - regression_loss: 1.2490 - classification_loss: 0.2534 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4979 - regression_loss: 1.2452 - classification_loss: 0.2527 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4934 - regression_loss: 1.2414 - classification_loss: 0.2520 162/500 [========>.....................] - ETA: 1:25 - loss: 1.4950 - regression_loss: 1.2429 - classification_loss: 0.2521 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4942 - regression_loss: 1.2426 - classification_loss: 0.2517 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4928 - regression_loss: 1.2414 - classification_loss: 0.2514 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4964 - regression_loss: 1.2443 - classification_loss: 0.2521 166/500 [========>.....................] - ETA: 1:24 - loss: 1.4941 - regression_loss: 1.2426 - classification_loss: 0.2516 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4947 - regression_loss: 1.2429 - classification_loss: 0.2518 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4949 - regression_loss: 1.2431 - classification_loss: 0.2518 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4951 - regression_loss: 1.2430 - classification_loss: 0.2521 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4975 - regression_loss: 1.2453 - classification_loss: 0.2522 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4978 - regression_loss: 1.2459 - classification_loss: 0.2519 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4973 - regression_loss: 1.2457 - classification_loss: 0.2515 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4938 - regression_loss: 1.2434 - classification_loss: 0.2505 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4941 - regression_loss: 1.2433 - classification_loss: 0.2508 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4973 - regression_loss: 1.2459 - classification_loss: 0.2514 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4984 - regression_loss: 1.2467 - classification_loss: 0.2517 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4984 - regression_loss: 1.2467 - classification_loss: 0.2517 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4972 - regression_loss: 1.2461 - classification_loss: 0.2511 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5013 - regression_loss: 1.2486 - classification_loss: 0.2527 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4981 - regression_loss: 1.2457 - classification_loss: 0.2524 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4971 - regression_loss: 1.2450 - classification_loss: 0.2521 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4969 - regression_loss: 1.2451 - classification_loss: 0.2518 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4991 - regression_loss: 1.2471 - classification_loss: 0.2520 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4950 - regression_loss: 1.2438 - classification_loss: 0.2512 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4971 - regression_loss: 1.2453 - classification_loss: 0.2518 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4981 - regression_loss: 1.2463 - classification_loss: 0.2518 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4991 - regression_loss: 1.2470 - classification_loss: 0.2521 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4955 - regression_loss: 1.2442 - classification_loss: 0.2513 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4944 - regression_loss: 1.2432 - classification_loss: 0.2512 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4984 - regression_loss: 1.2466 - classification_loss: 0.2518 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5005 - regression_loss: 1.2480 - classification_loss: 0.2525 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5001 - regression_loss: 1.2478 - classification_loss: 0.2523 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5020 - regression_loss: 1.2495 - classification_loss: 0.2525 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4983 - regression_loss: 1.2468 - classification_loss: 0.2515 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4996 - regression_loss: 1.2480 - classification_loss: 0.2516 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5025 - regression_loss: 1.2503 - classification_loss: 0.2522 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5030 - regression_loss: 1.2511 - classification_loss: 0.2520 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5032 - regression_loss: 1.2513 - classification_loss: 0.2519 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5052 - regression_loss: 1.2523 - classification_loss: 0.2528 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5082 - regression_loss: 1.2546 - classification_loss: 0.2536 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5041 - regression_loss: 1.2515 - classification_loss: 0.2526 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5115 - regression_loss: 1.2574 - classification_loss: 0.2541 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5093 - regression_loss: 1.2558 - classification_loss: 0.2535 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5114 - regression_loss: 1.2576 - classification_loss: 0.2538 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5095 - regression_loss: 1.2563 - classification_loss: 0.2532 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5093 - regression_loss: 1.2562 - classification_loss: 0.2531 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5066 - regression_loss: 1.2545 - classification_loss: 0.2522 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5076 - regression_loss: 1.2551 - classification_loss: 0.2525 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5045 - regression_loss: 1.2526 - classification_loss: 0.2519 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5050 - regression_loss: 1.2530 - classification_loss: 0.2520 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5020 - regression_loss: 1.2509 - classification_loss: 0.2512 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5029 - regression_loss: 1.2516 - classification_loss: 0.2513 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5029 - regression_loss: 1.2516 - classification_loss: 0.2513 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5041 - regression_loss: 1.2527 - classification_loss: 0.2514 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5053 - regression_loss: 1.2537 - classification_loss: 0.2516 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5067 - regression_loss: 1.2548 - classification_loss: 0.2518 217/500 [============>.................] - ETA: 1:11 - loss: 1.5074 - regression_loss: 1.2558 - classification_loss: 0.2515 218/500 [============>.................] - ETA: 1:10 - loss: 1.5087 - regression_loss: 1.2570 - classification_loss: 0.2517 219/500 [============>.................] - ETA: 1:10 - loss: 1.5081 - regression_loss: 1.2559 - classification_loss: 0.2523 220/500 [============>.................] - ETA: 1:10 - loss: 1.5072 - regression_loss: 1.2549 - classification_loss: 0.2524 221/500 [============>.................] - ETA: 1:10 - loss: 1.5098 - regression_loss: 1.2573 - classification_loss: 0.2526 222/500 [============>.................] - ETA: 1:09 - loss: 1.5108 - regression_loss: 1.2584 - classification_loss: 0.2524 223/500 [============>.................] - ETA: 1:09 - loss: 1.5106 - regression_loss: 1.2582 - classification_loss: 0.2524 224/500 [============>.................] - ETA: 1:09 - loss: 1.5098 - regression_loss: 1.2575 - classification_loss: 0.2522 225/500 [============>.................] - ETA: 1:09 - loss: 1.5084 - regression_loss: 1.2565 - classification_loss: 0.2519 226/500 [============>.................] - ETA: 1:08 - loss: 1.5094 - regression_loss: 1.2573 - classification_loss: 0.2520 227/500 [============>.................] - ETA: 1:08 - loss: 1.5112 - regression_loss: 1.2585 - classification_loss: 0.2527 228/500 [============>.................] - ETA: 1:08 - loss: 1.5083 - regression_loss: 1.2560 - classification_loss: 0.2523 229/500 [============>.................] - ETA: 1:08 - loss: 1.5094 - regression_loss: 1.2569 - classification_loss: 0.2526 230/500 [============>.................] - ETA: 1:07 - loss: 1.5090 - regression_loss: 1.2567 - classification_loss: 0.2523 231/500 [============>.................] - ETA: 1:07 - loss: 1.5100 - regression_loss: 1.2578 - classification_loss: 0.2522 232/500 [============>.................] - ETA: 1:07 - loss: 1.5079 - regression_loss: 1.2560 - classification_loss: 0.2519 233/500 [============>.................] - ETA: 1:07 - loss: 1.5110 - regression_loss: 1.2585 - classification_loss: 0.2525 234/500 [=============>................] - ETA: 1:06 - loss: 1.5111 - regression_loss: 1.2590 - classification_loss: 0.2520 235/500 [=============>................] - ETA: 1:06 - loss: 1.5105 - regression_loss: 1.2587 - classification_loss: 0.2519 236/500 [=============>................] - ETA: 1:06 - loss: 1.5134 - regression_loss: 1.2609 - classification_loss: 0.2525 237/500 [=============>................] - ETA: 1:06 - loss: 1.5113 - regression_loss: 1.2594 - classification_loss: 0.2519 238/500 [=============>................] - ETA: 1:05 - loss: 1.5126 - regression_loss: 1.2605 - classification_loss: 0.2521 239/500 [=============>................] - ETA: 1:05 - loss: 1.5115 - regression_loss: 1.2600 - classification_loss: 0.2515 240/500 [=============>................] - ETA: 1:05 - loss: 1.5119 - regression_loss: 1.2605 - classification_loss: 0.2514 241/500 [=============>................] - ETA: 1:05 - loss: 1.5112 - regression_loss: 1.2598 - classification_loss: 0.2514 242/500 [=============>................] - ETA: 1:04 - loss: 1.5113 - regression_loss: 1.2597 - classification_loss: 0.2516 243/500 [=============>................] - ETA: 1:04 - loss: 1.5130 - regression_loss: 1.2614 - classification_loss: 0.2516 244/500 [=============>................] - ETA: 1:04 - loss: 1.5106 - regression_loss: 1.2594 - classification_loss: 0.2512 245/500 [=============>................] - ETA: 1:04 - loss: 1.5113 - regression_loss: 1.2600 - classification_loss: 0.2513 246/500 [=============>................] - ETA: 1:03 - loss: 1.5105 - regression_loss: 1.2594 - classification_loss: 0.2511 247/500 [=============>................] - ETA: 1:03 - loss: 1.5088 - regression_loss: 1.2582 - classification_loss: 0.2506 248/500 [=============>................] - ETA: 1:03 - loss: 1.5084 - regression_loss: 1.2580 - classification_loss: 0.2503 249/500 [=============>................] - ETA: 1:03 - loss: 1.5086 - regression_loss: 1.2583 - classification_loss: 0.2503 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5074 - regression_loss: 1.2575 - classification_loss: 0.2499 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5057 - regression_loss: 1.2555 - classification_loss: 0.2502 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5049 - regression_loss: 1.2550 - classification_loss: 0.2499 253/500 [==============>...............] - ETA: 1:02 - loss: 1.5074 - regression_loss: 1.2568 - classification_loss: 0.2506 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5075 - regression_loss: 1.2567 - classification_loss: 0.2508 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5090 - regression_loss: 1.2580 - classification_loss: 0.2510 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5077 - regression_loss: 1.2570 - classification_loss: 0.2507 257/500 [==============>...............] - ETA: 1:01 - loss: 1.5076 - regression_loss: 1.2570 - classification_loss: 0.2506 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5079 - regression_loss: 1.2574 - classification_loss: 0.2504 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5101 - regression_loss: 1.2592 - classification_loss: 0.2508 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5105 - regression_loss: 1.2598 - classification_loss: 0.2507 261/500 [==============>...............] - ETA: 1:00 - loss: 1.5086 - regression_loss: 1.2581 - classification_loss: 0.2504 262/500 [==============>...............] - ETA: 59s - loss: 1.5101 - regression_loss: 1.2594 - classification_loss: 0.2507  263/500 [==============>...............] - ETA: 59s - loss: 1.5077 - regression_loss: 1.2576 - classification_loss: 0.2502 264/500 [==============>...............] - ETA: 59s - loss: 1.5052 - regression_loss: 1.2555 - classification_loss: 0.2498 265/500 [==============>...............] - ETA: 59s - loss: 1.5055 - regression_loss: 1.2557 - classification_loss: 0.2498 266/500 [==============>...............] - ETA: 58s - loss: 1.5055 - regression_loss: 1.2560 - classification_loss: 0.2495 267/500 [===============>..............] - ETA: 58s - loss: 1.5078 - regression_loss: 1.2579 - classification_loss: 0.2499 268/500 [===============>..............] - ETA: 58s - loss: 1.5052 - regression_loss: 1.2559 - classification_loss: 0.2493 269/500 [===============>..............] - ETA: 58s - loss: 1.5027 - regression_loss: 1.2539 - classification_loss: 0.2488 270/500 [===============>..............] - ETA: 57s - loss: 1.5018 - regression_loss: 1.2533 - classification_loss: 0.2484 271/500 [===============>..............] - ETA: 57s - loss: 1.4987 - regression_loss: 1.2509 - classification_loss: 0.2478 272/500 [===============>..............] - ETA: 57s - loss: 1.4985 - regression_loss: 1.2507 - classification_loss: 0.2478 273/500 [===============>..............] - ETA: 57s - loss: 1.4960 - regression_loss: 1.2484 - classification_loss: 0.2476 274/500 [===============>..............] - ETA: 56s - loss: 1.4953 - regression_loss: 1.2482 - classification_loss: 0.2472 275/500 [===============>..............] - ETA: 56s - loss: 1.4942 - regression_loss: 1.2474 - classification_loss: 0.2468 276/500 [===============>..............] - ETA: 56s - loss: 1.4950 - regression_loss: 1.2480 - classification_loss: 0.2471 277/500 [===============>..............] - ETA: 56s - loss: 1.4950 - regression_loss: 1.2479 - classification_loss: 0.2471 278/500 [===============>..............] - ETA: 55s - loss: 1.4957 - regression_loss: 1.2478 - classification_loss: 0.2479 279/500 [===============>..............] - ETA: 55s - loss: 1.4962 - regression_loss: 1.2483 - classification_loss: 0.2479 280/500 [===============>..............] - ETA: 55s - loss: 1.4954 - regression_loss: 1.2478 - classification_loss: 0.2476 281/500 [===============>..............] - ETA: 55s - loss: 1.4966 - regression_loss: 1.2490 - classification_loss: 0.2476 282/500 [===============>..............] - ETA: 54s - loss: 1.4942 - regression_loss: 1.2468 - classification_loss: 0.2474 283/500 [===============>..............] - ETA: 54s - loss: 1.4925 - regression_loss: 1.2458 - classification_loss: 0.2467 284/500 [================>.............] - ETA: 54s - loss: 1.4973 - regression_loss: 1.2497 - classification_loss: 0.2476 285/500 [================>.............] - ETA: 54s - loss: 1.4975 - regression_loss: 1.2500 - classification_loss: 0.2475 286/500 [================>.............] - ETA: 53s - loss: 1.4970 - regression_loss: 1.2497 - classification_loss: 0.2473 287/500 [================>.............] - ETA: 53s - loss: 1.4969 - regression_loss: 1.2497 - classification_loss: 0.2473 288/500 [================>.............] - ETA: 53s - loss: 1.4974 - regression_loss: 1.2503 - classification_loss: 0.2471 289/500 [================>.............] - ETA: 53s - loss: 1.4951 - regression_loss: 1.2485 - classification_loss: 0.2465 290/500 [================>.............] - ETA: 52s - loss: 1.4935 - regression_loss: 1.2475 - classification_loss: 0.2459 291/500 [================>.............] - ETA: 52s - loss: 1.4933 - regression_loss: 1.2473 - classification_loss: 0.2460 292/500 [================>.............] - ETA: 52s - loss: 1.4923 - regression_loss: 1.2468 - classification_loss: 0.2456 293/500 [================>.............] - ETA: 52s - loss: 1.4925 - regression_loss: 1.2468 - classification_loss: 0.2457 294/500 [================>.............] - ETA: 51s - loss: 1.4943 - regression_loss: 1.2485 - classification_loss: 0.2458 295/500 [================>.............] - ETA: 51s - loss: 1.4946 - regression_loss: 1.2489 - classification_loss: 0.2457 296/500 [================>.............] - ETA: 51s - loss: 1.4943 - regression_loss: 1.2488 - classification_loss: 0.2455 297/500 [================>.............] - ETA: 51s - loss: 1.4949 - regression_loss: 1.2492 - classification_loss: 0.2457 298/500 [================>.............] - ETA: 50s - loss: 1.4966 - regression_loss: 1.2506 - classification_loss: 0.2460 299/500 [================>.............] - ETA: 50s - loss: 1.4984 - regression_loss: 1.2521 - classification_loss: 0.2463 300/500 [=================>............] - ETA: 50s - loss: 1.5007 - regression_loss: 1.2540 - classification_loss: 0.2467 301/500 [=================>............] - ETA: 50s - loss: 1.5017 - regression_loss: 1.2547 - classification_loss: 0.2470 302/500 [=================>............] - ETA: 49s - loss: 1.5006 - regression_loss: 1.2538 - classification_loss: 0.2468 303/500 [=================>............] - ETA: 49s - loss: 1.5017 - regression_loss: 1.2546 - classification_loss: 0.2471 304/500 [=================>............] - ETA: 49s - loss: 1.5017 - regression_loss: 1.2547 - classification_loss: 0.2471 305/500 [=================>............] - ETA: 49s - loss: 1.5025 - regression_loss: 1.2554 - classification_loss: 0.2471 306/500 [=================>............] - ETA: 48s - loss: 1.5034 - regression_loss: 1.2562 - classification_loss: 0.2472 307/500 [=================>............] - ETA: 48s - loss: 1.5053 - regression_loss: 1.2579 - classification_loss: 0.2474 308/500 [=================>............] - ETA: 48s - loss: 1.5066 - regression_loss: 1.2592 - classification_loss: 0.2474 309/500 [=================>............] - ETA: 48s - loss: 1.5078 - regression_loss: 1.2603 - classification_loss: 0.2475 310/500 [=================>............] - ETA: 47s - loss: 1.5104 - regression_loss: 1.2625 - classification_loss: 0.2479 311/500 [=================>............] - ETA: 47s - loss: 1.5107 - regression_loss: 1.2623 - classification_loss: 0.2484 312/500 [=================>............] - ETA: 47s - loss: 1.5115 - regression_loss: 1.2630 - classification_loss: 0.2485 313/500 [=================>............] - ETA: 47s - loss: 1.5119 - regression_loss: 1.2635 - classification_loss: 0.2485 314/500 [=================>............] - ETA: 46s - loss: 1.5126 - regression_loss: 1.2642 - classification_loss: 0.2484 315/500 [=================>............] - ETA: 46s - loss: 1.5136 - regression_loss: 1.2652 - classification_loss: 0.2485 316/500 [=================>............] - ETA: 46s - loss: 1.5137 - regression_loss: 1.2652 - classification_loss: 0.2485 317/500 [==================>...........] - ETA: 46s - loss: 1.5124 - regression_loss: 1.2643 - classification_loss: 0.2482 318/500 [==================>...........] - ETA: 45s - loss: 1.5131 - regression_loss: 1.2649 - classification_loss: 0.2482 319/500 [==================>...........] - ETA: 45s - loss: 1.5132 - regression_loss: 1.2650 - classification_loss: 0.2482 320/500 [==================>...........] - ETA: 45s - loss: 1.5150 - regression_loss: 1.2667 - classification_loss: 0.2484 321/500 [==================>...........] - ETA: 44s - loss: 1.5157 - regression_loss: 1.2673 - classification_loss: 0.2484 322/500 [==================>...........] - ETA: 44s - loss: 1.5168 - regression_loss: 1.2679 - classification_loss: 0.2489 323/500 [==================>...........] - ETA: 44s - loss: 1.5182 - regression_loss: 1.2691 - classification_loss: 0.2490 324/500 [==================>...........] - ETA: 44s - loss: 1.5198 - regression_loss: 1.2705 - classification_loss: 0.2493 325/500 [==================>...........] - ETA: 43s - loss: 1.5202 - regression_loss: 1.2709 - classification_loss: 0.2494 326/500 [==================>...........] - ETA: 43s - loss: 1.5198 - regression_loss: 1.2705 - classification_loss: 0.2493 327/500 [==================>...........] - ETA: 43s - loss: 1.5201 - regression_loss: 1.2708 - classification_loss: 0.2493 328/500 [==================>...........] - ETA: 43s - loss: 1.5193 - regression_loss: 1.2702 - classification_loss: 0.2491 329/500 [==================>...........] - ETA: 42s - loss: 1.5199 - regression_loss: 1.2707 - classification_loss: 0.2492 330/500 [==================>...........] - ETA: 42s - loss: 1.5204 - regression_loss: 1.2711 - classification_loss: 0.2494 331/500 [==================>...........] - ETA: 42s - loss: 1.5207 - regression_loss: 1.2711 - classification_loss: 0.2496 332/500 [==================>...........] - ETA: 42s - loss: 1.5209 - regression_loss: 1.2714 - classification_loss: 0.2495 333/500 [==================>...........] - ETA: 41s - loss: 1.5220 - regression_loss: 1.2723 - classification_loss: 0.2497 334/500 [===================>..........] - ETA: 41s - loss: 1.5235 - regression_loss: 1.2736 - classification_loss: 0.2499 335/500 [===================>..........] - ETA: 41s - loss: 1.5244 - regression_loss: 1.2743 - classification_loss: 0.2501 336/500 [===================>..........] - ETA: 41s - loss: 1.5224 - regression_loss: 1.2728 - classification_loss: 0.2496 337/500 [===================>..........] - ETA: 40s - loss: 1.5222 - regression_loss: 1.2728 - classification_loss: 0.2494 338/500 [===================>..........] - ETA: 40s - loss: 1.5214 - regression_loss: 1.2721 - classification_loss: 0.2493 339/500 [===================>..........] - ETA: 40s - loss: 1.5231 - regression_loss: 1.2734 - classification_loss: 0.2496 340/500 [===================>..........] - ETA: 40s - loss: 1.5249 - regression_loss: 1.2750 - classification_loss: 0.2499 341/500 [===================>..........] - ETA: 39s - loss: 1.5254 - regression_loss: 1.2756 - classification_loss: 0.2498 342/500 [===================>..........] - ETA: 39s - loss: 1.5248 - regression_loss: 1.2751 - classification_loss: 0.2497 343/500 [===================>..........] - ETA: 39s - loss: 1.5260 - regression_loss: 1.2760 - classification_loss: 0.2500 344/500 [===================>..........] - ETA: 39s - loss: 1.5261 - regression_loss: 1.2762 - classification_loss: 0.2499 345/500 [===================>..........] - ETA: 38s - loss: 1.5234 - regression_loss: 1.2740 - classification_loss: 0.2493 346/500 [===================>..........] - ETA: 38s - loss: 1.5215 - regression_loss: 1.2726 - classification_loss: 0.2489 347/500 [===================>..........] - ETA: 38s - loss: 1.5193 - regression_loss: 1.2710 - classification_loss: 0.2483 348/500 [===================>..........] - ETA: 38s - loss: 1.5188 - regression_loss: 1.2707 - classification_loss: 0.2481 349/500 [===================>..........] - ETA: 37s - loss: 1.5190 - regression_loss: 1.2707 - classification_loss: 0.2483 350/500 [====================>.........] - ETA: 37s - loss: 1.5184 - regression_loss: 1.2701 - classification_loss: 0.2483 351/500 [====================>.........] - ETA: 37s - loss: 1.5178 - regression_loss: 1.2696 - classification_loss: 0.2483 352/500 [====================>.........] - ETA: 37s - loss: 1.5185 - regression_loss: 1.2701 - classification_loss: 0.2484 353/500 [====================>.........] - ETA: 36s - loss: 1.5184 - regression_loss: 1.2701 - classification_loss: 0.2483 354/500 [====================>.........] - ETA: 36s - loss: 1.5195 - regression_loss: 1.2715 - classification_loss: 0.2480 355/500 [====================>.........] - ETA: 36s - loss: 1.5193 - regression_loss: 1.2713 - classification_loss: 0.2481 356/500 [====================>.........] - ETA: 36s - loss: 1.5212 - regression_loss: 1.2729 - classification_loss: 0.2484 357/500 [====================>.........] - ETA: 35s - loss: 1.5212 - regression_loss: 1.2729 - classification_loss: 0.2484 358/500 [====================>.........] - ETA: 35s - loss: 1.5225 - regression_loss: 1.2740 - classification_loss: 0.2486 359/500 [====================>.........] - ETA: 35s - loss: 1.5216 - regression_loss: 1.2734 - classification_loss: 0.2482 360/500 [====================>.........] - ETA: 35s - loss: 1.5213 - regression_loss: 1.2732 - classification_loss: 0.2481 361/500 [====================>.........] - ETA: 34s - loss: 1.5206 - regression_loss: 1.2724 - classification_loss: 0.2482 362/500 [====================>.........] - ETA: 34s - loss: 1.5217 - regression_loss: 1.2733 - classification_loss: 0.2483 363/500 [====================>.........] - ETA: 34s - loss: 1.5214 - regression_loss: 1.2731 - classification_loss: 0.2483 364/500 [====================>.........] - ETA: 34s - loss: 1.5220 - regression_loss: 1.2737 - classification_loss: 0.2482 365/500 [====================>.........] - ETA: 33s - loss: 1.5202 - regression_loss: 1.2721 - classification_loss: 0.2481 366/500 [====================>.........] - ETA: 33s - loss: 1.5210 - regression_loss: 1.2725 - classification_loss: 0.2486 367/500 [=====================>........] - ETA: 33s - loss: 1.5196 - regression_loss: 1.2715 - classification_loss: 0.2482 368/500 [=====================>........] - ETA: 33s - loss: 1.5204 - regression_loss: 1.2720 - classification_loss: 0.2484 369/500 [=====================>........] - ETA: 32s - loss: 1.5208 - regression_loss: 1.2722 - classification_loss: 0.2486 370/500 [=====================>........] - ETA: 32s - loss: 1.5184 - regression_loss: 1.2701 - classification_loss: 0.2483 371/500 [=====================>........] - ETA: 32s - loss: 1.5177 - regression_loss: 1.2697 - classification_loss: 0.2480 372/500 [=====================>........] - ETA: 32s - loss: 1.5174 - regression_loss: 1.2695 - classification_loss: 0.2479 373/500 [=====================>........] - ETA: 31s - loss: 1.5185 - regression_loss: 1.2705 - classification_loss: 0.2480 374/500 [=====================>........] - ETA: 31s - loss: 1.5188 - regression_loss: 1.2709 - classification_loss: 0.2480 375/500 [=====================>........] - ETA: 31s - loss: 1.5183 - regression_loss: 1.2703 - classification_loss: 0.2480 376/500 [=====================>........] - ETA: 31s - loss: 1.5181 - regression_loss: 1.2701 - classification_loss: 0.2480 377/500 [=====================>........] - ETA: 30s - loss: 1.5180 - regression_loss: 1.2699 - classification_loss: 0.2481 378/500 [=====================>........] - ETA: 30s - loss: 1.5159 - regression_loss: 1.2683 - classification_loss: 0.2477 379/500 [=====================>........] - ETA: 30s - loss: 1.5166 - regression_loss: 1.2688 - classification_loss: 0.2477 380/500 [=====================>........] - ETA: 30s - loss: 1.5147 - regression_loss: 1.2674 - classification_loss: 0.2473 381/500 [=====================>........] - ETA: 29s - loss: 1.5124 - regression_loss: 1.2656 - classification_loss: 0.2468 382/500 [=====================>........] - ETA: 29s - loss: 1.5130 - regression_loss: 1.2661 - classification_loss: 0.2470 383/500 [=====================>........] - ETA: 29s - loss: 1.5134 - regression_loss: 1.2662 - classification_loss: 0.2472 384/500 [======================>.......] - ETA: 29s - loss: 1.5148 - regression_loss: 1.2674 - classification_loss: 0.2475 385/500 [======================>.......] - ETA: 28s - loss: 1.5149 - regression_loss: 1.2674 - classification_loss: 0.2474 386/500 [======================>.......] - ETA: 28s - loss: 1.5151 - regression_loss: 1.2676 - classification_loss: 0.2475 387/500 [======================>.......] - ETA: 28s - loss: 1.5137 - regression_loss: 1.2662 - classification_loss: 0.2475 388/500 [======================>.......] - ETA: 28s - loss: 1.5127 - regression_loss: 1.2654 - classification_loss: 0.2473 389/500 [======================>.......] - ETA: 27s - loss: 1.5112 - regression_loss: 1.2642 - classification_loss: 0.2470 390/500 [======================>.......] - ETA: 27s - loss: 1.5120 - regression_loss: 1.2651 - classification_loss: 0.2470 391/500 [======================>.......] - ETA: 27s - loss: 1.5125 - regression_loss: 1.2655 - classification_loss: 0.2470 392/500 [======================>.......] - ETA: 27s - loss: 1.5126 - regression_loss: 1.2654 - classification_loss: 0.2472 393/500 [======================>.......] - ETA: 26s - loss: 1.5122 - regression_loss: 1.2650 - classification_loss: 0.2471 394/500 [======================>.......] - ETA: 26s - loss: 1.5131 - regression_loss: 1.2659 - classification_loss: 0.2472 395/500 [======================>.......] - ETA: 26s - loss: 1.5133 - regression_loss: 1.2661 - classification_loss: 0.2472 396/500 [======================>.......] - ETA: 26s - loss: 1.5127 - regression_loss: 1.2657 - classification_loss: 0.2470 397/500 [======================>.......] - ETA: 25s - loss: 1.5118 - regression_loss: 1.2650 - classification_loss: 0.2468 398/500 [======================>.......] - ETA: 25s - loss: 1.5118 - regression_loss: 1.2650 - classification_loss: 0.2468 399/500 [======================>.......] - ETA: 25s - loss: 1.5113 - regression_loss: 1.2646 - classification_loss: 0.2467 400/500 [=======================>......] - ETA: 25s - loss: 1.5099 - regression_loss: 1.2635 - classification_loss: 0.2464 401/500 [=======================>......] - ETA: 24s - loss: 1.5116 - regression_loss: 1.2649 - classification_loss: 0.2467 402/500 [=======================>......] - ETA: 24s - loss: 1.5118 - regression_loss: 1.2651 - classification_loss: 0.2466 403/500 [=======================>......] - ETA: 24s - loss: 1.5115 - regression_loss: 1.2650 - classification_loss: 0.2465 404/500 [=======================>......] - ETA: 24s - loss: 1.5108 - regression_loss: 1.2645 - classification_loss: 0.2462 405/500 [=======================>......] - ETA: 23s - loss: 1.5118 - regression_loss: 1.2653 - classification_loss: 0.2466 406/500 [=======================>......] - ETA: 23s - loss: 1.5123 - regression_loss: 1.2658 - classification_loss: 0.2465 407/500 [=======================>......] - ETA: 23s - loss: 1.5123 - regression_loss: 1.2658 - classification_loss: 0.2465 408/500 [=======================>......] - ETA: 23s - loss: 1.5115 - regression_loss: 1.2652 - classification_loss: 0.2464 409/500 [=======================>......] - ETA: 22s - loss: 1.5113 - regression_loss: 1.2649 - classification_loss: 0.2464 410/500 [=======================>......] - ETA: 22s - loss: 1.5112 - regression_loss: 1.2650 - classification_loss: 0.2463 411/500 [=======================>......] - ETA: 22s - loss: 1.5105 - regression_loss: 1.2643 - classification_loss: 0.2462 412/500 [=======================>......] - ETA: 22s - loss: 1.5110 - regression_loss: 1.2647 - classification_loss: 0.2463 413/500 [=======================>......] - ETA: 21s - loss: 1.5097 - regression_loss: 1.2636 - classification_loss: 0.2461 414/500 [=======================>......] - ETA: 21s - loss: 1.5101 - regression_loss: 1.2638 - classification_loss: 0.2463 415/500 [=======================>......] - ETA: 21s - loss: 1.5110 - regression_loss: 1.2645 - classification_loss: 0.2465 416/500 [=======================>......] - ETA: 21s - loss: 1.5120 - regression_loss: 1.2651 - classification_loss: 0.2469 417/500 [========================>.....] - ETA: 20s - loss: 1.5142 - regression_loss: 1.2669 - classification_loss: 0.2473 418/500 [========================>.....] - ETA: 20s - loss: 1.5145 - regression_loss: 1.2673 - classification_loss: 0.2472 419/500 [========================>.....] - ETA: 20s - loss: 1.5151 - regression_loss: 1.2679 - classification_loss: 0.2472 420/500 [========================>.....] - ETA: 20s - loss: 1.5154 - regression_loss: 1.2682 - classification_loss: 0.2472 421/500 [========================>.....] - ETA: 19s - loss: 1.5139 - regression_loss: 1.2670 - classification_loss: 0.2469 422/500 [========================>.....] - ETA: 19s - loss: 1.5126 - regression_loss: 1.2661 - classification_loss: 0.2466 423/500 [========================>.....] - ETA: 19s - loss: 1.5130 - regression_loss: 1.2664 - classification_loss: 0.2465 424/500 [========================>.....] - ETA: 19s - loss: 1.5133 - regression_loss: 1.2667 - classification_loss: 0.2466 425/500 [========================>.....] - ETA: 18s - loss: 1.5118 - regression_loss: 1.2657 - classification_loss: 0.2462 426/500 [========================>.....] - ETA: 18s - loss: 1.5117 - regression_loss: 1.2657 - classification_loss: 0.2460 427/500 [========================>.....] - ETA: 18s - loss: 1.5141 - regression_loss: 1.2677 - classification_loss: 0.2464 428/500 [========================>.....] - ETA: 18s - loss: 1.5146 - regression_loss: 1.2682 - classification_loss: 0.2464 429/500 [========================>.....] - ETA: 17s - loss: 1.5143 - regression_loss: 1.2678 - classification_loss: 0.2464 430/500 [========================>.....] - ETA: 17s - loss: 1.5142 - regression_loss: 1.2678 - classification_loss: 0.2464 431/500 [========================>.....] - ETA: 17s - loss: 1.5123 - regression_loss: 1.2662 - classification_loss: 0.2461 432/500 [========================>.....] - ETA: 17s - loss: 1.5115 - regression_loss: 1.2655 - classification_loss: 0.2459 433/500 [========================>.....] - ETA: 16s - loss: 1.5126 - regression_loss: 1.2665 - classification_loss: 0.2461 434/500 [=========================>....] - ETA: 16s - loss: 1.5133 - regression_loss: 1.2671 - classification_loss: 0.2462 435/500 [=========================>....] - ETA: 16s - loss: 1.5132 - regression_loss: 1.2669 - classification_loss: 0.2463 436/500 [=========================>....] - ETA: 16s - loss: 1.5134 - regression_loss: 1.2671 - classification_loss: 0.2463 437/500 [=========================>....] - ETA: 15s - loss: 1.5144 - regression_loss: 1.2678 - classification_loss: 0.2466 438/500 [=========================>....] - ETA: 15s - loss: 1.5141 - regression_loss: 1.2676 - classification_loss: 0.2465 439/500 [=========================>....] - ETA: 15s - loss: 1.5151 - regression_loss: 1.2680 - classification_loss: 0.2472 440/500 [=========================>....] - ETA: 15s - loss: 1.5130 - regression_loss: 1.2663 - classification_loss: 0.2468 441/500 [=========================>....] - ETA: 14s - loss: 1.5138 - regression_loss: 1.2671 - classification_loss: 0.2466 442/500 [=========================>....] - ETA: 14s - loss: 1.5136 - regression_loss: 1.2671 - classification_loss: 0.2465 443/500 [=========================>....] - ETA: 14s - loss: 1.5143 - regression_loss: 1.2677 - classification_loss: 0.2466 444/500 [=========================>....] - ETA: 14s - loss: 1.5146 - regression_loss: 1.2680 - classification_loss: 0.2465 445/500 [=========================>....] - ETA: 13s - loss: 1.5143 - regression_loss: 1.2678 - classification_loss: 0.2465 446/500 [=========================>....] - ETA: 13s - loss: 1.5141 - regression_loss: 1.2678 - classification_loss: 0.2463 447/500 [=========================>....] - ETA: 13s - loss: 1.5142 - regression_loss: 1.2679 - classification_loss: 0.2463 448/500 [=========================>....] - ETA: 13s - loss: 1.5131 - regression_loss: 1.2671 - classification_loss: 0.2460 449/500 [=========================>....] - ETA: 12s - loss: 1.5137 - regression_loss: 1.2676 - classification_loss: 0.2461 450/500 [==========================>...] - ETA: 12s - loss: 1.5161 - regression_loss: 1.2696 - classification_loss: 0.2465 451/500 [==========================>...] - ETA: 12s - loss: 1.5149 - regression_loss: 1.2688 - classification_loss: 0.2462 452/500 [==========================>...] - ETA: 12s - loss: 1.5131 - regression_loss: 1.2672 - classification_loss: 0.2459 453/500 [==========================>...] - ETA: 11s - loss: 1.5135 - regression_loss: 1.2676 - classification_loss: 0.2459 454/500 [==========================>...] - ETA: 11s - loss: 1.5144 - regression_loss: 1.2683 - classification_loss: 0.2461 455/500 [==========================>...] - ETA: 11s - loss: 1.5142 - regression_loss: 1.2680 - classification_loss: 0.2461 456/500 [==========================>...] - ETA: 11s - loss: 1.5137 - regression_loss: 1.2676 - classification_loss: 0.2461 457/500 [==========================>...] - ETA: 10s - loss: 1.5153 - regression_loss: 1.2687 - classification_loss: 0.2466 458/500 [==========================>...] - ETA: 10s - loss: 1.5159 - regression_loss: 1.2691 - classification_loss: 0.2468 459/500 [==========================>...] - ETA: 10s - loss: 1.5162 - regression_loss: 1.2694 - classification_loss: 0.2468 460/500 [==========================>...] - ETA: 10s - loss: 1.5160 - regression_loss: 1.2693 - classification_loss: 0.2467 461/500 [==========================>...] - ETA: 9s - loss: 1.5149 - regression_loss: 1.2685 - classification_loss: 0.2464  462/500 [==========================>...] - ETA: 9s - loss: 1.5150 - regression_loss: 1.2687 - classification_loss: 0.2464 463/500 [==========================>...] - ETA: 9s - loss: 1.5152 - regression_loss: 1.2689 - classification_loss: 0.2463 464/500 [==========================>...] - ETA: 9s - loss: 1.5151 - regression_loss: 1.2687 - classification_loss: 0.2463 465/500 [==========================>...] - ETA: 8s - loss: 1.5156 - regression_loss: 1.2693 - classification_loss: 0.2464 466/500 [==========================>...] - ETA: 8s - loss: 1.5149 - regression_loss: 1.2687 - classification_loss: 0.2462 467/500 [===========================>..] - ETA: 8s - loss: 1.5146 - regression_loss: 1.2687 - classification_loss: 0.2460 468/500 [===========================>..] - ETA: 8s - loss: 1.5148 - regression_loss: 1.2692 - classification_loss: 0.2457 469/500 [===========================>..] - ETA: 7s - loss: 1.5155 - regression_loss: 1.2697 - classification_loss: 0.2459 470/500 [===========================>..] - ETA: 7s - loss: 1.5166 - regression_loss: 1.2705 - classification_loss: 0.2461 471/500 [===========================>..] - ETA: 7s - loss: 1.5169 - regression_loss: 1.2708 - classification_loss: 0.2461 472/500 [===========================>..] - ETA: 7s - loss: 1.5161 - regression_loss: 1.2703 - classification_loss: 0.2458 473/500 [===========================>..] - ETA: 6s - loss: 1.5162 - regression_loss: 1.2704 - classification_loss: 0.2458 474/500 [===========================>..] - ETA: 6s - loss: 1.5172 - regression_loss: 1.2710 - classification_loss: 0.2462 475/500 [===========================>..] - ETA: 6s - loss: 1.5162 - regression_loss: 1.2702 - classification_loss: 0.2460 476/500 [===========================>..] - ETA: 6s - loss: 1.5167 - regression_loss: 1.2707 - classification_loss: 0.2461 477/500 [===========================>..] - ETA: 5s - loss: 1.5173 - regression_loss: 1.2712 - classification_loss: 0.2461 478/500 [===========================>..] - ETA: 5s - loss: 1.5171 - regression_loss: 1.2711 - classification_loss: 0.2460 479/500 [===========================>..] - ETA: 5s - loss: 1.5152 - regression_loss: 1.2695 - classification_loss: 0.2457 480/500 [===========================>..] - ETA: 5s - loss: 1.5161 - regression_loss: 1.2700 - classification_loss: 0.2461 481/500 [===========================>..] - ETA: 4s - loss: 1.5150 - regression_loss: 1.2692 - classification_loss: 0.2458 482/500 [===========================>..] - ETA: 4s - loss: 1.5155 - regression_loss: 1.2696 - classification_loss: 0.2459 483/500 [===========================>..] - ETA: 4s - loss: 1.5141 - regression_loss: 1.2685 - classification_loss: 0.2456 484/500 [============================>.] - ETA: 4s - loss: 1.5147 - regression_loss: 1.2687 - classification_loss: 0.2460 485/500 [============================>.] - ETA: 3s - loss: 1.5156 - regression_loss: 1.2694 - classification_loss: 0.2462 486/500 [============================>.] - ETA: 3s - loss: 1.5159 - regression_loss: 1.2697 - classification_loss: 0.2462 487/500 [============================>.] - ETA: 3s - loss: 1.5158 - regression_loss: 1.2698 - classification_loss: 0.2460 488/500 [============================>.] - ETA: 3s - loss: 1.5164 - regression_loss: 1.2703 - classification_loss: 0.2461 489/500 [============================>.] - ETA: 2s - loss: 1.5172 - regression_loss: 1.2709 - classification_loss: 0.2462 490/500 [============================>.] - ETA: 2s - loss: 1.5169 - regression_loss: 1.2707 - classification_loss: 0.2463 491/500 [============================>.] - ETA: 2s - loss: 1.5170 - regression_loss: 1.2707 - classification_loss: 0.2463 492/500 [============================>.] - ETA: 2s - loss: 1.5182 - regression_loss: 1.2718 - classification_loss: 0.2465 493/500 [============================>.] - ETA: 1s - loss: 1.5187 - regression_loss: 1.2720 - classification_loss: 0.2467 494/500 [============================>.] - ETA: 1s - loss: 1.5185 - regression_loss: 1.2719 - classification_loss: 0.2465 495/500 [============================>.] - ETA: 1s - loss: 1.5186 - regression_loss: 1.2722 - classification_loss: 0.2465 496/500 [============================>.] - ETA: 1s - loss: 1.5169 - regression_loss: 1.2708 - classification_loss: 0.2461 497/500 [============================>.] - ETA: 0s - loss: 1.5172 - regression_loss: 1.2712 - classification_loss: 0.2461 498/500 [============================>.] - ETA: 0s - loss: 1.5166 - regression_loss: 1.2706 - classification_loss: 0.2459 499/500 [============================>.] - ETA: 0s - loss: 1.5174 - regression_loss: 1.2716 - classification_loss: 0.2459 500/500 [==============================] - 125s 251ms/step - loss: 1.5183 - regression_loss: 1.2725 - classification_loss: 0.2459 1172 instances of class plum with average precision: 0.6633 mAP: 0.6633 Epoch 00095: saving model to ./training/snapshots/resnet50_pascal_95.h5 Epoch 96/150 1/500 [..............................] - ETA: 2:03 - loss: 1.9345 - regression_loss: 1.6022 - classification_loss: 0.3323 2/500 [..............................] - ETA: 2:03 - loss: 2.0370 - regression_loss: 1.6910 - classification_loss: 0.3460 3/500 [..............................] - ETA: 2:04 - loss: 1.6216 - regression_loss: 1.3534 - classification_loss: 0.2682 4/500 [..............................] - ETA: 2:05 - loss: 1.7995 - regression_loss: 1.5017 - classification_loss: 0.2977 5/500 [..............................] - ETA: 2:04 - loss: 1.6889 - regression_loss: 1.4190 - classification_loss: 0.2699 6/500 [..............................] - ETA: 2:05 - loss: 1.5831 - regression_loss: 1.3370 - classification_loss: 0.2460 7/500 [..............................] - ETA: 2:05 - loss: 1.5871 - regression_loss: 1.2968 - classification_loss: 0.2903 8/500 [..............................] - ETA: 2:05 - loss: 1.6202 - regression_loss: 1.3336 - classification_loss: 0.2866 9/500 [..............................] - ETA: 2:04 - loss: 1.6229 - regression_loss: 1.3409 - classification_loss: 0.2820 10/500 [..............................] - ETA: 2:04 - loss: 1.6107 - regression_loss: 1.3365 - classification_loss: 0.2742 11/500 [..............................] - ETA: 2:04 - loss: 1.5481 - regression_loss: 1.2870 - classification_loss: 0.2610 12/500 [..............................] - ETA: 2:04 - loss: 1.5247 - regression_loss: 1.2733 - classification_loss: 0.2515 13/500 [..............................] - ETA: 2:04 - loss: 1.5689 - regression_loss: 1.3042 - classification_loss: 0.2647 14/500 [..............................] - ETA: 2:03 - loss: 1.5539 - regression_loss: 1.2946 - classification_loss: 0.2594 15/500 [..............................] - ETA: 2:03 - loss: 1.5616 - regression_loss: 1.3045 - classification_loss: 0.2572 16/500 [..............................] - ETA: 2:03 - loss: 1.5902 - regression_loss: 1.3285 - classification_loss: 0.2617 17/500 [>.............................] - ETA: 2:02 - loss: 1.6027 - regression_loss: 1.3362 - classification_loss: 0.2665 18/500 [>.............................] - ETA: 2:02 - loss: 1.5941 - regression_loss: 1.3294 - classification_loss: 0.2647 19/500 [>.............................] - ETA: 2:02 - loss: 1.5839 - regression_loss: 1.3207 - classification_loss: 0.2631 20/500 [>.............................] - ETA: 2:01 - loss: 1.5416 - regression_loss: 1.2863 - classification_loss: 0.2553 21/500 [>.............................] - ETA: 2:00 - loss: 1.5227 - regression_loss: 1.2728 - classification_loss: 0.2499 22/500 [>.............................] - ETA: 1:59 - loss: 1.5119 - regression_loss: 1.2603 - classification_loss: 0.2516 23/500 [>.............................] - ETA: 1:58 - loss: 1.5017 - regression_loss: 1.2534 - classification_loss: 0.2483 24/500 [>.............................] - ETA: 1:57 - loss: 1.5293 - regression_loss: 1.2764 - classification_loss: 0.2529 25/500 [>.............................] - ETA: 1:57 - loss: 1.5326 - regression_loss: 1.2799 - classification_loss: 0.2526 26/500 [>.............................] - ETA: 1:57 - loss: 1.5088 - regression_loss: 1.2615 - classification_loss: 0.2473 27/500 [>.............................] - ETA: 1:57 - loss: 1.4932 - regression_loss: 1.2451 - classification_loss: 0.2481 28/500 [>.............................] - ETA: 1:56 - loss: 1.5099 - regression_loss: 1.2598 - classification_loss: 0.2501 29/500 [>.............................] - ETA: 1:56 - loss: 1.4930 - regression_loss: 1.2478 - classification_loss: 0.2452 30/500 [>.............................] - ETA: 1:56 - loss: 1.5042 - regression_loss: 1.2563 - classification_loss: 0.2479 31/500 [>.............................] - ETA: 1:56 - loss: 1.4912 - regression_loss: 1.2462 - classification_loss: 0.2450 32/500 [>.............................] - ETA: 1:56 - loss: 1.4714 - regression_loss: 1.2285 - classification_loss: 0.2429 33/500 [>.............................] - ETA: 1:55 - loss: 1.4787 - regression_loss: 1.2321 - classification_loss: 0.2466 34/500 [=>............................] - ETA: 1:55 - loss: 1.4803 - regression_loss: 1.2347 - classification_loss: 0.2456 35/500 [=>............................] - ETA: 1:55 - loss: 1.4872 - regression_loss: 1.2383 - classification_loss: 0.2489 36/500 [=>............................] - ETA: 1:55 - loss: 1.4947 - regression_loss: 1.2437 - classification_loss: 0.2510 37/500 [=>............................] - ETA: 1:55 - loss: 1.4939 - regression_loss: 1.2429 - classification_loss: 0.2510 38/500 [=>............................] - ETA: 1:54 - loss: 1.5003 - regression_loss: 1.2476 - classification_loss: 0.2527 39/500 [=>............................] - ETA: 1:54 - loss: 1.4793 - regression_loss: 1.2307 - classification_loss: 0.2486 40/500 [=>............................] - ETA: 1:54 - loss: 1.4892 - regression_loss: 1.2408 - classification_loss: 0.2484 41/500 [=>............................] - ETA: 1:54 - loss: 1.4942 - regression_loss: 1.2459 - classification_loss: 0.2482 42/500 [=>............................] - ETA: 1:53 - loss: 1.4925 - regression_loss: 1.2451 - classification_loss: 0.2475 43/500 [=>............................] - ETA: 1:53 - loss: 1.5003 - regression_loss: 1.2513 - classification_loss: 0.2490 44/500 [=>............................] - ETA: 1:53 - loss: 1.5056 - regression_loss: 1.2562 - classification_loss: 0.2494 45/500 [=>............................] - ETA: 1:53 - loss: 1.5212 - regression_loss: 1.2680 - classification_loss: 0.2533 46/500 [=>............................] - ETA: 1:53 - loss: 1.5196 - regression_loss: 1.2676 - classification_loss: 0.2520 47/500 [=>............................] - ETA: 1:52 - loss: 1.5251 - regression_loss: 1.2737 - classification_loss: 0.2514 48/500 [=>............................] - ETA: 1:52 - loss: 1.5213 - regression_loss: 1.2706 - classification_loss: 0.2508 49/500 [=>............................] - ETA: 1:52 - loss: 1.5287 - regression_loss: 1.2769 - classification_loss: 0.2518 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5193 - regression_loss: 1.2694 - classification_loss: 0.2499 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5118 - regression_loss: 1.2629 - classification_loss: 0.2489 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5030 - regression_loss: 1.2552 - classification_loss: 0.2479 53/500 [==>...........................] - ETA: 1:51 - loss: 1.4956 - regression_loss: 1.2482 - classification_loss: 0.2474 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4780 - regression_loss: 1.2331 - classification_loss: 0.2449 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4810 - regression_loss: 1.2350 - classification_loss: 0.2461 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4807 - regression_loss: 1.2342 - classification_loss: 0.2465 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4845 - regression_loss: 1.2372 - classification_loss: 0.2473 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4759 - regression_loss: 1.2306 - classification_loss: 0.2453 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4837 - regression_loss: 1.2375 - classification_loss: 0.2462 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4883 - regression_loss: 1.2411 - classification_loss: 0.2472 61/500 [==>...........................] - ETA: 1:50 - loss: 1.4864 - regression_loss: 1.2402 - classification_loss: 0.2462 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4850 - regression_loss: 1.2394 - classification_loss: 0.2456 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4928 - regression_loss: 1.2463 - classification_loss: 0.2465 64/500 [==>...........................] - ETA: 1:49 - loss: 1.4979 - regression_loss: 1.2511 - classification_loss: 0.2469 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4850 - regression_loss: 1.2400 - classification_loss: 0.2450 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4901 - regression_loss: 1.2429 - classification_loss: 0.2471 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4960 - regression_loss: 1.2477 - classification_loss: 0.2483 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4971 - regression_loss: 1.2476 - classification_loss: 0.2495 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4971 - regression_loss: 1.2476 - classification_loss: 0.2495 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4983 - regression_loss: 1.2489 - classification_loss: 0.2494 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4971 - regression_loss: 1.2481 - classification_loss: 0.2490 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4997 - regression_loss: 1.2506 - classification_loss: 0.2491 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5013 - regression_loss: 1.2523 - classification_loss: 0.2490 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4999 - regression_loss: 1.2517 - classification_loss: 0.2482 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4998 - regression_loss: 1.2510 - classification_loss: 0.2487 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5076 - regression_loss: 1.2574 - classification_loss: 0.2502 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5007 - regression_loss: 1.2529 - classification_loss: 0.2478 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5041 - regression_loss: 1.2562 - classification_loss: 0.2480 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5035 - regression_loss: 1.2571 - classification_loss: 0.2465 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5129 - regression_loss: 1.2622 - classification_loss: 0.2508 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5206 - regression_loss: 1.2688 - classification_loss: 0.2518 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5221 - regression_loss: 1.2697 - classification_loss: 0.2524 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5273 - regression_loss: 1.2738 - classification_loss: 0.2535 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5256 - regression_loss: 1.2729 - classification_loss: 0.2527 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5313 - regression_loss: 1.2776 - classification_loss: 0.2537 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5335 - regression_loss: 1.2799 - classification_loss: 0.2536 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5386 - regression_loss: 1.2843 - classification_loss: 0.2543 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5397 - regression_loss: 1.2849 - classification_loss: 0.2548 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5316 - regression_loss: 1.2788 - classification_loss: 0.2528 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5301 - regression_loss: 1.2780 - classification_loss: 0.2522 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5268 - regression_loss: 1.2755 - classification_loss: 0.2512 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5228 - regression_loss: 1.2725 - classification_loss: 0.2503 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5249 - regression_loss: 1.2743 - classification_loss: 0.2506 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5291 - regression_loss: 1.2770 - classification_loss: 0.2522 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5315 - regression_loss: 1.2791 - classification_loss: 0.2524 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5209 - regression_loss: 1.2705 - classification_loss: 0.2504 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5218 - regression_loss: 1.2716 - classification_loss: 0.2503 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5236 - regression_loss: 1.2729 - classification_loss: 0.2507 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5161 - regression_loss: 1.2669 - classification_loss: 0.2492 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5187 - regression_loss: 1.2688 - classification_loss: 0.2499 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5248 - regression_loss: 1.2739 - classification_loss: 0.2509 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5289 - regression_loss: 1.2778 - classification_loss: 0.2512 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5264 - regression_loss: 1.2758 - classification_loss: 0.2506 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5284 - regression_loss: 1.2776 - classification_loss: 0.2507 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5296 - regression_loss: 1.2774 - classification_loss: 0.2522 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5235 - regression_loss: 1.2728 - classification_loss: 0.2507 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5254 - regression_loss: 1.2747 - classification_loss: 0.2507 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5189 - regression_loss: 1.2694 - classification_loss: 0.2495 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5161 - regression_loss: 1.2672 - classification_loss: 0.2489 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5143 - regression_loss: 1.2655 - classification_loss: 0.2488 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5145 - regression_loss: 1.2665 - classification_loss: 0.2480 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5155 - regression_loss: 1.2671 - classification_loss: 0.2484 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5080 - regression_loss: 1.2613 - classification_loss: 0.2467 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5005 - regression_loss: 1.2551 - classification_loss: 0.2455 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5046 - regression_loss: 1.2583 - classification_loss: 0.2464 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5030 - regression_loss: 1.2569 - classification_loss: 0.2461 117/500 [======>.......................] - ETA: 1:36 - loss: 1.5078 - regression_loss: 1.2619 - classification_loss: 0.2459 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5028 - regression_loss: 1.2582 - classification_loss: 0.2446 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5027 - regression_loss: 1.2586 - classification_loss: 0.2441 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5050 - regression_loss: 1.2607 - classification_loss: 0.2443 121/500 [======>.......................] - ETA: 1:35 - loss: 1.5065 - regression_loss: 1.2622 - classification_loss: 0.2443 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5070 - regression_loss: 1.2622 - classification_loss: 0.2448 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5134 - regression_loss: 1.2680 - classification_loss: 0.2454 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5094 - regression_loss: 1.2644 - classification_loss: 0.2450 125/500 [======>.......................] - ETA: 1:34 - loss: 1.5115 - regression_loss: 1.2665 - classification_loss: 0.2450 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5125 - regression_loss: 1.2678 - classification_loss: 0.2448 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5127 - regression_loss: 1.2680 - classification_loss: 0.2447 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5118 - regression_loss: 1.2672 - classification_loss: 0.2446 129/500 [======>.......................] - ETA: 1:33 - loss: 1.5079 - regression_loss: 1.2642 - classification_loss: 0.2437 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5126 - regression_loss: 1.2677 - classification_loss: 0.2449 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5139 - regression_loss: 1.2681 - classification_loss: 0.2457 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5204 - regression_loss: 1.2732 - classification_loss: 0.2472 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5216 - regression_loss: 1.2741 - classification_loss: 0.2474 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5198 - regression_loss: 1.2732 - classification_loss: 0.2466 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5205 - regression_loss: 1.2741 - classification_loss: 0.2464 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5214 - regression_loss: 1.2745 - classification_loss: 0.2469 137/500 [=======>......................] - ETA: 1:31 - loss: 1.5184 - regression_loss: 1.2725 - classification_loss: 0.2459 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5229 - regression_loss: 1.2759 - classification_loss: 0.2469 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5231 - regression_loss: 1.2767 - classification_loss: 0.2464 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5172 - regression_loss: 1.2719 - classification_loss: 0.2453 141/500 [=======>......................] - ETA: 1:30 - loss: 1.5178 - regression_loss: 1.2727 - classification_loss: 0.2452 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5250 - regression_loss: 1.2789 - classification_loss: 0.2461 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5299 - regression_loss: 1.2823 - classification_loss: 0.2476 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5302 - regression_loss: 1.2822 - classification_loss: 0.2480 145/500 [=======>......................] - ETA: 1:29 - loss: 1.5336 - regression_loss: 1.2845 - classification_loss: 0.2491 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5332 - regression_loss: 1.2845 - classification_loss: 0.2487 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5353 - regression_loss: 1.2864 - classification_loss: 0.2489 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5401 - regression_loss: 1.2909 - classification_loss: 0.2492 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5354 - regression_loss: 1.2873 - classification_loss: 0.2481 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5333 - regression_loss: 1.2850 - classification_loss: 0.2483 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5338 - regression_loss: 1.2852 - classification_loss: 0.2485 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5350 - regression_loss: 1.2864 - classification_loss: 0.2486 153/500 [========>.....................] - ETA: 1:27 - loss: 1.5295 - regression_loss: 1.2821 - classification_loss: 0.2474 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5323 - regression_loss: 1.2841 - classification_loss: 0.2482 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5322 - regression_loss: 1.2844 - classification_loss: 0.2478 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5256 - regression_loss: 1.2788 - classification_loss: 0.2468 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5220 - regression_loss: 1.2758 - classification_loss: 0.2462 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5231 - regression_loss: 1.2769 - classification_loss: 0.2462 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5242 - regression_loss: 1.2775 - classification_loss: 0.2467 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5242 - regression_loss: 1.2778 - classification_loss: 0.2464 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5225 - regression_loss: 1.2764 - classification_loss: 0.2461 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5220 - regression_loss: 1.2759 - classification_loss: 0.2462 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5184 - regression_loss: 1.2729 - classification_loss: 0.2455 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5159 - regression_loss: 1.2706 - classification_loss: 0.2453 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5133 - regression_loss: 1.2689 - classification_loss: 0.2443 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5136 - regression_loss: 1.2693 - classification_loss: 0.2443 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5164 - regression_loss: 1.2717 - classification_loss: 0.2447 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5182 - regression_loss: 1.2737 - classification_loss: 0.2445 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5172 - regression_loss: 1.2729 - classification_loss: 0.2443 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5132 - regression_loss: 1.2699 - classification_loss: 0.2434 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5119 - regression_loss: 1.2688 - classification_loss: 0.2431 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5122 - regression_loss: 1.2693 - classification_loss: 0.2429 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5110 - regression_loss: 1.2688 - classification_loss: 0.2422 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5116 - regression_loss: 1.2688 - classification_loss: 0.2428 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5115 - regression_loss: 1.2684 - classification_loss: 0.2431 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5106 - regression_loss: 1.2679 - classification_loss: 0.2427 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5076 - regression_loss: 1.2648 - classification_loss: 0.2428 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5102 - regression_loss: 1.2663 - classification_loss: 0.2439 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5086 - regression_loss: 1.2652 - classification_loss: 0.2434 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5096 - regression_loss: 1.2663 - classification_loss: 0.2433 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5100 - regression_loss: 1.2660 - classification_loss: 0.2440 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5098 - regression_loss: 1.2656 - classification_loss: 0.2442 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5078 - regression_loss: 1.2634 - classification_loss: 0.2443 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5099 - regression_loss: 1.2658 - classification_loss: 0.2441 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5082 - regression_loss: 1.2645 - classification_loss: 0.2437 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5087 - regression_loss: 1.2649 - classification_loss: 0.2438 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5096 - regression_loss: 1.2658 - classification_loss: 0.2439 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5112 - regression_loss: 1.2676 - classification_loss: 0.2436 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5055 - regression_loss: 1.2628 - classification_loss: 0.2427 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5061 - regression_loss: 1.2632 - classification_loss: 0.2429 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5075 - regression_loss: 1.2642 - classification_loss: 0.2433 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5077 - regression_loss: 1.2644 - classification_loss: 0.2433 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5076 - regression_loss: 1.2639 - classification_loss: 0.2437 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5081 - regression_loss: 1.2641 - classification_loss: 0.2440 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5029 - regression_loss: 1.2596 - classification_loss: 0.2433 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5027 - regression_loss: 1.2600 - classification_loss: 0.2427 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5029 - regression_loss: 1.2603 - classification_loss: 0.2426 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5009 - regression_loss: 1.2586 - classification_loss: 0.2423 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5033 - regression_loss: 1.2608 - classification_loss: 0.2425 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5052 - regression_loss: 1.2623 - classification_loss: 0.2429 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5042 - regression_loss: 1.2617 - classification_loss: 0.2425 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5054 - regression_loss: 1.2627 - classification_loss: 0.2427 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5017 - regression_loss: 1.2597 - classification_loss: 0.2420 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5002 - regression_loss: 1.2577 - classification_loss: 0.2425 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5012 - regression_loss: 1.2587 - classification_loss: 0.2425 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4991 - regression_loss: 1.2569 - classification_loss: 0.2421 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4998 - regression_loss: 1.2576 - classification_loss: 0.2422 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5012 - regression_loss: 1.2588 - classification_loss: 0.2424 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4983 - regression_loss: 1.2565 - classification_loss: 0.2418 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4992 - regression_loss: 1.2572 - classification_loss: 0.2420 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4997 - regression_loss: 1.2578 - classification_loss: 0.2419 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5000 - regression_loss: 1.2581 - classification_loss: 0.2418 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5006 - regression_loss: 1.2586 - classification_loss: 0.2420 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5027 - regression_loss: 1.2604 - classification_loss: 0.2423 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5017 - regression_loss: 1.2596 - classification_loss: 0.2422 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5050 - regression_loss: 1.2624 - classification_loss: 0.2427 217/500 [============>.................] - ETA: 1:10 - loss: 1.5054 - regression_loss: 1.2629 - classification_loss: 0.2425 218/500 [============>.................] - ETA: 1:10 - loss: 1.5058 - regression_loss: 1.2634 - classification_loss: 0.2424 219/500 [============>.................] - ETA: 1:10 - loss: 1.5049 - regression_loss: 1.2629 - classification_loss: 0.2421 220/500 [============>.................] - ETA: 1:10 - loss: 1.5051 - regression_loss: 1.2629 - classification_loss: 0.2422 221/500 [============>.................] - ETA: 1:09 - loss: 1.5050 - regression_loss: 1.2627 - classification_loss: 0.2423 222/500 [============>.................] - ETA: 1:09 - loss: 1.5055 - regression_loss: 1.2630 - classification_loss: 0.2426 223/500 [============>.................] - ETA: 1:09 - loss: 1.5063 - regression_loss: 1.2632 - classification_loss: 0.2431 224/500 [============>.................] - ETA: 1:09 - loss: 1.5094 - regression_loss: 1.2649 - classification_loss: 0.2446 225/500 [============>.................] - ETA: 1:08 - loss: 1.5078 - regression_loss: 1.2636 - classification_loss: 0.2442 226/500 [============>.................] - ETA: 1:08 - loss: 1.5085 - regression_loss: 1.2642 - classification_loss: 0.2443 227/500 [============>.................] - ETA: 1:08 - loss: 1.5045 - regression_loss: 1.2610 - classification_loss: 0.2436 228/500 [============>.................] - ETA: 1:08 - loss: 1.5040 - regression_loss: 1.2607 - classification_loss: 0.2432 229/500 [============>.................] - ETA: 1:07 - loss: 1.5047 - regression_loss: 1.2612 - classification_loss: 0.2435 230/500 [============>.................] - ETA: 1:07 - loss: 1.5054 - regression_loss: 1.2619 - classification_loss: 0.2435 231/500 [============>.................] - ETA: 1:07 - loss: 1.5044 - regression_loss: 1.2610 - classification_loss: 0.2434 232/500 [============>.................] - ETA: 1:07 - loss: 1.5059 - regression_loss: 1.2623 - classification_loss: 0.2436 233/500 [============>.................] - ETA: 1:06 - loss: 1.5078 - regression_loss: 1.2639 - classification_loss: 0.2439 234/500 [=============>................] - ETA: 1:06 - loss: 1.5039 - regression_loss: 1.2607 - classification_loss: 0.2432 235/500 [=============>................] - ETA: 1:06 - loss: 1.5043 - regression_loss: 1.2613 - classification_loss: 0.2430 236/500 [=============>................] - ETA: 1:06 - loss: 1.5059 - regression_loss: 1.2628 - classification_loss: 0.2431 237/500 [=============>................] - ETA: 1:05 - loss: 1.5073 - regression_loss: 1.2640 - classification_loss: 0.2432 238/500 [=============>................] - ETA: 1:05 - loss: 1.5089 - regression_loss: 1.2656 - classification_loss: 0.2433 239/500 [=============>................] - ETA: 1:05 - loss: 1.5104 - regression_loss: 1.2670 - classification_loss: 0.2434 240/500 [=============>................] - ETA: 1:05 - loss: 1.5071 - regression_loss: 1.2643 - classification_loss: 0.2429 241/500 [=============>................] - ETA: 1:04 - loss: 1.5081 - regression_loss: 1.2650 - classification_loss: 0.2431 242/500 [=============>................] - ETA: 1:04 - loss: 1.5079 - regression_loss: 1.2648 - classification_loss: 0.2431 243/500 [=============>................] - ETA: 1:04 - loss: 1.5066 - regression_loss: 1.2635 - classification_loss: 0.2432 244/500 [=============>................] - ETA: 1:04 - loss: 1.5095 - regression_loss: 1.2665 - classification_loss: 0.2431 245/500 [=============>................] - ETA: 1:03 - loss: 1.5084 - regression_loss: 1.2656 - classification_loss: 0.2428 246/500 [=============>................] - ETA: 1:03 - loss: 1.5071 - regression_loss: 1.2643 - classification_loss: 0.2428 247/500 [=============>................] - ETA: 1:03 - loss: 1.5062 - regression_loss: 1.2632 - classification_loss: 0.2430 248/500 [=============>................] - ETA: 1:03 - loss: 1.5061 - regression_loss: 1.2632 - classification_loss: 0.2429 249/500 [=============>................] - ETA: 1:02 - loss: 1.5073 - regression_loss: 1.2641 - classification_loss: 0.2432 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5099 - regression_loss: 1.2659 - classification_loss: 0.2441 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5111 - regression_loss: 1.2666 - classification_loss: 0.2445 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5122 - regression_loss: 1.2675 - classification_loss: 0.2447 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5133 - regression_loss: 1.2684 - classification_loss: 0.2448 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5118 - regression_loss: 1.2673 - classification_loss: 0.2444 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5129 - regression_loss: 1.2680 - classification_loss: 0.2449 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5096 - regression_loss: 1.2653 - classification_loss: 0.2442 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5111 - regression_loss: 1.2666 - classification_loss: 0.2445 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5124 - regression_loss: 1.2676 - classification_loss: 0.2449 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5143 - regression_loss: 1.2690 - classification_loss: 0.2454 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5133 - regression_loss: 1.2683 - classification_loss: 0.2450 261/500 [==============>...............] - ETA: 59s - loss: 1.5124 - regression_loss: 1.2676 - classification_loss: 0.2448  262/500 [==============>...............] - ETA: 59s - loss: 1.5123 - regression_loss: 1.2677 - classification_loss: 0.2446 263/500 [==============>...............] - ETA: 59s - loss: 1.5137 - regression_loss: 1.2688 - classification_loss: 0.2449 264/500 [==============>...............] - ETA: 59s - loss: 1.5141 - regression_loss: 1.2688 - classification_loss: 0.2453 265/500 [==============>...............] - ETA: 58s - loss: 1.5157 - regression_loss: 1.2696 - classification_loss: 0.2461 266/500 [==============>...............] - ETA: 58s - loss: 1.5155 - regression_loss: 1.2694 - classification_loss: 0.2461 267/500 [===============>..............] - ETA: 58s - loss: 1.5148 - regression_loss: 1.2687 - classification_loss: 0.2462 268/500 [===============>..............] - ETA: 58s - loss: 1.5144 - regression_loss: 1.2682 - classification_loss: 0.2462 269/500 [===============>..............] - ETA: 57s - loss: 1.5122 - regression_loss: 1.2660 - classification_loss: 0.2462 270/500 [===============>..............] - ETA: 57s - loss: 1.5134 - regression_loss: 1.2668 - classification_loss: 0.2465 271/500 [===============>..............] - ETA: 57s - loss: 1.5143 - regression_loss: 1.2676 - classification_loss: 0.2467 272/500 [===============>..............] - ETA: 57s - loss: 1.5154 - regression_loss: 1.2685 - classification_loss: 0.2469 273/500 [===============>..............] - ETA: 56s - loss: 1.5137 - regression_loss: 1.2670 - classification_loss: 0.2467 274/500 [===============>..............] - ETA: 56s - loss: 1.5148 - regression_loss: 1.2675 - classification_loss: 0.2472 275/500 [===============>..............] - ETA: 56s - loss: 1.5132 - regression_loss: 1.2662 - classification_loss: 0.2471 276/500 [===============>..............] - ETA: 56s - loss: 1.5125 - regression_loss: 1.2655 - classification_loss: 0.2470 277/500 [===============>..............] - ETA: 55s - loss: 1.5130 - regression_loss: 1.2658 - classification_loss: 0.2472 278/500 [===============>..............] - ETA: 55s - loss: 1.5121 - regression_loss: 1.2650 - classification_loss: 0.2471 279/500 [===============>..............] - ETA: 55s - loss: 1.5110 - regression_loss: 1.2644 - classification_loss: 0.2465 280/500 [===============>..............] - ETA: 55s - loss: 1.5128 - regression_loss: 1.2659 - classification_loss: 0.2469 281/500 [===============>..............] - ETA: 54s - loss: 1.5127 - regression_loss: 1.2658 - classification_loss: 0.2469 282/500 [===============>..............] - ETA: 54s - loss: 1.5130 - regression_loss: 1.2661 - classification_loss: 0.2470 283/500 [===============>..............] - ETA: 54s - loss: 1.5146 - regression_loss: 1.2672 - classification_loss: 0.2474 284/500 [================>.............] - ETA: 54s - loss: 1.5145 - regression_loss: 1.2673 - classification_loss: 0.2472 285/500 [================>.............] - ETA: 53s - loss: 1.5144 - regression_loss: 1.2674 - classification_loss: 0.2470 286/500 [================>.............] - ETA: 53s - loss: 1.5143 - regression_loss: 1.2673 - classification_loss: 0.2470 287/500 [================>.............] - ETA: 53s - loss: 1.5118 - regression_loss: 1.2652 - classification_loss: 0.2466 288/500 [================>.............] - ETA: 53s - loss: 1.5121 - regression_loss: 1.2655 - classification_loss: 0.2465 289/500 [================>.............] - ETA: 52s - loss: 1.5103 - regression_loss: 1.2642 - classification_loss: 0.2461 290/500 [================>.............] - ETA: 52s - loss: 1.5086 - regression_loss: 1.2627 - classification_loss: 0.2459 291/500 [================>.............] - ETA: 52s - loss: 1.5112 - regression_loss: 1.2650 - classification_loss: 0.2462 292/500 [================>.............] - ETA: 52s - loss: 1.5086 - regression_loss: 1.2629 - classification_loss: 0.2457 293/500 [================>.............] - ETA: 51s - loss: 1.5062 - regression_loss: 1.2608 - classification_loss: 0.2453 294/500 [================>.............] - ETA: 51s - loss: 1.5058 - regression_loss: 1.2607 - classification_loss: 0.2451 295/500 [================>.............] - ETA: 51s - loss: 1.5032 - regression_loss: 1.2582 - classification_loss: 0.2450 296/500 [================>.............] - ETA: 51s - loss: 1.5048 - regression_loss: 1.2592 - classification_loss: 0.2456 297/500 [================>.............] - ETA: 50s - loss: 1.5042 - regression_loss: 1.2588 - classification_loss: 0.2455 298/500 [================>.............] - ETA: 50s - loss: 1.5059 - regression_loss: 1.2599 - classification_loss: 0.2460 299/500 [================>.............] - ETA: 50s - loss: 1.5075 - regression_loss: 1.2612 - classification_loss: 0.2463 300/500 [=================>............] - ETA: 50s - loss: 1.5080 - regression_loss: 1.2615 - classification_loss: 0.2464 301/500 [=================>............] - ETA: 49s - loss: 1.5081 - regression_loss: 1.2618 - classification_loss: 0.2463 302/500 [=================>............] - ETA: 49s - loss: 1.5047 - regression_loss: 1.2591 - classification_loss: 0.2456 303/500 [=================>............] - ETA: 49s - loss: 1.5046 - regression_loss: 1.2591 - classification_loss: 0.2455 304/500 [=================>............] - ETA: 49s - loss: 1.5042 - regression_loss: 1.2590 - classification_loss: 0.2452 305/500 [=================>............] - ETA: 48s - loss: 1.5037 - regression_loss: 1.2586 - classification_loss: 0.2451 306/500 [=================>............] - ETA: 48s - loss: 1.5048 - regression_loss: 1.2595 - classification_loss: 0.2453 307/500 [=================>............] - ETA: 48s - loss: 1.5056 - regression_loss: 1.2602 - classification_loss: 0.2454 308/500 [=================>............] - ETA: 48s - loss: 1.5052 - regression_loss: 1.2598 - classification_loss: 0.2453 309/500 [=================>............] - ETA: 47s - loss: 1.5043 - regression_loss: 1.2592 - classification_loss: 0.2450 310/500 [=================>............] - ETA: 47s - loss: 1.5052 - regression_loss: 1.2598 - classification_loss: 0.2453 311/500 [=================>............] - ETA: 47s - loss: 1.5035 - regression_loss: 1.2585 - classification_loss: 0.2450 312/500 [=================>............] - ETA: 47s - loss: 1.5055 - regression_loss: 1.2596 - classification_loss: 0.2459 313/500 [=================>............] - ETA: 46s - loss: 1.5056 - regression_loss: 1.2597 - classification_loss: 0.2459 314/500 [=================>............] - ETA: 46s - loss: 1.5049 - regression_loss: 1.2593 - classification_loss: 0.2457 315/500 [=================>............] - ETA: 46s - loss: 1.5021 - regression_loss: 1.2571 - classification_loss: 0.2451 316/500 [=================>............] - ETA: 46s - loss: 1.5019 - regression_loss: 1.2569 - classification_loss: 0.2450 317/500 [==================>...........] - ETA: 45s - loss: 1.5021 - regression_loss: 1.2571 - classification_loss: 0.2450 318/500 [==================>...........] - ETA: 45s - loss: 1.4999 - regression_loss: 1.2554 - classification_loss: 0.2445 319/500 [==================>...........] - ETA: 45s - loss: 1.5013 - regression_loss: 1.2564 - classification_loss: 0.2449 320/500 [==================>...........] - ETA: 45s - loss: 1.5024 - regression_loss: 1.2573 - classification_loss: 0.2451 321/500 [==================>...........] - ETA: 44s - loss: 1.5016 - regression_loss: 1.2567 - classification_loss: 0.2449 322/500 [==================>...........] - ETA: 44s - loss: 1.5019 - regression_loss: 1.2570 - classification_loss: 0.2449 323/500 [==================>...........] - ETA: 44s - loss: 1.5027 - regression_loss: 1.2577 - classification_loss: 0.2450 324/500 [==================>...........] - ETA: 44s - loss: 1.5037 - regression_loss: 1.2584 - classification_loss: 0.2453 325/500 [==================>...........] - ETA: 43s - loss: 1.5025 - regression_loss: 1.2575 - classification_loss: 0.2450 326/500 [==================>...........] - ETA: 43s - loss: 1.5047 - regression_loss: 1.2591 - classification_loss: 0.2456 327/500 [==================>...........] - ETA: 43s - loss: 1.5063 - regression_loss: 1.2604 - classification_loss: 0.2459 328/500 [==================>...........] - ETA: 43s - loss: 1.5077 - regression_loss: 1.2616 - classification_loss: 0.2462 329/500 [==================>...........] - ETA: 42s - loss: 1.5091 - regression_loss: 1.2626 - classification_loss: 0.2465 330/500 [==================>...........] - ETA: 42s - loss: 1.5107 - regression_loss: 1.2637 - classification_loss: 0.2470 331/500 [==================>...........] - ETA: 42s - loss: 1.5111 - regression_loss: 1.2641 - classification_loss: 0.2470 332/500 [==================>...........] - ETA: 42s - loss: 1.5115 - regression_loss: 1.2645 - classification_loss: 0.2470 333/500 [==================>...........] - ETA: 41s - loss: 1.5123 - regression_loss: 1.2648 - classification_loss: 0.2475 334/500 [===================>..........] - ETA: 41s - loss: 1.5103 - regression_loss: 1.2631 - classification_loss: 0.2472 335/500 [===================>..........] - ETA: 41s - loss: 1.5105 - regression_loss: 1.2633 - classification_loss: 0.2472 336/500 [===================>..........] - ETA: 41s - loss: 1.5104 - regression_loss: 1.2633 - classification_loss: 0.2471 337/500 [===================>..........] - ETA: 40s - loss: 1.5092 - regression_loss: 1.2625 - classification_loss: 0.2467 338/500 [===================>..........] - ETA: 40s - loss: 1.5076 - regression_loss: 1.2612 - classification_loss: 0.2464 339/500 [===================>..........] - ETA: 40s - loss: 1.5078 - regression_loss: 1.2616 - classification_loss: 0.2462 340/500 [===================>..........] - ETA: 40s - loss: 1.5061 - regression_loss: 1.2603 - classification_loss: 0.2458 341/500 [===================>..........] - ETA: 39s - loss: 1.5046 - regression_loss: 1.2592 - classification_loss: 0.2454 342/500 [===================>..........] - ETA: 39s - loss: 1.5039 - regression_loss: 1.2586 - classification_loss: 0.2454 343/500 [===================>..........] - ETA: 39s - loss: 1.5053 - regression_loss: 1.2597 - classification_loss: 0.2456 344/500 [===================>..........] - ETA: 39s - loss: 1.5055 - regression_loss: 1.2598 - classification_loss: 0.2457 345/500 [===================>..........] - ETA: 38s - loss: 1.5037 - regression_loss: 1.2583 - classification_loss: 0.2454 346/500 [===================>..........] - ETA: 38s - loss: 1.5042 - regression_loss: 1.2588 - classification_loss: 0.2454 347/500 [===================>..........] - ETA: 38s - loss: 1.5043 - regression_loss: 1.2589 - classification_loss: 0.2454 348/500 [===================>..........] - ETA: 38s - loss: 1.5018 - regression_loss: 1.2569 - classification_loss: 0.2449 349/500 [===================>..........] - ETA: 37s - loss: 1.5032 - regression_loss: 1.2581 - classification_loss: 0.2451 350/500 [====================>.........] - ETA: 37s - loss: 1.5036 - regression_loss: 1.2585 - classification_loss: 0.2452 351/500 [====================>.........] - ETA: 37s - loss: 1.5044 - regression_loss: 1.2590 - classification_loss: 0.2454 352/500 [====================>.........] - ETA: 37s - loss: 1.5047 - regression_loss: 1.2594 - classification_loss: 0.2454 353/500 [====================>.........] - ETA: 36s - loss: 1.5029 - regression_loss: 1.2580 - classification_loss: 0.2449 354/500 [====================>.........] - ETA: 36s - loss: 1.5060 - regression_loss: 1.2606 - classification_loss: 0.2454 355/500 [====================>.........] - ETA: 36s - loss: 1.5055 - regression_loss: 1.2602 - classification_loss: 0.2453 356/500 [====================>.........] - ETA: 36s - loss: 1.5092 - regression_loss: 1.2629 - classification_loss: 0.2463 357/500 [====================>.........] - ETA: 35s - loss: 1.5084 - regression_loss: 1.2623 - classification_loss: 0.2461 358/500 [====================>.........] - ETA: 35s - loss: 1.5090 - regression_loss: 1.2629 - classification_loss: 0.2461 359/500 [====================>.........] - ETA: 35s - loss: 1.5096 - regression_loss: 1.2636 - classification_loss: 0.2460 360/500 [====================>.........] - ETA: 35s - loss: 1.5103 - regression_loss: 1.2642 - classification_loss: 0.2461 361/500 [====================>.........] - ETA: 34s - loss: 1.5106 - regression_loss: 1.2643 - classification_loss: 0.2463 362/500 [====================>.........] - ETA: 34s - loss: 1.5112 - regression_loss: 1.2650 - classification_loss: 0.2462 363/500 [====================>.........] - ETA: 34s - loss: 1.5097 - regression_loss: 1.2638 - classification_loss: 0.2458 364/500 [====================>.........] - ETA: 34s - loss: 1.5102 - regression_loss: 1.2643 - classification_loss: 0.2459 365/500 [====================>.........] - ETA: 33s - loss: 1.5079 - regression_loss: 1.2625 - classification_loss: 0.2454 366/500 [====================>.........] - ETA: 33s - loss: 1.5084 - regression_loss: 1.2628 - classification_loss: 0.2456 367/500 [=====================>........] - ETA: 33s - loss: 1.5097 - regression_loss: 1.2638 - classification_loss: 0.2458 368/500 [=====================>........] - ETA: 33s - loss: 1.5115 - regression_loss: 1.2652 - classification_loss: 0.2464 369/500 [=====================>........] - ETA: 32s - loss: 1.5145 - regression_loss: 1.2678 - classification_loss: 0.2467 370/500 [=====================>........] - ETA: 32s - loss: 1.5148 - regression_loss: 1.2682 - classification_loss: 0.2466 371/500 [=====================>........] - ETA: 32s - loss: 1.5150 - regression_loss: 1.2682 - classification_loss: 0.2469 372/500 [=====================>........] - ETA: 32s - loss: 1.5157 - regression_loss: 1.2690 - classification_loss: 0.2468 373/500 [=====================>........] - ETA: 31s - loss: 1.5164 - regression_loss: 1.2696 - classification_loss: 0.2468 374/500 [=====================>........] - ETA: 31s - loss: 1.5159 - regression_loss: 1.2691 - classification_loss: 0.2467 375/500 [=====================>........] - ETA: 31s - loss: 1.5170 - regression_loss: 1.2699 - classification_loss: 0.2471 376/500 [=====================>........] - ETA: 31s - loss: 1.5180 - regression_loss: 1.2707 - classification_loss: 0.2473 377/500 [=====================>........] - ETA: 30s - loss: 1.5189 - regression_loss: 1.2717 - classification_loss: 0.2472 378/500 [=====================>........] - ETA: 30s - loss: 1.5211 - regression_loss: 1.2731 - classification_loss: 0.2480 379/500 [=====================>........] - ETA: 30s - loss: 1.5219 - regression_loss: 1.2738 - classification_loss: 0.2481 380/500 [=====================>........] - ETA: 30s - loss: 1.5212 - regression_loss: 1.2734 - classification_loss: 0.2478 381/500 [=====================>........] - ETA: 29s - loss: 1.5215 - regression_loss: 1.2735 - classification_loss: 0.2480 382/500 [=====================>........] - ETA: 29s - loss: 1.5195 - regression_loss: 1.2719 - classification_loss: 0.2476 383/500 [=====================>........] - ETA: 29s - loss: 1.5191 - regression_loss: 1.2715 - classification_loss: 0.2476 384/500 [======================>.......] - ETA: 29s - loss: 1.5178 - regression_loss: 1.2704 - classification_loss: 0.2474 385/500 [======================>.......] - ETA: 28s - loss: 1.5179 - regression_loss: 1.2706 - classification_loss: 0.2474 386/500 [======================>.......] - ETA: 28s - loss: 1.5179 - regression_loss: 1.2706 - classification_loss: 0.2472 387/500 [======================>.......] - ETA: 28s - loss: 1.5178 - regression_loss: 1.2706 - classification_loss: 0.2471 388/500 [======================>.......] - ETA: 28s - loss: 1.5163 - regression_loss: 1.2694 - classification_loss: 0.2468 389/500 [======================>.......] - ETA: 27s - loss: 1.5159 - regression_loss: 1.2691 - classification_loss: 0.2467 390/500 [======================>.......] - ETA: 27s - loss: 1.5162 - regression_loss: 1.2693 - classification_loss: 0.2468 391/500 [======================>.......] - ETA: 27s - loss: 1.5169 - regression_loss: 1.2700 - classification_loss: 0.2469 392/500 [======================>.......] - ETA: 27s - loss: 1.5150 - regression_loss: 1.2685 - classification_loss: 0.2465 393/500 [======================>.......] - ETA: 26s - loss: 1.5157 - regression_loss: 1.2693 - classification_loss: 0.2465 394/500 [======================>.......] - ETA: 26s - loss: 1.5143 - regression_loss: 1.2681 - classification_loss: 0.2462 395/500 [======================>.......] - ETA: 26s - loss: 1.5155 - regression_loss: 1.2691 - classification_loss: 0.2464 396/500 [======================>.......] - ETA: 26s - loss: 1.5161 - regression_loss: 1.2696 - classification_loss: 0.2465 397/500 [======================>.......] - ETA: 25s - loss: 1.5197 - regression_loss: 1.2720 - classification_loss: 0.2477 398/500 [======================>.......] - ETA: 25s - loss: 1.5197 - regression_loss: 1.2720 - classification_loss: 0.2477 399/500 [======================>.......] - ETA: 25s - loss: 1.5200 - regression_loss: 1.2717 - classification_loss: 0.2482 400/500 [=======================>......] - ETA: 25s - loss: 1.5209 - regression_loss: 1.2725 - classification_loss: 0.2483 401/500 [=======================>......] - ETA: 24s - loss: 1.5213 - regression_loss: 1.2730 - classification_loss: 0.2483 402/500 [=======================>......] - ETA: 24s - loss: 1.5211 - regression_loss: 1.2729 - classification_loss: 0.2482 403/500 [=======================>......] - ETA: 24s - loss: 1.5214 - regression_loss: 1.2731 - classification_loss: 0.2483 404/500 [=======================>......] - ETA: 24s - loss: 1.5206 - regression_loss: 1.2725 - classification_loss: 0.2481 405/500 [=======================>......] - ETA: 23s - loss: 1.5208 - regression_loss: 1.2727 - classification_loss: 0.2481 406/500 [=======================>......] - ETA: 23s - loss: 1.5190 - regression_loss: 1.2712 - classification_loss: 0.2478 407/500 [=======================>......] - ETA: 23s - loss: 1.5192 - regression_loss: 1.2713 - classification_loss: 0.2479 408/500 [=======================>......] - ETA: 23s - loss: 1.5173 - regression_loss: 1.2696 - classification_loss: 0.2477 409/500 [=======================>......] - ETA: 22s - loss: 1.5165 - regression_loss: 1.2690 - classification_loss: 0.2475 410/500 [=======================>......] - ETA: 22s - loss: 1.5165 - regression_loss: 1.2690 - classification_loss: 0.2475 411/500 [=======================>......] - ETA: 22s - loss: 1.5164 - regression_loss: 1.2688 - classification_loss: 0.2476 412/500 [=======================>......] - ETA: 22s - loss: 1.5173 - regression_loss: 1.2699 - classification_loss: 0.2473 413/500 [=======================>......] - ETA: 21s - loss: 1.5177 - regression_loss: 1.2703 - classification_loss: 0.2473 414/500 [=======================>......] - ETA: 21s - loss: 1.5160 - regression_loss: 1.2689 - classification_loss: 0.2471 415/500 [=======================>......] - ETA: 21s - loss: 1.5142 - regression_loss: 1.2674 - classification_loss: 0.2468 416/500 [=======================>......] - ETA: 21s - loss: 1.5143 - regression_loss: 1.2676 - classification_loss: 0.2467 417/500 [========================>.....] - ETA: 20s - loss: 1.5145 - regression_loss: 1.2677 - classification_loss: 0.2468 418/500 [========================>.....] - ETA: 20s - loss: 1.5147 - regression_loss: 1.2679 - classification_loss: 0.2468 419/500 [========================>.....] - ETA: 20s - loss: 1.5154 - regression_loss: 1.2686 - classification_loss: 0.2468 420/500 [========================>.....] - ETA: 20s - loss: 1.5168 - regression_loss: 1.2697 - classification_loss: 0.2472 421/500 [========================>.....] - ETA: 19s - loss: 1.5171 - regression_loss: 1.2696 - classification_loss: 0.2475 422/500 [========================>.....] - ETA: 19s - loss: 1.5179 - regression_loss: 1.2702 - classification_loss: 0.2477 423/500 [========================>.....] - ETA: 19s - loss: 1.5173 - regression_loss: 1.2696 - classification_loss: 0.2477 424/500 [========================>.....] - ETA: 19s - loss: 1.5177 - regression_loss: 1.2699 - classification_loss: 0.2477 425/500 [========================>.....] - ETA: 18s - loss: 1.5183 - regression_loss: 1.2704 - classification_loss: 0.2479 426/500 [========================>.....] - ETA: 18s - loss: 1.5195 - regression_loss: 1.2713 - classification_loss: 0.2482 427/500 [========================>.....] - ETA: 18s - loss: 1.5207 - regression_loss: 1.2723 - classification_loss: 0.2484 428/500 [========================>.....] - ETA: 18s - loss: 1.5215 - regression_loss: 1.2728 - classification_loss: 0.2487 429/500 [========================>.....] - ETA: 17s - loss: 1.5202 - regression_loss: 1.2718 - classification_loss: 0.2484 430/500 [========================>.....] - ETA: 17s - loss: 1.5198 - regression_loss: 1.2715 - classification_loss: 0.2483 431/500 [========================>.....] - ETA: 17s - loss: 1.5207 - regression_loss: 1.2723 - classification_loss: 0.2484 432/500 [========================>.....] - ETA: 17s - loss: 1.5217 - regression_loss: 1.2728 - classification_loss: 0.2489 433/500 [========================>.....] - ETA: 16s - loss: 1.5206 - regression_loss: 1.2717 - classification_loss: 0.2489 434/500 [=========================>....] - ETA: 16s - loss: 1.5205 - regression_loss: 1.2715 - classification_loss: 0.2489 435/500 [=========================>....] - ETA: 16s - loss: 1.5216 - regression_loss: 1.2724 - classification_loss: 0.2492 436/500 [=========================>....] - ETA: 16s - loss: 1.5225 - regression_loss: 1.2733 - classification_loss: 0.2492 437/500 [=========================>....] - ETA: 15s - loss: 1.5234 - regression_loss: 1.2739 - classification_loss: 0.2495 438/500 [=========================>....] - ETA: 15s - loss: 1.5244 - regression_loss: 1.2747 - classification_loss: 0.2497 439/500 [=========================>....] - ETA: 15s - loss: 1.5237 - regression_loss: 1.2743 - classification_loss: 0.2494 440/500 [=========================>....] - ETA: 15s - loss: 1.5242 - regression_loss: 1.2746 - classification_loss: 0.2496 441/500 [=========================>....] - ETA: 14s - loss: 1.5242 - regression_loss: 1.2745 - classification_loss: 0.2497 442/500 [=========================>....] - ETA: 14s - loss: 1.5224 - regression_loss: 1.2730 - classification_loss: 0.2494 443/500 [=========================>....] - ETA: 14s - loss: 1.5226 - regression_loss: 1.2732 - classification_loss: 0.2495 444/500 [=========================>....] - ETA: 14s - loss: 1.5233 - regression_loss: 1.2737 - classification_loss: 0.2496 445/500 [=========================>....] - ETA: 13s - loss: 1.5237 - regression_loss: 1.2741 - classification_loss: 0.2496 446/500 [=========================>....] - ETA: 13s - loss: 1.5226 - regression_loss: 1.2732 - classification_loss: 0.2494 447/500 [=========================>....] - ETA: 13s - loss: 1.5231 - regression_loss: 1.2734 - classification_loss: 0.2497 448/500 [=========================>....] - ETA: 13s - loss: 1.5244 - regression_loss: 1.2744 - classification_loss: 0.2499 449/500 [=========================>....] - ETA: 12s - loss: 1.5242 - regression_loss: 1.2743 - classification_loss: 0.2499 450/500 [==========================>...] - ETA: 12s - loss: 1.5240 - regression_loss: 1.2741 - classification_loss: 0.2499 451/500 [==========================>...] - ETA: 12s - loss: 1.5236 - regression_loss: 1.2738 - classification_loss: 0.2498 452/500 [==========================>...] - ETA: 12s - loss: 1.5220 - regression_loss: 1.2724 - classification_loss: 0.2496 453/500 [==========================>...] - ETA: 11s - loss: 1.5219 - regression_loss: 1.2724 - classification_loss: 0.2495 454/500 [==========================>...] - ETA: 11s - loss: 1.5211 - regression_loss: 1.2719 - classification_loss: 0.2493 455/500 [==========================>...] - ETA: 11s - loss: 1.5215 - regression_loss: 1.2723 - classification_loss: 0.2492 456/500 [==========================>...] - ETA: 11s - loss: 1.5209 - regression_loss: 1.2716 - classification_loss: 0.2493 457/500 [==========================>...] - ETA: 10s - loss: 1.5217 - regression_loss: 1.2722 - classification_loss: 0.2495 458/500 [==========================>...] - ETA: 10s - loss: 1.5227 - regression_loss: 1.2730 - classification_loss: 0.2497 459/500 [==========================>...] - ETA: 10s - loss: 1.5233 - regression_loss: 1.2734 - classification_loss: 0.2499 460/500 [==========================>...] - ETA: 10s - loss: 1.5228 - regression_loss: 1.2731 - classification_loss: 0.2497 461/500 [==========================>...] - ETA: 9s - loss: 1.5226 - regression_loss: 1.2729 - classification_loss: 0.2497  462/500 [==========================>...] - ETA: 9s - loss: 1.5232 - regression_loss: 1.2734 - classification_loss: 0.2498 463/500 [==========================>...] - ETA: 9s - loss: 1.5214 - regression_loss: 1.2720 - classification_loss: 0.2493 464/500 [==========================>...] - ETA: 9s - loss: 1.5205 - regression_loss: 1.2714 - classification_loss: 0.2491 465/500 [==========================>...] - ETA: 8s - loss: 1.5187 - regression_loss: 1.2699 - classification_loss: 0.2487 466/500 [==========================>...] - ETA: 8s - loss: 1.5199 - regression_loss: 1.2709 - classification_loss: 0.2490 467/500 [===========================>..] - ETA: 8s - loss: 1.5210 - regression_loss: 1.2718 - classification_loss: 0.2492 468/500 [===========================>..] - ETA: 8s - loss: 1.5204 - regression_loss: 1.2712 - classification_loss: 0.2492 469/500 [===========================>..] - ETA: 7s - loss: 1.5199 - regression_loss: 1.2708 - classification_loss: 0.2491 470/500 [===========================>..] - ETA: 7s - loss: 1.5192 - regression_loss: 1.2702 - classification_loss: 0.2490 471/500 [===========================>..] - ETA: 7s - loss: 1.5196 - regression_loss: 1.2707 - classification_loss: 0.2490 472/500 [===========================>..] - ETA: 7s - loss: 1.5205 - regression_loss: 1.2714 - classification_loss: 0.2490 473/500 [===========================>..] - ETA: 6s - loss: 1.5197 - regression_loss: 1.2708 - classification_loss: 0.2489 474/500 [===========================>..] - ETA: 6s - loss: 1.5199 - regression_loss: 1.2710 - classification_loss: 0.2489 475/500 [===========================>..] - ETA: 6s - loss: 1.5195 - regression_loss: 1.2707 - classification_loss: 0.2488 476/500 [===========================>..] - ETA: 6s - loss: 1.5185 - regression_loss: 1.2699 - classification_loss: 0.2486 477/500 [===========================>..] - ETA: 5s - loss: 1.5175 - regression_loss: 1.2691 - classification_loss: 0.2484 478/500 [===========================>..] - ETA: 5s - loss: 1.5176 - regression_loss: 1.2693 - classification_loss: 0.2483 479/500 [===========================>..] - ETA: 5s - loss: 1.5192 - regression_loss: 1.2705 - classification_loss: 0.2486 480/500 [===========================>..] - ETA: 5s - loss: 1.5194 - regression_loss: 1.2710 - classification_loss: 0.2484 481/500 [===========================>..] - ETA: 4s - loss: 1.5197 - regression_loss: 1.2713 - classification_loss: 0.2484 482/500 [===========================>..] - ETA: 4s - loss: 1.5222 - regression_loss: 1.2732 - classification_loss: 0.2490 483/500 [===========================>..] - ETA: 4s - loss: 1.5212 - regression_loss: 1.2724 - classification_loss: 0.2488 484/500 [============================>.] - ETA: 4s - loss: 1.5193 - regression_loss: 1.2709 - classification_loss: 0.2484 485/500 [============================>.] - ETA: 3s - loss: 1.5203 - regression_loss: 1.2717 - classification_loss: 0.2485 486/500 [============================>.] - ETA: 3s - loss: 1.5210 - regression_loss: 1.2723 - classification_loss: 0.2487 487/500 [============================>.] - ETA: 3s - loss: 1.5217 - regression_loss: 1.2729 - classification_loss: 0.2488 488/500 [============================>.] - ETA: 3s - loss: 1.5219 - regression_loss: 1.2732 - classification_loss: 0.2487 489/500 [============================>.] - ETA: 2s - loss: 1.5220 - regression_loss: 1.2735 - classification_loss: 0.2485 490/500 [============================>.] - ETA: 2s - loss: 1.5223 - regression_loss: 1.2737 - classification_loss: 0.2486 491/500 [============================>.] - ETA: 2s - loss: 1.5201 - regression_loss: 1.2719 - classification_loss: 0.2482 492/500 [============================>.] - ETA: 2s - loss: 1.5209 - regression_loss: 1.2725 - classification_loss: 0.2484 493/500 [============================>.] - ETA: 1s - loss: 1.5207 - regression_loss: 1.2723 - classification_loss: 0.2484 494/500 [============================>.] - ETA: 1s - loss: 1.5201 - regression_loss: 1.2715 - classification_loss: 0.2485 495/500 [============================>.] - ETA: 1s - loss: 1.5201 - regression_loss: 1.2715 - classification_loss: 0.2487 496/500 [============================>.] - ETA: 1s - loss: 1.5199 - regression_loss: 1.2714 - classification_loss: 0.2484 497/500 [============================>.] - ETA: 0s - loss: 1.5203 - regression_loss: 1.2718 - classification_loss: 0.2485 498/500 [============================>.] - ETA: 0s - loss: 1.5213 - regression_loss: 1.2725 - classification_loss: 0.2487 499/500 [============================>.] - ETA: 0s - loss: 1.5209 - regression_loss: 1.2723 - classification_loss: 0.2486 500/500 [==============================] - 125s 250ms/step - loss: 1.5203 - regression_loss: 1.2718 - classification_loss: 0.2485 1172 instances of class plum with average precision: 0.6570 mAP: 0.6570 Epoch 00096: saving model to ./training/snapshots/resnet50_pascal_96.h5 Epoch 97/150 1/500 [..............................] - ETA: 2:00 - loss: 1.4216 - regression_loss: 1.2305 - classification_loss: 0.1911 2/500 [..............................] - ETA: 2:01 - loss: 1.2054 - regression_loss: 1.0238 - classification_loss: 0.1816 3/500 [..............................] - ETA: 2:02 - loss: 1.4474 - regression_loss: 1.2241 - classification_loss: 0.2233 4/500 [..............................] - ETA: 2:02 - loss: 1.5817 - regression_loss: 1.3366 - classification_loss: 0.2450 5/500 [..............................] - ETA: 2:03 - loss: 1.5525 - regression_loss: 1.3111 - classification_loss: 0.2414 6/500 [..............................] - ETA: 2:02 - loss: 1.5550 - regression_loss: 1.3070 - classification_loss: 0.2480 7/500 [..............................] - ETA: 2:01 - loss: 1.4995 - regression_loss: 1.2604 - classification_loss: 0.2391 8/500 [..............................] - ETA: 2:00 - loss: 1.5229 - regression_loss: 1.2752 - classification_loss: 0.2477 9/500 [..............................] - ETA: 2:00 - loss: 1.6180 - regression_loss: 1.3558 - classification_loss: 0.2621 10/500 [..............................] - ETA: 2:00 - loss: 1.6317 - regression_loss: 1.3695 - classification_loss: 0.2622 11/500 [..............................] - ETA: 1:59 - loss: 1.6515 - regression_loss: 1.3868 - classification_loss: 0.2646 12/500 [..............................] - ETA: 1:59 - loss: 1.6687 - regression_loss: 1.3967 - classification_loss: 0.2719 13/500 [..............................] - ETA: 2:00 - loss: 1.6695 - regression_loss: 1.3994 - classification_loss: 0.2700 14/500 [..............................] - ETA: 2:00 - loss: 1.6667 - regression_loss: 1.3992 - classification_loss: 0.2675 15/500 [..............................] - ETA: 2:00 - loss: 1.5947 - regression_loss: 1.3397 - classification_loss: 0.2550 16/500 [..............................] - ETA: 2:00 - loss: 1.5661 - regression_loss: 1.3180 - classification_loss: 0.2481 17/500 [>.............................] - ETA: 2:00 - loss: 1.6019 - regression_loss: 1.3534 - classification_loss: 0.2485 18/500 [>.............................] - ETA: 1:59 - loss: 1.6246 - regression_loss: 1.3699 - classification_loss: 0.2547 19/500 [>.............................] - ETA: 1:59 - loss: 1.6596 - regression_loss: 1.3965 - classification_loss: 0.2630 20/500 [>.............................] - ETA: 1:59 - loss: 1.6517 - regression_loss: 1.3929 - classification_loss: 0.2588 21/500 [>.............................] - ETA: 1:59 - loss: 1.6538 - regression_loss: 1.3944 - classification_loss: 0.2594 22/500 [>.............................] - ETA: 1:58 - loss: 1.6779 - regression_loss: 1.4100 - classification_loss: 0.2679 23/500 [>.............................] - ETA: 1:58 - loss: 1.6722 - regression_loss: 1.4043 - classification_loss: 0.2679 24/500 [>.............................] - ETA: 1:58 - loss: 1.6562 - regression_loss: 1.3921 - classification_loss: 0.2642 25/500 [>.............................] - ETA: 1:58 - loss: 1.6525 - regression_loss: 1.3873 - classification_loss: 0.2652 26/500 [>.............................] - ETA: 1:58 - loss: 1.6501 - regression_loss: 1.3838 - classification_loss: 0.2663 27/500 [>.............................] - ETA: 1:57 - loss: 1.6298 - regression_loss: 1.3653 - classification_loss: 0.2646 28/500 [>.............................] - ETA: 1:57 - loss: 1.6117 - regression_loss: 1.3508 - classification_loss: 0.2609 29/500 [>.............................] - ETA: 1:57 - loss: 1.5901 - regression_loss: 1.3334 - classification_loss: 0.2567 30/500 [>.............................] - ETA: 1:57 - loss: 1.5981 - regression_loss: 1.3410 - classification_loss: 0.2571 31/500 [>.............................] - ETA: 1:57 - loss: 1.5850 - regression_loss: 1.3300 - classification_loss: 0.2550 32/500 [>.............................] - ETA: 1:57 - loss: 1.5801 - regression_loss: 1.3259 - classification_loss: 0.2542 33/500 [>.............................] - ETA: 1:56 - loss: 1.5725 - regression_loss: 1.3212 - classification_loss: 0.2512 34/500 [=>............................] - ETA: 1:56 - loss: 1.5876 - regression_loss: 1.3367 - classification_loss: 0.2509 35/500 [=>............................] - ETA: 1:56 - loss: 1.5933 - regression_loss: 1.3418 - classification_loss: 0.2515 36/500 [=>............................] - ETA: 1:56 - loss: 1.5941 - regression_loss: 1.3455 - classification_loss: 0.2486 37/500 [=>............................] - ETA: 1:56 - loss: 1.5746 - regression_loss: 1.3288 - classification_loss: 0.2458 38/500 [=>............................] - ETA: 1:55 - loss: 1.5810 - regression_loss: 1.3328 - classification_loss: 0.2483 39/500 [=>............................] - ETA: 1:55 - loss: 1.5797 - regression_loss: 1.3317 - classification_loss: 0.2480 40/500 [=>............................] - ETA: 1:55 - loss: 1.5731 - regression_loss: 1.3248 - classification_loss: 0.2483 41/500 [=>............................] - ETA: 1:55 - loss: 1.5649 - regression_loss: 1.3170 - classification_loss: 0.2479 42/500 [=>............................] - ETA: 1:54 - loss: 1.5654 - regression_loss: 1.3180 - classification_loss: 0.2474 43/500 [=>............................] - ETA: 1:54 - loss: 1.5723 - regression_loss: 1.3238 - classification_loss: 0.2485 44/500 [=>............................] - ETA: 1:54 - loss: 1.5540 - regression_loss: 1.3098 - classification_loss: 0.2442 45/500 [=>............................] - ETA: 1:53 - loss: 1.5329 - regression_loss: 1.2929 - classification_loss: 0.2400 46/500 [=>............................] - ETA: 1:53 - loss: 1.5363 - regression_loss: 1.2960 - classification_loss: 0.2403 47/500 [=>............................] - ETA: 1:53 - loss: 1.5425 - regression_loss: 1.3006 - classification_loss: 0.2419 48/500 [=>............................] - ETA: 1:53 - loss: 1.5532 - regression_loss: 1.3090 - classification_loss: 0.2441 49/500 [=>............................] - ETA: 1:52 - loss: 1.5343 - regression_loss: 1.2938 - classification_loss: 0.2405 50/500 [==>...........................] - ETA: 1:52 - loss: 1.5354 - regression_loss: 1.2952 - classification_loss: 0.2402 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5405 - regression_loss: 1.2999 - classification_loss: 0.2406 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5365 - regression_loss: 1.2974 - classification_loss: 0.2390 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5391 - regression_loss: 1.2989 - classification_loss: 0.2403 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5524 - regression_loss: 1.3110 - classification_loss: 0.2415 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5456 - regression_loss: 1.3049 - classification_loss: 0.2407 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5414 - regression_loss: 1.3007 - classification_loss: 0.2407 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5243 - regression_loss: 1.2867 - classification_loss: 0.2376 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5230 - regression_loss: 1.2877 - classification_loss: 0.2353 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5206 - regression_loss: 1.2849 - classification_loss: 0.2356 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5156 - regression_loss: 1.2813 - classification_loss: 0.2344 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5155 - regression_loss: 1.2820 - classification_loss: 0.2334 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5252 - regression_loss: 1.2887 - classification_loss: 0.2365 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5252 - regression_loss: 1.2892 - classification_loss: 0.2360 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5248 - regression_loss: 1.2883 - classification_loss: 0.2364 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5339 - regression_loss: 1.2940 - classification_loss: 0.2400 66/500 [==>...........................] - ETA: 1:47 - loss: 1.5410 - regression_loss: 1.2999 - classification_loss: 0.2410 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5266 - regression_loss: 1.2872 - classification_loss: 0.2395 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5311 - regression_loss: 1.2897 - classification_loss: 0.2414 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5300 - regression_loss: 1.2894 - classification_loss: 0.2405 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5263 - regression_loss: 1.2866 - classification_loss: 0.2397 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5346 - regression_loss: 1.2953 - classification_loss: 0.2393 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5358 - regression_loss: 1.2966 - classification_loss: 0.2392 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5397 - regression_loss: 1.2993 - classification_loss: 0.2405 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5397 - regression_loss: 1.2990 - classification_loss: 0.2407 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5343 - regression_loss: 1.2950 - classification_loss: 0.2393 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5241 - regression_loss: 1.2861 - classification_loss: 0.2380 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5215 - regression_loss: 1.2827 - classification_loss: 0.2388 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5175 - regression_loss: 1.2790 - classification_loss: 0.2385 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5231 - regression_loss: 1.2833 - classification_loss: 0.2398 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5250 - regression_loss: 1.2850 - classification_loss: 0.2400 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5260 - regression_loss: 1.2846 - classification_loss: 0.2413 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5300 - regression_loss: 1.2882 - classification_loss: 0.2418 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5260 - regression_loss: 1.2850 - classification_loss: 0.2410 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5257 - regression_loss: 1.2826 - classification_loss: 0.2432 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5317 - regression_loss: 1.2869 - classification_loss: 0.2449 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5381 - regression_loss: 1.2917 - classification_loss: 0.2464 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5380 - regression_loss: 1.2902 - classification_loss: 0.2478 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5443 - regression_loss: 1.2957 - classification_loss: 0.2486 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5467 - regression_loss: 1.2978 - classification_loss: 0.2489 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5531 - regression_loss: 1.3029 - classification_loss: 0.2503 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5549 - regression_loss: 1.3048 - classification_loss: 0.2501 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5553 - regression_loss: 1.3057 - classification_loss: 0.2496 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5608 - regression_loss: 1.3092 - classification_loss: 0.2516 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5624 - regression_loss: 1.3091 - classification_loss: 0.2533 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5612 - regression_loss: 1.3085 - classification_loss: 0.2527 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5552 - regression_loss: 1.3034 - classification_loss: 0.2517 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5515 - regression_loss: 1.3004 - classification_loss: 0.2511 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5470 - regression_loss: 1.2967 - classification_loss: 0.2503 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5516 - regression_loss: 1.3006 - classification_loss: 0.2511 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5451 - regression_loss: 1.2958 - classification_loss: 0.2494 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5501 - regression_loss: 1.3003 - classification_loss: 0.2498 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5549 - regression_loss: 1.3049 - classification_loss: 0.2500 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5496 - regression_loss: 1.3004 - classification_loss: 0.2492 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5559 - regression_loss: 1.3053 - classification_loss: 0.2507 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5554 - regression_loss: 1.3050 - classification_loss: 0.2504 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5597 - regression_loss: 1.3083 - classification_loss: 0.2514 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5612 - regression_loss: 1.3091 - classification_loss: 0.2521 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5638 - regression_loss: 1.3088 - classification_loss: 0.2550 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5664 - regression_loss: 1.3109 - classification_loss: 0.2555 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5734 - regression_loss: 1.3171 - classification_loss: 0.2563 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5770 - regression_loss: 1.3208 - classification_loss: 0.2563 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5823 - regression_loss: 1.3250 - classification_loss: 0.2573 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5806 - regression_loss: 1.3238 - classification_loss: 0.2568 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5770 - regression_loss: 1.3209 - classification_loss: 0.2561 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5837 - regression_loss: 1.3259 - classification_loss: 0.2578 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5845 - regression_loss: 1.3264 - classification_loss: 0.2581 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5859 - regression_loss: 1.3280 - classification_loss: 0.2579 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5857 - regression_loss: 1.3279 - classification_loss: 0.2578 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5896 - regression_loss: 1.3313 - classification_loss: 0.2583 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5936 - regression_loss: 1.3346 - classification_loss: 0.2590 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5953 - regression_loss: 1.3364 - classification_loss: 0.2589 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5902 - regression_loss: 1.3322 - classification_loss: 0.2580 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5904 - regression_loss: 1.3327 - classification_loss: 0.2577 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5886 - regression_loss: 1.3311 - classification_loss: 0.2575 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5905 - regression_loss: 1.3327 - classification_loss: 0.2578 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5876 - regression_loss: 1.3296 - classification_loss: 0.2580 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5867 - regression_loss: 1.3289 - classification_loss: 0.2578 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5881 - regression_loss: 1.3304 - classification_loss: 0.2578 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5914 - regression_loss: 1.3325 - classification_loss: 0.2590 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5895 - regression_loss: 1.3313 - classification_loss: 0.2582 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5894 - regression_loss: 1.3315 - classification_loss: 0.2579 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5863 - regression_loss: 1.3287 - classification_loss: 0.2576 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5861 - regression_loss: 1.3283 - classification_loss: 0.2579 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5834 - regression_loss: 1.3262 - classification_loss: 0.2572 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5836 - regression_loss: 1.3266 - classification_loss: 0.2569 136/500 [=======>......................] - ETA: 1:31 - loss: 1.5868 - regression_loss: 1.3289 - classification_loss: 0.2579 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5863 - regression_loss: 1.3287 - classification_loss: 0.2577 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5844 - regression_loss: 1.3269 - classification_loss: 0.2575 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5884 - regression_loss: 1.3301 - classification_loss: 0.2583 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5876 - regression_loss: 1.3300 - classification_loss: 0.2576 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5823 - regression_loss: 1.3257 - classification_loss: 0.2566 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5739 - regression_loss: 1.3188 - classification_loss: 0.2551 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5729 - regression_loss: 1.3181 - classification_loss: 0.2548 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5689 - regression_loss: 1.3151 - classification_loss: 0.2538 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5663 - regression_loss: 1.3134 - classification_loss: 0.2528 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5659 - regression_loss: 1.3129 - classification_loss: 0.2530 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5613 - regression_loss: 1.3095 - classification_loss: 0.2518 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5622 - regression_loss: 1.3102 - classification_loss: 0.2520 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5641 - regression_loss: 1.3120 - classification_loss: 0.2521 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5653 - regression_loss: 1.3129 - classification_loss: 0.2524 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5619 - regression_loss: 1.3103 - classification_loss: 0.2516 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5569 - regression_loss: 1.3065 - classification_loss: 0.2504 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5527 - regression_loss: 1.3032 - classification_loss: 0.2496 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5532 - regression_loss: 1.3035 - classification_loss: 0.2498 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5532 - regression_loss: 1.3035 - classification_loss: 0.2497 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5545 - regression_loss: 1.3041 - classification_loss: 0.2504 157/500 [========>.....................] - ETA: 1:26 - loss: 1.5549 - regression_loss: 1.3041 - classification_loss: 0.2508 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5513 - regression_loss: 1.3012 - classification_loss: 0.2501 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5476 - regression_loss: 1.2982 - classification_loss: 0.2494 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5504 - regression_loss: 1.3005 - classification_loss: 0.2499 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5520 - regression_loss: 1.3019 - classification_loss: 0.2500 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5484 - regression_loss: 1.2991 - classification_loss: 0.2493 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5434 - regression_loss: 1.2951 - classification_loss: 0.2483 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5473 - regression_loss: 1.2982 - classification_loss: 0.2492 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5478 - regression_loss: 1.2985 - classification_loss: 0.2492 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5487 - regression_loss: 1.2994 - classification_loss: 0.2493 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5450 - regression_loss: 1.2956 - classification_loss: 0.2495 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5448 - regression_loss: 1.2955 - classification_loss: 0.2492 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5452 - regression_loss: 1.2958 - classification_loss: 0.2493 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5402 - regression_loss: 1.2917 - classification_loss: 0.2485 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5412 - regression_loss: 1.2922 - classification_loss: 0.2489 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5417 - regression_loss: 1.2927 - classification_loss: 0.2490 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5427 - regression_loss: 1.2937 - classification_loss: 0.2490 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5446 - regression_loss: 1.2953 - classification_loss: 0.2493 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5472 - regression_loss: 1.2971 - classification_loss: 0.2501 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5492 - regression_loss: 1.2984 - classification_loss: 0.2507 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5506 - regression_loss: 1.2995 - classification_loss: 0.2511 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5528 - regression_loss: 1.3013 - classification_loss: 0.2515 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5533 - regression_loss: 1.3018 - classification_loss: 0.2514 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5537 - regression_loss: 1.3022 - classification_loss: 0.2515 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5565 - regression_loss: 1.3047 - classification_loss: 0.2518 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5568 - regression_loss: 1.3048 - classification_loss: 0.2520 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5577 - regression_loss: 1.3057 - classification_loss: 0.2520 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5626 - regression_loss: 1.3092 - classification_loss: 0.2534 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5624 - regression_loss: 1.3087 - classification_loss: 0.2537 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5624 - regression_loss: 1.3086 - classification_loss: 0.2538 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5625 - regression_loss: 1.3085 - classification_loss: 0.2540 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5662 - regression_loss: 1.3115 - classification_loss: 0.2548 189/500 [==========>...................] - ETA: 1:18 - loss: 1.5676 - regression_loss: 1.3126 - classification_loss: 0.2551 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5705 - regression_loss: 1.3155 - classification_loss: 0.2550 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5673 - regression_loss: 1.3125 - classification_loss: 0.2548 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5683 - regression_loss: 1.3136 - classification_loss: 0.2547 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5684 - regression_loss: 1.3138 - classification_loss: 0.2546 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5654 - regression_loss: 1.3113 - classification_loss: 0.2542 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5642 - regression_loss: 1.3102 - classification_loss: 0.2541 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5640 - regression_loss: 1.3102 - classification_loss: 0.2538 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5657 - regression_loss: 1.3117 - classification_loss: 0.2541 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5620 - regression_loss: 1.3084 - classification_loss: 0.2536 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5616 - regression_loss: 1.3080 - classification_loss: 0.2536 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5614 - regression_loss: 1.3077 - classification_loss: 0.2537 201/500 [===========>..................] - ETA: 1:15 - loss: 1.5641 - regression_loss: 1.3095 - classification_loss: 0.2546 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5638 - regression_loss: 1.3094 - classification_loss: 0.2545 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5651 - regression_loss: 1.3095 - classification_loss: 0.2556 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5640 - regression_loss: 1.3090 - classification_loss: 0.2549 205/500 [===========>..................] - ETA: 1:14 - loss: 1.5593 - regression_loss: 1.3051 - classification_loss: 0.2542 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5599 - regression_loss: 1.3057 - classification_loss: 0.2542 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5623 - regression_loss: 1.3076 - classification_loss: 0.2547 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5596 - regression_loss: 1.3056 - classification_loss: 0.2540 209/500 [===========>..................] - ETA: 1:13 - loss: 1.5582 - regression_loss: 1.3042 - classification_loss: 0.2540 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5567 - regression_loss: 1.3032 - classification_loss: 0.2535 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5568 - regression_loss: 1.3032 - classification_loss: 0.2536 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5568 - regression_loss: 1.3029 - classification_loss: 0.2539 213/500 [===========>..................] - ETA: 1:12 - loss: 1.5531 - regression_loss: 1.2996 - classification_loss: 0.2535 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5531 - regression_loss: 1.2998 - classification_loss: 0.2534 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5517 - regression_loss: 1.2987 - classification_loss: 0.2530 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5521 - regression_loss: 1.2991 - classification_loss: 0.2530 217/500 [============>.................] - ETA: 1:11 - loss: 1.5529 - regression_loss: 1.2998 - classification_loss: 0.2531 218/500 [============>.................] - ETA: 1:10 - loss: 1.5546 - regression_loss: 1.3012 - classification_loss: 0.2534 219/500 [============>.................] - ETA: 1:10 - loss: 1.5535 - regression_loss: 1.3004 - classification_loss: 0.2531 220/500 [============>.................] - ETA: 1:10 - loss: 1.5556 - regression_loss: 1.3019 - classification_loss: 0.2537 221/500 [============>.................] - ETA: 1:09 - loss: 1.5566 - regression_loss: 1.3030 - classification_loss: 0.2536 222/500 [============>.................] - ETA: 1:09 - loss: 1.5576 - regression_loss: 1.3034 - classification_loss: 0.2541 223/500 [============>.................] - ETA: 1:09 - loss: 1.5592 - regression_loss: 1.3050 - classification_loss: 0.2542 224/500 [============>.................] - ETA: 1:09 - loss: 1.5571 - regression_loss: 1.3032 - classification_loss: 0.2539 225/500 [============>.................] - ETA: 1:08 - loss: 1.5543 - regression_loss: 1.3009 - classification_loss: 0.2535 226/500 [============>.................] - ETA: 1:08 - loss: 1.5548 - regression_loss: 1.3012 - classification_loss: 0.2536 227/500 [============>.................] - ETA: 1:08 - loss: 1.5558 - regression_loss: 1.3019 - classification_loss: 0.2539 228/500 [============>.................] - ETA: 1:08 - loss: 1.5572 - regression_loss: 1.3033 - classification_loss: 0.2539 229/500 [============>.................] - ETA: 1:07 - loss: 1.5556 - regression_loss: 1.3022 - classification_loss: 0.2534 230/500 [============>.................] - ETA: 1:07 - loss: 1.5551 - regression_loss: 1.3018 - classification_loss: 0.2532 231/500 [============>.................] - ETA: 1:07 - loss: 1.5559 - regression_loss: 1.3027 - classification_loss: 0.2532 232/500 [============>.................] - ETA: 1:07 - loss: 1.5553 - regression_loss: 1.3023 - classification_loss: 0.2530 233/500 [============>.................] - ETA: 1:06 - loss: 1.5550 - regression_loss: 1.3022 - classification_loss: 0.2529 234/500 [=============>................] - ETA: 1:06 - loss: 1.5551 - regression_loss: 1.3028 - classification_loss: 0.2524 235/500 [=============>................] - ETA: 1:06 - loss: 1.5559 - regression_loss: 1.3036 - classification_loss: 0.2523 236/500 [=============>................] - ETA: 1:05 - loss: 1.5559 - regression_loss: 1.3038 - classification_loss: 0.2521 237/500 [=============>................] - ETA: 1:05 - loss: 1.5570 - regression_loss: 1.3046 - classification_loss: 0.2523 238/500 [=============>................] - ETA: 1:05 - loss: 1.5578 - regression_loss: 1.3054 - classification_loss: 0.2524 239/500 [=============>................] - ETA: 1:05 - loss: 1.5590 - regression_loss: 1.3064 - classification_loss: 0.2525 240/500 [=============>................] - ETA: 1:05 - loss: 1.5604 - regression_loss: 1.3074 - classification_loss: 0.2530 241/500 [=============>................] - ETA: 1:04 - loss: 1.5607 - regression_loss: 1.3077 - classification_loss: 0.2530 242/500 [=============>................] - ETA: 1:04 - loss: 1.5644 - regression_loss: 1.3103 - classification_loss: 0.2541 243/500 [=============>................] - ETA: 1:04 - loss: 1.5674 - regression_loss: 1.3128 - classification_loss: 0.2546 244/500 [=============>................] - ETA: 1:04 - loss: 1.5691 - regression_loss: 1.3141 - classification_loss: 0.2550 245/500 [=============>................] - ETA: 1:03 - loss: 1.5675 - regression_loss: 1.3131 - classification_loss: 0.2544 246/500 [=============>................] - ETA: 1:03 - loss: 1.5690 - regression_loss: 1.3142 - classification_loss: 0.2547 247/500 [=============>................] - ETA: 1:03 - loss: 1.5681 - regression_loss: 1.3140 - classification_loss: 0.2542 248/500 [=============>................] - ETA: 1:03 - loss: 1.5697 - regression_loss: 1.3153 - classification_loss: 0.2544 249/500 [=============>................] - ETA: 1:02 - loss: 1.5664 - regression_loss: 1.3127 - classification_loss: 0.2538 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5698 - regression_loss: 1.3149 - classification_loss: 0.2550 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5688 - regression_loss: 1.3142 - classification_loss: 0.2546 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5704 - regression_loss: 1.3152 - classification_loss: 0.2552 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5666 - regression_loss: 1.3123 - classification_loss: 0.2544 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5649 - regression_loss: 1.3108 - classification_loss: 0.2541 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5663 - regression_loss: 1.3122 - classification_loss: 0.2541 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5672 - regression_loss: 1.3130 - classification_loss: 0.2542 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5684 - regression_loss: 1.3142 - classification_loss: 0.2542 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5679 - regression_loss: 1.3139 - classification_loss: 0.2540 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5660 - regression_loss: 1.3124 - classification_loss: 0.2537 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5655 - regression_loss: 1.3123 - classification_loss: 0.2532 261/500 [==============>...............] - ETA: 59s - loss: 1.5634 - regression_loss: 1.3103 - classification_loss: 0.2530  262/500 [==============>...............] - ETA: 59s - loss: 1.5638 - regression_loss: 1.3100 - classification_loss: 0.2539 263/500 [==============>...............] - ETA: 59s - loss: 1.5639 - regression_loss: 1.3101 - classification_loss: 0.2539 264/500 [==============>...............] - ETA: 59s - loss: 1.5628 - regression_loss: 1.3093 - classification_loss: 0.2535 265/500 [==============>...............] - ETA: 58s - loss: 1.5639 - regression_loss: 1.3104 - classification_loss: 0.2535 266/500 [==============>...............] - ETA: 58s - loss: 1.5638 - regression_loss: 1.3100 - classification_loss: 0.2538 267/500 [===============>..............] - ETA: 58s - loss: 1.5640 - regression_loss: 1.3101 - classification_loss: 0.2539 268/500 [===============>..............] - ETA: 58s - loss: 1.5646 - regression_loss: 1.3108 - classification_loss: 0.2538 269/500 [===============>..............] - ETA: 57s - loss: 1.5629 - regression_loss: 1.3096 - classification_loss: 0.2533 270/500 [===============>..............] - ETA: 57s - loss: 1.5604 - regression_loss: 1.3072 - classification_loss: 0.2532 271/500 [===============>..............] - ETA: 57s - loss: 1.5614 - regression_loss: 1.3080 - classification_loss: 0.2533 272/500 [===============>..............] - ETA: 57s - loss: 1.5609 - regression_loss: 1.3078 - classification_loss: 0.2532 273/500 [===============>..............] - ETA: 56s - loss: 1.5613 - regression_loss: 1.3082 - classification_loss: 0.2531 274/500 [===============>..............] - ETA: 56s - loss: 1.5623 - regression_loss: 1.3089 - classification_loss: 0.2534 275/500 [===============>..............] - ETA: 56s - loss: 1.5614 - regression_loss: 1.3081 - classification_loss: 0.2533 276/500 [===============>..............] - ETA: 56s - loss: 1.5623 - regression_loss: 1.3089 - classification_loss: 0.2534 277/500 [===============>..............] - ETA: 55s - loss: 1.5611 - regression_loss: 1.3079 - classification_loss: 0.2533 278/500 [===============>..............] - ETA: 55s - loss: 1.5606 - regression_loss: 1.3076 - classification_loss: 0.2529 279/500 [===============>..............] - ETA: 55s - loss: 1.5599 - regression_loss: 1.3072 - classification_loss: 0.2527 280/500 [===============>..............] - ETA: 55s - loss: 1.5607 - regression_loss: 1.3078 - classification_loss: 0.2529 281/500 [===============>..............] - ETA: 54s - loss: 1.5619 - regression_loss: 1.3090 - classification_loss: 0.2529 282/500 [===============>..............] - ETA: 54s - loss: 1.5632 - regression_loss: 1.3101 - classification_loss: 0.2531 283/500 [===============>..............] - ETA: 54s - loss: 1.5601 - regression_loss: 1.3074 - classification_loss: 0.2527 284/500 [================>.............] - ETA: 54s - loss: 1.5564 - regression_loss: 1.3042 - classification_loss: 0.2521 285/500 [================>.............] - ETA: 53s - loss: 1.5548 - regression_loss: 1.3031 - classification_loss: 0.2518 286/500 [================>.............] - ETA: 53s - loss: 1.5538 - regression_loss: 1.3021 - classification_loss: 0.2516 287/500 [================>.............] - ETA: 53s - loss: 1.5528 - regression_loss: 1.3010 - classification_loss: 0.2518 288/500 [================>.............] - ETA: 53s - loss: 1.5529 - regression_loss: 1.3009 - classification_loss: 0.2520 289/500 [================>.............] - ETA: 52s - loss: 1.5519 - regression_loss: 1.3002 - classification_loss: 0.2517 290/500 [================>.............] - ETA: 52s - loss: 1.5522 - regression_loss: 1.3004 - classification_loss: 0.2518 291/500 [================>.............] - ETA: 52s - loss: 1.5523 - regression_loss: 1.3004 - classification_loss: 0.2520 292/500 [================>.............] - ETA: 52s - loss: 1.5491 - regression_loss: 1.2979 - classification_loss: 0.2512 293/500 [================>.............] - ETA: 51s - loss: 1.5489 - regression_loss: 1.2977 - classification_loss: 0.2512 294/500 [================>.............] - ETA: 51s - loss: 1.5466 - regression_loss: 1.2953 - classification_loss: 0.2512 295/500 [================>.............] - ETA: 51s - loss: 1.5446 - regression_loss: 1.2937 - classification_loss: 0.2509 296/500 [================>.............] - ETA: 51s - loss: 1.5430 - regression_loss: 1.2922 - classification_loss: 0.2508 297/500 [================>.............] - ETA: 50s - loss: 1.5437 - regression_loss: 1.2928 - classification_loss: 0.2509 298/500 [================>.............] - ETA: 50s - loss: 1.5422 - regression_loss: 1.2913 - classification_loss: 0.2509 299/500 [================>.............] - ETA: 50s - loss: 1.5420 - regression_loss: 1.2911 - classification_loss: 0.2509 300/500 [=================>............] - ETA: 50s - loss: 1.5429 - regression_loss: 1.2918 - classification_loss: 0.2511 301/500 [=================>............] - ETA: 49s - loss: 1.5402 - regression_loss: 1.2896 - classification_loss: 0.2506 302/500 [=================>............] - ETA: 49s - loss: 1.5410 - regression_loss: 1.2902 - classification_loss: 0.2508 303/500 [=================>............] - ETA: 49s - loss: 1.5387 - regression_loss: 1.2884 - classification_loss: 0.2503 304/500 [=================>............] - ETA: 49s - loss: 1.5397 - regression_loss: 1.2893 - classification_loss: 0.2504 305/500 [=================>............] - ETA: 48s - loss: 1.5385 - regression_loss: 1.2884 - classification_loss: 0.2501 306/500 [=================>............] - ETA: 48s - loss: 1.5390 - regression_loss: 1.2889 - classification_loss: 0.2501 307/500 [=================>............] - ETA: 48s - loss: 1.5397 - regression_loss: 1.2894 - classification_loss: 0.2502 308/500 [=================>............] - ETA: 48s - loss: 1.5389 - regression_loss: 1.2890 - classification_loss: 0.2499 309/500 [=================>............] - ETA: 47s - loss: 1.5365 - regression_loss: 1.2872 - classification_loss: 0.2493 310/500 [=================>............] - ETA: 47s - loss: 1.5368 - regression_loss: 1.2874 - classification_loss: 0.2493 311/500 [=================>............] - ETA: 47s - loss: 1.5367 - regression_loss: 1.2874 - classification_loss: 0.2493 312/500 [=================>............] - ETA: 47s - loss: 1.5376 - regression_loss: 1.2882 - classification_loss: 0.2493 313/500 [=================>............] - ETA: 46s - loss: 1.5368 - regression_loss: 1.2877 - classification_loss: 0.2491 314/500 [=================>............] - ETA: 46s - loss: 1.5380 - regression_loss: 1.2884 - classification_loss: 0.2496 315/500 [=================>............] - ETA: 46s - loss: 1.5381 - regression_loss: 1.2885 - classification_loss: 0.2496 316/500 [=================>............] - ETA: 46s - loss: 1.5389 - regression_loss: 1.2891 - classification_loss: 0.2498 317/500 [==================>...........] - ETA: 45s - loss: 1.5400 - regression_loss: 1.2901 - classification_loss: 0.2499 318/500 [==================>...........] - ETA: 45s - loss: 1.5409 - regression_loss: 1.2909 - classification_loss: 0.2500 319/500 [==================>...........] - ETA: 45s - loss: 1.5421 - regression_loss: 1.2918 - classification_loss: 0.2503 320/500 [==================>...........] - ETA: 45s - loss: 1.5433 - regression_loss: 1.2928 - classification_loss: 0.2504 321/500 [==================>...........] - ETA: 44s - loss: 1.5431 - regression_loss: 1.2927 - classification_loss: 0.2504 322/500 [==================>...........] - ETA: 44s - loss: 1.5414 - regression_loss: 1.2914 - classification_loss: 0.2500 323/500 [==================>...........] - ETA: 44s - loss: 1.5385 - regression_loss: 1.2890 - classification_loss: 0.2495 324/500 [==================>...........] - ETA: 44s - loss: 1.5389 - regression_loss: 1.2895 - classification_loss: 0.2494 325/500 [==================>...........] - ETA: 43s - loss: 1.5410 - regression_loss: 1.2910 - classification_loss: 0.2500 326/500 [==================>...........] - ETA: 43s - loss: 1.5417 - regression_loss: 1.2916 - classification_loss: 0.2501 327/500 [==================>...........] - ETA: 43s - loss: 1.5405 - regression_loss: 1.2900 - classification_loss: 0.2505 328/500 [==================>...........] - ETA: 43s - loss: 1.5413 - regression_loss: 1.2905 - classification_loss: 0.2509 329/500 [==================>...........] - ETA: 42s - loss: 1.5419 - regression_loss: 1.2909 - classification_loss: 0.2509 330/500 [==================>...........] - ETA: 42s - loss: 1.5421 - regression_loss: 1.2912 - classification_loss: 0.2509 331/500 [==================>...........] - ETA: 42s - loss: 1.5397 - regression_loss: 1.2893 - classification_loss: 0.2504 332/500 [==================>...........] - ETA: 42s - loss: 1.5370 - regression_loss: 1.2872 - classification_loss: 0.2498 333/500 [==================>...........] - ETA: 41s - loss: 1.5345 - regression_loss: 1.2853 - classification_loss: 0.2492 334/500 [===================>..........] - ETA: 41s - loss: 1.5326 - regression_loss: 1.2838 - classification_loss: 0.2488 335/500 [===================>..........] - ETA: 41s - loss: 1.5339 - regression_loss: 1.2848 - classification_loss: 0.2490 336/500 [===================>..........] - ETA: 41s - loss: 1.5337 - regression_loss: 1.2847 - classification_loss: 0.2490 337/500 [===================>..........] - ETA: 40s - loss: 1.5349 - regression_loss: 1.2857 - classification_loss: 0.2492 338/500 [===================>..........] - ETA: 40s - loss: 1.5369 - regression_loss: 1.2872 - classification_loss: 0.2497 339/500 [===================>..........] - ETA: 40s - loss: 1.5367 - regression_loss: 1.2871 - classification_loss: 0.2496 340/500 [===================>..........] - ETA: 40s - loss: 1.5363 - regression_loss: 1.2869 - classification_loss: 0.2495 341/500 [===================>..........] - ETA: 39s - loss: 1.5360 - regression_loss: 1.2866 - classification_loss: 0.2494 342/500 [===================>..........] - ETA: 39s - loss: 1.5337 - regression_loss: 1.2848 - classification_loss: 0.2489 343/500 [===================>..........] - ETA: 39s - loss: 1.5332 - regression_loss: 1.2845 - classification_loss: 0.2487 344/500 [===================>..........] - ETA: 39s - loss: 1.5305 - regression_loss: 1.2823 - classification_loss: 0.2482 345/500 [===================>..........] - ETA: 38s - loss: 1.5306 - regression_loss: 1.2822 - classification_loss: 0.2484 346/500 [===================>..........] - ETA: 38s - loss: 1.5328 - regression_loss: 1.2839 - classification_loss: 0.2489 347/500 [===================>..........] - ETA: 38s - loss: 1.5334 - regression_loss: 1.2845 - classification_loss: 0.2489 348/500 [===================>..........] - ETA: 38s - loss: 1.5335 - regression_loss: 1.2846 - classification_loss: 0.2489 349/500 [===================>..........] - ETA: 37s - loss: 1.5342 - regression_loss: 1.2851 - classification_loss: 0.2491 350/500 [====================>.........] - ETA: 37s - loss: 1.5326 - regression_loss: 1.2838 - classification_loss: 0.2488 351/500 [====================>.........] - ETA: 37s - loss: 1.5322 - regression_loss: 1.2833 - classification_loss: 0.2489 352/500 [====================>.........] - ETA: 37s - loss: 1.5315 - regression_loss: 1.2827 - classification_loss: 0.2488 353/500 [====================>.........] - ETA: 36s - loss: 1.5329 - regression_loss: 1.2839 - classification_loss: 0.2490 354/500 [====================>.........] - ETA: 36s - loss: 1.5317 - regression_loss: 1.2829 - classification_loss: 0.2488 355/500 [====================>.........] - ETA: 36s - loss: 1.5319 - regression_loss: 1.2832 - classification_loss: 0.2487 356/500 [====================>.........] - ETA: 36s - loss: 1.5328 - regression_loss: 1.2838 - classification_loss: 0.2491 357/500 [====================>.........] - ETA: 35s - loss: 1.5329 - regression_loss: 1.2837 - classification_loss: 0.2492 358/500 [====================>.........] - ETA: 35s - loss: 1.5339 - regression_loss: 1.2842 - classification_loss: 0.2496 359/500 [====================>.........] - ETA: 35s - loss: 1.5317 - regression_loss: 1.2826 - classification_loss: 0.2491 360/500 [====================>.........] - ETA: 35s - loss: 1.5322 - regression_loss: 1.2830 - classification_loss: 0.2491 361/500 [====================>.........] - ETA: 34s - loss: 1.5322 - regression_loss: 1.2831 - classification_loss: 0.2491 362/500 [====================>.........] - ETA: 34s - loss: 1.5312 - regression_loss: 1.2822 - classification_loss: 0.2490 363/500 [====================>.........] - ETA: 34s - loss: 1.5289 - regression_loss: 1.2803 - classification_loss: 0.2486 364/500 [====================>.........] - ETA: 34s - loss: 1.5268 - regression_loss: 1.2787 - classification_loss: 0.2482 365/500 [====================>.........] - ETA: 33s - loss: 1.5270 - regression_loss: 1.2788 - classification_loss: 0.2482 366/500 [====================>.........] - ETA: 33s - loss: 1.5283 - regression_loss: 1.2798 - classification_loss: 0.2485 367/500 [=====================>........] - ETA: 33s - loss: 1.5271 - regression_loss: 1.2791 - classification_loss: 0.2480 368/500 [=====================>........] - ETA: 33s - loss: 1.5271 - regression_loss: 1.2792 - classification_loss: 0.2480 369/500 [=====================>........] - ETA: 32s - loss: 1.5255 - regression_loss: 1.2779 - classification_loss: 0.2476 370/500 [=====================>........] - ETA: 32s - loss: 1.5257 - regression_loss: 1.2778 - classification_loss: 0.2479 371/500 [=====================>........] - ETA: 32s - loss: 1.5270 - regression_loss: 1.2787 - classification_loss: 0.2483 372/500 [=====================>........] - ETA: 32s - loss: 1.5273 - regression_loss: 1.2791 - classification_loss: 0.2482 373/500 [=====================>........] - ETA: 31s - loss: 1.5275 - regression_loss: 1.2793 - classification_loss: 0.2482 374/500 [=====================>........] - ETA: 31s - loss: 1.5275 - regression_loss: 1.2793 - classification_loss: 0.2481 375/500 [=====================>........] - ETA: 31s - loss: 1.5279 - regression_loss: 1.2796 - classification_loss: 0.2483 376/500 [=====================>........] - ETA: 31s - loss: 1.5280 - regression_loss: 1.2798 - classification_loss: 0.2482 377/500 [=====================>........] - ETA: 30s - loss: 1.5290 - regression_loss: 1.2806 - classification_loss: 0.2484 378/500 [=====================>........] - ETA: 30s - loss: 1.5287 - regression_loss: 1.2803 - classification_loss: 0.2484 379/500 [=====================>........] - ETA: 30s - loss: 1.5286 - regression_loss: 1.2803 - classification_loss: 0.2483 380/500 [=====================>........] - ETA: 30s - loss: 1.5284 - regression_loss: 1.2803 - classification_loss: 0.2481 381/500 [=====================>........] - ETA: 29s - loss: 1.5271 - regression_loss: 1.2792 - classification_loss: 0.2478 382/500 [=====================>........] - ETA: 29s - loss: 1.5259 - regression_loss: 1.2782 - classification_loss: 0.2478 383/500 [=====================>........] - ETA: 29s - loss: 1.5236 - regression_loss: 1.2763 - classification_loss: 0.2473 384/500 [======================>.......] - ETA: 29s - loss: 1.5246 - regression_loss: 1.2772 - classification_loss: 0.2474 385/500 [======================>.......] - ETA: 28s - loss: 1.5226 - regression_loss: 1.2756 - classification_loss: 0.2470 386/500 [======================>.......] - ETA: 28s - loss: 1.5230 - regression_loss: 1.2760 - classification_loss: 0.2470 387/500 [======================>.......] - ETA: 28s - loss: 1.5239 - regression_loss: 1.2765 - classification_loss: 0.2474 388/500 [======================>.......] - ETA: 28s - loss: 1.5225 - regression_loss: 1.2753 - classification_loss: 0.2472 389/500 [======================>.......] - ETA: 27s - loss: 1.5227 - regression_loss: 1.2755 - classification_loss: 0.2472 390/500 [======================>.......] - ETA: 27s - loss: 1.5247 - regression_loss: 1.2770 - classification_loss: 0.2477 391/500 [======================>.......] - ETA: 27s - loss: 1.5256 - regression_loss: 1.2779 - classification_loss: 0.2478 392/500 [======================>.......] - ETA: 27s - loss: 1.5242 - regression_loss: 1.2767 - classification_loss: 0.2475 393/500 [======================>.......] - ETA: 26s - loss: 1.5240 - regression_loss: 1.2766 - classification_loss: 0.2474 394/500 [======================>.......] - ETA: 26s - loss: 1.5231 - regression_loss: 1.2758 - classification_loss: 0.2473 395/500 [======================>.......] - ETA: 26s - loss: 1.5237 - regression_loss: 1.2760 - classification_loss: 0.2477 396/500 [======================>.......] - ETA: 26s - loss: 1.5239 - regression_loss: 1.2763 - classification_loss: 0.2476 397/500 [======================>.......] - ETA: 25s - loss: 1.5251 - regression_loss: 1.2773 - classification_loss: 0.2478 398/500 [======================>.......] - ETA: 25s - loss: 1.5239 - regression_loss: 1.2761 - classification_loss: 0.2477 399/500 [======================>.......] - ETA: 25s - loss: 1.5239 - regression_loss: 1.2759 - classification_loss: 0.2480 400/500 [=======================>......] - ETA: 25s - loss: 1.5248 - regression_loss: 1.2766 - classification_loss: 0.2482 401/500 [=======================>......] - ETA: 24s - loss: 1.5254 - regression_loss: 1.2774 - classification_loss: 0.2481 402/500 [=======================>......] - ETA: 24s - loss: 1.5258 - regression_loss: 1.2778 - classification_loss: 0.2481 403/500 [=======================>......] - ETA: 24s - loss: 1.5267 - regression_loss: 1.2783 - classification_loss: 0.2484 404/500 [=======================>......] - ETA: 24s - loss: 1.5273 - regression_loss: 1.2791 - classification_loss: 0.2483 405/500 [=======================>......] - ETA: 23s - loss: 1.5261 - regression_loss: 1.2780 - classification_loss: 0.2481 406/500 [=======================>......] - ETA: 23s - loss: 1.5251 - regression_loss: 1.2774 - classification_loss: 0.2478 407/500 [=======================>......] - ETA: 23s - loss: 1.5250 - regression_loss: 1.2773 - classification_loss: 0.2477 408/500 [=======================>......] - ETA: 23s - loss: 1.5256 - regression_loss: 1.2776 - classification_loss: 0.2479 409/500 [=======================>......] - ETA: 22s - loss: 1.5264 - regression_loss: 1.2785 - classification_loss: 0.2479 410/500 [=======================>......] - ETA: 22s - loss: 1.5265 - regression_loss: 1.2787 - classification_loss: 0.2478 411/500 [=======================>......] - ETA: 22s - loss: 1.5278 - regression_loss: 1.2797 - classification_loss: 0.2481 412/500 [=======================>......] - ETA: 22s - loss: 1.5274 - regression_loss: 1.2794 - classification_loss: 0.2480 413/500 [=======================>......] - ETA: 21s - loss: 1.5285 - regression_loss: 1.2803 - classification_loss: 0.2482 414/500 [=======================>......] - ETA: 21s - loss: 1.5281 - regression_loss: 1.2797 - classification_loss: 0.2484 415/500 [=======================>......] - ETA: 21s - loss: 1.5290 - regression_loss: 1.2804 - classification_loss: 0.2486 416/500 [=======================>......] - ETA: 21s - loss: 1.5299 - regression_loss: 1.2811 - classification_loss: 0.2488 417/500 [========================>.....] - ETA: 20s - loss: 1.5306 - regression_loss: 1.2818 - classification_loss: 0.2487 418/500 [========================>.....] - ETA: 20s - loss: 1.5302 - regression_loss: 1.2815 - classification_loss: 0.2487 419/500 [========================>.....] - ETA: 20s - loss: 1.5303 - regression_loss: 1.2817 - classification_loss: 0.2486 420/500 [========================>.....] - ETA: 20s - loss: 1.5299 - regression_loss: 1.2814 - classification_loss: 0.2485 421/500 [========================>.....] - ETA: 19s - loss: 1.5293 - regression_loss: 1.2811 - classification_loss: 0.2482 422/500 [========================>.....] - ETA: 19s - loss: 1.5292 - regression_loss: 1.2810 - classification_loss: 0.2481 423/500 [========================>.....] - ETA: 19s - loss: 1.5302 - regression_loss: 1.2818 - classification_loss: 0.2484 424/500 [========================>.....] - ETA: 19s - loss: 1.5284 - regression_loss: 1.2802 - classification_loss: 0.2482 425/500 [========================>.....] - ETA: 18s - loss: 1.5287 - regression_loss: 1.2804 - classification_loss: 0.2483 426/500 [========================>.....] - ETA: 18s - loss: 1.5269 - regression_loss: 1.2789 - classification_loss: 0.2480 427/500 [========================>.....] - ETA: 18s - loss: 1.5265 - regression_loss: 1.2786 - classification_loss: 0.2479 428/500 [========================>.....] - ETA: 18s - loss: 1.5273 - regression_loss: 1.2794 - classification_loss: 0.2478 429/500 [========================>.....] - ETA: 17s - loss: 1.5283 - regression_loss: 1.2801 - classification_loss: 0.2481 430/500 [========================>.....] - ETA: 17s - loss: 1.5285 - regression_loss: 1.2803 - classification_loss: 0.2483 431/500 [========================>.....] - ETA: 17s - loss: 1.5305 - regression_loss: 1.2814 - classification_loss: 0.2491 432/500 [========================>.....] - ETA: 17s - loss: 1.5312 - regression_loss: 1.2821 - classification_loss: 0.2492 433/500 [========================>.....] - ETA: 16s - loss: 1.5307 - regression_loss: 1.2816 - classification_loss: 0.2491 434/500 [=========================>....] - ETA: 16s - loss: 1.5310 - regression_loss: 1.2817 - classification_loss: 0.2493 435/500 [=========================>....] - ETA: 16s - loss: 1.5298 - regression_loss: 1.2807 - classification_loss: 0.2490 436/500 [=========================>....] - ETA: 16s - loss: 1.5296 - regression_loss: 1.2807 - classification_loss: 0.2489 437/500 [=========================>....] - ETA: 15s - loss: 1.5273 - regression_loss: 1.2788 - classification_loss: 0.2485 438/500 [=========================>....] - ETA: 15s - loss: 1.5264 - regression_loss: 1.2781 - classification_loss: 0.2483 439/500 [=========================>....] - ETA: 15s - loss: 1.5275 - regression_loss: 1.2789 - classification_loss: 0.2486 440/500 [=========================>....] - ETA: 15s - loss: 1.5266 - regression_loss: 1.2784 - classification_loss: 0.2482 441/500 [=========================>....] - ETA: 14s - loss: 1.5265 - regression_loss: 1.2782 - classification_loss: 0.2482 442/500 [=========================>....] - ETA: 14s - loss: 1.5268 - regression_loss: 1.2785 - classification_loss: 0.2483 443/500 [=========================>....] - ETA: 14s - loss: 1.5255 - regression_loss: 1.2776 - classification_loss: 0.2479 444/500 [=========================>....] - ETA: 14s - loss: 1.5238 - regression_loss: 1.2761 - classification_loss: 0.2478 445/500 [=========================>....] - ETA: 13s - loss: 1.5225 - regression_loss: 1.2750 - classification_loss: 0.2475 446/500 [=========================>....] - ETA: 13s - loss: 1.5233 - regression_loss: 1.2757 - classification_loss: 0.2476 447/500 [=========================>....] - ETA: 13s - loss: 1.5234 - regression_loss: 1.2759 - classification_loss: 0.2475 448/500 [=========================>....] - ETA: 13s - loss: 1.5239 - regression_loss: 1.2763 - classification_loss: 0.2476 449/500 [=========================>....] - ETA: 12s - loss: 1.5239 - regression_loss: 1.2764 - classification_loss: 0.2476 450/500 [==========================>...] - ETA: 12s - loss: 1.5249 - regression_loss: 1.2771 - classification_loss: 0.2478 451/500 [==========================>...] - ETA: 12s - loss: 1.5252 - regression_loss: 1.2772 - classification_loss: 0.2479 452/500 [==========================>...] - ETA: 12s - loss: 1.5260 - regression_loss: 1.2780 - classification_loss: 0.2480 453/500 [==========================>...] - ETA: 11s - loss: 1.5247 - regression_loss: 1.2768 - classification_loss: 0.2480 454/500 [==========================>...] - ETA: 11s - loss: 1.5247 - regression_loss: 1.2768 - classification_loss: 0.2479 455/500 [==========================>...] - ETA: 11s - loss: 1.5245 - regression_loss: 1.2767 - classification_loss: 0.2478 456/500 [==========================>...] - ETA: 11s - loss: 1.5236 - regression_loss: 1.2762 - classification_loss: 0.2475 457/500 [==========================>...] - ETA: 10s - loss: 1.5241 - regression_loss: 1.2766 - classification_loss: 0.2474 458/500 [==========================>...] - ETA: 10s - loss: 1.5246 - regression_loss: 1.2771 - classification_loss: 0.2475 459/500 [==========================>...] - ETA: 10s - loss: 1.5253 - regression_loss: 1.2776 - classification_loss: 0.2477 460/500 [==========================>...] - ETA: 10s - loss: 1.5269 - regression_loss: 1.2792 - classification_loss: 0.2477 461/500 [==========================>...] - ETA: 9s - loss: 1.5260 - regression_loss: 1.2784 - classification_loss: 0.2476  462/500 [==========================>...] - ETA: 9s - loss: 1.5263 - regression_loss: 1.2784 - classification_loss: 0.2479 463/500 [==========================>...] - ETA: 9s - loss: 1.5266 - regression_loss: 1.2785 - classification_loss: 0.2481 464/500 [==========================>...] - ETA: 9s - loss: 1.5274 - regression_loss: 1.2793 - classification_loss: 0.2481 465/500 [==========================>...] - ETA: 8s - loss: 1.5285 - regression_loss: 1.2803 - classification_loss: 0.2482 466/500 [==========================>...] - ETA: 8s - loss: 1.5284 - regression_loss: 1.2804 - classification_loss: 0.2480 467/500 [===========================>..] - ETA: 8s - loss: 1.5288 - regression_loss: 1.2806 - classification_loss: 0.2482 468/500 [===========================>..] - ETA: 8s - loss: 1.5289 - regression_loss: 1.2808 - classification_loss: 0.2481 469/500 [===========================>..] - ETA: 7s - loss: 1.5295 - regression_loss: 1.2813 - classification_loss: 0.2482 470/500 [===========================>..] - ETA: 7s - loss: 1.5300 - regression_loss: 1.2817 - classification_loss: 0.2483 471/500 [===========================>..] - ETA: 7s - loss: 1.5305 - regression_loss: 1.2821 - classification_loss: 0.2484 472/500 [===========================>..] - ETA: 7s - loss: 1.5309 - regression_loss: 1.2823 - classification_loss: 0.2486 473/500 [===========================>..] - ETA: 6s - loss: 1.5296 - regression_loss: 1.2812 - classification_loss: 0.2484 474/500 [===========================>..] - ETA: 6s - loss: 1.5281 - regression_loss: 1.2801 - classification_loss: 0.2480 475/500 [===========================>..] - ETA: 6s - loss: 1.5286 - regression_loss: 1.2806 - classification_loss: 0.2479 476/500 [===========================>..] - ETA: 6s - loss: 1.5288 - regression_loss: 1.2808 - classification_loss: 0.2480 477/500 [===========================>..] - ETA: 5s - loss: 1.5292 - regression_loss: 1.2811 - classification_loss: 0.2480 478/500 [===========================>..] - ETA: 5s - loss: 1.5284 - regression_loss: 1.2805 - classification_loss: 0.2479 479/500 [===========================>..] - ETA: 5s - loss: 1.5296 - regression_loss: 1.2815 - classification_loss: 0.2481 480/500 [===========================>..] - ETA: 5s - loss: 1.5290 - regression_loss: 1.2811 - classification_loss: 0.2478 481/500 [===========================>..] - ETA: 4s - loss: 1.5319 - regression_loss: 1.2831 - classification_loss: 0.2488 482/500 [===========================>..] - ETA: 4s - loss: 1.5303 - regression_loss: 1.2817 - classification_loss: 0.2486 483/500 [===========================>..] - ETA: 4s - loss: 1.5303 - regression_loss: 1.2819 - classification_loss: 0.2485 484/500 [============================>.] - ETA: 4s - loss: 1.5294 - regression_loss: 1.2812 - classification_loss: 0.2482 485/500 [============================>.] - ETA: 3s - loss: 1.5291 - regression_loss: 1.2809 - classification_loss: 0.2481 486/500 [============================>.] - ETA: 3s - loss: 1.5295 - regression_loss: 1.2814 - classification_loss: 0.2480 487/500 [============================>.] - ETA: 3s - loss: 1.5303 - regression_loss: 1.2821 - classification_loss: 0.2481 488/500 [============================>.] - ETA: 3s - loss: 1.5312 - regression_loss: 1.2829 - classification_loss: 0.2483 489/500 [============================>.] - ETA: 2s - loss: 1.5307 - regression_loss: 1.2826 - classification_loss: 0.2481 490/500 [============================>.] - ETA: 2s - loss: 1.5293 - regression_loss: 1.2812 - classification_loss: 0.2481 491/500 [============================>.] - ETA: 2s - loss: 1.5294 - regression_loss: 1.2814 - classification_loss: 0.2481 492/500 [============================>.] - ETA: 2s - loss: 1.5282 - regression_loss: 1.2805 - classification_loss: 0.2477 493/500 [============================>.] - ETA: 1s - loss: 1.5283 - regression_loss: 1.2807 - classification_loss: 0.2477 494/500 [============================>.] - ETA: 1s - loss: 1.5287 - regression_loss: 1.2809 - classification_loss: 0.2478 495/500 [============================>.] - ETA: 1s - loss: 1.5279 - regression_loss: 1.2802 - classification_loss: 0.2477 496/500 [============================>.] - ETA: 1s - loss: 1.5269 - regression_loss: 1.2795 - classification_loss: 0.2474 497/500 [============================>.] - ETA: 0s - loss: 1.5261 - regression_loss: 1.2789 - classification_loss: 0.2472 498/500 [============================>.] - ETA: 0s - loss: 1.5253 - regression_loss: 1.2785 - classification_loss: 0.2469 499/500 [============================>.] - ETA: 0s - loss: 1.5246 - regression_loss: 1.2781 - classification_loss: 0.2465 500/500 [==============================] - 125s 251ms/step - loss: 1.5247 - regression_loss: 1.2782 - classification_loss: 0.2465 1172 instances of class plum with average precision: 0.6680 mAP: 0.6680 Epoch 00097: saving model to ./training/snapshots/resnet50_pascal_97.h5 Epoch 98/150 1/500 [..............................] - ETA: 1:50 - loss: 1.7781 - regression_loss: 1.4780 - classification_loss: 0.3002 2/500 [..............................] - ETA: 1:48 - loss: 1.3805 - regression_loss: 1.1725 - classification_loss: 0.2081 3/500 [..............................] - ETA: 1:52 - loss: 1.4810 - regression_loss: 1.2596 - classification_loss: 0.2214 4/500 [..............................] - ETA: 1:55 - loss: 1.3206 - regression_loss: 1.1187 - classification_loss: 0.2019 5/500 [..............................] - ETA: 1:57 - loss: 1.3771 - regression_loss: 1.1630 - classification_loss: 0.2140 6/500 [..............................] - ETA: 1:59 - loss: 1.3667 - regression_loss: 1.1537 - classification_loss: 0.2130 7/500 [..............................] - ETA: 2:00 - loss: 1.4940 - regression_loss: 1.2691 - classification_loss: 0.2249 8/500 [..............................] - ETA: 1:59 - loss: 1.4478 - regression_loss: 1.2356 - classification_loss: 0.2122 9/500 [..............................] - ETA: 1:59 - loss: 1.4207 - regression_loss: 1.2187 - classification_loss: 0.2020 10/500 [..............................] - ETA: 1:59 - loss: 1.4667 - regression_loss: 1.2544 - classification_loss: 0.2123 11/500 [..............................] - ETA: 1:59 - loss: 1.3763 - regression_loss: 1.1771 - classification_loss: 0.1992 12/500 [..............................] - ETA: 2:00 - loss: 1.3666 - regression_loss: 1.1676 - classification_loss: 0.1990 13/500 [..............................] - ETA: 2:00 - loss: 1.3885 - regression_loss: 1.1839 - classification_loss: 0.2045 14/500 [..............................] - ETA: 2:00 - loss: 1.4059 - regression_loss: 1.2005 - classification_loss: 0.2054 15/500 [..............................] - ETA: 2:00 - loss: 1.4176 - regression_loss: 1.2097 - classification_loss: 0.2079 16/500 [..............................] - ETA: 2:00 - loss: 1.4726 - regression_loss: 1.2451 - classification_loss: 0.2275 17/500 [>.............................] - ETA: 1:59 - loss: 1.4743 - regression_loss: 1.2402 - classification_loss: 0.2341 18/500 [>.............................] - ETA: 2:00 - loss: 1.4725 - regression_loss: 1.2367 - classification_loss: 0.2358 19/500 [>.............................] - ETA: 2:00 - loss: 1.4775 - regression_loss: 1.2462 - classification_loss: 0.2313 20/500 [>.............................] - ETA: 2:00 - loss: 1.5124 - regression_loss: 1.2734 - classification_loss: 0.2391 21/500 [>.............................] - ETA: 2:00 - loss: 1.5036 - regression_loss: 1.2675 - classification_loss: 0.2362 22/500 [>.............................] - ETA: 2:00 - loss: 1.5066 - regression_loss: 1.2716 - classification_loss: 0.2350 23/500 [>.............................] - ETA: 1:59 - loss: 1.4741 - regression_loss: 1.2438 - classification_loss: 0.2303 24/500 [>.............................] - ETA: 1:59 - loss: 1.4790 - regression_loss: 1.2439 - classification_loss: 0.2351 25/500 [>.............................] - ETA: 1:59 - loss: 1.4564 - regression_loss: 1.2268 - classification_loss: 0.2296 26/500 [>.............................] - ETA: 1:59 - loss: 1.4588 - regression_loss: 1.2321 - classification_loss: 0.2267 27/500 [>.............................] - ETA: 1:58 - loss: 1.4556 - regression_loss: 1.2300 - classification_loss: 0.2255 28/500 [>.............................] - ETA: 1:58 - loss: 1.4413 - regression_loss: 1.2199 - classification_loss: 0.2214 29/500 [>.............................] - ETA: 1:58 - loss: 1.4633 - regression_loss: 1.2379 - classification_loss: 0.2254 30/500 [>.............................] - ETA: 1:58 - loss: 1.4676 - regression_loss: 1.2430 - classification_loss: 0.2246 31/500 [>.............................] - ETA: 1:57 - loss: 1.4623 - regression_loss: 1.2395 - classification_loss: 0.2228 32/500 [>.............................] - ETA: 1:57 - loss: 1.4632 - regression_loss: 1.2406 - classification_loss: 0.2226 33/500 [>.............................] - ETA: 1:57 - loss: 1.4509 - regression_loss: 1.2318 - classification_loss: 0.2192 34/500 [=>............................] - ETA: 1:56 - loss: 1.4474 - regression_loss: 1.2278 - classification_loss: 0.2197 35/500 [=>............................] - ETA: 1:56 - loss: 1.4461 - regression_loss: 1.2270 - classification_loss: 0.2191 36/500 [=>............................] - ETA: 1:56 - loss: 1.4522 - regression_loss: 1.2310 - classification_loss: 0.2212 37/500 [=>............................] - ETA: 1:56 - loss: 1.4497 - regression_loss: 1.2291 - classification_loss: 0.2205 38/500 [=>............................] - ETA: 1:55 - loss: 1.4457 - regression_loss: 1.2265 - classification_loss: 0.2193 39/500 [=>............................] - ETA: 1:55 - loss: 1.4415 - regression_loss: 1.2226 - classification_loss: 0.2190 40/500 [=>............................] - ETA: 1:55 - loss: 1.4381 - regression_loss: 1.2178 - classification_loss: 0.2204 41/500 [=>............................] - ETA: 1:55 - loss: 1.4439 - regression_loss: 1.2225 - classification_loss: 0.2214 42/500 [=>............................] - ETA: 1:55 - loss: 1.4547 - regression_loss: 1.2301 - classification_loss: 0.2246 43/500 [=>............................] - ETA: 1:54 - loss: 1.4540 - regression_loss: 1.2297 - classification_loss: 0.2243 44/500 [=>............................] - ETA: 1:54 - loss: 1.4595 - regression_loss: 1.2331 - classification_loss: 0.2264 45/500 [=>............................] - ETA: 1:54 - loss: 1.4649 - regression_loss: 1.2384 - classification_loss: 0.2265 46/500 [=>............................] - ETA: 1:53 - loss: 1.4700 - regression_loss: 1.2427 - classification_loss: 0.2273 47/500 [=>............................] - ETA: 1:53 - loss: 1.4804 - regression_loss: 1.2512 - classification_loss: 0.2292 48/500 [=>............................] - ETA: 1:53 - loss: 1.4818 - regression_loss: 1.2518 - classification_loss: 0.2300 49/500 [=>............................] - ETA: 1:52 - loss: 1.4877 - regression_loss: 1.2576 - classification_loss: 0.2300 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4983 - regression_loss: 1.2649 - classification_loss: 0.2334 51/500 [==>...........................] - ETA: 1:52 - loss: 1.5063 - regression_loss: 1.2684 - classification_loss: 0.2380 52/500 [==>...........................] - ETA: 1:52 - loss: 1.5094 - regression_loss: 1.2718 - classification_loss: 0.2377 53/500 [==>...........................] - ETA: 1:51 - loss: 1.4886 - regression_loss: 1.2546 - classification_loss: 0.2340 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4924 - regression_loss: 1.2589 - classification_loss: 0.2335 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4906 - regression_loss: 1.2571 - classification_loss: 0.2336 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4942 - regression_loss: 1.2598 - classification_loss: 0.2344 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5028 - regression_loss: 1.2668 - classification_loss: 0.2360 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4998 - regression_loss: 1.2643 - classification_loss: 0.2356 59/500 [==>...........................] - ETA: 1:50 - loss: 1.5052 - regression_loss: 1.2674 - classification_loss: 0.2378 60/500 [==>...........................] - ETA: 1:50 - loss: 1.5050 - regression_loss: 1.2672 - classification_loss: 0.2377 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5039 - regression_loss: 1.2661 - classification_loss: 0.2378 62/500 [==>...........................] - ETA: 1:49 - loss: 1.5046 - regression_loss: 1.2675 - classification_loss: 0.2371 63/500 [==>...........................] - ETA: 1:49 - loss: 1.5039 - regression_loss: 1.2671 - classification_loss: 0.2368 64/500 [==>...........................] - ETA: 1:49 - loss: 1.5065 - regression_loss: 1.2698 - classification_loss: 0.2368 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4950 - regression_loss: 1.2601 - classification_loss: 0.2348 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4971 - regression_loss: 1.2612 - classification_loss: 0.2360 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5048 - regression_loss: 1.2650 - classification_loss: 0.2398 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5078 - regression_loss: 1.2677 - classification_loss: 0.2401 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5242 - regression_loss: 1.2808 - classification_loss: 0.2434 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5275 - regression_loss: 1.2836 - classification_loss: 0.2440 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5191 - regression_loss: 1.2763 - classification_loss: 0.2428 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5212 - regression_loss: 1.2784 - classification_loss: 0.2428 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5286 - regression_loss: 1.2845 - classification_loss: 0.2441 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5207 - regression_loss: 1.2779 - classification_loss: 0.2429 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5217 - regression_loss: 1.2783 - classification_loss: 0.2434 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5259 - regression_loss: 1.2820 - classification_loss: 0.2439 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5245 - regression_loss: 1.2802 - classification_loss: 0.2443 78/500 [===>..........................] - ETA: 1:44 - loss: 1.5230 - regression_loss: 1.2791 - classification_loss: 0.2438 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5180 - regression_loss: 1.2756 - classification_loss: 0.2424 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5160 - regression_loss: 1.2739 - classification_loss: 0.2421 81/500 [===>..........................] - ETA: 1:43 - loss: 1.5193 - regression_loss: 1.2763 - classification_loss: 0.2430 82/500 [===>..........................] - ETA: 1:43 - loss: 1.5201 - regression_loss: 1.2767 - classification_loss: 0.2434 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5240 - regression_loss: 1.2804 - classification_loss: 0.2436 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5212 - regression_loss: 1.2785 - classification_loss: 0.2427 85/500 [====>.........................] - ETA: 1:42 - loss: 1.5246 - regression_loss: 1.2814 - classification_loss: 0.2432 86/500 [====>.........................] - ETA: 1:42 - loss: 1.5268 - regression_loss: 1.2828 - classification_loss: 0.2440 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5211 - regression_loss: 1.2765 - classification_loss: 0.2447 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5206 - regression_loss: 1.2759 - classification_loss: 0.2447 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5289 - regression_loss: 1.2825 - classification_loss: 0.2463 90/500 [====>.........................] - ETA: 1:41 - loss: 1.5372 - regression_loss: 1.2880 - classification_loss: 0.2492 91/500 [====>.........................] - ETA: 1:41 - loss: 1.5380 - regression_loss: 1.2887 - classification_loss: 0.2493 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5321 - regression_loss: 1.2841 - classification_loss: 0.2479 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5286 - regression_loss: 1.2814 - classification_loss: 0.2471 94/500 [====>.........................] - ETA: 1:40 - loss: 1.5356 - regression_loss: 1.2869 - classification_loss: 0.2487 95/500 [====>.........................] - ETA: 1:40 - loss: 1.5371 - regression_loss: 1.2882 - classification_loss: 0.2489 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5333 - regression_loss: 1.2853 - classification_loss: 0.2480 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5345 - regression_loss: 1.2865 - classification_loss: 0.2480 98/500 [====>.........................] - ETA: 1:39 - loss: 1.5343 - regression_loss: 1.2860 - classification_loss: 0.2483 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5306 - regression_loss: 1.2831 - classification_loss: 0.2476 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5312 - regression_loss: 1.2836 - classification_loss: 0.2477 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5318 - regression_loss: 1.2845 - classification_loss: 0.2473 102/500 [=====>........................] - ETA: 1:38 - loss: 1.5361 - regression_loss: 1.2882 - classification_loss: 0.2479 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5390 - regression_loss: 1.2908 - classification_loss: 0.2482 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5372 - regression_loss: 1.2889 - classification_loss: 0.2482 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5376 - regression_loss: 1.2889 - classification_loss: 0.2486 106/500 [=====>........................] - ETA: 1:37 - loss: 1.5363 - regression_loss: 1.2880 - classification_loss: 0.2483 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5404 - regression_loss: 1.2912 - classification_loss: 0.2492 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5384 - regression_loss: 1.2897 - classification_loss: 0.2486 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5341 - regression_loss: 1.2865 - classification_loss: 0.2476 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5304 - regression_loss: 1.2838 - classification_loss: 0.2466 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5294 - regression_loss: 1.2830 - classification_loss: 0.2464 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5304 - regression_loss: 1.2841 - classification_loss: 0.2463 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5331 - regression_loss: 1.2860 - classification_loss: 0.2470 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5361 - regression_loss: 1.2887 - classification_loss: 0.2473 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5339 - regression_loss: 1.2871 - classification_loss: 0.2468 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5294 - regression_loss: 1.2832 - classification_loss: 0.2461 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5280 - regression_loss: 1.2824 - classification_loss: 0.2456 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5251 - regression_loss: 1.2802 - classification_loss: 0.2449 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5200 - regression_loss: 1.2763 - classification_loss: 0.2437 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5227 - regression_loss: 1.2785 - classification_loss: 0.2441 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5195 - regression_loss: 1.2756 - classification_loss: 0.2439 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5206 - regression_loss: 1.2771 - classification_loss: 0.2435 123/500 [======>.......................] - ETA: 1:33 - loss: 1.5226 - regression_loss: 1.2790 - classification_loss: 0.2437 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5275 - regression_loss: 1.2822 - classification_loss: 0.2453 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5202 - regression_loss: 1.2757 - classification_loss: 0.2445 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5227 - regression_loss: 1.2771 - classification_loss: 0.2455 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5234 - regression_loss: 1.2777 - classification_loss: 0.2458 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5226 - regression_loss: 1.2764 - classification_loss: 0.2462 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5209 - regression_loss: 1.2753 - classification_loss: 0.2456 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5246 - regression_loss: 1.2784 - classification_loss: 0.2461 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5191 - regression_loss: 1.2731 - classification_loss: 0.2460 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5159 - regression_loss: 1.2707 - classification_loss: 0.2452 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5144 - regression_loss: 1.2697 - classification_loss: 0.2447 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5144 - regression_loss: 1.2695 - classification_loss: 0.2449 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5159 - regression_loss: 1.2709 - classification_loss: 0.2450 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5181 - regression_loss: 1.2727 - classification_loss: 0.2454 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5159 - regression_loss: 1.2715 - classification_loss: 0.2444 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5147 - regression_loss: 1.2708 - classification_loss: 0.2439 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5175 - regression_loss: 1.2723 - classification_loss: 0.2452 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5190 - regression_loss: 1.2737 - classification_loss: 0.2452 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5172 - regression_loss: 1.2724 - classification_loss: 0.2448 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5202 - regression_loss: 1.2749 - classification_loss: 0.2452 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5188 - regression_loss: 1.2740 - classification_loss: 0.2447 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5139 - regression_loss: 1.2701 - classification_loss: 0.2438 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5118 - regression_loss: 1.2686 - classification_loss: 0.2433 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5155 - regression_loss: 1.2713 - classification_loss: 0.2442 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5172 - regression_loss: 1.2727 - classification_loss: 0.2445 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5127 - regression_loss: 1.2693 - classification_loss: 0.2434 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5081 - regression_loss: 1.2654 - classification_loss: 0.2426 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5103 - regression_loss: 1.2676 - classification_loss: 0.2427 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5122 - regression_loss: 1.2692 - classification_loss: 0.2430 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5076 - regression_loss: 1.2655 - classification_loss: 0.2421 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5071 - regression_loss: 1.2652 - classification_loss: 0.2419 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5077 - regression_loss: 1.2658 - classification_loss: 0.2419 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5077 - regression_loss: 1.2660 - classification_loss: 0.2417 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5087 - regression_loss: 1.2668 - classification_loss: 0.2418 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5104 - regression_loss: 1.2679 - classification_loss: 0.2425 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5075 - regression_loss: 1.2656 - classification_loss: 0.2419 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5083 - regression_loss: 1.2662 - classification_loss: 0.2421 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5069 - regression_loss: 1.2651 - classification_loss: 0.2419 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5104 - regression_loss: 1.2673 - classification_loss: 0.2431 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5102 - regression_loss: 1.2669 - classification_loss: 0.2433 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5164 - regression_loss: 1.2723 - classification_loss: 0.2441 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5145 - regression_loss: 1.2709 - classification_loss: 0.2436 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5145 - regression_loss: 1.2710 - classification_loss: 0.2435 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5130 - regression_loss: 1.2701 - classification_loss: 0.2429 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5123 - regression_loss: 1.2699 - classification_loss: 0.2425 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5114 - regression_loss: 1.2689 - classification_loss: 0.2424 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5128 - regression_loss: 1.2697 - classification_loss: 0.2432 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5133 - regression_loss: 1.2703 - classification_loss: 0.2430 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5182 - regression_loss: 1.2753 - classification_loss: 0.2429 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5217 - regression_loss: 1.2772 - classification_loss: 0.2444 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5193 - regression_loss: 1.2751 - classification_loss: 0.2442 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5199 - regression_loss: 1.2756 - classification_loss: 0.2443 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5191 - regression_loss: 1.2751 - classification_loss: 0.2440 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5147 - regression_loss: 1.2713 - classification_loss: 0.2434 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5186 - regression_loss: 1.2743 - classification_loss: 0.2443 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5177 - regression_loss: 1.2732 - classification_loss: 0.2446 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5211 - regression_loss: 1.2758 - classification_loss: 0.2453 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5189 - regression_loss: 1.2740 - classification_loss: 0.2449 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5188 - regression_loss: 1.2740 - classification_loss: 0.2447 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5197 - regression_loss: 1.2742 - classification_loss: 0.2456 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5163 - regression_loss: 1.2709 - classification_loss: 0.2453 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5207 - regression_loss: 1.2748 - classification_loss: 0.2459 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5214 - regression_loss: 1.2752 - classification_loss: 0.2462 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5186 - regression_loss: 1.2725 - classification_loss: 0.2460 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5165 - regression_loss: 1.2707 - classification_loss: 0.2458 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5201 - regression_loss: 1.2730 - classification_loss: 0.2471 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5160 - regression_loss: 1.2697 - classification_loss: 0.2464 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5168 - regression_loss: 1.2701 - classification_loss: 0.2467 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5191 - regression_loss: 1.2718 - classification_loss: 0.2473 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5195 - regression_loss: 1.2723 - classification_loss: 0.2473 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5208 - regression_loss: 1.2732 - classification_loss: 0.2476 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5216 - regression_loss: 1.2741 - classification_loss: 0.2476 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5236 - regression_loss: 1.2756 - classification_loss: 0.2480 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5216 - regression_loss: 1.2739 - classification_loss: 0.2478 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5176 - regression_loss: 1.2705 - classification_loss: 0.2471 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5184 - regression_loss: 1.2712 - classification_loss: 0.2472 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5144 - regression_loss: 1.2681 - classification_loss: 0.2463 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5160 - regression_loss: 1.2695 - classification_loss: 0.2465 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5171 - regression_loss: 1.2704 - classification_loss: 0.2467 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5187 - regression_loss: 1.2720 - classification_loss: 0.2467 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5183 - regression_loss: 1.2718 - classification_loss: 0.2465 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5167 - regression_loss: 1.2704 - classification_loss: 0.2463 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5152 - regression_loss: 1.2693 - classification_loss: 0.2459 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5122 - regression_loss: 1.2666 - classification_loss: 0.2456 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5119 - regression_loss: 1.2665 - classification_loss: 0.2453 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5110 - regression_loss: 1.2659 - classification_loss: 0.2452 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5097 - regression_loss: 1.2651 - classification_loss: 0.2446 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5112 - regression_loss: 1.2659 - classification_loss: 0.2453 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5141 - regression_loss: 1.2683 - classification_loss: 0.2458 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5162 - regression_loss: 1.2698 - classification_loss: 0.2465 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5187 - regression_loss: 1.2723 - classification_loss: 0.2465 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5174 - regression_loss: 1.2714 - classification_loss: 0.2460 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5171 - regression_loss: 1.2710 - classification_loss: 0.2461 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5173 - regression_loss: 1.2713 - classification_loss: 0.2460 217/500 [============>.................] - ETA: 1:10 - loss: 1.5172 - regression_loss: 1.2713 - classification_loss: 0.2459 218/500 [============>.................] - ETA: 1:10 - loss: 1.5134 - regression_loss: 1.2684 - classification_loss: 0.2450 219/500 [============>.................] - ETA: 1:10 - loss: 1.5122 - regression_loss: 1.2676 - classification_loss: 0.2446 220/500 [============>.................] - ETA: 1:10 - loss: 1.5132 - regression_loss: 1.2683 - classification_loss: 0.2450 221/500 [============>.................] - ETA: 1:09 - loss: 1.5106 - regression_loss: 1.2661 - classification_loss: 0.2445 222/500 [============>.................] - ETA: 1:09 - loss: 1.5130 - regression_loss: 1.2682 - classification_loss: 0.2449 223/500 [============>.................] - ETA: 1:09 - loss: 1.5108 - regression_loss: 1.2665 - classification_loss: 0.2443 224/500 [============>.................] - ETA: 1:09 - loss: 1.5110 - regression_loss: 1.2667 - classification_loss: 0.2442 225/500 [============>.................] - ETA: 1:08 - loss: 1.5102 - regression_loss: 1.2661 - classification_loss: 0.2441 226/500 [============>.................] - ETA: 1:08 - loss: 1.5063 - regression_loss: 1.2628 - classification_loss: 0.2435 227/500 [============>.................] - ETA: 1:08 - loss: 1.5073 - regression_loss: 1.2637 - classification_loss: 0.2436 228/500 [============>.................] - ETA: 1:08 - loss: 1.5093 - regression_loss: 1.2652 - classification_loss: 0.2441 229/500 [============>.................] - ETA: 1:07 - loss: 1.5129 - regression_loss: 1.2680 - classification_loss: 0.2449 230/500 [============>.................] - ETA: 1:07 - loss: 1.5122 - regression_loss: 1.2677 - classification_loss: 0.2446 231/500 [============>.................] - ETA: 1:07 - loss: 1.5115 - regression_loss: 1.2672 - classification_loss: 0.2443 232/500 [============>.................] - ETA: 1:07 - loss: 1.5110 - regression_loss: 1.2668 - classification_loss: 0.2442 233/500 [============>.................] - ETA: 1:06 - loss: 1.5108 - regression_loss: 1.2667 - classification_loss: 0.2440 234/500 [=============>................] - ETA: 1:06 - loss: 1.5096 - regression_loss: 1.2654 - classification_loss: 0.2442 235/500 [=============>................] - ETA: 1:06 - loss: 1.5124 - regression_loss: 1.2677 - classification_loss: 0.2446 236/500 [=============>................] - ETA: 1:06 - loss: 1.5124 - regression_loss: 1.2676 - classification_loss: 0.2448 237/500 [=============>................] - ETA: 1:05 - loss: 1.5088 - regression_loss: 1.2645 - classification_loss: 0.2443 238/500 [=============>................] - ETA: 1:05 - loss: 1.5091 - regression_loss: 1.2648 - classification_loss: 0.2443 239/500 [=============>................] - ETA: 1:05 - loss: 1.5077 - regression_loss: 1.2636 - classification_loss: 0.2441 240/500 [=============>................] - ETA: 1:05 - loss: 1.5086 - regression_loss: 1.2645 - classification_loss: 0.2441 241/500 [=============>................] - ETA: 1:04 - loss: 1.5089 - regression_loss: 1.2649 - classification_loss: 0.2439 242/500 [=============>................] - ETA: 1:04 - loss: 1.5111 - regression_loss: 1.2666 - classification_loss: 0.2445 243/500 [=============>................] - ETA: 1:04 - loss: 1.5117 - regression_loss: 1.2673 - classification_loss: 0.2445 244/500 [=============>................] - ETA: 1:04 - loss: 1.5124 - regression_loss: 1.2677 - classification_loss: 0.2446 245/500 [=============>................] - ETA: 1:03 - loss: 1.5140 - regression_loss: 1.2690 - classification_loss: 0.2450 246/500 [=============>................] - ETA: 1:03 - loss: 1.5143 - regression_loss: 1.2693 - classification_loss: 0.2450 247/500 [=============>................] - ETA: 1:03 - loss: 1.5136 - regression_loss: 1.2686 - classification_loss: 0.2450 248/500 [=============>................] - ETA: 1:03 - loss: 1.5104 - regression_loss: 1.2657 - classification_loss: 0.2447 249/500 [=============>................] - ETA: 1:02 - loss: 1.5112 - regression_loss: 1.2662 - classification_loss: 0.2449 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5135 - regression_loss: 1.2689 - classification_loss: 0.2447 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5134 - regression_loss: 1.2689 - classification_loss: 0.2445 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5146 - regression_loss: 1.2692 - classification_loss: 0.2454 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5152 - regression_loss: 1.2694 - classification_loss: 0.2458 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5146 - regression_loss: 1.2690 - classification_loss: 0.2456 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5155 - regression_loss: 1.2697 - classification_loss: 0.2458 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5160 - regression_loss: 1.2703 - classification_loss: 0.2457 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5159 - regression_loss: 1.2704 - classification_loss: 0.2455 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5150 - regression_loss: 1.2693 - classification_loss: 0.2457 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5144 - regression_loss: 1.2689 - classification_loss: 0.2455 260/500 [==============>...............] - ETA: 59s - loss: 1.5124 - regression_loss: 1.2675 - classification_loss: 0.2449  261/500 [==============>...............] - ETA: 59s - loss: 1.5145 - regression_loss: 1.2693 - classification_loss: 0.2452 262/500 [==============>...............] - ETA: 59s - loss: 1.5134 - regression_loss: 1.2686 - classification_loss: 0.2448 263/500 [==============>...............] - ETA: 59s - loss: 1.5113 - regression_loss: 1.2670 - classification_loss: 0.2443 264/500 [==============>...............] - ETA: 59s - loss: 1.5115 - regression_loss: 1.2672 - classification_loss: 0.2443 265/500 [==============>...............] - ETA: 58s - loss: 1.5127 - regression_loss: 1.2682 - classification_loss: 0.2445 266/500 [==============>...............] - ETA: 58s - loss: 1.5131 - regression_loss: 1.2685 - classification_loss: 0.2446 267/500 [===============>..............] - ETA: 58s - loss: 1.5115 - regression_loss: 1.2670 - classification_loss: 0.2444 268/500 [===============>..............] - ETA: 58s - loss: 1.5110 - regression_loss: 1.2666 - classification_loss: 0.2444 269/500 [===============>..............] - ETA: 57s - loss: 1.5161 - regression_loss: 1.2709 - classification_loss: 0.2452 270/500 [===============>..............] - ETA: 57s - loss: 1.5137 - regression_loss: 1.2690 - classification_loss: 0.2447 271/500 [===============>..............] - ETA: 57s - loss: 1.5108 - regression_loss: 1.2666 - classification_loss: 0.2442 272/500 [===============>..............] - ETA: 56s - loss: 1.5101 - regression_loss: 1.2661 - classification_loss: 0.2440 273/500 [===============>..............] - ETA: 56s - loss: 1.5075 - regression_loss: 1.2639 - classification_loss: 0.2436 274/500 [===============>..............] - ETA: 56s - loss: 1.5074 - regression_loss: 1.2638 - classification_loss: 0.2436 275/500 [===============>..............] - ETA: 56s - loss: 1.5067 - regression_loss: 1.2634 - classification_loss: 0.2433 276/500 [===============>..............] - ETA: 55s - loss: 1.5070 - regression_loss: 1.2637 - classification_loss: 0.2433 277/500 [===============>..............] - ETA: 55s - loss: 1.5098 - regression_loss: 1.2656 - classification_loss: 0.2442 278/500 [===============>..............] - ETA: 55s - loss: 1.5126 - regression_loss: 1.2677 - classification_loss: 0.2449 279/500 [===============>..............] - ETA: 55s - loss: 1.5122 - regression_loss: 1.2674 - classification_loss: 0.2448 280/500 [===============>..............] - ETA: 55s - loss: 1.5129 - regression_loss: 1.2683 - classification_loss: 0.2446 281/500 [===============>..............] - ETA: 54s - loss: 1.5150 - regression_loss: 1.2698 - classification_loss: 0.2452 282/500 [===============>..............] - ETA: 54s - loss: 1.5156 - regression_loss: 1.2703 - classification_loss: 0.2453 283/500 [===============>..............] - ETA: 54s - loss: 1.5160 - regression_loss: 1.2704 - classification_loss: 0.2456 284/500 [================>.............] - ETA: 54s - loss: 1.5154 - regression_loss: 1.2695 - classification_loss: 0.2458 285/500 [================>.............] - ETA: 53s - loss: 1.5167 - regression_loss: 1.2707 - classification_loss: 0.2461 286/500 [================>.............] - ETA: 53s - loss: 1.5173 - regression_loss: 1.2711 - classification_loss: 0.2462 287/500 [================>.............] - ETA: 53s - loss: 1.5180 - regression_loss: 1.2716 - classification_loss: 0.2464 288/500 [================>.............] - ETA: 53s - loss: 1.5143 - regression_loss: 1.2686 - classification_loss: 0.2457 289/500 [================>.............] - ETA: 52s - loss: 1.5132 - regression_loss: 1.2673 - classification_loss: 0.2460 290/500 [================>.............] - ETA: 52s - loss: 1.5135 - regression_loss: 1.2677 - classification_loss: 0.2458 291/500 [================>.............] - ETA: 52s - loss: 1.5111 - regression_loss: 1.2658 - classification_loss: 0.2453 292/500 [================>.............] - ETA: 52s - loss: 1.5113 - regression_loss: 1.2659 - classification_loss: 0.2454 293/500 [================>.............] - ETA: 51s - loss: 1.5097 - regression_loss: 1.2647 - classification_loss: 0.2450 294/500 [================>.............] - ETA: 51s - loss: 1.5119 - regression_loss: 1.2668 - classification_loss: 0.2451 295/500 [================>.............] - ETA: 51s - loss: 1.5093 - regression_loss: 1.2647 - classification_loss: 0.2446 296/500 [================>.............] - ETA: 51s - loss: 1.5110 - regression_loss: 1.2660 - classification_loss: 0.2449 297/500 [================>.............] - ETA: 50s - loss: 1.5102 - regression_loss: 1.2655 - classification_loss: 0.2447 298/500 [================>.............] - ETA: 50s - loss: 1.5100 - regression_loss: 1.2652 - classification_loss: 0.2448 299/500 [================>.............] - ETA: 50s - loss: 1.5102 - regression_loss: 1.2654 - classification_loss: 0.2448 300/500 [=================>............] - ETA: 50s - loss: 1.5115 - regression_loss: 1.2666 - classification_loss: 0.2449 301/500 [=================>............] - ETA: 49s - loss: 1.5115 - regression_loss: 1.2663 - classification_loss: 0.2452 302/500 [=================>............] - ETA: 49s - loss: 1.5116 - regression_loss: 1.2664 - classification_loss: 0.2452 303/500 [=================>............] - ETA: 49s - loss: 1.5115 - regression_loss: 1.2663 - classification_loss: 0.2452 304/500 [=================>............] - ETA: 49s - loss: 1.5113 - regression_loss: 1.2662 - classification_loss: 0.2451 305/500 [=================>............] - ETA: 48s - loss: 1.5122 - regression_loss: 1.2669 - classification_loss: 0.2453 306/500 [=================>............] - ETA: 48s - loss: 1.5106 - regression_loss: 1.2654 - classification_loss: 0.2452 307/500 [=================>............] - ETA: 48s - loss: 1.5071 - regression_loss: 1.2625 - classification_loss: 0.2445 308/500 [=================>............] - ETA: 48s - loss: 1.5074 - regression_loss: 1.2628 - classification_loss: 0.2446 309/500 [=================>............] - ETA: 47s - loss: 1.5074 - regression_loss: 1.2630 - classification_loss: 0.2444 310/500 [=================>............] - ETA: 47s - loss: 1.5087 - regression_loss: 1.2641 - classification_loss: 0.2446 311/500 [=================>............] - ETA: 47s - loss: 1.5099 - regression_loss: 1.2646 - classification_loss: 0.2453 312/500 [=================>............] - ETA: 47s - loss: 1.5105 - regression_loss: 1.2649 - classification_loss: 0.2456 313/500 [=================>............] - ETA: 46s - loss: 1.5086 - regression_loss: 1.2633 - classification_loss: 0.2453 314/500 [=================>............] - ETA: 46s - loss: 1.5051 - regression_loss: 1.2605 - classification_loss: 0.2446 315/500 [=================>............] - ETA: 46s - loss: 1.5048 - regression_loss: 1.2601 - classification_loss: 0.2447 316/500 [=================>............] - ETA: 46s - loss: 1.5066 - regression_loss: 1.2616 - classification_loss: 0.2450 317/500 [==================>...........] - ETA: 45s - loss: 1.5070 - regression_loss: 1.2620 - classification_loss: 0.2450 318/500 [==================>...........] - ETA: 45s - loss: 1.5089 - regression_loss: 1.2635 - classification_loss: 0.2454 319/500 [==================>...........] - ETA: 45s - loss: 1.5074 - regression_loss: 1.2624 - classification_loss: 0.2450 320/500 [==================>...........] - ETA: 45s - loss: 1.5066 - regression_loss: 1.2619 - classification_loss: 0.2448 321/500 [==================>...........] - ETA: 44s - loss: 1.5074 - regression_loss: 1.2624 - classification_loss: 0.2450 322/500 [==================>...........] - ETA: 44s - loss: 1.5068 - regression_loss: 1.2619 - classification_loss: 0.2449 323/500 [==================>...........] - ETA: 44s - loss: 1.5084 - regression_loss: 1.2631 - classification_loss: 0.2453 324/500 [==================>...........] - ETA: 44s - loss: 1.5081 - regression_loss: 1.2629 - classification_loss: 0.2451 325/500 [==================>...........] - ETA: 43s - loss: 1.5081 - regression_loss: 1.2629 - classification_loss: 0.2452 326/500 [==================>...........] - ETA: 43s - loss: 1.5080 - regression_loss: 1.2632 - classification_loss: 0.2449 327/500 [==================>...........] - ETA: 43s - loss: 1.5092 - regression_loss: 1.2641 - classification_loss: 0.2451 328/500 [==================>...........] - ETA: 43s - loss: 1.5096 - regression_loss: 1.2644 - classification_loss: 0.2452 329/500 [==================>...........] - ETA: 42s - loss: 1.5105 - regression_loss: 1.2654 - classification_loss: 0.2452 330/500 [==================>...........] - ETA: 42s - loss: 1.5076 - regression_loss: 1.2624 - classification_loss: 0.2452 331/500 [==================>...........] - ETA: 42s - loss: 1.5085 - regression_loss: 1.2632 - classification_loss: 0.2453 332/500 [==================>...........] - ETA: 42s - loss: 1.5083 - regression_loss: 1.2630 - classification_loss: 0.2453 333/500 [==================>...........] - ETA: 41s - loss: 1.5094 - regression_loss: 1.2640 - classification_loss: 0.2454 334/500 [===================>..........] - ETA: 41s - loss: 1.5098 - regression_loss: 1.2642 - classification_loss: 0.2456 335/500 [===================>..........] - ETA: 41s - loss: 1.5115 - regression_loss: 1.2654 - classification_loss: 0.2461 336/500 [===================>..........] - ETA: 41s - loss: 1.5126 - regression_loss: 1.2663 - classification_loss: 0.2463 337/500 [===================>..........] - ETA: 40s - loss: 1.5134 - regression_loss: 1.2670 - classification_loss: 0.2464 338/500 [===================>..........] - ETA: 40s - loss: 1.5115 - regression_loss: 1.2656 - classification_loss: 0.2459 339/500 [===================>..........] - ETA: 40s - loss: 1.5107 - regression_loss: 1.2645 - classification_loss: 0.2462 340/500 [===================>..........] - ETA: 40s - loss: 1.5108 - regression_loss: 1.2648 - classification_loss: 0.2461 341/500 [===================>..........] - ETA: 39s - loss: 1.5091 - regression_loss: 1.2634 - classification_loss: 0.2457 342/500 [===================>..........] - ETA: 39s - loss: 1.5090 - regression_loss: 1.2632 - classification_loss: 0.2458 343/500 [===================>..........] - ETA: 39s - loss: 1.5092 - regression_loss: 1.2636 - classification_loss: 0.2456 344/500 [===================>..........] - ETA: 39s - loss: 1.5083 - regression_loss: 1.2628 - classification_loss: 0.2455 345/500 [===================>..........] - ETA: 38s - loss: 1.5069 - regression_loss: 1.2617 - classification_loss: 0.2452 346/500 [===================>..........] - ETA: 38s - loss: 1.5080 - regression_loss: 1.2625 - classification_loss: 0.2455 347/500 [===================>..........] - ETA: 38s - loss: 1.5082 - regression_loss: 1.2624 - classification_loss: 0.2458 348/500 [===================>..........] - ETA: 38s - loss: 1.5111 - regression_loss: 1.2644 - classification_loss: 0.2466 349/500 [===================>..........] - ETA: 37s - loss: 1.5126 - regression_loss: 1.2655 - classification_loss: 0.2471 350/500 [====================>.........] - ETA: 37s - loss: 1.5146 - regression_loss: 1.2667 - classification_loss: 0.2480 351/500 [====================>.........] - ETA: 37s - loss: 1.5136 - regression_loss: 1.2660 - classification_loss: 0.2476 352/500 [====================>.........] - ETA: 37s - loss: 1.5132 - regression_loss: 1.2658 - classification_loss: 0.2474 353/500 [====================>.........] - ETA: 36s - loss: 1.5131 - regression_loss: 1.2658 - classification_loss: 0.2473 354/500 [====================>.........] - ETA: 36s - loss: 1.5145 - regression_loss: 1.2669 - classification_loss: 0.2476 355/500 [====================>.........] - ETA: 36s - loss: 1.5151 - regression_loss: 1.2673 - classification_loss: 0.2478 356/500 [====================>.........] - ETA: 36s - loss: 1.5134 - regression_loss: 1.2659 - classification_loss: 0.2474 357/500 [====================>.........] - ETA: 35s - loss: 1.5112 - regression_loss: 1.2641 - classification_loss: 0.2471 358/500 [====================>.........] - ETA: 35s - loss: 1.5125 - regression_loss: 1.2649 - classification_loss: 0.2476 359/500 [====================>.........] - ETA: 35s - loss: 1.5128 - regression_loss: 1.2653 - classification_loss: 0.2475 360/500 [====================>.........] - ETA: 35s - loss: 1.5135 - regression_loss: 1.2659 - classification_loss: 0.2476 361/500 [====================>.........] - ETA: 34s - loss: 1.5118 - regression_loss: 1.2647 - classification_loss: 0.2471 362/500 [====================>.........] - ETA: 34s - loss: 1.5126 - regression_loss: 1.2653 - classification_loss: 0.2473 363/500 [====================>.........] - ETA: 34s - loss: 1.5139 - regression_loss: 1.2665 - classification_loss: 0.2474 364/500 [====================>.........] - ETA: 34s - loss: 1.5135 - regression_loss: 1.2663 - classification_loss: 0.2472 365/500 [====================>.........] - ETA: 33s - loss: 1.5132 - regression_loss: 1.2661 - classification_loss: 0.2471 366/500 [====================>.........] - ETA: 33s - loss: 1.5136 - regression_loss: 1.2664 - classification_loss: 0.2472 367/500 [=====================>........] - ETA: 33s - loss: 1.5138 - regression_loss: 1.2666 - classification_loss: 0.2472 368/500 [=====================>........] - ETA: 33s - loss: 1.5119 - regression_loss: 1.2651 - classification_loss: 0.2468 369/500 [=====================>........] - ETA: 32s - loss: 1.5117 - regression_loss: 1.2650 - classification_loss: 0.2466 370/500 [=====================>........] - ETA: 32s - loss: 1.5127 - regression_loss: 1.2658 - classification_loss: 0.2468 371/500 [=====================>........] - ETA: 32s - loss: 1.5106 - regression_loss: 1.2642 - classification_loss: 0.2464 372/500 [=====================>........] - ETA: 32s - loss: 1.5103 - regression_loss: 1.2640 - classification_loss: 0.2464 373/500 [=====================>........] - ETA: 31s - loss: 1.5104 - regression_loss: 1.2641 - classification_loss: 0.2463 374/500 [=====================>........] - ETA: 31s - loss: 1.5105 - regression_loss: 1.2642 - classification_loss: 0.2463 375/500 [=====================>........] - ETA: 31s - loss: 1.5109 - regression_loss: 1.2646 - classification_loss: 0.2463 376/500 [=====================>........] - ETA: 31s - loss: 1.5105 - regression_loss: 1.2643 - classification_loss: 0.2462 377/500 [=====================>........] - ETA: 30s - loss: 1.5123 - regression_loss: 1.2659 - classification_loss: 0.2464 378/500 [=====================>........] - ETA: 30s - loss: 1.5115 - regression_loss: 1.2650 - classification_loss: 0.2465 379/500 [=====================>........] - ETA: 30s - loss: 1.5099 - regression_loss: 1.2635 - classification_loss: 0.2464 380/500 [=====================>........] - ETA: 30s - loss: 1.5101 - regression_loss: 1.2637 - classification_loss: 0.2464 381/500 [=====================>........] - ETA: 29s - loss: 1.5096 - regression_loss: 1.2634 - classification_loss: 0.2462 382/500 [=====================>........] - ETA: 29s - loss: 1.5089 - regression_loss: 1.2627 - classification_loss: 0.2462 383/500 [=====================>........] - ETA: 29s - loss: 1.5077 - regression_loss: 1.2616 - classification_loss: 0.2461 384/500 [======================>.......] - ETA: 29s - loss: 1.5089 - regression_loss: 1.2626 - classification_loss: 0.2463 385/500 [======================>.......] - ETA: 28s - loss: 1.5090 - regression_loss: 1.2625 - classification_loss: 0.2464 386/500 [======================>.......] - ETA: 28s - loss: 1.5068 - regression_loss: 1.2608 - classification_loss: 0.2460 387/500 [======================>.......] - ETA: 28s - loss: 1.5082 - regression_loss: 1.2619 - classification_loss: 0.2463 388/500 [======================>.......] - ETA: 28s - loss: 1.5097 - regression_loss: 1.2632 - classification_loss: 0.2465 389/500 [======================>.......] - ETA: 27s - loss: 1.5109 - regression_loss: 1.2645 - classification_loss: 0.2464 390/500 [======================>.......] - ETA: 27s - loss: 1.5112 - regression_loss: 1.2649 - classification_loss: 0.2464 391/500 [======================>.......] - ETA: 27s - loss: 1.5109 - regression_loss: 1.2647 - classification_loss: 0.2462 392/500 [======================>.......] - ETA: 27s - loss: 1.5113 - regression_loss: 1.2651 - classification_loss: 0.2462 393/500 [======================>.......] - ETA: 26s - loss: 1.5092 - regression_loss: 1.2635 - classification_loss: 0.2458 394/500 [======================>.......] - ETA: 26s - loss: 1.5110 - regression_loss: 1.2644 - classification_loss: 0.2466 395/500 [======================>.......] - ETA: 26s - loss: 1.5104 - regression_loss: 1.2639 - classification_loss: 0.2465 396/500 [======================>.......] - ETA: 26s - loss: 1.5109 - regression_loss: 1.2642 - classification_loss: 0.2467 397/500 [======================>.......] - ETA: 25s - loss: 1.5120 - regression_loss: 1.2652 - classification_loss: 0.2468 398/500 [======================>.......] - ETA: 25s - loss: 1.5114 - regression_loss: 1.2646 - classification_loss: 0.2468 399/500 [======================>.......] - ETA: 25s - loss: 1.5130 - regression_loss: 1.2659 - classification_loss: 0.2471 400/500 [=======================>......] - ETA: 25s - loss: 1.5141 - regression_loss: 1.2668 - classification_loss: 0.2472 401/500 [=======================>......] - ETA: 24s - loss: 1.5138 - regression_loss: 1.2668 - classification_loss: 0.2471 402/500 [=======================>......] - ETA: 24s - loss: 1.5114 - regression_loss: 1.2647 - classification_loss: 0.2467 403/500 [=======================>......] - ETA: 24s - loss: 1.5127 - regression_loss: 1.2657 - classification_loss: 0.2470 404/500 [=======================>......] - ETA: 24s - loss: 1.5134 - regression_loss: 1.2663 - classification_loss: 0.2470 405/500 [=======================>......] - ETA: 23s - loss: 1.5138 - regression_loss: 1.2668 - classification_loss: 0.2470 406/500 [=======================>......] - ETA: 23s - loss: 1.5139 - regression_loss: 1.2668 - classification_loss: 0.2471 407/500 [=======================>......] - ETA: 23s - loss: 1.5138 - regression_loss: 1.2665 - classification_loss: 0.2473 408/500 [=======================>......] - ETA: 23s - loss: 1.5129 - regression_loss: 1.2656 - classification_loss: 0.2472 409/500 [=======================>......] - ETA: 22s - loss: 1.5128 - regression_loss: 1.2655 - classification_loss: 0.2473 410/500 [=======================>......] - ETA: 22s - loss: 1.5113 - regression_loss: 1.2643 - classification_loss: 0.2470 411/500 [=======================>......] - ETA: 22s - loss: 1.5094 - regression_loss: 1.2626 - classification_loss: 0.2468 412/500 [=======================>......] - ETA: 22s - loss: 1.5101 - regression_loss: 1.2632 - classification_loss: 0.2469 413/500 [=======================>......] - ETA: 21s - loss: 1.5110 - regression_loss: 1.2636 - classification_loss: 0.2474 414/500 [=======================>......] - ETA: 21s - loss: 1.5110 - regression_loss: 1.2636 - classification_loss: 0.2474 415/500 [=======================>......] - ETA: 21s - loss: 1.5117 - regression_loss: 1.2641 - classification_loss: 0.2476 416/500 [=======================>......] - ETA: 21s - loss: 1.5107 - regression_loss: 1.2633 - classification_loss: 0.2474 417/500 [========================>.....] - ETA: 20s - loss: 1.5090 - regression_loss: 1.2620 - classification_loss: 0.2470 418/500 [========================>.....] - ETA: 20s - loss: 1.5094 - regression_loss: 1.2624 - classification_loss: 0.2470 419/500 [========================>.....] - ETA: 20s - loss: 1.5081 - regression_loss: 1.2613 - classification_loss: 0.2468 420/500 [========================>.....] - ETA: 20s - loss: 1.5081 - regression_loss: 1.2612 - classification_loss: 0.2470 421/500 [========================>.....] - ETA: 19s - loss: 1.5076 - regression_loss: 1.2608 - classification_loss: 0.2468 422/500 [========================>.....] - ETA: 19s - loss: 1.5075 - regression_loss: 1.2608 - classification_loss: 0.2467 423/500 [========================>.....] - ETA: 19s - loss: 1.5077 - regression_loss: 1.2609 - classification_loss: 0.2468 424/500 [========================>.....] - ETA: 19s - loss: 1.5074 - regression_loss: 1.2607 - classification_loss: 0.2467 425/500 [========================>.....] - ETA: 18s - loss: 1.5069 - regression_loss: 1.2603 - classification_loss: 0.2467 426/500 [========================>.....] - ETA: 18s - loss: 1.5070 - regression_loss: 1.2603 - classification_loss: 0.2467 427/500 [========================>.....] - ETA: 18s - loss: 1.5049 - regression_loss: 1.2587 - classification_loss: 0.2462 428/500 [========================>.....] - ETA: 18s - loss: 1.5032 - regression_loss: 1.2574 - classification_loss: 0.2459 429/500 [========================>.....] - ETA: 17s - loss: 1.5016 - regression_loss: 1.2561 - classification_loss: 0.2454 430/500 [========================>.....] - ETA: 17s - loss: 1.5019 - regression_loss: 1.2564 - classification_loss: 0.2455 431/500 [========================>.....] - ETA: 17s - loss: 1.5008 - regression_loss: 1.2554 - classification_loss: 0.2454 432/500 [========================>.....] - ETA: 17s - loss: 1.5008 - regression_loss: 1.2554 - classification_loss: 0.2454 433/500 [========================>.....] - ETA: 16s - loss: 1.5011 - regression_loss: 1.2557 - classification_loss: 0.2454 434/500 [=========================>....] - ETA: 16s - loss: 1.5030 - regression_loss: 1.2572 - classification_loss: 0.2458 435/500 [=========================>....] - ETA: 16s - loss: 1.5019 - regression_loss: 1.2565 - classification_loss: 0.2454 436/500 [=========================>....] - ETA: 16s - loss: 1.5032 - regression_loss: 1.2574 - classification_loss: 0.2458 437/500 [=========================>....] - ETA: 15s - loss: 1.5011 - regression_loss: 1.2557 - classification_loss: 0.2454 438/500 [=========================>....] - ETA: 15s - loss: 1.5003 - regression_loss: 1.2551 - classification_loss: 0.2452 439/500 [=========================>....] - ETA: 15s - loss: 1.5006 - regression_loss: 1.2554 - classification_loss: 0.2452 440/500 [=========================>....] - ETA: 15s - loss: 1.5007 - regression_loss: 1.2555 - classification_loss: 0.2452 441/500 [=========================>....] - ETA: 14s - loss: 1.4998 - regression_loss: 1.2548 - classification_loss: 0.2450 442/500 [=========================>....] - ETA: 14s - loss: 1.4982 - regression_loss: 1.2536 - classification_loss: 0.2447 443/500 [=========================>....] - ETA: 14s - loss: 1.4976 - regression_loss: 1.2529 - classification_loss: 0.2447 444/500 [=========================>....] - ETA: 14s - loss: 1.4984 - regression_loss: 1.2536 - classification_loss: 0.2448 445/500 [=========================>....] - ETA: 13s - loss: 1.4979 - regression_loss: 1.2532 - classification_loss: 0.2447 446/500 [=========================>....] - ETA: 13s - loss: 1.4960 - regression_loss: 1.2514 - classification_loss: 0.2447 447/500 [=========================>....] - ETA: 13s - loss: 1.4963 - regression_loss: 1.2517 - classification_loss: 0.2446 448/500 [=========================>....] - ETA: 13s - loss: 1.4950 - regression_loss: 1.2506 - classification_loss: 0.2444 449/500 [=========================>....] - ETA: 12s - loss: 1.4958 - regression_loss: 1.2513 - classification_loss: 0.2444 450/500 [==========================>...] - ETA: 12s - loss: 1.4944 - regression_loss: 1.2502 - classification_loss: 0.2442 451/500 [==========================>...] - ETA: 12s - loss: 1.4945 - regression_loss: 1.2503 - classification_loss: 0.2442 452/500 [==========================>...] - ETA: 12s - loss: 1.4952 - regression_loss: 1.2510 - classification_loss: 0.2443 453/500 [==========================>...] - ETA: 11s - loss: 1.4956 - regression_loss: 1.2513 - classification_loss: 0.2443 454/500 [==========================>...] - ETA: 11s - loss: 1.4959 - regression_loss: 1.2516 - classification_loss: 0.2444 455/500 [==========================>...] - ETA: 11s - loss: 1.4952 - regression_loss: 1.2510 - classification_loss: 0.2442 456/500 [==========================>...] - ETA: 11s - loss: 1.4945 - regression_loss: 1.2504 - classification_loss: 0.2442 457/500 [==========================>...] - ETA: 10s - loss: 1.4946 - regression_loss: 1.2506 - classification_loss: 0.2441 458/500 [==========================>...] - ETA: 10s - loss: 1.4954 - regression_loss: 1.2512 - classification_loss: 0.2442 459/500 [==========================>...] - ETA: 10s - loss: 1.4948 - regression_loss: 1.2509 - classification_loss: 0.2439 460/500 [==========================>...] - ETA: 10s - loss: 1.4949 - regression_loss: 1.2507 - classification_loss: 0.2442 461/500 [==========================>...] - ETA: 9s - loss: 1.4956 - regression_loss: 1.2512 - classification_loss: 0.2444  462/500 [==========================>...] - ETA: 9s - loss: 1.4955 - regression_loss: 1.2512 - classification_loss: 0.2444 463/500 [==========================>...] - ETA: 9s - loss: 1.4959 - regression_loss: 1.2514 - classification_loss: 0.2445 464/500 [==========================>...] - ETA: 9s - loss: 1.4977 - regression_loss: 1.2531 - classification_loss: 0.2446 465/500 [==========================>...] - ETA: 8s - loss: 1.4977 - regression_loss: 1.2532 - classification_loss: 0.2445 466/500 [==========================>...] - ETA: 8s - loss: 1.4995 - regression_loss: 1.2547 - classification_loss: 0.2448 467/500 [===========================>..] - ETA: 8s - loss: 1.5001 - regression_loss: 1.2555 - classification_loss: 0.2445 468/500 [===========================>..] - ETA: 8s - loss: 1.4990 - regression_loss: 1.2546 - classification_loss: 0.2443 469/500 [===========================>..] - ETA: 7s - loss: 1.4995 - regression_loss: 1.2552 - classification_loss: 0.2443 470/500 [===========================>..] - ETA: 7s - loss: 1.4987 - regression_loss: 1.2546 - classification_loss: 0.2441 471/500 [===========================>..] - ETA: 7s - loss: 1.4994 - regression_loss: 1.2549 - classification_loss: 0.2445 472/500 [===========================>..] - ETA: 7s - loss: 1.5001 - regression_loss: 1.2555 - classification_loss: 0.2445 473/500 [===========================>..] - ETA: 6s - loss: 1.4985 - regression_loss: 1.2542 - classification_loss: 0.2442 474/500 [===========================>..] - ETA: 6s - loss: 1.5002 - regression_loss: 1.2555 - classification_loss: 0.2447 475/500 [===========================>..] - ETA: 6s - loss: 1.5001 - regression_loss: 1.2553 - classification_loss: 0.2448 476/500 [===========================>..] - ETA: 6s - loss: 1.5001 - regression_loss: 1.2554 - classification_loss: 0.2447 477/500 [===========================>..] - ETA: 5s - loss: 1.5014 - regression_loss: 1.2563 - classification_loss: 0.2451 478/500 [===========================>..] - ETA: 5s - loss: 1.5016 - regression_loss: 1.2564 - classification_loss: 0.2452 479/500 [===========================>..] - ETA: 5s - loss: 1.5014 - regression_loss: 1.2563 - classification_loss: 0.2452 480/500 [===========================>..] - ETA: 5s - loss: 1.5006 - regression_loss: 1.2557 - classification_loss: 0.2450 481/500 [===========================>..] - ETA: 4s - loss: 1.5014 - regression_loss: 1.2562 - classification_loss: 0.2451 482/500 [===========================>..] - ETA: 4s - loss: 1.5016 - regression_loss: 1.2564 - classification_loss: 0.2452 483/500 [===========================>..] - ETA: 4s - loss: 1.5014 - regression_loss: 1.2562 - classification_loss: 0.2452 484/500 [============================>.] - ETA: 4s - loss: 1.5016 - regression_loss: 1.2563 - classification_loss: 0.2453 485/500 [============================>.] - ETA: 3s - loss: 1.5005 - regression_loss: 1.2554 - classification_loss: 0.2451 486/500 [============================>.] - ETA: 3s - loss: 1.5003 - regression_loss: 1.2553 - classification_loss: 0.2450 487/500 [============================>.] - ETA: 3s - loss: 1.4997 - regression_loss: 1.2549 - classification_loss: 0.2448 488/500 [============================>.] - ETA: 3s - loss: 1.4982 - regression_loss: 1.2538 - classification_loss: 0.2445 489/500 [============================>.] - ETA: 2s - loss: 1.4980 - regression_loss: 1.2535 - classification_loss: 0.2446 490/500 [============================>.] - ETA: 2s - loss: 1.4986 - regression_loss: 1.2541 - classification_loss: 0.2445 491/500 [============================>.] - ETA: 2s - loss: 1.4997 - regression_loss: 1.2550 - classification_loss: 0.2448 492/500 [============================>.] - ETA: 2s - loss: 1.4995 - regression_loss: 1.2548 - classification_loss: 0.2447 493/500 [============================>.] - ETA: 1s - loss: 1.5008 - regression_loss: 1.2558 - classification_loss: 0.2450 494/500 [============================>.] - ETA: 1s - loss: 1.5014 - regression_loss: 1.2561 - classification_loss: 0.2453 495/500 [============================>.] - ETA: 1s - loss: 1.5011 - regression_loss: 1.2560 - classification_loss: 0.2451 496/500 [============================>.] - ETA: 1s - loss: 1.5018 - regression_loss: 1.2567 - classification_loss: 0.2451 497/500 [============================>.] - ETA: 0s - loss: 1.5009 - regression_loss: 1.2560 - classification_loss: 0.2449 498/500 [============================>.] - ETA: 0s - loss: 1.5014 - regression_loss: 1.2565 - classification_loss: 0.2449 499/500 [============================>.] - ETA: 0s - loss: 1.5004 - regression_loss: 1.2557 - classification_loss: 0.2447 500/500 [==============================] - 125s 251ms/step - loss: 1.5003 - regression_loss: 1.2557 - classification_loss: 0.2446 1172 instances of class plum with average precision: 0.6503 mAP: 0.6503 Epoch 00098: saving model to ./training/snapshots/resnet50_pascal_98.h5 Epoch 99/150 1/500 [..............................] - ETA: 1:56 - loss: 2.2470 - regression_loss: 1.8628 - classification_loss: 0.3842 2/500 [..............................] - ETA: 2:01 - loss: 1.8632 - regression_loss: 1.5613 - classification_loss: 0.3019 3/500 [..............................] - ETA: 2:01 - loss: 1.4129 - regression_loss: 1.1838 - classification_loss: 0.2291 4/500 [..............................] - ETA: 2:02 - loss: 1.5339 - regression_loss: 1.2870 - classification_loss: 0.2469 5/500 [..............................] - ETA: 2:01 - loss: 1.5666 - regression_loss: 1.2901 - classification_loss: 0.2765 6/500 [..............................] - ETA: 2:02 - loss: 1.6418 - regression_loss: 1.3463 - classification_loss: 0.2955 7/500 [..............................] - ETA: 2:01 - loss: 1.5979 - regression_loss: 1.3186 - classification_loss: 0.2793 8/500 [..............................] - ETA: 2:01 - loss: 1.5224 - regression_loss: 1.2498 - classification_loss: 0.2726 9/500 [..............................] - ETA: 2:01 - loss: 1.4480 - regression_loss: 1.1899 - classification_loss: 0.2581 10/500 [..............................] - ETA: 2:01 - loss: 1.4676 - regression_loss: 1.2151 - classification_loss: 0.2525 11/500 [..............................] - ETA: 2:01 - loss: 1.4643 - regression_loss: 1.2164 - classification_loss: 0.2479 12/500 [..............................] - ETA: 2:00 - loss: 1.4666 - regression_loss: 1.2197 - classification_loss: 0.2468 13/500 [..............................] - ETA: 1:59 - loss: 1.5052 - regression_loss: 1.2473 - classification_loss: 0.2580 14/500 [..............................] - ETA: 2:00 - loss: 1.5275 - regression_loss: 1.2687 - classification_loss: 0.2588 15/500 [..............................] - ETA: 2:00 - loss: 1.5249 - regression_loss: 1.2672 - classification_loss: 0.2577 16/500 [..............................] - ETA: 1:59 - loss: 1.5186 - regression_loss: 1.2603 - classification_loss: 0.2584 17/500 [>.............................] - ETA: 1:59 - loss: 1.5226 - regression_loss: 1.2667 - classification_loss: 0.2559 18/500 [>.............................] - ETA: 1:59 - loss: 1.5412 - regression_loss: 1.2824 - classification_loss: 0.2588 19/500 [>.............................] - ETA: 1:59 - loss: 1.5546 - regression_loss: 1.2981 - classification_loss: 0.2565 20/500 [>.............................] - ETA: 1:58 - loss: 1.5528 - regression_loss: 1.3003 - classification_loss: 0.2525 21/500 [>.............................] - ETA: 1:58 - loss: 1.5583 - regression_loss: 1.3030 - classification_loss: 0.2554 22/500 [>.............................] - ETA: 1:58 - loss: 1.5871 - regression_loss: 1.3220 - classification_loss: 0.2651 23/500 [>.............................] - ETA: 1:57 - loss: 1.5800 - regression_loss: 1.3165 - classification_loss: 0.2635 24/500 [>.............................] - ETA: 1:57 - loss: 1.5869 - regression_loss: 1.3198 - classification_loss: 0.2671 25/500 [>.............................] - ETA: 1:57 - loss: 1.5772 - regression_loss: 1.3127 - classification_loss: 0.2644 26/500 [>.............................] - ETA: 1:57 - loss: 1.5733 - regression_loss: 1.3121 - classification_loss: 0.2611 27/500 [>.............................] - ETA: 1:57 - loss: 1.5752 - regression_loss: 1.3111 - classification_loss: 0.2641 28/500 [>.............................] - ETA: 1:57 - loss: 1.5412 - regression_loss: 1.2819 - classification_loss: 0.2593 29/500 [>.............................] - ETA: 1:57 - loss: 1.5425 - regression_loss: 1.2835 - classification_loss: 0.2590 30/500 [>.............................] - ETA: 1:56 - loss: 1.5425 - regression_loss: 1.2846 - classification_loss: 0.2579 31/500 [>.............................] - ETA: 1:56 - loss: 1.5382 - regression_loss: 1.2825 - classification_loss: 0.2556 32/500 [>.............................] - ETA: 1:56 - loss: 1.5447 - regression_loss: 1.2826 - classification_loss: 0.2620 33/500 [>.............................] - ETA: 1:56 - loss: 1.5393 - regression_loss: 1.2801 - classification_loss: 0.2591 34/500 [=>............................] - ETA: 1:55 - loss: 1.5450 - regression_loss: 1.2855 - classification_loss: 0.2595 35/500 [=>............................] - ETA: 1:55 - loss: 1.5320 - regression_loss: 1.2732 - classification_loss: 0.2587 36/500 [=>............................] - ETA: 1:55 - loss: 1.5235 - regression_loss: 1.2673 - classification_loss: 0.2562 37/500 [=>............................] - ETA: 1:55 - loss: 1.5240 - regression_loss: 1.2689 - classification_loss: 0.2551 38/500 [=>............................] - ETA: 1:54 - loss: 1.5130 - regression_loss: 1.2589 - classification_loss: 0.2541 39/500 [=>............................] - ETA: 1:54 - loss: 1.5125 - regression_loss: 1.2586 - classification_loss: 0.2538 40/500 [=>............................] - ETA: 1:54 - loss: 1.5312 - regression_loss: 1.2709 - classification_loss: 0.2603 41/500 [=>............................] - ETA: 1:54 - loss: 1.5303 - regression_loss: 1.2700 - classification_loss: 0.2603 42/500 [=>............................] - ETA: 1:54 - loss: 1.5322 - regression_loss: 1.2720 - classification_loss: 0.2602 43/500 [=>............................] - ETA: 1:53 - loss: 1.5486 - regression_loss: 1.2876 - classification_loss: 0.2610 44/500 [=>............................] - ETA: 1:53 - loss: 1.5532 - regression_loss: 1.2916 - classification_loss: 0.2615 45/500 [=>............................] - ETA: 1:53 - loss: 1.5373 - regression_loss: 1.2794 - classification_loss: 0.2579 46/500 [=>............................] - ETA: 1:53 - loss: 1.5456 - regression_loss: 1.2847 - classification_loss: 0.2610 47/500 [=>............................] - ETA: 1:52 - loss: 1.5469 - regression_loss: 1.2858 - classification_loss: 0.2611 48/500 [=>............................] - ETA: 1:52 - loss: 1.5438 - regression_loss: 1.2837 - classification_loss: 0.2601 49/500 [=>............................] - ETA: 1:52 - loss: 1.5444 - regression_loss: 1.2832 - classification_loss: 0.2612 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5602 - regression_loss: 1.2976 - classification_loss: 0.2626 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5602 - regression_loss: 1.2982 - classification_loss: 0.2620 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5653 - regression_loss: 1.3016 - classification_loss: 0.2637 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5559 - regression_loss: 1.2913 - classification_loss: 0.2645 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5568 - regression_loss: 1.2928 - classification_loss: 0.2640 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5628 - regression_loss: 1.2970 - classification_loss: 0.2658 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5530 - regression_loss: 1.2899 - classification_loss: 0.2631 57/500 [==>...........................] - ETA: 1:49 - loss: 1.5606 - regression_loss: 1.2975 - classification_loss: 0.2632 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5565 - regression_loss: 1.2941 - classification_loss: 0.2624 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5554 - regression_loss: 1.2924 - classification_loss: 0.2630 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5557 - regression_loss: 1.2935 - classification_loss: 0.2622 61/500 [==>...........................] - ETA: 1:48 - loss: 1.5617 - regression_loss: 1.2989 - classification_loss: 0.2628 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5640 - regression_loss: 1.3009 - classification_loss: 0.2631 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5475 - regression_loss: 1.2870 - classification_loss: 0.2605 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5519 - regression_loss: 1.2912 - classification_loss: 0.2607 65/500 [==>...........................] - ETA: 1:47 - loss: 1.5582 - regression_loss: 1.2955 - classification_loss: 0.2627 66/500 [==>...........................] - ETA: 1:47 - loss: 1.5527 - regression_loss: 1.2918 - classification_loss: 0.2609 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5562 - regression_loss: 1.2952 - classification_loss: 0.2610 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5408 - regression_loss: 1.2832 - classification_loss: 0.2577 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5473 - regression_loss: 1.2882 - classification_loss: 0.2591 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5444 - regression_loss: 1.2861 - classification_loss: 0.2582 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5555 - regression_loss: 1.2949 - classification_loss: 0.2606 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5600 - regression_loss: 1.2978 - classification_loss: 0.2622 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5549 - regression_loss: 1.2941 - classification_loss: 0.2608 74/500 [===>..........................] - ETA: 1:45 - loss: 1.5615 - regression_loss: 1.2998 - classification_loss: 0.2616 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5675 - regression_loss: 1.3051 - classification_loss: 0.2624 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5623 - regression_loss: 1.3005 - classification_loss: 0.2618 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5665 - regression_loss: 1.3041 - classification_loss: 0.2625 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5678 - regression_loss: 1.3057 - classification_loss: 0.2621 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5685 - regression_loss: 1.3066 - classification_loss: 0.2619 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5595 - regression_loss: 1.2982 - classification_loss: 0.2614 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5618 - regression_loss: 1.3002 - classification_loss: 0.2616 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5612 - regression_loss: 1.3000 - classification_loss: 0.2613 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5648 - regression_loss: 1.3036 - classification_loss: 0.2612 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5590 - regression_loss: 1.2991 - classification_loss: 0.2599 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5515 - regression_loss: 1.2926 - classification_loss: 0.2588 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5460 - regression_loss: 1.2872 - classification_loss: 0.2588 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5467 - regression_loss: 1.2886 - classification_loss: 0.2581 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5374 - regression_loss: 1.2814 - classification_loss: 0.2560 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5328 - regression_loss: 1.2782 - classification_loss: 0.2546 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5311 - regression_loss: 1.2773 - classification_loss: 0.2538 91/500 [====>.........................] - ETA: 1:41 - loss: 1.5278 - regression_loss: 1.2742 - classification_loss: 0.2536 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5255 - regression_loss: 1.2728 - classification_loss: 0.2527 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5170 - regression_loss: 1.2659 - classification_loss: 0.2510 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5065 - regression_loss: 1.2575 - classification_loss: 0.2490 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5095 - regression_loss: 1.2600 - classification_loss: 0.2495 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5108 - regression_loss: 1.2610 - classification_loss: 0.2498 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5103 - regression_loss: 1.2608 - classification_loss: 0.2495 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5115 - regression_loss: 1.2621 - classification_loss: 0.2494 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5052 - regression_loss: 1.2572 - classification_loss: 0.2480 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4958 - regression_loss: 1.2495 - classification_loss: 0.2463 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4962 - regression_loss: 1.2498 - classification_loss: 0.2464 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4946 - regression_loss: 1.2489 - classification_loss: 0.2458 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4980 - regression_loss: 1.2526 - classification_loss: 0.2455 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5016 - regression_loss: 1.2553 - classification_loss: 0.2463 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5012 - regression_loss: 1.2551 - classification_loss: 0.2461 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4951 - regression_loss: 1.2499 - classification_loss: 0.2452 107/500 [=====>........................] - ETA: 1:37 - loss: 1.4960 - regression_loss: 1.2498 - classification_loss: 0.2462 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5000 - regression_loss: 1.2527 - classification_loss: 0.2474 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4974 - regression_loss: 1.2508 - classification_loss: 0.2467 110/500 [=====>........................] - ETA: 1:36 - loss: 1.4909 - regression_loss: 1.2456 - classification_loss: 0.2453 111/500 [=====>........................] - ETA: 1:36 - loss: 1.4847 - regression_loss: 1.2407 - classification_loss: 0.2440 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4869 - regression_loss: 1.2428 - classification_loss: 0.2441 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4790 - regression_loss: 1.2360 - classification_loss: 0.2430 114/500 [=====>........................] - ETA: 1:35 - loss: 1.4723 - regression_loss: 1.2306 - classification_loss: 0.2417 115/500 [=====>........................] - ETA: 1:35 - loss: 1.4738 - regression_loss: 1.2328 - classification_loss: 0.2410 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4757 - regression_loss: 1.2341 - classification_loss: 0.2416 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4770 - regression_loss: 1.2355 - classification_loss: 0.2415 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4731 - regression_loss: 1.2318 - classification_loss: 0.2413 119/500 [======>.......................] - ETA: 1:34 - loss: 1.4735 - regression_loss: 1.2328 - classification_loss: 0.2407 120/500 [======>.......................] - ETA: 1:34 - loss: 1.4771 - regression_loss: 1.2357 - classification_loss: 0.2413 121/500 [======>.......................] - ETA: 1:34 - loss: 1.4797 - regression_loss: 1.2376 - classification_loss: 0.2421 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4825 - regression_loss: 1.2405 - classification_loss: 0.2420 123/500 [======>.......................] - ETA: 1:33 - loss: 1.4801 - regression_loss: 1.2375 - classification_loss: 0.2426 124/500 [======>.......................] - ETA: 1:33 - loss: 1.4835 - regression_loss: 1.2405 - classification_loss: 0.2431 125/500 [======>.......................] - ETA: 1:33 - loss: 1.4838 - regression_loss: 1.2407 - classification_loss: 0.2432 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4852 - regression_loss: 1.2421 - classification_loss: 0.2431 127/500 [======>.......................] - ETA: 1:32 - loss: 1.4807 - regression_loss: 1.2386 - classification_loss: 0.2421 128/500 [======>.......................] - ETA: 1:32 - loss: 1.4878 - regression_loss: 1.2440 - classification_loss: 0.2439 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4870 - regression_loss: 1.2430 - classification_loss: 0.2440 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4858 - regression_loss: 1.2426 - classification_loss: 0.2431 131/500 [======>.......................] - ETA: 1:31 - loss: 1.4848 - regression_loss: 1.2421 - classification_loss: 0.2428 132/500 [======>.......................] - ETA: 1:31 - loss: 1.4825 - regression_loss: 1.2401 - classification_loss: 0.2424 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4845 - regression_loss: 1.2414 - classification_loss: 0.2431 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4880 - regression_loss: 1.2436 - classification_loss: 0.2444 135/500 [=======>......................] - ETA: 1:30 - loss: 1.4869 - regression_loss: 1.2429 - classification_loss: 0.2440 136/500 [=======>......................] - ETA: 1:30 - loss: 1.4925 - regression_loss: 1.2467 - classification_loss: 0.2458 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4882 - regression_loss: 1.2430 - classification_loss: 0.2453 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4926 - regression_loss: 1.2465 - classification_loss: 0.2462 139/500 [=======>......................] - ETA: 1:29 - loss: 1.4954 - regression_loss: 1.2489 - classification_loss: 0.2465 140/500 [=======>......................] - ETA: 1:29 - loss: 1.4973 - regression_loss: 1.2504 - classification_loss: 0.2469 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4968 - regression_loss: 1.2500 - classification_loss: 0.2468 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4923 - regression_loss: 1.2468 - classification_loss: 0.2455 143/500 [=======>......................] - ETA: 1:28 - loss: 1.4927 - regression_loss: 1.2475 - classification_loss: 0.2453 144/500 [=======>......................] - ETA: 1:28 - loss: 1.4848 - regression_loss: 1.2408 - classification_loss: 0.2441 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4866 - regression_loss: 1.2424 - classification_loss: 0.2442 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4878 - regression_loss: 1.2435 - classification_loss: 0.2443 147/500 [=======>......................] - ETA: 1:27 - loss: 1.4925 - regression_loss: 1.2474 - classification_loss: 0.2451 148/500 [=======>......................] - ETA: 1:27 - loss: 1.4992 - regression_loss: 1.2533 - classification_loss: 0.2459 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4964 - regression_loss: 1.2507 - classification_loss: 0.2457 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4985 - regression_loss: 1.2525 - classification_loss: 0.2460 151/500 [========>.....................] - ETA: 1:26 - loss: 1.4984 - regression_loss: 1.2522 - classification_loss: 0.2462 152/500 [========>.....................] - ETA: 1:26 - loss: 1.4930 - regression_loss: 1.2479 - classification_loss: 0.2451 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4936 - regression_loss: 1.2484 - classification_loss: 0.2452 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4936 - regression_loss: 1.2486 - classification_loss: 0.2450 155/500 [========>.....................] - ETA: 1:25 - loss: 1.4881 - regression_loss: 1.2444 - classification_loss: 0.2438 156/500 [========>.....................] - ETA: 1:25 - loss: 1.4887 - regression_loss: 1.2450 - classification_loss: 0.2437 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4921 - regression_loss: 1.2471 - classification_loss: 0.2451 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4865 - regression_loss: 1.2427 - classification_loss: 0.2438 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4872 - regression_loss: 1.2435 - classification_loss: 0.2437 160/500 [========>.....................] - ETA: 1:24 - loss: 1.4839 - regression_loss: 1.2412 - classification_loss: 0.2428 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4853 - regression_loss: 1.2427 - classification_loss: 0.2426 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4869 - regression_loss: 1.2437 - classification_loss: 0.2432 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4880 - regression_loss: 1.2449 - classification_loss: 0.2432 164/500 [========>.....................] - ETA: 1:23 - loss: 1.4868 - regression_loss: 1.2429 - classification_loss: 0.2439 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4889 - regression_loss: 1.2448 - classification_loss: 0.2442 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4865 - regression_loss: 1.2426 - classification_loss: 0.2439 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4869 - regression_loss: 1.2430 - classification_loss: 0.2439 168/500 [=========>....................] - ETA: 1:22 - loss: 1.4838 - regression_loss: 1.2407 - classification_loss: 0.2431 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4852 - regression_loss: 1.2421 - classification_loss: 0.2431 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4852 - regression_loss: 1.2422 - classification_loss: 0.2430 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4838 - regression_loss: 1.2406 - classification_loss: 0.2431 172/500 [=========>....................] - ETA: 1:21 - loss: 1.4859 - regression_loss: 1.2425 - classification_loss: 0.2434 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4882 - regression_loss: 1.2442 - classification_loss: 0.2440 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4903 - regression_loss: 1.2460 - classification_loss: 0.2443 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4894 - regression_loss: 1.2451 - classification_loss: 0.2443 176/500 [=========>....................] - ETA: 1:20 - loss: 1.4896 - regression_loss: 1.2454 - classification_loss: 0.2442 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4894 - regression_loss: 1.2452 - classification_loss: 0.2443 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4896 - regression_loss: 1.2456 - classification_loss: 0.2440 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4922 - regression_loss: 1.2478 - classification_loss: 0.2444 180/500 [=========>....................] - ETA: 1:19 - loss: 1.4923 - regression_loss: 1.2480 - classification_loss: 0.2443 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4934 - regression_loss: 1.2487 - classification_loss: 0.2447 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4951 - regression_loss: 1.2503 - classification_loss: 0.2449 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4963 - regression_loss: 1.2513 - classification_loss: 0.2449 184/500 [==========>...................] - ETA: 1:18 - loss: 1.4970 - regression_loss: 1.2519 - classification_loss: 0.2451 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4999 - regression_loss: 1.2542 - classification_loss: 0.2457 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4987 - regression_loss: 1.2533 - classification_loss: 0.2453 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4979 - regression_loss: 1.2530 - classification_loss: 0.2449 188/500 [==========>...................] - ETA: 1:17 - loss: 1.4943 - regression_loss: 1.2502 - classification_loss: 0.2441 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4951 - regression_loss: 1.2507 - classification_loss: 0.2444 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4980 - regression_loss: 1.2532 - classification_loss: 0.2448 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4982 - regression_loss: 1.2536 - classification_loss: 0.2446 192/500 [==========>...................] - ETA: 1:16 - loss: 1.4998 - regression_loss: 1.2549 - classification_loss: 0.2450 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4996 - regression_loss: 1.2545 - classification_loss: 0.2451 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5028 - regression_loss: 1.2567 - classification_loss: 0.2461 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5003 - regression_loss: 1.2548 - classification_loss: 0.2455 196/500 [==========>...................] - ETA: 1:15 - loss: 1.4991 - regression_loss: 1.2539 - classification_loss: 0.2452 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4965 - regression_loss: 1.2520 - classification_loss: 0.2445 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4971 - regression_loss: 1.2524 - classification_loss: 0.2447 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4990 - regression_loss: 1.2541 - classification_loss: 0.2449 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5000 - regression_loss: 1.2549 - classification_loss: 0.2450 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4993 - regression_loss: 1.2543 - classification_loss: 0.2450 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4978 - regression_loss: 1.2535 - classification_loss: 0.2443 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4952 - regression_loss: 1.2515 - classification_loss: 0.2437 204/500 [===========>..................] - ETA: 1:13 - loss: 1.4955 - regression_loss: 1.2517 - classification_loss: 0.2438 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4995 - regression_loss: 1.2546 - classification_loss: 0.2449 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4957 - regression_loss: 1.2515 - classification_loss: 0.2442 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4907 - regression_loss: 1.2475 - classification_loss: 0.2432 208/500 [===========>..................] - ETA: 1:12 - loss: 1.4941 - regression_loss: 1.2497 - classification_loss: 0.2444 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4940 - regression_loss: 1.2494 - classification_loss: 0.2446 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4930 - regression_loss: 1.2486 - classification_loss: 0.2444 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4954 - regression_loss: 1.2506 - classification_loss: 0.2449 212/500 [===========>..................] - ETA: 1:11 - loss: 1.4955 - regression_loss: 1.2502 - classification_loss: 0.2453 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4961 - regression_loss: 1.2508 - classification_loss: 0.2453 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4951 - regression_loss: 1.2500 - classification_loss: 0.2451 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4904 - regression_loss: 1.2462 - classification_loss: 0.2442 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4929 - regression_loss: 1.2481 - classification_loss: 0.2447 217/500 [============>.................] - ETA: 1:10 - loss: 1.4962 - regression_loss: 1.2510 - classification_loss: 0.2452 218/500 [============>.................] - ETA: 1:10 - loss: 1.4948 - regression_loss: 1.2498 - classification_loss: 0.2450 219/500 [============>.................] - ETA: 1:10 - loss: 1.4966 - regression_loss: 1.2514 - classification_loss: 0.2452 220/500 [============>.................] - ETA: 1:10 - loss: 1.4928 - regression_loss: 1.2483 - classification_loss: 0.2445 221/500 [============>.................] - ETA: 1:09 - loss: 1.4941 - regression_loss: 1.2494 - classification_loss: 0.2447 222/500 [============>.................] - ETA: 1:09 - loss: 1.4951 - regression_loss: 1.2501 - classification_loss: 0.2450 223/500 [============>.................] - ETA: 1:09 - loss: 1.4958 - regression_loss: 1.2505 - classification_loss: 0.2453 224/500 [============>.................] - ETA: 1:09 - loss: 1.4976 - regression_loss: 1.2521 - classification_loss: 0.2455 225/500 [============>.................] - ETA: 1:08 - loss: 1.5002 - regression_loss: 1.2546 - classification_loss: 0.2456 226/500 [============>.................] - ETA: 1:08 - loss: 1.5007 - regression_loss: 1.2551 - classification_loss: 0.2456 227/500 [============>.................] - ETA: 1:08 - loss: 1.5021 - regression_loss: 1.2561 - classification_loss: 0.2460 228/500 [============>.................] - ETA: 1:08 - loss: 1.5005 - regression_loss: 1.2550 - classification_loss: 0.2456 229/500 [============>.................] - ETA: 1:07 - loss: 1.5006 - regression_loss: 1.2549 - classification_loss: 0.2457 230/500 [============>.................] - ETA: 1:07 - loss: 1.5025 - regression_loss: 1.2565 - classification_loss: 0.2460 231/500 [============>.................] - ETA: 1:07 - loss: 1.5027 - regression_loss: 1.2569 - classification_loss: 0.2458 232/500 [============>.................] - ETA: 1:07 - loss: 1.5041 - regression_loss: 1.2582 - classification_loss: 0.2458 233/500 [============>.................] - ETA: 1:06 - loss: 1.5018 - regression_loss: 1.2564 - classification_loss: 0.2454 234/500 [=============>................] - ETA: 1:06 - loss: 1.5014 - regression_loss: 1.2560 - classification_loss: 0.2454 235/500 [=============>................] - ETA: 1:06 - loss: 1.5018 - regression_loss: 1.2565 - classification_loss: 0.2452 236/500 [=============>................] - ETA: 1:06 - loss: 1.4992 - regression_loss: 1.2536 - classification_loss: 0.2456 237/500 [=============>................] - ETA: 1:05 - loss: 1.4978 - regression_loss: 1.2526 - classification_loss: 0.2452 238/500 [=============>................] - ETA: 1:05 - loss: 1.4964 - regression_loss: 1.2517 - classification_loss: 0.2447 239/500 [=============>................] - ETA: 1:05 - loss: 1.5049 - regression_loss: 1.2578 - classification_loss: 0.2471 240/500 [=============>................] - ETA: 1:05 - loss: 1.5072 - regression_loss: 1.2597 - classification_loss: 0.2475 241/500 [=============>................] - ETA: 1:04 - loss: 1.5072 - regression_loss: 1.2597 - classification_loss: 0.2475 242/500 [=============>................] - ETA: 1:04 - loss: 1.5031 - regression_loss: 1.2564 - classification_loss: 0.2467 243/500 [=============>................] - ETA: 1:04 - loss: 1.5065 - regression_loss: 1.2593 - classification_loss: 0.2472 244/500 [=============>................] - ETA: 1:04 - loss: 1.5080 - regression_loss: 1.2607 - classification_loss: 0.2473 245/500 [=============>................] - ETA: 1:03 - loss: 1.5098 - regression_loss: 1.2622 - classification_loss: 0.2476 246/500 [=============>................] - ETA: 1:03 - loss: 1.5068 - regression_loss: 1.2599 - classification_loss: 0.2469 247/500 [=============>................] - ETA: 1:03 - loss: 1.5056 - regression_loss: 1.2590 - classification_loss: 0.2466 248/500 [=============>................] - ETA: 1:03 - loss: 1.5032 - regression_loss: 1.2567 - classification_loss: 0.2465 249/500 [=============>................] - ETA: 1:02 - loss: 1.5028 - regression_loss: 1.2565 - classification_loss: 0.2464 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5033 - regression_loss: 1.2570 - classification_loss: 0.2462 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5062 - regression_loss: 1.2585 - classification_loss: 0.2477 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5041 - regression_loss: 1.2570 - classification_loss: 0.2471 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5034 - regression_loss: 1.2564 - classification_loss: 0.2470 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5057 - regression_loss: 1.2582 - classification_loss: 0.2476 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5043 - regression_loss: 1.2570 - classification_loss: 0.2473 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5025 - regression_loss: 1.2557 - classification_loss: 0.2469 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5044 - regression_loss: 1.2575 - classification_loss: 0.2469 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5067 - regression_loss: 1.2592 - classification_loss: 0.2475 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5029 - regression_loss: 1.2561 - classification_loss: 0.2469 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5031 - regression_loss: 1.2564 - classification_loss: 0.2468 261/500 [==============>...............] - ETA: 59s - loss: 1.4991 - regression_loss: 1.2532 - classification_loss: 0.2459  262/500 [==============>...............] - ETA: 59s - loss: 1.5015 - regression_loss: 1.2550 - classification_loss: 0.2465 263/500 [==============>...............] - ETA: 59s - loss: 1.5000 - regression_loss: 1.2540 - classification_loss: 0.2460 264/500 [==============>...............] - ETA: 59s - loss: 1.5002 - regression_loss: 1.2543 - classification_loss: 0.2458 265/500 [==============>...............] - ETA: 58s - loss: 1.5025 - regression_loss: 1.2562 - classification_loss: 0.2463 266/500 [==============>...............] - ETA: 58s - loss: 1.5041 - regression_loss: 1.2575 - classification_loss: 0.2466 267/500 [===============>..............] - ETA: 58s - loss: 1.5040 - regression_loss: 1.2575 - classification_loss: 0.2465 268/500 [===============>..............] - ETA: 58s - loss: 1.5053 - regression_loss: 1.2586 - classification_loss: 0.2467 269/500 [===============>..............] - ETA: 57s - loss: 1.5042 - regression_loss: 1.2579 - classification_loss: 0.2463 270/500 [===============>..............] - ETA: 57s - loss: 1.5062 - regression_loss: 1.2593 - classification_loss: 0.2468 271/500 [===============>..............] - ETA: 57s - loss: 1.5049 - regression_loss: 1.2583 - classification_loss: 0.2466 272/500 [===============>..............] - ETA: 57s - loss: 1.5059 - regression_loss: 1.2592 - classification_loss: 0.2467 273/500 [===============>..............] - ETA: 56s - loss: 1.5058 - regression_loss: 1.2593 - classification_loss: 0.2465 274/500 [===============>..............] - ETA: 56s - loss: 1.5070 - regression_loss: 1.2602 - classification_loss: 0.2468 275/500 [===============>..............] - ETA: 56s - loss: 1.5088 - regression_loss: 1.2616 - classification_loss: 0.2472 276/500 [===============>..............] - ETA: 56s - loss: 1.5091 - regression_loss: 1.2621 - classification_loss: 0.2471 277/500 [===============>..............] - ETA: 55s - loss: 1.5097 - regression_loss: 1.2626 - classification_loss: 0.2471 278/500 [===============>..............] - ETA: 55s - loss: 1.5087 - regression_loss: 1.2617 - classification_loss: 0.2470 279/500 [===============>..............] - ETA: 55s - loss: 1.5093 - regression_loss: 1.2622 - classification_loss: 0.2471 280/500 [===============>..............] - ETA: 55s - loss: 1.5112 - regression_loss: 1.2638 - classification_loss: 0.2474 281/500 [===============>..............] - ETA: 54s - loss: 1.5112 - regression_loss: 1.2639 - classification_loss: 0.2473 282/500 [===============>..............] - ETA: 54s - loss: 1.5077 - regression_loss: 1.2611 - classification_loss: 0.2466 283/500 [===============>..............] - ETA: 54s - loss: 1.5077 - regression_loss: 1.2611 - classification_loss: 0.2465 284/500 [================>.............] - ETA: 53s - loss: 1.5054 - regression_loss: 1.2592 - classification_loss: 0.2462 285/500 [================>.............] - ETA: 53s - loss: 1.5085 - regression_loss: 1.2614 - classification_loss: 0.2471 286/500 [================>.............] - ETA: 53s - loss: 1.5100 - regression_loss: 1.2627 - classification_loss: 0.2473 287/500 [================>.............] - ETA: 53s - loss: 1.5088 - regression_loss: 1.2617 - classification_loss: 0.2471 288/500 [================>.............] - ETA: 52s - loss: 1.5101 - regression_loss: 1.2626 - classification_loss: 0.2474 289/500 [================>.............] - ETA: 52s - loss: 1.5114 - regression_loss: 1.2637 - classification_loss: 0.2477 290/500 [================>.............] - ETA: 52s - loss: 1.5101 - regression_loss: 1.2626 - classification_loss: 0.2475 291/500 [================>.............] - ETA: 52s - loss: 1.5096 - regression_loss: 1.2623 - classification_loss: 0.2473 292/500 [================>.............] - ETA: 51s - loss: 1.5107 - regression_loss: 1.2632 - classification_loss: 0.2474 293/500 [================>.............] - ETA: 51s - loss: 1.5103 - regression_loss: 1.2630 - classification_loss: 0.2473 294/500 [================>.............] - ETA: 51s - loss: 1.5113 - regression_loss: 1.2640 - classification_loss: 0.2473 295/500 [================>.............] - ETA: 51s - loss: 1.5115 - regression_loss: 1.2643 - classification_loss: 0.2472 296/500 [================>.............] - ETA: 50s - loss: 1.5093 - regression_loss: 1.2625 - classification_loss: 0.2468 297/500 [================>.............] - ETA: 50s - loss: 1.5095 - regression_loss: 1.2629 - classification_loss: 0.2466 298/500 [================>.............] - ETA: 50s - loss: 1.5113 - regression_loss: 1.2644 - classification_loss: 0.2469 299/500 [================>.............] - ETA: 50s - loss: 1.5130 - regression_loss: 1.2658 - classification_loss: 0.2472 300/500 [=================>............] - ETA: 49s - loss: 1.5146 - regression_loss: 1.2670 - classification_loss: 0.2476 301/500 [=================>............] - ETA: 49s - loss: 1.5115 - regression_loss: 1.2645 - classification_loss: 0.2470 302/500 [=================>............] - ETA: 49s - loss: 1.5094 - regression_loss: 1.2628 - classification_loss: 0.2466 303/500 [=================>............] - ETA: 49s - loss: 1.5079 - regression_loss: 1.2616 - classification_loss: 0.2463 304/500 [=================>............] - ETA: 48s - loss: 1.5057 - regression_loss: 1.2599 - classification_loss: 0.2458 305/500 [=================>............] - ETA: 48s - loss: 1.5053 - regression_loss: 1.2596 - classification_loss: 0.2457 306/500 [=================>............] - ETA: 48s - loss: 1.5052 - regression_loss: 1.2595 - classification_loss: 0.2457 307/500 [=================>............] - ETA: 48s - loss: 1.5032 - regression_loss: 1.2578 - classification_loss: 0.2454 308/500 [=================>............] - ETA: 47s - loss: 1.5032 - regression_loss: 1.2579 - classification_loss: 0.2454 309/500 [=================>............] - ETA: 47s - loss: 1.5042 - regression_loss: 1.2587 - classification_loss: 0.2455 310/500 [=================>............] - ETA: 47s - loss: 1.5049 - regression_loss: 1.2594 - classification_loss: 0.2454 311/500 [=================>............] - ETA: 47s - loss: 1.5051 - regression_loss: 1.2597 - classification_loss: 0.2454 312/500 [=================>............] - ETA: 46s - loss: 1.5035 - regression_loss: 1.2583 - classification_loss: 0.2451 313/500 [=================>............] - ETA: 46s - loss: 1.5007 - regression_loss: 1.2562 - classification_loss: 0.2445 314/500 [=================>............] - ETA: 46s - loss: 1.5015 - regression_loss: 1.2570 - classification_loss: 0.2445 315/500 [=================>............] - ETA: 46s - loss: 1.5021 - regression_loss: 1.2573 - classification_loss: 0.2447 316/500 [=================>............] - ETA: 45s - loss: 1.5005 - regression_loss: 1.2562 - classification_loss: 0.2443 317/500 [==================>...........] - ETA: 45s - loss: 1.4992 - regression_loss: 1.2551 - classification_loss: 0.2442 318/500 [==================>...........] - ETA: 45s - loss: 1.5000 - regression_loss: 1.2558 - classification_loss: 0.2442 319/500 [==================>...........] - ETA: 45s - loss: 1.4998 - regression_loss: 1.2558 - classification_loss: 0.2440 320/500 [==================>...........] - ETA: 45s - loss: 1.5001 - regression_loss: 1.2563 - classification_loss: 0.2438 321/500 [==================>...........] - ETA: 44s - loss: 1.5006 - regression_loss: 1.2568 - classification_loss: 0.2438 322/500 [==================>...........] - ETA: 44s - loss: 1.5004 - regression_loss: 1.2568 - classification_loss: 0.2436 323/500 [==================>...........] - ETA: 44s - loss: 1.4997 - regression_loss: 1.2561 - classification_loss: 0.2436 324/500 [==================>...........] - ETA: 44s - loss: 1.5010 - regression_loss: 1.2576 - classification_loss: 0.2435 325/500 [==================>...........] - ETA: 43s - loss: 1.5004 - regression_loss: 1.2568 - classification_loss: 0.2437 326/500 [==================>...........] - ETA: 43s - loss: 1.5012 - regression_loss: 1.2575 - classification_loss: 0.2438 327/500 [==================>...........] - ETA: 43s - loss: 1.5012 - regression_loss: 1.2574 - classification_loss: 0.2437 328/500 [==================>...........] - ETA: 43s - loss: 1.4988 - regression_loss: 1.2556 - classification_loss: 0.2432 329/500 [==================>...........] - ETA: 42s - loss: 1.5000 - regression_loss: 1.2565 - classification_loss: 0.2435 330/500 [==================>...........] - ETA: 42s - loss: 1.4988 - regression_loss: 1.2557 - classification_loss: 0.2430 331/500 [==================>...........] - ETA: 42s - loss: 1.4962 - regression_loss: 1.2536 - classification_loss: 0.2426 332/500 [==================>...........] - ETA: 41s - loss: 1.4961 - regression_loss: 1.2535 - classification_loss: 0.2426 333/500 [==================>...........] - ETA: 41s - loss: 1.4992 - regression_loss: 1.2562 - classification_loss: 0.2430 334/500 [===================>..........] - ETA: 41s - loss: 1.5004 - regression_loss: 1.2572 - classification_loss: 0.2433 335/500 [===================>..........] - ETA: 41s - loss: 1.4998 - regression_loss: 1.2566 - classification_loss: 0.2432 336/500 [===================>..........] - ETA: 40s - loss: 1.4988 - regression_loss: 1.2559 - classification_loss: 0.2429 337/500 [===================>..........] - ETA: 40s - loss: 1.4987 - regression_loss: 1.2558 - classification_loss: 0.2429 338/500 [===================>..........] - ETA: 40s - loss: 1.4994 - regression_loss: 1.2567 - classification_loss: 0.2427 339/500 [===================>..........] - ETA: 40s - loss: 1.4985 - regression_loss: 1.2561 - classification_loss: 0.2424 340/500 [===================>..........] - ETA: 39s - loss: 1.4986 - regression_loss: 1.2563 - classification_loss: 0.2424 341/500 [===================>..........] - ETA: 39s - loss: 1.5018 - regression_loss: 1.2575 - classification_loss: 0.2443 342/500 [===================>..........] - ETA: 39s - loss: 1.5038 - regression_loss: 1.2580 - classification_loss: 0.2457 343/500 [===================>..........] - ETA: 39s - loss: 1.5037 - regression_loss: 1.2581 - classification_loss: 0.2456 344/500 [===================>..........] - ETA: 38s - loss: 1.5036 - regression_loss: 1.2576 - classification_loss: 0.2460 345/500 [===================>..........] - ETA: 38s - loss: 1.5059 - regression_loss: 1.2592 - classification_loss: 0.2467 346/500 [===================>..........] - ETA: 38s - loss: 1.5066 - regression_loss: 1.2598 - classification_loss: 0.2468 347/500 [===================>..........] - ETA: 38s - loss: 1.5073 - regression_loss: 1.2603 - classification_loss: 0.2470 348/500 [===================>..........] - ETA: 37s - loss: 1.5073 - regression_loss: 1.2603 - classification_loss: 0.2470 349/500 [===================>..........] - ETA: 37s - loss: 1.5082 - regression_loss: 1.2608 - classification_loss: 0.2474 350/500 [====================>.........] - ETA: 37s - loss: 1.5057 - regression_loss: 1.2587 - classification_loss: 0.2470 351/500 [====================>.........] - ETA: 37s - loss: 1.5062 - regression_loss: 1.2589 - classification_loss: 0.2473 352/500 [====================>.........] - ETA: 37s - loss: 1.5075 - regression_loss: 1.2601 - classification_loss: 0.2475 353/500 [====================>.........] - ETA: 36s - loss: 1.5072 - regression_loss: 1.2599 - classification_loss: 0.2473 354/500 [====================>.........] - ETA: 36s - loss: 1.5086 - regression_loss: 1.2611 - classification_loss: 0.2476 355/500 [====================>.........] - ETA: 36s - loss: 1.5104 - regression_loss: 1.2624 - classification_loss: 0.2480 356/500 [====================>.........] - ETA: 36s - loss: 1.5083 - regression_loss: 1.2605 - classification_loss: 0.2477 357/500 [====================>.........] - ETA: 35s - loss: 1.5088 - regression_loss: 1.2608 - classification_loss: 0.2480 358/500 [====================>.........] - ETA: 35s - loss: 1.5087 - regression_loss: 1.2606 - classification_loss: 0.2481 359/500 [====================>.........] - ETA: 35s - loss: 1.5106 - regression_loss: 1.2621 - classification_loss: 0.2485 360/500 [====================>.........] - ETA: 35s - loss: 1.5114 - regression_loss: 1.2624 - classification_loss: 0.2490 361/500 [====================>.........] - ETA: 34s - loss: 1.5134 - regression_loss: 1.2637 - classification_loss: 0.2496 362/500 [====================>.........] - ETA: 34s - loss: 1.5139 - regression_loss: 1.2641 - classification_loss: 0.2498 363/500 [====================>.........] - ETA: 34s - loss: 1.5147 - regression_loss: 1.2644 - classification_loss: 0.2503 364/500 [====================>.........] - ETA: 34s - loss: 1.5133 - regression_loss: 1.2634 - classification_loss: 0.2499 365/500 [====================>.........] - ETA: 33s - loss: 1.5116 - regression_loss: 1.2621 - classification_loss: 0.2496 366/500 [====================>.........] - ETA: 33s - loss: 1.5089 - regression_loss: 1.2598 - classification_loss: 0.2491 367/500 [=====================>........] - ETA: 33s - loss: 1.5076 - regression_loss: 1.2588 - classification_loss: 0.2488 368/500 [=====================>........] - ETA: 33s - loss: 1.5059 - regression_loss: 1.2575 - classification_loss: 0.2483 369/500 [=====================>........] - ETA: 32s - loss: 1.5061 - regression_loss: 1.2578 - classification_loss: 0.2483 370/500 [=====================>........] - ETA: 32s - loss: 1.5059 - regression_loss: 1.2577 - classification_loss: 0.2482 371/500 [=====================>........] - ETA: 32s - loss: 1.5065 - regression_loss: 1.2580 - classification_loss: 0.2484 372/500 [=====================>........] - ETA: 32s - loss: 1.5065 - regression_loss: 1.2580 - classification_loss: 0.2485 373/500 [=====================>........] - ETA: 31s - loss: 1.5062 - regression_loss: 1.2580 - classification_loss: 0.2482 374/500 [=====================>........] - ETA: 31s - loss: 1.5057 - regression_loss: 1.2575 - classification_loss: 0.2482 375/500 [=====================>........] - ETA: 31s - loss: 1.5064 - regression_loss: 1.2579 - classification_loss: 0.2485 376/500 [=====================>........] - ETA: 31s - loss: 1.5045 - regression_loss: 1.2565 - classification_loss: 0.2480 377/500 [=====================>........] - ETA: 30s - loss: 1.5031 - regression_loss: 1.2555 - classification_loss: 0.2476 378/500 [=====================>........] - ETA: 30s - loss: 1.5026 - regression_loss: 1.2551 - classification_loss: 0.2475 379/500 [=====================>........] - ETA: 30s - loss: 1.5019 - regression_loss: 1.2545 - classification_loss: 0.2474 380/500 [=====================>........] - ETA: 30s - loss: 1.5026 - regression_loss: 1.2550 - classification_loss: 0.2476 381/500 [=====================>........] - ETA: 29s - loss: 1.5026 - regression_loss: 1.2550 - classification_loss: 0.2476 382/500 [=====================>........] - ETA: 29s - loss: 1.5032 - regression_loss: 1.2555 - classification_loss: 0.2478 383/500 [=====================>........] - ETA: 29s - loss: 1.5034 - regression_loss: 1.2557 - classification_loss: 0.2477 384/500 [======================>.......] - ETA: 29s - loss: 1.5041 - regression_loss: 1.2557 - classification_loss: 0.2484 385/500 [======================>.......] - ETA: 28s - loss: 1.5042 - regression_loss: 1.2559 - classification_loss: 0.2483 386/500 [======================>.......] - ETA: 28s - loss: 1.5046 - regression_loss: 1.2563 - classification_loss: 0.2483 387/500 [======================>.......] - ETA: 28s - loss: 1.5049 - regression_loss: 1.2564 - classification_loss: 0.2485 388/500 [======================>.......] - ETA: 28s - loss: 1.5064 - regression_loss: 1.2577 - classification_loss: 0.2487 389/500 [======================>.......] - ETA: 27s - loss: 1.5048 - regression_loss: 1.2565 - classification_loss: 0.2483 390/500 [======================>.......] - ETA: 27s - loss: 1.5035 - regression_loss: 1.2552 - classification_loss: 0.2483 391/500 [======================>.......] - ETA: 27s - loss: 1.5031 - regression_loss: 1.2549 - classification_loss: 0.2482 392/500 [======================>.......] - ETA: 27s - loss: 1.5030 - regression_loss: 1.2550 - classification_loss: 0.2480 393/500 [======================>.......] - ETA: 26s - loss: 1.5026 - regression_loss: 1.2547 - classification_loss: 0.2479 394/500 [======================>.......] - ETA: 26s - loss: 1.5050 - regression_loss: 1.2568 - classification_loss: 0.2481 395/500 [======================>.......] - ETA: 26s - loss: 1.5053 - regression_loss: 1.2571 - classification_loss: 0.2482 396/500 [======================>.......] - ETA: 26s - loss: 1.5052 - regression_loss: 1.2571 - classification_loss: 0.2481 397/500 [======================>.......] - ETA: 25s - loss: 1.5059 - regression_loss: 1.2576 - classification_loss: 0.2483 398/500 [======================>.......] - ETA: 25s - loss: 1.5059 - regression_loss: 1.2576 - classification_loss: 0.2483 399/500 [======================>.......] - ETA: 25s - loss: 1.5053 - regression_loss: 1.2572 - classification_loss: 0.2482 400/500 [=======================>......] - ETA: 25s - loss: 1.5057 - regression_loss: 1.2576 - classification_loss: 0.2482 401/500 [=======================>......] - ETA: 24s - loss: 1.5072 - regression_loss: 1.2587 - classification_loss: 0.2484 402/500 [=======================>......] - ETA: 24s - loss: 1.5081 - regression_loss: 1.2595 - classification_loss: 0.2486 403/500 [=======================>......] - ETA: 24s - loss: 1.5086 - regression_loss: 1.2600 - classification_loss: 0.2486 404/500 [=======================>......] - ETA: 24s - loss: 1.5123 - regression_loss: 1.2627 - classification_loss: 0.2496 405/500 [=======================>......] - ETA: 23s - loss: 1.5124 - regression_loss: 1.2629 - classification_loss: 0.2495 406/500 [=======================>......] - ETA: 23s - loss: 1.5137 - regression_loss: 1.2640 - classification_loss: 0.2497 407/500 [=======================>......] - ETA: 23s - loss: 1.5145 - regression_loss: 1.2647 - classification_loss: 0.2498 408/500 [=======================>......] - ETA: 23s - loss: 1.5131 - regression_loss: 1.2634 - classification_loss: 0.2497 409/500 [=======================>......] - ETA: 22s - loss: 1.5140 - regression_loss: 1.2641 - classification_loss: 0.2499 410/500 [=======================>......] - ETA: 22s - loss: 1.5160 - regression_loss: 1.2654 - classification_loss: 0.2507 411/500 [=======================>......] - ETA: 22s - loss: 1.5179 - regression_loss: 1.2669 - classification_loss: 0.2510 412/500 [=======================>......] - ETA: 22s - loss: 1.5179 - regression_loss: 1.2670 - classification_loss: 0.2509 413/500 [=======================>......] - ETA: 21s - loss: 1.5194 - regression_loss: 1.2681 - classification_loss: 0.2513 414/500 [=======================>......] - ETA: 21s - loss: 1.5192 - regression_loss: 1.2680 - classification_loss: 0.2511 415/500 [=======================>......] - ETA: 21s - loss: 1.5190 - regression_loss: 1.2679 - classification_loss: 0.2510 416/500 [=======================>......] - ETA: 21s - loss: 1.5200 - regression_loss: 1.2688 - classification_loss: 0.2511 417/500 [========================>.....] - ETA: 20s - loss: 1.5257 - regression_loss: 1.2680 - classification_loss: 0.2577 418/500 [========================>.....] - ETA: 20s - loss: 1.5259 - regression_loss: 1.2685 - classification_loss: 0.2574 419/500 [========================>.....] - ETA: 20s - loss: 1.5250 - regression_loss: 1.2679 - classification_loss: 0.2571 420/500 [========================>.....] - ETA: 20s - loss: 1.5246 - regression_loss: 1.2677 - classification_loss: 0.2569 421/500 [========================>.....] - ETA: 19s - loss: 1.5243 - regression_loss: 1.2674 - classification_loss: 0.2569 422/500 [========================>.....] - ETA: 19s - loss: 1.5247 - regression_loss: 1.2678 - classification_loss: 0.2569 423/500 [========================>.....] - ETA: 19s - loss: 1.5258 - regression_loss: 1.2685 - classification_loss: 0.2573 424/500 [========================>.....] - ETA: 19s - loss: 1.5238 - regression_loss: 1.2670 - classification_loss: 0.2569 425/500 [========================>.....] - ETA: 18s - loss: 1.5228 - regression_loss: 1.2659 - classification_loss: 0.2569 426/500 [========================>.....] - ETA: 18s - loss: 1.5225 - regression_loss: 1.2657 - classification_loss: 0.2568 427/500 [========================>.....] - ETA: 18s - loss: 1.5231 - regression_loss: 1.2661 - classification_loss: 0.2570 428/500 [========================>.....] - ETA: 18s - loss: 1.5239 - regression_loss: 1.2669 - classification_loss: 0.2570 429/500 [========================>.....] - ETA: 17s - loss: 1.5249 - regression_loss: 1.2677 - classification_loss: 0.2571 430/500 [========================>.....] - ETA: 17s - loss: 1.5245 - regression_loss: 1.2673 - classification_loss: 0.2572 431/500 [========================>.....] - ETA: 17s - loss: 1.5260 - regression_loss: 1.2684 - classification_loss: 0.2576 432/500 [========================>.....] - ETA: 17s - loss: 1.5264 - regression_loss: 1.2689 - classification_loss: 0.2575 433/500 [========================>.....] - ETA: 16s - loss: 1.5248 - regression_loss: 1.2678 - classification_loss: 0.2570 434/500 [=========================>....] - ETA: 16s - loss: 1.5231 - regression_loss: 1.2665 - classification_loss: 0.2566 435/500 [=========================>....] - ETA: 16s - loss: 1.5227 - regression_loss: 1.2661 - classification_loss: 0.2565 436/500 [=========================>....] - ETA: 16s - loss: 1.5220 - regression_loss: 1.2656 - classification_loss: 0.2565 437/500 [=========================>....] - ETA: 15s - loss: 1.5223 - regression_loss: 1.2658 - classification_loss: 0.2564 438/500 [=========================>....] - ETA: 15s - loss: 1.5225 - regression_loss: 1.2661 - classification_loss: 0.2564 439/500 [=========================>....] - ETA: 15s - loss: 1.5227 - regression_loss: 1.2664 - classification_loss: 0.2563 440/500 [=========================>....] - ETA: 15s - loss: 1.5239 - regression_loss: 1.2673 - classification_loss: 0.2565 441/500 [=========================>....] - ETA: 14s - loss: 1.5227 - regression_loss: 1.2664 - classification_loss: 0.2563 442/500 [=========================>....] - ETA: 14s - loss: 1.5229 - regression_loss: 1.2664 - classification_loss: 0.2565 443/500 [=========================>....] - ETA: 14s - loss: 1.5236 - regression_loss: 1.2670 - classification_loss: 0.2566 444/500 [=========================>....] - ETA: 14s - loss: 1.5228 - regression_loss: 1.2665 - classification_loss: 0.2563 445/500 [=========================>....] - ETA: 13s - loss: 1.5233 - regression_loss: 1.2671 - classification_loss: 0.2562 446/500 [=========================>....] - ETA: 13s - loss: 1.5238 - regression_loss: 1.2672 - classification_loss: 0.2566 447/500 [=========================>....] - ETA: 13s - loss: 1.5237 - regression_loss: 1.2672 - classification_loss: 0.2565 448/500 [=========================>....] - ETA: 13s - loss: 1.5235 - regression_loss: 1.2671 - classification_loss: 0.2564 449/500 [=========================>....] - ETA: 12s - loss: 1.5252 - regression_loss: 1.2685 - classification_loss: 0.2567 450/500 [==========================>...] - ETA: 12s - loss: 1.5237 - regression_loss: 1.2673 - classification_loss: 0.2563 451/500 [==========================>...] - ETA: 12s - loss: 1.5251 - regression_loss: 1.2683 - classification_loss: 0.2568 452/500 [==========================>...] - ETA: 12s - loss: 1.5258 - regression_loss: 1.2690 - classification_loss: 0.2567 453/500 [==========================>...] - ETA: 11s - loss: 1.5271 - regression_loss: 1.2700 - classification_loss: 0.2571 454/500 [==========================>...] - ETA: 11s - loss: 1.5270 - regression_loss: 1.2699 - classification_loss: 0.2570 455/500 [==========================>...] - ETA: 11s - loss: 1.5271 - regression_loss: 1.2702 - classification_loss: 0.2569 456/500 [==========================>...] - ETA: 11s - loss: 1.5268 - regression_loss: 1.2699 - classification_loss: 0.2569 457/500 [==========================>...] - ETA: 10s - loss: 1.5250 - regression_loss: 1.2684 - classification_loss: 0.2565 458/500 [==========================>...] - ETA: 10s - loss: 1.5252 - regression_loss: 1.2688 - classification_loss: 0.2564 459/500 [==========================>...] - ETA: 10s - loss: 1.5248 - regression_loss: 1.2684 - classification_loss: 0.2564 460/500 [==========================>...] - ETA: 9s - loss: 1.5241 - regression_loss: 1.2678 - classification_loss: 0.2563  461/500 [==========================>...] - ETA: 9s - loss: 1.5252 - regression_loss: 1.2687 - classification_loss: 0.2565 462/500 [==========================>...] - ETA: 9s - loss: 1.5239 - regression_loss: 1.2677 - classification_loss: 0.2562 463/500 [==========================>...] - ETA: 9s - loss: 1.5238 - regression_loss: 1.2676 - classification_loss: 0.2562 464/500 [==========================>...] - ETA: 8s - loss: 1.5223 - regression_loss: 1.2665 - classification_loss: 0.2559 465/500 [==========================>...] - ETA: 8s - loss: 1.5232 - regression_loss: 1.2673 - classification_loss: 0.2559 466/500 [==========================>...] - ETA: 8s - loss: 1.5226 - regression_loss: 1.2668 - classification_loss: 0.2558 467/500 [===========================>..] - ETA: 8s - loss: 1.5208 - regression_loss: 1.2651 - classification_loss: 0.2557 468/500 [===========================>..] - ETA: 7s - loss: 1.5194 - regression_loss: 1.2639 - classification_loss: 0.2554 469/500 [===========================>..] - ETA: 7s - loss: 1.5190 - regression_loss: 1.2637 - classification_loss: 0.2553 470/500 [===========================>..] - ETA: 7s - loss: 1.5195 - regression_loss: 1.2640 - classification_loss: 0.2555 471/500 [===========================>..] - ETA: 7s - loss: 1.5181 - regression_loss: 1.2630 - classification_loss: 0.2552 472/500 [===========================>..] - ETA: 6s - loss: 1.5183 - regression_loss: 1.2632 - classification_loss: 0.2552 473/500 [===========================>..] - ETA: 6s - loss: 1.5173 - regression_loss: 1.2623 - classification_loss: 0.2550 474/500 [===========================>..] - ETA: 6s - loss: 1.5168 - regression_loss: 1.2620 - classification_loss: 0.2548 475/500 [===========================>..] - ETA: 6s - loss: 1.5159 - regression_loss: 1.2614 - classification_loss: 0.2545 476/500 [===========================>..] - ETA: 5s - loss: 1.5159 - regression_loss: 1.2615 - classification_loss: 0.2545 477/500 [===========================>..] - ETA: 5s - loss: 1.5157 - regression_loss: 1.2613 - classification_loss: 0.2544 478/500 [===========================>..] - ETA: 5s - loss: 1.5147 - regression_loss: 1.2607 - classification_loss: 0.2540 479/500 [===========================>..] - ETA: 5s - loss: 1.5141 - regression_loss: 1.2602 - classification_loss: 0.2539 480/500 [===========================>..] - ETA: 5s - loss: 1.5140 - regression_loss: 1.2601 - classification_loss: 0.2538 481/500 [===========================>..] - ETA: 4s - loss: 1.5135 - regression_loss: 1.2597 - classification_loss: 0.2538 482/500 [===========================>..] - ETA: 4s - loss: 1.5131 - regression_loss: 1.2595 - classification_loss: 0.2537 483/500 [===========================>..] - ETA: 4s - loss: 1.5138 - regression_loss: 1.2599 - classification_loss: 0.2539 484/500 [============================>.] - ETA: 4s - loss: 1.5153 - regression_loss: 1.2613 - classification_loss: 0.2540 485/500 [============================>.] - ETA: 3s - loss: 1.5139 - regression_loss: 1.2602 - classification_loss: 0.2537 486/500 [============================>.] - ETA: 3s - loss: 1.5149 - regression_loss: 1.2609 - classification_loss: 0.2540 487/500 [============================>.] - ETA: 3s - loss: 1.5159 - regression_loss: 1.2618 - classification_loss: 0.2541 488/500 [============================>.] - ETA: 3s - loss: 1.5162 - regression_loss: 1.2619 - classification_loss: 0.2542 489/500 [============================>.] - ETA: 2s - loss: 1.5167 - regression_loss: 1.2624 - classification_loss: 0.2543 490/500 [============================>.] - ETA: 2s - loss: 1.5162 - regression_loss: 1.2621 - classification_loss: 0.2541 491/500 [============================>.] - ETA: 2s - loss: 1.5168 - regression_loss: 1.2626 - classification_loss: 0.2543 492/500 [============================>.] - ETA: 2s - loss: 1.5154 - regression_loss: 1.2613 - classification_loss: 0.2541 493/500 [============================>.] - ETA: 1s - loss: 1.5156 - regression_loss: 1.2615 - classification_loss: 0.2541 494/500 [============================>.] - ETA: 1s - loss: 1.5158 - regression_loss: 1.2617 - classification_loss: 0.2541 495/500 [============================>.] - ETA: 1s - loss: 1.5155 - regression_loss: 1.2616 - classification_loss: 0.2539 496/500 [============================>.] - ETA: 1s - loss: 1.5162 - regression_loss: 1.2622 - classification_loss: 0.2540 497/500 [============================>.] - ETA: 0s - loss: 1.5162 - regression_loss: 1.2622 - classification_loss: 0.2540 498/500 [============================>.] - ETA: 0s - loss: 1.5171 - regression_loss: 1.2628 - classification_loss: 0.2543 499/500 [============================>.] - ETA: 0s - loss: 1.5169 - regression_loss: 1.2626 - classification_loss: 0.2543 500/500 [==============================] - 125s 250ms/step - loss: 1.5183 - regression_loss: 1.2636 - classification_loss: 0.2546 1172 instances of class plum with average precision: 0.6607 mAP: 0.6607 Epoch 00099: saving model to ./training/snapshots/resnet50_pascal_99.h5 Epoch 100/150 1/500 [..............................] - ETA: 1:58 - loss: 1.6441 - regression_loss: 1.3898 - classification_loss: 0.2543 2/500 [..............................] - ETA: 2:02 - loss: 1.6912 - regression_loss: 1.4110 - classification_loss: 0.2802 3/500 [..............................] - ETA: 2:03 - loss: 1.5973 - regression_loss: 1.3580 - classification_loss: 0.2393 4/500 [..............................] - ETA: 2:00 - loss: 1.4295 - regression_loss: 1.2166 - classification_loss: 0.2129 5/500 [..............................] - ETA: 2:01 - loss: 1.4131 - regression_loss: 1.1927 - classification_loss: 0.2204 6/500 [..............................] - ETA: 2:00 - loss: 1.4790 - regression_loss: 1.2558 - classification_loss: 0.2232 7/500 [..............................] - ETA: 2:01 - loss: 1.5412 - regression_loss: 1.3056 - classification_loss: 0.2357 8/500 [..............................] - ETA: 2:01 - loss: 1.5784 - regression_loss: 1.3325 - classification_loss: 0.2458 9/500 [..............................] - ETA: 2:01 - loss: 1.6148 - regression_loss: 1.3594 - classification_loss: 0.2554 10/500 [..............................] - ETA: 2:01 - loss: 1.5974 - regression_loss: 1.3261 - classification_loss: 0.2712 11/500 [..............................] - ETA: 2:01 - loss: 1.5663 - regression_loss: 1.3053 - classification_loss: 0.2610 12/500 [..............................] - ETA: 2:01 - loss: 1.5636 - regression_loss: 1.2947 - classification_loss: 0.2689 13/500 [..............................] - ETA: 2:00 - loss: 1.5840 - regression_loss: 1.3090 - classification_loss: 0.2750 14/500 [..............................] - ETA: 2:00 - loss: 1.5971 - regression_loss: 1.3231 - classification_loss: 0.2740 15/500 [..............................] - ETA: 2:00 - loss: 1.6305 - regression_loss: 1.3505 - classification_loss: 0.2800 16/500 [..............................] - ETA: 1:59 - loss: 1.6335 - regression_loss: 1.3564 - classification_loss: 0.2771 17/500 [>.............................] - ETA: 1:59 - loss: 1.6292 - regression_loss: 1.3542 - classification_loss: 0.2750 18/500 [>.............................] - ETA: 1:59 - loss: 1.5596 - regression_loss: 1.2954 - classification_loss: 0.2642 19/500 [>.............................] - ETA: 1:58 - loss: 1.5609 - regression_loss: 1.2979 - classification_loss: 0.2630 20/500 [>.............................] - ETA: 1:58 - loss: 1.5490 - regression_loss: 1.2913 - classification_loss: 0.2577 21/500 [>.............................] - ETA: 1:58 - loss: 1.5464 - regression_loss: 1.2907 - classification_loss: 0.2557 22/500 [>.............................] - ETA: 1:58 - loss: 1.5619 - regression_loss: 1.3012 - classification_loss: 0.2607 23/500 [>.............................] - ETA: 1:58 - loss: 1.5701 - regression_loss: 1.3090 - classification_loss: 0.2611 24/500 [>.............................] - ETA: 1:58 - loss: 1.5347 - regression_loss: 1.2818 - classification_loss: 0.2529 25/500 [>.............................] - ETA: 1:57 - loss: 1.5229 - regression_loss: 1.2723 - classification_loss: 0.2506 26/500 [>.............................] - ETA: 1:57 - loss: 1.5248 - regression_loss: 1.2739 - classification_loss: 0.2509 27/500 [>.............................] - ETA: 1:57 - loss: 1.5408 - regression_loss: 1.2836 - classification_loss: 0.2571 28/500 [>.............................] - ETA: 1:57 - loss: 1.5251 - regression_loss: 1.2695 - classification_loss: 0.2556 29/500 [>.............................] - ETA: 1:57 - loss: 1.4953 - regression_loss: 1.2457 - classification_loss: 0.2496 30/500 [>.............................] - ETA: 1:56 - loss: 1.5069 - regression_loss: 1.2551 - classification_loss: 0.2518 31/500 [>.............................] - ETA: 1:56 - loss: 1.5055 - regression_loss: 1.2533 - classification_loss: 0.2523 32/500 [>.............................] - ETA: 1:56 - loss: 1.5103 - regression_loss: 1.2574 - classification_loss: 0.2529 33/500 [>.............................] - ETA: 1:55 - loss: 1.5317 - regression_loss: 1.2768 - classification_loss: 0.2549 34/500 [=>............................] - ETA: 1:55 - loss: 1.4997 - regression_loss: 1.2500 - classification_loss: 0.2497 35/500 [=>............................] - ETA: 1:55 - loss: 1.4794 - regression_loss: 1.2340 - classification_loss: 0.2453 36/500 [=>............................] - ETA: 1:55 - loss: 1.4823 - regression_loss: 1.2369 - classification_loss: 0.2454 37/500 [=>............................] - ETA: 1:54 - loss: 1.4803 - regression_loss: 1.2323 - classification_loss: 0.2480 38/500 [=>............................] - ETA: 1:54 - loss: 1.4916 - regression_loss: 1.2412 - classification_loss: 0.2504 39/500 [=>............................] - ETA: 1:54 - loss: 1.4954 - regression_loss: 1.2426 - classification_loss: 0.2528 40/500 [=>............................] - ETA: 1:54 - loss: 1.4984 - regression_loss: 1.2454 - classification_loss: 0.2530 41/500 [=>............................] - ETA: 1:53 - loss: 1.5129 - regression_loss: 1.2556 - classification_loss: 0.2573 42/500 [=>............................] - ETA: 1:53 - loss: 1.4996 - regression_loss: 1.2451 - classification_loss: 0.2546 43/500 [=>............................] - ETA: 1:53 - loss: 1.5057 - regression_loss: 1.2505 - classification_loss: 0.2552 44/500 [=>............................] - ETA: 1:53 - loss: 1.4952 - regression_loss: 1.2417 - classification_loss: 0.2534 45/500 [=>............................] - ETA: 1:53 - loss: 1.5010 - regression_loss: 1.2470 - classification_loss: 0.2540 46/500 [=>............................] - ETA: 1:53 - loss: 1.4939 - regression_loss: 1.2432 - classification_loss: 0.2507 47/500 [=>............................] - ETA: 1:52 - loss: 1.4990 - regression_loss: 1.2480 - classification_loss: 0.2510 48/500 [=>............................] - ETA: 1:52 - loss: 1.5013 - regression_loss: 1.2497 - classification_loss: 0.2515 49/500 [=>............................] - ETA: 1:52 - loss: 1.5067 - regression_loss: 1.2543 - classification_loss: 0.2524 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5056 - regression_loss: 1.2536 - classification_loss: 0.2520 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5150 - regression_loss: 1.2621 - classification_loss: 0.2530 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5270 - regression_loss: 1.2715 - classification_loss: 0.2555 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5298 - regression_loss: 1.2738 - classification_loss: 0.2560 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5289 - regression_loss: 1.2742 - classification_loss: 0.2547 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5353 - regression_loss: 1.2795 - classification_loss: 0.2559 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5333 - regression_loss: 1.2773 - classification_loss: 0.2559 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5298 - regression_loss: 1.2749 - classification_loss: 0.2548 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5345 - regression_loss: 1.2797 - classification_loss: 0.2548 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5389 - regression_loss: 1.2839 - classification_loss: 0.2551 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5462 - regression_loss: 1.2892 - classification_loss: 0.2570 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5498 - regression_loss: 1.2929 - classification_loss: 0.2570 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5454 - regression_loss: 1.2875 - classification_loss: 0.2579 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5434 - regression_loss: 1.2858 - classification_loss: 0.2576 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5378 - regression_loss: 1.2817 - classification_loss: 0.2561 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5376 - regression_loss: 1.2810 - classification_loss: 0.2565 66/500 [==>...........................] - ETA: 1:47 - loss: 1.5217 - regression_loss: 1.2682 - classification_loss: 0.2535 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5281 - regression_loss: 1.2742 - classification_loss: 0.2538 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5336 - regression_loss: 1.2792 - classification_loss: 0.2544 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5407 - regression_loss: 1.2859 - classification_loss: 0.2548 70/500 [===>..........................] - ETA: 1:46 - loss: 1.5472 - regression_loss: 1.2916 - classification_loss: 0.2556 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5393 - regression_loss: 1.2855 - classification_loss: 0.2539 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5284 - regression_loss: 1.2769 - classification_loss: 0.2515 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5251 - regression_loss: 1.2741 - classification_loss: 0.2510 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5194 - regression_loss: 1.2688 - classification_loss: 0.2506 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5273 - regression_loss: 1.2756 - classification_loss: 0.2517 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5310 - regression_loss: 1.2786 - classification_loss: 0.2524 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5318 - regression_loss: 1.2785 - classification_loss: 0.2533 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5338 - regression_loss: 1.2804 - classification_loss: 0.2535 79/500 [===>..........................] - ETA: 1:44 - loss: 1.5229 - regression_loss: 1.2720 - classification_loss: 0.2509 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5148 - regression_loss: 1.2660 - classification_loss: 0.2488 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5146 - regression_loss: 1.2661 - classification_loss: 0.2485 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5124 - regression_loss: 1.2642 - classification_loss: 0.2482 83/500 [===>..........................] - ETA: 1:43 - loss: 1.5128 - regression_loss: 1.2639 - classification_loss: 0.2489 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5214 - regression_loss: 1.2713 - classification_loss: 0.2501 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5166 - regression_loss: 1.2667 - classification_loss: 0.2499 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5234 - regression_loss: 1.2720 - classification_loss: 0.2514 87/500 [====>.........................] - ETA: 1:42 - loss: 1.5287 - regression_loss: 1.2756 - classification_loss: 0.2532 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5352 - regression_loss: 1.2802 - classification_loss: 0.2550 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5249 - regression_loss: 1.2721 - classification_loss: 0.2528 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5259 - regression_loss: 1.2729 - classification_loss: 0.2529 91/500 [====>.........................] - ETA: 1:41 - loss: 1.5297 - regression_loss: 1.2753 - classification_loss: 0.2544 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5310 - regression_loss: 1.2761 - classification_loss: 0.2550 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5272 - regression_loss: 1.2694 - classification_loss: 0.2578 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5247 - regression_loss: 1.2670 - classification_loss: 0.2577 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5175 - regression_loss: 1.2601 - classification_loss: 0.2574 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5183 - regression_loss: 1.2605 - classification_loss: 0.2578 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5196 - regression_loss: 1.2617 - classification_loss: 0.2579 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5214 - regression_loss: 1.2634 - classification_loss: 0.2580 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5132 - regression_loss: 1.2575 - classification_loss: 0.2557 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5132 - regression_loss: 1.2580 - classification_loss: 0.2552 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5138 - regression_loss: 1.2584 - classification_loss: 0.2555 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5167 - regression_loss: 1.2614 - classification_loss: 0.2553 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5164 - regression_loss: 1.2616 - classification_loss: 0.2549 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5260 - regression_loss: 1.2699 - classification_loss: 0.2561 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5318 - regression_loss: 1.2745 - classification_loss: 0.2573 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5290 - regression_loss: 1.2713 - classification_loss: 0.2577 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5257 - regression_loss: 1.2685 - classification_loss: 0.2572 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5265 - regression_loss: 1.2689 - classification_loss: 0.2576 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5299 - regression_loss: 1.2718 - classification_loss: 0.2581 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5299 - regression_loss: 1.2714 - classification_loss: 0.2585 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5325 - regression_loss: 1.2739 - classification_loss: 0.2586 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5338 - regression_loss: 1.2754 - classification_loss: 0.2585 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5316 - regression_loss: 1.2736 - classification_loss: 0.2580 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5313 - regression_loss: 1.2736 - classification_loss: 0.2577 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5296 - regression_loss: 1.2725 - classification_loss: 0.2570 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5304 - regression_loss: 1.2730 - classification_loss: 0.2574 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5304 - regression_loss: 1.2735 - classification_loss: 0.2569 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5343 - regression_loss: 1.2767 - classification_loss: 0.2576 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5332 - regression_loss: 1.2760 - classification_loss: 0.2572 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5339 - regression_loss: 1.2757 - classification_loss: 0.2582 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5352 - regression_loss: 1.2762 - classification_loss: 0.2589 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5352 - regression_loss: 1.2768 - classification_loss: 0.2584 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5322 - regression_loss: 1.2747 - classification_loss: 0.2575 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5274 - regression_loss: 1.2705 - classification_loss: 0.2569 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5259 - regression_loss: 1.2692 - classification_loss: 0.2567 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5302 - regression_loss: 1.2733 - classification_loss: 0.2569 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5238 - regression_loss: 1.2678 - classification_loss: 0.2560 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5238 - regression_loss: 1.2673 - classification_loss: 0.2565 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5273 - regression_loss: 1.2699 - classification_loss: 0.2573 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5355 - regression_loss: 1.2767 - classification_loss: 0.2588 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5396 - regression_loss: 1.2795 - classification_loss: 0.2602 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5377 - regression_loss: 1.2781 - classification_loss: 0.2596 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5413 - regression_loss: 1.2805 - classification_loss: 0.2607 134/500 [=======>......................] - ETA: 1:30 - loss: 1.5440 - regression_loss: 1.2827 - classification_loss: 0.2612 135/500 [=======>......................] - ETA: 1:30 - loss: 1.5431 - regression_loss: 1.2825 - classification_loss: 0.2606 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5384 - regression_loss: 1.2780 - classification_loss: 0.2604 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5404 - regression_loss: 1.2798 - classification_loss: 0.2605 138/500 [=======>......................] - ETA: 1:29 - loss: 1.5408 - regression_loss: 1.2797 - classification_loss: 0.2611 139/500 [=======>......................] - ETA: 1:29 - loss: 1.5370 - regression_loss: 1.2765 - classification_loss: 0.2605 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5422 - regression_loss: 1.2811 - classification_loss: 0.2612 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5397 - regression_loss: 1.2792 - classification_loss: 0.2605 142/500 [=======>......................] - ETA: 1:28 - loss: 1.5440 - regression_loss: 1.2825 - classification_loss: 0.2614 143/500 [=======>......................] - ETA: 1:28 - loss: 1.5460 - regression_loss: 1.2842 - classification_loss: 0.2618 144/500 [=======>......................] - ETA: 1:28 - loss: 1.5461 - regression_loss: 1.2837 - classification_loss: 0.2624 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5485 - regression_loss: 1.2859 - classification_loss: 0.2626 146/500 [=======>......................] - ETA: 1:27 - loss: 1.5463 - regression_loss: 1.2851 - classification_loss: 0.2612 147/500 [=======>......................] - ETA: 1:27 - loss: 1.5420 - regression_loss: 1.2817 - classification_loss: 0.2602 148/500 [=======>......................] - ETA: 1:27 - loss: 1.5375 - regression_loss: 1.2783 - classification_loss: 0.2592 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5392 - regression_loss: 1.2798 - classification_loss: 0.2594 150/500 [========>.....................] - ETA: 1:26 - loss: 1.5319 - regression_loss: 1.2738 - classification_loss: 0.2581 151/500 [========>.....................] - ETA: 1:26 - loss: 1.5261 - regression_loss: 1.2689 - classification_loss: 0.2572 152/500 [========>.....................] - ETA: 1:26 - loss: 1.5256 - regression_loss: 1.2684 - classification_loss: 0.2571 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5272 - regression_loss: 1.2689 - classification_loss: 0.2582 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5286 - regression_loss: 1.2702 - classification_loss: 0.2584 155/500 [========>.....................] - ETA: 1:25 - loss: 1.5290 - regression_loss: 1.2704 - classification_loss: 0.2586 156/500 [========>.....................] - ETA: 1:25 - loss: 1.5267 - regression_loss: 1.2688 - classification_loss: 0.2578 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5250 - regression_loss: 1.2677 - classification_loss: 0.2573 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5283 - regression_loss: 1.2703 - classification_loss: 0.2580 159/500 [========>.....................] - ETA: 1:24 - loss: 1.5214 - regression_loss: 1.2647 - classification_loss: 0.2567 160/500 [========>.....................] - ETA: 1:24 - loss: 1.5169 - regression_loss: 1.2609 - classification_loss: 0.2560 161/500 [========>.....................] - ETA: 1:24 - loss: 1.5202 - regression_loss: 1.2640 - classification_loss: 0.2562 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5194 - regression_loss: 1.2634 - classification_loss: 0.2560 163/500 [========>.....................] - ETA: 1:23 - loss: 1.5196 - regression_loss: 1.2636 - classification_loss: 0.2560 164/500 [========>.....................] - ETA: 1:23 - loss: 1.5166 - regression_loss: 1.2613 - classification_loss: 0.2553 165/500 [========>.....................] - ETA: 1:23 - loss: 1.5158 - regression_loss: 1.2604 - classification_loss: 0.2553 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5192 - regression_loss: 1.2632 - classification_loss: 0.2561 167/500 [=========>....................] - ETA: 1:22 - loss: 1.5209 - regression_loss: 1.2646 - classification_loss: 0.2563 168/500 [=========>....................] - ETA: 1:22 - loss: 1.5194 - regression_loss: 1.2632 - classification_loss: 0.2562 169/500 [=========>....................] - ETA: 1:22 - loss: 1.5184 - regression_loss: 1.2625 - classification_loss: 0.2560 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5170 - regression_loss: 1.2612 - classification_loss: 0.2558 171/500 [=========>....................] - ETA: 1:21 - loss: 1.5155 - regression_loss: 1.2603 - classification_loss: 0.2551 172/500 [=========>....................] - ETA: 1:21 - loss: 1.5152 - regression_loss: 1.2602 - classification_loss: 0.2550 173/500 [=========>....................] - ETA: 1:21 - loss: 1.5152 - regression_loss: 1.2604 - classification_loss: 0.2548 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5166 - regression_loss: 1.2615 - classification_loss: 0.2551 175/500 [=========>....................] - ETA: 1:20 - loss: 1.5148 - regression_loss: 1.2601 - classification_loss: 0.2546 176/500 [=========>....................] - ETA: 1:20 - loss: 1.5163 - regression_loss: 1.2608 - classification_loss: 0.2555 177/500 [=========>....................] - ETA: 1:20 - loss: 1.5160 - regression_loss: 1.2607 - classification_loss: 0.2553 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5180 - regression_loss: 1.2626 - classification_loss: 0.2554 179/500 [=========>....................] - ETA: 1:19 - loss: 1.5172 - regression_loss: 1.2617 - classification_loss: 0.2554 180/500 [=========>....................] - ETA: 1:19 - loss: 1.5183 - regression_loss: 1.2628 - classification_loss: 0.2555 181/500 [=========>....................] - ETA: 1:19 - loss: 1.5154 - regression_loss: 1.2603 - classification_loss: 0.2550 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5172 - regression_loss: 1.2618 - classification_loss: 0.2553 183/500 [=========>....................] - ETA: 1:18 - loss: 1.5180 - regression_loss: 1.2621 - classification_loss: 0.2559 184/500 [==========>...................] - ETA: 1:18 - loss: 1.5161 - regression_loss: 1.2605 - classification_loss: 0.2557 185/500 [==========>...................] - ETA: 1:18 - loss: 1.5187 - regression_loss: 1.2618 - classification_loss: 0.2569 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5179 - regression_loss: 1.2612 - classification_loss: 0.2567 187/500 [==========>...................] - ETA: 1:17 - loss: 1.5175 - regression_loss: 1.2611 - classification_loss: 0.2564 188/500 [==========>...................] - ETA: 1:17 - loss: 1.5177 - regression_loss: 1.2612 - classification_loss: 0.2565 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5135 - regression_loss: 1.2580 - classification_loss: 0.2555 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5147 - regression_loss: 1.2592 - classification_loss: 0.2555 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5141 - regression_loss: 1.2589 - classification_loss: 0.2552 192/500 [==========>...................] - ETA: 1:16 - loss: 1.5128 - regression_loss: 1.2581 - classification_loss: 0.2547 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5148 - regression_loss: 1.2599 - classification_loss: 0.2550 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5173 - regression_loss: 1.2619 - classification_loss: 0.2554 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5160 - regression_loss: 1.2610 - classification_loss: 0.2550 196/500 [==========>...................] - ETA: 1:15 - loss: 1.5182 - regression_loss: 1.2626 - classification_loss: 0.2556 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5202 - regression_loss: 1.2648 - classification_loss: 0.2554 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5267 - regression_loss: 1.2685 - classification_loss: 0.2581 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5284 - regression_loss: 1.2696 - classification_loss: 0.2588 200/500 [===========>..................] - ETA: 1:14 - loss: 1.5283 - regression_loss: 1.2698 - classification_loss: 0.2585 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5251 - regression_loss: 1.2666 - classification_loss: 0.2585 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5256 - regression_loss: 1.2669 - classification_loss: 0.2586 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5246 - regression_loss: 1.2662 - classification_loss: 0.2583 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5221 - regression_loss: 1.2642 - classification_loss: 0.2579 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5233 - regression_loss: 1.2653 - classification_loss: 0.2580 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5237 - regression_loss: 1.2657 - classification_loss: 0.2581 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5214 - regression_loss: 1.2638 - classification_loss: 0.2576 208/500 [===========>..................] - ETA: 1:12 - loss: 1.5199 - regression_loss: 1.2628 - classification_loss: 0.2571 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5188 - regression_loss: 1.2623 - classification_loss: 0.2565 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5160 - regression_loss: 1.2600 - classification_loss: 0.2560 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5177 - regression_loss: 1.2614 - classification_loss: 0.2562 212/500 [===========>..................] - ETA: 1:11 - loss: 1.5158 - regression_loss: 1.2600 - classification_loss: 0.2558 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5167 - regression_loss: 1.2603 - classification_loss: 0.2564 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5159 - regression_loss: 1.2598 - classification_loss: 0.2561 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5151 - regression_loss: 1.2592 - classification_loss: 0.2558 216/500 [===========>..................] - ETA: 1:10 - loss: 1.5159 - regression_loss: 1.2599 - classification_loss: 0.2560 217/500 [============>.................] - ETA: 1:10 - loss: 1.5147 - regression_loss: 1.2587 - classification_loss: 0.2560 218/500 [============>.................] - ETA: 1:10 - loss: 1.5151 - regression_loss: 1.2588 - classification_loss: 0.2563 219/500 [============>.................] - ETA: 1:10 - loss: 1.5174 - regression_loss: 1.2607 - classification_loss: 0.2567 220/500 [============>.................] - ETA: 1:09 - loss: 1.5162 - regression_loss: 1.2599 - classification_loss: 0.2563 221/500 [============>.................] - ETA: 1:09 - loss: 1.5169 - regression_loss: 1.2603 - classification_loss: 0.2566 222/500 [============>.................] - ETA: 1:09 - loss: 1.5163 - regression_loss: 1.2601 - classification_loss: 0.2563 223/500 [============>.................] - ETA: 1:09 - loss: 1.5176 - regression_loss: 1.2610 - classification_loss: 0.2566 224/500 [============>.................] - ETA: 1:08 - loss: 1.5169 - regression_loss: 1.2604 - classification_loss: 0.2564 225/500 [============>.................] - ETA: 1:08 - loss: 1.5197 - regression_loss: 1.2628 - classification_loss: 0.2569 226/500 [============>.................] - ETA: 1:08 - loss: 1.5220 - regression_loss: 1.2646 - classification_loss: 0.2574 227/500 [============>.................] - ETA: 1:08 - loss: 1.5212 - regression_loss: 1.2640 - classification_loss: 0.2573 228/500 [============>.................] - ETA: 1:07 - loss: 1.5222 - regression_loss: 1.2649 - classification_loss: 0.2573 229/500 [============>.................] - ETA: 1:07 - loss: 1.5220 - regression_loss: 1.2648 - classification_loss: 0.2571 230/500 [============>.................] - ETA: 1:07 - loss: 1.5221 - regression_loss: 1.2654 - classification_loss: 0.2567 231/500 [============>.................] - ETA: 1:07 - loss: 1.5243 - regression_loss: 1.2676 - classification_loss: 0.2567 232/500 [============>.................] - ETA: 1:06 - loss: 1.5251 - regression_loss: 1.2683 - classification_loss: 0.2567 233/500 [============>.................] - ETA: 1:06 - loss: 1.5262 - regression_loss: 1.2695 - classification_loss: 0.2567 234/500 [=============>................] - ETA: 1:06 - loss: 1.5278 - regression_loss: 1.2706 - classification_loss: 0.2572 235/500 [=============>................] - ETA: 1:06 - loss: 1.5300 - regression_loss: 1.2724 - classification_loss: 0.2576 236/500 [=============>................] - ETA: 1:05 - loss: 1.5299 - regression_loss: 1.2726 - classification_loss: 0.2573 237/500 [=============>................] - ETA: 1:05 - loss: 1.5306 - regression_loss: 1.2730 - classification_loss: 0.2576 238/500 [=============>................] - ETA: 1:05 - loss: 1.5301 - regression_loss: 1.2725 - classification_loss: 0.2577 239/500 [=============>................] - ETA: 1:05 - loss: 1.5293 - regression_loss: 1.2713 - classification_loss: 0.2580 240/500 [=============>................] - ETA: 1:04 - loss: 1.5289 - regression_loss: 1.2712 - classification_loss: 0.2577 241/500 [=============>................] - ETA: 1:04 - loss: 1.5280 - regression_loss: 1.2705 - classification_loss: 0.2575 242/500 [=============>................] - ETA: 1:04 - loss: 1.5294 - regression_loss: 1.2716 - classification_loss: 0.2578 243/500 [=============>................] - ETA: 1:04 - loss: 1.5283 - regression_loss: 1.2710 - classification_loss: 0.2574 244/500 [=============>................] - ETA: 1:03 - loss: 1.5252 - regression_loss: 1.2687 - classification_loss: 0.2565 245/500 [=============>................] - ETA: 1:03 - loss: 1.5268 - regression_loss: 1.2706 - classification_loss: 0.2562 246/500 [=============>................] - ETA: 1:03 - loss: 1.5243 - regression_loss: 1.2687 - classification_loss: 0.2556 247/500 [=============>................] - ETA: 1:03 - loss: 1.5222 - regression_loss: 1.2671 - classification_loss: 0.2551 248/500 [=============>................] - ETA: 1:02 - loss: 1.5210 - regression_loss: 1.2664 - classification_loss: 0.2546 249/500 [=============>................] - ETA: 1:02 - loss: 1.5173 - regression_loss: 1.2635 - classification_loss: 0.2539 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5191 - regression_loss: 1.2647 - classification_loss: 0.2544 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5210 - regression_loss: 1.2661 - classification_loss: 0.2549 252/500 [==============>...............] - ETA: 1:01 - loss: 1.5203 - regression_loss: 1.2657 - classification_loss: 0.2546 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5215 - regression_loss: 1.2669 - classification_loss: 0.2547 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5231 - regression_loss: 1.2684 - classification_loss: 0.2547 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5205 - regression_loss: 1.2662 - classification_loss: 0.2542 256/500 [==============>...............] - ETA: 1:00 - loss: 1.5221 - regression_loss: 1.2675 - classification_loss: 0.2546 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5208 - regression_loss: 1.2663 - classification_loss: 0.2545 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5220 - regression_loss: 1.2674 - classification_loss: 0.2546 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5227 - regression_loss: 1.2682 - classification_loss: 0.2546 260/500 [==============>...............] - ETA: 59s - loss: 1.5223 - regression_loss: 1.2680 - classification_loss: 0.2543  261/500 [==============>...............] - ETA: 59s - loss: 1.5196 - regression_loss: 1.2659 - classification_loss: 0.2537 262/500 [==============>...............] - ETA: 59s - loss: 1.5194 - regression_loss: 1.2657 - classification_loss: 0.2537 263/500 [==============>...............] - ETA: 59s - loss: 1.5206 - regression_loss: 1.2669 - classification_loss: 0.2537 264/500 [==============>...............] - ETA: 59s - loss: 1.5209 - regression_loss: 1.2672 - classification_loss: 0.2537 265/500 [==============>...............] - ETA: 58s - loss: 1.5204 - regression_loss: 1.2669 - classification_loss: 0.2535 266/500 [==============>...............] - ETA: 58s - loss: 1.5210 - regression_loss: 1.2674 - classification_loss: 0.2537 267/500 [===============>..............] - ETA: 58s - loss: 1.5205 - regression_loss: 1.2671 - classification_loss: 0.2534 268/500 [===============>..............] - ETA: 58s - loss: 1.5211 - regression_loss: 1.2680 - classification_loss: 0.2530 269/500 [===============>..............] - ETA: 57s - loss: 1.5185 - regression_loss: 1.2661 - classification_loss: 0.2525 270/500 [===============>..............] - ETA: 57s - loss: 1.5178 - regression_loss: 1.2654 - classification_loss: 0.2523 271/500 [===============>..............] - ETA: 57s - loss: 1.5172 - regression_loss: 1.2650 - classification_loss: 0.2522 272/500 [===============>..............] - ETA: 57s - loss: 1.5182 - regression_loss: 1.2660 - classification_loss: 0.2522 273/500 [===============>..............] - ETA: 56s - loss: 1.5183 - regression_loss: 1.2663 - classification_loss: 0.2521 274/500 [===============>..............] - ETA: 56s - loss: 1.5212 - regression_loss: 1.2678 - classification_loss: 0.2533 275/500 [===============>..............] - ETA: 56s - loss: 1.5202 - regression_loss: 1.2671 - classification_loss: 0.2531 276/500 [===============>..............] - ETA: 56s - loss: 1.5191 - regression_loss: 1.2662 - classification_loss: 0.2529 277/500 [===============>..............] - ETA: 55s - loss: 1.5179 - regression_loss: 1.2653 - classification_loss: 0.2527 278/500 [===============>..............] - ETA: 55s - loss: 1.5150 - regression_loss: 1.2630 - classification_loss: 0.2520 279/500 [===============>..............] - ETA: 55s - loss: 1.5163 - regression_loss: 1.2641 - classification_loss: 0.2522 280/500 [===============>..............] - ETA: 54s - loss: 1.5124 - regression_loss: 1.2608 - classification_loss: 0.2515 281/500 [===============>..............] - ETA: 54s - loss: 1.5124 - regression_loss: 1.2609 - classification_loss: 0.2515 282/500 [===============>..............] - ETA: 54s - loss: 1.5140 - regression_loss: 1.2622 - classification_loss: 0.2517 283/500 [===============>..............] - ETA: 54s - loss: 1.5148 - regression_loss: 1.2631 - classification_loss: 0.2518 284/500 [================>.............] - ETA: 54s - loss: 1.5149 - regression_loss: 1.2631 - classification_loss: 0.2519 285/500 [================>.............] - ETA: 53s - loss: 1.5118 - regression_loss: 1.2606 - classification_loss: 0.2513 286/500 [================>.............] - ETA: 53s - loss: 1.5118 - regression_loss: 1.2606 - classification_loss: 0.2512 287/500 [================>.............] - ETA: 53s - loss: 1.5092 - regression_loss: 1.2584 - classification_loss: 0.2508 288/500 [================>.............] - ETA: 53s - loss: 1.5103 - regression_loss: 1.2594 - classification_loss: 0.2509 289/500 [================>.............] - ETA: 52s - loss: 1.5106 - regression_loss: 1.2599 - classification_loss: 0.2507 290/500 [================>.............] - ETA: 52s - loss: 1.5110 - regression_loss: 1.2601 - classification_loss: 0.2509 291/500 [================>.............] - ETA: 52s - loss: 1.5120 - regression_loss: 1.2609 - classification_loss: 0.2511 292/500 [================>.............] - ETA: 52s - loss: 1.5131 - regression_loss: 1.2617 - classification_loss: 0.2514 293/500 [================>.............] - ETA: 51s - loss: 1.5126 - regression_loss: 1.2613 - classification_loss: 0.2513 294/500 [================>.............] - ETA: 51s - loss: 1.5141 - regression_loss: 1.2628 - classification_loss: 0.2513 295/500 [================>.............] - ETA: 51s - loss: 1.5153 - regression_loss: 1.2636 - classification_loss: 0.2517 296/500 [================>.............] - ETA: 51s - loss: 1.5158 - regression_loss: 1.2639 - classification_loss: 0.2519 297/500 [================>.............] - ETA: 50s - loss: 1.5143 - regression_loss: 1.2628 - classification_loss: 0.2515 298/500 [================>.............] - ETA: 50s - loss: 1.5134 - regression_loss: 1.2622 - classification_loss: 0.2512 299/500 [================>.............] - ETA: 50s - loss: 1.5137 - regression_loss: 1.2625 - classification_loss: 0.2512 300/500 [=================>............] - ETA: 50s - loss: 1.5137 - regression_loss: 1.2625 - classification_loss: 0.2512 301/500 [=================>............] - ETA: 49s - loss: 1.5139 - regression_loss: 1.2626 - classification_loss: 0.2513 302/500 [=================>............] - ETA: 49s - loss: 1.5159 - regression_loss: 1.2645 - classification_loss: 0.2514 303/500 [=================>............] - ETA: 49s - loss: 1.5179 - regression_loss: 1.2664 - classification_loss: 0.2515 304/500 [=================>............] - ETA: 49s - loss: 1.5151 - regression_loss: 1.2643 - classification_loss: 0.2509 305/500 [=================>............] - ETA: 48s - loss: 1.5135 - regression_loss: 1.2630 - classification_loss: 0.2505 306/500 [=================>............] - ETA: 48s - loss: 1.5123 - regression_loss: 1.2621 - classification_loss: 0.2502 307/500 [=================>............] - ETA: 48s - loss: 1.5128 - regression_loss: 1.2624 - classification_loss: 0.2504 308/500 [=================>............] - ETA: 48s - loss: 1.5137 - regression_loss: 1.2631 - classification_loss: 0.2506 309/500 [=================>............] - ETA: 47s - loss: 1.5140 - regression_loss: 1.2634 - classification_loss: 0.2506 310/500 [=================>............] - ETA: 47s - loss: 1.5143 - regression_loss: 1.2637 - classification_loss: 0.2507 311/500 [=================>............] - ETA: 47s - loss: 1.5157 - regression_loss: 1.2651 - classification_loss: 0.2506 312/500 [=================>............] - ETA: 46s - loss: 1.5140 - regression_loss: 1.2637 - classification_loss: 0.2503 313/500 [=================>............] - ETA: 46s - loss: 1.5132 - regression_loss: 1.2631 - classification_loss: 0.2501 314/500 [=================>............] - ETA: 46s - loss: 1.5115 - regression_loss: 1.2617 - classification_loss: 0.2498 315/500 [=================>............] - ETA: 46s - loss: 1.5101 - regression_loss: 1.2605 - classification_loss: 0.2496 316/500 [=================>............] - ETA: 45s - loss: 1.5101 - regression_loss: 1.2606 - classification_loss: 0.2496 317/500 [==================>...........] - ETA: 45s - loss: 1.5108 - regression_loss: 1.2614 - classification_loss: 0.2494 318/500 [==================>...........] - ETA: 45s - loss: 1.5104 - regression_loss: 1.2612 - classification_loss: 0.2492 319/500 [==================>...........] - ETA: 45s - loss: 1.5071 - regression_loss: 1.2585 - classification_loss: 0.2486 320/500 [==================>...........] - ETA: 44s - loss: 1.5080 - regression_loss: 1.2591 - classification_loss: 0.2489 321/500 [==================>...........] - ETA: 44s - loss: 1.5084 - regression_loss: 1.2596 - classification_loss: 0.2488 322/500 [==================>...........] - ETA: 44s - loss: 1.5083 - regression_loss: 1.2595 - classification_loss: 0.2488 323/500 [==================>...........] - ETA: 44s - loss: 1.5098 - regression_loss: 1.2600 - classification_loss: 0.2498 324/500 [==================>...........] - ETA: 43s - loss: 1.5134 - regression_loss: 1.2627 - classification_loss: 0.2508 325/500 [==================>...........] - ETA: 43s - loss: 1.5118 - regression_loss: 1.2614 - classification_loss: 0.2503 326/500 [==================>...........] - ETA: 43s - loss: 1.5118 - regression_loss: 1.2615 - classification_loss: 0.2503 327/500 [==================>...........] - ETA: 43s - loss: 1.5108 - regression_loss: 1.2608 - classification_loss: 0.2500 328/500 [==================>...........] - ETA: 42s - loss: 1.5108 - regression_loss: 1.2605 - classification_loss: 0.2503 329/500 [==================>...........] - ETA: 42s - loss: 1.5121 - regression_loss: 1.2613 - classification_loss: 0.2508 330/500 [==================>...........] - ETA: 42s - loss: 1.5110 - regression_loss: 1.2602 - classification_loss: 0.2507 331/500 [==================>...........] - ETA: 42s - loss: 1.5127 - regression_loss: 1.2617 - classification_loss: 0.2510 332/500 [==================>...........] - ETA: 41s - loss: 1.5134 - regression_loss: 1.2623 - classification_loss: 0.2511 333/500 [==================>...........] - ETA: 41s - loss: 1.5147 - regression_loss: 1.2634 - classification_loss: 0.2513 334/500 [===================>..........] - ETA: 41s - loss: 1.5120 - regression_loss: 1.2613 - classification_loss: 0.2507 335/500 [===================>..........] - ETA: 41s - loss: 1.5130 - regression_loss: 1.2619 - classification_loss: 0.2511 336/500 [===================>..........] - ETA: 40s - loss: 1.5114 - regression_loss: 1.2608 - classification_loss: 0.2506 337/500 [===================>..........] - ETA: 40s - loss: 1.5128 - regression_loss: 1.2620 - classification_loss: 0.2508 338/500 [===================>..........] - ETA: 40s - loss: 1.5129 - regression_loss: 1.2621 - classification_loss: 0.2508 339/500 [===================>..........] - ETA: 40s - loss: 1.5124 - regression_loss: 1.2619 - classification_loss: 0.2505 340/500 [===================>..........] - ETA: 39s - loss: 1.5119 - regression_loss: 1.2614 - classification_loss: 0.2504 341/500 [===================>..........] - ETA: 39s - loss: 1.5105 - regression_loss: 1.2605 - classification_loss: 0.2500 342/500 [===================>..........] - ETA: 39s - loss: 1.5128 - regression_loss: 1.2619 - classification_loss: 0.2509 343/500 [===================>..........] - ETA: 39s - loss: 1.5134 - regression_loss: 1.2624 - classification_loss: 0.2510 344/500 [===================>..........] - ETA: 38s - loss: 1.5157 - regression_loss: 1.2643 - classification_loss: 0.2514 345/500 [===================>..........] - ETA: 38s - loss: 1.5149 - regression_loss: 1.2634 - classification_loss: 0.2516 346/500 [===================>..........] - ETA: 38s - loss: 1.5150 - regression_loss: 1.2635 - classification_loss: 0.2515 347/500 [===================>..........] - ETA: 38s - loss: 1.5132 - regression_loss: 1.2622 - classification_loss: 0.2510 348/500 [===================>..........] - ETA: 37s - loss: 1.5144 - regression_loss: 1.2632 - classification_loss: 0.2512 349/500 [===================>..........] - ETA: 37s - loss: 1.5136 - regression_loss: 1.2625 - classification_loss: 0.2511 350/500 [====================>.........] - ETA: 37s - loss: 1.5147 - regression_loss: 1.2637 - classification_loss: 0.2510 351/500 [====================>.........] - ETA: 37s - loss: 1.5154 - regression_loss: 1.2643 - classification_loss: 0.2510 352/500 [====================>.........] - ETA: 36s - loss: 1.5146 - regression_loss: 1.2637 - classification_loss: 0.2509 353/500 [====================>.........] - ETA: 36s - loss: 1.5156 - regression_loss: 1.2645 - classification_loss: 0.2510 354/500 [====================>.........] - ETA: 36s - loss: 1.5162 - regression_loss: 1.2652 - classification_loss: 0.2510 355/500 [====================>.........] - ETA: 36s - loss: 1.5159 - regression_loss: 1.2653 - classification_loss: 0.2507 356/500 [====================>.........] - ETA: 35s - loss: 1.5153 - regression_loss: 1.2650 - classification_loss: 0.2504 357/500 [====================>.........] - ETA: 35s - loss: 1.5147 - regression_loss: 1.2645 - classification_loss: 0.2502 358/500 [====================>.........] - ETA: 35s - loss: 1.5145 - regression_loss: 1.2644 - classification_loss: 0.2501 359/500 [====================>.........] - ETA: 35s - loss: 1.5149 - regression_loss: 1.2648 - classification_loss: 0.2502 360/500 [====================>.........] - ETA: 34s - loss: 1.5156 - regression_loss: 1.2652 - classification_loss: 0.2504 361/500 [====================>.........] - ETA: 34s - loss: 1.5166 - regression_loss: 1.2658 - classification_loss: 0.2508 362/500 [====================>.........] - ETA: 34s - loss: 1.5157 - regression_loss: 1.2652 - classification_loss: 0.2505 363/500 [====================>.........] - ETA: 34s - loss: 1.5168 - regression_loss: 1.2662 - classification_loss: 0.2506 364/500 [====================>.........] - ETA: 33s - loss: 1.5157 - regression_loss: 1.2656 - classification_loss: 0.2501 365/500 [====================>.........] - ETA: 33s - loss: 1.5155 - regression_loss: 1.2655 - classification_loss: 0.2500 366/500 [====================>.........] - ETA: 33s - loss: 1.5156 - regression_loss: 1.2656 - classification_loss: 0.2500 367/500 [=====================>........] - ETA: 33s - loss: 1.5140 - regression_loss: 1.2643 - classification_loss: 0.2497 368/500 [=====================>........] - ETA: 32s - loss: 1.5149 - regression_loss: 1.2651 - classification_loss: 0.2498 369/500 [=====================>........] - ETA: 32s - loss: 1.5153 - regression_loss: 1.2655 - classification_loss: 0.2498 370/500 [=====================>........] - ETA: 32s - loss: 1.5143 - regression_loss: 1.2647 - classification_loss: 0.2496 371/500 [=====================>........] - ETA: 32s - loss: 1.5151 - regression_loss: 1.2653 - classification_loss: 0.2498 372/500 [=====================>........] - ETA: 31s - loss: 1.5156 - regression_loss: 1.2657 - classification_loss: 0.2499 373/500 [=====================>........] - ETA: 31s - loss: 1.5135 - regression_loss: 1.2641 - classification_loss: 0.2494 374/500 [=====================>........] - ETA: 31s - loss: 1.5150 - regression_loss: 1.2656 - classification_loss: 0.2495 375/500 [=====================>........] - ETA: 31s - loss: 1.5151 - regression_loss: 1.2656 - classification_loss: 0.2495 376/500 [=====================>........] - ETA: 30s - loss: 1.5153 - regression_loss: 1.2656 - classification_loss: 0.2497 377/500 [=====================>........] - ETA: 30s - loss: 1.5158 - regression_loss: 1.2661 - classification_loss: 0.2496 378/500 [=====================>........] - ETA: 30s - loss: 1.5156 - regression_loss: 1.2661 - classification_loss: 0.2496 379/500 [=====================>........] - ETA: 30s - loss: 1.5164 - regression_loss: 1.2663 - classification_loss: 0.2501 380/500 [=====================>........] - ETA: 29s - loss: 1.5176 - regression_loss: 1.2674 - classification_loss: 0.2501 381/500 [=====================>........] - ETA: 29s - loss: 1.5174 - regression_loss: 1.2673 - classification_loss: 0.2501 382/500 [=====================>........] - ETA: 29s - loss: 1.5158 - regression_loss: 1.2662 - classification_loss: 0.2497 383/500 [=====================>........] - ETA: 29s - loss: 1.5144 - regression_loss: 1.2651 - classification_loss: 0.2493 384/500 [======================>.......] - ETA: 28s - loss: 1.5144 - regression_loss: 1.2650 - classification_loss: 0.2494 385/500 [======================>.......] - ETA: 28s - loss: 1.5124 - regression_loss: 1.2635 - classification_loss: 0.2489 386/500 [======================>.......] - ETA: 28s - loss: 1.5131 - regression_loss: 1.2641 - classification_loss: 0.2490 387/500 [======================>.......] - ETA: 28s - loss: 1.5138 - regression_loss: 1.2646 - classification_loss: 0.2493 388/500 [======================>.......] - ETA: 27s - loss: 1.5147 - regression_loss: 1.2652 - classification_loss: 0.2495 389/500 [======================>.......] - ETA: 27s - loss: 1.5135 - regression_loss: 1.2643 - classification_loss: 0.2493 390/500 [======================>.......] - ETA: 27s - loss: 1.5138 - regression_loss: 1.2645 - classification_loss: 0.2493 391/500 [======================>.......] - ETA: 27s - loss: 1.5138 - regression_loss: 1.2645 - classification_loss: 0.2493 392/500 [======================>.......] - ETA: 26s - loss: 1.5147 - regression_loss: 1.2653 - classification_loss: 0.2493 393/500 [======================>.......] - ETA: 26s - loss: 1.5134 - regression_loss: 1.2642 - classification_loss: 0.2492 394/500 [======================>.......] - ETA: 26s - loss: 1.5133 - regression_loss: 1.2638 - classification_loss: 0.2495 395/500 [======================>.......] - ETA: 26s - loss: 1.5133 - regression_loss: 1.2640 - classification_loss: 0.2493 396/500 [======================>.......] - ETA: 25s - loss: 1.5115 - regression_loss: 1.2625 - classification_loss: 0.2490 397/500 [======================>.......] - ETA: 25s - loss: 1.5106 - regression_loss: 1.2617 - classification_loss: 0.2489 398/500 [======================>.......] - ETA: 25s - loss: 1.5098 - regression_loss: 1.2613 - classification_loss: 0.2485 399/500 [======================>.......] - ETA: 25s - loss: 1.5085 - regression_loss: 1.2603 - classification_loss: 0.2482 400/500 [=======================>......] - ETA: 24s - loss: 1.5070 - regression_loss: 1.2590 - classification_loss: 0.2480 401/500 [=======================>......] - ETA: 24s - loss: 1.5052 - regression_loss: 1.2575 - classification_loss: 0.2477 402/500 [=======================>......] - ETA: 24s - loss: 1.5034 - regression_loss: 1.2562 - classification_loss: 0.2473 403/500 [=======================>......] - ETA: 24s - loss: 1.5031 - regression_loss: 1.2558 - classification_loss: 0.2473 404/500 [=======================>......] - ETA: 23s - loss: 1.5031 - regression_loss: 1.2559 - classification_loss: 0.2472 405/500 [=======================>......] - ETA: 23s - loss: 1.5027 - regression_loss: 1.2557 - classification_loss: 0.2470 406/500 [=======================>......] - ETA: 23s - loss: 1.5036 - regression_loss: 1.2565 - classification_loss: 0.2471 407/500 [=======================>......] - ETA: 23s - loss: 1.5042 - regression_loss: 1.2570 - classification_loss: 0.2471 408/500 [=======================>......] - ETA: 22s - loss: 1.5050 - regression_loss: 1.2578 - classification_loss: 0.2472 409/500 [=======================>......] - ETA: 22s - loss: 1.5036 - regression_loss: 1.2567 - classification_loss: 0.2469 410/500 [=======================>......] - ETA: 22s - loss: 1.5043 - regression_loss: 1.2572 - classification_loss: 0.2472 411/500 [=======================>......] - ETA: 22s - loss: 1.5030 - regression_loss: 1.2560 - classification_loss: 0.2469 412/500 [=======================>......] - ETA: 21s - loss: 1.5031 - regression_loss: 1.2563 - classification_loss: 0.2469 413/500 [=======================>......] - ETA: 21s - loss: 1.5030 - regression_loss: 1.2561 - classification_loss: 0.2468 414/500 [=======================>......] - ETA: 21s - loss: 1.5052 - regression_loss: 1.2578 - classification_loss: 0.2474 415/500 [=======================>......] - ETA: 21s - loss: 1.5059 - regression_loss: 1.2584 - classification_loss: 0.2475 416/500 [=======================>......] - ETA: 20s - loss: 1.5066 - regression_loss: 1.2589 - classification_loss: 0.2477 417/500 [========================>.....] - ETA: 20s - loss: 1.5059 - regression_loss: 1.2583 - classification_loss: 0.2476 418/500 [========================>.....] - ETA: 20s - loss: 1.5054 - regression_loss: 1.2580 - classification_loss: 0.2474 419/500 [========================>.....] - ETA: 20s - loss: 1.5030 - regression_loss: 1.2561 - classification_loss: 0.2468 420/500 [========================>.....] - ETA: 19s - loss: 1.5044 - regression_loss: 1.2572 - classification_loss: 0.2472 421/500 [========================>.....] - ETA: 19s - loss: 1.5047 - regression_loss: 1.2576 - classification_loss: 0.2471 422/500 [========================>.....] - ETA: 19s - loss: 1.5072 - regression_loss: 1.2597 - classification_loss: 0.2476 423/500 [========================>.....] - ETA: 19s - loss: 1.5051 - regression_loss: 1.2576 - classification_loss: 0.2474 424/500 [========================>.....] - ETA: 18s - loss: 1.5048 - regression_loss: 1.2573 - classification_loss: 0.2474 425/500 [========================>.....] - ETA: 18s - loss: 1.5045 - regression_loss: 1.2572 - classification_loss: 0.2473 426/500 [========================>.....] - ETA: 18s - loss: 1.5037 - regression_loss: 1.2565 - classification_loss: 0.2472 427/500 [========================>.....] - ETA: 18s - loss: 1.5030 - regression_loss: 1.2559 - classification_loss: 0.2471 428/500 [========================>.....] - ETA: 17s - loss: 1.5038 - regression_loss: 1.2567 - classification_loss: 0.2471 429/500 [========================>.....] - ETA: 17s - loss: 1.5054 - regression_loss: 1.2581 - classification_loss: 0.2473 430/500 [========================>.....] - ETA: 17s - loss: 1.5042 - regression_loss: 1.2572 - classification_loss: 0.2470 431/500 [========================>.....] - ETA: 17s - loss: 1.5045 - regression_loss: 1.2574 - classification_loss: 0.2472 432/500 [========================>.....] - ETA: 16s - loss: 1.5053 - regression_loss: 1.2580 - classification_loss: 0.2473 433/500 [========================>.....] - ETA: 16s - loss: 1.5033 - regression_loss: 1.2564 - classification_loss: 0.2469 434/500 [=========================>....] - ETA: 16s - loss: 1.5030 - regression_loss: 1.2561 - classification_loss: 0.2469 435/500 [=========================>....] - ETA: 16s - loss: 1.5017 - regression_loss: 1.2549 - classification_loss: 0.2468 436/500 [=========================>....] - ETA: 15s - loss: 1.5033 - regression_loss: 1.2560 - classification_loss: 0.2472 437/500 [=========================>....] - ETA: 15s - loss: 1.5024 - regression_loss: 1.2554 - classification_loss: 0.2470 438/500 [=========================>....] - ETA: 15s - loss: 1.5017 - regression_loss: 1.2549 - classification_loss: 0.2469 439/500 [=========================>....] - ETA: 15s - loss: 1.5028 - regression_loss: 1.2559 - classification_loss: 0.2470 440/500 [=========================>....] - ETA: 14s - loss: 1.5029 - regression_loss: 1.2559 - classification_loss: 0.2470 441/500 [=========================>....] - ETA: 14s - loss: 1.5037 - regression_loss: 1.2566 - classification_loss: 0.2470 442/500 [=========================>....] - ETA: 14s - loss: 1.5047 - regression_loss: 1.2575 - classification_loss: 0.2472 443/500 [=========================>....] - ETA: 14s - loss: 1.5047 - regression_loss: 1.2575 - classification_loss: 0.2472 444/500 [=========================>....] - ETA: 13s - loss: 1.5055 - regression_loss: 1.2582 - classification_loss: 0.2473 445/500 [=========================>....] - ETA: 13s - loss: 1.5043 - regression_loss: 1.2572 - classification_loss: 0.2470 446/500 [=========================>....] - ETA: 13s - loss: 1.5046 - regression_loss: 1.2575 - classification_loss: 0.2471 447/500 [=========================>....] - ETA: 13s - loss: 1.5043 - regression_loss: 1.2575 - classification_loss: 0.2469 448/500 [=========================>....] - ETA: 12s - loss: 1.5043 - regression_loss: 1.2574 - classification_loss: 0.2469 449/500 [=========================>....] - ETA: 12s - loss: 1.5045 - regression_loss: 1.2577 - classification_loss: 0.2468 450/500 [==========================>...] - ETA: 12s - loss: 1.5044 - regression_loss: 1.2571 - classification_loss: 0.2472 451/500 [==========================>...] - ETA: 12s - loss: 1.5050 - regression_loss: 1.2577 - classification_loss: 0.2473 452/500 [==========================>...] - ETA: 11s - loss: 1.5041 - regression_loss: 1.2570 - classification_loss: 0.2470 453/500 [==========================>...] - ETA: 11s - loss: 1.5041 - regression_loss: 1.2572 - classification_loss: 0.2469 454/500 [==========================>...] - ETA: 11s - loss: 1.5042 - regression_loss: 1.2573 - classification_loss: 0.2468 455/500 [==========================>...] - ETA: 11s - loss: 1.5027 - regression_loss: 1.2560 - classification_loss: 0.2467 456/500 [==========================>...] - ETA: 10s - loss: 1.5038 - regression_loss: 1.2571 - classification_loss: 0.2468 457/500 [==========================>...] - ETA: 10s - loss: 1.5036 - regression_loss: 1.2570 - classification_loss: 0.2467 458/500 [==========================>...] - ETA: 10s - loss: 1.5035 - regression_loss: 1.2568 - classification_loss: 0.2467 459/500 [==========================>...] - ETA: 10s - loss: 1.5035 - regression_loss: 1.2569 - classification_loss: 0.2466 460/500 [==========================>...] - ETA: 9s - loss: 1.5039 - regression_loss: 1.2573 - classification_loss: 0.2466  461/500 [==========================>...] - ETA: 9s - loss: 1.5044 - regression_loss: 1.2577 - classification_loss: 0.2467 462/500 [==========================>...] - ETA: 9s - loss: 1.5045 - regression_loss: 1.2579 - classification_loss: 0.2466 463/500 [==========================>...] - ETA: 9s - loss: 1.5052 - regression_loss: 1.2583 - classification_loss: 0.2468 464/500 [==========================>...] - ETA: 8s - loss: 1.5051 - regression_loss: 1.2583 - classification_loss: 0.2468 465/500 [==========================>...] - ETA: 8s - loss: 1.5057 - regression_loss: 1.2589 - classification_loss: 0.2468 466/500 [==========================>...] - ETA: 8s - loss: 1.5059 - regression_loss: 1.2590 - classification_loss: 0.2469 467/500 [===========================>..] - ETA: 8s - loss: 1.5064 - regression_loss: 1.2594 - classification_loss: 0.2470 468/500 [===========================>..] - ETA: 7s - loss: 1.5056 - regression_loss: 1.2589 - classification_loss: 0.2467 469/500 [===========================>..] - ETA: 7s - loss: 1.5057 - regression_loss: 1.2590 - classification_loss: 0.2467 470/500 [===========================>..] - ETA: 7s - loss: 1.5041 - regression_loss: 1.2578 - classification_loss: 0.2463 471/500 [===========================>..] - ETA: 7s - loss: 1.5029 - regression_loss: 1.2569 - classification_loss: 0.2460 472/500 [===========================>..] - ETA: 6s - loss: 1.5034 - regression_loss: 1.2574 - classification_loss: 0.2460 473/500 [===========================>..] - ETA: 6s - loss: 1.5032 - regression_loss: 1.2573 - classification_loss: 0.2460 474/500 [===========================>..] - ETA: 6s - loss: 1.5033 - regression_loss: 1.2573 - classification_loss: 0.2460 475/500 [===========================>..] - ETA: 6s - loss: 1.5040 - regression_loss: 1.2581 - classification_loss: 0.2460 476/500 [===========================>..] - ETA: 5s - loss: 1.5031 - regression_loss: 1.2574 - classification_loss: 0.2457 477/500 [===========================>..] - ETA: 5s - loss: 1.5027 - regression_loss: 1.2571 - classification_loss: 0.2456 478/500 [===========================>..] - ETA: 5s - loss: 1.5022 - regression_loss: 1.2564 - classification_loss: 0.2458 479/500 [===========================>..] - ETA: 5s - loss: 1.5049 - regression_loss: 1.2581 - classification_loss: 0.2469 480/500 [===========================>..] - ETA: 4s - loss: 1.5043 - regression_loss: 1.2577 - classification_loss: 0.2465 481/500 [===========================>..] - ETA: 4s - loss: 1.5050 - regression_loss: 1.2584 - classification_loss: 0.2466 482/500 [===========================>..] - ETA: 4s - loss: 1.5061 - regression_loss: 1.2593 - classification_loss: 0.2468 483/500 [===========================>..] - ETA: 4s - loss: 1.5052 - regression_loss: 1.2587 - classification_loss: 0.2466 484/500 [============================>.] - ETA: 3s - loss: 1.5054 - regression_loss: 1.2587 - classification_loss: 0.2466 485/500 [============================>.] - ETA: 3s - loss: 1.5035 - regression_loss: 1.2572 - classification_loss: 0.2462 486/500 [============================>.] - ETA: 3s - loss: 1.5043 - regression_loss: 1.2578 - classification_loss: 0.2464 487/500 [============================>.] - ETA: 3s - loss: 1.5045 - regression_loss: 1.2581 - classification_loss: 0.2464 488/500 [============================>.] - ETA: 2s - loss: 1.5044 - regression_loss: 1.2580 - classification_loss: 0.2464 489/500 [============================>.] - ETA: 2s - loss: 1.5052 - regression_loss: 1.2586 - classification_loss: 0.2465 490/500 [============================>.] - ETA: 2s - loss: 1.5060 - regression_loss: 1.2593 - classification_loss: 0.2466 491/500 [============================>.] - ETA: 2s - loss: 1.5046 - regression_loss: 1.2583 - classification_loss: 0.2463 492/500 [============================>.] - ETA: 1s - loss: 1.5058 - regression_loss: 1.2590 - classification_loss: 0.2467 493/500 [============================>.] - ETA: 1s - loss: 1.5060 - regression_loss: 1.2592 - classification_loss: 0.2468 494/500 [============================>.] - ETA: 1s - loss: 1.5062 - regression_loss: 1.2592 - classification_loss: 0.2469 495/500 [============================>.] - ETA: 1s - loss: 1.5086 - regression_loss: 1.2594 - classification_loss: 0.2492 496/500 [============================>.] - ETA: 0s - loss: 1.5090 - regression_loss: 1.2596 - classification_loss: 0.2494 497/500 [============================>.] - ETA: 0s - loss: 1.5092 - regression_loss: 1.2599 - classification_loss: 0.2494 498/500 [============================>.] - ETA: 0s - loss: 1.5100 - regression_loss: 1.2605 - classification_loss: 0.2495 499/500 [============================>.] - ETA: 0s - loss: 1.5087 - regression_loss: 1.2593 - classification_loss: 0.2494 500/500 [==============================] - 125s 250ms/step - loss: 1.5082 - regression_loss: 1.2589 - classification_loss: 0.2492 1172 instances of class plum with average precision: 0.6651 mAP: 0.6651 Epoch 00100: saving model to ./training/snapshots/resnet50_pascal_100.h5 Epoch 101/150 1/500 [..............................] - ETA: 2:04 - loss: 1.3860 - regression_loss: 1.1993 - classification_loss: 0.1867 2/500 [..............................] - ETA: 2:02 - loss: 1.0072 - regression_loss: 0.8786 - classification_loss: 0.1285 3/500 [..............................] - ETA: 2:04 - loss: 1.0938 - regression_loss: 0.9641 - classification_loss: 0.1297 4/500 [..............................] - ETA: 2:04 - loss: 1.2106 - regression_loss: 1.0512 - classification_loss: 0.1594 5/500 [..............................] - ETA: 2:04 - loss: 1.3538 - regression_loss: 1.1699 - classification_loss: 0.1839 6/500 [..............................] - ETA: 2:04 - loss: 1.3590 - regression_loss: 1.1712 - classification_loss: 0.1878 7/500 [..............................] - ETA: 2:03 - loss: 1.3749 - regression_loss: 1.1837 - classification_loss: 0.1911 8/500 [..............................] - ETA: 2:03 - loss: 1.4769 - regression_loss: 1.2558 - classification_loss: 0.2212 9/500 [..............................] - ETA: 2:03 - loss: 1.4408 - regression_loss: 1.2259 - classification_loss: 0.2148 10/500 [..............................] - ETA: 2:03 - loss: 1.4277 - regression_loss: 1.2146 - classification_loss: 0.2131 11/500 [..............................] - ETA: 2:03 - loss: 1.3589 - regression_loss: 1.1537 - classification_loss: 0.2052 12/500 [..............................] - ETA: 2:02 - loss: 1.4115 - regression_loss: 1.1979 - classification_loss: 0.2136 13/500 [..............................] - ETA: 2:02 - loss: 1.4323 - regression_loss: 1.2117 - classification_loss: 0.2206 14/500 [..............................] - ETA: 2:02 - loss: 1.3971 - regression_loss: 1.1806 - classification_loss: 0.2165 15/500 [..............................] - ETA: 2:02 - loss: 1.4015 - regression_loss: 1.1828 - classification_loss: 0.2188 16/500 [..............................] - ETA: 2:02 - loss: 1.4010 - regression_loss: 1.1856 - classification_loss: 0.2154 17/500 [>.............................] - ETA: 2:02 - loss: 1.4240 - regression_loss: 1.2033 - classification_loss: 0.2207 18/500 [>.............................] - ETA: 2:01 - loss: 1.3772 - regression_loss: 1.1655 - classification_loss: 0.2117 19/500 [>.............................] - ETA: 2:01 - loss: 1.4059 - regression_loss: 1.1908 - classification_loss: 0.2151 20/500 [>.............................] - ETA: 2:01 - loss: 1.4165 - regression_loss: 1.2010 - classification_loss: 0.2155 21/500 [>.............................] - ETA: 2:01 - loss: 1.4025 - regression_loss: 1.1895 - classification_loss: 0.2130 22/500 [>.............................] - ETA: 2:00 - loss: 1.4230 - regression_loss: 1.2072 - classification_loss: 0.2158 23/500 [>.............................] - ETA: 2:00 - loss: 1.4266 - regression_loss: 1.2109 - classification_loss: 0.2157 24/500 [>.............................] - ETA: 2:00 - loss: 1.4246 - regression_loss: 1.2094 - classification_loss: 0.2151 25/500 [>.............................] - ETA: 2:00 - loss: 1.4292 - regression_loss: 1.2093 - classification_loss: 0.2200 26/500 [>.............................] - ETA: 1:59 - loss: 1.4334 - regression_loss: 1.2084 - classification_loss: 0.2249 27/500 [>.............................] - ETA: 1:59 - loss: 1.4060 - regression_loss: 1.1860 - classification_loss: 0.2200 28/500 [>.............................] - ETA: 1:58 - loss: 1.4060 - regression_loss: 1.1858 - classification_loss: 0.2202 29/500 [>.............................] - ETA: 1:58 - loss: 1.3809 - regression_loss: 1.1641 - classification_loss: 0.2168 30/500 [>.............................] - ETA: 1:58 - loss: 1.3763 - regression_loss: 1.1609 - classification_loss: 0.2154 31/500 [>.............................] - ETA: 1:57 - loss: 1.3814 - regression_loss: 1.1664 - classification_loss: 0.2150 32/500 [>.............................] - ETA: 1:57 - loss: 1.3729 - regression_loss: 1.1586 - classification_loss: 0.2143 33/500 [>.............................] - ETA: 1:57 - loss: 1.3706 - regression_loss: 1.1565 - classification_loss: 0.2140 34/500 [=>............................] - ETA: 1:56 - loss: 1.3791 - regression_loss: 1.1643 - classification_loss: 0.2147 35/500 [=>............................] - ETA: 1:56 - loss: 1.3784 - regression_loss: 1.1633 - classification_loss: 0.2151 36/500 [=>............................] - ETA: 1:56 - loss: 1.3761 - regression_loss: 1.1641 - classification_loss: 0.2120 37/500 [=>............................] - ETA: 1:56 - loss: 1.3899 - regression_loss: 1.1782 - classification_loss: 0.2118 38/500 [=>............................] - ETA: 1:55 - loss: 1.3795 - regression_loss: 1.1698 - classification_loss: 0.2097 39/500 [=>............................] - ETA: 1:55 - loss: 1.3819 - regression_loss: 1.1720 - classification_loss: 0.2099 40/500 [=>............................] - ETA: 1:55 - loss: 1.3647 - regression_loss: 1.1557 - classification_loss: 0.2090 41/500 [=>............................] - ETA: 1:55 - loss: 1.3743 - regression_loss: 1.1620 - classification_loss: 0.2123 42/500 [=>............................] - ETA: 1:54 - loss: 1.3898 - regression_loss: 1.1748 - classification_loss: 0.2150 43/500 [=>............................] - ETA: 1:54 - loss: 1.3721 - regression_loss: 1.1577 - classification_loss: 0.2144 44/500 [=>............................] - ETA: 1:54 - loss: 1.3601 - regression_loss: 1.1472 - classification_loss: 0.2129 45/500 [=>............................] - ETA: 1:53 - loss: 1.3500 - regression_loss: 1.1397 - classification_loss: 0.2103 46/500 [=>............................] - ETA: 1:53 - loss: 1.3634 - regression_loss: 1.1526 - classification_loss: 0.2107 47/500 [=>............................] - ETA: 1:52 - loss: 1.3737 - regression_loss: 1.1607 - classification_loss: 0.2130 48/500 [=>............................] - ETA: 1:52 - loss: 1.3887 - regression_loss: 1.1716 - classification_loss: 0.2170 49/500 [=>............................] - ETA: 1:52 - loss: 1.3871 - regression_loss: 1.1707 - classification_loss: 0.2163 50/500 [==>...........................] - ETA: 1:52 - loss: 1.3878 - regression_loss: 1.1719 - classification_loss: 0.2159 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3923 - regression_loss: 1.1758 - classification_loss: 0.2165 52/500 [==>...........................] - ETA: 1:51 - loss: 1.3942 - regression_loss: 1.1781 - classification_loss: 0.2161 53/500 [==>...........................] - ETA: 1:51 - loss: 1.3900 - regression_loss: 1.1760 - classification_loss: 0.2140 54/500 [==>...........................] - ETA: 1:51 - loss: 1.3847 - regression_loss: 1.1719 - classification_loss: 0.2127 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3905 - regression_loss: 1.1777 - classification_loss: 0.2128 56/500 [==>...........................] - ETA: 1:50 - loss: 1.3972 - regression_loss: 1.1840 - classification_loss: 0.2132 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4116 - regression_loss: 1.1947 - classification_loss: 0.2168 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4067 - regression_loss: 1.1903 - classification_loss: 0.2164 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4143 - regression_loss: 1.1962 - classification_loss: 0.2182 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4193 - regression_loss: 1.2002 - classification_loss: 0.2191 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4252 - regression_loss: 1.2051 - classification_loss: 0.2201 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4129 - regression_loss: 1.1956 - classification_loss: 0.2172 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4152 - regression_loss: 1.1981 - classification_loss: 0.2171 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4181 - regression_loss: 1.2003 - classification_loss: 0.2179 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4252 - regression_loss: 1.2059 - classification_loss: 0.2194 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4356 - regression_loss: 1.2124 - classification_loss: 0.2232 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4391 - regression_loss: 1.2152 - classification_loss: 0.2239 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4383 - regression_loss: 1.2139 - classification_loss: 0.2244 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4299 - regression_loss: 1.2066 - classification_loss: 0.2233 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4212 - regression_loss: 1.2000 - classification_loss: 0.2212 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4158 - regression_loss: 1.1956 - classification_loss: 0.2202 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4185 - regression_loss: 1.1974 - classification_loss: 0.2211 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4099 - regression_loss: 1.1903 - classification_loss: 0.2196 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4209 - regression_loss: 1.2008 - classification_loss: 0.2201 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4259 - regression_loss: 1.2046 - classification_loss: 0.2213 76/500 [===>..........................] - ETA: 1:46 - loss: 1.4321 - regression_loss: 1.2084 - classification_loss: 0.2236 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4325 - regression_loss: 1.2084 - classification_loss: 0.2240 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4341 - regression_loss: 1.2094 - classification_loss: 0.2247 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4311 - regression_loss: 1.2067 - classification_loss: 0.2244 80/500 [===>..........................] - ETA: 1:45 - loss: 1.4330 - regression_loss: 1.2090 - classification_loss: 0.2240 81/500 [===>..........................] - ETA: 1:44 - loss: 1.4408 - regression_loss: 1.2162 - classification_loss: 0.2246 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4475 - regression_loss: 1.2210 - classification_loss: 0.2265 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4498 - regression_loss: 1.2233 - classification_loss: 0.2265 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4497 - regression_loss: 1.2217 - classification_loss: 0.2281 85/500 [====>.........................] - ETA: 1:43 - loss: 1.4527 - regression_loss: 1.2248 - classification_loss: 0.2279 86/500 [====>.........................] - ETA: 1:43 - loss: 1.4542 - regression_loss: 1.2256 - classification_loss: 0.2286 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4560 - regression_loss: 1.2270 - classification_loss: 0.2290 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4605 - regression_loss: 1.2304 - classification_loss: 0.2301 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4507 - regression_loss: 1.2222 - classification_loss: 0.2285 90/500 [====>.........................] - ETA: 1:42 - loss: 1.4502 - regression_loss: 1.2215 - classification_loss: 0.2287 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4431 - regression_loss: 1.2161 - classification_loss: 0.2269 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4428 - regression_loss: 1.2161 - classification_loss: 0.2267 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4397 - regression_loss: 1.2136 - classification_loss: 0.2261 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4310 - regression_loss: 1.2062 - classification_loss: 0.2247 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4362 - regression_loss: 1.2106 - classification_loss: 0.2256 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4434 - regression_loss: 1.2158 - classification_loss: 0.2275 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4429 - regression_loss: 1.2155 - classification_loss: 0.2274 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4498 - regression_loss: 1.2211 - classification_loss: 0.2288 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4475 - regression_loss: 1.2192 - classification_loss: 0.2283 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4470 - regression_loss: 1.2191 - classification_loss: 0.2279 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4451 - regression_loss: 1.2177 - classification_loss: 0.2275 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4517 - regression_loss: 1.2241 - classification_loss: 0.2277 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4463 - regression_loss: 1.2188 - classification_loss: 0.2275 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4439 - regression_loss: 1.2172 - classification_loss: 0.2267 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4413 - regression_loss: 1.2151 - classification_loss: 0.2262 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4454 - regression_loss: 1.2173 - classification_loss: 0.2281 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4439 - regression_loss: 1.2160 - classification_loss: 0.2279 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4454 - regression_loss: 1.2170 - classification_loss: 0.2283 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4465 - regression_loss: 1.2176 - classification_loss: 0.2289 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4487 - regression_loss: 1.2193 - classification_loss: 0.2294 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4489 - regression_loss: 1.2194 - classification_loss: 0.2295 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4530 - regression_loss: 1.2227 - classification_loss: 0.2303 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4550 - regression_loss: 1.2243 - classification_loss: 0.2307 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4563 - regression_loss: 1.2247 - classification_loss: 0.2316 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4548 - regression_loss: 1.2239 - classification_loss: 0.2309 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4544 - regression_loss: 1.2237 - classification_loss: 0.2307 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4615 - regression_loss: 1.2292 - classification_loss: 0.2323 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4638 - regression_loss: 1.2302 - classification_loss: 0.2335 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4635 - regression_loss: 1.2291 - classification_loss: 0.2343 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4648 - regression_loss: 1.2302 - classification_loss: 0.2346 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4605 - regression_loss: 1.2269 - classification_loss: 0.2337 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4626 - regression_loss: 1.2284 - classification_loss: 0.2342 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4571 - regression_loss: 1.2236 - classification_loss: 0.2336 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4590 - regression_loss: 1.2256 - classification_loss: 0.2335 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4608 - regression_loss: 1.2272 - classification_loss: 0.2336 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4623 - regression_loss: 1.2285 - classification_loss: 0.2338 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4633 - regression_loss: 1.2294 - classification_loss: 0.2339 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4657 - regression_loss: 1.2308 - classification_loss: 0.2349 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4657 - regression_loss: 1.2306 - classification_loss: 0.2351 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4624 - regression_loss: 1.2282 - classification_loss: 0.2342 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4620 - regression_loss: 1.2280 - classification_loss: 0.2340 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4615 - regression_loss: 1.2271 - classification_loss: 0.2344 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4637 - regression_loss: 1.2289 - classification_loss: 0.2347 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4639 - regression_loss: 1.2291 - classification_loss: 0.2349 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4599 - regression_loss: 1.2255 - classification_loss: 0.2344 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4616 - regression_loss: 1.2269 - classification_loss: 0.2347 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4661 - regression_loss: 1.2307 - classification_loss: 0.2353 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4671 - regression_loss: 1.2321 - classification_loss: 0.2349 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4652 - regression_loss: 1.2307 - classification_loss: 0.2344 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4692 - regression_loss: 1.2342 - classification_loss: 0.2350 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4664 - regression_loss: 1.2321 - classification_loss: 0.2343 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4660 - regression_loss: 1.2313 - classification_loss: 0.2347 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4663 - regression_loss: 1.2316 - classification_loss: 0.2347 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4605 - regression_loss: 1.2268 - classification_loss: 0.2337 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4598 - regression_loss: 1.2267 - classification_loss: 0.2331 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4602 - regression_loss: 1.2272 - classification_loss: 0.2331 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4612 - regression_loss: 1.2279 - classification_loss: 0.2333 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4610 - regression_loss: 1.2280 - classification_loss: 0.2329 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4560 - regression_loss: 1.2240 - classification_loss: 0.2319 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4580 - regression_loss: 1.2258 - classification_loss: 0.2322 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4599 - regression_loss: 1.2265 - classification_loss: 0.2334 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4617 - regression_loss: 1.2281 - classification_loss: 0.2336 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4602 - regression_loss: 1.2275 - classification_loss: 0.2327 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4607 - regression_loss: 1.2275 - classification_loss: 0.2332 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4612 - regression_loss: 1.2282 - classification_loss: 0.2329 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4654 - regression_loss: 1.2309 - classification_loss: 0.2345 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4650 - regression_loss: 1.2307 - classification_loss: 0.2343 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4752 - regression_loss: 1.2400 - classification_loss: 0.2352 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4754 - regression_loss: 1.2404 - classification_loss: 0.2350 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4729 - regression_loss: 1.2378 - classification_loss: 0.2351 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4758 - regression_loss: 1.2406 - classification_loss: 0.2351 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4762 - regression_loss: 1.2410 - classification_loss: 0.2352 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4773 - regression_loss: 1.2418 - classification_loss: 0.2355 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4800 - regression_loss: 1.2444 - classification_loss: 0.2356 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4739 - regression_loss: 1.2392 - classification_loss: 0.2346 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4704 - regression_loss: 1.2362 - classification_loss: 0.2342 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4663 - regression_loss: 1.2331 - classification_loss: 0.2332 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4617 - regression_loss: 1.2295 - classification_loss: 0.2323 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4588 - regression_loss: 1.2272 - classification_loss: 0.2317 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4603 - regression_loss: 1.2285 - classification_loss: 0.2318 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4654 - regression_loss: 1.2318 - classification_loss: 0.2336 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4657 - regression_loss: 1.2319 - classification_loss: 0.2338 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4659 - regression_loss: 1.2319 - classification_loss: 0.2340 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4677 - regression_loss: 1.2333 - classification_loss: 0.2344 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4659 - regression_loss: 1.2317 - classification_loss: 0.2342 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4653 - regression_loss: 1.2310 - classification_loss: 0.2343 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4608 - regression_loss: 1.2271 - classification_loss: 0.2337 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4569 - regression_loss: 1.2235 - classification_loss: 0.2334 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4577 - regression_loss: 1.2241 - classification_loss: 0.2336 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4591 - regression_loss: 1.2252 - classification_loss: 0.2339 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4581 - regression_loss: 1.2247 - classification_loss: 0.2334 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4598 - regression_loss: 1.2264 - classification_loss: 0.2333 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4627 - regression_loss: 1.2275 - classification_loss: 0.2351 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4605 - regression_loss: 1.2256 - classification_loss: 0.2349 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4616 - regression_loss: 1.2266 - classification_loss: 0.2350 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4608 - regression_loss: 1.2260 - classification_loss: 0.2348 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4587 - regression_loss: 1.2230 - classification_loss: 0.2356 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4589 - regression_loss: 1.2234 - classification_loss: 0.2355 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4638 - regression_loss: 1.2278 - classification_loss: 0.2360 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4641 - regression_loss: 1.2281 - classification_loss: 0.2360 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4679 - regression_loss: 1.2315 - classification_loss: 0.2364 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4672 - regression_loss: 1.2310 - classification_loss: 0.2362 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4651 - regression_loss: 1.2291 - classification_loss: 0.2360 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4651 - regression_loss: 1.2290 - classification_loss: 0.2361 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4683 - regression_loss: 1.2314 - classification_loss: 0.2369 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4688 - regression_loss: 1.2323 - classification_loss: 0.2364 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4711 - regression_loss: 1.2344 - classification_loss: 0.2367 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4698 - regression_loss: 1.2333 - classification_loss: 0.2365 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4711 - regression_loss: 1.2345 - classification_loss: 0.2367 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4724 - regression_loss: 1.2355 - classification_loss: 0.2370 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4705 - regression_loss: 1.2338 - classification_loss: 0.2367 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4702 - regression_loss: 1.2340 - classification_loss: 0.2362 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4724 - regression_loss: 1.2359 - classification_loss: 0.2365 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4710 - regression_loss: 1.2347 - classification_loss: 0.2362 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4712 - regression_loss: 1.2352 - classification_loss: 0.2361 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4712 - regression_loss: 1.2352 - classification_loss: 0.2361 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4717 - regression_loss: 1.2358 - classification_loss: 0.2360 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4715 - regression_loss: 1.2356 - classification_loss: 0.2358 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4736 - regression_loss: 1.2367 - classification_loss: 0.2370 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4730 - regression_loss: 1.2363 - classification_loss: 0.2366 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4712 - regression_loss: 1.2350 - classification_loss: 0.2362 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4702 - regression_loss: 1.2343 - classification_loss: 0.2358 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4691 - regression_loss: 1.2336 - classification_loss: 0.2355 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4678 - regression_loss: 1.2323 - classification_loss: 0.2355 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4693 - regression_loss: 1.2341 - classification_loss: 0.2351 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4733 - regression_loss: 1.2367 - classification_loss: 0.2366 217/500 [============>.................] - ETA: 1:11 - loss: 1.4720 - regression_loss: 1.2359 - classification_loss: 0.2362 218/500 [============>.................] - ETA: 1:10 - loss: 1.4683 - regression_loss: 1.2329 - classification_loss: 0.2354 219/500 [============>.................] - ETA: 1:10 - loss: 1.4706 - regression_loss: 1.2346 - classification_loss: 0.2360 220/500 [============>.................] - ETA: 1:10 - loss: 1.4698 - regression_loss: 1.2340 - classification_loss: 0.2359 221/500 [============>.................] - ETA: 1:10 - loss: 1.4712 - regression_loss: 1.2350 - classification_loss: 0.2362 222/500 [============>.................] - ETA: 1:09 - loss: 1.4729 - regression_loss: 1.2367 - classification_loss: 0.2362 223/500 [============>.................] - ETA: 1:09 - loss: 1.4722 - regression_loss: 1.2362 - classification_loss: 0.2360 224/500 [============>.................] - ETA: 1:09 - loss: 1.4738 - regression_loss: 1.2376 - classification_loss: 0.2362 225/500 [============>.................] - ETA: 1:09 - loss: 1.4713 - regression_loss: 1.2353 - classification_loss: 0.2360 226/500 [============>.................] - ETA: 1:08 - loss: 1.4711 - regression_loss: 1.2353 - classification_loss: 0.2358 227/500 [============>.................] - ETA: 1:08 - loss: 1.4675 - regression_loss: 1.2323 - classification_loss: 0.2352 228/500 [============>.................] - ETA: 1:08 - loss: 1.4697 - regression_loss: 1.2343 - classification_loss: 0.2355 229/500 [============>.................] - ETA: 1:08 - loss: 1.4700 - regression_loss: 1.2344 - classification_loss: 0.2356 230/500 [============>.................] - ETA: 1:07 - loss: 1.4692 - regression_loss: 1.2335 - classification_loss: 0.2357 231/500 [============>.................] - ETA: 1:07 - loss: 1.4713 - regression_loss: 1.2351 - classification_loss: 0.2362 232/500 [============>.................] - ETA: 1:07 - loss: 1.4717 - regression_loss: 1.2355 - classification_loss: 0.2363 233/500 [============>.................] - ETA: 1:07 - loss: 1.4705 - regression_loss: 1.2347 - classification_loss: 0.2358 234/500 [=============>................] - ETA: 1:06 - loss: 1.4705 - regression_loss: 1.2347 - classification_loss: 0.2358 235/500 [=============>................] - ETA: 1:06 - loss: 1.4710 - regression_loss: 1.2356 - classification_loss: 0.2354 236/500 [=============>................] - ETA: 1:06 - loss: 1.4711 - regression_loss: 1.2357 - classification_loss: 0.2354 237/500 [=============>................] - ETA: 1:06 - loss: 1.4680 - regression_loss: 1.2330 - classification_loss: 0.2350 238/500 [=============>................] - ETA: 1:05 - loss: 1.4673 - regression_loss: 1.2324 - classification_loss: 0.2350 239/500 [=============>................] - ETA: 1:05 - loss: 1.4665 - regression_loss: 1.2317 - classification_loss: 0.2349 240/500 [=============>................] - ETA: 1:05 - loss: 1.4669 - regression_loss: 1.2321 - classification_loss: 0.2348 241/500 [=============>................] - ETA: 1:05 - loss: 1.4682 - regression_loss: 1.2332 - classification_loss: 0.2349 242/500 [=============>................] - ETA: 1:04 - loss: 1.4644 - regression_loss: 1.2302 - classification_loss: 0.2342 243/500 [=============>................] - ETA: 1:04 - loss: 1.4665 - regression_loss: 1.2319 - classification_loss: 0.2346 244/500 [=============>................] - ETA: 1:04 - loss: 1.4664 - regression_loss: 1.2318 - classification_loss: 0.2346 245/500 [=============>................] - ETA: 1:04 - loss: 1.4677 - regression_loss: 1.2330 - classification_loss: 0.2348 246/500 [=============>................] - ETA: 1:03 - loss: 1.4689 - regression_loss: 1.2343 - classification_loss: 0.2346 247/500 [=============>................] - ETA: 1:03 - loss: 1.4686 - regression_loss: 1.2339 - classification_loss: 0.2347 248/500 [=============>................] - ETA: 1:03 - loss: 1.4708 - regression_loss: 1.2356 - classification_loss: 0.2352 249/500 [=============>................] - ETA: 1:03 - loss: 1.4724 - regression_loss: 1.2366 - classification_loss: 0.2358 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4687 - regression_loss: 1.2335 - classification_loss: 0.2352 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4683 - regression_loss: 1.2333 - classification_loss: 0.2350 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4696 - regression_loss: 1.2345 - classification_loss: 0.2351 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4677 - regression_loss: 1.2330 - classification_loss: 0.2347 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4680 - regression_loss: 1.2333 - classification_loss: 0.2346 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4648 - regression_loss: 1.2308 - classification_loss: 0.2340 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4645 - regression_loss: 1.2303 - classification_loss: 0.2342 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4659 - regression_loss: 1.2316 - classification_loss: 0.2343 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4669 - regression_loss: 1.2324 - classification_loss: 0.2345 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4669 - regression_loss: 1.2324 - classification_loss: 0.2345 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4672 - regression_loss: 1.2328 - classification_loss: 0.2344 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4683 - regression_loss: 1.2335 - classification_loss: 0.2348 262/500 [==============>...............] - ETA: 59s - loss: 1.4662 - regression_loss: 1.2317 - classification_loss: 0.2345  263/500 [==============>...............] - ETA: 59s - loss: 1.4681 - regression_loss: 1.2333 - classification_loss: 0.2348 264/500 [==============>...............] - ETA: 59s - loss: 1.4689 - regression_loss: 1.2341 - classification_loss: 0.2349 265/500 [==============>...............] - ETA: 59s - loss: 1.4673 - regression_loss: 1.2327 - classification_loss: 0.2346 266/500 [==============>...............] - ETA: 58s - loss: 1.4657 - regression_loss: 1.2311 - classification_loss: 0.2346 267/500 [===============>..............] - ETA: 58s - loss: 1.4636 - regression_loss: 1.2293 - classification_loss: 0.2343 268/500 [===============>..............] - ETA: 58s - loss: 1.4630 - regression_loss: 1.2289 - classification_loss: 0.2341 269/500 [===============>..............] - ETA: 58s - loss: 1.4632 - regression_loss: 1.2289 - classification_loss: 0.2343 270/500 [===============>..............] - ETA: 57s - loss: 1.4636 - regression_loss: 1.2292 - classification_loss: 0.2344 271/500 [===============>..............] - ETA: 57s - loss: 1.4650 - regression_loss: 1.2305 - classification_loss: 0.2345 272/500 [===============>..............] - ETA: 57s - loss: 1.4664 - regression_loss: 1.2317 - classification_loss: 0.2347 273/500 [===============>..............] - ETA: 57s - loss: 1.4667 - regression_loss: 1.2320 - classification_loss: 0.2347 274/500 [===============>..............] - ETA: 56s - loss: 1.4637 - regression_loss: 1.2296 - classification_loss: 0.2342 275/500 [===============>..............] - ETA: 56s - loss: 1.4605 - regression_loss: 1.2268 - classification_loss: 0.2336 276/500 [===============>..............] - ETA: 56s - loss: 1.4610 - regression_loss: 1.2273 - classification_loss: 0.2337 277/500 [===============>..............] - ETA: 55s - loss: 1.4623 - regression_loss: 1.2286 - classification_loss: 0.2338 278/500 [===============>..............] - ETA: 55s - loss: 1.4630 - regression_loss: 1.2294 - classification_loss: 0.2337 279/500 [===============>..............] - ETA: 55s - loss: 1.4621 - regression_loss: 1.2285 - classification_loss: 0.2335 280/500 [===============>..............] - ETA: 55s - loss: 1.4604 - regression_loss: 1.2273 - classification_loss: 0.2332 281/500 [===============>..............] - ETA: 54s - loss: 1.4616 - regression_loss: 1.2282 - classification_loss: 0.2334 282/500 [===============>..............] - ETA: 54s - loss: 1.4631 - regression_loss: 1.2294 - classification_loss: 0.2337 283/500 [===============>..............] - ETA: 54s - loss: 1.4653 - regression_loss: 1.2310 - classification_loss: 0.2343 284/500 [================>.............] - ETA: 54s - loss: 1.4661 - regression_loss: 1.2318 - classification_loss: 0.2343 285/500 [================>.............] - ETA: 53s - loss: 1.4630 - regression_loss: 1.2293 - classification_loss: 0.2337 286/500 [================>.............] - ETA: 53s - loss: 1.4612 - regression_loss: 1.2280 - classification_loss: 0.2332 287/500 [================>.............] - ETA: 53s - loss: 1.4606 - regression_loss: 1.2273 - classification_loss: 0.2332 288/500 [================>.............] - ETA: 53s - loss: 1.4603 - regression_loss: 1.2269 - classification_loss: 0.2333 289/500 [================>.............] - ETA: 52s - loss: 1.4629 - regression_loss: 1.2291 - classification_loss: 0.2338 290/500 [================>.............] - ETA: 52s - loss: 1.4598 - regression_loss: 1.2265 - classification_loss: 0.2333 291/500 [================>.............] - ETA: 52s - loss: 1.4623 - regression_loss: 1.2282 - classification_loss: 0.2340 292/500 [================>.............] - ETA: 52s - loss: 1.4620 - regression_loss: 1.2281 - classification_loss: 0.2339 293/500 [================>.............] - ETA: 51s - loss: 1.4590 - regression_loss: 1.2257 - classification_loss: 0.2333 294/500 [================>.............] - ETA: 51s - loss: 1.4606 - regression_loss: 1.2271 - classification_loss: 0.2335 295/500 [================>.............] - ETA: 51s - loss: 1.4615 - regression_loss: 1.2276 - classification_loss: 0.2339 296/500 [================>.............] - ETA: 51s - loss: 1.4609 - regression_loss: 1.2273 - classification_loss: 0.2335 297/500 [================>.............] - ETA: 50s - loss: 1.4617 - regression_loss: 1.2276 - classification_loss: 0.2341 298/500 [================>.............] - ETA: 50s - loss: 1.4621 - regression_loss: 1.2279 - classification_loss: 0.2342 299/500 [================>.............] - ETA: 50s - loss: 1.4637 - regression_loss: 1.2289 - classification_loss: 0.2348 300/500 [=================>............] - ETA: 50s - loss: 1.4630 - regression_loss: 1.2283 - classification_loss: 0.2346 301/500 [=================>............] - ETA: 49s - loss: 1.4642 - regression_loss: 1.2294 - classification_loss: 0.2348 302/500 [=================>............] - ETA: 49s - loss: 1.4678 - regression_loss: 1.2323 - classification_loss: 0.2355 303/500 [=================>............] - ETA: 49s - loss: 1.4708 - regression_loss: 1.2351 - classification_loss: 0.2358 304/500 [=================>............] - ETA: 49s - loss: 1.4715 - regression_loss: 1.2357 - classification_loss: 0.2357 305/500 [=================>............] - ETA: 48s - loss: 1.4752 - regression_loss: 1.2395 - classification_loss: 0.2357 306/500 [=================>............] - ETA: 48s - loss: 1.4751 - regression_loss: 1.2394 - classification_loss: 0.2358 307/500 [=================>............] - ETA: 48s - loss: 1.4758 - regression_loss: 1.2400 - classification_loss: 0.2358 308/500 [=================>............] - ETA: 48s - loss: 1.4751 - regression_loss: 1.2397 - classification_loss: 0.2355 309/500 [=================>............] - ETA: 47s - loss: 1.4756 - regression_loss: 1.2401 - classification_loss: 0.2355 310/500 [=================>............] - ETA: 47s - loss: 1.4755 - regression_loss: 1.2401 - classification_loss: 0.2354 311/500 [=================>............] - ETA: 47s - loss: 1.4760 - regression_loss: 1.2406 - classification_loss: 0.2354 312/500 [=================>............] - ETA: 47s - loss: 1.4761 - regression_loss: 1.2408 - classification_loss: 0.2353 313/500 [=================>............] - ETA: 46s - loss: 1.4768 - regression_loss: 1.2413 - classification_loss: 0.2354 314/500 [=================>............] - ETA: 46s - loss: 1.4776 - regression_loss: 1.2419 - classification_loss: 0.2356 315/500 [=================>............] - ETA: 46s - loss: 1.4784 - regression_loss: 1.2426 - classification_loss: 0.2358 316/500 [=================>............] - ETA: 46s - loss: 1.4783 - regression_loss: 1.2426 - classification_loss: 0.2357 317/500 [==================>...........] - ETA: 45s - loss: 1.4786 - regression_loss: 1.2429 - classification_loss: 0.2357 318/500 [==================>...........] - ETA: 45s - loss: 1.4796 - regression_loss: 1.2435 - classification_loss: 0.2361 319/500 [==================>...........] - ETA: 45s - loss: 1.4799 - regression_loss: 1.2438 - classification_loss: 0.2361 320/500 [==================>...........] - ETA: 45s - loss: 1.4801 - regression_loss: 1.2439 - classification_loss: 0.2362 321/500 [==================>...........] - ETA: 44s - loss: 1.4791 - regression_loss: 1.2429 - classification_loss: 0.2362 322/500 [==================>...........] - ETA: 44s - loss: 1.4773 - regression_loss: 1.2414 - classification_loss: 0.2360 323/500 [==================>...........] - ETA: 44s - loss: 1.4780 - regression_loss: 1.2409 - classification_loss: 0.2371 324/500 [==================>...........] - ETA: 44s - loss: 1.4767 - regression_loss: 1.2397 - classification_loss: 0.2370 325/500 [==================>...........] - ETA: 43s - loss: 1.4780 - regression_loss: 1.2407 - classification_loss: 0.2372 326/500 [==================>...........] - ETA: 43s - loss: 1.4776 - regression_loss: 1.2405 - classification_loss: 0.2371 327/500 [==================>...........] - ETA: 43s - loss: 1.4771 - regression_loss: 1.2402 - classification_loss: 0.2369 328/500 [==================>...........] - ETA: 43s - loss: 1.4745 - regression_loss: 1.2380 - classification_loss: 0.2364 329/500 [==================>...........] - ETA: 42s - loss: 1.4738 - regression_loss: 1.2373 - classification_loss: 0.2365 330/500 [==================>...........] - ETA: 42s - loss: 1.4753 - regression_loss: 1.2385 - classification_loss: 0.2368 331/500 [==================>...........] - ETA: 42s - loss: 1.4735 - regression_loss: 1.2371 - classification_loss: 0.2364 332/500 [==================>...........] - ETA: 42s - loss: 1.4729 - regression_loss: 1.2367 - classification_loss: 0.2363 333/500 [==================>...........] - ETA: 41s - loss: 1.4714 - regression_loss: 1.2355 - classification_loss: 0.2359 334/500 [===================>..........] - ETA: 41s - loss: 1.4724 - regression_loss: 1.2363 - classification_loss: 0.2361 335/500 [===================>..........] - ETA: 41s - loss: 1.4736 - regression_loss: 1.2372 - classification_loss: 0.2364 336/500 [===================>..........] - ETA: 41s - loss: 1.4745 - regression_loss: 1.2381 - classification_loss: 0.2364 337/500 [===================>..........] - ETA: 40s - loss: 1.4719 - regression_loss: 1.2359 - classification_loss: 0.2360 338/500 [===================>..........] - ETA: 40s - loss: 1.4699 - regression_loss: 1.2345 - classification_loss: 0.2355 339/500 [===================>..........] - ETA: 40s - loss: 1.4669 - regression_loss: 1.2319 - classification_loss: 0.2349 340/500 [===================>..........] - ETA: 40s - loss: 1.4653 - regression_loss: 1.2307 - classification_loss: 0.2346 341/500 [===================>..........] - ETA: 39s - loss: 1.4657 - regression_loss: 1.2308 - classification_loss: 0.2349 342/500 [===================>..........] - ETA: 39s - loss: 1.4656 - regression_loss: 1.2308 - classification_loss: 0.2347 343/500 [===================>..........] - ETA: 39s - loss: 1.4657 - regression_loss: 1.2310 - classification_loss: 0.2347 344/500 [===================>..........] - ETA: 39s - loss: 1.4634 - regression_loss: 1.2291 - classification_loss: 0.2343 345/500 [===================>..........] - ETA: 38s - loss: 1.4626 - regression_loss: 1.2285 - classification_loss: 0.2341 346/500 [===================>..........] - ETA: 38s - loss: 1.4618 - regression_loss: 1.2277 - classification_loss: 0.2340 347/500 [===================>..........] - ETA: 38s - loss: 1.4638 - regression_loss: 1.2294 - classification_loss: 0.2343 348/500 [===================>..........] - ETA: 38s - loss: 1.4636 - regression_loss: 1.2292 - classification_loss: 0.2344 349/500 [===================>..........] - ETA: 37s - loss: 1.4621 - regression_loss: 1.2281 - classification_loss: 0.2340 350/500 [====================>.........] - ETA: 37s - loss: 1.4629 - regression_loss: 1.2288 - classification_loss: 0.2341 351/500 [====================>.........] - ETA: 37s - loss: 1.4644 - regression_loss: 1.2299 - classification_loss: 0.2344 352/500 [====================>.........] - ETA: 37s - loss: 1.4647 - regression_loss: 1.2302 - classification_loss: 0.2345 353/500 [====================>.........] - ETA: 36s - loss: 1.4653 - regression_loss: 1.2309 - classification_loss: 0.2344 354/500 [====================>.........] - ETA: 36s - loss: 1.4669 - regression_loss: 1.2319 - classification_loss: 0.2350 355/500 [====================>.........] - ETA: 36s - loss: 1.4671 - regression_loss: 1.2321 - classification_loss: 0.2351 356/500 [====================>.........] - ETA: 36s - loss: 1.4662 - regression_loss: 1.2314 - classification_loss: 0.2348 357/500 [====================>.........] - ETA: 35s - loss: 1.4668 - regression_loss: 1.2317 - classification_loss: 0.2350 358/500 [====================>.........] - ETA: 35s - loss: 1.4664 - regression_loss: 1.2312 - classification_loss: 0.2352 359/500 [====================>.........] - ETA: 35s - loss: 1.4666 - regression_loss: 1.2315 - classification_loss: 0.2351 360/500 [====================>.........] - ETA: 35s - loss: 1.4684 - regression_loss: 1.2329 - classification_loss: 0.2355 361/500 [====================>.........] - ETA: 34s - loss: 1.4664 - regression_loss: 1.2314 - classification_loss: 0.2350 362/500 [====================>.........] - ETA: 34s - loss: 1.4671 - regression_loss: 1.2318 - classification_loss: 0.2353 363/500 [====================>.........] - ETA: 34s - loss: 1.4674 - regression_loss: 1.2320 - classification_loss: 0.2354 364/500 [====================>.........] - ETA: 34s - loss: 1.4682 - regression_loss: 1.2329 - classification_loss: 0.2353 365/500 [====================>.........] - ETA: 33s - loss: 1.4670 - regression_loss: 1.2320 - classification_loss: 0.2350 366/500 [====================>.........] - ETA: 33s - loss: 1.4680 - regression_loss: 1.2329 - classification_loss: 0.2351 367/500 [=====================>........] - ETA: 33s - loss: 1.4682 - regression_loss: 1.2331 - classification_loss: 0.2351 368/500 [=====================>........] - ETA: 33s - loss: 1.4677 - regression_loss: 1.2327 - classification_loss: 0.2350 369/500 [=====================>........] - ETA: 32s - loss: 1.4654 - regression_loss: 1.2308 - classification_loss: 0.2346 370/500 [=====================>........] - ETA: 32s - loss: 1.4639 - regression_loss: 1.2297 - classification_loss: 0.2342 371/500 [=====================>........] - ETA: 32s - loss: 1.4626 - regression_loss: 1.2286 - classification_loss: 0.2341 372/500 [=====================>........] - ETA: 32s - loss: 1.4595 - regression_loss: 1.2253 - classification_loss: 0.2342 373/500 [=====================>........] - ETA: 31s - loss: 1.4610 - regression_loss: 1.2265 - classification_loss: 0.2345 374/500 [=====================>........] - ETA: 31s - loss: 1.4623 - regression_loss: 1.2277 - classification_loss: 0.2346 375/500 [=====================>........] - ETA: 31s - loss: 1.4603 - regression_loss: 1.2260 - classification_loss: 0.2343 376/500 [=====================>........] - ETA: 31s - loss: 1.4633 - regression_loss: 1.2287 - classification_loss: 0.2346 377/500 [=====================>........] - ETA: 30s - loss: 1.4639 - regression_loss: 1.2291 - classification_loss: 0.2348 378/500 [=====================>........] - ETA: 30s - loss: 1.4634 - regression_loss: 1.2288 - classification_loss: 0.2346 379/500 [=====================>........] - ETA: 30s - loss: 1.4636 - regression_loss: 1.2286 - classification_loss: 0.2350 380/500 [=====================>........] - ETA: 30s - loss: 1.4653 - regression_loss: 1.2301 - classification_loss: 0.2353 381/500 [=====================>........] - ETA: 29s - loss: 1.4651 - regression_loss: 1.2298 - classification_loss: 0.2354 382/500 [=====================>........] - ETA: 29s - loss: 1.4661 - regression_loss: 1.2307 - classification_loss: 0.2354 383/500 [=====================>........] - ETA: 29s - loss: 1.4655 - regression_loss: 1.2300 - classification_loss: 0.2355 384/500 [======================>.......] - ETA: 29s - loss: 1.4654 - regression_loss: 1.2299 - classification_loss: 0.2356 385/500 [======================>.......] - ETA: 28s - loss: 1.4654 - regression_loss: 1.2298 - classification_loss: 0.2356 386/500 [======================>.......] - ETA: 28s - loss: 1.4651 - regression_loss: 1.2296 - classification_loss: 0.2355 387/500 [======================>.......] - ETA: 28s - loss: 1.4650 - regression_loss: 1.2295 - classification_loss: 0.2355 388/500 [======================>.......] - ETA: 28s - loss: 1.4636 - regression_loss: 1.2283 - classification_loss: 0.2353 389/500 [======================>.......] - ETA: 27s - loss: 1.4646 - regression_loss: 1.2292 - classification_loss: 0.2354 390/500 [======================>.......] - ETA: 27s - loss: 1.4653 - regression_loss: 1.2298 - classification_loss: 0.2356 391/500 [======================>.......] - ETA: 27s - loss: 1.4673 - regression_loss: 1.2315 - classification_loss: 0.2359 392/500 [======================>.......] - ETA: 27s - loss: 1.4673 - regression_loss: 1.2315 - classification_loss: 0.2359 393/500 [======================>.......] - ETA: 26s - loss: 1.4668 - regression_loss: 1.2308 - classification_loss: 0.2360 394/500 [======================>.......] - ETA: 26s - loss: 1.4672 - regression_loss: 1.2311 - classification_loss: 0.2360 395/500 [======================>.......] - ETA: 26s - loss: 1.4654 - regression_loss: 1.2298 - classification_loss: 0.2356 396/500 [======================>.......] - ETA: 26s - loss: 1.4636 - regression_loss: 1.2283 - classification_loss: 0.2353 397/500 [======================>.......] - ETA: 25s - loss: 1.4615 - regression_loss: 1.2265 - classification_loss: 0.2350 398/500 [======================>.......] - ETA: 25s - loss: 1.4611 - regression_loss: 1.2264 - classification_loss: 0.2347 399/500 [======================>.......] - ETA: 25s - loss: 1.4596 - regression_loss: 1.2252 - classification_loss: 0.2344 400/500 [=======================>......] - ETA: 25s - loss: 1.4594 - regression_loss: 1.2251 - classification_loss: 0.2344 401/500 [=======================>......] - ETA: 24s - loss: 1.4584 - regression_loss: 1.2241 - classification_loss: 0.2344 402/500 [=======================>......] - ETA: 24s - loss: 1.4606 - regression_loss: 1.2256 - classification_loss: 0.2350 403/500 [=======================>......] - ETA: 24s - loss: 1.4593 - regression_loss: 1.2248 - classification_loss: 0.2346 404/500 [=======================>......] - ETA: 24s - loss: 1.4577 - regression_loss: 1.2235 - classification_loss: 0.2342 405/500 [=======================>......] - ETA: 23s - loss: 1.4570 - regression_loss: 1.2230 - classification_loss: 0.2340 406/500 [=======================>......] - ETA: 23s - loss: 1.4571 - regression_loss: 1.2231 - classification_loss: 0.2340 407/500 [=======================>......] - ETA: 23s - loss: 1.4585 - regression_loss: 1.2240 - classification_loss: 0.2345 408/500 [=======================>......] - ETA: 23s - loss: 1.4586 - regression_loss: 1.2241 - classification_loss: 0.2345 409/500 [=======================>......] - ETA: 22s - loss: 1.4594 - regression_loss: 1.2246 - classification_loss: 0.2347 410/500 [=======================>......] - ETA: 22s - loss: 1.4606 - regression_loss: 1.2256 - classification_loss: 0.2350 411/500 [=======================>......] - ETA: 22s - loss: 1.4602 - regression_loss: 1.2253 - classification_loss: 0.2349 412/500 [=======================>......] - ETA: 22s - loss: 1.4602 - regression_loss: 1.2254 - classification_loss: 0.2349 413/500 [=======================>......] - ETA: 21s - loss: 1.4617 - regression_loss: 1.2258 - classification_loss: 0.2359 414/500 [=======================>......] - ETA: 21s - loss: 1.4643 - regression_loss: 1.2279 - classification_loss: 0.2364 415/500 [=======================>......] - ETA: 21s - loss: 1.4647 - regression_loss: 1.2282 - classification_loss: 0.2365 416/500 [=======================>......] - ETA: 21s - loss: 1.4642 - regression_loss: 1.2279 - classification_loss: 0.2363 417/500 [========================>.....] - ETA: 20s - loss: 1.4649 - regression_loss: 1.2286 - classification_loss: 0.2363 418/500 [========================>.....] - ETA: 20s - loss: 1.4663 - regression_loss: 1.2298 - classification_loss: 0.2365 419/500 [========================>.....] - ETA: 20s - loss: 1.4675 - regression_loss: 1.2308 - classification_loss: 0.2367 420/500 [========================>.....] - ETA: 20s - loss: 1.4654 - regression_loss: 1.2288 - classification_loss: 0.2366 421/500 [========================>.....] - ETA: 19s - loss: 1.4637 - regression_loss: 1.2274 - classification_loss: 0.2362 422/500 [========================>.....] - ETA: 19s - loss: 1.4629 - regression_loss: 1.2268 - classification_loss: 0.2361 423/500 [========================>.....] - ETA: 19s - loss: 1.4638 - regression_loss: 1.2275 - classification_loss: 0.2362 424/500 [========================>.....] - ETA: 19s - loss: 1.4647 - regression_loss: 1.2282 - classification_loss: 0.2365 425/500 [========================>.....] - ETA: 18s - loss: 1.4649 - regression_loss: 1.2285 - classification_loss: 0.2364 426/500 [========================>.....] - ETA: 18s - loss: 1.4634 - regression_loss: 1.2273 - classification_loss: 0.2361 427/500 [========================>.....] - ETA: 18s - loss: 1.4644 - regression_loss: 1.2281 - classification_loss: 0.2363 428/500 [========================>.....] - ETA: 18s - loss: 1.4650 - regression_loss: 1.2288 - classification_loss: 0.2362 429/500 [========================>.....] - ETA: 17s - loss: 1.4659 - regression_loss: 1.2296 - classification_loss: 0.2363 430/500 [========================>.....] - ETA: 17s - loss: 1.4659 - regression_loss: 1.2296 - classification_loss: 0.2363 431/500 [========================>.....] - ETA: 17s - loss: 1.4666 - regression_loss: 1.2302 - classification_loss: 0.2364 432/500 [========================>.....] - ETA: 17s - loss: 1.4668 - regression_loss: 1.2305 - classification_loss: 0.2363 433/500 [========================>.....] - ETA: 16s - loss: 1.4672 - regression_loss: 1.2309 - classification_loss: 0.2363 434/500 [=========================>....] - ETA: 16s - loss: 1.4682 - regression_loss: 1.2314 - classification_loss: 0.2368 435/500 [=========================>....] - ETA: 16s - loss: 1.4677 - regression_loss: 1.2311 - classification_loss: 0.2366 436/500 [=========================>....] - ETA: 16s - loss: 1.4666 - regression_loss: 1.2302 - classification_loss: 0.2364 437/500 [=========================>....] - ETA: 15s - loss: 1.4673 - regression_loss: 1.2306 - classification_loss: 0.2367 438/500 [=========================>....] - ETA: 15s - loss: 1.4679 - regression_loss: 1.2309 - classification_loss: 0.2370 439/500 [=========================>....] - ETA: 15s - loss: 1.4685 - regression_loss: 1.2314 - classification_loss: 0.2371 440/500 [=========================>....] - ETA: 15s - loss: 1.4680 - regression_loss: 1.2311 - classification_loss: 0.2369 441/500 [=========================>....] - ETA: 14s - loss: 1.4691 - regression_loss: 1.2320 - classification_loss: 0.2371 442/500 [=========================>....] - ETA: 14s - loss: 1.4685 - regression_loss: 1.2315 - classification_loss: 0.2370 443/500 [=========================>....] - ETA: 14s - loss: 1.4669 - regression_loss: 1.2300 - classification_loss: 0.2369 444/500 [=========================>....] - ETA: 14s - loss: 1.4683 - regression_loss: 1.2315 - classification_loss: 0.2368 445/500 [=========================>....] - ETA: 13s - loss: 1.4665 - regression_loss: 1.2301 - classification_loss: 0.2365 446/500 [=========================>....] - ETA: 13s - loss: 1.4658 - regression_loss: 1.2295 - classification_loss: 0.2363 447/500 [=========================>....] - ETA: 13s - loss: 1.4643 - regression_loss: 1.2284 - classification_loss: 0.2359 448/500 [=========================>....] - ETA: 13s - loss: 1.4646 - regression_loss: 1.2285 - classification_loss: 0.2360 449/500 [=========================>....] - ETA: 12s - loss: 1.4652 - regression_loss: 1.2291 - classification_loss: 0.2361 450/500 [==========================>...] - ETA: 12s - loss: 1.4648 - regression_loss: 1.2288 - classification_loss: 0.2360 451/500 [==========================>...] - ETA: 12s - loss: 1.4645 - regression_loss: 1.2285 - classification_loss: 0.2360 452/500 [==========================>...] - ETA: 12s - loss: 1.4654 - regression_loss: 1.2293 - classification_loss: 0.2361 453/500 [==========================>...] - ETA: 11s - loss: 1.4657 - regression_loss: 1.2297 - classification_loss: 0.2360 454/500 [==========================>...] - ETA: 11s - loss: 1.4671 - regression_loss: 1.2305 - classification_loss: 0.2366 455/500 [==========================>...] - ETA: 11s - loss: 1.4672 - regression_loss: 1.2305 - classification_loss: 0.2367 456/500 [==========================>...] - ETA: 11s - loss: 1.4667 - regression_loss: 1.2300 - classification_loss: 0.2367 457/500 [==========================>...] - ETA: 10s - loss: 1.4664 - regression_loss: 1.2297 - classification_loss: 0.2367 458/500 [==========================>...] - ETA: 10s - loss: 1.4659 - regression_loss: 1.2294 - classification_loss: 0.2365 459/500 [==========================>...] - ETA: 10s - loss: 1.4670 - regression_loss: 1.2304 - classification_loss: 0.2365 460/500 [==========================>...] - ETA: 10s - loss: 1.4672 - regression_loss: 1.2306 - classification_loss: 0.2367 461/500 [==========================>...] - ETA: 9s - loss: 1.4678 - regression_loss: 1.2311 - classification_loss: 0.2367  462/500 [==========================>...] - ETA: 9s - loss: 1.4682 - regression_loss: 1.2315 - classification_loss: 0.2367 463/500 [==========================>...] - ETA: 9s - loss: 1.4696 - regression_loss: 1.2322 - classification_loss: 0.2374 464/500 [==========================>...] - ETA: 9s - loss: 1.4682 - regression_loss: 1.2312 - classification_loss: 0.2371 465/500 [==========================>...] - ETA: 8s - loss: 1.4687 - regression_loss: 1.2316 - classification_loss: 0.2371 466/500 [==========================>...] - ETA: 8s - loss: 1.4691 - regression_loss: 1.2323 - classification_loss: 0.2367 467/500 [===========================>..] - ETA: 8s - loss: 1.4675 - regression_loss: 1.2311 - classification_loss: 0.2364 468/500 [===========================>..] - ETA: 8s - loss: 1.4676 - regression_loss: 1.2312 - classification_loss: 0.2365 469/500 [===========================>..] - ETA: 7s - loss: 1.4670 - regression_loss: 1.2306 - classification_loss: 0.2364 470/500 [===========================>..] - ETA: 7s - loss: 1.4664 - regression_loss: 1.2303 - classification_loss: 0.2361 471/500 [===========================>..] - ETA: 7s - loss: 1.4664 - regression_loss: 1.2304 - classification_loss: 0.2360 472/500 [===========================>..] - ETA: 7s - loss: 1.4677 - regression_loss: 1.2316 - classification_loss: 0.2361 473/500 [===========================>..] - ETA: 6s - loss: 1.4679 - regression_loss: 1.2317 - classification_loss: 0.2362 474/500 [===========================>..] - ETA: 6s - loss: 1.4681 - regression_loss: 1.2319 - classification_loss: 0.2362 475/500 [===========================>..] - ETA: 6s - loss: 1.4684 - regression_loss: 1.2322 - classification_loss: 0.2362 476/500 [===========================>..] - ETA: 6s - loss: 1.4690 - regression_loss: 1.2327 - classification_loss: 0.2363 477/500 [===========================>..] - ETA: 5s - loss: 1.4687 - regression_loss: 1.2325 - classification_loss: 0.2362 478/500 [===========================>..] - ETA: 5s - loss: 1.4679 - regression_loss: 1.2318 - classification_loss: 0.2361 479/500 [===========================>..] - ETA: 5s - loss: 1.4671 - regression_loss: 1.2312 - classification_loss: 0.2359 480/500 [===========================>..] - ETA: 5s - loss: 1.4676 - regression_loss: 1.2317 - classification_loss: 0.2360 481/500 [===========================>..] - ETA: 4s - loss: 1.4657 - regression_loss: 1.2300 - classification_loss: 0.2357 482/500 [===========================>..] - ETA: 4s - loss: 1.4660 - regression_loss: 1.2303 - classification_loss: 0.2357 483/500 [===========================>..] - ETA: 4s - loss: 1.4668 - regression_loss: 1.2308 - classification_loss: 0.2361 484/500 [============================>.] - ETA: 4s - loss: 1.4682 - regression_loss: 1.2319 - classification_loss: 0.2363 485/500 [============================>.] - ETA: 3s - loss: 1.4670 - regression_loss: 1.2309 - classification_loss: 0.2361 486/500 [============================>.] - ETA: 3s - loss: 1.4676 - regression_loss: 1.2314 - classification_loss: 0.2362 487/500 [============================>.] - ETA: 3s - loss: 1.4670 - regression_loss: 1.2308 - classification_loss: 0.2362 488/500 [============================>.] - ETA: 3s - loss: 1.4680 - regression_loss: 1.2316 - classification_loss: 0.2364 489/500 [============================>.] - ETA: 2s - loss: 1.4675 - regression_loss: 1.2312 - classification_loss: 0.2362 490/500 [============================>.] - ETA: 2s - loss: 1.4671 - regression_loss: 1.2307 - classification_loss: 0.2364 491/500 [============================>.] - ETA: 2s - loss: 1.4668 - regression_loss: 1.2305 - classification_loss: 0.2363 492/500 [============================>.] - ETA: 2s - loss: 1.4669 - regression_loss: 1.2305 - classification_loss: 0.2364 493/500 [============================>.] - ETA: 1s - loss: 1.4676 - regression_loss: 1.2310 - classification_loss: 0.2366 494/500 [============================>.] - ETA: 1s - loss: 1.4684 - regression_loss: 1.2318 - classification_loss: 0.2366 495/500 [============================>.] - ETA: 1s - loss: 1.4681 - regression_loss: 1.2317 - classification_loss: 0.2365 496/500 [============================>.] - ETA: 1s - loss: 1.4684 - regression_loss: 1.2319 - classification_loss: 0.2365 497/500 [============================>.] - ETA: 0s - loss: 1.4699 - regression_loss: 1.2330 - classification_loss: 0.2369 498/500 [============================>.] - ETA: 0s - loss: 1.4690 - regression_loss: 1.2322 - classification_loss: 0.2368 499/500 [============================>.] - ETA: 0s - loss: 1.4692 - regression_loss: 1.2325 - classification_loss: 0.2367 500/500 [==============================] - 125s 251ms/step - loss: 1.4690 - regression_loss: 1.2323 - classification_loss: 0.2367 1172 instances of class plum with average precision: 0.6433 mAP: 0.6433 Epoch 00101: saving model to ./training/snapshots/resnet50_pascal_101.h5 Epoch 102/150 1/500 [..............................] - ETA: 2:02 - loss: 1.8755 - regression_loss: 1.6525 - classification_loss: 0.2230 2/500 [..............................] - ETA: 2:04 - loss: 1.4818 - regression_loss: 1.2734 - classification_loss: 0.2084 3/500 [..............................] - ETA: 2:04 - loss: 1.5680 - regression_loss: 1.3414 - classification_loss: 0.2266 4/500 [..............................] - ETA: 2:01 - loss: 1.5032 - regression_loss: 1.2726 - classification_loss: 0.2306 5/500 [..............................] - ETA: 2:01 - loss: 1.5699 - regression_loss: 1.3381 - classification_loss: 0.2318 6/500 [..............................] - ETA: 2:01 - loss: 1.6548 - regression_loss: 1.3891 - classification_loss: 0.2657 7/500 [..............................] - ETA: 2:01 - loss: 1.6265 - regression_loss: 1.3545 - classification_loss: 0.2721 8/500 [..............................] - ETA: 2:01 - loss: 1.6678 - regression_loss: 1.3874 - classification_loss: 0.2804 9/500 [..............................] - ETA: 2:01 - loss: 1.7015 - regression_loss: 1.4178 - classification_loss: 0.2837 10/500 [..............................] - ETA: 2:01 - loss: 1.6959 - regression_loss: 1.4164 - classification_loss: 0.2795 11/500 [..............................] - ETA: 2:01 - loss: 1.6903 - regression_loss: 1.4128 - classification_loss: 0.2775 12/500 [..............................] - ETA: 2:00 - loss: 1.6573 - regression_loss: 1.3869 - classification_loss: 0.2704 13/500 [..............................] - ETA: 1:59 - loss: 1.7000 - regression_loss: 1.4177 - classification_loss: 0.2822 14/500 [..............................] - ETA: 1:58 - loss: 1.6225 - regression_loss: 1.3575 - classification_loss: 0.2650 15/500 [..............................] - ETA: 1:56 - loss: 1.6243 - regression_loss: 1.3525 - classification_loss: 0.2718 16/500 [..............................] - ETA: 1:57 - loss: 1.6387 - regression_loss: 1.3633 - classification_loss: 0.2755 17/500 [>.............................] - ETA: 1:57 - loss: 1.6286 - regression_loss: 1.3550 - classification_loss: 0.2736 18/500 [>.............................] - ETA: 1:56 - loss: 1.6291 - regression_loss: 1.3561 - classification_loss: 0.2730 19/500 [>.............................] - ETA: 1:56 - loss: 1.6077 - regression_loss: 1.3361 - classification_loss: 0.2715 20/500 [>.............................] - ETA: 1:56 - loss: 1.6455 - regression_loss: 1.3738 - classification_loss: 0.2717 21/500 [>.............................] - ETA: 1:56 - loss: 1.6587 - regression_loss: 1.3824 - classification_loss: 0.2763 22/500 [>.............................] - ETA: 1:55 - loss: 1.6605 - regression_loss: 1.3846 - classification_loss: 0.2760 23/500 [>.............................] - ETA: 1:56 - loss: 1.6515 - regression_loss: 1.3760 - classification_loss: 0.2755 24/500 [>.............................] - ETA: 1:56 - loss: 1.6596 - regression_loss: 1.3840 - classification_loss: 0.2756 25/500 [>.............................] - ETA: 1:56 - loss: 1.6252 - regression_loss: 1.3543 - classification_loss: 0.2709 26/500 [>.............................] - ETA: 1:55 - loss: 1.6173 - regression_loss: 1.3463 - classification_loss: 0.2710 27/500 [>.............................] - ETA: 1:55 - loss: 1.5698 - regression_loss: 1.3055 - classification_loss: 0.2644 28/500 [>.............................] - ETA: 1:55 - loss: 1.5664 - regression_loss: 1.3016 - classification_loss: 0.2648 29/500 [>.............................] - ETA: 1:55 - loss: 1.5639 - regression_loss: 1.2988 - classification_loss: 0.2652 30/500 [>.............................] - ETA: 1:55 - loss: 1.5578 - regression_loss: 1.2935 - classification_loss: 0.2643 31/500 [>.............................] - ETA: 1:55 - loss: 1.5434 - regression_loss: 1.2802 - classification_loss: 0.2632 32/500 [>.............................] - ETA: 1:55 - loss: 1.5631 - regression_loss: 1.2949 - classification_loss: 0.2682 33/500 [>.............................] - ETA: 1:54 - loss: 1.5588 - regression_loss: 1.2926 - classification_loss: 0.2662 34/500 [=>............................] - ETA: 1:54 - loss: 1.5653 - regression_loss: 1.2984 - classification_loss: 0.2669 35/500 [=>............................] - ETA: 1:54 - loss: 1.5382 - regression_loss: 1.2765 - classification_loss: 0.2618 36/500 [=>............................] - ETA: 1:54 - loss: 1.5415 - regression_loss: 1.2784 - classification_loss: 0.2631 37/500 [=>............................] - ETA: 1:54 - loss: 1.5490 - regression_loss: 1.2854 - classification_loss: 0.2636 38/500 [=>............................] - ETA: 1:53 - loss: 1.5573 - regression_loss: 1.2957 - classification_loss: 0.2616 39/500 [=>............................] - ETA: 1:53 - loss: 1.5478 - regression_loss: 1.2876 - classification_loss: 0.2602 40/500 [=>............................] - ETA: 1:53 - loss: 1.5638 - regression_loss: 1.2992 - classification_loss: 0.2646 41/500 [=>............................] - ETA: 1:53 - loss: 1.5696 - regression_loss: 1.3035 - classification_loss: 0.2661 42/500 [=>............................] - ETA: 1:53 - loss: 1.5721 - regression_loss: 1.3069 - classification_loss: 0.2653 43/500 [=>............................] - ETA: 1:52 - loss: 1.5800 - regression_loss: 1.3130 - classification_loss: 0.2669 44/500 [=>............................] - ETA: 1:52 - loss: 1.5823 - regression_loss: 1.3157 - classification_loss: 0.2665 45/500 [=>............................] - ETA: 1:52 - loss: 1.5730 - regression_loss: 1.3047 - classification_loss: 0.2682 46/500 [=>............................] - ETA: 1:52 - loss: 1.5559 - regression_loss: 1.2918 - classification_loss: 0.2641 47/500 [=>............................] - ETA: 1:52 - loss: 1.5339 - regression_loss: 1.2737 - classification_loss: 0.2602 48/500 [=>............................] - ETA: 1:51 - loss: 1.5302 - regression_loss: 1.2705 - classification_loss: 0.2598 49/500 [=>............................] - ETA: 1:51 - loss: 1.5233 - regression_loss: 1.2654 - classification_loss: 0.2579 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5284 - regression_loss: 1.2677 - classification_loss: 0.2608 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5466 - regression_loss: 1.2831 - classification_loss: 0.2634 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5509 - regression_loss: 1.2862 - classification_loss: 0.2647 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5584 - regression_loss: 1.2923 - classification_loss: 0.2661 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5542 - regression_loss: 1.2892 - classification_loss: 0.2651 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5459 - regression_loss: 1.2839 - classification_loss: 0.2620 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5407 - regression_loss: 1.2791 - classification_loss: 0.2616 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5479 - regression_loss: 1.2842 - classification_loss: 0.2637 58/500 [==>...........................] - ETA: 1:49 - loss: 1.5353 - regression_loss: 1.2743 - classification_loss: 0.2610 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5405 - regression_loss: 1.2789 - classification_loss: 0.2616 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5438 - regression_loss: 1.2839 - classification_loss: 0.2599 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5381 - regression_loss: 1.2795 - classification_loss: 0.2586 62/500 [==>...........................] - ETA: 1:48 - loss: 1.5338 - regression_loss: 1.2756 - classification_loss: 0.2582 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5411 - regression_loss: 1.2790 - classification_loss: 0.2621 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5530 - regression_loss: 1.2894 - classification_loss: 0.2636 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5573 - regression_loss: 1.2924 - classification_loss: 0.2648 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5601 - regression_loss: 1.2945 - classification_loss: 0.2656 67/500 [===>..........................] - ETA: 1:47 - loss: 1.5529 - regression_loss: 1.2899 - classification_loss: 0.2630 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5582 - regression_loss: 1.2945 - classification_loss: 0.2637 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5566 - regression_loss: 1.2922 - classification_loss: 0.2644 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5466 - regression_loss: 1.2851 - classification_loss: 0.2614 71/500 [===>..........................] - ETA: 1:46 - loss: 1.5451 - regression_loss: 1.2846 - classification_loss: 0.2605 72/500 [===>..........................] - ETA: 1:46 - loss: 1.5417 - regression_loss: 1.2818 - classification_loss: 0.2599 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5428 - regression_loss: 1.2834 - classification_loss: 0.2593 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5384 - regression_loss: 1.2798 - classification_loss: 0.2586 75/500 [===>..........................] - ETA: 1:45 - loss: 1.5361 - regression_loss: 1.2773 - classification_loss: 0.2589 76/500 [===>..........................] - ETA: 1:45 - loss: 1.5397 - regression_loss: 1.2808 - classification_loss: 0.2589 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5416 - regression_loss: 1.2820 - classification_loss: 0.2596 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5354 - regression_loss: 1.2776 - classification_loss: 0.2579 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5337 - regression_loss: 1.2764 - classification_loss: 0.2573 80/500 [===>..........................] - ETA: 1:44 - loss: 1.5362 - regression_loss: 1.2777 - classification_loss: 0.2585 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5385 - regression_loss: 1.2800 - classification_loss: 0.2586 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5383 - regression_loss: 1.2803 - classification_loss: 0.2581 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5406 - regression_loss: 1.2809 - classification_loss: 0.2597 84/500 [====>.........................] - ETA: 1:43 - loss: 1.5436 - regression_loss: 1.2837 - classification_loss: 0.2599 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5397 - regression_loss: 1.2812 - classification_loss: 0.2586 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5453 - regression_loss: 1.2860 - classification_loss: 0.2593 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5334 - regression_loss: 1.2761 - classification_loss: 0.2574 88/500 [====>.........................] - ETA: 1:42 - loss: 1.5392 - regression_loss: 1.2809 - classification_loss: 0.2582 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5403 - regression_loss: 1.2824 - classification_loss: 0.2578 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5396 - regression_loss: 1.2820 - classification_loss: 0.2575 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5417 - regression_loss: 1.2834 - classification_loss: 0.2583 92/500 [====>.........................] - ETA: 1:41 - loss: 1.5450 - regression_loss: 1.2866 - classification_loss: 0.2583 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5376 - regression_loss: 1.2812 - classification_loss: 0.2564 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5394 - regression_loss: 1.2832 - classification_loss: 0.2561 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5446 - regression_loss: 1.2879 - classification_loss: 0.2566 96/500 [====>.........................] - ETA: 1:40 - loss: 1.5441 - regression_loss: 1.2877 - classification_loss: 0.2563 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5426 - regression_loss: 1.2866 - classification_loss: 0.2560 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5421 - regression_loss: 1.2860 - classification_loss: 0.2561 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5332 - regression_loss: 1.2782 - classification_loss: 0.2550 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5344 - regression_loss: 1.2795 - classification_loss: 0.2550 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5388 - regression_loss: 1.2828 - classification_loss: 0.2560 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5320 - regression_loss: 1.2772 - classification_loss: 0.2548 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5294 - regression_loss: 1.2750 - classification_loss: 0.2545 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5289 - regression_loss: 1.2726 - classification_loss: 0.2563 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5224 - regression_loss: 1.2658 - classification_loss: 0.2566 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5227 - regression_loss: 1.2664 - classification_loss: 0.2563 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5283 - regression_loss: 1.2712 - classification_loss: 0.2572 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5282 - regression_loss: 1.2714 - classification_loss: 0.2568 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5324 - regression_loss: 1.2744 - classification_loss: 0.2581 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5383 - regression_loss: 1.2798 - classification_loss: 0.2585 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5394 - regression_loss: 1.2810 - classification_loss: 0.2584 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5404 - regression_loss: 1.2824 - classification_loss: 0.2580 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5433 - regression_loss: 1.2845 - classification_loss: 0.2588 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5379 - regression_loss: 1.2806 - classification_loss: 0.2573 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5401 - regression_loss: 1.2830 - classification_loss: 0.2571 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5395 - regression_loss: 1.2826 - classification_loss: 0.2568 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5423 - regression_loss: 1.2854 - classification_loss: 0.2570 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5371 - regression_loss: 1.2815 - classification_loss: 0.2556 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5393 - regression_loss: 1.2833 - classification_loss: 0.2561 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5385 - regression_loss: 1.2826 - classification_loss: 0.2559 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5432 - regression_loss: 1.2856 - classification_loss: 0.2576 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5406 - regression_loss: 1.2832 - classification_loss: 0.2574 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5454 - regression_loss: 1.2873 - classification_loss: 0.2581 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5482 - regression_loss: 1.2900 - classification_loss: 0.2582 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5484 - regression_loss: 1.2904 - classification_loss: 0.2580 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5498 - regression_loss: 1.2919 - classification_loss: 0.2578 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5433 - regression_loss: 1.2868 - classification_loss: 0.2565 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5445 - regression_loss: 1.2871 - classification_loss: 0.2574 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5431 - regression_loss: 1.2861 - classification_loss: 0.2570 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5404 - regression_loss: 1.2844 - classification_loss: 0.2560 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5408 - regression_loss: 1.2850 - classification_loss: 0.2558 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5438 - regression_loss: 1.2875 - classification_loss: 0.2563 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5419 - regression_loss: 1.2864 - classification_loss: 0.2555 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5426 - regression_loss: 1.2867 - classification_loss: 0.2559 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5427 - regression_loss: 1.2869 - classification_loss: 0.2558 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5449 - regression_loss: 1.2892 - classification_loss: 0.2558 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5469 - regression_loss: 1.2910 - classification_loss: 0.2560 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5427 - regression_loss: 1.2869 - classification_loss: 0.2558 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5442 - regression_loss: 1.2884 - classification_loss: 0.2558 140/500 [=======>......................] - ETA: 1:30 - loss: 1.5417 - regression_loss: 1.2870 - classification_loss: 0.2547 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5402 - regression_loss: 1.2854 - classification_loss: 0.2548 142/500 [=======>......................] - ETA: 1:29 - loss: 1.5420 - regression_loss: 1.2874 - classification_loss: 0.2545 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5410 - regression_loss: 1.2868 - classification_loss: 0.2542 144/500 [=======>......................] - ETA: 1:29 - loss: 1.5438 - regression_loss: 1.2890 - classification_loss: 0.2548 145/500 [=======>......................] - ETA: 1:28 - loss: 1.5453 - regression_loss: 1.2903 - classification_loss: 0.2550 146/500 [=======>......................] - ETA: 1:28 - loss: 1.5446 - regression_loss: 1.2895 - classification_loss: 0.2550 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5372 - regression_loss: 1.2836 - classification_loss: 0.2536 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5353 - regression_loss: 1.2821 - classification_loss: 0.2531 149/500 [=======>......................] - ETA: 1:27 - loss: 1.5355 - regression_loss: 1.2827 - classification_loss: 0.2528 150/500 [========>.....................] - ETA: 1:27 - loss: 1.5384 - regression_loss: 1.2850 - classification_loss: 0.2534 151/500 [========>.....................] - ETA: 1:27 - loss: 1.5361 - regression_loss: 1.2832 - classification_loss: 0.2529 152/500 [========>.....................] - ETA: 1:27 - loss: 1.5362 - regression_loss: 1.2836 - classification_loss: 0.2526 153/500 [========>.....................] - ETA: 1:26 - loss: 1.5376 - regression_loss: 1.2849 - classification_loss: 0.2527 154/500 [========>.....................] - ETA: 1:26 - loss: 1.5377 - regression_loss: 1.2851 - classification_loss: 0.2526 155/500 [========>.....................] - ETA: 1:26 - loss: 1.5401 - regression_loss: 1.2871 - classification_loss: 0.2530 156/500 [========>.....................] - ETA: 1:26 - loss: 1.5408 - regression_loss: 1.2877 - classification_loss: 0.2530 157/500 [========>.....................] - ETA: 1:25 - loss: 1.5395 - regression_loss: 1.2869 - classification_loss: 0.2527 158/500 [========>.....................] - ETA: 1:25 - loss: 1.5361 - regression_loss: 1.2841 - classification_loss: 0.2520 159/500 [========>.....................] - ETA: 1:25 - loss: 1.5359 - regression_loss: 1.2840 - classification_loss: 0.2519 160/500 [========>.....................] - ETA: 1:25 - loss: 1.5365 - regression_loss: 1.2840 - classification_loss: 0.2525 161/500 [========>.....................] - ETA: 1:25 - loss: 1.5403 - regression_loss: 1.2871 - classification_loss: 0.2532 162/500 [========>.....................] - ETA: 1:24 - loss: 1.5377 - regression_loss: 1.2851 - classification_loss: 0.2526 163/500 [========>.....................] - ETA: 1:24 - loss: 1.5329 - regression_loss: 1.2814 - classification_loss: 0.2515 164/500 [========>.....................] - ETA: 1:24 - loss: 1.5341 - regression_loss: 1.2824 - classification_loss: 0.2517 165/500 [========>.....................] - ETA: 1:24 - loss: 1.5338 - regression_loss: 1.2821 - classification_loss: 0.2518 166/500 [========>.....................] - ETA: 1:23 - loss: 1.5337 - regression_loss: 1.2817 - classification_loss: 0.2520 167/500 [=========>....................] - ETA: 1:23 - loss: 1.5356 - regression_loss: 1.2833 - classification_loss: 0.2524 168/500 [=========>....................] - ETA: 1:23 - loss: 1.5367 - regression_loss: 1.2844 - classification_loss: 0.2523 169/500 [=========>....................] - ETA: 1:23 - loss: 1.5355 - regression_loss: 1.2835 - classification_loss: 0.2520 170/500 [=========>....................] - ETA: 1:22 - loss: 1.5364 - regression_loss: 1.2842 - classification_loss: 0.2522 171/500 [=========>....................] - ETA: 1:22 - loss: 1.5378 - regression_loss: 1.2854 - classification_loss: 0.2524 172/500 [=========>....................] - ETA: 1:22 - loss: 1.5358 - regression_loss: 1.2835 - classification_loss: 0.2522 173/500 [=========>....................] - ETA: 1:22 - loss: 1.5356 - regression_loss: 1.2837 - classification_loss: 0.2519 174/500 [=========>....................] - ETA: 1:21 - loss: 1.5366 - regression_loss: 1.2847 - classification_loss: 0.2519 175/500 [=========>....................] - ETA: 1:21 - loss: 1.5329 - regression_loss: 1.2816 - classification_loss: 0.2513 176/500 [=========>....................] - ETA: 1:21 - loss: 1.5306 - regression_loss: 1.2796 - classification_loss: 0.2510 177/500 [=========>....................] - ETA: 1:21 - loss: 1.5266 - regression_loss: 1.2759 - classification_loss: 0.2507 178/500 [=========>....................] - ETA: 1:20 - loss: 1.5267 - regression_loss: 1.2762 - classification_loss: 0.2505 179/500 [=========>....................] - ETA: 1:20 - loss: 1.5276 - regression_loss: 1.2773 - classification_loss: 0.2503 180/500 [=========>....................] - ETA: 1:20 - loss: 1.5279 - regression_loss: 1.2776 - classification_loss: 0.2503 181/500 [=========>....................] - ETA: 1:20 - loss: 1.5258 - regression_loss: 1.2760 - classification_loss: 0.2499 182/500 [=========>....................] - ETA: 1:19 - loss: 1.5259 - regression_loss: 1.2761 - classification_loss: 0.2499 183/500 [=========>....................] - ETA: 1:19 - loss: 1.5258 - regression_loss: 1.2761 - classification_loss: 0.2497 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5247 - regression_loss: 1.2754 - classification_loss: 0.2494 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5250 - regression_loss: 1.2761 - classification_loss: 0.2489 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5212 - regression_loss: 1.2728 - classification_loss: 0.2484 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5190 - regression_loss: 1.2712 - classification_loss: 0.2478 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5152 - regression_loss: 1.2680 - classification_loss: 0.2472 189/500 [==========>...................] - ETA: 1:17 - loss: 1.5189 - regression_loss: 1.2709 - classification_loss: 0.2480 190/500 [==========>...................] - ETA: 1:17 - loss: 1.5191 - regression_loss: 1.2709 - classification_loss: 0.2482 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5179 - regression_loss: 1.2696 - classification_loss: 0.2482 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5161 - regression_loss: 1.2682 - classification_loss: 0.2479 193/500 [==========>...................] - ETA: 1:16 - loss: 1.5157 - regression_loss: 1.2679 - classification_loss: 0.2478 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5120 - regression_loss: 1.2651 - classification_loss: 0.2469 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5125 - regression_loss: 1.2654 - classification_loss: 0.2471 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5116 - regression_loss: 1.2646 - classification_loss: 0.2471 197/500 [==========>...................] - ETA: 1:15 - loss: 1.5116 - regression_loss: 1.2646 - classification_loss: 0.2470 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5120 - regression_loss: 1.2649 - classification_loss: 0.2472 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5118 - regression_loss: 1.2647 - classification_loss: 0.2470 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5090 - regression_loss: 1.2628 - classification_loss: 0.2462 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5101 - regression_loss: 1.2636 - classification_loss: 0.2465 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5099 - regression_loss: 1.2635 - classification_loss: 0.2464 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5126 - regression_loss: 1.2658 - classification_loss: 0.2468 204/500 [===========>..................] - ETA: 1:14 - loss: 1.5111 - regression_loss: 1.2642 - classification_loss: 0.2468 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5104 - regression_loss: 1.2638 - classification_loss: 0.2466 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5106 - regression_loss: 1.2638 - classification_loss: 0.2467 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5068 - regression_loss: 1.2604 - classification_loss: 0.2464 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5062 - regression_loss: 1.2599 - classification_loss: 0.2464 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5035 - regression_loss: 1.2575 - classification_loss: 0.2460 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5018 - regression_loss: 1.2562 - classification_loss: 0.2456 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5016 - regression_loss: 1.2562 - classification_loss: 0.2454 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4976 - regression_loss: 1.2526 - classification_loss: 0.2451 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4987 - regression_loss: 1.2534 - classification_loss: 0.2453 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4945 - regression_loss: 1.2499 - classification_loss: 0.2446 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4935 - regression_loss: 1.2483 - classification_loss: 0.2453 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4928 - regression_loss: 1.2477 - classification_loss: 0.2450 217/500 [============>.................] - ETA: 1:10 - loss: 1.4898 - regression_loss: 1.2452 - classification_loss: 0.2445 218/500 [============>.................] - ETA: 1:10 - loss: 1.4918 - regression_loss: 1.2469 - classification_loss: 0.2449 219/500 [============>.................] - ETA: 1:10 - loss: 1.4938 - regression_loss: 1.2487 - classification_loss: 0.2451 220/500 [============>.................] - ETA: 1:10 - loss: 1.4932 - regression_loss: 1.2481 - classification_loss: 0.2450 221/500 [============>.................] - ETA: 1:09 - loss: 1.4941 - regression_loss: 1.2492 - classification_loss: 0.2450 222/500 [============>.................] - ETA: 1:09 - loss: 1.4963 - regression_loss: 1.2508 - classification_loss: 0.2455 223/500 [============>.................] - ETA: 1:09 - loss: 1.4979 - regression_loss: 1.2523 - classification_loss: 0.2457 224/500 [============>.................] - ETA: 1:09 - loss: 1.4990 - regression_loss: 1.2532 - classification_loss: 0.2458 225/500 [============>.................] - ETA: 1:08 - loss: 1.4986 - regression_loss: 1.2530 - classification_loss: 0.2456 226/500 [============>.................] - ETA: 1:08 - loss: 1.5006 - regression_loss: 1.2546 - classification_loss: 0.2461 227/500 [============>.................] - ETA: 1:08 - loss: 1.5035 - regression_loss: 1.2569 - classification_loss: 0.2466 228/500 [============>.................] - ETA: 1:08 - loss: 1.5029 - regression_loss: 1.2563 - classification_loss: 0.2466 229/500 [============>.................] - ETA: 1:07 - loss: 1.5022 - regression_loss: 1.2557 - classification_loss: 0.2465 230/500 [============>.................] - ETA: 1:07 - loss: 1.5044 - regression_loss: 1.2580 - classification_loss: 0.2464 231/500 [============>.................] - ETA: 1:07 - loss: 1.5057 - regression_loss: 1.2593 - classification_loss: 0.2465 232/500 [============>.................] - ETA: 1:07 - loss: 1.5107 - regression_loss: 1.2635 - classification_loss: 0.2472 233/500 [============>.................] - ETA: 1:06 - loss: 1.5109 - regression_loss: 1.2628 - classification_loss: 0.2481 234/500 [=============>................] - ETA: 1:06 - loss: 1.5097 - regression_loss: 1.2620 - classification_loss: 0.2477 235/500 [=============>................] - ETA: 1:06 - loss: 1.5106 - regression_loss: 1.2628 - classification_loss: 0.2477 236/500 [=============>................] - ETA: 1:06 - loss: 1.5113 - regression_loss: 1.2633 - classification_loss: 0.2480 237/500 [=============>................] - ETA: 1:05 - loss: 1.5110 - regression_loss: 1.2631 - classification_loss: 0.2479 238/500 [=============>................] - ETA: 1:05 - loss: 1.5141 - regression_loss: 1.2651 - classification_loss: 0.2490 239/500 [=============>................] - ETA: 1:05 - loss: 1.5136 - regression_loss: 1.2645 - classification_loss: 0.2490 240/500 [=============>................] - ETA: 1:05 - loss: 1.5145 - regression_loss: 1.2652 - classification_loss: 0.2493 241/500 [=============>................] - ETA: 1:04 - loss: 1.5129 - regression_loss: 1.2641 - classification_loss: 0.2488 242/500 [=============>................] - ETA: 1:04 - loss: 1.5095 - regression_loss: 1.2612 - classification_loss: 0.2482 243/500 [=============>................] - ETA: 1:04 - loss: 1.5103 - regression_loss: 1.2620 - classification_loss: 0.2483 244/500 [=============>................] - ETA: 1:04 - loss: 1.5129 - regression_loss: 1.2640 - classification_loss: 0.2488 245/500 [=============>................] - ETA: 1:03 - loss: 1.5129 - regression_loss: 1.2640 - classification_loss: 0.2489 246/500 [=============>................] - ETA: 1:03 - loss: 1.5121 - regression_loss: 1.2631 - classification_loss: 0.2490 247/500 [=============>................] - ETA: 1:03 - loss: 1.5105 - regression_loss: 1.2620 - classification_loss: 0.2485 248/500 [=============>................] - ETA: 1:03 - loss: 1.5075 - regression_loss: 1.2594 - classification_loss: 0.2481 249/500 [=============>................] - ETA: 1:02 - loss: 1.5101 - regression_loss: 1.2616 - classification_loss: 0.2485 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5091 - regression_loss: 1.2611 - classification_loss: 0.2479 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5087 - regression_loss: 1.2608 - classification_loss: 0.2479 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5096 - regression_loss: 1.2615 - classification_loss: 0.2480 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5102 - regression_loss: 1.2621 - classification_loss: 0.2481 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5087 - regression_loss: 1.2609 - classification_loss: 0.2478 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5082 - regression_loss: 1.2607 - classification_loss: 0.2475 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5062 - regression_loss: 1.2589 - classification_loss: 0.2473 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5055 - regression_loss: 1.2583 - classification_loss: 0.2472 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5053 - regression_loss: 1.2582 - classification_loss: 0.2470 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5028 - regression_loss: 1.2564 - classification_loss: 0.2463 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5032 - regression_loss: 1.2569 - classification_loss: 0.2463 261/500 [==============>...............] - ETA: 59s - loss: 1.5051 - regression_loss: 1.2579 - classification_loss: 0.2473  262/500 [==============>...............] - ETA: 59s - loss: 1.5044 - regression_loss: 1.2573 - classification_loss: 0.2471 263/500 [==============>...............] - ETA: 59s - loss: 1.5030 - regression_loss: 1.2564 - classification_loss: 0.2466 264/500 [==============>...............] - ETA: 59s - loss: 1.4995 - regression_loss: 1.2532 - classification_loss: 0.2463 265/500 [==============>...............] - ETA: 58s - loss: 1.5041 - regression_loss: 1.2569 - classification_loss: 0.2472 266/500 [==============>...............] - ETA: 58s - loss: 1.5044 - regression_loss: 1.2573 - classification_loss: 0.2472 267/500 [===============>..............] - ETA: 58s - loss: 1.5038 - regression_loss: 1.2568 - classification_loss: 0.2470 268/500 [===============>..............] - ETA: 58s - loss: 1.5022 - regression_loss: 1.2555 - classification_loss: 0.2467 269/500 [===============>..............] - ETA: 57s - loss: 1.5031 - regression_loss: 1.2562 - classification_loss: 0.2469 270/500 [===============>..............] - ETA: 57s - loss: 1.5042 - regression_loss: 1.2571 - classification_loss: 0.2471 271/500 [===============>..............] - ETA: 57s - loss: 1.5058 - regression_loss: 1.2584 - classification_loss: 0.2474 272/500 [===============>..............] - ETA: 57s - loss: 1.5057 - regression_loss: 1.2584 - classification_loss: 0.2473 273/500 [===============>..............] - ETA: 56s - loss: 1.5058 - regression_loss: 1.2585 - classification_loss: 0.2473 274/500 [===============>..............] - ETA: 56s - loss: 1.5072 - regression_loss: 1.2597 - classification_loss: 0.2475 275/500 [===============>..............] - ETA: 56s - loss: 1.5082 - regression_loss: 1.2605 - classification_loss: 0.2478 276/500 [===============>..............] - ETA: 56s - loss: 1.5057 - regression_loss: 1.2584 - classification_loss: 0.2473 277/500 [===============>..............] - ETA: 55s - loss: 1.5062 - regression_loss: 1.2588 - classification_loss: 0.2474 278/500 [===============>..............] - ETA: 55s - loss: 1.5074 - regression_loss: 1.2600 - classification_loss: 0.2474 279/500 [===============>..............] - ETA: 55s - loss: 1.5065 - regression_loss: 1.2594 - classification_loss: 0.2471 280/500 [===============>..............] - ETA: 55s - loss: 1.5064 - regression_loss: 1.2594 - classification_loss: 0.2470 281/500 [===============>..............] - ETA: 54s - loss: 1.5060 - regression_loss: 1.2591 - classification_loss: 0.2469 282/500 [===============>..............] - ETA: 54s - loss: 1.5053 - regression_loss: 1.2583 - classification_loss: 0.2469 283/500 [===============>..............] - ETA: 54s - loss: 1.5054 - regression_loss: 1.2582 - classification_loss: 0.2472 284/500 [================>.............] - ETA: 54s - loss: 1.5047 - regression_loss: 1.2576 - classification_loss: 0.2471 285/500 [================>.............] - ETA: 53s - loss: 1.5056 - regression_loss: 1.2582 - classification_loss: 0.2474 286/500 [================>.............] - ETA: 53s - loss: 1.5074 - regression_loss: 1.2598 - classification_loss: 0.2475 287/500 [================>.............] - ETA: 53s - loss: 1.5077 - regression_loss: 1.2603 - classification_loss: 0.2474 288/500 [================>.............] - ETA: 53s - loss: 1.5086 - regression_loss: 1.2611 - classification_loss: 0.2475 289/500 [================>.............] - ETA: 52s - loss: 1.5075 - regression_loss: 1.2603 - classification_loss: 0.2472 290/500 [================>.............] - ETA: 52s - loss: 1.5045 - regression_loss: 1.2578 - classification_loss: 0.2466 291/500 [================>.............] - ETA: 52s - loss: 1.5065 - regression_loss: 1.2595 - classification_loss: 0.2470 292/500 [================>.............] - ETA: 52s - loss: 1.5053 - regression_loss: 1.2584 - classification_loss: 0.2469 293/500 [================>.............] - ETA: 51s - loss: 1.5039 - regression_loss: 1.2574 - classification_loss: 0.2465 294/500 [================>.............] - ETA: 51s - loss: 1.5043 - regression_loss: 1.2575 - classification_loss: 0.2467 295/500 [================>.............] - ETA: 51s - loss: 1.5038 - regression_loss: 1.2573 - classification_loss: 0.2465 296/500 [================>.............] - ETA: 51s - loss: 1.5015 - regression_loss: 1.2555 - classification_loss: 0.2460 297/500 [================>.............] - ETA: 50s - loss: 1.4988 - regression_loss: 1.2534 - classification_loss: 0.2454 298/500 [================>.............] - ETA: 50s - loss: 1.4984 - regression_loss: 1.2533 - classification_loss: 0.2451 299/500 [================>.............] - ETA: 50s - loss: 1.4984 - regression_loss: 1.2535 - classification_loss: 0.2450 300/500 [=================>............] - ETA: 50s - loss: 1.4987 - regression_loss: 1.2537 - classification_loss: 0.2450 301/500 [=================>............] - ETA: 49s - loss: 1.4975 - regression_loss: 1.2527 - classification_loss: 0.2448 302/500 [=================>............] - ETA: 49s - loss: 1.4966 - regression_loss: 1.2516 - classification_loss: 0.2450 303/500 [=================>............] - ETA: 49s - loss: 1.4966 - regression_loss: 1.2517 - classification_loss: 0.2449 304/500 [=================>............] - ETA: 49s - loss: 1.4963 - regression_loss: 1.2518 - classification_loss: 0.2445 305/500 [=================>............] - ETA: 48s - loss: 1.4965 - regression_loss: 1.2521 - classification_loss: 0.2444 306/500 [=================>............] - ETA: 48s - loss: 1.4946 - regression_loss: 1.2507 - classification_loss: 0.2439 307/500 [=================>............] - ETA: 48s - loss: 1.4966 - regression_loss: 1.2520 - classification_loss: 0.2447 308/500 [=================>............] - ETA: 48s - loss: 1.4983 - regression_loss: 1.2531 - classification_loss: 0.2452 309/500 [=================>............] - ETA: 47s - loss: 1.5007 - regression_loss: 1.2552 - classification_loss: 0.2454 310/500 [=================>............] - ETA: 47s - loss: 1.5017 - regression_loss: 1.2562 - classification_loss: 0.2455 311/500 [=================>............] - ETA: 47s - loss: 1.5051 - regression_loss: 1.2588 - classification_loss: 0.2464 312/500 [=================>............] - ETA: 47s - loss: 1.5100 - regression_loss: 1.2625 - classification_loss: 0.2475 313/500 [=================>............] - ETA: 46s - loss: 1.5079 - regression_loss: 1.2608 - classification_loss: 0.2471 314/500 [=================>............] - ETA: 46s - loss: 1.5094 - regression_loss: 1.2615 - classification_loss: 0.2479 315/500 [=================>............] - ETA: 46s - loss: 1.5084 - regression_loss: 1.2609 - classification_loss: 0.2475 316/500 [=================>............] - ETA: 46s - loss: 1.5102 - regression_loss: 1.2624 - classification_loss: 0.2478 317/500 [==================>...........] - ETA: 45s - loss: 1.5098 - regression_loss: 1.2619 - classification_loss: 0.2479 318/500 [==================>...........] - ETA: 45s - loss: 1.5111 - regression_loss: 1.2635 - classification_loss: 0.2477 319/500 [==================>...........] - ETA: 45s - loss: 1.5091 - regression_loss: 1.2619 - classification_loss: 0.2472 320/500 [==================>...........] - ETA: 45s - loss: 1.5074 - regression_loss: 1.2606 - classification_loss: 0.2467 321/500 [==================>...........] - ETA: 44s - loss: 1.5060 - regression_loss: 1.2596 - classification_loss: 0.2464 322/500 [==================>...........] - ETA: 44s - loss: 1.5063 - regression_loss: 1.2599 - classification_loss: 0.2464 323/500 [==================>...........] - ETA: 44s - loss: 1.5045 - regression_loss: 1.2582 - classification_loss: 0.2463 324/500 [==================>...........] - ETA: 44s - loss: 1.5026 - regression_loss: 1.2566 - classification_loss: 0.2460 325/500 [==================>...........] - ETA: 43s - loss: 1.4997 - regression_loss: 1.2543 - classification_loss: 0.2454 326/500 [==================>...........] - ETA: 43s - loss: 1.4979 - regression_loss: 1.2529 - classification_loss: 0.2450 327/500 [==================>...........] - ETA: 43s - loss: 1.4978 - regression_loss: 1.2528 - classification_loss: 0.2450 328/500 [==================>...........] - ETA: 43s - loss: 1.4981 - regression_loss: 1.2532 - classification_loss: 0.2450 329/500 [==================>...........] - ETA: 42s - loss: 1.4961 - regression_loss: 1.2515 - classification_loss: 0.2446 330/500 [==================>...........] - ETA: 42s - loss: 1.4969 - regression_loss: 1.2522 - classification_loss: 0.2446 331/500 [==================>...........] - ETA: 42s - loss: 1.4981 - regression_loss: 1.2533 - classification_loss: 0.2449 332/500 [==================>...........] - ETA: 42s - loss: 1.4981 - regression_loss: 1.2535 - classification_loss: 0.2446 333/500 [==================>...........] - ETA: 41s - loss: 1.4983 - regression_loss: 1.2535 - classification_loss: 0.2448 334/500 [===================>..........] - ETA: 41s - loss: 1.4982 - regression_loss: 1.2535 - classification_loss: 0.2447 335/500 [===================>..........] - ETA: 41s - loss: 1.4998 - regression_loss: 1.2546 - classification_loss: 0.2451 336/500 [===================>..........] - ETA: 41s - loss: 1.4977 - regression_loss: 1.2530 - classification_loss: 0.2447 337/500 [===================>..........] - ETA: 40s - loss: 1.4973 - regression_loss: 1.2529 - classification_loss: 0.2444 338/500 [===================>..........] - ETA: 40s - loss: 1.4971 - regression_loss: 1.2529 - classification_loss: 0.2442 339/500 [===================>..........] - ETA: 40s - loss: 1.4958 - regression_loss: 1.2517 - classification_loss: 0.2441 340/500 [===================>..........] - ETA: 40s - loss: 1.4955 - regression_loss: 1.2514 - classification_loss: 0.2440 341/500 [===================>..........] - ETA: 39s - loss: 1.4970 - regression_loss: 1.2525 - classification_loss: 0.2445 342/500 [===================>..........] - ETA: 39s - loss: 1.4982 - regression_loss: 1.2534 - classification_loss: 0.2449 343/500 [===================>..........] - ETA: 39s - loss: 1.4990 - regression_loss: 1.2541 - classification_loss: 0.2449 344/500 [===================>..........] - ETA: 39s - loss: 1.5003 - regression_loss: 1.2553 - classification_loss: 0.2449 345/500 [===================>..........] - ETA: 38s - loss: 1.5002 - regression_loss: 1.2554 - classification_loss: 0.2449 346/500 [===================>..........] - ETA: 38s - loss: 1.4984 - regression_loss: 1.2540 - classification_loss: 0.2445 347/500 [===================>..........] - ETA: 38s - loss: 1.4994 - regression_loss: 1.2548 - classification_loss: 0.2447 348/500 [===================>..........] - ETA: 38s - loss: 1.4993 - regression_loss: 1.2547 - classification_loss: 0.2446 349/500 [===================>..........] - ETA: 37s - loss: 1.5005 - regression_loss: 1.2556 - classification_loss: 0.2449 350/500 [====================>.........] - ETA: 37s - loss: 1.5003 - regression_loss: 1.2556 - classification_loss: 0.2448 351/500 [====================>.........] - ETA: 37s - loss: 1.5003 - regression_loss: 1.2556 - classification_loss: 0.2448 352/500 [====================>.........] - ETA: 37s - loss: 1.5010 - regression_loss: 1.2561 - classification_loss: 0.2449 353/500 [====================>.........] - ETA: 36s - loss: 1.5024 - regression_loss: 1.2572 - classification_loss: 0.2452 354/500 [====================>.........] - ETA: 36s - loss: 1.5022 - regression_loss: 1.2571 - classification_loss: 0.2451 355/500 [====================>.........] - ETA: 36s - loss: 1.5036 - regression_loss: 1.2583 - classification_loss: 0.2453 356/500 [====================>.........] - ETA: 36s - loss: 1.5031 - regression_loss: 1.2580 - classification_loss: 0.2452 357/500 [====================>.........] - ETA: 35s - loss: 1.5036 - regression_loss: 1.2585 - classification_loss: 0.2451 358/500 [====================>.........] - ETA: 35s - loss: 1.5039 - regression_loss: 1.2588 - classification_loss: 0.2451 359/500 [====================>.........] - ETA: 35s - loss: 1.5059 - regression_loss: 1.2603 - classification_loss: 0.2456 360/500 [====================>.........] - ETA: 35s - loss: 1.5074 - regression_loss: 1.2616 - classification_loss: 0.2459 361/500 [====================>.........] - ETA: 34s - loss: 1.5077 - regression_loss: 1.2618 - classification_loss: 0.2458 362/500 [====================>.........] - ETA: 34s - loss: 1.5078 - regression_loss: 1.2619 - classification_loss: 0.2459 363/500 [====================>.........] - ETA: 34s - loss: 1.5086 - regression_loss: 1.2626 - classification_loss: 0.2460 364/500 [====================>.........] - ETA: 34s - loss: 1.5086 - regression_loss: 1.2627 - classification_loss: 0.2459 365/500 [====================>.........] - ETA: 33s - loss: 1.5094 - regression_loss: 1.2634 - classification_loss: 0.2460 366/500 [====================>.........] - ETA: 33s - loss: 1.5080 - regression_loss: 1.2623 - classification_loss: 0.2458 367/500 [=====================>........] - ETA: 33s - loss: 1.5086 - regression_loss: 1.2628 - classification_loss: 0.2458 368/500 [=====================>........] - ETA: 33s - loss: 1.5083 - regression_loss: 1.2624 - classification_loss: 0.2459 369/500 [=====================>........] - ETA: 32s - loss: 1.5106 - regression_loss: 1.2645 - classification_loss: 0.2461 370/500 [=====================>........] - ETA: 32s - loss: 1.5110 - regression_loss: 1.2649 - classification_loss: 0.2461 371/500 [=====================>........] - ETA: 32s - loss: 1.5110 - regression_loss: 1.2649 - classification_loss: 0.2461 372/500 [=====================>........] - ETA: 32s - loss: 1.5111 - regression_loss: 1.2650 - classification_loss: 0.2461 373/500 [=====================>........] - ETA: 31s - loss: 1.5108 - regression_loss: 1.2648 - classification_loss: 0.2460 374/500 [=====================>........] - ETA: 31s - loss: 1.5108 - regression_loss: 1.2648 - classification_loss: 0.2460 375/500 [=====================>........] - ETA: 31s - loss: 1.5107 - regression_loss: 1.2646 - classification_loss: 0.2461 376/500 [=====================>........] - ETA: 31s - loss: 1.5097 - regression_loss: 1.2637 - classification_loss: 0.2460 377/500 [=====================>........] - ETA: 30s - loss: 1.5103 - regression_loss: 1.2644 - classification_loss: 0.2459 378/500 [=====================>........] - ETA: 30s - loss: 1.5093 - regression_loss: 1.2635 - classification_loss: 0.2458 379/500 [=====================>........] - ETA: 30s - loss: 1.5092 - regression_loss: 1.2634 - classification_loss: 0.2458 380/500 [=====================>........] - ETA: 30s - loss: 1.5092 - regression_loss: 1.2632 - classification_loss: 0.2460 381/500 [=====================>........] - ETA: 29s - loss: 1.5087 - regression_loss: 1.2629 - classification_loss: 0.2458 382/500 [=====================>........] - ETA: 29s - loss: 1.5091 - regression_loss: 1.2633 - classification_loss: 0.2458 383/500 [=====================>........] - ETA: 29s - loss: 1.5090 - regression_loss: 1.2632 - classification_loss: 0.2457 384/500 [======================>.......] - ETA: 29s - loss: 1.5106 - regression_loss: 1.2647 - classification_loss: 0.2459 385/500 [======================>.......] - ETA: 28s - loss: 1.5113 - regression_loss: 1.2653 - classification_loss: 0.2460 386/500 [======================>.......] - ETA: 28s - loss: 1.5112 - regression_loss: 1.2653 - classification_loss: 0.2459 387/500 [======================>.......] - ETA: 28s - loss: 1.5117 - regression_loss: 1.2657 - classification_loss: 0.2460 388/500 [======================>.......] - ETA: 28s - loss: 1.5128 - regression_loss: 1.2668 - classification_loss: 0.2460 389/500 [======================>.......] - ETA: 27s - loss: 1.5135 - regression_loss: 1.2671 - classification_loss: 0.2464 390/500 [======================>.......] - ETA: 27s - loss: 1.5145 - regression_loss: 1.2679 - classification_loss: 0.2467 391/500 [======================>.......] - ETA: 27s - loss: 1.5148 - regression_loss: 1.2681 - classification_loss: 0.2467 392/500 [======================>.......] - ETA: 27s - loss: 1.5143 - regression_loss: 1.2676 - classification_loss: 0.2468 393/500 [======================>.......] - ETA: 26s - loss: 1.5137 - regression_loss: 1.2669 - classification_loss: 0.2468 394/500 [======================>.......] - ETA: 26s - loss: 1.5148 - regression_loss: 1.2677 - classification_loss: 0.2470 395/500 [======================>.......] - ETA: 26s - loss: 1.5131 - regression_loss: 1.2663 - classification_loss: 0.2467 396/500 [======================>.......] - ETA: 26s - loss: 1.5125 - regression_loss: 1.2659 - classification_loss: 0.2466 397/500 [======================>.......] - ETA: 25s - loss: 1.5117 - regression_loss: 1.2654 - classification_loss: 0.2464 398/500 [======================>.......] - ETA: 25s - loss: 1.5110 - regression_loss: 1.2645 - classification_loss: 0.2465 399/500 [======================>.......] - ETA: 25s - loss: 1.5123 - regression_loss: 1.2657 - classification_loss: 0.2467 400/500 [=======================>......] - ETA: 25s - loss: 1.5134 - regression_loss: 1.2665 - classification_loss: 0.2470 401/500 [=======================>......] - ETA: 24s - loss: 1.5139 - regression_loss: 1.2669 - classification_loss: 0.2470 402/500 [=======================>......] - ETA: 24s - loss: 1.5136 - regression_loss: 1.2666 - classification_loss: 0.2469 403/500 [=======================>......] - ETA: 24s - loss: 1.5120 - regression_loss: 1.2655 - classification_loss: 0.2466 404/500 [=======================>......] - ETA: 24s - loss: 1.5128 - regression_loss: 1.2660 - classification_loss: 0.2468 405/500 [=======================>......] - ETA: 23s - loss: 1.5114 - regression_loss: 1.2649 - classification_loss: 0.2465 406/500 [=======================>......] - ETA: 23s - loss: 1.5129 - regression_loss: 1.2662 - classification_loss: 0.2467 407/500 [=======================>......] - ETA: 23s - loss: 1.5132 - regression_loss: 1.2664 - classification_loss: 0.2468 408/500 [=======================>......] - ETA: 23s - loss: 1.5146 - regression_loss: 1.2674 - classification_loss: 0.2472 409/500 [=======================>......] - ETA: 22s - loss: 1.5150 - regression_loss: 1.2676 - classification_loss: 0.2474 410/500 [=======================>......] - ETA: 22s - loss: 1.5145 - regression_loss: 1.2672 - classification_loss: 0.2473 411/500 [=======================>......] - ETA: 22s - loss: 1.5175 - regression_loss: 1.2693 - classification_loss: 0.2482 412/500 [=======================>......] - ETA: 22s - loss: 1.5172 - regression_loss: 1.2691 - classification_loss: 0.2481 413/500 [=======================>......] - ETA: 21s - loss: 1.5163 - regression_loss: 1.2684 - classification_loss: 0.2479 414/500 [=======================>......] - ETA: 21s - loss: 1.5168 - regression_loss: 1.2688 - classification_loss: 0.2480 415/500 [=======================>......] - ETA: 21s - loss: 1.5179 - regression_loss: 1.2695 - classification_loss: 0.2484 416/500 [=======================>......] - ETA: 21s - loss: 1.5180 - regression_loss: 1.2696 - classification_loss: 0.2484 417/500 [========================>.....] - ETA: 20s - loss: 1.5185 - regression_loss: 1.2699 - classification_loss: 0.2486 418/500 [========================>.....] - ETA: 20s - loss: 1.5197 - regression_loss: 1.2708 - classification_loss: 0.2489 419/500 [========================>.....] - ETA: 20s - loss: 1.5201 - regression_loss: 1.2712 - classification_loss: 0.2489 420/500 [========================>.....] - ETA: 20s - loss: 1.5196 - regression_loss: 1.2708 - classification_loss: 0.2488 421/500 [========================>.....] - ETA: 19s - loss: 1.5191 - regression_loss: 1.2703 - classification_loss: 0.2488 422/500 [========================>.....] - ETA: 19s - loss: 1.5191 - regression_loss: 1.2701 - classification_loss: 0.2490 423/500 [========================>.....] - ETA: 19s - loss: 1.5190 - regression_loss: 1.2698 - classification_loss: 0.2492 424/500 [========================>.....] - ETA: 19s - loss: 1.5190 - regression_loss: 1.2698 - classification_loss: 0.2492 425/500 [========================>.....] - ETA: 18s - loss: 1.5189 - regression_loss: 1.2697 - classification_loss: 0.2492 426/500 [========================>.....] - ETA: 18s - loss: 1.5177 - regression_loss: 1.2685 - classification_loss: 0.2491 427/500 [========================>.....] - ETA: 18s - loss: 1.5186 - regression_loss: 1.2694 - classification_loss: 0.2493 428/500 [========================>.....] - ETA: 18s - loss: 1.5190 - regression_loss: 1.2697 - classification_loss: 0.2493 429/500 [========================>.....] - ETA: 17s - loss: 1.5190 - regression_loss: 1.2698 - classification_loss: 0.2492 430/500 [========================>.....] - ETA: 17s - loss: 1.5193 - regression_loss: 1.2702 - classification_loss: 0.2491 431/500 [========================>.....] - ETA: 17s - loss: 1.5193 - regression_loss: 1.2702 - classification_loss: 0.2490 432/500 [========================>.....] - ETA: 17s - loss: 1.5188 - regression_loss: 1.2700 - classification_loss: 0.2487 433/500 [========================>.....] - ETA: 16s - loss: 1.5198 - regression_loss: 1.2709 - classification_loss: 0.2489 434/500 [=========================>....] - ETA: 16s - loss: 1.5187 - regression_loss: 1.2699 - classification_loss: 0.2488 435/500 [=========================>....] - ETA: 16s - loss: 1.5166 - regression_loss: 1.2681 - classification_loss: 0.2485 436/500 [=========================>....] - ETA: 16s - loss: 1.5166 - regression_loss: 1.2682 - classification_loss: 0.2484 437/500 [=========================>....] - ETA: 15s - loss: 1.5179 - regression_loss: 1.2692 - classification_loss: 0.2487 438/500 [=========================>....] - ETA: 15s - loss: 1.5177 - regression_loss: 1.2691 - classification_loss: 0.2486 439/500 [=========================>....] - ETA: 15s - loss: 1.5209 - regression_loss: 1.2721 - classification_loss: 0.2488 440/500 [=========================>....] - ETA: 15s - loss: 1.5204 - regression_loss: 1.2716 - classification_loss: 0.2488 441/500 [=========================>....] - ETA: 14s - loss: 1.5191 - regression_loss: 1.2707 - classification_loss: 0.2485 442/500 [=========================>....] - ETA: 14s - loss: 1.5198 - regression_loss: 1.2715 - classification_loss: 0.2484 443/500 [=========================>....] - ETA: 14s - loss: 1.5177 - regression_loss: 1.2698 - classification_loss: 0.2479 444/500 [=========================>....] - ETA: 14s - loss: 1.5181 - regression_loss: 1.2702 - classification_loss: 0.2479 445/500 [=========================>....] - ETA: 13s - loss: 1.5185 - regression_loss: 1.2703 - classification_loss: 0.2482 446/500 [=========================>....] - ETA: 13s - loss: 1.5181 - regression_loss: 1.2700 - classification_loss: 0.2481 447/500 [=========================>....] - ETA: 13s - loss: 1.5180 - regression_loss: 1.2698 - classification_loss: 0.2483 448/500 [=========================>....] - ETA: 13s - loss: 1.5187 - regression_loss: 1.2701 - classification_loss: 0.2486 449/500 [=========================>....] - ETA: 12s - loss: 1.5184 - regression_loss: 1.2700 - classification_loss: 0.2484 450/500 [==========================>...] - ETA: 12s - loss: 1.5185 - regression_loss: 1.2701 - classification_loss: 0.2484 451/500 [==========================>...] - ETA: 12s - loss: 1.5189 - regression_loss: 1.2705 - classification_loss: 0.2484 452/500 [==========================>...] - ETA: 12s - loss: 1.5176 - regression_loss: 1.2693 - classification_loss: 0.2483 453/500 [==========================>...] - ETA: 11s - loss: 1.5195 - regression_loss: 1.2708 - classification_loss: 0.2487 454/500 [==========================>...] - ETA: 11s - loss: 1.5198 - regression_loss: 1.2711 - classification_loss: 0.2487 455/500 [==========================>...] - ETA: 11s - loss: 1.5208 - regression_loss: 1.2719 - classification_loss: 0.2489 456/500 [==========================>...] - ETA: 11s - loss: 1.5205 - regression_loss: 1.2717 - classification_loss: 0.2489 457/500 [==========================>...] - ETA: 10s - loss: 1.5213 - regression_loss: 1.2724 - classification_loss: 0.2490 458/500 [==========================>...] - ETA: 10s - loss: 1.5195 - regression_loss: 1.2709 - classification_loss: 0.2486 459/500 [==========================>...] - ETA: 10s - loss: 1.5190 - regression_loss: 1.2705 - classification_loss: 0.2485 460/500 [==========================>...] - ETA: 10s - loss: 1.5179 - regression_loss: 1.2696 - classification_loss: 0.2483 461/500 [==========================>...] - ETA: 9s - loss: 1.5190 - regression_loss: 1.2704 - classification_loss: 0.2486  462/500 [==========================>...] - ETA: 9s - loss: 1.5177 - regression_loss: 1.2694 - classification_loss: 0.2484 463/500 [==========================>...] - ETA: 9s - loss: 1.5183 - regression_loss: 1.2697 - classification_loss: 0.2485 464/500 [==========================>...] - ETA: 9s - loss: 1.5178 - regression_loss: 1.2693 - classification_loss: 0.2484 465/500 [==========================>...] - ETA: 8s - loss: 1.5175 - regression_loss: 1.2690 - classification_loss: 0.2485 466/500 [==========================>...] - ETA: 8s - loss: 1.5162 - regression_loss: 1.2680 - classification_loss: 0.2482 467/500 [===========================>..] - ETA: 8s - loss: 1.5165 - regression_loss: 1.2682 - classification_loss: 0.2483 468/500 [===========================>..] - ETA: 8s - loss: 1.5158 - regression_loss: 1.2675 - classification_loss: 0.2482 469/500 [===========================>..] - ETA: 7s - loss: 1.5169 - regression_loss: 1.2685 - classification_loss: 0.2484 470/500 [===========================>..] - ETA: 7s - loss: 1.5172 - regression_loss: 1.2687 - classification_loss: 0.2485 471/500 [===========================>..] - ETA: 7s - loss: 1.5151 - regression_loss: 1.2670 - classification_loss: 0.2481 472/500 [===========================>..] - ETA: 7s - loss: 1.5150 - regression_loss: 1.2670 - classification_loss: 0.2481 473/500 [===========================>..] - ETA: 6s - loss: 1.5154 - regression_loss: 1.2673 - classification_loss: 0.2480 474/500 [===========================>..] - ETA: 6s - loss: 1.5146 - regression_loss: 1.2668 - classification_loss: 0.2478 475/500 [===========================>..] - ETA: 6s - loss: 1.5154 - regression_loss: 1.2676 - classification_loss: 0.2478 476/500 [===========================>..] - ETA: 6s - loss: 1.5156 - regression_loss: 1.2679 - classification_loss: 0.2478 477/500 [===========================>..] - ETA: 5s - loss: 1.5157 - regression_loss: 1.2679 - classification_loss: 0.2478 478/500 [===========================>..] - ETA: 5s - loss: 1.5159 - regression_loss: 1.2681 - classification_loss: 0.2478 479/500 [===========================>..] - ETA: 5s - loss: 1.5166 - regression_loss: 1.2686 - classification_loss: 0.2480 480/500 [===========================>..] - ETA: 5s - loss: 1.5156 - regression_loss: 1.2679 - classification_loss: 0.2477 481/500 [===========================>..] - ETA: 4s - loss: 1.5178 - regression_loss: 1.2696 - classification_loss: 0.2482 482/500 [===========================>..] - ETA: 4s - loss: 1.5176 - regression_loss: 1.2695 - classification_loss: 0.2481 483/500 [===========================>..] - ETA: 4s - loss: 1.5182 - regression_loss: 1.2699 - classification_loss: 0.2483 484/500 [============================>.] - ETA: 4s - loss: 1.5178 - regression_loss: 1.2696 - classification_loss: 0.2482 485/500 [============================>.] - ETA: 3s - loss: 1.5168 - regression_loss: 1.2687 - classification_loss: 0.2480 486/500 [============================>.] - ETA: 3s - loss: 1.5165 - regression_loss: 1.2683 - classification_loss: 0.2482 487/500 [============================>.] - ETA: 3s - loss: 1.5158 - regression_loss: 1.2676 - classification_loss: 0.2482 488/500 [============================>.] - ETA: 3s - loss: 1.5155 - regression_loss: 1.2673 - classification_loss: 0.2482 489/500 [============================>.] - ETA: 2s - loss: 1.5136 - regression_loss: 1.2656 - classification_loss: 0.2480 490/500 [============================>.] - ETA: 2s - loss: 1.5139 - regression_loss: 1.2659 - classification_loss: 0.2480 491/500 [============================>.] - ETA: 2s - loss: 1.5123 - regression_loss: 1.2647 - classification_loss: 0.2476 492/500 [============================>.] - ETA: 2s - loss: 1.5122 - regression_loss: 1.2648 - classification_loss: 0.2475 493/500 [============================>.] - ETA: 1s - loss: 1.5131 - regression_loss: 1.2654 - classification_loss: 0.2476 494/500 [============================>.] - ETA: 1s - loss: 1.5132 - regression_loss: 1.2656 - classification_loss: 0.2476 495/500 [============================>.] - ETA: 1s - loss: 1.5130 - regression_loss: 1.2655 - classification_loss: 0.2475 496/500 [============================>.] - ETA: 1s - loss: 1.5141 - regression_loss: 1.2663 - classification_loss: 0.2478 497/500 [============================>.] - ETA: 0s - loss: 1.5138 - regression_loss: 1.2660 - classification_loss: 0.2478 498/500 [============================>.] - ETA: 0s - loss: 1.5124 - regression_loss: 1.2648 - classification_loss: 0.2476 499/500 [============================>.] - ETA: 0s - loss: 1.5126 - regression_loss: 1.2651 - classification_loss: 0.2475 500/500 [==============================] - 125s 251ms/step - loss: 1.5136 - regression_loss: 1.2656 - classification_loss: 0.2481 1172 instances of class plum with average precision: 0.6741 mAP: 0.6741 Epoch 00102: saving model to ./training/snapshots/resnet50_pascal_102.h5 Epoch 103/150 1/500 [..............................] - ETA: 1:50 - loss: 1.0300 - regression_loss: 0.8913 - classification_loss: 0.1387 2/500 [..............................] - ETA: 1:52 - loss: 1.6628 - regression_loss: 1.3067 - classification_loss: 0.3561 3/500 [..............................] - ETA: 1:57 - loss: 1.6919 - regression_loss: 1.3699 - classification_loss: 0.3219 4/500 [..............................] - ETA: 1:58 - loss: 1.6317 - regression_loss: 1.3348 - classification_loss: 0.2969 5/500 [..............................] - ETA: 2:00 - loss: 1.7160 - regression_loss: 1.3907 - classification_loss: 0.3253 6/500 [..............................] - ETA: 2:00 - loss: 1.5506 - regression_loss: 1.2636 - classification_loss: 0.2869 7/500 [..............................] - ETA: 2:00 - loss: 1.4296 - regression_loss: 1.1672 - classification_loss: 0.2625 8/500 [..............................] - ETA: 2:00 - loss: 1.4715 - regression_loss: 1.2323 - classification_loss: 0.2392 9/500 [..............................] - ETA: 2:00 - loss: 1.4681 - regression_loss: 1.2338 - classification_loss: 0.2343 10/500 [..............................] - ETA: 2:00 - loss: 1.4184 - regression_loss: 1.1941 - classification_loss: 0.2243 11/500 [..............................] - ETA: 2:00 - loss: 1.3494 - regression_loss: 1.1393 - classification_loss: 0.2101 12/500 [..............................] - ETA: 2:00 - loss: 1.3159 - regression_loss: 1.1129 - classification_loss: 0.2031 13/500 [..............................] - ETA: 2:00 - loss: 1.3350 - regression_loss: 1.1258 - classification_loss: 0.2092 14/500 [..............................] - ETA: 2:00 - loss: 1.3277 - regression_loss: 1.1251 - classification_loss: 0.2027 15/500 [..............................] - ETA: 2:00 - loss: 1.3533 - regression_loss: 1.1477 - classification_loss: 0.2056 16/500 [..............................] - ETA: 2:00 - loss: 1.3467 - regression_loss: 1.1413 - classification_loss: 0.2055 17/500 [>.............................] - ETA: 2:00 - loss: 1.3571 - regression_loss: 1.1503 - classification_loss: 0.2068 18/500 [>.............................] - ETA: 1:59 - loss: 1.3420 - regression_loss: 1.1368 - classification_loss: 0.2052 19/500 [>.............................] - ETA: 1:59 - loss: 1.3492 - regression_loss: 1.1415 - classification_loss: 0.2078 20/500 [>.............................] - ETA: 1:59 - loss: 1.3098 - regression_loss: 1.1070 - classification_loss: 0.2028 21/500 [>.............................] - ETA: 1:59 - loss: 1.4171 - regression_loss: 1.1924 - classification_loss: 0.2247 22/500 [>.............................] - ETA: 1:58 - loss: 1.4433 - regression_loss: 1.2144 - classification_loss: 0.2288 23/500 [>.............................] - ETA: 1:58 - loss: 1.4310 - regression_loss: 1.2045 - classification_loss: 0.2265 24/500 [>.............................] - ETA: 1:58 - loss: 1.4289 - regression_loss: 1.2029 - classification_loss: 0.2260 25/500 [>.............................] - ETA: 1:58 - loss: 1.4403 - regression_loss: 1.2140 - classification_loss: 0.2263 26/500 [>.............................] - ETA: 1:58 - loss: 1.4377 - regression_loss: 1.2144 - classification_loss: 0.2234 27/500 [>.............................] - ETA: 1:58 - loss: 1.4411 - regression_loss: 1.2151 - classification_loss: 0.2260 28/500 [>.............................] - ETA: 1:57 - loss: 1.4478 - regression_loss: 1.2217 - classification_loss: 0.2261 29/500 [>.............................] - ETA: 1:57 - loss: 1.4441 - regression_loss: 1.2175 - classification_loss: 0.2265 30/500 [>.............................] - ETA: 1:57 - loss: 1.4414 - regression_loss: 1.2082 - classification_loss: 0.2333 31/500 [>.............................] - ETA: 1:57 - loss: 1.4457 - regression_loss: 1.2113 - classification_loss: 0.2344 32/500 [>.............................] - ETA: 1:56 - loss: 1.4503 - regression_loss: 1.2141 - classification_loss: 0.2362 33/500 [>.............................] - ETA: 1:56 - loss: 1.4398 - regression_loss: 1.2085 - classification_loss: 0.2313 34/500 [=>............................] - ETA: 1:56 - loss: 1.4418 - regression_loss: 1.2084 - classification_loss: 0.2334 35/500 [=>............................] - ETA: 1:55 - loss: 1.4392 - regression_loss: 1.2066 - classification_loss: 0.2326 36/500 [=>............................] - ETA: 1:55 - loss: 1.4270 - regression_loss: 1.1946 - classification_loss: 0.2324 37/500 [=>............................] - ETA: 1:55 - loss: 1.4038 - regression_loss: 1.1758 - classification_loss: 0.2280 38/500 [=>............................] - ETA: 1:55 - loss: 1.4098 - regression_loss: 1.1816 - classification_loss: 0.2282 39/500 [=>............................] - ETA: 1:55 - loss: 1.3967 - regression_loss: 1.1716 - classification_loss: 0.2251 40/500 [=>............................] - ETA: 1:54 - loss: 1.4003 - regression_loss: 1.1752 - classification_loss: 0.2251 41/500 [=>............................] - ETA: 1:54 - loss: 1.4059 - regression_loss: 1.1814 - classification_loss: 0.2245 42/500 [=>............................] - ETA: 1:53 - loss: 1.3881 - regression_loss: 1.1673 - classification_loss: 0.2208 43/500 [=>............................] - ETA: 1:53 - loss: 1.3799 - regression_loss: 1.1594 - classification_loss: 0.2205 44/500 [=>............................] - ETA: 1:52 - loss: 1.3688 - regression_loss: 1.1516 - classification_loss: 0.2171 45/500 [=>............................] - ETA: 1:52 - loss: 1.3743 - regression_loss: 1.1561 - classification_loss: 0.2182 46/500 [=>............................] - ETA: 1:52 - loss: 1.3802 - regression_loss: 1.1623 - classification_loss: 0.2179 47/500 [=>............................] - ETA: 1:52 - loss: 1.3705 - regression_loss: 1.1539 - classification_loss: 0.2165 48/500 [=>............................] - ETA: 1:51 - loss: 1.3814 - regression_loss: 1.1629 - classification_loss: 0.2184 49/500 [=>............................] - ETA: 1:51 - loss: 1.3951 - regression_loss: 1.1739 - classification_loss: 0.2212 50/500 [==>...........................] - ETA: 1:51 - loss: 1.3986 - regression_loss: 1.1774 - classification_loss: 0.2211 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3969 - regression_loss: 1.1746 - classification_loss: 0.2223 52/500 [==>...........................] - ETA: 1:51 - loss: 1.4037 - regression_loss: 1.1792 - classification_loss: 0.2246 53/500 [==>...........................] - ETA: 1:50 - loss: 1.4020 - regression_loss: 1.1783 - classification_loss: 0.2237 54/500 [==>...........................] - ETA: 1:50 - loss: 1.4025 - regression_loss: 1.1780 - classification_loss: 0.2245 55/500 [==>...........................] - ETA: 1:50 - loss: 1.4091 - regression_loss: 1.1842 - classification_loss: 0.2249 56/500 [==>...........................] - ETA: 1:50 - loss: 1.4174 - regression_loss: 1.1911 - classification_loss: 0.2262 57/500 [==>...........................] - ETA: 1:49 - loss: 1.4225 - regression_loss: 1.1952 - classification_loss: 0.2272 58/500 [==>...........................] - ETA: 1:49 - loss: 1.4282 - regression_loss: 1.2007 - classification_loss: 0.2275 59/500 [==>...........................] - ETA: 1:49 - loss: 1.4229 - regression_loss: 1.1968 - classification_loss: 0.2262 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4292 - regression_loss: 1.2002 - classification_loss: 0.2290 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4328 - regression_loss: 1.2040 - classification_loss: 0.2287 62/500 [==>...........................] - ETA: 1:48 - loss: 1.4309 - regression_loss: 1.2025 - classification_loss: 0.2284 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4382 - regression_loss: 1.2092 - classification_loss: 0.2290 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4380 - regression_loss: 1.2063 - classification_loss: 0.2317 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4411 - regression_loss: 1.2066 - classification_loss: 0.2345 66/500 [==>...........................] - ETA: 1:47 - loss: 1.4489 - regression_loss: 1.2113 - classification_loss: 0.2376 67/500 [===>..........................] - ETA: 1:47 - loss: 1.4453 - regression_loss: 1.2084 - classification_loss: 0.2369 68/500 [===>..........................] - ETA: 1:47 - loss: 1.4508 - regression_loss: 1.2123 - classification_loss: 0.2384 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4659 - regression_loss: 1.2256 - classification_loss: 0.2403 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4660 - regression_loss: 1.2254 - classification_loss: 0.2406 71/500 [===>..........................] - ETA: 1:46 - loss: 1.4607 - regression_loss: 1.2213 - classification_loss: 0.2394 72/500 [===>..........................] - ETA: 1:46 - loss: 1.4702 - regression_loss: 1.2290 - classification_loss: 0.2411 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4712 - regression_loss: 1.2296 - classification_loss: 0.2417 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4692 - regression_loss: 1.2283 - classification_loss: 0.2409 75/500 [===>..........................] - ETA: 1:45 - loss: 1.4719 - regression_loss: 1.2310 - classification_loss: 0.2409 76/500 [===>..........................] - ETA: 1:45 - loss: 1.4697 - regression_loss: 1.2282 - classification_loss: 0.2415 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4702 - regression_loss: 1.2282 - classification_loss: 0.2420 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4683 - regression_loss: 1.2272 - classification_loss: 0.2411 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4701 - regression_loss: 1.2289 - classification_loss: 0.2411 80/500 [===>..........................] - ETA: 1:44 - loss: 1.4644 - regression_loss: 1.2253 - classification_loss: 0.2391 81/500 [===>..........................] - ETA: 1:44 - loss: 1.4630 - regression_loss: 1.2242 - classification_loss: 0.2388 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4518 - regression_loss: 1.2151 - classification_loss: 0.2367 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4569 - regression_loss: 1.2202 - classification_loss: 0.2367 84/500 [====>.........................] - ETA: 1:43 - loss: 1.4575 - regression_loss: 1.2202 - classification_loss: 0.2373 85/500 [====>.........................] - ETA: 1:43 - loss: 1.4706 - regression_loss: 1.2295 - classification_loss: 0.2411 86/500 [====>.........................] - ETA: 1:43 - loss: 1.4724 - regression_loss: 1.2311 - classification_loss: 0.2413 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4748 - regression_loss: 1.2347 - classification_loss: 0.2400 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4778 - regression_loss: 1.2375 - classification_loss: 0.2403 89/500 [====>.........................] - ETA: 1:42 - loss: 1.4803 - regression_loss: 1.2393 - classification_loss: 0.2411 90/500 [====>.........................] - ETA: 1:42 - loss: 1.4818 - regression_loss: 1.2400 - classification_loss: 0.2418 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4822 - regression_loss: 1.2404 - classification_loss: 0.2419 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4800 - regression_loss: 1.2395 - classification_loss: 0.2405 93/500 [====>.........................] - ETA: 1:41 - loss: 1.4794 - regression_loss: 1.2391 - classification_loss: 0.2403 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4698 - regression_loss: 1.2312 - classification_loss: 0.2386 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4713 - regression_loss: 1.2329 - classification_loss: 0.2384 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4736 - regression_loss: 1.2346 - classification_loss: 0.2390 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4757 - regression_loss: 1.2368 - classification_loss: 0.2389 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4833 - regression_loss: 1.2440 - classification_loss: 0.2394 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4829 - regression_loss: 1.2439 - classification_loss: 0.2390 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4859 - regression_loss: 1.2467 - classification_loss: 0.2393 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4865 - regression_loss: 1.2477 - classification_loss: 0.2389 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4861 - regression_loss: 1.2474 - classification_loss: 0.2387 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4826 - regression_loss: 1.2452 - classification_loss: 0.2375 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4883 - regression_loss: 1.2495 - classification_loss: 0.2388 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4874 - regression_loss: 1.2490 - classification_loss: 0.2384 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4805 - regression_loss: 1.2428 - classification_loss: 0.2377 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4788 - regression_loss: 1.2424 - classification_loss: 0.2364 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4817 - regression_loss: 1.2448 - classification_loss: 0.2368 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4839 - regression_loss: 1.2467 - classification_loss: 0.2372 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4770 - regression_loss: 1.2412 - classification_loss: 0.2358 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4759 - regression_loss: 1.2403 - classification_loss: 0.2356 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4753 - regression_loss: 1.2399 - classification_loss: 0.2354 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4702 - regression_loss: 1.2356 - classification_loss: 0.2346 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4620 - regression_loss: 1.2288 - classification_loss: 0.2332 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4558 - regression_loss: 1.2232 - classification_loss: 0.2326 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4528 - regression_loss: 1.2206 - classification_loss: 0.2322 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4561 - regression_loss: 1.2230 - classification_loss: 0.2331 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4607 - regression_loss: 1.2240 - classification_loss: 0.2367 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4590 - regression_loss: 1.2229 - classification_loss: 0.2362 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4602 - regression_loss: 1.2240 - classification_loss: 0.2363 121/500 [======>.......................] - ETA: 1:34 - loss: 1.4565 - regression_loss: 1.2205 - classification_loss: 0.2360 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4504 - regression_loss: 1.2148 - classification_loss: 0.2357 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4503 - regression_loss: 1.2144 - classification_loss: 0.2359 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4499 - regression_loss: 1.2141 - classification_loss: 0.2359 125/500 [======>.......................] - ETA: 1:33 - loss: 1.4490 - regression_loss: 1.2134 - classification_loss: 0.2355 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4443 - regression_loss: 1.2101 - classification_loss: 0.2342 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4410 - regression_loss: 1.2076 - classification_loss: 0.2334 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4404 - regression_loss: 1.2068 - classification_loss: 0.2336 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4418 - regression_loss: 1.2071 - classification_loss: 0.2347 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4456 - regression_loss: 1.2107 - classification_loss: 0.2350 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4539 - regression_loss: 1.2180 - classification_loss: 0.2359 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4559 - regression_loss: 1.2197 - classification_loss: 0.2361 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4616 - regression_loss: 1.2245 - classification_loss: 0.2371 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4651 - regression_loss: 1.2270 - classification_loss: 0.2381 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4656 - regression_loss: 1.2275 - classification_loss: 0.2382 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4648 - regression_loss: 1.2276 - classification_loss: 0.2372 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4651 - regression_loss: 1.2276 - classification_loss: 0.2375 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4701 - regression_loss: 1.2315 - classification_loss: 0.2386 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4696 - regression_loss: 1.2315 - classification_loss: 0.2381 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4636 - regression_loss: 1.2265 - classification_loss: 0.2371 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4601 - regression_loss: 1.2238 - classification_loss: 0.2364 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4588 - regression_loss: 1.2230 - classification_loss: 0.2359 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4578 - regression_loss: 1.2224 - classification_loss: 0.2355 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4597 - regression_loss: 1.2239 - classification_loss: 0.2358 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4599 - regression_loss: 1.2242 - classification_loss: 0.2356 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4613 - regression_loss: 1.2250 - classification_loss: 0.2363 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4589 - regression_loss: 1.2226 - classification_loss: 0.2363 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4673 - regression_loss: 1.2290 - classification_loss: 0.2383 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4681 - regression_loss: 1.2298 - classification_loss: 0.2383 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4718 - regression_loss: 1.2326 - classification_loss: 0.2392 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4700 - regression_loss: 1.2305 - classification_loss: 0.2395 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4712 - regression_loss: 1.2317 - classification_loss: 0.2396 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4688 - regression_loss: 1.2296 - classification_loss: 0.2392 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4710 - regression_loss: 1.2316 - classification_loss: 0.2394 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4682 - regression_loss: 1.2297 - classification_loss: 0.2385 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4694 - regression_loss: 1.2306 - classification_loss: 0.2387 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4663 - regression_loss: 1.2283 - classification_loss: 0.2380 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4609 - regression_loss: 1.2238 - classification_loss: 0.2370 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4567 - regression_loss: 1.2202 - classification_loss: 0.2365 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4603 - regression_loss: 1.2227 - classification_loss: 0.2375 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4579 - regression_loss: 1.2208 - classification_loss: 0.2372 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4557 - regression_loss: 1.2182 - classification_loss: 0.2374 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4551 - regression_loss: 1.2177 - classification_loss: 0.2374 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4550 - regression_loss: 1.2176 - classification_loss: 0.2373 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4514 - regression_loss: 1.2152 - classification_loss: 0.2361 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4486 - regression_loss: 1.2130 - classification_loss: 0.2357 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4503 - regression_loss: 1.2145 - classification_loss: 0.2358 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4475 - regression_loss: 1.2122 - classification_loss: 0.2353 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4489 - regression_loss: 1.2130 - classification_loss: 0.2359 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4511 - regression_loss: 1.2144 - classification_loss: 0.2367 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4474 - regression_loss: 1.2110 - classification_loss: 0.2365 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4505 - regression_loss: 1.2134 - classification_loss: 0.2371 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4501 - regression_loss: 1.2131 - classification_loss: 0.2370 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4516 - regression_loss: 1.2141 - classification_loss: 0.2376 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4517 - regression_loss: 1.2143 - classification_loss: 0.2374 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4507 - regression_loss: 1.2134 - classification_loss: 0.2373 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4470 - regression_loss: 1.2104 - classification_loss: 0.2366 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4499 - regression_loss: 1.2128 - classification_loss: 0.2371 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4512 - regression_loss: 1.2143 - classification_loss: 0.2369 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4480 - regression_loss: 1.2113 - classification_loss: 0.2367 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4491 - regression_loss: 1.2124 - classification_loss: 0.2367 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4527 - regression_loss: 1.2155 - classification_loss: 0.2373 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4527 - regression_loss: 1.2157 - classification_loss: 0.2370 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4548 - regression_loss: 1.2176 - classification_loss: 0.2372 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4538 - regression_loss: 1.2168 - classification_loss: 0.2370 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4513 - regression_loss: 1.2149 - classification_loss: 0.2364 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4525 - regression_loss: 1.2160 - classification_loss: 0.2364 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4534 - regression_loss: 1.2165 - classification_loss: 0.2369 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4568 - regression_loss: 1.2193 - classification_loss: 0.2375 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4526 - regression_loss: 1.2158 - classification_loss: 0.2368 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4555 - regression_loss: 1.2180 - classification_loss: 0.2375 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4528 - regression_loss: 1.2161 - classification_loss: 0.2368 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4485 - regression_loss: 1.2126 - classification_loss: 0.2359 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4507 - regression_loss: 1.2143 - classification_loss: 0.2364 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4503 - regression_loss: 1.2138 - classification_loss: 0.2365 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4472 - regression_loss: 1.2111 - classification_loss: 0.2361 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4483 - regression_loss: 1.2122 - classification_loss: 0.2361 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4493 - regression_loss: 1.2127 - classification_loss: 0.2366 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4486 - regression_loss: 1.2123 - classification_loss: 0.2363 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4441 - regression_loss: 1.2088 - classification_loss: 0.2353 201/500 [===========>..................] - ETA: 1:15 - loss: 1.4407 - regression_loss: 1.2061 - classification_loss: 0.2346 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4423 - regression_loss: 1.2074 - classification_loss: 0.2349 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4435 - regression_loss: 1.2086 - classification_loss: 0.2349 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4440 - regression_loss: 1.2090 - classification_loss: 0.2350 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4464 - regression_loss: 1.2111 - classification_loss: 0.2353 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4472 - regression_loss: 1.2119 - classification_loss: 0.2353 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4494 - regression_loss: 1.2137 - classification_loss: 0.2358 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4483 - regression_loss: 1.2129 - classification_loss: 0.2354 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4472 - regression_loss: 1.2124 - classification_loss: 0.2348 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4496 - regression_loss: 1.2144 - classification_loss: 0.2352 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4514 - regression_loss: 1.2158 - classification_loss: 0.2356 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4523 - regression_loss: 1.2168 - classification_loss: 0.2355 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4513 - regression_loss: 1.2163 - classification_loss: 0.2351 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4521 - regression_loss: 1.2170 - classification_loss: 0.2351 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4517 - regression_loss: 1.2172 - classification_loss: 0.2345 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4531 - regression_loss: 1.2184 - classification_loss: 0.2347 217/500 [============>.................] - ETA: 1:10 - loss: 1.4534 - regression_loss: 1.2186 - classification_loss: 0.2348 218/500 [============>.................] - ETA: 1:10 - loss: 1.4584 - regression_loss: 1.2225 - classification_loss: 0.2359 219/500 [============>.................] - ETA: 1:10 - loss: 1.4583 - regression_loss: 1.2225 - classification_loss: 0.2359 220/500 [============>.................] - ETA: 1:10 - loss: 1.4562 - regression_loss: 1.2207 - classification_loss: 0.2354 221/500 [============>.................] - ETA: 1:09 - loss: 1.4576 - regression_loss: 1.2220 - classification_loss: 0.2356 222/500 [============>.................] - ETA: 1:09 - loss: 1.4597 - regression_loss: 1.2239 - classification_loss: 0.2359 223/500 [============>.................] - ETA: 1:09 - loss: 1.4596 - regression_loss: 1.2238 - classification_loss: 0.2358 224/500 [============>.................] - ETA: 1:09 - loss: 1.4593 - regression_loss: 1.2236 - classification_loss: 0.2358 225/500 [============>.................] - ETA: 1:08 - loss: 1.4613 - regression_loss: 1.2253 - classification_loss: 0.2360 226/500 [============>.................] - ETA: 1:08 - loss: 1.4628 - regression_loss: 1.2262 - classification_loss: 0.2366 227/500 [============>.................] - ETA: 1:08 - loss: 1.4593 - regression_loss: 1.2235 - classification_loss: 0.2359 228/500 [============>.................] - ETA: 1:08 - loss: 1.4585 - regression_loss: 1.2228 - classification_loss: 0.2357 229/500 [============>.................] - ETA: 1:07 - loss: 1.4602 - regression_loss: 1.2242 - classification_loss: 0.2360 230/500 [============>.................] - ETA: 1:07 - loss: 1.4623 - regression_loss: 1.2259 - classification_loss: 0.2364 231/500 [============>.................] - ETA: 1:07 - loss: 1.4629 - regression_loss: 1.2263 - classification_loss: 0.2366 232/500 [============>.................] - ETA: 1:07 - loss: 1.4633 - regression_loss: 1.2268 - classification_loss: 0.2365 233/500 [============>.................] - ETA: 1:06 - loss: 1.4597 - regression_loss: 1.2239 - classification_loss: 0.2358 234/500 [=============>................] - ETA: 1:06 - loss: 1.4591 - regression_loss: 1.2235 - classification_loss: 0.2356 235/500 [=============>................] - ETA: 1:06 - loss: 1.4601 - regression_loss: 1.2244 - classification_loss: 0.2357 236/500 [=============>................] - ETA: 1:06 - loss: 1.4613 - regression_loss: 1.2253 - classification_loss: 0.2360 237/500 [=============>................] - ETA: 1:05 - loss: 1.4598 - regression_loss: 1.2240 - classification_loss: 0.2357 238/500 [=============>................] - ETA: 1:05 - loss: 1.4589 - regression_loss: 1.2235 - classification_loss: 0.2354 239/500 [=============>................] - ETA: 1:05 - loss: 1.4595 - regression_loss: 1.2238 - classification_loss: 0.2356 240/500 [=============>................] - ETA: 1:05 - loss: 1.4606 - regression_loss: 1.2249 - classification_loss: 0.2357 241/500 [=============>................] - ETA: 1:04 - loss: 1.4608 - regression_loss: 1.2251 - classification_loss: 0.2357 242/500 [=============>................] - ETA: 1:04 - loss: 1.4612 - regression_loss: 1.2255 - classification_loss: 0.2357 243/500 [=============>................] - ETA: 1:04 - loss: 1.4588 - regression_loss: 1.2239 - classification_loss: 0.2349 244/500 [=============>................] - ETA: 1:04 - loss: 1.4577 - regression_loss: 1.2228 - classification_loss: 0.2349 245/500 [=============>................] - ETA: 1:03 - loss: 1.4543 - regression_loss: 1.2199 - classification_loss: 0.2343 246/500 [=============>................] - ETA: 1:03 - loss: 1.4540 - regression_loss: 1.2199 - classification_loss: 0.2341 247/500 [=============>................] - ETA: 1:03 - loss: 1.4524 - regression_loss: 1.2187 - classification_loss: 0.2337 248/500 [=============>................] - ETA: 1:03 - loss: 1.4541 - regression_loss: 1.2202 - classification_loss: 0.2340 249/500 [=============>................] - ETA: 1:02 - loss: 1.4520 - regression_loss: 1.2185 - classification_loss: 0.2335 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4525 - regression_loss: 1.2190 - classification_loss: 0.2335 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4488 - regression_loss: 1.2160 - classification_loss: 0.2328 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4485 - regression_loss: 1.2160 - classification_loss: 0.2325 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4509 - regression_loss: 1.2181 - classification_loss: 0.2327 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4501 - regression_loss: 1.2178 - classification_loss: 0.2323 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4522 - regression_loss: 1.2197 - classification_loss: 0.2324 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4484 - regression_loss: 1.2168 - classification_loss: 0.2317 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4498 - regression_loss: 1.2179 - classification_loss: 0.2318 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4474 - regression_loss: 1.2162 - classification_loss: 0.2312 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4491 - regression_loss: 1.2174 - classification_loss: 0.2317 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4508 - regression_loss: 1.2188 - classification_loss: 0.2321 261/500 [==============>...............] - ETA: 59s - loss: 1.4503 - regression_loss: 1.2181 - classification_loss: 0.2323  262/500 [==============>...............] - ETA: 59s - loss: 1.4496 - regression_loss: 1.2174 - classification_loss: 0.2322 263/500 [==============>...............] - ETA: 59s - loss: 1.4510 - regression_loss: 1.2186 - classification_loss: 0.2325 264/500 [==============>...............] - ETA: 59s - loss: 1.4496 - regression_loss: 1.2175 - classification_loss: 0.2321 265/500 [==============>...............] - ETA: 58s - loss: 1.4510 - regression_loss: 1.2189 - classification_loss: 0.2320 266/500 [==============>...............] - ETA: 58s - loss: 1.4485 - regression_loss: 1.2169 - classification_loss: 0.2316 267/500 [===============>..............] - ETA: 58s - loss: 1.4476 - regression_loss: 1.2153 - classification_loss: 0.2324 268/500 [===============>..............] - ETA: 58s - loss: 1.4482 - regression_loss: 1.2159 - classification_loss: 0.2323 269/500 [===============>..............] - ETA: 57s - loss: 1.4471 - regression_loss: 1.2150 - classification_loss: 0.2321 270/500 [===============>..............] - ETA: 57s - loss: 1.4464 - regression_loss: 1.2141 - classification_loss: 0.2323 271/500 [===============>..............] - ETA: 57s - loss: 1.4470 - regression_loss: 1.2145 - classification_loss: 0.2325 272/500 [===============>..............] - ETA: 57s - loss: 1.4475 - regression_loss: 1.2151 - classification_loss: 0.2325 273/500 [===============>..............] - ETA: 56s - loss: 1.4479 - regression_loss: 1.2155 - classification_loss: 0.2325 274/500 [===============>..............] - ETA: 56s - loss: 1.4489 - regression_loss: 1.2164 - classification_loss: 0.2325 275/500 [===============>..............] - ETA: 56s - loss: 1.4464 - regression_loss: 1.2145 - classification_loss: 0.2319 276/500 [===============>..............] - ETA: 56s - loss: 1.4483 - regression_loss: 1.2160 - classification_loss: 0.2324 277/500 [===============>..............] - ETA: 55s - loss: 1.4473 - regression_loss: 1.2154 - classification_loss: 0.2320 278/500 [===============>..............] - ETA: 55s - loss: 1.4483 - regression_loss: 1.2161 - classification_loss: 0.2322 279/500 [===============>..............] - ETA: 55s - loss: 1.4498 - regression_loss: 1.2170 - classification_loss: 0.2328 280/500 [===============>..............] - ETA: 55s - loss: 1.4489 - regression_loss: 1.2161 - classification_loss: 0.2329 281/500 [===============>..............] - ETA: 54s - loss: 1.4489 - regression_loss: 1.2162 - classification_loss: 0.2327 282/500 [===============>..............] - ETA: 54s - loss: 1.4490 - regression_loss: 1.2163 - classification_loss: 0.2327 283/500 [===============>..............] - ETA: 54s - loss: 1.4506 - regression_loss: 1.2175 - classification_loss: 0.2331 284/500 [================>.............] - ETA: 54s - loss: 1.4502 - regression_loss: 1.2173 - classification_loss: 0.2329 285/500 [================>.............] - ETA: 53s - loss: 1.4503 - regression_loss: 1.2175 - classification_loss: 0.2328 286/500 [================>.............] - ETA: 53s - loss: 1.4511 - regression_loss: 1.2183 - classification_loss: 0.2328 287/500 [================>.............] - ETA: 53s - loss: 1.4513 - regression_loss: 1.2185 - classification_loss: 0.2327 288/500 [================>.............] - ETA: 53s - loss: 1.4489 - regression_loss: 1.2164 - classification_loss: 0.2325 289/500 [================>.............] - ETA: 52s - loss: 1.4496 - regression_loss: 1.2166 - classification_loss: 0.2330 290/500 [================>.............] - ETA: 52s - loss: 1.4510 - regression_loss: 1.2174 - classification_loss: 0.2335 291/500 [================>.............] - ETA: 52s - loss: 1.4479 - regression_loss: 1.2148 - classification_loss: 0.2330 292/500 [================>.............] - ETA: 52s - loss: 1.4491 - regression_loss: 1.2160 - classification_loss: 0.2331 293/500 [================>.............] - ETA: 51s - loss: 1.4503 - regression_loss: 1.2171 - classification_loss: 0.2332 294/500 [================>.............] - ETA: 51s - loss: 1.4519 - regression_loss: 1.2182 - classification_loss: 0.2336 295/500 [================>.............] - ETA: 51s - loss: 1.4506 - regression_loss: 1.2171 - classification_loss: 0.2335 296/500 [================>.............] - ETA: 51s - loss: 1.4523 - regression_loss: 1.2181 - classification_loss: 0.2342 297/500 [================>.............] - ETA: 50s - loss: 1.4553 - regression_loss: 1.2206 - classification_loss: 0.2347 298/500 [================>.............] - ETA: 50s - loss: 1.4549 - regression_loss: 1.2202 - classification_loss: 0.2346 299/500 [================>.............] - ETA: 50s - loss: 1.4548 - regression_loss: 1.2202 - classification_loss: 0.2346 300/500 [=================>............] - ETA: 50s - loss: 1.4554 - regression_loss: 1.2209 - classification_loss: 0.2345 301/500 [=================>............] - ETA: 49s - loss: 1.4565 - regression_loss: 1.2218 - classification_loss: 0.2347 302/500 [=================>............] - ETA: 49s - loss: 1.4568 - regression_loss: 1.2221 - classification_loss: 0.2347 303/500 [=================>............] - ETA: 49s - loss: 1.4573 - regression_loss: 1.2225 - classification_loss: 0.2347 304/500 [=================>............] - ETA: 49s - loss: 1.4576 - regression_loss: 1.2229 - classification_loss: 0.2347 305/500 [=================>............] - ETA: 48s - loss: 1.4576 - regression_loss: 1.2229 - classification_loss: 0.2347 306/500 [=================>............] - ETA: 48s - loss: 1.4575 - regression_loss: 1.2227 - classification_loss: 0.2348 307/500 [=================>............] - ETA: 48s - loss: 1.4575 - regression_loss: 1.2227 - classification_loss: 0.2348 308/500 [=================>............] - ETA: 48s - loss: 1.4576 - regression_loss: 1.2229 - classification_loss: 0.2347 309/500 [=================>............] - ETA: 47s - loss: 1.4569 - regression_loss: 1.2222 - classification_loss: 0.2347 310/500 [=================>............] - ETA: 47s - loss: 1.4571 - regression_loss: 1.2224 - classification_loss: 0.2347 311/500 [=================>............] - ETA: 47s - loss: 1.4564 - regression_loss: 1.2218 - classification_loss: 0.2347 312/500 [=================>............] - ETA: 47s - loss: 1.4579 - regression_loss: 1.2230 - classification_loss: 0.2349 313/500 [=================>............] - ETA: 46s - loss: 1.4590 - regression_loss: 1.2240 - classification_loss: 0.2350 314/500 [=================>............] - ETA: 46s - loss: 1.4604 - regression_loss: 1.2253 - classification_loss: 0.2351 315/500 [=================>............] - ETA: 46s - loss: 1.4585 - regression_loss: 1.2238 - classification_loss: 0.2347 316/500 [=================>............] - ETA: 46s - loss: 1.4586 - regression_loss: 1.2241 - classification_loss: 0.2346 317/500 [==================>...........] - ETA: 45s - loss: 1.4579 - regression_loss: 1.2236 - classification_loss: 0.2343 318/500 [==================>...........] - ETA: 45s - loss: 1.4583 - regression_loss: 1.2241 - classification_loss: 0.2342 319/500 [==================>...........] - ETA: 45s - loss: 1.4587 - regression_loss: 1.2245 - classification_loss: 0.2343 320/500 [==================>...........] - ETA: 45s - loss: 1.4589 - regression_loss: 1.2246 - classification_loss: 0.2343 321/500 [==================>...........] - ETA: 44s - loss: 1.4583 - regression_loss: 1.2241 - classification_loss: 0.2342 322/500 [==================>...........] - ETA: 44s - loss: 1.4598 - regression_loss: 1.2252 - classification_loss: 0.2346 323/500 [==================>...........] - ETA: 44s - loss: 1.4614 - regression_loss: 1.2266 - classification_loss: 0.2348 324/500 [==================>...........] - ETA: 44s - loss: 1.4581 - regression_loss: 1.2238 - classification_loss: 0.2343 325/500 [==================>...........] - ETA: 43s - loss: 1.4591 - regression_loss: 1.2246 - classification_loss: 0.2345 326/500 [==================>...........] - ETA: 43s - loss: 1.4584 - regression_loss: 1.2241 - classification_loss: 0.2343 327/500 [==================>...........] - ETA: 43s - loss: 1.4570 - regression_loss: 1.2231 - classification_loss: 0.2339 328/500 [==================>...........] - ETA: 43s - loss: 1.4557 - regression_loss: 1.2222 - classification_loss: 0.2335 329/500 [==================>...........] - ETA: 42s - loss: 1.4559 - regression_loss: 1.2224 - classification_loss: 0.2334 330/500 [==================>...........] - ETA: 42s - loss: 1.4548 - regression_loss: 1.2216 - classification_loss: 0.2332 331/500 [==================>...........] - ETA: 42s - loss: 1.4544 - regression_loss: 1.2214 - classification_loss: 0.2330 332/500 [==================>...........] - ETA: 42s - loss: 1.4539 - regression_loss: 1.2209 - classification_loss: 0.2330 333/500 [==================>...........] - ETA: 41s - loss: 1.4536 - regression_loss: 1.2209 - classification_loss: 0.2328 334/500 [===================>..........] - ETA: 41s - loss: 1.4508 - regression_loss: 1.2187 - classification_loss: 0.2322 335/500 [===================>..........] - ETA: 41s - loss: 1.4507 - regression_loss: 1.2185 - classification_loss: 0.2322 336/500 [===================>..........] - ETA: 41s - loss: 1.4513 - regression_loss: 1.2189 - classification_loss: 0.2323 337/500 [===================>..........] - ETA: 40s - loss: 1.4511 - regression_loss: 1.2188 - classification_loss: 0.2323 338/500 [===================>..........] - ETA: 40s - loss: 1.4534 - regression_loss: 1.2205 - classification_loss: 0.2329 339/500 [===================>..........] - ETA: 40s - loss: 1.4541 - regression_loss: 1.2210 - classification_loss: 0.2331 340/500 [===================>..........] - ETA: 40s - loss: 1.4558 - regression_loss: 1.2222 - classification_loss: 0.2336 341/500 [===================>..........] - ETA: 39s - loss: 1.4534 - regression_loss: 1.2203 - classification_loss: 0.2332 342/500 [===================>..........] - ETA: 39s - loss: 1.4533 - regression_loss: 1.2201 - classification_loss: 0.2332 343/500 [===================>..........] - ETA: 39s - loss: 1.4539 - regression_loss: 1.2207 - classification_loss: 0.2333 344/500 [===================>..........] - ETA: 39s - loss: 1.4537 - regression_loss: 1.2207 - classification_loss: 0.2330 345/500 [===================>..........] - ETA: 38s - loss: 1.4548 - regression_loss: 1.2219 - classification_loss: 0.2330 346/500 [===================>..........] - ETA: 38s - loss: 1.4552 - regression_loss: 1.2221 - classification_loss: 0.2331 347/500 [===================>..........] - ETA: 38s - loss: 1.4543 - regression_loss: 1.2214 - classification_loss: 0.2329 348/500 [===================>..........] - ETA: 38s - loss: 1.4530 - regression_loss: 1.2204 - classification_loss: 0.2326 349/500 [===================>..........] - ETA: 37s - loss: 1.4537 - regression_loss: 1.2208 - classification_loss: 0.2329 350/500 [====================>.........] - ETA: 37s - loss: 1.4550 - regression_loss: 1.2218 - classification_loss: 0.2332 351/500 [====================>.........] - ETA: 37s - loss: 1.4531 - regression_loss: 1.2200 - classification_loss: 0.2331 352/500 [====================>.........] - ETA: 37s - loss: 1.4521 - regression_loss: 1.2193 - classification_loss: 0.2328 353/500 [====================>.........] - ETA: 36s - loss: 1.4491 - regression_loss: 1.2168 - classification_loss: 0.2324 354/500 [====================>.........] - ETA: 36s - loss: 1.4482 - regression_loss: 1.2159 - classification_loss: 0.2323 355/500 [====================>.........] - ETA: 36s - loss: 1.4483 - regression_loss: 1.2161 - classification_loss: 0.2322 356/500 [====================>.........] - ETA: 36s - loss: 1.4467 - regression_loss: 1.2148 - classification_loss: 0.2319 357/500 [====================>.........] - ETA: 35s - loss: 1.4456 - regression_loss: 1.2140 - classification_loss: 0.2316 358/500 [====================>.........] - ETA: 35s - loss: 1.4459 - regression_loss: 1.2143 - classification_loss: 0.2317 359/500 [====================>.........] - ETA: 35s - loss: 1.4439 - regression_loss: 1.2127 - classification_loss: 0.2312 360/500 [====================>.........] - ETA: 35s - loss: 1.4433 - regression_loss: 1.2122 - classification_loss: 0.2311 361/500 [====================>.........] - ETA: 34s - loss: 1.4420 - regression_loss: 1.2112 - classification_loss: 0.2308 362/500 [====================>.........] - ETA: 34s - loss: 1.4429 - regression_loss: 1.2119 - classification_loss: 0.2310 363/500 [====================>.........] - ETA: 34s - loss: 1.4422 - regression_loss: 1.2113 - classification_loss: 0.2310 364/500 [====================>.........] - ETA: 34s - loss: 1.4396 - regression_loss: 1.2091 - classification_loss: 0.2306 365/500 [====================>.........] - ETA: 33s - loss: 1.4399 - regression_loss: 1.2094 - classification_loss: 0.2306 366/500 [====================>.........] - ETA: 33s - loss: 1.4402 - regression_loss: 1.2097 - classification_loss: 0.2304 367/500 [=====================>........] - ETA: 33s - loss: 1.4416 - regression_loss: 1.2109 - classification_loss: 0.2306 368/500 [=====================>........] - ETA: 33s - loss: 1.4420 - regression_loss: 1.2114 - classification_loss: 0.2306 369/500 [=====================>........] - ETA: 32s - loss: 1.4423 - regression_loss: 1.2116 - classification_loss: 0.2307 370/500 [=====================>........] - ETA: 32s - loss: 1.4438 - regression_loss: 1.2128 - classification_loss: 0.2311 371/500 [=====================>........] - ETA: 32s - loss: 1.4437 - regression_loss: 1.2127 - classification_loss: 0.2310 372/500 [=====================>........] - ETA: 32s - loss: 1.4435 - regression_loss: 1.2126 - classification_loss: 0.2309 373/500 [=====================>........] - ETA: 31s - loss: 1.4434 - regression_loss: 1.2124 - classification_loss: 0.2310 374/500 [=====================>........] - ETA: 31s - loss: 1.4432 - regression_loss: 1.2123 - classification_loss: 0.2308 375/500 [=====================>........] - ETA: 31s - loss: 1.4402 - regression_loss: 1.2099 - classification_loss: 0.2303 376/500 [=====================>........] - ETA: 31s - loss: 1.4394 - regression_loss: 1.2092 - classification_loss: 0.2302 377/500 [=====================>........] - ETA: 30s - loss: 1.4384 - regression_loss: 1.2084 - classification_loss: 0.2300 378/500 [=====================>........] - ETA: 30s - loss: 1.4393 - regression_loss: 1.2090 - classification_loss: 0.2303 379/500 [=====================>........] - ETA: 30s - loss: 1.4396 - regression_loss: 1.2092 - classification_loss: 0.2303 380/500 [=====================>........] - ETA: 30s - loss: 1.4372 - regression_loss: 1.2072 - classification_loss: 0.2300 381/500 [=====================>........] - ETA: 29s - loss: 1.4365 - regression_loss: 1.2068 - classification_loss: 0.2297 382/500 [=====================>........] - ETA: 29s - loss: 1.4372 - regression_loss: 1.2074 - classification_loss: 0.2298 383/500 [=====================>........] - ETA: 29s - loss: 1.4373 - regression_loss: 1.2074 - classification_loss: 0.2299 384/500 [======================>.......] - ETA: 29s - loss: 1.4379 - regression_loss: 1.2079 - classification_loss: 0.2299 385/500 [======================>.......] - ETA: 28s - loss: 1.4374 - regression_loss: 1.2077 - classification_loss: 0.2297 386/500 [======================>.......] - ETA: 28s - loss: 1.4384 - regression_loss: 1.2083 - classification_loss: 0.2300 387/500 [======================>.......] - ETA: 28s - loss: 1.4383 - regression_loss: 1.2083 - classification_loss: 0.2300 388/500 [======================>.......] - ETA: 28s - loss: 1.4376 - regression_loss: 1.2079 - classification_loss: 0.2297 389/500 [======================>.......] - ETA: 27s - loss: 1.4388 - regression_loss: 1.2088 - classification_loss: 0.2300 390/500 [======================>.......] - ETA: 27s - loss: 1.4389 - regression_loss: 1.2090 - classification_loss: 0.2299 391/500 [======================>.......] - ETA: 27s - loss: 1.4423 - regression_loss: 1.2115 - classification_loss: 0.2308 392/500 [======================>.......] - ETA: 27s - loss: 1.4429 - regression_loss: 1.2119 - classification_loss: 0.2310 393/500 [======================>.......] - ETA: 26s - loss: 1.4440 - regression_loss: 1.2129 - classification_loss: 0.2312 394/500 [======================>.......] - ETA: 26s - loss: 1.4439 - regression_loss: 1.2128 - classification_loss: 0.2311 395/500 [======================>.......] - ETA: 26s - loss: 1.4445 - regression_loss: 1.2131 - classification_loss: 0.2314 396/500 [======================>.......] - ETA: 26s - loss: 1.4461 - regression_loss: 1.2144 - classification_loss: 0.2317 397/500 [======================>.......] - ETA: 25s - loss: 1.4471 - regression_loss: 1.2152 - classification_loss: 0.2319 398/500 [======================>.......] - ETA: 25s - loss: 1.4478 - regression_loss: 1.2159 - classification_loss: 0.2319 399/500 [======================>.......] - ETA: 25s - loss: 1.4465 - regression_loss: 1.2150 - classification_loss: 0.2315 400/500 [=======================>......] - ETA: 25s - loss: 1.4475 - regression_loss: 1.2156 - classification_loss: 0.2319 401/500 [=======================>......] - ETA: 24s - loss: 1.4472 - regression_loss: 1.2154 - classification_loss: 0.2317 402/500 [=======================>......] - ETA: 24s - loss: 1.4485 - regression_loss: 1.2166 - classification_loss: 0.2319 403/500 [=======================>......] - ETA: 24s - loss: 1.4492 - regression_loss: 1.2173 - classification_loss: 0.2320 404/500 [=======================>......] - ETA: 24s - loss: 1.4484 - regression_loss: 1.2167 - classification_loss: 0.2317 405/500 [=======================>......] - ETA: 23s - loss: 1.4485 - regression_loss: 1.2165 - classification_loss: 0.2319 406/500 [=======================>......] - ETA: 23s - loss: 1.4481 - regression_loss: 1.2163 - classification_loss: 0.2318 407/500 [=======================>......] - ETA: 23s - loss: 1.4486 - regression_loss: 1.2168 - classification_loss: 0.2318 408/500 [=======================>......] - ETA: 23s - loss: 1.4499 - regression_loss: 1.2177 - classification_loss: 0.2322 409/500 [=======================>......] - ETA: 22s - loss: 1.4478 - regression_loss: 1.2159 - classification_loss: 0.2319 410/500 [=======================>......] - ETA: 22s - loss: 1.4459 - regression_loss: 1.2143 - classification_loss: 0.2316 411/500 [=======================>......] - ETA: 22s - loss: 1.4466 - regression_loss: 1.2146 - classification_loss: 0.2320 412/500 [=======================>......] - ETA: 22s - loss: 1.4470 - regression_loss: 1.2149 - classification_loss: 0.2321 413/500 [=======================>......] - ETA: 21s - loss: 1.4445 - regression_loss: 1.2127 - classification_loss: 0.2318 414/500 [=======================>......] - ETA: 21s - loss: 1.4459 - regression_loss: 1.2136 - classification_loss: 0.2323 415/500 [=======================>......] - ETA: 21s - loss: 1.4458 - regression_loss: 1.2135 - classification_loss: 0.2323 416/500 [=======================>......] - ETA: 21s - loss: 1.4456 - regression_loss: 1.2135 - classification_loss: 0.2321 417/500 [========================>.....] - ETA: 20s - loss: 1.4457 - regression_loss: 1.2138 - classification_loss: 0.2319 418/500 [========================>.....] - ETA: 20s - loss: 1.4457 - regression_loss: 1.2138 - classification_loss: 0.2320 419/500 [========================>.....] - ETA: 20s - loss: 1.4452 - regression_loss: 1.2132 - classification_loss: 0.2320 420/500 [========================>.....] - ETA: 20s - loss: 1.4444 - regression_loss: 1.2126 - classification_loss: 0.2318 421/500 [========================>.....] - ETA: 19s - loss: 1.4444 - regression_loss: 1.2126 - classification_loss: 0.2317 422/500 [========================>.....] - ETA: 19s - loss: 1.4455 - regression_loss: 1.2136 - classification_loss: 0.2319 423/500 [========================>.....] - ETA: 19s - loss: 1.4462 - regression_loss: 1.2143 - classification_loss: 0.2319 424/500 [========================>.....] - ETA: 19s - loss: 1.4473 - regression_loss: 1.2152 - classification_loss: 0.2321 425/500 [========================>.....] - ETA: 18s - loss: 1.4479 - regression_loss: 1.2157 - classification_loss: 0.2322 426/500 [========================>.....] - ETA: 18s - loss: 1.4479 - regression_loss: 1.2158 - classification_loss: 0.2321 427/500 [========================>.....] - ETA: 18s - loss: 1.4479 - regression_loss: 1.2158 - classification_loss: 0.2322 428/500 [========================>.....] - ETA: 18s - loss: 1.4468 - regression_loss: 1.2148 - classification_loss: 0.2320 429/500 [========================>.....] - ETA: 17s - loss: 1.4469 - regression_loss: 1.2149 - classification_loss: 0.2321 430/500 [========================>.....] - ETA: 17s - loss: 1.4452 - regression_loss: 1.2135 - classification_loss: 0.2317 431/500 [========================>.....] - ETA: 17s - loss: 1.4452 - regression_loss: 1.2135 - classification_loss: 0.2317 432/500 [========================>.....] - ETA: 17s - loss: 1.4441 - regression_loss: 1.2125 - classification_loss: 0.2315 433/500 [========================>.....] - ETA: 16s - loss: 1.4454 - regression_loss: 1.2135 - classification_loss: 0.2319 434/500 [=========================>....] - ETA: 16s - loss: 1.4459 - regression_loss: 1.2140 - classification_loss: 0.2319 435/500 [=========================>....] - ETA: 16s - loss: 1.4472 - regression_loss: 1.2153 - classification_loss: 0.2319 436/500 [=========================>....] - ETA: 16s - loss: 1.4472 - regression_loss: 1.2154 - classification_loss: 0.2318 437/500 [=========================>....] - ETA: 15s - loss: 1.4478 - regression_loss: 1.2159 - classification_loss: 0.2319 438/500 [=========================>....] - ETA: 15s - loss: 1.4477 - regression_loss: 1.2158 - classification_loss: 0.2319 439/500 [=========================>....] - ETA: 15s - loss: 1.4493 - regression_loss: 1.2173 - classification_loss: 0.2321 440/500 [=========================>....] - ETA: 15s - loss: 1.4487 - regression_loss: 1.2167 - classification_loss: 0.2320 441/500 [=========================>....] - ETA: 14s - loss: 1.4484 - regression_loss: 1.2164 - classification_loss: 0.2320 442/500 [=========================>....] - ETA: 14s - loss: 1.4480 - regression_loss: 1.2161 - classification_loss: 0.2319 443/500 [=========================>....] - ETA: 14s - loss: 1.4469 - regression_loss: 1.2154 - classification_loss: 0.2314 444/500 [=========================>....] - ETA: 14s - loss: 1.4464 - regression_loss: 1.2148 - classification_loss: 0.2316 445/500 [=========================>....] - ETA: 13s - loss: 1.4487 - regression_loss: 1.2165 - classification_loss: 0.2322 446/500 [=========================>....] - ETA: 13s - loss: 1.4501 - regression_loss: 1.2173 - classification_loss: 0.2328 447/500 [=========================>....] - ETA: 13s - loss: 1.4509 - regression_loss: 1.2180 - classification_loss: 0.2329 448/500 [=========================>....] - ETA: 13s - loss: 1.4507 - regression_loss: 1.2180 - classification_loss: 0.2327 449/500 [=========================>....] - ETA: 12s - loss: 1.4519 - regression_loss: 1.2189 - classification_loss: 0.2330 450/500 [==========================>...] - ETA: 12s - loss: 1.4509 - regression_loss: 1.2182 - classification_loss: 0.2327 451/500 [==========================>...] - ETA: 12s - loss: 1.4498 - regression_loss: 1.2174 - classification_loss: 0.2324 452/500 [==========================>...] - ETA: 12s - loss: 1.4484 - regression_loss: 1.2163 - classification_loss: 0.2321 453/500 [==========================>...] - ETA: 11s - loss: 1.4496 - regression_loss: 1.2173 - classification_loss: 0.2323 454/500 [==========================>...] - ETA: 11s - loss: 1.4494 - regression_loss: 1.2172 - classification_loss: 0.2322 455/500 [==========================>...] - ETA: 11s - loss: 1.4492 - regression_loss: 1.2171 - classification_loss: 0.2321 456/500 [==========================>...] - ETA: 11s - loss: 1.4508 - regression_loss: 1.2182 - classification_loss: 0.2326 457/500 [==========================>...] - ETA: 10s - loss: 1.4498 - regression_loss: 1.2173 - classification_loss: 0.2325 458/500 [==========================>...] - ETA: 10s - loss: 1.4498 - regression_loss: 1.2174 - classification_loss: 0.2325 459/500 [==========================>...] - ETA: 10s - loss: 1.4517 - regression_loss: 1.2185 - classification_loss: 0.2331 460/500 [==========================>...] - ETA: 10s - loss: 1.4535 - regression_loss: 1.2198 - classification_loss: 0.2337 461/500 [==========================>...] - ETA: 9s - loss: 1.4538 - regression_loss: 1.2200 - classification_loss: 0.2338  462/500 [==========================>...] - ETA: 9s - loss: 1.4533 - regression_loss: 1.2197 - classification_loss: 0.2335 463/500 [==========================>...] - ETA: 9s - loss: 1.4540 - regression_loss: 1.2203 - classification_loss: 0.2336 464/500 [==========================>...] - ETA: 9s - loss: 1.4544 - regression_loss: 1.2208 - classification_loss: 0.2337 465/500 [==========================>...] - ETA: 8s - loss: 1.4552 - regression_loss: 1.2214 - classification_loss: 0.2338 466/500 [==========================>...] - ETA: 8s - loss: 1.4549 - regression_loss: 1.2212 - classification_loss: 0.2337 467/500 [===========================>..] - ETA: 8s - loss: 1.4542 - regression_loss: 1.2206 - classification_loss: 0.2336 468/500 [===========================>..] - ETA: 8s - loss: 1.4532 - regression_loss: 1.2197 - classification_loss: 0.2335 469/500 [===========================>..] - ETA: 7s - loss: 1.4519 - regression_loss: 1.2186 - classification_loss: 0.2333 470/500 [===========================>..] - ETA: 7s - loss: 1.4518 - regression_loss: 1.2183 - classification_loss: 0.2335 471/500 [===========================>..] - ETA: 7s - loss: 1.4509 - regression_loss: 1.2175 - classification_loss: 0.2335 472/500 [===========================>..] - ETA: 7s - loss: 1.4512 - regression_loss: 1.2177 - classification_loss: 0.2335 473/500 [===========================>..] - ETA: 6s - loss: 1.4520 - regression_loss: 1.2184 - classification_loss: 0.2336 474/500 [===========================>..] - ETA: 6s - loss: 1.4516 - regression_loss: 1.2181 - classification_loss: 0.2335 475/500 [===========================>..] - ETA: 6s - loss: 1.4525 - regression_loss: 1.2188 - classification_loss: 0.2337 476/500 [===========================>..] - ETA: 6s - loss: 1.4528 - regression_loss: 1.2191 - classification_loss: 0.2337 477/500 [===========================>..] - ETA: 5s - loss: 1.4547 - regression_loss: 1.2207 - classification_loss: 0.2339 478/500 [===========================>..] - ETA: 5s - loss: 1.4554 - regression_loss: 1.2214 - classification_loss: 0.2340 479/500 [===========================>..] - ETA: 5s - loss: 1.4553 - regression_loss: 1.2214 - classification_loss: 0.2339 480/500 [===========================>..] - ETA: 5s - loss: 1.4565 - regression_loss: 1.2223 - classification_loss: 0.2342 481/500 [===========================>..] - ETA: 4s - loss: 1.4575 - regression_loss: 1.2231 - classification_loss: 0.2344 482/500 [===========================>..] - ETA: 4s - loss: 1.4575 - regression_loss: 1.2232 - classification_loss: 0.2343 483/500 [===========================>..] - ETA: 4s - loss: 1.4559 - regression_loss: 1.2219 - classification_loss: 0.2340 484/500 [============================>.] - ETA: 4s - loss: 1.4568 - regression_loss: 1.2226 - classification_loss: 0.2342 485/500 [============================>.] - ETA: 3s - loss: 1.4559 - regression_loss: 1.2217 - classification_loss: 0.2342 486/500 [============================>.] - ETA: 3s - loss: 1.4560 - regression_loss: 1.2219 - classification_loss: 0.2342 487/500 [============================>.] - ETA: 3s - loss: 1.4563 - regression_loss: 1.2221 - classification_loss: 0.2342 488/500 [============================>.] - ETA: 3s - loss: 1.4564 - regression_loss: 1.2223 - classification_loss: 0.2342 489/500 [============================>.] - ETA: 2s - loss: 1.4574 - regression_loss: 1.2231 - classification_loss: 0.2343 490/500 [============================>.] - ETA: 2s - loss: 1.4573 - regression_loss: 1.2230 - classification_loss: 0.2343 491/500 [============================>.] - ETA: 2s - loss: 1.4571 - regression_loss: 1.2229 - classification_loss: 0.2342 492/500 [============================>.] - ETA: 2s - loss: 1.4562 - regression_loss: 1.2222 - classification_loss: 0.2340 493/500 [============================>.] - ETA: 1s - loss: 1.4562 - regression_loss: 1.2222 - classification_loss: 0.2340 494/500 [============================>.] - ETA: 1s - loss: 1.4570 - regression_loss: 1.2229 - classification_loss: 0.2340 495/500 [============================>.] - ETA: 1s - loss: 1.4571 - regression_loss: 1.2231 - classification_loss: 0.2340 496/500 [============================>.] - ETA: 1s - loss: 1.4561 - regression_loss: 1.2223 - classification_loss: 0.2338 497/500 [============================>.] - ETA: 0s - loss: 1.4565 - regression_loss: 1.2226 - classification_loss: 0.2338 498/500 [============================>.] - ETA: 0s - loss: 1.4566 - regression_loss: 1.2228 - classification_loss: 0.2338 499/500 [============================>.] - ETA: 0s - loss: 1.4568 - regression_loss: 1.2231 - classification_loss: 0.2338 500/500 [==============================] - 125s 250ms/step - loss: 1.4548 - regression_loss: 1.2213 - classification_loss: 0.2335 1172 instances of class plum with average precision: 0.6837 mAP: 0.6837 Epoch 00103: saving model to ./training/snapshots/resnet50_pascal_103.h5 Epoch 104/150 1/500 [..............................] - ETA: 2:04 - loss: 1.7297 - regression_loss: 1.4640 - classification_loss: 0.2657 2/500 [..............................] - ETA: 2:07 - loss: 1.6418 - regression_loss: 1.3744 - classification_loss: 0.2674 3/500 [..............................] - ETA: 2:07 - loss: 1.7414 - regression_loss: 1.4716 - classification_loss: 0.2698 4/500 [..............................] - ETA: 2:07 - loss: 1.6128 - regression_loss: 1.3597 - classification_loss: 0.2532 5/500 [..............................] - ETA: 2:06 - loss: 1.6478 - regression_loss: 1.3781 - classification_loss: 0.2697 6/500 [..............................] - ETA: 2:06 - loss: 1.7060 - regression_loss: 1.4136 - classification_loss: 0.2924 7/500 [..............................] - ETA: 2:03 - loss: 1.5988 - regression_loss: 1.3337 - classification_loss: 0.2651 8/500 [..............................] - ETA: 2:03 - loss: 1.6010 - regression_loss: 1.3349 - classification_loss: 0.2661 9/500 [..............................] - ETA: 2:04 - loss: 1.5066 - regression_loss: 1.2598 - classification_loss: 0.2468 10/500 [..............................] - ETA: 2:04 - loss: 1.4485 - regression_loss: 1.2170 - classification_loss: 0.2315 11/500 [..............................] - ETA: 2:03 - loss: 1.3850 - regression_loss: 1.1655 - classification_loss: 0.2195 12/500 [..............................] - ETA: 2:04 - loss: 1.3522 - regression_loss: 1.1427 - classification_loss: 0.2095 13/500 [..............................] - ETA: 2:03 - loss: 1.3810 - regression_loss: 1.1682 - classification_loss: 0.2128 14/500 [..............................] - ETA: 2:03 - loss: 1.4129 - regression_loss: 1.1923 - classification_loss: 0.2206 15/500 [..............................] - ETA: 2:02 - loss: 1.4276 - regression_loss: 1.2050 - classification_loss: 0.2226 16/500 [..............................] - ETA: 2:03 - loss: 1.4721 - regression_loss: 1.2355 - classification_loss: 0.2366 17/500 [>.............................] - ETA: 2:02 - loss: 1.5061 - regression_loss: 1.2580 - classification_loss: 0.2481 18/500 [>.............................] - ETA: 2:01 - loss: 1.4998 - regression_loss: 1.2503 - classification_loss: 0.2496 19/500 [>.............................] - ETA: 2:01 - loss: 1.4624 - regression_loss: 1.2195 - classification_loss: 0.2429 20/500 [>.............................] - ETA: 2:01 - loss: 1.4433 - regression_loss: 1.2068 - classification_loss: 0.2364 21/500 [>.............................] - ETA: 2:00 - loss: 1.4534 - regression_loss: 1.2114 - classification_loss: 0.2420 22/500 [>.............................] - ETA: 2:00 - loss: 1.4685 - regression_loss: 1.2234 - classification_loss: 0.2451 23/500 [>.............................] - ETA: 2:00 - loss: 1.4676 - regression_loss: 1.2256 - classification_loss: 0.2419 24/500 [>.............................] - ETA: 1:59 - loss: 1.4717 - regression_loss: 1.2247 - classification_loss: 0.2470 25/500 [>.............................] - ETA: 1:59 - loss: 1.4783 - regression_loss: 1.2316 - classification_loss: 0.2468 26/500 [>.............................] - ETA: 1:59 - loss: 1.4800 - regression_loss: 1.2328 - classification_loss: 0.2471 27/500 [>.............................] - ETA: 1:59 - loss: 1.4901 - regression_loss: 1.2398 - classification_loss: 0.2503 28/500 [>.............................] - ETA: 1:58 - loss: 1.5106 - regression_loss: 1.2555 - classification_loss: 0.2551 29/500 [>.............................] - ETA: 1:58 - loss: 1.4993 - regression_loss: 1.2470 - classification_loss: 0.2522 30/500 [>.............................] - ETA: 1:58 - loss: 1.5095 - regression_loss: 1.2546 - classification_loss: 0.2548 31/500 [>.............................] - ETA: 1:58 - loss: 1.5086 - regression_loss: 1.2551 - classification_loss: 0.2535 32/500 [>.............................] - ETA: 1:58 - loss: 1.5071 - regression_loss: 1.2532 - classification_loss: 0.2539 33/500 [>.............................] - ETA: 1:57 - loss: 1.4845 - regression_loss: 1.2360 - classification_loss: 0.2484 34/500 [=>............................] - ETA: 1:57 - loss: 1.4730 - regression_loss: 1.2268 - classification_loss: 0.2462 35/500 [=>............................] - ETA: 1:57 - loss: 1.4902 - regression_loss: 1.2276 - classification_loss: 0.2626 36/500 [=>............................] - ETA: 1:57 - loss: 1.5086 - regression_loss: 1.2381 - classification_loss: 0.2704 37/500 [=>............................] - ETA: 1:57 - loss: 1.5171 - regression_loss: 1.2475 - classification_loss: 0.2697 38/500 [=>............................] - ETA: 1:56 - loss: 1.5158 - regression_loss: 1.2483 - classification_loss: 0.2675 39/500 [=>............................] - ETA: 1:56 - loss: 1.5028 - regression_loss: 1.2388 - classification_loss: 0.2640 40/500 [=>............................] - ETA: 1:56 - loss: 1.4913 - regression_loss: 1.2279 - classification_loss: 0.2634 41/500 [=>............................] - ETA: 1:56 - loss: 1.4887 - regression_loss: 1.2266 - classification_loss: 0.2621 42/500 [=>............................] - ETA: 1:56 - loss: 1.4859 - regression_loss: 1.2247 - classification_loss: 0.2613 43/500 [=>............................] - ETA: 1:55 - loss: 1.4731 - regression_loss: 1.2151 - classification_loss: 0.2580 44/500 [=>............................] - ETA: 1:55 - loss: 1.4842 - regression_loss: 1.2256 - classification_loss: 0.2586 45/500 [=>............................] - ETA: 1:55 - loss: 1.4973 - regression_loss: 1.2386 - classification_loss: 0.2587 46/500 [=>............................] - ETA: 1:54 - loss: 1.4991 - regression_loss: 1.2414 - classification_loss: 0.2577 47/500 [=>............................] - ETA: 1:54 - loss: 1.4828 - regression_loss: 1.2292 - classification_loss: 0.2536 48/500 [=>............................] - ETA: 1:54 - loss: 1.4796 - regression_loss: 1.2262 - classification_loss: 0.2534 49/500 [=>............................] - ETA: 1:54 - loss: 1.4758 - regression_loss: 1.2211 - classification_loss: 0.2547 50/500 [==>...........................] - ETA: 1:54 - loss: 1.4852 - regression_loss: 1.2292 - classification_loss: 0.2561 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4907 - regression_loss: 1.2337 - classification_loss: 0.2570 52/500 [==>...........................] - ETA: 1:53 - loss: 1.4937 - regression_loss: 1.2345 - classification_loss: 0.2591 53/500 [==>...........................] - ETA: 1:53 - loss: 1.4876 - regression_loss: 1.2294 - classification_loss: 0.2581 54/500 [==>...........................] - ETA: 1:52 - loss: 1.4895 - regression_loss: 1.2296 - classification_loss: 0.2599 55/500 [==>...........................] - ETA: 1:52 - loss: 1.4876 - regression_loss: 1.2283 - classification_loss: 0.2593 56/500 [==>...........................] - ETA: 1:52 - loss: 1.4959 - regression_loss: 1.2360 - classification_loss: 0.2599 57/500 [==>...........................] - ETA: 1:52 - loss: 1.5100 - regression_loss: 1.2451 - classification_loss: 0.2649 58/500 [==>...........................] - ETA: 1:51 - loss: 1.5139 - regression_loss: 1.2478 - classification_loss: 0.2661 59/500 [==>...........................] - ETA: 1:51 - loss: 1.5068 - regression_loss: 1.2433 - classification_loss: 0.2635 60/500 [==>...........................] - ETA: 1:51 - loss: 1.5106 - regression_loss: 1.2468 - classification_loss: 0.2638 61/500 [==>...........................] - ETA: 1:51 - loss: 1.5121 - regression_loss: 1.2492 - classification_loss: 0.2629 62/500 [==>...........................] - ETA: 1:50 - loss: 1.5089 - regression_loss: 1.2462 - classification_loss: 0.2627 63/500 [==>...........................] - ETA: 1:50 - loss: 1.5074 - regression_loss: 1.2453 - classification_loss: 0.2621 64/500 [==>...........................] - ETA: 1:50 - loss: 1.5092 - regression_loss: 1.2471 - classification_loss: 0.2621 65/500 [==>...........................] - ETA: 1:50 - loss: 1.5094 - regression_loss: 1.2478 - classification_loss: 0.2616 66/500 [==>...........................] - ETA: 1:50 - loss: 1.5154 - regression_loss: 1.2521 - classification_loss: 0.2633 67/500 [===>..........................] - ETA: 1:49 - loss: 1.5145 - regression_loss: 1.2516 - classification_loss: 0.2630 68/500 [===>..........................] - ETA: 1:49 - loss: 1.5121 - regression_loss: 1.2495 - classification_loss: 0.2625 69/500 [===>..........................] - ETA: 1:48 - loss: 1.5055 - regression_loss: 1.2449 - classification_loss: 0.2606 70/500 [===>..........................] - ETA: 1:48 - loss: 1.5068 - regression_loss: 1.2463 - classification_loss: 0.2605 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5135 - regression_loss: 1.2521 - classification_loss: 0.2614 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5179 - regression_loss: 1.2561 - classification_loss: 0.2618 73/500 [===>..........................] - ETA: 1:47 - loss: 1.5186 - regression_loss: 1.2569 - classification_loss: 0.2618 74/500 [===>..........................] - ETA: 1:47 - loss: 1.5065 - regression_loss: 1.2477 - classification_loss: 0.2589 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5080 - regression_loss: 1.2492 - classification_loss: 0.2589 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5096 - regression_loss: 1.2508 - classification_loss: 0.2588 77/500 [===>..........................] - ETA: 1:46 - loss: 1.5025 - regression_loss: 1.2436 - classification_loss: 0.2589 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4953 - regression_loss: 1.2383 - classification_loss: 0.2569 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4985 - regression_loss: 1.2414 - classification_loss: 0.2571 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5033 - regression_loss: 1.2446 - classification_loss: 0.2587 81/500 [===>..........................] - ETA: 1:45 - loss: 1.5027 - regression_loss: 1.2436 - classification_loss: 0.2591 82/500 [===>..........................] - ETA: 1:45 - loss: 1.5023 - regression_loss: 1.2437 - classification_loss: 0.2586 83/500 [===>..........................] - ETA: 1:45 - loss: 1.5093 - regression_loss: 1.2497 - classification_loss: 0.2595 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5013 - regression_loss: 1.2438 - classification_loss: 0.2574 85/500 [====>.........................] - ETA: 1:44 - loss: 1.5006 - regression_loss: 1.2437 - classification_loss: 0.2569 86/500 [====>.........................] - ETA: 1:44 - loss: 1.5047 - regression_loss: 1.2471 - classification_loss: 0.2576 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5054 - regression_loss: 1.2477 - classification_loss: 0.2578 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5057 - regression_loss: 1.2481 - classification_loss: 0.2576 89/500 [====>.........................] - ETA: 1:43 - loss: 1.5018 - regression_loss: 1.2454 - classification_loss: 0.2564 90/500 [====>.........................] - ETA: 1:43 - loss: 1.5050 - regression_loss: 1.2481 - classification_loss: 0.2569 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5032 - regression_loss: 1.2472 - classification_loss: 0.2560 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5054 - regression_loss: 1.2487 - classification_loss: 0.2567 93/500 [====>.........................] - ETA: 1:42 - loss: 1.5122 - regression_loss: 1.2542 - classification_loss: 0.2580 94/500 [====>.........................] - ETA: 1:42 - loss: 1.5233 - regression_loss: 1.2636 - classification_loss: 0.2597 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5164 - regression_loss: 1.2575 - classification_loss: 0.2589 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5152 - regression_loss: 1.2565 - classification_loss: 0.2587 97/500 [====>.........................] - ETA: 1:41 - loss: 1.5123 - regression_loss: 1.2545 - classification_loss: 0.2578 98/500 [====>.........................] - ETA: 1:41 - loss: 1.5172 - regression_loss: 1.2595 - classification_loss: 0.2578 99/500 [====>.........................] - ETA: 1:40 - loss: 1.5172 - regression_loss: 1.2599 - classification_loss: 0.2573 100/500 [=====>........................] - ETA: 1:40 - loss: 1.5179 - regression_loss: 1.2605 - classification_loss: 0.2574 101/500 [=====>........................] - ETA: 1:40 - loss: 1.5205 - regression_loss: 1.2630 - classification_loss: 0.2574 102/500 [=====>........................] - ETA: 1:40 - loss: 1.5240 - regression_loss: 1.2664 - classification_loss: 0.2576 103/500 [=====>........................] - ETA: 1:39 - loss: 1.5258 - regression_loss: 1.2680 - classification_loss: 0.2579 104/500 [=====>........................] - ETA: 1:39 - loss: 1.5289 - regression_loss: 1.2708 - classification_loss: 0.2581 105/500 [=====>........................] - ETA: 1:39 - loss: 1.5300 - regression_loss: 1.2717 - classification_loss: 0.2583 106/500 [=====>........................] - ETA: 1:39 - loss: 1.5297 - regression_loss: 1.2715 - classification_loss: 0.2582 107/500 [=====>........................] - ETA: 1:38 - loss: 1.5362 - regression_loss: 1.2771 - classification_loss: 0.2590 108/500 [=====>........................] - ETA: 1:38 - loss: 1.5296 - regression_loss: 1.2721 - classification_loss: 0.2575 109/500 [=====>........................] - ETA: 1:38 - loss: 1.5264 - regression_loss: 1.2707 - classification_loss: 0.2557 110/500 [=====>........................] - ETA: 1:38 - loss: 1.5271 - regression_loss: 1.2712 - classification_loss: 0.2559 111/500 [=====>........................] - ETA: 1:37 - loss: 1.5267 - regression_loss: 1.2710 - classification_loss: 0.2557 112/500 [=====>........................] - ETA: 1:37 - loss: 1.5264 - regression_loss: 1.2708 - classification_loss: 0.2556 113/500 [=====>........................] - ETA: 1:37 - loss: 1.5296 - regression_loss: 1.2741 - classification_loss: 0.2555 114/500 [=====>........................] - ETA: 1:37 - loss: 1.5308 - regression_loss: 1.2747 - classification_loss: 0.2562 115/500 [=====>........................] - ETA: 1:36 - loss: 1.5341 - regression_loss: 1.2777 - classification_loss: 0.2564 116/500 [=====>........................] - ETA: 1:36 - loss: 1.5320 - regression_loss: 1.2762 - classification_loss: 0.2558 117/500 [======>.......................] - ETA: 1:36 - loss: 1.5312 - regression_loss: 1.2756 - classification_loss: 0.2556 118/500 [======>.......................] - ETA: 1:36 - loss: 1.5308 - regression_loss: 1.2752 - classification_loss: 0.2556 119/500 [======>.......................] - ETA: 1:35 - loss: 1.5265 - regression_loss: 1.2720 - classification_loss: 0.2546 120/500 [======>.......................] - ETA: 1:35 - loss: 1.5186 - regression_loss: 1.2646 - classification_loss: 0.2541 121/500 [======>.......................] - ETA: 1:35 - loss: 1.5157 - regression_loss: 1.2626 - classification_loss: 0.2531 122/500 [======>.......................] - ETA: 1:35 - loss: 1.5129 - regression_loss: 1.2605 - classification_loss: 0.2524 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5139 - regression_loss: 1.2615 - classification_loss: 0.2524 124/500 [======>.......................] - ETA: 1:34 - loss: 1.5157 - regression_loss: 1.2636 - classification_loss: 0.2521 125/500 [======>.......................] - ETA: 1:34 - loss: 1.5160 - regression_loss: 1.2637 - classification_loss: 0.2523 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5184 - regression_loss: 1.2662 - classification_loss: 0.2522 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5220 - regression_loss: 1.2697 - classification_loss: 0.2523 128/500 [======>.......................] - ETA: 1:33 - loss: 1.5230 - regression_loss: 1.2708 - classification_loss: 0.2522 129/500 [======>.......................] - ETA: 1:33 - loss: 1.5198 - regression_loss: 1.2684 - classification_loss: 0.2514 130/500 [======>.......................] - ETA: 1:33 - loss: 1.5138 - regression_loss: 1.2632 - classification_loss: 0.2506 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5065 - regression_loss: 1.2574 - classification_loss: 0.2491 132/500 [======>.......................] - ETA: 1:32 - loss: 1.5084 - regression_loss: 1.2590 - classification_loss: 0.2494 133/500 [======>.......................] - ETA: 1:32 - loss: 1.5102 - regression_loss: 1.2607 - classification_loss: 0.2495 134/500 [=======>......................] - ETA: 1:32 - loss: 1.5072 - regression_loss: 1.2584 - classification_loss: 0.2488 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5053 - regression_loss: 1.2565 - classification_loss: 0.2488 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4979 - regression_loss: 1.2503 - classification_loss: 0.2477 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4956 - regression_loss: 1.2488 - classification_loss: 0.2468 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4934 - regression_loss: 1.2468 - classification_loss: 0.2466 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4888 - regression_loss: 1.2431 - classification_loss: 0.2457 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4921 - regression_loss: 1.2463 - classification_loss: 0.2457 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4940 - regression_loss: 1.2480 - classification_loss: 0.2460 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4955 - regression_loss: 1.2492 - classification_loss: 0.2463 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4954 - regression_loss: 1.2475 - classification_loss: 0.2479 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4928 - regression_loss: 1.2451 - classification_loss: 0.2477 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4946 - regression_loss: 1.2472 - classification_loss: 0.2474 146/500 [=======>......................] - ETA: 1:29 - loss: 1.5005 - regression_loss: 1.2523 - classification_loss: 0.2482 147/500 [=======>......................] - ETA: 1:28 - loss: 1.5050 - regression_loss: 1.2554 - classification_loss: 0.2496 148/500 [=======>......................] - ETA: 1:28 - loss: 1.5052 - regression_loss: 1.2559 - classification_loss: 0.2493 149/500 [=======>......................] - ETA: 1:28 - loss: 1.5021 - regression_loss: 1.2528 - classification_loss: 0.2493 150/500 [========>.....................] - ETA: 1:28 - loss: 1.5008 - regression_loss: 1.2509 - classification_loss: 0.2499 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4980 - regression_loss: 1.2488 - classification_loss: 0.2492 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4990 - regression_loss: 1.2500 - classification_loss: 0.2489 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4967 - regression_loss: 1.2473 - classification_loss: 0.2494 154/500 [========>.....................] - ETA: 1:28 - loss: 1.4958 - regression_loss: 1.2467 - classification_loss: 0.2491 155/500 [========>.....................] - ETA: 1:27 - loss: 1.4975 - regression_loss: 1.2478 - classification_loss: 0.2497 156/500 [========>.....................] - ETA: 1:27 - loss: 1.4999 - regression_loss: 1.2499 - classification_loss: 0.2500 157/500 [========>.....................] - ETA: 1:27 - loss: 1.4999 - regression_loss: 1.2503 - classification_loss: 0.2496 158/500 [========>.....................] - ETA: 1:27 - loss: 1.5019 - regression_loss: 1.2517 - classification_loss: 0.2502 159/500 [========>.....................] - ETA: 1:26 - loss: 1.5045 - regression_loss: 1.2534 - classification_loss: 0.2510 160/500 [========>.....................] - ETA: 1:26 - loss: 1.5045 - regression_loss: 1.2536 - classification_loss: 0.2509 161/500 [========>.....................] - ETA: 1:26 - loss: 1.5078 - regression_loss: 1.2564 - classification_loss: 0.2515 162/500 [========>.....................] - ETA: 1:26 - loss: 1.5099 - regression_loss: 1.2582 - classification_loss: 0.2517 163/500 [========>.....................] - ETA: 1:25 - loss: 1.5035 - regression_loss: 1.2531 - classification_loss: 0.2504 164/500 [========>.....................] - ETA: 1:25 - loss: 1.5026 - regression_loss: 1.2525 - classification_loss: 0.2501 165/500 [========>.....................] - ETA: 1:25 - loss: 1.5033 - regression_loss: 1.2531 - classification_loss: 0.2502 166/500 [========>.....................] - ETA: 1:25 - loss: 1.5019 - regression_loss: 1.2522 - classification_loss: 0.2497 167/500 [=========>....................] - ETA: 1:24 - loss: 1.5027 - regression_loss: 1.2531 - classification_loss: 0.2495 168/500 [=========>....................] - ETA: 1:24 - loss: 1.4993 - regression_loss: 1.2506 - classification_loss: 0.2488 169/500 [=========>....................] - ETA: 1:24 - loss: 1.5014 - regression_loss: 1.2525 - classification_loss: 0.2489 170/500 [=========>....................] - ETA: 1:24 - loss: 1.5024 - regression_loss: 1.2535 - classification_loss: 0.2489 171/500 [=========>....................] - ETA: 1:23 - loss: 1.5007 - regression_loss: 1.2515 - classification_loss: 0.2492 172/500 [=========>....................] - ETA: 1:23 - loss: 1.4953 - regression_loss: 1.2469 - classification_loss: 0.2484 173/500 [=========>....................] - ETA: 1:23 - loss: 1.4919 - regression_loss: 1.2444 - classification_loss: 0.2475 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4881 - regression_loss: 1.2414 - classification_loss: 0.2467 175/500 [=========>....................] - ETA: 1:22 - loss: 1.4859 - regression_loss: 1.2399 - classification_loss: 0.2460 176/500 [=========>....................] - ETA: 1:22 - loss: 1.4933 - regression_loss: 1.2460 - classification_loss: 0.2473 177/500 [=========>....................] - ETA: 1:22 - loss: 1.4909 - regression_loss: 1.2440 - classification_loss: 0.2469 178/500 [=========>....................] - ETA: 1:21 - loss: 1.4931 - regression_loss: 1.2454 - classification_loss: 0.2477 179/500 [=========>....................] - ETA: 1:21 - loss: 1.4935 - regression_loss: 1.2457 - classification_loss: 0.2478 180/500 [=========>....................] - ETA: 1:21 - loss: 1.4939 - regression_loss: 1.2454 - classification_loss: 0.2485 181/500 [=========>....................] - ETA: 1:21 - loss: 1.4944 - regression_loss: 1.2460 - classification_loss: 0.2484 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4971 - regression_loss: 1.2480 - classification_loss: 0.2491 183/500 [=========>....................] - ETA: 1:20 - loss: 1.4991 - regression_loss: 1.2498 - classification_loss: 0.2493 184/500 [==========>...................] - ETA: 1:20 - loss: 1.4988 - regression_loss: 1.2495 - classification_loss: 0.2493 185/500 [==========>...................] - ETA: 1:20 - loss: 1.4968 - regression_loss: 1.2484 - classification_loss: 0.2484 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4992 - regression_loss: 1.2503 - classification_loss: 0.2489 187/500 [==========>...................] - ETA: 1:19 - loss: 1.4967 - regression_loss: 1.2485 - classification_loss: 0.2482 188/500 [==========>...................] - ETA: 1:19 - loss: 1.4960 - regression_loss: 1.2482 - classification_loss: 0.2478 189/500 [==========>...................] - ETA: 1:19 - loss: 1.4939 - regression_loss: 1.2464 - classification_loss: 0.2475 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4932 - regression_loss: 1.2461 - classification_loss: 0.2471 191/500 [==========>...................] - ETA: 1:18 - loss: 1.4958 - regression_loss: 1.2480 - classification_loss: 0.2478 192/500 [==========>...................] - ETA: 1:18 - loss: 1.4964 - regression_loss: 1.2487 - classification_loss: 0.2477 193/500 [==========>...................] - ETA: 1:18 - loss: 1.4942 - regression_loss: 1.2470 - classification_loss: 0.2472 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4928 - regression_loss: 1.2458 - classification_loss: 0.2471 195/500 [==========>...................] - ETA: 1:17 - loss: 1.4946 - regression_loss: 1.2474 - classification_loss: 0.2472 196/500 [==========>...................] - ETA: 1:17 - loss: 1.4968 - regression_loss: 1.2494 - classification_loss: 0.2474 197/500 [==========>...................] - ETA: 1:17 - loss: 1.4967 - regression_loss: 1.2494 - classification_loss: 0.2473 198/500 [==========>...................] - ETA: 1:16 - loss: 1.4977 - regression_loss: 1.2504 - classification_loss: 0.2473 199/500 [==========>...................] - ETA: 1:16 - loss: 1.4935 - regression_loss: 1.2472 - classification_loss: 0.2463 200/500 [===========>..................] - ETA: 1:16 - loss: 1.4913 - regression_loss: 1.2453 - classification_loss: 0.2459 201/500 [===========>..................] - ETA: 1:16 - loss: 1.4927 - regression_loss: 1.2465 - classification_loss: 0.2461 202/500 [===========>..................] - ETA: 1:15 - loss: 1.4902 - regression_loss: 1.2448 - classification_loss: 0.2455 203/500 [===========>..................] - ETA: 1:15 - loss: 1.4907 - regression_loss: 1.2452 - classification_loss: 0.2455 204/500 [===========>..................] - ETA: 1:15 - loss: 1.4894 - regression_loss: 1.2443 - classification_loss: 0.2451 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4866 - regression_loss: 1.2422 - classification_loss: 0.2444 206/500 [===========>..................] - ETA: 1:14 - loss: 1.4855 - regression_loss: 1.2413 - classification_loss: 0.2442 207/500 [===========>..................] - ETA: 1:14 - loss: 1.4833 - regression_loss: 1.2397 - classification_loss: 0.2436 208/500 [===========>..................] - ETA: 1:14 - loss: 1.4827 - regression_loss: 1.2392 - classification_loss: 0.2435 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4855 - regression_loss: 1.2412 - classification_loss: 0.2442 210/500 [===========>..................] - ETA: 1:13 - loss: 1.4838 - regression_loss: 1.2399 - classification_loss: 0.2439 211/500 [===========>..................] - ETA: 1:13 - loss: 1.4837 - regression_loss: 1.2399 - classification_loss: 0.2438 212/500 [===========>..................] - ETA: 1:13 - loss: 1.4860 - regression_loss: 1.2414 - classification_loss: 0.2446 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4857 - regression_loss: 1.2412 - classification_loss: 0.2445 214/500 [===========>..................] - ETA: 1:12 - loss: 1.4864 - regression_loss: 1.2418 - classification_loss: 0.2446 215/500 [===========>..................] - ETA: 1:12 - loss: 1.4869 - regression_loss: 1.2421 - classification_loss: 0.2448 216/500 [===========>..................] - ETA: 1:12 - loss: 1.4877 - regression_loss: 1.2427 - classification_loss: 0.2450 217/500 [============>.................] - ETA: 1:11 - loss: 1.4880 - regression_loss: 1.2430 - classification_loss: 0.2451 218/500 [============>.................] - ETA: 1:11 - loss: 1.4900 - regression_loss: 1.2446 - classification_loss: 0.2454 219/500 [============>.................] - ETA: 1:11 - loss: 1.4903 - regression_loss: 1.2450 - classification_loss: 0.2453 220/500 [============>.................] - ETA: 1:11 - loss: 1.4894 - regression_loss: 1.2435 - classification_loss: 0.2459 221/500 [============>.................] - ETA: 1:10 - loss: 1.4879 - regression_loss: 1.2418 - classification_loss: 0.2461 222/500 [============>.................] - ETA: 1:10 - loss: 1.4872 - regression_loss: 1.2414 - classification_loss: 0.2458 223/500 [============>.................] - ETA: 1:10 - loss: 1.4854 - regression_loss: 1.2399 - classification_loss: 0.2455 224/500 [============>.................] - ETA: 1:10 - loss: 1.4850 - regression_loss: 1.2397 - classification_loss: 0.2454 225/500 [============>.................] - ETA: 1:09 - loss: 1.4871 - regression_loss: 1.2415 - classification_loss: 0.2456 226/500 [============>.................] - ETA: 1:09 - loss: 1.4905 - regression_loss: 1.2439 - classification_loss: 0.2467 227/500 [============>.................] - ETA: 1:09 - loss: 1.4928 - regression_loss: 1.2455 - classification_loss: 0.2473 228/500 [============>.................] - ETA: 1:09 - loss: 1.4940 - regression_loss: 1.2463 - classification_loss: 0.2477 229/500 [============>.................] - ETA: 1:08 - loss: 1.4912 - regression_loss: 1.2440 - classification_loss: 0.2471 230/500 [============>.................] - ETA: 1:08 - loss: 1.4866 - regression_loss: 1.2400 - classification_loss: 0.2466 231/500 [============>.................] - ETA: 1:08 - loss: 1.4863 - regression_loss: 1.2398 - classification_loss: 0.2465 232/500 [============>.................] - ETA: 1:08 - loss: 1.4862 - regression_loss: 1.2398 - classification_loss: 0.2464 233/500 [============>.................] - ETA: 1:07 - loss: 1.4834 - regression_loss: 1.2377 - classification_loss: 0.2457 234/500 [=============>................] - ETA: 1:07 - loss: 1.4838 - regression_loss: 1.2380 - classification_loss: 0.2457 235/500 [=============>................] - ETA: 1:07 - loss: 1.4849 - regression_loss: 1.2384 - classification_loss: 0.2464 236/500 [=============>................] - ETA: 1:07 - loss: 1.4823 - regression_loss: 1.2361 - classification_loss: 0.2461 237/500 [=============>................] - ETA: 1:06 - loss: 1.4790 - regression_loss: 1.2335 - classification_loss: 0.2454 238/500 [=============>................] - ETA: 1:06 - loss: 1.4777 - regression_loss: 1.2327 - classification_loss: 0.2450 239/500 [=============>................] - ETA: 1:06 - loss: 1.4751 - regression_loss: 1.2306 - classification_loss: 0.2445 240/500 [=============>................] - ETA: 1:06 - loss: 1.4739 - regression_loss: 1.2298 - classification_loss: 0.2442 241/500 [=============>................] - ETA: 1:05 - loss: 1.4740 - regression_loss: 1.2299 - classification_loss: 0.2441 242/500 [=============>................] - ETA: 1:05 - loss: 1.4735 - regression_loss: 1.2293 - classification_loss: 0.2442 243/500 [=============>................] - ETA: 1:05 - loss: 1.4747 - regression_loss: 1.2305 - classification_loss: 0.2442 244/500 [=============>................] - ETA: 1:04 - loss: 1.4752 - regression_loss: 1.2307 - classification_loss: 0.2445 245/500 [=============>................] - ETA: 1:04 - loss: 1.4747 - regression_loss: 1.2303 - classification_loss: 0.2445 246/500 [=============>................] - ETA: 1:04 - loss: 1.4762 - regression_loss: 1.2315 - classification_loss: 0.2447 247/500 [=============>................] - ETA: 1:04 - loss: 1.4757 - regression_loss: 1.2311 - classification_loss: 0.2446 248/500 [=============>................] - ETA: 1:03 - loss: 1.4753 - regression_loss: 1.2308 - classification_loss: 0.2445 249/500 [=============>................] - ETA: 1:03 - loss: 1.4716 - regression_loss: 1.2276 - classification_loss: 0.2439 250/500 [==============>...............] - ETA: 1:03 - loss: 1.4727 - regression_loss: 1.2285 - classification_loss: 0.2442 251/500 [==============>...............] - ETA: 1:03 - loss: 1.4711 - regression_loss: 1.2271 - classification_loss: 0.2441 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4730 - regression_loss: 1.2286 - classification_loss: 0.2444 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4747 - regression_loss: 1.2305 - classification_loss: 0.2442 254/500 [==============>...............] - ETA: 1:02 - loss: 1.4720 - regression_loss: 1.2271 - classification_loss: 0.2449 255/500 [==============>...............] - ETA: 1:02 - loss: 1.4730 - regression_loss: 1.2277 - classification_loss: 0.2452 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4739 - regression_loss: 1.2283 - classification_loss: 0.2456 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4757 - regression_loss: 1.2303 - classification_loss: 0.2455 258/500 [==============>...............] - ETA: 1:01 - loss: 1.4729 - regression_loss: 1.2279 - classification_loss: 0.2450 259/500 [==============>...............] - ETA: 1:01 - loss: 1.4729 - regression_loss: 1.2279 - classification_loss: 0.2450 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4738 - regression_loss: 1.2289 - classification_loss: 0.2450 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4720 - regression_loss: 1.2274 - classification_loss: 0.2446 262/500 [==============>...............] - ETA: 1:00 - loss: 1.4728 - regression_loss: 1.2283 - classification_loss: 0.2445 263/500 [==============>...............] - ETA: 59s - loss: 1.4733 - regression_loss: 1.2286 - classification_loss: 0.2447  264/500 [==============>...............] - ETA: 59s - loss: 1.4695 - regression_loss: 1.2255 - classification_loss: 0.2440 265/500 [==============>...............] - ETA: 59s - loss: 1.4675 - regression_loss: 1.2235 - classification_loss: 0.2440 266/500 [==============>...............] - ETA: 59s - loss: 1.4691 - regression_loss: 1.2245 - classification_loss: 0.2446 267/500 [===============>..............] - ETA: 58s - loss: 1.4674 - regression_loss: 1.2230 - classification_loss: 0.2443 268/500 [===============>..............] - ETA: 58s - loss: 1.4675 - regression_loss: 1.2234 - classification_loss: 0.2442 269/500 [===============>..............] - ETA: 58s - loss: 1.4686 - regression_loss: 1.2243 - classification_loss: 0.2443 270/500 [===============>..............] - ETA: 58s - loss: 1.4677 - regression_loss: 1.2235 - classification_loss: 0.2442 271/500 [===============>..............] - ETA: 57s - loss: 1.4689 - regression_loss: 1.2246 - classification_loss: 0.2443 272/500 [===============>..............] - ETA: 57s - loss: 1.4700 - regression_loss: 1.2257 - classification_loss: 0.2443 273/500 [===============>..............] - ETA: 57s - loss: 1.4706 - regression_loss: 1.2257 - classification_loss: 0.2449 274/500 [===============>..............] - ETA: 57s - loss: 1.4716 - regression_loss: 1.2266 - classification_loss: 0.2450 275/500 [===============>..............] - ETA: 56s - loss: 1.4715 - regression_loss: 1.2268 - classification_loss: 0.2447 276/500 [===============>..............] - ETA: 56s - loss: 1.4718 - regression_loss: 1.2272 - classification_loss: 0.2446 277/500 [===============>..............] - ETA: 56s - loss: 1.4730 - regression_loss: 1.2282 - classification_loss: 0.2448 278/500 [===============>..............] - ETA: 56s - loss: 1.4730 - regression_loss: 1.2280 - classification_loss: 0.2450 279/500 [===============>..............] - ETA: 55s - loss: 1.4740 - regression_loss: 1.2287 - classification_loss: 0.2454 280/500 [===============>..............] - ETA: 55s - loss: 1.4734 - regression_loss: 1.2281 - classification_loss: 0.2453 281/500 [===============>..............] - ETA: 55s - loss: 1.4755 - regression_loss: 1.2303 - classification_loss: 0.2452 282/500 [===============>..............] - ETA: 55s - loss: 1.4756 - regression_loss: 1.2305 - classification_loss: 0.2452 283/500 [===============>..............] - ETA: 54s - loss: 1.4759 - regression_loss: 1.2305 - classification_loss: 0.2454 284/500 [================>.............] - ETA: 54s - loss: 1.4771 - regression_loss: 1.2315 - classification_loss: 0.2456 285/500 [================>.............] - ETA: 54s - loss: 1.4791 - regression_loss: 1.2332 - classification_loss: 0.2459 286/500 [================>.............] - ETA: 54s - loss: 1.4796 - regression_loss: 1.2338 - classification_loss: 0.2458 287/500 [================>.............] - ETA: 53s - loss: 1.4807 - regression_loss: 1.2353 - classification_loss: 0.2454 288/500 [================>.............] - ETA: 53s - loss: 1.4798 - regression_loss: 1.2345 - classification_loss: 0.2452 289/500 [================>.............] - ETA: 53s - loss: 1.4793 - regression_loss: 1.2342 - classification_loss: 0.2451 290/500 [================>.............] - ETA: 53s - loss: 1.4792 - regression_loss: 1.2341 - classification_loss: 0.2451 291/500 [================>.............] - ETA: 52s - loss: 1.4803 - regression_loss: 1.2351 - classification_loss: 0.2452 292/500 [================>.............] - ETA: 52s - loss: 1.4791 - regression_loss: 1.2341 - classification_loss: 0.2450 293/500 [================>.............] - ETA: 52s - loss: 1.4802 - regression_loss: 1.2352 - classification_loss: 0.2450 294/500 [================>.............] - ETA: 52s - loss: 1.4809 - regression_loss: 1.2359 - classification_loss: 0.2450 295/500 [================>.............] - ETA: 51s - loss: 1.4814 - regression_loss: 1.2364 - classification_loss: 0.2450 296/500 [================>.............] - ETA: 51s - loss: 1.4826 - regression_loss: 1.2374 - classification_loss: 0.2452 297/500 [================>.............] - ETA: 51s - loss: 1.4829 - regression_loss: 1.2378 - classification_loss: 0.2451 298/500 [================>.............] - ETA: 51s - loss: 1.4842 - regression_loss: 1.2389 - classification_loss: 0.2453 299/500 [================>.............] - ETA: 50s - loss: 1.4847 - regression_loss: 1.2394 - classification_loss: 0.2453 300/500 [=================>............] - ETA: 50s - loss: 1.4832 - regression_loss: 1.2381 - classification_loss: 0.2450 301/500 [=================>............] - ETA: 50s - loss: 1.4839 - regression_loss: 1.2388 - classification_loss: 0.2450 302/500 [=================>............] - ETA: 50s - loss: 1.4834 - regression_loss: 1.2385 - classification_loss: 0.2449 303/500 [=================>............] - ETA: 49s - loss: 1.4828 - regression_loss: 1.2379 - classification_loss: 0.2449 304/500 [=================>............] - ETA: 49s - loss: 1.4821 - regression_loss: 1.2374 - classification_loss: 0.2447 305/500 [=================>............] - ETA: 49s - loss: 1.4824 - regression_loss: 1.2376 - classification_loss: 0.2448 306/500 [=================>............] - ETA: 49s - loss: 1.4826 - regression_loss: 1.2377 - classification_loss: 0.2449 307/500 [=================>............] - ETA: 48s - loss: 1.4848 - regression_loss: 1.2396 - classification_loss: 0.2452 308/500 [=================>............] - ETA: 48s - loss: 1.4854 - regression_loss: 1.2403 - classification_loss: 0.2451 309/500 [=================>............] - ETA: 48s - loss: 1.4826 - regression_loss: 1.2378 - classification_loss: 0.2447 310/500 [=================>............] - ETA: 48s - loss: 1.4825 - regression_loss: 1.2380 - classification_loss: 0.2446 311/500 [=================>............] - ETA: 47s - loss: 1.4834 - regression_loss: 1.2385 - classification_loss: 0.2449 312/500 [=================>............] - ETA: 47s - loss: 1.4827 - regression_loss: 1.2381 - classification_loss: 0.2446 313/500 [=================>............] - ETA: 47s - loss: 1.4829 - regression_loss: 1.2385 - classification_loss: 0.2444 314/500 [=================>............] - ETA: 47s - loss: 1.4822 - regression_loss: 1.2382 - classification_loss: 0.2441 315/500 [=================>............] - ETA: 46s - loss: 1.4797 - regression_loss: 1.2362 - classification_loss: 0.2435 316/500 [=================>............] - ETA: 46s - loss: 1.4812 - regression_loss: 1.2374 - classification_loss: 0.2438 317/500 [==================>...........] - ETA: 46s - loss: 1.4811 - regression_loss: 1.2374 - classification_loss: 0.2437 318/500 [==================>...........] - ETA: 46s - loss: 1.4794 - regression_loss: 1.2362 - classification_loss: 0.2433 319/500 [==================>...........] - ETA: 45s - loss: 1.4784 - regression_loss: 1.2354 - classification_loss: 0.2431 320/500 [==================>...........] - ETA: 45s - loss: 1.4798 - regression_loss: 1.2363 - classification_loss: 0.2435 321/500 [==================>...........] - ETA: 45s - loss: 1.4796 - regression_loss: 1.2362 - classification_loss: 0.2435 322/500 [==================>...........] - ETA: 45s - loss: 1.4801 - regression_loss: 1.2366 - classification_loss: 0.2434 323/500 [==================>...........] - ETA: 44s - loss: 1.4774 - regression_loss: 1.2346 - classification_loss: 0.2428 324/500 [==================>...........] - ETA: 44s - loss: 1.4788 - regression_loss: 1.2356 - classification_loss: 0.2432 325/500 [==================>...........] - ETA: 44s - loss: 1.4780 - regression_loss: 1.2350 - classification_loss: 0.2430 326/500 [==================>...........] - ETA: 44s - loss: 1.4766 - regression_loss: 1.2338 - classification_loss: 0.2428 327/500 [==================>...........] - ETA: 43s - loss: 1.4767 - regression_loss: 1.2339 - classification_loss: 0.2428 328/500 [==================>...........] - ETA: 43s - loss: 1.4782 - regression_loss: 1.2353 - classification_loss: 0.2429 329/500 [==================>...........] - ETA: 43s - loss: 1.4780 - regression_loss: 1.2352 - classification_loss: 0.2428 330/500 [==================>...........] - ETA: 42s - loss: 1.4785 - regression_loss: 1.2357 - classification_loss: 0.2428 331/500 [==================>...........] - ETA: 42s - loss: 1.4787 - regression_loss: 1.2359 - classification_loss: 0.2427 332/500 [==================>...........] - ETA: 42s - loss: 1.4787 - regression_loss: 1.2360 - classification_loss: 0.2427 333/500 [==================>...........] - ETA: 42s - loss: 1.4763 - regression_loss: 1.2340 - classification_loss: 0.2423 334/500 [===================>..........] - ETA: 41s - loss: 1.4777 - regression_loss: 1.2352 - classification_loss: 0.2425 335/500 [===================>..........] - ETA: 41s - loss: 1.4751 - regression_loss: 1.2331 - classification_loss: 0.2420 336/500 [===================>..........] - ETA: 41s - loss: 1.4763 - regression_loss: 1.2337 - classification_loss: 0.2426 337/500 [===================>..........] - ETA: 41s - loss: 1.4758 - regression_loss: 1.2332 - classification_loss: 0.2426 338/500 [===================>..........] - ETA: 40s - loss: 1.4756 - regression_loss: 1.2332 - classification_loss: 0.2424 339/500 [===================>..........] - ETA: 40s - loss: 1.4759 - regression_loss: 1.2334 - classification_loss: 0.2426 340/500 [===================>..........] - ETA: 40s - loss: 1.4765 - regression_loss: 1.2336 - classification_loss: 0.2428 341/500 [===================>..........] - ETA: 40s - loss: 1.4763 - regression_loss: 1.2335 - classification_loss: 0.2427 342/500 [===================>..........] - ETA: 39s - loss: 1.4771 - regression_loss: 1.2343 - classification_loss: 0.2428 343/500 [===================>..........] - ETA: 39s - loss: 1.4766 - regression_loss: 1.2340 - classification_loss: 0.2427 344/500 [===================>..........] - ETA: 39s - loss: 1.4763 - regression_loss: 1.2337 - classification_loss: 0.2426 345/500 [===================>..........] - ETA: 39s - loss: 1.4739 - regression_loss: 1.2316 - classification_loss: 0.2422 346/500 [===================>..........] - ETA: 38s - loss: 1.4738 - regression_loss: 1.2317 - classification_loss: 0.2420 347/500 [===================>..........] - ETA: 38s - loss: 1.4749 - regression_loss: 1.2326 - classification_loss: 0.2423 348/500 [===================>..........] - ETA: 38s - loss: 1.4749 - regression_loss: 1.2326 - classification_loss: 0.2423 349/500 [===================>..........] - ETA: 38s - loss: 1.4750 - regression_loss: 1.2326 - classification_loss: 0.2424 350/500 [====================>.........] - ETA: 37s - loss: 1.4756 - regression_loss: 1.2331 - classification_loss: 0.2426 351/500 [====================>.........] - ETA: 37s - loss: 1.4762 - regression_loss: 1.2336 - classification_loss: 0.2426 352/500 [====================>.........] - ETA: 37s - loss: 1.4775 - regression_loss: 1.2350 - classification_loss: 0.2425 353/500 [====================>.........] - ETA: 37s - loss: 1.4787 - regression_loss: 1.2362 - classification_loss: 0.2426 354/500 [====================>.........] - ETA: 36s - loss: 1.4781 - regression_loss: 1.2356 - classification_loss: 0.2424 355/500 [====================>.........] - ETA: 36s - loss: 1.4780 - regression_loss: 1.2358 - classification_loss: 0.2422 356/500 [====================>.........] - ETA: 36s - loss: 1.4779 - regression_loss: 1.2359 - classification_loss: 0.2420 357/500 [====================>.........] - ETA: 36s - loss: 1.4785 - regression_loss: 1.2365 - classification_loss: 0.2419 358/500 [====================>.........] - ETA: 35s - loss: 1.4762 - regression_loss: 1.2348 - classification_loss: 0.2415 359/500 [====================>.........] - ETA: 35s - loss: 1.4767 - regression_loss: 1.2351 - classification_loss: 0.2415 360/500 [====================>.........] - ETA: 35s - loss: 1.4781 - regression_loss: 1.2363 - classification_loss: 0.2418 361/500 [====================>.........] - ETA: 35s - loss: 1.4785 - regression_loss: 1.2367 - classification_loss: 0.2418 362/500 [====================>.........] - ETA: 34s - loss: 1.4784 - regression_loss: 1.2365 - classification_loss: 0.2419 363/500 [====================>.........] - ETA: 34s - loss: 1.4803 - regression_loss: 1.2381 - classification_loss: 0.2422 364/500 [====================>.........] - ETA: 34s - loss: 1.4803 - regression_loss: 1.2382 - classification_loss: 0.2421 365/500 [====================>.........] - ETA: 34s - loss: 1.4811 - regression_loss: 1.2384 - classification_loss: 0.2427 366/500 [====================>.........] - ETA: 33s - loss: 1.4809 - regression_loss: 1.2383 - classification_loss: 0.2426 367/500 [=====================>........] - ETA: 33s - loss: 1.4823 - regression_loss: 1.2394 - classification_loss: 0.2429 368/500 [=====================>........] - ETA: 33s - loss: 1.4834 - regression_loss: 1.2403 - classification_loss: 0.2430 369/500 [=====================>........] - ETA: 33s - loss: 1.4825 - regression_loss: 1.2395 - classification_loss: 0.2429 370/500 [=====================>........] - ETA: 32s - loss: 1.4816 - regression_loss: 1.2388 - classification_loss: 0.2428 371/500 [=====================>........] - ETA: 32s - loss: 1.4814 - regression_loss: 1.2388 - classification_loss: 0.2426 372/500 [=====================>........] - ETA: 32s - loss: 1.4791 - regression_loss: 1.2369 - classification_loss: 0.2422 373/500 [=====================>........] - ETA: 32s - loss: 1.4788 - regression_loss: 1.2369 - classification_loss: 0.2419 374/500 [=====================>........] - ETA: 31s - loss: 1.4782 - regression_loss: 1.2365 - classification_loss: 0.2418 375/500 [=====================>........] - ETA: 31s - loss: 1.4781 - regression_loss: 1.2362 - classification_loss: 0.2418 376/500 [=====================>........] - ETA: 31s - loss: 1.4765 - regression_loss: 1.2352 - classification_loss: 0.2414 377/500 [=====================>........] - ETA: 31s - loss: 1.4764 - regression_loss: 1.2347 - classification_loss: 0.2417 378/500 [=====================>........] - ETA: 30s - loss: 1.4760 - regression_loss: 1.2344 - classification_loss: 0.2416 379/500 [=====================>........] - ETA: 30s - loss: 1.4763 - regression_loss: 1.2346 - classification_loss: 0.2417 380/500 [=====================>........] - ETA: 30s - loss: 1.4766 - regression_loss: 1.2346 - classification_loss: 0.2420 381/500 [=====================>........] - ETA: 30s - loss: 1.4748 - regression_loss: 1.2332 - classification_loss: 0.2416 382/500 [=====================>........] - ETA: 29s - loss: 1.4763 - regression_loss: 1.2346 - classification_loss: 0.2417 383/500 [=====================>........] - ETA: 29s - loss: 1.4791 - regression_loss: 1.2364 - classification_loss: 0.2427 384/500 [======================>.......] - ETA: 29s - loss: 1.4798 - regression_loss: 1.2370 - classification_loss: 0.2428 385/500 [======================>.......] - ETA: 29s - loss: 1.4802 - regression_loss: 1.2374 - classification_loss: 0.2428 386/500 [======================>.......] - ETA: 28s - loss: 1.4799 - regression_loss: 1.2371 - classification_loss: 0.2428 387/500 [======================>.......] - ETA: 28s - loss: 1.4790 - regression_loss: 1.2363 - classification_loss: 0.2426 388/500 [======================>.......] - ETA: 28s - loss: 1.4814 - regression_loss: 1.2376 - classification_loss: 0.2438 389/500 [======================>.......] - ETA: 27s - loss: 1.4812 - regression_loss: 1.2376 - classification_loss: 0.2437 390/500 [======================>.......] - ETA: 27s - loss: 1.4798 - regression_loss: 1.2365 - classification_loss: 0.2433 391/500 [======================>.......] - ETA: 27s - loss: 1.4786 - regression_loss: 1.2354 - classification_loss: 0.2433 392/500 [======================>.......] - ETA: 27s - loss: 1.4786 - regression_loss: 1.2342 - classification_loss: 0.2443 393/500 [======================>.......] - ETA: 26s - loss: 1.4783 - regression_loss: 1.2340 - classification_loss: 0.2443 394/500 [======================>.......] - ETA: 26s - loss: 1.4784 - regression_loss: 1.2343 - classification_loss: 0.2442 395/500 [======================>.......] - ETA: 26s - loss: 1.4774 - regression_loss: 1.2333 - classification_loss: 0.2441 396/500 [======================>.......] - ETA: 26s - loss: 1.4758 - regression_loss: 1.2321 - classification_loss: 0.2437 397/500 [======================>.......] - ETA: 25s - loss: 1.4771 - regression_loss: 1.2332 - classification_loss: 0.2439 398/500 [======================>.......] - ETA: 25s - loss: 1.4768 - regression_loss: 1.2329 - classification_loss: 0.2438 399/500 [======================>.......] - ETA: 25s - loss: 1.4756 - regression_loss: 1.2321 - classification_loss: 0.2435 400/500 [=======================>......] - ETA: 25s - loss: 1.4759 - regression_loss: 1.2323 - classification_loss: 0.2436 401/500 [=======================>......] - ETA: 24s - loss: 1.4762 - regression_loss: 1.2326 - classification_loss: 0.2437 402/500 [=======================>......] - ETA: 24s - loss: 1.4762 - regression_loss: 1.2326 - classification_loss: 0.2437 403/500 [=======================>......] - ETA: 24s - loss: 1.4805 - regression_loss: 1.2364 - classification_loss: 0.2441 404/500 [=======================>......] - ETA: 24s - loss: 1.4797 - regression_loss: 1.2356 - classification_loss: 0.2441 405/500 [=======================>......] - ETA: 23s - loss: 1.4805 - regression_loss: 1.2363 - classification_loss: 0.2441 406/500 [=======================>......] - ETA: 23s - loss: 1.4798 - regression_loss: 1.2358 - classification_loss: 0.2440 407/500 [=======================>......] - ETA: 23s - loss: 1.4800 - regression_loss: 1.2360 - classification_loss: 0.2440 408/500 [=======================>......] - ETA: 23s - loss: 1.4814 - regression_loss: 1.2369 - classification_loss: 0.2445 409/500 [=======================>......] - ETA: 22s - loss: 1.4800 - regression_loss: 1.2359 - classification_loss: 0.2442 410/500 [=======================>......] - ETA: 22s - loss: 1.4800 - regression_loss: 1.2358 - classification_loss: 0.2442 411/500 [=======================>......] - ETA: 22s - loss: 1.4800 - regression_loss: 1.2361 - classification_loss: 0.2440 412/500 [=======================>......] - ETA: 22s - loss: 1.4819 - regression_loss: 1.2376 - classification_loss: 0.2443 413/500 [=======================>......] - ETA: 21s - loss: 1.4831 - regression_loss: 1.2385 - classification_loss: 0.2446 414/500 [=======================>......] - ETA: 21s - loss: 1.4813 - regression_loss: 1.2372 - classification_loss: 0.2441 415/500 [=======================>......] - ETA: 21s - loss: 1.4809 - regression_loss: 1.2369 - classification_loss: 0.2440 416/500 [=======================>......] - ETA: 21s - loss: 1.4811 - regression_loss: 1.2371 - classification_loss: 0.2439 417/500 [========================>.....] - ETA: 20s - loss: 1.4815 - regression_loss: 1.2376 - classification_loss: 0.2439 418/500 [========================>.....] - ETA: 20s - loss: 1.4813 - regression_loss: 1.2375 - classification_loss: 0.2438 419/500 [========================>.....] - ETA: 20s - loss: 1.4809 - regression_loss: 1.2373 - classification_loss: 0.2436 420/500 [========================>.....] - ETA: 20s - loss: 1.4795 - regression_loss: 1.2361 - classification_loss: 0.2434 421/500 [========================>.....] - ETA: 19s - loss: 1.4782 - regression_loss: 1.2349 - classification_loss: 0.2433 422/500 [========================>.....] - ETA: 19s - loss: 1.4809 - regression_loss: 1.2371 - classification_loss: 0.2438 423/500 [========================>.....] - ETA: 19s - loss: 1.4813 - regression_loss: 1.2375 - classification_loss: 0.2439 424/500 [========================>.....] - ETA: 19s - loss: 1.4820 - regression_loss: 1.2379 - classification_loss: 0.2440 425/500 [========================>.....] - ETA: 18s - loss: 1.4833 - regression_loss: 1.2390 - classification_loss: 0.2443 426/500 [========================>.....] - ETA: 18s - loss: 1.4831 - regression_loss: 1.2389 - classification_loss: 0.2442 427/500 [========================>.....] - ETA: 18s - loss: 1.4821 - regression_loss: 1.2382 - classification_loss: 0.2440 428/500 [========================>.....] - ETA: 18s - loss: 1.4817 - regression_loss: 1.2378 - classification_loss: 0.2439 429/500 [========================>.....] - ETA: 17s - loss: 1.4800 - regression_loss: 1.2365 - classification_loss: 0.2435 430/500 [========================>.....] - ETA: 17s - loss: 1.4801 - regression_loss: 1.2367 - classification_loss: 0.2434 431/500 [========================>.....] - ETA: 17s - loss: 1.4804 - regression_loss: 1.2366 - classification_loss: 0.2438 432/500 [========================>.....] - ETA: 17s - loss: 1.4789 - regression_loss: 1.2355 - classification_loss: 0.2434 433/500 [========================>.....] - ETA: 16s - loss: 1.4787 - regression_loss: 1.2355 - classification_loss: 0.2432 434/500 [=========================>....] - ETA: 16s - loss: 1.4788 - regression_loss: 1.2356 - classification_loss: 0.2432 435/500 [=========================>....] - ETA: 16s - loss: 1.4789 - regression_loss: 1.2358 - classification_loss: 0.2431 436/500 [=========================>....] - ETA: 16s - loss: 1.4792 - regression_loss: 1.2361 - classification_loss: 0.2431 437/500 [=========================>....] - ETA: 15s - loss: 1.4802 - regression_loss: 1.2369 - classification_loss: 0.2432 438/500 [=========================>....] - ETA: 15s - loss: 1.4795 - regression_loss: 1.2365 - classification_loss: 0.2431 439/500 [=========================>....] - ETA: 15s - loss: 1.4783 - regression_loss: 1.2354 - classification_loss: 0.2429 440/500 [=========================>....] - ETA: 15s - loss: 1.4793 - regression_loss: 1.2362 - classification_loss: 0.2431 441/500 [=========================>....] - ETA: 14s - loss: 1.4785 - regression_loss: 1.2356 - classification_loss: 0.2428 442/500 [=========================>....] - ETA: 14s - loss: 1.4796 - regression_loss: 1.2364 - classification_loss: 0.2431 443/500 [=========================>....] - ETA: 14s - loss: 1.4793 - regression_loss: 1.2362 - classification_loss: 0.2431 444/500 [=========================>....] - ETA: 14s - loss: 1.4772 - regression_loss: 1.2346 - classification_loss: 0.2427 445/500 [=========================>....] - ETA: 13s - loss: 1.4775 - regression_loss: 1.2347 - classification_loss: 0.2428 446/500 [=========================>....] - ETA: 13s - loss: 1.4759 - regression_loss: 1.2334 - classification_loss: 0.2425 447/500 [=========================>....] - ETA: 13s - loss: 1.4754 - regression_loss: 1.2331 - classification_loss: 0.2423 448/500 [=========================>....] - ETA: 13s - loss: 1.4763 - regression_loss: 1.2340 - classification_loss: 0.2423 449/500 [=========================>....] - ETA: 12s - loss: 1.4764 - regression_loss: 1.2341 - classification_loss: 0.2422 450/500 [==========================>...] - ETA: 12s - loss: 1.4773 - regression_loss: 1.2348 - classification_loss: 0.2425 451/500 [==========================>...] - ETA: 12s - loss: 1.4769 - regression_loss: 1.2346 - classification_loss: 0.2423 452/500 [==========================>...] - ETA: 12s - loss: 1.4786 - regression_loss: 1.2359 - classification_loss: 0.2426 453/500 [==========================>...] - ETA: 11s - loss: 1.4797 - regression_loss: 1.2371 - classification_loss: 0.2426 454/500 [==========================>...] - ETA: 11s - loss: 1.4799 - regression_loss: 1.2373 - classification_loss: 0.2425 455/500 [==========================>...] - ETA: 11s - loss: 1.4800 - regression_loss: 1.2376 - classification_loss: 0.2424 456/500 [==========================>...] - ETA: 11s - loss: 1.4808 - regression_loss: 1.2383 - classification_loss: 0.2425 457/500 [==========================>...] - ETA: 10s - loss: 1.4805 - regression_loss: 1.2380 - classification_loss: 0.2425 458/500 [==========================>...] - ETA: 10s - loss: 1.4793 - regression_loss: 1.2372 - classification_loss: 0.2422 459/500 [==========================>...] - ETA: 10s - loss: 1.4797 - regression_loss: 1.2376 - classification_loss: 0.2421 460/500 [==========================>...] - ETA: 10s - loss: 1.4801 - regression_loss: 1.2378 - classification_loss: 0.2423 461/500 [==========================>...] - ETA: 9s - loss: 1.4796 - regression_loss: 1.2376 - classification_loss: 0.2420  462/500 [==========================>...] - ETA: 9s - loss: 1.4801 - regression_loss: 1.2373 - classification_loss: 0.2428 463/500 [==========================>...] - ETA: 9s - loss: 1.4813 - regression_loss: 1.2383 - classification_loss: 0.2431 464/500 [==========================>...] - ETA: 9s - loss: 1.4799 - regression_loss: 1.2371 - classification_loss: 0.2428 465/500 [==========================>...] - ETA: 8s - loss: 1.4804 - regression_loss: 1.2375 - classification_loss: 0.2430 466/500 [==========================>...] - ETA: 8s - loss: 1.4793 - regression_loss: 1.2367 - classification_loss: 0.2426 467/500 [===========================>..] - ETA: 8s - loss: 1.4803 - regression_loss: 1.2375 - classification_loss: 0.2428 468/500 [===========================>..] - ETA: 8s - loss: 1.4809 - regression_loss: 1.2379 - classification_loss: 0.2429 469/500 [===========================>..] - ETA: 7s - loss: 1.4794 - regression_loss: 1.2369 - classification_loss: 0.2425 470/500 [===========================>..] - ETA: 7s - loss: 1.4794 - regression_loss: 1.2369 - classification_loss: 0.2425 471/500 [===========================>..] - ETA: 7s - loss: 1.4788 - regression_loss: 1.2365 - classification_loss: 0.2422 472/500 [===========================>..] - ETA: 7s - loss: 1.4792 - regression_loss: 1.2369 - classification_loss: 0.2423 473/500 [===========================>..] - ETA: 6s - loss: 1.4779 - regression_loss: 1.2358 - classification_loss: 0.2421 474/500 [===========================>..] - ETA: 6s - loss: 1.4767 - regression_loss: 1.2349 - classification_loss: 0.2418 475/500 [===========================>..] - ETA: 6s - loss: 1.4774 - regression_loss: 1.2353 - classification_loss: 0.2420 476/500 [===========================>..] - ETA: 6s - loss: 1.4779 - regression_loss: 1.2357 - classification_loss: 0.2422 477/500 [===========================>..] - ETA: 5s - loss: 1.4788 - regression_loss: 1.2365 - classification_loss: 0.2423 478/500 [===========================>..] - ETA: 5s - loss: 1.4794 - regression_loss: 1.2371 - classification_loss: 0.2423 479/500 [===========================>..] - ETA: 5s - loss: 1.4803 - regression_loss: 1.2377 - classification_loss: 0.2425 480/500 [===========================>..] - ETA: 5s - loss: 1.4785 - regression_loss: 1.2362 - classification_loss: 0.2423 481/500 [===========================>..] - ETA: 4s - loss: 1.4769 - regression_loss: 1.2349 - classification_loss: 0.2420 482/500 [===========================>..] - ETA: 4s - loss: 1.4768 - regression_loss: 1.2348 - classification_loss: 0.2420 483/500 [===========================>..] - ETA: 4s - loss: 1.4762 - regression_loss: 1.2344 - classification_loss: 0.2418 484/500 [============================>.] - ETA: 4s - loss: 1.4768 - regression_loss: 1.2347 - classification_loss: 0.2421 485/500 [============================>.] - ETA: 3s - loss: 1.4771 - regression_loss: 1.2350 - classification_loss: 0.2421 486/500 [============================>.] - ETA: 3s - loss: 1.4770 - regression_loss: 1.2350 - classification_loss: 0.2420 487/500 [============================>.] - ETA: 3s - loss: 1.4772 - regression_loss: 1.2353 - classification_loss: 0.2419 488/500 [============================>.] - ETA: 3s - loss: 1.4773 - regression_loss: 1.2354 - classification_loss: 0.2418 489/500 [============================>.] - ETA: 2s - loss: 1.4783 - regression_loss: 1.2363 - classification_loss: 0.2420 490/500 [============================>.] - ETA: 2s - loss: 1.4786 - regression_loss: 1.2366 - classification_loss: 0.2420 491/500 [============================>.] - ETA: 2s - loss: 1.4781 - regression_loss: 1.2359 - classification_loss: 0.2422 492/500 [============================>.] - ETA: 2s - loss: 1.4762 - regression_loss: 1.2344 - classification_loss: 0.2419 493/500 [============================>.] - ETA: 1s - loss: 1.4758 - regression_loss: 1.2340 - classification_loss: 0.2418 494/500 [============================>.] - ETA: 1s - loss: 1.4763 - regression_loss: 1.2345 - classification_loss: 0.2418 495/500 [============================>.] - ETA: 1s - loss: 1.4750 - regression_loss: 1.2335 - classification_loss: 0.2415 496/500 [============================>.] - ETA: 1s - loss: 1.4757 - regression_loss: 1.2343 - classification_loss: 0.2414 497/500 [============================>.] - ETA: 0s - loss: 1.4747 - regression_loss: 1.2336 - classification_loss: 0.2411 498/500 [============================>.] - ETA: 0s - loss: 1.4740 - regression_loss: 1.2328 - classification_loss: 0.2412 499/500 [============================>.] - ETA: 0s - loss: 1.4724 - regression_loss: 1.2316 - classification_loss: 0.2408 500/500 [==============================] - 126s 252ms/step - loss: 1.4724 - regression_loss: 1.2316 - classification_loss: 0.2408 1172 instances of class plum with average precision: 0.6755 mAP: 0.6755 Epoch 00104: saving model to ./training/snapshots/resnet50_pascal_104.h5 Epoch 105/150 1/500 [..............................] - ETA: 1:59 - loss: 1.5317 - regression_loss: 1.2940 - classification_loss: 0.2377 2/500 [..............................] - ETA: 2:00 - loss: 1.4466 - regression_loss: 1.2439 - classification_loss: 0.2027 3/500 [..............................] - ETA: 2:00 - loss: 1.6218 - regression_loss: 1.3666 - classification_loss: 0.2552 4/500 [..............................] - ETA: 2:00 - loss: 1.6889 - regression_loss: 1.4162 - classification_loss: 0.2727 5/500 [..............................] - ETA: 2:01 - loss: 1.5292 - regression_loss: 1.2796 - classification_loss: 0.2496 6/500 [..............................] - ETA: 2:01 - loss: 1.5095 - regression_loss: 1.2691 - classification_loss: 0.2405 7/500 [..............................] - ETA: 2:01 - loss: 1.5232 - regression_loss: 1.2875 - classification_loss: 0.2357 8/500 [..............................] - ETA: 2:01 - loss: 1.5004 - regression_loss: 1.2678 - classification_loss: 0.2326 9/500 [..............................] - ETA: 2:01 - loss: 1.4681 - regression_loss: 1.2444 - classification_loss: 0.2236 10/500 [..............................] - ETA: 2:00 - loss: 1.4545 - regression_loss: 1.2282 - classification_loss: 0.2263 11/500 [..............................] - ETA: 2:00 - loss: 1.5472 - regression_loss: 1.3022 - classification_loss: 0.2450 12/500 [..............................] - ETA: 2:00 - loss: 1.4726 - regression_loss: 1.2433 - classification_loss: 0.2293 13/500 [..............................] - ETA: 2:00 - loss: 1.4661 - regression_loss: 1.2400 - classification_loss: 0.2261 14/500 [..............................] - ETA: 2:00 - loss: 1.5009 - regression_loss: 1.2668 - classification_loss: 0.2341 15/500 [..............................] - ETA: 2:00 - loss: 1.5206 - regression_loss: 1.2816 - classification_loss: 0.2390 16/500 [..............................] - ETA: 2:00 - loss: 1.5453 - regression_loss: 1.3036 - classification_loss: 0.2417 17/500 [>.............................] - ETA: 2:00 - loss: 1.5351 - regression_loss: 1.2949 - classification_loss: 0.2402 18/500 [>.............................] - ETA: 2:00 - loss: 1.5118 - regression_loss: 1.2779 - classification_loss: 0.2339 19/500 [>.............................] - ETA: 2:00 - loss: 1.5274 - regression_loss: 1.2892 - classification_loss: 0.2382 20/500 [>.............................] - ETA: 1:59 - loss: 1.5277 - regression_loss: 1.2893 - classification_loss: 0.2384 21/500 [>.............................] - ETA: 1:59 - loss: 1.5276 - regression_loss: 1.2884 - classification_loss: 0.2393 22/500 [>.............................] - ETA: 1:58 - loss: 1.5118 - regression_loss: 1.2748 - classification_loss: 0.2370 23/500 [>.............................] - ETA: 1:58 - loss: 1.5397 - regression_loss: 1.2899 - classification_loss: 0.2498 24/500 [>.............................] - ETA: 1:57 - loss: 1.5455 - regression_loss: 1.2956 - classification_loss: 0.2499 25/500 [>.............................] - ETA: 1:57 - loss: 1.5328 - regression_loss: 1.2882 - classification_loss: 0.2446 26/500 [>.............................] - ETA: 1:57 - loss: 1.5071 - regression_loss: 1.2682 - classification_loss: 0.2389 27/500 [>.............................] - ETA: 1:57 - loss: 1.5052 - regression_loss: 1.2660 - classification_loss: 0.2393 28/500 [>.............................] - ETA: 1:56 - loss: 1.4921 - regression_loss: 1.2561 - classification_loss: 0.2361 29/500 [>.............................] - ETA: 1:56 - loss: 1.4903 - regression_loss: 1.2531 - classification_loss: 0.2372 30/500 [>.............................] - ETA: 1:56 - loss: 1.4902 - regression_loss: 1.2539 - classification_loss: 0.2363 31/500 [>.............................] - ETA: 1:56 - loss: 1.4990 - regression_loss: 1.2612 - classification_loss: 0.2379 32/500 [>.............................] - ETA: 1:56 - loss: 1.5107 - regression_loss: 1.2675 - classification_loss: 0.2431 33/500 [>.............................] - ETA: 1:55 - loss: 1.5206 - regression_loss: 1.2768 - classification_loss: 0.2437 34/500 [=>............................] - ETA: 1:55 - loss: 1.5169 - regression_loss: 1.2742 - classification_loss: 0.2428 35/500 [=>............................] - ETA: 1:55 - loss: 1.5089 - regression_loss: 1.2684 - classification_loss: 0.2405 36/500 [=>............................] - ETA: 1:54 - loss: 1.5103 - regression_loss: 1.2701 - classification_loss: 0.2403 37/500 [=>............................] - ETA: 1:54 - loss: 1.5040 - regression_loss: 1.2655 - classification_loss: 0.2385 38/500 [=>............................] - ETA: 1:54 - loss: 1.4933 - regression_loss: 1.2562 - classification_loss: 0.2371 39/500 [=>............................] - ETA: 1:54 - loss: 1.4962 - regression_loss: 1.2584 - classification_loss: 0.2378 40/500 [=>............................] - ETA: 1:54 - loss: 1.4910 - regression_loss: 1.2537 - classification_loss: 0.2373 41/500 [=>............................] - ETA: 1:53 - loss: 1.4961 - regression_loss: 1.2594 - classification_loss: 0.2367 42/500 [=>............................] - ETA: 1:53 - loss: 1.5011 - regression_loss: 1.2642 - classification_loss: 0.2368 43/500 [=>............................] - ETA: 1:53 - loss: 1.5208 - regression_loss: 1.2719 - classification_loss: 0.2489 44/500 [=>............................] - ETA: 1:53 - loss: 1.5282 - regression_loss: 1.2783 - classification_loss: 0.2499 45/500 [=>............................] - ETA: 1:52 - loss: 1.5370 - regression_loss: 1.2865 - classification_loss: 0.2505 46/500 [=>............................] - ETA: 1:52 - loss: 1.5370 - regression_loss: 1.2859 - classification_loss: 0.2511 47/500 [=>............................] - ETA: 1:52 - loss: 1.5334 - regression_loss: 1.2837 - classification_loss: 0.2497 48/500 [=>............................] - ETA: 1:52 - loss: 1.5229 - regression_loss: 1.2746 - classification_loss: 0.2483 49/500 [=>............................] - ETA: 1:52 - loss: 1.5117 - regression_loss: 1.2646 - classification_loss: 0.2472 50/500 [==>...........................] - ETA: 1:51 - loss: 1.5140 - regression_loss: 1.2669 - classification_loss: 0.2471 51/500 [==>...........................] - ETA: 1:51 - loss: 1.5028 - regression_loss: 1.2585 - classification_loss: 0.2443 52/500 [==>...........................] - ETA: 1:51 - loss: 1.5124 - regression_loss: 1.2648 - classification_loss: 0.2476 53/500 [==>...........................] - ETA: 1:51 - loss: 1.5140 - regression_loss: 1.2664 - classification_loss: 0.2475 54/500 [==>...........................] - ETA: 1:51 - loss: 1.5125 - regression_loss: 1.2642 - classification_loss: 0.2483 55/500 [==>...........................] - ETA: 1:50 - loss: 1.5167 - regression_loss: 1.2673 - classification_loss: 0.2493 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5268 - regression_loss: 1.2734 - classification_loss: 0.2534 57/500 [==>...........................] - ETA: 1:50 - loss: 1.5338 - regression_loss: 1.2791 - classification_loss: 0.2546 58/500 [==>...........................] - ETA: 1:50 - loss: 1.5203 - regression_loss: 1.2681 - classification_loss: 0.2522 59/500 [==>...........................] - ETA: 1:49 - loss: 1.5200 - regression_loss: 1.2687 - classification_loss: 0.2513 60/500 [==>...........................] - ETA: 1:49 - loss: 1.5190 - regression_loss: 1.2681 - classification_loss: 0.2509 61/500 [==>...........................] - ETA: 1:49 - loss: 1.5200 - regression_loss: 1.2700 - classification_loss: 0.2500 62/500 [==>...........................] - ETA: 1:49 - loss: 1.5351 - regression_loss: 1.2831 - classification_loss: 0.2520 63/500 [==>...........................] - ETA: 1:48 - loss: 1.5304 - regression_loss: 1.2797 - classification_loss: 0.2507 64/500 [==>...........................] - ETA: 1:48 - loss: 1.5341 - regression_loss: 1.2815 - classification_loss: 0.2526 65/500 [==>...........................] - ETA: 1:48 - loss: 1.5192 - regression_loss: 1.2691 - classification_loss: 0.2501 66/500 [==>...........................] - ETA: 1:48 - loss: 1.5219 - regression_loss: 1.2735 - classification_loss: 0.2484 67/500 [===>..........................] - ETA: 1:48 - loss: 1.5291 - regression_loss: 1.2796 - classification_loss: 0.2495 68/500 [===>..........................] - ETA: 1:47 - loss: 1.5257 - regression_loss: 1.2769 - classification_loss: 0.2488 69/500 [===>..........................] - ETA: 1:47 - loss: 1.5274 - regression_loss: 1.2787 - classification_loss: 0.2487 70/500 [===>..........................] - ETA: 1:47 - loss: 1.5286 - regression_loss: 1.2792 - classification_loss: 0.2494 71/500 [===>..........................] - ETA: 1:47 - loss: 1.5269 - regression_loss: 1.2779 - classification_loss: 0.2491 72/500 [===>..........................] - ETA: 1:47 - loss: 1.5228 - regression_loss: 1.2739 - classification_loss: 0.2489 73/500 [===>..........................] - ETA: 1:46 - loss: 1.5341 - regression_loss: 1.2797 - classification_loss: 0.2543 74/500 [===>..........................] - ETA: 1:46 - loss: 1.5338 - regression_loss: 1.2795 - classification_loss: 0.2543 75/500 [===>..........................] - ETA: 1:46 - loss: 1.5321 - regression_loss: 1.2788 - classification_loss: 0.2533 76/500 [===>..........................] - ETA: 1:46 - loss: 1.5321 - regression_loss: 1.2792 - classification_loss: 0.2529 77/500 [===>..........................] - ETA: 1:45 - loss: 1.5370 - regression_loss: 1.2833 - classification_loss: 0.2536 78/500 [===>..........................] - ETA: 1:45 - loss: 1.5353 - regression_loss: 1.2824 - classification_loss: 0.2529 79/500 [===>..........................] - ETA: 1:45 - loss: 1.5290 - regression_loss: 1.2781 - classification_loss: 0.2510 80/500 [===>..........................] - ETA: 1:45 - loss: 1.5319 - regression_loss: 1.2804 - classification_loss: 0.2515 81/500 [===>..........................] - ETA: 1:44 - loss: 1.5277 - regression_loss: 1.2769 - classification_loss: 0.2508 82/500 [===>..........................] - ETA: 1:44 - loss: 1.5247 - regression_loss: 1.2743 - classification_loss: 0.2504 83/500 [===>..........................] - ETA: 1:44 - loss: 1.5189 - regression_loss: 1.2699 - classification_loss: 0.2490 84/500 [====>.........................] - ETA: 1:44 - loss: 1.5202 - regression_loss: 1.2714 - classification_loss: 0.2489 85/500 [====>.........................] - ETA: 1:43 - loss: 1.5134 - regression_loss: 1.2666 - classification_loss: 0.2469 86/500 [====>.........................] - ETA: 1:43 - loss: 1.5128 - regression_loss: 1.2657 - classification_loss: 0.2471 87/500 [====>.........................] - ETA: 1:43 - loss: 1.5140 - regression_loss: 1.2663 - classification_loss: 0.2476 88/500 [====>.........................] - ETA: 1:43 - loss: 1.5154 - regression_loss: 1.2674 - classification_loss: 0.2480 89/500 [====>.........................] - ETA: 1:42 - loss: 1.5147 - regression_loss: 1.2675 - classification_loss: 0.2472 90/500 [====>.........................] - ETA: 1:42 - loss: 1.5172 - regression_loss: 1.2695 - classification_loss: 0.2477 91/500 [====>.........................] - ETA: 1:42 - loss: 1.5078 - regression_loss: 1.2617 - classification_loss: 0.2462 92/500 [====>.........................] - ETA: 1:42 - loss: 1.5139 - regression_loss: 1.2668 - classification_loss: 0.2471 93/500 [====>.........................] - ETA: 1:41 - loss: 1.5149 - regression_loss: 1.2672 - classification_loss: 0.2477 94/500 [====>.........................] - ETA: 1:41 - loss: 1.5202 - regression_loss: 1.2698 - classification_loss: 0.2504 95/500 [====>.........................] - ETA: 1:41 - loss: 1.5194 - regression_loss: 1.2693 - classification_loss: 0.2501 96/500 [====>.........................] - ETA: 1:41 - loss: 1.5184 - regression_loss: 1.2687 - classification_loss: 0.2496 97/500 [====>.........................] - ETA: 1:40 - loss: 1.5144 - regression_loss: 1.2656 - classification_loss: 0.2487 98/500 [====>.........................] - ETA: 1:40 - loss: 1.5168 - regression_loss: 1.2676 - classification_loss: 0.2492 99/500 [====>.........................] - ETA: 1:39 - loss: 1.5182 - regression_loss: 1.2691 - classification_loss: 0.2491 100/500 [=====>........................] - ETA: 1:39 - loss: 1.5178 - regression_loss: 1.2689 - classification_loss: 0.2489 101/500 [=====>........................] - ETA: 1:39 - loss: 1.5178 - regression_loss: 1.2692 - classification_loss: 0.2486 102/500 [=====>........................] - ETA: 1:39 - loss: 1.5224 - regression_loss: 1.2732 - classification_loss: 0.2491 103/500 [=====>........................] - ETA: 1:38 - loss: 1.5266 - regression_loss: 1.2770 - classification_loss: 0.2496 104/500 [=====>........................] - ETA: 1:38 - loss: 1.5291 - regression_loss: 1.2798 - classification_loss: 0.2492 105/500 [=====>........................] - ETA: 1:38 - loss: 1.5223 - regression_loss: 1.2739 - classification_loss: 0.2484 106/500 [=====>........................] - ETA: 1:38 - loss: 1.5228 - regression_loss: 1.2737 - classification_loss: 0.2491 107/500 [=====>........................] - ETA: 1:37 - loss: 1.5203 - regression_loss: 1.2721 - classification_loss: 0.2482 108/500 [=====>........................] - ETA: 1:37 - loss: 1.5175 - regression_loss: 1.2701 - classification_loss: 0.2474 109/500 [=====>........................] - ETA: 1:37 - loss: 1.5200 - regression_loss: 1.2728 - classification_loss: 0.2472 110/500 [=====>........................] - ETA: 1:37 - loss: 1.5200 - regression_loss: 1.2728 - classification_loss: 0.2473 111/500 [=====>........................] - ETA: 1:36 - loss: 1.5176 - regression_loss: 1.2708 - classification_loss: 0.2468 112/500 [=====>........................] - ETA: 1:36 - loss: 1.5155 - regression_loss: 1.2695 - classification_loss: 0.2460 113/500 [=====>........................] - ETA: 1:36 - loss: 1.5182 - regression_loss: 1.2720 - classification_loss: 0.2462 114/500 [=====>........................] - ETA: 1:36 - loss: 1.5187 - regression_loss: 1.2729 - classification_loss: 0.2459 115/500 [=====>........................] - ETA: 1:35 - loss: 1.5127 - regression_loss: 1.2677 - classification_loss: 0.2450 116/500 [=====>........................] - ETA: 1:35 - loss: 1.5133 - regression_loss: 1.2682 - classification_loss: 0.2451 117/500 [======>.......................] - ETA: 1:35 - loss: 1.5124 - regression_loss: 1.2679 - classification_loss: 0.2445 118/500 [======>.......................] - ETA: 1:35 - loss: 1.5123 - regression_loss: 1.2677 - classification_loss: 0.2446 119/500 [======>.......................] - ETA: 1:34 - loss: 1.5092 - regression_loss: 1.2652 - classification_loss: 0.2440 120/500 [======>.......................] - ETA: 1:34 - loss: 1.5136 - regression_loss: 1.2684 - classification_loss: 0.2452 121/500 [======>.......................] - ETA: 1:34 - loss: 1.5127 - regression_loss: 1.2686 - classification_loss: 0.2441 122/500 [======>.......................] - ETA: 1:34 - loss: 1.5111 - regression_loss: 1.2672 - classification_loss: 0.2440 123/500 [======>.......................] - ETA: 1:34 - loss: 1.5094 - regression_loss: 1.2661 - classification_loss: 0.2433 124/500 [======>.......................] - ETA: 1:33 - loss: 1.5091 - regression_loss: 1.2659 - classification_loss: 0.2432 125/500 [======>.......................] - ETA: 1:33 - loss: 1.5095 - regression_loss: 1.2659 - classification_loss: 0.2436 126/500 [======>.......................] - ETA: 1:33 - loss: 1.5109 - regression_loss: 1.2670 - classification_loss: 0.2439 127/500 [======>.......................] - ETA: 1:33 - loss: 1.5075 - regression_loss: 1.2648 - classification_loss: 0.2427 128/500 [======>.......................] - ETA: 1:32 - loss: 1.5111 - regression_loss: 1.2680 - classification_loss: 0.2431 129/500 [======>.......................] - ETA: 1:32 - loss: 1.5103 - regression_loss: 1.2670 - classification_loss: 0.2432 130/500 [======>.......................] - ETA: 1:32 - loss: 1.5146 - regression_loss: 1.2705 - classification_loss: 0.2440 131/500 [======>.......................] - ETA: 1:32 - loss: 1.5108 - regression_loss: 1.2680 - classification_loss: 0.2428 132/500 [======>.......................] - ETA: 1:31 - loss: 1.5127 - regression_loss: 1.2697 - classification_loss: 0.2431 133/500 [======>.......................] - ETA: 1:31 - loss: 1.5089 - regression_loss: 1.2663 - classification_loss: 0.2426 134/500 [=======>......................] - ETA: 1:31 - loss: 1.5094 - regression_loss: 1.2669 - classification_loss: 0.2425 135/500 [=======>......................] - ETA: 1:31 - loss: 1.5103 - regression_loss: 1.2690 - classification_loss: 0.2413 136/500 [=======>......................] - ETA: 1:30 - loss: 1.5119 - regression_loss: 1.2705 - classification_loss: 0.2414 137/500 [=======>......................] - ETA: 1:30 - loss: 1.5062 - regression_loss: 1.2660 - classification_loss: 0.2402 138/500 [=======>......................] - ETA: 1:30 - loss: 1.5009 - regression_loss: 1.2617 - classification_loss: 0.2392 139/500 [=======>......................] - ETA: 1:30 - loss: 1.5049 - regression_loss: 1.2659 - classification_loss: 0.2390 140/500 [=======>......................] - ETA: 1:29 - loss: 1.5036 - regression_loss: 1.2651 - classification_loss: 0.2385 141/500 [=======>......................] - ETA: 1:29 - loss: 1.5003 - regression_loss: 1.2628 - classification_loss: 0.2375 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4983 - regression_loss: 1.2606 - classification_loss: 0.2377 143/500 [=======>......................] - ETA: 1:29 - loss: 1.5005 - regression_loss: 1.2625 - classification_loss: 0.2379 144/500 [=======>......................] - ETA: 1:28 - loss: 1.4936 - regression_loss: 1.2571 - classification_loss: 0.2366 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4921 - regression_loss: 1.2559 - classification_loss: 0.2362 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4963 - regression_loss: 1.2588 - classification_loss: 0.2375 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4943 - regression_loss: 1.2568 - classification_loss: 0.2375 148/500 [=======>......................] - ETA: 1:27 - loss: 1.4916 - regression_loss: 1.2548 - classification_loss: 0.2367 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4946 - regression_loss: 1.2577 - classification_loss: 0.2369 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4927 - regression_loss: 1.2561 - classification_loss: 0.2366 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4939 - regression_loss: 1.2569 - classification_loss: 0.2370 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4943 - regression_loss: 1.2573 - classification_loss: 0.2370 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4912 - regression_loss: 1.2548 - classification_loss: 0.2364 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4937 - regression_loss: 1.2562 - classification_loss: 0.2375 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4928 - regression_loss: 1.2553 - classification_loss: 0.2375 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4950 - regression_loss: 1.2568 - classification_loss: 0.2382 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4958 - regression_loss: 1.2573 - classification_loss: 0.2385 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4998 - regression_loss: 1.2602 - classification_loss: 0.2397 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4961 - regression_loss: 1.2573 - classification_loss: 0.2388 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4947 - regression_loss: 1.2557 - classification_loss: 0.2390 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4928 - regression_loss: 1.2543 - classification_loss: 0.2384 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4884 - regression_loss: 1.2508 - classification_loss: 0.2376 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4858 - regression_loss: 1.2483 - classification_loss: 0.2375 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4881 - regression_loss: 1.2503 - classification_loss: 0.2378 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4902 - regression_loss: 1.2521 - classification_loss: 0.2381 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4874 - regression_loss: 1.2500 - classification_loss: 0.2374 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4839 - regression_loss: 1.2470 - classification_loss: 0.2368 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4849 - regression_loss: 1.2478 - classification_loss: 0.2372 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4875 - regression_loss: 1.2501 - classification_loss: 0.2374 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4884 - regression_loss: 1.2502 - classification_loss: 0.2382 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4879 - regression_loss: 1.2498 - classification_loss: 0.2381 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4869 - regression_loss: 1.2490 - classification_loss: 0.2380 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4876 - regression_loss: 1.2497 - classification_loss: 0.2380 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4879 - regression_loss: 1.2498 - classification_loss: 0.2381 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4871 - regression_loss: 1.2491 - classification_loss: 0.2380 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4880 - regression_loss: 1.2500 - classification_loss: 0.2380 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4842 - regression_loss: 1.2464 - classification_loss: 0.2378 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4848 - regression_loss: 1.2469 - classification_loss: 0.2379 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4814 - regression_loss: 1.2440 - classification_loss: 0.2375 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4814 - regression_loss: 1.2439 - classification_loss: 0.2375 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4809 - regression_loss: 1.2435 - classification_loss: 0.2374 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4829 - regression_loss: 1.2452 - classification_loss: 0.2376 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4841 - regression_loss: 1.2462 - classification_loss: 0.2379 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4909 - regression_loss: 1.2516 - classification_loss: 0.2393 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4939 - regression_loss: 1.2541 - classification_loss: 0.2398 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4919 - regression_loss: 1.2522 - classification_loss: 0.2397 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4892 - regression_loss: 1.2496 - classification_loss: 0.2395 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4850 - regression_loss: 1.2464 - classification_loss: 0.2385 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4891 - regression_loss: 1.2490 - classification_loss: 0.2401 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4895 - regression_loss: 1.2493 - classification_loss: 0.2402 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4850 - regression_loss: 1.2458 - classification_loss: 0.2392 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4868 - regression_loss: 1.2477 - classification_loss: 0.2391 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4870 - regression_loss: 1.2476 - classification_loss: 0.2393 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4900 - regression_loss: 1.2498 - classification_loss: 0.2402 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4915 - regression_loss: 1.2510 - classification_loss: 0.2405 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4912 - regression_loss: 1.2507 - classification_loss: 0.2404 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4930 - regression_loss: 1.2524 - classification_loss: 0.2406 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4937 - regression_loss: 1.2529 - classification_loss: 0.2408 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4932 - regression_loss: 1.2526 - classification_loss: 0.2406 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4912 - regression_loss: 1.2505 - classification_loss: 0.2407 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4923 - regression_loss: 1.2508 - classification_loss: 0.2415 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4886 - regression_loss: 1.2471 - classification_loss: 0.2414 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4906 - regression_loss: 1.2484 - classification_loss: 0.2422 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4930 - regression_loss: 1.2501 - classification_loss: 0.2429 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4930 - regression_loss: 1.2502 - classification_loss: 0.2428 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4913 - regression_loss: 1.2490 - classification_loss: 0.2423 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4931 - regression_loss: 1.2508 - classification_loss: 0.2423 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4928 - regression_loss: 1.2507 - classification_loss: 0.2421 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4967 - regression_loss: 1.2543 - classification_loss: 0.2423 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4928 - regression_loss: 1.2512 - classification_loss: 0.2416 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4886 - regression_loss: 1.2478 - classification_loss: 0.2408 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4875 - regression_loss: 1.2466 - classification_loss: 0.2409 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4879 - regression_loss: 1.2469 - classification_loss: 0.2410 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4854 - regression_loss: 1.2447 - classification_loss: 0.2407 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4856 - regression_loss: 1.2448 - classification_loss: 0.2408 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4849 - regression_loss: 1.2447 - classification_loss: 0.2401 217/500 [============>.................] - ETA: 1:10 - loss: 1.4833 - regression_loss: 1.2437 - classification_loss: 0.2396 218/500 [============>.................] - ETA: 1:10 - loss: 1.4843 - regression_loss: 1.2445 - classification_loss: 0.2398 219/500 [============>.................] - ETA: 1:10 - loss: 1.4848 - regression_loss: 1.2451 - classification_loss: 0.2397 220/500 [============>.................] - ETA: 1:10 - loss: 1.4839 - regression_loss: 1.2443 - classification_loss: 0.2396 221/500 [============>.................] - ETA: 1:09 - loss: 1.4834 - regression_loss: 1.2438 - classification_loss: 0.2396 222/500 [============>.................] - ETA: 1:09 - loss: 1.4841 - regression_loss: 1.2445 - classification_loss: 0.2396 223/500 [============>.................] - ETA: 1:09 - loss: 1.4810 - regression_loss: 1.2418 - classification_loss: 0.2392 224/500 [============>.................] - ETA: 1:09 - loss: 1.4789 - regression_loss: 1.2396 - classification_loss: 0.2393 225/500 [============>.................] - ETA: 1:08 - loss: 1.4759 - regression_loss: 1.2371 - classification_loss: 0.2388 226/500 [============>.................] - ETA: 1:08 - loss: 1.4749 - regression_loss: 1.2363 - classification_loss: 0.2386 227/500 [============>.................] - ETA: 1:08 - loss: 1.4733 - regression_loss: 1.2351 - classification_loss: 0.2381 228/500 [============>.................] - ETA: 1:08 - loss: 1.4727 - regression_loss: 1.2348 - classification_loss: 0.2378 229/500 [============>.................] - ETA: 1:07 - loss: 1.4742 - regression_loss: 1.2363 - classification_loss: 0.2379 230/500 [============>.................] - ETA: 1:07 - loss: 1.4766 - regression_loss: 1.2383 - classification_loss: 0.2383 231/500 [============>.................] - ETA: 1:07 - loss: 1.4778 - regression_loss: 1.2393 - classification_loss: 0.2385 232/500 [============>.................] - ETA: 1:07 - loss: 1.4784 - regression_loss: 1.2398 - classification_loss: 0.2386 233/500 [============>.................] - ETA: 1:06 - loss: 1.4786 - regression_loss: 1.2398 - classification_loss: 0.2387 234/500 [=============>................] - ETA: 1:06 - loss: 1.4769 - regression_loss: 1.2384 - classification_loss: 0.2386 235/500 [=============>................] - ETA: 1:06 - loss: 1.4779 - regression_loss: 1.2395 - classification_loss: 0.2385 236/500 [=============>................] - ETA: 1:06 - loss: 1.4762 - regression_loss: 1.2379 - classification_loss: 0.2383 237/500 [=============>................] - ETA: 1:05 - loss: 1.4773 - regression_loss: 1.2386 - classification_loss: 0.2387 238/500 [=============>................] - ETA: 1:05 - loss: 1.4783 - regression_loss: 1.2390 - classification_loss: 0.2392 239/500 [=============>................] - ETA: 1:05 - loss: 1.4768 - regression_loss: 1.2379 - classification_loss: 0.2389 240/500 [=============>................] - ETA: 1:05 - loss: 1.4788 - regression_loss: 1.2397 - classification_loss: 0.2391 241/500 [=============>................] - ETA: 1:04 - loss: 1.4810 - regression_loss: 1.2420 - classification_loss: 0.2390 242/500 [=============>................] - ETA: 1:04 - loss: 1.4813 - regression_loss: 1.2424 - classification_loss: 0.2389 243/500 [=============>................] - ETA: 1:04 - loss: 1.4816 - regression_loss: 1.2426 - classification_loss: 0.2390 244/500 [=============>................] - ETA: 1:04 - loss: 1.4799 - regression_loss: 1.2413 - classification_loss: 0.2387 245/500 [=============>................] - ETA: 1:03 - loss: 1.4759 - regression_loss: 1.2380 - classification_loss: 0.2379 246/500 [=============>................] - ETA: 1:03 - loss: 1.4781 - regression_loss: 1.2397 - classification_loss: 0.2384 247/500 [=============>................] - ETA: 1:03 - loss: 1.4775 - regression_loss: 1.2392 - classification_loss: 0.2383 248/500 [=============>................] - ETA: 1:03 - loss: 1.4784 - regression_loss: 1.2396 - classification_loss: 0.2388 249/500 [=============>................] - ETA: 1:02 - loss: 1.4793 - regression_loss: 1.2402 - classification_loss: 0.2391 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4790 - regression_loss: 1.2400 - classification_loss: 0.2391 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4787 - regression_loss: 1.2395 - classification_loss: 0.2391 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4800 - regression_loss: 1.2403 - classification_loss: 0.2397 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4786 - regression_loss: 1.2394 - classification_loss: 0.2393 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4792 - regression_loss: 1.2396 - classification_loss: 0.2396 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4790 - regression_loss: 1.2395 - classification_loss: 0.2395 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4792 - regression_loss: 1.2398 - classification_loss: 0.2395 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4800 - regression_loss: 1.2404 - classification_loss: 0.2396 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4775 - regression_loss: 1.2383 - classification_loss: 0.2392 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4796 - regression_loss: 1.2398 - classification_loss: 0.2398 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4813 - regression_loss: 1.2409 - classification_loss: 0.2404 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4806 - regression_loss: 1.2404 - classification_loss: 0.2402 262/500 [==============>...............] - ETA: 59s - loss: 1.4787 - regression_loss: 1.2389 - classification_loss: 0.2397  263/500 [==============>...............] - ETA: 59s - loss: 1.4801 - regression_loss: 1.2400 - classification_loss: 0.2401 264/500 [==============>...............] - ETA: 59s - loss: 1.4797 - regression_loss: 1.2398 - classification_loss: 0.2399 265/500 [==============>...............] - ETA: 59s - loss: 1.4820 - regression_loss: 1.2414 - classification_loss: 0.2406 266/500 [==============>...............] - ETA: 58s - loss: 1.4845 - regression_loss: 1.2434 - classification_loss: 0.2412 267/500 [===============>..............] - ETA: 58s - loss: 1.4843 - regression_loss: 1.2431 - classification_loss: 0.2411 268/500 [===============>..............] - ETA: 58s - loss: 1.4849 - regression_loss: 1.2438 - classification_loss: 0.2411 269/500 [===============>..............] - ETA: 58s - loss: 1.4843 - regression_loss: 1.2432 - classification_loss: 0.2411 270/500 [===============>..............] - ETA: 57s - loss: 1.4842 - regression_loss: 1.2433 - classification_loss: 0.2408 271/500 [===============>..............] - ETA: 57s - loss: 1.4813 - regression_loss: 1.2410 - classification_loss: 0.2403 272/500 [===============>..............] - ETA: 57s - loss: 1.4775 - regression_loss: 1.2378 - classification_loss: 0.2397 273/500 [===============>..............] - ETA: 56s - loss: 1.4777 - regression_loss: 1.2380 - classification_loss: 0.2396 274/500 [===============>..............] - ETA: 56s - loss: 1.4786 - regression_loss: 1.2387 - classification_loss: 0.2399 275/500 [===============>..............] - ETA: 56s - loss: 1.4797 - regression_loss: 1.2397 - classification_loss: 0.2400 276/500 [===============>..............] - ETA: 56s - loss: 1.4804 - regression_loss: 1.2404 - classification_loss: 0.2400 277/500 [===============>..............] - ETA: 55s - loss: 1.4775 - regression_loss: 1.2381 - classification_loss: 0.2394 278/500 [===============>..............] - ETA: 55s - loss: 1.4778 - regression_loss: 1.2384 - classification_loss: 0.2394 279/500 [===============>..............] - ETA: 55s - loss: 1.4789 - regression_loss: 1.2394 - classification_loss: 0.2395 280/500 [===============>..............] - ETA: 55s - loss: 1.4786 - regression_loss: 1.2395 - classification_loss: 0.2391 281/500 [===============>..............] - ETA: 54s - loss: 1.4799 - regression_loss: 1.2404 - classification_loss: 0.2395 282/500 [===============>..............] - ETA: 54s - loss: 1.4787 - regression_loss: 1.2387 - classification_loss: 0.2399 283/500 [===============>..............] - ETA: 54s - loss: 1.4771 - regression_loss: 1.2374 - classification_loss: 0.2398 284/500 [================>.............] - ETA: 54s - loss: 1.4788 - regression_loss: 1.2387 - classification_loss: 0.2401 285/500 [================>.............] - ETA: 53s - loss: 1.4758 - regression_loss: 1.2360 - classification_loss: 0.2398 286/500 [================>.............] - ETA: 53s - loss: 1.4777 - regression_loss: 1.2377 - classification_loss: 0.2401 287/500 [================>.............] - ETA: 53s - loss: 1.4777 - regression_loss: 1.2374 - classification_loss: 0.2403 288/500 [================>.............] - ETA: 53s - loss: 1.4783 - regression_loss: 1.2380 - classification_loss: 0.2403 289/500 [================>.............] - ETA: 52s - loss: 1.4758 - regression_loss: 1.2361 - classification_loss: 0.2397 290/500 [================>.............] - ETA: 52s - loss: 1.4772 - regression_loss: 1.2371 - classification_loss: 0.2401 291/500 [================>.............] - ETA: 52s - loss: 1.4785 - regression_loss: 1.2382 - classification_loss: 0.2403 292/500 [================>.............] - ETA: 52s - loss: 1.4778 - regression_loss: 1.2376 - classification_loss: 0.2401 293/500 [================>.............] - ETA: 51s - loss: 1.4769 - regression_loss: 1.2366 - classification_loss: 0.2403 294/500 [================>.............] - ETA: 51s - loss: 1.4784 - regression_loss: 1.2377 - classification_loss: 0.2407 295/500 [================>.............] - ETA: 51s - loss: 1.4767 - regression_loss: 1.2360 - classification_loss: 0.2408 296/500 [================>.............] - ETA: 51s - loss: 1.4754 - regression_loss: 1.2347 - classification_loss: 0.2407 297/500 [================>.............] - ETA: 50s - loss: 1.4741 - regression_loss: 1.2327 - classification_loss: 0.2414 298/500 [================>.............] - ETA: 50s - loss: 1.4722 - regression_loss: 1.2313 - classification_loss: 0.2408 299/500 [================>.............] - ETA: 50s - loss: 1.4726 - regression_loss: 1.2319 - classification_loss: 0.2408 300/500 [=================>............] - ETA: 50s - loss: 1.4735 - regression_loss: 1.2328 - classification_loss: 0.2407 301/500 [=================>............] - ETA: 49s - loss: 1.4730 - regression_loss: 1.2322 - classification_loss: 0.2408 302/500 [=================>............] - ETA: 49s - loss: 1.4722 - regression_loss: 1.2316 - classification_loss: 0.2406 303/500 [=================>............] - ETA: 49s - loss: 1.4706 - regression_loss: 1.2304 - classification_loss: 0.2402 304/500 [=================>............] - ETA: 49s - loss: 1.4712 - regression_loss: 1.2308 - classification_loss: 0.2404 305/500 [=================>............] - ETA: 48s - loss: 1.4719 - regression_loss: 1.2313 - classification_loss: 0.2406 306/500 [=================>............] - ETA: 48s - loss: 1.4717 - regression_loss: 1.2312 - classification_loss: 0.2405 307/500 [=================>............] - ETA: 48s - loss: 1.4720 - regression_loss: 1.2312 - classification_loss: 0.2408 308/500 [=================>............] - ETA: 48s - loss: 1.4728 - regression_loss: 1.2319 - classification_loss: 0.2409 309/500 [=================>............] - ETA: 47s - loss: 1.4723 - regression_loss: 1.2316 - classification_loss: 0.2407 310/500 [=================>............] - ETA: 47s - loss: 1.4729 - regression_loss: 1.2318 - classification_loss: 0.2411 311/500 [=================>............] - ETA: 47s - loss: 1.4704 - regression_loss: 1.2298 - classification_loss: 0.2406 312/500 [=================>............] - ETA: 47s - loss: 1.4679 - regression_loss: 1.2279 - classification_loss: 0.2400 313/500 [=================>............] - ETA: 46s - loss: 1.4687 - regression_loss: 1.2286 - classification_loss: 0.2401 314/500 [=================>............] - ETA: 46s - loss: 1.4698 - regression_loss: 1.2299 - classification_loss: 0.2400 315/500 [=================>............] - ETA: 46s - loss: 1.4684 - regression_loss: 1.2287 - classification_loss: 0.2397 316/500 [=================>............] - ETA: 46s - loss: 1.4683 - regression_loss: 1.2287 - classification_loss: 0.2396 317/500 [==================>...........] - ETA: 45s - loss: 1.4684 - regression_loss: 1.2289 - classification_loss: 0.2395 318/500 [==================>...........] - ETA: 45s - loss: 1.4690 - regression_loss: 1.2293 - classification_loss: 0.2397 319/500 [==================>...........] - ETA: 45s - loss: 1.4704 - regression_loss: 1.2306 - classification_loss: 0.2398 320/500 [==================>...........] - ETA: 45s - loss: 1.4700 - regression_loss: 1.2302 - classification_loss: 0.2398 321/500 [==================>...........] - ETA: 44s - loss: 1.4688 - regression_loss: 1.2293 - classification_loss: 0.2395 322/500 [==================>...........] - ETA: 44s - loss: 1.4702 - regression_loss: 1.2305 - classification_loss: 0.2397 323/500 [==================>...........] - ETA: 44s - loss: 1.4703 - regression_loss: 1.2307 - classification_loss: 0.2396 324/500 [==================>...........] - ETA: 44s - loss: 1.4687 - regression_loss: 1.2295 - classification_loss: 0.2393 325/500 [==================>...........] - ETA: 43s - loss: 1.4667 - regression_loss: 1.2279 - classification_loss: 0.2387 326/500 [==================>...........] - ETA: 43s - loss: 1.4667 - regression_loss: 1.2280 - classification_loss: 0.2387 327/500 [==================>...........] - ETA: 43s - loss: 1.4683 - regression_loss: 1.2294 - classification_loss: 0.2389 328/500 [==================>...........] - ETA: 43s - loss: 1.4674 - regression_loss: 1.2288 - classification_loss: 0.2386 329/500 [==================>...........] - ETA: 42s - loss: 1.4679 - regression_loss: 1.2291 - classification_loss: 0.2388 330/500 [==================>...........] - ETA: 42s - loss: 1.4654 - regression_loss: 1.2271 - classification_loss: 0.2384 331/500 [==================>...........] - ETA: 42s - loss: 1.4656 - regression_loss: 1.2272 - classification_loss: 0.2385 332/500 [==================>...........] - ETA: 42s - loss: 1.4649 - regression_loss: 1.2267 - classification_loss: 0.2382 333/500 [==================>...........] - ETA: 41s - loss: 1.4654 - regression_loss: 1.2271 - classification_loss: 0.2382 334/500 [===================>..........] - ETA: 41s - loss: 1.4656 - regression_loss: 1.2275 - classification_loss: 0.2382 335/500 [===================>..........] - ETA: 41s - loss: 1.4659 - regression_loss: 1.2277 - classification_loss: 0.2382 336/500 [===================>..........] - ETA: 41s - loss: 1.4648 - regression_loss: 1.2267 - classification_loss: 0.2380 337/500 [===================>..........] - ETA: 40s - loss: 1.4662 - regression_loss: 1.2279 - classification_loss: 0.2383 338/500 [===================>..........] - ETA: 40s - loss: 1.4643 - regression_loss: 1.2263 - classification_loss: 0.2380 339/500 [===================>..........] - ETA: 40s - loss: 1.4631 - regression_loss: 1.2253 - classification_loss: 0.2378 340/500 [===================>..........] - ETA: 40s - loss: 1.4626 - regression_loss: 1.2250 - classification_loss: 0.2377 341/500 [===================>..........] - ETA: 39s - loss: 1.4639 - regression_loss: 1.2259 - classification_loss: 0.2379 342/500 [===================>..........] - ETA: 39s - loss: 1.4641 - regression_loss: 1.2262 - classification_loss: 0.2380 343/500 [===================>..........] - ETA: 39s - loss: 1.4627 - regression_loss: 1.2250 - classification_loss: 0.2377 344/500 [===================>..........] - ETA: 39s - loss: 1.4618 - regression_loss: 1.2244 - classification_loss: 0.2373 345/500 [===================>..........] - ETA: 38s - loss: 1.4619 - regression_loss: 1.2245 - classification_loss: 0.2375 346/500 [===================>..........] - ETA: 38s - loss: 1.4627 - regression_loss: 1.2251 - classification_loss: 0.2376 347/500 [===================>..........] - ETA: 38s - loss: 1.4609 - regression_loss: 1.2235 - classification_loss: 0.2374 348/500 [===================>..........] - ETA: 38s - loss: 1.4623 - regression_loss: 1.2248 - classification_loss: 0.2376 349/500 [===================>..........] - ETA: 37s - loss: 1.4636 - regression_loss: 1.2257 - classification_loss: 0.2379 350/500 [====================>.........] - ETA: 37s - loss: 1.4644 - regression_loss: 1.2263 - classification_loss: 0.2382 351/500 [====================>.........] - ETA: 37s - loss: 1.4650 - regression_loss: 1.2268 - classification_loss: 0.2382 352/500 [====================>.........] - ETA: 37s - loss: 1.4651 - regression_loss: 1.2270 - classification_loss: 0.2381 353/500 [====================>.........] - ETA: 36s - loss: 1.4657 - regression_loss: 1.2275 - classification_loss: 0.2382 354/500 [====================>.........] - ETA: 36s - loss: 1.4670 - regression_loss: 1.2286 - classification_loss: 0.2384 355/500 [====================>.........] - ETA: 36s - loss: 1.4660 - regression_loss: 1.2281 - classification_loss: 0.2379 356/500 [====================>.........] - ETA: 36s - loss: 1.4664 - regression_loss: 1.2286 - classification_loss: 0.2378 357/500 [====================>.........] - ETA: 35s - loss: 1.4672 - regression_loss: 1.2292 - classification_loss: 0.2380 358/500 [====================>.........] - ETA: 35s - loss: 1.4670 - regression_loss: 1.2290 - classification_loss: 0.2381 359/500 [====================>.........] - ETA: 35s - loss: 1.4671 - regression_loss: 1.2290 - classification_loss: 0.2382 360/500 [====================>.........] - ETA: 35s - loss: 1.4676 - regression_loss: 1.2294 - classification_loss: 0.2382 361/500 [====================>.........] - ETA: 34s - loss: 1.4664 - regression_loss: 1.2280 - classification_loss: 0.2384 362/500 [====================>.........] - ETA: 34s - loss: 1.4659 - regression_loss: 1.2273 - classification_loss: 0.2386 363/500 [====================>.........] - ETA: 34s - loss: 1.4674 - regression_loss: 1.2287 - classification_loss: 0.2387 364/500 [====================>.........] - ETA: 34s - loss: 1.4682 - regression_loss: 1.2290 - classification_loss: 0.2393 365/500 [====================>.........] - ETA: 33s - loss: 1.4688 - regression_loss: 1.2293 - classification_loss: 0.2395 366/500 [====================>.........] - ETA: 33s - loss: 1.4665 - regression_loss: 1.2274 - classification_loss: 0.2391 367/500 [=====================>........] - ETA: 33s - loss: 1.4647 - regression_loss: 1.2259 - classification_loss: 0.2388 368/500 [=====================>........] - ETA: 33s - loss: 1.4648 - regression_loss: 1.2261 - classification_loss: 0.2387 369/500 [=====================>........] - ETA: 32s - loss: 1.4647 - regression_loss: 1.2259 - classification_loss: 0.2388 370/500 [=====================>........] - ETA: 32s - loss: 1.4655 - regression_loss: 1.2267 - classification_loss: 0.2388 371/500 [=====================>........] - ETA: 32s - loss: 1.4664 - regression_loss: 1.2275 - classification_loss: 0.2389 372/500 [=====================>........] - ETA: 32s - loss: 1.4665 - regression_loss: 1.2275 - classification_loss: 0.2390 373/500 [=====================>........] - ETA: 31s - loss: 1.4651 - regression_loss: 1.2265 - classification_loss: 0.2386 374/500 [=====================>........] - ETA: 31s - loss: 1.4636 - regression_loss: 1.2252 - classification_loss: 0.2383 375/500 [=====================>........] - ETA: 31s - loss: 1.4652 - regression_loss: 1.2264 - classification_loss: 0.2388 376/500 [=====================>........] - ETA: 31s - loss: 1.4665 - regression_loss: 1.2274 - classification_loss: 0.2391 377/500 [=====================>........] - ETA: 30s - loss: 1.4673 - regression_loss: 1.2282 - classification_loss: 0.2392 378/500 [=====================>........] - ETA: 30s - loss: 1.4675 - regression_loss: 1.2282 - classification_loss: 0.2394 379/500 [=====================>........] - ETA: 30s - loss: 1.4681 - regression_loss: 1.2284 - classification_loss: 0.2396 380/500 [=====================>........] - ETA: 30s - loss: 1.4691 - regression_loss: 1.2294 - classification_loss: 0.2397 381/500 [=====================>........] - ETA: 29s - loss: 1.4683 - regression_loss: 1.2288 - classification_loss: 0.2395 382/500 [=====================>........] - ETA: 29s - loss: 1.4691 - regression_loss: 1.2295 - classification_loss: 0.2397 383/500 [=====================>........] - ETA: 29s - loss: 1.4686 - regression_loss: 1.2291 - classification_loss: 0.2395 384/500 [======================>.......] - ETA: 29s - loss: 1.4676 - regression_loss: 1.2282 - classification_loss: 0.2394 385/500 [======================>.......] - ETA: 28s - loss: 1.4654 - regression_loss: 1.2264 - classification_loss: 0.2390 386/500 [======================>.......] - ETA: 28s - loss: 1.4641 - regression_loss: 1.2253 - classification_loss: 0.2388 387/500 [======================>.......] - ETA: 28s - loss: 1.4643 - regression_loss: 1.2256 - classification_loss: 0.2387 388/500 [======================>.......] - ETA: 28s - loss: 1.4626 - regression_loss: 1.2243 - classification_loss: 0.2384 389/500 [======================>.......] - ETA: 27s - loss: 1.4632 - regression_loss: 1.2247 - classification_loss: 0.2384 390/500 [======================>.......] - ETA: 27s - loss: 1.4648 - regression_loss: 1.2258 - classification_loss: 0.2390 391/500 [======================>.......] - ETA: 27s - loss: 1.4630 - regression_loss: 1.2241 - classification_loss: 0.2388 392/500 [======================>.......] - ETA: 27s - loss: 1.4624 - regression_loss: 1.2237 - classification_loss: 0.2386 393/500 [======================>.......] - ETA: 26s - loss: 1.4636 - regression_loss: 1.2246 - classification_loss: 0.2390 394/500 [======================>.......] - ETA: 26s - loss: 1.4639 - regression_loss: 1.2247 - classification_loss: 0.2392 395/500 [======================>.......] - ETA: 26s - loss: 1.4622 - regression_loss: 1.2231 - classification_loss: 0.2390 396/500 [======================>.......] - ETA: 26s - loss: 1.4620 - regression_loss: 1.2231 - classification_loss: 0.2389 397/500 [======================>.......] - ETA: 25s - loss: 1.4605 - regression_loss: 1.2218 - classification_loss: 0.2386 398/500 [======================>.......] - ETA: 25s - loss: 1.4623 - regression_loss: 1.2229 - classification_loss: 0.2395 399/500 [======================>.......] - ETA: 25s - loss: 1.4618 - regression_loss: 1.2220 - classification_loss: 0.2397 400/500 [=======================>......] - ETA: 25s - loss: 1.4620 - regression_loss: 1.2223 - classification_loss: 0.2397 401/500 [=======================>......] - ETA: 24s - loss: 1.4620 - regression_loss: 1.2223 - classification_loss: 0.2397 402/500 [=======================>......] - ETA: 24s - loss: 1.4618 - regression_loss: 1.2222 - classification_loss: 0.2396 403/500 [=======================>......] - ETA: 24s - loss: 1.4626 - regression_loss: 1.2229 - classification_loss: 0.2397 404/500 [=======================>......] - ETA: 24s - loss: 1.4626 - regression_loss: 1.2229 - classification_loss: 0.2397 405/500 [=======================>......] - ETA: 23s - loss: 1.4621 - regression_loss: 1.2226 - classification_loss: 0.2395 406/500 [=======================>......] - ETA: 23s - loss: 1.4622 - regression_loss: 1.2227 - classification_loss: 0.2395 407/500 [=======================>......] - ETA: 23s - loss: 1.4621 - regression_loss: 1.2226 - classification_loss: 0.2395 408/500 [=======================>......] - ETA: 23s - loss: 1.4609 - regression_loss: 1.2217 - classification_loss: 0.2392 409/500 [=======================>......] - ETA: 22s - loss: 1.4621 - regression_loss: 1.2226 - classification_loss: 0.2395 410/500 [=======================>......] - ETA: 22s - loss: 1.4615 - regression_loss: 1.2220 - classification_loss: 0.2396 411/500 [=======================>......] - ETA: 22s - loss: 1.4613 - regression_loss: 1.2219 - classification_loss: 0.2394 412/500 [=======================>......] - ETA: 22s - loss: 1.4615 - regression_loss: 1.2220 - classification_loss: 0.2395 413/500 [=======================>......] - ETA: 21s - loss: 1.4621 - regression_loss: 1.2224 - classification_loss: 0.2397 414/500 [=======================>......] - ETA: 21s - loss: 1.4621 - regression_loss: 1.2225 - classification_loss: 0.2396 415/500 [=======================>......] - ETA: 21s - loss: 1.4625 - regression_loss: 1.2229 - classification_loss: 0.2396 416/500 [=======================>......] - ETA: 21s - loss: 1.4630 - regression_loss: 1.2233 - classification_loss: 0.2398 417/500 [========================>.....] - ETA: 20s - loss: 1.4633 - regression_loss: 1.2236 - classification_loss: 0.2397 418/500 [========================>.....] - ETA: 20s - loss: 1.4657 - regression_loss: 1.2257 - classification_loss: 0.2400 419/500 [========================>.....] - ETA: 20s - loss: 1.4655 - regression_loss: 1.2255 - classification_loss: 0.2400 420/500 [========================>.....] - ETA: 20s - loss: 1.4672 - regression_loss: 1.2268 - classification_loss: 0.2405 421/500 [========================>.....] - ETA: 19s - loss: 1.4678 - regression_loss: 1.2270 - classification_loss: 0.2408 422/500 [========================>.....] - ETA: 19s - loss: 1.4670 - regression_loss: 1.2263 - classification_loss: 0.2407 423/500 [========================>.....] - ETA: 19s - loss: 1.4670 - regression_loss: 1.2263 - classification_loss: 0.2407 424/500 [========================>.....] - ETA: 19s - loss: 1.4675 - regression_loss: 1.2268 - classification_loss: 0.2408 425/500 [========================>.....] - ETA: 18s - loss: 1.4682 - regression_loss: 1.2273 - classification_loss: 0.2409 426/500 [========================>.....] - ETA: 18s - loss: 1.4682 - regression_loss: 1.2273 - classification_loss: 0.2409 427/500 [========================>.....] - ETA: 18s - loss: 1.4673 - regression_loss: 1.2265 - classification_loss: 0.2408 428/500 [========================>.....] - ETA: 18s - loss: 1.4676 - regression_loss: 1.2268 - classification_loss: 0.2408 429/500 [========================>.....] - ETA: 17s - loss: 1.4677 - regression_loss: 1.2269 - classification_loss: 0.2408 430/500 [========================>.....] - ETA: 17s - loss: 1.4674 - regression_loss: 1.2266 - classification_loss: 0.2407 431/500 [========================>.....] - ETA: 17s - loss: 1.4662 - regression_loss: 1.2258 - classification_loss: 0.2404 432/500 [========================>.....] - ETA: 17s - loss: 1.4671 - regression_loss: 1.2266 - classification_loss: 0.2405 433/500 [========================>.....] - ETA: 16s - loss: 1.4689 - regression_loss: 1.2282 - classification_loss: 0.2407 434/500 [=========================>....] - ETA: 16s - loss: 1.4677 - regression_loss: 1.2272 - classification_loss: 0.2405 435/500 [=========================>....] - ETA: 16s - loss: 1.4662 - regression_loss: 1.2260 - classification_loss: 0.2402 436/500 [=========================>....] - ETA: 16s - loss: 1.4662 - regression_loss: 1.2259 - classification_loss: 0.2403 437/500 [=========================>....] - ETA: 15s - loss: 1.4671 - regression_loss: 1.2266 - classification_loss: 0.2405 438/500 [=========================>....] - ETA: 15s - loss: 1.4668 - regression_loss: 1.2264 - classification_loss: 0.2404 439/500 [=========================>....] - ETA: 15s - loss: 1.4669 - regression_loss: 1.2264 - classification_loss: 0.2404 440/500 [=========================>....] - ETA: 15s - loss: 1.4652 - regression_loss: 1.2251 - classification_loss: 0.2401 441/500 [=========================>....] - ETA: 14s - loss: 1.4646 - regression_loss: 1.2247 - classification_loss: 0.2400 442/500 [=========================>....] - ETA: 14s - loss: 1.4654 - regression_loss: 1.2255 - classification_loss: 0.2399 443/500 [=========================>....] - ETA: 14s - loss: 1.4674 - regression_loss: 1.2268 - classification_loss: 0.2406 444/500 [=========================>....] - ETA: 14s - loss: 1.4679 - regression_loss: 1.2271 - classification_loss: 0.2407 445/500 [=========================>....] - ETA: 13s - loss: 1.4681 - regression_loss: 1.2272 - classification_loss: 0.2409 446/500 [=========================>....] - ETA: 13s - loss: 1.4675 - regression_loss: 1.2268 - classification_loss: 0.2408 447/500 [=========================>....] - ETA: 13s - loss: 1.4677 - regression_loss: 1.2270 - classification_loss: 0.2407 448/500 [=========================>....] - ETA: 13s - loss: 1.4678 - regression_loss: 1.2272 - classification_loss: 0.2407 449/500 [=========================>....] - ETA: 12s - loss: 1.4678 - regression_loss: 1.2273 - classification_loss: 0.2406 450/500 [==========================>...] - ETA: 12s - loss: 1.4676 - regression_loss: 1.2271 - classification_loss: 0.2405 451/500 [==========================>...] - ETA: 12s - loss: 1.4681 - regression_loss: 1.2275 - classification_loss: 0.2406 452/500 [==========================>...] - ETA: 12s - loss: 1.4666 - regression_loss: 1.2264 - classification_loss: 0.2402 453/500 [==========================>...] - ETA: 11s - loss: 1.4684 - regression_loss: 1.2279 - classification_loss: 0.2405 454/500 [==========================>...] - ETA: 11s - loss: 1.4674 - regression_loss: 1.2270 - classification_loss: 0.2404 455/500 [==========================>...] - ETA: 11s - loss: 1.4672 - regression_loss: 1.2270 - classification_loss: 0.2402 456/500 [==========================>...] - ETA: 11s - loss: 1.4667 - regression_loss: 1.2265 - classification_loss: 0.2401 457/500 [==========================>...] - ETA: 10s - loss: 1.4660 - regression_loss: 1.2261 - classification_loss: 0.2400 458/500 [==========================>...] - ETA: 10s - loss: 1.4651 - regression_loss: 1.2252 - classification_loss: 0.2398 459/500 [==========================>...] - ETA: 10s - loss: 1.4664 - regression_loss: 1.2264 - classification_loss: 0.2400 460/500 [==========================>...] - ETA: 10s - loss: 1.4650 - regression_loss: 1.2251 - classification_loss: 0.2399 461/500 [==========================>...] - ETA: 9s - loss: 1.4650 - regression_loss: 1.2251 - classification_loss: 0.2398  462/500 [==========================>...] - ETA: 9s - loss: 1.4650 - regression_loss: 1.2253 - classification_loss: 0.2397 463/500 [==========================>...] - ETA: 9s - loss: 1.4655 - regression_loss: 1.2257 - classification_loss: 0.2398 464/500 [==========================>...] - ETA: 9s - loss: 1.4665 - regression_loss: 1.2264 - classification_loss: 0.2401 465/500 [==========================>...] - ETA: 8s - loss: 1.4670 - regression_loss: 1.2267 - classification_loss: 0.2402 466/500 [==========================>...] - ETA: 8s - loss: 1.4673 - regression_loss: 1.2271 - classification_loss: 0.2402 467/500 [===========================>..] - ETA: 8s - loss: 1.4655 - regression_loss: 1.2257 - classification_loss: 0.2399 468/500 [===========================>..] - ETA: 8s - loss: 1.4651 - regression_loss: 1.2253 - classification_loss: 0.2398 469/500 [===========================>..] - ETA: 7s - loss: 1.4651 - regression_loss: 1.2252 - classification_loss: 0.2399 470/500 [===========================>..] - ETA: 7s - loss: 1.4641 - regression_loss: 1.2244 - classification_loss: 0.2397 471/500 [===========================>..] - ETA: 7s - loss: 1.4644 - regression_loss: 1.2247 - classification_loss: 0.2397 472/500 [===========================>..] - ETA: 7s - loss: 1.4644 - regression_loss: 1.2247 - classification_loss: 0.2397 473/500 [===========================>..] - ETA: 6s - loss: 1.4635 - regression_loss: 1.2241 - classification_loss: 0.2394 474/500 [===========================>..] - ETA: 6s - loss: 1.4638 - regression_loss: 1.2245 - classification_loss: 0.2393 475/500 [===========================>..] - ETA: 6s - loss: 1.4618 - regression_loss: 1.2229 - classification_loss: 0.2389 476/500 [===========================>..] - ETA: 6s - loss: 1.4623 - regression_loss: 1.2234 - classification_loss: 0.2389 477/500 [===========================>..] - ETA: 5s - loss: 1.4609 - regression_loss: 1.2222 - classification_loss: 0.2386 478/500 [===========================>..] - ETA: 5s - loss: 1.4612 - regression_loss: 1.2223 - classification_loss: 0.2388 479/500 [===========================>..] - ETA: 5s - loss: 1.4602 - regression_loss: 1.2216 - classification_loss: 0.2386 480/500 [===========================>..] - ETA: 5s - loss: 1.4598 - regression_loss: 1.2212 - classification_loss: 0.2386 481/500 [===========================>..] - ETA: 4s - loss: 1.4595 - regression_loss: 1.2209 - classification_loss: 0.2386 482/500 [===========================>..] - ETA: 4s - loss: 1.4593 - regression_loss: 1.2208 - classification_loss: 0.2385 483/500 [===========================>..] - ETA: 4s - loss: 1.4588 - regression_loss: 1.2205 - classification_loss: 0.2383 484/500 [============================>.] - ETA: 4s - loss: 1.4595 - regression_loss: 1.2210 - classification_loss: 0.2385 485/500 [============================>.] - ETA: 3s - loss: 1.4594 - regression_loss: 1.2210 - classification_loss: 0.2384 486/500 [============================>.] - ETA: 3s - loss: 1.4597 - regression_loss: 1.2213 - classification_loss: 0.2384 487/500 [============================>.] - ETA: 3s - loss: 1.4595 - regression_loss: 1.2212 - classification_loss: 0.2383 488/500 [============================>.] - ETA: 3s - loss: 1.4603 - regression_loss: 1.2218 - classification_loss: 0.2384 489/500 [============================>.] - ETA: 2s - loss: 1.4605 - regression_loss: 1.2220 - classification_loss: 0.2386 490/500 [============================>.] - ETA: 2s - loss: 1.4589 - regression_loss: 1.2207 - classification_loss: 0.2382 491/500 [============================>.] - ETA: 2s - loss: 1.4580 - regression_loss: 1.2199 - classification_loss: 0.2381 492/500 [============================>.] - ETA: 2s - loss: 1.4565 - regression_loss: 1.2187 - classification_loss: 0.2378 493/500 [============================>.] - ETA: 1s - loss: 1.4570 - regression_loss: 1.2192 - classification_loss: 0.2378 494/500 [============================>.] - ETA: 1s - loss: 1.4566 - regression_loss: 1.2188 - classification_loss: 0.2378 495/500 [============================>.] - ETA: 1s - loss: 1.4571 - regression_loss: 1.2192 - classification_loss: 0.2378 496/500 [============================>.] - ETA: 1s - loss: 1.4569 - regression_loss: 1.2188 - classification_loss: 0.2381 497/500 [============================>.] - ETA: 0s - loss: 1.4581 - regression_loss: 1.2201 - classification_loss: 0.2380 498/500 [============================>.] - ETA: 0s - loss: 1.4585 - regression_loss: 1.2204 - classification_loss: 0.2381 499/500 [============================>.] - ETA: 0s - loss: 1.4596 - regression_loss: 1.2212 - classification_loss: 0.2384 500/500 [==============================] - 126s 251ms/step - loss: 1.4588 - regression_loss: 1.2203 - classification_loss: 0.2385 1172 instances of class plum with average precision: 0.6563 mAP: 0.6563 Epoch 00105: saving model to ./training/snapshots/resnet50_pascal_105.h5 Epoch 106/150 1/500 [..............................] - ETA: 1:55 - loss: 1.1789 - regression_loss: 0.9984 - classification_loss: 0.1805 2/500 [..............................] - ETA: 2:00 - loss: 1.3939 - regression_loss: 1.1884 - classification_loss: 0.2055 3/500 [..............................] - ETA: 2:01 - loss: 1.5092 - regression_loss: 1.2749 - classification_loss: 0.2343 4/500 [..............................] - ETA: 2:02 - loss: 1.4674 - regression_loss: 1.2452 - classification_loss: 0.2222 5/500 [..............................] - ETA: 2:03 - loss: 1.5649 - regression_loss: 1.3391 - classification_loss: 0.2257 6/500 [..............................] - ETA: 2:03 - loss: 1.5093 - regression_loss: 1.2910 - classification_loss: 0.2183 7/500 [..............................] - ETA: 2:03 - loss: 1.5546 - regression_loss: 1.3260 - classification_loss: 0.2285 8/500 [..............................] - ETA: 2:03 - loss: 1.5668 - regression_loss: 1.3348 - classification_loss: 0.2320 9/500 [..............................] - ETA: 2:03 - loss: 1.5602 - regression_loss: 1.3285 - classification_loss: 0.2316 10/500 [..............................] - ETA: 2:02 - loss: 1.5255 - regression_loss: 1.2828 - classification_loss: 0.2426 11/500 [..............................] - ETA: 2:02 - loss: 1.4763 - regression_loss: 1.2422 - classification_loss: 0.2341 12/500 [..............................] - ETA: 2:01 - loss: 1.4436 - regression_loss: 1.2123 - classification_loss: 0.2313 13/500 [..............................] - ETA: 2:01 - loss: 1.4101 - regression_loss: 1.1841 - classification_loss: 0.2260 14/500 [..............................] - ETA: 2:01 - loss: 1.4611 - regression_loss: 1.2379 - classification_loss: 0.2232 15/500 [..............................] - ETA: 2:01 - loss: 1.4134 - regression_loss: 1.1996 - classification_loss: 0.2137 16/500 [..............................] - ETA: 2:01 - loss: 1.4299 - regression_loss: 1.2164 - classification_loss: 0.2136 17/500 [>.............................] - ETA: 2:01 - loss: 1.4350 - regression_loss: 1.2227 - classification_loss: 0.2123 18/500 [>.............................] - ETA: 2:01 - loss: 1.4386 - regression_loss: 1.2245 - classification_loss: 0.2141 19/500 [>.............................] - ETA: 2:00 - loss: 1.3857 - regression_loss: 1.1796 - classification_loss: 0.2061 20/500 [>.............................] - ETA: 2:00 - loss: 1.3744 - regression_loss: 1.1696 - classification_loss: 0.2048 21/500 [>.............................] - ETA: 1:59 - loss: 1.3879 - regression_loss: 1.1794 - classification_loss: 0.2085 22/500 [>.............................] - ETA: 1:59 - loss: 1.3716 - regression_loss: 1.1672 - classification_loss: 0.2045 23/500 [>.............................] - ETA: 1:59 - loss: 1.3730 - regression_loss: 1.1649 - classification_loss: 0.2081 24/500 [>.............................] - ETA: 1:59 - loss: 1.3657 - regression_loss: 1.1591 - classification_loss: 0.2066 25/500 [>.............................] - ETA: 1:59 - loss: 1.3618 - regression_loss: 1.1584 - classification_loss: 0.2034 26/500 [>.............................] - ETA: 1:59 - loss: 1.3885 - regression_loss: 1.1796 - classification_loss: 0.2089 27/500 [>.............................] - ETA: 1:59 - loss: 1.3596 - regression_loss: 1.1561 - classification_loss: 0.2035 28/500 [>.............................] - ETA: 1:58 - loss: 1.3783 - regression_loss: 1.1748 - classification_loss: 0.2034 29/500 [>.............................] - ETA: 1:58 - loss: 1.3724 - regression_loss: 1.1717 - classification_loss: 0.2007 30/500 [>.............................] - ETA: 1:58 - loss: 1.3759 - regression_loss: 1.1745 - classification_loss: 0.2014 31/500 [>.............................] - ETA: 1:58 - loss: 1.3897 - regression_loss: 1.1858 - classification_loss: 0.2039 32/500 [>.............................] - ETA: 1:58 - loss: 1.3646 - regression_loss: 1.1651 - classification_loss: 0.1995 33/500 [>.............................] - ETA: 1:58 - loss: 1.3720 - regression_loss: 1.1704 - classification_loss: 0.2016 34/500 [=>............................] - ETA: 1:58 - loss: 1.3835 - regression_loss: 1.1805 - classification_loss: 0.2030 35/500 [=>............................] - ETA: 1:57 - loss: 1.3810 - regression_loss: 1.1771 - classification_loss: 0.2039 36/500 [=>............................] - ETA: 1:57 - loss: 1.3881 - regression_loss: 1.1828 - classification_loss: 0.2052 37/500 [=>............................] - ETA: 1:56 - loss: 1.3920 - regression_loss: 1.1865 - classification_loss: 0.2054 38/500 [=>............................] - ETA: 1:56 - loss: 1.3949 - regression_loss: 1.1881 - classification_loss: 0.2068 39/500 [=>............................] - ETA: 1:56 - loss: 1.4037 - regression_loss: 1.1951 - classification_loss: 0.2086 40/500 [=>............................] - ETA: 1:56 - loss: 1.4000 - regression_loss: 1.1920 - classification_loss: 0.2080 41/500 [=>............................] - ETA: 1:56 - loss: 1.4068 - regression_loss: 1.1972 - classification_loss: 0.2096 42/500 [=>............................] - ETA: 1:55 - loss: 1.4159 - regression_loss: 1.2045 - classification_loss: 0.2113 43/500 [=>............................] - ETA: 1:55 - loss: 1.4201 - regression_loss: 1.2090 - classification_loss: 0.2111 44/500 [=>............................] - ETA: 1:55 - loss: 1.4338 - regression_loss: 1.2200 - classification_loss: 0.2137 45/500 [=>............................] - ETA: 1:54 - loss: 1.4316 - regression_loss: 1.2187 - classification_loss: 0.2129 46/500 [=>............................] - ETA: 1:54 - loss: 1.4283 - regression_loss: 1.2157 - classification_loss: 0.2126 47/500 [=>............................] - ETA: 1:54 - loss: 1.4328 - regression_loss: 1.2190 - classification_loss: 0.2138 48/500 [=>............................] - ETA: 1:53 - loss: 1.4287 - regression_loss: 1.2162 - classification_loss: 0.2125 49/500 [=>............................] - ETA: 1:53 - loss: 1.4123 - regression_loss: 1.2005 - classification_loss: 0.2118 50/500 [==>...........................] - ETA: 1:53 - loss: 1.4119 - regression_loss: 1.1999 - classification_loss: 0.2120 51/500 [==>...........................] - ETA: 1:52 - loss: 1.4190 - regression_loss: 1.2046 - classification_loss: 0.2144 52/500 [==>...........................] - ETA: 1:52 - loss: 1.4164 - regression_loss: 1.2019 - classification_loss: 0.2145 53/500 [==>...........................] - ETA: 1:52 - loss: 1.4041 - regression_loss: 1.1915 - classification_loss: 0.2126 54/500 [==>...........................] - ETA: 1:52 - loss: 1.4079 - regression_loss: 1.1951 - classification_loss: 0.2128 55/500 [==>...........................] - ETA: 1:52 - loss: 1.4180 - regression_loss: 1.2038 - classification_loss: 0.2142 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4185 - regression_loss: 1.2035 - classification_loss: 0.2150 57/500 [==>...........................] - ETA: 1:51 - loss: 1.4161 - regression_loss: 1.2033 - classification_loss: 0.2128 58/500 [==>...........................] - ETA: 1:51 - loss: 1.4123 - regression_loss: 1.1979 - classification_loss: 0.2144 59/500 [==>...........................] - ETA: 1:51 - loss: 1.4183 - regression_loss: 1.2025 - classification_loss: 0.2158 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4027 - regression_loss: 1.1896 - classification_loss: 0.2131 61/500 [==>...........................] - ETA: 1:50 - loss: 1.4022 - regression_loss: 1.1900 - classification_loss: 0.2122 62/500 [==>...........................] - ETA: 1:50 - loss: 1.3898 - regression_loss: 1.1798 - classification_loss: 0.2099 63/500 [==>...........................] - ETA: 1:50 - loss: 1.3934 - regression_loss: 1.1825 - classification_loss: 0.2109 64/500 [==>...........................] - ETA: 1:49 - loss: 1.3893 - regression_loss: 1.1793 - classification_loss: 0.2100 65/500 [==>...........................] - ETA: 1:49 - loss: 1.3773 - regression_loss: 1.1690 - classification_loss: 0.2083 66/500 [==>...........................] - ETA: 1:49 - loss: 1.3785 - regression_loss: 1.1701 - classification_loss: 0.2085 67/500 [===>..........................] - ETA: 1:49 - loss: 1.3875 - regression_loss: 1.1775 - classification_loss: 0.2100 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4106 - regression_loss: 1.1949 - classification_loss: 0.2157 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4159 - regression_loss: 1.1992 - classification_loss: 0.2167 70/500 [===>..........................] - ETA: 1:48 - loss: 1.4078 - regression_loss: 1.1932 - classification_loss: 0.2147 71/500 [===>..........................] - ETA: 1:48 - loss: 1.4039 - regression_loss: 1.1901 - classification_loss: 0.2138 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4036 - regression_loss: 1.1895 - classification_loss: 0.2140 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4067 - regression_loss: 1.1922 - classification_loss: 0.2145 74/500 [===>..........................] - ETA: 1:47 - loss: 1.4023 - regression_loss: 1.1891 - classification_loss: 0.2132 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4066 - regression_loss: 1.1922 - classification_loss: 0.2144 76/500 [===>..........................] - ETA: 1:46 - loss: 1.4069 - regression_loss: 1.1933 - classification_loss: 0.2135 77/500 [===>..........................] - ETA: 1:46 - loss: 1.4017 - regression_loss: 1.1893 - classification_loss: 0.2123 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4031 - regression_loss: 1.1910 - classification_loss: 0.2121 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3928 - regression_loss: 1.1821 - classification_loss: 0.2107 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3902 - regression_loss: 1.1801 - classification_loss: 0.2101 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4147 - regression_loss: 1.2007 - classification_loss: 0.2140 82/500 [===>..........................] - ETA: 1:45 - loss: 1.4164 - regression_loss: 1.2014 - classification_loss: 0.2150 83/500 [===>..........................] - ETA: 1:45 - loss: 1.4167 - regression_loss: 1.2004 - classification_loss: 0.2163 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4209 - regression_loss: 1.2030 - classification_loss: 0.2179 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4238 - regression_loss: 1.2055 - classification_loss: 0.2182 86/500 [====>.........................] - ETA: 1:44 - loss: 1.4331 - regression_loss: 1.2117 - classification_loss: 0.2214 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4339 - regression_loss: 1.2117 - classification_loss: 0.2222 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4416 - regression_loss: 1.2176 - classification_loss: 0.2240 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4299 - regression_loss: 1.2081 - classification_loss: 0.2218 90/500 [====>.........................] - ETA: 1:43 - loss: 1.4323 - regression_loss: 1.2104 - classification_loss: 0.2219 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4303 - regression_loss: 1.2091 - classification_loss: 0.2212 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4218 - regression_loss: 1.2021 - classification_loss: 0.2197 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4238 - regression_loss: 1.2038 - classification_loss: 0.2200 94/500 [====>.........................] - ETA: 1:42 - loss: 1.4238 - regression_loss: 1.2032 - classification_loss: 0.2206 95/500 [====>.........................] - ETA: 1:42 - loss: 1.4236 - regression_loss: 1.2029 - classification_loss: 0.2207 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4300 - regression_loss: 1.2078 - classification_loss: 0.2222 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4316 - regression_loss: 1.2088 - classification_loss: 0.2228 98/500 [====>.........................] - ETA: 1:41 - loss: 1.4380 - regression_loss: 1.2136 - classification_loss: 0.2244 99/500 [====>.........................] - ETA: 1:41 - loss: 1.4402 - regression_loss: 1.2155 - classification_loss: 0.2247 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4411 - regression_loss: 1.2165 - classification_loss: 0.2246 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4428 - regression_loss: 1.2173 - classification_loss: 0.2255 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4440 - regression_loss: 1.2179 - classification_loss: 0.2261 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4441 - regression_loss: 1.2183 - classification_loss: 0.2258 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4459 - regression_loss: 1.2197 - classification_loss: 0.2262 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4496 - regression_loss: 1.2222 - classification_loss: 0.2273 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4462 - regression_loss: 1.2197 - classification_loss: 0.2264 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4428 - regression_loss: 1.2170 - classification_loss: 0.2258 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4450 - regression_loss: 1.2187 - classification_loss: 0.2263 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4370 - regression_loss: 1.2121 - classification_loss: 0.2249 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4392 - regression_loss: 1.2140 - classification_loss: 0.2251 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4434 - regression_loss: 1.2163 - classification_loss: 0.2271 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4482 - regression_loss: 1.2197 - classification_loss: 0.2285 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4538 - regression_loss: 1.2261 - classification_loss: 0.2277 114/500 [=====>........................] - ETA: 1:37 - loss: 1.4548 - regression_loss: 1.2269 - classification_loss: 0.2279 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4572 - regression_loss: 1.2291 - classification_loss: 0.2281 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4609 - regression_loss: 1.2321 - classification_loss: 0.2288 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4705 - regression_loss: 1.2389 - classification_loss: 0.2316 118/500 [======>.......................] - ETA: 1:36 - loss: 1.4747 - regression_loss: 1.2424 - classification_loss: 0.2322 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4807 - regression_loss: 1.2469 - classification_loss: 0.2338 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4836 - regression_loss: 1.2496 - classification_loss: 0.2339 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4800 - regression_loss: 1.2457 - classification_loss: 0.2342 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4790 - regression_loss: 1.2448 - classification_loss: 0.2342 123/500 [======>.......................] - ETA: 1:35 - loss: 1.4826 - regression_loss: 1.2479 - classification_loss: 0.2346 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4761 - regression_loss: 1.2427 - classification_loss: 0.2334 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4735 - regression_loss: 1.2403 - classification_loss: 0.2332 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4774 - regression_loss: 1.2437 - classification_loss: 0.2337 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4792 - regression_loss: 1.2452 - classification_loss: 0.2340 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4747 - regression_loss: 1.2418 - classification_loss: 0.2329 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4756 - regression_loss: 1.2427 - classification_loss: 0.2329 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4745 - regression_loss: 1.2419 - classification_loss: 0.2326 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4699 - regression_loss: 1.2377 - classification_loss: 0.2322 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4679 - regression_loss: 1.2364 - classification_loss: 0.2315 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4639 - regression_loss: 1.2326 - classification_loss: 0.2313 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4670 - regression_loss: 1.2354 - classification_loss: 0.2316 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4680 - regression_loss: 1.2363 - classification_loss: 0.2317 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4673 - regression_loss: 1.2359 - classification_loss: 0.2314 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4637 - regression_loss: 1.2332 - classification_loss: 0.2306 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4639 - regression_loss: 1.2336 - classification_loss: 0.2303 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4669 - regression_loss: 1.2360 - classification_loss: 0.2308 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4674 - regression_loss: 1.2367 - classification_loss: 0.2307 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4708 - regression_loss: 1.2403 - classification_loss: 0.2305 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4663 - regression_loss: 1.2367 - classification_loss: 0.2296 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4665 - regression_loss: 1.2371 - classification_loss: 0.2294 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4681 - regression_loss: 1.2385 - classification_loss: 0.2295 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4702 - regression_loss: 1.2395 - classification_loss: 0.2308 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4729 - regression_loss: 1.2417 - classification_loss: 0.2312 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4767 - regression_loss: 1.2451 - classification_loss: 0.2316 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4751 - regression_loss: 1.2438 - classification_loss: 0.2313 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4763 - regression_loss: 1.2446 - classification_loss: 0.2317 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4726 - regression_loss: 1.2409 - classification_loss: 0.2317 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4757 - regression_loss: 1.2437 - classification_loss: 0.2319 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4729 - regression_loss: 1.2417 - classification_loss: 0.2312 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4728 - regression_loss: 1.2417 - classification_loss: 0.2311 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4757 - regression_loss: 1.2448 - classification_loss: 0.2309 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4715 - regression_loss: 1.2417 - classification_loss: 0.2298 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4717 - regression_loss: 1.2418 - classification_loss: 0.2299 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4751 - regression_loss: 1.2447 - classification_loss: 0.2304 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4758 - regression_loss: 1.2454 - classification_loss: 0.2304 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4765 - regression_loss: 1.2462 - classification_loss: 0.2303 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4747 - regression_loss: 1.2447 - classification_loss: 0.2299 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4718 - regression_loss: 1.2426 - classification_loss: 0.2293 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4726 - regression_loss: 1.2430 - classification_loss: 0.2296 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4728 - regression_loss: 1.2434 - classification_loss: 0.2295 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4727 - regression_loss: 1.2427 - classification_loss: 0.2300 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4753 - regression_loss: 1.2448 - classification_loss: 0.2305 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4738 - regression_loss: 1.2439 - classification_loss: 0.2300 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4761 - regression_loss: 1.2453 - classification_loss: 0.2308 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4739 - regression_loss: 1.2436 - classification_loss: 0.2302 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4706 - regression_loss: 1.2408 - classification_loss: 0.2298 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4695 - regression_loss: 1.2400 - classification_loss: 0.2295 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4708 - regression_loss: 1.2413 - classification_loss: 0.2295 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4693 - regression_loss: 1.2399 - classification_loss: 0.2293 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4659 - regression_loss: 1.2370 - classification_loss: 0.2289 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4671 - regression_loss: 1.2380 - classification_loss: 0.2291 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4682 - regression_loss: 1.2392 - classification_loss: 0.2290 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4704 - regression_loss: 1.2407 - classification_loss: 0.2298 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4643 - regression_loss: 1.2352 - classification_loss: 0.2290 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4598 - regression_loss: 1.2315 - classification_loss: 0.2283 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4575 - regression_loss: 1.2298 - classification_loss: 0.2277 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4572 - regression_loss: 1.2295 - classification_loss: 0.2276 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4549 - regression_loss: 1.2277 - classification_loss: 0.2272 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4519 - regression_loss: 1.2251 - classification_loss: 0.2268 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4526 - regression_loss: 1.2256 - classification_loss: 0.2270 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4521 - regression_loss: 1.2251 - classification_loss: 0.2270 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4536 - regression_loss: 1.2262 - classification_loss: 0.2273 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4480 - regression_loss: 1.2216 - classification_loss: 0.2264 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4511 - regression_loss: 1.2243 - classification_loss: 0.2269 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4473 - regression_loss: 1.2212 - classification_loss: 0.2261 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4484 - regression_loss: 1.2221 - classification_loss: 0.2262 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4512 - regression_loss: 1.2242 - classification_loss: 0.2269 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4490 - regression_loss: 1.2227 - classification_loss: 0.2263 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4473 - regression_loss: 1.2211 - classification_loss: 0.2263 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4457 - regression_loss: 1.2198 - classification_loss: 0.2259 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4416 - regression_loss: 1.2164 - classification_loss: 0.2252 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4380 - regression_loss: 1.2137 - classification_loss: 0.2243 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4338 - regression_loss: 1.2104 - classification_loss: 0.2234 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4306 - regression_loss: 1.2079 - classification_loss: 0.2226 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4308 - regression_loss: 1.2080 - classification_loss: 0.2228 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4309 - regression_loss: 1.2080 - classification_loss: 0.2229 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4308 - regression_loss: 1.2076 - classification_loss: 0.2232 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4293 - regression_loss: 1.2063 - classification_loss: 0.2230 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4304 - regression_loss: 1.2067 - classification_loss: 0.2237 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4336 - regression_loss: 1.2093 - classification_loss: 0.2243 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4343 - regression_loss: 1.2100 - classification_loss: 0.2242 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4338 - regression_loss: 1.2100 - classification_loss: 0.2237 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4336 - regression_loss: 1.2099 - classification_loss: 0.2237 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4333 - regression_loss: 1.2092 - classification_loss: 0.2241 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4347 - regression_loss: 1.2108 - classification_loss: 0.2239 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4349 - regression_loss: 1.2110 - classification_loss: 0.2239 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4347 - regression_loss: 1.2110 - classification_loss: 0.2237 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4330 - regression_loss: 1.2094 - classification_loss: 0.2237 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4347 - regression_loss: 1.2107 - classification_loss: 0.2240 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4348 - regression_loss: 1.2108 - classification_loss: 0.2239 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4359 - regression_loss: 1.2106 - classification_loss: 0.2253 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4370 - regression_loss: 1.2113 - classification_loss: 0.2257 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4381 - regression_loss: 1.2122 - classification_loss: 0.2259 217/500 [============>.................] - ETA: 1:10 - loss: 1.4393 - regression_loss: 1.2135 - classification_loss: 0.2258 218/500 [============>.................] - ETA: 1:10 - loss: 1.4460 - regression_loss: 1.2186 - classification_loss: 0.2275 219/500 [============>.................] - ETA: 1:10 - loss: 1.4467 - regression_loss: 1.2191 - classification_loss: 0.2276 220/500 [============>.................] - ETA: 1:10 - loss: 1.4484 - regression_loss: 1.2204 - classification_loss: 0.2280 221/500 [============>.................] - ETA: 1:09 - loss: 1.4480 - regression_loss: 1.2200 - classification_loss: 0.2280 222/500 [============>.................] - ETA: 1:09 - loss: 1.4457 - regression_loss: 1.2184 - classification_loss: 0.2273 223/500 [============>.................] - ETA: 1:09 - loss: 1.4463 - regression_loss: 1.2188 - classification_loss: 0.2275 224/500 [============>.................] - ETA: 1:09 - loss: 1.4423 - regression_loss: 1.2156 - classification_loss: 0.2266 225/500 [============>.................] - ETA: 1:08 - loss: 1.4422 - regression_loss: 1.2154 - classification_loss: 0.2267 226/500 [============>.................] - ETA: 1:08 - loss: 1.4413 - regression_loss: 1.2146 - classification_loss: 0.2267 227/500 [============>.................] - ETA: 1:08 - loss: 1.4386 - regression_loss: 1.2125 - classification_loss: 0.2261 228/500 [============>.................] - ETA: 1:08 - loss: 1.4385 - regression_loss: 1.2126 - classification_loss: 0.2259 229/500 [============>.................] - ETA: 1:07 - loss: 1.4416 - regression_loss: 1.2148 - classification_loss: 0.2268 230/500 [============>.................] - ETA: 1:07 - loss: 1.4428 - regression_loss: 1.2157 - classification_loss: 0.2271 231/500 [============>.................] - ETA: 1:07 - loss: 1.4438 - regression_loss: 1.2168 - classification_loss: 0.2269 232/500 [============>.................] - ETA: 1:07 - loss: 1.4435 - regression_loss: 1.2166 - classification_loss: 0.2269 233/500 [============>.................] - ETA: 1:06 - loss: 1.4429 - regression_loss: 1.2163 - classification_loss: 0.2267 234/500 [=============>................] - ETA: 1:06 - loss: 1.4412 - regression_loss: 1.2150 - classification_loss: 0.2262 235/500 [=============>................] - ETA: 1:06 - loss: 1.4395 - regression_loss: 1.2134 - classification_loss: 0.2261 236/500 [=============>................] - ETA: 1:06 - loss: 1.4391 - regression_loss: 1.2131 - classification_loss: 0.2260 237/500 [=============>................] - ETA: 1:05 - loss: 1.4391 - regression_loss: 1.2130 - classification_loss: 0.2261 238/500 [=============>................] - ETA: 1:05 - loss: 1.4428 - regression_loss: 1.2163 - classification_loss: 0.2265 239/500 [=============>................] - ETA: 1:05 - loss: 1.4435 - regression_loss: 1.2169 - classification_loss: 0.2266 240/500 [=============>................] - ETA: 1:05 - loss: 1.4442 - regression_loss: 1.2176 - classification_loss: 0.2265 241/500 [=============>................] - ETA: 1:04 - loss: 1.4444 - regression_loss: 1.2180 - classification_loss: 0.2263 242/500 [=============>................] - ETA: 1:04 - loss: 1.4447 - regression_loss: 1.2184 - classification_loss: 0.2263 243/500 [=============>................] - ETA: 1:04 - loss: 1.4451 - regression_loss: 1.2188 - classification_loss: 0.2263 244/500 [=============>................] - ETA: 1:04 - loss: 1.4463 - regression_loss: 1.2198 - classification_loss: 0.2265 245/500 [=============>................] - ETA: 1:03 - loss: 1.4432 - regression_loss: 1.2172 - classification_loss: 0.2260 246/500 [=============>................] - ETA: 1:03 - loss: 1.4415 - regression_loss: 1.2157 - classification_loss: 0.2258 247/500 [=============>................] - ETA: 1:03 - loss: 1.4379 - regression_loss: 1.2127 - classification_loss: 0.2252 248/500 [=============>................] - ETA: 1:03 - loss: 1.4401 - regression_loss: 1.2140 - classification_loss: 0.2262 249/500 [=============>................] - ETA: 1:03 - loss: 1.4406 - regression_loss: 1.2143 - classification_loss: 0.2263 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4428 - regression_loss: 1.2160 - classification_loss: 0.2269 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4413 - regression_loss: 1.2148 - classification_loss: 0.2265 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4438 - regression_loss: 1.2169 - classification_loss: 0.2269 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4452 - regression_loss: 1.2172 - classification_loss: 0.2279 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4467 - regression_loss: 1.2184 - classification_loss: 0.2284 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4477 - regression_loss: 1.2192 - classification_loss: 0.2285 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4451 - regression_loss: 1.2172 - classification_loss: 0.2279 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4461 - regression_loss: 1.2178 - classification_loss: 0.2283 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4469 - regression_loss: 1.2184 - classification_loss: 0.2285 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4475 - regression_loss: 1.2189 - classification_loss: 0.2285 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4457 - regression_loss: 1.2177 - classification_loss: 0.2280 261/500 [==============>...............] - ETA: 59s - loss: 1.4463 - regression_loss: 1.2183 - classification_loss: 0.2280  262/500 [==============>...............] - ETA: 59s - loss: 1.4468 - regression_loss: 1.2185 - classification_loss: 0.2283 263/500 [==============>...............] - ETA: 59s - loss: 1.4460 - regression_loss: 1.2178 - classification_loss: 0.2282 264/500 [==============>...............] - ETA: 59s - loss: 1.4459 - regression_loss: 1.2179 - classification_loss: 0.2280 265/500 [==============>...............] - ETA: 58s - loss: 1.4470 - regression_loss: 1.2189 - classification_loss: 0.2281 266/500 [==============>...............] - ETA: 58s - loss: 1.4484 - regression_loss: 1.2203 - classification_loss: 0.2281 267/500 [===============>..............] - ETA: 58s - loss: 1.4464 - regression_loss: 1.2187 - classification_loss: 0.2277 268/500 [===============>..............] - ETA: 58s - loss: 1.4463 - regression_loss: 1.2184 - classification_loss: 0.2279 269/500 [===============>..............] - ETA: 57s - loss: 1.4505 - regression_loss: 1.2218 - classification_loss: 0.2287 270/500 [===============>..............] - ETA: 57s - loss: 1.4491 - regression_loss: 1.2205 - classification_loss: 0.2286 271/500 [===============>..............] - ETA: 57s - loss: 1.4500 - regression_loss: 1.2213 - classification_loss: 0.2287 272/500 [===============>..............] - ETA: 57s - loss: 1.4518 - regression_loss: 1.2228 - classification_loss: 0.2290 273/500 [===============>..............] - ETA: 56s - loss: 1.4516 - regression_loss: 1.2224 - classification_loss: 0.2291 274/500 [===============>..............] - ETA: 56s - loss: 1.4513 - regression_loss: 1.2222 - classification_loss: 0.2291 275/500 [===============>..............] - ETA: 56s - loss: 1.4528 - regression_loss: 1.2235 - classification_loss: 0.2293 276/500 [===============>..............] - ETA: 56s - loss: 1.4538 - regression_loss: 1.2238 - classification_loss: 0.2300 277/500 [===============>..............] - ETA: 55s - loss: 1.4518 - regression_loss: 1.2222 - classification_loss: 0.2296 278/500 [===============>..............] - ETA: 55s - loss: 1.4527 - regression_loss: 1.2229 - classification_loss: 0.2298 279/500 [===============>..............] - ETA: 55s - loss: 1.4526 - regression_loss: 1.2226 - classification_loss: 0.2300 280/500 [===============>..............] - ETA: 55s - loss: 1.4521 - regression_loss: 1.2222 - classification_loss: 0.2298 281/500 [===============>..............] - ETA: 54s - loss: 1.4534 - regression_loss: 1.2234 - classification_loss: 0.2300 282/500 [===============>..............] - ETA: 54s - loss: 1.4544 - regression_loss: 1.2243 - classification_loss: 0.2301 283/500 [===============>..............] - ETA: 54s - loss: 1.4546 - regression_loss: 1.2245 - classification_loss: 0.2301 284/500 [================>.............] - ETA: 54s - loss: 1.4518 - regression_loss: 1.2219 - classification_loss: 0.2299 285/500 [================>.............] - ETA: 53s - loss: 1.4511 - regression_loss: 1.2214 - classification_loss: 0.2297 286/500 [================>.............] - ETA: 53s - loss: 1.4513 - regression_loss: 1.2216 - classification_loss: 0.2298 287/500 [================>.............] - ETA: 53s - loss: 1.4492 - regression_loss: 1.2197 - classification_loss: 0.2294 288/500 [================>.............] - ETA: 53s - loss: 1.4485 - regression_loss: 1.2191 - classification_loss: 0.2294 289/500 [================>.............] - ETA: 52s - loss: 1.4477 - regression_loss: 1.2184 - classification_loss: 0.2292 290/500 [================>.............] - ETA: 52s - loss: 1.4476 - regression_loss: 1.2185 - classification_loss: 0.2291 291/500 [================>.............] - ETA: 52s - loss: 1.4471 - regression_loss: 1.2181 - classification_loss: 0.2290 292/500 [================>.............] - ETA: 52s - loss: 1.4464 - regression_loss: 1.2175 - classification_loss: 0.2289 293/500 [================>.............] - ETA: 51s - loss: 1.4476 - regression_loss: 1.2186 - classification_loss: 0.2290 294/500 [================>.............] - ETA: 51s - loss: 1.4473 - regression_loss: 1.2183 - classification_loss: 0.2290 295/500 [================>.............] - ETA: 51s - loss: 1.4470 - regression_loss: 1.2182 - classification_loss: 0.2288 296/500 [================>.............] - ETA: 50s - loss: 1.4481 - regression_loss: 1.2191 - classification_loss: 0.2290 297/500 [================>.............] - ETA: 50s - loss: 1.4484 - regression_loss: 1.2192 - classification_loss: 0.2292 298/500 [================>.............] - ETA: 50s - loss: 1.4500 - regression_loss: 1.2202 - classification_loss: 0.2297 299/500 [================>.............] - ETA: 50s - loss: 1.4531 - regression_loss: 1.2227 - classification_loss: 0.2304 300/500 [=================>............] - ETA: 49s - loss: 1.4514 - regression_loss: 1.2211 - classification_loss: 0.2303 301/500 [=================>............] - ETA: 49s - loss: 1.4513 - regression_loss: 1.2211 - classification_loss: 0.2302 302/500 [=================>............] - ETA: 49s - loss: 1.4513 - regression_loss: 1.2212 - classification_loss: 0.2301 303/500 [=================>............] - ETA: 49s - loss: 1.4500 - regression_loss: 1.2201 - classification_loss: 0.2299 304/500 [=================>............] - ETA: 48s - loss: 1.4505 - regression_loss: 1.2205 - classification_loss: 0.2300 305/500 [=================>............] - ETA: 48s - loss: 1.4499 - regression_loss: 1.2200 - classification_loss: 0.2299 306/500 [=================>............] - ETA: 48s - loss: 1.4463 - regression_loss: 1.2168 - classification_loss: 0.2295 307/500 [=================>............] - ETA: 48s - loss: 1.4469 - regression_loss: 1.2169 - classification_loss: 0.2300 308/500 [=================>............] - ETA: 47s - loss: 1.4458 - regression_loss: 1.2161 - classification_loss: 0.2297 309/500 [=================>............] - ETA: 47s - loss: 1.4460 - regression_loss: 1.2164 - classification_loss: 0.2296 310/500 [=================>............] - ETA: 47s - loss: 1.4425 - regression_loss: 1.2135 - classification_loss: 0.2291 311/500 [=================>............] - ETA: 47s - loss: 1.4425 - regression_loss: 1.2133 - classification_loss: 0.2292 312/500 [=================>............] - ETA: 46s - loss: 1.4425 - regression_loss: 1.2131 - classification_loss: 0.2294 313/500 [=================>............] - ETA: 46s - loss: 1.4439 - regression_loss: 1.2142 - classification_loss: 0.2297 314/500 [=================>............] - ETA: 46s - loss: 1.4448 - regression_loss: 1.2148 - classification_loss: 0.2300 315/500 [=================>............] - ETA: 46s - loss: 1.4446 - regression_loss: 1.2146 - classification_loss: 0.2300 316/500 [=================>............] - ETA: 45s - loss: 1.4459 - regression_loss: 1.2154 - classification_loss: 0.2305 317/500 [==================>...........] - ETA: 45s - loss: 1.4486 - regression_loss: 1.2179 - classification_loss: 0.2308 318/500 [==================>...........] - ETA: 45s - loss: 1.4497 - regression_loss: 1.2183 - classification_loss: 0.2314 319/500 [==================>...........] - ETA: 45s - loss: 1.4509 - regression_loss: 1.2193 - classification_loss: 0.2316 320/500 [==================>...........] - ETA: 44s - loss: 1.4540 - regression_loss: 1.2215 - classification_loss: 0.2324 321/500 [==================>...........] - ETA: 44s - loss: 1.4549 - regression_loss: 1.2223 - classification_loss: 0.2326 322/500 [==================>...........] - ETA: 44s - loss: 1.4534 - regression_loss: 1.2210 - classification_loss: 0.2324 323/500 [==================>...........] - ETA: 44s - loss: 1.4571 - regression_loss: 1.2242 - classification_loss: 0.2330 324/500 [==================>...........] - ETA: 43s - loss: 1.4588 - regression_loss: 1.2255 - classification_loss: 0.2332 325/500 [==================>...........] - ETA: 43s - loss: 1.4588 - regression_loss: 1.2255 - classification_loss: 0.2334 326/500 [==================>...........] - ETA: 43s - loss: 1.4581 - regression_loss: 1.2247 - classification_loss: 0.2334 327/500 [==================>...........] - ETA: 43s - loss: 1.4577 - regression_loss: 1.2246 - classification_loss: 0.2331 328/500 [==================>...........] - ETA: 42s - loss: 1.4570 - regression_loss: 1.2243 - classification_loss: 0.2328 329/500 [==================>...........] - ETA: 42s - loss: 1.4574 - regression_loss: 1.2246 - classification_loss: 0.2328 330/500 [==================>...........] - ETA: 42s - loss: 1.4587 - regression_loss: 1.2250 - classification_loss: 0.2336 331/500 [==================>...........] - ETA: 42s - loss: 1.4586 - regression_loss: 1.2248 - classification_loss: 0.2338 332/500 [==================>...........] - ETA: 41s - loss: 1.4585 - regression_loss: 1.2248 - classification_loss: 0.2337 333/500 [==================>...........] - ETA: 41s - loss: 1.4590 - regression_loss: 1.2252 - classification_loss: 0.2338 334/500 [===================>..........] - ETA: 41s - loss: 1.4594 - regression_loss: 1.2253 - classification_loss: 0.2341 335/500 [===================>..........] - ETA: 41s - loss: 1.4597 - regression_loss: 1.2257 - classification_loss: 0.2340 336/500 [===================>..........] - ETA: 40s - loss: 1.4603 - regression_loss: 1.2263 - classification_loss: 0.2341 337/500 [===================>..........] - ETA: 40s - loss: 1.4596 - regression_loss: 1.2255 - classification_loss: 0.2341 338/500 [===================>..........] - ETA: 40s - loss: 1.4596 - regression_loss: 1.2254 - classification_loss: 0.2341 339/500 [===================>..........] - ETA: 40s - loss: 1.4570 - regression_loss: 1.2233 - classification_loss: 0.2337 340/500 [===================>..........] - ETA: 39s - loss: 1.4545 - regression_loss: 1.2213 - classification_loss: 0.2331 341/500 [===================>..........] - ETA: 39s - loss: 1.4538 - regression_loss: 1.2210 - classification_loss: 0.2328 342/500 [===================>..........] - ETA: 39s - loss: 1.4535 - regression_loss: 1.2208 - classification_loss: 0.2327 343/500 [===================>..........] - ETA: 39s - loss: 1.4534 - regression_loss: 1.2209 - classification_loss: 0.2326 344/500 [===================>..........] - ETA: 38s - loss: 1.4537 - regression_loss: 1.2212 - classification_loss: 0.2325 345/500 [===================>..........] - ETA: 38s - loss: 1.4545 - regression_loss: 1.2217 - classification_loss: 0.2328 346/500 [===================>..........] - ETA: 38s - loss: 1.4545 - regression_loss: 1.2218 - classification_loss: 0.2327 347/500 [===================>..........] - ETA: 38s - loss: 1.4561 - regression_loss: 1.2229 - classification_loss: 0.2333 348/500 [===================>..........] - ETA: 37s - loss: 1.4560 - regression_loss: 1.2228 - classification_loss: 0.2332 349/500 [===================>..........] - ETA: 37s - loss: 1.4544 - regression_loss: 1.2215 - classification_loss: 0.2329 350/500 [====================>.........] - ETA: 37s - loss: 1.4551 - regression_loss: 1.2220 - classification_loss: 0.2331 351/500 [====================>.........] - ETA: 37s - loss: 1.4554 - regression_loss: 1.2223 - classification_loss: 0.2331 352/500 [====================>.........] - ETA: 36s - loss: 1.4556 - regression_loss: 1.2225 - classification_loss: 0.2331 353/500 [====================>.........] - ETA: 36s - loss: 1.4574 - regression_loss: 1.2241 - classification_loss: 0.2333 354/500 [====================>.........] - ETA: 36s - loss: 1.4567 - regression_loss: 1.2235 - classification_loss: 0.2331 355/500 [====================>.........] - ETA: 36s - loss: 1.4564 - regression_loss: 1.2233 - classification_loss: 0.2331 356/500 [====================>.........] - ETA: 35s - loss: 1.4569 - regression_loss: 1.2237 - classification_loss: 0.2333 357/500 [====================>.........] - ETA: 35s - loss: 1.4578 - regression_loss: 1.2243 - classification_loss: 0.2335 358/500 [====================>.........] - ETA: 35s - loss: 1.4562 - regression_loss: 1.2230 - classification_loss: 0.2332 359/500 [====================>.........] - ETA: 35s - loss: 1.4562 - regression_loss: 1.2230 - classification_loss: 0.2332 360/500 [====================>.........] - ETA: 34s - loss: 1.4550 - regression_loss: 1.2219 - classification_loss: 0.2330 361/500 [====================>.........] - ETA: 34s - loss: 1.4553 - regression_loss: 1.2224 - classification_loss: 0.2329 362/500 [====================>.........] - ETA: 34s - loss: 1.4524 - regression_loss: 1.2201 - classification_loss: 0.2323 363/500 [====================>.........] - ETA: 34s - loss: 1.4516 - regression_loss: 1.2194 - classification_loss: 0.2321 364/500 [====================>.........] - ETA: 33s - loss: 1.4491 - regression_loss: 1.2174 - classification_loss: 0.2317 365/500 [====================>.........] - ETA: 33s - loss: 1.4490 - regression_loss: 1.2174 - classification_loss: 0.2316 366/500 [====================>.........] - ETA: 33s - loss: 1.4492 - regression_loss: 1.2173 - classification_loss: 0.2319 367/500 [=====================>........] - ETA: 33s - loss: 1.4502 - regression_loss: 1.2180 - classification_loss: 0.2322 368/500 [=====================>........] - ETA: 32s - loss: 1.4523 - regression_loss: 1.2196 - classification_loss: 0.2327 369/500 [=====================>........] - ETA: 32s - loss: 1.4520 - regression_loss: 1.2193 - classification_loss: 0.2327 370/500 [=====================>........] - ETA: 32s - loss: 1.4533 - regression_loss: 1.2207 - classification_loss: 0.2326 371/500 [=====================>........] - ETA: 32s - loss: 1.4534 - regression_loss: 1.2207 - classification_loss: 0.2327 372/500 [=====================>........] - ETA: 31s - loss: 1.4520 - regression_loss: 1.2195 - classification_loss: 0.2325 373/500 [=====================>........] - ETA: 31s - loss: 1.4521 - regression_loss: 1.2194 - classification_loss: 0.2327 374/500 [=====================>........] - ETA: 31s - loss: 1.4507 - regression_loss: 1.2179 - classification_loss: 0.2327 375/500 [=====================>........] - ETA: 31s - loss: 1.4501 - regression_loss: 1.2172 - classification_loss: 0.2329 376/500 [=====================>........] - ETA: 30s - loss: 1.4516 - regression_loss: 1.2182 - classification_loss: 0.2333 377/500 [=====================>........] - ETA: 30s - loss: 1.4505 - regression_loss: 1.2173 - classification_loss: 0.2332 378/500 [=====================>........] - ETA: 30s - loss: 1.4488 - regression_loss: 1.2160 - classification_loss: 0.2328 379/500 [=====================>........] - ETA: 30s - loss: 1.4476 - regression_loss: 1.2152 - classification_loss: 0.2324 380/500 [=====================>........] - ETA: 29s - loss: 1.4482 - regression_loss: 1.2156 - classification_loss: 0.2326 381/500 [=====================>........] - ETA: 29s - loss: 1.4483 - regression_loss: 1.2156 - classification_loss: 0.2326 382/500 [=====================>........] - ETA: 29s - loss: 1.4465 - regression_loss: 1.2142 - classification_loss: 0.2323 383/500 [=====================>........] - ETA: 29s - loss: 1.4476 - regression_loss: 1.2152 - classification_loss: 0.2324 384/500 [======================>.......] - ETA: 28s - loss: 1.4460 - regression_loss: 1.2137 - classification_loss: 0.2323 385/500 [======================>.......] - ETA: 28s - loss: 1.4465 - regression_loss: 1.2138 - classification_loss: 0.2326 386/500 [======================>.......] - ETA: 28s - loss: 1.4475 - regression_loss: 1.2147 - classification_loss: 0.2328 387/500 [======================>.......] - ETA: 28s - loss: 1.4473 - regression_loss: 1.2145 - classification_loss: 0.2328 388/500 [======================>.......] - ETA: 27s - loss: 1.4478 - regression_loss: 1.2149 - classification_loss: 0.2329 389/500 [======================>.......] - ETA: 27s - loss: 1.4482 - regression_loss: 1.2153 - classification_loss: 0.2330 390/500 [======================>.......] - ETA: 27s - loss: 1.4495 - regression_loss: 1.2164 - classification_loss: 0.2332 391/500 [======================>.......] - ETA: 27s - loss: 1.4497 - regression_loss: 1.2166 - classification_loss: 0.2331 392/500 [======================>.......] - ETA: 26s - loss: 1.4502 - regression_loss: 1.2172 - classification_loss: 0.2330 393/500 [======================>.......] - ETA: 26s - loss: 1.4517 - regression_loss: 1.2186 - classification_loss: 0.2331 394/500 [======================>.......] - ETA: 26s - loss: 1.4523 - regression_loss: 1.2191 - classification_loss: 0.2332 395/500 [======================>.......] - ETA: 26s - loss: 1.4523 - regression_loss: 1.2192 - classification_loss: 0.2332 396/500 [======================>.......] - ETA: 25s - loss: 1.4517 - regression_loss: 1.2182 - classification_loss: 0.2335 397/500 [======================>.......] - ETA: 25s - loss: 1.4498 - regression_loss: 1.2167 - classification_loss: 0.2332 398/500 [======================>.......] - ETA: 25s - loss: 1.4505 - regression_loss: 1.2174 - classification_loss: 0.2332 399/500 [======================>.......] - ETA: 25s - loss: 1.4504 - regression_loss: 1.2174 - classification_loss: 0.2331 400/500 [=======================>......] - ETA: 24s - loss: 1.4505 - regression_loss: 1.2173 - classification_loss: 0.2331 401/500 [=======================>......] - ETA: 24s - loss: 1.4517 - regression_loss: 1.2183 - classification_loss: 0.2334 402/500 [=======================>......] - ETA: 24s - loss: 1.4511 - regression_loss: 1.2179 - classification_loss: 0.2333 403/500 [=======================>......] - ETA: 24s - loss: 1.4504 - regression_loss: 1.2172 - classification_loss: 0.2332 404/500 [=======================>......] - ETA: 23s - loss: 1.4510 - regression_loss: 1.2177 - classification_loss: 0.2333 405/500 [=======================>......] - ETA: 23s - loss: 1.4551 - regression_loss: 1.2209 - classification_loss: 0.2342 406/500 [=======================>......] - ETA: 23s - loss: 1.4553 - regression_loss: 1.2211 - classification_loss: 0.2342 407/500 [=======================>......] - ETA: 23s - loss: 1.4563 - regression_loss: 1.2221 - classification_loss: 0.2342 408/500 [=======================>......] - ETA: 22s - loss: 1.4570 - regression_loss: 1.2227 - classification_loss: 0.2343 409/500 [=======================>......] - ETA: 22s - loss: 1.4579 - regression_loss: 1.2235 - classification_loss: 0.2345 410/500 [=======================>......] - ETA: 22s - loss: 1.4598 - regression_loss: 1.2247 - classification_loss: 0.2350 411/500 [=======================>......] - ETA: 22s - loss: 1.4610 - regression_loss: 1.2258 - classification_loss: 0.2352 412/500 [=======================>......] - ETA: 21s - loss: 1.4608 - regression_loss: 1.2256 - classification_loss: 0.2352 413/500 [=======================>......] - ETA: 21s - loss: 1.4591 - regression_loss: 1.2242 - classification_loss: 0.2349 414/500 [=======================>......] - ETA: 21s - loss: 1.4596 - regression_loss: 1.2248 - classification_loss: 0.2347 415/500 [=======================>......] - ETA: 21s - loss: 1.4594 - regression_loss: 1.2247 - classification_loss: 0.2347 416/500 [=======================>......] - ETA: 20s - loss: 1.4595 - regression_loss: 1.2248 - classification_loss: 0.2347 417/500 [========================>.....] - ETA: 20s - loss: 1.4567 - regression_loss: 1.2225 - classification_loss: 0.2342 418/500 [========================>.....] - ETA: 20s - loss: 1.4571 - regression_loss: 1.2226 - classification_loss: 0.2345 419/500 [========================>.....] - ETA: 20s - loss: 1.4580 - regression_loss: 1.2233 - classification_loss: 0.2347 420/500 [========================>.....] - ETA: 19s - loss: 1.4580 - regression_loss: 1.2233 - classification_loss: 0.2348 421/500 [========================>.....] - ETA: 19s - loss: 1.4569 - regression_loss: 1.2223 - classification_loss: 0.2345 422/500 [========================>.....] - ETA: 19s - loss: 1.4574 - regression_loss: 1.2228 - classification_loss: 0.2346 423/500 [========================>.....] - ETA: 19s - loss: 1.4578 - regression_loss: 1.2231 - classification_loss: 0.2347 424/500 [========================>.....] - ETA: 18s - loss: 1.4578 - regression_loss: 1.2231 - classification_loss: 0.2347 425/500 [========================>.....] - ETA: 18s - loss: 1.4574 - regression_loss: 1.2230 - classification_loss: 0.2344 426/500 [========================>.....] - ETA: 18s - loss: 1.4590 - regression_loss: 1.2242 - classification_loss: 0.2348 427/500 [========================>.....] - ETA: 18s - loss: 1.4584 - regression_loss: 1.2238 - classification_loss: 0.2346 428/500 [========================>.....] - ETA: 17s - loss: 1.4598 - regression_loss: 1.2249 - classification_loss: 0.2349 429/500 [========================>.....] - ETA: 17s - loss: 1.4597 - regression_loss: 1.2248 - classification_loss: 0.2349 430/500 [========================>.....] - ETA: 17s - loss: 1.4599 - regression_loss: 1.2250 - classification_loss: 0.2348 431/500 [========================>.....] - ETA: 17s - loss: 1.4599 - regression_loss: 1.2249 - classification_loss: 0.2350 432/500 [========================>.....] - ETA: 16s - loss: 1.4591 - regression_loss: 1.2243 - classification_loss: 0.2348 433/500 [========================>.....] - ETA: 16s - loss: 1.4593 - regression_loss: 1.2245 - classification_loss: 0.2347 434/500 [=========================>....] - ETA: 16s - loss: 1.4604 - regression_loss: 1.2255 - classification_loss: 0.2349 435/500 [=========================>....] - ETA: 16s - loss: 1.4601 - regression_loss: 1.2252 - classification_loss: 0.2349 436/500 [=========================>....] - ETA: 15s - loss: 1.4606 - regression_loss: 1.2257 - classification_loss: 0.2349 437/500 [=========================>....] - ETA: 15s - loss: 1.4607 - regression_loss: 1.2257 - classification_loss: 0.2350 438/500 [=========================>....] - ETA: 15s - loss: 1.4612 - regression_loss: 1.2260 - classification_loss: 0.2352 439/500 [=========================>....] - ETA: 15s - loss: 1.4614 - regression_loss: 1.2260 - classification_loss: 0.2353 440/500 [=========================>....] - ETA: 14s - loss: 1.4620 - regression_loss: 1.2261 - classification_loss: 0.2359 441/500 [=========================>....] - ETA: 14s - loss: 1.4624 - regression_loss: 1.2264 - classification_loss: 0.2360 442/500 [=========================>....] - ETA: 14s - loss: 1.4618 - regression_loss: 1.2260 - classification_loss: 0.2358 443/500 [=========================>....] - ETA: 14s - loss: 1.4611 - regression_loss: 1.2254 - classification_loss: 0.2357 444/500 [=========================>....] - ETA: 13s - loss: 1.4607 - regression_loss: 1.2250 - classification_loss: 0.2358 445/500 [=========================>....] - ETA: 13s - loss: 1.4607 - regression_loss: 1.2249 - classification_loss: 0.2358 446/500 [=========================>....] - ETA: 13s - loss: 1.4616 - regression_loss: 1.2256 - classification_loss: 0.2360 447/500 [=========================>....] - ETA: 13s - loss: 1.4624 - regression_loss: 1.2262 - classification_loss: 0.2362 448/500 [=========================>....] - ETA: 12s - loss: 1.4630 - regression_loss: 1.2268 - classification_loss: 0.2362 449/500 [=========================>....] - ETA: 12s - loss: 1.4635 - regression_loss: 1.2272 - classification_loss: 0.2364 450/500 [==========================>...] - ETA: 12s - loss: 1.4659 - regression_loss: 1.2290 - classification_loss: 0.2369 451/500 [==========================>...] - ETA: 12s - loss: 1.4674 - regression_loss: 1.2303 - classification_loss: 0.2371 452/500 [==========================>...] - ETA: 11s - loss: 1.4675 - regression_loss: 1.2304 - classification_loss: 0.2370 453/500 [==========================>...] - ETA: 11s - loss: 1.4672 - regression_loss: 1.2303 - classification_loss: 0.2369 454/500 [==========================>...] - ETA: 11s - loss: 1.4653 - regression_loss: 1.2287 - classification_loss: 0.2366 455/500 [==========================>...] - ETA: 11s - loss: 1.4645 - regression_loss: 1.2280 - classification_loss: 0.2365 456/500 [==========================>...] - ETA: 10s - loss: 1.4649 - regression_loss: 1.2283 - classification_loss: 0.2366 457/500 [==========================>...] - ETA: 10s - loss: 1.4648 - regression_loss: 1.2283 - classification_loss: 0.2365 458/500 [==========================>...] - ETA: 10s - loss: 1.4637 - regression_loss: 1.2276 - classification_loss: 0.2361 459/500 [==========================>...] - ETA: 10s - loss: 1.4620 - regression_loss: 1.2263 - classification_loss: 0.2357 460/500 [==========================>...] - ETA: 9s - loss: 1.4636 - regression_loss: 1.2275 - classification_loss: 0.2361  461/500 [==========================>...] - ETA: 9s - loss: 1.4632 - regression_loss: 1.2270 - classification_loss: 0.2362 462/500 [==========================>...] - ETA: 9s - loss: 1.4618 - regression_loss: 1.2260 - classification_loss: 0.2359 463/500 [==========================>...] - ETA: 9s - loss: 1.4619 - regression_loss: 1.2260 - classification_loss: 0.2359 464/500 [==========================>...] - ETA: 8s - loss: 1.4625 - regression_loss: 1.2265 - classification_loss: 0.2360 465/500 [==========================>...] - ETA: 8s - loss: 1.4631 - regression_loss: 1.2269 - classification_loss: 0.2362 466/500 [==========================>...] - ETA: 8s - loss: 1.4624 - regression_loss: 1.2265 - classification_loss: 0.2359 467/500 [===========================>..] - ETA: 8s - loss: 1.4611 - regression_loss: 1.2255 - classification_loss: 0.2356 468/500 [===========================>..] - ETA: 7s - loss: 1.4616 - regression_loss: 1.2261 - classification_loss: 0.2355 469/500 [===========================>..] - ETA: 7s - loss: 1.4592 - regression_loss: 1.2242 - classification_loss: 0.2350 470/500 [===========================>..] - ETA: 7s - loss: 1.4579 - regression_loss: 1.2230 - classification_loss: 0.2349 471/500 [===========================>..] - ETA: 7s - loss: 1.4583 - regression_loss: 1.2235 - classification_loss: 0.2348 472/500 [===========================>..] - ETA: 6s - loss: 1.4558 - regression_loss: 1.2214 - classification_loss: 0.2344 473/500 [===========================>..] - ETA: 6s - loss: 1.4571 - regression_loss: 1.2224 - classification_loss: 0.2347 474/500 [===========================>..] - ETA: 6s - loss: 1.4573 - regression_loss: 1.2225 - classification_loss: 0.2347 475/500 [===========================>..] - ETA: 6s - loss: 1.4557 - regression_loss: 1.2213 - classification_loss: 0.2344 476/500 [===========================>..] - ETA: 5s - loss: 1.4555 - regression_loss: 1.2213 - classification_loss: 0.2342 477/500 [===========================>..] - ETA: 5s - loss: 1.4548 - regression_loss: 1.2208 - classification_loss: 0.2340 478/500 [===========================>..] - ETA: 5s - loss: 1.4543 - regression_loss: 1.2205 - classification_loss: 0.2338 479/500 [===========================>..] - ETA: 5s - loss: 1.4542 - regression_loss: 1.2204 - classification_loss: 0.2338 480/500 [===========================>..] - ETA: 4s - loss: 1.4526 - regression_loss: 1.2191 - classification_loss: 0.2335 481/500 [===========================>..] - ETA: 4s - loss: 1.4512 - regression_loss: 1.2179 - classification_loss: 0.2333 482/500 [===========================>..] - ETA: 4s - loss: 1.4511 - regression_loss: 1.2178 - classification_loss: 0.2333 483/500 [===========================>..] - ETA: 4s - loss: 1.4512 - regression_loss: 1.2180 - classification_loss: 0.2332 484/500 [============================>.] - ETA: 3s - loss: 1.4503 - regression_loss: 1.2172 - classification_loss: 0.2331 485/500 [============================>.] - ETA: 3s - loss: 1.4519 - regression_loss: 1.2188 - classification_loss: 0.2332 486/500 [============================>.] - ETA: 3s - loss: 1.4528 - regression_loss: 1.2194 - classification_loss: 0.2333 487/500 [============================>.] - ETA: 3s - loss: 1.4524 - regression_loss: 1.2195 - classification_loss: 0.2330 488/500 [============================>.] - ETA: 2s - loss: 1.4518 - regression_loss: 1.2190 - classification_loss: 0.2328 489/500 [============================>.] - ETA: 2s - loss: 1.4512 - regression_loss: 1.2184 - classification_loss: 0.2328 490/500 [============================>.] - ETA: 2s - loss: 1.4507 - regression_loss: 1.2180 - classification_loss: 0.2327 491/500 [============================>.] - ETA: 2s - loss: 1.4503 - regression_loss: 1.2177 - classification_loss: 0.2326 492/500 [============================>.] - ETA: 1s - loss: 1.4498 - regression_loss: 1.2175 - classification_loss: 0.2323 493/500 [============================>.] - ETA: 1s - loss: 1.4506 - regression_loss: 1.2182 - classification_loss: 0.2324 494/500 [============================>.] - ETA: 1s - loss: 1.4518 - regression_loss: 1.2191 - classification_loss: 0.2327 495/500 [============================>.] - ETA: 1s - loss: 1.4509 - regression_loss: 1.2184 - classification_loss: 0.2325 496/500 [============================>.] - ETA: 0s - loss: 1.4517 - regression_loss: 1.2191 - classification_loss: 0.2326 497/500 [============================>.] - ETA: 0s - loss: 1.4511 - regression_loss: 1.2185 - classification_loss: 0.2326 498/500 [============================>.] - ETA: 0s - loss: 1.4501 - regression_loss: 1.2179 - classification_loss: 0.2323 499/500 [============================>.] - ETA: 0s - loss: 1.4505 - regression_loss: 1.2181 - classification_loss: 0.2324 500/500 [==============================] - 125s 249ms/step - loss: 1.4512 - regression_loss: 1.2187 - classification_loss: 0.2324 1172 instances of class plum with average precision: 0.6863 mAP: 0.6863 Epoch 00106: saving model to ./training/snapshots/resnet50_pascal_106.h5 Epoch 107/150 1/500 [..............................] - ETA: 2:02 - loss: 1.9558 - regression_loss: 1.4761 - classification_loss: 0.4797 2/500 [..............................] - ETA: 2:04 - loss: 1.6817 - regression_loss: 1.3465 - classification_loss: 0.3352 3/500 [..............................] - ETA: 2:04 - loss: 1.7478 - regression_loss: 1.4254 - classification_loss: 0.3224 4/500 [..............................] - ETA: 2:06 - loss: 1.7356 - regression_loss: 1.3882 - classification_loss: 0.3474 5/500 [..............................] - ETA: 2:07 - loss: 1.6911 - regression_loss: 1.3761 - classification_loss: 0.3151 6/500 [..............................] - ETA: 2:06 - loss: 1.5845 - regression_loss: 1.2996 - classification_loss: 0.2848 7/500 [..............................] - ETA: 2:06 - loss: 1.5711 - regression_loss: 1.2924 - classification_loss: 0.2787 8/500 [..............................] - ETA: 2:06 - loss: 1.6162 - regression_loss: 1.3203 - classification_loss: 0.2960 9/500 [..............................] - ETA: 2:05 - loss: 1.6062 - regression_loss: 1.3132 - classification_loss: 0.2929 10/500 [..............................] - ETA: 2:05 - loss: 1.5474 - regression_loss: 1.2653 - classification_loss: 0.2821 11/500 [..............................] - ETA: 2:04 - loss: 1.5405 - regression_loss: 1.2636 - classification_loss: 0.2768 12/500 [..............................] - ETA: 2:04 - loss: 1.5453 - regression_loss: 1.2723 - classification_loss: 0.2731 13/500 [..............................] - ETA: 2:04 - loss: 1.5618 - regression_loss: 1.2862 - classification_loss: 0.2755 14/500 [..............................] - ETA: 2:04 - loss: 1.4885 - regression_loss: 1.2294 - classification_loss: 0.2591 15/500 [..............................] - ETA: 2:03 - loss: 1.4413 - regression_loss: 1.1970 - classification_loss: 0.2442 16/500 [..............................] - ETA: 2:03 - loss: 1.4520 - regression_loss: 1.2061 - classification_loss: 0.2458 17/500 [>.............................] - ETA: 2:02 - loss: 1.3981 - regression_loss: 1.1623 - classification_loss: 0.2358 18/500 [>.............................] - ETA: 2:02 - loss: 1.4248 - regression_loss: 1.1828 - classification_loss: 0.2421 19/500 [>.............................] - ETA: 2:02 - loss: 1.4297 - regression_loss: 1.1839 - classification_loss: 0.2458 20/500 [>.............................] - ETA: 2:02 - loss: 1.3921 - regression_loss: 1.1525 - classification_loss: 0.2396 21/500 [>.............................] - ETA: 2:02 - loss: 1.4115 - regression_loss: 1.1713 - classification_loss: 0.2402 22/500 [>.............................] - ETA: 2:01 - loss: 1.4225 - regression_loss: 1.1818 - classification_loss: 0.2407 23/500 [>.............................] - ETA: 2:01 - loss: 1.4206 - regression_loss: 1.1843 - classification_loss: 0.2363 24/500 [>.............................] - ETA: 2:01 - loss: 1.4422 - regression_loss: 1.1994 - classification_loss: 0.2428 25/500 [>.............................] - ETA: 2:01 - loss: 1.4387 - regression_loss: 1.1955 - classification_loss: 0.2432 26/500 [>.............................] - ETA: 2:00 - loss: 1.4476 - regression_loss: 1.2048 - classification_loss: 0.2428 27/500 [>.............................] - ETA: 2:00 - loss: 1.4588 - regression_loss: 1.2143 - classification_loss: 0.2445 28/500 [>.............................] - ETA: 2:00 - loss: 1.4457 - regression_loss: 1.2068 - classification_loss: 0.2389 29/500 [>.............................] - ETA: 1:59 - loss: 1.4180 - regression_loss: 1.1821 - classification_loss: 0.2359 30/500 [>.............................] - ETA: 1:59 - loss: 1.4096 - regression_loss: 1.1748 - classification_loss: 0.2347 31/500 [>.............................] - ETA: 1:59 - loss: 1.3980 - regression_loss: 1.1663 - classification_loss: 0.2317 32/500 [>.............................] - ETA: 1:59 - loss: 1.4039 - regression_loss: 1.1726 - classification_loss: 0.2312 33/500 [>.............................] - ETA: 1:59 - loss: 1.3995 - regression_loss: 1.1702 - classification_loss: 0.2293 34/500 [=>............................] - ETA: 1:58 - loss: 1.4148 - regression_loss: 1.1835 - classification_loss: 0.2314 35/500 [=>............................] - ETA: 1:58 - loss: 1.4096 - regression_loss: 1.1803 - classification_loss: 0.2293 36/500 [=>............................] - ETA: 1:58 - loss: 1.4171 - regression_loss: 1.1865 - classification_loss: 0.2306 37/500 [=>............................] - ETA: 1:57 - loss: 1.4222 - regression_loss: 1.1900 - classification_loss: 0.2322 38/500 [=>............................] - ETA: 1:57 - loss: 1.4233 - regression_loss: 1.1920 - classification_loss: 0.2313 39/500 [=>............................] - ETA: 1:57 - loss: 1.4205 - regression_loss: 1.1904 - classification_loss: 0.2300 40/500 [=>............................] - ETA: 1:57 - loss: 1.4166 - regression_loss: 1.1880 - classification_loss: 0.2286 41/500 [=>............................] - ETA: 1:56 - loss: 1.4125 - regression_loss: 1.1847 - classification_loss: 0.2278 42/500 [=>............................] - ETA: 1:56 - loss: 1.3923 - regression_loss: 1.1683 - classification_loss: 0.2239 43/500 [=>............................] - ETA: 1:56 - loss: 1.4145 - regression_loss: 1.1846 - classification_loss: 0.2299 44/500 [=>............................] - ETA: 1:55 - loss: 1.4005 - regression_loss: 1.1725 - classification_loss: 0.2280 45/500 [=>............................] - ETA: 1:55 - loss: 1.4083 - regression_loss: 1.1793 - classification_loss: 0.2290 46/500 [=>............................] - ETA: 1:55 - loss: 1.4321 - regression_loss: 1.1971 - classification_loss: 0.2350 47/500 [=>............................] - ETA: 1:55 - loss: 1.4315 - regression_loss: 1.1971 - classification_loss: 0.2344 48/500 [=>............................] - ETA: 1:54 - loss: 1.4325 - regression_loss: 1.1975 - classification_loss: 0.2350 49/500 [=>............................] - ETA: 1:54 - loss: 1.4404 - regression_loss: 1.2043 - classification_loss: 0.2362 50/500 [==>...........................] - ETA: 1:54 - loss: 1.4302 - regression_loss: 1.1958 - classification_loss: 0.2344 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4247 - regression_loss: 1.1905 - classification_loss: 0.2342 52/500 [==>...........................] - ETA: 1:53 - loss: 1.4202 - regression_loss: 1.1871 - classification_loss: 0.2331 53/500 [==>...........................] - ETA: 1:53 - loss: 1.4012 - regression_loss: 1.1711 - classification_loss: 0.2301 54/500 [==>...........................] - ETA: 1:53 - loss: 1.3843 - regression_loss: 1.1574 - classification_loss: 0.2269 55/500 [==>...........................] - ETA: 1:52 - loss: 1.3858 - regression_loss: 1.1580 - classification_loss: 0.2278 56/500 [==>...........................] - ETA: 1:52 - loss: 1.3879 - regression_loss: 1.1596 - classification_loss: 0.2282 57/500 [==>...........................] - ETA: 1:52 - loss: 1.3847 - regression_loss: 1.1572 - classification_loss: 0.2275 58/500 [==>...........................] - ETA: 1:51 - loss: 1.3958 - regression_loss: 1.1655 - classification_loss: 0.2303 59/500 [==>...........................] - ETA: 1:51 - loss: 1.4027 - regression_loss: 1.1710 - classification_loss: 0.2317 60/500 [==>...........................] - ETA: 1:51 - loss: 1.4035 - regression_loss: 1.1721 - classification_loss: 0.2314 61/500 [==>...........................] - ETA: 1:51 - loss: 1.4079 - regression_loss: 1.1749 - classification_loss: 0.2330 62/500 [==>...........................] - ETA: 1:50 - loss: 1.4138 - regression_loss: 1.1806 - classification_loss: 0.2332 63/500 [==>...........................] - ETA: 1:50 - loss: 1.4024 - regression_loss: 1.1718 - classification_loss: 0.2306 64/500 [==>...........................] - ETA: 1:50 - loss: 1.4033 - regression_loss: 1.1729 - classification_loss: 0.2304 65/500 [==>...........................] - ETA: 1:50 - loss: 1.4077 - regression_loss: 1.1766 - classification_loss: 0.2311 66/500 [==>...........................] - ETA: 1:49 - loss: 1.4168 - regression_loss: 1.1852 - classification_loss: 0.2316 67/500 [===>..........................] - ETA: 1:49 - loss: 1.4159 - regression_loss: 1.1838 - classification_loss: 0.2321 68/500 [===>..........................] - ETA: 1:49 - loss: 1.4206 - regression_loss: 1.1893 - classification_loss: 0.2314 69/500 [===>..........................] - ETA: 1:49 - loss: 1.4341 - regression_loss: 1.2015 - classification_loss: 0.2326 70/500 [===>..........................] - ETA: 1:48 - loss: 1.4361 - regression_loss: 1.2028 - classification_loss: 0.2333 71/500 [===>..........................] - ETA: 1:48 - loss: 1.4374 - regression_loss: 1.2043 - classification_loss: 0.2331 72/500 [===>..........................] - ETA: 1:48 - loss: 1.4471 - regression_loss: 1.2141 - classification_loss: 0.2330 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4536 - regression_loss: 1.2209 - classification_loss: 0.2327 74/500 [===>..........................] - ETA: 1:47 - loss: 1.4513 - regression_loss: 1.2194 - classification_loss: 0.2319 75/500 [===>..........................] - ETA: 1:47 - loss: 1.4506 - regression_loss: 1.2187 - classification_loss: 0.2319 76/500 [===>..........................] - ETA: 1:47 - loss: 1.4454 - regression_loss: 1.2152 - classification_loss: 0.2302 77/500 [===>..........................] - ETA: 1:47 - loss: 1.4463 - regression_loss: 1.2163 - classification_loss: 0.2300 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4475 - regression_loss: 1.2172 - classification_loss: 0.2303 79/500 [===>..........................] - ETA: 1:46 - loss: 1.4540 - regression_loss: 1.2227 - classification_loss: 0.2313 80/500 [===>..........................] - ETA: 1:46 - loss: 1.4471 - regression_loss: 1.2172 - classification_loss: 0.2298 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4401 - regression_loss: 1.2118 - classification_loss: 0.2282 82/500 [===>..........................] - ETA: 1:45 - loss: 1.4435 - regression_loss: 1.2151 - classification_loss: 0.2284 83/500 [===>..........................] - ETA: 1:45 - loss: 1.4444 - regression_loss: 1.2155 - classification_loss: 0.2289 84/500 [====>.........................] - ETA: 1:45 - loss: 1.4466 - regression_loss: 1.2171 - classification_loss: 0.2295 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4548 - regression_loss: 1.2227 - classification_loss: 0.2321 86/500 [====>.........................] - ETA: 1:44 - loss: 1.4585 - regression_loss: 1.2254 - classification_loss: 0.2331 87/500 [====>.........................] - ETA: 1:44 - loss: 1.4680 - regression_loss: 1.2325 - classification_loss: 0.2355 88/500 [====>.........................] - ETA: 1:44 - loss: 1.4638 - regression_loss: 1.2290 - classification_loss: 0.2348 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4614 - regression_loss: 1.2268 - classification_loss: 0.2346 90/500 [====>.........................] - ETA: 1:43 - loss: 1.4660 - regression_loss: 1.2309 - classification_loss: 0.2351 91/500 [====>.........................] - ETA: 1:43 - loss: 1.4628 - regression_loss: 1.2278 - classification_loss: 0.2349 92/500 [====>.........................] - ETA: 1:43 - loss: 1.4532 - regression_loss: 1.2197 - classification_loss: 0.2334 93/500 [====>.........................] - ETA: 1:43 - loss: 1.4665 - regression_loss: 1.2302 - classification_loss: 0.2362 94/500 [====>.........................] - ETA: 1:42 - loss: 1.4719 - regression_loss: 1.2352 - classification_loss: 0.2367 95/500 [====>.........................] - ETA: 1:42 - loss: 1.4701 - regression_loss: 1.2335 - classification_loss: 0.2365 96/500 [====>.........................] - ETA: 1:42 - loss: 1.4745 - regression_loss: 1.2370 - classification_loss: 0.2375 97/500 [====>.........................] - ETA: 1:42 - loss: 1.4699 - regression_loss: 1.2344 - classification_loss: 0.2354 98/500 [====>.........................] - ETA: 1:41 - loss: 1.4704 - regression_loss: 1.2352 - classification_loss: 0.2351 99/500 [====>.........................] - ETA: 1:41 - loss: 1.4739 - regression_loss: 1.2383 - classification_loss: 0.2356 100/500 [=====>........................] - ETA: 1:41 - loss: 1.4661 - regression_loss: 1.2321 - classification_loss: 0.2340 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4586 - regression_loss: 1.2250 - classification_loss: 0.2336 102/500 [=====>........................] - ETA: 1:40 - loss: 1.4615 - regression_loss: 1.2279 - classification_loss: 0.2336 103/500 [=====>........................] - ETA: 1:40 - loss: 1.4557 - regression_loss: 1.2232 - classification_loss: 0.2325 104/500 [=====>........................] - ETA: 1:40 - loss: 1.4603 - regression_loss: 1.2269 - classification_loss: 0.2334 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4636 - regression_loss: 1.2299 - classification_loss: 0.2337 106/500 [=====>........................] - ETA: 1:39 - loss: 1.4635 - regression_loss: 1.2289 - classification_loss: 0.2346 107/500 [=====>........................] - ETA: 1:39 - loss: 1.4648 - regression_loss: 1.2299 - classification_loss: 0.2349 108/500 [=====>........................] - ETA: 1:39 - loss: 1.4642 - regression_loss: 1.2296 - classification_loss: 0.2347 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4604 - regression_loss: 1.2265 - classification_loss: 0.2339 110/500 [=====>........................] - ETA: 1:38 - loss: 1.4655 - regression_loss: 1.2305 - classification_loss: 0.2351 111/500 [=====>........................] - ETA: 1:38 - loss: 1.4606 - regression_loss: 1.2267 - classification_loss: 0.2338 112/500 [=====>........................] - ETA: 1:38 - loss: 1.4565 - regression_loss: 1.2231 - classification_loss: 0.2334 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4587 - regression_loss: 1.2259 - classification_loss: 0.2328 114/500 [=====>........................] - ETA: 1:37 - loss: 1.4622 - regression_loss: 1.2291 - classification_loss: 0.2332 115/500 [=====>........................] - ETA: 1:37 - loss: 1.4605 - regression_loss: 1.2277 - classification_loss: 0.2328 116/500 [=====>........................] - ETA: 1:37 - loss: 1.4605 - regression_loss: 1.2274 - classification_loss: 0.2331 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4658 - regression_loss: 1.2311 - classification_loss: 0.2347 118/500 [======>.......................] - ETA: 1:36 - loss: 1.4634 - regression_loss: 1.2293 - classification_loss: 0.2341 119/500 [======>.......................] - ETA: 1:36 - loss: 1.4657 - regression_loss: 1.2310 - classification_loss: 0.2346 120/500 [======>.......................] - ETA: 1:36 - loss: 1.4643 - regression_loss: 1.2298 - classification_loss: 0.2345 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4607 - regression_loss: 1.2273 - classification_loss: 0.2334 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4630 - regression_loss: 1.2297 - classification_loss: 0.2333 123/500 [======>.......................] - ETA: 1:35 - loss: 1.4616 - regression_loss: 1.2284 - classification_loss: 0.2332 124/500 [======>.......................] - ETA: 1:35 - loss: 1.4617 - regression_loss: 1.2283 - classification_loss: 0.2335 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4598 - regression_loss: 1.2273 - classification_loss: 0.2325 126/500 [======>.......................] - ETA: 1:34 - loss: 1.4600 - regression_loss: 1.2273 - classification_loss: 0.2327 127/500 [======>.......................] - ETA: 1:34 - loss: 1.4630 - regression_loss: 1.2293 - classification_loss: 0.2338 128/500 [======>.......................] - ETA: 1:34 - loss: 1.4674 - regression_loss: 1.2325 - classification_loss: 0.2350 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4685 - regression_loss: 1.2333 - classification_loss: 0.2353 130/500 [======>.......................] - ETA: 1:33 - loss: 1.4704 - regression_loss: 1.2349 - classification_loss: 0.2355 131/500 [======>.......................] - ETA: 1:33 - loss: 1.4706 - regression_loss: 1.2352 - classification_loss: 0.2354 132/500 [======>.......................] - ETA: 1:33 - loss: 1.4662 - regression_loss: 1.2318 - classification_loss: 0.2344 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4647 - regression_loss: 1.2308 - classification_loss: 0.2339 134/500 [=======>......................] - ETA: 1:32 - loss: 1.4632 - regression_loss: 1.2291 - classification_loss: 0.2340 135/500 [=======>......................] - ETA: 1:32 - loss: 1.4651 - regression_loss: 1.2306 - classification_loss: 0.2345 136/500 [=======>......................] - ETA: 1:32 - loss: 1.4644 - regression_loss: 1.2301 - classification_loss: 0.2343 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4639 - regression_loss: 1.2295 - classification_loss: 0.2344 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4639 - regression_loss: 1.2297 - classification_loss: 0.2342 139/500 [=======>......................] - ETA: 1:31 - loss: 1.4648 - regression_loss: 1.2297 - classification_loss: 0.2351 140/500 [=======>......................] - ETA: 1:31 - loss: 1.4636 - regression_loss: 1.2278 - classification_loss: 0.2357 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4628 - regression_loss: 1.2275 - classification_loss: 0.2353 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4634 - regression_loss: 1.2277 - classification_loss: 0.2357 143/500 [=======>......................] - ETA: 1:30 - loss: 1.4668 - regression_loss: 1.2306 - classification_loss: 0.2362 144/500 [=======>......................] - ETA: 1:30 - loss: 1.4644 - regression_loss: 1.2280 - classification_loss: 0.2365 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4663 - regression_loss: 1.2294 - classification_loss: 0.2370 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4668 - regression_loss: 1.2299 - classification_loss: 0.2369 147/500 [=======>......................] - ETA: 1:29 - loss: 1.4699 - regression_loss: 1.2325 - classification_loss: 0.2374 148/500 [=======>......................] - ETA: 1:29 - loss: 1.4744 - regression_loss: 1.2356 - classification_loss: 0.2388 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4771 - regression_loss: 1.2377 - classification_loss: 0.2394 150/500 [========>.....................] - ETA: 1:28 - loss: 1.4757 - regression_loss: 1.2365 - classification_loss: 0.2392 151/500 [========>.....................] - ETA: 1:28 - loss: 1.4780 - regression_loss: 1.2387 - classification_loss: 0.2393 152/500 [========>.....................] - ETA: 1:28 - loss: 1.4764 - regression_loss: 1.2376 - classification_loss: 0.2388 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4736 - regression_loss: 1.2351 - classification_loss: 0.2385 154/500 [========>.....................] - ETA: 1:27 - loss: 1.4740 - regression_loss: 1.2356 - classification_loss: 0.2384 155/500 [========>.....................] - ETA: 1:27 - loss: 1.4724 - regression_loss: 1.2341 - classification_loss: 0.2383 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4674 - regression_loss: 1.2304 - classification_loss: 0.2370 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4711 - regression_loss: 1.2332 - classification_loss: 0.2379 158/500 [========>.....................] - ETA: 1:26 - loss: 1.4727 - regression_loss: 1.2346 - classification_loss: 0.2381 159/500 [========>.....................] - ETA: 1:26 - loss: 1.4734 - regression_loss: 1.2353 - classification_loss: 0.2382 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4720 - regression_loss: 1.2342 - classification_loss: 0.2378 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4716 - regression_loss: 1.2338 - classification_loss: 0.2378 162/500 [========>.....................] - ETA: 1:25 - loss: 1.4751 - regression_loss: 1.2363 - classification_loss: 0.2388 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4764 - regression_loss: 1.2375 - classification_loss: 0.2389 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4696 - regression_loss: 1.2318 - classification_loss: 0.2378 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4652 - regression_loss: 1.2273 - classification_loss: 0.2379 166/500 [========>.....................] - ETA: 1:24 - loss: 1.4627 - regression_loss: 1.2254 - classification_loss: 0.2372 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4629 - regression_loss: 1.2261 - classification_loss: 0.2368 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4579 - regression_loss: 1.2221 - classification_loss: 0.2357 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4541 - regression_loss: 1.2192 - classification_loss: 0.2349 170/500 [=========>....................] - ETA: 1:23 - loss: 1.4521 - regression_loss: 1.2177 - classification_loss: 0.2344 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4542 - regression_loss: 1.2194 - classification_loss: 0.2348 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4552 - regression_loss: 1.2203 - classification_loss: 0.2349 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4561 - regression_loss: 1.2216 - classification_loss: 0.2344 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4573 - regression_loss: 1.2224 - classification_loss: 0.2349 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4576 - regression_loss: 1.2228 - classification_loss: 0.2348 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4576 - regression_loss: 1.2230 - classification_loss: 0.2346 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4578 - regression_loss: 1.2232 - classification_loss: 0.2347 178/500 [=========>....................] - ETA: 1:21 - loss: 1.4589 - regression_loss: 1.2241 - classification_loss: 0.2348 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4584 - regression_loss: 1.2239 - classification_loss: 0.2345 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4569 - regression_loss: 1.2222 - classification_loss: 0.2347 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4539 - regression_loss: 1.2199 - classification_loss: 0.2340 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4563 - regression_loss: 1.2216 - classification_loss: 0.2346 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4559 - regression_loss: 1.2213 - classification_loss: 0.2346 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4581 - regression_loss: 1.2226 - classification_loss: 0.2354 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4570 - regression_loss: 1.2218 - classification_loss: 0.2352 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4584 - regression_loss: 1.2227 - classification_loss: 0.2357 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4543 - regression_loss: 1.2193 - classification_loss: 0.2351 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4557 - regression_loss: 1.2204 - classification_loss: 0.2352 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4565 - regression_loss: 1.2213 - classification_loss: 0.2352 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4544 - regression_loss: 1.2194 - classification_loss: 0.2350 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4569 - regression_loss: 1.2216 - classification_loss: 0.2354 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4575 - regression_loss: 1.2221 - classification_loss: 0.2354 193/500 [==========>...................] - ETA: 1:17 - loss: 1.4563 - regression_loss: 1.2208 - classification_loss: 0.2354 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4606 - regression_loss: 1.2237 - classification_loss: 0.2368 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4632 - regression_loss: 1.2262 - classification_loss: 0.2371 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4651 - regression_loss: 1.2278 - classification_loss: 0.2373 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4651 - regression_loss: 1.2278 - classification_loss: 0.2373 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4690 - regression_loss: 1.2311 - classification_loss: 0.2379 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4667 - regression_loss: 1.2291 - classification_loss: 0.2376 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4669 - regression_loss: 1.2294 - classification_loss: 0.2376 201/500 [===========>..................] - ETA: 1:15 - loss: 1.4685 - regression_loss: 1.2306 - classification_loss: 0.2379 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4693 - regression_loss: 1.2308 - classification_loss: 0.2385 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4730 - regression_loss: 1.2336 - classification_loss: 0.2394 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4691 - regression_loss: 1.2303 - classification_loss: 0.2388 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4693 - regression_loss: 1.2306 - classification_loss: 0.2387 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4676 - regression_loss: 1.2291 - classification_loss: 0.2385 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4630 - regression_loss: 1.2256 - classification_loss: 0.2374 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4646 - regression_loss: 1.2269 - classification_loss: 0.2377 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4658 - regression_loss: 1.2280 - classification_loss: 0.2378 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4650 - regression_loss: 1.2274 - classification_loss: 0.2375 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4619 - regression_loss: 1.2248 - classification_loss: 0.2370 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4636 - regression_loss: 1.2264 - classification_loss: 0.2372 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4644 - regression_loss: 1.2268 - classification_loss: 0.2376 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4658 - regression_loss: 1.2277 - classification_loss: 0.2381 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4643 - regression_loss: 1.2267 - classification_loss: 0.2376 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4667 - regression_loss: 1.2287 - classification_loss: 0.2380 217/500 [============>.................] - ETA: 1:11 - loss: 1.4646 - regression_loss: 1.2269 - classification_loss: 0.2377 218/500 [============>.................] - ETA: 1:10 - loss: 1.4609 - regression_loss: 1.2240 - classification_loss: 0.2369 219/500 [============>.................] - ETA: 1:10 - loss: 1.4626 - regression_loss: 1.2256 - classification_loss: 0.2370 220/500 [============>.................] - ETA: 1:10 - loss: 1.4655 - regression_loss: 1.2279 - classification_loss: 0.2377 221/500 [============>.................] - ETA: 1:10 - loss: 1.4664 - regression_loss: 1.2283 - classification_loss: 0.2380 222/500 [============>.................] - ETA: 1:09 - loss: 1.4670 - regression_loss: 1.2290 - classification_loss: 0.2380 223/500 [============>.................] - ETA: 1:09 - loss: 1.4677 - regression_loss: 1.2296 - classification_loss: 0.2381 224/500 [============>.................] - ETA: 1:09 - loss: 1.4689 - regression_loss: 1.2306 - classification_loss: 0.2383 225/500 [============>.................] - ETA: 1:09 - loss: 1.4682 - regression_loss: 1.2302 - classification_loss: 0.2380 226/500 [============>.................] - ETA: 1:08 - loss: 1.4691 - regression_loss: 1.2312 - classification_loss: 0.2380 227/500 [============>.................] - ETA: 1:08 - loss: 1.4678 - regression_loss: 1.2299 - classification_loss: 0.2379 228/500 [============>.................] - ETA: 1:08 - loss: 1.4672 - regression_loss: 1.2291 - classification_loss: 0.2381 229/500 [============>.................] - ETA: 1:08 - loss: 1.4694 - regression_loss: 1.2311 - classification_loss: 0.2383 230/500 [============>.................] - ETA: 1:07 - loss: 1.4704 - regression_loss: 1.2319 - classification_loss: 0.2385 231/500 [============>.................] - ETA: 1:07 - loss: 1.4721 - regression_loss: 1.2332 - classification_loss: 0.2389 232/500 [============>.................] - ETA: 1:07 - loss: 1.4727 - regression_loss: 1.2340 - classification_loss: 0.2388 233/500 [============>.................] - ETA: 1:07 - loss: 1.4748 - regression_loss: 1.2357 - classification_loss: 0.2392 234/500 [=============>................] - ETA: 1:06 - loss: 1.4760 - regression_loss: 1.2366 - classification_loss: 0.2394 235/500 [=============>................] - ETA: 1:06 - loss: 1.4740 - regression_loss: 1.2351 - classification_loss: 0.2389 236/500 [=============>................] - ETA: 1:06 - loss: 1.4753 - regression_loss: 1.2360 - classification_loss: 0.2393 237/500 [=============>................] - ETA: 1:06 - loss: 1.4756 - regression_loss: 1.2362 - classification_loss: 0.2394 238/500 [=============>................] - ETA: 1:05 - loss: 1.4746 - regression_loss: 1.2354 - classification_loss: 0.2392 239/500 [=============>................] - ETA: 1:05 - loss: 1.4709 - regression_loss: 1.2325 - classification_loss: 0.2384 240/500 [=============>................] - ETA: 1:05 - loss: 1.4706 - regression_loss: 1.2321 - classification_loss: 0.2385 241/500 [=============>................] - ETA: 1:05 - loss: 1.4687 - regression_loss: 1.2308 - classification_loss: 0.2378 242/500 [=============>................] - ETA: 1:04 - loss: 1.4686 - regression_loss: 1.2309 - classification_loss: 0.2378 243/500 [=============>................] - ETA: 1:04 - loss: 1.4686 - regression_loss: 1.2308 - classification_loss: 0.2377 244/500 [=============>................] - ETA: 1:04 - loss: 1.4707 - regression_loss: 1.2323 - classification_loss: 0.2384 245/500 [=============>................] - ETA: 1:04 - loss: 1.4692 - regression_loss: 1.2311 - classification_loss: 0.2381 246/500 [=============>................] - ETA: 1:03 - loss: 1.4699 - regression_loss: 1.2315 - classification_loss: 0.2384 247/500 [=============>................] - ETA: 1:03 - loss: 1.4708 - regression_loss: 1.2323 - classification_loss: 0.2385 248/500 [=============>................] - ETA: 1:03 - loss: 1.4728 - regression_loss: 1.2339 - classification_loss: 0.2389 249/500 [=============>................] - ETA: 1:03 - loss: 1.4709 - regression_loss: 1.2318 - classification_loss: 0.2391 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4672 - regression_loss: 1.2286 - classification_loss: 0.2386 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4660 - regression_loss: 1.2276 - classification_loss: 0.2384 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4663 - regression_loss: 1.2280 - classification_loss: 0.2383 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4682 - regression_loss: 1.2300 - classification_loss: 0.2382 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4682 - regression_loss: 1.2300 - classification_loss: 0.2383 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4687 - regression_loss: 1.2305 - classification_loss: 0.2382 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4677 - regression_loss: 1.2298 - classification_loss: 0.2378 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4649 - regression_loss: 1.2277 - classification_loss: 0.2373 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4645 - regression_loss: 1.2274 - classification_loss: 0.2370 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4675 - regression_loss: 1.2300 - classification_loss: 0.2376 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4673 - regression_loss: 1.2299 - classification_loss: 0.2374 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4651 - regression_loss: 1.2281 - classification_loss: 0.2370 262/500 [==============>...............] - ETA: 59s - loss: 1.4659 - regression_loss: 1.2284 - classification_loss: 0.2375  263/500 [==============>...............] - ETA: 59s - loss: 1.4651 - regression_loss: 1.2278 - classification_loss: 0.2373 264/500 [==============>...............] - ETA: 59s - loss: 1.4655 - regression_loss: 1.2281 - classification_loss: 0.2374 265/500 [==============>...............] - ETA: 59s - loss: 1.4631 - regression_loss: 1.2261 - classification_loss: 0.2369 266/500 [==============>...............] - ETA: 58s - loss: 1.4625 - regression_loss: 1.2256 - classification_loss: 0.2369 267/500 [===============>..............] - ETA: 58s - loss: 1.4656 - regression_loss: 1.2277 - classification_loss: 0.2379 268/500 [===============>..............] - ETA: 58s - loss: 1.4639 - regression_loss: 1.2259 - classification_loss: 0.2379 269/500 [===============>..............] - ETA: 58s - loss: 1.4640 - regression_loss: 1.2261 - classification_loss: 0.2379 270/500 [===============>..............] - ETA: 57s - loss: 1.4624 - regression_loss: 1.2249 - classification_loss: 0.2375 271/500 [===============>..............] - ETA: 57s - loss: 1.4606 - regression_loss: 1.2235 - classification_loss: 0.2372 272/500 [===============>..............] - ETA: 57s - loss: 1.4611 - regression_loss: 1.2238 - classification_loss: 0.2373 273/500 [===============>..............] - ETA: 57s - loss: 1.4621 - regression_loss: 1.2244 - classification_loss: 0.2378 274/500 [===============>..............] - ETA: 56s - loss: 1.4626 - regression_loss: 1.2247 - classification_loss: 0.2379 275/500 [===============>..............] - ETA: 56s - loss: 1.4622 - regression_loss: 1.2244 - classification_loss: 0.2378 276/500 [===============>..............] - ETA: 56s - loss: 1.4621 - regression_loss: 1.2245 - classification_loss: 0.2376 277/500 [===============>..............] - ETA: 56s - loss: 1.4587 - regression_loss: 1.2218 - classification_loss: 0.2369 278/500 [===============>..............] - ETA: 55s - loss: 1.4594 - regression_loss: 1.2228 - classification_loss: 0.2365 279/500 [===============>..............] - ETA: 55s - loss: 1.4601 - regression_loss: 1.2232 - classification_loss: 0.2369 280/500 [===============>..............] - ETA: 55s - loss: 1.4611 - regression_loss: 1.2241 - classification_loss: 0.2370 281/500 [===============>..............] - ETA: 55s - loss: 1.4620 - regression_loss: 1.2245 - classification_loss: 0.2376 282/500 [===============>..............] - ETA: 54s - loss: 1.4622 - regression_loss: 1.2247 - classification_loss: 0.2375 283/500 [===============>..............] - ETA: 54s - loss: 1.4630 - regression_loss: 1.2252 - classification_loss: 0.2378 284/500 [================>.............] - ETA: 54s - loss: 1.4635 - regression_loss: 1.2256 - classification_loss: 0.2379 285/500 [================>.............] - ETA: 54s - loss: 1.4675 - regression_loss: 1.2289 - classification_loss: 0.2386 286/500 [================>.............] - ETA: 53s - loss: 1.4662 - regression_loss: 1.2279 - classification_loss: 0.2383 287/500 [================>.............] - ETA: 53s - loss: 1.4658 - regression_loss: 1.2278 - classification_loss: 0.2380 288/500 [================>.............] - ETA: 53s - loss: 1.4652 - regression_loss: 1.2274 - classification_loss: 0.2378 289/500 [================>.............] - ETA: 53s - loss: 1.4655 - regression_loss: 1.2276 - classification_loss: 0.2378 290/500 [================>.............] - ETA: 52s - loss: 1.4650 - regression_loss: 1.2272 - classification_loss: 0.2378 291/500 [================>.............] - ETA: 52s - loss: 1.4645 - regression_loss: 1.2270 - classification_loss: 0.2375 292/500 [================>.............] - ETA: 52s - loss: 1.4638 - regression_loss: 1.2265 - classification_loss: 0.2373 293/500 [================>.............] - ETA: 52s - loss: 1.4632 - regression_loss: 1.2259 - classification_loss: 0.2373 294/500 [================>.............] - ETA: 51s - loss: 1.4621 - regression_loss: 1.2251 - classification_loss: 0.2370 295/500 [================>.............] - ETA: 51s - loss: 1.4617 - regression_loss: 1.2248 - classification_loss: 0.2368 296/500 [================>.............] - ETA: 51s - loss: 1.4624 - regression_loss: 1.2256 - classification_loss: 0.2368 297/500 [================>.............] - ETA: 51s - loss: 1.4613 - regression_loss: 1.2246 - classification_loss: 0.2367 298/500 [================>.............] - ETA: 50s - loss: 1.4627 - regression_loss: 1.2258 - classification_loss: 0.2368 299/500 [================>.............] - ETA: 50s - loss: 1.4636 - regression_loss: 1.2264 - classification_loss: 0.2372 300/500 [=================>............] - ETA: 50s - loss: 1.4623 - regression_loss: 1.2255 - classification_loss: 0.2368 301/500 [=================>............] - ETA: 50s - loss: 1.4609 - regression_loss: 1.2246 - classification_loss: 0.2363 302/500 [=================>............] - ETA: 49s - loss: 1.4621 - regression_loss: 1.2256 - classification_loss: 0.2364 303/500 [=================>............] - ETA: 49s - loss: 1.4625 - regression_loss: 1.2262 - classification_loss: 0.2364 304/500 [=================>............] - ETA: 49s - loss: 1.4625 - regression_loss: 1.2261 - classification_loss: 0.2364 305/500 [=================>............] - ETA: 49s - loss: 1.4623 - regression_loss: 1.2260 - classification_loss: 0.2363 306/500 [=================>............] - ETA: 48s - loss: 1.4613 - regression_loss: 1.2252 - classification_loss: 0.2361 307/500 [=================>............] - ETA: 48s - loss: 1.4595 - regression_loss: 1.2239 - classification_loss: 0.2356 308/500 [=================>............] - ETA: 48s - loss: 1.4600 - regression_loss: 1.2245 - classification_loss: 0.2355 309/500 [=================>............] - ETA: 48s - loss: 1.4590 - regression_loss: 1.2237 - classification_loss: 0.2353 310/500 [=================>............] - ETA: 47s - loss: 1.4595 - regression_loss: 1.2242 - classification_loss: 0.2353 311/500 [=================>............] - ETA: 47s - loss: 1.4586 - regression_loss: 1.2234 - classification_loss: 0.2352 312/500 [=================>............] - ETA: 47s - loss: 1.4586 - regression_loss: 1.2233 - classification_loss: 0.2353 313/500 [=================>............] - ETA: 47s - loss: 1.4582 - regression_loss: 1.2230 - classification_loss: 0.2353 314/500 [=================>............] - ETA: 46s - loss: 1.4585 - regression_loss: 1.2232 - classification_loss: 0.2353 315/500 [=================>............] - ETA: 46s - loss: 1.4584 - regression_loss: 1.2231 - classification_loss: 0.2352 316/500 [=================>............] - ETA: 46s - loss: 1.4583 - regression_loss: 1.2231 - classification_loss: 0.2352 317/500 [==================>...........] - ETA: 46s - loss: 1.4576 - regression_loss: 1.2226 - classification_loss: 0.2350 318/500 [==================>...........] - ETA: 45s - loss: 1.4570 - regression_loss: 1.2222 - classification_loss: 0.2347 319/500 [==================>...........] - ETA: 45s - loss: 1.4579 - regression_loss: 1.2231 - classification_loss: 0.2348 320/500 [==================>...........] - ETA: 45s - loss: 1.4569 - regression_loss: 1.2223 - classification_loss: 0.2347 321/500 [==================>...........] - ETA: 45s - loss: 1.4556 - regression_loss: 1.2210 - classification_loss: 0.2346 322/500 [==================>...........] - ETA: 44s - loss: 1.4544 - regression_loss: 1.2200 - classification_loss: 0.2344 323/500 [==================>...........] - ETA: 44s - loss: 1.4523 - regression_loss: 1.2184 - classification_loss: 0.2339 324/500 [==================>...........] - ETA: 44s - loss: 1.4530 - regression_loss: 1.2192 - classification_loss: 0.2338 325/500 [==================>...........] - ETA: 44s - loss: 1.4533 - regression_loss: 1.2195 - classification_loss: 0.2339 326/500 [==================>...........] - ETA: 43s - loss: 1.4510 - regression_loss: 1.2176 - classification_loss: 0.2334 327/500 [==================>...........] - ETA: 43s - loss: 1.4525 - regression_loss: 1.2188 - classification_loss: 0.2338 328/500 [==================>...........] - ETA: 43s - loss: 1.4556 - regression_loss: 1.2212 - classification_loss: 0.2345 329/500 [==================>...........] - ETA: 43s - loss: 1.4540 - regression_loss: 1.2198 - classification_loss: 0.2342 330/500 [==================>...........] - ETA: 42s - loss: 1.4539 - regression_loss: 1.2197 - classification_loss: 0.2342 331/500 [==================>...........] - ETA: 42s - loss: 1.4541 - regression_loss: 1.2198 - classification_loss: 0.2343 332/500 [==================>...........] - ETA: 42s - loss: 1.4550 - regression_loss: 1.2206 - classification_loss: 0.2344 333/500 [==================>...........] - ETA: 41s - loss: 1.4530 - regression_loss: 1.2187 - classification_loss: 0.2342 334/500 [===================>..........] - ETA: 41s - loss: 1.4532 - regression_loss: 1.2190 - classification_loss: 0.2342 335/500 [===================>..........] - ETA: 41s - loss: 1.4528 - regression_loss: 1.2187 - classification_loss: 0.2342 336/500 [===================>..........] - ETA: 41s - loss: 1.4545 - regression_loss: 1.2200 - classification_loss: 0.2345 337/500 [===================>..........] - ETA: 40s - loss: 1.4538 - regression_loss: 1.2195 - classification_loss: 0.2343 338/500 [===================>..........] - ETA: 40s - loss: 1.4542 - regression_loss: 1.2199 - classification_loss: 0.2344 339/500 [===================>..........] - ETA: 40s - loss: 1.4565 - regression_loss: 1.2217 - classification_loss: 0.2349 340/500 [===================>..........] - ETA: 40s - loss: 1.4549 - regression_loss: 1.2201 - classification_loss: 0.2348 341/500 [===================>..........] - ETA: 39s - loss: 1.4543 - regression_loss: 1.2197 - classification_loss: 0.2346 342/500 [===================>..........] - ETA: 39s - loss: 1.4543 - regression_loss: 1.2197 - classification_loss: 0.2346 343/500 [===================>..........] - ETA: 39s - loss: 1.4543 - regression_loss: 1.2198 - classification_loss: 0.2345 344/500 [===================>..........] - ETA: 39s - loss: 1.4549 - regression_loss: 1.2203 - classification_loss: 0.2345 345/500 [===================>..........] - ETA: 38s - loss: 1.4543 - regression_loss: 1.2198 - classification_loss: 0.2345 346/500 [===================>..........] - ETA: 38s - loss: 1.4556 - regression_loss: 1.2206 - classification_loss: 0.2349 347/500 [===================>..........] - ETA: 38s - loss: 1.4544 - regression_loss: 1.2199 - classification_loss: 0.2346 348/500 [===================>..........] - ETA: 38s - loss: 1.4553 - regression_loss: 1.2206 - classification_loss: 0.2347 349/500 [===================>..........] - ETA: 37s - loss: 1.4544 - regression_loss: 1.2199 - classification_loss: 0.2345 350/500 [====================>.........] - ETA: 37s - loss: 1.4559 - regression_loss: 1.2210 - classification_loss: 0.2349 351/500 [====================>.........] - ETA: 37s - loss: 1.4590 - regression_loss: 1.2233 - classification_loss: 0.2356 352/500 [====================>.........] - ETA: 37s - loss: 1.4571 - regression_loss: 1.2218 - classification_loss: 0.2353 353/500 [====================>.........] - ETA: 36s - loss: 1.4571 - regression_loss: 1.2219 - classification_loss: 0.2352 354/500 [====================>.........] - ETA: 36s - loss: 1.4552 - regression_loss: 1.2206 - classification_loss: 0.2347 355/500 [====================>.........] - ETA: 36s - loss: 1.4553 - regression_loss: 1.2204 - classification_loss: 0.2349 356/500 [====================>.........] - ETA: 36s - loss: 1.4577 - regression_loss: 1.2219 - classification_loss: 0.2358 357/500 [====================>.........] - ETA: 35s - loss: 1.4574 - regression_loss: 1.2217 - classification_loss: 0.2357 358/500 [====================>.........] - ETA: 35s - loss: 1.4586 - regression_loss: 1.2225 - classification_loss: 0.2361 359/500 [====================>.........] - ETA: 35s - loss: 1.4563 - regression_loss: 1.2206 - classification_loss: 0.2357 360/500 [====================>.........] - ETA: 35s - loss: 1.4579 - regression_loss: 1.2219 - classification_loss: 0.2360 361/500 [====================>.........] - ETA: 34s - loss: 1.4584 - regression_loss: 1.2222 - classification_loss: 0.2363 362/500 [====================>.........] - ETA: 34s - loss: 1.4601 - regression_loss: 1.2235 - classification_loss: 0.2365 363/500 [====================>.........] - ETA: 34s - loss: 1.4605 - regression_loss: 1.2237 - classification_loss: 0.2368 364/500 [====================>.........] - ETA: 34s - loss: 1.4601 - regression_loss: 1.2234 - classification_loss: 0.2367 365/500 [====================>.........] - ETA: 33s - loss: 1.4596 - regression_loss: 1.2231 - classification_loss: 0.2366 366/500 [====================>.........] - ETA: 33s - loss: 1.4599 - regression_loss: 1.2233 - classification_loss: 0.2366 367/500 [=====================>........] - ETA: 33s - loss: 1.4586 - regression_loss: 1.2218 - classification_loss: 0.2368 368/500 [=====================>........] - ETA: 33s - loss: 1.4581 - regression_loss: 1.2216 - classification_loss: 0.2365 369/500 [=====================>........] - ETA: 32s - loss: 1.4589 - regression_loss: 1.2223 - classification_loss: 0.2366 370/500 [=====================>........] - ETA: 32s - loss: 1.4576 - regression_loss: 1.2212 - classification_loss: 0.2364 371/500 [=====================>........] - ETA: 32s - loss: 1.4582 - regression_loss: 1.2219 - classification_loss: 0.2364 372/500 [=====================>........] - ETA: 32s - loss: 1.4573 - regression_loss: 1.2212 - classification_loss: 0.2361 373/500 [=====================>........] - ETA: 31s - loss: 1.4559 - regression_loss: 1.2201 - classification_loss: 0.2358 374/500 [=====================>........] - ETA: 31s - loss: 1.4553 - regression_loss: 1.2196 - classification_loss: 0.2358 375/500 [=====================>........] - ETA: 31s - loss: 1.4568 - regression_loss: 1.2209 - classification_loss: 0.2359 376/500 [=====================>........] - ETA: 31s - loss: 1.4582 - regression_loss: 1.2219 - classification_loss: 0.2363 377/500 [=====================>........] - ETA: 30s - loss: 1.4592 - regression_loss: 1.2226 - classification_loss: 0.2366 378/500 [=====================>........] - ETA: 30s - loss: 1.4605 - regression_loss: 1.2239 - classification_loss: 0.2366 379/500 [=====================>........] - ETA: 30s - loss: 1.4606 - regression_loss: 1.2240 - classification_loss: 0.2367 380/500 [=====================>........] - ETA: 30s - loss: 1.4618 - regression_loss: 1.2249 - classification_loss: 0.2369 381/500 [=====================>........] - ETA: 29s - loss: 1.4611 - regression_loss: 1.2243 - classification_loss: 0.2368 382/500 [=====================>........] - ETA: 29s - loss: 1.4597 - regression_loss: 1.2232 - classification_loss: 0.2365 383/500 [=====================>........] - ETA: 29s - loss: 1.4608 - regression_loss: 1.2242 - classification_loss: 0.2366 384/500 [======================>.......] - ETA: 29s - loss: 1.4612 - regression_loss: 1.2246 - classification_loss: 0.2365 385/500 [======================>.......] - ETA: 28s - loss: 1.4601 - regression_loss: 1.2239 - classification_loss: 0.2363 386/500 [======================>.......] - ETA: 28s - loss: 1.4597 - regression_loss: 1.2236 - classification_loss: 0.2361 387/500 [======================>.......] - ETA: 28s - loss: 1.4590 - regression_loss: 1.2230 - classification_loss: 0.2360 388/500 [======================>.......] - ETA: 28s - loss: 1.4594 - regression_loss: 1.2235 - classification_loss: 0.2359 389/500 [======================>.......] - ETA: 27s - loss: 1.4601 - regression_loss: 1.2240 - classification_loss: 0.2361 390/500 [======================>.......] - ETA: 27s - loss: 1.4613 - regression_loss: 1.2250 - classification_loss: 0.2362 391/500 [======================>.......] - ETA: 27s - loss: 1.4607 - regression_loss: 1.2247 - classification_loss: 0.2360 392/500 [======================>.......] - ETA: 27s - loss: 1.4621 - regression_loss: 1.2257 - classification_loss: 0.2363 393/500 [======================>.......] - ETA: 26s - loss: 1.4625 - regression_loss: 1.2261 - classification_loss: 0.2365 394/500 [======================>.......] - ETA: 26s - loss: 1.4625 - regression_loss: 1.2261 - classification_loss: 0.2364 395/500 [======================>.......] - ETA: 26s - loss: 1.4627 - regression_loss: 1.2262 - classification_loss: 0.2365 396/500 [======================>.......] - ETA: 26s - loss: 1.4617 - regression_loss: 1.2254 - classification_loss: 0.2363 397/500 [======================>.......] - ETA: 25s - loss: 1.4626 - regression_loss: 1.2262 - classification_loss: 0.2364 398/500 [======================>.......] - ETA: 25s - loss: 1.4637 - regression_loss: 1.2271 - classification_loss: 0.2366 399/500 [======================>.......] - ETA: 25s - loss: 1.4636 - regression_loss: 1.2269 - classification_loss: 0.2367 400/500 [=======================>......] - ETA: 25s - loss: 1.4646 - regression_loss: 1.2277 - classification_loss: 0.2369 401/500 [=======================>......] - ETA: 24s - loss: 1.4647 - regression_loss: 1.2279 - classification_loss: 0.2368 402/500 [=======================>......] - ETA: 24s - loss: 1.4646 - regression_loss: 1.2278 - classification_loss: 0.2369 403/500 [=======================>......] - ETA: 24s - loss: 1.4654 - regression_loss: 1.2284 - classification_loss: 0.2370 404/500 [=======================>......] - ETA: 24s - loss: 1.4636 - regression_loss: 1.2268 - classification_loss: 0.2368 405/500 [=======================>......] - ETA: 23s - loss: 1.4637 - regression_loss: 1.2270 - classification_loss: 0.2367 406/500 [=======================>......] - ETA: 23s - loss: 1.4636 - regression_loss: 1.2270 - classification_loss: 0.2365 407/500 [=======================>......] - ETA: 23s - loss: 1.4631 - regression_loss: 1.2267 - classification_loss: 0.2364 408/500 [=======================>......] - ETA: 23s - loss: 1.4646 - regression_loss: 1.2278 - classification_loss: 0.2368 409/500 [=======================>......] - ETA: 22s - loss: 1.4637 - regression_loss: 1.2271 - classification_loss: 0.2366 410/500 [=======================>......] - ETA: 22s - loss: 1.4625 - regression_loss: 1.2262 - classification_loss: 0.2363 411/500 [=======================>......] - ETA: 22s - loss: 1.4621 - regression_loss: 1.2259 - classification_loss: 0.2362 412/500 [=======================>......] - ETA: 22s - loss: 1.4624 - regression_loss: 1.2262 - classification_loss: 0.2362 413/500 [=======================>......] - ETA: 21s - loss: 1.4615 - regression_loss: 1.2255 - classification_loss: 0.2361 414/500 [=======================>......] - ETA: 21s - loss: 1.4617 - regression_loss: 1.2256 - classification_loss: 0.2362 415/500 [=======================>......] - ETA: 21s - loss: 1.4609 - regression_loss: 1.2250 - classification_loss: 0.2359 416/500 [=======================>......] - ETA: 21s - loss: 1.4620 - regression_loss: 1.2262 - classification_loss: 0.2358 417/500 [========================>.....] - ETA: 20s - loss: 1.4631 - regression_loss: 1.2271 - classification_loss: 0.2361 418/500 [========================>.....] - ETA: 20s - loss: 1.4642 - regression_loss: 1.2280 - classification_loss: 0.2362 419/500 [========================>.....] - ETA: 20s - loss: 1.4646 - regression_loss: 1.2283 - classification_loss: 0.2362 420/500 [========================>.....] - ETA: 20s - loss: 1.4641 - regression_loss: 1.2281 - classification_loss: 0.2360 421/500 [========================>.....] - ETA: 19s - loss: 1.4660 - regression_loss: 1.2296 - classification_loss: 0.2364 422/500 [========================>.....] - ETA: 19s - loss: 1.4665 - regression_loss: 1.2302 - classification_loss: 0.2363 423/500 [========================>.....] - ETA: 19s - loss: 1.4673 - regression_loss: 1.2309 - classification_loss: 0.2364 424/500 [========================>.....] - ETA: 19s - loss: 1.4672 - regression_loss: 1.2309 - classification_loss: 0.2363 425/500 [========================>.....] - ETA: 18s - loss: 1.4681 - regression_loss: 1.2317 - classification_loss: 0.2364 426/500 [========================>.....] - ETA: 18s - loss: 1.4688 - regression_loss: 1.2324 - classification_loss: 0.2364 427/500 [========================>.....] - ETA: 18s - loss: 1.4683 - regression_loss: 1.2320 - classification_loss: 0.2363 428/500 [========================>.....] - ETA: 18s - loss: 1.4696 - regression_loss: 1.2329 - classification_loss: 0.2367 429/500 [========================>.....] - ETA: 17s - loss: 1.4697 - regression_loss: 1.2330 - classification_loss: 0.2368 430/500 [========================>.....] - ETA: 17s - loss: 1.4676 - regression_loss: 1.2313 - classification_loss: 0.2363 431/500 [========================>.....] - ETA: 17s - loss: 1.4672 - regression_loss: 1.2311 - classification_loss: 0.2361 432/500 [========================>.....] - ETA: 17s - loss: 1.4651 - regression_loss: 1.2295 - classification_loss: 0.2357 433/500 [========================>.....] - ETA: 16s - loss: 1.4655 - regression_loss: 1.2295 - classification_loss: 0.2360 434/500 [=========================>....] - ETA: 16s - loss: 1.4665 - regression_loss: 1.2303 - classification_loss: 0.2361 435/500 [=========================>....] - ETA: 16s - loss: 1.4667 - regression_loss: 1.2306 - classification_loss: 0.2360 436/500 [=========================>....] - ETA: 16s - loss: 1.4674 - regression_loss: 1.2314 - classification_loss: 0.2360 437/500 [=========================>....] - ETA: 15s - loss: 1.4660 - regression_loss: 1.2303 - classification_loss: 0.2357 438/500 [=========================>....] - ETA: 15s - loss: 1.4655 - regression_loss: 1.2299 - classification_loss: 0.2356 439/500 [=========================>....] - ETA: 15s - loss: 1.4640 - regression_loss: 1.2288 - classification_loss: 0.2353 440/500 [=========================>....] - ETA: 15s - loss: 1.4640 - regression_loss: 1.2288 - classification_loss: 0.2352 441/500 [=========================>....] - ETA: 14s - loss: 1.4652 - regression_loss: 1.2298 - classification_loss: 0.2354 442/500 [=========================>....] - ETA: 14s - loss: 1.4662 - regression_loss: 1.2306 - classification_loss: 0.2356 443/500 [=========================>....] - ETA: 14s - loss: 1.4669 - regression_loss: 1.2313 - classification_loss: 0.2356 444/500 [=========================>....] - ETA: 14s - loss: 1.4666 - regression_loss: 1.2311 - classification_loss: 0.2355 445/500 [=========================>....] - ETA: 13s - loss: 1.4679 - regression_loss: 1.2324 - classification_loss: 0.2355 446/500 [=========================>....] - ETA: 13s - loss: 1.4668 - regression_loss: 1.2315 - classification_loss: 0.2353 447/500 [=========================>....] - ETA: 13s - loss: 1.4673 - regression_loss: 1.2320 - classification_loss: 0.2353 448/500 [=========================>....] - ETA: 13s - loss: 1.4685 - regression_loss: 1.2329 - classification_loss: 0.2356 449/500 [=========================>....] - ETA: 12s - loss: 1.4666 - regression_loss: 1.2314 - classification_loss: 0.2352 450/500 [==========================>...] - ETA: 12s - loss: 1.4663 - regression_loss: 1.2311 - classification_loss: 0.2352 451/500 [==========================>...] - ETA: 12s - loss: 1.4659 - regression_loss: 1.2308 - classification_loss: 0.2351 452/500 [==========================>...] - ETA: 12s - loss: 1.4661 - regression_loss: 1.2309 - classification_loss: 0.2352 453/500 [==========================>...] - ETA: 11s - loss: 1.4663 - regression_loss: 1.2310 - classification_loss: 0.2353 454/500 [==========================>...] - ETA: 11s - loss: 1.4659 - regression_loss: 1.2306 - classification_loss: 0.2353 455/500 [==========================>...] - ETA: 11s - loss: 1.4661 - regression_loss: 1.2308 - classification_loss: 0.2353 456/500 [==========================>...] - ETA: 11s - loss: 1.4651 - regression_loss: 1.2300 - classification_loss: 0.2352 457/500 [==========================>...] - ETA: 10s - loss: 1.4652 - regression_loss: 1.2300 - classification_loss: 0.2352 458/500 [==========================>...] - ETA: 10s - loss: 1.4668 - regression_loss: 1.2316 - classification_loss: 0.2352 459/500 [==========================>...] - ETA: 10s - loss: 1.4672 - regression_loss: 1.2316 - classification_loss: 0.2356 460/500 [==========================>...] - ETA: 10s - loss: 1.4673 - regression_loss: 1.2317 - classification_loss: 0.2356 461/500 [==========================>...] - ETA: 9s - loss: 1.4650 - regression_loss: 1.2298 - classification_loss: 0.2353  462/500 [==========================>...] - ETA: 9s - loss: 1.4649 - regression_loss: 1.2297 - classification_loss: 0.2352 463/500 [==========================>...] - ETA: 9s - loss: 1.4652 - regression_loss: 1.2300 - classification_loss: 0.2352 464/500 [==========================>...] - ETA: 9s - loss: 1.4646 - regression_loss: 1.2297 - classification_loss: 0.2349 465/500 [==========================>...] - ETA: 8s - loss: 1.4650 - regression_loss: 1.2301 - classification_loss: 0.2349 466/500 [==========================>...] - ETA: 8s - loss: 1.4645 - regression_loss: 1.2296 - classification_loss: 0.2349 467/500 [===========================>..] - ETA: 8s - loss: 1.4657 - regression_loss: 1.2305 - classification_loss: 0.2351 468/500 [===========================>..] - ETA: 8s - loss: 1.4644 - regression_loss: 1.2293 - classification_loss: 0.2351 469/500 [===========================>..] - ETA: 7s - loss: 1.4639 - regression_loss: 1.2290 - classification_loss: 0.2350 470/500 [===========================>..] - ETA: 7s - loss: 1.4642 - regression_loss: 1.2291 - classification_loss: 0.2351 471/500 [===========================>..] - ETA: 7s - loss: 1.4636 - regression_loss: 1.2288 - classification_loss: 0.2348 472/500 [===========================>..] - ETA: 7s - loss: 1.4646 - regression_loss: 1.2295 - classification_loss: 0.2350 473/500 [===========================>..] - ETA: 6s - loss: 1.4653 - regression_loss: 1.2302 - classification_loss: 0.2351 474/500 [===========================>..] - ETA: 6s - loss: 1.4659 - regression_loss: 1.2308 - classification_loss: 0.2351 475/500 [===========================>..] - ETA: 6s - loss: 1.4654 - regression_loss: 1.2304 - classification_loss: 0.2350 476/500 [===========================>..] - ETA: 6s - loss: 1.4644 - regression_loss: 1.2295 - classification_loss: 0.2349 477/500 [===========================>..] - ETA: 5s - loss: 1.4642 - regression_loss: 1.2291 - classification_loss: 0.2351 478/500 [===========================>..] - ETA: 5s - loss: 1.4654 - regression_loss: 1.2299 - classification_loss: 0.2355 479/500 [===========================>..] - ETA: 5s - loss: 1.4655 - regression_loss: 1.2300 - classification_loss: 0.2354 480/500 [===========================>..] - ETA: 5s - loss: 1.4660 - regression_loss: 1.2305 - classification_loss: 0.2355 481/500 [===========================>..] - ETA: 4s - loss: 1.4647 - regression_loss: 1.2289 - classification_loss: 0.2358 482/500 [===========================>..] - ETA: 4s - loss: 1.4656 - regression_loss: 1.2296 - classification_loss: 0.2360 483/500 [===========================>..] - ETA: 4s - loss: 1.4645 - regression_loss: 1.2288 - classification_loss: 0.2357 484/500 [============================>.] - ETA: 4s - loss: 1.4626 - regression_loss: 1.2273 - classification_loss: 0.2353 485/500 [============================>.] - ETA: 3s - loss: 1.4626 - regression_loss: 1.2273 - classification_loss: 0.2353 486/500 [============================>.] - ETA: 3s - loss: 1.4628 - regression_loss: 1.2274 - classification_loss: 0.2354 487/500 [============================>.] - ETA: 3s - loss: 1.4621 - regression_loss: 1.2270 - classification_loss: 0.2351 488/500 [============================>.] - ETA: 3s - loss: 1.4629 - regression_loss: 1.2276 - classification_loss: 0.2353 489/500 [============================>.] - ETA: 2s - loss: 1.4636 - regression_loss: 1.2283 - classification_loss: 0.2353 490/500 [============================>.] - ETA: 2s - loss: 1.4638 - regression_loss: 1.2285 - classification_loss: 0.2353 491/500 [============================>.] - ETA: 2s - loss: 1.4625 - regression_loss: 1.2275 - classification_loss: 0.2351 492/500 [============================>.] - ETA: 2s - loss: 1.4625 - regression_loss: 1.2274 - classification_loss: 0.2351 493/500 [============================>.] - ETA: 1s - loss: 1.4614 - regression_loss: 1.2265 - classification_loss: 0.2348 494/500 [============================>.] - ETA: 1s - loss: 1.4611 - regression_loss: 1.2264 - classification_loss: 0.2347 495/500 [============================>.] - ETA: 1s - loss: 1.4624 - regression_loss: 1.2272 - classification_loss: 0.2351 496/500 [============================>.] - ETA: 1s - loss: 1.4626 - regression_loss: 1.2273 - classification_loss: 0.2354 497/500 [============================>.] - ETA: 0s - loss: 1.4631 - regression_loss: 1.2277 - classification_loss: 0.2354 498/500 [============================>.] - ETA: 0s - loss: 1.4621 - regression_loss: 1.2269 - classification_loss: 0.2352 499/500 [============================>.] - ETA: 0s - loss: 1.4616 - regression_loss: 1.2266 - classification_loss: 0.2350 500/500 [==============================] - 126s 251ms/step - loss: 1.4617 - regression_loss: 1.2268 - classification_loss: 0.2349 1172 instances of class plum with average precision: 0.6674 mAP: 0.6674 Epoch 00107: saving model to ./training/snapshots/resnet50_pascal_107.h5 Epoch 108/150 1/500 [..............................] - ETA: 1:58 - loss: 1.0110 - regression_loss: 0.8835 - classification_loss: 0.1274 2/500 [..............................] - ETA: 2:01 - loss: 1.0417 - regression_loss: 0.8547 - classification_loss: 0.1870 3/500 [..............................] - ETA: 2:02 - loss: 0.9810 - regression_loss: 0.8094 - classification_loss: 0.1716 4/500 [..............................] - ETA: 1:59 - loss: 0.8588 - regression_loss: 0.7167 - classification_loss: 0.1421 5/500 [..............................] - ETA: 1:57 - loss: 1.0314 - regression_loss: 0.8558 - classification_loss: 0.1756 6/500 [..............................] - ETA: 1:55 - loss: 0.9891 - regression_loss: 0.8150 - classification_loss: 0.1740 7/500 [..............................] - ETA: 1:53 - loss: 1.1103 - regression_loss: 0.9141 - classification_loss: 0.1963 8/500 [..............................] - ETA: 1:52 - loss: 1.1873 - regression_loss: 0.9792 - classification_loss: 0.2081 9/500 [..............................] - ETA: 1:52 - loss: 1.2524 - regression_loss: 1.0424 - classification_loss: 0.2100 10/500 [..............................] - ETA: 1:53 - loss: 1.3046 - regression_loss: 1.0852 - classification_loss: 0.2194 11/500 [..............................] - ETA: 1:54 - loss: 1.3486 - regression_loss: 1.1145 - classification_loss: 0.2341 12/500 [..............................] - ETA: 1:54 - loss: 1.3586 - regression_loss: 1.1258 - classification_loss: 0.2328 13/500 [..............................] - ETA: 1:54 - loss: 1.4003 - regression_loss: 1.1594 - classification_loss: 0.2409 14/500 [..............................] - ETA: 1:55 - loss: 1.3966 - regression_loss: 1.1575 - classification_loss: 0.2391 15/500 [..............................] - ETA: 1:55 - loss: 1.3932 - regression_loss: 1.1563 - classification_loss: 0.2369 16/500 [..............................] - ETA: 1:55 - loss: 1.3944 - regression_loss: 1.1538 - classification_loss: 0.2406 17/500 [>.............................] - ETA: 1:55 - loss: 1.4037 - regression_loss: 1.1657 - classification_loss: 0.2380 18/500 [>.............................] - ETA: 1:56 - loss: 1.4005 - regression_loss: 1.1649 - classification_loss: 0.2356 19/500 [>.............................] - ETA: 1:56 - loss: 1.3707 - regression_loss: 1.1430 - classification_loss: 0.2277 20/500 [>.............................] - ETA: 1:56 - loss: 1.3834 - regression_loss: 1.1542 - classification_loss: 0.2292 21/500 [>.............................] - ETA: 1:56 - loss: 1.3994 - regression_loss: 1.1672 - classification_loss: 0.2322 22/500 [>.............................] - ETA: 1:56 - loss: 1.4038 - regression_loss: 1.1729 - classification_loss: 0.2309 23/500 [>.............................] - ETA: 1:56 - loss: 1.4196 - regression_loss: 1.1860 - classification_loss: 0.2336 24/500 [>.............................] - ETA: 1:56 - loss: 1.4148 - regression_loss: 1.1820 - classification_loss: 0.2328 25/500 [>.............................] - ETA: 1:55 - loss: 1.4217 - regression_loss: 1.1900 - classification_loss: 0.2317 26/500 [>.............................] - ETA: 1:55 - loss: 1.4401 - regression_loss: 1.2063 - classification_loss: 0.2339 27/500 [>.............................] - ETA: 1:55 - loss: 1.4486 - regression_loss: 1.2143 - classification_loss: 0.2343 28/500 [>.............................] - ETA: 1:55 - loss: 1.4311 - regression_loss: 1.1929 - classification_loss: 0.2383 29/500 [>.............................] - ETA: 1:55 - loss: 1.4226 - regression_loss: 1.1870 - classification_loss: 0.2356 30/500 [>.............................] - ETA: 1:54 - loss: 1.4200 - regression_loss: 1.1852 - classification_loss: 0.2348 31/500 [>.............................] - ETA: 1:54 - loss: 1.4437 - regression_loss: 1.2070 - classification_loss: 0.2368 32/500 [>.............................] - ETA: 1:54 - loss: 1.4038 - regression_loss: 1.1692 - classification_loss: 0.2346 33/500 [>.............................] - ETA: 1:54 - loss: 1.4111 - regression_loss: 1.1744 - classification_loss: 0.2367 34/500 [=>............................] - ETA: 1:54 - loss: 1.4277 - regression_loss: 1.1858 - classification_loss: 0.2419 35/500 [=>............................] - ETA: 1:54 - loss: 1.4247 - regression_loss: 1.1848 - classification_loss: 0.2399 36/500 [=>............................] - ETA: 1:54 - loss: 1.4064 - regression_loss: 1.1712 - classification_loss: 0.2352 37/500 [=>............................] - ETA: 1:54 - loss: 1.3848 - regression_loss: 1.1538 - classification_loss: 0.2310 38/500 [=>............................] - ETA: 1:54 - loss: 1.3967 - regression_loss: 1.1631 - classification_loss: 0.2336 39/500 [=>............................] - ETA: 1:53 - loss: 1.4071 - regression_loss: 1.1710 - classification_loss: 0.2361 40/500 [=>............................] - ETA: 1:53 - loss: 1.4113 - regression_loss: 1.1717 - classification_loss: 0.2396 41/500 [=>............................] - ETA: 1:53 - loss: 1.4338 - regression_loss: 1.1859 - classification_loss: 0.2478 42/500 [=>............................] - ETA: 1:53 - loss: 1.4326 - regression_loss: 1.1872 - classification_loss: 0.2454 43/500 [=>............................] - ETA: 1:53 - loss: 1.4078 - regression_loss: 1.1674 - classification_loss: 0.2405 44/500 [=>............................] - ETA: 1:53 - loss: 1.4431 - regression_loss: 1.1914 - classification_loss: 0.2516 45/500 [=>............................] - ETA: 1:52 - loss: 1.4532 - regression_loss: 1.2004 - classification_loss: 0.2527 46/500 [=>............................] - ETA: 1:52 - loss: 1.4536 - regression_loss: 1.2026 - classification_loss: 0.2510 47/500 [=>............................] - ETA: 1:52 - loss: 1.4510 - regression_loss: 1.2002 - classification_loss: 0.2508 48/500 [=>............................] - ETA: 1:52 - loss: 1.4535 - regression_loss: 1.2055 - classification_loss: 0.2480 49/500 [=>............................] - ETA: 1:52 - loss: 1.4611 - regression_loss: 1.2117 - classification_loss: 0.2494 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4733 - regression_loss: 1.2220 - classification_loss: 0.2513 51/500 [==>...........................] - ETA: 1:51 - loss: 1.4559 - regression_loss: 1.2076 - classification_loss: 0.2483 52/500 [==>...........................] - ETA: 1:51 - loss: 1.4662 - regression_loss: 1.2148 - classification_loss: 0.2513 53/500 [==>...........................] - ETA: 1:51 - loss: 1.4703 - regression_loss: 1.2181 - classification_loss: 0.2522 54/500 [==>...........................] - ETA: 1:50 - loss: 1.4725 - regression_loss: 1.2199 - classification_loss: 0.2526 55/500 [==>...........................] - ETA: 1:50 - loss: 1.4617 - regression_loss: 1.2121 - classification_loss: 0.2496 56/500 [==>...........................] - ETA: 1:50 - loss: 1.4458 - regression_loss: 1.1990 - classification_loss: 0.2468 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4513 - regression_loss: 1.2040 - classification_loss: 0.2473 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4388 - regression_loss: 1.1938 - classification_loss: 0.2450 59/500 [==>...........................] - ETA: 1:49 - loss: 1.4406 - regression_loss: 1.1959 - classification_loss: 0.2447 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4340 - regression_loss: 1.1906 - classification_loss: 0.2434 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4516 - regression_loss: 1.2041 - classification_loss: 0.2476 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4540 - regression_loss: 1.2069 - classification_loss: 0.2471 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4582 - regression_loss: 1.2126 - classification_loss: 0.2456 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4614 - regression_loss: 1.2152 - classification_loss: 0.2462 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4579 - regression_loss: 1.2129 - classification_loss: 0.2450 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4619 - regression_loss: 1.2169 - classification_loss: 0.2450 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4717 - regression_loss: 1.2237 - classification_loss: 0.2480 68/500 [===>..........................] - ETA: 1:47 - loss: 1.4781 - regression_loss: 1.2293 - classification_loss: 0.2488 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4774 - regression_loss: 1.2285 - classification_loss: 0.2489 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4696 - regression_loss: 1.2226 - classification_loss: 0.2470 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4730 - regression_loss: 1.2250 - classification_loss: 0.2480 72/500 [===>..........................] - ETA: 1:46 - loss: 1.4810 - regression_loss: 1.2313 - classification_loss: 0.2497 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4864 - regression_loss: 1.2362 - classification_loss: 0.2503 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4876 - regression_loss: 1.2393 - classification_loss: 0.2483 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4896 - regression_loss: 1.2404 - classification_loss: 0.2492 76/500 [===>..........................] - ETA: 1:45 - loss: 1.4844 - regression_loss: 1.2362 - classification_loss: 0.2482 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4871 - regression_loss: 1.2392 - classification_loss: 0.2479 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4907 - regression_loss: 1.2422 - classification_loss: 0.2485 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4888 - regression_loss: 1.2411 - classification_loss: 0.2477 80/500 [===>..........................] - ETA: 1:45 - loss: 1.4965 - regression_loss: 1.2475 - classification_loss: 0.2490 81/500 [===>..........................] - ETA: 1:44 - loss: 1.4902 - regression_loss: 1.2426 - classification_loss: 0.2476 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4907 - regression_loss: 1.2432 - classification_loss: 0.2475 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4892 - regression_loss: 1.2424 - classification_loss: 0.2468 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4878 - regression_loss: 1.2417 - classification_loss: 0.2460 85/500 [====>.........................] - ETA: 1:43 - loss: 1.4839 - regression_loss: 1.2386 - classification_loss: 0.2453 86/500 [====>.........................] - ETA: 1:43 - loss: 1.4802 - regression_loss: 1.2355 - classification_loss: 0.2447 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4771 - regression_loss: 1.2335 - classification_loss: 0.2436 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4801 - regression_loss: 1.2343 - classification_loss: 0.2458 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4771 - regression_loss: 1.2319 - classification_loss: 0.2451 90/500 [====>.........................] - ETA: 1:42 - loss: 1.4816 - regression_loss: 1.2351 - classification_loss: 0.2464 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4868 - regression_loss: 1.2392 - classification_loss: 0.2476 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4846 - regression_loss: 1.2383 - classification_loss: 0.2463 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4830 - regression_loss: 1.2371 - classification_loss: 0.2460 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4769 - regression_loss: 1.2325 - classification_loss: 0.2444 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4746 - regression_loss: 1.2302 - classification_loss: 0.2445 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4792 - regression_loss: 1.2341 - classification_loss: 0.2452 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4790 - regression_loss: 1.2341 - classification_loss: 0.2450 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4789 - regression_loss: 1.2336 - classification_loss: 0.2454 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4750 - regression_loss: 1.2309 - classification_loss: 0.2441 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4792 - regression_loss: 1.2339 - classification_loss: 0.2452 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4769 - regression_loss: 1.2317 - classification_loss: 0.2451 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4783 - regression_loss: 1.2325 - classification_loss: 0.2457 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4797 - regression_loss: 1.2338 - classification_loss: 0.2459 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4785 - regression_loss: 1.2330 - classification_loss: 0.2455 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4808 - regression_loss: 1.2352 - classification_loss: 0.2456 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4747 - regression_loss: 1.2304 - classification_loss: 0.2444 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4764 - regression_loss: 1.2320 - classification_loss: 0.2444 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4727 - regression_loss: 1.2293 - classification_loss: 0.2433 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4669 - regression_loss: 1.2251 - classification_loss: 0.2417 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4723 - regression_loss: 1.2292 - classification_loss: 0.2431 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4764 - regression_loss: 1.2326 - classification_loss: 0.2438 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4780 - regression_loss: 1.2335 - classification_loss: 0.2445 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4820 - regression_loss: 1.2369 - classification_loss: 0.2451 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4804 - regression_loss: 1.2361 - classification_loss: 0.2443 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4740 - regression_loss: 1.2311 - classification_loss: 0.2429 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4720 - regression_loss: 1.2290 - classification_loss: 0.2430 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4671 - regression_loss: 1.2253 - classification_loss: 0.2418 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4676 - regression_loss: 1.2256 - classification_loss: 0.2420 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4681 - regression_loss: 1.2259 - classification_loss: 0.2422 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4729 - regression_loss: 1.2295 - classification_loss: 0.2434 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4713 - regression_loss: 1.2282 - classification_loss: 0.2431 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4701 - regression_loss: 1.2273 - classification_loss: 0.2428 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4659 - regression_loss: 1.2242 - classification_loss: 0.2417 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4688 - regression_loss: 1.2270 - classification_loss: 0.2418 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4604 - regression_loss: 1.2201 - classification_loss: 0.2403 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4596 - regression_loss: 1.2195 - classification_loss: 0.2401 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4606 - regression_loss: 1.2205 - classification_loss: 0.2401 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4603 - regression_loss: 1.2201 - classification_loss: 0.2402 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4547 - regression_loss: 1.2156 - classification_loss: 0.2391 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4545 - regression_loss: 1.2158 - classification_loss: 0.2387 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4550 - regression_loss: 1.2164 - classification_loss: 0.2387 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4511 - regression_loss: 1.2136 - classification_loss: 0.2375 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4521 - regression_loss: 1.2145 - classification_loss: 0.2376 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4516 - regression_loss: 1.2138 - classification_loss: 0.2378 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4542 - regression_loss: 1.2158 - classification_loss: 0.2384 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4551 - regression_loss: 1.2164 - classification_loss: 0.2388 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4524 - regression_loss: 1.2143 - classification_loss: 0.2381 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4527 - regression_loss: 1.2148 - classification_loss: 0.2380 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4536 - regression_loss: 1.2153 - classification_loss: 0.2383 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4561 - regression_loss: 1.2172 - classification_loss: 0.2389 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4584 - regression_loss: 1.2191 - classification_loss: 0.2393 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4567 - regression_loss: 1.2174 - classification_loss: 0.2393 143/500 [=======>......................] - ETA: 1:30 - loss: 1.4660 - regression_loss: 1.2249 - classification_loss: 0.2411 144/500 [=======>......................] - ETA: 1:30 - loss: 1.4629 - regression_loss: 1.2225 - classification_loss: 0.2404 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4629 - regression_loss: 1.2229 - classification_loss: 0.2399 146/500 [=======>......................] - ETA: 1:29 - loss: 1.4587 - regression_loss: 1.2198 - classification_loss: 0.2389 147/500 [=======>......................] - ETA: 1:29 - loss: 1.4592 - regression_loss: 1.2201 - classification_loss: 0.2390 148/500 [=======>......................] - ETA: 1:29 - loss: 1.4616 - regression_loss: 1.2221 - classification_loss: 0.2395 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4645 - regression_loss: 1.2243 - classification_loss: 0.2402 150/500 [========>.....................] - ETA: 1:28 - loss: 1.4670 - regression_loss: 1.2254 - classification_loss: 0.2416 151/500 [========>.....................] - ETA: 1:28 - loss: 1.4666 - regression_loss: 1.2240 - classification_loss: 0.2426 152/500 [========>.....................] - ETA: 1:28 - loss: 1.4676 - regression_loss: 1.2246 - classification_loss: 0.2430 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4649 - regression_loss: 1.2225 - classification_loss: 0.2423 154/500 [========>.....................] - ETA: 1:27 - loss: 1.4650 - regression_loss: 1.2228 - classification_loss: 0.2422 155/500 [========>.....................] - ETA: 1:27 - loss: 1.4660 - regression_loss: 1.2235 - classification_loss: 0.2425 156/500 [========>.....................] - ETA: 1:27 - loss: 1.4691 - regression_loss: 1.2258 - classification_loss: 0.2433 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4695 - regression_loss: 1.2260 - classification_loss: 0.2435 158/500 [========>.....................] - ETA: 1:26 - loss: 1.4699 - regression_loss: 1.2266 - classification_loss: 0.2433 159/500 [========>.....................] - ETA: 1:26 - loss: 1.4650 - regression_loss: 1.2230 - classification_loss: 0.2420 160/500 [========>.....................] - ETA: 1:26 - loss: 1.4645 - regression_loss: 1.2228 - classification_loss: 0.2417 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4619 - regression_loss: 1.2207 - classification_loss: 0.2412 162/500 [========>.....................] - ETA: 1:25 - loss: 1.4599 - regression_loss: 1.2190 - classification_loss: 0.2408 163/500 [========>.....................] - ETA: 1:25 - loss: 1.4583 - regression_loss: 1.2180 - classification_loss: 0.2403 164/500 [========>.....................] - ETA: 1:25 - loss: 1.4533 - regression_loss: 1.2139 - classification_loss: 0.2395 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4538 - regression_loss: 1.2144 - classification_loss: 0.2394 166/500 [========>.....................] - ETA: 1:24 - loss: 1.4528 - regression_loss: 1.2143 - classification_loss: 0.2385 167/500 [=========>....................] - ETA: 1:24 - loss: 1.4504 - regression_loss: 1.2116 - classification_loss: 0.2389 168/500 [=========>....................] - ETA: 1:24 - loss: 1.4504 - regression_loss: 1.2120 - classification_loss: 0.2384 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4537 - regression_loss: 1.2139 - classification_loss: 0.2398 170/500 [=========>....................] - ETA: 1:23 - loss: 1.4544 - regression_loss: 1.2146 - classification_loss: 0.2398 171/500 [=========>....................] - ETA: 1:23 - loss: 1.4527 - regression_loss: 1.2126 - classification_loss: 0.2401 172/500 [=========>....................] - ETA: 1:23 - loss: 1.4491 - regression_loss: 1.2090 - classification_loss: 0.2401 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4495 - regression_loss: 1.2094 - classification_loss: 0.2401 174/500 [=========>....................] - ETA: 1:22 - loss: 1.4514 - regression_loss: 1.2113 - classification_loss: 0.2402 175/500 [=========>....................] - ETA: 1:22 - loss: 1.4521 - regression_loss: 1.2116 - classification_loss: 0.2405 176/500 [=========>....................] - ETA: 1:22 - loss: 1.4510 - regression_loss: 1.2106 - classification_loss: 0.2404 177/500 [=========>....................] - ETA: 1:21 - loss: 1.4509 - regression_loss: 1.2106 - classification_loss: 0.2404 178/500 [=========>....................] - ETA: 1:21 - loss: 1.4500 - regression_loss: 1.2095 - classification_loss: 0.2405 179/500 [=========>....................] - ETA: 1:21 - loss: 1.4514 - regression_loss: 1.2107 - classification_loss: 0.2406 180/500 [=========>....................] - ETA: 1:21 - loss: 1.4523 - regression_loss: 1.2117 - classification_loss: 0.2405 181/500 [=========>....................] - ETA: 1:20 - loss: 1.4501 - regression_loss: 1.2100 - classification_loss: 0.2401 182/500 [=========>....................] - ETA: 1:20 - loss: 1.4488 - regression_loss: 1.2091 - classification_loss: 0.2397 183/500 [=========>....................] - ETA: 1:20 - loss: 1.4485 - regression_loss: 1.2088 - classification_loss: 0.2397 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4447 - regression_loss: 1.2058 - classification_loss: 0.2389 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4447 - regression_loss: 1.2062 - classification_loss: 0.2386 186/500 [==========>...................] - ETA: 1:19 - loss: 1.4454 - regression_loss: 1.2069 - classification_loss: 0.2384 187/500 [==========>...................] - ETA: 1:19 - loss: 1.4396 - regression_loss: 1.2022 - classification_loss: 0.2374 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4397 - regression_loss: 1.2026 - classification_loss: 0.2370 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4407 - regression_loss: 1.2039 - classification_loss: 0.2368 190/500 [==========>...................] - ETA: 1:18 - loss: 1.4427 - regression_loss: 1.2057 - classification_loss: 0.2370 191/500 [==========>...................] - ETA: 1:18 - loss: 1.4400 - regression_loss: 1.2037 - classification_loss: 0.2362 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4413 - regression_loss: 1.2045 - classification_loss: 0.2368 193/500 [==========>...................] - ETA: 1:17 - loss: 1.4421 - regression_loss: 1.2052 - classification_loss: 0.2370 194/500 [==========>...................] - ETA: 1:17 - loss: 1.4437 - regression_loss: 1.2065 - classification_loss: 0.2372 195/500 [==========>...................] - ETA: 1:17 - loss: 1.4420 - regression_loss: 1.2050 - classification_loss: 0.2370 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4425 - regression_loss: 1.2057 - classification_loss: 0.2368 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4423 - regression_loss: 1.2058 - classification_loss: 0.2365 198/500 [==========>...................] - ETA: 1:16 - loss: 1.4399 - regression_loss: 1.2042 - classification_loss: 0.2358 199/500 [==========>...................] - ETA: 1:16 - loss: 1.4414 - regression_loss: 1.2052 - classification_loss: 0.2362 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4437 - regression_loss: 1.2073 - classification_loss: 0.2364 201/500 [===========>..................] - ETA: 1:15 - loss: 1.4447 - regression_loss: 1.2081 - classification_loss: 0.2365 202/500 [===========>..................] - ETA: 1:15 - loss: 1.4446 - regression_loss: 1.2074 - classification_loss: 0.2372 203/500 [===========>..................] - ETA: 1:15 - loss: 1.4451 - regression_loss: 1.2080 - classification_loss: 0.2371 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4415 - regression_loss: 1.2052 - classification_loss: 0.2363 205/500 [===========>..................] - ETA: 1:14 - loss: 1.4390 - regression_loss: 1.2025 - classification_loss: 0.2365 206/500 [===========>..................] - ETA: 1:14 - loss: 1.4413 - regression_loss: 1.2044 - classification_loss: 0.2369 207/500 [===========>..................] - ETA: 1:14 - loss: 1.4372 - regression_loss: 1.2011 - classification_loss: 0.2361 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4333 - regression_loss: 1.1980 - classification_loss: 0.2353 209/500 [===========>..................] - ETA: 1:13 - loss: 1.4289 - regression_loss: 1.1942 - classification_loss: 0.2347 210/500 [===========>..................] - ETA: 1:13 - loss: 1.4310 - regression_loss: 1.1960 - classification_loss: 0.2350 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4340 - regression_loss: 1.1983 - classification_loss: 0.2357 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4332 - regression_loss: 1.1978 - classification_loss: 0.2354 213/500 [===========>..................] - ETA: 1:12 - loss: 1.4322 - regression_loss: 1.1971 - classification_loss: 0.2351 214/500 [===========>..................] - ETA: 1:12 - loss: 1.4327 - regression_loss: 1.1976 - classification_loss: 0.2351 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4371 - regression_loss: 1.2013 - classification_loss: 0.2358 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4385 - regression_loss: 1.2025 - classification_loss: 0.2360 217/500 [============>.................] - ETA: 1:11 - loss: 1.4385 - regression_loss: 1.2028 - classification_loss: 0.2357 218/500 [============>.................] - ETA: 1:11 - loss: 1.4391 - regression_loss: 1.2030 - classification_loss: 0.2360 219/500 [============>.................] - ETA: 1:10 - loss: 1.4389 - regression_loss: 1.2030 - classification_loss: 0.2360 220/500 [============>.................] - ETA: 1:10 - loss: 1.4364 - regression_loss: 1.2010 - classification_loss: 0.2354 221/500 [============>.................] - ETA: 1:10 - loss: 1.4372 - regression_loss: 1.2017 - classification_loss: 0.2355 222/500 [============>.................] - ETA: 1:10 - loss: 1.4380 - regression_loss: 1.2020 - classification_loss: 0.2361 223/500 [============>.................] - ETA: 1:10 - loss: 1.4369 - regression_loss: 1.2013 - classification_loss: 0.2355 224/500 [============>.................] - ETA: 1:09 - loss: 1.4386 - regression_loss: 1.2026 - classification_loss: 0.2359 225/500 [============>.................] - ETA: 1:09 - loss: 1.4389 - regression_loss: 1.2032 - classification_loss: 0.2357 226/500 [============>.................] - ETA: 1:09 - loss: 1.4392 - regression_loss: 1.2036 - classification_loss: 0.2356 227/500 [============>.................] - ETA: 1:08 - loss: 1.4388 - regression_loss: 1.2031 - classification_loss: 0.2357 228/500 [============>.................] - ETA: 1:08 - loss: 1.4351 - regression_loss: 1.1999 - classification_loss: 0.2352 229/500 [============>.................] - ETA: 1:08 - loss: 1.4357 - regression_loss: 1.2004 - classification_loss: 0.2353 230/500 [============>.................] - ETA: 1:08 - loss: 1.4371 - regression_loss: 1.2018 - classification_loss: 0.2353 231/500 [============>.................] - ETA: 1:07 - loss: 1.4374 - regression_loss: 1.2020 - classification_loss: 0.2354 232/500 [============>.................] - ETA: 1:07 - loss: 1.4363 - regression_loss: 1.2013 - classification_loss: 0.2349 233/500 [============>.................] - ETA: 1:07 - loss: 1.4354 - regression_loss: 1.2008 - classification_loss: 0.2346 234/500 [=============>................] - ETA: 1:07 - loss: 1.4348 - regression_loss: 1.2005 - classification_loss: 0.2343 235/500 [=============>................] - ETA: 1:06 - loss: 1.4374 - regression_loss: 1.2027 - classification_loss: 0.2347 236/500 [=============>................] - ETA: 1:06 - loss: 1.4347 - regression_loss: 1.2006 - classification_loss: 0.2341 237/500 [=============>................] - ETA: 1:06 - loss: 1.4349 - regression_loss: 1.2010 - classification_loss: 0.2339 238/500 [=============>................] - ETA: 1:06 - loss: 1.4361 - regression_loss: 1.2019 - classification_loss: 0.2342 239/500 [=============>................] - ETA: 1:05 - loss: 1.4373 - regression_loss: 1.2035 - classification_loss: 0.2338 240/500 [=============>................] - ETA: 1:05 - loss: 1.4344 - regression_loss: 1.2010 - classification_loss: 0.2333 241/500 [=============>................] - ETA: 1:05 - loss: 1.4353 - regression_loss: 1.2020 - classification_loss: 0.2333 242/500 [=============>................] - ETA: 1:05 - loss: 1.4351 - regression_loss: 1.2020 - classification_loss: 0.2331 243/500 [=============>................] - ETA: 1:04 - loss: 1.4317 - regression_loss: 1.1992 - classification_loss: 0.2325 244/500 [=============>................] - ETA: 1:04 - loss: 1.4288 - regression_loss: 1.1969 - classification_loss: 0.2319 245/500 [=============>................] - ETA: 1:04 - loss: 1.4307 - regression_loss: 1.1982 - classification_loss: 0.2325 246/500 [=============>................] - ETA: 1:04 - loss: 1.4306 - regression_loss: 1.1983 - classification_loss: 0.2323 247/500 [=============>................] - ETA: 1:03 - loss: 1.4309 - regression_loss: 1.1988 - classification_loss: 0.2321 248/500 [=============>................] - ETA: 1:03 - loss: 1.4347 - regression_loss: 1.2019 - classification_loss: 0.2328 249/500 [=============>................] - ETA: 1:03 - loss: 1.4350 - regression_loss: 1.2020 - classification_loss: 0.2330 250/500 [==============>...............] - ETA: 1:03 - loss: 1.4358 - regression_loss: 1.2027 - classification_loss: 0.2331 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4333 - regression_loss: 1.2008 - classification_loss: 0.2325 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4330 - regression_loss: 1.2006 - classification_loss: 0.2324 253/500 [==============>...............] - ETA: 1:02 - loss: 1.4335 - regression_loss: 1.2010 - classification_loss: 0.2326 254/500 [==============>...............] - ETA: 1:02 - loss: 1.4347 - regression_loss: 1.2019 - classification_loss: 0.2327 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4361 - regression_loss: 1.2023 - classification_loss: 0.2338 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4342 - regression_loss: 1.2008 - classification_loss: 0.2334 257/500 [==============>...............] - ETA: 1:01 - loss: 1.4360 - regression_loss: 1.2021 - classification_loss: 0.2338 258/500 [==============>...............] - ETA: 1:01 - loss: 1.4374 - regression_loss: 1.2032 - classification_loss: 0.2342 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4373 - regression_loss: 1.2032 - classification_loss: 0.2341 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4372 - regression_loss: 1.2031 - classification_loss: 0.2341 261/500 [==============>...............] - ETA: 1:00 - loss: 1.4340 - regression_loss: 1.2004 - classification_loss: 0.2336 262/500 [==============>...............] - ETA: 1:00 - loss: 1.4343 - regression_loss: 1.2007 - classification_loss: 0.2336 263/500 [==============>...............] - ETA: 59s - loss: 1.4365 - regression_loss: 1.2024 - classification_loss: 0.2341  264/500 [==============>...............] - ETA: 59s - loss: 1.4376 - regression_loss: 1.2034 - classification_loss: 0.2342 265/500 [==============>...............] - ETA: 59s - loss: 1.4393 - regression_loss: 1.2048 - classification_loss: 0.2345 266/500 [==============>...............] - ETA: 59s - loss: 1.4389 - regression_loss: 1.2045 - classification_loss: 0.2344 267/500 [===============>..............] - ETA: 58s - loss: 1.4411 - regression_loss: 1.2064 - classification_loss: 0.2347 268/500 [===============>..............] - ETA: 58s - loss: 1.4392 - regression_loss: 1.2047 - classification_loss: 0.2345 269/500 [===============>..............] - ETA: 58s - loss: 1.4379 - regression_loss: 1.2035 - classification_loss: 0.2343 270/500 [===============>..............] - ETA: 58s - loss: 1.4383 - regression_loss: 1.2040 - classification_loss: 0.2344 271/500 [===============>..............] - ETA: 57s - loss: 1.4364 - regression_loss: 1.2023 - classification_loss: 0.2340 272/500 [===============>..............] - ETA: 57s - loss: 1.4372 - regression_loss: 1.2034 - classification_loss: 0.2338 273/500 [===============>..............] - ETA: 57s - loss: 1.4387 - regression_loss: 1.2046 - classification_loss: 0.2341 274/500 [===============>..............] - ETA: 57s - loss: 1.4359 - regression_loss: 1.2023 - classification_loss: 0.2336 275/500 [===============>..............] - ETA: 56s - loss: 1.4356 - regression_loss: 1.2019 - classification_loss: 0.2337 276/500 [===============>..............] - ETA: 56s - loss: 1.4360 - regression_loss: 1.2023 - classification_loss: 0.2337 277/500 [===============>..............] - ETA: 56s - loss: 1.4361 - regression_loss: 1.2026 - classification_loss: 0.2336 278/500 [===============>..............] - ETA: 56s - loss: 1.4336 - regression_loss: 1.2005 - classification_loss: 0.2331 279/500 [===============>..............] - ETA: 55s - loss: 1.4331 - regression_loss: 1.2001 - classification_loss: 0.2330 280/500 [===============>..............] - ETA: 55s - loss: 1.4373 - regression_loss: 1.2036 - classification_loss: 0.2337 281/500 [===============>..............] - ETA: 55s - loss: 1.4399 - regression_loss: 1.2053 - classification_loss: 0.2346 282/500 [===============>..............] - ETA: 55s - loss: 1.4410 - regression_loss: 1.2062 - classification_loss: 0.2348 283/500 [===============>..............] - ETA: 54s - loss: 1.4423 - regression_loss: 1.2073 - classification_loss: 0.2350 284/500 [================>.............] - ETA: 54s - loss: 1.4419 - regression_loss: 1.2069 - classification_loss: 0.2350 285/500 [================>.............] - ETA: 54s - loss: 1.4426 - regression_loss: 1.2073 - classification_loss: 0.2353 286/500 [================>.............] - ETA: 54s - loss: 1.4416 - regression_loss: 1.2066 - classification_loss: 0.2351 287/500 [================>.............] - ETA: 53s - loss: 1.4418 - regression_loss: 1.2066 - classification_loss: 0.2352 288/500 [================>.............] - ETA: 53s - loss: 1.4391 - regression_loss: 1.2045 - classification_loss: 0.2346 289/500 [================>.............] - ETA: 53s - loss: 1.4403 - regression_loss: 1.2055 - classification_loss: 0.2348 290/500 [================>.............] - ETA: 53s - loss: 1.4413 - regression_loss: 1.2064 - classification_loss: 0.2349 291/500 [================>.............] - ETA: 52s - loss: 1.4432 - regression_loss: 1.2075 - classification_loss: 0.2356 292/500 [================>.............] - ETA: 52s - loss: 1.4441 - regression_loss: 1.2084 - classification_loss: 0.2357 293/500 [================>.............] - ETA: 52s - loss: 1.4443 - regression_loss: 1.2087 - classification_loss: 0.2355 294/500 [================>.............] - ETA: 51s - loss: 1.4450 - regression_loss: 1.2094 - classification_loss: 0.2356 295/500 [================>.............] - ETA: 51s - loss: 1.4442 - regression_loss: 1.2090 - classification_loss: 0.2352 296/500 [================>.............] - ETA: 51s - loss: 1.4433 - regression_loss: 1.2083 - classification_loss: 0.2350 297/500 [================>.............] - ETA: 51s - loss: 1.4416 - regression_loss: 1.2068 - classification_loss: 0.2348 298/500 [================>.............] - ETA: 50s - loss: 1.4437 - regression_loss: 1.2083 - classification_loss: 0.2353 299/500 [================>.............] - ETA: 50s - loss: 1.4465 - regression_loss: 1.2103 - classification_loss: 0.2361 300/500 [=================>............] - ETA: 50s - loss: 1.4467 - regression_loss: 1.2106 - classification_loss: 0.2360 301/500 [=================>............] - ETA: 50s - loss: 1.4444 - regression_loss: 1.2088 - classification_loss: 0.2356 302/500 [=================>............] - ETA: 49s - loss: 1.4449 - regression_loss: 1.2095 - classification_loss: 0.2354 303/500 [=================>............] - ETA: 49s - loss: 1.4449 - regression_loss: 1.2093 - classification_loss: 0.2356 304/500 [=================>............] - ETA: 49s - loss: 1.4459 - regression_loss: 1.2104 - classification_loss: 0.2355 305/500 [=================>............] - ETA: 49s - loss: 1.4480 - regression_loss: 1.2123 - classification_loss: 0.2357 306/500 [=================>............] - ETA: 48s - loss: 1.4472 - regression_loss: 1.2116 - classification_loss: 0.2356 307/500 [=================>............] - ETA: 48s - loss: 1.4466 - regression_loss: 1.2114 - classification_loss: 0.2352 308/500 [=================>............] - ETA: 48s - loss: 1.4461 - regression_loss: 1.2109 - classification_loss: 0.2353 309/500 [=================>............] - ETA: 48s - loss: 1.4463 - regression_loss: 1.2111 - classification_loss: 0.2352 310/500 [=================>............] - ETA: 47s - loss: 1.4441 - regression_loss: 1.2087 - classification_loss: 0.2354 311/500 [=================>............] - ETA: 47s - loss: 1.4433 - regression_loss: 1.2081 - classification_loss: 0.2352 312/500 [=================>............] - ETA: 47s - loss: 1.4432 - regression_loss: 1.2080 - classification_loss: 0.2352 313/500 [=================>............] - ETA: 47s - loss: 1.4423 - regression_loss: 1.2076 - classification_loss: 0.2347 314/500 [=================>............] - ETA: 46s - loss: 1.4427 - regression_loss: 1.2081 - classification_loss: 0.2346 315/500 [=================>............] - ETA: 46s - loss: 1.4416 - regression_loss: 1.2070 - classification_loss: 0.2347 316/500 [=================>............] - ETA: 46s - loss: 1.4412 - regression_loss: 1.2067 - classification_loss: 0.2345 317/500 [==================>...........] - ETA: 46s - loss: 1.4406 - regression_loss: 1.2059 - classification_loss: 0.2347 318/500 [==================>...........] - ETA: 45s - loss: 1.4398 - regression_loss: 1.2052 - classification_loss: 0.2346 319/500 [==================>...........] - ETA: 45s - loss: 1.4400 - regression_loss: 1.2053 - classification_loss: 0.2347 320/500 [==================>...........] - ETA: 45s - loss: 1.4417 - regression_loss: 1.2065 - classification_loss: 0.2353 321/500 [==================>...........] - ETA: 45s - loss: 1.4424 - regression_loss: 1.2071 - classification_loss: 0.2353 322/500 [==================>...........] - ETA: 44s - loss: 1.4425 - regression_loss: 1.2069 - classification_loss: 0.2356 323/500 [==================>...........] - ETA: 44s - loss: 1.4442 - regression_loss: 1.2083 - classification_loss: 0.2359 324/500 [==================>...........] - ETA: 44s - loss: 1.4440 - regression_loss: 1.2082 - classification_loss: 0.2358 325/500 [==================>...........] - ETA: 44s - loss: 1.4434 - regression_loss: 1.2074 - classification_loss: 0.2360 326/500 [==================>...........] - ETA: 43s - loss: 1.4439 - regression_loss: 1.2079 - classification_loss: 0.2361 327/500 [==================>...........] - ETA: 43s - loss: 1.4426 - regression_loss: 1.2068 - classification_loss: 0.2358 328/500 [==================>...........] - ETA: 43s - loss: 1.4434 - regression_loss: 1.2076 - classification_loss: 0.2358 329/500 [==================>...........] - ETA: 43s - loss: 1.4449 - regression_loss: 1.2089 - classification_loss: 0.2360 330/500 [==================>...........] - ETA: 42s - loss: 1.4460 - regression_loss: 1.2097 - classification_loss: 0.2363 331/500 [==================>...........] - ETA: 42s - loss: 1.4468 - regression_loss: 1.2104 - classification_loss: 0.2365 332/500 [==================>...........] - ETA: 42s - loss: 1.4490 - regression_loss: 1.2122 - classification_loss: 0.2368 333/500 [==================>...........] - ETA: 42s - loss: 1.4485 - regression_loss: 1.2118 - classification_loss: 0.2367 334/500 [===================>..........] - ETA: 41s - loss: 1.4505 - regression_loss: 1.2131 - classification_loss: 0.2374 335/500 [===================>..........] - ETA: 41s - loss: 1.4502 - regression_loss: 1.2129 - classification_loss: 0.2373 336/500 [===================>..........] - ETA: 41s - loss: 1.4510 - regression_loss: 1.2137 - classification_loss: 0.2373 337/500 [===================>..........] - ETA: 41s - loss: 1.4493 - regression_loss: 1.2123 - classification_loss: 0.2370 338/500 [===================>..........] - ETA: 40s - loss: 1.4495 - regression_loss: 1.2123 - classification_loss: 0.2372 339/500 [===================>..........] - ETA: 40s - loss: 1.4498 - regression_loss: 1.2126 - classification_loss: 0.2371 340/500 [===================>..........] - ETA: 40s - loss: 1.4500 - regression_loss: 1.2129 - classification_loss: 0.2371 341/500 [===================>..........] - ETA: 40s - loss: 1.4504 - regression_loss: 1.2135 - classification_loss: 0.2369 342/500 [===================>..........] - ETA: 39s - loss: 1.4506 - regression_loss: 1.2137 - classification_loss: 0.2369 343/500 [===================>..........] - ETA: 39s - loss: 1.4510 - regression_loss: 1.2138 - classification_loss: 0.2372 344/500 [===================>..........] - ETA: 39s - loss: 1.4479 - regression_loss: 1.2113 - classification_loss: 0.2366 345/500 [===================>..........] - ETA: 39s - loss: 1.4449 - regression_loss: 1.2088 - classification_loss: 0.2361 346/500 [===================>..........] - ETA: 38s - loss: 1.4442 - regression_loss: 1.2086 - classification_loss: 0.2356 347/500 [===================>..........] - ETA: 38s - loss: 1.4431 - regression_loss: 1.2078 - classification_loss: 0.2353 348/500 [===================>..........] - ETA: 38s - loss: 1.4450 - regression_loss: 1.2093 - classification_loss: 0.2357 349/500 [===================>..........] - ETA: 38s - loss: 1.4444 - regression_loss: 1.2089 - classification_loss: 0.2355 350/500 [====================>.........] - ETA: 37s - loss: 1.4451 - regression_loss: 1.2095 - classification_loss: 0.2356 351/500 [====================>.........] - ETA: 37s - loss: 1.4455 - regression_loss: 1.2099 - classification_loss: 0.2356 352/500 [====================>.........] - ETA: 37s - loss: 1.4451 - regression_loss: 1.2097 - classification_loss: 0.2354 353/500 [====================>.........] - ETA: 37s - loss: 1.4456 - regression_loss: 1.2102 - classification_loss: 0.2354 354/500 [====================>.........] - ETA: 36s - loss: 1.4439 - regression_loss: 1.2088 - classification_loss: 0.2351 355/500 [====================>.........] - ETA: 36s - loss: 1.4460 - regression_loss: 1.2097 - classification_loss: 0.2363 356/500 [====================>.........] - ETA: 36s - loss: 1.4473 - regression_loss: 1.2108 - classification_loss: 0.2365 357/500 [====================>.........] - ETA: 36s - loss: 1.4463 - regression_loss: 1.2101 - classification_loss: 0.2362 358/500 [====================>.........] - ETA: 35s - loss: 1.4465 - regression_loss: 1.2104 - classification_loss: 0.2361 359/500 [====================>.........] - ETA: 35s - loss: 1.4464 - regression_loss: 1.2105 - classification_loss: 0.2359 360/500 [====================>.........] - ETA: 35s - loss: 1.4449 - regression_loss: 1.2093 - classification_loss: 0.2357 361/500 [====================>.........] - ETA: 35s - loss: 1.4429 - regression_loss: 1.2075 - classification_loss: 0.2354 362/500 [====================>.........] - ETA: 34s - loss: 1.4432 - regression_loss: 1.2078 - classification_loss: 0.2355 363/500 [====================>.........] - ETA: 34s - loss: 1.4421 - regression_loss: 1.2069 - classification_loss: 0.2352 364/500 [====================>.........] - ETA: 34s - loss: 1.4406 - regression_loss: 1.2057 - classification_loss: 0.2349 365/500 [====================>.........] - ETA: 34s - loss: 1.4412 - regression_loss: 1.2064 - classification_loss: 0.2349 366/500 [====================>.........] - ETA: 33s - loss: 1.4412 - regression_loss: 1.2064 - classification_loss: 0.2347 367/500 [=====================>........] - ETA: 33s - loss: 1.4416 - regression_loss: 1.2070 - classification_loss: 0.2346 368/500 [=====================>........] - ETA: 33s - loss: 1.4389 - regression_loss: 1.2048 - classification_loss: 0.2342 369/500 [=====================>........] - ETA: 33s - loss: 1.4396 - regression_loss: 1.2057 - classification_loss: 0.2340 370/500 [=====================>........] - ETA: 32s - loss: 1.4386 - regression_loss: 1.2047 - classification_loss: 0.2339 371/500 [=====================>........] - ETA: 32s - loss: 1.4378 - regression_loss: 1.2040 - classification_loss: 0.2337 372/500 [=====================>........] - ETA: 32s - loss: 1.4355 - regression_loss: 1.2022 - classification_loss: 0.2333 373/500 [=====================>........] - ETA: 32s - loss: 1.4372 - regression_loss: 1.2033 - classification_loss: 0.2339 374/500 [=====================>........] - ETA: 31s - loss: 1.4351 - regression_loss: 1.2015 - classification_loss: 0.2336 375/500 [=====================>........] - ETA: 31s - loss: 1.4354 - regression_loss: 1.2019 - classification_loss: 0.2336 376/500 [=====================>........] - ETA: 31s - loss: 1.4362 - regression_loss: 1.2024 - classification_loss: 0.2338 377/500 [=====================>........] - ETA: 31s - loss: 1.4376 - regression_loss: 1.2033 - classification_loss: 0.2343 378/500 [=====================>........] - ETA: 30s - loss: 1.4383 - regression_loss: 1.2039 - classification_loss: 0.2344 379/500 [=====================>........] - ETA: 30s - loss: 1.4375 - regression_loss: 1.2034 - classification_loss: 0.2342 380/500 [=====================>........] - ETA: 30s - loss: 1.4378 - regression_loss: 1.2037 - classification_loss: 0.2341 381/500 [=====================>........] - ETA: 29s - loss: 1.4389 - regression_loss: 1.2045 - classification_loss: 0.2343 382/500 [=====================>........] - ETA: 29s - loss: 1.4411 - regression_loss: 1.2055 - classification_loss: 0.2356 383/500 [=====================>........] - ETA: 29s - loss: 1.4425 - regression_loss: 1.2067 - classification_loss: 0.2358 384/500 [======================>.......] - ETA: 29s - loss: 1.4405 - regression_loss: 1.2052 - classification_loss: 0.2354 385/500 [======================>.......] - ETA: 28s - loss: 1.4412 - regression_loss: 1.2057 - classification_loss: 0.2354 386/500 [======================>.......] - ETA: 28s - loss: 1.4414 - regression_loss: 1.2059 - classification_loss: 0.2355 387/500 [======================>.......] - ETA: 28s - loss: 1.4454 - regression_loss: 1.2096 - classification_loss: 0.2359 388/500 [======================>.......] - ETA: 28s - loss: 1.4488 - regression_loss: 1.2126 - classification_loss: 0.2362 389/500 [======================>.......] - ETA: 27s - loss: 1.4496 - regression_loss: 1.2132 - classification_loss: 0.2364 390/500 [======================>.......] - ETA: 27s - loss: 1.4492 - regression_loss: 1.2127 - classification_loss: 0.2365 391/500 [======================>.......] - ETA: 27s - loss: 1.4474 - regression_loss: 1.2112 - classification_loss: 0.2363 392/500 [======================>.......] - ETA: 27s - loss: 1.4465 - regression_loss: 1.2105 - classification_loss: 0.2360 393/500 [======================>.......] - ETA: 26s - loss: 1.4463 - regression_loss: 1.2102 - classification_loss: 0.2361 394/500 [======================>.......] - ETA: 26s - loss: 1.4472 - regression_loss: 1.2107 - classification_loss: 0.2364 395/500 [======================>.......] - ETA: 26s - loss: 1.4474 - regression_loss: 1.2110 - classification_loss: 0.2363 396/500 [======================>.......] - ETA: 26s - loss: 1.4473 - regression_loss: 1.2110 - classification_loss: 0.2363 397/500 [======================>.......] - ETA: 25s - loss: 1.4469 - regression_loss: 1.2110 - classification_loss: 0.2360 398/500 [======================>.......] - ETA: 25s - loss: 1.4490 - regression_loss: 1.2124 - classification_loss: 0.2366 399/500 [======================>.......] - ETA: 25s - loss: 1.4509 - regression_loss: 1.2139 - classification_loss: 0.2370 400/500 [=======================>......] - ETA: 25s - loss: 1.4510 - regression_loss: 1.2139 - classification_loss: 0.2371 401/500 [=======================>......] - ETA: 24s - loss: 1.4500 - regression_loss: 1.2132 - classification_loss: 0.2368 402/500 [=======================>......] - ETA: 24s - loss: 1.4488 - regression_loss: 1.2123 - classification_loss: 0.2365 403/500 [=======================>......] - ETA: 24s - loss: 1.4487 - regression_loss: 1.2124 - classification_loss: 0.2363 404/500 [=======================>......] - ETA: 24s - loss: 1.4477 - regression_loss: 1.2116 - classification_loss: 0.2361 405/500 [=======================>......] - ETA: 23s - loss: 1.4478 - regression_loss: 1.2117 - classification_loss: 0.2361 406/500 [=======================>......] - ETA: 23s - loss: 1.4465 - regression_loss: 1.2106 - classification_loss: 0.2358 407/500 [=======================>......] - ETA: 23s - loss: 1.4471 - regression_loss: 1.2109 - classification_loss: 0.2363 408/500 [=======================>......] - ETA: 23s - loss: 1.4468 - regression_loss: 1.2106 - classification_loss: 0.2362 409/500 [=======================>......] - ETA: 22s - loss: 1.4473 - regression_loss: 1.2111 - classification_loss: 0.2362 410/500 [=======================>......] - ETA: 22s - loss: 1.4449 - regression_loss: 1.2092 - classification_loss: 0.2357 411/500 [=======================>......] - ETA: 22s - loss: 1.4461 - regression_loss: 1.2102 - classification_loss: 0.2359 412/500 [=======================>......] - ETA: 22s - loss: 1.4459 - regression_loss: 1.2101 - classification_loss: 0.2358 413/500 [=======================>......] - ETA: 21s - loss: 1.4464 - regression_loss: 1.2106 - classification_loss: 0.2359 414/500 [=======================>......] - ETA: 21s - loss: 1.4462 - regression_loss: 1.2102 - classification_loss: 0.2359 415/500 [=======================>......] - ETA: 21s - loss: 1.4463 - regression_loss: 1.2104 - classification_loss: 0.2359 416/500 [=======================>......] - ETA: 21s - loss: 1.4458 - regression_loss: 1.2101 - classification_loss: 0.2357 417/500 [========================>.....] - ETA: 20s - loss: 1.4435 - regression_loss: 1.2083 - classification_loss: 0.2352 418/500 [========================>.....] - ETA: 20s - loss: 1.4430 - regression_loss: 1.2079 - classification_loss: 0.2351 419/500 [========================>.....] - ETA: 20s - loss: 1.4434 - regression_loss: 1.2082 - classification_loss: 0.2352 420/500 [========================>.....] - ETA: 20s - loss: 1.4415 - regression_loss: 1.2067 - classification_loss: 0.2348 421/500 [========================>.....] - ETA: 19s - loss: 1.4400 - regression_loss: 1.2055 - classification_loss: 0.2345 422/500 [========================>.....] - ETA: 19s - loss: 1.4402 - regression_loss: 1.2058 - classification_loss: 0.2344 423/500 [========================>.....] - ETA: 19s - loss: 1.4416 - regression_loss: 1.2069 - classification_loss: 0.2347 424/500 [========================>.....] - ETA: 19s - loss: 1.4412 - regression_loss: 1.2067 - classification_loss: 0.2346 425/500 [========================>.....] - ETA: 18s - loss: 1.4399 - regression_loss: 1.2057 - classification_loss: 0.2343 426/500 [========================>.....] - ETA: 18s - loss: 1.4395 - regression_loss: 1.2053 - classification_loss: 0.2342 427/500 [========================>.....] - ETA: 18s - loss: 1.4403 - regression_loss: 1.2062 - classification_loss: 0.2342 428/500 [========================>.....] - ETA: 18s - loss: 1.4406 - regression_loss: 1.2063 - classification_loss: 0.2343 429/500 [========================>.....] - ETA: 17s - loss: 1.4412 - regression_loss: 1.2069 - classification_loss: 0.2343 430/500 [========================>.....] - ETA: 17s - loss: 1.4417 - regression_loss: 1.2073 - classification_loss: 0.2344 431/500 [========================>.....] - ETA: 17s - loss: 1.4418 - regression_loss: 1.2073 - classification_loss: 0.2346 432/500 [========================>.....] - ETA: 17s - loss: 1.4412 - regression_loss: 1.2068 - classification_loss: 0.2344 433/500 [========================>.....] - ETA: 16s - loss: 1.4418 - regression_loss: 1.2074 - classification_loss: 0.2344 434/500 [=========================>....] - ETA: 16s - loss: 1.4425 - regression_loss: 1.2080 - classification_loss: 0.2345 435/500 [=========================>....] - ETA: 16s - loss: 1.4418 - regression_loss: 1.2075 - classification_loss: 0.2343 436/500 [=========================>....] - ETA: 16s - loss: 1.4419 - regression_loss: 1.2075 - classification_loss: 0.2344 437/500 [=========================>....] - ETA: 15s - loss: 1.4432 - regression_loss: 1.2086 - classification_loss: 0.2346 438/500 [=========================>....] - ETA: 15s - loss: 1.4437 - regression_loss: 1.2090 - classification_loss: 0.2347 439/500 [=========================>....] - ETA: 15s - loss: 1.4430 - regression_loss: 1.2085 - classification_loss: 0.2345 440/500 [=========================>....] - ETA: 15s - loss: 1.4436 - regression_loss: 1.2090 - classification_loss: 0.2345 441/500 [=========================>....] - ETA: 14s - loss: 1.4444 - regression_loss: 1.2097 - classification_loss: 0.2346 442/500 [=========================>....] - ETA: 14s - loss: 1.4453 - regression_loss: 1.2105 - classification_loss: 0.2347 443/500 [=========================>....] - ETA: 14s - loss: 1.4460 - regression_loss: 1.2113 - classification_loss: 0.2347 444/500 [=========================>....] - ETA: 14s - loss: 1.4469 - regression_loss: 1.2121 - classification_loss: 0.2347 445/500 [=========================>....] - ETA: 13s - loss: 1.4466 - regression_loss: 1.2119 - classification_loss: 0.2347 446/500 [=========================>....] - ETA: 13s - loss: 1.4478 - regression_loss: 1.2128 - classification_loss: 0.2350 447/500 [=========================>....] - ETA: 13s - loss: 1.4485 - regression_loss: 1.2134 - classification_loss: 0.2351 448/500 [=========================>....] - ETA: 13s - loss: 1.4491 - regression_loss: 1.2138 - classification_loss: 0.2353 449/500 [=========================>....] - ETA: 12s - loss: 1.4498 - regression_loss: 1.2145 - classification_loss: 0.2354 450/500 [==========================>...] - ETA: 12s - loss: 1.4500 - regression_loss: 1.2147 - classification_loss: 0.2353 451/500 [==========================>...] - ETA: 12s - loss: 1.4492 - regression_loss: 1.2141 - classification_loss: 0.2350 452/500 [==========================>...] - ETA: 12s - loss: 1.4482 - regression_loss: 1.2134 - classification_loss: 0.2348 453/500 [==========================>...] - ETA: 11s - loss: 1.4486 - regression_loss: 1.2135 - classification_loss: 0.2351 454/500 [==========================>...] - ETA: 11s - loss: 1.4490 - regression_loss: 1.2138 - classification_loss: 0.2352 455/500 [==========================>...] - ETA: 11s - loss: 1.4489 - regression_loss: 1.2137 - classification_loss: 0.2352 456/500 [==========================>...] - ETA: 11s - loss: 1.4492 - regression_loss: 1.2140 - classification_loss: 0.2353 457/500 [==========================>...] - ETA: 10s - loss: 1.4496 - regression_loss: 1.2144 - classification_loss: 0.2353 458/500 [==========================>...] - ETA: 10s - loss: 1.4498 - regression_loss: 1.2145 - classification_loss: 0.2353 459/500 [==========================>...] - ETA: 10s - loss: 1.4491 - regression_loss: 1.2138 - classification_loss: 0.2353 460/500 [==========================>...] - ETA: 10s - loss: 1.4489 - regression_loss: 1.2137 - classification_loss: 0.2352 461/500 [==========================>...] - ETA: 9s - loss: 1.4489 - regression_loss: 1.2137 - classification_loss: 0.2352  462/500 [==========================>...] - ETA: 9s - loss: 1.4503 - regression_loss: 1.2149 - classification_loss: 0.2353 463/500 [==========================>...] - ETA: 9s - loss: 1.4507 - regression_loss: 1.2152 - classification_loss: 0.2355 464/500 [==========================>...] - ETA: 9s - loss: 1.4482 - regression_loss: 1.2131 - classification_loss: 0.2351 465/500 [==========================>...] - ETA: 8s - loss: 1.4480 - regression_loss: 1.2129 - classification_loss: 0.2351 466/500 [==========================>...] - ETA: 8s - loss: 1.4474 - regression_loss: 1.2123 - classification_loss: 0.2351 467/500 [===========================>..] - ETA: 8s - loss: 1.4471 - regression_loss: 1.2121 - classification_loss: 0.2350 468/500 [===========================>..] - ETA: 8s - loss: 1.4465 - regression_loss: 1.2115 - classification_loss: 0.2350 469/500 [===========================>..] - ETA: 7s - loss: 1.4474 - regression_loss: 1.2122 - classification_loss: 0.2352 470/500 [===========================>..] - ETA: 7s - loss: 1.4480 - regression_loss: 1.2125 - classification_loss: 0.2354 471/500 [===========================>..] - ETA: 7s - loss: 1.4461 - regression_loss: 1.2110 - classification_loss: 0.2351 472/500 [===========================>..] - ETA: 7s - loss: 1.4456 - regression_loss: 1.2107 - classification_loss: 0.2349 473/500 [===========================>..] - ETA: 6s - loss: 1.4440 - regression_loss: 1.2093 - classification_loss: 0.2347 474/500 [===========================>..] - ETA: 6s - loss: 1.4448 - regression_loss: 1.2099 - classification_loss: 0.2349 475/500 [===========================>..] - ETA: 6s - loss: 1.4444 - regression_loss: 1.2100 - classification_loss: 0.2344 476/500 [===========================>..] - ETA: 6s - loss: 1.4424 - regression_loss: 1.2083 - classification_loss: 0.2340 477/500 [===========================>..] - ETA: 5s - loss: 1.4424 - regression_loss: 1.2083 - classification_loss: 0.2342 478/500 [===========================>..] - ETA: 5s - loss: 1.4432 - regression_loss: 1.2090 - classification_loss: 0.2342 479/500 [===========================>..] - ETA: 5s - loss: 1.4433 - regression_loss: 1.2092 - classification_loss: 0.2342 480/500 [===========================>..] - ETA: 5s - loss: 1.4429 - regression_loss: 1.2088 - classification_loss: 0.2341 481/500 [===========================>..] - ETA: 4s - loss: 1.4425 - regression_loss: 1.2086 - classification_loss: 0.2339 482/500 [===========================>..] - ETA: 4s - loss: 1.4414 - regression_loss: 1.2078 - classification_loss: 0.2336 483/500 [===========================>..] - ETA: 4s - loss: 1.4400 - regression_loss: 1.2068 - classification_loss: 0.2332 484/500 [============================>.] - ETA: 4s - loss: 1.4386 - regression_loss: 1.2057 - classification_loss: 0.2329 485/500 [============================>.] - ETA: 3s - loss: 1.4398 - regression_loss: 1.2067 - classification_loss: 0.2331 486/500 [============================>.] - ETA: 3s - loss: 1.4392 - regression_loss: 1.2063 - classification_loss: 0.2330 487/500 [============================>.] - ETA: 3s - loss: 1.4400 - regression_loss: 1.2070 - classification_loss: 0.2330 488/500 [============================>.] - ETA: 3s - loss: 1.4405 - regression_loss: 1.2075 - classification_loss: 0.2330 489/500 [============================>.] - ETA: 2s - loss: 1.4405 - regression_loss: 1.2074 - classification_loss: 0.2331 490/500 [============================>.] - ETA: 2s - loss: 1.4413 - regression_loss: 1.2080 - classification_loss: 0.2332 491/500 [============================>.] - ETA: 2s - loss: 1.4407 - regression_loss: 1.2074 - classification_loss: 0.2333 492/500 [============================>.] - ETA: 2s - loss: 1.4410 - regression_loss: 1.2075 - classification_loss: 0.2335 493/500 [============================>.] - ETA: 1s - loss: 1.4409 - regression_loss: 1.2074 - classification_loss: 0.2335 494/500 [============================>.] - ETA: 1s - loss: 1.4408 - regression_loss: 1.2073 - classification_loss: 0.2334 495/500 [============================>.] - ETA: 1s - loss: 1.4401 - regression_loss: 1.2068 - classification_loss: 0.2333 496/500 [============================>.] - ETA: 1s - loss: 1.4415 - regression_loss: 1.2079 - classification_loss: 0.2336 497/500 [============================>.] - ETA: 0s - loss: 1.4414 - regression_loss: 1.2079 - classification_loss: 0.2335 498/500 [============================>.] - ETA: 0s - loss: 1.4409 - regression_loss: 1.2074 - classification_loss: 0.2334 499/500 [============================>.] - ETA: 0s - loss: 1.4409 - regression_loss: 1.2074 - classification_loss: 0.2334 500/500 [==============================] - 126s 252ms/step - loss: 1.4423 - regression_loss: 1.2085 - classification_loss: 0.2339 1172 instances of class plum with average precision: 0.6749 mAP: 0.6749 Epoch 00108: saving model to ./training/snapshots/resnet50_pascal_108.h5 Epoch 109/150 1/500 [..............................] - ETA: 1:47 - loss: 1.3825 - regression_loss: 1.2068 - classification_loss: 0.1757 2/500 [..............................] - ETA: 1:53 - loss: 1.4906 - regression_loss: 1.2633 - classification_loss: 0.2273 3/500 [..............................] - ETA: 1:55 - loss: 1.7084 - regression_loss: 1.4329 - classification_loss: 0.2754 4/500 [..............................] - ETA: 1:58 - loss: 1.8647 - regression_loss: 1.5775 - classification_loss: 0.2872 5/500 [..............................] - ETA: 1:59 - loss: 1.6621 - regression_loss: 1.4041 - classification_loss: 0.2580 6/500 [..............................] - ETA: 2:00 - loss: 1.4936 - regression_loss: 1.2579 - classification_loss: 0.2357 7/500 [..............................] - ETA: 2:01 - loss: 1.5399 - regression_loss: 1.2980 - classification_loss: 0.2419 8/500 [..............................] - ETA: 2:01 - loss: 1.4650 - regression_loss: 1.2401 - classification_loss: 0.2249 9/500 [..............................] - ETA: 2:01 - loss: 1.4798 - regression_loss: 1.2576 - classification_loss: 0.2222 10/500 [..............................] - ETA: 2:02 - loss: 1.4705 - regression_loss: 1.2471 - classification_loss: 0.2234 11/500 [..............................] - ETA: 2:02 - loss: 1.4148 - regression_loss: 1.1994 - classification_loss: 0.2154 12/500 [..............................] - ETA: 2:02 - loss: 1.4383 - regression_loss: 1.2200 - classification_loss: 0.2183 13/500 [..............................] - ETA: 2:01 - loss: 1.4666 - regression_loss: 1.2406 - classification_loss: 0.2260 14/500 [..............................] - ETA: 2:01 - loss: 1.4379 - regression_loss: 1.2169 - classification_loss: 0.2210 15/500 [..............................] - ETA: 2:01 - loss: 1.3693 - regression_loss: 1.1588 - classification_loss: 0.2105 16/500 [..............................] - ETA: 2:00 - loss: 1.3692 - regression_loss: 1.1602 - classification_loss: 0.2090 17/500 [>.............................] - ETA: 2:00 - loss: 1.3688 - regression_loss: 1.1591 - classification_loss: 0.2098 18/500 [>.............................] - ETA: 2:00 - loss: 1.3400 - regression_loss: 1.1354 - classification_loss: 0.2046 19/500 [>.............................] - ETA: 2:00 - loss: 1.3666 - regression_loss: 1.1524 - classification_loss: 0.2142 20/500 [>.............................] - ETA: 2:00 - loss: 1.3893 - regression_loss: 1.1703 - classification_loss: 0.2190 21/500 [>.............................] - ETA: 2:00 - loss: 1.4162 - regression_loss: 1.1918 - classification_loss: 0.2244 22/500 [>.............................] - ETA: 1:59 - loss: 1.4111 - regression_loss: 1.1887 - classification_loss: 0.2224 23/500 [>.............................] - ETA: 1:59 - loss: 1.4226 - regression_loss: 1.1995 - classification_loss: 0.2232 24/500 [>.............................] - ETA: 1:59 - loss: 1.4039 - regression_loss: 1.1832 - classification_loss: 0.2207 25/500 [>.............................] - ETA: 1:59 - loss: 1.4190 - regression_loss: 1.1963 - classification_loss: 0.2227 26/500 [>.............................] - ETA: 1:58 - loss: 1.4177 - regression_loss: 1.1955 - classification_loss: 0.2221 27/500 [>.............................] - ETA: 1:58 - loss: 1.4371 - regression_loss: 1.2068 - classification_loss: 0.2303 28/500 [>.............................] - ETA: 1:58 - loss: 1.4598 - regression_loss: 1.2264 - classification_loss: 0.2334 29/500 [>.............................] - ETA: 1:57 - loss: 1.4679 - regression_loss: 1.2328 - classification_loss: 0.2351 30/500 [>.............................] - ETA: 1:56 - loss: 1.4602 - regression_loss: 1.2279 - classification_loss: 0.2322 31/500 [>.............................] - ETA: 1:56 - loss: 1.4683 - regression_loss: 1.2355 - classification_loss: 0.2328 32/500 [>.............................] - ETA: 1:56 - loss: 1.4657 - regression_loss: 1.2329 - classification_loss: 0.2329 33/500 [>.............................] - ETA: 1:55 - loss: 1.4683 - regression_loss: 1.2354 - classification_loss: 0.2329 34/500 [=>............................] - ETA: 1:54 - loss: 1.4713 - regression_loss: 1.2388 - classification_loss: 0.2325 35/500 [=>............................] - ETA: 1:53 - loss: 1.4630 - regression_loss: 1.2328 - classification_loss: 0.2302 36/500 [=>............................] - ETA: 1:53 - loss: 1.4562 - regression_loss: 1.2267 - classification_loss: 0.2295 37/500 [=>............................] - ETA: 1:53 - loss: 1.4649 - regression_loss: 1.2343 - classification_loss: 0.2306 38/500 [=>............................] - ETA: 1:53 - loss: 1.4572 - regression_loss: 1.2286 - classification_loss: 0.2286 39/500 [=>............................] - ETA: 1:53 - loss: 1.4552 - regression_loss: 1.2270 - classification_loss: 0.2282 40/500 [=>............................] - ETA: 1:53 - loss: 1.4590 - regression_loss: 1.2313 - classification_loss: 0.2278 41/500 [=>............................] - ETA: 1:52 - loss: 1.4559 - regression_loss: 1.2300 - classification_loss: 0.2259 42/500 [=>............................] - ETA: 1:52 - loss: 1.4691 - regression_loss: 1.2388 - classification_loss: 0.2303 43/500 [=>............................] - ETA: 1:52 - loss: 1.4707 - regression_loss: 1.2406 - classification_loss: 0.2301 44/500 [=>............................] - ETA: 1:52 - loss: 1.4695 - regression_loss: 1.2385 - classification_loss: 0.2310 45/500 [=>............................] - ETA: 1:52 - loss: 1.4645 - regression_loss: 1.2349 - classification_loss: 0.2296 46/500 [=>............................] - ETA: 1:52 - loss: 1.4754 - regression_loss: 1.2436 - classification_loss: 0.2318 47/500 [=>............................] - ETA: 1:51 - loss: 1.4751 - regression_loss: 1.2437 - classification_loss: 0.2314 48/500 [=>............................] - ETA: 1:51 - loss: 1.4897 - regression_loss: 1.2546 - classification_loss: 0.2352 49/500 [=>............................] - ETA: 1:51 - loss: 1.4882 - regression_loss: 1.2526 - classification_loss: 0.2356 50/500 [==>...........................] - ETA: 1:51 - loss: 1.4982 - regression_loss: 1.2612 - classification_loss: 0.2370 51/500 [==>...........................] - ETA: 1:50 - loss: 1.4893 - regression_loss: 1.2531 - classification_loss: 0.2362 52/500 [==>...........................] - ETA: 1:50 - loss: 1.4901 - regression_loss: 1.2537 - classification_loss: 0.2364 53/500 [==>...........................] - ETA: 1:50 - loss: 1.5029 - regression_loss: 1.2634 - classification_loss: 0.2395 54/500 [==>...........................] - ETA: 1:50 - loss: 1.5026 - regression_loss: 1.2635 - classification_loss: 0.2391 55/500 [==>...........................] - ETA: 1:50 - loss: 1.4970 - regression_loss: 1.2582 - classification_loss: 0.2388 56/500 [==>...........................] - ETA: 1:50 - loss: 1.5037 - regression_loss: 1.2610 - classification_loss: 0.2427 57/500 [==>...........................] - ETA: 1:49 - loss: 1.4917 - regression_loss: 1.2514 - classification_loss: 0.2403 58/500 [==>...........................] - ETA: 1:49 - loss: 1.4794 - regression_loss: 1.2409 - classification_loss: 0.2385 59/500 [==>...........................] - ETA: 1:49 - loss: 1.4658 - regression_loss: 1.2292 - classification_loss: 0.2366 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4685 - regression_loss: 1.2316 - classification_loss: 0.2370 61/500 [==>...........................] - ETA: 1:48 - loss: 1.4628 - regression_loss: 1.2273 - classification_loss: 0.2355 62/500 [==>...........................] - ETA: 1:48 - loss: 1.4652 - regression_loss: 1.2302 - classification_loss: 0.2351 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4698 - regression_loss: 1.2348 - classification_loss: 0.2350 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4712 - regression_loss: 1.2354 - classification_loss: 0.2359 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4769 - regression_loss: 1.2406 - classification_loss: 0.2363 66/500 [==>...........................] - ETA: 1:47 - loss: 1.4791 - regression_loss: 1.2426 - classification_loss: 0.2364 67/500 [===>..........................] - ETA: 1:47 - loss: 1.4773 - regression_loss: 1.2416 - classification_loss: 0.2357 68/500 [===>..........................] - ETA: 1:46 - loss: 1.4798 - regression_loss: 1.2433 - classification_loss: 0.2365 69/500 [===>..........................] - ETA: 1:46 - loss: 1.4695 - regression_loss: 1.2349 - classification_loss: 0.2345 70/500 [===>..........................] - ETA: 1:46 - loss: 1.4655 - regression_loss: 1.2305 - classification_loss: 0.2350 71/500 [===>..........................] - ETA: 1:45 - loss: 1.4714 - regression_loss: 1.2354 - classification_loss: 0.2360 72/500 [===>..........................] - ETA: 1:45 - loss: 1.4710 - regression_loss: 1.2359 - classification_loss: 0.2351 73/500 [===>..........................] - ETA: 1:45 - loss: 1.4715 - regression_loss: 1.2371 - classification_loss: 0.2345 74/500 [===>..........................] - ETA: 1:45 - loss: 1.4704 - regression_loss: 1.2358 - classification_loss: 0.2346 75/500 [===>..........................] - ETA: 1:45 - loss: 1.4674 - regression_loss: 1.2340 - classification_loss: 0.2333 76/500 [===>..........................] - ETA: 1:44 - loss: 1.4679 - regression_loss: 1.2343 - classification_loss: 0.2336 77/500 [===>..........................] - ETA: 1:44 - loss: 1.4704 - regression_loss: 1.2369 - classification_loss: 0.2334 78/500 [===>..........................] - ETA: 1:44 - loss: 1.4746 - regression_loss: 1.2383 - classification_loss: 0.2364 79/500 [===>..........................] - ETA: 1:44 - loss: 1.4732 - regression_loss: 1.2369 - classification_loss: 0.2363 80/500 [===>..........................] - ETA: 1:43 - loss: 1.4758 - regression_loss: 1.2390 - classification_loss: 0.2368 81/500 [===>..........................] - ETA: 1:43 - loss: 1.4772 - regression_loss: 1.2397 - classification_loss: 0.2376 82/500 [===>..........................] - ETA: 1:43 - loss: 1.4702 - regression_loss: 1.2338 - classification_loss: 0.2365 83/500 [===>..........................] - ETA: 1:43 - loss: 1.4722 - regression_loss: 1.2347 - classification_loss: 0.2375 84/500 [====>.........................] - ETA: 1:43 - loss: 1.4729 - regression_loss: 1.2351 - classification_loss: 0.2378 85/500 [====>.........................] - ETA: 1:42 - loss: 1.4667 - regression_loss: 1.2305 - classification_loss: 0.2363 86/500 [====>.........................] - ETA: 1:42 - loss: 1.4721 - regression_loss: 1.2353 - classification_loss: 0.2368 87/500 [====>.........................] - ETA: 1:42 - loss: 1.4719 - regression_loss: 1.2366 - classification_loss: 0.2353 88/500 [====>.........................] - ETA: 1:42 - loss: 1.4722 - regression_loss: 1.2372 - classification_loss: 0.2351 89/500 [====>.........................] - ETA: 1:41 - loss: 1.4726 - regression_loss: 1.2378 - classification_loss: 0.2348 90/500 [====>.........................] - ETA: 1:41 - loss: 1.4780 - regression_loss: 1.2415 - classification_loss: 0.2365 91/500 [====>.........................] - ETA: 1:41 - loss: 1.4702 - regression_loss: 1.2354 - classification_loss: 0.2348 92/500 [====>.........................] - ETA: 1:41 - loss: 1.4658 - regression_loss: 1.2311 - classification_loss: 0.2347 93/500 [====>.........................] - ETA: 1:40 - loss: 1.4640 - regression_loss: 1.2307 - classification_loss: 0.2333 94/500 [====>.........................] - ETA: 1:40 - loss: 1.4605 - regression_loss: 1.2277 - classification_loss: 0.2328 95/500 [====>.........................] - ETA: 1:40 - loss: 1.4635 - regression_loss: 1.2295 - classification_loss: 0.2340 96/500 [====>.........................] - ETA: 1:40 - loss: 1.4641 - regression_loss: 1.2300 - classification_loss: 0.2341 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4670 - regression_loss: 1.2320 - classification_loss: 0.2350 98/500 [====>.........................] - ETA: 1:39 - loss: 1.4698 - regression_loss: 1.2339 - classification_loss: 0.2359 99/500 [====>.........................] - ETA: 1:39 - loss: 1.4611 - regression_loss: 1.2267 - classification_loss: 0.2344 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4609 - regression_loss: 1.2270 - classification_loss: 0.2339 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4602 - regression_loss: 1.2268 - classification_loss: 0.2334 102/500 [=====>........................] - ETA: 1:38 - loss: 1.4617 - regression_loss: 1.2282 - classification_loss: 0.2335 103/500 [=====>........................] - ETA: 1:38 - loss: 1.4647 - regression_loss: 1.2306 - classification_loss: 0.2341 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4660 - regression_loss: 1.2313 - classification_loss: 0.2347 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4647 - regression_loss: 1.2313 - classification_loss: 0.2334 106/500 [=====>........................] - ETA: 1:37 - loss: 1.4627 - regression_loss: 1.2292 - classification_loss: 0.2335 107/500 [=====>........................] - ETA: 1:37 - loss: 1.4671 - regression_loss: 1.2332 - classification_loss: 0.2339 108/500 [=====>........................] - ETA: 1:37 - loss: 1.4678 - regression_loss: 1.2340 - classification_loss: 0.2338 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4735 - regression_loss: 1.2390 - classification_loss: 0.2344 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4671 - regression_loss: 1.2338 - classification_loss: 0.2333 111/500 [=====>........................] - ETA: 1:36 - loss: 1.4636 - regression_loss: 1.2302 - classification_loss: 0.2334 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4642 - regression_loss: 1.2303 - classification_loss: 0.2339 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4611 - regression_loss: 1.2282 - classification_loss: 0.2329 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4570 - regression_loss: 1.2255 - classification_loss: 0.2315 115/500 [=====>........................] - ETA: 1:35 - loss: 1.4495 - regression_loss: 1.2195 - classification_loss: 0.2299 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4537 - regression_loss: 1.2231 - classification_loss: 0.2306 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4556 - regression_loss: 1.2250 - classification_loss: 0.2306 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4521 - regression_loss: 1.2227 - classification_loss: 0.2294 119/500 [======>.......................] - ETA: 1:34 - loss: 1.4531 - regression_loss: 1.2233 - classification_loss: 0.2298 120/500 [======>.......................] - ETA: 1:34 - loss: 1.4538 - regression_loss: 1.2241 - classification_loss: 0.2297 121/500 [======>.......................] - ETA: 1:34 - loss: 1.4562 - regression_loss: 1.2262 - classification_loss: 0.2299 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4575 - regression_loss: 1.2271 - classification_loss: 0.2304 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4605 - regression_loss: 1.2304 - classification_loss: 0.2301 124/500 [======>.......................] - ETA: 1:33 - loss: 1.4610 - regression_loss: 1.2308 - classification_loss: 0.2301 125/500 [======>.......................] - ETA: 1:33 - loss: 1.4546 - regression_loss: 1.2259 - classification_loss: 0.2287 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4573 - regression_loss: 1.2277 - classification_loss: 0.2295 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4625 - regression_loss: 1.2318 - classification_loss: 0.2307 128/500 [======>.......................] - ETA: 1:32 - loss: 1.4575 - regression_loss: 1.2276 - classification_loss: 0.2299 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4510 - regression_loss: 1.2220 - classification_loss: 0.2290 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4544 - regression_loss: 1.2253 - classification_loss: 0.2291 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4565 - regression_loss: 1.2274 - classification_loss: 0.2291 132/500 [======>.......................] - ETA: 1:31 - loss: 1.4576 - regression_loss: 1.2281 - classification_loss: 0.2295 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4600 - regression_loss: 1.2305 - classification_loss: 0.2295 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4518 - regression_loss: 1.2239 - classification_loss: 0.2280 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4548 - regression_loss: 1.2264 - classification_loss: 0.2284 136/500 [=======>......................] - ETA: 1:30 - loss: 1.4540 - regression_loss: 1.2258 - classification_loss: 0.2282 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4517 - regression_loss: 1.2238 - classification_loss: 0.2278 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4554 - regression_loss: 1.2270 - classification_loss: 0.2284 139/500 [=======>......................] - ETA: 1:29 - loss: 1.4566 - regression_loss: 1.2278 - classification_loss: 0.2288 140/500 [=======>......................] - ETA: 1:29 - loss: 1.4553 - regression_loss: 1.2271 - classification_loss: 0.2282 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4574 - regression_loss: 1.2289 - classification_loss: 0.2285 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4578 - regression_loss: 1.2289 - classification_loss: 0.2289 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4528 - regression_loss: 1.2249 - classification_loss: 0.2279 144/500 [=======>......................] - ETA: 1:28 - loss: 1.4525 - regression_loss: 1.2250 - classification_loss: 0.2275 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4504 - regression_loss: 1.2233 - classification_loss: 0.2270 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4437 - regression_loss: 1.2174 - classification_loss: 0.2263 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4438 - regression_loss: 1.2175 - classification_loss: 0.2262 148/500 [=======>......................] - ETA: 1:27 - loss: 1.4444 - regression_loss: 1.2181 - classification_loss: 0.2263 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4474 - regression_loss: 1.2207 - classification_loss: 0.2267 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4436 - regression_loss: 1.2175 - classification_loss: 0.2261 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4427 - regression_loss: 1.2171 - classification_loss: 0.2256 152/500 [========>.....................] - ETA: 1:26 - loss: 1.4440 - regression_loss: 1.2180 - classification_loss: 0.2260 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4479 - regression_loss: 1.2210 - classification_loss: 0.2270 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4510 - regression_loss: 1.2236 - classification_loss: 0.2274 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4493 - regression_loss: 1.2225 - classification_loss: 0.2268 156/500 [========>.....................] - ETA: 1:25 - loss: 1.4518 - regression_loss: 1.2247 - classification_loss: 0.2271 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4491 - regression_loss: 1.2224 - classification_loss: 0.2267 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4479 - regression_loss: 1.2211 - classification_loss: 0.2267 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4494 - regression_loss: 1.2213 - classification_loss: 0.2281 160/500 [========>.....................] - ETA: 1:24 - loss: 1.4512 - regression_loss: 1.2226 - classification_loss: 0.2286 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4470 - regression_loss: 1.2188 - classification_loss: 0.2281 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4473 - regression_loss: 1.2189 - classification_loss: 0.2284 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4477 - regression_loss: 1.2193 - classification_loss: 0.2284 164/500 [========>.....................] - ETA: 1:23 - loss: 1.4463 - regression_loss: 1.2183 - classification_loss: 0.2280 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4433 - regression_loss: 1.2159 - classification_loss: 0.2274 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4446 - regression_loss: 1.2176 - classification_loss: 0.2271 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4514 - regression_loss: 1.2235 - classification_loss: 0.2280 168/500 [=========>....................] - ETA: 1:22 - loss: 1.4518 - regression_loss: 1.2240 - classification_loss: 0.2278 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4534 - regression_loss: 1.2252 - classification_loss: 0.2282 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4505 - regression_loss: 1.2229 - classification_loss: 0.2276 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4482 - regression_loss: 1.2210 - classification_loss: 0.2272 172/500 [=========>....................] - ETA: 1:21 - loss: 1.4512 - regression_loss: 1.2231 - classification_loss: 0.2281 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4446 - regression_loss: 1.2175 - classification_loss: 0.2270 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4447 - regression_loss: 1.2175 - classification_loss: 0.2273 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4430 - regression_loss: 1.2161 - classification_loss: 0.2269 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4449 - regression_loss: 1.2175 - classification_loss: 0.2274 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4486 - regression_loss: 1.2205 - classification_loss: 0.2281 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4500 - regression_loss: 1.2216 - classification_loss: 0.2283 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4506 - regression_loss: 1.2216 - classification_loss: 0.2290 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4472 - regression_loss: 1.2189 - classification_loss: 0.2282 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4459 - regression_loss: 1.2182 - classification_loss: 0.2277 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4465 - regression_loss: 1.2182 - classification_loss: 0.2283 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4432 - regression_loss: 1.2160 - classification_loss: 0.2272 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4422 - regression_loss: 1.2148 - classification_loss: 0.2274 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4409 - regression_loss: 1.2134 - classification_loss: 0.2274 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4427 - regression_loss: 1.2148 - classification_loss: 0.2279 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4411 - regression_loss: 1.2140 - classification_loss: 0.2271 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4395 - regression_loss: 1.2127 - classification_loss: 0.2268 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4407 - regression_loss: 1.2137 - classification_loss: 0.2269 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4405 - regression_loss: 1.2136 - classification_loss: 0.2269 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4408 - regression_loss: 1.2137 - classification_loss: 0.2271 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4422 - regression_loss: 1.2148 - classification_loss: 0.2274 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4419 - regression_loss: 1.2145 - classification_loss: 0.2274 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4422 - regression_loss: 1.2146 - classification_loss: 0.2276 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4420 - regression_loss: 1.2145 - classification_loss: 0.2276 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4421 - regression_loss: 1.2147 - classification_loss: 0.2275 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4446 - regression_loss: 1.2167 - classification_loss: 0.2279 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4415 - regression_loss: 1.2142 - classification_loss: 0.2273 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4426 - regression_loss: 1.2152 - classification_loss: 0.2274 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4380 - regression_loss: 1.2113 - classification_loss: 0.2267 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4389 - regression_loss: 1.2121 - classification_loss: 0.2268 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4416 - regression_loss: 1.2142 - classification_loss: 0.2275 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4401 - regression_loss: 1.2130 - classification_loss: 0.2271 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4402 - regression_loss: 1.2133 - classification_loss: 0.2270 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4419 - regression_loss: 1.2145 - classification_loss: 0.2275 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4395 - regression_loss: 1.2122 - classification_loss: 0.2273 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4364 - regression_loss: 1.2095 - classification_loss: 0.2269 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4387 - regression_loss: 1.2110 - classification_loss: 0.2277 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4381 - regression_loss: 1.2106 - classification_loss: 0.2275 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4364 - regression_loss: 1.2093 - classification_loss: 0.2271 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4389 - regression_loss: 1.2118 - classification_loss: 0.2272 212/500 [===========>..................] - ETA: 1:11 - loss: 1.4419 - regression_loss: 1.2141 - classification_loss: 0.2279 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4412 - regression_loss: 1.2136 - classification_loss: 0.2276 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4424 - regression_loss: 1.2147 - classification_loss: 0.2278 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4410 - regression_loss: 1.2136 - classification_loss: 0.2274 216/500 [===========>..................] - ETA: 1:10 - loss: 1.4380 - regression_loss: 1.2110 - classification_loss: 0.2270 217/500 [============>.................] - ETA: 1:10 - loss: 1.4339 - regression_loss: 1.2073 - classification_loss: 0.2266 218/500 [============>.................] - ETA: 1:10 - loss: 1.4335 - regression_loss: 1.2071 - classification_loss: 0.2264 219/500 [============>.................] - ETA: 1:10 - loss: 1.4317 - regression_loss: 1.2055 - classification_loss: 0.2262 220/500 [============>.................] - ETA: 1:09 - loss: 1.4333 - regression_loss: 1.2067 - classification_loss: 0.2265 221/500 [============>.................] - ETA: 1:09 - loss: 1.4338 - regression_loss: 1.2070 - classification_loss: 0.2267 222/500 [============>.................] - ETA: 1:09 - loss: 1.4338 - regression_loss: 1.2071 - classification_loss: 0.2267 223/500 [============>.................] - ETA: 1:09 - loss: 1.4359 - regression_loss: 1.2089 - classification_loss: 0.2270 224/500 [============>.................] - ETA: 1:08 - loss: 1.4379 - regression_loss: 1.2105 - classification_loss: 0.2274 225/500 [============>.................] - ETA: 1:08 - loss: 1.4400 - regression_loss: 1.2120 - classification_loss: 0.2280 226/500 [============>.................] - ETA: 1:08 - loss: 1.4405 - regression_loss: 1.2125 - classification_loss: 0.2280 227/500 [============>.................] - ETA: 1:08 - loss: 1.4375 - regression_loss: 1.2099 - classification_loss: 0.2276 228/500 [============>.................] - ETA: 1:07 - loss: 1.4410 - regression_loss: 1.2129 - classification_loss: 0.2281 229/500 [============>.................] - ETA: 1:07 - loss: 1.4412 - regression_loss: 1.2132 - classification_loss: 0.2280 230/500 [============>.................] - ETA: 1:07 - loss: 1.4440 - regression_loss: 1.2150 - classification_loss: 0.2290 231/500 [============>.................] - ETA: 1:07 - loss: 1.4455 - regression_loss: 1.2161 - classification_loss: 0.2294 232/500 [============>.................] - ETA: 1:06 - loss: 1.4469 - regression_loss: 1.2177 - classification_loss: 0.2292 233/500 [============>.................] - ETA: 1:06 - loss: 1.4463 - regression_loss: 1.2170 - classification_loss: 0.2292 234/500 [=============>................] - ETA: 1:06 - loss: 1.4424 - regression_loss: 1.2138 - classification_loss: 0.2286 235/500 [=============>................] - ETA: 1:06 - loss: 1.4433 - regression_loss: 1.2146 - classification_loss: 0.2288 236/500 [=============>................] - ETA: 1:05 - loss: 1.4392 - regression_loss: 1.2111 - classification_loss: 0.2280 237/500 [=============>................] - ETA: 1:05 - loss: 1.4398 - regression_loss: 1.2117 - classification_loss: 0.2281 238/500 [=============>................] - ETA: 1:05 - loss: 1.4410 - regression_loss: 1.2127 - classification_loss: 0.2283 239/500 [=============>................] - ETA: 1:05 - loss: 1.4435 - regression_loss: 1.2145 - classification_loss: 0.2291 240/500 [=============>................] - ETA: 1:05 - loss: 1.4407 - regression_loss: 1.2121 - classification_loss: 0.2286 241/500 [=============>................] - ETA: 1:04 - loss: 1.4424 - regression_loss: 1.2134 - classification_loss: 0.2290 242/500 [=============>................] - ETA: 1:04 - loss: 1.4439 - regression_loss: 1.2147 - classification_loss: 0.2293 243/500 [=============>................] - ETA: 1:04 - loss: 1.4420 - regression_loss: 1.2132 - classification_loss: 0.2288 244/500 [=============>................] - ETA: 1:04 - loss: 1.4417 - regression_loss: 1.2131 - classification_loss: 0.2286 245/500 [=============>................] - ETA: 1:03 - loss: 1.4398 - regression_loss: 1.2118 - classification_loss: 0.2280 246/500 [=============>................] - ETA: 1:03 - loss: 1.4406 - regression_loss: 1.2125 - classification_loss: 0.2281 247/500 [=============>................] - ETA: 1:03 - loss: 1.4418 - regression_loss: 1.2135 - classification_loss: 0.2283 248/500 [=============>................] - ETA: 1:03 - loss: 1.4397 - regression_loss: 1.2119 - classification_loss: 0.2279 249/500 [=============>................] - ETA: 1:02 - loss: 1.4412 - regression_loss: 1.2132 - classification_loss: 0.2280 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4387 - regression_loss: 1.2110 - classification_loss: 0.2277 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4383 - regression_loss: 1.2105 - classification_loss: 0.2278 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4350 - regression_loss: 1.2079 - classification_loss: 0.2271 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4352 - regression_loss: 1.2080 - classification_loss: 0.2272 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4317 - regression_loss: 1.2052 - classification_loss: 0.2265 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4317 - regression_loss: 1.2053 - classification_loss: 0.2263 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4292 - regression_loss: 1.2029 - classification_loss: 0.2263 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4303 - regression_loss: 1.2038 - classification_loss: 0.2266 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4303 - regression_loss: 1.2037 - classification_loss: 0.2266 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4303 - regression_loss: 1.2038 - classification_loss: 0.2265 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4313 - regression_loss: 1.2049 - classification_loss: 0.2265 261/500 [==============>...............] - ETA: 59s - loss: 1.4325 - regression_loss: 1.2059 - classification_loss: 0.2266  262/500 [==============>...............] - ETA: 59s - loss: 1.4331 - regression_loss: 1.2062 - classification_loss: 0.2269 263/500 [==============>...............] - ETA: 59s - loss: 1.4338 - regression_loss: 1.2069 - classification_loss: 0.2269 264/500 [==============>...............] - ETA: 59s - loss: 1.4349 - regression_loss: 1.2080 - classification_loss: 0.2270 265/500 [==============>...............] - ETA: 58s - loss: 1.4350 - regression_loss: 1.2080 - classification_loss: 0.2271 266/500 [==============>...............] - ETA: 58s - loss: 1.4352 - regression_loss: 1.2080 - classification_loss: 0.2272 267/500 [===============>..............] - ETA: 58s - loss: 1.4367 - regression_loss: 1.2093 - classification_loss: 0.2274 268/500 [===============>..............] - ETA: 58s - loss: 1.4378 - regression_loss: 1.2103 - classification_loss: 0.2275 269/500 [===============>..............] - ETA: 57s - loss: 1.4379 - regression_loss: 1.2104 - classification_loss: 0.2275 270/500 [===============>..............] - ETA: 57s - loss: 1.4378 - regression_loss: 1.2103 - classification_loss: 0.2276 271/500 [===============>..............] - ETA: 57s - loss: 1.4363 - regression_loss: 1.2089 - classification_loss: 0.2274 272/500 [===============>..............] - ETA: 57s - loss: 1.4375 - regression_loss: 1.2099 - classification_loss: 0.2276 273/500 [===============>..............] - ETA: 56s - loss: 1.4371 - regression_loss: 1.2091 - classification_loss: 0.2281 274/500 [===============>..............] - ETA: 56s - loss: 1.4359 - regression_loss: 1.2084 - classification_loss: 0.2275 275/500 [===============>..............] - ETA: 56s - loss: 1.4379 - regression_loss: 1.2102 - classification_loss: 0.2277 276/500 [===============>..............] - ETA: 56s - loss: 1.4388 - regression_loss: 1.2110 - classification_loss: 0.2278 277/500 [===============>..............] - ETA: 55s - loss: 1.4396 - regression_loss: 1.2114 - classification_loss: 0.2282 278/500 [===============>..............] - ETA: 55s - loss: 1.4392 - regression_loss: 1.2111 - classification_loss: 0.2281 279/500 [===============>..............] - ETA: 55s - loss: 1.4415 - regression_loss: 1.2129 - classification_loss: 0.2286 280/500 [===============>..............] - ETA: 55s - loss: 1.4409 - regression_loss: 1.2124 - classification_loss: 0.2285 281/500 [===============>..............] - ETA: 54s - loss: 1.4389 - regression_loss: 1.2108 - classification_loss: 0.2281 282/500 [===============>..............] - ETA: 54s - loss: 1.4371 - regression_loss: 1.2093 - classification_loss: 0.2278 283/500 [===============>..............] - ETA: 54s - loss: 1.4379 - regression_loss: 1.2101 - classification_loss: 0.2278 284/500 [================>.............] - ETA: 54s - loss: 1.4381 - regression_loss: 1.2103 - classification_loss: 0.2278 285/500 [================>.............] - ETA: 53s - loss: 1.4387 - regression_loss: 1.2108 - classification_loss: 0.2279 286/500 [================>.............] - ETA: 53s - loss: 1.4411 - regression_loss: 1.2126 - classification_loss: 0.2284 287/500 [================>.............] - ETA: 53s - loss: 1.4406 - regression_loss: 1.2122 - classification_loss: 0.2284 288/500 [================>.............] - ETA: 53s - loss: 1.4412 - regression_loss: 1.2127 - classification_loss: 0.2285 289/500 [================>.............] - ETA: 52s - loss: 1.4389 - regression_loss: 1.2109 - classification_loss: 0.2280 290/500 [================>.............] - ETA: 52s - loss: 1.4382 - regression_loss: 1.2099 - classification_loss: 0.2283 291/500 [================>.............] - ETA: 52s - loss: 1.4382 - regression_loss: 1.2100 - classification_loss: 0.2282 292/500 [================>.............] - ETA: 52s - loss: 1.4390 - regression_loss: 1.2107 - classification_loss: 0.2284 293/500 [================>.............] - ETA: 51s - loss: 1.4386 - regression_loss: 1.2105 - classification_loss: 0.2282 294/500 [================>.............] - ETA: 51s - loss: 1.4390 - regression_loss: 1.2106 - classification_loss: 0.2284 295/500 [================>.............] - ETA: 51s - loss: 1.4374 - regression_loss: 1.2092 - classification_loss: 0.2281 296/500 [================>.............] - ETA: 51s - loss: 1.4382 - regression_loss: 1.2099 - classification_loss: 0.2283 297/500 [================>.............] - ETA: 50s - loss: 1.4379 - regression_loss: 1.2097 - classification_loss: 0.2282 298/500 [================>.............] - ETA: 50s - loss: 1.4358 - regression_loss: 1.2079 - classification_loss: 0.2279 299/500 [================>.............] - ETA: 50s - loss: 1.4367 - regression_loss: 1.2087 - classification_loss: 0.2280 300/500 [=================>............] - ETA: 50s - loss: 1.4366 - regression_loss: 1.2085 - classification_loss: 0.2281 301/500 [=================>............] - ETA: 49s - loss: 1.4360 - regression_loss: 1.2080 - classification_loss: 0.2280 302/500 [=================>............] - ETA: 49s - loss: 1.4361 - regression_loss: 1.2083 - classification_loss: 0.2278 303/500 [=================>............] - ETA: 49s - loss: 1.4369 - regression_loss: 1.2089 - classification_loss: 0.2280 304/500 [=================>............] - ETA: 49s - loss: 1.4348 - regression_loss: 1.2072 - classification_loss: 0.2276 305/500 [=================>............] - ETA: 48s - loss: 1.4357 - regression_loss: 1.2083 - classification_loss: 0.2274 306/500 [=================>............] - ETA: 48s - loss: 1.4375 - regression_loss: 1.2094 - classification_loss: 0.2281 307/500 [=================>............] - ETA: 48s - loss: 1.4395 - regression_loss: 1.2110 - classification_loss: 0.2285 308/500 [=================>............] - ETA: 48s - loss: 1.4395 - regression_loss: 1.2112 - classification_loss: 0.2283 309/500 [=================>............] - ETA: 47s - loss: 1.4421 - regression_loss: 1.2129 - classification_loss: 0.2292 310/500 [=================>............] - ETA: 47s - loss: 1.4439 - regression_loss: 1.2143 - classification_loss: 0.2296 311/500 [=================>............] - ETA: 47s - loss: 1.4458 - regression_loss: 1.2158 - classification_loss: 0.2300 312/500 [=================>............] - ETA: 47s - loss: 1.4462 - regression_loss: 1.2162 - classification_loss: 0.2300 313/500 [=================>............] - ETA: 46s - loss: 1.4469 - regression_loss: 1.2169 - classification_loss: 0.2300 314/500 [=================>............] - ETA: 46s - loss: 1.4462 - regression_loss: 1.2163 - classification_loss: 0.2299 315/500 [=================>............] - ETA: 46s - loss: 1.4474 - regression_loss: 1.2175 - classification_loss: 0.2300 316/500 [=================>............] - ETA: 46s - loss: 1.4471 - regression_loss: 1.2173 - classification_loss: 0.2298 317/500 [==================>...........] - ETA: 45s - loss: 1.4457 - regression_loss: 1.2160 - classification_loss: 0.2296 318/500 [==================>...........] - ETA: 45s - loss: 1.4455 - regression_loss: 1.2159 - classification_loss: 0.2295 319/500 [==================>...........] - ETA: 45s - loss: 1.4454 - regression_loss: 1.2160 - classification_loss: 0.2294 320/500 [==================>...........] - ETA: 45s - loss: 1.4473 - regression_loss: 1.2173 - classification_loss: 0.2300 321/500 [==================>...........] - ETA: 44s - loss: 1.4468 - regression_loss: 1.2167 - classification_loss: 0.2301 322/500 [==================>...........] - ETA: 44s - loss: 1.4462 - regression_loss: 1.2156 - classification_loss: 0.2306 323/500 [==================>...........] - ETA: 44s - loss: 1.4445 - regression_loss: 1.2142 - classification_loss: 0.2303 324/500 [==================>...........] - ETA: 44s - loss: 1.4441 - regression_loss: 1.2138 - classification_loss: 0.2302 325/500 [==================>...........] - ETA: 43s - loss: 1.4416 - regression_loss: 1.2117 - classification_loss: 0.2298 326/500 [==================>...........] - ETA: 43s - loss: 1.4417 - regression_loss: 1.2118 - classification_loss: 0.2299 327/500 [==================>...........] - ETA: 43s - loss: 1.4420 - regression_loss: 1.2120 - classification_loss: 0.2299 328/500 [==================>...........] - ETA: 43s - loss: 1.4421 - regression_loss: 1.2122 - classification_loss: 0.2299 329/500 [==================>...........] - ETA: 42s - loss: 1.4407 - regression_loss: 1.2110 - classification_loss: 0.2296 330/500 [==================>...........] - ETA: 42s - loss: 1.4406 - regression_loss: 1.2110 - classification_loss: 0.2296 331/500 [==================>...........] - ETA: 42s - loss: 1.4390 - regression_loss: 1.2098 - classification_loss: 0.2292 332/500 [==================>...........] - ETA: 42s - loss: 1.4387 - regression_loss: 1.2093 - classification_loss: 0.2294 333/500 [==================>...........] - ETA: 41s - loss: 1.4386 - regression_loss: 1.2092 - classification_loss: 0.2294 334/500 [===================>..........] - ETA: 41s - loss: 1.4380 - regression_loss: 1.2087 - classification_loss: 0.2293 335/500 [===================>..........] - ETA: 41s - loss: 1.4362 - regression_loss: 1.2074 - classification_loss: 0.2288 336/500 [===================>..........] - ETA: 41s - loss: 1.4368 - regression_loss: 1.2080 - classification_loss: 0.2287 337/500 [===================>..........] - ETA: 40s - loss: 1.4369 - regression_loss: 1.2081 - classification_loss: 0.2288 338/500 [===================>..........] - ETA: 40s - loss: 1.4382 - regression_loss: 1.2092 - classification_loss: 0.2290 339/500 [===================>..........] - ETA: 40s - loss: 1.4390 - regression_loss: 1.2099 - classification_loss: 0.2291 340/500 [===================>..........] - ETA: 40s - loss: 1.4410 - regression_loss: 1.2112 - classification_loss: 0.2298 341/500 [===================>..........] - ETA: 39s - loss: 1.4420 - regression_loss: 1.2121 - classification_loss: 0.2299 342/500 [===================>..........] - ETA: 39s - loss: 1.4431 - regression_loss: 1.2130 - classification_loss: 0.2301 343/500 [===================>..........] - ETA: 39s - loss: 1.4443 - regression_loss: 1.2136 - classification_loss: 0.2307 344/500 [===================>..........] - ETA: 39s - loss: 1.4422 - regression_loss: 1.2119 - classification_loss: 0.2303 345/500 [===================>..........] - ETA: 38s - loss: 1.4450 - regression_loss: 1.2146 - classification_loss: 0.2305 346/500 [===================>..........] - ETA: 38s - loss: 1.4447 - regression_loss: 1.2143 - classification_loss: 0.2304 347/500 [===================>..........] - ETA: 38s - loss: 1.4435 - regression_loss: 1.2134 - classification_loss: 0.2301 348/500 [===================>..........] - ETA: 38s - loss: 1.4439 - regression_loss: 1.2136 - classification_loss: 0.2303 349/500 [===================>..........] - ETA: 37s - loss: 1.4445 - regression_loss: 1.2140 - classification_loss: 0.2305 350/500 [====================>.........] - ETA: 37s - loss: 1.4433 - regression_loss: 1.2132 - classification_loss: 0.2301 351/500 [====================>.........] - ETA: 37s - loss: 1.4447 - regression_loss: 1.2144 - classification_loss: 0.2303 352/500 [====================>.........] - ETA: 37s - loss: 1.4451 - regression_loss: 1.2146 - classification_loss: 0.2306 353/500 [====================>.........] - ETA: 36s - loss: 1.4452 - regression_loss: 1.2146 - classification_loss: 0.2306 354/500 [====================>.........] - ETA: 36s - loss: 1.4483 - regression_loss: 1.2173 - classification_loss: 0.2310 355/500 [====================>.........] - ETA: 36s - loss: 1.4472 - regression_loss: 1.2164 - classification_loss: 0.2308 356/500 [====================>.........] - ETA: 36s - loss: 1.4477 - regression_loss: 1.2168 - classification_loss: 0.2309 357/500 [====================>.........] - ETA: 35s - loss: 1.4469 - regression_loss: 1.2162 - classification_loss: 0.2307 358/500 [====================>.........] - ETA: 35s - loss: 1.4454 - regression_loss: 1.2151 - classification_loss: 0.2303 359/500 [====================>.........] - ETA: 35s - loss: 1.4473 - regression_loss: 1.2166 - classification_loss: 0.2307 360/500 [====================>.........] - ETA: 35s - loss: 1.4471 - regression_loss: 1.2164 - classification_loss: 0.2307 361/500 [====================>.........] - ETA: 34s - loss: 1.4465 - regression_loss: 1.2159 - classification_loss: 0.2306 362/500 [====================>.........] - ETA: 34s - loss: 1.4468 - regression_loss: 1.2161 - classification_loss: 0.2307 363/500 [====================>.........] - ETA: 34s - loss: 1.4480 - regression_loss: 1.2171 - classification_loss: 0.2308 364/500 [====================>.........] - ETA: 34s - loss: 1.4483 - regression_loss: 1.2174 - classification_loss: 0.2310 365/500 [====================>.........] - ETA: 33s - loss: 1.4489 - regression_loss: 1.2178 - classification_loss: 0.2311 366/500 [====================>.........] - ETA: 33s - loss: 1.4471 - regression_loss: 1.2163 - classification_loss: 0.2308 367/500 [=====================>........] - ETA: 33s - loss: 1.4464 - regression_loss: 1.2160 - classification_loss: 0.2304 368/500 [=====================>........] - ETA: 33s - loss: 1.4441 - regression_loss: 1.2142 - classification_loss: 0.2300 369/500 [=====================>........] - ETA: 32s - loss: 1.4454 - regression_loss: 1.2152 - classification_loss: 0.2302 370/500 [=====================>........] - ETA: 32s - loss: 1.4452 - regression_loss: 1.2152 - classification_loss: 0.2300 371/500 [=====================>........] - ETA: 32s - loss: 1.4447 - regression_loss: 1.2149 - classification_loss: 0.2298 372/500 [=====================>........] - ETA: 32s - loss: 1.4458 - regression_loss: 1.2157 - classification_loss: 0.2301 373/500 [=====================>........] - ETA: 31s - loss: 1.4464 - regression_loss: 1.2163 - classification_loss: 0.2301 374/500 [=====================>........] - ETA: 31s - loss: 1.4455 - regression_loss: 1.2155 - classification_loss: 0.2300 375/500 [=====================>........] - ETA: 31s - loss: 1.4457 - regression_loss: 1.2159 - classification_loss: 0.2298 376/500 [=====================>........] - ETA: 31s - loss: 1.4436 - regression_loss: 1.2142 - classification_loss: 0.2294 377/500 [=====================>........] - ETA: 30s - loss: 1.4434 - regression_loss: 1.2138 - classification_loss: 0.2297 378/500 [=====================>........] - ETA: 30s - loss: 1.4425 - regression_loss: 1.2129 - classification_loss: 0.2297 379/500 [=====================>........] - ETA: 30s - loss: 1.4423 - regression_loss: 1.2129 - classification_loss: 0.2294 380/500 [=====================>........] - ETA: 30s - loss: 1.4429 - regression_loss: 1.2135 - classification_loss: 0.2294 381/500 [=====================>........] - ETA: 29s - loss: 1.4425 - regression_loss: 1.2133 - classification_loss: 0.2292 382/500 [=====================>........] - ETA: 29s - loss: 1.4411 - regression_loss: 1.2121 - classification_loss: 0.2290 383/500 [=====================>........] - ETA: 29s - loss: 1.4402 - regression_loss: 1.2111 - classification_loss: 0.2291 384/500 [======================>.......] - ETA: 29s - loss: 1.4398 - regression_loss: 1.2108 - classification_loss: 0.2290 385/500 [======================>.......] - ETA: 28s - loss: 1.4400 - regression_loss: 1.2108 - classification_loss: 0.2292 386/500 [======================>.......] - ETA: 28s - loss: 1.4399 - regression_loss: 1.2107 - classification_loss: 0.2292 387/500 [======================>.......] - ETA: 28s - loss: 1.4388 - regression_loss: 1.2098 - classification_loss: 0.2290 388/500 [======================>.......] - ETA: 28s - loss: 1.4382 - regression_loss: 1.2093 - classification_loss: 0.2289 389/500 [======================>.......] - ETA: 27s - loss: 1.4377 - regression_loss: 1.2088 - classification_loss: 0.2289 390/500 [======================>.......] - ETA: 27s - loss: 1.4383 - regression_loss: 1.2093 - classification_loss: 0.2290 391/500 [======================>.......] - ETA: 27s - loss: 1.4359 - regression_loss: 1.2074 - classification_loss: 0.2286 392/500 [======================>.......] - ETA: 27s - loss: 1.4372 - regression_loss: 1.2085 - classification_loss: 0.2287 393/500 [======================>.......] - ETA: 26s - loss: 1.4361 - regression_loss: 1.2075 - classification_loss: 0.2286 394/500 [======================>.......] - ETA: 26s - loss: 1.4364 - regression_loss: 1.2079 - classification_loss: 0.2285 395/500 [======================>.......] - ETA: 26s - loss: 1.4378 - regression_loss: 1.2089 - classification_loss: 0.2288 396/500 [======================>.......] - ETA: 26s - loss: 1.4388 - regression_loss: 1.2099 - classification_loss: 0.2289 397/500 [======================>.......] - ETA: 25s - loss: 1.4397 - regression_loss: 1.2105 - classification_loss: 0.2292 398/500 [======================>.......] - ETA: 25s - loss: 1.4375 - regression_loss: 1.2088 - classification_loss: 0.2288 399/500 [======================>.......] - ETA: 25s - loss: 1.4362 - regression_loss: 1.2076 - classification_loss: 0.2286 400/500 [=======================>......] - ETA: 25s - loss: 1.4364 - regression_loss: 1.2077 - classification_loss: 0.2287 401/500 [=======================>......] - ETA: 24s - loss: 1.4362 - regression_loss: 1.2077 - classification_loss: 0.2285 402/500 [=======================>......] - ETA: 24s - loss: 1.4356 - regression_loss: 1.2072 - classification_loss: 0.2284 403/500 [=======================>......] - ETA: 24s - loss: 1.4364 - regression_loss: 1.2078 - classification_loss: 0.2285 404/500 [=======================>......] - ETA: 24s - loss: 1.4353 - regression_loss: 1.2071 - classification_loss: 0.2282 405/500 [=======================>......] - ETA: 23s - loss: 1.4355 - regression_loss: 1.2072 - classification_loss: 0.2282 406/500 [=======================>......] - ETA: 23s - loss: 1.4340 - regression_loss: 1.2061 - classification_loss: 0.2279 407/500 [=======================>......] - ETA: 23s - loss: 1.4346 - regression_loss: 1.2066 - classification_loss: 0.2280 408/500 [=======================>......] - ETA: 23s - loss: 1.4334 - regression_loss: 1.2055 - classification_loss: 0.2279 409/500 [=======================>......] - ETA: 22s - loss: 1.4316 - regression_loss: 1.2041 - classification_loss: 0.2275 410/500 [=======================>......] - ETA: 22s - loss: 1.4314 - regression_loss: 1.2038 - classification_loss: 0.2276 411/500 [=======================>......] - ETA: 22s - loss: 1.4301 - regression_loss: 1.2028 - classification_loss: 0.2273 412/500 [=======================>......] - ETA: 22s - loss: 1.4299 - regression_loss: 1.2026 - classification_loss: 0.2273 413/500 [=======================>......] - ETA: 21s - loss: 1.4307 - regression_loss: 1.2033 - classification_loss: 0.2274 414/500 [=======================>......] - ETA: 21s - loss: 1.4325 - regression_loss: 1.2047 - classification_loss: 0.2278 415/500 [=======================>......] - ETA: 21s - loss: 1.4339 - regression_loss: 1.2056 - classification_loss: 0.2284 416/500 [=======================>......] - ETA: 21s - loss: 1.4338 - regression_loss: 1.2056 - classification_loss: 0.2282 417/500 [========================>.....] - ETA: 20s - loss: 1.4339 - regression_loss: 1.2057 - classification_loss: 0.2281 418/500 [========================>.....] - ETA: 20s - loss: 1.4336 - regression_loss: 1.2055 - classification_loss: 0.2281 419/500 [========================>.....] - ETA: 20s - loss: 1.4323 - regression_loss: 1.2045 - classification_loss: 0.2278 420/500 [========================>.....] - ETA: 20s - loss: 1.4338 - regression_loss: 1.2056 - classification_loss: 0.2282 421/500 [========================>.....] - ETA: 19s - loss: 1.4336 - regression_loss: 1.2056 - classification_loss: 0.2280 422/500 [========================>.....] - ETA: 19s - loss: 1.4347 - regression_loss: 1.2064 - classification_loss: 0.2283 423/500 [========================>.....] - ETA: 19s - loss: 1.4334 - regression_loss: 1.2052 - classification_loss: 0.2282 424/500 [========================>.....] - ETA: 19s - loss: 1.4346 - regression_loss: 1.2061 - classification_loss: 0.2286 425/500 [========================>.....] - ETA: 18s - loss: 1.4380 - regression_loss: 1.2088 - classification_loss: 0.2292 426/500 [========================>.....] - ETA: 18s - loss: 1.4375 - regression_loss: 1.2084 - classification_loss: 0.2291 427/500 [========================>.....] - ETA: 18s - loss: 1.4373 - regression_loss: 1.2084 - classification_loss: 0.2289 428/500 [========================>.....] - ETA: 18s - loss: 1.4370 - regression_loss: 1.2081 - classification_loss: 0.2289 429/500 [========================>.....] - ETA: 17s - loss: 1.4380 - regression_loss: 1.2089 - classification_loss: 0.2290 430/500 [========================>.....] - ETA: 17s - loss: 1.4388 - regression_loss: 1.2096 - classification_loss: 0.2292 431/500 [========================>.....] - ETA: 17s - loss: 1.4402 - regression_loss: 1.2107 - classification_loss: 0.2296 432/500 [========================>.....] - ETA: 17s - loss: 1.4397 - regression_loss: 1.2103 - classification_loss: 0.2294 433/500 [========================>.....] - ETA: 16s - loss: 1.4381 - regression_loss: 1.2090 - classification_loss: 0.2291 434/500 [=========================>....] - ETA: 16s - loss: 1.4393 - regression_loss: 1.2098 - classification_loss: 0.2295 435/500 [=========================>....] - ETA: 16s - loss: 1.4406 - regression_loss: 1.2109 - classification_loss: 0.2297 436/500 [=========================>....] - ETA: 16s - loss: 1.4410 - regression_loss: 1.2113 - classification_loss: 0.2297 437/500 [=========================>....] - ETA: 15s - loss: 1.4409 - regression_loss: 1.2113 - classification_loss: 0.2296 438/500 [=========================>....] - ETA: 15s - loss: 1.4401 - regression_loss: 1.2107 - classification_loss: 0.2294 439/500 [=========================>....] - ETA: 15s - loss: 1.4392 - regression_loss: 1.2102 - classification_loss: 0.2290 440/500 [=========================>....] - ETA: 15s - loss: 1.4375 - regression_loss: 1.2088 - classification_loss: 0.2287 441/500 [=========================>....] - ETA: 14s - loss: 1.4367 - regression_loss: 1.2082 - classification_loss: 0.2284 442/500 [=========================>....] - ETA: 14s - loss: 1.4373 - regression_loss: 1.2087 - classification_loss: 0.2285 443/500 [=========================>....] - ETA: 14s - loss: 1.4401 - regression_loss: 1.2112 - classification_loss: 0.2290 444/500 [=========================>....] - ETA: 14s - loss: 1.4403 - regression_loss: 1.2113 - classification_loss: 0.2290 445/500 [=========================>....] - ETA: 13s - loss: 1.4387 - regression_loss: 1.2100 - classification_loss: 0.2287 446/500 [=========================>....] - ETA: 13s - loss: 1.4392 - regression_loss: 1.2104 - classification_loss: 0.2287 447/500 [=========================>....] - ETA: 13s - loss: 1.4405 - regression_loss: 1.2114 - classification_loss: 0.2291 448/500 [=========================>....] - ETA: 13s - loss: 1.4409 - regression_loss: 1.2115 - classification_loss: 0.2294 449/500 [=========================>....] - ETA: 12s - loss: 1.4416 - regression_loss: 1.2121 - classification_loss: 0.2295 450/500 [==========================>...] - ETA: 12s - loss: 1.4425 - regression_loss: 1.2126 - classification_loss: 0.2299 451/500 [==========================>...] - ETA: 12s - loss: 1.4432 - regression_loss: 1.2132 - classification_loss: 0.2301 452/500 [==========================>...] - ETA: 12s - loss: 1.4433 - regression_loss: 1.2130 - classification_loss: 0.2303 453/500 [==========================>...] - ETA: 11s - loss: 1.4433 - regression_loss: 1.2130 - classification_loss: 0.2303 454/500 [==========================>...] - ETA: 11s - loss: 1.4436 - regression_loss: 1.2133 - classification_loss: 0.2303 455/500 [==========================>...] - ETA: 11s - loss: 1.4444 - regression_loss: 1.2140 - classification_loss: 0.2304 456/500 [==========================>...] - ETA: 11s - loss: 1.4430 - regression_loss: 1.2127 - classification_loss: 0.2302 457/500 [==========================>...] - ETA: 10s - loss: 1.4417 - regression_loss: 1.2118 - classification_loss: 0.2299 458/500 [==========================>...] - ETA: 10s - loss: 1.4417 - regression_loss: 1.2118 - classification_loss: 0.2299 459/500 [==========================>...] - ETA: 10s - loss: 1.4424 - regression_loss: 1.2123 - classification_loss: 0.2301 460/500 [==========================>...] - ETA: 10s - loss: 1.4435 - regression_loss: 1.2132 - classification_loss: 0.2303 461/500 [==========================>...] - ETA: 9s - loss: 1.4429 - regression_loss: 1.2127 - classification_loss: 0.2302  462/500 [==========================>...] - ETA: 9s - loss: 1.4415 - regression_loss: 1.2116 - classification_loss: 0.2299 463/500 [==========================>...] - ETA: 9s - loss: 1.4423 - regression_loss: 1.2123 - classification_loss: 0.2300 464/500 [==========================>...] - ETA: 9s - loss: 1.4424 - regression_loss: 1.2124 - classification_loss: 0.2301 465/500 [==========================>...] - ETA: 8s - loss: 1.4411 - regression_loss: 1.2112 - classification_loss: 0.2300 466/500 [==========================>...] - ETA: 8s - loss: 1.4405 - regression_loss: 1.2107 - classification_loss: 0.2298 467/500 [===========================>..] - ETA: 8s - loss: 1.4400 - regression_loss: 1.2101 - classification_loss: 0.2300 468/500 [===========================>..] - ETA: 8s - loss: 1.4387 - regression_loss: 1.2090 - classification_loss: 0.2297 469/500 [===========================>..] - ETA: 7s - loss: 1.4382 - regression_loss: 1.2086 - classification_loss: 0.2296 470/500 [===========================>..] - ETA: 7s - loss: 1.4379 - regression_loss: 1.2083 - classification_loss: 0.2296 471/500 [===========================>..] - ETA: 7s - loss: 1.4391 - regression_loss: 1.2093 - classification_loss: 0.2298 472/500 [===========================>..] - ETA: 7s - loss: 1.4391 - regression_loss: 1.2093 - classification_loss: 0.2297 473/500 [===========================>..] - ETA: 6s - loss: 1.4380 - regression_loss: 1.2085 - classification_loss: 0.2295 474/500 [===========================>..] - ETA: 6s - loss: 1.4394 - regression_loss: 1.2098 - classification_loss: 0.2295 475/500 [===========================>..] - ETA: 6s - loss: 1.4380 - regression_loss: 1.2087 - classification_loss: 0.2293 476/500 [===========================>..] - ETA: 6s - loss: 1.4384 - regression_loss: 1.2089 - classification_loss: 0.2294 477/500 [===========================>..] - ETA: 5s - loss: 1.4390 - regression_loss: 1.2095 - classification_loss: 0.2295 478/500 [===========================>..] - ETA: 5s - loss: 1.4396 - regression_loss: 1.2101 - classification_loss: 0.2295 479/500 [===========================>..] - ETA: 5s - loss: 1.4414 - regression_loss: 1.2117 - classification_loss: 0.2297 480/500 [===========================>..] - ETA: 5s - loss: 1.4418 - regression_loss: 1.2119 - classification_loss: 0.2299 481/500 [===========================>..] - ETA: 4s - loss: 1.4428 - regression_loss: 1.2126 - classification_loss: 0.2302 482/500 [===========================>..] - ETA: 4s - loss: 1.4440 - regression_loss: 1.2135 - classification_loss: 0.2305 483/500 [===========================>..] - ETA: 4s - loss: 1.4441 - regression_loss: 1.2137 - classification_loss: 0.2304 484/500 [============================>.] - ETA: 4s - loss: 1.4443 - regression_loss: 1.2138 - classification_loss: 0.2304 485/500 [============================>.] - ETA: 3s - loss: 1.4443 - regression_loss: 1.2139 - classification_loss: 0.2305 486/500 [============================>.] - ETA: 3s - loss: 1.4446 - regression_loss: 1.2141 - classification_loss: 0.2305 487/500 [============================>.] - ETA: 3s - loss: 1.4452 - regression_loss: 1.2146 - classification_loss: 0.2306 488/500 [============================>.] - ETA: 3s - loss: 1.4448 - regression_loss: 1.2144 - classification_loss: 0.2304 489/500 [============================>.] - ETA: 2s - loss: 1.4455 - regression_loss: 1.2150 - classification_loss: 0.2305 490/500 [============================>.] - ETA: 2s - loss: 1.4456 - regression_loss: 1.2150 - classification_loss: 0.2306 491/500 [============================>.] - ETA: 2s - loss: 1.4447 - regression_loss: 1.2142 - classification_loss: 0.2305 492/500 [============================>.] - ETA: 2s - loss: 1.4453 - regression_loss: 1.2146 - classification_loss: 0.2306 493/500 [============================>.] - ETA: 1s - loss: 1.4439 - regression_loss: 1.2136 - classification_loss: 0.2303 494/500 [============================>.] - ETA: 1s - loss: 1.4439 - regression_loss: 1.2137 - classification_loss: 0.2302 495/500 [============================>.] - ETA: 1s - loss: 1.4441 - regression_loss: 1.2138 - classification_loss: 0.2302 496/500 [============================>.] - ETA: 1s - loss: 1.4439 - regression_loss: 1.2137 - classification_loss: 0.2302 497/500 [============================>.] - ETA: 0s - loss: 1.4440 - regression_loss: 1.2139 - classification_loss: 0.2301 498/500 [============================>.] - ETA: 0s - loss: 1.4438 - regression_loss: 1.2138 - classification_loss: 0.2300 499/500 [============================>.] - ETA: 0s - loss: 1.4438 - regression_loss: 1.2134 - classification_loss: 0.2305 500/500 [==============================] - 125s 251ms/step - loss: 1.4440 - regression_loss: 1.2134 - classification_loss: 0.2305 1172 instances of class plum with average precision: 0.6612 mAP: 0.6612 Epoch 00109: saving model to ./training/snapshots/resnet50_pascal_109.h5 Epoch 110/150 1/500 [..............................] - ETA: 1:56 - loss: 0.7262 - regression_loss: 0.5323 - classification_loss: 0.1940 2/500 [..............................] - ETA: 2:00 - loss: 1.1826 - regression_loss: 0.9348 - classification_loss: 0.2479 3/500 [..............................] - ETA: 2:02 - loss: 1.3193 - regression_loss: 1.0566 - classification_loss: 0.2628 4/500 [..............................] - ETA: 2:02 - loss: 1.2571 - regression_loss: 0.9966 - classification_loss: 0.2604 5/500 [..............................] - ETA: 2:01 - loss: 1.3368 - regression_loss: 1.0728 - classification_loss: 0.2640 6/500 [..............................] - ETA: 2:02 - loss: 1.3626 - regression_loss: 1.1070 - classification_loss: 0.2556 7/500 [..............................] - ETA: 2:02 - loss: 1.3562 - regression_loss: 1.1098 - classification_loss: 0.2464 8/500 [..............................] - ETA: 2:02 - loss: 1.4476 - regression_loss: 1.1812 - classification_loss: 0.2664 9/500 [..............................] - ETA: 2:02 - loss: 1.4528 - regression_loss: 1.1933 - classification_loss: 0.2594 10/500 [..............................] - ETA: 2:03 - loss: 1.4567 - regression_loss: 1.1988 - classification_loss: 0.2579 11/500 [..............................] - ETA: 2:03 - loss: 1.4777 - regression_loss: 1.2122 - classification_loss: 0.2655 12/500 [..............................] - ETA: 2:02 - loss: 1.5241 - regression_loss: 1.2468 - classification_loss: 0.2772 13/500 [..............................] - ETA: 2:02 - loss: 1.5108 - regression_loss: 1.2386 - classification_loss: 0.2722 14/500 [..............................] - ETA: 2:02 - loss: 1.4871 - regression_loss: 1.2232 - classification_loss: 0.2640 15/500 [..............................] - ETA: 2:02 - loss: 1.4922 - regression_loss: 1.2269 - classification_loss: 0.2653 16/500 [..............................] - ETA: 2:02 - loss: 1.5343 - regression_loss: 1.2584 - classification_loss: 0.2759 17/500 [>.............................] - ETA: 2:01 - loss: 1.5601 - regression_loss: 1.2708 - classification_loss: 0.2892 18/500 [>.............................] - ETA: 2:01 - loss: 1.5539 - regression_loss: 1.2681 - classification_loss: 0.2858 19/500 [>.............................] - ETA: 2:01 - loss: 1.5616 - regression_loss: 1.2781 - classification_loss: 0.2834 20/500 [>.............................] - ETA: 2:01 - loss: 1.5530 - regression_loss: 1.2719 - classification_loss: 0.2811 21/500 [>.............................] - ETA: 2:00 - loss: 1.5545 - regression_loss: 1.2754 - classification_loss: 0.2791 22/500 [>.............................] - ETA: 2:00 - loss: 1.5384 - regression_loss: 1.2650 - classification_loss: 0.2734 23/500 [>.............................] - ETA: 2:00 - loss: 1.5429 - regression_loss: 1.2712 - classification_loss: 0.2717 24/500 [>.............................] - ETA: 2:00 - loss: 1.5334 - regression_loss: 1.2630 - classification_loss: 0.2704 25/500 [>.............................] - ETA: 2:00 - loss: 1.5477 - regression_loss: 1.2743 - classification_loss: 0.2734 26/500 [>.............................] - ETA: 1:59 - loss: 1.5398 - regression_loss: 1.2694 - classification_loss: 0.2704 27/500 [>.............................] - ETA: 1:59 - loss: 1.5381 - regression_loss: 1.2715 - classification_loss: 0.2667 28/500 [>.............................] - ETA: 1:59 - loss: 1.5425 - regression_loss: 1.2769 - classification_loss: 0.2656 29/500 [>.............................] - ETA: 1:58 - loss: 1.5216 - regression_loss: 1.2607 - classification_loss: 0.2609 30/500 [>.............................] - ETA: 1:58 - loss: 1.5205 - regression_loss: 1.2610 - classification_loss: 0.2595 31/500 [>.............................] - ETA: 1:58 - loss: 1.5227 - regression_loss: 1.2634 - classification_loss: 0.2593 32/500 [>.............................] - ETA: 1:58 - loss: 1.5181 - regression_loss: 1.2591 - classification_loss: 0.2590 33/500 [>.............................] - ETA: 1:57 - loss: 1.4959 - regression_loss: 1.2406 - classification_loss: 0.2553 34/500 [=>............................] - ETA: 1:57 - loss: 1.4971 - regression_loss: 1.2423 - classification_loss: 0.2548 35/500 [=>............................] - ETA: 1:57 - loss: 1.4991 - regression_loss: 1.2453 - classification_loss: 0.2538 36/500 [=>............................] - ETA: 1:57 - loss: 1.4856 - regression_loss: 1.2381 - classification_loss: 0.2475 37/500 [=>............................] - ETA: 1:56 - loss: 1.4715 - regression_loss: 1.2273 - classification_loss: 0.2442 38/500 [=>............................] - ETA: 1:56 - loss: 1.4633 - regression_loss: 1.2198 - classification_loss: 0.2434 39/500 [=>............................] - ETA: 1:56 - loss: 1.4686 - regression_loss: 1.2253 - classification_loss: 0.2433 40/500 [=>............................] - ETA: 1:56 - loss: 1.4602 - regression_loss: 1.2188 - classification_loss: 0.2414 41/500 [=>............................] - ETA: 1:55 - loss: 1.4663 - regression_loss: 1.2263 - classification_loss: 0.2401 42/500 [=>............................] - ETA: 1:55 - loss: 1.4641 - regression_loss: 1.2264 - classification_loss: 0.2378 43/500 [=>............................] - ETA: 1:55 - loss: 1.4631 - regression_loss: 1.2256 - classification_loss: 0.2375 44/500 [=>............................] - ETA: 1:55 - loss: 1.4539 - regression_loss: 1.2185 - classification_loss: 0.2355 45/500 [=>............................] - ETA: 1:54 - loss: 1.4376 - regression_loss: 1.2055 - classification_loss: 0.2321 46/500 [=>............................] - ETA: 1:54 - loss: 1.4396 - regression_loss: 1.2083 - classification_loss: 0.2314 47/500 [=>............................] - ETA: 1:54 - loss: 1.4457 - regression_loss: 1.2142 - classification_loss: 0.2315 48/500 [=>............................] - ETA: 1:54 - loss: 1.4325 - regression_loss: 1.2038 - classification_loss: 0.2287 49/500 [=>............................] - ETA: 1:53 - loss: 1.4117 - regression_loss: 1.1859 - classification_loss: 0.2258 50/500 [==>...........................] - ETA: 1:53 - loss: 1.4126 - regression_loss: 1.1871 - classification_loss: 0.2255 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4174 - regression_loss: 1.1919 - classification_loss: 0.2255 52/500 [==>...........................] - ETA: 1:53 - loss: 1.4070 - regression_loss: 1.1835 - classification_loss: 0.2235 53/500 [==>...........................] - ETA: 1:52 - loss: 1.4208 - regression_loss: 1.1955 - classification_loss: 0.2254 54/500 [==>...........................] - ETA: 1:52 - loss: 1.4262 - regression_loss: 1.2006 - classification_loss: 0.2257 55/500 [==>...........................] - ETA: 1:52 - loss: 1.4189 - regression_loss: 1.1941 - classification_loss: 0.2248 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4318 - regression_loss: 1.2030 - classification_loss: 0.2288 57/500 [==>...........................] - ETA: 1:51 - loss: 1.4314 - regression_loss: 1.2029 - classification_loss: 0.2286 58/500 [==>...........................] - ETA: 1:51 - loss: 1.4373 - regression_loss: 1.2065 - classification_loss: 0.2309 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4339 - regression_loss: 1.2043 - classification_loss: 0.2296 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4453 - regression_loss: 1.2133 - classification_loss: 0.2320 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4433 - regression_loss: 1.2119 - classification_loss: 0.2314 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4430 - regression_loss: 1.2125 - classification_loss: 0.2305 63/500 [==>...........................] - ETA: 1:48 - loss: 1.4367 - regression_loss: 1.2062 - classification_loss: 0.2305 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4488 - regression_loss: 1.2144 - classification_loss: 0.2344 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4461 - regression_loss: 1.2129 - classification_loss: 0.2332 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4380 - regression_loss: 1.2069 - classification_loss: 0.2311 67/500 [===>..........................] - ETA: 1:47 - loss: 1.4338 - regression_loss: 1.2031 - classification_loss: 0.2307 68/500 [===>..........................] - ETA: 1:47 - loss: 1.4280 - regression_loss: 1.1986 - classification_loss: 0.2294 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4396 - regression_loss: 1.2096 - classification_loss: 0.2300 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4320 - regression_loss: 1.2049 - classification_loss: 0.2271 71/500 [===>..........................] - ETA: 1:46 - loss: 1.4316 - regression_loss: 1.2046 - classification_loss: 0.2271 72/500 [===>..........................] - ETA: 1:46 - loss: 1.4369 - regression_loss: 1.2093 - classification_loss: 0.2277 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4406 - regression_loss: 1.2124 - classification_loss: 0.2281 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4356 - regression_loss: 1.2077 - classification_loss: 0.2279 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4282 - regression_loss: 1.2020 - classification_loss: 0.2262 76/500 [===>..........................] - ETA: 1:45 - loss: 1.4281 - regression_loss: 1.2020 - classification_loss: 0.2260 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4236 - regression_loss: 1.1978 - classification_loss: 0.2258 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4276 - regression_loss: 1.2015 - classification_loss: 0.2261 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4300 - regression_loss: 1.2047 - classification_loss: 0.2252 80/500 [===>..........................] - ETA: 1:44 - loss: 1.4312 - regression_loss: 1.2060 - classification_loss: 0.2253 81/500 [===>..........................] - ETA: 1:44 - loss: 1.4262 - regression_loss: 1.2019 - classification_loss: 0.2243 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4282 - regression_loss: 1.2037 - classification_loss: 0.2245 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4276 - regression_loss: 1.2029 - classification_loss: 0.2247 84/500 [====>.........................] - ETA: 1:43 - loss: 1.4276 - regression_loss: 1.2030 - classification_loss: 0.2246 85/500 [====>.........................] - ETA: 1:43 - loss: 1.4300 - regression_loss: 1.2053 - classification_loss: 0.2247 86/500 [====>.........................] - ETA: 1:43 - loss: 1.4299 - regression_loss: 1.2056 - classification_loss: 0.2244 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4268 - regression_loss: 1.2038 - classification_loss: 0.2230 88/500 [====>.........................] - ETA: 1:42 - loss: 1.4275 - regression_loss: 1.2035 - classification_loss: 0.2240 89/500 [====>.........................] - ETA: 1:42 - loss: 1.4231 - regression_loss: 1.1999 - classification_loss: 0.2231 90/500 [====>.........................] - ETA: 1:42 - loss: 1.4267 - regression_loss: 1.2034 - classification_loss: 0.2233 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4274 - regression_loss: 1.2037 - classification_loss: 0.2237 92/500 [====>.........................] - ETA: 1:41 - loss: 1.4344 - regression_loss: 1.2097 - classification_loss: 0.2247 93/500 [====>.........................] - ETA: 1:41 - loss: 1.4268 - regression_loss: 1.2038 - classification_loss: 0.2230 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4218 - regression_loss: 1.1998 - classification_loss: 0.2219 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4278 - regression_loss: 1.2040 - classification_loss: 0.2238 96/500 [====>.........................] - ETA: 1:40 - loss: 1.4346 - regression_loss: 1.2089 - classification_loss: 0.2256 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4250 - regression_loss: 1.2010 - classification_loss: 0.2240 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4165 - regression_loss: 1.1939 - classification_loss: 0.2227 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4171 - regression_loss: 1.1945 - classification_loss: 0.2226 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4201 - regression_loss: 1.1973 - classification_loss: 0.2229 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4265 - regression_loss: 1.2025 - classification_loss: 0.2240 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4260 - regression_loss: 1.2020 - classification_loss: 0.2241 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4281 - regression_loss: 1.2036 - classification_loss: 0.2245 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4191 - regression_loss: 1.1964 - classification_loss: 0.2227 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4217 - regression_loss: 1.1977 - classification_loss: 0.2240 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4224 - regression_loss: 1.1984 - classification_loss: 0.2240 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4220 - regression_loss: 1.1974 - classification_loss: 0.2246 108/500 [=====>........................] - ETA: 1:37 - loss: 1.4218 - regression_loss: 1.1970 - classification_loss: 0.2248 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4235 - regression_loss: 1.1980 - classification_loss: 0.2254 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4183 - regression_loss: 1.1942 - classification_loss: 0.2240 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4221 - regression_loss: 1.1973 - classification_loss: 0.2248 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4239 - regression_loss: 1.1991 - classification_loss: 0.2248 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4271 - regression_loss: 1.2017 - classification_loss: 0.2254 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4316 - regression_loss: 1.2055 - classification_loss: 0.2261 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4352 - regression_loss: 1.2086 - classification_loss: 0.2266 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4322 - regression_loss: 1.2059 - classification_loss: 0.2263 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4351 - regression_loss: 1.2083 - classification_loss: 0.2268 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4366 - regression_loss: 1.2098 - classification_loss: 0.2268 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4374 - regression_loss: 1.2108 - classification_loss: 0.2266 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4415 - regression_loss: 1.2135 - classification_loss: 0.2281 121/500 [======>.......................] - ETA: 1:34 - loss: 1.4377 - regression_loss: 1.2098 - classification_loss: 0.2279 122/500 [======>.......................] - ETA: 1:34 - loss: 1.4398 - regression_loss: 1.2120 - classification_loss: 0.2277 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4396 - regression_loss: 1.2119 - classification_loss: 0.2277 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4436 - regression_loss: 1.2146 - classification_loss: 0.2289 125/500 [======>.......................] - ETA: 1:33 - loss: 1.4494 - regression_loss: 1.2198 - classification_loss: 0.2295 126/500 [======>.......................] - ETA: 1:33 - loss: 1.4536 - regression_loss: 1.2225 - classification_loss: 0.2311 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4529 - regression_loss: 1.2221 - classification_loss: 0.2308 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4561 - regression_loss: 1.2245 - classification_loss: 0.2315 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4526 - regression_loss: 1.2216 - classification_loss: 0.2309 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4531 - regression_loss: 1.2223 - classification_loss: 0.2308 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4551 - regression_loss: 1.2239 - classification_loss: 0.2312 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4548 - regression_loss: 1.2236 - classification_loss: 0.2312 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4540 - regression_loss: 1.2228 - classification_loss: 0.2312 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4570 - regression_loss: 1.2254 - classification_loss: 0.2315 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4520 - regression_loss: 1.2216 - classification_loss: 0.2304 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4544 - regression_loss: 1.2236 - classification_loss: 0.2308 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4513 - regression_loss: 1.2208 - classification_loss: 0.2305 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4482 - regression_loss: 1.2179 - classification_loss: 0.2303 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4479 - regression_loss: 1.2170 - classification_loss: 0.2309 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4497 - regression_loss: 1.2183 - classification_loss: 0.2314 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4492 - regression_loss: 1.2179 - classification_loss: 0.2313 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4474 - regression_loss: 1.2164 - classification_loss: 0.2310 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4460 - regression_loss: 1.2150 - classification_loss: 0.2310 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4466 - regression_loss: 1.2154 - classification_loss: 0.2311 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4479 - regression_loss: 1.2164 - classification_loss: 0.2315 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4484 - regression_loss: 1.2169 - classification_loss: 0.2315 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4460 - regression_loss: 1.2148 - classification_loss: 0.2313 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4461 - regression_loss: 1.2147 - classification_loss: 0.2314 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4491 - regression_loss: 1.2172 - classification_loss: 0.2319 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4454 - regression_loss: 1.2138 - classification_loss: 0.2316 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4449 - regression_loss: 1.2128 - classification_loss: 0.2322 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4441 - regression_loss: 1.2123 - classification_loss: 0.2318 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4457 - regression_loss: 1.2132 - classification_loss: 0.2325 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4481 - regression_loss: 1.2151 - classification_loss: 0.2329 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4531 - regression_loss: 1.2186 - classification_loss: 0.2345 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4559 - regression_loss: 1.2212 - classification_loss: 0.2347 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4546 - regression_loss: 1.2199 - classification_loss: 0.2348 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4555 - regression_loss: 1.2203 - classification_loss: 0.2352 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4559 - regression_loss: 1.2209 - classification_loss: 0.2350 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4573 - regression_loss: 1.2222 - classification_loss: 0.2351 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4563 - regression_loss: 1.2212 - classification_loss: 0.2351 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4544 - regression_loss: 1.2199 - classification_loss: 0.2344 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4555 - regression_loss: 1.2205 - classification_loss: 0.2350 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4573 - regression_loss: 1.2221 - classification_loss: 0.2352 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4536 - regression_loss: 1.2190 - classification_loss: 0.2346 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4485 - regression_loss: 1.2147 - classification_loss: 0.2338 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4497 - regression_loss: 1.2159 - classification_loss: 0.2338 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4493 - regression_loss: 1.2156 - classification_loss: 0.2337 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4501 - regression_loss: 1.2157 - classification_loss: 0.2344 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4467 - regression_loss: 1.2131 - classification_loss: 0.2337 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4420 - regression_loss: 1.2092 - classification_loss: 0.2328 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4419 - regression_loss: 1.2091 - classification_loss: 0.2328 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4392 - regression_loss: 1.2067 - classification_loss: 0.2325 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4385 - regression_loss: 1.2062 - classification_loss: 0.2323 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4407 - regression_loss: 1.2079 - classification_loss: 0.2327 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4383 - regression_loss: 1.2061 - classification_loss: 0.2322 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4383 - regression_loss: 1.2062 - classification_loss: 0.2320 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4373 - regression_loss: 1.2058 - classification_loss: 0.2315 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4351 - regression_loss: 1.2038 - classification_loss: 0.2313 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4360 - regression_loss: 1.2046 - classification_loss: 0.2314 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4391 - regression_loss: 1.2072 - classification_loss: 0.2318 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4333 - regression_loss: 1.2026 - classification_loss: 0.2307 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4364 - regression_loss: 1.2046 - classification_loss: 0.2317 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4369 - regression_loss: 1.2052 - classification_loss: 0.2317 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4357 - regression_loss: 1.2045 - classification_loss: 0.2312 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4369 - regression_loss: 1.2056 - classification_loss: 0.2313 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4346 - regression_loss: 1.2039 - classification_loss: 0.2306 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4328 - regression_loss: 1.2025 - classification_loss: 0.2302 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4311 - regression_loss: 1.2010 - classification_loss: 0.2300 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4321 - regression_loss: 1.2017 - classification_loss: 0.2304 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4347 - regression_loss: 1.2034 - classification_loss: 0.2312 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4321 - regression_loss: 1.2017 - classification_loss: 0.2304 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4329 - regression_loss: 1.2025 - classification_loss: 0.2304 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4356 - regression_loss: 1.2045 - classification_loss: 0.2312 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4354 - regression_loss: 1.2044 - classification_loss: 0.2310 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4367 - regression_loss: 1.2050 - classification_loss: 0.2317 197/500 [==========>...................] - ETA: 1:16 - loss: 1.4363 - regression_loss: 1.2047 - classification_loss: 0.2317 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4329 - regression_loss: 1.2016 - classification_loss: 0.2313 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4338 - regression_loss: 1.2025 - classification_loss: 0.2313 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4319 - regression_loss: 1.2008 - classification_loss: 0.2311 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4294 - regression_loss: 1.1985 - classification_loss: 0.2308 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4291 - regression_loss: 1.1985 - classification_loss: 0.2306 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4309 - regression_loss: 1.2002 - classification_loss: 0.2307 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4317 - regression_loss: 1.2011 - classification_loss: 0.2306 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4312 - regression_loss: 1.2006 - classification_loss: 0.2306 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4286 - regression_loss: 1.1986 - classification_loss: 0.2300 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4278 - regression_loss: 1.1982 - classification_loss: 0.2296 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4309 - regression_loss: 1.2006 - classification_loss: 0.2303 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4330 - regression_loss: 1.2020 - classification_loss: 0.2310 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4337 - regression_loss: 1.2026 - classification_loss: 0.2311 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4339 - regression_loss: 1.2031 - classification_loss: 0.2308 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4316 - regression_loss: 1.2012 - classification_loss: 0.2304 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4311 - regression_loss: 1.2009 - classification_loss: 0.2302 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4325 - regression_loss: 1.2021 - classification_loss: 0.2303 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4329 - regression_loss: 1.2027 - classification_loss: 0.2302 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4327 - regression_loss: 1.2028 - classification_loss: 0.2300 217/500 [============>.................] - ETA: 1:10 - loss: 1.4321 - regression_loss: 1.2025 - classification_loss: 0.2297 218/500 [============>.................] - ETA: 1:10 - loss: 1.4281 - regression_loss: 1.1990 - classification_loss: 0.2291 219/500 [============>.................] - ETA: 1:10 - loss: 1.4261 - regression_loss: 1.1974 - classification_loss: 0.2287 220/500 [============>.................] - ETA: 1:10 - loss: 1.4261 - regression_loss: 1.1976 - classification_loss: 0.2284 221/500 [============>.................] - ETA: 1:10 - loss: 1.4224 - regression_loss: 1.1947 - classification_loss: 0.2277 222/500 [============>.................] - ETA: 1:09 - loss: 1.4256 - regression_loss: 1.1968 - classification_loss: 0.2287 223/500 [============>.................] - ETA: 1:09 - loss: 1.4251 - regression_loss: 1.1963 - classification_loss: 0.2289 224/500 [============>.................] - ETA: 1:09 - loss: 1.4273 - regression_loss: 1.1979 - classification_loss: 0.2294 225/500 [============>.................] - ETA: 1:09 - loss: 1.4288 - regression_loss: 1.1993 - classification_loss: 0.2296 226/500 [============>.................] - ETA: 1:08 - loss: 1.4276 - regression_loss: 1.1983 - classification_loss: 0.2293 227/500 [============>.................] - ETA: 1:08 - loss: 1.4249 - regression_loss: 1.1962 - classification_loss: 0.2287 228/500 [============>.................] - ETA: 1:08 - loss: 1.4238 - regression_loss: 1.1950 - classification_loss: 0.2287 229/500 [============>.................] - ETA: 1:08 - loss: 1.4242 - regression_loss: 1.1957 - classification_loss: 0.2285 230/500 [============>.................] - ETA: 1:07 - loss: 1.4202 - regression_loss: 1.1924 - classification_loss: 0.2278 231/500 [============>.................] - ETA: 1:07 - loss: 1.4192 - regression_loss: 1.1915 - classification_loss: 0.2277 232/500 [============>.................] - ETA: 1:07 - loss: 1.4191 - regression_loss: 1.1915 - classification_loss: 0.2276 233/500 [============>.................] - ETA: 1:06 - loss: 1.4218 - regression_loss: 1.1936 - classification_loss: 0.2282 234/500 [=============>................] - ETA: 1:06 - loss: 1.4206 - regression_loss: 1.1925 - classification_loss: 0.2281 235/500 [=============>................] - ETA: 1:06 - loss: 1.4235 - regression_loss: 1.1946 - classification_loss: 0.2289 236/500 [=============>................] - ETA: 1:06 - loss: 1.4255 - regression_loss: 1.1963 - classification_loss: 0.2292 237/500 [=============>................] - ETA: 1:05 - loss: 1.4256 - regression_loss: 1.1966 - classification_loss: 0.2290 238/500 [=============>................] - ETA: 1:05 - loss: 1.4239 - regression_loss: 1.1954 - classification_loss: 0.2285 239/500 [=============>................] - ETA: 1:05 - loss: 1.4254 - regression_loss: 1.1968 - classification_loss: 0.2286 240/500 [=============>................] - ETA: 1:05 - loss: 1.4266 - regression_loss: 1.1982 - classification_loss: 0.2284 241/500 [=============>................] - ETA: 1:04 - loss: 1.4275 - regression_loss: 1.1991 - classification_loss: 0.2284 242/500 [=============>................] - ETA: 1:04 - loss: 1.4239 - regression_loss: 1.1955 - classification_loss: 0.2284 243/500 [=============>................] - ETA: 1:04 - loss: 1.4235 - regression_loss: 1.1954 - classification_loss: 0.2282 244/500 [=============>................] - ETA: 1:04 - loss: 1.4214 - regression_loss: 1.1937 - classification_loss: 0.2277 245/500 [=============>................] - ETA: 1:03 - loss: 1.4225 - regression_loss: 1.1950 - classification_loss: 0.2275 246/500 [=============>................] - ETA: 1:03 - loss: 1.4188 - regression_loss: 1.1920 - classification_loss: 0.2268 247/500 [=============>................] - ETA: 1:03 - loss: 1.4191 - regression_loss: 1.1923 - classification_loss: 0.2267 248/500 [=============>................] - ETA: 1:03 - loss: 1.4173 - regression_loss: 1.1911 - classification_loss: 0.2262 249/500 [=============>................] - ETA: 1:02 - loss: 1.4197 - regression_loss: 1.1938 - classification_loss: 0.2258 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4202 - regression_loss: 1.1942 - classification_loss: 0.2260 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4183 - regression_loss: 1.1929 - classification_loss: 0.2254 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4182 - regression_loss: 1.1925 - classification_loss: 0.2257 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4166 - regression_loss: 1.1912 - classification_loss: 0.2254 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4182 - regression_loss: 1.1926 - classification_loss: 0.2256 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4195 - regression_loss: 1.1937 - classification_loss: 0.2258 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4162 - regression_loss: 1.1910 - classification_loss: 0.2252 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4156 - regression_loss: 1.1906 - classification_loss: 0.2250 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4151 - regression_loss: 1.1903 - classification_loss: 0.2247 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4148 - regression_loss: 1.1899 - classification_loss: 0.2249 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4182 - regression_loss: 1.1923 - classification_loss: 0.2258 261/500 [==============>...............] - ETA: 59s - loss: 1.4186 - regression_loss: 1.1928 - classification_loss: 0.2258  262/500 [==============>...............] - ETA: 59s - loss: 1.4221 - regression_loss: 1.1957 - classification_loss: 0.2264 263/500 [==============>...............] - ETA: 59s - loss: 1.4239 - regression_loss: 1.1964 - classification_loss: 0.2274 264/500 [==============>...............] - ETA: 59s - loss: 1.4217 - regression_loss: 1.1947 - classification_loss: 0.2270 265/500 [==============>...............] - ETA: 58s - loss: 1.4229 - regression_loss: 1.1958 - classification_loss: 0.2270 266/500 [==============>...............] - ETA: 58s - loss: 1.4236 - regression_loss: 1.1965 - classification_loss: 0.2271 267/500 [===============>..............] - ETA: 58s - loss: 1.4242 - regression_loss: 1.1968 - classification_loss: 0.2274 268/500 [===============>..............] - ETA: 58s - loss: 1.4219 - regression_loss: 1.1949 - classification_loss: 0.2270 269/500 [===============>..............] - ETA: 57s - loss: 1.4203 - regression_loss: 1.1938 - classification_loss: 0.2265 270/500 [===============>..............] - ETA: 57s - loss: 1.4211 - regression_loss: 1.1941 - classification_loss: 0.2270 271/500 [===============>..............] - ETA: 57s - loss: 1.4219 - regression_loss: 1.1948 - classification_loss: 0.2271 272/500 [===============>..............] - ETA: 57s - loss: 1.4221 - regression_loss: 1.1951 - classification_loss: 0.2270 273/500 [===============>..............] - ETA: 56s - loss: 1.4229 - regression_loss: 1.1963 - classification_loss: 0.2265 274/500 [===============>..............] - ETA: 56s - loss: 1.4246 - regression_loss: 1.1977 - classification_loss: 0.2269 275/500 [===============>..............] - ETA: 56s - loss: 1.4261 - regression_loss: 1.1991 - classification_loss: 0.2271 276/500 [===============>..............] - ETA: 56s - loss: 1.4265 - regression_loss: 1.1993 - classification_loss: 0.2272 277/500 [===============>..............] - ETA: 55s - loss: 1.4270 - regression_loss: 1.1997 - classification_loss: 0.2273 278/500 [===============>..............] - ETA: 55s - loss: 1.4261 - regression_loss: 1.1992 - classification_loss: 0.2270 279/500 [===============>..............] - ETA: 55s - loss: 1.4269 - regression_loss: 1.2000 - classification_loss: 0.2269 280/500 [===============>..............] - ETA: 55s - loss: 1.4272 - regression_loss: 1.1998 - classification_loss: 0.2274 281/500 [===============>..............] - ETA: 54s - loss: 1.4248 - regression_loss: 1.1977 - classification_loss: 0.2270 282/500 [===============>..............] - ETA: 54s - loss: 1.4258 - regression_loss: 1.1986 - classification_loss: 0.2271 283/500 [===============>..............] - ETA: 54s - loss: 1.4264 - regression_loss: 1.1990 - classification_loss: 0.2273 284/500 [================>.............] - ETA: 54s - loss: 1.4274 - regression_loss: 1.1997 - classification_loss: 0.2276 285/500 [================>.............] - ETA: 53s - loss: 1.4276 - regression_loss: 1.2000 - classification_loss: 0.2276 286/500 [================>.............] - ETA: 53s - loss: 1.4283 - regression_loss: 1.2005 - classification_loss: 0.2277 287/500 [================>.............] - ETA: 53s - loss: 1.4293 - regression_loss: 1.2012 - classification_loss: 0.2282 288/500 [================>.............] - ETA: 53s - loss: 1.4298 - regression_loss: 1.2015 - classification_loss: 0.2283 289/500 [================>.............] - ETA: 52s - loss: 1.4290 - regression_loss: 1.2010 - classification_loss: 0.2280 290/500 [================>.............] - ETA: 52s - loss: 1.4284 - regression_loss: 1.2002 - classification_loss: 0.2283 291/500 [================>.............] - ETA: 52s - loss: 1.4307 - regression_loss: 1.2018 - classification_loss: 0.2290 292/500 [================>.............] - ETA: 52s - loss: 1.4316 - regression_loss: 1.2027 - classification_loss: 0.2290 293/500 [================>.............] - ETA: 51s - loss: 1.4314 - regression_loss: 1.2024 - classification_loss: 0.2290 294/500 [================>.............] - ETA: 51s - loss: 1.4292 - regression_loss: 1.2006 - classification_loss: 0.2286 295/500 [================>.............] - ETA: 51s - loss: 1.4268 - regression_loss: 1.1984 - classification_loss: 0.2284 296/500 [================>.............] - ETA: 51s - loss: 1.4258 - regression_loss: 1.1973 - classification_loss: 0.2285 297/500 [================>.............] - ETA: 50s - loss: 1.4262 - regression_loss: 1.1976 - classification_loss: 0.2286 298/500 [================>.............] - ETA: 50s - loss: 1.4254 - regression_loss: 1.1969 - classification_loss: 0.2284 299/500 [================>.............] - ETA: 50s - loss: 1.4277 - regression_loss: 1.1989 - classification_loss: 0.2288 300/500 [=================>............] - ETA: 50s - loss: 1.4279 - regression_loss: 1.1991 - classification_loss: 0.2288 301/500 [=================>............] - ETA: 49s - loss: 1.4296 - regression_loss: 1.2004 - classification_loss: 0.2292 302/500 [=================>............] - ETA: 49s - loss: 1.4308 - regression_loss: 1.2016 - classification_loss: 0.2292 303/500 [=================>............] - ETA: 49s - loss: 1.4303 - regression_loss: 1.2012 - classification_loss: 0.2291 304/500 [=================>............] - ETA: 49s - loss: 1.4279 - regression_loss: 1.1994 - classification_loss: 0.2286 305/500 [=================>............] - ETA: 48s - loss: 1.4275 - regression_loss: 1.1988 - classification_loss: 0.2286 306/500 [=================>............] - ETA: 48s - loss: 1.4270 - regression_loss: 1.1986 - classification_loss: 0.2284 307/500 [=================>............] - ETA: 48s - loss: 1.4273 - regression_loss: 1.1989 - classification_loss: 0.2284 308/500 [=================>............] - ETA: 48s - loss: 1.4276 - regression_loss: 1.1991 - classification_loss: 0.2284 309/500 [=================>............] - ETA: 47s - loss: 1.4259 - regression_loss: 1.1976 - classification_loss: 0.2282 310/500 [=================>............] - ETA: 47s - loss: 1.4271 - regression_loss: 1.1987 - classification_loss: 0.2284 311/500 [=================>............] - ETA: 47s - loss: 1.4270 - regression_loss: 1.1989 - classification_loss: 0.2281 312/500 [=================>............] - ETA: 47s - loss: 1.4246 - regression_loss: 1.1970 - classification_loss: 0.2276 313/500 [=================>............] - ETA: 46s - loss: 1.4260 - regression_loss: 1.1981 - classification_loss: 0.2279 314/500 [=================>............] - ETA: 46s - loss: 1.4257 - regression_loss: 1.1980 - classification_loss: 0.2277 315/500 [=================>............] - ETA: 46s - loss: 1.4234 - regression_loss: 1.1962 - classification_loss: 0.2272 316/500 [=================>............] - ETA: 46s - loss: 1.4238 - regression_loss: 1.1963 - classification_loss: 0.2275 317/500 [==================>...........] - ETA: 45s - loss: 1.4236 - regression_loss: 1.1963 - classification_loss: 0.2273 318/500 [==================>...........] - ETA: 45s - loss: 1.4213 - regression_loss: 1.1945 - classification_loss: 0.2269 319/500 [==================>...........] - ETA: 45s - loss: 1.4222 - regression_loss: 1.1952 - classification_loss: 0.2271 320/500 [==================>...........] - ETA: 45s - loss: 1.4235 - regression_loss: 1.1961 - classification_loss: 0.2274 321/500 [==================>...........] - ETA: 44s - loss: 1.4250 - regression_loss: 1.1973 - classification_loss: 0.2277 322/500 [==================>...........] - ETA: 44s - loss: 1.4264 - regression_loss: 1.1985 - classification_loss: 0.2279 323/500 [==================>...........] - ETA: 44s - loss: 1.4276 - regression_loss: 1.1994 - classification_loss: 0.2282 324/500 [==================>...........] - ETA: 44s - loss: 1.4295 - regression_loss: 1.2012 - classification_loss: 0.2283 325/500 [==================>...........] - ETA: 43s - loss: 1.4294 - regression_loss: 1.2012 - classification_loss: 0.2282 326/500 [==================>...........] - ETA: 43s - loss: 1.4306 - regression_loss: 1.2023 - classification_loss: 0.2283 327/500 [==================>...........] - ETA: 43s - loss: 1.4309 - regression_loss: 1.2027 - classification_loss: 0.2282 328/500 [==================>...........] - ETA: 43s - loss: 1.4320 - regression_loss: 1.2036 - classification_loss: 0.2284 329/500 [==================>...........] - ETA: 42s - loss: 1.4330 - regression_loss: 1.2045 - classification_loss: 0.2285 330/500 [==================>...........] - ETA: 42s - loss: 1.4339 - regression_loss: 1.2053 - classification_loss: 0.2286 331/500 [==================>...........] - ETA: 42s - loss: 1.4337 - regression_loss: 1.2052 - classification_loss: 0.2285 332/500 [==================>...........] - ETA: 42s - loss: 1.4340 - regression_loss: 1.2054 - classification_loss: 0.2285 333/500 [==================>...........] - ETA: 41s - loss: 1.4322 - regression_loss: 1.2041 - classification_loss: 0.2281 334/500 [===================>..........] - ETA: 41s - loss: 1.4321 - regression_loss: 1.2041 - classification_loss: 0.2280 335/500 [===================>..........] - ETA: 41s - loss: 1.4297 - regression_loss: 1.2022 - classification_loss: 0.2275 336/500 [===================>..........] - ETA: 41s - loss: 1.4298 - regression_loss: 1.2024 - classification_loss: 0.2274 337/500 [===================>..........] - ETA: 40s - loss: 1.4308 - regression_loss: 1.2034 - classification_loss: 0.2274 338/500 [===================>..........] - ETA: 40s - loss: 1.4282 - regression_loss: 1.2013 - classification_loss: 0.2268 339/500 [===================>..........] - ETA: 40s - loss: 1.4292 - regression_loss: 1.2023 - classification_loss: 0.2270 340/500 [===================>..........] - ETA: 40s - loss: 1.4314 - regression_loss: 1.2042 - classification_loss: 0.2272 341/500 [===================>..........] - ETA: 39s - loss: 1.4321 - regression_loss: 1.2047 - classification_loss: 0.2274 342/500 [===================>..........] - ETA: 39s - loss: 1.4310 - regression_loss: 1.2039 - classification_loss: 0.2271 343/500 [===================>..........] - ETA: 39s - loss: 1.4298 - regression_loss: 1.2030 - classification_loss: 0.2269 344/500 [===================>..........] - ETA: 39s - loss: 1.4299 - regression_loss: 1.2030 - classification_loss: 0.2269 345/500 [===================>..........] - ETA: 38s - loss: 1.4291 - regression_loss: 1.2025 - classification_loss: 0.2265 346/500 [===================>..........] - ETA: 38s - loss: 1.4312 - regression_loss: 1.2038 - classification_loss: 0.2273 347/500 [===================>..........] - ETA: 38s - loss: 1.4290 - regression_loss: 1.2017 - classification_loss: 0.2273 348/500 [===================>..........] - ETA: 38s - loss: 1.4293 - regression_loss: 1.2021 - classification_loss: 0.2273 349/500 [===================>..........] - ETA: 37s - loss: 1.4299 - regression_loss: 1.2025 - classification_loss: 0.2273 350/500 [====================>.........] - ETA: 37s - loss: 1.4291 - regression_loss: 1.2020 - classification_loss: 0.2271 351/500 [====================>.........] - ETA: 37s - loss: 1.4292 - regression_loss: 1.2021 - classification_loss: 0.2270 352/500 [====================>.........] - ETA: 37s - loss: 1.4293 - regression_loss: 1.2023 - classification_loss: 0.2270 353/500 [====================>.........] - ETA: 36s - loss: 1.4274 - regression_loss: 1.2006 - classification_loss: 0.2268 354/500 [====================>.........] - ETA: 36s - loss: 1.4274 - regression_loss: 1.2008 - classification_loss: 0.2266 355/500 [====================>.........] - ETA: 36s - loss: 1.4286 - regression_loss: 1.2018 - classification_loss: 0.2268 356/500 [====================>.........] - ETA: 36s - loss: 1.4268 - regression_loss: 1.2004 - classification_loss: 0.2263 357/500 [====================>.........] - ETA: 35s - loss: 1.4270 - regression_loss: 1.2010 - classification_loss: 0.2260 358/500 [====================>.........] - ETA: 35s - loss: 1.4272 - regression_loss: 1.2010 - classification_loss: 0.2261 359/500 [====================>.........] - ETA: 35s - loss: 1.4272 - regression_loss: 1.2011 - classification_loss: 0.2261 360/500 [====================>.........] - ETA: 35s - loss: 1.4306 - regression_loss: 1.2035 - classification_loss: 0.2271 361/500 [====================>.........] - ETA: 34s - loss: 1.4294 - regression_loss: 1.2027 - classification_loss: 0.2267 362/500 [====================>.........] - ETA: 34s - loss: 1.4309 - regression_loss: 1.2045 - classification_loss: 0.2264 363/500 [====================>.........] - ETA: 34s - loss: 1.4291 - regression_loss: 1.2030 - classification_loss: 0.2261 364/500 [====================>.........] - ETA: 34s - loss: 1.4270 - regression_loss: 1.2012 - classification_loss: 0.2257 365/500 [====================>.........] - ETA: 33s - loss: 1.4250 - regression_loss: 1.1996 - classification_loss: 0.2254 366/500 [====================>.........] - ETA: 33s - loss: 1.4238 - regression_loss: 1.1987 - classification_loss: 0.2251 367/500 [=====================>........] - ETA: 33s - loss: 1.4242 - regression_loss: 1.1990 - classification_loss: 0.2252 368/500 [=====================>........] - ETA: 33s - loss: 1.4240 - regression_loss: 1.1992 - classification_loss: 0.2248 369/500 [=====================>........] - ETA: 32s - loss: 1.4219 - regression_loss: 1.1976 - classification_loss: 0.2244 370/500 [=====================>........] - ETA: 32s - loss: 1.4226 - regression_loss: 1.1984 - classification_loss: 0.2242 371/500 [=====================>........] - ETA: 32s - loss: 1.4251 - regression_loss: 1.2003 - classification_loss: 0.2248 372/500 [=====================>........] - ETA: 32s - loss: 1.4250 - regression_loss: 1.2002 - classification_loss: 0.2248 373/500 [=====================>........] - ETA: 31s - loss: 1.4260 - regression_loss: 1.2008 - classification_loss: 0.2252 374/500 [=====================>........] - ETA: 31s - loss: 1.4233 - regression_loss: 1.1986 - classification_loss: 0.2248 375/500 [=====================>........] - ETA: 31s - loss: 1.4217 - regression_loss: 1.1973 - classification_loss: 0.2244 376/500 [=====================>........] - ETA: 31s - loss: 1.4221 - regression_loss: 1.1975 - classification_loss: 0.2246 377/500 [=====================>........] - ETA: 30s - loss: 1.4223 - regression_loss: 1.1976 - classification_loss: 0.2247 378/500 [=====================>........] - ETA: 30s - loss: 1.4224 - regression_loss: 1.1976 - classification_loss: 0.2248 379/500 [=====================>........] - ETA: 30s - loss: 1.4227 - regression_loss: 1.1979 - classification_loss: 0.2248 380/500 [=====================>........] - ETA: 30s - loss: 1.4235 - regression_loss: 1.1984 - classification_loss: 0.2251 381/500 [=====================>........] - ETA: 29s - loss: 1.4237 - regression_loss: 1.1987 - classification_loss: 0.2250 382/500 [=====================>........] - ETA: 29s - loss: 1.4233 - regression_loss: 1.1983 - classification_loss: 0.2250 383/500 [=====================>........] - ETA: 29s - loss: 1.4244 - regression_loss: 1.1990 - classification_loss: 0.2255 384/500 [======================>.......] - ETA: 29s - loss: 1.4248 - regression_loss: 1.1991 - classification_loss: 0.2257 385/500 [======================>.......] - ETA: 28s - loss: 1.4248 - regression_loss: 1.1992 - classification_loss: 0.2256 386/500 [======================>.......] - ETA: 28s - loss: 1.4247 - regression_loss: 1.1988 - classification_loss: 0.2259 387/500 [======================>.......] - ETA: 28s - loss: 1.4239 - regression_loss: 1.1982 - classification_loss: 0.2257 388/500 [======================>.......] - ETA: 28s - loss: 1.4244 - regression_loss: 1.1986 - classification_loss: 0.2258 389/500 [======================>.......] - ETA: 27s - loss: 1.4223 - regression_loss: 1.1968 - classification_loss: 0.2254 390/500 [======================>.......] - ETA: 27s - loss: 1.4224 - regression_loss: 1.1969 - classification_loss: 0.2255 391/500 [======================>.......] - ETA: 27s - loss: 1.4226 - regression_loss: 1.1970 - classification_loss: 0.2255 392/500 [======================>.......] - ETA: 27s - loss: 1.4240 - regression_loss: 1.1980 - classification_loss: 0.2260 393/500 [======================>.......] - ETA: 26s - loss: 1.4244 - regression_loss: 1.1982 - classification_loss: 0.2262 394/500 [======================>.......] - ETA: 26s - loss: 1.4240 - regression_loss: 1.1979 - classification_loss: 0.2261 395/500 [======================>.......] - ETA: 26s - loss: 1.4247 - regression_loss: 1.1983 - classification_loss: 0.2264 396/500 [======================>.......] - ETA: 26s - loss: 1.4245 - regression_loss: 1.1983 - classification_loss: 0.2263 397/500 [======================>.......] - ETA: 25s - loss: 1.4252 - regression_loss: 1.1988 - classification_loss: 0.2265 398/500 [======================>.......] - ETA: 25s - loss: 1.4267 - regression_loss: 1.1999 - classification_loss: 0.2268 399/500 [======================>.......] - ETA: 25s - loss: 1.4260 - regression_loss: 1.1992 - classification_loss: 0.2268 400/500 [=======================>......] - ETA: 25s - loss: 1.4264 - regression_loss: 1.1995 - classification_loss: 0.2269 401/500 [=======================>......] - ETA: 24s - loss: 1.4257 - regression_loss: 1.1990 - classification_loss: 0.2267 402/500 [=======================>......] - ETA: 24s - loss: 1.4267 - regression_loss: 1.2000 - classification_loss: 0.2267 403/500 [=======================>......] - ETA: 24s - loss: 1.4283 - regression_loss: 1.2014 - classification_loss: 0.2269 404/500 [=======================>......] - ETA: 24s - loss: 1.4276 - regression_loss: 1.2007 - classification_loss: 0.2269 405/500 [=======================>......] - ETA: 23s - loss: 1.4286 - regression_loss: 1.2015 - classification_loss: 0.2271 406/500 [=======================>......] - ETA: 23s - loss: 1.4297 - regression_loss: 1.2024 - classification_loss: 0.2273 407/500 [=======================>......] - ETA: 23s - loss: 1.4296 - regression_loss: 1.2020 - classification_loss: 0.2275 408/500 [=======================>......] - ETA: 23s - loss: 1.4299 - regression_loss: 1.2023 - classification_loss: 0.2276 409/500 [=======================>......] - ETA: 22s - loss: 1.4301 - regression_loss: 1.2025 - classification_loss: 0.2276 410/500 [=======================>......] - ETA: 22s - loss: 1.4304 - regression_loss: 1.2028 - classification_loss: 0.2276 411/500 [=======================>......] - ETA: 22s - loss: 1.4288 - regression_loss: 1.2014 - classification_loss: 0.2274 412/500 [=======================>......] - ETA: 22s - loss: 1.4296 - regression_loss: 1.2014 - classification_loss: 0.2282 413/500 [=======================>......] - ETA: 21s - loss: 1.4302 - regression_loss: 1.2020 - classification_loss: 0.2282 414/500 [=======================>......] - ETA: 21s - loss: 1.4326 - regression_loss: 1.2039 - classification_loss: 0.2287 415/500 [=======================>......] - ETA: 21s - loss: 1.4327 - regression_loss: 1.2041 - classification_loss: 0.2286 416/500 [=======================>......] - ETA: 21s - loss: 1.4325 - regression_loss: 1.2039 - classification_loss: 0.2286 417/500 [========================>.....] - ETA: 20s - loss: 1.4337 - regression_loss: 1.2048 - classification_loss: 0.2288 418/500 [========================>.....] - ETA: 20s - loss: 1.4340 - regression_loss: 1.2051 - classification_loss: 0.2289 419/500 [========================>.....] - ETA: 20s - loss: 1.4323 - regression_loss: 1.2038 - classification_loss: 0.2285 420/500 [========================>.....] - ETA: 20s - loss: 1.4322 - regression_loss: 1.2038 - classification_loss: 0.2284 421/500 [========================>.....] - ETA: 19s - loss: 1.4322 - regression_loss: 1.2039 - classification_loss: 0.2283 422/500 [========================>.....] - ETA: 19s - loss: 1.4315 - regression_loss: 1.2032 - classification_loss: 0.2282 423/500 [========================>.....] - ETA: 19s - loss: 1.4306 - regression_loss: 1.2026 - classification_loss: 0.2281 424/500 [========================>.....] - ETA: 19s - loss: 1.4283 - regression_loss: 1.2007 - classification_loss: 0.2276 425/500 [========================>.....] - ETA: 18s - loss: 1.4289 - regression_loss: 1.2013 - classification_loss: 0.2276 426/500 [========================>.....] - ETA: 18s - loss: 1.4294 - regression_loss: 1.2017 - classification_loss: 0.2277 427/500 [========================>.....] - ETA: 18s - loss: 1.4293 - regression_loss: 1.2015 - classification_loss: 0.2278 428/500 [========================>.....] - ETA: 18s - loss: 1.4284 - regression_loss: 1.2007 - classification_loss: 0.2277 429/500 [========================>.....] - ETA: 17s - loss: 1.4266 - regression_loss: 1.1992 - classification_loss: 0.2274 430/500 [========================>.....] - ETA: 17s - loss: 1.4269 - regression_loss: 1.1995 - classification_loss: 0.2274 431/500 [========================>.....] - ETA: 17s - loss: 1.4265 - regression_loss: 1.1992 - classification_loss: 0.2273 432/500 [========================>.....] - ETA: 17s - loss: 1.4262 - regression_loss: 1.1989 - classification_loss: 0.2274 433/500 [========================>.....] - ETA: 16s - loss: 1.4257 - regression_loss: 1.1984 - classification_loss: 0.2273 434/500 [=========================>....] - ETA: 16s - loss: 1.4256 - regression_loss: 1.1983 - classification_loss: 0.2273 435/500 [=========================>....] - ETA: 16s - loss: 1.4257 - regression_loss: 1.1984 - classification_loss: 0.2273 436/500 [=========================>....] - ETA: 16s - loss: 1.4251 - regression_loss: 1.1981 - classification_loss: 0.2270 437/500 [=========================>....] - ETA: 15s - loss: 1.4252 - regression_loss: 1.1982 - classification_loss: 0.2270 438/500 [=========================>....] - ETA: 15s - loss: 1.4255 - regression_loss: 1.1985 - classification_loss: 0.2270 439/500 [=========================>....] - ETA: 15s - loss: 1.4249 - regression_loss: 1.1979 - classification_loss: 0.2271 440/500 [=========================>....] - ETA: 15s - loss: 1.4250 - regression_loss: 1.1980 - classification_loss: 0.2270 441/500 [=========================>....] - ETA: 14s - loss: 1.4251 - regression_loss: 1.1981 - classification_loss: 0.2270 442/500 [=========================>....] - ETA: 14s - loss: 1.4260 - regression_loss: 1.1989 - classification_loss: 0.2271 443/500 [=========================>....] - ETA: 14s - loss: 1.4275 - regression_loss: 1.2001 - classification_loss: 0.2273 444/500 [=========================>....] - ETA: 14s - loss: 1.4284 - regression_loss: 1.2008 - classification_loss: 0.2275 445/500 [=========================>....] - ETA: 13s - loss: 1.4291 - regression_loss: 1.2015 - classification_loss: 0.2276 446/500 [=========================>....] - ETA: 13s - loss: 1.4297 - regression_loss: 1.2020 - classification_loss: 0.2277 447/500 [=========================>....] - ETA: 13s - loss: 1.4294 - regression_loss: 1.2019 - classification_loss: 0.2275 448/500 [=========================>....] - ETA: 13s - loss: 1.4295 - regression_loss: 1.2021 - classification_loss: 0.2274 449/500 [=========================>....] - ETA: 12s - loss: 1.4288 - regression_loss: 1.2015 - classification_loss: 0.2273 450/500 [==========================>...] - ETA: 12s - loss: 1.4270 - regression_loss: 1.2001 - classification_loss: 0.2269 451/500 [==========================>...] - ETA: 12s - loss: 1.4282 - regression_loss: 1.2011 - classification_loss: 0.2271 452/500 [==========================>...] - ETA: 12s - loss: 1.4297 - regression_loss: 1.2023 - classification_loss: 0.2274 453/500 [==========================>...] - ETA: 11s - loss: 1.4295 - regression_loss: 1.2021 - classification_loss: 0.2274 454/500 [==========================>...] - ETA: 11s - loss: 1.4287 - regression_loss: 1.2013 - classification_loss: 0.2274 455/500 [==========================>...] - ETA: 11s - loss: 1.4289 - regression_loss: 1.2015 - classification_loss: 0.2274 456/500 [==========================>...] - ETA: 11s - loss: 1.4300 - regression_loss: 1.2027 - classification_loss: 0.2273 457/500 [==========================>...] - ETA: 10s - loss: 1.4289 - regression_loss: 1.2017 - classification_loss: 0.2272 458/500 [==========================>...] - ETA: 10s - loss: 1.4292 - regression_loss: 1.2021 - classification_loss: 0.2271 459/500 [==========================>...] - ETA: 10s - loss: 1.4288 - regression_loss: 1.2017 - classification_loss: 0.2271 460/500 [==========================>...] - ETA: 10s - loss: 1.4282 - regression_loss: 1.2012 - classification_loss: 0.2269 461/500 [==========================>...] - ETA: 9s - loss: 1.4282 - regression_loss: 1.2014 - classification_loss: 0.2268  462/500 [==========================>...] - ETA: 9s - loss: 1.4291 - regression_loss: 1.2023 - classification_loss: 0.2269 463/500 [==========================>...] - ETA: 9s - loss: 1.4281 - regression_loss: 1.2014 - classification_loss: 0.2267 464/500 [==========================>...] - ETA: 9s - loss: 1.4263 - regression_loss: 1.1999 - classification_loss: 0.2263 465/500 [==========================>...] - ETA: 8s - loss: 1.4273 - regression_loss: 1.2008 - classification_loss: 0.2265 466/500 [==========================>...] - ETA: 8s - loss: 1.4264 - regression_loss: 1.2001 - classification_loss: 0.2263 467/500 [===========================>..] - ETA: 8s - loss: 1.4265 - regression_loss: 1.2002 - classification_loss: 0.2262 468/500 [===========================>..] - ETA: 8s - loss: 1.4263 - regression_loss: 1.2002 - classification_loss: 0.2261 469/500 [===========================>..] - ETA: 7s - loss: 1.4254 - regression_loss: 1.1994 - classification_loss: 0.2260 470/500 [===========================>..] - ETA: 7s - loss: 1.4245 - regression_loss: 1.1986 - classification_loss: 0.2259 471/500 [===========================>..] - ETA: 7s - loss: 1.4247 - regression_loss: 1.1989 - classification_loss: 0.2258 472/500 [===========================>..] - ETA: 7s - loss: 1.4246 - regression_loss: 1.1988 - classification_loss: 0.2258 473/500 [===========================>..] - ETA: 6s - loss: 1.4258 - regression_loss: 1.1998 - classification_loss: 0.2260 474/500 [===========================>..] - ETA: 6s - loss: 1.4256 - regression_loss: 1.1996 - classification_loss: 0.2261 475/500 [===========================>..] - ETA: 6s - loss: 1.4257 - regression_loss: 1.1995 - classification_loss: 0.2261 476/500 [===========================>..] - ETA: 6s - loss: 1.4265 - regression_loss: 1.2002 - classification_loss: 0.2263 477/500 [===========================>..] - ETA: 5s - loss: 1.4269 - regression_loss: 1.2006 - classification_loss: 0.2263 478/500 [===========================>..] - ETA: 5s - loss: 1.4262 - regression_loss: 1.1998 - classification_loss: 0.2264 479/500 [===========================>..] - ETA: 5s - loss: 1.4251 - regression_loss: 1.1989 - classification_loss: 0.2262 480/500 [===========================>..] - ETA: 5s - loss: 1.4263 - regression_loss: 1.1998 - classification_loss: 0.2264 481/500 [===========================>..] - ETA: 4s - loss: 1.4260 - regression_loss: 1.1996 - classification_loss: 0.2264 482/500 [===========================>..] - ETA: 4s - loss: 1.4261 - regression_loss: 1.1996 - classification_loss: 0.2265 483/500 [===========================>..] - ETA: 4s - loss: 1.4261 - regression_loss: 1.1995 - classification_loss: 0.2266 484/500 [============================>.] - ETA: 4s - loss: 1.4276 - regression_loss: 1.2006 - classification_loss: 0.2269 485/500 [============================>.] - ETA: 3s - loss: 1.4281 - regression_loss: 1.2011 - classification_loss: 0.2270 486/500 [============================>.] - ETA: 3s - loss: 1.4283 - regression_loss: 1.2013 - classification_loss: 0.2270 487/500 [============================>.] - ETA: 3s - loss: 1.4292 - regression_loss: 1.2021 - classification_loss: 0.2271 488/500 [============================>.] - ETA: 3s - loss: 1.4293 - regression_loss: 1.2022 - classification_loss: 0.2270 489/500 [============================>.] - ETA: 2s - loss: 1.4293 - regression_loss: 1.2023 - classification_loss: 0.2271 490/500 [============================>.] - ETA: 2s - loss: 1.4298 - regression_loss: 1.2026 - classification_loss: 0.2272 491/500 [============================>.] - ETA: 2s - loss: 1.4302 - regression_loss: 1.2031 - classification_loss: 0.2271 492/500 [============================>.] - ETA: 2s - loss: 1.4315 - regression_loss: 1.2042 - classification_loss: 0.2273 493/500 [============================>.] - ETA: 1s - loss: 1.4300 - regression_loss: 1.2029 - classification_loss: 0.2271 494/500 [============================>.] - ETA: 1s - loss: 1.4289 - regression_loss: 1.2021 - classification_loss: 0.2269 495/500 [============================>.] - ETA: 1s - loss: 1.4296 - regression_loss: 1.2026 - classification_loss: 0.2270 496/500 [============================>.] - ETA: 1s - loss: 1.4302 - regression_loss: 1.2029 - classification_loss: 0.2273 497/500 [============================>.] - ETA: 0s - loss: 1.4312 - regression_loss: 1.2037 - classification_loss: 0.2275 498/500 [============================>.] - ETA: 0s - loss: 1.4304 - regression_loss: 1.2030 - classification_loss: 0.2274 499/500 [============================>.] - ETA: 0s - loss: 1.4299 - regression_loss: 1.2025 - classification_loss: 0.2274 500/500 [==============================] - 125s 251ms/step - loss: 1.4303 - regression_loss: 1.2028 - classification_loss: 0.2274 1172 instances of class plum with average precision: 0.6725 mAP: 0.6725 Epoch 00110: saving model to ./training/snapshots/resnet50_pascal_110.h5 Epoch 111/150 1/500 [..............................] - ETA: 1:58 - loss: 1.7810 - regression_loss: 1.5439 - classification_loss: 0.2371 2/500 [..............................] - ETA: 2:00 - loss: 1.4559 - regression_loss: 1.2866 - classification_loss: 0.1693 3/500 [..............................] - ETA: 2:00 - loss: 1.7075 - regression_loss: 1.4433 - classification_loss: 0.2642 4/500 [..............................] - ETA: 2:01 - loss: 1.6408 - regression_loss: 1.3728 - classification_loss: 0.2680 5/500 [..............................] - ETA: 2:01 - loss: 1.5797 - regression_loss: 1.3149 - classification_loss: 0.2648 6/500 [..............................] - ETA: 2:02 - loss: 1.4484 - regression_loss: 1.2053 - classification_loss: 0.2432 7/500 [..............................] - ETA: 2:02 - loss: 1.4420 - regression_loss: 1.1993 - classification_loss: 0.2426 8/500 [..............................] - ETA: 2:02 - loss: 1.3909 - regression_loss: 1.1566 - classification_loss: 0.2343 9/500 [..............................] - ETA: 2:02 - loss: 1.4214 - regression_loss: 1.1735 - classification_loss: 0.2479 10/500 [..............................] - ETA: 2:02 - loss: 1.4550 - regression_loss: 1.1982 - classification_loss: 0.2568 11/500 [..............................] - ETA: 2:01 - loss: 1.3888 - regression_loss: 1.1423 - classification_loss: 0.2464 12/500 [..............................] - ETA: 2:01 - loss: 1.3932 - regression_loss: 1.1481 - classification_loss: 0.2450 13/500 [..............................] - ETA: 2:01 - loss: 1.3752 - regression_loss: 1.1326 - classification_loss: 0.2426 14/500 [..............................] - ETA: 2:00 - loss: 1.4480 - regression_loss: 1.1875 - classification_loss: 0.2605 15/500 [..............................] - ETA: 2:00 - loss: 1.3964 - regression_loss: 1.1481 - classification_loss: 0.2483 16/500 [..............................] - ETA: 2:00 - loss: 1.4005 - regression_loss: 1.1553 - classification_loss: 0.2453 17/500 [>.............................] - ETA: 2:00 - loss: 1.4205 - regression_loss: 1.1735 - classification_loss: 0.2470 18/500 [>.............................] - ETA: 1:59 - loss: 1.4327 - regression_loss: 1.1852 - classification_loss: 0.2475 19/500 [>.............................] - ETA: 1:59 - loss: 1.4344 - regression_loss: 1.1906 - classification_loss: 0.2439 20/500 [>.............................] - ETA: 1:59 - loss: 1.4303 - regression_loss: 1.1905 - classification_loss: 0.2398 21/500 [>.............................] - ETA: 1:59 - loss: 1.4445 - regression_loss: 1.2035 - classification_loss: 0.2411 22/500 [>.............................] - ETA: 1:58 - loss: 1.4013 - regression_loss: 1.1688 - classification_loss: 0.2326 23/500 [>.............................] - ETA: 1:58 - loss: 1.4237 - regression_loss: 1.1847 - classification_loss: 0.2390 24/500 [>.............................] - ETA: 1:58 - loss: 1.4252 - regression_loss: 1.1862 - classification_loss: 0.2389 25/500 [>.............................] - ETA: 1:58 - loss: 1.4336 - regression_loss: 1.1922 - classification_loss: 0.2414 26/500 [>.............................] - ETA: 1:57 - loss: 1.4214 - regression_loss: 1.1859 - classification_loss: 0.2356 27/500 [>.............................] - ETA: 1:57 - loss: 1.4270 - regression_loss: 1.1917 - classification_loss: 0.2353 28/500 [>.............................] - ETA: 1:57 - loss: 1.4199 - regression_loss: 1.1873 - classification_loss: 0.2326 29/500 [>.............................] - ETA: 1:57 - loss: 1.4086 - regression_loss: 1.1768 - classification_loss: 0.2318 30/500 [>.............................] - ETA: 1:57 - loss: 1.4198 - regression_loss: 1.1852 - classification_loss: 0.2346 31/500 [>.............................] - ETA: 1:57 - loss: 1.4272 - regression_loss: 1.1927 - classification_loss: 0.2345 32/500 [>.............................] - ETA: 1:57 - loss: 1.4329 - regression_loss: 1.1984 - classification_loss: 0.2346 33/500 [>.............................] - ETA: 1:56 - loss: 1.4279 - regression_loss: 1.1955 - classification_loss: 0.2324 34/500 [=>............................] - ETA: 1:56 - loss: 1.4383 - regression_loss: 1.2050 - classification_loss: 0.2332 35/500 [=>............................] - ETA: 1:56 - loss: 1.4563 - regression_loss: 1.2188 - classification_loss: 0.2375 36/500 [=>............................] - ETA: 1:56 - loss: 1.4516 - regression_loss: 1.2150 - classification_loss: 0.2366 37/500 [=>............................] - ETA: 1:55 - loss: 1.4444 - regression_loss: 1.2108 - classification_loss: 0.2336 38/500 [=>............................] - ETA: 1:55 - loss: 1.4498 - regression_loss: 1.2142 - classification_loss: 0.2356 39/500 [=>............................] - ETA: 1:55 - loss: 1.4585 - regression_loss: 1.2217 - classification_loss: 0.2368 40/500 [=>............................] - ETA: 1:55 - loss: 1.4461 - regression_loss: 1.2124 - classification_loss: 0.2337 41/500 [=>............................] - ETA: 1:55 - loss: 1.4510 - regression_loss: 1.2174 - classification_loss: 0.2336 42/500 [=>............................] - ETA: 1:54 - loss: 1.4367 - regression_loss: 1.2045 - classification_loss: 0.2323 43/500 [=>............................] - ETA: 1:54 - loss: 1.4383 - regression_loss: 1.2053 - classification_loss: 0.2330 44/500 [=>............................] - ETA: 1:54 - loss: 1.4237 - regression_loss: 1.1943 - classification_loss: 0.2294 45/500 [=>............................] - ETA: 1:54 - loss: 1.4225 - regression_loss: 1.1934 - classification_loss: 0.2291 46/500 [=>............................] - ETA: 1:53 - loss: 1.4193 - regression_loss: 1.1914 - classification_loss: 0.2278 47/500 [=>............................] - ETA: 1:53 - loss: 1.4345 - regression_loss: 1.2036 - classification_loss: 0.2310 48/500 [=>............................] - ETA: 1:53 - loss: 1.4422 - regression_loss: 1.2103 - classification_loss: 0.2319 49/500 [=>............................] - ETA: 1:53 - loss: 1.4359 - regression_loss: 1.2064 - classification_loss: 0.2295 50/500 [==>...........................] - ETA: 1:52 - loss: 1.4299 - regression_loss: 1.1997 - classification_loss: 0.2302 51/500 [==>...........................] - ETA: 1:52 - loss: 1.4328 - regression_loss: 1.2003 - classification_loss: 0.2325 52/500 [==>...........................] - ETA: 1:52 - loss: 1.4334 - regression_loss: 1.2014 - classification_loss: 0.2320 53/500 [==>...........................] - ETA: 1:52 - loss: 1.4416 - regression_loss: 1.2084 - classification_loss: 0.2332 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4477 - regression_loss: 1.2133 - classification_loss: 0.2345 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4575 - regression_loss: 1.2216 - classification_loss: 0.2359 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4649 - regression_loss: 1.2284 - classification_loss: 0.2365 57/500 [==>...........................] - ETA: 1:51 - loss: 1.4694 - regression_loss: 1.2298 - classification_loss: 0.2396 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4699 - regression_loss: 1.2302 - classification_loss: 0.2397 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4719 - regression_loss: 1.2318 - classification_loss: 0.2401 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4744 - regression_loss: 1.2340 - classification_loss: 0.2404 61/500 [==>...........................] - ETA: 1:50 - loss: 1.4779 - regression_loss: 1.2370 - classification_loss: 0.2409 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4641 - regression_loss: 1.2261 - classification_loss: 0.2380 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4613 - regression_loss: 1.2238 - classification_loss: 0.2375 64/500 [==>...........................] - ETA: 1:49 - loss: 1.4656 - regression_loss: 1.2280 - classification_loss: 0.2375 65/500 [==>...........................] - ETA: 1:49 - loss: 1.4663 - regression_loss: 1.2285 - classification_loss: 0.2378 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4604 - regression_loss: 1.2232 - classification_loss: 0.2373 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4593 - regression_loss: 1.2225 - classification_loss: 0.2368 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4612 - regression_loss: 1.2249 - classification_loss: 0.2363 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4588 - regression_loss: 1.2232 - classification_loss: 0.2356 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4636 - regression_loss: 1.2257 - classification_loss: 0.2379 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4682 - regression_loss: 1.2281 - classification_loss: 0.2401 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4732 - regression_loss: 1.2307 - classification_loss: 0.2425 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4742 - regression_loss: 1.2317 - classification_loss: 0.2425 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4797 - regression_loss: 1.2356 - classification_loss: 0.2441 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4732 - regression_loss: 1.2303 - classification_loss: 0.2428 76/500 [===>..........................] - ETA: 1:46 - loss: 1.4679 - regression_loss: 1.2270 - classification_loss: 0.2409 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4555 - regression_loss: 1.2167 - classification_loss: 0.2388 78/500 [===>..........................] - ETA: 1:45 - loss: 1.4584 - regression_loss: 1.2190 - classification_loss: 0.2394 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4637 - regression_loss: 1.2236 - classification_loss: 0.2401 80/500 [===>..........................] - ETA: 1:45 - loss: 1.4525 - regression_loss: 1.2146 - classification_loss: 0.2379 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4407 - regression_loss: 1.2049 - classification_loss: 0.2358 82/500 [===>..........................] - ETA: 1:45 - loss: 1.4281 - regression_loss: 1.1947 - classification_loss: 0.2334 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4237 - regression_loss: 1.1913 - classification_loss: 0.2324 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4204 - regression_loss: 1.1894 - classification_loss: 0.2310 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4214 - regression_loss: 1.1905 - classification_loss: 0.2309 86/500 [====>.........................] - ETA: 1:43 - loss: 1.4263 - regression_loss: 1.1947 - classification_loss: 0.2316 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4158 - regression_loss: 1.1861 - classification_loss: 0.2297 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4168 - regression_loss: 1.1868 - classification_loss: 0.2300 89/500 [====>.........................] - ETA: 1:42 - loss: 1.4125 - regression_loss: 1.1835 - classification_loss: 0.2290 90/500 [====>.........................] - ETA: 1:42 - loss: 1.4117 - regression_loss: 1.1836 - classification_loss: 0.2281 91/500 [====>.........................] - ETA: 1:41 - loss: 1.4174 - regression_loss: 1.1897 - classification_loss: 0.2277 92/500 [====>.........................] - ETA: 1:41 - loss: 1.4061 - regression_loss: 1.1805 - classification_loss: 0.2256 93/500 [====>.........................] - ETA: 1:41 - loss: 1.4108 - regression_loss: 1.1843 - classification_loss: 0.2265 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4062 - regression_loss: 1.1815 - classification_loss: 0.2247 95/500 [====>.........................] - ETA: 1:40 - loss: 1.4118 - regression_loss: 1.1866 - classification_loss: 0.2253 96/500 [====>.........................] - ETA: 1:40 - loss: 1.4125 - regression_loss: 1.1874 - classification_loss: 0.2250 97/500 [====>.........................] - ETA: 1:40 - loss: 1.4169 - regression_loss: 1.1902 - classification_loss: 0.2267 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4172 - regression_loss: 1.1909 - classification_loss: 0.2264 99/500 [====>.........................] - ETA: 1:39 - loss: 1.4156 - regression_loss: 1.1898 - classification_loss: 0.2259 100/500 [=====>........................] - ETA: 1:39 - loss: 1.4162 - regression_loss: 1.1905 - classification_loss: 0.2257 101/500 [=====>........................] - ETA: 1:39 - loss: 1.4179 - regression_loss: 1.1920 - classification_loss: 0.2259 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4177 - regression_loss: 1.1927 - classification_loss: 0.2250 103/500 [=====>........................] - ETA: 1:38 - loss: 1.4220 - regression_loss: 1.1956 - classification_loss: 0.2264 104/500 [=====>........................] - ETA: 1:38 - loss: 1.4229 - regression_loss: 1.1965 - classification_loss: 0.2264 105/500 [=====>........................] - ETA: 1:38 - loss: 1.4249 - regression_loss: 1.1983 - classification_loss: 0.2266 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4188 - regression_loss: 1.1933 - classification_loss: 0.2254 107/500 [=====>........................] - ETA: 1:37 - loss: 1.4181 - regression_loss: 1.1919 - classification_loss: 0.2261 108/500 [=====>........................] - ETA: 1:37 - loss: 1.4171 - regression_loss: 1.1916 - classification_loss: 0.2255 109/500 [=====>........................] - ETA: 1:37 - loss: 1.4177 - regression_loss: 1.1917 - classification_loss: 0.2260 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4194 - regression_loss: 1.1906 - classification_loss: 0.2287 111/500 [=====>........................] - ETA: 1:36 - loss: 1.4114 - regression_loss: 1.1841 - classification_loss: 0.2273 112/500 [=====>........................] - ETA: 1:36 - loss: 1.4117 - regression_loss: 1.1844 - classification_loss: 0.2273 113/500 [=====>........................] - ETA: 1:36 - loss: 1.4155 - regression_loss: 1.1872 - classification_loss: 0.2284 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4096 - regression_loss: 1.1826 - classification_loss: 0.2270 115/500 [=====>........................] - ETA: 1:35 - loss: 1.4101 - regression_loss: 1.1833 - classification_loss: 0.2268 116/500 [=====>........................] - ETA: 1:35 - loss: 1.4097 - regression_loss: 1.1824 - classification_loss: 0.2273 117/500 [======>.......................] - ETA: 1:35 - loss: 1.4066 - regression_loss: 1.1801 - classification_loss: 0.2265 118/500 [======>.......................] - ETA: 1:35 - loss: 1.4058 - regression_loss: 1.1794 - classification_loss: 0.2264 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4059 - regression_loss: 1.1799 - classification_loss: 0.2260 120/500 [======>.......................] - ETA: 1:34 - loss: 1.4054 - regression_loss: 1.1801 - classification_loss: 0.2252 121/500 [======>.......................] - ETA: 1:34 - loss: 1.3982 - regression_loss: 1.1743 - classification_loss: 0.2240 122/500 [======>.......................] - ETA: 1:34 - loss: 1.3937 - regression_loss: 1.1710 - classification_loss: 0.2227 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3967 - regression_loss: 1.1731 - classification_loss: 0.2236 124/500 [======>.......................] - ETA: 1:33 - loss: 1.3957 - regression_loss: 1.1722 - classification_loss: 0.2235 125/500 [======>.......................] - ETA: 1:33 - loss: 1.3944 - regression_loss: 1.1708 - classification_loss: 0.2236 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3992 - regression_loss: 1.1746 - classification_loss: 0.2247 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4044 - regression_loss: 1.1787 - classification_loss: 0.2257 128/500 [======>.......................] - ETA: 1:32 - loss: 1.4041 - regression_loss: 1.1780 - classification_loss: 0.2261 129/500 [======>.......................] - ETA: 1:32 - loss: 1.4051 - regression_loss: 1.1787 - classification_loss: 0.2264 130/500 [======>.......................] - ETA: 1:32 - loss: 1.4061 - regression_loss: 1.1794 - classification_loss: 0.2267 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4034 - regression_loss: 1.1756 - classification_loss: 0.2277 132/500 [======>.......................] - ETA: 1:31 - loss: 1.4070 - regression_loss: 1.1781 - classification_loss: 0.2289 133/500 [======>.......................] - ETA: 1:31 - loss: 1.4094 - regression_loss: 1.1804 - classification_loss: 0.2290 134/500 [=======>......................] - ETA: 1:31 - loss: 1.4073 - regression_loss: 1.1790 - classification_loss: 0.2284 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4029 - regression_loss: 1.1756 - classification_loss: 0.2273 136/500 [=======>......................] - ETA: 1:30 - loss: 1.4022 - regression_loss: 1.1752 - classification_loss: 0.2270 137/500 [=======>......................] - ETA: 1:30 - loss: 1.4047 - regression_loss: 1.1778 - classification_loss: 0.2269 138/500 [=======>......................] - ETA: 1:30 - loss: 1.4058 - regression_loss: 1.1787 - classification_loss: 0.2271 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4122 - regression_loss: 1.1843 - classification_loss: 0.2279 140/500 [=======>......................] - ETA: 1:29 - loss: 1.4147 - regression_loss: 1.1864 - classification_loss: 0.2284 141/500 [=======>......................] - ETA: 1:29 - loss: 1.4173 - regression_loss: 1.1892 - classification_loss: 0.2281 142/500 [=======>......................] - ETA: 1:29 - loss: 1.4220 - regression_loss: 1.1916 - classification_loss: 0.2304 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4174 - regression_loss: 1.1882 - classification_loss: 0.2292 144/500 [=======>......................] - ETA: 1:28 - loss: 1.4203 - regression_loss: 1.1907 - classification_loss: 0.2295 145/500 [=======>......................] - ETA: 1:28 - loss: 1.4140 - regression_loss: 1.1857 - classification_loss: 0.2283 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4153 - regression_loss: 1.1865 - classification_loss: 0.2288 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4188 - regression_loss: 1.1893 - classification_loss: 0.2295 148/500 [=======>......................] - ETA: 1:27 - loss: 1.4166 - regression_loss: 1.1872 - classification_loss: 0.2294 149/500 [=======>......................] - ETA: 1:27 - loss: 1.4167 - regression_loss: 1.1873 - classification_loss: 0.2294 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4176 - regression_loss: 1.1879 - classification_loss: 0.2296 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4139 - regression_loss: 1.1849 - classification_loss: 0.2290 152/500 [========>.....................] - ETA: 1:26 - loss: 1.4128 - regression_loss: 1.1842 - classification_loss: 0.2285 153/500 [========>.....................] - ETA: 1:26 - loss: 1.4118 - regression_loss: 1.1833 - classification_loss: 0.2285 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4156 - regression_loss: 1.1864 - classification_loss: 0.2292 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4164 - regression_loss: 1.1872 - classification_loss: 0.2293 156/500 [========>.....................] - ETA: 1:25 - loss: 1.4143 - regression_loss: 1.1848 - classification_loss: 0.2295 157/500 [========>.....................] - ETA: 1:25 - loss: 1.4159 - regression_loss: 1.1862 - classification_loss: 0.2297 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4175 - regression_loss: 1.1875 - classification_loss: 0.2300 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4187 - regression_loss: 1.1889 - classification_loss: 0.2299 160/500 [========>.....................] - ETA: 1:24 - loss: 1.4168 - regression_loss: 1.1867 - classification_loss: 0.2301 161/500 [========>.....................] - ETA: 1:24 - loss: 1.4179 - regression_loss: 1.1874 - classification_loss: 0.2305 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4174 - regression_loss: 1.1873 - classification_loss: 0.2301 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4173 - regression_loss: 1.1871 - classification_loss: 0.2301 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4233 - regression_loss: 1.1921 - classification_loss: 0.2312 165/500 [========>.....................] - ETA: 1:23 - loss: 1.4224 - regression_loss: 1.1911 - classification_loss: 0.2313 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4314 - regression_loss: 1.1977 - classification_loss: 0.2337 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4327 - regression_loss: 1.1990 - classification_loss: 0.2337 168/500 [=========>....................] - ETA: 1:22 - loss: 1.4332 - regression_loss: 1.1996 - classification_loss: 0.2336 169/500 [=========>....................] - ETA: 1:22 - loss: 1.4283 - regression_loss: 1.1958 - classification_loss: 0.2325 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4270 - regression_loss: 1.1945 - classification_loss: 0.2324 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4267 - regression_loss: 1.1936 - classification_loss: 0.2331 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4249 - regression_loss: 1.1924 - classification_loss: 0.2325 173/500 [=========>....................] - ETA: 1:21 - loss: 1.4246 - regression_loss: 1.1922 - classification_loss: 0.2324 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4237 - regression_loss: 1.1918 - classification_loss: 0.2319 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4233 - regression_loss: 1.1917 - classification_loss: 0.2316 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4239 - regression_loss: 1.1922 - classification_loss: 0.2316 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4243 - regression_loss: 1.1927 - classification_loss: 0.2316 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4252 - regression_loss: 1.1936 - classification_loss: 0.2316 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4254 - regression_loss: 1.1938 - classification_loss: 0.2316 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4264 - regression_loss: 1.1947 - classification_loss: 0.2317 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4292 - regression_loss: 1.1975 - classification_loss: 0.2317 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4260 - regression_loss: 1.1943 - classification_loss: 0.2317 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4267 - regression_loss: 1.1952 - classification_loss: 0.2315 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4269 - regression_loss: 1.1945 - classification_loss: 0.2323 185/500 [==========>...................] - ETA: 1:18 - loss: 1.4233 - regression_loss: 1.1918 - classification_loss: 0.2315 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4279 - regression_loss: 1.1952 - classification_loss: 0.2326 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4292 - regression_loss: 1.1965 - classification_loss: 0.2328 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4326 - regression_loss: 1.1990 - classification_loss: 0.2336 189/500 [==========>...................] - ETA: 1:17 - loss: 1.4337 - regression_loss: 1.2001 - classification_loss: 0.2336 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4337 - regression_loss: 1.2003 - classification_loss: 0.2334 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4327 - regression_loss: 1.1992 - classification_loss: 0.2335 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4318 - regression_loss: 1.1983 - classification_loss: 0.2335 193/500 [==========>...................] - ETA: 1:16 - loss: 1.4301 - regression_loss: 1.1966 - classification_loss: 0.2335 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4315 - regression_loss: 1.1977 - classification_loss: 0.2339 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4285 - regression_loss: 1.1953 - classification_loss: 0.2332 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4255 - regression_loss: 1.1930 - classification_loss: 0.2325 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4221 - regression_loss: 1.1904 - classification_loss: 0.2316 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4253 - regression_loss: 1.1933 - classification_loss: 0.2320 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4251 - regression_loss: 1.1930 - classification_loss: 0.2322 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4272 - regression_loss: 1.1947 - classification_loss: 0.2325 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4244 - regression_loss: 1.1927 - classification_loss: 0.2317 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4262 - regression_loss: 1.1934 - classification_loss: 0.2327 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4350 - regression_loss: 1.1976 - classification_loss: 0.2374 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4334 - regression_loss: 1.1964 - classification_loss: 0.2370 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4340 - regression_loss: 1.1968 - classification_loss: 0.2372 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4317 - regression_loss: 1.1949 - classification_loss: 0.2367 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4329 - regression_loss: 1.1960 - classification_loss: 0.2369 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4337 - regression_loss: 1.1965 - classification_loss: 0.2373 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4316 - regression_loss: 1.1949 - classification_loss: 0.2366 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4322 - regression_loss: 1.1954 - classification_loss: 0.2368 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4316 - regression_loss: 1.1949 - classification_loss: 0.2366 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4334 - regression_loss: 1.1959 - classification_loss: 0.2375 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4337 - regression_loss: 1.1963 - classification_loss: 0.2373 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4327 - regression_loss: 1.1956 - classification_loss: 0.2371 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4347 - regression_loss: 1.1973 - classification_loss: 0.2374 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4336 - regression_loss: 1.1964 - classification_loss: 0.2373 217/500 [============>.................] - ETA: 1:10 - loss: 1.4310 - regression_loss: 1.1941 - classification_loss: 0.2369 218/500 [============>.................] - ETA: 1:10 - loss: 1.4282 - regression_loss: 1.1918 - classification_loss: 0.2364 219/500 [============>.................] - ETA: 1:10 - loss: 1.4289 - regression_loss: 1.1925 - classification_loss: 0.2364 220/500 [============>.................] - ETA: 1:10 - loss: 1.4252 - regression_loss: 1.1893 - classification_loss: 0.2359 221/500 [============>.................] - ETA: 1:09 - loss: 1.4238 - regression_loss: 1.1881 - classification_loss: 0.2357 222/500 [============>.................] - ETA: 1:09 - loss: 1.4225 - regression_loss: 1.1872 - classification_loss: 0.2353 223/500 [============>.................] - ETA: 1:09 - loss: 1.4231 - regression_loss: 1.1879 - classification_loss: 0.2352 224/500 [============>.................] - ETA: 1:09 - loss: 1.4200 - regression_loss: 1.1853 - classification_loss: 0.2347 225/500 [============>.................] - ETA: 1:08 - loss: 1.4207 - regression_loss: 1.1856 - classification_loss: 0.2351 226/500 [============>.................] - ETA: 1:08 - loss: 1.4205 - regression_loss: 1.1857 - classification_loss: 0.2349 227/500 [============>.................] - ETA: 1:08 - loss: 1.4197 - regression_loss: 1.1850 - classification_loss: 0.2346 228/500 [============>.................] - ETA: 1:08 - loss: 1.4207 - regression_loss: 1.1860 - classification_loss: 0.2347 229/500 [============>.................] - ETA: 1:07 - loss: 1.4191 - regression_loss: 1.1846 - classification_loss: 0.2345 230/500 [============>.................] - ETA: 1:07 - loss: 1.4214 - regression_loss: 1.1863 - classification_loss: 0.2351 231/500 [============>.................] - ETA: 1:07 - loss: 1.4223 - regression_loss: 1.1867 - classification_loss: 0.2355 232/500 [============>.................] - ETA: 1:07 - loss: 1.4246 - regression_loss: 1.1888 - classification_loss: 0.2358 233/500 [============>.................] - ETA: 1:06 - loss: 1.4283 - regression_loss: 1.1913 - classification_loss: 0.2370 234/500 [=============>................] - ETA: 1:06 - loss: 1.4297 - regression_loss: 1.1923 - classification_loss: 0.2374 235/500 [=============>................] - ETA: 1:06 - loss: 1.4302 - regression_loss: 1.1925 - classification_loss: 0.2377 236/500 [=============>................] - ETA: 1:06 - loss: 1.4283 - regression_loss: 1.1909 - classification_loss: 0.2374 237/500 [=============>................] - ETA: 1:05 - loss: 1.4273 - regression_loss: 1.1902 - classification_loss: 0.2371 238/500 [=============>................] - ETA: 1:05 - loss: 1.4276 - regression_loss: 1.1908 - classification_loss: 0.2367 239/500 [=============>................] - ETA: 1:05 - loss: 1.4276 - regression_loss: 1.1909 - classification_loss: 0.2367 240/500 [=============>................] - ETA: 1:05 - loss: 1.4298 - regression_loss: 1.1926 - classification_loss: 0.2371 241/500 [=============>................] - ETA: 1:04 - loss: 1.4293 - regression_loss: 1.1923 - classification_loss: 0.2370 242/500 [=============>................] - ETA: 1:04 - loss: 1.4296 - regression_loss: 1.1926 - classification_loss: 0.2370 243/500 [=============>................] - ETA: 1:04 - loss: 1.4295 - regression_loss: 1.1928 - classification_loss: 0.2367 244/500 [=============>................] - ETA: 1:04 - loss: 1.4265 - regression_loss: 1.1902 - classification_loss: 0.2364 245/500 [=============>................] - ETA: 1:03 - loss: 1.4287 - regression_loss: 1.1920 - classification_loss: 0.2368 246/500 [=============>................] - ETA: 1:03 - loss: 1.4271 - regression_loss: 1.1892 - classification_loss: 0.2379 247/500 [=============>................] - ETA: 1:03 - loss: 1.4280 - regression_loss: 1.1892 - classification_loss: 0.2388 248/500 [=============>................] - ETA: 1:03 - loss: 1.4270 - regression_loss: 1.1883 - classification_loss: 0.2387 249/500 [=============>................] - ETA: 1:02 - loss: 1.4273 - regression_loss: 1.1886 - classification_loss: 0.2387 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4272 - regression_loss: 1.1887 - classification_loss: 0.2386 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4298 - regression_loss: 1.1907 - classification_loss: 0.2391 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4264 - regression_loss: 1.1878 - classification_loss: 0.2386 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4267 - regression_loss: 1.1880 - classification_loss: 0.2387 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4264 - regression_loss: 1.1878 - classification_loss: 0.2386 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4259 - regression_loss: 1.1874 - classification_loss: 0.2385 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4266 - regression_loss: 1.1879 - classification_loss: 0.2387 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4287 - regression_loss: 1.1895 - classification_loss: 0.2392 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4290 - regression_loss: 1.1899 - classification_loss: 0.2391 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4272 - regression_loss: 1.1885 - classification_loss: 0.2387 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4250 - regression_loss: 1.1866 - classification_loss: 0.2385 261/500 [==============>...............] - ETA: 59s - loss: 1.4260 - regression_loss: 1.1875 - classification_loss: 0.2385  262/500 [==============>...............] - ETA: 59s - loss: 1.4284 - regression_loss: 1.1895 - classification_loss: 0.2389 263/500 [==============>...............] - ETA: 59s - loss: 1.4297 - regression_loss: 1.1906 - classification_loss: 0.2391 264/500 [==============>...............] - ETA: 59s - loss: 1.4308 - regression_loss: 1.1916 - classification_loss: 0.2393 265/500 [==============>...............] - ETA: 58s - loss: 1.4276 - regression_loss: 1.1890 - classification_loss: 0.2386 266/500 [==============>...............] - ETA: 58s - loss: 1.4272 - regression_loss: 1.1888 - classification_loss: 0.2384 267/500 [===============>..............] - ETA: 58s - loss: 1.4287 - regression_loss: 1.1901 - classification_loss: 0.2386 268/500 [===============>..............] - ETA: 57s - loss: 1.4292 - regression_loss: 1.1903 - classification_loss: 0.2389 269/500 [===============>..............] - ETA: 57s - loss: 1.4305 - regression_loss: 1.1912 - classification_loss: 0.2393 270/500 [===============>..............] - ETA: 57s - loss: 1.4280 - regression_loss: 1.1889 - classification_loss: 0.2391 271/500 [===============>..............] - ETA: 57s - loss: 1.4257 - regression_loss: 1.1871 - classification_loss: 0.2386 272/500 [===============>..............] - ETA: 56s - loss: 1.4274 - regression_loss: 1.1883 - classification_loss: 0.2390 273/500 [===============>..............] - ETA: 56s - loss: 1.4255 - regression_loss: 1.1865 - classification_loss: 0.2390 274/500 [===============>..............] - ETA: 56s - loss: 1.4249 - regression_loss: 1.1861 - classification_loss: 0.2388 275/500 [===============>..............] - ETA: 56s - loss: 1.4233 - regression_loss: 1.1847 - classification_loss: 0.2387 276/500 [===============>..............] - ETA: 55s - loss: 1.4250 - regression_loss: 1.1862 - classification_loss: 0.2388 277/500 [===============>..............] - ETA: 55s - loss: 1.4245 - regression_loss: 1.1858 - classification_loss: 0.2387 278/500 [===============>..............] - ETA: 55s - loss: 1.4246 - regression_loss: 1.1858 - classification_loss: 0.2388 279/500 [===============>..............] - ETA: 55s - loss: 1.4226 - regression_loss: 1.1844 - classification_loss: 0.2382 280/500 [===============>..............] - ETA: 54s - loss: 1.4244 - regression_loss: 1.1856 - classification_loss: 0.2388 281/500 [===============>..............] - ETA: 54s - loss: 1.4215 - regression_loss: 1.1834 - classification_loss: 0.2382 282/500 [===============>..............] - ETA: 54s - loss: 1.4229 - regression_loss: 1.1845 - classification_loss: 0.2383 283/500 [===============>..............] - ETA: 54s - loss: 1.4231 - regression_loss: 1.1846 - classification_loss: 0.2385 284/500 [================>.............] - ETA: 54s - loss: 1.4202 - regression_loss: 1.1824 - classification_loss: 0.2378 285/500 [================>.............] - ETA: 53s - loss: 1.4225 - regression_loss: 1.1841 - classification_loss: 0.2384 286/500 [================>.............] - ETA: 53s - loss: 1.4222 - regression_loss: 1.1839 - classification_loss: 0.2383 287/500 [================>.............] - ETA: 53s - loss: 1.4207 - regression_loss: 1.1826 - classification_loss: 0.2380 288/500 [================>.............] - ETA: 53s - loss: 1.4199 - regression_loss: 1.1821 - classification_loss: 0.2377 289/500 [================>.............] - ETA: 52s - loss: 1.4203 - regression_loss: 1.1825 - classification_loss: 0.2379 290/500 [================>.............] - ETA: 52s - loss: 1.4173 - regression_loss: 1.1801 - classification_loss: 0.2372 291/500 [================>.............] - ETA: 52s - loss: 1.4163 - regression_loss: 1.1794 - classification_loss: 0.2369 292/500 [================>.............] - ETA: 52s - loss: 1.4161 - regression_loss: 1.1793 - classification_loss: 0.2368 293/500 [================>.............] - ETA: 51s - loss: 1.4167 - regression_loss: 1.1796 - classification_loss: 0.2371 294/500 [================>.............] - ETA: 51s - loss: 1.4159 - regression_loss: 1.1787 - classification_loss: 0.2372 295/500 [================>.............] - ETA: 51s - loss: 1.4166 - regression_loss: 1.1794 - classification_loss: 0.2372 296/500 [================>.............] - ETA: 51s - loss: 1.4166 - regression_loss: 1.1794 - classification_loss: 0.2372 297/500 [================>.............] - ETA: 50s - loss: 1.4183 - regression_loss: 1.1809 - classification_loss: 0.2374 298/500 [================>.............] - ETA: 50s - loss: 1.4188 - regression_loss: 1.1812 - classification_loss: 0.2376 299/500 [================>.............] - ETA: 50s - loss: 1.4177 - regression_loss: 1.1804 - classification_loss: 0.2373 300/500 [=================>............] - ETA: 50s - loss: 1.4177 - regression_loss: 1.1805 - classification_loss: 0.2372 301/500 [=================>............] - ETA: 49s - loss: 1.4171 - regression_loss: 1.1800 - classification_loss: 0.2371 302/500 [=================>............] - ETA: 49s - loss: 1.4188 - regression_loss: 1.1815 - classification_loss: 0.2373 303/500 [=================>............] - ETA: 49s - loss: 1.4170 - regression_loss: 1.1801 - classification_loss: 0.2369 304/500 [=================>............] - ETA: 49s - loss: 1.4174 - regression_loss: 1.1803 - classification_loss: 0.2371 305/500 [=================>............] - ETA: 48s - loss: 1.4185 - regression_loss: 1.1812 - classification_loss: 0.2372 306/500 [=================>............] - ETA: 48s - loss: 1.4184 - regression_loss: 1.1816 - classification_loss: 0.2369 307/500 [=================>............] - ETA: 48s - loss: 1.4176 - regression_loss: 1.1809 - classification_loss: 0.2367 308/500 [=================>............] - ETA: 48s - loss: 1.4178 - regression_loss: 1.1813 - classification_loss: 0.2365 309/500 [=================>............] - ETA: 47s - loss: 1.4202 - regression_loss: 1.1829 - classification_loss: 0.2373 310/500 [=================>............] - ETA: 47s - loss: 1.4202 - regression_loss: 1.1829 - classification_loss: 0.2373 311/500 [=================>............] - ETA: 47s - loss: 1.4197 - regression_loss: 1.1825 - classification_loss: 0.2372 312/500 [=================>............] - ETA: 47s - loss: 1.4207 - regression_loss: 1.1834 - classification_loss: 0.2373 313/500 [=================>............] - ETA: 46s - loss: 1.4241 - regression_loss: 1.1860 - classification_loss: 0.2381 314/500 [=================>............] - ETA: 46s - loss: 1.4255 - regression_loss: 1.1872 - classification_loss: 0.2383 315/500 [=================>............] - ETA: 46s - loss: 1.4246 - regression_loss: 1.1864 - classification_loss: 0.2382 316/500 [=================>............] - ETA: 46s - loss: 1.4271 - regression_loss: 1.1884 - classification_loss: 0.2387 317/500 [==================>...........] - ETA: 45s - loss: 1.4284 - regression_loss: 1.1890 - classification_loss: 0.2394 318/500 [==================>...........] - ETA: 45s - loss: 1.4281 - regression_loss: 1.1891 - classification_loss: 0.2390 319/500 [==================>...........] - ETA: 45s - loss: 1.4278 - regression_loss: 1.1889 - classification_loss: 0.2389 320/500 [==================>...........] - ETA: 45s - loss: 1.4279 - regression_loss: 1.1893 - classification_loss: 0.2386 321/500 [==================>...........] - ETA: 44s - loss: 1.4273 - regression_loss: 1.1889 - classification_loss: 0.2384 322/500 [==================>...........] - ETA: 44s - loss: 1.4274 - regression_loss: 1.1890 - classification_loss: 0.2384 323/500 [==================>...........] - ETA: 44s - loss: 1.4269 - regression_loss: 1.1887 - classification_loss: 0.2383 324/500 [==================>...........] - ETA: 44s - loss: 1.4275 - regression_loss: 1.1892 - classification_loss: 0.2383 325/500 [==================>...........] - ETA: 43s - loss: 1.4261 - regression_loss: 1.1882 - classification_loss: 0.2379 326/500 [==================>...........] - ETA: 43s - loss: 1.4243 - regression_loss: 1.1869 - classification_loss: 0.2375 327/500 [==================>...........] - ETA: 43s - loss: 1.4253 - regression_loss: 1.1875 - classification_loss: 0.2378 328/500 [==================>...........] - ETA: 43s - loss: 1.4264 - regression_loss: 1.1886 - classification_loss: 0.2378 329/500 [==================>...........] - ETA: 42s - loss: 1.4272 - regression_loss: 1.1893 - classification_loss: 0.2379 330/500 [==================>...........] - ETA: 42s - loss: 1.4268 - regression_loss: 1.1890 - classification_loss: 0.2378 331/500 [==================>...........] - ETA: 42s - loss: 1.4291 - regression_loss: 1.1910 - classification_loss: 0.2381 332/500 [==================>...........] - ETA: 42s - loss: 1.4304 - regression_loss: 1.1919 - classification_loss: 0.2385 333/500 [==================>...........] - ETA: 41s - loss: 1.4311 - regression_loss: 1.1924 - classification_loss: 0.2386 334/500 [===================>..........] - ETA: 41s - loss: 1.4325 - regression_loss: 1.1935 - classification_loss: 0.2390 335/500 [===================>..........] - ETA: 41s - loss: 1.4329 - regression_loss: 1.1939 - classification_loss: 0.2390 336/500 [===================>..........] - ETA: 41s - loss: 1.4306 - regression_loss: 1.1921 - classification_loss: 0.2385 337/500 [===================>..........] - ETA: 40s - loss: 1.4311 - regression_loss: 1.1927 - classification_loss: 0.2385 338/500 [===================>..........] - ETA: 40s - loss: 1.4314 - regression_loss: 1.1931 - classification_loss: 0.2383 339/500 [===================>..........] - ETA: 40s - loss: 1.4326 - regression_loss: 1.1942 - classification_loss: 0.2385 340/500 [===================>..........] - ETA: 40s - loss: 1.4319 - regression_loss: 1.1936 - classification_loss: 0.2383 341/500 [===================>..........] - ETA: 39s - loss: 1.4305 - regression_loss: 1.1926 - classification_loss: 0.2380 342/500 [===================>..........] - ETA: 39s - loss: 1.4306 - regression_loss: 1.1922 - classification_loss: 0.2384 343/500 [===================>..........] - ETA: 39s - loss: 1.4318 - regression_loss: 1.1931 - classification_loss: 0.2388 344/500 [===================>..........] - ETA: 39s - loss: 1.4317 - regression_loss: 1.1929 - classification_loss: 0.2388 345/500 [===================>..........] - ETA: 38s - loss: 1.4326 - regression_loss: 1.1936 - classification_loss: 0.2389 346/500 [===================>..........] - ETA: 38s - loss: 1.4323 - regression_loss: 1.1935 - classification_loss: 0.2388 347/500 [===================>..........] - ETA: 38s - loss: 1.4294 - regression_loss: 1.1910 - classification_loss: 0.2384 348/500 [===================>..........] - ETA: 38s - loss: 1.4264 - regression_loss: 1.1886 - classification_loss: 0.2378 349/500 [===================>..........] - ETA: 37s - loss: 1.4251 - regression_loss: 1.1877 - classification_loss: 0.2375 350/500 [====================>.........] - ETA: 37s - loss: 1.4225 - regression_loss: 1.1855 - classification_loss: 0.2369 351/500 [====================>.........] - ETA: 37s - loss: 1.4223 - regression_loss: 1.1855 - classification_loss: 0.2369 352/500 [====================>.........] - ETA: 37s - loss: 1.4210 - regression_loss: 1.1845 - classification_loss: 0.2366 353/500 [====================>.........] - ETA: 36s - loss: 1.4213 - regression_loss: 1.1847 - classification_loss: 0.2367 354/500 [====================>.........] - ETA: 36s - loss: 1.4218 - regression_loss: 1.1850 - classification_loss: 0.2367 355/500 [====================>.........] - ETA: 36s - loss: 1.4226 - regression_loss: 1.1855 - classification_loss: 0.2370 356/500 [====================>.........] - ETA: 36s - loss: 1.4233 - regression_loss: 1.1861 - classification_loss: 0.2372 357/500 [====================>.........] - ETA: 35s - loss: 1.4233 - regression_loss: 1.1863 - classification_loss: 0.2370 358/500 [====================>.........] - ETA: 35s - loss: 1.4228 - regression_loss: 1.1860 - classification_loss: 0.2368 359/500 [====================>.........] - ETA: 35s - loss: 1.4214 - regression_loss: 1.1850 - classification_loss: 0.2364 360/500 [====================>.........] - ETA: 35s - loss: 1.4223 - regression_loss: 1.1855 - classification_loss: 0.2368 361/500 [====================>.........] - ETA: 34s - loss: 1.4222 - regression_loss: 1.1853 - classification_loss: 0.2368 362/500 [====================>.........] - ETA: 34s - loss: 1.4197 - regression_loss: 1.1835 - classification_loss: 0.2363 363/500 [====================>.........] - ETA: 34s - loss: 1.4197 - regression_loss: 1.1835 - classification_loss: 0.2362 364/500 [====================>.........] - ETA: 34s - loss: 1.4204 - regression_loss: 1.1842 - classification_loss: 0.2362 365/500 [====================>.........] - ETA: 33s - loss: 1.4244 - regression_loss: 1.1877 - classification_loss: 0.2368 366/500 [====================>.........] - ETA: 33s - loss: 1.4245 - regression_loss: 1.1880 - classification_loss: 0.2366 367/500 [=====================>........] - ETA: 33s - loss: 1.4245 - regression_loss: 1.1881 - classification_loss: 0.2364 368/500 [=====================>........] - ETA: 33s - loss: 1.4220 - regression_loss: 1.1860 - classification_loss: 0.2360 369/500 [=====================>........] - ETA: 32s - loss: 1.4208 - regression_loss: 1.1851 - classification_loss: 0.2356 370/500 [=====================>........] - ETA: 32s - loss: 1.4217 - regression_loss: 1.1860 - classification_loss: 0.2357 371/500 [=====================>........] - ETA: 32s - loss: 1.4221 - regression_loss: 1.1863 - classification_loss: 0.2357 372/500 [=====================>........] - ETA: 32s - loss: 1.4219 - regression_loss: 1.1859 - classification_loss: 0.2360 373/500 [=====================>........] - ETA: 31s - loss: 1.4209 - regression_loss: 1.1854 - classification_loss: 0.2355 374/500 [=====================>........] - ETA: 31s - loss: 1.4225 - regression_loss: 1.1868 - classification_loss: 0.2357 375/500 [=====================>........] - ETA: 31s - loss: 1.4239 - regression_loss: 1.1877 - classification_loss: 0.2362 376/500 [=====================>........] - ETA: 31s - loss: 1.4244 - regression_loss: 1.1881 - classification_loss: 0.2363 377/500 [=====================>........] - ETA: 30s - loss: 1.4249 - regression_loss: 1.1886 - classification_loss: 0.2363 378/500 [=====================>........] - ETA: 30s - loss: 1.4260 - regression_loss: 1.1895 - classification_loss: 0.2365 379/500 [=====================>........] - ETA: 30s - loss: 1.4259 - regression_loss: 1.1892 - classification_loss: 0.2367 380/500 [=====================>........] - ETA: 30s - loss: 1.4271 - regression_loss: 1.1901 - classification_loss: 0.2370 381/500 [=====================>........] - ETA: 29s - loss: 1.4270 - regression_loss: 1.1902 - classification_loss: 0.2369 382/500 [=====================>........] - ETA: 29s - loss: 1.4263 - regression_loss: 1.1896 - classification_loss: 0.2367 383/500 [=====================>........] - ETA: 29s - loss: 1.4269 - regression_loss: 1.1901 - classification_loss: 0.2368 384/500 [======================>.......] - ETA: 29s - loss: 1.4289 - regression_loss: 1.1917 - classification_loss: 0.2372 385/500 [======================>.......] - ETA: 28s - loss: 1.4304 - regression_loss: 1.1931 - classification_loss: 0.2374 386/500 [======================>.......] - ETA: 28s - loss: 1.4324 - regression_loss: 1.1946 - classification_loss: 0.2378 387/500 [======================>.......] - ETA: 28s - loss: 1.4327 - regression_loss: 1.1948 - classification_loss: 0.2378 388/500 [======================>.......] - ETA: 28s - loss: 1.4312 - regression_loss: 1.1938 - classification_loss: 0.2374 389/500 [======================>.......] - ETA: 27s - loss: 1.4303 - regression_loss: 1.1931 - classification_loss: 0.2372 390/500 [======================>.......] - ETA: 27s - loss: 1.4303 - regression_loss: 1.1930 - classification_loss: 0.2373 391/500 [======================>.......] - ETA: 27s - loss: 1.4299 - regression_loss: 1.1928 - classification_loss: 0.2371 392/500 [======================>.......] - ETA: 27s - loss: 1.4299 - regression_loss: 1.1927 - classification_loss: 0.2372 393/500 [======================>.......] - ETA: 26s - loss: 1.4304 - regression_loss: 1.1931 - classification_loss: 0.2373 394/500 [======================>.......] - ETA: 26s - loss: 1.4308 - regression_loss: 1.1934 - classification_loss: 0.2374 395/500 [======================>.......] - ETA: 26s - loss: 1.4305 - regression_loss: 1.1932 - classification_loss: 0.2373 396/500 [======================>.......] - ETA: 26s - loss: 1.4303 - regression_loss: 1.1931 - classification_loss: 0.2372 397/500 [======================>.......] - ETA: 25s - loss: 1.4312 - regression_loss: 1.1940 - classification_loss: 0.2372 398/500 [======================>.......] - ETA: 25s - loss: 1.4302 - regression_loss: 1.1932 - classification_loss: 0.2370 399/500 [======================>.......] - ETA: 25s - loss: 1.4321 - regression_loss: 1.1950 - classification_loss: 0.2371 400/500 [=======================>......] - ETA: 25s - loss: 1.4307 - regression_loss: 1.1940 - classification_loss: 0.2367 401/500 [=======================>......] - ETA: 24s - loss: 1.4278 - regression_loss: 1.1910 - classification_loss: 0.2369 402/500 [=======================>......] - ETA: 24s - loss: 1.4280 - regression_loss: 1.1911 - classification_loss: 0.2369 403/500 [=======================>......] - ETA: 24s - loss: 1.4284 - regression_loss: 1.1915 - classification_loss: 0.2369 404/500 [=======================>......] - ETA: 24s - loss: 1.4292 - regression_loss: 1.1923 - classification_loss: 0.2369 405/500 [=======================>......] - ETA: 23s - loss: 1.4292 - regression_loss: 1.1922 - classification_loss: 0.2371 406/500 [=======================>......] - ETA: 23s - loss: 1.4302 - regression_loss: 1.1928 - classification_loss: 0.2375 407/500 [=======================>......] - ETA: 23s - loss: 1.4301 - regression_loss: 1.1926 - classification_loss: 0.2375 408/500 [=======================>......] - ETA: 23s - loss: 1.4299 - regression_loss: 1.1924 - classification_loss: 0.2375 409/500 [=======================>......] - ETA: 22s - loss: 1.4293 - regression_loss: 1.1920 - classification_loss: 0.2373 410/500 [=======================>......] - ETA: 22s - loss: 1.4279 - regression_loss: 1.1908 - classification_loss: 0.2371 411/500 [=======================>......] - ETA: 22s - loss: 1.4285 - regression_loss: 1.1913 - classification_loss: 0.2372 412/500 [=======================>......] - ETA: 22s - loss: 1.4285 - regression_loss: 1.1914 - classification_loss: 0.2371 413/500 [=======================>......] - ETA: 21s - loss: 1.4290 - regression_loss: 1.1919 - classification_loss: 0.2371 414/500 [=======================>......] - ETA: 21s - loss: 1.4298 - regression_loss: 1.1926 - classification_loss: 0.2372 415/500 [=======================>......] - ETA: 21s - loss: 1.4303 - regression_loss: 1.1928 - classification_loss: 0.2375 416/500 [=======================>......] - ETA: 21s - loss: 1.4305 - regression_loss: 1.1929 - classification_loss: 0.2375 417/500 [========================>.....] - ETA: 20s - loss: 1.4286 - regression_loss: 1.1915 - classification_loss: 0.2371 418/500 [========================>.....] - ETA: 20s - loss: 1.4263 - regression_loss: 1.1895 - classification_loss: 0.2368 419/500 [========================>.....] - ETA: 20s - loss: 1.4262 - regression_loss: 1.1896 - classification_loss: 0.2366 420/500 [========================>.....] - ETA: 20s - loss: 1.4254 - regression_loss: 1.1889 - classification_loss: 0.2365 421/500 [========================>.....] - ETA: 19s - loss: 1.4249 - regression_loss: 1.1884 - classification_loss: 0.2364 422/500 [========================>.....] - ETA: 19s - loss: 1.4243 - regression_loss: 1.1879 - classification_loss: 0.2364 423/500 [========================>.....] - ETA: 19s - loss: 1.4256 - regression_loss: 1.1889 - classification_loss: 0.2367 424/500 [========================>.....] - ETA: 19s - loss: 1.4261 - regression_loss: 1.1892 - classification_loss: 0.2369 425/500 [========================>.....] - ETA: 18s - loss: 1.4253 - regression_loss: 1.1887 - classification_loss: 0.2366 426/500 [========================>.....] - ETA: 18s - loss: 1.4255 - regression_loss: 1.1889 - classification_loss: 0.2366 427/500 [========================>.....] - ETA: 18s - loss: 1.4254 - regression_loss: 1.1889 - classification_loss: 0.2364 428/500 [========================>.....] - ETA: 18s - loss: 1.4267 - regression_loss: 1.1899 - classification_loss: 0.2367 429/500 [========================>.....] - ETA: 17s - loss: 1.4269 - regression_loss: 1.1902 - classification_loss: 0.2367 430/500 [========================>.....] - ETA: 17s - loss: 1.4268 - regression_loss: 1.1900 - classification_loss: 0.2368 431/500 [========================>.....] - ETA: 17s - loss: 1.4266 - regression_loss: 1.1899 - classification_loss: 0.2367 432/500 [========================>.....] - ETA: 17s - loss: 1.4272 - regression_loss: 1.1905 - classification_loss: 0.2366 433/500 [========================>.....] - ETA: 16s - loss: 1.4272 - regression_loss: 1.1907 - classification_loss: 0.2365 434/500 [=========================>....] - ETA: 16s - loss: 1.4273 - regression_loss: 1.1909 - classification_loss: 0.2365 435/500 [=========================>....] - ETA: 16s - loss: 1.4290 - regression_loss: 1.1923 - classification_loss: 0.2368 436/500 [=========================>....] - ETA: 16s - loss: 1.4284 - regression_loss: 1.1918 - classification_loss: 0.2366 437/500 [=========================>....] - ETA: 15s - loss: 1.4274 - regression_loss: 1.1909 - classification_loss: 0.2365 438/500 [=========================>....] - ETA: 15s - loss: 1.4266 - regression_loss: 1.1904 - classification_loss: 0.2363 439/500 [=========================>....] - ETA: 15s - loss: 1.4267 - regression_loss: 1.1905 - classification_loss: 0.2361 440/500 [=========================>....] - ETA: 15s - loss: 1.4252 - regression_loss: 1.1891 - classification_loss: 0.2361 441/500 [=========================>....] - ETA: 14s - loss: 1.4257 - regression_loss: 1.1898 - classification_loss: 0.2359 442/500 [=========================>....] - ETA: 14s - loss: 1.4255 - regression_loss: 1.1897 - classification_loss: 0.2359 443/500 [=========================>....] - ETA: 14s - loss: 1.4247 - regression_loss: 1.1889 - classification_loss: 0.2358 444/500 [=========================>....] - ETA: 14s - loss: 1.4251 - regression_loss: 1.1893 - classification_loss: 0.2357 445/500 [=========================>....] - ETA: 13s - loss: 1.4261 - regression_loss: 1.1891 - classification_loss: 0.2370 446/500 [=========================>....] - ETA: 13s - loss: 1.4272 - regression_loss: 1.1901 - classification_loss: 0.2371 447/500 [=========================>....] - ETA: 13s - loss: 1.4281 - regression_loss: 1.1909 - classification_loss: 0.2372 448/500 [=========================>....] - ETA: 13s - loss: 1.4292 - regression_loss: 1.1921 - classification_loss: 0.2372 449/500 [=========================>....] - ETA: 12s - loss: 1.4294 - regression_loss: 1.1921 - classification_loss: 0.2373 450/500 [==========================>...] - ETA: 12s - loss: 1.4301 - regression_loss: 1.1925 - classification_loss: 0.2376 451/500 [==========================>...] - ETA: 12s - loss: 1.4292 - regression_loss: 1.1920 - classification_loss: 0.2373 452/500 [==========================>...] - ETA: 12s - loss: 1.4293 - regression_loss: 1.1921 - classification_loss: 0.2372 453/500 [==========================>...] - ETA: 11s - loss: 1.4280 - regression_loss: 1.1911 - classification_loss: 0.2369 454/500 [==========================>...] - ETA: 11s - loss: 1.4258 - regression_loss: 1.1893 - classification_loss: 0.2365 455/500 [==========================>...] - ETA: 11s - loss: 1.4269 - regression_loss: 1.1903 - classification_loss: 0.2366 456/500 [==========================>...] - ETA: 11s - loss: 1.4274 - regression_loss: 1.1906 - classification_loss: 0.2367 457/500 [==========================>...] - ETA: 10s - loss: 1.4281 - regression_loss: 1.1911 - classification_loss: 0.2369 458/500 [==========================>...] - ETA: 10s - loss: 1.4289 - regression_loss: 1.1919 - classification_loss: 0.2370 459/500 [==========================>...] - ETA: 10s - loss: 1.4288 - regression_loss: 1.1917 - classification_loss: 0.2371 460/500 [==========================>...] - ETA: 10s - loss: 1.4265 - regression_loss: 1.1899 - classification_loss: 0.2367 461/500 [==========================>...] - ETA: 9s - loss: 1.4254 - regression_loss: 1.1890 - classification_loss: 0.2363  462/500 [==========================>...] - ETA: 9s - loss: 1.4244 - regression_loss: 1.1882 - classification_loss: 0.2363 463/500 [==========================>...] - ETA: 9s - loss: 1.4238 - regression_loss: 1.1877 - classification_loss: 0.2360 464/500 [==========================>...] - ETA: 9s - loss: 1.4241 - regression_loss: 1.1881 - classification_loss: 0.2360 465/500 [==========================>...] - ETA: 8s - loss: 1.4229 - regression_loss: 1.1870 - classification_loss: 0.2359 466/500 [==========================>...] - ETA: 8s - loss: 1.4222 - regression_loss: 1.1865 - classification_loss: 0.2357 467/500 [===========================>..] - ETA: 8s - loss: 1.4234 - regression_loss: 1.1875 - classification_loss: 0.2359 468/500 [===========================>..] - ETA: 8s - loss: 1.4236 - regression_loss: 1.1877 - classification_loss: 0.2359 469/500 [===========================>..] - ETA: 7s - loss: 1.4229 - regression_loss: 1.1874 - classification_loss: 0.2355 470/500 [===========================>..] - ETA: 7s - loss: 1.4237 - regression_loss: 1.1882 - classification_loss: 0.2355 471/500 [===========================>..] - ETA: 7s - loss: 1.4249 - regression_loss: 1.1891 - classification_loss: 0.2358 472/500 [===========================>..] - ETA: 7s - loss: 1.4256 - regression_loss: 1.1898 - classification_loss: 0.2358 473/500 [===========================>..] - ETA: 6s - loss: 1.4265 - regression_loss: 1.1906 - classification_loss: 0.2359 474/500 [===========================>..] - ETA: 6s - loss: 1.4264 - regression_loss: 1.1905 - classification_loss: 0.2360 475/500 [===========================>..] - ETA: 6s - loss: 1.4260 - regression_loss: 1.1901 - classification_loss: 0.2359 476/500 [===========================>..] - ETA: 6s - loss: 1.4255 - regression_loss: 1.1895 - classification_loss: 0.2359 477/500 [===========================>..] - ETA: 5s - loss: 1.4238 - regression_loss: 1.1882 - classification_loss: 0.2356 478/500 [===========================>..] - ETA: 5s - loss: 1.4235 - regression_loss: 1.1880 - classification_loss: 0.2354 479/500 [===========================>..] - ETA: 5s - loss: 1.4229 - regression_loss: 1.1875 - classification_loss: 0.2354 480/500 [===========================>..] - ETA: 5s - loss: 1.4235 - regression_loss: 1.1880 - classification_loss: 0.2354 481/500 [===========================>..] - ETA: 4s - loss: 1.4228 - regression_loss: 1.1876 - classification_loss: 0.2353 482/500 [===========================>..] - ETA: 4s - loss: 1.4229 - regression_loss: 1.1875 - classification_loss: 0.2354 483/500 [===========================>..] - ETA: 4s - loss: 1.4222 - regression_loss: 1.1870 - classification_loss: 0.2352 484/500 [============================>.] - ETA: 4s - loss: 1.4220 - regression_loss: 1.1870 - classification_loss: 0.2350 485/500 [============================>.] - ETA: 3s - loss: 1.4225 - regression_loss: 1.1875 - classification_loss: 0.2351 486/500 [============================>.] - ETA: 3s - loss: 1.4233 - regression_loss: 1.1881 - classification_loss: 0.2351 487/500 [============================>.] - ETA: 3s - loss: 1.4244 - regression_loss: 1.1891 - classification_loss: 0.2353 488/500 [============================>.] - ETA: 3s - loss: 1.4256 - regression_loss: 1.1902 - classification_loss: 0.2354 489/500 [============================>.] - ETA: 2s - loss: 1.4255 - regression_loss: 1.1901 - classification_loss: 0.2353 490/500 [============================>.] - ETA: 2s - loss: 1.4270 - regression_loss: 1.1912 - classification_loss: 0.2358 491/500 [============================>.] - ETA: 2s - loss: 1.4275 - regression_loss: 1.1916 - classification_loss: 0.2359 492/500 [============================>.] - ETA: 2s - loss: 1.4279 - regression_loss: 1.1920 - classification_loss: 0.2359 493/500 [============================>.] - ETA: 1s - loss: 1.4288 - regression_loss: 1.1927 - classification_loss: 0.2361 494/500 [============================>.] - ETA: 1s - loss: 1.4299 - regression_loss: 1.1936 - classification_loss: 0.2363 495/500 [============================>.] - ETA: 1s - loss: 1.4298 - regression_loss: 1.1936 - classification_loss: 0.2362 496/500 [============================>.] - ETA: 1s - loss: 1.4305 - regression_loss: 1.1942 - classification_loss: 0.2362 497/500 [============================>.] - ETA: 0s - loss: 1.4292 - regression_loss: 1.1932 - classification_loss: 0.2360 498/500 [============================>.] - ETA: 0s - loss: 1.4304 - regression_loss: 1.1940 - classification_loss: 0.2364 499/500 [============================>.] - ETA: 0s - loss: 1.4291 - regression_loss: 1.1929 - classification_loss: 0.2362 500/500 [==============================] - 125s 250ms/step - loss: 1.4288 - regression_loss: 1.1927 - classification_loss: 0.2361 1172 instances of class plum with average precision: 0.6629 mAP: 0.6629 Epoch 00111: saving model to ./training/snapshots/resnet50_pascal_111.h5 Epoch 112/150 1/500 [..............................] - ETA: 1:58 - loss: 1.5596 - regression_loss: 1.2805 - classification_loss: 0.2791 2/500 [..............................] - ETA: 2:03 - loss: 1.2706 - regression_loss: 1.0291 - classification_loss: 0.2416 3/500 [..............................] - ETA: 2:02 - loss: 1.2374 - regression_loss: 0.9761 - classification_loss: 0.2613 4/500 [..............................] - ETA: 2:05 - loss: 1.4237 - regression_loss: 1.1183 - classification_loss: 0.3054 5/500 [..............................] - ETA: 2:04 - loss: 1.3416 - regression_loss: 1.0384 - classification_loss: 0.3032 6/500 [..............................] - ETA: 2:05 - loss: 1.3297 - regression_loss: 1.0503 - classification_loss: 0.2794 7/500 [..............................] - ETA: 2:05 - loss: 1.3739 - regression_loss: 1.0959 - classification_loss: 0.2780 8/500 [..............................] - ETA: 2:05 - loss: 1.4163 - regression_loss: 1.1368 - classification_loss: 0.2795 9/500 [..............................] - ETA: 2:04 - loss: 1.3081 - regression_loss: 1.0544 - classification_loss: 0.2537 10/500 [..............................] - ETA: 2:04 - loss: 1.2693 - regression_loss: 1.0315 - classification_loss: 0.2377 11/500 [..............................] - ETA: 2:03 - loss: 1.2772 - regression_loss: 1.0353 - classification_loss: 0.2419 12/500 [..............................] - ETA: 2:03 - loss: 1.2832 - regression_loss: 1.0437 - classification_loss: 0.2394 13/500 [..............................] - ETA: 2:03 - loss: 1.2327 - regression_loss: 1.0063 - classification_loss: 0.2264 14/500 [..............................] - ETA: 2:03 - loss: 1.2748 - regression_loss: 1.0417 - classification_loss: 0.2332 15/500 [..............................] - ETA: 2:02 - loss: 1.3215 - regression_loss: 1.0893 - classification_loss: 0.2322 16/500 [..............................] - ETA: 2:02 - loss: 1.3382 - regression_loss: 1.1010 - classification_loss: 0.2372 17/500 [>.............................] - ETA: 2:02 - loss: 1.3639 - regression_loss: 1.1277 - classification_loss: 0.2362 18/500 [>.............................] - ETA: 2:02 - loss: 1.3784 - regression_loss: 1.1412 - classification_loss: 0.2372 19/500 [>.............................] - ETA: 2:01 - loss: 1.4028 - regression_loss: 1.1631 - classification_loss: 0.2397 20/500 [>.............................] - ETA: 2:01 - loss: 1.4226 - regression_loss: 1.1796 - classification_loss: 0.2430 21/500 [>.............................] - ETA: 2:01 - loss: 1.4296 - regression_loss: 1.1868 - classification_loss: 0.2428 22/500 [>.............................] - ETA: 2:00 - loss: 1.4018 - regression_loss: 1.1643 - classification_loss: 0.2375 23/500 [>.............................] - ETA: 2:00 - loss: 1.4183 - regression_loss: 1.1798 - classification_loss: 0.2384 24/500 [>.............................] - ETA: 1:59 - loss: 1.4073 - regression_loss: 1.1718 - classification_loss: 0.2356 25/500 [>.............................] - ETA: 1:59 - loss: 1.4084 - regression_loss: 1.1737 - classification_loss: 0.2347 26/500 [>.............................] - ETA: 1:59 - loss: 1.3899 - regression_loss: 1.1591 - classification_loss: 0.2309 27/500 [>.............................] - ETA: 1:58 - loss: 1.3851 - regression_loss: 1.1554 - classification_loss: 0.2297 28/500 [>.............................] - ETA: 1:58 - loss: 1.3891 - regression_loss: 1.1594 - classification_loss: 0.2297 29/500 [>.............................] - ETA: 1:58 - loss: 1.3995 - regression_loss: 1.1686 - classification_loss: 0.2310 30/500 [>.............................] - ETA: 1:58 - loss: 1.4074 - regression_loss: 1.1761 - classification_loss: 0.2312 31/500 [>.............................] - ETA: 1:57 - loss: 1.4061 - regression_loss: 1.1766 - classification_loss: 0.2295 32/500 [>.............................] - ETA: 1:57 - loss: 1.3836 - regression_loss: 1.1592 - classification_loss: 0.2243 33/500 [>.............................] - ETA: 1:57 - loss: 1.3728 - regression_loss: 1.1535 - classification_loss: 0.2193 34/500 [=>............................] - ETA: 1:57 - loss: 1.3829 - regression_loss: 1.1616 - classification_loss: 0.2214 35/500 [=>............................] - ETA: 1:57 - loss: 1.3933 - regression_loss: 1.1709 - classification_loss: 0.2225 36/500 [=>............................] - ETA: 1:56 - loss: 1.3936 - regression_loss: 1.1693 - classification_loss: 0.2243 37/500 [=>............................] - ETA: 1:56 - loss: 1.3986 - regression_loss: 1.1747 - classification_loss: 0.2239 38/500 [=>............................] - ETA: 1:56 - loss: 1.3883 - regression_loss: 1.1647 - classification_loss: 0.2237 39/500 [=>............................] - ETA: 1:56 - loss: 1.3970 - regression_loss: 1.1694 - classification_loss: 0.2277 40/500 [=>............................] - ETA: 1:55 - loss: 1.3891 - regression_loss: 1.1641 - classification_loss: 0.2251 41/500 [=>............................] - ETA: 1:55 - loss: 1.3935 - regression_loss: 1.1678 - classification_loss: 0.2257 42/500 [=>............................] - ETA: 1:55 - loss: 1.3984 - regression_loss: 1.1731 - classification_loss: 0.2253 43/500 [=>............................] - ETA: 1:55 - loss: 1.3745 - regression_loss: 1.1524 - classification_loss: 0.2221 44/500 [=>............................] - ETA: 1:54 - loss: 1.3763 - regression_loss: 1.1542 - classification_loss: 0.2221 45/500 [=>............................] - ETA: 1:54 - loss: 1.3806 - regression_loss: 1.1583 - classification_loss: 0.2223 46/500 [=>............................] - ETA: 1:54 - loss: 1.3869 - regression_loss: 1.1637 - classification_loss: 0.2232 47/500 [=>............................] - ETA: 1:54 - loss: 1.3737 - regression_loss: 1.1542 - classification_loss: 0.2195 48/500 [=>............................] - ETA: 1:53 - loss: 1.3725 - regression_loss: 1.1525 - classification_loss: 0.2200 49/500 [=>............................] - ETA: 1:53 - loss: 1.3629 - regression_loss: 1.1447 - classification_loss: 0.2182 50/500 [==>...........................] - ETA: 1:53 - loss: 1.3561 - regression_loss: 1.1404 - classification_loss: 0.2157 51/500 [==>...........................] - ETA: 1:53 - loss: 1.3589 - regression_loss: 1.1444 - classification_loss: 0.2145 52/500 [==>...........................] - ETA: 1:52 - loss: 1.3684 - regression_loss: 1.1486 - classification_loss: 0.2198 53/500 [==>...........................] - ETA: 1:52 - loss: 1.3776 - regression_loss: 1.1558 - classification_loss: 0.2218 54/500 [==>...........................] - ETA: 1:52 - loss: 1.3741 - regression_loss: 1.1517 - classification_loss: 0.2223 55/500 [==>...........................] - ETA: 1:52 - loss: 1.3634 - regression_loss: 1.1417 - classification_loss: 0.2216 56/500 [==>...........................] - ETA: 1:51 - loss: 1.3665 - regression_loss: 1.1454 - classification_loss: 0.2211 57/500 [==>...........................] - ETA: 1:51 - loss: 1.3658 - regression_loss: 1.1455 - classification_loss: 0.2203 58/500 [==>...........................] - ETA: 1:51 - loss: 1.3758 - regression_loss: 1.1538 - classification_loss: 0.2220 59/500 [==>...........................] - ETA: 1:51 - loss: 1.3789 - regression_loss: 1.1557 - classification_loss: 0.2232 60/500 [==>...........................] - ETA: 1:50 - loss: 1.3768 - regression_loss: 1.1546 - classification_loss: 0.2222 61/500 [==>...........................] - ETA: 1:50 - loss: 1.3723 - regression_loss: 1.1518 - classification_loss: 0.2205 62/500 [==>...........................] - ETA: 1:50 - loss: 1.3797 - regression_loss: 1.1581 - classification_loss: 0.2217 63/500 [==>...........................] - ETA: 1:50 - loss: 1.3827 - regression_loss: 1.1617 - classification_loss: 0.2210 64/500 [==>...........................] - ETA: 1:49 - loss: 1.3867 - regression_loss: 1.1648 - classification_loss: 0.2219 65/500 [==>...........................] - ETA: 1:49 - loss: 1.3762 - regression_loss: 1.1567 - classification_loss: 0.2195 66/500 [==>...........................] - ETA: 1:49 - loss: 1.3773 - regression_loss: 1.1565 - classification_loss: 0.2208 67/500 [===>..........................] - ETA: 1:49 - loss: 1.3764 - regression_loss: 1.1570 - classification_loss: 0.2194 68/500 [===>..........................] - ETA: 1:48 - loss: 1.3773 - regression_loss: 1.1576 - classification_loss: 0.2197 69/500 [===>..........................] - ETA: 1:48 - loss: 1.3697 - regression_loss: 1.1520 - classification_loss: 0.2178 70/500 [===>..........................] - ETA: 1:48 - loss: 1.3760 - regression_loss: 1.1573 - classification_loss: 0.2187 71/500 [===>..........................] - ETA: 1:48 - loss: 1.3756 - regression_loss: 1.1568 - classification_loss: 0.2189 72/500 [===>..........................] - ETA: 1:47 - loss: 1.3709 - regression_loss: 1.1525 - classification_loss: 0.2183 73/500 [===>..........................] - ETA: 1:47 - loss: 1.3729 - regression_loss: 1.1537 - classification_loss: 0.2192 74/500 [===>..........................] - ETA: 1:47 - loss: 1.3768 - regression_loss: 1.1577 - classification_loss: 0.2191 75/500 [===>..........................] - ETA: 1:47 - loss: 1.3685 - regression_loss: 1.1512 - classification_loss: 0.2173 76/500 [===>..........................] - ETA: 1:46 - loss: 1.3731 - regression_loss: 1.1549 - classification_loss: 0.2182 77/500 [===>..........................] - ETA: 1:46 - loss: 1.3642 - regression_loss: 1.1474 - classification_loss: 0.2168 78/500 [===>..........................] - ETA: 1:46 - loss: 1.3711 - regression_loss: 1.1531 - classification_loss: 0.2181 79/500 [===>..........................] - ETA: 1:46 - loss: 1.3750 - regression_loss: 1.1569 - classification_loss: 0.2181 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3750 - regression_loss: 1.1574 - classification_loss: 0.2176 81/500 [===>..........................] - ETA: 1:45 - loss: 1.3808 - regression_loss: 1.1619 - classification_loss: 0.2189 82/500 [===>..........................] - ETA: 1:45 - loss: 1.3814 - regression_loss: 1.1622 - classification_loss: 0.2193 83/500 [===>..........................] - ETA: 1:45 - loss: 1.3763 - regression_loss: 1.1580 - classification_loss: 0.2183 84/500 [====>.........................] - ETA: 1:44 - loss: 1.3754 - regression_loss: 1.1577 - classification_loss: 0.2177 85/500 [====>.........................] - ETA: 1:44 - loss: 1.3772 - regression_loss: 1.1584 - classification_loss: 0.2188 86/500 [====>.........................] - ETA: 1:44 - loss: 1.3823 - regression_loss: 1.1619 - classification_loss: 0.2205 87/500 [====>.........................] - ETA: 1:44 - loss: 1.3810 - regression_loss: 1.1608 - classification_loss: 0.2201 88/500 [====>.........................] - ETA: 1:43 - loss: 1.3820 - regression_loss: 1.1623 - classification_loss: 0.2197 89/500 [====>.........................] - ETA: 1:43 - loss: 1.3850 - regression_loss: 1.1648 - classification_loss: 0.2202 90/500 [====>.........................] - ETA: 1:43 - loss: 1.3779 - regression_loss: 1.1589 - classification_loss: 0.2190 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3783 - regression_loss: 1.1596 - classification_loss: 0.2187 92/500 [====>.........................] - ETA: 1:42 - loss: 1.3770 - regression_loss: 1.1583 - classification_loss: 0.2187 93/500 [====>.........................] - ETA: 1:42 - loss: 1.3767 - regression_loss: 1.1581 - classification_loss: 0.2185 94/500 [====>.........................] - ETA: 1:42 - loss: 1.3765 - regression_loss: 1.1581 - classification_loss: 0.2184 95/500 [====>.........................] - ETA: 1:42 - loss: 1.3923 - regression_loss: 1.1654 - classification_loss: 0.2269 96/500 [====>.........................] - ETA: 1:41 - loss: 1.3840 - regression_loss: 1.1583 - classification_loss: 0.2257 97/500 [====>.........................] - ETA: 1:41 - loss: 1.3837 - regression_loss: 1.1577 - classification_loss: 0.2260 98/500 [====>.........................] - ETA: 1:41 - loss: 1.3856 - regression_loss: 1.1591 - classification_loss: 0.2265 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3870 - regression_loss: 1.1592 - classification_loss: 0.2278 100/500 [=====>........................] - ETA: 1:40 - loss: 1.3890 - regression_loss: 1.1616 - classification_loss: 0.2274 101/500 [=====>........................] - ETA: 1:40 - loss: 1.3848 - regression_loss: 1.1585 - classification_loss: 0.2263 102/500 [=====>........................] - ETA: 1:40 - loss: 1.3820 - regression_loss: 1.1566 - classification_loss: 0.2254 103/500 [=====>........................] - ETA: 1:40 - loss: 1.3783 - regression_loss: 1.1530 - classification_loss: 0.2252 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3795 - regression_loss: 1.1540 - classification_loss: 0.2255 105/500 [=====>........................] - ETA: 1:39 - loss: 1.3814 - regression_loss: 1.1567 - classification_loss: 0.2248 106/500 [=====>........................] - ETA: 1:39 - loss: 1.3862 - regression_loss: 1.1605 - classification_loss: 0.2257 107/500 [=====>........................] - ETA: 1:39 - loss: 1.3861 - regression_loss: 1.1606 - classification_loss: 0.2255 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3882 - regression_loss: 1.1629 - classification_loss: 0.2253 109/500 [=====>........................] - ETA: 1:38 - loss: 1.3957 - regression_loss: 1.1687 - classification_loss: 0.2270 110/500 [=====>........................] - ETA: 1:38 - loss: 1.3983 - regression_loss: 1.1710 - classification_loss: 0.2273 111/500 [=====>........................] - ETA: 1:38 - loss: 1.3929 - regression_loss: 1.1669 - classification_loss: 0.2259 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3928 - regression_loss: 1.1671 - classification_loss: 0.2257 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3967 - regression_loss: 1.1703 - classification_loss: 0.2263 114/500 [=====>........................] - ETA: 1:37 - loss: 1.3957 - regression_loss: 1.1700 - classification_loss: 0.2257 115/500 [=====>........................] - ETA: 1:37 - loss: 1.3920 - regression_loss: 1.1670 - classification_loss: 0.2250 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3920 - regression_loss: 1.1669 - classification_loss: 0.2251 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3863 - regression_loss: 1.1627 - classification_loss: 0.2236 118/500 [======>.......................] - ETA: 1:35 - loss: 1.3830 - regression_loss: 1.1601 - classification_loss: 0.2230 119/500 [======>.......................] - ETA: 1:35 - loss: 1.3778 - regression_loss: 1.1560 - classification_loss: 0.2219 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3789 - regression_loss: 1.1569 - classification_loss: 0.2220 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3812 - regression_loss: 1.1588 - classification_loss: 0.2224 122/500 [======>.......................] - ETA: 1:34 - loss: 1.3791 - regression_loss: 1.1570 - classification_loss: 0.2221 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3737 - regression_loss: 1.1523 - classification_loss: 0.2214 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3779 - regression_loss: 1.1560 - classification_loss: 0.2219 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3770 - regression_loss: 1.1554 - classification_loss: 0.2217 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3710 - regression_loss: 1.1505 - classification_loss: 0.2205 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3737 - regression_loss: 1.1529 - classification_loss: 0.2208 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3728 - regression_loss: 1.1517 - classification_loss: 0.2212 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3686 - regression_loss: 1.1486 - classification_loss: 0.2201 130/500 [======>.......................] - ETA: 1:32 - loss: 1.3677 - regression_loss: 1.1472 - classification_loss: 0.2205 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3658 - regression_loss: 1.1459 - classification_loss: 0.2199 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3699 - regression_loss: 1.1494 - classification_loss: 0.2205 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3724 - regression_loss: 1.1517 - classification_loss: 0.2207 134/500 [=======>......................] - ETA: 1:31 - loss: 1.3745 - regression_loss: 1.1538 - classification_loss: 0.2207 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3779 - regression_loss: 1.1565 - classification_loss: 0.2214 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3818 - regression_loss: 1.1600 - classification_loss: 0.2219 137/500 [=======>......................] - ETA: 1:30 - loss: 1.3785 - regression_loss: 1.1572 - classification_loss: 0.2212 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3798 - regression_loss: 1.1578 - classification_loss: 0.2221 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3802 - regression_loss: 1.1583 - classification_loss: 0.2220 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3818 - regression_loss: 1.1596 - classification_loss: 0.2222 141/500 [=======>......................] - ETA: 1:29 - loss: 1.3837 - regression_loss: 1.1615 - classification_loss: 0.2222 142/500 [=======>......................] - ETA: 1:29 - loss: 1.3887 - regression_loss: 1.1656 - classification_loss: 0.2231 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3877 - regression_loss: 1.1650 - classification_loss: 0.2228 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3850 - regression_loss: 1.1616 - classification_loss: 0.2234 145/500 [=======>......................] - ETA: 1:28 - loss: 1.3838 - regression_loss: 1.1604 - classification_loss: 0.2234 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3862 - regression_loss: 1.1624 - classification_loss: 0.2238 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3856 - regression_loss: 1.1619 - classification_loss: 0.2238 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3832 - regression_loss: 1.1597 - classification_loss: 0.2235 149/500 [=======>......................] - ETA: 1:27 - loss: 1.3823 - regression_loss: 1.1592 - classification_loss: 0.2231 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3832 - regression_loss: 1.1603 - classification_loss: 0.2229 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3824 - regression_loss: 1.1595 - classification_loss: 0.2229 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3850 - regression_loss: 1.1613 - classification_loss: 0.2237 153/500 [========>.....................] - ETA: 1:26 - loss: 1.3862 - regression_loss: 1.1627 - classification_loss: 0.2235 154/500 [========>.....................] - ETA: 1:26 - loss: 1.3852 - regression_loss: 1.1621 - classification_loss: 0.2231 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3843 - regression_loss: 1.1614 - classification_loss: 0.2229 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3855 - regression_loss: 1.1631 - classification_loss: 0.2224 157/500 [========>.....................] - ETA: 1:25 - loss: 1.3832 - regression_loss: 1.1615 - classification_loss: 0.2217 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3870 - regression_loss: 1.1645 - classification_loss: 0.2224 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3832 - regression_loss: 1.1615 - classification_loss: 0.2217 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3808 - regression_loss: 1.1595 - classification_loss: 0.2213 161/500 [========>.....................] - ETA: 1:24 - loss: 1.3821 - regression_loss: 1.1602 - classification_loss: 0.2219 162/500 [========>.....................] - ETA: 1:24 - loss: 1.3790 - regression_loss: 1.1572 - classification_loss: 0.2218 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3809 - regression_loss: 1.1591 - classification_loss: 0.2218 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3827 - regression_loss: 1.1603 - classification_loss: 0.2224 165/500 [========>.....................] - ETA: 1:23 - loss: 1.3833 - regression_loss: 1.1608 - classification_loss: 0.2225 166/500 [========>.....................] - ETA: 1:23 - loss: 1.3866 - regression_loss: 1.1637 - classification_loss: 0.2229 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3858 - regression_loss: 1.1631 - classification_loss: 0.2227 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3924 - regression_loss: 1.1683 - classification_loss: 0.2241 169/500 [=========>....................] - ETA: 1:22 - loss: 1.3932 - regression_loss: 1.1692 - classification_loss: 0.2240 170/500 [=========>....................] - ETA: 1:22 - loss: 1.3929 - regression_loss: 1.1689 - classification_loss: 0.2239 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3900 - regression_loss: 1.1664 - classification_loss: 0.2236 172/500 [=========>....................] - ETA: 1:22 - loss: 1.3880 - regression_loss: 1.1648 - classification_loss: 0.2232 173/500 [=========>....................] - ETA: 1:21 - loss: 1.3902 - regression_loss: 1.1670 - classification_loss: 0.2232 174/500 [=========>....................] - ETA: 1:21 - loss: 1.3940 - regression_loss: 1.1695 - classification_loss: 0.2245 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3939 - regression_loss: 1.1698 - classification_loss: 0.2241 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3940 - regression_loss: 1.1699 - classification_loss: 0.2241 177/500 [=========>....................] - ETA: 1:20 - loss: 1.3930 - regression_loss: 1.1691 - classification_loss: 0.2240 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3959 - regression_loss: 1.1714 - classification_loss: 0.2245 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3973 - regression_loss: 1.1719 - classification_loss: 0.2254 180/500 [=========>....................] - ETA: 1:20 - loss: 1.3982 - regression_loss: 1.1727 - classification_loss: 0.2256 181/500 [=========>....................] - ETA: 1:19 - loss: 1.3953 - regression_loss: 1.1706 - classification_loss: 0.2247 182/500 [=========>....................] - ETA: 1:19 - loss: 1.3899 - regression_loss: 1.1662 - classification_loss: 0.2237 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3913 - regression_loss: 1.1668 - classification_loss: 0.2244 184/500 [==========>...................] - ETA: 1:19 - loss: 1.3864 - regression_loss: 1.1625 - classification_loss: 0.2239 185/500 [==========>...................] - ETA: 1:18 - loss: 1.3864 - regression_loss: 1.1624 - classification_loss: 0.2240 186/500 [==========>...................] - ETA: 1:18 - loss: 1.3886 - regression_loss: 1.1641 - classification_loss: 0.2245 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3893 - regression_loss: 1.1651 - classification_loss: 0.2242 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3897 - regression_loss: 1.1653 - classification_loss: 0.2244 189/500 [==========>...................] - ETA: 1:17 - loss: 1.3915 - regression_loss: 1.1666 - classification_loss: 0.2249 190/500 [==========>...................] - ETA: 1:17 - loss: 1.3939 - regression_loss: 1.1687 - classification_loss: 0.2252 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3882 - regression_loss: 1.1639 - classification_loss: 0.2243 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3856 - regression_loss: 1.1618 - classification_loss: 0.2238 193/500 [==========>...................] - ETA: 1:16 - loss: 1.3877 - regression_loss: 1.1634 - classification_loss: 0.2243 194/500 [==========>...................] - ETA: 1:16 - loss: 1.3884 - regression_loss: 1.1641 - classification_loss: 0.2243 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3884 - regression_loss: 1.1645 - classification_loss: 0.2239 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3907 - regression_loss: 1.1666 - classification_loss: 0.2242 197/500 [==========>...................] - ETA: 1:15 - loss: 1.3924 - regression_loss: 1.1677 - classification_loss: 0.2247 198/500 [==========>...................] - ETA: 1:15 - loss: 1.3886 - regression_loss: 1.1646 - classification_loss: 0.2240 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3863 - regression_loss: 1.1627 - classification_loss: 0.2236 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3857 - regression_loss: 1.1622 - classification_loss: 0.2235 201/500 [===========>..................] - ETA: 1:14 - loss: 1.3867 - regression_loss: 1.1632 - classification_loss: 0.2236 202/500 [===========>..................] - ETA: 1:14 - loss: 1.3858 - regression_loss: 1.1625 - classification_loss: 0.2233 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3864 - regression_loss: 1.1631 - classification_loss: 0.2233 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3866 - regression_loss: 1.1633 - classification_loss: 0.2233 205/500 [===========>..................] - ETA: 1:13 - loss: 1.3879 - regression_loss: 1.1646 - classification_loss: 0.2233 206/500 [===========>..................] - ETA: 1:13 - loss: 1.3908 - regression_loss: 1.1669 - classification_loss: 0.2239 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3932 - regression_loss: 1.1686 - classification_loss: 0.2247 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3959 - regression_loss: 1.1703 - classification_loss: 0.2256 209/500 [===========>..................] - ETA: 1:12 - loss: 1.3985 - regression_loss: 1.1724 - classification_loss: 0.2261 210/500 [===========>..................] - ETA: 1:12 - loss: 1.3973 - regression_loss: 1.1715 - classification_loss: 0.2258 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3981 - regression_loss: 1.1723 - classification_loss: 0.2258 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3981 - regression_loss: 1.1724 - classification_loss: 0.2258 213/500 [===========>..................] - ETA: 1:12 - loss: 1.3982 - regression_loss: 1.1728 - classification_loss: 0.2255 214/500 [===========>..................] - ETA: 1:11 - loss: 1.3972 - regression_loss: 1.1720 - classification_loss: 0.2251 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3977 - regression_loss: 1.1724 - classification_loss: 0.2253 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3986 - regression_loss: 1.1726 - classification_loss: 0.2260 217/500 [============>.................] - ETA: 1:11 - loss: 1.3995 - regression_loss: 1.1728 - classification_loss: 0.2266 218/500 [============>.................] - ETA: 1:10 - loss: 1.3996 - regression_loss: 1.1732 - classification_loss: 0.2264 219/500 [============>.................] - ETA: 1:10 - loss: 1.4006 - regression_loss: 1.1738 - classification_loss: 0.2268 220/500 [============>.................] - ETA: 1:10 - loss: 1.4011 - regression_loss: 1.1741 - classification_loss: 0.2270 221/500 [============>.................] - ETA: 1:09 - loss: 1.4034 - regression_loss: 1.1760 - classification_loss: 0.2274 222/500 [============>.................] - ETA: 1:09 - loss: 1.4029 - regression_loss: 1.1760 - classification_loss: 0.2269 223/500 [============>.................] - ETA: 1:09 - loss: 1.4026 - regression_loss: 1.1756 - classification_loss: 0.2270 224/500 [============>.................] - ETA: 1:09 - loss: 1.4033 - regression_loss: 1.1764 - classification_loss: 0.2270 225/500 [============>.................] - ETA: 1:08 - loss: 1.4008 - regression_loss: 1.1743 - classification_loss: 0.2264 226/500 [============>.................] - ETA: 1:08 - loss: 1.4001 - regression_loss: 1.1736 - classification_loss: 0.2266 227/500 [============>.................] - ETA: 1:08 - loss: 1.3969 - regression_loss: 1.1709 - classification_loss: 0.2260 228/500 [============>.................] - ETA: 1:08 - loss: 1.3963 - regression_loss: 1.1697 - classification_loss: 0.2265 229/500 [============>.................] - ETA: 1:07 - loss: 1.3932 - regression_loss: 1.1673 - classification_loss: 0.2259 230/500 [============>.................] - ETA: 1:07 - loss: 1.3947 - regression_loss: 1.1684 - classification_loss: 0.2263 231/500 [============>.................] - ETA: 1:07 - loss: 1.3938 - regression_loss: 1.1677 - classification_loss: 0.2261 232/500 [============>.................] - ETA: 1:07 - loss: 1.3951 - regression_loss: 1.1687 - classification_loss: 0.2264 233/500 [============>.................] - ETA: 1:06 - loss: 1.3959 - regression_loss: 1.1693 - classification_loss: 0.2266 234/500 [=============>................] - ETA: 1:06 - loss: 1.3977 - regression_loss: 1.1709 - classification_loss: 0.2268 235/500 [=============>................] - ETA: 1:06 - loss: 1.3956 - regression_loss: 1.1688 - classification_loss: 0.2268 236/500 [=============>................] - ETA: 1:06 - loss: 1.3962 - regression_loss: 1.1693 - classification_loss: 0.2269 237/500 [=============>................] - ETA: 1:05 - loss: 1.3943 - regression_loss: 1.1679 - classification_loss: 0.2264 238/500 [=============>................] - ETA: 1:05 - loss: 1.3944 - regression_loss: 1.1677 - classification_loss: 0.2267 239/500 [=============>................] - ETA: 1:05 - loss: 1.3924 - regression_loss: 1.1661 - classification_loss: 0.2263 240/500 [=============>................] - ETA: 1:05 - loss: 1.3889 - regression_loss: 1.1632 - classification_loss: 0.2257 241/500 [=============>................] - ETA: 1:04 - loss: 1.3886 - regression_loss: 1.1623 - classification_loss: 0.2263 242/500 [=============>................] - ETA: 1:04 - loss: 1.3904 - regression_loss: 1.1640 - classification_loss: 0.2265 243/500 [=============>................] - ETA: 1:04 - loss: 1.3896 - regression_loss: 1.1632 - classification_loss: 0.2264 244/500 [=============>................] - ETA: 1:04 - loss: 1.3908 - regression_loss: 1.1639 - classification_loss: 0.2269 245/500 [=============>................] - ETA: 1:03 - loss: 1.3910 - regression_loss: 1.1641 - classification_loss: 0.2269 246/500 [=============>................] - ETA: 1:03 - loss: 1.3913 - regression_loss: 1.1647 - classification_loss: 0.2267 247/500 [=============>................] - ETA: 1:03 - loss: 1.3948 - regression_loss: 1.1676 - classification_loss: 0.2273 248/500 [=============>................] - ETA: 1:03 - loss: 1.3957 - regression_loss: 1.1684 - classification_loss: 0.2273 249/500 [=============>................] - ETA: 1:02 - loss: 1.3978 - regression_loss: 1.1704 - classification_loss: 0.2274 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3981 - regression_loss: 1.1708 - classification_loss: 0.2272 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3984 - regression_loss: 1.1710 - classification_loss: 0.2274 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3988 - regression_loss: 1.1712 - classification_loss: 0.2275 253/500 [==============>...............] - ETA: 1:01 - loss: 1.3996 - regression_loss: 1.1720 - classification_loss: 0.2276 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3974 - regression_loss: 1.1701 - classification_loss: 0.2273 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3969 - regression_loss: 1.1698 - classification_loss: 0.2271 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3943 - regression_loss: 1.1678 - classification_loss: 0.2265 257/500 [==============>...............] - ETA: 1:00 - loss: 1.3946 - regression_loss: 1.1680 - classification_loss: 0.2265 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3966 - regression_loss: 1.1693 - classification_loss: 0.2272 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3967 - regression_loss: 1.1698 - classification_loss: 0.2269 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3979 - regression_loss: 1.1707 - classification_loss: 0.2272 261/500 [==============>...............] - ETA: 59s - loss: 1.3973 - regression_loss: 1.1702 - classification_loss: 0.2271  262/500 [==============>...............] - ETA: 59s - loss: 1.3977 - regression_loss: 1.1713 - classification_loss: 0.2264 263/500 [==============>...............] - ETA: 59s - loss: 1.3958 - regression_loss: 1.1697 - classification_loss: 0.2261 264/500 [==============>...............] - ETA: 59s - loss: 1.3974 - regression_loss: 1.1712 - classification_loss: 0.2262 265/500 [==============>...............] - ETA: 59s - loss: 1.3990 - regression_loss: 1.1726 - classification_loss: 0.2265 266/500 [==============>...............] - ETA: 58s - loss: 1.4005 - regression_loss: 1.1738 - classification_loss: 0.2267 267/500 [===============>..............] - ETA: 58s - loss: 1.4012 - regression_loss: 1.1741 - classification_loss: 0.2271 268/500 [===============>..............] - ETA: 58s - loss: 1.3983 - regression_loss: 1.1717 - classification_loss: 0.2266 269/500 [===============>..............] - ETA: 58s - loss: 1.3964 - regression_loss: 1.1702 - classification_loss: 0.2262 270/500 [===============>..............] - ETA: 57s - loss: 1.4003 - regression_loss: 1.1735 - classification_loss: 0.2268 271/500 [===============>..............] - ETA: 57s - loss: 1.4012 - regression_loss: 1.1742 - classification_loss: 0.2269 272/500 [===============>..............] - ETA: 57s - loss: 1.4000 - regression_loss: 1.1735 - classification_loss: 0.2265 273/500 [===============>..............] - ETA: 57s - loss: 1.3978 - regression_loss: 1.1717 - classification_loss: 0.2261 274/500 [===============>..............] - ETA: 56s - loss: 1.3998 - regression_loss: 1.1734 - classification_loss: 0.2265 275/500 [===============>..............] - ETA: 56s - loss: 1.3974 - regression_loss: 1.1716 - classification_loss: 0.2258 276/500 [===============>..............] - ETA: 56s - loss: 1.3968 - regression_loss: 1.1711 - classification_loss: 0.2257 277/500 [===============>..............] - ETA: 56s - loss: 1.3958 - regression_loss: 1.1704 - classification_loss: 0.2253 278/500 [===============>..............] - ETA: 55s - loss: 1.3948 - regression_loss: 1.1695 - classification_loss: 0.2252 279/500 [===============>..............] - ETA: 55s - loss: 1.3962 - regression_loss: 1.1706 - classification_loss: 0.2256 280/500 [===============>..............] - ETA: 55s - loss: 1.3953 - regression_loss: 1.1697 - classification_loss: 0.2256 281/500 [===============>..............] - ETA: 55s - loss: 1.3967 - regression_loss: 1.1708 - classification_loss: 0.2259 282/500 [===============>..............] - ETA: 54s - loss: 1.3983 - regression_loss: 1.1721 - classification_loss: 0.2263 283/500 [===============>..............] - ETA: 54s - loss: 1.3976 - regression_loss: 1.1712 - classification_loss: 0.2263 284/500 [================>.............] - ETA: 54s - loss: 1.3988 - regression_loss: 1.1722 - classification_loss: 0.2266 285/500 [================>.............] - ETA: 54s - loss: 1.3967 - regression_loss: 1.1705 - classification_loss: 0.2262 286/500 [================>.............] - ETA: 53s - loss: 1.3956 - regression_loss: 1.1701 - classification_loss: 0.2256 287/500 [================>.............] - ETA: 53s - loss: 1.3970 - regression_loss: 1.1713 - classification_loss: 0.2257 288/500 [================>.............] - ETA: 53s - loss: 1.3949 - regression_loss: 1.1694 - classification_loss: 0.2255 289/500 [================>.............] - ETA: 53s - loss: 1.3959 - regression_loss: 1.1703 - classification_loss: 0.2256 290/500 [================>.............] - ETA: 52s - loss: 1.3966 - regression_loss: 1.1708 - classification_loss: 0.2258 291/500 [================>.............] - ETA: 52s - loss: 1.3957 - regression_loss: 1.1701 - classification_loss: 0.2256 292/500 [================>.............] - ETA: 52s - loss: 1.3964 - regression_loss: 1.1707 - classification_loss: 0.2257 293/500 [================>.............] - ETA: 51s - loss: 1.3954 - regression_loss: 1.1699 - classification_loss: 0.2255 294/500 [================>.............] - ETA: 51s - loss: 1.3939 - regression_loss: 1.1685 - classification_loss: 0.2254 295/500 [================>.............] - ETA: 51s - loss: 1.3937 - regression_loss: 1.1685 - classification_loss: 0.2253 296/500 [================>.............] - ETA: 51s - loss: 1.3960 - regression_loss: 1.1703 - classification_loss: 0.2257 297/500 [================>.............] - ETA: 50s - loss: 1.3958 - regression_loss: 1.1703 - classification_loss: 0.2255 298/500 [================>.............] - ETA: 50s - loss: 1.3943 - regression_loss: 1.1689 - classification_loss: 0.2254 299/500 [================>.............] - ETA: 50s - loss: 1.3920 - regression_loss: 1.1664 - classification_loss: 0.2255 300/500 [=================>............] - ETA: 50s - loss: 1.3909 - regression_loss: 1.1655 - classification_loss: 0.2254 301/500 [=================>............] - ETA: 49s - loss: 1.3905 - regression_loss: 1.1650 - classification_loss: 0.2255 302/500 [=================>............] - ETA: 49s - loss: 1.3907 - regression_loss: 1.1649 - classification_loss: 0.2258 303/500 [=================>............] - ETA: 49s - loss: 1.3888 - regression_loss: 1.1635 - classification_loss: 0.2253 304/500 [=================>............] - ETA: 49s - loss: 1.3889 - regression_loss: 1.1637 - classification_loss: 0.2252 305/500 [=================>............] - ETA: 48s - loss: 1.3893 - regression_loss: 1.1641 - classification_loss: 0.2253 306/500 [=================>............] - ETA: 48s - loss: 1.3897 - regression_loss: 1.1644 - classification_loss: 0.2253 307/500 [=================>............] - ETA: 48s - loss: 1.3910 - regression_loss: 1.1652 - classification_loss: 0.2258 308/500 [=================>............] - ETA: 48s - loss: 1.3916 - regression_loss: 1.1658 - classification_loss: 0.2258 309/500 [=================>............] - ETA: 47s - loss: 1.3910 - regression_loss: 1.1654 - classification_loss: 0.2256 310/500 [=================>............] - ETA: 47s - loss: 1.3908 - regression_loss: 1.1652 - classification_loss: 0.2257 311/500 [=================>............] - ETA: 47s - loss: 1.3921 - regression_loss: 1.1663 - classification_loss: 0.2258 312/500 [=================>............] - ETA: 47s - loss: 1.3931 - regression_loss: 1.1673 - classification_loss: 0.2258 313/500 [=================>............] - ETA: 46s - loss: 1.3936 - regression_loss: 1.1676 - classification_loss: 0.2260 314/500 [=================>............] - ETA: 46s - loss: 1.3941 - regression_loss: 1.1680 - classification_loss: 0.2261 315/500 [=================>............] - ETA: 46s - loss: 1.3955 - regression_loss: 1.1689 - classification_loss: 0.2266 316/500 [=================>............] - ETA: 46s - loss: 1.3961 - regression_loss: 1.1696 - classification_loss: 0.2265 317/500 [==================>...........] - ETA: 45s - loss: 1.3983 - regression_loss: 1.1712 - classification_loss: 0.2271 318/500 [==================>...........] - ETA: 45s - loss: 1.3987 - regression_loss: 1.1713 - classification_loss: 0.2273 319/500 [==================>...........] - ETA: 45s - loss: 1.3976 - regression_loss: 1.1705 - classification_loss: 0.2271 320/500 [==================>...........] - ETA: 45s - loss: 1.3966 - regression_loss: 1.1699 - classification_loss: 0.2268 321/500 [==================>...........] - ETA: 44s - loss: 1.3967 - regression_loss: 1.1699 - classification_loss: 0.2269 322/500 [==================>...........] - ETA: 44s - loss: 1.3958 - regression_loss: 1.1692 - classification_loss: 0.2266 323/500 [==================>...........] - ETA: 44s - loss: 1.3970 - regression_loss: 1.1703 - classification_loss: 0.2267 324/500 [==================>...........] - ETA: 44s - loss: 1.3965 - regression_loss: 1.1700 - classification_loss: 0.2265 325/500 [==================>...........] - ETA: 43s - loss: 1.3965 - regression_loss: 1.1701 - classification_loss: 0.2264 326/500 [==================>...........] - ETA: 43s - loss: 1.3967 - regression_loss: 1.1705 - classification_loss: 0.2263 327/500 [==================>...........] - ETA: 43s - loss: 1.3966 - regression_loss: 1.1704 - classification_loss: 0.2263 328/500 [==================>...........] - ETA: 43s - loss: 1.3958 - regression_loss: 1.1698 - classification_loss: 0.2260 329/500 [==================>...........] - ETA: 42s - loss: 1.3954 - regression_loss: 1.1694 - classification_loss: 0.2260 330/500 [==================>...........] - ETA: 42s - loss: 1.3985 - regression_loss: 1.1719 - classification_loss: 0.2265 331/500 [==================>...........] - ETA: 42s - loss: 1.3998 - regression_loss: 1.1732 - classification_loss: 0.2265 332/500 [==================>...........] - ETA: 42s - loss: 1.3987 - regression_loss: 1.1725 - classification_loss: 0.2262 333/500 [==================>...........] - ETA: 41s - loss: 1.4003 - regression_loss: 1.1739 - classification_loss: 0.2264 334/500 [===================>..........] - ETA: 41s - loss: 1.4003 - regression_loss: 1.1740 - classification_loss: 0.2263 335/500 [===================>..........] - ETA: 41s - loss: 1.4003 - regression_loss: 1.1742 - classification_loss: 0.2261 336/500 [===================>..........] - ETA: 41s - loss: 1.3995 - regression_loss: 1.1734 - classification_loss: 0.2262 337/500 [===================>..........] - ETA: 40s - loss: 1.3997 - regression_loss: 1.1736 - classification_loss: 0.2261 338/500 [===================>..........] - ETA: 40s - loss: 1.4012 - regression_loss: 1.1748 - classification_loss: 0.2264 339/500 [===================>..........] - ETA: 40s - loss: 1.4010 - regression_loss: 1.1747 - classification_loss: 0.2262 340/500 [===================>..........] - ETA: 40s - loss: 1.4012 - regression_loss: 1.1750 - classification_loss: 0.2262 341/500 [===================>..........] - ETA: 39s - loss: 1.4026 - regression_loss: 1.1762 - classification_loss: 0.2264 342/500 [===================>..........] - ETA: 39s - loss: 1.4036 - regression_loss: 1.1770 - classification_loss: 0.2266 343/500 [===================>..........] - ETA: 39s - loss: 1.4027 - regression_loss: 1.1762 - classification_loss: 0.2265 344/500 [===================>..........] - ETA: 39s - loss: 1.4029 - regression_loss: 1.1764 - classification_loss: 0.2265 345/500 [===================>..........] - ETA: 38s - loss: 1.4044 - regression_loss: 1.1776 - classification_loss: 0.2268 346/500 [===================>..........] - ETA: 38s - loss: 1.4055 - regression_loss: 1.1782 - classification_loss: 0.2273 347/500 [===================>..........] - ETA: 38s - loss: 1.4062 - regression_loss: 1.1789 - classification_loss: 0.2274 348/500 [===================>..........] - ETA: 38s - loss: 1.4072 - regression_loss: 1.1797 - classification_loss: 0.2275 349/500 [===================>..........] - ETA: 37s - loss: 1.4063 - regression_loss: 1.1790 - classification_loss: 0.2273 350/500 [====================>.........] - ETA: 37s - loss: 1.4061 - regression_loss: 1.1788 - classification_loss: 0.2272 351/500 [====================>.........] - ETA: 37s - loss: 1.4042 - regression_loss: 1.1774 - classification_loss: 0.2268 352/500 [====================>.........] - ETA: 37s - loss: 1.4037 - regression_loss: 1.1771 - classification_loss: 0.2267 353/500 [====================>.........] - ETA: 36s - loss: 1.4043 - regression_loss: 1.1775 - classification_loss: 0.2267 354/500 [====================>.........] - ETA: 36s - loss: 1.4043 - regression_loss: 1.1777 - classification_loss: 0.2267 355/500 [====================>.........] - ETA: 36s - loss: 1.4044 - regression_loss: 1.1779 - classification_loss: 0.2265 356/500 [====================>.........] - ETA: 36s - loss: 1.4035 - regression_loss: 1.1771 - classification_loss: 0.2264 357/500 [====================>.........] - ETA: 35s - loss: 1.4042 - regression_loss: 1.1776 - classification_loss: 0.2267 358/500 [====================>.........] - ETA: 35s - loss: 1.4056 - regression_loss: 1.1785 - classification_loss: 0.2271 359/500 [====================>.........] - ETA: 35s - loss: 1.4036 - regression_loss: 1.1769 - classification_loss: 0.2267 360/500 [====================>.........] - ETA: 35s - loss: 1.4020 - regression_loss: 1.1755 - classification_loss: 0.2265 361/500 [====================>.........] - ETA: 34s - loss: 1.4015 - regression_loss: 1.1748 - classification_loss: 0.2267 362/500 [====================>.........] - ETA: 34s - loss: 1.4021 - regression_loss: 1.1754 - classification_loss: 0.2266 363/500 [====================>.........] - ETA: 34s - loss: 1.4005 - regression_loss: 1.1740 - classification_loss: 0.2266 364/500 [====================>.........] - ETA: 34s - loss: 1.4018 - regression_loss: 1.1752 - classification_loss: 0.2266 365/500 [====================>.........] - ETA: 33s - loss: 1.4037 - regression_loss: 1.1766 - classification_loss: 0.2270 366/500 [====================>.........] - ETA: 33s - loss: 1.4037 - regression_loss: 1.1767 - classification_loss: 0.2270 367/500 [=====================>........] - ETA: 33s - loss: 1.4045 - regression_loss: 1.1775 - classification_loss: 0.2270 368/500 [=====================>........] - ETA: 33s - loss: 1.4033 - regression_loss: 1.1768 - classification_loss: 0.2265 369/500 [=====================>........] - ETA: 32s - loss: 1.4043 - regression_loss: 1.1776 - classification_loss: 0.2268 370/500 [=====================>........] - ETA: 32s - loss: 1.4044 - regression_loss: 1.1776 - classification_loss: 0.2268 371/500 [=====================>........] - ETA: 32s - loss: 1.4031 - regression_loss: 1.1765 - classification_loss: 0.2266 372/500 [=====================>........] - ETA: 32s - loss: 1.4042 - regression_loss: 1.1772 - classification_loss: 0.2271 373/500 [=====================>........] - ETA: 31s - loss: 1.4041 - regression_loss: 1.1771 - classification_loss: 0.2270 374/500 [=====================>........] - ETA: 31s - loss: 1.4034 - regression_loss: 1.1766 - classification_loss: 0.2268 375/500 [=====================>........] - ETA: 31s - loss: 1.4018 - regression_loss: 1.1753 - classification_loss: 0.2265 376/500 [=====================>........] - ETA: 31s - loss: 1.4018 - regression_loss: 1.1754 - classification_loss: 0.2264 377/500 [=====================>........] - ETA: 30s - loss: 1.4025 - regression_loss: 1.1759 - classification_loss: 0.2266 378/500 [=====================>........] - ETA: 30s - loss: 1.4022 - regression_loss: 1.1756 - classification_loss: 0.2266 379/500 [=====================>........] - ETA: 30s - loss: 1.4005 - regression_loss: 1.1743 - classification_loss: 0.2262 380/500 [=====================>........] - ETA: 30s - loss: 1.3983 - regression_loss: 1.1726 - classification_loss: 0.2257 381/500 [=====================>........] - ETA: 29s - loss: 1.3990 - regression_loss: 1.1732 - classification_loss: 0.2258 382/500 [=====================>........] - ETA: 29s - loss: 1.3984 - regression_loss: 1.1727 - classification_loss: 0.2257 383/500 [=====================>........] - ETA: 29s - loss: 1.3985 - regression_loss: 1.1729 - classification_loss: 0.2255 384/500 [======================>.......] - ETA: 29s - loss: 1.3972 - regression_loss: 1.1719 - classification_loss: 0.2252 385/500 [======================>.......] - ETA: 28s - loss: 1.3952 - regression_loss: 1.1703 - classification_loss: 0.2249 386/500 [======================>.......] - ETA: 28s - loss: 1.3946 - regression_loss: 1.1699 - classification_loss: 0.2247 387/500 [======================>.......] - ETA: 28s - loss: 1.3949 - regression_loss: 1.1702 - classification_loss: 0.2248 388/500 [======================>.......] - ETA: 28s - loss: 1.3950 - regression_loss: 1.1703 - classification_loss: 0.2247 389/500 [======================>.......] - ETA: 27s - loss: 1.3943 - regression_loss: 1.1697 - classification_loss: 0.2246 390/500 [======================>.......] - ETA: 27s - loss: 1.3958 - regression_loss: 1.1708 - classification_loss: 0.2250 391/500 [======================>.......] - ETA: 27s - loss: 1.3952 - regression_loss: 1.1703 - classification_loss: 0.2249 392/500 [======================>.......] - ETA: 27s - loss: 1.3952 - regression_loss: 1.1701 - classification_loss: 0.2251 393/500 [======================>.......] - ETA: 26s - loss: 1.3967 - regression_loss: 1.1714 - classification_loss: 0.2253 394/500 [======================>.......] - ETA: 26s - loss: 1.3958 - regression_loss: 1.1706 - classification_loss: 0.2252 395/500 [======================>.......] - ETA: 26s - loss: 1.3965 - regression_loss: 1.1712 - classification_loss: 0.2253 396/500 [======================>.......] - ETA: 26s - loss: 1.3970 - regression_loss: 1.1717 - classification_loss: 0.2253 397/500 [======================>.......] - ETA: 25s - loss: 1.3948 - regression_loss: 1.1698 - classification_loss: 0.2250 398/500 [======================>.......] - ETA: 25s - loss: 1.3957 - regression_loss: 1.1706 - classification_loss: 0.2251 399/500 [======================>.......] - ETA: 25s - loss: 1.3941 - regression_loss: 1.1692 - classification_loss: 0.2248 400/500 [=======================>......] - ETA: 25s - loss: 1.3951 - regression_loss: 1.1701 - classification_loss: 0.2250 401/500 [=======================>......] - ETA: 24s - loss: 1.3939 - regression_loss: 1.1690 - classification_loss: 0.2249 402/500 [=======================>......] - ETA: 24s - loss: 1.3924 - regression_loss: 1.1680 - classification_loss: 0.2244 403/500 [=======================>......] - ETA: 24s - loss: 1.3929 - regression_loss: 1.1685 - classification_loss: 0.2244 404/500 [=======================>......] - ETA: 24s - loss: 1.3932 - regression_loss: 1.1688 - classification_loss: 0.2244 405/500 [=======================>......] - ETA: 23s - loss: 1.3947 - regression_loss: 1.1701 - classification_loss: 0.2246 406/500 [=======================>......] - ETA: 23s - loss: 1.3931 - regression_loss: 1.1688 - classification_loss: 0.2243 407/500 [=======================>......] - ETA: 23s - loss: 1.3936 - regression_loss: 1.1693 - classification_loss: 0.2244 408/500 [=======================>......] - ETA: 23s - loss: 1.3944 - regression_loss: 1.1699 - classification_loss: 0.2245 409/500 [=======================>......] - ETA: 22s - loss: 1.3936 - regression_loss: 1.1691 - classification_loss: 0.2244 410/500 [=======================>......] - ETA: 22s - loss: 1.3932 - regression_loss: 1.1688 - classification_loss: 0.2244 411/500 [=======================>......] - ETA: 22s - loss: 1.3938 - regression_loss: 1.1694 - classification_loss: 0.2244 412/500 [=======================>......] - ETA: 22s - loss: 1.3944 - regression_loss: 1.1700 - classification_loss: 0.2245 413/500 [=======================>......] - ETA: 21s - loss: 1.3954 - regression_loss: 1.1708 - classification_loss: 0.2246 414/500 [=======================>......] - ETA: 21s - loss: 1.3959 - regression_loss: 1.1705 - classification_loss: 0.2254 415/500 [=======================>......] - ETA: 21s - loss: 1.3953 - regression_loss: 1.1700 - classification_loss: 0.2253 416/500 [=======================>......] - ETA: 21s - loss: 1.3955 - regression_loss: 1.1702 - classification_loss: 0.2253 417/500 [========================>.....] - ETA: 20s - loss: 1.3964 - regression_loss: 1.1708 - classification_loss: 0.2256 418/500 [========================>.....] - ETA: 20s - loss: 1.3971 - regression_loss: 1.1716 - classification_loss: 0.2256 419/500 [========================>.....] - ETA: 20s - loss: 1.3979 - regression_loss: 1.1722 - classification_loss: 0.2257 420/500 [========================>.....] - ETA: 20s - loss: 1.3977 - regression_loss: 1.1720 - classification_loss: 0.2256 421/500 [========================>.....] - ETA: 19s - loss: 1.3975 - regression_loss: 1.1719 - classification_loss: 0.2257 422/500 [========================>.....] - ETA: 19s - loss: 1.3957 - regression_loss: 1.1704 - classification_loss: 0.2253 423/500 [========================>.....] - ETA: 19s - loss: 1.3950 - regression_loss: 1.1699 - classification_loss: 0.2251 424/500 [========================>.....] - ETA: 19s - loss: 1.3954 - regression_loss: 1.1703 - classification_loss: 0.2251 425/500 [========================>.....] - ETA: 18s - loss: 1.3961 - regression_loss: 1.1709 - classification_loss: 0.2252 426/500 [========================>.....] - ETA: 18s - loss: 1.3960 - regression_loss: 1.1708 - classification_loss: 0.2252 427/500 [========================>.....] - ETA: 18s - loss: 1.3974 - regression_loss: 1.1720 - classification_loss: 0.2254 428/500 [========================>.....] - ETA: 18s - loss: 1.3977 - regression_loss: 1.1723 - classification_loss: 0.2254 429/500 [========================>.....] - ETA: 17s - loss: 1.3966 - regression_loss: 1.1714 - classification_loss: 0.2251 430/500 [========================>.....] - ETA: 17s - loss: 1.3959 - regression_loss: 1.1708 - classification_loss: 0.2251 431/500 [========================>.....] - ETA: 17s - loss: 1.3957 - regression_loss: 1.1707 - classification_loss: 0.2250 432/500 [========================>.....] - ETA: 17s - loss: 1.3967 - regression_loss: 1.1715 - classification_loss: 0.2252 433/500 [========================>.....] - ETA: 16s - loss: 1.3967 - regression_loss: 1.1715 - classification_loss: 0.2252 434/500 [=========================>....] - ETA: 16s - loss: 1.3957 - regression_loss: 1.1708 - classification_loss: 0.2249 435/500 [=========================>....] - ETA: 16s - loss: 1.3935 - regression_loss: 1.1690 - classification_loss: 0.2245 436/500 [=========================>....] - ETA: 16s - loss: 1.3917 - regression_loss: 1.1676 - classification_loss: 0.2242 437/500 [=========================>....] - ETA: 15s - loss: 1.3920 - regression_loss: 1.1677 - classification_loss: 0.2242 438/500 [=========================>....] - ETA: 15s - loss: 1.3921 - regression_loss: 1.1679 - classification_loss: 0.2242 439/500 [=========================>....] - ETA: 15s - loss: 1.3920 - regression_loss: 1.1678 - classification_loss: 0.2242 440/500 [=========================>....] - ETA: 15s - loss: 1.3931 - regression_loss: 1.1688 - classification_loss: 0.2243 441/500 [=========================>....] - ETA: 14s - loss: 1.3949 - regression_loss: 1.1703 - classification_loss: 0.2246 442/500 [=========================>....] - ETA: 14s - loss: 1.3964 - regression_loss: 1.1715 - classification_loss: 0.2249 443/500 [=========================>....] - ETA: 14s - loss: 1.3966 - regression_loss: 1.1716 - classification_loss: 0.2250 444/500 [=========================>....] - ETA: 14s - loss: 1.3951 - regression_loss: 1.1704 - classification_loss: 0.2247 445/500 [=========================>....] - ETA: 13s - loss: 1.3969 - regression_loss: 1.1718 - classification_loss: 0.2251 446/500 [=========================>....] - ETA: 13s - loss: 1.3977 - regression_loss: 1.1725 - classification_loss: 0.2252 447/500 [=========================>....] - ETA: 13s - loss: 1.3964 - regression_loss: 1.1714 - classification_loss: 0.2250 448/500 [=========================>....] - ETA: 13s - loss: 1.3977 - regression_loss: 1.1725 - classification_loss: 0.2251 449/500 [=========================>....] - ETA: 12s - loss: 1.3965 - regression_loss: 1.1716 - classification_loss: 0.2249 450/500 [==========================>...] - ETA: 12s - loss: 1.3972 - regression_loss: 1.1723 - classification_loss: 0.2249 451/500 [==========================>...] - ETA: 12s - loss: 1.3982 - regression_loss: 1.1731 - classification_loss: 0.2251 452/500 [==========================>...] - ETA: 12s - loss: 1.3968 - regression_loss: 1.1719 - classification_loss: 0.2249 453/500 [==========================>...] - ETA: 11s - loss: 1.3956 - regression_loss: 1.1707 - classification_loss: 0.2249 454/500 [==========================>...] - ETA: 11s - loss: 1.3965 - regression_loss: 1.1716 - classification_loss: 0.2249 455/500 [==========================>...] - ETA: 11s - loss: 1.3955 - regression_loss: 1.1707 - classification_loss: 0.2248 456/500 [==========================>...] - ETA: 11s - loss: 1.3942 - regression_loss: 1.1697 - classification_loss: 0.2245 457/500 [==========================>...] - ETA: 10s - loss: 1.3947 - regression_loss: 1.1701 - classification_loss: 0.2246 458/500 [==========================>...] - ETA: 10s - loss: 1.3929 - regression_loss: 1.1685 - classification_loss: 0.2243 459/500 [==========================>...] - ETA: 10s - loss: 1.3929 - regression_loss: 1.1687 - classification_loss: 0.2242 460/500 [==========================>...] - ETA: 10s - loss: 1.3914 - regression_loss: 1.1675 - classification_loss: 0.2239 461/500 [==========================>...] - ETA: 9s - loss: 1.3916 - regression_loss: 1.1675 - classification_loss: 0.2241  462/500 [==========================>...] - ETA: 9s - loss: 1.3924 - regression_loss: 1.1680 - classification_loss: 0.2244 463/500 [==========================>...] - ETA: 9s - loss: 1.3928 - regression_loss: 1.1684 - classification_loss: 0.2244 464/500 [==========================>...] - ETA: 9s - loss: 1.3928 - regression_loss: 1.1687 - classification_loss: 0.2241 465/500 [==========================>...] - ETA: 8s - loss: 1.3933 - regression_loss: 1.1690 - classification_loss: 0.2242 466/500 [==========================>...] - ETA: 8s - loss: 1.3936 - regression_loss: 1.1693 - classification_loss: 0.2243 467/500 [===========================>..] - ETA: 8s - loss: 1.3931 - regression_loss: 1.1690 - classification_loss: 0.2241 468/500 [===========================>..] - ETA: 8s - loss: 1.3931 - regression_loss: 1.1690 - classification_loss: 0.2241 469/500 [===========================>..] - ETA: 7s - loss: 1.3931 - regression_loss: 1.1690 - classification_loss: 0.2241 470/500 [===========================>..] - ETA: 7s - loss: 1.3930 - regression_loss: 1.1690 - classification_loss: 0.2239 471/500 [===========================>..] - ETA: 7s - loss: 1.3939 - regression_loss: 1.1697 - classification_loss: 0.2242 472/500 [===========================>..] - ETA: 7s - loss: 1.3942 - regression_loss: 1.1701 - classification_loss: 0.2241 473/500 [===========================>..] - ETA: 6s - loss: 1.3953 - regression_loss: 1.1710 - classification_loss: 0.2243 474/500 [===========================>..] - ETA: 6s - loss: 1.3959 - regression_loss: 1.1714 - classification_loss: 0.2245 475/500 [===========================>..] - ETA: 6s - loss: 1.3965 - regression_loss: 1.1719 - classification_loss: 0.2245 476/500 [===========================>..] - ETA: 6s - loss: 1.3973 - regression_loss: 1.1725 - classification_loss: 0.2248 477/500 [===========================>..] - ETA: 5s - loss: 1.4007 - regression_loss: 1.1753 - classification_loss: 0.2254 478/500 [===========================>..] - ETA: 5s - loss: 1.3995 - regression_loss: 1.1743 - classification_loss: 0.2253 479/500 [===========================>..] - ETA: 5s - loss: 1.3995 - regression_loss: 1.1742 - classification_loss: 0.2253 480/500 [===========================>..] - ETA: 5s - loss: 1.4005 - regression_loss: 1.1751 - classification_loss: 0.2254 481/500 [===========================>..] - ETA: 4s - loss: 1.3998 - regression_loss: 1.1745 - classification_loss: 0.2253 482/500 [===========================>..] - ETA: 4s - loss: 1.3986 - regression_loss: 1.1736 - classification_loss: 0.2250 483/500 [===========================>..] - ETA: 4s - loss: 1.3992 - regression_loss: 1.1741 - classification_loss: 0.2251 484/500 [============================>.] - ETA: 4s - loss: 1.3985 - regression_loss: 1.1734 - classification_loss: 0.2251 485/500 [============================>.] - ETA: 3s - loss: 1.3974 - regression_loss: 1.1726 - classification_loss: 0.2248 486/500 [============================>.] - ETA: 3s - loss: 1.3981 - regression_loss: 1.1732 - classification_loss: 0.2249 487/500 [============================>.] - ETA: 3s - loss: 1.3984 - regression_loss: 1.1736 - classification_loss: 0.2248 488/500 [============================>.] - ETA: 3s - loss: 1.3990 - regression_loss: 1.1740 - classification_loss: 0.2249 489/500 [============================>.] - ETA: 2s - loss: 1.3999 - regression_loss: 1.1747 - classification_loss: 0.2252 490/500 [============================>.] - ETA: 2s - loss: 1.3999 - regression_loss: 1.1747 - classification_loss: 0.2252 491/500 [============================>.] - ETA: 2s - loss: 1.3992 - regression_loss: 1.1740 - classification_loss: 0.2252 492/500 [============================>.] - ETA: 2s - loss: 1.3993 - regression_loss: 1.1740 - classification_loss: 0.2253 493/500 [============================>.] - ETA: 1s - loss: 1.3991 - regression_loss: 1.1738 - classification_loss: 0.2252 494/500 [============================>.] - ETA: 1s - loss: 1.3995 - regression_loss: 1.1741 - classification_loss: 0.2254 495/500 [============================>.] - ETA: 1s - loss: 1.3998 - regression_loss: 1.1744 - classification_loss: 0.2254 496/500 [============================>.] - ETA: 1s - loss: 1.4009 - regression_loss: 1.1754 - classification_loss: 0.2255 497/500 [============================>.] - ETA: 0s - loss: 1.4004 - regression_loss: 1.1751 - classification_loss: 0.2253 498/500 [============================>.] - ETA: 0s - loss: 1.3991 - regression_loss: 1.1738 - classification_loss: 0.2253 499/500 [============================>.] - ETA: 0s - loss: 1.3979 - regression_loss: 1.1729 - classification_loss: 0.2251 500/500 [==============================] - 126s 251ms/step - loss: 1.3976 - regression_loss: 1.1726 - classification_loss: 0.2250 1172 instances of class plum with average precision: 0.6717 mAP: 0.6717 Epoch 00112: saving model to ./training/snapshots/resnet50_pascal_112.h5 Epoch 113/150 1/500 [..............................] - ETA: 1:59 - loss: 0.6492 - regression_loss: 0.5140 - classification_loss: 0.1351 2/500 [..............................] - ETA: 2:01 - loss: 0.7309 - regression_loss: 0.5877 - classification_loss: 0.1432 3/500 [..............................] - ETA: 2:02 - loss: 1.0419 - regression_loss: 0.8656 - classification_loss: 0.1763 4/500 [..............................] - ETA: 2:03 - loss: 1.1725 - regression_loss: 0.9874 - classification_loss: 0.1851 5/500 [..............................] - ETA: 2:02 - loss: 1.1453 - regression_loss: 0.9506 - classification_loss: 0.1947 6/500 [..............................] - ETA: 2:02 - loss: 1.2546 - regression_loss: 1.0186 - classification_loss: 0.2360 7/500 [..............................] - ETA: 2:02 - loss: 1.2119 - regression_loss: 0.9893 - classification_loss: 0.2226 8/500 [..............................] - ETA: 2:03 - loss: 1.2561 - regression_loss: 1.0250 - classification_loss: 0.2311 9/500 [..............................] - ETA: 2:02 - loss: 1.2146 - regression_loss: 0.9935 - classification_loss: 0.2211 10/500 [..............................] - ETA: 2:02 - loss: 1.2448 - regression_loss: 1.0251 - classification_loss: 0.2197 11/500 [..............................] - ETA: 2:02 - loss: 1.2855 - regression_loss: 1.0565 - classification_loss: 0.2291 12/500 [..............................] - ETA: 2:02 - loss: 1.2888 - regression_loss: 1.0643 - classification_loss: 0.2245 13/500 [..............................] - ETA: 2:02 - loss: 1.2416 - regression_loss: 1.0277 - classification_loss: 0.2138 14/500 [..............................] - ETA: 2:01 - loss: 1.2989 - regression_loss: 1.0770 - classification_loss: 0.2218 15/500 [..............................] - ETA: 2:01 - loss: 1.3164 - regression_loss: 1.0934 - classification_loss: 0.2229 16/500 [..............................] - ETA: 2:01 - loss: 1.3916 - regression_loss: 1.1291 - classification_loss: 0.2625 17/500 [>.............................] - ETA: 2:01 - loss: 1.4006 - regression_loss: 1.1403 - classification_loss: 0.2603 18/500 [>.............................] - ETA: 2:01 - loss: 1.4034 - regression_loss: 1.1462 - classification_loss: 0.2572 19/500 [>.............................] - ETA: 2:01 - loss: 1.4202 - regression_loss: 1.1648 - classification_loss: 0.2554 20/500 [>.............................] - ETA: 2:00 - loss: 1.4280 - regression_loss: 1.1735 - classification_loss: 0.2546 21/500 [>.............................] - ETA: 2:00 - loss: 1.4401 - regression_loss: 1.1835 - classification_loss: 0.2566 22/500 [>.............................] - ETA: 2:00 - loss: 1.4129 - regression_loss: 1.1618 - classification_loss: 0.2511 23/500 [>.............................] - ETA: 1:59 - loss: 1.3727 - regression_loss: 1.1294 - classification_loss: 0.2433 24/500 [>.............................] - ETA: 1:59 - loss: 1.3868 - regression_loss: 1.1425 - classification_loss: 0.2443 25/500 [>.............................] - ETA: 1:59 - loss: 1.4119 - regression_loss: 1.1650 - classification_loss: 0.2469 26/500 [>.............................] - ETA: 1:59 - loss: 1.4055 - regression_loss: 1.1588 - classification_loss: 0.2467 27/500 [>.............................] - ETA: 1:59 - loss: 1.3864 - regression_loss: 1.1428 - classification_loss: 0.2436 28/500 [>.............................] - ETA: 1:58 - loss: 1.3645 - regression_loss: 1.1274 - classification_loss: 0.2371 29/500 [>.............................] - ETA: 1:58 - loss: 1.3375 - regression_loss: 1.1048 - classification_loss: 0.2327 30/500 [>.............................] - ETA: 1:58 - loss: 1.3430 - regression_loss: 1.1109 - classification_loss: 0.2321 31/500 [>.............................] - ETA: 1:58 - loss: 1.3526 - regression_loss: 1.1204 - classification_loss: 0.2322 32/500 [>.............................] - ETA: 1:57 - loss: 1.3253 - regression_loss: 1.0984 - classification_loss: 0.2269 33/500 [>.............................] - ETA: 1:57 - loss: 1.3491 - regression_loss: 1.1183 - classification_loss: 0.2308 34/500 [=>............................] - ETA: 1:57 - loss: 1.3572 - regression_loss: 1.1247 - classification_loss: 0.2324 35/500 [=>............................] - ETA: 1:57 - loss: 1.3639 - regression_loss: 1.1312 - classification_loss: 0.2328 36/500 [=>............................] - ETA: 1:56 - loss: 1.3628 - regression_loss: 1.1305 - classification_loss: 0.2322 37/500 [=>............................] - ETA: 1:56 - loss: 1.3675 - regression_loss: 1.1342 - classification_loss: 0.2333 38/500 [=>............................] - ETA: 1:56 - loss: 1.3611 - regression_loss: 1.1298 - classification_loss: 0.2313 39/500 [=>............................] - ETA: 1:56 - loss: 1.3454 - regression_loss: 1.1175 - classification_loss: 0.2279 40/500 [=>............................] - ETA: 1:55 - loss: 1.3448 - regression_loss: 1.1182 - classification_loss: 0.2266 41/500 [=>............................] - ETA: 1:55 - loss: 1.3499 - regression_loss: 1.1207 - classification_loss: 0.2292 42/500 [=>............................] - ETA: 1:55 - loss: 1.3469 - regression_loss: 1.1190 - classification_loss: 0.2280 43/500 [=>............................] - ETA: 1:54 - loss: 1.3593 - regression_loss: 1.1316 - classification_loss: 0.2278 44/500 [=>............................] - ETA: 1:54 - loss: 1.3685 - regression_loss: 1.1396 - classification_loss: 0.2289 45/500 [=>............................] - ETA: 1:54 - loss: 1.3645 - regression_loss: 1.1367 - classification_loss: 0.2279 46/500 [=>............................] - ETA: 1:54 - loss: 1.3549 - regression_loss: 1.1283 - classification_loss: 0.2266 47/500 [=>............................] - ETA: 1:53 - loss: 1.3501 - regression_loss: 1.1253 - classification_loss: 0.2249 48/500 [=>............................] - ETA: 1:53 - loss: 1.3509 - regression_loss: 1.1260 - classification_loss: 0.2249 49/500 [=>............................] - ETA: 1:53 - loss: 1.3405 - regression_loss: 1.1181 - classification_loss: 0.2224 50/500 [==>...........................] - ETA: 1:52 - loss: 1.3380 - regression_loss: 1.1175 - classification_loss: 0.2204 51/500 [==>...........................] - ETA: 1:52 - loss: 1.3370 - regression_loss: 1.1175 - classification_loss: 0.2195 52/500 [==>...........................] - ETA: 1:52 - loss: 1.3297 - regression_loss: 1.1118 - classification_loss: 0.2178 53/500 [==>...........................] - ETA: 1:52 - loss: 1.3347 - regression_loss: 1.1162 - classification_loss: 0.2185 54/500 [==>...........................] - ETA: 1:51 - loss: 1.3502 - regression_loss: 1.1284 - classification_loss: 0.2218 55/500 [==>...........................] - ETA: 1:51 - loss: 1.3434 - regression_loss: 1.1237 - classification_loss: 0.2198 56/500 [==>...........................] - ETA: 1:51 - loss: 1.3506 - regression_loss: 1.1303 - classification_loss: 0.2203 57/500 [==>...........................] - ETA: 1:51 - loss: 1.3525 - regression_loss: 1.1319 - classification_loss: 0.2207 58/500 [==>...........................] - ETA: 1:50 - loss: 1.3522 - regression_loss: 1.1316 - classification_loss: 0.2206 59/500 [==>...........................] - ETA: 1:50 - loss: 1.3480 - regression_loss: 1.1280 - classification_loss: 0.2200 60/500 [==>...........................] - ETA: 1:50 - loss: 1.3515 - regression_loss: 1.1311 - classification_loss: 0.2203 61/500 [==>...........................] - ETA: 1:50 - loss: 1.3568 - regression_loss: 1.1359 - classification_loss: 0.2209 62/500 [==>...........................] - ETA: 1:49 - loss: 1.3598 - regression_loss: 1.1390 - classification_loss: 0.2208 63/500 [==>...........................] - ETA: 1:49 - loss: 1.3587 - regression_loss: 1.1384 - classification_loss: 0.2203 64/500 [==>...........................] - ETA: 1:49 - loss: 1.3526 - regression_loss: 1.1344 - classification_loss: 0.2182 65/500 [==>...........................] - ETA: 1:49 - loss: 1.3532 - regression_loss: 1.1352 - classification_loss: 0.2180 66/500 [==>...........................] - ETA: 1:48 - loss: 1.3458 - regression_loss: 1.1289 - classification_loss: 0.2169 67/500 [===>..........................] - ETA: 1:48 - loss: 1.3501 - regression_loss: 1.1326 - classification_loss: 0.2175 68/500 [===>..........................] - ETA: 1:48 - loss: 1.3486 - regression_loss: 1.1308 - classification_loss: 0.2178 69/500 [===>..........................] - ETA: 1:48 - loss: 1.3501 - regression_loss: 1.1320 - classification_loss: 0.2181 70/500 [===>..........................] - ETA: 1:47 - loss: 1.3545 - regression_loss: 1.1361 - classification_loss: 0.2184 71/500 [===>..........................] - ETA: 1:47 - loss: 1.3583 - regression_loss: 1.1382 - classification_loss: 0.2201 72/500 [===>..........................] - ETA: 1:47 - loss: 1.3620 - regression_loss: 1.1415 - classification_loss: 0.2205 73/500 [===>..........................] - ETA: 1:47 - loss: 1.3661 - regression_loss: 1.1456 - classification_loss: 0.2205 74/500 [===>..........................] - ETA: 1:46 - loss: 1.3552 - regression_loss: 1.1369 - classification_loss: 0.2183 75/500 [===>..........................] - ETA: 1:46 - loss: 1.3448 - regression_loss: 1.1293 - classification_loss: 0.2155 76/500 [===>..........................] - ETA: 1:46 - loss: 1.3443 - regression_loss: 1.1291 - classification_loss: 0.2152 77/500 [===>..........................] - ETA: 1:46 - loss: 1.3430 - regression_loss: 1.1282 - classification_loss: 0.2147 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3453 - regression_loss: 1.1296 - classification_loss: 0.2157 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3343 - regression_loss: 1.1208 - classification_loss: 0.2135 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3354 - regression_loss: 1.1215 - classification_loss: 0.2140 81/500 [===>..........................] - ETA: 1:45 - loss: 1.3414 - regression_loss: 1.1264 - classification_loss: 0.2151 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3461 - regression_loss: 1.1308 - classification_loss: 0.2152 83/500 [===>..........................] - ETA: 1:44 - loss: 1.3483 - regression_loss: 1.1328 - classification_loss: 0.2154 84/500 [====>.........................] - ETA: 1:44 - loss: 1.3531 - regression_loss: 1.1368 - classification_loss: 0.2163 85/500 [====>.........................] - ETA: 1:44 - loss: 1.3479 - regression_loss: 1.1322 - classification_loss: 0.2157 86/500 [====>.........................] - ETA: 1:44 - loss: 1.3488 - regression_loss: 1.1328 - classification_loss: 0.2160 87/500 [====>.........................] - ETA: 1:43 - loss: 1.3497 - regression_loss: 1.1334 - classification_loss: 0.2163 88/500 [====>.........................] - ETA: 1:43 - loss: 1.3449 - regression_loss: 1.1294 - classification_loss: 0.2155 89/500 [====>.........................] - ETA: 1:43 - loss: 1.3478 - regression_loss: 1.1319 - classification_loss: 0.2159 90/500 [====>.........................] - ETA: 1:43 - loss: 1.3535 - regression_loss: 1.1365 - classification_loss: 0.2170 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3574 - regression_loss: 1.1399 - classification_loss: 0.2175 92/500 [====>.........................] - ETA: 1:42 - loss: 1.3549 - regression_loss: 1.1379 - classification_loss: 0.2170 93/500 [====>.........................] - ETA: 1:42 - loss: 1.3553 - regression_loss: 1.1385 - classification_loss: 0.2168 94/500 [====>.........................] - ETA: 1:42 - loss: 1.3569 - regression_loss: 1.1401 - classification_loss: 0.2167 95/500 [====>.........................] - ETA: 1:41 - loss: 1.3522 - regression_loss: 1.1356 - classification_loss: 0.2166 96/500 [====>.........................] - ETA: 1:41 - loss: 1.3523 - regression_loss: 1.1359 - classification_loss: 0.2165 97/500 [====>.........................] - ETA: 1:41 - loss: 1.3554 - regression_loss: 1.1386 - classification_loss: 0.2168 98/500 [====>.........................] - ETA: 1:41 - loss: 1.3585 - regression_loss: 1.1418 - classification_loss: 0.2167 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3593 - regression_loss: 1.1430 - classification_loss: 0.2164 100/500 [=====>........................] - ETA: 1:40 - loss: 1.3637 - regression_loss: 1.1465 - classification_loss: 0.2173 101/500 [=====>........................] - ETA: 1:40 - loss: 1.3673 - regression_loss: 1.1497 - classification_loss: 0.2176 102/500 [=====>........................] - ETA: 1:40 - loss: 1.3600 - regression_loss: 1.1435 - classification_loss: 0.2164 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3628 - regression_loss: 1.1455 - classification_loss: 0.2172 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3654 - regression_loss: 1.1478 - classification_loss: 0.2176 105/500 [=====>........................] - ETA: 1:39 - loss: 1.3633 - regression_loss: 1.1465 - classification_loss: 0.2169 106/500 [=====>........................] - ETA: 1:39 - loss: 1.3637 - regression_loss: 1.1457 - classification_loss: 0.2180 107/500 [=====>........................] - ETA: 1:38 - loss: 1.3566 - regression_loss: 1.1399 - classification_loss: 0.2166 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3535 - regression_loss: 1.1370 - classification_loss: 0.2165 109/500 [=====>........................] - ETA: 1:38 - loss: 1.3499 - regression_loss: 1.1346 - classification_loss: 0.2153 110/500 [=====>........................] - ETA: 1:38 - loss: 1.3578 - regression_loss: 1.1412 - classification_loss: 0.2165 111/500 [=====>........................] - ETA: 1:37 - loss: 1.3574 - regression_loss: 1.1415 - classification_loss: 0.2159 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3566 - regression_loss: 1.1409 - classification_loss: 0.2156 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3515 - regression_loss: 1.1370 - classification_loss: 0.2145 114/500 [=====>........................] - ETA: 1:37 - loss: 1.3574 - regression_loss: 1.1416 - classification_loss: 0.2158 115/500 [=====>........................] - ETA: 1:36 - loss: 1.3548 - regression_loss: 1.1397 - classification_loss: 0.2150 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3536 - regression_loss: 1.1391 - classification_loss: 0.2144 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3464 - regression_loss: 1.1333 - classification_loss: 0.2131 118/500 [======>.......................] - ETA: 1:36 - loss: 1.3531 - regression_loss: 1.1379 - classification_loss: 0.2153 119/500 [======>.......................] - ETA: 1:35 - loss: 1.3484 - regression_loss: 1.1343 - classification_loss: 0.2141 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3503 - regression_loss: 1.1361 - classification_loss: 0.2142 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3446 - regression_loss: 1.1310 - classification_loss: 0.2136 122/500 [======>.......................] - ETA: 1:35 - loss: 1.3490 - regression_loss: 1.1346 - classification_loss: 0.2143 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3495 - regression_loss: 1.1352 - classification_loss: 0.2143 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3502 - regression_loss: 1.1362 - classification_loss: 0.2140 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3507 - regression_loss: 1.1354 - classification_loss: 0.2153 126/500 [======>.......................] - ETA: 1:34 - loss: 1.3510 - regression_loss: 1.1356 - classification_loss: 0.2154 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3526 - regression_loss: 1.1372 - classification_loss: 0.2153 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3577 - regression_loss: 1.1415 - classification_loss: 0.2162 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3597 - regression_loss: 1.1431 - classification_loss: 0.2166 130/500 [======>.......................] - ETA: 1:33 - loss: 1.3610 - regression_loss: 1.1441 - classification_loss: 0.2170 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3625 - regression_loss: 1.1451 - classification_loss: 0.2174 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3636 - regression_loss: 1.1456 - classification_loss: 0.2179 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3653 - regression_loss: 1.1470 - classification_loss: 0.2183 134/500 [=======>......................] - ETA: 1:32 - loss: 1.3690 - regression_loss: 1.1510 - classification_loss: 0.2181 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3701 - regression_loss: 1.1518 - classification_loss: 0.2183 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3695 - regression_loss: 1.1507 - classification_loss: 0.2189 137/500 [=======>......................] - ETA: 1:31 - loss: 1.3732 - regression_loss: 1.1538 - classification_loss: 0.2194 138/500 [=======>......................] - ETA: 1:31 - loss: 1.3677 - regression_loss: 1.1493 - classification_loss: 0.2184 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3691 - regression_loss: 1.1500 - classification_loss: 0.2192 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3700 - regression_loss: 1.1508 - classification_loss: 0.2191 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3714 - regression_loss: 1.1524 - classification_loss: 0.2190 142/500 [=======>......................] - ETA: 1:30 - loss: 1.3720 - regression_loss: 1.1529 - classification_loss: 0.2191 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3745 - regression_loss: 1.1548 - classification_loss: 0.2197 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3723 - regression_loss: 1.1533 - classification_loss: 0.2191 145/500 [=======>......................] - ETA: 1:29 - loss: 1.3742 - regression_loss: 1.1548 - classification_loss: 0.2194 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3766 - regression_loss: 1.1562 - classification_loss: 0.2204 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3780 - regression_loss: 1.1575 - classification_loss: 0.2205 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3742 - regression_loss: 1.1544 - classification_loss: 0.2198 149/500 [=======>......................] - ETA: 1:27 - loss: 1.3735 - regression_loss: 1.1540 - classification_loss: 0.2194 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3690 - regression_loss: 1.1502 - classification_loss: 0.2188 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3682 - regression_loss: 1.1498 - classification_loss: 0.2183 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3630 - regression_loss: 1.1459 - classification_loss: 0.2171 153/500 [========>.....................] - ETA: 1:26 - loss: 1.3601 - regression_loss: 1.1435 - classification_loss: 0.2166 154/500 [========>.....................] - ETA: 1:26 - loss: 1.3600 - regression_loss: 1.1437 - classification_loss: 0.2162 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3576 - regression_loss: 1.1420 - classification_loss: 0.2156 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3581 - regression_loss: 1.1420 - classification_loss: 0.2160 157/500 [========>.....................] - ETA: 1:25 - loss: 1.3589 - regression_loss: 1.1437 - classification_loss: 0.2151 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3624 - regression_loss: 1.1461 - classification_loss: 0.2163 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3641 - regression_loss: 1.1477 - classification_loss: 0.2165 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3585 - regression_loss: 1.1431 - classification_loss: 0.2153 161/500 [========>.....................] - ETA: 1:24 - loss: 1.3588 - regression_loss: 1.1436 - classification_loss: 0.2152 162/500 [========>.....................] - ETA: 1:24 - loss: 1.3587 - regression_loss: 1.1438 - classification_loss: 0.2150 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3634 - regression_loss: 1.1479 - classification_loss: 0.2155 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3656 - regression_loss: 1.1495 - classification_loss: 0.2161 165/500 [========>.....................] - ETA: 1:23 - loss: 1.3673 - regression_loss: 1.1507 - classification_loss: 0.2166 166/500 [========>.....................] - ETA: 1:23 - loss: 1.3639 - regression_loss: 1.1480 - classification_loss: 0.2159 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3627 - regression_loss: 1.1472 - classification_loss: 0.2155 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3640 - regression_loss: 1.1484 - classification_loss: 0.2155 169/500 [=========>....................] - ETA: 1:22 - loss: 1.3609 - regression_loss: 1.1457 - classification_loss: 0.2153 170/500 [=========>....................] - ETA: 1:22 - loss: 1.3666 - regression_loss: 1.1504 - classification_loss: 0.2161 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3691 - regression_loss: 1.1525 - classification_loss: 0.2166 172/500 [=========>....................] - ETA: 1:22 - loss: 1.3710 - regression_loss: 1.1530 - classification_loss: 0.2179 173/500 [=========>....................] - ETA: 1:21 - loss: 1.3712 - regression_loss: 1.1528 - classification_loss: 0.2184 174/500 [=========>....................] - ETA: 1:21 - loss: 1.3723 - regression_loss: 1.1539 - classification_loss: 0.2183 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3728 - regression_loss: 1.1541 - classification_loss: 0.2188 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3727 - regression_loss: 1.1539 - classification_loss: 0.2188 177/500 [=========>....................] - ETA: 1:20 - loss: 1.3726 - regression_loss: 1.1540 - classification_loss: 0.2186 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3742 - regression_loss: 1.1557 - classification_loss: 0.2185 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3724 - regression_loss: 1.1544 - classification_loss: 0.2180 180/500 [=========>....................] - ETA: 1:20 - loss: 1.3678 - regression_loss: 1.1508 - classification_loss: 0.2170 181/500 [=========>....................] - ETA: 1:19 - loss: 1.3686 - regression_loss: 1.1514 - classification_loss: 0.2172 182/500 [=========>....................] - ETA: 1:19 - loss: 1.3644 - regression_loss: 1.1478 - classification_loss: 0.2167 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3665 - regression_loss: 1.1482 - classification_loss: 0.2184 184/500 [==========>...................] - ETA: 1:19 - loss: 1.3707 - regression_loss: 1.1514 - classification_loss: 0.2193 185/500 [==========>...................] - ETA: 1:18 - loss: 1.3744 - regression_loss: 1.1541 - classification_loss: 0.2203 186/500 [==========>...................] - ETA: 1:18 - loss: 1.3738 - regression_loss: 1.1538 - classification_loss: 0.2200 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3777 - regression_loss: 1.1565 - classification_loss: 0.2212 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3765 - regression_loss: 1.1557 - classification_loss: 0.2208 189/500 [==========>...................] - ETA: 1:17 - loss: 1.3754 - regression_loss: 1.1548 - classification_loss: 0.2206 190/500 [==========>...................] - ETA: 1:17 - loss: 1.3772 - regression_loss: 1.1562 - classification_loss: 0.2210 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3775 - regression_loss: 1.1564 - classification_loss: 0.2211 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3778 - regression_loss: 1.1566 - classification_loss: 0.2212 193/500 [==========>...................] - ETA: 1:16 - loss: 1.3843 - regression_loss: 1.1610 - classification_loss: 0.2233 194/500 [==========>...................] - ETA: 1:16 - loss: 1.3854 - regression_loss: 1.1616 - classification_loss: 0.2237 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3822 - regression_loss: 1.1585 - classification_loss: 0.2237 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3818 - regression_loss: 1.1584 - classification_loss: 0.2234 197/500 [==========>...................] - ETA: 1:15 - loss: 1.3845 - regression_loss: 1.1608 - classification_loss: 0.2237 198/500 [==========>...................] - ETA: 1:15 - loss: 1.3836 - regression_loss: 1.1602 - classification_loss: 0.2234 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3857 - regression_loss: 1.1619 - classification_loss: 0.2238 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3841 - regression_loss: 1.1606 - classification_loss: 0.2234 201/500 [===========>..................] - ETA: 1:14 - loss: 1.3835 - regression_loss: 1.1602 - classification_loss: 0.2234 202/500 [===========>..................] - ETA: 1:14 - loss: 1.3846 - regression_loss: 1.1609 - classification_loss: 0.2237 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3828 - regression_loss: 1.1590 - classification_loss: 0.2237 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3837 - regression_loss: 1.1600 - classification_loss: 0.2238 205/500 [===========>..................] - ETA: 1:13 - loss: 1.3811 - regression_loss: 1.1579 - classification_loss: 0.2232 206/500 [===========>..................] - ETA: 1:13 - loss: 1.3784 - regression_loss: 1.1558 - classification_loss: 0.2226 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3789 - regression_loss: 1.1562 - classification_loss: 0.2227 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3804 - regression_loss: 1.1572 - classification_loss: 0.2233 209/500 [===========>..................] - ETA: 1:13 - loss: 1.3813 - regression_loss: 1.1580 - classification_loss: 0.2233 210/500 [===========>..................] - ETA: 1:12 - loss: 1.3809 - regression_loss: 1.1578 - classification_loss: 0.2232 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3803 - regression_loss: 1.1573 - classification_loss: 0.2230 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3812 - regression_loss: 1.1579 - classification_loss: 0.2233 213/500 [===========>..................] - ETA: 1:12 - loss: 1.3782 - regression_loss: 1.1558 - classification_loss: 0.2224 214/500 [===========>..................] - ETA: 1:11 - loss: 1.3788 - regression_loss: 1.1563 - classification_loss: 0.2224 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3795 - regression_loss: 1.1570 - classification_loss: 0.2225 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3802 - regression_loss: 1.1577 - classification_loss: 0.2225 217/500 [============>.................] - ETA: 1:11 - loss: 1.3806 - regression_loss: 1.1582 - classification_loss: 0.2224 218/500 [============>.................] - ETA: 1:10 - loss: 1.3803 - regression_loss: 1.1579 - classification_loss: 0.2224 219/500 [============>.................] - ETA: 1:10 - loss: 1.3810 - regression_loss: 1.1584 - classification_loss: 0.2226 220/500 [============>.................] - ETA: 1:10 - loss: 1.3803 - regression_loss: 1.1577 - classification_loss: 0.2226 221/500 [============>.................] - ETA: 1:10 - loss: 1.3806 - regression_loss: 1.1578 - classification_loss: 0.2229 222/500 [============>.................] - ETA: 1:09 - loss: 1.3795 - regression_loss: 1.1570 - classification_loss: 0.2226 223/500 [============>.................] - ETA: 1:09 - loss: 1.3816 - regression_loss: 1.1583 - classification_loss: 0.2233 224/500 [============>.................] - ETA: 1:09 - loss: 1.3785 - regression_loss: 1.1555 - classification_loss: 0.2230 225/500 [============>.................] - ETA: 1:09 - loss: 1.3802 - regression_loss: 1.1567 - classification_loss: 0.2235 226/500 [============>.................] - ETA: 1:08 - loss: 1.3817 - regression_loss: 1.1577 - classification_loss: 0.2240 227/500 [============>.................] - ETA: 1:08 - loss: 1.3806 - regression_loss: 1.1569 - classification_loss: 0.2237 228/500 [============>.................] - ETA: 1:08 - loss: 1.3818 - regression_loss: 1.1579 - classification_loss: 0.2239 229/500 [============>.................] - ETA: 1:08 - loss: 1.3834 - regression_loss: 1.1589 - classification_loss: 0.2245 230/500 [============>.................] - ETA: 1:07 - loss: 1.3835 - regression_loss: 1.1591 - classification_loss: 0.2244 231/500 [============>.................] - ETA: 1:07 - loss: 1.3811 - regression_loss: 1.1572 - classification_loss: 0.2238 232/500 [============>.................] - ETA: 1:07 - loss: 1.3799 - regression_loss: 1.1560 - classification_loss: 0.2239 233/500 [============>.................] - ETA: 1:07 - loss: 1.3809 - regression_loss: 1.1567 - classification_loss: 0.2242 234/500 [=============>................] - ETA: 1:06 - loss: 1.3809 - regression_loss: 1.1567 - classification_loss: 0.2242 235/500 [=============>................] - ETA: 1:06 - loss: 1.3831 - regression_loss: 1.1588 - classification_loss: 0.2243 236/500 [=============>................] - ETA: 1:06 - loss: 1.3833 - regression_loss: 1.1589 - classification_loss: 0.2244 237/500 [=============>................] - ETA: 1:06 - loss: 1.3857 - regression_loss: 1.1608 - classification_loss: 0.2248 238/500 [=============>................] - ETA: 1:05 - loss: 1.3833 - regression_loss: 1.1590 - classification_loss: 0.2243 239/500 [=============>................] - ETA: 1:05 - loss: 1.3819 - regression_loss: 1.1581 - classification_loss: 0.2238 240/500 [=============>................] - ETA: 1:05 - loss: 1.3812 - regression_loss: 1.1574 - classification_loss: 0.2237 241/500 [=============>................] - ETA: 1:05 - loss: 1.3793 - regression_loss: 1.1559 - classification_loss: 0.2235 242/500 [=============>................] - ETA: 1:04 - loss: 1.3779 - regression_loss: 1.1545 - classification_loss: 0.2235 243/500 [=============>................] - ETA: 1:04 - loss: 1.3777 - regression_loss: 1.1543 - classification_loss: 0.2233 244/500 [=============>................] - ETA: 1:04 - loss: 1.3804 - regression_loss: 1.1564 - classification_loss: 0.2240 245/500 [=============>................] - ETA: 1:04 - loss: 1.3795 - regression_loss: 1.1557 - classification_loss: 0.2238 246/500 [=============>................] - ETA: 1:03 - loss: 1.3809 - regression_loss: 1.1570 - classification_loss: 0.2239 247/500 [=============>................] - ETA: 1:03 - loss: 1.3792 - regression_loss: 1.1559 - classification_loss: 0.2233 248/500 [=============>................] - ETA: 1:03 - loss: 1.3826 - regression_loss: 1.1588 - classification_loss: 0.2238 249/500 [=============>................] - ETA: 1:03 - loss: 1.3826 - regression_loss: 1.1589 - classification_loss: 0.2237 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3826 - regression_loss: 1.1590 - classification_loss: 0.2236 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3852 - regression_loss: 1.1616 - classification_loss: 0.2236 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3861 - regression_loss: 1.1624 - classification_loss: 0.2237 253/500 [==============>...............] - ETA: 1:02 - loss: 1.3850 - regression_loss: 1.1615 - classification_loss: 0.2236 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3853 - regression_loss: 1.1618 - classification_loss: 0.2235 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3866 - regression_loss: 1.1628 - classification_loss: 0.2238 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3884 - regression_loss: 1.1643 - classification_loss: 0.2241 257/500 [==============>...............] - ETA: 1:01 - loss: 1.3889 - regression_loss: 1.1646 - classification_loss: 0.2244 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3910 - regression_loss: 1.1662 - classification_loss: 0.2247 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3909 - regression_loss: 1.1665 - classification_loss: 0.2244 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3915 - regression_loss: 1.1669 - classification_loss: 0.2246 261/500 [==============>...............] - ETA: 1:00 - loss: 1.3936 - regression_loss: 1.1687 - classification_loss: 0.2249 262/500 [==============>...............] - ETA: 59s - loss: 1.3911 - regression_loss: 1.1666 - classification_loss: 0.2245  263/500 [==============>...............] - ETA: 59s - loss: 1.3919 - regression_loss: 1.1674 - classification_loss: 0.2245 264/500 [==============>...............] - ETA: 59s - loss: 1.3931 - regression_loss: 1.1685 - classification_loss: 0.2246 265/500 [==============>...............] - ETA: 59s - loss: 1.3947 - regression_loss: 1.1699 - classification_loss: 0.2248 266/500 [==============>...............] - ETA: 58s - loss: 1.3926 - regression_loss: 1.1685 - classification_loss: 0.2241 267/500 [===============>..............] - ETA: 58s - loss: 1.3932 - regression_loss: 1.1690 - classification_loss: 0.2242 268/500 [===============>..............] - ETA: 58s - loss: 1.3909 - regression_loss: 1.1674 - classification_loss: 0.2236 269/500 [===============>..............] - ETA: 58s - loss: 1.3918 - regression_loss: 1.1679 - classification_loss: 0.2239 270/500 [===============>..............] - ETA: 57s - loss: 1.3931 - regression_loss: 1.1690 - classification_loss: 0.2241 271/500 [===============>..............] - ETA: 57s - loss: 1.3945 - regression_loss: 1.1702 - classification_loss: 0.2244 272/500 [===============>..............] - ETA: 57s - loss: 1.3930 - regression_loss: 1.1688 - classification_loss: 0.2242 273/500 [===============>..............] - ETA: 57s - loss: 1.3931 - regression_loss: 1.1689 - classification_loss: 0.2242 274/500 [===============>..............] - ETA: 56s - loss: 1.3933 - regression_loss: 1.1690 - classification_loss: 0.2243 275/500 [===============>..............] - ETA: 56s - loss: 1.3932 - regression_loss: 1.1690 - classification_loss: 0.2242 276/500 [===============>..............] - ETA: 56s - loss: 1.3917 - regression_loss: 1.1681 - classification_loss: 0.2236 277/500 [===============>..............] - ETA: 56s - loss: 1.3922 - regression_loss: 1.1685 - classification_loss: 0.2236 278/500 [===============>..............] - ETA: 55s - loss: 1.3928 - regression_loss: 1.1688 - classification_loss: 0.2240 279/500 [===============>..............] - ETA: 55s - loss: 1.3929 - regression_loss: 1.1687 - classification_loss: 0.2242 280/500 [===============>..............] - ETA: 55s - loss: 1.3944 - regression_loss: 1.1699 - classification_loss: 0.2245 281/500 [===============>..............] - ETA: 55s - loss: 1.3953 - regression_loss: 1.1707 - classification_loss: 0.2246 282/500 [===============>..............] - ETA: 54s - loss: 1.3957 - regression_loss: 1.1711 - classification_loss: 0.2245 283/500 [===============>..............] - ETA: 54s - loss: 1.3956 - regression_loss: 1.1711 - classification_loss: 0.2246 284/500 [================>.............] - ETA: 54s - loss: 1.3970 - regression_loss: 1.1721 - classification_loss: 0.2249 285/500 [================>.............] - ETA: 54s - loss: 1.3941 - regression_loss: 1.1698 - classification_loss: 0.2243 286/500 [================>.............] - ETA: 53s - loss: 1.3947 - regression_loss: 1.1703 - classification_loss: 0.2244 287/500 [================>.............] - ETA: 53s - loss: 1.3940 - regression_loss: 1.1697 - classification_loss: 0.2242 288/500 [================>.............] - ETA: 53s - loss: 1.3938 - regression_loss: 1.1697 - classification_loss: 0.2242 289/500 [================>.............] - ETA: 53s - loss: 1.3950 - regression_loss: 1.1706 - classification_loss: 0.2244 290/500 [================>.............] - ETA: 52s - loss: 1.3957 - regression_loss: 1.1713 - classification_loss: 0.2244 291/500 [================>.............] - ETA: 52s - loss: 1.3953 - regression_loss: 1.1707 - classification_loss: 0.2245 292/500 [================>.............] - ETA: 52s - loss: 1.3941 - regression_loss: 1.1698 - classification_loss: 0.2242 293/500 [================>.............] - ETA: 52s - loss: 1.3919 - regression_loss: 1.1682 - classification_loss: 0.2237 294/500 [================>.............] - ETA: 51s - loss: 1.3936 - regression_loss: 1.1696 - classification_loss: 0.2241 295/500 [================>.............] - ETA: 51s - loss: 1.3946 - regression_loss: 1.1703 - classification_loss: 0.2243 296/500 [================>.............] - ETA: 51s - loss: 1.3954 - regression_loss: 1.1711 - classification_loss: 0.2243 297/500 [================>.............] - ETA: 51s - loss: 1.3955 - regression_loss: 1.1710 - classification_loss: 0.2245 298/500 [================>.............] - ETA: 50s - loss: 1.3930 - regression_loss: 1.1689 - classification_loss: 0.2241 299/500 [================>.............] - ETA: 50s - loss: 1.3954 - regression_loss: 1.1709 - classification_loss: 0.2245 300/500 [=================>............] - ETA: 50s - loss: 1.3963 - regression_loss: 1.1717 - classification_loss: 0.2246 301/500 [=================>............] - ETA: 50s - loss: 1.3973 - regression_loss: 1.1726 - classification_loss: 0.2247 302/500 [=================>............] - ETA: 49s - loss: 1.3979 - regression_loss: 1.1731 - classification_loss: 0.2248 303/500 [=================>............] - ETA: 49s - loss: 1.3988 - regression_loss: 1.1738 - classification_loss: 0.2250 304/500 [=================>............] - ETA: 49s - loss: 1.3992 - regression_loss: 1.1741 - classification_loss: 0.2251 305/500 [=================>............] - ETA: 48s - loss: 1.3992 - regression_loss: 1.1740 - classification_loss: 0.2252 306/500 [=================>............] - ETA: 48s - loss: 1.3981 - regression_loss: 1.1731 - classification_loss: 0.2251 307/500 [=================>............] - ETA: 48s - loss: 1.3966 - regression_loss: 1.1719 - classification_loss: 0.2247 308/500 [=================>............] - ETA: 48s - loss: 1.3980 - regression_loss: 1.1730 - classification_loss: 0.2251 309/500 [=================>............] - ETA: 47s - loss: 1.3984 - regression_loss: 1.1737 - classification_loss: 0.2247 310/500 [=================>............] - ETA: 47s - loss: 1.3979 - regression_loss: 1.1732 - classification_loss: 0.2247 311/500 [=================>............] - ETA: 47s - loss: 1.3987 - regression_loss: 1.1742 - classification_loss: 0.2246 312/500 [=================>............] - ETA: 47s - loss: 1.3989 - regression_loss: 1.1743 - classification_loss: 0.2245 313/500 [=================>............] - ETA: 46s - loss: 1.4006 - regression_loss: 1.1757 - classification_loss: 0.2249 314/500 [=================>............] - ETA: 46s - loss: 1.4008 - regression_loss: 1.1758 - classification_loss: 0.2250 315/500 [=================>............] - ETA: 46s - loss: 1.3983 - regression_loss: 1.1737 - classification_loss: 0.2246 316/500 [=================>............] - ETA: 46s - loss: 1.3997 - regression_loss: 1.1752 - classification_loss: 0.2245 317/500 [==================>...........] - ETA: 45s - loss: 1.3993 - regression_loss: 1.1748 - classification_loss: 0.2245 318/500 [==================>...........] - ETA: 45s - loss: 1.3994 - regression_loss: 1.1749 - classification_loss: 0.2245 319/500 [==================>...........] - ETA: 45s - loss: 1.3993 - regression_loss: 1.1747 - classification_loss: 0.2245 320/500 [==================>...........] - ETA: 45s - loss: 1.3967 - regression_loss: 1.1724 - classification_loss: 0.2243 321/500 [==================>...........] - ETA: 44s - loss: 1.3961 - regression_loss: 1.1720 - classification_loss: 0.2241 322/500 [==================>...........] - ETA: 44s - loss: 1.3973 - regression_loss: 1.1729 - classification_loss: 0.2244 323/500 [==================>...........] - ETA: 44s - loss: 1.3975 - regression_loss: 1.1734 - classification_loss: 0.2241 324/500 [==================>...........] - ETA: 44s - loss: 1.3991 - regression_loss: 1.1746 - classification_loss: 0.2244 325/500 [==================>...........] - ETA: 43s - loss: 1.3970 - regression_loss: 1.1729 - classification_loss: 0.2241 326/500 [==================>...........] - ETA: 43s - loss: 1.3977 - regression_loss: 1.1736 - classification_loss: 0.2241 327/500 [==================>...........] - ETA: 43s - loss: 1.3980 - regression_loss: 1.1738 - classification_loss: 0.2242 328/500 [==================>...........] - ETA: 43s - loss: 1.3965 - regression_loss: 1.1726 - classification_loss: 0.2240 329/500 [==================>...........] - ETA: 42s - loss: 1.3962 - regression_loss: 1.1724 - classification_loss: 0.2237 330/500 [==================>...........] - ETA: 42s - loss: 1.3940 - regression_loss: 1.1704 - classification_loss: 0.2236 331/500 [==================>...........] - ETA: 42s - loss: 1.3942 - regression_loss: 1.1709 - classification_loss: 0.2233 332/500 [==================>...........] - ETA: 42s - loss: 1.3946 - regression_loss: 1.1712 - classification_loss: 0.2235 333/500 [==================>...........] - ETA: 41s - loss: 1.3954 - regression_loss: 1.1717 - classification_loss: 0.2237 334/500 [===================>..........] - ETA: 41s - loss: 1.3964 - regression_loss: 1.1727 - classification_loss: 0.2237 335/500 [===================>..........] - ETA: 41s - loss: 1.3968 - regression_loss: 1.1733 - classification_loss: 0.2235 336/500 [===================>..........] - ETA: 41s - loss: 1.3954 - regression_loss: 1.1722 - classification_loss: 0.2233 337/500 [===================>..........] - ETA: 40s - loss: 1.3971 - regression_loss: 1.1736 - classification_loss: 0.2235 338/500 [===================>..........] - ETA: 40s - loss: 1.3951 - regression_loss: 1.1719 - classification_loss: 0.2232 339/500 [===================>..........] - ETA: 40s - loss: 1.3979 - regression_loss: 1.1742 - classification_loss: 0.2237 340/500 [===================>..........] - ETA: 40s - loss: 1.3962 - regression_loss: 1.1728 - classification_loss: 0.2234 341/500 [===================>..........] - ETA: 39s - loss: 1.3944 - regression_loss: 1.1714 - classification_loss: 0.2230 342/500 [===================>..........] - ETA: 39s - loss: 1.3949 - regression_loss: 1.1717 - classification_loss: 0.2232 343/500 [===================>..........] - ETA: 39s - loss: 1.3968 - regression_loss: 1.1732 - classification_loss: 0.2236 344/500 [===================>..........] - ETA: 39s - loss: 1.3948 - regression_loss: 1.1716 - classification_loss: 0.2232 345/500 [===================>..........] - ETA: 38s - loss: 1.3956 - regression_loss: 1.1724 - classification_loss: 0.2233 346/500 [===================>..........] - ETA: 38s - loss: 1.3964 - regression_loss: 1.1732 - classification_loss: 0.2231 347/500 [===================>..........] - ETA: 38s - loss: 1.3976 - regression_loss: 1.1743 - classification_loss: 0.2233 348/500 [===================>..........] - ETA: 38s - loss: 1.3988 - regression_loss: 1.1755 - classification_loss: 0.2233 349/500 [===================>..........] - ETA: 37s - loss: 1.3988 - regression_loss: 1.1755 - classification_loss: 0.2233 350/500 [====================>.........] - ETA: 37s - loss: 1.4002 - regression_loss: 1.1766 - classification_loss: 0.2236 351/500 [====================>.........] - ETA: 37s - loss: 1.3995 - regression_loss: 1.1761 - classification_loss: 0.2234 352/500 [====================>.........] - ETA: 37s - loss: 1.4003 - regression_loss: 1.1769 - classification_loss: 0.2234 353/500 [====================>.........] - ETA: 36s - loss: 1.4011 - regression_loss: 1.1776 - classification_loss: 0.2235 354/500 [====================>.........] - ETA: 36s - loss: 1.4017 - regression_loss: 1.1781 - classification_loss: 0.2236 355/500 [====================>.........] - ETA: 36s - loss: 1.4013 - regression_loss: 1.1778 - classification_loss: 0.2235 356/500 [====================>.........] - ETA: 36s - loss: 1.4024 - regression_loss: 1.1787 - classification_loss: 0.2237 357/500 [====================>.........] - ETA: 35s - loss: 1.4024 - regression_loss: 1.1789 - classification_loss: 0.2235 358/500 [====================>.........] - ETA: 35s - loss: 1.4022 - regression_loss: 1.1788 - classification_loss: 0.2234 359/500 [====================>.........] - ETA: 35s - loss: 1.4017 - regression_loss: 1.1786 - classification_loss: 0.2231 360/500 [====================>.........] - ETA: 35s - loss: 1.4025 - regression_loss: 1.1794 - classification_loss: 0.2230 361/500 [====================>.........] - ETA: 34s - loss: 1.4022 - regression_loss: 1.1792 - classification_loss: 0.2230 362/500 [====================>.........] - ETA: 34s - loss: 1.4041 - regression_loss: 1.1808 - classification_loss: 0.2233 363/500 [====================>.........] - ETA: 34s - loss: 1.4050 - regression_loss: 1.1814 - classification_loss: 0.2236 364/500 [====================>.........] - ETA: 34s - loss: 1.4052 - regression_loss: 1.1816 - classification_loss: 0.2236 365/500 [====================>.........] - ETA: 33s - loss: 1.4058 - regression_loss: 1.1822 - classification_loss: 0.2236 366/500 [====================>.........] - ETA: 33s - loss: 1.4052 - regression_loss: 1.1818 - classification_loss: 0.2234 367/500 [=====================>........] - ETA: 33s - loss: 1.4050 - regression_loss: 1.1815 - classification_loss: 0.2235 368/500 [=====================>........] - ETA: 33s - loss: 1.4066 - regression_loss: 1.1827 - classification_loss: 0.2239 369/500 [=====================>........] - ETA: 32s - loss: 1.4070 - regression_loss: 1.1830 - classification_loss: 0.2240 370/500 [=====================>........] - ETA: 32s - loss: 1.4074 - regression_loss: 1.1834 - classification_loss: 0.2240 371/500 [=====================>........] - ETA: 32s - loss: 1.4079 - regression_loss: 1.1839 - classification_loss: 0.2240 372/500 [=====================>........] - ETA: 32s - loss: 1.4077 - regression_loss: 1.1835 - classification_loss: 0.2241 373/500 [=====================>........] - ETA: 31s - loss: 1.4076 - regression_loss: 1.1835 - classification_loss: 0.2241 374/500 [=====================>........] - ETA: 31s - loss: 1.4075 - regression_loss: 1.1834 - classification_loss: 0.2241 375/500 [=====================>........] - ETA: 31s - loss: 1.4049 - regression_loss: 1.1809 - classification_loss: 0.2240 376/500 [=====================>........] - ETA: 31s - loss: 1.4048 - regression_loss: 1.1809 - classification_loss: 0.2239 377/500 [=====================>........] - ETA: 30s - loss: 1.4059 - regression_loss: 1.1819 - classification_loss: 0.2240 378/500 [=====================>........] - ETA: 30s - loss: 1.4079 - regression_loss: 1.1836 - classification_loss: 0.2243 379/500 [=====================>........] - ETA: 30s - loss: 1.4080 - regression_loss: 1.1837 - classification_loss: 0.2243 380/500 [=====================>........] - ETA: 30s - loss: 1.4079 - regression_loss: 1.1837 - classification_loss: 0.2243 381/500 [=====================>........] - ETA: 29s - loss: 1.4083 - regression_loss: 1.1839 - classification_loss: 0.2244 382/500 [=====================>........] - ETA: 29s - loss: 1.4093 - regression_loss: 1.1848 - classification_loss: 0.2244 383/500 [=====================>........] - ETA: 29s - loss: 1.4095 - regression_loss: 1.1851 - classification_loss: 0.2244 384/500 [======================>.......] - ETA: 29s - loss: 1.4082 - regression_loss: 1.1842 - classification_loss: 0.2240 385/500 [======================>.......] - ETA: 28s - loss: 1.4082 - regression_loss: 1.1843 - classification_loss: 0.2239 386/500 [======================>.......] - ETA: 28s - loss: 1.4068 - regression_loss: 1.1833 - classification_loss: 0.2236 387/500 [======================>.......] - ETA: 28s - loss: 1.4074 - regression_loss: 1.1836 - classification_loss: 0.2238 388/500 [======================>.......] - ETA: 28s - loss: 1.4057 - regression_loss: 1.1823 - classification_loss: 0.2233 389/500 [======================>.......] - ETA: 27s - loss: 1.4058 - regression_loss: 1.1825 - classification_loss: 0.2233 390/500 [======================>.......] - ETA: 27s - loss: 1.4065 - regression_loss: 1.1831 - classification_loss: 0.2234 391/500 [======================>.......] - ETA: 27s - loss: 1.4048 - regression_loss: 1.1816 - classification_loss: 0.2231 392/500 [======================>.......] - ETA: 27s - loss: 1.4028 - regression_loss: 1.1800 - classification_loss: 0.2228 393/500 [======================>.......] - ETA: 26s - loss: 1.4031 - regression_loss: 1.1803 - classification_loss: 0.2229 394/500 [======================>.......] - ETA: 26s - loss: 1.4033 - regression_loss: 1.1804 - classification_loss: 0.2229 395/500 [======================>.......] - ETA: 26s - loss: 1.4038 - regression_loss: 1.1808 - classification_loss: 0.2230 396/500 [======================>.......] - ETA: 26s - loss: 1.4041 - regression_loss: 1.1810 - classification_loss: 0.2231 397/500 [======================>.......] - ETA: 25s - loss: 1.4041 - regression_loss: 1.1810 - classification_loss: 0.2231 398/500 [======================>.......] - ETA: 25s - loss: 1.4058 - regression_loss: 1.1818 - classification_loss: 0.2240 399/500 [======================>.......] - ETA: 25s - loss: 1.4065 - regression_loss: 1.1824 - classification_loss: 0.2240 400/500 [=======================>......] - ETA: 25s - loss: 1.4057 - regression_loss: 1.1814 - classification_loss: 0.2243 401/500 [=======================>......] - ETA: 24s - loss: 1.4040 - regression_loss: 1.1799 - classification_loss: 0.2241 402/500 [=======================>......] - ETA: 24s - loss: 1.4052 - regression_loss: 1.1804 - classification_loss: 0.2248 403/500 [=======================>......] - ETA: 24s - loss: 1.4056 - regression_loss: 1.1804 - classification_loss: 0.2252 404/500 [=======================>......] - ETA: 24s - loss: 1.4055 - regression_loss: 1.1802 - classification_loss: 0.2254 405/500 [=======================>......] - ETA: 23s - loss: 1.4036 - regression_loss: 1.1785 - classification_loss: 0.2252 406/500 [=======================>......] - ETA: 23s - loss: 1.4030 - regression_loss: 1.1781 - classification_loss: 0.2250 407/500 [=======================>......] - ETA: 23s - loss: 1.4046 - regression_loss: 1.1794 - classification_loss: 0.2252 408/500 [=======================>......] - ETA: 23s - loss: 1.4044 - regression_loss: 1.1793 - classification_loss: 0.2251 409/500 [=======================>......] - ETA: 22s - loss: 1.4049 - regression_loss: 1.1798 - classification_loss: 0.2251 410/500 [=======================>......] - ETA: 22s - loss: 1.4049 - regression_loss: 1.1798 - classification_loss: 0.2251 411/500 [=======================>......] - ETA: 22s - loss: 1.4064 - regression_loss: 1.1810 - classification_loss: 0.2254 412/500 [=======================>......] - ETA: 22s - loss: 1.4049 - regression_loss: 1.1797 - classification_loss: 0.2252 413/500 [=======================>......] - ETA: 21s - loss: 1.4052 - regression_loss: 1.1798 - classification_loss: 0.2254 414/500 [=======================>......] - ETA: 21s - loss: 1.4050 - regression_loss: 1.1798 - classification_loss: 0.2253 415/500 [=======================>......] - ETA: 21s - loss: 1.4046 - regression_loss: 1.1795 - classification_loss: 0.2251 416/500 [=======================>......] - ETA: 21s - loss: 1.4034 - regression_loss: 1.1786 - classification_loss: 0.2248 417/500 [========================>.....] - ETA: 20s - loss: 1.4029 - regression_loss: 1.1784 - classification_loss: 0.2246 418/500 [========================>.....] - ETA: 20s - loss: 1.4025 - regression_loss: 1.1780 - classification_loss: 0.2244 419/500 [========================>.....] - ETA: 20s - loss: 1.4010 - regression_loss: 1.1769 - classification_loss: 0.2241 420/500 [========================>.....] - ETA: 20s - loss: 1.3998 - regression_loss: 1.1760 - classification_loss: 0.2238 421/500 [========================>.....] - ETA: 19s - loss: 1.3988 - regression_loss: 1.1750 - classification_loss: 0.2238 422/500 [========================>.....] - ETA: 19s - loss: 1.3980 - regression_loss: 1.1743 - classification_loss: 0.2237 423/500 [========================>.....] - ETA: 19s - loss: 1.3989 - regression_loss: 1.1749 - classification_loss: 0.2240 424/500 [========================>.....] - ETA: 19s - loss: 1.3985 - regression_loss: 1.1745 - classification_loss: 0.2240 425/500 [========================>.....] - ETA: 18s - loss: 1.3969 - regression_loss: 1.1732 - classification_loss: 0.2237 426/500 [========================>.....] - ETA: 18s - loss: 1.3963 - regression_loss: 1.1729 - classification_loss: 0.2235 427/500 [========================>.....] - ETA: 18s - loss: 1.3970 - regression_loss: 1.1734 - classification_loss: 0.2237 428/500 [========================>.....] - ETA: 18s - loss: 1.3974 - regression_loss: 1.1735 - classification_loss: 0.2240 429/500 [========================>.....] - ETA: 17s - loss: 1.3955 - regression_loss: 1.1718 - classification_loss: 0.2237 430/500 [========================>.....] - ETA: 17s - loss: 1.3950 - regression_loss: 1.1717 - classification_loss: 0.2233 431/500 [========================>.....] - ETA: 17s - loss: 1.3939 - regression_loss: 1.1706 - classification_loss: 0.2234 432/500 [========================>.....] - ETA: 17s - loss: 1.3942 - regression_loss: 1.1708 - classification_loss: 0.2235 433/500 [========================>.....] - ETA: 16s - loss: 1.3957 - regression_loss: 1.1720 - classification_loss: 0.2237 434/500 [=========================>....] - ETA: 16s - loss: 1.3957 - regression_loss: 1.1721 - classification_loss: 0.2236 435/500 [=========================>....] - ETA: 16s - loss: 1.3976 - regression_loss: 1.1736 - classification_loss: 0.2240 436/500 [=========================>....] - ETA: 16s - loss: 1.3985 - regression_loss: 1.1744 - classification_loss: 0.2241 437/500 [=========================>....] - ETA: 15s - loss: 1.3977 - regression_loss: 1.1738 - classification_loss: 0.2239 438/500 [=========================>....] - ETA: 15s - loss: 1.3980 - regression_loss: 1.1741 - classification_loss: 0.2239 439/500 [=========================>....] - ETA: 15s - loss: 1.3980 - regression_loss: 1.1742 - classification_loss: 0.2238 440/500 [=========================>....] - ETA: 15s - loss: 1.3993 - regression_loss: 1.1752 - classification_loss: 0.2241 441/500 [=========================>....] - ETA: 14s - loss: 1.3991 - regression_loss: 1.1751 - classification_loss: 0.2240 442/500 [=========================>....] - ETA: 14s - loss: 1.3992 - regression_loss: 1.1752 - classification_loss: 0.2240 443/500 [=========================>....] - ETA: 14s - loss: 1.3997 - regression_loss: 1.1757 - classification_loss: 0.2240 444/500 [=========================>....] - ETA: 14s - loss: 1.4006 - regression_loss: 1.1765 - classification_loss: 0.2241 445/500 [=========================>....] - ETA: 13s - loss: 1.4002 - regression_loss: 1.1762 - classification_loss: 0.2240 446/500 [=========================>....] - ETA: 13s - loss: 1.3990 - regression_loss: 1.1753 - classification_loss: 0.2238 447/500 [=========================>....] - ETA: 13s - loss: 1.3997 - regression_loss: 1.1753 - classification_loss: 0.2244 448/500 [=========================>....] - ETA: 13s - loss: 1.4013 - regression_loss: 1.1767 - classification_loss: 0.2246 449/500 [=========================>....] - ETA: 12s - loss: 1.4016 - regression_loss: 1.1770 - classification_loss: 0.2246 450/500 [==========================>...] - ETA: 12s - loss: 1.4014 - regression_loss: 1.1764 - classification_loss: 0.2249 451/500 [==========================>...] - ETA: 12s - loss: 1.4016 - regression_loss: 1.1768 - classification_loss: 0.2248 452/500 [==========================>...] - ETA: 12s - loss: 1.4017 - regression_loss: 1.1769 - classification_loss: 0.2247 453/500 [==========================>...] - ETA: 11s - loss: 1.4053 - regression_loss: 1.1797 - classification_loss: 0.2256 454/500 [==========================>...] - ETA: 11s - loss: 1.4064 - regression_loss: 1.1807 - classification_loss: 0.2257 455/500 [==========================>...] - ETA: 11s - loss: 1.4064 - regression_loss: 1.1807 - classification_loss: 0.2257 456/500 [==========================>...] - ETA: 11s - loss: 1.4065 - regression_loss: 1.1807 - classification_loss: 0.2258 457/500 [==========================>...] - ETA: 10s - loss: 1.4072 - regression_loss: 1.1814 - classification_loss: 0.2258 458/500 [==========================>...] - ETA: 10s - loss: 1.4051 - regression_loss: 1.1797 - classification_loss: 0.2255 459/500 [==========================>...] - ETA: 10s - loss: 1.4045 - regression_loss: 1.1793 - classification_loss: 0.2252 460/500 [==========================>...] - ETA: 10s - loss: 1.4050 - regression_loss: 1.1798 - classification_loss: 0.2253 461/500 [==========================>...] - ETA: 9s - loss: 1.4039 - regression_loss: 1.1788 - classification_loss: 0.2251  462/500 [==========================>...] - ETA: 9s - loss: 1.4045 - regression_loss: 1.1796 - classification_loss: 0.2249 463/500 [==========================>...] - ETA: 9s - loss: 1.4052 - regression_loss: 1.1800 - classification_loss: 0.2252 464/500 [==========================>...] - ETA: 9s - loss: 1.4040 - regression_loss: 1.1791 - classification_loss: 0.2249 465/500 [==========================>...] - ETA: 8s - loss: 1.4041 - regression_loss: 1.1792 - classification_loss: 0.2250 466/500 [==========================>...] - ETA: 8s - loss: 1.4032 - regression_loss: 1.1784 - classification_loss: 0.2248 467/500 [===========================>..] - ETA: 8s - loss: 1.4040 - regression_loss: 1.1790 - classification_loss: 0.2249 468/500 [===========================>..] - ETA: 8s - loss: 1.4032 - regression_loss: 1.1786 - classification_loss: 0.2247 469/500 [===========================>..] - ETA: 7s - loss: 1.4020 - regression_loss: 1.1776 - classification_loss: 0.2244 470/500 [===========================>..] - ETA: 7s - loss: 1.4009 - regression_loss: 1.1767 - classification_loss: 0.2242 471/500 [===========================>..] - ETA: 7s - loss: 1.3998 - regression_loss: 1.1758 - classification_loss: 0.2239 472/500 [===========================>..] - ETA: 7s - loss: 1.3987 - regression_loss: 1.1750 - classification_loss: 0.2238 473/500 [===========================>..] - ETA: 6s - loss: 1.3984 - regression_loss: 1.1747 - classification_loss: 0.2237 474/500 [===========================>..] - ETA: 6s - loss: 1.3989 - regression_loss: 1.1750 - classification_loss: 0.2239 475/500 [===========================>..] - ETA: 6s - loss: 1.3992 - regression_loss: 1.1752 - classification_loss: 0.2240 476/500 [===========================>..] - ETA: 6s - loss: 1.3998 - regression_loss: 1.1758 - classification_loss: 0.2240 477/500 [===========================>..] - ETA: 5s - loss: 1.3985 - regression_loss: 1.1747 - classification_loss: 0.2237 478/500 [===========================>..] - ETA: 5s - loss: 1.3985 - regression_loss: 1.1747 - classification_loss: 0.2238 479/500 [===========================>..] - ETA: 5s - loss: 1.3986 - regression_loss: 1.1748 - classification_loss: 0.2238 480/500 [===========================>..] - ETA: 5s - loss: 1.3988 - regression_loss: 1.1750 - classification_loss: 0.2238 481/500 [===========================>..] - ETA: 4s - loss: 1.3979 - regression_loss: 1.1745 - classification_loss: 0.2234 482/500 [===========================>..] - ETA: 4s - loss: 1.3978 - regression_loss: 1.1744 - classification_loss: 0.2234 483/500 [===========================>..] - ETA: 4s - loss: 1.3979 - regression_loss: 1.1746 - classification_loss: 0.2233 484/500 [============================>.] - ETA: 4s - loss: 1.3981 - regression_loss: 1.1748 - classification_loss: 0.2233 485/500 [============================>.] - ETA: 3s - loss: 1.3978 - regression_loss: 1.1745 - classification_loss: 0.2233 486/500 [============================>.] - ETA: 3s - loss: 1.3983 - regression_loss: 1.1751 - classification_loss: 0.2232 487/500 [============================>.] - ETA: 3s - loss: 1.4003 - regression_loss: 1.1765 - classification_loss: 0.2238 488/500 [============================>.] - ETA: 3s - loss: 1.3997 - regression_loss: 1.1762 - classification_loss: 0.2235 489/500 [============================>.] - ETA: 2s - loss: 1.4003 - regression_loss: 1.1768 - classification_loss: 0.2235 490/500 [============================>.] - ETA: 2s - loss: 1.4000 - regression_loss: 1.1766 - classification_loss: 0.2234 491/500 [============================>.] - ETA: 2s - loss: 1.4018 - regression_loss: 1.1782 - classification_loss: 0.2236 492/500 [============================>.] - ETA: 2s - loss: 1.4020 - regression_loss: 1.1784 - classification_loss: 0.2236 493/500 [============================>.] - ETA: 1s - loss: 1.4027 - regression_loss: 1.1790 - classification_loss: 0.2237 494/500 [============================>.] - ETA: 1s - loss: 1.4022 - regression_loss: 1.1787 - classification_loss: 0.2235 495/500 [============================>.] - ETA: 1s - loss: 1.4030 - regression_loss: 1.1792 - classification_loss: 0.2239 496/500 [============================>.] - ETA: 1s - loss: 1.4036 - regression_loss: 1.1796 - classification_loss: 0.2240 497/500 [============================>.] - ETA: 0s - loss: 1.4034 - regression_loss: 1.1793 - classification_loss: 0.2241 498/500 [============================>.] - ETA: 0s - loss: 1.4027 - regression_loss: 1.1786 - classification_loss: 0.2240 499/500 [============================>.] - ETA: 0s - loss: 1.4061 - regression_loss: 1.1816 - classification_loss: 0.2245 500/500 [==============================] - 125s 251ms/step - loss: 1.4043 - regression_loss: 1.1799 - classification_loss: 0.2243 1172 instances of class plum with average precision: 0.6754 mAP: 0.6754 Epoch 00113: saving model to ./training/snapshots/resnet50_pascal_113.h5 Epoch 114/150 1/500 [..............................] - ETA: 1:58 - loss: 0.7529 - regression_loss: 0.6285 - classification_loss: 0.1244 2/500 [..............................] - ETA: 2:02 - loss: 1.2983 - regression_loss: 1.0845 - classification_loss: 0.2138 3/500 [..............................] - ETA: 2:02 - loss: 1.4005 - regression_loss: 1.1553 - classification_loss: 0.2452 4/500 [..............................] - ETA: 2:03 - loss: 1.4185 - regression_loss: 1.1664 - classification_loss: 0.2521 5/500 [..............................] - ETA: 2:04 - loss: 1.4913 - regression_loss: 1.2177 - classification_loss: 0.2736 6/500 [..............................] - ETA: 2:04 - loss: 1.5554 - regression_loss: 1.2787 - classification_loss: 0.2766 7/500 [..............................] - ETA: 2:04 - loss: 1.5395 - regression_loss: 1.2703 - classification_loss: 0.2691 8/500 [..............................] - ETA: 2:04 - loss: 1.5136 - regression_loss: 1.2534 - classification_loss: 0.2602 9/500 [..............................] - ETA: 2:04 - loss: 1.6228 - regression_loss: 1.3406 - classification_loss: 0.2821 10/500 [..............................] - ETA: 2:03 - loss: 1.5904 - regression_loss: 1.3147 - classification_loss: 0.2757 11/500 [..............................] - ETA: 2:03 - loss: 1.5859 - regression_loss: 1.3155 - classification_loss: 0.2705 12/500 [..............................] - ETA: 2:03 - loss: 1.5729 - regression_loss: 1.3076 - classification_loss: 0.2654 13/500 [..............................] - ETA: 2:02 - loss: 1.5557 - regression_loss: 1.2973 - classification_loss: 0.2584 14/500 [..............................] - ETA: 2:02 - loss: 1.5781 - regression_loss: 1.3191 - classification_loss: 0.2590 15/500 [..............................] - ETA: 2:02 - loss: 1.5237 - regression_loss: 1.2781 - classification_loss: 0.2456 16/500 [..............................] - ETA: 2:02 - loss: 1.5687 - regression_loss: 1.3108 - classification_loss: 0.2579 17/500 [>.............................] - ETA: 2:01 - loss: 1.5901 - regression_loss: 1.3281 - classification_loss: 0.2620 18/500 [>.............................] - ETA: 2:01 - loss: 1.6021 - regression_loss: 1.3351 - classification_loss: 0.2670 19/500 [>.............................] - ETA: 2:01 - loss: 1.5652 - regression_loss: 1.3035 - classification_loss: 0.2616 20/500 [>.............................] - ETA: 2:01 - loss: 1.5551 - regression_loss: 1.2960 - classification_loss: 0.2591 21/500 [>.............................] - ETA: 2:00 - loss: 1.5587 - regression_loss: 1.3003 - classification_loss: 0.2583 22/500 [>.............................] - ETA: 2:00 - loss: 1.5381 - regression_loss: 1.2854 - classification_loss: 0.2527 23/500 [>.............................] - ETA: 2:00 - loss: 1.5458 - regression_loss: 1.2919 - classification_loss: 0.2538 24/500 [>.............................] - ETA: 1:59 - loss: 1.5821 - regression_loss: 1.3154 - classification_loss: 0.2668 25/500 [>.............................] - ETA: 1:59 - loss: 1.5697 - regression_loss: 1.3066 - classification_loss: 0.2632 26/500 [>.............................] - ETA: 1:59 - loss: 1.5536 - regression_loss: 1.2946 - classification_loss: 0.2589 27/500 [>.............................] - ETA: 1:59 - loss: 1.5330 - regression_loss: 1.2786 - classification_loss: 0.2543 28/500 [>.............................] - ETA: 1:58 - loss: 1.5358 - regression_loss: 1.2834 - classification_loss: 0.2524 29/500 [>.............................] - ETA: 1:58 - loss: 1.4977 - regression_loss: 1.2510 - classification_loss: 0.2467 30/500 [>.............................] - ETA: 1:58 - loss: 1.4891 - regression_loss: 1.2402 - classification_loss: 0.2489 31/500 [>.............................] - ETA: 1:58 - loss: 1.4675 - regression_loss: 1.2232 - classification_loss: 0.2443 32/500 [>.............................] - ETA: 1:57 - loss: 1.4865 - regression_loss: 1.2386 - classification_loss: 0.2479 33/500 [>.............................] - ETA: 1:57 - loss: 1.4891 - regression_loss: 1.2429 - classification_loss: 0.2461 34/500 [=>............................] - ETA: 1:57 - loss: 1.4617 - regression_loss: 1.2204 - classification_loss: 0.2413 35/500 [=>............................] - ETA: 1:57 - loss: 1.4733 - regression_loss: 1.2275 - classification_loss: 0.2458 36/500 [=>............................] - ETA: 1:57 - loss: 1.4867 - regression_loss: 1.2358 - classification_loss: 0.2509 37/500 [=>............................] - ETA: 1:56 - loss: 1.4788 - regression_loss: 1.2297 - classification_loss: 0.2491 38/500 [=>............................] - ETA: 1:56 - loss: 1.4901 - regression_loss: 1.2366 - classification_loss: 0.2535 39/500 [=>............................] - ETA: 1:56 - loss: 1.4775 - regression_loss: 1.2270 - classification_loss: 0.2505 40/500 [=>............................] - ETA: 1:56 - loss: 1.4776 - regression_loss: 1.2267 - classification_loss: 0.2509 41/500 [=>............................] - ETA: 1:55 - loss: 1.4733 - regression_loss: 1.2245 - classification_loss: 0.2489 42/500 [=>............................] - ETA: 1:55 - loss: 1.4682 - regression_loss: 1.2200 - classification_loss: 0.2482 43/500 [=>............................] - ETA: 1:55 - loss: 1.4653 - regression_loss: 1.2181 - classification_loss: 0.2473 44/500 [=>............................] - ETA: 1:55 - loss: 1.4706 - regression_loss: 1.2224 - classification_loss: 0.2481 45/500 [=>............................] - ETA: 1:54 - loss: 1.4576 - regression_loss: 1.2124 - classification_loss: 0.2452 46/500 [=>............................] - ETA: 1:54 - loss: 1.4621 - regression_loss: 1.2167 - classification_loss: 0.2454 47/500 [=>............................] - ETA: 1:54 - loss: 1.4660 - regression_loss: 1.2201 - classification_loss: 0.2459 48/500 [=>............................] - ETA: 1:53 - loss: 1.4644 - regression_loss: 1.2184 - classification_loss: 0.2460 49/500 [=>............................] - ETA: 1:53 - loss: 1.4695 - regression_loss: 1.2223 - classification_loss: 0.2473 50/500 [==>...........................] - ETA: 1:53 - loss: 1.4735 - regression_loss: 1.2242 - classification_loss: 0.2494 51/500 [==>...........................] - ETA: 1:53 - loss: 1.4755 - regression_loss: 1.2265 - classification_loss: 0.2490 52/500 [==>...........................] - ETA: 1:53 - loss: 1.4830 - regression_loss: 1.2328 - classification_loss: 0.2502 53/500 [==>...........................] - ETA: 1:52 - loss: 1.4697 - regression_loss: 1.2227 - classification_loss: 0.2470 54/500 [==>...........................] - ETA: 1:52 - loss: 1.4753 - regression_loss: 1.2255 - classification_loss: 0.2498 55/500 [==>...........................] - ETA: 1:52 - loss: 1.4644 - regression_loss: 1.2167 - classification_loss: 0.2477 56/500 [==>...........................] - ETA: 1:51 - loss: 1.4711 - regression_loss: 1.2219 - classification_loss: 0.2492 57/500 [==>...........................] - ETA: 1:51 - loss: 1.4704 - regression_loss: 1.2223 - classification_loss: 0.2481 58/500 [==>...........................] - ETA: 1:51 - loss: 1.4669 - regression_loss: 1.2200 - classification_loss: 0.2469 59/500 [==>...........................] - ETA: 1:51 - loss: 1.4655 - regression_loss: 1.2180 - classification_loss: 0.2475 60/500 [==>...........................] - ETA: 1:50 - loss: 1.4640 - regression_loss: 1.2146 - classification_loss: 0.2494 61/500 [==>...........................] - ETA: 1:50 - loss: 1.4565 - regression_loss: 1.2093 - classification_loss: 0.2472 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4630 - regression_loss: 1.2119 - classification_loss: 0.2511 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4664 - regression_loss: 1.2146 - classification_loss: 0.2518 64/500 [==>...........................] - ETA: 1:49 - loss: 1.4708 - regression_loss: 1.2191 - classification_loss: 0.2517 65/500 [==>...........................] - ETA: 1:49 - loss: 1.4691 - regression_loss: 1.2184 - classification_loss: 0.2507 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4719 - regression_loss: 1.2205 - classification_loss: 0.2514 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4622 - regression_loss: 1.2133 - classification_loss: 0.2489 68/500 [===>..........................] - ETA: 1:48 - loss: 1.4613 - regression_loss: 1.2127 - classification_loss: 0.2485 69/500 [===>..........................] - ETA: 1:48 - loss: 1.4485 - regression_loss: 1.2015 - classification_loss: 0.2470 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4523 - regression_loss: 1.2049 - classification_loss: 0.2473 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4566 - regression_loss: 1.2086 - classification_loss: 0.2480 72/500 [===>..........................] - ETA: 1:47 - loss: 1.4606 - regression_loss: 1.2120 - classification_loss: 0.2486 73/500 [===>..........................] - ETA: 1:47 - loss: 1.4537 - regression_loss: 1.2068 - classification_loss: 0.2469 74/500 [===>..........................] - ETA: 1:47 - loss: 1.4657 - regression_loss: 1.2131 - classification_loss: 0.2526 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4596 - regression_loss: 1.2074 - classification_loss: 0.2522 76/500 [===>..........................] - ETA: 1:46 - loss: 1.4599 - regression_loss: 1.2081 - classification_loss: 0.2518 77/500 [===>..........................] - ETA: 1:46 - loss: 1.4579 - regression_loss: 1.2070 - classification_loss: 0.2510 78/500 [===>..........................] - ETA: 1:46 - loss: 1.4501 - regression_loss: 1.2017 - classification_loss: 0.2483 79/500 [===>..........................] - ETA: 1:45 - loss: 1.4512 - regression_loss: 1.2042 - classification_loss: 0.2470 80/500 [===>..........................] - ETA: 1:45 - loss: 1.4521 - regression_loss: 1.2049 - classification_loss: 0.2472 81/500 [===>..........................] - ETA: 1:45 - loss: 1.4537 - regression_loss: 1.2067 - classification_loss: 0.2470 82/500 [===>..........................] - ETA: 1:44 - loss: 1.4423 - regression_loss: 1.1971 - classification_loss: 0.2453 83/500 [===>..........................] - ETA: 1:44 - loss: 1.4397 - regression_loss: 1.1952 - classification_loss: 0.2445 84/500 [====>.........................] - ETA: 1:44 - loss: 1.4454 - regression_loss: 1.1990 - classification_loss: 0.2464 85/500 [====>.........................] - ETA: 1:44 - loss: 1.4382 - regression_loss: 1.1935 - classification_loss: 0.2447 86/500 [====>.........................] - ETA: 1:44 - loss: 1.4375 - regression_loss: 1.1935 - classification_loss: 0.2441 87/500 [====>.........................] - ETA: 1:43 - loss: 1.4404 - regression_loss: 1.1959 - classification_loss: 0.2446 88/500 [====>.........................] - ETA: 1:43 - loss: 1.4376 - regression_loss: 1.1937 - classification_loss: 0.2438 89/500 [====>.........................] - ETA: 1:43 - loss: 1.4327 - regression_loss: 1.1899 - classification_loss: 0.2428 90/500 [====>.........................] - ETA: 1:43 - loss: 1.4318 - regression_loss: 1.1894 - classification_loss: 0.2424 91/500 [====>.........................] - ETA: 1:42 - loss: 1.4306 - regression_loss: 1.1885 - classification_loss: 0.2421 92/500 [====>.........................] - ETA: 1:42 - loss: 1.4340 - regression_loss: 1.1915 - classification_loss: 0.2425 93/500 [====>.........................] - ETA: 1:42 - loss: 1.4422 - regression_loss: 1.1973 - classification_loss: 0.2449 94/500 [====>.........................] - ETA: 1:41 - loss: 1.4509 - regression_loss: 1.2039 - classification_loss: 0.2470 95/500 [====>.........................] - ETA: 1:41 - loss: 1.4484 - regression_loss: 1.2016 - classification_loss: 0.2468 96/500 [====>.........................] - ETA: 1:41 - loss: 1.4491 - regression_loss: 1.2030 - classification_loss: 0.2460 97/500 [====>.........................] - ETA: 1:41 - loss: 1.4505 - regression_loss: 1.2049 - classification_loss: 0.2456 98/500 [====>.........................] - ETA: 1:40 - loss: 1.4445 - regression_loss: 1.2005 - classification_loss: 0.2440 99/500 [====>.........................] - ETA: 1:40 - loss: 1.4466 - regression_loss: 1.2029 - classification_loss: 0.2437 100/500 [=====>........................] - ETA: 1:40 - loss: 1.4506 - regression_loss: 1.2066 - classification_loss: 0.2440 101/500 [=====>........................] - ETA: 1:40 - loss: 1.4496 - regression_loss: 1.2062 - classification_loss: 0.2434 102/500 [=====>........................] - ETA: 1:39 - loss: 1.4486 - regression_loss: 1.2056 - classification_loss: 0.2431 103/500 [=====>........................] - ETA: 1:39 - loss: 1.4519 - regression_loss: 1.2087 - classification_loss: 0.2432 104/500 [=====>........................] - ETA: 1:39 - loss: 1.4459 - regression_loss: 1.2031 - classification_loss: 0.2429 105/500 [=====>........................] - ETA: 1:39 - loss: 1.4502 - regression_loss: 1.2069 - classification_loss: 0.2433 106/500 [=====>........................] - ETA: 1:38 - loss: 1.4453 - regression_loss: 1.2032 - classification_loss: 0.2421 107/500 [=====>........................] - ETA: 1:38 - loss: 1.4460 - regression_loss: 1.2035 - classification_loss: 0.2425 108/500 [=====>........................] - ETA: 1:38 - loss: 1.4420 - regression_loss: 1.2004 - classification_loss: 0.2416 109/500 [=====>........................] - ETA: 1:38 - loss: 1.4428 - regression_loss: 1.2015 - classification_loss: 0.2413 110/500 [=====>........................] - ETA: 1:37 - loss: 1.4387 - regression_loss: 1.1981 - classification_loss: 0.2405 111/500 [=====>........................] - ETA: 1:37 - loss: 1.4382 - regression_loss: 1.1976 - classification_loss: 0.2406 112/500 [=====>........................] - ETA: 1:37 - loss: 1.4382 - regression_loss: 1.1979 - classification_loss: 0.2403 113/500 [=====>........................] - ETA: 1:37 - loss: 1.4314 - regression_loss: 1.1925 - classification_loss: 0.2390 114/500 [=====>........................] - ETA: 1:36 - loss: 1.4360 - regression_loss: 1.1957 - classification_loss: 0.2404 115/500 [=====>........................] - ETA: 1:36 - loss: 1.4358 - regression_loss: 1.1959 - classification_loss: 0.2399 116/500 [=====>........................] - ETA: 1:36 - loss: 1.4385 - regression_loss: 1.1975 - classification_loss: 0.2409 117/500 [======>.......................] - ETA: 1:36 - loss: 1.4372 - regression_loss: 1.1965 - classification_loss: 0.2407 118/500 [======>.......................] - ETA: 1:36 - loss: 1.4290 - regression_loss: 1.1902 - classification_loss: 0.2388 119/500 [======>.......................] - ETA: 1:35 - loss: 1.4330 - regression_loss: 1.1932 - classification_loss: 0.2398 120/500 [======>.......................] - ETA: 1:35 - loss: 1.4318 - regression_loss: 1.1925 - classification_loss: 0.2393 121/500 [======>.......................] - ETA: 1:35 - loss: 1.4325 - regression_loss: 1.1940 - classification_loss: 0.2385 122/500 [======>.......................] - ETA: 1:35 - loss: 1.4342 - regression_loss: 1.1956 - classification_loss: 0.2386 123/500 [======>.......................] - ETA: 1:34 - loss: 1.4347 - regression_loss: 1.1965 - classification_loss: 0.2383 124/500 [======>.......................] - ETA: 1:34 - loss: 1.4371 - regression_loss: 1.1993 - classification_loss: 0.2378 125/500 [======>.......................] - ETA: 1:34 - loss: 1.4408 - regression_loss: 1.2022 - classification_loss: 0.2386 126/500 [======>.......................] - ETA: 1:34 - loss: 1.4383 - regression_loss: 1.2001 - classification_loss: 0.2381 127/500 [======>.......................] - ETA: 1:33 - loss: 1.4427 - regression_loss: 1.2038 - classification_loss: 0.2389 128/500 [======>.......................] - ETA: 1:33 - loss: 1.4444 - regression_loss: 1.2053 - classification_loss: 0.2391 129/500 [======>.......................] - ETA: 1:33 - loss: 1.4438 - regression_loss: 1.2049 - classification_loss: 0.2389 130/500 [======>.......................] - ETA: 1:33 - loss: 1.4422 - regression_loss: 1.2037 - classification_loss: 0.2385 131/500 [======>.......................] - ETA: 1:32 - loss: 1.4420 - regression_loss: 1.2032 - classification_loss: 0.2387 132/500 [======>.......................] - ETA: 1:32 - loss: 1.4437 - regression_loss: 1.2047 - classification_loss: 0.2390 133/500 [======>.......................] - ETA: 1:32 - loss: 1.4468 - regression_loss: 1.2072 - classification_loss: 0.2396 134/500 [=======>......................] - ETA: 1:32 - loss: 1.4449 - regression_loss: 1.2056 - classification_loss: 0.2393 135/500 [=======>......................] - ETA: 1:31 - loss: 1.4468 - regression_loss: 1.2072 - classification_loss: 0.2396 136/500 [=======>......................] - ETA: 1:31 - loss: 1.4426 - regression_loss: 1.2040 - classification_loss: 0.2386 137/500 [=======>......................] - ETA: 1:31 - loss: 1.4439 - regression_loss: 1.2053 - classification_loss: 0.2385 138/500 [=======>......................] - ETA: 1:31 - loss: 1.4445 - regression_loss: 1.2059 - classification_loss: 0.2386 139/500 [=======>......................] - ETA: 1:30 - loss: 1.4416 - regression_loss: 1.2038 - classification_loss: 0.2377 140/500 [=======>......................] - ETA: 1:30 - loss: 1.4369 - regression_loss: 1.2000 - classification_loss: 0.2370 141/500 [=======>......................] - ETA: 1:30 - loss: 1.4380 - regression_loss: 1.2007 - classification_loss: 0.2373 142/500 [=======>......................] - ETA: 1:30 - loss: 1.4390 - regression_loss: 1.2019 - classification_loss: 0.2372 143/500 [=======>......................] - ETA: 1:29 - loss: 1.4435 - regression_loss: 1.2055 - classification_loss: 0.2379 144/500 [=======>......................] - ETA: 1:29 - loss: 1.4419 - regression_loss: 1.2045 - classification_loss: 0.2374 145/500 [=======>......................] - ETA: 1:29 - loss: 1.4370 - regression_loss: 1.2007 - classification_loss: 0.2363 146/500 [=======>......................] - ETA: 1:28 - loss: 1.4398 - regression_loss: 1.2031 - classification_loss: 0.2367 147/500 [=======>......................] - ETA: 1:28 - loss: 1.4424 - regression_loss: 1.2058 - classification_loss: 0.2365 148/500 [=======>......................] - ETA: 1:28 - loss: 1.4369 - regression_loss: 1.2014 - classification_loss: 0.2355 149/500 [=======>......................] - ETA: 1:28 - loss: 1.4376 - regression_loss: 1.2023 - classification_loss: 0.2353 150/500 [========>.....................] - ETA: 1:27 - loss: 1.4392 - regression_loss: 1.2035 - classification_loss: 0.2357 151/500 [========>.....................] - ETA: 1:27 - loss: 1.4386 - regression_loss: 1.2029 - classification_loss: 0.2357 152/500 [========>.....................] - ETA: 1:27 - loss: 1.4396 - regression_loss: 1.2037 - classification_loss: 0.2358 153/500 [========>.....................] - ETA: 1:27 - loss: 1.4393 - regression_loss: 1.2035 - classification_loss: 0.2358 154/500 [========>.....................] - ETA: 1:26 - loss: 1.4385 - regression_loss: 1.2029 - classification_loss: 0.2356 155/500 [========>.....................] - ETA: 1:26 - loss: 1.4347 - regression_loss: 1.2002 - classification_loss: 0.2346 156/500 [========>.....................] - ETA: 1:26 - loss: 1.4365 - regression_loss: 1.2017 - classification_loss: 0.2347 157/500 [========>.....................] - ETA: 1:26 - loss: 1.4382 - regression_loss: 1.2030 - classification_loss: 0.2351 158/500 [========>.....................] - ETA: 1:25 - loss: 1.4326 - regression_loss: 1.1982 - classification_loss: 0.2343 159/500 [========>.....................] - ETA: 1:25 - loss: 1.4285 - regression_loss: 1.1952 - classification_loss: 0.2333 160/500 [========>.....................] - ETA: 1:25 - loss: 1.4301 - regression_loss: 1.1965 - classification_loss: 0.2336 161/500 [========>.....................] - ETA: 1:25 - loss: 1.4323 - regression_loss: 1.1987 - classification_loss: 0.2336 162/500 [========>.....................] - ETA: 1:24 - loss: 1.4336 - regression_loss: 1.2000 - classification_loss: 0.2336 163/500 [========>.....................] - ETA: 1:24 - loss: 1.4356 - regression_loss: 1.2022 - classification_loss: 0.2334 164/500 [========>.....................] - ETA: 1:24 - loss: 1.4330 - regression_loss: 1.2001 - classification_loss: 0.2330 165/500 [========>.....................] - ETA: 1:24 - loss: 1.4337 - regression_loss: 1.2002 - classification_loss: 0.2335 166/500 [========>.....................] - ETA: 1:23 - loss: 1.4315 - regression_loss: 1.1986 - classification_loss: 0.2329 167/500 [=========>....................] - ETA: 1:23 - loss: 1.4300 - regression_loss: 1.1981 - classification_loss: 0.2320 168/500 [=========>....................] - ETA: 1:23 - loss: 1.4318 - regression_loss: 1.1995 - classification_loss: 0.2323 169/500 [=========>....................] - ETA: 1:23 - loss: 1.4357 - regression_loss: 1.2017 - classification_loss: 0.2340 170/500 [=========>....................] - ETA: 1:22 - loss: 1.4330 - regression_loss: 1.1997 - classification_loss: 0.2333 171/500 [=========>....................] - ETA: 1:22 - loss: 1.4342 - regression_loss: 1.2009 - classification_loss: 0.2333 172/500 [=========>....................] - ETA: 1:22 - loss: 1.4339 - regression_loss: 1.2007 - classification_loss: 0.2332 173/500 [=========>....................] - ETA: 1:22 - loss: 1.4367 - regression_loss: 1.2027 - classification_loss: 0.2341 174/500 [=========>....................] - ETA: 1:21 - loss: 1.4362 - regression_loss: 1.2022 - classification_loss: 0.2340 175/500 [=========>....................] - ETA: 1:21 - loss: 1.4372 - regression_loss: 1.2032 - classification_loss: 0.2340 176/500 [=========>....................] - ETA: 1:21 - loss: 1.4387 - regression_loss: 1.2046 - classification_loss: 0.2341 177/500 [=========>....................] - ETA: 1:20 - loss: 1.4369 - regression_loss: 1.2034 - classification_loss: 0.2335 178/500 [=========>....................] - ETA: 1:20 - loss: 1.4374 - regression_loss: 1.2039 - classification_loss: 0.2335 179/500 [=========>....................] - ETA: 1:20 - loss: 1.4366 - regression_loss: 1.2038 - classification_loss: 0.2328 180/500 [=========>....................] - ETA: 1:20 - loss: 1.4393 - regression_loss: 1.2059 - classification_loss: 0.2334 181/500 [=========>....................] - ETA: 1:19 - loss: 1.4376 - regression_loss: 1.2043 - classification_loss: 0.2333 182/500 [=========>....................] - ETA: 1:19 - loss: 1.4367 - regression_loss: 1.2035 - classification_loss: 0.2332 183/500 [=========>....................] - ETA: 1:19 - loss: 1.4321 - regression_loss: 1.1996 - classification_loss: 0.2325 184/500 [==========>...................] - ETA: 1:19 - loss: 1.4313 - regression_loss: 1.1993 - classification_loss: 0.2320 185/500 [==========>...................] - ETA: 1:19 - loss: 1.4300 - regression_loss: 1.1984 - classification_loss: 0.2315 186/500 [==========>...................] - ETA: 1:18 - loss: 1.4317 - regression_loss: 1.1998 - classification_loss: 0.2319 187/500 [==========>...................] - ETA: 1:18 - loss: 1.4306 - regression_loss: 1.1987 - classification_loss: 0.2319 188/500 [==========>...................] - ETA: 1:18 - loss: 1.4309 - regression_loss: 1.1991 - classification_loss: 0.2318 189/500 [==========>...................] - ETA: 1:18 - loss: 1.4321 - regression_loss: 1.1998 - classification_loss: 0.2323 190/500 [==========>...................] - ETA: 1:17 - loss: 1.4330 - regression_loss: 1.2007 - classification_loss: 0.2323 191/500 [==========>...................] - ETA: 1:17 - loss: 1.4370 - regression_loss: 1.2036 - classification_loss: 0.2334 192/500 [==========>...................] - ETA: 1:17 - loss: 1.4383 - regression_loss: 1.2047 - classification_loss: 0.2336 193/500 [==========>...................] - ETA: 1:17 - loss: 1.4336 - regression_loss: 1.2008 - classification_loss: 0.2329 194/500 [==========>...................] - ETA: 1:16 - loss: 1.4339 - regression_loss: 1.2009 - classification_loss: 0.2330 195/500 [==========>...................] - ETA: 1:16 - loss: 1.4360 - regression_loss: 1.2026 - classification_loss: 0.2334 196/500 [==========>...................] - ETA: 1:16 - loss: 1.4329 - regression_loss: 1.2002 - classification_loss: 0.2327 197/500 [==========>...................] - ETA: 1:15 - loss: 1.4289 - regression_loss: 1.1971 - classification_loss: 0.2319 198/500 [==========>...................] - ETA: 1:15 - loss: 1.4308 - regression_loss: 1.1985 - classification_loss: 0.2324 199/500 [==========>...................] - ETA: 1:15 - loss: 1.4322 - regression_loss: 1.1995 - classification_loss: 0.2327 200/500 [===========>..................] - ETA: 1:15 - loss: 1.4331 - regression_loss: 1.2001 - classification_loss: 0.2330 201/500 [===========>..................] - ETA: 1:14 - loss: 1.4291 - regression_loss: 1.1969 - classification_loss: 0.2322 202/500 [===========>..................] - ETA: 1:14 - loss: 1.4269 - regression_loss: 1.1953 - classification_loss: 0.2316 203/500 [===========>..................] - ETA: 1:14 - loss: 1.4283 - regression_loss: 1.1964 - classification_loss: 0.2319 204/500 [===========>..................] - ETA: 1:14 - loss: 1.4257 - regression_loss: 1.1940 - classification_loss: 0.2317 205/500 [===========>..................] - ETA: 1:13 - loss: 1.4266 - regression_loss: 1.1949 - classification_loss: 0.2317 206/500 [===========>..................] - ETA: 1:13 - loss: 1.4256 - regression_loss: 1.1942 - classification_loss: 0.2314 207/500 [===========>..................] - ETA: 1:13 - loss: 1.4286 - regression_loss: 1.1964 - classification_loss: 0.2322 208/500 [===========>..................] - ETA: 1:13 - loss: 1.4273 - regression_loss: 1.1953 - classification_loss: 0.2320 209/500 [===========>..................] - ETA: 1:12 - loss: 1.4293 - regression_loss: 1.1967 - classification_loss: 0.2326 210/500 [===========>..................] - ETA: 1:12 - loss: 1.4275 - regression_loss: 1.1950 - classification_loss: 0.2325 211/500 [===========>..................] - ETA: 1:12 - loss: 1.4299 - regression_loss: 1.1971 - classification_loss: 0.2328 212/500 [===========>..................] - ETA: 1:12 - loss: 1.4294 - regression_loss: 1.1972 - classification_loss: 0.2322 213/500 [===========>..................] - ETA: 1:11 - loss: 1.4312 - regression_loss: 1.1988 - classification_loss: 0.2324 214/500 [===========>..................] - ETA: 1:11 - loss: 1.4309 - regression_loss: 1.1985 - classification_loss: 0.2324 215/500 [===========>..................] - ETA: 1:11 - loss: 1.4337 - regression_loss: 1.2007 - classification_loss: 0.2330 216/500 [===========>..................] - ETA: 1:11 - loss: 1.4335 - regression_loss: 1.2007 - classification_loss: 0.2328 217/500 [============>.................] - ETA: 1:10 - loss: 1.4346 - regression_loss: 1.2016 - classification_loss: 0.2330 218/500 [============>.................] - ETA: 1:10 - loss: 1.4339 - regression_loss: 1.2012 - classification_loss: 0.2327 219/500 [============>.................] - ETA: 1:10 - loss: 1.4366 - regression_loss: 1.2027 - classification_loss: 0.2339 220/500 [============>.................] - ETA: 1:10 - loss: 1.4378 - regression_loss: 1.2037 - classification_loss: 0.2340 221/500 [============>.................] - ETA: 1:09 - loss: 1.4386 - regression_loss: 1.2043 - classification_loss: 0.2342 222/500 [============>.................] - ETA: 1:09 - loss: 1.4382 - regression_loss: 1.2041 - classification_loss: 0.2341 223/500 [============>.................] - ETA: 1:09 - loss: 1.4380 - regression_loss: 1.2043 - classification_loss: 0.2337 224/500 [============>.................] - ETA: 1:09 - loss: 1.4384 - regression_loss: 1.2047 - classification_loss: 0.2336 225/500 [============>.................] - ETA: 1:08 - loss: 1.4387 - regression_loss: 1.2048 - classification_loss: 0.2339 226/500 [============>.................] - ETA: 1:08 - loss: 1.4369 - regression_loss: 1.2032 - classification_loss: 0.2337 227/500 [============>.................] - ETA: 1:08 - loss: 1.4372 - regression_loss: 1.2035 - classification_loss: 0.2336 228/500 [============>.................] - ETA: 1:08 - loss: 1.4355 - regression_loss: 1.2023 - classification_loss: 0.2332 229/500 [============>.................] - ETA: 1:07 - loss: 1.4362 - regression_loss: 1.2030 - classification_loss: 0.2332 230/500 [============>.................] - ETA: 1:07 - loss: 1.4360 - regression_loss: 1.2028 - classification_loss: 0.2332 231/500 [============>.................] - ETA: 1:07 - loss: 1.4376 - regression_loss: 1.2038 - classification_loss: 0.2338 232/500 [============>.................] - ETA: 1:07 - loss: 1.4360 - regression_loss: 1.2026 - classification_loss: 0.2334 233/500 [============>.................] - ETA: 1:06 - loss: 1.4364 - regression_loss: 1.2030 - classification_loss: 0.2333 234/500 [=============>................] - ETA: 1:06 - loss: 1.4342 - regression_loss: 1.2014 - classification_loss: 0.2329 235/500 [=============>................] - ETA: 1:06 - loss: 1.4346 - regression_loss: 1.2017 - classification_loss: 0.2329 236/500 [=============>................] - ETA: 1:06 - loss: 1.4374 - regression_loss: 1.2040 - classification_loss: 0.2334 237/500 [=============>................] - ETA: 1:05 - loss: 1.4387 - regression_loss: 1.2050 - classification_loss: 0.2337 238/500 [=============>................] - ETA: 1:05 - loss: 1.4387 - regression_loss: 1.2049 - classification_loss: 0.2338 239/500 [=============>................] - ETA: 1:05 - loss: 1.4379 - regression_loss: 1.2041 - classification_loss: 0.2337 240/500 [=============>................] - ETA: 1:05 - loss: 1.4376 - regression_loss: 1.2041 - classification_loss: 0.2335 241/500 [=============>................] - ETA: 1:04 - loss: 1.4401 - regression_loss: 1.2057 - classification_loss: 0.2344 242/500 [=============>................] - ETA: 1:04 - loss: 1.4404 - regression_loss: 1.2061 - classification_loss: 0.2343 243/500 [=============>................] - ETA: 1:04 - loss: 1.4372 - regression_loss: 1.2033 - classification_loss: 0.2339 244/500 [=============>................] - ETA: 1:04 - loss: 1.4367 - regression_loss: 1.2030 - classification_loss: 0.2337 245/500 [=============>................] - ETA: 1:03 - loss: 1.4383 - regression_loss: 1.2044 - classification_loss: 0.2339 246/500 [=============>................] - ETA: 1:03 - loss: 1.4368 - regression_loss: 1.2033 - classification_loss: 0.2335 247/500 [=============>................] - ETA: 1:03 - loss: 1.4336 - regression_loss: 1.2007 - classification_loss: 0.2329 248/500 [=============>................] - ETA: 1:03 - loss: 1.4313 - regression_loss: 1.1987 - classification_loss: 0.2325 249/500 [=============>................] - ETA: 1:02 - loss: 1.4319 - regression_loss: 1.1991 - classification_loss: 0.2328 250/500 [==============>...............] - ETA: 1:02 - loss: 1.4314 - regression_loss: 1.1986 - classification_loss: 0.2328 251/500 [==============>...............] - ETA: 1:02 - loss: 1.4291 - regression_loss: 1.1967 - classification_loss: 0.2324 252/500 [==============>...............] - ETA: 1:02 - loss: 1.4293 - regression_loss: 1.1964 - classification_loss: 0.2329 253/500 [==============>...............] - ETA: 1:01 - loss: 1.4309 - regression_loss: 1.1981 - classification_loss: 0.2328 254/500 [==============>...............] - ETA: 1:01 - loss: 1.4288 - regression_loss: 1.1966 - classification_loss: 0.2322 255/500 [==============>...............] - ETA: 1:01 - loss: 1.4273 - regression_loss: 1.1955 - classification_loss: 0.2317 256/500 [==============>...............] - ETA: 1:01 - loss: 1.4278 - regression_loss: 1.1960 - classification_loss: 0.2318 257/500 [==============>...............] - ETA: 1:00 - loss: 1.4280 - regression_loss: 1.1961 - classification_loss: 0.2319 258/500 [==============>...............] - ETA: 1:00 - loss: 1.4270 - regression_loss: 1.1953 - classification_loss: 0.2317 259/500 [==============>...............] - ETA: 1:00 - loss: 1.4283 - regression_loss: 1.1963 - classification_loss: 0.2320 260/500 [==============>...............] - ETA: 1:00 - loss: 1.4272 - regression_loss: 1.1956 - classification_loss: 0.2316 261/500 [==============>...............] - ETA: 59s - loss: 1.4266 - regression_loss: 1.1953 - classification_loss: 0.2313  262/500 [==============>...............] - ETA: 59s - loss: 1.4251 - regression_loss: 1.1941 - classification_loss: 0.2311 263/500 [==============>...............] - ETA: 59s - loss: 1.4274 - regression_loss: 1.1955 - classification_loss: 0.2319 264/500 [==============>...............] - ETA: 59s - loss: 1.4250 - regression_loss: 1.1938 - classification_loss: 0.2312 265/500 [==============>...............] - ETA: 58s - loss: 1.4250 - regression_loss: 1.1936 - classification_loss: 0.2313 266/500 [==============>...............] - ETA: 58s - loss: 1.4275 - regression_loss: 1.1958 - classification_loss: 0.2316 267/500 [===============>..............] - ETA: 58s - loss: 1.4271 - regression_loss: 1.1952 - classification_loss: 0.2318 268/500 [===============>..............] - ETA: 58s - loss: 1.4273 - regression_loss: 1.1954 - classification_loss: 0.2319 269/500 [===============>..............] - ETA: 57s - loss: 1.4270 - regression_loss: 1.1949 - classification_loss: 0.2322 270/500 [===============>..............] - ETA: 57s - loss: 1.4232 - regression_loss: 1.1918 - classification_loss: 0.2314 271/500 [===============>..............] - ETA: 57s - loss: 1.4249 - regression_loss: 1.1933 - classification_loss: 0.2316 272/500 [===============>..............] - ETA: 57s - loss: 1.4242 - regression_loss: 1.1929 - classification_loss: 0.2313 273/500 [===============>..............] - ETA: 56s - loss: 1.4204 - regression_loss: 1.1896 - classification_loss: 0.2308 274/500 [===============>..............] - ETA: 56s - loss: 1.4183 - regression_loss: 1.1880 - classification_loss: 0.2303 275/500 [===============>..............] - ETA: 56s - loss: 1.4163 - regression_loss: 1.1864 - classification_loss: 0.2299 276/500 [===============>..............] - ETA: 56s - loss: 1.4171 - regression_loss: 1.1871 - classification_loss: 0.2300 277/500 [===============>..............] - ETA: 55s - loss: 1.4183 - regression_loss: 1.1882 - classification_loss: 0.2301 278/500 [===============>..............] - ETA: 55s - loss: 1.4195 - regression_loss: 1.1892 - classification_loss: 0.2302 279/500 [===============>..............] - ETA: 55s - loss: 1.4199 - regression_loss: 1.1898 - classification_loss: 0.2300 280/500 [===============>..............] - ETA: 55s - loss: 1.4174 - regression_loss: 1.1878 - classification_loss: 0.2295 281/500 [===============>..............] - ETA: 54s - loss: 1.4156 - regression_loss: 1.1863 - classification_loss: 0.2293 282/500 [===============>..............] - ETA: 54s - loss: 1.4144 - regression_loss: 1.1854 - classification_loss: 0.2290 283/500 [===============>..............] - ETA: 54s - loss: 1.4132 - regression_loss: 1.1845 - classification_loss: 0.2286 284/500 [================>.............] - ETA: 54s - loss: 1.4126 - regression_loss: 1.1844 - classification_loss: 0.2283 285/500 [================>.............] - ETA: 53s - loss: 1.4125 - regression_loss: 1.1844 - classification_loss: 0.2281 286/500 [================>.............] - ETA: 53s - loss: 1.4108 - regression_loss: 1.1830 - classification_loss: 0.2278 287/500 [================>.............] - ETA: 53s - loss: 1.4110 - regression_loss: 1.1829 - classification_loss: 0.2281 288/500 [================>.............] - ETA: 53s - loss: 1.4114 - regression_loss: 1.1832 - classification_loss: 0.2282 289/500 [================>.............] - ETA: 52s - loss: 1.4088 - regression_loss: 1.1811 - classification_loss: 0.2277 290/500 [================>.............] - ETA: 52s - loss: 1.4106 - regression_loss: 1.1823 - classification_loss: 0.2282 291/500 [================>.............] - ETA: 52s - loss: 1.4130 - regression_loss: 1.1843 - classification_loss: 0.2287 292/500 [================>.............] - ETA: 52s - loss: 1.4133 - regression_loss: 1.1845 - classification_loss: 0.2288 293/500 [================>.............] - ETA: 51s - loss: 1.4138 - regression_loss: 1.1852 - classification_loss: 0.2286 294/500 [================>.............] - ETA: 51s - loss: 1.4139 - regression_loss: 1.1853 - classification_loss: 0.2286 295/500 [================>.............] - ETA: 51s - loss: 1.4121 - regression_loss: 1.1839 - classification_loss: 0.2282 296/500 [================>.............] - ETA: 51s - loss: 1.4130 - regression_loss: 1.1847 - classification_loss: 0.2283 297/500 [================>.............] - ETA: 50s - loss: 1.4133 - regression_loss: 1.1851 - classification_loss: 0.2282 298/500 [================>.............] - ETA: 50s - loss: 1.4121 - regression_loss: 1.1840 - classification_loss: 0.2281 299/500 [================>.............] - ETA: 50s - loss: 1.4141 - regression_loss: 1.1856 - classification_loss: 0.2286 300/500 [=================>............] - ETA: 50s - loss: 1.4144 - regression_loss: 1.1859 - classification_loss: 0.2286 301/500 [=================>............] - ETA: 49s - loss: 1.4162 - regression_loss: 1.1874 - classification_loss: 0.2288 302/500 [=================>............] - ETA: 49s - loss: 1.4181 - regression_loss: 1.1888 - classification_loss: 0.2293 303/500 [=================>............] - ETA: 49s - loss: 1.4185 - regression_loss: 1.1892 - classification_loss: 0.2293 304/500 [=================>............] - ETA: 49s - loss: 1.4226 - regression_loss: 1.1928 - classification_loss: 0.2298 305/500 [=================>............] - ETA: 48s - loss: 1.4215 - regression_loss: 1.1921 - classification_loss: 0.2294 306/500 [=================>............] - ETA: 48s - loss: 1.4220 - regression_loss: 1.1924 - classification_loss: 0.2295 307/500 [=================>............] - ETA: 48s - loss: 1.4231 - regression_loss: 1.1931 - classification_loss: 0.2300 308/500 [=================>............] - ETA: 48s - loss: 1.4206 - regression_loss: 1.1911 - classification_loss: 0.2294 309/500 [=================>............] - ETA: 47s - loss: 1.4199 - regression_loss: 1.1907 - classification_loss: 0.2293 310/500 [=================>............] - ETA: 47s - loss: 1.4208 - regression_loss: 1.1915 - classification_loss: 0.2293 311/500 [=================>............] - ETA: 47s - loss: 1.4181 - regression_loss: 1.1894 - classification_loss: 0.2288 312/500 [=================>............] - ETA: 47s - loss: 1.4168 - regression_loss: 1.1884 - classification_loss: 0.2284 313/500 [=================>............] - ETA: 46s - loss: 1.4162 - regression_loss: 1.1878 - classification_loss: 0.2284 314/500 [=================>............] - ETA: 46s - loss: 1.4170 - regression_loss: 1.1883 - classification_loss: 0.2287 315/500 [=================>............] - ETA: 46s - loss: 1.4172 - regression_loss: 1.1883 - classification_loss: 0.2289 316/500 [=================>............] - ETA: 46s - loss: 1.4171 - regression_loss: 1.1883 - classification_loss: 0.2288 317/500 [==================>...........] - ETA: 45s - loss: 1.4144 - regression_loss: 1.1861 - classification_loss: 0.2283 318/500 [==================>...........] - ETA: 45s - loss: 1.4145 - regression_loss: 1.1862 - classification_loss: 0.2283 319/500 [==================>...........] - ETA: 45s - loss: 1.4155 - regression_loss: 1.1871 - classification_loss: 0.2283 320/500 [==================>...........] - ETA: 45s - loss: 1.4160 - regression_loss: 1.1874 - classification_loss: 0.2286 321/500 [==================>...........] - ETA: 44s - loss: 1.4161 - regression_loss: 1.1876 - classification_loss: 0.2285 322/500 [==================>...........] - ETA: 44s - loss: 1.4171 - regression_loss: 1.1885 - classification_loss: 0.2286 323/500 [==================>...........] - ETA: 44s - loss: 1.4150 - regression_loss: 1.1869 - classification_loss: 0.2280 324/500 [==================>...........] - ETA: 44s - loss: 1.4151 - regression_loss: 1.1871 - classification_loss: 0.2280 325/500 [==================>...........] - ETA: 43s - loss: 1.4146 - regression_loss: 1.1868 - classification_loss: 0.2278 326/500 [==================>...........] - ETA: 43s - loss: 1.4147 - regression_loss: 1.1868 - classification_loss: 0.2279 327/500 [==================>...........] - ETA: 43s - loss: 1.4131 - regression_loss: 1.1855 - classification_loss: 0.2276 328/500 [==================>...........] - ETA: 43s - loss: 1.4142 - regression_loss: 1.1863 - classification_loss: 0.2278 329/500 [==================>...........] - ETA: 42s - loss: 1.4142 - regression_loss: 1.1866 - classification_loss: 0.2276 330/500 [==================>...........] - ETA: 42s - loss: 1.4142 - regression_loss: 1.1865 - classification_loss: 0.2276 331/500 [==================>...........] - ETA: 42s - loss: 1.4160 - regression_loss: 1.1881 - classification_loss: 0.2278 332/500 [==================>...........] - ETA: 42s - loss: 1.4151 - regression_loss: 1.1875 - classification_loss: 0.2276 333/500 [==================>...........] - ETA: 41s - loss: 1.4148 - regression_loss: 1.1872 - classification_loss: 0.2275 334/500 [===================>..........] - ETA: 41s - loss: 1.4177 - regression_loss: 1.1893 - classification_loss: 0.2285 335/500 [===================>..........] - ETA: 41s - loss: 1.4180 - regression_loss: 1.1897 - classification_loss: 0.2283 336/500 [===================>..........] - ETA: 41s - loss: 1.4194 - regression_loss: 1.1905 - classification_loss: 0.2289 337/500 [===================>..........] - ETA: 40s - loss: 1.4200 - regression_loss: 1.1909 - classification_loss: 0.2291 338/500 [===================>..........] - ETA: 40s - loss: 1.4192 - regression_loss: 1.1903 - classification_loss: 0.2289 339/500 [===================>..........] - ETA: 40s - loss: 1.4163 - regression_loss: 1.1879 - classification_loss: 0.2285 340/500 [===================>..........] - ETA: 40s - loss: 1.4140 - regression_loss: 1.1860 - classification_loss: 0.2280 341/500 [===================>..........] - ETA: 39s - loss: 1.4142 - regression_loss: 1.1863 - classification_loss: 0.2279 342/500 [===================>..........] - ETA: 39s - loss: 1.4152 - regression_loss: 1.1872 - classification_loss: 0.2280 343/500 [===================>..........] - ETA: 39s - loss: 1.4124 - regression_loss: 1.1849 - classification_loss: 0.2275 344/500 [===================>..........] - ETA: 39s - loss: 1.4117 - regression_loss: 1.1845 - classification_loss: 0.2272 345/500 [===================>..........] - ETA: 38s - loss: 1.4130 - regression_loss: 1.1856 - classification_loss: 0.2275 346/500 [===================>..........] - ETA: 38s - loss: 1.4139 - regression_loss: 1.1864 - classification_loss: 0.2275 347/500 [===================>..........] - ETA: 38s - loss: 1.4129 - regression_loss: 1.1855 - classification_loss: 0.2274 348/500 [===================>..........] - ETA: 38s - loss: 1.4127 - regression_loss: 1.1854 - classification_loss: 0.2272 349/500 [===================>..........] - ETA: 37s - loss: 1.4135 - regression_loss: 1.1859 - classification_loss: 0.2275 350/500 [====================>.........] - ETA: 37s - loss: 1.4140 - regression_loss: 1.1864 - classification_loss: 0.2275 351/500 [====================>.........] - ETA: 37s - loss: 1.4150 - regression_loss: 1.1874 - classification_loss: 0.2277 352/500 [====================>.........] - ETA: 37s - loss: 1.4164 - regression_loss: 1.1885 - classification_loss: 0.2280 353/500 [====================>.........] - ETA: 36s - loss: 1.4156 - regression_loss: 1.1878 - classification_loss: 0.2278 354/500 [====================>.........] - ETA: 36s - loss: 1.4149 - regression_loss: 1.1871 - classification_loss: 0.2278 355/500 [====================>.........] - ETA: 36s - loss: 1.4144 - regression_loss: 1.1869 - classification_loss: 0.2274 356/500 [====================>.........] - ETA: 36s - loss: 1.4122 - regression_loss: 1.1851 - classification_loss: 0.2270 357/500 [====================>.........] - ETA: 35s - loss: 1.4126 - regression_loss: 1.1854 - classification_loss: 0.2273 358/500 [====================>.........] - ETA: 35s - loss: 1.4139 - regression_loss: 1.1863 - classification_loss: 0.2276 359/500 [====================>.........] - ETA: 35s - loss: 1.4140 - regression_loss: 1.1865 - classification_loss: 0.2275 360/500 [====================>.........] - ETA: 35s - loss: 1.4150 - regression_loss: 1.1876 - classification_loss: 0.2274 361/500 [====================>.........] - ETA: 34s - loss: 1.4150 - regression_loss: 1.1876 - classification_loss: 0.2274 362/500 [====================>.........] - ETA: 34s - loss: 1.4125 - regression_loss: 1.1855 - classification_loss: 0.2270 363/500 [====================>.........] - ETA: 34s - loss: 1.4125 - regression_loss: 1.1856 - classification_loss: 0.2270 364/500 [====================>.........] - ETA: 34s - loss: 1.4131 - regression_loss: 1.1860 - classification_loss: 0.2271 365/500 [====================>.........] - ETA: 33s - loss: 1.4139 - regression_loss: 1.1866 - classification_loss: 0.2274 366/500 [====================>.........] - ETA: 33s - loss: 1.4138 - regression_loss: 1.1863 - classification_loss: 0.2275 367/500 [=====================>........] - ETA: 33s - loss: 1.4139 - regression_loss: 1.1864 - classification_loss: 0.2276 368/500 [=====================>........] - ETA: 33s - loss: 1.4127 - regression_loss: 1.1855 - classification_loss: 0.2272 369/500 [=====================>........] - ETA: 32s - loss: 1.4147 - regression_loss: 1.1872 - classification_loss: 0.2275 370/500 [=====================>........] - ETA: 32s - loss: 1.4141 - regression_loss: 1.1864 - classification_loss: 0.2277 371/500 [=====================>........] - ETA: 32s - loss: 1.4164 - regression_loss: 1.1881 - classification_loss: 0.2283 372/500 [=====================>........] - ETA: 32s - loss: 1.4163 - regression_loss: 1.1881 - classification_loss: 0.2282 373/500 [=====================>........] - ETA: 31s - loss: 1.4142 - regression_loss: 1.1864 - classification_loss: 0.2278 374/500 [=====================>........] - ETA: 31s - loss: 1.4128 - regression_loss: 1.1853 - classification_loss: 0.2275 375/500 [=====================>........] - ETA: 31s - loss: 1.4155 - regression_loss: 1.1874 - classification_loss: 0.2281 376/500 [=====================>........] - ETA: 31s - loss: 1.4160 - regression_loss: 1.1878 - classification_loss: 0.2282 377/500 [=====================>........] - ETA: 30s - loss: 1.4159 - regression_loss: 1.1876 - classification_loss: 0.2282 378/500 [=====================>........] - ETA: 30s - loss: 1.4149 - regression_loss: 1.1867 - classification_loss: 0.2282 379/500 [=====================>........] - ETA: 30s - loss: 1.4134 - regression_loss: 1.1855 - classification_loss: 0.2279 380/500 [=====================>........] - ETA: 30s - loss: 1.4144 - regression_loss: 1.1861 - classification_loss: 0.2282 381/500 [=====================>........] - ETA: 29s - loss: 1.4137 - regression_loss: 1.1857 - classification_loss: 0.2280 382/500 [=====================>........] - ETA: 29s - loss: 1.4137 - regression_loss: 1.1860 - classification_loss: 0.2277 383/500 [=====================>........] - ETA: 29s - loss: 1.4134 - regression_loss: 1.1861 - classification_loss: 0.2274 384/500 [======================>.......] - ETA: 29s - loss: 1.4145 - regression_loss: 1.1867 - classification_loss: 0.2279 385/500 [======================>.......] - ETA: 28s - loss: 1.4151 - regression_loss: 1.1872 - classification_loss: 0.2279 386/500 [======================>.......] - ETA: 28s - loss: 1.4134 - regression_loss: 1.1859 - classification_loss: 0.2275 387/500 [======================>.......] - ETA: 28s - loss: 1.4144 - regression_loss: 1.1866 - classification_loss: 0.2277 388/500 [======================>.......] - ETA: 28s - loss: 1.4146 - regression_loss: 1.1867 - classification_loss: 0.2279 389/500 [======================>.......] - ETA: 27s - loss: 1.4132 - regression_loss: 1.1856 - classification_loss: 0.2276 390/500 [======================>.......] - ETA: 27s - loss: 1.4128 - regression_loss: 1.1853 - classification_loss: 0.2275 391/500 [======================>.......] - ETA: 27s - loss: 1.4133 - regression_loss: 1.1858 - classification_loss: 0.2275 392/500 [======================>.......] - ETA: 27s - loss: 1.4115 - regression_loss: 1.1845 - classification_loss: 0.2271 393/500 [======================>.......] - ETA: 26s - loss: 1.4103 - regression_loss: 1.1836 - classification_loss: 0.2267 394/500 [======================>.......] - ETA: 26s - loss: 1.4109 - regression_loss: 1.1840 - classification_loss: 0.2268 395/500 [======================>.......] - ETA: 26s - loss: 1.4124 - regression_loss: 1.1854 - classification_loss: 0.2270 396/500 [======================>.......] - ETA: 26s - loss: 1.4138 - regression_loss: 1.1863 - classification_loss: 0.2275 397/500 [======================>.......] - ETA: 25s - loss: 1.4122 - regression_loss: 1.1850 - classification_loss: 0.2272 398/500 [======================>.......] - ETA: 25s - loss: 1.4132 - regression_loss: 1.1856 - classification_loss: 0.2277 399/500 [======================>.......] - ETA: 25s - loss: 1.4114 - regression_loss: 1.1841 - classification_loss: 0.2273 400/500 [=======================>......] - ETA: 25s - loss: 1.4124 - regression_loss: 1.1850 - classification_loss: 0.2274 401/500 [=======================>......] - ETA: 24s - loss: 1.4127 - regression_loss: 1.1853 - classification_loss: 0.2274 402/500 [=======================>......] - ETA: 24s - loss: 1.4129 - regression_loss: 1.1856 - classification_loss: 0.2273 403/500 [=======================>......] - ETA: 24s - loss: 1.4131 - regression_loss: 1.1858 - classification_loss: 0.2274 404/500 [=======================>......] - ETA: 24s - loss: 1.4140 - regression_loss: 1.1865 - classification_loss: 0.2274 405/500 [=======================>......] - ETA: 23s - loss: 1.4133 - regression_loss: 1.1861 - classification_loss: 0.2272 406/500 [=======================>......] - ETA: 23s - loss: 1.4136 - regression_loss: 1.1863 - classification_loss: 0.2273 407/500 [=======================>......] - ETA: 23s - loss: 1.4131 - regression_loss: 1.1859 - classification_loss: 0.2272 408/500 [=======================>......] - ETA: 23s - loss: 1.4118 - regression_loss: 1.1846 - classification_loss: 0.2271 409/500 [=======================>......] - ETA: 22s - loss: 1.4118 - regression_loss: 1.1847 - classification_loss: 0.2271 410/500 [=======================>......] - ETA: 22s - loss: 1.4124 - regression_loss: 1.1852 - classification_loss: 0.2273 411/500 [=======================>......] - ETA: 22s - loss: 1.4110 - regression_loss: 1.1841 - classification_loss: 0.2269 412/500 [=======================>......] - ETA: 22s - loss: 1.4113 - regression_loss: 1.1845 - classification_loss: 0.2268 413/500 [=======================>......] - ETA: 21s - loss: 1.4115 - regression_loss: 1.1847 - classification_loss: 0.2268 414/500 [=======================>......] - ETA: 21s - loss: 1.4098 - regression_loss: 1.1833 - classification_loss: 0.2265 415/500 [=======================>......] - ETA: 21s - loss: 1.4084 - regression_loss: 1.1821 - classification_loss: 0.2262 416/500 [=======================>......] - ETA: 21s - loss: 1.4093 - regression_loss: 1.1830 - classification_loss: 0.2263 417/500 [========================>.....] - ETA: 20s - loss: 1.4100 - regression_loss: 1.1836 - classification_loss: 0.2264 418/500 [========================>.....] - ETA: 20s - loss: 1.4081 - regression_loss: 1.1821 - classification_loss: 0.2260 419/500 [========================>.....] - ETA: 20s - loss: 1.4082 - regression_loss: 1.1822 - classification_loss: 0.2260 420/500 [========================>.....] - ETA: 20s - loss: 1.4073 - regression_loss: 1.1816 - classification_loss: 0.2257 421/500 [========================>.....] - ETA: 19s - loss: 1.4084 - regression_loss: 1.1826 - classification_loss: 0.2258 422/500 [========================>.....] - ETA: 19s - loss: 1.4099 - regression_loss: 1.1837 - classification_loss: 0.2261 423/500 [========================>.....] - ETA: 19s - loss: 1.4094 - regression_loss: 1.1835 - classification_loss: 0.2260 424/500 [========================>.....] - ETA: 19s - loss: 1.4095 - regression_loss: 1.1836 - classification_loss: 0.2259 425/500 [========================>.....] - ETA: 18s - loss: 1.4098 - regression_loss: 1.1838 - classification_loss: 0.2260 426/500 [========================>.....] - ETA: 18s - loss: 1.4112 - regression_loss: 1.1849 - classification_loss: 0.2263 427/500 [========================>.....] - ETA: 18s - loss: 1.4114 - regression_loss: 1.1850 - classification_loss: 0.2264 428/500 [========================>.....] - ETA: 18s - loss: 1.4111 - regression_loss: 1.1849 - classification_loss: 0.2263 429/500 [========================>.....] - ETA: 17s - loss: 1.4111 - regression_loss: 1.1845 - classification_loss: 0.2266 430/500 [========================>.....] - ETA: 17s - loss: 1.4106 - regression_loss: 1.1842 - classification_loss: 0.2264 431/500 [========================>.....] - ETA: 17s - loss: 1.4093 - regression_loss: 1.1831 - classification_loss: 0.2261 432/500 [========================>.....] - ETA: 17s - loss: 1.4088 - regression_loss: 1.1828 - classification_loss: 0.2260 433/500 [========================>.....] - ETA: 16s - loss: 1.4067 - regression_loss: 1.1811 - classification_loss: 0.2256 434/500 [=========================>....] - ETA: 16s - loss: 1.4055 - regression_loss: 1.1802 - classification_loss: 0.2253 435/500 [=========================>....] - ETA: 16s - loss: 1.4069 - regression_loss: 1.1813 - classification_loss: 0.2256 436/500 [=========================>....] - ETA: 16s - loss: 1.4060 - regression_loss: 1.1806 - classification_loss: 0.2254 437/500 [=========================>....] - ETA: 15s - loss: 1.4064 - regression_loss: 1.1810 - classification_loss: 0.2255 438/500 [=========================>....] - ETA: 15s - loss: 1.4067 - regression_loss: 1.1812 - classification_loss: 0.2254 439/500 [=========================>....] - ETA: 15s - loss: 1.4076 - regression_loss: 1.1820 - classification_loss: 0.2256 440/500 [=========================>....] - ETA: 15s - loss: 1.4080 - regression_loss: 1.1824 - classification_loss: 0.2256 441/500 [=========================>....] - ETA: 14s - loss: 1.4080 - regression_loss: 1.1824 - classification_loss: 0.2256 442/500 [=========================>....] - ETA: 14s - loss: 1.4074 - regression_loss: 1.1819 - classification_loss: 0.2255 443/500 [=========================>....] - ETA: 14s - loss: 1.4085 - regression_loss: 1.1828 - classification_loss: 0.2256 444/500 [=========================>....] - ETA: 14s - loss: 1.4086 - regression_loss: 1.1829 - classification_loss: 0.2257 445/500 [=========================>....] - ETA: 13s - loss: 1.4073 - regression_loss: 1.1817 - classification_loss: 0.2256 446/500 [=========================>....] - ETA: 13s - loss: 1.4087 - regression_loss: 1.1828 - classification_loss: 0.2259 447/500 [=========================>....] - ETA: 13s - loss: 1.4082 - regression_loss: 1.1824 - classification_loss: 0.2259 448/500 [=========================>....] - ETA: 13s - loss: 1.4073 - regression_loss: 1.1816 - classification_loss: 0.2257 449/500 [=========================>....] - ETA: 12s - loss: 1.4059 - regression_loss: 1.1806 - classification_loss: 0.2253 450/500 [==========================>...] - ETA: 12s - loss: 1.4060 - regression_loss: 1.1807 - classification_loss: 0.2253 451/500 [==========================>...] - ETA: 12s - loss: 1.4072 - regression_loss: 1.1817 - classification_loss: 0.2255 452/500 [==========================>...] - ETA: 12s - loss: 1.4077 - regression_loss: 1.1822 - classification_loss: 0.2255 453/500 [==========================>...] - ETA: 11s - loss: 1.4064 - regression_loss: 1.1811 - classification_loss: 0.2252 454/500 [==========================>...] - ETA: 11s - loss: 1.4070 - regression_loss: 1.1814 - classification_loss: 0.2256 455/500 [==========================>...] - ETA: 11s - loss: 1.4070 - regression_loss: 1.1814 - classification_loss: 0.2257 456/500 [==========================>...] - ETA: 11s - loss: 1.4119 - regression_loss: 1.1851 - classification_loss: 0.2267 457/500 [==========================>...] - ETA: 10s - loss: 1.4127 - regression_loss: 1.1859 - classification_loss: 0.2269 458/500 [==========================>...] - ETA: 10s - loss: 1.4123 - regression_loss: 1.1854 - classification_loss: 0.2269 459/500 [==========================>...] - ETA: 10s - loss: 1.4134 - regression_loss: 1.1862 - classification_loss: 0.2272 460/500 [==========================>...] - ETA: 10s - loss: 1.4143 - regression_loss: 1.1867 - classification_loss: 0.2277 461/500 [==========================>...] - ETA: 9s - loss: 1.4149 - regression_loss: 1.1871 - classification_loss: 0.2278  462/500 [==========================>...] - ETA: 9s - loss: 1.4147 - regression_loss: 1.1870 - classification_loss: 0.2277 463/500 [==========================>...] - ETA: 9s - loss: 1.4152 - regression_loss: 1.1873 - classification_loss: 0.2279 464/500 [==========================>...] - ETA: 9s - loss: 1.4145 - regression_loss: 1.1866 - classification_loss: 0.2279 465/500 [==========================>...] - ETA: 8s - loss: 1.4135 - regression_loss: 1.1858 - classification_loss: 0.2277 466/500 [==========================>...] - ETA: 8s - loss: 1.4143 - regression_loss: 1.1864 - classification_loss: 0.2279 467/500 [===========================>..] - ETA: 8s - loss: 1.4130 - regression_loss: 1.1853 - classification_loss: 0.2277 468/500 [===========================>..] - ETA: 8s - loss: 1.4139 - regression_loss: 1.1861 - classification_loss: 0.2278 469/500 [===========================>..] - ETA: 7s - loss: 1.4139 - regression_loss: 1.1861 - classification_loss: 0.2278 470/500 [===========================>..] - ETA: 7s - loss: 1.4141 - regression_loss: 1.1863 - classification_loss: 0.2279 471/500 [===========================>..] - ETA: 7s - loss: 1.4141 - regression_loss: 1.1862 - classification_loss: 0.2279 472/500 [===========================>..] - ETA: 7s - loss: 1.4147 - regression_loss: 1.1868 - classification_loss: 0.2279 473/500 [===========================>..] - ETA: 6s - loss: 1.4141 - regression_loss: 1.1861 - classification_loss: 0.2279 474/500 [===========================>..] - ETA: 6s - loss: 1.4145 - regression_loss: 1.1865 - classification_loss: 0.2280 475/500 [===========================>..] - ETA: 6s - loss: 1.4151 - regression_loss: 1.1870 - classification_loss: 0.2282 476/500 [===========================>..] - ETA: 6s - loss: 1.4160 - regression_loss: 1.1876 - classification_loss: 0.2283 477/500 [===========================>..] - ETA: 5s - loss: 1.4149 - regression_loss: 1.1867 - classification_loss: 0.2282 478/500 [===========================>..] - ETA: 5s - loss: 1.4150 - regression_loss: 1.1868 - classification_loss: 0.2282 479/500 [===========================>..] - ETA: 5s - loss: 1.4136 - regression_loss: 1.1854 - classification_loss: 0.2282 480/500 [===========================>..] - ETA: 5s - loss: 1.4146 - regression_loss: 1.1862 - classification_loss: 0.2284 481/500 [===========================>..] - ETA: 4s - loss: 1.4155 - regression_loss: 1.1869 - classification_loss: 0.2286 482/500 [===========================>..] - ETA: 4s - loss: 1.4145 - regression_loss: 1.1860 - classification_loss: 0.2284 483/500 [===========================>..] - ETA: 4s - loss: 1.4153 - regression_loss: 1.1866 - classification_loss: 0.2286 484/500 [============================>.] - ETA: 4s - loss: 1.4155 - regression_loss: 1.1869 - classification_loss: 0.2286 485/500 [============================>.] - ETA: 3s - loss: 1.4167 - regression_loss: 1.1878 - classification_loss: 0.2288 486/500 [============================>.] - ETA: 3s - loss: 1.4168 - regression_loss: 1.1879 - classification_loss: 0.2289 487/500 [============================>.] - ETA: 3s - loss: 1.4164 - regression_loss: 1.1876 - classification_loss: 0.2288 488/500 [============================>.] - ETA: 3s - loss: 1.4148 - regression_loss: 1.1863 - classification_loss: 0.2285 489/500 [============================>.] - ETA: 2s - loss: 1.4149 - regression_loss: 1.1864 - classification_loss: 0.2285 490/500 [============================>.] - ETA: 2s - loss: 1.4151 - regression_loss: 1.1867 - classification_loss: 0.2285 491/500 [============================>.] - ETA: 2s - loss: 1.4151 - regression_loss: 1.1868 - classification_loss: 0.2284 492/500 [============================>.] - ETA: 2s - loss: 1.4159 - regression_loss: 1.1874 - classification_loss: 0.2285 493/500 [============================>.] - ETA: 1s - loss: 1.4156 - regression_loss: 1.1872 - classification_loss: 0.2284 494/500 [============================>.] - ETA: 1s - loss: 1.4160 - regression_loss: 1.1874 - classification_loss: 0.2286 495/500 [============================>.] - ETA: 1s - loss: 1.4166 - regression_loss: 1.1880 - classification_loss: 0.2286 496/500 [============================>.] - ETA: 1s - loss: 1.4176 - regression_loss: 1.1889 - classification_loss: 0.2287 497/500 [============================>.] - ETA: 0s - loss: 1.4164 - regression_loss: 1.1880 - classification_loss: 0.2284 498/500 [============================>.] - ETA: 0s - loss: 1.4164 - regression_loss: 1.1879 - classification_loss: 0.2285 499/500 [============================>.] - ETA: 0s - loss: 1.4159 - regression_loss: 1.1873 - classification_loss: 0.2286 500/500 [==============================] - 125s 251ms/step - loss: 1.4168 - regression_loss: 1.1881 - classification_loss: 0.2287 1172 instances of class plum with average precision: 0.6727 mAP: 0.6727 Epoch 00114: saving model to ./training/snapshots/resnet50_pascal_114.h5 Epoch 115/150 1/500 [..............................] - ETA: 2:00 - loss: 1.8683 - regression_loss: 1.5839 - classification_loss: 0.2844 2/500 [..............................] - ETA: 2:02 - loss: 1.7607 - regression_loss: 1.4928 - classification_loss: 0.2679 3/500 [..............................] - ETA: 2:02 - loss: 1.3704 - regression_loss: 1.1719 - classification_loss: 0.1985 4/500 [..............................] - ETA: 2:03 - loss: 1.2526 - regression_loss: 1.0747 - classification_loss: 0.1779 5/500 [..............................] - ETA: 2:04 - loss: 1.3162 - regression_loss: 1.1035 - classification_loss: 0.2127 6/500 [..............................] - ETA: 2:05 - loss: 1.3339 - regression_loss: 1.1161 - classification_loss: 0.2179 7/500 [..............................] - ETA: 2:04 - loss: 1.4600 - regression_loss: 1.2128 - classification_loss: 0.2472 8/500 [..............................] - ETA: 2:04 - loss: 1.4366 - regression_loss: 1.1958 - classification_loss: 0.2408 9/500 [..............................] - ETA: 2:03 - loss: 1.4235 - regression_loss: 1.1856 - classification_loss: 0.2379 10/500 [..............................] - ETA: 2:03 - loss: 1.4315 - regression_loss: 1.1969 - classification_loss: 0.2347 11/500 [..............................] - ETA: 2:03 - loss: 1.3608 - regression_loss: 1.1381 - classification_loss: 0.2227 12/500 [..............................] - ETA: 2:03 - loss: 1.3717 - regression_loss: 1.1476 - classification_loss: 0.2241 13/500 [..............................] - ETA: 2:03 - loss: 1.3814 - regression_loss: 1.1536 - classification_loss: 0.2277 14/500 [..............................] - ETA: 2:03 - loss: 1.4062 - regression_loss: 1.1754 - classification_loss: 0.2307 15/500 [..............................] - ETA: 2:02 - loss: 1.4044 - regression_loss: 1.1750 - classification_loss: 0.2294 16/500 [..............................] - ETA: 2:02 - loss: 1.3732 - regression_loss: 1.1535 - classification_loss: 0.2197 17/500 [>.............................] - ETA: 2:01 - loss: 1.3816 - regression_loss: 1.1604 - classification_loss: 0.2212 18/500 [>.............................] - ETA: 2:01 - loss: 1.3991 - regression_loss: 1.1749 - classification_loss: 0.2242 19/500 [>.............................] - ETA: 2:01 - loss: 1.3493 - regression_loss: 1.1344 - classification_loss: 0.2149 20/500 [>.............................] - ETA: 2:01 - loss: 1.3211 - regression_loss: 1.1134 - classification_loss: 0.2078 21/500 [>.............................] - ETA: 2:01 - loss: 1.3122 - regression_loss: 1.1050 - classification_loss: 0.2072 22/500 [>.............................] - ETA: 2:01 - loss: 1.3158 - regression_loss: 1.1098 - classification_loss: 0.2060 23/500 [>.............................] - ETA: 2:00 - loss: 2.7601 - regression_loss: 1.1563 - classification_loss: 1.6038 24/500 [>.............................] - ETA: 1:59 - loss: 2.6753 - regression_loss: 1.1344 - classification_loss: 1.5408 25/500 [>.............................] - ETA: 1:58 - loss: 2.6167 - regression_loss: 1.1305 - classification_loss: 1.4862 26/500 [>.............................] - ETA: 1:58 - loss: 2.5980 - regression_loss: 1.1531 - classification_loss: 1.4449 27/500 [>.............................] - ETA: 1:57 - loss: 2.5437 - regression_loss: 1.1470 - classification_loss: 1.3967 28/500 [>.............................] - ETA: 1:56 - loss: 2.5205 - regression_loss: 1.1624 - classification_loss: 1.3581 29/500 [>.............................] - ETA: 1:56 - loss: 2.4721 - regression_loss: 1.1552 - classification_loss: 1.3169 30/500 [>.............................] - ETA: 1:56 - loss: 2.4381 - regression_loss: 1.1609 - classification_loss: 1.2771 31/500 [>.............................] - ETA: 1:56 - loss: 2.3738 - regression_loss: 1.1365 - classification_loss: 1.2373 32/500 [>.............................] - ETA: 1:56 - loss: 2.3259 - regression_loss: 1.1233 - classification_loss: 1.2025 33/500 [>.............................] - ETA: 1:55 - loss: 2.2744 - regression_loss: 1.1070 - classification_loss: 1.1675 34/500 [=>............................] - ETA: 1:55 - loss: 2.2554 - regression_loss: 1.1137 - classification_loss: 1.1416 35/500 [=>............................] - ETA: 1:55 - loss: 2.2297 - regression_loss: 1.1141 - classification_loss: 1.1157 36/500 [=>............................] - ETA: 1:54 - loss: 2.2076 - regression_loss: 1.1177 - classification_loss: 1.0899 37/500 [=>............................] - ETA: 1:54 - loss: 2.1704 - regression_loss: 1.1070 - classification_loss: 1.0634 38/500 [=>............................] - ETA: 1:54 - loss: 2.1572 - regression_loss: 1.1151 - classification_loss: 1.0421 39/500 [=>............................] - ETA: 1:53 - loss: 2.1509 - regression_loss: 1.1283 - classification_loss: 1.0226 40/500 [=>............................] - ETA: 1:53 - loss: 2.1151 - regression_loss: 1.1157 - classification_loss: 0.9994 41/500 [=>............................] - ETA: 1:53 - loss: 2.0943 - regression_loss: 1.1150 - classification_loss: 0.9793 42/500 [=>............................] - ETA: 1:53 - loss: 2.0834 - regression_loss: 1.1198 - classification_loss: 0.9637 43/500 [=>............................] - ETA: 1:53 - loss: 2.0724 - regression_loss: 1.1247 - classification_loss: 0.9477 44/500 [=>............................] - ETA: 1:52 - loss: 2.0645 - regression_loss: 1.1325 - classification_loss: 0.9319 45/500 [=>............................] - ETA: 1:52 - loss: 2.0557 - regression_loss: 1.1366 - classification_loss: 0.9191 46/500 [=>............................] - ETA: 1:52 - loss: 2.0433 - regression_loss: 1.1400 - classification_loss: 0.9033 47/500 [=>............................] - ETA: 1:52 - loss: 2.0285 - regression_loss: 1.1402 - classification_loss: 0.8882 48/500 [=>............................] - ETA: 1:52 - loss: 2.0169 - regression_loss: 1.1431 - classification_loss: 0.8737 49/500 [=>............................] - ETA: 1:51 - loss: 2.0024 - regression_loss: 1.1423 - classification_loss: 0.8600 50/500 [==>...........................] - ETA: 1:51 - loss: 1.9782 - regression_loss: 1.1338 - classification_loss: 0.8444 51/500 [==>...........................] - ETA: 1:51 - loss: 1.9733 - regression_loss: 1.1384 - classification_loss: 0.8349 52/500 [==>...........................] - ETA: 1:51 - loss: 1.9635 - regression_loss: 1.1408 - classification_loss: 0.8227 53/500 [==>...........................] - ETA: 1:50 - loss: 1.9583 - regression_loss: 1.1474 - classification_loss: 0.8110 54/500 [==>...........................] - ETA: 1:50 - loss: 1.9431 - regression_loss: 1.1434 - classification_loss: 0.7998 55/500 [==>...........................] - ETA: 1:50 - loss: 1.9328 - regression_loss: 1.1443 - classification_loss: 0.7885 56/500 [==>...........................] - ETA: 1:50 - loss: 1.9215 - regression_loss: 1.1435 - classification_loss: 0.7780 57/500 [==>...........................] - ETA: 1:50 - loss: 1.9228 - regression_loss: 1.1530 - classification_loss: 0.7698 58/500 [==>...........................] - ETA: 1:49 - loss: 1.9162 - regression_loss: 1.1557 - classification_loss: 0.7605 59/500 [==>...........................] - ETA: 1:49 - loss: 1.9068 - regression_loss: 1.1544 - classification_loss: 0.7524 60/500 [==>...........................] - ETA: 1:49 - loss: 1.8836 - regression_loss: 1.1425 - classification_loss: 0.7411 61/500 [==>...........................] - ETA: 1:49 - loss: 1.8757 - regression_loss: 1.1434 - classification_loss: 0.7323 62/500 [==>...........................] - ETA: 1:49 - loss: 1.8716 - regression_loss: 1.1458 - classification_loss: 0.7257 63/500 [==>...........................] - ETA: 1:48 - loss: 1.8579 - regression_loss: 1.1418 - classification_loss: 0.7161 64/500 [==>...........................] - ETA: 1:48 - loss: 1.8540 - regression_loss: 1.1460 - classification_loss: 0.7081 65/500 [==>...........................] - ETA: 1:48 - loss: 1.8536 - regression_loss: 1.1519 - classification_loss: 0.7017 66/500 [==>...........................] - ETA: 1:48 - loss: 1.8467 - regression_loss: 1.1524 - classification_loss: 0.6943 67/500 [===>..........................] - ETA: 1:47 - loss: 1.8309 - regression_loss: 1.1452 - classification_loss: 0.6857 68/500 [===>..........................] - ETA: 1:47 - loss: 1.8238 - regression_loss: 1.1454 - classification_loss: 0.6784 69/500 [===>..........................] - ETA: 1:47 - loss: 1.8314 - regression_loss: 1.1554 - classification_loss: 0.6759 70/500 [===>..........................] - ETA: 1:47 - loss: 1.8105 - regression_loss: 1.1431 - classification_loss: 0.6674 71/500 [===>..........................] - ETA: 1:46 - loss: 1.8038 - regression_loss: 1.1432 - classification_loss: 0.6606 72/500 [===>..........................] - ETA: 1:46 - loss: 1.8008 - regression_loss: 1.1462 - classification_loss: 0.6546 73/500 [===>..........................] - ETA: 1:46 - loss: 1.7981 - regression_loss: 1.1491 - classification_loss: 0.6490 74/500 [===>..........................] - ETA: 1:46 - loss: 1.7965 - regression_loss: 1.1515 - classification_loss: 0.6450 75/500 [===>..........................] - ETA: 1:46 - loss: 1.7987 - regression_loss: 1.1579 - classification_loss: 0.6407 76/500 [===>..........................] - ETA: 1:45 - loss: 1.7854 - regression_loss: 1.1520 - classification_loss: 0.6334 77/500 [===>..........................] - ETA: 1:45 - loss: 1.7851 - regression_loss: 1.1568 - classification_loss: 0.6283 78/500 [===>..........................] - ETA: 1:45 - loss: 1.7794 - regression_loss: 1.1560 - classification_loss: 0.6235 79/500 [===>..........................] - ETA: 1:45 - loss: 1.7794 - regression_loss: 1.1599 - classification_loss: 0.6195 80/500 [===>..........................] - ETA: 1:44 - loss: 1.7773 - regression_loss: 1.1620 - classification_loss: 0.6153 81/500 [===>..........................] - ETA: 1:44 - loss: 1.7729 - regression_loss: 1.1624 - classification_loss: 0.6105 82/500 [===>..........................] - ETA: 1:44 - loss: 1.7735 - regression_loss: 1.1661 - classification_loss: 0.6074 83/500 [===>..........................] - ETA: 1:44 - loss: 1.7769 - regression_loss: 1.1722 - classification_loss: 0.6047 84/500 [====>.........................] - ETA: 1:43 - loss: 1.7771 - regression_loss: 1.1755 - classification_loss: 0.6016 85/500 [====>.........................] - ETA: 1:43 - loss: 1.7734 - regression_loss: 1.1764 - classification_loss: 0.5971 86/500 [====>.........................] - ETA: 1:43 - loss: 1.7691 - regression_loss: 1.1769 - classification_loss: 0.5922 87/500 [====>.........................] - ETA: 1:43 - loss: 1.7635 - regression_loss: 1.1762 - classification_loss: 0.5873 88/500 [====>.........................] - ETA: 1:42 - loss: 1.7587 - regression_loss: 1.1760 - classification_loss: 0.5826 89/500 [====>.........................] - ETA: 1:42 - loss: 1.7591 - regression_loss: 1.1803 - classification_loss: 0.5789 90/500 [====>.........................] - ETA: 1:42 - loss: 1.7509 - regression_loss: 1.1773 - classification_loss: 0.5735 91/500 [====>.........................] - ETA: 1:42 - loss: 1.7457 - regression_loss: 1.1765 - classification_loss: 0.5692 92/500 [====>.........................] - ETA: 1:41 - loss: 1.7436 - regression_loss: 1.1786 - classification_loss: 0.5651 93/500 [====>.........................] - ETA: 1:41 - loss: 1.7388 - regression_loss: 1.1785 - classification_loss: 0.5603 94/500 [====>.........................] - ETA: 1:41 - loss: 1.7395 - regression_loss: 1.1816 - classification_loss: 0.5579 95/500 [====>.........................] - ETA: 1:41 - loss: 1.7453 - regression_loss: 1.1897 - classification_loss: 0.5555 96/500 [====>.........................] - ETA: 1:40 - loss: 1.7484 - regression_loss: 1.1963 - classification_loss: 0.5522 97/500 [====>.........................] - ETA: 1:40 - loss: 1.7363 - regression_loss: 1.1889 - classification_loss: 0.5474 98/500 [====>.........................] - ETA: 1:40 - loss: 1.7344 - regression_loss: 1.1912 - classification_loss: 0.5432 99/500 [====>.........................] - ETA: 1:40 - loss: 1.7372 - regression_loss: 1.1955 - classification_loss: 0.5417 100/500 [=====>........................] - ETA: 1:39 - loss: 1.7266 - regression_loss: 1.1898 - classification_loss: 0.5368 101/500 [=====>........................] - ETA: 1:39 - loss: 1.7184 - regression_loss: 1.1857 - classification_loss: 0.5327 102/500 [=====>........................] - ETA: 1:39 - loss: 1.7152 - regression_loss: 1.1853 - classification_loss: 0.5298 103/500 [=====>........................] - ETA: 1:39 - loss: 1.7136 - regression_loss: 1.1862 - classification_loss: 0.5274 104/500 [=====>........................] - ETA: 1:39 - loss: 1.7075 - regression_loss: 1.1835 - classification_loss: 0.5240 105/500 [=====>........................] - ETA: 1:38 - loss: 1.7112 - regression_loss: 1.1885 - classification_loss: 0.5227 106/500 [=====>........................] - ETA: 1:38 - loss: 1.7100 - regression_loss: 1.1899 - classification_loss: 0.5201 107/500 [=====>........................] - ETA: 1:38 - loss: 1.7072 - regression_loss: 1.1900 - classification_loss: 0.5172 108/500 [=====>........................] - ETA: 1:38 - loss: 1.7069 - regression_loss: 1.1923 - classification_loss: 0.5146 109/500 [=====>........................] - ETA: 1:37 - loss: 1.7052 - regression_loss: 1.1932 - classification_loss: 0.5120 110/500 [=====>........................] - ETA: 1:37 - loss: 1.6983 - regression_loss: 1.1902 - classification_loss: 0.5081 111/500 [=====>........................] - ETA: 1:37 - loss: 1.6936 - regression_loss: 1.1886 - classification_loss: 0.5050 112/500 [=====>........................] - ETA: 1:37 - loss: 1.6889 - regression_loss: 1.1865 - classification_loss: 0.5024 113/500 [=====>........................] - ETA: 1:36 - loss: 1.6850 - regression_loss: 1.1852 - classification_loss: 0.4998 114/500 [=====>........................] - ETA: 1:36 - loss: 1.6765 - regression_loss: 1.1802 - classification_loss: 0.4963 115/500 [=====>........................] - ETA: 1:36 - loss: 1.6898 - regression_loss: 1.1922 - classification_loss: 0.4976 116/500 [=====>........................] - ETA: 1:36 - loss: 1.6898 - regression_loss: 1.1954 - classification_loss: 0.4943 117/500 [======>.......................] - ETA: 1:35 - loss: 1.6878 - regression_loss: 1.1958 - classification_loss: 0.4920 118/500 [======>.......................] - ETA: 1:35 - loss: 1.6870 - regression_loss: 1.1975 - classification_loss: 0.4895 119/500 [======>.......................] - ETA: 1:35 - loss: 1.6886 - regression_loss: 1.2006 - classification_loss: 0.4880 120/500 [======>.......................] - ETA: 1:35 - loss: 1.6838 - regression_loss: 1.1986 - classification_loss: 0.4851 121/500 [======>.......................] - ETA: 1:34 - loss: 1.6788 - regression_loss: 1.1967 - classification_loss: 0.4821 122/500 [======>.......................] - ETA: 1:34 - loss: 1.6800 - regression_loss: 1.1998 - classification_loss: 0.4802 123/500 [======>.......................] - ETA: 1:34 - loss: 1.6750 - regression_loss: 1.1967 - classification_loss: 0.4784 124/500 [======>.......................] - ETA: 1:34 - loss: 1.6742 - regression_loss: 1.1976 - classification_loss: 0.4766 125/500 [======>.......................] - ETA: 1:33 - loss: 1.6715 - regression_loss: 1.1963 - classification_loss: 0.4751 126/500 [======>.......................] - ETA: 1:33 - loss: 1.6635 - regression_loss: 1.1916 - classification_loss: 0.4719 127/500 [======>.......................] - ETA: 1:33 - loss: 1.6629 - regression_loss: 1.1929 - classification_loss: 0.4701 128/500 [======>.......................] - ETA: 1:33 - loss: 1.6576 - regression_loss: 1.1898 - classification_loss: 0.4678 129/500 [======>.......................] - ETA: 1:33 - loss: 1.6495 - regression_loss: 1.1849 - classification_loss: 0.4647 130/500 [======>.......................] - ETA: 1:32 - loss: 1.6478 - regression_loss: 1.1851 - classification_loss: 0.4627 131/500 [======>.......................] - ETA: 1:32 - loss: 1.6506 - regression_loss: 1.1895 - classification_loss: 0.4611 132/500 [======>.......................] - ETA: 1:32 - loss: 1.6490 - regression_loss: 1.1897 - classification_loss: 0.4594 133/500 [======>.......................] - ETA: 1:32 - loss: 1.6467 - regression_loss: 1.1888 - classification_loss: 0.4579 134/500 [=======>......................] - ETA: 1:31 - loss: 1.6471 - regression_loss: 1.1907 - classification_loss: 0.4564 135/500 [=======>......................] - ETA: 1:31 - loss: 1.6477 - regression_loss: 1.1928 - classification_loss: 0.4549 136/500 [=======>......................] - ETA: 1:31 - loss: 1.6469 - regression_loss: 1.1937 - classification_loss: 0.4533 137/500 [=======>......................] - ETA: 1:31 - loss: 1.6460 - regression_loss: 1.1939 - classification_loss: 0.4521 138/500 [=======>......................] - ETA: 1:30 - loss: 1.6466 - regression_loss: 1.1954 - classification_loss: 0.4512 139/500 [=======>......................] - ETA: 1:30 - loss: 1.6473 - regression_loss: 1.1970 - classification_loss: 0.4504 140/500 [=======>......................] - ETA: 1:30 - loss: 1.6448 - regression_loss: 1.1961 - classification_loss: 0.4487 141/500 [=======>......................] - ETA: 1:30 - loss: 1.6446 - regression_loss: 1.1974 - classification_loss: 0.4473 142/500 [=======>......................] - ETA: 1:29 - loss: 1.6421 - regression_loss: 1.1955 - classification_loss: 0.4465 143/500 [=======>......................] - ETA: 1:29 - loss: 1.6342 - regression_loss: 1.1904 - classification_loss: 0.4438 144/500 [=======>......................] - ETA: 1:29 - loss: 1.6360 - regression_loss: 1.1932 - classification_loss: 0.4429 145/500 [=======>......................] - ETA: 1:29 - loss: 1.6289 - regression_loss: 1.1888 - classification_loss: 0.4402 146/500 [=======>......................] - ETA: 1:28 - loss: 1.6272 - regression_loss: 1.1882 - classification_loss: 0.4390 147/500 [=======>......................] - ETA: 1:28 - loss: 1.6261 - regression_loss: 1.1889 - classification_loss: 0.4372 148/500 [=======>......................] - ETA: 1:28 - loss: 1.6225 - regression_loss: 1.1873 - classification_loss: 0.4352 149/500 [=======>......................] - ETA: 1:28 - loss: 1.6227 - regression_loss: 1.1881 - classification_loss: 0.4346 150/500 [========>.....................] - ETA: 1:27 - loss: 1.6227 - regression_loss: 1.1889 - classification_loss: 0.4338 151/500 [========>.....................] - ETA: 1:27 - loss: 1.6231 - regression_loss: 1.1908 - classification_loss: 0.4324 152/500 [========>.....................] - ETA: 1:27 - loss: 1.6292 - regression_loss: 1.1965 - classification_loss: 0.4327 153/500 [========>.....................] - ETA: 1:27 - loss: 1.6315 - regression_loss: 1.1998 - classification_loss: 0.4317 154/500 [========>.....................] - ETA: 1:26 - loss: 1.6361 - regression_loss: 1.2039 - classification_loss: 0.4321 155/500 [========>.....................] - ETA: 1:26 - loss: 1.6358 - regression_loss: 1.2048 - classification_loss: 0.4310 156/500 [========>.....................] - ETA: 1:26 - loss: 1.6353 - regression_loss: 1.2055 - classification_loss: 0.4299 157/500 [========>.....................] - ETA: 1:26 - loss: 1.6340 - regression_loss: 1.2057 - classification_loss: 0.4284 158/500 [========>.....................] - ETA: 1:25 - loss: 1.6347 - regression_loss: 1.2072 - classification_loss: 0.4275 159/500 [========>.....................] - ETA: 1:25 - loss: 1.6316 - regression_loss: 1.2058 - classification_loss: 0.4257 160/500 [========>.....................] - ETA: 1:25 - loss: 1.6313 - regression_loss: 1.2069 - classification_loss: 0.4244 161/500 [========>.....................] - ETA: 1:25 - loss: 1.6320 - regression_loss: 1.2082 - classification_loss: 0.4238 162/500 [========>.....................] - ETA: 1:24 - loss: 1.6302 - regression_loss: 1.2078 - classification_loss: 0.4224 163/500 [========>.....................] - ETA: 1:24 - loss: 1.6255 - regression_loss: 1.2051 - classification_loss: 0.4205 164/500 [========>.....................] - ETA: 1:24 - loss: 1.6249 - regression_loss: 1.2057 - classification_loss: 0.4192 165/500 [========>.....................] - ETA: 1:24 - loss: 1.6248 - regression_loss: 1.2067 - classification_loss: 0.4181 166/500 [========>.....................] - ETA: 1:23 - loss: 1.6228 - regression_loss: 1.2055 - classification_loss: 0.4173 167/500 [=========>....................] - ETA: 1:23 - loss: 1.6204 - regression_loss: 1.2040 - classification_loss: 0.4164 168/500 [=========>....................] - ETA: 1:23 - loss: 1.6164 - regression_loss: 1.2016 - classification_loss: 0.4148 169/500 [=========>....................] - ETA: 1:23 - loss: 1.6162 - regression_loss: 1.2024 - classification_loss: 0.4137 170/500 [=========>....................] - ETA: 1:22 - loss: 1.6165 - regression_loss: 1.2036 - classification_loss: 0.4128 171/500 [=========>....................] - ETA: 1:22 - loss: 1.6193 - regression_loss: 1.2068 - classification_loss: 0.4125 172/500 [=========>....................] - ETA: 1:22 - loss: 1.6169 - regression_loss: 1.2059 - classification_loss: 0.4110 173/500 [=========>....................] - ETA: 1:22 - loss: 1.6181 - regression_loss: 1.2078 - classification_loss: 0.4103 174/500 [=========>....................] - ETA: 1:21 - loss: 1.6163 - regression_loss: 1.2071 - classification_loss: 0.4091 175/500 [=========>....................] - ETA: 1:21 - loss: 1.6170 - regression_loss: 1.2087 - classification_loss: 0.4083 176/500 [=========>....................] - ETA: 1:21 - loss: 1.6173 - regression_loss: 1.2098 - classification_loss: 0.4075 177/500 [=========>....................] - ETA: 1:21 - loss: 1.6165 - regression_loss: 1.2102 - classification_loss: 0.4063 178/500 [=========>....................] - ETA: 1:20 - loss: 1.6168 - regression_loss: 1.2115 - classification_loss: 0.4053 179/500 [=========>....................] - ETA: 1:20 - loss: 1.6167 - regression_loss: 1.2123 - classification_loss: 0.4044 180/500 [=========>....................] - ETA: 1:20 - loss: 1.6115 - regression_loss: 1.2087 - classification_loss: 0.4028 181/500 [=========>....................] - ETA: 1:20 - loss: 1.6089 - regression_loss: 1.2078 - classification_loss: 0.4011 182/500 [=========>....................] - ETA: 1:19 - loss: 1.6032 - regression_loss: 1.2040 - classification_loss: 0.3992 183/500 [=========>....................] - ETA: 1:19 - loss: 1.6006 - regression_loss: 1.2029 - classification_loss: 0.3977 184/500 [==========>...................] - ETA: 1:19 - loss: 1.5969 - regression_loss: 1.2004 - classification_loss: 0.3965 185/500 [==========>...................] - ETA: 1:19 - loss: 1.5964 - regression_loss: 1.2003 - classification_loss: 0.3961 186/500 [==========>...................] - ETA: 1:18 - loss: 1.5962 - regression_loss: 1.2011 - classification_loss: 0.3951 187/500 [==========>...................] - ETA: 1:18 - loss: 1.5952 - regression_loss: 1.2011 - classification_loss: 0.3941 188/500 [==========>...................] - ETA: 1:18 - loss: 1.5969 - regression_loss: 1.2034 - classification_loss: 0.3935 189/500 [==========>...................] - ETA: 1:18 - loss: 1.6000 - regression_loss: 1.2062 - classification_loss: 0.3938 190/500 [==========>...................] - ETA: 1:17 - loss: 1.6000 - regression_loss: 1.2071 - classification_loss: 0.3929 191/500 [==========>...................] - ETA: 1:17 - loss: 1.5956 - regression_loss: 1.2042 - classification_loss: 0.3914 192/500 [==========>...................] - ETA: 1:17 - loss: 1.5971 - regression_loss: 1.2055 - classification_loss: 0.3916 193/500 [==========>...................] - ETA: 1:17 - loss: 1.5929 - regression_loss: 1.2029 - classification_loss: 0.3900 194/500 [==========>...................] - ETA: 1:16 - loss: 1.5916 - regression_loss: 1.2026 - classification_loss: 0.3890 195/500 [==========>...................] - ETA: 1:16 - loss: 1.5873 - regression_loss: 1.1999 - classification_loss: 0.3873 196/500 [==========>...................] - ETA: 1:16 - loss: 1.5891 - regression_loss: 1.2026 - classification_loss: 0.3866 197/500 [==========>...................] - ETA: 1:16 - loss: 1.5836 - regression_loss: 1.1987 - classification_loss: 0.3848 198/500 [==========>...................] - ETA: 1:15 - loss: 1.5822 - regression_loss: 1.1982 - classification_loss: 0.3839 199/500 [==========>...................] - ETA: 1:15 - loss: 1.5838 - regression_loss: 1.2003 - classification_loss: 0.3835 200/500 [===========>..................] - ETA: 1:15 - loss: 1.5834 - regression_loss: 1.2005 - classification_loss: 0.3829 201/500 [===========>..................] - ETA: 1:14 - loss: 1.5848 - regression_loss: 1.2021 - classification_loss: 0.3827 202/500 [===========>..................] - ETA: 1:14 - loss: 1.5856 - regression_loss: 1.2033 - classification_loss: 0.3823 203/500 [===========>..................] - ETA: 1:14 - loss: 1.5841 - regression_loss: 1.2026 - classification_loss: 0.3815 204/500 [===========>..................] - ETA: 1:13 - loss: 1.5828 - regression_loss: 1.2022 - classification_loss: 0.3806 205/500 [===========>..................] - ETA: 1:13 - loss: 1.5803 - regression_loss: 1.2007 - classification_loss: 0.3796 206/500 [===========>..................] - ETA: 1:13 - loss: 1.5802 - regression_loss: 1.2013 - classification_loss: 0.3789 207/500 [===========>..................] - ETA: 1:13 - loss: 1.5814 - regression_loss: 1.2027 - classification_loss: 0.3787 208/500 [===========>..................] - ETA: 1:13 - loss: 1.5772 - regression_loss: 1.2001 - classification_loss: 0.3771 209/500 [===========>..................] - ETA: 1:12 - loss: 1.5782 - regression_loss: 1.2013 - classification_loss: 0.3769 210/500 [===========>..................] - ETA: 1:12 - loss: 1.5789 - regression_loss: 1.2025 - classification_loss: 0.3764 211/500 [===========>..................] - ETA: 1:12 - loss: 1.5806 - regression_loss: 1.2044 - classification_loss: 0.3762 212/500 [===========>..................] - ETA: 1:12 - loss: 1.5808 - regression_loss: 1.2056 - classification_loss: 0.3752 213/500 [===========>..................] - ETA: 1:11 - loss: 1.5772 - regression_loss: 1.2034 - classification_loss: 0.3738 214/500 [===========>..................] - ETA: 1:11 - loss: 1.5780 - regression_loss: 1.2046 - classification_loss: 0.3734 215/500 [===========>..................] - ETA: 1:11 - loss: 1.5759 - regression_loss: 1.2034 - classification_loss: 0.3725 216/500 [===========>..................] - ETA: 1:11 - loss: 1.5760 - regression_loss: 1.2040 - classification_loss: 0.3720 217/500 [============>.................] - ETA: 1:10 - loss: 1.5747 - regression_loss: 1.2037 - classification_loss: 0.3710 218/500 [============>.................] - ETA: 1:10 - loss: 1.5773 - regression_loss: 1.2065 - classification_loss: 0.3708 219/500 [============>.................] - ETA: 1:10 - loss: 1.5735 - regression_loss: 1.2038 - classification_loss: 0.3697 220/500 [============>.................] - ETA: 1:10 - loss: 1.5722 - regression_loss: 1.2034 - classification_loss: 0.3688 221/500 [============>.................] - ETA: 1:09 - loss: 1.5721 - regression_loss: 1.2032 - classification_loss: 0.3689 222/500 [============>.................] - ETA: 1:09 - loss: 1.5730 - regression_loss: 1.2042 - classification_loss: 0.3688 223/500 [============>.................] - ETA: 1:09 - loss: 1.5722 - regression_loss: 1.2040 - classification_loss: 0.3682 224/500 [============>.................] - ETA: 1:09 - loss: 1.5709 - regression_loss: 1.2036 - classification_loss: 0.3673 225/500 [============>.................] - ETA: 1:08 - loss: 1.5709 - regression_loss: 1.2042 - classification_loss: 0.3667 226/500 [============>.................] - ETA: 1:08 - loss: 1.5725 - regression_loss: 1.2060 - classification_loss: 0.3665 227/500 [============>.................] - ETA: 1:08 - loss: 1.5744 - regression_loss: 1.2080 - classification_loss: 0.3664 228/500 [============>.................] - ETA: 1:08 - loss: 1.5708 - regression_loss: 1.2058 - classification_loss: 0.3650 229/500 [============>.................] - ETA: 1:07 - loss: 1.5716 - regression_loss: 1.2068 - classification_loss: 0.3648 230/500 [============>.................] - ETA: 1:07 - loss: 1.5723 - regression_loss: 1.2068 - classification_loss: 0.3655 231/500 [============>.................] - ETA: 1:07 - loss: 1.5741 - regression_loss: 1.2087 - classification_loss: 0.3654 232/500 [============>.................] - ETA: 1:07 - loss: 1.5743 - regression_loss: 1.2094 - classification_loss: 0.3649 233/500 [============>.................] - ETA: 1:06 - loss: 1.5722 - regression_loss: 1.2080 - classification_loss: 0.3641 234/500 [=============>................] - ETA: 1:06 - loss: 1.5710 - regression_loss: 1.2075 - classification_loss: 0.3634 235/500 [=============>................] - ETA: 1:06 - loss: 1.5677 - regression_loss: 1.2055 - classification_loss: 0.3622 236/500 [=============>................] - ETA: 1:06 - loss: 1.5633 - regression_loss: 1.2023 - classification_loss: 0.3611 237/500 [=============>................] - ETA: 1:05 - loss: 1.5633 - regression_loss: 1.2027 - classification_loss: 0.3606 238/500 [=============>................] - ETA: 1:05 - loss: 1.5623 - regression_loss: 1.2025 - classification_loss: 0.3598 239/500 [=============>................] - ETA: 1:05 - loss: 1.5597 - regression_loss: 1.2003 - classification_loss: 0.3593 240/500 [=============>................] - ETA: 1:05 - loss: 1.5592 - regression_loss: 1.2007 - classification_loss: 0.3585 241/500 [=============>................] - ETA: 1:04 - loss: 1.5571 - regression_loss: 1.1997 - classification_loss: 0.3574 242/500 [=============>................] - ETA: 1:04 - loss: 1.5565 - regression_loss: 1.1996 - classification_loss: 0.3569 243/500 [=============>................] - ETA: 1:04 - loss: 1.5531 - regression_loss: 1.1975 - classification_loss: 0.3556 244/500 [=============>................] - ETA: 1:04 - loss: 1.5535 - regression_loss: 1.1981 - classification_loss: 0.3554 245/500 [=============>................] - ETA: 1:03 - loss: 1.5517 - regression_loss: 1.1972 - classification_loss: 0.3545 246/500 [=============>................] - ETA: 1:03 - loss: 1.5497 - regression_loss: 1.1961 - classification_loss: 0.3536 247/500 [=============>................] - ETA: 1:03 - loss: 1.5460 - regression_loss: 1.1937 - classification_loss: 0.3524 248/500 [=============>................] - ETA: 1:03 - loss: 1.5453 - regression_loss: 1.1928 - classification_loss: 0.3525 249/500 [=============>................] - ETA: 1:02 - loss: 1.5446 - regression_loss: 1.1926 - classification_loss: 0.3520 250/500 [==============>...............] - ETA: 1:02 - loss: 1.5447 - regression_loss: 1.1934 - classification_loss: 0.3513 251/500 [==============>...............] - ETA: 1:02 - loss: 1.5459 - regression_loss: 1.1948 - classification_loss: 0.3511 252/500 [==============>...............] - ETA: 1:02 - loss: 1.5446 - regression_loss: 1.1943 - classification_loss: 0.3503 253/500 [==============>...............] - ETA: 1:01 - loss: 1.5431 - regression_loss: 1.1936 - classification_loss: 0.3495 254/500 [==============>...............] - ETA: 1:01 - loss: 1.5417 - regression_loss: 1.1930 - classification_loss: 0.3488 255/500 [==============>...............] - ETA: 1:01 - loss: 1.5410 - regression_loss: 1.1928 - classification_loss: 0.3482 256/500 [==============>...............] - ETA: 1:01 - loss: 1.5399 - regression_loss: 1.1924 - classification_loss: 0.3475 257/500 [==============>...............] - ETA: 1:00 - loss: 1.5413 - regression_loss: 1.1940 - classification_loss: 0.3473 258/500 [==============>...............] - ETA: 1:00 - loss: 1.5430 - regression_loss: 1.1957 - classification_loss: 0.3473 259/500 [==============>...............] - ETA: 1:00 - loss: 1.5407 - regression_loss: 1.1939 - classification_loss: 0.3468 260/500 [==============>...............] - ETA: 1:00 - loss: 1.5412 - regression_loss: 1.1947 - classification_loss: 0.3465 261/500 [==============>...............] - ETA: 59s - loss: 1.5396 - regression_loss: 1.1937 - classification_loss: 0.3459  262/500 [==============>...............] - ETA: 59s - loss: 1.5397 - regression_loss: 1.1946 - classification_loss: 0.3451 263/500 [==============>...............] - ETA: 59s - loss: 1.5410 - regression_loss: 1.1961 - classification_loss: 0.3449 264/500 [==============>...............] - ETA: 59s - loss: 1.5392 - regression_loss: 1.1952 - classification_loss: 0.3440 265/500 [==============>...............] - ETA: 58s - loss: 1.5386 - regression_loss: 1.1953 - classification_loss: 0.3433 266/500 [==============>...............] - ETA: 58s - loss: 1.5387 - regression_loss: 1.1959 - classification_loss: 0.3428 267/500 [===============>..............] - ETA: 58s - loss: 1.5363 - regression_loss: 1.1944 - classification_loss: 0.3419 268/500 [===============>..............] - ETA: 58s - loss: 1.5343 - regression_loss: 1.1930 - classification_loss: 0.3413 269/500 [===============>..............] - ETA: 57s - loss: 1.5343 - regression_loss: 1.1935 - classification_loss: 0.3408 270/500 [===============>..............] - ETA: 57s - loss: 1.5340 - regression_loss: 1.1934 - classification_loss: 0.3406 271/500 [===============>..............] - ETA: 57s - loss: 1.5349 - regression_loss: 1.1946 - classification_loss: 0.3403 272/500 [===============>..............] - ETA: 57s - loss: 1.5328 - regression_loss: 1.1932 - classification_loss: 0.3396 273/500 [===============>..............] - ETA: 56s - loss: 1.5336 - regression_loss: 1.1939 - classification_loss: 0.3398 274/500 [===============>..............] - ETA: 56s - loss: 1.5316 - regression_loss: 1.1926 - classification_loss: 0.3389 275/500 [===============>..............] - ETA: 56s - loss: 1.5296 - regression_loss: 1.1914 - classification_loss: 0.3382 276/500 [===============>..............] - ETA: 56s - loss: 1.5293 - regression_loss: 1.1911 - classification_loss: 0.3383 277/500 [===============>..............] - ETA: 55s - loss: 1.5283 - regression_loss: 1.1906 - classification_loss: 0.3377 278/500 [===============>..............] - ETA: 55s - loss: 1.5255 - regression_loss: 1.1887 - classification_loss: 0.3368 279/500 [===============>..............] - ETA: 55s - loss: 1.5259 - regression_loss: 1.1893 - classification_loss: 0.3366 280/500 [===============>..............] - ETA: 55s - loss: 1.5257 - regression_loss: 1.1896 - classification_loss: 0.3361 281/500 [===============>..............] - ETA: 54s - loss: 1.5249 - regression_loss: 1.1892 - classification_loss: 0.3357 282/500 [===============>..............] - ETA: 54s - loss: 1.5227 - regression_loss: 1.1878 - classification_loss: 0.3349 283/500 [===============>..............] - ETA: 54s - loss: 1.5197 - regression_loss: 1.1857 - classification_loss: 0.3340 284/500 [================>.............] - ETA: 54s - loss: 1.5182 - regression_loss: 1.1847 - classification_loss: 0.3336 285/500 [================>.............] - ETA: 53s - loss: 1.5168 - regression_loss: 1.1836 - classification_loss: 0.3332 286/500 [================>.............] - ETA: 53s - loss: 1.5141 - regression_loss: 1.1818 - classification_loss: 0.3323 287/500 [================>.............] - ETA: 53s - loss: 1.5131 - regression_loss: 1.1814 - classification_loss: 0.3317 288/500 [================>.............] - ETA: 53s - loss: 1.5131 - regression_loss: 1.1815 - classification_loss: 0.3316 289/500 [================>.............] - ETA: 52s - loss: 1.5098 - regression_loss: 1.1793 - classification_loss: 0.3305 290/500 [================>.............] - ETA: 52s - loss: 1.5064 - regression_loss: 1.1768 - classification_loss: 0.3296 291/500 [================>.............] - ETA: 52s - loss: 1.5069 - regression_loss: 1.1774 - classification_loss: 0.3294 292/500 [================>.............] - ETA: 52s - loss: 1.5073 - regression_loss: 1.1782 - classification_loss: 0.3291 293/500 [================>.............] - ETA: 51s - loss: 1.5084 - regression_loss: 1.1794 - classification_loss: 0.3291 294/500 [================>.............] - ETA: 51s - loss: 1.5059 - regression_loss: 1.1778 - classification_loss: 0.3281 295/500 [================>.............] - ETA: 51s - loss: 1.5058 - regression_loss: 1.1780 - classification_loss: 0.3278 296/500 [================>.............] - ETA: 51s - loss: 1.5055 - regression_loss: 1.1781 - classification_loss: 0.3274 297/500 [================>.............] - ETA: 50s - loss: 1.5060 - regression_loss: 1.1790 - classification_loss: 0.3271 298/500 [================>.............] - ETA: 50s - loss: 1.5054 - regression_loss: 1.1789 - classification_loss: 0.3265 299/500 [================>.............] - ETA: 50s - loss: 1.5059 - regression_loss: 1.1796 - classification_loss: 0.3263 300/500 [=================>............] - ETA: 50s - loss: 1.5054 - regression_loss: 1.1796 - classification_loss: 0.3258 301/500 [=================>............] - ETA: 49s - loss: 1.5056 - regression_loss: 1.1801 - classification_loss: 0.3254 302/500 [=================>............] - ETA: 49s - loss: 1.5047 - regression_loss: 1.1798 - classification_loss: 0.3250 303/500 [=================>............] - ETA: 49s - loss: 1.5038 - regression_loss: 1.1793 - classification_loss: 0.3245 304/500 [=================>............] - ETA: 49s - loss: 1.5042 - regression_loss: 1.1796 - classification_loss: 0.3245 305/500 [=================>............] - ETA: 48s - loss: 1.5050 - regression_loss: 1.1806 - classification_loss: 0.3244 306/500 [=================>............] - ETA: 48s - loss: 1.5047 - regression_loss: 1.1806 - classification_loss: 0.3241 307/500 [=================>............] - ETA: 48s - loss: 1.5042 - regression_loss: 1.1806 - classification_loss: 0.3236 308/500 [=================>............] - ETA: 48s - loss: 1.5033 - regression_loss: 1.1800 - classification_loss: 0.3233 309/500 [=================>............] - ETA: 47s - loss: 1.5019 - regression_loss: 1.1793 - classification_loss: 0.3226 310/500 [=================>............] - ETA: 47s - loss: 1.5017 - regression_loss: 1.1794 - classification_loss: 0.3224 311/500 [=================>............] - ETA: 47s - loss: 1.4991 - regression_loss: 1.1775 - classification_loss: 0.3216 312/500 [=================>............] - ETA: 47s - loss: 1.4966 - regression_loss: 1.1759 - classification_loss: 0.3208 313/500 [=================>............] - ETA: 46s - loss: 1.4965 - regression_loss: 1.1761 - classification_loss: 0.3204 314/500 [=================>............] - ETA: 46s - loss: 1.4957 - regression_loss: 1.1759 - classification_loss: 0.3198 315/500 [=================>............] - ETA: 46s - loss: 1.4929 - regression_loss: 1.1740 - classification_loss: 0.3189 316/500 [=================>............] - ETA: 46s - loss: 1.4909 - regression_loss: 1.1726 - classification_loss: 0.3183 317/500 [==================>...........] - ETA: 45s - loss: 1.4906 - regression_loss: 1.1724 - classification_loss: 0.3183 318/500 [==================>...........] - ETA: 45s - loss: 1.4904 - regression_loss: 1.1725 - classification_loss: 0.3179 319/500 [==================>...........] - ETA: 45s - loss: 1.4886 - regression_loss: 1.1714 - classification_loss: 0.3173 320/500 [==================>...........] - ETA: 45s - loss: 1.4890 - regression_loss: 1.1720 - classification_loss: 0.3170 321/500 [==================>...........] - ETA: 44s - loss: 1.4881 - regression_loss: 1.1716 - classification_loss: 0.3165 322/500 [==================>...........] - ETA: 44s - loss: 1.4874 - regression_loss: 1.1714 - classification_loss: 0.3161 323/500 [==================>...........] - ETA: 44s - loss: 1.4883 - regression_loss: 1.1723 - classification_loss: 0.3160 324/500 [==================>...........] - ETA: 44s - loss: 1.4877 - regression_loss: 1.1722 - classification_loss: 0.3155 325/500 [==================>...........] - ETA: 43s - loss: 1.4866 - regression_loss: 1.1718 - classification_loss: 0.3148 326/500 [==================>...........] - ETA: 43s - loss: 1.4856 - regression_loss: 1.1713 - classification_loss: 0.3143 327/500 [==================>...........] - ETA: 43s - loss: 1.4856 - regression_loss: 1.1713 - classification_loss: 0.3143 328/500 [==================>...........] - ETA: 43s - loss: 1.4877 - regression_loss: 1.1729 - classification_loss: 0.3147 329/500 [==================>...........] - ETA: 42s - loss: 1.4891 - regression_loss: 1.1743 - classification_loss: 0.3148 330/500 [==================>...........] - ETA: 42s - loss: 1.4883 - regression_loss: 1.1740 - classification_loss: 0.3143 331/500 [==================>...........] - ETA: 42s - loss: 1.4889 - regression_loss: 1.1748 - classification_loss: 0.3141 332/500 [==================>...........] - ETA: 42s - loss: 1.4894 - regression_loss: 1.1756 - classification_loss: 0.3139 333/500 [==================>...........] - ETA: 41s - loss: 1.4901 - regression_loss: 1.1763 - classification_loss: 0.3138 334/500 [===================>..........] - ETA: 41s - loss: 1.4909 - regression_loss: 1.1773 - classification_loss: 0.3136 335/500 [===================>..........] - ETA: 41s - loss: 1.4907 - regression_loss: 1.1774 - classification_loss: 0.3133 336/500 [===================>..........] - ETA: 41s - loss: 1.4917 - regression_loss: 1.1785 - classification_loss: 0.3132 337/500 [===================>..........] - ETA: 40s - loss: 1.4914 - regression_loss: 1.1785 - classification_loss: 0.3129 338/500 [===================>..........] - ETA: 40s - loss: 1.4915 - regression_loss: 1.1787 - classification_loss: 0.3127 339/500 [===================>..........] - ETA: 40s - loss: 1.4917 - regression_loss: 1.1787 - classification_loss: 0.3130 340/500 [===================>..........] - ETA: 40s - loss: 1.4921 - regression_loss: 1.1794 - classification_loss: 0.3127 341/500 [===================>..........] - ETA: 39s - loss: 1.4924 - regression_loss: 1.1797 - classification_loss: 0.3127 342/500 [===================>..........] - ETA: 39s - loss: 1.4902 - regression_loss: 1.1778 - classification_loss: 0.3124 343/500 [===================>..........] - ETA: 39s - loss: 1.4895 - regression_loss: 1.1775 - classification_loss: 0.3120 344/500 [===================>..........] - ETA: 39s - loss: 1.4909 - regression_loss: 1.1787 - classification_loss: 0.3122 345/500 [===================>..........] - ETA: 38s - loss: 1.4920 - regression_loss: 1.1795 - classification_loss: 0.3125 346/500 [===================>..........] - ETA: 38s - loss: 1.4913 - regression_loss: 1.1793 - classification_loss: 0.3121 347/500 [===================>..........] - ETA: 38s - loss: 1.4906 - regression_loss: 1.1789 - classification_loss: 0.3117 348/500 [===================>..........] - ETA: 38s - loss: 1.4911 - regression_loss: 1.1796 - classification_loss: 0.3114 349/500 [===================>..........] - ETA: 37s - loss: 1.4910 - regression_loss: 1.1798 - classification_loss: 0.3111 350/500 [====================>.........] - ETA: 37s - loss: 1.4915 - regression_loss: 1.1805 - classification_loss: 0.3111 351/500 [====================>.........] - ETA: 37s - loss: 1.4916 - regression_loss: 1.1809 - classification_loss: 0.3107 352/500 [====================>.........] - ETA: 37s - loss: 1.4890 - regression_loss: 1.1789 - classification_loss: 0.3100 353/500 [====================>.........] - ETA: 36s - loss: 1.4905 - regression_loss: 1.1804 - classification_loss: 0.3101 354/500 [====================>.........] - ETA: 36s - loss: 1.4901 - regression_loss: 1.1802 - classification_loss: 0.3099 355/500 [====================>.........] - ETA: 36s - loss: 1.4896 - regression_loss: 1.1800 - classification_loss: 0.3096 356/500 [====================>.........] - ETA: 36s - loss: 1.4901 - regression_loss: 1.1806 - classification_loss: 0.3095 357/500 [====================>.........] - ETA: 35s - loss: 1.4907 - regression_loss: 1.1813 - classification_loss: 0.3094 358/500 [====================>.........] - ETA: 35s - loss: 1.4915 - regression_loss: 1.1819 - classification_loss: 0.3096 359/500 [====================>.........] - ETA: 35s - loss: 1.4918 - regression_loss: 1.1824 - classification_loss: 0.3094 360/500 [====================>.........] - ETA: 35s - loss: 1.4908 - regression_loss: 1.1818 - classification_loss: 0.3089 361/500 [====================>.........] - ETA: 34s - loss: 1.4904 - regression_loss: 1.1815 - classification_loss: 0.3089 362/500 [====================>.........] - ETA: 34s - loss: 1.4906 - regression_loss: 1.1820 - classification_loss: 0.3086 363/500 [====================>.........] - ETA: 34s - loss: 1.4901 - regression_loss: 1.1819 - classification_loss: 0.3082 364/500 [====================>.........] - ETA: 34s - loss: 1.4901 - regression_loss: 1.1820 - classification_loss: 0.3081 365/500 [====================>.........] - ETA: 33s - loss: 1.4904 - regression_loss: 1.1824 - classification_loss: 0.3081 366/500 [====================>.........] - ETA: 33s - loss: 1.4917 - regression_loss: 1.1837 - classification_loss: 0.3081 367/500 [=====================>........] - ETA: 33s - loss: 1.4928 - regression_loss: 1.1847 - classification_loss: 0.3081 368/500 [=====================>........] - ETA: 33s - loss: 1.4929 - regression_loss: 1.1851 - classification_loss: 0.3079 369/500 [=====================>........] - ETA: 32s - loss: 1.4911 - regression_loss: 1.1839 - classification_loss: 0.3072 370/500 [=====================>........] - ETA: 32s - loss: 1.4916 - regression_loss: 1.1846 - classification_loss: 0.3070 371/500 [=====================>........] - ETA: 32s - loss: 1.4931 - regression_loss: 1.1859 - classification_loss: 0.3072 372/500 [=====================>........] - ETA: 32s - loss: 1.4946 - regression_loss: 1.1874 - classification_loss: 0.3072 373/500 [=====================>........] - ETA: 31s - loss: 1.4927 - regression_loss: 1.1860 - classification_loss: 0.3067 374/500 [=====================>........] - ETA: 31s - loss: 1.4926 - regression_loss: 1.1860 - classification_loss: 0.3066 375/500 [=====================>........] - ETA: 31s - loss: 1.4934 - regression_loss: 1.1868 - classification_loss: 0.3066 376/500 [=====================>........] - ETA: 31s - loss: 1.4940 - regression_loss: 1.1874 - classification_loss: 0.3066 377/500 [=====================>........] - ETA: 30s - loss: 1.4938 - regression_loss: 1.1875 - classification_loss: 0.3063 378/500 [=====================>........] - ETA: 30s - loss: 1.4937 - regression_loss: 1.1877 - classification_loss: 0.3060 379/500 [=====================>........] - ETA: 30s - loss: 1.4943 - regression_loss: 1.1884 - classification_loss: 0.3058 380/500 [=====================>........] - ETA: 30s - loss: 1.4941 - regression_loss: 1.1885 - classification_loss: 0.3056 381/500 [=====================>........] - ETA: 29s - loss: 1.4927 - regression_loss: 1.1876 - classification_loss: 0.3051 382/500 [=====================>........] - ETA: 29s - loss: 1.4915 - regression_loss: 1.1868 - classification_loss: 0.3046 383/500 [=====================>........] - ETA: 29s - loss: 1.4905 - regression_loss: 1.1863 - classification_loss: 0.3042 384/500 [======================>.......] - ETA: 29s - loss: 1.4916 - regression_loss: 1.1874 - classification_loss: 0.3042 385/500 [======================>.......] - ETA: 28s - loss: 1.4904 - regression_loss: 1.1866 - classification_loss: 0.3038 386/500 [======================>.......] - ETA: 28s - loss: 1.4890 - regression_loss: 1.1855 - classification_loss: 0.3035 387/500 [======================>.......] - ETA: 28s - loss: 1.4867 - regression_loss: 1.1838 - classification_loss: 0.3029 388/500 [======================>.......] - ETA: 28s - loss: 1.4853 - regression_loss: 1.1829 - classification_loss: 0.3024 389/500 [======================>.......] - ETA: 27s - loss: 1.4860 - regression_loss: 1.1837 - classification_loss: 0.3023 390/500 [======================>.......] - ETA: 27s - loss: 1.4836 - regression_loss: 1.1819 - classification_loss: 0.3016 391/500 [======================>.......] - ETA: 27s - loss: 1.4835 - regression_loss: 1.1823 - classification_loss: 0.3012 392/500 [======================>.......] - ETA: 27s - loss: 1.4835 - regression_loss: 1.1825 - classification_loss: 0.3010 393/500 [======================>.......] - ETA: 26s - loss: 1.4811 - regression_loss: 1.1807 - classification_loss: 0.3003 394/500 [======================>.......] - ETA: 26s - loss: 1.4819 - regression_loss: 1.1815 - classification_loss: 0.3004 395/500 [======================>.......] - ETA: 26s - loss: 1.4800 - regression_loss: 1.1802 - classification_loss: 0.2998 396/500 [======================>.......] - ETA: 26s - loss: 1.4781 - regression_loss: 1.1786 - classification_loss: 0.2995 397/500 [======================>.......] - ETA: 25s - loss: 1.4788 - regression_loss: 1.1793 - classification_loss: 0.2994 398/500 [======================>.......] - ETA: 25s - loss: 1.4789 - regression_loss: 1.1798 - classification_loss: 0.2991 399/500 [======================>.......] - ETA: 25s - loss: 1.4792 - regression_loss: 1.1802 - classification_loss: 0.2990 400/500 [=======================>......] - ETA: 25s - loss: 1.4794 - regression_loss: 1.1806 - classification_loss: 0.2989 401/500 [=======================>......] - ETA: 24s - loss: 1.4797 - regression_loss: 1.1810 - classification_loss: 0.2988 402/500 [=======================>......] - ETA: 24s - loss: 1.4798 - regression_loss: 1.1809 - classification_loss: 0.2988 403/500 [=======================>......] - ETA: 24s - loss: 1.4775 - regression_loss: 1.1793 - classification_loss: 0.2982 404/500 [=======================>......] - ETA: 24s - loss: 1.4775 - regression_loss: 1.1795 - classification_loss: 0.2980 405/500 [=======================>......] - ETA: 23s - loss: 1.4781 - regression_loss: 1.1800 - classification_loss: 0.2981 406/500 [=======================>......] - ETA: 23s - loss: 1.4782 - regression_loss: 1.1803 - classification_loss: 0.2979 407/500 [=======================>......] - ETA: 23s - loss: 1.4784 - regression_loss: 1.1805 - classification_loss: 0.2979 408/500 [=======================>......] - ETA: 23s - loss: 1.4783 - regression_loss: 1.1806 - classification_loss: 0.2977 409/500 [=======================>......] - ETA: 22s - loss: 1.4791 - regression_loss: 1.1815 - classification_loss: 0.2976 410/500 [=======================>......] - ETA: 22s - loss: 1.4777 - regression_loss: 1.1805 - classification_loss: 0.2971 411/500 [=======================>......] - ETA: 22s - loss: 1.4775 - regression_loss: 1.1807 - classification_loss: 0.2968 412/500 [=======================>......] - ETA: 22s - loss: 1.4775 - regression_loss: 1.1808 - classification_loss: 0.2966 413/500 [=======================>......] - ETA: 21s - loss: 1.4768 - regression_loss: 1.1803 - classification_loss: 0.2965 414/500 [=======================>......] - ETA: 21s - loss: 1.4748 - regression_loss: 1.1788 - classification_loss: 0.2959 415/500 [=======================>......] - ETA: 21s - loss: 1.4739 - regression_loss: 1.1781 - classification_loss: 0.2958 416/500 [=======================>......] - ETA: 21s - loss: 1.4745 - regression_loss: 1.1787 - classification_loss: 0.2958 417/500 [========================>.....] - ETA: 20s - loss: 1.4753 - regression_loss: 1.1795 - classification_loss: 0.2958 418/500 [========================>.....] - ETA: 20s - loss: 1.4756 - regression_loss: 1.1801 - classification_loss: 0.2956 419/500 [========================>.....] - ETA: 20s - loss: 1.4760 - regression_loss: 1.1803 - classification_loss: 0.2957 420/500 [========================>.....] - ETA: 20s - loss: 1.4753 - regression_loss: 1.1799 - classification_loss: 0.2954 421/500 [========================>.....] - ETA: 19s - loss: 1.4754 - regression_loss: 1.1802 - classification_loss: 0.2951 422/500 [========================>.....] - ETA: 19s - loss: 1.4745 - regression_loss: 1.1796 - classification_loss: 0.2949 423/500 [========================>.....] - ETA: 19s - loss: 1.4747 - regression_loss: 1.1799 - classification_loss: 0.2948 424/500 [========================>.....] - ETA: 19s - loss: 1.4747 - regression_loss: 1.1800 - classification_loss: 0.2946 425/500 [========================>.....] - ETA: 18s - loss: 1.4720 - regression_loss: 1.1780 - classification_loss: 0.2940 426/500 [========================>.....] - ETA: 18s - loss: 1.4703 - regression_loss: 1.1764 - classification_loss: 0.2938 427/500 [========================>.....] - ETA: 18s - loss: 1.4693 - regression_loss: 1.1758 - classification_loss: 0.2936 428/500 [========================>.....] - ETA: 18s - loss: 1.4696 - regression_loss: 1.1762 - classification_loss: 0.2934 429/500 [========================>.....] - ETA: 17s - loss: 1.4712 - regression_loss: 1.1776 - classification_loss: 0.2936 430/500 [========================>.....] - ETA: 17s - loss: 1.4714 - regression_loss: 1.1779 - classification_loss: 0.2934 431/500 [========================>.....] - ETA: 17s - loss: 1.4710 - regression_loss: 1.1777 - classification_loss: 0.2933 432/500 [========================>.....] - ETA: 17s - loss: 1.4714 - regression_loss: 1.1781 - classification_loss: 0.2932 433/500 [========================>.....] - ETA: 16s - loss: 1.4699 - regression_loss: 1.1772 - classification_loss: 0.2928 434/500 [=========================>....] - ETA: 16s - loss: 1.4695 - regression_loss: 1.1769 - classification_loss: 0.2926 435/500 [=========================>....] - ETA: 16s - loss: 1.4707 - regression_loss: 1.1782 - classification_loss: 0.2925 436/500 [=========================>....] - ETA: 16s - loss: 1.4708 - regression_loss: 1.1784 - classification_loss: 0.2924 437/500 [=========================>....] - ETA: 15s - loss: 1.4700 - regression_loss: 1.1779 - classification_loss: 0.2921 438/500 [=========================>....] - ETA: 15s - loss: 1.4679 - regression_loss: 1.1763 - classification_loss: 0.2916 439/500 [=========================>....] - ETA: 15s - loss: 1.4665 - regression_loss: 1.1753 - classification_loss: 0.2913 440/500 [=========================>....] - ETA: 15s - loss: 1.4654 - regression_loss: 1.1744 - classification_loss: 0.2909 441/500 [=========================>....] - ETA: 14s - loss: 1.4641 - regression_loss: 1.1734 - classification_loss: 0.2906 442/500 [=========================>....] - ETA: 14s - loss: 1.4641 - regression_loss: 1.1736 - classification_loss: 0.2905 443/500 [=========================>....] - ETA: 14s - loss: 1.4625 - regression_loss: 1.1725 - classification_loss: 0.2901 444/500 [=========================>....] - ETA: 14s - loss: 1.4645 - regression_loss: 1.1738 - classification_loss: 0.2906 445/500 [=========================>....] - ETA: 13s - loss: 1.4643 - regression_loss: 1.1739 - classification_loss: 0.2904 446/500 [=========================>....] - ETA: 13s - loss: 1.4629 - regression_loss: 1.1728 - classification_loss: 0.2900 447/500 [=========================>....] - ETA: 13s - loss: 1.4617 - regression_loss: 1.1720 - classification_loss: 0.2897 448/500 [=========================>....] - ETA: 13s - loss: 1.4615 - regression_loss: 1.1719 - classification_loss: 0.2896 449/500 [=========================>....] - ETA: 12s - loss: 1.4608 - regression_loss: 1.1715 - classification_loss: 0.2893 450/500 [==========================>...] - ETA: 12s - loss: 1.4602 - regression_loss: 1.1712 - classification_loss: 0.2890 451/500 [==========================>...] - ETA: 12s - loss: 1.4580 - regression_loss: 1.1694 - classification_loss: 0.2885 452/500 [==========================>...] - ETA: 12s - loss: 1.4580 - regression_loss: 1.1695 - classification_loss: 0.2885 453/500 [==========================>...] - ETA: 11s - loss: 1.4581 - regression_loss: 1.1697 - classification_loss: 0.2883 454/500 [==========================>...] - ETA: 11s - loss: 1.4579 - regression_loss: 1.1697 - classification_loss: 0.2881 455/500 [==========================>...] - ETA: 11s - loss: 1.4573 - regression_loss: 1.1694 - classification_loss: 0.2878 456/500 [==========================>...] - ETA: 11s - loss: 1.4576 - regression_loss: 1.1699 - classification_loss: 0.2877 457/500 [==========================>...] - ETA: 10s - loss: 1.4577 - regression_loss: 1.1702 - classification_loss: 0.2876 458/500 [==========================>...] - ETA: 10s - loss: 1.4575 - regression_loss: 1.1701 - classification_loss: 0.2874 459/500 [==========================>...] - ETA: 10s - loss: 1.4586 - regression_loss: 1.1711 - classification_loss: 0.2875 460/500 [==========================>...] - ETA: 10s - loss: 1.4597 - regression_loss: 1.1721 - classification_loss: 0.2876 461/500 [==========================>...] - ETA: 9s - loss: 1.4589 - regression_loss: 1.1718 - classification_loss: 0.2871  462/500 [==========================>...] - ETA: 9s - loss: 1.4593 - regression_loss: 1.1722 - classification_loss: 0.2872 463/500 [==========================>...] - ETA: 9s - loss: 1.4589 - regression_loss: 1.1718 - classification_loss: 0.2871 464/500 [==========================>...] - ETA: 9s - loss: 1.4593 - regression_loss: 1.1723 - classification_loss: 0.2869 465/500 [==========================>...] - ETA: 8s - loss: 1.4589 - regression_loss: 1.1722 - classification_loss: 0.2868 466/500 [==========================>...] - ETA: 8s - loss: 1.4590 - regression_loss: 1.1725 - classification_loss: 0.2865 467/500 [===========================>..] - ETA: 8s - loss: 1.4592 - regression_loss: 1.1727 - classification_loss: 0.2865 468/500 [===========================>..] - ETA: 8s - loss: 1.4600 - regression_loss: 1.1735 - classification_loss: 0.2865 469/500 [===========================>..] - ETA: 7s - loss: 1.4607 - regression_loss: 1.1741 - classification_loss: 0.2866 470/500 [===========================>..] - ETA: 7s - loss: 1.4601 - regression_loss: 1.1736 - classification_loss: 0.2866 471/500 [===========================>..] - ETA: 7s - loss: 1.4603 - regression_loss: 1.1739 - classification_loss: 0.2864 472/500 [===========================>..] - ETA: 7s - loss: 1.4604 - regression_loss: 1.1740 - classification_loss: 0.2864 473/500 [===========================>..] - ETA: 6s - loss: 1.4589 - regression_loss: 1.1730 - classification_loss: 0.2859 474/500 [===========================>..] - ETA: 6s - loss: 1.4580 - regression_loss: 1.1723 - classification_loss: 0.2857 475/500 [===========================>..] - ETA: 6s - loss: 1.4587 - regression_loss: 1.1729 - classification_loss: 0.2859 476/500 [===========================>..] - ETA: 6s - loss: 1.4618 - regression_loss: 1.1756 - classification_loss: 0.2862 477/500 [===========================>..] - ETA: 5s - loss: 1.4614 - regression_loss: 1.1756 - classification_loss: 0.2858 478/500 [===========================>..] - ETA: 5s - loss: 1.4609 - regression_loss: 1.1755 - classification_loss: 0.2854 479/500 [===========================>..] - ETA: 5s - loss: 1.4618 - regression_loss: 1.1764 - classification_loss: 0.2854 480/500 [===========================>..] - ETA: 5s - loss: 1.4621 - regression_loss: 1.1768 - classification_loss: 0.2853 481/500 [===========================>..] - ETA: 4s - loss: 1.4631 - regression_loss: 1.1779 - classification_loss: 0.2852 482/500 [===========================>..] - ETA: 4s - loss: 1.4629 - regression_loss: 1.1777 - classification_loss: 0.2852 483/500 [===========================>..] - ETA: 4s - loss: 1.4617 - regression_loss: 1.1767 - classification_loss: 0.2850 484/500 [============================>.] - ETA: 4s - loss: 1.4621 - regression_loss: 1.1771 - classification_loss: 0.2850 485/500 [============================>.] - ETA: 3s - loss: 1.4620 - regression_loss: 1.1771 - classification_loss: 0.2849 486/500 [============================>.] - ETA: 3s - loss: 1.4608 - regression_loss: 1.1763 - classification_loss: 0.2845 487/500 [============================>.] - ETA: 3s - loss: 1.4607 - regression_loss: 1.1763 - classification_loss: 0.2843 488/500 [============================>.] - ETA: 3s - loss: 1.4609 - regression_loss: 1.1765 - classification_loss: 0.2844 489/500 [============================>.] - ETA: 2s - loss: 1.4605 - regression_loss: 1.1764 - classification_loss: 0.2841 490/500 [============================>.] - ETA: 2s - loss: 1.4606 - regression_loss: 1.1765 - classification_loss: 0.2840 491/500 [============================>.] - ETA: 2s - loss: 1.4609 - regression_loss: 1.1769 - classification_loss: 0.2840 492/500 [============================>.] - ETA: 2s - loss: 1.4611 - regression_loss: 1.1771 - classification_loss: 0.2840 493/500 [============================>.] - ETA: 1s - loss: 1.4597 - regression_loss: 1.1762 - classification_loss: 0.2836 494/500 [============================>.] - ETA: 1s - loss: 1.4593 - regression_loss: 1.1760 - classification_loss: 0.2833 495/500 [============================>.] - ETA: 1s - loss: 1.4603 - regression_loss: 1.1769 - classification_loss: 0.2834 496/500 [============================>.] - ETA: 1s - loss: 1.4590 - regression_loss: 1.1759 - classification_loss: 0.2831 497/500 [============================>.] - ETA: 0s - loss: 1.4579 - regression_loss: 1.1750 - classification_loss: 0.2829 498/500 [============================>.] - ETA: 0s - loss: 1.4561 - regression_loss: 1.1737 - classification_loss: 0.2824 499/500 [============================>.] - ETA: 0s - loss: 1.4561 - regression_loss: 1.1738 - classification_loss: 0.2823 500/500 [==============================] - 125s 250ms/step - loss: 1.4554 - regression_loss: 1.1734 - classification_loss: 0.2820 1172 instances of class plum with average precision: 0.6712 mAP: 0.6712 Epoch 00115: saving model to ./training/snapshots/resnet50_pascal_115.h5 Epoch 00115: ReduceLROnPlateau reducing learning rate to 9.999999747378752e-07. Epoch 116/150 1/500 [..............................] - ETA: 1:55 - loss: 1.3700 - regression_loss: 1.1957 - classification_loss: 0.1743 2/500 [..............................] - ETA: 2:01 - loss: 1.6190 - regression_loss: 1.4013 - classification_loss: 0.2177 3/500 [..............................] - ETA: 2:01 - loss: 1.5115 - regression_loss: 1.2938 - classification_loss: 0.2177 4/500 [..............................] - ETA: 2:01 - loss: 1.5669 - regression_loss: 1.3593 - classification_loss: 0.2076 5/500 [..............................] - ETA: 2:02 - loss: 1.4643 - regression_loss: 1.2730 - classification_loss: 0.1913 6/500 [..............................] - ETA: 2:02 - loss: 1.4871 - regression_loss: 1.2749 - classification_loss: 0.2122 7/500 [..............................] - ETA: 2:02 - loss: 1.4470 - regression_loss: 1.2446 - classification_loss: 0.2024 8/500 [..............................] - ETA: 2:02 - loss: 1.4500 - regression_loss: 1.2501 - classification_loss: 0.1999 9/500 [..............................] - ETA: 2:01 - loss: 1.3869 - regression_loss: 1.1926 - classification_loss: 0.1943 10/500 [..............................] - ETA: 2:02 - loss: 1.3907 - regression_loss: 1.1937 - classification_loss: 0.1970 11/500 [..............................] - ETA: 2:01 - loss: 1.3950 - regression_loss: 1.1959 - classification_loss: 0.1991 12/500 [..............................] - ETA: 2:01 - loss: 1.4405 - regression_loss: 1.2304 - classification_loss: 0.2101 13/500 [..............................] - ETA: 2:01 - loss: 1.4224 - regression_loss: 1.2166 - classification_loss: 0.2057 14/500 [..............................] - ETA: 2:01 - loss: 1.4171 - regression_loss: 1.2121 - classification_loss: 0.2050 15/500 [..............................] - ETA: 2:01 - loss: 1.4077 - regression_loss: 1.2041 - classification_loss: 0.2036 16/500 [..............................] - ETA: 2:00 - loss: 1.3981 - regression_loss: 1.1913 - classification_loss: 0.2068 17/500 [>.............................] - ETA: 2:00 - loss: 1.3950 - regression_loss: 1.1894 - classification_loss: 0.2056 18/500 [>.............................] - ETA: 2:00 - loss: 1.3859 - regression_loss: 1.1813 - classification_loss: 0.2046 19/500 [>.............................] - ETA: 2:00 - loss: 1.3913 - regression_loss: 1.1879 - classification_loss: 0.2034 20/500 [>.............................] - ETA: 2:00 - loss: 1.4165 - regression_loss: 1.2074 - classification_loss: 0.2091 21/500 [>.............................] - ETA: 2:00 - loss: 1.4009 - regression_loss: 1.1953 - classification_loss: 0.2056 22/500 [>.............................] - ETA: 2:00 - loss: 1.3700 - regression_loss: 1.1683 - classification_loss: 0.2016 23/500 [>.............................] - ETA: 1:59 - loss: 1.3772 - regression_loss: 1.1753 - classification_loss: 0.2019 24/500 [>.............................] - ETA: 1:59 - loss: 1.3830 - regression_loss: 1.1801 - classification_loss: 0.2029 25/500 [>.............................] - ETA: 1:59 - loss: 1.4016 - regression_loss: 1.1960 - classification_loss: 0.2056 26/500 [>.............................] - ETA: 1:59 - loss: 1.4106 - regression_loss: 1.2040 - classification_loss: 0.2066 27/500 [>.............................] - ETA: 1:58 - loss: 1.3985 - regression_loss: 1.1929 - classification_loss: 0.2056 28/500 [>.............................] - ETA: 1:58 - loss: 1.4022 - regression_loss: 1.1964 - classification_loss: 0.2059 29/500 [>.............................] - ETA: 1:58 - loss: 1.3890 - regression_loss: 1.1854 - classification_loss: 0.2036 30/500 [>.............................] - ETA: 1:58 - loss: 1.3909 - regression_loss: 1.1855 - classification_loss: 0.2054 31/500 [>.............................] - ETA: 1:58 - loss: 1.4042 - regression_loss: 1.1973 - classification_loss: 0.2069 32/500 [>.............................] - ETA: 1:58 - loss: 1.4110 - regression_loss: 1.2024 - classification_loss: 0.2086 33/500 [>.............................] - ETA: 1:58 - loss: 1.4139 - regression_loss: 1.2054 - classification_loss: 0.2085 34/500 [=>............................] - ETA: 1:57 - loss: 1.4289 - regression_loss: 1.2159 - classification_loss: 0.2130 35/500 [=>............................] - ETA: 1:57 - loss: 1.4240 - regression_loss: 1.2094 - classification_loss: 0.2146 36/500 [=>............................] - ETA: 1:57 - loss: 1.4007 - regression_loss: 1.1905 - classification_loss: 0.2102 37/500 [=>............................] - ETA: 1:57 - loss: 1.3975 - regression_loss: 1.1873 - classification_loss: 0.2102 38/500 [=>............................] - ETA: 1:56 - loss: 1.3912 - regression_loss: 1.1821 - classification_loss: 0.2091 39/500 [=>............................] - ETA: 1:56 - loss: 1.4019 - regression_loss: 1.1914 - classification_loss: 0.2105 40/500 [=>............................] - ETA: 1:56 - loss: 1.3921 - regression_loss: 1.1826 - classification_loss: 0.2094 41/500 [=>............................] - ETA: 1:55 - loss: 1.3755 - regression_loss: 1.1694 - classification_loss: 0.2061 42/500 [=>............................] - ETA: 1:55 - loss: 1.3750 - regression_loss: 1.1696 - classification_loss: 0.2054 43/500 [=>............................] - ETA: 1:55 - loss: 1.3860 - regression_loss: 1.1750 - classification_loss: 0.2111 44/500 [=>............................] - ETA: 1:55 - loss: 1.3976 - regression_loss: 1.1837 - classification_loss: 0.2139 45/500 [=>............................] - ETA: 1:54 - loss: 1.3962 - regression_loss: 1.1802 - classification_loss: 0.2160 46/500 [=>............................] - ETA: 1:54 - loss: 1.3991 - regression_loss: 1.1817 - classification_loss: 0.2173 47/500 [=>............................] - ETA: 1:54 - loss: 1.3899 - regression_loss: 1.1742 - classification_loss: 0.2157 48/500 [=>............................] - ETA: 1:54 - loss: 1.3944 - regression_loss: 1.1773 - classification_loss: 0.2171 49/500 [=>............................] - ETA: 1:53 - loss: 1.3909 - regression_loss: 1.1723 - classification_loss: 0.2186 50/500 [==>...........................] - ETA: 1:53 - loss: 1.3930 - regression_loss: 1.1740 - classification_loss: 0.2190 51/500 [==>...........................] - ETA: 1:53 - loss: 1.3966 - regression_loss: 1.1765 - classification_loss: 0.2201 52/500 [==>...........................] - ETA: 1:52 - loss: 1.4059 - regression_loss: 1.1839 - classification_loss: 0.2220 53/500 [==>...........................] - ETA: 1:52 - loss: 1.4158 - regression_loss: 1.1918 - classification_loss: 0.2240 54/500 [==>...........................] - ETA: 1:51 - loss: 1.4116 - regression_loss: 1.1891 - classification_loss: 0.2225 55/500 [==>...........................] - ETA: 1:51 - loss: 1.4103 - regression_loss: 1.1892 - classification_loss: 0.2211 56/500 [==>...........................] - ETA: 1:50 - loss: 1.4132 - regression_loss: 1.1920 - classification_loss: 0.2212 57/500 [==>...........................] - ETA: 1:50 - loss: 1.4195 - regression_loss: 1.1975 - classification_loss: 0.2221 58/500 [==>...........................] - ETA: 1:50 - loss: 1.4227 - regression_loss: 1.1997 - classification_loss: 0.2231 59/500 [==>...........................] - ETA: 1:50 - loss: 1.4224 - regression_loss: 1.1991 - classification_loss: 0.2233 60/500 [==>...........................] - ETA: 1:49 - loss: 1.4297 - regression_loss: 1.2047 - classification_loss: 0.2250 61/500 [==>...........................] - ETA: 1:49 - loss: 1.4322 - regression_loss: 1.2073 - classification_loss: 0.2249 62/500 [==>...........................] - ETA: 1:49 - loss: 1.4309 - regression_loss: 1.2053 - classification_loss: 0.2256 63/500 [==>...........................] - ETA: 1:49 - loss: 1.4314 - regression_loss: 1.2072 - classification_loss: 0.2241 64/500 [==>...........................] - ETA: 1:48 - loss: 1.4236 - regression_loss: 1.2017 - classification_loss: 0.2218 65/500 [==>...........................] - ETA: 1:48 - loss: 1.4259 - regression_loss: 1.2033 - classification_loss: 0.2226 66/500 [==>...........................] - ETA: 1:48 - loss: 1.4296 - regression_loss: 1.2051 - classification_loss: 0.2245 67/500 [===>..........................] - ETA: 1:48 - loss: 1.4409 - regression_loss: 1.2137 - classification_loss: 0.2272 68/500 [===>..........................] - ETA: 1:47 - loss: 1.4349 - regression_loss: 1.2089 - classification_loss: 0.2260 69/500 [===>..........................] - ETA: 1:47 - loss: 1.4284 - regression_loss: 1.2034 - classification_loss: 0.2251 70/500 [===>..........................] - ETA: 1:47 - loss: 1.4208 - regression_loss: 1.1978 - classification_loss: 0.2230 71/500 [===>..........................] - ETA: 1:47 - loss: 1.4289 - regression_loss: 1.2030 - classification_loss: 0.2259 72/500 [===>..........................] - ETA: 1:46 - loss: 1.4146 - regression_loss: 1.1912 - classification_loss: 0.2235 73/500 [===>..........................] - ETA: 1:46 - loss: 1.4134 - regression_loss: 1.1900 - classification_loss: 0.2234 74/500 [===>..........................] - ETA: 1:46 - loss: 1.4125 - regression_loss: 1.1888 - classification_loss: 0.2237 75/500 [===>..........................] - ETA: 1:46 - loss: 1.4160 - regression_loss: 1.1920 - classification_loss: 0.2240 76/500 [===>..........................] - ETA: 1:45 - loss: 1.4155 - regression_loss: 1.1922 - classification_loss: 0.2233 77/500 [===>..........................] - ETA: 1:45 - loss: 1.4103 - regression_loss: 1.1879 - classification_loss: 0.2224 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3986 - regression_loss: 1.1786 - classification_loss: 0.2200 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3950 - regression_loss: 1.1760 - classification_loss: 0.2190 80/500 [===>..........................] - ETA: 1:44 - loss: 1.3872 - regression_loss: 1.1688 - classification_loss: 0.2185 81/500 [===>..........................] - ETA: 1:44 - loss: 1.3853 - regression_loss: 1.1670 - classification_loss: 0.2182 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3840 - regression_loss: 1.1666 - classification_loss: 0.2174 83/500 [===>..........................] - ETA: 1:44 - loss: 1.3895 - regression_loss: 1.1708 - classification_loss: 0.2187 84/500 [====>.........................] - ETA: 1:44 - loss: 1.3840 - regression_loss: 1.1658 - classification_loss: 0.2182 85/500 [====>.........................] - ETA: 1:43 - loss: 1.3785 - regression_loss: 1.1616 - classification_loss: 0.2169 86/500 [====>.........................] - ETA: 1:43 - loss: 1.3755 - regression_loss: 1.1597 - classification_loss: 0.2158 87/500 [====>.........................] - ETA: 1:43 - loss: 1.3714 - regression_loss: 1.1568 - classification_loss: 0.2146 88/500 [====>.........................] - ETA: 1:43 - loss: 1.3755 - regression_loss: 1.1600 - classification_loss: 0.2155 89/500 [====>.........................] - ETA: 1:42 - loss: 1.3708 - regression_loss: 1.1556 - classification_loss: 0.2152 90/500 [====>.........................] - ETA: 1:42 - loss: 1.3746 - regression_loss: 1.1580 - classification_loss: 0.2166 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3697 - regression_loss: 1.1540 - classification_loss: 0.2157 92/500 [====>.........................] - ETA: 1:42 - loss: 1.3714 - regression_loss: 1.1554 - classification_loss: 0.2159 93/500 [====>.........................] - ETA: 1:41 - loss: 1.3694 - regression_loss: 1.1542 - classification_loss: 0.2153 94/500 [====>.........................] - ETA: 1:41 - loss: 1.3732 - regression_loss: 1.1573 - classification_loss: 0.2158 95/500 [====>.........................] - ETA: 1:41 - loss: 1.3671 - regression_loss: 1.1528 - classification_loss: 0.2143 96/500 [====>.........................] - ETA: 1:41 - loss: 1.3724 - regression_loss: 1.1572 - classification_loss: 0.2152 97/500 [====>.........................] - ETA: 1:40 - loss: 1.3742 - regression_loss: 1.1585 - classification_loss: 0.2157 98/500 [====>.........................] - ETA: 1:40 - loss: 1.3736 - regression_loss: 1.1588 - classification_loss: 0.2148 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3725 - regression_loss: 1.1581 - classification_loss: 0.2144 100/500 [=====>........................] - ETA: 1:40 - loss: 1.3751 - regression_loss: 1.1600 - classification_loss: 0.2151 101/500 [=====>........................] - ETA: 1:39 - loss: 1.3781 - regression_loss: 1.1618 - classification_loss: 0.2163 102/500 [=====>........................] - ETA: 1:39 - loss: 1.3790 - regression_loss: 1.1626 - classification_loss: 0.2164 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3765 - regression_loss: 1.1609 - classification_loss: 0.2156 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3684 - regression_loss: 1.1541 - classification_loss: 0.2142 105/500 [=====>........................] - ETA: 1:38 - loss: 1.3626 - regression_loss: 1.1498 - classification_loss: 0.2129 106/500 [=====>........................] - ETA: 1:38 - loss: 1.3600 - regression_loss: 1.1480 - classification_loss: 0.2120 107/500 [=====>........................] - ETA: 1:38 - loss: 1.3588 - regression_loss: 1.1473 - classification_loss: 0.2115 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3532 - regression_loss: 1.1431 - classification_loss: 0.2100 109/500 [=====>........................] - ETA: 1:37 - loss: 1.3503 - regression_loss: 1.1415 - classification_loss: 0.2088 110/500 [=====>........................] - ETA: 1:37 - loss: 1.3527 - regression_loss: 1.1437 - classification_loss: 0.2089 111/500 [=====>........................] - ETA: 1:37 - loss: 1.3514 - regression_loss: 1.1423 - classification_loss: 0.2091 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3526 - regression_loss: 1.1436 - classification_loss: 0.2090 113/500 [=====>........................] - ETA: 1:36 - loss: 1.3550 - regression_loss: 1.1458 - classification_loss: 0.2092 114/500 [=====>........................] - ETA: 1:36 - loss: 1.3554 - regression_loss: 1.1465 - classification_loss: 0.2089 115/500 [=====>........................] - ETA: 1:36 - loss: 1.3546 - regression_loss: 1.1459 - classification_loss: 0.2086 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3482 - regression_loss: 1.1409 - classification_loss: 0.2074 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3488 - regression_loss: 1.1418 - classification_loss: 0.2070 118/500 [======>.......................] - ETA: 1:35 - loss: 1.3432 - regression_loss: 1.1373 - classification_loss: 0.2059 119/500 [======>.......................] - ETA: 1:35 - loss: 1.3382 - regression_loss: 1.1332 - classification_loss: 0.2049 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3360 - regression_loss: 1.1317 - classification_loss: 0.2043 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3372 - regression_loss: 1.1326 - classification_loss: 0.2047 122/500 [======>.......................] - ETA: 1:34 - loss: 1.3379 - regression_loss: 1.1334 - classification_loss: 0.2044 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3383 - regression_loss: 1.1331 - classification_loss: 0.2052 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3402 - regression_loss: 1.1344 - classification_loss: 0.2058 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3430 - regression_loss: 1.1371 - classification_loss: 0.2060 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3470 - regression_loss: 1.1403 - classification_loss: 0.2067 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3497 - regression_loss: 1.1429 - classification_loss: 0.2068 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3443 - regression_loss: 1.1382 - classification_loss: 0.2060 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3476 - regression_loss: 1.1411 - classification_loss: 0.2065 130/500 [======>.......................] - ETA: 1:32 - loss: 1.3472 - regression_loss: 1.1408 - classification_loss: 0.2064 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3410 - regression_loss: 1.1359 - classification_loss: 0.2051 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3400 - regression_loss: 1.1345 - classification_loss: 0.2055 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3422 - regression_loss: 1.1366 - classification_loss: 0.2056 134/500 [=======>......................] - ETA: 1:31 - loss: 1.3473 - regression_loss: 1.1416 - classification_loss: 0.2057 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3495 - regression_loss: 1.1425 - classification_loss: 0.2070 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3522 - regression_loss: 1.1449 - classification_loss: 0.2073 137/500 [=======>......................] - ETA: 1:31 - loss: 1.3551 - regression_loss: 1.1481 - classification_loss: 0.2070 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3575 - regression_loss: 1.1503 - classification_loss: 0.2073 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3557 - regression_loss: 1.1489 - classification_loss: 0.2068 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3533 - regression_loss: 1.1472 - classification_loss: 0.2061 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3576 - regression_loss: 1.1506 - classification_loss: 0.2070 142/500 [=======>......................] - ETA: 1:30 - loss: 1.3544 - regression_loss: 1.1483 - classification_loss: 0.2061 143/500 [=======>......................] - ETA: 1:30 - loss: 1.3558 - regression_loss: 1.1492 - classification_loss: 0.2066 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3484 - regression_loss: 1.1430 - classification_loss: 0.2054 145/500 [=======>......................] - ETA: 1:29 - loss: 1.3481 - regression_loss: 1.1425 - classification_loss: 0.2056 146/500 [=======>......................] - ETA: 1:29 - loss: 1.3528 - regression_loss: 1.1465 - classification_loss: 0.2063 147/500 [=======>......................] - ETA: 1:29 - loss: 1.3527 - regression_loss: 1.1464 - classification_loss: 0.2063 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3484 - regression_loss: 1.1425 - classification_loss: 0.2059 149/500 [=======>......................] - ETA: 1:28 - loss: 1.3510 - regression_loss: 1.1445 - classification_loss: 0.2065 150/500 [========>.....................] - ETA: 1:28 - loss: 1.3498 - regression_loss: 1.1437 - classification_loss: 0.2061 151/500 [========>.....................] - ETA: 1:28 - loss: 1.3512 - regression_loss: 1.1450 - classification_loss: 0.2061 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3467 - regression_loss: 1.1415 - classification_loss: 0.2052 153/500 [========>.....................] - ETA: 1:27 - loss: 1.3449 - regression_loss: 1.1400 - classification_loss: 0.2048 154/500 [========>.....................] - ETA: 1:27 - loss: 1.3458 - regression_loss: 1.1406 - classification_loss: 0.2052 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3437 - regression_loss: 1.1389 - classification_loss: 0.2049 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3454 - regression_loss: 1.1403 - classification_loss: 0.2051 157/500 [========>.....................] - ETA: 1:26 - loss: 1.3482 - regression_loss: 1.1422 - classification_loss: 0.2060 158/500 [========>.....................] - ETA: 1:26 - loss: 1.3467 - regression_loss: 1.1411 - classification_loss: 0.2056 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3485 - regression_loss: 1.1425 - classification_loss: 0.2060 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3489 - regression_loss: 1.1426 - classification_loss: 0.2064 161/500 [========>.....................] - ETA: 1:25 - loss: 1.3589 - regression_loss: 1.1509 - classification_loss: 0.2080 162/500 [========>.....................] - ETA: 1:25 - loss: 1.3575 - regression_loss: 1.1498 - classification_loss: 0.2077 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3571 - regression_loss: 1.1492 - classification_loss: 0.2080 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3547 - regression_loss: 1.1475 - classification_loss: 0.2073 165/500 [========>.....................] - ETA: 1:24 - loss: 1.3553 - regression_loss: 1.1481 - classification_loss: 0.2072 166/500 [========>.....................] - ETA: 1:24 - loss: 1.3571 - regression_loss: 1.1496 - classification_loss: 0.2076 167/500 [=========>....................] - ETA: 1:23 - loss: 1.3583 - regression_loss: 1.1506 - classification_loss: 0.2078 168/500 [=========>....................] - ETA: 1:23 - loss: 1.3555 - regression_loss: 1.1485 - classification_loss: 0.2070 169/500 [=========>....................] - ETA: 1:23 - loss: 1.3520 - regression_loss: 1.1457 - classification_loss: 0.2062 170/500 [=========>....................] - ETA: 1:23 - loss: 1.3521 - regression_loss: 1.1456 - classification_loss: 0.2065 171/500 [=========>....................] - ETA: 1:22 - loss: 1.3476 - regression_loss: 1.1417 - classification_loss: 0.2059 172/500 [=========>....................] - ETA: 1:22 - loss: 1.3496 - regression_loss: 1.1434 - classification_loss: 0.2062 173/500 [=========>....................] - ETA: 1:22 - loss: 1.3485 - regression_loss: 1.1428 - classification_loss: 0.2058 174/500 [=========>....................] - ETA: 1:22 - loss: 1.3478 - regression_loss: 1.1422 - classification_loss: 0.2056 175/500 [=========>....................] - ETA: 1:21 - loss: 1.3449 - regression_loss: 1.1399 - classification_loss: 0.2050 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3474 - regression_loss: 1.1417 - classification_loss: 0.2057 177/500 [=========>....................] - ETA: 1:21 - loss: 1.3525 - regression_loss: 1.1457 - classification_loss: 0.2069 178/500 [=========>....................] - ETA: 1:21 - loss: 1.3513 - regression_loss: 1.1447 - classification_loss: 0.2066 179/500 [=========>....................] - ETA: 1:20 - loss: 1.3534 - regression_loss: 1.1466 - classification_loss: 0.2068 180/500 [=========>....................] - ETA: 1:20 - loss: 1.3543 - regression_loss: 1.1472 - classification_loss: 0.2072 181/500 [=========>....................] - ETA: 1:20 - loss: 1.3511 - regression_loss: 1.1443 - classification_loss: 0.2068 182/500 [=========>....................] - ETA: 1:20 - loss: 1.3496 - regression_loss: 1.1436 - classification_loss: 0.2060 183/500 [=========>....................] - ETA: 1:19 - loss: 1.3505 - regression_loss: 1.1442 - classification_loss: 0.2063 184/500 [==========>...................] - ETA: 1:19 - loss: 1.3509 - regression_loss: 1.1446 - classification_loss: 0.2063 185/500 [==========>...................] - ETA: 1:19 - loss: 1.3474 - regression_loss: 1.1415 - classification_loss: 0.2059 186/500 [==========>...................] - ETA: 1:19 - loss: 1.3477 - regression_loss: 1.1417 - classification_loss: 0.2060 187/500 [==========>...................] - ETA: 1:18 - loss: 1.3476 - regression_loss: 1.1416 - classification_loss: 0.2060 188/500 [==========>...................] - ETA: 1:18 - loss: 1.3456 - regression_loss: 1.1399 - classification_loss: 0.2057 189/500 [==========>...................] - ETA: 1:18 - loss: 1.3458 - regression_loss: 1.1401 - classification_loss: 0.2056 190/500 [==========>...................] - ETA: 1:18 - loss: 1.3476 - regression_loss: 1.1413 - classification_loss: 0.2063 191/500 [==========>...................] - ETA: 1:17 - loss: 1.3489 - regression_loss: 1.1424 - classification_loss: 0.2065 192/500 [==========>...................] - ETA: 1:17 - loss: 1.3531 - regression_loss: 1.1456 - classification_loss: 0.2075 193/500 [==========>...................] - ETA: 1:17 - loss: 1.3523 - regression_loss: 1.1450 - classification_loss: 0.2073 194/500 [==========>...................] - ETA: 1:17 - loss: 1.3522 - regression_loss: 1.1451 - classification_loss: 0.2071 195/500 [==========>...................] - ETA: 1:16 - loss: 1.3513 - regression_loss: 1.1443 - classification_loss: 0.2070 196/500 [==========>...................] - ETA: 1:16 - loss: 1.3477 - regression_loss: 1.1412 - classification_loss: 0.2065 197/500 [==========>...................] - ETA: 1:16 - loss: 1.3448 - regression_loss: 1.1391 - classification_loss: 0.2057 198/500 [==========>...................] - ETA: 1:16 - loss: 1.3462 - regression_loss: 1.1404 - classification_loss: 0.2058 199/500 [==========>...................] - ETA: 1:15 - loss: 1.3476 - regression_loss: 1.1418 - classification_loss: 0.2059 200/500 [===========>..................] - ETA: 1:15 - loss: 1.3492 - regression_loss: 1.1431 - classification_loss: 0.2061 201/500 [===========>..................] - ETA: 1:15 - loss: 1.3473 - regression_loss: 1.1419 - classification_loss: 0.2054 202/500 [===========>..................] - ETA: 1:15 - loss: 1.3465 - regression_loss: 1.1413 - classification_loss: 0.2052 203/500 [===========>..................] - ETA: 1:14 - loss: 1.3480 - regression_loss: 1.1425 - classification_loss: 0.2055 204/500 [===========>..................] - ETA: 1:14 - loss: 1.3495 - regression_loss: 1.1437 - classification_loss: 0.2058 205/500 [===========>..................] - ETA: 1:14 - loss: 1.3509 - regression_loss: 1.1449 - classification_loss: 0.2060 206/500 [===========>..................] - ETA: 1:14 - loss: 1.3510 - regression_loss: 1.1449 - classification_loss: 0.2061 207/500 [===========>..................] - ETA: 1:13 - loss: 1.3498 - regression_loss: 1.1437 - classification_loss: 0.2061 208/500 [===========>..................] - ETA: 1:13 - loss: 1.3529 - regression_loss: 1.1458 - classification_loss: 0.2071 209/500 [===========>..................] - ETA: 1:13 - loss: 1.3532 - regression_loss: 1.1460 - classification_loss: 0.2072 210/500 [===========>..................] - ETA: 1:13 - loss: 1.3539 - regression_loss: 1.1464 - classification_loss: 0.2075 211/500 [===========>..................] - ETA: 1:12 - loss: 1.3508 - regression_loss: 1.1438 - classification_loss: 0.2070 212/500 [===========>..................] - ETA: 1:12 - loss: 1.3505 - regression_loss: 1.1437 - classification_loss: 0.2068 213/500 [===========>..................] - ETA: 1:12 - loss: 1.3493 - regression_loss: 1.1426 - classification_loss: 0.2066 214/500 [===========>..................] - ETA: 1:12 - loss: 1.3449 - regression_loss: 1.1390 - classification_loss: 0.2060 215/500 [===========>..................] - ETA: 1:11 - loss: 1.3473 - regression_loss: 1.1408 - classification_loss: 0.2065 216/500 [===========>..................] - ETA: 1:11 - loss: 1.3477 - regression_loss: 1.1412 - classification_loss: 0.2066 217/500 [============>.................] - ETA: 1:11 - loss: 1.3484 - regression_loss: 1.1419 - classification_loss: 0.2065 218/500 [============>.................] - ETA: 1:11 - loss: 1.3504 - regression_loss: 1.1437 - classification_loss: 0.2067 219/500 [============>.................] - ETA: 1:10 - loss: 1.3474 - regression_loss: 1.1410 - classification_loss: 0.2064 220/500 [============>.................] - ETA: 1:10 - loss: 1.3502 - regression_loss: 1.1430 - classification_loss: 0.2072 221/500 [============>.................] - ETA: 1:10 - loss: 1.3504 - regression_loss: 1.1432 - classification_loss: 0.2072 222/500 [============>.................] - ETA: 1:10 - loss: 1.3508 - regression_loss: 1.1434 - classification_loss: 0.2074 223/500 [============>.................] - ETA: 1:09 - loss: 1.3475 - regression_loss: 1.1408 - classification_loss: 0.2067 224/500 [============>.................] - ETA: 1:09 - loss: 1.3497 - regression_loss: 1.1424 - classification_loss: 0.2074 225/500 [============>.................] - ETA: 1:09 - loss: 1.3516 - regression_loss: 1.1436 - classification_loss: 0.2080 226/500 [============>.................] - ETA: 1:09 - loss: 1.3488 - regression_loss: 1.1416 - classification_loss: 0.2073 227/500 [============>.................] - ETA: 1:08 - loss: 1.3468 - regression_loss: 1.1396 - classification_loss: 0.2073 228/500 [============>.................] - ETA: 1:08 - loss: 1.3483 - regression_loss: 1.1407 - classification_loss: 0.2076 229/500 [============>.................] - ETA: 1:08 - loss: 1.3471 - regression_loss: 1.1397 - classification_loss: 0.2075 230/500 [============>.................] - ETA: 1:07 - loss: 1.3459 - regression_loss: 1.1385 - classification_loss: 0.2074 231/500 [============>.................] - ETA: 1:07 - loss: 1.3462 - regression_loss: 1.1388 - classification_loss: 0.2074 232/500 [============>.................] - ETA: 1:07 - loss: 1.3435 - regression_loss: 1.1366 - classification_loss: 0.2068 233/500 [============>.................] - ETA: 1:07 - loss: 1.3429 - regression_loss: 1.1365 - classification_loss: 0.2064 234/500 [=============>................] - ETA: 1:06 - loss: 1.3431 - regression_loss: 1.1367 - classification_loss: 0.2064 235/500 [=============>................] - ETA: 1:06 - loss: 1.3424 - regression_loss: 1.1359 - classification_loss: 0.2065 236/500 [=============>................] - ETA: 1:06 - loss: 1.3407 - regression_loss: 1.1346 - classification_loss: 0.2061 237/500 [=============>................] - ETA: 1:06 - loss: 1.3416 - regression_loss: 1.1352 - classification_loss: 0.2064 238/500 [=============>................] - ETA: 1:05 - loss: 1.3420 - regression_loss: 1.1356 - classification_loss: 0.2064 239/500 [=============>................] - ETA: 1:05 - loss: 1.3437 - regression_loss: 1.1369 - classification_loss: 0.2068 240/500 [=============>................] - ETA: 1:05 - loss: 1.3399 - regression_loss: 1.1337 - classification_loss: 0.2062 241/500 [=============>................] - ETA: 1:05 - loss: 1.3414 - regression_loss: 1.1350 - classification_loss: 0.2064 242/500 [=============>................] - ETA: 1:04 - loss: 1.3375 - regression_loss: 1.1317 - classification_loss: 0.2058 243/500 [=============>................] - ETA: 1:04 - loss: 1.3383 - regression_loss: 1.1326 - classification_loss: 0.2058 244/500 [=============>................] - ETA: 1:04 - loss: 1.3399 - regression_loss: 1.1339 - classification_loss: 0.2060 245/500 [=============>................] - ETA: 1:04 - loss: 1.3384 - regression_loss: 1.1323 - classification_loss: 0.2061 246/500 [=============>................] - ETA: 1:03 - loss: 1.3392 - regression_loss: 1.1330 - classification_loss: 0.2062 247/500 [=============>................] - ETA: 1:03 - loss: 1.3392 - regression_loss: 1.1330 - classification_loss: 0.2062 248/500 [=============>................] - ETA: 1:03 - loss: 1.3383 - regression_loss: 1.1321 - classification_loss: 0.2061 249/500 [=============>................] - ETA: 1:03 - loss: 1.3355 - regression_loss: 1.1297 - classification_loss: 0.2058 250/500 [==============>...............] - ETA: 1:02 - loss: 1.3335 - regression_loss: 1.1278 - classification_loss: 0.2057 251/500 [==============>...............] - ETA: 1:02 - loss: 1.3333 - regression_loss: 1.1275 - classification_loss: 0.2058 252/500 [==============>...............] - ETA: 1:02 - loss: 1.3347 - regression_loss: 1.1286 - classification_loss: 0.2061 253/500 [==============>...............] - ETA: 1:02 - loss: 1.3317 - regression_loss: 1.1262 - classification_loss: 0.2055 254/500 [==============>...............] - ETA: 1:01 - loss: 1.3338 - regression_loss: 1.1280 - classification_loss: 0.2058 255/500 [==============>...............] - ETA: 1:01 - loss: 1.3345 - regression_loss: 1.1287 - classification_loss: 0.2059 256/500 [==============>...............] - ETA: 1:01 - loss: 1.3311 - regression_loss: 1.1259 - classification_loss: 0.2053 257/500 [==============>...............] - ETA: 1:01 - loss: 1.3285 - regression_loss: 1.1238 - classification_loss: 0.2047 258/500 [==============>...............] - ETA: 1:00 - loss: 1.3298 - regression_loss: 1.1250 - classification_loss: 0.2049 259/500 [==============>...............] - ETA: 1:00 - loss: 1.3308 - regression_loss: 1.1257 - classification_loss: 0.2050 260/500 [==============>...............] - ETA: 1:00 - loss: 1.3311 - regression_loss: 1.1261 - classification_loss: 0.2050 261/500 [==============>...............] - ETA: 1:00 - loss: 1.3339 - regression_loss: 1.1279 - classification_loss: 0.2060 262/500 [==============>...............] - ETA: 59s - loss: 1.3327 - regression_loss: 1.1270 - classification_loss: 0.2057  263/500 [==============>...............] - ETA: 59s - loss: 1.3333 - regression_loss: 1.1276 - classification_loss: 0.2057 264/500 [==============>...............] - ETA: 59s - loss: 1.3339 - regression_loss: 1.1278 - classification_loss: 0.2061 265/500 [==============>...............] - ETA: 59s - loss: 1.3345 - regression_loss: 1.1283 - classification_loss: 0.2062 266/500 [==============>...............] - ETA: 58s - loss: 1.3363 - regression_loss: 1.1298 - classification_loss: 0.2065 267/500 [===============>..............] - ETA: 58s - loss: 1.3357 - regression_loss: 1.1294 - classification_loss: 0.2063 268/500 [===============>..............] - ETA: 58s - loss: 1.3357 - regression_loss: 1.1294 - classification_loss: 0.2063 269/500 [===============>..............] - ETA: 58s - loss: 1.3386 - regression_loss: 1.1320 - classification_loss: 0.2066 270/500 [===============>..............] - ETA: 57s - loss: 1.3383 - regression_loss: 1.1319 - classification_loss: 0.2064 271/500 [===============>..............] - ETA: 57s - loss: 1.3388 - regression_loss: 1.1320 - classification_loss: 0.2068 272/500 [===============>..............] - ETA: 57s - loss: 1.3390 - regression_loss: 1.1320 - classification_loss: 0.2070 273/500 [===============>..............] - ETA: 57s - loss: 1.3387 - regression_loss: 1.1317 - classification_loss: 0.2069 274/500 [===============>..............] - ETA: 56s - loss: 1.3383 - regression_loss: 1.1315 - classification_loss: 0.2068 275/500 [===============>..............] - ETA: 56s - loss: 1.3374 - regression_loss: 1.1309 - classification_loss: 0.2065 276/500 [===============>..............] - ETA: 56s - loss: 1.3382 - regression_loss: 1.1316 - classification_loss: 0.2067 277/500 [===============>..............] - ETA: 56s - loss: 1.3384 - regression_loss: 1.1318 - classification_loss: 0.2067 278/500 [===============>..............] - ETA: 55s - loss: 1.3379 - regression_loss: 1.1314 - classification_loss: 0.2066 279/500 [===============>..............] - ETA: 55s - loss: 1.3412 - regression_loss: 1.1335 - classification_loss: 0.2077 280/500 [===============>..............] - ETA: 55s - loss: 1.3379 - regression_loss: 1.1308 - classification_loss: 0.2071 281/500 [===============>..............] - ETA: 55s - loss: 1.3392 - regression_loss: 1.1318 - classification_loss: 0.2075 282/500 [===============>..............] - ETA: 54s - loss: 1.3390 - regression_loss: 1.1316 - classification_loss: 0.2073 283/500 [===============>..............] - ETA: 54s - loss: 1.3405 - regression_loss: 1.1329 - classification_loss: 0.2075 284/500 [================>.............] - ETA: 54s - loss: 1.3405 - regression_loss: 1.1329 - classification_loss: 0.2076 285/500 [================>.............] - ETA: 54s - loss: 1.3416 - regression_loss: 1.1338 - classification_loss: 0.2079 286/500 [================>.............] - ETA: 53s - loss: 1.3436 - regression_loss: 1.1354 - classification_loss: 0.2082 287/500 [================>.............] - ETA: 53s - loss: 1.3425 - regression_loss: 1.1346 - classification_loss: 0.2079 288/500 [================>.............] - ETA: 53s - loss: 1.3412 - regression_loss: 1.1336 - classification_loss: 0.2076 289/500 [================>.............] - ETA: 53s - loss: 1.3420 - regression_loss: 1.1343 - classification_loss: 0.2077 290/500 [================>.............] - ETA: 52s - loss: 1.3419 - regression_loss: 1.1344 - classification_loss: 0.2076 291/500 [================>.............] - ETA: 52s - loss: 1.3407 - regression_loss: 1.1335 - classification_loss: 0.2072 292/500 [================>.............] - ETA: 52s - loss: 1.3389 - regression_loss: 1.1322 - classification_loss: 0.2067 293/500 [================>.............] - ETA: 52s - loss: 1.3395 - regression_loss: 1.1327 - classification_loss: 0.2068 294/500 [================>.............] - ETA: 51s - loss: 1.3393 - regression_loss: 1.1326 - classification_loss: 0.2067 295/500 [================>.............] - ETA: 51s - loss: 1.3385 - regression_loss: 1.1318 - classification_loss: 0.2067 296/500 [================>.............] - ETA: 51s - loss: 1.3372 - regression_loss: 1.1308 - classification_loss: 0.2064 297/500 [================>.............] - ETA: 51s - loss: 1.3347 - regression_loss: 1.1288 - classification_loss: 0.2059 298/500 [================>.............] - ETA: 50s - loss: 1.3353 - regression_loss: 1.1293 - classification_loss: 0.2059 299/500 [================>.............] - ETA: 50s - loss: 1.3352 - regression_loss: 1.1294 - classification_loss: 0.2059 300/500 [=================>............] - ETA: 50s - loss: 1.3345 - regression_loss: 1.1288 - classification_loss: 0.2057 301/500 [=================>............] - ETA: 50s - loss: 1.3338 - regression_loss: 1.1282 - classification_loss: 0.2056 302/500 [=================>............] - ETA: 49s - loss: 1.3311 - regression_loss: 1.1261 - classification_loss: 0.2050 303/500 [=================>............] - ETA: 49s - loss: 1.3315 - regression_loss: 1.1262 - classification_loss: 0.2052 304/500 [=================>............] - ETA: 49s - loss: 1.3321 - regression_loss: 1.1268 - classification_loss: 0.2053 305/500 [=================>............] - ETA: 49s - loss: 1.3325 - regression_loss: 1.1271 - classification_loss: 0.2054 306/500 [=================>............] - ETA: 48s - loss: 1.3320 - regression_loss: 1.1267 - classification_loss: 0.2053 307/500 [=================>............] - ETA: 48s - loss: 1.3344 - regression_loss: 1.1286 - classification_loss: 0.2058 308/500 [=================>............] - ETA: 48s - loss: 1.3348 - regression_loss: 1.1288 - classification_loss: 0.2060 309/500 [=================>............] - ETA: 48s - loss: 1.3358 - regression_loss: 1.1299 - classification_loss: 0.2060 310/500 [=================>............] - ETA: 47s - loss: 1.3358 - regression_loss: 1.1301 - classification_loss: 0.2057 311/500 [=================>............] - ETA: 47s - loss: 1.3350 - regression_loss: 1.1295 - classification_loss: 0.2055 312/500 [=================>............] - ETA: 47s - loss: 1.3346 - regression_loss: 1.1292 - classification_loss: 0.2054 313/500 [=================>............] - ETA: 47s - loss: 1.3334 - regression_loss: 1.1283 - classification_loss: 0.2052 314/500 [=================>............] - ETA: 46s - loss: 1.3313 - regression_loss: 1.1267 - classification_loss: 0.2047 315/500 [=================>............] - ETA: 46s - loss: 1.3346 - regression_loss: 1.1270 - classification_loss: 0.2076 316/500 [=================>............] - ETA: 46s - loss: 1.3352 - regression_loss: 1.1274 - classification_loss: 0.2078 317/500 [==================>...........] - ETA: 46s - loss: 1.3347 - regression_loss: 1.1270 - classification_loss: 0.2076 318/500 [==================>...........] - ETA: 45s - loss: 1.3330 - regression_loss: 1.1258 - classification_loss: 0.2072 319/500 [==================>...........] - ETA: 45s - loss: 1.3328 - regression_loss: 1.1257 - classification_loss: 0.2071 320/500 [==================>...........] - ETA: 45s - loss: 1.3332 - regression_loss: 1.1263 - classification_loss: 0.2069 321/500 [==================>...........] - ETA: 45s - loss: 1.3316 - regression_loss: 1.1250 - classification_loss: 0.2066 322/500 [==================>...........] - ETA: 44s - loss: 1.3307 - regression_loss: 1.1242 - classification_loss: 0.2064 323/500 [==================>...........] - ETA: 44s - loss: 1.3323 - regression_loss: 1.1258 - classification_loss: 0.2065 324/500 [==================>...........] - ETA: 44s - loss: 1.3308 - regression_loss: 1.1248 - classification_loss: 0.2061 325/500 [==================>...........] - ETA: 44s - loss: 1.3291 - regression_loss: 1.1233 - classification_loss: 0.2058 326/500 [==================>...........] - ETA: 43s - loss: 1.3295 - regression_loss: 1.1236 - classification_loss: 0.2059 327/500 [==================>...........] - ETA: 43s - loss: 1.3295 - regression_loss: 1.1237 - classification_loss: 0.2058 328/500 [==================>...........] - ETA: 43s - loss: 1.3295 - regression_loss: 1.1238 - classification_loss: 0.2057 329/500 [==================>...........] - ETA: 43s - loss: 1.3303 - regression_loss: 1.1246 - classification_loss: 0.2057 330/500 [==================>...........] - ETA: 42s - loss: 1.3319 - regression_loss: 1.1258 - classification_loss: 0.2060 331/500 [==================>...........] - ETA: 42s - loss: 1.3310 - regression_loss: 1.1251 - classification_loss: 0.2059 332/500 [==================>...........] - ETA: 42s - loss: 1.3299 - regression_loss: 1.1243 - classification_loss: 0.2056 333/500 [==================>...........] - ETA: 42s - loss: 1.3323 - regression_loss: 1.1261 - classification_loss: 0.2062 334/500 [===================>..........] - ETA: 41s - loss: 1.3311 - regression_loss: 1.1253 - classification_loss: 0.2059 335/500 [===================>..........] - ETA: 41s - loss: 1.3326 - regression_loss: 1.1264 - classification_loss: 0.2062 336/500 [===================>..........] - ETA: 41s - loss: 1.3312 - regression_loss: 1.1253 - classification_loss: 0.2060 337/500 [===================>..........] - ETA: 41s - loss: 1.3330 - regression_loss: 1.1267 - classification_loss: 0.2063 338/500 [===================>..........] - ETA: 40s - loss: 1.3329 - regression_loss: 1.1267 - classification_loss: 0.2061 339/500 [===================>..........] - ETA: 40s - loss: 1.3314 - regression_loss: 1.1256 - classification_loss: 0.2058 340/500 [===================>..........] - ETA: 40s - loss: 1.3312 - regression_loss: 1.1255 - classification_loss: 0.2057 341/500 [===================>..........] - ETA: 40s - loss: 1.3305 - regression_loss: 1.1250 - classification_loss: 0.2054 342/500 [===================>..........] - ETA: 39s - loss: 1.3316 - regression_loss: 1.1260 - classification_loss: 0.2057 343/500 [===================>..........] - ETA: 39s - loss: 1.3319 - regression_loss: 1.1261 - classification_loss: 0.2057 344/500 [===================>..........] - ETA: 39s - loss: 1.3333 - regression_loss: 1.1274 - classification_loss: 0.2059 345/500 [===================>..........] - ETA: 39s - loss: 1.3329 - regression_loss: 1.1271 - classification_loss: 0.2058 346/500 [===================>..........] - ETA: 38s - loss: 1.3319 - regression_loss: 1.1264 - classification_loss: 0.2055 347/500 [===================>..........] - ETA: 38s - loss: 1.3299 - regression_loss: 1.1247 - classification_loss: 0.2052 348/500 [===================>..........] - ETA: 38s - loss: 1.3320 - regression_loss: 1.1264 - classification_loss: 0.2056 349/500 [===================>..........] - ETA: 38s - loss: 1.3326 - regression_loss: 1.1270 - classification_loss: 0.2056 350/500 [====================>.........] - ETA: 37s - loss: 1.3333 - regression_loss: 1.1276 - classification_loss: 0.2057 351/500 [====================>.........] - ETA: 37s - loss: 1.3336 - regression_loss: 1.1278 - classification_loss: 0.2058 352/500 [====================>.........] - ETA: 37s - loss: 1.3339 - regression_loss: 1.1279 - classification_loss: 0.2060 353/500 [====================>.........] - ETA: 36s - loss: 1.3345 - regression_loss: 1.1284 - classification_loss: 0.2061 354/500 [====================>.........] - ETA: 36s - loss: 1.3340 - regression_loss: 1.1280 - classification_loss: 0.2060 355/500 [====================>.........] - ETA: 36s - loss: 1.3334 - regression_loss: 1.1274 - classification_loss: 0.2060 356/500 [====================>.........] - ETA: 36s - loss: 1.3347 - regression_loss: 1.1284 - classification_loss: 0.2064 357/500 [====================>.........] - ETA: 35s - loss: 1.3349 - regression_loss: 1.1285 - classification_loss: 0.2065 358/500 [====================>.........] - ETA: 35s - loss: 1.3351 - regression_loss: 1.1286 - classification_loss: 0.2065 359/500 [====================>.........] - ETA: 35s - loss: 1.3362 - regression_loss: 1.1295 - classification_loss: 0.2067 360/500 [====================>.........] - ETA: 35s - loss: 1.3366 - regression_loss: 1.1299 - classification_loss: 0.2067 361/500 [====================>.........] - ETA: 34s - loss: 1.3357 - regression_loss: 1.1293 - classification_loss: 0.2064 362/500 [====================>.........] - ETA: 34s - loss: 1.3347 - regression_loss: 1.1287 - classification_loss: 0.2060 363/500 [====================>.........] - ETA: 34s - loss: 1.3348 - regression_loss: 1.1287 - classification_loss: 0.2061 364/500 [====================>.........] - ETA: 34s - loss: 1.3358 - regression_loss: 1.1295 - classification_loss: 0.2063 365/500 [====================>.........] - ETA: 33s - loss: 1.3358 - regression_loss: 1.1293 - classification_loss: 0.2065 366/500 [====================>.........] - ETA: 33s - loss: 1.3380 - regression_loss: 1.1310 - classification_loss: 0.2070 367/500 [=====================>........] - ETA: 33s - loss: 1.3366 - regression_loss: 1.1298 - classification_loss: 0.2068 368/500 [=====================>........] - ETA: 33s - loss: 1.3371 - regression_loss: 1.1300 - classification_loss: 0.2070 369/500 [=====================>........] - ETA: 32s - loss: 1.3361 - regression_loss: 1.1291 - classification_loss: 0.2070 370/500 [=====================>........] - ETA: 32s - loss: 1.3349 - regression_loss: 1.1280 - classification_loss: 0.2069 371/500 [=====================>........] - ETA: 32s - loss: 1.3355 - regression_loss: 1.1285 - classification_loss: 0.2070 372/500 [=====================>........] - ETA: 32s - loss: 1.3339 - regression_loss: 1.1272 - classification_loss: 0.2066 373/500 [=====================>........] - ETA: 31s - loss: 1.3337 - regression_loss: 1.1271 - classification_loss: 0.2066 374/500 [=====================>........] - ETA: 31s - loss: 1.3331 - regression_loss: 1.1266 - classification_loss: 0.2065 375/500 [=====================>........] - ETA: 31s - loss: 1.3327 - regression_loss: 1.1263 - classification_loss: 0.2064 376/500 [=====================>........] - ETA: 31s - loss: 1.3330 - regression_loss: 1.1266 - classification_loss: 0.2064 377/500 [=====================>........] - ETA: 30s - loss: 1.3323 - regression_loss: 1.1259 - classification_loss: 0.2064 378/500 [=====================>........] - ETA: 30s - loss: 1.3335 - regression_loss: 1.1272 - classification_loss: 0.2064 379/500 [=====================>........] - ETA: 30s - loss: 1.3339 - regression_loss: 1.1275 - classification_loss: 0.2064 380/500 [=====================>........] - ETA: 30s - loss: 1.3322 - regression_loss: 1.1262 - classification_loss: 0.2060 381/500 [=====================>........] - ETA: 29s - loss: 1.3327 - regression_loss: 1.1264 - classification_loss: 0.2063 382/500 [=====================>........] - ETA: 29s - loss: 1.3334 - regression_loss: 1.1269 - classification_loss: 0.2064 383/500 [=====================>........] - ETA: 29s - loss: 1.3339 - regression_loss: 1.1275 - classification_loss: 0.2064 384/500 [======================>.......] - ETA: 29s - loss: 1.3343 - regression_loss: 1.1278 - classification_loss: 0.2065 385/500 [======================>.......] - ETA: 28s - loss: 1.3358 - regression_loss: 1.1291 - classification_loss: 0.2067 386/500 [======================>.......] - ETA: 28s - loss: 1.3338 - regression_loss: 1.1275 - classification_loss: 0.2064 387/500 [======================>.......] - ETA: 28s - loss: 1.3326 - regression_loss: 1.1265 - classification_loss: 0.2061 388/500 [======================>.......] - ETA: 28s - loss: 1.3302 - regression_loss: 1.1245 - classification_loss: 0.2057 389/500 [======================>.......] - ETA: 27s - loss: 1.3299 - regression_loss: 1.1243 - classification_loss: 0.2055 390/500 [======================>.......] - ETA: 27s - loss: 1.3302 - regression_loss: 1.1247 - classification_loss: 0.2055 391/500 [======================>.......] - ETA: 27s - loss: 1.3309 - regression_loss: 1.1254 - classification_loss: 0.2055 392/500 [======================>.......] - ETA: 27s - loss: 1.3293 - regression_loss: 1.1241 - classification_loss: 0.2052 393/500 [======================>.......] - ETA: 26s - loss: 1.3296 - regression_loss: 1.1243 - classification_loss: 0.2053 394/500 [======================>.......] - ETA: 26s - loss: 1.3304 - regression_loss: 1.1250 - classification_loss: 0.2054 395/500 [======================>.......] - ETA: 26s - loss: 1.3301 - regression_loss: 1.1248 - classification_loss: 0.2053 396/500 [======================>.......] - ETA: 26s - loss: 1.3305 - regression_loss: 1.1251 - classification_loss: 0.2054 397/500 [======================>.......] - ETA: 25s - loss: 1.3288 - regression_loss: 1.1238 - classification_loss: 0.2050 398/500 [======================>.......] - ETA: 25s - loss: 1.3295 - regression_loss: 1.1243 - classification_loss: 0.2052 399/500 [======================>.......] - ETA: 25s - loss: 1.3285 - regression_loss: 1.1234 - classification_loss: 0.2050 400/500 [=======================>......] - ETA: 25s - loss: 1.3281 - regression_loss: 1.1231 - classification_loss: 0.2049 401/500 [=======================>......] - ETA: 24s - loss: 1.3255 - regression_loss: 1.1210 - classification_loss: 0.2045 402/500 [=======================>......] - ETA: 24s - loss: 1.3268 - regression_loss: 1.1217 - classification_loss: 0.2050 403/500 [=======================>......] - ETA: 24s - loss: 1.3249 - regression_loss: 1.1202 - classification_loss: 0.2047 404/500 [=======================>......] - ETA: 24s - loss: 1.3243 - regression_loss: 1.1198 - classification_loss: 0.2045 405/500 [=======================>......] - ETA: 23s - loss: 1.3248 - regression_loss: 1.1202 - classification_loss: 0.2046 406/500 [=======================>......] - ETA: 23s - loss: 1.3259 - regression_loss: 1.1209 - classification_loss: 0.2050 407/500 [=======================>......] - ETA: 23s - loss: 1.3266 - regression_loss: 1.1216 - classification_loss: 0.2051 408/500 [=======================>......] - ETA: 23s - loss: 1.3252 - regression_loss: 1.1205 - classification_loss: 0.2047 409/500 [=======================>......] - ETA: 22s - loss: 1.3239 - regression_loss: 1.1194 - classification_loss: 0.2045 410/500 [=======================>......] - ETA: 22s - loss: 1.3261 - regression_loss: 1.1208 - classification_loss: 0.2053 411/500 [=======================>......] - ETA: 22s - loss: 1.3261 - regression_loss: 1.1209 - classification_loss: 0.2052 412/500 [=======================>......] - ETA: 22s - loss: 1.3250 - regression_loss: 1.1200 - classification_loss: 0.2050 413/500 [=======================>......] - ETA: 21s - loss: 1.3259 - regression_loss: 1.1208 - classification_loss: 0.2051 414/500 [=======================>......] - ETA: 21s - loss: 1.3258 - regression_loss: 1.1207 - classification_loss: 0.2050 415/500 [=======================>......] - ETA: 21s - loss: 1.3262 - regression_loss: 1.1212 - classification_loss: 0.2051 416/500 [=======================>......] - ETA: 21s - loss: 1.3265 - regression_loss: 1.1214 - classification_loss: 0.2051 417/500 [========================>.....] - ETA: 20s - loss: 1.3276 - regression_loss: 1.1224 - classification_loss: 0.2052 418/500 [========================>.....] - ETA: 20s - loss: 1.3286 - regression_loss: 1.1234 - classification_loss: 0.2052 419/500 [========================>.....] - ETA: 20s - loss: 1.3285 - regression_loss: 1.1234 - classification_loss: 0.2051 420/500 [========================>.....] - ETA: 20s - loss: 1.3291 - regression_loss: 1.1239 - classification_loss: 0.2052 421/500 [========================>.....] - ETA: 19s - loss: 1.3298 - regression_loss: 1.1244 - classification_loss: 0.2054 422/500 [========================>.....] - ETA: 19s - loss: 1.3295 - regression_loss: 1.1243 - classification_loss: 0.2053 423/500 [========================>.....] - ETA: 19s - loss: 1.3287 - regression_loss: 1.1236 - classification_loss: 0.2051 424/500 [========================>.....] - ETA: 19s - loss: 1.3293 - regression_loss: 1.1241 - classification_loss: 0.2052 425/500 [========================>.....] - ETA: 18s - loss: 1.3269 - regression_loss: 1.1221 - classification_loss: 0.2048 426/500 [========================>.....] - ETA: 18s - loss: 1.3271 - regression_loss: 1.1223 - classification_loss: 0.2048 427/500 [========================>.....] - ETA: 18s - loss: 1.3268 - regression_loss: 1.1220 - classification_loss: 0.2048 428/500 [========================>.....] - ETA: 18s - loss: 1.3270 - regression_loss: 1.1223 - classification_loss: 0.2047 429/500 [========================>.....] - ETA: 17s - loss: 1.3259 - regression_loss: 1.1214 - classification_loss: 0.2044 430/500 [========================>.....] - ETA: 17s - loss: 1.3260 - regression_loss: 1.1216 - classification_loss: 0.2044 431/500 [========================>.....] - ETA: 17s - loss: 1.3264 - regression_loss: 1.1221 - classification_loss: 0.2043 432/500 [========================>.....] - ETA: 17s - loss: 1.3270 - regression_loss: 1.1227 - classification_loss: 0.2043 433/500 [========================>.....] - ETA: 16s - loss: 1.3261 - regression_loss: 1.1217 - classification_loss: 0.2044 434/500 [=========================>....] - ETA: 16s - loss: 1.3262 - regression_loss: 1.1218 - classification_loss: 0.2044 435/500 [=========================>....] - ETA: 16s - loss: 1.3258 - regression_loss: 1.1215 - classification_loss: 0.2043 436/500 [=========================>....] - ETA: 16s - loss: 1.3261 - regression_loss: 1.1217 - classification_loss: 0.2044 437/500 [=========================>....] - ETA: 15s - loss: 1.3263 - regression_loss: 1.1217 - classification_loss: 0.2045 438/500 [=========================>....] - ETA: 15s - loss: 1.3247 - regression_loss: 1.1206 - classification_loss: 0.2042 439/500 [=========================>....] - ETA: 15s - loss: 1.3246 - regression_loss: 1.1205 - classification_loss: 0.2041 440/500 [=========================>....] - ETA: 15s - loss: 1.3254 - regression_loss: 1.1209 - classification_loss: 0.2045 441/500 [=========================>....] - ETA: 14s - loss: 1.3271 - regression_loss: 1.1222 - classification_loss: 0.2049 442/500 [=========================>....] - ETA: 14s - loss: 1.3260 - regression_loss: 1.1214 - classification_loss: 0.2046 443/500 [=========================>....] - ETA: 14s - loss: 1.3251 - regression_loss: 1.1208 - classification_loss: 0.2042 444/500 [=========================>....] - ETA: 14s - loss: 1.3255 - regression_loss: 1.1214 - classification_loss: 0.2041 445/500 [=========================>....] - ETA: 13s - loss: 1.3252 - regression_loss: 1.1212 - classification_loss: 0.2040 446/500 [=========================>....] - ETA: 13s - loss: 1.3250 - regression_loss: 1.1212 - classification_loss: 0.2039 447/500 [=========================>....] - ETA: 13s - loss: 1.3251 - regression_loss: 1.1213 - classification_loss: 0.2038 448/500 [=========================>....] - ETA: 13s - loss: 1.3262 - regression_loss: 1.1222 - classification_loss: 0.2040 449/500 [=========================>....] - ETA: 12s - loss: 1.3244 - regression_loss: 1.1208 - classification_loss: 0.2036 450/500 [==========================>...] - ETA: 12s - loss: 1.3233 - regression_loss: 1.1198 - classification_loss: 0.2034 451/500 [==========================>...] - ETA: 12s - loss: 1.3214 - regression_loss: 1.1183 - classification_loss: 0.2030 452/500 [==========================>...] - ETA: 12s - loss: 1.3214 - regression_loss: 1.1183 - classification_loss: 0.2031 453/500 [==========================>...] - ETA: 11s - loss: 1.3212 - regression_loss: 1.1181 - classification_loss: 0.2031 454/500 [==========================>...] - ETA: 11s - loss: 1.3219 - regression_loss: 1.1186 - classification_loss: 0.2033 455/500 [==========================>...] - ETA: 11s - loss: 1.3224 - regression_loss: 1.1190 - classification_loss: 0.2034 456/500 [==========================>...] - ETA: 11s - loss: 1.3229 - regression_loss: 1.1194 - classification_loss: 0.2035 457/500 [==========================>...] - ETA: 10s - loss: 1.3227 - regression_loss: 1.1192 - classification_loss: 0.2036 458/500 [==========================>...] - ETA: 10s - loss: 1.3233 - regression_loss: 1.1197 - classification_loss: 0.2036 459/500 [==========================>...] - ETA: 10s - loss: 1.3228 - regression_loss: 1.1193 - classification_loss: 0.2035 460/500 [==========================>...] - ETA: 10s - loss: 1.3227 - regression_loss: 1.1193 - classification_loss: 0.2034 461/500 [==========================>...] - ETA: 9s - loss: 1.3224 - regression_loss: 1.1191 - classification_loss: 0.2034  462/500 [==========================>...] - ETA: 9s - loss: 1.3208 - regression_loss: 1.1177 - classification_loss: 0.2030 463/500 [==========================>...] - ETA: 9s - loss: 1.3191 - regression_loss: 1.1160 - classification_loss: 0.2031 464/500 [==========================>...] - ETA: 9s - loss: 1.3175 - regression_loss: 1.1148 - classification_loss: 0.2028 465/500 [==========================>...] - ETA: 8s - loss: 1.3182 - regression_loss: 1.1154 - classification_loss: 0.2028 466/500 [==========================>...] - ETA: 8s - loss: 1.3176 - regression_loss: 1.1149 - classification_loss: 0.2027 467/500 [===========================>..] - ETA: 8s - loss: 1.3182 - regression_loss: 1.1153 - classification_loss: 0.2029 468/500 [===========================>..] - ETA: 8s - loss: 1.3195 - regression_loss: 1.1165 - classification_loss: 0.2030 469/500 [===========================>..] - ETA: 7s - loss: 1.3193 - regression_loss: 1.1162 - classification_loss: 0.2031 470/500 [===========================>..] - ETA: 7s - loss: 1.3194 - regression_loss: 1.1163 - classification_loss: 0.2031 471/500 [===========================>..] - ETA: 7s - loss: 1.3183 - regression_loss: 1.1153 - classification_loss: 0.2031 472/500 [===========================>..] - ETA: 7s - loss: 1.3182 - regression_loss: 1.1152 - classification_loss: 0.2030 473/500 [===========================>..] - ETA: 6s - loss: 1.3171 - regression_loss: 1.1143 - classification_loss: 0.2029 474/500 [===========================>..] - ETA: 6s - loss: 1.3170 - regression_loss: 1.1142 - classification_loss: 0.2028 475/500 [===========================>..] - ETA: 6s - loss: 1.3171 - regression_loss: 1.1144 - classification_loss: 0.2028 476/500 [===========================>..] - ETA: 6s - loss: 1.3166 - regression_loss: 1.1139 - classification_loss: 0.2026 477/500 [===========================>..] - ETA: 5s - loss: 1.3158 - regression_loss: 1.1134 - classification_loss: 0.2024 478/500 [===========================>..] - ETA: 5s - loss: 1.3156 - regression_loss: 1.1133 - classification_loss: 0.2022 479/500 [===========================>..] - ETA: 5s - loss: 1.3168 - regression_loss: 1.1143 - classification_loss: 0.2025 480/500 [===========================>..] - ETA: 5s - loss: 1.3160 - regression_loss: 1.1136 - classification_loss: 0.2024 481/500 [===========================>..] - ETA: 4s - loss: 1.3168 - regression_loss: 1.1143 - classification_loss: 0.2025 482/500 [===========================>..] - ETA: 4s - loss: 1.3179 - regression_loss: 1.1151 - classification_loss: 0.2028 483/500 [===========================>..] - ETA: 4s - loss: 1.3164 - regression_loss: 1.1138 - classification_loss: 0.2026 484/500 [============================>.] - ETA: 4s - loss: 1.3160 - regression_loss: 1.1136 - classification_loss: 0.2025 485/500 [============================>.] - ETA: 3s - loss: 1.3157 - regression_loss: 1.1133 - classification_loss: 0.2024 486/500 [============================>.] - ETA: 3s - loss: 1.3147 - regression_loss: 1.1126 - classification_loss: 0.2021 487/500 [============================>.] - ETA: 3s - loss: 1.3127 - regression_loss: 1.1109 - classification_loss: 0.2018 488/500 [============================>.] - ETA: 3s - loss: 1.3120 - regression_loss: 1.1104 - classification_loss: 0.2016 489/500 [============================>.] - ETA: 2s - loss: 1.3122 - regression_loss: 1.1105 - classification_loss: 0.2017 490/500 [============================>.] - ETA: 2s - loss: 1.3115 - regression_loss: 1.1099 - classification_loss: 0.2016 491/500 [============================>.] - ETA: 2s - loss: 1.3122 - regression_loss: 1.1105 - classification_loss: 0.2017 492/500 [============================>.] - ETA: 2s - loss: 1.3126 - regression_loss: 1.1108 - classification_loss: 0.2018 493/500 [============================>.] - ETA: 1s - loss: 1.3134 - regression_loss: 1.1113 - classification_loss: 0.2021 494/500 [============================>.] - ETA: 1s - loss: 1.3139 - regression_loss: 1.1117 - classification_loss: 0.2022 495/500 [============================>.] - ETA: 1s - loss: 1.3141 - regression_loss: 1.1117 - classification_loss: 0.2024 496/500 [============================>.] - ETA: 1s - loss: 1.3149 - regression_loss: 1.1125 - classification_loss: 0.2024 497/500 [============================>.] - ETA: 0s - loss: 1.3135 - regression_loss: 1.1113 - classification_loss: 0.2022 498/500 [============================>.] - ETA: 0s - loss: 1.3133 - regression_loss: 1.1111 - classification_loss: 0.2021 499/500 [============================>.] - ETA: 0s - loss: 1.3131 - regression_loss: 1.1111 - classification_loss: 0.2020 500/500 [==============================] - 126s 251ms/step - loss: 1.3136 - regression_loss: 1.1116 - classification_loss: 0.2020 1172 instances of class plum with average precision: 0.6922 mAP: 0.6922 Epoch 00116: saving model to ./training/snapshots/resnet50_pascal_116.h5 Epoch 117/150 1/500 [..............................] - ETA: 2:04 - loss: 1.3916 - regression_loss: 1.1997 - classification_loss: 0.1919 2/500 [..............................] - ETA: 2:03 - loss: 1.1970 - regression_loss: 1.0287 - classification_loss: 0.1683 3/500 [..............................] - ETA: 2:05 - loss: 1.0136 - regression_loss: 0.8658 - classification_loss: 0.1478 4/500 [..............................] - ETA: 2:04 - loss: 1.2432 - regression_loss: 1.0629 - classification_loss: 0.1804 5/500 [..............................] - ETA: 2:05 - loss: 1.2721 - regression_loss: 1.0918 - classification_loss: 0.1802 6/500 [..............................] - ETA: 2:06 - loss: 1.2987 - regression_loss: 1.1120 - classification_loss: 0.1867 7/500 [..............................] - ETA: 2:05 - loss: 1.2087 - regression_loss: 1.0425 - classification_loss: 0.1662 8/500 [..............................] - ETA: 2:05 - loss: 1.1762 - regression_loss: 1.0082 - classification_loss: 0.1680 9/500 [..............................] - ETA: 2:04 - loss: 1.2311 - regression_loss: 1.0547 - classification_loss: 0.1764 10/500 [..............................] - ETA: 2:03 - loss: 1.2161 - regression_loss: 1.0454 - classification_loss: 0.1706 11/500 [..............................] - ETA: 2:04 - loss: 1.2856 - regression_loss: 1.1002 - classification_loss: 0.1854 12/500 [..............................] - ETA: 2:03 - loss: 1.2764 - regression_loss: 1.0951 - classification_loss: 0.1813 13/500 [..............................] - ETA: 2:03 - loss: 1.3113 - regression_loss: 1.1240 - classification_loss: 0.1872 14/500 [..............................] - ETA: 2:03 - loss: 1.3211 - regression_loss: 1.1352 - classification_loss: 0.1859 15/500 [..............................] - ETA: 2:02 - loss: 1.2530 - regression_loss: 1.0733 - classification_loss: 0.1797 16/500 [..............................] - ETA: 2:02 - loss: 1.2262 - regression_loss: 1.0525 - classification_loss: 0.1736 17/500 [>.............................] - ETA: 2:02 - loss: 1.2488 - regression_loss: 1.0714 - classification_loss: 0.1774 18/500 [>.............................] - ETA: 2:01 - loss: 1.2303 - regression_loss: 1.0578 - classification_loss: 0.1726 19/500 [>.............................] - ETA: 2:01 - loss: 1.2529 - regression_loss: 1.0768 - classification_loss: 0.1760 20/500 [>.............................] - ETA: 2:00 - loss: 1.2413 - regression_loss: 1.0643 - classification_loss: 0.1770 21/500 [>.............................] - ETA: 2:00 - loss: 1.2518 - regression_loss: 1.0737 - classification_loss: 0.1781 22/500 [>.............................] - ETA: 2:00 - loss: 1.2750 - regression_loss: 1.0907 - classification_loss: 0.1843 23/500 [>.............................] - ETA: 2:00 - loss: 1.2970 - regression_loss: 1.1072 - classification_loss: 0.1898 24/500 [>.............................] - ETA: 1:59 - loss: 1.3183 - regression_loss: 1.1189 - classification_loss: 0.1994 25/500 [>.............................] - ETA: 1:59 - loss: 1.3456 - regression_loss: 1.1380 - classification_loss: 0.2076 26/500 [>.............................] - ETA: 1:59 - loss: 1.3443 - regression_loss: 1.1367 - classification_loss: 0.2076 27/500 [>.............................] - ETA: 1:58 - loss: 1.3487 - regression_loss: 1.1398 - classification_loss: 0.2089 28/500 [>.............................] - ETA: 1:58 - loss: 1.3503 - regression_loss: 1.1405 - classification_loss: 0.2099 29/500 [>.............................] - ETA: 1:58 - loss: 1.3290 - regression_loss: 1.1247 - classification_loss: 0.2043 30/500 [>.............................] - ETA: 1:58 - loss: 1.3168 - regression_loss: 1.1151 - classification_loss: 0.2016 31/500 [>.............................] - ETA: 1:57 - loss: 1.3203 - regression_loss: 1.1186 - classification_loss: 0.2017 32/500 [>.............................] - ETA: 1:57 - loss: 1.3330 - regression_loss: 1.1304 - classification_loss: 0.2025 33/500 [>.............................] - ETA: 1:57 - loss: 1.3363 - regression_loss: 1.1309 - classification_loss: 0.2054 34/500 [=>............................] - ETA: 1:57 - loss: 1.3426 - regression_loss: 1.1340 - classification_loss: 0.2086 35/500 [=>............................] - ETA: 1:56 - loss: 1.3370 - regression_loss: 1.1312 - classification_loss: 0.2058 36/500 [=>............................] - ETA: 1:56 - loss: 1.3670 - regression_loss: 1.1524 - classification_loss: 0.2147 37/500 [=>............................] - ETA: 1:56 - loss: 1.3616 - regression_loss: 1.1476 - classification_loss: 0.2140 38/500 [=>............................] - ETA: 1:56 - loss: 1.3663 - regression_loss: 1.1516 - classification_loss: 0.2147 39/500 [=>............................] - ETA: 1:55 - loss: 1.3714 - regression_loss: 1.1554 - classification_loss: 0.2160 40/500 [=>............................] - ETA: 1:55 - loss: 1.3755 - regression_loss: 1.1592 - classification_loss: 0.2164 41/500 [=>............................] - ETA: 1:55 - loss: 1.3518 - regression_loss: 1.1397 - classification_loss: 0.2121 42/500 [=>............................] - ETA: 1:55 - loss: 1.3296 - regression_loss: 1.1215 - classification_loss: 0.2081 43/500 [=>............................] - ETA: 1:54 - loss: 1.3217 - regression_loss: 1.1147 - classification_loss: 0.2070 44/500 [=>............................] - ETA: 1:54 - loss: 1.3035 - regression_loss: 1.1004 - classification_loss: 0.2031 45/500 [=>............................] - ETA: 1:54 - loss: 1.3150 - regression_loss: 1.1105 - classification_loss: 0.2045 46/500 [=>............................] - ETA: 1:53 - loss: 1.3051 - regression_loss: 1.1024 - classification_loss: 0.2028 47/500 [=>............................] - ETA: 1:53 - loss: 1.3147 - regression_loss: 1.1102 - classification_loss: 0.2045 48/500 [=>............................] - ETA: 1:53 - loss: 1.3203 - regression_loss: 1.1147 - classification_loss: 0.2056 49/500 [=>............................] - ETA: 1:53 - loss: 1.3270 - regression_loss: 1.1209 - classification_loss: 0.2061 50/500 [==>...........................] - ETA: 1:53 - loss: 1.3370 - regression_loss: 1.1289 - classification_loss: 0.2081 51/500 [==>...........................] - ETA: 1:52 - loss: 1.3276 - regression_loss: 1.1211 - classification_loss: 0.2064 52/500 [==>...........................] - ETA: 1:52 - loss: 1.3343 - regression_loss: 1.1269 - classification_loss: 0.2074 53/500 [==>...........................] - ETA: 1:52 - loss: 1.3272 - regression_loss: 1.1211 - classification_loss: 0.2061 54/500 [==>...........................] - ETA: 1:52 - loss: 1.3256 - regression_loss: 1.1199 - classification_loss: 0.2057 55/500 [==>...........................] - ETA: 1:51 - loss: 1.3159 - regression_loss: 1.1125 - classification_loss: 0.2035 56/500 [==>...........................] - ETA: 1:51 - loss: 1.3173 - regression_loss: 1.1137 - classification_loss: 0.2036 57/500 [==>...........................] - ETA: 1:51 - loss: 1.3137 - regression_loss: 1.1113 - classification_loss: 0.2024 58/500 [==>...........................] - ETA: 1:51 - loss: 1.3149 - regression_loss: 1.1130 - classification_loss: 0.2019 59/500 [==>...........................] - ETA: 1:50 - loss: 1.3124 - regression_loss: 1.1103 - classification_loss: 0.2020 60/500 [==>...........................] - ETA: 1:50 - loss: 1.3155 - regression_loss: 1.1132 - classification_loss: 0.2023 61/500 [==>...........................] - ETA: 1:50 - loss: 1.3265 - regression_loss: 1.1220 - classification_loss: 0.2046 62/500 [==>...........................] - ETA: 1:49 - loss: 1.3238 - regression_loss: 1.1192 - classification_loss: 0.2046 63/500 [==>...........................] - ETA: 1:49 - loss: 1.3203 - regression_loss: 1.1163 - classification_loss: 0.2039 64/500 [==>...........................] - ETA: 1:49 - loss: 1.3222 - regression_loss: 1.1183 - classification_loss: 0.2039 65/500 [==>...........................] - ETA: 1:49 - loss: 1.3266 - regression_loss: 1.1224 - classification_loss: 0.2043 66/500 [==>...........................] - ETA: 1:49 - loss: 1.3222 - regression_loss: 1.1188 - classification_loss: 0.2035 67/500 [===>..........................] - ETA: 1:48 - loss: 1.3289 - regression_loss: 1.1235 - classification_loss: 0.2054 68/500 [===>..........................] - ETA: 1:48 - loss: 1.3281 - regression_loss: 1.1227 - classification_loss: 0.2054 69/500 [===>..........................] - ETA: 1:48 - loss: 1.3331 - regression_loss: 1.1271 - classification_loss: 0.2060 70/500 [===>..........................] - ETA: 1:48 - loss: 1.3324 - regression_loss: 1.1267 - classification_loss: 0.2057 71/500 [===>..........................] - ETA: 1:47 - loss: 1.3324 - regression_loss: 1.1261 - classification_loss: 0.2063 72/500 [===>..........................] - ETA: 1:47 - loss: 1.3285 - regression_loss: 1.1231 - classification_loss: 0.2054 73/500 [===>..........................] - ETA: 1:47 - loss: 1.3268 - regression_loss: 1.1219 - classification_loss: 0.2049 74/500 [===>..........................] - ETA: 1:47 - loss: 1.3317 - regression_loss: 1.1264 - classification_loss: 0.2053 75/500 [===>..........................] - ETA: 1:46 - loss: 1.3272 - regression_loss: 1.1233 - classification_loss: 0.2039 76/500 [===>..........................] - ETA: 1:46 - loss: 1.3251 - regression_loss: 1.1217 - classification_loss: 0.2034 77/500 [===>..........................] - ETA: 1:46 - loss: 1.3190 - regression_loss: 1.1163 - classification_loss: 0.2027 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3197 - regression_loss: 1.1170 - classification_loss: 0.2027 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3124 - regression_loss: 1.1107 - classification_loss: 0.2016 80/500 [===>..........................] - ETA: 1:45 - loss: 1.3041 - regression_loss: 1.1041 - classification_loss: 0.2000 81/500 [===>..........................] - ETA: 1:44 - loss: 1.3056 - regression_loss: 1.1057 - classification_loss: 0.1999 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3035 - regression_loss: 1.1041 - classification_loss: 0.1993 83/500 [===>..........................] - ETA: 1:43 - loss: 1.2960 - regression_loss: 1.0985 - classification_loss: 0.1975 84/500 [====>.........................] - ETA: 1:43 - loss: 1.3012 - regression_loss: 1.1030 - classification_loss: 0.1983 85/500 [====>.........................] - ETA: 1:43 - loss: 1.3060 - regression_loss: 1.1072 - classification_loss: 0.1988 86/500 [====>.........................] - ETA: 1:43 - loss: 1.3033 - regression_loss: 1.1052 - classification_loss: 0.1981 87/500 [====>.........................] - ETA: 1:42 - loss: 1.3101 - regression_loss: 1.1106 - classification_loss: 0.1996 88/500 [====>.........................] - ETA: 1:42 - loss: 1.3117 - regression_loss: 1.1119 - classification_loss: 0.1998 89/500 [====>.........................] - ETA: 1:42 - loss: 1.3167 - regression_loss: 1.1165 - classification_loss: 0.2002 90/500 [====>.........................] - ETA: 1:42 - loss: 1.3187 - regression_loss: 1.1179 - classification_loss: 0.2008 91/500 [====>.........................] - ETA: 1:41 - loss: 1.3215 - regression_loss: 1.1202 - classification_loss: 0.2013 92/500 [====>.........................] - ETA: 1:41 - loss: 1.3155 - regression_loss: 1.1153 - classification_loss: 0.2001 93/500 [====>.........................] - ETA: 1:41 - loss: 1.3127 - regression_loss: 1.1127 - classification_loss: 0.2001 94/500 [====>.........................] - ETA: 1:41 - loss: 1.3104 - regression_loss: 1.1110 - classification_loss: 0.1994 95/500 [====>.........................] - ETA: 1:40 - loss: 1.3140 - regression_loss: 1.1143 - classification_loss: 0.1997 96/500 [====>.........................] - ETA: 1:40 - loss: 1.3198 - regression_loss: 1.1177 - classification_loss: 0.2022 97/500 [====>.........................] - ETA: 1:40 - loss: 1.3124 - regression_loss: 1.1118 - classification_loss: 0.2006 98/500 [====>.........................] - ETA: 1:40 - loss: 1.3074 - regression_loss: 1.1075 - classification_loss: 0.2000 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3038 - regression_loss: 1.1047 - classification_loss: 0.1991 100/500 [=====>........................] - ETA: 1:39 - loss: 1.2966 - regression_loss: 1.0991 - classification_loss: 0.1976 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2959 - regression_loss: 1.0986 - classification_loss: 0.1973 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2991 - regression_loss: 1.1007 - classification_loss: 0.1985 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3051 - regression_loss: 1.1050 - classification_loss: 0.2001 104/500 [=====>........................] - ETA: 1:38 - loss: 1.3034 - regression_loss: 1.1016 - classification_loss: 0.2018 105/500 [=====>........................] - ETA: 1:38 - loss: 1.3061 - regression_loss: 1.1041 - classification_loss: 0.2020 106/500 [=====>........................] - ETA: 1:38 - loss: 1.3046 - regression_loss: 1.1029 - classification_loss: 0.2017 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2960 - regression_loss: 1.0957 - classification_loss: 0.2002 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2917 - regression_loss: 1.0922 - classification_loss: 0.1995 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2828 - regression_loss: 1.0849 - classification_loss: 0.1979 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2812 - regression_loss: 1.0835 - classification_loss: 0.1977 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2816 - regression_loss: 1.0834 - classification_loss: 0.1982 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2756 - regression_loss: 1.0786 - classification_loss: 0.1970 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2730 - regression_loss: 1.0769 - classification_loss: 0.1961 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2763 - regression_loss: 1.0795 - classification_loss: 0.1967 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2779 - regression_loss: 1.0812 - classification_loss: 0.1967 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2792 - regression_loss: 1.0824 - classification_loss: 0.1968 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2798 - regression_loss: 1.0830 - classification_loss: 0.1968 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2781 - regression_loss: 1.0816 - classification_loss: 0.1964 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2810 - regression_loss: 1.0843 - classification_loss: 0.1967 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2804 - regression_loss: 1.0841 - classification_loss: 0.1964 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2811 - regression_loss: 1.0847 - classification_loss: 0.1964 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2765 - regression_loss: 1.0811 - classification_loss: 0.1954 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2769 - regression_loss: 1.0814 - classification_loss: 0.1954 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2782 - regression_loss: 1.0823 - classification_loss: 0.1958 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2810 - regression_loss: 1.0845 - classification_loss: 0.1964 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2781 - regression_loss: 1.0823 - classification_loss: 0.1958 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2757 - regression_loss: 1.0802 - classification_loss: 0.1955 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2714 - regression_loss: 1.0767 - classification_loss: 0.1947 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2706 - regression_loss: 1.0762 - classification_loss: 0.1944 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2654 - regression_loss: 1.0720 - classification_loss: 0.1934 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2674 - regression_loss: 1.0736 - classification_loss: 0.1938 132/500 [======>.......................] - ETA: 1:31 - loss: 1.2662 - regression_loss: 1.0717 - classification_loss: 0.1945 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2673 - regression_loss: 1.0727 - classification_loss: 0.1946 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2704 - regression_loss: 1.0753 - classification_loss: 0.1951 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2755 - regression_loss: 1.0792 - classification_loss: 0.1964 136/500 [=======>......................] - ETA: 1:30 - loss: 1.2774 - regression_loss: 1.0808 - classification_loss: 0.1967 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2790 - regression_loss: 1.0816 - classification_loss: 0.1974 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2762 - regression_loss: 1.0792 - classification_loss: 0.1970 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2760 - regression_loss: 1.0793 - classification_loss: 0.1967 140/500 [=======>......................] - ETA: 1:29 - loss: 1.2771 - regression_loss: 1.0803 - classification_loss: 0.1968 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2796 - regression_loss: 1.0822 - classification_loss: 0.1974 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2787 - regression_loss: 1.0819 - classification_loss: 0.1968 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2789 - regression_loss: 1.0825 - classification_loss: 0.1964 144/500 [=======>......................] - ETA: 1:28 - loss: 1.2815 - regression_loss: 1.0840 - classification_loss: 0.1975 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2873 - regression_loss: 1.0892 - classification_loss: 0.1981 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2873 - regression_loss: 1.0893 - classification_loss: 0.1980 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2832 - regression_loss: 1.0859 - classification_loss: 0.1972 148/500 [=======>......................] - ETA: 1:27 - loss: 1.2810 - regression_loss: 1.0841 - classification_loss: 0.1969 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2842 - regression_loss: 1.0869 - classification_loss: 0.1973 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2836 - regression_loss: 1.0865 - classification_loss: 0.1971 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2875 - regression_loss: 1.0896 - classification_loss: 0.1979 152/500 [========>.....................] - ETA: 1:26 - loss: 1.2880 - regression_loss: 1.0901 - classification_loss: 0.1979 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2899 - regression_loss: 1.0916 - classification_loss: 0.1983 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2893 - regression_loss: 1.0912 - classification_loss: 0.1981 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2885 - regression_loss: 1.0906 - classification_loss: 0.1979 156/500 [========>.....................] - ETA: 1:25 - loss: 1.2921 - regression_loss: 1.0942 - classification_loss: 0.1979 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2957 - regression_loss: 1.0972 - classification_loss: 0.1985 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2924 - regression_loss: 1.0944 - classification_loss: 0.1980 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2908 - regression_loss: 1.0933 - classification_loss: 0.1976 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2930 - regression_loss: 1.0949 - classification_loss: 0.1981 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2960 - regression_loss: 1.0976 - classification_loss: 0.1984 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2932 - regression_loss: 1.0953 - classification_loss: 0.1979 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2900 - regression_loss: 1.0930 - classification_loss: 0.1970 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2922 - regression_loss: 1.0946 - classification_loss: 0.1975 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2884 - regression_loss: 1.0917 - classification_loss: 0.1967 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2889 - regression_loss: 1.0913 - classification_loss: 0.1977 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2916 - regression_loss: 1.0934 - classification_loss: 0.1982 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2931 - regression_loss: 1.0946 - classification_loss: 0.1986 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2912 - regression_loss: 1.0932 - classification_loss: 0.1981 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2882 - regression_loss: 1.0907 - classification_loss: 0.1975 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2897 - regression_loss: 1.0919 - classification_loss: 0.1978 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2880 - regression_loss: 1.0905 - classification_loss: 0.1975 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2845 - regression_loss: 1.0876 - classification_loss: 0.1969 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2871 - regression_loss: 1.0896 - classification_loss: 0.1975 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2867 - regression_loss: 1.0890 - classification_loss: 0.1978 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2835 - regression_loss: 1.0864 - classification_loss: 0.1971 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2840 - regression_loss: 1.0870 - classification_loss: 0.1970 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2811 - regression_loss: 1.0846 - classification_loss: 0.1965 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2823 - regression_loss: 1.0856 - classification_loss: 0.1967 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2801 - regression_loss: 1.0840 - classification_loss: 0.1962 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2804 - regression_loss: 1.0842 - classification_loss: 0.1963 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2784 - regression_loss: 1.0823 - classification_loss: 0.1961 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2766 - regression_loss: 1.0807 - classification_loss: 0.1959 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2740 - regression_loss: 1.0787 - classification_loss: 0.1954 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2727 - regression_loss: 1.0775 - classification_loss: 0.1951 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2697 - regression_loss: 1.0754 - classification_loss: 0.1943 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2693 - regression_loss: 1.0752 - classification_loss: 0.1941 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2695 - regression_loss: 1.0754 - classification_loss: 0.1941 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2714 - regression_loss: 1.0770 - classification_loss: 0.1944 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2710 - regression_loss: 1.0769 - classification_loss: 0.1941 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2727 - regression_loss: 1.0785 - classification_loss: 0.1942 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2726 - regression_loss: 1.0785 - classification_loss: 0.1941 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2716 - regression_loss: 1.0778 - classification_loss: 0.1938 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2737 - regression_loss: 1.0795 - classification_loss: 0.1942 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2748 - regression_loss: 1.0807 - classification_loss: 0.1941 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2767 - regression_loss: 1.0826 - classification_loss: 0.1941 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2776 - regression_loss: 1.0838 - classification_loss: 0.1939 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2816 - regression_loss: 1.0868 - classification_loss: 0.1948 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2862 - regression_loss: 1.0902 - classification_loss: 0.1960 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2877 - regression_loss: 1.0915 - classification_loss: 0.1962 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2846 - regression_loss: 1.0890 - classification_loss: 0.1956 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2801 - regression_loss: 1.0853 - classification_loss: 0.1948 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2800 - regression_loss: 1.0852 - classification_loss: 0.1949 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2793 - regression_loss: 1.0845 - classification_loss: 0.1948 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2784 - regression_loss: 1.0839 - classification_loss: 0.1945 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2785 - regression_loss: 1.0843 - classification_loss: 0.1943 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2794 - regression_loss: 1.0851 - classification_loss: 0.1943 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2811 - regression_loss: 1.0865 - classification_loss: 0.1946 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2841 - regression_loss: 1.0891 - classification_loss: 0.1950 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2869 - regression_loss: 1.0914 - classification_loss: 0.1956 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2907 - regression_loss: 1.0941 - classification_loss: 0.1966 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2911 - regression_loss: 1.0944 - classification_loss: 0.1967 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2926 - regression_loss: 1.0955 - classification_loss: 0.1971 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2935 - regression_loss: 1.0963 - classification_loss: 0.1973 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2927 - regression_loss: 1.0959 - classification_loss: 0.1968 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2938 - regression_loss: 1.0968 - classification_loss: 0.1970 217/500 [============>.................] - ETA: 1:10 - loss: 1.2911 - regression_loss: 1.0947 - classification_loss: 0.1965 218/500 [============>.................] - ETA: 1:10 - loss: 1.2902 - regression_loss: 1.0940 - classification_loss: 0.1963 219/500 [============>.................] - ETA: 1:10 - loss: 1.2930 - regression_loss: 1.0957 - classification_loss: 0.1973 220/500 [============>.................] - ETA: 1:10 - loss: 1.2936 - regression_loss: 1.0964 - classification_loss: 0.1973 221/500 [============>.................] - ETA: 1:09 - loss: 1.2926 - regression_loss: 1.0951 - classification_loss: 0.1975 222/500 [============>.................] - ETA: 1:09 - loss: 1.2920 - regression_loss: 1.0947 - classification_loss: 0.1973 223/500 [============>.................] - ETA: 1:09 - loss: 1.2938 - regression_loss: 1.0962 - classification_loss: 0.1976 224/500 [============>.................] - ETA: 1:09 - loss: 1.2940 - regression_loss: 1.0954 - classification_loss: 0.1986 225/500 [============>.................] - ETA: 1:08 - loss: 1.2922 - regression_loss: 1.0941 - classification_loss: 0.1981 226/500 [============>.................] - ETA: 1:08 - loss: 1.2926 - regression_loss: 1.0945 - classification_loss: 0.1981 227/500 [============>.................] - ETA: 1:08 - loss: 1.2941 - regression_loss: 1.0958 - classification_loss: 0.1983 228/500 [============>.................] - ETA: 1:08 - loss: 1.2916 - regression_loss: 1.0938 - classification_loss: 0.1978 229/500 [============>.................] - ETA: 1:07 - loss: 1.2939 - regression_loss: 1.0959 - classification_loss: 0.1981 230/500 [============>.................] - ETA: 1:07 - loss: 1.2941 - regression_loss: 1.0961 - classification_loss: 0.1980 231/500 [============>.................] - ETA: 1:07 - loss: 1.2921 - regression_loss: 1.0946 - classification_loss: 0.1975 232/500 [============>.................] - ETA: 1:07 - loss: 1.2951 - regression_loss: 1.0961 - classification_loss: 0.1990 233/500 [============>.................] - ETA: 1:06 - loss: 1.2956 - regression_loss: 1.0959 - classification_loss: 0.1997 234/500 [=============>................] - ETA: 1:06 - loss: 1.2937 - regression_loss: 1.0945 - classification_loss: 0.1992 235/500 [=============>................] - ETA: 1:06 - loss: 1.2946 - regression_loss: 1.0953 - classification_loss: 0.1993 236/500 [=============>................] - ETA: 1:06 - loss: 1.2948 - regression_loss: 1.0956 - classification_loss: 0.1993 237/500 [=============>................] - ETA: 1:05 - loss: 1.2935 - regression_loss: 1.0942 - classification_loss: 0.1993 238/500 [=============>................] - ETA: 1:05 - loss: 1.2942 - regression_loss: 1.0947 - classification_loss: 0.1995 239/500 [=============>................] - ETA: 1:05 - loss: 1.2992 - regression_loss: 1.0988 - classification_loss: 0.2004 240/500 [=============>................] - ETA: 1:05 - loss: 1.2974 - regression_loss: 1.0973 - classification_loss: 0.2000 241/500 [=============>................] - ETA: 1:04 - loss: 1.2936 - regression_loss: 1.0941 - classification_loss: 0.1994 242/500 [=============>................] - ETA: 1:04 - loss: 1.2943 - regression_loss: 1.0947 - classification_loss: 0.1996 243/500 [=============>................] - ETA: 1:04 - loss: 1.2962 - regression_loss: 1.0962 - classification_loss: 0.2000 244/500 [=============>................] - ETA: 1:04 - loss: 1.2954 - regression_loss: 1.0954 - classification_loss: 0.1999 245/500 [=============>................] - ETA: 1:03 - loss: 1.2953 - regression_loss: 1.0955 - classification_loss: 0.1998 246/500 [=============>................] - ETA: 1:03 - loss: 1.2930 - regression_loss: 1.0935 - classification_loss: 0.1994 247/500 [=============>................] - ETA: 1:03 - loss: 1.2960 - regression_loss: 1.0953 - classification_loss: 0.2006 248/500 [=============>................] - ETA: 1:03 - loss: 1.2948 - regression_loss: 1.0945 - classification_loss: 0.2003 249/500 [=============>................] - ETA: 1:02 - loss: 1.2948 - regression_loss: 1.0943 - classification_loss: 0.2005 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2976 - regression_loss: 1.0964 - classification_loss: 0.2011 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2958 - regression_loss: 1.0952 - classification_loss: 0.2005 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2980 - regression_loss: 1.0970 - classification_loss: 0.2010 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2983 - regression_loss: 1.0974 - classification_loss: 0.2009 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2963 - regression_loss: 1.0958 - classification_loss: 0.2005 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2948 - regression_loss: 1.0946 - classification_loss: 0.2002 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2937 - regression_loss: 1.0940 - classification_loss: 0.1997 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2932 - regression_loss: 1.0935 - classification_loss: 0.1997 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2917 - regression_loss: 1.0922 - classification_loss: 0.1994 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2910 - regression_loss: 1.0918 - classification_loss: 0.1992 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2918 - regression_loss: 1.0924 - classification_loss: 0.1993 261/500 [==============>...............] - ETA: 59s - loss: 1.2891 - regression_loss: 1.0902 - classification_loss: 0.1989  262/500 [==============>...............] - ETA: 59s - loss: 1.2868 - regression_loss: 1.0883 - classification_loss: 0.1985 263/500 [==============>...............] - ETA: 59s - loss: 1.2877 - regression_loss: 1.0891 - classification_loss: 0.1986 264/500 [==============>...............] - ETA: 59s - loss: 1.2874 - regression_loss: 1.0889 - classification_loss: 0.1986 265/500 [==============>...............] - ETA: 58s - loss: 1.2876 - regression_loss: 1.0893 - classification_loss: 0.1982 266/500 [==============>...............] - ETA: 58s - loss: 1.2875 - regression_loss: 1.0894 - classification_loss: 0.1981 267/500 [===============>..............] - ETA: 58s - loss: 1.2879 - regression_loss: 1.0898 - classification_loss: 0.1981 268/500 [===============>..............] - ETA: 58s - loss: 1.2865 - regression_loss: 1.0885 - classification_loss: 0.1980 269/500 [===============>..............] - ETA: 57s - loss: 1.2883 - regression_loss: 1.0904 - classification_loss: 0.1979 270/500 [===============>..............] - ETA: 57s - loss: 1.2868 - regression_loss: 1.0892 - classification_loss: 0.1976 271/500 [===============>..............] - ETA: 57s - loss: 1.2876 - regression_loss: 1.0896 - classification_loss: 0.1979 272/500 [===============>..............] - ETA: 57s - loss: 1.2870 - regression_loss: 1.0891 - classification_loss: 0.1979 273/500 [===============>..............] - ETA: 56s - loss: 1.2890 - regression_loss: 1.0907 - classification_loss: 0.1983 274/500 [===============>..............] - ETA: 56s - loss: 1.2901 - regression_loss: 1.0916 - classification_loss: 0.1985 275/500 [===============>..............] - ETA: 56s - loss: 1.2901 - regression_loss: 1.0917 - classification_loss: 0.1984 276/500 [===============>..............] - ETA: 56s - loss: 1.2912 - regression_loss: 1.0928 - classification_loss: 0.1985 277/500 [===============>..............] - ETA: 55s - loss: 1.2884 - regression_loss: 1.0904 - classification_loss: 0.1980 278/500 [===============>..............] - ETA: 55s - loss: 1.2894 - regression_loss: 1.0913 - classification_loss: 0.1981 279/500 [===============>..............] - ETA: 55s - loss: 1.2892 - regression_loss: 1.0910 - classification_loss: 0.1982 280/500 [===============>..............] - ETA: 55s - loss: 1.2873 - regression_loss: 1.0897 - classification_loss: 0.1977 281/500 [===============>..............] - ETA: 54s - loss: 1.2878 - regression_loss: 1.0902 - classification_loss: 0.1977 282/500 [===============>..............] - ETA: 54s - loss: 1.2884 - regression_loss: 1.0906 - classification_loss: 0.1977 283/500 [===============>..............] - ETA: 54s - loss: 1.2877 - regression_loss: 1.0900 - classification_loss: 0.1977 284/500 [================>.............] - ETA: 54s - loss: 1.2864 - regression_loss: 1.0891 - classification_loss: 0.1973 285/500 [================>.............] - ETA: 53s - loss: 1.2879 - regression_loss: 1.0903 - classification_loss: 0.1975 286/500 [================>.............] - ETA: 53s - loss: 1.2880 - regression_loss: 1.0904 - classification_loss: 0.1976 287/500 [================>.............] - ETA: 53s - loss: 1.2897 - regression_loss: 1.0919 - classification_loss: 0.1978 288/500 [================>.............] - ETA: 53s - loss: 1.2886 - regression_loss: 1.0910 - classification_loss: 0.1977 289/500 [================>.............] - ETA: 52s - loss: 1.2902 - regression_loss: 1.0924 - classification_loss: 0.1978 290/500 [================>.............] - ETA: 52s - loss: 1.2876 - regression_loss: 1.0903 - classification_loss: 0.1972 291/500 [================>.............] - ETA: 52s - loss: 1.2893 - regression_loss: 1.0918 - classification_loss: 0.1975 292/500 [================>.............] - ETA: 52s - loss: 1.2907 - regression_loss: 1.0929 - classification_loss: 0.1978 293/500 [================>.............] - ETA: 51s - loss: 1.2910 - regression_loss: 1.0932 - classification_loss: 0.1978 294/500 [================>.............] - ETA: 51s - loss: 1.2921 - regression_loss: 1.0941 - classification_loss: 0.1980 295/500 [================>.............] - ETA: 51s - loss: 1.2900 - regression_loss: 1.0925 - classification_loss: 0.1976 296/500 [================>.............] - ETA: 51s - loss: 1.2913 - regression_loss: 1.0935 - classification_loss: 0.1978 297/500 [================>.............] - ETA: 50s - loss: 1.2923 - regression_loss: 1.0943 - classification_loss: 0.1981 298/500 [================>.............] - ETA: 50s - loss: 1.2911 - regression_loss: 1.0933 - classification_loss: 0.1979 299/500 [================>.............] - ETA: 50s - loss: 1.2942 - regression_loss: 1.0960 - classification_loss: 0.1983 300/500 [=================>............] - ETA: 50s - loss: 1.2941 - regression_loss: 1.0958 - classification_loss: 0.1982 301/500 [=================>............] - ETA: 49s - loss: 1.2950 - regression_loss: 1.0961 - classification_loss: 0.1988 302/500 [=================>............] - ETA: 49s - loss: 1.2952 - regression_loss: 1.0963 - classification_loss: 0.1989 303/500 [=================>............] - ETA: 49s - loss: 1.2970 - regression_loss: 1.0978 - classification_loss: 0.1992 304/500 [=================>............] - ETA: 49s - loss: 1.2948 - regression_loss: 1.0958 - classification_loss: 0.1990 305/500 [=================>............] - ETA: 48s - loss: 1.2967 - regression_loss: 1.0975 - classification_loss: 0.1993 306/500 [=================>............] - ETA: 48s - loss: 1.2950 - regression_loss: 1.0961 - classification_loss: 0.1989 307/500 [=================>............] - ETA: 48s - loss: 1.2976 - regression_loss: 1.0981 - classification_loss: 0.1995 308/500 [=================>............] - ETA: 48s - loss: 1.2969 - regression_loss: 1.0976 - classification_loss: 0.1994 309/500 [=================>............] - ETA: 47s - loss: 1.2981 - regression_loss: 1.0984 - classification_loss: 0.1996 310/500 [=================>............] - ETA: 47s - loss: 1.2992 - regression_loss: 1.0994 - classification_loss: 0.1998 311/500 [=================>............] - ETA: 47s - loss: 1.2976 - regression_loss: 1.0980 - classification_loss: 0.1997 312/500 [=================>............] - ETA: 47s - loss: 1.2992 - regression_loss: 1.0992 - classification_loss: 0.1999 313/500 [=================>............] - ETA: 46s - loss: 1.2971 - regression_loss: 1.0977 - classification_loss: 0.1994 314/500 [=================>............] - ETA: 46s - loss: 1.2967 - regression_loss: 1.0973 - classification_loss: 0.1994 315/500 [=================>............] - ETA: 46s - loss: 1.2977 - regression_loss: 1.0983 - classification_loss: 0.1995 316/500 [=================>............] - ETA: 46s - loss: 1.2977 - regression_loss: 1.0983 - classification_loss: 0.1994 317/500 [==================>...........] - ETA: 45s - loss: 1.2952 - regression_loss: 1.0962 - classification_loss: 0.1990 318/500 [==================>...........] - ETA: 45s - loss: 1.2948 - regression_loss: 1.0959 - classification_loss: 0.1989 319/500 [==================>...........] - ETA: 45s - loss: 1.2955 - regression_loss: 1.0964 - classification_loss: 0.1991 320/500 [==================>...........] - ETA: 45s - loss: 1.2979 - regression_loss: 1.0981 - classification_loss: 0.1998 321/500 [==================>...........] - ETA: 44s - loss: 1.2960 - regression_loss: 1.0962 - classification_loss: 0.1997 322/500 [==================>...........] - ETA: 44s - loss: 1.2951 - regression_loss: 1.0955 - classification_loss: 0.1996 323/500 [==================>...........] - ETA: 44s - loss: 1.2930 - regression_loss: 1.0937 - classification_loss: 0.1993 324/500 [==================>...........] - ETA: 44s - loss: 1.2917 - regression_loss: 1.0926 - classification_loss: 0.1991 325/500 [==================>...........] - ETA: 43s - loss: 1.2927 - regression_loss: 1.0935 - classification_loss: 0.1992 326/500 [==================>...........] - ETA: 43s - loss: 1.2919 - regression_loss: 1.0929 - classification_loss: 0.1990 327/500 [==================>...........] - ETA: 43s - loss: 1.2919 - regression_loss: 1.0931 - classification_loss: 0.1989 328/500 [==================>...........] - ETA: 43s - loss: 1.2919 - regression_loss: 1.0929 - classification_loss: 0.1990 329/500 [==================>...........] - ETA: 42s - loss: 1.2917 - regression_loss: 1.0928 - classification_loss: 0.1989 330/500 [==================>...........] - ETA: 42s - loss: 1.2915 - regression_loss: 1.0926 - classification_loss: 0.1989 331/500 [==================>...........] - ETA: 42s - loss: 1.2905 - regression_loss: 1.0918 - classification_loss: 0.1987 332/500 [==================>...........] - ETA: 42s - loss: 1.2916 - regression_loss: 1.0928 - classification_loss: 0.1988 333/500 [==================>...........] - ETA: 41s - loss: 1.2929 - regression_loss: 1.0938 - classification_loss: 0.1991 334/500 [===================>..........] - ETA: 41s - loss: 1.2924 - regression_loss: 1.0934 - classification_loss: 0.1990 335/500 [===================>..........] - ETA: 41s - loss: 1.2925 - regression_loss: 1.0933 - classification_loss: 0.1992 336/500 [===================>..........] - ETA: 41s - loss: 1.2931 - regression_loss: 1.0939 - classification_loss: 0.1992 337/500 [===================>..........] - ETA: 40s - loss: 1.2932 - regression_loss: 1.0940 - classification_loss: 0.1992 338/500 [===================>..........] - ETA: 40s - loss: 1.2938 - regression_loss: 1.0946 - classification_loss: 0.1992 339/500 [===================>..........] - ETA: 40s - loss: 1.2951 - regression_loss: 1.0957 - classification_loss: 0.1993 340/500 [===================>..........] - ETA: 40s - loss: 1.2934 - regression_loss: 1.0942 - classification_loss: 0.1992 341/500 [===================>..........] - ETA: 39s - loss: 1.2949 - regression_loss: 1.0956 - classification_loss: 0.1993 342/500 [===================>..........] - ETA: 39s - loss: 1.2956 - regression_loss: 1.0963 - classification_loss: 0.1993 343/500 [===================>..........] - ETA: 39s - loss: 1.2958 - regression_loss: 1.0966 - classification_loss: 0.1991 344/500 [===================>..........] - ETA: 39s - loss: 1.2953 - regression_loss: 1.0960 - classification_loss: 0.1993 345/500 [===================>..........] - ETA: 38s - loss: 1.2965 - regression_loss: 1.0969 - classification_loss: 0.1997 346/500 [===================>..........] - ETA: 38s - loss: 1.2948 - regression_loss: 1.0954 - classification_loss: 0.1994 347/500 [===================>..........] - ETA: 38s - loss: 1.2933 - regression_loss: 1.0941 - classification_loss: 0.1992 348/500 [===================>..........] - ETA: 38s - loss: 1.2929 - regression_loss: 1.0938 - classification_loss: 0.1991 349/500 [===================>..........] - ETA: 37s - loss: 1.2932 - regression_loss: 1.0941 - classification_loss: 0.1990 350/500 [====================>.........] - ETA: 37s - loss: 1.2939 - regression_loss: 1.0947 - classification_loss: 0.1992 351/500 [====================>.........] - ETA: 37s - loss: 1.2923 - regression_loss: 1.0933 - classification_loss: 0.1990 352/500 [====================>.........] - ETA: 37s - loss: 1.2934 - regression_loss: 1.0943 - classification_loss: 0.1991 353/500 [====================>.........] - ETA: 36s - loss: 1.2950 - regression_loss: 1.0958 - classification_loss: 0.1991 354/500 [====================>.........] - ETA: 36s - loss: 1.2955 - regression_loss: 1.0961 - classification_loss: 0.1993 355/500 [====================>.........] - ETA: 36s - loss: 1.2950 - regression_loss: 1.0957 - classification_loss: 0.1993 356/500 [====================>.........] - ETA: 36s - loss: 1.2950 - regression_loss: 1.0958 - classification_loss: 0.1992 357/500 [====================>.........] - ETA: 35s - loss: 1.2946 - regression_loss: 1.0955 - classification_loss: 0.1992 358/500 [====================>.........] - ETA: 35s - loss: 1.2939 - regression_loss: 1.0949 - classification_loss: 0.1990 359/500 [====================>.........] - ETA: 35s - loss: 1.2945 - regression_loss: 1.0955 - classification_loss: 0.1990 360/500 [====================>.........] - ETA: 35s - loss: 1.2951 - regression_loss: 1.0959 - classification_loss: 0.1992 361/500 [====================>.........] - ETA: 34s - loss: 1.2953 - regression_loss: 1.0961 - classification_loss: 0.1991 362/500 [====================>.........] - ETA: 34s - loss: 1.2949 - regression_loss: 1.0957 - classification_loss: 0.1991 363/500 [====================>.........] - ETA: 34s - loss: 1.2955 - regression_loss: 1.0962 - classification_loss: 0.1992 364/500 [====================>.........] - ETA: 34s - loss: 1.2960 - regression_loss: 1.0966 - classification_loss: 0.1995 365/500 [====================>.........] - ETA: 33s - loss: 1.2947 - regression_loss: 1.0953 - classification_loss: 0.1994 366/500 [====================>.........] - ETA: 33s - loss: 1.2949 - regression_loss: 1.0957 - classification_loss: 0.1991 367/500 [=====================>........] - ETA: 33s - loss: 1.2961 - regression_loss: 1.0967 - classification_loss: 0.1994 368/500 [=====================>........] - ETA: 33s - loss: 1.2979 - regression_loss: 1.0982 - classification_loss: 0.1997 369/500 [=====================>........] - ETA: 32s - loss: 1.2972 - regression_loss: 1.0977 - classification_loss: 0.1995 370/500 [=====================>........] - ETA: 32s - loss: 1.2975 - regression_loss: 1.0979 - classification_loss: 0.1995 371/500 [=====================>........] - ETA: 32s - loss: 1.2970 - regression_loss: 1.0975 - classification_loss: 0.1995 372/500 [=====================>........] - ETA: 32s - loss: 1.2976 - regression_loss: 1.0981 - classification_loss: 0.1996 373/500 [=====================>........] - ETA: 31s - loss: 1.2991 - regression_loss: 1.0992 - classification_loss: 0.1999 374/500 [=====================>........] - ETA: 31s - loss: 1.2986 - regression_loss: 1.0988 - classification_loss: 0.1998 375/500 [=====================>........] - ETA: 31s - loss: 1.2985 - regression_loss: 1.0989 - classification_loss: 0.1996 376/500 [=====================>........] - ETA: 31s - loss: 1.2971 - regression_loss: 1.0978 - classification_loss: 0.1993 377/500 [=====================>........] - ETA: 30s - loss: 1.2951 - regression_loss: 1.0962 - classification_loss: 0.1989 378/500 [=====================>........] - ETA: 30s - loss: 1.2971 - regression_loss: 1.0978 - classification_loss: 0.1993 379/500 [=====================>........] - ETA: 30s - loss: 1.2979 - regression_loss: 1.0987 - classification_loss: 0.1992 380/500 [=====================>........] - ETA: 30s - loss: 1.2984 - regression_loss: 1.0990 - classification_loss: 0.1993 381/500 [=====================>........] - ETA: 29s - loss: 1.3022 - regression_loss: 1.1020 - classification_loss: 0.2002 382/500 [=====================>........] - ETA: 29s - loss: 1.3024 - regression_loss: 1.1022 - classification_loss: 0.2002 383/500 [=====================>........] - ETA: 29s - loss: 1.3025 - regression_loss: 1.1023 - classification_loss: 0.2002 384/500 [======================>.......] - ETA: 29s - loss: 1.3019 - regression_loss: 1.1019 - classification_loss: 0.2000 385/500 [======================>.......] - ETA: 28s - loss: 1.3004 - regression_loss: 1.1007 - classification_loss: 0.1997 386/500 [======================>.......] - ETA: 28s - loss: 1.3010 - regression_loss: 1.1012 - classification_loss: 0.1997 387/500 [======================>.......] - ETA: 28s - loss: 1.3013 - regression_loss: 1.1015 - classification_loss: 0.1998 388/500 [======================>.......] - ETA: 28s - loss: 1.3012 - regression_loss: 1.1014 - classification_loss: 0.1999 389/500 [======================>.......] - ETA: 27s - loss: 1.3013 - regression_loss: 1.1013 - classification_loss: 0.2000 390/500 [======================>.......] - ETA: 27s - loss: 1.3022 - regression_loss: 1.1020 - classification_loss: 0.2001 391/500 [======================>.......] - ETA: 27s - loss: 1.3019 - regression_loss: 1.1016 - classification_loss: 0.2003 392/500 [======================>.......] - ETA: 27s - loss: 1.3023 - regression_loss: 1.1019 - classification_loss: 0.2004 393/500 [======================>.......] - ETA: 26s - loss: 1.3004 - regression_loss: 1.1004 - classification_loss: 0.2000 394/500 [======================>.......] - ETA: 26s - loss: 1.2991 - regression_loss: 1.0993 - classification_loss: 0.1998 395/500 [======================>.......] - ETA: 26s - loss: 1.2990 - regression_loss: 1.0993 - classification_loss: 0.1997 396/500 [======================>.......] - ETA: 26s - loss: 1.2985 - regression_loss: 1.0989 - classification_loss: 0.1996 397/500 [======================>.......] - ETA: 25s - loss: 1.2966 - regression_loss: 1.0972 - classification_loss: 0.1993 398/500 [======================>.......] - ETA: 25s - loss: 1.2965 - regression_loss: 1.0971 - classification_loss: 0.1994 399/500 [======================>.......] - ETA: 25s - loss: 1.2974 - regression_loss: 1.0977 - classification_loss: 0.1997 400/500 [=======================>......] - ETA: 25s - loss: 1.2949 - regression_loss: 1.0956 - classification_loss: 0.1993 401/500 [=======================>......] - ETA: 24s - loss: 1.2952 - regression_loss: 1.0958 - classification_loss: 0.1993 402/500 [=======================>......] - ETA: 24s - loss: 1.2961 - regression_loss: 1.0966 - classification_loss: 0.1995 403/500 [=======================>......] - ETA: 24s - loss: 1.2963 - regression_loss: 1.0970 - classification_loss: 0.1992 404/500 [=======================>......] - ETA: 24s - loss: 1.2958 - regression_loss: 1.0968 - classification_loss: 0.1990 405/500 [=======================>......] - ETA: 23s - loss: 1.2970 - regression_loss: 1.0978 - classification_loss: 0.1992 406/500 [=======================>......] - ETA: 23s - loss: 1.2967 - regression_loss: 1.0975 - classification_loss: 0.1992 407/500 [=======================>......] - ETA: 23s - loss: 1.2967 - regression_loss: 1.0974 - classification_loss: 0.1993 408/500 [=======================>......] - ETA: 23s - loss: 1.2954 - regression_loss: 1.0964 - classification_loss: 0.1990 409/500 [=======================>......] - ETA: 22s - loss: 1.2939 - regression_loss: 1.0953 - classification_loss: 0.1987 410/500 [=======================>......] - ETA: 22s - loss: 1.2927 - regression_loss: 1.0944 - classification_loss: 0.1983 411/500 [=======================>......] - ETA: 22s - loss: 1.2932 - regression_loss: 1.0949 - classification_loss: 0.1983 412/500 [=======================>......] - ETA: 22s - loss: 1.2944 - regression_loss: 1.0960 - classification_loss: 0.1984 413/500 [=======================>......] - ETA: 21s - loss: 1.2929 - regression_loss: 1.0949 - classification_loss: 0.1980 414/500 [=======================>......] - ETA: 21s - loss: 1.2919 - regression_loss: 1.0941 - classification_loss: 0.1978 415/500 [=======================>......] - ETA: 21s - loss: 1.2920 - regression_loss: 1.0942 - classification_loss: 0.1978 416/500 [=======================>......] - ETA: 21s - loss: 1.2932 - regression_loss: 1.0952 - classification_loss: 0.1980 417/500 [========================>.....] - ETA: 20s - loss: 1.2945 - regression_loss: 1.0963 - classification_loss: 0.1983 418/500 [========================>.....] - ETA: 20s - loss: 1.2946 - regression_loss: 1.0963 - classification_loss: 0.1983 419/500 [========================>.....] - ETA: 20s - loss: 1.2946 - regression_loss: 1.0963 - classification_loss: 0.1983 420/500 [========================>.....] - ETA: 20s - loss: 1.2941 - regression_loss: 1.0959 - classification_loss: 0.1982 421/500 [========================>.....] - ETA: 19s - loss: 1.2949 - regression_loss: 1.0966 - classification_loss: 0.1983 422/500 [========================>.....] - ETA: 19s - loss: 1.2947 - regression_loss: 1.0965 - classification_loss: 0.1982 423/500 [========================>.....] - ETA: 19s - loss: 1.2955 - regression_loss: 1.0973 - classification_loss: 0.1982 424/500 [========================>.....] - ETA: 19s - loss: 1.2959 - regression_loss: 1.0976 - classification_loss: 0.1983 425/500 [========================>.....] - ETA: 18s - loss: 1.2947 - regression_loss: 1.0967 - classification_loss: 0.1980 426/500 [========================>.....] - ETA: 18s - loss: 1.2962 - regression_loss: 1.0978 - classification_loss: 0.1984 427/500 [========================>.....] - ETA: 18s - loss: 1.2960 - regression_loss: 1.0977 - classification_loss: 0.1982 428/500 [========================>.....] - ETA: 18s - loss: 1.2956 - regression_loss: 1.0974 - classification_loss: 0.1983 429/500 [========================>.....] - ETA: 17s - loss: 1.2947 - regression_loss: 1.0966 - classification_loss: 0.1981 430/500 [========================>.....] - ETA: 17s - loss: 1.2957 - regression_loss: 1.0975 - classification_loss: 0.1982 431/500 [========================>.....] - ETA: 17s - loss: 1.2960 - regression_loss: 1.0976 - classification_loss: 0.1984 432/500 [========================>.....] - ETA: 17s - loss: 1.2944 - regression_loss: 1.0962 - classification_loss: 0.1981 433/500 [========================>.....] - ETA: 16s - loss: 1.2945 - regression_loss: 1.0964 - classification_loss: 0.1981 434/500 [=========================>....] - ETA: 16s - loss: 1.2947 - regression_loss: 1.0966 - classification_loss: 0.1982 435/500 [=========================>....] - ETA: 16s - loss: 1.2951 - regression_loss: 1.0969 - classification_loss: 0.1982 436/500 [=========================>....] - ETA: 16s - loss: 1.2958 - regression_loss: 1.0975 - classification_loss: 0.1982 437/500 [=========================>....] - ETA: 15s - loss: 1.2956 - regression_loss: 1.0975 - classification_loss: 0.1981 438/500 [=========================>....] - ETA: 15s - loss: 1.2963 - regression_loss: 1.0980 - classification_loss: 0.1982 439/500 [=========================>....] - ETA: 15s - loss: 1.2965 - regression_loss: 1.0982 - classification_loss: 0.1983 440/500 [=========================>....] - ETA: 15s - loss: 1.2973 - regression_loss: 1.0989 - classification_loss: 0.1985 441/500 [=========================>....] - ETA: 14s - loss: 1.2978 - regression_loss: 1.0992 - classification_loss: 0.1986 442/500 [=========================>....] - ETA: 14s - loss: 1.2971 - regression_loss: 1.0986 - classification_loss: 0.1985 443/500 [=========================>....] - ETA: 14s - loss: 1.2957 - regression_loss: 1.0975 - classification_loss: 0.1983 444/500 [=========================>....] - ETA: 14s - loss: 1.2962 - regression_loss: 1.0978 - classification_loss: 0.1985 445/500 [=========================>....] - ETA: 13s - loss: 1.2973 - regression_loss: 1.0987 - classification_loss: 0.1987 446/500 [=========================>....] - ETA: 13s - loss: 1.2971 - regression_loss: 1.0986 - classification_loss: 0.1985 447/500 [=========================>....] - ETA: 13s - loss: 1.2974 - regression_loss: 1.0987 - classification_loss: 0.1986 448/500 [=========================>....] - ETA: 13s - loss: 1.2983 - regression_loss: 1.0997 - classification_loss: 0.1986 449/500 [=========================>....] - ETA: 12s - loss: 1.2969 - regression_loss: 1.0986 - classification_loss: 0.1983 450/500 [==========================>...] - ETA: 12s - loss: 1.2957 - regression_loss: 1.0975 - classification_loss: 0.1982 451/500 [==========================>...] - ETA: 12s - loss: 1.2946 - regression_loss: 1.0965 - classification_loss: 0.1980 452/500 [==========================>...] - ETA: 12s - loss: 1.2941 - regression_loss: 1.0962 - classification_loss: 0.1979 453/500 [==========================>...] - ETA: 11s - loss: 1.2923 - regression_loss: 1.0948 - classification_loss: 0.1975 454/500 [==========================>...] - ETA: 11s - loss: 1.2936 - regression_loss: 1.0957 - classification_loss: 0.1979 455/500 [==========================>...] - ETA: 11s - loss: 1.2940 - regression_loss: 1.0960 - classification_loss: 0.1980 456/500 [==========================>...] - ETA: 11s - loss: 1.2942 - regression_loss: 1.0961 - classification_loss: 0.1981 457/500 [==========================>...] - ETA: 10s - loss: 1.2946 - regression_loss: 1.0964 - classification_loss: 0.1982 458/500 [==========================>...] - ETA: 10s - loss: 1.2942 - regression_loss: 1.0961 - classification_loss: 0.1981 459/500 [==========================>...] - ETA: 10s - loss: 1.2946 - regression_loss: 1.0965 - classification_loss: 0.1981 460/500 [==========================>...] - ETA: 10s - loss: 1.2947 - regression_loss: 1.0967 - classification_loss: 0.1980 461/500 [==========================>...] - ETA: 9s - loss: 1.2934 - regression_loss: 1.0956 - classification_loss: 0.1978  462/500 [==========================>...] - ETA: 9s - loss: 1.2948 - regression_loss: 1.0967 - classification_loss: 0.1981 463/500 [==========================>...] - ETA: 9s - loss: 1.2955 - regression_loss: 1.0973 - classification_loss: 0.1982 464/500 [==========================>...] - ETA: 9s - loss: 1.2945 - regression_loss: 1.0964 - classification_loss: 0.1980 465/500 [==========================>...] - ETA: 8s - loss: 1.2964 - regression_loss: 1.0980 - classification_loss: 0.1984 466/500 [==========================>...] - ETA: 8s - loss: 1.2970 - regression_loss: 1.0985 - classification_loss: 0.1985 467/500 [===========================>..] - ETA: 8s - loss: 1.2981 - regression_loss: 1.0993 - classification_loss: 0.1988 468/500 [===========================>..] - ETA: 8s - loss: 1.2988 - regression_loss: 1.1000 - classification_loss: 0.1988 469/500 [===========================>..] - ETA: 7s - loss: 1.2997 - regression_loss: 1.1007 - classification_loss: 0.1990 470/500 [===========================>..] - ETA: 7s - loss: 1.3010 - regression_loss: 1.1019 - classification_loss: 0.1991 471/500 [===========================>..] - ETA: 7s - loss: 1.3013 - regression_loss: 1.1022 - classification_loss: 0.1991 472/500 [===========================>..] - ETA: 7s - loss: 1.3005 - regression_loss: 1.1017 - classification_loss: 0.1988 473/500 [===========================>..] - ETA: 6s - loss: 1.3008 - regression_loss: 1.1019 - classification_loss: 0.1989 474/500 [===========================>..] - ETA: 6s - loss: 1.3016 - regression_loss: 1.1025 - classification_loss: 0.1991 475/500 [===========================>..] - ETA: 6s - loss: 1.3009 - regression_loss: 1.1019 - classification_loss: 0.1990 476/500 [===========================>..] - ETA: 6s - loss: 1.3004 - regression_loss: 1.1016 - classification_loss: 0.1988 477/500 [===========================>..] - ETA: 5s - loss: 1.2994 - regression_loss: 1.1007 - classification_loss: 0.1987 478/500 [===========================>..] - ETA: 5s - loss: 1.2993 - regression_loss: 1.1006 - classification_loss: 0.1987 479/500 [===========================>..] - ETA: 5s - loss: 1.3001 - regression_loss: 1.1013 - classification_loss: 0.1988 480/500 [===========================>..] - ETA: 5s - loss: 1.3005 - regression_loss: 1.1016 - classification_loss: 0.1989 481/500 [===========================>..] - ETA: 4s - loss: 1.2991 - regression_loss: 1.1005 - classification_loss: 0.1986 482/500 [===========================>..] - ETA: 4s - loss: 1.2992 - regression_loss: 1.1007 - classification_loss: 0.1985 483/500 [===========================>..] - ETA: 4s - loss: 1.2982 - regression_loss: 1.0999 - classification_loss: 0.1983 484/500 [============================>.] - ETA: 4s - loss: 1.2986 - regression_loss: 1.1002 - classification_loss: 0.1984 485/500 [============================>.] - ETA: 3s - loss: 1.2985 - regression_loss: 1.1002 - classification_loss: 0.1983 486/500 [============================>.] - ETA: 3s - loss: 1.2970 - regression_loss: 1.0990 - classification_loss: 0.1981 487/500 [============================>.] - ETA: 3s - loss: 1.2965 - regression_loss: 1.0986 - classification_loss: 0.1980 488/500 [============================>.] - ETA: 3s - loss: 1.2973 - regression_loss: 1.0992 - classification_loss: 0.1982 489/500 [============================>.] - ETA: 2s - loss: 1.2976 - regression_loss: 1.0993 - classification_loss: 0.1982 490/500 [============================>.] - ETA: 2s - loss: 1.2973 - regression_loss: 1.0991 - classification_loss: 0.1982 491/500 [============================>.] - ETA: 2s - loss: 1.2969 - regression_loss: 1.0987 - classification_loss: 0.1982 492/500 [============================>.] - ETA: 2s - loss: 1.2975 - regression_loss: 1.0992 - classification_loss: 0.1983 493/500 [============================>.] - ETA: 1s - loss: 1.2959 - regression_loss: 1.0979 - classification_loss: 0.1980 494/500 [============================>.] - ETA: 1s - loss: 1.2958 - regression_loss: 1.0978 - classification_loss: 0.1980 495/500 [============================>.] - ETA: 1s - loss: 1.2975 - regression_loss: 1.0991 - classification_loss: 0.1983 496/500 [============================>.] - ETA: 1s - loss: 1.2968 - regression_loss: 1.0986 - classification_loss: 0.1981 497/500 [============================>.] - ETA: 0s - loss: 1.2967 - regression_loss: 1.0987 - classification_loss: 0.1981 498/500 [============================>.] - ETA: 0s - loss: 1.2975 - regression_loss: 1.0993 - classification_loss: 0.1982 499/500 [============================>.] - ETA: 0s - loss: 1.2977 - regression_loss: 1.0995 - classification_loss: 0.1982 500/500 [==============================] - 126s 251ms/step - loss: 1.2979 - regression_loss: 1.0996 - classification_loss: 0.1982 1172 instances of class plum with average precision: 0.6996 mAP: 0.6996 Epoch 00117: saving model to ./training/snapshots/resnet50_pascal_117.h5 Epoch 118/150 1/500 [..............................] - ETA: 1:58 - loss: 0.7911 - regression_loss: 0.7290 - classification_loss: 0.0621 2/500 [..............................] - ETA: 1:59 - loss: 1.2731 - regression_loss: 1.0957 - classification_loss: 0.1774 3/500 [..............................] - ETA: 2:00 - loss: 1.3082 - regression_loss: 1.1042 - classification_loss: 0.2040 4/500 [..............................] - ETA: 2:00 - loss: 1.2259 - regression_loss: 1.0273 - classification_loss: 0.1986 5/500 [..............................] - ETA: 2:00 - loss: 1.3319 - regression_loss: 1.1157 - classification_loss: 0.2161 6/500 [..............................] - ETA: 2:00 - loss: 1.2406 - regression_loss: 1.0442 - classification_loss: 0.1964 7/500 [..............................] - ETA: 2:00 - loss: 1.3430 - regression_loss: 1.1241 - classification_loss: 0.2190 8/500 [..............................] - ETA: 2:00 - loss: 1.3433 - regression_loss: 1.1254 - classification_loss: 0.2179 9/500 [..............................] - ETA: 2:00 - loss: 1.3118 - regression_loss: 1.1027 - classification_loss: 0.2091 10/500 [..............................] - ETA: 2:00 - loss: 1.3169 - regression_loss: 1.1012 - classification_loss: 0.2157 11/500 [..............................] - ETA: 2:01 - loss: 1.3099 - regression_loss: 1.0874 - classification_loss: 0.2225 12/500 [..............................] - ETA: 2:01 - loss: 1.3645 - regression_loss: 1.1332 - classification_loss: 0.2313 13/500 [..............................] - ETA: 2:01 - loss: 1.3872 - regression_loss: 1.1545 - classification_loss: 0.2327 14/500 [..............................] - ETA: 2:00 - loss: 1.3739 - regression_loss: 1.1457 - classification_loss: 0.2282 15/500 [..............................] - ETA: 2:00 - loss: 1.3786 - regression_loss: 1.1516 - classification_loss: 0.2271 16/500 [..............................] - ETA: 2:00 - loss: 1.3302 - regression_loss: 1.1124 - classification_loss: 0.2178 17/500 [>.............................] - ETA: 2:00 - loss: 1.3005 - regression_loss: 1.0907 - classification_loss: 0.2098 18/500 [>.............................] - ETA: 1:59 - loss: 1.3173 - regression_loss: 1.1020 - classification_loss: 0.2154 19/500 [>.............................] - ETA: 1:59 - loss: 1.3420 - regression_loss: 1.1224 - classification_loss: 0.2196 20/500 [>.............................] - ETA: 1:58 - loss: 1.3311 - regression_loss: 1.1160 - classification_loss: 0.2151 21/500 [>.............................] - ETA: 1:58 - loss: 1.3323 - regression_loss: 1.1182 - classification_loss: 0.2141 22/500 [>.............................] - ETA: 1:58 - loss: 1.3086 - regression_loss: 1.0999 - classification_loss: 0.2087 23/500 [>.............................] - ETA: 1:57 - loss: 1.3090 - regression_loss: 1.1016 - classification_loss: 0.2074 24/500 [>.............................] - ETA: 1:57 - loss: 1.3334 - regression_loss: 1.1200 - classification_loss: 0.2134 25/500 [>.............................] - ETA: 1:57 - loss: 1.3327 - regression_loss: 1.1197 - classification_loss: 0.2130 26/500 [>.............................] - ETA: 1:57 - loss: 1.3351 - regression_loss: 1.1221 - classification_loss: 0.2130 27/500 [>.............................] - ETA: 1:57 - loss: 1.3249 - regression_loss: 1.1150 - classification_loss: 0.2098 28/500 [>.............................] - ETA: 1:56 - loss: 1.3254 - regression_loss: 1.1167 - classification_loss: 0.2087 29/500 [>.............................] - ETA: 1:56 - loss: 1.3546 - regression_loss: 1.1397 - classification_loss: 0.2149 30/500 [>.............................] - ETA: 1:56 - loss: 1.3514 - regression_loss: 1.1381 - classification_loss: 0.2133 31/500 [>.............................] - ETA: 1:56 - loss: 1.3400 - regression_loss: 1.1287 - classification_loss: 0.2113 32/500 [>.............................] - ETA: 1:56 - loss: 1.3479 - regression_loss: 1.1335 - classification_loss: 0.2144 33/500 [>.............................] - ETA: 1:56 - loss: 1.3271 - regression_loss: 1.1177 - classification_loss: 0.2095 34/500 [=>............................] - ETA: 1:55 - loss: 1.3028 - regression_loss: 1.0933 - classification_loss: 0.2095 35/500 [=>............................] - ETA: 1:55 - loss: 1.3069 - regression_loss: 1.0963 - classification_loss: 0.2106 36/500 [=>............................] - ETA: 1:55 - loss: 1.3199 - regression_loss: 1.1067 - classification_loss: 0.2132 37/500 [=>............................] - ETA: 1:55 - loss: 1.3166 - regression_loss: 1.1047 - classification_loss: 0.2119 38/500 [=>............................] - ETA: 1:54 - loss: 1.3020 - regression_loss: 1.0922 - classification_loss: 0.2098 39/500 [=>............................] - ETA: 1:54 - loss: 1.3044 - regression_loss: 1.0906 - classification_loss: 0.2138 40/500 [=>............................] - ETA: 1:54 - loss: 1.3115 - regression_loss: 1.0974 - classification_loss: 0.2141 41/500 [=>............................] - ETA: 1:54 - loss: 1.3099 - regression_loss: 1.0967 - classification_loss: 0.2132 42/500 [=>............................] - ETA: 1:53 - loss: 1.3156 - regression_loss: 1.1026 - classification_loss: 0.2130 43/500 [=>............................] - ETA: 1:53 - loss: 1.3227 - regression_loss: 1.1094 - classification_loss: 0.2133 44/500 [=>............................] - ETA: 1:53 - loss: 1.3127 - regression_loss: 1.1019 - classification_loss: 0.2108 45/500 [=>............................] - ETA: 1:53 - loss: 1.2943 - regression_loss: 1.0872 - classification_loss: 0.2071 46/500 [=>............................] - ETA: 1:52 - loss: 1.3002 - regression_loss: 1.0916 - classification_loss: 0.2086 47/500 [=>............................] - ETA: 1:52 - loss: 1.2917 - regression_loss: 1.0841 - classification_loss: 0.2076 48/500 [=>............................] - ETA: 1:52 - loss: 1.2814 - regression_loss: 1.0769 - classification_loss: 0.2046 49/500 [=>............................] - ETA: 1:52 - loss: 1.2885 - regression_loss: 1.0829 - classification_loss: 0.2055 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2960 - regression_loss: 1.0894 - classification_loss: 0.2067 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3024 - regression_loss: 1.0952 - classification_loss: 0.2072 52/500 [==>...........................] - ETA: 1:51 - loss: 1.2953 - regression_loss: 1.0894 - classification_loss: 0.2059 53/500 [==>...........................] - ETA: 1:51 - loss: 1.3078 - regression_loss: 1.1013 - classification_loss: 0.2065 54/500 [==>...........................] - ETA: 1:51 - loss: 1.3123 - regression_loss: 1.1049 - classification_loss: 0.2073 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3064 - regression_loss: 1.1004 - classification_loss: 0.2060 56/500 [==>...........................] - ETA: 1:50 - loss: 1.3091 - regression_loss: 1.1018 - classification_loss: 0.2073 57/500 [==>...........................] - ETA: 1:50 - loss: 1.3123 - regression_loss: 1.1046 - classification_loss: 0.2076 58/500 [==>...........................] - ETA: 1:50 - loss: 1.3153 - regression_loss: 1.1074 - classification_loss: 0.2079 59/500 [==>...........................] - ETA: 1:50 - loss: 1.3128 - regression_loss: 1.1044 - classification_loss: 0.2084 60/500 [==>...........................] - ETA: 1:49 - loss: 1.3115 - regression_loss: 1.1040 - classification_loss: 0.2075 61/500 [==>...........................] - ETA: 1:49 - loss: 1.3150 - regression_loss: 1.1072 - classification_loss: 0.2078 62/500 [==>...........................] - ETA: 1:49 - loss: 1.3185 - regression_loss: 1.1093 - classification_loss: 0.2092 63/500 [==>...........................] - ETA: 1:49 - loss: 1.3268 - regression_loss: 1.1157 - classification_loss: 0.2111 64/500 [==>...........................] - ETA: 1:48 - loss: 1.3257 - regression_loss: 1.1148 - classification_loss: 0.2109 65/500 [==>...........................] - ETA: 1:48 - loss: 1.3404 - regression_loss: 1.1267 - classification_loss: 0.2138 66/500 [==>...........................] - ETA: 1:48 - loss: 1.3401 - regression_loss: 1.1257 - classification_loss: 0.2143 67/500 [===>..........................] - ETA: 1:48 - loss: 1.3343 - regression_loss: 1.1215 - classification_loss: 0.2128 68/500 [===>..........................] - ETA: 1:47 - loss: 1.3310 - regression_loss: 1.1188 - classification_loss: 0.2122 69/500 [===>..........................] - ETA: 1:47 - loss: 1.3306 - regression_loss: 1.1193 - classification_loss: 0.2113 70/500 [===>..........................] - ETA: 1:47 - loss: 1.3326 - regression_loss: 1.1212 - classification_loss: 0.2114 71/500 [===>..........................] - ETA: 1:47 - loss: 1.3229 - regression_loss: 1.1138 - classification_loss: 0.2091 72/500 [===>..........................] - ETA: 1:46 - loss: 1.3129 - regression_loss: 1.1060 - classification_loss: 0.2069 73/500 [===>..........................] - ETA: 1:46 - loss: 1.3213 - regression_loss: 1.1133 - classification_loss: 0.2081 74/500 [===>..........................] - ETA: 1:46 - loss: 1.3180 - regression_loss: 1.1113 - classification_loss: 0.2067 75/500 [===>..........................] - ETA: 1:46 - loss: 1.3198 - regression_loss: 1.1122 - classification_loss: 0.2076 76/500 [===>..........................] - ETA: 1:46 - loss: 1.3106 - regression_loss: 1.1045 - classification_loss: 0.2061 77/500 [===>..........................] - ETA: 1:45 - loss: 1.3075 - regression_loss: 1.1020 - classification_loss: 0.2055 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3014 - regression_loss: 1.0974 - classification_loss: 0.2040 79/500 [===>..........................] - ETA: 1:45 - loss: 1.3017 - regression_loss: 1.0981 - classification_loss: 0.2036 80/500 [===>..........................] - ETA: 1:44 - loss: 1.2981 - regression_loss: 1.0957 - classification_loss: 0.2024 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2952 - regression_loss: 1.0935 - classification_loss: 0.2017 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2883 - regression_loss: 1.0873 - classification_loss: 0.2009 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2796 - regression_loss: 1.0799 - classification_loss: 0.1998 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2768 - regression_loss: 1.0781 - classification_loss: 0.1987 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2802 - regression_loss: 1.0798 - classification_loss: 0.2004 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2840 - regression_loss: 1.0839 - classification_loss: 0.2002 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2893 - regression_loss: 1.0884 - classification_loss: 0.2009 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2882 - regression_loss: 1.0877 - classification_loss: 0.2005 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2800 - regression_loss: 1.0812 - classification_loss: 0.1988 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2851 - regression_loss: 1.0836 - classification_loss: 0.2015 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2848 - regression_loss: 1.0838 - classification_loss: 0.2010 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2850 - regression_loss: 1.0840 - classification_loss: 0.2010 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2871 - regression_loss: 1.0859 - classification_loss: 0.2012 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2894 - regression_loss: 1.0886 - classification_loss: 0.2009 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2813 - regression_loss: 1.0821 - classification_loss: 0.1992 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2858 - regression_loss: 1.0857 - classification_loss: 0.2001 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2841 - regression_loss: 1.0844 - classification_loss: 0.1998 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2862 - regression_loss: 1.0864 - classification_loss: 0.1998 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2866 - regression_loss: 1.0868 - classification_loss: 0.1998 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2878 - regression_loss: 1.0888 - classification_loss: 0.1990 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2921 - regression_loss: 1.0919 - classification_loss: 0.2002 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2857 - regression_loss: 1.0871 - classification_loss: 0.1987 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2762 - regression_loss: 1.0793 - classification_loss: 0.1969 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2812 - regression_loss: 1.0836 - classification_loss: 0.1976 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2778 - regression_loss: 1.0806 - classification_loss: 0.1972 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2777 - regression_loss: 1.0809 - classification_loss: 0.1967 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2823 - regression_loss: 1.0849 - classification_loss: 0.1975 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2855 - regression_loss: 1.0872 - classification_loss: 0.1983 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2885 - regression_loss: 1.0899 - classification_loss: 0.1986 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2830 - regression_loss: 1.0856 - classification_loss: 0.1974 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2808 - regression_loss: 1.0839 - classification_loss: 0.1970 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2814 - regression_loss: 1.0845 - classification_loss: 0.1968 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2799 - regression_loss: 1.0838 - classification_loss: 0.1961 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2796 - regression_loss: 1.0837 - classification_loss: 0.1959 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2826 - regression_loss: 1.0864 - classification_loss: 0.1962 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2826 - regression_loss: 1.0864 - classification_loss: 0.1962 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2885 - regression_loss: 1.0900 - classification_loss: 0.1986 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2878 - regression_loss: 1.0897 - classification_loss: 0.1981 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2828 - regression_loss: 1.0855 - classification_loss: 0.1973 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2842 - regression_loss: 1.0864 - classification_loss: 0.1978 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2775 - regression_loss: 1.0806 - classification_loss: 0.1969 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2784 - regression_loss: 1.0819 - classification_loss: 0.1966 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2723 - regression_loss: 1.0768 - classification_loss: 0.1955 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2722 - regression_loss: 1.0769 - classification_loss: 0.1953 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2743 - regression_loss: 1.0787 - classification_loss: 0.1956 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2783 - regression_loss: 1.0818 - classification_loss: 0.1965 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2799 - regression_loss: 1.0831 - classification_loss: 0.1969 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2819 - regression_loss: 1.0846 - classification_loss: 0.1973 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2836 - regression_loss: 1.0863 - classification_loss: 0.1973 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2843 - regression_loss: 1.0871 - classification_loss: 0.1972 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2891 - regression_loss: 1.0905 - classification_loss: 0.1986 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2828 - regression_loss: 1.0853 - classification_loss: 0.1975 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2785 - regression_loss: 1.0817 - classification_loss: 0.1968 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2736 - regression_loss: 1.0777 - classification_loss: 0.1959 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2666 - regression_loss: 1.0718 - classification_loss: 0.1948 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2702 - regression_loss: 1.0741 - classification_loss: 0.1960 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2706 - regression_loss: 1.0744 - classification_loss: 0.1962 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2721 - regression_loss: 1.0759 - classification_loss: 0.1962 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2690 - regression_loss: 1.0737 - classification_loss: 0.1953 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2716 - regression_loss: 1.0760 - classification_loss: 0.1956 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2690 - regression_loss: 1.0736 - classification_loss: 0.1955 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2717 - regression_loss: 1.0758 - classification_loss: 0.1959 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2676 - regression_loss: 1.0725 - classification_loss: 0.1951 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2664 - regression_loss: 1.0711 - classification_loss: 0.1953 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2696 - regression_loss: 1.0735 - classification_loss: 0.1960 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2689 - regression_loss: 1.0733 - classification_loss: 0.1956 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2649 - regression_loss: 1.0699 - classification_loss: 0.1950 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2647 - regression_loss: 1.0696 - classification_loss: 0.1951 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2675 - regression_loss: 1.0718 - classification_loss: 0.1956 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2643 - regression_loss: 1.0691 - classification_loss: 0.1952 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2644 - regression_loss: 1.0692 - classification_loss: 0.1952 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2657 - regression_loss: 1.0701 - classification_loss: 0.1957 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2651 - regression_loss: 1.0697 - classification_loss: 0.1954 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2640 - regression_loss: 1.0694 - classification_loss: 0.1945 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2653 - regression_loss: 1.0707 - classification_loss: 0.1947 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2668 - regression_loss: 1.0724 - classification_loss: 0.1944 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2694 - regression_loss: 1.0745 - classification_loss: 0.1949 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2703 - regression_loss: 1.0752 - classification_loss: 0.1951 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2712 - regression_loss: 1.0762 - classification_loss: 0.1950 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2690 - regression_loss: 1.0740 - classification_loss: 0.1951 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2688 - regression_loss: 1.0735 - classification_loss: 0.1954 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2686 - regression_loss: 1.0734 - classification_loss: 0.1952 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2701 - regression_loss: 1.0747 - classification_loss: 0.1954 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2697 - regression_loss: 1.0742 - classification_loss: 0.1955 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2667 - regression_loss: 1.0715 - classification_loss: 0.1952 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2680 - regression_loss: 1.0722 - classification_loss: 0.1959 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2715 - regression_loss: 1.0752 - classification_loss: 0.1962 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2713 - regression_loss: 1.0752 - classification_loss: 0.1961 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2740 - regression_loss: 1.0775 - classification_loss: 0.1964 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2687 - regression_loss: 1.0732 - classification_loss: 0.1955 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2722 - regression_loss: 1.0761 - classification_loss: 0.1961 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2733 - regression_loss: 1.0769 - classification_loss: 0.1964 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2769 - regression_loss: 1.0799 - classification_loss: 0.1971 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2783 - regression_loss: 1.0810 - classification_loss: 0.1974 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2807 - regression_loss: 1.0831 - classification_loss: 0.1976 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2769 - regression_loss: 1.0801 - classification_loss: 0.1968 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2761 - regression_loss: 1.0795 - classification_loss: 0.1966 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2729 - regression_loss: 1.0766 - classification_loss: 0.1963 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2735 - regression_loss: 1.0771 - classification_loss: 0.1963 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2724 - regression_loss: 1.0761 - classification_loss: 0.1963 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2739 - regression_loss: 1.0773 - classification_loss: 0.1966 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2707 - regression_loss: 1.0750 - classification_loss: 0.1957 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2725 - regression_loss: 1.0767 - classification_loss: 0.1958 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2687 - regression_loss: 1.0735 - classification_loss: 0.1952 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2749 - regression_loss: 1.0792 - classification_loss: 0.1958 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2756 - regression_loss: 1.0800 - classification_loss: 0.1957 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2773 - regression_loss: 1.0807 - classification_loss: 0.1966 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2787 - regression_loss: 1.0818 - classification_loss: 0.1968 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2758 - regression_loss: 1.0794 - classification_loss: 0.1964 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2774 - regression_loss: 1.0806 - classification_loss: 0.1968 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2753 - regression_loss: 1.0791 - classification_loss: 0.1963 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2759 - regression_loss: 1.0796 - classification_loss: 0.1963 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2771 - regression_loss: 1.0805 - classification_loss: 0.1966 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2765 - regression_loss: 1.0799 - classification_loss: 0.1966 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2773 - regression_loss: 1.0805 - classification_loss: 0.1967 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2780 - regression_loss: 1.0806 - classification_loss: 0.1974 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2756 - regression_loss: 1.0788 - classification_loss: 0.1968 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2792 - regression_loss: 1.0816 - classification_loss: 0.1976 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2774 - regression_loss: 1.0797 - classification_loss: 0.1977 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2733 - regression_loss: 1.0763 - classification_loss: 0.1970 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2735 - regression_loss: 1.0766 - classification_loss: 0.1969 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2742 - regression_loss: 1.0772 - classification_loss: 0.1970 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2718 - regression_loss: 1.0753 - classification_loss: 0.1965 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2730 - regression_loss: 1.0764 - classification_loss: 0.1966 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2709 - regression_loss: 1.0747 - classification_loss: 0.1962 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2702 - regression_loss: 1.0743 - classification_loss: 0.1959 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2692 - regression_loss: 1.0733 - classification_loss: 0.1959 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2685 - regression_loss: 1.0729 - classification_loss: 0.1956 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2672 - regression_loss: 1.0718 - classification_loss: 0.1954 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2683 - regression_loss: 1.0728 - classification_loss: 0.1955 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2689 - regression_loss: 1.0732 - classification_loss: 0.1956 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2703 - regression_loss: 1.0744 - classification_loss: 0.1959 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2681 - regression_loss: 1.0729 - classification_loss: 0.1952 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2699 - regression_loss: 1.0742 - classification_loss: 0.1957 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2694 - regression_loss: 1.0739 - classification_loss: 0.1955 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2706 - regression_loss: 1.0748 - classification_loss: 0.1958 217/500 [============>.................] - ETA: 1:10 - loss: 1.2715 - regression_loss: 1.0755 - classification_loss: 0.1959 218/500 [============>.................] - ETA: 1:10 - loss: 1.2721 - regression_loss: 1.0762 - classification_loss: 0.1959 219/500 [============>.................] - ETA: 1:10 - loss: 1.2732 - regression_loss: 1.0771 - classification_loss: 0.1961 220/500 [============>.................] - ETA: 1:10 - loss: 1.2743 - regression_loss: 1.0779 - classification_loss: 0.1964 221/500 [============>.................] - ETA: 1:09 - loss: 1.2730 - regression_loss: 1.0770 - classification_loss: 0.1960 222/500 [============>.................] - ETA: 1:09 - loss: 1.2761 - regression_loss: 1.0792 - classification_loss: 0.1969 223/500 [============>.................] - ETA: 1:09 - loss: 1.2767 - regression_loss: 1.0798 - classification_loss: 0.1969 224/500 [============>.................] - ETA: 1:09 - loss: 1.2741 - regression_loss: 1.0777 - classification_loss: 0.1964 225/500 [============>.................] - ETA: 1:08 - loss: 1.2760 - regression_loss: 1.0792 - classification_loss: 0.1968 226/500 [============>.................] - ETA: 1:08 - loss: 1.2785 - regression_loss: 1.0806 - classification_loss: 0.1979 227/500 [============>.................] - ETA: 1:08 - loss: 1.2766 - regression_loss: 1.0792 - classification_loss: 0.1975 228/500 [============>.................] - ETA: 1:08 - loss: 1.2747 - regression_loss: 1.0777 - classification_loss: 0.1970 229/500 [============>.................] - ETA: 1:07 - loss: 1.2748 - regression_loss: 1.0778 - classification_loss: 0.1970 230/500 [============>.................] - ETA: 1:07 - loss: 1.2767 - regression_loss: 1.0791 - classification_loss: 0.1975 231/500 [============>.................] - ETA: 1:07 - loss: 1.2778 - regression_loss: 1.0801 - classification_loss: 0.1977 232/500 [============>.................] - ETA: 1:07 - loss: 1.2767 - regression_loss: 1.0789 - classification_loss: 0.1978 233/500 [============>.................] - ETA: 1:06 - loss: 1.2809 - regression_loss: 1.0824 - classification_loss: 0.1985 234/500 [=============>................] - ETA: 1:06 - loss: 1.2772 - regression_loss: 1.0794 - classification_loss: 0.1978 235/500 [=============>................] - ETA: 1:06 - loss: 1.2784 - regression_loss: 1.0800 - classification_loss: 0.1984 236/500 [=============>................] - ETA: 1:06 - loss: 1.2804 - regression_loss: 1.0816 - classification_loss: 0.1987 237/500 [=============>................] - ETA: 1:05 - loss: 1.2809 - regression_loss: 1.0820 - classification_loss: 0.1989 238/500 [=============>................] - ETA: 1:05 - loss: 1.2825 - regression_loss: 1.0832 - classification_loss: 0.1992 239/500 [=============>................] - ETA: 1:05 - loss: 1.2798 - regression_loss: 1.0807 - classification_loss: 0.1991 240/500 [=============>................] - ETA: 1:05 - loss: 1.2812 - regression_loss: 1.0813 - classification_loss: 0.1999 241/500 [=============>................] - ETA: 1:04 - loss: 1.2788 - regression_loss: 1.0795 - classification_loss: 0.1993 242/500 [=============>................] - ETA: 1:04 - loss: 1.2789 - regression_loss: 1.0797 - classification_loss: 0.1992 243/500 [=============>................] - ETA: 1:04 - loss: 1.2805 - regression_loss: 1.0812 - classification_loss: 0.1993 244/500 [=============>................] - ETA: 1:04 - loss: 1.2824 - regression_loss: 1.0829 - classification_loss: 0.1996 245/500 [=============>................] - ETA: 1:03 - loss: 1.2819 - regression_loss: 1.0825 - classification_loss: 0.1994 246/500 [=============>................] - ETA: 1:03 - loss: 1.2835 - regression_loss: 1.0840 - classification_loss: 0.1995 247/500 [=============>................] - ETA: 1:03 - loss: 1.2843 - regression_loss: 1.0847 - classification_loss: 0.1996 248/500 [=============>................] - ETA: 1:03 - loss: 1.2842 - regression_loss: 1.0846 - classification_loss: 0.1996 249/500 [=============>................] - ETA: 1:02 - loss: 1.2822 - regression_loss: 1.0830 - classification_loss: 0.1992 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2852 - regression_loss: 1.0853 - classification_loss: 0.1999 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2870 - regression_loss: 1.0867 - classification_loss: 0.2003 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2869 - regression_loss: 1.0867 - classification_loss: 0.2002 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2872 - regression_loss: 1.0867 - classification_loss: 0.2005 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2867 - regression_loss: 1.0863 - classification_loss: 0.2004 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2893 - regression_loss: 1.0884 - classification_loss: 0.2008 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2893 - regression_loss: 1.0886 - classification_loss: 0.2007 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2900 - regression_loss: 1.0893 - classification_loss: 0.2007 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2872 - regression_loss: 1.0871 - classification_loss: 0.2001 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2861 - regression_loss: 1.0863 - classification_loss: 0.1999 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2872 - regression_loss: 1.0872 - classification_loss: 0.2000 261/500 [==============>...............] - ETA: 59s - loss: 1.2898 - regression_loss: 1.0892 - classification_loss: 0.2006  262/500 [==============>...............] - ETA: 59s - loss: 1.2891 - regression_loss: 1.0887 - classification_loss: 0.2004 263/500 [==============>...............] - ETA: 59s - loss: 1.2881 - regression_loss: 1.0881 - classification_loss: 0.2000 264/500 [==============>...............] - ETA: 59s - loss: 1.2899 - regression_loss: 1.0893 - classification_loss: 0.2006 265/500 [==============>...............] - ETA: 58s - loss: 1.2914 - regression_loss: 1.0908 - classification_loss: 0.2006 266/500 [==============>...............] - ETA: 58s - loss: 1.2935 - regression_loss: 1.0927 - classification_loss: 0.2008 267/500 [===============>..............] - ETA: 58s - loss: 1.2904 - regression_loss: 1.0902 - classification_loss: 0.2002 268/500 [===============>..............] - ETA: 58s - loss: 1.2868 - regression_loss: 1.0871 - classification_loss: 0.1997 269/500 [===============>..............] - ETA: 57s - loss: 1.2846 - regression_loss: 1.0853 - classification_loss: 0.1993 270/500 [===============>..............] - ETA: 57s - loss: 1.2862 - regression_loss: 1.0866 - classification_loss: 0.1996 271/500 [===============>..............] - ETA: 57s - loss: 1.2869 - regression_loss: 1.0870 - classification_loss: 0.1999 272/500 [===============>..............] - ETA: 57s - loss: 1.2891 - regression_loss: 1.0888 - classification_loss: 0.2003 273/500 [===============>..............] - ETA: 56s - loss: 1.2876 - regression_loss: 1.0875 - classification_loss: 0.2001 274/500 [===============>..............] - ETA: 56s - loss: 1.2867 - regression_loss: 1.0868 - classification_loss: 0.1999 275/500 [===============>..............] - ETA: 56s - loss: 1.2885 - regression_loss: 1.0882 - classification_loss: 0.2003 276/500 [===============>..............] - ETA: 56s - loss: 1.2873 - regression_loss: 1.0872 - classification_loss: 0.2001 277/500 [===============>..............] - ETA: 55s - loss: 1.2858 - regression_loss: 1.0859 - classification_loss: 0.2000 278/500 [===============>..............] - ETA: 55s - loss: 1.2857 - regression_loss: 1.0859 - classification_loss: 0.1998 279/500 [===============>..............] - ETA: 55s - loss: 1.2864 - regression_loss: 1.0865 - classification_loss: 0.1999 280/500 [===============>..............] - ETA: 55s - loss: 1.2885 - regression_loss: 1.0880 - classification_loss: 0.2006 281/500 [===============>..............] - ETA: 54s - loss: 1.2895 - regression_loss: 1.0888 - classification_loss: 0.2008 282/500 [===============>..............] - ETA: 54s - loss: 1.2894 - regression_loss: 1.0887 - classification_loss: 0.2008 283/500 [===============>..............] - ETA: 54s - loss: 1.2899 - regression_loss: 1.0892 - classification_loss: 0.2007 284/500 [================>.............] - ETA: 54s - loss: 1.2898 - regression_loss: 1.0891 - classification_loss: 0.2007 285/500 [================>.............] - ETA: 53s - loss: 1.2881 - regression_loss: 1.0878 - classification_loss: 0.2003 286/500 [================>.............] - ETA: 53s - loss: 1.2874 - regression_loss: 1.0871 - classification_loss: 0.2003 287/500 [================>.............] - ETA: 53s - loss: 1.2882 - regression_loss: 1.0877 - classification_loss: 0.2005 288/500 [================>.............] - ETA: 53s - loss: 1.2875 - regression_loss: 1.0870 - classification_loss: 0.2005 289/500 [================>.............] - ETA: 52s - loss: 1.2883 - regression_loss: 1.0877 - classification_loss: 0.2006 290/500 [================>.............] - ETA: 52s - loss: 1.2905 - regression_loss: 1.0893 - classification_loss: 0.2012 291/500 [================>.............] - ETA: 52s - loss: 1.2911 - regression_loss: 1.0899 - classification_loss: 0.2012 292/500 [================>.............] - ETA: 52s - loss: 1.2904 - regression_loss: 1.0894 - classification_loss: 0.2009 293/500 [================>.............] - ETA: 51s - loss: 1.2883 - regression_loss: 1.0877 - classification_loss: 0.2006 294/500 [================>.............] - ETA: 51s - loss: 1.2880 - regression_loss: 1.0877 - classification_loss: 0.2003 295/500 [================>.............] - ETA: 51s - loss: 1.2881 - regression_loss: 1.0879 - classification_loss: 0.2002 296/500 [================>.............] - ETA: 51s - loss: 1.2890 - regression_loss: 1.0887 - classification_loss: 0.2003 297/500 [================>.............] - ETA: 50s - loss: 1.2865 - regression_loss: 1.0865 - classification_loss: 0.2000 298/500 [================>.............] - ETA: 50s - loss: 1.2895 - regression_loss: 1.0888 - classification_loss: 0.2007 299/500 [================>.............] - ETA: 50s - loss: 1.2884 - regression_loss: 1.0880 - classification_loss: 0.2004 300/500 [=================>............] - ETA: 50s - loss: 1.2889 - regression_loss: 1.0885 - classification_loss: 0.2004 301/500 [=================>............] - ETA: 49s - loss: 1.2916 - regression_loss: 1.0907 - classification_loss: 0.2009 302/500 [=================>............] - ETA: 49s - loss: 1.2930 - regression_loss: 1.0919 - classification_loss: 0.2011 303/500 [=================>............] - ETA: 49s - loss: 1.2933 - regression_loss: 1.0922 - classification_loss: 0.2011 304/500 [=================>............] - ETA: 49s - loss: 1.2938 - regression_loss: 1.0927 - classification_loss: 0.2011 305/500 [=================>............] - ETA: 48s - loss: 1.2938 - regression_loss: 1.0927 - classification_loss: 0.2011 306/500 [=================>............] - ETA: 48s - loss: 1.2946 - regression_loss: 1.0933 - classification_loss: 0.2014 307/500 [=================>............] - ETA: 48s - loss: 1.2946 - regression_loss: 1.0932 - classification_loss: 0.2015 308/500 [=================>............] - ETA: 48s - loss: 1.2944 - regression_loss: 1.0929 - classification_loss: 0.2015 309/500 [=================>............] - ETA: 47s - loss: 1.2960 - regression_loss: 1.0943 - classification_loss: 0.2017 310/500 [=================>............] - ETA: 47s - loss: 1.2955 - regression_loss: 1.0938 - classification_loss: 0.2017 311/500 [=================>............] - ETA: 47s - loss: 1.2972 - regression_loss: 1.0952 - classification_loss: 0.2020 312/500 [=================>............] - ETA: 47s - loss: 1.2948 - regression_loss: 1.0930 - classification_loss: 0.2017 313/500 [=================>............] - ETA: 46s - loss: 1.2941 - regression_loss: 1.0924 - classification_loss: 0.2017 314/500 [=================>............] - ETA: 46s - loss: 1.2918 - regression_loss: 1.0904 - classification_loss: 0.2014 315/500 [=================>............] - ETA: 46s - loss: 1.2896 - regression_loss: 1.0887 - classification_loss: 0.2009 316/500 [=================>............] - ETA: 46s - loss: 1.2876 - regression_loss: 1.0871 - classification_loss: 0.2005 317/500 [==================>...........] - ETA: 45s - loss: 1.2891 - regression_loss: 1.0885 - classification_loss: 0.2006 318/500 [==================>...........] - ETA: 45s - loss: 1.2888 - regression_loss: 1.0883 - classification_loss: 0.2005 319/500 [==================>...........] - ETA: 45s - loss: 1.2870 - regression_loss: 1.0868 - classification_loss: 0.2003 320/500 [==================>...........] - ETA: 45s - loss: 1.2882 - regression_loss: 1.0877 - classification_loss: 0.2005 321/500 [==================>...........] - ETA: 44s - loss: 1.2889 - regression_loss: 1.0882 - classification_loss: 0.2006 322/500 [==================>...........] - ETA: 44s - loss: 1.2863 - regression_loss: 1.0861 - classification_loss: 0.2002 323/500 [==================>...........] - ETA: 44s - loss: 1.2835 - regression_loss: 1.0838 - classification_loss: 0.1997 324/500 [==================>...........] - ETA: 44s - loss: 1.2848 - regression_loss: 1.0849 - classification_loss: 0.1999 325/500 [==================>...........] - ETA: 43s - loss: 1.2860 - regression_loss: 1.0860 - classification_loss: 0.2000 326/500 [==================>...........] - ETA: 43s - loss: 1.2866 - regression_loss: 1.0864 - classification_loss: 0.2002 327/500 [==================>...........] - ETA: 43s - loss: 1.2867 - regression_loss: 1.0864 - classification_loss: 0.2003 328/500 [==================>...........] - ETA: 43s - loss: 1.2857 - regression_loss: 1.0856 - classification_loss: 0.2001 329/500 [==================>...........] - ETA: 42s - loss: 1.2878 - regression_loss: 1.0873 - classification_loss: 0.2005 330/500 [==================>...........] - ETA: 42s - loss: 1.2883 - regression_loss: 1.0876 - classification_loss: 0.2007 331/500 [==================>...........] - ETA: 42s - loss: 1.2868 - regression_loss: 1.0863 - classification_loss: 0.2004 332/500 [==================>...........] - ETA: 42s - loss: 1.2845 - regression_loss: 1.0846 - classification_loss: 0.1999 333/500 [==================>...........] - ETA: 41s - loss: 1.2857 - regression_loss: 1.0855 - classification_loss: 0.2002 334/500 [===================>..........] - ETA: 41s - loss: 1.2860 - regression_loss: 1.0855 - classification_loss: 0.2004 335/500 [===================>..........] - ETA: 41s - loss: 1.2852 - regression_loss: 1.0849 - classification_loss: 0.2002 336/500 [===================>..........] - ETA: 41s - loss: 1.2849 - regression_loss: 1.0846 - classification_loss: 0.2002 337/500 [===================>..........] - ETA: 40s - loss: 1.2839 - regression_loss: 1.0839 - classification_loss: 0.2000 338/500 [===================>..........] - ETA: 40s - loss: 1.2833 - regression_loss: 1.0832 - classification_loss: 0.2000 339/500 [===================>..........] - ETA: 40s - loss: 1.2809 - regression_loss: 1.0813 - classification_loss: 0.1996 340/500 [===================>..........] - ETA: 40s - loss: 1.2799 - regression_loss: 1.0806 - classification_loss: 0.1993 341/500 [===================>..........] - ETA: 39s - loss: 1.2815 - regression_loss: 1.0817 - classification_loss: 0.1998 342/500 [===================>..........] - ETA: 39s - loss: 1.2811 - regression_loss: 1.0812 - classification_loss: 0.1999 343/500 [===================>..........] - ETA: 39s - loss: 1.2820 - regression_loss: 1.0821 - classification_loss: 0.2000 344/500 [===================>..........] - ETA: 39s - loss: 1.2825 - regression_loss: 1.0825 - classification_loss: 0.2000 345/500 [===================>..........] - ETA: 38s - loss: 1.2831 - regression_loss: 1.0831 - classification_loss: 0.2001 346/500 [===================>..........] - ETA: 38s - loss: 1.2830 - regression_loss: 1.0831 - classification_loss: 0.1999 347/500 [===================>..........] - ETA: 38s - loss: 1.2830 - regression_loss: 1.0832 - classification_loss: 0.1998 348/500 [===================>..........] - ETA: 38s - loss: 1.2836 - regression_loss: 1.0838 - classification_loss: 0.1998 349/500 [===================>..........] - ETA: 37s - loss: 1.2851 - regression_loss: 1.0851 - classification_loss: 0.2000 350/500 [====================>.........] - ETA: 37s - loss: 1.2854 - regression_loss: 1.0854 - classification_loss: 0.2000 351/500 [====================>.........] - ETA: 37s - loss: 1.2857 - regression_loss: 1.0857 - classification_loss: 0.2000 352/500 [====================>.........] - ETA: 37s - loss: 1.2855 - regression_loss: 1.0855 - classification_loss: 0.2001 353/500 [====================>.........] - ETA: 36s - loss: 1.2862 - regression_loss: 1.0859 - classification_loss: 0.2002 354/500 [====================>.........] - ETA: 36s - loss: 1.2858 - regression_loss: 1.0858 - classification_loss: 0.2000 355/500 [====================>.........] - ETA: 36s - loss: 1.2845 - regression_loss: 1.0847 - classification_loss: 0.1998 356/500 [====================>.........] - ETA: 36s - loss: 1.2822 - regression_loss: 1.0828 - classification_loss: 0.1993 357/500 [====================>.........] - ETA: 35s - loss: 1.2823 - regression_loss: 1.0829 - classification_loss: 0.1994 358/500 [====================>.........] - ETA: 35s - loss: 1.2807 - regression_loss: 1.0816 - classification_loss: 0.1991 359/500 [====================>.........] - ETA: 35s - loss: 1.2802 - regression_loss: 1.0812 - classification_loss: 0.1990 360/500 [====================>.........] - ETA: 35s - loss: 1.2805 - regression_loss: 1.0812 - classification_loss: 0.1993 361/500 [====================>.........] - ETA: 34s - loss: 1.2811 - regression_loss: 1.0818 - classification_loss: 0.1993 362/500 [====================>.........] - ETA: 34s - loss: 1.2816 - regression_loss: 1.0821 - classification_loss: 0.1995 363/500 [====================>.........] - ETA: 34s - loss: 1.2815 - regression_loss: 1.0820 - classification_loss: 0.1995 364/500 [====================>.........] - ETA: 34s - loss: 1.2820 - regression_loss: 1.0825 - classification_loss: 0.1995 365/500 [====================>.........] - ETA: 33s - loss: 1.2801 - regression_loss: 1.0809 - classification_loss: 0.1992 366/500 [====================>.........] - ETA: 33s - loss: 1.2812 - regression_loss: 1.0819 - classification_loss: 0.1994 367/500 [=====================>........] - ETA: 33s - loss: 1.2810 - regression_loss: 1.0818 - classification_loss: 0.1992 368/500 [=====================>........] - ETA: 33s - loss: 1.2822 - regression_loss: 1.0829 - classification_loss: 0.1994 369/500 [=====================>........] - ETA: 32s - loss: 1.2810 - regression_loss: 1.0819 - classification_loss: 0.1991 370/500 [=====================>........] - ETA: 32s - loss: 1.2810 - regression_loss: 1.0819 - classification_loss: 0.1991 371/500 [=====================>........] - ETA: 32s - loss: 1.2824 - regression_loss: 1.0830 - classification_loss: 0.1994 372/500 [=====================>........] - ETA: 32s - loss: 1.2819 - regression_loss: 1.0826 - classification_loss: 0.1994 373/500 [=====================>........] - ETA: 31s - loss: 1.2805 - regression_loss: 1.0815 - classification_loss: 0.1990 374/500 [=====================>........] - ETA: 31s - loss: 1.2806 - regression_loss: 1.0815 - classification_loss: 0.1991 375/500 [=====================>........] - ETA: 31s - loss: 1.2797 - regression_loss: 1.0805 - classification_loss: 0.1992 376/500 [=====================>........] - ETA: 31s - loss: 1.2803 - regression_loss: 1.0811 - classification_loss: 0.1992 377/500 [=====================>........] - ETA: 30s - loss: 1.2803 - regression_loss: 1.0812 - classification_loss: 0.1991 378/500 [=====================>........] - ETA: 30s - loss: 1.2782 - regression_loss: 1.0795 - classification_loss: 0.1987 379/500 [=====================>........] - ETA: 30s - loss: 1.2783 - regression_loss: 1.0796 - classification_loss: 0.1987 380/500 [=====================>........] - ETA: 30s - loss: 1.2782 - regression_loss: 1.0796 - classification_loss: 0.1985 381/500 [=====================>........] - ETA: 29s - loss: 1.2777 - regression_loss: 1.0792 - classification_loss: 0.1985 382/500 [=====================>........] - ETA: 29s - loss: 1.2780 - regression_loss: 1.0795 - classification_loss: 0.1985 383/500 [=====================>........] - ETA: 29s - loss: 1.2789 - regression_loss: 1.0804 - classification_loss: 0.1986 384/500 [======================>.......] - ETA: 29s - loss: 1.2793 - regression_loss: 1.0807 - classification_loss: 0.1986 385/500 [======================>.......] - ETA: 28s - loss: 1.2790 - regression_loss: 1.0805 - classification_loss: 0.1984 386/500 [======================>.......] - ETA: 28s - loss: 1.2779 - regression_loss: 1.0797 - classification_loss: 0.1982 387/500 [======================>.......] - ETA: 28s - loss: 1.2763 - regression_loss: 1.0783 - classification_loss: 0.1980 388/500 [======================>.......] - ETA: 28s - loss: 1.2777 - regression_loss: 1.0795 - classification_loss: 0.1982 389/500 [======================>.......] - ETA: 27s - loss: 1.2777 - regression_loss: 1.0793 - classification_loss: 0.1984 390/500 [======================>.......] - ETA: 27s - loss: 1.2781 - regression_loss: 1.0796 - classification_loss: 0.1984 391/500 [======================>.......] - ETA: 27s - loss: 1.2787 - regression_loss: 1.0802 - classification_loss: 0.1985 392/500 [======================>.......] - ETA: 27s - loss: 1.2784 - regression_loss: 1.0801 - classification_loss: 0.1983 393/500 [======================>.......] - ETA: 26s - loss: 1.2796 - regression_loss: 1.0811 - classification_loss: 0.1985 394/500 [======================>.......] - ETA: 26s - loss: 1.2803 - regression_loss: 1.0817 - classification_loss: 0.1985 395/500 [======================>.......] - ETA: 26s - loss: 1.2800 - regression_loss: 1.0814 - classification_loss: 0.1986 396/500 [======================>.......] - ETA: 26s - loss: 1.2802 - regression_loss: 1.0816 - classification_loss: 0.1986 397/500 [======================>.......] - ETA: 25s - loss: 1.2805 - regression_loss: 1.0820 - classification_loss: 0.1985 398/500 [======================>.......] - ETA: 25s - loss: 1.2786 - regression_loss: 1.0805 - classification_loss: 0.1981 399/500 [======================>.......] - ETA: 25s - loss: 1.2787 - regression_loss: 1.0806 - classification_loss: 0.1982 400/500 [=======================>......] - ETA: 25s - loss: 1.2787 - regression_loss: 1.0805 - classification_loss: 0.1982 401/500 [=======================>......] - ETA: 24s - loss: 1.2781 - regression_loss: 1.0801 - classification_loss: 0.1980 402/500 [=======================>......] - ETA: 24s - loss: 1.2795 - regression_loss: 1.0814 - classification_loss: 0.1982 403/500 [=======================>......] - ETA: 24s - loss: 1.2790 - regression_loss: 1.0808 - classification_loss: 0.1982 404/500 [=======================>......] - ETA: 24s - loss: 1.2789 - regression_loss: 1.0808 - classification_loss: 0.1981 405/500 [=======================>......] - ETA: 23s - loss: 1.2769 - regression_loss: 1.0792 - classification_loss: 0.1977 406/500 [=======================>......] - ETA: 23s - loss: 1.2751 - regression_loss: 1.0776 - classification_loss: 0.1975 407/500 [=======================>......] - ETA: 23s - loss: 1.2763 - regression_loss: 1.0785 - classification_loss: 0.1979 408/500 [=======================>......] - ETA: 23s - loss: 1.2757 - regression_loss: 1.0780 - classification_loss: 0.1977 409/500 [=======================>......] - ETA: 22s - loss: 1.2768 - regression_loss: 1.0789 - classification_loss: 0.1979 410/500 [=======================>......] - ETA: 22s - loss: 1.2780 - regression_loss: 1.0799 - classification_loss: 0.1981 411/500 [=======================>......] - ETA: 22s - loss: 1.2777 - regression_loss: 1.0798 - classification_loss: 0.1978 412/500 [=======================>......] - ETA: 22s - loss: 1.2776 - regression_loss: 1.0798 - classification_loss: 0.1978 413/500 [=======================>......] - ETA: 21s - loss: 1.2781 - regression_loss: 1.0802 - classification_loss: 0.1979 414/500 [=======================>......] - ETA: 21s - loss: 1.2772 - regression_loss: 1.0796 - classification_loss: 0.1976 415/500 [=======================>......] - ETA: 21s - loss: 1.2771 - regression_loss: 1.0794 - classification_loss: 0.1977 416/500 [=======================>......] - ETA: 21s - loss: 1.2774 - regression_loss: 1.0797 - classification_loss: 0.1977 417/500 [========================>.....] - ETA: 20s - loss: 1.2774 - regression_loss: 1.0797 - classification_loss: 0.1977 418/500 [========================>.....] - ETA: 20s - loss: 1.2759 - regression_loss: 1.0785 - classification_loss: 0.1974 419/500 [========================>.....] - ETA: 20s - loss: 1.2751 - regression_loss: 1.0777 - classification_loss: 0.1974 420/500 [========================>.....] - ETA: 20s - loss: 1.2762 - regression_loss: 1.0786 - classification_loss: 0.1976 421/500 [========================>.....] - ETA: 19s - loss: 1.2776 - regression_loss: 1.0797 - classification_loss: 0.1979 422/500 [========================>.....] - ETA: 19s - loss: 1.2773 - regression_loss: 1.0794 - classification_loss: 0.1979 423/500 [========================>.....] - ETA: 19s - loss: 1.2777 - regression_loss: 1.0798 - classification_loss: 0.1979 424/500 [========================>.....] - ETA: 19s - loss: 1.2766 - regression_loss: 1.0788 - classification_loss: 0.1979 425/500 [========================>.....] - ETA: 18s - loss: 1.2771 - regression_loss: 1.0792 - classification_loss: 0.1979 426/500 [========================>.....] - ETA: 18s - loss: 1.2761 - regression_loss: 1.0785 - classification_loss: 0.1976 427/500 [========================>.....] - ETA: 18s - loss: 1.2772 - regression_loss: 1.0795 - classification_loss: 0.1977 428/500 [========================>.....] - ETA: 18s - loss: 1.2776 - regression_loss: 1.0799 - classification_loss: 0.1977 429/500 [========================>.....] - ETA: 17s - loss: 1.2776 - regression_loss: 1.0799 - classification_loss: 0.1977 430/500 [========================>.....] - ETA: 17s - loss: 1.2767 - regression_loss: 1.0793 - classification_loss: 0.1973 431/500 [========================>.....] - ETA: 17s - loss: 1.2750 - regression_loss: 1.0780 - classification_loss: 0.1970 432/500 [========================>.....] - ETA: 17s - loss: 1.2748 - regression_loss: 1.0778 - classification_loss: 0.1969 433/500 [========================>.....] - ETA: 16s - loss: 1.2752 - regression_loss: 1.0781 - classification_loss: 0.1971 434/500 [=========================>....] - ETA: 16s - loss: 1.2760 - regression_loss: 1.0789 - classification_loss: 0.1971 435/500 [=========================>....] - ETA: 16s - loss: 1.2750 - regression_loss: 1.0782 - classification_loss: 0.1969 436/500 [=========================>....] - ETA: 16s - loss: 1.2747 - regression_loss: 1.0777 - classification_loss: 0.1970 437/500 [=========================>....] - ETA: 15s - loss: 1.2736 - regression_loss: 1.0768 - classification_loss: 0.1968 438/500 [=========================>....] - ETA: 15s - loss: 1.2737 - regression_loss: 1.0769 - classification_loss: 0.1968 439/500 [=========================>....] - ETA: 15s - loss: 1.2719 - regression_loss: 1.0755 - classification_loss: 0.1964 440/500 [=========================>....] - ETA: 15s - loss: 1.2717 - regression_loss: 1.0754 - classification_loss: 0.1963 441/500 [=========================>....] - ETA: 14s - loss: 1.2722 - regression_loss: 1.0758 - classification_loss: 0.1964 442/500 [=========================>....] - ETA: 14s - loss: 1.2726 - regression_loss: 1.0761 - classification_loss: 0.1965 443/500 [=========================>....] - ETA: 14s - loss: 1.2728 - regression_loss: 1.0762 - classification_loss: 0.1967 444/500 [=========================>....] - ETA: 14s - loss: 1.2746 - regression_loss: 1.0778 - classification_loss: 0.1968 445/500 [=========================>....] - ETA: 13s - loss: 1.2749 - regression_loss: 1.0780 - classification_loss: 0.1969 446/500 [=========================>....] - ETA: 13s - loss: 1.2757 - regression_loss: 1.0786 - classification_loss: 0.1971 447/500 [=========================>....] - ETA: 13s - loss: 1.2752 - regression_loss: 1.0781 - classification_loss: 0.1971 448/500 [=========================>....] - ETA: 13s - loss: 1.2739 - regression_loss: 1.0771 - classification_loss: 0.1968 449/500 [=========================>....] - ETA: 12s - loss: 1.2746 - regression_loss: 1.0778 - classification_loss: 0.1969 450/500 [==========================>...] - ETA: 12s - loss: 1.2748 - regression_loss: 1.0780 - classification_loss: 0.1969 451/500 [==========================>...] - ETA: 12s - loss: 1.2762 - regression_loss: 1.0790 - classification_loss: 0.1971 452/500 [==========================>...] - ETA: 12s - loss: 1.2760 - regression_loss: 1.0789 - classification_loss: 0.1971 453/500 [==========================>...] - ETA: 11s - loss: 1.2758 - regression_loss: 1.0787 - classification_loss: 0.1970 454/500 [==========================>...] - ETA: 11s - loss: 1.2751 - regression_loss: 1.0781 - classification_loss: 0.1969 455/500 [==========================>...] - ETA: 11s - loss: 1.2764 - regression_loss: 1.0792 - classification_loss: 0.1972 456/500 [==========================>...] - ETA: 11s - loss: 1.2762 - regression_loss: 1.0791 - classification_loss: 0.1971 457/500 [==========================>...] - ETA: 10s - loss: 1.2762 - regression_loss: 1.0789 - classification_loss: 0.1973 458/500 [==========================>...] - ETA: 10s - loss: 1.2773 - regression_loss: 1.0797 - classification_loss: 0.1976 459/500 [==========================>...] - ETA: 10s - loss: 1.2757 - regression_loss: 1.0784 - classification_loss: 0.1973 460/500 [==========================>...] - ETA: 10s - loss: 1.2755 - regression_loss: 1.0782 - classification_loss: 0.1972 461/500 [==========================>...] - ETA: 9s - loss: 1.2743 - regression_loss: 1.0773 - classification_loss: 0.1970  462/500 [==========================>...] - ETA: 9s - loss: 1.2751 - regression_loss: 1.0779 - classification_loss: 0.1972 463/500 [==========================>...] - ETA: 9s - loss: 1.2756 - regression_loss: 1.0784 - classification_loss: 0.1972 464/500 [==========================>...] - ETA: 9s - loss: 1.2752 - regression_loss: 1.0780 - classification_loss: 0.1972 465/500 [==========================>...] - ETA: 8s - loss: 1.2748 - regression_loss: 1.0778 - classification_loss: 0.1970 466/500 [==========================>...] - ETA: 8s - loss: 1.2746 - regression_loss: 1.0776 - classification_loss: 0.1970 467/500 [===========================>..] - ETA: 8s - loss: 1.2742 - regression_loss: 1.0774 - classification_loss: 0.1969 468/500 [===========================>..] - ETA: 8s - loss: 1.2741 - regression_loss: 1.0772 - classification_loss: 0.1969 469/500 [===========================>..] - ETA: 7s - loss: 1.2754 - regression_loss: 1.0783 - classification_loss: 0.1971 470/500 [===========================>..] - ETA: 7s - loss: 1.2747 - regression_loss: 1.0776 - classification_loss: 0.1971 471/500 [===========================>..] - ETA: 7s - loss: 1.2744 - regression_loss: 1.0774 - classification_loss: 0.1970 472/500 [===========================>..] - ETA: 7s - loss: 1.2741 - regression_loss: 1.0772 - classification_loss: 0.1969 473/500 [===========================>..] - ETA: 6s - loss: 1.2731 - regression_loss: 1.0764 - classification_loss: 0.1967 474/500 [===========================>..] - ETA: 6s - loss: 1.2732 - regression_loss: 1.0765 - classification_loss: 0.1967 475/500 [===========================>..] - ETA: 6s - loss: 1.2744 - regression_loss: 1.0776 - classification_loss: 0.1968 476/500 [===========================>..] - ETA: 6s - loss: 1.2746 - regression_loss: 1.0777 - classification_loss: 0.1969 477/500 [===========================>..] - ETA: 5s - loss: 1.2747 - regression_loss: 1.0777 - classification_loss: 0.1970 478/500 [===========================>..] - ETA: 5s - loss: 1.2730 - regression_loss: 1.0764 - classification_loss: 0.1967 479/500 [===========================>..] - ETA: 5s - loss: 1.2741 - regression_loss: 1.0772 - classification_loss: 0.1969 480/500 [===========================>..] - ETA: 5s - loss: 1.2739 - regression_loss: 1.0771 - classification_loss: 0.1968 481/500 [===========================>..] - ETA: 4s - loss: 1.2748 - regression_loss: 1.0779 - classification_loss: 0.1969 482/500 [===========================>..] - ETA: 4s - loss: 1.2750 - regression_loss: 1.0781 - classification_loss: 0.1968 483/500 [===========================>..] - ETA: 4s - loss: 1.2737 - regression_loss: 1.0771 - classification_loss: 0.1967 484/500 [============================>.] - ETA: 4s - loss: 1.2749 - regression_loss: 1.0780 - classification_loss: 0.1969 485/500 [============================>.] - ETA: 3s - loss: 1.2753 - regression_loss: 1.0783 - classification_loss: 0.1970 486/500 [============================>.] - ETA: 3s - loss: 1.2760 - regression_loss: 1.0788 - classification_loss: 0.1972 487/500 [============================>.] - ETA: 3s - loss: 1.2758 - regression_loss: 1.0789 - classification_loss: 0.1969 488/500 [============================>.] - ETA: 3s - loss: 1.2753 - regression_loss: 1.0785 - classification_loss: 0.1968 489/500 [============================>.] - ETA: 2s - loss: 1.2762 - regression_loss: 1.0792 - classification_loss: 0.1970 490/500 [============================>.] - ETA: 2s - loss: 1.2748 - regression_loss: 1.0780 - classification_loss: 0.1968 491/500 [============================>.] - ETA: 2s - loss: 1.2743 - regression_loss: 1.0777 - classification_loss: 0.1966 492/500 [============================>.] - ETA: 2s - loss: 1.2748 - regression_loss: 1.0780 - classification_loss: 0.1967 493/500 [============================>.] - ETA: 1s - loss: 1.2747 - regression_loss: 1.0781 - classification_loss: 0.1967 494/500 [============================>.] - ETA: 1s - loss: 1.2756 - regression_loss: 1.0788 - classification_loss: 0.1968 495/500 [============================>.] - ETA: 1s - loss: 1.2749 - regression_loss: 1.0782 - classification_loss: 0.1967 496/500 [============================>.] - ETA: 1s - loss: 1.2752 - regression_loss: 1.0785 - classification_loss: 0.1967 497/500 [============================>.] - ETA: 0s - loss: 1.2761 - regression_loss: 1.0792 - classification_loss: 0.1969 498/500 [============================>.] - ETA: 0s - loss: 1.2757 - regression_loss: 1.0788 - classification_loss: 0.1969 499/500 [============================>.] - ETA: 0s - loss: 1.2751 - regression_loss: 1.0782 - classification_loss: 0.1969 500/500 [==============================] - 125s 251ms/step - loss: 1.2756 - regression_loss: 1.0785 - classification_loss: 0.1971 1172 instances of class plum with average precision: 0.6974 mAP: 0.6974 Epoch 00118: saving model to ./training/snapshots/resnet50_pascal_118.h5 Epoch 119/150 1/500 [..............................] - ETA: 2:00 - loss: 0.5637 - regression_loss: 0.5130 - classification_loss: 0.0507 2/500 [..............................] - ETA: 2:02 - loss: 1.1012 - regression_loss: 0.9314 - classification_loss: 0.1698 3/500 [..............................] - ETA: 2:03 - loss: 1.2187 - regression_loss: 1.0285 - classification_loss: 0.1903 4/500 [..............................] - ETA: 2:02 - loss: 1.1921 - regression_loss: 1.0155 - classification_loss: 0.1767 5/500 [..............................] - ETA: 2:04 - loss: 1.2158 - regression_loss: 1.0356 - classification_loss: 0.1802 6/500 [..............................] - ETA: 2:03 - loss: 1.2548 - regression_loss: 1.0708 - classification_loss: 0.1839 7/500 [..............................] - ETA: 2:04 - loss: 1.2038 - regression_loss: 1.0311 - classification_loss: 0.1727 8/500 [..............................] - ETA: 2:04 - loss: 1.2674 - regression_loss: 1.0994 - classification_loss: 0.1680 9/500 [..............................] - ETA: 2:04 - loss: 1.2608 - regression_loss: 1.0940 - classification_loss: 0.1668 10/500 [..............................] - ETA: 2:03 - loss: 1.2949 - regression_loss: 1.1177 - classification_loss: 0.1772 11/500 [..............................] - ETA: 2:04 - loss: 1.3184 - regression_loss: 1.1305 - classification_loss: 0.1879 12/500 [..............................] - ETA: 2:04 - loss: 1.3215 - regression_loss: 1.1316 - classification_loss: 0.1900 13/500 [..............................] - ETA: 2:04 - loss: 1.3487 - regression_loss: 1.1545 - classification_loss: 0.1942 14/500 [..............................] - ETA: 2:03 - loss: 1.3298 - regression_loss: 1.1371 - classification_loss: 0.1928 15/500 [..............................] - ETA: 2:03 - loss: 1.2814 - regression_loss: 1.0967 - classification_loss: 0.1847 16/500 [..............................] - ETA: 2:02 - loss: 1.2962 - regression_loss: 1.1092 - classification_loss: 0.1871 17/500 [>.............................] - ETA: 2:02 - loss: 1.3201 - regression_loss: 1.1250 - classification_loss: 0.1951 18/500 [>.............................] - ETA: 2:02 - loss: 1.3689 - regression_loss: 1.1608 - classification_loss: 0.2081 19/500 [>.............................] - ETA: 2:02 - loss: 1.3795 - regression_loss: 1.1713 - classification_loss: 0.2083 20/500 [>.............................] - ETA: 2:02 - loss: 1.3766 - regression_loss: 1.1681 - classification_loss: 0.2085 21/500 [>.............................] - ETA: 2:01 - loss: 1.3903 - regression_loss: 1.1813 - classification_loss: 0.2089 22/500 [>.............................] - ETA: 2:01 - loss: 1.3846 - regression_loss: 1.1744 - classification_loss: 0.2102 23/500 [>.............................] - ETA: 2:00 - loss: 1.3543 - regression_loss: 1.1497 - classification_loss: 0.2046 24/500 [>.............................] - ETA: 2:00 - loss: 1.3478 - regression_loss: 1.1419 - classification_loss: 0.2059 25/500 [>.............................] - ETA: 1:59 - loss: 1.3156 - regression_loss: 1.1157 - classification_loss: 0.2000 26/500 [>.............................] - ETA: 1:59 - loss: 1.3133 - regression_loss: 1.1148 - classification_loss: 0.1985 27/500 [>.............................] - ETA: 1:59 - loss: 1.3175 - regression_loss: 1.1189 - classification_loss: 0.1985 28/500 [>.............................] - ETA: 1:58 - loss: 1.3363 - regression_loss: 1.1326 - classification_loss: 0.2037 29/500 [>.............................] - ETA: 1:58 - loss: 1.3451 - regression_loss: 1.1417 - classification_loss: 0.2034 30/500 [>.............................] - ETA: 1:58 - loss: 1.3340 - regression_loss: 1.1346 - classification_loss: 0.1993 31/500 [>.............................] - ETA: 1:58 - loss: 1.3293 - regression_loss: 1.1296 - classification_loss: 0.1996 32/500 [>.............................] - ETA: 1:57 - loss: 1.3178 - regression_loss: 1.1200 - classification_loss: 0.1978 33/500 [>.............................] - ETA: 1:57 - loss: 1.3167 - regression_loss: 1.1198 - classification_loss: 0.1969 34/500 [=>............................] - ETA: 1:57 - loss: 1.3047 - regression_loss: 1.1094 - classification_loss: 0.1953 35/500 [=>............................] - ETA: 1:57 - loss: 1.3138 - regression_loss: 1.1178 - classification_loss: 0.1960 36/500 [=>............................] - ETA: 1:56 - loss: 1.3082 - regression_loss: 1.1131 - classification_loss: 0.1951 37/500 [=>............................] - ETA: 1:56 - loss: 1.3150 - regression_loss: 1.1182 - classification_loss: 0.1968 38/500 [=>............................] - ETA: 1:56 - loss: 1.3080 - regression_loss: 1.1114 - classification_loss: 0.1965 39/500 [=>............................] - ETA: 1:56 - loss: 1.3013 - regression_loss: 1.1060 - classification_loss: 0.1953 40/500 [=>............................] - ETA: 1:55 - loss: 1.2993 - regression_loss: 1.1032 - classification_loss: 0.1961 41/500 [=>............................] - ETA: 1:55 - loss: 1.3036 - regression_loss: 1.1062 - classification_loss: 0.1974 42/500 [=>............................] - ETA: 1:55 - loss: 1.2859 - regression_loss: 1.0915 - classification_loss: 0.1944 43/500 [=>............................] - ETA: 1:55 - loss: 1.2913 - regression_loss: 1.0964 - classification_loss: 0.1948 44/500 [=>............................] - ETA: 1:54 - loss: 1.2643 - regression_loss: 1.0715 - classification_loss: 0.1928 45/500 [=>............................] - ETA: 1:54 - loss: 1.2593 - regression_loss: 1.0676 - classification_loss: 0.1918 46/500 [=>............................] - ETA: 1:54 - loss: 1.2642 - regression_loss: 1.0719 - classification_loss: 0.1922 47/500 [=>............................] - ETA: 1:53 - loss: 1.2696 - regression_loss: 1.0767 - classification_loss: 0.1928 48/500 [=>............................] - ETA: 1:53 - loss: 1.2733 - regression_loss: 1.0790 - classification_loss: 0.1943 49/500 [=>............................] - ETA: 1:53 - loss: 1.2594 - regression_loss: 1.0686 - classification_loss: 0.1908 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2664 - regression_loss: 1.0745 - classification_loss: 0.1919 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2698 - regression_loss: 1.0778 - classification_loss: 0.1921 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2763 - regression_loss: 1.0842 - classification_loss: 0.1922 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2724 - regression_loss: 1.0791 - classification_loss: 0.1933 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2898 - regression_loss: 1.0930 - classification_loss: 0.1968 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2817 - regression_loss: 1.0866 - classification_loss: 0.1951 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2719 - regression_loss: 1.0790 - classification_loss: 0.1929 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2579 - regression_loss: 1.0674 - classification_loss: 0.1905 58/500 [==>...........................] - ETA: 1:49 - loss: 1.2658 - regression_loss: 1.0742 - classification_loss: 0.1917 59/500 [==>...........................] - ETA: 1:49 - loss: 1.2570 - regression_loss: 1.0670 - classification_loss: 0.1900 60/500 [==>...........................] - ETA: 1:48 - loss: 1.2526 - regression_loss: 1.0634 - classification_loss: 0.1893 61/500 [==>...........................] - ETA: 1:48 - loss: 1.2487 - regression_loss: 1.0609 - classification_loss: 0.1878 62/500 [==>...........................] - ETA: 1:47 - loss: 1.2428 - regression_loss: 1.0571 - classification_loss: 0.1856 63/500 [==>...........................] - ETA: 1:47 - loss: 1.2283 - regression_loss: 1.0445 - classification_loss: 0.1839 64/500 [==>...........................] - ETA: 1:46 - loss: 1.2323 - regression_loss: 1.0465 - classification_loss: 0.1858 65/500 [==>...........................] - ETA: 1:46 - loss: 1.2351 - regression_loss: 1.0486 - classification_loss: 0.1865 66/500 [==>...........................] - ETA: 1:45 - loss: 1.2445 - regression_loss: 1.0576 - classification_loss: 0.1868 67/500 [===>..........................] - ETA: 1:45 - loss: 1.2505 - regression_loss: 1.0628 - classification_loss: 0.1877 68/500 [===>..........................] - ETA: 1:44 - loss: 1.2523 - regression_loss: 1.0632 - classification_loss: 0.1891 69/500 [===>..........................] - ETA: 1:44 - loss: 1.2550 - regression_loss: 1.0658 - classification_loss: 0.1893 70/500 [===>..........................] - ETA: 1:43 - loss: 1.2595 - regression_loss: 1.0689 - classification_loss: 0.1906 71/500 [===>..........................] - ETA: 1:43 - loss: 1.2623 - regression_loss: 1.0709 - classification_loss: 0.1914 72/500 [===>..........................] - ETA: 1:43 - loss: 1.2618 - regression_loss: 1.0690 - classification_loss: 0.1928 73/500 [===>..........................] - ETA: 1:42 - loss: 1.2651 - regression_loss: 1.0721 - classification_loss: 0.1930 74/500 [===>..........................] - ETA: 1:42 - loss: 1.2615 - regression_loss: 1.0685 - classification_loss: 0.1930 75/500 [===>..........................] - ETA: 1:41 - loss: 1.2586 - regression_loss: 1.0662 - classification_loss: 0.1924 76/500 [===>..........................] - ETA: 1:41 - loss: 1.2581 - regression_loss: 1.0660 - classification_loss: 0.1921 77/500 [===>..........................] - ETA: 1:41 - loss: 1.2524 - regression_loss: 1.0622 - classification_loss: 0.1902 78/500 [===>..........................] - ETA: 1:40 - loss: 1.2499 - regression_loss: 1.0604 - classification_loss: 0.1895 79/500 [===>..........................] - ETA: 1:40 - loss: 1.2373 - regression_loss: 1.0499 - classification_loss: 0.1874 80/500 [===>..........................] - ETA: 1:39 - loss: 1.2316 - regression_loss: 1.0448 - classification_loss: 0.1868 81/500 [===>..........................] - ETA: 1:39 - loss: 1.2309 - regression_loss: 1.0432 - classification_loss: 0.1878 82/500 [===>..........................] - ETA: 1:39 - loss: 1.2389 - regression_loss: 1.0501 - classification_loss: 0.1887 83/500 [===>..........................] - ETA: 1:38 - loss: 1.2448 - regression_loss: 1.0551 - classification_loss: 0.1897 84/500 [====>.........................] - ETA: 1:38 - loss: 1.2391 - regression_loss: 1.0502 - classification_loss: 0.1889 85/500 [====>.........................] - ETA: 1:38 - loss: 1.2451 - regression_loss: 1.0554 - classification_loss: 0.1897 86/500 [====>.........................] - ETA: 1:37 - loss: 1.2458 - regression_loss: 1.0555 - classification_loss: 0.1903 87/500 [====>.........................] - ETA: 1:37 - loss: 1.2442 - regression_loss: 1.0549 - classification_loss: 0.1892 88/500 [====>.........................] - ETA: 1:37 - loss: 1.2469 - regression_loss: 1.0576 - classification_loss: 0.1893 89/500 [====>.........................] - ETA: 1:36 - loss: 1.2509 - regression_loss: 1.0609 - classification_loss: 0.1900 90/500 [====>.........................] - ETA: 1:36 - loss: 1.2520 - regression_loss: 1.0613 - classification_loss: 0.1907 91/500 [====>.........................] - ETA: 1:35 - loss: 1.2672 - regression_loss: 1.0743 - classification_loss: 0.1928 92/500 [====>.........................] - ETA: 1:35 - loss: 1.2593 - regression_loss: 1.0682 - classification_loss: 0.1911 93/500 [====>.........................] - ETA: 1:35 - loss: 1.2662 - regression_loss: 1.0740 - classification_loss: 0.1922 94/500 [====>.........................] - ETA: 1:34 - loss: 1.2686 - regression_loss: 1.0762 - classification_loss: 0.1924 95/500 [====>.........................] - ETA: 1:34 - loss: 1.2676 - regression_loss: 1.0751 - classification_loss: 0.1925 96/500 [====>.........................] - ETA: 1:34 - loss: 1.2713 - regression_loss: 1.0783 - classification_loss: 0.1930 97/500 [====>.........................] - ETA: 1:34 - loss: 1.2752 - regression_loss: 1.0818 - classification_loss: 0.1934 98/500 [====>.........................] - ETA: 1:33 - loss: 1.2736 - regression_loss: 1.0807 - classification_loss: 0.1929 99/500 [====>.........................] - ETA: 1:33 - loss: 1.2661 - regression_loss: 1.0746 - classification_loss: 0.1915 100/500 [=====>........................] - ETA: 1:33 - loss: 1.2637 - regression_loss: 1.0727 - classification_loss: 0.1910 101/500 [=====>........................] - ETA: 1:32 - loss: 1.2642 - regression_loss: 1.0738 - classification_loss: 0.1904 102/500 [=====>........................] - ETA: 1:32 - loss: 1.2715 - regression_loss: 1.0796 - classification_loss: 0.1920 103/500 [=====>........................] - ETA: 1:32 - loss: 1.2760 - regression_loss: 1.0835 - classification_loss: 0.1926 104/500 [=====>........................] - ETA: 1:31 - loss: 1.2694 - regression_loss: 1.0779 - classification_loss: 0.1914 105/500 [=====>........................] - ETA: 1:31 - loss: 1.2662 - regression_loss: 1.0756 - classification_loss: 0.1906 106/500 [=====>........................] - ETA: 1:31 - loss: 1.2660 - regression_loss: 1.0761 - classification_loss: 0.1899 107/500 [=====>........................] - ETA: 1:30 - loss: 1.2655 - regression_loss: 1.0758 - classification_loss: 0.1898 108/500 [=====>........................] - ETA: 1:30 - loss: 1.2680 - regression_loss: 1.0779 - classification_loss: 0.1901 109/500 [=====>........................] - ETA: 1:30 - loss: 1.2736 - regression_loss: 1.0827 - classification_loss: 0.1909 110/500 [=====>........................] - ETA: 1:30 - loss: 1.2747 - regression_loss: 1.0832 - classification_loss: 0.1915 111/500 [=====>........................] - ETA: 1:29 - loss: 1.2785 - regression_loss: 1.0865 - classification_loss: 0.1920 112/500 [=====>........................] - ETA: 1:29 - loss: 1.2783 - regression_loss: 1.0867 - classification_loss: 0.1916 113/500 [=====>........................] - ETA: 1:29 - loss: 1.2835 - regression_loss: 1.0906 - classification_loss: 0.1929 114/500 [=====>........................] - ETA: 1:28 - loss: 1.2842 - regression_loss: 1.0913 - classification_loss: 0.1929 115/500 [=====>........................] - ETA: 1:28 - loss: 1.2902 - regression_loss: 1.0954 - classification_loss: 0.1948 116/500 [=====>........................] - ETA: 1:28 - loss: 1.2835 - regression_loss: 1.0899 - classification_loss: 0.1937 117/500 [======>.......................] - ETA: 1:28 - loss: 1.2896 - regression_loss: 1.0947 - classification_loss: 0.1949 118/500 [======>.......................] - ETA: 1:27 - loss: 1.2914 - regression_loss: 1.0965 - classification_loss: 0.1950 119/500 [======>.......................] - ETA: 1:27 - loss: 1.2942 - regression_loss: 1.0985 - classification_loss: 0.1957 120/500 [======>.......................] - ETA: 1:27 - loss: 1.2909 - regression_loss: 1.0959 - classification_loss: 0.1950 121/500 [======>.......................] - ETA: 1:26 - loss: 1.2967 - regression_loss: 1.1006 - classification_loss: 0.1960 122/500 [======>.......................] - ETA: 1:26 - loss: 1.2988 - regression_loss: 1.1019 - classification_loss: 0.1969 123/500 [======>.......................] - ETA: 1:26 - loss: 1.3013 - regression_loss: 1.1039 - classification_loss: 0.1974 124/500 [======>.......................] - ETA: 1:26 - loss: 1.3037 - regression_loss: 1.1057 - classification_loss: 0.1980 125/500 [======>.......................] - ETA: 1:25 - loss: 1.3049 - regression_loss: 1.1071 - classification_loss: 0.1978 126/500 [======>.......................] - ETA: 1:25 - loss: 1.3011 - regression_loss: 1.1041 - classification_loss: 0.1971 127/500 [======>.......................] - ETA: 1:25 - loss: 1.3031 - regression_loss: 1.1044 - classification_loss: 0.1987 128/500 [======>.......................] - ETA: 1:25 - loss: 1.3084 - regression_loss: 1.1084 - classification_loss: 0.2001 129/500 [======>.......................] - ETA: 1:24 - loss: 1.3053 - regression_loss: 1.1060 - classification_loss: 0.1993 130/500 [======>.......................] - ETA: 1:24 - loss: 1.3027 - regression_loss: 1.1038 - classification_loss: 0.1988 131/500 [======>.......................] - ETA: 1:24 - loss: 1.2998 - regression_loss: 1.1019 - classification_loss: 0.1979 132/500 [======>.......................] - ETA: 1:23 - loss: 1.2981 - regression_loss: 1.1008 - classification_loss: 0.1973 133/500 [======>.......................] - ETA: 1:23 - loss: 1.2964 - regression_loss: 1.0976 - classification_loss: 0.1987 134/500 [=======>......................] - ETA: 1:23 - loss: 1.3001 - regression_loss: 1.1009 - classification_loss: 0.1992 135/500 [=======>......................] - ETA: 1:23 - loss: 1.3041 - regression_loss: 1.1040 - classification_loss: 0.2002 136/500 [=======>......................] - ETA: 1:22 - loss: 1.3031 - regression_loss: 1.1031 - classification_loss: 0.2000 137/500 [=======>......................] - ETA: 1:22 - loss: 1.3064 - regression_loss: 1.1056 - classification_loss: 0.2007 138/500 [=======>......................] - ETA: 1:22 - loss: 1.3078 - regression_loss: 1.1070 - classification_loss: 0.2008 139/500 [=======>......................] - ETA: 1:22 - loss: 1.3074 - regression_loss: 1.1068 - classification_loss: 0.2006 140/500 [=======>......................] - ETA: 1:21 - loss: 1.3087 - regression_loss: 1.1079 - classification_loss: 0.2009 141/500 [=======>......................] - ETA: 1:21 - loss: 1.3098 - regression_loss: 1.1088 - classification_loss: 0.2010 142/500 [=======>......................] - ETA: 1:21 - loss: 1.3057 - regression_loss: 1.1055 - classification_loss: 0.2002 143/500 [=======>......................] - ETA: 1:20 - loss: 1.3076 - regression_loss: 1.1071 - classification_loss: 0.2006 144/500 [=======>......................] - ETA: 1:20 - loss: 1.3121 - regression_loss: 1.1109 - classification_loss: 0.2012 145/500 [=======>......................] - ETA: 1:20 - loss: 1.3166 - regression_loss: 1.1147 - classification_loss: 0.2019 146/500 [=======>......................] - ETA: 1:20 - loss: 1.3176 - regression_loss: 1.1156 - classification_loss: 0.2021 147/500 [=======>......................] - ETA: 1:20 - loss: 1.3197 - regression_loss: 1.1176 - classification_loss: 0.2021 148/500 [=======>......................] - ETA: 1:19 - loss: 1.3173 - regression_loss: 1.1155 - classification_loss: 0.2019 149/500 [=======>......................] - ETA: 1:19 - loss: 1.3148 - regression_loss: 1.1134 - classification_loss: 0.2014 150/500 [========>.....................] - ETA: 1:19 - loss: 1.3111 - regression_loss: 1.1106 - classification_loss: 0.2005 151/500 [========>.....................] - ETA: 1:18 - loss: 1.3055 - regression_loss: 1.1062 - classification_loss: 0.1994 152/500 [========>.....................] - ETA: 1:18 - loss: 1.3092 - regression_loss: 1.1090 - classification_loss: 0.2001 153/500 [========>.....................] - ETA: 1:18 - loss: 1.3111 - regression_loss: 1.1101 - classification_loss: 0.2010 154/500 [========>.....................] - ETA: 1:18 - loss: 1.3118 - regression_loss: 1.1106 - classification_loss: 0.2012 155/500 [========>.....................] - ETA: 1:17 - loss: 1.3142 - regression_loss: 1.1125 - classification_loss: 0.2017 156/500 [========>.....................] - ETA: 1:17 - loss: 1.3097 - regression_loss: 1.1087 - classification_loss: 0.2010 157/500 [========>.....................] - ETA: 1:17 - loss: 1.3098 - regression_loss: 1.1088 - classification_loss: 0.2010 158/500 [========>.....................] - ETA: 1:17 - loss: 1.3068 - regression_loss: 1.1061 - classification_loss: 0.2007 159/500 [========>.....................] - ETA: 1:16 - loss: 1.3077 - regression_loss: 1.1070 - classification_loss: 0.2006 160/500 [========>.....................] - ETA: 1:16 - loss: 1.3078 - regression_loss: 1.1069 - classification_loss: 0.2009 161/500 [========>.....................] - ETA: 1:16 - loss: 1.3087 - regression_loss: 1.1076 - classification_loss: 0.2010 162/500 [========>.....................] - ETA: 1:16 - loss: 1.3072 - regression_loss: 1.1066 - classification_loss: 0.2006 163/500 [========>.....................] - ETA: 1:15 - loss: 1.3097 - regression_loss: 1.1088 - classification_loss: 0.2010 164/500 [========>.....................] - ETA: 1:15 - loss: 1.3075 - regression_loss: 1.1059 - classification_loss: 0.2015 165/500 [========>.....................] - ETA: 1:15 - loss: 1.3051 - regression_loss: 1.1043 - classification_loss: 0.2008 166/500 [========>.....................] - ETA: 1:15 - loss: 1.2989 - regression_loss: 1.0989 - classification_loss: 0.2000 167/500 [=========>....................] - ETA: 1:14 - loss: 1.2967 - regression_loss: 1.0973 - classification_loss: 0.1994 168/500 [=========>....................] - ETA: 1:14 - loss: 1.2976 - regression_loss: 1.0980 - classification_loss: 0.1996 169/500 [=========>....................] - ETA: 1:14 - loss: 1.2951 - regression_loss: 1.0963 - classification_loss: 0.1989 170/500 [=========>....................] - ETA: 1:14 - loss: 1.2944 - regression_loss: 1.0957 - classification_loss: 0.1987 171/500 [=========>....................] - ETA: 1:13 - loss: 1.2960 - regression_loss: 1.0970 - classification_loss: 0.1990 172/500 [=========>....................] - ETA: 1:13 - loss: 1.2951 - regression_loss: 1.0962 - classification_loss: 0.1989 173/500 [=========>....................] - ETA: 1:13 - loss: 1.2969 - regression_loss: 1.0978 - classification_loss: 0.1991 174/500 [=========>....................] - ETA: 1:13 - loss: 1.2974 - regression_loss: 1.0982 - classification_loss: 0.1993 175/500 [=========>....................] - ETA: 1:12 - loss: 1.2982 - regression_loss: 1.0987 - classification_loss: 0.1995 176/500 [=========>....................] - ETA: 1:12 - loss: 1.2971 - regression_loss: 1.0979 - classification_loss: 0.1993 177/500 [=========>....................] - ETA: 1:12 - loss: 1.2975 - regression_loss: 1.0981 - classification_loss: 0.1994 178/500 [=========>....................] - ETA: 1:12 - loss: 1.2987 - regression_loss: 1.0995 - classification_loss: 0.1992 179/500 [=========>....................] - ETA: 1:11 - loss: 1.3046 - regression_loss: 1.1044 - classification_loss: 0.2001 180/500 [=========>....................] - ETA: 1:11 - loss: 1.3027 - regression_loss: 1.1023 - classification_loss: 0.2004 181/500 [=========>....................] - ETA: 1:11 - loss: 1.3052 - regression_loss: 1.1044 - classification_loss: 0.2008 182/500 [=========>....................] - ETA: 1:11 - loss: 1.3044 - regression_loss: 1.1040 - classification_loss: 0.2004 183/500 [=========>....................] - ETA: 1:10 - loss: 1.3045 - regression_loss: 1.1039 - classification_loss: 0.2006 184/500 [==========>...................] - ETA: 1:10 - loss: 1.3054 - regression_loss: 1.1047 - classification_loss: 0.2007 185/500 [==========>...................] - ETA: 1:10 - loss: 1.3061 - regression_loss: 1.1054 - classification_loss: 0.2007 186/500 [==========>...................] - ETA: 1:10 - loss: 1.3086 - regression_loss: 1.1073 - classification_loss: 0.2013 187/500 [==========>...................] - ETA: 1:09 - loss: 1.3044 - regression_loss: 1.1041 - classification_loss: 0.2004 188/500 [==========>...................] - ETA: 1:09 - loss: 1.3068 - regression_loss: 1.1056 - classification_loss: 0.2012 189/500 [==========>...................] - ETA: 1:09 - loss: 1.3042 - regression_loss: 1.1037 - classification_loss: 0.2005 190/500 [==========>...................] - ETA: 1:09 - loss: 1.3058 - regression_loss: 1.1051 - classification_loss: 0.2006 191/500 [==========>...................] - ETA: 1:08 - loss: 1.3060 - regression_loss: 1.1050 - classification_loss: 0.2010 192/500 [==========>...................] - ETA: 1:08 - loss: 1.3043 - regression_loss: 1.1036 - classification_loss: 0.2008 193/500 [==========>...................] - ETA: 1:08 - loss: 1.3064 - regression_loss: 1.1050 - classification_loss: 0.2014 194/500 [==========>...................] - ETA: 1:08 - loss: 1.3063 - regression_loss: 1.1050 - classification_loss: 0.2013 195/500 [==========>...................] - ETA: 1:08 - loss: 1.3065 - regression_loss: 1.1053 - classification_loss: 0.2012 196/500 [==========>...................] - ETA: 1:07 - loss: 1.3074 - regression_loss: 1.1062 - classification_loss: 0.2012 197/500 [==========>...................] - ETA: 1:07 - loss: 1.3080 - regression_loss: 1.1070 - classification_loss: 0.2011 198/500 [==========>...................] - ETA: 1:07 - loss: 1.3096 - regression_loss: 1.1081 - classification_loss: 0.2014 199/500 [==========>...................] - ETA: 1:07 - loss: 1.3100 - regression_loss: 1.1085 - classification_loss: 0.2015 200/500 [===========>..................] - ETA: 1:06 - loss: 1.3097 - regression_loss: 1.1083 - classification_loss: 0.2014 201/500 [===========>..................] - ETA: 1:06 - loss: 1.3079 - regression_loss: 1.1063 - classification_loss: 0.2016 202/500 [===========>..................] - ETA: 1:06 - loss: 1.3060 - regression_loss: 1.1050 - classification_loss: 0.2010 203/500 [===========>..................] - ETA: 1:06 - loss: 1.3034 - regression_loss: 1.1031 - classification_loss: 0.2003 204/500 [===========>..................] - ETA: 1:05 - loss: 1.3008 - regression_loss: 1.1008 - classification_loss: 0.2000 205/500 [===========>..................] - ETA: 1:05 - loss: 1.2979 - regression_loss: 1.0986 - classification_loss: 0.1993 206/500 [===========>..................] - ETA: 1:05 - loss: 1.2929 - regression_loss: 1.0944 - classification_loss: 0.1985 207/500 [===========>..................] - ETA: 1:05 - loss: 1.2931 - regression_loss: 1.0944 - classification_loss: 0.1987 208/500 [===========>..................] - ETA: 1:05 - loss: 1.2938 - regression_loss: 1.0951 - classification_loss: 0.1987 209/500 [===========>..................] - ETA: 1:04 - loss: 1.2935 - regression_loss: 1.0948 - classification_loss: 0.1987 210/500 [===========>..................] - ETA: 1:04 - loss: 1.2967 - regression_loss: 1.0972 - classification_loss: 0.1996 211/500 [===========>..................] - ETA: 1:04 - loss: 1.2960 - regression_loss: 1.0966 - classification_loss: 0.1995 212/500 [===========>..................] - ETA: 1:04 - loss: 1.2957 - regression_loss: 1.0960 - classification_loss: 0.1997 213/500 [===========>..................] - ETA: 1:03 - loss: 1.2934 - regression_loss: 1.0941 - classification_loss: 0.1993 214/500 [===========>..................] - ETA: 1:03 - loss: 1.2933 - regression_loss: 1.0940 - classification_loss: 0.1992 215/500 [===========>..................] - ETA: 1:03 - loss: 1.2948 - regression_loss: 1.0955 - classification_loss: 0.1993 216/500 [===========>..................] - ETA: 1:03 - loss: 1.2924 - regression_loss: 1.0933 - classification_loss: 0.1991 217/500 [============>.................] - ETA: 1:03 - loss: 1.2946 - regression_loss: 1.0949 - classification_loss: 0.1997 218/500 [============>.................] - ETA: 1:02 - loss: 1.2944 - regression_loss: 1.0946 - classification_loss: 0.1998 219/500 [============>.................] - ETA: 1:02 - loss: 1.2958 - regression_loss: 1.0956 - classification_loss: 0.2002 220/500 [============>.................] - ETA: 1:02 - loss: 1.2953 - regression_loss: 1.0953 - classification_loss: 0.2000 221/500 [============>.................] - ETA: 1:02 - loss: 1.2929 - regression_loss: 1.0933 - classification_loss: 0.1996 222/500 [============>.................] - ETA: 1:01 - loss: 1.2930 - regression_loss: 1.0934 - classification_loss: 0.1996 223/500 [============>.................] - ETA: 1:01 - loss: 1.2921 - regression_loss: 1.0927 - classification_loss: 0.1993 224/500 [============>.................] - ETA: 1:01 - loss: 1.2928 - regression_loss: 1.0935 - classification_loss: 0.1993 225/500 [============>.................] - ETA: 1:01 - loss: 1.2958 - regression_loss: 1.0960 - classification_loss: 0.1998 226/500 [============>.................] - ETA: 1:00 - loss: 1.2976 - regression_loss: 1.0972 - classification_loss: 0.2004 227/500 [============>.................] - ETA: 1:00 - loss: 1.2960 - regression_loss: 1.0956 - classification_loss: 0.2004 228/500 [============>.................] - ETA: 1:00 - loss: 1.2983 - regression_loss: 1.0976 - classification_loss: 0.2007 229/500 [============>.................] - ETA: 1:00 - loss: 1.2993 - regression_loss: 1.0985 - classification_loss: 0.2008 230/500 [============>.................] - ETA: 59s - loss: 1.3002 - regression_loss: 1.0996 - classification_loss: 0.2006  231/500 [============>.................] - ETA: 59s - loss: 1.2987 - regression_loss: 1.0983 - classification_loss: 0.2003 232/500 [============>.................] - ETA: 59s - loss: 1.2971 - regression_loss: 1.0968 - classification_loss: 0.2003 233/500 [============>.................] - ETA: 59s - loss: 1.2986 - regression_loss: 1.0980 - classification_loss: 0.2007 234/500 [=============>................] - ETA: 59s - loss: 1.2970 - regression_loss: 1.0963 - classification_loss: 0.2007 235/500 [=============>................] - ETA: 58s - loss: 1.2950 - regression_loss: 1.0947 - classification_loss: 0.2003 236/500 [=============>................] - ETA: 58s - loss: 1.2922 - regression_loss: 1.0924 - classification_loss: 0.1998 237/500 [=============>................] - ETA: 58s - loss: 1.2944 - regression_loss: 1.0943 - classification_loss: 0.2001 238/500 [=============>................] - ETA: 58s - loss: 1.2932 - regression_loss: 1.0934 - classification_loss: 0.1998 239/500 [=============>................] - ETA: 57s - loss: 1.2908 - regression_loss: 1.0914 - classification_loss: 0.1994 240/500 [=============>................] - ETA: 57s - loss: 1.2903 - regression_loss: 1.0911 - classification_loss: 0.1992 241/500 [=============>................] - ETA: 57s - loss: 1.2908 - regression_loss: 1.0914 - classification_loss: 0.1994 242/500 [=============>................] - ETA: 57s - loss: 1.2887 - regression_loss: 1.0897 - classification_loss: 0.1991 243/500 [=============>................] - ETA: 56s - loss: 1.2861 - regression_loss: 1.0874 - classification_loss: 0.1987 244/500 [=============>................] - ETA: 56s - loss: 1.2894 - regression_loss: 1.0900 - classification_loss: 0.1995 245/500 [=============>................] - ETA: 56s - loss: 1.2892 - regression_loss: 1.0898 - classification_loss: 0.1994 246/500 [=============>................] - ETA: 56s - loss: 1.2878 - regression_loss: 1.0888 - classification_loss: 0.1990 247/500 [=============>................] - ETA: 56s - loss: 1.2859 - regression_loss: 1.0872 - classification_loss: 0.1987 248/500 [=============>................] - ETA: 55s - loss: 1.2861 - regression_loss: 1.0874 - classification_loss: 0.1986 249/500 [=============>................] - ETA: 55s - loss: 1.2865 - regression_loss: 1.0876 - classification_loss: 0.1989 250/500 [==============>...............] - ETA: 55s - loss: 1.2864 - regression_loss: 1.0878 - classification_loss: 0.1986 251/500 [==============>...............] - ETA: 55s - loss: 1.2851 - regression_loss: 1.0865 - classification_loss: 0.1986 252/500 [==============>...............] - ETA: 54s - loss: 1.2849 - regression_loss: 1.0864 - classification_loss: 0.1985 253/500 [==============>...............] - ETA: 54s - loss: 1.2863 - regression_loss: 1.0876 - classification_loss: 0.1987 254/500 [==============>...............] - ETA: 54s - loss: 1.2866 - regression_loss: 1.0879 - classification_loss: 0.1987 255/500 [==============>...............] - ETA: 54s - loss: 1.2858 - regression_loss: 1.0872 - classification_loss: 0.1986 256/500 [==============>...............] - ETA: 53s - loss: 1.2846 - regression_loss: 1.0862 - classification_loss: 0.1983 257/500 [==============>...............] - ETA: 53s - loss: 1.2855 - regression_loss: 1.0870 - classification_loss: 0.1984 258/500 [==============>...............] - ETA: 53s - loss: 1.2882 - regression_loss: 1.0892 - classification_loss: 0.1990 259/500 [==============>...............] - ETA: 53s - loss: 1.2884 - regression_loss: 1.0894 - classification_loss: 0.1989 260/500 [==============>...............] - ETA: 52s - loss: 1.2858 - regression_loss: 1.0872 - classification_loss: 0.1985 261/500 [==============>...............] - ETA: 52s - loss: 1.2831 - regression_loss: 1.0850 - classification_loss: 0.1981 262/500 [==============>...............] - ETA: 52s - loss: 1.2867 - regression_loss: 1.0883 - classification_loss: 0.1984 263/500 [==============>...............] - ETA: 52s - loss: 1.2881 - regression_loss: 1.0892 - classification_loss: 0.1989 264/500 [==============>...............] - ETA: 52s - loss: 1.2885 - regression_loss: 1.0895 - classification_loss: 0.1990 265/500 [==============>...............] - ETA: 51s - loss: 1.2869 - regression_loss: 1.0882 - classification_loss: 0.1987 266/500 [==============>...............] - ETA: 51s - loss: 1.2887 - regression_loss: 1.0897 - classification_loss: 0.1990 267/500 [===============>..............] - ETA: 51s - loss: 1.2874 - regression_loss: 1.0887 - classification_loss: 0.1986 268/500 [===============>..............] - ETA: 51s - loss: 1.2845 - regression_loss: 1.0863 - classification_loss: 0.1982 269/500 [===============>..............] - ETA: 50s - loss: 1.2841 - regression_loss: 1.0858 - classification_loss: 0.1982 270/500 [===============>..............] - ETA: 50s - loss: 1.2842 - regression_loss: 1.0859 - classification_loss: 0.1983 271/500 [===============>..............] - ETA: 50s - loss: 1.2854 - regression_loss: 1.0869 - classification_loss: 0.1985 272/500 [===============>..............] - ETA: 50s - loss: 1.2827 - regression_loss: 1.0848 - classification_loss: 0.1979 273/500 [===============>..............] - ETA: 50s - loss: 1.2843 - regression_loss: 1.0860 - classification_loss: 0.1983 274/500 [===============>..............] - ETA: 49s - loss: 1.2809 - regression_loss: 1.0832 - classification_loss: 0.1978 275/500 [===============>..............] - ETA: 49s - loss: 1.2825 - regression_loss: 1.0845 - classification_loss: 0.1980 276/500 [===============>..............] - ETA: 49s - loss: 1.2836 - regression_loss: 1.0855 - classification_loss: 0.1982 277/500 [===============>..............] - ETA: 49s - loss: 1.2843 - regression_loss: 1.0863 - classification_loss: 0.1980 278/500 [===============>..............] - ETA: 48s - loss: 1.2841 - regression_loss: 1.0863 - classification_loss: 0.1979 279/500 [===============>..............] - ETA: 48s - loss: 1.2807 - regression_loss: 1.0833 - classification_loss: 0.1973 280/500 [===============>..............] - ETA: 48s - loss: 1.2821 - regression_loss: 1.0847 - classification_loss: 0.1975 281/500 [===============>..............] - ETA: 48s - loss: 1.2821 - regression_loss: 1.0847 - classification_loss: 0.1974 282/500 [===============>..............] - ETA: 47s - loss: 1.2838 - regression_loss: 1.0862 - classification_loss: 0.1976 283/500 [===============>..............] - ETA: 47s - loss: 1.2829 - regression_loss: 1.0854 - classification_loss: 0.1975 284/500 [================>.............] - ETA: 47s - loss: 1.2842 - regression_loss: 1.0864 - classification_loss: 0.1979 285/500 [================>.............] - ETA: 47s - loss: 1.2816 - regression_loss: 1.0840 - classification_loss: 0.1976 286/500 [================>.............] - ETA: 47s - loss: 1.2803 - regression_loss: 1.0829 - classification_loss: 0.1974 287/500 [================>.............] - ETA: 46s - loss: 1.2794 - regression_loss: 1.0821 - classification_loss: 0.1973 288/500 [================>.............] - ETA: 46s - loss: 1.2804 - regression_loss: 1.0831 - classification_loss: 0.1973 289/500 [================>.............] - ETA: 46s - loss: 1.2814 - regression_loss: 1.0840 - classification_loss: 0.1974 290/500 [================>.............] - ETA: 46s - loss: 1.2802 - regression_loss: 1.0830 - classification_loss: 0.1972 291/500 [================>.............] - ETA: 45s - loss: 1.2832 - regression_loss: 1.0847 - classification_loss: 0.1985 292/500 [================>.............] - ETA: 45s - loss: 1.2831 - regression_loss: 1.0846 - classification_loss: 0.1985 293/500 [================>.............] - ETA: 45s - loss: 1.2812 - regression_loss: 1.0826 - classification_loss: 0.1986 294/500 [================>.............] - ETA: 45s - loss: 1.2824 - regression_loss: 1.0836 - classification_loss: 0.1989 295/500 [================>.............] - ETA: 45s - loss: 1.2818 - regression_loss: 1.0830 - classification_loss: 0.1987 296/500 [================>.............] - ETA: 44s - loss: 1.2813 - regression_loss: 1.0827 - classification_loss: 0.1986 297/500 [================>.............] - ETA: 44s - loss: 1.2814 - regression_loss: 1.0827 - classification_loss: 0.1986 298/500 [================>.............] - ETA: 44s - loss: 1.2813 - regression_loss: 1.0828 - classification_loss: 0.1985 299/500 [================>.............] - ETA: 44s - loss: 1.2798 - regression_loss: 1.0815 - classification_loss: 0.1982 300/500 [=================>............] - ETA: 43s - loss: 1.2787 - regression_loss: 1.0809 - classification_loss: 0.1978 301/500 [=================>............] - ETA: 43s - loss: 1.2790 - regression_loss: 1.0811 - classification_loss: 0.1979 302/500 [=================>............] - ETA: 43s - loss: 1.2806 - regression_loss: 1.0825 - classification_loss: 0.1982 303/500 [=================>............] - ETA: 43s - loss: 1.2811 - regression_loss: 1.0828 - classification_loss: 0.1982 304/500 [=================>............] - ETA: 43s - loss: 1.2788 - regression_loss: 1.0809 - classification_loss: 0.1979 305/500 [=================>............] - ETA: 42s - loss: 1.2794 - regression_loss: 1.0815 - classification_loss: 0.1979 306/500 [=================>............] - ETA: 42s - loss: 1.2808 - regression_loss: 1.0827 - classification_loss: 0.1980 307/500 [=================>............] - ETA: 42s - loss: 1.2810 - regression_loss: 1.0830 - classification_loss: 0.1980 308/500 [=================>............] - ETA: 42s - loss: 1.2790 - regression_loss: 1.0814 - classification_loss: 0.1977 309/500 [=================>............] - ETA: 41s - loss: 1.2792 - regression_loss: 1.0815 - classification_loss: 0.1977 310/500 [=================>............] - ETA: 41s - loss: 1.2800 - regression_loss: 1.0825 - classification_loss: 0.1975 311/500 [=================>............] - ETA: 41s - loss: 1.2807 - regression_loss: 1.0830 - classification_loss: 0.1977 312/500 [=================>............] - ETA: 41s - loss: 1.2791 - regression_loss: 1.0817 - classification_loss: 0.1974 313/500 [=================>............] - ETA: 40s - loss: 1.2777 - regression_loss: 1.0804 - classification_loss: 0.1972 314/500 [=================>............] - ETA: 40s - loss: 1.2779 - regression_loss: 1.0807 - classification_loss: 0.1972 315/500 [=================>............] - ETA: 40s - loss: 1.2778 - regression_loss: 1.0806 - classification_loss: 0.1972 316/500 [=================>............] - ETA: 40s - loss: 1.2787 - regression_loss: 1.0813 - classification_loss: 0.1974 317/500 [==================>...........] - ETA: 40s - loss: 1.2792 - regression_loss: 1.0817 - classification_loss: 0.1975 318/500 [==================>...........] - ETA: 39s - loss: 1.2776 - regression_loss: 1.0805 - classification_loss: 0.1971 319/500 [==================>...........] - ETA: 39s - loss: 1.2774 - regression_loss: 1.0803 - classification_loss: 0.1971 320/500 [==================>...........] - ETA: 39s - loss: 1.2779 - regression_loss: 1.0807 - classification_loss: 0.1972 321/500 [==================>...........] - ETA: 39s - loss: 1.2783 - regression_loss: 1.0813 - classification_loss: 0.1970 322/500 [==================>...........] - ETA: 38s - loss: 1.2786 - regression_loss: 1.0816 - classification_loss: 0.1970 323/500 [==================>...........] - ETA: 38s - loss: 1.2797 - regression_loss: 1.0825 - classification_loss: 0.1972 324/500 [==================>...........] - ETA: 38s - loss: 1.2810 - regression_loss: 1.0837 - classification_loss: 0.1972 325/500 [==================>...........] - ETA: 38s - loss: 1.2818 - regression_loss: 1.0845 - classification_loss: 0.1973 326/500 [==================>...........] - ETA: 38s - loss: 1.2825 - regression_loss: 1.0849 - classification_loss: 0.1976 327/500 [==================>...........] - ETA: 37s - loss: 1.2814 - regression_loss: 1.0841 - classification_loss: 0.1974 328/500 [==================>...........] - ETA: 37s - loss: 1.2839 - regression_loss: 1.0859 - classification_loss: 0.1980 329/500 [==================>...........] - ETA: 37s - loss: 1.2814 - regression_loss: 1.0837 - classification_loss: 0.1977 330/500 [==================>...........] - ETA: 37s - loss: 1.2832 - regression_loss: 1.0848 - classification_loss: 0.1984 331/500 [==================>...........] - ETA: 36s - loss: 1.2809 - regression_loss: 1.0829 - classification_loss: 0.1980 332/500 [==================>...........] - ETA: 36s - loss: 1.2819 - regression_loss: 1.0838 - classification_loss: 0.1980 333/500 [==================>...........] - ETA: 36s - loss: 1.2792 - regression_loss: 1.0816 - classification_loss: 0.1976 334/500 [===================>..........] - ETA: 36s - loss: 1.2793 - regression_loss: 1.0817 - classification_loss: 0.1976 335/500 [===================>..........] - ETA: 36s - loss: 1.2796 - regression_loss: 1.0819 - classification_loss: 0.1977 336/500 [===================>..........] - ETA: 35s - loss: 1.2792 - regression_loss: 1.0815 - classification_loss: 0.1976 337/500 [===================>..........] - ETA: 35s - loss: 1.2803 - regression_loss: 1.0825 - classification_loss: 0.1977 338/500 [===================>..........] - ETA: 35s - loss: 1.2818 - regression_loss: 1.0839 - classification_loss: 0.1979 339/500 [===================>..........] - ETA: 35s - loss: 1.2818 - regression_loss: 1.0837 - classification_loss: 0.1981 340/500 [===================>..........] - ETA: 34s - loss: 1.2804 - regression_loss: 1.0826 - classification_loss: 0.1978 341/500 [===================>..........] - ETA: 34s - loss: 1.2783 - regression_loss: 1.0809 - classification_loss: 0.1974 342/500 [===================>..........] - ETA: 34s - loss: 1.2779 - regression_loss: 1.0807 - classification_loss: 0.1972 343/500 [===================>..........] - ETA: 34s - loss: 1.2790 - regression_loss: 1.0815 - classification_loss: 0.1974 344/500 [===================>..........] - ETA: 34s - loss: 1.2776 - regression_loss: 1.0803 - classification_loss: 0.1972 345/500 [===================>..........] - ETA: 33s - loss: 1.2763 - regression_loss: 1.0792 - classification_loss: 0.1972 346/500 [===================>..........] - ETA: 33s - loss: 1.2755 - regression_loss: 1.0786 - classification_loss: 0.1969 347/500 [===================>..........] - ETA: 33s - loss: 1.2752 - regression_loss: 1.0781 - classification_loss: 0.1971 348/500 [===================>..........] - ETA: 33s - loss: 1.2747 - regression_loss: 1.0778 - classification_loss: 0.1970 349/500 [===================>..........] - ETA: 32s - loss: 1.2764 - regression_loss: 1.0789 - classification_loss: 0.1975 350/500 [====================>.........] - ETA: 32s - loss: 1.2762 - regression_loss: 1.0788 - classification_loss: 0.1974 351/500 [====================>.........] - ETA: 32s - loss: 1.2770 - regression_loss: 1.0794 - classification_loss: 0.1977 352/500 [====================>.........] - ETA: 32s - loss: 1.2762 - regression_loss: 1.0787 - classification_loss: 0.1975 353/500 [====================>.........] - ETA: 32s - loss: 1.2757 - regression_loss: 1.0784 - classification_loss: 0.1973 354/500 [====================>.........] - ETA: 31s - loss: 1.2744 - regression_loss: 1.0772 - classification_loss: 0.1972 355/500 [====================>.........] - ETA: 31s - loss: 1.2723 - regression_loss: 1.0754 - classification_loss: 0.1969 356/500 [====================>.........] - ETA: 31s - loss: 1.2720 - regression_loss: 1.0750 - classification_loss: 0.1970 357/500 [====================>.........] - ETA: 31s - loss: 1.2724 - regression_loss: 1.0755 - classification_loss: 0.1969 358/500 [====================>.........] - ETA: 30s - loss: 1.2720 - regression_loss: 1.0750 - classification_loss: 0.1970 359/500 [====================>.........] - ETA: 30s - loss: 1.2723 - regression_loss: 1.0754 - classification_loss: 0.1969 360/500 [====================>.........] - ETA: 30s - loss: 1.2735 - regression_loss: 1.0764 - classification_loss: 0.1971 361/500 [====================>.........] - ETA: 30s - loss: 1.2719 - regression_loss: 1.0751 - classification_loss: 0.1968 362/500 [====================>.........] - ETA: 30s - loss: 1.2720 - regression_loss: 1.0751 - classification_loss: 0.1968 363/500 [====================>.........] - ETA: 29s - loss: 1.2710 - regression_loss: 1.0743 - classification_loss: 0.1967 364/500 [====================>.........] - ETA: 29s - loss: 1.2707 - regression_loss: 1.0740 - classification_loss: 0.1967 365/500 [====================>.........] - ETA: 29s - loss: 1.2695 - regression_loss: 1.0731 - classification_loss: 0.1965 366/500 [====================>.........] - ETA: 29s - loss: 1.2704 - regression_loss: 1.0738 - classification_loss: 0.1966 367/500 [=====================>........] - ETA: 29s - loss: 1.2713 - regression_loss: 1.0747 - classification_loss: 0.1967 368/500 [=====================>........] - ETA: 28s - loss: 1.2707 - regression_loss: 1.0741 - classification_loss: 0.1966 369/500 [=====================>........] - ETA: 28s - loss: 1.2707 - regression_loss: 1.0741 - classification_loss: 0.1966 370/500 [=====================>........] - ETA: 28s - loss: 1.2707 - regression_loss: 1.0742 - classification_loss: 0.1966 371/500 [=====================>........] - ETA: 28s - loss: 1.2724 - regression_loss: 1.0757 - classification_loss: 0.1967 372/500 [=====================>........] - ETA: 27s - loss: 1.2727 - regression_loss: 1.0759 - classification_loss: 0.1968 373/500 [=====================>........] - ETA: 27s - loss: 1.2707 - regression_loss: 1.0742 - classification_loss: 0.1965 374/500 [=====================>........] - ETA: 27s - loss: 1.2721 - regression_loss: 1.0753 - classification_loss: 0.1968 375/500 [=====================>........] - ETA: 27s - loss: 1.2705 - regression_loss: 1.0741 - classification_loss: 0.1964 376/500 [=====================>........] - ETA: 27s - loss: 1.2710 - regression_loss: 1.0744 - classification_loss: 0.1965 377/500 [=====================>........] - ETA: 26s - loss: 1.2734 - regression_loss: 1.0763 - classification_loss: 0.1970 378/500 [=====================>........] - ETA: 26s - loss: 1.2726 - regression_loss: 1.0757 - classification_loss: 0.1969 379/500 [=====================>........] - ETA: 26s - loss: 1.2729 - regression_loss: 1.0758 - classification_loss: 0.1971 380/500 [=====================>........] - ETA: 26s - loss: 1.2730 - regression_loss: 1.0759 - classification_loss: 0.1971 381/500 [=====================>........] - ETA: 25s - loss: 1.2731 - regression_loss: 1.0760 - classification_loss: 0.1971 382/500 [=====================>........] - ETA: 25s - loss: 1.2728 - regression_loss: 1.0759 - classification_loss: 0.1969 383/500 [=====================>........] - ETA: 25s - loss: 1.2706 - regression_loss: 1.0740 - classification_loss: 0.1965 384/500 [======================>.......] - ETA: 25s - loss: 1.2710 - regression_loss: 1.0745 - classification_loss: 0.1965 385/500 [======================>.......] - ETA: 25s - loss: 1.2706 - regression_loss: 1.0742 - classification_loss: 0.1964 386/500 [======================>.......] - ETA: 24s - loss: 1.2707 - regression_loss: 1.0742 - classification_loss: 0.1965 387/500 [======================>.......] - ETA: 24s - loss: 1.2713 - regression_loss: 1.0748 - classification_loss: 0.1964 388/500 [======================>.......] - ETA: 24s - loss: 1.2730 - regression_loss: 1.0760 - classification_loss: 0.1970 389/500 [======================>.......] - ETA: 24s - loss: 1.2721 - regression_loss: 1.0753 - classification_loss: 0.1969 390/500 [======================>.......] - ETA: 23s - loss: 1.2706 - regression_loss: 1.0741 - classification_loss: 0.1965 391/500 [======================>.......] - ETA: 23s - loss: 1.2711 - regression_loss: 1.0745 - classification_loss: 0.1966 392/500 [======================>.......] - ETA: 23s - loss: 1.2701 - regression_loss: 1.0738 - classification_loss: 0.1963 393/500 [======================>.......] - ETA: 23s - loss: 1.2705 - regression_loss: 1.0741 - classification_loss: 0.1963 394/500 [======================>.......] - ETA: 23s - loss: 1.2701 - regression_loss: 1.0738 - classification_loss: 0.1963 395/500 [======================>.......] - ETA: 22s - loss: 1.2684 - regression_loss: 1.0725 - classification_loss: 0.1959 396/500 [======================>.......] - ETA: 22s - loss: 1.2696 - regression_loss: 1.0735 - classification_loss: 0.1961 397/500 [======================>.......] - ETA: 22s - loss: 1.2695 - regression_loss: 1.0735 - classification_loss: 0.1961 398/500 [======================>.......] - ETA: 22s - loss: 1.2712 - regression_loss: 1.0748 - classification_loss: 0.1963 399/500 [======================>.......] - ETA: 22s - loss: 1.2716 - regression_loss: 1.0752 - classification_loss: 0.1964 400/500 [=======================>......] - ETA: 21s - loss: 1.2717 - regression_loss: 1.0754 - classification_loss: 0.1964 401/500 [=======================>......] - ETA: 21s - loss: 1.2726 - regression_loss: 1.0761 - classification_loss: 0.1965 402/500 [=======================>......] - ETA: 21s - loss: 1.2736 - regression_loss: 1.0769 - classification_loss: 0.1968 403/500 [=======================>......] - ETA: 21s - loss: 1.2743 - regression_loss: 1.0774 - classification_loss: 0.1969 404/500 [=======================>......] - ETA: 20s - loss: 1.2746 - regression_loss: 1.0777 - classification_loss: 0.1969 405/500 [=======================>......] - ETA: 20s - loss: 1.2734 - regression_loss: 1.0767 - classification_loss: 0.1967 406/500 [=======================>......] - ETA: 20s - loss: 1.2736 - regression_loss: 1.0768 - classification_loss: 0.1968 407/500 [=======================>......] - ETA: 20s - loss: 1.2737 - regression_loss: 1.0768 - classification_loss: 0.1969 408/500 [=======================>......] - ETA: 20s - loss: 1.2739 - regression_loss: 1.0771 - classification_loss: 0.1968 409/500 [=======================>......] - ETA: 19s - loss: 1.2723 - regression_loss: 1.0759 - classification_loss: 0.1964 410/500 [=======================>......] - ETA: 19s - loss: 1.2721 - regression_loss: 1.0756 - classification_loss: 0.1965 411/500 [=======================>......] - ETA: 19s - loss: 1.2723 - regression_loss: 1.0758 - classification_loss: 0.1965 412/500 [=======================>......] - ETA: 19s - loss: 1.2738 - regression_loss: 1.0774 - classification_loss: 0.1964 413/500 [=======================>......] - ETA: 18s - loss: 1.2720 - regression_loss: 1.0759 - classification_loss: 0.1961 414/500 [=======================>......] - ETA: 18s - loss: 1.2736 - regression_loss: 1.0770 - classification_loss: 0.1966 415/500 [=======================>......] - ETA: 18s - loss: 1.2738 - regression_loss: 1.0771 - classification_loss: 0.1967 416/500 [=======================>......] - ETA: 18s - loss: 1.2751 - regression_loss: 1.0781 - classification_loss: 0.1969 417/500 [========================>.....] - ETA: 18s - loss: 1.2752 - regression_loss: 1.0784 - classification_loss: 0.1968 418/500 [========================>.....] - ETA: 17s - loss: 1.2763 - regression_loss: 1.0794 - classification_loss: 0.1970 419/500 [========================>.....] - ETA: 17s - loss: 1.2770 - regression_loss: 1.0799 - classification_loss: 0.1970 420/500 [========================>.....] - ETA: 17s - loss: 1.2772 - regression_loss: 1.0800 - classification_loss: 0.1972 421/500 [========================>.....] - ETA: 17s - loss: 1.2755 - regression_loss: 1.0786 - classification_loss: 0.1969 422/500 [========================>.....] - ETA: 16s - loss: 1.2743 - regression_loss: 1.0776 - classification_loss: 0.1967 423/500 [========================>.....] - ETA: 16s - loss: 1.2743 - regression_loss: 1.0774 - classification_loss: 0.1969 424/500 [========================>.....] - ETA: 16s - loss: 1.2747 - regression_loss: 1.0777 - classification_loss: 0.1970 425/500 [========================>.....] - ETA: 16s - loss: 1.2751 - regression_loss: 1.0780 - classification_loss: 0.1970 426/500 [========================>.....] - ETA: 16s - loss: 1.2758 - regression_loss: 1.0786 - classification_loss: 0.1971 427/500 [========================>.....] - ETA: 15s - loss: 1.2758 - regression_loss: 1.0788 - classification_loss: 0.1970 428/500 [========================>.....] - ETA: 15s - loss: 1.2763 - regression_loss: 1.0792 - classification_loss: 0.1971 429/500 [========================>.....] - ETA: 15s - loss: 1.2771 - regression_loss: 1.0798 - classification_loss: 0.1973 430/500 [========================>.....] - ETA: 15s - loss: 1.2783 - regression_loss: 1.0809 - classification_loss: 0.1974 431/500 [========================>.....] - ETA: 15s - loss: 1.2795 - regression_loss: 1.0819 - classification_loss: 0.1976 432/500 [========================>.....] - ETA: 14s - loss: 1.2798 - regression_loss: 1.0823 - classification_loss: 0.1974 433/500 [========================>.....] - ETA: 14s - loss: 1.2795 - regression_loss: 1.0821 - classification_loss: 0.1973 434/500 [=========================>....] - ETA: 14s - loss: 1.2800 - regression_loss: 1.0826 - classification_loss: 0.1974 435/500 [=========================>....] - ETA: 14s - loss: 1.2798 - regression_loss: 1.0825 - classification_loss: 0.1973 436/500 [=========================>....] - ETA: 13s - loss: 1.2805 - regression_loss: 1.0831 - classification_loss: 0.1974 437/500 [=========================>....] - ETA: 13s - loss: 1.2813 - regression_loss: 1.0838 - classification_loss: 0.1975 438/500 [=========================>....] - ETA: 13s - loss: 1.2812 - regression_loss: 1.0838 - classification_loss: 0.1974 439/500 [=========================>....] - ETA: 13s - loss: 1.2797 - regression_loss: 1.0826 - classification_loss: 0.1972 440/500 [=========================>....] - ETA: 13s - loss: 1.2800 - regression_loss: 1.0827 - classification_loss: 0.1973 441/500 [=========================>....] - ETA: 12s - loss: 1.2786 - regression_loss: 1.0816 - classification_loss: 0.1970 442/500 [=========================>....] - ETA: 12s - loss: 1.2795 - regression_loss: 1.0823 - classification_loss: 0.1972 443/500 [=========================>....] - ETA: 12s - loss: 1.2798 - regression_loss: 1.0827 - classification_loss: 0.1971 444/500 [=========================>....] - ETA: 12s - loss: 1.2794 - regression_loss: 1.0823 - classification_loss: 0.1971 445/500 [=========================>....] - ETA: 11s - loss: 1.2809 - regression_loss: 1.0837 - classification_loss: 0.1972 446/500 [=========================>....] - ETA: 11s - loss: 1.2803 - regression_loss: 1.0833 - classification_loss: 0.1969 447/500 [=========================>....] - ETA: 11s - loss: 1.2798 - regression_loss: 1.0831 - classification_loss: 0.1967 448/500 [=========================>....] - ETA: 11s - loss: 1.2783 - regression_loss: 1.0818 - classification_loss: 0.1964 449/500 [=========================>....] - ETA: 11s - loss: 1.2778 - regression_loss: 1.0815 - classification_loss: 0.1963 450/500 [==========================>...] - ETA: 10s - loss: 1.2789 - regression_loss: 1.0824 - classification_loss: 0.1965 451/500 [==========================>...] - ETA: 10s - loss: 1.2779 - regression_loss: 1.0816 - classification_loss: 0.1962 452/500 [==========================>...] - ETA: 10s - loss: 1.2780 - regression_loss: 1.0818 - classification_loss: 0.1962 453/500 [==========================>...] - ETA: 10s - loss: 1.2762 - regression_loss: 1.0802 - classification_loss: 0.1960 454/500 [==========================>...] - ETA: 10s - loss: 1.2772 - regression_loss: 1.0810 - classification_loss: 0.1962 455/500 [==========================>...] - ETA: 9s - loss: 1.2775 - regression_loss: 1.0813 - classification_loss: 0.1962  456/500 [==========================>...] - ETA: 9s - loss: 1.2767 - regression_loss: 1.0807 - classification_loss: 0.1960 457/500 [==========================>...] - ETA: 9s - loss: 1.2771 - regression_loss: 1.0810 - classification_loss: 0.1961 458/500 [==========================>...] - ETA: 9s - loss: 1.2776 - regression_loss: 1.0813 - classification_loss: 0.1962 459/500 [==========================>...] - ETA: 8s - loss: 1.2784 - regression_loss: 1.0820 - classification_loss: 0.1963 460/500 [==========================>...] - ETA: 8s - loss: 1.2787 - regression_loss: 1.0823 - classification_loss: 0.1964 461/500 [==========================>...] - ETA: 8s - loss: 1.2789 - regression_loss: 1.0825 - classification_loss: 0.1964 462/500 [==========================>...] - ETA: 8s - loss: 1.2774 - regression_loss: 1.0812 - classification_loss: 0.1962 463/500 [==========================>...] - ETA: 8s - loss: 1.2783 - regression_loss: 1.0820 - classification_loss: 0.1963 464/500 [==========================>...] - ETA: 7s - loss: 1.2796 - regression_loss: 1.0831 - classification_loss: 0.1965 465/500 [==========================>...] - ETA: 7s - loss: 1.2789 - regression_loss: 1.0828 - classification_loss: 0.1962 466/500 [==========================>...] - ETA: 7s - loss: 1.2778 - regression_loss: 1.0820 - classification_loss: 0.1959 467/500 [===========================>..] - ETA: 7s - loss: 1.2777 - regression_loss: 1.0819 - classification_loss: 0.1958 468/500 [===========================>..] - ETA: 6s - loss: 1.2766 - regression_loss: 1.0811 - classification_loss: 0.1955 469/500 [===========================>..] - ETA: 6s - loss: 1.2750 - regression_loss: 1.0799 - classification_loss: 0.1951 470/500 [===========================>..] - ETA: 6s - loss: 1.2746 - regression_loss: 1.0796 - classification_loss: 0.1950 471/500 [===========================>..] - ETA: 6s - loss: 1.2752 - regression_loss: 1.0799 - classification_loss: 0.1953 472/500 [===========================>..] - ETA: 6s - loss: 1.2757 - regression_loss: 1.0803 - classification_loss: 0.1954 473/500 [===========================>..] - ETA: 5s - loss: 1.2756 - regression_loss: 1.0803 - classification_loss: 0.1953 474/500 [===========================>..] - ETA: 5s - loss: 1.2746 - regression_loss: 1.0794 - classification_loss: 0.1952 475/500 [===========================>..] - ETA: 5s - loss: 1.2739 - regression_loss: 1.0788 - classification_loss: 0.1951 476/500 [===========================>..] - ETA: 5s - loss: 1.2743 - regression_loss: 1.0792 - classification_loss: 0.1951 477/500 [===========================>..] - ETA: 4s - loss: 1.2741 - regression_loss: 1.0791 - classification_loss: 0.1950 478/500 [===========================>..] - ETA: 4s - loss: 1.2746 - regression_loss: 1.0795 - classification_loss: 0.1950 479/500 [===========================>..] - ETA: 4s - loss: 1.2744 - regression_loss: 1.0794 - classification_loss: 0.1950 480/500 [===========================>..] - ETA: 4s - loss: 1.2758 - regression_loss: 1.0806 - classification_loss: 0.1952 481/500 [===========================>..] - ETA: 4s - loss: 1.2773 - regression_loss: 1.0817 - classification_loss: 0.1956 482/500 [===========================>..] - ETA: 3s - loss: 1.2778 - regression_loss: 1.0822 - classification_loss: 0.1956 483/500 [===========================>..] - ETA: 3s - loss: 1.2770 - regression_loss: 1.0816 - classification_loss: 0.1954 484/500 [============================>.] - ETA: 3s - loss: 1.2771 - regression_loss: 1.0818 - classification_loss: 0.1953 485/500 [============================>.] - ETA: 3s - loss: 1.2776 - regression_loss: 1.0822 - classification_loss: 0.1954 486/500 [============================>.] - ETA: 3s - loss: 1.2788 - regression_loss: 1.0832 - classification_loss: 0.1957 487/500 [============================>.] - ETA: 2s - loss: 1.2778 - regression_loss: 1.0823 - classification_loss: 0.1954 488/500 [============================>.] - ETA: 2s - loss: 1.2779 - regression_loss: 1.0826 - classification_loss: 0.1953 489/500 [============================>.] - ETA: 2s - loss: 1.2776 - regression_loss: 1.0824 - classification_loss: 0.1952 490/500 [============================>.] - ETA: 2s - loss: 1.2769 - regression_loss: 1.0818 - classification_loss: 0.1951 491/500 [============================>.] - ETA: 1s - loss: 1.2753 - regression_loss: 1.0805 - classification_loss: 0.1948 492/500 [============================>.] - ETA: 1s - loss: 1.2764 - regression_loss: 1.0814 - classification_loss: 0.1950 493/500 [============================>.] - ETA: 1s - loss: 1.2766 - regression_loss: 1.0817 - classification_loss: 0.1950 494/500 [============================>.] - ETA: 1s - loss: 1.2767 - regression_loss: 1.0818 - classification_loss: 0.1950 495/500 [============================>.] - ETA: 1s - loss: 1.2768 - regression_loss: 1.0818 - classification_loss: 0.1951 496/500 [============================>.] - ETA: 0s - loss: 1.2770 - regression_loss: 1.0819 - classification_loss: 0.1951 497/500 [============================>.] - ETA: 0s - loss: 1.2772 - regression_loss: 1.0820 - classification_loss: 0.1951 498/500 [============================>.] - ETA: 0s - loss: 1.2767 - regression_loss: 1.0818 - classification_loss: 0.1950 499/500 [============================>.] - ETA: 0s - loss: 1.2758 - regression_loss: 1.0810 - classification_loss: 0.1948 500/500 [==============================] - 108s 217ms/step - loss: 1.2747 - regression_loss: 1.0800 - classification_loss: 0.1947 1172 instances of class plum with average precision: 0.6969 mAP: 0.6969 Epoch 00119: saving model to ./training/snapshots/resnet50_pascal_119.h5 Epoch 120/150 1/500 [..............................] - ETA: 1:45 - loss: 1.6591 - regression_loss: 1.4532 - classification_loss: 0.2059 2/500 [..............................] - ETA: 1:46 - loss: 1.1720 - regression_loss: 1.0279 - classification_loss: 0.1441 3/500 [..............................] - ETA: 1:45 - loss: 1.2717 - regression_loss: 1.1023 - classification_loss: 0.1694 4/500 [..............................] - ETA: 1:45 - loss: 1.3237 - regression_loss: 1.1395 - classification_loss: 0.1841 5/500 [..............................] - ETA: 1:45 - loss: 1.4373 - regression_loss: 1.2219 - classification_loss: 0.2154 6/500 [..............................] - ETA: 1:45 - loss: 1.2570 - regression_loss: 1.0728 - classification_loss: 0.1842 7/500 [..............................] - ETA: 1:44 - loss: 1.2211 - regression_loss: 1.0466 - classification_loss: 0.1745 8/500 [..............................] - ETA: 1:44 - loss: 1.1967 - regression_loss: 1.0205 - classification_loss: 0.1762 9/500 [..............................] - ETA: 1:44 - loss: 1.1814 - regression_loss: 1.0071 - classification_loss: 0.1742 10/500 [..............................] - ETA: 1:43 - loss: 1.1635 - regression_loss: 0.9942 - classification_loss: 0.1693 11/500 [..............................] - ETA: 1:43 - loss: 1.1706 - regression_loss: 1.0029 - classification_loss: 0.1677 12/500 [..............................] - ETA: 1:43 - loss: 1.1680 - regression_loss: 1.0014 - classification_loss: 0.1666 13/500 [..............................] - ETA: 1:43 - loss: 1.1783 - regression_loss: 1.0073 - classification_loss: 0.1710 14/500 [..............................] - ETA: 1:43 - loss: 1.2097 - regression_loss: 1.0327 - classification_loss: 0.1770 15/500 [..............................] - ETA: 1:44 - loss: 1.1974 - regression_loss: 1.0224 - classification_loss: 0.1750 16/500 [..............................] - ETA: 1:44 - loss: 1.2034 - regression_loss: 1.0290 - classification_loss: 0.1744 17/500 [>.............................] - ETA: 1:44 - loss: 1.1814 - regression_loss: 1.0109 - classification_loss: 0.1705 18/500 [>.............................] - ETA: 1:43 - loss: 1.1911 - regression_loss: 1.0177 - classification_loss: 0.1734 19/500 [>.............................] - ETA: 1:43 - loss: 1.1882 - regression_loss: 1.0161 - classification_loss: 0.1721 20/500 [>.............................] - ETA: 1:43 - loss: 1.1602 - regression_loss: 0.9916 - classification_loss: 0.1686 21/500 [>.............................] - ETA: 1:43 - loss: 1.1720 - regression_loss: 0.9980 - classification_loss: 0.1741 22/500 [>.............................] - ETA: 1:42 - loss: 1.1622 - regression_loss: 0.9915 - classification_loss: 0.1707 23/500 [>.............................] - ETA: 1:42 - loss: 1.1834 - regression_loss: 1.0085 - classification_loss: 0.1748 24/500 [>.............................] - ETA: 1:41 - loss: 1.1714 - regression_loss: 0.9971 - classification_loss: 0.1743 25/500 [>.............................] - ETA: 1:41 - loss: 1.1723 - regression_loss: 0.9978 - classification_loss: 0.1744 26/500 [>.............................] - ETA: 1:41 - loss: 1.1684 - regression_loss: 0.9952 - classification_loss: 0.1732 27/500 [>.............................] - ETA: 1:41 - loss: 1.1472 - regression_loss: 0.9775 - classification_loss: 0.1696 28/500 [>.............................] - ETA: 1:40 - loss: 1.1610 - regression_loss: 0.9845 - classification_loss: 0.1765 29/500 [>.............................] - ETA: 1:40 - loss: 1.1632 - regression_loss: 0.9840 - classification_loss: 0.1792 30/500 [>.............................] - ETA: 1:40 - loss: 1.1595 - regression_loss: 0.9801 - classification_loss: 0.1794 31/500 [>.............................] - ETA: 1:40 - loss: 1.1702 - regression_loss: 0.9895 - classification_loss: 0.1807 32/500 [>.............................] - ETA: 1:39 - loss: 1.1767 - regression_loss: 0.9957 - classification_loss: 0.1810 33/500 [>.............................] - ETA: 1:39 - loss: 1.1814 - regression_loss: 1.0004 - classification_loss: 0.1810 34/500 [=>............................] - ETA: 1:39 - loss: 1.1906 - regression_loss: 1.0072 - classification_loss: 0.1834 35/500 [=>............................] - ETA: 1:39 - loss: 1.1792 - regression_loss: 0.9965 - classification_loss: 0.1827 36/500 [=>............................] - ETA: 1:39 - loss: 1.1829 - regression_loss: 1.0001 - classification_loss: 0.1829 37/500 [=>............................] - ETA: 1:38 - loss: 1.1940 - regression_loss: 1.0096 - classification_loss: 0.1844 38/500 [=>............................] - ETA: 1:38 - loss: 1.2046 - regression_loss: 1.0177 - classification_loss: 0.1870 39/500 [=>............................] - ETA: 1:38 - loss: 1.2096 - regression_loss: 1.0221 - classification_loss: 0.1875 40/500 [=>............................] - ETA: 1:38 - loss: 1.2089 - regression_loss: 1.0211 - classification_loss: 0.1878 41/500 [=>............................] - ETA: 1:37 - loss: 1.2028 - regression_loss: 1.0152 - classification_loss: 0.1876 42/500 [=>............................] - ETA: 1:37 - loss: 1.2012 - regression_loss: 1.0139 - classification_loss: 0.1873 43/500 [=>............................] - ETA: 1:37 - loss: 1.2007 - regression_loss: 1.0134 - classification_loss: 0.1874 44/500 [=>............................] - ETA: 1:37 - loss: 1.2027 - regression_loss: 1.0152 - classification_loss: 0.1875 45/500 [=>............................] - ETA: 1:36 - loss: 1.2021 - regression_loss: 1.0158 - classification_loss: 0.1863 46/500 [=>............................] - ETA: 1:36 - loss: 1.1936 - regression_loss: 1.0091 - classification_loss: 0.1845 47/500 [=>............................] - ETA: 1:36 - loss: 1.1813 - regression_loss: 0.9957 - classification_loss: 0.1856 48/500 [=>............................] - ETA: 1:36 - loss: 1.2026 - regression_loss: 1.0146 - classification_loss: 0.1880 49/500 [=>............................] - ETA: 1:35 - loss: 1.2057 - regression_loss: 1.0173 - classification_loss: 0.1885 50/500 [==>...........................] - ETA: 1:35 - loss: 1.1885 - regression_loss: 1.0018 - classification_loss: 0.1867 51/500 [==>...........................] - ETA: 1:35 - loss: 1.1764 - regression_loss: 0.9924 - classification_loss: 0.1841 52/500 [==>...........................] - ETA: 1:35 - loss: 1.1730 - regression_loss: 0.9914 - classification_loss: 0.1816 53/500 [==>...........................] - ETA: 1:35 - loss: 1.1752 - regression_loss: 0.9928 - classification_loss: 0.1825 54/500 [==>...........................] - ETA: 1:34 - loss: 1.1691 - regression_loss: 0.9886 - classification_loss: 0.1805 55/500 [==>...........................] - ETA: 1:34 - loss: 1.1750 - regression_loss: 0.9930 - classification_loss: 0.1820 56/500 [==>...........................] - ETA: 1:34 - loss: 1.1618 - regression_loss: 0.9823 - classification_loss: 0.1794 57/500 [==>...........................] - ETA: 1:34 - loss: 1.1689 - regression_loss: 0.9880 - classification_loss: 0.1808 58/500 [==>...........................] - ETA: 1:33 - loss: 1.1682 - regression_loss: 0.9877 - classification_loss: 0.1805 59/500 [==>...........................] - ETA: 1:33 - loss: 1.1718 - regression_loss: 0.9921 - classification_loss: 0.1797 60/500 [==>...........................] - ETA: 1:33 - loss: 1.1802 - regression_loss: 0.9992 - classification_loss: 0.1810 61/500 [==>...........................] - ETA: 1:33 - loss: 1.1852 - regression_loss: 1.0034 - classification_loss: 0.1818 62/500 [==>...........................] - ETA: 1:33 - loss: 1.1937 - regression_loss: 1.0106 - classification_loss: 0.1831 63/500 [==>...........................] - ETA: 1:32 - loss: 1.1827 - regression_loss: 1.0016 - classification_loss: 0.1811 64/500 [==>...........................] - ETA: 1:32 - loss: 1.1833 - regression_loss: 1.0024 - classification_loss: 0.1809 65/500 [==>...........................] - ETA: 1:32 - loss: 1.1868 - regression_loss: 1.0053 - classification_loss: 0.1815 66/500 [==>...........................] - ETA: 1:32 - loss: 1.1903 - regression_loss: 1.0090 - classification_loss: 0.1813 67/500 [===>..........................] - ETA: 1:32 - loss: 1.1889 - regression_loss: 1.0074 - classification_loss: 0.1815 68/500 [===>..........................] - ETA: 1:32 - loss: 1.1922 - regression_loss: 1.0106 - classification_loss: 0.1816 69/500 [===>..........................] - ETA: 1:32 - loss: 1.1895 - regression_loss: 1.0102 - classification_loss: 0.1794 70/500 [===>..........................] - ETA: 1:32 - loss: 1.1888 - regression_loss: 1.0104 - classification_loss: 0.1785 71/500 [===>..........................] - ETA: 1:32 - loss: 1.1915 - regression_loss: 1.0133 - classification_loss: 0.1782 72/500 [===>..........................] - ETA: 1:32 - loss: 1.1981 - regression_loss: 1.0182 - classification_loss: 0.1799 73/500 [===>..........................] - ETA: 1:32 - loss: 1.2017 - regression_loss: 1.0214 - classification_loss: 0.1803 74/500 [===>..........................] - ETA: 1:32 - loss: 1.2023 - regression_loss: 1.0214 - classification_loss: 0.1810 75/500 [===>..........................] - ETA: 1:32 - loss: 1.2036 - regression_loss: 1.0229 - classification_loss: 0.1807 76/500 [===>..........................] - ETA: 1:32 - loss: 1.2060 - regression_loss: 1.0259 - classification_loss: 0.1801 77/500 [===>..........................] - ETA: 1:32 - loss: 1.2035 - regression_loss: 1.0239 - classification_loss: 0.1795 78/500 [===>..........................] - ETA: 1:32 - loss: 1.1982 - regression_loss: 1.0198 - classification_loss: 0.1783 79/500 [===>..........................] - ETA: 1:32 - loss: 1.2017 - regression_loss: 1.0232 - classification_loss: 0.1785 80/500 [===>..........................] - ETA: 1:32 - loss: 1.1980 - regression_loss: 1.0201 - classification_loss: 0.1779 81/500 [===>..........................] - ETA: 1:32 - loss: 1.1947 - regression_loss: 1.0178 - classification_loss: 0.1770 82/500 [===>..........................] - ETA: 1:32 - loss: 1.2055 - regression_loss: 1.0267 - classification_loss: 0.1788 83/500 [===>..........................] - ETA: 1:32 - loss: 1.2065 - regression_loss: 1.0278 - classification_loss: 0.1787 84/500 [====>.........................] - ETA: 1:31 - loss: 1.2098 - regression_loss: 1.0308 - classification_loss: 0.1791 85/500 [====>.........................] - ETA: 1:31 - loss: 1.2088 - regression_loss: 1.0297 - classification_loss: 0.1792 86/500 [====>.........................] - ETA: 1:31 - loss: 1.2158 - regression_loss: 1.0356 - classification_loss: 0.1802 87/500 [====>.........................] - ETA: 1:31 - loss: 1.2196 - regression_loss: 1.0378 - classification_loss: 0.1817 88/500 [====>.........................] - ETA: 1:31 - loss: 1.2234 - regression_loss: 1.0413 - classification_loss: 0.1822 89/500 [====>.........................] - ETA: 1:31 - loss: 1.2156 - regression_loss: 1.0337 - classification_loss: 0.1818 90/500 [====>.........................] - ETA: 1:31 - loss: 1.2097 - regression_loss: 1.0291 - classification_loss: 0.1806 91/500 [====>.........................] - ETA: 1:31 - loss: 1.2152 - regression_loss: 1.0342 - classification_loss: 0.1810 92/500 [====>.........................] - ETA: 1:31 - loss: 1.2264 - regression_loss: 1.0432 - classification_loss: 0.1832 93/500 [====>.........................] - ETA: 1:31 - loss: 1.2218 - regression_loss: 1.0400 - classification_loss: 0.1818 94/500 [====>.........................] - ETA: 1:31 - loss: 1.2265 - regression_loss: 1.0435 - classification_loss: 0.1830 95/500 [====>.........................] - ETA: 1:30 - loss: 1.2275 - regression_loss: 1.0441 - classification_loss: 0.1835 96/500 [====>.........................] - ETA: 1:30 - loss: 1.2356 - regression_loss: 1.0506 - classification_loss: 0.1851 97/500 [====>.........................] - ETA: 1:30 - loss: 1.2405 - regression_loss: 1.0544 - classification_loss: 0.1861 98/500 [====>.........................] - ETA: 1:30 - loss: 1.2406 - regression_loss: 1.0549 - classification_loss: 0.1857 99/500 [====>.........................] - ETA: 1:30 - loss: 1.2422 - regression_loss: 1.0546 - classification_loss: 0.1877 100/500 [=====>........................] - ETA: 1:30 - loss: 1.2470 - regression_loss: 1.0586 - classification_loss: 0.1884 101/500 [=====>........................] - ETA: 1:30 - loss: 1.2490 - regression_loss: 1.0609 - classification_loss: 0.1881 102/500 [=====>........................] - ETA: 1:29 - loss: 1.2522 - regression_loss: 1.0636 - classification_loss: 0.1885 103/500 [=====>........................] - ETA: 1:29 - loss: 1.2510 - regression_loss: 1.0627 - classification_loss: 0.1883 104/500 [=====>........................] - ETA: 1:29 - loss: 1.2549 - regression_loss: 1.0662 - classification_loss: 0.1886 105/500 [=====>........................] - ETA: 1:29 - loss: 1.2521 - regression_loss: 1.0641 - classification_loss: 0.1880 106/500 [=====>........................] - ETA: 1:28 - loss: 1.2434 - regression_loss: 1.0569 - classification_loss: 0.1865 107/500 [=====>........................] - ETA: 1:28 - loss: 1.2480 - regression_loss: 1.0615 - classification_loss: 0.1865 108/500 [=====>........................] - ETA: 1:28 - loss: 1.2438 - regression_loss: 1.0583 - classification_loss: 0.1855 109/500 [=====>........................] - ETA: 1:28 - loss: 1.2389 - regression_loss: 1.0546 - classification_loss: 0.1843 110/500 [=====>........................] - ETA: 1:27 - loss: 1.2335 - regression_loss: 1.0498 - classification_loss: 0.1837 111/500 [=====>........................] - ETA: 1:27 - loss: 1.2380 - regression_loss: 1.0537 - classification_loss: 0.1843 112/500 [=====>........................] - ETA: 1:27 - loss: 1.2323 - regression_loss: 1.0489 - classification_loss: 0.1834 113/500 [=====>........................] - ETA: 1:27 - loss: 1.2250 - regression_loss: 1.0426 - classification_loss: 0.1824 114/500 [=====>........................] - ETA: 1:27 - loss: 1.2261 - regression_loss: 1.0439 - classification_loss: 0.1823 115/500 [=====>........................] - ETA: 1:26 - loss: 1.2313 - regression_loss: 1.0479 - classification_loss: 0.1833 116/500 [=====>........................] - ETA: 1:26 - loss: 1.2327 - regression_loss: 1.0491 - classification_loss: 0.1836 117/500 [======>.......................] - ETA: 1:26 - loss: 1.2307 - regression_loss: 1.0471 - classification_loss: 0.1836 118/500 [======>.......................] - ETA: 1:26 - loss: 1.2264 - regression_loss: 1.0438 - classification_loss: 0.1826 119/500 [======>.......................] - ETA: 1:26 - loss: 1.2264 - regression_loss: 1.0439 - classification_loss: 0.1825 120/500 [======>.......................] - ETA: 1:26 - loss: 1.2214 - regression_loss: 1.0394 - classification_loss: 0.1821 121/500 [======>.......................] - ETA: 1:26 - loss: 1.2190 - regression_loss: 1.0377 - classification_loss: 0.1813 122/500 [======>.......................] - ETA: 1:25 - loss: 1.2201 - regression_loss: 1.0390 - classification_loss: 0.1811 123/500 [======>.......................] - ETA: 1:25 - loss: 1.2227 - regression_loss: 1.0413 - classification_loss: 0.1814 124/500 [======>.......................] - ETA: 1:25 - loss: 1.2248 - regression_loss: 1.0432 - classification_loss: 0.1816 125/500 [======>.......................] - ETA: 1:25 - loss: 1.2208 - regression_loss: 1.0391 - classification_loss: 0.1817 126/500 [======>.......................] - ETA: 1:24 - loss: 1.2218 - regression_loss: 1.0401 - classification_loss: 0.1817 127/500 [======>.......................] - ETA: 1:24 - loss: 1.2250 - regression_loss: 1.0425 - classification_loss: 0.1825 128/500 [======>.......................] - ETA: 1:24 - loss: 1.2255 - regression_loss: 1.0433 - classification_loss: 0.1823 129/500 [======>.......................] - ETA: 1:24 - loss: 1.2278 - regression_loss: 1.0451 - classification_loss: 0.1826 130/500 [======>.......................] - ETA: 1:23 - loss: 1.2333 - regression_loss: 1.0488 - classification_loss: 0.1845 131/500 [======>.......................] - ETA: 1:23 - loss: 1.2327 - regression_loss: 1.0487 - classification_loss: 0.1840 132/500 [======>.......................] - ETA: 1:23 - loss: 1.2328 - regression_loss: 1.0491 - classification_loss: 0.1838 133/500 [======>.......................] - ETA: 1:23 - loss: 1.2313 - regression_loss: 1.0478 - classification_loss: 0.1835 134/500 [=======>......................] - ETA: 1:23 - loss: 1.2344 - regression_loss: 1.0505 - classification_loss: 0.1839 135/500 [=======>......................] - ETA: 1:22 - loss: 1.2347 - regression_loss: 1.0510 - classification_loss: 0.1837 136/500 [=======>......................] - ETA: 1:22 - loss: 1.2319 - regression_loss: 1.0483 - classification_loss: 0.1836 137/500 [=======>......................] - ETA: 1:22 - loss: 1.2336 - regression_loss: 1.0498 - classification_loss: 0.1838 138/500 [=======>......................] - ETA: 1:22 - loss: 1.2352 - regression_loss: 1.0513 - classification_loss: 0.1839 139/500 [=======>......................] - ETA: 1:22 - loss: 1.2343 - regression_loss: 1.0511 - classification_loss: 0.1832 140/500 [=======>......................] - ETA: 1:22 - loss: 1.2336 - regression_loss: 1.0506 - classification_loss: 0.1830 141/500 [=======>......................] - ETA: 1:21 - loss: 1.2344 - regression_loss: 1.0512 - classification_loss: 0.1832 142/500 [=======>......................] - ETA: 1:21 - loss: 1.2297 - regression_loss: 1.0473 - classification_loss: 0.1823 143/500 [=======>......................] - ETA: 1:21 - loss: 1.2288 - regression_loss: 1.0466 - classification_loss: 0.1822 144/500 [=======>......................] - ETA: 1:21 - loss: 1.2304 - regression_loss: 1.0482 - classification_loss: 0.1822 145/500 [=======>......................] - ETA: 1:21 - loss: 1.2321 - regression_loss: 1.0500 - classification_loss: 0.1822 146/500 [=======>......................] - ETA: 1:20 - loss: 1.2346 - regression_loss: 1.0519 - classification_loss: 0.1827 147/500 [=======>......................] - ETA: 1:20 - loss: 1.2335 - regression_loss: 1.0515 - classification_loss: 0.1821 148/500 [=======>......................] - ETA: 1:20 - loss: 1.2386 - regression_loss: 1.0552 - classification_loss: 0.1833 149/500 [=======>......................] - ETA: 1:20 - loss: 1.2391 - regression_loss: 1.0557 - classification_loss: 0.1834 150/500 [========>.....................] - ETA: 1:20 - loss: 1.2334 - regression_loss: 1.0508 - classification_loss: 0.1826 151/500 [========>.....................] - ETA: 1:19 - loss: 1.2364 - regression_loss: 1.0531 - classification_loss: 0.1832 152/500 [========>.....................] - ETA: 1:19 - loss: 1.2365 - regression_loss: 1.0534 - classification_loss: 0.1831 153/500 [========>.....................] - ETA: 1:19 - loss: 1.2364 - regression_loss: 1.0534 - classification_loss: 0.1830 154/500 [========>.....................] - ETA: 1:19 - loss: 1.2364 - regression_loss: 1.0535 - classification_loss: 0.1830 155/500 [========>.....................] - ETA: 1:18 - loss: 1.2385 - regression_loss: 1.0552 - classification_loss: 0.1833 156/500 [========>.....................] - ETA: 1:18 - loss: 1.2358 - regression_loss: 1.0531 - classification_loss: 0.1827 157/500 [========>.....................] - ETA: 1:18 - loss: 1.2349 - regression_loss: 1.0524 - classification_loss: 0.1826 158/500 [========>.....................] - ETA: 1:18 - loss: 1.2379 - regression_loss: 1.0549 - classification_loss: 0.1830 159/500 [========>.....................] - ETA: 1:18 - loss: 1.2371 - regression_loss: 1.0542 - classification_loss: 0.1829 160/500 [========>.....................] - ETA: 1:17 - loss: 1.2326 - regression_loss: 1.0505 - classification_loss: 0.1821 161/500 [========>.....................] - ETA: 1:17 - loss: 1.2329 - regression_loss: 1.0506 - classification_loss: 0.1823 162/500 [========>.....................] - ETA: 1:17 - loss: 1.2372 - regression_loss: 1.0538 - classification_loss: 0.1834 163/500 [========>.....................] - ETA: 1:17 - loss: 1.2329 - regression_loss: 1.0504 - classification_loss: 0.1824 164/500 [========>.....................] - ETA: 1:16 - loss: 1.2338 - regression_loss: 1.0510 - classification_loss: 0.1828 165/500 [========>.....................] - ETA: 1:16 - loss: 1.2369 - regression_loss: 1.0536 - classification_loss: 0.1832 166/500 [========>.....................] - ETA: 1:16 - loss: 1.2347 - regression_loss: 1.0517 - classification_loss: 0.1830 167/500 [=========>....................] - ETA: 1:16 - loss: 1.2301 - regression_loss: 1.0477 - classification_loss: 0.1824 168/500 [=========>....................] - ETA: 1:15 - loss: 1.2268 - regression_loss: 1.0449 - classification_loss: 0.1820 169/500 [=========>....................] - ETA: 1:15 - loss: 1.2294 - regression_loss: 1.0471 - classification_loss: 0.1823 170/500 [=========>....................] - ETA: 1:15 - loss: 1.2298 - regression_loss: 1.0472 - classification_loss: 0.1826 171/500 [=========>....................] - ETA: 1:15 - loss: 1.2290 - regression_loss: 1.0465 - classification_loss: 0.1826 172/500 [=========>....................] - ETA: 1:14 - loss: 1.2251 - regression_loss: 1.0431 - classification_loss: 0.1820 173/500 [=========>....................] - ETA: 1:14 - loss: 1.2266 - regression_loss: 1.0441 - classification_loss: 0.1825 174/500 [=========>....................] - ETA: 1:14 - loss: 1.2231 - regression_loss: 1.0414 - classification_loss: 0.1817 175/500 [=========>....................] - ETA: 1:14 - loss: 1.2251 - regression_loss: 1.0432 - classification_loss: 0.1818 176/500 [=========>....................] - ETA: 1:14 - loss: 1.2237 - regression_loss: 1.0423 - classification_loss: 0.1814 177/500 [=========>....................] - ETA: 1:13 - loss: 1.2229 - regression_loss: 1.0417 - classification_loss: 0.1811 178/500 [=========>....................] - ETA: 1:13 - loss: 1.2273 - regression_loss: 1.0455 - classification_loss: 0.1818 179/500 [=========>....................] - ETA: 1:13 - loss: 1.2289 - regression_loss: 1.0469 - classification_loss: 0.1820 180/500 [=========>....................] - ETA: 1:13 - loss: 1.2276 - regression_loss: 1.0458 - classification_loss: 0.1818 181/500 [=========>....................] - ETA: 1:13 - loss: 1.2272 - regression_loss: 1.0453 - classification_loss: 0.1820 182/500 [=========>....................] - ETA: 1:12 - loss: 1.2277 - regression_loss: 1.0457 - classification_loss: 0.1820 183/500 [=========>....................] - ETA: 1:12 - loss: 1.2255 - regression_loss: 1.0434 - classification_loss: 0.1821 184/500 [==========>...................] - ETA: 1:12 - loss: 1.2217 - regression_loss: 1.0403 - classification_loss: 0.1814 185/500 [==========>...................] - ETA: 1:12 - loss: 1.2231 - regression_loss: 1.0414 - classification_loss: 0.1817 186/500 [==========>...................] - ETA: 1:12 - loss: 1.2181 - regression_loss: 1.0372 - classification_loss: 0.1809 187/500 [==========>...................] - ETA: 1:11 - loss: 1.2177 - regression_loss: 1.0371 - classification_loss: 0.1806 188/500 [==========>...................] - ETA: 1:11 - loss: 1.2198 - regression_loss: 1.0389 - classification_loss: 0.1809 189/500 [==========>...................] - ETA: 1:11 - loss: 1.2172 - regression_loss: 1.0367 - classification_loss: 0.1805 190/500 [==========>...................] - ETA: 1:11 - loss: 1.2135 - regression_loss: 1.0336 - classification_loss: 0.1799 191/500 [==========>...................] - ETA: 1:11 - loss: 1.2094 - regression_loss: 1.0302 - classification_loss: 0.1792 192/500 [==========>...................] - ETA: 1:10 - loss: 1.2095 - regression_loss: 1.0300 - classification_loss: 0.1794 193/500 [==========>...................] - ETA: 1:10 - loss: 1.2123 - regression_loss: 1.0323 - classification_loss: 0.1800 194/500 [==========>...................] - ETA: 1:10 - loss: 1.2097 - regression_loss: 1.0303 - classification_loss: 0.1795 195/500 [==========>...................] - ETA: 1:10 - loss: 1.2139 - regression_loss: 1.0321 - classification_loss: 0.1817 196/500 [==========>...................] - ETA: 1:10 - loss: 1.2155 - regression_loss: 1.0334 - classification_loss: 0.1821 197/500 [==========>...................] - ETA: 1:09 - loss: 1.2160 - regression_loss: 1.0340 - classification_loss: 0.1820 198/500 [==========>...................] - ETA: 1:09 - loss: 1.2132 - regression_loss: 1.0317 - classification_loss: 0.1815 199/500 [==========>...................] - ETA: 1:09 - loss: 1.2100 - regression_loss: 1.0290 - classification_loss: 0.1810 200/500 [===========>..................] - ETA: 1:09 - loss: 1.2119 - regression_loss: 1.0306 - classification_loss: 0.1813 201/500 [===========>..................] - ETA: 1:09 - loss: 1.2113 - regression_loss: 1.0302 - classification_loss: 0.1811 202/500 [===========>..................] - ETA: 1:09 - loss: 1.2139 - regression_loss: 1.0322 - classification_loss: 0.1816 203/500 [===========>..................] - ETA: 1:08 - loss: 1.2140 - regression_loss: 1.0321 - classification_loss: 0.1819 204/500 [===========>..................] - ETA: 1:08 - loss: 1.2128 - regression_loss: 1.0310 - classification_loss: 0.1818 205/500 [===========>..................] - ETA: 1:08 - loss: 1.2167 - regression_loss: 1.0344 - classification_loss: 0.1823 206/500 [===========>..................] - ETA: 1:08 - loss: 1.2180 - regression_loss: 1.0356 - classification_loss: 0.1824 207/500 [===========>..................] - ETA: 1:07 - loss: 1.2233 - regression_loss: 1.0399 - classification_loss: 0.1834 208/500 [===========>..................] - ETA: 1:07 - loss: 1.2260 - regression_loss: 1.0421 - classification_loss: 0.1839 209/500 [===========>..................] - ETA: 1:07 - loss: 1.2245 - regression_loss: 1.0411 - classification_loss: 0.1834 210/500 [===========>..................] - ETA: 1:07 - loss: 1.2249 - regression_loss: 1.0417 - classification_loss: 0.1832 211/500 [===========>..................] - ETA: 1:07 - loss: 1.2283 - regression_loss: 1.0445 - classification_loss: 0.1838 212/500 [===========>..................] - ETA: 1:06 - loss: 1.2258 - regression_loss: 1.0424 - classification_loss: 0.1834 213/500 [===========>..................] - ETA: 1:06 - loss: 1.2242 - regression_loss: 1.0410 - classification_loss: 0.1832 214/500 [===========>..................] - ETA: 1:06 - loss: 1.2241 - regression_loss: 1.0407 - classification_loss: 0.1833 215/500 [===========>..................] - ETA: 1:06 - loss: 1.2254 - regression_loss: 1.0417 - classification_loss: 0.1837 216/500 [===========>..................] - ETA: 1:06 - loss: 1.2273 - regression_loss: 1.0434 - classification_loss: 0.1839 217/500 [============>.................] - ETA: 1:05 - loss: 1.2276 - regression_loss: 1.0437 - classification_loss: 0.1840 218/500 [============>.................] - ETA: 1:05 - loss: 1.2277 - regression_loss: 1.0436 - classification_loss: 0.1841 219/500 [============>.................] - ETA: 1:05 - loss: 1.2243 - regression_loss: 1.0408 - classification_loss: 0.1835 220/500 [============>.................] - ETA: 1:05 - loss: 1.2251 - regression_loss: 1.0415 - classification_loss: 0.1836 221/500 [============>.................] - ETA: 1:05 - loss: 1.2264 - regression_loss: 1.0425 - classification_loss: 0.1839 222/500 [============>.................] - ETA: 1:04 - loss: 1.2281 - regression_loss: 1.0436 - classification_loss: 0.1844 223/500 [============>.................] - ETA: 1:04 - loss: 1.2303 - regression_loss: 1.0453 - classification_loss: 0.1850 224/500 [============>.................] - ETA: 1:04 - loss: 1.2304 - regression_loss: 1.0454 - classification_loss: 0.1850 225/500 [============>.................] - ETA: 1:04 - loss: 1.2305 - regression_loss: 1.0457 - classification_loss: 0.1848 226/500 [============>.................] - ETA: 1:03 - loss: 1.2288 - regression_loss: 1.0441 - classification_loss: 0.1846 227/500 [============>.................] - ETA: 1:03 - loss: 1.2285 - regression_loss: 1.0438 - classification_loss: 0.1847 228/500 [============>.................] - ETA: 1:03 - loss: 1.2288 - regression_loss: 1.0444 - classification_loss: 0.1844 229/500 [============>.................] - ETA: 1:03 - loss: 1.2298 - regression_loss: 1.0452 - classification_loss: 0.1846 230/500 [============>.................] - ETA: 1:03 - loss: 1.2316 - regression_loss: 1.0467 - classification_loss: 0.1849 231/500 [============>.................] - ETA: 1:02 - loss: 1.2308 - regression_loss: 1.0460 - classification_loss: 0.1847 232/500 [============>.................] - ETA: 1:02 - loss: 1.2294 - regression_loss: 1.0451 - classification_loss: 0.1844 233/500 [============>.................] - ETA: 1:02 - loss: 1.2271 - regression_loss: 1.0432 - classification_loss: 0.1839 234/500 [=============>................] - ETA: 1:02 - loss: 1.2249 - regression_loss: 1.0414 - classification_loss: 0.1835 235/500 [=============>................] - ETA: 1:01 - loss: 1.2248 - regression_loss: 1.0415 - classification_loss: 0.1833 236/500 [=============>................] - ETA: 1:01 - loss: 1.2262 - regression_loss: 1.0430 - classification_loss: 0.1832 237/500 [=============>................] - ETA: 1:01 - loss: 1.2232 - regression_loss: 1.0405 - classification_loss: 0.1827 238/500 [=============>................] - ETA: 1:01 - loss: 1.2226 - regression_loss: 1.0401 - classification_loss: 0.1826 239/500 [=============>................] - ETA: 1:01 - loss: 1.2213 - regression_loss: 1.0391 - classification_loss: 0.1822 240/500 [=============>................] - ETA: 1:00 - loss: 1.2216 - regression_loss: 1.0393 - classification_loss: 0.1822 241/500 [=============>................] - ETA: 1:00 - loss: 1.2230 - regression_loss: 1.0406 - classification_loss: 0.1824 242/500 [=============>................] - ETA: 1:00 - loss: 1.2250 - regression_loss: 1.0423 - classification_loss: 0.1828 243/500 [=============>................] - ETA: 1:00 - loss: 1.2274 - regression_loss: 1.0439 - classification_loss: 0.1835 244/500 [=============>................] - ETA: 59s - loss: 1.2256 - regression_loss: 1.0423 - classification_loss: 0.1832  245/500 [=============>................] - ETA: 59s - loss: 1.2251 - regression_loss: 1.0421 - classification_loss: 0.1831 246/500 [=============>................] - ETA: 59s - loss: 1.2277 - regression_loss: 1.0441 - classification_loss: 0.1836 247/500 [=============>................] - ETA: 59s - loss: 1.2286 - regression_loss: 1.0447 - classification_loss: 0.1839 248/500 [=============>................] - ETA: 59s - loss: 1.2306 - regression_loss: 1.0461 - classification_loss: 0.1845 249/500 [=============>................] - ETA: 58s - loss: 1.2294 - regression_loss: 1.0450 - classification_loss: 0.1843 250/500 [==============>...............] - ETA: 58s - loss: 1.2311 - regression_loss: 1.0464 - classification_loss: 0.1847 251/500 [==============>...............] - ETA: 58s - loss: 1.2307 - regression_loss: 1.0461 - classification_loss: 0.1846 252/500 [==============>...............] - ETA: 58s - loss: 1.2281 - regression_loss: 1.0441 - classification_loss: 0.1840 253/500 [==============>...............] - ETA: 57s - loss: 1.2279 - regression_loss: 1.0438 - classification_loss: 0.1841 254/500 [==============>...............] - ETA: 57s - loss: 1.2279 - regression_loss: 1.0439 - classification_loss: 0.1840 255/500 [==============>...............] - ETA: 57s - loss: 1.2299 - regression_loss: 1.0456 - classification_loss: 0.1842 256/500 [==============>...............] - ETA: 57s - loss: 1.2305 - regression_loss: 1.0460 - classification_loss: 0.1844 257/500 [==============>...............] - ETA: 57s - loss: 1.2307 - regression_loss: 1.0462 - classification_loss: 0.1845 258/500 [==============>...............] - ETA: 56s - loss: 1.2277 - regression_loss: 1.0438 - classification_loss: 0.1840 259/500 [==============>...............] - ETA: 56s - loss: 1.2287 - regression_loss: 1.0447 - classification_loss: 0.1840 260/500 [==============>...............] - ETA: 56s - loss: 1.2261 - regression_loss: 1.0425 - classification_loss: 0.1835 261/500 [==============>...............] - ETA: 56s - loss: 1.2270 - regression_loss: 1.0433 - classification_loss: 0.1837 262/500 [==============>...............] - ETA: 55s - loss: 1.2284 - regression_loss: 1.0448 - classification_loss: 0.1837 263/500 [==============>...............] - ETA: 55s - loss: 1.2310 - regression_loss: 1.0468 - classification_loss: 0.1842 264/500 [==============>...............] - ETA: 55s - loss: 1.2329 - regression_loss: 1.0484 - classification_loss: 0.1845 265/500 [==============>...............] - ETA: 55s - loss: 1.2334 - regression_loss: 1.0487 - classification_loss: 0.1848 266/500 [==============>...............] - ETA: 54s - loss: 1.2339 - regression_loss: 1.0491 - classification_loss: 0.1848 267/500 [===============>..............] - ETA: 54s - loss: 1.2341 - regression_loss: 1.0488 - classification_loss: 0.1853 268/500 [===============>..............] - ETA: 54s - loss: 1.2360 - regression_loss: 1.0502 - classification_loss: 0.1857 269/500 [===============>..............] - ETA: 54s - loss: 1.2376 - regression_loss: 1.0516 - classification_loss: 0.1859 270/500 [===============>..............] - ETA: 54s - loss: 1.2375 - regression_loss: 1.0514 - classification_loss: 0.1861 271/500 [===============>..............] - ETA: 53s - loss: 1.2378 - regression_loss: 1.0517 - classification_loss: 0.1861 272/500 [===============>..............] - ETA: 53s - loss: 1.2391 - regression_loss: 1.0527 - classification_loss: 0.1864 273/500 [===============>..............] - ETA: 53s - loss: 1.2398 - regression_loss: 1.0533 - classification_loss: 0.1865 274/500 [===============>..............] - ETA: 53s - loss: 1.2387 - regression_loss: 1.0525 - classification_loss: 0.1861 275/500 [===============>..............] - ETA: 52s - loss: 1.2404 - regression_loss: 1.0541 - classification_loss: 0.1864 276/500 [===============>..............] - ETA: 52s - loss: 1.2399 - regression_loss: 1.0535 - classification_loss: 0.1864 277/500 [===============>..............] - ETA: 52s - loss: 1.2380 - regression_loss: 1.0519 - classification_loss: 0.1860 278/500 [===============>..............] - ETA: 52s - loss: 1.2409 - regression_loss: 1.0544 - classification_loss: 0.1866 279/500 [===============>..............] - ETA: 51s - loss: 1.2407 - regression_loss: 1.0542 - classification_loss: 0.1865 280/500 [===============>..............] - ETA: 51s - loss: 1.2406 - regression_loss: 1.0543 - classification_loss: 0.1864 281/500 [===============>..............] - ETA: 51s - loss: 1.2420 - regression_loss: 1.0553 - classification_loss: 0.1868 282/500 [===============>..............] - ETA: 51s - loss: 1.2420 - regression_loss: 1.0554 - classification_loss: 0.1866 283/500 [===============>..............] - ETA: 50s - loss: 1.2418 - regression_loss: 1.0553 - classification_loss: 0.1865 284/500 [================>.............] - ETA: 50s - loss: 1.2404 - regression_loss: 1.0540 - classification_loss: 0.1864 285/500 [================>.............] - ETA: 50s - loss: 1.2403 - regression_loss: 1.0542 - classification_loss: 0.1862 286/500 [================>.............] - ETA: 50s - loss: 1.2373 - regression_loss: 1.0517 - classification_loss: 0.1856 287/500 [================>.............] - ETA: 49s - loss: 1.2382 - regression_loss: 1.0522 - classification_loss: 0.1860 288/500 [================>.............] - ETA: 49s - loss: 1.2401 - regression_loss: 1.0539 - classification_loss: 0.1862 289/500 [================>.............] - ETA: 49s - loss: 1.2401 - regression_loss: 1.0537 - classification_loss: 0.1864 290/500 [================>.............] - ETA: 49s - loss: 1.2399 - regression_loss: 1.0536 - classification_loss: 0.1862 291/500 [================>.............] - ETA: 49s - loss: 1.2401 - regression_loss: 1.0538 - classification_loss: 0.1862 292/500 [================>.............] - ETA: 48s - loss: 1.2409 - regression_loss: 1.0544 - classification_loss: 0.1865 293/500 [================>.............] - ETA: 48s - loss: 1.2410 - regression_loss: 1.0544 - classification_loss: 0.1866 294/500 [================>.............] - ETA: 48s - loss: 1.2412 - regression_loss: 1.0546 - classification_loss: 0.1866 295/500 [================>.............] - ETA: 48s - loss: 1.2397 - regression_loss: 1.0531 - classification_loss: 0.1866 296/500 [================>.............] - ETA: 47s - loss: 1.2406 - regression_loss: 1.0540 - classification_loss: 0.1866 297/500 [================>.............] - ETA: 47s - loss: 1.2420 - regression_loss: 1.0548 - classification_loss: 0.1872 298/500 [================>.............] - ETA: 47s - loss: 1.2447 - regression_loss: 1.0571 - classification_loss: 0.1875 299/500 [================>.............] - ETA: 47s - loss: 1.2449 - regression_loss: 1.0574 - classification_loss: 0.1875 300/500 [=================>............] - ETA: 47s - loss: 1.2470 - regression_loss: 1.0590 - classification_loss: 0.1879 301/500 [=================>............] - ETA: 46s - loss: 1.2492 - regression_loss: 1.0608 - classification_loss: 0.1883 302/500 [=================>............] - ETA: 46s - loss: 1.2492 - regression_loss: 1.0609 - classification_loss: 0.1883 303/500 [=================>............] - ETA: 46s - loss: 1.2499 - regression_loss: 1.0616 - classification_loss: 0.1883 304/500 [=================>............] - ETA: 46s - loss: 1.2527 - regression_loss: 1.0643 - classification_loss: 0.1885 305/500 [=================>............] - ETA: 45s - loss: 1.2557 - regression_loss: 1.0665 - classification_loss: 0.1892 306/500 [=================>............] - ETA: 45s - loss: 1.2568 - regression_loss: 1.0673 - classification_loss: 0.1895 307/500 [=================>............] - ETA: 45s - loss: 1.2550 - regression_loss: 1.0659 - classification_loss: 0.1891 308/500 [=================>............] - ETA: 45s - loss: 1.2546 - regression_loss: 1.0656 - classification_loss: 0.1890 309/500 [=================>............] - ETA: 44s - loss: 1.2536 - regression_loss: 1.0649 - classification_loss: 0.1888 310/500 [=================>............] - ETA: 44s - loss: 1.2520 - regression_loss: 1.0635 - classification_loss: 0.1885 311/500 [=================>............] - ETA: 44s - loss: 1.2534 - regression_loss: 1.0646 - classification_loss: 0.1888 312/500 [=================>............] - ETA: 44s - loss: 1.2529 - regression_loss: 1.0640 - classification_loss: 0.1889 313/500 [=================>............] - ETA: 44s - loss: 1.2521 - regression_loss: 1.0632 - classification_loss: 0.1888 314/500 [=================>............] - ETA: 43s - loss: 1.2496 - regression_loss: 1.0612 - classification_loss: 0.1884 315/500 [=================>............] - ETA: 43s - loss: 1.2485 - regression_loss: 1.0605 - classification_loss: 0.1880 316/500 [=================>............] - ETA: 43s - loss: 1.2490 - regression_loss: 1.0608 - classification_loss: 0.1882 317/500 [==================>...........] - ETA: 43s - loss: 1.2480 - regression_loss: 1.0602 - classification_loss: 0.1878 318/500 [==================>...........] - ETA: 42s - loss: 1.2502 - regression_loss: 1.0621 - classification_loss: 0.1881 319/500 [==================>...........] - ETA: 42s - loss: 1.2479 - regression_loss: 1.0603 - classification_loss: 0.1876 320/500 [==================>...........] - ETA: 42s - loss: 1.2476 - regression_loss: 1.0601 - classification_loss: 0.1875 321/500 [==================>...........] - ETA: 42s - loss: 1.2480 - regression_loss: 1.0605 - classification_loss: 0.1875 322/500 [==================>...........] - ETA: 41s - loss: 1.2490 - regression_loss: 1.0611 - classification_loss: 0.1879 323/500 [==================>...........] - ETA: 41s - loss: 1.2489 - regression_loss: 1.0611 - classification_loss: 0.1878 324/500 [==================>...........] - ETA: 41s - loss: 1.2494 - regression_loss: 1.0615 - classification_loss: 0.1878 325/500 [==================>...........] - ETA: 41s - loss: 1.2491 - regression_loss: 1.0610 - classification_loss: 0.1881 326/500 [==================>...........] - ETA: 40s - loss: 1.2486 - regression_loss: 1.0604 - classification_loss: 0.1882 327/500 [==================>...........] - ETA: 40s - loss: 1.2489 - regression_loss: 1.0606 - classification_loss: 0.1883 328/500 [==================>...........] - ETA: 40s - loss: 1.2505 - regression_loss: 1.0624 - classification_loss: 0.1882 329/500 [==================>...........] - ETA: 40s - loss: 1.2498 - regression_loss: 1.0618 - classification_loss: 0.1880 330/500 [==================>...........] - ETA: 40s - loss: 1.2507 - regression_loss: 1.0627 - classification_loss: 0.1880 331/500 [==================>...........] - ETA: 39s - loss: 1.2513 - regression_loss: 1.0630 - classification_loss: 0.1883 332/500 [==================>...........] - ETA: 39s - loss: 1.2512 - regression_loss: 1.0628 - classification_loss: 0.1884 333/500 [==================>...........] - ETA: 39s - loss: 1.2490 - regression_loss: 1.0610 - classification_loss: 0.1880 334/500 [===================>..........] - ETA: 39s - loss: 1.2498 - regression_loss: 1.0617 - classification_loss: 0.1881 335/500 [===================>..........] - ETA: 38s - loss: 1.2502 - regression_loss: 1.0621 - classification_loss: 0.1882 336/500 [===================>..........] - ETA: 38s - loss: 1.2504 - regression_loss: 1.0620 - classification_loss: 0.1884 337/500 [===================>..........] - ETA: 38s - loss: 1.2485 - regression_loss: 1.0605 - classification_loss: 0.1880 338/500 [===================>..........] - ETA: 38s - loss: 1.2495 - regression_loss: 1.0613 - classification_loss: 0.1882 339/500 [===================>..........] - ETA: 37s - loss: 1.2500 - regression_loss: 1.0617 - classification_loss: 0.1882 340/500 [===================>..........] - ETA: 37s - loss: 1.2507 - regression_loss: 1.0624 - classification_loss: 0.1884 341/500 [===================>..........] - ETA: 37s - loss: 1.2502 - regression_loss: 1.0618 - classification_loss: 0.1884 342/500 [===================>..........] - ETA: 37s - loss: 1.2489 - regression_loss: 1.0609 - classification_loss: 0.1881 343/500 [===================>..........] - ETA: 36s - loss: 1.2490 - regression_loss: 1.0609 - classification_loss: 0.1881 344/500 [===================>..........] - ETA: 36s - loss: 1.2500 - regression_loss: 1.0613 - classification_loss: 0.1886 345/500 [===================>..........] - ETA: 36s - loss: 1.2497 - regression_loss: 1.0611 - classification_loss: 0.1886 346/500 [===================>..........] - ETA: 36s - loss: 1.2499 - regression_loss: 1.0613 - classification_loss: 0.1886 347/500 [===================>..........] - ETA: 35s - loss: 1.2508 - regression_loss: 1.0621 - classification_loss: 0.1888 348/500 [===================>..........] - ETA: 35s - loss: 1.2502 - regression_loss: 1.0617 - classification_loss: 0.1885 349/500 [===================>..........] - ETA: 35s - loss: 1.2508 - regression_loss: 1.0622 - classification_loss: 0.1886 350/500 [====================>.........] - ETA: 35s - loss: 1.2496 - regression_loss: 1.0612 - classification_loss: 0.1885 351/500 [====================>.........] - ETA: 34s - loss: 1.2477 - regression_loss: 1.0596 - classification_loss: 0.1881 352/500 [====================>.........] - ETA: 34s - loss: 1.2465 - regression_loss: 1.0586 - classification_loss: 0.1879 353/500 [====================>.........] - ETA: 34s - loss: 1.2479 - regression_loss: 1.0597 - classification_loss: 0.1882 354/500 [====================>.........] - ETA: 34s - loss: 1.2476 - regression_loss: 1.0597 - classification_loss: 0.1880 355/500 [====================>.........] - ETA: 33s - loss: 1.2485 - regression_loss: 1.0603 - classification_loss: 0.1881 356/500 [====================>.........] - ETA: 33s - loss: 1.2483 - regression_loss: 1.0603 - classification_loss: 0.1880 357/500 [====================>.........] - ETA: 33s - loss: 1.2473 - regression_loss: 1.0593 - classification_loss: 0.1880 358/500 [====================>.........] - ETA: 33s - loss: 1.2468 - regression_loss: 1.0589 - classification_loss: 0.1879 359/500 [====================>.........] - ETA: 32s - loss: 1.2471 - regression_loss: 1.0593 - classification_loss: 0.1879 360/500 [====================>.........] - ETA: 32s - loss: 1.2457 - regression_loss: 1.0582 - classification_loss: 0.1875 361/500 [====================>.........] - ETA: 32s - loss: 1.2466 - regression_loss: 1.0589 - classification_loss: 0.1877 362/500 [====================>.........] - ETA: 32s - loss: 1.2465 - regression_loss: 1.0586 - classification_loss: 0.1879 363/500 [====================>.........] - ETA: 31s - loss: 1.2477 - regression_loss: 1.0595 - classification_loss: 0.1882 364/500 [====================>.........] - ETA: 31s - loss: 1.2483 - regression_loss: 1.0600 - classification_loss: 0.1883 365/500 [====================>.........] - ETA: 31s - loss: 1.2489 - regression_loss: 1.0605 - classification_loss: 0.1884 366/500 [====================>.........] - ETA: 31s - loss: 1.2513 - regression_loss: 1.0624 - classification_loss: 0.1889 367/500 [=====================>........] - ETA: 30s - loss: 1.2502 - regression_loss: 1.0615 - classification_loss: 0.1887 368/500 [=====================>........] - ETA: 30s - loss: 1.2516 - regression_loss: 1.0628 - classification_loss: 0.1888 369/500 [=====================>........] - ETA: 30s - loss: 1.2504 - regression_loss: 1.0617 - classification_loss: 0.1887 370/500 [=====================>........] - ETA: 30s - loss: 1.2511 - regression_loss: 1.0622 - classification_loss: 0.1889 371/500 [=====================>........] - ETA: 30s - loss: 1.2507 - regression_loss: 1.0617 - classification_loss: 0.1890 372/500 [=====================>........] - ETA: 29s - loss: 1.2510 - regression_loss: 1.0619 - classification_loss: 0.1891 373/500 [=====================>........] - ETA: 29s - loss: 1.2488 - regression_loss: 1.0600 - classification_loss: 0.1888 374/500 [=====================>........] - ETA: 29s - loss: 1.2501 - regression_loss: 1.0611 - classification_loss: 0.1891 375/500 [=====================>........] - ETA: 29s - loss: 1.2509 - regression_loss: 1.0618 - classification_loss: 0.1891 376/500 [=====================>........] - ETA: 28s - loss: 1.2517 - regression_loss: 1.0624 - classification_loss: 0.1893 377/500 [=====================>........] - ETA: 28s - loss: 1.2527 - regression_loss: 1.0633 - classification_loss: 0.1894 378/500 [=====================>........] - ETA: 28s - loss: 1.2538 - regression_loss: 1.0640 - classification_loss: 0.1898 379/500 [=====================>........] - ETA: 28s - loss: 1.2557 - regression_loss: 1.0656 - classification_loss: 0.1901 380/500 [=====================>........] - ETA: 27s - loss: 1.2553 - regression_loss: 1.0653 - classification_loss: 0.1900 381/500 [=====================>........] - ETA: 27s - loss: 1.2538 - regression_loss: 1.0642 - classification_loss: 0.1897 382/500 [=====================>........] - ETA: 27s - loss: 1.2548 - regression_loss: 1.0650 - classification_loss: 0.1898 383/500 [=====================>........] - ETA: 27s - loss: 1.2523 - regression_loss: 1.0629 - classification_loss: 0.1894 384/500 [======================>.......] - ETA: 26s - loss: 1.2528 - regression_loss: 1.0633 - classification_loss: 0.1895 385/500 [======================>.......] - ETA: 26s - loss: 1.2507 - regression_loss: 1.0616 - classification_loss: 0.1891 386/500 [======================>.......] - ETA: 26s - loss: 1.2510 - regression_loss: 1.0619 - classification_loss: 0.1891 387/500 [======================>.......] - ETA: 26s - loss: 1.2518 - regression_loss: 1.0626 - classification_loss: 0.1892 388/500 [======================>.......] - ETA: 25s - loss: 1.2518 - regression_loss: 1.0626 - classification_loss: 0.1893 389/500 [======================>.......] - ETA: 25s - loss: 1.2516 - regression_loss: 1.0624 - classification_loss: 0.1892 390/500 [======================>.......] - ETA: 25s - loss: 1.2526 - regression_loss: 1.0632 - classification_loss: 0.1894 391/500 [======================>.......] - ETA: 25s - loss: 1.2527 - regression_loss: 1.0633 - classification_loss: 0.1894 392/500 [======================>.......] - ETA: 25s - loss: 1.2549 - regression_loss: 1.0650 - classification_loss: 0.1899 393/500 [======================>.......] - ETA: 24s - loss: 1.2558 - regression_loss: 1.0658 - classification_loss: 0.1900 394/500 [======================>.......] - ETA: 24s - loss: 1.2565 - regression_loss: 1.0665 - classification_loss: 0.1901 395/500 [======================>.......] - ETA: 24s - loss: 1.2556 - regression_loss: 1.0658 - classification_loss: 0.1898 396/500 [======================>.......] - ETA: 24s - loss: 1.2571 - regression_loss: 1.0670 - classification_loss: 0.1901 397/500 [======================>.......] - ETA: 23s - loss: 1.2581 - regression_loss: 1.0678 - classification_loss: 0.1903 398/500 [======================>.......] - ETA: 23s - loss: 1.2571 - regression_loss: 1.0669 - classification_loss: 0.1901 399/500 [======================>.......] - ETA: 23s - loss: 1.2554 - regression_loss: 1.0655 - classification_loss: 0.1898 400/500 [=======================>......] - ETA: 23s - loss: 1.2529 - regression_loss: 1.0635 - classification_loss: 0.1894 401/500 [=======================>......] - ETA: 22s - loss: 1.2535 - regression_loss: 1.0638 - classification_loss: 0.1897 402/500 [=======================>......] - ETA: 22s - loss: 1.2549 - regression_loss: 1.0645 - classification_loss: 0.1904 403/500 [=======================>......] - ETA: 22s - loss: 1.2552 - regression_loss: 1.0648 - classification_loss: 0.1904 404/500 [=======================>......] - ETA: 22s - loss: 1.2544 - regression_loss: 1.0642 - classification_loss: 0.1903 405/500 [=======================>......] - ETA: 21s - loss: 1.2538 - regression_loss: 1.0636 - classification_loss: 0.1902 406/500 [=======================>......] - ETA: 21s - loss: 1.2521 - regression_loss: 1.0623 - classification_loss: 0.1899 407/500 [=======================>......] - ETA: 21s - loss: 1.2515 - regression_loss: 1.0618 - classification_loss: 0.1897 408/500 [=======================>......] - ETA: 21s - loss: 1.2515 - regression_loss: 1.0618 - classification_loss: 0.1897 409/500 [=======================>......] - ETA: 21s - loss: 1.2522 - regression_loss: 1.0624 - classification_loss: 0.1898 410/500 [=======================>......] - ETA: 20s - loss: 1.2509 - regression_loss: 1.0613 - classification_loss: 0.1895 411/500 [=======================>......] - ETA: 20s - loss: 1.2516 - regression_loss: 1.0618 - classification_loss: 0.1898 412/500 [=======================>......] - ETA: 20s - loss: 1.2511 - regression_loss: 1.0615 - classification_loss: 0.1896 413/500 [=======================>......] - ETA: 20s - loss: 1.2498 - regression_loss: 1.0605 - classification_loss: 0.1893 414/500 [=======================>......] - ETA: 19s - loss: 1.2502 - regression_loss: 1.0608 - classification_loss: 0.1894 415/500 [=======================>......] - ETA: 19s - loss: 1.2514 - regression_loss: 1.0621 - classification_loss: 0.1893 416/500 [=======================>......] - ETA: 19s - loss: 1.2511 - regression_loss: 1.0616 - classification_loss: 0.1895 417/500 [========================>.....] - ETA: 19s - loss: 1.2522 - regression_loss: 1.0624 - classification_loss: 0.1897 418/500 [========================>.....] - ETA: 18s - loss: 1.2537 - regression_loss: 1.0635 - classification_loss: 0.1902 419/500 [========================>.....] - ETA: 18s - loss: 1.2555 - regression_loss: 1.0651 - classification_loss: 0.1904 420/500 [========================>.....] - ETA: 18s - loss: 1.2557 - regression_loss: 1.0652 - classification_loss: 0.1905 421/500 [========================>.....] - ETA: 18s - loss: 1.2556 - regression_loss: 1.0652 - classification_loss: 0.1905 422/500 [========================>.....] - ETA: 17s - loss: 1.2554 - regression_loss: 1.0650 - classification_loss: 0.1903 423/500 [========================>.....] - ETA: 17s - loss: 1.2561 - regression_loss: 1.0656 - classification_loss: 0.1905 424/500 [========================>.....] - ETA: 17s - loss: 1.2570 - regression_loss: 1.0664 - classification_loss: 0.1906 425/500 [========================>.....] - ETA: 17s - loss: 1.2553 - regression_loss: 1.0651 - classification_loss: 0.1902 426/500 [========================>.....] - ETA: 17s - loss: 1.2556 - regression_loss: 1.0655 - classification_loss: 0.1901 427/500 [========================>.....] - ETA: 16s - loss: 1.2553 - regression_loss: 1.0653 - classification_loss: 0.1900 428/500 [========================>.....] - ETA: 16s - loss: 1.2556 - regression_loss: 1.0655 - classification_loss: 0.1901 429/500 [========================>.....] - ETA: 16s - loss: 1.2557 - regression_loss: 1.0656 - classification_loss: 0.1901 430/500 [========================>.....] - ETA: 16s - loss: 1.2555 - regression_loss: 1.0655 - classification_loss: 0.1900 431/500 [========================>.....] - ETA: 15s - loss: 1.2558 - regression_loss: 1.0657 - classification_loss: 0.1901 432/500 [========================>.....] - ETA: 15s - loss: 1.2537 - regression_loss: 1.0639 - classification_loss: 0.1898 433/500 [========================>.....] - ETA: 15s - loss: 1.2533 - regression_loss: 1.0636 - classification_loss: 0.1898 434/500 [=========================>....] - ETA: 15s - loss: 1.2537 - regression_loss: 1.0639 - classification_loss: 0.1898 435/500 [=========================>....] - ETA: 14s - loss: 1.2520 - regression_loss: 1.0625 - classification_loss: 0.1895 436/500 [=========================>....] - ETA: 14s - loss: 1.2500 - regression_loss: 1.0608 - classification_loss: 0.1892 437/500 [=========================>....] - ETA: 14s - loss: 1.2508 - regression_loss: 1.0614 - classification_loss: 0.1894 438/500 [=========================>....] - ETA: 14s - loss: 1.2518 - regression_loss: 1.0623 - classification_loss: 0.1894 439/500 [=========================>....] - ETA: 14s - loss: 1.2529 - regression_loss: 1.0631 - classification_loss: 0.1898 440/500 [=========================>....] - ETA: 13s - loss: 1.2541 - regression_loss: 1.0640 - classification_loss: 0.1901 441/500 [=========================>....] - ETA: 13s - loss: 1.2536 - regression_loss: 1.0636 - classification_loss: 0.1900 442/500 [=========================>....] - ETA: 13s - loss: 1.2532 - regression_loss: 1.0633 - classification_loss: 0.1899 443/500 [=========================>....] - ETA: 13s - loss: 1.2537 - regression_loss: 1.0638 - classification_loss: 0.1899 444/500 [=========================>....] - ETA: 12s - loss: 1.2537 - regression_loss: 1.0638 - classification_loss: 0.1899 445/500 [=========================>....] - ETA: 12s - loss: 1.2544 - regression_loss: 1.0645 - classification_loss: 0.1899 446/500 [=========================>....] - ETA: 12s - loss: 1.2540 - regression_loss: 1.0642 - classification_loss: 0.1898 447/500 [=========================>....] - ETA: 12s - loss: 1.2538 - regression_loss: 1.0642 - classification_loss: 0.1897 448/500 [=========================>....] - ETA: 11s - loss: 1.2543 - regression_loss: 1.0645 - classification_loss: 0.1898 449/500 [=========================>....] - ETA: 11s - loss: 1.2542 - regression_loss: 1.0645 - classification_loss: 0.1897 450/500 [==========================>...] - ETA: 11s - loss: 1.2524 - regression_loss: 1.0630 - classification_loss: 0.1894 451/500 [==========================>...] - ETA: 11s - loss: 1.2522 - regression_loss: 1.0628 - classification_loss: 0.1894 452/500 [==========================>...] - ETA: 11s - loss: 1.2532 - regression_loss: 1.0637 - classification_loss: 0.1895 453/500 [==========================>...] - ETA: 10s - loss: 1.2542 - regression_loss: 1.0645 - classification_loss: 0.1897 454/500 [==========================>...] - ETA: 10s - loss: 1.2536 - regression_loss: 1.0641 - classification_loss: 0.1896 455/500 [==========================>...] - ETA: 10s - loss: 1.2527 - regression_loss: 1.0635 - classification_loss: 0.1892 456/500 [==========================>...] - ETA: 10s - loss: 1.2529 - regression_loss: 1.0637 - classification_loss: 0.1892 457/500 [==========================>...] - ETA: 9s - loss: 1.2536 - regression_loss: 1.0641 - classification_loss: 0.1896  458/500 [==========================>...] - ETA: 9s - loss: 1.2536 - regression_loss: 1.0641 - classification_loss: 0.1895 459/500 [==========================>...] - ETA: 9s - loss: 1.2542 - regression_loss: 1.0646 - classification_loss: 0.1896 460/500 [==========================>...] - ETA: 9s - loss: 1.2547 - regression_loss: 1.0650 - classification_loss: 0.1897 461/500 [==========================>...] - ETA: 8s - loss: 1.2526 - regression_loss: 1.0633 - classification_loss: 0.1893 462/500 [==========================>...] - ETA: 8s - loss: 1.2532 - regression_loss: 1.0637 - classification_loss: 0.1895 463/500 [==========================>...] - ETA: 8s - loss: 1.2535 - regression_loss: 1.0642 - classification_loss: 0.1894 464/500 [==========================>...] - ETA: 8s - loss: 1.2528 - regression_loss: 1.0635 - classification_loss: 0.1894 465/500 [==========================>...] - ETA: 8s - loss: 1.2526 - regression_loss: 1.0633 - classification_loss: 0.1893 466/500 [==========================>...] - ETA: 7s - loss: 1.2516 - regression_loss: 1.0624 - classification_loss: 0.1891 467/500 [===========================>..] - ETA: 7s - loss: 1.2513 - regression_loss: 1.0622 - classification_loss: 0.1890 468/500 [===========================>..] - ETA: 7s - loss: 1.2514 - regression_loss: 1.0624 - classification_loss: 0.1890 469/500 [===========================>..] - ETA: 7s - loss: 1.2527 - regression_loss: 1.0634 - classification_loss: 0.1893 470/500 [===========================>..] - ETA: 6s - loss: 1.2529 - regression_loss: 1.0636 - classification_loss: 0.1893 471/500 [===========================>..] - ETA: 6s - loss: 1.2541 - regression_loss: 1.0647 - classification_loss: 0.1894 472/500 [===========================>..] - ETA: 6s - loss: 1.2550 - regression_loss: 1.0656 - classification_loss: 0.1894 473/500 [===========================>..] - ETA: 6s - loss: 1.2550 - regression_loss: 1.0655 - classification_loss: 0.1894 474/500 [===========================>..] - ETA: 5s - loss: 1.2544 - regression_loss: 1.0650 - classification_loss: 0.1894 475/500 [===========================>..] - ETA: 5s - loss: 1.2556 - regression_loss: 1.0660 - classification_loss: 0.1895 476/500 [===========================>..] - ETA: 5s - loss: 1.2563 - regression_loss: 1.0667 - classification_loss: 0.1896 477/500 [===========================>..] - ETA: 5s - loss: 1.2574 - regression_loss: 1.0676 - classification_loss: 0.1898 478/500 [===========================>..] - ETA: 5s - loss: 1.2571 - regression_loss: 1.0672 - classification_loss: 0.1899 479/500 [===========================>..] - ETA: 4s - loss: 1.2579 - regression_loss: 1.0679 - classification_loss: 0.1900 480/500 [===========================>..] - ETA: 4s - loss: 1.2578 - regression_loss: 1.0678 - classification_loss: 0.1900 481/500 [===========================>..] - ETA: 4s - loss: 1.2592 - regression_loss: 1.0690 - classification_loss: 0.1902 482/500 [===========================>..] - ETA: 4s - loss: 1.2594 - regression_loss: 1.0692 - classification_loss: 0.1902 483/500 [===========================>..] - ETA: 3s - loss: 1.2599 - regression_loss: 1.0695 - classification_loss: 0.1903 484/500 [============================>.] - ETA: 3s - loss: 1.2588 - regression_loss: 1.0687 - classification_loss: 0.1901 485/500 [============================>.] - ETA: 3s - loss: 1.2585 - regression_loss: 1.0684 - classification_loss: 0.1901 486/500 [============================>.] - ETA: 3s - loss: 1.2587 - regression_loss: 1.0685 - classification_loss: 0.1902 487/500 [============================>.] - ETA: 3s - loss: 1.2633 - regression_loss: 1.0720 - classification_loss: 0.1913 488/500 [============================>.] - ETA: 2s - loss: 1.2619 - regression_loss: 1.0708 - classification_loss: 0.1910 489/500 [============================>.] - ETA: 2s - loss: 1.2609 - regression_loss: 1.0700 - classification_loss: 0.1910 490/500 [============================>.] - ETA: 2s - loss: 1.2617 - regression_loss: 1.0706 - classification_loss: 0.1910 491/500 [============================>.] - ETA: 2s - loss: 1.2613 - regression_loss: 1.0704 - classification_loss: 0.1909 492/500 [============================>.] - ETA: 1s - loss: 1.2607 - regression_loss: 1.0699 - classification_loss: 0.1908 493/500 [============================>.] - ETA: 1s - loss: 1.2612 - regression_loss: 1.0704 - classification_loss: 0.1909 494/500 [============================>.] - ETA: 1s - loss: 1.2621 - regression_loss: 1.0712 - classification_loss: 0.1909 495/500 [============================>.] - ETA: 1s - loss: 1.2615 - regression_loss: 1.0708 - classification_loss: 0.1907 496/500 [============================>.] - ETA: 0s - loss: 1.2599 - regression_loss: 1.0695 - classification_loss: 0.1905 497/500 [============================>.] - ETA: 0s - loss: 1.2604 - regression_loss: 1.0699 - classification_loss: 0.1905 498/500 [============================>.] - ETA: 0s - loss: 1.2606 - regression_loss: 1.0702 - classification_loss: 0.1904 499/500 [============================>.] - ETA: 0s - loss: 1.2594 - regression_loss: 1.0691 - classification_loss: 0.1903 500/500 [==============================] - 116s 231ms/step - loss: 1.2598 - regression_loss: 1.0694 - classification_loss: 0.1904 1172 instances of class plum with average precision: 0.6988 mAP: 0.6988 Epoch 00120: saving model to ./training/snapshots/resnet50_pascal_120.h5 Epoch 121/150 1/500 [..............................] - ETA: 2:01 - loss: 1.3464 - regression_loss: 1.1482 - classification_loss: 0.1982 2/500 [..............................] - ETA: 2:02 - loss: 0.8948 - regression_loss: 0.7759 - classification_loss: 0.1189 3/500 [..............................] - ETA: 2:03 - loss: 0.8544 - regression_loss: 0.7533 - classification_loss: 0.1010 4/500 [..............................] - ETA: 2:03 - loss: 0.9469 - regression_loss: 0.8282 - classification_loss: 0.1187 5/500 [..............................] - ETA: 2:02 - loss: 1.0090 - regression_loss: 0.8655 - classification_loss: 0.1435 6/500 [..............................] - ETA: 2:02 - loss: 1.0398 - regression_loss: 0.8815 - classification_loss: 0.1583 7/500 [..............................] - ETA: 2:02 - loss: 1.0419 - regression_loss: 0.8658 - classification_loss: 0.1761 8/500 [..............................] - ETA: 2:03 - loss: 0.9998 - regression_loss: 0.8321 - classification_loss: 0.1677 9/500 [..............................] - ETA: 2:03 - loss: 1.0488 - regression_loss: 0.8727 - classification_loss: 0.1761 10/500 [..............................] - ETA: 2:03 - loss: 1.1136 - regression_loss: 0.9268 - classification_loss: 0.1868 11/500 [..............................] - ETA: 2:03 - loss: 1.1411 - regression_loss: 0.9526 - classification_loss: 0.1886 12/500 [..............................] - ETA: 2:03 - loss: 1.0917 - regression_loss: 0.9144 - classification_loss: 0.1774 13/500 [..............................] - ETA: 2:02 - loss: 1.0800 - regression_loss: 0.9037 - classification_loss: 0.1763 14/500 [..............................] - ETA: 2:02 - loss: 1.0989 - regression_loss: 0.9235 - classification_loss: 0.1754 15/500 [..............................] - ETA: 2:02 - loss: 1.1019 - regression_loss: 0.9260 - classification_loss: 0.1759 16/500 [..............................] - ETA: 2:01 - loss: 1.1179 - regression_loss: 0.9407 - classification_loss: 0.1772 17/500 [>.............................] - ETA: 2:00 - loss: 1.1048 - regression_loss: 0.9315 - classification_loss: 0.1732 18/500 [>.............................] - ETA: 1:59 - loss: 1.0633 - regression_loss: 0.8985 - classification_loss: 0.1649 19/500 [>.............................] - ETA: 1:58 - loss: 1.0975 - regression_loss: 0.9263 - classification_loss: 0.1712 20/500 [>.............................] - ETA: 1:57 - loss: 1.0917 - regression_loss: 0.9184 - classification_loss: 0.1733 21/500 [>.............................] - ETA: 1:56 - loss: 1.1019 - regression_loss: 0.9294 - classification_loss: 0.1725 22/500 [>.............................] - ETA: 1:56 - loss: 1.1292 - regression_loss: 0.9517 - classification_loss: 0.1775 23/500 [>.............................] - ETA: 1:56 - loss: 1.1330 - regression_loss: 0.9540 - classification_loss: 0.1790 24/500 [>.............................] - ETA: 1:56 - loss: 1.1436 - regression_loss: 0.9630 - classification_loss: 0.1806 25/500 [>.............................] - ETA: 1:56 - loss: 1.1211 - regression_loss: 0.9464 - classification_loss: 0.1747 26/500 [>.............................] - ETA: 1:56 - loss: 1.1335 - regression_loss: 0.9558 - classification_loss: 0.1778 27/500 [>.............................] - ETA: 1:56 - loss: 1.1402 - regression_loss: 0.9617 - classification_loss: 0.1786 28/500 [>.............................] - ETA: 1:56 - loss: 1.1506 - regression_loss: 0.9731 - classification_loss: 0.1774 29/500 [>.............................] - ETA: 1:56 - loss: 1.1541 - regression_loss: 0.9770 - classification_loss: 0.1771 30/500 [>.............................] - ETA: 1:56 - loss: 1.1421 - regression_loss: 0.9677 - classification_loss: 0.1744 31/500 [>.............................] - ETA: 1:55 - loss: 1.1602 - regression_loss: 0.9814 - classification_loss: 0.1788 32/500 [>.............................] - ETA: 1:55 - loss: 1.1744 - regression_loss: 0.9941 - classification_loss: 0.1803 33/500 [>.............................] - ETA: 1:55 - loss: 1.1638 - regression_loss: 0.9859 - classification_loss: 0.1779 34/500 [=>............................] - ETA: 1:55 - loss: 1.1696 - regression_loss: 0.9914 - classification_loss: 0.1782 35/500 [=>............................] - ETA: 1:55 - loss: 1.1788 - regression_loss: 0.9995 - classification_loss: 0.1793 36/500 [=>............................] - ETA: 1:55 - loss: 1.1919 - regression_loss: 1.0105 - classification_loss: 0.1814 37/500 [=>............................] - ETA: 1:55 - loss: 1.1904 - regression_loss: 1.0094 - classification_loss: 0.1810 38/500 [=>............................] - ETA: 1:54 - loss: 1.2098 - regression_loss: 1.0253 - classification_loss: 0.1845 39/500 [=>............................] - ETA: 1:54 - loss: 1.2151 - regression_loss: 1.0306 - classification_loss: 0.1845 40/500 [=>............................] - ETA: 1:54 - loss: 1.2098 - regression_loss: 1.0276 - classification_loss: 0.1822 41/500 [=>............................] - ETA: 1:54 - loss: 1.2210 - regression_loss: 1.0382 - classification_loss: 0.1827 42/500 [=>............................] - ETA: 1:53 - loss: 1.2272 - regression_loss: 1.0436 - classification_loss: 0.1836 43/500 [=>............................] - ETA: 1:53 - loss: 1.2141 - regression_loss: 1.0322 - classification_loss: 0.1818 44/500 [=>............................] - ETA: 1:53 - loss: 1.2037 - regression_loss: 1.0236 - classification_loss: 0.1800 45/500 [=>............................] - ETA: 1:53 - loss: 1.2093 - regression_loss: 1.0292 - classification_loss: 0.1801 46/500 [=>............................] - ETA: 1:53 - loss: 1.2185 - regression_loss: 1.0375 - classification_loss: 0.1810 47/500 [=>............................] - ETA: 1:52 - loss: 1.2233 - regression_loss: 1.0409 - classification_loss: 0.1824 48/500 [=>............................] - ETA: 1:52 - loss: 1.2303 - regression_loss: 1.0470 - classification_loss: 0.1833 49/500 [=>............................] - ETA: 1:52 - loss: 1.2393 - regression_loss: 1.0546 - classification_loss: 0.1847 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2275 - regression_loss: 1.0424 - classification_loss: 0.1852 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2311 - regression_loss: 1.0444 - classification_loss: 0.1867 52/500 [==>...........................] - ETA: 1:51 - loss: 1.2376 - regression_loss: 1.0502 - classification_loss: 0.1874 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2439 - regression_loss: 1.0557 - classification_loss: 0.1882 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2543 - regression_loss: 1.0646 - classification_loss: 0.1897 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2662 - regression_loss: 1.0747 - classification_loss: 0.1915 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2657 - regression_loss: 1.0734 - classification_loss: 0.1922 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2594 - regression_loss: 1.0681 - classification_loss: 0.1913 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2557 - regression_loss: 1.0641 - classification_loss: 0.1917 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2428 - regression_loss: 1.0535 - classification_loss: 0.1893 60/500 [==>...........................] - ETA: 1:49 - loss: 1.2479 - regression_loss: 1.0586 - classification_loss: 0.1893 61/500 [==>...........................] - ETA: 1:49 - loss: 1.2450 - regression_loss: 1.0557 - classification_loss: 0.1893 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2481 - regression_loss: 1.0590 - classification_loss: 0.1891 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2559 - regression_loss: 1.0653 - classification_loss: 0.1906 64/500 [==>...........................] - ETA: 1:48 - loss: 1.2600 - regression_loss: 1.0689 - classification_loss: 0.1911 65/500 [==>...........................] - ETA: 1:48 - loss: 1.2590 - regression_loss: 1.0689 - classification_loss: 0.1901 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2592 - regression_loss: 1.0687 - classification_loss: 0.1905 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2460 - regression_loss: 1.0580 - classification_loss: 0.1880 68/500 [===>..........................] - ETA: 1:47 - loss: 1.2493 - regression_loss: 1.0611 - classification_loss: 0.1882 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2618 - regression_loss: 1.0706 - classification_loss: 0.1911 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2520 - regression_loss: 1.0629 - classification_loss: 0.1891 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2434 - regression_loss: 1.0559 - classification_loss: 0.1875 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2401 - regression_loss: 1.0531 - classification_loss: 0.1871 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2298 - regression_loss: 1.0446 - classification_loss: 0.1853 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2358 - regression_loss: 1.0488 - classification_loss: 0.1870 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2334 - regression_loss: 1.0470 - classification_loss: 0.1863 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2279 - regression_loss: 1.0426 - classification_loss: 0.1854 77/500 [===>..........................] - ETA: 1:45 - loss: 1.2344 - regression_loss: 1.0480 - classification_loss: 0.1863 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2380 - regression_loss: 1.0520 - classification_loss: 0.1860 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2356 - regression_loss: 1.0503 - classification_loss: 0.1853 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2372 - regression_loss: 1.0514 - classification_loss: 0.1858 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2366 - regression_loss: 1.0508 - classification_loss: 0.1858 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2445 - regression_loss: 1.0565 - classification_loss: 0.1880 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2364 - regression_loss: 1.0501 - classification_loss: 0.1863 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2392 - regression_loss: 1.0524 - classification_loss: 0.1867 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2403 - regression_loss: 1.0535 - classification_loss: 0.1868 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2318 - regression_loss: 1.0461 - classification_loss: 0.1857 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2332 - regression_loss: 1.0478 - classification_loss: 0.1854 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2401 - regression_loss: 1.0542 - classification_loss: 0.1859 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2332 - regression_loss: 1.0483 - classification_loss: 0.1849 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2334 - regression_loss: 1.0479 - classification_loss: 0.1855 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2357 - regression_loss: 1.0494 - classification_loss: 0.1863 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2357 - regression_loss: 1.0495 - classification_loss: 0.1862 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2380 - regression_loss: 1.0515 - classification_loss: 0.1864 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2394 - regression_loss: 1.0530 - classification_loss: 0.1864 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2402 - regression_loss: 1.0534 - classification_loss: 0.1868 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2320 - regression_loss: 1.0460 - classification_loss: 0.1860 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2361 - regression_loss: 1.0495 - classification_loss: 0.1867 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2307 - regression_loss: 1.0455 - classification_loss: 0.1852 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2301 - regression_loss: 1.0451 - classification_loss: 0.1851 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2235 - regression_loss: 1.0391 - classification_loss: 0.1844 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2294 - regression_loss: 1.0439 - classification_loss: 0.1855 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2316 - regression_loss: 1.0456 - classification_loss: 0.1860 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2364 - regression_loss: 1.0497 - classification_loss: 0.1867 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2360 - regression_loss: 1.0495 - classification_loss: 0.1866 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2391 - regression_loss: 1.0522 - classification_loss: 0.1869 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2366 - regression_loss: 1.0504 - classification_loss: 0.1862 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2430 - regression_loss: 1.0557 - classification_loss: 0.1873 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2433 - regression_loss: 1.0557 - classification_loss: 0.1876 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2452 - regression_loss: 1.0576 - classification_loss: 0.1876 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2365 - regression_loss: 1.0503 - classification_loss: 0.1862 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2388 - regression_loss: 1.0515 - classification_loss: 0.1873 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2417 - regression_loss: 1.0539 - classification_loss: 0.1877 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2381 - regression_loss: 1.0508 - classification_loss: 0.1873 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2307 - regression_loss: 1.0446 - classification_loss: 0.1861 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2399 - regression_loss: 1.0518 - classification_loss: 0.1881 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2398 - regression_loss: 1.0517 - classification_loss: 0.1880 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2416 - regression_loss: 1.0534 - classification_loss: 0.1881 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2449 - regression_loss: 1.0563 - classification_loss: 0.1886 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2443 - regression_loss: 1.0557 - classification_loss: 0.1886 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2451 - regression_loss: 1.0563 - classification_loss: 0.1887 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2467 - regression_loss: 1.0574 - classification_loss: 0.1893 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2507 - regression_loss: 1.0608 - classification_loss: 0.1899 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2508 - regression_loss: 1.0610 - classification_loss: 0.1899 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2522 - regression_loss: 1.0623 - classification_loss: 0.1899 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2575 - regression_loss: 1.0666 - classification_loss: 0.1908 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2589 - regression_loss: 1.0679 - classification_loss: 0.1909 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2535 - regression_loss: 1.0636 - classification_loss: 0.1899 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2532 - regression_loss: 1.0635 - classification_loss: 0.1897 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2515 - regression_loss: 1.0623 - classification_loss: 0.1893 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2474 - regression_loss: 1.0587 - classification_loss: 0.1887 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2435 - regression_loss: 1.0556 - classification_loss: 0.1879 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2449 - regression_loss: 1.0570 - classification_loss: 0.1880 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2487 - regression_loss: 1.0600 - classification_loss: 0.1887 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2534 - regression_loss: 1.0642 - classification_loss: 0.1892 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2490 - regression_loss: 1.0608 - classification_loss: 0.1882 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2503 - regression_loss: 1.0613 - classification_loss: 0.1890 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2532 - regression_loss: 1.0636 - classification_loss: 0.1896 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2549 - regression_loss: 1.0650 - classification_loss: 0.1899 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2555 - regression_loss: 1.0653 - classification_loss: 0.1901 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2585 - regression_loss: 1.0679 - classification_loss: 0.1906 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2621 - regression_loss: 1.0707 - classification_loss: 0.1914 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2630 - regression_loss: 1.0715 - classification_loss: 0.1915 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2690 - regression_loss: 1.0758 - classification_loss: 0.1932 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2650 - regression_loss: 1.0727 - classification_loss: 0.1924 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2671 - regression_loss: 1.0742 - classification_loss: 0.1929 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2646 - regression_loss: 1.0722 - classification_loss: 0.1924 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2639 - regression_loss: 1.0712 - classification_loss: 0.1927 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2659 - regression_loss: 1.0727 - classification_loss: 0.1931 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2721 - regression_loss: 1.0779 - classification_loss: 0.1942 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2690 - regression_loss: 1.0751 - classification_loss: 0.1939 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2727 - regression_loss: 1.0780 - classification_loss: 0.1947 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2741 - regression_loss: 1.0791 - classification_loss: 0.1950 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2760 - regression_loss: 1.0810 - classification_loss: 0.1950 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2716 - regression_loss: 1.0774 - classification_loss: 0.1942 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2712 - regression_loss: 1.0768 - classification_loss: 0.1944 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2714 - regression_loss: 1.0768 - classification_loss: 0.1946 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2717 - regression_loss: 1.0769 - classification_loss: 0.1948 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2717 - regression_loss: 1.0772 - classification_loss: 0.1946 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2745 - regression_loss: 1.0793 - classification_loss: 0.1952 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2738 - regression_loss: 1.0787 - classification_loss: 0.1950 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2718 - regression_loss: 1.0773 - classification_loss: 0.1945 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2678 - regression_loss: 1.0742 - classification_loss: 0.1936 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2683 - regression_loss: 1.0744 - classification_loss: 0.1938 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2709 - regression_loss: 1.0770 - classification_loss: 0.1939 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2707 - regression_loss: 1.0767 - classification_loss: 0.1940 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2724 - regression_loss: 1.0782 - classification_loss: 0.1942 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2740 - regression_loss: 1.0794 - classification_loss: 0.1946 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2722 - regression_loss: 1.0780 - classification_loss: 0.1942 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2711 - regression_loss: 1.0772 - classification_loss: 0.1939 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2719 - regression_loss: 1.0782 - classification_loss: 0.1937 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2734 - regression_loss: 1.0787 - classification_loss: 0.1948 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2754 - regression_loss: 1.0804 - classification_loss: 0.1950 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2745 - regression_loss: 1.0798 - classification_loss: 0.1947 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2702 - regression_loss: 1.0762 - classification_loss: 0.1940 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2718 - regression_loss: 1.0774 - classification_loss: 0.1944 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2668 - regression_loss: 1.0732 - classification_loss: 0.1937 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2693 - regression_loss: 1.0742 - classification_loss: 0.1951 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2676 - regression_loss: 1.0727 - classification_loss: 0.1949 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2677 - regression_loss: 1.0730 - classification_loss: 0.1948 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2714 - regression_loss: 1.0760 - classification_loss: 0.1954 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2679 - regression_loss: 1.0733 - classification_loss: 0.1947 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2674 - regression_loss: 1.0730 - classification_loss: 0.1944 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2701 - regression_loss: 1.0754 - classification_loss: 0.1948 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2697 - regression_loss: 1.0751 - classification_loss: 0.1946 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2680 - regression_loss: 1.0739 - classification_loss: 0.1941 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2652 - regression_loss: 1.0716 - classification_loss: 0.1936 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2648 - regression_loss: 1.0714 - classification_loss: 0.1934 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2627 - regression_loss: 1.0699 - classification_loss: 0.1928 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2646 - regression_loss: 1.0717 - classification_loss: 0.1929 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2599 - regression_loss: 1.0678 - classification_loss: 0.1921 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2571 - regression_loss: 1.0655 - classification_loss: 0.1915 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2582 - regression_loss: 1.0663 - classification_loss: 0.1919 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2588 - regression_loss: 1.0668 - classification_loss: 0.1920 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2619 - regression_loss: 1.0694 - classification_loss: 0.1925 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2593 - regression_loss: 1.0672 - classification_loss: 0.1921 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2607 - regression_loss: 1.0681 - classification_loss: 0.1926 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2576 - regression_loss: 1.0658 - classification_loss: 0.1919 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2593 - regression_loss: 1.0677 - classification_loss: 0.1916 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2603 - regression_loss: 1.0688 - classification_loss: 0.1916 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2606 - regression_loss: 1.0690 - classification_loss: 0.1916 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2599 - regression_loss: 1.0686 - classification_loss: 0.1913 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2609 - regression_loss: 1.0695 - classification_loss: 0.1914 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2620 - regression_loss: 1.0706 - classification_loss: 0.1914 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2601 - regression_loss: 1.0692 - classification_loss: 0.1909 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2632 - regression_loss: 1.0715 - classification_loss: 0.1917 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2598 - regression_loss: 1.0686 - classification_loss: 0.1913 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2601 - regression_loss: 1.0685 - classification_loss: 0.1915 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2613 - regression_loss: 1.0696 - classification_loss: 0.1917 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2590 - regression_loss: 1.0676 - classification_loss: 0.1914 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2608 - regression_loss: 1.0691 - classification_loss: 0.1917 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2629 - regression_loss: 1.0710 - classification_loss: 0.1919 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2611 - regression_loss: 1.0695 - classification_loss: 0.1917 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2621 - regression_loss: 1.0704 - classification_loss: 0.1916 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2576 - regression_loss: 1.0666 - classification_loss: 0.1910 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2567 - regression_loss: 1.0657 - classification_loss: 0.1910 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2591 - regression_loss: 1.0675 - classification_loss: 0.1916 217/500 [============>.................] - ETA: 1:10 - loss: 1.2590 - regression_loss: 1.0672 - classification_loss: 0.1918 218/500 [============>.................] - ETA: 1:10 - loss: 1.2576 - regression_loss: 1.0657 - classification_loss: 0.1919 219/500 [============>.................] - ETA: 1:10 - loss: 1.2546 - regression_loss: 1.0634 - classification_loss: 0.1912 220/500 [============>.................] - ETA: 1:10 - loss: 1.2581 - regression_loss: 1.0663 - classification_loss: 0.1919 221/500 [============>.................] - ETA: 1:09 - loss: 1.2596 - regression_loss: 1.0676 - classification_loss: 0.1921 222/500 [============>.................] - ETA: 1:09 - loss: 1.2564 - regression_loss: 1.0649 - classification_loss: 0.1915 223/500 [============>.................] - ETA: 1:09 - loss: 1.2569 - regression_loss: 1.0655 - classification_loss: 0.1914 224/500 [============>.................] - ETA: 1:09 - loss: 1.2556 - regression_loss: 1.0645 - classification_loss: 0.1911 225/500 [============>.................] - ETA: 1:08 - loss: 1.2551 - regression_loss: 1.0641 - classification_loss: 0.1911 226/500 [============>.................] - ETA: 1:08 - loss: 1.2558 - regression_loss: 1.0647 - classification_loss: 0.1911 227/500 [============>.................] - ETA: 1:08 - loss: 1.2528 - regression_loss: 1.0623 - classification_loss: 0.1905 228/500 [============>.................] - ETA: 1:08 - loss: 1.2512 - regression_loss: 1.0608 - classification_loss: 0.1904 229/500 [============>.................] - ETA: 1:07 - loss: 1.2516 - regression_loss: 1.0612 - classification_loss: 0.1905 230/500 [============>.................] - ETA: 1:07 - loss: 1.2533 - regression_loss: 1.0626 - classification_loss: 0.1907 231/500 [============>.................] - ETA: 1:07 - loss: 1.2535 - regression_loss: 1.0627 - classification_loss: 0.1908 232/500 [============>.................] - ETA: 1:07 - loss: 1.2555 - regression_loss: 1.0642 - classification_loss: 0.1913 233/500 [============>.................] - ETA: 1:06 - loss: 1.2537 - regression_loss: 1.0629 - classification_loss: 0.1907 234/500 [=============>................] - ETA: 1:06 - loss: 1.2515 - regression_loss: 1.0608 - classification_loss: 0.1908 235/500 [=============>................] - ETA: 1:06 - loss: 1.2509 - regression_loss: 1.0603 - classification_loss: 0.1906 236/500 [=============>................] - ETA: 1:06 - loss: 1.2516 - regression_loss: 1.0609 - classification_loss: 0.1906 237/500 [=============>................] - ETA: 1:05 - loss: 1.2524 - regression_loss: 1.0616 - classification_loss: 0.1908 238/500 [=============>................] - ETA: 1:05 - loss: 1.2525 - regression_loss: 1.0613 - classification_loss: 0.1912 239/500 [=============>................] - ETA: 1:05 - loss: 1.2492 - regression_loss: 1.0586 - classification_loss: 0.1906 240/500 [=============>................] - ETA: 1:05 - loss: 1.2469 - regression_loss: 1.0566 - classification_loss: 0.1903 241/500 [=============>................] - ETA: 1:04 - loss: 1.2491 - regression_loss: 1.0589 - classification_loss: 0.1902 242/500 [=============>................] - ETA: 1:04 - loss: 1.2467 - regression_loss: 1.0569 - classification_loss: 0.1897 243/500 [=============>................] - ETA: 1:04 - loss: 1.2468 - regression_loss: 1.0568 - classification_loss: 0.1899 244/500 [=============>................] - ETA: 1:04 - loss: 1.2446 - regression_loss: 1.0550 - classification_loss: 0.1895 245/500 [=============>................] - ETA: 1:03 - loss: 1.2463 - regression_loss: 1.0565 - classification_loss: 0.1898 246/500 [=============>................] - ETA: 1:03 - loss: 1.2466 - regression_loss: 1.0569 - classification_loss: 0.1897 247/500 [=============>................] - ETA: 1:03 - loss: 1.2473 - regression_loss: 1.0577 - classification_loss: 0.1896 248/500 [=============>................] - ETA: 1:03 - loss: 1.2440 - regression_loss: 1.0550 - classification_loss: 0.1889 249/500 [=============>................] - ETA: 1:02 - loss: 1.2463 - regression_loss: 1.0573 - classification_loss: 0.1890 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2464 - regression_loss: 1.0574 - classification_loss: 0.1890 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2489 - regression_loss: 1.0597 - classification_loss: 0.1892 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2501 - regression_loss: 1.0607 - classification_loss: 0.1894 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2502 - regression_loss: 1.0607 - classification_loss: 0.1895 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2521 - regression_loss: 1.0623 - classification_loss: 0.1898 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2533 - regression_loss: 1.0633 - classification_loss: 0.1900 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2535 - regression_loss: 1.0632 - classification_loss: 0.1903 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2509 - regression_loss: 1.0611 - classification_loss: 0.1898 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2524 - regression_loss: 1.0624 - classification_loss: 0.1900 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2526 - regression_loss: 1.0627 - classification_loss: 0.1899 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2535 - regression_loss: 1.0635 - classification_loss: 0.1899 261/500 [==============>...............] - ETA: 59s - loss: 1.2548 - regression_loss: 1.0647 - classification_loss: 0.1901  262/500 [==============>...............] - ETA: 59s - loss: 1.2517 - regression_loss: 1.0621 - classification_loss: 0.1896 263/500 [==============>...............] - ETA: 59s - loss: 1.2515 - regression_loss: 1.0619 - classification_loss: 0.1897 264/500 [==============>...............] - ETA: 59s - loss: 1.2496 - regression_loss: 1.0603 - classification_loss: 0.1893 265/500 [==============>...............] - ETA: 58s - loss: 1.2465 - regression_loss: 1.0577 - classification_loss: 0.1888 266/500 [==============>...............] - ETA: 58s - loss: 1.2462 - regression_loss: 1.0575 - classification_loss: 0.1888 267/500 [===============>..............] - ETA: 58s - loss: 1.2475 - regression_loss: 1.0586 - classification_loss: 0.1889 268/500 [===============>..............] - ETA: 58s - loss: 1.2488 - regression_loss: 1.0598 - classification_loss: 0.1890 269/500 [===============>..............] - ETA: 57s - loss: 1.2495 - regression_loss: 1.0604 - classification_loss: 0.1892 270/500 [===============>..............] - ETA: 57s - loss: 1.2514 - regression_loss: 1.0619 - classification_loss: 0.1895 271/500 [===============>..............] - ETA: 57s - loss: 1.2513 - regression_loss: 1.0618 - classification_loss: 0.1894 272/500 [===============>..............] - ETA: 57s - loss: 1.2527 - regression_loss: 1.0630 - classification_loss: 0.1896 273/500 [===============>..............] - ETA: 56s - loss: 1.2507 - regression_loss: 1.0616 - classification_loss: 0.1891 274/500 [===============>..............] - ETA: 56s - loss: 1.2537 - regression_loss: 1.0639 - classification_loss: 0.1899 275/500 [===============>..............] - ETA: 56s - loss: 1.2524 - regression_loss: 1.0626 - classification_loss: 0.1898 276/500 [===============>..............] - ETA: 56s - loss: 1.2520 - regression_loss: 1.0621 - classification_loss: 0.1899 277/500 [===============>..............] - ETA: 55s - loss: 1.2512 - regression_loss: 1.0615 - classification_loss: 0.1897 278/500 [===============>..............] - ETA: 55s - loss: 1.2582 - regression_loss: 1.0675 - classification_loss: 0.1907 279/500 [===============>..............] - ETA: 55s - loss: 1.2571 - regression_loss: 1.0667 - classification_loss: 0.1904 280/500 [===============>..............] - ETA: 55s - loss: 1.2578 - regression_loss: 1.0674 - classification_loss: 0.1905 281/500 [===============>..............] - ETA: 54s - loss: 1.2561 - regression_loss: 1.0658 - classification_loss: 0.1902 282/500 [===============>..............] - ETA: 54s - loss: 1.2543 - regression_loss: 1.0643 - classification_loss: 0.1900 283/500 [===============>..............] - ETA: 54s - loss: 1.2556 - regression_loss: 1.0655 - classification_loss: 0.1901 284/500 [================>.............] - ETA: 54s - loss: 1.2530 - regression_loss: 1.0633 - classification_loss: 0.1897 285/500 [================>.............] - ETA: 53s - loss: 1.2550 - regression_loss: 1.0647 - classification_loss: 0.1903 286/500 [================>.............] - ETA: 53s - loss: 1.2558 - regression_loss: 1.0654 - classification_loss: 0.1904 287/500 [================>.............] - ETA: 53s - loss: 1.2544 - regression_loss: 1.0643 - classification_loss: 0.1901 288/500 [================>.............] - ETA: 53s - loss: 1.2546 - regression_loss: 1.0646 - classification_loss: 0.1900 289/500 [================>.............] - ETA: 52s - loss: 1.2538 - regression_loss: 1.0641 - classification_loss: 0.1898 290/500 [================>.............] - ETA: 52s - loss: 1.2553 - regression_loss: 1.0655 - classification_loss: 0.1898 291/500 [================>.............] - ETA: 52s - loss: 1.2537 - regression_loss: 1.0642 - classification_loss: 0.1895 292/500 [================>.............] - ETA: 52s - loss: 1.2556 - regression_loss: 1.0658 - classification_loss: 0.1898 293/500 [================>.............] - ETA: 51s - loss: 1.2554 - regression_loss: 1.0656 - classification_loss: 0.1897 294/500 [================>.............] - ETA: 51s - loss: 1.2549 - regression_loss: 1.0654 - classification_loss: 0.1895 295/500 [================>.............] - ETA: 51s - loss: 1.2553 - regression_loss: 1.0654 - classification_loss: 0.1898 296/500 [================>.............] - ETA: 51s - loss: 1.2579 - regression_loss: 1.0673 - classification_loss: 0.1906 297/500 [================>.............] - ETA: 50s - loss: 1.2656 - regression_loss: 1.0705 - classification_loss: 0.1951 298/500 [================>.............] - ETA: 50s - loss: 1.2665 - regression_loss: 1.0713 - classification_loss: 0.1952 299/500 [================>.............] - ETA: 50s - loss: 1.2658 - regression_loss: 1.0707 - classification_loss: 0.1951 300/500 [=================>............] - ETA: 50s - loss: 1.2653 - regression_loss: 1.0704 - classification_loss: 0.1949 301/500 [=================>............] - ETA: 49s - loss: 1.2644 - regression_loss: 1.0697 - classification_loss: 0.1947 302/500 [=================>............] - ETA: 49s - loss: 1.2658 - regression_loss: 1.0708 - classification_loss: 0.1950 303/500 [=================>............] - ETA: 49s - loss: 1.2675 - regression_loss: 1.0722 - classification_loss: 0.1953 304/500 [=================>............] - ETA: 49s - loss: 1.2666 - regression_loss: 1.0716 - classification_loss: 0.1951 305/500 [=================>............] - ETA: 48s - loss: 1.2674 - regression_loss: 1.0722 - classification_loss: 0.1952 306/500 [=================>............] - ETA: 48s - loss: 1.2664 - regression_loss: 1.0715 - classification_loss: 0.1949 307/500 [=================>............] - ETA: 48s - loss: 1.2667 - regression_loss: 1.0719 - classification_loss: 0.1948 308/500 [=================>............] - ETA: 48s - loss: 1.2668 - regression_loss: 1.0719 - classification_loss: 0.1949 309/500 [=================>............] - ETA: 47s - loss: 1.2658 - regression_loss: 1.0709 - classification_loss: 0.1949 310/500 [=================>............] - ETA: 47s - loss: 1.2665 - regression_loss: 1.0715 - classification_loss: 0.1950 311/500 [=================>............] - ETA: 47s - loss: 1.2658 - regression_loss: 1.0708 - classification_loss: 0.1949 312/500 [=================>............] - ETA: 47s - loss: 1.2641 - regression_loss: 1.0694 - classification_loss: 0.1947 313/500 [=================>............] - ETA: 46s - loss: 1.2649 - regression_loss: 1.0700 - classification_loss: 0.1949 314/500 [=================>............] - ETA: 46s - loss: 1.2667 - regression_loss: 1.0716 - classification_loss: 0.1950 315/500 [=================>............] - ETA: 46s - loss: 1.2664 - regression_loss: 1.0717 - classification_loss: 0.1947 316/500 [=================>............] - ETA: 46s - loss: 1.2672 - regression_loss: 1.0725 - classification_loss: 0.1947 317/500 [==================>...........] - ETA: 45s - loss: 1.2664 - regression_loss: 1.0718 - classification_loss: 0.1945 318/500 [==================>...........] - ETA: 45s - loss: 1.2641 - regression_loss: 1.0700 - classification_loss: 0.1941 319/500 [==================>...........] - ETA: 45s - loss: 1.2624 - regression_loss: 1.0687 - classification_loss: 0.1937 320/500 [==================>...........] - ETA: 45s - loss: 1.2636 - regression_loss: 1.0697 - classification_loss: 0.1939 321/500 [==================>...........] - ETA: 44s - loss: 1.2631 - regression_loss: 1.0693 - classification_loss: 0.1938 322/500 [==================>...........] - ETA: 44s - loss: 1.2639 - regression_loss: 1.0699 - classification_loss: 0.1940 323/500 [==================>...........] - ETA: 44s - loss: 1.2643 - regression_loss: 1.0703 - classification_loss: 0.1941 324/500 [==================>...........] - ETA: 44s - loss: 1.2647 - regression_loss: 1.0706 - classification_loss: 0.1941 325/500 [==================>...........] - ETA: 43s - loss: 1.2639 - regression_loss: 1.0699 - classification_loss: 0.1940 326/500 [==================>...........] - ETA: 43s - loss: 1.2627 - regression_loss: 1.0691 - classification_loss: 0.1936 327/500 [==================>...........] - ETA: 43s - loss: 1.2620 - regression_loss: 1.0684 - classification_loss: 0.1935 328/500 [==================>...........] - ETA: 43s - loss: 1.2596 - regression_loss: 1.0665 - classification_loss: 0.1931 329/500 [==================>...........] - ETA: 42s - loss: 1.2609 - regression_loss: 1.0676 - classification_loss: 0.1933 330/500 [==================>...........] - ETA: 42s - loss: 1.2610 - regression_loss: 1.0675 - classification_loss: 0.1934 331/500 [==================>...........] - ETA: 42s - loss: 1.2594 - regression_loss: 1.0662 - classification_loss: 0.1932 332/500 [==================>...........] - ETA: 42s - loss: 1.2606 - regression_loss: 1.0673 - classification_loss: 0.1933 333/500 [==================>...........] - ETA: 41s - loss: 1.2629 - regression_loss: 1.0691 - classification_loss: 0.1938 334/500 [===================>..........] - ETA: 41s - loss: 1.2630 - regression_loss: 1.0691 - classification_loss: 0.1940 335/500 [===================>..........] - ETA: 41s - loss: 1.2644 - regression_loss: 1.0703 - classification_loss: 0.1941 336/500 [===================>..........] - ETA: 41s - loss: 1.2652 - regression_loss: 1.0709 - classification_loss: 0.1943 337/500 [===================>..........] - ETA: 40s - loss: 1.2659 - regression_loss: 1.0716 - classification_loss: 0.1944 338/500 [===================>..........] - ETA: 40s - loss: 1.2671 - regression_loss: 1.0726 - classification_loss: 0.1945 339/500 [===================>..........] - ETA: 40s - loss: 1.2674 - regression_loss: 1.0728 - classification_loss: 0.1946 340/500 [===================>..........] - ETA: 40s - loss: 1.2681 - regression_loss: 1.0734 - classification_loss: 0.1947 341/500 [===================>..........] - ETA: 39s - loss: 1.2691 - regression_loss: 1.0742 - classification_loss: 0.1949 342/500 [===================>..........] - ETA: 39s - loss: 1.2703 - regression_loss: 1.0751 - classification_loss: 0.1952 343/500 [===================>..........] - ETA: 39s - loss: 1.2682 - regression_loss: 1.0734 - classification_loss: 0.1948 344/500 [===================>..........] - ETA: 39s - loss: 1.2694 - regression_loss: 1.0741 - classification_loss: 0.1953 345/500 [===================>..........] - ETA: 38s - loss: 1.2696 - regression_loss: 1.0743 - classification_loss: 0.1953 346/500 [===================>..........] - ETA: 38s - loss: 1.2697 - regression_loss: 1.0745 - classification_loss: 0.1952 347/500 [===================>..........] - ETA: 38s - loss: 1.2678 - regression_loss: 1.0729 - classification_loss: 0.1949 348/500 [===================>..........] - ETA: 38s - loss: 1.2688 - regression_loss: 1.0737 - classification_loss: 0.1951 349/500 [===================>..........] - ETA: 37s - loss: 1.2683 - regression_loss: 1.0733 - classification_loss: 0.1949 350/500 [====================>.........] - ETA: 37s - loss: 1.2693 - regression_loss: 1.0742 - classification_loss: 0.1951 351/500 [====================>.........] - ETA: 37s - loss: 1.2689 - regression_loss: 1.0738 - classification_loss: 0.1951 352/500 [====================>.........] - ETA: 37s - loss: 1.2692 - regression_loss: 1.0741 - classification_loss: 0.1950 353/500 [====================>.........] - ETA: 36s - loss: 1.2698 - regression_loss: 1.0747 - classification_loss: 0.1951 354/500 [====================>.........] - ETA: 36s - loss: 1.2694 - regression_loss: 1.0746 - classification_loss: 0.1948 355/500 [====================>.........] - ETA: 36s - loss: 1.2685 - regression_loss: 1.0739 - classification_loss: 0.1946 356/500 [====================>.........] - ETA: 36s - loss: 1.2702 - regression_loss: 1.0754 - classification_loss: 0.1948 357/500 [====================>.........] - ETA: 35s - loss: 1.2698 - regression_loss: 1.0750 - classification_loss: 0.1948 358/500 [====================>.........] - ETA: 35s - loss: 1.2706 - regression_loss: 1.0757 - classification_loss: 0.1949 359/500 [====================>.........] - ETA: 35s - loss: 1.2716 - regression_loss: 1.0764 - classification_loss: 0.1952 360/500 [====================>.........] - ETA: 35s - loss: 1.2708 - regression_loss: 1.0758 - classification_loss: 0.1950 361/500 [====================>.........] - ETA: 34s - loss: 1.2714 - regression_loss: 1.0764 - classification_loss: 0.1950 362/500 [====================>.........] - ETA: 34s - loss: 1.2714 - regression_loss: 1.0763 - classification_loss: 0.1950 363/500 [====================>.........] - ETA: 34s - loss: 1.2705 - regression_loss: 1.0755 - classification_loss: 0.1949 364/500 [====================>.........] - ETA: 34s - loss: 1.2700 - regression_loss: 1.0751 - classification_loss: 0.1949 365/500 [====================>.........] - ETA: 33s - loss: 1.2688 - regression_loss: 1.0740 - classification_loss: 0.1948 366/500 [====================>.........] - ETA: 33s - loss: 1.2685 - regression_loss: 1.0739 - classification_loss: 0.1946 367/500 [=====================>........] - ETA: 33s - loss: 1.2685 - regression_loss: 1.0738 - classification_loss: 0.1947 368/500 [=====================>........] - ETA: 33s - loss: 1.2677 - regression_loss: 1.0731 - classification_loss: 0.1946 369/500 [=====================>........] - ETA: 32s - loss: 1.2678 - regression_loss: 1.0733 - classification_loss: 0.1945 370/500 [=====================>........] - ETA: 32s - loss: 1.2682 - regression_loss: 1.0737 - classification_loss: 0.1946 371/500 [=====================>........] - ETA: 32s - loss: 1.2691 - regression_loss: 1.0743 - classification_loss: 0.1947 372/500 [=====================>........] - ETA: 32s - loss: 1.2693 - regression_loss: 1.0746 - classification_loss: 0.1947 373/500 [=====================>........] - ETA: 31s - loss: 1.2689 - regression_loss: 1.0745 - classification_loss: 0.1945 374/500 [=====================>........] - ETA: 31s - loss: 1.2686 - regression_loss: 1.0738 - classification_loss: 0.1948 375/500 [=====================>........] - ETA: 31s - loss: 1.2685 - regression_loss: 1.0737 - classification_loss: 0.1948 376/500 [=====================>........] - ETA: 31s - loss: 1.2706 - regression_loss: 1.0756 - classification_loss: 0.1950 377/500 [=====================>........] - ETA: 30s - loss: 1.2712 - regression_loss: 1.0760 - classification_loss: 0.1952 378/500 [=====================>........] - ETA: 30s - loss: 1.2696 - regression_loss: 1.0748 - classification_loss: 0.1948 379/500 [=====================>........] - ETA: 30s - loss: 1.2708 - regression_loss: 1.0758 - classification_loss: 0.1949 380/500 [=====================>........] - ETA: 30s - loss: 1.2706 - regression_loss: 1.0757 - classification_loss: 0.1949 381/500 [=====================>........] - ETA: 29s - loss: 1.2697 - regression_loss: 1.0750 - classification_loss: 0.1946 382/500 [=====================>........] - ETA: 29s - loss: 1.2677 - regression_loss: 1.0732 - classification_loss: 0.1945 383/500 [=====================>........] - ETA: 29s - loss: 1.2674 - regression_loss: 1.0730 - classification_loss: 0.1944 384/500 [======================>.......] - ETA: 29s - loss: 1.2657 - regression_loss: 1.0716 - classification_loss: 0.1941 385/500 [======================>.......] - ETA: 28s - loss: 1.2654 - regression_loss: 1.0714 - classification_loss: 0.1940 386/500 [======================>.......] - ETA: 28s - loss: 1.2664 - regression_loss: 1.0723 - classification_loss: 0.1941 387/500 [======================>.......] - ETA: 28s - loss: 1.2663 - regression_loss: 1.0724 - classification_loss: 0.1939 388/500 [======================>.......] - ETA: 28s - loss: 1.2674 - regression_loss: 1.0733 - classification_loss: 0.1942 389/500 [======================>.......] - ETA: 27s - loss: 1.2679 - regression_loss: 1.0738 - classification_loss: 0.1942 390/500 [======================>.......] - ETA: 27s - loss: 1.2668 - regression_loss: 1.0729 - classification_loss: 0.1939 391/500 [======================>.......] - ETA: 27s - loss: 1.2665 - regression_loss: 1.0726 - classification_loss: 0.1938 392/500 [======================>.......] - ETA: 27s - loss: 1.2651 - regression_loss: 1.0716 - classification_loss: 0.1935 393/500 [======================>.......] - ETA: 26s - loss: 1.2650 - regression_loss: 1.0716 - classification_loss: 0.1934 394/500 [======================>.......] - ETA: 26s - loss: 1.2637 - regression_loss: 1.0704 - classification_loss: 0.1933 395/500 [======================>.......] - ETA: 26s - loss: 1.2638 - regression_loss: 1.0705 - classification_loss: 0.1933 396/500 [======================>.......] - ETA: 26s - loss: 1.2642 - regression_loss: 1.0709 - classification_loss: 0.1932 397/500 [======================>.......] - ETA: 25s - loss: 1.2621 - regression_loss: 1.0692 - classification_loss: 0.1928 398/500 [======================>.......] - ETA: 25s - loss: 1.2617 - regression_loss: 1.0690 - classification_loss: 0.1927 399/500 [======================>.......] - ETA: 25s - loss: 1.2604 - regression_loss: 1.0680 - classification_loss: 0.1924 400/500 [=======================>......] - ETA: 25s - loss: 1.2608 - regression_loss: 1.0683 - classification_loss: 0.1925 401/500 [=======================>......] - ETA: 24s - loss: 1.2603 - regression_loss: 1.0680 - classification_loss: 0.1923 402/500 [=======================>......] - ETA: 24s - loss: 1.2615 - regression_loss: 1.0691 - classification_loss: 0.1924 403/500 [=======================>......] - ETA: 24s - loss: 1.2624 - regression_loss: 1.0698 - classification_loss: 0.1926 404/500 [=======================>......] - ETA: 24s - loss: 1.2636 - regression_loss: 1.0708 - classification_loss: 0.1928 405/500 [=======================>......] - ETA: 23s - loss: 1.2618 - regression_loss: 1.0693 - classification_loss: 0.1925 406/500 [=======================>......] - ETA: 23s - loss: 1.2628 - regression_loss: 1.0701 - classification_loss: 0.1927 407/500 [=======================>......] - ETA: 23s - loss: 1.2629 - regression_loss: 1.0703 - classification_loss: 0.1926 408/500 [=======================>......] - ETA: 23s - loss: 1.2623 - regression_loss: 1.0698 - classification_loss: 0.1924 409/500 [=======================>......] - ETA: 22s - loss: 1.2629 - regression_loss: 1.0704 - classification_loss: 0.1925 410/500 [=======================>......] - ETA: 22s - loss: 1.2639 - regression_loss: 1.0712 - classification_loss: 0.1927 411/500 [=======================>......] - ETA: 22s - loss: 1.2651 - regression_loss: 1.0722 - classification_loss: 0.1929 412/500 [=======================>......] - ETA: 22s - loss: 1.2641 - regression_loss: 1.0713 - classification_loss: 0.1928 413/500 [=======================>......] - ETA: 21s - loss: 1.2650 - regression_loss: 1.0722 - classification_loss: 0.1928 414/500 [=======================>......] - ETA: 21s - loss: 1.2633 - regression_loss: 1.0709 - classification_loss: 0.1925 415/500 [=======================>......] - ETA: 21s - loss: 1.2616 - regression_loss: 1.0695 - classification_loss: 0.1921 416/500 [=======================>......] - ETA: 21s - loss: 1.2625 - regression_loss: 1.0703 - classification_loss: 0.1922 417/500 [========================>.....] - ETA: 20s - loss: 1.2631 - regression_loss: 1.0705 - classification_loss: 0.1925 418/500 [========================>.....] - ETA: 20s - loss: 1.2628 - regression_loss: 1.0703 - classification_loss: 0.1924 419/500 [========================>.....] - ETA: 20s - loss: 1.2628 - regression_loss: 1.0704 - classification_loss: 0.1924 420/500 [========================>.....] - ETA: 20s - loss: 1.2626 - regression_loss: 1.0703 - classification_loss: 0.1923 421/500 [========================>.....] - ETA: 19s - loss: 1.2621 - regression_loss: 1.0699 - classification_loss: 0.1922 422/500 [========================>.....] - ETA: 19s - loss: 1.2629 - regression_loss: 1.0705 - classification_loss: 0.1924 423/500 [========================>.....] - ETA: 19s - loss: 1.2644 - regression_loss: 1.0718 - classification_loss: 0.1925 424/500 [========================>.....] - ETA: 19s - loss: 1.2653 - regression_loss: 1.0725 - classification_loss: 0.1928 425/500 [========================>.....] - ETA: 18s - loss: 1.2651 - regression_loss: 1.0722 - classification_loss: 0.1929 426/500 [========================>.....] - ETA: 18s - loss: 1.2651 - regression_loss: 1.0722 - classification_loss: 0.1929 427/500 [========================>.....] - ETA: 18s - loss: 1.2660 - regression_loss: 1.0730 - classification_loss: 0.1930 428/500 [========================>.....] - ETA: 18s - loss: 1.2664 - regression_loss: 1.0734 - classification_loss: 0.1930 429/500 [========================>.....] - ETA: 17s - loss: 1.2675 - regression_loss: 1.0743 - classification_loss: 0.1932 430/500 [========================>.....] - ETA: 17s - loss: 1.2680 - regression_loss: 1.0748 - classification_loss: 0.1932 431/500 [========================>.....] - ETA: 17s - loss: 1.2718 - regression_loss: 1.0780 - classification_loss: 0.1938 432/500 [========================>.....] - ETA: 17s - loss: 1.2725 - regression_loss: 1.0784 - classification_loss: 0.1940 433/500 [========================>.....] - ETA: 16s - loss: 1.2711 - regression_loss: 1.0773 - classification_loss: 0.1938 434/500 [=========================>....] - ETA: 16s - loss: 1.2698 - regression_loss: 1.0763 - classification_loss: 0.1935 435/500 [=========================>....] - ETA: 16s - loss: 1.2719 - regression_loss: 1.0783 - classification_loss: 0.1936 436/500 [=========================>....] - ETA: 16s - loss: 1.2710 - regression_loss: 1.0776 - classification_loss: 0.1934 437/500 [=========================>....] - ETA: 15s - loss: 1.2712 - regression_loss: 1.0777 - classification_loss: 0.1935 438/500 [=========================>....] - ETA: 15s - loss: 1.2717 - regression_loss: 1.0781 - classification_loss: 0.1936 439/500 [=========================>....] - ETA: 15s - loss: 1.2720 - regression_loss: 1.0784 - classification_loss: 0.1936 440/500 [=========================>....] - ETA: 15s - loss: 1.2720 - regression_loss: 1.0784 - classification_loss: 0.1936 441/500 [=========================>....] - ETA: 14s - loss: 1.2722 - regression_loss: 1.0784 - classification_loss: 0.1937 442/500 [=========================>....] - ETA: 14s - loss: 1.2719 - regression_loss: 1.0784 - classification_loss: 0.1935 443/500 [=========================>....] - ETA: 14s - loss: 1.2725 - regression_loss: 1.0787 - classification_loss: 0.1937 444/500 [=========================>....] - ETA: 14s - loss: 1.2725 - regression_loss: 1.0787 - classification_loss: 0.1938 445/500 [=========================>....] - ETA: 13s - loss: 1.2720 - regression_loss: 1.0783 - classification_loss: 0.1937 446/500 [=========================>....] - ETA: 13s - loss: 1.2725 - regression_loss: 1.0787 - classification_loss: 0.1938 447/500 [=========================>....] - ETA: 13s - loss: 1.2715 - regression_loss: 1.0779 - classification_loss: 0.1936 448/500 [=========================>....] - ETA: 13s - loss: 1.2722 - regression_loss: 1.0785 - classification_loss: 0.1937 449/500 [=========================>....] - ETA: 12s - loss: 1.2726 - regression_loss: 1.0788 - classification_loss: 0.1938 450/500 [==========================>...] - ETA: 12s - loss: 1.2730 - regression_loss: 1.0792 - classification_loss: 0.1938 451/500 [==========================>...] - ETA: 12s - loss: 1.2743 - regression_loss: 1.0803 - classification_loss: 0.1940 452/500 [==========================>...] - ETA: 12s - loss: 1.2746 - regression_loss: 1.0807 - classification_loss: 0.1939 453/500 [==========================>...] - ETA: 11s - loss: 1.2750 - regression_loss: 1.0809 - classification_loss: 0.1942 454/500 [==========================>...] - ETA: 11s - loss: 1.2755 - regression_loss: 1.0812 - classification_loss: 0.1943 455/500 [==========================>...] - ETA: 11s - loss: 1.2752 - regression_loss: 1.0810 - classification_loss: 0.1942 456/500 [==========================>...] - ETA: 11s - loss: 1.2744 - regression_loss: 1.0801 - classification_loss: 0.1943 457/500 [==========================>...] - ETA: 10s - loss: 1.2745 - regression_loss: 1.0801 - classification_loss: 0.1944 458/500 [==========================>...] - ETA: 10s - loss: 1.2726 - regression_loss: 1.0786 - classification_loss: 0.1940 459/500 [==========================>...] - ETA: 10s - loss: 1.2726 - regression_loss: 1.0787 - classification_loss: 0.1939 460/500 [==========================>...] - ETA: 10s - loss: 1.2719 - regression_loss: 1.0780 - classification_loss: 0.1939 461/500 [==========================>...] - ETA: 9s - loss: 1.2703 - regression_loss: 1.0766 - classification_loss: 0.1936  462/500 [==========================>...] - ETA: 9s - loss: 1.2694 - regression_loss: 1.0757 - classification_loss: 0.1937 463/500 [==========================>...] - ETA: 9s - loss: 1.2694 - regression_loss: 1.0757 - classification_loss: 0.1937 464/500 [==========================>...] - ETA: 9s - loss: 1.2695 - regression_loss: 1.0757 - classification_loss: 0.1938 465/500 [==========================>...] - ETA: 8s - loss: 1.2691 - regression_loss: 1.0754 - classification_loss: 0.1937 466/500 [==========================>...] - ETA: 8s - loss: 1.2683 - regression_loss: 1.0748 - classification_loss: 0.1935 467/500 [===========================>..] - ETA: 8s - loss: 1.2683 - regression_loss: 1.0748 - classification_loss: 0.1935 468/500 [===========================>..] - ETA: 8s - loss: 1.2695 - regression_loss: 1.0758 - classification_loss: 0.1937 469/500 [===========================>..] - ETA: 7s - loss: 1.2695 - regression_loss: 1.0760 - classification_loss: 0.1935 470/500 [===========================>..] - ETA: 7s - loss: 1.2690 - regression_loss: 1.0756 - classification_loss: 0.1933 471/500 [===========================>..] - ETA: 7s - loss: 1.2692 - regression_loss: 1.0758 - classification_loss: 0.1934 472/500 [===========================>..] - ETA: 7s - loss: 1.2678 - regression_loss: 1.0746 - classification_loss: 0.1932 473/500 [===========================>..] - ETA: 6s - loss: 1.2675 - regression_loss: 1.0745 - classification_loss: 0.1930 474/500 [===========================>..] - ETA: 6s - loss: 1.2673 - regression_loss: 1.0743 - classification_loss: 0.1929 475/500 [===========================>..] - ETA: 6s - loss: 1.2666 - regression_loss: 1.0739 - classification_loss: 0.1928 476/500 [===========================>..] - ETA: 6s - loss: 1.2669 - regression_loss: 1.0742 - classification_loss: 0.1928 477/500 [===========================>..] - ETA: 5s - loss: 1.2661 - regression_loss: 1.0735 - classification_loss: 0.1927 478/500 [===========================>..] - ETA: 5s - loss: 1.2665 - regression_loss: 1.0738 - classification_loss: 0.1926 479/500 [===========================>..] - ETA: 5s - loss: 1.2666 - regression_loss: 1.0740 - classification_loss: 0.1926 480/500 [===========================>..] - ETA: 5s - loss: 1.2662 - regression_loss: 1.0737 - classification_loss: 0.1926 481/500 [===========================>..] - ETA: 4s - loss: 1.2666 - regression_loss: 1.0740 - classification_loss: 0.1926 482/500 [===========================>..] - ETA: 4s - loss: 1.2657 - regression_loss: 1.0733 - classification_loss: 0.1924 483/500 [===========================>..] - ETA: 4s - loss: 1.2656 - regression_loss: 1.0732 - classification_loss: 0.1924 484/500 [============================>.] - ETA: 4s - loss: 1.2662 - regression_loss: 1.0737 - classification_loss: 0.1925 485/500 [============================>.] - ETA: 3s - loss: 1.2663 - regression_loss: 1.0737 - classification_loss: 0.1925 486/500 [============================>.] - ETA: 3s - loss: 1.2673 - regression_loss: 1.0745 - classification_loss: 0.1928 487/500 [============================>.] - ETA: 3s - loss: 1.2679 - regression_loss: 1.0750 - classification_loss: 0.1929 488/500 [============================>.] - ETA: 3s - loss: 1.2695 - regression_loss: 1.0762 - classification_loss: 0.1933 489/500 [============================>.] - ETA: 2s - loss: 1.2701 - regression_loss: 1.0767 - classification_loss: 0.1934 490/500 [============================>.] - ETA: 2s - loss: 1.2693 - regression_loss: 1.0761 - classification_loss: 0.1932 491/500 [============================>.] - ETA: 2s - loss: 1.2685 - regression_loss: 1.0755 - classification_loss: 0.1930 492/500 [============================>.] - ETA: 2s - loss: 1.2684 - regression_loss: 1.0754 - classification_loss: 0.1930 493/500 [============================>.] - ETA: 1s - loss: 1.2685 - regression_loss: 1.0756 - classification_loss: 0.1930 494/500 [============================>.] - ETA: 1s - loss: 1.2689 - regression_loss: 1.0759 - classification_loss: 0.1930 495/500 [============================>.] - ETA: 1s - loss: 1.2683 - regression_loss: 1.0753 - classification_loss: 0.1930 496/500 [============================>.] - ETA: 1s - loss: 1.2686 - regression_loss: 1.0755 - classification_loss: 0.1931 497/500 [============================>.] - ETA: 0s - loss: 1.2691 - regression_loss: 1.0760 - classification_loss: 0.1931 498/500 [============================>.] - ETA: 0s - loss: 1.2676 - regression_loss: 1.0748 - classification_loss: 0.1928 499/500 [============================>.] - ETA: 0s - loss: 1.2669 - regression_loss: 1.0743 - classification_loss: 0.1927 500/500 [==============================] - 125s 251ms/step - loss: 1.2672 - regression_loss: 1.0745 - classification_loss: 0.1926 1172 instances of class plum with average precision: 0.7060 mAP: 0.7060 Epoch 00121: saving model to ./training/snapshots/resnet50_pascal_121.h5 Epoch 122/150 1/500 [..............................] - ETA: 1:46 - loss: 1.1258 - regression_loss: 0.9464 - classification_loss: 0.1795 2/500 [..............................] - ETA: 1:50 - loss: 0.9178 - regression_loss: 0.7490 - classification_loss: 0.1688 3/500 [..............................] - ETA: 1:56 - loss: 1.1655 - regression_loss: 0.9681 - classification_loss: 0.1973 4/500 [..............................] - ETA: 1:59 - loss: 1.1499 - regression_loss: 0.9654 - classification_loss: 0.1844 5/500 [..............................] - ETA: 2:01 - loss: 1.0651 - regression_loss: 0.8709 - classification_loss: 0.1942 6/500 [..............................] - ETA: 2:02 - loss: 1.1779 - regression_loss: 0.9753 - classification_loss: 0.2027 7/500 [..............................] - ETA: 2:02 - loss: 1.2400 - regression_loss: 1.0311 - classification_loss: 0.2090 8/500 [..............................] - ETA: 2:02 - loss: 1.2548 - regression_loss: 1.0440 - classification_loss: 0.2108 9/500 [..............................] - ETA: 2:02 - loss: 1.3164 - regression_loss: 1.0965 - classification_loss: 0.2199 10/500 [..............................] - ETA: 2:02 - loss: 1.3009 - regression_loss: 1.0835 - classification_loss: 0.2174 11/500 [..............................] - ETA: 2:01 - loss: 1.2907 - regression_loss: 1.0786 - classification_loss: 0.2121 12/500 [..............................] - ETA: 2:00 - loss: 1.3542 - regression_loss: 1.1261 - classification_loss: 0.2281 13/500 [..............................] - ETA: 2:00 - loss: 1.3703 - regression_loss: 1.1419 - classification_loss: 0.2285 14/500 [..............................] - ETA: 2:00 - loss: 1.3729 - regression_loss: 1.1447 - classification_loss: 0.2282 15/500 [..............................] - ETA: 2:00 - loss: 1.3588 - regression_loss: 1.1326 - classification_loss: 0.2263 16/500 [..............................] - ETA: 2:00 - loss: 1.3663 - regression_loss: 1.1372 - classification_loss: 0.2292 17/500 [>.............................] - ETA: 2:00 - loss: 1.3007 - regression_loss: 1.0810 - classification_loss: 0.2196 18/500 [>.............................] - ETA: 1:59 - loss: 1.2998 - regression_loss: 1.0779 - classification_loss: 0.2219 19/500 [>.............................] - ETA: 1:59 - loss: 1.2641 - regression_loss: 1.0487 - classification_loss: 0.2154 20/500 [>.............................] - ETA: 1:59 - loss: 1.2612 - regression_loss: 1.0465 - classification_loss: 0.2147 21/500 [>.............................] - ETA: 1:58 - loss: 1.2635 - regression_loss: 1.0488 - classification_loss: 0.2147 22/500 [>.............................] - ETA: 1:58 - loss: 1.2470 - regression_loss: 1.0365 - classification_loss: 0.2105 23/500 [>.............................] - ETA: 1:58 - loss: 1.2178 - regression_loss: 1.0105 - classification_loss: 0.2073 24/500 [>.............................] - ETA: 1:58 - loss: 1.2613 - regression_loss: 1.0474 - classification_loss: 0.2139 25/500 [>.............................] - ETA: 1:58 - loss: 1.2606 - regression_loss: 1.0485 - classification_loss: 0.2120 26/500 [>.............................] - ETA: 1:58 - loss: 1.2742 - regression_loss: 1.0615 - classification_loss: 0.2127 27/500 [>.............................] - ETA: 1:58 - loss: 1.2883 - regression_loss: 1.0737 - classification_loss: 0.2146 28/500 [>.............................] - ETA: 1:57 - loss: 1.2948 - regression_loss: 1.0799 - classification_loss: 0.2149 29/500 [>.............................] - ETA: 1:57 - loss: 1.3000 - regression_loss: 1.0826 - classification_loss: 0.2174 30/500 [>.............................] - ETA: 1:57 - loss: 1.3044 - regression_loss: 1.0875 - classification_loss: 0.2169 31/500 [>.............................] - ETA: 1:57 - loss: 1.3261 - regression_loss: 1.1092 - classification_loss: 0.2169 32/500 [>.............................] - ETA: 1:57 - loss: 1.3287 - regression_loss: 1.1122 - classification_loss: 0.2165 33/500 [>.............................] - ETA: 1:57 - loss: 1.3359 - regression_loss: 1.1182 - classification_loss: 0.2177 34/500 [=>............................] - ETA: 1:57 - loss: 1.3492 - regression_loss: 1.1275 - classification_loss: 0.2217 35/500 [=>............................] - ETA: 1:56 - loss: 1.3592 - regression_loss: 1.1359 - classification_loss: 0.2233 36/500 [=>............................] - ETA: 1:56 - loss: 1.3502 - regression_loss: 1.1278 - classification_loss: 0.2224 37/500 [=>............................] - ETA: 1:56 - loss: 1.3217 - regression_loss: 1.1037 - classification_loss: 0.2180 38/500 [=>............................] - ETA: 1:56 - loss: 1.3010 - regression_loss: 1.0878 - classification_loss: 0.2132 39/500 [=>............................] - ETA: 1:56 - loss: 1.3010 - regression_loss: 1.0893 - classification_loss: 0.2117 40/500 [=>............................] - ETA: 1:55 - loss: 1.2894 - regression_loss: 1.0805 - classification_loss: 0.2089 41/500 [=>............................] - ETA: 1:55 - loss: 1.2789 - regression_loss: 1.0727 - classification_loss: 0.2062 42/500 [=>............................] - ETA: 1:55 - loss: 1.2635 - regression_loss: 1.0591 - classification_loss: 0.2045 43/500 [=>............................] - ETA: 1:55 - loss: 1.2736 - regression_loss: 1.0694 - classification_loss: 0.2042 44/500 [=>............................] - ETA: 1:54 - loss: 1.2788 - regression_loss: 1.0738 - classification_loss: 0.2049 45/500 [=>............................] - ETA: 1:53 - loss: 1.2703 - regression_loss: 1.0669 - classification_loss: 0.2035 46/500 [=>............................] - ETA: 1:53 - loss: 1.2550 - regression_loss: 1.0546 - classification_loss: 0.2003 47/500 [=>............................] - ETA: 1:52 - loss: 1.2494 - regression_loss: 1.0498 - classification_loss: 0.1996 48/500 [=>............................] - ETA: 1:52 - loss: 1.2538 - regression_loss: 1.0531 - classification_loss: 0.2008 49/500 [=>............................] - ETA: 1:52 - loss: 1.2616 - regression_loss: 1.0592 - classification_loss: 0.2024 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2696 - regression_loss: 1.0660 - classification_loss: 0.2036 51/500 [==>...........................] - ETA: 1:51 - loss: 1.2695 - regression_loss: 1.0659 - classification_loss: 0.2036 52/500 [==>...........................] - ETA: 1:51 - loss: 1.2711 - regression_loss: 1.0674 - classification_loss: 0.2037 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2690 - regression_loss: 1.0658 - classification_loss: 0.2032 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2647 - regression_loss: 1.0615 - classification_loss: 0.2032 55/500 [==>...........................] - ETA: 1:50 - loss: 1.2662 - regression_loss: 1.0638 - classification_loss: 0.2024 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2644 - regression_loss: 1.0623 - classification_loss: 0.2021 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2695 - regression_loss: 1.0644 - classification_loss: 0.2051 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2732 - regression_loss: 1.0682 - classification_loss: 0.2050 59/500 [==>...........................] - ETA: 1:49 - loss: 1.2683 - regression_loss: 1.0646 - classification_loss: 0.2037 60/500 [==>...........................] - ETA: 1:49 - loss: 1.2710 - regression_loss: 1.0663 - classification_loss: 0.2047 61/500 [==>...........................] - ETA: 1:49 - loss: 1.2597 - regression_loss: 1.0571 - classification_loss: 0.2026 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2674 - regression_loss: 1.0634 - classification_loss: 0.2040 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2788 - regression_loss: 1.0717 - classification_loss: 0.2071 64/500 [==>...........................] - ETA: 1:48 - loss: 1.2881 - regression_loss: 1.0795 - classification_loss: 0.2087 65/500 [==>...........................] - ETA: 1:48 - loss: 1.2814 - regression_loss: 1.0740 - classification_loss: 0.2074 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2807 - regression_loss: 1.0729 - classification_loss: 0.2078 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2772 - regression_loss: 1.0708 - classification_loss: 0.2064 68/500 [===>..........................] - ETA: 1:47 - loss: 1.2845 - regression_loss: 1.0771 - classification_loss: 0.2074 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2843 - regression_loss: 1.0775 - classification_loss: 0.2068 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2747 - regression_loss: 1.0700 - classification_loss: 0.2047 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2722 - regression_loss: 1.0686 - classification_loss: 0.2036 72/500 [===>..........................] - ETA: 1:46 - loss: 1.2675 - regression_loss: 1.0647 - classification_loss: 0.2028 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2746 - regression_loss: 1.0712 - classification_loss: 0.2035 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2765 - regression_loss: 1.0728 - classification_loss: 0.2037 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2713 - regression_loss: 1.0668 - classification_loss: 0.2045 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2719 - regression_loss: 1.0673 - classification_loss: 0.2046 77/500 [===>..........................] - ETA: 1:45 - loss: 1.2758 - regression_loss: 1.0709 - classification_loss: 0.2050 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2703 - regression_loss: 1.0670 - classification_loss: 0.2033 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2807 - regression_loss: 1.0761 - classification_loss: 0.2047 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2879 - regression_loss: 1.0819 - classification_loss: 0.2060 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2973 - regression_loss: 1.0894 - classification_loss: 0.2079 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2987 - regression_loss: 1.0901 - classification_loss: 0.2086 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2981 - regression_loss: 1.0884 - classification_loss: 0.2097 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2929 - regression_loss: 1.0847 - classification_loss: 0.2083 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2909 - regression_loss: 1.0834 - classification_loss: 0.2075 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2885 - regression_loss: 1.0823 - classification_loss: 0.2063 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2888 - regression_loss: 1.0830 - classification_loss: 0.2058 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2892 - regression_loss: 1.0832 - classification_loss: 0.2060 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2960 - regression_loss: 1.0890 - classification_loss: 0.2071 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2997 - regression_loss: 1.0915 - classification_loss: 0.2082 91/500 [====>.........................] - ETA: 1:42 - loss: 1.3049 - regression_loss: 1.0956 - classification_loss: 0.2093 92/500 [====>.........................] - ETA: 1:41 - loss: 1.3005 - regression_loss: 1.0923 - classification_loss: 0.2082 93/500 [====>.........................] - ETA: 1:41 - loss: 1.3055 - regression_loss: 1.0959 - classification_loss: 0.2097 94/500 [====>.........................] - ETA: 1:41 - loss: 1.3069 - regression_loss: 1.0973 - classification_loss: 0.2096 95/500 [====>.........................] - ETA: 1:41 - loss: 1.3131 - regression_loss: 1.1022 - classification_loss: 0.2110 96/500 [====>.........................] - ETA: 1:40 - loss: 1.3115 - regression_loss: 1.1008 - classification_loss: 0.2108 97/500 [====>.........................] - ETA: 1:40 - loss: 1.3110 - regression_loss: 1.1001 - classification_loss: 0.2109 98/500 [====>.........................] - ETA: 1:40 - loss: 1.3021 - regression_loss: 1.0929 - classification_loss: 0.2091 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2983 - regression_loss: 1.0902 - classification_loss: 0.2081 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2994 - regression_loss: 1.0915 - classification_loss: 0.2079 101/500 [=====>........................] - ETA: 1:39 - loss: 1.3009 - regression_loss: 1.0928 - classification_loss: 0.2081 102/500 [=====>........................] - ETA: 1:39 - loss: 1.3050 - regression_loss: 1.0963 - classification_loss: 0.2087 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3041 - regression_loss: 1.0956 - classification_loss: 0.2085 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3081 - regression_loss: 1.0991 - classification_loss: 0.2089 105/500 [=====>........................] - ETA: 1:38 - loss: 1.3112 - regression_loss: 1.1019 - classification_loss: 0.2093 106/500 [=====>........................] - ETA: 1:38 - loss: 1.3150 - regression_loss: 1.1053 - classification_loss: 0.2097 107/500 [=====>........................] - ETA: 1:38 - loss: 1.3117 - regression_loss: 1.1030 - classification_loss: 0.2087 108/500 [=====>........................] - ETA: 1:38 - loss: 1.3094 - regression_loss: 1.1015 - classification_loss: 0.2079 109/500 [=====>........................] - ETA: 1:37 - loss: 1.3060 - regression_loss: 1.0991 - classification_loss: 0.2069 110/500 [=====>........................] - ETA: 1:37 - loss: 1.3001 - regression_loss: 1.0943 - classification_loss: 0.2058 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2957 - regression_loss: 1.0905 - classification_loss: 0.2052 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2914 - regression_loss: 1.0866 - classification_loss: 0.2048 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2960 - regression_loss: 1.0908 - classification_loss: 0.2052 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2934 - regression_loss: 1.0885 - classification_loss: 0.2049 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2944 - regression_loss: 1.0899 - classification_loss: 0.2046 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2868 - regression_loss: 1.0836 - classification_loss: 0.2032 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2806 - regression_loss: 1.0786 - classification_loss: 0.2019 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2755 - regression_loss: 1.0749 - classification_loss: 0.2006 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2714 - regression_loss: 1.0712 - classification_loss: 0.2002 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2709 - regression_loss: 1.0709 - classification_loss: 0.2001 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2739 - regression_loss: 1.0743 - classification_loss: 0.1996 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2764 - regression_loss: 1.0767 - classification_loss: 0.1997 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2730 - regression_loss: 1.0739 - classification_loss: 0.1991 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2754 - regression_loss: 1.0752 - classification_loss: 0.2001 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2778 - regression_loss: 1.0771 - classification_loss: 0.2007 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2794 - regression_loss: 1.0785 - classification_loss: 0.2009 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2807 - regression_loss: 1.0794 - classification_loss: 0.2013 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2755 - regression_loss: 1.0752 - classification_loss: 0.2003 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2787 - regression_loss: 1.0776 - classification_loss: 0.2011 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2797 - regression_loss: 1.0786 - classification_loss: 0.2011 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2793 - regression_loss: 1.0784 - classification_loss: 0.2009 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2827 - regression_loss: 1.0811 - classification_loss: 0.2016 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2841 - regression_loss: 1.0816 - classification_loss: 0.2025 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2782 - regression_loss: 1.0766 - classification_loss: 0.2016 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2822 - regression_loss: 1.0802 - classification_loss: 0.2020 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2762 - regression_loss: 1.0754 - classification_loss: 0.2008 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2739 - regression_loss: 1.0734 - classification_loss: 0.2005 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2767 - regression_loss: 1.0761 - classification_loss: 0.2006 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2758 - regression_loss: 1.0754 - classification_loss: 0.2004 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2713 - regression_loss: 1.0716 - classification_loss: 0.1997 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2727 - regression_loss: 1.0730 - classification_loss: 0.1997 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2757 - regression_loss: 1.0754 - classification_loss: 0.2003 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2781 - regression_loss: 1.0775 - classification_loss: 0.2006 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2815 - regression_loss: 1.0803 - classification_loss: 0.2011 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2826 - regression_loss: 1.0816 - classification_loss: 0.2010 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2844 - regression_loss: 1.0833 - classification_loss: 0.2011 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2847 - regression_loss: 1.0835 - classification_loss: 0.2011 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2796 - regression_loss: 1.0795 - classification_loss: 0.2001 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2793 - regression_loss: 1.0791 - classification_loss: 0.2002 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2801 - regression_loss: 1.0798 - classification_loss: 0.2003 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2771 - regression_loss: 1.0774 - classification_loss: 0.1997 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2798 - regression_loss: 1.0796 - classification_loss: 0.2002 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2808 - regression_loss: 1.0803 - classification_loss: 0.2004 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2814 - regression_loss: 1.0810 - classification_loss: 0.2005 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2767 - regression_loss: 1.0769 - classification_loss: 0.1997 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2740 - regression_loss: 1.0751 - classification_loss: 0.1989 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2725 - regression_loss: 1.0742 - classification_loss: 0.1983 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2736 - regression_loss: 1.0752 - classification_loss: 0.1984 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2752 - regression_loss: 1.0766 - classification_loss: 0.1987 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2774 - regression_loss: 1.0781 - classification_loss: 0.1993 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2781 - regression_loss: 1.0787 - classification_loss: 0.1994 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2799 - regression_loss: 1.0805 - classification_loss: 0.1994 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2820 - regression_loss: 1.0819 - classification_loss: 0.2002 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2844 - regression_loss: 1.0842 - classification_loss: 0.2002 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2869 - regression_loss: 1.0864 - classification_loss: 0.2004 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2897 - regression_loss: 1.0889 - classification_loss: 0.2007 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2890 - regression_loss: 1.0884 - classification_loss: 0.2006 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2889 - regression_loss: 1.0885 - classification_loss: 0.2003 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2915 - regression_loss: 1.0909 - classification_loss: 0.2006 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2915 - regression_loss: 1.0911 - classification_loss: 0.2004 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2918 - regression_loss: 1.0916 - classification_loss: 0.2001 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2946 - regression_loss: 1.0935 - classification_loss: 0.2011 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2943 - regression_loss: 1.0939 - classification_loss: 0.2004 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2952 - regression_loss: 1.0947 - classification_loss: 0.2005 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2953 - regression_loss: 1.0949 - classification_loss: 0.2004 176/500 [=========>....................] - ETA: 1:21 - loss: 1.3002 - regression_loss: 1.0987 - classification_loss: 0.2015 177/500 [=========>....................] - ETA: 1:21 - loss: 1.3003 - regression_loss: 1.0989 - classification_loss: 0.2014 178/500 [=========>....................] - ETA: 1:20 - loss: 1.3003 - regression_loss: 1.0991 - classification_loss: 0.2012 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2990 - regression_loss: 1.0982 - classification_loss: 0.2008 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2993 - regression_loss: 1.0985 - classification_loss: 0.2008 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2967 - regression_loss: 1.0959 - classification_loss: 0.2008 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2922 - regression_loss: 1.0922 - classification_loss: 0.2000 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2935 - regression_loss: 1.0933 - classification_loss: 0.2002 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2938 - regression_loss: 1.0933 - classification_loss: 0.2005 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2932 - regression_loss: 1.0931 - classification_loss: 0.2001 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2931 - regression_loss: 1.0927 - classification_loss: 0.2004 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2936 - regression_loss: 1.0932 - classification_loss: 0.2004 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2886 - regression_loss: 1.0889 - classification_loss: 0.1997 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2907 - regression_loss: 1.0907 - classification_loss: 0.2000 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2905 - regression_loss: 1.0905 - classification_loss: 0.2000 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2903 - regression_loss: 1.0903 - classification_loss: 0.2000 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2909 - regression_loss: 1.0910 - classification_loss: 0.1999 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2885 - regression_loss: 1.0892 - classification_loss: 0.1994 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2841 - regression_loss: 1.0855 - classification_loss: 0.1986 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2862 - regression_loss: 1.0874 - classification_loss: 0.1988 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2859 - regression_loss: 1.0872 - classification_loss: 0.1987 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2856 - regression_loss: 1.0870 - classification_loss: 0.1986 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2820 - regression_loss: 1.0843 - classification_loss: 0.1977 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2809 - regression_loss: 1.0833 - classification_loss: 0.1975 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2804 - regression_loss: 1.0828 - classification_loss: 0.1976 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2816 - regression_loss: 1.0839 - classification_loss: 0.1977 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2817 - regression_loss: 1.0839 - classification_loss: 0.1978 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2815 - regression_loss: 1.0837 - classification_loss: 0.1978 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2801 - regression_loss: 1.0826 - classification_loss: 0.1975 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2839 - regression_loss: 1.0853 - classification_loss: 0.1987 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2845 - regression_loss: 1.0856 - classification_loss: 0.1988 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2860 - regression_loss: 1.0869 - classification_loss: 0.1991 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2874 - regression_loss: 1.0880 - classification_loss: 0.1993 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2853 - regression_loss: 1.0865 - classification_loss: 0.1988 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2820 - regression_loss: 1.0839 - classification_loss: 0.1981 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2809 - regression_loss: 1.0831 - classification_loss: 0.1978 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2805 - regression_loss: 1.0824 - classification_loss: 0.1981 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2796 - regression_loss: 1.0816 - classification_loss: 0.1980 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2794 - regression_loss: 1.0816 - classification_loss: 0.1978 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2885 - regression_loss: 1.0891 - classification_loss: 0.1995 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2852 - regression_loss: 1.0862 - classification_loss: 0.1989 217/500 [============>.................] - ETA: 1:10 - loss: 1.2838 - regression_loss: 1.0849 - classification_loss: 0.1989 218/500 [============>.................] - ETA: 1:10 - loss: 1.2841 - regression_loss: 1.0855 - classification_loss: 0.1986 219/500 [============>.................] - ETA: 1:10 - loss: 1.2869 - regression_loss: 1.0879 - classification_loss: 0.1990 220/500 [============>.................] - ETA: 1:10 - loss: 1.2841 - regression_loss: 1.0858 - classification_loss: 0.1983 221/500 [============>.................] - ETA: 1:09 - loss: 1.2845 - regression_loss: 1.0864 - classification_loss: 0.1981 222/500 [============>.................] - ETA: 1:09 - loss: 1.2851 - regression_loss: 1.0868 - classification_loss: 0.1983 223/500 [============>.................] - ETA: 1:09 - loss: 1.2822 - regression_loss: 1.0846 - classification_loss: 0.1977 224/500 [============>.................] - ETA: 1:09 - loss: 1.2820 - regression_loss: 1.0843 - classification_loss: 0.1976 225/500 [============>.................] - ETA: 1:08 - loss: 1.2796 - regression_loss: 1.0825 - classification_loss: 0.1972 226/500 [============>.................] - ETA: 1:08 - loss: 1.2807 - regression_loss: 1.0832 - classification_loss: 0.1974 227/500 [============>.................] - ETA: 1:08 - loss: 1.2821 - regression_loss: 1.0846 - classification_loss: 0.1975 228/500 [============>.................] - ETA: 1:08 - loss: 1.2827 - regression_loss: 1.0853 - classification_loss: 0.1974 229/500 [============>.................] - ETA: 1:07 - loss: 1.2794 - regression_loss: 1.0827 - classification_loss: 0.1967 230/500 [============>.................] - ETA: 1:07 - loss: 1.2805 - regression_loss: 1.0835 - classification_loss: 0.1970 231/500 [============>.................] - ETA: 1:07 - loss: 1.2788 - regression_loss: 1.0821 - classification_loss: 0.1968 232/500 [============>.................] - ETA: 1:07 - loss: 1.2796 - regression_loss: 1.0825 - classification_loss: 0.1972 233/500 [============>.................] - ETA: 1:06 - loss: 1.2814 - regression_loss: 1.0837 - classification_loss: 0.1977 234/500 [=============>................] - ETA: 1:06 - loss: 1.2830 - regression_loss: 1.0850 - classification_loss: 0.1980 235/500 [=============>................] - ETA: 1:06 - loss: 1.2838 - regression_loss: 1.0857 - classification_loss: 0.1981 236/500 [=============>................] - ETA: 1:06 - loss: 1.2838 - regression_loss: 1.0855 - classification_loss: 0.1983 237/500 [=============>................] - ETA: 1:05 - loss: 1.2875 - regression_loss: 1.0883 - classification_loss: 0.1991 238/500 [=============>................] - ETA: 1:05 - loss: 1.2869 - regression_loss: 1.0880 - classification_loss: 0.1990 239/500 [=============>................] - ETA: 1:05 - loss: 1.2852 - regression_loss: 1.0865 - classification_loss: 0.1987 240/500 [=============>................] - ETA: 1:05 - loss: 1.2852 - regression_loss: 1.0868 - classification_loss: 0.1984 241/500 [=============>................] - ETA: 1:04 - loss: 1.2845 - regression_loss: 1.0864 - classification_loss: 0.1981 242/500 [=============>................] - ETA: 1:04 - loss: 1.2842 - regression_loss: 1.0859 - classification_loss: 0.1982 243/500 [=============>................] - ETA: 1:04 - loss: 1.2820 - regression_loss: 1.0840 - classification_loss: 0.1980 244/500 [=============>................] - ETA: 1:04 - loss: 1.2791 - regression_loss: 1.0816 - classification_loss: 0.1975 245/500 [=============>................] - ETA: 1:03 - loss: 1.2779 - regression_loss: 1.0805 - classification_loss: 0.1973 246/500 [=============>................] - ETA: 1:03 - loss: 1.2784 - regression_loss: 1.0808 - classification_loss: 0.1976 247/500 [=============>................] - ETA: 1:03 - loss: 1.2766 - regression_loss: 1.0792 - classification_loss: 0.1974 248/500 [=============>................] - ETA: 1:03 - loss: 1.2767 - regression_loss: 1.0792 - classification_loss: 0.1976 249/500 [=============>................] - ETA: 1:02 - loss: 1.2768 - regression_loss: 1.0791 - classification_loss: 0.1977 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2788 - regression_loss: 1.0806 - classification_loss: 0.1982 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2788 - regression_loss: 1.0807 - classification_loss: 0.1981 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2799 - regression_loss: 1.0817 - classification_loss: 0.1982 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2815 - regression_loss: 1.0830 - classification_loss: 0.1985 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2820 - regression_loss: 1.0832 - classification_loss: 0.1987 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2805 - regression_loss: 1.0822 - classification_loss: 0.1983 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2825 - regression_loss: 1.0838 - classification_loss: 0.1987 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2802 - regression_loss: 1.0821 - classification_loss: 0.1981 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2782 - regression_loss: 1.0807 - classification_loss: 0.1975 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2788 - regression_loss: 1.0811 - classification_loss: 0.1977 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2796 - regression_loss: 1.0818 - classification_loss: 0.1978 261/500 [==============>...............] - ETA: 59s - loss: 1.2767 - regression_loss: 1.0794 - classification_loss: 0.1973  262/500 [==============>...............] - ETA: 59s - loss: 1.2753 - regression_loss: 1.0783 - classification_loss: 0.1971 263/500 [==============>...............] - ETA: 59s - loss: 1.2767 - regression_loss: 1.0793 - classification_loss: 0.1974 264/500 [==============>...............] - ETA: 59s - loss: 1.2766 - regression_loss: 1.0793 - classification_loss: 0.1973 265/500 [==============>...............] - ETA: 58s - loss: 1.2744 - regression_loss: 1.0777 - classification_loss: 0.1966 266/500 [==============>...............] - ETA: 58s - loss: 1.2754 - regression_loss: 1.0786 - classification_loss: 0.1968 267/500 [===============>..............] - ETA: 58s - loss: 1.2731 - regression_loss: 1.0767 - classification_loss: 0.1964 268/500 [===============>..............] - ETA: 58s - loss: 1.2752 - regression_loss: 1.0783 - classification_loss: 0.1969 269/500 [===============>..............] - ETA: 57s - loss: 1.2730 - regression_loss: 1.0765 - classification_loss: 0.1964 270/500 [===============>..............] - ETA: 57s - loss: 1.2721 - regression_loss: 1.0759 - classification_loss: 0.1962 271/500 [===============>..............] - ETA: 57s - loss: 1.2720 - regression_loss: 1.0756 - classification_loss: 0.1964 272/500 [===============>..............] - ETA: 57s - loss: 1.2717 - regression_loss: 1.0754 - classification_loss: 0.1963 273/500 [===============>..............] - ETA: 56s - loss: 1.2727 - regression_loss: 1.0763 - classification_loss: 0.1964 274/500 [===============>..............] - ETA: 56s - loss: 1.2719 - regression_loss: 1.0757 - classification_loss: 0.1962 275/500 [===============>..............] - ETA: 56s - loss: 1.2695 - regression_loss: 1.0738 - classification_loss: 0.1957 276/500 [===============>..............] - ETA: 56s - loss: 1.2698 - regression_loss: 1.0743 - classification_loss: 0.1955 277/500 [===============>..............] - ETA: 55s - loss: 1.2697 - regression_loss: 1.0743 - classification_loss: 0.1954 278/500 [===============>..............] - ETA: 55s - loss: 1.2692 - regression_loss: 1.0740 - classification_loss: 0.1952 279/500 [===============>..............] - ETA: 55s - loss: 1.2662 - regression_loss: 1.0716 - classification_loss: 0.1947 280/500 [===============>..............] - ETA: 55s - loss: 1.2671 - regression_loss: 1.0722 - classification_loss: 0.1949 281/500 [===============>..............] - ETA: 54s - loss: 1.2676 - regression_loss: 1.0727 - classification_loss: 0.1949 282/500 [===============>..............] - ETA: 54s - loss: 1.2679 - regression_loss: 1.0728 - classification_loss: 0.1950 283/500 [===============>..............] - ETA: 54s - loss: 1.2667 - regression_loss: 1.0719 - classification_loss: 0.1949 284/500 [================>.............] - ETA: 54s - loss: 1.2674 - regression_loss: 1.0728 - classification_loss: 0.1947 285/500 [================>.............] - ETA: 53s - loss: 1.2667 - regression_loss: 1.0720 - classification_loss: 0.1947 286/500 [================>.............] - ETA: 53s - loss: 1.2652 - regression_loss: 1.0706 - classification_loss: 0.1946 287/500 [================>.............] - ETA: 53s - loss: 1.2637 - regression_loss: 1.0693 - classification_loss: 0.1943 288/500 [================>.............] - ETA: 53s - loss: 1.2627 - regression_loss: 1.0685 - classification_loss: 0.1942 289/500 [================>.............] - ETA: 52s - loss: 1.2655 - regression_loss: 1.0708 - classification_loss: 0.1947 290/500 [================>.............] - ETA: 52s - loss: 1.2632 - regression_loss: 1.0689 - classification_loss: 0.1943 291/500 [================>.............] - ETA: 52s - loss: 1.2653 - regression_loss: 1.0705 - classification_loss: 0.1947 292/500 [================>.............] - ETA: 52s - loss: 1.2670 - regression_loss: 1.0718 - classification_loss: 0.1951 293/500 [================>.............] - ETA: 51s - loss: 1.2668 - regression_loss: 1.0719 - classification_loss: 0.1950 294/500 [================>.............] - ETA: 51s - loss: 1.2661 - regression_loss: 1.0711 - classification_loss: 0.1949 295/500 [================>.............] - ETA: 51s - loss: 1.2644 - regression_loss: 1.0699 - classification_loss: 0.1944 296/500 [================>.............] - ETA: 51s - loss: 1.2659 - regression_loss: 1.0712 - classification_loss: 0.1947 297/500 [================>.............] - ETA: 50s - loss: 1.2656 - regression_loss: 1.0710 - classification_loss: 0.1946 298/500 [================>.............] - ETA: 50s - loss: 1.2658 - regression_loss: 1.0714 - classification_loss: 0.1944 299/500 [================>.............] - ETA: 50s - loss: 1.2663 - regression_loss: 1.0719 - classification_loss: 0.1944 300/500 [=================>............] - ETA: 50s - loss: 1.2650 - regression_loss: 1.0708 - classification_loss: 0.1942 301/500 [=================>............] - ETA: 49s - loss: 1.2659 - regression_loss: 1.0716 - classification_loss: 0.1943 302/500 [=================>............] - ETA: 49s - loss: 1.2642 - regression_loss: 1.0702 - classification_loss: 0.1940 303/500 [=================>............] - ETA: 49s - loss: 1.2666 - regression_loss: 1.0719 - classification_loss: 0.1947 304/500 [=================>............] - ETA: 49s - loss: 1.2687 - regression_loss: 1.0736 - classification_loss: 0.1951 305/500 [=================>............] - ETA: 48s - loss: 1.2670 - regression_loss: 1.0722 - classification_loss: 0.1948 306/500 [=================>............] - ETA: 48s - loss: 1.2676 - regression_loss: 1.0729 - classification_loss: 0.1947 307/500 [=================>............] - ETA: 48s - loss: 1.2668 - regression_loss: 1.0723 - classification_loss: 0.1945 308/500 [=================>............] - ETA: 48s - loss: 1.2690 - regression_loss: 1.0742 - classification_loss: 0.1948 309/500 [=================>............] - ETA: 47s - loss: 1.2688 - regression_loss: 1.0743 - classification_loss: 0.1945 310/500 [=================>............] - ETA: 47s - loss: 1.2701 - regression_loss: 1.0754 - classification_loss: 0.1947 311/500 [=================>............] - ETA: 47s - loss: 1.2714 - regression_loss: 1.0766 - classification_loss: 0.1949 312/500 [=================>............] - ETA: 47s - loss: 1.2727 - regression_loss: 1.0778 - classification_loss: 0.1949 313/500 [=================>............] - ETA: 46s - loss: 1.2707 - regression_loss: 1.0762 - classification_loss: 0.1945 314/500 [=================>............] - ETA: 46s - loss: 1.2714 - regression_loss: 1.0769 - classification_loss: 0.1945 315/500 [=================>............] - ETA: 46s - loss: 1.2721 - regression_loss: 1.0775 - classification_loss: 0.1946 316/500 [=================>............] - ETA: 46s - loss: 1.2724 - regression_loss: 1.0780 - classification_loss: 0.1945 317/500 [==================>...........] - ETA: 45s - loss: 1.2728 - regression_loss: 1.0783 - classification_loss: 0.1945 318/500 [==================>...........] - ETA: 45s - loss: 1.2741 - regression_loss: 1.0794 - classification_loss: 0.1946 319/500 [==================>...........] - ETA: 45s - loss: 1.2730 - regression_loss: 1.0785 - classification_loss: 0.1945 320/500 [==================>...........] - ETA: 45s - loss: 1.2732 - regression_loss: 1.0788 - classification_loss: 0.1944 321/500 [==================>...........] - ETA: 44s - loss: 1.2721 - regression_loss: 1.0781 - classification_loss: 0.1941 322/500 [==================>...........] - ETA: 44s - loss: 1.2724 - regression_loss: 1.0783 - classification_loss: 0.1941 323/500 [==================>...........] - ETA: 44s - loss: 1.2721 - regression_loss: 1.0779 - classification_loss: 0.1942 324/500 [==================>...........] - ETA: 44s - loss: 1.2730 - regression_loss: 1.0789 - classification_loss: 0.1941 325/500 [==================>...........] - ETA: 43s - loss: 1.2719 - regression_loss: 1.0779 - classification_loss: 0.1940 326/500 [==================>...........] - ETA: 43s - loss: 1.2734 - regression_loss: 1.0790 - classification_loss: 0.1944 327/500 [==================>...........] - ETA: 43s - loss: 1.2728 - regression_loss: 1.0784 - classification_loss: 0.1943 328/500 [==================>...........] - ETA: 43s - loss: 1.2739 - regression_loss: 1.0791 - classification_loss: 0.1947 329/500 [==================>...........] - ETA: 42s - loss: 1.2746 - regression_loss: 1.0796 - classification_loss: 0.1950 330/500 [==================>...........] - ETA: 42s - loss: 1.2724 - regression_loss: 1.0778 - classification_loss: 0.1947 331/500 [==================>...........] - ETA: 42s - loss: 1.2737 - regression_loss: 1.0789 - classification_loss: 0.1948 332/500 [==================>...........] - ETA: 42s - loss: 1.2754 - regression_loss: 1.0804 - classification_loss: 0.1950 333/500 [==================>...........] - ETA: 41s - loss: 1.2743 - regression_loss: 1.0793 - classification_loss: 0.1949 334/500 [===================>..........] - ETA: 41s - loss: 1.2731 - regression_loss: 1.0786 - classification_loss: 0.1946 335/500 [===================>..........] - ETA: 41s - loss: 1.2743 - regression_loss: 1.0796 - classification_loss: 0.1947 336/500 [===================>..........] - ETA: 41s - loss: 1.2721 - regression_loss: 1.0778 - classification_loss: 0.1943 337/500 [===================>..........] - ETA: 40s - loss: 1.2714 - regression_loss: 1.0771 - classification_loss: 0.1943 338/500 [===================>..........] - ETA: 40s - loss: 1.2694 - regression_loss: 1.0755 - classification_loss: 0.1939 339/500 [===================>..........] - ETA: 40s - loss: 1.2698 - regression_loss: 1.0758 - classification_loss: 0.1940 340/500 [===================>..........] - ETA: 40s - loss: 1.2708 - regression_loss: 1.0766 - classification_loss: 0.1942 341/500 [===================>..........] - ETA: 39s - loss: 1.2701 - regression_loss: 1.0762 - classification_loss: 0.1940 342/500 [===================>..........] - ETA: 39s - loss: 1.2702 - regression_loss: 1.0761 - classification_loss: 0.1941 343/500 [===================>..........] - ETA: 39s - loss: 1.2694 - regression_loss: 1.0755 - classification_loss: 0.1938 344/500 [===================>..........] - ETA: 39s - loss: 1.2696 - regression_loss: 1.0757 - classification_loss: 0.1939 345/500 [===================>..........] - ETA: 38s - loss: 1.2698 - regression_loss: 1.0760 - classification_loss: 0.1938 346/500 [===================>..........] - ETA: 38s - loss: 1.2694 - regression_loss: 1.0757 - classification_loss: 0.1937 347/500 [===================>..........] - ETA: 38s - loss: 1.2704 - regression_loss: 1.0766 - classification_loss: 0.1939 348/500 [===================>..........] - ETA: 38s - loss: 1.2692 - regression_loss: 1.0757 - classification_loss: 0.1935 349/500 [===================>..........] - ETA: 37s - loss: 1.2681 - regression_loss: 1.0749 - classification_loss: 0.1932 350/500 [====================>.........] - ETA: 37s - loss: 1.2667 - regression_loss: 1.0738 - classification_loss: 0.1929 351/500 [====================>.........] - ETA: 37s - loss: 1.2688 - regression_loss: 1.0754 - classification_loss: 0.1934 352/500 [====================>.........] - ETA: 37s - loss: 1.2696 - regression_loss: 1.0761 - classification_loss: 0.1936 353/500 [====================>.........] - ETA: 36s - loss: 1.2701 - regression_loss: 1.0764 - classification_loss: 0.1937 354/500 [====================>.........] - ETA: 36s - loss: 1.2696 - regression_loss: 1.0762 - classification_loss: 0.1934 355/500 [====================>.........] - ETA: 36s - loss: 1.2693 - regression_loss: 1.0756 - classification_loss: 0.1937 356/500 [====================>.........] - ETA: 36s - loss: 1.2693 - regression_loss: 1.0757 - classification_loss: 0.1936 357/500 [====================>.........] - ETA: 35s - loss: 1.2703 - regression_loss: 1.0765 - classification_loss: 0.1938 358/500 [====================>.........] - ETA: 35s - loss: 1.2710 - regression_loss: 1.0768 - classification_loss: 0.1942 359/500 [====================>.........] - ETA: 35s - loss: 1.2725 - regression_loss: 1.0779 - classification_loss: 0.1946 360/500 [====================>.........] - ETA: 35s - loss: 1.2707 - regression_loss: 1.0765 - classification_loss: 0.1943 361/500 [====================>.........] - ETA: 34s - loss: 1.2711 - regression_loss: 1.0768 - classification_loss: 0.1942 362/500 [====================>.........] - ETA: 34s - loss: 1.2733 - regression_loss: 1.0782 - classification_loss: 0.1951 363/500 [====================>.........] - ETA: 34s - loss: 1.2758 - regression_loss: 1.0805 - classification_loss: 0.1954 364/500 [====================>.........] - ETA: 34s - loss: 1.2744 - regression_loss: 1.0793 - classification_loss: 0.1951 365/500 [====================>.........] - ETA: 33s - loss: 1.2753 - regression_loss: 1.0800 - classification_loss: 0.1953 366/500 [====================>.........] - ETA: 33s - loss: 1.2764 - regression_loss: 1.0810 - classification_loss: 0.1954 367/500 [=====================>........] - ETA: 33s - loss: 1.2774 - regression_loss: 1.0817 - classification_loss: 0.1958 368/500 [=====================>........] - ETA: 33s - loss: 1.2792 - regression_loss: 1.0831 - classification_loss: 0.1961 369/500 [=====================>........] - ETA: 32s - loss: 1.2793 - regression_loss: 1.0832 - classification_loss: 0.1961 370/500 [=====================>........] - ETA: 32s - loss: 1.2805 - regression_loss: 1.0843 - classification_loss: 0.1962 371/500 [=====================>........] - ETA: 32s - loss: 1.2810 - regression_loss: 1.0848 - classification_loss: 0.1962 372/500 [=====================>........] - ETA: 32s - loss: 1.2808 - regression_loss: 1.0847 - classification_loss: 0.1961 373/500 [=====================>........] - ETA: 31s - loss: 1.2808 - regression_loss: 1.0847 - classification_loss: 0.1961 374/500 [=====================>........] - ETA: 31s - loss: 1.2801 - regression_loss: 1.0842 - classification_loss: 0.1958 375/500 [=====================>........] - ETA: 31s - loss: 1.2810 - regression_loss: 1.0851 - classification_loss: 0.1958 376/500 [=====================>........] - ETA: 31s - loss: 1.2817 - regression_loss: 1.0857 - classification_loss: 0.1960 377/500 [=====================>........] - ETA: 30s - loss: 1.2818 - regression_loss: 1.0857 - classification_loss: 0.1961 378/500 [=====================>........] - ETA: 30s - loss: 1.2825 - regression_loss: 1.0863 - classification_loss: 0.1962 379/500 [=====================>........] - ETA: 30s - loss: 1.2803 - regression_loss: 1.0845 - classification_loss: 0.1958 380/500 [=====================>........] - ETA: 30s - loss: 1.2804 - regression_loss: 1.0846 - classification_loss: 0.1959 381/500 [=====================>........] - ETA: 29s - loss: 1.2785 - regression_loss: 1.0830 - classification_loss: 0.1955 382/500 [=====================>........] - ETA: 29s - loss: 1.2783 - regression_loss: 1.0828 - classification_loss: 0.1955 383/500 [=====================>........] - ETA: 29s - loss: 1.2797 - regression_loss: 1.0839 - classification_loss: 0.1958 384/500 [======================>.......] - ETA: 29s - loss: 1.2794 - regression_loss: 1.0838 - classification_loss: 0.1957 385/500 [======================>.......] - ETA: 28s - loss: 1.2796 - regression_loss: 1.0839 - classification_loss: 0.1957 386/500 [======================>.......] - ETA: 28s - loss: 1.2809 - regression_loss: 1.0849 - classification_loss: 0.1960 387/500 [======================>.......] - ETA: 28s - loss: 1.2812 - regression_loss: 1.0853 - classification_loss: 0.1960 388/500 [======================>.......] - ETA: 28s - loss: 1.2815 - regression_loss: 1.0856 - classification_loss: 0.1960 389/500 [======================>.......] - ETA: 27s - loss: 1.2820 - regression_loss: 1.0858 - classification_loss: 0.1962 390/500 [======================>.......] - ETA: 27s - loss: 1.2825 - regression_loss: 1.0862 - classification_loss: 0.1963 391/500 [======================>.......] - ETA: 27s - loss: 1.2846 - regression_loss: 1.0878 - classification_loss: 0.1967 392/500 [======================>.......] - ETA: 27s - loss: 1.2841 - regression_loss: 1.0875 - classification_loss: 0.1966 393/500 [======================>.......] - ETA: 26s - loss: 1.2828 - regression_loss: 1.0865 - classification_loss: 0.1963 394/500 [======================>.......] - ETA: 26s - loss: 1.2807 - regression_loss: 1.0847 - classification_loss: 0.1960 395/500 [======================>.......] - ETA: 26s - loss: 1.2807 - regression_loss: 1.0847 - classification_loss: 0.1960 396/500 [======================>.......] - ETA: 26s - loss: 1.2803 - regression_loss: 1.0844 - classification_loss: 0.1959 397/500 [======================>.......] - ETA: 25s - loss: 1.2784 - regression_loss: 1.0828 - classification_loss: 0.1955 398/500 [======================>.......] - ETA: 25s - loss: 1.2797 - regression_loss: 1.0840 - classification_loss: 0.1958 399/500 [======================>.......] - ETA: 25s - loss: 1.2798 - regression_loss: 1.0841 - classification_loss: 0.1957 400/500 [=======================>......] - ETA: 25s - loss: 1.2801 - regression_loss: 1.0845 - classification_loss: 0.1956 401/500 [=======================>......] - ETA: 24s - loss: 1.2780 - regression_loss: 1.0827 - classification_loss: 0.1953 402/500 [=======================>......] - ETA: 24s - loss: 1.2783 - regression_loss: 1.0831 - classification_loss: 0.1952 403/500 [=======================>......] - ETA: 24s - loss: 1.2764 - regression_loss: 1.0815 - classification_loss: 0.1949 404/500 [=======================>......] - ETA: 24s - loss: 1.2775 - regression_loss: 1.0823 - classification_loss: 0.1952 405/500 [=======================>......] - ETA: 23s - loss: 1.2783 - regression_loss: 1.0831 - classification_loss: 0.1952 406/500 [=======================>......] - ETA: 23s - loss: 1.2782 - regression_loss: 1.0831 - classification_loss: 0.1951 407/500 [=======================>......] - ETA: 23s - loss: 1.2785 - regression_loss: 1.0833 - classification_loss: 0.1952 408/500 [=======================>......] - ETA: 23s - loss: 1.2795 - regression_loss: 1.0841 - classification_loss: 0.1953 409/500 [=======================>......] - ETA: 22s - loss: 1.2803 - regression_loss: 1.0846 - classification_loss: 0.1957 410/500 [=======================>......] - ETA: 22s - loss: 1.2812 - regression_loss: 1.0853 - classification_loss: 0.1959 411/500 [=======================>......] - ETA: 22s - loss: 1.2821 - regression_loss: 1.0859 - classification_loss: 0.1962 412/500 [=======================>......] - ETA: 22s - loss: 1.2827 - regression_loss: 1.0865 - classification_loss: 0.1962 413/500 [=======================>......] - ETA: 21s - loss: 1.2815 - regression_loss: 1.0854 - classification_loss: 0.1961 414/500 [=======================>......] - ETA: 21s - loss: 1.2805 - regression_loss: 1.0846 - classification_loss: 0.1959 415/500 [=======================>......] - ETA: 21s - loss: 1.2784 - regression_loss: 1.0829 - classification_loss: 0.1955 416/500 [=======================>......] - ETA: 21s - loss: 1.2767 - regression_loss: 1.0815 - classification_loss: 0.1952 417/500 [========================>.....] - ETA: 20s - loss: 1.2770 - regression_loss: 1.0818 - classification_loss: 0.1952 418/500 [========================>.....] - ETA: 20s - loss: 1.2783 - regression_loss: 1.0830 - classification_loss: 0.1954 419/500 [========================>.....] - ETA: 20s - loss: 1.2784 - regression_loss: 1.0829 - classification_loss: 0.1954 420/500 [========================>.....] - ETA: 20s - loss: 1.2784 - regression_loss: 1.0828 - classification_loss: 0.1956 421/500 [========================>.....] - ETA: 19s - loss: 1.2763 - regression_loss: 1.0811 - classification_loss: 0.1952 422/500 [========================>.....] - ETA: 19s - loss: 1.2768 - regression_loss: 1.0816 - classification_loss: 0.1953 423/500 [========================>.....] - ETA: 19s - loss: 1.2772 - regression_loss: 1.0819 - classification_loss: 0.1953 424/500 [========================>.....] - ETA: 19s - loss: 1.2765 - regression_loss: 1.0813 - classification_loss: 0.1952 425/500 [========================>.....] - ETA: 18s - loss: 1.2771 - regression_loss: 1.0817 - classification_loss: 0.1953 426/500 [========================>.....] - ETA: 18s - loss: 1.2768 - regression_loss: 1.0813 - classification_loss: 0.1955 427/500 [========================>.....] - ETA: 18s - loss: 1.2767 - regression_loss: 1.0812 - classification_loss: 0.1955 428/500 [========================>.....] - ETA: 18s - loss: 1.2782 - regression_loss: 1.0824 - classification_loss: 0.1958 429/500 [========================>.....] - ETA: 17s - loss: 1.2792 - regression_loss: 1.0834 - classification_loss: 0.1958 430/500 [========================>.....] - ETA: 17s - loss: 1.2800 - regression_loss: 1.0839 - classification_loss: 0.1960 431/500 [========================>.....] - ETA: 17s - loss: 1.2811 - regression_loss: 1.0847 - classification_loss: 0.1964 432/500 [========================>.....] - ETA: 17s - loss: 1.2817 - regression_loss: 1.0852 - classification_loss: 0.1965 433/500 [========================>.....] - ETA: 16s - loss: 1.2823 - regression_loss: 1.0858 - classification_loss: 0.1965 434/500 [=========================>....] - ETA: 16s - loss: 1.2825 - regression_loss: 1.0860 - classification_loss: 0.1965 435/500 [=========================>....] - ETA: 16s - loss: 1.2828 - regression_loss: 1.0863 - classification_loss: 0.1964 436/500 [=========================>....] - ETA: 16s - loss: 1.2822 - regression_loss: 1.0858 - classification_loss: 0.1963 437/500 [=========================>....] - ETA: 15s - loss: 1.2828 - regression_loss: 1.0864 - classification_loss: 0.1964 438/500 [=========================>....] - ETA: 15s - loss: 1.2819 - regression_loss: 1.0857 - classification_loss: 0.1962 439/500 [=========================>....] - ETA: 15s - loss: 1.2832 - regression_loss: 1.0868 - classification_loss: 0.1964 440/500 [=========================>....] - ETA: 15s - loss: 1.2831 - regression_loss: 1.0868 - classification_loss: 0.1964 441/500 [=========================>....] - ETA: 14s - loss: 1.2820 - regression_loss: 1.0859 - classification_loss: 0.1961 442/500 [=========================>....] - ETA: 14s - loss: 1.2800 - regression_loss: 1.0843 - classification_loss: 0.1957 443/500 [=========================>....] - ETA: 14s - loss: 1.2808 - regression_loss: 1.0846 - classification_loss: 0.1962 444/500 [=========================>....] - ETA: 14s - loss: 1.2816 - regression_loss: 1.0854 - classification_loss: 0.1962 445/500 [=========================>....] - ETA: 13s - loss: 1.2817 - regression_loss: 1.0855 - classification_loss: 0.1962 446/500 [=========================>....] - ETA: 13s - loss: 1.2817 - regression_loss: 1.0855 - classification_loss: 0.1962 447/500 [=========================>....] - ETA: 13s - loss: 1.2805 - regression_loss: 1.0846 - classification_loss: 0.1959 448/500 [=========================>....] - ETA: 13s - loss: 1.2801 - regression_loss: 1.0842 - classification_loss: 0.1960 449/500 [=========================>....] - ETA: 12s - loss: 1.2802 - regression_loss: 1.0843 - classification_loss: 0.1959 450/500 [==========================>...] - ETA: 12s - loss: 1.2805 - regression_loss: 1.0844 - classification_loss: 0.1960 451/500 [==========================>...] - ETA: 12s - loss: 1.2813 - regression_loss: 1.0850 - classification_loss: 0.1963 452/500 [==========================>...] - ETA: 12s - loss: 1.2813 - regression_loss: 1.0850 - classification_loss: 0.1963 453/500 [==========================>...] - ETA: 11s - loss: 1.2820 - regression_loss: 1.0856 - classification_loss: 0.1964 454/500 [==========================>...] - ETA: 11s - loss: 1.2825 - regression_loss: 1.0860 - classification_loss: 0.1965 455/500 [==========================>...] - ETA: 11s - loss: 1.2829 - regression_loss: 1.0864 - classification_loss: 0.1964 456/500 [==========================>...] - ETA: 11s - loss: 1.2823 - regression_loss: 1.0860 - classification_loss: 0.1963 457/500 [==========================>...] - ETA: 10s - loss: 1.2820 - regression_loss: 1.0858 - classification_loss: 0.1962 458/500 [==========================>...] - ETA: 10s - loss: 1.2824 - regression_loss: 1.0859 - classification_loss: 0.1965 459/500 [==========================>...] - ETA: 10s - loss: 1.2822 - regression_loss: 1.0858 - classification_loss: 0.1964 460/500 [==========================>...] - ETA: 10s - loss: 1.2824 - regression_loss: 1.0860 - classification_loss: 0.1965 461/500 [==========================>...] - ETA: 9s - loss: 1.2823 - regression_loss: 1.0859 - classification_loss: 0.1964  462/500 [==========================>...] - ETA: 9s - loss: 1.2823 - regression_loss: 1.0859 - classification_loss: 0.1964 463/500 [==========================>...] - ETA: 9s - loss: 1.2833 - regression_loss: 1.0868 - classification_loss: 0.1965 464/500 [==========================>...] - ETA: 9s - loss: 1.2843 - regression_loss: 1.0874 - classification_loss: 0.1968 465/500 [==========================>...] - ETA: 8s - loss: 1.2832 - regression_loss: 1.0867 - classification_loss: 0.1965 466/500 [==========================>...] - ETA: 8s - loss: 1.2818 - regression_loss: 1.0854 - classification_loss: 0.1963 467/500 [===========================>..] - ETA: 8s - loss: 1.2800 - regression_loss: 1.0840 - classification_loss: 0.1960 468/500 [===========================>..] - ETA: 8s - loss: 1.2819 - regression_loss: 1.0858 - classification_loss: 0.1962 469/500 [===========================>..] - ETA: 7s - loss: 1.2816 - regression_loss: 1.0856 - classification_loss: 0.1960 470/500 [===========================>..] - ETA: 7s - loss: 1.2829 - regression_loss: 1.0867 - classification_loss: 0.1962 471/500 [===========================>..] - ETA: 7s - loss: 1.2828 - regression_loss: 1.0866 - classification_loss: 0.1962 472/500 [===========================>..] - ETA: 7s - loss: 1.2827 - regression_loss: 1.0864 - classification_loss: 0.1963 473/500 [===========================>..] - ETA: 6s - loss: 1.2822 - regression_loss: 1.0860 - classification_loss: 0.1962 474/500 [===========================>..] - ETA: 6s - loss: 1.2811 - regression_loss: 1.0851 - classification_loss: 0.1960 475/500 [===========================>..] - ETA: 6s - loss: 1.2818 - regression_loss: 1.0857 - classification_loss: 0.1961 476/500 [===========================>..] - ETA: 6s - loss: 1.2836 - regression_loss: 1.0872 - classification_loss: 0.1965 477/500 [===========================>..] - ETA: 5s - loss: 1.2838 - regression_loss: 1.0875 - classification_loss: 0.1963 478/500 [===========================>..] - ETA: 5s - loss: 1.2844 - regression_loss: 1.0879 - classification_loss: 0.1964 479/500 [===========================>..] - ETA: 5s - loss: 1.2834 - regression_loss: 1.0873 - classification_loss: 0.1962 480/500 [===========================>..] - ETA: 5s - loss: 1.2822 - regression_loss: 1.0862 - classification_loss: 0.1960 481/500 [===========================>..] - ETA: 4s - loss: 1.2809 - regression_loss: 1.0851 - classification_loss: 0.1957 482/500 [===========================>..] - ETA: 4s - loss: 1.2810 - regression_loss: 1.0852 - classification_loss: 0.1958 483/500 [===========================>..] - ETA: 4s - loss: 1.2805 - regression_loss: 1.0849 - classification_loss: 0.1956 484/500 [============================>.] - ETA: 4s - loss: 1.2800 - regression_loss: 1.0844 - classification_loss: 0.1956 485/500 [============================>.] - ETA: 3s - loss: 1.2787 - regression_loss: 1.0833 - classification_loss: 0.1954 486/500 [============================>.] - ETA: 3s - loss: 1.2788 - regression_loss: 1.0835 - classification_loss: 0.1953 487/500 [============================>.] - ETA: 3s - loss: 1.2796 - regression_loss: 1.0842 - classification_loss: 0.1955 488/500 [============================>.] - ETA: 3s - loss: 1.2789 - regression_loss: 1.0834 - classification_loss: 0.1954 489/500 [============================>.] - ETA: 2s - loss: 1.2792 - regression_loss: 1.0836 - classification_loss: 0.1956 490/500 [============================>.] - ETA: 2s - loss: 1.2775 - regression_loss: 1.0822 - classification_loss: 0.1952 491/500 [============================>.] - ETA: 2s - loss: 1.2769 - regression_loss: 1.0818 - classification_loss: 0.1951 492/500 [============================>.] - ETA: 2s - loss: 1.2768 - regression_loss: 1.0817 - classification_loss: 0.1951 493/500 [============================>.] - ETA: 1s - loss: 1.2771 - regression_loss: 1.0820 - classification_loss: 0.1951 494/500 [============================>.] - ETA: 1s - loss: 1.2759 - regression_loss: 1.0810 - classification_loss: 0.1950 495/500 [============================>.] - ETA: 1s - loss: 1.2764 - regression_loss: 1.0811 - classification_loss: 0.1953 496/500 [============================>.] - ETA: 1s - loss: 1.2763 - regression_loss: 1.0812 - classification_loss: 0.1951 497/500 [============================>.] - ETA: 0s - loss: 1.2768 - regression_loss: 1.0817 - classification_loss: 0.1951 498/500 [============================>.] - ETA: 0s - loss: 1.2753 - regression_loss: 1.0804 - classification_loss: 0.1948 499/500 [============================>.] - ETA: 0s - loss: 1.2740 - regression_loss: 1.0794 - classification_loss: 0.1946 500/500 [==============================] - 125s 251ms/step - loss: 1.2746 - regression_loss: 1.0799 - classification_loss: 0.1946 1172 instances of class plum with average precision: 0.6991 mAP: 0.6991 Epoch 00122: saving model to ./training/snapshots/resnet50_pascal_122.h5 Epoch 123/150 1/500 [..............................] - ETA: 2:03 - loss: 0.9536 - regression_loss: 0.8111 - classification_loss: 0.1425 2/500 [..............................] - ETA: 2:08 - loss: 1.2209 - regression_loss: 1.0912 - classification_loss: 0.1297 3/500 [..............................] - ETA: 2:06 - loss: 1.1680 - regression_loss: 1.0085 - classification_loss: 0.1595 4/500 [..............................] - ETA: 2:07 - loss: 1.3071 - regression_loss: 1.1479 - classification_loss: 0.1592 5/500 [..............................] - ETA: 2:06 - loss: 1.3496 - regression_loss: 1.1862 - classification_loss: 0.1634 6/500 [..............................] - ETA: 2:06 - loss: 1.2619 - regression_loss: 1.1081 - classification_loss: 0.1538 7/500 [..............................] - ETA: 2:06 - loss: 1.2038 - regression_loss: 1.0602 - classification_loss: 0.1435 8/500 [..............................] - ETA: 2:06 - loss: 1.2531 - regression_loss: 1.0863 - classification_loss: 0.1668 9/500 [..............................] - ETA: 2:04 - loss: 1.2811 - regression_loss: 1.1078 - classification_loss: 0.1733 10/500 [..............................] - ETA: 2:04 - loss: 1.2869 - regression_loss: 1.1137 - classification_loss: 0.1731 11/500 [..............................] - ETA: 2:04 - loss: 1.3052 - regression_loss: 1.1270 - classification_loss: 0.1781 12/500 [..............................] - ETA: 2:03 - loss: 1.3084 - regression_loss: 1.1293 - classification_loss: 0.1790 13/500 [..............................] - ETA: 2:03 - loss: 1.2999 - regression_loss: 1.1175 - classification_loss: 0.1824 14/500 [..............................] - ETA: 2:03 - loss: 1.3252 - regression_loss: 1.1355 - classification_loss: 0.1897 15/500 [..............................] - ETA: 2:03 - loss: 1.3155 - regression_loss: 1.1254 - classification_loss: 0.1901 16/500 [..............................] - ETA: 2:02 - loss: 1.2970 - regression_loss: 1.1049 - classification_loss: 0.1921 17/500 [>.............................] - ETA: 2:01 - loss: 1.3056 - regression_loss: 1.1116 - classification_loss: 0.1940 18/500 [>.............................] - ETA: 2:01 - loss: 1.3045 - regression_loss: 1.1113 - classification_loss: 0.1932 19/500 [>.............................] - ETA: 2:00 - loss: 1.2638 - regression_loss: 1.0780 - classification_loss: 0.1859 20/500 [>.............................] - ETA: 2:00 - loss: 1.2735 - regression_loss: 1.0864 - classification_loss: 0.1870 21/500 [>.............................] - ETA: 1:59 - loss: 1.2728 - regression_loss: 1.0861 - classification_loss: 0.1867 22/500 [>.............................] - ETA: 1:59 - loss: 1.2444 - regression_loss: 1.0624 - classification_loss: 0.1820 23/500 [>.............................] - ETA: 1:59 - loss: 1.2398 - regression_loss: 1.0585 - classification_loss: 0.1813 24/500 [>.............................] - ETA: 1:59 - loss: 1.2379 - regression_loss: 1.0566 - classification_loss: 0.1813 25/500 [>.............................] - ETA: 1:59 - loss: 1.2413 - regression_loss: 1.0589 - classification_loss: 0.1824 26/500 [>.............................] - ETA: 1:58 - loss: 1.2353 - regression_loss: 1.0529 - classification_loss: 0.1823 27/500 [>.............................] - ETA: 1:58 - loss: 1.2307 - regression_loss: 1.0528 - classification_loss: 0.1779 28/500 [>.............................] - ETA: 1:58 - loss: 1.2324 - regression_loss: 1.0524 - classification_loss: 0.1800 29/500 [>.............................] - ETA: 1:58 - loss: 1.2499 - regression_loss: 1.0673 - classification_loss: 0.1826 30/500 [>.............................] - ETA: 1:57 - loss: 1.2325 - regression_loss: 1.0522 - classification_loss: 0.1803 31/500 [>.............................] - ETA: 1:57 - loss: 1.2171 - regression_loss: 1.0385 - classification_loss: 0.1786 32/500 [>.............................] - ETA: 1:57 - loss: 1.2300 - regression_loss: 1.0489 - classification_loss: 0.1812 33/500 [>.............................] - ETA: 1:57 - loss: 1.2084 - regression_loss: 1.0310 - classification_loss: 0.1774 34/500 [=>............................] - ETA: 1:56 - loss: 1.2130 - regression_loss: 1.0346 - classification_loss: 0.1784 35/500 [=>............................] - ETA: 1:56 - loss: 1.2193 - regression_loss: 1.0394 - classification_loss: 0.1799 36/500 [=>............................] - ETA: 1:56 - loss: 1.2329 - regression_loss: 1.0427 - classification_loss: 0.1902 37/500 [=>............................] - ETA: 1:56 - loss: 1.2215 - regression_loss: 1.0326 - classification_loss: 0.1889 38/500 [=>............................] - ETA: 1:56 - loss: 1.2059 - regression_loss: 1.0193 - classification_loss: 0.1866 39/500 [=>............................] - ETA: 1:55 - loss: 1.1910 - regression_loss: 1.0064 - classification_loss: 0.1846 40/500 [=>............................] - ETA: 1:55 - loss: 1.1851 - regression_loss: 1.0009 - classification_loss: 0.1842 41/500 [=>............................] - ETA: 1:55 - loss: 1.1945 - regression_loss: 1.0093 - classification_loss: 0.1851 42/500 [=>............................] - ETA: 1:55 - loss: 1.1802 - regression_loss: 0.9983 - classification_loss: 0.1819 43/500 [=>............................] - ETA: 1:55 - loss: 1.1909 - regression_loss: 1.0081 - classification_loss: 0.1829 44/500 [=>............................] - ETA: 1:54 - loss: 1.1919 - regression_loss: 1.0098 - classification_loss: 0.1821 45/500 [=>............................] - ETA: 1:54 - loss: 1.1780 - regression_loss: 0.9980 - classification_loss: 0.1800 46/500 [=>............................] - ETA: 1:54 - loss: 1.1698 - regression_loss: 0.9918 - classification_loss: 0.1780 47/500 [=>............................] - ETA: 1:54 - loss: 1.1673 - regression_loss: 0.9906 - classification_loss: 0.1767 48/500 [=>............................] - ETA: 1:53 - loss: 1.1729 - regression_loss: 0.9953 - classification_loss: 0.1776 49/500 [=>............................] - ETA: 1:53 - loss: 1.1725 - regression_loss: 0.9951 - classification_loss: 0.1773 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1781 - regression_loss: 1.0000 - classification_loss: 0.1780 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1842 - regression_loss: 1.0066 - classification_loss: 0.1777 52/500 [==>...........................] - ETA: 1:52 - loss: 1.1840 - regression_loss: 1.0069 - classification_loss: 0.1771 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1751 - regression_loss: 0.9975 - classification_loss: 0.1775 54/500 [==>...........................] - ETA: 1:52 - loss: 1.1914 - regression_loss: 1.0109 - classification_loss: 0.1805 55/500 [==>...........................] - ETA: 1:51 - loss: 1.1956 - regression_loss: 1.0139 - classification_loss: 0.1817 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2034 - regression_loss: 1.0198 - classification_loss: 0.1837 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2111 - regression_loss: 1.0254 - classification_loss: 0.1857 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2127 - regression_loss: 1.0278 - classification_loss: 0.1849 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2217 - regression_loss: 1.0354 - classification_loss: 0.1863 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2354 - regression_loss: 1.0471 - classification_loss: 0.1883 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2379 - regression_loss: 1.0500 - classification_loss: 0.1879 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2297 - regression_loss: 1.0436 - classification_loss: 0.1861 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2333 - regression_loss: 1.0464 - classification_loss: 0.1870 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2380 - regression_loss: 1.0509 - classification_loss: 0.1871 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2361 - regression_loss: 1.0491 - classification_loss: 0.1870 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2367 - regression_loss: 1.0497 - classification_loss: 0.1870 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2279 - regression_loss: 1.0424 - classification_loss: 0.1855 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2174 - regression_loss: 1.0337 - classification_loss: 0.1837 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2219 - regression_loss: 1.0377 - classification_loss: 0.1842 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2283 - regression_loss: 1.0422 - classification_loss: 0.1861 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2307 - regression_loss: 1.0442 - classification_loss: 0.1865 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2296 - regression_loss: 1.0433 - classification_loss: 0.1864 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2301 - regression_loss: 1.0434 - classification_loss: 0.1867 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2320 - regression_loss: 1.0449 - classification_loss: 0.1872 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2364 - regression_loss: 1.0486 - classification_loss: 0.1878 76/500 [===>..........................] - ETA: 1:45 - loss: 1.2271 - regression_loss: 1.0411 - classification_loss: 0.1861 77/500 [===>..........................] - ETA: 1:45 - loss: 1.2214 - regression_loss: 1.0365 - classification_loss: 0.1849 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2237 - regression_loss: 1.0381 - classification_loss: 0.1856 79/500 [===>..........................] - ETA: 1:44 - loss: 1.2291 - regression_loss: 1.0420 - classification_loss: 0.1871 80/500 [===>..........................] - ETA: 1:44 - loss: 1.2256 - regression_loss: 1.0398 - classification_loss: 0.1858 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2334 - regression_loss: 1.0458 - classification_loss: 0.1876 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2375 - regression_loss: 1.0497 - classification_loss: 0.1879 83/500 [===>..........................] - ETA: 1:43 - loss: 1.2422 - regression_loss: 1.0539 - classification_loss: 0.1884 84/500 [====>.........................] - ETA: 1:43 - loss: 1.2460 - regression_loss: 1.0569 - classification_loss: 0.1891 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2424 - regression_loss: 1.0545 - classification_loss: 0.1879 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2369 - regression_loss: 1.0502 - classification_loss: 0.1867 87/500 [====>.........................] - ETA: 1:42 - loss: 1.2395 - regression_loss: 1.0523 - classification_loss: 0.1871 88/500 [====>.........................] - ETA: 1:42 - loss: 1.2418 - regression_loss: 1.0544 - classification_loss: 0.1874 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2437 - regression_loss: 1.0554 - classification_loss: 0.1883 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2347 - regression_loss: 1.0482 - classification_loss: 0.1865 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2386 - regression_loss: 1.0514 - classification_loss: 0.1872 92/500 [====>.........................] - ETA: 1:41 - loss: 1.2343 - regression_loss: 1.0482 - classification_loss: 0.1861 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2292 - regression_loss: 1.0441 - classification_loss: 0.1851 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2362 - regression_loss: 1.0482 - classification_loss: 0.1880 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2401 - regression_loss: 1.0519 - classification_loss: 0.1882 96/500 [====>.........................] - ETA: 1:40 - loss: 1.2441 - regression_loss: 1.0554 - classification_loss: 0.1887 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2433 - regression_loss: 1.0549 - classification_loss: 0.1884 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2427 - regression_loss: 1.0541 - classification_loss: 0.1886 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2400 - regression_loss: 1.0517 - classification_loss: 0.1883 100/500 [=====>........................] - ETA: 1:39 - loss: 1.2417 - regression_loss: 1.0528 - classification_loss: 0.1889 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2417 - regression_loss: 1.0529 - classification_loss: 0.1888 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2464 - regression_loss: 1.0568 - classification_loss: 0.1896 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2447 - regression_loss: 1.0559 - classification_loss: 0.1889 104/500 [=====>........................] - ETA: 1:38 - loss: 1.2436 - regression_loss: 1.0549 - classification_loss: 0.1887 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2358 - regression_loss: 1.0484 - classification_loss: 0.1873 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2419 - regression_loss: 1.0534 - classification_loss: 0.1885 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2385 - regression_loss: 1.0506 - classification_loss: 0.1879 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2321 - regression_loss: 1.0453 - classification_loss: 0.1868 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2371 - regression_loss: 1.0493 - classification_loss: 0.1878 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2297 - regression_loss: 1.0431 - classification_loss: 0.1866 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2345 - regression_loss: 1.0474 - classification_loss: 0.1871 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2336 - regression_loss: 1.0464 - classification_loss: 0.1872 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2362 - regression_loss: 1.0487 - classification_loss: 0.1875 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2393 - regression_loss: 1.0510 - classification_loss: 0.1883 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2358 - regression_loss: 1.0481 - classification_loss: 0.1877 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2318 - regression_loss: 1.0449 - classification_loss: 0.1869 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2364 - regression_loss: 1.0489 - classification_loss: 0.1876 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2309 - regression_loss: 1.0445 - classification_loss: 0.1864 119/500 [======>.......................] - ETA: 1:34 - loss: 1.2313 - regression_loss: 1.0450 - classification_loss: 0.1863 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2325 - regression_loss: 1.0462 - classification_loss: 0.1864 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2312 - regression_loss: 1.0452 - classification_loss: 0.1860 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2351 - regression_loss: 1.0490 - classification_loss: 0.1862 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2349 - regression_loss: 1.0487 - classification_loss: 0.1862 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2315 - regression_loss: 1.0459 - classification_loss: 0.1856 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2346 - regression_loss: 1.0484 - classification_loss: 0.1862 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2340 - regression_loss: 1.0481 - classification_loss: 0.1859 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2321 - regression_loss: 1.0467 - classification_loss: 0.1854 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2253 - regression_loss: 1.0410 - classification_loss: 0.1843 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2265 - regression_loss: 1.0422 - classification_loss: 0.1843 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2317 - regression_loss: 1.0463 - classification_loss: 0.1855 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2356 - regression_loss: 1.0500 - classification_loss: 0.1857 132/500 [======>.......................] - ETA: 1:31 - loss: 1.2399 - regression_loss: 1.0532 - classification_loss: 0.1867 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2454 - regression_loss: 1.0591 - classification_loss: 0.1863 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2489 - regression_loss: 1.0621 - classification_loss: 0.1868 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2441 - regression_loss: 1.0576 - classification_loss: 0.1865 136/500 [=======>......................] - ETA: 1:30 - loss: 1.2443 - regression_loss: 1.0579 - classification_loss: 0.1864 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2447 - regression_loss: 1.0584 - classification_loss: 0.1863 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2393 - regression_loss: 1.0540 - classification_loss: 0.1854 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2400 - regression_loss: 1.0546 - classification_loss: 0.1854 140/500 [=======>......................] - ETA: 1:29 - loss: 1.2425 - regression_loss: 1.0552 - classification_loss: 0.1872 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2391 - regression_loss: 1.0525 - classification_loss: 0.1867 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2420 - regression_loss: 1.0556 - classification_loss: 0.1864 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2428 - regression_loss: 1.0564 - classification_loss: 0.1864 144/500 [=======>......................] - ETA: 1:28 - loss: 1.2426 - regression_loss: 1.0563 - classification_loss: 0.1863 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2443 - regression_loss: 1.0576 - classification_loss: 0.1867 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2495 - regression_loss: 1.0618 - classification_loss: 0.1877 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2477 - regression_loss: 1.0598 - classification_loss: 0.1879 148/500 [=======>......................] - ETA: 1:27 - loss: 1.2480 - regression_loss: 1.0601 - classification_loss: 0.1879 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2495 - regression_loss: 1.0613 - classification_loss: 0.1882 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2464 - regression_loss: 1.0587 - classification_loss: 0.1877 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2458 - regression_loss: 1.0583 - classification_loss: 0.1875 152/500 [========>.....................] - ETA: 1:26 - loss: 1.2454 - regression_loss: 1.0579 - classification_loss: 0.1874 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2482 - regression_loss: 1.0602 - classification_loss: 0.1880 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2512 - regression_loss: 1.0632 - classification_loss: 0.1881 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2485 - regression_loss: 1.0610 - classification_loss: 0.1875 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2480 - regression_loss: 1.0605 - classification_loss: 0.1875 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2459 - regression_loss: 1.0591 - classification_loss: 0.1868 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2497 - regression_loss: 1.0623 - classification_loss: 0.1873 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2474 - regression_loss: 1.0606 - classification_loss: 0.1869 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2466 - regression_loss: 1.0598 - classification_loss: 0.1868 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2420 - regression_loss: 1.0559 - classification_loss: 0.1861 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2448 - regression_loss: 1.0582 - classification_loss: 0.1866 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2434 - regression_loss: 1.0571 - classification_loss: 0.1863 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2392 - regression_loss: 1.0539 - classification_loss: 0.1853 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2401 - regression_loss: 1.0546 - classification_loss: 0.1855 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2374 - regression_loss: 1.0528 - classification_loss: 0.1846 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2369 - regression_loss: 1.0524 - classification_loss: 0.1846 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2403 - regression_loss: 1.0550 - classification_loss: 0.1852 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2414 - regression_loss: 1.0560 - classification_loss: 0.1855 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2379 - regression_loss: 1.0531 - classification_loss: 0.1848 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2392 - regression_loss: 1.0542 - classification_loss: 0.1850 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2402 - regression_loss: 1.0548 - classification_loss: 0.1854 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2438 - regression_loss: 1.0577 - classification_loss: 0.1862 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2470 - regression_loss: 1.0607 - classification_loss: 0.1863 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2453 - regression_loss: 1.0598 - classification_loss: 0.1856 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2478 - regression_loss: 1.0620 - classification_loss: 0.1858 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2518 - regression_loss: 1.0650 - classification_loss: 0.1868 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2516 - regression_loss: 1.0647 - classification_loss: 0.1868 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2536 - regression_loss: 1.0660 - classification_loss: 0.1877 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2527 - regression_loss: 1.0649 - classification_loss: 0.1878 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2510 - regression_loss: 1.0634 - classification_loss: 0.1876 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2512 - regression_loss: 1.0635 - classification_loss: 0.1876 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2506 - regression_loss: 1.0627 - classification_loss: 0.1878 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2513 - regression_loss: 1.0631 - classification_loss: 0.1882 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2529 - regression_loss: 1.0645 - classification_loss: 0.1884 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2488 - regression_loss: 1.0609 - classification_loss: 0.1880 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2500 - regression_loss: 1.0614 - classification_loss: 0.1886 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2521 - regression_loss: 1.0632 - classification_loss: 0.1889 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2512 - regression_loss: 1.0616 - classification_loss: 0.1896 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2532 - regression_loss: 1.0631 - classification_loss: 0.1901 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2518 - regression_loss: 1.0619 - classification_loss: 0.1898 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2511 - regression_loss: 1.0611 - classification_loss: 0.1900 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2537 - regression_loss: 1.0632 - classification_loss: 0.1904 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2544 - regression_loss: 1.0638 - classification_loss: 0.1906 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2522 - regression_loss: 1.0622 - classification_loss: 0.1901 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2519 - regression_loss: 1.0620 - classification_loss: 0.1899 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2505 - regression_loss: 1.0609 - classification_loss: 0.1896 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2505 - regression_loss: 1.0610 - classification_loss: 0.1895 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2520 - regression_loss: 1.0622 - classification_loss: 0.1898 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2535 - regression_loss: 1.0633 - classification_loss: 0.1902 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2529 - regression_loss: 1.0629 - classification_loss: 0.1901 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2537 - regression_loss: 1.0636 - classification_loss: 0.1901 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2550 - regression_loss: 1.0648 - classification_loss: 0.1901 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2562 - regression_loss: 1.0658 - classification_loss: 0.1904 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2541 - regression_loss: 1.0641 - classification_loss: 0.1900 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2559 - regression_loss: 1.0655 - classification_loss: 0.1904 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2577 - regression_loss: 1.0672 - classification_loss: 0.1905 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2585 - regression_loss: 1.0679 - classification_loss: 0.1906 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2584 - regression_loss: 1.0677 - classification_loss: 0.1907 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2597 - regression_loss: 1.0689 - classification_loss: 0.1908 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2577 - regression_loss: 1.0672 - classification_loss: 0.1905 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2582 - regression_loss: 1.0679 - classification_loss: 0.1903 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2572 - regression_loss: 1.0671 - classification_loss: 0.1901 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2592 - regression_loss: 1.0689 - classification_loss: 0.1903 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2570 - regression_loss: 1.0671 - classification_loss: 0.1899 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2553 - regression_loss: 1.0659 - classification_loss: 0.1894 217/500 [============>.................] - ETA: 1:10 - loss: 1.2526 - regression_loss: 1.0637 - classification_loss: 0.1889 218/500 [============>.................] - ETA: 1:10 - loss: 1.2527 - regression_loss: 1.0637 - classification_loss: 0.1890 219/500 [============>.................] - ETA: 1:10 - loss: 1.2552 - regression_loss: 1.0648 - classification_loss: 0.1903 220/500 [============>.................] - ETA: 1:10 - loss: 1.2573 - regression_loss: 1.0667 - classification_loss: 0.1906 221/500 [============>.................] - ETA: 1:09 - loss: 1.2583 - regression_loss: 1.0677 - classification_loss: 0.1906 222/500 [============>.................] - ETA: 1:09 - loss: 1.2601 - regression_loss: 1.0694 - classification_loss: 0.1906 223/500 [============>.................] - ETA: 1:09 - loss: 1.2606 - regression_loss: 1.0699 - classification_loss: 0.1907 224/500 [============>.................] - ETA: 1:09 - loss: 1.2609 - regression_loss: 1.0703 - classification_loss: 0.1907 225/500 [============>.................] - ETA: 1:08 - loss: 1.2637 - regression_loss: 1.0723 - classification_loss: 0.1914 226/500 [============>.................] - ETA: 1:08 - loss: 1.2615 - regression_loss: 1.0705 - classification_loss: 0.1911 227/500 [============>.................] - ETA: 1:08 - loss: 1.2589 - regression_loss: 1.0684 - classification_loss: 0.1905 228/500 [============>.................] - ETA: 1:08 - loss: 1.2603 - regression_loss: 1.0694 - classification_loss: 0.1909 229/500 [============>.................] - ETA: 1:07 - loss: 1.2600 - regression_loss: 1.0692 - classification_loss: 0.1909 230/500 [============>.................] - ETA: 1:07 - loss: 1.2602 - regression_loss: 1.0692 - classification_loss: 0.1909 231/500 [============>.................] - ETA: 1:07 - loss: 1.2568 - regression_loss: 1.0664 - classification_loss: 0.1905 232/500 [============>.................] - ETA: 1:07 - loss: 1.2548 - regression_loss: 1.0648 - classification_loss: 0.1899 233/500 [============>.................] - ETA: 1:06 - loss: 1.2510 - regression_loss: 1.0617 - classification_loss: 0.1893 234/500 [=============>................] - ETA: 1:06 - loss: 1.2514 - regression_loss: 1.0622 - classification_loss: 0.1892 235/500 [=============>................] - ETA: 1:06 - loss: 1.2517 - regression_loss: 1.0621 - classification_loss: 0.1895 236/500 [=============>................] - ETA: 1:06 - loss: 1.2514 - regression_loss: 1.0621 - classification_loss: 0.1893 237/500 [=============>................] - ETA: 1:05 - loss: 1.2527 - regression_loss: 1.0632 - classification_loss: 0.1895 238/500 [=============>................] - ETA: 1:05 - loss: 1.2527 - regression_loss: 1.0631 - classification_loss: 0.1896 239/500 [=============>................] - ETA: 1:05 - loss: 1.2508 - regression_loss: 1.0618 - classification_loss: 0.1890 240/500 [=============>................] - ETA: 1:05 - loss: 1.2515 - regression_loss: 1.0625 - classification_loss: 0.1890 241/500 [=============>................] - ETA: 1:04 - loss: 1.2529 - regression_loss: 1.0636 - classification_loss: 0.1894 242/500 [=============>................] - ETA: 1:04 - loss: 1.2559 - regression_loss: 1.0656 - classification_loss: 0.1903 243/500 [=============>................] - ETA: 1:04 - loss: 1.2555 - regression_loss: 1.0650 - classification_loss: 0.1905 244/500 [=============>................] - ETA: 1:04 - loss: 1.2564 - regression_loss: 1.0657 - classification_loss: 0.1907 245/500 [=============>................] - ETA: 1:03 - loss: 1.2542 - regression_loss: 1.0636 - classification_loss: 0.1906 246/500 [=============>................] - ETA: 1:03 - loss: 1.2508 - regression_loss: 1.0609 - classification_loss: 0.1899 247/500 [=============>................] - ETA: 1:03 - loss: 1.2530 - regression_loss: 1.0627 - classification_loss: 0.1903 248/500 [=============>................] - ETA: 1:03 - loss: 1.2548 - regression_loss: 1.0647 - classification_loss: 0.1902 249/500 [=============>................] - ETA: 1:02 - loss: 1.2546 - regression_loss: 1.0645 - classification_loss: 0.1901 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2562 - regression_loss: 1.0659 - classification_loss: 0.1904 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2559 - regression_loss: 1.0656 - classification_loss: 0.1903 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2569 - regression_loss: 1.0667 - classification_loss: 0.1902 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2532 - regression_loss: 1.0636 - classification_loss: 0.1896 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2542 - regression_loss: 1.0647 - classification_loss: 0.1895 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2555 - regression_loss: 1.0661 - classification_loss: 0.1894 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2572 - regression_loss: 1.0675 - classification_loss: 0.1897 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2569 - regression_loss: 1.0672 - classification_loss: 0.1896 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2574 - regression_loss: 1.0675 - classification_loss: 0.1899 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2580 - regression_loss: 1.0676 - classification_loss: 0.1904 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2555 - regression_loss: 1.0655 - classification_loss: 0.1900 261/500 [==============>...............] - ETA: 59s - loss: 1.2553 - regression_loss: 1.0654 - classification_loss: 0.1899  262/500 [==============>...............] - ETA: 59s - loss: 1.2528 - regression_loss: 1.0635 - classification_loss: 0.1894 263/500 [==============>...............] - ETA: 59s - loss: 1.2532 - regression_loss: 1.0637 - classification_loss: 0.1895 264/500 [==============>...............] - ETA: 59s - loss: 1.2548 - regression_loss: 1.0649 - classification_loss: 0.1899 265/500 [==============>...............] - ETA: 58s - loss: 1.2572 - regression_loss: 1.0668 - classification_loss: 0.1903 266/500 [==============>...............] - ETA: 58s - loss: 1.2539 - regression_loss: 1.0641 - classification_loss: 0.1897 267/500 [===============>..............] - ETA: 58s - loss: 1.2549 - regression_loss: 1.0649 - classification_loss: 0.1899 268/500 [===============>..............] - ETA: 58s - loss: 1.2564 - regression_loss: 1.0661 - classification_loss: 0.1904 269/500 [===============>..............] - ETA: 57s - loss: 1.2541 - regression_loss: 1.0643 - classification_loss: 0.1898 270/500 [===============>..............] - ETA: 57s - loss: 1.2535 - regression_loss: 1.0638 - classification_loss: 0.1897 271/500 [===============>..............] - ETA: 57s - loss: 1.2541 - regression_loss: 1.0643 - classification_loss: 0.1898 272/500 [===============>..............] - ETA: 57s - loss: 1.2548 - regression_loss: 1.0647 - classification_loss: 0.1901 273/500 [===============>..............] - ETA: 56s - loss: 1.2543 - regression_loss: 1.0643 - classification_loss: 0.1900 274/500 [===============>..............] - ETA: 56s - loss: 1.2569 - regression_loss: 1.0666 - classification_loss: 0.1904 275/500 [===============>..............] - ETA: 56s - loss: 1.2577 - regression_loss: 1.0670 - classification_loss: 0.1907 276/500 [===============>..............] - ETA: 56s - loss: 1.2558 - regression_loss: 1.0655 - classification_loss: 0.1903 277/500 [===============>..............] - ETA: 55s - loss: 1.2564 - regression_loss: 1.0657 - classification_loss: 0.1907 278/500 [===============>..............] - ETA: 55s - loss: 1.2571 - regression_loss: 1.0664 - classification_loss: 0.1907 279/500 [===============>..............] - ETA: 55s - loss: 1.2546 - regression_loss: 1.0643 - classification_loss: 0.1903 280/500 [===============>..............] - ETA: 55s - loss: 1.2518 - regression_loss: 1.0621 - classification_loss: 0.1898 281/500 [===============>..............] - ETA: 54s - loss: 1.2509 - regression_loss: 1.0613 - classification_loss: 0.1896 282/500 [===============>..............] - ETA: 54s - loss: 1.2506 - regression_loss: 1.0611 - classification_loss: 0.1895 283/500 [===============>..............] - ETA: 54s - loss: 1.2515 - regression_loss: 1.0619 - classification_loss: 0.1896 284/500 [================>.............] - ETA: 54s - loss: 1.2504 - regression_loss: 1.0611 - classification_loss: 0.1893 285/500 [================>.............] - ETA: 53s - loss: 1.2512 - regression_loss: 1.0619 - classification_loss: 0.1893 286/500 [================>.............] - ETA: 53s - loss: 1.2510 - regression_loss: 1.0616 - classification_loss: 0.1895 287/500 [================>.............] - ETA: 53s - loss: 1.2504 - regression_loss: 1.0611 - classification_loss: 0.1894 288/500 [================>.............] - ETA: 53s - loss: 1.2518 - regression_loss: 1.0621 - classification_loss: 0.1897 289/500 [================>.............] - ETA: 52s - loss: 1.2535 - regression_loss: 1.0636 - classification_loss: 0.1900 290/500 [================>.............] - ETA: 52s - loss: 1.2549 - regression_loss: 1.0647 - classification_loss: 0.1902 291/500 [================>.............] - ETA: 52s - loss: 1.2547 - regression_loss: 1.0645 - classification_loss: 0.1902 292/500 [================>.............] - ETA: 52s - loss: 1.2546 - regression_loss: 1.0644 - classification_loss: 0.1902 293/500 [================>.............] - ETA: 51s - loss: 1.2556 - regression_loss: 1.0652 - classification_loss: 0.1904 294/500 [================>.............] - ETA: 51s - loss: 1.2574 - regression_loss: 1.0664 - classification_loss: 0.1910 295/500 [================>.............] - ETA: 51s - loss: 1.2562 - regression_loss: 1.0653 - classification_loss: 0.1909 296/500 [================>.............] - ETA: 51s - loss: 1.2566 - regression_loss: 1.0656 - classification_loss: 0.1910 297/500 [================>.............] - ETA: 50s - loss: 1.2568 - regression_loss: 1.0658 - classification_loss: 0.1910 298/500 [================>.............] - ETA: 50s - loss: 1.2571 - regression_loss: 1.0661 - classification_loss: 0.1911 299/500 [================>.............] - ETA: 50s - loss: 1.2585 - regression_loss: 1.0673 - classification_loss: 0.1912 300/500 [=================>............] - ETA: 50s - loss: 1.2579 - regression_loss: 1.0666 - classification_loss: 0.1913 301/500 [=================>............] - ETA: 49s - loss: 1.2572 - regression_loss: 1.0661 - classification_loss: 0.1910 302/500 [=================>............] - ETA: 49s - loss: 1.2561 - regression_loss: 1.0653 - classification_loss: 0.1909 303/500 [=================>............] - ETA: 49s - loss: 1.2555 - regression_loss: 1.0648 - classification_loss: 0.1907 304/500 [=================>............] - ETA: 49s - loss: 1.2561 - regression_loss: 1.0654 - classification_loss: 0.1907 305/500 [=================>............] - ETA: 48s - loss: 1.2556 - regression_loss: 1.0652 - classification_loss: 0.1905 306/500 [=================>............] - ETA: 48s - loss: 1.2538 - regression_loss: 1.0638 - classification_loss: 0.1900 307/500 [=================>............] - ETA: 48s - loss: 1.2533 - regression_loss: 1.0635 - classification_loss: 0.1898 308/500 [=================>............] - ETA: 48s - loss: 1.2547 - regression_loss: 1.0646 - classification_loss: 0.1901 309/500 [=================>............] - ETA: 47s - loss: 1.2553 - regression_loss: 1.0650 - classification_loss: 0.1903 310/500 [=================>............] - ETA: 47s - loss: 1.2548 - regression_loss: 1.0647 - classification_loss: 0.1901 311/500 [=================>............] - ETA: 47s - loss: 1.2529 - regression_loss: 1.0632 - classification_loss: 0.1898 312/500 [=================>............] - ETA: 47s - loss: 1.2518 - regression_loss: 1.0624 - classification_loss: 0.1894 313/500 [=================>............] - ETA: 46s - loss: 1.2492 - regression_loss: 1.0603 - classification_loss: 0.1889 314/500 [=================>............] - ETA: 46s - loss: 1.2474 - regression_loss: 1.0589 - classification_loss: 0.1885 315/500 [=================>............] - ETA: 46s - loss: 1.2479 - regression_loss: 1.0594 - classification_loss: 0.1885 316/500 [=================>............] - ETA: 46s - loss: 1.2486 - regression_loss: 1.0601 - classification_loss: 0.1885 317/500 [==================>...........] - ETA: 45s - loss: 1.2463 - regression_loss: 1.0583 - classification_loss: 0.1881 318/500 [==================>...........] - ETA: 45s - loss: 1.2457 - regression_loss: 1.0579 - classification_loss: 0.1879 319/500 [==================>...........] - ETA: 45s - loss: 1.2465 - regression_loss: 1.0585 - classification_loss: 0.1880 320/500 [==================>...........] - ETA: 45s - loss: 1.2453 - regression_loss: 1.0576 - classification_loss: 0.1877 321/500 [==================>...........] - ETA: 44s - loss: 1.2428 - regression_loss: 1.0555 - classification_loss: 0.1873 322/500 [==================>...........] - ETA: 44s - loss: 1.2437 - regression_loss: 1.0564 - classification_loss: 0.1873 323/500 [==================>...........] - ETA: 44s - loss: 1.2460 - regression_loss: 1.0582 - classification_loss: 0.1877 324/500 [==================>...........] - ETA: 44s - loss: 1.2468 - regression_loss: 1.0589 - classification_loss: 0.1879 325/500 [==================>...........] - ETA: 43s - loss: 1.2459 - regression_loss: 1.0580 - classification_loss: 0.1878 326/500 [==================>...........] - ETA: 43s - loss: 1.2470 - regression_loss: 1.0590 - classification_loss: 0.1880 327/500 [==================>...........] - ETA: 43s - loss: 1.2456 - regression_loss: 1.0579 - classification_loss: 0.1877 328/500 [==================>...........] - ETA: 43s - loss: 1.2430 - regression_loss: 1.0557 - classification_loss: 0.1873 329/500 [==================>...........] - ETA: 42s - loss: 1.2421 - regression_loss: 1.0550 - classification_loss: 0.1871 330/500 [==================>...........] - ETA: 42s - loss: 1.2430 - regression_loss: 1.0557 - classification_loss: 0.1873 331/500 [==================>...........] - ETA: 42s - loss: 1.2442 - regression_loss: 1.0567 - classification_loss: 0.1875 332/500 [==================>...........] - ETA: 42s - loss: 1.2443 - regression_loss: 1.0568 - classification_loss: 0.1875 333/500 [==================>...........] - ETA: 41s - loss: 1.2445 - regression_loss: 1.0568 - classification_loss: 0.1876 334/500 [===================>..........] - ETA: 41s - loss: 1.2439 - regression_loss: 1.0565 - classification_loss: 0.1875 335/500 [===================>..........] - ETA: 41s - loss: 1.2431 - regression_loss: 1.0557 - classification_loss: 0.1874 336/500 [===================>..........] - ETA: 41s - loss: 1.2430 - regression_loss: 1.0560 - classification_loss: 0.1871 337/500 [===================>..........] - ETA: 40s - loss: 1.2420 - regression_loss: 1.0554 - classification_loss: 0.1866 338/500 [===================>..........] - ETA: 40s - loss: 1.2437 - regression_loss: 1.0567 - classification_loss: 0.1870 339/500 [===================>..........] - ETA: 40s - loss: 1.2440 - regression_loss: 1.0568 - classification_loss: 0.1872 340/500 [===================>..........] - ETA: 40s - loss: 1.2449 - regression_loss: 1.0577 - classification_loss: 0.1872 341/500 [===================>..........] - ETA: 39s - loss: 1.2433 - regression_loss: 1.0562 - classification_loss: 0.1871 342/500 [===================>..........] - ETA: 39s - loss: 1.2419 - regression_loss: 1.0551 - classification_loss: 0.1868 343/500 [===================>..........] - ETA: 39s - loss: 1.2424 - regression_loss: 1.0556 - classification_loss: 0.1868 344/500 [===================>..........] - ETA: 39s - loss: 1.2429 - regression_loss: 1.0560 - classification_loss: 0.1869 345/500 [===================>..........] - ETA: 38s - loss: 1.2425 - regression_loss: 1.0555 - classification_loss: 0.1870 346/500 [===================>..........] - ETA: 38s - loss: 1.2433 - regression_loss: 1.0561 - classification_loss: 0.1871 347/500 [===================>..........] - ETA: 38s - loss: 1.2427 - regression_loss: 1.0555 - classification_loss: 0.1872 348/500 [===================>..........] - ETA: 38s - loss: 1.2427 - regression_loss: 1.0554 - classification_loss: 0.1873 349/500 [===================>..........] - ETA: 37s - loss: 1.2422 - regression_loss: 1.0551 - classification_loss: 0.1871 350/500 [====================>.........] - ETA: 37s - loss: 1.2431 - regression_loss: 1.0558 - classification_loss: 0.1873 351/500 [====================>.........] - ETA: 37s - loss: 1.2429 - regression_loss: 1.0557 - classification_loss: 0.1872 352/500 [====================>.........] - ETA: 37s - loss: 1.2434 - regression_loss: 1.0562 - classification_loss: 0.1872 353/500 [====================>.........] - ETA: 36s - loss: 1.2439 - regression_loss: 1.0566 - classification_loss: 0.1873 354/500 [====================>.........] - ETA: 36s - loss: 1.2428 - regression_loss: 1.0558 - classification_loss: 0.1870 355/500 [====================>.........] - ETA: 36s - loss: 1.2434 - regression_loss: 1.0562 - classification_loss: 0.1871 356/500 [====================>.........] - ETA: 36s - loss: 1.2451 - regression_loss: 1.0576 - classification_loss: 0.1875 357/500 [====================>.........] - ETA: 35s - loss: 1.2450 - regression_loss: 1.0576 - classification_loss: 0.1874 358/500 [====================>.........] - ETA: 35s - loss: 1.2422 - regression_loss: 1.0553 - classification_loss: 0.1869 359/500 [====================>.........] - ETA: 35s - loss: 1.2437 - regression_loss: 1.0564 - classification_loss: 0.1873 360/500 [====================>.........] - ETA: 35s - loss: 1.2459 - regression_loss: 1.0581 - classification_loss: 0.1878 361/500 [====================>.........] - ETA: 34s - loss: 1.2443 - regression_loss: 1.0567 - classification_loss: 0.1877 362/500 [====================>.........] - ETA: 34s - loss: 1.2433 - regression_loss: 1.0558 - classification_loss: 0.1875 363/500 [====================>.........] - ETA: 34s - loss: 1.2427 - regression_loss: 1.0554 - classification_loss: 0.1872 364/500 [====================>.........] - ETA: 34s - loss: 1.2440 - regression_loss: 1.0565 - classification_loss: 0.1875 365/500 [====================>.........] - ETA: 33s - loss: 1.2456 - regression_loss: 1.0582 - classification_loss: 0.1874 366/500 [====================>.........] - ETA: 33s - loss: 1.2431 - regression_loss: 1.0561 - classification_loss: 0.1870 367/500 [=====================>........] - ETA: 33s - loss: 1.2440 - regression_loss: 1.0567 - classification_loss: 0.1873 368/500 [=====================>........] - ETA: 33s - loss: 1.2444 - regression_loss: 1.0570 - classification_loss: 0.1874 369/500 [=====================>........] - ETA: 32s - loss: 1.2431 - regression_loss: 1.0559 - classification_loss: 0.1871 370/500 [=====================>........] - ETA: 32s - loss: 1.2436 - regression_loss: 1.0563 - classification_loss: 0.1872 371/500 [=====================>........] - ETA: 32s - loss: 1.2443 - regression_loss: 1.0570 - classification_loss: 0.1873 372/500 [=====================>........] - ETA: 32s - loss: 1.2428 - regression_loss: 1.0558 - classification_loss: 0.1870 373/500 [=====================>........] - ETA: 31s - loss: 1.2417 - regression_loss: 1.0549 - classification_loss: 0.1868 374/500 [=====================>........] - ETA: 31s - loss: 1.2428 - regression_loss: 1.0557 - classification_loss: 0.1871 375/500 [=====================>........] - ETA: 31s - loss: 1.2429 - regression_loss: 1.0558 - classification_loss: 0.1871 376/500 [=====================>........] - ETA: 31s - loss: 1.2427 - regression_loss: 1.0557 - classification_loss: 0.1870 377/500 [=====================>........] - ETA: 30s - loss: 1.2437 - regression_loss: 1.0566 - classification_loss: 0.1871 378/500 [=====================>........] - ETA: 30s - loss: 1.2414 - regression_loss: 1.0547 - classification_loss: 0.1867 379/500 [=====================>........] - ETA: 30s - loss: 1.2416 - regression_loss: 1.0548 - classification_loss: 0.1867 380/500 [=====================>........] - ETA: 30s - loss: 1.2404 - regression_loss: 1.0538 - classification_loss: 0.1866 381/500 [=====================>........] - ETA: 29s - loss: 1.2406 - regression_loss: 1.0539 - classification_loss: 0.1866 382/500 [=====================>........] - ETA: 29s - loss: 1.2412 - regression_loss: 1.0546 - classification_loss: 0.1866 383/500 [=====================>........] - ETA: 29s - loss: 1.2406 - regression_loss: 1.0541 - classification_loss: 0.1865 384/500 [======================>.......] - ETA: 29s - loss: 1.2405 - regression_loss: 1.0541 - classification_loss: 0.1864 385/500 [======================>.......] - ETA: 28s - loss: 1.2392 - regression_loss: 1.0530 - classification_loss: 0.1861 386/500 [======================>.......] - ETA: 28s - loss: 1.2403 - regression_loss: 1.0539 - classification_loss: 0.1864 387/500 [======================>.......] - ETA: 28s - loss: 1.2407 - regression_loss: 1.0543 - classification_loss: 0.1864 388/500 [======================>.......] - ETA: 28s - loss: 1.2406 - regression_loss: 1.0542 - classification_loss: 0.1864 389/500 [======================>.......] - ETA: 27s - loss: 1.2394 - regression_loss: 1.0530 - classification_loss: 0.1863 390/500 [======================>.......] - ETA: 27s - loss: 1.2379 - regression_loss: 1.0519 - classification_loss: 0.1860 391/500 [======================>.......] - ETA: 27s - loss: 1.2388 - regression_loss: 1.0526 - classification_loss: 0.1862 392/500 [======================>.......] - ETA: 27s - loss: 1.2389 - regression_loss: 1.0527 - classification_loss: 0.1862 393/500 [======================>.......] - ETA: 26s - loss: 1.2389 - regression_loss: 1.0528 - classification_loss: 0.1861 394/500 [======================>.......] - ETA: 26s - loss: 1.2394 - regression_loss: 1.0534 - classification_loss: 0.1860 395/500 [======================>.......] - ETA: 26s - loss: 1.2383 - regression_loss: 1.0524 - classification_loss: 0.1859 396/500 [======================>.......] - ETA: 26s - loss: 1.2390 - regression_loss: 1.0530 - classification_loss: 0.1860 397/500 [======================>.......] - ETA: 25s - loss: 1.2392 - regression_loss: 1.0532 - classification_loss: 0.1860 398/500 [======================>.......] - ETA: 25s - loss: 1.2395 - regression_loss: 1.0534 - classification_loss: 0.1861 399/500 [======================>.......] - ETA: 25s - loss: 1.2391 - regression_loss: 1.0530 - classification_loss: 0.1861 400/500 [=======================>......] - ETA: 25s - loss: 1.2424 - regression_loss: 1.0556 - classification_loss: 0.1868 401/500 [=======================>......] - ETA: 24s - loss: 1.2433 - regression_loss: 1.0563 - classification_loss: 0.1870 402/500 [=======================>......] - ETA: 24s - loss: 1.2448 - regression_loss: 1.0575 - classification_loss: 0.1873 403/500 [=======================>......] - ETA: 24s - loss: 1.2436 - regression_loss: 1.0564 - classification_loss: 0.1872 404/500 [=======================>......] - ETA: 24s - loss: 1.2451 - regression_loss: 1.0577 - classification_loss: 0.1873 405/500 [=======================>......] - ETA: 23s - loss: 1.2463 - regression_loss: 1.0587 - classification_loss: 0.1876 406/500 [=======================>......] - ETA: 23s - loss: 1.2465 - regression_loss: 1.0589 - classification_loss: 0.1876 407/500 [=======================>......] - ETA: 23s - loss: 1.2471 - regression_loss: 1.0593 - classification_loss: 0.1878 408/500 [=======================>......] - ETA: 23s - loss: 1.2481 - regression_loss: 1.0603 - classification_loss: 0.1879 409/500 [=======================>......] - ETA: 22s - loss: 1.2471 - regression_loss: 1.0594 - classification_loss: 0.1877 410/500 [=======================>......] - ETA: 22s - loss: 1.2481 - regression_loss: 1.0600 - classification_loss: 0.1881 411/500 [=======================>......] - ETA: 22s - loss: 1.2488 - regression_loss: 1.0606 - classification_loss: 0.1882 412/500 [=======================>......] - ETA: 22s - loss: 1.2500 - regression_loss: 1.0616 - classification_loss: 0.1885 413/500 [=======================>......] - ETA: 21s - loss: 1.2506 - regression_loss: 1.0620 - classification_loss: 0.1887 414/500 [=======================>......] - ETA: 21s - loss: 1.2499 - regression_loss: 1.0614 - classification_loss: 0.1884 415/500 [=======================>......] - ETA: 21s - loss: 1.2500 - regression_loss: 1.0612 - classification_loss: 0.1888 416/500 [=======================>......] - ETA: 21s - loss: 1.2514 - regression_loss: 1.0624 - classification_loss: 0.1891 417/500 [========================>.....] - ETA: 20s - loss: 1.2500 - regression_loss: 1.0612 - classification_loss: 0.1888 418/500 [========================>.....] - ETA: 20s - loss: 1.2508 - regression_loss: 1.0617 - classification_loss: 0.1891 419/500 [========================>.....] - ETA: 20s - loss: 1.2507 - regression_loss: 1.0616 - classification_loss: 0.1891 420/500 [========================>.....] - ETA: 20s - loss: 1.2515 - regression_loss: 1.0623 - classification_loss: 0.1892 421/500 [========================>.....] - ETA: 19s - loss: 1.2524 - regression_loss: 1.0631 - classification_loss: 0.1893 422/500 [========================>.....] - ETA: 19s - loss: 1.2517 - regression_loss: 1.0626 - classification_loss: 0.1891 423/500 [========================>.....] - ETA: 19s - loss: 1.2524 - regression_loss: 1.0632 - classification_loss: 0.1892 424/500 [========================>.....] - ETA: 19s - loss: 1.2530 - regression_loss: 1.0637 - classification_loss: 0.1893 425/500 [========================>.....] - ETA: 18s - loss: 1.2523 - regression_loss: 1.0632 - classification_loss: 0.1891 426/500 [========================>.....] - ETA: 18s - loss: 1.2524 - regression_loss: 1.0633 - classification_loss: 0.1891 427/500 [========================>.....] - ETA: 18s - loss: 1.2511 - regression_loss: 1.0623 - classification_loss: 0.1888 428/500 [========================>.....] - ETA: 18s - loss: 1.2518 - regression_loss: 1.0629 - classification_loss: 0.1889 429/500 [========================>.....] - ETA: 17s - loss: 1.2499 - regression_loss: 1.0614 - classification_loss: 0.1886 430/500 [========================>.....] - ETA: 17s - loss: 1.2493 - regression_loss: 1.0609 - classification_loss: 0.1884 431/500 [========================>.....] - ETA: 17s - loss: 1.2496 - regression_loss: 1.0611 - classification_loss: 0.1884 432/500 [========================>.....] - ETA: 17s - loss: 1.2497 - regression_loss: 1.0611 - classification_loss: 0.1886 433/500 [========================>.....] - ETA: 16s - loss: 1.2481 - regression_loss: 1.0597 - classification_loss: 0.1884 434/500 [=========================>....] - ETA: 16s - loss: 1.2487 - regression_loss: 1.0603 - classification_loss: 0.1884 435/500 [=========================>....] - ETA: 16s - loss: 1.2482 - regression_loss: 1.0600 - classification_loss: 0.1882 436/500 [=========================>....] - ETA: 15s - loss: 1.2482 - regression_loss: 1.0601 - classification_loss: 0.1881 437/500 [=========================>....] - ETA: 15s - loss: 1.2504 - regression_loss: 1.0620 - classification_loss: 0.1885 438/500 [=========================>....] - ETA: 15s - loss: 1.2510 - regression_loss: 1.0625 - classification_loss: 0.1886 439/500 [=========================>....] - ETA: 15s - loss: 1.2505 - regression_loss: 1.0620 - classification_loss: 0.1885 440/500 [=========================>....] - ETA: 15s - loss: 1.2515 - regression_loss: 1.0628 - classification_loss: 0.1887 441/500 [=========================>....] - ETA: 14s - loss: 1.2514 - regression_loss: 1.0626 - classification_loss: 0.1888 442/500 [=========================>....] - ETA: 14s - loss: 1.2499 - regression_loss: 1.0614 - classification_loss: 0.1885 443/500 [=========================>....] - ETA: 14s - loss: 1.2508 - regression_loss: 1.0622 - classification_loss: 0.1886 444/500 [=========================>....] - ETA: 13s - loss: 1.2506 - regression_loss: 1.0620 - classification_loss: 0.1886 445/500 [=========================>....] - ETA: 13s - loss: 1.2506 - regression_loss: 1.0620 - classification_loss: 0.1886 446/500 [=========================>....] - ETA: 13s - loss: 1.2510 - regression_loss: 1.0623 - classification_loss: 0.1887 447/500 [=========================>....] - ETA: 13s - loss: 1.2513 - regression_loss: 1.0628 - classification_loss: 0.1886 448/500 [=========================>....] - ETA: 13s - loss: 1.2516 - regression_loss: 1.0628 - classification_loss: 0.1887 449/500 [=========================>....] - ETA: 12s - loss: 1.2534 - regression_loss: 1.0641 - classification_loss: 0.1892 450/500 [==========================>...] - ETA: 12s - loss: 1.2535 - regression_loss: 1.0644 - classification_loss: 0.1892 451/500 [==========================>...] - ETA: 12s - loss: 1.2553 - regression_loss: 1.0659 - classification_loss: 0.1894 452/500 [==========================>...] - ETA: 12s - loss: 1.2561 - regression_loss: 1.0666 - classification_loss: 0.1895 453/500 [==========================>...] - ETA: 11s - loss: 1.2567 - regression_loss: 1.0671 - classification_loss: 0.1896 454/500 [==========================>...] - ETA: 11s - loss: 1.2573 - regression_loss: 1.0678 - classification_loss: 0.1896 455/500 [==========================>...] - ETA: 11s - loss: 1.2557 - regression_loss: 1.0664 - classification_loss: 0.1893 456/500 [==========================>...] - ETA: 11s - loss: 1.2570 - regression_loss: 1.0674 - classification_loss: 0.1896 457/500 [==========================>...] - ETA: 10s - loss: 1.2576 - regression_loss: 1.0681 - classification_loss: 0.1895 458/500 [==========================>...] - ETA: 10s - loss: 1.2582 - regression_loss: 1.0686 - classification_loss: 0.1896 459/500 [==========================>...] - ETA: 10s - loss: 1.2565 - regression_loss: 1.0673 - classification_loss: 0.1893 460/500 [==========================>...] - ETA: 10s - loss: 1.2560 - regression_loss: 1.0669 - classification_loss: 0.1891 461/500 [==========================>...] - ETA: 9s - loss: 1.2568 - regression_loss: 1.0675 - classification_loss: 0.1892  462/500 [==========================>...] - ETA: 9s - loss: 1.2576 - regression_loss: 1.0682 - classification_loss: 0.1894 463/500 [==========================>...] - ETA: 9s - loss: 1.2579 - regression_loss: 1.0685 - classification_loss: 0.1894 464/500 [==========================>...] - ETA: 9s - loss: 1.2573 - regression_loss: 1.0681 - classification_loss: 0.1892 465/500 [==========================>...] - ETA: 8s - loss: 1.2556 - regression_loss: 1.0667 - classification_loss: 0.1889 466/500 [==========================>...] - ETA: 8s - loss: 1.2542 - regression_loss: 1.0655 - classification_loss: 0.1887 467/500 [===========================>..] - ETA: 8s - loss: 1.2546 - regression_loss: 1.0658 - classification_loss: 0.1887 468/500 [===========================>..] - ETA: 8s - loss: 1.2557 - regression_loss: 1.0668 - classification_loss: 0.1889 469/500 [===========================>..] - ETA: 7s - loss: 1.2562 - regression_loss: 1.0673 - classification_loss: 0.1890 470/500 [===========================>..] - ETA: 7s - loss: 1.2553 - regression_loss: 1.0666 - classification_loss: 0.1886 471/500 [===========================>..] - ETA: 7s - loss: 1.2541 - regression_loss: 1.0657 - classification_loss: 0.1883 472/500 [===========================>..] - ETA: 7s - loss: 1.2523 - regression_loss: 1.0643 - classification_loss: 0.1881 473/500 [===========================>..] - ETA: 6s - loss: 1.2532 - regression_loss: 1.0649 - classification_loss: 0.1883 474/500 [===========================>..] - ETA: 6s - loss: 1.2535 - regression_loss: 1.0652 - classification_loss: 0.1883 475/500 [===========================>..] - ETA: 6s - loss: 1.2539 - regression_loss: 1.0656 - classification_loss: 0.1884 476/500 [===========================>..] - ETA: 6s - loss: 1.2557 - regression_loss: 1.0670 - classification_loss: 0.1888 477/500 [===========================>..] - ETA: 5s - loss: 1.2569 - regression_loss: 1.0677 - classification_loss: 0.1892 478/500 [===========================>..] - ETA: 5s - loss: 1.2570 - regression_loss: 1.0678 - classification_loss: 0.1892 479/500 [===========================>..] - ETA: 5s - loss: 1.2560 - regression_loss: 1.0669 - classification_loss: 0.1891 480/500 [===========================>..] - ETA: 5s - loss: 1.2551 - regression_loss: 1.0662 - classification_loss: 0.1889 481/500 [===========================>..] - ETA: 4s - loss: 1.2543 - regression_loss: 1.0655 - classification_loss: 0.1888 482/500 [===========================>..] - ETA: 4s - loss: 1.2541 - regression_loss: 1.0654 - classification_loss: 0.1887 483/500 [===========================>..] - ETA: 4s - loss: 1.2548 - regression_loss: 1.0659 - classification_loss: 0.1889 484/500 [============================>.] - ETA: 4s - loss: 1.2550 - regression_loss: 1.0660 - classification_loss: 0.1890 485/500 [============================>.] - ETA: 3s - loss: 1.2546 - regression_loss: 1.0656 - classification_loss: 0.1889 486/500 [============================>.] - ETA: 3s - loss: 1.2546 - regression_loss: 1.0656 - classification_loss: 0.1890 487/500 [============================>.] - ETA: 3s - loss: 1.2546 - regression_loss: 1.0657 - classification_loss: 0.1890 488/500 [============================>.] - ETA: 3s - loss: 1.2547 - regression_loss: 1.0658 - classification_loss: 0.1889 489/500 [============================>.] - ETA: 2s - loss: 1.2542 - regression_loss: 1.0652 - classification_loss: 0.1890 490/500 [============================>.] - ETA: 2s - loss: 1.2549 - regression_loss: 1.0657 - classification_loss: 0.1892 491/500 [============================>.] - ETA: 2s - loss: 1.2550 - regression_loss: 1.0658 - classification_loss: 0.1892 492/500 [============================>.] - ETA: 2s - loss: 1.2548 - regression_loss: 1.0656 - classification_loss: 0.1892 493/500 [============================>.] - ETA: 1s - loss: 1.2541 - regression_loss: 1.0651 - classification_loss: 0.1890 494/500 [============================>.] - ETA: 1s - loss: 1.2529 - regression_loss: 1.0641 - classification_loss: 0.1888 495/500 [============================>.] - ETA: 1s - loss: 1.2520 - regression_loss: 1.0633 - classification_loss: 0.1887 496/500 [============================>.] - ETA: 1s - loss: 1.2511 - regression_loss: 1.0626 - classification_loss: 0.1885 497/500 [============================>.] - ETA: 0s - loss: 1.2504 - regression_loss: 1.0621 - classification_loss: 0.1883 498/500 [============================>.] - ETA: 0s - loss: 1.2514 - regression_loss: 1.0629 - classification_loss: 0.1885 499/500 [============================>.] - ETA: 0s - loss: 1.2521 - regression_loss: 1.0634 - classification_loss: 0.1887 500/500 [==============================] - 125s 250ms/step - loss: 1.2520 - regression_loss: 1.0633 - classification_loss: 0.1887 1172 instances of class plum with average precision: 0.6991 mAP: 0.6991 Epoch 00123: saving model to ./training/snapshots/resnet50_pascal_123.h5 Epoch 124/150 1/500 [..............................] - ETA: 1:59 - loss: 1.4735 - regression_loss: 1.2746 - classification_loss: 0.1990 2/500 [..............................] - ETA: 1:59 - loss: 1.3676 - regression_loss: 1.1755 - classification_loss: 0.1922 3/500 [..............................] - ETA: 2:00 - loss: 1.1467 - regression_loss: 0.9819 - classification_loss: 0.1648 4/500 [..............................] - ETA: 2:02 - loss: 1.1353 - regression_loss: 0.9697 - classification_loss: 0.1655 5/500 [..............................] - ETA: 2:01 - loss: 1.1027 - regression_loss: 0.9435 - classification_loss: 0.1592 6/500 [..............................] - ETA: 2:00 - loss: 1.0282 - regression_loss: 0.8643 - classification_loss: 0.1639 7/500 [..............................] - ETA: 2:01 - loss: 1.0783 - regression_loss: 0.9113 - classification_loss: 0.1670 8/500 [..............................] - ETA: 2:01 - loss: 1.1380 - regression_loss: 0.9623 - classification_loss: 0.1757 9/500 [..............................] - ETA: 2:01 - loss: 1.1066 - regression_loss: 0.9405 - classification_loss: 0.1661 10/500 [..............................] - ETA: 2:01 - loss: 1.1734 - regression_loss: 0.9968 - classification_loss: 0.1766 11/500 [..............................] - ETA: 2:01 - loss: 1.1876 - regression_loss: 1.0100 - classification_loss: 0.1775 12/500 [..............................] - ETA: 2:01 - loss: 1.2005 - regression_loss: 1.0233 - classification_loss: 0.1771 13/500 [..............................] - ETA: 2:01 - loss: 1.2432 - regression_loss: 1.0585 - classification_loss: 0.1847 14/500 [..............................] - ETA: 2:01 - loss: 1.1834 - regression_loss: 1.0084 - classification_loss: 0.1750 15/500 [..............................] - ETA: 2:01 - loss: 1.2013 - regression_loss: 1.0256 - classification_loss: 0.1757 16/500 [..............................] - ETA: 2:01 - loss: 1.2209 - regression_loss: 1.0426 - classification_loss: 0.1783 17/500 [>.............................] - ETA: 2:01 - loss: 1.2258 - regression_loss: 1.0466 - classification_loss: 0.1792 18/500 [>.............................] - ETA: 2:01 - loss: 1.2272 - regression_loss: 1.0462 - classification_loss: 0.1810 19/500 [>.............................] - ETA: 2:00 - loss: 1.2210 - regression_loss: 1.0425 - classification_loss: 0.1784 20/500 [>.............................] - ETA: 2:00 - loss: 1.2424 - regression_loss: 1.0604 - classification_loss: 0.1820 21/500 [>.............................] - ETA: 2:00 - loss: 1.2286 - regression_loss: 1.0494 - classification_loss: 0.1792 22/500 [>.............................] - ETA: 2:00 - loss: 1.2409 - regression_loss: 1.0624 - classification_loss: 0.1785 23/500 [>.............................] - ETA: 2:00 - loss: 1.2605 - regression_loss: 1.0767 - classification_loss: 0.1838 24/500 [>.............................] - ETA: 2:00 - loss: 1.2724 - regression_loss: 1.0856 - classification_loss: 0.1868 25/500 [>.............................] - ETA: 1:59 - loss: 1.2812 - regression_loss: 1.0916 - classification_loss: 0.1896 26/500 [>.............................] - ETA: 1:59 - loss: 1.2484 - regression_loss: 1.0643 - classification_loss: 0.1840 27/500 [>.............................] - ETA: 1:59 - loss: 1.2559 - regression_loss: 1.0699 - classification_loss: 0.1861 28/500 [>.............................] - ETA: 1:59 - loss: 1.2570 - regression_loss: 1.0706 - classification_loss: 0.1864 29/500 [>.............................] - ETA: 1:58 - loss: 1.2592 - regression_loss: 1.0732 - classification_loss: 0.1860 30/500 [>.............................] - ETA: 1:58 - loss: 1.2722 - regression_loss: 1.0826 - classification_loss: 0.1897 31/500 [>.............................] - ETA: 1:58 - loss: 1.2662 - regression_loss: 1.0779 - classification_loss: 0.1883 32/500 [>.............................] - ETA: 1:58 - loss: 1.2663 - regression_loss: 1.0785 - classification_loss: 0.1878 33/500 [>.............................] - ETA: 1:57 - loss: 1.2805 - regression_loss: 1.0909 - classification_loss: 0.1896 34/500 [=>............................] - ETA: 1:57 - loss: 1.2774 - regression_loss: 1.0890 - classification_loss: 0.1885 35/500 [=>............................] - ETA: 1:57 - loss: 1.2542 - regression_loss: 1.0683 - classification_loss: 0.1859 36/500 [=>............................] - ETA: 1:56 - loss: 1.2707 - regression_loss: 1.0813 - classification_loss: 0.1894 37/500 [=>............................] - ETA: 1:56 - loss: 1.2784 - regression_loss: 1.0879 - classification_loss: 0.1904 38/500 [=>............................] - ETA: 1:56 - loss: 1.2559 - regression_loss: 1.0689 - classification_loss: 0.1870 39/500 [=>............................] - ETA: 1:56 - loss: 1.2749 - regression_loss: 1.0822 - classification_loss: 0.1927 40/500 [=>............................] - ETA: 1:56 - loss: 1.2625 - regression_loss: 1.0728 - classification_loss: 0.1897 41/500 [=>............................] - ETA: 1:55 - loss: 1.2438 - regression_loss: 1.0574 - classification_loss: 0.1864 42/500 [=>............................] - ETA: 1:55 - loss: 1.2301 - regression_loss: 1.0446 - classification_loss: 0.1855 43/500 [=>............................] - ETA: 1:55 - loss: 1.2297 - regression_loss: 1.0445 - classification_loss: 0.1852 44/500 [=>............................] - ETA: 1:55 - loss: 1.2353 - regression_loss: 1.0497 - classification_loss: 0.1856 45/500 [=>............................] - ETA: 1:54 - loss: 1.2627 - regression_loss: 1.0752 - classification_loss: 0.1875 46/500 [=>............................] - ETA: 1:54 - loss: 1.2783 - regression_loss: 1.0873 - classification_loss: 0.1910 47/500 [=>............................] - ETA: 1:54 - loss: 1.2825 - regression_loss: 1.0900 - classification_loss: 0.1925 48/500 [=>............................] - ETA: 1:54 - loss: 1.2745 - regression_loss: 1.0826 - classification_loss: 0.1920 49/500 [=>............................] - ETA: 1:53 - loss: 1.2783 - regression_loss: 1.0847 - classification_loss: 0.1935 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2777 - regression_loss: 1.0842 - classification_loss: 0.1935 51/500 [==>...........................] - ETA: 1:53 - loss: 1.2818 - regression_loss: 1.0871 - classification_loss: 0.1947 52/500 [==>...........................] - ETA: 1:53 - loss: 1.2821 - regression_loss: 1.0874 - classification_loss: 0.1947 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2849 - regression_loss: 1.0893 - classification_loss: 0.1956 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2728 - regression_loss: 1.0795 - classification_loss: 0.1933 55/500 [==>...........................] - ETA: 1:52 - loss: 1.2858 - regression_loss: 1.0908 - classification_loss: 0.1950 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2696 - regression_loss: 1.0774 - classification_loss: 0.1922 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2742 - regression_loss: 1.0820 - classification_loss: 0.1922 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2780 - regression_loss: 1.0868 - classification_loss: 0.1912 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2878 - regression_loss: 1.0945 - classification_loss: 0.1932 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2858 - regression_loss: 1.0926 - classification_loss: 0.1933 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2727 - regression_loss: 1.0818 - classification_loss: 0.1909 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2809 - regression_loss: 1.0881 - classification_loss: 0.1928 63/500 [==>...........................] - ETA: 1:50 - loss: 1.2826 - regression_loss: 1.0891 - classification_loss: 0.1935 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2706 - regression_loss: 1.0788 - classification_loss: 0.1918 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2583 - regression_loss: 1.0689 - classification_loss: 0.1894 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2503 - regression_loss: 1.0625 - classification_loss: 0.1878 67/500 [===>..........................] - ETA: 1:49 - loss: 1.2533 - regression_loss: 1.0648 - classification_loss: 0.1886 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2541 - regression_loss: 1.0651 - classification_loss: 0.1890 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2558 - regression_loss: 1.0653 - classification_loss: 0.1905 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2527 - regression_loss: 1.0628 - classification_loss: 0.1899 71/500 [===>..........................] - ETA: 1:48 - loss: 1.2567 - regression_loss: 1.0662 - classification_loss: 0.1905 72/500 [===>..........................] - ETA: 1:48 - loss: 1.2572 - regression_loss: 1.0667 - classification_loss: 0.1905 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2621 - regression_loss: 1.0709 - classification_loss: 0.1912 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2706 - regression_loss: 1.0777 - classification_loss: 0.1928 75/500 [===>..........................] - ETA: 1:47 - loss: 1.2591 - regression_loss: 1.0684 - classification_loss: 0.1906 76/500 [===>..........................] - ETA: 1:47 - loss: 1.2564 - regression_loss: 1.0665 - classification_loss: 0.1899 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2601 - regression_loss: 1.0697 - classification_loss: 0.1904 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2636 - regression_loss: 1.0729 - classification_loss: 0.1907 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2651 - regression_loss: 1.0743 - classification_loss: 0.1908 80/500 [===>..........................] - ETA: 1:46 - loss: 1.2705 - regression_loss: 1.0792 - classification_loss: 0.1914 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2721 - regression_loss: 1.0804 - classification_loss: 0.1917 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2776 - regression_loss: 1.0844 - classification_loss: 0.1932 83/500 [===>..........................] - ETA: 1:45 - loss: 1.2740 - regression_loss: 1.0807 - classification_loss: 0.1933 84/500 [====>.........................] - ETA: 1:45 - loss: 1.2750 - regression_loss: 1.0823 - classification_loss: 0.1927 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2708 - regression_loss: 1.0790 - classification_loss: 0.1918 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2675 - regression_loss: 1.0761 - classification_loss: 0.1914 87/500 [====>.........................] - ETA: 1:44 - loss: 1.2630 - regression_loss: 1.0725 - classification_loss: 0.1905 88/500 [====>.........................] - ETA: 1:44 - loss: 1.2555 - regression_loss: 1.0667 - classification_loss: 0.1888 89/500 [====>.........................] - ETA: 1:44 - loss: 1.2577 - regression_loss: 1.0682 - classification_loss: 0.1896 90/500 [====>.........................] - ETA: 1:43 - loss: 1.2640 - regression_loss: 1.0729 - classification_loss: 0.1911 91/500 [====>.........................] - ETA: 1:43 - loss: 1.2672 - regression_loss: 1.0753 - classification_loss: 0.1919 92/500 [====>.........................] - ETA: 1:43 - loss: 1.2662 - regression_loss: 1.0741 - classification_loss: 0.1921 93/500 [====>.........................] - ETA: 1:43 - loss: 1.2651 - regression_loss: 1.0739 - classification_loss: 0.1912 94/500 [====>.........................] - ETA: 1:42 - loss: 1.2717 - regression_loss: 1.0791 - classification_loss: 0.1926 95/500 [====>.........................] - ETA: 1:42 - loss: 1.2723 - regression_loss: 1.0791 - classification_loss: 0.1931 96/500 [====>.........................] - ETA: 1:42 - loss: 1.2744 - regression_loss: 1.0812 - classification_loss: 0.1932 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2685 - regression_loss: 1.0766 - classification_loss: 0.1919 98/500 [====>.........................] - ETA: 1:41 - loss: 1.2669 - regression_loss: 1.0754 - classification_loss: 0.1914 99/500 [====>.........................] - ETA: 1:41 - loss: 1.2662 - regression_loss: 1.0748 - classification_loss: 0.1914 100/500 [=====>........................] - ETA: 1:41 - loss: 1.2611 - regression_loss: 1.0707 - classification_loss: 0.1904 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2606 - regression_loss: 1.0703 - classification_loss: 0.1903 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2624 - regression_loss: 1.0717 - classification_loss: 0.1907 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2540 - regression_loss: 1.0645 - classification_loss: 0.1895 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2569 - regression_loss: 1.0670 - classification_loss: 0.1899 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2607 - regression_loss: 1.0702 - classification_loss: 0.1905 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2571 - regression_loss: 1.0673 - classification_loss: 0.1898 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2517 - regression_loss: 1.0627 - classification_loss: 0.1890 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2544 - regression_loss: 1.0646 - classification_loss: 0.1898 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2587 - regression_loss: 1.0683 - classification_loss: 0.1905 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2552 - regression_loss: 1.0654 - classification_loss: 0.1897 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2554 - regression_loss: 1.0659 - classification_loss: 0.1895 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2568 - regression_loss: 1.0662 - classification_loss: 0.1906 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2601 - regression_loss: 1.0687 - classification_loss: 0.1914 114/500 [=====>........................] - ETA: 1:37 - loss: 1.2582 - regression_loss: 1.0677 - classification_loss: 0.1906 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2608 - regression_loss: 1.0700 - classification_loss: 0.1907 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2609 - regression_loss: 1.0708 - classification_loss: 0.1901 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2617 - regression_loss: 1.0718 - classification_loss: 0.1899 118/500 [======>.......................] - ETA: 1:36 - loss: 1.2604 - regression_loss: 1.0709 - classification_loss: 0.1895 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2626 - regression_loss: 1.0727 - classification_loss: 0.1899 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2649 - regression_loss: 1.0744 - classification_loss: 0.1904 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2778 - regression_loss: 1.0849 - classification_loss: 0.1929 122/500 [======>.......................] - ETA: 1:35 - loss: 1.2792 - regression_loss: 1.0861 - classification_loss: 0.1931 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2806 - regression_loss: 1.0875 - classification_loss: 0.1931 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2802 - regression_loss: 1.0871 - classification_loss: 0.1931 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2760 - regression_loss: 1.0837 - classification_loss: 0.1922 126/500 [======>.......................] - ETA: 1:34 - loss: 1.2705 - regression_loss: 1.0793 - classification_loss: 0.1912 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2665 - regression_loss: 1.0762 - classification_loss: 0.1903 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2665 - regression_loss: 1.0763 - classification_loss: 0.1903 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2638 - regression_loss: 1.0737 - classification_loss: 0.1901 130/500 [======>.......................] - ETA: 1:33 - loss: 1.2559 - regression_loss: 1.0671 - classification_loss: 0.1888 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2570 - regression_loss: 1.0681 - classification_loss: 0.1889 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2576 - regression_loss: 1.0686 - classification_loss: 0.1890 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2556 - regression_loss: 1.0672 - classification_loss: 0.1884 134/500 [=======>......................] - ETA: 1:32 - loss: 1.2549 - regression_loss: 1.0664 - classification_loss: 0.1885 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2591 - regression_loss: 1.0699 - classification_loss: 0.1892 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2604 - regression_loss: 1.0712 - classification_loss: 0.1892 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2569 - regression_loss: 1.0685 - classification_loss: 0.1884 138/500 [=======>......................] - ETA: 1:31 - loss: 1.2582 - regression_loss: 1.0697 - classification_loss: 0.1885 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2564 - regression_loss: 1.0684 - classification_loss: 0.1880 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2589 - regression_loss: 1.0706 - classification_loss: 0.1883 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2620 - regression_loss: 1.0733 - classification_loss: 0.1887 142/500 [=======>......................] - ETA: 1:30 - loss: 1.2583 - regression_loss: 1.0703 - classification_loss: 0.1880 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2610 - regression_loss: 1.0729 - classification_loss: 0.1881 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2607 - regression_loss: 1.0728 - classification_loss: 0.1879 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2590 - regression_loss: 1.0718 - classification_loss: 0.1872 146/500 [=======>......................] - ETA: 1:29 - loss: 1.2578 - regression_loss: 1.0711 - classification_loss: 0.1867 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2583 - regression_loss: 1.0715 - classification_loss: 0.1868 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2629 - regression_loss: 1.0749 - classification_loss: 0.1879 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2624 - regression_loss: 1.0743 - classification_loss: 0.1881 150/500 [========>.....................] - ETA: 1:28 - loss: 1.2595 - regression_loss: 1.0721 - classification_loss: 0.1874 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2615 - regression_loss: 1.0738 - classification_loss: 0.1877 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2656 - regression_loss: 1.0771 - classification_loss: 0.1885 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2626 - regression_loss: 1.0748 - classification_loss: 0.1879 154/500 [========>.....................] - ETA: 1:27 - loss: 1.2596 - regression_loss: 1.0725 - classification_loss: 0.1871 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2598 - regression_loss: 1.0724 - classification_loss: 0.1874 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2596 - regression_loss: 1.0721 - classification_loss: 0.1874 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2628 - regression_loss: 1.0746 - classification_loss: 0.1882 158/500 [========>.....................] - ETA: 1:26 - loss: 1.2653 - regression_loss: 1.0766 - classification_loss: 0.1887 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2675 - regression_loss: 1.0783 - classification_loss: 0.1892 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2681 - regression_loss: 1.0786 - classification_loss: 0.1895 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2644 - regression_loss: 1.0755 - classification_loss: 0.1889 162/500 [========>.....................] - ETA: 1:25 - loss: 1.2620 - regression_loss: 1.0734 - classification_loss: 0.1886 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2610 - regression_loss: 1.0728 - classification_loss: 0.1882 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2603 - regression_loss: 1.0720 - classification_loss: 0.1883 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2583 - regression_loss: 1.0698 - classification_loss: 0.1885 166/500 [========>.....................] - ETA: 1:24 - loss: 1.2590 - regression_loss: 1.0703 - classification_loss: 0.1887 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2562 - regression_loss: 1.0682 - classification_loss: 0.1880 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2543 - regression_loss: 1.0669 - classification_loss: 0.1875 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2525 - regression_loss: 1.0655 - classification_loss: 0.1870 170/500 [=========>....................] - ETA: 1:23 - loss: 1.2554 - regression_loss: 1.0681 - classification_loss: 0.1873 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2523 - regression_loss: 1.0656 - classification_loss: 0.1867 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2518 - regression_loss: 1.0652 - classification_loss: 0.1866 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2540 - regression_loss: 1.0670 - classification_loss: 0.1870 174/500 [=========>....................] - ETA: 1:22 - loss: 1.2540 - regression_loss: 1.0670 - classification_loss: 0.1870 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2571 - regression_loss: 1.0695 - classification_loss: 0.1876 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2607 - regression_loss: 1.0722 - classification_loss: 0.1885 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2639 - regression_loss: 1.0749 - classification_loss: 0.1890 178/500 [=========>....................] - ETA: 1:21 - loss: 1.2655 - regression_loss: 1.0764 - classification_loss: 0.1892 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2658 - regression_loss: 1.0763 - classification_loss: 0.1895 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2607 - regression_loss: 1.0718 - classification_loss: 0.1890 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2605 - regression_loss: 1.0717 - classification_loss: 0.1888 182/500 [=========>....................] - ETA: 1:20 - loss: 1.2578 - regression_loss: 1.0693 - classification_loss: 0.1885 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2576 - regression_loss: 1.0692 - classification_loss: 0.1884 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2571 - regression_loss: 1.0687 - classification_loss: 0.1884 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2582 - regression_loss: 1.0698 - classification_loss: 0.1884 186/500 [==========>...................] - ETA: 1:19 - loss: 1.2591 - regression_loss: 1.0701 - classification_loss: 0.1890 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2615 - regression_loss: 1.0719 - classification_loss: 0.1896 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2642 - regression_loss: 1.0743 - classification_loss: 0.1900 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2629 - regression_loss: 1.0733 - classification_loss: 0.1896 190/500 [==========>...................] - ETA: 1:18 - loss: 1.2628 - regression_loss: 1.0734 - classification_loss: 0.1894 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2665 - regression_loss: 1.0765 - classification_loss: 0.1900 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2671 - regression_loss: 1.0771 - classification_loss: 0.1900 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2631 - regression_loss: 1.0736 - classification_loss: 0.1895 194/500 [==========>...................] - ETA: 1:17 - loss: 1.2596 - regression_loss: 1.0708 - classification_loss: 0.1889 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2597 - regression_loss: 1.0711 - classification_loss: 0.1886 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2609 - regression_loss: 1.0718 - classification_loss: 0.1891 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2627 - regression_loss: 1.0734 - classification_loss: 0.1893 198/500 [==========>...................] - ETA: 1:16 - loss: 1.2638 - regression_loss: 1.0744 - classification_loss: 0.1894 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2650 - regression_loss: 1.0754 - classification_loss: 0.1895 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2658 - regression_loss: 1.0759 - classification_loss: 0.1899 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2625 - regression_loss: 1.0734 - classification_loss: 0.1891 202/500 [===========>..................] - ETA: 1:15 - loss: 1.2618 - regression_loss: 1.0729 - classification_loss: 0.1889 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2610 - regression_loss: 1.0723 - classification_loss: 0.1886 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2610 - regression_loss: 1.0725 - classification_loss: 0.1885 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2620 - regression_loss: 1.0732 - classification_loss: 0.1888 206/500 [===========>..................] - ETA: 1:14 - loss: 1.2643 - regression_loss: 1.0751 - classification_loss: 0.1893 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2601 - regression_loss: 1.0715 - classification_loss: 0.1886 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2610 - regression_loss: 1.0721 - classification_loss: 0.1889 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2595 - regression_loss: 1.0704 - classification_loss: 0.1891 210/500 [===========>..................] - ETA: 1:13 - loss: 1.2596 - regression_loss: 1.0705 - classification_loss: 0.1891 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2610 - regression_loss: 1.0715 - classification_loss: 0.1895 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2620 - regression_loss: 1.0723 - classification_loss: 0.1897 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2617 - regression_loss: 1.0721 - classification_loss: 0.1896 214/500 [===========>..................] - ETA: 1:12 - loss: 1.2590 - regression_loss: 1.0700 - classification_loss: 0.1890 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2602 - regression_loss: 1.0711 - classification_loss: 0.1891 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2621 - regression_loss: 1.0728 - classification_loss: 0.1893 217/500 [============>.................] - ETA: 1:11 - loss: 1.2579 - regression_loss: 1.0693 - classification_loss: 0.1886 218/500 [============>.................] - ETA: 1:11 - loss: 1.2556 - regression_loss: 1.0676 - classification_loss: 0.1880 219/500 [============>.................] - ETA: 1:10 - loss: 1.2539 - regression_loss: 1.0662 - classification_loss: 0.1877 220/500 [============>.................] - ETA: 1:10 - loss: 1.2559 - regression_loss: 1.0678 - classification_loss: 0.1881 221/500 [============>.................] - ETA: 1:10 - loss: 1.2570 - regression_loss: 1.0687 - classification_loss: 0.1882 222/500 [============>.................] - ETA: 1:10 - loss: 1.2553 - regression_loss: 1.0673 - classification_loss: 0.1879 223/500 [============>.................] - ETA: 1:09 - loss: 1.2568 - regression_loss: 1.0686 - classification_loss: 0.1882 224/500 [============>.................] - ETA: 1:09 - loss: 1.2579 - regression_loss: 1.0695 - classification_loss: 0.1883 225/500 [============>.................] - ETA: 1:09 - loss: 1.2561 - regression_loss: 1.0681 - classification_loss: 0.1881 226/500 [============>.................] - ETA: 1:09 - loss: 1.2587 - regression_loss: 1.0702 - classification_loss: 0.1886 227/500 [============>.................] - ETA: 1:08 - loss: 1.2569 - regression_loss: 1.0687 - classification_loss: 0.1883 228/500 [============>.................] - ETA: 1:08 - loss: 1.2558 - regression_loss: 1.0680 - classification_loss: 0.1879 229/500 [============>.................] - ETA: 1:08 - loss: 1.2565 - regression_loss: 1.0684 - classification_loss: 0.1881 230/500 [============>.................] - ETA: 1:08 - loss: 1.2573 - regression_loss: 1.0688 - classification_loss: 0.1885 231/500 [============>.................] - ETA: 1:07 - loss: 1.2605 - regression_loss: 1.0712 - classification_loss: 0.1893 232/500 [============>.................] - ETA: 1:07 - loss: 1.2629 - regression_loss: 1.0730 - classification_loss: 0.1900 233/500 [============>.................] - ETA: 1:07 - loss: 1.2628 - regression_loss: 1.0729 - classification_loss: 0.1899 234/500 [=============>................] - ETA: 1:07 - loss: 1.2641 - regression_loss: 1.0738 - classification_loss: 0.1904 235/500 [=============>................] - ETA: 1:06 - loss: 1.2641 - regression_loss: 1.0739 - classification_loss: 0.1902 236/500 [=============>................] - ETA: 1:06 - loss: 1.2661 - regression_loss: 1.0758 - classification_loss: 0.1903 237/500 [=============>................] - ETA: 1:06 - loss: 1.2668 - regression_loss: 1.0764 - classification_loss: 0.1904 238/500 [=============>................] - ETA: 1:06 - loss: 1.2657 - regression_loss: 1.0754 - classification_loss: 0.1903 239/500 [=============>................] - ETA: 1:05 - loss: 1.2632 - regression_loss: 1.0734 - classification_loss: 0.1898 240/500 [=============>................] - ETA: 1:05 - loss: 1.2641 - regression_loss: 1.0740 - classification_loss: 0.1901 241/500 [=============>................] - ETA: 1:05 - loss: 1.2656 - regression_loss: 1.0754 - classification_loss: 0.1903 242/500 [=============>................] - ETA: 1:05 - loss: 1.2650 - regression_loss: 1.0749 - classification_loss: 0.1900 243/500 [=============>................] - ETA: 1:04 - loss: 1.2659 - regression_loss: 1.0759 - classification_loss: 0.1901 244/500 [=============>................] - ETA: 1:04 - loss: 1.2627 - regression_loss: 1.0732 - classification_loss: 0.1895 245/500 [=============>................] - ETA: 1:04 - loss: 1.2628 - regression_loss: 1.0736 - classification_loss: 0.1892 246/500 [=============>................] - ETA: 1:04 - loss: 1.2628 - regression_loss: 1.0736 - classification_loss: 0.1891 247/500 [=============>................] - ETA: 1:03 - loss: 1.2646 - regression_loss: 1.0752 - classification_loss: 0.1894 248/500 [=============>................] - ETA: 1:03 - loss: 1.2618 - regression_loss: 1.0730 - classification_loss: 0.1888 249/500 [=============>................] - ETA: 1:03 - loss: 1.2633 - regression_loss: 1.0744 - classification_loss: 0.1889 250/500 [==============>...............] - ETA: 1:03 - loss: 1.2635 - regression_loss: 1.0744 - classification_loss: 0.1891 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2659 - regression_loss: 1.0766 - classification_loss: 0.1894 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2632 - regression_loss: 1.0743 - classification_loss: 0.1889 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2625 - regression_loss: 1.0735 - classification_loss: 0.1889 254/500 [==============>...............] - ETA: 1:02 - loss: 1.2645 - regression_loss: 1.0750 - classification_loss: 0.1895 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2653 - regression_loss: 1.0756 - classification_loss: 0.1897 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2658 - regression_loss: 1.0760 - classification_loss: 0.1898 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2654 - regression_loss: 1.0755 - classification_loss: 0.1899 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2648 - regression_loss: 1.0748 - classification_loss: 0.1900 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2666 - regression_loss: 1.0763 - classification_loss: 0.1903 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2680 - regression_loss: 1.0773 - classification_loss: 0.1907 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2687 - regression_loss: 1.0781 - classification_loss: 0.1906 262/500 [==============>...............] - ETA: 59s - loss: 1.2697 - regression_loss: 1.0790 - classification_loss: 0.1908  263/500 [==============>...............] - ETA: 59s - loss: 1.2708 - regression_loss: 1.0798 - classification_loss: 0.1911 264/500 [==============>...............] - ETA: 59s - loss: 1.2684 - regression_loss: 1.0778 - classification_loss: 0.1906 265/500 [==============>...............] - ETA: 59s - loss: 1.2697 - regression_loss: 1.0792 - classification_loss: 0.1905 266/500 [==============>...............] - ETA: 58s - loss: 1.2670 - regression_loss: 1.0771 - classification_loss: 0.1899 267/500 [===============>..............] - ETA: 58s - loss: 1.2659 - regression_loss: 1.0763 - classification_loss: 0.1896 268/500 [===============>..............] - ETA: 58s - loss: 1.2653 - regression_loss: 1.0759 - classification_loss: 0.1894 269/500 [===============>..............] - ETA: 58s - loss: 1.2637 - regression_loss: 1.0745 - classification_loss: 0.1892 270/500 [===============>..............] - ETA: 57s - loss: 1.2614 - regression_loss: 1.0725 - classification_loss: 0.1888 271/500 [===============>..............] - ETA: 57s - loss: 1.2618 - regression_loss: 1.0729 - classification_loss: 0.1889 272/500 [===============>..............] - ETA: 57s - loss: 1.2633 - regression_loss: 1.0744 - classification_loss: 0.1889 273/500 [===============>..............] - ETA: 57s - loss: 1.2656 - regression_loss: 1.0762 - classification_loss: 0.1894 274/500 [===============>..............] - ETA: 56s - loss: 1.2662 - regression_loss: 1.0767 - classification_loss: 0.1895 275/500 [===============>..............] - ETA: 56s - loss: 1.2671 - regression_loss: 1.0774 - classification_loss: 0.1897 276/500 [===============>..............] - ETA: 56s - loss: 1.2690 - regression_loss: 1.0789 - classification_loss: 0.1901 277/500 [===============>..............] - ETA: 56s - loss: 1.2666 - regression_loss: 1.0769 - classification_loss: 0.1896 278/500 [===============>..............] - ETA: 55s - loss: 1.2679 - regression_loss: 1.0781 - classification_loss: 0.1898 279/500 [===============>..............] - ETA: 55s - loss: 1.2682 - regression_loss: 1.0783 - classification_loss: 0.1899 280/500 [===============>..............] - ETA: 55s - loss: 1.2675 - regression_loss: 1.0779 - classification_loss: 0.1896 281/500 [===============>..............] - ETA: 55s - loss: 1.2682 - regression_loss: 1.0783 - classification_loss: 0.1899 282/500 [===============>..............] - ETA: 54s - loss: 1.2676 - regression_loss: 1.0775 - classification_loss: 0.1901 283/500 [===============>..............] - ETA: 54s - loss: 1.2649 - regression_loss: 1.0753 - classification_loss: 0.1896 284/500 [================>.............] - ETA: 54s - loss: 1.2642 - regression_loss: 1.0748 - classification_loss: 0.1895 285/500 [================>.............] - ETA: 54s - loss: 1.2617 - regression_loss: 1.0727 - classification_loss: 0.1890 286/500 [================>.............] - ETA: 53s - loss: 1.2622 - regression_loss: 1.0732 - classification_loss: 0.1890 287/500 [================>.............] - ETA: 53s - loss: 1.2626 - regression_loss: 1.0737 - classification_loss: 0.1890 288/500 [================>.............] - ETA: 53s - loss: 1.2621 - regression_loss: 1.0733 - classification_loss: 0.1888 289/500 [================>.............] - ETA: 53s - loss: 1.2611 - regression_loss: 1.0723 - classification_loss: 0.1888 290/500 [================>.............] - ETA: 52s - loss: 1.2621 - regression_loss: 1.0732 - classification_loss: 0.1889 291/500 [================>.............] - ETA: 52s - loss: 1.2621 - regression_loss: 1.0732 - classification_loss: 0.1889 292/500 [================>.............] - ETA: 52s - loss: 1.2615 - regression_loss: 1.0727 - classification_loss: 0.1888 293/500 [================>.............] - ETA: 52s - loss: 1.2631 - regression_loss: 1.0740 - classification_loss: 0.1891 294/500 [================>.............] - ETA: 51s - loss: 1.2634 - regression_loss: 1.0744 - classification_loss: 0.1890 295/500 [================>.............] - ETA: 51s - loss: 1.2633 - regression_loss: 1.0739 - classification_loss: 0.1894 296/500 [================>.............] - ETA: 51s - loss: 1.2657 - regression_loss: 1.0763 - classification_loss: 0.1894 297/500 [================>.............] - ETA: 51s - loss: 1.2664 - regression_loss: 1.0768 - classification_loss: 0.1895 298/500 [================>.............] - ETA: 50s - loss: 1.2672 - regression_loss: 1.0775 - classification_loss: 0.1897 299/500 [================>.............] - ETA: 50s - loss: 1.2685 - regression_loss: 1.0783 - classification_loss: 0.1902 300/500 [=================>............] - ETA: 50s - loss: 1.2660 - regression_loss: 1.0763 - classification_loss: 0.1897 301/500 [=================>............] - ETA: 50s - loss: 1.2669 - regression_loss: 1.0770 - classification_loss: 0.1898 302/500 [=================>............] - ETA: 49s - loss: 1.2669 - regression_loss: 1.0769 - classification_loss: 0.1900 303/500 [=================>............] - ETA: 49s - loss: 1.2684 - regression_loss: 1.0783 - classification_loss: 0.1901 304/500 [=================>............] - ETA: 49s - loss: 1.2684 - regression_loss: 1.0783 - classification_loss: 0.1901 305/500 [=================>............] - ETA: 49s - loss: 1.2696 - regression_loss: 1.0796 - classification_loss: 0.1900 306/500 [=================>............] - ETA: 48s - loss: 1.2716 - regression_loss: 1.0810 - classification_loss: 0.1906 307/500 [=================>............] - ETA: 48s - loss: 1.2728 - regression_loss: 1.0821 - classification_loss: 0.1907 308/500 [=================>............] - ETA: 48s - loss: 1.2715 - regression_loss: 1.0811 - classification_loss: 0.1905 309/500 [=================>............] - ETA: 48s - loss: 1.2736 - regression_loss: 1.0827 - classification_loss: 0.1910 310/500 [=================>............] - ETA: 47s - loss: 1.2730 - regression_loss: 1.0823 - classification_loss: 0.1907 311/500 [=================>............] - ETA: 47s - loss: 1.2730 - regression_loss: 1.0823 - classification_loss: 0.1907 312/500 [=================>............] - ETA: 47s - loss: 1.2737 - regression_loss: 1.0820 - classification_loss: 0.1917 313/500 [=================>............] - ETA: 47s - loss: 1.2722 - regression_loss: 1.0809 - classification_loss: 0.1913 314/500 [=================>............] - ETA: 46s - loss: 1.2727 - regression_loss: 1.0814 - classification_loss: 0.1913 315/500 [=================>............] - ETA: 46s - loss: 1.2739 - regression_loss: 1.0822 - classification_loss: 0.1917 316/500 [=================>............] - ETA: 46s - loss: 1.2737 - regression_loss: 1.0821 - classification_loss: 0.1916 317/500 [==================>...........] - ETA: 46s - loss: 1.2753 - regression_loss: 1.0834 - classification_loss: 0.1919 318/500 [==================>...........] - ETA: 45s - loss: 1.2735 - regression_loss: 1.0820 - classification_loss: 0.1914 319/500 [==================>...........] - ETA: 45s - loss: 1.2734 - regression_loss: 1.0820 - classification_loss: 0.1913 320/500 [==================>...........] - ETA: 45s - loss: 1.2731 - regression_loss: 1.0818 - classification_loss: 0.1913 321/500 [==================>...........] - ETA: 45s - loss: 1.2725 - regression_loss: 1.0814 - classification_loss: 0.1911 322/500 [==================>...........] - ETA: 44s - loss: 1.2742 - regression_loss: 1.0826 - classification_loss: 0.1916 323/500 [==================>...........] - ETA: 44s - loss: 1.2720 - regression_loss: 1.0809 - classification_loss: 0.1911 324/500 [==================>...........] - ETA: 44s - loss: 1.2731 - regression_loss: 1.0816 - classification_loss: 0.1914 325/500 [==================>...........] - ETA: 44s - loss: 1.2735 - regression_loss: 1.0820 - classification_loss: 0.1915 326/500 [==================>...........] - ETA: 43s - loss: 1.2722 - regression_loss: 1.0809 - classification_loss: 0.1913 327/500 [==================>...........] - ETA: 43s - loss: 1.2737 - regression_loss: 1.0821 - classification_loss: 0.1915 328/500 [==================>...........] - ETA: 43s - loss: 1.2736 - regression_loss: 1.0822 - classification_loss: 0.1914 329/500 [==================>...........] - ETA: 43s - loss: 1.2728 - regression_loss: 1.0815 - classification_loss: 0.1912 330/500 [==================>...........] - ETA: 42s - loss: 1.2725 - regression_loss: 1.0813 - classification_loss: 0.1912 331/500 [==================>...........] - ETA: 42s - loss: 1.2714 - regression_loss: 1.0805 - classification_loss: 0.1909 332/500 [==================>...........] - ETA: 42s - loss: 1.2720 - regression_loss: 1.0809 - classification_loss: 0.1911 333/500 [==================>...........] - ETA: 42s - loss: 1.2708 - regression_loss: 1.0799 - classification_loss: 0.1909 334/500 [===================>..........] - ETA: 41s - loss: 1.2715 - regression_loss: 1.0807 - classification_loss: 0.1908 335/500 [===================>..........] - ETA: 41s - loss: 1.2732 - regression_loss: 1.0820 - classification_loss: 0.1912 336/500 [===================>..........] - ETA: 41s - loss: 1.2729 - regression_loss: 1.0818 - classification_loss: 0.1911 337/500 [===================>..........] - ETA: 41s - loss: 1.2746 - regression_loss: 1.0831 - classification_loss: 0.1915 338/500 [===================>..........] - ETA: 40s - loss: 1.2769 - regression_loss: 1.0854 - classification_loss: 0.1915 339/500 [===================>..........] - ETA: 40s - loss: 1.2780 - regression_loss: 1.0862 - classification_loss: 0.1918 340/500 [===================>..........] - ETA: 40s - loss: 1.2785 - regression_loss: 1.0865 - classification_loss: 0.1920 341/500 [===================>..........] - ETA: 40s - loss: 1.2770 - regression_loss: 1.0853 - classification_loss: 0.1917 342/500 [===================>..........] - ETA: 39s - loss: 1.2752 - regression_loss: 1.0839 - classification_loss: 0.1913 343/500 [===================>..........] - ETA: 39s - loss: 1.2731 - regression_loss: 1.0819 - classification_loss: 0.1912 344/500 [===================>..........] - ETA: 39s - loss: 1.2742 - regression_loss: 1.0828 - classification_loss: 0.1914 345/500 [===================>..........] - ETA: 39s - loss: 1.2744 - regression_loss: 1.0830 - classification_loss: 0.1914 346/500 [===================>..........] - ETA: 38s - loss: 1.2751 - regression_loss: 1.0836 - classification_loss: 0.1914 347/500 [===================>..........] - ETA: 38s - loss: 1.2753 - regression_loss: 1.0835 - classification_loss: 0.1919 348/500 [===================>..........] - ETA: 38s - loss: 1.2742 - regression_loss: 1.0825 - classification_loss: 0.1916 349/500 [===================>..........] - ETA: 38s - loss: 1.2725 - regression_loss: 1.0813 - classification_loss: 0.1912 350/500 [====================>.........] - ETA: 37s - loss: 1.2705 - regression_loss: 1.0797 - classification_loss: 0.1909 351/500 [====================>.........] - ETA: 37s - loss: 1.2713 - regression_loss: 1.0804 - classification_loss: 0.1909 352/500 [====================>.........] - ETA: 37s - loss: 1.2725 - regression_loss: 1.0814 - classification_loss: 0.1911 353/500 [====================>.........] - ETA: 37s - loss: 1.2723 - regression_loss: 1.0812 - classification_loss: 0.1911 354/500 [====================>.........] - ETA: 36s - loss: 1.2747 - regression_loss: 1.0831 - classification_loss: 0.1916 355/500 [====================>.........] - ETA: 36s - loss: 1.2725 - regression_loss: 1.0813 - classification_loss: 0.1911 356/500 [====================>.........] - ETA: 36s - loss: 1.2705 - regression_loss: 1.0798 - classification_loss: 0.1908 357/500 [====================>.........] - ETA: 36s - loss: 1.2701 - regression_loss: 1.0794 - classification_loss: 0.1907 358/500 [====================>.........] - ETA: 35s - loss: 1.2683 - regression_loss: 1.0779 - classification_loss: 0.1904 359/500 [====================>.........] - ETA: 35s - loss: 1.2670 - regression_loss: 1.0768 - classification_loss: 0.1902 360/500 [====================>.........] - ETA: 35s - loss: 1.2654 - regression_loss: 1.0754 - classification_loss: 0.1900 361/500 [====================>.........] - ETA: 35s - loss: 1.2656 - regression_loss: 1.0756 - classification_loss: 0.1900 362/500 [====================>.........] - ETA: 34s - loss: 1.2636 - regression_loss: 1.0740 - classification_loss: 0.1896 363/500 [====================>.........] - ETA: 34s - loss: 1.2641 - regression_loss: 1.0744 - classification_loss: 0.1897 364/500 [====================>.........] - ETA: 34s - loss: 1.2652 - regression_loss: 1.0754 - classification_loss: 0.1898 365/500 [====================>.........] - ETA: 34s - loss: 1.2631 - regression_loss: 1.0737 - classification_loss: 0.1894 366/500 [====================>.........] - ETA: 33s - loss: 1.2638 - regression_loss: 1.0742 - classification_loss: 0.1895 367/500 [=====================>........] - ETA: 33s - loss: 1.2662 - regression_loss: 1.0763 - classification_loss: 0.1899 368/500 [=====================>........] - ETA: 33s - loss: 1.2652 - regression_loss: 1.0756 - classification_loss: 0.1896 369/500 [=====================>........] - ETA: 33s - loss: 1.2664 - regression_loss: 1.0764 - classification_loss: 0.1900 370/500 [=====================>........] - ETA: 32s - loss: 1.2665 - regression_loss: 1.0766 - classification_loss: 0.1899 371/500 [=====================>........] - ETA: 32s - loss: 1.2677 - regression_loss: 1.0776 - classification_loss: 0.1902 372/500 [=====================>........] - ETA: 32s - loss: 1.2667 - regression_loss: 1.0768 - classification_loss: 0.1900 373/500 [=====================>........] - ETA: 31s - loss: 1.2676 - regression_loss: 1.0774 - classification_loss: 0.1902 374/500 [=====================>........] - ETA: 31s - loss: 1.2673 - regression_loss: 1.0772 - classification_loss: 0.1901 375/500 [=====================>........] - ETA: 31s - loss: 1.2663 - regression_loss: 1.0763 - classification_loss: 0.1900 376/500 [=====================>........] - ETA: 31s - loss: 1.2667 - regression_loss: 1.0767 - classification_loss: 0.1900 377/500 [=====================>........] - ETA: 30s - loss: 1.2644 - regression_loss: 1.0747 - classification_loss: 0.1898 378/500 [=====================>........] - ETA: 30s - loss: 1.2639 - regression_loss: 1.0742 - classification_loss: 0.1897 379/500 [=====================>........] - ETA: 30s - loss: 1.2655 - regression_loss: 1.0755 - classification_loss: 0.1900 380/500 [=====================>........] - ETA: 30s - loss: 1.2680 - regression_loss: 1.0775 - classification_loss: 0.1905 381/500 [=====================>........] - ETA: 29s - loss: 1.2685 - regression_loss: 1.0780 - classification_loss: 0.1905 382/500 [=====================>........] - ETA: 29s - loss: 1.2699 - regression_loss: 1.0792 - classification_loss: 0.1907 383/500 [=====================>........] - ETA: 29s - loss: 1.2699 - regression_loss: 1.0792 - classification_loss: 0.1907 384/500 [======================>.......] - ETA: 29s - loss: 1.2688 - regression_loss: 1.0780 - classification_loss: 0.1908 385/500 [======================>.......] - ETA: 28s - loss: 1.2700 - regression_loss: 1.0790 - classification_loss: 0.1910 386/500 [======================>.......] - ETA: 28s - loss: 1.2679 - regression_loss: 1.0773 - classification_loss: 0.1906 387/500 [======================>.......] - ETA: 28s - loss: 1.2681 - regression_loss: 1.0774 - classification_loss: 0.1907 388/500 [======================>.......] - ETA: 28s - loss: 1.2658 - regression_loss: 1.0755 - classification_loss: 0.1903 389/500 [======================>.......] - ETA: 27s - loss: 1.2648 - regression_loss: 1.0747 - classification_loss: 0.1901 390/500 [======================>.......] - ETA: 27s - loss: 1.2657 - regression_loss: 1.0753 - classification_loss: 0.1904 391/500 [======================>.......] - ETA: 27s - loss: 1.2637 - regression_loss: 1.0736 - classification_loss: 0.1901 392/500 [======================>.......] - ETA: 27s - loss: 1.2631 - regression_loss: 1.0730 - classification_loss: 0.1901 393/500 [======================>.......] - ETA: 26s - loss: 1.2633 - regression_loss: 1.0732 - classification_loss: 0.1901 394/500 [======================>.......] - ETA: 26s - loss: 1.2638 - regression_loss: 1.0736 - classification_loss: 0.1903 395/500 [======================>.......] - ETA: 26s - loss: 1.2644 - regression_loss: 1.0740 - classification_loss: 0.1904 396/500 [======================>.......] - ETA: 26s - loss: 1.2689 - regression_loss: 1.0758 - classification_loss: 0.1931 397/500 [======================>.......] - ETA: 25s - loss: 1.2702 - regression_loss: 1.0765 - classification_loss: 0.1937 398/500 [======================>.......] - ETA: 25s - loss: 1.2710 - regression_loss: 1.0772 - classification_loss: 0.1937 399/500 [======================>.......] - ETA: 25s - loss: 1.2716 - regression_loss: 1.0778 - classification_loss: 0.1938 400/500 [=======================>......] - ETA: 25s - loss: 1.2718 - regression_loss: 1.0779 - classification_loss: 0.1939 401/500 [=======================>......] - ETA: 24s - loss: 1.2721 - regression_loss: 1.0782 - classification_loss: 0.1939 402/500 [=======================>......] - ETA: 24s - loss: 1.2708 - regression_loss: 1.0770 - classification_loss: 0.1938 403/500 [=======================>......] - ETA: 24s - loss: 1.2703 - regression_loss: 1.0766 - classification_loss: 0.1938 404/500 [=======================>......] - ETA: 24s - loss: 1.2710 - regression_loss: 1.0771 - classification_loss: 0.1939 405/500 [=======================>......] - ETA: 23s - loss: 1.2722 - regression_loss: 1.0780 - classification_loss: 0.1941 406/500 [=======================>......] - ETA: 23s - loss: 1.2722 - regression_loss: 1.0782 - classification_loss: 0.1941 407/500 [=======================>......] - ETA: 23s - loss: 1.2704 - regression_loss: 1.0767 - classification_loss: 0.1937 408/500 [=======================>......] - ETA: 23s - loss: 1.2682 - regression_loss: 1.0749 - classification_loss: 0.1933 409/500 [=======================>......] - ETA: 22s - loss: 1.2678 - regression_loss: 1.0746 - classification_loss: 0.1932 410/500 [=======================>......] - ETA: 22s - loss: 1.2696 - regression_loss: 1.0764 - classification_loss: 0.1932 411/500 [=======================>......] - ETA: 22s - loss: 1.2699 - regression_loss: 1.0767 - classification_loss: 0.1932 412/500 [=======================>......] - ETA: 22s - loss: 1.2721 - regression_loss: 1.0786 - classification_loss: 0.1935 413/500 [=======================>......] - ETA: 21s - loss: 1.2717 - regression_loss: 1.0782 - classification_loss: 0.1935 414/500 [=======================>......] - ETA: 21s - loss: 1.2712 - regression_loss: 1.0778 - classification_loss: 0.1934 415/500 [=======================>......] - ETA: 21s - loss: 1.2724 - regression_loss: 1.0788 - classification_loss: 0.1936 416/500 [=======================>......] - ETA: 21s - loss: 1.2728 - regression_loss: 1.0791 - classification_loss: 0.1937 417/500 [========================>.....] - ETA: 20s - loss: 1.2722 - regression_loss: 1.0787 - classification_loss: 0.1936 418/500 [========================>.....] - ETA: 20s - loss: 1.2723 - regression_loss: 1.0787 - classification_loss: 0.1936 419/500 [========================>.....] - ETA: 20s - loss: 1.2723 - regression_loss: 1.0781 - classification_loss: 0.1942 420/500 [========================>.....] - ETA: 20s - loss: 1.2722 - regression_loss: 1.0780 - classification_loss: 0.1942 421/500 [========================>.....] - ETA: 19s - loss: 1.2727 - regression_loss: 1.0784 - classification_loss: 0.1943 422/500 [========================>.....] - ETA: 19s - loss: 1.2713 - regression_loss: 1.0774 - classification_loss: 0.1940 423/500 [========================>.....] - ETA: 19s - loss: 1.2700 - regression_loss: 1.0764 - classification_loss: 0.1936 424/500 [========================>.....] - ETA: 19s - loss: 1.2711 - regression_loss: 1.0772 - classification_loss: 0.1938 425/500 [========================>.....] - ETA: 18s - loss: 1.2722 - regression_loss: 1.0782 - classification_loss: 0.1941 426/500 [========================>.....] - ETA: 18s - loss: 1.2729 - regression_loss: 1.0786 - classification_loss: 0.1943 427/500 [========================>.....] - ETA: 18s - loss: 1.2735 - regression_loss: 1.0792 - classification_loss: 0.1943 428/500 [========================>.....] - ETA: 18s - loss: 1.2736 - regression_loss: 1.0794 - classification_loss: 0.1942 429/500 [========================>.....] - ETA: 17s - loss: 1.2750 - regression_loss: 1.0806 - classification_loss: 0.1944 430/500 [========================>.....] - ETA: 17s - loss: 1.2747 - regression_loss: 1.0803 - classification_loss: 0.1944 431/500 [========================>.....] - ETA: 17s - loss: 1.2748 - regression_loss: 1.0804 - classification_loss: 0.1944 432/500 [========================>.....] - ETA: 17s - loss: 1.2749 - regression_loss: 1.0805 - classification_loss: 0.1944 433/500 [========================>.....] - ETA: 16s - loss: 1.2745 - regression_loss: 1.0801 - classification_loss: 0.1944 434/500 [=========================>....] - ETA: 16s - loss: 1.2738 - regression_loss: 1.0796 - classification_loss: 0.1942 435/500 [=========================>....] - ETA: 16s - loss: 1.2736 - regression_loss: 1.0795 - classification_loss: 0.1942 436/500 [=========================>....] - ETA: 16s - loss: 1.2742 - regression_loss: 1.0800 - classification_loss: 0.1942 437/500 [=========================>....] - ETA: 15s - loss: 1.2758 - regression_loss: 1.0810 - classification_loss: 0.1948 438/500 [=========================>....] - ETA: 15s - loss: 1.2762 - regression_loss: 1.0814 - classification_loss: 0.1948 439/500 [=========================>....] - ETA: 15s - loss: 1.2754 - regression_loss: 1.0808 - classification_loss: 0.1947 440/500 [=========================>....] - ETA: 15s - loss: 1.2736 - regression_loss: 1.0792 - classification_loss: 0.1943 441/500 [=========================>....] - ETA: 14s - loss: 1.2742 - regression_loss: 1.0796 - classification_loss: 0.1946 442/500 [=========================>....] - ETA: 14s - loss: 1.2737 - regression_loss: 1.0792 - classification_loss: 0.1945 443/500 [=========================>....] - ETA: 14s - loss: 1.2732 - regression_loss: 1.0788 - classification_loss: 0.1944 444/500 [=========================>....] - ETA: 14s - loss: 1.2725 - regression_loss: 1.0782 - classification_loss: 0.1943 445/500 [=========================>....] - ETA: 13s - loss: 1.2729 - regression_loss: 1.0785 - classification_loss: 0.1943 446/500 [=========================>....] - ETA: 13s - loss: 1.2727 - regression_loss: 1.0784 - classification_loss: 0.1943 447/500 [=========================>....] - ETA: 13s - loss: 1.2728 - regression_loss: 1.0785 - classification_loss: 0.1943 448/500 [=========================>....] - ETA: 13s - loss: 1.2723 - regression_loss: 1.0780 - classification_loss: 0.1943 449/500 [=========================>....] - ETA: 12s - loss: 1.2729 - regression_loss: 1.0786 - classification_loss: 0.1943 450/500 [==========================>...] - ETA: 12s - loss: 1.2726 - regression_loss: 1.0784 - classification_loss: 0.1942 451/500 [==========================>...] - ETA: 12s - loss: 1.2734 - regression_loss: 1.0791 - classification_loss: 0.1943 452/500 [==========================>...] - ETA: 12s - loss: 1.2736 - regression_loss: 1.0794 - classification_loss: 0.1943 453/500 [==========================>...] - ETA: 11s - loss: 1.2727 - regression_loss: 1.0785 - classification_loss: 0.1942 454/500 [==========================>...] - ETA: 11s - loss: 1.2738 - regression_loss: 1.0794 - classification_loss: 0.1944 455/500 [==========================>...] - ETA: 11s - loss: 1.2739 - regression_loss: 1.0795 - classification_loss: 0.1944 456/500 [==========================>...] - ETA: 11s - loss: 1.2735 - regression_loss: 1.0792 - classification_loss: 0.1944 457/500 [==========================>...] - ETA: 10s - loss: 1.2748 - regression_loss: 1.0804 - classification_loss: 0.1945 458/500 [==========================>...] - ETA: 10s - loss: 1.2746 - regression_loss: 1.0802 - classification_loss: 0.1944 459/500 [==========================>...] - ETA: 10s - loss: 1.2745 - regression_loss: 1.0801 - classification_loss: 0.1943 460/500 [==========================>...] - ETA: 10s - loss: 1.2746 - regression_loss: 1.0803 - classification_loss: 0.1943 461/500 [==========================>...] - ETA: 9s - loss: 1.2756 - regression_loss: 1.0811 - classification_loss: 0.1945  462/500 [==========================>...] - ETA: 9s - loss: 1.2749 - regression_loss: 1.0806 - classification_loss: 0.1943 463/500 [==========================>...] - ETA: 9s - loss: 1.2752 - regression_loss: 1.0808 - classification_loss: 0.1943 464/500 [==========================>...] - ETA: 9s - loss: 1.2748 - regression_loss: 1.0805 - classification_loss: 0.1943 465/500 [==========================>...] - ETA: 8s - loss: 1.2746 - regression_loss: 1.0804 - classification_loss: 0.1943 466/500 [==========================>...] - ETA: 8s - loss: 1.2746 - regression_loss: 1.0801 - classification_loss: 0.1945 467/500 [===========================>..] - ETA: 8s - loss: 1.2745 - regression_loss: 1.0802 - classification_loss: 0.1944 468/500 [===========================>..] - ETA: 8s - loss: 1.2741 - regression_loss: 1.0799 - classification_loss: 0.1941 469/500 [===========================>..] - ETA: 7s - loss: 1.2731 - regression_loss: 1.0792 - classification_loss: 0.1939 470/500 [===========================>..] - ETA: 7s - loss: 1.2747 - regression_loss: 1.0805 - classification_loss: 0.1941 471/500 [===========================>..] - ETA: 7s - loss: 1.2752 - regression_loss: 1.0811 - classification_loss: 0.1941 472/500 [===========================>..] - ETA: 7s - loss: 1.2750 - regression_loss: 1.0811 - classification_loss: 0.1940 473/500 [===========================>..] - ETA: 6s - loss: 1.2765 - regression_loss: 1.0824 - classification_loss: 0.1941 474/500 [===========================>..] - ETA: 6s - loss: 1.2770 - regression_loss: 1.0829 - classification_loss: 0.1941 475/500 [===========================>..] - ETA: 6s - loss: 1.2779 - regression_loss: 1.0836 - classification_loss: 0.1942 476/500 [===========================>..] - ETA: 6s - loss: 1.2772 - regression_loss: 1.0831 - classification_loss: 0.1941 477/500 [===========================>..] - ETA: 5s - loss: 1.2782 - regression_loss: 1.0838 - classification_loss: 0.1943 478/500 [===========================>..] - ETA: 5s - loss: 1.2774 - regression_loss: 1.0833 - classification_loss: 0.1942 479/500 [===========================>..] - ETA: 5s - loss: 1.2780 - regression_loss: 1.0837 - classification_loss: 0.1943 480/500 [===========================>..] - ETA: 5s - loss: 1.2777 - regression_loss: 1.0834 - classification_loss: 0.1943 481/500 [===========================>..] - ETA: 4s - loss: 1.2777 - regression_loss: 1.0833 - classification_loss: 0.1944 482/500 [===========================>..] - ETA: 4s - loss: 1.2770 - regression_loss: 1.0828 - classification_loss: 0.1942 483/500 [===========================>..] - ETA: 4s - loss: 1.2765 - regression_loss: 1.0824 - classification_loss: 0.1941 484/500 [============================>.] - ETA: 4s - loss: 1.2785 - regression_loss: 1.0838 - classification_loss: 0.1947 485/500 [============================>.] - ETA: 3s - loss: 1.2782 - regression_loss: 1.0836 - classification_loss: 0.1946 486/500 [============================>.] - ETA: 3s - loss: 1.2790 - regression_loss: 1.0844 - classification_loss: 0.1946 487/500 [============================>.] - ETA: 3s - loss: 1.2789 - regression_loss: 1.0843 - classification_loss: 0.1946 488/500 [============================>.] - ETA: 3s - loss: 1.2794 - regression_loss: 1.0848 - classification_loss: 0.1946 489/500 [============================>.] - ETA: 2s - loss: 1.2798 - regression_loss: 1.0851 - classification_loss: 0.1946 490/500 [============================>.] - ETA: 2s - loss: 1.2803 - regression_loss: 1.0856 - classification_loss: 0.1947 491/500 [============================>.] - ETA: 2s - loss: 1.2790 - regression_loss: 1.0845 - classification_loss: 0.1945 492/500 [============================>.] - ETA: 2s - loss: 1.2776 - regression_loss: 1.0833 - classification_loss: 0.1942 493/500 [============================>.] - ETA: 1s - loss: 1.2788 - regression_loss: 1.0843 - classification_loss: 0.1945 494/500 [============================>.] - ETA: 1s - loss: 1.2787 - regression_loss: 1.0842 - classification_loss: 0.1945 495/500 [============================>.] - ETA: 1s - loss: 1.2790 - regression_loss: 1.0845 - classification_loss: 0.1945 496/500 [============================>.] - ETA: 1s - loss: 1.2791 - regression_loss: 1.0847 - classification_loss: 0.1944 497/500 [============================>.] - ETA: 0s - loss: 1.2792 - regression_loss: 1.0847 - classification_loss: 0.1944 498/500 [============================>.] - ETA: 0s - loss: 1.2780 - regression_loss: 1.0838 - classification_loss: 0.1942 499/500 [============================>.] - ETA: 0s - loss: 1.2778 - regression_loss: 1.0835 - classification_loss: 0.1943 500/500 [==============================] - 126s 251ms/step - loss: 1.2775 - regression_loss: 1.0833 - classification_loss: 0.1942 1172 instances of class plum with average precision: 0.7007 mAP: 0.7007 Epoch 00124: saving model to ./training/snapshots/resnet50_pascal_124.h5 Epoch 125/150 1/500 [..............................] - ETA: 1:58 - loss: 1.4768 - regression_loss: 1.2936 - classification_loss: 0.1832 2/500 [..............................] - ETA: 2:04 - loss: 1.5501 - regression_loss: 1.3304 - classification_loss: 0.2197 3/500 [..............................] - ETA: 2:03 - loss: 1.1918 - regression_loss: 1.0268 - classification_loss: 0.1650 4/500 [..............................] - ETA: 2:03 - loss: 1.1498 - regression_loss: 0.9941 - classification_loss: 0.1558 5/500 [..............................] - ETA: 2:03 - loss: 1.2939 - regression_loss: 1.1007 - classification_loss: 0.1932 6/500 [..............................] - ETA: 2:03 - loss: 1.1668 - regression_loss: 1.0003 - classification_loss: 0.1665 7/500 [..............................] - ETA: 2:02 - loss: 1.2555 - regression_loss: 1.0628 - classification_loss: 0.1926 8/500 [..............................] - ETA: 2:02 - loss: 1.3265 - regression_loss: 1.1197 - classification_loss: 0.2068 9/500 [..............................] - ETA: 2:02 - loss: 1.3264 - regression_loss: 1.1234 - classification_loss: 0.2029 10/500 [..............................] - ETA: 2:02 - loss: 1.3670 - regression_loss: 1.1686 - classification_loss: 0.1985 11/500 [..............................] - ETA: 2:02 - loss: 1.3981 - regression_loss: 1.1898 - classification_loss: 0.2083 12/500 [..............................] - ETA: 2:02 - loss: 1.3943 - regression_loss: 1.1847 - classification_loss: 0.2096 13/500 [..............................] - ETA: 2:02 - loss: 1.4166 - regression_loss: 1.2001 - classification_loss: 0.2165 14/500 [..............................] - ETA: 2:02 - loss: 1.4353 - regression_loss: 1.2144 - classification_loss: 0.2210 15/500 [..............................] - ETA: 2:01 - loss: 1.4357 - regression_loss: 1.2147 - classification_loss: 0.2209 16/500 [..............................] - ETA: 2:01 - loss: 1.4399 - regression_loss: 1.2204 - classification_loss: 0.2195 17/500 [>.............................] - ETA: 2:01 - loss: 1.3899 - regression_loss: 1.1783 - classification_loss: 0.2116 18/500 [>.............................] - ETA: 2:01 - loss: 1.3960 - regression_loss: 1.1825 - classification_loss: 0.2136 19/500 [>.............................] - ETA: 2:00 - loss: 1.3833 - regression_loss: 1.1764 - classification_loss: 0.2069 20/500 [>.............................] - ETA: 2:00 - loss: 1.3410 - regression_loss: 1.1408 - classification_loss: 0.2001 21/500 [>.............................] - ETA: 2:00 - loss: 1.3059 - regression_loss: 1.1131 - classification_loss: 0.1928 22/500 [>.............................] - ETA: 2:00 - loss: 1.3344 - regression_loss: 1.1332 - classification_loss: 0.2012 23/500 [>.............................] - ETA: 2:00 - loss: 1.3186 - regression_loss: 1.1207 - classification_loss: 0.1979 24/500 [>.............................] - ETA: 1:59 - loss: 1.2850 - regression_loss: 1.0924 - classification_loss: 0.1926 25/500 [>.............................] - ETA: 1:59 - loss: 1.2656 - regression_loss: 1.0749 - classification_loss: 0.1908 26/500 [>.............................] - ETA: 1:59 - loss: 1.2657 - regression_loss: 1.0736 - classification_loss: 0.1921 27/500 [>.............................] - ETA: 1:58 - loss: 1.2788 - regression_loss: 1.0854 - classification_loss: 0.1935 28/500 [>.............................] - ETA: 1:58 - loss: 1.2642 - regression_loss: 1.0744 - classification_loss: 0.1898 29/500 [>.............................] - ETA: 1:58 - loss: 1.2724 - regression_loss: 1.0816 - classification_loss: 0.1908 30/500 [>.............................] - ETA: 1:58 - loss: 1.2494 - regression_loss: 1.0630 - classification_loss: 0.1864 31/500 [>.............................] - ETA: 1:57 - loss: 1.2433 - regression_loss: 1.0569 - classification_loss: 0.1864 32/500 [>.............................] - ETA: 1:57 - loss: 1.2532 - regression_loss: 1.0659 - classification_loss: 0.1873 33/500 [>.............................] - ETA: 1:56 - loss: 1.2680 - regression_loss: 1.0779 - classification_loss: 0.1901 34/500 [=>............................] - ETA: 1:56 - loss: 1.2549 - regression_loss: 1.0666 - classification_loss: 0.1883 35/500 [=>............................] - ETA: 1:56 - loss: 1.2554 - regression_loss: 1.0666 - classification_loss: 0.1888 36/500 [=>............................] - ETA: 1:56 - loss: 1.2606 - regression_loss: 1.0695 - classification_loss: 0.1911 37/500 [=>............................] - ETA: 1:56 - loss: 1.2633 - regression_loss: 1.0721 - classification_loss: 0.1912 38/500 [=>............................] - ETA: 1:56 - loss: 1.2770 - regression_loss: 1.0823 - classification_loss: 0.1947 39/500 [=>............................] - ETA: 1:55 - loss: 1.2702 - regression_loss: 1.0735 - classification_loss: 0.1967 40/500 [=>............................] - ETA: 1:55 - loss: 1.2759 - regression_loss: 1.0786 - classification_loss: 0.1973 41/500 [=>............................] - ETA: 1:55 - loss: 1.2611 - regression_loss: 1.0671 - classification_loss: 0.1940 42/500 [=>............................] - ETA: 1:55 - loss: 1.2630 - regression_loss: 1.0687 - classification_loss: 0.1942 43/500 [=>............................] - ETA: 1:54 - loss: 1.2610 - regression_loss: 1.0675 - classification_loss: 0.1935 44/500 [=>............................] - ETA: 1:54 - loss: 1.2427 - regression_loss: 1.0530 - classification_loss: 0.1897 45/500 [=>............................] - ETA: 1:54 - loss: 1.2236 - regression_loss: 1.0374 - classification_loss: 0.1862 46/500 [=>............................] - ETA: 1:54 - loss: 1.2166 - regression_loss: 1.0315 - classification_loss: 0.1851 47/500 [=>............................] - ETA: 1:53 - loss: 1.2243 - regression_loss: 1.0371 - classification_loss: 0.1872 48/500 [=>............................] - ETA: 1:53 - loss: 1.2289 - regression_loss: 1.0415 - classification_loss: 0.1874 49/500 [=>............................] - ETA: 1:53 - loss: 1.2228 - regression_loss: 1.0368 - classification_loss: 0.1859 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2321 - regression_loss: 1.0446 - classification_loss: 0.1876 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2364 - regression_loss: 1.0481 - classification_loss: 0.1882 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2470 - regression_loss: 1.0567 - classification_loss: 0.1903 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2574 - regression_loss: 1.0656 - classification_loss: 0.1918 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2521 - regression_loss: 1.0617 - classification_loss: 0.1904 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2465 - regression_loss: 1.0573 - classification_loss: 0.1892 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2476 - regression_loss: 1.0581 - classification_loss: 0.1895 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2548 - regression_loss: 1.0635 - classification_loss: 0.1913 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2436 - regression_loss: 1.0547 - classification_loss: 0.1890 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2320 - regression_loss: 1.0452 - classification_loss: 0.1868 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2320 - regression_loss: 1.0454 - classification_loss: 0.1866 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2277 - regression_loss: 1.0424 - classification_loss: 0.1853 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2383 - regression_loss: 1.0511 - classification_loss: 0.1872 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2368 - regression_loss: 1.0498 - classification_loss: 0.1870 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2450 - regression_loss: 1.0564 - classification_loss: 0.1886 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2444 - regression_loss: 1.0559 - classification_loss: 0.1885 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2551 - regression_loss: 1.0636 - classification_loss: 0.1915 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2628 - regression_loss: 1.0701 - classification_loss: 0.1928 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2691 - regression_loss: 1.0750 - classification_loss: 0.1941 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2653 - regression_loss: 1.0720 - classification_loss: 0.1933 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2702 - regression_loss: 1.0751 - classification_loss: 0.1951 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2777 - regression_loss: 1.0798 - classification_loss: 0.1979 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2894 - regression_loss: 1.0902 - classification_loss: 0.1992 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2892 - regression_loss: 1.0901 - classification_loss: 0.1991 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2957 - regression_loss: 1.0950 - classification_loss: 0.2006 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2903 - regression_loss: 1.0908 - classification_loss: 0.1995 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2959 - regression_loss: 1.0958 - classification_loss: 0.2001 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2988 - regression_loss: 1.0979 - classification_loss: 0.2009 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2981 - regression_loss: 1.0969 - classification_loss: 0.2012 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2955 - regression_loss: 1.0956 - classification_loss: 0.1999 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2930 - regression_loss: 1.0934 - classification_loss: 0.1996 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2858 - regression_loss: 1.0872 - classification_loss: 0.1986 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2865 - regression_loss: 1.0877 - classification_loss: 0.1988 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2906 - regression_loss: 1.0921 - classification_loss: 0.1985 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2830 - regression_loss: 1.0862 - classification_loss: 0.1968 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2884 - regression_loss: 1.0886 - classification_loss: 0.1998 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2880 - regression_loss: 1.0888 - classification_loss: 0.1992 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2804 - regression_loss: 1.0818 - classification_loss: 0.1986 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2794 - regression_loss: 1.0811 - classification_loss: 0.1983 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2816 - regression_loss: 1.0833 - classification_loss: 0.1983 90/500 [====>.........................] - ETA: 1:43 - loss: 1.2744 - regression_loss: 1.0773 - classification_loss: 0.1971 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2673 - regression_loss: 1.0715 - classification_loss: 0.1958 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2672 - regression_loss: 1.0710 - classification_loss: 0.1962 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2709 - regression_loss: 1.0743 - classification_loss: 0.1966 94/500 [====>.........................] - ETA: 1:42 - loss: 1.2721 - regression_loss: 1.0759 - classification_loss: 0.1962 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2662 - regression_loss: 1.0710 - classification_loss: 0.1953 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2680 - regression_loss: 1.0726 - classification_loss: 0.1954 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2692 - regression_loss: 1.0738 - classification_loss: 0.1954 98/500 [====>.........................] - ETA: 1:41 - loss: 1.2722 - regression_loss: 1.0766 - classification_loss: 0.1956 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2760 - regression_loss: 1.0798 - classification_loss: 0.1962 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2773 - regression_loss: 1.0811 - classification_loss: 0.1962 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2778 - regression_loss: 1.0809 - classification_loss: 0.1969 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2783 - regression_loss: 1.0815 - classification_loss: 0.1968 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2726 - regression_loss: 1.0766 - classification_loss: 0.1960 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2694 - regression_loss: 1.0740 - classification_loss: 0.1954 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2762 - regression_loss: 1.0795 - classification_loss: 0.1968 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2759 - regression_loss: 1.0794 - classification_loss: 0.1965 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2748 - regression_loss: 1.0786 - classification_loss: 0.1962 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2777 - regression_loss: 1.0811 - classification_loss: 0.1966 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2712 - regression_loss: 1.0756 - classification_loss: 0.1956 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2646 - regression_loss: 1.0701 - classification_loss: 0.1945 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2595 - regression_loss: 1.0657 - classification_loss: 0.1937 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2666 - regression_loss: 1.0726 - classification_loss: 0.1941 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2739 - regression_loss: 1.0782 - classification_loss: 0.1958 114/500 [=====>........................] - ETA: 1:37 - loss: 1.2770 - regression_loss: 1.0805 - classification_loss: 0.1965 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2706 - regression_loss: 1.0750 - classification_loss: 0.1956 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2714 - regression_loss: 1.0757 - classification_loss: 0.1957 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2659 - regression_loss: 1.0710 - classification_loss: 0.1949 118/500 [======>.......................] - ETA: 1:36 - loss: 1.2674 - regression_loss: 1.0723 - classification_loss: 0.1951 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2600 - regression_loss: 1.0661 - classification_loss: 0.1939 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2585 - regression_loss: 1.0657 - classification_loss: 0.1928 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2563 - regression_loss: 1.0639 - classification_loss: 0.1924 122/500 [======>.......................] - ETA: 1:35 - loss: 1.2588 - regression_loss: 1.0646 - classification_loss: 0.1942 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2548 - regression_loss: 1.0614 - classification_loss: 0.1934 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2509 - regression_loss: 1.0584 - classification_loss: 0.1925 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2500 - regression_loss: 1.0568 - classification_loss: 0.1932 126/500 [======>.......................] - ETA: 1:34 - loss: 1.2510 - regression_loss: 1.0578 - classification_loss: 0.1932 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2557 - regression_loss: 1.0617 - classification_loss: 0.1940 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2573 - regression_loss: 1.0630 - classification_loss: 0.1943 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2593 - regression_loss: 1.0650 - classification_loss: 0.1943 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2576 - regression_loss: 1.0634 - classification_loss: 0.1942 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2603 - regression_loss: 1.0659 - classification_loss: 0.1944 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2620 - regression_loss: 1.0673 - classification_loss: 0.1947 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2614 - regression_loss: 1.0673 - classification_loss: 0.1941 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2581 - regression_loss: 1.0648 - classification_loss: 0.1933 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2543 - regression_loss: 1.0614 - classification_loss: 0.1929 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2504 - regression_loss: 1.0584 - classification_loss: 0.1919 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2497 - regression_loss: 1.0581 - classification_loss: 0.1916 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2496 - regression_loss: 1.0584 - classification_loss: 0.1912 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2521 - regression_loss: 1.0609 - classification_loss: 0.1913 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2526 - regression_loss: 1.0613 - classification_loss: 0.1913 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2506 - regression_loss: 1.0596 - classification_loss: 0.1911 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2516 - regression_loss: 1.0604 - classification_loss: 0.1912 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2524 - regression_loss: 1.0608 - classification_loss: 0.1916 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2627 - regression_loss: 1.0700 - classification_loss: 0.1927 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2611 - regression_loss: 1.0688 - classification_loss: 0.1923 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2567 - regression_loss: 1.0653 - classification_loss: 0.1914 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2528 - regression_loss: 1.0608 - classification_loss: 0.1919 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2537 - regression_loss: 1.0617 - classification_loss: 0.1921 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2534 - regression_loss: 1.0615 - classification_loss: 0.1919 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2509 - regression_loss: 1.0594 - classification_loss: 0.1915 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2458 - regression_loss: 1.0552 - classification_loss: 0.1906 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2459 - regression_loss: 1.0547 - classification_loss: 0.1912 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2505 - regression_loss: 1.0586 - classification_loss: 0.1919 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2496 - regression_loss: 1.0580 - classification_loss: 0.1916 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2440 - regression_loss: 1.0535 - classification_loss: 0.1905 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2451 - regression_loss: 1.0542 - classification_loss: 0.1909 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2404 - regression_loss: 1.0501 - classification_loss: 0.1903 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2412 - regression_loss: 1.0510 - classification_loss: 0.1902 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2417 - regression_loss: 1.0515 - classification_loss: 0.1902 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2390 - regression_loss: 1.0490 - classification_loss: 0.1899 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2434 - regression_loss: 1.0526 - classification_loss: 0.1909 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2427 - regression_loss: 1.0518 - classification_loss: 0.1910 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2376 - regression_loss: 1.0476 - classification_loss: 0.1900 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2392 - regression_loss: 1.0490 - classification_loss: 0.1902 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2356 - regression_loss: 1.0461 - classification_loss: 0.1895 166/500 [========>.....................] - ETA: 1:24 - loss: 1.2368 - regression_loss: 1.0470 - classification_loss: 0.1898 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2344 - regression_loss: 1.0449 - classification_loss: 0.1895 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2364 - regression_loss: 1.0467 - classification_loss: 0.1896 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2382 - regression_loss: 1.0487 - classification_loss: 0.1896 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2409 - regression_loss: 1.0509 - classification_loss: 0.1900 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2364 - regression_loss: 1.0473 - classification_loss: 0.1892 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2350 - regression_loss: 1.0463 - classification_loss: 0.1887 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2346 - regression_loss: 1.0456 - classification_loss: 0.1889 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2366 - regression_loss: 1.0472 - classification_loss: 0.1895 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2393 - regression_loss: 1.0493 - classification_loss: 0.1900 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2383 - regression_loss: 1.0482 - classification_loss: 0.1901 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2352 - regression_loss: 1.0457 - classification_loss: 0.1895 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2364 - regression_loss: 1.0469 - classification_loss: 0.1895 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2371 - regression_loss: 1.0477 - classification_loss: 0.1894 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2345 - regression_loss: 1.0455 - classification_loss: 0.1891 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2336 - regression_loss: 1.0452 - classification_loss: 0.1884 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2320 - regression_loss: 1.0437 - classification_loss: 0.1883 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2278 - regression_loss: 1.0403 - classification_loss: 0.1875 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2261 - regression_loss: 1.0386 - classification_loss: 0.1876 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2280 - regression_loss: 1.0400 - classification_loss: 0.1881 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2279 - regression_loss: 1.0399 - classification_loss: 0.1880 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2285 - regression_loss: 1.0403 - classification_loss: 0.1882 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2300 - regression_loss: 1.0417 - classification_loss: 0.1883 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2295 - regression_loss: 1.0414 - classification_loss: 0.1881 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2301 - regression_loss: 1.0421 - classification_loss: 0.1881 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2302 - regression_loss: 1.0424 - classification_loss: 0.1877 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2280 - regression_loss: 1.0407 - classification_loss: 0.1873 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2269 - regression_loss: 1.0401 - classification_loss: 0.1868 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2286 - regression_loss: 1.0413 - classification_loss: 0.1873 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2256 - regression_loss: 1.0389 - classification_loss: 0.1867 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2244 - regression_loss: 1.0378 - classification_loss: 0.1866 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2283 - regression_loss: 1.0408 - classification_loss: 0.1875 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2301 - regression_loss: 1.0422 - classification_loss: 0.1879 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2316 - regression_loss: 1.0432 - classification_loss: 0.1884 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2328 - regression_loss: 1.0443 - classification_loss: 0.1885 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2349 - regression_loss: 1.0460 - classification_loss: 0.1889 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2341 - regression_loss: 1.0452 - classification_loss: 0.1889 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2366 - regression_loss: 1.0471 - classification_loss: 0.1894 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2397 - regression_loss: 1.0496 - classification_loss: 0.1901 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2414 - regression_loss: 1.0510 - classification_loss: 0.1904 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2392 - regression_loss: 1.0491 - classification_loss: 0.1901 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2407 - regression_loss: 1.0505 - classification_loss: 0.1902 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2397 - regression_loss: 1.0498 - classification_loss: 0.1900 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2378 - regression_loss: 1.0483 - classification_loss: 0.1895 210/500 [===========>..................] - ETA: 1:13 - loss: 1.2340 - regression_loss: 1.0451 - classification_loss: 0.1890 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2345 - regression_loss: 1.0455 - classification_loss: 0.1890 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2361 - regression_loss: 1.0465 - classification_loss: 0.1897 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2360 - regression_loss: 1.0463 - classification_loss: 0.1896 214/500 [===========>..................] - ETA: 1:12 - loss: 1.2346 - regression_loss: 1.0454 - classification_loss: 0.1893 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2365 - regression_loss: 1.0468 - classification_loss: 0.1897 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2363 - regression_loss: 1.0465 - classification_loss: 0.1898 217/500 [============>.................] - ETA: 1:11 - loss: 1.2321 - regression_loss: 1.0429 - classification_loss: 0.1893 218/500 [============>.................] - ETA: 1:11 - loss: 1.2343 - regression_loss: 1.0447 - classification_loss: 0.1896 219/500 [============>.................] - ETA: 1:10 - loss: 1.2363 - regression_loss: 1.0464 - classification_loss: 0.1899 220/500 [============>.................] - ETA: 1:10 - loss: 1.2384 - regression_loss: 1.0483 - classification_loss: 0.1901 221/500 [============>.................] - ETA: 1:10 - loss: 1.2366 - regression_loss: 1.0469 - classification_loss: 0.1898 222/500 [============>.................] - ETA: 1:10 - loss: 1.2328 - regression_loss: 1.0436 - classification_loss: 0.1892 223/500 [============>.................] - ETA: 1:09 - loss: 1.2289 - regression_loss: 1.0404 - classification_loss: 0.1885 224/500 [============>.................] - ETA: 1:09 - loss: 1.2306 - regression_loss: 1.0416 - classification_loss: 0.1890 225/500 [============>.................] - ETA: 1:09 - loss: 1.2291 - regression_loss: 1.0404 - classification_loss: 0.1887 226/500 [============>.................] - ETA: 1:09 - loss: 1.2315 - regression_loss: 1.0421 - classification_loss: 0.1893 227/500 [============>.................] - ETA: 1:08 - loss: 1.2315 - regression_loss: 1.0422 - classification_loss: 0.1893 228/500 [============>.................] - ETA: 1:08 - loss: 1.2332 - regression_loss: 1.0431 - classification_loss: 0.1901 229/500 [============>.................] - ETA: 1:08 - loss: 1.2303 - regression_loss: 1.0409 - classification_loss: 0.1894 230/500 [============>.................] - ETA: 1:08 - loss: 1.2321 - regression_loss: 1.0422 - classification_loss: 0.1899 231/500 [============>.................] - ETA: 1:07 - loss: 1.2319 - regression_loss: 1.0421 - classification_loss: 0.1898 232/500 [============>.................] - ETA: 1:07 - loss: 1.2298 - regression_loss: 1.0404 - classification_loss: 0.1894 233/500 [============>.................] - ETA: 1:07 - loss: 1.2302 - regression_loss: 1.0406 - classification_loss: 0.1895 234/500 [=============>................] - ETA: 1:07 - loss: 1.2317 - regression_loss: 1.0417 - classification_loss: 0.1899 235/500 [=============>................] - ETA: 1:06 - loss: 1.2312 - regression_loss: 1.0417 - classification_loss: 0.1895 236/500 [=============>................] - ETA: 1:06 - loss: 1.2350 - regression_loss: 1.0448 - classification_loss: 0.1902 237/500 [=============>................] - ETA: 1:06 - loss: 1.2338 - regression_loss: 1.0435 - classification_loss: 0.1902 238/500 [=============>................] - ETA: 1:06 - loss: 1.2318 - regression_loss: 1.0420 - classification_loss: 0.1899 239/500 [=============>................] - ETA: 1:05 - loss: 1.2306 - regression_loss: 1.0411 - classification_loss: 0.1895 240/500 [=============>................] - ETA: 1:05 - loss: 1.2328 - regression_loss: 1.0429 - classification_loss: 0.1899 241/500 [=============>................] - ETA: 1:05 - loss: 1.2307 - regression_loss: 1.0414 - classification_loss: 0.1893 242/500 [=============>................] - ETA: 1:04 - loss: 1.2300 - regression_loss: 1.0410 - classification_loss: 0.1890 243/500 [=============>................] - ETA: 1:04 - loss: 1.2312 - regression_loss: 1.0420 - classification_loss: 0.1892 244/500 [=============>................] - ETA: 1:04 - loss: 1.2335 - regression_loss: 1.0439 - classification_loss: 0.1896 245/500 [=============>................] - ETA: 1:04 - loss: 1.2367 - regression_loss: 1.0474 - classification_loss: 0.1894 246/500 [=============>................] - ETA: 1:03 - loss: 1.2390 - regression_loss: 1.0493 - classification_loss: 0.1898 247/500 [=============>................] - ETA: 1:03 - loss: 1.2362 - regression_loss: 1.0470 - classification_loss: 0.1893 248/500 [=============>................] - ETA: 1:03 - loss: 1.2383 - regression_loss: 1.0486 - classification_loss: 0.1897 249/500 [=============>................] - ETA: 1:03 - loss: 1.2361 - regression_loss: 1.0469 - classification_loss: 0.1892 250/500 [==============>...............] - ETA: 1:03 - loss: 1.2370 - regression_loss: 1.0477 - classification_loss: 0.1894 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2351 - regression_loss: 1.0462 - classification_loss: 0.1890 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2331 - regression_loss: 1.0447 - classification_loss: 0.1884 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2353 - regression_loss: 1.0464 - classification_loss: 0.1890 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2355 - regression_loss: 1.0466 - classification_loss: 0.1889 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2365 - regression_loss: 1.0473 - classification_loss: 0.1892 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2372 - regression_loss: 1.0479 - classification_loss: 0.1892 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2368 - regression_loss: 1.0474 - classification_loss: 0.1894 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2373 - regression_loss: 1.0477 - classification_loss: 0.1896 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2382 - regression_loss: 1.0486 - classification_loss: 0.1897 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2381 - regression_loss: 1.0485 - classification_loss: 0.1897 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2396 - regression_loss: 1.0496 - classification_loss: 0.1900 262/500 [==============>...............] - ETA: 59s - loss: 1.2394 - regression_loss: 1.0495 - classification_loss: 0.1898  263/500 [==============>...............] - ETA: 59s - loss: 1.2398 - regression_loss: 1.0500 - classification_loss: 0.1898 264/500 [==============>...............] - ETA: 59s - loss: 1.2394 - regression_loss: 1.0497 - classification_loss: 0.1897 265/500 [==============>...............] - ETA: 59s - loss: 1.2375 - regression_loss: 1.0481 - classification_loss: 0.1894 266/500 [==============>...............] - ETA: 58s - loss: 1.2369 - regression_loss: 1.0478 - classification_loss: 0.1891 267/500 [===============>..............] - ETA: 58s - loss: 1.2386 - regression_loss: 1.0492 - classification_loss: 0.1894 268/500 [===============>..............] - ETA: 58s - loss: 1.2399 - regression_loss: 1.0503 - classification_loss: 0.1896 269/500 [===============>..............] - ETA: 58s - loss: 1.2378 - regression_loss: 1.0486 - classification_loss: 0.1893 270/500 [===============>..............] - ETA: 57s - loss: 1.2374 - regression_loss: 1.0483 - classification_loss: 0.1892 271/500 [===============>..............] - ETA: 57s - loss: 1.2372 - regression_loss: 1.0481 - classification_loss: 0.1891 272/500 [===============>..............] - ETA: 57s - loss: 1.2393 - regression_loss: 1.0500 - classification_loss: 0.1894 273/500 [===============>..............] - ETA: 57s - loss: 1.2395 - regression_loss: 1.0500 - classification_loss: 0.1895 274/500 [===============>..............] - ETA: 56s - loss: 1.2403 - regression_loss: 1.0510 - classification_loss: 0.1894 275/500 [===============>..............] - ETA: 56s - loss: 1.2384 - regression_loss: 1.0494 - classification_loss: 0.1890 276/500 [===============>..............] - ETA: 56s - loss: 1.2364 - regression_loss: 1.0477 - classification_loss: 0.1887 277/500 [===============>..............] - ETA: 56s - loss: 1.2388 - regression_loss: 1.0496 - classification_loss: 0.1891 278/500 [===============>..............] - ETA: 55s - loss: 1.2357 - regression_loss: 1.0470 - classification_loss: 0.1887 279/500 [===============>..............] - ETA: 55s - loss: 1.2349 - regression_loss: 1.0465 - classification_loss: 0.1884 280/500 [===============>..............] - ETA: 55s - loss: 1.2350 - regression_loss: 1.0466 - classification_loss: 0.1884 281/500 [===============>..............] - ETA: 55s - loss: 1.2359 - regression_loss: 1.0471 - classification_loss: 0.1888 282/500 [===============>..............] - ETA: 54s - loss: 1.2380 - regression_loss: 1.0489 - classification_loss: 0.1891 283/500 [===============>..............] - ETA: 54s - loss: 1.2381 - regression_loss: 1.0490 - classification_loss: 0.1891 284/500 [================>.............] - ETA: 54s - loss: 1.2368 - regression_loss: 1.0480 - classification_loss: 0.1888 285/500 [================>.............] - ETA: 54s - loss: 1.2347 - regression_loss: 1.0463 - classification_loss: 0.1883 286/500 [================>.............] - ETA: 53s - loss: 1.2346 - regression_loss: 1.0463 - classification_loss: 0.1883 287/500 [================>.............] - ETA: 53s - loss: 1.2358 - regression_loss: 1.0473 - classification_loss: 0.1885 288/500 [================>.............] - ETA: 53s - loss: 1.2371 - regression_loss: 1.0485 - classification_loss: 0.1886 289/500 [================>.............] - ETA: 53s - loss: 1.2366 - regression_loss: 1.0482 - classification_loss: 0.1885 290/500 [================>.............] - ETA: 52s - loss: 1.2377 - regression_loss: 1.0487 - classification_loss: 0.1889 291/500 [================>.............] - ETA: 52s - loss: 1.2393 - regression_loss: 1.0500 - classification_loss: 0.1893 292/500 [================>.............] - ETA: 52s - loss: 1.2400 - regression_loss: 1.0504 - classification_loss: 0.1896 293/500 [================>.............] - ETA: 52s - loss: 1.2404 - regression_loss: 1.0507 - classification_loss: 0.1897 294/500 [================>.............] - ETA: 51s - loss: 1.2412 - regression_loss: 1.0515 - classification_loss: 0.1898 295/500 [================>.............] - ETA: 51s - loss: 1.2408 - regression_loss: 1.0512 - classification_loss: 0.1896 296/500 [================>.............] - ETA: 51s - loss: 1.2433 - regression_loss: 1.0531 - classification_loss: 0.1902 297/500 [================>.............] - ETA: 51s - loss: 1.2413 - regression_loss: 1.0515 - classification_loss: 0.1898 298/500 [================>.............] - ETA: 50s - loss: 1.2418 - regression_loss: 1.0520 - classification_loss: 0.1898 299/500 [================>.............] - ETA: 50s - loss: 1.2399 - regression_loss: 1.0505 - classification_loss: 0.1894 300/500 [=================>............] - ETA: 50s - loss: 1.2401 - regression_loss: 1.0507 - classification_loss: 0.1894 301/500 [=================>............] - ETA: 50s - loss: 1.2413 - regression_loss: 1.0518 - classification_loss: 0.1895 302/500 [=================>............] - ETA: 49s - loss: 1.2409 - regression_loss: 1.0512 - classification_loss: 0.1897 303/500 [=================>............] - ETA: 49s - loss: 1.2430 - regression_loss: 1.0529 - classification_loss: 0.1901 304/500 [=================>............] - ETA: 49s - loss: 1.2411 - regression_loss: 1.0514 - classification_loss: 0.1897 305/500 [=================>............] - ETA: 49s - loss: 1.2421 - regression_loss: 1.0518 - classification_loss: 0.1903 306/500 [=================>............] - ETA: 48s - loss: 1.2413 - regression_loss: 1.0512 - classification_loss: 0.1901 307/500 [=================>............] - ETA: 48s - loss: 1.2389 - regression_loss: 1.0488 - classification_loss: 0.1900 308/500 [=================>............] - ETA: 48s - loss: 1.2398 - regression_loss: 1.0497 - classification_loss: 0.1901 309/500 [=================>............] - ETA: 48s - loss: 1.2396 - regression_loss: 1.0497 - classification_loss: 0.1899 310/500 [=================>............] - ETA: 47s - loss: 1.2403 - regression_loss: 1.0504 - classification_loss: 0.1898 311/500 [=================>............] - ETA: 47s - loss: 1.2413 - regression_loss: 1.0513 - classification_loss: 0.1900 312/500 [=================>............] - ETA: 47s - loss: 1.2391 - regression_loss: 1.0496 - classification_loss: 0.1895 313/500 [=================>............] - ETA: 47s - loss: 1.2379 - regression_loss: 1.0482 - classification_loss: 0.1897 314/500 [=================>............] - ETA: 46s - loss: 1.2403 - regression_loss: 1.0500 - classification_loss: 0.1903 315/500 [=================>............] - ETA: 46s - loss: 1.2390 - regression_loss: 1.0490 - classification_loss: 0.1900 316/500 [=================>............] - ETA: 46s - loss: 1.2387 - regression_loss: 1.0488 - classification_loss: 0.1899 317/500 [==================>...........] - ETA: 46s - loss: 1.2395 - regression_loss: 1.0495 - classification_loss: 0.1899 318/500 [==================>...........] - ETA: 45s - loss: 1.2393 - regression_loss: 1.0494 - classification_loss: 0.1899 319/500 [==================>...........] - ETA: 45s - loss: 1.2394 - regression_loss: 1.0495 - classification_loss: 0.1899 320/500 [==================>...........] - ETA: 45s - loss: 1.2371 - regression_loss: 1.0476 - classification_loss: 0.1894 321/500 [==================>...........] - ETA: 45s - loss: 1.2360 - regression_loss: 1.0469 - classification_loss: 0.1890 322/500 [==================>...........] - ETA: 44s - loss: 1.2357 - regression_loss: 1.0469 - classification_loss: 0.1888 323/500 [==================>...........] - ETA: 44s - loss: 1.2368 - regression_loss: 1.0478 - classification_loss: 0.1890 324/500 [==================>...........] - ETA: 44s - loss: 1.2381 - regression_loss: 1.0487 - classification_loss: 0.1894 325/500 [==================>...........] - ETA: 44s - loss: 1.2377 - regression_loss: 1.0484 - classification_loss: 0.1893 326/500 [==================>...........] - ETA: 43s - loss: 1.2382 - regression_loss: 1.0484 - classification_loss: 0.1898 327/500 [==================>...........] - ETA: 43s - loss: 1.2390 - regression_loss: 1.0491 - classification_loss: 0.1899 328/500 [==================>...........] - ETA: 43s - loss: 1.2393 - regression_loss: 1.0493 - classification_loss: 0.1900 329/500 [==================>...........] - ETA: 43s - loss: 1.2385 - regression_loss: 1.0486 - classification_loss: 0.1900 330/500 [==================>...........] - ETA: 42s - loss: 1.2392 - regression_loss: 1.0490 - classification_loss: 0.1902 331/500 [==================>...........] - ETA: 42s - loss: 1.2404 - regression_loss: 1.0500 - classification_loss: 0.1904 332/500 [==================>...........] - ETA: 42s - loss: 1.2407 - regression_loss: 1.0501 - classification_loss: 0.1906 333/500 [==================>...........] - ETA: 42s - loss: 1.2401 - regression_loss: 1.0496 - classification_loss: 0.1905 334/500 [===================>..........] - ETA: 41s - loss: 1.2382 - regression_loss: 1.0480 - classification_loss: 0.1902 335/500 [===================>..........] - ETA: 41s - loss: 1.2377 - regression_loss: 1.0478 - classification_loss: 0.1899 336/500 [===================>..........] - ETA: 41s - loss: 1.2369 - regression_loss: 1.0471 - classification_loss: 0.1898 337/500 [===================>..........] - ETA: 41s - loss: 1.2364 - regression_loss: 1.0468 - classification_loss: 0.1896 338/500 [===================>..........] - ETA: 40s - loss: 1.2374 - regression_loss: 1.0477 - classification_loss: 0.1898 339/500 [===================>..........] - ETA: 40s - loss: 1.2373 - regression_loss: 1.0476 - classification_loss: 0.1897 340/500 [===================>..........] - ETA: 40s - loss: 1.2356 - regression_loss: 1.0461 - classification_loss: 0.1895 341/500 [===================>..........] - ETA: 40s - loss: 1.2362 - regression_loss: 1.0465 - classification_loss: 0.1897 342/500 [===================>..........] - ETA: 39s - loss: 1.2357 - regression_loss: 1.0461 - classification_loss: 0.1896 343/500 [===================>..........] - ETA: 39s - loss: 1.2355 - regression_loss: 1.0460 - classification_loss: 0.1895 344/500 [===================>..........] - ETA: 39s - loss: 1.2358 - regression_loss: 1.0464 - classification_loss: 0.1894 345/500 [===================>..........] - ETA: 39s - loss: 1.2353 - regression_loss: 1.0462 - classification_loss: 0.1891 346/500 [===================>..........] - ETA: 38s - loss: 1.2343 - regression_loss: 1.0456 - classification_loss: 0.1888 347/500 [===================>..........] - ETA: 38s - loss: 1.2353 - regression_loss: 1.0463 - classification_loss: 0.1890 348/500 [===================>..........] - ETA: 38s - loss: 1.2352 - regression_loss: 1.0462 - classification_loss: 0.1891 349/500 [===================>..........] - ETA: 37s - loss: 1.2341 - regression_loss: 1.0451 - classification_loss: 0.1890 350/500 [====================>.........] - ETA: 37s - loss: 1.2352 - regression_loss: 1.0460 - classification_loss: 0.1892 351/500 [====================>.........] - ETA: 37s - loss: 1.2357 - regression_loss: 1.0466 - classification_loss: 0.1891 352/500 [====================>.........] - ETA: 37s - loss: 1.2363 - regression_loss: 1.0470 - classification_loss: 0.1893 353/500 [====================>.........] - ETA: 36s - loss: 1.2366 - regression_loss: 1.0473 - classification_loss: 0.1894 354/500 [====================>.........] - ETA: 36s - loss: 1.2369 - regression_loss: 1.0475 - classification_loss: 0.1894 355/500 [====================>.........] - ETA: 36s - loss: 1.2358 - regression_loss: 1.0466 - classification_loss: 0.1892 356/500 [====================>.........] - ETA: 36s - loss: 1.2360 - regression_loss: 1.0466 - classification_loss: 0.1894 357/500 [====================>.........] - ETA: 35s - loss: 1.2362 - regression_loss: 1.0469 - classification_loss: 0.1893 358/500 [====================>.........] - ETA: 35s - loss: 1.2355 - regression_loss: 1.0463 - classification_loss: 0.1892 359/500 [====================>.........] - ETA: 35s - loss: 1.2350 - regression_loss: 1.0460 - classification_loss: 0.1891 360/500 [====================>.........] - ETA: 35s - loss: 1.2375 - regression_loss: 1.0477 - classification_loss: 0.1898 361/500 [====================>.........] - ETA: 34s - loss: 1.2376 - regression_loss: 1.0479 - classification_loss: 0.1897 362/500 [====================>.........] - ETA: 34s - loss: 1.2396 - regression_loss: 1.0494 - classification_loss: 0.1903 363/500 [====================>.........] - ETA: 34s - loss: 1.2406 - regression_loss: 1.0502 - classification_loss: 0.1905 364/500 [====================>.........] - ETA: 34s - loss: 1.2414 - regression_loss: 1.0509 - classification_loss: 0.1905 365/500 [====================>.........] - ETA: 33s - loss: 1.2409 - regression_loss: 1.0504 - classification_loss: 0.1904 366/500 [====================>.........] - ETA: 33s - loss: 1.2442 - regression_loss: 1.0533 - classification_loss: 0.1909 367/500 [=====================>........] - ETA: 33s - loss: 1.2455 - regression_loss: 1.0544 - classification_loss: 0.1910 368/500 [=====================>........] - ETA: 33s - loss: 1.2449 - regression_loss: 1.0539 - classification_loss: 0.1910 369/500 [=====================>........] - ETA: 32s - loss: 1.2454 - regression_loss: 1.0544 - classification_loss: 0.1910 370/500 [=====================>........] - ETA: 32s - loss: 1.2448 - regression_loss: 1.0539 - classification_loss: 0.1909 371/500 [=====================>........] - ETA: 32s - loss: 1.2443 - regression_loss: 1.0535 - classification_loss: 0.1908 372/500 [=====================>........] - ETA: 32s - loss: 1.2452 - regression_loss: 1.0543 - classification_loss: 0.1909 373/500 [=====================>........] - ETA: 31s - loss: 1.2429 - regression_loss: 1.0524 - classification_loss: 0.1905 374/500 [=====================>........] - ETA: 31s - loss: 1.2437 - regression_loss: 1.0530 - classification_loss: 0.1907 375/500 [=====================>........] - ETA: 31s - loss: 1.2460 - regression_loss: 1.0549 - classification_loss: 0.1911 376/500 [=====================>........] - ETA: 31s - loss: 1.2466 - regression_loss: 1.0555 - classification_loss: 0.1910 377/500 [=====================>........] - ETA: 30s - loss: 1.2454 - regression_loss: 1.0547 - classification_loss: 0.1907 378/500 [=====================>........] - ETA: 30s - loss: 1.2442 - regression_loss: 1.0535 - classification_loss: 0.1906 379/500 [=====================>........] - ETA: 30s - loss: 1.2425 - regression_loss: 1.0522 - classification_loss: 0.1903 380/500 [=====================>........] - ETA: 30s - loss: 1.2422 - regression_loss: 1.0519 - classification_loss: 0.1903 381/500 [=====================>........] - ETA: 29s - loss: 1.2420 - regression_loss: 1.0518 - classification_loss: 0.1902 382/500 [=====================>........] - ETA: 29s - loss: 1.2429 - regression_loss: 1.0525 - classification_loss: 0.1903 383/500 [=====================>........] - ETA: 29s - loss: 1.2425 - regression_loss: 1.0523 - classification_loss: 0.1902 384/500 [======================>.......] - ETA: 29s - loss: 1.2440 - regression_loss: 1.0535 - classification_loss: 0.1905 385/500 [======================>.......] - ETA: 28s - loss: 1.2460 - regression_loss: 1.0550 - classification_loss: 0.1910 386/500 [======================>.......] - ETA: 28s - loss: 1.2444 - regression_loss: 1.0534 - classification_loss: 0.1910 387/500 [======================>.......] - ETA: 28s - loss: 1.2456 - regression_loss: 1.0545 - classification_loss: 0.1911 388/500 [======================>.......] - ETA: 28s - loss: 1.2458 - regression_loss: 1.0546 - classification_loss: 0.1912 389/500 [======================>.......] - ETA: 27s - loss: 1.2456 - regression_loss: 1.0545 - classification_loss: 0.1911 390/500 [======================>.......] - ETA: 27s - loss: 1.2467 - regression_loss: 1.0555 - classification_loss: 0.1912 391/500 [======================>.......] - ETA: 27s - loss: 1.2482 - regression_loss: 1.0568 - classification_loss: 0.1914 392/500 [======================>.......] - ETA: 27s - loss: 1.2474 - regression_loss: 1.0563 - classification_loss: 0.1912 393/500 [======================>.......] - ETA: 26s - loss: 1.2475 - regression_loss: 1.0562 - classification_loss: 0.1913 394/500 [======================>.......] - ETA: 26s - loss: 1.2470 - regression_loss: 1.0556 - classification_loss: 0.1913 395/500 [======================>.......] - ETA: 26s - loss: 1.2474 - regression_loss: 1.0559 - classification_loss: 0.1915 396/500 [======================>.......] - ETA: 26s - loss: 1.2479 - regression_loss: 1.0563 - classification_loss: 0.1915 397/500 [======================>.......] - ETA: 25s - loss: 1.2473 - regression_loss: 1.0558 - classification_loss: 0.1915 398/500 [======================>.......] - ETA: 25s - loss: 1.2478 - regression_loss: 1.0563 - classification_loss: 0.1915 399/500 [======================>.......] - ETA: 25s - loss: 1.2467 - regression_loss: 1.0555 - classification_loss: 0.1912 400/500 [=======================>......] - ETA: 25s - loss: 1.2475 - regression_loss: 1.0563 - classification_loss: 0.1912 401/500 [=======================>......] - ETA: 24s - loss: 1.2484 - regression_loss: 1.0571 - classification_loss: 0.1913 402/500 [=======================>......] - ETA: 24s - loss: 1.2497 - regression_loss: 1.0582 - classification_loss: 0.1915 403/500 [=======================>......] - ETA: 24s - loss: 1.2503 - regression_loss: 1.0588 - classification_loss: 0.1915 404/500 [=======================>......] - ETA: 24s - loss: 1.2500 - regression_loss: 1.0585 - classification_loss: 0.1915 405/500 [=======================>......] - ETA: 23s - loss: 1.2505 - regression_loss: 1.0590 - classification_loss: 0.1915 406/500 [=======================>......] - ETA: 23s - loss: 1.2499 - regression_loss: 1.0586 - classification_loss: 0.1913 407/500 [=======================>......] - ETA: 23s - loss: 1.2492 - regression_loss: 1.0580 - classification_loss: 0.1911 408/500 [=======================>......] - ETA: 23s - loss: 1.2489 - regression_loss: 1.0579 - classification_loss: 0.1909 409/500 [=======================>......] - ETA: 22s - loss: 1.2498 - regression_loss: 1.0587 - classification_loss: 0.1911 410/500 [=======================>......] - ETA: 22s - loss: 1.2499 - regression_loss: 1.0588 - classification_loss: 0.1912 411/500 [=======================>......] - ETA: 22s - loss: 1.2499 - regression_loss: 1.0587 - classification_loss: 0.1911 412/500 [=======================>......] - ETA: 22s - loss: 1.2505 - regression_loss: 1.0593 - classification_loss: 0.1912 413/500 [=======================>......] - ETA: 21s - loss: 1.2513 - regression_loss: 1.0601 - classification_loss: 0.1912 414/500 [=======================>......] - ETA: 21s - loss: 1.2529 - regression_loss: 1.0614 - classification_loss: 0.1915 415/500 [=======================>......] - ETA: 21s - loss: 1.2528 - regression_loss: 1.0614 - classification_loss: 0.1914 416/500 [=======================>......] - ETA: 21s - loss: 1.2540 - regression_loss: 1.0624 - classification_loss: 0.1916 417/500 [========================>.....] - ETA: 20s - loss: 1.2527 - regression_loss: 1.0614 - classification_loss: 0.1913 418/500 [========================>.....] - ETA: 20s - loss: 1.2509 - regression_loss: 1.0599 - classification_loss: 0.1910 419/500 [========================>.....] - ETA: 20s - loss: 1.2490 - regression_loss: 1.0583 - classification_loss: 0.1907 420/500 [========================>.....] - ETA: 20s - loss: 1.2496 - regression_loss: 1.0588 - classification_loss: 0.1908 421/500 [========================>.....] - ETA: 19s - loss: 1.2499 - regression_loss: 1.0590 - classification_loss: 0.1909 422/500 [========================>.....] - ETA: 19s - loss: 1.2497 - regression_loss: 1.0587 - classification_loss: 0.1910 423/500 [========================>.....] - ETA: 19s - loss: 1.2491 - regression_loss: 1.0583 - classification_loss: 0.1908 424/500 [========================>.....] - ETA: 19s - loss: 1.2489 - regression_loss: 1.0583 - classification_loss: 0.1907 425/500 [========================>.....] - ETA: 18s - loss: 1.2495 - regression_loss: 1.0588 - classification_loss: 0.1908 426/500 [========================>.....] - ETA: 18s - loss: 1.2513 - regression_loss: 1.0601 - classification_loss: 0.1912 427/500 [========================>.....] - ETA: 18s - loss: 1.2499 - regression_loss: 1.0589 - classification_loss: 0.1910 428/500 [========================>.....] - ETA: 18s - loss: 1.2501 - regression_loss: 1.0591 - classification_loss: 0.1910 429/500 [========================>.....] - ETA: 17s - loss: 1.2491 - regression_loss: 1.0583 - classification_loss: 0.1908 430/500 [========================>.....] - ETA: 17s - loss: 1.2473 - regression_loss: 1.0568 - classification_loss: 0.1905 431/500 [========================>.....] - ETA: 17s - loss: 1.2467 - regression_loss: 1.0563 - classification_loss: 0.1903 432/500 [========================>.....] - ETA: 17s - loss: 1.2473 - regression_loss: 1.0570 - classification_loss: 0.1904 433/500 [========================>.....] - ETA: 16s - loss: 1.2488 - regression_loss: 1.0582 - classification_loss: 0.1906 434/500 [=========================>....] - ETA: 16s - loss: 1.2505 - regression_loss: 1.0595 - classification_loss: 0.1910 435/500 [=========================>....] - ETA: 16s - loss: 1.2521 - regression_loss: 1.0607 - classification_loss: 0.1913 436/500 [=========================>....] - ETA: 16s - loss: 1.2522 - regression_loss: 1.0608 - classification_loss: 0.1914 437/500 [=========================>....] - ETA: 15s - loss: 1.2536 - regression_loss: 1.0619 - classification_loss: 0.1917 438/500 [=========================>....] - ETA: 15s - loss: 1.2540 - regression_loss: 1.0623 - classification_loss: 0.1917 439/500 [=========================>....] - ETA: 15s - loss: 1.2527 - regression_loss: 1.0613 - classification_loss: 0.1914 440/500 [=========================>....] - ETA: 15s - loss: 1.2540 - regression_loss: 1.0625 - classification_loss: 0.1915 441/500 [=========================>....] - ETA: 14s - loss: 1.2544 - regression_loss: 1.0630 - classification_loss: 0.1914 442/500 [=========================>....] - ETA: 14s - loss: 1.2522 - regression_loss: 1.0611 - classification_loss: 0.1911 443/500 [=========================>....] - ETA: 14s - loss: 1.2519 - regression_loss: 1.0608 - classification_loss: 0.1911 444/500 [=========================>....] - ETA: 14s - loss: 1.2517 - regression_loss: 1.0607 - classification_loss: 0.1910 445/500 [=========================>....] - ETA: 13s - loss: 1.2505 - regression_loss: 1.0596 - classification_loss: 0.1909 446/500 [=========================>....] - ETA: 13s - loss: 1.2509 - regression_loss: 1.0599 - classification_loss: 0.1910 447/500 [=========================>....] - ETA: 13s - loss: 1.2510 - regression_loss: 1.0599 - classification_loss: 0.1910 448/500 [=========================>....] - ETA: 13s - loss: 1.2511 - regression_loss: 1.0601 - classification_loss: 0.1910 449/500 [=========================>....] - ETA: 12s - loss: 1.2524 - regression_loss: 1.0611 - classification_loss: 0.1913 450/500 [==========================>...] - ETA: 12s - loss: 1.2513 - regression_loss: 1.0601 - classification_loss: 0.1912 451/500 [==========================>...] - ETA: 12s - loss: 1.2519 - regression_loss: 1.0606 - classification_loss: 0.1913 452/500 [==========================>...] - ETA: 12s - loss: 1.2533 - regression_loss: 1.0615 - classification_loss: 0.1918 453/500 [==========================>...] - ETA: 11s - loss: 1.2520 - regression_loss: 1.0605 - classification_loss: 0.1915 454/500 [==========================>...] - ETA: 11s - loss: 1.2508 - regression_loss: 1.0594 - classification_loss: 0.1913 455/500 [==========================>...] - ETA: 11s - loss: 1.2508 - regression_loss: 1.0594 - classification_loss: 0.1914 456/500 [==========================>...] - ETA: 11s - loss: 1.2497 - regression_loss: 1.0586 - classification_loss: 0.1912 457/500 [==========================>...] - ETA: 10s - loss: 1.2481 - regression_loss: 1.0573 - classification_loss: 0.1908 458/500 [==========================>...] - ETA: 10s - loss: 1.2489 - regression_loss: 1.0580 - classification_loss: 0.1909 459/500 [==========================>...] - ETA: 10s - loss: 1.2503 - regression_loss: 1.0592 - classification_loss: 0.1911 460/500 [==========================>...] - ETA: 10s - loss: 1.2484 - regression_loss: 1.0576 - classification_loss: 0.1908 461/500 [==========================>...] - ETA: 9s - loss: 1.2487 - regression_loss: 1.0579 - classification_loss: 0.1908  462/500 [==========================>...] - ETA: 9s - loss: 1.2492 - regression_loss: 1.0582 - classification_loss: 0.1909 463/500 [==========================>...] - ETA: 9s - loss: 1.2483 - regression_loss: 1.0574 - classification_loss: 0.1909 464/500 [==========================>...] - ETA: 9s - loss: 1.2483 - regression_loss: 1.0575 - classification_loss: 0.1908 465/500 [==========================>...] - ETA: 8s - loss: 1.2486 - regression_loss: 1.0576 - classification_loss: 0.1909 466/500 [==========================>...] - ETA: 8s - loss: 1.2475 - regression_loss: 1.0568 - classification_loss: 0.1907 467/500 [===========================>..] - ETA: 8s - loss: 1.2463 - regression_loss: 1.0558 - classification_loss: 0.1904 468/500 [===========================>..] - ETA: 8s - loss: 1.2459 - regression_loss: 1.0556 - classification_loss: 0.1903 469/500 [===========================>..] - ETA: 7s - loss: 1.2468 - regression_loss: 1.0564 - classification_loss: 0.1904 470/500 [===========================>..] - ETA: 7s - loss: 1.2479 - regression_loss: 1.0573 - classification_loss: 0.1906 471/500 [===========================>..] - ETA: 7s - loss: 1.2482 - regression_loss: 1.0575 - classification_loss: 0.1907 472/500 [===========================>..] - ETA: 7s - loss: 1.2484 - regression_loss: 1.0578 - classification_loss: 0.1907 473/500 [===========================>..] - ETA: 6s - loss: 1.2482 - regression_loss: 1.0577 - classification_loss: 0.1904 474/500 [===========================>..] - ETA: 6s - loss: 1.2471 - regression_loss: 1.0568 - classification_loss: 0.1903 475/500 [===========================>..] - ETA: 6s - loss: 1.2464 - regression_loss: 1.0563 - classification_loss: 0.1901 476/500 [===========================>..] - ETA: 6s - loss: 1.2472 - regression_loss: 1.0569 - classification_loss: 0.1903 477/500 [===========================>..] - ETA: 5s - loss: 1.2462 - regression_loss: 1.0560 - classification_loss: 0.1902 478/500 [===========================>..] - ETA: 5s - loss: 1.2464 - regression_loss: 1.0562 - classification_loss: 0.1902 479/500 [===========================>..] - ETA: 5s - loss: 1.2466 - regression_loss: 1.0560 - classification_loss: 0.1906 480/500 [===========================>..] - ETA: 5s - loss: 1.2467 - regression_loss: 1.0559 - classification_loss: 0.1908 481/500 [===========================>..] - ETA: 4s - loss: 1.2467 - regression_loss: 1.0560 - classification_loss: 0.1908 482/500 [===========================>..] - ETA: 4s - loss: 1.2475 - regression_loss: 1.0566 - classification_loss: 0.1910 483/500 [===========================>..] - ETA: 4s - loss: 1.2470 - regression_loss: 1.0560 - classification_loss: 0.1910 484/500 [============================>.] - ETA: 4s - loss: 1.2467 - regression_loss: 1.0558 - classification_loss: 0.1909 485/500 [============================>.] - ETA: 3s - loss: 1.2470 - regression_loss: 1.0559 - classification_loss: 0.1911 486/500 [============================>.] - ETA: 3s - loss: 1.2469 - regression_loss: 1.0558 - classification_loss: 0.1911 487/500 [============================>.] - ETA: 3s - loss: 1.2467 - regression_loss: 1.0557 - classification_loss: 0.1910 488/500 [============================>.] - ETA: 3s - loss: 1.2467 - regression_loss: 1.0557 - classification_loss: 0.1910 489/500 [============================>.] - ETA: 2s - loss: 1.2452 - regression_loss: 1.0546 - classification_loss: 0.1907 490/500 [============================>.] - ETA: 2s - loss: 1.2443 - regression_loss: 1.0538 - classification_loss: 0.1905 491/500 [============================>.] - ETA: 2s - loss: 1.2454 - regression_loss: 1.0548 - classification_loss: 0.1906 492/500 [============================>.] - ETA: 2s - loss: 1.2447 - regression_loss: 1.0543 - classification_loss: 0.1904 493/500 [============================>.] - ETA: 1s - loss: 1.2450 - regression_loss: 1.0546 - classification_loss: 0.1904 494/500 [============================>.] - ETA: 1s - loss: 1.2458 - regression_loss: 1.0553 - classification_loss: 0.1906 495/500 [============================>.] - ETA: 1s - loss: 1.2448 - regression_loss: 1.0544 - classification_loss: 0.1904 496/500 [============================>.] - ETA: 1s - loss: 1.2440 - regression_loss: 1.0537 - classification_loss: 0.1902 497/500 [============================>.] - ETA: 0s - loss: 1.2444 - regression_loss: 1.0541 - classification_loss: 0.1903 498/500 [============================>.] - ETA: 0s - loss: 1.2449 - regression_loss: 1.0545 - classification_loss: 0.1904 499/500 [============================>.] - ETA: 0s - loss: 1.2449 - regression_loss: 1.0545 - classification_loss: 0.1904 500/500 [==============================] - 126s 251ms/step - loss: 1.2462 - regression_loss: 1.0556 - classification_loss: 0.1906 1172 instances of class plum with average precision: 0.7031 mAP: 0.7031 Epoch 00125: saving model to ./training/snapshots/resnet50_pascal_125.h5 Epoch 126/150 1/500 [..............................] - ETA: 2:00 - loss: 1.7971 - regression_loss: 1.4866 - classification_loss: 0.3105 2/500 [..............................] - ETA: 2:03 - loss: 1.1988 - regression_loss: 1.0078 - classification_loss: 0.1910 3/500 [..............................] - ETA: 2:06 - loss: 1.0572 - regression_loss: 0.8986 - classification_loss: 0.1586 4/500 [..............................] - ETA: 2:03 - loss: 1.2298 - regression_loss: 1.0510 - classification_loss: 0.1788 5/500 [..............................] - ETA: 2:04 - loss: 1.3506 - regression_loss: 1.1341 - classification_loss: 0.2165 6/500 [..............................] - ETA: 2:03 - loss: 1.2109 - regression_loss: 1.0182 - classification_loss: 0.1928 7/500 [..............................] - ETA: 2:03 - loss: 1.1307 - regression_loss: 0.9570 - classification_loss: 0.1737 8/500 [..............................] - ETA: 2:03 - loss: 1.1274 - regression_loss: 0.9562 - classification_loss: 0.1712 9/500 [..............................] - ETA: 2:03 - loss: 1.2079 - regression_loss: 1.0224 - classification_loss: 0.1855 10/500 [..............................] - ETA: 2:03 - loss: 1.2561 - regression_loss: 1.0609 - classification_loss: 0.1953 11/500 [..............................] - ETA: 2:03 - loss: 1.2467 - regression_loss: 1.0616 - classification_loss: 0.1852 12/500 [..............................] - ETA: 2:02 - loss: 1.2088 - regression_loss: 1.0331 - classification_loss: 0.1757 13/500 [..............................] - ETA: 2:02 - loss: 1.2262 - regression_loss: 1.0473 - classification_loss: 0.1789 14/500 [..............................] - ETA: 2:02 - loss: 1.2225 - regression_loss: 1.0484 - classification_loss: 0.1741 15/500 [..............................] - ETA: 2:02 - loss: 1.2557 - regression_loss: 1.0751 - classification_loss: 0.1806 16/500 [..............................] - ETA: 2:01 - loss: 1.2226 - regression_loss: 1.0439 - classification_loss: 0.1787 17/500 [>.............................] - ETA: 2:01 - loss: 1.2050 - regression_loss: 1.0305 - classification_loss: 0.1746 18/500 [>.............................] - ETA: 2:00 - loss: 1.1759 - regression_loss: 1.0034 - classification_loss: 0.1726 19/500 [>.............................] - ETA: 2:00 - loss: 1.1943 - regression_loss: 1.0220 - classification_loss: 0.1723 20/500 [>.............................] - ETA: 2:00 - loss: 1.2123 - regression_loss: 1.0382 - classification_loss: 0.1740 21/500 [>.............................] - ETA: 1:59 - loss: 1.2469 - regression_loss: 1.0652 - classification_loss: 0.1817 22/500 [>.............................] - ETA: 1:59 - loss: 1.2276 - regression_loss: 1.0519 - classification_loss: 0.1757 23/500 [>.............................] - ETA: 1:59 - loss: 1.2510 - regression_loss: 1.0713 - classification_loss: 0.1797 24/500 [>.............................] - ETA: 1:59 - loss: 1.2492 - regression_loss: 1.0696 - classification_loss: 0.1796 25/500 [>.............................] - ETA: 1:58 - loss: 1.2445 - regression_loss: 1.0644 - classification_loss: 0.1801 26/500 [>.............................] - ETA: 1:58 - loss: 1.2326 - regression_loss: 1.0519 - classification_loss: 0.1808 27/500 [>.............................] - ETA: 1:58 - loss: 1.2228 - regression_loss: 1.0421 - classification_loss: 0.1807 28/500 [>.............................] - ETA: 1:58 - loss: 1.2202 - regression_loss: 1.0399 - classification_loss: 0.1803 29/500 [>.............................] - ETA: 1:58 - loss: 1.2266 - regression_loss: 1.0457 - classification_loss: 0.1808 30/500 [>.............................] - ETA: 1:58 - loss: 1.2223 - regression_loss: 1.0403 - classification_loss: 0.1820 31/500 [>.............................] - ETA: 1:58 - loss: 1.2181 - regression_loss: 1.0374 - classification_loss: 0.1807 32/500 [>.............................] - ETA: 1:58 - loss: 1.2021 - regression_loss: 1.0240 - classification_loss: 0.1780 33/500 [>.............................] - ETA: 1:57 - loss: 1.1858 - regression_loss: 1.0117 - classification_loss: 0.1741 34/500 [=>............................] - ETA: 1:57 - loss: 1.2029 - regression_loss: 1.0262 - classification_loss: 0.1767 35/500 [=>............................] - ETA: 1:57 - loss: 1.1961 - regression_loss: 1.0223 - classification_loss: 0.1738 36/500 [=>............................] - ETA: 1:57 - loss: 1.2085 - regression_loss: 1.0319 - classification_loss: 0.1766 37/500 [=>............................] - ETA: 1:56 - loss: 1.2268 - regression_loss: 1.0454 - classification_loss: 0.1814 38/500 [=>............................] - ETA: 1:56 - loss: 1.2181 - regression_loss: 1.0387 - classification_loss: 0.1795 39/500 [=>............................] - ETA: 1:56 - loss: 1.2236 - regression_loss: 1.0442 - classification_loss: 0.1795 40/500 [=>............................] - ETA: 1:56 - loss: 1.2204 - regression_loss: 1.0418 - classification_loss: 0.1786 41/500 [=>............................] - ETA: 1:55 - loss: 1.2243 - regression_loss: 1.0445 - classification_loss: 0.1798 42/500 [=>............................] - ETA: 1:55 - loss: 1.2166 - regression_loss: 1.0351 - classification_loss: 0.1814 43/500 [=>............................] - ETA: 1:55 - loss: 1.2099 - regression_loss: 1.0309 - classification_loss: 0.1790 44/500 [=>............................] - ETA: 1:55 - loss: 1.1989 - regression_loss: 1.0217 - classification_loss: 0.1772 45/500 [=>............................] - ETA: 1:54 - loss: 1.2093 - regression_loss: 1.0296 - classification_loss: 0.1797 46/500 [=>............................] - ETA: 1:54 - loss: 1.2184 - regression_loss: 1.0366 - classification_loss: 0.1817 47/500 [=>............................] - ETA: 1:54 - loss: 1.2203 - regression_loss: 1.0385 - classification_loss: 0.1818 48/500 [=>............................] - ETA: 1:53 - loss: 1.2090 - regression_loss: 1.0292 - classification_loss: 0.1798 49/500 [=>............................] - ETA: 1:53 - loss: 1.1943 - regression_loss: 1.0177 - classification_loss: 0.1766 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2016 - regression_loss: 1.0238 - classification_loss: 0.1777 51/500 [==>...........................] - ETA: 1:52 - loss: 1.1853 - regression_loss: 1.0108 - classification_loss: 0.1746 52/500 [==>...........................] - ETA: 1:52 - loss: 1.1946 - regression_loss: 1.0186 - classification_loss: 0.1760 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2069 - regression_loss: 1.0261 - classification_loss: 0.1808 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2008 - regression_loss: 1.0213 - classification_loss: 0.1795 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2015 - regression_loss: 1.0213 - classification_loss: 0.1802 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2055 - regression_loss: 1.0244 - classification_loss: 0.1810 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2088 - regression_loss: 1.0276 - classification_loss: 0.1811 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2082 - regression_loss: 1.0274 - classification_loss: 0.1807 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2088 - regression_loss: 1.0286 - classification_loss: 0.1802 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2131 - regression_loss: 1.0330 - classification_loss: 0.1801 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2166 - regression_loss: 1.0360 - classification_loss: 0.1805 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2341 - regression_loss: 1.0514 - classification_loss: 0.1827 63/500 [==>...........................] - ETA: 1:50 - loss: 1.2374 - regression_loss: 1.0528 - classification_loss: 0.1846 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2300 - regression_loss: 1.0467 - classification_loss: 0.1833 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2309 - regression_loss: 1.0473 - classification_loss: 0.1836 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2332 - regression_loss: 1.0497 - classification_loss: 0.1835 67/500 [===>..........................] - ETA: 1:49 - loss: 1.2355 - regression_loss: 1.0517 - classification_loss: 0.1838 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2248 - regression_loss: 1.0431 - classification_loss: 0.1817 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2173 - regression_loss: 1.0369 - classification_loss: 0.1804 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2151 - regression_loss: 1.0356 - classification_loss: 0.1795 71/500 [===>..........................] - ETA: 1:48 - loss: 1.2102 - regression_loss: 1.0317 - classification_loss: 0.1785 72/500 [===>..........................] - ETA: 1:48 - loss: 1.2031 - regression_loss: 1.0251 - classification_loss: 0.1780 73/500 [===>..........................] - ETA: 1:47 - loss: 1.1985 - regression_loss: 1.0215 - classification_loss: 0.1770 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2038 - regression_loss: 1.0262 - classification_loss: 0.1776 75/500 [===>..........................] - ETA: 1:47 - loss: 1.2058 - regression_loss: 1.0272 - classification_loss: 0.1786 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2096 - regression_loss: 1.0299 - classification_loss: 0.1797 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2104 - regression_loss: 1.0307 - classification_loss: 0.1797 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2064 - regression_loss: 1.0279 - classification_loss: 0.1785 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2154 - regression_loss: 1.0352 - classification_loss: 0.1801 80/500 [===>..........................] - ETA: 1:46 - loss: 1.2181 - regression_loss: 1.0378 - classification_loss: 0.1803 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2208 - regression_loss: 1.0402 - classification_loss: 0.1805 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2176 - regression_loss: 1.0383 - classification_loss: 0.1793 83/500 [===>..........................] - ETA: 1:45 - loss: 1.2173 - regression_loss: 1.0379 - classification_loss: 0.1794 84/500 [====>.........................] - ETA: 1:45 - loss: 1.2083 - regression_loss: 1.0305 - classification_loss: 0.1778 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2131 - regression_loss: 1.0343 - classification_loss: 0.1788 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2110 - regression_loss: 1.0326 - classification_loss: 0.1784 87/500 [====>.........................] - ETA: 1:44 - loss: 1.2110 - regression_loss: 1.0328 - classification_loss: 0.1782 88/500 [====>.........................] - ETA: 1:44 - loss: 1.2028 - regression_loss: 1.0256 - classification_loss: 0.1772 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2077 - regression_loss: 1.0297 - classification_loss: 0.1780 90/500 [====>.........................] - ETA: 1:43 - loss: 1.2117 - regression_loss: 1.0328 - classification_loss: 0.1789 91/500 [====>.........................] - ETA: 1:43 - loss: 1.2154 - regression_loss: 1.0360 - classification_loss: 0.1793 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2149 - regression_loss: 1.0361 - classification_loss: 0.1788 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2133 - regression_loss: 1.0350 - classification_loss: 0.1783 94/500 [====>.........................] - ETA: 1:42 - loss: 1.2069 - regression_loss: 1.0290 - classification_loss: 0.1779 95/500 [====>.........................] - ETA: 1:42 - loss: 1.1983 - regression_loss: 1.0213 - classification_loss: 0.1771 96/500 [====>.........................] - ETA: 1:42 - loss: 1.1963 - regression_loss: 1.0190 - classification_loss: 0.1773 97/500 [====>.........................] - ETA: 1:41 - loss: 1.1977 - regression_loss: 1.0203 - classification_loss: 0.1775 98/500 [====>.........................] - ETA: 1:41 - loss: 1.1990 - regression_loss: 1.0217 - classification_loss: 0.1773 99/500 [====>.........................] - ETA: 1:41 - loss: 1.2023 - regression_loss: 1.0245 - classification_loss: 0.1778 100/500 [=====>........................] - ETA: 1:41 - loss: 1.1980 - regression_loss: 1.0211 - classification_loss: 0.1769 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2014 - regression_loss: 1.0237 - classification_loss: 0.1776 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2027 - regression_loss: 1.0250 - classification_loss: 0.1777 103/500 [=====>........................] - ETA: 1:40 - loss: 1.2087 - regression_loss: 1.0299 - classification_loss: 0.1788 104/500 [=====>........................] - ETA: 1:40 - loss: 1.2105 - regression_loss: 1.0311 - classification_loss: 0.1794 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2118 - regression_loss: 1.0319 - classification_loss: 0.1799 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2115 - regression_loss: 1.0317 - classification_loss: 0.1798 107/500 [=====>........................] - ETA: 1:39 - loss: 1.2169 - regression_loss: 1.0365 - classification_loss: 0.1804 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2167 - regression_loss: 1.0361 - classification_loss: 0.1806 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2209 - regression_loss: 1.0397 - classification_loss: 0.1812 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2196 - regression_loss: 1.0381 - classification_loss: 0.1815 111/500 [=====>........................] - ETA: 1:38 - loss: 1.2155 - regression_loss: 1.0344 - classification_loss: 0.1811 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2161 - regression_loss: 1.0348 - classification_loss: 0.1813 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2173 - regression_loss: 1.0357 - classification_loss: 0.1816 114/500 [=====>........................] - ETA: 1:37 - loss: 1.2184 - regression_loss: 1.0364 - classification_loss: 0.1820 115/500 [=====>........................] - ETA: 1:37 - loss: 1.2142 - regression_loss: 1.0328 - classification_loss: 0.1814 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2077 - regression_loss: 1.0274 - classification_loss: 0.1803 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2148 - regression_loss: 1.0331 - classification_loss: 0.1817 118/500 [======>.......................] - ETA: 1:36 - loss: 1.2120 - regression_loss: 1.0309 - classification_loss: 0.1812 119/500 [======>.......................] - ETA: 1:36 - loss: 1.2141 - regression_loss: 1.0327 - classification_loss: 0.1813 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2097 - regression_loss: 1.0290 - classification_loss: 0.1807 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2122 - regression_loss: 1.0307 - classification_loss: 0.1814 122/500 [======>.......................] - ETA: 1:35 - loss: 1.2117 - regression_loss: 1.0306 - classification_loss: 0.1811 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2132 - regression_loss: 1.0320 - classification_loss: 0.1812 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2182 - regression_loss: 1.0360 - classification_loss: 0.1822 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2164 - regression_loss: 1.0346 - classification_loss: 0.1818 126/500 [======>.......................] - ETA: 1:34 - loss: 1.2201 - regression_loss: 1.0369 - classification_loss: 0.1832 127/500 [======>.......................] - ETA: 1:34 - loss: 1.2203 - regression_loss: 1.0366 - classification_loss: 0.1837 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2207 - regression_loss: 1.0371 - classification_loss: 0.1836 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2211 - regression_loss: 1.0378 - classification_loss: 0.1834 130/500 [======>.......................] - ETA: 1:33 - loss: 1.2140 - regression_loss: 1.0316 - classification_loss: 0.1824 131/500 [======>.......................] - ETA: 1:33 - loss: 1.2103 - regression_loss: 1.0287 - classification_loss: 0.1817 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2094 - regression_loss: 1.0278 - classification_loss: 0.1816 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2091 - regression_loss: 1.0277 - classification_loss: 0.1814 134/500 [=======>......................] - ETA: 1:32 - loss: 1.2147 - regression_loss: 1.0319 - classification_loss: 0.1828 135/500 [=======>......................] - ETA: 1:32 - loss: 1.2111 - regression_loss: 1.0294 - classification_loss: 0.1818 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2097 - regression_loss: 1.0283 - classification_loss: 0.1814 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2126 - regression_loss: 1.0305 - classification_loss: 0.1821 138/500 [=======>......................] - ETA: 1:31 - loss: 1.2172 - regression_loss: 1.0343 - classification_loss: 0.1829 139/500 [=======>......................] - ETA: 1:31 - loss: 1.2161 - regression_loss: 1.0330 - classification_loss: 0.1831 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2152 - regression_loss: 1.0324 - classification_loss: 0.1828 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2172 - regression_loss: 1.0343 - classification_loss: 0.1829 142/500 [=======>......................] - ETA: 1:30 - loss: 1.2184 - regression_loss: 1.0355 - classification_loss: 0.1829 143/500 [=======>......................] - ETA: 1:30 - loss: 1.2174 - regression_loss: 1.0341 - classification_loss: 0.1833 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2224 - regression_loss: 1.0382 - classification_loss: 0.1842 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2219 - regression_loss: 1.0383 - classification_loss: 0.1836 146/500 [=======>......................] - ETA: 1:29 - loss: 1.2196 - regression_loss: 1.0365 - classification_loss: 0.1831 147/500 [=======>......................] - ETA: 1:29 - loss: 1.2196 - regression_loss: 1.0366 - classification_loss: 0.1830 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2222 - regression_loss: 1.0389 - classification_loss: 0.1833 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2231 - regression_loss: 1.0395 - classification_loss: 0.1837 150/500 [========>.....................] - ETA: 1:28 - loss: 1.2245 - regression_loss: 1.0404 - classification_loss: 0.1840 151/500 [========>.....................] - ETA: 1:28 - loss: 1.2271 - regression_loss: 1.0423 - classification_loss: 0.1848 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2300 - regression_loss: 1.0450 - classification_loss: 0.1849 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2288 - regression_loss: 1.0441 - classification_loss: 0.1847 154/500 [========>.....................] - ETA: 1:27 - loss: 1.2287 - regression_loss: 1.0441 - classification_loss: 0.1845 155/500 [========>.....................] - ETA: 1:27 - loss: 1.2283 - regression_loss: 1.0435 - classification_loss: 0.1847 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2306 - regression_loss: 1.0454 - classification_loss: 0.1853 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2319 - regression_loss: 1.0462 - classification_loss: 0.1857 158/500 [========>.....................] - ETA: 1:26 - loss: 1.2279 - regression_loss: 1.0429 - classification_loss: 0.1850 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2287 - regression_loss: 1.0440 - classification_loss: 0.1848 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2293 - regression_loss: 1.0446 - classification_loss: 0.1847 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2304 - regression_loss: 1.0456 - classification_loss: 0.1848 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2276 - regression_loss: 1.0432 - classification_loss: 0.1844 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2257 - regression_loss: 1.0414 - classification_loss: 0.1844 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2219 - regression_loss: 1.0382 - classification_loss: 0.1837 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2203 - regression_loss: 1.0371 - classification_loss: 0.1832 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2192 - regression_loss: 1.0362 - classification_loss: 0.1830 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2209 - regression_loss: 1.0377 - classification_loss: 0.1832 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2210 - regression_loss: 1.0379 - classification_loss: 0.1831 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2189 - regression_loss: 1.0364 - classification_loss: 0.1825 170/500 [=========>....................] - ETA: 1:23 - loss: 1.2240 - regression_loss: 1.0404 - classification_loss: 0.1836 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2218 - regression_loss: 1.0388 - classification_loss: 0.1831 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2193 - regression_loss: 1.0364 - classification_loss: 0.1829 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2172 - regression_loss: 1.0349 - classification_loss: 0.1823 174/500 [=========>....................] - ETA: 1:22 - loss: 1.2171 - regression_loss: 1.0342 - classification_loss: 0.1829 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2187 - regression_loss: 1.0356 - classification_loss: 0.1831 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2159 - regression_loss: 1.0330 - classification_loss: 0.1829 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2194 - regression_loss: 1.0358 - classification_loss: 0.1836 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2182 - regression_loss: 1.0350 - classification_loss: 0.1832 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2205 - regression_loss: 1.0370 - classification_loss: 0.1835 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2221 - regression_loss: 1.0383 - classification_loss: 0.1838 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2215 - regression_loss: 1.0378 - classification_loss: 0.1837 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2224 - regression_loss: 1.0387 - classification_loss: 0.1837 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2261 - regression_loss: 1.0419 - classification_loss: 0.1842 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2285 - regression_loss: 1.0440 - classification_loss: 0.1845 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2234 - regression_loss: 1.0397 - classification_loss: 0.1837 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2228 - regression_loss: 1.0392 - classification_loss: 0.1836 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2255 - regression_loss: 1.0416 - classification_loss: 0.1839 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2232 - regression_loss: 1.0398 - classification_loss: 0.1834 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2232 - regression_loss: 1.0399 - classification_loss: 0.1833 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2198 - regression_loss: 1.0371 - classification_loss: 0.1827 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2193 - regression_loss: 1.0368 - classification_loss: 0.1825 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2218 - regression_loss: 1.0387 - classification_loss: 0.1831 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2237 - regression_loss: 1.0402 - classification_loss: 0.1835 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2247 - regression_loss: 1.0413 - classification_loss: 0.1834 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2271 - regression_loss: 1.0433 - classification_loss: 0.1838 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2260 - regression_loss: 1.0420 - classification_loss: 0.1840 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2277 - regression_loss: 1.0432 - classification_loss: 0.1845 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2286 - regression_loss: 1.0438 - classification_loss: 0.1848 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2292 - regression_loss: 1.0443 - classification_loss: 0.1849 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2301 - regression_loss: 1.0449 - classification_loss: 0.1852 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2277 - regression_loss: 1.0429 - classification_loss: 0.1848 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2268 - regression_loss: 1.0422 - classification_loss: 0.1846 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2264 - regression_loss: 1.0417 - classification_loss: 0.1847 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2304 - regression_loss: 1.0447 - classification_loss: 0.1857 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2280 - regression_loss: 1.0426 - classification_loss: 0.1854 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2248 - regression_loss: 1.0398 - classification_loss: 0.1850 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2265 - regression_loss: 1.0403 - classification_loss: 0.1862 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2278 - regression_loss: 1.0412 - classification_loss: 0.1865 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2319 - regression_loss: 1.0444 - classification_loss: 0.1875 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2332 - regression_loss: 1.0458 - classification_loss: 0.1874 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2350 - regression_loss: 1.0474 - classification_loss: 0.1877 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2377 - regression_loss: 1.0494 - classification_loss: 0.1883 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2400 - regression_loss: 1.0513 - classification_loss: 0.1886 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2375 - regression_loss: 1.0495 - classification_loss: 0.1880 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2394 - regression_loss: 1.0509 - classification_loss: 0.1885 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2396 - regression_loss: 1.0511 - classification_loss: 0.1885 217/500 [============>.................] - ETA: 1:11 - loss: 1.2380 - regression_loss: 1.0498 - classification_loss: 0.1882 218/500 [============>.................] - ETA: 1:10 - loss: 1.2352 - regression_loss: 1.0476 - classification_loss: 0.1875 219/500 [============>.................] - ETA: 1:10 - loss: 1.2362 - regression_loss: 1.0485 - classification_loss: 0.1877 220/500 [============>.................] - ETA: 1:10 - loss: 1.2365 - regression_loss: 1.0488 - classification_loss: 0.1877 221/500 [============>.................] - ETA: 1:10 - loss: 1.2381 - regression_loss: 1.0501 - classification_loss: 0.1880 222/500 [============>.................] - ETA: 1:09 - loss: 1.2387 - regression_loss: 1.0507 - classification_loss: 0.1880 223/500 [============>.................] - ETA: 1:09 - loss: 1.2414 - regression_loss: 1.0530 - classification_loss: 0.1884 224/500 [============>.................] - ETA: 1:09 - loss: 1.2421 - regression_loss: 1.0537 - classification_loss: 0.1884 225/500 [============>.................] - ETA: 1:09 - loss: 1.2398 - regression_loss: 1.0518 - classification_loss: 0.1879 226/500 [============>.................] - ETA: 1:08 - loss: 1.2405 - regression_loss: 1.0525 - classification_loss: 0.1880 227/500 [============>.................] - ETA: 1:08 - loss: 1.2377 - regression_loss: 1.0504 - classification_loss: 0.1874 228/500 [============>.................] - ETA: 1:08 - loss: 1.2394 - regression_loss: 1.0520 - classification_loss: 0.1874 229/500 [============>.................] - ETA: 1:08 - loss: 1.2373 - regression_loss: 1.0503 - classification_loss: 0.1870 230/500 [============>.................] - ETA: 1:07 - loss: 1.2358 - regression_loss: 1.0492 - classification_loss: 0.1866 231/500 [============>.................] - ETA: 1:07 - loss: 1.2372 - regression_loss: 1.0504 - classification_loss: 0.1869 232/500 [============>.................] - ETA: 1:07 - loss: 1.2382 - regression_loss: 1.0508 - classification_loss: 0.1874 233/500 [============>.................] - ETA: 1:07 - loss: 1.2395 - regression_loss: 1.0518 - classification_loss: 0.1877 234/500 [=============>................] - ETA: 1:06 - loss: 1.2396 - regression_loss: 1.0520 - classification_loss: 0.1876 235/500 [=============>................] - ETA: 1:06 - loss: 1.2408 - regression_loss: 1.0527 - classification_loss: 0.1882 236/500 [=============>................] - ETA: 1:06 - loss: 1.2372 - regression_loss: 1.0498 - classification_loss: 0.1875 237/500 [=============>................] - ETA: 1:06 - loss: 1.2358 - regression_loss: 1.0487 - classification_loss: 0.1871 238/500 [=============>................] - ETA: 1:05 - loss: 1.2359 - regression_loss: 1.0484 - classification_loss: 0.1875 239/500 [=============>................] - ETA: 1:05 - loss: 1.2336 - regression_loss: 1.0465 - classification_loss: 0.1871 240/500 [=============>................] - ETA: 1:05 - loss: 1.2315 - regression_loss: 1.0447 - classification_loss: 0.1868 241/500 [=============>................] - ETA: 1:05 - loss: 1.2308 - regression_loss: 1.0444 - classification_loss: 0.1865 242/500 [=============>................] - ETA: 1:04 - loss: 1.2314 - regression_loss: 1.0448 - classification_loss: 0.1867 243/500 [=============>................] - ETA: 1:04 - loss: 1.2317 - regression_loss: 1.0448 - classification_loss: 0.1869 244/500 [=============>................] - ETA: 1:04 - loss: 1.2324 - regression_loss: 1.0454 - classification_loss: 0.1870 245/500 [=============>................] - ETA: 1:04 - loss: 1.2330 - regression_loss: 1.0459 - classification_loss: 0.1871 246/500 [=============>................] - ETA: 1:03 - loss: 1.2306 - regression_loss: 1.0442 - classification_loss: 0.1865 247/500 [=============>................] - ETA: 1:03 - loss: 1.2299 - regression_loss: 1.0436 - classification_loss: 0.1864 248/500 [=============>................] - ETA: 1:03 - loss: 1.2329 - regression_loss: 1.0459 - classification_loss: 0.1869 249/500 [=============>................] - ETA: 1:03 - loss: 1.2337 - regression_loss: 1.0466 - classification_loss: 0.1872 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2349 - regression_loss: 1.0477 - classification_loss: 0.1872 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2343 - regression_loss: 1.0472 - classification_loss: 0.1871 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2373 - regression_loss: 1.0497 - classification_loss: 0.1876 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2350 - regression_loss: 1.0480 - classification_loss: 0.1870 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2375 - regression_loss: 1.0500 - classification_loss: 0.1876 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2362 - regression_loss: 1.0490 - classification_loss: 0.1872 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2338 - regression_loss: 1.0469 - classification_loss: 0.1868 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2344 - regression_loss: 1.0476 - classification_loss: 0.1868 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2348 - regression_loss: 1.0480 - classification_loss: 0.1868 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2368 - regression_loss: 1.0497 - classification_loss: 0.1871 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2340 - regression_loss: 1.0474 - classification_loss: 0.1866 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2338 - regression_loss: 1.0472 - classification_loss: 0.1865 262/500 [==============>...............] - ETA: 59s - loss: 1.2334 - regression_loss: 1.0473 - classification_loss: 0.1861  263/500 [==============>...............] - ETA: 59s - loss: 1.2344 - regression_loss: 1.0481 - classification_loss: 0.1863 264/500 [==============>...............] - ETA: 59s - loss: 1.2308 - regression_loss: 1.0451 - classification_loss: 0.1857 265/500 [==============>...............] - ETA: 59s - loss: 1.2311 - regression_loss: 1.0455 - classification_loss: 0.1856 266/500 [==============>...............] - ETA: 58s - loss: 1.2314 - regression_loss: 1.0458 - classification_loss: 0.1855 267/500 [===============>..............] - ETA: 58s - loss: 1.2331 - regression_loss: 1.0474 - classification_loss: 0.1857 268/500 [===============>..............] - ETA: 58s - loss: 1.2342 - regression_loss: 1.0483 - classification_loss: 0.1859 269/500 [===============>..............] - ETA: 58s - loss: 1.2343 - regression_loss: 1.0483 - classification_loss: 0.1861 270/500 [===============>..............] - ETA: 57s - loss: 1.2348 - regression_loss: 1.0487 - classification_loss: 0.1860 271/500 [===============>..............] - ETA: 57s - loss: 1.2332 - regression_loss: 1.0475 - classification_loss: 0.1856 272/500 [===============>..............] - ETA: 57s - loss: 1.2339 - regression_loss: 1.0481 - classification_loss: 0.1858 273/500 [===============>..............] - ETA: 57s - loss: 1.2347 - regression_loss: 1.0488 - classification_loss: 0.1860 274/500 [===============>..............] - ETA: 56s - loss: 1.2330 - regression_loss: 1.0474 - classification_loss: 0.1856 275/500 [===============>..............] - ETA: 56s - loss: 1.2335 - regression_loss: 1.0479 - classification_loss: 0.1856 276/500 [===============>..............] - ETA: 56s - loss: 1.2338 - regression_loss: 1.0482 - classification_loss: 0.1856 277/500 [===============>..............] - ETA: 56s - loss: 1.2332 - regression_loss: 1.0475 - classification_loss: 0.1857 278/500 [===============>..............] - ETA: 55s - loss: 1.2339 - regression_loss: 1.0481 - classification_loss: 0.1858 279/500 [===============>..............] - ETA: 55s - loss: 1.2345 - regression_loss: 1.0486 - classification_loss: 0.1859 280/500 [===============>..............] - ETA: 55s - loss: 1.2367 - regression_loss: 1.0502 - classification_loss: 0.1865 281/500 [===============>..............] - ETA: 55s - loss: 1.2359 - regression_loss: 1.0493 - classification_loss: 0.1865 282/500 [===============>..............] - ETA: 54s - loss: 1.2361 - regression_loss: 1.0496 - classification_loss: 0.1865 283/500 [===============>..............] - ETA: 54s - loss: 1.2366 - regression_loss: 1.0500 - classification_loss: 0.1866 284/500 [================>.............] - ETA: 54s - loss: 1.2375 - regression_loss: 1.0508 - classification_loss: 0.1867 285/500 [================>.............] - ETA: 54s - loss: 1.2391 - regression_loss: 1.0522 - classification_loss: 0.1869 286/500 [================>.............] - ETA: 53s - loss: 1.2368 - regression_loss: 1.0504 - classification_loss: 0.1864 287/500 [================>.............] - ETA: 53s - loss: 1.2369 - regression_loss: 1.0505 - classification_loss: 0.1864 288/500 [================>.............] - ETA: 53s - loss: 1.2346 - regression_loss: 1.0487 - classification_loss: 0.1859 289/500 [================>.............] - ETA: 53s - loss: 1.2334 - regression_loss: 1.0477 - classification_loss: 0.1857 290/500 [================>.............] - ETA: 52s - loss: 1.2339 - regression_loss: 1.0480 - classification_loss: 0.1859 291/500 [================>.............] - ETA: 52s - loss: 1.2341 - regression_loss: 1.0480 - classification_loss: 0.1862 292/500 [================>.............] - ETA: 52s - loss: 1.2314 - regression_loss: 1.0457 - classification_loss: 0.1857 293/500 [================>.............] - ETA: 52s - loss: 1.2331 - regression_loss: 1.0469 - classification_loss: 0.1862 294/500 [================>.............] - ETA: 51s - loss: 1.2326 - regression_loss: 1.0464 - classification_loss: 0.1862 295/500 [================>.............] - ETA: 51s - loss: 1.2320 - regression_loss: 1.0458 - classification_loss: 0.1862 296/500 [================>.............] - ETA: 51s - loss: 1.2335 - regression_loss: 1.0471 - classification_loss: 0.1864 297/500 [================>.............] - ETA: 51s - loss: 1.2341 - regression_loss: 1.0477 - classification_loss: 0.1865 298/500 [================>.............] - ETA: 50s - loss: 1.2360 - regression_loss: 1.0493 - classification_loss: 0.1867 299/500 [================>.............] - ETA: 50s - loss: 1.2360 - regression_loss: 1.0493 - classification_loss: 0.1867 300/500 [=================>............] - ETA: 50s - loss: 1.2358 - regression_loss: 1.0493 - classification_loss: 0.1865 301/500 [=================>............] - ETA: 50s - loss: 1.2371 - regression_loss: 1.0503 - classification_loss: 0.1868 302/500 [=================>............] - ETA: 49s - loss: 1.2368 - regression_loss: 1.0500 - classification_loss: 0.1868 303/500 [=================>............] - ETA: 49s - loss: 1.2371 - regression_loss: 1.0502 - classification_loss: 0.1868 304/500 [=================>............] - ETA: 49s - loss: 1.2386 - regression_loss: 1.0515 - classification_loss: 0.1871 305/500 [=================>............] - ETA: 49s - loss: 1.2372 - regression_loss: 1.0502 - classification_loss: 0.1870 306/500 [=================>............] - ETA: 48s - loss: 1.2373 - regression_loss: 1.0502 - classification_loss: 0.1871 307/500 [=================>............] - ETA: 48s - loss: 1.2375 - regression_loss: 1.0505 - classification_loss: 0.1870 308/500 [=================>............] - ETA: 48s - loss: 1.2373 - regression_loss: 1.0504 - classification_loss: 0.1870 309/500 [=================>............] - ETA: 48s - loss: 1.2376 - regression_loss: 1.0509 - classification_loss: 0.1867 310/500 [=================>............] - ETA: 47s - loss: 1.2359 - regression_loss: 1.0496 - classification_loss: 0.1863 311/500 [=================>............] - ETA: 47s - loss: 1.2369 - regression_loss: 1.0506 - classification_loss: 0.1863 312/500 [=================>............] - ETA: 47s - loss: 1.2386 - regression_loss: 1.0522 - classification_loss: 0.1865 313/500 [=================>............] - ETA: 47s - loss: 1.2395 - regression_loss: 1.0529 - classification_loss: 0.1867 314/500 [=================>............] - ETA: 46s - loss: 1.2384 - regression_loss: 1.0520 - classification_loss: 0.1864 315/500 [=================>............] - ETA: 46s - loss: 1.2393 - regression_loss: 1.0529 - classification_loss: 0.1865 316/500 [=================>............] - ETA: 46s - loss: 1.2397 - regression_loss: 1.0532 - classification_loss: 0.1865 317/500 [==================>...........] - ETA: 46s - loss: 1.2380 - regression_loss: 1.0518 - classification_loss: 0.1861 318/500 [==================>...........] - ETA: 45s - loss: 1.2372 - regression_loss: 1.0513 - classification_loss: 0.1859 319/500 [==================>...........] - ETA: 45s - loss: 1.2364 - regression_loss: 1.0507 - classification_loss: 0.1857 320/500 [==================>...........] - ETA: 45s - loss: 1.2375 - regression_loss: 1.0516 - classification_loss: 0.1859 321/500 [==================>...........] - ETA: 45s - loss: 1.2395 - regression_loss: 1.0532 - classification_loss: 0.1862 322/500 [==================>...........] - ETA: 44s - loss: 1.2416 - regression_loss: 1.0550 - classification_loss: 0.1867 323/500 [==================>...........] - ETA: 44s - loss: 1.2422 - regression_loss: 1.0557 - classification_loss: 0.1865 324/500 [==================>...........] - ETA: 44s - loss: 1.2427 - regression_loss: 1.0563 - classification_loss: 0.1864 325/500 [==================>...........] - ETA: 44s - loss: 1.2424 - regression_loss: 1.0559 - classification_loss: 0.1865 326/500 [==================>...........] - ETA: 43s - loss: 1.2405 - regression_loss: 1.0545 - classification_loss: 0.1859 327/500 [==================>...........] - ETA: 43s - loss: 1.2417 - regression_loss: 1.0555 - classification_loss: 0.1862 328/500 [==================>...........] - ETA: 43s - loss: 1.2426 - regression_loss: 1.0564 - classification_loss: 0.1862 329/500 [==================>...........] - ETA: 43s - loss: 1.2426 - regression_loss: 1.0564 - classification_loss: 0.1862 330/500 [==================>...........] - ETA: 42s - loss: 1.2429 - regression_loss: 1.0568 - classification_loss: 0.1862 331/500 [==================>...........] - ETA: 42s - loss: 1.2421 - regression_loss: 1.0561 - classification_loss: 0.1860 332/500 [==================>...........] - ETA: 42s - loss: 1.2435 - regression_loss: 1.0573 - classification_loss: 0.1862 333/500 [==================>...........] - ETA: 41s - loss: 1.2408 - regression_loss: 1.0551 - classification_loss: 0.1857 334/500 [===================>..........] - ETA: 41s - loss: 1.2414 - regression_loss: 1.0556 - classification_loss: 0.1857 335/500 [===================>..........] - ETA: 41s - loss: 1.2391 - regression_loss: 1.0537 - classification_loss: 0.1855 336/500 [===================>..........] - ETA: 41s - loss: 1.2388 - regression_loss: 1.0534 - classification_loss: 0.1854 337/500 [===================>..........] - ETA: 40s - loss: 1.2407 - regression_loss: 1.0549 - classification_loss: 0.1857 338/500 [===================>..........] - ETA: 40s - loss: 1.2412 - regression_loss: 1.0552 - classification_loss: 0.1861 339/500 [===================>..........] - ETA: 40s - loss: 1.2415 - regression_loss: 1.0554 - classification_loss: 0.1861 340/500 [===================>..........] - ETA: 40s - loss: 1.2429 - regression_loss: 1.0566 - classification_loss: 0.1862 341/500 [===================>..........] - ETA: 39s - loss: 1.2441 - regression_loss: 1.0574 - classification_loss: 0.1867 342/500 [===================>..........] - ETA: 39s - loss: 1.2451 - regression_loss: 1.0582 - classification_loss: 0.1868 343/500 [===================>..........] - ETA: 39s - loss: 1.2432 - regression_loss: 1.0566 - classification_loss: 0.1865 344/500 [===================>..........] - ETA: 39s - loss: 1.2447 - regression_loss: 1.0579 - classification_loss: 0.1868 345/500 [===================>..........] - ETA: 38s - loss: 1.2436 - regression_loss: 1.0572 - classification_loss: 0.1864 346/500 [===================>..........] - ETA: 38s - loss: 1.2443 - regression_loss: 1.0578 - classification_loss: 0.1864 347/500 [===================>..........] - ETA: 38s - loss: 1.2448 - regression_loss: 1.0585 - classification_loss: 0.1863 348/500 [===================>..........] - ETA: 38s - loss: 1.2439 - regression_loss: 1.0576 - classification_loss: 0.1863 349/500 [===================>..........] - ETA: 37s - loss: 1.2422 - regression_loss: 1.0561 - classification_loss: 0.1860 350/500 [====================>.........] - ETA: 37s - loss: 1.2423 - regression_loss: 1.0563 - classification_loss: 0.1860 351/500 [====================>.........] - ETA: 37s - loss: 1.2411 - regression_loss: 1.0553 - classification_loss: 0.1857 352/500 [====================>.........] - ETA: 37s - loss: 1.2397 - regression_loss: 1.0541 - classification_loss: 0.1855 353/500 [====================>.........] - ETA: 36s - loss: 1.2416 - regression_loss: 1.0559 - classification_loss: 0.1857 354/500 [====================>.........] - ETA: 36s - loss: 1.2429 - regression_loss: 1.0569 - classification_loss: 0.1860 355/500 [====================>.........] - ETA: 36s - loss: 1.2443 - regression_loss: 1.0581 - classification_loss: 0.1862 356/500 [====================>.........] - ETA: 36s - loss: 1.2461 - regression_loss: 1.0596 - classification_loss: 0.1865 357/500 [====================>.........] - ETA: 35s - loss: 1.2446 - regression_loss: 1.0582 - classification_loss: 0.1864 358/500 [====================>.........] - ETA: 35s - loss: 1.2442 - regression_loss: 1.0579 - classification_loss: 0.1863 359/500 [====================>.........] - ETA: 35s - loss: 1.2452 - regression_loss: 1.0588 - classification_loss: 0.1864 360/500 [====================>.........] - ETA: 35s - loss: 1.2460 - regression_loss: 1.0594 - classification_loss: 0.1866 361/500 [====================>.........] - ETA: 34s - loss: 1.2469 - regression_loss: 1.0604 - classification_loss: 0.1865 362/500 [====================>.........] - ETA: 34s - loss: 1.2458 - regression_loss: 1.0595 - classification_loss: 0.1863 363/500 [====================>.........] - ETA: 34s - loss: 1.2449 - regression_loss: 1.0588 - classification_loss: 0.1862 364/500 [====================>.........] - ETA: 34s - loss: 1.2454 - regression_loss: 1.0592 - classification_loss: 0.1862 365/500 [====================>.........] - ETA: 33s - loss: 1.2453 - regression_loss: 1.0593 - classification_loss: 0.1860 366/500 [====================>.........] - ETA: 33s - loss: 1.2456 - regression_loss: 1.0594 - classification_loss: 0.1862 367/500 [=====================>........] - ETA: 33s - loss: 1.2453 - regression_loss: 1.0591 - classification_loss: 0.1862 368/500 [=====================>........] - ETA: 33s - loss: 1.2459 - regression_loss: 1.0591 - classification_loss: 0.1867 369/500 [=====================>........] - ETA: 32s - loss: 1.2460 - regression_loss: 1.0592 - classification_loss: 0.1868 370/500 [=====================>........] - ETA: 32s - loss: 1.2438 - regression_loss: 1.0574 - classification_loss: 0.1865 371/500 [=====================>........] - ETA: 32s - loss: 1.2443 - regression_loss: 1.0578 - classification_loss: 0.1865 372/500 [=====================>........] - ETA: 32s - loss: 1.2449 - regression_loss: 1.0582 - classification_loss: 0.1867 373/500 [=====================>........] - ETA: 31s - loss: 1.2431 - regression_loss: 1.0567 - classification_loss: 0.1864 374/500 [=====================>........] - ETA: 31s - loss: 1.2432 - regression_loss: 1.0567 - classification_loss: 0.1865 375/500 [=====================>........] - ETA: 31s - loss: 1.2427 - regression_loss: 1.0564 - classification_loss: 0.1864 376/500 [=====================>........] - ETA: 31s - loss: 1.2426 - regression_loss: 1.0562 - classification_loss: 0.1863 377/500 [=====================>........] - ETA: 30s - loss: 1.2429 - regression_loss: 1.0566 - classification_loss: 0.1863 378/500 [=====================>........] - ETA: 30s - loss: 1.2419 - regression_loss: 1.0559 - classification_loss: 0.1859 379/500 [=====================>........] - ETA: 30s - loss: 1.2428 - regression_loss: 1.0567 - classification_loss: 0.1861 380/500 [=====================>........] - ETA: 30s - loss: 1.2420 - regression_loss: 1.0560 - classification_loss: 0.1859 381/500 [=====================>........] - ETA: 29s - loss: 1.2427 - regression_loss: 1.0565 - classification_loss: 0.1862 382/500 [=====================>........] - ETA: 29s - loss: 1.2423 - regression_loss: 1.0562 - classification_loss: 0.1861 383/500 [=====================>........] - ETA: 29s - loss: 1.2424 - regression_loss: 1.0563 - classification_loss: 0.1861 384/500 [======================>.......] - ETA: 29s - loss: 1.2415 - regression_loss: 1.0557 - classification_loss: 0.1859 385/500 [======================>.......] - ETA: 28s - loss: 1.2426 - regression_loss: 1.0565 - classification_loss: 0.1861 386/500 [======================>.......] - ETA: 28s - loss: 1.2437 - regression_loss: 1.0573 - classification_loss: 0.1864 387/500 [======================>.......] - ETA: 28s - loss: 1.2446 - regression_loss: 1.0581 - classification_loss: 0.1865 388/500 [======================>.......] - ETA: 28s - loss: 1.2445 - regression_loss: 1.0581 - classification_loss: 0.1865 389/500 [======================>.......] - ETA: 27s - loss: 1.2456 - regression_loss: 1.0590 - classification_loss: 0.1867 390/500 [======================>.......] - ETA: 27s - loss: 1.2454 - regression_loss: 1.0588 - classification_loss: 0.1866 391/500 [======================>.......] - ETA: 27s - loss: 1.2448 - regression_loss: 1.0583 - classification_loss: 0.1865 392/500 [======================>.......] - ETA: 27s - loss: 1.2438 - regression_loss: 1.0574 - classification_loss: 0.1863 393/500 [======================>.......] - ETA: 26s - loss: 1.2433 - regression_loss: 1.0569 - classification_loss: 0.1864 394/500 [======================>.......] - ETA: 26s - loss: 1.2432 - regression_loss: 1.0567 - classification_loss: 0.1865 395/500 [======================>.......] - ETA: 26s - loss: 1.2443 - regression_loss: 1.0576 - classification_loss: 0.1867 396/500 [======================>.......] - ETA: 26s - loss: 1.2447 - regression_loss: 1.0581 - classification_loss: 0.1866 397/500 [======================>.......] - ETA: 25s - loss: 1.2431 - regression_loss: 1.0568 - classification_loss: 0.1863 398/500 [======================>.......] - ETA: 25s - loss: 1.2427 - regression_loss: 1.0566 - classification_loss: 0.1861 399/500 [======================>.......] - ETA: 25s - loss: 1.2443 - regression_loss: 1.0577 - classification_loss: 0.1865 400/500 [=======================>......] - ETA: 25s - loss: 1.2446 - regression_loss: 1.0580 - classification_loss: 0.1866 401/500 [=======================>......] - ETA: 24s - loss: 1.2442 - regression_loss: 1.0577 - classification_loss: 0.1866 402/500 [=======================>......] - ETA: 24s - loss: 1.2458 - regression_loss: 1.0588 - classification_loss: 0.1870 403/500 [=======================>......] - ETA: 24s - loss: 1.2452 - regression_loss: 1.0583 - classification_loss: 0.1869 404/500 [=======================>......] - ETA: 24s - loss: 1.2459 - regression_loss: 1.0586 - classification_loss: 0.1873 405/500 [=======================>......] - ETA: 23s - loss: 1.2448 - regression_loss: 1.0576 - classification_loss: 0.1872 406/500 [=======================>......] - ETA: 23s - loss: 1.2431 - regression_loss: 1.0562 - classification_loss: 0.1868 407/500 [=======================>......] - ETA: 23s - loss: 1.2442 - regression_loss: 1.0571 - classification_loss: 0.1871 408/500 [=======================>......] - ETA: 23s - loss: 1.2434 - regression_loss: 1.0566 - classification_loss: 0.1868 409/500 [=======================>......] - ETA: 22s - loss: 1.2450 - regression_loss: 1.0581 - classification_loss: 0.1869 410/500 [=======================>......] - ETA: 22s - loss: 1.2466 - regression_loss: 1.0593 - classification_loss: 0.1873 411/500 [=======================>......] - ETA: 22s - loss: 1.2476 - regression_loss: 1.0599 - classification_loss: 0.1877 412/500 [=======================>......] - ETA: 22s - loss: 1.2466 - regression_loss: 1.0591 - classification_loss: 0.1875 413/500 [=======================>......] - ETA: 21s - loss: 1.2463 - regression_loss: 1.0588 - classification_loss: 0.1875 414/500 [=======================>......] - ETA: 21s - loss: 1.2485 - regression_loss: 1.0607 - classification_loss: 0.1878 415/500 [=======================>......] - ETA: 21s - loss: 1.2496 - regression_loss: 1.0615 - classification_loss: 0.1881 416/500 [=======================>......] - ETA: 21s - loss: 1.2500 - regression_loss: 1.0618 - classification_loss: 0.1882 417/500 [========================>.....] - ETA: 20s - loss: 1.2508 - regression_loss: 1.0625 - classification_loss: 0.1883 418/500 [========================>.....] - ETA: 20s - loss: 1.2516 - regression_loss: 1.0632 - classification_loss: 0.1884 419/500 [========================>.....] - ETA: 20s - loss: 1.2513 - regression_loss: 1.0630 - classification_loss: 0.1883 420/500 [========================>.....] - ETA: 20s - loss: 1.2516 - regression_loss: 1.0633 - classification_loss: 0.1883 421/500 [========================>.....] - ETA: 19s - loss: 1.2512 - regression_loss: 1.0631 - classification_loss: 0.1881 422/500 [========================>.....] - ETA: 19s - loss: 1.2519 - regression_loss: 1.0637 - classification_loss: 0.1882 423/500 [========================>.....] - ETA: 19s - loss: 1.2519 - regression_loss: 1.0636 - classification_loss: 0.1883 424/500 [========================>.....] - ETA: 19s - loss: 1.2510 - regression_loss: 1.0628 - classification_loss: 0.1883 425/500 [========================>.....] - ETA: 18s - loss: 1.2514 - regression_loss: 1.0631 - classification_loss: 0.1883 426/500 [========================>.....] - ETA: 18s - loss: 1.2537 - regression_loss: 1.0650 - classification_loss: 0.1887 427/500 [========================>.....] - ETA: 18s - loss: 1.2538 - regression_loss: 1.0649 - classification_loss: 0.1889 428/500 [========================>.....] - ETA: 18s - loss: 1.2535 - regression_loss: 1.0648 - classification_loss: 0.1888 429/500 [========================>.....] - ETA: 17s - loss: 1.2544 - regression_loss: 1.0655 - classification_loss: 0.1889 430/500 [========================>.....] - ETA: 17s - loss: 1.2528 - regression_loss: 1.0642 - classification_loss: 0.1886 431/500 [========================>.....] - ETA: 17s - loss: 1.2533 - regression_loss: 1.0646 - classification_loss: 0.1887 432/500 [========================>.....] - ETA: 17s - loss: 1.2537 - regression_loss: 1.0649 - classification_loss: 0.1888 433/500 [========================>.....] - ETA: 16s - loss: 1.2535 - regression_loss: 1.0647 - classification_loss: 0.1888 434/500 [=========================>....] - ETA: 16s - loss: 1.2526 - regression_loss: 1.0642 - classification_loss: 0.1885 435/500 [=========================>....] - ETA: 16s - loss: 1.2533 - regression_loss: 1.0648 - classification_loss: 0.1885 436/500 [=========================>....] - ETA: 16s - loss: 1.2523 - regression_loss: 1.0638 - classification_loss: 0.1884 437/500 [=========================>....] - ETA: 15s - loss: 1.2524 - regression_loss: 1.0638 - classification_loss: 0.1885 438/500 [=========================>....] - ETA: 15s - loss: 1.2519 - regression_loss: 1.0637 - classification_loss: 0.1882 439/500 [=========================>....] - ETA: 15s - loss: 1.2518 - regression_loss: 1.0637 - classification_loss: 0.1881 440/500 [=========================>....] - ETA: 15s - loss: 1.2512 - regression_loss: 1.0633 - classification_loss: 0.1880 441/500 [=========================>....] - ETA: 14s - loss: 1.2498 - regression_loss: 1.0620 - classification_loss: 0.1877 442/500 [=========================>....] - ETA: 14s - loss: 1.2504 - regression_loss: 1.0624 - classification_loss: 0.1879 443/500 [=========================>....] - ETA: 14s - loss: 1.2501 - regression_loss: 1.0623 - classification_loss: 0.1878 444/500 [=========================>....] - ETA: 14s - loss: 1.2499 - regression_loss: 1.0622 - classification_loss: 0.1878 445/500 [=========================>....] - ETA: 13s - loss: 1.2487 - regression_loss: 1.0610 - classification_loss: 0.1877 446/500 [=========================>....] - ETA: 13s - loss: 1.2476 - regression_loss: 1.0601 - classification_loss: 0.1874 447/500 [=========================>....] - ETA: 13s - loss: 1.2471 - regression_loss: 1.0596 - classification_loss: 0.1875 448/500 [=========================>....] - ETA: 13s - loss: 1.2475 - regression_loss: 1.0600 - classification_loss: 0.1876 449/500 [=========================>....] - ETA: 12s - loss: 1.2465 - regression_loss: 1.0592 - classification_loss: 0.1873 450/500 [==========================>...] - ETA: 12s - loss: 1.2475 - regression_loss: 1.0599 - classification_loss: 0.1876 451/500 [==========================>...] - ETA: 12s - loss: 1.2482 - regression_loss: 1.0605 - classification_loss: 0.1877 452/500 [==========================>...] - ETA: 12s - loss: 1.2480 - regression_loss: 1.0604 - classification_loss: 0.1876 453/500 [==========================>...] - ETA: 11s - loss: 1.2489 - regression_loss: 1.0612 - classification_loss: 0.1877 454/500 [==========================>...] - ETA: 11s - loss: 1.2492 - regression_loss: 1.0615 - classification_loss: 0.1877 455/500 [==========================>...] - ETA: 11s - loss: 1.2494 - regression_loss: 1.0619 - classification_loss: 0.1875 456/500 [==========================>...] - ETA: 11s - loss: 1.2496 - regression_loss: 1.0621 - classification_loss: 0.1875 457/500 [==========================>...] - ETA: 10s - loss: 1.2500 - regression_loss: 1.0624 - classification_loss: 0.1875 458/500 [==========================>...] - ETA: 10s - loss: 1.2512 - regression_loss: 1.0634 - classification_loss: 0.1878 459/500 [==========================>...] - ETA: 10s - loss: 1.2503 - regression_loss: 1.0627 - classification_loss: 0.1876 460/500 [==========================>...] - ETA: 10s - loss: 1.2493 - regression_loss: 1.0618 - classification_loss: 0.1875 461/500 [==========================>...] - ETA: 9s - loss: 1.2475 - regression_loss: 1.0603 - classification_loss: 0.1872  462/500 [==========================>...] - ETA: 9s - loss: 1.2479 - regression_loss: 1.0607 - classification_loss: 0.1872 463/500 [==========================>...] - ETA: 9s - loss: 1.2489 - regression_loss: 1.0616 - classification_loss: 0.1874 464/500 [==========================>...] - ETA: 9s - loss: 1.2495 - regression_loss: 1.0620 - classification_loss: 0.1875 465/500 [==========================>...] - ETA: 8s - loss: 1.2503 - regression_loss: 1.0626 - classification_loss: 0.1877 466/500 [==========================>...] - ETA: 8s - loss: 1.2502 - regression_loss: 1.0624 - classification_loss: 0.1878 467/500 [===========================>..] - ETA: 8s - loss: 1.2503 - regression_loss: 1.0625 - classification_loss: 0.1878 468/500 [===========================>..] - ETA: 8s - loss: 1.2506 - regression_loss: 1.0628 - classification_loss: 0.1878 469/500 [===========================>..] - ETA: 7s - loss: 1.2496 - regression_loss: 1.0620 - classification_loss: 0.1876 470/500 [===========================>..] - ETA: 7s - loss: 1.2495 - regression_loss: 1.0619 - classification_loss: 0.1876 471/500 [===========================>..] - ETA: 7s - loss: 1.2500 - regression_loss: 1.0624 - classification_loss: 0.1876 472/500 [===========================>..] - ETA: 7s - loss: 1.2509 - regression_loss: 1.0631 - classification_loss: 0.1878 473/500 [===========================>..] - ETA: 6s - loss: 1.2502 - regression_loss: 1.0627 - classification_loss: 0.1876 474/500 [===========================>..] - ETA: 6s - loss: 1.2493 - regression_loss: 1.0618 - classification_loss: 0.1875 475/500 [===========================>..] - ETA: 6s - loss: 1.2489 - regression_loss: 1.0615 - classification_loss: 0.1874 476/500 [===========================>..] - ETA: 6s - loss: 1.2494 - regression_loss: 1.0619 - classification_loss: 0.1874 477/500 [===========================>..] - ETA: 5s - loss: 1.2496 - regression_loss: 1.0622 - classification_loss: 0.1875 478/500 [===========================>..] - ETA: 5s - loss: 1.2499 - regression_loss: 1.0623 - classification_loss: 0.1875 479/500 [===========================>..] - ETA: 5s - loss: 1.2509 - regression_loss: 1.0632 - classification_loss: 0.1877 480/500 [===========================>..] - ETA: 5s - loss: 1.2514 - regression_loss: 1.0637 - classification_loss: 0.1877 481/500 [===========================>..] - ETA: 4s - loss: 1.2502 - regression_loss: 1.0627 - classification_loss: 0.1875 482/500 [===========================>..] - ETA: 4s - loss: 1.2486 - regression_loss: 1.0614 - classification_loss: 0.1872 483/500 [===========================>..] - ETA: 4s - loss: 1.2487 - regression_loss: 1.0615 - classification_loss: 0.1872 484/500 [============================>.] - ETA: 4s - loss: 1.2498 - regression_loss: 1.0625 - classification_loss: 0.1874 485/500 [============================>.] - ETA: 3s - loss: 1.2497 - regression_loss: 1.0624 - classification_loss: 0.1873 486/500 [============================>.] - ETA: 3s - loss: 1.2481 - regression_loss: 1.0610 - classification_loss: 0.1871 487/500 [============================>.] - ETA: 3s - loss: 1.2479 - regression_loss: 1.0609 - classification_loss: 0.1871 488/500 [============================>.] - ETA: 3s - loss: 1.2475 - regression_loss: 1.0605 - classification_loss: 0.1869 489/500 [============================>.] - ETA: 2s - loss: 1.2477 - regression_loss: 1.0606 - classification_loss: 0.1871 490/500 [============================>.] - ETA: 2s - loss: 1.2475 - regression_loss: 1.0605 - classification_loss: 0.1870 491/500 [============================>.] - ETA: 2s - loss: 1.2470 - regression_loss: 1.0600 - classification_loss: 0.1869 492/500 [============================>.] - ETA: 2s - loss: 1.2455 - regression_loss: 1.0589 - classification_loss: 0.1866 493/500 [============================>.] - ETA: 1s - loss: 1.2455 - regression_loss: 1.0590 - classification_loss: 0.1865 494/500 [============================>.] - ETA: 1s - loss: 1.2457 - regression_loss: 1.0590 - classification_loss: 0.1867 495/500 [============================>.] - ETA: 1s - loss: 1.2452 - regression_loss: 1.0586 - classification_loss: 0.1866 496/500 [============================>.] - ETA: 1s - loss: 1.2444 - regression_loss: 1.0580 - classification_loss: 0.1864 497/500 [============================>.] - ETA: 0s - loss: 1.2446 - regression_loss: 1.0583 - classification_loss: 0.1863 498/500 [============================>.] - ETA: 0s - loss: 1.2451 - regression_loss: 1.0587 - classification_loss: 0.1864 499/500 [============================>.] - ETA: 0s - loss: 1.2461 - regression_loss: 1.0596 - classification_loss: 0.1865 500/500 [==============================] - 126s 251ms/step - loss: 1.2474 - regression_loss: 1.0607 - classification_loss: 0.1867 1172 instances of class plum with average precision: 0.6994 mAP: 0.6994 Epoch 00126: saving model to ./training/snapshots/resnet50_pascal_126.h5 Epoch 127/150 1/500 [..............................] - ETA: 2:07 - loss: 1.5214 - regression_loss: 1.2906 - classification_loss: 0.2308 2/500 [..............................] - ETA: 2:05 - loss: 1.5831 - regression_loss: 1.3345 - classification_loss: 0.2486 3/500 [..............................] - ETA: 2:07 - loss: 1.2183 - regression_loss: 1.0275 - classification_loss: 0.1908 4/500 [..............................] - ETA: 2:07 - loss: 1.1812 - regression_loss: 1.0025 - classification_loss: 0.1786 5/500 [..............................] - ETA: 2:06 - loss: 1.2993 - regression_loss: 1.0923 - classification_loss: 0.2071 6/500 [..............................] - ETA: 2:05 - loss: 1.3154 - regression_loss: 1.0964 - classification_loss: 0.2190 7/500 [..............................] - ETA: 2:04 - loss: 1.2307 - regression_loss: 1.0287 - classification_loss: 0.2019 8/500 [..............................] - ETA: 2:03 - loss: 1.2690 - regression_loss: 1.0607 - classification_loss: 0.2083 9/500 [..............................] - ETA: 2:00 - loss: 1.2232 - regression_loss: 1.0240 - classification_loss: 0.1992 10/500 [..............................] - ETA: 2:00 - loss: 1.2543 - regression_loss: 1.0502 - classification_loss: 0.2041 11/500 [..............................] - ETA: 1:58 - loss: 1.3631 - regression_loss: 1.1439 - classification_loss: 0.2192 12/500 [..............................] - ETA: 1:58 - loss: 1.3377 - regression_loss: 1.1246 - classification_loss: 0.2131 13/500 [..............................] - ETA: 1:58 - loss: 1.3414 - regression_loss: 1.1348 - classification_loss: 0.2065 14/500 [..............................] - ETA: 1:58 - loss: 1.3399 - regression_loss: 1.1334 - classification_loss: 0.2064 15/500 [..............................] - ETA: 1:58 - loss: 1.3615 - regression_loss: 1.1517 - classification_loss: 0.2098 16/500 [..............................] - ETA: 1:58 - loss: 1.3535 - regression_loss: 1.1468 - classification_loss: 0.2067 17/500 [>.............................] - ETA: 1:58 - loss: 1.3685 - regression_loss: 1.1608 - classification_loss: 0.2077 18/500 [>.............................] - ETA: 1:58 - loss: 1.3474 - regression_loss: 1.1393 - classification_loss: 0.2081 19/500 [>.............................] - ETA: 1:58 - loss: 1.3637 - regression_loss: 1.1546 - classification_loss: 0.2091 20/500 [>.............................] - ETA: 1:57 - loss: 1.3576 - regression_loss: 1.1491 - classification_loss: 0.2085 21/500 [>.............................] - ETA: 1:57 - loss: 1.3680 - regression_loss: 1.1580 - classification_loss: 0.2100 22/500 [>.............................] - ETA: 1:57 - loss: 1.3644 - regression_loss: 1.1560 - classification_loss: 0.2084 23/500 [>.............................] - ETA: 1:57 - loss: 1.3407 - regression_loss: 1.1376 - classification_loss: 0.2031 24/500 [>.............................] - ETA: 1:57 - loss: 1.3550 - regression_loss: 1.1484 - classification_loss: 0.2066 25/500 [>.............................] - ETA: 1:57 - loss: 1.3424 - regression_loss: 1.1366 - classification_loss: 0.2058 26/500 [>.............................] - ETA: 1:56 - loss: 1.3540 - regression_loss: 1.1469 - classification_loss: 0.2071 27/500 [>.............................] - ETA: 1:56 - loss: 1.3591 - regression_loss: 1.1510 - classification_loss: 0.2081 28/500 [>.............................] - ETA: 1:56 - loss: 1.3229 - regression_loss: 1.1210 - classification_loss: 0.2019 29/500 [>.............................] - ETA: 1:56 - loss: 1.3561 - regression_loss: 1.1472 - classification_loss: 0.2090 30/500 [>.............................] - ETA: 1:56 - loss: 1.3749 - regression_loss: 1.1633 - classification_loss: 0.2116 31/500 [>.............................] - ETA: 1:55 - loss: 1.3717 - regression_loss: 1.1646 - classification_loss: 0.2071 32/500 [>.............................] - ETA: 1:55 - loss: 1.3856 - regression_loss: 1.1735 - classification_loss: 0.2121 33/500 [>.............................] - ETA: 1:55 - loss: 1.3975 - regression_loss: 1.1842 - classification_loss: 0.2134 34/500 [=>............................] - ETA: 1:55 - loss: 1.3697 - regression_loss: 1.1611 - classification_loss: 0.2087 35/500 [=>............................] - ETA: 1:55 - loss: 1.3683 - regression_loss: 1.1602 - classification_loss: 0.2081 36/500 [=>............................] - ETA: 1:55 - loss: 1.3666 - regression_loss: 1.1572 - classification_loss: 0.2093 37/500 [=>............................] - ETA: 1:55 - loss: 1.3698 - regression_loss: 1.1582 - classification_loss: 0.2115 38/500 [=>............................] - ETA: 1:54 - loss: 1.3604 - regression_loss: 1.1493 - classification_loss: 0.2110 39/500 [=>............................] - ETA: 1:54 - loss: 1.3583 - regression_loss: 1.1478 - classification_loss: 0.2106 40/500 [=>............................] - ETA: 1:54 - loss: 1.3738 - regression_loss: 1.1596 - classification_loss: 0.2143 41/500 [=>............................] - ETA: 1:54 - loss: 1.3803 - regression_loss: 1.1652 - classification_loss: 0.2151 42/500 [=>............................] - ETA: 1:54 - loss: 1.3831 - regression_loss: 1.1672 - classification_loss: 0.2159 43/500 [=>............................] - ETA: 1:53 - loss: 1.3846 - regression_loss: 1.1689 - classification_loss: 0.2157 44/500 [=>............................] - ETA: 1:53 - loss: 1.3867 - regression_loss: 1.1697 - classification_loss: 0.2170 45/500 [=>............................] - ETA: 1:53 - loss: 1.3776 - regression_loss: 1.1610 - classification_loss: 0.2166 46/500 [=>............................] - ETA: 1:52 - loss: 1.3651 - regression_loss: 1.1514 - classification_loss: 0.2137 47/500 [=>............................] - ETA: 1:52 - loss: 1.3716 - regression_loss: 1.1579 - classification_loss: 0.2137 48/500 [=>............................] - ETA: 1:52 - loss: 1.3634 - regression_loss: 1.1507 - classification_loss: 0.2127 49/500 [=>............................] - ETA: 1:52 - loss: 1.3689 - regression_loss: 1.1556 - classification_loss: 0.2133 50/500 [==>...........................] - ETA: 1:51 - loss: 1.3700 - regression_loss: 1.1565 - classification_loss: 0.2135 51/500 [==>...........................] - ETA: 1:51 - loss: 1.3801 - regression_loss: 1.1651 - classification_loss: 0.2150 52/500 [==>...........................] - ETA: 1:51 - loss: 1.3785 - regression_loss: 1.1641 - classification_loss: 0.2144 53/500 [==>...........................] - ETA: 1:51 - loss: 1.3695 - regression_loss: 1.1567 - classification_loss: 0.2128 54/500 [==>...........................] - ETA: 1:50 - loss: 1.3715 - regression_loss: 1.1587 - classification_loss: 0.2128 55/500 [==>...........................] - ETA: 1:50 - loss: 1.3796 - regression_loss: 1.1657 - classification_loss: 0.2138 56/500 [==>...........................] - ETA: 1:50 - loss: 1.3817 - regression_loss: 1.1671 - classification_loss: 0.2146 57/500 [==>...........................] - ETA: 1:50 - loss: 1.3824 - regression_loss: 1.1676 - classification_loss: 0.2148 58/500 [==>...........................] - ETA: 1:49 - loss: 1.3794 - regression_loss: 1.1642 - classification_loss: 0.2153 59/500 [==>...........................] - ETA: 1:49 - loss: 1.3799 - regression_loss: 1.1646 - classification_loss: 0.2153 60/500 [==>...........................] - ETA: 1:49 - loss: 1.3811 - regression_loss: 1.1660 - classification_loss: 0.2152 61/500 [==>...........................] - ETA: 1:49 - loss: 1.3678 - regression_loss: 1.1551 - classification_loss: 0.2127 62/500 [==>...........................] - ETA: 1:49 - loss: 1.3695 - regression_loss: 1.1569 - classification_loss: 0.2126 63/500 [==>...........................] - ETA: 1:48 - loss: 1.3611 - regression_loss: 1.1496 - classification_loss: 0.2115 64/500 [==>...........................] - ETA: 1:48 - loss: 1.3658 - regression_loss: 1.1533 - classification_loss: 0.2126 65/500 [==>...........................] - ETA: 1:48 - loss: 1.3578 - regression_loss: 1.1470 - classification_loss: 0.2108 66/500 [==>...........................] - ETA: 1:48 - loss: 1.3522 - regression_loss: 1.1418 - classification_loss: 0.2104 67/500 [===>..........................] - ETA: 1:47 - loss: 1.3451 - regression_loss: 1.1363 - classification_loss: 0.2089 68/500 [===>..........................] - ETA: 1:47 - loss: 1.3475 - regression_loss: 1.1389 - classification_loss: 0.2086 69/500 [===>..........................] - ETA: 1:47 - loss: 1.3396 - regression_loss: 1.1317 - classification_loss: 0.2079 70/500 [===>..........................] - ETA: 1:47 - loss: 1.3380 - regression_loss: 1.1300 - classification_loss: 0.2080 71/500 [===>..........................] - ETA: 1:46 - loss: 1.3340 - regression_loss: 1.1263 - classification_loss: 0.2077 72/500 [===>..........................] - ETA: 1:46 - loss: 1.3287 - regression_loss: 1.1220 - classification_loss: 0.2067 73/500 [===>..........................] - ETA: 1:46 - loss: 1.3190 - regression_loss: 1.1144 - classification_loss: 0.2047 74/500 [===>..........................] - ETA: 1:46 - loss: 1.3118 - regression_loss: 1.1093 - classification_loss: 0.2025 75/500 [===>..........................] - ETA: 1:45 - loss: 1.3193 - regression_loss: 1.1157 - classification_loss: 0.2036 76/500 [===>..........................] - ETA: 1:45 - loss: 1.3101 - regression_loss: 1.1083 - classification_loss: 0.2018 77/500 [===>..........................] - ETA: 1:45 - loss: 1.3140 - regression_loss: 1.1113 - classification_loss: 0.2027 78/500 [===>..........................] - ETA: 1:45 - loss: 1.3039 - regression_loss: 1.1030 - classification_loss: 0.2009 79/500 [===>..........................] - ETA: 1:44 - loss: 1.3102 - regression_loss: 1.1082 - classification_loss: 0.2020 80/500 [===>..........................] - ETA: 1:44 - loss: 1.3102 - regression_loss: 1.1073 - classification_loss: 0.2029 81/500 [===>..........................] - ETA: 1:44 - loss: 1.3149 - regression_loss: 1.1114 - classification_loss: 0.2035 82/500 [===>..........................] - ETA: 1:44 - loss: 1.3060 - regression_loss: 1.1043 - classification_loss: 0.2017 83/500 [===>..........................] - ETA: 1:43 - loss: 1.3031 - regression_loss: 1.1020 - classification_loss: 0.2011 84/500 [====>.........................] - ETA: 1:43 - loss: 1.3045 - regression_loss: 1.1028 - classification_loss: 0.2017 85/500 [====>.........................] - ETA: 1:43 - loss: 1.3027 - regression_loss: 1.1013 - classification_loss: 0.2014 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2990 - regression_loss: 1.0986 - classification_loss: 0.2004 87/500 [====>.........................] - ETA: 1:42 - loss: 1.3031 - regression_loss: 1.1016 - classification_loss: 0.2015 88/500 [====>.........................] - ETA: 1:42 - loss: 1.3086 - regression_loss: 1.1055 - classification_loss: 0.2031 89/500 [====>.........................] - ETA: 1:42 - loss: 1.3105 - regression_loss: 1.1074 - classification_loss: 0.2032 90/500 [====>.........................] - ETA: 1:41 - loss: 1.3084 - regression_loss: 1.1057 - classification_loss: 0.2027 91/500 [====>.........................] - ETA: 1:41 - loss: 1.3111 - regression_loss: 1.1072 - classification_loss: 0.2039 92/500 [====>.........................] - ETA: 1:41 - loss: 1.3178 - regression_loss: 1.1128 - classification_loss: 0.2050 93/500 [====>.........................] - ETA: 1:41 - loss: 1.3151 - regression_loss: 1.1104 - classification_loss: 0.2047 94/500 [====>.........................] - ETA: 1:41 - loss: 1.3214 - regression_loss: 1.1151 - classification_loss: 0.2064 95/500 [====>.........................] - ETA: 1:40 - loss: 1.3206 - regression_loss: 1.1143 - classification_loss: 0.2063 96/500 [====>.........................] - ETA: 1:40 - loss: 1.3170 - regression_loss: 1.1112 - classification_loss: 0.2058 97/500 [====>.........................] - ETA: 1:40 - loss: 1.3164 - regression_loss: 1.1108 - classification_loss: 0.2056 98/500 [====>.........................] - ETA: 1:40 - loss: 1.3178 - regression_loss: 1.1120 - classification_loss: 0.2058 99/500 [====>.........................] - ETA: 1:40 - loss: 1.3130 - regression_loss: 1.1083 - classification_loss: 0.2047 100/500 [=====>........................] - ETA: 1:39 - loss: 1.3129 - regression_loss: 1.1082 - classification_loss: 0.2046 101/500 [=====>........................] - ETA: 1:39 - loss: 1.3162 - regression_loss: 1.1111 - classification_loss: 0.2051 102/500 [=====>........................] - ETA: 1:39 - loss: 1.3190 - regression_loss: 1.1134 - classification_loss: 0.2055 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3170 - regression_loss: 1.1120 - classification_loss: 0.2050 104/500 [=====>........................] - ETA: 1:38 - loss: 1.3097 - regression_loss: 1.1060 - classification_loss: 0.2037 105/500 [=====>........................] - ETA: 1:38 - loss: 1.3040 - regression_loss: 1.1016 - classification_loss: 0.2024 106/500 [=====>........................] - ETA: 1:38 - loss: 1.3008 - regression_loss: 1.0993 - classification_loss: 0.2016 107/500 [=====>........................] - ETA: 1:38 - loss: 1.3008 - regression_loss: 1.0987 - classification_loss: 0.2021 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2961 - regression_loss: 1.0955 - classification_loss: 0.2006 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2946 - regression_loss: 1.0944 - classification_loss: 0.2002 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2925 - regression_loss: 1.0930 - classification_loss: 0.1995 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2852 - regression_loss: 1.0872 - classification_loss: 0.1980 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2797 - regression_loss: 1.0819 - classification_loss: 0.1978 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2749 - regression_loss: 1.0779 - classification_loss: 0.1970 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2765 - regression_loss: 1.0791 - classification_loss: 0.1973 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2775 - regression_loss: 1.0803 - classification_loss: 0.1973 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2777 - regression_loss: 1.0809 - classification_loss: 0.1968 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2783 - regression_loss: 1.0814 - classification_loss: 0.1969 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2814 - regression_loss: 1.0842 - classification_loss: 0.1972 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2825 - regression_loss: 1.0850 - classification_loss: 0.1975 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2802 - regression_loss: 1.0834 - classification_loss: 0.1969 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2805 - regression_loss: 1.0837 - classification_loss: 0.1968 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2750 - regression_loss: 1.0787 - classification_loss: 0.1963 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2789 - regression_loss: 1.0818 - classification_loss: 0.1971 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2792 - regression_loss: 1.0823 - classification_loss: 0.1970 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2794 - regression_loss: 1.0820 - classification_loss: 0.1975 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2802 - regression_loss: 1.0829 - classification_loss: 0.1973 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2807 - regression_loss: 1.0837 - classification_loss: 0.1971 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2794 - regression_loss: 1.0826 - classification_loss: 0.1967 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2765 - regression_loss: 1.0798 - classification_loss: 0.1967 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2700 - regression_loss: 1.0745 - classification_loss: 0.1955 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2707 - regression_loss: 1.0744 - classification_loss: 0.1963 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2726 - regression_loss: 1.0759 - classification_loss: 0.1966 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2674 - regression_loss: 1.0716 - classification_loss: 0.1958 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2657 - regression_loss: 1.0703 - classification_loss: 0.1954 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2649 - regression_loss: 1.0697 - classification_loss: 0.1952 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2657 - regression_loss: 1.0708 - classification_loss: 0.1949 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2658 - regression_loss: 1.0708 - classification_loss: 0.1950 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2664 - regression_loss: 1.0714 - classification_loss: 0.1950 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2684 - regression_loss: 1.0732 - classification_loss: 0.1952 140/500 [=======>......................] - ETA: 1:29 - loss: 1.2670 - regression_loss: 1.0721 - classification_loss: 0.1949 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2674 - regression_loss: 1.0727 - classification_loss: 0.1947 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2680 - regression_loss: 1.0732 - classification_loss: 0.1949 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2694 - regression_loss: 1.0746 - classification_loss: 0.1948 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2667 - regression_loss: 1.0726 - classification_loss: 0.1941 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2650 - regression_loss: 1.0714 - classification_loss: 0.1936 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2593 - regression_loss: 1.0666 - classification_loss: 0.1927 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2542 - regression_loss: 1.0627 - classification_loss: 0.1915 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2543 - regression_loss: 1.0620 - classification_loss: 0.1923 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2589 - regression_loss: 1.0656 - classification_loss: 0.1934 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2608 - regression_loss: 1.0669 - classification_loss: 0.1939 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2572 - regression_loss: 1.0642 - classification_loss: 0.1930 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2568 - regression_loss: 1.0642 - classification_loss: 0.1926 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2587 - regression_loss: 1.0658 - classification_loss: 0.1929 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2550 - regression_loss: 1.0629 - classification_loss: 0.1921 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2482 - regression_loss: 1.0572 - classification_loss: 0.1910 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2489 - regression_loss: 1.0581 - classification_loss: 0.1908 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2505 - regression_loss: 1.0596 - classification_loss: 0.1910 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2512 - regression_loss: 1.0601 - classification_loss: 0.1911 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2467 - regression_loss: 1.0565 - classification_loss: 0.1902 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2485 - regression_loss: 1.0577 - classification_loss: 0.1908 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2474 - regression_loss: 1.0573 - classification_loss: 0.1901 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2493 - regression_loss: 1.0589 - classification_loss: 0.1905 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2490 - regression_loss: 1.0588 - classification_loss: 0.1903 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2444 - regression_loss: 1.0549 - classification_loss: 0.1895 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2442 - regression_loss: 1.0545 - classification_loss: 0.1896 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2439 - regression_loss: 1.0545 - classification_loss: 0.1893 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2435 - regression_loss: 1.0541 - classification_loss: 0.1895 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2396 - regression_loss: 1.0510 - classification_loss: 0.1886 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2404 - regression_loss: 1.0519 - classification_loss: 0.1885 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2413 - regression_loss: 1.0526 - classification_loss: 0.1886 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2383 - regression_loss: 1.0501 - classification_loss: 0.1882 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2400 - regression_loss: 1.0515 - classification_loss: 0.1885 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2355 - regression_loss: 1.0478 - classification_loss: 0.1877 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2323 - regression_loss: 1.0452 - classification_loss: 0.1871 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2339 - regression_loss: 1.0463 - classification_loss: 0.1876 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2297 - regression_loss: 1.0429 - classification_loss: 0.1869 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2346 - regression_loss: 1.0449 - classification_loss: 0.1897 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2377 - regression_loss: 1.0471 - classification_loss: 0.1906 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2345 - regression_loss: 1.0446 - classification_loss: 0.1900 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2357 - regression_loss: 1.0453 - classification_loss: 0.1904 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2355 - regression_loss: 1.0453 - classification_loss: 0.1902 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2401 - regression_loss: 1.0489 - classification_loss: 0.1912 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2403 - regression_loss: 1.0488 - classification_loss: 0.1914 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2429 - regression_loss: 1.0508 - classification_loss: 0.1921 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2412 - regression_loss: 1.0495 - classification_loss: 0.1917 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2399 - regression_loss: 1.0484 - classification_loss: 0.1915 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2435 - regression_loss: 1.0515 - classification_loss: 0.1921 188/500 [==========>...................] - ETA: 1:17 - loss: 1.2427 - regression_loss: 1.0505 - classification_loss: 0.1922 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2430 - regression_loss: 1.0506 - classification_loss: 0.1924 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2427 - regression_loss: 1.0502 - classification_loss: 0.1925 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2442 - regression_loss: 1.0513 - classification_loss: 0.1930 192/500 [==========>...................] - ETA: 1:16 - loss: 1.2448 - regression_loss: 1.0519 - classification_loss: 0.1929 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2453 - regression_loss: 1.0523 - classification_loss: 0.1930 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2464 - regression_loss: 1.0532 - classification_loss: 0.1932 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2474 - regression_loss: 1.0541 - classification_loss: 0.1933 196/500 [==========>...................] - ETA: 1:15 - loss: 1.2528 - regression_loss: 1.0572 - classification_loss: 0.1955 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2506 - regression_loss: 1.0555 - classification_loss: 0.1951 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2507 - regression_loss: 1.0558 - classification_loss: 0.1949 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2500 - regression_loss: 1.0554 - classification_loss: 0.1945 200/500 [===========>..................] - ETA: 1:14 - loss: 1.2496 - regression_loss: 1.0551 - classification_loss: 0.1944 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2472 - regression_loss: 1.0534 - classification_loss: 0.1938 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2458 - regression_loss: 1.0526 - classification_loss: 0.1933 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2458 - regression_loss: 1.0527 - classification_loss: 0.1931 204/500 [===========>..................] - ETA: 1:13 - loss: 1.2471 - regression_loss: 1.0536 - classification_loss: 0.1935 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2456 - regression_loss: 1.0523 - classification_loss: 0.1933 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2425 - regression_loss: 1.0497 - classification_loss: 0.1928 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2442 - regression_loss: 1.0513 - classification_loss: 0.1929 208/500 [===========>..................] - ETA: 1:12 - loss: 1.2482 - regression_loss: 1.0546 - classification_loss: 0.1935 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2526 - regression_loss: 1.0583 - classification_loss: 0.1944 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2493 - regression_loss: 1.0556 - classification_loss: 0.1937 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2464 - regression_loss: 1.0531 - classification_loss: 0.1933 212/500 [===========>..................] - ETA: 1:11 - loss: 1.2487 - regression_loss: 1.0550 - classification_loss: 0.1936 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2504 - regression_loss: 1.0566 - classification_loss: 0.1938 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2523 - regression_loss: 1.0583 - classification_loss: 0.1940 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2503 - regression_loss: 1.0564 - classification_loss: 0.1939 216/500 [===========>..................] - ETA: 1:10 - loss: 1.2500 - regression_loss: 1.0563 - classification_loss: 0.1937 217/500 [============>.................] - ETA: 1:10 - loss: 1.2508 - regression_loss: 1.0568 - classification_loss: 0.1940 218/500 [============>.................] - ETA: 1:10 - loss: 1.2523 - regression_loss: 1.0578 - classification_loss: 0.1945 219/500 [============>.................] - ETA: 1:10 - loss: 1.2514 - regression_loss: 1.0571 - classification_loss: 0.1943 220/500 [============>.................] - ETA: 1:09 - loss: 1.2513 - regression_loss: 1.0570 - classification_loss: 0.1943 221/500 [============>.................] - ETA: 1:09 - loss: 1.2526 - regression_loss: 1.0581 - classification_loss: 0.1945 222/500 [============>.................] - ETA: 1:09 - loss: 1.2535 - regression_loss: 1.0590 - classification_loss: 0.1945 223/500 [============>.................] - ETA: 1:09 - loss: 1.2548 - regression_loss: 1.0601 - classification_loss: 0.1947 224/500 [============>.................] - ETA: 1:08 - loss: 1.2539 - regression_loss: 1.0596 - classification_loss: 0.1943 225/500 [============>.................] - ETA: 1:08 - loss: 1.2538 - regression_loss: 1.0598 - classification_loss: 0.1941 226/500 [============>.................] - ETA: 1:08 - loss: 1.2543 - regression_loss: 1.0600 - classification_loss: 0.1943 227/500 [============>.................] - ETA: 1:08 - loss: 1.2542 - regression_loss: 1.0600 - classification_loss: 0.1942 228/500 [============>.................] - ETA: 1:07 - loss: 1.2510 - regression_loss: 1.0574 - classification_loss: 0.1936 229/500 [============>.................] - ETA: 1:07 - loss: 1.2531 - regression_loss: 1.0597 - classification_loss: 0.1934 230/500 [============>.................] - ETA: 1:07 - loss: 1.2537 - regression_loss: 1.0602 - classification_loss: 0.1934 231/500 [============>.................] - ETA: 1:07 - loss: 1.2565 - regression_loss: 1.0619 - classification_loss: 0.1946 232/500 [============>.................] - ETA: 1:06 - loss: 1.2572 - regression_loss: 1.0625 - classification_loss: 0.1947 233/500 [============>.................] - ETA: 1:06 - loss: 1.2557 - regression_loss: 1.0614 - classification_loss: 0.1943 234/500 [=============>................] - ETA: 1:06 - loss: 1.2566 - regression_loss: 1.0620 - classification_loss: 0.1945 235/500 [=============>................] - ETA: 1:06 - loss: 1.2573 - regression_loss: 1.0628 - classification_loss: 0.1946 236/500 [=============>................] - ETA: 1:05 - loss: 1.2583 - regression_loss: 1.0633 - classification_loss: 0.1949 237/500 [=============>................] - ETA: 1:05 - loss: 1.2585 - regression_loss: 1.0638 - classification_loss: 0.1947 238/500 [=============>................] - ETA: 1:05 - loss: 1.2566 - regression_loss: 1.0623 - classification_loss: 0.1943 239/500 [=============>................] - ETA: 1:05 - loss: 1.2578 - regression_loss: 1.0632 - classification_loss: 0.1945 240/500 [=============>................] - ETA: 1:04 - loss: 1.2584 - regression_loss: 1.0638 - classification_loss: 0.1946 241/500 [=============>................] - ETA: 1:04 - loss: 1.2582 - regression_loss: 1.0636 - classification_loss: 0.1946 242/500 [=============>................] - ETA: 1:04 - loss: 1.2572 - regression_loss: 1.0626 - classification_loss: 0.1946 243/500 [=============>................] - ETA: 1:04 - loss: 1.2580 - regression_loss: 1.0631 - classification_loss: 0.1948 244/500 [=============>................] - ETA: 1:03 - loss: 1.2597 - regression_loss: 1.0647 - classification_loss: 0.1950 245/500 [=============>................] - ETA: 1:03 - loss: 1.2593 - regression_loss: 1.0642 - classification_loss: 0.1951 246/500 [=============>................] - ETA: 1:03 - loss: 1.2620 - regression_loss: 1.0664 - classification_loss: 0.1956 247/500 [=============>................] - ETA: 1:03 - loss: 1.2629 - regression_loss: 1.0674 - classification_loss: 0.1956 248/500 [=============>................] - ETA: 1:02 - loss: 1.2631 - regression_loss: 1.0675 - classification_loss: 0.1956 249/500 [=============>................] - ETA: 1:02 - loss: 1.2635 - regression_loss: 1.0681 - classification_loss: 0.1954 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2644 - regression_loss: 1.0687 - classification_loss: 0.1956 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2628 - regression_loss: 1.0675 - classification_loss: 0.1953 252/500 [==============>...............] - ETA: 1:01 - loss: 1.2629 - regression_loss: 1.0677 - classification_loss: 0.1952 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2616 - regression_loss: 1.0667 - classification_loss: 0.1950 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2628 - regression_loss: 1.0678 - classification_loss: 0.1951 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2629 - regression_loss: 1.0678 - classification_loss: 0.1951 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2601 - regression_loss: 1.0655 - classification_loss: 0.1946 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2621 - regression_loss: 1.0671 - classification_loss: 0.1950 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2635 - regression_loss: 1.0683 - classification_loss: 0.1952 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2627 - regression_loss: 1.0677 - classification_loss: 0.1950 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2633 - regression_loss: 1.0683 - classification_loss: 0.1950 261/500 [==============>...............] - ETA: 59s - loss: 1.2648 - regression_loss: 1.0696 - classification_loss: 0.1952  262/500 [==============>...............] - ETA: 59s - loss: 1.2626 - regression_loss: 1.0678 - classification_loss: 0.1948 263/500 [==============>...............] - ETA: 59s - loss: 1.2633 - regression_loss: 1.0685 - classification_loss: 0.1948 264/500 [==============>...............] - ETA: 59s - loss: 1.2625 - regression_loss: 1.0679 - classification_loss: 0.1946 265/500 [==============>...............] - ETA: 58s - loss: 1.2622 - regression_loss: 1.0678 - classification_loss: 0.1943 266/500 [==============>...............] - ETA: 58s - loss: 1.2625 - regression_loss: 1.0681 - classification_loss: 0.1944 267/500 [===============>..............] - ETA: 58s - loss: 1.2646 - regression_loss: 1.0698 - classification_loss: 0.1948 268/500 [===============>..............] - ETA: 58s - loss: 1.2671 - regression_loss: 1.0717 - classification_loss: 0.1954 269/500 [===============>..............] - ETA: 57s - loss: 1.2674 - regression_loss: 1.0721 - classification_loss: 0.1954 270/500 [===============>..............] - ETA: 57s - loss: 1.2679 - regression_loss: 1.0726 - classification_loss: 0.1953 271/500 [===============>..............] - ETA: 57s - loss: 1.2691 - regression_loss: 1.0735 - classification_loss: 0.1956 272/500 [===============>..............] - ETA: 57s - loss: 1.2713 - regression_loss: 1.0753 - classification_loss: 0.1960 273/500 [===============>..............] - ETA: 56s - loss: 1.2681 - regression_loss: 1.0727 - classification_loss: 0.1954 274/500 [===============>..............] - ETA: 56s - loss: 1.2676 - regression_loss: 1.0722 - classification_loss: 0.1954 275/500 [===============>..............] - ETA: 56s - loss: 1.2668 - regression_loss: 1.0716 - classification_loss: 0.1952 276/500 [===============>..............] - ETA: 56s - loss: 1.2670 - regression_loss: 1.0718 - classification_loss: 0.1952 277/500 [===============>..............] - ETA: 55s - loss: 1.2687 - regression_loss: 1.0733 - classification_loss: 0.1954 278/500 [===============>..............] - ETA: 55s - loss: 1.2663 - regression_loss: 1.0713 - classification_loss: 0.1950 279/500 [===============>..............] - ETA: 55s - loss: 1.2683 - regression_loss: 1.0728 - classification_loss: 0.1956 280/500 [===============>..............] - ETA: 55s - loss: 1.2658 - regression_loss: 1.0708 - classification_loss: 0.1950 281/500 [===============>..............] - ETA: 54s - loss: 1.2668 - regression_loss: 1.0722 - classification_loss: 0.1946 282/500 [===============>..............] - ETA: 54s - loss: 1.2672 - regression_loss: 1.0727 - classification_loss: 0.1945 283/500 [===============>..............] - ETA: 54s - loss: 1.2660 - regression_loss: 1.0716 - classification_loss: 0.1944 284/500 [================>.............] - ETA: 54s - loss: 1.2641 - regression_loss: 1.0699 - classification_loss: 0.1943 285/500 [================>.............] - ETA: 53s - loss: 1.2646 - regression_loss: 1.0703 - classification_loss: 0.1944 286/500 [================>.............] - ETA: 53s - loss: 1.2661 - regression_loss: 1.0714 - classification_loss: 0.1947 287/500 [================>.............] - ETA: 53s - loss: 1.2677 - regression_loss: 1.0727 - classification_loss: 0.1950 288/500 [================>.............] - ETA: 53s - loss: 1.2678 - regression_loss: 1.0728 - classification_loss: 0.1950 289/500 [================>.............] - ETA: 52s - loss: 1.2670 - regression_loss: 1.0722 - classification_loss: 0.1947 290/500 [================>.............] - ETA: 52s - loss: 1.2679 - regression_loss: 1.0730 - classification_loss: 0.1949 291/500 [================>.............] - ETA: 52s - loss: 1.2647 - regression_loss: 1.0703 - classification_loss: 0.1944 292/500 [================>.............] - ETA: 52s - loss: 1.2663 - regression_loss: 1.0717 - classification_loss: 0.1947 293/500 [================>.............] - ETA: 51s - loss: 1.2690 - regression_loss: 1.0737 - classification_loss: 0.1953 294/500 [================>.............] - ETA: 51s - loss: 1.2674 - regression_loss: 1.0723 - classification_loss: 0.1951 295/500 [================>.............] - ETA: 51s - loss: 1.2673 - regression_loss: 1.0721 - classification_loss: 0.1951 296/500 [================>.............] - ETA: 51s - loss: 1.2657 - regression_loss: 1.0709 - classification_loss: 0.1948 297/500 [================>.............] - ETA: 50s - loss: 1.2636 - regression_loss: 1.0691 - classification_loss: 0.1946 298/500 [================>.............] - ETA: 50s - loss: 1.2665 - regression_loss: 1.0711 - classification_loss: 0.1954 299/500 [================>.............] - ETA: 50s - loss: 1.2652 - regression_loss: 1.0699 - classification_loss: 0.1953 300/500 [=================>............] - ETA: 50s - loss: 1.2628 - regression_loss: 1.0678 - classification_loss: 0.1950 301/500 [=================>............] - ETA: 49s - loss: 1.2644 - regression_loss: 1.0689 - classification_loss: 0.1955 302/500 [=================>............] - ETA: 49s - loss: 1.2621 - regression_loss: 1.0671 - classification_loss: 0.1950 303/500 [=================>............] - ETA: 49s - loss: 1.2658 - regression_loss: 1.0697 - classification_loss: 0.1961 304/500 [=================>............] - ETA: 49s - loss: 1.2660 - regression_loss: 1.0698 - classification_loss: 0.1962 305/500 [=================>............] - ETA: 48s - loss: 1.2659 - regression_loss: 1.0696 - classification_loss: 0.1962 306/500 [=================>............] - ETA: 48s - loss: 1.2660 - regression_loss: 1.0697 - classification_loss: 0.1962 307/500 [=================>............] - ETA: 48s - loss: 1.2650 - regression_loss: 1.0690 - classification_loss: 0.1960 308/500 [=================>............] - ETA: 48s - loss: 1.2630 - regression_loss: 1.0674 - classification_loss: 0.1956 309/500 [=================>............] - ETA: 47s - loss: 1.2635 - regression_loss: 1.0677 - classification_loss: 0.1957 310/500 [=================>............] - ETA: 47s - loss: 1.2630 - regression_loss: 1.0674 - classification_loss: 0.1956 311/500 [=================>............] - ETA: 47s - loss: 1.2618 - regression_loss: 1.0664 - classification_loss: 0.1954 312/500 [=================>............] - ETA: 47s - loss: 1.2605 - regression_loss: 1.0655 - classification_loss: 0.1949 313/500 [=================>............] - ETA: 46s - loss: 1.2612 - regression_loss: 1.0662 - classification_loss: 0.1950 314/500 [=================>............] - ETA: 46s - loss: 1.2620 - regression_loss: 1.0669 - classification_loss: 0.1951 315/500 [=================>............] - ETA: 46s - loss: 1.2600 - regression_loss: 1.0653 - classification_loss: 0.1947 316/500 [=================>............] - ETA: 46s - loss: 1.2594 - regression_loss: 1.0649 - classification_loss: 0.1945 317/500 [==================>...........] - ETA: 45s - loss: 1.2590 - regression_loss: 1.0646 - classification_loss: 0.1944 318/500 [==================>...........] - ETA: 45s - loss: 1.2577 - regression_loss: 1.0635 - classification_loss: 0.1942 319/500 [==================>...........] - ETA: 45s - loss: 1.2582 - regression_loss: 1.0641 - classification_loss: 0.1941 320/500 [==================>...........] - ETA: 45s - loss: 1.2579 - regression_loss: 1.0637 - classification_loss: 0.1941 321/500 [==================>...........] - ETA: 44s - loss: 1.2563 - regression_loss: 1.0626 - classification_loss: 0.1937 322/500 [==================>...........] - ETA: 44s - loss: 1.2562 - regression_loss: 1.0626 - classification_loss: 0.1936 323/500 [==================>...........] - ETA: 44s - loss: 1.2533 - regression_loss: 1.0603 - classification_loss: 0.1931 324/500 [==================>...........] - ETA: 44s - loss: 1.2514 - regression_loss: 1.0586 - classification_loss: 0.1928 325/500 [==================>...........] - ETA: 43s - loss: 1.2507 - regression_loss: 1.0581 - classification_loss: 0.1926 326/500 [==================>...........] - ETA: 43s - loss: 1.2519 - regression_loss: 1.0591 - classification_loss: 0.1928 327/500 [==================>...........] - ETA: 43s - loss: 1.2503 - regression_loss: 1.0579 - classification_loss: 0.1924 328/500 [==================>...........] - ETA: 43s - loss: 1.2495 - regression_loss: 1.0573 - classification_loss: 0.1922 329/500 [==================>...........] - ETA: 42s - loss: 1.2472 - regression_loss: 1.0555 - classification_loss: 0.1918 330/500 [==================>...........] - ETA: 42s - loss: 1.2509 - regression_loss: 1.0586 - classification_loss: 0.1923 331/500 [==================>...........] - ETA: 42s - loss: 1.2515 - regression_loss: 1.0591 - classification_loss: 0.1924 332/500 [==================>...........] - ETA: 42s - loss: 1.2526 - regression_loss: 1.0600 - classification_loss: 0.1925 333/500 [==================>...........] - ETA: 41s - loss: 1.2529 - regression_loss: 1.0600 - classification_loss: 0.1928 334/500 [===================>..........] - ETA: 41s - loss: 1.2534 - regression_loss: 1.0603 - classification_loss: 0.1931 335/500 [===================>..........] - ETA: 41s - loss: 1.2529 - regression_loss: 1.0598 - classification_loss: 0.1931 336/500 [===================>..........] - ETA: 41s - loss: 1.2542 - regression_loss: 1.0609 - classification_loss: 0.1933 337/500 [===================>..........] - ETA: 40s - loss: 1.2543 - regression_loss: 1.0611 - classification_loss: 0.1932 338/500 [===================>..........] - ETA: 40s - loss: 1.2538 - regression_loss: 1.0607 - classification_loss: 0.1931 339/500 [===================>..........] - ETA: 40s - loss: 1.2545 - regression_loss: 1.0613 - classification_loss: 0.1932 340/500 [===================>..........] - ETA: 40s - loss: 1.2538 - regression_loss: 1.0606 - classification_loss: 0.1932 341/500 [===================>..........] - ETA: 39s - loss: 1.2558 - regression_loss: 1.0623 - classification_loss: 0.1935 342/500 [===================>..........] - ETA: 39s - loss: 1.2543 - regression_loss: 1.0610 - classification_loss: 0.1933 343/500 [===================>..........] - ETA: 39s - loss: 1.2520 - regression_loss: 1.0591 - classification_loss: 0.1929 344/500 [===================>..........] - ETA: 39s - loss: 1.2527 - regression_loss: 1.0595 - classification_loss: 0.1932 345/500 [===================>..........] - ETA: 38s - loss: 1.2508 - regression_loss: 1.0580 - classification_loss: 0.1928 346/500 [===================>..........] - ETA: 38s - loss: 1.2486 - regression_loss: 1.0562 - classification_loss: 0.1923 347/500 [===================>..........] - ETA: 38s - loss: 1.2509 - regression_loss: 1.0575 - classification_loss: 0.1934 348/500 [===================>..........] - ETA: 38s - loss: 1.2488 - regression_loss: 1.0557 - classification_loss: 0.1931 349/500 [===================>..........] - ETA: 37s - loss: 1.2487 - regression_loss: 1.0555 - classification_loss: 0.1932 350/500 [====================>.........] - ETA: 37s - loss: 1.2497 - regression_loss: 1.0563 - classification_loss: 0.1934 351/500 [====================>.........] - ETA: 37s - loss: 1.2495 - regression_loss: 1.0561 - classification_loss: 0.1934 352/500 [====================>.........] - ETA: 37s - loss: 1.2488 - regression_loss: 1.0555 - classification_loss: 0.1933 353/500 [====================>.........] - ETA: 36s - loss: 1.2488 - regression_loss: 1.0555 - classification_loss: 0.1932 354/500 [====================>.........] - ETA: 36s - loss: 1.2493 - regression_loss: 1.0560 - classification_loss: 0.1933 355/500 [====================>.........] - ETA: 36s - loss: 1.2510 - regression_loss: 1.0574 - classification_loss: 0.1935 356/500 [====================>.........] - ETA: 36s - loss: 1.2522 - regression_loss: 1.0585 - classification_loss: 0.1937 357/500 [====================>.........] - ETA: 35s - loss: 1.2510 - regression_loss: 1.0576 - classification_loss: 0.1934 358/500 [====================>.........] - ETA: 35s - loss: 1.2505 - regression_loss: 1.0572 - classification_loss: 0.1933 359/500 [====================>.........] - ETA: 35s - loss: 1.2533 - regression_loss: 1.0593 - classification_loss: 0.1939 360/500 [====================>.........] - ETA: 35s - loss: 1.2537 - regression_loss: 1.0597 - classification_loss: 0.1940 361/500 [====================>.........] - ETA: 34s - loss: 1.2514 - regression_loss: 1.0579 - classification_loss: 0.1935 362/500 [====================>.........] - ETA: 34s - loss: 1.2528 - regression_loss: 1.0589 - classification_loss: 0.1939 363/500 [====================>.........] - ETA: 34s - loss: 1.2557 - regression_loss: 1.0613 - classification_loss: 0.1945 364/500 [====================>.........] - ETA: 34s - loss: 1.2558 - regression_loss: 1.0613 - classification_loss: 0.1946 365/500 [====================>.........] - ETA: 33s - loss: 1.2569 - regression_loss: 1.0620 - classification_loss: 0.1949 366/500 [====================>.........] - ETA: 33s - loss: 1.2553 - regression_loss: 1.0607 - classification_loss: 0.1946 367/500 [=====================>........] - ETA: 33s - loss: 1.2536 - regression_loss: 1.0593 - classification_loss: 0.1943 368/500 [=====================>........] - ETA: 33s - loss: 1.2546 - regression_loss: 1.0601 - classification_loss: 0.1944 369/500 [=====================>........] - ETA: 32s - loss: 1.2528 - regression_loss: 1.0587 - classification_loss: 0.1941 370/500 [=====================>........] - ETA: 32s - loss: 1.2527 - regression_loss: 1.0588 - classification_loss: 0.1939 371/500 [=====================>........] - ETA: 32s - loss: 1.2530 - regression_loss: 1.0591 - classification_loss: 0.1939 372/500 [=====================>........] - ETA: 32s - loss: 1.2534 - regression_loss: 1.0595 - classification_loss: 0.1939 373/500 [=====================>........] - ETA: 31s - loss: 1.2541 - regression_loss: 1.0601 - classification_loss: 0.1940 374/500 [=====================>........] - ETA: 31s - loss: 1.2529 - regression_loss: 1.0594 - classification_loss: 0.1935 375/500 [=====================>........] - ETA: 31s - loss: 1.2516 - regression_loss: 1.0584 - classification_loss: 0.1932 376/500 [=====================>........] - ETA: 31s - loss: 1.2525 - regression_loss: 1.0591 - classification_loss: 0.1934 377/500 [=====================>........] - ETA: 30s - loss: 1.2532 - regression_loss: 1.0597 - classification_loss: 0.1935 378/500 [=====================>........] - ETA: 30s - loss: 1.2544 - regression_loss: 1.0607 - classification_loss: 0.1937 379/500 [=====================>........] - ETA: 30s - loss: 1.2551 - regression_loss: 1.0612 - classification_loss: 0.1939 380/500 [=====================>........] - ETA: 30s - loss: 1.2542 - regression_loss: 1.0605 - classification_loss: 0.1937 381/500 [=====================>........] - ETA: 29s - loss: 1.2525 - regression_loss: 1.0591 - classification_loss: 0.1933 382/500 [=====================>........] - ETA: 29s - loss: 1.2510 - regression_loss: 1.0576 - classification_loss: 0.1933 383/500 [=====================>........] - ETA: 29s - loss: 1.2500 - regression_loss: 1.0568 - classification_loss: 0.1932 384/500 [======================>.......] - ETA: 29s - loss: 1.2506 - regression_loss: 1.0573 - classification_loss: 0.1934 385/500 [======================>.......] - ETA: 28s - loss: 1.2499 - regression_loss: 1.0567 - classification_loss: 0.1932 386/500 [======================>.......] - ETA: 28s - loss: 1.2497 - regression_loss: 1.0564 - classification_loss: 0.1933 387/500 [======================>.......] - ETA: 28s - loss: 1.2497 - regression_loss: 1.0563 - classification_loss: 0.1934 388/500 [======================>.......] - ETA: 28s - loss: 1.2502 - regression_loss: 1.0567 - classification_loss: 0.1934 389/500 [======================>.......] - ETA: 27s - loss: 1.2507 - regression_loss: 1.0571 - classification_loss: 0.1935 390/500 [======================>.......] - ETA: 27s - loss: 1.2513 - regression_loss: 1.0576 - classification_loss: 0.1937 391/500 [======================>.......] - ETA: 27s - loss: 1.2511 - regression_loss: 1.0574 - classification_loss: 0.1937 392/500 [======================>.......] - ETA: 27s - loss: 1.2527 - regression_loss: 1.0587 - classification_loss: 0.1940 393/500 [======================>.......] - ETA: 26s - loss: 1.2538 - regression_loss: 1.0598 - classification_loss: 0.1940 394/500 [======================>.......] - ETA: 26s - loss: 1.2544 - regression_loss: 1.0604 - classification_loss: 0.1941 395/500 [======================>.......] - ETA: 26s - loss: 1.2538 - regression_loss: 1.0599 - classification_loss: 0.1939 396/500 [======================>.......] - ETA: 26s - loss: 1.2536 - regression_loss: 1.0598 - classification_loss: 0.1939 397/500 [======================>.......] - ETA: 25s - loss: 1.2536 - regression_loss: 1.0595 - classification_loss: 0.1941 398/500 [======================>.......] - ETA: 25s - loss: 1.2531 - regression_loss: 1.0591 - classification_loss: 0.1940 399/500 [======================>.......] - ETA: 25s - loss: 1.2519 - regression_loss: 1.0581 - classification_loss: 0.1939 400/500 [=======================>......] - ETA: 25s - loss: 1.2520 - regression_loss: 1.0581 - classification_loss: 0.1939 401/500 [=======================>......] - ETA: 24s - loss: 1.2523 - regression_loss: 1.0586 - classification_loss: 0.1938 402/500 [=======================>......] - ETA: 24s - loss: 1.2515 - regression_loss: 1.0579 - classification_loss: 0.1935 403/500 [=======================>......] - ETA: 24s - loss: 1.2525 - regression_loss: 1.0589 - classification_loss: 0.1936 404/500 [=======================>......] - ETA: 24s - loss: 1.2511 - regression_loss: 1.0578 - classification_loss: 0.1934 405/500 [=======================>......] - ETA: 23s - loss: 1.2516 - regression_loss: 1.0582 - classification_loss: 0.1934 406/500 [=======================>......] - ETA: 23s - loss: 1.2528 - regression_loss: 1.0592 - classification_loss: 0.1936 407/500 [=======================>......] - ETA: 23s - loss: 1.2535 - regression_loss: 1.0598 - classification_loss: 0.1937 408/500 [=======================>......] - ETA: 23s - loss: 1.2540 - regression_loss: 1.0602 - classification_loss: 0.1937 409/500 [=======================>......] - ETA: 22s - loss: 1.2530 - regression_loss: 1.0595 - classification_loss: 0.1935 410/500 [=======================>......] - ETA: 22s - loss: 1.2534 - regression_loss: 1.0600 - classification_loss: 0.1935 411/500 [=======================>......] - ETA: 22s - loss: 1.2545 - regression_loss: 1.0608 - classification_loss: 0.1936 412/500 [=======================>......] - ETA: 22s - loss: 1.2547 - regression_loss: 1.0609 - classification_loss: 0.1937 413/500 [=======================>......] - ETA: 21s - loss: 1.2541 - regression_loss: 1.0604 - classification_loss: 0.1937 414/500 [=======================>......] - ETA: 21s - loss: 1.2553 - regression_loss: 1.0613 - classification_loss: 0.1940 415/500 [=======================>......] - ETA: 21s - loss: 1.2551 - regression_loss: 1.0612 - classification_loss: 0.1939 416/500 [=======================>......] - ETA: 21s - loss: 1.2543 - regression_loss: 1.0605 - classification_loss: 0.1938 417/500 [========================>.....] - ETA: 20s - loss: 1.2530 - regression_loss: 1.0594 - classification_loss: 0.1937 418/500 [========================>.....] - ETA: 20s - loss: 1.2533 - regression_loss: 1.0596 - classification_loss: 0.1937 419/500 [========================>.....] - ETA: 20s - loss: 1.2551 - regression_loss: 1.0611 - classification_loss: 0.1940 420/500 [========================>.....] - ETA: 20s - loss: 1.2535 - regression_loss: 1.0598 - classification_loss: 0.1937 421/500 [========================>.....] - ETA: 19s - loss: 1.2532 - regression_loss: 1.0595 - classification_loss: 0.1937 422/500 [========================>.....] - ETA: 19s - loss: 1.2526 - regression_loss: 1.0590 - classification_loss: 0.1936 423/500 [========================>.....] - ETA: 19s - loss: 1.2525 - regression_loss: 1.0590 - classification_loss: 0.1936 424/500 [========================>.....] - ETA: 19s - loss: 1.2517 - regression_loss: 1.0583 - classification_loss: 0.1934 425/500 [========================>.....] - ETA: 18s - loss: 1.2511 - regression_loss: 1.0578 - classification_loss: 0.1934 426/500 [========================>.....] - ETA: 18s - loss: 1.2509 - regression_loss: 1.0577 - classification_loss: 0.1932 427/500 [========================>.....] - ETA: 18s - loss: 1.2518 - regression_loss: 1.0584 - classification_loss: 0.1934 428/500 [========================>.....] - ETA: 18s - loss: 1.2524 - regression_loss: 1.0589 - classification_loss: 0.1935 429/500 [========================>.....] - ETA: 17s - loss: 1.2510 - regression_loss: 1.0579 - classification_loss: 0.1931 430/500 [========================>.....] - ETA: 17s - loss: 1.2517 - regression_loss: 1.0585 - classification_loss: 0.1932 431/500 [========================>.....] - ETA: 17s - loss: 1.2519 - regression_loss: 1.0586 - classification_loss: 0.1932 432/500 [========================>.....] - ETA: 17s - loss: 1.2535 - regression_loss: 1.0599 - classification_loss: 0.1936 433/500 [========================>.....] - ETA: 16s - loss: 1.2539 - regression_loss: 1.0602 - classification_loss: 0.1936 434/500 [=========================>....] - ETA: 16s - loss: 1.2540 - regression_loss: 1.0604 - classification_loss: 0.1936 435/500 [=========================>....] - ETA: 16s - loss: 1.2548 - regression_loss: 1.0611 - classification_loss: 0.1937 436/500 [=========================>....] - ETA: 16s - loss: 1.2551 - regression_loss: 1.0613 - classification_loss: 0.1938 437/500 [=========================>....] - ETA: 15s - loss: 1.2549 - regression_loss: 1.0611 - classification_loss: 0.1937 438/500 [=========================>....] - ETA: 15s - loss: 1.2544 - regression_loss: 1.0608 - classification_loss: 0.1936 439/500 [=========================>....] - ETA: 15s - loss: 1.2552 - regression_loss: 1.0614 - classification_loss: 0.1938 440/500 [=========================>....] - ETA: 15s - loss: 1.2545 - regression_loss: 1.0601 - classification_loss: 0.1944 441/500 [=========================>....] - ETA: 14s - loss: 1.2557 - regression_loss: 1.0611 - classification_loss: 0.1946 442/500 [=========================>....] - ETA: 14s - loss: 1.2563 - regression_loss: 1.0617 - classification_loss: 0.1947 443/500 [=========================>....] - ETA: 14s - loss: 1.2563 - regression_loss: 1.0616 - classification_loss: 0.1947 444/500 [=========================>....] - ETA: 14s - loss: 1.2555 - regression_loss: 1.0612 - classification_loss: 0.1943 445/500 [=========================>....] - ETA: 13s - loss: 1.2572 - regression_loss: 1.0626 - classification_loss: 0.1947 446/500 [=========================>....] - ETA: 13s - loss: 1.2565 - regression_loss: 1.0620 - classification_loss: 0.1945 447/500 [=========================>....] - ETA: 13s - loss: 1.2577 - regression_loss: 1.0630 - classification_loss: 0.1947 448/500 [=========================>....] - ETA: 13s - loss: 1.2580 - regression_loss: 1.0633 - classification_loss: 0.1947 449/500 [=========================>....] - ETA: 12s - loss: 1.2576 - regression_loss: 1.0630 - classification_loss: 0.1946 450/500 [==========================>...] - ETA: 12s - loss: 1.2578 - regression_loss: 1.0633 - classification_loss: 0.1944 451/500 [==========================>...] - ETA: 12s - loss: 1.2582 - regression_loss: 1.0637 - classification_loss: 0.1945 452/500 [==========================>...] - ETA: 12s - loss: 1.2583 - regression_loss: 1.0638 - classification_loss: 0.1945 453/500 [==========================>...] - ETA: 11s - loss: 1.2588 - regression_loss: 1.0643 - classification_loss: 0.1945 454/500 [==========================>...] - ETA: 11s - loss: 1.2593 - regression_loss: 1.0647 - classification_loss: 0.1945 455/500 [==========================>...] - ETA: 11s - loss: 1.2602 - regression_loss: 1.0655 - classification_loss: 0.1947 456/500 [==========================>...] - ETA: 11s - loss: 1.2608 - regression_loss: 1.0659 - classification_loss: 0.1949 457/500 [==========================>...] - ETA: 10s - loss: 1.2614 - regression_loss: 1.0663 - classification_loss: 0.1951 458/500 [==========================>...] - ETA: 10s - loss: 1.2618 - regression_loss: 1.0666 - classification_loss: 0.1953 459/500 [==========================>...] - ETA: 10s - loss: 1.2634 - regression_loss: 1.0675 - classification_loss: 0.1959 460/500 [==========================>...] - ETA: 10s - loss: 1.2647 - regression_loss: 1.0683 - classification_loss: 0.1963 461/500 [==========================>...] - ETA: 9s - loss: 1.2644 - regression_loss: 1.0682 - classification_loss: 0.1962  462/500 [==========================>...] - ETA: 9s - loss: 1.2638 - regression_loss: 1.0677 - classification_loss: 0.1961 463/500 [==========================>...] - ETA: 9s - loss: 1.2632 - regression_loss: 1.0673 - classification_loss: 0.1959 464/500 [==========================>...] - ETA: 9s - loss: 1.2635 - regression_loss: 1.0675 - classification_loss: 0.1960 465/500 [==========================>...] - ETA: 8s - loss: 1.2620 - regression_loss: 1.0663 - classification_loss: 0.1957 466/500 [==========================>...] - ETA: 8s - loss: 1.2624 - regression_loss: 1.0666 - classification_loss: 0.1959 467/500 [===========================>..] - ETA: 8s - loss: 1.2622 - regression_loss: 1.0663 - classification_loss: 0.1959 468/500 [===========================>..] - ETA: 8s - loss: 1.2617 - regression_loss: 1.0659 - classification_loss: 0.1959 469/500 [===========================>..] - ETA: 7s - loss: 1.2633 - regression_loss: 1.0672 - classification_loss: 0.1962 470/500 [===========================>..] - ETA: 7s - loss: 1.2619 - regression_loss: 1.0660 - classification_loss: 0.1959 471/500 [===========================>..] - ETA: 7s - loss: 1.2621 - regression_loss: 1.0661 - classification_loss: 0.1960 472/500 [===========================>..] - ETA: 7s - loss: 1.2633 - regression_loss: 1.0671 - classification_loss: 0.1962 473/500 [===========================>..] - ETA: 6s - loss: 1.2639 - regression_loss: 1.0677 - classification_loss: 0.1963 474/500 [===========================>..] - ETA: 6s - loss: 1.2636 - regression_loss: 1.0675 - classification_loss: 0.1961 475/500 [===========================>..] - ETA: 6s - loss: 1.2635 - regression_loss: 1.0675 - classification_loss: 0.1960 476/500 [===========================>..] - ETA: 6s - loss: 1.2641 - regression_loss: 1.0680 - classification_loss: 0.1961 477/500 [===========================>..] - ETA: 5s - loss: 1.2634 - regression_loss: 1.0674 - classification_loss: 0.1959 478/500 [===========================>..] - ETA: 5s - loss: 1.2636 - regression_loss: 1.0675 - classification_loss: 0.1961 479/500 [===========================>..] - ETA: 5s - loss: 1.2638 - regression_loss: 1.0675 - classification_loss: 0.1963 480/500 [===========================>..] - ETA: 5s - loss: 1.2652 - regression_loss: 1.0685 - classification_loss: 0.1967 481/500 [===========================>..] - ETA: 4s - loss: 1.2648 - regression_loss: 1.0682 - classification_loss: 0.1966 482/500 [===========================>..] - ETA: 4s - loss: 1.2647 - regression_loss: 1.0682 - classification_loss: 0.1965 483/500 [===========================>..] - ETA: 4s - loss: 1.2636 - regression_loss: 1.0673 - classification_loss: 0.1963 484/500 [============================>.] - ETA: 4s - loss: 1.2644 - regression_loss: 1.0680 - classification_loss: 0.1965 485/500 [============================>.] - ETA: 3s - loss: 1.2625 - regression_loss: 1.0662 - classification_loss: 0.1962 486/500 [============================>.] - ETA: 3s - loss: 1.2626 - regression_loss: 1.0664 - classification_loss: 0.1962 487/500 [============================>.] - ETA: 3s - loss: 1.2626 - regression_loss: 1.0664 - classification_loss: 0.1962 488/500 [============================>.] - ETA: 3s - loss: 1.2627 - regression_loss: 1.0666 - classification_loss: 0.1960 489/500 [============================>.] - ETA: 2s - loss: 1.2634 - regression_loss: 1.0671 - classification_loss: 0.1963 490/500 [============================>.] - ETA: 2s - loss: 1.2621 - regression_loss: 1.0661 - classification_loss: 0.1960 491/500 [============================>.] - ETA: 2s - loss: 1.2628 - regression_loss: 1.0669 - classification_loss: 0.1959 492/500 [============================>.] - ETA: 2s - loss: 1.2626 - regression_loss: 1.0667 - classification_loss: 0.1959 493/500 [============================>.] - ETA: 1s - loss: 1.2632 - regression_loss: 1.0673 - classification_loss: 0.1960 494/500 [============================>.] - ETA: 1s - loss: 1.2644 - regression_loss: 1.0683 - classification_loss: 0.1961 495/500 [============================>.] - ETA: 1s - loss: 1.2656 - regression_loss: 1.0692 - classification_loss: 0.1964 496/500 [============================>.] - ETA: 1s - loss: 1.2668 - regression_loss: 1.0699 - classification_loss: 0.1968 497/500 [============================>.] - ETA: 0s - loss: 1.2661 - regression_loss: 1.0695 - classification_loss: 0.1966 498/500 [============================>.] - ETA: 0s - loss: 1.2658 - regression_loss: 1.0692 - classification_loss: 0.1965 499/500 [============================>.] - ETA: 0s - loss: 1.2654 - regression_loss: 1.0690 - classification_loss: 0.1963 500/500 [==============================] - 125s 251ms/step - loss: 1.2648 - regression_loss: 1.0686 - classification_loss: 0.1962 1172 instances of class plum with average precision: 0.7037 mAP: 0.7037 Epoch 00127: saving model to ./training/snapshots/resnet50_pascal_127.h5 Epoch 128/150 1/500 [..............................] - ETA: 1:59 - loss: 1.2124 - regression_loss: 1.0656 - classification_loss: 0.1468 2/500 [..............................] - ETA: 2:02 - loss: 1.5514 - regression_loss: 1.3514 - classification_loss: 0.2000 3/500 [..............................] - ETA: 2:03 - loss: 1.4386 - regression_loss: 1.2478 - classification_loss: 0.1908 4/500 [..............................] - ETA: 2:02 - loss: 1.4639 - regression_loss: 1.2562 - classification_loss: 0.2077 5/500 [..............................] - ETA: 2:02 - loss: 1.3063 - regression_loss: 1.1176 - classification_loss: 0.1887 6/500 [..............................] - ETA: 2:03 - loss: 1.2439 - regression_loss: 1.0641 - classification_loss: 0.1798 7/500 [..............................] - ETA: 2:03 - loss: 1.2464 - regression_loss: 1.0652 - classification_loss: 0.1812 8/500 [..............................] - ETA: 2:02 - loss: 1.2145 - regression_loss: 1.0331 - classification_loss: 0.1814 9/500 [..............................] - ETA: 2:02 - loss: 1.2658 - regression_loss: 1.0789 - classification_loss: 0.1869 10/500 [..............................] - ETA: 2:02 - loss: 1.2843 - regression_loss: 1.0948 - classification_loss: 0.1895 11/500 [..............................] - ETA: 2:02 - loss: 1.2580 - regression_loss: 1.0744 - classification_loss: 0.1837 12/500 [..............................] - ETA: 2:01 - loss: 1.2127 - regression_loss: 1.0372 - classification_loss: 0.1755 13/500 [..............................] - ETA: 1:59 - loss: 1.2039 - regression_loss: 1.0290 - classification_loss: 0.1750 14/500 [..............................] - ETA: 1:59 - loss: 1.2291 - regression_loss: 1.0436 - classification_loss: 0.1855 15/500 [..............................] - ETA: 1:58 - loss: 1.2589 - regression_loss: 1.0675 - classification_loss: 0.1914 16/500 [..............................] - ETA: 1:59 - loss: 1.2697 - regression_loss: 1.0781 - classification_loss: 0.1916 17/500 [>.............................] - ETA: 1:58 - loss: 1.2628 - regression_loss: 1.0718 - classification_loss: 0.1910 18/500 [>.............................] - ETA: 1:58 - loss: 1.2272 - regression_loss: 1.0432 - classification_loss: 0.1840 19/500 [>.............................] - ETA: 1:58 - loss: 1.2496 - regression_loss: 1.0627 - classification_loss: 0.1869 20/500 [>.............................] - ETA: 1:58 - loss: 1.2561 - regression_loss: 1.0688 - classification_loss: 0.1873 21/500 [>.............................] - ETA: 1:57 - loss: 1.2182 - regression_loss: 1.0387 - classification_loss: 0.1795 22/500 [>.............................] - ETA: 1:58 - loss: 1.2303 - regression_loss: 1.0489 - classification_loss: 0.1813 23/500 [>.............................] - ETA: 1:57 - loss: 1.1952 - regression_loss: 1.0199 - classification_loss: 0.1754 24/500 [>.............................] - ETA: 1:57 - loss: 1.1874 - regression_loss: 1.0138 - classification_loss: 0.1736 25/500 [>.............................] - ETA: 1:57 - loss: 1.1618 - regression_loss: 0.9935 - classification_loss: 0.1683 26/500 [>.............................] - ETA: 1:57 - loss: 1.1573 - regression_loss: 0.9890 - classification_loss: 0.1683 27/500 [>.............................] - ETA: 1:57 - loss: 1.1868 - regression_loss: 1.0123 - classification_loss: 0.1745 28/500 [>.............................] - ETA: 1:57 - loss: 1.2021 - regression_loss: 1.0277 - classification_loss: 0.1744 29/500 [>.............................] - ETA: 1:56 - loss: 1.1809 - regression_loss: 1.0113 - classification_loss: 0.1696 30/500 [>.............................] - ETA: 1:56 - loss: 1.1704 - regression_loss: 1.0037 - classification_loss: 0.1668 31/500 [>.............................] - ETA: 1:56 - loss: 1.1747 - regression_loss: 1.0063 - classification_loss: 0.1684 32/500 [>.............................] - ETA: 1:56 - loss: 1.1823 - regression_loss: 1.0133 - classification_loss: 0.1690 33/500 [>.............................] - ETA: 1:56 - loss: 1.1644 - regression_loss: 0.9988 - classification_loss: 0.1656 34/500 [=>............................] - ETA: 1:56 - loss: 1.1738 - regression_loss: 1.0074 - classification_loss: 0.1665 35/500 [=>............................] - ETA: 1:55 - loss: 1.1576 - regression_loss: 0.9935 - classification_loss: 0.1641 36/500 [=>............................] - ETA: 1:55 - loss: 1.1509 - regression_loss: 0.9861 - classification_loss: 0.1647 37/500 [=>............................] - ETA: 1:55 - loss: 1.1541 - regression_loss: 0.9879 - classification_loss: 0.1663 38/500 [=>............................] - ETA: 1:54 - loss: 1.1782 - regression_loss: 1.0060 - classification_loss: 0.1721 39/500 [=>............................] - ETA: 1:54 - loss: 1.1796 - regression_loss: 1.0071 - classification_loss: 0.1725 40/500 [=>............................] - ETA: 1:53 - loss: 1.1899 - regression_loss: 1.0157 - classification_loss: 0.1743 41/500 [=>............................] - ETA: 1:53 - loss: 1.1920 - regression_loss: 1.0168 - classification_loss: 0.1753 42/500 [=>............................] - ETA: 1:52 - loss: 1.1966 - regression_loss: 1.0210 - classification_loss: 0.1756 43/500 [=>............................] - ETA: 1:52 - loss: 1.2025 - regression_loss: 1.0257 - classification_loss: 0.1768 44/500 [=>............................] - ETA: 1:52 - loss: 1.2163 - regression_loss: 1.0374 - classification_loss: 0.1788 45/500 [=>............................] - ETA: 1:52 - loss: 1.2222 - regression_loss: 1.0423 - classification_loss: 0.1799 46/500 [=>............................] - ETA: 1:52 - loss: 1.2253 - regression_loss: 1.0447 - classification_loss: 0.1807 47/500 [=>............................] - ETA: 1:51 - loss: 1.2329 - regression_loss: 1.0502 - classification_loss: 0.1827 48/500 [=>............................] - ETA: 1:51 - loss: 1.2373 - regression_loss: 1.0534 - classification_loss: 0.1839 49/500 [=>............................] - ETA: 1:51 - loss: 1.2430 - regression_loss: 1.0589 - classification_loss: 0.1841 50/500 [==>...........................] - ETA: 1:51 - loss: 1.2529 - regression_loss: 1.0669 - classification_loss: 0.1861 51/500 [==>...........................] - ETA: 1:50 - loss: 1.2736 - regression_loss: 1.0856 - classification_loss: 0.1879 52/500 [==>...........................] - ETA: 1:50 - loss: 1.2835 - regression_loss: 1.0936 - classification_loss: 0.1898 53/500 [==>...........................] - ETA: 1:50 - loss: 1.2853 - regression_loss: 1.0956 - classification_loss: 0.1897 54/500 [==>...........................] - ETA: 1:50 - loss: 1.2879 - regression_loss: 1.0978 - classification_loss: 0.1902 55/500 [==>...........................] - ETA: 1:50 - loss: 1.2896 - regression_loss: 1.0996 - classification_loss: 0.1900 56/500 [==>...........................] - ETA: 1:49 - loss: 1.2876 - regression_loss: 1.0977 - classification_loss: 0.1899 57/500 [==>...........................] - ETA: 1:49 - loss: 1.2829 - regression_loss: 1.0937 - classification_loss: 0.1892 58/500 [==>...........................] - ETA: 1:49 - loss: 1.2835 - regression_loss: 1.0946 - classification_loss: 0.1890 59/500 [==>...........................] - ETA: 1:49 - loss: 1.2832 - regression_loss: 1.0938 - classification_loss: 0.1894 60/500 [==>...........................] - ETA: 1:48 - loss: 1.2846 - regression_loss: 1.0940 - classification_loss: 0.1906 61/500 [==>...........................] - ETA: 1:48 - loss: 1.2927 - regression_loss: 1.1006 - classification_loss: 0.1921 62/500 [==>...........................] - ETA: 1:48 - loss: 1.2998 - regression_loss: 1.1062 - classification_loss: 0.1936 63/500 [==>...........................] - ETA: 1:48 - loss: 1.2892 - regression_loss: 1.0972 - classification_loss: 0.1920 64/500 [==>...........................] - ETA: 1:48 - loss: 1.2878 - regression_loss: 1.0960 - classification_loss: 0.1918 65/500 [==>...........................] - ETA: 1:47 - loss: 1.2808 - regression_loss: 1.0860 - classification_loss: 0.1948 66/500 [==>...........................] - ETA: 1:47 - loss: 1.2815 - regression_loss: 1.0865 - classification_loss: 0.1950 67/500 [===>..........................] - ETA: 1:47 - loss: 1.2773 - regression_loss: 1.0831 - classification_loss: 0.1942 68/500 [===>..........................] - ETA: 1:47 - loss: 1.2734 - regression_loss: 1.0804 - classification_loss: 0.1930 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2745 - regression_loss: 1.0815 - classification_loss: 0.1929 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2707 - regression_loss: 1.0772 - classification_loss: 0.1935 71/500 [===>..........................] - ETA: 1:46 - loss: 1.2679 - regression_loss: 1.0748 - classification_loss: 0.1932 72/500 [===>..........................] - ETA: 1:46 - loss: 1.2627 - regression_loss: 1.0691 - classification_loss: 0.1936 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2603 - regression_loss: 1.0673 - classification_loss: 0.1930 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2505 - regression_loss: 1.0597 - classification_loss: 0.1907 75/500 [===>..........................] - ETA: 1:45 - loss: 1.2545 - regression_loss: 1.0645 - classification_loss: 0.1901 76/500 [===>..........................] - ETA: 1:45 - loss: 1.2457 - regression_loss: 1.0573 - classification_loss: 0.1884 77/500 [===>..........................] - ETA: 1:45 - loss: 1.2478 - regression_loss: 1.0585 - classification_loss: 0.1892 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2408 - regression_loss: 1.0530 - classification_loss: 0.1878 79/500 [===>..........................] - ETA: 1:44 - loss: 1.2483 - regression_loss: 1.0597 - classification_loss: 0.1885 80/500 [===>..........................] - ETA: 1:44 - loss: 1.2555 - regression_loss: 1.0649 - classification_loss: 0.1906 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2528 - regression_loss: 1.0629 - classification_loss: 0.1900 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2576 - regression_loss: 1.0671 - classification_loss: 0.1905 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2493 - regression_loss: 1.0597 - classification_loss: 0.1896 84/500 [====>.........................] - ETA: 1:43 - loss: 1.2487 - regression_loss: 1.0590 - classification_loss: 0.1897 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2472 - regression_loss: 1.0581 - classification_loss: 0.1891 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2514 - regression_loss: 1.0616 - classification_loss: 0.1897 87/500 [====>.........................] - ETA: 1:42 - loss: 1.2585 - regression_loss: 1.0670 - classification_loss: 0.1915 88/500 [====>.........................] - ETA: 1:42 - loss: 1.2511 - regression_loss: 1.0609 - classification_loss: 0.1903 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2541 - regression_loss: 1.0637 - classification_loss: 0.1905 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2556 - regression_loss: 1.0648 - classification_loss: 0.1908 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2608 - regression_loss: 1.0690 - classification_loss: 0.1917 92/500 [====>.........................] - ETA: 1:41 - loss: 1.2613 - regression_loss: 1.0695 - classification_loss: 0.1918 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2625 - regression_loss: 1.0709 - classification_loss: 0.1915 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2619 - regression_loss: 1.0701 - classification_loss: 0.1918 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2643 - regression_loss: 1.0723 - classification_loss: 0.1920 96/500 [====>.........................] - ETA: 1:40 - loss: 1.2581 - regression_loss: 1.0674 - classification_loss: 0.1907 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2527 - regression_loss: 1.0629 - classification_loss: 0.1898 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2501 - regression_loss: 1.0604 - classification_loss: 0.1897 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2503 - regression_loss: 1.0603 - classification_loss: 0.1901 100/500 [=====>........................] - ETA: 1:39 - loss: 1.2434 - regression_loss: 1.0547 - classification_loss: 0.1887 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2494 - regression_loss: 1.0588 - classification_loss: 0.1906 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2530 - regression_loss: 1.0622 - classification_loss: 0.1908 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2558 - regression_loss: 1.0638 - classification_loss: 0.1920 104/500 [=====>........................] - ETA: 1:38 - loss: 1.2506 - regression_loss: 1.0599 - classification_loss: 0.1907 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2561 - regression_loss: 1.0638 - classification_loss: 0.1924 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2576 - regression_loss: 1.0650 - classification_loss: 0.1926 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2570 - regression_loss: 1.0646 - classification_loss: 0.1923 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2580 - regression_loss: 1.0657 - classification_loss: 0.1923 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2514 - regression_loss: 1.0603 - classification_loss: 0.1910 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2519 - regression_loss: 1.0605 - classification_loss: 0.1914 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2481 - regression_loss: 1.0573 - classification_loss: 0.1909 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2478 - regression_loss: 1.0568 - classification_loss: 0.1910 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2428 - regression_loss: 1.0526 - classification_loss: 0.1902 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2357 - regression_loss: 1.0465 - classification_loss: 0.1893 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2318 - regression_loss: 1.0430 - classification_loss: 0.1888 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2267 - regression_loss: 1.0389 - classification_loss: 0.1878 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2306 - regression_loss: 1.0424 - classification_loss: 0.1882 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2323 - regression_loss: 1.0436 - classification_loss: 0.1887 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2310 - regression_loss: 1.0425 - classification_loss: 0.1885 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2332 - regression_loss: 1.0443 - classification_loss: 0.1889 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2361 - regression_loss: 1.0469 - classification_loss: 0.1892 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2391 - regression_loss: 1.0494 - classification_loss: 0.1897 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2368 - regression_loss: 1.0473 - classification_loss: 0.1894 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2310 - regression_loss: 1.0430 - classification_loss: 0.1881 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2351 - regression_loss: 1.0467 - classification_loss: 0.1884 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2366 - regression_loss: 1.0481 - classification_loss: 0.1884 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2318 - regression_loss: 1.0440 - classification_loss: 0.1878 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2317 - regression_loss: 1.0442 - classification_loss: 0.1875 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2278 - regression_loss: 1.0412 - classification_loss: 0.1866 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2316 - regression_loss: 1.0444 - classification_loss: 0.1872 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2339 - regression_loss: 1.0466 - classification_loss: 0.1873 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2327 - regression_loss: 1.0457 - classification_loss: 0.1870 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2337 - regression_loss: 1.0464 - classification_loss: 0.1873 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2318 - regression_loss: 1.0449 - classification_loss: 0.1869 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2270 - regression_loss: 1.0410 - classification_loss: 0.1860 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2267 - regression_loss: 1.0407 - classification_loss: 0.1860 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2229 - regression_loss: 1.0377 - classification_loss: 0.1852 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2171 - regression_loss: 1.0330 - classification_loss: 0.1841 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2147 - regression_loss: 1.0308 - classification_loss: 0.1839 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2176 - regression_loss: 1.0332 - classification_loss: 0.1844 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2221 - regression_loss: 1.0362 - classification_loss: 0.1858 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2241 - regression_loss: 1.0381 - classification_loss: 0.1861 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2234 - regression_loss: 1.0376 - classification_loss: 0.1858 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2187 - regression_loss: 1.0336 - classification_loss: 0.1851 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2183 - regression_loss: 1.0333 - classification_loss: 0.1850 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2179 - regression_loss: 1.0329 - classification_loss: 0.1850 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2187 - regression_loss: 1.0334 - classification_loss: 0.1853 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2193 - regression_loss: 1.0340 - classification_loss: 0.1853 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2241 - regression_loss: 1.0383 - classification_loss: 0.1859 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2228 - regression_loss: 1.0375 - classification_loss: 0.1853 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2238 - regression_loss: 1.0385 - classification_loss: 0.1853 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2219 - regression_loss: 1.0370 - classification_loss: 0.1850 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2194 - regression_loss: 1.0349 - classification_loss: 0.1845 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2234 - regression_loss: 1.0387 - classification_loss: 0.1847 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2243 - regression_loss: 1.0393 - classification_loss: 0.1849 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2230 - regression_loss: 1.0385 - classification_loss: 0.1844 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2202 - regression_loss: 1.0361 - classification_loss: 0.1841 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2198 - regression_loss: 1.0359 - classification_loss: 0.1839 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2195 - regression_loss: 1.0356 - classification_loss: 0.1838 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2192 - regression_loss: 1.0353 - classification_loss: 0.1840 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2237 - regression_loss: 1.0387 - classification_loss: 0.1850 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2264 - regression_loss: 1.0408 - classification_loss: 0.1856 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2275 - regression_loss: 1.0417 - classification_loss: 0.1858 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2256 - regression_loss: 1.0402 - classification_loss: 0.1855 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2263 - regression_loss: 1.0407 - classification_loss: 0.1856 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2231 - regression_loss: 1.0382 - classification_loss: 0.1849 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2203 - regression_loss: 1.0360 - classification_loss: 0.1844 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2223 - regression_loss: 1.0380 - classification_loss: 0.1844 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2267 - regression_loss: 1.0417 - classification_loss: 0.1850 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2260 - regression_loss: 1.0410 - classification_loss: 0.1849 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2277 - regression_loss: 1.0425 - classification_loss: 0.1852 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2273 - regression_loss: 1.0417 - classification_loss: 0.1856 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2295 - regression_loss: 1.0436 - classification_loss: 0.1858 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2317 - regression_loss: 1.0458 - classification_loss: 0.1860 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2350 - regression_loss: 1.0486 - classification_loss: 0.1864 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2354 - regression_loss: 1.0490 - classification_loss: 0.1864 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2312 - regression_loss: 1.0455 - classification_loss: 0.1856 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2271 - regression_loss: 1.0423 - classification_loss: 0.1848 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2240 - regression_loss: 1.0398 - classification_loss: 0.1843 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2227 - regression_loss: 1.0387 - classification_loss: 0.1840 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2221 - regression_loss: 1.0386 - classification_loss: 0.1836 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2244 - regression_loss: 1.0403 - classification_loss: 0.1841 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2251 - regression_loss: 1.0412 - classification_loss: 0.1840 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2281 - regression_loss: 1.0436 - classification_loss: 0.1845 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2305 - regression_loss: 1.0457 - classification_loss: 0.1848 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2306 - regression_loss: 1.0458 - classification_loss: 0.1848 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2332 - regression_loss: 1.0480 - classification_loss: 0.1852 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2316 - regression_loss: 1.0469 - classification_loss: 0.1847 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2334 - regression_loss: 1.0484 - classification_loss: 0.1850 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2300 - regression_loss: 1.0456 - classification_loss: 0.1844 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2352 - regression_loss: 1.0493 - classification_loss: 0.1858 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2317 - regression_loss: 1.0462 - classification_loss: 0.1855 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2275 - regression_loss: 1.0425 - classification_loss: 0.1850 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2288 - regression_loss: 1.0439 - classification_loss: 0.1849 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2295 - regression_loss: 1.0448 - classification_loss: 0.1848 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2294 - regression_loss: 1.0444 - classification_loss: 0.1851 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2252 - regression_loss: 1.0409 - classification_loss: 0.1844 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2232 - regression_loss: 1.0394 - classification_loss: 0.1838 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2223 - regression_loss: 1.0385 - classification_loss: 0.1838 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2241 - regression_loss: 1.0404 - classification_loss: 0.1838 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2265 - regression_loss: 1.0423 - classification_loss: 0.1842 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2229 - regression_loss: 1.0394 - classification_loss: 0.1835 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2217 - regression_loss: 1.0388 - classification_loss: 0.1829 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2262 - regression_loss: 1.0426 - classification_loss: 0.1836 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2248 - regression_loss: 1.0412 - classification_loss: 0.1836 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2253 - regression_loss: 1.0417 - classification_loss: 0.1837 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2249 - regression_loss: 1.0412 - classification_loss: 0.1837 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2281 - regression_loss: 1.0440 - classification_loss: 0.1842 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2275 - regression_loss: 1.0436 - classification_loss: 0.1839 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2254 - regression_loss: 1.0418 - classification_loss: 0.1836 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2236 - regression_loss: 1.0399 - classification_loss: 0.1838 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2247 - regression_loss: 1.0408 - classification_loss: 0.1838 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2260 - regression_loss: 1.0422 - classification_loss: 0.1838 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2258 - regression_loss: 1.0423 - classification_loss: 0.1836 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2281 - regression_loss: 1.0441 - classification_loss: 0.1841 216/500 [===========>..................] - ETA: 1:10 - loss: 1.2295 - regression_loss: 1.0451 - classification_loss: 0.1844 217/500 [============>.................] - ETA: 1:10 - loss: 1.2271 - regression_loss: 1.0432 - classification_loss: 0.1839 218/500 [============>.................] - ETA: 1:10 - loss: 1.2290 - regression_loss: 1.0449 - classification_loss: 0.1841 219/500 [============>.................] - ETA: 1:10 - loss: 1.2256 - regression_loss: 1.0419 - classification_loss: 0.1837 220/500 [============>.................] - ETA: 1:10 - loss: 1.2209 - regression_loss: 1.0380 - classification_loss: 0.1829 221/500 [============>.................] - ETA: 1:09 - loss: 1.2194 - regression_loss: 1.0364 - classification_loss: 0.1830 222/500 [============>.................] - ETA: 1:09 - loss: 1.2227 - regression_loss: 1.0390 - classification_loss: 0.1837 223/500 [============>.................] - ETA: 1:09 - loss: 1.2241 - regression_loss: 1.0403 - classification_loss: 0.1838 224/500 [============>.................] - ETA: 1:09 - loss: 1.2281 - regression_loss: 1.0438 - classification_loss: 0.1844 225/500 [============>.................] - ETA: 1:08 - loss: 1.2262 - regression_loss: 1.0424 - classification_loss: 0.1838 226/500 [============>.................] - ETA: 1:08 - loss: 1.2279 - regression_loss: 1.0440 - classification_loss: 0.1840 227/500 [============>.................] - ETA: 1:08 - loss: 1.2255 - regression_loss: 1.0420 - classification_loss: 0.1835 228/500 [============>.................] - ETA: 1:07 - loss: 1.2250 - regression_loss: 1.0415 - classification_loss: 0.1835 229/500 [============>.................] - ETA: 1:07 - loss: 1.2247 - regression_loss: 1.0411 - classification_loss: 0.1836 230/500 [============>.................] - ETA: 1:07 - loss: 1.2247 - regression_loss: 1.0411 - classification_loss: 0.1836 231/500 [============>.................] - ETA: 1:07 - loss: 1.2261 - regression_loss: 1.0422 - classification_loss: 0.1838 232/500 [============>.................] - ETA: 1:06 - loss: 1.2252 - regression_loss: 1.0416 - classification_loss: 0.1836 233/500 [============>.................] - ETA: 1:06 - loss: 1.2272 - regression_loss: 1.0433 - classification_loss: 0.1839 234/500 [=============>................] - ETA: 1:06 - loss: 1.2292 - regression_loss: 1.0449 - classification_loss: 0.1843 235/500 [=============>................] - ETA: 1:06 - loss: 1.2288 - regression_loss: 1.0445 - classification_loss: 0.1842 236/500 [=============>................] - ETA: 1:06 - loss: 1.2335 - regression_loss: 1.0488 - classification_loss: 0.1847 237/500 [=============>................] - ETA: 1:05 - loss: 1.2313 - regression_loss: 1.0470 - classification_loss: 0.1844 238/500 [=============>................] - ETA: 1:05 - loss: 1.2303 - regression_loss: 1.0461 - classification_loss: 0.1842 239/500 [=============>................] - ETA: 1:05 - loss: 1.2269 - regression_loss: 1.0431 - classification_loss: 0.1838 240/500 [=============>................] - ETA: 1:05 - loss: 1.2262 - regression_loss: 1.0426 - classification_loss: 0.1836 241/500 [=============>................] - ETA: 1:04 - loss: 1.2284 - regression_loss: 1.0444 - classification_loss: 0.1840 242/500 [=============>................] - ETA: 1:04 - loss: 1.2298 - regression_loss: 1.0456 - classification_loss: 0.1842 243/500 [=============>................] - ETA: 1:04 - loss: 1.2268 - regression_loss: 1.0431 - classification_loss: 0.1837 244/500 [=============>................] - ETA: 1:04 - loss: 1.2278 - regression_loss: 1.0440 - classification_loss: 0.1838 245/500 [=============>................] - ETA: 1:03 - loss: 1.2277 - regression_loss: 1.0440 - classification_loss: 0.1837 246/500 [=============>................] - ETA: 1:03 - loss: 1.2301 - regression_loss: 1.0463 - classification_loss: 0.1838 247/500 [=============>................] - ETA: 1:03 - loss: 1.2292 - regression_loss: 1.0455 - classification_loss: 0.1838 248/500 [=============>................] - ETA: 1:03 - loss: 1.2306 - regression_loss: 1.0467 - classification_loss: 0.1839 249/500 [=============>................] - ETA: 1:02 - loss: 1.2337 - regression_loss: 1.0493 - classification_loss: 0.1844 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2339 - regression_loss: 1.0495 - classification_loss: 0.1843 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2308 - regression_loss: 1.0471 - classification_loss: 0.1837 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2308 - regression_loss: 1.0472 - classification_loss: 0.1836 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2321 - regression_loss: 1.0482 - classification_loss: 0.1839 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2317 - regression_loss: 1.0476 - classification_loss: 0.1840 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2338 - regression_loss: 1.0495 - classification_loss: 0.1844 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2323 - regression_loss: 1.0482 - classification_loss: 0.1841 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2291 - regression_loss: 1.0456 - classification_loss: 0.1835 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2306 - regression_loss: 1.0469 - classification_loss: 0.1837 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2296 - regression_loss: 1.0461 - classification_loss: 0.1835 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2300 - regression_loss: 1.0464 - classification_loss: 0.1836 261/500 [==============>...............] - ETA: 59s - loss: 1.2294 - regression_loss: 1.0460 - classification_loss: 0.1834  262/500 [==============>...............] - ETA: 59s - loss: 1.2281 - regression_loss: 1.0452 - classification_loss: 0.1829 263/500 [==============>...............] - ETA: 59s - loss: 1.2284 - regression_loss: 1.0447 - classification_loss: 0.1837 264/500 [==============>...............] - ETA: 59s - loss: 1.2295 - regression_loss: 1.0456 - classification_loss: 0.1839 265/500 [==============>...............] - ETA: 58s - loss: 1.2282 - regression_loss: 1.0445 - classification_loss: 0.1837 266/500 [==============>...............] - ETA: 58s - loss: 1.2261 - regression_loss: 1.0427 - classification_loss: 0.1834 267/500 [===============>..............] - ETA: 58s - loss: 1.2289 - regression_loss: 1.0449 - classification_loss: 0.1840 268/500 [===============>..............] - ETA: 58s - loss: 1.2323 - regression_loss: 1.0475 - classification_loss: 0.1847 269/500 [===============>..............] - ETA: 57s - loss: 1.2329 - regression_loss: 1.0482 - classification_loss: 0.1847 270/500 [===============>..............] - ETA: 57s - loss: 1.2359 - regression_loss: 1.0505 - classification_loss: 0.1854 271/500 [===============>..............] - ETA: 57s - loss: 1.2327 - regression_loss: 1.0478 - classification_loss: 0.1849 272/500 [===============>..............] - ETA: 57s - loss: 1.2322 - regression_loss: 1.0475 - classification_loss: 0.1847 273/500 [===============>..............] - ETA: 56s - loss: 1.2335 - regression_loss: 1.0486 - classification_loss: 0.1849 274/500 [===============>..............] - ETA: 56s - loss: 1.2316 - regression_loss: 1.0471 - classification_loss: 0.1845 275/500 [===============>..............] - ETA: 56s - loss: 1.2285 - regression_loss: 1.0445 - classification_loss: 0.1841 276/500 [===============>..............] - ETA: 56s - loss: 1.2279 - regression_loss: 1.0438 - classification_loss: 0.1842 277/500 [===============>..............] - ETA: 55s - loss: 1.2271 - regression_loss: 1.0431 - classification_loss: 0.1840 278/500 [===============>..............] - ETA: 55s - loss: 1.2262 - regression_loss: 1.0422 - classification_loss: 0.1840 279/500 [===============>..............] - ETA: 55s - loss: 1.2277 - regression_loss: 1.0433 - classification_loss: 0.1844 280/500 [===============>..............] - ETA: 55s - loss: 1.2292 - regression_loss: 1.0445 - classification_loss: 0.1847 281/500 [===============>..............] - ETA: 54s - loss: 1.2294 - regression_loss: 1.0448 - classification_loss: 0.1846 282/500 [===============>..............] - ETA: 54s - loss: 1.2304 - regression_loss: 1.0457 - classification_loss: 0.1847 283/500 [===============>..............] - ETA: 54s - loss: 1.2323 - regression_loss: 1.0473 - classification_loss: 0.1850 284/500 [================>.............] - ETA: 54s - loss: 1.2326 - regression_loss: 1.0474 - classification_loss: 0.1852 285/500 [================>.............] - ETA: 53s - loss: 1.2364 - regression_loss: 1.0506 - classification_loss: 0.1859 286/500 [================>.............] - ETA: 53s - loss: 1.2373 - regression_loss: 1.0512 - classification_loss: 0.1862 287/500 [================>.............] - ETA: 53s - loss: 1.2386 - regression_loss: 1.0514 - classification_loss: 0.1872 288/500 [================>.............] - ETA: 53s - loss: 1.2388 - regression_loss: 1.0514 - classification_loss: 0.1874 289/500 [================>.............] - ETA: 52s - loss: 1.2402 - regression_loss: 1.0527 - classification_loss: 0.1876 290/500 [================>.............] - ETA: 52s - loss: 1.2414 - regression_loss: 1.0536 - classification_loss: 0.1877 291/500 [================>.............] - ETA: 52s - loss: 1.2392 - regression_loss: 1.0519 - classification_loss: 0.1874 292/500 [================>.............] - ETA: 52s - loss: 1.2396 - regression_loss: 1.0522 - classification_loss: 0.1874 293/500 [================>.............] - ETA: 51s - loss: 1.2408 - regression_loss: 1.0531 - classification_loss: 0.1877 294/500 [================>.............] - ETA: 51s - loss: 1.2380 - regression_loss: 1.0508 - classification_loss: 0.1872 295/500 [================>.............] - ETA: 51s - loss: 1.2367 - regression_loss: 1.0497 - classification_loss: 0.1870 296/500 [================>.............] - ETA: 51s - loss: 1.2359 - regression_loss: 1.0490 - classification_loss: 0.1868 297/500 [================>.............] - ETA: 50s - loss: 1.2343 - regression_loss: 1.0478 - classification_loss: 0.1865 298/500 [================>.............] - ETA: 50s - loss: 1.2331 - regression_loss: 1.0468 - classification_loss: 0.1863 299/500 [================>.............] - ETA: 50s - loss: 1.2337 - regression_loss: 1.0472 - classification_loss: 0.1865 300/500 [=================>............] - ETA: 50s - loss: 1.2351 - regression_loss: 1.0485 - classification_loss: 0.1867 301/500 [=================>............] - ETA: 49s - loss: 1.2371 - regression_loss: 1.0501 - classification_loss: 0.1870 302/500 [=================>............] - ETA: 49s - loss: 1.2375 - regression_loss: 1.0506 - classification_loss: 0.1869 303/500 [=================>............] - ETA: 49s - loss: 1.2390 - regression_loss: 1.0518 - classification_loss: 0.1872 304/500 [=================>............] - ETA: 49s - loss: 1.2397 - regression_loss: 1.0523 - classification_loss: 0.1874 305/500 [=================>............] - ETA: 48s - loss: 1.2402 - regression_loss: 1.0526 - classification_loss: 0.1875 306/500 [=================>............] - ETA: 48s - loss: 1.2403 - regression_loss: 1.0527 - classification_loss: 0.1876 307/500 [=================>............] - ETA: 48s - loss: 1.2398 - regression_loss: 1.0523 - classification_loss: 0.1875 308/500 [=================>............] - ETA: 48s - loss: 1.2411 - regression_loss: 1.0531 - classification_loss: 0.1879 309/500 [=================>............] - ETA: 47s - loss: 1.2402 - regression_loss: 1.0524 - classification_loss: 0.1878 310/500 [=================>............] - ETA: 47s - loss: 1.2380 - regression_loss: 1.0505 - classification_loss: 0.1875 311/500 [=================>............] - ETA: 47s - loss: 1.2400 - regression_loss: 1.0521 - classification_loss: 0.1879 312/500 [=================>............] - ETA: 47s - loss: 1.2403 - regression_loss: 1.0523 - classification_loss: 0.1880 313/500 [=================>............] - ETA: 46s - loss: 1.2420 - regression_loss: 1.0534 - classification_loss: 0.1886 314/500 [=================>............] - ETA: 46s - loss: 1.2452 - regression_loss: 1.0563 - classification_loss: 0.1888 315/500 [=================>............] - ETA: 46s - loss: 1.2455 - regression_loss: 1.0567 - classification_loss: 0.1888 316/500 [=================>............] - ETA: 46s - loss: 1.2458 - regression_loss: 1.0569 - classification_loss: 0.1888 317/500 [==================>...........] - ETA: 45s - loss: 1.2478 - regression_loss: 1.0585 - classification_loss: 0.1892 318/500 [==================>...........] - ETA: 45s - loss: 1.2460 - regression_loss: 1.0571 - classification_loss: 0.1889 319/500 [==================>...........] - ETA: 45s - loss: 1.2460 - regression_loss: 1.0570 - classification_loss: 0.1890 320/500 [==================>...........] - ETA: 45s - loss: 1.2469 - regression_loss: 1.0578 - classification_loss: 0.1891 321/500 [==================>...........] - ETA: 44s - loss: 1.2462 - regression_loss: 1.0570 - classification_loss: 0.1892 322/500 [==================>...........] - ETA: 44s - loss: 1.2471 - regression_loss: 1.0577 - classification_loss: 0.1894 323/500 [==================>...........] - ETA: 44s - loss: 1.2448 - regression_loss: 1.0558 - classification_loss: 0.1890 324/500 [==================>...........] - ETA: 44s - loss: 1.2441 - regression_loss: 1.0551 - classification_loss: 0.1890 325/500 [==================>...........] - ETA: 43s - loss: 1.2450 - regression_loss: 1.0558 - classification_loss: 0.1892 326/500 [==================>...........] - ETA: 43s - loss: 1.2445 - regression_loss: 1.0555 - classification_loss: 0.1890 327/500 [==================>...........] - ETA: 43s - loss: 1.2445 - regression_loss: 1.0556 - classification_loss: 0.1889 328/500 [==================>...........] - ETA: 43s - loss: 1.2444 - regression_loss: 1.0555 - classification_loss: 0.1889 329/500 [==================>...........] - ETA: 42s - loss: 1.2441 - regression_loss: 1.0553 - classification_loss: 0.1888 330/500 [==================>...........] - ETA: 42s - loss: 1.2426 - regression_loss: 1.0541 - classification_loss: 0.1885 331/500 [==================>...........] - ETA: 42s - loss: 1.2428 - regression_loss: 1.0542 - classification_loss: 0.1886 332/500 [==================>...........] - ETA: 42s - loss: 1.2431 - regression_loss: 1.0544 - classification_loss: 0.1887 333/500 [==================>...........] - ETA: 41s - loss: 1.2430 - regression_loss: 1.0545 - classification_loss: 0.1885 334/500 [===================>..........] - ETA: 41s - loss: 1.2451 - regression_loss: 1.0563 - classification_loss: 0.1888 335/500 [===================>..........] - ETA: 41s - loss: 1.2458 - regression_loss: 1.0570 - classification_loss: 0.1889 336/500 [===================>..........] - ETA: 41s - loss: 1.2481 - regression_loss: 1.0585 - classification_loss: 0.1896 337/500 [===================>..........] - ETA: 40s - loss: 1.2471 - regression_loss: 1.0578 - classification_loss: 0.1893 338/500 [===================>..........] - ETA: 40s - loss: 1.2490 - regression_loss: 1.0595 - classification_loss: 0.1896 339/500 [===================>..........] - ETA: 40s - loss: 1.2467 - regression_loss: 1.0576 - classification_loss: 0.1891 340/500 [===================>..........] - ETA: 40s - loss: 1.2463 - regression_loss: 1.0573 - classification_loss: 0.1890 341/500 [===================>..........] - ETA: 39s - loss: 1.2468 - regression_loss: 1.0577 - classification_loss: 0.1891 342/500 [===================>..........] - ETA: 39s - loss: 1.2477 - regression_loss: 1.0584 - classification_loss: 0.1892 343/500 [===================>..........] - ETA: 39s - loss: 1.2500 - regression_loss: 1.0605 - classification_loss: 0.1894 344/500 [===================>..........] - ETA: 39s - loss: 1.2504 - regression_loss: 1.0610 - classification_loss: 0.1895 345/500 [===================>..........] - ETA: 38s - loss: 1.2508 - regression_loss: 1.0613 - classification_loss: 0.1895 346/500 [===================>..........] - ETA: 38s - loss: 1.2523 - regression_loss: 1.0625 - classification_loss: 0.1898 347/500 [===================>..........] - ETA: 38s - loss: 1.2532 - regression_loss: 1.0633 - classification_loss: 0.1899 348/500 [===================>..........] - ETA: 38s - loss: 1.2537 - regression_loss: 1.0638 - classification_loss: 0.1898 349/500 [===================>..........] - ETA: 37s - loss: 1.2557 - regression_loss: 1.0659 - classification_loss: 0.1899 350/500 [====================>.........] - ETA: 37s - loss: 1.2566 - regression_loss: 1.0667 - classification_loss: 0.1900 351/500 [====================>.........] - ETA: 37s - loss: 1.2562 - regression_loss: 1.0664 - classification_loss: 0.1898 352/500 [====================>.........] - ETA: 37s - loss: 1.2543 - regression_loss: 1.0648 - classification_loss: 0.1895 353/500 [====================>.........] - ETA: 36s - loss: 1.2535 - regression_loss: 1.0642 - classification_loss: 0.1893 354/500 [====================>.........] - ETA: 36s - loss: 1.2517 - regression_loss: 1.0628 - classification_loss: 0.1889 355/500 [====================>.........] - ETA: 36s - loss: 1.2512 - regression_loss: 1.0622 - classification_loss: 0.1889 356/500 [====================>.........] - ETA: 36s - loss: 1.2537 - regression_loss: 1.0642 - classification_loss: 0.1895 357/500 [====================>.........] - ETA: 35s - loss: 1.2537 - regression_loss: 1.0641 - classification_loss: 0.1895 358/500 [====================>.........] - ETA: 35s - loss: 1.2549 - regression_loss: 1.0650 - classification_loss: 0.1899 359/500 [====================>.........] - ETA: 35s - loss: 1.2557 - regression_loss: 1.0656 - classification_loss: 0.1900 360/500 [====================>.........] - ETA: 35s - loss: 1.2554 - regression_loss: 1.0654 - classification_loss: 0.1900 361/500 [====================>.........] - ETA: 34s - loss: 1.2547 - regression_loss: 1.0647 - classification_loss: 0.1900 362/500 [====================>.........] - ETA: 34s - loss: 1.2540 - regression_loss: 1.0641 - classification_loss: 0.1899 363/500 [====================>.........] - ETA: 34s - loss: 1.2545 - regression_loss: 1.0646 - classification_loss: 0.1899 364/500 [====================>.........] - ETA: 34s - loss: 1.2534 - regression_loss: 1.0638 - classification_loss: 0.1896 365/500 [====================>.........] - ETA: 33s - loss: 1.2543 - regression_loss: 1.0645 - classification_loss: 0.1898 366/500 [====================>.........] - ETA: 33s - loss: 1.2551 - regression_loss: 1.0653 - classification_loss: 0.1898 367/500 [=====================>........] - ETA: 33s - loss: 1.2546 - regression_loss: 1.0650 - classification_loss: 0.1896 368/500 [=====================>........] - ETA: 33s - loss: 1.2559 - regression_loss: 1.0661 - classification_loss: 0.1899 369/500 [=====================>........] - ETA: 32s - loss: 1.2555 - regression_loss: 1.0658 - classification_loss: 0.1897 370/500 [=====================>........] - ETA: 32s - loss: 1.2557 - regression_loss: 1.0660 - classification_loss: 0.1897 371/500 [=====================>........] - ETA: 32s - loss: 1.2558 - regression_loss: 1.0661 - classification_loss: 0.1897 372/500 [=====================>........] - ETA: 32s - loss: 1.2572 - regression_loss: 1.0673 - classification_loss: 0.1899 373/500 [=====================>........] - ETA: 31s - loss: 1.2574 - regression_loss: 1.0674 - classification_loss: 0.1899 374/500 [=====================>........] - ETA: 31s - loss: 1.2570 - regression_loss: 1.0671 - classification_loss: 0.1899 375/500 [=====================>........] - ETA: 31s - loss: 1.2565 - regression_loss: 1.0667 - classification_loss: 0.1898 376/500 [=====================>........] - ETA: 31s - loss: 1.2564 - regression_loss: 1.0665 - classification_loss: 0.1898 377/500 [=====================>........] - ETA: 30s - loss: 1.2566 - regression_loss: 1.0666 - classification_loss: 0.1899 378/500 [=====================>........] - ETA: 30s - loss: 1.2547 - regression_loss: 1.0651 - classification_loss: 0.1896 379/500 [=====================>........] - ETA: 30s - loss: 1.2529 - regression_loss: 1.0637 - classification_loss: 0.1893 380/500 [=====================>........] - ETA: 30s - loss: 1.2521 - regression_loss: 1.0630 - classification_loss: 0.1891 381/500 [=====================>........] - ETA: 29s - loss: 1.2523 - regression_loss: 1.0633 - classification_loss: 0.1891 382/500 [=====================>........] - ETA: 29s - loss: 1.2525 - regression_loss: 1.0632 - classification_loss: 0.1893 383/500 [=====================>........] - ETA: 29s - loss: 1.2531 - regression_loss: 1.0638 - classification_loss: 0.1893 384/500 [======================>.......] - ETA: 29s - loss: 1.2539 - regression_loss: 1.0645 - classification_loss: 0.1894 385/500 [======================>.......] - ETA: 28s - loss: 1.2528 - regression_loss: 1.0634 - classification_loss: 0.1893 386/500 [======================>.......] - ETA: 28s - loss: 1.2538 - regression_loss: 1.0643 - classification_loss: 0.1895 387/500 [======================>.......] - ETA: 28s - loss: 1.2522 - regression_loss: 1.0631 - classification_loss: 0.1891 388/500 [======================>.......] - ETA: 28s - loss: 1.2546 - regression_loss: 1.0649 - classification_loss: 0.1897 389/500 [======================>.......] - ETA: 27s - loss: 1.2549 - regression_loss: 1.0652 - classification_loss: 0.1897 390/500 [======================>.......] - ETA: 27s - loss: 1.2552 - regression_loss: 1.0653 - classification_loss: 0.1898 391/500 [======================>.......] - ETA: 27s - loss: 1.2550 - regression_loss: 1.0651 - classification_loss: 0.1899 392/500 [======================>.......] - ETA: 27s - loss: 1.2560 - regression_loss: 1.0658 - classification_loss: 0.1901 393/500 [======================>.......] - ETA: 26s - loss: 1.2539 - regression_loss: 1.0641 - classification_loss: 0.1898 394/500 [======================>.......] - ETA: 26s - loss: 1.2549 - regression_loss: 1.0649 - classification_loss: 0.1899 395/500 [======================>.......] - ETA: 26s - loss: 1.2543 - regression_loss: 1.0646 - classification_loss: 0.1897 396/500 [======================>.......] - ETA: 26s - loss: 1.2550 - regression_loss: 1.0653 - classification_loss: 0.1898 397/500 [======================>.......] - ETA: 25s - loss: 1.2566 - regression_loss: 1.0666 - classification_loss: 0.1900 398/500 [======================>.......] - ETA: 25s - loss: 1.2574 - regression_loss: 1.0673 - classification_loss: 0.1901 399/500 [======================>.......] - ETA: 25s - loss: 1.2574 - regression_loss: 1.0673 - classification_loss: 0.1901 400/500 [=======================>......] - ETA: 25s - loss: 1.2565 - regression_loss: 1.0665 - classification_loss: 0.1900 401/500 [=======================>......] - ETA: 24s - loss: 1.2572 - regression_loss: 1.0670 - classification_loss: 0.1902 402/500 [=======================>......] - ETA: 24s - loss: 1.2585 - regression_loss: 1.0681 - classification_loss: 0.1904 403/500 [=======================>......] - ETA: 24s - loss: 1.2583 - regression_loss: 1.0679 - classification_loss: 0.1905 404/500 [=======================>......] - ETA: 24s - loss: 1.2583 - regression_loss: 1.0679 - classification_loss: 0.1904 405/500 [=======================>......] - ETA: 23s - loss: 1.2587 - regression_loss: 1.0683 - classification_loss: 0.1904 406/500 [=======================>......] - ETA: 23s - loss: 1.2588 - regression_loss: 1.0685 - classification_loss: 0.1903 407/500 [=======================>......] - ETA: 23s - loss: 1.2593 - regression_loss: 1.0688 - classification_loss: 0.1905 408/500 [=======================>......] - ETA: 23s - loss: 1.2603 - regression_loss: 1.0697 - classification_loss: 0.1906 409/500 [=======================>......] - ETA: 22s - loss: 1.2607 - regression_loss: 1.0700 - classification_loss: 0.1907 410/500 [=======================>......] - ETA: 22s - loss: 1.2613 - regression_loss: 1.0704 - classification_loss: 0.1909 411/500 [=======================>......] - ETA: 22s - loss: 1.2621 - regression_loss: 1.0712 - classification_loss: 0.1909 412/500 [=======================>......] - ETA: 22s - loss: 1.2612 - regression_loss: 1.0705 - classification_loss: 0.1907 413/500 [=======================>......] - ETA: 21s - loss: 1.2599 - regression_loss: 1.0695 - classification_loss: 0.1904 414/500 [=======================>......] - ETA: 21s - loss: 1.2610 - regression_loss: 1.0704 - classification_loss: 0.1906 415/500 [=======================>......] - ETA: 21s - loss: 1.2590 - regression_loss: 1.0688 - classification_loss: 0.1903 416/500 [=======================>......] - ETA: 21s - loss: 1.2591 - regression_loss: 1.0688 - classification_loss: 0.1903 417/500 [========================>.....] - ETA: 20s - loss: 1.2596 - regression_loss: 1.0692 - classification_loss: 0.1904 418/500 [========================>.....] - ETA: 20s - loss: 1.2595 - regression_loss: 1.0692 - classification_loss: 0.1903 419/500 [========================>.....] - ETA: 20s - loss: 1.2607 - regression_loss: 1.0701 - classification_loss: 0.1906 420/500 [========================>.....] - ETA: 20s - loss: 1.2622 - regression_loss: 1.0713 - classification_loss: 0.1909 421/500 [========================>.....] - ETA: 19s - loss: 1.2630 - regression_loss: 1.0720 - classification_loss: 0.1910 422/500 [========================>.....] - ETA: 19s - loss: 1.2641 - regression_loss: 1.0729 - classification_loss: 0.1912 423/500 [========================>.....] - ETA: 19s - loss: 1.2640 - regression_loss: 1.0729 - classification_loss: 0.1911 424/500 [========================>.....] - ETA: 19s - loss: 1.2629 - regression_loss: 1.0719 - classification_loss: 0.1910 425/500 [========================>.....] - ETA: 18s - loss: 1.2626 - regression_loss: 1.0717 - classification_loss: 0.1909 426/500 [========================>.....] - ETA: 18s - loss: 1.2634 - regression_loss: 1.0724 - classification_loss: 0.1910 427/500 [========================>.....] - ETA: 18s - loss: 1.2634 - regression_loss: 1.0725 - classification_loss: 0.1909 428/500 [========================>.....] - ETA: 18s - loss: 1.2620 - regression_loss: 1.0713 - classification_loss: 0.1907 429/500 [========================>.....] - ETA: 17s - loss: 1.2633 - regression_loss: 1.0724 - classification_loss: 0.1909 430/500 [========================>.....] - ETA: 17s - loss: 1.2636 - regression_loss: 1.0728 - classification_loss: 0.1909 431/500 [========================>.....] - ETA: 17s - loss: 1.2642 - regression_loss: 1.0731 - classification_loss: 0.1911 432/500 [========================>.....] - ETA: 17s - loss: 1.2638 - regression_loss: 1.0726 - classification_loss: 0.1912 433/500 [========================>.....] - ETA: 16s - loss: 1.2643 - regression_loss: 1.0730 - classification_loss: 0.1913 434/500 [=========================>....] - ETA: 16s - loss: 1.2631 - regression_loss: 1.0722 - classification_loss: 0.1909 435/500 [=========================>....] - ETA: 16s - loss: 1.2625 - regression_loss: 1.0715 - classification_loss: 0.1910 436/500 [=========================>....] - ETA: 16s - loss: 1.2628 - regression_loss: 1.0718 - classification_loss: 0.1910 437/500 [=========================>....] - ETA: 15s - loss: 1.2635 - regression_loss: 1.0723 - classification_loss: 0.1911 438/500 [=========================>....] - ETA: 15s - loss: 1.2633 - regression_loss: 1.0722 - classification_loss: 0.1911 439/500 [=========================>....] - ETA: 15s - loss: 1.2629 - regression_loss: 1.0719 - classification_loss: 0.1910 440/500 [=========================>....] - ETA: 15s - loss: 1.2625 - regression_loss: 1.0716 - classification_loss: 0.1909 441/500 [=========================>....] - ETA: 14s - loss: 1.2625 - regression_loss: 1.0717 - classification_loss: 0.1908 442/500 [=========================>....] - ETA: 14s - loss: 1.2621 - regression_loss: 1.0714 - classification_loss: 0.1907 443/500 [=========================>....] - ETA: 14s - loss: 1.2631 - regression_loss: 1.0725 - classification_loss: 0.1906 444/500 [=========================>....] - ETA: 14s - loss: 1.2636 - regression_loss: 1.0728 - classification_loss: 0.1908 445/500 [=========================>....] - ETA: 13s - loss: 1.2645 - regression_loss: 1.0736 - classification_loss: 0.1909 446/500 [=========================>....] - ETA: 13s - loss: 1.2648 - regression_loss: 1.0739 - classification_loss: 0.1909 447/500 [=========================>....] - ETA: 13s - loss: 1.2657 - regression_loss: 1.0745 - classification_loss: 0.1912 448/500 [=========================>....] - ETA: 13s - loss: 1.2660 - regression_loss: 1.0747 - classification_loss: 0.1913 449/500 [=========================>....] - ETA: 12s - loss: 1.2647 - regression_loss: 1.0736 - classification_loss: 0.1911 450/500 [==========================>...] - ETA: 12s - loss: 1.2639 - regression_loss: 1.0727 - classification_loss: 0.1911 451/500 [==========================>...] - ETA: 12s - loss: 1.2640 - regression_loss: 1.0729 - classification_loss: 0.1911 452/500 [==========================>...] - ETA: 12s - loss: 1.2648 - regression_loss: 1.0736 - classification_loss: 0.1912 453/500 [==========================>...] - ETA: 11s - loss: 1.2647 - regression_loss: 1.0736 - classification_loss: 0.1911 454/500 [==========================>...] - ETA: 11s - loss: 1.2651 - regression_loss: 1.0738 - classification_loss: 0.1913 455/500 [==========================>...] - ETA: 11s - loss: 1.2661 - regression_loss: 1.0747 - classification_loss: 0.1914 456/500 [==========================>...] - ETA: 11s - loss: 1.2658 - regression_loss: 1.0743 - classification_loss: 0.1915 457/500 [==========================>...] - ETA: 10s - loss: 1.2655 - regression_loss: 1.0740 - classification_loss: 0.1915 458/500 [==========================>...] - ETA: 10s - loss: 1.2656 - regression_loss: 1.0740 - classification_loss: 0.1916 459/500 [==========================>...] - ETA: 10s - loss: 1.2656 - regression_loss: 1.0741 - classification_loss: 0.1915 460/500 [==========================>...] - ETA: 10s - loss: 1.2642 - regression_loss: 1.0730 - classification_loss: 0.1912 461/500 [==========================>...] - ETA: 9s - loss: 1.2645 - regression_loss: 1.0732 - classification_loss: 0.1913  462/500 [==========================>...] - ETA: 9s - loss: 1.2650 - regression_loss: 1.0737 - classification_loss: 0.1913 463/500 [==========================>...] - ETA: 9s - loss: 1.2650 - regression_loss: 1.0735 - classification_loss: 0.1915 464/500 [==========================>...] - ETA: 9s - loss: 1.2644 - regression_loss: 1.0730 - classification_loss: 0.1914 465/500 [==========================>...] - ETA: 8s - loss: 1.2649 - regression_loss: 1.0730 - classification_loss: 0.1919 466/500 [==========================>...] - ETA: 8s - loss: 1.2647 - regression_loss: 1.0726 - classification_loss: 0.1921 467/500 [===========================>..] - ETA: 8s - loss: 1.2653 - regression_loss: 1.0732 - classification_loss: 0.1922 468/500 [===========================>..] - ETA: 8s - loss: 1.2659 - regression_loss: 1.0737 - classification_loss: 0.1922 469/500 [===========================>..] - ETA: 7s - loss: 1.2643 - regression_loss: 1.0724 - classification_loss: 0.1918 470/500 [===========================>..] - ETA: 7s - loss: 1.2640 - regression_loss: 1.0722 - classification_loss: 0.1918 471/500 [===========================>..] - ETA: 7s - loss: 1.2621 - regression_loss: 1.0706 - classification_loss: 0.1915 472/500 [===========================>..] - ETA: 7s - loss: 1.2621 - regression_loss: 1.0705 - classification_loss: 0.1916 473/500 [===========================>..] - ETA: 6s - loss: 1.2600 - regression_loss: 1.0687 - classification_loss: 0.1912 474/500 [===========================>..] - ETA: 6s - loss: 1.2612 - regression_loss: 1.0697 - classification_loss: 0.1914 475/500 [===========================>..] - ETA: 6s - loss: 1.2619 - regression_loss: 1.0703 - classification_loss: 0.1916 476/500 [===========================>..] - ETA: 6s - loss: 1.2622 - regression_loss: 1.0704 - classification_loss: 0.1919 477/500 [===========================>..] - ETA: 5s - loss: 1.2621 - regression_loss: 1.0703 - classification_loss: 0.1918 478/500 [===========================>..] - ETA: 5s - loss: 1.2605 - regression_loss: 1.0689 - classification_loss: 0.1915 479/500 [===========================>..] - ETA: 5s - loss: 1.2590 - regression_loss: 1.0678 - classification_loss: 0.1912 480/500 [===========================>..] - ETA: 5s - loss: 1.2603 - regression_loss: 1.0687 - classification_loss: 0.1916 481/500 [===========================>..] - ETA: 4s - loss: 1.2590 - regression_loss: 1.0677 - classification_loss: 0.1913 482/500 [===========================>..] - ETA: 4s - loss: 1.2592 - regression_loss: 1.0679 - classification_loss: 0.1913 483/500 [===========================>..] - ETA: 4s - loss: 1.2597 - regression_loss: 1.0683 - classification_loss: 0.1914 484/500 [============================>.] - ETA: 4s - loss: 1.2589 - regression_loss: 1.0677 - classification_loss: 0.1912 485/500 [============================>.] - ETA: 3s - loss: 1.2590 - regression_loss: 1.0678 - classification_loss: 0.1912 486/500 [============================>.] - ETA: 3s - loss: 1.2573 - regression_loss: 1.0663 - classification_loss: 0.1909 487/500 [============================>.] - ETA: 3s - loss: 1.2579 - regression_loss: 1.0668 - classification_loss: 0.1911 488/500 [============================>.] - ETA: 3s - loss: 1.2578 - regression_loss: 1.0667 - classification_loss: 0.1911 489/500 [============================>.] - ETA: 2s - loss: 1.2562 - regression_loss: 1.0654 - classification_loss: 0.1908 490/500 [============================>.] - ETA: 2s - loss: 1.2563 - regression_loss: 1.0653 - classification_loss: 0.1910 491/500 [============================>.] - ETA: 2s - loss: 1.2567 - regression_loss: 1.0656 - classification_loss: 0.1911 492/500 [============================>.] - ETA: 2s - loss: 1.2553 - regression_loss: 1.0645 - classification_loss: 0.1908 493/500 [============================>.] - ETA: 1s - loss: 1.2538 - regression_loss: 1.0633 - classification_loss: 0.1905 494/500 [============================>.] - ETA: 1s - loss: 1.2548 - regression_loss: 1.0642 - classification_loss: 0.1907 495/500 [============================>.] - ETA: 1s - loss: 1.2561 - regression_loss: 1.0652 - classification_loss: 0.1909 496/500 [============================>.] - ETA: 1s - loss: 1.2573 - regression_loss: 1.0661 - classification_loss: 0.1911 497/500 [============================>.] - ETA: 0s - loss: 1.2568 - regression_loss: 1.0658 - classification_loss: 0.1911 498/500 [============================>.] - ETA: 0s - loss: 1.2561 - regression_loss: 1.0648 - classification_loss: 0.1913 499/500 [============================>.] - ETA: 0s - loss: 1.2564 - regression_loss: 1.0651 - classification_loss: 0.1913 500/500 [==============================] - 125s 251ms/step - loss: 1.2551 - regression_loss: 1.0641 - classification_loss: 0.1910 1172 instances of class plum with average precision: 0.7013 mAP: 0.7013 Epoch 00128: saving model to ./training/snapshots/resnet50_pascal_128.h5 Epoch 00128: ReduceLROnPlateau reducing learning rate to 9.999999974752428e-08. Epoch 129/150 1/500 [..............................] - ETA: 1:57 - loss: 1.2834 - regression_loss: 1.0911 - classification_loss: 0.1923 2/500 [..............................] - ETA: 2:02 - loss: 1.3357 - regression_loss: 1.1570 - classification_loss: 0.1787 3/500 [..............................] - ETA: 2:02 - loss: 1.0772 - regression_loss: 0.9304 - classification_loss: 0.1468 4/500 [..............................] - ETA: 2:03 - loss: 1.1784 - regression_loss: 1.0214 - classification_loss: 0.1571 5/500 [..............................] - ETA: 2:02 - loss: 1.1327 - regression_loss: 0.9776 - classification_loss: 0.1552 6/500 [..............................] - ETA: 2:03 - loss: 1.0873 - regression_loss: 0.9442 - classification_loss: 0.1431 7/500 [..............................] - ETA: 2:02 - loss: 0.9813 - regression_loss: 0.8499 - classification_loss: 0.1314 8/500 [..............................] - ETA: 2:02 - loss: 0.9395 - regression_loss: 0.8163 - classification_loss: 0.1232 9/500 [..............................] - ETA: 2:02 - loss: 0.9035 - regression_loss: 0.7849 - classification_loss: 0.1186 10/500 [..............................] - ETA: 2:02 - loss: 0.9193 - regression_loss: 0.7984 - classification_loss: 0.1210 11/500 [..............................] - ETA: 2:02 - loss: 0.9545 - regression_loss: 0.8248 - classification_loss: 0.1297 12/500 [..............................] - ETA: 2:02 - loss: 1.0052 - regression_loss: 0.8672 - classification_loss: 0.1379 13/500 [..............................] - ETA: 2:02 - loss: 1.0592 - regression_loss: 0.9127 - classification_loss: 0.1465 14/500 [..............................] - ETA: 2:02 - loss: 1.0875 - regression_loss: 0.9343 - classification_loss: 0.1532 15/500 [..............................] - ETA: 2:01 - loss: 1.0950 - regression_loss: 0.9456 - classification_loss: 0.1494 16/500 [..............................] - ETA: 2:01 - loss: 1.0949 - regression_loss: 0.9464 - classification_loss: 0.1485 17/500 [>.............................] - ETA: 2:01 - loss: 1.0919 - regression_loss: 0.9442 - classification_loss: 0.1478 18/500 [>.............................] - ETA: 2:00 - loss: 1.1223 - regression_loss: 0.9692 - classification_loss: 0.1531 19/500 [>.............................] - ETA: 2:00 - loss: 1.0966 - regression_loss: 0.9484 - classification_loss: 0.1482 20/500 [>.............................] - ETA: 2:00 - loss: 1.1372 - regression_loss: 0.9871 - classification_loss: 0.1501 21/500 [>.............................] - ETA: 2:00 - loss: 1.1470 - regression_loss: 0.9939 - classification_loss: 0.1531 22/500 [>.............................] - ETA: 2:00 - loss: 1.1465 - regression_loss: 0.9913 - classification_loss: 0.1552 23/500 [>.............................] - ETA: 1:59 - loss: 1.1564 - regression_loss: 0.9985 - classification_loss: 0.1579 24/500 [>.............................] - ETA: 1:59 - loss: 1.1424 - regression_loss: 0.9877 - classification_loss: 0.1546 25/500 [>.............................] - ETA: 1:59 - loss: 1.1800 - regression_loss: 1.0170 - classification_loss: 0.1630 26/500 [>.............................] - ETA: 1:59 - loss: 1.1680 - regression_loss: 1.0065 - classification_loss: 0.1615 27/500 [>.............................] - ETA: 1:59 - loss: 1.1358 - regression_loss: 0.9767 - classification_loss: 0.1590 28/500 [>.............................] - ETA: 1:58 - loss: 1.1221 - regression_loss: 0.9649 - classification_loss: 0.1572 29/500 [>.............................] - ETA: 1:58 - loss: 1.1424 - regression_loss: 0.9814 - classification_loss: 0.1610 30/500 [>.............................] - ETA: 1:58 - loss: 1.1252 - regression_loss: 0.9674 - classification_loss: 0.1577 31/500 [>.............................] - ETA: 1:58 - loss: 1.1285 - regression_loss: 0.9682 - classification_loss: 0.1604 32/500 [>.............................] - ETA: 1:57 - loss: 1.1359 - regression_loss: 0.9754 - classification_loss: 0.1604 33/500 [>.............................] - ETA: 1:57 - loss: 1.1549 - regression_loss: 0.9914 - classification_loss: 0.1635 34/500 [=>............................] - ETA: 1:57 - loss: 1.1736 - regression_loss: 1.0068 - classification_loss: 0.1668 35/500 [=>............................] - ETA: 1:57 - loss: 1.1681 - regression_loss: 1.0020 - classification_loss: 0.1661 36/500 [=>............................] - ETA: 1:57 - loss: 1.1509 - regression_loss: 0.9877 - classification_loss: 0.1633 37/500 [=>............................] - ETA: 1:56 - loss: 1.1440 - regression_loss: 0.9826 - classification_loss: 0.1615 38/500 [=>............................] - ETA: 1:56 - loss: 1.1455 - regression_loss: 0.9845 - classification_loss: 0.1610 39/500 [=>............................] - ETA: 1:56 - loss: 1.1381 - regression_loss: 0.9772 - classification_loss: 0.1609 40/500 [=>............................] - ETA: 1:56 - loss: 1.1226 - regression_loss: 0.9641 - classification_loss: 0.1585 41/500 [=>............................] - ETA: 1:55 - loss: 1.1303 - regression_loss: 0.9705 - classification_loss: 0.1598 42/500 [=>............................] - ETA: 1:55 - loss: 1.1300 - regression_loss: 0.9704 - classification_loss: 0.1596 43/500 [=>............................] - ETA: 1:55 - loss: 1.1263 - regression_loss: 0.9666 - classification_loss: 0.1597 44/500 [=>............................] - ETA: 1:55 - loss: 1.1266 - regression_loss: 0.9666 - classification_loss: 0.1600 45/500 [=>............................] - ETA: 1:54 - loss: 1.1449 - regression_loss: 0.9806 - classification_loss: 0.1642 46/500 [=>............................] - ETA: 1:54 - loss: 1.1390 - regression_loss: 0.9754 - classification_loss: 0.1636 47/500 [=>............................] - ETA: 1:54 - loss: 1.1381 - regression_loss: 0.9736 - classification_loss: 0.1645 48/500 [=>............................] - ETA: 1:54 - loss: 1.1447 - regression_loss: 0.9789 - classification_loss: 0.1658 49/500 [=>............................] - ETA: 1:53 - loss: 1.1568 - regression_loss: 0.9882 - classification_loss: 0.1686 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1588 - regression_loss: 0.9902 - classification_loss: 0.1686 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1758 - regression_loss: 1.0033 - classification_loss: 0.1725 52/500 [==>...........................] - ETA: 1:53 - loss: 1.1799 - regression_loss: 1.0069 - classification_loss: 0.1730 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1766 - regression_loss: 1.0042 - classification_loss: 0.1724 54/500 [==>...........................] - ETA: 1:52 - loss: 1.1877 - regression_loss: 1.0129 - classification_loss: 0.1748 55/500 [==>...........................] - ETA: 1:52 - loss: 1.1941 - regression_loss: 1.0180 - classification_loss: 0.1761 56/500 [==>...........................] - ETA: 1:52 - loss: 1.1973 - regression_loss: 1.0209 - classification_loss: 0.1764 57/500 [==>...........................] - ETA: 1:52 - loss: 1.2016 - regression_loss: 1.0245 - classification_loss: 0.1771 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2091 - regression_loss: 1.0310 - classification_loss: 0.1781 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2048 - regression_loss: 1.0277 - classification_loss: 0.1770 60/500 [==>...........................] - ETA: 1:51 - loss: 1.2129 - regression_loss: 1.0343 - classification_loss: 0.1787 61/500 [==>...........................] - ETA: 1:51 - loss: 1.2104 - regression_loss: 1.0320 - classification_loss: 0.1784 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2011 - regression_loss: 1.0242 - classification_loss: 0.1769 63/500 [==>...........................] - ETA: 1:50 - loss: 1.2027 - regression_loss: 1.0257 - classification_loss: 0.1770 64/500 [==>...........................] - ETA: 1:50 - loss: 1.2081 - regression_loss: 1.0302 - classification_loss: 0.1779 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2146 - regression_loss: 1.0356 - classification_loss: 0.1790 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2054 - regression_loss: 1.0282 - classification_loss: 0.1772 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2081 - regression_loss: 1.0303 - classification_loss: 0.1778 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2125 - regression_loss: 1.0341 - classification_loss: 0.1783 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2177 - regression_loss: 1.0387 - classification_loss: 0.1790 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2288 - regression_loss: 1.0470 - classification_loss: 0.1818 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2361 - regression_loss: 1.0535 - classification_loss: 0.1827 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2370 - regression_loss: 1.0544 - classification_loss: 0.1826 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2372 - regression_loss: 1.0548 - classification_loss: 0.1825 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2405 - regression_loss: 1.0576 - classification_loss: 0.1828 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2408 - regression_loss: 1.0583 - classification_loss: 0.1825 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2296 - regression_loss: 1.0491 - classification_loss: 0.1805 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2355 - regression_loss: 1.0540 - classification_loss: 0.1814 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2310 - regression_loss: 1.0504 - classification_loss: 0.1805 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2368 - regression_loss: 1.0552 - classification_loss: 0.1815 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2322 - regression_loss: 1.0518 - classification_loss: 0.1805 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2373 - regression_loss: 1.0562 - classification_loss: 0.1811 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2441 - regression_loss: 1.0614 - classification_loss: 0.1826 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2367 - regression_loss: 1.0553 - classification_loss: 0.1814 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2377 - regression_loss: 1.0562 - classification_loss: 0.1815 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2356 - regression_loss: 1.0539 - classification_loss: 0.1817 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2399 - regression_loss: 1.0577 - classification_loss: 0.1822 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2343 - regression_loss: 1.0528 - classification_loss: 0.1816 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2280 - regression_loss: 1.0476 - classification_loss: 0.1804 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2308 - regression_loss: 1.0494 - classification_loss: 0.1814 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2351 - regression_loss: 1.0535 - classification_loss: 0.1817 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2376 - regression_loss: 1.0552 - classification_loss: 0.1824 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2313 - regression_loss: 1.0501 - classification_loss: 0.1812 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2364 - regression_loss: 1.0556 - classification_loss: 0.1808 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2301 - regression_loss: 1.0506 - classification_loss: 0.1795 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2273 - regression_loss: 1.0472 - classification_loss: 0.1802 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2304 - regression_loss: 1.0502 - classification_loss: 0.1802 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2333 - regression_loss: 1.0527 - classification_loss: 0.1806 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2282 - regression_loss: 1.0478 - classification_loss: 0.1803 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2264 - regression_loss: 1.0463 - classification_loss: 0.1801 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2279 - regression_loss: 1.0476 - classification_loss: 0.1802 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2262 - regression_loss: 1.0461 - classification_loss: 0.1801 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2273 - regression_loss: 1.0469 - classification_loss: 0.1804 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2193 - regression_loss: 1.0401 - classification_loss: 0.1792 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2233 - regression_loss: 1.0429 - classification_loss: 0.1805 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2243 - regression_loss: 1.0438 - classification_loss: 0.1805 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2206 - regression_loss: 1.0408 - classification_loss: 0.1798 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2204 - regression_loss: 1.0406 - classification_loss: 0.1798 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2168 - regression_loss: 1.0368 - classification_loss: 0.1800 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2188 - regression_loss: 1.0385 - classification_loss: 0.1803 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2191 - regression_loss: 1.0387 - classification_loss: 0.1804 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2189 - regression_loss: 1.0386 - classification_loss: 0.1803 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2187 - regression_loss: 1.0379 - classification_loss: 0.1809 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2190 - regression_loss: 1.0381 - classification_loss: 0.1809 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2134 - regression_loss: 1.0333 - classification_loss: 0.1801 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2096 - regression_loss: 1.0300 - classification_loss: 0.1796 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2037 - regression_loss: 1.0253 - classification_loss: 0.1783 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2017 - regression_loss: 1.0233 - classification_loss: 0.1785 118/500 [======>.......................] - ETA: 1:35 - loss: 1.1949 - regression_loss: 1.0178 - classification_loss: 0.1771 119/500 [======>.......................] - ETA: 1:35 - loss: 1.1957 - regression_loss: 1.0186 - classification_loss: 0.1771 120/500 [======>.......................] - ETA: 1:35 - loss: 1.1968 - regression_loss: 1.0196 - classification_loss: 0.1772 121/500 [======>.......................] - ETA: 1:35 - loss: 1.1952 - regression_loss: 1.0182 - classification_loss: 0.1770 122/500 [======>.......................] - ETA: 1:34 - loss: 1.1964 - regression_loss: 1.0186 - classification_loss: 0.1778 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2012 - regression_loss: 1.0228 - classification_loss: 0.1784 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1994 - regression_loss: 1.0215 - classification_loss: 0.1779 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2005 - regression_loss: 1.0210 - classification_loss: 0.1795 126/500 [======>.......................] - ETA: 1:33 - loss: 1.1976 - regression_loss: 1.0190 - classification_loss: 0.1785 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2014 - regression_loss: 1.0217 - classification_loss: 0.1797 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2015 - regression_loss: 1.0220 - classification_loss: 0.1795 129/500 [======>.......................] - ETA: 1:33 - loss: 1.1957 - regression_loss: 1.0170 - classification_loss: 0.1787 130/500 [======>.......................] - ETA: 1:32 - loss: 1.1973 - regression_loss: 1.0185 - classification_loss: 0.1789 131/500 [======>.......................] - ETA: 1:32 - loss: 1.1971 - regression_loss: 1.0182 - classification_loss: 0.1789 132/500 [======>.......................] - ETA: 1:32 - loss: 1.1947 - regression_loss: 1.0161 - classification_loss: 0.1785 133/500 [======>.......................] - ETA: 1:32 - loss: 1.1889 - regression_loss: 1.0113 - classification_loss: 0.1776 134/500 [=======>......................] - ETA: 1:31 - loss: 1.1850 - regression_loss: 1.0083 - classification_loss: 0.1767 135/500 [=======>......................] - ETA: 1:31 - loss: 1.1880 - regression_loss: 1.0108 - classification_loss: 0.1772 136/500 [=======>......................] - ETA: 1:31 - loss: 1.1927 - regression_loss: 1.0140 - classification_loss: 0.1787 137/500 [=======>......................] - ETA: 1:31 - loss: 1.1909 - regression_loss: 1.0128 - classification_loss: 0.1781 138/500 [=======>......................] - ETA: 1:30 - loss: 1.1914 - regression_loss: 1.0134 - classification_loss: 0.1779 139/500 [=======>......................] - ETA: 1:30 - loss: 1.1904 - regression_loss: 1.0129 - classification_loss: 0.1775 140/500 [=======>......................] - ETA: 1:30 - loss: 1.1928 - regression_loss: 1.0146 - classification_loss: 0.1782 141/500 [=======>......................] - ETA: 1:30 - loss: 1.1932 - regression_loss: 1.0151 - classification_loss: 0.1781 142/500 [=======>......................] - ETA: 1:29 - loss: 1.1953 - regression_loss: 1.0169 - classification_loss: 0.1784 143/500 [=======>......................] - ETA: 1:29 - loss: 1.1926 - regression_loss: 1.0150 - classification_loss: 0.1776 144/500 [=======>......................] - ETA: 1:29 - loss: 1.1924 - regression_loss: 1.0151 - classification_loss: 0.1773 145/500 [=======>......................] - ETA: 1:29 - loss: 1.1983 - regression_loss: 1.0198 - classification_loss: 0.1785 146/500 [=======>......................] - ETA: 1:28 - loss: 1.1986 - regression_loss: 1.0201 - classification_loss: 0.1785 147/500 [=======>......................] - ETA: 1:28 - loss: 1.1982 - regression_loss: 1.0202 - classification_loss: 0.1780 148/500 [=======>......................] - ETA: 1:28 - loss: 1.1927 - regression_loss: 1.0158 - classification_loss: 0.1770 149/500 [=======>......................] - ETA: 1:28 - loss: 1.1897 - regression_loss: 1.0133 - classification_loss: 0.1764 150/500 [========>.....................] - ETA: 1:27 - loss: 1.1891 - regression_loss: 1.0125 - classification_loss: 0.1765 151/500 [========>.....................] - ETA: 1:27 - loss: 1.1911 - regression_loss: 1.0141 - classification_loss: 0.1770 152/500 [========>.....................] - ETA: 1:27 - loss: 1.1941 - regression_loss: 1.0166 - classification_loss: 0.1775 153/500 [========>.....................] - ETA: 1:26 - loss: 1.1963 - regression_loss: 1.0185 - classification_loss: 0.1779 154/500 [========>.....................] - ETA: 1:26 - loss: 1.1973 - regression_loss: 1.0191 - classification_loss: 0.1782 155/500 [========>.....................] - ETA: 1:26 - loss: 1.1920 - regression_loss: 1.0148 - classification_loss: 0.1772 156/500 [========>.....................] - ETA: 1:26 - loss: 1.1954 - regression_loss: 1.0175 - classification_loss: 0.1779 157/500 [========>.....................] - ETA: 1:25 - loss: 1.1956 - regression_loss: 1.0175 - classification_loss: 0.1781 158/500 [========>.....................] - ETA: 1:25 - loss: 1.1989 - regression_loss: 1.0204 - classification_loss: 0.1785 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2030 - regression_loss: 1.0238 - classification_loss: 0.1792 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2046 - regression_loss: 1.0252 - classification_loss: 0.1794 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2046 - regression_loss: 1.0252 - classification_loss: 0.1794 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2037 - regression_loss: 1.0246 - classification_loss: 0.1791 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2063 - regression_loss: 1.0267 - classification_loss: 0.1797 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2125 - regression_loss: 1.0317 - classification_loss: 0.1808 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2151 - regression_loss: 1.0335 - classification_loss: 0.1816 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2143 - regression_loss: 1.0328 - classification_loss: 0.1814 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2116 - regression_loss: 1.0307 - classification_loss: 0.1810 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2075 - regression_loss: 1.0274 - classification_loss: 0.1801 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2077 - regression_loss: 1.0275 - classification_loss: 0.1803 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2062 - regression_loss: 1.0264 - classification_loss: 0.1799 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2046 - regression_loss: 1.0245 - classification_loss: 0.1801 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2073 - regression_loss: 1.0272 - classification_loss: 0.1801 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2098 - regression_loss: 1.0293 - classification_loss: 0.1804 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2132 - regression_loss: 1.0323 - classification_loss: 0.1809 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2141 - regression_loss: 1.0329 - classification_loss: 0.1812 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2094 - regression_loss: 1.0289 - classification_loss: 0.1805 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2111 - regression_loss: 1.0304 - classification_loss: 0.1807 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2092 - regression_loss: 1.0291 - classification_loss: 0.1801 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2062 - regression_loss: 1.0267 - classification_loss: 0.1795 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2075 - regression_loss: 1.0279 - classification_loss: 0.1796 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2067 - regression_loss: 1.0273 - classification_loss: 0.1794 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2060 - regression_loss: 1.0268 - classification_loss: 0.1792 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2076 - regression_loss: 1.0277 - classification_loss: 0.1798 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2073 - regression_loss: 1.0276 - classification_loss: 0.1797 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2041 - regression_loss: 1.0251 - classification_loss: 0.1790 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2066 - regression_loss: 1.0272 - classification_loss: 0.1794 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2090 - regression_loss: 1.0293 - classification_loss: 0.1797 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2106 - regression_loss: 1.0307 - classification_loss: 0.1799 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2122 - regression_loss: 1.0321 - classification_loss: 0.1800 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2080 - regression_loss: 1.0284 - classification_loss: 0.1795 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2104 - regression_loss: 1.0304 - classification_loss: 0.1801 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2124 - regression_loss: 1.0321 - classification_loss: 0.1803 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2146 - regression_loss: 1.0336 - classification_loss: 0.1810 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2159 - regression_loss: 1.0347 - classification_loss: 0.1812 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2155 - regression_loss: 1.0346 - classification_loss: 0.1809 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2156 - regression_loss: 1.0343 - classification_loss: 0.1813 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2158 - regression_loss: 1.0346 - classification_loss: 0.1812 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2145 - regression_loss: 1.0337 - classification_loss: 0.1808 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2134 - regression_loss: 1.0327 - classification_loss: 0.1807 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2156 - regression_loss: 1.0346 - classification_loss: 0.1811 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2124 - regression_loss: 1.0318 - classification_loss: 0.1805 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2145 - regression_loss: 1.0338 - classification_loss: 0.1807 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2160 - regression_loss: 1.0349 - classification_loss: 0.1811 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2171 - regression_loss: 1.0358 - classification_loss: 0.1813 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2186 - regression_loss: 1.0371 - classification_loss: 0.1815 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2176 - regression_loss: 1.0364 - classification_loss: 0.1813 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2136 - regression_loss: 1.0329 - classification_loss: 0.1807 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2139 - regression_loss: 1.0331 - classification_loss: 0.1809 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2152 - regression_loss: 1.0338 - classification_loss: 0.1814 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2183 - regression_loss: 1.0369 - classification_loss: 0.1814 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2184 - regression_loss: 1.0370 - classification_loss: 0.1814 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2172 - regression_loss: 1.0358 - classification_loss: 0.1814 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2170 - regression_loss: 1.0356 - classification_loss: 0.1814 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2179 - regression_loss: 1.0363 - classification_loss: 0.1816 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2198 - regression_loss: 1.0379 - classification_loss: 0.1819 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2214 - regression_loss: 1.0393 - classification_loss: 0.1821 217/500 [============>.................] - ETA: 1:10 - loss: 1.2220 - regression_loss: 1.0396 - classification_loss: 0.1825 218/500 [============>.................] - ETA: 1:10 - loss: 1.2248 - regression_loss: 1.0418 - classification_loss: 0.1830 219/500 [============>.................] - ETA: 1:10 - loss: 1.2252 - regression_loss: 1.0420 - classification_loss: 0.1832 220/500 [============>.................] - ETA: 1:10 - loss: 1.2283 - regression_loss: 1.0445 - classification_loss: 0.1838 221/500 [============>.................] - ETA: 1:09 - loss: 1.2279 - regression_loss: 1.0444 - classification_loss: 0.1836 222/500 [============>.................] - ETA: 1:09 - loss: 1.2266 - regression_loss: 1.0432 - classification_loss: 0.1834 223/500 [============>.................] - ETA: 1:09 - loss: 1.2259 - regression_loss: 1.0428 - classification_loss: 0.1831 224/500 [============>.................] - ETA: 1:09 - loss: 1.2230 - regression_loss: 1.0403 - classification_loss: 0.1827 225/500 [============>.................] - ETA: 1:08 - loss: 1.2229 - regression_loss: 1.0402 - classification_loss: 0.1827 226/500 [============>.................] - ETA: 1:08 - loss: 1.2232 - regression_loss: 1.0405 - classification_loss: 0.1827 227/500 [============>.................] - ETA: 1:08 - loss: 1.2245 - regression_loss: 1.0416 - classification_loss: 0.1829 228/500 [============>.................] - ETA: 1:08 - loss: 1.2248 - regression_loss: 1.0419 - classification_loss: 0.1830 229/500 [============>.................] - ETA: 1:07 - loss: 1.2275 - regression_loss: 1.0440 - classification_loss: 0.1834 230/500 [============>.................] - ETA: 1:07 - loss: 1.2269 - regression_loss: 1.0436 - classification_loss: 0.1833 231/500 [============>.................] - ETA: 1:07 - loss: 1.2239 - regression_loss: 1.0412 - classification_loss: 0.1827 232/500 [============>.................] - ETA: 1:07 - loss: 1.2223 - regression_loss: 1.0402 - classification_loss: 0.1821 233/500 [============>.................] - ETA: 1:06 - loss: 1.2204 - regression_loss: 1.0384 - classification_loss: 0.1820 234/500 [=============>................] - ETA: 1:06 - loss: 1.2212 - regression_loss: 1.0390 - classification_loss: 0.1822 235/500 [=============>................] - ETA: 1:06 - loss: 1.2226 - regression_loss: 1.0400 - classification_loss: 0.1825 236/500 [=============>................] - ETA: 1:06 - loss: 1.2199 - regression_loss: 1.0379 - classification_loss: 0.1821 237/500 [=============>................] - ETA: 1:05 - loss: 1.2225 - regression_loss: 1.0401 - classification_loss: 0.1824 238/500 [=============>................] - ETA: 1:05 - loss: 1.2233 - regression_loss: 1.0410 - classification_loss: 0.1823 239/500 [=============>................] - ETA: 1:05 - loss: 1.2244 - regression_loss: 1.0418 - classification_loss: 0.1825 240/500 [=============>................] - ETA: 1:05 - loss: 1.2253 - regression_loss: 1.0422 - classification_loss: 0.1831 241/500 [=============>................] - ETA: 1:04 - loss: 1.2243 - regression_loss: 1.0415 - classification_loss: 0.1828 242/500 [=============>................] - ETA: 1:04 - loss: 1.2243 - regression_loss: 1.0416 - classification_loss: 0.1827 243/500 [=============>................] - ETA: 1:04 - loss: 1.2213 - regression_loss: 1.0392 - classification_loss: 0.1821 244/500 [=============>................] - ETA: 1:04 - loss: 1.2225 - regression_loss: 1.0403 - classification_loss: 0.1823 245/500 [=============>................] - ETA: 1:03 - loss: 1.2230 - regression_loss: 1.0407 - classification_loss: 0.1823 246/500 [=============>................] - ETA: 1:03 - loss: 1.2207 - regression_loss: 1.0390 - classification_loss: 0.1817 247/500 [=============>................] - ETA: 1:03 - loss: 1.2224 - regression_loss: 1.0404 - classification_loss: 0.1820 248/500 [=============>................] - ETA: 1:03 - loss: 1.2220 - regression_loss: 1.0401 - classification_loss: 0.1819 249/500 [=============>................] - ETA: 1:02 - loss: 1.2198 - regression_loss: 1.0381 - classification_loss: 0.1817 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2225 - regression_loss: 1.0403 - classification_loss: 0.1822 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2244 - regression_loss: 1.0419 - classification_loss: 0.1825 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2220 - regression_loss: 1.0398 - classification_loss: 0.1822 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2235 - regression_loss: 1.0410 - classification_loss: 0.1824 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2226 - regression_loss: 1.0404 - classification_loss: 0.1822 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2227 - regression_loss: 1.0404 - classification_loss: 0.1823 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2223 - regression_loss: 1.0401 - classification_loss: 0.1823 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2208 - regression_loss: 1.0388 - classification_loss: 0.1820 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2213 - regression_loss: 1.0393 - classification_loss: 0.1820 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2210 - regression_loss: 1.0390 - classification_loss: 0.1820 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2221 - regression_loss: 1.0398 - classification_loss: 0.1823 261/500 [==============>...............] - ETA: 59s - loss: 1.2240 - regression_loss: 1.0415 - classification_loss: 0.1825  262/500 [==============>...............] - ETA: 59s - loss: 1.2225 - regression_loss: 1.0401 - classification_loss: 0.1824 263/500 [==============>...............] - ETA: 59s - loss: 1.2251 - regression_loss: 1.0425 - classification_loss: 0.1827 264/500 [==============>...............] - ETA: 59s - loss: 1.2241 - regression_loss: 1.0418 - classification_loss: 0.1823 265/500 [==============>...............] - ETA: 58s - loss: 1.2241 - regression_loss: 1.0417 - classification_loss: 0.1824 266/500 [==============>...............] - ETA: 58s - loss: 1.2218 - regression_loss: 1.0396 - classification_loss: 0.1821 267/500 [===============>..............] - ETA: 58s - loss: 1.2213 - regression_loss: 1.0391 - classification_loss: 0.1822 268/500 [===============>..............] - ETA: 58s - loss: 1.2216 - regression_loss: 1.0394 - classification_loss: 0.1822 269/500 [===============>..............] - ETA: 57s - loss: 1.2191 - regression_loss: 1.0374 - classification_loss: 0.1817 270/500 [===============>..............] - ETA: 57s - loss: 1.2194 - regression_loss: 1.0376 - classification_loss: 0.1819 271/500 [===============>..............] - ETA: 57s - loss: 1.2202 - regression_loss: 1.0380 - classification_loss: 0.1822 272/500 [===============>..............] - ETA: 57s - loss: 1.2195 - regression_loss: 1.0375 - classification_loss: 0.1820 273/500 [===============>..............] - ETA: 56s - loss: 1.2197 - regression_loss: 1.0378 - classification_loss: 0.1819 274/500 [===============>..............] - ETA: 56s - loss: 1.2185 - regression_loss: 1.0371 - classification_loss: 0.1814 275/500 [===============>..............] - ETA: 56s - loss: 1.2159 - regression_loss: 1.0350 - classification_loss: 0.1810 276/500 [===============>..............] - ETA: 56s - loss: 1.2158 - regression_loss: 1.0349 - classification_loss: 0.1809 277/500 [===============>..............] - ETA: 55s - loss: 1.2167 - regression_loss: 1.0355 - classification_loss: 0.1812 278/500 [===============>..............] - ETA: 55s - loss: 1.2173 - regression_loss: 1.0360 - classification_loss: 0.1814 279/500 [===============>..............] - ETA: 55s - loss: 1.2173 - regression_loss: 1.0357 - classification_loss: 0.1816 280/500 [===============>..............] - ETA: 55s - loss: 1.2155 - regression_loss: 1.0341 - classification_loss: 0.1814 281/500 [===============>..............] - ETA: 54s - loss: 1.2173 - regression_loss: 1.0353 - classification_loss: 0.1820 282/500 [===============>..............] - ETA: 54s - loss: 1.2181 - regression_loss: 1.0362 - classification_loss: 0.1819 283/500 [===============>..............] - ETA: 54s - loss: 1.2189 - regression_loss: 1.0368 - classification_loss: 0.1821 284/500 [================>.............] - ETA: 54s - loss: 1.2178 - regression_loss: 1.0361 - classification_loss: 0.1817 285/500 [================>.............] - ETA: 53s - loss: 1.2163 - regression_loss: 1.0349 - classification_loss: 0.1814 286/500 [================>.............] - ETA: 53s - loss: 1.2147 - regression_loss: 1.0337 - classification_loss: 0.1811 287/500 [================>.............] - ETA: 53s - loss: 1.2167 - regression_loss: 1.0351 - classification_loss: 0.1815 288/500 [================>.............] - ETA: 53s - loss: 1.2185 - regression_loss: 1.0366 - classification_loss: 0.1819 289/500 [================>.............] - ETA: 52s - loss: 1.2174 - regression_loss: 1.0356 - classification_loss: 0.1819 290/500 [================>.............] - ETA: 52s - loss: 1.2155 - regression_loss: 1.0341 - classification_loss: 0.1814 291/500 [================>.............] - ETA: 52s - loss: 1.2161 - regression_loss: 1.0349 - classification_loss: 0.1813 292/500 [================>.............] - ETA: 52s - loss: 1.2159 - regression_loss: 1.0347 - classification_loss: 0.1813 293/500 [================>.............] - ETA: 51s - loss: 1.2189 - regression_loss: 1.0367 - classification_loss: 0.1823 294/500 [================>.............] - ETA: 51s - loss: 1.2185 - regression_loss: 1.0364 - classification_loss: 0.1821 295/500 [================>.............] - ETA: 51s - loss: 1.2208 - regression_loss: 1.0375 - classification_loss: 0.1833 296/500 [================>.............] - ETA: 51s - loss: 1.2186 - regression_loss: 1.0357 - classification_loss: 0.1829 297/500 [================>.............] - ETA: 50s - loss: 1.2212 - regression_loss: 1.0377 - classification_loss: 0.1835 298/500 [================>.............] - ETA: 50s - loss: 1.2199 - regression_loss: 1.0366 - classification_loss: 0.1832 299/500 [================>.............] - ETA: 50s - loss: 1.2180 - regression_loss: 1.0350 - classification_loss: 0.1830 300/500 [=================>............] - ETA: 50s - loss: 1.2185 - regression_loss: 1.0355 - classification_loss: 0.1830 301/500 [=================>............] - ETA: 49s - loss: 1.2176 - regression_loss: 1.0347 - classification_loss: 0.1829 302/500 [=================>............] - ETA: 49s - loss: 1.2176 - regression_loss: 1.0348 - classification_loss: 0.1829 303/500 [=================>............] - ETA: 49s - loss: 1.2185 - regression_loss: 1.0355 - classification_loss: 0.1830 304/500 [=================>............] - ETA: 49s - loss: 1.2183 - regression_loss: 1.0352 - classification_loss: 0.1830 305/500 [=================>............] - ETA: 48s - loss: 1.2218 - regression_loss: 1.0382 - classification_loss: 0.1835 306/500 [=================>............] - ETA: 48s - loss: 1.2224 - regression_loss: 1.0388 - classification_loss: 0.1836 307/500 [=================>............] - ETA: 48s - loss: 1.2210 - regression_loss: 1.0376 - classification_loss: 0.1834 308/500 [=================>............] - ETA: 48s - loss: 1.2195 - regression_loss: 1.0364 - classification_loss: 0.1831 309/500 [=================>............] - ETA: 47s - loss: 1.2190 - regression_loss: 1.0362 - classification_loss: 0.1828 310/500 [=================>............] - ETA: 47s - loss: 1.2202 - regression_loss: 1.0373 - classification_loss: 0.1829 311/500 [=================>............] - ETA: 47s - loss: 1.2211 - regression_loss: 1.0381 - classification_loss: 0.1830 312/500 [=================>............] - ETA: 47s - loss: 1.2219 - regression_loss: 1.0387 - classification_loss: 0.1832 313/500 [=================>............] - ETA: 46s - loss: 1.2201 - regression_loss: 1.0372 - classification_loss: 0.1830 314/500 [=================>............] - ETA: 46s - loss: 1.2196 - regression_loss: 1.0364 - classification_loss: 0.1831 315/500 [=================>............] - ETA: 46s - loss: 1.2212 - regression_loss: 1.0377 - classification_loss: 0.1834 316/500 [=================>............] - ETA: 46s - loss: 1.2227 - regression_loss: 1.0391 - classification_loss: 0.1835 317/500 [==================>...........] - ETA: 45s - loss: 1.2226 - regression_loss: 1.0391 - classification_loss: 0.1835 318/500 [==================>...........] - ETA: 45s - loss: 1.2206 - regression_loss: 1.0375 - classification_loss: 0.1832 319/500 [==================>...........] - ETA: 45s - loss: 1.2213 - regression_loss: 1.0381 - classification_loss: 0.1832 320/500 [==================>...........] - ETA: 45s - loss: 1.2219 - regression_loss: 1.0388 - classification_loss: 0.1832 321/500 [==================>...........] - ETA: 44s - loss: 1.2218 - regression_loss: 1.0387 - classification_loss: 0.1831 322/500 [==================>...........] - ETA: 44s - loss: 1.2211 - regression_loss: 1.0380 - classification_loss: 0.1831 323/500 [==================>...........] - ETA: 44s - loss: 1.2212 - regression_loss: 1.0380 - classification_loss: 0.1832 324/500 [==================>...........] - ETA: 44s - loss: 1.2189 - regression_loss: 1.0360 - classification_loss: 0.1829 325/500 [==================>...........] - ETA: 43s - loss: 1.2190 - regression_loss: 1.0363 - classification_loss: 0.1827 326/500 [==================>...........] - ETA: 43s - loss: 1.2197 - regression_loss: 1.0368 - classification_loss: 0.1828 327/500 [==================>...........] - ETA: 43s - loss: 1.2203 - regression_loss: 1.0374 - classification_loss: 0.1830 328/500 [==================>...........] - ETA: 43s - loss: 1.2212 - regression_loss: 1.0380 - classification_loss: 0.1833 329/500 [==================>...........] - ETA: 42s - loss: 1.2212 - regression_loss: 1.0381 - classification_loss: 0.1831 330/500 [==================>...........] - ETA: 42s - loss: 1.2219 - regression_loss: 1.0386 - classification_loss: 0.1833 331/500 [==================>...........] - ETA: 42s - loss: 1.2212 - regression_loss: 1.0381 - classification_loss: 0.1831 332/500 [==================>...........] - ETA: 42s - loss: 1.2220 - regression_loss: 1.0386 - classification_loss: 0.1834 333/500 [==================>...........] - ETA: 41s - loss: 1.2227 - regression_loss: 1.0394 - classification_loss: 0.1834 334/500 [===================>..........] - ETA: 41s - loss: 1.2233 - regression_loss: 1.0398 - classification_loss: 0.1835 335/500 [===================>..........] - ETA: 41s - loss: 1.2249 - regression_loss: 1.0412 - classification_loss: 0.1838 336/500 [===================>..........] - ETA: 41s - loss: 1.2230 - regression_loss: 1.0395 - classification_loss: 0.1834 337/500 [===================>..........] - ETA: 40s - loss: 1.2242 - regression_loss: 1.0407 - classification_loss: 0.1835 338/500 [===================>..........] - ETA: 40s - loss: 1.2246 - regression_loss: 1.0411 - classification_loss: 0.1835 339/500 [===================>..........] - ETA: 40s - loss: 1.2259 - regression_loss: 1.0422 - classification_loss: 0.1837 340/500 [===================>..........] - ETA: 40s - loss: 1.2252 - regression_loss: 1.0417 - classification_loss: 0.1835 341/500 [===================>..........] - ETA: 39s - loss: 1.2237 - regression_loss: 1.0403 - classification_loss: 0.1834 342/500 [===================>..........] - ETA: 39s - loss: 1.2236 - regression_loss: 1.0402 - classification_loss: 0.1834 343/500 [===================>..........] - ETA: 39s - loss: 1.2241 - regression_loss: 1.0405 - classification_loss: 0.1836 344/500 [===================>..........] - ETA: 39s - loss: 1.2249 - regression_loss: 1.0408 - classification_loss: 0.1841 345/500 [===================>..........] - ETA: 38s - loss: 1.2255 - regression_loss: 1.0413 - classification_loss: 0.1843 346/500 [===================>..........] - ETA: 38s - loss: 1.2235 - regression_loss: 1.0397 - classification_loss: 0.1838 347/500 [===================>..........] - ETA: 38s - loss: 1.2222 - regression_loss: 1.0388 - classification_loss: 0.1834 348/500 [===================>..........] - ETA: 38s - loss: 1.2223 - regression_loss: 1.0390 - classification_loss: 0.1834 349/500 [===================>..........] - ETA: 37s - loss: 1.2213 - regression_loss: 1.0382 - classification_loss: 0.1831 350/500 [====================>.........] - ETA: 37s - loss: 1.2192 - regression_loss: 1.0365 - classification_loss: 0.1827 351/500 [====================>.........] - ETA: 37s - loss: 1.2193 - regression_loss: 1.0366 - classification_loss: 0.1827 352/500 [====================>.........] - ETA: 37s - loss: 1.2199 - regression_loss: 1.0371 - classification_loss: 0.1828 353/500 [====================>.........] - ETA: 36s - loss: 1.2202 - regression_loss: 1.0375 - classification_loss: 0.1827 354/500 [====================>.........] - ETA: 36s - loss: 1.2208 - regression_loss: 1.0379 - classification_loss: 0.1830 355/500 [====================>.........] - ETA: 36s - loss: 1.2232 - regression_loss: 1.0397 - classification_loss: 0.1835 356/500 [====================>.........] - ETA: 36s - loss: 1.2217 - regression_loss: 1.0386 - classification_loss: 0.1831 357/500 [====================>.........] - ETA: 35s - loss: 1.2234 - regression_loss: 1.0400 - classification_loss: 0.1834 358/500 [====================>.........] - ETA: 35s - loss: 1.2235 - regression_loss: 1.0399 - classification_loss: 0.1836 359/500 [====================>.........] - ETA: 35s - loss: 1.2209 - regression_loss: 1.0377 - classification_loss: 0.1832 360/500 [====================>.........] - ETA: 35s - loss: 1.2220 - regression_loss: 1.0384 - classification_loss: 0.1836 361/500 [====================>.........] - ETA: 34s - loss: 1.2194 - regression_loss: 1.0362 - classification_loss: 0.1832 362/500 [====================>.........] - ETA: 34s - loss: 1.2197 - regression_loss: 1.0367 - classification_loss: 0.1829 363/500 [====================>.........] - ETA: 34s - loss: 1.2210 - regression_loss: 1.0378 - classification_loss: 0.1832 364/500 [====================>.........] - ETA: 34s - loss: 1.2210 - regression_loss: 1.0379 - classification_loss: 0.1831 365/500 [====================>.........] - ETA: 33s - loss: 1.2198 - regression_loss: 1.0369 - classification_loss: 0.1829 366/500 [====================>.........] - ETA: 33s - loss: 1.2207 - regression_loss: 1.0375 - classification_loss: 0.1833 367/500 [=====================>........] - ETA: 33s - loss: 1.2185 - regression_loss: 1.0356 - classification_loss: 0.1829 368/500 [=====================>........] - ETA: 33s - loss: 1.2185 - regression_loss: 1.0355 - classification_loss: 0.1829 369/500 [=====================>........] - ETA: 32s - loss: 1.2166 - regression_loss: 1.0339 - classification_loss: 0.1826 370/500 [=====================>........] - ETA: 32s - loss: 1.2183 - regression_loss: 1.0353 - classification_loss: 0.1830 371/500 [=====================>........] - ETA: 32s - loss: 1.2199 - regression_loss: 1.0367 - classification_loss: 0.1833 372/500 [=====================>........] - ETA: 32s - loss: 1.2201 - regression_loss: 1.0368 - classification_loss: 0.1833 373/500 [=====================>........] - ETA: 31s - loss: 1.2182 - regression_loss: 1.0352 - classification_loss: 0.1831 374/500 [=====================>........] - ETA: 31s - loss: 1.2184 - regression_loss: 1.0353 - classification_loss: 0.1831 375/500 [=====================>........] - ETA: 31s - loss: 1.2184 - regression_loss: 1.0354 - classification_loss: 0.1830 376/500 [=====================>........] - ETA: 31s - loss: 1.2174 - regression_loss: 1.0346 - classification_loss: 0.1828 377/500 [=====================>........] - ETA: 30s - loss: 1.2166 - regression_loss: 1.0341 - classification_loss: 0.1826 378/500 [=====================>........] - ETA: 30s - loss: 1.2181 - regression_loss: 1.0353 - classification_loss: 0.1828 379/500 [=====================>........] - ETA: 30s - loss: 1.2192 - regression_loss: 1.0360 - classification_loss: 0.1832 380/500 [=====================>........] - ETA: 30s - loss: 1.2197 - regression_loss: 1.0365 - classification_loss: 0.1831 381/500 [=====================>........] - ETA: 29s - loss: 1.2198 - regression_loss: 1.0367 - classification_loss: 0.1831 382/500 [=====================>........] - ETA: 29s - loss: 1.2201 - regression_loss: 1.0370 - classification_loss: 0.1831 383/500 [=====================>........] - ETA: 29s - loss: 1.2202 - regression_loss: 1.0371 - classification_loss: 0.1831 384/500 [======================>.......] - ETA: 29s - loss: 1.2213 - regression_loss: 1.0378 - classification_loss: 0.1834 385/500 [======================>.......] - ETA: 28s - loss: 1.2223 - regression_loss: 1.0389 - classification_loss: 0.1834 386/500 [======================>.......] - ETA: 28s - loss: 1.2239 - regression_loss: 1.0402 - classification_loss: 0.1837 387/500 [======================>.......] - ETA: 28s - loss: 1.2246 - regression_loss: 1.0408 - classification_loss: 0.1838 388/500 [======================>.......] - ETA: 28s - loss: 1.2255 - regression_loss: 1.0417 - classification_loss: 0.1839 389/500 [======================>.......] - ETA: 27s - loss: 1.2268 - regression_loss: 1.0426 - classification_loss: 0.1842 390/500 [======================>.......] - ETA: 27s - loss: 1.2273 - regression_loss: 1.0430 - classification_loss: 0.1843 391/500 [======================>.......] - ETA: 27s - loss: 1.2274 - regression_loss: 1.0430 - classification_loss: 0.1844 392/500 [======================>.......] - ETA: 27s - loss: 1.2280 - regression_loss: 1.0436 - classification_loss: 0.1844 393/500 [======================>.......] - ETA: 26s - loss: 1.2274 - regression_loss: 1.0430 - classification_loss: 0.1843 394/500 [======================>.......] - ETA: 26s - loss: 1.2272 - regression_loss: 1.0429 - classification_loss: 0.1843 395/500 [======================>.......] - ETA: 26s - loss: 1.2274 - regression_loss: 1.0432 - classification_loss: 0.1843 396/500 [======================>.......] - ETA: 26s - loss: 1.2264 - regression_loss: 1.0423 - classification_loss: 0.1841 397/500 [======================>.......] - ETA: 25s - loss: 1.2251 - regression_loss: 1.0413 - classification_loss: 0.1838 398/500 [======================>.......] - ETA: 25s - loss: 1.2236 - regression_loss: 1.0401 - classification_loss: 0.1834 399/500 [======================>.......] - ETA: 25s - loss: 1.2240 - regression_loss: 1.0405 - classification_loss: 0.1835 400/500 [=======================>......] - ETA: 25s - loss: 1.2249 - regression_loss: 1.0412 - classification_loss: 0.1836 401/500 [=======================>......] - ETA: 24s - loss: 1.2232 - regression_loss: 1.0398 - classification_loss: 0.1833 402/500 [=======================>......] - ETA: 24s - loss: 1.2222 - regression_loss: 1.0391 - classification_loss: 0.1831 403/500 [=======================>......] - ETA: 24s - loss: 1.2225 - regression_loss: 1.0395 - classification_loss: 0.1830 404/500 [=======================>......] - ETA: 24s - loss: 1.2231 - regression_loss: 1.0400 - classification_loss: 0.1831 405/500 [=======================>......] - ETA: 23s - loss: 1.2237 - regression_loss: 1.0401 - classification_loss: 0.1835 406/500 [=======================>......] - ETA: 23s - loss: 1.2246 - regression_loss: 1.0409 - classification_loss: 0.1837 407/500 [=======================>......] - ETA: 23s - loss: 1.2246 - regression_loss: 1.0410 - classification_loss: 0.1837 408/500 [=======================>......] - ETA: 23s - loss: 1.2239 - regression_loss: 1.0404 - classification_loss: 0.1835 409/500 [=======================>......] - ETA: 22s - loss: 1.2225 - regression_loss: 1.0392 - classification_loss: 0.1833 410/500 [=======================>......] - ETA: 22s - loss: 1.2236 - regression_loss: 1.0401 - classification_loss: 0.1834 411/500 [=======================>......] - ETA: 22s - loss: 1.2246 - regression_loss: 1.0410 - classification_loss: 0.1836 412/500 [=======================>......] - ETA: 22s - loss: 1.2237 - regression_loss: 1.0402 - classification_loss: 0.1835 413/500 [=======================>......] - ETA: 21s - loss: 1.2241 - regression_loss: 1.0402 - classification_loss: 0.1839 414/500 [=======================>......] - ETA: 21s - loss: 1.2247 - regression_loss: 1.0408 - classification_loss: 0.1839 415/500 [=======================>......] - ETA: 21s - loss: 1.2246 - regression_loss: 1.0405 - classification_loss: 0.1841 416/500 [=======================>......] - ETA: 21s - loss: 1.2238 - regression_loss: 1.0399 - classification_loss: 0.1839 417/500 [========================>.....] - ETA: 20s - loss: 1.2228 - regression_loss: 1.0390 - classification_loss: 0.1838 418/500 [========================>.....] - ETA: 20s - loss: 1.2214 - regression_loss: 1.0379 - classification_loss: 0.1835 419/500 [========================>.....] - ETA: 20s - loss: 1.2220 - regression_loss: 1.0385 - classification_loss: 0.1835 420/500 [========================>.....] - ETA: 20s - loss: 1.2216 - regression_loss: 1.0381 - classification_loss: 0.1835 421/500 [========================>.....] - ETA: 19s - loss: 1.2220 - regression_loss: 1.0384 - classification_loss: 0.1836 422/500 [========================>.....] - ETA: 19s - loss: 1.2212 - regression_loss: 1.0377 - classification_loss: 0.1835 423/500 [========================>.....] - ETA: 19s - loss: 1.2211 - regression_loss: 1.0376 - classification_loss: 0.1835 424/500 [========================>.....] - ETA: 19s - loss: 1.2201 - regression_loss: 1.0368 - classification_loss: 0.1833 425/500 [========================>.....] - ETA: 18s - loss: 1.2203 - regression_loss: 1.0371 - classification_loss: 0.1832 426/500 [========================>.....] - ETA: 18s - loss: 1.2210 - regression_loss: 1.0378 - classification_loss: 0.1832 427/500 [========================>.....] - ETA: 18s - loss: 1.2207 - regression_loss: 1.0377 - classification_loss: 0.1831 428/500 [========================>.....] - ETA: 18s - loss: 1.2222 - regression_loss: 1.0388 - classification_loss: 0.1834 429/500 [========================>.....] - ETA: 17s - loss: 1.2226 - regression_loss: 1.0391 - classification_loss: 0.1834 430/500 [========================>.....] - ETA: 17s - loss: 1.2233 - regression_loss: 1.0397 - classification_loss: 0.1836 431/500 [========================>.....] - ETA: 17s - loss: 1.2249 - regression_loss: 1.0411 - classification_loss: 0.1838 432/500 [========================>.....] - ETA: 17s - loss: 1.2251 - regression_loss: 1.0412 - classification_loss: 0.1839 433/500 [========================>.....] - ETA: 16s - loss: 1.2248 - regression_loss: 1.0410 - classification_loss: 0.1838 434/500 [=========================>....] - ETA: 16s - loss: 1.2259 - regression_loss: 1.0421 - classification_loss: 0.1839 435/500 [=========================>....] - ETA: 16s - loss: 1.2267 - regression_loss: 1.0428 - classification_loss: 0.1839 436/500 [=========================>....] - ETA: 16s - loss: 1.2266 - regression_loss: 1.0428 - classification_loss: 0.1839 437/500 [=========================>....] - ETA: 15s - loss: 1.2248 - regression_loss: 1.0413 - classification_loss: 0.1835 438/500 [=========================>....] - ETA: 15s - loss: 1.2250 - regression_loss: 1.0415 - classification_loss: 0.1835 439/500 [=========================>....] - ETA: 15s - loss: 1.2255 - regression_loss: 1.0420 - classification_loss: 0.1835 440/500 [=========================>....] - ETA: 15s - loss: 1.2261 - regression_loss: 1.0426 - classification_loss: 0.1835 441/500 [=========================>....] - ETA: 14s - loss: 1.2272 - regression_loss: 1.0434 - classification_loss: 0.1838 442/500 [=========================>....] - ETA: 14s - loss: 1.2263 - regression_loss: 1.0427 - classification_loss: 0.1836 443/500 [=========================>....] - ETA: 14s - loss: 1.2263 - regression_loss: 1.0427 - classification_loss: 0.1836 444/500 [=========================>....] - ETA: 14s - loss: 1.2259 - regression_loss: 1.0423 - classification_loss: 0.1836 445/500 [=========================>....] - ETA: 13s - loss: 1.2252 - regression_loss: 1.0418 - classification_loss: 0.1834 446/500 [=========================>....] - ETA: 13s - loss: 1.2252 - regression_loss: 1.0418 - classification_loss: 0.1833 447/500 [=========================>....] - ETA: 13s - loss: 1.2259 - regression_loss: 1.0425 - classification_loss: 0.1834 448/500 [=========================>....] - ETA: 13s - loss: 1.2261 - regression_loss: 1.0427 - classification_loss: 0.1834 449/500 [=========================>....] - ETA: 12s - loss: 1.2270 - regression_loss: 1.0436 - classification_loss: 0.1835 450/500 [==========================>...] - ETA: 12s - loss: 1.2272 - regression_loss: 1.0436 - classification_loss: 0.1836 451/500 [==========================>...] - ETA: 12s - loss: 1.2271 - regression_loss: 1.0434 - classification_loss: 0.1836 452/500 [==========================>...] - ETA: 12s - loss: 1.2263 - regression_loss: 1.0428 - classification_loss: 0.1835 453/500 [==========================>...] - ETA: 11s - loss: 1.2257 - regression_loss: 1.0424 - classification_loss: 0.1834 454/500 [==========================>...] - ETA: 11s - loss: 1.2254 - regression_loss: 1.0421 - classification_loss: 0.1833 455/500 [==========================>...] - ETA: 11s - loss: 1.2235 - regression_loss: 1.0404 - classification_loss: 0.1830 456/500 [==========================>...] - ETA: 11s - loss: 1.2250 - regression_loss: 1.0417 - classification_loss: 0.1834 457/500 [==========================>...] - ETA: 10s - loss: 1.2256 - regression_loss: 1.0421 - classification_loss: 0.1835 458/500 [==========================>...] - ETA: 10s - loss: 1.2252 - regression_loss: 1.0418 - classification_loss: 0.1834 459/500 [==========================>...] - ETA: 10s - loss: 1.2255 - regression_loss: 1.0422 - classification_loss: 0.1833 460/500 [==========================>...] - ETA: 10s - loss: 1.2259 - regression_loss: 1.0426 - classification_loss: 0.1833 461/500 [==========================>...] - ETA: 9s - loss: 1.2265 - regression_loss: 1.0431 - classification_loss: 0.1834  462/500 [==========================>...] - ETA: 9s - loss: 1.2275 - regression_loss: 1.0438 - classification_loss: 0.1837 463/500 [==========================>...] - ETA: 9s - loss: 1.2273 - regression_loss: 1.0436 - classification_loss: 0.1837 464/500 [==========================>...] - ETA: 9s - loss: 1.2278 - regression_loss: 1.0440 - classification_loss: 0.1838 465/500 [==========================>...] - ETA: 8s - loss: 1.2264 - regression_loss: 1.0429 - classification_loss: 0.1836 466/500 [==========================>...] - ETA: 8s - loss: 1.2279 - regression_loss: 1.0440 - classification_loss: 0.1839 467/500 [===========================>..] - ETA: 8s - loss: 1.2265 - regression_loss: 1.0429 - classification_loss: 0.1836 468/500 [===========================>..] - ETA: 8s - loss: 1.2275 - regression_loss: 1.0438 - classification_loss: 0.1837 469/500 [===========================>..] - ETA: 7s - loss: 1.2278 - regression_loss: 1.0440 - classification_loss: 0.1838 470/500 [===========================>..] - ETA: 7s - loss: 1.2297 - regression_loss: 1.0453 - classification_loss: 0.1844 471/500 [===========================>..] - ETA: 7s - loss: 1.2306 - regression_loss: 1.0460 - classification_loss: 0.1846 472/500 [===========================>..] - ETA: 7s - loss: 1.2290 - regression_loss: 1.0447 - classification_loss: 0.1843 473/500 [===========================>..] - ETA: 6s - loss: 1.2275 - regression_loss: 1.0436 - classification_loss: 0.1839 474/500 [===========================>..] - ETA: 6s - loss: 1.2277 - regression_loss: 1.0437 - classification_loss: 0.1840 475/500 [===========================>..] - ETA: 6s - loss: 1.2271 - regression_loss: 1.0432 - classification_loss: 0.1839 476/500 [===========================>..] - ETA: 6s - loss: 1.2266 - regression_loss: 1.0428 - classification_loss: 0.1838 477/500 [===========================>..] - ETA: 5s - loss: 1.2259 - regression_loss: 1.0422 - classification_loss: 0.1837 478/500 [===========================>..] - ETA: 5s - loss: 1.2266 - regression_loss: 1.0429 - classification_loss: 0.1837 479/500 [===========================>..] - ETA: 5s - loss: 1.2247 - regression_loss: 1.0413 - classification_loss: 0.1834 480/500 [===========================>..] - ETA: 5s - loss: 1.2231 - regression_loss: 1.0400 - classification_loss: 0.1831 481/500 [===========================>..] - ETA: 4s - loss: 1.2228 - regression_loss: 1.0398 - classification_loss: 0.1830 482/500 [===========================>..] - ETA: 4s - loss: 1.2231 - regression_loss: 1.0401 - classification_loss: 0.1830 483/500 [===========================>..] - ETA: 4s - loss: 1.2225 - regression_loss: 1.0396 - classification_loss: 0.1829 484/500 [============================>.] - ETA: 4s - loss: 1.2231 - regression_loss: 1.0401 - classification_loss: 0.1830 485/500 [============================>.] - ETA: 3s - loss: 1.2231 - regression_loss: 1.0400 - classification_loss: 0.1831 486/500 [============================>.] - ETA: 3s - loss: 1.2232 - regression_loss: 1.0401 - classification_loss: 0.1831 487/500 [============================>.] - ETA: 3s - loss: 1.2227 - regression_loss: 1.0395 - classification_loss: 0.1832 488/500 [============================>.] - ETA: 3s - loss: 1.2227 - regression_loss: 1.0396 - classification_loss: 0.1831 489/500 [============================>.] - ETA: 2s - loss: 1.2227 - regression_loss: 1.0395 - classification_loss: 0.1832 490/500 [============================>.] - ETA: 2s - loss: 1.2216 - regression_loss: 1.0386 - classification_loss: 0.1831 491/500 [============================>.] - ETA: 2s - loss: 1.2220 - regression_loss: 1.0389 - classification_loss: 0.1831 492/500 [============================>.] - ETA: 2s - loss: 1.2228 - regression_loss: 1.0395 - classification_loss: 0.1833 493/500 [============================>.] - ETA: 1s - loss: 1.2224 - regression_loss: 1.0391 - classification_loss: 0.1832 494/500 [============================>.] - ETA: 1s - loss: 1.2231 - regression_loss: 1.0398 - classification_loss: 0.1833 495/500 [============================>.] - ETA: 1s - loss: 1.2230 - regression_loss: 1.0393 - classification_loss: 0.1837 496/500 [============================>.] - ETA: 1s - loss: 1.2217 - regression_loss: 1.0383 - classification_loss: 0.1834 497/500 [============================>.] - ETA: 0s - loss: 1.2225 - regression_loss: 1.0390 - classification_loss: 0.1835 498/500 [============================>.] - ETA: 0s - loss: 1.2225 - regression_loss: 1.0391 - classification_loss: 0.1834 499/500 [============================>.] - ETA: 0s - loss: 1.2215 - regression_loss: 1.0383 - classification_loss: 0.1831 500/500 [==============================] - 125s 251ms/step - loss: 1.2220 - regression_loss: 1.0388 - classification_loss: 0.1832 1172 instances of class plum with average precision: 0.7001 mAP: 0.7001 Epoch 00129: saving model to ./training/snapshots/resnet50_pascal_129.h5 Epoch 130/150 1/500 [..............................] - ETA: 1:59 - loss: 1.2968 - regression_loss: 1.1550 - classification_loss: 0.1419 2/500 [..............................] - ETA: 2:01 - loss: 1.5114 - regression_loss: 1.2933 - classification_loss: 0.2180 3/500 [..............................] - ETA: 2:03 - loss: 1.2921 - regression_loss: 1.1048 - classification_loss: 0.1873 4/500 [..............................] - ETA: 2:03 - loss: 1.2480 - regression_loss: 1.0501 - classification_loss: 0.1978 5/500 [..............................] - ETA: 2:03 - loss: 1.1391 - regression_loss: 0.9635 - classification_loss: 0.1755 6/500 [..............................] - ETA: 2:02 - loss: 1.1273 - regression_loss: 0.9646 - classification_loss: 0.1627 7/500 [..............................] - ETA: 2:03 - loss: 1.0976 - regression_loss: 0.9465 - classification_loss: 0.1511 8/500 [..............................] - ETA: 2:02 - loss: 1.0983 - regression_loss: 0.9412 - classification_loss: 0.1571 9/500 [..............................] - ETA: 2:02 - loss: 1.2684 - regression_loss: 1.0609 - classification_loss: 0.2075 10/500 [..............................] - ETA: 2:02 - loss: 1.2987 - regression_loss: 1.0744 - classification_loss: 0.2243 11/500 [..............................] - ETA: 2:02 - loss: 1.2962 - regression_loss: 1.0726 - classification_loss: 0.2235 12/500 [..............................] - ETA: 2:02 - loss: 1.3051 - regression_loss: 1.0779 - classification_loss: 0.2273 13/500 [..............................] - ETA: 2:02 - loss: 1.3552 - regression_loss: 1.1211 - classification_loss: 0.2341 14/500 [..............................] - ETA: 2:02 - loss: 1.3412 - regression_loss: 1.1090 - classification_loss: 0.2322 15/500 [..............................] - ETA: 2:02 - loss: 1.3258 - regression_loss: 1.1009 - classification_loss: 0.2249 16/500 [..............................] - ETA: 2:02 - loss: 1.2826 - regression_loss: 1.0682 - classification_loss: 0.2144 17/500 [>.............................] - ETA: 2:01 - loss: 1.2262 - regression_loss: 1.0218 - classification_loss: 0.2044 18/500 [>.............................] - ETA: 2:00 - loss: 1.2207 - regression_loss: 1.0144 - classification_loss: 0.2063 19/500 [>.............................] - ETA: 2:00 - loss: 1.2744 - regression_loss: 1.0538 - classification_loss: 0.2205 20/500 [>.............................] - ETA: 2:00 - loss: 1.2692 - regression_loss: 1.0506 - classification_loss: 0.2186 21/500 [>.............................] - ETA: 2:00 - loss: 1.2914 - regression_loss: 1.0696 - classification_loss: 0.2218 22/500 [>.............................] - ETA: 2:00 - loss: 1.2801 - regression_loss: 1.0629 - classification_loss: 0.2172 23/500 [>.............................] - ETA: 2:00 - loss: 1.2723 - regression_loss: 1.0584 - classification_loss: 0.2138 24/500 [>.............................] - ETA: 2:00 - loss: 1.2773 - regression_loss: 1.0657 - classification_loss: 0.2115 25/500 [>.............................] - ETA: 2:00 - loss: 1.2614 - regression_loss: 1.0542 - classification_loss: 0.2072 26/500 [>.............................] - ETA: 1:59 - loss: 1.2669 - regression_loss: 1.0604 - classification_loss: 0.2066 27/500 [>.............................] - ETA: 1:59 - loss: 1.2402 - regression_loss: 1.0395 - classification_loss: 0.2007 28/500 [>.............................] - ETA: 1:58 - loss: 1.2397 - regression_loss: 1.0410 - classification_loss: 0.1987 29/500 [>.............................] - ETA: 1:58 - loss: 1.2343 - regression_loss: 1.0370 - classification_loss: 0.1972 30/500 [>.............................] - ETA: 1:58 - loss: 1.2390 - regression_loss: 1.0390 - classification_loss: 0.2000 31/500 [>.............................] - ETA: 1:57 - loss: 1.2503 - regression_loss: 1.0462 - classification_loss: 0.2041 32/500 [>.............................] - ETA: 1:57 - loss: 1.2275 - regression_loss: 1.0285 - classification_loss: 0.1990 33/500 [>.............................] - ETA: 1:57 - loss: 1.2171 - regression_loss: 1.0202 - classification_loss: 0.1969 34/500 [=>............................] - ETA: 1:57 - loss: 1.1982 - regression_loss: 1.0040 - classification_loss: 0.1942 35/500 [=>............................] - ETA: 1:56 - loss: 1.1863 - regression_loss: 0.9951 - classification_loss: 0.1911 36/500 [=>............................] - ETA: 1:56 - loss: 1.1959 - regression_loss: 1.0043 - classification_loss: 0.1916 37/500 [=>............................] - ETA: 1:56 - loss: 1.1810 - regression_loss: 0.9933 - classification_loss: 0.1878 38/500 [=>............................] - ETA: 1:56 - loss: 1.1867 - regression_loss: 0.9975 - classification_loss: 0.1891 39/500 [=>............................] - ETA: 1:55 - loss: 1.2016 - regression_loss: 1.0101 - classification_loss: 0.1915 40/500 [=>............................] - ETA: 1:55 - loss: 1.2485 - regression_loss: 1.0469 - classification_loss: 0.2016 41/500 [=>............................] - ETA: 1:55 - loss: 1.2575 - regression_loss: 1.0551 - classification_loss: 0.2024 42/500 [=>............................] - ETA: 1:54 - loss: 1.2713 - regression_loss: 1.0681 - classification_loss: 0.2032 43/500 [=>............................] - ETA: 1:54 - loss: 1.2739 - regression_loss: 1.0691 - classification_loss: 0.2048 44/500 [=>............................] - ETA: 1:54 - loss: 1.2761 - regression_loss: 1.0707 - classification_loss: 0.2054 45/500 [=>............................] - ETA: 1:54 - loss: 1.2549 - regression_loss: 1.0527 - classification_loss: 0.2022 46/500 [=>............................] - ETA: 1:54 - loss: 1.2634 - regression_loss: 1.0597 - classification_loss: 0.2038 47/500 [=>............................] - ETA: 1:53 - loss: 1.2757 - regression_loss: 1.0701 - classification_loss: 0.2056 48/500 [=>............................] - ETA: 1:53 - loss: 1.2657 - regression_loss: 1.0624 - classification_loss: 0.2032 49/500 [=>............................] - ETA: 1:53 - loss: 1.2704 - regression_loss: 1.0671 - classification_loss: 0.2033 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2730 - regression_loss: 1.0690 - classification_loss: 0.2039 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2535 - regression_loss: 1.0532 - classification_loss: 0.2004 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2445 - regression_loss: 1.0456 - classification_loss: 0.1989 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2333 - regression_loss: 1.0368 - classification_loss: 0.1965 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2420 - regression_loss: 1.0436 - classification_loss: 0.1984 55/500 [==>...........................] - ETA: 1:52 - loss: 1.2464 - regression_loss: 1.0476 - classification_loss: 0.1989 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2457 - regression_loss: 1.0457 - classification_loss: 0.2000 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2533 - regression_loss: 1.0522 - classification_loss: 0.2012 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2537 - regression_loss: 1.0527 - classification_loss: 0.2010 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2444 - regression_loss: 1.0451 - classification_loss: 0.1992 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2506 - regression_loss: 1.0513 - classification_loss: 0.1993 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2439 - regression_loss: 1.0460 - classification_loss: 0.1979 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2444 - regression_loss: 1.0462 - classification_loss: 0.1982 63/500 [==>...........................] - ETA: 1:50 - loss: 1.2467 - regression_loss: 1.0488 - classification_loss: 0.1979 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2444 - regression_loss: 1.0475 - classification_loss: 0.1970 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2484 - regression_loss: 1.0509 - classification_loss: 0.1976 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2477 - regression_loss: 1.0492 - classification_loss: 0.1985 67/500 [===>..........................] - ETA: 1:49 - loss: 1.2640 - regression_loss: 1.0625 - classification_loss: 0.2015 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2679 - regression_loss: 1.0658 - classification_loss: 0.2021 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2650 - regression_loss: 1.0641 - classification_loss: 0.2009 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2655 - regression_loss: 1.0647 - classification_loss: 0.2008 71/500 [===>..........................] - ETA: 1:48 - loss: 1.2639 - regression_loss: 1.0627 - classification_loss: 0.2012 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2634 - regression_loss: 1.0623 - classification_loss: 0.2011 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2655 - regression_loss: 1.0644 - classification_loss: 0.2012 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2658 - regression_loss: 1.0650 - classification_loss: 0.2008 75/500 [===>..........................] - ETA: 1:47 - loss: 1.2702 - regression_loss: 1.0688 - classification_loss: 0.2014 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2706 - regression_loss: 1.0686 - classification_loss: 0.2020 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2691 - regression_loss: 1.0678 - classification_loss: 0.2014 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2674 - regression_loss: 1.0669 - classification_loss: 0.2005 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2691 - regression_loss: 1.0688 - classification_loss: 0.2003 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2721 - regression_loss: 1.0715 - classification_loss: 0.2006 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2736 - regression_loss: 1.0724 - classification_loss: 0.2012 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2756 - regression_loss: 1.0738 - classification_loss: 0.2018 83/500 [===>..........................] - ETA: 1:45 - loss: 1.2779 - regression_loss: 1.0760 - classification_loss: 0.2018 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2828 - regression_loss: 1.0802 - classification_loss: 0.2026 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2911 - regression_loss: 1.0863 - classification_loss: 0.2048 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2876 - regression_loss: 1.0838 - classification_loss: 0.2038 87/500 [====>.........................] - ETA: 1:44 - loss: 1.2817 - regression_loss: 1.0784 - classification_loss: 0.2033 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2739 - regression_loss: 1.0721 - classification_loss: 0.2019 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2686 - regression_loss: 1.0682 - classification_loss: 0.2004 90/500 [====>.........................] - ETA: 1:43 - loss: 1.2709 - regression_loss: 1.0704 - classification_loss: 0.2005 91/500 [====>.........................] - ETA: 1:43 - loss: 1.2759 - regression_loss: 1.0745 - classification_loss: 0.2013 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2786 - regression_loss: 1.0773 - classification_loss: 0.2012 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2810 - regression_loss: 1.0796 - classification_loss: 0.2014 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2859 - regression_loss: 1.0835 - classification_loss: 0.2024 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2873 - regression_loss: 1.0843 - classification_loss: 0.2030 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2989 - regression_loss: 1.0941 - classification_loss: 0.2048 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2914 - regression_loss: 1.0883 - classification_loss: 0.2032 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2923 - regression_loss: 1.0894 - classification_loss: 0.2029 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2975 - regression_loss: 1.0937 - classification_loss: 0.2039 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2959 - regression_loss: 1.0922 - classification_loss: 0.2037 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2986 - regression_loss: 1.0950 - classification_loss: 0.2037 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2963 - regression_loss: 1.0935 - classification_loss: 0.2028 103/500 [=====>........................] - ETA: 1:39 - loss: 1.3022 - regression_loss: 1.0978 - classification_loss: 0.2044 104/500 [=====>........................] - ETA: 1:39 - loss: 1.3020 - regression_loss: 1.0978 - classification_loss: 0.2042 105/500 [=====>........................] - ETA: 1:39 - loss: 1.3030 - regression_loss: 1.0992 - classification_loss: 0.2038 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2978 - regression_loss: 1.0951 - classification_loss: 0.2027 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2956 - regression_loss: 1.0939 - classification_loss: 0.2018 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2933 - regression_loss: 1.0923 - classification_loss: 0.2009 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2985 - regression_loss: 1.0970 - classification_loss: 0.2015 110/500 [=====>........................] - ETA: 1:37 - loss: 1.3078 - regression_loss: 1.1032 - classification_loss: 0.2046 111/500 [=====>........................] - ETA: 1:37 - loss: 1.3117 - regression_loss: 1.1066 - classification_loss: 0.2051 112/500 [=====>........................] - ETA: 1:37 - loss: 1.3125 - regression_loss: 1.1073 - classification_loss: 0.2051 113/500 [=====>........................] - ETA: 1:37 - loss: 1.3104 - regression_loss: 1.1062 - classification_loss: 0.2042 114/500 [=====>........................] - ETA: 1:36 - loss: 1.3078 - regression_loss: 1.1043 - classification_loss: 0.2036 115/500 [=====>........................] - ETA: 1:36 - loss: 1.3089 - regression_loss: 1.1057 - classification_loss: 0.2032 116/500 [=====>........................] - ETA: 1:36 - loss: 1.3144 - regression_loss: 1.1098 - classification_loss: 0.2046 117/500 [======>.......................] - ETA: 1:36 - loss: 1.3139 - regression_loss: 1.1098 - classification_loss: 0.2041 118/500 [======>.......................] - ETA: 1:35 - loss: 1.3167 - regression_loss: 1.1124 - classification_loss: 0.2043 119/500 [======>.......................] - ETA: 1:35 - loss: 1.3157 - regression_loss: 1.1115 - classification_loss: 0.2041 120/500 [======>.......................] - ETA: 1:35 - loss: 1.3171 - regression_loss: 1.1130 - classification_loss: 0.2041 121/500 [======>.......................] - ETA: 1:35 - loss: 1.3189 - regression_loss: 1.1143 - classification_loss: 0.2045 122/500 [======>.......................] - ETA: 1:34 - loss: 1.3236 - regression_loss: 1.1177 - classification_loss: 0.2059 123/500 [======>.......................] - ETA: 1:34 - loss: 1.3222 - regression_loss: 1.1167 - classification_loss: 0.2055 124/500 [======>.......................] - ETA: 1:34 - loss: 1.3176 - regression_loss: 1.1132 - classification_loss: 0.2044 125/500 [======>.......................] - ETA: 1:34 - loss: 1.3171 - regression_loss: 1.1127 - classification_loss: 0.2044 126/500 [======>.......................] - ETA: 1:33 - loss: 1.3162 - regression_loss: 1.1119 - classification_loss: 0.2043 127/500 [======>.......................] - ETA: 1:33 - loss: 1.3162 - regression_loss: 1.1119 - classification_loss: 0.2043 128/500 [======>.......................] - ETA: 1:33 - loss: 1.3163 - regression_loss: 1.1119 - classification_loss: 0.2044 129/500 [======>.......................] - ETA: 1:33 - loss: 1.3183 - regression_loss: 1.1141 - classification_loss: 0.2043 130/500 [======>.......................] - ETA: 1:32 - loss: 1.3163 - regression_loss: 1.1127 - classification_loss: 0.2036 131/500 [======>.......................] - ETA: 1:32 - loss: 1.3148 - regression_loss: 1.1117 - classification_loss: 0.2031 132/500 [======>.......................] - ETA: 1:32 - loss: 1.3155 - regression_loss: 1.1126 - classification_loss: 0.2029 133/500 [======>.......................] - ETA: 1:32 - loss: 1.3141 - regression_loss: 1.1113 - classification_loss: 0.2028 134/500 [=======>......................] - ETA: 1:31 - loss: 1.3103 - regression_loss: 1.1083 - classification_loss: 0.2021 135/500 [=======>......................] - ETA: 1:31 - loss: 1.3125 - regression_loss: 1.1103 - classification_loss: 0.2022 136/500 [=======>......................] - ETA: 1:31 - loss: 1.3131 - regression_loss: 1.1109 - classification_loss: 0.2022 137/500 [=======>......................] - ETA: 1:31 - loss: 1.3166 - regression_loss: 1.1139 - classification_loss: 0.2027 138/500 [=======>......................] - ETA: 1:30 - loss: 1.3186 - regression_loss: 1.1155 - classification_loss: 0.2031 139/500 [=======>......................] - ETA: 1:30 - loss: 1.3188 - regression_loss: 1.1157 - classification_loss: 0.2031 140/500 [=======>......................] - ETA: 1:30 - loss: 1.3127 - regression_loss: 1.1107 - classification_loss: 0.2020 141/500 [=======>......................] - ETA: 1:30 - loss: 1.3113 - regression_loss: 1.1097 - classification_loss: 0.2016 142/500 [=======>......................] - ETA: 1:29 - loss: 1.3108 - regression_loss: 1.1094 - classification_loss: 0.2014 143/500 [=======>......................] - ETA: 1:29 - loss: 1.3132 - regression_loss: 1.1115 - classification_loss: 0.2017 144/500 [=======>......................] - ETA: 1:29 - loss: 1.3138 - regression_loss: 1.1117 - classification_loss: 0.2021 145/500 [=======>......................] - ETA: 1:29 - loss: 1.3094 - regression_loss: 1.1083 - classification_loss: 0.2011 146/500 [=======>......................] - ETA: 1:28 - loss: 1.3109 - regression_loss: 1.1098 - classification_loss: 0.2011 147/500 [=======>......................] - ETA: 1:28 - loss: 1.3136 - regression_loss: 1.1120 - classification_loss: 0.2016 148/500 [=======>......................] - ETA: 1:28 - loss: 1.3176 - regression_loss: 1.1148 - classification_loss: 0.2028 149/500 [=======>......................] - ETA: 1:28 - loss: 1.3196 - regression_loss: 1.1166 - classification_loss: 0.2030 150/500 [========>.....................] - ETA: 1:27 - loss: 1.3156 - regression_loss: 1.1132 - classification_loss: 0.2024 151/500 [========>.....................] - ETA: 1:27 - loss: 1.3167 - regression_loss: 1.1141 - classification_loss: 0.2026 152/500 [========>.....................] - ETA: 1:27 - loss: 1.3157 - regression_loss: 1.1135 - classification_loss: 0.2022 153/500 [========>.....................] - ETA: 1:27 - loss: 1.3106 - regression_loss: 1.1095 - classification_loss: 0.2011 154/500 [========>.....................] - ETA: 1:26 - loss: 1.3096 - regression_loss: 1.1085 - classification_loss: 0.2011 155/500 [========>.....................] - ETA: 1:26 - loss: 1.3034 - regression_loss: 1.1034 - classification_loss: 0.2001 156/500 [========>.....................] - ETA: 1:26 - loss: 1.3057 - regression_loss: 1.1046 - classification_loss: 0.2011 157/500 [========>.....................] - ETA: 1:26 - loss: 1.3072 - regression_loss: 1.1060 - classification_loss: 0.2012 158/500 [========>.....................] - ETA: 1:25 - loss: 1.3038 - regression_loss: 1.1033 - classification_loss: 0.2005 159/500 [========>.....................] - ETA: 1:25 - loss: 1.3002 - regression_loss: 1.1003 - classification_loss: 0.1999 160/500 [========>.....................] - ETA: 1:25 - loss: 1.3037 - regression_loss: 1.1032 - classification_loss: 0.2005 161/500 [========>.....................] - ETA: 1:25 - loss: 1.3030 - regression_loss: 1.1023 - classification_loss: 0.2006 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2987 - regression_loss: 1.0985 - classification_loss: 0.2002 163/500 [========>.....................] - ETA: 1:24 - loss: 1.3015 - regression_loss: 1.1008 - classification_loss: 0.2007 164/500 [========>.....................] - ETA: 1:24 - loss: 1.3048 - regression_loss: 1.1046 - classification_loss: 0.2002 165/500 [========>.....................] - ETA: 1:24 - loss: 1.3001 - regression_loss: 1.1008 - classification_loss: 0.1993 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2953 - regression_loss: 1.0969 - classification_loss: 0.1984 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2983 - regression_loss: 1.0991 - classification_loss: 0.1992 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2985 - regression_loss: 1.0992 - classification_loss: 0.1993 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2970 - regression_loss: 1.0980 - classification_loss: 0.1990 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2929 - regression_loss: 1.0945 - classification_loss: 0.1984 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2927 - regression_loss: 1.0942 - classification_loss: 0.1984 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2956 - regression_loss: 1.0955 - classification_loss: 0.2000 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2930 - regression_loss: 1.0937 - classification_loss: 0.1994 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2914 - regression_loss: 1.0922 - classification_loss: 0.1993 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2944 - regression_loss: 1.0946 - classification_loss: 0.1998 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2966 - regression_loss: 1.0967 - classification_loss: 0.2000 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2953 - regression_loss: 1.0957 - classification_loss: 0.1996 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2940 - regression_loss: 1.0947 - classification_loss: 0.1992 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2930 - regression_loss: 1.0943 - classification_loss: 0.1987 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2876 - regression_loss: 1.0898 - classification_loss: 0.1978 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2881 - regression_loss: 1.0902 - classification_loss: 0.1979 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2857 - regression_loss: 1.0880 - classification_loss: 0.1977 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2871 - regression_loss: 1.0890 - classification_loss: 0.1981 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2875 - regression_loss: 1.0894 - classification_loss: 0.1981 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2879 - regression_loss: 1.0898 - classification_loss: 0.1982 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2857 - regression_loss: 1.0879 - classification_loss: 0.1978 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2824 - regression_loss: 1.0853 - classification_loss: 0.1971 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2839 - regression_loss: 1.0865 - classification_loss: 0.1973 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2834 - regression_loss: 1.0859 - classification_loss: 0.1975 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2841 - regression_loss: 1.0863 - classification_loss: 0.1978 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2849 - regression_loss: 1.0870 - classification_loss: 0.1979 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2831 - regression_loss: 1.0859 - classification_loss: 0.1972 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2819 - regression_loss: 1.0847 - classification_loss: 0.1972 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2815 - regression_loss: 1.0844 - classification_loss: 0.1971 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2801 - regression_loss: 1.0834 - classification_loss: 0.1967 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2781 - regression_loss: 1.0817 - classification_loss: 0.1964 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2765 - regression_loss: 1.0806 - classification_loss: 0.1959 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2743 - regression_loss: 1.0788 - classification_loss: 0.1955 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2709 - regression_loss: 1.0760 - classification_loss: 0.1949 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2720 - regression_loss: 1.0770 - classification_loss: 0.1950 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2684 - regression_loss: 1.0741 - classification_loss: 0.1944 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2690 - regression_loss: 1.0748 - classification_loss: 0.1942 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2685 - regression_loss: 1.0744 - classification_loss: 0.1941 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2686 - regression_loss: 1.0746 - classification_loss: 0.1940 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2694 - regression_loss: 1.0759 - classification_loss: 0.1935 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2673 - regression_loss: 1.0740 - classification_loss: 0.1933 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2681 - regression_loss: 1.0745 - classification_loss: 0.1936 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2739 - regression_loss: 1.0790 - classification_loss: 0.1949 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2729 - regression_loss: 1.0783 - classification_loss: 0.1946 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2717 - regression_loss: 1.0774 - classification_loss: 0.1943 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2691 - regression_loss: 1.0751 - classification_loss: 0.1940 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2718 - regression_loss: 1.0774 - classification_loss: 0.1944 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2719 - regression_loss: 1.0777 - classification_loss: 0.1942 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2681 - regression_loss: 1.0746 - classification_loss: 0.1935 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2669 - regression_loss: 1.0737 - classification_loss: 0.1933 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2653 - regression_loss: 1.0721 - classification_loss: 0.1932 217/500 [============>.................] - ETA: 1:11 - loss: 1.2683 - regression_loss: 1.0743 - classification_loss: 0.1940 218/500 [============>.................] - ETA: 1:10 - loss: 1.2710 - regression_loss: 1.0765 - classification_loss: 0.1945 219/500 [============>.................] - ETA: 1:10 - loss: 1.2734 - regression_loss: 1.0785 - classification_loss: 0.1949 220/500 [============>.................] - ETA: 1:10 - loss: 1.2741 - regression_loss: 1.0789 - classification_loss: 0.1952 221/500 [============>.................] - ETA: 1:10 - loss: 1.2738 - regression_loss: 1.0787 - classification_loss: 0.1951 222/500 [============>.................] - ETA: 1:09 - loss: 1.2739 - regression_loss: 1.0789 - classification_loss: 0.1951 223/500 [============>.................] - ETA: 1:09 - loss: 1.2719 - regression_loss: 1.0772 - classification_loss: 0.1947 224/500 [============>.................] - ETA: 1:09 - loss: 1.2699 - regression_loss: 1.0758 - classification_loss: 0.1941 225/500 [============>.................] - ETA: 1:09 - loss: 1.2722 - regression_loss: 1.0777 - classification_loss: 0.1945 226/500 [============>.................] - ETA: 1:08 - loss: 1.2710 - regression_loss: 1.0767 - classification_loss: 0.1943 227/500 [============>.................] - ETA: 1:08 - loss: 1.2720 - regression_loss: 1.0776 - classification_loss: 0.1945 228/500 [============>.................] - ETA: 1:08 - loss: 1.2719 - regression_loss: 1.0775 - classification_loss: 0.1944 229/500 [============>.................] - ETA: 1:08 - loss: 1.2711 - regression_loss: 1.0766 - classification_loss: 0.1946 230/500 [============>.................] - ETA: 1:07 - loss: 1.2727 - regression_loss: 1.0781 - classification_loss: 0.1946 231/500 [============>.................] - ETA: 1:07 - loss: 1.2722 - regression_loss: 1.0777 - classification_loss: 0.1945 232/500 [============>.................] - ETA: 1:07 - loss: 1.2699 - regression_loss: 1.0759 - classification_loss: 0.1940 233/500 [============>.................] - ETA: 1:07 - loss: 1.2709 - regression_loss: 1.0766 - classification_loss: 0.1943 234/500 [=============>................] - ETA: 1:06 - loss: 1.2716 - regression_loss: 1.0770 - classification_loss: 0.1945 235/500 [=============>................] - ETA: 1:06 - loss: 1.2728 - regression_loss: 1.0778 - classification_loss: 0.1950 236/500 [=============>................] - ETA: 1:06 - loss: 1.2744 - regression_loss: 1.0792 - classification_loss: 0.1952 237/500 [=============>................] - ETA: 1:06 - loss: 1.2751 - regression_loss: 1.0800 - classification_loss: 0.1951 238/500 [=============>................] - ETA: 1:05 - loss: 1.2742 - regression_loss: 1.0794 - classification_loss: 0.1948 239/500 [=============>................] - ETA: 1:05 - loss: 1.2729 - regression_loss: 1.0782 - classification_loss: 0.1946 240/500 [=============>................] - ETA: 1:05 - loss: 1.2756 - regression_loss: 1.0801 - classification_loss: 0.1955 241/500 [=============>................] - ETA: 1:05 - loss: 1.2741 - regression_loss: 1.0790 - classification_loss: 0.1951 242/500 [=============>................] - ETA: 1:04 - loss: 1.2716 - regression_loss: 1.0769 - classification_loss: 0.1947 243/500 [=============>................] - ETA: 1:04 - loss: 1.2709 - regression_loss: 1.0761 - classification_loss: 0.1947 244/500 [=============>................] - ETA: 1:04 - loss: 1.2715 - regression_loss: 1.0766 - classification_loss: 0.1949 245/500 [=============>................] - ETA: 1:04 - loss: 1.2741 - regression_loss: 1.0785 - classification_loss: 0.1955 246/500 [=============>................] - ETA: 1:03 - loss: 1.2702 - regression_loss: 1.0753 - classification_loss: 0.1948 247/500 [=============>................] - ETA: 1:03 - loss: 1.2715 - regression_loss: 1.0765 - classification_loss: 0.1951 248/500 [=============>................] - ETA: 1:03 - loss: 1.2729 - regression_loss: 1.0777 - classification_loss: 0.1952 249/500 [=============>................] - ETA: 1:03 - loss: 1.2733 - regression_loss: 1.0782 - classification_loss: 0.1951 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2712 - regression_loss: 1.0764 - classification_loss: 0.1948 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2690 - regression_loss: 1.0746 - classification_loss: 0.1944 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2655 - regression_loss: 1.0718 - classification_loss: 0.1937 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2640 - regression_loss: 1.0707 - classification_loss: 0.1933 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2641 - regression_loss: 1.0708 - classification_loss: 0.1933 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2664 - regression_loss: 1.0724 - classification_loss: 0.1940 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2658 - regression_loss: 1.0722 - classification_loss: 0.1936 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2635 - regression_loss: 1.0702 - classification_loss: 0.1933 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2629 - regression_loss: 1.0697 - classification_loss: 0.1933 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2599 - regression_loss: 1.0672 - classification_loss: 0.1926 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2609 - regression_loss: 1.0686 - classification_loss: 0.1923 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2610 - regression_loss: 1.0687 - classification_loss: 0.1923 262/500 [==============>...............] - ETA: 59s - loss: 1.2583 - regression_loss: 1.0665 - classification_loss: 0.1918  263/500 [==============>...............] - ETA: 59s - loss: 1.2591 - regression_loss: 1.0674 - classification_loss: 0.1917 264/500 [==============>...............] - ETA: 59s - loss: 1.2587 - regression_loss: 1.0672 - classification_loss: 0.1915 265/500 [==============>...............] - ETA: 59s - loss: 1.2581 - regression_loss: 1.0667 - classification_loss: 0.1914 266/500 [==============>...............] - ETA: 58s - loss: 1.2562 - regression_loss: 1.0652 - classification_loss: 0.1910 267/500 [===============>..............] - ETA: 58s - loss: 1.2588 - regression_loss: 1.0673 - classification_loss: 0.1915 268/500 [===============>..............] - ETA: 58s - loss: 1.2591 - regression_loss: 1.0677 - classification_loss: 0.1915 269/500 [===============>..............] - ETA: 58s - loss: 1.2569 - regression_loss: 1.0661 - classification_loss: 0.1909 270/500 [===============>..............] - ETA: 57s - loss: 1.2571 - regression_loss: 1.0663 - classification_loss: 0.1908 271/500 [===============>..............] - ETA: 57s - loss: 1.2552 - regression_loss: 1.0648 - classification_loss: 0.1904 272/500 [===============>..............] - ETA: 57s - loss: 1.2525 - regression_loss: 1.0625 - classification_loss: 0.1899 273/500 [===============>..............] - ETA: 57s - loss: 1.2501 - regression_loss: 1.0606 - classification_loss: 0.1894 274/500 [===============>..............] - ETA: 56s - loss: 1.2476 - regression_loss: 1.0586 - classification_loss: 0.1889 275/500 [===============>..............] - ETA: 56s - loss: 1.2487 - regression_loss: 1.0596 - classification_loss: 0.1891 276/500 [===============>..............] - ETA: 56s - loss: 1.2486 - regression_loss: 1.0597 - classification_loss: 0.1890 277/500 [===============>..............] - ETA: 56s - loss: 1.2483 - regression_loss: 1.0595 - classification_loss: 0.1888 278/500 [===============>..............] - ETA: 55s - loss: 1.2479 - regression_loss: 1.0592 - classification_loss: 0.1887 279/500 [===============>..............] - ETA: 55s - loss: 1.2480 - regression_loss: 1.0590 - classification_loss: 0.1890 280/500 [===============>..............] - ETA: 55s - loss: 1.2501 - regression_loss: 1.0604 - classification_loss: 0.1897 281/500 [===============>..............] - ETA: 55s - loss: 1.2508 - regression_loss: 1.0610 - classification_loss: 0.1898 282/500 [===============>..............] - ETA: 54s - loss: 1.2520 - regression_loss: 1.0619 - classification_loss: 0.1901 283/500 [===============>..............] - ETA: 54s - loss: 1.2514 - regression_loss: 1.0612 - classification_loss: 0.1903 284/500 [================>.............] - ETA: 54s - loss: 1.2529 - regression_loss: 1.0624 - classification_loss: 0.1905 285/500 [================>.............] - ETA: 53s - loss: 1.2513 - regression_loss: 1.0612 - classification_loss: 0.1902 286/500 [================>.............] - ETA: 53s - loss: 1.2520 - regression_loss: 1.0617 - classification_loss: 0.1902 287/500 [================>.............] - ETA: 53s - loss: 1.2527 - regression_loss: 1.0623 - classification_loss: 0.1904 288/500 [================>.............] - ETA: 53s - loss: 1.2543 - regression_loss: 1.0636 - classification_loss: 0.1908 289/500 [================>.............] - ETA: 52s - loss: 1.2539 - regression_loss: 1.0634 - classification_loss: 0.1906 290/500 [================>.............] - ETA: 52s - loss: 1.2565 - regression_loss: 1.0658 - classification_loss: 0.1907 291/500 [================>.............] - ETA: 52s - loss: 1.2557 - regression_loss: 1.0651 - classification_loss: 0.1907 292/500 [================>.............] - ETA: 52s - loss: 1.2568 - regression_loss: 1.0660 - classification_loss: 0.1908 293/500 [================>.............] - ETA: 51s - loss: 1.2540 - regression_loss: 1.0638 - classification_loss: 0.1902 294/500 [================>.............] - ETA: 51s - loss: 1.2551 - regression_loss: 1.0646 - classification_loss: 0.1905 295/500 [================>.............] - ETA: 51s - loss: 1.2548 - regression_loss: 1.0642 - classification_loss: 0.1906 296/500 [================>.............] - ETA: 51s - loss: 1.2556 - regression_loss: 1.0647 - classification_loss: 0.1909 297/500 [================>.............] - ETA: 50s - loss: 1.2564 - regression_loss: 1.0655 - classification_loss: 0.1909 298/500 [================>.............] - ETA: 50s - loss: 1.2570 - regression_loss: 1.0662 - classification_loss: 0.1909 299/500 [================>.............] - ETA: 50s - loss: 1.2583 - regression_loss: 1.0672 - classification_loss: 0.1911 300/500 [=================>............] - ETA: 50s - loss: 1.2594 - regression_loss: 1.0682 - classification_loss: 0.1913 301/500 [=================>............] - ETA: 49s - loss: 1.2617 - regression_loss: 1.0699 - classification_loss: 0.1919 302/500 [=================>............] - ETA: 49s - loss: 1.2614 - regression_loss: 1.0694 - classification_loss: 0.1920 303/500 [=================>............] - ETA: 49s - loss: 1.2614 - regression_loss: 1.0694 - classification_loss: 0.1920 304/500 [=================>............] - ETA: 49s - loss: 1.2594 - regression_loss: 1.0677 - classification_loss: 0.1917 305/500 [=================>............] - ETA: 48s - loss: 1.2567 - regression_loss: 1.0655 - classification_loss: 0.1913 306/500 [=================>............] - ETA: 48s - loss: 1.2560 - regression_loss: 1.0650 - classification_loss: 0.1910 307/500 [=================>............] - ETA: 48s - loss: 1.2535 - regression_loss: 1.0630 - classification_loss: 0.1905 308/500 [=================>............] - ETA: 48s - loss: 1.2516 - regression_loss: 1.0613 - classification_loss: 0.1903 309/500 [=================>............] - ETA: 47s - loss: 1.2510 - regression_loss: 1.0608 - classification_loss: 0.1902 310/500 [=================>............] - ETA: 47s - loss: 1.2501 - regression_loss: 1.0601 - classification_loss: 0.1901 311/500 [=================>............] - ETA: 47s - loss: 1.2511 - regression_loss: 1.0607 - classification_loss: 0.1903 312/500 [=================>............] - ETA: 47s - loss: 1.2522 - regression_loss: 1.0618 - classification_loss: 0.1905 313/500 [=================>............] - ETA: 46s - loss: 1.2520 - regression_loss: 1.0617 - classification_loss: 0.1904 314/500 [=================>............] - ETA: 46s - loss: 1.2524 - regression_loss: 1.0620 - classification_loss: 0.1904 315/500 [=================>............] - ETA: 46s - loss: 1.2519 - regression_loss: 1.0617 - classification_loss: 0.1903 316/500 [=================>............] - ETA: 46s - loss: 1.2500 - regression_loss: 1.0601 - classification_loss: 0.1899 317/500 [==================>...........] - ETA: 45s - loss: 1.2480 - regression_loss: 1.0585 - classification_loss: 0.1895 318/500 [==================>...........] - ETA: 45s - loss: 1.2476 - regression_loss: 1.0582 - classification_loss: 0.1894 319/500 [==================>...........] - ETA: 45s - loss: 1.2471 - regression_loss: 1.0577 - classification_loss: 0.1894 320/500 [==================>...........] - ETA: 45s - loss: 1.2462 - regression_loss: 1.0565 - classification_loss: 0.1897 321/500 [==================>...........] - ETA: 44s - loss: 1.2461 - regression_loss: 1.0565 - classification_loss: 0.1896 322/500 [==================>...........] - ETA: 44s - loss: 1.2462 - regression_loss: 1.0564 - classification_loss: 0.1898 323/500 [==================>...........] - ETA: 44s - loss: 1.2482 - regression_loss: 1.0580 - classification_loss: 0.1902 324/500 [==================>...........] - ETA: 44s - loss: 1.2493 - regression_loss: 1.0588 - classification_loss: 0.1905 325/500 [==================>...........] - ETA: 43s - loss: 1.2506 - regression_loss: 1.0597 - classification_loss: 0.1908 326/500 [==================>...........] - ETA: 43s - loss: 1.2483 - regression_loss: 1.0578 - classification_loss: 0.1905 327/500 [==================>...........] - ETA: 43s - loss: 1.2487 - regression_loss: 1.0583 - classification_loss: 0.1905 328/500 [==================>...........] - ETA: 43s - loss: 1.2495 - regression_loss: 1.0590 - classification_loss: 0.1905 329/500 [==================>...........] - ETA: 42s - loss: 1.2496 - regression_loss: 1.0592 - classification_loss: 0.1904 330/500 [==================>...........] - ETA: 42s - loss: 1.2511 - regression_loss: 1.0607 - classification_loss: 0.1903 331/500 [==================>...........] - ETA: 42s - loss: 1.2483 - regression_loss: 1.0584 - classification_loss: 0.1898 332/500 [==================>...........] - ETA: 42s - loss: 1.2481 - regression_loss: 1.0584 - classification_loss: 0.1897 333/500 [==================>...........] - ETA: 41s - loss: 1.2452 - regression_loss: 1.0560 - classification_loss: 0.1893 334/500 [===================>..........] - ETA: 41s - loss: 1.2447 - regression_loss: 1.0557 - classification_loss: 0.1891 335/500 [===================>..........] - ETA: 41s - loss: 1.2438 - regression_loss: 1.0550 - classification_loss: 0.1888 336/500 [===================>..........] - ETA: 41s - loss: 1.2447 - regression_loss: 1.0559 - classification_loss: 0.1888 337/500 [===================>..........] - ETA: 40s - loss: 1.2451 - regression_loss: 1.0561 - classification_loss: 0.1890 338/500 [===================>..........] - ETA: 40s - loss: 1.2465 - regression_loss: 1.0574 - classification_loss: 0.1891 339/500 [===================>..........] - ETA: 40s - loss: 1.2470 - regression_loss: 1.0579 - classification_loss: 0.1891 340/500 [===================>..........] - ETA: 40s - loss: 1.2455 - regression_loss: 1.0568 - classification_loss: 0.1887 341/500 [===================>..........] - ETA: 39s - loss: 1.2456 - regression_loss: 1.0570 - classification_loss: 0.1886 342/500 [===================>..........] - ETA: 39s - loss: 1.2459 - regression_loss: 1.0571 - classification_loss: 0.1888 343/500 [===================>..........] - ETA: 39s - loss: 1.2468 - regression_loss: 1.0578 - classification_loss: 0.1889 344/500 [===================>..........] - ETA: 39s - loss: 1.2481 - regression_loss: 1.0590 - classification_loss: 0.1891 345/500 [===================>..........] - ETA: 38s - loss: 1.2472 - regression_loss: 1.0581 - classification_loss: 0.1890 346/500 [===================>..........] - ETA: 38s - loss: 1.2460 - regression_loss: 1.0573 - classification_loss: 0.1887 347/500 [===================>..........] - ETA: 38s - loss: 1.2476 - regression_loss: 1.0586 - classification_loss: 0.1890 348/500 [===================>..........] - ETA: 38s - loss: 1.2460 - regression_loss: 1.0573 - classification_loss: 0.1887 349/500 [===================>..........] - ETA: 37s - loss: 1.2459 - regression_loss: 1.0573 - classification_loss: 0.1886 350/500 [====================>.........] - ETA: 37s - loss: 1.2464 - regression_loss: 1.0577 - classification_loss: 0.1887 351/500 [====================>.........] - ETA: 37s - loss: 1.2446 - regression_loss: 1.0562 - classification_loss: 0.1884 352/500 [====================>.........] - ETA: 37s - loss: 1.2452 - regression_loss: 1.0568 - classification_loss: 0.1884 353/500 [====================>.........] - ETA: 36s - loss: 1.2469 - regression_loss: 1.0581 - classification_loss: 0.1887 354/500 [====================>.........] - ETA: 36s - loss: 1.2469 - regression_loss: 1.0582 - classification_loss: 0.1887 355/500 [====================>.........] - ETA: 36s - loss: 1.2456 - regression_loss: 1.0571 - classification_loss: 0.1885 356/500 [====================>.........] - ETA: 36s - loss: 1.2463 - regression_loss: 1.0576 - classification_loss: 0.1886 357/500 [====================>.........] - ETA: 35s - loss: 1.2463 - regression_loss: 1.0577 - classification_loss: 0.1886 358/500 [====================>.........] - ETA: 35s - loss: 1.2471 - regression_loss: 1.0582 - classification_loss: 0.1889 359/500 [====================>.........] - ETA: 35s - loss: 1.2458 - regression_loss: 1.0571 - classification_loss: 0.1886 360/500 [====================>.........] - ETA: 35s - loss: 1.2473 - regression_loss: 1.0583 - classification_loss: 0.1890 361/500 [====================>.........] - ETA: 34s - loss: 1.2480 - regression_loss: 1.0589 - classification_loss: 0.1891 362/500 [====================>.........] - ETA: 34s - loss: 1.2470 - regression_loss: 1.0581 - classification_loss: 0.1889 363/500 [====================>.........] - ETA: 34s - loss: 1.2472 - regression_loss: 1.0585 - classification_loss: 0.1887 364/500 [====================>.........] - ETA: 34s - loss: 1.2479 - regression_loss: 1.0591 - classification_loss: 0.1888 365/500 [====================>.........] - ETA: 33s - loss: 1.2492 - regression_loss: 1.0602 - classification_loss: 0.1890 366/500 [====================>.........] - ETA: 33s - loss: 1.2489 - regression_loss: 1.0600 - classification_loss: 0.1889 367/500 [=====================>........] - ETA: 33s - loss: 1.2494 - regression_loss: 1.0600 - classification_loss: 0.1894 368/500 [=====================>........] - ETA: 33s - loss: 1.2493 - regression_loss: 1.0600 - classification_loss: 0.1894 369/500 [=====================>........] - ETA: 32s - loss: 1.2495 - regression_loss: 1.0602 - classification_loss: 0.1893 370/500 [=====================>........] - ETA: 32s - loss: 1.2496 - regression_loss: 1.0602 - classification_loss: 0.1894 371/500 [=====================>........] - ETA: 32s - loss: 1.2481 - regression_loss: 1.0588 - classification_loss: 0.1892 372/500 [=====================>........] - ETA: 32s - loss: 1.2460 - regression_loss: 1.0571 - classification_loss: 0.1889 373/500 [=====================>........] - ETA: 31s - loss: 1.2468 - regression_loss: 1.0577 - classification_loss: 0.1891 374/500 [=====================>........] - ETA: 31s - loss: 1.2469 - regression_loss: 1.0577 - classification_loss: 0.1892 375/500 [=====================>........] - ETA: 31s - loss: 1.2478 - regression_loss: 1.0585 - classification_loss: 0.1892 376/500 [=====================>........] - ETA: 31s - loss: 1.2474 - regression_loss: 1.0583 - classification_loss: 0.1891 377/500 [=====================>........] - ETA: 30s - loss: 1.2486 - regression_loss: 1.0593 - classification_loss: 0.1893 378/500 [=====================>........] - ETA: 30s - loss: 1.2473 - regression_loss: 1.0582 - classification_loss: 0.1891 379/500 [=====================>........] - ETA: 30s - loss: 1.2475 - regression_loss: 1.0584 - classification_loss: 0.1891 380/500 [=====================>........] - ETA: 30s - loss: 1.2471 - regression_loss: 1.0581 - classification_loss: 0.1890 381/500 [=====================>........] - ETA: 29s - loss: 1.2477 - regression_loss: 1.0587 - classification_loss: 0.1890 382/500 [=====================>........] - ETA: 29s - loss: 1.2472 - regression_loss: 1.0583 - classification_loss: 0.1890 383/500 [=====================>........] - ETA: 29s - loss: 1.2478 - regression_loss: 1.0587 - classification_loss: 0.1890 384/500 [======================>.......] - ETA: 29s - loss: 1.2480 - regression_loss: 1.0590 - classification_loss: 0.1890 385/500 [======================>.......] - ETA: 28s - loss: 1.2485 - regression_loss: 1.0594 - classification_loss: 0.1891 386/500 [======================>.......] - ETA: 28s - loss: 1.2501 - regression_loss: 1.0605 - classification_loss: 0.1896 387/500 [======================>.......] - ETA: 28s - loss: 1.2504 - regression_loss: 1.0609 - classification_loss: 0.1896 388/500 [======================>.......] - ETA: 28s - loss: 1.2503 - regression_loss: 1.0607 - classification_loss: 0.1896 389/500 [======================>.......] - ETA: 27s - loss: 1.2510 - regression_loss: 1.0613 - classification_loss: 0.1897 390/500 [======================>.......] - ETA: 27s - loss: 1.2522 - regression_loss: 1.0622 - classification_loss: 0.1899 391/500 [======================>.......] - ETA: 27s - loss: 1.2533 - regression_loss: 1.0630 - classification_loss: 0.1903 392/500 [======================>.......] - ETA: 27s - loss: 1.2541 - regression_loss: 1.0638 - classification_loss: 0.1904 393/500 [======================>.......] - ETA: 26s - loss: 1.2543 - regression_loss: 1.0639 - classification_loss: 0.1904 394/500 [======================>.......] - ETA: 26s - loss: 1.2543 - regression_loss: 1.0640 - classification_loss: 0.1903 395/500 [======================>.......] - ETA: 26s - loss: 1.2545 - regression_loss: 1.0642 - classification_loss: 0.1903 396/500 [======================>.......] - ETA: 26s - loss: 1.2549 - regression_loss: 1.0645 - classification_loss: 0.1904 397/500 [======================>.......] - ETA: 25s - loss: 1.2535 - regression_loss: 1.0634 - classification_loss: 0.1902 398/500 [======================>.......] - ETA: 25s - loss: 1.2525 - regression_loss: 1.0625 - classification_loss: 0.1901 399/500 [======================>.......] - ETA: 25s - loss: 1.2518 - regression_loss: 1.0618 - classification_loss: 0.1900 400/500 [=======================>......] - ETA: 25s - loss: 1.2511 - regression_loss: 1.0612 - classification_loss: 0.1899 401/500 [=======================>......] - ETA: 24s - loss: 1.2523 - regression_loss: 1.0621 - classification_loss: 0.1902 402/500 [=======================>......] - ETA: 24s - loss: 1.2523 - regression_loss: 1.0620 - classification_loss: 0.1903 403/500 [=======================>......] - ETA: 24s - loss: 1.2536 - regression_loss: 1.0631 - classification_loss: 0.1905 404/500 [=======================>......] - ETA: 24s - loss: 1.2521 - regression_loss: 1.0618 - classification_loss: 0.1904 405/500 [=======================>......] - ETA: 23s - loss: 1.2523 - regression_loss: 1.0620 - classification_loss: 0.1903 406/500 [=======================>......] - ETA: 23s - loss: 1.2526 - regression_loss: 1.0623 - classification_loss: 0.1903 407/500 [=======================>......] - ETA: 23s - loss: 1.2506 - regression_loss: 1.0607 - classification_loss: 0.1900 408/500 [=======================>......] - ETA: 23s - loss: 1.2509 - regression_loss: 1.0609 - classification_loss: 0.1900 409/500 [=======================>......] - ETA: 22s - loss: 1.2523 - regression_loss: 1.0622 - classification_loss: 0.1901 410/500 [=======================>......] - ETA: 22s - loss: 1.2518 - regression_loss: 1.0616 - classification_loss: 0.1902 411/500 [=======================>......] - ETA: 22s - loss: 1.2505 - regression_loss: 1.0606 - classification_loss: 0.1899 412/500 [=======================>......] - ETA: 22s - loss: 1.2516 - regression_loss: 1.0615 - classification_loss: 0.1901 413/500 [=======================>......] - ETA: 21s - loss: 1.2513 - regression_loss: 1.0612 - classification_loss: 0.1901 414/500 [=======================>......] - ETA: 21s - loss: 1.2517 - regression_loss: 1.0615 - classification_loss: 0.1902 415/500 [=======================>......] - ETA: 21s - loss: 1.2520 - regression_loss: 1.0618 - classification_loss: 0.1902 416/500 [=======================>......] - ETA: 21s - loss: 1.2523 - regression_loss: 1.0621 - classification_loss: 0.1903 417/500 [========================>.....] - ETA: 20s - loss: 1.2506 - regression_loss: 1.0606 - classification_loss: 0.1900 418/500 [========================>.....] - ETA: 20s - loss: 1.2491 - regression_loss: 1.0593 - classification_loss: 0.1898 419/500 [========================>.....] - ETA: 20s - loss: 1.2494 - regression_loss: 1.0596 - classification_loss: 0.1898 420/500 [========================>.....] - ETA: 20s - loss: 1.2492 - regression_loss: 1.0594 - classification_loss: 0.1898 421/500 [========================>.....] - ETA: 19s - loss: 1.2509 - regression_loss: 1.0608 - classification_loss: 0.1901 422/500 [========================>.....] - ETA: 19s - loss: 1.2514 - regression_loss: 1.0611 - classification_loss: 0.1902 423/500 [========================>.....] - ETA: 19s - loss: 1.2508 - regression_loss: 1.0605 - classification_loss: 0.1903 424/500 [========================>.....] - ETA: 19s - loss: 1.2505 - regression_loss: 1.0603 - classification_loss: 0.1902 425/500 [========================>.....] - ETA: 18s - loss: 1.2511 - regression_loss: 1.0609 - classification_loss: 0.1902 426/500 [========================>.....] - ETA: 18s - loss: 1.2546 - regression_loss: 1.0640 - classification_loss: 0.1906 427/500 [========================>.....] - ETA: 18s - loss: 1.2556 - regression_loss: 1.0648 - classification_loss: 0.1908 428/500 [========================>.....] - ETA: 18s - loss: 1.2559 - regression_loss: 1.0650 - classification_loss: 0.1909 429/500 [========================>.....] - ETA: 17s - loss: 1.2565 - regression_loss: 1.0656 - classification_loss: 0.1909 430/500 [========================>.....] - ETA: 17s - loss: 1.2567 - regression_loss: 1.0658 - classification_loss: 0.1909 431/500 [========================>.....] - ETA: 17s - loss: 1.2570 - regression_loss: 1.0661 - classification_loss: 0.1908 432/500 [========================>.....] - ETA: 17s - loss: 1.2560 - regression_loss: 1.0654 - classification_loss: 0.1907 433/500 [========================>.....] - ETA: 16s - loss: 1.2569 - regression_loss: 1.0660 - classification_loss: 0.1909 434/500 [=========================>....] - ETA: 16s - loss: 1.2550 - regression_loss: 1.0645 - classification_loss: 0.1905 435/500 [=========================>....] - ETA: 16s - loss: 1.2558 - regression_loss: 1.0651 - classification_loss: 0.1907 436/500 [=========================>....] - ETA: 16s - loss: 1.2552 - regression_loss: 1.0647 - classification_loss: 0.1905 437/500 [=========================>....] - ETA: 15s - loss: 1.2548 - regression_loss: 1.0644 - classification_loss: 0.1904 438/500 [=========================>....] - ETA: 15s - loss: 1.2543 - regression_loss: 1.0640 - classification_loss: 0.1903 439/500 [=========================>....] - ETA: 15s - loss: 1.2530 - regression_loss: 1.0629 - classification_loss: 0.1900 440/500 [=========================>....] - ETA: 15s - loss: 1.2528 - regression_loss: 1.0627 - classification_loss: 0.1901 441/500 [=========================>....] - ETA: 14s - loss: 1.2511 - regression_loss: 1.0612 - classification_loss: 0.1899 442/500 [=========================>....] - ETA: 14s - loss: 1.2517 - regression_loss: 1.0618 - classification_loss: 0.1900 443/500 [=========================>....] - ETA: 14s - loss: 1.2525 - regression_loss: 1.0623 - classification_loss: 0.1902 444/500 [=========================>....] - ETA: 14s - loss: 1.2532 - regression_loss: 1.0629 - classification_loss: 0.1903 445/500 [=========================>....] - ETA: 13s - loss: 1.2528 - regression_loss: 1.0626 - classification_loss: 0.1902 446/500 [=========================>....] - ETA: 13s - loss: 1.2523 - regression_loss: 1.0622 - classification_loss: 0.1902 447/500 [=========================>....] - ETA: 13s - loss: 1.2518 - regression_loss: 1.0617 - classification_loss: 0.1901 448/500 [=========================>....] - ETA: 13s - loss: 1.2512 - regression_loss: 1.0612 - classification_loss: 0.1900 449/500 [=========================>....] - ETA: 12s - loss: 1.2517 - regression_loss: 1.0616 - classification_loss: 0.1900 450/500 [==========================>...] - ETA: 12s - loss: 1.2508 - regression_loss: 1.0609 - classification_loss: 0.1899 451/500 [==========================>...] - ETA: 12s - loss: 1.2508 - regression_loss: 1.0609 - classification_loss: 0.1899 452/500 [==========================>...] - ETA: 12s - loss: 1.2509 - regression_loss: 1.0611 - classification_loss: 0.1899 453/500 [==========================>...] - ETA: 11s - loss: 1.2507 - regression_loss: 1.0610 - classification_loss: 0.1897 454/500 [==========================>...] - ETA: 11s - loss: 1.2511 - regression_loss: 1.0613 - classification_loss: 0.1898 455/500 [==========================>...] - ETA: 11s - loss: 1.2520 - regression_loss: 1.0621 - classification_loss: 0.1899 456/500 [==========================>...] - ETA: 11s - loss: 1.2522 - regression_loss: 1.0623 - classification_loss: 0.1900 457/500 [==========================>...] - ETA: 10s - loss: 1.2524 - regression_loss: 1.0624 - classification_loss: 0.1900 458/500 [==========================>...] - ETA: 10s - loss: 1.2519 - regression_loss: 1.0621 - classification_loss: 0.1898 459/500 [==========================>...] - ETA: 10s - loss: 1.2508 - regression_loss: 1.0612 - classification_loss: 0.1896 460/500 [==========================>...] - ETA: 10s - loss: 1.2508 - regression_loss: 1.0613 - classification_loss: 0.1895 461/500 [==========================>...] - ETA: 9s - loss: 1.2513 - regression_loss: 1.0618 - classification_loss: 0.1895  462/500 [==========================>...] - ETA: 9s - loss: 1.2516 - regression_loss: 1.0620 - classification_loss: 0.1896 463/500 [==========================>...] - ETA: 9s - loss: 1.2509 - regression_loss: 1.0616 - classification_loss: 0.1894 464/500 [==========================>...] - ETA: 9s - loss: 1.2511 - regression_loss: 1.0616 - classification_loss: 0.1894 465/500 [==========================>...] - ETA: 8s - loss: 1.2516 - regression_loss: 1.0621 - classification_loss: 0.1894 466/500 [==========================>...] - ETA: 8s - loss: 1.2514 - regression_loss: 1.0620 - classification_loss: 0.1894 467/500 [===========================>..] - ETA: 8s - loss: 1.2512 - regression_loss: 1.0619 - classification_loss: 0.1892 468/500 [===========================>..] - ETA: 8s - loss: 1.2497 - regression_loss: 1.0607 - classification_loss: 0.1890 469/500 [===========================>..] - ETA: 7s - loss: 1.2497 - regression_loss: 1.0607 - classification_loss: 0.1890 470/500 [===========================>..] - ETA: 7s - loss: 1.2509 - regression_loss: 1.0616 - classification_loss: 0.1893 471/500 [===========================>..] - ETA: 7s - loss: 1.2501 - regression_loss: 1.0610 - classification_loss: 0.1892 472/500 [===========================>..] - ETA: 7s - loss: 1.2503 - regression_loss: 1.0611 - classification_loss: 0.1892 473/500 [===========================>..] - ETA: 6s - loss: 1.2505 - regression_loss: 1.0612 - classification_loss: 0.1892 474/500 [===========================>..] - ETA: 6s - loss: 1.2496 - regression_loss: 1.0606 - classification_loss: 0.1891 475/500 [===========================>..] - ETA: 6s - loss: 1.2500 - regression_loss: 1.0609 - classification_loss: 0.1892 476/500 [===========================>..] - ETA: 6s - loss: 1.2499 - regression_loss: 1.0607 - classification_loss: 0.1892 477/500 [===========================>..] - ETA: 5s - loss: 1.2508 - regression_loss: 1.0614 - classification_loss: 0.1894 478/500 [===========================>..] - ETA: 5s - loss: 1.2519 - regression_loss: 1.0623 - classification_loss: 0.1896 479/500 [===========================>..] - ETA: 5s - loss: 1.2529 - regression_loss: 1.0632 - classification_loss: 0.1897 480/500 [===========================>..] - ETA: 5s - loss: 1.2537 - regression_loss: 1.0642 - classification_loss: 0.1895 481/500 [===========================>..] - ETA: 4s - loss: 1.2537 - regression_loss: 1.0641 - classification_loss: 0.1896 482/500 [===========================>..] - ETA: 4s - loss: 1.2525 - regression_loss: 1.0631 - classification_loss: 0.1894 483/500 [===========================>..] - ETA: 4s - loss: 1.2521 - regression_loss: 1.0627 - classification_loss: 0.1893 484/500 [============================>.] - ETA: 4s - loss: 1.2518 - regression_loss: 1.0626 - classification_loss: 0.1893 485/500 [============================>.] - ETA: 3s - loss: 1.2523 - regression_loss: 1.0630 - classification_loss: 0.1893 486/500 [============================>.] - ETA: 3s - loss: 1.2526 - regression_loss: 1.0633 - classification_loss: 0.1892 487/500 [============================>.] - ETA: 3s - loss: 1.2534 - regression_loss: 1.0640 - classification_loss: 0.1894 488/500 [============================>.] - ETA: 3s - loss: 1.2519 - regression_loss: 1.0628 - classification_loss: 0.1891 489/500 [============================>.] - ETA: 2s - loss: 1.2511 - regression_loss: 1.0621 - classification_loss: 0.1890 490/500 [============================>.] - ETA: 2s - loss: 1.2505 - regression_loss: 1.0617 - classification_loss: 0.1888 491/500 [============================>.] - ETA: 2s - loss: 1.2512 - regression_loss: 1.0623 - classification_loss: 0.1888 492/500 [============================>.] - ETA: 2s - loss: 1.2512 - regression_loss: 1.0623 - classification_loss: 0.1889 493/500 [============================>.] - ETA: 1s - loss: 1.2520 - regression_loss: 1.0629 - classification_loss: 0.1891 494/500 [============================>.] - ETA: 1s - loss: 1.2522 - regression_loss: 1.0630 - classification_loss: 0.1891 495/500 [============================>.] - ETA: 1s - loss: 1.2509 - regression_loss: 1.0621 - classification_loss: 0.1888 496/500 [============================>.] - ETA: 1s - loss: 1.2506 - regression_loss: 1.0618 - classification_loss: 0.1888 497/500 [============================>.] - ETA: 0s - loss: 1.2502 - regression_loss: 1.0616 - classification_loss: 0.1887 498/500 [============================>.] - ETA: 0s - loss: 1.2485 - regression_loss: 1.0602 - classification_loss: 0.1883 499/500 [============================>.] - ETA: 0s - loss: 1.2482 - regression_loss: 1.0599 - classification_loss: 0.1883 500/500 [==============================] - 126s 252ms/step - loss: 1.2476 - regression_loss: 1.0593 - classification_loss: 0.1883 1172 instances of class plum with average precision: 0.6999 mAP: 0.6999 Epoch 00130: saving model to ./training/snapshots/resnet50_pascal_130.h5 Epoch 131/150 1/500 [..............................] - ETA: 1:59 - loss: 1.9965 - regression_loss: 1.5922 - classification_loss: 0.4043 2/500 [..............................] - ETA: 2:01 - loss: 1.2145 - regression_loss: 0.9896 - classification_loss: 0.2250 3/500 [..............................] - ETA: 2:02 - loss: 1.3834 - regression_loss: 1.1055 - classification_loss: 0.2779 4/500 [..............................] - ETA: 2:02 - loss: 1.2871 - regression_loss: 1.0484 - classification_loss: 0.2387 5/500 [..............................] - ETA: 2:02 - loss: 1.3249 - regression_loss: 1.0806 - classification_loss: 0.2442 6/500 [..............................] - ETA: 2:02 - loss: 1.2670 - regression_loss: 1.0347 - classification_loss: 0.2323 7/500 [..............................] - ETA: 2:02 - loss: 1.2907 - regression_loss: 1.0485 - classification_loss: 0.2422 8/500 [..............................] - ETA: 2:02 - loss: 1.2460 - regression_loss: 1.0098 - classification_loss: 0.2361 9/500 [..............................] - ETA: 2:02 - loss: 1.1660 - regression_loss: 0.9475 - classification_loss: 0.2185 10/500 [..............................] - ETA: 2:02 - loss: 1.1876 - regression_loss: 0.9723 - classification_loss: 0.2154 11/500 [..............................] - ETA: 2:02 - loss: 1.2059 - regression_loss: 0.9945 - classification_loss: 0.2114 12/500 [..............................] - ETA: 2:02 - loss: 1.2256 - regression_loss: 1.0117 - classification_loss: 0.2139 13/500 [..............................] - ETA: 2:01 - loss: 1.2399 - regression_loss: 1.0223 - classification_loss: 0.2175 14/500 [..............................] - ETA: 2:01 - loss: 1.2050 - regression_loss: 0.9978 - classification_loss: 0.2073 15/500 [..............................] - ETA: 2:01 - loss: 1.1769 - regression_loss: 0.9784 - classification_loss: 0.1985 16/500 [..............................] - ETA: 2:01 - loss: 1.1572 - regression_loss: 0.9649 - classification_loss: 0.1923 17/500 [>.............................] - ETA: 2:01 - loss: 1.1973 - regression_loss: 0.9985 - classification_loss: 0.1988 18/500 [>.............................] - ETA: 2:01 - loss: 1.1541 - regression_loss: 0.9626 - classification_loss: 0.1914 19/500 [>.............................] - ETA: 2:00 - loss: 1.1595 - regression_loss: 0.9691 - classification_loss: 0.1904 20/500 [>.............................] - ETA: 2:00 - loss: 1.1863 - regression_loss: 0.9940 - classification_loss: 0.1923 21/500 [>.............................] - ETA: 2:00 - loss: 1.2238 - regression_loss: 1.0229 - classification_loss: 0.2008 22/500 [>.............................] - ETA: 2:00 - loss: 1.2398 - regression_loss: 1.0392 - classification_loss: 0.2006 23/500 [>.............................] - ETA: 2:00 - loss: 1.2211 - regression_loss: 1.0240 - classification_loss: 0.1971 24/500 [>.............................] - ETA: 1:59 - loss: 1.1967 - regression_loss: 1.0014 - classification_loss: 0.1952 25/500 [>.............................] - ETA: 1:59 - loss: 1.1927 - regression_loss: 0.9972 - classification_loss: 0.1955 26/500 [>.............................] - ETA: 1:59 - loss: 1.2168 - regression_loss: 1.0175 - classification_loss: 0.1993 27/500 [>.............................] - ETA: 1:58 - loss: 1.2120 - regression_loss: 1.0148 - classification_loss: 0.1972 28/500 [>.............................] - ETA: 1:58 - loss: 1.1898 - regression_loss: 0.9966 - classification_loss: 0.1932 29/500 [>.............................] - ETA: 1:58 - loss: 1.2009 - regression_loss: 1.0069 - classification_loss: 0.1940 30/500 [>.............................] - ETA: 1:58 - loss: 1.1877 - regression_loss: 0.9963 - classification_loss: 0.1914 31/500 [>.............................] - ETA: 1:58 - loss: 1.1656 - regression_loss: 0.9792 - classification_loss: 0.1864 32/500 [>.............................] - ETA: 1:57 - loss: 1.1635 - regression_loss: 0.9781 - classification_loss: 0.1854 33/500 [>.............................] - ETA: 1:57 - loss: 1.1549 - regression_loss: 0.9718 - classification_loss: 0.1831 34/500 [=>............................] - ETA: 1:57 - loss: 1.1631 - regression_loss: 0.9801 - classification_loss: 0.1830 35/500 [=>............................] - ETA: 1:57 - loss: 1.1430 - regression_loss: 0.9636 - classification_loss: 0.1794 36/500 [=>............................] - ETA: 1:56 - loss: 1.1301 - regression_loss: 0.9531 - classification_loss: 0.1770 37/500 [=>............................] - ETA: 1:56 - loss: 1.1295 - regression_loss: 0.9541 - classification_loss: 0.1753 38/500 [=>............................] - ETA: 1:56 - loss: 1.1093 - regression_loss: 0.9377 - classification_loss: 0.1717 39/500 [=>............................] - ETA: 1:56 - loss: 1.1214 - regression_loss: 0.9473 - classification_loss: 0.1741 40/500 [=>............................] - ETA: 1:55 - loss: 1.1390 - regression_loss: 0.9624 - classification_loss: 0.1766 41/500 [=>............................] - ETA: 1:55 - loss: 1.1568 - regression_loss: 0.9767 - classification_loss: 0.1801 42/500 [=>............................] - ETA: 1:55 - loss: 1.1660 - regression_loss: 0.9845 - classification_loss: 0.1815 43/500 [=>............................] - ETA: 1:55 - loss: 1.1473 - regression_loss: 0.9687 - classification_loss: 0.1786 44/500 [=>............................] - ETA: 1:54 - loss: 1.1627 - regression_loss: 0.9803 - classification_loss: 0.1824 45/500 [=>............................] - ETA: 1:54 - loss: 1.1741 - regression_loss: 0.9904 - classification_loss: 0.1837 46/500 [=>............................] - ETA: 1:54 - loss: 1.1875 - regression_loss: 1.0017 - classification_loss: 0.1858 47/500 [=>............................] - ETA: 1:54 - loss: 1.1905 - regression_loss: 1.0045 - classification_loss: 0.1860 48/500 [=>............................] - ETA: 1:53 - loss: 1.1994 - regression_loss: 1.0124 - classification_loss: 0.1870 49/500 [=>............................] - ETA: 1:53 - loss: 1.1792 - regression_loss: 0.9954 - classification_loss: 0.1838 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1679 - regression_loss: 0.9855 - classification_loss: 0.1824 51/500 [==>...........................] - ETA: 1:52 - loss: 1.1696 - regression_loss: 0.9875 - classification_loss: 0.1821 52/500 [==>...........................] - ETA: 1:52 - loss: 1.1608 - regression_loss: 0.9806 - classification_loss: 0.1802 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1631 - regression_loss: 0.9827 - classification_loss: 0.1804 54/500 [==>...........................] - ETA: 1:52 - loss: 1.1765 - regression_loss: 0.9929 - classification_loss: 0.1835 55/500 [==>...........................] - ETA: 1:51 - loss: 1.1716 - regression_loss: 0.9886 - classification_loss: 0.1830 56/500 [==>...........................] - ETA: 1:51 - loss: 1.1604 - regression_loss: 0.9793 - classification_loss: 0.1811 57/500 [==>...........................] - ETA: 1:51 - loss: 1.1687 - regression_loss: 0.9857 - classification_loss: 0.1830 58/500 [==>...........................] - ETA: 1:51 - loss: 1.1706 - regression_loss: 0.9875 - classification_loss: 0.1830 59/500 [==>...........................] - ETA: 1:50 - loss: 1.1767 - regression_loss: 0.9921 - classification_loss: 0.1846 60/500 [==>...........................] - ETA: 1:50 - loss: 1.1811 - regression_loss: 0.9963 - classification_loss: 0.1848 61/500 [==>...........................] - ETA: 1:50 - loss: 1.1886 - regression_loss: 1.0026 - classification_loss: 0.1860 62/500 [==>...........................] - ETA: 1:50 - loss: 1.1915 - regression_loss: 1.0042 - classification_loss: 0.1873 63/500 [==>...........................] - ETA: 1:49 - loss: 1.1961 - regression_loss: 1.0072 - classification_loss: 0.1889 64/500 [==>...........................] - ETA: 1:49 - loss: 1.1911 - regression_loss: 1.0038 - classification_loss: 0.1873 65/500 [==>...........................] - ETA: 1:49 - loss: 1.1885 - regression_loss: 1.0029 - classification_loss: 0.1856 66/500 [==>...........................] - ETA: 1:49 - loss: 1.1937 - regression_loss: 1.0076 - classification_loss: 0.1861 67/500 [===>..........................] - ETA: 1:48 - loss: 1.1907 - regression_loss: 1.0050 - classification_loss: 0.1857 68/500 [===>..........................] - ETA: 1:48 - loss: 1.1983 - regression_loss: 1.0119 - classification_loss: 0.1865 69/500 [===>..........................] - ETA: 1:48 - loss: 1.1981 - regression_loss: 1.0121 - classification_loss: 0.1860 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2019 - regression_loss: 1.0145 - classification_loss: 0.1873 71/500 [===>..........................] - ETA: 1:47 - loss: 1.1999 - regression_loss: 1.0133 - classification_loss: 0.1866 72/500 [===>..........................] - ETA: 1:47 - loss: 1.1932 - regression_loss: 1.0070 - classification_loss: 0.1862 73/500 [===>..........................] - ETA: 1:47 - loss: 1.1959 - regression_loss: 1.0097 - classification_loss: 0.1863 74/500 [===>..........................] - ETA: 1:47 - loss: 1.1887 - regression_loss: 1.0035 - classification_loss: 0.1852 75/500 [===>..........................] - ETA: 1:46 - loss: 1.1970 - regression_loss: 1.0105 - classification_loss: 0.1866 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2035 - regression_loss: 1.0165 - classification_loss: 0.1870 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2076 - regression_loss: 1.0199 - classification_loss: 0.1877 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2144 - regression_loss: 1.0269 - classification_loss: 0.1875 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2184 - regression_loss: 1.0301 - classification_loss: 0.1883 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2207 - regression_loss: 1.0317 - classification_loss: 0.1890 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2202 - regression_loss: 1.0317 - classification_loss: 0.1885 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2210 - regression_loss: 1.0323 - classification_loss: 0.1887 83/500 [===>..........................] - ETA: 1:45 - loss: 1.2134 - regression_loss: 1.0260 - classification_loss: 0.1874 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2031 - regression_loss: 1.0174 - classification_loss: 0.1857 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2046 - regression_loss: 1.0191 - classification_loss: 0.1855 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2059 - regression_loss: 1.0201 - classification_loss: 0.1858 87/500 [====>.........................] - ETA: 1:44 - loss: 1.2002 - regression_loss: 1.0152 - classification_loss: 0.1849 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2017 - regression_loss: 1.0164 - classification_loss: 0.1853 89/500 [====>.........................] - ETA: 1:43 - loss: 1.1963 - regression_loss: 1.0118 - classification_loss: 0.1845 90/500 [====>.........................] - ETA: 1:43 - loss: 1.1961 - regression_loss: 1.0120 - classification_loss: 0.1841 91/500 [====>.........................] - ETA: 1:42 - loss: 1.1961 - regression_loss: 1.0114 - classification_loss: 0.1847 92/500 [====>.........................] - ETA: 1:42 - loss: 1.1993 - regression_loss: 1.0137 - classification_loss: 0.1857 93/500 [====>.........................] - ETA: 1:42 - loss: 1.1997 - regression_loss: 1.0142 - classification_loss: 0.1855 94/500 [====>.........................] - ETA: 1:42 - loss: 1.2012 - regression_loss: 1.0148 - classification_loss: 0.1864 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2064 - regression_loss: 1.0186 - classification_loss: 0.1877 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2071 - regression_loss: 1.0189 - classification_loss: 0.1882 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2026 - regression_loss: 1.0150 - classification_loss: 0.1876 98/500 [====>.........................] - ETA: 1:41 - loss: 1.2085 - regression_loss: 1.0193 - classification_loss: 0.1892 99/500 [====>.........................] - ETA: 1:41 - loss: 1.2122 - regression_loss: 1.0223 - classification_loss: 0.1899 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2170 - regression_loss: 1.0257 - classification_loss: 0.1913 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2139 - regression_loss: 1.0235 - classification_loss: 0.1903 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2127 - regression_loss: 1.0231 - classification_loss: 0.1897 103/500 [=====>........................] - ETA: 1:40 - loss: 1.2154 - regression_loss: 1.0255 - classification_loss: 0.1899 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2131 - regression_loss: 1.0231 - classification_loss: 0.1900 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2166 - regression_loss: 1.0260 - classification_loss: 0.1905 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2180 - regression_loss: 1.0274 - classification_loss: 0.1906 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2218 - regression_loss: 1.0312 - classification_loss: 0.1906 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2236 - regression_loss: 1.0327 - classification_loss: 0.1909 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2191 - regression_loss: 1.0291 - classification_loss: 0.1901 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2196 - regression_loss: 1.0293 - classification_loss: 0.1902 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2141 - regression_loss: 1.0252 - classification_loss: 0.1889 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2121 - regression_loss: 1.0235 - classification_loss: 0.1886 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2160 - regression_loss: 1.0254 - classification_loss: 0.1906 114/500 [=====>........................] - ETA: 1:37 - loss: 1.2163 - regression_loss: 1.0255 - classification_loss: 0.1908 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2146 - regression_loss: 1.0244 - classification_loss: 0.1902 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2168 - regression_loss: 1.0264 - classification_loss: 0.1904 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2159 - regression_loss: 1.0259 - classification_loss: 0.1900 118/500 [======>.......................] - ETA: 1:36 - loss: 1.2230 - regression_loss: 1.0320 - classification_loss: 0.1910 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2178 - regression_loss: 1.0280 - classification_loss: 0.1898 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2166 - regression_loss: 1.0275 - classification_loss: 0.1891 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2184 - regression_loss: 1.0288 - classification_loss: 0.1896 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2187 - regression_loss: 1.0291 - classification_loss: 0.1896 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2145 - regression_loss: 1.0257 - classification_loss: 0.1887 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2097 - regression_loss: 1.0219 - classification_loss: 0.1877 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2149 - regression_loss: 1.0262 - classification_loss: 0.1887 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2203 - regression_loss: 1.0300 - classification_loss: 0.1903 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2206 - regression_loss: 1.0305 - classification_loss: 0.1900 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2163 - regression_loss: 1.0272 - classification_loss: 0.1891 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2180 - regression_loss: 1.0290 - classification_loss: 0.1891 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2222 - regression_loss: 1.0318 - classification_loss: 0.1903 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2236 - regression_loss: 1.0330 - classification_loss: 0.1906 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2216 - regression_loss: 1.0314 - classification_loss: 0.1902 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2178 - regression_loss: 1.0285 - classification_loss: 0.1893 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2158 - regression_loss: 1.0271 - classification_loss: 0.1886 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2138 - regression_loss: 1.0256 - classification_loss: 0.1882 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2160 - regression_loss: 1.0275 - classification_loss: 0.1885 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2190 - regression_loss: 1.0303 - classification_loss: 0.1888 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2202 - regression_loss: 1.0309 - classification_loss: 0.1893 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2198 - regression_loss: 1.0307 - classification_loss: 0.1891 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2215 - regression_loss: 1.0319 - classification_loss: 0.1896 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2174 - regression_loss: 1.0288 - classification_loss: 0.1886 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2194 - regression_loss: 1.0304 - classification_loss: 0.1890 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2204 - regression_loss: 1.0315 - classification_loss: 0.1889 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2206 - regression_loss: 1.0318 - classification_loss: 0.1888 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2165 - regression_loss: 1.0283 - classification_loss: 0.1881 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2170 - regression_loss: 1.0282 - classification_loss: 0.1887 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2222 - regression_loss: 1.0319 - classification_loss: 0.1903 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2207 - regression_loss: 1.0309 - classification_loss: 0.1898 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2225 - regression_loss: 1.0324 - classification_loss: 0.1901 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2202 - regression_loss: 1.0306 - classification_loss: 0.1896 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2195 - regression_loss: 1.0302 - classification_loss: 0.1893 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2238 - regression_loss: 1.0335 - classification_loss: 0.1902 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2252 - regression_loss: 1.0348 - classification_loss: 0.1904 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2289 - regression_loss: 1.0377 - classification_loss: 0.1912 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2295 - regression_loss: 1.0382 - classification_loss: 0.1912 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2311 - regression_loss: 1.0392 - classification_loss: 0.1919 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2320 - regression_loss: 1.0402 - classification_loss: 0.1918 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2328 - regression_loss: 1.0410 - classification_loss: 0.1918 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2312 - regression_loss: 1.0400 - classification_loss: 0.1912 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2286 - regression_loss: 1.0376 - classification_loss: 0.1911 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2296 - regression_loss: 1.0385 - classification_loss: 0.1911 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2297 - regression_loss: 1.0389 - classification_loss: 0.1908 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2317 - regression_loss: 1.0407 - classification_loss: 0.1909 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2344 - regression_loss: 1.0430 - classification_loss: 0.1914 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2313 - regression_loss: 1.0407 - classification_loss: 0.1906 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2314 - regression_loss: 1.0406 - classification_loss: 0.1908 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2320 - regression_loss: 1.0411 - classification_loss: 0.1909 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2296 - regression_loss: 1.0393 - classification_loss: 0.1903 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2306 - regression_loss: 1.0402 - classification_loss: 0.1904 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2265 - regression_loss: 1.0368 - classification_loss: 0.1897 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2299 - regression_loss: 1.0394 - classification_loss: 0.1904 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2323 - regression_loss: 1.0416 - classification_loss: 0.1907 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2336 - regression_loss: 1.0428 - classification_loss: 0.1908 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2295 - regression_loss: 1.0391 - classification_loss: 0.1904 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2297 - regression_loss: 1.0394 - classification_loss: 0.1903 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2270 - regression_loss: 1.0374 - classification_loss: 0.1896 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2252 - regression_loss: 1.0361 - classification_loss: 0.1891 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2253 - regression_loss: 1.0362 - classification_loss: 0.1892 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2243 - regression_loss: 1.0354 - classification_loss: 0.1890 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2310 - regression_loss: 1.0413 - classification_loss: 0.1897 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2298 - regression_loss: 1.0404 - classification_loss: 0.1895 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2309 - regression_loss: 1.0410 - classification_loss: 0.1898 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2271 - regression_loss: 1.0380 - classification_loss: 0.1892 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2238 - regression_loss: 1.0353 - classification_loss: 0.1885 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2243 - regression_loss: 1.0356 - classification_loss: 0.1887 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2243 - regression_loss: 1.0355 - classification_loss: 0.1887 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2283 - regression_loss: 1.0392 - classification_loss: 0.1892 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2318 - regression_loss: 1.0421 - classification_loss: 0.1897 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2337 - regression_loss: 1.0429 - classification_loss: 0.1908 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2346 - regression_loss: 1.0437 - classification_loss: 0.1909 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2388 - regression_loss: 1.0469 - classification_loss: 0.1918 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2359 - regression_loss: 1.0446 - classification_loss: 0.1912 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2353 - regression_loss: 1.0443 - classification_loss: 0.1911 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2372 - regression_loss: 1.0459 - classification_loss: 0.1913 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2384 - regression_loss: 1.0468 - classification_loss: 0.1917 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2385 - regression_loss: 1.0461 - classification_loss: 0.1924 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2358 - regression_loss: 1.0439 - classification_loss: 0.1919 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2378 - regression_loss: 1.0456 - classification_loss: 0.1922 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2355 - regression_loss: 1.0438 - classification_loss: 0.1917 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2358 - regression_loss: 1.0445 - classification_loss: 0.1914 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2374 - regression_loss: 1.0456 - classification_loss: 0.1918 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2353 - regression_loss: 1.0442 - classification_loss: 0.1911 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2363 - regression_loss: 1.0450 - classification_loss: 0.1913 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2382 - regression_loss: 1.0464 - classification_loss: 0.1917 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2389 - regression_loss: 1.0472 - classification_loss: 0.1916 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2400 - regression_loss: 1.0485 - classification_loss: 0.1915 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2386 - regression_loss: 1.0475 - classification_loss: 0.1911 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2406 - regression_loss: 1.0493 - classification_loss: 0.1914 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2414 - regression_loss: 1.0501 - classification_loss: 0.1913 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2383 - regression_loss: 1.0477 - classification_loss: 0.1906 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2364 - regression_loss: 1.0459 - classification_loss: 0.1904 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2320 - regression_loss: 1.0423 - classification_loss: 0.1897 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2335 - regression_loss: 1.0434 - classification_loss: 0.1901 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2320 - regression_loss: 1.0423 - classification_loss: 0.1897 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2332 - regression_loss: 1.0434 - classification_loss: 0.1898 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2303 - regression_loss: 1.0412 - classification_loss: 0.1891 217/500 [============>.................] - ETA: 1:10 - loss: 1.2296 - regression_loss: 1.0407 - classification_loss: 0.1888 218/500 [============>.................] - ETA: 1:10 - loss: 1.2297 - regression_loss: 1.0409 - classification_loss: 0.1888 219/500 [============>.................] - ETA: 1:10 - loss: 1.2309 - regression_loss: 1.0420 - classification_loss: 0.1889 220/500 [============>.................] - ETA: 1:10 - loss: 1.2320 - regression_loss: 1.0429 - classification_loss: 0.1891 221/500 [============>.................] - ETA: 1:09 - loss: 1.2332 - regression_loss: 1.0438 - classification_loss: 0.1893 222/500 [============>.................] - ETA: 1:09 - loss: 1.2347 - regression_loss: 1.0451 - classification_loss: 0.1896 223/500 [============>.................] - ETA: 1:09 - loss: 1.2361 - regression_loss: 1.0463 - classification_loss: 0.1898 224/500 [============>.................] - ETA: 1:09 - loss: 1.2346 - regression_loss: 1.0453 - classification_loss: 0.1893 225/500 [============>.................] - ETA: 1:08 - loss: 1.2359 - regression_loss: 1.0461 - classification_loss: 0.1898 226/500 [============>.................] - ETA: 1:08 - loss: 1.2341 - regression_loss: 1.0448 - classification_loss: 0.1893 227/500 [============>.................] - ETA: 1:08 - loss: 1.2346 - regression_loss: 1.0453 - classification_loss: 0.1893 228/500 [============>.................] - ETA: 1:08 - loss: 1.2315 - regression_loss: 1.0426 - classification_loss: 0.1889 229/500 [============>.................] - ETA: 1:07 - loss: 1.2301 - regression_loss: 1.0411 - classification_loss: 0.1890 230/500 [============>.................] - ETA: 1:07 - loss: 1.2299 - regression_loss: 1.0411 - classification_loss: 0.1888 231/500 [============>.................] - ETA: 1:07 - loss: 1.2281 - regression_loss: 1.0397 - classification_loss: 0.1884 232/500 [============>.................] - ETA: 1:07 - loss: 1.2289 - regression_loss: 1.0404 - classification_loss: 0.1885 233/500 [============>.................] - ETA: 1:06 - loss: 1.2272 - regression_loss: 1.0388 - classification_loss: 0.1883 234/500 [=============>................] - ETA: 1:06 - loss: 1.2290 - regression_loss: 1.0402 - classification_loss: 0.1888 235/500 [=============>................] - ETA: 1:06 - loss: 1.2268 - regression_loss: 1.0386 - classification_loss: 0.1882 236/500 [=============>................] - ETA: 1:06 - loss: 1.2276 - regression_loss: 1.0395 - classification_loss: 0.1880 237/500 [=============>................] - ETA: 1:05 - loss: 1.2299 - regression_loss: 1.0418 - classification_loss: 0.1881 238/500 [=============>................] - ETA: 1:05 - loss: 1.2295 - regression_loss: 1.0416 - classification_loss: 0.1879 239/500 [=============>................] - ETA: 1:05 - loss: 1.2276 - regression_loss: 1.0399 - classification_loss: 0.1877 240/500 [=============>................] - ETA: 1:05 - loss: 1.2296 - regression_loss: 1.0417 - classification_loss: 0.1879 241/500 [=============>................] - ETA: 1:04 - loss: 1.2270 - regression_loss: 1.0397 - classification_loss: 0.1873 242/500 [=============>................] - ETA: 1:04 - loss: 1.2299 - regression_loss: 1.0421 - classification_loss: 0.1878 243/500 [=============>................] - ETA: 1:04 - loss: 1.2307 - regression_loss: 1.0430 - classification_loss: 0.1877 244/500 [=============>................] - ETA: 1:04 - loss: 1.2314 - regression_loss: 1.0437 - classification_loss: 0.1877 245/500 [=============>................] - ETA: 1:03 - loss: 1.2294 - regression_loss: 1.0423 - classification_loss: 0.1871 246/500 [=============>................] - ETA: 1:03 - loss: 1.2347 - regression_loss: 1.0467 - classification_loss: 0.1880 247/500 [=============>................] - ETA: 1:03 - loss: 1.2366 - regression_loss: 1.0483 - classification_loss: 0.1884 248/500 [=============>................] - ETA: 1:03 - loss: 1.2367 - regression_loss: 1.0483 - classification_loss: 0.1883 249/500 [=============>................] - ETA: 1:02 - loss: 1.2352 - regression_loss: 1.0471 - classification_loss: 0.1882 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2350 - regression_loss: 1.0467 - classification_loss: 0.1884 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2360 - regression_loss: 1.0476 - classification_loss: 0.1884 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2340 - regression_loss: 1.0459 - classification_loss: 0.1881 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2358 - regression_loss: 1.0476 - classification_loss: 0.1882 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2382 - regression_loss: 1.0496 - classification_loss: 0.1885 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2385 - regression_loss: 1.0498 - classification_loss: 0.1886 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2386 - regression_loss: 1.0500 - classification_loss: 0.1886 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2379 - regression_loss: 1.0496 - classification_loss: 0.1883 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2401 - regression_loss: 1.0514 - classification_loss: 0.1887 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2418 - regression_loss: 1.0525 - classification_loss: 0.1894 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2399 - regression_loss: 1.0507 - classification_loss: 0.1892 261/500 [==============>...............] - ETA: 59s - loss: 1.2406 - regression_loss: 1.0512 - classification_loss: 0.1894  262/500 [==============>...............] - ETA: 59s - loss: 1.2413 - regression_loss: 1.0518 - classification_loss: 0.1895 263/500 [==============>...............] - ETA: 59s - loss: 1.2421 - regression_loss: 1.0525 - classification_loss: 0.1896 264/500 [==============>...............] - ETA: 59s - loss: 1.2409 - regression_loss: 1.0514 - classification_loss: 0.1895 265/500 [==============>...............] - ETA: 58s - loss: 1.2423 - regression_loss: 1.0526 - classification_loss: 0.1897 266/500 [==============>...............] - ETA: 58s - loss: 1.2414 - regression_loss: 1.0518 - classification_loss: 0.1896 267/500 [===============>..............] - ETA: 58s - loss: 1.2402 - regression_loss: 1.0509 - classification_loss: 0.1893 268/500 [===============>..............] - ETA: 58s - loss: 1.2410 - regression_loss: 1.0516 - classification_loss: 0.1894 269/500 [===============>..............] - ETA: 57s - loss: 1.2407 - regression_loss: 1.0513 - classification_loss: 0.1894 270/500 [===============>..............] - ETA: 57s - loss: 1.2470 - regression_loss: 1.0564 - classification_loss: 0.1905 271/500 [===============>..............] - ETA: 57s - loss: 1.2477 - regression_loss: 1.0572 - classification_loss: 0.1905 272/500 [===============>..............] - ETA: 57s - loss: 1.2473 - regression_loss: 1.0569 - classification_loss: 0.1904 273/500 [===============>..............] - ETA: 56s - loss: 1.2471 - regression_loss: 1.0567 - classification_loss: 0.1904 274/500 [===============>..............] - ETA: 56s - loss: 1.2433 - regression_loss: 1.0535 - classification_loss: 0.1899 275/500 [===============>..............] - ETA: 56s - loss: 1.2434 - regression_loss: 1.0538 - classification_loss: 0.1896 276/500 [===============>..............] - ETA: 56s - loss: 1.2415 - regression_loss: 1.0521 - classification_loss: 0.1894 277/500 [===============>..............] - ETA: 55s - loss: 1.2407 - regression_loss: 1.0516 - classification_loss: 0.1891 278/500 [===============>..............] - ETA: 55s - loss: 1.2412 - regression_loss: 1.0520 - classification_loss: 0.1892 279/500 [===============>..............] - ETA: 55s - loss: 1.2420 - regression_loss: 1.0528 - classification_loss: 0.1892 280/500 [===============>..............] - ETA: 55s - loss: 1.2412 - regression_loss: 1.0520 - classification_loss: 0.1892 281/500 [===============>..............] - ETA: 54s - loss: 1.2407 - regression_loss: 1.0518 - classification_loss: 0.1889 282/500 [===============>..............] - ETA: 54s - loss: 1.2379 - regression_loss: 1.0495 - classification_loss: 0.1885 283/500 [===============>..............] - ETA: 54s - loss: 1.2379 - regression_loss: 1.0495 - classification_loss: 0.1885 284/500 [================>.............] - ETA: 54s - loss: 1.2371 - regression_loss: 1.0487 - classification_loss: 0.1883 285/500 [================>.............] - ETA: 53s - loss: 1.2362 - regression_loss: 1.0482 - classification_loss: 0.1881 286/500 [================>.............] - ETA: 53s - loss: 1.2390 - regression_loss: 1.0504 - classification_loss: 0.1887 287/500 [================>.............] - ETA: 53s - loss: 1.2371 - regression_loss: 1.0489 - classification_loss: 0.1882 288/500 [================>.............] - ETA: 53s - loss: 1.2368 - regression_loss: 1.0487 - classification_loss: 0.1881 289/500 [================>.............] - ETA: 52s - loss: 1.2369 - regression_loss: 1.0489 - classification_loss: 0.1880 290/500 [================>.............] - ETA: 52s - loss: 1.2389 - regression_loss: 1.0506 - classification_loss: 0.1882 291/500 [================>.............] - ETA: 52s - loss: 1.2384 - regression_loss: 1.0503 - classification_loss: 0.1881 292/500 [================>.............] - ETA: 52s - loss: 1.2363 - regression_loss: 1.0487 - classification_loss: 0.1876 293/500 [================>.............] - ETA: 51s - loss: 1.2368 - regression_loss: 1.0488 - classification_loss: 0.1880 294/500 [================>.............] - ETA: 51s - loss: 1.2384 - regression_loss: 1.0498 - classification_loss: 0.1886 295/500 [================>.............] - ETA: 51s - loss: 1.2387 - regression_loss: 1.0501 - classification_loss: 0.1886 296/500 [================>.............] - ETA: 51s - loss: 1.2361 - regression_loss: 1.0480 - classification_loss: 0.1881 297/500 [================>.............] - ETA: 50s - loss: 1.2357 - regression_loss: 1.0476 - classification_loss: 0.1881 298/500 [================>.............] - ETA: 50s - loss: 1.2340 - regression_loss: 1.0463 - classification_loss: 0.1877 299/500 [================>.............] - ETA: 50s - loss: 1.2333 - regression_loss: 1.0459 - classification_loss: 0.1873 300/500 [=================>............] - ETA: 49s - loss: 1.2334 - regression_loss: 1.0460 - classification_loss: 0.1874 301/500 [=================>............] - ETA: 49s - loss: 1.2337 - regression_loss: 1.0463 - classification_loss: 0.1874 302/500 [=================>............] - ETA: 49s - loss: 1.2309 - regression_loss: 1.0438 - classification_loss: 0.1871 303/500 [=================>............] - ETA: 49s - loss: 1.2307 - regression_loss: 1.0435 - classification_loss: 0.1872 304/500 [=================>............] - ETA: 48s - loss: 1.2285 - regression_loss: 1.0418 - classification_loss: 0.1868 305/500 [=================>............] - ETA: 48s - loss: 1.2292 - regression_loss: 1.0422 - classification_loss: 0.1869 306/500 [=================>............] - ETA: 48s - loss: 1.2280 - regression_loss: 1.0414 - classification_loss: 0.1866 307/500 [=================>............] - ETA: 48s - loss: 1.2283 - regression_loss: 1.0416 - classification_loss: 0.1867 308/500 [=================>............] - ETA: 48s - loss: 1.2296 - regression_loss: 1.0427 - classification_loss: 0.1869 309/500 [=================>............] - ETA: 47s - loss: 1.2302 - regression_loss: 1.0432 - classification_loss: 0.1870 310/500 [=================>............] - ETA: 47s - loss: 1.2318 - regression_loss: 1.0445 - classification_loss: 0.1873 311/500 [=================>............] - ETA: 47s - loss: 1.2327 - regression_loss: 1.0453 - classification_loss: 0.1875 312/500 [=================>............] - ETA: 46s - loss: 1.2305 - regression_loss: 1.0435 - classification_loss: 0.1870 313/500 [=================>............] - ETA: 46s - loss: 1.2304 - regression_loss: 1.0434 - classification_loss: 0.1870 314/500 [=================>............] - ETA: 46s - loss: 1.2279 - regression_loss: 1.0414 - classification_loss: 0.1865 315/500 [=================>............] - ETA: 46s - loss: 1.2268 - regression_loss: 1.0405 - classification_loss: 0.1863 316/500 [=================>............] - ETA: 46s - loss: 1.2290 - regression_loss: 1.0422 - classification_loss: 0.1868 317/500 [==================>...........] - ETA: 45s - loss: 1.2268 - regression_loss: 1.0404 - classification_loss: 0.1864 318/500 [==================>...........] - ETA: 45s - loss: 1.2282 - regression_loss: 1.0415 - classification_loss: 0.1866 319/500 [==================>...........] - ETA: 45s - loss: 1.2284 - regression_loss: 1.0418 - classification_loss: 0.1865 320/500 [==================>...........] - ETA: 45s - loss: 1.2301 - regression_loss: 1.0433 - classification_loss: 0.1869 321/500 [==================>...........] - ETA: 44s - loss: 1.2279 - regression_loss: 1.0414 - classification_loss: 0.1865 322/500 [==================>...........] - ETA: 44s - loss: 1.2284 - regression_loss: 1.0420 - classification_loss: 0.1864 323/500 [==================>...........] - ETA: 44s - loss: 1.2262 - regression_loss: 1.0403 - classification_loss: 0.1859 324/500 [==================>...........] - ETA: 44s - loss: 1.2259 - regression_loss: 1.0401 - classification_loss: 0.1857 325/500 [==================>...........] - ETA: 43s - loss: 1.2264 - regression_loss: 1.0407 - classification_loss: 0.1858 326/500 [==================>...........] - ETA: 43s - loss: 1.2250 - regression_loss: 1.0396 - classification_loss: 0.1854 327/500 [==================>...........] - ETA: 43s - loss: 1.2249 - regression_loss: 1.0395 - classification_loss: 0.1854 328/500 [==================>...........] - ETA: 43s - loss: 1.2227 - regression_loss: 1.0377 - classification_loss: 0.1850 329/500 [==================>...........] - ETA: 42s - loss: 1.2225 - regression_loss: 1.0372 - classification_loss: 0.1853 330/500 [==================>...........] - ETA: 42s - loss: 1.2238 - regression_loss: 1.0381 - classification_loss: 0.1857 331/500 [==================>...........] - ETA: 42s - loss: 1.2213 - regression_loss: 1.0361 - classification_loss: 0.1852 332/500 [==================>...........] - ETA: 42s - loss: 1.2224 - regression_loss: 1.0368 - classification_loss: 0.1856 333/500 [==================>...........] - ETA: 41s - loss: 1.2211 - regression_loss: 1.0357 - classification_loss: 0.1854 334/500 [===================>..........] - ETA: 41s - loss: 1.2207 - regression_loss: 1.0352 - classification_loss: 0.1854 335/500 [===================>..........] - ETA: 41s - loss: 1.2212 - regression_loss: 1.0357 - classification_loss: 0.1854 336/500 [===================>..........] - ETA: 41s - loss: 1.2220 - regression_loss: 1.0365 - classification_loss: 0.1855 337/500 [===================>..........] - ETA: 40s - loss: 1.2219 - regression_loss: 1.0364 - classification_loss: 0.1855 338/500 [===================>..........] - ETA: 40s - loss: 1.2219 - regression_loss: 1.0365 - classification_loss: 0.1854 339/500 [===================>..........] - ETA: 40s - loss: 1.2245 - regression_loss: 1.0385 - classification_loss: 0.1860 340/500 [===================>..........] - ETA: 39s - loss: 1.2226 - regression_loss: 1.0367 - classification_loss: 0.1858 341/500 [===================>..........] - ETA: 39s - loss: 1.2220 - regression_loss: 1.0364 - classification_loss: 0.1856 342/500 [===================>..........] - ETA: 39s - loss: 1.2237 - regression_loss: 1.0378 - classification_loss: 0.1859 343/500 [===================>..........] - ETA: 39s - loss: 1.2246 - regression_loss: 1.0386 - classification_loss: 0.1860 344/500 [===================>..........] - ETA: 38s - loss: 1.2259 - regression_loss: 1.0397 - classification_loss: 0.1862 345/500 [===================>..........] - ETA: 38s - loss: 1.2254 - regression_loss: 1.0393 - classification_loss: 0.1861 346/500 [===================>..........] - ETA: 38s - loss: 1.2260 - regression_loss: 1.0397 - classification_loss: 0.1862 347/500 [===================>..........] - ETA: 38s - loss: 1.2262 - regression_loss: 1.0396 - classification_loss: 0.1866 348/500 [===================>..........] - ETA: 37s - loss: 1.2263 - regression_loss: 1.0397 - classification_loss: 0.1865 349/500 [===================>..........] - ETA: 37s - loss: 1.2273 - regression_loss: 1.0407 - classification_loss: 0.1867 350/500 [====================>.........] - ETA: 37s - loss: 1.2267 - regression_loss: 1.0402 - classification_loss: 0.1865 351/500 [====================>.........] - ETA: 37s - loss: 1.2250 - regression_loss: 1.0388 - classification_loss: 0.1862 352/500 [====================>.........] - ETA: 36s - loss: 1.2267 - regression_loss: 1.0401 - classification_loss: 0.1865 353/500 [====================>.........] - ETA: 36s - loss: 1.2257 - regression_loss: 1.0394 - classification_loss: 0.1863 354/500 [====================>.........] - ETA: 36s - loss: 1.2245 - regression_loss: 1.0385 - classification_loss: 0.1860 355/500 [====================>.........] - ETA: 36s - loss: 1.2227 - regression_loss: 1.0371 - classification_loss: 0.1856 356/500 [====================>.........] - ETA: 35s - loss: 1.2239 - regression_loss: 1.0381 - classification_loss: 0.1858 357/500 [====================>.........] - ETA: 35s - loss: 1.2219 - regression_loss: 1.0365 - classification_loss: 0.1855 358/500 [====================>.........] - ETA: 35s - loss: 1.2208 - regression_loss: 1.0356 - classification_loss: 0.1852 359/500 [====================>.........] - ETA: 35s - loss: 1.2215 - regression_loss: 1.0361 - classification_loss: 0.1854 360/500 [====================>.........] - ETA: 34s - loss: 1.2232 - regression_loss: 1.0377 - classification_loss: 0.1855 361/500 [====================>.........] - ETA: 34s - loss: 1.2235 - regression_loss: 1.0380 - classification_loss: 0.1856 362/500 [====================>.........] - ETA: 34s - loss: 1.2249 - regression_loss: 1.0392 - classification_loss: 0.1857 363/500 [====================>.........] - ETA: 34s - loss: 1.2273 - regression_loss: 1.0411 - classification_loss: 0.1862 364/500 [====================>.........] - ETA: 33s - loss: 1.2291 - regression_loss: 1.0423 - classification_loss: 0.1868 365/500 [====================>.........] - ETA: 33s - loss: 1.2310 - regression_loss: 1.0440 - classification_loss: 0.1870 366/500 [====================>.........] - ETA: 33s - loss: 1.2298 - regression_loss: 1.0429 - classification_loss: 0.1868 367/500 [=====================>........] - ETA: 33s - loss: 1.2281 - regression_loss: 1.0416 - classification_loss: 0.1865 368/500 [=====================>........] - ETA: 32s - loss: 1.2296 - regression_loss: 1.0428 - classification_loss: 0.1868 369/500 [=====================>........] - ETA: 32s - loss: 1.2288 - regression_loss: 1.0421 - classification_loss: 0.1867 370/500 [=====================>........] - ETA: 32s - loss: 1.2294 - regression_loss: 1.0428 - classification_loss: 0.1867 371/500 [=====================>........] - ETA: 32s - loss: 1.2298 - regression_loss: 1.0431 - classification_loss: 0.1867 372/500 [=====================>........] - ETA: 31s - loss: 1.2294 - regression_loss: 1.0428 - classification_loss: 0.1866 373/500 [=====================>........] - ETA: 31s - loss: 1.2289 - regression_loss: 1.0425 - classification_loss: 0.1865 374/500 [=====================>........] - ETA: 31s - loss: 1.2276 - regression_loss: 1.0411 - classification_loss: 0.1865 375/500 [=====================>........] - ETA: 31s - loss: 1.2284 - regression_loss: 1.0417 - classification_loss: 0.1867 376/500 [=====================>........] - ETA: 30s - loss: 1.2266 - regression_loss: 1.0400 - classification_loss: 0.1866 377/500 [=====================>........] - ETA: 30s - loss: 1.2253 - regression_loss: 1.0389 - classification_loss: 0.1864 378/500 [=====================>........] - ETA: 30s - loss: 1.2235 - regression_loss: 1.0375 - classification_loss: 0.1860 379/500 [=====================>........] - ETA: 30s - loss: 1.2242 - regression_loss: 1.0380 - classification_loss: 0.1862 380/500 [=====================>........] - ETA: 29s - loss: 1.2224 - regression_loss: 1.0366 - classification_loss: 0.1858 381/500 [=====================>........] - ETA: 29s - loss: 1.2215 - regression_loss: 1.0359 - classification_loss: 0.1856 382/500 [=====================>........] - ETA: 29s - loss: 1.2227 - regression_loss: 1.0369 - classification_loss: 0.1859 383/500 [=====================>........] - ETA: 29s - loss: 1.2239 - regression_loss: 1.0377 - classification_loss: 0.1861 384/500 [======================>.......] - ETA: 28s - loss: 1.2259 - regression_loss: 1.0393 - classification_loss: 0.1866 385/500 [======================>.......] - ETA: 28s - loss: 1.2265 - regression_loss: 1.0397 - classification_loss: 0.1868 386/500 [======================>.......] - ETA: 28s - loss: 1.2270 - regression_loss: 1.0401 - classification_loss: 0.1869 387/500 [======================>.......] - ETA: 28s - loss: 1.2269 - regression_loss: 1.0401 - classification_loss: 0.1868 388/500 [======================>.......] - ETA: 27s - loss: 1.2283 - regression_loss: 1.0412 - classification_loss: 0.1870 389/500 [======================>.......] - ETA: 27s - loss: 1.2289 - regression_loss: 1.0418 - classification_loss: 0.1871 390/500 [======================>.......] - ETA: 27s - loss: 1.2289 - regression_loss: 1.0419 - classification_loss: 0.1870 391/500 [======================>.......] - ETA: 27s - loss: 1.2301 - regression_loss: 1.0428 - classification_loss: 0.1873 392/500 [======================>.......] - ETA: 26s - loss: 1.2302 - regression_loss: 1.0429 - classification_loss: 0.1873 393/500 [======================>.......] - ETA: 26s - loss: 1.2298 - regression_loss: 1.0425 - classification_loss: 0.1873 394/500 [======================>.......] - ETA: 26s - loss: 1.2320 - regression_loss: 1.0442 - classification_loss: 0.1879 395/500 [======================>.......] - ETA: 26s - loss: 1.2301 - regression_loss: 1.0424 - classification_loss: 0.1876 396/500 [======================>.......] - ETA: 25s - loss: 1.2284 - regression_loss: 1.0410 - classification_loss: 0.1874 397/500 [======================>.......] - ETA: 25s - loss: 1.2288 - regression_loss: 1.0413 - classification_loss: 0.1875 398/500 [======================>.......] - ETA: 25s - loss: 1.2296 - regression_loss: 1.0420 - classification_loss: 0.1876 399/500 [======================>.......] - ETA: 25s - loss: 1.2281 - regression_loss: 1.0408 - classification_loss: 0.1873 400/500 [=======================>......] - ETA: 24s - loss: 1.2284 - regression_loss: 1.0410 - classification_loss: 0.1874 401/500 [=======================>......] - ETA: 24s - loss: 1.2284 - regression_loss: 1.0409 - classification_loss: 0.1875 402/500 [=======================>......] - ETA: 24s - loss: 1.2286 - regression_loss: 1.0410 - classification_loss: 0.1877 403/500 [=======================>......] - ETA: 24s - loss: 1.2275 - regression_loss: 1.0401 - classification_loss: 0.1874 404/500 [=======================>......] - ETA: 23s - loss: 1.2287 - regression_loss: 1.0411 - classification_loss: 0.1876 405/500 [=======================>......] - ETA: 23s - loss: 1.2290 - regression_loss: 1.0413 - classification_loss: 0.1877 406/500 [=======================>......] - ETA: 23s - loss: 1.2292 - regression_loss: 1.0415 - classification_loss: 0.1877 407/500 [=======================>......] - ETA: 23s - loss: 1.2302 - regression_loss: 1.0423 - classification_loss: 0.1878 408/500 [=======================>......] - ETA: 22s - loss: 1.2309 - regression_loss: 1.0430 - classification_loss: 0.1879 409/500 [=======================>......] - ETA: 22s - loss: 1.2306 - regression_loss: 1.0427 - classification_loss: 0.1879 410/500 [=======================>......] - ETA: 22s - loss: 1.2309 - regression_loss: 1.0429 - classification_loss: 0.1880 411/500 [=======================>......] - ETA: 22s - loss: 1.2327 - regression_loss: 1.0445 - classification_loss: 0.1881 412/500 [=======================>......] - ETA: 21s - loss: 1.2339 - regression_loss: 1.0456 - classification_loss: 0.1883 413/500 [=======================>......] - ETA: 21s - loss: 1.2350 - regression_loss: 1.0466 - classification_loss: 0.1884 414/500 [=======================>......] - ETA: 21s - loss: 1.2351 - regression_loss: 1.0466 - classification_loss: 0.1884 415/500 [=======================>......] - ETA: 21s - loss: 1.2345 - regression_loss: 1.0462 - classification_loss: 0.1883 416/500 [=======================>......] - ETA: 20s - loss: 1.2348 - regression_loss: 1.0465 - classification_loss: 0.1883 417/500 [========================>.....] - ETA: 20s - loss: 1.2356 - regression_loss: 1.0473 - classification_loss: 0.1883 418/500 [========================>.....] - ETA: 20s - loss: 1.2359 - regression_loss: 1.0475 - classification_loss: 0.1884 419/500 [========================>.....] - ETA: 20s - loss: 1.2354 - regression_loss: 1.0473 - classification_loss: 0.1882 420/500 [========================>.....] - ETA: 19s - loss: 1.2365 - regression_loss: 1.0480 - classification_loss: 0.1884 421/500 [========================>.....] - ETA: 19s - loss: 1.2347 - regression_loss: 1.0465 - classification_loss: 0.1882 422/500 [========================>.....] - ETA: 19s - loss: 1.2357 - regression_loss: 1.0472 - classification_loss: 0.1885 423/500 [========================>.....] - ETA: 19s - loss: 1.2372 - regression_loss: 1.0484 - classification_loss: 0.1888 424/500 [========================>.....] - ETA: 18s - loss: 1.2365 - regression_loss: 1.0479 - classification_loss: 0.1887 425/500 [========================>.....] - ETA: 18s - loss: 1.2369 - regression_loss: 1.0481 - classification_loss: 0.1888 426/500 [========================>.....] - ETA: 18s - loss: 1.2358 - regression_loss: 1.0472 - classification_loss: 0.1887 427/500 [========================>.....] - ETA: 18s - loss: 1.2361 - regression_loss: 1.0475 - classification_loss: 0.1887 428/500 [========================>.....] - ETA: 17s - loss: 1.2353 - regression_loss: 1.0467 - classification_loss: 0.1886 429/500 [========================>.....] - ETA: 17s - loss: 1.2336 - regression_loss: 1.0455 - classification_loss: 0.1882 430/500 [========================>.....] - ETA: 17s - loss: 1.2331 - regression_loss: 1.0451 - classification_loss: 0.1880 431/500 [========================>.....] - ETA: 17s - loss: 1.2336 - regression_loss: 1.0455 - classification_loss: 0.1881 432/500 [========================>.....] - ETA: 16s - loss: 1.2317 - regression_loss: 1.0439 - classification_loss: 0.1878 433/500 [========================>.....] - ETA: 16s - loss: 1.2317 - regression_loss: 1.0439 - classification_loss: 0.1877 434/500 [=========================>....] - ETA: 16s - loss: 1.2326 - regression_loss: 1.0448 - classification_loss: 0.1878 435/500 [=========================>....] - ETA: 16s - loss: 1.2329 - regression_loss: 1.0450 - classification_loss: 0.1879 436/500 [=========================>....] - ETA: 15s - loss: 1.2343 - regression_loss: 1.0460 - classification_loss: 0.1883 437/500 [=========================>....] - ETA: 15s - loss: 1.2350 - regression_loss: 1.0466 - classification_loss: 0.1883 438/500 [=========================>....] - ETA: 15s - loss: 1.2346 - regression_loss: 1.0464 - classification_loss: 0.1882 439/500 [=========================>....] - ETA: 15s - loss: 1.2364 - regression_loss: 1.0479 - classification_loss: 0.1885 440/500 [=========================>....] - ETA: 14s - loss: 1.2371 - regression_loss: 1.0485 - classification_loss: 0.1886 441/500 [=========================>....] - ETA: 14s - loss: 1.2375 - regression_loss: 1.0489 - classification_loss: 0.1886 442/500 [=========================>....] - ETA: 14s - loss: 1.2376 - regression_loss: 1.0489 - classification_loss: 0.1886 443/500 [=========================>....] - ETA: 14s - loss: 1.2368 - regression_loss: 1.0483 - classification_loss: 0.1885 444/500 [=========================>....] - ETA: 13s - loss: 1.2366 - regression_loss: 1.0482 - classification_loss: 0.1883 445/500 [=========================>....] - ETA: 13s - loss: 1.2375 - regression_loss: 1.0488 - classification_loss: 0.1887 446/500 [=========================>....] - ETA: 13s - loss: 1.2376 - regression_loss: 1.0488 - classification_loss: 0.1888 447/500 [=========================>....] - ETA: 13s - loss: 1.2389 - regression_loss: 1.0498 - classification_loss: 0.1891 448/500 [=========================>....] - ETA: 12s - loss: 1.2394 - regression_loss: 1.0502 - classification_loss: 0.1892 449/500 [=========================>....] - ETA: 12s - loss: 1.2399 - regression_loss: 1.0505 - classification_loss: 0.1894 450/500 [==========================>...] - ETA: 12s - loss: 1.2402 - regression_loss: 1.0509 - classification_loss: 0.1894 451/500 [==========================>...] - ETA: 12s - loss: 1.2404 - regression_loss: 1.0510 - classification_loss: 0.1894 452/500 [==========================>...] - ETA: 11s - loss: 1.2410 - regression_loss: 1.0514 - classification_loss: 0.1896 453/500 [==========================>...] - ETA: 11s - loss: 1.2393 - regression_loss: 1.0500 - classification_loss: 0.1893 454/500 [==========================>...] - ETA: 11s - loss: 1.2392 - regression_loss: 1.0499 - classification_loss: 0.1893 455/500 [==========================>...] - ETA: 11s - loss: 1.2383 - regression_loss: 1.0491 - classification_loss: 0.1892 456/500 [==========================>...] - ETA: 10s - loss: 1.2381 - regression_loss: 1.0490 - classification_loss: 0.1891 457/500 [==========================>...] - ETA: 10s - loss: 1.2383 - regression_loss: 1.0492 - classification_loss: 0.1891 458/500 [==========================>...] - ETA: 10s - loss: 1.2389 - regression_loss: 1.0499 - classification_loss: 0.1889 459/500 [==========================>...] - ETA: 10s - loss: 1.2391 - regression_loss: 1.0502 - classification_loss: 0.1889 460/500 [==========================>...] - ETA: 9s - loss: 1.2399 - regression_loss: 1.0510 - classification_loss: 0.1889  461/500 [==========================>...] - ETA: 9s - loss: 1.2410 - regression_loss: 1.0518 - classification_loss: 0.1892 462/500 [==========================>...] - ETA: 9s - loss: 1.2408 - regression_loss: 1.0515 - classification_loss: 0.1893 463/500 [==========================>...] - ETA: 9s - loss: 1.2415 - regression_loss: 1.0522 - classification_loss: 0.1893 464/500 [==========================>...] - ETA: 8s - loss: 1.2424 - regression_loss: 1.0530 - classification_loss: 0.1895 465/500 [==========================>...] - ETA: 8s - loss: 1.2418 - regression_loss: 1.0525 - classification_loss: 0.1893 466/500 [==========================>...] - ETA: 8s - loss: 1.2423 - regression_loss: 1.0531 - classification_loss: 0.1892 467/500 [===========================>..] - ETA: 8s - loss: 1.2411 - regression_loss: 1.0521 - classification_loss: 0.1890 468/500 [===========================>..] - ETA: 7s - loss: 1.2423 - regression_loss: 1.0531 - classification_loss: 0.1892 469/500 [===========================>..] - ETA: 7s - loss: 1.2427 - regression_loss: 1.0535 - classification_loss: 0.1892 470/500 [===========================>..] - ETA: 7s - loss: 1.2432 - regression_loss: 1.0540 - classification_loss: 0.1893 471/500 [===========================>..] - ETA: 7s - loss: 1.2439 - regression_loss: 1.0546 - classification_loss: 0.1893 472/500 [===========================>..] - ETA: 6s - loss: 1.2433 - regression_loss: 1.0541 - classification_loss: 0.1892 473/500 [===========================>..] - ETA: 6s - loss: 1.2440 - regression_loss: 1.0547 - classification_loss: 0.1893 474/500 [===========================>..] - ETA: 6s - loss: 1.2431 - regression_loss: 1.0539 - classification_loss: 0.1892 475/500 [===========================>..] - ETA: 6s - loss: 1.2415 - regression_loss: 1.0526 - classification_loss: 0.1889 476/500 [===========================>..] - ETA: 5s - loss: 1.2425 - regression_loss: 1.0535 - classification_loss: 0.1890 477/500 [===========================>..] - ETA: 5s - loss: 1.2428 - regression_loss: 1.0537 - classification_loss: 0.1891 478/500 [===========================>..] - ETA: 5s - loss: 1.2410 - regression_loss: 1.0522 - classification_loss: 0.1888 479/500 [===========================>..] - ETA: 5s - loss: 1.2416 - regression_loss: 1.0528 - classification_loss: 0.1888 480/500 [===========================>..] - ETA: 4s - loss: 1.2409 - regression_loss: 1.0523 - classification_loss: 0.1886 481/500 [===========================>..] - ETA: 4s - loss: 1.2409 - regression_loss: 1.0524 - classification_loss: 0.1886 482/500 [===========================>..] - ETA: 4s - loss: 1.2412 - regression_loss: 1.0527 - classification_loss: 0.1885 483/500 [===========================>..] - ETA: 4s - loss: 1.2414 - regression_loss: 1.0530 - classification_loss: 0.1884 484/500 [============================>.] - ETA: 3s - loss: 1.2414 - regression_loss: 1.0530 - classification_loss: 0.1884 485/500 [============================>.] - ETA: 3s - loss: 1.2412 - regression_loss: 1.0529 - classification_loss: 0.1884 486/500 [============================>.] - ETA: 3s - loss: 1.2419 - regression_loss: 1.0535 - classification_loss: 0.1885 487/500 [============================>.] - ETA: 3s - loss: 1.2413 - regression_loss: 1.0529 - classification_loss: 0.1883 488/500 [============================>.] - ETA: 2s - loss: 1.2415 - regression_loss: 1.0532 - classification_loss: 0.1883 489/500 [============================>.] - ETA: 2s - loss: 1.2410 - regression_loss: 1.0528 - classification_loss: 0.1882 490/500 [============================>.] - ETA: 2s - loss: 1.2420 - regression_loss: 1.0537 - classification_loss: 0.1883 491/500 [============================>.] - ETA: 2s - loss: 1.2423 - regression_loss: 1.0539 - classification_loss: 0.1884 492/500 [============================>.] - ETA: 1s - loss: 1.2410 - regression_loss: 1.0529 - classification_loss: 0.1881 493/500 [============================>.] - ETA: 1s - loss: 1.2411 - regression_loss: 1.0529 - classification_loss: 0.1882 494/500 [============================>.] - ETA: 1s - loss: 1.2408 - regression_loss: 1.0527 - classification_loss: 0.1881 495/500 [============================>.] - ETA: 1s - loss: 1.2417 - regression_loss: 1.0532 - classification_loss: 0.1885 496/500 [============================>.] - ETA: 0s - loss: 1.2407 - regression_loss: 1.0523 - classification_loss: 0.1884 497/500 [============================>.] - ETA: 0s - loss: 1.2389 - regression_loss: 1.0507 - classification_loss: 0.1881 498/500 [============================>.] - ETA: 0s - loss: 1.2390 - regression_loss: 1.0508 - classification_loss: 0.1882 499/500 [============================>.] - ETA: 0s - loss: 1.2394 - regression_loss: 1.0511 - classification_loss: 0.1883 500/500 [==============================] - 125s 250ms/step - loss: 1.2393 - regression_loss: 1.0510 - classification_loss: 0.1883 1172 instances of class plum with average precision: 0.7013 mAP: 0.7013 Epoch 00131: saving model to ./training/snapshots/resnet50_pascal_131.h5 Epoch 132/150 1/500 [..............................] - ETA: 2:00 - loss: 1.3873 - regression_loss: 1.0980 - classification_loss: 0.2892 2/500 [..............................] - ETA: 2:02 - loss: 1.1828 - regression_loss: 0.9790 - classification_loss: 0.2038 3/500 [..............................] - ETA: 2:03 - loss: 1.1459 - regression_loss: 0.9622 - classification_loss: 0.1837 4/500 [..............................] - ETA: 2:03 - loss: 1.0174 - regression_loss: 0.8594 - classification_loss: 0.1580 5/500 [..............................] - ETA: 2:03 - loss: 0.9173 - regression_loss: 0.7773 - classification_loss: 0.1400 6/500 [..............................] - ETA: 2:04 - loss: 0.9962 - regression_loss: 0.8468 - classification_loss: 0.1494 7/500 [..............................] - ETA: 2:04 - loss: 1.0751 - regression_loss: 0.9154 - classification_loss: 0.1597 8/500 [..............................] - ETA: 2:03 - loss: 1.0466 - regression_loss: 0.8891 - classification_loss: 0.1575 9/500 [..............................] - ETA: 2:03 - loss: 1.1464 - regression_loss: 0.9728 - classification_loss: 0.1736 10/500 [..............................] - ETA: 2:03 - loss: 1.1368 - regression_loss: 0.9686 - classification_loss: 0.1682 11/500 [..............................] - ETA: 2:02 - loss: 1.1161 - regression_loss: 0.9484 - classification_loss: 0.1677 12/500 [..............................] - ETA: 2:01 - loss: 1.1072 - regression_loss: 0.9373 - classification_loss: 0.1699 13/500 [..............................] - ETA: 2:01 - loss: 1.1362 - regression_loss: 0.9627 - classification_loss: 0.1735 14/500 [..............................] - ETA: 2:01 - loss: 1.1447 - regression_loss: 0.9716 - classification_loss: 0.1731 15/500 [..............................] - ETA: 2:00 - loss: 1.1761 - regression_loss: 0.9934 - classification_loss: 0.1827 16/500 [..............................] - ETA: 2:00 - loss: 1.2090 - regression_loss: 1.0192 - classification_loss: 0.1898 17/500 [>.............................] - ETA: 2:00 - loss: 1.2009 - regression_loss: 1.0149 - classification_loss: 0.1860 18/500 [>.............................] - ETA: 2:00 - loss: 1.1853 - regression_loss: 1.0053 - classification_loss: 0.1800 19/500 [>.............................] - ETA: 1:59 - loss: 1.1838 - regression_loss: 1.0033 - classification_loss: 0.1805 20/500 [>.............................] - ETA: 1:59 - loss: 1.1888 - regression_loss: 1.0063 - classification_loss: 0.1825 21/500 [>.............................] - ETA: 1:59 - loss: 1.1910 - regression_loss: 1.0080 - classification_loss: 0.1830 22/500 [>.............................] - ETA: 1:59 - loss: 1.1952 - regression_loss: 1.0164 - classification_loss: 0.1788 23/500 [>.............................] - ETA: 1:59 - loss: 1.2106 - regression_loss: 1.0304 - classification_loss: 0.1802 24/500 [>.............................] - ETA: 1:59 - loss: 1.2086 - regression_loss: 1.0293 - classification_loss: 0.1793 25/500 [>.............................] - ETA: 1:59 - loss: 1.2119 - regression_loss: 1.0335 - classification_loss: 0.1784 26/500 [>.............................] - ETA: 1:59 - loss: 1.2053 - regression_loss: 1.0288 - classification_loss: 0.1765 27/500 [>.............................] - ETA: 1:58 - loss: 1.2385 - regression_loss: 1.0561 - classification_loss: 0.1824 28/500 [>.............................] - ETA: 1:58 - loss: 1.2543 - regression_loss: 1.0687 - classification_loss: 0.1856 29/500 [>.............................] - ETA: 1:58 - loss: 1.2554 - regression_loss: 1.0690 - classification_loss: 0.1864 30/500 [>.............................] - ETA: 1:57 - loss: 1.2450 - regression_loss: 1.0583 - classification_loss: 0.1868 31/500 [>.............................] - ETA: 1:57 - loss: 1.2535 - regression_loss: 1.0653 - classification_loss: 0.1882 32/500 [>.............................] - ETA: 1:57 - loss: 1.2565 - regression_loss: 1.0689 - classification_loss: 0.1875 33/500 [>.............................] - ETA: 1:57 - loss: 1.2614 - regression_loss: 1.0727 - classification_loss: 0.1887 34/500 [=>............................] - ETA: 1:57 - loss: 1.2427 - regression_loss: 1.0571 - classification_loss: 0.1856 35/500 [=>............................] - ETA: 1:56 - loss: 1.2210 - regression_loss: 1.0391 - classification_loss: 0.1819 36/500 [=>............................] - ETA: 1:56 - loss: 1.1985 - regression_loss: 1.0207 - classification_loss: 0.1778 37/500 [=>............................] - ETA: 1:56 - loss: 1.2171 - regression_loss: 1.0354 - classification_loss: 0.1818 38/500 [=>............................] - ETA: 1:56 - loss: 1.2274 - regression_loss: 1.0464 - classification_loss: 0.1811 39/500 [=>............................] - ETA: 1:56 - loss: 1.2299 - regression_loss: 1.0487 - classification_loss: 0.1812 40/500 [=>............................] - ETA: 1:55 - loss: 1.2270 - regression_loss: 1.0441 - classification_loss: 0.1829 41/500 [=>............................] - ETA: 1:55 - loss: 1.2344 - regression_loss: 1.0505 - classification_loss: 0.1839 42/500 [=>............................] - ETA: 1:55 - loss: 1.2418 - regression_loss: 1.0574 - classification_loss: 0.1845 43/500 [=>............................] - ETA: 1:55 - loss: 1.2523 - regression_loss: 1.0657 - classification_loss: 0.1866 44/500 [=>............................] - ETA: 1:54 - loss: 1.2570 - regression_loss: 1.0693 - classification_loss: 0.1877 45/500 [=>............................] - ETA: 1:54 - loss: 1.2493 - regression_loss: 1.0646 - classification_loss: 0.1847 46/500 [=>............................] - ETA: 1:53 - loss: 1.2557 - regression_loss: 1.0694 - classification_loss: 0.1863 47/500 [=>............................] - ETA: 1:53 - loss: 1.2418 - regression_loss: 1.0577 - classification_loss: 0.1841 48/500 [=>............................] - ETA: 1:52 - loss: 1.2515 - regression_loss: 1.0644 - classification_loss: 0.1871 49/500 [=>............................] - ETA: 1:52 - loss: 1.2490 - regression_loss: 1.0628 - classification_loss: 0.1862 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2368 - regression_loss: 1.0530 - classification_loss: 0.1838 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2241 - regression_loss: 1.0430 - classification_loss: 0.1811 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2277 - regression_loss: 1.0445 - classification_loss: 0.1832 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2146 - regression_loss: 1.0319 - classification_loss: 0.1828 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2163 - regression_loss: 1.0334 - classification_loss: 0.1829 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2218 - regression_loss: 1.0372 - classification_loss: 0.1846 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2181 - regression_loss: 1.0341 - classification_loss: 0.1840 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2313 - regression_loss: 1.0444 - classification_loss: 0.1869 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2272 - regression_loss: 1.0398 - classification_loss: 0.1874 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2317 - regression_loss: 1.0440 - classification_loss: 0.1877 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2359 - regression_loss: 1.0480 - classification_loss: 0.1879 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2381 - regression_loss: 1.0505 - classification_loss: 0.1877 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2412 - regression_loss: 1.0536 - classification_loss: 0.1876 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2290 - regression_loss: 1.0440 - classification_loss: 0.1850 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2205 - regression_loss: 1.0375 - classification_loss: 0.1830 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2099 - regression_loss: 1.0292 - classification_loss: 0.1807 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2006 - regression_loss: 1.0208 - classification_loss: 0.1799 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2011 - regression_loss: 1.0215 - classification_loss: 0.1797 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2135 - regression_loss: 1.0313 - classification_loss: 0.1822 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2219 - regression_loss: 1.0391 - classification_loss: 0.1828 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2250 - regression_loss: 1.0422 - classification_loss: 0.1827 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2248 - regression_loss: 1.0415 - classification_loss: 0.1834 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2238 - regression_loss: 1.0404 - classification_loss: 0.1835 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2257 - regression_loss: 1.0418 - classification_loss: 0.1838 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2355 - regression_loss: 1.0500 - classification_loss: 0.1855 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2393 - regression_loss: 1.0537 - classification_loss: 0.1856 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2415 - regression_loss: 1.0558 - classification_loss: 0.1857 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2361 - regression_loss: 1.0505 - classification_loss: 0.1856 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2457 - regression_loss: 1.0566 - classification_loss: 0.1891 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2433 - regression_loss: 1.0554 - classification_loss: 0.1879 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2467 - regression_loss: 1.0582 - classification_loss: 0.1886 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2469 - regression_loss: 1.0580 - classification_loss: 0.1889 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2466 - regression_loss: 1.0583 - classification_loss: 0.1883 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2509 - regression_loss: 1.0621 - classification_loss: 0.1888 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2525 - regression_loss: 1.0632 - classification_loss: 0.1893 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2604 - regression_loss: 1.0697 - classification_loss: 0.1907 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2603 - regression_loss: 1.0698 - classification_loss: 0.1905 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2513 - regression_loss: 1.0621 - classification_loss: 0.1893 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2495 - regression_loss: 1.0602 - classification_loss: 0.1893 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2504 - regression_loss: 1.0606 - classification_loss: 0.1897 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2520 - regression_loss: 1.0623 - classification_loss: 0.1897 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2469 - regression_loss: 1.0581 - classification_loss: 0.1888 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2511 - regression_loss: 1.0618 - classification_loss: 0.1893 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2517 - regression_loss: 1.0627 - classification_loss: 0.1890 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2535 - regression_loss: 1.0645 - classification_loss: 0.1890 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2566 - regression_loss: 1.0668 - classification_loss: 0.1897 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2511 - regression_loss: 1.0623 - classification_loss: 0.1888 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2522 - regression_loss: 1.0640 - classification_loss: 0.1882 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2518 - regression_loss: 1.0637 - classification_loss: 0.1882 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2504 - regression_loss: 1.0624 - classification_loss: 0.1880 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2439 - regression_loss: 1.0568 - classification_loss: 0.1870 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2432 - regression_loss: 1.0559 - classification_loss: 0.1873 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2474 - regression_loss: 1.0590 - classification_loss: 0.1884 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2482 - regression_loss: 1.0595 - classification_loss: 0.1888 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2454 - regression_loss: 1.0578 - classification_loss: 0.1876 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2462 - regression_loss: 1.0582 - classification_loss: 0.1879 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2392 - regression_loss: 1.0524 - classification_loss: 0.1868 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2415 - regression_loss: 1.0544 - classification_loss: 0.1871 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2433 - regression_loss: 1.0560 - classification_loss: 0.1873 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2471 - regression_loss: 1.0590 - classification_loss: 0.1881 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2483 - regression_loss: 1.0603 - classification_loss: 0.1879 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2502 - regression_loss: 1.0619 - classification_loss: 0.1883 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2508 - regression_loss: 1.0622 - classification_loss: 0.1886 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2513 - regression_loss: 1.0629 - classification_loss: 0.1884 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2551 - regression_loss: 1.0666 - classification_loss: 0.1885 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2540 - regression_loss: 1.0651 - classification_loss: 0.1890 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2572 - regression_loss: 1.0677 - classification_loss: 0.1896 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2525 - regression_loss: 1.0641 - classification_loss: 0.1885 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2487 - regression_loss: 1.0612 - classification_loss: 0.1875 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2455 - regression_loss: 1.0585 - classification_loss: 0.1871 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2427 - regression_loss: 1.0563 - classification_loss: 0.1864 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2446 - regression_loss: 1.0578 - classification_loss: 0.1868 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2437 - regression_loss: 1.0571 - classification_loss: 0.1865 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2463 - regression_loss: 1.0595 - classification_loss: 0.1868 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2470 - regression_loss: 1.0600 - classification_loss: 0.1869 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2453 - regression_loss: 1.0588 - classification_loss: 0.1865 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2467 - regression_loss: 1.0600 - classification_loss: 0.1867 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2385 - regression_loss: 1.0531 - classification_loss: 0.1854 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2375 - regression_loss: 1.0527 - classification_loss: 0.1847 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2317 - regression_loss: 1.0479 - classification_loss: 0.1838 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2292 - regression_loss: 1.0461 - classification_loss: 0.1831 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2333 - regression_loss: 1.0496 - classification_loss: 0.1837 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2370 - regression_loss: 1.0529 - classification_loss: 0.1841 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2366 - regression_loss: 1.0531 - classification_loss: 0.1835 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2370 - regression_loss: 1.0534 - classification_loss: 0.1836 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2402 - regression_loss: 1.0559 - classification_loss: 0.1843 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2432 - regression_loss: 1.0581 - classification_loss: 0.1851 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2375 - regression_loss: 1.0532 - classification_loss: 0.1842 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2393 - regression_loss: 1.0547 - classification_loss: 0.1846 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2377 - regression_loss: 1.0535 - classification_loss: 0.1842 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2415 - regression_loss: 1.0552 - classification_loss: 0.1863 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2383 - regression_loss: 1.0525 - classification_loss: 0.1858 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2336 - regression_loss: 1.0488 - classification_loss: 0.1849 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2332 - regression_loss: 1.0479 - classification_loss: 0.1853 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2290 - regression_loss: 1.0445 - classification_loss: 0.1844 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2296 - regression_loss: 1.0449 - classification_loss: 0.1846 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2278 - regression_loss: 1.0422 - classification_loss: 0.1857 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2305 - regression_loss: 1.0442 - classification_loss: 0.1863 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2347 - regression_loss: 1.0477 - classification_loss: 0.1870 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2348 - regression_loss: 1.0482 - classification_loss: 0.1866 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2349 - regression_loss: 1.0482 - classification_loss: 0.1867 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2381 - regression_loss: 1.0509 - classification_loss: 0.1872 152/500 [========>.....................] - ETA: 1:26 - loss: 1.2393 - regression_loss: 1.0516 - classification_loss: 0.1876 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2417 - regression_loss: 1.0537 - classification_loss: 0.1879 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2394 - regression_loss: 1.0518 - classification_loss: 0.1876 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2398 - regression_loss: 1.0519 - classification_loss: 0.1879 156/500 [========>.....................] - ETA: 1:25 - loss: 1.2391 - regression_loss: 1.0516 - classification_loss: 0.1875 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2393 - regression_loss: 1.0517 - classification_loss: 0.1876 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2405 - regression_loss: 1.0527 - classification_loss: 0.1878 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2409 - regression_loss: 1.0530 - classification_loss: 0.1879 160/500 [========>.....................] - ETA: 1:24 - loss: 1.2425 - regression_loss: 1.0545 - classification_loss: 0.1880 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2489 - regression_loss: 1.0595 - classification_loss: 0.1893 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2442 - regression_loss: 1.0556 - classification_loss: 0.1887 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2453 - regression_loss: 1.0564 - classification_loss: 0.1889 164/500 [========>.....................] - ETA: 1:23 - loss: 1.2434 - regression_loss: 1.0552 - classification_loss: 0.1882 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2417 - regression_loss: 1.0537 - classification_loss: 0.1879 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2417 - regression_loss: 1.0539 - classification_loss: 0.1879 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2445 - regression_loss: 1.0565 - classification_loss: 0.1880 168/500 [=========>....................] - ETA: 1:22 - loss: 1.2445 - regression_loss: 1.0565 - classification_loss: 0.1880 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2405 - regression_loss: 1.0530 - classification_loss: 0.1875 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2406 - regression_loss: 1.0530 - classification_loss: 0.1876 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2394 - regression_loss: 1.0521 - classification_loss: 0.1873 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2416 - regression_loss: 1.0537 - classification_loss: 0.1879 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2421 - regression_loss: 1.0538 - classification_loss: 0.1883 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2423 - regression_loss: 1.0537 - classification_loss: 0.1885 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2431 - regression_loss: 1.0543 - classification_loss: 0.1887 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2428 - regression_loss: 1.0540 - classification_loss: 0.1888 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2438 - regression_loss: 1.0549 - classification_loss: 0.1889 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2427 - regression_loss: 1.0545 - classification_loss: 0.1882 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2434 - regression_loss: 1.0552 - classification_loss: 0.1883 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2462 - regression_loss: 1.0567 - classification_loss: 0.1896 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2459 - regression_loss: 1.0564 - classification_loss: 0.1895 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2443 - regression_loss: 1.0550 - classification_loss: 0.1892 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2448 - regression_loss: 1.0555 - classification_loss: 0.1892 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2438 - regression_loss: 1.0547 - classification_loss: 0.1891 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2401 - regression_loss: 1.0514 - classification_loss: 0.1886 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2388 - regression_loss: 1.0505 - classification_loss: 0.1883 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2349 - regression_loss: 1.0475 - classification_loss: 0.1875 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2354 - regression_loss: 1.0479 - classification_loss: 0.1876 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2368 - regression_loss: 1.0490 - classification_loss: 0.1878 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2370 - regression_loss: 1.0492 - classification_loss: 0.1878 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2342 - regression_loss: 1.0469 - classification_loss: 0.1873 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2339 - regression_loss: 1.0467 - classification_loss: 0.1871 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2334 - regression_loss: 1.0462 - classification_loss: 0.1871 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2306 - regression_loss: 1.0441 - classification_loss: 0.1865 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2308 - regression_loss: 1.0440 - classification_loss: 0.1868 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2274 - regression_loss: 1.0412 - classification_loss: 0.1861 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2229 - regression_loss: 1.0374 - classification_loss: 0.1856 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2210 - regression_loss: 1.0357 - classification_loss: 0.1854 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2195 - regression_loss: 1.0343 - classification_loss: 0.1852 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2188 - regression_loss: 1.0337 - classification_loss: 0.1851 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2213 - regression_loss: 1.0359 - classification_loss: 0.1854 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2228 - regression_loss: 1.0371 - classification_loss: 0.1857 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2239 - regression_loss: 1.0379 - classification_loss: 0.1859 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2265 - regression_loss: 1.0397 - classification_loss: 0.1868 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2282 - regression_loss: 1.0412 - classification_loss: 0.1870 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2307 - regression_loss: 1.0432 - classification_loss: 0.1875 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2332 - regression_loss: 1.0451 - classification_loss: 0.1881 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2287 - regression_loss: 1.0414 - classification_loss: 0.1873 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2300 - regression_loss: 1.0424 - classification_loss: 0.1876 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2311 - regression_loss: 1.0433 - classification_loss: 0.1878 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2327 - regression_loss: 1.0446 - classification_loss: 0.1881 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2310 - regression_loss: 1.0431 - classification_loss: 0.1879 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2326 - regression_loss: 1.0445 - classification_loss: 0.1882 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2311 - regression_loss: 1.0433 - classification_loss: 0.1879 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2325 - regression_loss: 1.0444 - classification_loss: 0.1881 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2329 - regression_loss: 1.0447 - classification_loss: 0.1883 217/500 [============>.................] - ETA: 1:10 - loss: 1.2358 - regression_loss: 1.0470 - classification_loss: 0.1887 218/500 [============>.................] - ETA: 1:10 - loss: 1.2379 - regression_loss: 1.0487 - classification_loss: 0.1892 219/500 [============>.................] - ETA: 1:10 - loss: 1.2345 - regression_loss: 1.0459 - classification_loss: 0.1885 220/500 [============>.................] - ETA: 1:10 - loss: 1.2331 - regression_loss: 1.0448 - classification_loss: 0.1883 221/500 [============>.................] - ETA: 1:09 - loss: 1.2298 - regression_loss: 1.0418 - classification_loss: 0.1881 222/500 [============>.................] - ETA: 1:09 - loss: 1.2309 - regression_loss: 1.0426 - classification_loss: 0.1883 223/500 [============>.................] - ETA: 1:09 - loss: 1.2324 - regression_loss: 1.0444 - classification_loss: 0.1880 224/500 [============>.................] - ETA: 1:09 - loss: 1.2314 - regression_loss: 1.0434 - classification_loss: 0.1880 225/500 [============>.................] - ETA: 1:08 - loss: 1.2336 - regression_loss: 1.0452 - classification_loss: 0.1884 226/500 [============>.................] - ETA: 1:08 - loss: 1.2371 - regression_loss: 1.0481 - classification_loss: 0.1890 227/500 [============>.................] - ETA: 1:08 - loss: 1.2384 - regression_loss: 1.0492 - classification_loss: 0.1892 228/500 [============>.................] - ETA: 1:08 - loss: 1.2400 - regression_loss: 1.0506 - classification_loss: 0.1894 229/500 [============>.................] - ETA: 1:07 - loss: 1.2373 - regression_loss: 1.0484 - classification_loss: 0.1889 230/500 [============>.................] - ETA: 1:07 - loss: 1.2336 - regression_loss: 1.0454 - classification_loss: 0.1882 231/500 [============>.................] - ETA: 1:07 - loss: 1.2333 - regression_loss: 1.0451 - classification_loss: 0.1881 232/500 [============>.................] - ETA: 1:07 - loss: 1.2345 - regression_loss: 1.0463 - classification_loss: 0.1882 233/500 [============>.................] - ETA: 1:06 - loss: 1.2323 - regression_loss: 1.0447 - classification_loss: 0.1876 234/500 [=============>................] - ETA: 1:06 - loss: 1.2333 - regression_loss: 1.0457 - classification_loss: 0.1876 235/500 [=============>................] - ETA: 1:06 - loss: 1.2343 - regression_loss: 1.0466 - classification_loss: 0.1877 236/500 [=============>................] - ETA: 1:06 - loss: 1.2329 - regression_loss: 1.0454 - classification_loss: 0.1875 237/500 [=============>................] - ETA: 1:05 - loss: 1.2317 - regression_loss: 1.0444 - classification_loss: 0.1872 238/500 [=============>................] - ETA: 1:05 - loss: 1.2309 - regression_loss: 1.0436 - classification_loss: 0.1873 239/500 [=============>................] - ETA: 1:05 - loss: 1.2305 - regression_loss: 1.0433 - classification_loss: 0.1872 240/500 [=============>................] - ETA: 1:05 - loss: 1.2324 - regression_loss: 1.0450 - classification_loss: 0.1874 241/500 [=============>................] - ETA: 1:04 - loss: 1.2319 - regression_loss: 1.0447 - classification_loss: 0.1872 242/500 [=============>................] - ETA: 1:04 - loss: 1.2326 - regression_loss: 1.0453 - classification_loss: 0.1873 243/500 [=============>................] - ETA: 1:04 - loss: 1.2324 - regression_loss: 1.0452 - classification_loss: 0.1872 244/500 [=============>................] - ETA: 1:04 - loss: 1.2324 - regression_loss: 1.0452 - classification_loss: 0.1871 245/500 [=============>................] - ETA: 1:03 - loss: 1.2322 - regression_loss: 1.0452 - classification_loss: 0.1870 246/500 [=============>................] - ETA: 1:03 - loss: 1.2309 - regression_loss: 1.0442 - classification_loss: 0.1867 247/500 [=============>................] - ETA: 1:03 - loss: 1.2290 - regression_loss: 1.0425 - classification_loss: 0.1865 248/500 [=============>................] - ETA: 1:03 - loss: 1.2298 - regression_loss: 1.0431 - classification_loss: 0.1866 249/500 [=============>................] - ETA: 1:02 - loss: 1.2309 - regression_loss: 1.0441 - classification_loss: 0.1868 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2305 - regression_loss: 1.0436 - classification_loss: 0.1869 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2269 - regression_loss: 1.0406 - classification_loss: 0.1863 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2271 - regression_loss: 1.0408 - classification_loss: 0.1862 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2274 - regression_loss: 1.0406 - classification_loss: 0.1867 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2296 - regression_loss: 1.0424 - classification_loss: 0.1872 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2284 - regression_loss: 1.0414 - classification_loss: 0.1870 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2291 - regression_loss: 1.0422 - classification_loss: 0.1869 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2306 - regression_loss: 1.0432 - classification_loss: 0.1874 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2320 - regression_loss: 1.0441 - classification_loss: 0.1880 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2324 - regression_loss: 1.0445 - classification_loss: 0.1879 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2335 - regression_loss: 1.0454 - classification_loss: 0.1881 261/500 [==============>...............] - ETA: 59s - loss: 1.2339 - regression_loss: 1.0458 - classification_loss: 0.1881  262/500 [==============>...............] - ETA: 59s - loss: 1.2352 - regression_loss: 1.0472 - classification_loss: 0.1881 263/500 [==============>...............] - ETA: 59s - loss: 1.2360 - regression_loss: 1.0479 - classification_loss: 0.1881 264/500 [==============>...............] - ETA: 59s - loss: 1.2385 - regression_loss: 1.0498 - classification_loss: 0.1887 265/500 [==============>...............] - ETA: 58s - loss: 1.2409 - regression_loss: 1.0517 - classification_loss: 0.1892 266/500 [==============>...............] - ETA: 58s - loss: 1.2429 - regression_loss: 1.0533 - classification_loss: 0.1896 267/500 [===============>..............] - ETA: 58s - loss: 1.2425 - regression_loss: 1.0529 - classification_loss: 0.1896 268/500 [===============>..............] - ETA: 58s - loss: 1.2429 - regression_loss: 1.0533 - classification_loss: 0.1896 269/500 [===============>..............] - ETA: 57s - loss: 1.2419 - regression_loss: 1.0526 - classification_loss: 0.1893 270/500 [===============>..............] - ETA: 57s - loss: 1.2414 - regression_loss: 1.0522 - classification_loss: 0.1893 271/500 [===============>..............] - ETA: 57s - loss: 1.2417 - regression_loss: 1.0524 - classification_loss: 0.1893 272/500 [===============>..............] - ETA: 57s - loss: 1.2430 - regression_loss: 1.0534 - classification_loss: 0.1896 273/500 [===============>..............] - ETA: 56s - loss: 1.2419 - regression_loss: 1.0527 - classification_loss: 0.1892 274/500 [===============>..............] - ETA: 56s - loss: 1.2406 - regression_loss: 1.0519 - classification_loss: 0.1887 275/500 [===============>..............] - ETA: 56s - loss: 1.2381 - regression_loss: 1.0497 - classification_loss: 0.1883 276/500 [===============>..............] - ETA: 56s - loss: 1.2400 - regression_loss: 1.0515 - classification_loss: 0.1885 277/500 [===============>..............] - ETA: 55s - loss: 1.2408 - regression_loss: 1.0523 - classification_loss: 0.1885 278/500 [===============>..............] - ETA: 55s - loss: 1.2417 - regression_loss: 1.0529 - classification_loss: 0.1887 279/500 [===============>..............] - ETA: 55s - loss: 1.2422 - regression_loss: 1.0530 - classification_loss: 0.1892 280/500 [===============>..............] - ETA: 55s - loss: 1.2410 - regression_loss: 1.0522 - classification_loss: 0.1888 281/500 [===============>..............] - ETA: 54s - loss: 1.2415 - regression_loss: 1.0525 - classification_loss: 0.1890 282/500 [===============>..............] - ETA: 54s - loss: 1.2424 - regression_loss: 1.0535 - classification_loss: 0.1889 283/500 [===============>..............] - ETA: 54s - loss: 1.2441 - regression_loss: 1.0549 - classification_loss: 0.1892 284/500 [================>.............] - ETA: 54s - loss: 1.2453 - regression_loss: 1.0558 - classification_loss: 0.1894 285/500 [================>.............] - ETA: 53s - loss: 1.2442 - regression_loss: 1.0550 - classification_loss: 0.1892 286/500 [================>.............] - ETA: 53s - loss: 1.2436 - regression_loss: 1.0545 - classification_loss: 0.1891 287/500 [================>.............] - ETA: 53s - loss: 1.2431 - regression_loss: 1.0542 - classification_loss: 0.1889 288/500 [================>.............] - ETA: 53s - loss: 1.2452 - regression_loss: 1.0557 - classification_loss: 0.1895 289/500 [================>.............] - ETA: 52s - loss: 1.2453 - regression_loss: 1.0558 - classification_loss: 0.1895 290/500 [================>.............] - ETA: 52s - loss: 1.2439 - regression_loss: 1.0548 - classification_loss: 0.1890 291/500 [================>.............] - ETA: 52s - loss: 1.2446 - regression_loss: 1.0552 - classification_loss: 0.1894 292/500 [================>.............] - ETA: 52s - loss: 1.2451 - regression_loss: 1.0558 - classification_loss: 0.1893 293/500 [================>.............] - ETA: 51s - loss: 1.2463 - regression_loss: 1.0569 - classification_loss: 0.1894 294/500 [================>.............] - ETA: 51s - loss: 1.2490 - regression_loss: 1.0594 - classification_loss: 0.1896 295/500 [================>.............] - ETA: 51s - loss: 1.2485 - regression_loss: 1.0591 - classification_loss: 0.1894 296/500 [================>.............] - ETA: 51s - loss: 1.2464 - regression_loss: 1.0573 - classification_loss: 0.1890 297/500 [================>.............] - ETA: 50s - loss: 1.2463 - regression_loss: 1.0575 - classification_loss: 0.1889 298/500 [================>.............] - ETA: 50s - loss: 1.2466 - regression_loss: 1.0573 - classification_loss: 0.1894 299/500 [================>.............] - ETA: 50s - loss: 1.2515 - regression_loss: 1.0612 - classification_loss: 0.1903 300/500 [=================>............] - ETA: 50s - loss: 1.2526 - regression_loss: 1.0622 - classification_loss: 0.1904 301/500 [=================>............] - ETA: 49s - loss: 1.2504 - regression_loss: 1.0604 - classification_loss: 0.1900 302/500 [=================>............] - ETA: 49s - loss: 1.2513 - regression_loss: 1.0611 - classification_loss: 0.1902 303/500 [=================>............] - ETA: 49s - loss: 1.2512 - regression_loss: 1.0610 - classification_loss: 0.1902 304/500 [=================>............] - ETA: 49s - loss: 1.2491 - regression_loss: 1.0593 - classification_loss: 0.1898 305/500 [=================>............] - ETA: 48s - loss: 1.2496 - regression_loss: 1.0598 - classification_loss: 0.1899 306/500 [=================>............] - ETA: 48s - loss: 1.2498 - regression_loss: 1.0598 - classification_loss: 0.1900 307/500 [=================>............] - ETA: 48s - loss: 1.2510 - regression_loss: 1.0608 - classification_loss: 0.1902 308/500 [=================>............] - ETA: 48s - loss: 1.2484 - regression_loss: 1.0587 - classification_loss: 0.1897 309/500 [=================>............] - ETA: 47s - loss: 1.2505 - regression_loss: 1.0603 - classification_loss: 0.1902 310/500 [=================>............] - ETA: 47s - loss: 1.2485 - regression_loss: 1.0587 - classification_loss: 0.1898 311/500 [=================>............] - ETA: 47s - loss: 1.2493 - regression_loss: 1.0593 - classification_loss: 0.1899 312/500 [=================>............] - ETA: 47s - loss: 1.2503 - regression_loss: 1.0602 - classification_loss: 0.1901 313/500 [=================>............] - ETA: 46s - loss: 1.2499 - regression_loss: 1.0598 - classification_loss: 0.1901 314/500 [=================>............] - ETA: 46s - loss: 1.2485 - regression_loss: 1.0587 - classification_loss: 0.1898 315/500 [=================>............] - ETA: 46s - loss: 1.2458 - regression_loss: 1.0564 - classification_loss: 0.1893 316/500 [=================>............] - ETA: 46s - loss: 1.2448 - regression_loss: 1.0558 - classification_loss: 0.1890 317/500 [==================>...........] - ETA: 45s - loss: 1.2428 - regression_loss: 1.0542 - classification_loss: 0.1886 318/500 [==================>...........] - ETA: 45s - loss: 1.2440 - regression_loss: 1.0551 - classification_loss: 0.1889 319/500 [==================>...........] - ETA: 45s - loss: 1.2431 - regression_loss: 1.0544 - classification_loss: 0.1887 320/500 [==================>...........] - ETA: 45s - loss: 1.2430 - regression_loss: 1.0541 - classification_loss: 0.1889 321/500 [==================>...........] - ETA: 44s - loss: 1.2432 - regression_loss: 1.0544 - classification_loss: 0.1889 322/500 [==================>...........] - ETA: 44s - loss: 1.2439 - regression_loss: 1.0550 - classification_loss: 0.1889 323/500 [==================>...........] - ETA: 44s - loss: 1.2434 - regression_loss: 1.0547 - classification_loss: 0.1888 324/500 [==================>...........] - ETA: 44s - loss: 1.2429 - regression_loss: 1.0543 - classification_loss: 0.1886 325/500 [==================>...........] - ETA: 43s - loss: 1.2427 - regression_loss: 1.0542 - classification_loss: 0.1885 326/500 [==================>...........] - ETA: 43s - loss: 1.2413 - regression_loss: 1.0532 - classification_loss: 0.1882 327/500 [==================>...........] - ETA: 43s - loss: 1.2421 - regression_loss: 1.0538 - classification_loss: 0.1883 328/500 [==================>...........] - ETA: 42s - loss: 1.2427 - regression_loss: 1.0542 - classification_loss: 0.1885 329/500 [==================>...........] - ETA: 42s - loss: 1.2431 - regression_loss: 1.0545 - classification_loss: 0.1886 330/500 [==================>...........] - ETA: 42s - loss: 1.2423 - regression_loss: 1.0538 - classification_loss: 0.1885 331/500 [==================>...........] - ETA: 42s - loss: 1.2423 - regression_loss: 1.0539 - classification_loss: 0.1884 332/500 [==================>...........] - ETA: 41s - loss: 1.2421 - regression_loss: 1.0537 - classification_loss: 0.1884 333/500 [==================>...........] - ETA: 41s - loss: 1.2405 - regression_loss: 1.0524 - classification_loss: 0.1881 334/500 [===================>..........] - ETA: 41s - loss: 1.2387 - regression_loss: 1.0509 - classification_loss: 0.1878 335/500 [===================>..........] - ETA: 41s - loss: 1.2378 - regression_loss: 1.0503 - classification_loss: 0.1875 336/500 [===================>..........] - ETA: 40s - loss: 1.2356 - regression_loss: 1.0484 - classification_loss: 0.1872 337/500 [===================>..........] - ETA: 40s - loss: 1.2359 - regression_loss: 1.0488 - classification_loss: 0.1871 338/500 [===================>..........] - ETA: 40s - loss: 1.2358 - regression_loss: 1.0487 - classification_loss: 0.1872 339/500 [===================>..........] - ETA: 40s - loss: 1.2353 - regression_loss: 1.0482 - classification_loss: 0.1871 340/500 [===================>..........] - ETA: 39s - loss: 1.2353 - regression_loss: 1.0481 - classification_loss: 0.1872 341/500 [===================>..........] - ETA: 39s - loss: 1.2360 - regression_loss: 1.0487 - classification_loss: 0.1873 342/500 [===================>..........] - ETA: 39s - loss: 1.2374 - regression_loss: 1.0498 - classification_loss: 0.1876 343/500 [===================>..........] - ETA: 39s - loss: 1.2367 - regression_loss: 1.0491 - classification_loss: 0.1875 344/500 [===================>..........] - ETA: 38s - loss: 1.2356 - regression_loss: 1.0482 - classification_loss: 0.1874 345/500 [===================>..........] - ETA: 38s - loss: 1.2355 - regression_loss: 1.0480 - classification_loss: 0.1876 346/500 [===================>..........] - ETA: 38s - loss: 1.2357 - regression_loss: 1.0480 - classification_loss: 0.1878 347/500 [===================>..........] - ETA: 38s - loss: 1.2363 - regression_loss: 1.0484 - classification_loss: 0.1878 348/500 [===================>..........] - ETA: 37s - loss: 1.2376 - regression_loss: 1.0497 - classification_loss: 0.1879 349/500 [===================>..........] - ETA: 37s - loss: 1.2392 - regression_loss: 1.0509 - classification_loss: 0.1883 350/500 [====================>.........] - ETA: 37s - loss: 1.2391 - regression_loss: 1.0509 - classification_loss: 0.1882 351/500 [====================>.........] - ETA: 37s - loss: 1.2389 - regression_loss: 1.0507 - classification_loss: 0.1882 352/500 [====================>.........] - ETA: 36s - loss: 1.2390 - regression_loss: 1.0507 - classification_loss: 0.1883 353/500 [====================>.........] - ETA: 36s - loss: 1.2393 - regression_loss: 1.0510 - classification_loss: 0.1883 354/500 [====================>.........] - ETA: 36s - loss: 1.2398 - regression_loss: 1.0514 - classification_loss: 0.1884 355/500 [====================>.........] - ETA: 36s - loss: 1.2409 - regression_loss: 1.0524 - classification_loss: 0.1886 356/500 [====================>.........] - ETA: 35s - loss: 1.2412 - regression_loss: 1.0527 - classification_loss: 0.1884 357/500 [====================>.........] - ETA: 35s - loss: 1.2413 - regression_loss: 1.0530 - classification_loss: 0.1883 358/500 [====================>.........] - ETA: 35s - loss: 1.2387 - regression_loss: 1.0507 - classification_loss: 0.1879 359/500 [====================>.........] - ETA: 35s - loss: 1.2392 - regression_loss: 1.0511 - classification_loss: 0.1881 360/500 [====================>.........] - ETA: 34s - loss: 1.2394 - regression_loss: 1.0512 - classification_loss: 0.1882 361/500 [====================>.........] - ETA: 34s - loss: 1.2388 - regression_loss: 1.0506 - classification_loss: 0.1881 362/500 [====================>.........] - ETA: 34s - loss: 1.2395 - regression_loss: 1.0513 - classification_loss: 0.1882 363/500 [====================>.........] - ETA: 34s - loss: 1.2410 - regression_loss: 1.0524 - classification_loss: 0.1886 364/500 [====================>.........] - ETA: 33s - loss: 1.2426 - regression_loss: 1.0537 - classification_loss: 0.1888 365/500 [====================>.........] - ETA: 33s - loss: 1.2427 - regression_loss: 1.0540 - classification_loss: 0.1888 366/500 [====================>.........] - ETA: 33s - loss: 1.2411 - regression_loss: 1.0528 - classification_loss: 0.1883 367/500 [=====================>........] - ETA: 33s - loss: 1.2418 - regression_loss: 1.0530 - classification_loss: 0.1888 368/500 [=====================>........] - ETA: 32s - loss: 1.2394 - regression_loss: 1.0511 - classification_loss: 0.1884 369/500 [=====================>........] - ETA: 32s - loss: 1.2397 - regression_loss: 1.0512 - classification_loss: 0.1885 370/500 [=====================>........] - ETA: 32s - loss: 1.2397 - regression_loss: 1.0512 - classification_loss: 0.1885 371/500 [=====================>........] - ETA: 32s - loss: 1.2372 - regression_loss: 1.0491 - classification_loss: 0.1881 372/500 [=====================>........] - ETA: 31s - loss: 1.2366 - regression_loss: 1.0486 - classification_loss: 0.1880 373/500 [=====================>........] - ETA: 31s - loss: 1.2385 - regression_loss: 1.0501 - classification_loss: 0.1884 374/500 [=====================>........] - ETA: 31s - loss: 1.2377 - regression_loss: 1.0496 - classification_loss: 0.1882 375/500 [=====================>........] - ETA: 31s - loss: 1.2378 - regression_loss: 1.0498 - classification_loss: 0.1880 376/500 [=====================>........] - ETA: 30s - loss: 1.2373 - regression_loss: 1.0494 - classification_loss: 0.1879 377/500 [=====================>........] - ETA: 30s - loss: 1.2356 - regression_loss: 1.0481 - classification_loss: 0.1875 378/500 [=====================>........] - ETA: 30s - loss: 1.2361 - regression_loss: 1.0486 - classification_loss: 0.1875 379/500 [=====================>........] - ETA: 30s - loss: 1.2357 - regression_loss: 1.0483 - classification_loss: 0.1875 380/500 [=====================>........] - ETA: 29s - loss: 1.2367 - regression_loss: 1.0491 - classification_loss: 0.1876 381/500 [=====================>........] - ETA: 29s - loss: 1.2373 - regression_loss: 1.0498 - classification_loss: 0.1875 382/500 [=====================>........] - ETA: 29s - loss: 1.2381 - regression_loss: 1.0503 - classification_loss: 0.1877 383/500 [=====================>........] - ETA: 29s - loss: 1.2387 - regression_loss: 1.0508 - classification_loss: 0.1879 384/500 [======================>.......] - ETA: 28s - loss: 1.2394 - regression_loss: 1.0511 - classification_loss: 0.1883 385/500 [======================>.......] - ETA: 28s - loss: 1.2409 - regression_loss: 1.0524 - classification_loss: 0.1884 386/500 [======================>.......] - ETA: 28s - loss: 1.2413 - regression_loss: 1.0530 - classification_loss: 0.1882 387/500 [======================>.......] - ETA: 28s - loss: 1.2424 - regression_loss: 1.0539 - classification_loss: 0.1886 388/500 [======================>.......] - ETA: 27s - loss: 1.2415 - regression_loss: 1.0532 - classification_loss: 0.1884 389/500 [======================>.......] - ETA: 27s - loss: 1.2411 - regression_loss: 1.0528 - classification_loss: 0.1883 390/500 [======================>.......] - ETA: 27s - loss: 1.2391 - regression_loss: 1.0511 - classification_loss: 0.1880 391/500 [======================>.......] - ETA: 27s - loss: 1.2370 - regression_loss: 1.0494 - classification_loss: 0.1876 392/500 [======================>.......] - ETA: 26s - loss: 1.2371 - regression_loss: 1.0493 - classification_loss: 0.1878 393/500 [======================>.......] - ETA: 26s - loss: 1.2377 - regression_loss: 1.0498 - classification_loss: 0.1879 394/500 [======================>.......] - ETA: 26s - loss: 1.2380 - regression_loss: 1.0502 - classification_loss: 0.1879 395/500 [======================>.......] - ETA: 26s - loss: 1.2360 - regression_loss: 1.0485 - classification_loss: 0.1875 396/500 [======================>.......] - ETA: 25s - loss: 1.2372 - regression_loss: 1.0495 - classification_loss: 0.1876 397/500 [======================>.......] - ETA: 25s - loss: 1.2371 - regression_loss: 1.0495 - classification_loss: 0.1876 398/500 [======================>.......] - ETA: 25s - loss: 1.2357 - regression_loss: 1.0483 - classification_loss: 0.1874 399/500 [======================>.......] - ETA: 25s - loss: 1.2359 - regression_loss: 1.0485 - classification_loss: 0.1874 400/500 [=======================>......] - ETA: 25s - loss: 1.2372 - regression_loss: 1.0495 - classification_loss: 0.1877 401/500 [=======================>......] - ETA: 24s - loss: 1.2366 - regression_loss: 1.0490 - classification_loss: 0.1877 402/500 [=======================>......] - ETA: 24s - loss: 1.2372 - regression_loss: 1.0494 - classification_loss: 0.1878 403/500 [=======================>......] - ETA: 24s - loss: 1.2375 - regression_loss: 1.0496 - classification_loss: 0.1879 404/500 [=======================>......] - ETA: 24s - loss: 1.2378 - regression_loss: 1.0499 - classification_loss: 0.1879 405/500 [=======================>......] - ETA: 23s - loss: 1.2383 - regression_loss: 1.0503 - classification_loss: 0.1880 406/500 [=======================>......] - ETA: 23s - loss: 1.2379 - regression_loss: 1.0498 - classification_loss: 0.1881 407/500 [=======================>......] - ETA: 23s - loss: 1.2368 - regression_loss: 1.0488 - classification_loss: 0.1880 408/500 [=======================>......] - ETA: 23s - loss: 1.2375 - regression_loss: 1.0494 - classification_loss: 0.1881 409/500 [=======================>......] - ETA: 22s - loss: 1.2385 - regression_loss: 1.0501 - classification_loss: 0.1884 410/500 [=======================>......] - ETA: 22s - loss: 1.2388 - regression_loss: 1.0503 - classification_loss: 0.1884 411/500 [=======================>......] - ETA: 22s - loss: 1.2381 - regression_loss: 1.0497 - classification_loss: 0.1884 412/500 [=======================>......] - ETA: 22s - loss: 1.2387 - regression_loss: 1.0503 - classification_loss: 0.1884 413/500 [=======================>......] - ETA: 21s - loss: 1.2395 - regression_loss: 1.0511 - classification_loss: 0.1885 414/500 [=======================>......] - ETA: 21s - loss: 1.2386 - regression_loss: 1.0502 - classification_loss: 0.1883 415/500 [=======================>......] - ETA: 21s - loss: 1.2378 - regression_loss: 1.0495 - classification_loss: 0.1882 416/500 [=======================>......] - ETA: 21s - loss: 1.2358 - regression_loss: 1.0479 - classification_loss: 0.1879 417/500 [========================>.....] - ETA: 20s - loss: 1.2358 - regression_loss: 1.0479 - classification_loss: 0.1879 418/500 [========================>.....] - ETA: 20s - loss: 1.2362 - regression_loss: 1.0482 - classification_loss: 0.1879 419/500 [========================>.....] - ETA: 20s - loss: 1.2372 - regression_loss: 1.0491 - classification_loss: 0.1881 420/500 [========================>.....] - ETA: 20s - loss: 1.2369 - regression_loss: 1.0488 - classification_loss: 0.1880 421/500 [========================>.....] - ETA: 19s - loss: 1.2363 - regression_loss: 1.0482 - classification_loss: 0.1881 422/500 [========================>.....] - ETA: 19s - loss: 1.2359 - regression_loss: 1.0479 - classification_loss: 0.1880 423/500 [========================>.....] - ETA: 19s - loss: 1.2358 - regression_loss: 1.0479 - classification_loss: 0.1879 424/500 [========================>.....] - ETA: 19s - loss: 1.2363 - regression_loss: 1.0484 - classification_loss: 0.1879 425/500 [========================>.....] - ETA: 18s - loss: 1.2366 - regression_loss: 1.0487 - classification_loss: 0.1879 426/500 [========================>.....] - ETA: 18s - loss: 1.2382 - regression_loss: 1.0498 - classification_loss: 0.1885 427/500 [========================>.....] - ETA: 18s - loss: 1.2361 - regression_loss: 1.0480 - classification_loss: 0.1881 428/500 [========================>.....] - ETA: 18s - loss: 1.2365 - regression_loss: 1.0483 - classification_loss: 0.1882 429/500 [========================>.....] - ETA: 17s - loss: 1.2364 - regression_loss: 1.0481 - classification_loss: 0.1883 430/500 [========================>.....] - ETA: 17s - loss: 1.2375 - regression_loss: 1.0489 - classification_loss: 0.1886 431/500 [========================>.....] - ETA: 17s - loss: 1.2378 - regression_loss: 1.0487 - classification_loss: 0.1891 432/500 [========================>.....] - ETA: 17s - loss: 1.2387 - regression_loss: 1.0495 - classification_loss: 0.1892 433/500 [========================>.....] - ETA: 16s - loss: 1.2403 - regression_loss: 1.0508 - classification_loss: 0.1895 434/500 [=========================>....] - ETA: 16s - loss: 1.2414 - regression_loss: 1.0517 - classification_loss: 0.1897 435/500 [=========================>....] - ETA: 16s - loss: 1.2426 - regression_loss: 1.0527 - classification_loss: 0.1899 436/500 [=========================>....] - ETA: 16s - loss: 1.2435 - regression_loss: 1.0534 - classification_loss: 0.1900 437/500 [=========================>....] - ETA: 15s - loss: 1.2415 - regression_loss: 1.0518 - classification_loss: 0.1897 438/500 [=========================>....] - ETA: 15s - loss: 1.2423 - regression_loss: 1.0525 - classification_loss: 0.1898 439/500 [=========================>....] - ETA: 15s - loss: 1.2418 - regression_loss: 1.0522 - classification_loss: 0.1897 440/500 [=========================>....] - ETA: 15s - loss: 1.2407 - regression_loss: 1.0512 - classification_loss: 0.1895 441/500 [=========================>....] - ETA: 14s - loss: 1.2416 - regression_loss: 1.0521 - classification_loss: 0.1895 442/500 [=========================>....] - ETA: 14s - loss: 1.2413 - regression_loss: 1.0518 - classification_loss: 0.1895 443/500 [=========================>....] - ETA: 14s - loss: 1.2417 - regression_loss: 1.0522 - classification_loss: 0.1895 444/500 [=========================>....] - ETA: 14s - loss: 1.2427 - regression_loss: 1.0530 - classification_loss: 0.1897 445/500 [=========================>....] - ETA: 13s - loss: 1.2429 - regression_loss: 1.0532 - classification_loss: 0.1897 446/500 [=========================>....] - ETA: 13s - loss: 1.2416 - regression_loss: 1.0523 - classification_loss: 0.1893 447/500 [=========================>....] - ETA: 13s - loss: 1.2404 - regression_loss: 1.0513 - classification_loss: 0.1891 448/500 [=========================>....] - ETA: 13s - loss: 1.2394 - regression_loss: 1.0505 - classification_loss: 0.1888 449/500 [=========================>....] - ETA: 12s - loss: 1.2392 - regression_loss: 1.0507 - classification_loss: 0.1886 450/500 [==========================>...] - ETA: 12s - loss: 1.2404 - regression_loss: 1.0515 - classification_loss: 0.1888 451/500 [==========================>...] - ETA: 12s - loss: 1.2389 - regression_loss: 1.0503 - classification_loss: 0.1886 452/500 [==========================>...] - ETA: 12s - loss: 1.2403 - regression_loss: 1.0514 - classification_loss: 0.1889 453/500 [==========================>...] - ETA: 11s - loss: 1.2406 - regression_loss: 1.0517 - classification_loss: 0.1889 454/500 [==========================>...] - ETA: 11s - loss: 1.2389 - regression_loss: 1.0503 - classification_loss: 0.1886 455/500 [==========================>...] - ETA: 11s - loss: 1.2381 - regression_loss: 1.0495 - classification_loss: 0.1885 456/500 [==========================>...] - ETA: 11s - loss: 1.2367 - regression_loss: 1.0485 - classification_loss: 0.1882 457/500 [==========================>...] - ETA: 10s - loss: 1.2356 - regression_loss: 1.0477 - classification_loss: 0.1879 458/500 [==========================>...] - ETA: 10s - loss: 1.2369 - regression_loss: 1.0488 - classification_loss: 0.1881 459/500 [==========================>...] - ETA: 10s - loss: 1.2372 - regression_loss: 1.0493 - classification_loss: 0.1879 460/500 [==========================>...] - ETA: 10s - loss: 1.2364 - regression_loss: 1.0485 - classification_loss: 0.1879 461/500 [==========================>...] - ETA: 9s - loss: 1.2353 - regression_loss: 1.0476 - classification_loss: 0.1877  462/500 [==========================>...] - ETA: 9s - loss: 1.2355 - regression_loss: 1.0478 - classification_loss: 0.1877 463/500 [==========================>...] - ETA: 9s - loss: 1.2359 - regression_loss: 1.0481 - classification_loss: 0.1878 464/500 [==========================>...] - ETA: 9s - loss: 1.2341 - regression_loss: 1.0466 - classification_loss: 0.1874 465/500 [==========================>...] - ETA: 8s - loss: 1.2343 - regression_loss: 1.0467 - classification_loss: 0.1875 466/500 [==========================>...] - ETA: 8s - loss: 1.2345 - regression_loss: 1.0469 - classification_loss: 0.1876 467/500 [===========================>..] - ETA: 8s - loss: 1.2349 - regression_loss: 1.0473 - classification_loss: 0.1877 468/500 [===========================>..] - ETA: 8s - loss: 1.2341 - regression_loss: 1.0466 - classification_loss: 0.1875 469/500 [===========================>..] - ETA: 7s - loss: 1.2332 - regression_loss: 1.0457 - classification_loss: 0.1875 470/500 [===========================>..] - ETA: 7s - loss: 1.2329 - regression_loss: 1.0455 - classification_loss: 0.1874 471/500 [===========================>..] - ETA: 7s - loss: 1.2336 - regression_loss: 1.0461 - classification_loss: 0.1875 472/500 [===========================>..] - ETA: 7s - loss: 1.2338 - regression_loss: 1.0463 - classification_loss: 0.1874 473/500 [===========================>..] - ETA: 6s - loss: 1.2348 - regression_loss: 1.0472 - classification_loss: 0.1876 474/500 [===========================>..] - ETA: 6s - loss: 1.2358 - regression_loss: 1.0480 - classification_loss: 0.1878 475/500 [===========================>..] - ETA: 6s - loss: 1.2361 - regression_loss: 1.0483 - classification_loss: 0.1878 476/500 [===========================>..] - ETA: 6s - loss: 1.2359 - regression_loss: 1.0481 - classification_loss: 0.1878 477/500 [===========================>..] - ETA: 5s - loss: 1.2353 - regression_loss: 1.0476 - classification_loss: 0.1877 478/500 [===========================>..] - ETA: 5s - loss: 1.2343 - regression_loss: 1.0468 - classification_loss: 0.1875 479/500 [===========================>..] - ETA: 5s - loss: 1.2344 - regression_loss: 1.0469 - classification_loss: 0.1875 480/500 [===========================>..] - ETA: 5s - loss: 1.2346 - regression_loss: 1.0470 - classification_loss: 0.1876 481/500 [===========================>..] - ETA: 4s - loss: 1.2339 - regression_loss: 1.0464 - classification_loss: 0.1875 482/500 [===========================>..] - ETA: 4s - loss: 1.2352 - regression_loss: 1.0474 - classification_loss: 0.1878 483/500 [===========================>..] - ETA: 4s - loss: 1.2347 - regression_loss: 1.0471 - classification_loss: 0.1876 484/500 [============================>.] - ETA: 4s - loss: 1.2359 - regression_loss: 1.0479 - classification_loss: 0.1880 485/500 [============================>.] - ETA: 3s - loss: 1.2363 - regression_loss: 1.0484 - classification_loss: 0.1879 486/500 [============================>.] - ETA: 3s - loss: 1.2359 - regression_loss: 1.0481 - classification_loss: 0.1879 487/500 [============================>.] - ETA: 3s - loss: 1.2357 - regression_loss: 1.0479 - classification_loss: 0.1878 488/500 [============================>.] - ETA: 3s - loss: 1.2362 - regression_loss: 1.0485 - classification_loss: 0.1877 489/500 [============================>.] - ETA: 2s - loss: 1.2352 - regression_loss: 1.0477 - classification_loss: 0.1875 490/500 [============================>.] - ETA: 2s - loss: 1.2343 - regression_loss: 1.0469 - classification_loss: 0.1874 491/500 [============================>.] - ETA: 2s - loss: 1.2349 - regression_loss: 1.0474 - classification_loss: 0.1875 492/500 [============================>.] - ETA: 2s - loss: 1.2351 - regression_loss: 1.0476 - classification_loss: 0.1875 493/500 [============================>.] - ETA: 1s - loss: 1.2343 - regression_loss: 1.0470 - classification_loss: 0.1874 494/500 [============================>.] - ETA: 1s - loss: 1.2353 - regression_loss: 1.0477 - classification_loss: 0.1875 495/500 [============================>.] - ETA: 1s - loss: 1.2348 - regression_loss: 1.0474 - classification_loss: 0.1874 496/500 [============================>.] - ETA: 1s - loss: 1.2349 - regression_loss: 1.0476 - classification_loss: 0.1873 497/500 [============================>.] - ETA: 0s - loss: 1.2339 - regression_loss: 1.0467 - classification_loss: 0.1871 498/500 [============================>.] - ETA: 0s - loss: 1.2340 - regression_loss: 1.0469 - classification_loss: 0.1871 499/500 [============================>.] - ETA: 0s - loss: 1.2324 - regression_loss: 1.0456 - classification_loss: 0.1868 500/500 [==============================] - 125s 250ms/step - loss: 1.2330 - regression_loss: 1.0460 - classification_loss: 0.1870 1172 instances of class plum with average precision: 0.6983 mAP: 0.6983 Epoch 00132: saving model to ./training/snapshots/resnet50_pascal_132.h5 Epoch 00132: ReduceLROnPlateau reducing learning rate to 1.0000000116860975e-08. Epoch 133/150 1/500 [..............................] - ETA: 1:57 - loss: 1.7675 - regression_loss: 1.4969 - classification_loss: 0.2706 2/500 [..............................] - ETA: 2:00 - loss: 1.5848 - regression_loss: 1.3362 - classification_loss: 0.2486 3/500 [..............................] - ETA: 2:02 - loss: 1.5350 - regression_loss: 1.3072 - classification_loss: 0.2278 4/500 [..............................] - ETA: 2:01 - loss: 1.5148 - regression_loss: 1.2767 - classification_loss: 0.2381 5/500 [..............................] - ETA: 2:02 - loss: 1.4545 - regression_loss: 1.2278 - classification_loss: 0.2267 6/500 [..............................] - ETA: 2:02 - loss: 1.4396 - regression_loss: 1.2143 - classification_loss: 0.2253 7/500 [..............................] - ETA: 2:02 - loss: 1.3112 - regression_loss: 1.1102 - classification_loss: 0.2010 8/500 [..............................] - ETA: 2:02 - loss: 1.2557 - regression_loss: 1.0610 - classification_loss: 0.1947 9/500 [..............................] - ETA: 2:02 - loss: 1.2958 - regression_loss: 1.0949 - classification_loss: 0.2009 10/500 [..............................] - ETA: 2:01 - loss: 1.3925 - regression_loss: 1.1469 - classification_loss: 0.2456 11/500 [..............................] - ETA: 2:01 - loss: 1.3499 - regression_loss: 1.1171 - classification_loss: 0.2328 12/500 [..............................] - ETA: 2:01 - loss: 1.3552 - regression_loss: 1.1268 - classification_loss: 0.2284 13/500 [..............................] - ETA: 2:01 - loss: 1.3142 - regression_loss: 1.0908 - classification_loss: 0.2234 14/500 [..............................] - ETA: 2:01 - loss: 1.2846 - regression_loss: 1.0712 - classification_loss: 0.2134 15/500 [..............................] - ETA: 2:01 - loss: 1.2470 - regression_loss: 1.0431 - classification_loss: 0.2039 16/500 [..............................] - ETA: 2:00 - loss: 1.1904 - regression_loss: 0.9958 - classification_loss: 0.1946 17/500 [>.............................] - ETA: 2:00 - loss: 1.2063 - regression_loss: 1.0111 - classification_loss: 0.1953 18/500 [>.............................] - ETA: 2:00 - loss: 1.2123 - regression_loss: 1.0165 - classification_loss: 0.1958 19/500 [>.............................] - ETA: 2:00 - loss: 1.2219 - regression_loss: 1.0260 - classification_loss: 0.1959 20/500 [>.............................] - ETA: 2:00 - loss: 1.1787 - regression_loss: 0.9905 - classification_loss: 0.1882 21/500 [>.............................] - ETA: 1:59 - loss: 1.2022 - regression_loss: 1.0105 - classification_loss: 0.1917 22/500 [>.............................] - ETA: 1:59 - loss: 1.2122 - regression_loss: 1.0185 - classification_loss: 0.1937 23/500 [>.............................] - ETA: 1:59 - loss: 1.2066 - regression_loss: 1.0144 - classification_loss: 0.1922 24/500 [>.............................] - ETA: 1:59 - loss: 1.2044 - regression_loss: 1.0158 - classification_loss: 0.1887 25/500 [>.............................] - ETA: 1:59 - loss: 1.1808 - regression_loss: 0.9951 - classification_loss: 0.1857 26/500 [>.............................] - ETA: 1:58 - loss: 1.1806 - regression_loss: 0.9957 - classification_loss: 0.1849 27/500 [>.............................] - ETA: 1:58 - loss: 1.1775 - regression_loss: 0.9945 - classification_loss: 0.1830 28/500 [>.............................] - ETA: 1:58 - loss: 1.1940 - regression_loss: 1.0084 - classification_loss: 0.1856 29/500 [>.............................] - ETA: 1:58 - loss: 1.1743 - regression_loss: 0.9911 - classification_loss: 0.1833 30/500 [>.............................] - ETA: 1:57 - loss: 1.1643 - regression_loss: 0.9832 - classification_loss: 0.1811 31/500 [>.............................] - ETA: 1:57 - loss: 1.1661 - regression_loss: 0.9852 - classification_loss: 0.1810 32/500 [>.............................] - ETA: 1:57 - loss: 1.1812 - regression_loss: 0.9982 - classification_loss: 0.1830 33/500 [>.............................] - ETA: 1:57 - loss: 1.1936 - regression_loss: 1.0086 - classification_loss: 0.1850 34/500 [=>............................] - ETA: 1:57 - loss: 1.1997 - regression_loss: 1.0137 - classification_loss: 0.1861 35/500 [=>............................] - ETA: 1:56 - loss: 1.2115 - regression_loss: 1.0230 - classification_loss: 0.1885 36/500 [=>............................] - ETA: 1:56 - loss: 1.2167 - regression_loss: 1.0261 - classification_loss: 0.1906 37/500 [=>............................] - ETA: 1:56 - loss: 1.2062 - regression_loss: 1.0180 - classification_loss: 0.1882 38/500 [=>............................] - ETA: 1:55 - loss: 1.1874 - regression_loss: 1.0031 - classification_loss: 0.1842 39/500 [=>............................] - ETA: 1:55 - loss: 1.2079 - regression_loss: 1.0212 - classification_loss: 0.1866 40/500 [=>............................] - ETA: 1:55 - loss: 1.2140 - regression_loss: 1.0264 - classification_loss: 0.1876 41/500 [=>............................] - ETA: 1:55 - loss: 1.2132 - regression_loss: 1.0273 - classification_loss: 0.1859 42/500 [=>............................] - ETA: 1:54 - loss: 1.2182 - regression_loss: 1.0321 - classification_loss: 0.1861 43/500 [=>............................] - ETA: 1:54 - loss: 1.2256 - regression_loss: 1.0389 - classification_loss: 0.1867 44/500 [=>............................] - ETA: 1:54 - loss: 1.2294 - regression_loss: 1.0437 - classification_loss: 0.1857 45/500 [=>............................] - ETA: 1:54 - loss: 1.2436 - regression_loss: 1.0551 - classification_loss: 0.1885 46/500 [=>............................] - ETA: 1:54 - loss: 1.2461 - regression_loss: 1.0564 - classification_loss: 0.1897 47/500 [=>............................] - ETA: 1:53 - loss: 1.2544 - regression_loss: 1.0614 - classification_loss: 0.1930 48/500 [=>............................] - ETA: 1:53 - loss: 1.2527 - regression_loss: 1.0601 - classification_loss: 0.1926 49/500 [=>............................] - ETA: 1:53 - loss: 1.2522 - regression_loss: 1.0606 - classification_loss: 0.1916 50/500 [==>...........................] - ETA: 1:53 - loss: 1.2456 - regression_loss: 1.0554 - classification_loss: 0.1902 51/500 [==>...........................] - ETA: 1:53 - loss: 1.2590 - regression_loss: 1.0665 - classification_loss: 0.1925 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2461 - regression_loss: 1.0560 - classification_loss: 0.1901 53/500 [==>...........................] - ETA: 1:52 - loss: 1.2484 - regression_loss: 1.0582 - classification_loss: 0.1902 54/500 [==>...........................] - ETA: 1:52 - loss: 1.2455 - regression_loss: 1.0562 - classification_loss: 0.1894 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2439 - regression_loss: 1.0553 - classification_loss: 0.1886 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2422 - regression_loss: 1.0538 - classification_loss: 0.1884 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2329 - regression_loss: 1.0460 - classification_loss: 0.1869 58/500 [==>...........................] - ETA: 1:51 - loss: 1.2290 - regression_loss: 1.0429 - classification_loss: 0.1861 59/500 [==>...........................] - ETA: 1:51 - loss: 1.2347 - regression_loss: 1.0483 - classification_loss: 0.1864 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2352 - regression_loss: 1.0493 - classification_loss: 0.1859 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2390 - regression_loss: 1.0527 - classification_loss: 0.1863 62/500 [==>...........................] - ETA: 1:50 - loss: 1.2493 - regression_loss: 1.0606 - classification_loss: 0.1887 63/500 [==>...........................] - ETA: 1:50 - loss: 1.2580 - regression_loss: 1.0672 - classification_loss: 0.1908 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2553 - regression_loss: 1.0648 - classification_loss: 0.1904 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2544 - regression_loss: 1.0645 - classification_loss: 0.1899 66/500 [==>...........................] - ETA: 1:49 - loss: 1.2564 - regression_loss: 1.0662 - classification_loss: 0.1902 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2595 - regression_loss: 1.0690 - classification_loss: 0.1906 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2691 - regression_loss: 1.0765 - classification_loss: 0.1926 69/500 [===>..........................] - ETA: 1:48 - loss: 1.2590 - regression_loss: 1.0684 - classification_loss: 0.1906 70/500 [===>..........................] - ETA: 1:48 - loss: 1.2650 - regression_loss: 1.0739 - classification_loss: 0.1912 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2658 - regression_loss: 1.0749 - classification_loss: 0.1909 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2706 - regression_loss: 1.0792 - classification_loss: 0.1914 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2755 - regression_loss: 1.0832 - classification_loss: 0.1923 74/500 [===>..........................] - ETA: 1:47 - loss: 1.2646 - regression_loss: 1.0743 - classification_loss: 0.1903 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2545 - regression_loss: 1.0657 - classification_loss: 0.1888 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2604 - regression_loss: 1.0709 - classification_loss: 0.1896 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2622 - regression_loss: 1.0727 - classification_loss: 0.1895 78/500 [===>..........................] - ETA: 1:46 - loss: 1.2702 - regression_loss: 1.0794 - classification_loss: 0.1908 79/500 [===>..........................] - ETA: 1:46 - loss: 1.2648 - regression_loss: 1.0745 - classification_loss: 0.1904 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2611 - regression_loss: 1.0718 - classification_loss: 0.1893 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2595 - regression_loss: 1.0707 - classification_loss: 0.1889 82/500 [===>..........................] - ETA: 1:45 - loss: 1.2496 - regression_loss: 1.0627 - classification_loss: 0.1870 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2460 - regression_loss: 1.0593 - classification_loss: 0.1867 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2511 - regression_loss: 1.0635 - classification_loss: 0.1876 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2556 - regression_loss: 1.0669 - classification_loss: 0.1887 86/500 [====>.........................] - ETA: 1:44 - loss: 1.2464 - regression_loss: 1.0593 - classification_loss: 0.1870 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2488 - regression_loss: 1.0622 - classification_loss: 0.1866 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2424 - regression_loss: 1.0571 - classification_loss: 0.1853 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2372 - regression_loss: 1.0535 - classification_loss: 0.1837 90/500 [====>.........................] - ETA: 1:43 - loss: 1.2380 - regression_loss: 1.0545 - classification_loss: 0.1835 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2333 - regression_loss: 1.0511 - classification_loss: 0.1823 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2290 - regression_loss: 1.0475 - classification_loss: 0.1815 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2313 - regression_loss: 1.0498 - classification_loss: 0.1815 94/500 [====>.........................] - ETA: 1:42 - loss: 1.2284 - regression_loss: 1.0477 - classification_loss: 0.1808 95/500 [====>.........................] - ETA: 1:42 - loss: 1.2227 - regression_loss: 1.0429 - classification_loss: 0.1799 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2257 - regression_loss: 1.0452 - classification_loss: 0.1804 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2194 - regression_loss: 1.0396 - classification_loss: 0.1798 98/500 [====>.........................] - ETA: 1:41 - loss: 1.2247 - regression_loss: 1.0439 - classification_loss: 0.1808 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2278 - regression_loss: 1.0468 - classification_loss: 0.1810 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2290 - regression_loss: 1.0478 - classification_loss: 0.1811 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2269 - regression_loss: 1.0463 - classification_loss: 0.1806 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2370 - regression_loss: 1.0541 - classification_loss: 0.1829 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2390 - regression_loss: 1.0558 - classification_loss: 0.1832 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2340 - regression_loss: 1.0517 - classification_loss: 0.1822 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2403 - regression_loss: 1.0561 - classification_loss: 0.1842 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2384 - regression_loss: 1.0545 - classification_loss: 0.1839 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2384 - regression_loss: 1.0549 - classification_loss: 0.1836 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2310 - regression_loss: 1.0487 - classification_loss: 0.1823 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2280 - regression_loss: 1.0463 - classification_loss: 0.1816 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2273 - regression_loss: 1.0458 - classification_loss: 0.1815 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2193 - regression_loss: 1.0393 - classification_loss: 0.1800 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2126 - regression_loss: 1.0337 - classification_loss: 0.1789 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2130 - regression_loss: 1.0338 - classification_loss: 0.1792 114/500 [=====>........................] - ETA: 1:37 - loss: 1.2157 - regression_loss: 1.0363 - classification_loss: 0.1794 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2169 - regression_loss: 1.0378 - classification_loss: 0.1790 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2178 - regression_loss: 1.0382 - classification_loss: 0.1796 117/500 [======>.......................] - ETA: 1:36 - loss: 1.2146 - regression_loss: 1.0359 - classification_loss: 0.1787 118/500 [======>.......................] - ETA: 1:36 - loss: 1.2188 - regression_loss: 1.0393 - classification_loss: 0.1796 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2270 - regression_loss: 1.0455 - classification_loss: 0.1815 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2284 - regression_loss: 1.0465 - classification_loss: 0.1819 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2309 - regression_loss: 1.0485 - classification_loss: 0.1824 122/500 [======>.......................] - ETA: 1:35 - loss: 1.2353 - regression_loss: 1.0517 - classification_loss: 0.1836 123/500 [======>.......................] - ETA: 1:35 - loss: 1.2375 - regression_loss: 1.0534 - classification_loss: 0.1841 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2441 - regression_loss: 1.0588 - classification_loss: 0.1853 125/500 [======>.......................] - ETA: 1:34 - loss: 1.2429 - regression_loss: 1.0579 - classification_loss: 0.1851 126/500 [======>.......................] - ETA: 1:34 - loss: 1.2456 - regression_loss: 1.0603 - classification_loss: 0.1853 127/500 [======>.......................] - ETA: 1:34 - loss: 1.2487 - regression_loss: 1.0631 - classification_loss: 0.1856 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2450 - regression_loss: 1.0601 - classification_loss: 0.1849 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2454 - regression_loss: 1.0603 - classification_loss: 0.1851 130/500 [======>.......................] - ETA: 1:33 - loss: 1.2429 - regression_loss: 1.0587 - classification_loss: 0.1842 131/500 [======>.......................] - ETA: 1:33 - loss: 1.2447 - regression_loss: 1.0602 - classification_loss: 0.1845 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2469 - regression_loss: 1.0619 - classification_loss: 0.1850 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2446 - regression_loss: 1.0601 - classification_loss: 0.1845 134/500 [=======>......................] - ETA: 1:32 - loss: 1.2428 - regression_loss: 1.0586 - classification_loss: 0.1841 135/500 [=======>......................] - ETA: 1:32 - loss: 1.2430 - regression_loss: 1.0592 - classification_loss: 0.1838 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2444 - regression_loss: 1.0601 - classification_loss: 0.1843 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2467 - regression_loss: 1.0619 - classification_loss: 0.1848 138/500 [=======>......................] - ETA: 1:31 - loss: 1.2469 - regression_loss: 1.0616 - classification_loss: 0.1852 139/500 [=======>......................] - ETA: 1:31 - loss: 1.2425 - regression_loss: 1.0580 - classification_loss: 0.1845 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2442 - regression_loss: 1.0592 - classification_loss: 0.1850 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2453 - regression_loss: 1.0603 - classification_loss: 0.1850 142/500 [=======>......................] - ETA: 1:30 - loss: 1.2473 - regression_loss: 1.0621 - classification_loss: 0.1852 143/500 [=======>......................] - ETA: 1:30 - loss: 1.2460 - regression_loss: 1.0611 - classification_loss: 0.1849 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2417 - regression_loss: 1.0570 - classification_loss: 0.1847 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2411 - regression_loss: 1.0558 - classification_loss: 0.1853 146/500 [=======>......................] - ETA: 1:29 - loss: 1.2463 - regression_loss: 1.0596 - classification_loss: 0.1867 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2466 - regression_loss: 1.0603 - classification_loss: 0.1863 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2475 - regression_loss: 1.0602 - classification_loss: 0.1873 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2511 - regression_loss: 1.0631 - classification_loss: 0.1880 150/500 [========>.....................] - ETA: 1:28 - loss: 1.2471 - regression_loss: 1.0596 - classification_loss: 0.1876 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2480 - regression_loss: 1.0606 - classification_loss: 0.1875 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2433 - regression_loss: 1.0566 - classification_loss: 0.1867 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2452 - regression_loss: 1.0584 - classification_loss: 0.1869 154/500 [========>.....................] - ETA: 1:27 - loss: 1.2420 - regression_loss: 1.0554 - classification_loss: 0.1866 155/500 [========>.....................] - ETA: 1:27 - loss: 1.2452 - regression_loss: 1.0580 - classification_loss: 0.1871 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2474 - regression_loss: 1.0597 - classification_loss: 0.1878 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2497 - regression_loss: 1.0617 - classification_loss: 0.1880 158/500 [========>.....................] - ETA: 1:26 - loss: 1.2486 - regression_loss: 1.0607 - classification_loss: 0.1879 159/500 [========>.....................] - ETA: 1:26 - loss: 1.2490 - regression_loss: 1.0610 - classification_loss: 0.1879 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2463 - regression_loss: 1.0589 - classification_loss: 0.1874 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2465 - regression_loss: 1.0592 - classification_loss: 0.1872 162/500 [========>.....................] - ETA: 1:25 - loss: 1.2432 - regression_loss: 1.0565 - classification_loss: 0.1867 163/500 [========>.....................] - ETA: 1:25 - loss: 1.2442 - regression_loss: 1.0574 - classification_loss: 0.1868 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2400 - regression_loss: 1.0538 - classification_loss: 0.1862 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2425 - regression_loss: 1.0556 - classification_loss: 0.1869 166/500 [========>.....................] - ETA: 1:24 - loss: 1.2427 - regression_loss: 1.0558 - classification_loss: 0.1870 167/500 [=========>....................] - ETA: 1:24 - loss: 1.2459 - regression_loss: 1.0583 - classification_loss: 0.1876 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2458 - regression_loss: 1.0581 - classification_loss: 0.1877 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2444 - regression_loss: 1.0570 - classification_loss: 0.1873 170/500 [=========>....................] - ETA: 1:23 - loss: 1.2398 - regression_loss: 1.0533 - classification_loss: 0.1865 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2413 - regression_loss: 1.0547 - classification_loss: 0.1867 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2430 - regression_loss: 1.0562 - classification_loss: 0.1868 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2403 - regression_loss: 1.0541 - classification_loss: 0.1862 174/500 [=========>....................] - ETA: 1:22 - loss: 1.2414 - regression_loss: 1.0551 - classification_loss: 0.1863 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2370 - regression_loss: 1.0514 - classification_loss: 0.1856 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2374 - regression_loss: 1.0515 - classification_loss: 0.1859 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2367 - regression_loss: 1.0508 - classification_loss: 0.1858 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2359 - regression_loss: 1.0504 - classification_loss: 0.1855 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2381 - regression_loss: 1.0524 - classification_loss: 0.1858 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2370 - regression_loss: 1.0515 - classification_loss: 0.1856 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2348 - regression_loss: 1.0496 - classification_loss: 0.1852 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2311 - regression_loss: 1.0467 - classification_loss: 0.1844 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2301 - regression_loss: 1.0460 - classification_loss: 0.1842 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2306 - regression_loss: 1.0464 - classification_loss: 0.1842 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2313 - regression_loss: 1.0469 - classification_loss: 0.1844 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2290 - regression_loss: 1.0451 - classification_loss: 0.1840 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2294 - regression_loss: 1.0453 - classification_loss: 0.1840 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2316 - regression_loss: 1.0473 - classification_loss: 0.1844 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2321 - regression_loss: 1.0476 - classification_loss: 0.1845 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2327 - regression_loss: 1.0481 - classification_loss: 0.1846 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2352 - regression_loss: 1.0503 - classification_loss: 0.1849 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2351 - regression_loss: 1.0504 - classification_loss: 0.1847 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2310 - regression_loss: 1.0472 - classification_loss: 0.1838 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2320 - regression_loss: 1.0481 - classification_loss: 0.1840 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2304 - regression_loss: 1.0466 - classification_loss: 0.1839 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2320 - regression_loss: 1.0478 - classification_loss: 0.1842 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2335 - regression_loss: 1.0492 - classification_loss: 0.1844 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2351 - regression_loss: 1.0505 - classification_loss: 0.1846 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2368 - regression_loss: 1.0523 - classification_loss: 0.1845 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2373 - regression_loss: 1.0527 - classification_loss: 0.1846 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2329 - regression_loss: 1.0490 - classification_loss: 0.1839 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2302 - regression_loss: 1.0469 - classification_loss: 0.1833 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2336 - regression_loss: 1.0503 - classification_loss: 0.1833 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2329 - regression_loss: 1.0498 - classification_loss: 0.1832 205/500 [===========>..................] - ETA: 1:14 - loss: 1.2300 - regression_loss: 1.0470 - classification_loss: 0.1830 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2300 - regression_loss: 1.0466 - classification_loss: 0.1834 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2309 - regression_loss: 1.0473 - classification_loss: 0.1836 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2324 - regression_loss: 1.0487 - classification_loss: 0.1837 209/500 [===========>..................] - ETA: 1:13 - loss: 1.2380 - regression_loss: 1.0532 - classification_loss: 0.1848 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2407 - regression_loss: 1.0555 - classification_loss: 0.1853 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2428 - regression_loss: 1.0572 - classification_loss: 0.1856 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2415 - regression_loss: 1.0562 - classification_loss: 0.1852 213/500 [===========>..................] - ETA: 1:12 - loss: 1.2375 - regression_loss: 1.0529 - classification_loss: 0.1846 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2349 - regression_loss: 1.0508 - classification_loss: 0.1841 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2374 - regression_loss: 1.0527 - classification_loss: 0.1848 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2368 - regression_loss: 1.0522 - classification_loss: 0.1847 217/500 [============>.................] - ETA: 1:11 - loss: 1.2375 - regression_loss: 1.0528 - classification_loss: 0.1847 218/500 [============>.................] - ETA: 1:10 - loss: 1.2371 - regression_loss: 1.0524 - classification_loss: 0.1847 219/500 [============>.................] - ETA: 1:10 - loss: 1.2386 - regression_loss: 1.0537 - classification_loss: 0.1848 220/500 [============>.................] - ETA: 1:10 - loss: 1.2393 - regression_loss: 1.0545 - classification_loss: 0.1848 221/500 [============>.................] - ETA: 1:10 - loss: 1.2400 - regression_loss: 1.0550 - classification_loss: 0.1851 222/500 [============>.................] - ETA: 1:09 - loss: 1.2412 - regression_loss: 1.0558 - classification_loss: 0.1853 223/500 [============>.................] - ETA: 1:09 - loss: 1.2382 - regression_loss: 1.0535 - classification_loss: 0.1848 224/500 [============>.................] - ETA: 1:09 - loss: 1.2393 - regression_loss: 1.0543 - classification_loss: 0.1850 225/500 [============>.................] - ETA: 1:09 - loss: 1.2393 - regression_loss: 1.0547 - classification_loss: 0.1847 226/500 [============>.................] - ETA: 1:08 - loss: 1.2411 - regression_loss: 1.0562 - classification_loss: 0.1850 227/500 [============>.................] - ETA: 1:08 - loss: 1.2379 - regression_loss: 1.0532 - classification_loss: 0.1847 228/500 [============>.................] - ETA: 1:08 - loss: 1.2379 - regression_loss: 1.0532 - classification_loss: 0.1847 229/500 [============>.................] - ETA: 1:08 - loss: 1.2367 - regression_loss: 1.0521 - classification_loss: 0.1846 230/500 [============>.................] - ETA: 1:07 - loss: 1.2356 - regression_loss: 1.0513 - classification_loss: 0.1843 231/500 [============>.................] - ETA: 1:07 - loss: 1.2327 - regression_loss: 1.0486 - classification_loss: 0.1841 232/500 [============>.................] - ETA: 1:07 - loss: 1.2336 - regression_loss: 1.0495 - classification_loss: 0.1841 233/500 [============>.................] - ETA: 1:07 - loss: 1.2337 - regression_loss: 1.0495 - classification_loss: 0.1842 234/500 [=============>................] - ETA: 1:06 - loss: 1.2357 - regression_loss: 1.0510 - classification_loss: 0.1846 235/500 [=============>................] - ETA: 1:06 - loss: 1.2333 - regression_loss: 1.0490 - classification_loss: 0.1843 236/500 [=============>................] - ETA: 1:06 - loss: 1.2328 - regression_loss: 1.0486 - classification_loss: 0.1843 237/500 [=============>................] - ETA: 1:06 - loss: 1.2336 - regression_loss: 1.0492 - classification_loss: 0.1843 238/500 [=============>................] - ETA: 1:05 - loss: 1.2338 - regression_loss: 1.0498 - classification_loss: 0.1839 239/500 [=============>................] - ETA: 1:05 - loss: 1.2330 - regression_loss: 1.0493 - classification_loss: 0.1837 240/500 [=============>................] - ETA: 1:05 - loss: 1.2339 - regression_loss: 1.0498 - classification_loss: 0.1841 241/500 [=============>................] - ETA: 1:05 - loss: 1.2327 - regression_loss: 1.0488 - classification_loss: 0.1839 242/500 [=============>................] - ETA: 1:04 - loss: 1.2354 - regression_loss: 1.0508 - classification_loss: 0.1846 243/500 [=============>................] - ETA: 1:04 - loss: 1.2362 - regression_loss: 1.0516 - classification_loss: 0.1846 244/500 [=============>................] - ETA: 1:04 - loss: 1.2343 - regression_loss: 1.0500 - classification_loss: 0.1843 245/500 [=============>................] - ETA: 1:04 - loss: 1.2360 - regression_loss: 1.0516 - classification_loss: 0.1844 246/500 [=============>................] - ETA: 1:03 - loss: 1.2327 - regression_loss: 1.0488 - classification_loss: 0.1838 247/500 [=============>................] - ETA: 1:03 - loss: 1.2297 - regression_loss: 1.0464 - classification_loss: 0.1833 248/500 [=============>................] - ETA: 1:03 - loss: 1.2309 - regression_loss: 1.0475 - classification_loss: 0.1835 249/500 [=============>................] - ETA: 1:03 - loss: 1.2330 - regression_loss: 1.0490 - classification_loss: 0.1840 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2297 - regression_loss: 1.0464 - classification_loss: 0.1833 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2316 - regression_loss: 1.0479 - classification_loss: 0.1837 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2320 - regression_loss: 1.0486 - classification_loss: 0.1834 253/500 [==============>...............] - ETA: 1:02 - loss: 1.2335 - regression_loss: 1.0499 - classification_loss: 0.1836 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2352 - regression_loss: 1.0511 - classification_loss: 0.1840 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2327 - regression_loss: 1.0492 - classification_loss: 0.1835 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2350 - regression_loss: 1.0513 - classification_loss: 0.1837 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2366 - regression_loss: 1.0521 - classification_loss: 0.1845 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2377 - regression_loss: 1.0524 - classification_loss: 0.1853 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2395 - regression_loss: 1.0539 - classification_loss: 0.1856 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2399 - regression_loss: 1.0542 - classification_loss: 0.1857 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2408 - regression_loss: 1.0549 - classification_loss: 0.1859 262/500 [==============>...............] - ETA: 59s - loss: 1.2396 - regression_loss: 1.0541 - classification_loss: 0.1856  263/500 [==============>...............] - ETA: 59s - loss: 1.2388 - regression_loss: 1.0535 - classification_loss: 0.1854 264/500 [==============>...............] - ETA: 59s - loss: 1.2386 - regression_loss: 1.0534 - classification_loss: 0.1852 265/500 [==============>...............] - ETA: 58s - loss: 1.2384 - regression_loss: 1.0533 - classification_loss: 0.1851 266/500 [==============>...............] - ETA: 58s - loss: 1.2402 - regression_loss: 1.0545 - classification_loss: 0.1857 267/500 [===============>..............] - ETA: 58s - loss: 1.2394 - regression_loss: 1.0537 - classification_loss: 0.1857 268/500 [===============>..............] - ETA: 58s - loss: 1.2404 - regression_loss: 1.0546 - classification_loss: 0.1858 269/500 [===============>..............] - ETA: 57s - loss: 1.2408 - regression_loss: 1.0552 - classification_loss: 0.1856 270/500 [===============>..............] - ETA: 57s - loss: 1.2410 - regression_loss: 1.0556 - classification_loss: 0.1855 271/500 [===============>..............] - ETA: 57s - loss: 1.2379 - regression_loss: 1.0530 - classification_loss: 0.1849 272/500 [===============>..............] - ETA: 57s - loss: 1.2371 - regression_loss: 1.0522 - classification_loss: 0.1849 273/500 [===============>..............] - ETA: 56s - loss: 1.2371 - regression_loss: 1.0520 - classification_loss: 0.1851 274/500 [===============>..............] - ETA: 56s - loss: 1.2400 - regression_loss: 1.0545 - classification_loss: 0.1855 275/500 [===============>..............] - ETA: 56s - loss: 1.2382 - regression_loss: 1.0531 - classification_loss: 0.1851 276/500 [===============>..............] - ETA: 55s - loss: 1.2369 - regression_loss: 1.0515 - classification_loss: 0.1854 277/500 [===============>..............] - ETA: 55s - loss: 1.2349 - regression_loss: 1.0498 - classification_loss: 0.1851 278/500 [===============>..............] - ETA: 55s - loss: 1.2357 - regression_loss: 1.0506 - classification_loss: 0.1851 279/500 [===============>..............] - ETA: 55s - loss: 1.2351 - regression_loss: 1.0500 - classification_loss: 0.1851 280/500 [===============>..............] - ETA: 54s - loss: 1.2369 - regression_loss: 1.0514 - classification_loss: 0.1856 281/500 [===============>..............] - ETA: 54s - loss: 1.2377 - regression_loss: 1.0520 - classification_loss: 0.1857 282/500 [===============>..............] - ETA: 54s - loss: 1.2396 - regression_loss: 1.0537 - classification_loss: 0.1859 283/500 [===============>..............] - ETA: 54s - loss: 1.2383 - regression_loss: 1.0527 - classification_loss: 0.1856 284/500 [================>.............] - ETA: 53s - loss: 1.2390 - regression_loss: 1.0532 - classification_loss: 0.1857 285/500 [================>.............] - ETA: 53s - loss: 1.2393 - regression_loss: 1.0533 - classification_loss: 0.1859 286/500 [================>.............] - ETA: 53s - loss: 1.2385 - regression_loss: 1.0528 - classification_loss: 0.1857 287/500 [================>.............] - ETA: 53s - loss: 1.2378 - regression_loss: 1.0524 - classification_loss: 0.1854 288/500 [================>.............] - ETA: 52s - loss: 1.2376 - regression_loss: 1.0524 - classification_loss: 0.1852 289/500 [================>.............] - ETA: 52s - loss: 1.2363 - regression_loss: 1.0515 - classification_loss: 0.1848 290/500 [================>.............] - ETA: 52s - loss: 1.2366 - regression_loss: 1.0517 - classification_loss: 0.1849 291/500 [================>.............] - ETA: 52s - loss: 1.2382 - regression_loss: 1.0529 - classification_loss: 0.1853 292/500 [================>.............] - ETA: 51s - loss: 1.2409 - regression_loss: 1.0553 - classification_loss: 0.1855 293/500 [================>.............] - ETA: 51s - loss: 1.2380 - regression_loss: 1.0530 - classification_loss: 0.1850 294/500 [================>.............] - ETA: 51s - loss: 1.2365 - regression_loss: 1.0517 - classification_loss: 0.1848 295/500 [================>.............] - ETA: 51s - loss: 1.2359 - regression_loss: 1.0512 - classification_loss: 0.1846 296/500 [================>.............] - ETA: 50s - loss: 1.2364 - regression_loss: 1.0514 - classification_loss: 0.1850 297/500 [================>.............] - ETA: 50s - loss: 1.2354 - regression_loss: 1.0507 - classification_loss: 0.1848 298/500 [================>.............] - ETA: 50s - loss: 1.2343 - regression_loss: 1.0496 - classification_loss: 0.1846 299/500 [================>.............] - ETA: 50s - loss: 1.2348 - regression_loss: 1.0499 - classification_loss: 0.1849 300/500 [=================>............] - ETA: 49s - loss: 1.2333 - regression_loss: 1.0485 - classification_loss: 0.1847 301/500 [=================>............] - ETA: 49s - loss: 1.2310 - regression_loss: 1.0466 - classification_loss: 0.1844 302/500 [=================>............] - ETA: 49s - loss: 1.2319 - regression_loss: 1.0472 - classification_loss: 0.1847 303/500 [=================>............] - ETA: 49s - loss: 1.2316 - regression_loss: 1.0469 - classification_loss: 0.1847 304/500 [=================>............] - ETA: 48s - loss: 1.2298 - regression_loss: 1.0455 - classification_loss: 0.1843 305/500 [=================>............] - ETA: 48s - loss: 1.2317 - regression_loss: 1.0471 - classification_loss: 0.1846 306/500 [=================>............] - ETA: 48s - loss: 1.2305 - regression_loss: 1.0460 - classification_loss: 0.1845 307/500 [=================>............] - ETA: 48s - loss: 1.2288 - regression_loss: 1.0445 - classification_loss: 0.1843 308/500 [=================>............] - ETA: 47s - loss: 1.2279 - regression_loss: 1.0438 - classification_loss: 0.1841 309/500 [=================>............] - ETA: 47s - loss: 1.2273 - regression_loss: 1.0435 - classification_loss: 0.1838 310/500 [=================>............] - ETA: 47s - loss: 1.2263 - regression_loss: 1.0426 - classification_loss: 0.1836 311/500 [=================>............] - ETA: 47s - loss: 1.2240 - regression_loss: 1.0405 - classification_loss: 0.1835 312/500 [=================>............] - ETA: 46s - loss: 1.2251 - regression_loss: 1.0414 - classification_loss: 0.1836 313/500 [=================>............] - ETA: 46s - loss: 1.2259 - regression_loss: 1.0421 - classification_loss: 0.1838 314/500 [=================>............] - ETA: 46s - loss: 1.2274 - regression_loss: 1.0433 - classification_loss: 0.1842 315/500 [=================>............] - ETA: 46s - loss: 1.2272 - regression_loss: 1.0431 - classification_loss: 0.1841 316/500 [=================>............] - ETA: 45s - loss: 1.2287 - regression_loss: 1.0443 - classification_loss: 0.1844 317/500 [==================>...........] - ETA: 45s - loss: 1.2286 - regression_loss: 1.0442 - classification_loss: 0.1844 318/500 [==================>...........] - ETA: 45s - loss: 1.2292 - regression_loss: 1.0448 - classification_loss: 0.1844 319/500 [==================>...........] - ETA: 45s - loss: 1.2272 - regression_loss: 1.0432 - classification_loss: 0.1840 320/500 [==================>...........] - ETA: 44s - loss: 1.2265 - regression_loss: 1.0427 - classification_loss: 0.1838 321/500 [==================>...........] - ETA: 44s - loss: 1.2248 - regression_loss: 1.0414 - classification_loss: 0.1834 322/500 [==================>...........] - ETA: 44s - loss: 1.2268 - regression_loss: 1.0431 - classification_loss: 0.1837 323/500 [==================>...........] - ETA: 44s - loss: 1.2254 - regression_loss: 1.0418 - classification_loss: 0.1837 324/500 [==================>...........] - ETA: 43s - loss: 1.2265 - regression_loss: 1.0426 - classification_loss: 0.1839 325/500 [==================>...........] - ETA: 43s - loss: 1.2272 - regression_loss: 1.0430 - classification_loss: 0.1842 326/500 [==================>...........] - ETA: 43s - loss: 1.2265 - regression_loss: 1.0425 - classification_loss: 0.1840 327/500 [==================>...........] - ETA: 43s - loss: 1.2271 - regression_loss: 1.0431 - classification_loss: 0.1841 328/500 [==================>...........] - ETA: 42s - loss: 1.2257 - regression_loss: 1.0420 - classification_loss: 0.1838 329/500 [==================>...........] - ETA: 42s - loss: 1.2260 - regression_loss: 1.0421 - classification_loss: 0.1838 330/500 [==================>...........] - ETA: 42s - loss: 1.2247 - regression_loss: 1.0411 - classification_loss: 0.1836 331/500 [==================>...........] - ETA: 42s - loss: 1.2234 - regression_loss: 1.0399 - classification_loss: 0.1834 332/500 [==================>...........] - ETA: 41s - loss: 1.2228 - regression_loss: 1.0396 - classification_loss: 0.1832 333/500 [==================>...........] - ETA: 41s - loss: 1.2230 - regression_loss: 1.0397 - classification_loss: 0.1833 334/500 [===================>..........] - ETA: 41s - loss: 1.2237 - regression_loss: 1.0404 - classification_loss: 0.1833 335/500 [===================>..........] - ETA: 41s - loss: 1.2234 - regression_loss: 1.0401 - classification_loss: 0.1834 336/500 [===================>..........] - ETA: 40s - loss: 1.2245 - regression_loss: 1.0410 - classification_loss: 0.1835 337/500 [===================>..........] - ETA: 40s - loss: 1.2260 - regression_loss: 1.0424 - classification_loss: 0.1836 338/500 [===================>..........] - ETA: 40s - loss: 1.2253 - regression_loss: 1.0419 - classification_loss: 0.1834 339/500 [===================>..........] - ETA: 40s - loss: 1.2235 - regression_loss: 1.0404 - classification_loss: 0.1831 340/500 [===================>..........] - ETA: 39s - loss: 1.2226 - regression_loss: 1.0397 - classification_loss: 0.1829 341/500 [===================>..........] - ETA: 39s - loss: 1.2236 - regression_loss: 1.0406 - classification_loss: 0.1829 342/500 [===================>..........] - ETA: 39s - loss: 1.2244 - regression_loss: 1.0414 - classification_loss: 0.1830 343/500 [===================>..........] - ETA: 39s - loss: 1.2256 - regression_loss: 1.0424 - classification_loss: 0.1832 344/500 [===================>..........] - ETA: 38s - loss: 1.2259 - regression_loss: 1.0428 - classification_loss: 0.1831 345/500 [===================>..........] - ETA: 38s - loss: 1.2266 - regression_loss: 1.0434 - classification_loss: 0.1833 346/500 [===================>..........] - ETA: 38s - loss: 1.2284 - regression_loss: 1.0446 - classification_loss: 0.1838 347/500 [===================>..........] - ETA: 38s - loss: 1.2276 - regression_loss: 1.0438 - classification_loss: 0.1838 348/500 [===================>..........] - ETA: 37s - loss: 1.2300 - regression_loss: 1.0456 - classification_loss: 0.1845 349/500 [===================>..........] - ETA: 37s - loss: 1.2304 - regression_loss: 1.0459 - classification_loss: 0.1845 350/500 [====================>.........] - ETA: 37s - loss: 1.2312 - regression_loss: 1.0466 - classification_loss: 0.1846 351/500 [====================>.........] - ETA: 37s - loss: 1.2328 - regression_loss: 1.0477 - classification_loss: 0.1851 352/500 [====================>.........] - ETA: 36s - loss: 1.2334 - regression_loss: 1.0483 - classification_loss: 0.1851 353/500 [====================>.........] - ETA: 36s - loss: 1.2332 - regression_loss: 1.0482 - classification_loss: 0.1850 354/500 [====================>.........] - ETA: 36s - loss: 1.2339 - regression_loss: 1.0487 - classification_loss: 0.1852 355/500 [====================>.........] - ETA: 36s - loss: 1.2357 - regression_loss: 1.0500 - classification_loss: 0.1857 356/500 [====================>.........] - ETA: 35s - loss: 1.2366 - regression_loss: 1.0506 - classification_loss: 0.1860 357/500 [====================>.........] - ETA: 35s - loss: 1.2346 - regression_loss: 1.0490 - classification_loss: 0.1856 358/500 [====================>.........] - ETA: 35s - loss: 1.2358 - regression_loss: 1.0500 - classification_loss: 0.1858 359/500 [====================>.........] - ETA: 35s - loss: 1.2368 - regression_loss: 1.0508 - classification_loss: 0.1860 360/500 [====================>.........] - ETA: 34s - loss: 1.2370 - regression_loss: 1.0509 - classification_loss: 0.1861 361/500 [====================>.........] - ETA: 34s - loss: 1.2368 - regression_loss: 1.0508 - classification_loss: 0.1860 362/500 [====================>.........] - ETA: 34s - loss: 1.2351 - regression_loss: 1.0496 - classification_loss: 0.1856 363/500 [====================>.........] - ETA: 34s - loss: 1.2356 - regression_loss: 1.0499 - classification_loss: 0.1857 364/500 [====================>.........] - ETA: 33s - loss: 1.2352 - regression_loss: 1.0495 - classification_loss: 0.1856 365/500 [====================>.........] - ETA: 33s - loss: 1.2369 - regression_loss: 1.0511 - classification_loss: 0.1858 366/500 [====================>.........] - ETA: 33s - loss: 1.2375 - regression_loss: 1.0515 - classification_loss: 0.1861 367/500 [=====================>........] - ETA: 33s - loss: 1.2362 - regression_loss: 1.0503 - classification_loss: 0.1859 368/500 [=====================>........] - ETA: 32s - loss: 1.2366 - regression_loss: 1.0508 - classification_loss: 0.1858 369/500 [=====================>........] - ETA: 32s - loss: 1.2363 - regression_loss: 1.0506 - classification_loss: 0.1857 370/500 [=====================>........] - ETA: 32s - loss: 1.2365 - regression_loss: 1.0509 - classification_loss: 0.1857 371/500 [=====================>........] - ETA: 32s - loss: 1.2368 - regression_loss: 1.0512 - classification_loss: 0.1856 372/500 [=====================>........] - ETA: 31s - loss: 1.2356 - regression_loss: 1.0502 - classification_loss: 0.1854 373/500 [=====================>........] - ETA: 31s - loss: 1.2354 - regression_loss: 1.0499 - classification_loss: 0.1855 374/500 [=====================>........] - ETA: 31s - loss: 1.2347 - regression_loss: 1.0495 - classification_loss: 0.1852 375/500 [=====================>........] - ETA: 31s - loss: 1.2352 - regression_loss: 1.0498 - classification_loss: 0.1854 376/500 [=====================>........] - ETA: 30s - loss: 1.2348 - regression_loss: 1.0496 - classification_loss: 0.1853 377/500 [=====================>........] - ETA: 30s - loss: 1.2347 - regression_loss: 1.0495 - classification_loss: 0.1852 378/500 [=====================>........] - ETA: 30s - loss: 1.2337 - regression_loss: 1.0484 - classification_loss: 0.1853 379/500 [=====================>........] - ETA: 30s - loss: 1.2346 - regression_loss: 1.0491 - classification_loss: 0.1855 380/500 [=====================>........] - ETA: 29s - loss: 1.2352 - regression_loss: 1.0496 - classification_loss: 0.1856 381/500 [=====================>........] - ETA: 29s - loss: 1.2360 - regression_loss: 1.0503 - classification_loss: 0.1857 382/500 [=====================>........] - ETA: 29s - loss: 1.2341 - regression_loss: 1.0487 - classification_loss: 0.1854 383/500 [=====================>........] - ETA: 29s - loss: 1.2317 - regression_loss: 1.0467 - classification_loss: 0.1850 384/500 [======================>.......] - ETA: 28s - loss: 1.2320 - regression_loss: 1.0470 - classification_loss: 0.1850 385/500 [======================>.......] - ETA: 28s - loss: 1.2316 - regression_loss: 1.0467 - classification_loss: 0.1849 386/500 [======================>.......] - ETA: 28s - loss: 1.2300 - regression_loss: 1.0454 - classification_loss: 0.1846 387/500 [======================>.......] - ETA: 28s - loss: 1.2308 - regression_loss: 1.0462 - classification_loss: 0.1847 388/500 [======================>.......] - ETA: 27s - loss: 1.2302 - regression_loss: 1.0456 - classification_loss: 0.1846 389/500 [======================>.......] - ETA: 27s - loss: 1.2314 - regression_loss: 1.0467 - classification_loss: 0.1847 390/500 [======================>.......] - ETA: 27s - loss: 1.2316 - regression_loss: 1.0469 - classification_loss: 0.1848 391/500 [======================>.......] - ETA: 27s - loss: 1.2320 - regression_loss: 1.0471 - classification_loss: 0.1848 392/500 [======================>.......] - ETA: 26s - loss: 1.2316 - regression_loss: 1.0469 - classification_loss: 0.1847 393/500 [======================>.......] - ETA: 26s - loss: 1.2311 - regression_loss: 1.0466 - classification_loss: 0.1846 394/500 [======================>.......] - ETA: 26s - loss: 1.2310 - regression_loss: 1.0464 - classification_loss: 0.1845 395/500 [======================>.......] - ETA: 26s - loss: 1.2317 - regression_loss: 1.0470 - classification_loss: 0.1846 396/500 [======================>.......] - ETA: 25s - loss: 1.2319 - regression_loss: 1.0472 - classification_loss: 0.1847 397/500 [======================>.......] - ETA: 25s - loss: 1.2306 - regression_loss: 1.0462 - classification_loss: 0.1844 398/500 [======================>.......] - ETA: 25s - loss: 1.2311 - regression_loss: 1.0466 - classification_loss: 0.1845 399/500 [======================>.......] - ETA: 25s - loss: 1.2312 - regression_loss: 1.0468 - classification_loss: 0.1844 400/500 [=======================>......] - ETA: 24s - loss: 1.2298 - regression_loss: 1.0457 - classification_loss: 0.1841 401/500 [=======================>......] - ETA: 24s - loss: 1.2288 - regression_loss: 1.0450 - classification_loss: 0.1838 402/500 [=======================>......] - ETA: 24s - loss: 1.2286 - regression_loss: 1.0446 - classification_loss: 0.1840 403/500 [=======================>......] - ETA: 24s - loss: 1.2291 - regression_loss: 1.0450 - classification_loss: 0.1841 404/500 [=======================>......] - ETA: 23s - loss: 1.2305 - regression_loss: 1.0461 - classification_loss: 0.1844 405/500 [=======================>......] - ETA: 23s - loss: 1.2300 - regression_loss: 1.0458 - classification_loss: 0.1842 406/500 [=======================>......] - ETA: 23s - loss: 1.2281 - regression_loss: 1.0441 - classification_loss: 0.1839 407/500 [=======================>......] - ETA: 23s - loss: 1.2291 - regression_loss: 1.0449 - classification_loss: 0.1842 408/500 [=======================>......] - ETA: 22s - loss: 1.2295 - regression_loss: 1.0452 - classification_loss: 0.1843 409/500 [=======================>......] - ETA: 22s - loss: 1.2301 - regression_loss: 1.0457 - classification_loss: 0.1844 410/500 [=======================>......] - ETA: 22s - loss: 1.2305 - regression_loss: 1.0461 - classification_loss: 0.1844 411/500 [=======================>......] - ETA: 22s - loss: 1.2318 - regression_loss: 1.0472 - classification_loss: 0.1846 412/500 [=======================>......] - ETA: 21s - loss: 1.2301 - regression_loss: 1.0458 - classification_loss: 0.1843 413/500 [=======================>......] - ETA: 21s - loss: 1.2306 - regression_loss: 1.0460 - classification_loss: 0.1846 414/500 [=======================>......] - ETA: 21s - loss: 1.2292 - regression_loss: 1.0448 - classification_loss: 0.1844 415/500 [=======================>......] - ETA: 21s - loss: 1.2292 - regression_loss: 1.0448 - classification_loss: 0.1844 416/500 [=======================>......] - ETA: 20s - loss: 1.2302 - regression_loss: 1.0454 - classification_loss: 0.1848 417/500 [========================>.....] - ETA: 20s - loss: 1.2315 - regression_loss: 1.0464 - classification_loss: 0.1851 418/500 [========================>.....] - ETA: 20s - loss: 1.2318 - regression_loss: 1.0467 - classification_loss: 0.1852 419/500 [========================>.....] - ETA: 20s - loss: 1.2319 - regression_loss: 1.0468 - classification_loss: 0.1852 420/500 [========================>.....] - ETA: 19s - loss: 1.2323 - regression_loss: 1.0471 - classification_loss: 0.1852 421/500 [========================>.....] - ETA: 19s - loss: 1.2313 - regression_loss: 1.0463 - classification_loss: 0.1850 422/500 [========================>.....] - ETA: 19s - loss: 1.2300 - regression_loss: 1.0453 - classification_loss: 0.1847 423/500 [========================>.....] - ETA: 19s - loss: 1.2300 - regression_loss: 1.0454 - classification_loss: 0.1846 424/500 [========================>.....] - ETA: 18s - loss: 1.2313 - regression_loss: 1.0466 - classification_loss: 0.1847 425/500 [========================>.....] - ETA: 18s - loss: 1.2313 - regression_loss: 1.0466 - classification_loss: 0.1847 426/500 [========================>.....] - ETA: 18s - loss: 1.2317 - regression_loss: 1.0469 - classification_loss: 0.1848 427/500 [========================>.....] - ETA: 18s - loss: 1.2305 - regression_loss: 1.0459 - classification_loss: 0.1845 428/500 [========================>.....] - ETA: 17s - loss: 1.2288 - regression_loss: 1.0445 - classification_loss: 0.1843 429/500 [========================>.....] - ETA: 17s - loss: 1.2286 - regression_loss: 1.0444 - classification_loss: 0.1842 430/500 [========================>.....] - ETA: 17s - loss: 1.2290 - regression_loss: 1.0447 - classification_loss: 0.1843 431/500 [========================>.....] - ETA: 17s - loss: 1.2282 - regression_loss: 1.0440 - classification_loss: 0.1842 432/500 [========================>.....] - ETA: 16s - loss: 1.2294 - regression_loss: 1.0449 - classification_loss: 0.1845 433/500 [========================>.....] - ETA: 16s - loss: 1.2291 - regression_loss: 1.0447 - classification_loss: 0.1844 434/500 [=========================>....] - ETA: 16s - loss: 1.2288 - regression_loss: 1.0443 - classification_loss: 0.1846 435/500 [=========================>....] - ETA: 16s - loss: 1.2279 - regression_loss: 1.0435 - classification_loss: 0.1844 436/500 [=========================>....] - ETA: 16s - loss: 1.2286 - regression_loss: 1.0442 - classification_loss: 0.1844 437/500 [=========================>....] - ETA: 15s - loss: 1.2295 - regression_loss: 1.0449 - classification_loss: 0.1846 438/500 [=========================>....] - ETA: 15s - loss: 1.2288 - regression_loss: 1.0443 - classification_loss: 0.1845 439/500 [=========================>....] - ETA: 15s - loss: 1.2275 - regression_loss: 1.0432 - classification_loss: 0.1842 440/500 [=========================>....] - ETA: 15s - loss: 1.2285 - regression_loss: 1.0442 - classification_loss: 0.1844 441/500 [=========================>....] - ETA: 14s - loss: 1.2297 - regression_loss: 1.0453 - classification_loss: 0.1844 442/500 [=========================>....] - ETA: 14s - loss: 1.2301 - regression_loss: 1.0455 - classification_loss: 0.1846 443/500 [=========================>....] - ETA: 14s - loss: 1.2297 - regression_loss: 1.0451 - classification_loss: 0.1846 444/500 [=========================>....] - ETA: 14s - loss: 1.2278 - regression_loss: 1.0436 - classification_loss: 0.1843 445/500 [=========================>....] - ETA: 13s - loss: 1.2286 - regression_loss: 1.0444 - classification_loss: 0.1843 446/500 [=========================>....] - ETA: 13s - loss: 1.2279 - regression_loss: 1.0437 - classification_loss: 0.1842 447/500 [=========================>....] - ETA: 13s - loss: 1.2281 - regression_loss: 1.0438 - classification_loss: 0.1842 448/500 [=========================>....] - ETA: 13s - loss: 1.2283 - regression_loss: 1.0441 - classification_loss: 0.1842 449/500 [=========================>....] - ETA: 12s - loss: 1.2285 - regression_loss: 1.0443 - classification_loss: 0.1842 450/500 [==========================>...] - ETA: 12s - loss: 1.2285 - regression_loss: 1.0442 - classification_loss: 0.1843 451/500 [==========================>...] - ETA: 12s - loss: 1.2289 - regression_loss: 1.0446 - classification_loss: 0.1843 452/500 [==========================>...] - ETA: 12s - loss: 1.2279 - regression_loss: 1.0438 - classification_loss: 0.1841 453/500 [==========================>...] - ETA: 11s - loss: 1.2279 - regression_loss: 1.0438 - classification_loss: 0.1841 454/500 [==========================>...] - ETA: 11s - loss: 1.2278 - regression_loss: 1.0437 - classification_loss: 0.1841 455/500 [==========================>...] - ETA: 11s - loss: 1.2277 - regression_loss: 1.0435 - classification_loss: 0.1842 456/500 [==========================>...] - ETA: 11s - loss: 1.2282 - regression_loss: 1.0440 - classification_loss: 0.1841 457/500 [==========================>...] - ETA: 10s - loss: 1.2287 - regression_loss: 1.0446 - classification_loss: 0.1842 458/500 [==========================>...] - ETA: 10s - loss: 1.2291 - regression_loss: 1.0448 - classification_loss: 0.1844 459/500 [==========================>...] - ETA: 10s - loss: 1.2288 - regression_loss: 1.0441 - classification_loss: 0.1847 460/500 [==========================>...] - ETA: 10s - loss: 1.2296 - regression_loss: 1.0447 - classification_loss: 0.1850 461/500 [==========================>...] - ETA: 9s - loss: 1.2289 - regression_loss: 1.0441 - classification_loss: 0.1848  462/500 [==========================>...] - ETA: 9s - loss: 1.2280 - regression_loss: 1.0434 - classification_loss: 0.1846 463/500 [==========================>...] - ETA: 9s - loss: 1.2280 - regression_loss: 1.0433 - classification_loss: 0.1847 464/500 [==========================>...] - ETA: 9s - loss: 1.2289 - regression_loss: 1.0439 - classification_loss: 0.1850 465/500 [==========================>...] - ETA: 8s - loss: 1.2292 - regression_loss: 1.0442 - classification_loss: 0.1850 466/500 [==========================>...] - ETA: 8s - loss: 1.2297 - regression_loss: 1.0447 - classification_loss: 0.1850 467/500 [===========================>..] - ETA: 8s - loss: 1.2294 - regression_loss: 1.0443 - classification_loss: 0.1850 468/500 [===========================>..] - ETA: 8s - loss: 1.2304 - regression_loss: 1.0452 - classification_loss: 0.1852 469/500 [===========================>..] - ETA: 7s - loss: 1.2303 - regression_loss: 1.0451 - classification_loss: 0.1851 470/500 [===========================>..] - ETA: 7s - loss: 1.2305 - regression_loss: 1.0454 - classification_loss: 0.1851 471/500 [===========================>..] - ETA: 7s - loss: 1.2312 - regression_loss: 1.0460 - classification_loss: 0.1852 472/500 [===========================>..] - ETA: 7s - loss: 1.2311 - regression_loss: 1.0460 - classification_loss: 0.1851 473/500 [===========================>..] - ETA: 6s - loss: 1.2307 - regression_loss: 1.0457 - classification_loss: 0.1851 474/500 [===========================>..] - ETA: 6s - loss: 1.2303 - regression_loss: 1.0454 - classification_loss: 0.1849 475/500 [===========================>..] - ETA: 6s - loss: 1.2311 - regression_loss: 1.0461 - classification_loss: 0.1850 476/500 [===========================>..] - ETA: 6s - loss: 1.2299 - regression_loss: 1.0452 - classification_loss: 0.1847 477/500 [===========================>..] - ETA: 5s - loss: 1.2285 - regression_loss: 1.0441 - classification_loss: 0.1845 478/500 [===========================>..] - ETA: 5s - loss: 1.2297 - regression_loss: 1.0450 - classification_loss: 0.1846 479/500 [===========================>..] - ETA: 5s - loss: 1.2296 - regression_loss: 1.0449 - classification_loss: 0.1846 480/500 [===========================>..] - ETA: 5s - loss: 1.2303 - regression_loss: 1.0456 - classification_loss: 0.1847 481/500 [===========================>..] - ETA: 4s - loss: 1.2291 - regression_loss: 1.0447 - classification_loss: 0.1844 482/500 [===========================>..] - ETA: 4s - loss: 1.2295 - regression_loss: 1.0450 - classification_loss: 0.1845 483/500 [===========================>..] - ETA: 4s - loss: 1.2286 - regression_loss: 1.0443 - classification_loss: 0.1843 484/500 [============================>.] - ETA: 4s - loss: 1.2297 - regression_loss: 1.0452 - classification_loss: 0.1845 485/500 [============================>.] - ETA: 3s - loss: 1.2291 - regression_loss: 1.0449 - classification_loss: 0.1842 486/500 [============================>.] - ETA: 3s - loss: 1.2277 - regression_loss: 1.0438 - classification_loss: 0.1839 487/500 [============================>.] - ETA: 3s - loss: 1.2283 - regression_loss: 1.0442 - classification_loss: 0.1841 488/500 [============================>.] - ETA: 3s - loss: 1.2292 - regression_loss: 1.0449 - classification_loss: 0.1842 489/500 [============================>.] - ETA: 2s - loss: 1.2281 - regression_loss: 1.0440 - classification_loss: 0.1841 490/500 [============================>.] - ETA: 2s - loss: 1.2276 - regression_loss: 1.0437 - classification_loss: 0.1840 491/500 [============================>.] - ETA: 2s - loss: 1.2289 - regression_loss: 1.0447 - classification_loss: 0.1842 492/500 [============================>.] - ETA: 2s - loss: 1.2277 - regression_loss: 1.0437 - classification_loss: 0.1840 493/500 [============================>.] - ETA: 1s - loss: 1.2260 - regression_loss: 1.0423 - classification_loss: 0.1837 494/500 [============================>.] - ETA: 1s - loss: 1.2254 - regression_loss: 1.0418 - classification_loss: 0.1836 495/500 [============================>.] - ETA: 1s - loss: 1.2263 - regression_loss: 1.0426 - classification_loss: 0.1837 496/500 [============================>.] - ETA: 1s - loss: 1.2265 - regression_loss: 1.0429 - classification_loss: 0.1836 497/500 [============================>.] - ETA: 0s - loss: 1.2268 - regression_loss: 1.0432 - classification_loss: 0.1836 498/500 [============================>.] - ETA: 0s - loss: 1.2266 - regression_loss: 1.0431 - classification_loss: 0.1835 499/500 [============================>.] - ETA: 0s - loss: 1.2266 - regression_loss: 1.0431 - classification_loss: 0.1836 500/500 [==============================] - 125s 251ms/step - loss: 1.2261 - regression_loss: 1.0427 - classification_loss: 0.1834 1172 instances of class plum with average precision: 0.6994 mAP: 0.6994 Epoch 00133: saving model to ./training/snapshots/resnet50_pascal_133.h5 Epoch 134/150 1/500 [..............................] - ETA: 1:58 - loss: 0.7781 - regression_loss: 0.6698 - classification_loss: 0.1083 2/500 [..............................] - ETA: 2:03 - loss: 0.7978 - regression_loss: 0.6893 - classification_loss: 0.1085 3/500 [..............................] - ETA: 2:02 - loss: 0.9668 - regression_loss: 0.8229 - classification_loss: 0.1439 4/500 [..............................] - ETA: 2:03 - loss: 0.8857 - regression_loss: 0.7562 - classification_loss: 0.1295 5/500 [..............................] - ETA: 2:04 - loss: 1.1487 - regression_loss: 0.9663 - classification_loss: 0.1824 6/500 [..............................] - ETA: 2:04 - loss: 1.1875 - regression_loss: 0.9887 - classification_loss: 0.1988 7/500 [..............................] - ETA: 2:04 - loss: 1.1639 - regression_loss: 0.9776 - classification_loss: 0.1864 8/500 [..............................] - ETA: 2:03 - loss: 1.3055 - regression_loss: 1.0851 - classification_loss: 0.2203 9/500 [..............................] - ETA: 2:03 - loss: 1.1925 - regression_loss: 0.9913 - classification_loss: 0.2012 10/500 [..............................] - ETA: 2:03 - loss: 1.1116 - regression_loss: 0.9286 - classification_loss: 0.1830 11/500 [..............................] - ETA: 2:03 - loss: 1.1257 - regression_loss: 0.9396 - classification_loss: 0.1860 12/500 [..............................] - ETA: 2:03 - loss: 1.1620 - regression_loss: 0.9715 - classification_loss: 0.1905 13/500 [..............................] - ETA: 2:02 - loss: 1.1636 - regression_loss: 0.9767 - classification_loss: 0.1869 14/500 [..............................] - ETA: 2:02 - loss: 1.2130 - regression_loss: 1.0139 - classification_loss: 0.1991 15/500 [..............................] - ETA: 2:01 - loss: 1.1590 - regression_loss: 0.9700 - classification_loss: 0.1890 16/500 [..............................] - ETA: 2:00 - loss: 1.1419 - regression_loss: 0.9574 - classification_loss: 0.1845 17/500 [>.............................] - ETA: 2:00 - loss: 1.1300 - regression_loss: 0.9485 - classification_loss: 0.1815 18/500 [>.............................] - ETA: 2:00 - loss: 1.1359 - regression_loss: 0.9520 - classification_loss: 0.1839 19/500 [>.............................] - ETA: 1:59 - loss: 1.1145 - regression_loss: 0.9355 - classification_loss: 0.1790 20/500 [>.............................] - ETA: 1:59 - loss: 1.1044 - regression_loss: 0.9290 - classification_loss: 0.1754 21/500 [>.............................] - ETA: 1:59 - loss: 1.1314 - regression_loss: 0.9515 - classification_loss: 0.1799 22/500 [>.............................] - ETA: 1:59 - loss: 1.1151 - regression_loss: 0.9363 - classification_loss: 0.1788 23/500 [>.............................] - ETA: 1:58 - loss: 1.1556 - regression_loss: 0.9668 - classification_loss: 0.1887 24/500 [>.............................] - ETA: 1:58 - loss: 1.1539 - regression_loss: 0.9668 - classification_loss: 0.1871 25/500 [>.............................] - ETA: 1:58 - loss: 1.1828 - regression_loss: 0.9900 - classification_loss: 0.1928 26/500 [>.............................] - ETA: 1:58 - loss: 1.2049 - regression_loss: 1.0095 - classification_loss: 0.1955 27/500 [>.............................] - ETA: 1:58 - loss: 1.2270 - regression_loss: 1.0279 - classification_loss: 0.1992 28/500 [>.............................] - ETA: 1:57 - loss: 1.2275 - regression_loss: 1.0273 - classification_loss: 0.2002 29/500 [>.............................] - ETA: 1:56 - loss: 1.2269 - regression_loss: 1.0274 - classification_loss: 0.1995 30/500 [>.............................] - ETA: 1:55 - loss: 1.2074 - regression_loss: 1.0125 - classification_loss: 0.1949 31/500 [>.............................] - ETA: 1:55 - loss: 1.2050 - regression_loss: 1.0096 - classification_loss: 0.1954 32/500 [>.............................] - ETA: 1:54 - loss: 1.2107 - regression_loss: 1.0161 - classification_loss: 0.1945 33/500 [>.............................] - ETA: 1:54 - loss: 1.2067 - regression_loss: 1.0119 - classification_loss: 0.1948 34/500 [=>............................] - ETA: 1:54 - loss: 1.2170 - regression_loss: 1.0200 - classification_loss: 0.1971 35/500 [=>............................] - ETA: 1:54 - loss: 1.2067 - regression_loss: 1.0122 - classification_loss: 0.1945 36/500 [=>............................] - ETA: 1:54 - loss: 1.2074 - regression_loss: 1.0132 - classification_loss: 0.1942 37/500 [=>............................] - ETA: 1:54 - loss: 1.1936 - regression_loss: 1.0027 - classification_loss: 0.1908 38/500 [=>............................] - ETA: 1:54 - loss: 1.2044 - regression_loss: 1.0087 - classification_loss: 0.1957 39/500 [=>............................] - ETA: 1:53 - loss: 1.2159 - regression_loss: 1.0190 - classification_loss: 0.1968 40/500 [=>............................] - ETA: 1:53 - loss: 1.2176 - regression_loss: 1.0210 - classification_loss: 0.1966 41/500 [=>............................] - ETA: 1:53 - loss: 1.2206 - regression_loss: 1.0234 - classification_loss: 0.1972 42/500 [=>............................] - ETA: 1:53 - loss: 1.2260 - regression_loss: 1.0285 - classification_loss: 0.1975 43/500 [=>............................] - ETA: 1:53 - loss: 1.2344 - regression_loss: 1.0355 - classification_loss: 0.1990 44/500 [=>............................] - ETA: 1:53 - loss: 1.2379 - regression_loss: 1.0394 - classification_loss: 0.1985 45/500 [=>............................] - ETA: 1:53 - loss: 1.2435 - regression_loss: 1.0446 - classification_loss: 0.1990 46/500 [=>............................] - ETA: 1:52 - loss: 1.2557 - regression_loss: 1.0553 - classification_loss: 0.2004 47/500 [=>............................] - ETA: 1:52 - loss: 1.2552 - regression_loss: 1.0557 - classification_loss: 0.1994 48/500 [=>............................] - ETA: 1:52 - loss: 1.2596 - regression_loss: 1.0599 - classification_loss: 0.1997 49/500 [=>............................] - ETA: 1:51 - loss: 1.2666 - regression_loss: 1.0661 - classification_loss: 0.2005 50/500 [==>...........................] - ETA: 1:51 - loss: 1.2539 - regression_loss: 1.0560 - classification_loss: 0.1979 51/500 [==>...........................] - ETA: 1:51 - loss: 1.2440 - regression_loss: 1.0483 - classification_loss: 0.1957 52/500 [==>...........................] - ETA: 1:51 - loss: 1.2383 - regression_loss: 1.0444 - classification_loss: 0.1940 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2244 - regression_loss: 1.0325 - classification_loss: 0.1919 54/500 [==>...........................] - ETA: 1:50 - loss: 1.2329 - regression_loss: 1.0394 - classification_loss: 0.1935 55/500 [==>...........................] - ETA: 1:50 - loss: 1.2281 - regression_loss: 1.0358 - classification_loss: 0.1922 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2268 - regression_loss: 1.0354 - classification_loss: 0.1914 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2264 - regression_loss: 1.0360 - classification_loss: 0.1904 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2129 - regression_loss: 1.0255 - classification_loss: 0.1874 59/500 [==>...........................] - ETA: 1:49 - loss: 1.2032 - regression_loss: 1.0175 - classification_loss: 0.1857 60/500 [==>...........................] - ETA: 1:49 - loss: 1.2088 - regression_loss: 1.0225 - classification_loss: 0.1864 61/500 [==>...........................] - ETA: 1:49 - loss: 1.2018 - regression_loss: 1.0175 - classification_loss: 0.1843 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2006 - regression_loss: 1.0171 - classification_loss: 0.1835 63/500 [==>...........................] - ETA: 1:48 - loss: 1.2035 - regression_loss: 1.0189 - classification_loss: 0.1846 64/500 [==>...........................] - ETA: 1:48 - loss: 1.2009 - regression_loss: 1.0170 - classification_loss: 0.1839 65/500 [==>...........................] - ETA: 1:48 - loss: 1.2040 - regression_loss: 1.0189 - classification_loss: 0.1851 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2002 - regression_loss: 1.0159 - classification_loss: 0.1843 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2079 - regression_loss: 1.0225 - classification_loss: 0.1854 68/500 [===>..........................] - ETA: 1:47 - loss: 1.2068 - regression_loss: 1.0222 - classification_loss: 0.1845 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2032 - regression_loss: 1.0198 - classification_loss: 0.1834 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2102 - regression_loss: 1.0251 - classification_loss: 0.1851 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2081 - regression_loss: 1.0239 - classification_loss: 0.1842 72/500 [===>..........................] - ETA: 1:46 - loss: 1.1997 - regression_loss: 1.0172 - classification_loss: 0.1825 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2038 - regression_loss: 1.0207 - classification_loss: 0.1830 74/500 [===>..........................] - ETA: 1:46 - loss: 1.1970 - regression_loss: 1.0156 - classification_loss: 0.1814 75/500 [===>..........................] - ETA: 1:46 - loss: 1.1955 - regression_loss: 1.0141 - classification_loss: 0.1814 76/500 [===>..........................] - ETA: 1:45 - loss: 1.1871 - regression_loss: 1.0066 - classification_loss: 0.1805 77/500 [===>..........................] - ETA: 1:45 - loss: 1.1950 - regression_loss: 1.0128 - classification_loss: 0.1821 78/500 [===>..........................] - ETA: 1:45 - loss: 1.1919 - regression_loss: 1.0103 - classification_loss: 0.1816 79/500 [===>..........................] - ETA: 1:45 - loss: 1.1979 - regression_loss: 1.0153 - classification_loss: 0.1826 80/500 [===>..........................] - ETA: 1:44 - loss: 1.1957 - regression_loss: 1.0130 - classification_loss: 0.1827 81/500 [===>..........................] - ETA: 1:44 - loss: 1.1981 - regression_loss: 1.0145 - classification_loss: 0.1836 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2075 - regression_loss: 1.0221 - classification_loss: 0.1854 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2079 - regression_loss: 1.0225 - classification_loss: 0.1853 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2098 - regression_loss: 1.0240 - classification_loss: 0.1858 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2080 - regression_loss: 1.0228 - classification_loss: 0.1852 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2050 - regression_loss: 1.0206 - classification_loss: 0.1844 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2060 - regression_loss: 1.0218 - classification_loss: 0.1842 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2136 - regression_loss: 1.0274 - classification_loss: 0.1862 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2198 - regression_loss: 1.0315 - classification_loss: 0.1883 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2207 - regression_loss: 1.0323 - classification_loss: 0.1884 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2192 - regression_loss: 1.0304 - classification_loss: 0.1888 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2169 - regression_loss: 1.0287 - classification_loss: 0.1882 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2138 - regression_loss: 1.0259 - classification_loss: 0.1879 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2175 - regression_loss: 1.0286 - classification_loss: 0.1890 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2165 - regression_loss: 1.0276 - classification_loss: 0.1890 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2148 - regression_loss: 1.0258 - classification_loss: 0.1889 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2174 - regression_loss: 1.0284 - classification_loss: 0.1890 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2170 - regression_loss: 1.0282 - classification_loss: 0.1888 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2112 - regression_loss: 1.0240 - classification_loss: 0.1872 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2054 - regression_loss: 1.0194 - classification_loss: 0.1859 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2134 - regression_loss: 1.0257 - classification_loss: 0.1876 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2078 - regression_loss: 1.0212 - classification_loss: 0.1866 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2076 - regression_loss: 1.0212 - classification_loss: 0.1864 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2057 - regression_loss: 1.0190 - classification_loss: 0.1867 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2059 - regression_loss: 1.0190 - classification_loss: 0.1869 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2040 - regression_loss: 1.0176 - classification_loss: 0.1864 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2012 - regression_loss: 1.0150 - classification_loss: 0.1862 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2045 - regression_loss: 1.0177 - classification_loss: 0.1868 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2064 - regression_loss: 1.0194 - classification_loss: 0.1870 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2041 - regression_loss: 1.0174 - classification_loss: 0.1867 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2000 - regression_loss: 1.0144 - classification_loss: 0.1856 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2017 - regression_loss: 1.0160 - classification_loss: 0.1857 113/500 [=====>........................] - ETA: 1:36 - loss: 1.1963 - regression_loss: 1.0112 - classification_loss: 0.1851 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2040 - regression_loss: 1.0184 - classification_loss: 0.1856 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2005 - regression_loss: 1.0156 - classification_loss: 0.1849 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2055 - regression_loss: 1.0199 - classification_loss: 0.1856 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2021 - regression_loss: 1.0176 - classification_loss: 0.1845 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2035 - regression_loss: 1.0192 - classification_loss: 0.1844 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2028 - regression_loss: 1.0190 - classification_loss: 0.1838 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2037 - regression_loss: 1.0199 - classification_loss: 0.1838 121/500 [======>.......................] - ETA: 1:35 - loss: 1.2018 - regression_loss: 1.0181 - classification_loss: 0.1837 122/500 [======>.......................] - ETA: 1:34 - loss: 1.1992 - regression_loss: 1.0161 - classification_loss: 0.1831 123/500 [======>.......................] - ETA: 1:34 - loss: 1.1988 - regression_loss: 1.0159 - classification_loss: 0.1830 124/500 [======>.......................] - ETA: 1:34 - loss: 1.1998 - regression_loss: 1.0166 - classification_loss: 0.1831 125/500 [======>.......................] - ETA: 1:34 - loss: 1.1955 - regression_loss: 1.0135 - classification_loss: 0.1820 126/500 [======>.......................] - ETA: 1:33 - loss: 1.1982 - regression_loss: 1.0157 - classification_loss: 0.1826 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2005 - regression_loss: 1.0175 - classification_loss: 0.1830 128/500 [======>.......................] - ETA: 1:33 - loss: 1.2044 - regression_loss: 1.0198 - classification_loss: 0.1847 129/500 [======>.......................] - ETA: 1:33 - loss: 1.2077 - regression_loss: 1.0222 - classification_loss: 0.1855 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2119 - regression_loss: 1.0257 - classification_loss: 0.1862 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2153 - regression_loss: 1.0288 - classification_loss: 0.1865 132/500 [======>.......................] - ETA: 1:32 - loss: 1.2145 - regression_loss: 1.0278 - classification_loss: 0.1867 133/500 [======>.......................] - ETA: 1:32 - loss: 1.2188 - regression_loss: 1.0313 - classification_loss: 0.1875 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2135 - regression_loss: 1.0271 - classification_loss: 0.1864 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2110 - regression_loss: 1.0252 - classification_loss: 0.1857 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2126 - regression_loss: 1.0267 - classification_loss: 0.1859 137/500 [=======>......................] - ETA: 1:31 - loss: 1.2138 - regression_loss: 1.0276 - classification_loss: 0.1862 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2199 - regression_loss: 1.0329 - classification_loss: 0.1870 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2211 - regression_loss: 1.0341 - classification_loss: 0.1870 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2233 - regression_loss: 1.0360 - classification_loss: 0.1873 141/500 [=======>......................] - ETA: 1:30 - loss: 1.2234 - regression_loss: 1.0361 - classification_loss: 0.1873 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2281 - regression_loss: 1.0399 - classification_loss: 0.1882 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2287 - regression_loss: 1.0408 - classification_loss: 0.1879 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2334 - regression_loss: 1.0449 - classification_loss: 0.1886 145/500 [=======>......................] - ETA: 1:29 - loss: 1.2282 - regression_loss: 1.0405 - classification_loss: 0.1877 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2295 - regression_loss: 1.0417 - classification_loss: 0.1878 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2284 - regression_loss: 1.0411 - classification_loss: 0.1873 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2305 - regression_loss: 1.0426 - classification_loss: 0.1880 149/500 [=======>......................] - ETA: 1:28 - loss: 1.2313 - regression_loss: 1.0431 - classification_loss: 0.1881 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2325 - regression_loss: 1.0448 - classification_loss: 0.1877 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2345 - regression_loss: 1.0463 - classification_loss: 0.1882 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2374 - regression_loss: 1.0488 - classification_loss: 0.1886 153/500 [========>.....................] - ETA: 1:27 - loss: 1.2355 - regression_loss: 1.0473 - classification_loss: 0.1882 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2352 - regression_loss: 1.0472 - classification_loss: 0.1880 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2327 - regression_loss: 1.0451 - classification_loss: 0.1875 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2307 - regression_loss: 1.0436 - classification_loss: 0.1872 157/500 [========>.....................] - ETA: 1:26 - loss: 1.2315 - regression_loss: 1.0437 - classification_loss: 0.1877 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2310 - regression_loss: 1.0436 - classification_loss: 0.1874 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2332 - regression_loss: 1.0455 - classification_loss: 0.1877 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2319 - regression_loss: 1.0445 - classification_loss: 0.1874 161/500 [========>.....................] - ETA: 1:25 - loss: 1.2319 - regression_loss: 1.0446 - classification_loss: 0.1873 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2267 - regression_loss: 1.0403 - classification_loss: 0.1864 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2275 - regression_loss: 1.0401 - classification_loss: 0.1874 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2295 - regression_loss: 1.0418 - classification_loss: 0.1877 165/500 [========>.....................] - ETA: 1:24 - loss: 1.2281 - regression_loss: 1.0410 - classification_loss: 0.1871 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2301 - regression_loss: 1.0425 - classification_loss: 0.1877 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2301 - regression_loss: 1.0422 - classification_loss: 0.1878 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2294 - regression_loss: 1.0418 - classification_loss: 0.1876 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2314 - regression_loss: 1.0437 - classification_loss: 0.1876 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2285 - regression_loss: 1.0411 - classification_loss: 0.1874 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2273 - regression_loss: 1.0402 - classification_loss: 0.1871 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2282 - regression_loss: 1.0409 - classification_loss: 0.1872 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2323 - regression_loss: 1.0441 - classification_loss: 0.1882 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2269 - regression_loss: 1.0397 - classification_loss: 0.1873 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2289 - regression_loss: 1.0414 - classification_loss: 0.1875 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2249 - regression_loss: 1.0383 - classification_loss: 0.1867 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2263 - regression_loss: 1.0396 - classification_loss: 0.1867 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2278 - regression_loss: 1.0407 - classification_loss: 0.1871 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2301 - regression_loss: 1.0429 - classification_loss: 0.1872 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2275 - regression_loss: 1.0408 - classification_loss: 0.1867 181/500 [=========>....................] - ETA: 1:20 - loss: 1.2276 - regression_loss: 1.0409 - classification_loss: 0.1867 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2268 - regression_loss: 1.0403 - classification_loss: 0.1865 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2250 - regression_loss: 1.0388 - classification_loss: 0.1862 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2245 - regression_loss: 1.0386 - classification_loss: 0.1859 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2255 - regression_loss: 1.0395 - classification_loss: 0.1859 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2246 - regression_loss: 1.0389 - classification_loss: 0.1857 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2244 - regression_loss: 1.0387 - classification_loss: 0.1857 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2269 - regression_loss: 1.0407 - classification_loss: 0.1861 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2225 - regression_loss: 1.0370 - classification_loss: 0.1854 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2240 - regression_loss: 1.0383 - classification_loss: 0.1858 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2253 - regression_loss: 1.0395 - classification_loss: 0.1859 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2243 - regression_loss: 1.0386 - classification_loss: 0.1857 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2263 - regression_loss: 1.0401 - classification_loss: 0.1862 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2267 - regression_loss: 1.0405 - classification_loss: 0.1862 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2255 - regression_loss: 1.0388 - classification_loss: 0.1867 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2251 - regression_loss: 1.0386 - classification_loss: 0.1865 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2264 - regression_loss: 1.0398 - classification_loss: 0.1867 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2267 - regression_loss: 1.0399 - classification_loss: 0.1868 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2244 - regression_loss: 1.0379 - classification_loss: 0.1865 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2229 - regression_loss: 1.0369 - classification_loss: 0.1860 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2256 - regression_loss: 1.0392 - classification_loss: 0.1864 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2248 - regression_loss: 1.0387 - classification_loss: 0.1861 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2269 - regression_loss: 1.0405 - classification_loss: 0.1864 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2263 - regression_loss: 1.0401 - classification_loss: 0.1862 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2252 - regression_loss: 1.0394 - classification_loss: 0.1857 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2261 - regression_loss: 1.0404 - classification_loss: 0.1856 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2227 - regression_loss: 1.0376 - classification_loss: 0.1851 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2239 - regression_loss: 1.0386 - classification_loss: 0.1852 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2271 - regression_loss: 1.0412 - classification_loss: 0.1859 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2280 - regression_loss: 1.0420 - classification_loss: 0.1861 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2253 - regression_loss: 1.0396 - classification_loss: 0.1857 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2250 - regression_loss: 1.0394 - classification_loss: 0.1856 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2222 - regression_loss: 1.0367 - classification_loss: 0.1854 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2212 - regression_loss: 1.0356 - classification_loss: 0.1856 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2212 - regression_loss: 1.0354 - classification_loss: 0.1858 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2222 - regression_loss: 1.0363 - classification_loss: 0.1859 217/500 [============>.................] - ETA: 1:10 - loss: 1.2216 - regression_loss: 1.0360 - classification_loss: 0.1857 218/500 [============>.................] - ETA: 1:10 - loss: 1.2241 - regression_loss: 1.0379 - classification_loss: 0.1862 219/500 [============>.................] - ETA: 1:10 - loss: 1.2201 - regression_loss: 1.0346 - classification_loss: 0.1856 220/500 [============>.................] - ETA: 1:10 - loss: 1.2209 - regression_loss: 1.0352 - classification_loss: 0.1857 221/500 [============>.................] - ETA: 1:09 - loss: 1.2194 - regression_loss: 1.0338 - classification_loss: 0.1856 222/500 [============>.................] - ETA: 1:09 - loss: 1.2197 - regression_loss: 1.0339 - classification_loss: 0.1858 223/500 [============>.................] - ETA: 1:09 - loss: 1.2184 - regression_loss: 1.0324 - classification_loss: 0.1859 224/500 [============>.................] - ETA: 1:09 - loss: 1.2167 - regression_loss: 1.0312 - classification_loss: 0.1855 225/500 [============>.................] - ETA: 1:08 - loss: 1.2174 - regression_loss: 1.0320 - classification_loss: 0.1854 226/500 [============>.................] - ETA: 1:08 - loss: 1.2175 - regression_loss: 1.0320 - classification_loss: 0.1855 227/500 [============>.................] - ETA: 1:08 - loss: 1.2175 - regression_loss: 1.0321 - classification_loss: 0.1854 228/500 [============>.................] - ETA: 1:08 - loss: 1.2140 - regression_loss: 1.0292 - classification_loss: 0.1848 229/500 [============>.................] - ETA: 1:07 - loss: 1.2148 - regression_loss: 1.0300 - classification_loss: 0.1848 230/500 [============>.................] - ETA: 1:07 - loss: 1.2133 - regression_loss: 1.0288 - classification_loss: 0.1845 231/500 [============>.................] - ETA: 1:07 - loss: 1.2144 - regression_loss: 1.0299 - classification_loss: 0.1845 232/500 [============>.................] - ETA: 1:07 - loss: 1.2128 - regression_loss: 1.0282 - classification_loss: 0.1846 233/500 [============>.................] - ETA: 1:06 - loss: 1.2101 - regression_loss: 1.0261 - classification_loss: 0.1841 234/500 [=============>................] - ETA: 1:06 - loss: 1.2123 - regression_loss: 1.0276 - classification_loss: 0.1847 235/500 [=============>................] - ETA: 1:06 - loss: 1.2147 - regression_loss: 1.0297 - classification_loss: 0.1850 236/500 [=============>................] - ETA: 1:06 - loss: 1.2128 - regression_loss: 1.0281 - classification_loss: 0.1847 237/500 [=============>................] - ETA: 1:05 - loss: 1.2178 - regression_loss: 1.0323 - classification_loss: 0.1855 238/500 [=============>................] - ETA: 1:05 - loss: 1.2181 - regression_loss: 1.0326 - classification_loss: 0.1855 239/500 [=============>................] - ETA: 1:05 - loss: 1.2211 - regression_loss: 1.0351 - classification_loss: 0.1860 240/500 [=============>................] - ETA: 1:05 - loss: 1.2214 - regression_loss: 1.0352 - classification_loss: 0.1861 241/500 [=============>................] - ETA: 1:04 - loss: 1.2206 - regression_loss: 1.0346 - classification_loss: 0.1860 242/500 [=============>................] - ETA: 1:04 - loss: 1.2191 - regression_loss: 1.0335 - classification_loss: 0.1856 243/500 [=============>................] - ETA: 1:04 - loss: 1.2161 - regression_loss: 1.0310 - classification_loss: 0.1851 244/500 [=============>................] - ETA: 1:04 - loss: 1.2154 - regression_loss: 1.0307 - classification_loss: 0.1847 245/500 [=============>................] - ETA: 1:03 - loss: 1.2165 - regression_loss: 1.0318 - classification_loss: 0.1847 246/500 [=============>................] - ETA: 1:03 - loss: 1.2180 - regression_loss: 1.0331 - classification_loss: 0.1849 247/500 [=============>................] - ETA: 1:03 - loss: 1.2209 - regression_loss: 1.0353 - classification_loss: 0.1856 248/500 [=============>................] - ETA: 1:03 - loss: 1.2242 - regression_loss: 1.0377 - classification_loss: 0.1865 249/500 [=============>................] - ETA: 1:02 - loss: 1.2247 - regression_loss: 1.0381 - classification_loss: 0.1866 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2246 - regression_loss: 1.0380 - classification_loss: 0.1866 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2242 - regression_loss: 1.0378 - classification_loss: 0.1864 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2262 - regression_loss: 1.0391 - classification_loss: 0.1870 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2238 - regression_loss: 1.0372 - classification_loss: 0.1866 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2261 - regression_loss: 1.0391 - classification_loss: 0.1870 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2235 - regression_loss: 1.0372 - classification_loss: 0.1864 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2239 - regression_loss: 1.0375 - classification_loss: 0.1864 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2268 - regression_loss: 1.0397 - classification_loss: 0.1871 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2269 - regression_loss: 1.0398 - classification_loss: 0.1871 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2280 - regression_loss: 1.0406 - classification_loss: 0.1874 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2274 - regression_loss: 1.0399 - classification_loss: 0.1874 261/500 [==============>...............] - ETA: 59s - loss: 1.2293 - regression_loss: 1.0416 - classification_loss: 0.1878  262/500 [==============>...............] - ETA: 59s - loss: 1.2278 - regression_loss: 1.0404 - classification_loss: 0.1874 263/500 [==============>...............] - ETA: 59s - loss: 1.2271 - regression_loss: 1.0398 - classification_loss: 0.1873 264/500 [==============>...............] - ETA: 59s - loss: 1.2295 - regression_loss: 1.0417 - classification_loss: 0.1878 265/500 [==============>...............] - ETA: 58s - loss: 1.2302 - regression_loss: 1.0420 - classification_loss: 0.1882 266/500 [==============>...............] - ETA: 58s - loss: 1.2321 - regression_loss: 1.0436 - classification_loss: 0.1884 267/500 [===============>..............] - ETA: 58s - loss: 1.2309 - regression_loss: 1.0427 - classification_loss: 0.1881 268/500 [===============>..............] - ETA: 58s - loss: 1.2306 - regression_loss: 1.0425 - classification_loss: 0.1882 269/500 [===============>..............] - ETA: 57s - loss: 1.2303 - regression_loss: 1.0422 - classification_loss: 0.1880 270/500 [===============>..............] - ETA: 57s - loss: 1.2315 - regression_loss: 1.0435 - classification_loss: 0.1881 271/500 [===============>..............] - ETA: 57s - loss: 1.2331 - regression_loss: 1.0448 - classification_loss: 0.1883 272/500 [===============>..............] - ETA: 57s - loss: 1.2346 - regression_loss: 1.0460 - classification_loss: 0.1886 273/500 [===============>..............] - ETA: 56s - loss: 1.2353 - regression_loss: 1.0466 - classification_loss: 0.1886 274/500 [===============>..............] - ETA: 56s - loss: 1.2371 - regression_loss: 1.0481 - classification_loss: 0.1890 275/500 [===============>..............] - ETA: 56s - loss: 1.2381 - regression_loss: 1.0491 - classification_loss: 0.1890 276/500 [===============>..............] - ETA: 56s - loss: 1.2359 - regression_loss: 1.0473 - classification_loss: 0.1886 277/500 [===============>..............] - ETA: 55s - loss: 1.2386 - regression_loss: 1.0493 - classification_loss: 0.1893 278/500 [===============>..............] - ETA: 55s - loss: 1.2383 - regression_loss: 1.0493 - classification_loss: 0.1890 279/500 [===============>..............] - ETA: 55s - loss: 1.2381 - regression_loss: 1.0492 - classification_loss: 0.1889 280/500 [===============>..............] - ETA: 55s - loss: 1.2361 - regression_loss: 1.0475 - classification_loss: 0.1886 281/500 [===============>..............] - ETA: 54s - loss: 1.2372 - regression_loss: 1.0484 - classification_loss: 0.1888 282/500 [===============>..............] - ETA: 54s - loss: 1.2400 - regression_loss: 1.0507 - classification_loss: 0.1892 283/500 [===============>..............] - ETA: 54s - loss: 1.2415 - regression_loss: 1.0522 - classification_loss: 0.1893 284/500 [================>.............] - ETA: 54s - loss: 1.2432 - regression_loss: 1.0536 - classification_loss: 0.1896 285/500 [================>.............] - ETA: 53s - loss: 1.2432 - regression_loss: 1.0536 - classification_loss: 0.1897 286/500 [================>.............] - ETA: 53s - loss: 1.2438 - regression_loss: 1.0541 - classification_loss: 0.1896 287/500 [================>.............] - ETA: 53s - loss: 1.2425 - regression_loss: 1.0524 - classification_loss: 0.1901 288/500 [================>.............] - ETA: 53s - loss: 1.2423 - regression_loss: 1.0524 - classification_loss: 0.1899 289/500 [================>.............] - ETA: 52s - loss: 1.2393 - regression_loss: 1.0498 - classification_loss: 0.1895 290/500 [================>.............] - ETA: 52s - loss: 1.2402 - regression_loss: 1.0506 - classification_loss: 0.1896 291/500 [================>.............] - ETA: 52s - loss: 1.2408 - regression_loss: 1.0511 - classification_loss: 0.1898 292/500 [================>.............] - ETA: 52s - loss: 1.2392 - regression_loss: 1.0498 - classification_loss: 0.1894 293/500 [================>.............] - ETA: 51s - loss: 1.2398 - regression_loss: 1.0503 - classification_loss: 0.1895 294/500 [================>.............] - ETA: 51s - loss: 1.2416 - regression_loss: 1.0518 - classification_loss: 0.1898 295/500 [================>.............] - ETA: 51s - loss: 1.2420 - regression_loss: 1.0523 - classification_loss: 0.1896 296/500 [================>.............] - ETA: 51s - loss: 1.2411 - regression_loss: 1.0516 - classification_loss: 0.1896 297/500 [================>.............] - ETA: 50s - loss: 1.2421 - regression_loss: 1.0521 - classification_loss: 0.1899 298/500 [================>.............] - ETA: 50s - loss: 1.2423 - regression_loss: 1.0524 - classification_loss: 0.1899 299/500 [================>.............] - ETA: 50s - loss: 1.2435 - regression_loss: 1.0532 - classification_loss: 0.1903 300/500 [=================>............] - ETA: 50s - loss: 1.2436 - regression_loss: 1.0534 - classification_loss: 0.1902 301/500 [=================>............] - ETA: 49s - loss: 1.2447 - regression_loss: 1.0543 - classification_loss: 0.1904 302/500 [=================>............] - ETA: 49s - loss: 1.2428 - regression_loss: 1.0527 - classification_loss: 0.1901 303/500 [=================>............] - ETA: 49s - loss: 1.2420 - regression_loss: 1.0519 - classification_loss: 0.1901 304/500 [=================>............] - ETA: 49s - loss: 1.2411 - regression_loss: 1.0513 - classification_loss: 0.1898 305/500 [=================>............] - ETA: 48s - loss: 1.2416 - regression_loss: 1.0518 - classification_loss: 0.1899 306/500 [=================>............] - ETA: 48s - loss: 1.2427 - regression_loss: 1.0524 - classification_loss: 0.1903 307/500 [=================>............] - ETA: 48s - loss: 1.2415 - regression_loss: 1.0516 - classification_loss: 0.1900 308/500 [=================>............] - ETA: 48s - loss: 1.2424 - regression_loss: 1.0523 - classification_loss: 0.1901 309/500 [=================>............] - ETA: 47s - loss: 1.2410 - regression_loss: 1.0514 - classification_loss: 0.1897 310/500 [=================>............] - ETA: 47s - loss: 1.2396 - regression_loss: 1.0501 - classification_loss: 0.1894 311/500 [=================>............] - ETA: 47s - loss: 1.2382 - regression_loss: 1.0492 - classification_loss: 0.1890 312/500 [=================>............] - ETA: 47s - loss: 1.2380 - regression_loss: 1.0490 - classification_loss: 0.1890 313/500 [=================>............] - ETA: 46s - loss: 1.2376 - regression_loss: 1.0485 - classification_loss: 0.1891 314/500 [=================>............] - ETA: 46s - loss: 1.2369 - regression_loss: 1.0480 - classification_loss: 0.1888 315/500 [=================>............] - ETA: 46s - loss: 1.2369 - regression_loss: 1.0482 - classification_loss: 0.1887 316/500 [=================>............] - ETA: 46s - loss: 1.2385 - regression_loss: 1.0496 - classification_loss: 0.1889 317/500 [==================>...........] - ETA: 45s - loss: 1.2386 - regression_loss: 1.0498 - classification_loss: 0.1888 318/500 [==================>...........] - ETA: 45s - loss: 1.2367 - regression_loss: 1.0483 - classification_loss: 0.1884 319/500 [==================>...........] - ETA: 45s - loss: 1.2373 - regression_loss: 1.0488 - classification_loss: 0.1884 320/500 [==================>...........] - ETA: 45s - loss: 1.2383 - regression_loss: 1.0497 - classification_loss: 0.1886 321/500 [==================>...........] - ETA: 44s - loss: 1.2352 - regression_loss: 1.0472 - classification_loss: 0.1881 322/500 [==================>...........] - ETA: 44s - loss: 1.2362 - regression_loss: 1.0476 - classification_loss: 0.1886 323/500 [==================>...........] - ETA: 44s - loss: 1.2379 - regression_loss: 1.0490 - classification_loss: 0.1889 324/500 [==================>...........] - ETA: 44s - loss: 1.2372 - regression_loss: 1.0485 - classification_loss: 0.1887 325/500 [==================>...........] - ETA: 43s - loss: 1.2385 - regression_loss: 1.0495 - classification_loss: 0.1890 326/500 [==================>...........] - ETA: 43s - loss: 1.2388 - regression_loss: 1.0499 - classification_loss: 0.1889 327/500 [==================>...........] - ETA: 43s - loss: 1.2409 - regression_loss: 1.0515 - classification_loss: 0.1894 328/500 [==================>...........] - ETA: 43s - loss: 1.2393 - regression_loss: 1.0502 - classification_loss: 0.1890 329/500 [==================>...........] - ETA: 42s - loss: 1.2372 - regression_loss: 1.0484 - classification_loss: 0.1888 330/500 [==================>...........] - ETA: 42s - loss: 1.2362 - regression_loss: 1.0476 - classification_loss: 0.1886 331/500 [==================>...........] - ETA: 42s - loss: 1.2375 - regression_loss: 1.0487 - classification_loss: 0.1887 332/500 [==================>...........] - ETA: 42s - loss: 1.2385 - regression_loss: 1.0496 - classification_loss: 0.1889 333/500 [==================>...........] - ETA: 41s - loss: 1.2392 - regression_loss: 1.0502 - classification_loss: 0.1890 334/500 [===================>..........] - ETA: 41s - loss: 1.2397 - regression_loss: 1.0508 - classification_loss: 0.1889 335/500 [===================>..........] - ETA: 41s - loss: 1.2404 - regression_loss: 1.0513 - classification_loss: 0.1890 336/500 [===================>..........] - ETA: 41s - loss: 1.2401 - regression_loss: 1.0514 - classification_loss: 0.1887 337/500 [===================>..........] - ETA: 40s - loss: 1.2411 - regression_loss: 1.0521 - classification_loss: 0.1890 338/500 [===================>..........] - ETA: 40s - loss: 1.2419 - regression_loss: 1.0526 - classification_loss: 0.1892 339/500 [===================>..........] - ETA: 40s - loss: 1.2407 - regression_loss: 1.0518 - classification_loss: 0.1889 340/500 [===================>..........] - ETA: 40s - loss: 1.2407 - regression_loss: 1.0517 - classification_loss: 0.1889 341/500 [===================>..........] - ETA: 39s - loss: 1.2407 - regression_loss: 1.0517 - classification_loss: 0.1889 342/500 [===================>..........] - ETA: 39s - loss: 1.2393 - regression_loss: 1.0505 - classification_loss: 0.1888 343/500 [===================>..........] - ETA: 39s - loss: 1.2369 - regression_loss: 1.0485 - classification_loss: 0.1884 344/500 [===================>..........] - ETA: 39s - loss: 1.2377 - regression_loss: 1.0492 - classification_loss: 0.1886 345/500 [===================>..........] - ETA: 38s - loss: 1.2355 - regression_loss: 1.0472 - classification_loss: 0.1883 346/500 [===================>..........] - ETA: 38s - loss: 1.2341 - regression_loss: 1.0462 - classification_loss: 0.1879 347/500 [===================>..........] - ETA: 38s - loss: 1.2344 - regression_loss: 1.0464 - classification_loss: 0.1880 348/500 [===================>..........] - ETA: 38s - loss: 1.2330 - regression_loss: 1.0453 - classification_loss: 0.1878 349/500 [===================>..........] - ETA: 37s - loss: 1.2346 - regression_loss: 1.0464 - classification_loss: 0.1882 350/500 [====================>.........] - ETA: 37s - loss: 1.2360 - regression_loss: 1.0475 - classification_loss: 0.1885 351/500 [====================>.........] - ETA: 37s - loss: 1.2338 - regression_loss: 1.0457 - classification_loss: 0.1881 352/500 [====================>.........] - ETA: 37s - loss: 1.2319 - regression_loss: 1.0443 - classification_loss: 0.1876 353/500 [====================>.........] - ETA: 36s - loss: 1.2316 - regression_loss: 1.0439 - classification_loss: 0.1877 354/500 [====================>.........] - ETA: 36s - loss: 1.2323 - regression_loss: 1.0443 - classification_loss: 0.1880 355/500 [====================>.........] - ETA: 36s - loss: 1.2333 - regression_loss: 1.0451 - classification_loss: 0.1883 356/500 [====================>.........] - ETA: 36s - loss: 1.2333 - regression_loss: 1.0451 - classification_loss: 0.1882 357/500 [====================>.........] - ETA: 35s - loss: 1.2327 - regression_loss: 1.0448 - classification_loss: 0.1879 358/500 [====================>.........] - ETA: 35s - loss: 1.2336 - regression_loss: 1.0455 - classification_loss: 0.1881 359/500 [====================>.........] - ETA: 35s - loss: 1.2319 - regression_loss: 1.0440 - classification_loss: 0.1879 360/500 [====================>.........] - ETA: 35s - loss: 1.2333 - regression_loss: 1.0453 - classification_loss: 0.1880 361/500 [====================>.........] - ETA: 34s - loss: 1.2341 - regression_loss: 1.0459 - classification_loss: 0.1883 362/500 [====================>.........] - ETA: 34s - loss: 1.2325 - regression_loss: 1.0445 - classification_loss: 0.1880 363/500 [====================>.........] - ETA: 34s - loss: 1.2316 - regression_loss: 1.0437 - classification_loss: 0.1879 364/500 [====================>.........] - ETA: 34s - loss: 1.2319 - regression_loss: 1.0438 - classification_loss: 0.1881 365/500 [====================>.........] - ETA: 33s - loss: 1.2318 - regression_loss: 1.0437 - classification_loss: 0.1881 366/500 [====================>.........] - ETA: 33s - loss: 1.2320 - regression_loss: 1.0439 - classification_loss: 0.1881 367/500 [=====================>........] - ETA: 33s - loss: 1.2318 - regression_loss: 1.0436 - classification_loss: 0.1882 368/500 [=====================>........] - ETA: 33s - loss: 1.2305 - regression_loss: 1.0423 - classification_loss: 0.1883 369/500 [=====================>........] - ETA: 32s - loss: 1.2314 - regression_loss: 1.0429 - classification_loss: 0.1884 370/500 [=====================>........] - ETA: 32s - loss: 1.2307 - regression_loss: 1.0424 - classification_loss: 0.1883 371/500 [=====================>........] - ETA: 32s - loss: 1.2299 - regression_loss: 1.0418 - classification_loss: 0.1881 372/500 [=====================>........] - ETA: 32s - loss: 1.2290 - regression_loss: 1.0411 - classification_loss: 0.1879 373/500 [=====================>........] - ETA: 31s - loss: 1.2291 - regression_loss: 1.0411 - classification_loss: 0.1879 374/500 [=====================>........] - ETA: 31s - loss: 1.2273 - regression_loss: 1.0397 - classification_loss: 0.1876 375/500 [=====================>........] - ETA: 31s - loss: 1.2272 - regression_loss: 1.0397 - classification_loss: 0.1875 376/500 [=====================>........] - ETA: 31s - loss: 1.2259 - regression_loss: 1.0387 - classification_loss: 0.1872 377/500 [=====================>........] - ETA: 30s - loss: 1.2247 - regression_loss: 1.0378 - classification_loss: 0.1870 378/500 [=====================>........] - ETA: 30s - loss: 1.2269 - regression_loss: 1.0399 - classification_loss: 0.1870 379/500 [=====================>........] - ETA: 30s - loss: 1.2277 - regression_loss: 1.0406 - classification_loss: 0.1870 380/500 [=====================>........] - ETA: 30s - loss: 1.2283 - regression_loss: 1.0411 - classification_loss: 0.1872 381/500 [=====================>........] - ETA: 29s - loss: 1.2292 - regression_loss: 1.0419 - classification_loss: 0.1873 382/500 [=====================>........] - ETA: 29s - loss: 1.2303 - regression_loss: 1.0428 - classification_loss: 0.1875 383/500 [=====================>........] - ETA: 29s - loss: 1.2303 - regression_loss: 1.0428 - classification_loss: 0.1875 384/500 [======================>.......] - ETA: 29s - loss: 1.2322 - regression_loss: 1.0445 - classification_loss: 0.1877 385/500 [======================>.......] - ETA: 28s - loss: 1.2322 - regression_loss: 1.0444 - classification_loss: 0.1878 386/500 [======================>.......] - ETA: 28s - loss: 1.2320 - regression_loss: 1.0443 - classification_loss: 0.1877 387/500 [======================>.......] - ETA: 28s - loss: 1.2315 - regression_loss: 1.0440 - classification_loss: 0.1874 388/500 [======================>.......] - ETA: 28s - loss: 1.2322 - regression_loss: 1.0447 - classification_loss: 0.1876 389/500 [======================>.......] - ETA: 27s - loss: 1.2322 - regression_loss: 1.0447 - classification_loss: 0.1874 390/500 [======================>.......] - ETA: 27s - loss: 1.2327 - regression_loss: 1.0453 - classification_loss: 0.1874 391/500 [======================>.......] - ETA: 27s - loss: 1.2323 - regression_loss: 1.0451 - classification_loss: 0.1873 392/500 [======================>.......] - ETA: 27s - loss: 1.2323 - regression_loss: 1.0450 - classification_loss: 0.1873 393/500 [======================>.......] - ETA: 26s - loss: 1.2324 - regression_loss: 1.0451 - classification_loss: 0.1873 394/500 [======================>.......] - ETA: 26s - loss: 1.2331 - regression_loss: 1.0456 - classification_loss: 0.1874 395/500 [======================>.......] - ETA: 26s - loss: 1.2333 - regression_loss: 1.0459 - classification_loss: 0.1874 396/500 [======================>.......] - ETA: 26s - loss: 1.2312 - regression_loss: 1.0441 - classification_loss: 0.1871 397/500 [======================>.......] - ETA: 25s - loss: 1.2297 - regression_loss: 1.0430 - classification_loss: 0.1868 398/500 [======================>.......] - ETA: 25s - loss: 1.2288 - regression_loss: 1.0423 - classification_loss: 0.1865 399/500 [======================>.......] - ETA: 25s - loss: 1.2298 - regression_loss: 1.0432 - classification_loss: 0.1866 400/500 [=======================>......] - ETA: 25s - loss: 1.2292 - regression_loss: 1.0426 - classification_loss: 0.1866 401/500 [=======================>......] - ETA: 24s - loss: 1.2286 - regression_loss: 1.0421 - classification_loss: 0.1865 402/500 [=======================>......] - ETA: 24s - loss: 1.2296 - regression_loss: 1.0429 - classification_loss: 0.1866 403/500 [=======================>......] - ETA: 24s - loss: 1.2292 - regression_loss: 1.0428 - classification_loss: 0.1864 404/500 [=======================>......] - ETA: 24s - loss: 1.2284 - regression_loss: 1.0423 - classification_loss: 0.1861 405/500 [=======================>......] - ETA: 23s - loss: 1.2289 - regression_loss: 1.0427 - classification_loss: 0.1863 406/500 [=======================>......] - ETA: 23s - loss: 1.2299 - regression_loss: 1.0434 - classification_loss: 0.1865 407/500 [=======================>......] - ETA: 23s - loss: 1.2309 - regression_loss: 1.0442 - classification_loss: 0.1867 408/500 [=======================>......] - ETA: 23s - loss: 1.2308 - regression_loss: 1.0442 - classification_loss: 0.1866 409/500 [=======================>......] - ETA: 22s - loss: 1.2299 - regression_loss: 1.0435 - classification_loss: 0.1864 410/500 [=======================>......] - ETA: 22s - loss: 1.2305 - regression_loss: 1.0439 - classification_loss: 0.1866 411/500 [=======================>......] - ETA: 22s - loss: 1.2311 - regression_loss: 1.0444 - classification_loss: 0.1867 412/500 [=======================>......] - ETA: 22s - loss: 1.2315 - regression_loss: 1.0448 - classification_loss: 0.1867 413/500 [=======================>......] - ETA: 21s - loss: 1.2300 - regression_loss: 1.0435 - classification_loss: 0.1864 414/500 [=======================>......] - ETA: 21s - loss: 1.2289 - regression_loss: 1.0427 - classification_loss: 0.1862 415/500 [=======================>......] - ETA: 21s - loss: 1.2287 - regression_loss: 1.0426 - classification_loss: 0.1861 416/500 [=======================>......] - ETA: 21s - loss: 1.2281 - regression_loss: 1.0420 - classification_loss: 0.1860 417/500 [========================>.....] - ETA: 20s - loss: 1.2281 - regression_loss: 1.0419 - classification_loss: 0.1861 418/500 [========================>.....] - ETA: 20s - loss: 1.2285 - regression_loss: 1.0424 - classification_loss: 0.1862 419/500 [========================>.....] - ETA: 20s - loss: 1.2281 - regression_loss: 1.0421 - classification_loss: 0.1860 420/500 [========================>.....] - ETA: 20s - loss: 1.2299 - regression_loss: 1.0437 - classification_loss: 0.1862 421/500 [========================>.....] - ETA: 19s - loss: 1.2289 - regression_loss: 1.0427 - classification_loss: 0.1863 422/500 [========================>.....] - ETA: 19s - loss: 1.2301 - regression_loss: 1.0437 - classification_loss: 0.1864 423/500 [========================>.....] - ETA: 19s - loss: 1.2287 - regression_loss: 1.0425 - classification_loss: 0.1862 424/500 [========================>.....] - ETA: 19s - loss: 1.2295 - regression_loss: 1.0432 - classification_loss: 0.1863 425/500 [========================>.....] - ETA: 18s - loss: 1.2313 - regression_loss: 1.0447 - classification_loss: 0.1867 426/500 [========================>.....] - ETA: 18s - loss: 1.2296 - regression_loss: 1.0433 - classification_loss: 0.1863 427/500 [========================>.....] - ETA: 18s - loss: 1.2295 - regression_loss: 1.0433 - classification_loss: 0.1862 428/500 [========================>.....] - ETA: 18s - loss: 1.2311 - regression_loss: 1.0446 - classification_loss: 0.1865 429/500 [========================>.....] - ETA: 17s - loss: 1.2318 - regression_loss: 1.0452 - classification_loss: 0.1866 430/500 [========================>.....] - ETA: 17s - loss: 1.2327 - regression_loss: 1.0458 - classification_loss: 0.1868 431/500 [========================>.....] - ETA: 17s - loss: 1.2327 - regression_loss: 1.0458 - classification_loss: 0.1870 432/500 [========================>.....] - ETA: 17s - loss: 1.2322 - regression_loss: 1.0455 - classification_loss: 0.1868 433/500 [========================>.....] - ETA: 16s - loss: 1.2332 - regression_loss: 1.0463 - classification_loss: 0.1869 434/500 [=========================>....] - ETA: 16s - loss: 1.2349 - regression_loss: 1.0477 - classification_loss: 0.1872 435/500 [=========================>....] - ETA: 16s - loss: 1.2347 - regression_loss: 1.0475 - classification_loss: 0.1871 436/500 [=========================>....] - ETA: 16s - loss: 1.2336 - regression_loss: 1.0467 - classification_loss: 0.1869 437/500 [=========================>....] - ETA: 15s - loss: 1.2342 - regression_loss: 1.0471 - classification_loss: 0.1871 438/500 [=========================>....] - ETA: 15s - loss: 1.2333 - regression_loss: 1.0463 - classification_loss: 0.1869 439/500 [=========================>....] - ETA: 15s - loss: 1.2319 - regression_loss: 1.0452 - classification_loss: 0.1867 440/500 [=========================>....] - ETA: 15s - loss: 1.2323 - regression_loss: 1.0456 - classification_loss: 0.1867 441/500 [=========================>....] - ETA: 14s - loss: 1.2333 - regression_loss: 1.0464 - classification_loss: 0.1868 442/500 [=========================>....] - ETA: 14s - loss: 1.2330 - regression_loss: 1.0463 - classification_loss: 0.1867 443/500 [=========================>....] - ETA: 14s - loss: 1.2320 - regression_loss: 1.0455 - classification_loss: 0.1865 444/500 [=========================>....] - ETA: 14s - loss: 1.2332 - regression_loss: 1.0465 - classification_loss: 0.1867 445/500 [=========================>....] - ETA: 13s - loss: 1.2339 - regression_loss: 1.0471 - classification_loss: 0.1868 446/500 [=========================>....] - ETA: 13s - loss: 1.2336 - regression_loss: 1.0469 - classification_loss: 0.1867 447/500 [=========================>....] - ETA: 13s - loss: 1.2340 - regression_loss: 1.0470 - classification_loss: 0.1871 448/500 [=========================>....] - ETA: 13s - loss: 1.2327 - regression_loss: 1.0457 - classification_loss: 0.1871 449/500 [=========================>....] - ETA: 12s - loss: 1.2325 - regression_loss: 1.0454 - classification_loss: 0.1871 450/500 [==========================>...] - ETA: 12s - loss: 1.2328 - regression_loss: 1.0458 - classification_loss: 0.1871 451/500 [==========================>...] - ETA: 12s - loss: 1.2333 - regression_loss: 1.0462 - classification_loss: 0.1872 452/500 [==========================>...] - ETA: 12s - loss: 1.2347 - regression_loss: 1.0473 - classification_loss: 0.1874 453/500 [==========================>...] - ETA: 11s - loss: 1.2349 - regression_loss: 1.0476 - classification_loss: 0.1873 454/500 [==========================>...] - ETA: 11s - loss: 1.2336 - regression_loss: 1.0465 - classification_loss: 0.1870 455/500 [==========================>...] - ETA: 11s - loss: 1.2346 - regression_loss: 1.0475 - classification_loss: 0.1871 456/500 [==========================>...] - ETA: 11s - loss: 1.2357 - regression_loss: 1.0485 - classification_loss: 0.1873 457/500 [==========================>...] - ETA: 10s - loss: 1.2350 - regression_loss: 1.0480 - classification_loss: 0.1871 458/500 [==========================>...] - ETA: 10s - loss: 1.2342 - regression_loss: 1.0473 - classification_loss: 0.1869 459/500 [==========================>...] - ETA: 10s - loss: 1.2344 - regression_loss: 1.0475 - classification_loss: 0.1869 460/500 [==========================>...] - ETA: 10s - loss: 1.2359 - regression_loss: 1.0487 - classification_loss: 0.1872 461/500 [==========================>...] - ETA: 9s - loss: 1.2365 - regression_loss: 1.0491 - classification_loss: 0.1873  462/500 [==========================>...] - ETA: 9s - loss: 1.2365 - regression_loss: 1.0491 - classification_loss: 0.1874 463/500 [==========================>...] - ETA: 9s - loss: 1.2365 - regression_loss: 1.0490 - classification_loss: 0.1875 464/500 [==========================>...] - ETA: 9s - loss: 1.2361 - regression_loss: 1.0487 - classification_loss: 0.1874 465/500 [==========================>...] - ETA: 8s - loss: 1.2371 - regression_loss: 1.0495 - classification_loss: 0.1876 466/500 [==========================>...] - ETA: 8s - loss: 1.2375 - regression_loss: 1.0499 - classification_loss: 0.1876 467/500 [===========================>..] - ETA: 8s - loss: 1.2368 - regression_loss: 1.0493 - classification_loss: 0.1875 468/500 [===========================>..] - ETA: 8s - loss: 1.2376 - regression_loss: 1.0500 - classification_loss: 0.1876 469/500 [===========================>..] - ETA: 7s - loss: 1.2368 - regression_loss: 1.0493 - classification_loss: 0.1875 470/500 [===========================>..] - ETA: 7s - loss: 1.2376 - regression_loss: 1.0500 - classification_loss: 0.1875 471/500 [===========================>..] - ETA: 7s - loss: 1.2373 - regression_loss: 1.0498 - classification_loss: 0.1874 472/500 [===========================>..] - ETA: 7s - loss: 1.2379 - regression_loss: 1.0503 - classification_loss: 0.1876 473/500 [===========================>..] - ETA: 6s - loss: 1.2401 - regression_loss: 1.0519 - classification_loss: 0.1882 474/500 [===========================>..] - ETA: 6s - loss: 1.2401 - regression_loss: 1.0520 - classification_loss: 0.1881 475/500 [===========================>..] - ETA: 6s - loss: 1.2390 - regression_loss: 1.0510 - classification_loss: 0.1880 476/500 [===========================>..] - ETA: 6s - loss: 1.2396 - regression_loss: 1.0515 - classification_loss: 0.1880 477/500 [===========================>..] - ETA: 5s - loss: 1.2390 - regression_loss: 1.0512 - classification_loss: 0.1878 478/500 [===========================>..] - ETA: 5s - loss: 1.2390 - regression_loss: 1.0512 - classification_loss: 0.1878 479/500 [===========================>..] - ETA: 5s - loss: 1.2397 - regression_loss: 1.0514 - classification_loss: 0.1883 480/500 [===========================>..] - ETA: 5s - loss: 1.2382 - regression_loss: 1.0502 - classification_loss: 0.1880 481/500 [===========================>..] - ETA: 4s - loss: 1.2364 - regression_loss: 1.0487 - classification_loss: 0.1877 482/500 [===========================>..] - ETA: 4s - loss: 1.2369 - regression_loss: 1.0491 - classification_loss: 0.1878 483/500 [===========================>..] - ETA: 4s - loss: 1.2363 - regression_loss: 1.0487 - classification_loss: 0.1876 484/500 [============================>.] - ETA: 4s - loss: 1.2370 - regression_loss: 1.0493 - classification_loss: 0.1878 485/500 [============================>.] - ETA: 3s - loss: 1.2378 - regression_loss: 1.0501 - classification_loss: 0.1877 486/500 [============================>.] - ETA: 3s - loss: 1.2378 - regression_loss: 1.0500 - classification_loss: 0.1878 487/500 [============================>.] - ETA: 3s - loss: 1.2364 - regression_loss: 1.0489 - classification_loss: 0.1875 488/500 [============================>.] - ETA: 3s - loss: 1.2353 - regression_loss: 1.0480 - classification_loss: 0.1873 489/500 [============================>.] - ETA: 2s - loss: 1.2345 - regression_loss: 1.0473 - classification_loss: 0.1871 490/500 [============================>.] - ETA: 2s - loss: 1.2349 - regression_loss: 1.0479 - classification_loss: 0.1871 491/500 [============================>.] - ETA: 2s - loss: 1.2351 - regression_loss: 1.0480 - classification_loss: 0.1871 492/500 [============================>.] - ETA: 2s - loss: 1.2353 - regression_loss: 1.0482 - classification_loss: 0.1871 493/500 [============================>.] - ETA: 1s - loss: 1.2357 - regression_loss: 1.0485 - classification_loss: 0.1871 494/500 [============================>.] - ETA: 1s - loss: 1.2358 - regression_loss: 1.0487 - classification_loss: 0.1871 495/500 [============================>.] - ETA: 1s - loss: 1.2373 - regression_loss: 1.0499 - classification_loss: 0.1874 496/500 [============================>.] - ETA: 1s - loss: 1.2375 - regression_loss: 1.0502 - classification_loss: 0.1874 497/500 [============================>.] - ETA: 0s - loss: 1.2377 - regression_loss: 1.0502 - classification_loss: 0.1875 498/500 [============================>.] - ETA: 0s - loss: 1.2382 - regression_loss: 1.0507 - classification_loss: 0.1875 499/500 [============================>.] - ETA: 0s - loss: 1.2377 - regression_loss: 1.0503 - classification_loss: 0.1874 500/500 [==============================] - 125s 251ms/step - loss: 1.2377 - regression_loss: 1.0503 - classification_loss: 0.1874 1172 instances of class plum with average precision: 0.6993 mAP: 0.6993 Epoch 00134: saving model to ./training/snapshots/resnet50_pascal_134.h5 Epoch 135/150 1/500 [..............................] - ETA: 2:01 - loss: 0.6290 - regression_loss: 0.5484 - classification_loss: 0.0806 2/500 [..............................] - ETA: 2:03 - loss: 0.9262 - regression_loss: 0.7820 - classification_loss: 0.1442 3/500 [..............................] - ETA: 2:05 - loss: 0.9507 - regression_loss: 0.8096 - classification_loss: 0.1410 4/500 [..............................] - ETA: 2:05 - loss: 1.0840 - regression_loss: 0.9306 - classification_loss: 0.1534 5/500 [..............................] - ETA: 2:05 - loss: 1.0856 - regression_loss: 0.9260 - classification_loss: 0.1596 6/500 [..............................] - ETA: 2:04 - loss: 1.1526 - regression_loss: 0.9904 - classification_loss: 0.1622 7/500 [..............................] - ETA: 2:04 - loss: 1.1861 - regression_loss: 1.0231 - classification_loss: 0.1631 8/500 [..............................] - ETA: 2:04 - loss: 1.1840 - regression_loss: 1.0131 - classification_loss: 0.1708 9/500 [..............................] - ETA: 2:04 - loss: 1.1253 - regression_loss: 0.9691 - classification_loss: 0.1562 10/500 [..............................] - ETA: 2:04 - loss: 1.1510 - regression_loss: 0.9930 - classification_loss: 0.1580 11/500 [..............................] - ETA: 2:04 - loss: 1.0704 - regression_loss: 0.9245 - classification_loss: 0.1459 12/500 [..............................] - ETA: 2:03 - loss: 1.0914 - regression_loss: 0.9434 - classification_loss: 0.1479 13/500 [..............................] - ETA: 2:03 - loss: 1.0706 - regression_loss: 0.9265 - classification_loss: 0.1441 14/500 [..............................] - ETA: 2:03 - loss: 1.1097 - regression_loss: 0.9583 - classification_loss: 0.1514 15/500 [..............................] - ETA: 2:02 - loss: 1.1443 - regression_loss: 0.9866 - classification_loss: 0.1578 16/500 [..............................] - ETA: 2:02 - loss: 1.1750 - regression_loss: 1.0077 - classification_loss: 0.1674 17/500 [>.............................] - ETA: 2:02 - loss: 1.1855 - regression_loss: 1.0158 - classification_loss: 0.1697 18/500 [>.............................] - ETA: 2:01 - loss: 1.1384 - regression_loss: 0.9757 - classification_loss: 0.1627 19/500 [>.............................] - ETA: 2:01 - loss: 1.1517 - regression_loss: 0.9885 - classification_loss: 0.1633 20/500 [>.............................] - ETA: 2:01 - loss: 1.1596 - regression_loss: 0.9923 - classification_loss: 0.1674 21/500 [>.............................] - ETA: 2:00 - loss: 1.1995 - regression_loss: 1.0240 - classification_loss: 0.1754 22/500 [>.............................] - ETA: 2:00 - loss: 1.2021 - regression_loss: 1.0267 - classification_loss: 0.1754 23/500 [>.............................] - ETA: 2:00 - loss: 1.1883 - regression_loss: 1.0144 - classification_loss: 0.1738 24/500 [>.............................] - ETA: 1:59 - loss: 1.1835 - regression_loss: 1.0101 - classification_loss: 0.1734 25/500 [>.............................] - ETA: 1:59 - loss: 1.1621 - regression_loss: 0.9916 - classification_loss: 0.1706 26/500 [>.............................] - ETA: 1:59 - loss: 1.1315 - regression_loss: 0.9644 - classification_loss: 0.1670 27/500 [>.............................] - ETA: 1:58 - loss: 1.1337 - regression_loss: 0.9690 - classification_loss: 0.1647 28/500 [>.............................] - ETA: 1:58 - loss: 1.1571 - regression_loss: 0.9884 - classification_loss: 0.1687 29/500 [>.............................] - ETA: 1:58 - loss: 1.1598 - regression_loss: 0.9926 - classification_loss: 0.1672 30/500 [>.............................] - ETA: 1:58 - loss: 1.1752 - regression_loss: 1.0053 - classification_loss: 0.1699 31/500 [>.............................] - ETA: 1:57 - loss: 1.1805 - regression_loss: 1.0089 - classification_loss: 0.1716 32/500 [>.............................] - ETA: 1:57 - loss: 1.1876 - regression_loss: 1.0145 - classification_loss: 0.1731 33/500 [>.............................] - ETA: 1:57 - loss: 1.1901 - regression_loss: 1.0177 - classification_loss: 0.1724 34/500 [=>............................] - ETA: 1:57 - loss: 1.1713 - regression_loss: 1.0018 - classification_loss: 0.1695 35/500 [=>............................] - ETA: 1:57 - loss: 1.1797 - regression_loss: 1.0081 - classification_loss: 0.1715 36/500 [=>............................] - ETA: 1:56 - loss: 1.1908 - regression_loss: 1.0191 - classification_loss: 0.1717 37/500 [=>............................] - ETA: 1:56 - loss: 1.1867 - regression_loss: 1.0161 - classification_loss: 0.1705 38/500 [=>............................] - ETA: 1:56 - loss: 1.2008 - regression_loss: 1.0265 - classification_loss: 0.1743 39/500 [=>............................] - ETA: 1:55 - loss: 1.2019 - regression_loss: 1.0260 - classification_loss: 0.1759 40/500 [=>............................] - ETA: 1:55 - loss: 1.2126 - regression_loss: 1.0349 - classification_loss: 0.1777 41/500 [=>............................] - ETA: 1:55 - loss: 1.2148 - regression_loss: 1.0373 - classification_loss: 0.1775 42/500 [=>............................] - ETA: 1:55 - loss: 1.1999 - regression_loss: 1.0250 - classification_loss: 0.1749 43/500 [=>............................] - ETA: 1:55 - loss: 1.1930 - regression_loss: 1.0195 - classification_loss: 0.1735 44/500 [=>............................] - ETA: 1:54 - loss: 1.1935 - regression_loss: 1.0203 - classification_loss: 0.1732 45/500 [=>............................] - ETA: 1:54 - loss: 1.1804 - regression_loss: 1.0094 - classification_loss: 0.1710 46/500 [=>............................] - ETA: 1:54 - loss: 1.1688 - regression_loss: 0.9990 - classification_loss: 0.1697 47/500 [=>............................] - ETA: 1:53 - loss: 1.1618 - regression_loss: 0.9933 - classification_loss: 0.1685 48/500 [=>............................] - ETA: 1:53 - loss: 1.1662 - regression_loss: 0.9934 - classification_loss: 0.1729 49/500 [=>............................] - ETA: 1:53 - loss: 1.1506 - regression_loss: 0.9800 - classification_loss: 0.1705 50/500 [==>...........................] - ETA: 1:53 - loss: 1.1502 - regression_loss: 0.9802 - classification_loss: 0.1700 51/500 [==>...........................] - ETA: 1:53 - loss: 1.1559 - regression_loss: 0.9850 - classification_loss: 0.1709 52/500 [==>...........................] - ETA: 1:52 - loss: 1.1584 - regression_loss: 0.9877 - classification_loss: 0.1707 53/500 [==>...........................] - ETA: 1:52 - loss: 1.1626 - regression_loss: 0.9912 - classification_loss: 0.1714 54/500 [==>...........................] - ETA: 1:51 - loss: 1.1696 - regression_loss: 0.9976 - classification_loss: 0.1720 55/500 [==>...........................] - ETA: 1:51 - loss: 1.1811 - regression_loss: 1.0072 - classification_loss: 0.1739 56/500 [==>...........................] - ETA: 1:51 - loss: 1.1734 - regression_loss: 0.9999 - classification_loss: 0.1734 57/500 [==>...........................] - ETA: 1:50 - loss: 1.1775 - regression_loss: 1.0034 - classification_loss: 0.1742 58/500 [==>...........................] - ETA: 1:50 - loss: 1.1852 - regression_loss: 1.0104 - classification_loss: 0.1748 59/500 [==>...........................] - ETA: 1:49 - loss: 1.1853 - regression_loss: 1.0104 - classification_loss: 0.1748 60/500 [==>...........................] - ETA: 1:49 - loss: 1.1778 - regression_loss: 1.0041 - classification_loss: 0.1736 61/500 [==>...........................] - ETA: 1:48 - loss: 1.1817 - regression_loss: 1.0074 - classification_loss: 0.1743 62/500 [==>...........................] - ETA: 1:48 - loss: 1.1749 - regression_loss: 1.0017 - classification_loss: 0.1732 63/500 [==>...........................] - ETA: 1:48 - loss: 1.1800 - regression_loss: 1.0060 - classification_loss: 0.1741 64/500 [==>...........................] - ETA: 1:48 - loss: 1.1807 - regression_loss: 1.0065 - classification_loss: 0.1743 65/500 [==>...........................] - ETA: 1:47 - loss: 1.1912 - regression_loss: 1.0147 - classification_loss: 0.1765 66/500 [==>...........................] - ETA: 1:47 - loss: 1.1807 - regression_loss: 1.0054 - classification_loss: 0.1753 67/500 [===>..........................] - ETA: 1:47 - loss: 1.1791 - regression_loss: 1.0037 - classification_loss: 0.1753 68/500 [===>..........................] - ETA: 1:47 - loss: 1.1848 - regression_loss: 1.0078 - classification_loss: 0.1770 69/500 [===>..........................] - ETA: 1:46 - loss: 1.1852 - regression_loss: 1.0077 - classification_loss: 0.1775 70/500 [===>..........................] - ETA: 1:46 - loss: 1.1811 - regression_loss: 1.0047 - classification_loss: 0.1765 71/500 [===>..........................] - ETA: 1:46 - loss: 1.1712 - regression_loss: 0.9966 - classification_loss: 0.1746 72/500 [===>..........................] - ETA: 1:46 - loss: 1.1780 - regression_loss: 1.0025 - classification_loss: 0.1755 73/500 [===>..........................] - ETA: 1:46 - loss: 1.1769 - regression_loss: 1.0020 - classification_loss: 0.1749 74/500 [===>..........................] - ETA: 1:45 - loss: 1.1789 - regression_loss: 1.0039 - classification_loss: 0.1750 75/500 [===>..........................] - ETA: 1:45 - loss: 1.1829 - regression_loss: 1.0086 - classification_loss: 0.1744 76/500 [===>..........................] - ETA: 1:45 - loss: 1.1736 - regression_loss: 1.0008 - classification_loss: 0.1728 77/500 [===>..........................] - ETA: 1:45 - loss: 1.1782 - regression_loss: 1.0046 - classification_loss: 0.1736 78/500 [===>..........................] - ETA: 1:45 - loss: 1.1780 - regression_loss: 1.0036 - classification_loss: 0.1744 79/500 [===>..........................] - ETA: 1:44 - loss: 1.1710 - regression_loss: 0.9982 - classification_loss: 0.1728 80/500 [===>..........................] - ETA: 1:44 - loss: 1.1678 - regression_loss: 0.9956 - classification_loss: 0.1722 81/500 [===>..........................] - ETA: 1:44 - loss: 1.1693 - regression_loss: 0.9961 - classification_loss: 0.1733 82/500 [===>..........................] - ETA: 1:44 - loss: 1.1682 - regression_loss: 0.9951 - classification_loss: 0.1731 83/500 [===>..........................] - ETA: 1:43 - loss: 1.1695 - regression_loss: 0.9963 - classification_loss: 0.1732 84/500 [====>.........................] - ETA: 1:43 - loss: 1.1783 - regression_loss: 1.0034 - classification_loss: 0.1749 85/500 [====>.........................] - ETA: 1:43 - loss: 1.1796 - regression_loss: 1.0041 - classification_loss: 0.1755 86/500 [====>.........................] - ETA: 1:43 - loss: 1.1851 - regression_loss: 1.0089 - classification_loss: 0.1762 87/500 [====>.........................] - ETA: 1:42 - loss: 1.1929 - regression_loss: 1.0152 - classification_loss: 0.1777 88/500 [====>.........................] - ETA: 1:42 - loss: 1.1869 - regression_loss: 1.0100 - classification_loss: 0.1769 89/500 [====>.........................] - ETA: 1:42 - loss: 1.1914 - regression_loss: 1.0151 - classification_loss: 0.1763 90/500 [====>.........................] - ETA: 1:42 - loss: 1.1909 - regression_loss: 1.0149 - classification_loss: 0.1760 91/500 [====>.........................] - ETA: 1:41 - loss: 1.1927 - regression_loss: 1.0162 - classification_loss: 0.1766 92/500 [====>.........................] - ETA: 1:41 - loss: 1.1938 - regression_loss: 1.0172 - classification_loss: 0.1766 93/500 [====>.........................] - ETA: 1:41 - loss: 1.1893 - regression_loss: 1.0134 - classification_loss: 0.1758 94/500 [====>.........................] - ETA: 1:41 - loss: 1.1831 - regression_loss: 1.0082 - classification_loss: 0.1749 95/500 [====>.........................] - ETA: 1:40 - loss: 1.1835 - regression_loss: 1.0091 - classification_loss: 0.1744 96/500 [====>.........................] - ETA: 1:40 - loss: 1.1857 - regression_loss: 1.0111 - classification_loss: 0.1746 97/500 [====>.........................] - ETA: 1:40 - loss: 1.1818 - regression_loss: 1.0075 - classification_loss: 0.1743 98/500 [====>.........................] - ETA: 1:39 - loss: 1.1849 - regression_loss: 1.0098 - classification_loss: 0.1751 99/500 [====>.........................] - ETA: 1:39 - loss: 1.1821 - regression_loss: 1.0073 - classification_loss: 0.1748 100/500 [=====>........................] - ETA: 1:39 - loss: 1.1841 - regression_loss: 1.0089 - classification_loss: 0.1752 101/500 [=====>........................] - ETA: 1:39 - loss: 1.1891 - regression_loss: 1.0127 - classification_loss: 0.1764 102/500 [=====>........................] - ETA: 1:39 - loss: 1.1923 - regression_loss: 1.0149 - classification_loss: 0.1774 103/500 [=====>........................] - ETA: 1:38 - loss: 1.1958 - regression_loss: 1.0178 - classification_loss: 0.1780 104/500 [=====>........................] - ETA: 1:38 - loss: 1.1981 - regression_loss: 1.0196 - classification_loss: 0.1785 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2018 - regression_loss: 1.0225 - classification_loss: 0.1793 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2057 - regression_loss: 1.0252 - classification_loss: 0.1805 107/500 [=====>........................] - ETA: 1:37 - loss: 1.2082 - regression_loss: 1.0269 - classification_loss: 0.1813 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2135 - regression_loss: 1.0312 - classification_loss: 0.1824 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2074 - regression_loss: 1.0261 - classification_loss: 0.1813 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2098 - regression_loss: 1.0289 - classification_loss: 0.1808 111/500 [=====>........................] - ETA: 1:36 - loss: 1.2132 - regression_loss: 1.0316 - classification_loss: 0.1816 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2170 - regression_loss: 1.0350 - classification_loss: 0.1821 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2226 - regression_loss: 1.0409 - classification_loss: 0.1816 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2173 - regression_loss: 1.0366 - classification_loss: 0.1807 115/500 [=====>........................] - ETA: 1:35 - loss: 1.2164 - regression_loss: 1.0361 - classification_loss: 0.1803 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2203 - regression_loss: 1.0391 - classification_loss: 0.1812 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2185 - regression_loss: 1.0374 - classification_loss: 0.1811 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2128 - regression_loss: 1.0328 - classification_loss: 0.1800 119/500 [======>.......................] - ETA: 1:34 - loss: 1.2153 - regression_loss: 1.0349 - classification_loss: 0.1804 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2149 - regression_loss: 1.0346 - classification_loss: 0.1803 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2123 - regression_loss: 1.0324 - classification_loss: 0.1800 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2102 - regression_loss: 1.0307 - classification_loss: 0.1796 123/500 [======>.......................] - ETA: 1:33 - loss: 1.2058 - regression_loss: 1.0270 - classification_loss: 0.1788 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2055 - regression_loss: 1.0271 - classification_loss: 0.1784 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2043 - regression_loss: 1.0263 - classification_loss: 0.1780 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2047 - regression_loss: 1.0267 - classification_loss: 0.1781 127/500 [======>.......................] - ETA: 1:32 - loss: 1.2052 - regression_loss: 1.0276 - classification_loss: 0.1776 128/500 [======>.......................] - ETA: 1:32 - loss: 1.1989 - regression_loss: 1.0224 - classification_loss: 0.1766 129/500 [======>.......................] - ETA: 1:32 - loss: 1.1921 - regression_loss: 1.0166 - classification_loss: 0.1756 130/500 [======>.......................] - ETA: 1:32 - loss: 1.1863 - regression_loss: 1.0116 - classification_loss: 0.1747 131/500 [======>.......................] - ETA: 1:31 - loss: 1.1863 - regression_loss: 1.0107 - classification_loss: 0.1756 132/500 [======>.......................] - ETA: 1:31 - loss: 1.1815 - regression_loss: 1.0069 - classification_loss: 0.1746 133/500 [======>.......................] - ETA: 1:31 - loss: 1.1809 - regression_loss: 1.0063 - classification_loss: 0.1746 134/500 [=======>......................] - ETA: 1:31 - loss: 1.1763 - regression_loss: 1.0026 - classification_loss: 0.1737 135/500 [=======>......................] - ETA: 1:31 - loss: 1.1762 - regression_loss: 1.0022 - classification_loss: 0.1740 136/500 [=======>......................] - ETA: 1:30 - loss: 1.1717 - regression_loss: 0.9984 - classification_loss: 0.1732 137/500 [=======>......................] - ETA: 1:30 - loss: 1.1736 - regression_loss: 0.9999 - classification_loss: 0.1737 138/500 [=======>......................] - ETA: 1:30 - loss: 1.1767 - regression_loss: 1.0023 - classification_loss: 0.1744 139/500 [=======>......................] - ETA: 1:30 - loss: 1.1796 - regression_loss: 1.0045 - classification_loss: 0.1750 140/500 [=======>......................] - ETA: 1:29 - loss: 1.1828 - regression_loss: 1.0071 - classification_loss: 0.1757 141/500 [=======>......................] - ETA: 1:29 - loss: 1.1874 - regression_loss: 1.0112 - classification_loss: 0.1762 142/500 [=======>......................] - ETA: 1:29 - loss: 1.1908 - regression_loss: 1.0141 - classification_loss: 0.1768 143/500 [=======>......................] - ETA: 1:29 - loss: 1.1969 - regression_loss: 1.0182 - classification_loss: 0.1787 144/500 [=======>......................] - ETA: 1:28 - loss: 1.1925 - regression_loss: 1.0146 - classification_loss: 0.1779 145/500 [=======>......................] - ETA: 1:28 - loss: 1.1902 - regression_loss: 1.0122 - classification_loss: 0.1780 146/500 [=======>......................] - ETA: 1:28 - loss: 1.1889 - regression_loss: 1.0113 - classification_loss: 0.1776 147/500 [=======>......................] - ETA: 1:28 - loss: 1.1863 - regression_loss: 1.0091 - classification_loss: 0.1773 148/500 [=======>......................] - ETA: 1:27 - loss: 1.1905 - regression_loss: 1.0126 - classification_loss: 0.1779 149/500 [=======>......................] - ETA: 1:27 - loss: 1.1947 - regression_loss: 1.0162 - classification_loss: 0.1785 150/500 [========>.....................] - ETA: 1:27 - loss: 1.1946 - regression_loss: 1.0164 - classification_loss: 0.1782 151/500 [========>.....................] - ETA: 1:27 - loss: 1.1905 - regression_loss: 1.0128 - classification_loss: 0.1777 152/500 [========>.....................] - ETA: 1:26 - loss: 1.1880 - regression_loss: 1.0104 - classification_loss: 0.1776 153/500 [========>.....................] - ETA: 1:26 - loss: 1.1925 - regression_loss: 1.0138 - classification_loss: 0.1786 154/500 [========>.....................] - ETA: 1:26 - loss: 1.1946 - regression_loss: 1.0158 - classification_loss: 0.1788 155/500 [========>.....................] - ETA: 1:26 - loss: 1.1921 - regression_loss: 1.0135 - classification_loss: 0.1786 156/500 [========>.....................] - ETA: 1:25 - loss: 1.1910 - regression_loss: 1.0127 - classification_loss: 0.1783 157/500 [========>.....................] - ETA: 1:25 - loss: 1.1927 - regression_loss: 1.0142 - classification_loss: 0.1786 158/500 [========>.....................] - ETA: 1:25 - loss: 1.1951 - regression_loss: 1.0160 - classification_loss: 0.1791 159/500 [========>.....................] - ETA: 1:25 - loss: 1.1955 - regression_loss: 1.0162 - classification_loss: 0.1793 160/500 [========>.....................] - ETA: 1:24 - loss: 1.1961 - regression_loss: 1.0164 - classification_loss: 0.1797 161/500 [========>.....................] - ETA: 1:24 - loss: 1.1998 - regression_loss: 1.0193 - classification_loss: 0.1804 162/500 [========>.....................] - ETA: 1:24 - loss: 1.1991 - regression_loss: 1.0186 - classification_loss: 0.1805 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2011 - regression_loss: 1.0200 - classification_loss: 0.1811 164/500 [========>.....................] - ETA: 1:23 - loss: 1.1983 - regression_loss: 1.0177 - classification_loss: 0.1806 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2009 - regression_loss: 1.0202 - classification_loss: 0.1807 166/500 [========>.....................] - ETA: 1:23 - loss: 1.1997 - regression_loss: 1.0191 - classification_loss: 0.1805 167/500 [=========>....................] - ETA: 1:23 - loss: 1.1989 - regression_loss: 1.0189 - classification_loss: 0.1800 168/500 [=========>....................] - ETA: 1:23 - loss: 1.1963 - regression_loss: 1.0167 - classification_loss: 0.1796 169/500 [=========>....................] - ETA: 1:22 - loss: 1.1983 - regression_loss: 1.0186 - classification_loss: 0.1797 170/500 [=========>....................] - ETA: 1:22 - loss: 1.1999 - regression_loss: 1.0199 - classification_loss: 0.1801 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2003 - regression_loss: 1.0196 - classification_loss: 0.1807 172/500 [=========>....................] - ETA: 1:21 - loss: 1.2025 - regression_loss: 1.0214 - classification_loss: 0.1811 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2001 - regression_loss: 1.0199 - classification_loss: 0.1803 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2014 - regression_loss: 1.0212 - classification_loss: 0.1802 175/500 [=========>....................] - ETA: 1:21 - loss: 1.1974 - regression_loss: 1.0180 - classification_loss: 0.1794 176/500 [=========>....................] - ETA: 1:20 - loss: 1.1976 - regression_loss: 1.0183 - classification_loss: 0.1794 177/500 [=========>....................] - ETA: 1:20 - loss: 1.1990 - regression_loss: 1.0196 - classification_loss: 0.1794 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2037 - regression_loss: 1.0234 - classification_loss: 0.1803 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2034 - regression_loss: 1.0229 - classification_loss: 0.1805 180/500 [=========>....................] - ETA: 1:19 - loss: 1.2047 - regression_loss: 1.0242 - classification_loss: 0.1805 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2058 - regression_loss: 1.0251 - classification_loss: 0.1807 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2068 - regression_loss: 1.0258 - classification_loss: 0.1810 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2078 - regression_loss: 1.0268 - classification_loss: 0.1810 184/500 [==========>...................] - ETA: 1:18 - loss: 1.2070 - regression_loss: 1.0264 - classification_loss: 0.1806 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2088 - regression_loss: 1.0277 - classification_loss: 0.1811 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2101 - regression_loss: 1.0287 - classification_loss: 0.1814 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2087 - regression_loss: 1.0273 - classification_loss: 0.1814 188/500 [==========>...................] - ETA: 1:17 - loss: 1.2126 - regression_loss: 1.0302 - classification_loss: 0.1824 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2153 - regression_loss: 1.0325 - classification_loss: 0.1828 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2144 - regression_loss: 1.0318 - classification_loss: 0.1826 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2185 - regression_loss: 1.0347 - classification_loss: 0.1838 192/500 [==========>...................] - ETA: 1:16 - loss: 1.2165 - regression_loss: 1.0334 - classification_loss: 0.1830 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2165 - regression_loss: 1.0332 - classification_loss: 0.1832 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2205 - regression_loss: 1.0367 - classification_loss: 0.1839 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2232 - regression_loss: 1.0389 - classification_loss: 0.1843 196/500 [==========>...................] - ETA: 1:15 - loss: 1.2234 - regression_loss: 1.0388 - classification_loss: 0.1846 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2239 - regression_loss: 1.0392 - classification_loss: 0.1847 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2249 - regression_loss: 1.0402 - classification_loss: 0.1847 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2236 - regression_loss: 1.0392 - classification_loss: 0.1843 200/500 [===========>..................] - ETA: 1:14 - loss: 1.2237 - regression_loss: 1.0394 - classification_loss: 0.1843 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2250 - regression_loss: 1.0406 - classification_loss: 0.1845 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2238 - regression_loss: 1.0393 - classification_loss: 0.1844 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2228 - regression_loss: 1.0387 - classification_loss: 0.1842 204/500 [===========>..................] - ETA: 1:13 - loss: 1.2218 - regression_loss: 1.0380 - classification_loss: 0.1838 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2206 - regression_loss: 1.0369 - classification_loss: 0.1837 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2234 - regression_loss: 1.0391 - classification_loss: 0.1843 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2232 - regression_loss: 1.0390 - classification_loss: 0.1842 208/500 [===========>..................] - ETA: 1:12 - loss: 1.2249 - regression_loss: 1.0403 - classification_loss: 0.1847 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2240 - regression_loss: 1.0396 - classification_loss: 0.1844 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2254 - regression_loss: 1.0409 - classification_loss: 0.1846 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2259 - regression_loss: 1.0411 - classification_loss: 0.1848 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2239 - regression_loss: 1.0395 - classification_loss: 0.1844 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2239 - regression_loss: 1.0396 - classification_loss: 0.1844 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2238 - regression_loss: 1.0394 - classification_loss: 0.1844 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2245 - regression_loss: 1.0400 - classification_loss: 0.1845 216/500 [===========>..................] - ETA: 1:10 - loss: 1.2240 - regression_loss: 1.0395 - classification_loss: 0.1845 217/500 [============>.................] - ETA: 1:10 - loss: 1.2235 - regression_loss: 1.0392 - classification_loss: 0.1843 218/500 [============>.................] - ETA: 1:10 - loss: 1.2239 - regression_loss: 1.0397 - classification_loss: 0.1842 219/500 [============>.................] - ETA: 1:10 - loss: 1.2236 - regression_loss: 1.0393 - classification_loss: 0.1844 220/500 [============>.................] - ETA: 1:10 - loss: 1.2257 - regression_loss: 1.0412 - classification_loss: 0.1844 221/500 [============>.................] - ETA: 1:09 - loss: 1.2253 - regression_loss: 1.0411 - classification_loss: 0.1843 222/500 [============>.................] - ETA: 1:09 - loss: 1.2268 - regression_loss: 1.0423 - classification_loss: 0.1845 223/500 [============>.................] - ETA: 1:09 - loss: 1.2263 - regression_loss: 1.0420 - classification_loss: 0.1844 224/500 [============>.................] - ETA: 1:09 - loss: 1.2274 - regression_loss: 1.0428 - classification_loss: 0.1846 225/500 [============>.................] - ETA: 1:08 - loss: 1.2273 - regression_loss: 1.0429 - classification_loss: 0.1844 226/500 [============>.................] - ETA: 1:08 - loss: 1.2294 - regression_loss: 1.0447 - classification_loss: 0.1847 227/500 [============>.................] - ETA: 1:08 - loss: 1.2320 - regression_loss: 1.0467 - classification_loss: 0.1853 228/500 [============>.................] - ETA: 1:08 - loss: 1.2327 - regression_loss: 1.0471 - classification_loss: 0.1857 229/500 [============>.................] - ETA: 1:07 - loss: 1.2344 - regression_loss: 1.0485 - classification_loss: 0.1859 230/500 [============>.................] - ETA: 1:07 - loss: 1.2329 - regression_loss: 1.0474 - classification_loss: 0.1855 231/500 [============>.................] - ETA: 1:07 - loss: 1.2313 - regression_loss: 1.0459 - classification_loss: 0.1854 232/500 [============>.................] - ETA: 1:06 - loss: 1.2287 - regression_loss: 1.0439 - classification_loss: 0.1848 233/500 [============>.................] - ETA: 1:06 - loss: 1.2297 - regression_loss: 1.0450 - classification_loss: 0.1848 234/500 [=============>................] - ETA: 1:06 - loss: 1.2306 - regression_loss: 1.0455 - classification_loss: 0.1851 235/500 [=============>................] - ETA: 1:06 - loss: 1.2300 - regression_loss: 1.0451 - classification_loss: 0.1850 236/500 [=============>................] - ETA: 1:05 - loss: 1.2288 - regression_loss: 1.0442 - classification_loss: 0.1846 237/500 [=============>................] - ETA: 1:05 - loss: 1.2310 - regression_loss: 1.0460 - classification_loss: 0.1851 238/500 [=============>................] - ETA: 1:05 - loss: 1.2301 - regression_loss: 1.0450 - classification_loss: 0.1851 239/500 [=============>................] - ETA: 1:05 - loss: 1.2305 - regression_loss: 1.0453 - classification_loss: 0.1851 240/500 [=============>................] - ETA: 1:04 - loss: 1.2287 - regression_loss: 1.0440 - classification_loss: 0.1847 241/500 [=============>................] - ETA: 1:04 - loss: 1.2296 - regression_loss: 1.0448 - classification_loss: 0.1849 242/500 [=============>................] - ETA: 1:04 - loss: 1.2304 - regression_loss: 1.0453 - classification_loss: 0.1850 243/500 [=============>................] - ETA: 1:04 - loss: 1.2303 - regression_loss: 1.0451 - classification_loss: 0.1852 244/500 [=============>................] - ETA: 1:03 - loss: 1.2285 - regression_loss: 1.0436 - classification_loss: 0.1850 245/500 [=============>................] - ETA: 1:03 - loss: 1.2277 - regression_loss: 1.0430 - classification_loss: 0.1848 246/500 [=============>................] - ETA: 1:03 - loss: 1.2276 - regression_loss: 1.0429 - classification_loss: 0.1847 247/500 [=============>................] - ETA: 1:03 - loss: 1.2296 - regression_loss: 1.0447 - classification_loss: 0.1850 248/500 [=============>................] - ETA: 1:02 - loss: 1.2289 - regression_loss: 1.0441 - classification_loss: 0.1849 249/500 [=============>................] - ETA: 1:02 - loss: 1.2320 - regression_loss: 1.0461 - classification_loss: 0.1859 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2298 - regression_loss: 1.0442 - classification_loss: 0.1856 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2314 - regression_loss: 1.0455 - classification_loss: 0.1859 252/500 [==============>...............] - ETA: 1:01 - loss: 1.2294 - regression_loss: 1.0440 - classification_loss: 0.1855 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2310 - regression_loss: 1.0452 - classification_loss: 0.1858 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2312 - regression_loss: 1.0454 - classification_loss: 0.1858 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2323 - regression_loss: 1.0460 - classification_loss: 0.1863 256/500 [==============>...............] - ETA: 1:00 - loss: 1.2316 - regression_loss: 1.0455 - classification_loss: 0.1861 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2312 - regression_loss: 1.0453 - classification_loss: 0.1859 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2313 - regression_loss: 1.0454 - classification_loss: 0.1859 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2313 - regression_loss: 1.0452 - classification_loss: 0.1861 260/500 [==============>...............] - ETA: 59s - loss: 1.2314 - regression_loss: 1.0455 - classification_loss: 0.1859  261/500 [==============>...............] - ETA: 59s - loss: 1.2321 - regression_loss: 1.0462 - classification_loss: 0.1860 262/500 [==============>...............] - ETA: 59s - loss: 1.2310 - regression_loss: 1.0451 - classification_loss: 0.1859 263/500 [==============>...............] - ETA: 59s - loss: 1.2291 - regression_loss: 1.0435 - classification_loss: 0.1856 264/500 [==============>...............] - ETA: 58s - loss: 1.2318 - regression_loss: 1.0458 - classification_loss: 0.1860 265/500 [==============>...............] - ETA: 58s - loss: 1.2336 - regression_loss: 1.0472 - classification_loss: 0.1864 266/500 [==============>...............] - ETA: 58s - loss: 1.2329 - regression_loss: 1.0467 - classification_loss: 0.1862 267/500 [===============>..............] - ETA: 58s - loss: 1.2305 - regression_loss: 1.0448 - classification_loss: 0.1857 268/500 [===============>..............] - ETA: 57s - loss: 1.2298 - regression_loss: 1.0443 - classification_loss: 0.1855 269/500 [===============>..............] - ETA: 57s - loss: 1.2307 - regression_loss: 1.0451 - classification_loss: 0.1857 270/500 [===============>..............] - ETA: 57s - loss: 1.2320 - regression_loss: 1.0463 - classification_loss: 0.1857 271/500 [===============>..............] - ETA: 57s - loss: 1.2333 - regression_loss: 1.0472 - classification_loss: 0.1861 272/500 [===============>..............] - ETA: 56s - loss: 1.2346 - regression_loss: 1.0481 - classification_loss: 0.1865 273/500 [===============>..............] - ETA: 56s - loss: 1.2331 - regression_loss: 1.0467 - classification_loss: 0.1864 274/500 [===============>..............] - ETA: 56s - loss: 1.2343 - regression_loss: 1.0478 - classification_loss: 0.1866 275/500 [===============>..............] - ETA: 56s - loss: 1.2357 - regression_loss: 1.0489 - classification_loss: 0.1868 276/500 [===============>..............] - ETA: 55s - loss: 1.2338 - regression_loss: 1.0475 - classification_loss: 0.1864 277/500 [===============>..............] - ETA: 55s - loss: 1.2318 - regression_loss: 1.0459 - classification_loss: 0.1859 278/500 [===============>..............] - ETA: 55s - loss: 1.2317 - regression_loss: 1.0457 - classification_loss: 0.1860 279/500 [===============>..............] - ETA: 55s - loss: 1.2300 - regression_loss: 1.0443 - classification_loss: 0.1857 280/500 [===============>..............] - ETA: 54s - loss: 1.2312 - regression_loss: 1.0454 - classification_loss: 0.1858 281/500 [===============>..............] - ETA: 54s - loss: 1.2318 - regression_loss: 1.0460 - classification_loss: 0.1858 282/500 [===============>..............] - ETA: 54s - loss: 1.2305 - regression_loss: 1.0450 - classification_loss: 0.1855 283/500 [===============>..............] - ETA: 54s - loss: 1.2315 - regression_loss: 1.0459 - classification_loss: 0.1856 284/500 [================>.............] - ETA: 53s - loss: 1.2298 - regression_loss: 1.0447 - classification_loss: 0.1851 285/500 [================>.............] - ETA: 53s - loss: 1.2319 - regression_loss: 1.0464 - classification_loss: 0.1854 286/500 [================>.............] - ETA: 53s - loss: 1.2321 - regression_loss: 1.0467 - classification_loss: 0.1854 287/500 [================>.............] - ETA: 53s - loss: 1.2330 - regression_loss: 1.0470 - classification_loss: 0.1859 288/500 [================>.............] - ETA: 52s - loss: 1.2310 - regression_loss: 1.0453 - classification_loss: 0.1857 289/500 [================>.............] - ETA: 52s - loss: 1.2306 - regression_loss: 1.0449 - classification_loss: 0.1857 290/500 [================>.............] - ETA: 52s - loss: 1.2322 - regression_loss: 1.0462 - classification_loss: 0.1860 291/500 [================>.............] - ETA: 52s - loss: 1.2315 - regression_loss: 1.0457 - classification_loss: 0.1858 292/500 [================>.............] - ETA: 51s - loss: 1.2313 - regression_loss: 1.0457 - classification_loss: 0.1856 293/500 [================>.............] - ETA: 51s - loss: 1.2301 - regression_loss: 1.0448 - classification_loss: 0.1852 294/500 [================>.............] - ETA: 51s - loss: 1.2281 - regression_loss: 1.0431 - classification_loss: 0.1849 295/500 [================>.............] - ETA: 51s - loss: 1.2329 - regression_loss: 1.0473 - classification_loss: 0.1856 296/500 [================>.............] - ETA: 50s - loss: 1.2325 - regression_loss: 1.0468 - classification_loss: 0.1857 297/500 [================>.............] - ETA: 50s - loss: 1.2338 - regression_loss: 1.0478 - classification_loss: 0.1860 298/500 [================>.............] - ETA: 50s - loss: 1.2360 - regression_loss: 1.0495 - classification_loss: 0.1865 299/500 [================>.............] - ETA: 50s - loss: 1.2344 - regression_loss: 1.0481 - classification_loss: 0.1862 300/500 [=================>............] - ETA: 50s - loss: 1.2350 - regression_loss: 1.0487 - classification_loss: 0.1863 301/500 [=================>............] - ETA: 49s - loss: 1.2347 - regression_loss: 1.0483 - classification_loss: 0.1863 302/500 [=================>............] - ETA: 49s - loss: 1.2353 - regression_loss: 1.0489 - classification_loss: 0.1865 303/500 [=================>............] - ETA: 49s - loss: 1.2349 - regression_loss: 1.0486 - classification_loss: 0.1863 304/500 [=================>............] - ETA: 49s - loss: 1.2319 - regression_loss: 1.0461 - classification_loss: 0.1858 305/500 [=================>............] - ETA: 48s - loss: 1.2314 - regression_loss: 1.0457 - classification_loss: 0.1857 306/500 [=================>............] - ETA: 48s - loss: 1.2331 - regression_loss: 1.0471 - classification_loss: 0.1860 307/500 [=================>............] - ETA: 48s - loss: 1.2346 - regression_loss: 1.0484 - classification_loss: 0.1863 308/500 [=================>............] - ETA: 48s - loss: 1.2358 - regression_loss: 1.0494 - classification_loss: 0.1864 309/500 [=================>............] - ETA: 47s - loss: 1.2360 - regression_loss: 1.0496 - classification_loss: 0.1864 310/500 [=================>............] - ETA: 47s - loss: 1.2364 - regression_loss: 1.0499 - classification_loss: 0.1865 311/500 [=================>............] - ETA: 47s - loss: 1.2348 - regression_loss: 1.0487 - classification_loss: 0.1861 312/500 [=================>............] - ETA: 47s - loss: 1.2375 - regression_loss: 1.0508 - classification_loss: 0.1867 313/500 [=================>............] - ETA: 46s - loss: 1.2365 - regression_loss: 1.0500 - classification_loss: 0.1865 314/500 [=================>............] - ETA: 46s - loss: 1.2361 - regression_loss: 1.0497 - classification_loss: 0.1864 315/500 [=================>............] - ETA: 46s - loss: 1.2376 - regression_loss: 1.0511 - classification_loss: 0.1866 316/500 [=================>............] - ETA: 46s - loss: 1.2352 - regression_loss: 1.0491 - classification_loss: 0.1862 317/500 [==================>...........] - ETA: 45s - loss: 1.2368 - regression_loss: 1.0505 - classification_loss: 0.1863 318/500 [==================>...........] - ETA: 45s - loss: 1.2374 - regression_loss: 1.0510 - classification_loss: 0.1864 319/500 [==================>...........] - ETA: 45s - loss: 1.2362 - regression_loss: 1.0500 - classification_loss: 0.1863 320/500 [==================>...........] - ETA: 45s - loss: 1.2351 - regression_loss: 1.0491 - classification_loss: 0.1860 321/500 [==================>...........] - ETA: 44s - loss: 1.2364 - regression_loss: 1.0503 - classification_loss: 0.1861 322/500 [==================>...........] - ETA: 44s - loss: 1.2369 - regression_loss: 1.0510 - classification_loss: 0.1859 323/500 [==================>...........] - ETA: 44s - loss: 1.2384 - regression_loss: 1.0522 - classification_loss: 0.1863 324/500 [==================>...........] - ETA: 44s - loss: 1.2392 - regression_loss: 1.0527 - classification_loss: 0.1864 325/500 [==================>...........] - ETA: 43s - loss: 1.2376 - regression_loss: 1.0515 - classification_loss: 0.1861 326/500 [==================>...........] - ETA: 43s - loss: 1.2362 - regression_loss: 1.0504 - classification_loss: 0.1859 327/500 [==================>...........] - ETA: 43s - loss: 1.2347 - regression_loss: 1.0472 - classification_loss: 0.1875 328/500 [==================>...........] - ETA: 43s - loss: 1.2330 - regression_loss: 1.0457 - classification_loss: 0.1873 329/500 [==================>...........] - ETA: 42s - loss: 1.2330 - regression_loss: 1.0458 - classification_loss: 0.1872 330/500 [==================>...........] - ETA: 42s - loss: 1.2345 - regression_loss: 1.0472 - classification_loss: 0.1874 331/500 [==================>...........] - ETA: 42s - loss: 1.2327 - regression_loss: 1.0456 - classification_loss: 0.1871 332/500 [==================>...........] - ETA: 42s - loss: 1.2335 - regression_loss: 1.0463 - classification_loss: 0.1872 333/500 [==================>...........] - ETA: 41s - loss: 1.2354 - regression_loss: 1.0479 - classification_loss: 0.1875 334/500 [===================>..........] - ETA: 41s - loss: 1.2361 - regression_loss: 1.0485 - classification_loss: 0.1876 335/500 [===================>..........] - ETA: 41s - loss: 1.2350 - regression_loss: 1.0476 - classification_loss: 0.1875 336/500 [===================>..........] - ETA: 41s - loss: 1.2344 - regression_loss: 1.0472 - classification_loss: 0.1872 337/500 [===================>..........] - ETA: 40s - loss: 1.2350 - regression_loss: 1.0477 - classification_loss: 0.1873 338/500 [===================>..........] - ETA: 40s - loss: 1.2336 - regression_loss: 1.0465 - classification_loss: 0.1871 339/500 [===================>..........] - ETA: 40s - loss: 1.2344 - regression_loss: 1.0472 - classification_loss: 0.1872 340/500 [===================>..........] - ETA: 40s - loss: 1.2338 - regression_loss: 1.0467 - classification_loss: 0.1871 341/500 [===================>..........] - ETA: 39s - loss: 1.2323 - regression_loss: 1.0455 - classification_loss: 0.1868 342/500 [===================>..........] - ETA: 39s - loss: 1.2311 - regression_loss: 1.0446 - classification_loss: 0.1866 343/500 [===================>..........] - ETA: 39s - loss: 1.2306 - regression_loss: 1.0442 - classification_loss: 0.1865 344/500 [===================>..........] - ETA: 39s - loss: 1.2306 - regression_loss: 1.0443 - classification_loss: 0.1863 345/500 [===================>..........] - ETA: 38s - loss: 1.2312 - regression_loss: 1.0449 - classification_loss: 0.1864 346/500 [===================>..........] - ETA: 38s - loss: 1.2293 - regression_loss: 1.0433 - classification_loss: 0.1860 347/500 [===================>..........] - ETA: 38s - loss: 1.2298 - regression_loss: 1.0436 - classification_loss: 0.1862 348/500 [===================>..........] - ETA: 38s - loss: 1.2303 - regression_loss: 1.0441 - classification_loss: 0.1862 349/500 [===================>..........] - ETA: 37s - loss: 1.2341 - regression_loss: 1.0471 - classification_loss: 0.1870 350/500 [====================>.........] - ETA: 37s - loss: 1.2336 - regression_loss: 1.0468 - classification_loss: 0.1868 351/500 [====================>.........] - ETA: 37s - loss: 1.2319 - regression_loss: 1.0455 - classification_loss: 0.1865 352/500 [====================>.........] - ETA: 37s - loss: 1.2319 - regression_loss: 1.0455 - classification_loss: 0.1865 353/500 [====================>.........] - ETA: 36s - loss: 1.2329 - regression_loss: 1.0463 - classification_loss: 0.1866 354/500 [====================>.........] - ETA: 36s - loss: 1.2344 - regression_loss: 1.0475 - classification_loss: 0.1869 355/500 [====================>.........] - ETA: 36s - loss: 1.2340 - regression_loss: 1.0471 - classification_loss: 0.1870 356/500 [====================>.........] - ETA: 36s - loss: 1.2340 - regression_loss: 1.0471 - classification_loss: 0.1870 357/500 [====================>.........] - ETA: 35s - loss: 1.2341 - regression_loss: 1.0472 - classification_loss: 0.1869 358/500 [====================>.........] - ETA: 35s - loss: 1.2336 - regression_loss: 1.0468 - classification_loss: 0.1868 359/500 [====================>.........] - ETA: 35s - loss: 1.2320 - regression_loss: 1.0455 - classification_loss: 0.1865 360/500 [====================>.........] - ETA: 35s - loss: 1.2308 - regression_loss: 1.0445 - classification_loss: 0.1863 361/500 [====================>.........] - ETA: 34s - loss: 1.2323 - regression_loss: 1.0458 - classification_loss: 0.1866 362/500 [====================>.........] - ETA: 34s - loss: 1.2327 - regression_loss: 1.0461 - classification_loss: 0.1866 363/500 [====================>.........] - ETA: 34s - loss: 1.2341 - regression_loss: 1.0471 - classification_loss: 0.1870 364/500 [====================>.........] - ETA: 34s - loss: 1.2348 - regression_loss: 1.0476 - classification_loss: 0.1871 365/500 [====================>.........] - ETA: 33s - loss: 1.2344 - regression_loss: 1.0473 - classification_loss: 0.1870 366/500 [====================>.........] - ETA: 33s - loss: 1.2355 - regression_loss: 1.0483 - classification_loss: 0.1872 367/500 [=====================>........] - ETA: 33s - loss: 1.2339 - regression_loss: 1.0470 - classification_loss: 0.1869 368/500 [=====================>........] - ETA: 33s - loss: 1.2340 - regression_loss: 1.0472 - classification_loss: 0.1868 369/500 [=====================>........] - ETA: 32s - loss: 1.2337 - regression_loss: 1.0469 - classification_loss: 0.1868 370/500 [=====================>........] - ETA: 32s - loss: 1.2326 - regression_loss: 1.0462 - classification_loss: 0.1864 371/500 [=====================>........] - ETA: 32s - loss: 1.2324 - regression_loss: 1.0459 - classification_loss: 0.1864 372/500 [=====================>........] - ETA: 32s - loss: 1.2326 - regression_loss: 1.0461 - classification_loss: 0.1866 373/500 [=====================>........] - ETA: 31s - loss: 1.2320 - regression_loss: 1.0455 - classification_loss: 0.1865 374/500 [=====================>........] - ETA: 31s - loss: 1.2303 - regression_loss: 1.0443 - classification_loss: 0.1861 375/500 [=====================>........] - ETA: 31s - loss: 1.2297 - regression_loss: 1.0439 - classification_loss: 0.1858 376/500 [=====================>........] - ETA: 31s - loss: 1.2304 - regression_loss: 1.0445 - classification_loss: 0.1859 377/500 [=====================>........] - ETA: 30s - loss: 1.2300 - regression_loss: 1.0442 - classification_loss: 0.1858 378/500 [=====================>........] - ETA: 30s - loss: 1.2297 - regression_loss: 1.0442 - classification_loss: 0.1856 379/500 [=====================>........] - ETA: 30s - loss: 1.2293 - regression_loss: 1.0439 - classification_loss: 0.1855 380/500 [=====================>........] - ETA: 30s - loss: 1.2304 - regression_loss: 1.0448 - classification_loss: 0.1856 381/500 [=====================>........] - ETA: 29s - loss: 1.2303 - regression_loss: 1.0447 - classification_loss: 0.1855 382/500 [=====================>........] - ETA: 29s - loss: 1.2282 - regression_loss: 1.0430 - classification_loss: 0.1851 383/500 [=====================>........] - ETA: 29s - loss: 1.2290 - regression_loss: 1.0437 - classification_loss: 0.1853 384/500 [======================>.......] - ETA: 29s - loss: 1.2287 - regression_loss: 1.0435 - classification_loss: 0.1852 385/500 [======================>.......] - ETA: 28s - loss: 1.2295 - regression_loss: 1.0442 - classification_loss: 0.1853 386/500 [======================>.......] - ETA: 28s - loss: 1.2314 - regression_loss: 1.0456 - classification_loss: 0.1858 387/500 [======================>.......] - ETA: 28s - loss: 1.2330 - regression_loss: 1.0468 - classification_loss: 0.1863 388/500 [======================>.......] - ETA: 28s - loss: 1.2311 - regression_loss: 1.0451 - classification_loss: 0.1859 389/500 [======================>.......] - ETA: 27s - loss: 1.2324 - regression_loss: 1.0465 - classification_loss: 0.1859 390/500 [======================>.......] - ETA: 27s - loss: 1.2326 - regression_loss: 1.0466 - classification_loss: 0.1860 391/500 [======================>.......] - ETA: 27s - loss: 1.2333 - regression_loss: 1.0472 - classification_loss: 0.1861 392/500 [======================>.......] - ETA: 27s - loss: 1.2340 - regression_loss: 1.0478 - classification_loss: 0.1862 393/500 [======================>.......] - ETA: 26s - loss: 1.2346 - regression_loss: 1.0483 - classification_loss: 0.1862 394/500 [======================>.......] - ETA: 26s - loss: 1.2357 - regression_loss: 1.0494 - classification_loss: 0.1863 395/500 [======================>.......] - ETA: 26s - loss: 1.2346 - regression_loss: 1.0484 - classification_loss: 0.1862 396/500 [======================>.......] - ETA: 26s - loss: 1.2356 - regression_loss: 1.0493 - classification_loss: 0.1863 397/500 [======================>.......] - ETA: 25s - loss: 1.2355 - regression_loss: 1.0493 - classification_loss: 0.1862 398/500 [======================>.......] - ETA: 25s - loss: 1.2335 - regression_loss: 1.0475 - classification_loss: 0.1860 399/500 [======================>.......] - ETA: 25s - loss: 1.2344 - regression_loss: 1.0481 - classification_loss: 0.1863 400/500 [=======================>......] - ETA: 25s - loss: 1.2354 - regression_loss: 1.0489 - classification_loss: 0.1866 401/500 [=======================>......] - ETA: 24s - loss: 1.2350 - regression_loss: 1.0484 - classification_loss: 0.1866 402/500 [=======================>......] - ETA: 24s - loss: 1.2338 - regression_loss: 1.0475 - classification_loss: 0.1863 403/500 [=======================>......] - ETA: 24s - loss: 1.2321 - regression_loss: 1.0461 - classification_loss: 0.1860 404/500 [=======================>......] - ETA: 24s - loss: 1.2318 - regression_loss: 1.0459 - classification_loss: 0.1859 405/500 [=======================>......] - ETA: 23s - loss: 1.2326 - regression_loss: 1.0466 - classification_loss: 0.1860 406/500 [=======================>......] - ETA: 23s - loss: 1.2337 - regression_loss: 1.0476 - classification_loss: 0.1860 407/500 [=======================>......] - ETA: 23s - loss: 1.2349 - regression_loss: 1.0485 - classification_loss: 0.1864 408/500 [=======================>......] - ETA: 23s - loss: 1.2336 - regression_loss: 1.0475 - classification_loss: 0.1861 409/500 [=======================>......] - ETA: 22s - loss: 1.2328 - regression_loss: 1.0469 - classification_loss: 0.1859 410/500 [=======================>......] - ETA: 22s - loss: 1.2337 - regression_loss: 1.0478 - classification_loss: 0.1859 411/500 [=======================>......] - ETA: 22s - loss: 1.2341 - regression_loss: 1.0481 - classification_loss: 0.1860 412/500 [=======================>......] - ETA: 22s - loss: 1.2337 - regression_loss: 1.0478 - classification_loss: 0.1859 413/500 [=======================>......] - ETA: 21s - loss: 1.2338 - regression_loss: 1.0479 - classification_loss: 0.1859 414/500 [=======================>......] - ETA: 21s - loss: 1.2337 - regression_loss: 1.0478 - classification_loss: 0.1858 415/500 [=======================>......] - ETA: 21s - loss: 1.2354 - regression_loss: 1.0493 - classification_loss: 0.1861 416/500 [=======================>......] - ETA: 21s - loss: 1.2351 - regression_loss: 1.0491 - classification_loss: 0.1860 417/500 [========================>.....] - ETA: 20s - loss: 1.2339 - regression_loss: 1.0481 - classification_loss: 0.1857 418/500 [========================>.....] - ETA: 20s - loss: 1.2329 - regression_loss: 1.0471 - classification_loss: 0.1858 419/500 [========================>.....] - ETA: 20s - loss: 1.2339 - regression_loss: 1.0480 - classification_loss: 0.1859 420/500 [========================>.....] - ETA: 20s - loss: 1.2341 - regression_loss: 1.0481 - classification_loss: 0.1860 421/500 [========================>.....] - ETA: 19s - loss: 1.2356 - regression_loss: 1.0489 - classification_loss: 0.1868 422/500 [========================>.....] - ETA: 19s - loss: 1.2363 - regression_loss: 1.0495 - classification_loss: 0.1868 423/500 [========================>.....] - ETA: 19s - loss: 1.2346 - regression_loss: 1.0481 - classification_loss: 0.1865 424/500 [========================>.....] - ETA: 19s - loss: 1.2343 - regression_loss: 1.0479 - classification_loss: 0.1864 425/500 [========================>.....] - ETA: 18s - loss: 1.2340 - regression_loss: 1.0478 - classification_loss: 0.1862 426/500 [========================>.....] - ETA: 18s - loss: 1.2340 - regression_loss: 1.0479 - classification_loss: 0.1862 427/500 [========================>.....] - ETA: 18s - loss: 1.2351 - regression_loss: 1.0488 - classification_loss: 0.1863 428/500 [========================>.....] - ETA: 18s - loss: 1.2352 - regression_loss: 1.0486 - classification_loss: 0.1865 429/500 [========================>.....] - ETA: 17s - loss: 1.2360 - regression_loss: 1.0493 - classification_loss: 0.1867 430/500 [========================>.....] - ETA: 17s - loss: 1.2365 - regression_loss: 1.0498 - classification_loss: 0.1867 431/500 [========================>.....] - ETA: 17s - loss: 1.2376 - regression_loss: 1.0507 - classification_loss: 0.1869 432/500 [========================>.....] - ETA: 17s - loss: 1.2390 - regression_loss: 1.0519 - classification_loss: 0.1871 433/500 [========================>.....] - ETA: 16s - loss: 1.2398 - regression_loss: 1.0523 - classification_loss: 0.1876 434/500 [=========================>....] - ETA: 16s - loss: 1.2402 - regression_loss: 1.0523 - classification_loss: 0.1878 435/500 [=========================>....] - ETA: 16s - loss: 1.2385 - regression_loss: 1.0510 - classification_loss: 0.1875 436/500 [=========================>....] - ETA: 16s - loss: 1.2376 - regression_loss: 1.0502 - classification_loss: 0.1874 437/500 [=========================>....] - ETA: 15s - loss: 1.2368 - regression_loss: 1.0496 - classification_loss: 0.1872 438/500 [=========================>....] - ETA: 15s - loss: 1.2372 - regression_loss: 1.0499 - classification_loss: 0.1874 439/500 [=========================>....] - ETA: 15s - loss: 1.2374 - regression_loss: 1.0500 - classification_loss: 0.1874 440/500 [=========================>....] - ETA: 15s - loss: 1.2359 - regression_loss: 1.0488 - classification_loss: 0.1871 441/500 [=========================>....] - ETA: 14s - loss: 1.2339 - regression_loss: 1.0471 - classification_loss: 0.1868 442/500 [=========================>....] - ETA: 14s - loss: 1.2352 - regression_loss: 1.0482 - classification_loss: 0.1871 443/500 [=========================>....] - ETA: 14s - loss: 1.2351 - regression_loss: 1.0480 - classification_loss: 0.1871 444/500 [=========================>....] - ETA: 14s - loss: 1.2344 - regression_loss: 1.0476 - classification_loss: 0.1868 445/500 [=========================>....] - ETA: 13s - loss: 1.2336 - regression_loss: 1.0468 - classification_loss: 0.1868 446/500 [=========================>....] - ETA: 13s - loss: 1.2349 - regression_loss: 1.0478 - classification_loss: 0.1871 447/500 [=========================>....] - ETA: 13s - loss: 1.2350 - regression_loss: 1.0478 - classification_loss: 0.1873 448/500 [=========================>....] - ETA: 13s - loss: 1.2362 - regression_loss: 1.0487 - classification_loss: 0.1875 449/500 [=========================>....] - ETA: 12s - loss: 1.2365 - regression_loss: 1.0489 - classification_loss: 0.1876 450/500 [==========================>...] - ETA: 12s - loss: 1.2375 - regression_loss: 1.0497 - classification_loss: 0.1878 451/500 [==========================>...] - ETA: 12s - loss: 1.2367 - regression_loss: 1.0491 - classification_loss: 0.1876 452/500 [==========================>...] - ETA: 12s - loss: 1.2351 - regression_loss: 1.0477 - classification_loss: 0.1874 453/500 [==========================>...] - ETA: 11s - loss: 1.2363 - regression_loss: 1.0486 - classification_loss: 0.1877 454/500 [==========================>...] - ETA: 11s - loss: 1.2353 - regression_loss: 1.0477 - classification_loss: 0.1876 455/500 [==========================>...] - ETA: 11s - loss: 1.2332 - regression_loss: 1.0460 - classification_loss: 0.1873 456/500 [==========================>...] - ETA: 11s - loss: 1.2332 - regression_loss: 1.0460 - classification_loss: 0.1872 457/500 [==========================>...] - ETA: 10s - loss: 1.2342 - regression_loss: 1.0468 - classification_loss: 0.1873 458/500 [==========================>...] - ETA: 10s - loss: 1.2330 - regression_loss: 1.0459 - classification_loss: 0.1870 459/500 [==========================>...] - ETA: 10s - loss: 1.2317 - regression_loss: 1.0450 - classification_loss: 0.1867 460/500 [==========================>...] - ETA: 10s - loss: 1.2302 - regression_loss: 1.0438 - classification_loss: 0.1864 461/500 [==========================>...] - ETA: 9s - loss: 1.2312 - regression_loss: 1.0446 - classification_loss: 0.1866  462/500 [==========================>...] - ETA: 9s - loss: 1.2313 - regression_loss: 1.0446 - classification_loss: 0.1867 463/500 [==========================>...] - ETA: 9s - loss: 1.2307 - regression_loss: 1.0441 - classification_loss: 0.1865 464/500 [==========================>...] - ETA: 9s - loss: 1.2306 - regression_loss: 1.0440 - classification_loss: 0.1866 465/500 [==========================>...] - ETA: 8s - loss: 1.2307 - regression_loss: 1.0441 - classification_loss: 0.1866 466/500 [==========================>...] - ETA: 8s - loss: 1.2309 - regression_loss: 1.0443 - classification_loss: 0.1865 467/500 [===========================>..] - ETA: 8s - loss: 1.2305 - regression_loss: 1.0441 - classification_loss: 0.1865 468/500 [===========================>..] - ETA: 8s - loss: 1.2311 - regression_loss: 1.0445 - classification_loss: 0.1866 469/500 [===========================>..] - ETA: 7s - loss: 1.2309 - regression_loss: 1.0444 - classification_loss: 0.1866 470/500 [===========================>..] - ETA: 7s - loss: 1.2317 - regression_loss: 1.0450 - classification_loss: 0.1866 471/500 [===========================>..] - ETA: 7s - loss: 1.2315 - regression_loss: 1.0449 - classification_loss: 0.1866 472/500 [===========================>..] - ETA: 7s - loss: 1.2321 - regression_loss: 1.0453 - classification_loss: 0.1868 473/500 [===========================>..] - ETA: 6s - loss: 1.2310 - regression_loss: 1.0444 - classification_loss: 0.1866 474/500 [===========================>..] - ETA: 6s - loss: 1.2296 - regression_loss: 1.0432 - classification_loss: 0.1863 475/500 [===========================>..] - ETA: 6s - loss: 1.2305 - regression_loss: 1.0440 - classification_loss: 0.1865 476/500 [===========================>..] - ETA: 6s - loss: 1.2310 - regression_loss: 1.0445 - classification_loss: 0.1865 477/500 [===========================>..] - ETA: 5s - loss: 1.2301 - regression_loss: 1.0439 - classification_loss: 0.1862 478/500 [===========================>..] - ETA: 5s - loss: 1.2298 - regression_loss: 1.0436 - classification_loss: 0.1862 479/500 [===========================>..] - ETA: 5s - loss: 1.2302 - regression_loss: 1.0439 - classification_loss: 0.1863 480/500 [===========================>..] - ETA: 5s - loss: 1.2298 - regression_loss: 1.0437 - classification_loss: 0.1861 481/500 [===========================>..] - ETA: 4s - loss: 1.2313 - regression_loss: 1.0451 - classification_loss: 0.1862 482/500 [===========================>..] - ETA: 4s - loss: 1.2321 - regression_loss: 1.0459 - classification_loss: 0.1862 483/500 [===========================>..] - ETA: 4s - loss: 1.2334 - regression_loss: 1.0470 - classification_loss: 0.1864 484/500 [============================>.] - ETA: 4s - loss: 1.2322 - regression_loss: 1.0460 - classification_loss: 0.1862 485/500 [============================>.] - ETA: 3s - loss: 1.2320 - regression_loss: 1.0459 - classification_loss: 0.1861 486/500 [============================>.] - ETA: 3s - loss: 1.2322 - regression_loss: 1.0460 - classification_loss: 0.1862 487/500 [============================>.] - ETA: 3s - loss: 1.2330 - regression_loss: 1.0467 - classification_loss: 0.1863 488/500 [============================>.] - ETA: 3s - loss: 1.2331 - regression_loss: 1.0467 - classification_loss: 0.1864 489/500 [============================>.] - ETA: 2s - loss: 1.2329 - regression_loss: 1.0467 - classification_loss: 0.1862 490/500 [============================>.] - ETA: 2s - loss: 1.2324 - regression_loss: 1.0461 - classification_loss: 0.1863 491/500 [============================>.] - ETA: 2s - loss: 1.2321 - regression_loss: 1.0459 - classification_loss: 0.1862 492/500 [============================>.] - ETA: 2s - loss: 1.2321 - regression_loss: 1.0457 - classification_loss: 0.1864 493/500 [============================>.] - ETA: 1s - loss: 1.2322 - regression_loss: 1.0459 - classification_loss: 0.1864 494/500 [============================>.] - ETA: 1s - loss: 1.2307 - regression_loss: 1.0446 - classification_loss: 0.1861 495/500 [============================>.] - ETA: 1s - loss: 1.2317 - regression_loss: 1.0453 - classification_loss: 0.1863 496/500 [============================>.] - ETA: 1s - loss: 1.2319 - regression_loss: 1.0456 - classification_loss: 0.1863 497/500 [============================>.] - ETA: 0s - loss: 1.2324 - regression_loss: 1.0460 - classification_loss: 0.1864 498/500 [============================>.] - ETA: 0s - loss: 1.2322 - regression_loss: 1.0458 - classification_loss: 0.1864 499/500 [============================>.] - ETA: 0s - loss: 1.2311 - regression_loss: 1.0449 - classification_loss: 0.1862 500/500 [==============================] - 125s 251ms/step - loss: 1.2310 - regression_loss: 1.0449 - classification_loss: 0.1861 1172 instances of class plum with average precision: 0.7003 mAP: 0.7003 Epoch 00135: saving model to ./training/snapshots/resnet50_pascal_135.h5 Epoch 00135: ReduceLROnPlateau reducing learning rate to 9.999999939225292e-10. Epoch 136/150 1/500 [..............................] - ETA: 1:59 - loss: 1.6823 - regression_loss: 1.4215 - classification_loss: 0.2609 2/500 [..............................] - ETA: 2:02 - loss: 1.5977 - regression_loss: 1.3537 - classification_loss: 0.2440 3/500 [..............................] - ETA: 2:01 - loss: 1.3299 - regression_loss: 1.1273 - classification_loss: 0.2026 4/500 [..............................] - ETA: 2:00 - loss: 1.3050 - regression_loss: 1.0962 - classification_loss: 0.2088 5/500 [..............................] - ETA: 2:00 - loss: 1.1079 - regression_loss: 0.9276 - classification_loss: 0.1802 6/500 [..............................] - ETA: 2:00 - loss: 1.2415 - regression_loss: 1.0379 - classification_loss: 0.2036 7/500 [..............................] - ETA: 2:00 - loss: 1.1494 - regression_loss: 0.9643 - classification_loss: 0.1851 8/500 [..............................] - ETA: 2:00 - loss: 1.1340 - regression_loss: 0.9516 - classification_loss: 0.1825 9/500 [..............................] - ETA: 2:00 - loss: 1.2409 - regression_loss: 1.0397 - classification_loss: 0.2012 10/500 [..............................] - ETA: 2:01 - loss: 1.2624 - regression_loss: 1.0583 - classification_loss: 0.2042 11/500 [..............................] - ETA: 2:00 - loss: 1.2607 - regression_loss: 1.0600 - classification_loss: 0.2007 12/500 [..............................] - ETA: 2:00 - loss: 1.3011 - regression_loss: 1.0893 - classification_loss: 0.2119 13/500 [..............................] - ETA: 2:00 - loss: 1.2888 - regression_loss: 1.0827 - classification_loss: 0.2060 14/500 [..............................] - ETA: 2:00 - loss: 1.2780 - regression_loss: 1.0761 - classification_loss: 0.2019 15/500 [..............................] - ETA: 2:00 - loss: 1.3062 - regression_loss: 1.1014 - classification_loss: 0.2048 16/500 [..............................] - ETA: 2:00 - loss: 1.3307 - regression_loss: 1.1241 - classification_loss: 0.2066 17/500 [>.............................] - ETA: 1:59 - loss: 1.3123 - regression_loss: 1.1033 - classification_loss: 0.2089 18/500 [>.............................] - ETA: 1:59 - loss: 1.2887 - regression_loss: 1.0869 - classification_loss: 0.2018 19/500 [>.............................] - ETA: 1:59 - loss: 1.2695 - regression_loss: 1.0719 - classification_loss: 0.1976 20/500 [>.............................] - ETA: 1:59 - loss: 1.2669 - regression_loss: 1.0719 - classification_loss: 0.1950 21/500 [>.............................] - ETA: 1:59 - loss: 1.2546 - regression_loss: 1.0624 - classification_loss: 0.1922 22/500 [>.............................] - ETA: 1:59 - loss: 1.2853 - regression_loss: 1.0869 - classification_loss: 0.1983 23/500 [>.............................] - ETA: 1:58 - loss: 1.3083 - regression_loss: 1.1084 - classification_loss: 0.1999 24/500 [>.............................] - ETA: 1:58 - loss: 1.2783 - regression_loss: 1.0842 - classification_loss: 0.1942 25/500 [>.............................] - ETA: 1:58 - loss: 1.2671 - regression_loss: 1.0746 - classification_loss: 0.1924 26/500 [>.............................] - ETA: 1:58 - loss: 1.2609 - regression_loss: 1.0723 - classification_loss: 0.1886 27/500 [>.............................] - ETA: 1:57 - loss: 1.2616 - regression_loss: 1.0722 - classification_loss: 0.1894 28/500 [>.............................] - ETA: 1:57 - loss: 1.2705 - regression_loss: 1.0813 - classification_loss: 0.1893 29/500 [>.............................] - ETA: 1:57 - loss: 1.2675 - regression_loss: 1.0772 - classification_loss: 0.1903 30/500 [>.............................] - ETA: 1:57 - loss: 1.2427 - regression_loss: 1.0557 - classification_loss: 0.1869 31/500 [>.............................] - ETA: 1:57 - loss: 1.2543 - regression_loss: 1.0643 - classification_loss: 0.1900 32/500 [>.............................] - ETA: 1:56 - loss: 1.2595 - regression_loss: 1.0687 - classification_loss: 0.1908 33/500 [>.............................] - ETA: 1:56 - loss: 1.2649 - regression_loss: 1.0741 - classification_loss: 0.1908 34/500 [=>............................] - ETA: 1:56 - loss: 1.2658 - regression_loss: 1.0742 - classification_loss: 0.1916 35/500 [=>............................] - ETA: 1:56 - loss: 1.2471 - regression_loss: 1.0579 - classification_loss: 0.1893 36/500 [=>............................] - ETA: 1:55 - loss: 1.2572 - regression_loss: 1.0667 - classification_loss: 0.1905 37/500 [=>............................] - ETA: 1:55 - loss: 1.2565 - regression_loss: 1.0660 - classification_loss: 0.1905 38/500 [=>............................] - ETA: 1:55 - loss: 1.2604 - regression_loss: 1.0695 - classification_loss: 0.1908 39/500 [=>............................] - ETA: 1:55 - loss: 1.2510 - regression_loss: 1.0628 - classification_loss: 0.1882 40/500 [=>............................] - ETA: 1:54 - loss: 1.2608 - regression_loss: 1.0708 - classification_loss: 0.1900 41/500 [=>............................] - ETA: 1:54 - loss: 1.2474 - regression_loss: 1.0600 - classification_loss: 0.1875 42/500 [=>............................] - ETA: 1:54 - loss: 1.2543 - regression_loss: 1.0656 - classification_loss: 0.1887 43/500 [=>............................] - ETA: 1:54 - loss: 1.2686 - regression_loss: 1.0778 - classification_loss: 0.1909 44/500 [=>............................] - ETA: 1:54 - loss: 1.2709 - regression_loss: 1.0796 - classification_loss: 0.1913 45/500 [=>............................] - ETA: 1:53 - loss: 1.2552 - regression_loss: 1.0660 - classification_loss: 0.1892 46/500 [=>............................] - ETA: 1:53 - loss: 1.2551 - regression_loss: 1.0662 - classification_loss: 0.1889 47/500 [=>............................] - ETA: 1:53 - loss: 1.2384 - regression_loss: 1.0521 - classification_loss: 0.1862 48/500 [=>............................] - ETA: 1:52 - loss: 1.2340 - regression_loss: 1.0486 - classification_loss: 0.1853 49/500 [=>............................] - ETA: 1:52 - loss: 1.2415 - regression_loss: 1.0546 - classification_loss: 0.1870 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2496 - regression_loss: 1.0616 - classification_loss: 0.1880 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2530 - regression_loss: 1.0647 - classification_loss: 0.1883 52/500 [==>...........................] - ETA: 1:51 - loss: 1.2474 - regression_loss: 1.0600 - classification_loss: 0.1873 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2518 - regression_loss: 1.0638 - classification_loss: 0.1880 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2504 - regression_loss: 1.0632 - classification_loss: 0.1872 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2622 - regression_loss: 1.0737 - classification_loss: 0.1884 56/500 [==>...........................] - ETA: 1:50 - loss: 1.2529 - regression_loss: 1.0662 - classification_loss: 0.1867 57/500 [==>...........................] - ETA: 1:50 - loss: 1.2482 - regression_loss: 1.0625 - classification_loss: 0.1857 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2514 - regression_loss: 1.0651 - classification_loss: 0.1863 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2586 - regression_loss: 1.0702 - classification_loss: 0.1884 60/500 [==>...........................] - ETA: 1:49 - loss: 1.2597 - regression_loss: 1.0714 - classification_loss: 0.1883 61/500 [==>...........................] - ETA: 1:49 - loss: 1.2557 - regression_loss: 1.0674 - classification_loss: 0.1883 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2606 - regression_loss: 1.0713 - classification_loss: 0.1893 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2662 - regression_loss: 1.0758 - classification_loss: 0.1904 64/500 [==>...........................] - ETA: 1:48 - loss: 1.2729 - regression_loss: 1.0812 - classification_loss: 0.1917 65/500 [==>...........................] - ETA: 1:48 - loss: 1.2755 - regression_loss: 1.0836 - classification_loss: 0.1918 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2827 - regression_loss: 1.0895 - classification_loss: 0.1932 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2680 - regression_loss: 1.0771 - classification_loss: 0.1909 68/500 [===>..........................] - ETA: 1:47 - loss: 1.2562 - regression_loss: 1.0670 - classification_loss: 0.1891 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2479 - regression_loss: 1.0607 - classification_loss: 0.1872 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2631 - regression_loss: 1.0735 - classification_loss: 0.1896 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2632 - regression_loss: 1.0726 - classification_loss: 0.1906 72/500 [===>..........................] - ETA: 1:46 - loss: 1.2651 - regression_loss: 1.0744 - classification_loss: 0.1907 73/500 [===>..........................] - ETA: 1:46 - loss: 1.2688 - regression_loss: 1.0774 - classification_loss: 0.1914 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2663 - regression_loss: 1.0758 - classification_loss: 0.1905 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2697 - regression_loss: 1.0791 - classification_loss: 0.1906 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2644 - regression_loss: 1.0746 - classification_loss: 0.1899 77/500 [===>..........................] - ETA: 1:45 - loss: 1.2653 - regression_loss: 1.0750 - classification_loss: 0.1903 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2616 - regression_loss: 1.0724 - classification_loss: 0.1892 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2576 - regression_loss: 1.0693 - classification_loss: 0.1883 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2651 - regression_loss: 1.0745 - classification_loss: 0.1906 81/500 [===>..........................] - ETA: 1:44 - loss: 1.2664 - regression_loss: 1.0752 - classification_loss: 0.1913 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2683 - regression_loss: 1.0764 - classification_loss: 0.1919 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2711 - regression_loss: 1.0787 - classification_loss: 0.1924 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2769 - regression_loss: 1.0830 - classification_loss: 0.1939 85/500 [====>.........................] - ETA: 1:43 - loss: 1.2774 - regression_loss: 1.0840 - classification_loss: 0.1934 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2752 - regression_loss: 1.0822 - classification_loss: 0.1930 87/500 [====>.........................] - ETA: 1:42 - loss: 1.2670 - regression_loss: 1.0755 - classification_loss: 0.1915 88/500 [====>.........................] - ETA: 1:42 - loss: 1.2563 - regression_loss: 1.0665 - classification_loss: 0.1898 89/500 [====>.........................] - ETA: 1:42 - loss: 1.2638 - regression_loss: 1.0729 - classification_loss: 0.1909 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2639 - regression_loss: 1.0729 - classification_loss: 0.1910 91/500 [====>.........................] - ETA: 1:41 - loss: 1.2634 - regression_loss: 1.0722 - classification_loss: 0.1912 92/500 [====>.........................] - ETA: 1:41 - loss: 1.2686 - regression_loss: 1.0763 - classification_loss: 0.1923 93/500 [====>.........................] - ETA: 1:41 - loss: 1.2623 - regression_loss: 1.0710 - classification_loss: 0.1913 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2668 - regression_loss: 1.0740 - classification_loss: 0.1928 95/500 [====>.........................] - ETA: 1:40 - loss: 1.2641 - regression_loss: 1.0722 - classification_loss: 0.1918 96/500 [====>.........................] - ETA: 1:40 - loss: 1.2672 - regression_loss: 1.0753 - classification_loss: 0.1919 97/500 [====>.........................] - ETA: 1:40 - loss: 1.2629 - regression_loss: 1.0713 - classification_loss: 0.1916 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2623 - regression_loss: 1.0708 - classification_loss: 0.1915 99/500 [====>.........................] - ETA: 1:39 - loss: 1.2658 - regression_loss: 1.0741 - classification_loss: 0.1916 100/500 [=====>........................] - ETA: 1:39 - loss: 1.2642 - regression_loss: 1.0722 - classification_loss: 0.1920 101/500 [=====>........................] - ETA: 1:39 - loss: 1.2662 - regression_loss: 1.0730 - classification_loss: 0.1932 102/500 [=====>........................] - ETA: 1:39 - loss: 1.2726 - regression_loss: 1.0778 - classification_loss: 0.1949 103/500 [=====>........................] - ETA: 1:38 - loss: 1.2746 - regression_loss: 1.0795 - classification_loss: 0.1951 104/500 [=====>........................] - ETA: 1:38 - loss: 1.2751 - regression_loss: 1.0800 - classification_loss: 0.1950 105/500 [=====>........................] - ETA: 1:38 - loss: 1.2731 - regression_loss: 1.0784 - classification_loss: 0.1947 106/500 [=====>........................] - ETA: 1:38 - loss: 1.2770 - regression_loss: 1.0819 - classification_loss: 0.1951 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2795 - regression_loss: 1.0845 - classification_loss: 0.1950 108/500 [=====>........................] - ETA: 1:37 - loss: 1.2800 - regression_loss: 1.0841 - classification_loss: 0.1959 109/500 [=====>........................] - ETA: 1:37 - loss: 1.2810 - regression_loss: 1.0856 - classification_loss: 0.1954 110/500 [=====>........................] - ETA: 1:37 - loss: 1.2832 - regression_loss: 1.0869 - classification_loss: 0.1963 111/500 [=====>........................] - ETA: 1:36 - loss: 1.2784 - regression_loss: 1.0818 - classification_loss: 0.1966 112/500 [=====>........................] - ETA: 1:36 - loss: 1.2814 - regression_loss: 1.0842 - classification_loss: 0.1972 113/500 [=====>........................] - ETA: 1:36 - loss: 1.2795 - regression_loss: 1.0827 - classification_loss: 0.1968 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2741 - regression_loss: 1.0784 - classification_loss: 0.1957 115/500 [=====>........................] - ETA: 1:35 - loss: 1.2718 - regression_loss: 1.0767 - classification_loss: 0.1951 116/500 [=====>........................] - ETA: 1:35 - loss: 1.2730 - regression_loss: 1.0780 - classification_loss: 0.1950 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2693 - regression_loss: 1.0749 - classification_loss: 0.1944 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2681 - regression_loss: 1.0738 - classification_loss: 0.1944 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2693 - regression_loss: 1.0751 - classification_loss: 0.1943 120/500 [======>.......................] - ETA: 1:34 - loss: 1.2624 - regression_loss: 1.0694 - classification_loss: 0.1930 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2614 - regression_loss: 1.0688 - classification_loss: 0.1926 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2651 - regression_loss: 1.0721 - classification_loss: 0.1931 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2622 - regression_loss: 1.0696 - classification_loss: 0.1926 124/500 [======>.......................] - ETA: 1:33 - loss: 1.2577 - regression_loss: 1.0655 - classification_loss: 0.1922 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2565 - regression_loss: 1.0647 - classification_loss: 0.1918 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2569 - regression_loss: 1.0652 - classification_loss: 0.1917 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2559 - regression_loss: 1.0642 - classification_loss: 0.1917 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2620 - regression_loss: 1.0690 - classification_loss: 0.1931 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2582 - regression_loss: 1.0653 - classification_loss: 0.1929 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2576 - regression_loss: 1.0644 - classification_loss: 0.1932 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2586 - regression_loss: 1.0653 - classification_loss: 0.1933 132/500 [======>.......................] - ETA: 1:31 - loss: 1.2603 - regression_loss: 1.0669 - classification_loss: 0.1934 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2611 - regression_loss: 1.0681 - classification_loss: 0.1931 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2671 - regression_loss: 1.0726 - classification_loss: 0.1945 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2709 - regression_loss: 1.0760 - classification_loss: 0.1949 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2744 - regression_loss: 1.0787 - classification_loss: 0.1957 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2750 - regression_loss: 1.0792 - classification_loss: 0.1958 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2743 - regression_loss: 1.0788 - classification_loss: 0.1954 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2753 - regression_loss: 1.0799 - classification_loss: 0.1954 140/500 [=======>......................] - ETA: 1:30 - loss: 1.2758 - regression_loss: 1.0804 - classification_loss: 0.1955 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2713 - regression_loss: 1.0768 - classification_loss: 0.1945 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2722 - regression_loss: 1.0774 - classification_loss: 0.1948 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2720 - regression_loss: 1.0775 - classification_loss: 0.1945 144/500 [=======>......................] - ETA: 1:28 - loss: 1.2719 - regression_loss: 1.0775 - classification_loss: 0.1943 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2669 - regression_loss: 1.0734 - classification_loss: 0.1935 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2685 - regression_loss: 1.0750 - classification_loss: 0.1935 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2664 - regression_loss: 1.0736 - classification_loss: 0.1928 148/500 [=======>......................] - ETA: 1:27 - loss: 1.2674 - regression_loss: 1.0740 - classification_loss: 0.1934 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2676 - regression_loss: 1.0739 - classification_loss: 0.1937 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2631 - regression_loss: 1.0702 - classification_loss: 0.1930 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2594 - regression_loss: 1.0674 - classification_loss: 0.1920 152/500 [========>.....................] - ETA: 1:26 - loss: 1.2613 - regression_loss: 1.0692 - classification_loss: 0.1921 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2584 - regression_loss: 1.0671 - classification_loss: 0.1913 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2591 - regression_loss: 1.0677 - classification_loss: 0.1914 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2545 - regression_loss: 1.0640 - classification_loss: 0.1905 156/500 [========>.....................] - ETA: 1:25 - loss: 1.2573 - regression_loss: 1.0663 - classification_loss: 0.1910 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2544 - regression_loss: 1.0638 - classification_loss: 0.1907 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2563 - regression_loss: 1.0648 - classification_loss: 0.1915 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2556 - regression_loss: 1.0645 - classification_loss: 0.1911 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2589 - regression_loss: 1.0670 - classification_loss: 0.1919 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2630 - regression_loss: 1.0697 - classification_loss: 0.1933 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2642 - regression_loss: 1.0706 - classification_loss: 0.1937 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2651 - regression_loss: 1.0712 - classification_loss: 0.1938 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2603 - regression_loss: 1.0674 - classification_loss: 0.1929 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2632 - regression_loss: 1.0699 - classification_loss: 0.1933 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2627 - regression_loss: 1.0694 - classification_loss: 0.1933 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2624 - regression_loss: 1.0694 - classification_loss: 0.1931 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2574 - regression_loss: 1.0653 - classification_loss: 0.1921 169/500 [=========>....................] - ETA: 1:22 - loss: 1.2541 - regression_loss: 1.0625 - classification_loss: 0.1916 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2559 - regression_loss: 1.0642 - classification_loss: 0.1917 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2576 - regression_loss: 1.0657 - classification_loss: 0.1918 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2582 - regression_loss: 1.0664 - classification_loss: 0.1918 173/500 [=========>....................] - ETA: 1:21 - loss: 1.2587 - regression_loss: 1.0669 - classification_loss: 0.1918 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2580 - regression_loss: 1.0665 - classification_loss: 0.1915 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2591 - regression_loss: 1.0675 - classification_loss: 0.1915 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2595 - regression_loss: 1.0680 - classification_loss: 0.1915 177/500 [=========>....................] - ETA: 1:20 - loss: 1.2602 - regression_loss: 1.0686 - classification_loss: 0.1916 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2629 - regression_loss: 1.0708 - classification_loss: 0.1921 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2641 - regression_loss: 1.0719 - classification_loss: 0.1922 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2666 - regression_loss: 1.0740 - classification_loss: 0.1925 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2679 - regression_loss: 1.0751 - classification_loss: 0.1928 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2641 - regression_loss: 1.0722 - classification_loss: 0.1919 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2654 - regression_loss: 1.0733 - classification_loss: 0.1920 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2648 - regression_loss: 1.0728 - classification_loss: 0.1920 185/500 [==========>...................] - ETA: 1:18 - loss: 1.2610 - regression_loss: 1.0698 - classification_loss: 0.1912 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2590 - regression_loss: 1.0680 - classification_loss: 0.1910 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2595 - regression_loss: 1.0684 - classification_loss: 0.1911 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2558 - regression_loss: 1.0654 - classification_loss: 0.1904 189/500 [==========>...................] - ETA: 1:17 - loss: 1.2553 - regression_loss: 1.0650 - classification_loss: 0.1903 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2542 - regression_loss: 1.0640 - classification_loss: 0.1902 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2544 - regression_loss: 1.0642 - classification_loss: 0.1902 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2531 - regression_loss: 1.0626 - classification_loss: 0.1904 193/500 [==========>...................] - ETA: 1:16 - loss: 1.2528 - regression_loss: 1.0621 - classification_loss: 0.1907 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2521 - regression_loss: 1.0617 - classification_loss: 0.1904 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2503 - regression_loss: 1.0602 - classification_loss: 0.1901 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2540 - regression_loss: 1.0630 - classification_loss: 0.1910 197/500 [==========>...................] - ETA: 1:15 - loss: 1.2514 - regression_loss: 1.0610 - classification_loss: 0.1904 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2533 - regression_loss: 1.0626 - classification_loss: 0.1907 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2482 - regression_loss: 1.0584 - classification_loss: 0.1898 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2501 - regression_loss: 1.0601 - classification_loss: 0.1901 201/500 [===========>..................] - ETA: 1:14 - loss: 1.2507 - regression_loss: 1.0602 - classification_loss: 0.1905 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2506 - regression_loss: 1.0600 - classification_loss: 0.1906 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2517 - regression_loss: 1.0610 - classification_loss: 0.1907 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2520 - regression_loss: 1.0615 - classification_loss: 0.1906 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2513 - regression_loss: 1.0609 - classification_loss: 0.1904 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2505 - regression_loss: 1.0602 - classification_loss: 0.1903 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2487 - regression_loss: 1.0589 - classification_loss: 0.1898 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2488 - regression_loss: 1.0588 - classification_loss: 0.1900 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2497 - regression_loss: 1.0597 - classification_loss: 0.1900 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2513 - regression_loss: 1.0612 - classification_loss: 0.1901 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2536 - regression_loss: 1.0635 - classification_loss: 0.1901 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2511 - regression_loss: 1.0617 - classification_loss: 0.1895 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2478 - regression_loss: 1.0590 - classification_loss: 0.1888 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2441 - regression_loss: 1.0558 - classification_loss: 0.1883 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2424 - regression_loss: 1.0544 - classification_loss: 0.1880 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2428 - regression_loss: 1.0544 - classification_loss: 0.1883 217/500 [============>.................] - ETA: 1:10 - loss: 1.2436 - regression_loss: 1.0551 - classification_loss: 0.1884 218/500 [============>.................] - ETA: 1:10 - loss: 1.2436 - regression_loss: 1.0551 - classification_loss: 0.1885 219/500 [============>.................] - ETA: 1:10 - loss: 1.2463 - regression_loss: 1.0568 - classification_loss: 0.1895 220/500 [============>.................] - ETA: 1:10 - loss: 1.2433 - regression_loss: 1.0543 - classification_loss: 0.1890 221/500 [============>.................] - ETA: 1:09 - loss: 1.2454 - regression_loss: 1.0560 - classification_loss: 0.1895 222/500 [============>.................] - ETA: 1:09 - loss: 1.2467 - regression_loss: 1.0571 - classification_loss: 0.1897 223/500 [============>.................] - ETA: 1:09 - loss: 1.2432 - regression_loss: 1.0542 - classification_loss: 0.1890 224/500 [============>.................] - ETA: 1:09 - loss: 1.2431 - regression_loss: 1.0541 - classification_loss: 0.1890 225/500 [============>.................] - ETA: 1:08 - loss: 1.2414 - regression_loss: 1.0529 - classification_loss: 0.1885 226/500 [============>.................] - ETA: 1:08 - loss: 1.2435 - regression_loss: 1.0543 - classification_loss: 0.1891 227/500 [============>.................] - ETA: 1:08 - loss: 1.2437 - regression_loss: 1.0546 - classification_loss: 0.1891 228/500 [============>.................] - ETA: 1:08 - loss: 1.2440 - regression_loss: 1.0548 - classification_loss: 0.1892 229/500 [============>.................] - ETA: 1:07 - loss: 1.2452 - regression_loss: 1.0554 - classification_loss: 0.1898 230/500 [============>.................] - ETA: 1:07 - loss: 1.2456 - regression_loss: 1.0557 - classification_loss: 0.1899 231/500 [============>.................] - ETA: 1:07 - loss: 1.2482 - regression_loss: 1.0577 - classification_loss: 0.1904 232/500 [============>.................] - ETA: 1:07 - loss: 1.2475 - regression_loss: 1.0574 - classification_loss: 0.1902 233/500 [============>.................] - ETA: 1:06 - loss: 1.2453 - regression_loss: 1.0556 - classification_loss: 0.1897 234/500 [=============>................] - ETA: 1:06 - loss: 1.2491 - regression_loss: 1.0589 - classification_loss: 0.1902 235/500 [=============>................] - ETA: 1:06 - loss: 1.2497 - regression_loss: 1.0595 - classification_loss: 0.1902 236/500 [=============>................] - ETA: 1:06 - loss: 1.2507 - regression_loss: 1.0602 - classification_loss: 0.1905 237/500 [=============>................] - ETA: 1:05 - loss: 1.2505 - regression_loss: 1.0601 - classification_loss: 0.1904 238/500 [=============>................] - ETA: 1:05 - loss: 1.2516 - regression_loss: 1.0610 - classification_loss: 0.1906 239/500 [=============>................] - ETA: 1:05 - loss: 1.2520 - regression_loss: 1.0614 - classification_loss: 0.1906 240/500 [=============>................] - ETA: 1:05 - loss: 1.2512 - regression_loss: 1.0609 - classification_loss: 0.1903 241/500 [=============>................] - ETA: 1:04 - loss: 1.2509 - regression_loss: 1.0605 - classification_loss: 0.1903 242/500 [=============>................] - ETA: 1:04 - loss: 1.2511 - regression_loss: 1.0609 - classification_loss: 0.1902 243/500 [=============>................] - ETA: 1:04 - loss: 1.2514 - regression_loss: 1.0614 - classification_loss: 0.1900 244/500 [=============>................] - ETA: 1:04 - loss: 1.2525 - regression_loss: 1.0621 - classification_loss: 0.1904 245/500 [=============>................] - ETA: 1:03 - loss: 1.2519 - regression_loss: 1.0617 - classification_loss: 0.1902 246/500 [=============>................] - ETA: 1:03 - loss: 1.2515 - regression_loss: 1.0613 - classification_loss: 0.1902 247/500 [=============>................] - ETA: 1:03 - loss: 1.2509 - regression_loss: 1.0611 - classification_loss: 0.1898 248/500 [=============>................] - ETA: 1:03 - loss: 1.2555 - regression_loss: 1.0649 - classification_loss: 0.1906 249/500 [=============>................] - ETA: 1:02 - loss: 1.2543 - regression_loss: 1.0640 - classification_loss: 0.1903 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2559 - regression_loss: 1.0650 - classification_loss: 0.1909 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2531 - regression_loss: 1.0627 - classification_loss: 0.1905 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2556 - regression_loss: 1.0649 - classification_loss: 0.1907 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2537 - regression_loss: 1.0634 - classification_loss: 0.1903 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2539 - regression_loss: 1.0634 - classification_loss: 0.1905 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2531 - regression_loss: 1.0625 - classification_loss: 0.1905 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2523 - regression_loss: 1.0620 - classification_loss: 0.1903 257/500 [==============>...............] - ETA: 1:00 - loss: 1.2531 - regression_loss: 1.0627 - classification_loss: 0.1904 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2601 - regression_loss: 1.0681 - classification_loss: 0.1919 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2581 - regression_loss: 1.0665 - classification_loss: 0.1916 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2576 - regression_loss: 1.0662 - classification_loss: 0.1914 261/500 [==============>...............] - ETA: 59s - loss: 1.2596 - regression_loss: 1.0678 - classification_loss: 0.1918  262/500 [==============>...............] - ETA: 59s - loss: 1.2590 - regression_loss: 1.0673 - classification_loss: 0.1918 263/500 [==============>...............] - ETA: 59s - loss: 1.2602 - regression_loss: 1.0683 - classification_loss: 0.1919 264/500 [==============>...............] - ETA: 59s - loss: 1.2612 - regression_loss: 1.0691 - classification_loss: 0.1921 265/500 [==============>...............] - ETA: 58s - loss: 1.2609 - regression_loss: 1.0687 - classification_loss: 0.1922 266/500 [==============>...............] - ETA: 58s - loss: 1.2626 - regression_loss: 1.0702 - classification_loss: 0.1924 267/500 [===============>..............] - ETA: 58s - loss: 1.2628 - regression_loss: 1.0702 - classification_loss: 0.1925 268/500 [===============>..............] - ETA: 58s - loss: 1.2629 - regression_loss: 1.0704 - classification_loss: 0.1925 269/500 [===============>..............] - ETA: 57s - loss: 1.2601 - regression_loss: 1.0679 - classification_loss: 0.1922 270/500 [===============>..............] - ETA: 57s - loss: 1.2606 - regression_loss: 1.0684 - classification_loss: 0.1922 271/500 [===============>..............] - ETA: 57s - loss: 1.2599 - regression_loss: 1.0678 - classification_loss: 0.1921 272/500 [===============>..............] - ETA: 57s - loss: 1.2614 - regression_loss: 1.0695 - classification_loss: 0.1920 273/500 [===============>..............] - ETA: 56s - loss: 1.2590 - regression_loss: 1.0674 - classification_loss: 0.1916 274/500 [===============>..............] - ETA: 56s - loss: 1.2576 - regression_loss: 1.0662 - classification_loss: 0.1914 275/500 [===============>..............] - ETA: 56s - loss: 1.2577 - regression_loss: 1.0663 - classification_loss: 0.1913 276/500 [===============>..............] - ETA: 56s - loss: 1.2568 - regression_loss: 1.0656 - classification_loss: 0.1912 277/500 [===============>..............] - ETA: 55s - loss: 1.2580 - regression_loss: 1.0665 - classification_loss: 0.1915 278/500 [===============>..............] - ETA: 55s - loss: 1.2589 - regression_loss: 1.0672 - classification_loss: 0.1917 279/500 [===============>..............] - ETA: 55s - loss: 1.2598 - regression_loss: 1.0679 - classification_loss: 0.1919 280/500 [===============>..............] - ETA: 55s - loss: 1.2601 - regression_loss: 1.0682 - classification_loss: 0.1919 281/500 [===============>..............] - ETA: 54s - loss: 1.2605 - regression_loss: 1.0687 - classification_loss: 0.1918 282/500 [===============>..............] - ETA: 54s - loss: 1.2616 - regression_loss: 1.0696 - classification_loss: 0.1919 283/500 [===============>..............] - ETA: 54s - loss: 1.2625 - regression_loss: 1.0704 - classification_loss: 0.1920 284/500 [================>.............] - ETA: 54s - loss: 1.2641 - regression_loss: 1.0718 - classification_loss: 0.1923 285/500 [================>.............] - ETA: 53s - loss: 1.2625 - regression_loss: 1.0704 - classification_loss: 0.1920 286/500 [================>.............] - ETA: 53s - loss: 1.2605 - regression_loss: 1.0687 - classification_loss: 0.1918 287/500 [================>.............] - ETA: 53s - loss: 1.2608 - regression_loss: 1.0691 - classification_loss: 0.1917 288/500 [================>.............] - ETA: 53s - loss: 1.2610 - regression_loss: 1.0694 - classification_loss: 0.1916 289/500 [================>.............] - ETA: 52s - loss: 1.2617 - regression_loss: 1.0698 - classification_loss: 0.1919 290/500 [================>.............] - ETA: 52s - loss: 1.2618 - regression_loss: 1.0700 - classification_loss: 0.1917 291/500 [================>.............] - ETA: 52s - loss: 1.2642 - regression_loss: 1.0719 - classification_loss: 0.1923 292/500 [================>.............] - ETA: 52s - loss: 1.2632 - regression_loss: 1.0713 - classification_loss: 0.1919 293/500 [================>.............] - ETA: 51s - loss: 1.2654 - regression_loss: 1.0732 - classification_loss: 0.1922 294/500 [================>.............] - ETA: 51s - loss: 1.2645 - regression_loss: 1.0724 - classification_loss: 0.1921 295/500 [================>.............] - ETA: 51s - loss: 1.2639 - regression_loss: 1.0716 - classification_loss: 0.1922 296/500 [================>.............] - ETA: 51s - loss: 1.2627 - regression_loss: 1.0709 - classification_loss: 0.1919 297/500 [================>.............] - ETA: 50s - loss: 1.2642 - regression_loss: 1.0720 - classification_loss: 0.1921 298/500 [================>.............] - ETA: 50s - loss: 1.2652 - regression_loss: 1.0730 - classification_loss: 0.1922 299/500 [================>.............] - ETA: 50s - loss: 1.2669 - regression_loss: 1.0744 - classification_loss: 0.1925 300/500 [=================>............] - ETA: 50s - loss: 1.2674 - regression_loss: 1.0746 - classification_loss: 0.1927 301/500 [=================>............] - ETA: 49s - loss: 1.2683 - regression_loss: 1.0752 - classification_loss: 0.1931 302/500 [=================>............] - ETA: 49s - loss: 1.2699 - regression_loss: 1.0763 - classification_loss: 0.1935 303/500 [=================>............] - ETA: 49s - loss: 1.2688 - regression_loss: 1.0754 - classification_loss: 0.1934 304/500 [=================>............] - ETA: 49s - loss: 1.2707 - regression_loss: 1.0771 - classification_loss: 0.1937 305/500 [=================>............] - ETA: 48s - loss: 1.2721 - regression_loss: 1.0782 - classification_loss: 0.1939 306/500 [=================>............] - ETA: 48s - loss: 1.2736 - regression_loss: 1.0793 - classification_loss: 0.1943 307/500 [=================>............] - ETA: 48s - loss: 1.2738 - regression_loss: 1.0796 - classification_loss: 0.1943 308/500 [=================>............] - ETA: 48s - loss: 1.2714 - regression_loss: 1.0776 - classification_loss: 0.1938 309/500 [=================>............] - ETA: 47s - loss: 1.2718 - regression_loss: 1.0779 - classification_loss: 0.1939 310/500 [=================>............] - ETA: 47s - loss: 1.2719 - regression_loss: 1.0781 - classification_loss: 0.1938 311/500 [=================>............] - ETA: 47s - loss: 1.2718 - regression_loss: 1.0781 - classification_loss: 0.1938 312/500 [=================>............] - ETA: 47s - loss: 1.2727 - regression_loss: 1.0788 - classification_loss: 0.1939 313/500 [=================>............] - ETA: 46s - loss: 1.2738 - regression_loss: 1.0797 - classification_loss: 0.1940 314/500 [=================>............] - ETA: 46s - loss: 1.2738 - regression_loss: 1.0798 - classification_loss: 0.1940 315/500 [=================>............] - ETA: 46s - loss: 1.2758 - regression_loss: 1.0813 - classification_loss: 0.1945 316/500 [=================>............] - ETA: 46s - loss: 1.2741 - regression_loss: 1.0801 - classification_loss: 0.1940 317/500 [==================>...........] - ETA: 45s - loss: 1.2749 - regression_loss: 1.0809 - classification_loss: 0.1940 318/500 [==================>...........] - ETA: 45s - loss: 1.2771 - regression_loss: 1.0827 - classification_loss: 0.1945 319/500 [==================>...........] - ETA: 45s - loss: 1.2786 - regression_loss: 1.0839 - classification_loss: 0.1947 320/500 [==================>...........] - ETA: 45s - loss: 1.2784 - regression_loss: 1.0838 - classification_loss: 0.1946 321/500 [==================>...........] - ETA: 44s - loss: 1.2787 - regression_loss: 1.0838 - classification_loss: 0.1949 322/500 [==================>...........] - ETA: 44s - loss: 1.2773 - regression_loss: 1.0826 - classification_loss: 0.1947 323/500 [==================>...........] - ETA: 44s - loss: 1.2773 - regression_loss: 1.0828 - classification_loss: 0.1945 324/500 [==================>...........] - ETA: 44s - loss: 1.2771 - regression_loss: 1.0827 - classification_loss: 0.1943 325/500 [==================>...........] - ETA: 43s - loss: 1.2768 - regression_loss: 1.0824 - classification_loss: 0.1944 326/500 [==================>...........] - ETA: 43s - loss: 1.2794 - regression_loss: 1.0843 - classification_loss: 0.1951 327/500 [==================>...........] - ETA: 43s - loss: 1.2786 - regression_loss: 1.0838 - classification_loss: 0.1949 328/500 [==================>...........] - ETA: 43s - loss: 1.2795 - regression_loss: 1.0846 - classification_loss: 0.1950 329/500 [==================>...........] - ETA: 42s - loss: 1.2795 - regression_loss: 1.0844 - classification_loss: 0.1951 330/500 [==================>...........] - ETA: 42s - loss: 1.2770 - regression_loss: 1.0823 - classification_loss: 0.1946 331/500 [==================>...........] - ETA: 42s - loss: 1.2773 - regression_loss: 1.0827 - classification_loss: 0.1946 332/500 [==================>...........] - ETA: 42s - loss: 1.2756 - regression_loss: 1.0812 - classification_loss: 0.1944 333/500 [==================>...........] - ETA: 41s - loss: 1.2737 - regression_loss: 1.0797 - classification_loss: 0.1940 334/500 [===================>..........] - ETA: 41s - loss: 1.2715 - regression_loss: 1.0779 - classification_loss: 0.1936 335/500 [===================>..........] - ETA: 41s - loss: 1.2713 - regression_loss: 1.0778 - classification_loss: 0.1935 336/500 [===================>..........] - ETA: 41s - loss: 1.2691 - regression_loss: 1.0760 - classification_loss: 0.1931 337/500 [===================>..........] - ETA: 40s - loss: 1.2673 - regression_loss: 1.0745 - classification_loss: 0.1928 338/500 [===================>..........] - ETA: 40s - loss: 1.2655 - regression_loss: 1.0731 - classification_loss: 0.1924 339/500 [===================>..........] - ETA: 40s - loss: 1.2663 - regression_loss: 1.0736 - classification_loss: 0.1926 340/500 [===================>..........] - ETA: 40s - loss: 1.2667 - regression_loss: 1.0740 - classification_loss: 0.1927 341/500 [===================>..........] - ETA: 39s - loss: 1.2662 - regression_loss: 1.0736 - classification_loss: 0.1926 342/500 [===================>..........] - ETA: 39s - loss: 1.2661 - regression_loss: 1.0735 - classification_loss: 0.1925 343/500 [===================>..........] - ETA: 39s - loss: 1.2680 - regression_loss: 1.0751 - classification_loss: 0.1929 344/500 [===================>..........] - ETA: 39s - loss: 1.2668 - regression_loss: 1.0740 - classification_loss: 0.1928 345/500 [===================>..........] - ETA: 38s - loss: 1.2662 - regression_loss: 1.0736 - classification_loss: 0.1927 346/500 [===================>..........] - ETA: 38s - loss: 1.2684 - regression_loss: 1.0733 - classification_loss: 0.1950 347/500 [===================>..........] - ETA: 38s - loss: 1.2682 - regression_loss: 1.0733 - classification_loss: 0.1949 348/500 [===================>..........] - ETA: 38s - loss: 1.2674 - regression_loss: 1.0727 - classification_loss: 0.1947 349/500 [===================>..........] - ETA: 37s - loss: 1.2669 - regression_loss: 1.0722 - classification_loss: 0.1947 350/500 [====================>.........] - ETA: 37s - loss: 1.2694 - regression_loss: 1.0739 - classification_loss: 0.1955 351/500 [====================>.........] - ETA: 37s - loss: 1.2691 - regression_loss: 1.0735 - classification_loss: 0.1956 352/500 [====================>.........] - ETA: 37s - loss: 1.2697 - regression_loss: 1.0740 - classification_loss: 0.1957 353/500 [====================>.........] - ETA: 36s - loss: 1.2705 - regression_loss: 1.0745 - classification_loss: 0.1960 354/500 [====================>.........] - ETA: 36s - loss: 1.2709 - regression_loss: 1.0749 - classification_loss: 0.1960 355/500 [====================>.........] - ETA: 36s - loss: 1.2724 - regression_loss: 1.0759 - classification_loss: 0.1964 356/500 [====================>.........] - ETA: 36s - loss: 1.2724 - regression_loss: 1.0757 - classification_loss: 0.1967 357/500 [====================>.........] - ETA: 35s - loss: 1.2703 - regression_loss: 1.0741 - classification_loss: 0.1962 358/500 [====================>.........] - ETA: 35s - loss: 1.2689 - regression_loss: 1.0731 - classification_loss: 0.1958 359/500 [====================>.........] - ETA: 35s - loss: 1.2662 - regression_loss: 1.0709 - classification_loss: 0.1953 360/500 [====================>.........] - ETA: 35s - loss: 1.2650 - regression_loss: 1.0698 - classification_loss: 0.1952 361/500 [====================>.........] - ETA: 34s - loss: 1.2654 - regression_loss: 1.0702 - classification_loss: 0.1952 362/500 [====================>.........] - ETA: 34s - loss: 1.2640 - regression_loss: 1.0690 - classification_loss: 0.1950 363/500 [====================>.........] - ETA: 34s - loss: 1.2656 - regression_loss: 1.0705 - classification_loss: 0.1951 364/500 [====================>.........] - ETA: 34s - loss: 1.2646 - regression_loss: 1.0698 - classification_loss: 0.1948 365/500 [====================>.........] - ETA: 33s - loss: 1.2641 - regression_loss: 1.0694 - classification_loss: 0.1947 366/500 [====================>.........] - ETA: 33s - loss: 1.2649 - regression_loss: 1.0701 - classification_loss: 0.1948 367/500 [=====================>........] - ETA: 33s - loss: 1.2667 - regression_loss: 1.0716 - classification_loss: 0.1951 368/500 [=====================>........] - ETA: 33s - loss: 1.2661 - regression_loss: 1.0713 - classification_loss: 0.1948 369/500 [=====================>........] - ETA: 32s - loss: 1.2673 - regression_loss: 1.0723 - classification_loss: 0.1950 370/500 [=====================>........] - ETA: 32s - loss: 1.2675 - regression_loss: 1.0724 - classification_loss: 0.1951 371/500 [=====================>........] - ETA: 32s - loss: 1.2681 - regression_loss: 1.0729 - classification_loss: 0.1951 372/500 [=====================>........] - ETA: 32s - loss: 1.2657 - regression_loss: 1.0710 - classification_loss: 0.1947 373/500 [=====================>........] - ETA: 31s - loss: 1.2643 - regression_loss: 1.0699 - classification_loss: 0.1944 374/500 [=====================>........] - ETA: 31s - loss: 1.2652 - regression_loss: 1.0707 - classification_loss: 0.1945 375/500 [=====================>........] - ETA: 31s - loss: 1.2652 - regression_loss: 1.0705 - classification_loss: 0.1948 376/500 [=====================>........] - ETA: 31s - loss: 1.2653 - regression_loss: 1.0704 - classification_loss: 0.1948 377/500 [=====================>........] - ETA: 30s - loss: 1.2637 - regression_loss: 1.0693 - classification_loss: 0.1944 378/500 [=====================>........] - ETA: 30s - loss: 1.2640 - regression_loss: 1.0697 - classification_loss: 0.1944 379/500 [=====================>........] - ETA: 30s - loss: 1.2640 - regression_loss: 1.0697 - classification_loss: 0.1943 380/500 [=====================>........] - ETA: 30s - loss: 1.2637 - regression_loss: 1.0692 - classification_loss: 0.1944 381/500 [=====================>........] - ETA: 29s - loss: 1.2637 - regression_loss: 1.0691 - classification_loss: 0.1946 382/500 [=====================>........] - ETA: 29s - loss: 1.2627 - regression_loss: 1.0683 - classification_loss: 0.1944 383/500 [=====================>........] - ETA: 29s - loss: 1.2626 - regression_loss: 1.0683 - classification_loss: 0.1943 384/500 [======================>.......] - ETA: 29s - loss: 1.2629 - regression_loss: 1.0687 - classification_loss: 0.1942 385/500 [======================>.......] - ETA: 28s - loss: 1.2618 - regression_loss: 1.0679 - classification_loss: 0.1940 386/500 [======================>.......] - ETA: 28s - loss: 1.2602 - regression_loss: 1.0665 - classification_loss: 0.1938 387/500 [======================>.......] - ETA: 28s - loss: 1.2586 - regression_loss: 1.0650 - classification_loss: 0.1935 388/500 [======================>.......] - ETA: 28s - loss: 1.2601 - regression_loss: 1.0663 - classification_loss: 0.1938 389/500 [======================>.......] - ETA: 27s - loss: 1.2585 - regression_loss: 1.0649 - classification_loss: 0.1936 390/500 [======================>.......] - ETA: 27s - loss: 1.2572 - regression_loss: 1.0637 - classification_loss: 0.1934 391/500 [======================>.......] - ETA: 27s - loss: 1.2578 - regression_loss: 1.0643 - classification_loss: 0.1935 392/500 [======================>.......] - ETA: 27s - loss: 1.2574 - regression_loss: 1.0640 - classification_loss: 0.1934 393/500 [======================>.......] - ETA: 26s - loss: 1.2583 - regression_loss: 1.0648 - classification_loss: 0.1935 394/500 [======================>.......] - ETA: 26s - loss: 1.2570 - regression_loss: 1.0639 - classification_loss: 0.1931 395/500 [======================>.......] - ETA: 26s - loss: 1.2570 - regression_loss: 1.0640 - classification_loss: 0.1930 396/500 [======================>.......] - ETA: 26s - loss: 1.2570 - regression_loss: 1.0639 - classification_loss: 0.1931 397/500 [======================>.......] - ETA: 25s - loss: 1.2565 - regression_loss: 1.0634 - classification_loss: 0.1931 398/500 [======================>.......] - ETA: 25s - loss: 1.2548 - regression_loss: 1.0620 - classification_loss: 0.1928 399/500 [======================>.......] - ETA: 25s - loss: 1.2552 - regression_loss: 1.0624 - classification_loss: 0.1928 400/500 [=======================>......] - ETA: 25s - loss: 1.2560 - regression_loss: 1.0630 - classification_loss: 0.1929 401/500 [=======================>......] - ETA: 24s - loss: 1.2575 - regression_loss: 1.0643 - classification_loss: 0.1931 402/500 [=======================>......] - ETA: 24s - loss: 1.2607 - regression_loss: 1.0668 - classification_loss: 0.1940 403/500 [=======================>......] - ETA: 24s - loss: 1.2604 - regression_loss: 1.0664 - classification_loss: 0.1939 404/500 [=======================>......] - ETA: 24s - loss: 1.2606 - regression_loss: 1.0666 - classification_loss: 0.1939 405/500 [=======================>......] - ETA: 23s - loss: 1.2608 - regression_loss: 1.0667 - classification_loss: 0.1940 406/500 [=======================>......] - ETA: 23s - loss: 1.2592 - regression_loss: 1.0655 - classification_loss: 0.1937 407/500 [=======================>......] - ETA: 23s - loss: 1.2573 - regression_loss: 1.0639 - classification_loss: 0.1934 408/500 [=======================>......] - ETA: 23s - loss: 1.2577 - regression_loss: 1.0643 - classification_loss: 0.1934 409/500 [=======================>......] - ETA: 22s - loss: 1.2566 - regression_loss: 1.0635 - classification_loss: 0.1932 410/500 [=======================>......] - ETA: 22s - loss: 1.2545 - regression_loss: 1.0617 - classification_loss: 0.1929 411/500 [=======================>......] - ETA: 22s - loss: 1.2557 - regression_loss: 1.0626 - classification_loss: 0.1931 412/500 [=======================>......] - ETA: 22s - loss: 1.2548 - regression_loss: 1.0620 - classification_loss: 0.1929 413/500 [=======================>......] - ETA: 21s - loss: 1.2552 - regression_loss: 1.0623 - classification_loss: 0.1929 414/500 [=======================>......] - ETA: 21s - loss: 1.2549 - regression_loss: 1.0620 - classification_loss: 0.1930 415/500 [=======================>......] - ETA: 21s - loss: 1.2547 - regression_loss: 1.0617 - classification_loss: 0.1929 416/500 [=======================>......] - ETA: 21s - loss: 1.2545 - regression_loss: 1.0617 - classification_loss: 0.1928 417/500 [========================>.....] - ETA: 20s - loss: 1.2530 - regression_loss: 1.0604 - classification_loss: 0.1926 418/500 [========================>.....] - ETA: 20s - loss: 1.2526 - regression_loss: 1.0601 - classification_loss: 0.1925 419/500 [========================>.....] - ETA: 20s - loss: 1.2525 - regression_loss: 1.0601 - classification_loss: 0.1925 420/500 [========================>.....] - ETA: 20s - loss: 1.2502 - regression_loss: 1.0581 - classification_loss: 0.1921 421/500 [========================>.....] - ETA: 19s - loss: 1.2503 - regression_loss: 1.0583 - classification_loss: 0.1920 422/500 [========================>.....] - ETA: 19s - loss: 1.2494 - regression_loss: 1.0577 - classification_loss: 0.1918 423/500 [========================>.....] - ETA: 19s - loss: 1.2496 - regression_loss: 1.0576 - classification_loss: 0.1920 424/500 [========================>.....] - ETA: 19s - loss: 1.2495 - regression_loss: 1.0577 - classification_loss: 0.1918 425/500 [========================>.....] - ETA: 18s - loss: 1.2486 - regression_loss: 1.0569 - classification_loss: 0.1917 426/500 [========================>.....] - ETA: 18s - loss: 1.2477 - regression_loss: 1.0562 - classification_loss: 0.1915 427/500 [========================>.....] - ETA: 18s - loss: 1.2481 - regression_loss: 1.0565 - classification_loss: 0.1916 428/500 [========================>.....] - ETA: 18s - loss: 1.2478 - regression_loss: 1.0562 - classification_loss: 0.1916 429/500 [========================>.....] - ETA: 17s - loss: 1.2481 - regression_loss: 1.0565 - classification_loss: 0.1916 430/500 [========================>.....] - ETA: 17s - loss: 1.2490 - regression_loss: 1.0573 - classification_loss: 0.1917 431/500 [========================>.....] - ETA: 17s - loss: 1.2493 - regression_loss: 1.0573 - classification_loss: 0.1919 432/500 [========================>.....] - ETA: 17s - loss: 1.2490 - regression_loss: 1.0571 - classification_loss: 0.1919 433/500 [========================>.....] - ETA: 16s - loss: 1.2500 - regression_loss: 1.0580 - classification_loss: 0.1920 434/500 [=========================>....] - ETA: 16s - loss: 1.2488 - regression_loss: 1.0570 - classification_loss: 0.1918 435/500 [=========================>....] - ETA: 16s - loss: 1.2486 - regression_loss: 1.0568 - classification_loss: 0.1918 436/500 [=========================>....] - ETA: 16s - loss: 1.2470 - regression_loss: 1.0554 - classification_loss: 0.1915 437/500 [=========================>....] - ETA: 15s - loss: 1.2471 - regression_loss: 1.0555 - classification_loss: 0.1916 438/500 [=========================>....] - ETA: 15s - loss: 1.2474 - regression_loss: 1.0558 - classification_loss: 0.1916 439/500 [=========================>....] - ETA: 15s - loss: 1.2478 - regression_loss: 1.0562 - classification_loss: 0.1916 440/500 [=========================>....] - ETA: 15s - loss: 1.2477 - regression_loss: 1.0562 - classification_loss: 0.1915 441/500 [=========================>....] - ETA: 14s - loss: 1.2470 - regression_loss: 1.0557 - classification_loss: 0.1914 442/500 [=========================>....] - ETA: 14s - loss: 1.2477 - regression_loss: 1.0562 - classification_loss: 0.1915 443/500 [=========================>....] - ETA: 14s - loss: 1.2480 - regression_loss: 1.0565 - classification_loss: 0.1916 444/500 [=========================>....] - ETA: 14s - loss: 1.2462 - regression_loss: 1.0550 - classification_loss: 0.1912 445/500 [=========================>....] - ETA: 13s - loss: 1.2463 - regression_loss: 1.0551 - classification_loss: 0.1912 446/500 [=========================>....] - ETA: 13s - loss: 1.2466 - regression_loss: 1.0553 - classification_loss: 0.1913 447/500 [=========================>....] - ETA: 13s - loss: 1.2475 - regression_loss: 1.0563 - classification_loss: 0.1913 448/500 [=========================>....] - ETA: 13s - loss: 1.2466 - regression_loss: 1.0555 - classification_loss: 0.1911 449/500 [=========================>....] - ETA: 12s - loss: 1.2458 - regression_loss: 1.0549 - classification_loss: 0.1909 450/500 [==========================>...] - ETA: 12s - loss: 1.2467 - regression_loss: 1.0557 - classification_loss: 0.1910 451/500 [==========================>...] - ETA: 12s - loss: 1.2476 - regression_loss: 1.0565 - classification_loss: 0.1911 452/500 [==========================>...] - ETA: 12s - loss: 1.2478 - regression_loss: 1.0566 - classification_loss: 0.1912 453/500 [==========================>...] - ETA: 11s - loss: 1.2482 - regression_loss: 1.0570 - classification_loss: 0.1912 454/500 [==========================>...] - ETA: 11s - loss: 1.2496 - regression_loss: 1.0581 - classification_loss: 0.1915 455/500 [==========================>...] - ETA: 11s - loss: 1.2505 - regression_loss: 1.0587 - classification_loss: 0.1918 456/500 [==========================>...] - ETA: 11s - loss: 1.2518 - regression_loss: 1.0597 - classification_loss: 0.1921 457/500 [==========================>...] - ETA: 10s - loss: 1.2525 - regression_loss: 1.0603 - classification_loss: 0.1922 458/500 [==========================>...] - ETA: 10s - loss: 1.2526 - regression_loss: 1.0605 - classification_loss: 0.1921 459/500 [==========================>...] - ETA: 10s - loss: 1.2530 - regression_loss: 1.0608 - classification_loss: 0.1921 460/500 [==========================>...] - ETA: 10s - loss: 1.2520 - regression_loss: 1.0601 - classification_loss: 0.1920 461/500 [==========================>...] - ETA: 9s - loss: 1.2515 - regression_loss: 1.0595 - classification_loss: 0.1920  462/500 [==========================>...] - ETA: 9s - loss: 1.2499 - regression_loss: 1.0582 - classification_loss: 0.1917 463/500 [==========================>...] - ETA: 9s - loss: 1.2490 - regression_loss: 1.0574 - classification_loss: 0.1916 464/500 [==========================>...] - ETA: 9s - loss: 1.2488 - regression_loss: 1.0573 - classification_loss: 0.1915 465/500 [==========================>...] - ETA: 8s - loss: 1.2471 - regression_loss: 1.0560 - classification_loss: 0.1911 466/500 [==========================>...] - ETA: 8s - loss: 1.2476 - regression_loss: 1.0564 - classification_loss: 0.1913 467/500 [===========================>..] - ETA: 8s - loss: 1.2472 - regression_loss: 1.0559 - classification_loss: 0.1914 468/500 [===========================>..] - ETA: 8s - loss: 1.2480 - regression_loss: 1.0564 - classification_loss: 0.1916 469/500 [===========================>..] - ETA: 7s - loss: 1.2491 - regression_loss: 1.0571 - classification_loss: 0.1920 470/500 [===========================>..] - ETA: 7s - loss: 1.2495 - regression_loss: 1.0576 - classification_loss: 0.1919 471/500 [===========================>..] - ETA: 7s - loss: 1.2526 - regression_loss: 1.0602 - classification_loss: 0.1923 472/500 [===========================>..] - ETA: 7s - loss: 1.2528 - regression_loss: 1.0605 - classification_loss: 0.1923 473/500 [===========================>..] - ETA: 6s - loss: 1.2539 - regression_loss: 1.0614 - classification_loss: 0.1925 474/500 [===========================>..] - ETA: 6s - loss: 1.2542 - regression_loss: 1.0617 - classification_loss: 0.1925 475/500 [===========================>..] - ETA: 6s - loss: 1.2527 - regression_loss: 1.0606 - classification_loss: 0.1921 476/500 [===========================>..] - ETA: 6s - loss: 1.2545 - regression_loss: 1.0621 - classification_loss: 0.1925 477/500 [===========================>..] - ETA: 5s - loss: 1.2543 - regression_loss: 1.0618 - classification_loss: 0.1924 478/500 [===========================>..] - ETA: 5s - loss: 1.2540 - regression_loss: 1.0615 - classification_loss: 0.1924 479/500 [===========================>..] - ETA: 5s - loss: 1.2534 - regression_loss: 1.0610 - classification_loss: 0.1923 480/500 [===========================>..] - ETA: 5s - loss: 1.2539 - regression_loss: 1.0616 - classification_loss: 0.1923 481/500 [===========================>..] - ETA: 4s - loss: 1.2521 - regression_loss: 1.0601 - classification_loss: 0.1920 482/500 [===========================>..] - ETA: 4s - loss: 1.2516 - regression_loss: 1.0597 - classification_loss: 0.1919 483/500 [===========================>..] - ETA: 4s - loss: 1.2527 - regression_loss: 1.0606 - classification_loss: 0.1921 484/500 [============================>.] - ETA: 4s - loss: 1.2536 - regression_loss: 1.0612 - classification_loss: 0.1924 485/500 [============================>.] - ETA: 3s - loss: 1.2531 - regression_loss: 1.0608 - classification_loss: 0.1923 486/500 [============================>.] - ETA: 3s - loss: 1.2530 - regression_loss: 1.0607 - classification_loss: 0.1923 487/500 [============================>.] - ETA: 3s - loss: 1.2542 - regression_loss: 1.0617 - classification_loss: 0.1925 488/500 [============================>.] - ETA: 3s - loss: 1.2543 - regression_loss: 1.0617 - classification_loss: 0.1926 489/500 [============================>.] - ETA: 2s - loss: 1.2543 - regression_loss: 1.0618 - classification_loss: 0.1925 490/500 [============================>.] - ETA: 2s - loss: 1.2534 - regression_loss: 1.0611 - classification_loss: 0.1923 491/500 [============================>.] - ETA: 2s - loss: 1.2531 - regression_loss: 1.0609 - classification_loss: 0.1922 492/500 [============================>.] - ETA: 2s - loss: 1.2515 - regression_loss: 1.0596 - classification_loss: 0.1919 493/500 [============================>.] - ETA: 1s - loss: 1.2527 - regression_loss: 1.0606 - classification_loss: 0.1922 494/500 [============================>.] - ETA: 1s - loss: 1.2527 - regression_loss: 1.0606 - classification_loss: 0.1921 495/500 [============================>.] - ETA: 1s - loss: 1.2532 - regression_loss: 1.0611 - classification_loss: 0.1922 496/500 [============================>.] - ETA: 1s - loss: 1.2522 - regression_loss: 1.0602 - classification_loss: 0.1920 497/500 [============================>.] - ETA: 0s - loss: 1.2518 - regression_loss: 1.0599 - classification_loss: 0.1919 498/500 [============================>.] - ETA: 0s - loss: 1.2512 - regression_loss: 1.0595 - classification_loss: 0.1918 499/500 [============================>.] - ETA: 0s - loss: 1.2522 - regression_loss: 1.0602 - classification_loss: 0.1919 500/500 [==============================] - 125s 251ms/step - loss: 1.2523 - regression_loss: 1.0603 - classification_loss: 0.1920 1172 instances of class plum with average precision: 0.7003 mAP: 0.7003 Epoch 00136: saving model to ./training/snapshots/resnet50_pascal_136.h5 Epoch 137/150 1/500 [..............................] - ETA: 2:05 - loss: 1.4135 - regression_loss: 1.1897 - classification_loss: 0.2239 2/500 [..............................] - ETA: 2:04 - loss: 1.2081 - regression_loss: 1.0294 - classification_loss: 0.1787 3/500 [..............................] - ETA: 2:03 - loss: 0.9524 - regression_loss: 0.8042 - classification_loss: 0.1482 4/500 [..............................] - ETA: 2:03 - loss: 1.3040 - regression_loss: 1.1308 - classification_loss: 0.1732 5/500 [..............................] - ETA: 2:00 - loss: 1.2954 - regression_loss: 1.1314 - classification_loss: 0.1640 6/500 [..............................] - ETA: 2:00 - loss: 1.3261 - regression_loss: 1.1419 - classification_loss: 0.1842 7/500 [..............................] - ETA: 2:00 - loss: 1.2089 - regression_loss: 1.0418 - classification_loss: 0.1671 8/500 [..............................] - ETA: 2:01 - loss: 1.2390 - regression_loss: 1.0686 - classification_loss: 0.1704 9/500 [..............................] - ETA: 2:01 - loss: 1.1644 - regression_loss: 1.0050 - classification_loss: 0.1594 10/500 [..............................] - ETA: 2:01 - loss: 1.2164 - regression_loss: 1.0518 - classification_loss: 0.1646 11/500 [..............................] - ETA: 2:01 - loss: 1.2178 - regression_loss: 1.0520 - classification_loss: 0.1657 12/500 [..............................] - ETA: 2:00 - loss: 1.2929 - regression_loss: 1.1143 - classification_loss: 0.1786 13/500 [..............................] - ETA: 2:00 - loss: 1.2953 - regression_loss: 1.1144 - classification_loss: 0.1809 14/500 [..............................] - ETA: 2:00 - loss: 1.2454 - regression_loss: 1.0734 - classification_loss: 0.1719 15/500 [..............................] - ETA: 2:00 - loss: 1.1992 - regression_loss: 1.0365 - classification_loss: 0.1627 16/500 [..............................] - ETA: 2:00 - loss: 1.1966 - regression_loss: 1.0307 - classification_loss: 0.1659 17/500 [>.............................] - ETA: 1:59 - loss: 1.2216 - regression_loss: 1.0515 - classification_loss: 0.1700 18/500 [>.............................] - ETA: 1:59 - loss: 1.2439 - regression_loss: 1.0647 - classification_loss: 0.1792 19/500 [>.............................] - ETA: 1:59 - loss: 1.2219 - regression_loss: 1.0467 - classification_loss: 0.1752 20/500 [>.............................] - ETA: 1:59 - loss: 1.2300 - regression_loss: 1.0527 - classification_loss: 0.1773 21/500 [>.............................] - ETA: 1:59 - loss: 1.2555 - regression_loss: 1.0736 - classification_loss: 0.1820 22/500 [>.............................] - ETA: 1:59 - loss: 1.2642 - regression_loss: 1.0800 - classification_loss: 0.1842 23/500 [>.............................] - ETA: 1:59 - loss: 1.2624 - regression_loss: 1.0788 - classification_loss: 0.1836 24/500 [>.............................] - ETA: 1:59 - loss: 1.2750 - regression_loss: 1.0837 - classification_loss: 0.1914 25/500 [>.............................] - ETA: 1:58 - loss: 1.2914 - regression_loss: 1.0981 - classification_loss: 0.1933 26/500 [>.............................] - ETA: 1:58 - loss: 1.2869 - regression_loss: 1.0933 - classification_loss: 0.1936 27/500 [>.............................] - ETA: 1:58 - loss: 1.2880 - regression_loss: 1.0932 - classification_loss: 0.1949 28/500 [>.............................] - ETA: 1:58 - loss: 1.2897 - regression_loss: 1.0948 - classification_loss: 0.1949 29/500 [>.............................] - ETA: 1:57 - loss: 1.2987 - regression_loss: 1.1034 - classification_loss: 0.1953 30/500 [>.............................] - ETA: 1:57 - loss: 1.3056 - regression_loss: 1.1089 - classification_loss: 0.1968 31/500 [>.............................] - ETA: 1:57 - loss: 1.2770 - regression_loss: 1.0827 - classification_loss: 0.1943 32/500 [>.............................] - ETA: 1:56 - loss: 1.2804 - regression_loss: 1.0847 - classification_loss: 0.1957 33/500 [>.............................] - ETA: 1:56 - loss: 1.2690 - regression_loss: 1.0777 - classification_loss: 0.1913 34/500 [=>............................] - ETA: 1:56 - loss: 1.2819 - regression_loss: 1.0874 - classification_loss: 0.1945 35/500 [=>............................] - ETA: 1:56 - loss: 1.2858 - regression_loss: 1.0909 - classification_loss: 0.1949 36/500 [=>............................] - ETA: 1:55 - loss: 1.3013 - regression_loss: 1.1036 - classification_loss: 0.1977 37/500 [=>............................] - ETA: 1:55 - loss: 1.2931 - regression_loss: 1.0975 - classification_loss: 0.1956 38/500 [=>............................] - ETA: 1:55 - loss: 1.2835 - regression_loss: 1.0896 - classification_loss: 0.1939 39/500 [=>............................] - ETA: 1:55 - loss: 1.2809 - regression_loss: 1.0871 - classification_loss: 0.1938 40/500 [=>............................] - ETA: 1:55 - loss: 1.2842 - regression_loss: 1.0902 - classification_loss: 0.1940 41/500 [=>............................] - ETA: 1:54 - loss: 1.2801 - regression_loss: 1.0865 - classification_loss: 0.1936 42/500 [=>............................] - ETA: 1:54 - loss: 1.2802 - regression_loss: 1.0867 - classification_loss: 0.1935 43/500 [=>............................] - ETA: 1:54 - loss: 1.2700 - regression_loss: 1.0783 - classification_loss: 0.1917 44/500 [=>............................] - ETA: 1:54 - loss: 1.2629 - regression_loss: 1.0742 - classification_loss: 0.1888 45/500 [=>............................] - ETA: 1:54 - loss: 1.2642 - regression_loss: 1.0759 - classification_loss: 0.1883 46/500 [=>............................] - ETA: 1:53 - loss: 1.2679 - regression_loss: 1.0784 - classification_loss: 0.1896 47/500 [=>............................] - ETA: 1:53 - loss: 1.2669 - regression_loss: 1.0757 - classification_loss: 0.1912 48/500 [=>............................] - ETA: 1:53 - loss: 1.2682 - regression_loss: 1.0770 - classification_loss: 0.1912 49/500 [=>............................] - ETA: 1:53 - loss: 1.2639 - regression_loss: 1.0741 - classification_loss: 0.1898 50/500 [==>...........................] - ETA: 1:52 - loss: 1.2657 - regression_loss: 1.0755 - classification_loss: 0.1902 51/500 [==>...........................] - ETA: 1:52 - loss: 1.2657 - regression_loss: 1.0762 - classification_loss: 0.1895 52/500 [==>...........................] - ETA: 1:52 - loss: 1.2629 - regression_loss: 1.0738 - classification_loss: 0.1890 53/500 [==>...........................] - ETA: 1:51 - loss: 1.2717 - regression_loss: 1.0812 - classification_loss: 0.1905 54/500 [==>...........................] - ETA: 1:51 - loss: 1.2829 - regression_loss: 1.0905 - classification_loss: 0.1924 55/500 [==>...........................] - ETA: 1:51 - loss: 1.2793 - regression_loss: 1.0878 - classification_loss: 0.1915 56/500 [==>...........................] - ETA: 1:51 - loss: 1.2633 - regression_loss: 1.0745 - classification_loss: 0.1888 57/500 [==>...........................] - ETA: 1:51 - loss: 1.2571 - regression_loss: 1.0701 - classification_loss: 0.1870 58/500 [==>...........................] - ETA: 1:50 - loss: 1.2602 - regression_loss: 1.0726 - classification_loss: 0.1875 59/500 [==>...........................] - ETA: 1:50 - loss: 1.2522 - regression_loss: 1.0648 - classification_loss: 0.1874 60/500 [==>...........................] - ETA: 1:50 - loss: 1.2520 - regression_loss: 1.0647 - classification_loss: 0.1873 61/500 [==>...........................] - ETA: 1:50 - loss: 1.2436 - regression_loss: 1.0575 - classification_loss: 0.1861 62/500 [==>...........................] - ETA: 1:49 - loss: 1.2371 - regression_loss: 1.0511 - classification_loss: 0.1860 63/500 [==>...........................] - ETA: 1:49 - loss: 1.2288 - regression_loss: 1.0453 - classification_loss: 0.1834 64/500 [==>...........................] - ETA: 1:49 - loss: 1.2337 - regression_loss: 1.0495 - classification_loss: 0.1842 65/500 [==>...........................] - ETA: 1:49 - loss: 1.2351 - regression_loss: 1.0506 - classification_loss: 0.1844 66/500 [==>...........................] - ETA: 1:48 - loss: 1.2461 - regression_loss: 1.0602 - classification_loss: 0.1860 67/500 [===>..........................] - ETA: 1:48 - loss: 1.2369 - regression_loss: 1.0524 - classification_loss: 0.1845 68/500 [===>..........................] - ETA: 1:48 - loss: 1.2388 - regression_loss: 1.0530 - classification_loss: 0.1858 69/500 [===>..........................] - ETA: 1:47 - loss: 1.2421 - regression_loss: 1.0556 - classification_loss: 0.1866 70/500 [===>..........................] - ETA: 1:47 - loss: 1.2416 - regression_loss: 1.0557 - classification_loss: 0.1859 71/500 [===>..........................] - ETA: 1:47 - loss: 1.2353 - regression_loss: 1.0511 - classification_loss: 0.1842 72/500 [===>..........................] - ETA: 1:47 - loss: 1.2271 - regression_loss: 1.0446 - classification_loss: 0.1825 73/500 [===>..........................] - ETA: 1:47 - loss: 1.2367 - regression_loss: 1.0523 - classification_loss: 0.1844 74/500 [===>..........................] - ETA: 1:46 - loss: 1.2295 - regression_loss: 1.0465 - classification_loss: 0.1830 75/500 [===>..........................] - ETA: 1:46 - loss: 1.2361 - regression_loss: 1.0519 - classification_loss: 0.1843 76/500 [===>..........................] - ETA: 1:46 - loss: 1.2245 - regression_loss: 1.0419 - classification_loss: 0.1825 77/500 [===>..........................] - ETA: 1:46 - loss: 1.2181 - regression_loss: 1.0368 - classification_loss: 0.1812 78/500 [===>..........................] - ETA: 1:45 - loss: 1.2102 - regression_loss: 1.0301 - classification_loss: 0.1801 79/500 [===>..........................] - ETA: 1:45 - loss: 1.2086 - regression_loss: 1.0290 - classification_loss: 0.1796 80/500 [===>..........................] - ETA: 1:45 - loss: 1.2153 - regression_loss: 1.0349 - classification_loss: 0.1804 81/500 [===>..........................] - ETA: 1:45 - loss: 1.2142 - regression_loss: 1.0343 - classification_loss: 0.1800 82/500 [===>..........................] - ETA: 1:44 - loss: 1.2085 - regression_loss: 1.0293 - classification_loss: 0.1791 83/500 [===>..........................] - ETA: 1:44 - loss: 1.2136 - regression_loss: 1.0337 - classification_loss: 0.1799 84/500 [====>.........................] - ETA: 1:44 - loss: 1.2087 - regression_loss: 1.0301 - classification_loss: 0.1786 85/500 [====>.........................] - ETA: 1:44 - loss: 1.2030 - regression_loss: 1.0253 - classification_loss: 0.1777 86/500 [====>.........................] - ETA: 1:43 - loss: 1.2093 - regression_loss: 1.0309 - classification_loss: 0.1784 87/500 [====>.........................] - ETA: 1:43 - loss: 1.2150 - regression_loss: 1.0358 - classification_loss: 0.1792 88/500 [====>.........................] - ETA: 1:43 - loss: 1.2135 - regression_loss: 1.0345 - classification_loss: 0.1789 89/500 [====>.........................] - ETA: 1:43 - loss: 1.2100 - regression_loss: 1.0299 - classification_loss: 0.1801 90/500 [====>.........................] - ETA: 1:42 - loss: 1.2147 - regression_loss: 1.0341 - classification_loss: 0.1806 91/500 [====>.........................] - ETA: 1:42 - loss: 1.2186 - regression_loss: 1.0369 - classification_loss: 0.1817 92/500 [====>.........................] - ETA: 1:42 - loss: 1.2148 - regression_loss: 1.0345 - classification_loss: 0.1803 93/500 [====>.........................] - ETA: 1:42 - loss: 1.2176 - regression_loss: 1.0367 - classification_loss: 0.1809 94/500 [====>.........................] - ETA: 1:41 - loss: 1.2221 - regression_loss: 1.0397 - classification_loss: 0.1824 95/500 [====>.........................] - ETA: 1:41 - loss: 1.2241 - regression_loss: 1.0415 - classification_loss: 0.1827 96/500 [====>.........................] - ETA: 1:41 - loss: 1.2220 - regression_loss: 1.0401 - classification_loss: 0.1820 97/500 [====>.........................] - ETA: 1:41 - loss: 1.2262 - regression_loss: 1.0433 - classification_loss: 0.1829 98/500 [====>.........................] - ETA: 1:40 - loss: 1.2278 - regression_loss: 1.0450 - classification_loss: 0.1828 99/500 [====>.........................] - ETA: 1:40 - loss: 1.2296 - regression_loss: 1.0471 - classification_loss: 0.1825 100/500 [=====>........................] - ETA: 1:40 - loss: 1.2304 - regression_loss: 1.0479 - classification_loss: 0.1826 101/500 [=====>........................] - ETA: 1:40 - loss: 1.2269 - regression_loss: 1.0449 - classification_loss: 0.1821 102/500 [=====>........................] - ETA: 1:40 - loss: 1.2317 - regression_loss: 1.0489 - classification_loss: 0.1828 103/500 [=====>........................] - ETA: 1:39 - loss: 1.2270 - regression_loss: 1.0450 - classification_loss: 0.1820 104/500 [=====>........................] - ETA: 1:39 - loss: 1.2277 - regression_loss: 1.0457 - classification_loss: 0.1820 105/500 [=====>........................] - ETA: 1:39 - loss: 1.2220 - regression_loss: 1.0409 - classification_loss: 0.1811 106/500 [=====>........................] - ETA: 1:39 - loss: 1.2206 - regression_loss: 1.0390 - classification_loss: 0.1816 107/500 [=====>........................] - ETA: 1:38 - loss: 1.2188 - regression_loss: 1.0367 - classification_loss: 0.1821 108/500 [=====>........................] - ETA: 1:38 - loss: 1.2200 - regression_loss: 1.0381 - classification_loss: 0.1818 109/500 [=====>........................] - ETA: 1:38 - loss: 1.2229 - regression_loss: 1.0409 - classification_loss: 0.1820 110/500 [=====>........................] - ETA: 1:38 - loss: 1.2192 - regression_loss: 1.0376 - classification_loss: 0.1817 111/500 [=====>........................] - ETA: 1:37 - loss: 1.2143 - regression_loss: 1.0330 - classification_loss: 0.1813 112/500 [=====>........................] - ETA: 1:37 - loss: 1.2144 - regression_loss: 1.0326 - classification_loss: 0.1818 113/500 [=====>........................] - ETA: 1:37 - loss: 1.2154 - regression_loss: 1.0338 - classification_loss: 0.1816 114/500 [=====>........................] - ETA: 1:36 - loss: 1.2184 - regression_loss: 1.0357 - classification_loss: 0.1827 115/500 [=====>........................] - ETA: 1:36 - loss: 1.2173 - regression_loss: 1.0352 - classification_loss: 0.1821 116/500 [=====>........................] - ETA: 1:36 - loss: 1.2180 - regression_loss: 1.0355 - classification_loss: 0.1825 117/500 [======>.......................] - ETA: 1:35 - loss: 1.2178 - regression_loss: 1.0346 - classification_loss: 0.1832 118/500 [======>.......................] - ETA: 1:35 - loss: 1.2119 - regression_loss: 1.0299 - classification_loss: 0.1820 119/500 [======>.......................] - ETA: 1:35 - loss: 1.2087 - regression_loss: 1.0270 - classification_loss: 0.1817 120/500 [======>.......................] - ETA: 1:35 - loss: 1.2037 - regression_loss: 1.0229 - classification_loss: 0.1809 121/500 [======>.......................] - ETA: 1:34 - loss: 1.2059 - regression_loss: 1.0246 - classification_loss: 0.1813 122/500 [======>.......................] - ETA: 1:34 - loss: 1.2123 - regression_loss: 1.0302 - classification_loss: 0.1821 123/500 [======>.......................] - ETA: 1:34 - loss: 1.2097 - regression_loss: 1.0275 - classification_loss: 0.1822 124/500 [======>.......................] - ETA: 1:34 - loss: 1.2144 - regression_loss: 1.0313 - classification_loss: 0.1831 125/500 [======>.......................] - ETA: 1:33 - loss: 1.2168 - regression_loss: 1.0333 - classification_loss: 0.1835 126/500 [======>.......................] - ETA: 1:33 - loss: 1.2191 - regression_loss: 1.0353 - classification_loss: 0.1838 127/500 [======>.......................] - ETA: 1:33 - loss: 1.2160 - regression_loss: 1.0328 - classification_loss: 0.1831 128/500 [======>.......................] - ETA: 1:32 - loss: 1.2171 - regression_loss: 1.0339 - classification_loss: 0.1832 129/500 [======>.......................] - ETA: 1:32 - loss: 1.2146 - regression_loss: 1.0319 - classification_loss: 0.1827 130/500 [======>.......................] - ETA: 1:32 - loss: 1.2113 - regression_loss: 1.0295 - classification_loss: 0.1819 131/500 [======>.......................] - ETA: 1:32 - loss: 1.2131 - regression_loss: 1.0309 - classification_loss: 0.1822 132/500 [======>.......................] - ETA: 1:31 - loss: 1.2148 - regression_loss: 1.0327 - classification_loss: 0.1821 133/500 [======>.......................] - ETA: 1:31 - loss: 1.2174 - regression_loss: 1.0348 - classification_loss: 0.1826 134/500 [=======>......................] - ETA: 1:31 - loss: 1.2154 - regression_loss: 1.0334 - classification_loss: 0.1820 135/500 [=======>......................] - ETA: 1:31 - loss: 1.2160 - regression_loss: 1.0344 - classification_loss: 0.1816 136/500 [=======>......................] - ETA: 1:31 - loss: 1.2170 - regression_loss: 1.0352 - classification_loss: 0.1818 137/500 [=======>......................] - ETA: 1:30 - loss: 1.2198 - regression_loss: 1.0371 - classification_loss: 0.1827 138/500 [=======>......................] - ETA: 1:30 - loss: 1.2165 - regression_loss: 1.0341 - classification_loss: 0.1824 139/500 [=======>......................] - ETA: 1:30 - loss: 1.2184 - regression_loss: 1.0355 - classification_loss: 0.1829 140/500 [=======>......................] - ETA: 1:29 - loss: 1.2207 - regression_loss: 1.0377 - classification_loss: 0.1830 141/500 [=======>......................] - ETA: 1:29 - loss: 1.2162 - regression_loss: 1.0337 - classification_loss: 0.1824 142/500 [=======>......................] - ETA: 1:29 - loss: 1.2135 - regression_loss: 1.0315 - classification_loss: 0.1820 143/500 [=======>......................] - ETA: 1:29 - loss: 1.2081 - regression_loss: 1.0269 - classification_loss: 0.1812 144/500 [=======>......................] - ETA: 1:29 - loss: 1.2078 - regression_loss: 1.0267 - classification_loss: 0.1811 145/500 [=======>......................] - ETA: 1:28 - loss: 1.2088 - regression_loss: 1.0277 - classification_loss: 0.1811 146/500 [=======>......................] - ETA: 1:28 - loss: 1.2120 - regression_loss: 1.0302 - classification_loss: 0.1818 147/500 [=======>......................] - ETA: 1:28 - loss: 1.2146 - regression_loss: 1.0323 - classification_loss: 0.1823 148/500 [=======>......................] - ETA: 1:28 - loss: 1.2164 - regression_loss: 1.0339 - classification_loss: 0.1826 149/500 [=======>......................] - ETA: 1:27 - loss: 1.2162 - regression_loss: 1.0337 - classification_loss: 0.1825 150/500 [========>.....................] - ETA: 1:27 - loss: 1.2124 - regression_loss: 1.0307 - classification_loss: 0.1817 151/500 [========>.....................] - ETA: 1:27 - loss: 1.2144 - regression_loss: 1.0325 - classification_loss: 0.1819 152/500 [========>.....................] - ETA: 1:27 - loss: 1.2132 - regression_loss: 1.0322 - classification_loss: 0.1811 153/500 [========>.....................] - ETA: 1:26 - loss: 1.2139 - regression_loss: 1.0329 - classification_loss: 0.1809 154/500 [========>.....................] - ETA: 1:26 - loss: 1.2088 - regression_loss: 1.0287 - classification_loss: 0.1801 155/500 [========>.....................] - ETA: 1:26 - loss: 1.2134 - regression_loss: 1.0324 - classification_loss: 0.1809 156/500 [========>.....................] - ETA: 1:26 - loss: 1.2173 - regression_loss: 1.0359 - classification_loss: 0.1815 157/500 [========>.....................] - ETA: 1:25 - loss: 1.2200 - regression_loss: 1.0379 - classification_loss: 0.1821 158/500 [========>.....................] - ETA: 1:25 - loss: 1.2199 - regression_loss: 1.0375 - classification_loss: 0.1824 159/500 [========>.....................] - ETA: 1:25 - loss: 1.2178 - regression_loss: 1.0360 - classification_loss: 0.1818 160/500 [========>.....................] - ETA: 1:25 - loss: 1.2170 - regression_loss: 1.0352 - classification_loss: 0.1818 161/500 [========>.....................] - ETA: 1:24 - loss: 1.2194 - regression_loss: 1.0374 - classification_loss: 0.1820 162/500 [========>.....................] - ETA: 1:24 - loss: 1.2147 - regression_loss: 1.0336 - classification_loss: 0.1812 163/500 [========>.....................] - ETA: 1:24 - loss: 1.2183 - regression_loss: 1.0365 - classification_loss: 0.1818 164/500 [========>.....................] - ETA: 1:24 - loss: 1.2170 - regression_loss: 1.0353 - classification_loss: 0.1817 165/500 [========>.....................] - ETA: 1:23 - loss: 1.2165 - regression_loss: 1.0350 - classification_loss: 0.1815 166/500 [========>.....................] - ETA: 1:23 - loss: 1.2189 - regression_loss: 1.0370 - classification_loss: 0.1819 167/500 [=========>....................] - ETA: 1:23 - loss: 1.2254 - regression_loss: 1.0429 - classification_loss: 0.1825 168/500 [=========>....................] - ETA: 1:23 - loss: 1.2265 - regression_loss: 1.0437 - classification_loss: 0.1828 169/500 [=========>....................] - ETA: 1:23 - loss: 1.2282 - regression_loss: 1.0449 - classification_loss: 0.1833 170/500 [=========>....................] - ETA: 1:22 - loss: 1.2307 - regression_loss: 1.0472 - classification_loss: 0.1835 171/500 [=========>....................] - ETA: 1:22 - loss: 1.2305 - regression_loss: 1.0471 - classification_loss: 0.1834 172/500 [=========>....................] - ETA: 1:22 - loss: 1.2303 - regression_loss: 1.0471 - classification_loss: 0.1832 173/500 [=========>....................] - ETA: 1:22 - loss: 1.2315 - regression_loss: 1.0481 - classification_loss: 0.1833 174/500 [=========>....................] - ETA: 1:21 - loss: 1.2306 - regression_loss: 1.0475 - classification_loss: 0.1831 175/500 [=========>....................] - ETA: 1:21 - loss: 1.2327 - regression_loss: 1.0495 - classification_loss: 0.1832 176/500 [=========>....................] - ETA: 1:21 - loss: 1.2336 - regression_loss: 1.0504 - classification_loss: 0.1833 177/500 [=========>....................] - ETA: 1:21 - loss: 1.2305 - regression_loss: 1.0475 - classification_loss: 0.1830 178/500 [=========>....................] - ETA: 1:20 - loss: 1.2259 - regression_loss: 1.0437 - classification_loss: 0.1822 179/500 [=========>....................] - ETA: 1:20 - loss: 1.2299 - regression_loss: 1.0468 - classification_loss: 0.1831 180/500 [=========>....................] - ETA: 1:20 - loss: 1.2314 - regression_loss: 1.0481 - classification_loss: 0.1833 181/500 [=========>....................] - ETA: 1:19 - loss: 1.2330 - regression_loss: 1.0496 - classification_loss: 0.1834 182/500 [=========>....................] - ETA: 1:19 - loss: 1.2370 - regression_loss: 1.0525 - classification_loss: 0.1844 183/500 [=========>....................] - ETA: 1:19 - loss: 1.2368 - regression_loss: 1.0527 - classification_loss: 0.1841 184/500 [==========>...................] - ETA: 1:19 - loss: 1.2407 - regression_loss: 1.0558 - classification_loss: 0.1848 185/500 [==========>...................] - ETA: 1:19 - loss: 1.2437 - regression_loss: 1.0583 - classification_loss: 0.1854 186/500 [==========>...................] - ETA: 1:18 - loss: 1.2455 - regression_loss: 1.0598 - classification_loss: 0.1857 187/500 [==========>...................] - ETA: 1:18 - loss: 1.2461 - regression_loss: 1.0607 - classification_loss: 0.1854 188/500 [==========>...................] - ETA: 1:18 - loss: 1.2427 - regression_loss: 1.0581 - classification_loss: 0.1846 189/500 [==========>...................] - ETA: 1:18 - loss: 1.2466 - regression_loss: 1.0609 - classification_loss: 0.1857 190/500 [==========>...................] - ETA: 1:17 - loss: 1.2488 - regression_loss: 1.0625 - classification_loss: 0.1863 191/500 [==========>...................] - ETA: 1:17 - loss: 1.2485 - regression_loss: 1.0624 - classification_loss: 0.1861 192/500 [==========>...................] - ETA: 1:17 - loss: 1.2463 - regression_loss: 1.0605 - classification_loss: 0.1858 193/500 [==========>...................] - ETA: 1:17 - loss: 1.2425 - regression_loss: 1.0572 - classification_loss: 0.1853 194/500 [==========>...................] - ETA: 1:16 - loss: 1.2418 - regression_loss: 1.0566 - classification_loss: 0.1852 195/500 [==========>...................] - ETA: 1:16 - loss: 1.2420 - regression_loss: 1.0565 - classification_loss: 0.1855 196/500 [==========>...................] - ETA: 1:16 - loss: 1.2393 - regression_loss: 1.0540 - classification_loss: 0.1853 197/500 [==========>...................] - ETA: 1:16 - loss: 1.2398 - regression_loss: 1.0545 - classification_loss: 0.1853 198/500 [==========>...................] - ETA: 1:15 - loss: 1.2409 - regression_loss: 1.0554 - classification_loss: 0.1855 199/500 [==========>...................] - ETA: 1:15 - loss: 1.2439 - regression_loss: 1.0579 - classification_loss: 0.1860 200/500 [===========>..................] - ETA: 1:15 - loss: 1.2449 - regression_loss: 1.0587 - classification_loss: 0.1863 201/500 [===========>..................] - ETA: 1:15 - loss: 1.2455 - regression_loss: 1.0592 - classification_loss: 0.1863 202/500 [===========>..................] - ETA: 1:14 - loss: 1.2441 - regression_loss: 1.0581 - classification_loss: 0.1860 203/500 [===========>..................] - ETA: 1:14 - loss: 1.2452 - regression_loss: 1.0592 - classification_loss: 0.1861 204/500 [===========>..................] - ETA: 1:14 - loss: 1.2447 - regression_loss: 1.0589 - classification_loss: 0.1858 205/500 [===========>..................] - ETA: 1:13 - loss: 1.2442 - regression_loss: 1.0586 - classification_loss: 0.1856 206/500 [===========>..................] - ETA: 1:13 - loss: 1.2437 - regression_loss: 1.0580 - classification_loss: 0.1857 207/500 [===========>..................] - ETA: 1:13 - loss: 1.2442 - regression_loss: 1.0584 - classification_loss: 0.1858 208/500 [===========>..................] - ETA: 1:13 - loss: 1.2426 - regression_loss: 1.0571 - classification_loss: 0.1855 209/500 [===========>..................] - ETA: 1:12 - loss: 1.2451 - regression_loss: 1.0589 - classification_loss: 0.1862 210/500 [===========>..................] - ETA: 1:12 - loss: 1.2448 - regression_loss: 1.0588 - classification_loss: 0.1860 211/500 [===========>..................] - ETA: 1:12 - loss: 1.2448 - regression_loss: 1.0589 - classification_loss: 0.1859 212/500 [===========>..................] - ETA: 1:12 - loss: 1.2488 - regression_loss: 1.0624 - classification_loss: 0.1865 213/500 [===========>..................] - ETA: 1:11 - loss: 1.2462 - regression_loss: 1.0604 - classification_loss: 0.1858 214/500 [===========>..................] - ETA: 1:11 - loss: 1.2462 - regression_loss: 1.0585 - classification_loss: 0.1877 215/500 [===========>..................] - ETA: 1:11 - loss: 1.2453 - regression_loss: 1.0576 - classification_loss: 0.1877 216/500 [===========>..................] - ETA: 1:11 - loss: 1.2457 - regression_loss: 1.0583 - classification_loss: 0.1874 217/500 [============>.................] - ETA: 1:10 - loss: 1.2491 - regression_loss: 1.0605 - classification_loss: 0.1886 218/500 [============>.................] - ETA: 1:10 - loss: 1.2505 - regression_loss: 1.0616 - classification_loss: 0.1889 219/500 [============>.................] - ETA: 1:10 - loss: 1.2526 - regression_loss: 1.0633 - classification_loss: 0.1893 220/500 [============>.................] - ETA: 1:10 - loss: 1.2519 - regression_loss: 1.0627 - classification_loss: 0.1892 221/500 [============>.................] - ETA: 1:10 - loss: 1.2496 - regression_loss: 1.0607 - classification_loss: 0.1890 222/500 [============>.................] - ETA: 1:09 - loss: 1.2500 - regression_loss: 1.0610 - classification_loss: 0.1890 223/500 [============>.................] - ETA: 1:09 - loss: 1.2528 - regression_loss: 1.0632 - classification_loss: 0.1896 224/500 [============>.................] - ETA: 1:09 - loss: 1.2556 - regression_loss: 1.0652 - classification_loss: 0.1904 225/500 [============>.................] - ETA: 1:09 - loss: 1.2566 - regression_loss: 1.0658 - classification_loss: 0.1908 226/500 [============>.................] - ETA: 1:08 - loss: 1.2552 - regression_loss: 1.0646 - classification_loss: 0.1906 227/500 [============>.................] - ETA: 1:08 - loss: 1.2573 - regression_loss: 1.0662 - classification_loss: 0.1911 228/500 [============>.................] - ETA: 1:08 - loss: 1.2569 - regression_loss: 1.0659 - classification_loss: 0.1909 229/500 [============>.................] - ETA: 1:08 - loss: 1.2565 - regression_loss: 1.0656 - classification_loss: 0.1909 230/500 [============>.................] - ETA: 1:07 - loss: 1.2549 - regression_loss: 1.0644 - classification_loss: 0.1905 231/500 [============>.................] - ETA: 1:07 - loss: 1.2567 - regression_loss: 1.0661 - classification_loss: 0.1906 232/500 [============>.................] - ETA: 1:07 - loss: 1.2568 - regression_loss: 1.0662 - classification_loss: 0.1906 233/500 [============>.................] - ETA: 1:07 - loss: 1.2559 - regression_loss: 1.0656 - classification_loss: 0.1904 234/500 [=============>................] - ETA: 1:06 - loss: 1.2533 - regression_loss: 1.0635 - classification_loss: 0.1898 235/500 [=============>................] - ETA: 1:06 - loss: 1.2503 - regression_loss: 1.0610 - classification_loss: 0.1893 236/500 [=============>................] - ETA: 1:06 - loss: 1.2531 - regression_loss: 1.0634 - classification_loss: 0.1897 237/500 [=============>................] - ETA: 1:06 - loss: 1.2530 - regression_loss: 1.0633 - classification_loss: 0.1897 238/500 [=============>................] - ETA: 1:05 - loss: 1.2556 - regression_loss: 1.0653 - classification_loss: 0.1903 239/500 [=============>................] - ETA: 1:05 - loss: 1.2529 - regression_loss: 1.0630 - classification_loss: 0.1899 240/500 [=============>................] - ETA: 1:05 - loss: 1.2520 - regression_loss: 1.0624 - classification_loss: 0.1897 241/500 [=============>................] - ETA: 1:05 - loss: 1.2517 - regression_loss: 1.0621 - classification_loss: 0.1896 242/500 [=============>................] - ETA: 1:04 - loss: 1.2496 - regression_loss: 1.0604 - classification_loss: 0.1892 243/500 [=============>................] - ETA: 1:04 - loss: 1.2469 - regression_loss: 1.0582 - classification_loss: 0.1887 244/500 [=============>................] - ETA: 1:04 - loss: 1.2447 - regression_loss: 1.0563 - classification_loss: 0.1884 245/500 [=============>................] - ETA: 1:04 - loss: 1.2415 - regression_loss: 1.0537 - classification_loss: 0.1878 246/500 [=============>................] - ETA: 1:03 - loss: 1.2426 - regression_loss: 1.0546 - classification_loss: 0.1879 247/500 [=============>................] - ETA: 1:03 - loss: 1.2397 - regression_loss: 1.0523 - classification_loss: 0.1874 248/500 [=============>................] - ETA: 1:03 - loss: 1.2375 - regression_loss: 1.0506 - classification_loss: 0.1869 249/500 [=============>................] - ETA: 1:03 - loss: 1.2402 - regression_loss: 1.0528 - classification_loss: 0.1874 250/500 [==============>...............] - ETA: 1:02 - loss: 1.2432 - regression_loss: 1.0550 - classification_loss: 0.1882 251/500 [==============>...............] - ETA: 1:02 - loss: 1.2404 - regression_loss: 1.0528 - classification_loss: 0.1876 252/500 [==============>...............] - ETA: 1:02 - loss: 1.2395 - regression_loss: 1.0520 - classification_loss: 0.1875 253/500 [==============>...............] - ETA: 1:01 - loss: 1.2409 - regression_loss: 1.0530 - classification_loss: 0.1879 254/500 [==============>...............] - ETA: 1:01 - loss: 1.2397 - regression_loss: 1.0522 - classification_loss: 0.1875 255/500 [==============>...............] - ETA: 1:01 - loss: 1.2381 - regression_loss: 1.0509 - classification_loss: 0.1872 256/500 [==============>...............] - ETA: 1:01 - loss: 1.2382 - regression_loss: 1.0510 - classification_loss: 0.1872 257/500 [==============>...............] - ETA: 1:01 - loss: 1.2376 - regression_loss: 1.0500 - classification_loss: 0.1877 258/500 [==============>...............] - ETA: 1:00 - loss: 1.2367 - regression_loss: 1.0494 - classification_loss: 0.1874 259/500 [==============>...............] - ETA: 1:00 - loss: 1.2372 - regression_loss: 1.0499 - classification_loss: 0.1873 260/500 [==============>...............] - ETA: 1:00 - loss: 1.2359 - regression_loss: 1.0488 - classification_loss: 0.1871 261/500 [==============>...............] - ETA: 1:00 - loss: 1.2340 - regression_loss: 1.0473 - classification_loss: 0.1866 262/500 [==============>...............] - ETA: 59s - loss: 1.2335 - regression_loss: 1.0470 - classification_loss: 0.1865  263/500 [==============>...............] - ETA: 59s - loss: 1.2327 - regression_loss: 1.0464 - classification_loss: 0.1864 264/500 [==============>...............] - ETA: 59s - loss: 1.2325 - regression_loss: 1.0463 - classification_loss: 0.1862 265/500 [==============>...............] - ETA: 58s - loss: 1.2311 - regression_loss: 1.0452 - classification_loss: 0.1859 266/500 [==============>...............] - ETA: 58s - loss: 1.2312 - regression_loss: 1.0452 - classification_loss: 0.1859 267/500 [===============>..............] - ETA: 58s - loss: 1.2347 - regression_loss: 1.0480 - classification_loss: 0.1867 268/500 [===============>..............] - ETA: 58s - loss: 1.2355 - regression_loss: 1.0485 - classification_loss: 0.1870 269/500 [===============>..............] - ETA: 57s - loss: 1.2331 - regression_loss: 1.0465 - classification_loss: 0.1867 270/500 [===============>..............] - ETA: 57s - loss: 1.2325 - regression_loss: 1.0460 - classification_loss: 0.1865 271/500 [===============>..............] - ETA: 57s - loss: 1.2351 - regression_loss: 1.0482 - classification_loss: 0.1869 272/500 [===============>..............] - ETA: 57s - loss: 1.2345 - regression_loss: 1.0479 - classification_loss: 0.1866 273/500 [===============>..............] - ETA: 56s - loss: 1.2338 - regression_loss: 1.0474 - classification_loss: 0.1864 274/500 [===============>..............] - ETA: 56s - loss: 1.2332 - regression_loss: 1.0469 - classification_loss: 0.1863 275/500 [===============>..............] - ETA: 56s - loss: 1.2329 - regression_loss: 1.0468 - classification_loss: 0.1862 276/500 [===============>..............] - ETA: 56s - loss: 1.2346 - regression_loss: 1.0482 - classification_loss: 0.1864 277/500 [===============>..............] - ETA: 55s - loss: 1.2360 - regression_loss: 1.0493 - classification_loss: 0.1866 278/500 [===============>..............] - ETA: 55s - loss: 1.2368 - regression_loss: 1.0502 - classification_loss: 0.1866 279/500 [===============>..............] - ETA: 55s - loss: 1.2384 - regression_loss: 1.0517 - classification_loss: 0.1867 280/500 [===============>..............] - ETA: 55s - loss: 1.2405 - regression_loss: 1.0536 - classification_loss: 0.1868 281/500 [===============>..............] - ETA: 54s - loss: 1.2384 - regression_loss: 1.0520 - classification_loss: 0.1863 282/500 [===============>..............] - ETA: 54s - loss: 1.2395 - regression_loss: 1.0528 - classification_loss: 0.1867 283/500 [===============>..............] - ETA: 54s - loss: 1.2414 - regression_loss: 1.0543 - classification_loss: 0.1870 284/500 [================>.............] - ETA: 54s - loss: 1.2431 - regression_loss: 1.0558 - classification_loss: 0.1873 285/500 [================>.............] - ETA: 53s - loss: 1.2404 - regression_loss: 1.0535 - classification_loss: 0.1868 286/500 [================>.............] - ETA: 53s - loss: 1.2424 - regression_loss: 1.0553 - classification_loss: 0.1871 287/500 [================>.............] - ETA: 53s - loss: 1.2399 - regression_loss: 1.0533 - classification_loss: 0.1866 288/500 [================>.............] - ETA: 53s - loss: 1.2393 - regression_loss: 1.0527 - classification_loss: 0.1866 289/500 [================>.............] - ETA: 52s - loss: 1.2378 - regression_loss: 1.0515 - classification_loss: 0.1863 290/500 [================>.............] - ETA: 52s - loss: 1.2370 - regression_loss: 1.0509 - classification_loss: 0.1861 291/500 [================>.............] - ETA: 52s - loss: 1.2390 - regression_loss: 1.0526 - classification_loss: 0.1865 292/500 [================>.............] - ETA: 52s - loss: 1.2384 - regression_loss: 1.0519 - classification_loss: 0.1865 293/500 [================>.............] - ETA: 51s - loss: 1.2398 - regression_loss: 1.0527 - classification_loss: 0.1871 294/500 [================>.............] - ETA: 51s - loss: 1.2384 - regression_loss: 1.0515 - classification_loss: 0.1869 295/500 [================>.............] - ETA: 51s - loss: 1.2389 - regression_loss: 1.0519 - classification_loss: 0.1870 296/500 [================>.............] - ETA: 51s - loss: 1.2398 - regression_loss: 1.0526 - classification_loss: 0.1872 297/500 [================>.............] - ETA: 50s - loss: 1.2390 - regression_loss: 1.0520 - classification_loss: 0.1870 298/500 [================>.............] - ETA: 50s - loss: 1.2391 - regression_loss: 1.0520 - classification_loss: 0.1871 299/500 [================>.............] - ETA: 50s - loss: 1.2387 - regression_loss: 1.0517 - classification_loss: 0.1871 300/500 [=================>............] - ETA: 50s - loss: 1.2365 - regression_loss: 1.0498 - classification_loss: 0.1867 301/500 [=================>............] - ETA: 49s - loss: 1.2383 - regression_loss: 1.0515 - classification_loss: 0.1868 302/500 [=================>............] - ETA: 49s - loss: 1.2388 - regression_loss: 1.0520 - classification_loss: 0.1868 303/500 [=================>............] - ETA: 49s - loss: 1.2378 - regression_loss: 1.0510 - classification_loss: 0.1868 304/500 [=================>............] - ETA: 49s - loss: 1.2394 - regression_loss: 1.0524 - classification_loss: 0.1870 305/500 [=================>............] - ETA: 48s - loss: 1.2392 - regression_loss: 1.0523 - classification_loss: 0.1869 306/500 [=================>............] - ETA: 48s - loss: 1.2365 - regression_loss: 1.0501 - classification_loss: 0.1864 307/500 [=================>............] - ETA: 48s - loss: 1.2372 - regression_loss: 1.0506 - classification_loss: 0.1867 308/500 [=================>............] - ETA: 48s - loss: 1.2346 - regression_loss: 1.0485 - classification_loss: 0.1861 309/500 [=================>............] - ETA: 47s - loss: 1.2356 - regression_loss: 1.0493 - classification_loss: 0.1863 310/500 [=================>............] - ETA: 47s - loss: 1.2333 - regression_loss: 1.0474 - classification_loss: 0.1858 311/500 [=================>............] - ETA: 47s - loss: 1.2308 - regression_loss: 1.0454 - classification_loss: 0.1854 312/500 [=================>............] - ETA: 47s - loss: 1.2307 - regression_loss: 1.0453 - classification_loss: 0.1854 313/500 [=================>............] - ETA: 46s - loss: 1.2286 - regression_loss: 1.0436 - classification_loss: 0.1849 314/500 [=================>............] - ETA: 46s - loss: 1.2270 - regression_loss: 1.0423 - classification_loss: 0.1847 315/500 [=================>............] - ETA: 46s - loss: 1.2280 - regression_loss: 1.0432 - classification_loss: 0.1848 316/500 [=================>............] - ETA: 46s - loss: 1.2272 - regression_loss: 1.0426 - classification_loss: 0.1846 317/500 [==================>...........] - ETA: 45s - loss: 1.2267 - regression_loss: 1.0421 - classification_loss: 0.1846 318/500 [==================>...........] - ETA: 45s - loss: 1.2267 - regression_loss: 1.0420 - classification_loss: 0.1847 319/500 [==================>...........] - ETA: 45s - loss: 1.2276 - regression_loss: 1.0427 - classification_loss: 0.1848 320/500 [==================>...........] - ETA: 45s - loss: 1.2287 - regression_loss: 1.0436 - classification_loss: 0.1851 321/500 [==================>...........] - ETA: 44s - loss: 1.2286 - regression_loss: 1.0435 - classification_loss: 0.1851 322/500 [==================>...........] - ETA: 44s - loss: 1.2284 - regression_loss: 1.0434 - classification_loss: 0.1850 323/500 [==================>...........] - ETA: 44s - loss: 1.2266 - regression_loss: 1.0420 - classification_loss: 0.1847 324/500 [==================>...........] - ETA: 44s - loss: 1.2268 - regression_loss: 1.0421 - classification_loss: 0.1847 325/500 [==================>...........] - ETA: 43s - loss: 1.2282 - regression_loss: 1.0433 - classification_loss: 0.1849 326/500 [==================>...........] - ETA: 43s - loss: 1.2286 - regression_loss: 1.0436 - classification_loss: 0.1849 327/500 [==================>...........] - ETA: 43s - loss: 1.2305 - regression_loss: 1.0452 - classification_loss: 0.1853 328/500 [==================>...........] - ETA: 43s - loss: 1.2303 - regression_loss: 1.0451 - classification_loss: 0.1852 329/500 [==================>...........] - ETA: 42s - loss: 1.2305 - regression_loss: 1.0453 - classification_loss: 0.1853 330/500 [==================>...........] - ETA: 42s - loss: 1.2310 - regression_loss: 1.0457 - classification_loss: 0.1853 331/500 [==================>...........] - ETA: 42s - loss: 1.2327 - regression_loss: 1.0470 - classification_loss: 0.1857 332/500 [==================>...........] - ETA: 42s - loss: 1.2323 - regression_loss: 1.0467 - classification_loss: 0.1856 333/500 [==================>...........] - ETA: 41s - loss: 1.2303 - regression_loss: 1.0450 - classification_loss: 0.1853 334/500 [===================>..........] - ETA: 41s - loss: 1.2320 - regression_loss: 1.0462 - classification_loss: 0.1858 335/500 [===================>..........] - ETA: 41s - loss: 1.2338 - regression_loss: 1.0476 - classification_loss: 0.1862 336/500 [===================>..........] - ETA: 41s - loss: 1.2352 - regression_loss: 1.0489 - classification_loss: 0.1863 337/500 [===================>..........] - ETA: 40s - loss: 1.2326 - regression_loss: 1.0468 - classification_loss: 0.1858 338/500 [===================>..........] - ETA: 40s - loss: 1.2318 - regression_loss: 1.0461 - classification_loss: 0.1856 339/500 [===================>..........] - ETA: 40s - loss: 1.2306 - regression_loss: 1.0450 - classification_loss: 0.1856 340/500 [===================>..........] - ETA: 40s - loss: 1.2312 - regression_loss: 1.0454 - classification_loss: 0.1857 341/500 [===================>..........] - ETA: 39s - loss: 1.2317 - regression_loss: 1.0459 - classification_loss: 0.1858 342/500 [===================>..........] - ETA: 39s - loss: 1.2319 - regression_loss: 1.0461 - classification_loss: 0.1858 343/500 [===================>..........] - ETA: 39s - loss: 1.2318 - regression_loss: 1.0460 - classification_loss: 0.1858 344/500 [===================>..........] - ETA: 39s - loss: 1.2314 - regression_loss: 1.0458 - classification_loss: 0.1856 345/500 [===================>..........] - ETA: 38s - loss: 1.2323 - regression_loss: 1.0466 - classification_loss: 0.1857 346/500 [===================>..........] - ETA: 38s - loss: 1.2313 - regression_loss: 1.0459 - classification_loss: 0.1854 347/500 [===================>..........] - ETA: 38s - loss: 1.2327 - regression_loss: 1.0470 - classification_loss: 0.1857 348/500 [===================>..........] - ETA: 38s - loss: 1.2329 - regression_loss: 1.0472 - classification_loss: 0.1857 349/500 [===================>..........] - ETA: 37s - loss: 1.2336 - regression_loss: 1.0477 - classification_loss: 0.1859 350/500 [====================>.........] - ETA: 37s - loss: 1.2335 - regression_loss: 1.0475 - classification_loss: 0.1859 351/500 [====================>.........] - ETA: 37s - loss: 1.2339 - regression_loss: 1.0480 - classification_loss: 0.1859 352/500 [====================>.........] - ETA: 37s - loss: 1.2345 - regression_loss: 1.0486 - classification_loss: 0.1859 353/500 [====================>.........] - ETA: 36s - loss: 1.2325 - regression_loss: 1.0468 - classification_loss: 0.1857 354/500 [====================>.........] - ETA: 36s - loss: 1.2327 - regression_loss: 1.0470 - classification_loss: 0.1857 355/500 [====================>.........] - ETA: 36s - loss: 1.2337 - regression_loss: 1.0479 - classification_loss: 0.1858 356/500 [====================>.........] - ETA: 36s - loss: 1.2336 - regression_loss: 1.0478 - classification_loss: 0.1859 357/500 [====================>.........] - ETA: 35s - loss: 1.2341 - regression_loss: 1.0481 - classification_loss: 0.1860 358/500 [====================>.........] - ETA: 35s - loss: 1.2350 - regression_loss: 1.0489 - classification_loss: 0.1861 359/500 [====================>.........] - ETA: 35s - loss: 1.2332 - regression_loss: 1.0474 - classification_loss: 0.1858 360/500 [====================>.........] - ETA: 35s - loss: 1.2326 - regression_loss: 1.0469 - classification_loss: 0.1857 361/500 [====================>.........] - ETA: 34s - loss: 1.2344 - regression_loss: 1.0484 - classification_loss: 0.1860 362/500 [====================>.........] - ETA: 34s - loss: 1.2360 - regression_loss: 1.0498 - classification_loss: 0.1862 363/500 [====================>.........] - ETA: 34s - loss: 1.2351 - regression_loss: 1.0490 - classification_loss: 0.1861 364/500 [====================>.........] - ETA: 34s - loss: 1.2360 - regression_loss: 1.0497 - classification_loss: 0.1863 365/500 [====================>.........] - ETA: 33s - loss: 1.2356 - regression_loss: 1.0495 - classification_loss: 0.1862 366/500 [====================>.........] - ETA: 33s - loss: 1.2357 - regression_loss: 1.0496 - classification_loss: 0.1861 367/500 [=====================>........] - ETA: 33s - loss: 1.2370 - regression_loss: 1.0507 - classification_loss: 0.1863 368/500 [=====================>........] - ETA: 33s - loss: 1.2376 - regression_loss: 1.0512 - classification_loss: 0.1864 369/500 [=====================>........] - ETA: 32s - loss: 1.2382 - regression_loss: 1.0517 - classification_loss: 0.1865 370/500 [=====================>........] - ETA: 32s - loss: 1.2394 - regression_loss: 1.0529 - classification_loss: 0.1865 371/500 [=====================>........] - ETA: 32s - loss: 1.2385 - regression_loss: 1.0523 - classification_loss: 0.1862 372/500 [=====================>........] - ETA: 32s - loss: 1.2374 - regression_loss: 1.0513 - classification_loss: 0.1861 373/500 [=====================>........] - ETA: 31s - loss: 1.2382 - regression_loss: 1.0520 - classification_loss: 0.1861 374/500 [=====================>........] - ETA: 31s - loss: 1.2393 - regression_loss: 1.0529 - classification_loss: 0.1864 375/500 [=====================>........] - ETA: 31s - loss: 1.2406 - regression_loss: 1.0538 - classification_loss: 0.1868 376/500 [=====================>........] - ETA: 31s - loss: 1.2415 - regression_loss: 1.0547 - classification_loss: 0.1868 377/500 [=====================>........] - ETA: 30s - loss: 1.2416 - regression_loss: 1.0547 - classification_loss: 0.1869 378/500 [=====================>........] - ETA: 30s - loss: 1.2417 - regression_loss: 1.0548 - classification_loss: 0.1869 379/500 [=====================>........] - ETA: 30s - loss: 1.2411 - regression_loss: 1.0543 - classification_loss: 0.1868 380/500 [=====================>........] - ETA: 30s - loss: 1.2417 - regression_loss: 1.0548 - classification_loss: 0.1869 381/500 [=====================>........] - ETA: 29s - loss: 1.2440 - regression_loss: 1.0567 - classification_loss: 0.1873 382/500 [=====================>........] - ETA: 29s - loss: 1.2449 - regression_loss: 1.0574 - classification_loss: 0.1874 383/500 [=====================>........] - ETA: 29s - loss: 1.2436 - regression_loss: 1.0564 - classification_loss: 0.1872 384/500 [======================>.......] - ETA: 29s - loss: 1.2446 - regression_loss: 1.0572 - classification_loss: 0.1874 385/500 [======================>.......] - ETA: 28s - loss: 1.2439 - regression_loss: 1.0567 - classification_loss: 0.1872 386/500 [======================>.......] - ETA: 28s - loss: 1.2427 - regression_loss: 1.0556 - classification_loss: 0.1871 387/500 [======================>.......] - ETA: 28s - loss: 1.2435 - regression_loss: 1.0562 - classification_loss: 0.1873 388/500 [======================>.......] - ETA: 28s - loss: 1.2435 - regression_loss: 1.0562 - classification_loss: 0.1873 389/500 [======================>.......] - ETA: 27s - loss: 1.2444 - regression_loss: 1.0570 - classification_loss: 0.1874 390/500 [======================>.......] - ETA: 27s - loss: 1.2445 - regression_loss: 1.0571 - classification_loss: 0.1874 391/500 [======================>.......] - ETA: 27s - loss: 1.2451 - regression_loss: 1.0576 - classification_loss: 0.1875 392/500 [======================>.......] - ETA: 27s - loss: 1.2448 - regression_loss: 1.0573 - classification_loss: 0.1875 393/500 [======================>.......] - ETA: 26s - loss: 1.2447 - regression_loss: 1.0572 - classification_loss: 0.1875 394/500 [======================>.......] - ETA: 26s - loss: 1.2451 - regression_loss: 1.0577 - classification_loss: 0.1875 395/500 [======================>.......] - ETA: 26s - loss: 1.2465 - regression_loss: 1.0588 - classification_loss: 0.1878 396/500 [======================>.......] - ETA: 26s - loss: 1.2467 - regression_loss: 1.0589 - classification_loss: 0.1878 397/500 [======================>.......] - ETA: 25s - loss: 1.2459 - regression_loss: 1.0583 - classification_loss: 0.1876 398/500 [======================>.......] - ETA: 25s - loss: 1.2445 - regression_loss: 1.0570 - classification_loss: 0.1875 399/500 [======================>.......] - ETA: 25s - loss: 1.2446 - regression_loss: 1.0571 - classification_loss: 0.1875 400/500 [=======================>......] - ETA: 25s - loss: 1.2447 - regression_loss: 1.0571 - classification_loss: 0.1875 401/500 [=======================>......] - ETA: 24s - loss: 1.2441 - regression_loss: 1.0566 - classification_loss: 0.1875 402/500 [=======================>......] - ETA: 24s - loss: 1.2456 - regression_loss: 1.0578 - classification_loss: 0.1878 403/500 [=======================>......] - ETA: 24s - loss: 1.2457 - regression_loss: 1.0579 - classification_loss: 0.1878 404/500 [=======================>......] - ETA: 24s - loss: 1.2462 - regression_loss: 1.0582 - classification_loss: 0.1880 405/500 [=======================>......] - ETA: 23s - loss: 1.2452 - regression_loss: 1.0573 - classification_loss: 0.1879 406/500 [=======================>......] - ETA: 23s - loss: 1.2463 - regression_loss: 1.0584 - classification_loss: 0.1878 407/500 [=======================>......] - ETA: 23s - loss: 1.2458 - regression_loss: 1.0582 - classification_loss: 0.1877 408/500 [=======================>......] - ETA: 23s - loss: 1.2462 - regression_loss: 1.0585 - classification_loss: 0.1877 409/500 [=======================>......] - ETA: 22s - loss: 1.2450 - regression_loss: 1.0575 - classification_loss: 0.1875 410/500 [=======================>......] - ETA: 22s - loss: 1.2464 - regression_loss: 1.0586 - classification_loss: 0.1878 411/500 [=======================>......] - ETA: 22s - loss: 1.2472 - regression_loss: 1.0592 - classification_loss: 0.1880 412/500 [=======================>......] - ETA: 22s - loss: 1.2461 - regression_loss: 1.0584 - classification_loss: 0.1876 413/500 [=======================>......] - ETA: 21s - loss: 1.2479 - regression_loss: 1.0599 - classification_loss: 0.1880 414/500 [=======================>......] - ETA: 21s - loss: 1.2485 - regression_loss: 1.0602 - classification_loss: 0.1883 415/500 [=======================>......] - ETA: 21s - loss: 1.2474 - regression_loss: 1.0592 - classification_loss: 0.1882 416/500 [=======================>......] - ETA: 21s - loss: 1.2481 - regression_loss: 1.0598 - classification_loss: 0.1883 417/500 [========================>.....] - ETA: 20s - loss: 1.2463 - regression_loss: 1.0583 - classification_loss: 0.1880 418/500 [========================>.....] - ETA: 20s - loss: 1.2450 - regression_loss: 1.0573 - classification_loss: 0.1877 419/500 [========================>.....] - ETA: 20s - loss: 1.2456 - regression_loss: 1.0578 - classification_loss: 0.1878 420/500 [========================>.....] - ETA: 20s - loss: 1.2455 - regression_loss: 1.0577 - classification_loss: 0.1877 421/500 [========================>.....] - ETA: 19s - loss: 1.2450 - regression_loss: 1.0574 - classification_loss: 0.1876 422/500 [========================>.....] - ETA: 19s - loss: 1.2443 - regression_loss: 1.0567 - classification_loss: 0.1876 423/500 [========================>.....] - ETA: 19s - loss: 1.2445 - regression_loss: 1.0570 - classification_loss: 0.1876 424/500 [========================>.....] - ETA: 19s - loss: 1.2429 - regression_loss: 1.0557 - classification_loss: 0.1873 425/500 [========================>.....] - ETA: 18s - loss: 1.2427 - regression_loss: 1.0555 - classification_loss: 0.1872 426/500 [========================>.....] - ETA: 18s - loss: 1.2407 - regression_loss: 1.0538 - classification_loss: 0.1869 427/500 [========================>.....] - ETA: 18s - loss: 1.2413 - regression_loss: 1.0545 - classification_loss: 0.1868 428/500 [========================>.....] - ETA: 18s - loss: 1.2425 - regression_loss: 1.0555 - classification_loss: 0.1870 429/500 [========================>.....] - ETA: 17s - loss: 1.2438 - regression_loss: 1.0566 - classification_loss: 0.1873 430/500 [========================>.....] - ETA: 17s - loss: 1.2444 - regression_loss: 1.0569 - classification_loss: 0.1875 431/500 [========================>.....] - ETA: 17s - loss: 1.2433 - regression_loss: 1.0561 - classification_loss: 0.1872 432/500 [========================>.....] - ETA: 17s - loss: 1.2438 - regression_loss: 1.0566 - classification_loss: 0.1872 433/500 [========================>.....] - ETA: 16s - loss: 1.2445 - regression_loss: 1.0572 - classification_loss: 0.1873 434/500 [=========================>....] - ETA: 16s - loss: 1.2438 - regression_loss: 1.0565 - classification_loss: 0.1873 435/500 [=========================>....] - ETA: 16s - loss: 1.2444 - regression_loss: 1.0570 - classification_loss: 0.1874 436/500 [=========================>....] - ETA: 16s - loss: 1.2446 - regression_loss: 1.0572 - classification_loss: 0.1874 437/500 [=========================>....] - ETA: 15s - loss: 1.2442 - regression_loss: 1.0569 - classification_loss: 0.1873 438/500 [=========================>....] - ETA: 15s - loss: 1.2447 - regression_loss: 1.0573 - classification_loss: 0.1874 439/500 [=========================>....] - ETA: 15s - loss: 1.2453 - regression_loss: 1.0578 - classification_loss: 0.1874 440/500 [=========================>....] - ETA: 15s - loss: 1.2462 - regression_loss: 1.0584 - classification_loss: 0.1878 441/500 [=========================>....] - ETA: 14s - loss: 1.2462 - regression_loss: 1.0583 - classification_loss: 0.1879 442/500 [=========================>....] - ETA: 14s - loss: 1.2469 - regression_loss: 1.0588 - classification_loss: 0.1882 443/500 [=========================>....] - ETA: 14s - loss: 1.2479 - regression_loss: 1.0597 - classification_loss: 0.1882 444/500 [=========================>....] - ETA: 14s - loss: 1.2492 - regression_loss: 1.0607 - classification_loss: 0.1884 445/500 [=========================>....] - ETA: 13s - loss: 1.2494 - regression_loss: 1.0606 - classification_loss: 0.1887 446/500 [=========================>....] - ETA: 13s - loss: 1.2494 - regression_loss: 1.0606 - classification_loss: 0.1888 447/500 [=========================>....] - ETA: 13s - loss: 1.2488 - regression_loss: 1.0601 - classification_loss: 0.1887 448/500 [=========================>....] - ETA: 13s - loss: 1.2494 - regression_loss: 1.0607 - classification_loss: 0.1887 449/500 [=========================>....] - ETA: 12s - loss: 1.2499 - regression_loss: 1.0612 - classification_loss: 0.1887 450/500 [==========================>...] - ETA: 12s - loss: 1.2501 - regression_loss: 1.0612 - classification_loss: 0.1889 451/500 [==========================>...] - ETA: 12s - loss: 1.2490 - regression_loss: 1.0603 - classification_loss: 0.1887 452/500 [==========================>...] - ETA: 12s - loss: 1.2472 - regression_loss: 1.0588 - classification_loss: 0.1884 453/500 [==========================>...] - ETA: 11s - loss: 1.2456 - regression_loss: 1.0575 - classification_loss: 0.1881 454/500 [==========================>...] - ETA: 11s - loss: 1.2447 - regression_loss: 1.0568 - classification_loss: 0.1879 455/500 [==========================>...] - ETA: 11s - loss: 1.2456 - regression_loss: 1.0576 - classification_loss: 0.1880 456/500 [==========================>...] - ETA: 11s - loss: 1.2452 - regression_loss: 1.0573 - classification_loss: 0.1879 457/500 [==========================>...] - ETA: 10s - loss: 1.2440 - regression_loss: 1.0564 - classification_loss: 0.1877 458/500 [==========================>...] - ETA: 10s - loss: 1.2420 - regression_loss: 1.0547 - classification_loss: 0.1873 459/500 [==========================>...] - ETA: 10s - loss: 1.2414 - regression_loss: 1.0542 - classification_loss: 0.1873 460/500 [==========================>...] - ETA: 10s - loss: 1.2414 - regression_loss: 1.0540 - classification_loss: 0.1874 461/500 [==========================>...] - ETA: 9s - loss: 1.2415 - regression_loss: 1.0541 - classification_loss: 0.1874  462/500 [==========================>...] - ETA: 9s - loss: 1.2422 - regression_loss: 1.0547 - classification_loss: 0.1875 463/500 [==========================>...] - ETA: 9s - loss: 1.2431 - regression_loss: 1.0557 - classification_loss: 0.1875 464/500 [==========================>...] - ETA: 9s - loss: 1.2450 - regression_loss: 1.0570 - classification_loss: 0.1880 465/500 [==========================>...] - ETA: 8s - loss: 1.2439 - regression_loss: 1.0561 - classification_loss: 0.1878 466/500 [==========================>...] - ETA: 8s - loss: 1.2453 - regression_loss: 1.0572 - classification_loss: 0.1881 467/500 [===========================>..] - ETA: 8s - loss: 1.2468 - regression_loss: 1.0584 - classification_loss: 0.1884 468/500 [===========================>..] - ETA: 8s - loss: 1.2471 - regression_loss: 1.0586 - classification_loss: 0.1886 469/500 [===========================>..] - ETA: 7s - loss: 1.2481 - regression_loss: 1.0594 - classification_loss: 0.1887 470/500 [===========================>..] - ETA: 7s - loss: 1.2478 - regression_loss: 1.0592 - classification_loss: 0.1886 471/500 [===========================>..] - ETA: 7s - loss: 1.2467 - regression_loss: 1.0583 - classification_loss: 0.1884 472/500 [===========================>..] - ETA: 7s - loss: 1.2458 - regression_loss: 1.0576 - classification_loss: 0.1882 473/500 [===========================>..] - ETA: 6s - loss: 1.2452 - regression_loss: 1.0572 - classification_loss: 0.1880 474/500 [===========================>..] - ETA: 6s - loss: 1.2441 - regression_loss: 1.0563 - classification_loss: 0.1878 475/500 [===========================>..] - ETA: 6s - loss: 1.2441 - regression_loss: 1.0563 - classification_loss: 0.1878 476/500 [===========================>..] - ETA: 6s - loss: 1.2427 - regression_loss: 1.0552 - classification_loss: 0.1875 477/500 [===========================>..] - ETA: 5s - loss: 1.2432 - regression_loss: 1.0556 - classification_loss: 0.1876 478/500 [===========================>..] - ETA: 5s - loss: 1.2425 - regression_loss: 1.0550 - classification_loss: 0.1875 479/500 [===========================>..] - ETA: 5s - loss: 1.2416 - regression_loss: 1.0542 - classification_loss: 0.1874 480/500 [===========================>..] - ETA: 5s - loss: 1.2426 - regression_loss: 1.0550 - classification_loss: 0.1876 481/500 [===========================>..] - ETA: 4s - loss: 1.2408 - regression_loss: 1.0536 - classification_loss: 0.1872 482/500 [===========================>..] - ETA: 4s - loss: 1.2416 - regression_loss: 1.0544 - classification_loss: 0.1872 483/500 [===========================>..] - ETA: 4s - loss: 1.2420 - regression_loss: 1.0548 - classification_loss: 0.1872 484/500 [============================>.] - ETA: 4s - loss: 1.2420 - regression_loss: 1.0548 - classification_loss: 0.1872 485/500 [============================>.] - ETA: 3s - loss: 1.2406 - regression_loss: 1.0536 - classification_loss: 0.1870 486/500 [============================>.] - ETA: 3s - loss: 1.2403 - regression_loss: 1.0534 - classification_loss: 0.1870 487/500 [============================>.] - ETA: 3s - loss: 1.2413 - regression_loss: 1.0542 - classification_loss: 0.1871 488/500 [============================>.] - ETA: 3s - loss: 1.2403 - regression_loss: 1.0533 - classification_loss: 0.1870 489/500 [============================>.] - ETA: 2s - loss: 1.2410 - regression_loss: 1.0539 - classification_loss: 0.1871 490/500 [============================>.] - ETA: 2s - loss: 1.2426 - regression_loss: 1.0550 - classification_loss: 0.1875 491/500 [============================>.] - ETA: 2s - loss: 1.2420 - regression_loss: 1.0547 - classification_loss: 0.1874 492/500 [============================>.] - ETA: 2s - loss: 1.2426 - regression_loss: 1.0551 - classification_loss: 0.1875 493/500 [============================>.] - ETA: 1s - loss: 1.2410 - regression_loss: 1.0538 - classification_loss: 0.1872 494/500 [============================>.] - ETA: 1s - loss: 1.2399 - regression_loss: 1.0529 - classification_loss: 0.1870 495/500 [============================>.] - ETA: 1s - loss: 1.2412 - regression_loss: 1.0540 - classification_loss: 0.1872 496/500 [============================>.] - ETA: 1s - loss: 1.2407 - regression_loss: 1.0536 - classification_loss: 0.1871 497/500 [============================>.] - ETA: 0s - loss: 1.2400 - regression_loss: 1.0531 - classification_loss: 0.1869 498/500 [============================>.] - ETA: 0s - loss: 1.2420 - regression_loss: 1.0544 - classification_loss: 0.1876 499/500 [============================>.] - ETA: 0s - loss: 1.2400 - regression_loss: 1.0527 - classification_loss: 0.1873 500/500 [==============================] - 126s 251ms/step - loss: 1.2407 - regression_loss: 1.0532 - classification_loss: 0.1875 1172 instances of class plum with average precision: 0.7003 mAP: 0.7003 Epoch 00137: saving model to ./training/snapshots/resnet50_pascal_137.h5